# Datadog Api > Action connections extend your installed integrations and allow you to take action in your third-party systems (e.g. AWS, GitLab, and Statuspage) with Datadog’s Workflow Automation and App Builder pro --- # Source: https://docs.datadoghq.com/api/latest/action-connection/ # Action Connection Action connections extend your installed integrations and allow you to take action in your third-party systems (e.g. AWS, GitLab, and Statuspage) with Datadog’s Workflow Automation and App Builder products. Datadog’s Integrations automatically provide authentication for Slack, Microsoft Teams, PagerDuty, Opsgenie, JIRA, GitHub, and Statuspage. You do not need additional connections in order to access these tools within Workflow Automation and App Builder. We offer granular access control for editing and resolving connections. ## [Get an existing Action Connection](https://docs.datadoghq.com/api/latest/action-connection/#get-an-existing-action-connection) * [v2 (latest)](https://docs.datadoghq.com/api/latest/action-connection/#get-an-existing-action-connection-v2) GET https://api.ap1.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.ap2.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.datadoghq.eu/api/v2/actions/connections/{connection_id}https://api.ddog-gov.com/api/v2/actions/connections/{connection_id}https://api.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.us3.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.us5.datadoghq.com/api/v2/actions/connections/{connection_id} ### Overview Get an existing Action Connection. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). ### Arguments #### Path Parameters Name Type Description connection_id [_required_] string The ID of the action connection ### Response * [200](https://docs.datadoghq.com/api/latest/action-connection/#GetActionConnection-200-v2) * [400](https://docs.datadoghq.com/api/latest/action-connection/#GetActionConnection-400-v2) * [403](https://docs.datadoghq.com/api/latest/action-connection/#GetActionConnection-403-v2) * [404](https://docs.datadoghq.com/api/latest/action-connection/#GetActionConnection-404-v2) * [429](https://docs.datadoghq.com/api/latest/action-connection/#GetActionConnection-429-v2) Successfully get Action Connection * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) The response for found connection Field Type Description data object Data related to the connection. attributes [_required_] object The definition of `ActionConnectionAttributes` object. integration [_required_] The definition of `ActionConnectionIntegration` object. Option 1 object The definition of `AWSIntegration` object. credentials [_required_] The definition of `AWSCredentials` object. Option 1 object The definition of `AWSAssumeRole` object. account_id [_required_] string AWS account the connection is created for external_id string External ID used to scope which connection can be used to assume the role principal_id string AWS account that will assume the role role [_required_] string Role to assume type [_required_] enum The definition of `AWSAssumeRoleType` object. Allowed enum values: `AWSAssumeRole` type [_required_] enum The definition of `AWSIntegrationType` object. Allowed enum values: `AWS` Option 2 object The definition of the `AnthropicIntegration` object. credentials [_required_] The definition of the `AnthropicCredentials` object. Option 1 object The definition of the `AnthropicAPIKey` object. api_token [_required_] string The `AnthropicAPIKey` `api_token`. type [_required_] enum The definition of the `AnthropicAPIKey` object. Allowed enum values: `AnthropicAPIKey` type [_required_] enum The definition of the `AnthropicIntegrationType` object. Allowed enum values: `Anthropic` Option 3 object The definition of the `AsanaIntegration` object. credentials [_required_] The definition of the `AsanaCredentials` object. Option 1 object The definition of the `AsanaAccessToken` object. access_token [_required_] string The `AsanaAccessToken` `access_token`. type [_required_] enum The definition of the `AsanaAccessToken` object. Allowed enum values: `AsanaAccessToken` type [_required_] enum The definition of the `AsanaIntegrationType` object. Allowed enum values: `Asana` Option 4 object The definition of the `AzureIntegration` object. credentials [_required_] The definition of the `AzureCredentials` object. Option 1 object The definition of the `AzureTenant` object. app_client_id [_required_] string The Client ID, also known as the Application ID in Azure, is a unique identifier for an application. It's used to identify the application during the authentication process. Your Application (client) ID is listed in the application's overview page. You can navigate to your application via the Azure Directory. client_secret [_required_] string The Client Secret is a confidential piece of information known only to the application and Azure AD. It's used to prove the application's identity. Your Client Secret is available from the application’s secrets page. You can navigate to your application via the Azure Directory. custom_scopes string If provided, the custom scope to be requested from Microsoft when acquiring an OAuth 2 access token. This custom scope is used only in conjunction with the HTTP action. A resource's scope is constructed by using the identifier URI for the resource and .default, separated by a forward slash (/) as follows:{identifierURI}/.default. tenant_id [_required_] string The Tenant ID, also known as the Directory ID in Azure, is a unique identifier that represents an Azure AD instance. Your Tenant ID (Directory ID) is listed in your Active Directory overview page under the 'Tenant information' section. type [_required_] enum The definition of the `AzureTenant` object. Allowed enum values: `AzureTenant` type [_required_] enum The definition of the `AzureIntegrationType` object. Allowed enum values: `Azure` Option 5 object The definition of the `CircleCIIntegration` object. credentials [_required_] The definition of the `CircleCICredentials` object. Option 1 object The definition of the `CircleCIAPIKey` object. api_token [_required_] string The `CircleCIAPIKey` `api_token`. type [_required_] enum The definition of the `CircleCIAPIKey` object. Allowed enum values: `CircleCIAPIKey` type [_required_] enum The definition of the `CircleCIIntegrationType` object. Allowed enum values: `CircleCI` Option 6 object The definition of the `ClickupIntegration` object. credentials [_required_] The definition of the `ClickupCredentials` object. Option 1 object The definition of the `ClickupAPIKey` object. api_token [_required_] string The `ClickupAPIKey` `api_token`. type [_required_] enum The definition of the `ClickupAPIKey` object. Allowed enum values: `ClickupAPIKey` type [_required_] enum The definition of the `ClickupIntegrationType` object. Allowed enum values: `Clickup` Option 7 object The definition of the `CloudflareIntegration` object. credentials [_required_] The definition of the `CloudflareCredentials` object. Option 1 object The definition of the `CloudflareAPIToken` object. api_token [_required_] string The `CloudflareAPIToken` `api_token`. type [_required_] enum The definition of the `CloudflareAPIToken` object. Allowed enum values: `CloudflareAPIToken` Option 2 object The definition of the `CloudflareGlobalAPIToken` object. auth_email [_required_] string The `CloudflareGlobalAPIToken` `auth_email`. global_api_key [_required_] string The `CloudflareGlobalAPIToken` `global_api_key`. type [_required_] enum The definition of the `CloudflareGlobalAPIToken` object. Allowed enum values: `CloudflareGlobalAPIToken` type [_required_] enum The definition of the `CloudflareIntegrationType` object. Allowed enum values: `Cloudflare` Option 8 object The definition of the `ConfigCatIntegration` object. credentials [_required_] The definition of the `ConfigCatCredentials` object. Option 1 object The definition of the `ConfigCatSDKKey` object. api_password [_required_] string The `ConfigCatSDKKey` `api_password`. api_username [_required_] string The `ConfigCatSDKKey` `api_username`. sdk_key [_required_] string The `ConfigCatSDKKey` `sdk_key`. type [_required_] enum The definition of the `ConfigCatSDKKey` object. Allowed enum values: `ConfigCatSDKKey` type [_required_] enum The definition of the `ConfigCatIntegrationType` object. Allowed enum values: `ConfigCat` Option 9 object The definition of the `DatadogIntegration` object. credentials [_required_] The definition of the `DatadogCredentials` object. Option 1 object The definition of the `DatadogAPIKey` object. api_key [_required_] string The `DatadogAPIKey` `api_key`. app_key [_required_] string The `DatadogAPIKey` `app_key`. datacenter [_required_] string The `DatadogAPIKey` `datacenter`. subdomain string Custom subdomain used for Datadog URLs generated with this Connection. For example, if this org uses `https://acme.datadoghq.com` to access Datadog, set this field to `acme`. If this field is omitted, generated URLs will use the default site URL for its datacenter (see ). type [_required_] enum The definition of the `DatadogAPIKey` object. Allowed enum values: `DatadogAPIKey` type [_required_] enum The definition of the `DatadogIntegrationType` object. Allowed enum values: `Datadog` Option 10 object The definition of the `FastlyIntegration` object. credentials [_required_] The definition of the `FastlyCredentials` object. Option 1 object The definition of the `FastlyAPIKey` object. api_key [_required_] string The `FastlyAPIKey` `api_key`. type [_required_] enum The definition of the `FastlyAPIKey` object. Allowed enum values: `FastlyAPIKey` type [_required_] enum The definition of the `FastlyIntegrationType` object. Allowed enum values: `Fastly` Option 11 object The definition of the `FreshserviceIntegration` object. credentials [_required_] The definition of the `FreshserviceCredentials` object. Option 1 object The definition of the `FreshserviceAPIKey` object. api_key [_required_] string The `FreshserviceAPIKey` `api_key`. domain [_required_] string The `FreshserviceAPIKey` `domain`. type [_required_] enum The definition of the `FreshserviceAPIKey` object. Allowed enum values: `FreshserviceAPIKey` type [_required_] enum The definition of the `FreshserviceIntegrationType` object. Allowed enum values: `Freshservice` Option 12 object The definition of the `GCPIntegration` object. credentials [_required_] The definition of the `GCPCredentials` object. Option 1 object The definition of the `GCPServiceAccount` object. private_key [_required_] string The `GCPServiceAccount` `private_key`. service_account_email [_required_] string The `GCPServiceAccount` `service_account_email`. type [_required_] enum The definition of the `GCPServiceAccount` object. Allowed enum values: `GCPServiceAccount` type [_required_] enum The definition of the `GCPIntegrationType` object. Allowed enum values: `GCP` Option 13 object The definition of the `GeminiIntegration` object. credentials [_required_] The definition of the `GeminiCredentials` object. Option 1 object The definition of the `GeminiAPIKey` object. api_key [_required_] string The `GeminiAPIKey` `api_key`. type [_required_] enum The definition of the `GeminiAPIKey` object. Allowed enum values: `GeminiAPIKey` type [_required_] enum The definition of the `GeminiIntegrationType` object. Allowed enum values: `Gemini` Option 14 object The definition of the `GitlabIntegration` object. credentials [_required_] The definition of the `GitlabCredentials` object. Option 1 object The definition of the `GitlabAPIKey` object. api_token [_required_] string The `GitlabAPIKey` `api_token`. type [_required_] enum The definition of the `GitlabAPIKey` object. Allowed enum values: `GitlabAPIKey` type [_required_] enum The definition of the `GitlabIntegrationType` object. Allowed enum values: `Gitlab` Option 15 object The definition of the `GreyNoiseIntegration` object. credentials [_required_] The definition of the `GreyNoiseCredentials` object. Option 1 object The definition of the `GreyNoiseAPIKey` object. api_key [_required_] string The `GreyNoiseAPIKey` `api_key`. type [_required_] enum The definition of the `GreyNoiseAPIKey` object. Allowed enum values: `GreyNoiseAPIKey` type [_required_] enum The definition of the `GreyNoiseIntegrationType` object. Allowed enum values: `GreyNoise` Option 16 object The definition of `HTTPIntegration` object. base_url [_required_] string Base HTTP url for the integration credentials [_required_] The definition of `HTTPCredentials` object. Option 1 object The definition of `HTTPTokenAuth` object. body object The definition of `HTTPBody` object. content string Serialized body content content_type string Content type of the body headers [object] The `HTTPTokenAuth` `headers`. name [_required_] string The `HTTPHeader` `name`. value [_required_] string The `HTTPHeader` `value`. tokens [object] The `HTTPTokenAuth` `tokens`. name [_required_] string The `HTTPToken` `name`. type [_required_] enum The definition of `TokenType` object. Allowed enum values: `SECRET` value [_required_] string The `HTTPToken` `value`. type [_required_] enum The definition of `HTTPTokenAuthType` object. Allowed enum values: `HTTPTokenAuth` url_parameters [object] The `HTTPTokenAuth` `url_parameters`. name [_required_] string Name for tokens. value [_required_] string The `UrlParam` `value`. type [_required_] enum The definition of `HTTPIntegrationType` object. Allowed enum values: `HTTP` Option 17 object The definition of the `LaunchDarklyIntegration` object. credentials [_required_] The definition of the `LaunchDarklyCredentials` object. Option 1 object The definition of the `LaunchDarklyAPIKey` object. api_token [_required_] string The `LaunchDarklyAPIKey` `api_token`. type [_required_] enum The definition of the `LaunchDarklyAPIKey` object. Allowed enum values: `LaunchDarklyAPIKey` type [_required_] enum The definition of the `LaunchDarklyIntegrationType` object. Allowed enum values: `LaunchDarkly` Option 18 object The definition of the `NotionIntegration` object. credentials [_required_] The definition of the `NotionCredentials` object. Option 1 object The definition of the `NotionAPIKey` object. api_token [_required_] string The `NotionAPIKey` `api_token`. type [_required_] enum The definition of the `NotionAPIKey` object. Allowed enum values: `NotionAPIKey` type [_required_] enum The definition of the `NotionIntegrationType` object. Allowed enum values: `Notion` Option 19 object The definition of the `OktaIntegration` object. credentials [_required_] The definition of the `OktaCredentials` object. Option 1 object The definition of the `OktaAPIToken` object. api_token [_required_] string The `OktaAPIToken` `api_token`. domain [_required_] string The `OktaAPIToken` `domain`. type [_required_] enum The definition of the `OktaAPIToken` object. Allowed enum values: `OktaAPIToken` type [_required_] enum The definition of the `OktaIntegrationType` object. Allowed enum values: `Okta` Option 20 object The definition of the `OpenAIIntegration` object. credentials [_required_] The definition of the `OpenAICredentials` object. Option 1 object The definition of the `OpenAIAPIKey` object. api_token [_required_] string The `OpenAIAPIKey` `api_token`. type [_required_] enum The definition of the `OpenAIAPIKey` object. Allowed enum values: `OpenAIAPIKey` type [_required_] enum The definition of the `OpenAIIntegrationType` object. Allowed enum values: `OpenAI` Option 21 object The definition of the `ServiceNowIntegration` object. credentials [_required_] The definition of the `ServiceNowCredentials` object. Option 1 object The definition of the `ServiceNowBasicAuth` object. instance [_required_] string The `ServiceNowBasicAuth` `instance`. password [_required_] string The `ServiceNowBasicAuth` `password`. type [_required_] enum The definition of the `ServiceNowBasicAuth` object. Allowed enum values: `ServiceNowBasicAuth` username [_required_] string The `ServiceNowBasicAuth` `username`. type [_required_] enum The definition of the `ServiceNowIntegrationType` object. Allowed enum values: `ServiceNow` Option 22 object The definition of the `SplitIntegration` object. credentials [_required_] The definition of the `SplitCredentials` object. Option 1 object The definition of the `SplitAPIKey` object. api_key [_required_] string The `SplitAPIKey` `api_key`. type [_required_] enum The definition of the `SplitAPIKey` object. Allowed enum values: `SplitAPIKey` type [_required_] enum The definition of the `SplitIntegrationType` object. Allowed enum values: `Split` Option 23 object The definition of the `StatsigIntegration` object. credentials [_required_] The definition of the `StatsigCredentials` object. Option 1 object The definition of the `StatsigAPIKey` object. api_key [_required_] string The `StatsigAPIKey` `api_key`. type [_required_] enum The definition of the `StatsigAPIKey` object. Allowed enum values: `StatsigAPIKey` type [_required_] enum The definition of the `StatsigIntegrationType` object. Allowed enum values: `Statsig` Option 24 object The definition of the `VirusTotalIntegration` object. credentials [_required_] The definition of the `VirusTotalCredentials` object. Option 1 object The definition of the `VirusTotalAPIKey` object. api_key [_required_] string The `VirusTotalAPIKey` `api_key`. type [_required_] enum The definition of the `VirusTotalAPIKey` object. Allowed enum values: `VirusTotalAPIKey` type [_required_] enum The definition of the `VirusTotalIntegrationType` object. Allowed enum values: `VirusTotal` name [_required_] string Name of the connection id string The connection identifier type [_required_] enum The definition of `ActionConnectionDataType` object. Allowed enum values: `action_connection` ``` { "data": { "attributes": { "integration": { "credentials": { "account_id": "111222333444", "external_id": "33a1011635c44b38a064cf14e82e1d8f", "principal_id": "123456789012", "role": "my-role", "type": "AWSAssumeRole" }, "type": "AWS" }, "name": "My AWS Connection" }, "id": "string", "type": "action_connection" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too Many Request * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=typescript) ##### Get an existing Action Connection Copy ``` # Path parameters export connection_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions/connections/${connection_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an existing Action Connection ``` """ Get an existing Action Connection returns "Successfully get Action Connection" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.action_connection_api import ActionConnectionApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionConnectionApi(api_client) response = api_instance.get_action_connection( connection_id="cb460d51-3c88-4e87-adac-d47131d0423d", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an existing Action Connection ``` # Get an existing Action Connection returns "Successfully get Action Connection" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionConnectionAPI.new p api_instance.get_action_connection("cb460d51-3c88-4e87-adac-d47131d0423d") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an existing Action Connection ``` // Get an existing Action Connection returns "Successfully get Action Connection" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionConnectionApi(apiClient) resp, r, err := api.GetActionConnection(ctx, "cb460d51-3c88-4e87-adac-d47131d0423d") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionConnectionApi.GetActionConnection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionConnectionApi.GetActionConnection`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an existing Action Connection ``` // Get an existing Action Connection returns "Successfully get Action Connection" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionConnectionApi; import com.datadog.api.client.v2.model.GetActionConnectionResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionConnectionApi apiInstance = new ActionConnectionApi(defaultClient); try { GetActionConnectionResponse result = apiInstance.getActionConnection("cb460d51-3c88-4e87-adac-d47131d0423d"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionConnectionApi#getActionConnection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an existing Action Connection ``` // Get an existing Action Connection returns "Successfully get Action Connection" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_action_connection::ActionConnectionAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ActionConnectionAPI::with_config(configuration); let resp = api .get_action_connection("cb460d51-3c88-4e87-adac-d47131d0423d".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an existing Action Connection ``` /** * Get an existing Action Connection returns "Successfully get Action Connection" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionConnectionApi(configuration); const params: v2.ActionConnectionApiGetActionConnectionRequest = { connectionId: "cb460d51-3c88-4e87-adac-d47131d0423d", }; apiInstance .getActionConnection(params) .then((data: v2.GetActionConnectionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new Action Connection](https://docs.datadoghq.com/api/latest/action-connection/#create-a-new-action-connection) * [v2 (latest)](https://docs.datadoghq.com/api/latest/action-connection/#create-a-new-action-connection-v2) POST https://api.ap1.datadoghq.com/api/v2/actions/connectionshttps://api.ap2.datadoghq.com/api/v2/actions/connectionshttps://api.datadoghq.eu/api/v2/actions/connectionshttps://api.ddog-gov.com/api/v2/actions/connectionshttps://api.datadoghq.com/api/v2/actions/connectionshttps://api.us3.datadoghq.com/api/v2/actions/connectionshttps://api.us5.datadoghq.com/api/v2/actions/connections ### Overview Create a new Action Connection. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) Field Type Description data [_required_] object Data related to the connection. attributes [_required_] object The definition of `ActionConnectionAttributes` object. integration [_required_] The definition of `ActionConnectionIntegration` object. Option 1 object The definition of `AWSIntegration` object. credentials [_required_] The definition of `AWSCredentials` object. Option 1 object The definition of `AWSAssumeRole` object. account_id [_required_] string AWS account the connection is created for external_id string External ID used to scope which connection can be used to assume the role principal_id string AWS account that will assume the role role [_required_] string Role to assume type [_required_] enum The definition of `AWSAssumeRoleType` object. Allowed enum values: `AWSAssumeRole` type [_required_] enum The definition of `AWSIntegrationType` object. Allowed enum values: `AWS` Option 2 object The definition of the `AnthropicIntegration` object. credentials [_required_] The definition of the `AnthropicCredentials` object. Option 1 object The definition of the `AnthropicAPIKey` object. api_token [_required_] string The `AnthropicAPIKey` `api_token`. type [_required_] enum The definition of the `AnthropicAPIKey` object. Allowed enum values: `AnthropicAPIKey` type [_required_] enum The definition of the `AnthropicIntegrationType` object. Allowed enum values: `Anthropic` Option 3 object The definition of the `AsanaIntegration` object. credentials [_required_] The definition of the `AsanaCredentials` object. Option 1 object The definition of the `AsanaAccessToken` object. access_token [_required_] string The `AsanaAccessToken` `access_token`. type [_required_] enum The definition of the `AsanaAccessToken` object. Allowed enum values: `AsanaAccessToken` type [_required_] enum The definition of the `AsanaIntegrationType` object. Allowed enum values: `Asana` Option 4 object The definition of the `AzureIntegration` object. credentials [_required_] The definition of the `AzureCredentials` object. Option 1 object The definition of the `AzureTenant` object. app_client_id [_required_] string The Client ID, also known as the Application ID in Azure, is a unique identifier for an application. It's used to identify the application during the authentication process. Your Application (client) ID is listed in the application's overview page. You can navigate to your application via the Azure Directory. client_secret [_required_] string The Client Secret is a confidential piece of information known only to the application and Azure AD. It's used to prove the application's identity. Your Client Secret is available from the application’s secrets page. You can navigate to your application via the Azure Directory. custom_scopes string If provided, the custom scope to be requested from Microsoft when acquiring an OAuth 2 access token. This custom scope is used only in conjunction with the HTTP action. A resource's scope is constructed by using the identifier URI for the resource and .default, separated by a forward slash (/) as follows:{identifierURI}/.default. tenant_id [_required_] string The Tenant ID, also known as the Directory ID in Azure, is a unique identifier that represents an Azure AD instance. Your Tenant ID (Directory ID) is listed in your Active Directory overview page under the 'Tenant information' section. type [_required_] enum The definition of the `AzureTenant` object. Allowed enum values: `AzureTenant` type [_required_] enum The definition of the `AzureIntegrationType` object. Allowed enum values: `Azure` Option 5 object The definition of the `CircleCIIntegration` object. credentials [_required_] The definition of the `CircleCICredentials` object. Option 1 object The definition of the `CircleCIAPIKey` object. api_token [_required_] string The `CircleCIAPIKey` `api_token`. type [_required_] enum The definition of the `CircleCIAPIKey` object. Allowed enum values: `CircleCIAPIKey` type [_required_] enum The definition of the `CircleCIIntegrationType` object. Allowed enum values: `CircleCI` Option 6 object The definition of the `ClickupIntegration` object. credentials [_required_] The definition of the `ClickupCredentials` object. Option 1 object The definition of the `ClickupAPIKey` object. api_token [_required_] string The `ClickupAPIKey` `api_token`. type [_required_] enum The definition of the `ClickupAPIKey` object. Allowed enum values: `ClickupAPIKey` type [_required_] enum The definition of the `ClickupIntegrationType` object. Allowed enum values: `Clickup` Option 7 object The definition of the `CloudflareIntegration` object. credentials [_required_] The definition of the `CloudflareCredentials` object. Option 1 object The definition of the `CloudflareAPIToken` object. api_token [_required_] string The `CloudflareAPIToken` `api_token`. type [_required_] enum The definition of the `CloudflareAPIToken` object. Allowed enum values: `CloudflareAPIToken` Option 2 object The definition of the `CloudflareGlobalAPIToken` object. auth_email [_required_] string The `CloudflareGlobalAPIToken` `auth_email`. global_api_key [_required_] string The `CloudflareGlobalAPIToken` `global_api_key`. type [_required_] enum The definition of the `CloudflareGlobalAPIToken` object. Allowed enum values: `CloudflareGlobalAPIToken` type [_required_] enum The definition of the `CloudflareIntegrationType` object. Allowed enum values: `Cloudflare` Option 8 object The definition of the `ConfigCatIntegration` object. credentials [_required_] The definition of the `ConfigCatCredentials` object. Option 1 object The definition of the `ConfigCatSDKKey` object. api_password [_required_] string The `ConfigCatSDKKey` `api_password`. api_username [_required_] string The `ConfigCatSDKKey` `api_username`. sdk_key [_required_] string The `ConfigCatSDKKey` `sdk_key`. type [_required_] enum The definition of the `ConfigCatSDKKey` object. Allowed enum values: `ConfigCatSDKKey` type [_required_] enum The definition of the `ConfigCatIntegrationType` object. Allowed enum values: `ConfigCat` Option 9 object The definition of the `DatadogIntegration` object. credentials [_required_] The definition of the `DatadogCredentials` object. Option 1 object The definition of the `DatadogAPIKey` object. api_key [_required_] string The `DatadogAPIKey` `api_key`. app_key [_required_] string The `DatadogAPIKey` `app_key`. datacenter [_required_] string The `DatadogAPIKey` `datacenter`. subdomain string Custom subdomain used for Datadog URLs generated with this Connection. For example, if this org uses `https://acme.datadoghq.com` to access Datadog, set this field to `acme`. If this field is omitted, generated URLs will use the default site URL for its datacenter (see ). type [_required_] enum The definition of the `DatadogAPIKey` object. Allowed enum values: `DatadogAPIKey` type [_required_] enum The definition of the `DatadogIntegrationType` object. Allowed enum values: `Datadog` Option 10 object The definition of the `FastlyIntegration` object. credentials [_required_] The definition of the `FastlyCredentials` object. Option 1 object The definition of the `FastlyAPIKey` object. api_key [_required_] string The `FastlyAPIKey` `api_key`. type [_required_] enum The definition of the `FastlyAPIKey` object. Allowed enum values: `FastlyAPIKey` type [_required_] enum The definition of the `FastlyIntegrationType` object. Allowed enum values: `Fastly` Option 11 object The definition of the `FreshserviceIntegration` object. credentials [_required_] The definition of the `FreshserviceCredentials` object. Option 1 object The definition of the `FreshserviceAPIKey` object. api_key [_required_] string The `FreshserviceAPIKey` `api_key`. domain [_required_] string The `FreshserviceAPIKey` `domain`. type [_required_] enum The definition of the `FreshserviceAPIKey` object. Allowed enum values: `FreshserviceAPIKey` type [_required_] enum The definition of the `FreshserviceIntegrationType` object. Allowed enum values: `Freshservice` Option 12 object The definition of the `GCPIntegration` object. credentials [_required_] The definition of the `GCPCredentials` object. Option 1 object The definition of the `GCPServiceAccount` object. private_key [_required_] string The `GCPServiceAccount` `private_key`. service_account_email [_required_] string The `GCPServiceAccount` `service_account_email`. type [_required_] enum The definition of the `GCPServiceAccount` object. Allowed enum values: `GCPServiceAccount` type [_required_] enum The definition of the `GCPIntegrationType` object. Allowed enum values: `GCP` Option 13 object The definition of the `GeminiIntegration` object. credentials [_required_] The definition of the `GeminiCredentials` object. Option 1 object The definition of the `GeminiAPIKey` object. api_key [_required_] string The `GeminiAPIKey` `api_key`. type [_required_] enum The definition of the `GeminiAPIKey` object. Allowed enum values: `GeminiAPIKey` type [_required_] enum The definition of the `GeminiIntegrationType` object. Allowed enum values: `Gemini` Option 14 object The definition of the `GitlabIntegration` object. credentials [_required_] The definition of the `GitlabCredentials` object. Option 1 object The definition of the `GitlabAPIKey` object. api_token [_required_] string The `GitlabAPIKey` `api_token`. type [_required_] enum The definition of the `GitlabAPIKey` object. Allowed enum values: `GitlabAPIKey` type [_required_] enum The definition of the `GitlabIntegrationType` object. Allowed enum values: `Gitlab` Option 15 object The definition of the `GreyNoiseIntegration` object. credentials [_required_] The definition of the `GreyNoiseCredentials` object. Option 1 object The definition of the `GreyNoiseAPIKey` object. api_key [_required_] string The `GreyNoiseAPIKey` `api_key`. type [_required_] enum The definition of the `GreyNoiseAPIKey` object. Allowed enum values: `GreyNoiseAPIKey` type [_required_] enum The definition of the `GreyNoiseIntegrationType` object. Allowed enum values: `GreyNoise` Option 16 object The definition of `HTTPIntegration` object. base_url [_required_] string Base HTTP url for the integration credentials [_required_] The definition of `HTTPCredentials` object. Option 1 object The definition of `HTTPTokenAuth` object. body object The definition of `HTTPBody` object. content string Serialized body content content_type string Content type of the body headers [object] The `HTTPTokenAuth` `headers`. name [_required_] string The `HTTPHeader` `name`. value [_required_] string The `HTTPHeader` `value`. tokens [object] The `HTTPTokenAuth` `tokens`. name [_required_] string The `HTTPToken` `name`. type [_required_] enum The definition of `TokenType` object. Allowed enum values: `SECRET` value [_required_] string The `HTTPToken` `value`. type [_required_] enum The definition of `HTTPTokenAuthType` object. Allowed enum values: `HTTPTokenAuth` url_parameters [object] The `HTTPTokenAuth` `url_parameters`. name [_required_] string Name for tokens. value [_required_] string The `UrlParam` `value`. type [_required_] enum The definition of `HTTPIntegrationType` object. Allowed enum values: `HTTP` Option 17 object The definition of the `LaunchDarklyIntegration` object. credentials [_required_] The definition of the `LaunchDarklyCredentials` object. Option 1 object The definition of the `LaunchDarklyAPIKey` object. api_token [_required_] string The `LaunchDarklyAPIKey` `api_token`. type [_required_] enum The definition of the `LaunchDarklyAPIKey` object. Allowed enum values: `LaunchDarklyAPIKey` type [_required_] enum The definition of the `LaunchDarklyIntegrationType` object. Allowed enum values: `LaunchDarkly` Option 18 object The definition of the `NotionIntegration` object. credentials [_required_] The definition of the `NotionCredentials` object. Option 1 object The definition of the `NotionAPIKey` object. api_token [_required_] string The `NotionAPIKey` `api_token`. type [_required_] enum The definition of the `NotionAPIKey` object. Allowed enum values: `NotionAPIKey` type [_required_] enum The definition of the `NotionIntegrationType` object. Allowed enum values: `Notion` Option 19 object The definition of the `OktaIntegration` object. credentials [_required_] The definition of the `OktaCredentials` object. Option 1 object The definition of the `OktaAPIToken` object. api_token [_required_] string The `OktaAPIToken` `api_token`. domain [_required_] string The `OktaAPIToken` `domain`. type [_required_] enum The definition of the `OktaAPIToken` object. Allowed enum values: `OktaAPIToken` type [_required_] enum The definition of the `OktaIntegrationType` object. Allowed enum values: `Okta` Option 20 object The definition of the `OpenAIIntegration` object. credentials [_required_] The definition of the `OpenAICredentials` object. Option 1 object The definition of the `OpenAIAPIKey` object. api_token [_required_] string The `OpenAIAPIKey` `api_token`. type [_required_] enum The definition of the `OpenAIAPIKey` object. Allowed enum values: `OpenAIAPIKey` type [_required_] enum The definition of the `OpenAIIntegrationType` object. Allowed enum values: `OpenAI` Option 21 object The definition of the `ServiceNowIntegration` object. credentials [_required_] The definition of the `ServiceNowCredentials` object. Option 1 object The definition of the `ServiceNowBasicAuth` object. instance [_required_] string The `ServiceNowBasicAuth` `instance`. password [_required_] string The `ServiceNowBasicAuth` `password`. type [_required_] enum The definition of the `ServiceNowBasicAuth` object. Allowed enum values: `ServiceNowBasicAuth` username [_required_] string The `ServiceNowBasicAuth` `username`. type [_required_] enum The definition of the `ServiceNowIntegrationType` object. Allowed enum values: `ServiceNow` Option 22 object The definition of the `SplitIntegration` object. credentials [_required_] The definition of the `SplitCredentials` object. Option 1 object The definition of the `SplitAPIKey` object. api_key [_required_] string The `SplitAPIKey` `api_key`. type [_required_] enum The definition of the `SplitAPIKey` object. Allowed enum values: `SplitAPIKey` type [_required_] enum The definition of the `SplitIntegrationType` object. Allowed enum values: `Split` Option 23 object The definition of the `StatsigIntegration` object. credentials [_required_] The definition of the `StatsigCredentials` object. Option 1 object The definition of the `StatsigAPIKey` object. api_key [_required_] string The `StatsigAPIKey` `api_key`. type [_required_] enum The definition of the `StatsigAPIKey` object. Allowed enum values: `StatsigAPIKey` type [_required_] enum The definition of the `StatsigIntegrationType` object. Allowed enum values: `Statsig` Option 24 object The definition of the `VirusTotalIntegration` object. credentials [_required_] The definition of the `VirusTotalCredentials` object. Option 1 object The definition of the `VirusTotalAPIKey` object. api_key [_required_] string The `VirusTotalAPIKey` `api_key`. type [_required_] enum The definition of the `VirusTotalAPIKey` object. Allowed enum values: `VirusTotalAPIKey` type [_required_] enum The definition of the `VirusTotalIntegrationType` object. Allowed enum values: `VirusTotal` name [_required_] string Name of the connection id string The connection identifier type [_required_] enum The definition of `ActionConnectionDataType` object. Allowed enum values: `action_connection` ``` { "data": { "type": "action_connection", "attributes": { "name": "Cassette Connection exampleactionconnection", "integration": { "type": "AWS", "credentials": { "type": "AWSAssumeRole", "role": "MyRoleUpdated", "account_id": "123456789123" } } } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/action-connection/#CreateActionConnection-201-v2) * [400](https://docs.datadoghq.com/api/latest/action-connection/#CreateActionConnection-400-v2) * [403](https://docs.datadoghq.com/api/latest/action-connection/#CreateActionConnection-403-v2) * [429](https://docs.datadoghq.com/api/latest/action-connection/#CreateActionConnection-429-v2) Successfully created Action Connection * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) The response for a created connection Field Type Description data object Data related to the connection. attributes [_required_] object The definition of `ActionConnectionAttributes` object. integration [_required_] The definition of `ActionConnectionIntegration` object. Option 1 object The definition of `AWSIntegration` object. credentials [_required_] The definition of `AWSCredentials` object. Option 1 object The definition of `AWSAssumeRole` object. account_id [_required_] string AWS account the connection is created for external_id string External ID used to scope which connection can be used to assume the role principal_id string AWS account that will assume the role role [_required_] string Role to assume type [_required_] enum The definition of `AWSAssumeRoleType` object. Allowed enum values: `AWSAssumeRole` type [_required_] enum The definition of `AWSIntegrationType` object. Allowed enum values: `AWS` Option 2 object The definition of the `AnthropicIntegration` object. credentials [_required_] The definition of the `AnthropicCredentials` object. Option 1 object The definition of the `AnthropicAPIKey` object. api_token [_required_] string The `AnthropicAPIKey` `api_token`. type [_required_] enum The definition of the `AnthropicAPIKey` object. Allowed enum values: `AnthropicAPIKey` type [_required_] enum The definition of the `AnthropicIntegrationType` object. Allowed enum values: `Anthropic` Option 3 object The definition of the `AsanaIntegration` object. credentials [_required_] The definition of the `AsanaCredentials` object. Option 1 object The definition of the `AsanaAccessToken` object. access_token [_required_] string The `AsanaAccessToken` `access_token`. type [_required_] enum The definition of the `AsanaAccessToken` object. Allowed enum values: `AsanaAccessToken` type [_required_] enum The definition of the `AsanaIntegrationType` object. Allowed enum values: `Asana` Option 4 object The definition of the `AzureIntegration` object. credentials [_required_] The definition of the `AzureCredentials` object. Option 1 object The definition of the `AzureTenant` object. app_client_id [_required_] string The Client ID, also known as the Application ID in Azure, is a unique identifier for an application. It's used to identify the application during the authentication process. Your Application (client) ID is listed in the application's overview page. You can navigate to your application via the Azure Directory. client_secret [_required_] string The Client Secret is a confidential piece of information known only to the application and Azure AD. It's used to prove the application's identity. Your Client Secret is available from the application’s secrets page. You can navigate to your application via the Azure Directory. custom_scopes string If provided, the custom scope to be requested from Microsoft when acquiring an OAuth 2 access token. This custom scope is used only in conjunction with the HTTP action. A resource's scope is constructed by using the identifier URI for the resource and .default, separated by a forward slash (/) as follows:{identifierURI}/.default. tenant_id [_required_] string The Tenant ID, also known as the Directory ID in Azure, is a unique identifier that represents an Azure AD instance. Your Tenant ID (Directory ID) is listed in your Active Directory overview page under the 'Tenant information' section. type [_required_] enum The definition of the `AzureTenant` object. Allowed enum values: `AzureTenant` type [_required_] enum The definition of the `AzureIntegrationType` object. Allowed enum values: `Azure` Option 5 object The definition of the `CircleCIIntegration` object. credentials [_required_] The definition of the `CircleCICredentials` object. Option 1 object The definition of the `CircleCIAPIKey` object. api_token [_required_] string The `CircleCIAPIKey` `api_token`. type [_required_] enum The definition of the `CircleCIAPIKey` object. Allowed enum values: `CircleCIAPIKey` type [_required_] enum The definition of the `CircleCIIntegrationType` object. Allowed enum values: `CircleCI` Option 6 object The definition of the `ClickupIntegration` object. credentials [_required_] The definition of the `ClickupCredentials` object. Option 1 object The definition of the `ClickupAPIKey` object. api_token [_required_] string The `ClickupAPIKey` `api_token`. type [_required_] enum The definition of the `ClickupAPIKey` object. Allowed enum values: `ClickupAPIKey` type [_required_] enum The definition of the `ClickupIntegrationType` object. Allowed enum values: `Clickup` Option 7 object The definition of the `CloudflareIntegration` object. credentials [_required_] The definition of the `CloudflareCredentials` object. Option 1 object The definition of the `CloudflareAPIToken` object. api_token [_required_] string The `CloudflareAPIToken` `api_token`. type [_required_] enum The definition of the `CloudflareAPIToken` object. Allowed enum values: `CloudflareAPIToken` Option 2 object The definition of the `CloudflareGlobalAPIToken` object. auth_email [_required_] string The `CloudflareGlobalAPIToken` `auth_email`. global_api_key [_required_] string The `CloudflareGlobalAPIToken` `global_api_key`. type [_required_] enum The definition of the `CloudflareGlobalAPIToken` object. Allowed enum values: `CloudflareGlobalAPIToken` type [_required_] enum The definition of the `CloudflareIntegrationType` object. Allowed enum values: `Cloudflare` Option 8 object The definition of the `ConfigCatIntegration` object. credentials [_required_] The definition of the `ConfigCatCredentials` object. Option 1 object The definition of the `ConfigCatSDKKey` object. api_password [_required_] string The `ConfigCatSDKKey` `api_password`. api_username [_required_] string The `ConfigCatSDKKey` `api_username`. sdk_key [_required_] string The `ConfigCatSDKKey` `sdk_key`. type [_required_] enum The definition of the `ConfigCatSDKKey` object. Allowed enum values: `ConfigCatSDKKey` type [_required_] enum The definition of the `ConfigCatIntegrationType` object. Allowed enum values: `ConfigCat` Option 9 object The definition of the `DatadogIntegration` object. credentials [_required_] The definition of the `DatadogCredentials` object. Option 1 object The definition of the `DatadogAPIKey` object. api_key [_required_] string The `DatadogAPIKey` `api_key`. app_key [_required_] string The `DatadogAPIKey` `app_key`. datacenter [_required_] string The `DatadogAPIKey` `datacenter`. subdomain string Custom subdomain used for Datadog URLs generated with this Connection. For example, if this org uses `https://acme.datadoghq.com` to access Datadog, set this field to `acme`. If this field is omitted, generated URLs will use the default site URL for its datacenter (see ). type [_required_] enum The definition of the `DatadogAPIKey` object. Allowed enum values: `DatadogAPIKey` type [_required_] enum The definition of the `DatadogIntegrationType` object. Allowed enum values: `Datadog` Option 10 object The definition of the `FastlyIntegration` object. credentials [_required_] The definition of the `FastlyCredentials` object. Option 1 object The definition of the `FastlyAPIKey` object. api_key [_required_] string The `FastlyAPIKey` `api_key`. type [_required_] enum The definition of the `FastlyAPIKey` object. Allowed enum values: `FastlyAPIKey` type [_required_] enum The definition of the `FastlyIntegrationType` object. Allowed enum values: `Fastly` Option 11 object The definition of the `FreshserviceIntegration` object. credentials [_required_] The definition of the `FreshserviceCredentials` object. Option 1 object The definition of the `FreshserviceAPIKey` object. api_key [_required_] string The `FreshserviceAPIKey` `api_key`. domain [_required_] string The `FreshserviceAPIKey` `domain`. type [_required_] enum The definition of the `FreshserviceAPIKey` object. Allowed enum values: `FreshserviceAPIKey` type [_required_] enum The definition of the `FreshserviceIntegrationType` object. Allowed enum values: `Freshservice` Option 12 object The definition of the `GCPIntegration` object. credentials [_required_] The definition of the `GCPCredentials` object. Option 1 object The definition of the `GCPServiceAccount` object. private_key [_required_] string The `GCPServiceAccount` `private_key`. service_account_email [_required_] string The `GCPServiceAccount` `service_account_email`. type [_required_] enum The definition of the `GCPServiceAccount` object. Allowed enum values: `GCPServiceAccount` type [_required_] enum The definition of the `GCPIntegrationType` object. Allowed enum values: `GCP` Option 13 object The definition of the `GeminiIntegration` object. credentials [_required_] The definition of the `GeminiCredentials` object. Option 1 object The definition of the `GeminiAPIKey` object. api_key [_required_] string The `GeminiAPIKey` `api_key`. type [_required_] enum The definition of the `GeminiAPIKey` object. Allowed enum values: `GeminiAPIKey` type [_required_] enum The definition of the `GeminiIntegrationType` object. Allowed enum values: `Gemini` Option 14 object The definition of the `GitlabIntegration` object. credentials [_required_] The definition of the `GitlabCredentials` object. Option 1 object The definition of the `GitlabAPIKey` object. api_token [_required_] string The `GitlabAPIKey` `api_token`. type [_required_] enum The definition of the `GitlabAPIKey` object. Allowed enum values: `GitlabAPIKey` type [_required_] enum The definition of the `GitlabIntegrationType` object. Allowed enum values: `Gitlab` Option 15 object The definition of the `GreyNoiseIntegration` object. credentials [_required_] The definition of the `GreyNoiseCredentials` object. Option 1 object The definition of the `GreyNoiseAPIKey` object. api_key [_required_] string The `GreyNoiseAPIKey` `api_key`. type [_required_] enum The definition of the `GreyNoiseAPIKey` object. Allowed enum values: `GreyNoiseAPIKey` type [_required_] enum The definition of the `GreyNoiseIntegrationType` object. Allowed enum values: `GreyNoise` Option 16 object The definition of `HTTPIntegration` object. base_url [_required_] string Base HTTP url for the integration credentials [_required_] The definition of `HTTPCredentials` object. Option 1 object The definition of `HTTPTokenAuth` object. body object The definition of `HTTPBody` object. content string Serialized body content content_type string Content type of the body headers [object] The `HTTPTokenAuth` `headers`. name [_required_] string The `HTTPHeader` `name`. value [_required_] string The `HTTPHeader` `value`. tokens [object] The `HTTPTokenAuth` `tokens`. name [_required_] string The `HTTPToken` `name`. type [_required_] enum The definition of `TokenType` object. Allowed enum values: `SECRET` value [_required_] string The `HTTPToken` `value`. type [_required_] enum The definition of `HTTPTokenAuthType` object. Allowed enum values: `HTTPTokenAuth` url_parameters [object] The `HTTPTokenAuth` `url_parameters`. name [_required_] string Name for tokens. value [_required_] string The `UrlParam` `value`. type [_required_] enum The definition of `HTTPIntegrationType` object. Allowed enum values: `HTTP` Option 17 object The definition of the `LaunchDarklyIntegration` object. credentials [_required_] The definition of the `LaunchDarklyCredentials` object. Option 1 object The definition of the `LaunchDarklyAPIKey` object. api_token [_required_] string The `LaunchDarklyAPIKey` `api_token`. type [_required_] enum The definition of the `LaunchDarklyAPIKey` object. Allowed enum values: `LaunchDarklyAPIKey` type [_required_] enum The definition of the `LaunchDarklyIntegrationType` object. Allowed enum values: `LaunchDarkly` Option 18 object The definition of the `NotionIntegration` object. credentials [_required_] The definition of the `NotionCredentials` object. Option 1 object The definition of the `NotionAPIKey` object. api_token [_required_] string The `NotionAPIKey` `api_token`. type [_required_] enum The definition of the `NotionAPIKey` object. Allowed enum values: `NotionAPIKey` type [_required_] enum The definition of the `NotionIntegrationType` object. Allowed enum values: `Notion` Option 19 object The definition of the `OktaIntegration` object. credentials [_required_] The definition of the `OktaCredentials` object. Option 1 object The definition of the `OktaAPIToken` object. api_token [_required_] string The `OktaAPIToken` `api_token`. domain [_required_] string The `OktaAPIToken` `domain`. type [_required_] enum The definition of the `OktaAPIToken` object. Allowed enum values: `OktaAPIToken` type [_required_] enum The definition of the `OktaIntegrationType` object. Allowed enum values: `Okta` Option 20 object The definition of the `OpenAIIntegration` object. credentials [_required_] The definition of the `OpenAICredentials` object. Option 1 object The definition of the `OpenAIAPIKey` object. api_token [_required_] string The `OpenAIAPIKey` `api_token`. type [_required_] enum The definition of the `OpenAIAPIKey` object. Allowed enum values: `OpenAIAPIKey` type [_required_] enum The definition of the `OpenAIIntegrationType` object. Allowed enum values: `OpenAI` Option 21 object The definition of the `ServiceNowIntegration` object. credentials [_required_] The definition of the `ServiceNowCredentials` object. Option 1 object The definition of the `ServiceNowBasicAuth` object. instance [_required_] string The `ServiceNowBasicAuth` `instance`. password [_required_] string The `ServiceNowBasicAuth` `password`. type [_required_] enum The definition of the `ServiceNowBasicAuth` object. Allowed enum values: `ServiceNowBasicAuth` username [_required_] string The `ServiceNowBasicAuth` `username`. type [_required_] enum The definition of the `ServiceNowIntegrationType` object. Allowed enum values: `ServiceNow` Option 22 object The definition of the `SplitIntegration` object. credentials [_required_] The definition of the `SplitCredentials` object. Option 1 object The definition of the `SplitAPIKey` object. api_key [_required_] string The `SplitAPIKey` `api_key`. type [_required_] enum The definition of the `SplitAPIKey` object. Allowed enum values: `SplitAPIKey` type [_required_] enum The definition of the `SplitIntegrationType` object. Allowed enum values: `Split` Option 23 object The definition of the `StatsigIntegration` object. credentials [_required_] The definition of the `StatsigCredentials` object. Option 1 object The definition of the `StatsigAPIKey` object. api_key [_required_] string The `StatsigAPIKey` `api_key`. type [_required_] enum The definition of the `StatsigAPIKey` object. Allowed enum values: `StatsigAPIKey` type [_required_] enum The definition of the `StatsigIntegrationType` object. Allowed enum values: `Statsig` Option 24 object The definition of the `VirusTotalIntegration` object. credentials [_required_] The definition of the `VirusTotalCredentials` object. Option 1 object The definition of the `VirusTotalAPIKey` object. api_key [_required_] string The `VirusTotalAPIKey` `api_key`. type [_required_] enum The definition of the `VirusTotalAPIKey` object. Allowed enum values: `VirusTotalAPIKey` type [_required_] enum The definition of the `VirusTotalIntegrationType` object. Allowed enum values: `VirusTotal` name [_required_] string Name of the connection id string The connection identifier type [_required_] enum The definition of `ActionConnectionDataType` object. Allowed enum values: `action_connection` ``` { "data": { "attributes": { "integration": { "credentials": { "account_id": "111222333444", "external_id": "33a1011635c44b38a064cf14e82e1d8f", "principal_id": "123456789012", "role": "my-role", "type": "AWSAssumeRole" }, "type": "AWS" }, "name": "My AWS Connection" }, "id": "string", "type": "action_connection" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too Many Request * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=typescript) ##### Create a new Action Connection returns "Successfully created Action Connection" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions/connections" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "action_connection", "attributes": { "name": "Cassette Connection exampleactionconnection", "integration": { "type": "AWS", "credentials": { "type": "AWSAssumeRole", "role": "MyRoleUpdated", "account_id": "123456789123" } } } } } EOF ``` ##### Create a new Action Connection returns "Successfully created Action Connection" response ``` // Create a new Action Connection returns "Successfully created Action Connection" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateActionConnectionRequest{ Data: datadogV2.ActionConnectionData{ Type: datadogV2.ACTIONCONNECTIONDATATYPE_ACTION_CONNECTION, Attributes: datadogV2.ActionConnectionAttributes{ Name: "Cassette Connection exampleactionconnection", Integration: datadogV2.ActionConnectionIntegration{ AWSIntegration: &datadogV2.AWSIntegration{ Type: datadogV2.AWSINTEGRATIONTYPE_AWS, Credentials: datadogV2.AWSCredentials{ AWSAssumeRole: &datadogV2.AWSAssumeRole{ Type: datadogV2.AWSASSUMEROLETYPE_AWSASSUMEROLE, Role: "MyRoleUpdated", AccountId: "123456789123", }}, }}, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionConnectionApi(apiClient) resp, r, err := api.CreateActionConnection(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionConnectionApi.CreateActionConnection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionConnectionApi.CreateActionConnection`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new Action Connection returns "Successfully created Action Connection" response ``` // Create a new Action Connection returns "Successfully created Action Connection" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionConnectionApi; import com.datadog.api.client.v2.model.AWSAssumeRole; import com.datadog.api.client.v2.model.AWSAssumeRoleType; import com.datadog.api.client.v2.model.AWSCredentials; import com.datadog.api.client.v2.model.AWSIntegration; import com.datadog.api.client.v2.model.AWSIntegrationType; import com.datadog.api.client.v2.model.ActionConnectionAttributes; import com.datadog.api.client.v2.model.ActionConnectionData; import com.datadog.api.client.v2.model.ActionConnectionDataType; import com.datadog.api.client.v2.model.ActionConnectionIntegration; import com.datadog.api.client.v2.model.CreateActionConnectionRequest; import com.datadog.api.client.v2.model.CreateActionConnectionResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionConnectionApi apiInstance = new ActionConnectionApi(defaultClient); CreateActionConnectionRequest body = new CreateActionConnectionRequest() .data( new ActionConnectionData() .type(ActionConnectionDataType.ACTION_CONNECTION) .attributes( new ActionConnectionAttributes() .name("Cassette Connection exampleactionconnection") .integration( new ActionConnectionIntegration( new AWSIntegration() .type(AWSIntegrationType.AWS) .credentials( new AWSCredentials( new AWSAssumeRole() .type(AWSAssumeRoleType.AWSASSUMEROLE) .role("MyRoleUpdated") .accountId("123456789123"))))))); try { CreateActionConnectionResponse result = apiInstance.createActionConnection(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionConnectionApi#createActionConnection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new Action Connection returns "Successfully created Action Connection" response ``` """ Create a new Action Connection returns "Successfully created Action Connection" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.action_connection_api import ActionConnectionApi from datadog_api_client.v2.model.action_connection_attributes import ActionConnectionAttributes from datadog_api_client.v2.model.action_connection_data import ActionConnectionData from datadog_api_client.v2.model.action_connection_data_type import ActionConnectionDataType from datadog_api_client.v2.model.aws_assume_role import AWSAssumeRole from datadog_api_client.v2.model.aws_assume_role_type import AWSAssumeRoleType from datadog_api_client.v2.model.aws_integration import AWSIntegration from datadog_api_client.v2.model.aws_integration_type import AWSIntegrationType from datadog_api_client.v2.model.create_action_connection_request import CreateActionConnectionRequest body = CreateActionConnectionRequest( data=ActionConnectionData( type=ActionConnectionDataType.ACTION_CONNECTION, attributes=ActionConnectionAttributes( name="Cassette Connection exampleactionconnection", integration=AWSIntegration( type=AWSIntegrationType.AWS, credentials=AWSAssumeRole( type=AWSAssumeRoleType.AWSASSUMEROLE, role="MyRoleUpdated", account_id="123456789123", ), ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionConnectionApi(api_client) response = api_instance.create_action_connection(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new Action Connection returns "Successfully created Action Connection" response ``` # Create a new Action Connection returns "Successfully created Action Connection" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionConnectionAPI.new body = DatadogAPIClient::V2::CreateActionConnectionRequest.new({ data: DatadogAPIClient::V2::ActionConnectionData.new({ type: DatadogAPIClient::V2::ActionConnectionDataType::ACTION_CONNECTION, attributes: DatadogAPIClient::V2::ActionConnectionAttributes.new({ name: "Cassette Connection exampleactionconnection", integration: DatadogAPIClient::V2::AWSIntegration.new({ type: DatadogAPIClient::V2::AWSIntegrationType::AWS, credentials: DatadogAPIClient::V2::AWSAssumeRole.new({ type: DatadogAPIClient::V2::AWSAssumeRoleType::AWSASSUMEROLE, role: "MyRoleUpdated", account_id: "123456789123", }), }), }), }), }) p api_instance.create_action_connection(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new Action Connection returns "Successfully created Action Connection" response ``` // Create a new Action Connection returns "Successfully created Action Connection" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_action_connection::ActionConnectionAPI; use datadog_api_client::datadogV2::model::AWSAssumeRole; use datadog_api_client::datadogV2::model::AWSAssumeRoleType; use datadog_api_client::datadogV2::model::AWSCredentials; use datadog_api_client::datadogV2::model::AWSIntegration; use datadog_api_client::datadogV2::model::AWSIntegrationType; use datadog_api_client::datadogV2::model::ActionConnectionAttributes; use datadog_api_client::datadogV2::model::ActionConnectionData; use datadog_api_client::datadogV2::model::ActionConnectionDataType; use datadog_api_client::datadogV2::model::ActionConnectionIntegration; use datadog_api_client::datadogV2::model::CreateActionConnectionRequest; #[tokio::main] async fn main() { let body = CreateActionConnectionRequest::new(ActionConnectionData::new( ActionConnectionAttributes::new( ActionConnectionIntegration::AWSIntegration(Box::new(AWSIntegration::new( AWSCredentials::AWSAssumeRole(Box::new(AWSAssumeRole::new( "123456789123".to_string(), "MyRoleUpdated".to_string(), AWSAssumeRoleType::AWSASSUMEROLE, ))), AWSIntegrationType::AWS, ))), "Cassette Connection exampleactionconnection".to_string(), ), ActionConnectionDataType::ACTION_CONNECTION, )); let configuration = datadog::Configuration::new(); let api = ActionConnectionAPI::with_config(configuration); let resp = api.create_action_connection(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new Action Connection returns "Successfully created Action Connection" response ``` /** * Create a new Action Connection returns "Successfully created Action Connection" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionConnectionApi(configuration); const params: v2.ActionConnectionApiCreateActionConnectionRequest = { body: { data: { type: "action_connection", attributes: { name: "Cassette Connection exampleactionconnection", integration: { type: "AWS", credentials: { type: "AWSAssumeRole", role: "MyRoleUpdated", accountId: "123456789123", }, }, }, }, }, }; apiInstance .createActionConnection(params) .then((data: v2.CreateActionConnectionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an existing Action Connection](https://docs.datadoghq.com/api/latest/action-connection/#update-an-existing-action-connection) * [v2 (latest)](https://docs.datadoghq.com/api/latest/action-connection/#update-an-existing-action-connection-v2) PATCH https://api.ap1.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.ap2.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.datadoghq.eu/api/v2/actions/connections/{connection_id}https://api.ddog-gov.com/api/v2/actions/connections/{connection_id}https://api.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.us3.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.us5.datadoghq.com/api/v2/actions/connections/{connection_id} ### Overview Update an existing Action Connection. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). ### Arguments #### Path Parameters Name Type Description connection_id [_required_] string The ID of the action connection ### Request #### Body Data (required) Update an existing Action Connection request body * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) Field Type Description data [_required_] object Data related to the connection update. attributes [_required_] object The definition of `ActionConnectionAttributesUpdate` object. integration The definition of `ActionConnectionIntegrationUpdate` object. Option 1 object The definition of `AWSIntegrationUpdate` object. credentials The definition of `AWSCredentialsUpdate` object. Option 1 object The definition of `AWSAssumeRoleUpdate` object. account_id string AWS account the connection is created for generate_new_external_id boolean The `AWSAssumeRoleUpdate` `generate_new_external_id`. role string Role to assume type [_required_] enum The definition of `AWSAssumeRoleType` object. Allowed enum values: `AWSAssumeRole` type [_required_] enum The definition of `AWSIntegrationType` object. Allowed enum values: `AWS` Option 2 object The definition of the `AnthropicIntegrationUpdate` object. credentials The definition of the `AnthropicCredentialsUpdate` object. Option 1 object The definition of the `AnthropicAPIKey` object. api_token string The `AnthropicAPIKeyUpdate` `api_token`. type [_required_] enum The definition of the `AnthropicAPIKey` object. Allowed enum values: `AnthropicAPIKey` type [_required_] enum The definition of the `AnthropicIntegrationType` object. Allowed enum values: `Anthropic` Option 3 object The definition of the `AsanaIntegrationUpdate` object. credentials The definition of the `AsanaCredentialsUpdate` object. Option 1 object The definition of the `AsanaAccessToken` object. access_token string The `AsanaAccessTokenUpdate` `access_token`. type [_required_] enum The definition of the `AsanaAccessToken` object. Allowed enum values: `AsanaAccessToken` type [_required_] enum The definition of the `AsanaIntegrationType` object. Allowed enum values: `Asana` Option 4 object The definition of the `AzureIntegrationUpdate` object. credentials The definition of the `AzureCredentialsUpdate` object. Option 1 object The definition of the `AzureTenant` object. app_client_id string The Client ID, also known as the Application ID in Azure, is a unique identifier for an application. It's used to identify the application during the authentication process. Your Application (client) ID is listed in the application's overview page. You can navigate to your application via the Azure Directory. client_secret string The Client Secret is a confidential piece of information known only to the application and Azure AD. It's used to prove the application's identity. Your Client Secret is available from the application’s secrets page. You can navigate to your application via the Azure Directory. custom_scopes string If provided, the custom scope to be requested from Microsoft when acquiring an OAuth 2 access token. This custom scope is used only in conjunction with the HTTP action. A resource's scope is constructed by using the identifier URI for the resource and .default, separated by a forward slash (/) as follows:{identifierURI}/.default. tenant_id string The Tenant ID, also known as the Directory ID in Azure, is a unique identifier that represents an Azure AD instance. Your Tenant ID (Directory ID) is listed in your Active Directory overview page under the 'Tenant information' section. type [_required_] enum The definition of the `AzureTenant` object. Allowed enum values: `AzureTenant` type [_required_] enum The definition of the `AzureIntegrationType` object. Allowed enum values: `Azure` Option 5 object The definition of the `CircleCIIntegrationUpdate` object. credentials The definition of the `CircleCICredentialsUpdate` object. Option 1 object The definition of the `CircleCIAPIKey` object. api_token string The `CircleCIAPIKeyUpdate` `api_token`. type [_required_] enum The definition of the `CircleCIAPIKey` object. Allowed enum values: `CircleCIAPIKey` type [_required_] enum The definition of the `CircleCIIntegrationType` object. Allowed enum values: `CircleCI` Option 6 object The definition of the `ClickupIntegrationUpdate` object. credentials The definition of the `ClickupCredentialsUpdate` object. Option 1 object The definition of the `ClickupAPIKey` object. api_token string The `ClickupAPIKeyUpdate` `api_token`. type [_required_] enum The definition of the `ClickupAPIKey` object. Allowed enum values: `ClickupAPIKey` type [_required_] enum The definition of the `ClickupIntegrationType` object. Allowed enum values: `Clickup` Option 7 object The definition of the `CloudflareIntegrationUpdate` object. credentials The definition of the `CloudflareCredentialsUpdate` object. Option 1 object The definition of the `CloudflareAPIToken` object. api_token string The `CloudflareAPITokenUpdate` `api_token`. type [_required_] enum The definition of the `CloudflareAPIToken` object. Allowed enum values: `CloudflareAPIToken` Option 2 object The definition of the `CloudflareGlobalAPIToken` object. auth_email string The `CloudflareGlobalAPITokenUpdate` `auth_email`. global_api_key string The `CloudflareGlobalAPITokenUpdate` `global_api_key`. type [_required_] enum The definition of the `CloudflareGlobalAPIToken` object. Allowed enum values: `CloudflareGlobalAPIToken` type [_required_] enum The definition of the `CloudflareIntegrationType` object. Allowed enum values: `Cloudflare` Option 8 object The definition of the `ConfigCatIntegrationUpdate` object. credentials The definition of the `ConfigCatCredentialsUpdate` object. Option 1 object The definition of the `ConfigCatSDKKey` object. api_password string The `ConfigCatSDKKeyUpdate` `api_password`. api_username string The `ConfigCatSDKKeyUpdate` `api_username`. sdk_key string The `ConfigCatSDKKeyUpdate` `sdk_key`. type [_required_] enum The definition of the `ConfigCatSDKKey` object. Allowed enum values: `ConfigCatSDKKey` type [_required_] enum The definition of the `ConfigCatIntegrationType` object. Allowed enum values: `ConfigCat` Option 9 object The definition of the `DatadogIntegrationUpdate` object. credentials The definition of the `DatadogCredentialsUpdate` object. Option 1 object The definition of the `DatadogAPIKey` object. api_key string The `DatadogAPIKeyUpdate` `api_key`. app_key string The `DatadogAPIKeyUpdate` `app_key`. datacenter string The `DatadogAPIKeyUpdate` `datacenter`. subdomain string Custom subdomain used for Datadog URLs generated with this Connection. For example, if this org uses `https://acme.datadoghq.com` to access Datadog, set this field to `acme`. If this field is omitted, generated URLs will use the default site URL for its datacenter (see ). type [_required_] enum The definition of the `DatadogAPIKey` object. Allowed enum values: `DatadogAPIKey` type [_required_] enum The definition of the `DatadogIntegrationType` object. Allowed enum values: `Datadog` Option 10 object The definition of the `FastlyIntegrationUpdate` object. credentials The definition of the `FastlyCredentialsUpdate` object. Option 1 object The definition of the `FastlyAPIKey` object. api_key string The `FastlyAPIKeyUpdate` `api_key`. type [_required_] enum The definition of the `FastlyAPIKey` object. Allowed enum values: `FastlyAPIKey` type [_required_] enum The definition of the `FastlyIntegrationType` object. Allowed enum values: `Fastly` Option 11 object The definition of the `FreshserviceIntegrationUpdate` object. credentials The definition of the `FreshserviceCredentialsUpdate` object. Option 1 object The definition of the `FreshserviceAPIKey` object. api_key string The `FreshserviceAPIKeyUpdate` `api_key`. domain string The `FreshserviceAPIKeyUpdate` `domain`. type [_required_] enum The definition of the `FreshserviceAPIKey` object. Allowed enum values: `FreshserviceAPIKey` type [_required_] enum The definition of the `FreshserviceIntegrationType` object. Allowed enum values: `Freshservice` Option 12 object The definition of the `GCPIntegrationUpdate` object. credentials The definition of the `GCPCredentialsUpdate` object. Option 1 object The definition of the `GCPServiceAccount` object. private_key string The `GCPServiceAccountUpdate` `private_key`. service_account_email string The `GCPServiceAccountUpdate` `service_account_email`. type [_required_] enum The definition of the `GCPServiceAccount` object. Allowed enum values: `GCPServiceAccount` type [_required_] enum The definition of the `GCPIntegrationType` object. Allowed enum values: `GCP` Option 13 object The definition of the `GeminiIntegrationUpdate` object. credentials The definition of the `GeminiCredentialsUpdate` object. Option 1 object The definition of the `GeminiAPIKey` object. api_key string The `GeminiAPIKeyUpdate` `api_key`. type [_required_] enum The definition of the `GeminiAPIKey` object. Allowed enum values: `GeminiAPIKey` type [_required_] enum The definition of the `GeminiIntegrationType` object. Allowed enum values: `Gemini` Option 14 object The definition of the `GitlabIntegrationUpdate` object. credentials The definition of the `GitlabCredentialsUpdate` object. Option 1 object The definition of the `GitlabAPIKey` object. api_token string The `GitlabAPIKeyUpdate` `api_token`. type [_required_] enum The definition of the `GitlabAPIKey` object. Allowed enum values: `GitlabAPIKey` type [_required_] enum The definition of the `GitlabIntegrationType` object. Allowed enum values: `Gitlab` Option 15 object The definition of the `GreyNoiseIntegrationUpdate` object. credentials The definition of the `GreyNoiseCredentialsUpdate` object. Option 1 object The definition of the `GreyNoiseAPIKey` object. api_key string The `GreyNoiseAPIKeyUpdate` `api_key`. type [_required_] enum The definition of the `GreyNoiseAPIKey` object. Allowed enum values: `GreyNoiseAPIKey` type [_required_] enum The definition of the `GreyNoiseIntegrationType` object. Allowed enum values: `GreyNoise` Option 16 object The definition of `HTTPIntegrationUpdate` object. base_url string Base HTTP url for the integration credentials The definition of `HTTPCredentialsUpdate` object. Option 1 object The definition of `HTTPTokenAuthUpdate` object. body object The definition of `HTTPBody` object. content string Serialized body content content_type string Content type of the body headers [object] The `HTTPTokenAuthUpdate` `headers`. deleted boolean Should the header be deleted. name [_required_] string The `HTTPHeaderUpdate` `name`. value string The `HTTPHeaderUpdate` `value`. tokens [object] The `HTTPTokenAuthUpdate` `tokens`. deleted boolean Should the header be deleted. name [_required_] string The `HTTPToken` `name`. type [_required_] enum The definition of `TokenType` object. Allowed enum values: `SECRET` value [_required_] string The `HTTPToken` `value`. type [_required_] enum The definition of `HTTPTokenAuthType` object. Allowed enum values: `HTTPTokenAuth` url_parameters [object] The `HTTPTokenAuthUpdate` `url_parameters`. deleted boolean Should the header be deleted. name [_required_] string Name for tokens. value string The `UrlParamUpdate` `value`. type [_required_] enum The definition of `HTTPIntegrationType` object. Allowed enum values: `HTTP` Option 17 object The definition of the `LaunchDarklyIntegrationUpdate` object. credentials The definition of the `LaunchDarklyCredentialsUpdate` object. Option 1 object The definition of the `LaunchDarklyAPIKey` object. api_token string The `LaunchDarklyAPIKeyUpdate` `api_token`. type [_required_] enum The definition of the `LaunchDarklyAPIKey` object. Allowed enum values: `LaunchDarklyAPIKey` type [_required_] enum The definition of the `LaunchDarklyIntegrationType` object. Allowed enum values: `LaunchDarkly` Option 18 object The definition of the `NotionIntegrationUpdate` object. credentials The definition of the `NotionCredentialsUpdate` object. Option 1 object The definition of the `NotionAPIKey` object. api_token string The `NotionAPIKeyUpdate` `api_token`. type [_required_] enum The definition of the `NotionAPIKey` object. Allowed enum values: `NotionAPIKey` type [_required_] enum The definition of the `NotionIntegrationType` object. Allowed enum values: `Notion` Option 19 object The definition of the `OktaIntegrationUpdate` object. credentials The definition of the `OktaCredentialsUpdate` object. Option 1 object The definition of the `OktaAPIToken` object. api_token string The `OktaAPITokenUpdate` `api_token`. domain string The `OktaAPITokenUpdate` `domain`. type [_required_] enum The definition of the `OktaAPIToken` object. Allowed enum values: `OktaAPIToken` type [_required_] enum The definition of the `OktaIntegrationType` object. Allowed enum values: `Okta` Option 20 object The definition of the `OpenAIIntegrationUpdate` object. credentials The definition of the `OpenAICredentialsUpdate` object. Option 1 object The definition of the `OpenAIAPIKey` object. api_token string The `OpenAIAPIKeyUpdate` `api_token`. type [_required_] enum The definition of the `OpenAIAPIKey` object. Allowed enum values: `OpenAIAPIKey` type [_required_] enum The definition of the `OpenAIIntegrationType` object. Allowed enum values: `OpenAI` Option 21 object The definition of the `ServiceNowIntegrationUpdate` object. credentials The definition of the `ServiceNowCredentialsUpdate` object. Option 1 object The definition of the `ServiceNowBasicAuth` object. instance string The `ServiceNowBasicAuthUpdate` `instance`. password string The `ServiceNowBasicAuthUpdate` `password`. type [_required_] enum The definition of the `ServiceNowBasicAuth` object. Allowed enum values: `ServiceNowBasicAuth` username string The `ServiceNowBasicAuthUpdate` `username`. type [_required_] enum The definition of the `ServiceNowIntegrationType` object. Allowed enum values: `ServiceNow` Option 22 object The definition of the `SplitIntegrationUpdate` object. credentials The definition of the `SplitCredentialsUpdate` object. Option 1 object The definition of the `SplitAPIKey` object. api_key string The `SplitAPIKeyUpdate` `api_key`. type [_required_] enum The definition of the `SplitAPIKey` object. Allowed enum values: `SplitAPIKey` type [_required_] enum The definition of the `SplitIntegrationType` object. Allowed enum values: `Split` Option 23 object The definition of the `StatsigIntegrationUpdate` object. credentials The definition of the `StatsigCredentialsUpdate` object. Option 1 object The definition of the `StatsigAPIKey` object. api_key string The `StatsigAPIKeyUpdate` `api_key`. type [_required_] enum The definition of the `StatsigAPIKey` object. Allowed enum values: `StatsigAPIKey` type [_required_] enum The definition of the `StatsigIntegrationType` object. Allowed enum values: `Statsig` Option 24 object The definition of the `VirusTotalIntegrationUpdate` object. credentials The definition of the `VirusTotalCredentialsUpdate` object. Option 1 object The definition of the `VirusTotalAPIKey` object. api_key string The `VirusTotalAPIKeyUpdate` `api_key`. type [_required_] enum The definition of the `VirusTotalAPIKey` object. Allowed enum values: `VirusTotalAPIKey` type [_required_] enum The definition of the `VirusTotalIntegrationType` object. Allowed enum values: `VirusTotal` name string Name of the connection type [_required_] enum The definition of `ActionConnectionDataType` object. Allowed enum values: `action_connection` ``` { "data": { "type": "action_connection", "attributes": { "name": "Cassette Connection", "integration": { "type": "AWS", "credentials": { "type": "AWSAssumeRole", "role": "MyRoleUpdated", "account_id": "123456789123" } } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/action-connection/#UpdateActionConnection-200-v2) * [400](https://docs.datadoghq.com/api/latest/action-connection/#UpdateActionConnection-400-v2) * [403](https://docs.datadoghq.com/api/latest/action-connection/#UpdateActionConnection-403-v2) * [404](https://docs.datadoghq.com/api/latest/action-connection/#UpdateActionConnection-404-v2) * [429](https://docs.datadoghq.com/api/latest/action-connection/#UpdateActionConnection-429-v2) Successfully updated Action Connection * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) The response for an updated connection. Field Type Description data object Data related to the connection. attributes [_required_] object The definition of `ActionConnectionAttributes` object. integration [_required_] The definition of `ActionConnectionIntegration` object. Option 1 object The definition of `AWSIntegration` object. credentials [_required_] The definition of `AWSCredentials` object. Option 1 object The definition of `AWSAssumeRole` object. account_id [_required_] string AWS account the connection is created for external_id string External ID used to scope which connection can be used to assume the role principal_id string AWS account that will assume the role role [_required_] string Role to assume type [_required_] enum The definition of `AWSAssumeRoleType` object. Allowed enum values: `AWSAssumeRole` type [_required_] enum The definition of `AWSIntegrationType` object. Allowed enum values: `AWS` Option 2 object The definition of the `AnthropicIntegration` object. credentials [_required_] The definition of the `AnthropicCredentials` object. Option 1 object The definition of the `AnthropicAPIKey` object. api_token [_required_] string The `AnthropicAPIKey` `api_token`. type [_required_] enum The definition of the `AnthropicAPIKey` object. Allowed enum values: `AnthropicAPIKey` type [_required_] enum The definition of the `AnthropicIntegrationType` object. Allowed enum values: `Anthropic` Option 3 object The definition of the `AsanaIntegration` object. credentials [_required_] The definition of the `AsanaCredentials` object. Option 1 object The definition of the `AsanaAccessToken` object. access_token [_required_] string The `AsanaAccessToken` `access_token`. type [_required_] enum The definition of the `AsanaAccessToken` object. Allowed enum values: `AsanaAccessToken` type [_required_] enum The definition of the `AsanaIntegrationType` object. Allowed enum values: `Asana` Option 4 object The definition of the `AzureIntegration` object. credentials [_required_] The definition of the `AzureCredentials` object. Option 1 object The definition of the `AzureTenant` object. app_client_id [_required_] string The Client ID, also known as the Application ID in Azure, is a unique identifier for an application. It's used to identify the application during the authentication process. Your Application (client) ID is listed in the application's overview page. You can navigate to your application via the Azure Directory. client_secret [_required_] string The Client Secret is a confidential piece of information known only to the application and Azure AD. It's used to prove the application's identity. Your Client Secret is available from the application’s secrets page. You can navigate to your application via the Azure Directory. custom_scopes string If provided, the custom scope to be requested from Microsoft when acquiring an OAuth 2 access token. This custom scope is used only in conjunction with the HTTP action. A resource's scope is constructed by using the identifier URI for the resource and .default, separated by a forward slash (/) as follows:{identifierURI}/.default. tenant_id [_required_] string The Tenant ID, also known as the Directory ID in Azure, is a unique identifier that represents an Azure AD instance. Your Tenant ID (Directory ID) is listed in your Active Directory overview page under the 'Tenant information' section. type [_required_] enum The definition of the `AzureTenant` object. Allowed enum values: `AzureTenant` type [_required_] enum The definition of the `AzureIntegrationType` object. Allowed enum values: `Azure` Option 5 object The definition of the `CircleCIIntegration` object. credentials [_required_] The definition of the `CircleCICredentials` object. Option 1 object The definition of the `CircleCIAPIKey` object. api_token [_required_] string The `CircleCIAPIKey` `api_token`. type [_required_] enum The definition of the `CircleCIAPIKey` object. Allowed enum values: `CircleCIAPIKey` type [_required_] enum The definition of the `CircleCIIntegrationType` object. Allowed enum values: `CircleCI` Option 6 object The definition of the `ClickupIntegration` object. credentials [_required_] The definition of the `ClickupCredentials` object. Option 1 object The definition of the `ClickupAPIKey` object. api_token [_required_] string The `ClickupAPIKey` `api_token`. type [_required_] enum The definition of the `ClickupAPIKey` object. Allowed enum values: `ClickupAPIKey` type [_required_] enum The definition of the `ClickupIntegrationType` object. Allowed enum values: `Clickup` Option 7 object The definition of the `CloudflareIntegration` object. credentials [_required_] The definition of the `CloudflareCredentials` object. Option 1 object The definition of the `CloudflareAPIToken` object. api_token [_required_] string The `CloudflareAPIToken` `api_token`. type [_required_] enum The definition of the `CloudflareAPIToken` object. Allowed enum values: `CloudflareAPIToken` Option 2 object The definition of the `CloudflareGlobalAPIToken` object. auth_email [_required_] string The `CloudflareGlobalAPIToken` `auth_email`. global_api_key [_required_] string The `CloudflareGlobalAPIToken` `global_api_key`. type [_required_] enum The definition of the `CloudflareGlobalAPIToken` object. Allowed enum values: `CloudflareGlobalAPIToken` type [_required_] enum The definition of the `CloudflareIntegrationType` object. Allowed enum values: `Cloudflare` Option 8 object The definition of the `ConfigCatIntegration` object. credentials [_required_] The definition of the `ConfigCatCredentials` object. Option 1 object The definition of the `ConfigCatSDKKey` object. api_password [_required_] string The `ConfigCatSDKKey` `api_password`. api_username [_required_] string The `ConfigCatSDKKey` `api_username`. sdk_key [_required_] string The `ConfigCatSDKKey` `sdk_key`. type [_required_] enum The definition of the `ConfigCatSDKKey` object. Allowed enum values: `ConfigCatSDKKey` type [_required_] enum The definition of the `ConfigCatIntegrationType` object. Allowed enum values: `ConfigCat` Option 9 object The definition of the `DatadogIntegration` object. credentials [_required_] The definition of the `DatadogCredentials` object. Option 1 object The definition of the `DatadogAPIKey` object. api_key [_required_] string The `DatadogAPIKey` `api_key`. app_key [_required_] string The `DatadogAPIKey` `app_key`. datacenter [_required_] string The `DatadogAPIKey` `datacenter`. subdomain string Custom subdomain used for Datadog URLs generated with this Connection. For example, if this org uses `https://acme.datadoghq.com` to access Datadog, set this field to `acme`. If this field is omitted, generated URLs will use the default site URL for its datacenter (see ). type [_required_] enum The definition of the `DatadogAPIKey` object. Allowed enum values: `DatadogAPIKey` type [_required_] enum The definition of the `DatadogIntegrationType` object. Allowed enum values: `Datadog` Option 10 object The definition of the `FastlyIntegration` object. credentials [_required_] The definition of the `FastlyCredentials` object. Option 1 object The definition of the `FastlyAPIKey` object. api_key [_required_] string The `FastlyAPIKey` `api_key`. type [_required_] enum The definition of the `FastlyAPIKey` object. Allowed enum values: `FastlyAPIKey` type [_required_] enum The definition of the `FastlyIntegrationType` object. Allowed enum values: `Fastly` Option 11 object The definition of the `FreshserviceIntegration` object. credentials [_required_] The definition of the `FreshserviceCredentials` object. Option 1 object The definition of the `FreshserviceAPIKey` object. api_key [_required_] string The `FreshserviceAPIKey` `api_key`. domain [_required_] string The `FreshserviceAPIKey` `domain`. type [_required_] enum The definition of the `FreshserviceAPIKey` object. Allowed enum values: `FreshserviceAPIKey` type [_required_] enum The definition of the `FreshserviceIntegrationType` object. Allowed enum values: `Freshservice` Option 12 object The definition of the `GCPIntegration` object. credentials [_required_] The definition of the `GCPCredentials` object. Option 1 object The definition of the `GCPServiceAccount` object. private_key [_required_] string The `GCPServiceAccount` `private_key`. service_account_email [_required_] string The `GCPServiceAccount` `service_account_email`. type [_required_] enum The definition of the `GCPServiceAccount` object. Allowed enum values: `GCPServiceAccount` type [_required_] enum The definition of the `GCPIntegrationType` object. Allowed enum values: `GCP` Option 13 object The definition of the `GeminiIntegration` object. credentials [_required_] The definition of the `GeminiCredentials` object. Option 1 object The definition of the `GeminiAPIKey` object. api_key [_required_] string The `GeminiAPIKey` `api_key`. type [_required_] enum The definition of the `GeminiAPIKey` object. Allowed enum values: `GeminiAPIKey` type [_required_] enum The definition of the `GeminiIntegrationType` object. Allowed enum values: `Gemini` Option 14 object The definition of the `GitlabIntegration` object. credentials [_required_] The definition of the `GitlabCredentials` object. Option 1 object The definition of the `GitlabAPIKey` object. api_token [_required_] string The `GitlabAPIKey` `api_token`. type [_required_] enum The definition of the `GitlabAPIKey` object. Allowed enum values: `GitlabAPIKey` type [_required_] enum The definition of the `GitlabIntegrationType` object. Allowed enum values: `Gitlab` Option 15 object The definition of the `GreyNoiseIntegration` object. credentials [_required_] The definition of the `GreyNoiseCredentials` object. Option 1 object The definition of the `GreyNoiseAPIKey` object. api_key [_required_] string The `GreyNoiseAPIKey` `api_key`. type [_required_] enum The definition of the `GreyNoiseAPIKey` object. Allowed enum values: `GreyNoiseAPIKey` type [_required_] enum The definition of the `GreyNoiseIntegrationType` object. Allowed enum values: `GreyNoise` Option 16 object The definition of `HTTPIntegration` object. base_url [_required_] string Base HTTP url for the integration credentials [_required_] The definition of `HTTPCredentials` object. Option 1 object The definition of `HTTPTokenAuth` object. body object The definition of `HTTPBody` object. content string Serialized body content content_type string Content type of the body headers [object] The `HTTPTokenAuth` `headers`. name [_required_] string The `HTTPHeader` `name`. value [_required_] string The `HTTPHeader` `value`. tokens [object] The `HTTPTokenAuth` `tokens`. name [_required_] string The `HTTPToken` `name`. type [_required_] enum The definition of `TokenType` object. Allowed enum values: `SECRET` value [_required_] string The `HTTPToken` `value`. type [_required_] enum The definition of `HTTPTokenAuthType` object. Allowed enum values: `HTTPTokenAuth` url_parameters [object] The `HTTPTokenAuth` `url_parameters`. name [_required_] string Name for tokens. value [_required_] string The `UrlParam` `value`. type [_required_] enum The definition of `HTTPIntegrationType` object. Allowed enum values: `HTTP` Option 17 object The definition of the `LaunchDarklyIntegration` object. credentials [_required_] The definition of the `LaunchDarklyCredentials` object. Option 1 object The definition of the `LaunchDarklyAPIKey` object. api_token [_required_] string The `LaunchDarklyAPIKey` `api_token`. type [_required_] enum The definition of the `LaunchDarklyAPIKey` object. Allowed enum values: `LaunchDarklyAPIKey` type [_required_] enum The definition of the `LaunchDarklyIntegrationType` object. Allowed enum values: `LaunchDarkly` Option 18 object The definition of the `NotionIntegration` object. credentials [_required_] The definition of the `NotionCredentials` object. Option 1 object The definition of the `NotionAPIKey` object. api_token [_required_] string The `NotionAPIKey` `api_token`. type [_required_] enum The definition of the `NotionAPIKey` object. Allowed enum values: `NotionAPIKey` type [_required_] enum The definition of the `NotionIntegrationType` object. Allowed enum values: `Notion` Option 19 object The definition of the `OktaIntegration` object. credentials [_required_] The definition of the `OktaCredentials` object. Option 1 object The definition of the `OktaAPIToken` object. api_token [_required_] string The `OktaAPIToken` `api_token`. domain [_required_] string The `OktaAPIToken` `domain`. type [_required_] enum The definition of the `OktaAPIToken` object. Allowed enum values: `OktaAPIToken` type [_required_] enum The definition of the `OktaIntegrationType` object. Allowed enum values: `Okta` Option 20 object The definition of the `OpenAIIntegration` object. credentials [_required_] The definition of the `OpenAICredentials` object. Option 1 object The definition of the `OpenAIAPIKey` object. api_token [_required_] string The `OpenAIAPIKey` `api_token`. type [_required_] enum The definition of the `OpenAIAPIKey` object. Allowed enum values: `OpenAIAPIKey` type [_required_] enum The definition of the `OpenAIIntegrationType` object. Allowed enum values: `OpenAI` Option 21 object The definition of the `ServiceNowIntegration` object. credentials [_required_] The definition of the `ServiceNowCredentials` object. Option 1 object The definition of the `ServiceNowBasicAuth` object. instance [_required_] string The `ServiceNowBasicAuth` `instance`. password [_required_] string The `ServiceNowBasicAuth` `password`. type [_required_] enum The definition of the `ServiceNowBasicAuth` object. Allowed enum values: `ServiceNowBasicAuth` username [_required_] string The `ServiceNowBasicAuth` `username`. type [_required_] enum The definition of the `ServiceNowIntegrationType` object. Allowed enum values: `ServiceNow` Option 22 object The definition of the `SplitIntegration` object. credentials [_required_] The definition of the `SplitCredentials` object. Option 1 object The definition of the `SplitAPIKey` object. api_key [_required_] string The `SplitAPIKey` `api_key`. type [_required_] enum The definition of the `SplitAPIKey` object. Allowed enum values: `SplitAPIKey` type [_required_] enum The definition of the `SplitIntegrationType` object. Allowed enum values: `Split` Option 23 object The definition of the `StatsigIntegration` object. credentials [_required_] The definition of the `StatsigCredentials` object. Option 1 object The definition of the `StatsigAPIKey` object. api_key [_required_] string The `StatsigAPIKey` `api_key`. type [_required_] enum The definition of the `StatsigAPIKey` object. Allowed enum values: `StatsigAPIKey` type [_required_] enum The definition of the `StatsigIntegrationType` object. Allowed enum values: `Statsig` Option 24 object The definition of the `VirusTotalIntegration` object. credentials [_required_] The definition of the `VirusTotalCredentials` object. Option 1 object The definition of the `VirusTotalAPIKey` object. api_key [_required_] string The `VirusTotalAPIKey` `api_key`. type [_required_] enum The definition of the `VirusTotalAPIKey` object. Allowed enum values: `VirusTotalAPIKey` type [_required_] enum The definition of the `VirusTotalIntegrationType` object. Allowed enum values: `VirusTotal` name [_required_] string Name of the connection id string The connection identifier type [_required_] enum The definition of `ActionConnectionDataType` object. Allowed enum values: `action_connection` ``` { "data": { "attributes": { "integration": { "credentials": { "account_id": "111222333444", "external_id": "33a1011635c44b38a064cf14e82e1d8f", "principal_id": "123456789012", "role": "my-role", "type": "AWSAssumeRole" }, "type": "AWS" }, "name": "My AWS Connection" }, "id": "string", "type": "action_connection" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too Many Request * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=typescript) ##### Update an existing Action Connection returns "Successfully updated Action Connection" response Copy ``` # Path parameters export connection_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions/connections/${connection_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "action_connection", "attributes": { "name": "Cassette Connection", "integration": { "type": "AWS", "credentials": { "type": "AWSAssumeRole", "role": "MyRoleUpdated", "account_id": "123456789123" } } } } } EOF ``` ##### Update an existing Action Connection returns "Successfully updated Action Connection" response ``` // Update an existing Action Connection returns "Successfully updated Action Connection" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.UpdateActionConnectionRequest{ Data: datadogV2.ActionConnectionDataUpdate{ Type: datadogV2.ACTIONCONNECTIONDATATYPE_ACTION_CONNECTION, Attributes: datadogV2.ActionConnectionAttributesUpdate{ Name: datadog.PtrString("Cassette Connection"), Integration: &datadogV2.ActionConnectionIntegrationUpdate{ AWSIntegrationUpdate: &datadogV2.AWSIntegrationUpdate{ Type: datadogV2.AWSINTEGRATIONTYPE_AWS, Credentials: &datadogV2.AWSCredentialsUpdate{ AWSAssumeRoleUpdate: &datadogV2.AWSAssumeRoleUpdate{ Type: datadogV2.AWSASSUMEROLETYPE_AWSASSUMEROLE, Role: datadog.PtrString("MyRoleUpdated"), AccountId: datadog.PtrString("123456789123"), }}, }}, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionConnectionApi(apiClient) resp, r, err := api.UpdateActionConnection(ctx, "cb460d51-3c88-4e87-adac-d47131d0423d", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionConnectionApi.UpdateActionConnection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionConnectionApi.UpdateActionConnection`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an existing Action Connection returns "Successfully updated Action Connection" response ``` // Update an existing Action Connection returns "Successfully updated Action Connection" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionConnectionApi; import com.datadog.api.client.v2.model.AWSAssumeRoleType; import com.datadog.api.client.v2.model.AWSAssumeRoleUpdate; import com.datadog.api.client.v2.model.AWSCredentialsUpdate; import com.datadog.api.client.v2.model.AWSIntegrationType; import com.datadog.api.client.v2.model.AWSIntegrationUpdate; import com.datadog.api.client.v2.model.ActionConnectionAttributesUpdate; import com.datadog.api.client.v2.model.ActionConnectionDataType; import com.datadog.api.client.v2.model.ActionConnectionDataUpdate; import com.datadog.api.client.v2.model.ActionConnectionIntegrationUpdate; import com.datadog.api.client.v2.model.UpdateActionConnectionRequest; import com.datadog.api.client.v2.model.UpdateActionConnectionResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionConnectionApi apiInstance = new ActionConnectionApi(defaultClient); UpdateActionConnectionRequest body = new UpdateActionConnectionRequest() .data( new ActionConnectionDataUpdate() .type(ActionConnectionDataType.ACTION_CONNECTION) .attributes( new ActionConnectionAttributesUpdate() .name("Cassette Connection") .integration( new ActionConnectionIntegrationUpdate( new AWSIntegrationUpdate() .type(AWSIntegrationType.AWS) .credentials( new AWSCredentialsUpdate( new AWSAssumeRoleUpdate() .type(AWSAssumeRoleType.AWSASSUMEROLE) .role("MyRoleUpdated") .accountId("123456789123"))))))); try { UpdateActionConnectionResponse result = apiInstance.updateActionConnection("cb460d51-3c88-4e87-adac-d47131d0423d", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionConnectionApi#updateActionConnection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an existing Action Connection returns "Successfully updated Action Connection" response ``` """ Update an existing Action Connection returns "Successfully updated Action Connection" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.action_connection_api import ActionConnectionApi from datadog_api_client.v2.model.action_connection_attributes_update import ActionConnectionAttributesUpdate from datadog_api_client.v2.model.action_connection_data_type import ActionConnectionDataType from datadog_api_client.v2.model.action_connection_data_update import ActionConnectionDataUpdate from datadog_api_client.v2.model.aws_assume_role_type import AWSAssumeRoleType from datadog_api_client.v2.model.aws_assume_role_update import AWSAssumeRoleUpdate from datadog_api_client.v2.model.aws_integration_type import AWSIntegrationType from datadog_api_client.v2.model.aws_integration_update import AWSIntegrationUpdate from datadog_api_client.v2.model.update_action_connection_request import UpdateActionConnectionRequest body = UpdateActionConnectionRequest( data=ActionConnectionDataUpdate( type=ActionConnectionDataType.ACTION_CONNECTION, attributes=ActionConnectionAttributesUpdate( name="Cassette Connection", integration=AWSIntegrationUpdate( type=AWSIntegrationType.AWS, credentials=AWSAssumeRoleUpdate( type=AWSAssumeRoleType.AWSASSUMEROLE, role="MyRoleUpdated", account_id="123456789123", ), ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionConnectionApi(api_client) response = api_instance.update_action_connection(connection_id="cb460d51-3c88-4e87-adac-d47131d0423d", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an existing Action Connection returns "Successfully updated Action Connection" response ``` # Update an existing Action Connection returns "Successfully updated Action Connection" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionConnectionAPI.new body = DatadogAPIClient::V2::UpdateActionConnectionRequest.new({ data: DatadogAPIClient::V2::ActionConnectionDataUpdate.new({ type: DatadogAPIClient::V2::ActionConnectionDataType::ACTION_CONNECTION, attributes: DatadogAPIClient::V2::ActionConnectionAttributesUpdate.new({ name: "Cassette Connection", integration: DatadogAPIClient::V2::AWSIntegrationUpdate.new({ type: DatadogAPIClient::V2::AWSIntegrationType::AWS, credentials: DatadogAPIClient::V2::AWSAssumeRoleUpdate.new({ type: DatadogAPIClient::V2::AWSAssumeRoleType::AWSASSUMEROLE, role: "MyRoleUpdated", account_id: "123456789123", }), }), }), }), }) p api_instance.update_action_connection("cb460d51-3c88-4e87-adac-d47131d0423d", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an existing Action Connection returns "Successfully updated Action Connection" response ``` // Update an existing Action Connection returns "Successfully updated Action // Connection" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_action_connection::ActionConnectionAPI; use datadog_api_client::datadogV2::model::AWSAssumeRoleType; use datadog_api_client::datadogV2::model::AWSAssumeRoleUpdate; use datadog_api_client::datadogV2::model::AWSCredentialsUpdate; use datadog_api_client::datadogV2::model::AWSIntegrationType; use datadog_api_client::datadogV2::model::AWSIntegrationUpdate; use datadog_api_client::datadogV2::model::ActionConnectionAttributesUpdate; use datadog_api_client::datadogV2::model::ActionConnectionDataType; use datadog_api_client::datadogV2::model::ActionConnectionDataUpdate; use datadog_api_client::datadogV2::model::ActionConnectionIntegrationUpdate; use datadog_api_client::datadogV2::model::UpdateActionConnectionRequest; #[tokio::main] async fn main() { let body = UpdateActionConnectionRequest::new(ActionConnectionDataUpdate::new( ActionConnectionAttributesUpdate::new() .integration(ActionConnectionIntegrationUpdate::AWSIntegrationUpdate( Box::new( AWSIntegrationUpdate::new(AWSIntegrationType::AWS).credentials( AWSCredentialsUpdate::AWSAssumeRoleUpdate(Box::new( AWSAssumeRoleUpdate::new(AWSAssumeRoleType::AWSASSUMEROLE) .account_id("123456789123".to_string()) .role("MyRoleUpdated".to_string()), )), ), ), )) .name("Cassette Connection".to_string()), ActionConnectionDataType::ACTION_CONNECTION, )); let configuration = datadog::Configuration::new(); let api = ActionConnectionAPI::with_config(configuration); let resp = api .update_action_connection("cb460d51-3c88-4e87-adac-d47131d0423d".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an existing Action Connection returns "Successfully updated Action Connection" response ``` /** * Update an existing Action Connection returns "Successfully updated Action Connection" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionConnectionApi(configuration); const params: v2.ActionConnectionApiUpdateActionConnectionRequest = { body: { data: { type: "action_connection", attributes: { name: "Cassette Connection", integration: { type: "AWS", credentials: { type: "AWSAssumeRole", role: "MyRoleUpdated", accountId: "123456789123", }, }, }, }, }, connectionId: "cb460d51-3c88-4e87-adac-d47131d0423d", }; apiInstance .updateActionConnection(params) .then((data: v2.UpdateActionConnectionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an existing Action Connection](https://docs.datadoghq.com/api/latest/action-connection/#delete-an-existing-action-connection) * [v2 (latest)](https://docs.datadoghq.com/api/latest/action-connection/#delete-an-existing-action-connection-v2) DELETE https://api.ap1.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.ap2.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.datadoghq.eu/api/v2/actions/connections/{connection_id}https://api.ddog-gov.com/api/v2/actions/connections/{connection_id}https://api.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.us3.datadoghq.com/api/v2/actions/connections/{connection_id}https://api.us5.datadoghq.com/api/v2/actions/connections/{connection_id} ### Overview Delete an existing Action Connection. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `connection_write` permission. ### Arguments #### Path Parameters Name Type Description connection_id [_required_] string The ID of the action connection ### Response * [204](https://docs.datadoghq.com/api/latest/action-connection/#DeleteActionConnection-204-v2) * [403](https://docs.datadoghq.com/api/latest/action-connection/#DeleteActionConnection-403-v2) * [404](https://docs.datadoghq.com/api/latest/action-connection/#DeleteActionConnection-404-v2) * [429](https://docs.datadoghq.com/api/latest/action-connection/#DeleteActionConnection-429-v2) The resource was deleted successfully. Forbidden * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too Many Request * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=typescript) ##### Delete an existing Action Connection Copy ``` # Path parameters export connection_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions/connections/${connection_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an existing Action Connection ``` """ Delete an existing Action Connection returns "The resource was deleted successfully." response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.action_connection_api import ActionConnectionApi # there is a valid "action_connection" in the system ACTION_CONNECTION_DATA_ID = environ["ACTION_CONNECTION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionConnectionApi(api_client) api_instance.delete_action_connection( connection_id=ACTION_CONNECTION_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an existing Action Connection ``` # Delete an existing Action Connection returns "The resource was deleted successfully." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionConnectionAPI.new # there is a valid "action_connection" in the system ACTION_CONNECTION_DATA_ID = ENV["ACTION_CONNECTION_DATA_ID"] api_instance.delete_action_connection(ACTION_CONNECTION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an existing Action Connection ``` // Delete an existing Action Connection returns "The resource was deleted successfully." response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "action_connection" in the system ActionConnectionDataID := os.Getenv("ACTION_CONNECTION_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionConnectionApi(apiClient) r, err := api.DeleteActionConnection(ctx, ActionConnectionDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionConnectionApi.DeleteActionConnection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an existing Action Connection ``` // Delete an existing Action Connection returns "The resource was deleted successfully." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionConnectionApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionConnectionApi apiInstance = new ActionConnectionApi(defaultClient); // there is a valid "action_connection" in the system String ACTION_CONNECTION_DATA_ID = System.getenv("ACTION_CONNECTION_DATA_ID"); try { apiInstance.deleteActionConnection(ACTION_CONNECTION_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling ActionConnectionApi#deleteActionConnection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an existing Action Connection ``` // Delete an existing Action Connection returns "The resource was deleted // successfully." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_action_connection::ActionConnectionAPI; #[tokio::main] async fn main() { // there is a valid "action_connection" in the system let action_connection_data_id = std::env::var("ACTION_CONNECTION_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ActionConnectionAPI::with_config(configuration); let resp = api .delete_action_connection(action_connection_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an existing Action Connection ``` /** * Delete an existing Action Connection returns "The resource was deleted successfully." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionConnectionApi(configuration); // there is a valid "action_connection" in the system const ACTION_CONNECTION_DATA_ID = process.env .ACTION_CONNECTION_DATA_ID as string; const params: v2.ActionConnectionApiDeleteActionConnectionRequest = { connectionId: ACTION_CONNECTION_DATA_ID, }; apiInstance .deleteActionConnection(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Register a new App Key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key) * [v2 (latest)](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key-v2) PUT https://api.ap1.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.ap2.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.datadoghq.eu/api/v2/actions/app_key_registrations/{app_key_id}https://api.ddog-gov.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.us3.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.us5.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id} ### Overview Register a new App Key This endpoint requires any of the following permissions: * `user_access_manage` * `user_app_keys` * `service_account_write` ### Arguments #### Path Parameters Name Type Description app_key_id [_required_] string The ID of the app key ### Response * [201](https://docs.datadoghq.com/api/latest/action-connection/#RegisterAppKey-201-v2) * [400](https://docs.datadoghq.com/api/latest/action-connection/#RegisterAppKey-400-v2) * [403](https://docs.datadoghq.com/api/latest/action-connection/#RegisterAppKey-403-v2) * [429](https://docs.datadoghq.com/api/latest/action-connection/#RegisterAppKey-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) The response object after creating an app key registration. Field Type Description data object Data related to the app key registration. id uuid The app key registration identifier type [_required_] enum The definition of `AppKeyRegistrationDataType` object. Allowed enum values: `app_key_registration` ``` { "data": { "id": "string", "type": "app_key_registration" } } ``` Copy Bad request * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=typescript) ##### Register a new App Key Copy ``` # Path parameters export app_key_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions/app_key_registrations/${app_key_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Register a new App Key ``` """ Register a new App Key returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.action_connection_api import ActionConnectionApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionConnectionApi(api_client) response = api_instance.register_app_key( app_key_id="b7feea52-994e-4714-a100-1bd9eff5aee1", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Register a new App Key ``` # Register a new App Key returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionConnectionAPI.new p api_instance.register_app_key("b7feea52-994e-4714-a100-1bd9eff5aee1") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Register a new App Key ``` // Register a new App Key returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionConnectionApi(apiClient) resp, r, err := api.RegisterAppKey(ctx, "b7feea52-994e-4714-a100-1bd9eff5aee1") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionConnectionApi.RegisterAppKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionConnectionApi.RegisterAppKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Register a new App Key ``` // Register a new App Key returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionConnectionApi; import com.datadog.api.client.v2.model.RegisterAppKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionConnectionApi apiInstance = new ActionConnectionApi(defaultClient); try { RegisterAppKeyResponse result = apiInstance.registerAppKey("b7feea52-994e-4714-a100-1bd9eff5aee1"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionConnectionApi#registerAppKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Register a new App Key ``` // Register a new App Key returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_action_connection::ActionConnectionAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ActionConnectionAPI::with_config(configuration); let resp = api .register_app_key("b7feea52-994e-4714-a100-1bd9eff5aee1".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Register a new App Key ``` /** * Register a new App Key returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionConnectionApi(configuration); const params: v2.ActionConnectionApiRegisterAppKeyRequest = { appKeyId: "b7feea52-994e-4714-a100-1bd9eff5aee1", }; apiInstance .registerAppKey(params) .then((data: v2.RegisterAppKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List App Key Registrations](https://docs.datadoghq.com/api/latest/action-connection/#list-app-key-registrations) * [v2 (latest)](https://docs.datadoghq.com/api/latest/action-connection/#list-app-key-registrations-v2) GET https://api.ap1.datadoghq.com/api/v2/actions/app_key_registrationshttps://api.ap2.datadoghq.com/api/v2/actions/app_key_registrationshttps://api.datadoghq.eu/api/v2/actions/app_key_registrationshttps://api.ddog-gov.com/api/v2/actions/app_key_registrationshttps://api.datadoghq.com/api/v2/actions/app_key_registrationshttps://api.us3.datadoghq.com/api/v2/actions/app_key_registrationshttps://api.us5.datadoghq.com/api/v2/actions/app_key_registrations ### Overview List App Key Registrations This endpoint requires the `org_app_keys_read` permission. ### Arguments #### Query Strings Name Type Description page[size] integer The number of App Key Registrations to return per page. page[number] integer The page number to return. ### Response * [200](https://docs.datadoghq.com/api/latest/action-connection/#ListAppKeyRegistrations-200-v2) * [400](https://docs.datadoghq.com/api/latest/action-connection/#ListAppKeyRegistrations-400-v2) * [403](https://docs.datadoghq.com/api/latest/action-connection/#ListAppKeyRegistrations-403-v2) * [429](https://docs.datadoghq.com/api/latest/action-connection/#ListAppKeyRegistrations-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) A paginated list of app key registrations. Field Type Description data [object] An array of app key registrations. id uuid The app key registration identifier type [_required_] enum The definition of `AppKeyRegistrationDataType` object. Allowed enum values: `app_key_registration` meta object The definition of `ListAppKeyRegistrationsResponseMeta` object. total int64 The total number of app key registrations. total_filtered int64 The total number of app key registrations that match the specified filters. ``` { "data": [ { "id": "string", "type": "app_key_registration" } ], "meta": { "total": 1, "total_filtered": 1 } } ``` Copy Bad request * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=typescript) ##### List App Key Registrations Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions/app_key_registrations" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List App Key Registrations ``` """ List App Key Registrations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.action_connection_api import ActionConnectionApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionConnectionApi(api_client) response = api_instance.list_app_key_registrations() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List App Key Registrations ``` # List App Key Registrations returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionConnectionAPI.new p api_instance.list_app_key_registrations() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List App Key Registrations ``` // List App Key Registrations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionConnectionApi(apiClient) resp, r, err := api.ListAppKeyRegistrations(ctx, *datadogV2.NewListAppKeyRegistrationsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionConnectionApi.ListAppKeyRegistrations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionConnectionApi.ListAppKeyRegistrations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List App Key Registrations ``` // List App Key Registrations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionConnectionApi; import com.datadog.api.client.v2.model.ListAppKeyRegistrationsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionConnectionApi apiInstance = new ActionConnectionApi(defaultClient); try { ListAppKeyRegistrationsResponse result = apiInstance.listAppKeyRegistrations(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionConnectionApi#listAppKeyRegistrations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List App Key Registrations ``` // List App Key Registrations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_action_connection::ActionConnectionAPI; use datadog_api_client::datadogV2::api_action_connection::ListAppKeyRegistrationsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ActionConnectionAPI::with_config(configuration); let resp = api .list_app_key_registrations(ListAppKeyRegistrationsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List App Key Registrations ``` /** * List App Key Registrations returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionConnectionApi(configuration); apiInstance .listAppKeyRegistrations() .then((data: v2.ListAppKeyRegistrationsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Unregister an App Key](https://docs.datadoghq.com/api/latest/action-connection/#unregister-an-app-key) * [v2 (latest)](https://docs.datadoghq.com/api/latest/action-connection/#unregister-an-app-key-v2) DELETE https://api.ap1.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.ap2.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.datadoghq.eu/api/v2/actions/app_key_registrations/{app_key_id}https://api.ddog-gov.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.us3.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.us5.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id} ### Overview Unregister an App Key This endpoint requires any of the following permissions: * `user_access_manage` * `user_app_keys` * `service_account_write` ### Arguments #### Path Parameters Name Type Description app_key_id [_required_] string The ID of the app key ### Response * [204](https://docs.datadoghq.com/api/latest/action-connection/#UnregisterAppKey-204-v2) * [400](https://docs.datadoghq.com/api/latest/action-connection/#UnregisterAppKey-400-v2) * [403](https://docs.datadoghq.com/api/latest/action-connection/#UnregisterAppKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/action-connection/#UnregisterAppKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/action-connection/#UnregisterAppKey-429-v2) No Content Bad request * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=typescript) ##### Unregister an App Key Copy ``` # Path parameters export app_key_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions/app_key_registrations/${app_key_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Unregister an App Key ``` """ Unregister an App Key returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.action_connection_api import ActionConnectionApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionConnectionApi(api_client) api_instance.unregister_app_key( app_key_id="57cc69ae-9214-4ecc-8df8-43ecc1d92d99", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Unregister an App Key ``` # Unregister an App Key returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionConnectionAPI.new api_instance.unregister_app_key("57cc69ae-9214-4ecc-8df8-43ecc1d92d99") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Unregister an App Key ``` // Unregister an App Key returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionConnectionApi(apiClient) r, err := api.UnregisterAppKey(ctx, "57cc69ae-9214-4ecc-8df8-43ecc1d92d99") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionConnectionApi.UnregisterAppKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Unregister an App Key ``` // Unregister an App Key returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionConnectionApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionConnectionApi apiInstance = new ActionConnectionApi(defaultClient); try { apiInstance.unregisterAppKey("57cc69ae-9214-4ecc-8df8-43ecc1d92d99"); } catch (ApiException e) { System.err.println("Exception when calling ActionConnectionApi#unregisterAppKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Unregister an App Key ``` // Unregister an App Key returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_action_connection::ActionConnectionAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ActionConnectionAPI::with_config(configuration); let resp = api .unregister_app_key("57cc69ae-9214-4ecc-8df8-43ecc1d92d99".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Unregister an App Key ``` /** * Unregister an App Key returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionConnectionApi(configuration); const params: v2.ActionConnectionApiUnregisterAppKeyRequest = { appKeyId: "57cc69ae-9214-4ecc-8df8-43ecc1d92d99", }; apiInstance .unregisterAppKey(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an existing App Key Registration](https://docs.datadoghq.com/api/latest/action-connection/#get-an-existing-app-key-registration) * [v2 (latest)](https://docs.datadoghq.com/api/latest/action-connection/#get-an-existing-app-key-registration-v2) GET https://api.ap1.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.ap2.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.datadoghq.eu/api/v2/actions/app_key_registrations/{app_key_id}https://api.ddog-gov.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.us3.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id}https://api.us5.datadoghq.com/api/v2/actions/app_key_registrations/{app_key_id} ### Overview Get an existing App Key Registration This endpoint requires the `org_app_keys_read` permission. ### Arguments #### Path Parameters Name Type Description app_key_id [_required_] string The ID of the app key ### Response * [200](https://docs.datadoghq.com/api/latest/action-connection/#GetAppKeyRegistration-200-v2) * [400](https://docs.datadoghq.com/api/latest/action-connection/#GetAppKeyRegistration-400-v2) * [403](https://docs.datadoghq.com/api/latest/action-connection/#GetAppKeyRegistration-403-v2) * [404](https://docs.datadoghq.com/api/latest/action-connection/#GetAppKeyRegistration-404-v2) * [429](https://docs.datadoghq.com/api/latest/action-connection/#GetAppKeyRegistration-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) The response object after getting an app key registration. Field Type Description data object Data related to the app key registration. id uuid The app key registration identifier type [_required_] enum The definition of `AppKeyRegistrationDataType` object. Allowed enum values: `app_key_registration` ``` { "data": { "id": "string", "type": "app_key_registration" } } ``` Copy Bad request * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/action-connection/) * [Example](https://docs.datadoghq.com/api/latest/action-connection/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/action-connection/?code-lang=typescript) ##### Get an existing App Key Registration Copy ``` # Path parameters export app_key_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions/app_key_registrations/${app_key_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an existing App Key Registration ``` """ Get an existing App Key Registration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.action_connection_api import ActionConnectionApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionConnectionApi(api_client) response = api_instance.get_app_key_registration( app_key_id="b7feea52-994e-4714-a100-1bd9eff5aee1", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an existing App Key Registration ``` # Get an existing App Key Registration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionConnectionAPI.new p api_instance.get_app_key_registration("b7feea52-994e-4714-a100-1bd9eff5aee1") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an existing App Key Registration ``` // Get an existing App Key Registration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionConnectionApi(apiClient) resp, r, err := api.GetAppKeyRegistration(ctx, "b7feea52-994e-4714-a100-1bd9eff5aee1") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionConnectionApi.GetAppKeyRegistration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionConnectionApi.GetAppKeyRegistration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an existing App Key Registration ``` // Get an existing App Key Registration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionConnectionApi; import com.datadog.api.client.v2.model.GetAppKeyRegistrationResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionConnectionApi apiInstance = new ActionConnectionApi(defaultClient); try { GetAppKeyRegistrationResponse result = apiInstance.getAppKeyRegistration("b7feea52-994e-4714-a100-1bd9eff5aee1"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionConnectionApi#getAppKeyRegistration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an existing App Key Registration ``` // Get an existing App Key Registration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_action_connection::ActionConnectionAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ActionConnectionAPI::with_config(configuration); let resp = api .get_app_key_registration("b7feea52-994e-4714-a100-1bd9eff5aee1".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an existing App Key Registration ``` /** * Get an existing App Key Registration returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionConnectionApi(configuration); const params: v2.ActionConnectionApiGetAppKeyRegistrationRequest = { appKeyId: "b7feea52-994e-4714-a100-1bd9eff5aee1", }; apiInstance .getAppKeyRegistration(params) .then((data: v2.GetAppKeyRegistrationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=d281a251-581b-46d0-bf2a-22262fe396d1&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=dea7f038-498d-4446-8f7c-77b275287acd&pt=Action%20Connection&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faction-connection%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=d281a251-581b-46d0-bf2a-22262fe396d1&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=dea7f038-498d-4446-8f7c-77b275287acd&pt=Action%20Connection&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faction-connection%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=60345456-6951-4902-870b-140634b8edef&bo=2&sid=bf71a970f0be11f0814e1f63b85d6398&vid=bf72bca0f0be11f09feacba66159d584&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Action%20Connection&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faction-connection%2F&r=<=2210&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=554673) --- # Source: https://docs.datadoghq.com/api/latest/actions-datastores/ # Actions Datastores Leverage the Actions Datastore API to create, modify, and delete items in datastores owned by your organization. ## [List datastores](https://docs.datadoghq.com/api/latest/actions-datastores/#list-datastores) * [v2 (latest)](https://docs.datadoghq.com/api/latest/actions-datastores/#list-datastores-v2) GET https://api.ap1.datadoghq.com/api/v2/actions-datastoreshttps://api.ap2.datadoghq.com/api/v2/actions-datastoreshttps://api.datadoghq.eu/api/v2/actions-datastoreshttps://api.ddog-gov.com/api/v2/actions-datastoreshttps://api.datadoghq.com/api/v2/actions-datastoreshttps://api.us3.datadoghq.com/api/v2/actions-datastoreshttps://api.us5.datadoghq.com/api/v2/actions-datastores ### Overview Lists all datastores for the organization. This endpoint requires the `apps_datastore_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/actions-datastores/#ListDatastores-200-v2) * [429](https://docs.datadoghq.com/api/latest/actions-datastores/#ListDatastores-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) A collection of datastores returned by list operations. Field Type Description data [_required_] [object] An array of datastore objects containing their configurations and metadata. attributes object Detailed information about a datastore. created_at date-time Timestamp when the datastore was created. creator_user_id int64 The numeric ID of the user who created the datastore. creator_user_uuid string The UUID of the user who created the datastore. description string A human-readable description about the datastore. modified_at date-time Timestamp when the datastore was last modified. name string The display name of the datastore. org_id int64 The ID of the organization that owns this datastore. primary_column_name string The name of the primary key column for this datastore. Primary column names: * Must abide by both [PostgreSQL naming conventions](https://www.postgresql.org/docs/7.0/syntax525.htm) * Cannot exceed 63 characters primary_key_generation_strategy enum Can be set to `uuid` to automatically generate primary keys when new items are added. Default value is `none`, which requires you to supply a primary key for each new item. Allowed enum values: `none,uuid` id string The unique identifier of the datastore. type [_required_] enum The resource type for datastores. Allowed enum values: `datastores` default: `datastores` ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "creator_user_id": "integer", "creator_user_uuid": "string", "description": "string", "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "org_id": "integer", "primary_column_name": "", "primary_key_generation_strategy": "string" }, "id": "string", "type": "datastores" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=typescript) ##### List datastores Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions-datastores" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List datastores ``` """ List datastores returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.actions_datastores_api import ActionsDatastoresApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionsDatastoresApi(api_client) response = api_instance.list_datastores() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List datastores ``` # List datastores returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionsDatastoresAPI.new p api_instance.list_datastores() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List datastores ``` // List datastores returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionsDatastoresApi(apiClient) resp, r, err := api.ListDatastores(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionsDatastoresApi.ListDatastores`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionsDatastoresApi.ListDatastores`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List datastores ``` // List datastores returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionsDatastoresApi; import com.datadog.api.client.v2.model.DatastoreArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionsDatastoresApi apiInstance = new ActionsDatastoresApi(defaultClient); try { DatastoreArray result = apiInstance.listDatastores(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionsDatastoresApi#listDatastores"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List datastores ``` // List datastores returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_actions_datastores::ActionsDatastoresAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ActionsDatastoresAPI::with_config(configuration); let resp = api.list_datastores().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List datastores ``` /** * List datastores returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionsDatastoresApi(configuration); apiInstance .listDatastores() .then((data: v2.DatastoreArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create datastore](https://docs.datadoghq.com/api/latest/actions-datastores/#create-datastore) * [v2 (latest)](https://docs.datadoghq.com/api/latest/actions-datastores/#create-datastore-v2) POST https://api.ap1.datadoghq.com/api/v2/actions-datastoreshttps://api.ap2.datadoghq.com/api/v2/actions-datastoreshttps://api.datadoghq.eu/api/v2/actions-datastoreshttps://api.ddog-gov.com/api/v2/actions-datastoreshttps://api.datadoghq.com/api/v2/actions-datastoreshttps://api.us3.datadoghq.com/api/v2/actions-datastoreshttps://api.us5.datadoghq.com/api/v2/actions-datastores ### Overview Creates a new datastore. This endpoint requires the `apps_datastore_manage` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) Field Type Description data object Data wrapper containing the configuration needed to create a new datastore. attributes object Configuration and metadata to create a new datastore. description string A human-readable description about the datastore. name [_required_] string The display name for the new datastore. org_access enum The organization access level for the datastore. For example, 'contributor'. Allowed enum values: `contributor,viewer,manager` primary_column_name [_required_] string The name of the primary key column for this datastore. Primary column names: * Must abide by both [PostgreSQL naming conventions](https://www.postgresql.org/docs/7.0/syntax525.htm) * Cannot exceed 63 characters primary_key_generation_strategy enum Can be set to `uuid` to automatically generate primary keys when new items are added. Default value is `none`, which requires you to supply a primary key for each new item. Allowed enum values: `none,uuid` id string Optional ID for the new datastore. If not provided, one will be generated automatically. type [_required_] enum The resource type for datastores. Allowed enum values: `datastores` default: `datastores` ``` { "data": { "attributes": { "name": "datastore-name", "primary_column_name": "primaryKey" }, "type": "datastores" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/actions-datastores/#CreateDatastore-200-v2) * [400](https://docs.datadoghq.com/api/latest/actions-datastores/#CreateDatastore-400-v2) * [429](https://docs.datadoghq.com/api/latest/actions-datastores/#CreateDatastore-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) Response after successfully creating a new datastore, containing the datastore’s assigned ID. Field Type Description data object The newly created datastore's data. id string The unique identifier assigned to the newly created datastore. type [_required_] enum The resource type for datastores. Allowed enum values: `datastores` default: `datastores` ``` { "data": { "id": "string", "type": "datastores" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=typescript) ##### Create datastore returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions-datastores" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "datastore-name", "primary_column_name": "primaryKey" }, "type": "datastores" } } EOF ``` ##### Create datastore returns "OK" response ``` // Create datastore returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateAppsDatastoreRequest{ Data: &datadogV2.CreateAppsDatastoreRequestData{ Attributes: &datadogV2.CreateAppsDatastoreRequestDataAttributes{ Name: "datastore-name", PrimaryColumnName: "primaryKey", }, Type: datadogV2.DATASTOREDATATYPE_DATASTORES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionsDatastoresApi(apiClient) resp, r, err := api.CreateDatastore(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionsDatastoresApi.CreateDatastore`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionsDatastoresApi.CreateDatastore`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create datastore returns "OK" response ``` // Create datastore returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionsDatastoresApi; import com.datadog.api.client.v2.model.CreateAppsDatastoreRequest; import com.datadog.api.client.v2.model.CreateAppsDatastoreRequestData; import com.datadog.api.client.v2.model.CreateAppsDatastoreRequestDataAttributes; import com.datadog.api.client.v2.model.CreateAppsDatastoreResponse; import com.datadog.api.client.v2.model.DatastoreDataType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionsDatastoresApi apiInstance = new ActionsDatastoresApi(defaultClient); CreateAppsDatastoreRequest body = new CreateAppsDatastoreRequest() .data( new CreateAppsDatastoreRequestData() .attributes( new CreateAppsDatastoreRequestDataAttributes() .name("datastore-name") .primaryColumnName("primaryKey")) .type(DatastoreDataType.DATASTORES)); try { CreateAppsDatastoreResponse result = apiInstance.createDatastore(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionsDatastoresApi#createDatastore"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create datastore returns "OK" response ``` """ Create datastore returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.actions_datastores_api import ActionsDatastoresApi from datadog_api_client.v2.model.create_apps_datastore_request import CreateAppsDatastoreRequest from datadog_api_client.v2.model.create_apps_datastore_request_data import CreateAppsDatastoreRequestData from datadog_api_client.v2.model.create_apps_datastore_request_data_attributes import ( CreateAppsDatastoreRequestDataAttributes, ) from datadog_api_client.v2.model.datastore_data_type import DatastoreDataType body = CreateAppsDatastoreRequest( data=CreateAppsDatastoreRequestData( attributes=CreateAppsDatastoreRequestDataAttributes( name="datastore-name", primary_column_name="primaryKey", ), type=DatastoreDataType.DATASTORES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionsDatastoresApi(api_client) response = api_instance.create_datastore(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create datastore returns "OK" response ``` # Create datastore returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionsDatastoresAPI.new body = DatadogAPIClient::V2::CreateAppsDatastoreRequest.new({ data: DatadogAPIClient::V2::CreateAppsDatastoreRequestData.new({ attributes: DatadogAPIClient::V2::CreateAppsDatastoreRequestDataAttributes.new({ name: "datastore-name", primary_column_name: "primaryKey", }), type: DatadogAPIClient::V2::DatastoreDataType::DATASTORES, }), }) p api_instance.create_datastore(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create datastore returns "OK" response ``` // Create datastore returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_actions_datastores::ActionsDatastoresAPI; use datadog_api_client::datadogV2::model::CreateAppsDatastoreRequest; use datadog_api_client::datadogV2::model::CreateAppsDatastoreRequestData; use datadog_api_client::datadogV2::model::CreateAppsDatastoreRequestDataAttributes; use datadog_api_client::datadogV2::model::DatastoreDataType; #[tokio::main] async fn main() { let body = CreateAppsDatastoreRequest::new().data( CreateAppsDatastoreRequestData::new(DatastoreDataType::DATASTORES).attributes( CreateAppsDatastoreRequestDataAttributes::new( "datastore-name".to_string(), "primaryKey".to_string(), ), ), ); let configuration = datadog::Configuration::new(); let api = ActionsDatastoresAPI::with_config(configuration); let resp = api.create_datastore(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create datastore returns "OK" response ``` /** * Create datastore returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionsDatastoresApi(configuration); const params: v2.ActionsDatastoresApiCreateDatastoreRequest = { body: { data: { attributes: { name: "datastore-name", primaryColumnName: "primaryKey", }, type: "datastores", }, }, }; apiInstance .createDatastore(params) .then((data: v2.CreateAppsDatastoreResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get datastore](https://docs.datadoghq.com/api/latest/actions-datastores/#get-datastore) * [v2 (latest)](https://docs.datadoghq.com/api/latest/actions-datastores/#get-datastore-v2) GET https://api.ap1.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.ap2.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.datadoghq.eu/api/v2/actions-datastores/{datastore_id}https://api.ddog-gov.com/api/v2/actions-datastores/{datastore_id}https://api.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.us3.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.us5.datadoghq.com/api/v2/actions-datastores/{datastore_id} ### Overview Retrieves a specific datastore by its ID. This endpoint requires the `apps_datastore_read` permission. ### Arguments #### Path Parameters Name Type Description datastore_id [_required_] string The unique identifier of the datastore to retrieve. ### Response * [200](https://docs.datadoghq.com/api/latest/actions-datastores/#GetDatastore-200-v2) * [400](https://docs.datadoghq.com/api/latest/actions-datastores/#GetDatastore-400-v2) * [404](https://docs.datadoghq.com/api/latest/actions-datastores/#GetDatastore-404-v2) * [429](https://docs.datadoghq.com/api/latest/actions-datastores/#GetDatastore-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) A datastore’s complete configuration and metadata. Field Type Description data object Core information about a datastore, including its unique identifier and attributes. attributes object Detailed information about a datastore. created_at date-time Timestamp when the datastore was created. creator_user_id int64 The numeric ID of the user who created the datastore. creator_user_uuid string The UUID of the user who created the datastore. description string A human-readable description about the datastore. modified_at date-time Timestamp when the datastore was last modified. name string The display name of the datastore. org_id int64 The ID of the organization that owns this datastore. primary_column_name string The name of the primary key column for this datastore. Primary column names: * Must abide by both [PostgreSQL naming conventions](https://www.postgresql.org/docs/7.0/syntax525.htm) * Cannot exceed 63 characters primary_key_generation_strategy enum Can be set to `uuid` to automatically generate primary keys when new items are added. Default value is `none`, which requires you to supply a primary key for each new item. Allowed enum values: `none,uuid` id string The unique identifier of the datastore. type [_required_] enum The resource type for datastores. Allowed enum values: `datastores` default: `datastores` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "creator_user_id": "integer", "creator_user_uuid": "string", "description": "string", "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "org_id": "integer", "primary_column_name": "", "primary_key_generation_strategy": "string" }, "id": "string", "type": "datastores" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=typescript) ##### Get datastore Copy ``` # Path parameters export datastore_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions-datastores/${datastore_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get datastore ``` """ Get datastore returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.actions_datastores_api import ActionsDatastoresApi # there is a valid "datastore" in the system DATASTORE_DATA_ID = environ["DATASTORE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionsDatastoresApi(api_client) response = api_instance.get_datastore( datastore_id=DATASTORE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get datastore ``` # Get datastore returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionsDatastoresAPI.new # there is a valid "datastore" in the system DATASTORE_DATA_ID = ENV["DATASTORE_DATA_ID"] p api_instance.get_datastore(DATASTORE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get datastore ``` // Get datastore returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "datastore" in the system DatastoreDataID := os.Getenv("DATASTORE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionsDatastoresApi(apiClient) resp, r, err := api.GetDatastore(ctx, DatastoreDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionsDatastoresApi.GetDatastore`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionsDatastoresApi.GetDatastore`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get datastore ``` // Get datastore returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionsDatastoresApi; import com.datadog.api.client.v2.model.Datastore; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionsDatastoresApi apiInstance = new ActionsDatastoresApi(defaultClient); // there is a valid "datastore" in the system String DATASTORE_DATA_ID = System.getenv("DATASTORE_DATA_ID"); try { Datastore result = apiInstance.getDatastore(DATASTORE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionsDatastoresApi#getDatastore"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get datastore ``` // Get datastore returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_actions_datastores::ActionsDatastoresAPI; #[tokio::main] async fn main() { // there is a valid "datastore" in the system let datastore_data_id = std::env::var("DATASTORE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ActionsDatastoresAPI::with_config(configuration); let resp = api.get_datastore(datastore_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get datastore ``` /** * Get datastore returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionsDatastoresApi(configuration); // there is a valid "datastore" in the system const DATASTORE_DATA_ID = process.env.DATASTORE_DATA_ID as string; const params: v2.ActionsDatastoresApiGetDatastoreRequest = { datastoreId: DATASTORE_DATA_ID, }; apiInstance .getDatastore(params) .then((data: v2.Datastore) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update datastore](https://docs.datadoghq.com/api/latest/actions-datastores/#update-datastore) * [v2 (latest)](https://docs.datadoghq.com/api/latest/actions-datastores/#update-datastore-v2) PATCH https://api.ap1.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.ap2.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.datadoghq.eu/api/v2/actions-datastores/{datastore_id}https://api.ddog-gov.com/api/v2/actions-datastores/{datastore_id}https://api.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.us3.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.us5.datadoghq.com/api/v2/actions-datastores/{datastore_id} ### Overview Updates an existing datastore’s attributes. This endpoint requires the `apps_datastore_manage` permission. ### Arguments #### Path Parameters Name Type Description datastore_id [_required_] string The unique identifier of the datastore to retrieve. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) Field Type Description data object Data wrapper containing the datastore identifier and the attributes to update. attributes object Attributes that can be updated on a datastore. description string A human-readable description about the datastore. name string The display name of the datastore. id string The unique identifier of the datastore to update. type [_required_] enum The resource type for datastores. Allowed enum values: `datastores` default: `datastores` ``` { "data": { "attributes": { "name": "updated name" }, "type": "datastores", "id": "string" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/actions-datastores/#UpdateDatastore-200-v2) * [400](https://docs.datadoghq.com/api/latest/actions-datastores/#UpdateDatastore-400-v2) * [404](https://docs.datadoghq.com/api/latest/actions-datastores/#UpdateDatastore-404-v2) * [429](https://docs.datadoghq.com/api/latest/actions-datastores/#UpdateDatastore-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) A datastore’s complete configuration and metadata. Field Type Description data object Core information about a datastore, including its unique identifier and attributes. attributes object Detailed information about a datastore. created_at date-time Timestamp when the datastore was created. creator_user_id int64 The numeric ID of the user who created the datastore. creator_user_uuid string The UUID of the user who created the datastore. description string A human-readable description about the datastore. modified_at date-time Timestamp when the datastore was last modified. name string The display name of the datastore. org_id int64 The ID of the organization that owns this datastore. primary_column_name string The name of the primary key column for this datastore. Primary column names: * Must abide by both [PostgreSQL naming conventions](https://www.postgresql.org/docs/7.0/syntax525.htm) * Cannot exceed 63 characters primary_key_generation_strategy enum Can be set to `uuid` to automatically generate primary keys when new items are added. Default value is `none`, which requires you to supply a primary key for each new item. Allowed enum values: `none,uuid` id string The unique identifier of the datastore. type [_required_] enum The resource type for datastores. Allowed enum values: `datastores` default: `datastores` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "creator_user_id": "integer", "creator_user_uuid": "string", "description": "string", "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "org_id": "integer", "primary_column_name": "", "primary_key_generation_strategy": "string" }, "id": "string", "type": "datastores" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=typescript) ##### Update datastore returns "OK" response Copy ``` # Path parameters export datastore_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions-datastores/${datastore_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "updated name" }, "type": "datastores", "id": "string" } } EOF ``` ##### Update datastore returns "OK" response ``` // Update datastore returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "datastore" in the system DatastoreDataID := os.Getenv("DATASTORE_DATA_ID") body := datadogV2.UpdateAppsDatastoreRequest{ Data: &datadogV2.UpdateAppsDatastoreRequestData{ Attributes: &datadogV2.UpdateAppsDatastoreRequestDataAttributes{ Name: datadog.PtrString("updated name"), }, Type: datadogV2.DATASTOREDATATYPE_DATASTORES, Id: datadog.PtrString(DatastoreDataID), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionsDatastoresApi(apiClient) resp, r, err := api.UpdateDatastore(ctx, DatastoreDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionsDatastoresApi.UpdateDatastore`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionsDatastoresApi.UpdateDatastore`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update datastore returns "OK" response ``` // Update datastore returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionsDatastoresApi; import com.datadog.api.client.v2.model.Datastore; import com.datadog.api.client.v2.model.DatastoreDataType; import com.datadog.api.client.v2.model.UpdateAppsDatastoreRequest; import com.datadog.api.client.v2.model.UpdateAppsDatastoreRequestData; import com.datadog.api.client.v2.model.UpdateAppsDatastoreRequestDataAttributes; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionsDatastoresApi apiInstance = new ActionsDatastoresApi(defaultClient); // there is a valid "datastore" in the system String DATASTORE_DATA_ID = System.getenv("DATASTORE_DATA_ID"); UpdateAppsDatastoreRequest body = new UpdateAppsDatastoreRequest() .data( new UpdateAppsDatastoreRequestData() .attributes(new UpdateAppsDatastoreRequestDataAttributes().name("updated name")) .type(DatastoreDataType.DATASTORES) .id(DATASTORE_DATA_ID)); try { Datastore result = apiInstance.updateDatastore(DATASTORE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionsDatastoresApi#updateDatastore"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update datastore returns "OK" response ``` """ Update datastore returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.actions_datastores_api import ActionsDatastoresApi from datadog_api_client.v2.model.datastore_data_type import DatastoreDataType from datadog_api_client.v2.model.update_apps_datastore_request import UpdateAppsDatastoreRequest from datadog_api_client.v2.model.update_apps_datastore_request_data import UpdateAppsDatastoreRequestData from datadog_api_client.v2.model.update_apps_datastore_request_data_attributes import ( UpdateAppsDatastoreRequestDataAttributes, ) # there is a valid "datastore" in the system DATASTORE_DATA_ID = environ["DATASTORE_DATA_ID"] body = UpdateAppsDatastoreRequest( data=UpdateAppsDatastoreRequestData( attributes=UpdateAppsDatastoreRequestDataAttributes( name="updated name", ), type=DatastoreDataType.DATASTORES, id=DATASTORE_DATA_ID, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionsDatastoresApi(api_client) response = api_instance.update_datastore(datastore_id=DATASTORE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update datastore returns "OK" response ``` # Update datastore returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionsDatastoresAPI.new # there is a valid "datastore" in the system DATASTORE_DATA_ID = ENV["DATASTORE_DATA_ID"] body = DatadogAPIClient::V2::UpdateAppsDatastoreRequest.new({ data: DatadogAPIClient::V2::UpdateAppsDatastoreRequestData.new({ attributes: DatadogAPIClient::V2::UpdateAppsDatastoreRequestDataAttributes.new({ name: "updated name", }), type: DatadogAPIClient::V2::DatastoreDataType::DATASTORES, id: DATASTORE_DATA_ID, }), }) p api_instance.update_datastore(DATASTORE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update datastore returns "OK" response ``` // Update datastore returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_actions_datastores::ActionsDatastoresAPI; use datadog_api_client::datadogV2::model::DatastoreDataType; use datadog_api_client::datadogV2::model::UpdateAppsDatastoreRequest; use datadog_api_client::datadogV2::model::UpdateAppsDatastoreRequestData; use datadog_api_client::datadogV2::model::UpdateAppsDatastoreRequestDataAttributes; #[tokio::main] async fn main() { // there is a valid "datastore" in the system let datastore_data_id = std::env::var("DATASTORE_DATA_ID").unwrap(); let body = UpdateAppsDatastoreRequest::new().data( UpdateAppsDatastoreRequestData::new(DatastoreDataType::DATASTORES) .attributes( UpdateAppsDatastoreRequestDataAttributes::new().name("updated name".to_string()), ) .id(datastore_data_id.clone()), ); let configuration = datadog::Configuration::new(); let api = ActionsDatastoresAPI::with_config(configuration); let resp = api.update_datastore(datastore_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update datastore returns "OK" response ``` /** * Update datastore returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionsDatastoresApi(configuration); // there is a valid "datastore" in the system const DATASTORE_DATA_ID = process.env.DATASTORE_DATA_ID as string; const params: v2.ActionsDatastoresApiUpdateDatastoreRequest = { body: { data: { attributes: { name: "updated name", }, type: "datastores", id: DATASTORE_DATA_ID, }, }, datastoreId: DATASTORE_DATA_ID, }; apiInstance .updateDatastore(params) .then((data: v2.Datastore) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete datastore](https://docs.datadoghq.com/api/latest/actions-datastores/#delete-datastore) * [v2 (latest)](https://docs.datadoghq.com/api/latest/actions-datastores/#delete-datastore-v2) DELETE https://api.ap1.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.ap2.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.datadoghq.eu/api/v2/actions-datastores/{datastore_id}https://api.ddog-gov.com/api/v2/actions-datastores/{datastore_id}https://api.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.us3.datadoghq.com/api/v2/actions-datastores/{datastore_id}https://api.us5.datadoghq.com/api/v2/actions-datastores/{datastore_id} ### Overview Deletes a datastore by its unique identifier. This endpoint requires the `apps_datastore_manage` permission. ### Arguments #### Path Parameters Name Type Description datastore_id [_required_] string The unique identifier of the datastore to retrieve. ### Response * [200](https://docs.datadoghq.com/api/latest/actions-datastores/#DeleteDatastore-200-v2) * [400](https://docs.datadoghq.com/api/latest/actions-datastores/#DeleteDatastore-400-v2) * [429](https://docs.datadoghq.com/api/latest/actions-datastores/#DeleteDatastore-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=typescript) ##### Delete datastore Copy ``` # Path parameters export datastore_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions-datastores/${datastore_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete datastore ``` """ Delete datastore returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.actions_datastores_api import ActionsDatastoresApi # there is a valid "datastore" in the system DATASTORE_DATA_ID = environ["DATASTORE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionsDatastoresApi(api_client) api_instance.delete_datastore( datastore_id=DATASTORE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete datastore ``` # Delete datastore returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionsDatastoresAPI.new # there is a valid "datastore" in the system DATASTORE_DATA_ID = ENV["DATASTORE_DATA_ID"] p api_instance.delete_datastore(DATASTORE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete datastore ``` // Delete datastore returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "datastore" in the system DatastoreDataID := os.Getenv("DATASTORE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionsDatastoresApi(apiClient) r, err := api.DeleteDatastore(ctx, DatastoreDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionsDatastoresApi.DeleteDatastore`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete datastore ``` // Delete datastore returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionsDatastoresApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionsDatastoresApi apiInstance = new ActionsDatastoresApi(defaultClient); // there is a valid "datastore" in the system String DATASTORE_DATA_ID = System.getenv("DATASTORE_DATA_ID"); try { apiInstance.deleteDatastore(DATASTORE_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling ActionsDatastoresApi#deleteDatastore"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete datastore ``` // Delete datastore returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_actions_datastores::ActionsDatastoresAPI; #[tokio::main] async fn main() { // there is a valid "datastore" in the system let datastore_data_id = std::env::var("DATASTORE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ActionsDatastoresAPI::with_config(configuration); let resp = api.delete_datastore(datastore_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete datastore ``` /** * Delete datastore returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionsDatastoresApi(configuration); // there is a valid "datastore" in the system const DATASTORE_DATA_ID = process.env.DATASTORE_DATA_ID as string; const params: v2.ActionsDatastoresApiDeleteDatastoreRequest = { datastoreId: DATASTORE_DATA_ID, }; apiInstance .deleteDatastore(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List datastore items](https://docs.datadoghq.com/api/latest/actions-datastores/#list-datastore-items) * [v2 (latest)](https://docs.datadoghq.com/api/latest/actions-datastores/#list-datastore-items-v2) GET https://api.ap1.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.ap2.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.datadoghq.eu/api/v2/actions-datastores/{datastore_id}/itemshttps://api.ddog-gov.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.us3.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.us5.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items ### Overview Lists items from a datastore. You can filter the results by specifying either an item key or a filter query parameter, but not both at the same time. Supports server-side pagination for large datasets. This endpoint requires the `apps_datastore_read` permission. ### Arguments #### Path Parameters Name Type Description datastore_id [_required_] string The unique identifier of the datastore to retrieve. #### Query Strings Name Type Description filter string Optional query filter to search items using the [logs search syntax](https://docs.datadoghq.com/logs/explorer/search_syntax/). item_key string Optional primary key value to retrieve a specific item. Cannot be used together with the filter parameter. page[limit] integer Optional field to limit the number of items to return per page for pagination. Up to 100 items can be returned per page. page[offset] integer Optional field to offset the number of items to skip from the beginning of the result set for pagination. sort string Optional field to sort results by. Prefix with ‘-’ for descending order (e.g., ‘-created_at’). ### Response * [200](https://docs.datadoghq.com/api/latest/actions-datastores/#ListDatastoreItems-200-v2) * [400](https://docs.datadoghq.com/api/latest/actions-datastores/#ListDatastoreItems-400-v2) * [404](https://docs.datadoghq.com/api/latest/actions-datastores/#ListDatastoreItems-404-v2) * [429](https://docs.datadoghq.com/api/latest/actions-datastores/#ListDatastoreItems-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) A collection of datastore items with pagination and schema metadata. Field Type Description data [_required_] [object] An array of datastore items with their content and metadata. attributes object Metadata and content of a datastore item. created_at date-time Timestamp when the item was first created. modified_at date-time Timestamp when the item was last modified. org_id int64 The ID of the organization that owns this item. primary_column_name string The name of the primary key column for this datastore. Primary column names: * Must abide by both [PostgreSQL naming conventions](https://www.postgresql.org/docs/7.0/syntax525.htm) * Cannot exceed 63 characters signature string A unique signature identifying this item version. store_id string The unique identifier of the datastore containing this item. value object The data content (as key-value pairs) of a datastore item. id string The unique identifier of the datastore. type [_required_] enum The resource type for datastore items. Allowed enum values: `items` default: `items` meta object Metadata about the included items, including pagination info and datastore schema. page object Pagination information for a collection of datastore items. hasMore boolean Whether there are additional pages of items beyond the current page. totalCount int64 The total number of items in the datastore, ignoring any filters. totalFilteredCount int64 The total number of items that match the current filter criteria. schema object Schema information about the datastore, including its primary key and field definitions. fields [object] An array describing the columns available in this datastore. name [_required_] string The name of this column in the datastore. type [_required_] string The data type of this column. For example, 'string', 'number', or 'boolean'. primary_key string The name of the primary key column for this datastore. ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "org_id": "integer", "primary_column_name": "", "signature": "string", "store_id": "string", "value": {} }, "id": "string", "type": "items" } ], "meta": { "page": { "hasMore": false, "totalCount": "integer", "totalFilteredCount": "integer" }, "schema": { "fields": [ { "name": "", "type": "" } ], "primary_key": "string" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=typescript) ##### List datastore items Copy ``` # Path parameters export datastore_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions-datastores/${datastore_id}/items" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List datastore items ``` """ List datastore items returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.actions_datastores_api import ActionsDatastoresApi # there is a valid "datastore" in the system DATASTORE_DATA_ID = environ["DATASTORE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionsDatastoresApi(api_client) response = api_instance.list_datastore_items( datastore_id=DATASTORE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List datastore items ``` # List datastore items returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionsDatastoresAPI.new # there is a valid "datastore" in the system DATASTORE_DATA_ID = ENV["DATASTORE_DATA_ID"] p api_instance.list_datastore_items(DATASTORE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List datastore items ``` // List datastore items returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "datastore" in the system DatastoreDataID := os.Getenv("DATASTORE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionsDatastoresApi(apiClient) resp, r, err := api.ListDatastoreItems(ctx, DatastoreDataID, *datadogV2.NewListDatastoreItemsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionsDatastoresApi.ListDatastoreItems`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionsDatastoresApi.ListDatastoreItems`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List datastore items ``` // List datastore items returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionsDatastoresApi; import com.datadog.api.client.v2.model.ItemApiPayloadArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionsDatastoresApi apiInstance = new ActionsDatastoresApi(defaultClient); // there is a valid "datastore" in the system String DATASTORE_DATA_ID = System.getenv("DATASTORE_DATA_ID"); try { ItemApiPayloadArray result = apiInstance.listDatastoreItems(DATASTORE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionsDatastoresApi#listDatastoreItems"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List datastore items ``` // List datastore items returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_actions_datastores::ActionsDatastoresAPI; use datadog_api_client::datadogV2::api_actions_datastores::ListDatastoreItemsOptionalParams; #[tokio::main] async fn main() { // there is a valid "datastore" in the system let datastore_data_id = std::env::var("DATASTORE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ActionsDatastoresAPI::with_config(configuration); let resp = api .list_datastore_items( datastore_data_id.clone(), ListDatastoreItemsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List datastore items ``` /** * List datastore items returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionsDatastoresApi(configuration); // there is a valid "datastore" in the system const DATASTORE_DATA_ID = process.env.DATASTORE_DATA_ID as string; const params: v2.ActionsDatastoresApiListDatastoreItemsRequest = { datastoreId: DATASTORE_DATA_ID, }; apiInstance .listDatastoreItems(params) .then((data: v2.ItemApiPayloadArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete datastore item](https://docs.datadoghq.com/api/latest/actions-datastores/#delete-datastore-item) * [v2 (latest)](https://docs.datadoghq.com/api/latest/actions-datastores/#delete-datastore-item-v2) DELETE https://api.ap1.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.ap2.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.datadoghq.eu/api/v2/actions-datastores/{datastore_id}/itemshttps://api.ddog-gov.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.us3.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.us5.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items ### Overview Deletes an item from a datastore by its key. This endpoint requires the `apps_datastore_write` permission. ### Arguments #### Path Parameters Name Type Description datastore_id [_required_] string The unique identifier of the datastore to retrieve. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) Field Type Description data object Data wrapper containing the information needed to identify and delete a specific datastore item. attributes object Attributes specifying which datastore item to delete by its primary key. id string Optional unique identifier of the item to delete. item_key [_required_] string The primary key value that identifies the item to delete. Cannot exceed 256 characters. type [_required_] enum The resource type for datastore items. Allowed enum values: `items` default: `items` ``` { "data": { "attributes": { "item_key": "test-key" }, "type": "items" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/actions-datastores/#DeleteDatastoreItem-200-v2) * [400](https://docs.datadoghq.com/api/latest/actions-datastores/#DeleteDatastoreItem-400-v2) * [404](https://docs.datadoghq.com/api/latest/actions-datastores/#DeleteDatastoreItem-404-v2) * [429](https://docs.datadoghq.com/api/latest/actions-datastores/#DeleteDatastoreItem-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) Response from successfully deleting a datastore item. Field Type Description data object Data containing the identifier of the datastore item that was successfully deleted. id string The unique identifier of the item that was deleted. type [_required_] enum The resource type for datastore items. Allowed enum values: `items` default: `items` ``` { "data": { "id": "string", "type": "items" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=typescript) ##### Delete datastore item returns "OK" response Copy ``` # Path parameters export datastore_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions-datastores/${datastore_id}/items" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "item_key": "test-key" }, "type": "items" } } EOF ``` ##### Delete datastore item returns "OK" response ``` // Delete datastore item returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "datastore" in the system DatastoreDataID := os.Getenv("DATASTORE_DATA_ID") body := datadogV2.DeleteAppsDatastoreItemRequest{ Data: &datadogV2.DeleteAppsDatastoreItemRequestData{ Attributes: &datadogV2.DeleteAppsDatastoreItemRequestDataAttributes{ ItemKey: "test-key", }, Type: datadogV2.DATASTOREITEMSDATATYPE_ITEMS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionsDatastoresApi(apiClient) resp, r, err := api.DeleteDatastoreItem(ctx, DatastoreDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionsDatastoresApi.DeleteDatastoreItem`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionsDatastoresApi.DeleteDatastoreItem`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete datastore item returns "OK" response ``` // Delete datastore item returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionsDatastoresApi; import com.datadog.api.client.v2.model.DatastoreItemsDataType; import com.datadog.api.client.v2.model.DeleteAppsDatastoreItemRequest; import com.datadog.api.client.v2.model.DeleteAppsDatastoreItemRequestData; import com.datadog.api.client.v2.model.DeleteAppsDatastoreItemRequestDataAttributes; import com.datadog.api.client.v2.model.DeleteAppsDatastoreItemResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionsDatastoresApi apiInstance = new ActionsDatastoresApi(defaultClient); // there is a valid "datastore" in the system String DATASTORE_DATA_ID = System.getenv("DATASTORE_DATA_ID"); DeleteAppsDatastoreItemRequest body = new DeleteAppsDatastoreItemRequest() .data( new DeleteAppsDatastoreItemRequestData() .attributes( new DeleteAppsDatastoreItemRequestDataAttributes().itemKey("test-key")) .type(DatastoreItemsDataType.ITEMS)); try { DeleteAppsDatastoreItemResponse result = apiInstance.deleteDatastoreItem(DATASTORE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionsDatastoresApi#deleteDatastoreItem"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete datastore item returns "OK" response ``` """ Delete datastore item returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.actions_datastores_api import ActionsDatastoresApi from datadog_api_client.v2.model.datastore_items_data_type import DatastoreItemsDataType from datadog_api_client.v2.model.delete_apps_datastore_item_request import DeleteAppsDatastoreItemRequest from datadog_api_client.v2.model.delete_apps_datastore_item_request_data import DeleteAppsDatastoreItemRequestData from datadog_api_client.v2.model.delete_apps_datastore_item_request_data_attributes import ( DeleteAppsDatastoreItemRequestDataAttributes, ) # there is a valid "datastore" in the system DATASTORE_DATA_ID = environ["DATASTORE_DATA_ID"] body = DeleteAppsDatastoreItemRequest( data=DeleteAppsDatastoreItemRequestData( attributes=DeleteAppsDatastoreItemRequestDataAttributes( item_key="test-key", ), type=DatastoreItemsDataType.ITEMS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionsDatastoresApi(api_client) response = api_instance.delete_datastore_item(datastore_id=DATASTORE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete datastore item returns "OK" response ``` # Delete datastore item returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionsDatastoresAPI.new # there is a valid "datastore" in the system DATASTORE_DATA_ID = ENV["DATASTORE_DATA_ID"] body = DatadogAPIClient::V2::DeleteAppsDatastoreItemRequest.new({ data: DatadogAPIClient::V2::DeleteAppsDatastoreItemRequestData.new({ attributes: DatadogAPIClient::V2::DeleteAppsDatastoreItemRequestDataAttributes.new({ item_key: "test-key", }), type: DatadogAPIClient::V2::DatastoreItemsDataType::ITEMS, }), }) p api_instance.delete_datastore_item(DATASTORE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete datastore item returns "OK" response ``` // Delete datastore item returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_actions_datastores::ActionsDatastoresAPI; use datadog_api_client::datadogV2::model::DatastoreItemsDataType; use datadog_api_client::datadogV2::model::DeleteAppsDatastoreItemRequest; use datadog_api_client::datadogV2::model::DeleteAppsDatastoreItemRequestData; use datadog_api_client::datadogV2::model::DeleteAppsDatastoreItemRequestDataAttributes; #[tokio::main] async fn main() { // there is a valid "datastore" in the system let datastore_data_id = std::env::var("DATASTORE_DATA_ID").unwrap(); let body = DeleteAppsDatastoreItemRequest::new().data( DeleteAppsDatastoreItemRequestData::new(DatastoreItemsDataType::ITEMS).attributes( DeleteAppsDatastoreItemRequestDataAttributes::new("test-key".to_string()), ), ); let configuration = datadog::Configuration::new(); let api = ActionsDatastoresAPI::with_config(configuration); let resp = api .delete_datastore_item(datastore_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete datastore item returns "OK" response ``` /** * Delete datastore item returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionsDatastoresApi(configuration); // there is a valid "datastore" in the system const DATASTORE_DATA_ID = process.env.DATASTORE_DATA_ID as string; const params: v2.ActionsDatastoresApiDeleteDatastoreItemRequest = { body: { data: { attributes: { itemKey: "test-key", }, type: "items", }, }, datastoreId: DATASTORE_DATA_ID, }; apiInstance .deleteDatastoreItem(params) .then((data: v2.DeleteAppsDatastoreItemResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update datastore item](https://docs.datadoghq.com/api/latest/actions-datastores/#update-datastore-item) * [v2 (latest)](https://docs.datadoghq.com/api/latest/actions-datastores/#update-datastore-item-v2) PATCH https://api.ap1.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.ap2.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.datadoghq.eu/api/v2/actions-datastores/{datastore_id}/itemshttps://api.ddog-gov.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.us3.datadoghq.com/api/v2/actions-datastores/{datastore_id}/itemshttps://api.us5.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items ### Overview Partially updates an item in a datastore by its key. This endpoint requires the `apps_datastore_write` permission. ### Arguments #### Path Parameters Name Type Description datastore_id [_required_] string The unique identifier of the datastore to retrieve. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) Field Type Description data object Data wrapper containing the item identifier and the changes to apply during the update operation. attributes object Attributes for updating a datastore item, including the item key and changes to apply. id string The unique identifier of the item being updated. item_changes [_required_] object Changes to apply to a datastore item using set operations. ops_set object Set operation that contains key-value pairs to set on the datastore item. item_key [_required_] string The primary key that identifies the item to update. Cannot exceed 256 characters. id string The unique identifier of the datastore item. type [_required_] enum The resource type for datastore items. Allowed enum values: `items` default: `items` ``` { "data": { "attributes": { "item_changes": {}, "item_key": "test-key" }, "type": "items" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/actions-datastores/#UpdateDatastoreItem-200-v2) * [400](https://docs.datadoghq.com/api/latest/actions-datastores/#UpdateDatastoreItem-400-v2) * [404](https://docs.datadoghq.com/api/latest/actions-datastores/#UpdateDatastoreItem-404-v2) * [429](https://docs.datadoghq.com/api/latest/actions-datastores/#UpdateDatastoreItem-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) A single datastore item with its content and metadata. Field Type Description data object Core data and metadata for a single datastore item. attributes object Metadata and content of a datastore item. created_at date-time Timestamp when the item was first created. modified_at date-time Timestamp when the item was last modified. org_id int64 The ID of the organization that owns this item. primary_column_name string The name of the primary key column for this datastore. Primary column names: * Must abide by both [PostgreSQL naming conventions](https://www.postgresql.org/docs/7.0/syntax525.htm) * Cannot exceed 63 characters signature string A unique signature identifying this item version. store_id string The unique identifier of the datastore containing this item. value object The data content (as key-value pairs) of a datastore item. id string The unique identifier of the datastore. type [_required_] enum The resource type for datastore items. Allowed enum values: `items` default: `items` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "org_id": "integer", "primary_column_name": "", "signature": "string", "store_id": "string", "value": {} }, "id": "string", "type": "items" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=typescript) ##### Update datastore item returns "OK" response Copy ``` # Path parameters export datastore_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions-datastores/${datastore_id}/items" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "item_changes": {}, "item_key": "test-key" }, "type": "items" } } EOF ``` ##### Update datastore item returns "OK" response ``` // Update datastore item returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "datastore" in the system DatastoreDataID := os.Getenv("DATASTORE_DATA_ID") body := datadogV2.UpdateAppsDatastoreItemRequest{ Data: &datadogV2.UpdateAppsDatastoreItemRequestData{ Attributes: &datadogV2.UpdateAppsDatastoreItemRequestDataAttributes{ ItemChanges: datadogV2.UpdateAppsDatastoreItemRequestDataAttributesItemChanges{}, ItemKey: "test-key", }, Type: datadogV2.UPDATEAPPSDATASTOREITEMREQUESTDATATYPE_ITEMS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionsDatastoresApi(apiClient) resp, r, err := api.UpdateDatastoreItem(ctx, DatastoreDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionsDatastoresApi.UpdateDatastoreItem`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionsDatastoresApi.UpdateDatastoreItem`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update datastore item returns "OK" response ``` // Update datastore item returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionsDatastoresApi; import com.datadog.api.client.v2.model.ItemApiPayload; import com.datadog.api.client.v2.model.UpdateAppsDatastoreItemRequest; import com.datadog.api.client.v2.model.UpdateAppsDatastoreItemRequestData; import com.datadog.api.client.v2.model.UpdateAppsDatastoreItemRequestDataAttributes; import com.datadog.api.client.v2.model.UpdateAppsDatastoreItemRequestDataAttributesItemChanges; import com.datadog.api.client.v2.model.UpdateAppsDatastoreItemRequestDataType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionsDatastoresApi apiInstance = new ActionsDatastoresApi(defaultClient); // there is a valid "datastore" in the system String DATASTORE_DATA_ID = System.getenv("DATASTORE_DATA_ID"); UpdateAppsDatastoreItemRequest body = new UpdateAppsDatastoreItemRequest() .data( new UpdateAppsDatastoreItemRequestData() .attributes( new UpdateAppsDatastoreItemRequestDataAttributes() .itemChanges( new UpdateAppsDatastoreItemRequestDataAttributesItemChanges()) .itemKey("test-key")) .type(UpdateAppsDatastoreItemRequestDataType.ITEMS)); try { ItemApiPayload result = apiInstance.updateDatastoreItem(DATASTORE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionsDatastoresApi#updateDatastoreItem"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update datastore item returns "OK" response ``` """ Update datastore item returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.actions_datastores_api import ActionsDatastoresApi from datadog_api_client.v2.model.update_apps_datastore_item_request import UpdateAppsDatastoreItemRequest from datadog_api_client.v2.model.update_apps_datastore_item_request_data import UpdateAppsDatastoreItemRequestData from datadog_api_client.v2.model.update_apps_datastore_item_request_data_attributes import ( UpdateAppsDatastoreItemRequestDataAttributes, ) from datadog_api_client.v2.model.update_apps_datastore_item_request_data_attributes_item_changes import ( UpdateAppsDatastoreItemRequestDataAttributesItemChanges, ) from datadog_api_client.v2.model.update_apps_datastore_item_request_data_type import ( UpdateAppsDatastoreItemRequestDataType, ) # there is a valid "datastore" in the system DATASTORE_DATA_ID = environ["DATASTORE_DATA_ID"] body = UpdateAppsDatastoreItemRequest( data=UpdateAppsDatastoreItemRequestData( attributes=UpdateAppsDatastoreItemRequestDataAttributes( item_changes=UpdateAppsDatastoreItemRequestDataAttributesItemChanges(), item_key="test-key", ), type=UpdateAppsDatastoreItemRequestDataType.ITEMS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionsDatastoresApi(api_client) response = api_instance.update_datastore_item(datastore_id=DATASTORE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update datastore item returns "OK" response ``` # Update datastore item returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionsDatastoresAPI.new # there is a valid "datastore" in the system DATASTORE_DATA_ID = ENV["DATASTORE_DATA_ID"] body = DatadogAPIClient::V2::UpdateAppsDatastoreItemRequest.new({ data: DatadogAPIClient::V2::UpdateAppsDatastoreItemRequestData.new({ attributes: DatadogAPIClient::V2::UpdateAppsDatastoreItemRequestDataAttributes.new({ item_changes: DatadogAPIClient::V2::UpdateAppsDatastoreItemRequestDataAttributesItemChanges.new({}), item_key: "test-key", }), type: DatadogAPIClient::V2::UpdateAppsDatastoreItemRequestDataType::ITEMS, }), }) p api_instance.update_datastore_item(DATASTORE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update datastore item returns "OK" response ``` // Update datastore item returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_actions_datastores::ActionsDatastoresAPI; use datadog_api_client::datadogV2::model::UpdateAppsDatastoreItemRequest; use datadog_api_client::datadogV2::model::UpdateAppsDatastoreItemRequestData; use datadog_api_client::datadogV2::model::UpdateAppsDatastoreItemRequestDataAttributes; use datadog_api_client::datadogV2::model::UpdateAppsDatastoreItemRequestDataAttributesItemChanges; use datadog_api_client::datadogV2::model::UpdateAppsDatastoreItemRequestDataType; #[tokio::main] async fn main() { // there is a valid "datastore" in the system let datastore_data_id = std::env::var("DATASTORE_DATA_ID").unwrap(); let body = UpdateAppsDatastoreItemRequest::new().data( UpdateAppsDatastoreItemRequestData::new(UpdateAppsDatastoreItemRequestDataType::ITEMS) .attributes(UpdateAppsDatastoreItemRequestDataAttributes::new( UpdateAppsDatastoreItemRequestDataAttributesItemChanges::new(), "test-key".to_string(), )), ); let configuration = datadog::Configuration::new(); let api = ActionsDatastoresAPI::with_config(configuration); let resp = api .update_datastore_item(datastore_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update datastore item returns "OK" response ``` /** * Update datastore item returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionsDatastoresApi(configuration); // there is a valid "datastore" in the system const DATASTORE_DATA_ID = process.env.DATASTORE_DATA_ID as string; const params: v2.ActionsDatastoresApiUpdateDatastoreItemRequest = { body: { data: { attributes: { itemChanges: {}, itemKey: "test-key", }, type: "items", }, }, datastoreId: DATASTORE_DATA_ID, }; apiInstance .updateDatastoreItem(params) .then((data: v2.ItemApiPayload) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Bulk write datastore items](https://docs.datadoghq.com/api/latest/actions-datastores/#bulk-write-datastore-items) * [v2 (latest)](https://docs.datadoghq.com/api/latest/actions-datastores/#bulk-write-datastore-items-v2) POST https://api.ap1.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.ap2.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.datadoghq.eu/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.ddog-gov.com/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.us3.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.us5.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items/bulk ### Overview Creates or replaces multiple items in a datastore by their keys in a single operation. This endpoint requires the `apps_datastore_write` permission. ### Arguments #### Path Parameters Name Type Description datastore_id [_required_] string The unique identifier of the datastore to retrieve. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) Field Type Description data object Data wrapper containing the items to insert and their configuration for the bulk insert operation. attributes object Configuration for bulk inserting multiple items into a datastore. conflict_mode enum How to handle conflicts when inserting items that already exist in the datastore. Allowed enum values: `fail_on_conflict,overwrite_on_conflict` values [_required_] [object] An array of items to add to the datastore, where each item is a set of key-value pairs representing the item's data. Up to 100 items can be updated in a single request. type [_required_] enum The resource type for datastore items. Allowed enum values: `items` default: `items` ``` { "data": { "attributes": { "values": [ { "id": "cust_3141", "name": "Johnathan" }, { "id": "cust_3142", "name": "Mary" } ] }, "type": "items" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/actions-datastores/#BulkWriteDatastoreItems-200-v2) * [400](https://docs.datadoghq.com/api/latest/actions-datastores/#BulkWriteDatastoreItems-400-v2) * [404](https://docs.datadoghq.com/api/latest/actions-datastores/#BulkWriteDatastoreItems-404-v2) * [429](https://docs.datadoghq.com/api/latest/actions-datastores/#BulkWriteDatastoreItems-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) Response after successfully inserting multiple items into a datastore, containing the identifiers of the created items. Field Type Description data [_required_] [object] An array of data objects containing the identifiers of the successfully inserted items. id string The unique identifier assigned to the inserted item. type [_required_] enum The resource type for datastore items. Allowed enum values: `items` default: `items` ``` { "data": [ { "id": "string", "type": "items" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=typescript) ##### Bulk write datastore items returns "OK" response Copy ``` # Path parameters export datastore_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions-datastores/${datastore_id}/items/bulk" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "values": [ { "id": "cust_3141", "name": "Johnathan" }, { "id": "cust_3142", "name": "Mary" } ] }, "type": "items" } } EOF ``` ##### Bulk write datastore items returns "OK" response ``` // Bulk write datastore items returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "datastore" in the system DatastoreDataID := os.Getenv("DATASTORE_DATA_ID") body := datadogV2.BulkPutAppsDatastoreItemsRequest{ Data: &datadogV2.BulkPutAppsDatastoreItemsRequestData{ Attributes: &datadogV2.BulkPutAppsDatastoreItemsRequestDataAttributes{ Values: []map[string]interface{}{ { "id": "cust_3141", "name": "Johnathan", }, { "id": "cust_3142", "name": "Mary", }, }, }, Type: datadogV2.DATASTOREITEMSDATATYPE_ITEMS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionsDatastoresApi(apiClient) resp, r, err := api.BulkWriteDatastoreItems(ctx, DatastoreDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionsDatastoresApi.BulkWriteDatastoreItems`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionsDatastoresApi.BulkWriteDatastoreItems`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Bulk write datastore items returns "OK" response ``` // Bulk write datastore items returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionsDatastoresApi; import com.datadog.api.client.v2.model.BulkPutAppsDatastoreItemsRequest; import com.datadog.api.client.v2.model.BulkPutAppsDatastoreItemsRequestData; import com.datadog.api.client.v2.model.BulkPutAppsDatastoreItemsRequestDataAttributes; import com.datadog.api.client.v2.model.DatastoreItemsDataType; import com.datadog.api.client.v2.model.PutAppsDatastoreItemResponseArray; import java.util.Arrays; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionsDatastoresApi apiInstance = new ActionsDatastoresApi(defaultClient); // there is a valid "datastore" in the system String DATASTORE_DATA_ID = System.getenv("DATASTORE_DATA_ID"); BulkPutAppsDatastoreItemsRequest body = new BulkPutAppsDatastoreItemsRequest() .data( new BulkPutAppsDatastoreItemsRequestData() .attributes( new BulkPutAppsDatastoreItemsRequestDataAttributes() .values( Arrays.asList( Map.ofEntries( Map.entry("id", "cust_3141"), Map.entry("name", "Johnathan")), Map.ofEntries( Map.entry("id", "cust_3142"), Map.entry("name", "Mary"))))) .type(DatastoreItemsDataType.ITEMS)); try { PutAppsDatastoreItemResponseArray result = apiInstance.bulkWriteDatastoreItems(DATASTORE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionsDatastoresApi#bulkWriteDatastoreItems"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Bulk write datastore items returns "OK" response ``` """ Bulk write datastore items returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.actions_datastores_api import ActionsDatastoresApi from datadog_api_client.v2.model.bulk_put_apps_datastore_items_request import BulkPutAppsDatastoreItemsRequest from datadog_api_client.v2.model.bulk_put_apps_datastore_items_request_data import BulkPutAppsDatastoreItemsRequestData from datadog_api_client.v2.model.bulk_put_apps_datastore_items_request_data_attributes import ( BulkPutAppsDatastoreItemsRequestDataAttributes, ) from datadog_api_client.v2.model.datastore_items_data_type import DatastoreItemsDataType # there is a valid "datastore" in the system DATASTORE_DATA_ID = environ["DATASTORE_DATA_ID"] body = BulkPutAppsDatastoreItemsRequest( data=BulkPutAppsDatastoreItemsRequestData( attributes=BulkPutAppsDatastoreItemsRequestDataAttributes( values=[ dict([("id", "cust_3141"), ("name", "Johnathan")]), dict([("id", "cust_3142"), ("name", "Mary")]), ], ), type=DatastoreItemsDataType.ITEMS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionsDatastoresApi(api_client) response = api_instance.bulk_write_datastore_items(datastore_id=DATASTORE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Bulk write datastore items returns "OK" response ``` # Bulk write datastore items returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionsDatastoresAPI.new # there is a valid "datastore" in the system DATASTORE_DATA_ID = ENV["DATASTORE_DATA_ID"] body = DatadogAPIClient::V2::BulkPutAppsDatastoreItemsRequest.new({ data: DatadogAPIClient::V2::BulkPutAppsDatastoreItemsRequestData.new({ attributes: DatadogAPIClient::V2::BulkPutAppsDatastoreItemsRequestDataAttributes.new({ values: [ { "id": "cust_3141", "name": "Johnathan", }, { "id": "cust_3142", "name": "Mary", }, ], }), type: DatadogAPIClient::V2::DatastoreItemsDataType::ITEMS, }), }) p api_instance.bulk_write_datastore_items(DATASTORE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Bulk write datastore items returns "OK" response ``` // Bulk write datastore items returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_actions_datastores::ActionsDatastoresAPI; use datadog_api_client::datadogV2::model::BulkPutAppsDatastoreItemsRequest; use datadog_api_client::datadogV2::model::BulkPutAppsDatastoreItemsRequestData; use datadog_api_client::datadogV2::model::BulkPutAppsDatastoreItemsRequestDataAttributes; use datadog_api_client::datadogV2::model::DatastoreItemsDataType; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { // there is a valid "datastore" in the system let datastore_data_id = std::env::var("DATASTORE_DATA_ID").unwrap(); let body = BulkPutAppsDatastoreItemsRequest::new().data( BulkPutAppsDatastoreItemsRequestData::new(DatastoreItemsDataType::ITEMS).attributes( BulkPutAppsDatastoreItemsRequestDataAttributes::new(vec![ BTreeMap::from([ ("id".to_string(), Value::from("cust_3141")), ("name".to_string(), Value::from("Johnathan")), ]), BTreeMap::from([ ("id".to_string(), Value::from("cust_3142")), ("name".to_string(), Value::from("Mary")), ]), ]), ), ); let configuration = datadog::Configuration::new(); let api = ActionsDatastoresAPI::with_config(configuration); let resp = api .bulk_write_datastore_items(datastore_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Bulk write datastore items returns "OK" response ``` /** * Bulk write datastore items returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionsDatastoresApi(configuration); // there is a valid "datastore" in the system const DATASTORE_DATA_ID = process.env.DATASTORE_DATA_ID as string; const params: v2.ActionsDatastoresApiBulkWriteDatastoreItemsRequest = { body: { data: { attributes: { values: [ { id: "cust_3141", name: "Johnathan", }, { id: "cust_3142", name: "Mary", }, ], }, type: "items", }, }, datastoreId: DATASTORE_DATA_ID, }; apiInstance .bulkWriteDatastoreItems(params) .then((data: v2.PutAppsDatastoreItemResponseArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Bulk delete datastore items](https://docs.datadoghq.com/api/latest/actions-datastores/#bulk-delete-datastore-items) * [v2 (latest)](https://docs.datadoghq.com/api/latest/actions-datastores/#bulk-delete-datastore-items-v2) DELETE https://api.ap1.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.ap2.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.datadoghq.eu/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.ddog-gov.com/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.us3.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items/bulkhttps://api.us5.datadoghq.com/api/v2/actions-datastores/{datastore_id}/items/bulk ### Overview Deletes multiple items from a datastore by their keys in a single operation. This endpoint requires the `apps_datastore_write` permission. ### Arguments #### Path Parameters Name Type Description datastore_id [_required_] string The ID of the datastore. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) Field Type Description data object Data wrapper containing the data needed to delete items from a datastore. attributes object Attributes of request data to delete items from a datastore. item_keys [string] List of primary keys identifying items to delete from datastore. Up to 100 items can be deleted in a single request. id string ID for the datastore of the items to delete. type [_required_] enum Items resource type. Allowed enum values: `items` default: `items` ``` { "data": { "attributes": { "item_keys": [ "test-key" ] }, "type": "items" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/actions-datastores/#BulkDeleteDatastoreItems-200-v2) * [400](https://docs.datadoghq.com/api/latest/actions-datastores/#BulkDeleteDatastoreItems-400-v2) * [404](https://docs.datadoghq.com/api/latest/actions-datastores/#BulkDeleteDatastoreItems-404-v2) * [429](https://docs.datadoghq.com/api/latest/actions-datastores/#BulkDeleteDatastoreItems-429-v2) * [500](https://docs.datadoghq.com/api/latest/actions-datastores/#BulkDeleteDatastoreItems-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) The definition of `DeleteAppsDatastoreItemResponseArray` object. Field Type Description data [_required_] [object] The `DeleteAppsDatastoreItemResponseArray` `data`. id string The unique identifier of the item that was deleted. type [_required_] enum The resource type for datastore items. Allowed enum values: `items` default: `items` ``` { "data": [ { "id": "string", "type": "items" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/actions-datastores/) * [Example](https://docs.datadoghq.com/api/latest/actions-datastores/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/actions-datastores/?code-lang=typescript) ##### Bulk delete datastore items returns "OK" response Copy ``` # Path parameters export datastore_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/actions-datastores/${datastore_id}/items/bulk" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "item_keys": [ "test-key" ] }, "type": "items" } } EOF ``` ##### Bulk delete datastore items returns "OK" response ``` // Bulk delete datastore items returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "datastore" in the system DatastoreDataID := os.Getenv("DATASTORE_DATA_ID") body := datadogV2.BulkDeleteAppsDatastoreItemsRequest{ Data: &datadogV2.BulkDeleteAppsDatastoreItemsRequestData{ Attributes: &datadogV2.BulkDeleteAppsDatastoreItemsRequestDataAttributes{ ItemKeys: []string{ "test-key", }, }, Type: datadogV2.BULKDELETEAPPSDATASTOREITEMSREQUESTDATATYPE_ITEMS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewActionsDatastoresApi(apiClient) resp, r, err := api.BulkDeleteDatastoreItems(ctx, DatastoreDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ActionsDatastoresApi.BulkDeleteDatastoreItems`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ActionsDatastoresApi.BulkDeleteDatastoreItems`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Bulk delete datastore items returns "OK" response ``` // Bulk delete datastore items returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ActionsDatastoresApi; import com.datadog.api.client.v2.model.BulkDeleteAppsDatastoreItemsRequest; import com.datadog.api.client.v2.model.BulkDeleteAppsDatastoreItemsRequestData; import com.datadog.api.client.v2.model.BulkDeleteAppsDatastoreItemsRequestDataAttributes; import com.datadog.api.client.v2.model.BulkDeleteAppsDatastoreItemsRequestDataType; import com.datadog.api.client.v2.model.DeleteAppsDatastoreItemResponseArray; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ActionsDatastoresApi apiInstance = new ActionsDatastoresApi(defaultClient); // there is a valid "datastore" in the system String DATASTORE_DATA_ID = System.getenv("DATASTORE_DATA_ID"); BulkDeleteAppsDatastoreItemsRequest body = new BulkDeleteAppsDatastoreItemsRequest() .data( new BulkDeleteAppsDatastoreItemsRequestData() .attributes( new BulkDeleteAppsDatastoreItemsRequestDataAttributes() .itemKeys(Collections.singletonList("test-key"))) .type(BulkDeleteAppsDatastoreItemsRequestDataType.ITEMS)); try { DeleteAppsDatastoreItemResponseArray result = apiInstance.bulkDeleteDatastoreItems(DATASTORE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ActionsDatastoresApi#bulkDeleteDatastoreItems"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Bulk delete datastore items returns "OK" response ``` """ Bulk delete datastore items returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.actions_datastores_api import ActionsDatastoresApi from datadog_api_client.v2.model.bulk_delete_apps_datastore_items_request import BulkDeleteAppsDatastoreItemsRequest from datadog_api_client.v2.model.bulk_delete_apps_datastore_items_request_data import ( BulkDeleteAppsDatastoreItemsRequestData, ) from datadog_api_client.v2.model.bulk_delete_apps_datastore_items_request_data_attributes import ( BulkDeleteAppsDatastoreItemsRequestDataAttributes, ) from datadog_api_client.v2.model.bulk_delete_apps_datastore_items_request_data_type import ( BulkDeleteAppsDatastoreItemsRequestDataType, ) # there is a valid "datastore" in the system DATASTORE_DATA_ID = environ["DATASTORE_DATA_ID"] body = BulkDeleteAppsDatastoreItemsRequest( data=BulkDeleteAppsDatastoreItemsRequestData( attributes=BulkDeleteAppsDatastoreItemsRequestDataAttributes( item_keys=[ "test-key", ], ), type=BulkDeleteAppsDatastoreItemsRequestDataType.ITEMS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ActionsDatastoresApi(api_client) response = api_instance.bulk_delete_datastore_items(datastore_id=DATASTORE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Bulk delete datastore items returns "OK" response ``` # Bulk delete datastore items returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ActionsDatastoresAPI.new # there is a valid "datastore" in the system DATASTORE_DATA_ID = ENV["DATASTORE_DATA_ID"] body = DatadogAPIClient::V2::BulkDeleteAppsDatastoreItemsRequest.new({ data: DatadogAPIClient::V2::BulkDeleteAppsDatastoreItemsRequestData.new({ attributes: DatadogAPIClient::V2::BulkDeleteAppsDatastoreItemsRequestDataAttributes.new({ item_keys: [ "test-key", ], }), type: DatadogAPIClient::V2::BulkDeleteAppsDatastoreItemsRequestDataType::ITEMS, }), }) p api_instance.bulk_delete_datastore_items(DATASTORE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Bulk delete datastore items returns "OK" response ``` // Bulk delete datastore items returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_actions_datastores::ActionsDatastoresAPI; use datadog_api_client::datadogV2::model::BulkDeleteAppsDatastoreItemsRequest; use datadog_api_client::datadogV2::model::BulkDeleteAppsDatastoreItemsRequestData; use datadog_api_client::datadogV2::model::BulkDeleteAppsDatastoreItemsRequestDataAttributes; use datadog_api_client::datadogV2::model::BulkDeleteAppsDatastoreItemsRequestDataType; #[tokio::main] async fn main() { // there is a valid "datastore" in the system let datastore_data_id = std::env::var("DATASTORE_DATA_ID").unwrap(); let body = BulkDeleteAppsDatastoreItemsRequest::new().data( BulkDeleteAppsDatastoreItemsRequestData::new( BulkDeleteAppsDatastoreItemsRequestDataType::ITEMS, ) .attributes( BulkDeleteAppsDatastoreItemsRequestDataAttributes::new() .item_keys(vec!["test-key".to_string()]), ), ); let configuration = datadog::Configuration::new(); let api = ActionsDatastoresAPI::with_config(configuration); let resp = api .bulk_delete_datastore_items(datastore_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Bulk delete datastore items returns "OK" response ``` /** * Bulk delete datastore items returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ActionsDatastoresApi(configuration); // there is a valid "datastore" in the system const DATASTORE_DATA_ID = process.env.DATASTORE_DATA_ID as string; const params: v2.ActionsDatastoresApiBulkDeleteDatastoreItemsRequest = { body: { data: { attributes: { itemKeys: ["test-key"], }, type: "items", }, }, datastoreId: DATASTORE_DATA_ID, }; apiInstance .bulkDeleteDatastoreItems(params) .then((data: v2.DeleteAppsDatastoreItemResponseArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=2267ae36-4a94-47fe-a15d-f43c193f4c76&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=ed719aba-8d27-4a1b-88da-57c8241899fe&pt=Actions%20Datastores&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Factions-datastores%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=2267ae36-4a94-47fe-a15d-f43c193f4c76&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=ed719aba-8d27-4a1b-88da-57c8241899fe&pt=Actions%20Datastores&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Factions-datastores%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=ed1ca886-31e3-460f-a068-d49795a8d454&bo=2&sid=c5e1b530f0be11f0ae1ff1e6f8e56999&vid=c5e22300f0be11f08910977dd913337a&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Actions%20Datastores&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Factions-datastores%2F&r=<=2049&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=910934) --- # Source: https://docs.datadoghq.com/api/latest/agentless-scanning # Agentless Scanning Datadog Agentless Scanning provides visibility into risks and vulnerabilities within your hosts, running containers, and serverless functions—all without requiring teams to install Agents on every host or where Agents cannot be installed. Agentless offers also Sensitive Data Scanning capabilities on your storage. Go to to learn more. ## [List AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-scan-options-v2) GET https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/awshttps://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/awshttps://api.datadoghq.eu/api/v2/agentless_scanning/accounts/awshttps://api.ddog-gov.com/api/v2/agentless_scanning/accounts/awshttps://api.datadoghq.com/api/v2/agentless_scanning/accounts/awshttps://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/awshttps://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/aws ### Overview Fetches the scan options configured for AWS accounts. OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/agentless-scanning/#ListAwsScanOptions-200-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#ListAwsScanOptions-403-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#ListAwsScanOptions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object that includes a list of AWS scan options. Field Type Description data [object] A list of AWS scan options. attributes object Attributes for the AWS scan options. lambda boolean Indicates if scanning of Lambda functions is enabled. sensitive_data boolean Indicates if scanning for sensitive data is enabled. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id string The ID of the AWS account. type enum The type of the resource. The value should always be `aws_scan_options`. Allowed enum values: `aws_scan_options` default: `aws_scan_options` ``` { "data": [ { "attributes": { "lambda": true, "sensitive_data": false, "vuln_containers_os": true, "vuln_host_os": true }, "id": "184366314700", "type": "aws_scan_options" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### List AWS scan options Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/aws" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List AWS scan options ``` """ List AWS scan options returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.list_aws_scan_options() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List AWS scan options ``` # List AWS scan options returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new p api_instance.list_aws_scan_options() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List AWS scan options ``` // List AWS scan options returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.ListAwsScanOptions(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.ListAwsScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.ListAwsScanOptions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List AWS scan options ``` // List AWS scan options returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.AwsScanOptionsListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); try { AwsScanOptionsListResponse result = apiInstance.listAwsScanOptions(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#listAwsScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List AWS scan options ``` // List AWS scan options returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api.list_aws_scan_options().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List AWS scan options ``` /** * List AWS scan options returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); apiInstance .listAwsScanOptions() .then((data: v2.AwsScanOptionsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-scan-options-v2) POST https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/awshttps://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/awshttps://api.datadoghq.eu/api/v2/agentless_scanning/accounts/awshttps://api.ddog-gov.com/api/v2/agentless_scanning/accounts/awshttps://api.datadoghq.com/api/v2/agentless_scanning/accounts/awshttps://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/awshttps://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/aws ### Overview Activate Agentless scan options for an AWS account. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Request #### Body Data (required) The definition of the new scan options. * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Field Type Description data [_required_] object Object for the scan options of a single AWS account. attributes [_required_] object Attributes for the AWS scan options to create. lambda [_required_] boolean Indicates if scanning of Lambda functions is enabled. sensitive_data [_required_] boolean Indicates if scanning for sensitive data is enabled. vuln_containers_os [_required_] boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os [_required_] boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The ID of the AWS account. type [_required_] enum The type of the resource. The value should always be `aws_scan_options`. Allowed enum values: `aws_scan_options` default: `aws_scan_options` ``` { "data": { "attributes": { "lambda": true, "sensitive_data": false, "vuln_containers_os": true, "vuln_host_os": true }, "id": "123456789012", "type": "aws_scan_options" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateAwsScanOptions-201-v2) * [400](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateAwsScanOptions-400-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateAwsScanOptions-403-v2) * [409](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateAwsScanOptions-409-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateAwsScanOptions-429-v2) Agentless scan options enabled successfully. * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object that includes the scan options of an AWS account. Field Type Description data object Single AWS Scan Options entry. attributes object Attributes for the AWS scan options. lambda boolean Indicates if scanning of Lambda functions is enabled. sensitive_data boolean Indicates if scanning for sensitive data is enabled. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id string The ID of the AWS account. type enum The type of the resource. The value should always be `aws_scan_options`. Allowed enum values: `aws_scan_options` default: `aws_scan_options` ``` { "data": { "attributes": { "lambda": true, "sensitive_data": false, "vuln_containers_os": true, "vuln_host_os": true }, "id": "184366314700", "type": "aws_scan_options" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Create AWS scan options Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/aws" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "lambda": true, "sensitive_data": false, "vuln_containers_os": true, "vuln_host_os": true }, "id": "123456789012", "type": "aws_scan_options" } } EOF ``` ##### Create AWS scan options ``` """ Create AWS scan options returns "Agentless scan options enabled successfully." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi from datadog_api_client.v2.model.aws_scan_options_create_attributes import AwsScanOptionsCreateAttributes from datadog_api_client.v2.model.aws_scan_options_create_data import AwsScanOptionsCreateData from datadog_api_client.v2.model.aws_scan_options_create_request import AwsScanOptionsCreateRequest from datadog_api_client.v2.model.aws_scan_options_type import AwsScanOptionsType body = AwsScanOptionsCreateRequest( data=AwsScanOptionsCreateData( id="000000000003", type=AwsScanOptionsType.AWS_SCAN_OPTIONS, attributes=AwsScanOptionsCreateAttributes( _lambda=True, sensitive_data=False, vuln_containers_os=True, vuln_host_os=True, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.create_aws_scan_options(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create AWS scan options ``` # Create AWS scan options returns "Agentless scan options enabled successfully." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new body = DatadogAPIClient::V2::AwsScanOptionsCreateRequest.new({ data: DatadogAPIClient::V2::AwsScanOptionsCreateData.new({ id: "000000000003", type: DatadogAPIClient::V2::AwsScanOptionsType::AWS_SCAN_OPTIONS, attributes: DatadogAPIClient::V2::AwsScanOptionsCreateAttributes.new({ lambda: true, sensitive_data: false, vuln_containers_os: true, vuln_host_os: true, }), }), }) p api_instance.create_aws_scan_options(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create AWS scan options ``` // Create AWS scan options returns "Agentless scan options enabled successfully." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AwsScanOptionsCreateRequest{ Data: datadogV2.AwsScanOptionsCreateData{ Id: "000000000003", Type: datadogV2.AWSSCANOPTIONSTYPE_AWS_SCAN_OPTIONS, Attributes: datadogV2.AwsScanOptionsCreateAttributes{ Lambda: true, SensitiveData: false, VulnContainersOs: true, VulnHostOs: true, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.CreateAwsScanOptions(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.CreateAwsScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.CreateAwsScanOptions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create AWS scan options ``` // Create AWS scan options returns "Agentless scan options enabled successfully." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.AwsScanOptionsCreateAttributes; import com.datadog.api.client.v2.model.AwsScanOptionsCreateData; import com.datadog.api.client.v2.model.AwsScanOptionsCreateRequest; import com.datadog.api.client.v2.model.AwsScanOptionsResponse; import com.datadog.api.client.v2.model.AwsScanOptionsType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); AwsScanOptionsCreateRequest body = new AwsScanOptionsCreateRequest() .data( new AwsScanOptionsCreateData() .id("000000000003") .type(AwsScanOptionsType.AWS_SCAN_OPTIONS) .attributes( new AwsScanOptionsCreateAttributes() .lambda(true) .sensitiveData(false) .vulnContainersOs(true) .vulnHostOs(true))); try { AwsScanOptionsResponse result = apiInstance.createAwsScanOptions(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#createAwsScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create AWS scan options ``` // Create AWS scan options returns "Agentless scan options enabled successfully." // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; use datadog_api_client::datadogV2::model::AwsScanOptionsCreateAttributes; use datadog_api_client::datadogV2::model::AwsScanOptionsCreateData; use datadog_api_client::datadogV2::model::AwsScanOptionsCreateRequest; use datadog_api_client::datadogV2::model::AwsScanOptionsType; #[tokio::main] async fn main() { let body = AwsScanOptionsCreateRequest::new(AwsScanOptionsCreateData::new( AwsScanOptionsCreateAttributes::new(true, false, true, true), "000000000003".to_string(), AwsScanOptionsType::AWS_SCAN_OPTIONS, )); let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api.create_aws_scan_options(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create AWS scan options ``` /** * Create AWS scan options returns "Agentless scan options enabled successfully." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiCreateAwsScanOptionsRequest = { body: { data: { id: "000000000003", type: "aws_scan_options", attributes: { lambda: true, sensitiveData: false, vulnContainersOs: true, vulnHostOs: true, }, }, }, }; apiInstance .createAwsScanOptions(params) .then((data: v2.AwsScanOptionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-scan-options-v2) GET https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.datadoghq.eu/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.ddog-gov.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id} ### Overview Fetches the Agentless scan options for an activated account. OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string The ID of an AWS account. ### Response * [200](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAwsScanOptions-200-v2) * [400](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAwsScanOptions-400-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAwsScanOptions-403-v2) * [404](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAwsScanOptions-404-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAwsScanOptions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object that includes the scan options of an AWS account. Field Type Description data object Single AWS Scan Options entry. attributes object Attributes for the AWS scan options. lambda boolean Indicates if scanning of Lambda functions is enabled. sensitive_data boolean Indicates if scanning for sensitive data is enabled. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id string The ID of the AWS account. type enum The type of the resource. The value should always be `aws_scan_options`. Allowed enum values: `aws_scan_options` default: `aws_scan_options` ``` { "data": { "attributes": { "lambda": true, "sensitive_data": false, "vuln_containers_os": true, "vuln_host_os": true }, "id": "184366314700", "type": "aws_scan_options" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Get AWS scan options Copy ``` # Path parameters export account_id="123456789012" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/aws/${account_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get AWS scan options ``` """ Get AWS scan options returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi # there is a valid "aws_scan_options" in the system AWS_SCAN_OPTIONS_ID = environ["AWS_SCAN_OPTIONS_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.get_aws_scan_options( account_id=AWS_SCAN_OPTIONS_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get AWS scan options ``` # Get AWS scan options returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new # there is a valid "aws_scan_options" in the system AWS_SCAN_OPTIONS_ID = ENV["AWS_SCAN_OPTIONS_ID"] p api_instance.get_aws_scan_options(AWS_SCAN_OPTIONS_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get AWS scan options ``` // Get AWS scan options returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "aws_scan_options" in the system AwsScanOptionsID := os.Getenv("AWS_SCAN_OPTIONS_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.GetAwsScanOptions(ctx, AwsScanOptionsID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.GetAwsScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.GetAwsScanOptions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get AWS scan options ``` // Get AWS scan options returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.AwsScanOptionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); // there is a valid "aws_scan_options" in the system String AWS_SCAN_OPTIONS_ID = System.getenv("AWS_SCAN_OPTIONS_ID"); try { AwsScanOptionsResponse result = apiInstance.getAwsScanOptions(AWS_SCAN_OPTIONS_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#getAwsScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get AWS scan options ``` // Get AWS scan options returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; #[tokio::main] async fn main() { // there is a valid "aws_scan_options" in the system let aws_scan_options_id = std::env::var("AWS_SCAN_OPTIONS_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api.get_aws_scan_options(aws_scan_options_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get AWS scan options ``` /** * Get AWS scan options returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); // there is a valid "aws_scan_options" in the system const AWS_SCAN_OPTIONS_ID = process.env.AWS_SCAN_OPTIONS_ID as string; const params: v2.AgentlessScanningApiGetAwsScanOptionsRequest = { accountId: AWS_SCAN_OPTIONS_ID, }; apiInstance .getAwsScanOptions(params) .then((data: v2.AwsScanOptionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-aws-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-aws-scan-options-v2) PATCH https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.datadoghq.eu/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.ddog-gov.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id} ### Overview Update the Agentless scan options for an activated account. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string The ID of an AWS account. ### Request #### Body Data (required) New definition of the scan options. * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Field Type Description data [_required_] object Object for the scan options of a single AWS account. attributes [_required_] object Attributes for the AWS scan options to update. lambda boolean Indicates if scanning of Lambda functions is enabled. sensitive_data boolean Indicates if scanning for sensitive data is enabled. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The ID of the AWS account. type [_required_] enum The type of the resource. The value should always be `aws_scan_options`. Allowed enum values: `aws_scan_options` default: `aws_scan_options` ``` { "data": { "type": "aws_scan_options", "id": "000000000002", "attributes": { "vuln_host_os": true, "vuln_containers_os": true, "lambda": false } } } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateAwsScanOptions-204-v2) * [400](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateAwsScanOptions-400-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateAwsScanOptions-403-v2) * [404](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateAwsScanOptions-404-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateAwsScanOptions-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Update AWS scan options returns "No Content" response Copy ``` # Path parameters export account_id="123456789012" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/aws/${account_id}" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "aws_scan_options", "id": "000000000002", "attributes": { "vuln_host_os": true, "vuln_containers_os": true, "lambda": false } } } EOF ``` ##### Update AWS scan options returns "No Content" response ``` // Update AWS scan options returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AwsScanOptionsUpdateRequest{ Data: datadogV2.AwsScanOptionsUpdateData{ Type: datadogV2.AWSSCANOPTIONSTYPE_AWS_SCAN_OPTIONS, Id: "000000000002", Attributes: datadogV2.AwsScanOptionsUpdateAttributes{ VulnHostOs: datadog.PtrBool(true), VulnContainersOs: datadog.PtrBool(true), Lambda: datadog.PtrBool(false), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) r, err := api.UpdateAwsScanOptions(ctx, "000000000002", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.UpdateAwsScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update AWS scan options returns "No Content" response ``` // Update AWS scan options returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.AwsScanOptionsType; import com.datadog.api.client.v2.model.AwsScanOptionsUpdateAttributes; import com.datadog.api.client.v2.model.AwsScanOptionsUpdateData; import com.datadog.api.client.v2.model.AwsScanOptionsUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); AwsScanOptionsUpdateRequest body = new AwsScanOptionsUpdateRequest() .data( new AwsScanOptionsUpdateData() .type(AwsScanOptionsType.AWS_SCAN_OPTIONS) .id("000000000002") .attributes( new AwsScanOptionsUpdateAttributes() .vulnHostOs(true) .vulnContainersOs(true) .lambda(false))); try { apiInstance.updateAwsScanOptions("000000000002", body); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#updateAwsScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update AWS scan options returns "No Content" response ``` """ Update AWS scan options returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi from datadog_api_client.v2.model.aws_scan_options_type import AwsScanOptionsType from datadog_api_client.v2.model.aws_scan_options_update_attributes import AwsScanOptionsUpdateAttributes from datadog_api_client.v2.model.aws_scan_options_update_data import AwsScanOptionsUpdateData from datadog_api_client.v2.model.aws_scan_options_update_request import AwsScanOptionsUpdateRequest body = AwsScanOptionsUpdateRequest( data=AwsScanOptionsUpdateData( type=AwsScanOptionsType.AWS_SCAN_OPTIONS, id="000000000002", attributes=AwsScanOptionsUpdateAttributes( vuln_host_os=True, vuln_containers_os=True, _lambda=False, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) api_instance.update_aws_scan_options(account_id="000000000002", body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update AWS scan options returns "No Content" response ``` # Update AWS scan options returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new body = DatadogAPIClient::V2::AwsScanOptionsUpdateRequest.new({ data: DatadogAPIClient::V2::AwsScanOptionsUpdateData.new({ type: DatadogAPIClient::V2::AwsScanOptionsType::AWS_SCAN_OPTIONS, id: "000000000002", attributes: DatadogAPIClient::V2::AwsScanOptionsUpdateAttributes.new({ vuln_host_os: true, vuln_containers_os: true, lambda: false, }), }), }) api_instance.update_aws_scan_options("000000000002", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update AWS scan options returns "No Content" response ``` // Update AWS scan options returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; use datadog_api_client::datadogV2::model::AwsScanOptionsType; use datadog_api_client::datadogV2::model::AwsScanOptionsUpdateAttributes; use datadog_api_client::datadogV2::model::AwsScanOptionsUpdateData; use datadog_api_client::datadogV2::model::AwsScanOptionsUpdateRequest; #[tokio::main] async fn main() { let body = AwsScanOptionsUpdateRequest::new(AwsScanOptionsUpdateData::new( AwsScanOptionsUpdateAttributes::new() .lambda(false) .vuln_containers_os(true) .vuln_host_os(true), "000000000002".to_string(), AwsScanOptionsType::AWS_SCAN_OPTIONS, )); let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api .update_aws_scan_options("000000000002".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update AWS scan options returns "No Content" response ``` /** * Update AWS scan options returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiUpdateAwsScanOptionsRequest = { body: { data: { type: "aws_scan_options", id: "000000000002", attributes: { vulnHostOs: true, vulnContainersOs: true, lambda: false, }, }, }, accountId: "000000000002", }; apiInstance .updateAwsScanOptions(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-aws-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-aws-scan-options-v2) DELETE https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.datadoghq.eu/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.ddog-gov.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id}https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/aws/{account_id} ### Overview Delete Agentless scan options for an AWS account. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string The ID of an AWS account. ### Response * [204](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteAwsScanOptions-204-v2) * [400](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteAwsScanOptions-400-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteAwsScanOptions-403-v2) * [404](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteAwsScanOptions-404-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteAwsScanOptions-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Delete AWS scan options Copy ``` # Path parameters export account_id="123456789012" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/aws/${account_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete AWS scan options ``` """ Delete AWS scan options returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) api_instance.delete_aws_scan_options( account_id="account_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete AWS scan options ``` # Delete AWS scan options returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new api_instance.delete_aws_scan_options("account_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete AWS scan options ``` // Delete AWS scan options returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) r, err := api.DeleteAwsScanOptions(ctx, "account_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.DeleteAwsScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete AWS scan options ``` // Delete AWS scan options returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); try { apiInstance.deleteAwsScanOptions("123456789012"); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#deleteAwsScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete AWS scan options ``` // Delete AWS scan options returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api.delete_aws_scan_options("account_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete AWS scan options ``` /** * Delete AWS scan options returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiDeleteAwsScanOptionsRequest = { accountId: "account_id", }; apiInstance .deleteAwsScanOptions(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-azure-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-azure-scan-options-v2) GET https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/azurehttps://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/azurehttps://api.datadoghq.eu/api/v2/agentless_scanning/accounts/azurehttps://api.ddog-gov.com/api/v2/agentless_scanning/accounts/azurehttps://api.datadoghq.com/api/v2/agentless_scanning/accounts/azurehttps://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/azurehttps://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/azure ### Overview Fetches the scan options configured for Azure accounts. OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/agentless-scanning/#ListAzureScanOptions-200-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#ListAzureScanOptions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object containing a list of Azure scan options. Field Type Description data [_required_] [object] A list of Azure scan options. attributes object Attributes for Azure scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The Azure subscription ID. type [_required_] enum The type of the resource. The value should always be `azure_scan_options`. Allowed enum values: `azure_scan_options` default: `azure_scan_options` ``` { "data": [ { "attributes": { "vuln_containers_os": true, "vuln_host_os": true }, "id": "12345678-90ab-cdef-1234-567890abcdef", "type": "azure_scan_options" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### List Azure scan options Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/azure" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Azure scan options ``` """ List Azure scan options returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.list_azure_scan_options() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Azure scan options ``` # List Azure scan options returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new p api_instance.list_azure_scan_options() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Azure scan options ``` // List Azure scan options returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.ListAzureScanOptions(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.ListAzureScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.ListAzureScanOptions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Azure scan options ``` // List Azure scan options returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.AzureScanOptionsArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); try { AzureScanOptionsArray result = apiInstance.listAzureScanOptions(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#listAzureScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Azure scan options ``` // List Azure scan options returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api.list_azure_scan_options().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Azure scan options ``` /** * List Azure scan options returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); apiInstance .listAzureScanOptions() .then((data: v2.AzureScanOptionsArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-azure-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-azure-scan-options-v2) POST https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/azurehttps://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/azurehttps://api.datadoghq.eu/api/v2/agentless_scanning/accounts/azurehttps://api.ddog-gov.com/api/v2/agentless_scanning/accounts/azurehttps://api.datadoghq.com/api/v2/agentless_scanning/accounts/azurehttps://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/azurehttps://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/azure ### Overview Activate Agentless scan options for an Azure subscription. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Field Type Description data object Single Azure scan options entry. attributes object Attributes for Azure scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The Azure subscription ID. type [_required_] enum The type of the resource. The value should always be `azure_scan_options`. Allowed enum values: `azure_scan_options` default: `azure_scan_options` ``` { "data": { "attributes": { "vuln_containers_os": true, "vuln_host_os": true }, "id": "12345678-90ab-cdef-1234-567890abcdef", "type": "azure_scan_options" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateAzureScanOptions-201-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateAzureScanOptions-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object containing Azure scan options for a single subscription. Field Type Description data object Single Azure scan options entry. attributes object Attributes for Azure scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The Azure subscription ID. type [_required_] enum The type of the resource. The value should always be `azure_scan_options`. Allowed enum values: `azure_scan_options` default: `azure_scan_options` ``` { "data": { "attributes": { "vuln_containers_os": true, "vuln_host_os": true }, "id": "12345678-90ab-cdef-1234-567890abcdef", "type": "azure_scan_options" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Create Azure scan options returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/azure" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "vuln_containers_os": true, "vuln_host_os": true }, "id": "12345678-90ab-cdef-1234-567890abcdef", "type": "azure_scan_options" } } EOF ``` ##### Create Azure scan options returns "Created" response ``` // Create Azure scan options returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AzureScanOptions{ Data: &datadogV2.AzureScanOptionsData{ Attributes: &datadogV2.AzureScanOptionsDataAttributes{ VulnContainersOs: datadog.PtrBool(true), VulnHostOs: datadog.PtrBool(true), }, Id: "12345678-90ab-cdef-1234-567890abcdef", Type: datadogV2.AZURESCANOPTIONSDATATYPE_AZURE_SCAN_OPTIONS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.CreateAzureScanOptions(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.CreateAzureScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.CreateAzureScanOptions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create Azure scan options returns "Created" response ``` // Create Azure scan options returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.AzureScanOptions; import com.datadog.api.client.v2.model.AzureScanOptionsData; import com.datadog.api.client.v2.model.AzureScanOptionsDataAttributes; import com.datadog.api.client.v2.model.AzureScanOptionsDataType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); AzureScanOptions body = new AzureScanOptions() .data( new AzureScanOptionsData() .attributes( new AzureScanOptionsDataAttributes() .vulnContainersOs(true) .vulnHostOs(true)) .id("12345678-90ab-cdef-1234-567890abcdef") .type(AzureScanOptionsDataType.AZURE_SCAN_OPTIONS)); try { AzureScanOptions result = apiInstance.createAzureScanOptions(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#createAzureScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create Azure scan options returns "Created" response ``` """ Create Azure scan options returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi from datadog_api_client.v2.model.azure_scan_options import AzureScanOptions from datadog_api_client.v2.model.azure_scan_options_data import AzureScanOptionsData from datadog_api_client.v2.model.azure_scan_options_data_attributes import AzureScanOptionsDataAttributes from datadog_api_client.v2.model.azure_scan_options_data_type import AzureScanOptionsDataType body = AzureScanOptions( data=AzureScanOptionsData( attributes=AzureScanOptionsDataAttributes( vuln_containers_os=True, vuln_host_os=True, ), id="12345678-90ab-cdef-1234-567890abcdef", type=AzureScanOptionsDataType.AZURE_SCAN_OPTIONS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.create_azure_scan_options(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create Azure scan options returns "Created" response ``` # Create Azure scan options returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new body = DatadogAPIClient::V2::AzureScanOptions.new({ data: DatadogAPIClient::V2::AzureScanOptionsData.new({ attributes: DatadogAPIClient::V2::AzureScanOptionsDataAttributes.new({ vuln_containers_os: true, vuln_host_os: true, }), id: "12345678-90ab-cdef-1234-567890abcdef", type: DatadogAPIClient::V2::AzureScanOptionsDataType::AZURE_SCAN_OPTIONS, }), }) p api_instance.create_azure_scan_options(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create Azure scan options returns "Created" response ``` // Create Azure scan options returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; use datadog_api_client::datadogV2::model::AzureScanOptions; use datadog_api_client::datadogV2::model::AzureScanOptionsData; use datadog_api_client::datadogV2::model::AzureScanOptionsDataAttributes; use datadog_api_client::datadogV2::model::AzureScanOptionsDataType; #[tokio::main] async fn main() { let body = AzureScanOptions::new().data( AzureScanOptionsData::new( "12345678-90ab-cdef-1234-567890abcdef".to_string(), AzureScanOptionsDataType::AZURE_SCAN_OPTIONS, ) .attributes( AzureScanOptionsDataAttributes::new() .vuln_containers_os(true) .vuln_host_os(true), ), ); let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api.create_azure_scan_options(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create Azure scan options returns "Created" response ``` /** * Create Azure scan options returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiCreateAzureScanOptionsRequest = { body: { data: { attributes: { vulnContainersOs: true, vulnHostOs: true, }, id: "12345678-90ab-cdef-1234-567890abcdef", type: "azure_scan_options", }, }, }; apiInstance .createAzureScanOptions(params) .then((data: v2.AzureScanOptions) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-azure-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-azure-scan-options-v2) GET https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.datadoghq.eu/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.ddog-gov.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id} ### Overview Fetches the Agentless scan options for an activated subscription. OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Arguments #### Path Parameters Name Type Description subscription_id [_required_] string The Azure subscription ID. ### Response * [200](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAzureScanOptions-200-v2) * [400](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAzureScanOptions-400-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAzureScanOptions-403-v2) * [404](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAzureScanOptions-404-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAzureScanOptions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object containing Azure scan options for a single subscription. Field Type Description data object Single Azure scan options entry. attributes object Attributes for Azure scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The Azure subscription ID. type [_required_] enum The type of the resource. The value should always be `azure_scan_options`. Allowed enum values: `azure_scan_options` default: `azure_scan_options` ``` { "data": { "attributes": { "vuln_containers_os": true, "vuln_host_os": true }, "id": "12345678-90ab-cdef-1234-567890abcdef", "type": "azure_scan_options" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Get Azure scan options Copy ``` # Path parameters export subscription_id="12345678-90ab-cdef-1234-567890abcdef" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/azure/${subscription_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Azure scan options ``` """ Get Azure scan options returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.get_azure_scan_options( subscription_id="12345678-90ab-cdef-1234-567890abcdef", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Azure scan options ``` # Get Azure scan options returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new p api_instance.get_azure_scan_options("12345678-90ab-cdef-1234-567890abcdef") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Azure scan options ``` // Get Azure scan options returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.GetAzureScanOptions(ctx, "12345678-90ab-cdef-1234-567890abcdef") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.GetAzureScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.GetAzureScanOptions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Azure scan options ``` // Get Azure scan options returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.AzureScanOptions; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); try { AzureScanOptions result = apiInstance.getAzureScanOptions("12345678-90ab-cdef-1234-567890abcdef"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#getAzureScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Azure scan options ``` // Get Azure scan options returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api .get_azure_scan_options("12345678-90ab-cdef-1234-567890abcdef".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Azure scan options ``` /** * Get Azure scan options returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiGetAzureScanOptionsRequest = { subscriptionId: "12345678-90ab-cdef-1234-567890abcdef", }; apiInstance .getAzureScanOptions(params) .then((data: v2.AzureScanOptions) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-azure-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-azure-scan-options-v2) PATCH https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.datadoghq.eu/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.ddog-gov.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id} ### Overview Update the Agentless scan options for an activated subscription. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Arguments #### Path Parameters Name Type Description subscription_id [_required_] string The Azure subscription ID. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Field Type Description data object Data object for updating the scan options of a single Azure subscription. attributes object Attributes for updating Azure scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The Azure subscription ID. type [_required_] enum Azure scan options resource type. Allowed enum values: `azure_scan_options` default: `azure_scan_options` ``` { "data": { "attributes": { "vuln_containers_os": false, "vuln_host_os": false }, "id": "12345678-90ab-cdef-1234-567890abcdef", "type": "azure_scan_options" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateAzureScanOptions-200-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateAzureScanOptions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object containing Azure scan options for a single subscription. Field Type Description data object Single Azure scan options entry. attributes object Attributes for Azure scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The Azure subscription ID. type [_required_] enum The type of the resource. The value should always be `azure_scan_options`. Allowed enum values: `azure_scan_options` default: `azure_scan_options` ``` { "data": { "attributes": { "vuln_containers_os": true, "vuln_host_os": true }, "id": "12345678-90ab-cdef-1234-567890abcdef", "type": "azure_scan_options" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Update Azure scan options Copy ``` # Path parameters export subscription_id="12345678-90ab-cdef-1234-567890abcdef" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/azure/${subscription_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "12345678-90ab-cdef-1234-567890abcdef", "type": "azure_scan_options" } } EOF ``` ##### Update Azure scan options ``` """ Update Azure scan options returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi from datadog_api_client.v2.model.azure_scan_options_input_update import AzureScanOptionsInputUpdate from datadog_api_client.v2.model.azure_scan_options_input_update_data import AzureScanOptionsInputUpdateData from datadog_api_client.v2.model.azure_scan_options_input_update_data_type import AzureScanOptionsInputUpdateDataType body = AzureScanOptionsInputUpdate( data=AzureScanOptionsInputUpdateData( id="12345678-90ab-cdef-1234-567890abcdef", type=AzureScanOptionsInputUpdateDataType.AZURE_SCAN_OPTIONS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.update_azure_scan_options(subscription_id="12345678-90ab-cdef-1234-567890abcdef", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Azure scan options ``` # Update Azure scan options returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new body = DatadogAPIClient::V2::AzureScanOptionsInputUpdate.new({ data: DatadogAPIClient::V2::AzureScanOptionsInputUpdateData.new({ id: "12345678-90ab-cdef-1234-567890abcdef", type: DatadogAPIClient::V2::AzureScanOptionsInputUpdateDataType::AZURE_SCAN_OPTIONS, }), }) p api_instance.update_azure_scan_options("12345678-90ab-cdef-1234-567890abcdef", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Azure scan options ``` // Update Azure scan options returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AzureScanOptionsInputUpdate{ Data: &datadogV2.AzureScanOptionsInputUpdateData{ Id: "12345678-90ab-cdef-1234-567890abcdef", Type: datadogV2.AZURESCANOPTIONSINPUTUPDATEDATATYPE_AZURE_SCAN_OPTIONS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.UpdateAzureScanOptions(ctx, "12345678-90ab-cdef-1234-567890abcdef", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.UpdateAzureScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.UpdateAzureScanOptions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Azure scan options ``` // Update Azure scan options returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.AzureScanOptions; import com.datadog.api.client.v2.model.AzureScanOptionsInputUpdate; import com.datadog.api.client.v2.model.AzureScanOptionsInputUpdateData; import com.datadog.api.client.v2.model.AzureScanOptionsInputUpdateDataType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); AzureScanOptionsInputUpdate body = new AzureScanOptionsInputUpdate() .data( new AzureScanOptionsInputUpdateData() .id("12345678-90ab-cdef-1234-567890abcdef") .type(AzureScanOptionsInputUpdateDataType.AZURE_SCAN_OPTIONS)); try { AzureScanOptions result = apiInstance.updateAzureScanOptions("12345678-90ab-cdef-1234-567890abcdef", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#updateAzureScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Azure scan options ``` // Update Azure scan options returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; use datadog_api_client::datadogV2::model::AzureScanOptionsInputUpdate; use datadog_api_client::datadogV2::model::AzureScanOptionsInputUpdateData; use datadog_api_client::datadogV2::model::AzureScanOptionsInputUpdateDataType; #[tokio::main] async fn main() { let body = AzureScanOptionsInputUpdate::new().data(AzureScanOptionsInputUpdateData::new( "12345678-90ab-cdef-1234-567890abcdef".to_string(), AzureScanOptionsInputUpdateDataType::AZURE_SCAN_OPTIONS, )); let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api .update_azure_scan_options("12345678-90ab-cdef-1234-567890abcdef".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Azure scan options ``` /** * Update Azure scan options returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiUpdateAzureScanOptionsRequest = { body: { data: { id: "12345678-90ab-cdef-1234-567890abcdef", type: "azure_scan_options", }, }, subscriptionId: "12345678-90ab-cdef-1234-567890abcdef", }; apiInstance .updateAzureScanOptions(params) .then((data: v2.AzureScanOptions) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-azure-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-azure-scan-options-v2) DELETE https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.datadoghq.eu/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.ddog-gov.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id}https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/azure/{subscription_id} ### Overview Delete Agentless scan options for an Azure subscription. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Arguments #### Path Parameters Name Type Description subscription_id [_required_] string The Azure subscription ID. ### Response * [204](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteAzureScanOptions-204-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteAzureScanOptions-429-v2) No Content Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Delete Azure scan options Copy ``` # Path parameters export subscription_id="12345678-90ab-cdef-1234-567890abcdef" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/azure/${subscription_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Azure scan options ``` """ Delete Azure scan options returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) api_instance.delete_azure_scan_options( subscription_id="12345678-90ab-cdef-1234-567890abcdef", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Azure scan options ``` # Delete Azure scan options returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new api_instance.delete_azure_scan_options("12345678-90ab-cdef-1234-567890abcdef") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Azure scan options ``` // Delete Azure scan options returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) r, err := api.DeleteAzureScanOptions(ctx, "12345678-90ab-cdef-1234-567890abcdef") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.DeleteAzureScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Azure scan options ``` // Delete Azure scan options returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); try { apiInstance.deleteAzureScanOptions("12345678-90ab-cdef-1234-567890abcdef"); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#deleteAzureScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Azure scan options ``` // Delete Azure scan options returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api .delete_azure_scan_options("12345678-90ab-cdef-1234-567890abcdef".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Azure scan options ``` /** * Delete Azure scan options returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiDeleteAzureScanOptionsRequest = { subscriptionId: "12345678-90ab-cdef-1234-567890abcdef", }; apiInstance .deleteAzureScanOptions(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-gcp-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-gcp-scan-options-v2) GET https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/gcphttps://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/gcphttps://api.datadoghq.eu/api/v2/agentless_scanning/accounts/gcphttps://api.ddog-gov.com/api/v2/agentless_scanning/accounts/gcphttps://api.datadoghq.com/api/v2/agentless_scanning/accounts/gcphttps://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/gcphttps://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/gcp ### Overview Fetches the scan options configured for all GCP projects. OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/agentless-scanning/#ListGcpScanOptions-200-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#ListGcpScanOptions-403-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#ListGcpScanOptions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object containing a list of GCP scan options. Field Type Description data [_required_] [object] A list of GCP scan options. attributes object Attributes for GCP scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The GCP project ID. type [_required_] enum GCP scan options resource type. Allowed enum values: `gcp_scan_options` default: `gcp_scan_options` ``` { "data": [ { "attributes": { "vuln_containers_os": true, "vuln_host_os": true }, "id": "company-project-id", "type": "gcp_scan_options" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### List GCP scan options Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/gcp" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List GCP scan options ``` """ List GCP scan options returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.list_gcp_scan_options() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List GCP scan options ``` # List GCP scan options returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new p api_instance.list_gcp_scan_options() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List GCP scan options ``` // List GCP scan options returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.ListGcpScanOptions(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.ListGcpScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.ListGcpScanOptions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List GCP scan options ``` // List GCP scan options returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.GcpScanOptionsArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); try { GcpScanOptionsArray result = apiInstance.listGcpScanOptions(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#listGcpScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List GCP scan options ``` // List GCP scan options returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api.list_gcp_scan_options().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List GCP scan options ``` /** * List GCP scan options returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); apiInstance .listGcpScanOptions() .then((data: v2.GcpScanOptionsArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-gcp-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-gcp-scan-options-v2) POST https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/gcphttps://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/gcphttps://api.datadoghq.eu/api/v2/agentless_scanning/accounts/gcphttps://api.ddog-gov.com/api/v2/agentless_scanning/accounts/gcphttps://api.datadoghq.com/api/v2/agentless_scanning/accounts/gcphttps://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/gcphttps://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/gcp ### Overview Activate Agentless scan options for a GCP project. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Request #### Body Data (required) The definition of the new scan options. * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Field Type Description data object Single GCP scan options entry. attributes object Attributes for GCP scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The GCP project ID. type [_required_] enum GCP scan options resource type. Allowed enum values: `gcp_scan_options` default: `gcp_scan_options` ``` { "data": { "id": "new-project", "type": "gcp_scan_options", "attributes": { "vuln_host_os": true, "vuln_containers_os": true } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateGcpScanOptions-201-v2) * [400](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateGcpScanOptions-400-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateGcpScanOptions-403-v2) * [409](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateGcpScanOptions-409-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateGcpScanOptions-429-v2) Agentless scan options enabled successfully. * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object containing GCP scan options for a single project. Field Type Description data object Single GCP scan options entry. attributes object Attributes for GCP scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The GCP project ID. type [_required_] enum GCP scan options resource type. Allowed enum values: `gcp_scan_options` default: `gcp_scan_options` ``` { "data": { "attributes": { "vuln_containers_os": true, "vuln_host_os": true }, "id": "company-project-id", "type": "gcp_scan_options" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Create GCP scan options returns "Agentless scan options enabled successfully." response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/gcp" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "new-project", "type": "gcp_scan_options", "attributes": { "vuln_host_os": true, "vuln_containers_os": true } } } EOF ``` ##### Create GCP scan options returns "Agentless scan options enabled successfully." response ``` // Create GCP scan options returns "Agentless scan options enabled successfully." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.GcpScanOptions{ Data: &datadogV2.GcpScanOptionsData{ Id: "new-project", Type: datadogV2.GCPSCANOPTIONSDATATYPE_GCP_SCAN_OPTIONS, Attributes: &datadogV2.GcpScanOptionsDataAttributes{ VulnHostOs: datadog.PtrBool(true), VulnContainersOs: datadog.PtrBool(true), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.CreateGcpScanOptions(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.CreateGcpScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.CreateGcpScanOptions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create GCP scan options returns "Agentless scan options enabled successfully." response ``` // Create GCP scan options returns "Agentless scan options enabled successfully." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.GcpScanOptions; import com.datadog.api.client.v2.model.GcpScanOptionsData; import com.datadog.api.client.v2.model.GcpScanOptionsDataAttributes; import com.datadog.api.client.v2.model.GcpScanOptionsDataType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); GcpScanOptions body = new GcpScanOptions() .data( new GcpScanOptionsData() .id("new-project") .type(GcpScanOptionsDataType.GCP_SCAN_OPTIONS) .attributes( new GcpScanOptionsDataAttributes() .vulnHostOs(true) .vulnContainersOs(true))); try { GcpScanOptions result = apiInstance.createGcpScanOptions(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#createGcpScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create GCP scan options returns "Agentless scan options enabled successfully." response ``` """ Create GCP scan options returns "Agentless scan options enabled successfully." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi from datadog_api_client.v2.model.gcp_scan_options import GcpScanOptions from datadog_api_client.v2.model.gcp_scan_options_data import GcpScanOptionsData from datadog_api_client.v2.model.gcp_scan_options_data_attributes import GcpScanOptionsDataAttributes from datadog_api_client.v2.model.gcp_scan_options_data_type import GcpScanOptionsDataType body = GcpScanOptions( data=GcpScanOptionsData( id="new-project", type=GcpScanOptionsDataType.GCP_SCAN_OPTIONS, attributes=GcpScanOptionsDataAttributes( vuln_host_os=True, vuln_containers_os=True, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.create_gcp_scan_options(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create GCP scan options returns "Agentless scan options enabled successfully." response ``` # Create GCP scan options returns "Agentless scan options enabled successfully." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new body = DatadogAPIClient::V2::GcpScanOptions.new({ data: DatadogAPIClient::V2::GcpScanOptionsData.new({ id: "new-project", type: DatadogAPIClient::V2::GcpScanOptionsDataType::GCP_SCAN_OPTIONS, attributes: DatadogAPIClient::V2::GcpScanOptionsDataAttributes.new({ vuln_host_os: true, vuln_containers_os: true, }), }), }) p api_instance.create_gcp_scan_options(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create GCP scan options returns "Agentless scan options enabled successfully." response ``` // Create GCP scan options returns "Agentless scan options enabled successfully." // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; use datadog_api_client::datadogV2::model::GcpScanOptions; use datadog_api_client::datadogV2::model::GcpScanOptionsData; use datadog_api_client::datadogV2::model::GcpScanOptionsDataAttributes; use datadog_api_client::datadogV2::model::GcpScanOptionsDataType; #[tokio::main] async fn main() { let body = GcpScanOptions::new().data( GcpScanOptionsData::new( "new-project".to_string(), GcpScanOptionsDataType::GCP_SCAN_OPTIONS, ) .attributes( GcpScanOptionsDataAttributes::new() .vuln_containers_os(true) .vuln_host_os(true), ), ); let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api.create_gcp_scan_options(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create GCP scan options returns "Agentless scan options enabled successfully." response ``` /** * Create GCP scan options returns "Agentless scan options enabled successfully." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiCreateGcpScanOptionsRequest = { body: { data: { id: "new-project", type: "gcp_scan_options", attributes: { vulnHostOs: true, vulnContainersOs: true, }, }, }, }; apiInstance .createGcpScanOptions(params) .then((data: v2.GcpScanOptions) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-gcp-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-gcp-scan-options-v2) GET https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.datadoghq.eu/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.ddog-gov.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id} ### Overview Fetches the Agentless scan options for an activated GCP project. OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Arguments #### Path Parameters Name Type Description project_id [_required_] string The GCP project ID. ### Response * [200](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetGcpScanOptions-200-v2) * [400](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetGcpScanOptions-400-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetGcpScanOptions-403-v2) * [404](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetGcpScanOptions-404-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetGcpScanOptions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object containing GCP scan options for a single project. Field Type Description data object Single GCP scan options entry. attributes object Attributes for GCP scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The GCP project ID. type [_required_] enum GCP scan options resource type. Allowed enum values: `gcp_scan_options` default: `gcp_scan_options` ``` { "data": { "attributes": { "vuln_containers_os": true, "vuln_host_os": true }, "id": "company-project-id", "type": "gcp_scan_options" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Get GCP scan options Copy ``` # Path parameters export project_id="company-project-id" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/${project_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get GCP scan options ``` """ Get GCP scan options returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.get_gcp_scan_options( project_id="api-spec-test", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get GCP scan options ``` # Get GCP scan options returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new p api_instance.get_gcp_scan_options("api-spec-test") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get GCP scan options ``` // Get GCP scan options returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.GetGcpScanOptions(ctx, "api-spec-test") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.GetGcpScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.GetGcpScanOptions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get GCP scan options ``` // Get GCP scan options returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.GcpScanOptions; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); try { GcpScanOptions result = apiInstance.getGcpScanOptions("api-spec-test"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#getGcpScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get GCP scan options ``` // Get GCP scan options returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api.get_gcp_scan_options("api-spec-test".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get GCP scan options ``` /** * Get GCP scan options returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiGetGcpScanOptionsRequest = { projectId: "api-spec-test", }; apiInstance .getGcpScanOptions(params) .then((data: v2.GcpScanOptions) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-gcp-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-gcp-scan-options-v2) PATCH https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.datadoghq.eu/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.ddog-gov.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id} ### Overview Update the Agentless scan options for an activated GCP project. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Arguments #### Path Parameters Name Type Description project_id [_required_] string The GCP project ID. ### Request #### Body Data (required) New definition of the scan options. * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Field Type Description data object Data object for updating the scan options of a single GCP project. attributes object Attributes for updating GCP scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The GCP project ID. type [_required_] enum GCP scan options resource type. Allowed enum values: `gcp_scan_options` default: `gcp_scan_options` ``` { "data": { "id": "api-spec-test", "type": "gcp_scan_options", "attributes": { "vuln_containers_os": false } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateGcpScanOptions-200-v2) * [400](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateGcpScanOptions-400-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateGcpScanOptions-403-v2) * [404](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateGcpScanOptions-404-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#UpdateGcpScanOptions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object containing GCP scan options for a single project. Field Type Description data object Single GCP scan options entry. attributes object Attributes for GCP scan options configuration. vuln_containers_os boolean Indicates if scanning for vulnerabilities in containers is enabled. vuln_host_os boolean Indicates if scanning for vulnerabilities in hosts is enabled. id [_required_] string The GCP project ID. type [_required_] enum GCP scan options resource type. Allowed enum values: `gcp_scan_options` default: `gcp_scan_options` ``` { "data": { "attributes": { "vuln_containers_os": true, "vuln_host_os": true }, "id": "company-project-id", "type": "gcp_scan_options" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Update GCP scan options returns "OK" response Copy ``` # Path parameters export project_id="company-project-id" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/${project_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "api-spec-test", "type": "gcp_scan_options", "attributes": { "vuln_containers_os": false } } } EOF ``` ##### Update GCP scan options returns "OK" response ``` // Update GCP scan options returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.GcpScanOptionsInputUpdate{ Data: &datadogV2.GcpScanOptionsInputUpdateData{ Id: "api-spec-test", Type: datadogV2.GCPSCANOPTIONSINPUTUPDATEDATATYPE_GCP_SCAN_OPTIONS, Attributes: &datadogV2.GcpScanOptionsInputUpdateDataAttributes{ VulnContainersOs: datadog.PtrBool(false), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.UpdateGcpScanOptions(ctx, "api-spec-test", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.UpdateGcpScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.UpdateGcpScanOptions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update GCP scan options returns "OK" response ``` // Update GCP scan options returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.GcpScanOptions; import com.datadog.api.client.v2.model.GcpScanOptionsInputUpdate; import com.datadog.api.client.v2.model.GcpScanOptionsInputUpdateData; import com.datadog.api.client.v2.model.GcpScanOptionsInputUpdateDataAttributes; import com.datadog.api.client.v2.model.GcpScanOptionsInputUpdateDataType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); GcpScanOptionsInputUpdate body = new GcpScanOptionsInputUpdate() .data( new GcpScanOptionsInputUpdateData() .id("api-spec-test") .type(GcpScanOptionsInputUpdateDataType.GCP_SCAN_OPTIONS) .attributes( new GcpScanOptionsInputUpdateDataAttributes().vulnContainersOs(false))); try { GcpScanOptions result = apiInstance.updateGcpScanOptions("api-spec-test", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#updateGcpScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update GCP scan options returns "OK" response ``` """ Update GCP scan options returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi from datadog_api_client.v2.model.gcp_scan_options_input_update import GcpScanOptionsInputUpdate from datadog_api_client.v2.model.gcp_scan_options_input_update_data import GcpScanOptionsInputUpdateData from datadog_api_client.v2.model.gcp_scan_options_input_update_data_attributes import ( GcpScanOptionsInputUpdateDataAttributes, ) from datadog_api_client.v2.model.gcp_scan_options_input_update_data_type import GcpScanOptionsInputUpdateDataType body = GcpScanOptionsInputUpdate( data=GcpScanOptionsInputUpdateData( id="api-spec-test", type=GcpScanOptionsInputUpdateDataType.GCP_SCAN_OPTIONS, attributes=GcpScanOptionsInputUpdateDataAttributes( vuln_containers_os=False, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.update_gcp_scan_options(project_id="api-spec-test", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update GCP scan options returns "OK" response ``` # Update GCP scan options returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new body = DatadogAPIClient::V2::GcpScanOptionsInputUpdate.new({ data: DatadogAPIClient::V2::GcpScanOptionsInputUpdateData.new({ id: "api-spec-test", type: DatadogAPIClient::V2::GcpScanOptionsInputUpdateDataType::GCP_SCAN_OPTIONS, attributes: DatadogAPIClient::V2::GcpScanOptionsInputUpdateDataAttributes.new({ vuln_containers_os: false, }), }), }) p api_instance.update_gcp_scan_options("api-spec-test", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update GCP scan options returns "OK" response ``` // Update GCP scan options returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; use datadog_api_client::datadogV2::model::GcpScanOptionsInputUpdate; use datadog_api_client::datadogV2::model::GcpScanOptionsInputUpdateData; use datadog_api_client::datadogV2::model::GcpScanOptionsInputUpdateDataAttributes; use datadog_api_client::datadogV2::model::GcpScanOptionsInputUpdateDataType; #[tokio::main] async fn main() { let body = GcpScanOptionsInputUpdate::new().data( GcpScanOptionsInputUpdateData::new( "api-spec-test".to_string(), GcpScanOptionsInputUpdateDataType::GCP_SCAN_OPTIONS, ) .attributes(GcpScanOptionsInputUpdateDataAttributes::new().vuln_containers_os(false)), ); let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api .update_gcp_scan_options("api-spec-test".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update GCP scan options returns "OK" response ``` /** * Update GCP scan options returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiUpdateGcpScanOptionsRequest = { body: { data: { id: "api-spec-test", type: "gcp_scan_options", attributes: { vulnContainersOs: false, }, }, }, projectId: "api-spec-test", }; apiInstance .updateGcpScanOptions(params) .then((data: v2.GcpScanOptions) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-gcp-scan-options) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-gcp-scan-options-v2) DELETE https://api.ap1.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.ap2.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.datadoghq.eu/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.ddog-gov.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.us3.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id}https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/{project_id} ### Overview Delete Agentless scan options for a GCP project. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Arguments #### Path Parameters Name Type Description project_id [_required_] string The GCP project ID. ### Response * [204](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteGcpScanOptions-204-v2) * [400](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteGcpScanOptions-400-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteGcpScanOptions-403-v2) * [404](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteGcpScanOptions-404-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#DeleteGcpScanOptions-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Delete GCP scan options Copy ``` # Path parameters export project_id="company-project-id" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/accounts/gcp/${project_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete GCP scan options ``` """ Delete GCP scan options returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) api_instance.delete_gcp_scan_options( project_id="company-project-id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete GCP scan options ``` # Delete GCP scan options returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new api_instance.delete_gcp_scan_options("company-project-id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete GCP scan options ``` // Delete GCP scan options returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) r, err := api.DeleteGcpScanOptions(ctx, "company-project-id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.DeleteGcpScanOptions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete GCP scan options ``` // Delete GCP scan options returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); try { apiInstance.deleteGcpScanOptions("company-project-id"); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#deleteGcpScanOptions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete GCP scan options ``` // Delete GCP scan options returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api .delete_gcp_scan_options("company-project-id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete GCP scan options ``` /** * Delete GCP scan options returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiDeleteGcpScanOptionsRequest = { projectId: "company-project-id", }; apiInstance .deleteGcpScanOptions(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List AWS on demand tasks](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-on-demand-tasks) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-on-demand-tasks-v2) GET https://api.ap1.datadoghq.com/api/v2/agentless_scanning/ondemand/awshttps://api.ap2.datadoghq.com/api/v2/agentless_scanning/ondemand/awshttps://api.datadoghq.eu/api/v2/agentless_scanning/ondemand/awshttps://api.ddog-gov.com/api/v2/agentless_scanning/ondemand/awshttps://api.datadoghq.com/api/v2/agentless_scanning/ondemand/awshttps://api.us3.datadoghq.com/api/v2/agentless_scanning/ondemand/awshttps://api.us5.datadoghq.com/api/v2/agentless_scanning/ondemand/aws ### Overview Fetches the most recent 1000 AWS on demand tasks. This endpoint requires the `security_monitoring_findings_read` permission. OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/agentless-scanning/#ListAwsOnDemandTasks-200-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#ListAwsOnDemandTasks-403-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#ListAwsOnDemandTasks-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object that includes a list of AWS on demand tasks. Field Type Description data [object] A list of on demand tasks. attributes object Attributes for the AWS on demand task. arn string The arn of the resource to scan. assigned_at string Specifies the assignment timestamp if the task has been already assigned to a scanner. created_at string The task submission timestamp. status string Indicates the status of the task. QUEUED: the task has been submitted successfully and the resource has not been assigned to a scanner yet. ASSIGNED: the task has been assigned. ABORTED: the scan has been aborted after a period of time due to technical reasons, such as resource not found, insufficient permissions, or the absence of a configured scanner. id string The UUID of the task. type enum The type of the on demand task. The value should always be `aws_resource`. Allowed enum values: `aws_resource` default: `aws_resource` ``` { "data": [ { "attributes": { "arn": "arn:aws:ec2:us-east-1:727000456123:instance/i-0eabb50529b67a1ba", "assigned_at": "2025-02-11T18:25:04.550564Z", "created_at": "2025-02-11T18:13:24.576915Z", "status": "QUEUED" }, "id": "6d09294c-9ad9-42fd-a759-a0c1599b4828", "type": "aws_resource" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### List AWS on demand tasks Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/ondemand/aws" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List AWS on demand tasks ``` """ List AWS on demand tasks returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.list_aws_on_demand_tasks() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List AWS on demand tasks ``` # List AWS on demand tasks returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new p api_instance.list_aws_on_demand_tasks() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List AWS on demand tasks ``` // List AWS on demand tasks returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.ListAwsOnDemandTasks(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.ListAwsOnDemandTasks`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.ListAwsOnDemandTasks`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List AWS on demand tasks ``` // List AWS on demand tasks returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.AwsOnDemandListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); try { AwsOnDemandListResponse result = apiInstance.listAwsOnDemandTasks(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#listAwsOnDemandTasks"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List AWS on demand tasks ``` // List AWS on demand tasks returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api.list_aws_on_demand_tasks().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List AWS on demand tasks ``` /** * List AWS on demand tasks returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); apiInstance .listAwsOnDemandTasks() .then((data: v2.AwsOnDemandListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create AWS on demand task](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-on-demand-task) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-on-demand-task-v2) POST https://api.ap1.datadoghq.com/api/v2/agentless_scanning/ondemand/awshttps://api.ap2.datadoghq.com/api/v2/agentless_scanning/ondemand/awshttps://api.datadoghq.eu/api/v2/agentless_scanning/ondemand/awshttps://api.ddog-gov.com/api/v2/agentless_scanning/ondemand/awshttps://api.datadoghq.com/api/v2/agentless_scanning/ondemand/awshttps://api.us3.datadoghq.com/api/v2/agentless_scanning/ondemand/awshttps://api.us5.datadoghq.com/api/v2/agentless_scanning/ondemand/aws ### Overview Trigger the scan of an AWS resource with a high priority. Agentless scanning must be activated for the AWS account containing the resource to scan. This endpoint requires the `security_monitoring_findings_write` permission. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Request #### Body Data (required) The definition of the on demand task. * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Field Type Description data [_required_] object Object for a single AWS on demand task. attributes [_required_] object Attributes for the AWS on demand task. arn [_required_] string The arn of the resource to scan. Agentless supports the scan of EC2 instances, lambda functions, AMI, ECR, RDS and S3 buckets. type [_required_] enum The type of the on demand task. The value should always be `aws_resource`. Allowed enum values: `aws_resource` default: `aws_resource` ``` { "data": { "attributes": { "arn": "arn:aws:lambda:us-west-2:123456789012:function:my-function" }, "type": "aws_resource" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateAwsOnDemandTask-201-v2) * [400](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateAwsOnDemandTask-400-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateAwsOnDemandTask-403-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#CreateAwsOnDemandTask-429-v2) AWS on demand task created successfully. * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object that includes an AWS on demand task. Field Type Description data object Single AWS on demand task. attributes object Attributes for the AWS on demand task. arn string The arn of the resource to scan. assigned_at string Specifies the assignment timestamp if the task has been already assigned to a scanner. created_at string The task submission timestamp. status string Indicates the status of the task. QUEUED: the task has been submitted successfully and the resource has not been assigned to a scanner yet. ASSIGNED: the task has been assigned. ABORTED: the scan has been aborted after a period of time due to technical reasons, such as resource not found, insufficient permissions, or the absence of a configured scanner. id string The UUID of the task. type enum The type of the on demand task. The value should always be `aws_resource`. Allowed enum values: `aws_resource` default: `aws_resource` ``` { "data": { "attributes": { "arn": "arn:aws:ec2:us-east-1:727000456123:instance/i-0eabb50529b67a1ba", "assigned_at": "2025-02-11T18:25:04.550564Z", "created_at": "2025-02-11T18:13:24.576915Z", "status": "QUEUED" }, "id": "6d09294c-9ad9-42fd-a759-a0c1599b4828", "type": "aws_resource" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Create AWS on demand task returns "AWS on demand task created successfully." response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/ondemand/aws" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "arn": "arn:aws:lambda:us-west-2:123456789012:function:my-function" }, "type": "aws_resource" } } EOF ``` ##### Create AWS on demand task returns "AWS on demand task created successfully." response ``` // Create AWS on demand task returns "AWS on demand task created successfully." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AwsOnDemandCreateRequest{ Data: datadogV2.AwsOnDemandCreateData{ Attributes: datadogV2.AwsOnDemandCreateAttributes{ Arn: "arn:aws:lambda:us-west-2:123456789012:function:my-function", }, Type: datadogV2.AWSONDEMANDTYPE_AWS_RESOURCE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.CreateAwsOnDemandTask(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.CreateAwsOnDemandTask`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.CreateAwsOnDemandTask`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create AWS on demand task returns "AWS on demand task created successfully." response ``` // Create AWS on demand task returns "AWS on demand task created successfully." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.AwsOnDemandCreateAttributes; import com.datadog.api.client.v2.model.AwsOnDemandCreateData; import com.datadog.api.client.v2.model.AwsOnDemandCreateRequest; import com.datadog.api.client.v2.model.AwsOnDemandResponse; import com.datadog.api.client.v2.model.AwsOnDemandType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); AwsOnDemandCreateRequest body = new AwsOnDemandCreateRequest() .data( new AwsOnDemandCreateData() .attributes( new AwsOnDemandCreateAttributes() .arn("arn:aws:lambda:us-west-2:123456789012:function:my-function")) .type(AwsOnDemandType.AWS_RESOURCE)); try { AwsOnDemandResponse result = apiInstance.createAwsOnDemandTask(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#createAwsOnDemandTask"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create AWS on demand task returns "AWS on demand task created successfully." response ``` """ Create AWS on demand task returns "AWS on demand task created successfully." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi from datadog_api_client.v2.model.aws_on_demand_create_attributes import AwsOnDemandCreateAttributes from datadog_api_client.v2.model.aws_on_demand_create_data import AwsOnDemandCreateData from datadog_api_client.v2.model.aws_on_demand_create_request import AwsOnDemandCreateRequest from datadog_api_client.v2.model.aws_on_demand_type import AwsOnDemandType body = AwsOnDemandCreateRequest( data=AwsOnDemandCreateData( attributes=AwsOnDemandCreateAttributes( arn="arn:aws:lambda:us-west-2:123456789012:function:my-function", ), type=AwsOnDemandType.AWS_RESOURCE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.create_aws_on_demand_task(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create AWS on demand task returns "AWS on demand task created successfully." response ``` # Create AWS on demand task returns "AWS on demand task created successfully." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new body = DatadogAPIClient::V2::AwsOnDemandCreateRequest.new({ data: DatadogAPIClient::V2::AwsOnDemandCreateData.new({ attributes: DatadogAPIClient::V2::AwsOnDemandCreateAttributes.new({ arn: "arn:aws:lambda:us-west-2:123456789012:function:my-function", }), type: DatadogAPIClient::V2::AwsOnDemandType::AWS_RESOURCE, }), }) p api_instance.create_aws_on_demand_task(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create AWS on demand task returns "AWS on demand task created successfully." response ``` // Create AWS on demand task returns "AWS on demand task created successfully." // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; use datadog_api_client::datadogV2::model::AwsOnDemandCreateAttributes; use datadog_api_client::datadogV2::model::AwsOnDemandCreateData; use datadog_api_client::datadogV2::model::AwsOnDemandCreateRequest; use datadog_api_client::datadogV2::model::AwsOnDemandType; #[tokio::main] async fn main() { let body = AwsOnDemandCreateRequest::new(AwsOnDemandCreateData::new( AwsOnDemandCreateAttributes::new( "arn:aws:lambda:us-west-2:123456789012:function:my-function".to_string(), ), AwsOnDemandType::AWS_RESOURCE, )); let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api.create_aws_on_demand_task(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create AWS on demand task returns "AWS on demand task created successfully." response ``` /** * Create AWS on demand task returns "AWS on demand task created successfully." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiCreateAwsOnDemandTaskRequest = { body: { data: { attributes: { arn: "arn:aws:lambda:us-west-2:123456789012:function:my-function", }, type: "aws_resource", }, }, }; apiInstance .createAwsOnDemandTask(params) .then((data: v2.AwsOnDemandResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get AWS on demand task](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-on-demand-task) * [v2 (latest)](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-on-demand-task-v2) GET https://api.ap1.datadoghq.com/api/v2/agentless_scanning/ondemand/aws/{task_id}https://api.ap2.datadoghq.com/api/v2/agentless_scanning/ondemand/aws/{task_id}https://api.datadoghq.eu/api/v2/agentless_scanning/ondemand/aws/{task_id}https://api.ddog-gov.com/api/v2/agentless_scanning/ondemand/aws/{task_id}https://api.datadoghq.com/api/v2/agentless_scanning/ondemand/aws/{task_id}https://api.us3.datadoghq.com/api/v2/agentless_scanning/ondemand/aws/{task_id}https://api.us5.datadoghq.com/api/v2/agentless_scanning/ondemand/aws/{task_id} ### Overview Fetch the data of a specific on demand task. This endpoint requires the `security_monitoring_findings_read` permission. OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#agentless-scanning) to access this endpoint. ### Arguments #### Path Parameters Name Type Description task_id [_required_] string The UUID of the task. ### Response * [200](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAwsOnDemandTask-200-v2) * [400](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAwsOnDemandTask-400-v2) * [403](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAwsOnDemandTask-403-v2) * [404](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAwsOnDemandTask-404-v2) * [429](https://docs.datadoghq.com/api/latest/agentless-scanning/#GetAwsOnDemandTask-429-v2) OK. * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) Response object that includes an AWS on demand task. Field Type Description data object Single AWS on demand task. attributes object Attributes for the AWS on demand task. arn string The arn of the resource to scan. assigned_at string Specifies the assignment timestamp if the task has been already assigned to a scanner. created_at string The task submission timestamp. status string Indicates the status of the task. QUEUED: the task has been submitted successfully and the resource has not been assigned to a scanner yet. ASSIGNED: the task has been assigned. ABORTED: the scan has been aborted after a period of time due to technical reasons, such as resource not found, insufficient permissions, or the absence of a configured scanner. id string The UUID of the task. type enum The type of the on demand task. The value should always be `aws_resource`. Allowed enum values: `aws_resource` default: `aws_resource` ``` { "data": { "attributes": { "arn": "arn:aws:ec2:us-east-1:727000456123:instance/i-0eabb50529b67a1ba", "assigned_at": "2025-02-11T18:25:04.550564Z", "created_at": "2025-02-11T18:13:24.576915Z", "status": "QUEUED" }, "id": "6d09294c-9ad9-42fd-a759-a0c1599b4828", "type": "aws_resource" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [Example](https://docs.datadoghq.com/api/latest/agentless-scanning/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/agentless-scanning/?code-lang=typescript) ##### Get AWS on demand task Copy ``` # Path parameters export task_id="6d09294c-9ad9-42fd-a759-a0c1599b4828" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/agentless_scanning/ondemand/aws/${task_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get AWS on demand task ``` """ Get AWS on demand task returns "OK." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.agentless_scanning_api import AgentlessScanningApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AgentlessScanningApi(api_client) response = api_instance.get_aws_on_demand_task( task_id="63d6b4f5-e5d0-4d90-824a-9580f05f026a", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get AWS on demand task ``` # Get AWS on demand task returns "OK." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AgentlessScanningAPI.new p api_instance.get_aws_on_demand_task("63d6b4f5-e5d0-4d90-824a-9580f05f026a") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get AWS on demand task ``` // Get AWS on demand task returns "OK." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAgentlessScanningApi(apiClient) resp, r, err := api.GetAwsOnDemandTask(ctx, "63d6b4f5-e5d0-4d90-824a-9580f05f026a") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AgentlessScanningApi.GetAwsOnDemandTask`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AgentlessScanningApi.GetAwsOnDemandTask`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get AWS on demand task ``` // Get AWS on demand task returns "OK." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AgentlessScanningApi; import com.datadog.api.client.v2.model.AwsOnDemandResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AgentlessScanningApi apiInstance = new AgentlessScanningApi(defaultClient); try { AwsOnDemandResponse result = apiInstance.getAwsOnDemandTask("63d6b4f5-e5d0-4d90-824a-9580f05f026a"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AgentlessScanningApi#getAwsOnDemandTask"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get AWS on demand task ``` // Get AWS on demand task returns "OK." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_agentless_scanning::AgentlessScanningAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AgentlessScanningAPI::with_config(configuration); let resp = api .get_aws_on_demand_task("63d6b4f5-e5d0-4d90-824a-9580f05f026a".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get AWS on demand task ``` /** * Get AWS on demand task returns "OK." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AgentlessScanningApi(configuration); const params: v2.AgentlessScanningApiGetAwsOnDemandTaskRequest = { taskId: "63d6b4f5-e5d0-4d90-824a-9580f05f026a", }; apiInstance .getAwsOnDemandTask(params) .then((data: v2.AwsOnDemandResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=80e61ee6-b367-4f3e-a489-4c65923e6cd2&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=0e99874c-ac92-4e7a-a496-fd57eb0639b7&pt=Agentless%20Scanning&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fagentless-scanning%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=80e61ee6-b367-4f3e-a489-4c65923e6cd2&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=0e99874c-ac92-4e7a-a496-fd57eb0639b7&pt=Agentless%20Scanning&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fagentless-scanning%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=faa9edfb-15ef-4589-a1c8-fce51e9d1dd9&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Agentless%20Scanning&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fagentless-scanning%2F&r=<=10405&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=230169) --- # Source: https://docs.datadoghq.com/api/latest/api-management # API Management Configure your API endpoints through the Datadog API. ## [Create a new API](https://docs.datadoghq.com/api/latest/api-management/#create-a-new-api) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/api-management/#create-a-new-api-v2) **Note** : This endpoint is deprecated. POST https://api.ap1.datadoghq.com/api/v2/apicatalog/openapihttps://api.ap2.datadoghq.com/api/v2/apicatalog/openapihttps://api.datadoghq.eu/api/v2/apicatalog/openapihttps://api.ddog-gov.com/api/v2/apicatalog/openapihttps://api.datadoghq.com/api/v2/apicatalog/openapihttps://api.us3.datadoghq.com/api/v2/apicatalog/openapihttps://api.us5.datadoghq.com/api/v2/apicatalog/openapi ### Overview Create a new API from the [OpenAPI](https://spec.openapis.org/oas/latest.html) specification given. See the [API Catalog documentation](https://docs.datadoghq.com/api_catalog/add_metadata/) for additional information about the possible metadata. It returns the created API ID. This endpoint requires the `apm_api_catalog_write` permission. OAuth apps require the `apm_api_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#api-management) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) Expand All Field Type Description openapi_spec_file binary Binary `OpenAPI` spec file ``` { "openapi_spec_file": "string" } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/api-management/#CreateOpenAPI-201-v2) * [400](https://docs.datadoghq.com/api/latest/api-management/#CreateOpenAPI-400-v2) * [403](https://docs.datadoghq.com/api/latest/api-management/#CreateOpenAPI-403-v2) * [429](https://docs.datadoghq.com/api/latest/api-management/#CreateOpenAPI-429-v2) API created successfully * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) Response for `CreateOpenAPI` operation. Field Type Description data object Data envelope for `CreateOpenAPIResponse`. attributes object Attributes for `CreateOpenAPI`. failed_endpoints [object] List of endpoints which couldn't be parsed. method string The endpoint method. path string The endpoint path. id uuid API identifier. ``` { "data": { "attributes": { "failed_endpoints": [ { "method": "string", "path": "string" } ] }, "id": "90646597-5fdb-4a17-a240-647003f8c028" } } ``` Copy Bad request * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/api-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/api-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/api-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/api-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/api-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/api-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/api-management/?code-lang=typescript) ##### Create a new API Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apicatalog/openapi" \ -H "Accept: application/json" \ -H "Content-Type: multipart/form-data" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -F openapi_spec_file=@string ``` ##### Create a new API ``` """ Create a new API returns "API created successfully" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.api_management_api import APIManagementApi configuration = Configuration() configuration.unstable_operations["create_open_api"] = True with ApiClient(configuration) as api_client: api_instance = APIManagementApi(api_client) response = api_instance.create_open_api( openapi_spec_file=open("openapi-spec.yaml", "rb"), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new API ``` # Create a new API returns "API created successfully" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_open_api".to_sym] = true end api_instance = DatadogAPIClient::V2::APIManagementAPI.new opts = { openapi_spec_file: File.open("openapi-spec.yaml", "r"), } p api_instance.create_open_api(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new API ``` // Create a new API returns "API created successfully" response package main import ( "context" "encoding/json" "fmt" "io" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateOpenAPI", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPIManagementApi(apiClient) resp, r, err := api.CreateOpenAPI(ctx, *datadogV2.NewCreateOpenAPIOptionalParameters().WithOpenapiSpecFile(func() io.Reader { fp, _ := os.Open("openapi-spec.yaml"); return fp }())) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APIManagementApi.CreateOpenAPI`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `APIManagementApi.CreateOpenAPI`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new API ``` // Create a new API returns "API created successfully" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApiManagementApi; import com.datadog.api.client.v2.api.ApiManagementApi.CreateOpenAPIOptionalParameters; import com.datadog.api.client.v2.model.CreateOpenAPIResponse; import java.io.File; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createOpenAPI", true); ApiManagementApi apiInstance = new ApiManagementApi(defaultClient); try { CreateOpenAPIResponse result = apiInstance.createOpenAPI( new CreateOpenAPIOptionalParameters().openapiSpecFile(new File("openapi-spec.yaml"))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ApiManagementApi#createOpenAPI"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new API ``` // Create a new API returns "API created successfully" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_api_management::APIManagementAPI; use datadog_api_client::datadogV2::api_api_management::CreateOpenAPIOptionalParams; use std::fs; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateOpenAPI", true); let api = APIManagementAPI::with_config(configuration); let resp = api .create_open_api( CreateOpenAPIOptionalParams::default() .openapi_spec_file(fs::read("openapi-spec.yaml").unwrap()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new API ``` /** * Create a new API returns "API created successfully" response */ import * as fs from "fs"; import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createOpenAPI"] = true; const apiInstance = new v2.APIManagementApi(configuration); const params: v2.APIManagementApiCreateOpenAPIRequest = { openapiSpecFile: { data: Buffer.from(fs.readFileSync("openapi-spec.yaml", "utf8")), name: "openapi-spec.yaml", }, }; apiInstance .createOpenAPI(params) .then((data: v2.CreateOpenAPIResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an API](https://docs.datadoghq.com/api/latest/api-management/#update-an-api) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/api-management/#update-an-api-v2) **Note** : This endpoint is deprecated. PUT https://api.ap1.datadoghq.com/api/v2/apicatalog/api/{id}/openapihttps://api.ap2.datadoghq.com/api/v2/apicatalog/api/{id}/openapihttps://api.datadoghq.eu/api/v2/apicatalog/api/{id}/openapihttps://api.ddog-gov.com/api/v2/apicatalog/api/{id}/openapihttps://api.datadoghq.com/api/v2/apicatalog/api/{id}/openapihttps://api.us3.datadoghq.com/api/v2/apicatalog/api/{id}/openapihttps://api.us5.datadoghq.com/api/v2/apicatalog/api/{id}/openapi ### Overview Update information about a specific API. The given content will replace all API content of the given ID. The ID is returned by the create API, or can be found in the URL in the API catalog UI. This endpoint requires the `apm_api_catalog_write` permission. OAuth apps require the `apm_api_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#api-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description id [_required_] string ID of the API to modify ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) Expand All Field Type Description openapi_spec_file binary Binary `OpenAPI` spec file ``` { "openapi_spec_file": "string" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/api-management/#UpdateOpenAPI-200-v2) * [400](https://docs.datadoghq.com/api/latest/api-management/#UpdateOpenAPI-400-v2) * [403](https://docs.datadoghq.com/api/latest/api-management/#UpdateOpenAPI-403-v2) * [404](https://docs.datadoghq.com/api/latest/api-management/#UpdateOpenAPI-404-v2) * [429](https://docs.datadoghq.com/api/latest/api-management/#UpdateOpenAPI-429-v2) API updated successfully * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) Response for `UpdateOpenAPI`. Field Type Description data object Data envelope for `UpdateOpenAPIResponse`. attributes object Attributes for `UpdateOpenAPI`. failed_endpoints [object] List of endpoints which couldn't be parsed. method string The endpoint method. path string The endpoint path. id uuid API identifier. ``` { "data": { "attributes": { "failed_endpoints": [ { "method": "string", "path": "string" } ] }, "id": "90646597-5fdb-4a17-a240-647003f8c028" } } ``` Copy Bad request * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy API not found error * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/api-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/api-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/api-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/api-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/api-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/api-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/api-management/?code-lang=typescript) ##### Update an API Copy ``` # Path parameters export id="90646597-5fdb-4a17-a240-647003f8c028" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apicatalog/api/${id}/openapi" \ -H "Accept: application/json" \ -H "Content-Type: multipart/form-data" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -F openapi_spec_file=@string ``` ##### Update an API ``` """ Update an API returns "API updated successfully" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.api_management_api import APIManagementApi # there is a valid "managed_api" in the system MANAGED_API_DATA_ID = environ["MANAGED_API_DATA_ID"] configuration = Configuration() configuration.unstable_operations["update_open_api"] = True with ApiClient(configuration) as api_client: api_instance = APIManagementApi(api_client) response = api_instance.update_open_api( id=MANAGED_API_DATA_ID, openapi_spec_file=open("openapi-spec.yaml", "rb"), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an API ``` # Update an API returns "API updated successfully" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_open_api".to_sym] = true end api_instance = DatadogAPIClient::V2::APIManagementAPI.new # there is a valid "managed_api" in the system MANAGED_API_DATA_ID = ENV["MANAGED_API_DATA_ID"] opts = { openapi_spec_file: File.open("openapi-spec.yaml", "r"), } p api_instance.update_open_api(MANAGED_API_DATA_ID, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an API ``` // Update an API returns "API updated successfully" response package main import ( "context" "encoding/json" "fmt" "io" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "managed_api" in the system ManagedAPIDataID := uuid.MustParse(os.Getenv("MANAGED_API_DATA_ID")) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateOpenAPI", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPIManagementApi(apiClient) resp, r, err := api.UpdateOpenAPI(ctx, ManagedAPIDataID, *datadogV2.NewUpdateOpenAPIOptionalParameters().WithOpenapiSpecFile(func() io.Reader { fp, _ := os.Open("openapi-spec.yaml"); return fp }())) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APIManagementApi.UpdateOpenAPI`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `APIManagementApi.UpdateOpenAPI`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an API ``` // Update an API returns "API updated successfully" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApiManagementApi; import com.datadog.api.client.v2.api.ApiManagementApi.UpdateOpenAPIOptionalParameters; import com.datadog.api.client.v2.model.UpdateOpenAPIResponse; import java.io.File; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateOpenAPI", true); ApiManagementApi apiInstance = new ApiManagementApi(defaultClient); // there is a valid "managed_api" in the system UUID MANAGED_API_DATA_ID = null; try { MANAGED_API_DATA_ID = UUID.fromString(System.getenv("MANAGED_API_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } try { UpdateOpenAPIResponse result = apiInstance.updateOpenAPI( MANAGED_API_DATA_ID, new UpdateOpenAPIOptionalParameters().openapiSpecFile(new File("openapi-spec.yaml"))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ApiManagementApi#updateOpenAPI"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an API ``` // Update an API returns "API updated successfully" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_api_management::APIManagementAPI; use datadog_api_client::datadogV2::api_api_management::UpdateOpenAPIOptionalParams; use std::fs; #[tokio::main] async fn main() { // there is a valid "managed_api" in the system let managed_api_data_id = uuid::Uuid::parse_str(&std::env::var("MANAGED_API_DATA_ID").unwrap()) .expect("Invalid UUID"); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateOpenAPI", true); let api = APIManagementAPI::with_config(configuration); let resp = api .update_open_api( managed_api_data_id.clone(), UpdateOpenAPIOptionalParams::default() .openapi_spec_file(fs::read("openapi-spec.yaml").unwrap()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an API ``` /** * Update an API returns "API updated successfully" response */ import * as fs from "fs"; import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateOpenAPI"] = true; const apiInstance = new v2.APIManagementApi(configuration); // there is a valid "managed_api" in the system const MANAGED_API_DATA_ID = process.env.MANAGED_API_DATA_ID as string; const params: v2.APIManagementApiUpdateOpenAPIRequest = { id: MANAGED_API_DATA_ID, openapiSpecFile: { data: Buffer.from(fs.readFileSync("openapi-spec.yaml", "utf8")), name: "openapi-spec.yaml", }, }; apiInstance .updateOpenAPI(params) .then((data: v2.UpdateOpenAPIResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an API](https://docs.datadoghq.com/api/latest/api-management/#get-an-api) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/api-management/#get-an-api-v2) **Note** : This endpoint is deprecated. GET https://api.ap1.datadoghq.com/api/v2/apicatalog/api/{id}/openapihttps://api.ap2.datadoghq.com/api/v2/apicatalog/api/{id}/openapihttps://api.datadoghq.eu/api/v2/apicatalog/api/{id}/openapihttps://api.ddog-gov.com/api/v2/apicatalog/api/{id}/openapihttps://api.datadoghq.com/api/v2/apicatalog/api/{id}/openapihttps://api.us3.datadoghq.com/api/v2/apicatalog/api/{id}/openapihttps://api.us5.datadoghq.com/api/v2/apicatalog/api/{id}/openapi ### Overview Retrieve information about a specific API in [OpenAPI](https://spec.openapis.org/oas/latest.html) format file. This endpoint requires the `apm_api_catalog_read` permission. OAuth apps require the `apm_api_catalog_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#api-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description id [_required_] string ID of the API to retrieve ### Response * [200](https://docs.datadoghq.com/api/latest/api-management/#GetOpenAPI-200-v2) * [400](https://docs.datadoghq.com/api/latest/api-management/#GetOpenAPI-400-v2) * [403](https://docs.datadoghq.com/api/latest/api-management/#GetOpenAPI-403-v2) * [404](https://docs.datadoghq.com/api/latest/api-management/#GetOpenAPI-404-v2) * [429](https://docs.datadoghq.com/api/latest/api-management/#GetOpenAPI-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) Expand All Field Type Description No response body ``` {} ``` Copy Bad request * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy API not found error * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/api-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/api-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/api-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/api-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/api-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/api-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/api-management/?code-lang=typescript) ##### Get an API Copy ``` # Path parameters export id="90646597-5fdb-4a17-a240-647003f8c028" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apicatalog/api/${id}/openapi" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an API ``` """ Get an API returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.api_management_api import APIManagementApi # there is a valid "managed_api" in the system MANAGED_API_DATA_ID = environ["MANAGED_API_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_open_api"] = True with ApiClient(configuration) as api_client: api_instance = APIManagementApi(api_client) response = api_instance.get_open_api( id=MANAGED_API_DATA_ID, ) print(response.read()) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an API ``` # Get an API returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_open_api".to_sym] = true end api_instance = DatadogAPIClient::V2::APIManagementAPI.new # there is a valid "managed_api" in the system MANAGED_API_DATA_ID = ENV["MANAGED_API_DATA_ID"] p api_instance.get_open_api(MANAGED_API_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an API ``` // Get an API returns "OK" response package main import ( "context" "fmt" "io/ioutil" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "managed_api" in the system ManagedAPIDataID := uuid.MustParse(os.Getenv("MANAGED_API_DATA_ID")) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetOpenAPI", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPIManagementApi(apiClient) resp, r, err := api.GetOpenAPI(ctx, ManagedAPIDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APIManagementApi.GetOpenAPI`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := ioutil.ReadAll(resp) fmt.Fprintf(os.Stdout, "Response from `APIManagementApi.GetOpenAPI`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an API ``` // Get an API returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApiManagementApi; import java.io.File; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getOpenAPI", true); ApiManagementApi apiInstance = new ApiManagementApi(defaultClient); // there is a valid "managed_api" in the system UUID MANAGED_API_DATA_ID = null; try { MANAGED_API_DATA_ID = UUID.fromString(System.getenv("MANAGED_API_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } try { File result = apiInstance.getOpenAPI(MANAGED_API_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ApiManagementApi#getOpenAPI"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an API ``` // Get an API returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_api_management::APIManagementAPI; #[tokio::main] async fn main() { // there is a valid "managed_api" in the system let managed_api_data_id = uuid::Uuid::parse_str(&std::env::var("MANAGED_API_DATA_ID").unwrap()) .expect("Invalid UUID"); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetOpenAPI", true); let api = APIManagementAPI::with_config(configuration); let resp = api.get_open_api(managed_api_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an API ``` /** * Get an API returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getOpenAPI"] = true; const apiInstance = new v2.APIManagementApi(configuration); // there is a valid "managed_api" in the system const MANAGED_API_DATA_ID = process.env.MANAGED_API_DATA_ID as string; const params: v2.APIManagementApiGetOpenAPIRequest = { id: MANAGED_API_DATA_ID, }; apiInstance .getOpenAPI(params) .then((data: client.HttpFile) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List APIs](https://docs.datadoghq.com/api/latest/api-management/#list-apis) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/api-management/#list-apis-v2) **Note** : This endpoint is deprecated. GET https://api.ap1.datadoghq.com/api/v2/apicatalog/apihttps://api.ap2.datadoghq.com/api/v2/apicatalog/apihttps://api.datadoghq.eu/api/v2/apicatalog/apihttps://api.ddog-gov.com/api/v2/apicatalog/apihttps://api.datadoghq.com/api/v2/apicatalog/apihttps://api.us3.datadoghq.com/api/v2/apicatalog/apihttps://api.us5.datadoghq.com/api/v2/apicatalog/api ### Overview List APIs and their IDs. This endpoint requires the `apm_api_catalog_read` permission. OAuth apps require the `apm_api_catalog_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#api-management) to access this endpoint. ### Arguments #### Query Strings Name Type Description query string Filter APIs by name page[limit] integer Number of items per page. page[offset] integer Offset for pagination. ### Response * [200](https://docs.datadoghq.com/api/latest/api-management/#ListAPIs-200-v2) * [400](https://docs.datadoghq.com/api/latest/api-management/#ListAPIs-400-v2) * [403](https://docs.datadoghq.com/api/latest/api-management/#ListAPIs-403-v2) * [429](https://docs.datadoghq.com/api/latest/api-management/#ListAPIs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) Response for `ListAPIs`. Field Type Description data [object] List of API items. attributes object Attributes for `ListAPIsResponseData`. name string API name. id uuid API identifier. meta object Metadata for `ListAPIsResponse`. pagination object Pagination metadata information for `ListAPIsResponse`. limit int64 Number of items in the current page. offset int64 Offset for pagination. total_count int64 Total number of items. ``` { "data": [ { "attributes": { "name": "Payments API" }, "id": "90646597-5fdb-4a17-a240-647003f8c028" } ], "meta": { "pagination": { "limit": 20, "offset": 0, "total_count": 35 } } } ``` Copy Bad request * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/api-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/api-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/api-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/api-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/api-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/api-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/api-management/?code-lang=typescript) ##### List APIs Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apicatalog/api" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List APIs ``` """ List APIs returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.api_management_api import APIManagementApi configuration = Configuration() configuration.unstable_operations["list_apis"] = True with ApiClient(configuration) as api_client: api_instance = APIManagementApi(api_client) response = api_instance.list_apis() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List APIs ``` # List APIs returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_apis".to_sym] = true end api_instance = DatadogAPIClient::V2::APIManagementAPI.new p api_instance.list_apis() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List APIs ``` // List APIs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListAPIs", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPIManagementApi(apiClient) resp, r, err := api.ListAPIs(ctx, *datadogV2.NewListAPIsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APIManagementApi.ListAPIs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `APIManagementApi.ListAPIs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List APIs ``` // List APIs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApiManagementApi; import com.datadog.api.client.v2.model.ListAPIsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listAPIs", true); ApiManagementApi apiInstance = new ApiManagementApi(defaultClient); try { ListAPIsResponse result = apiInstance.listAPIs(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ApiManagementApi#listAPIs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List APIs ``` // List APIs returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_api_management::APIManagementAPI; use datadog_api_client::datadogV2::api_api_management::ListAPIsOptionalParams; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListAPIs", true); let api = APIManagementAPI::with_config(configuration); let resp = api.list_apis(ListAPIsOptionalParams::default()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List APIs ``` /** * List APIs returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listAPIs"] = true; const apiInstance = new v2.APIManagementApi(configuration); apiInstance .listAPIs() .then((data: v2.ListAPIsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an API](https://docs.datadoghq.com/api/latest/api-management/#delete-an-api) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/api-management/#delete-an-api-v2) **Note** : This endpoint is deprecated. DELETE https://api.ap1.datadoghq.com/api/v2/apicatalog/api/{id}https://api.ap2.datadoghq.com/api/v2/apicatalog/api/{id}https://api.datadoghq.eu/api/v2/apicatalog/api/{id}https://api.ddog-gov.com/api/v2/apicatalog/api/{id}https://api.datadoghq.com/api/v2/apicatalog/api/{id}https://api.us3.datadoghq.com/api/v2/apicatalog/api/{id}https://api.us5.datadoghq.com/api/v2/apicatalog/api/{id} ### Overview Delete a specific API by ID. This endpoint requires the `apm_api_catalog_write` permission. OAuth apps require the `apm_api_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#api-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description id [_required_] string ID of the API to delete ### Response * [204](https://docs.datadoghq.com/api/latest/api-management/#DeleteOpenAPI-204-v2) * [400](https://docs.datadoghq.com/api/latest/api-management/#DeleteOpenAPI-400-v2) * [403](https://docs.datadoghq.com/api/latest/api-management/#DeleteOpenAPI-403-v2) * [404](https://docs.datadoghq.com/api/latest/api-management/#DeleteOpenAPI-404-v2) * [429](https://docs.datadoghq.com/api/latest/api-management/#DeleteOpenAPI-429-v2) API deleted successfully Bad request * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy API not found error * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/api-management/) * [Example](https://docs.datadoghq.com/api/latest/api-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/api-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/api-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/api-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/api-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/api-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/api-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/api-management/?code-lang=typescript) ##### Delete an API Copy ``` # Path parameters export id="90646597-5fdb-4a17-a240-647003f8c028" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apicatalog/api/${id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an API ``` """ Delete an API returns "API deleted successfully" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.api_management_api import APIManagementApi # there is a valid "managed_api" in the system MANAGED_API_DATA_ID = environ["MANAGED_API_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_open_api"] = True with ApiClient(configuration) as api_client: api_instance = APIManagementApi(api_client) api_instance.delete_open_api( id=MANAGED_API_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an API ``` # Delete an API returns "API deleted successfully" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_open_api".to_sym] = true end api_instance = DatadogAPIClient::V2::APIManagementAPI.new # there is a valid "managed_api" in the system MANAGED_API_DATA_ID = ENV["MANAGED_API_DATA_ID"] api_instance.delete_open_api(MANAGED_API_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an API ``` // Delete an API returns "API deleted successfully" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "managed_api" in the system ManagedAPIDataID := uuid.MustParse(os.Getenv("MANAGED_API_DATA_ID")) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteOpenAPI", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPIManagementApi(apiClient) r, err := api.DeleteOpenAPI(ctx, ManagedAPIDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APIManagementApi.DeleteOpenAPI`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an API ``` // Delete an API returns "API deleted successfully" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApiManagementApi; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteOpenAPI", true); ApiManagementApi apiInstance = new ApiManagementApi(defaultClient); // there is a valid "managed_api" in the system UUID MANAGED_API_DATA_ID = null; try { MANAGED_API_DATA_ID = UUID.fromString(System.getenv("MANAGED_API_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } try { apiInstance.deleteOpenAPI(MANAGED_API_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling ApiManagementApi#deleteOpenAPI"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an API ``` // Delete an API returns "API deleted successfully" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_api_management::APIManagementAPI; #[tokio::main] async fn main() { // there is a valid "managed_api" in the system let managed_api_data_id = uuid::Uuid::parse_str(&std::env::var("MANAGED_API_DATA_ID").unwrap()) .expect("Invalid UUID"); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteOpenAPI", true); let api = APIManagementAPI::with_config(configuration); let resp = api.delete_open_api(managed_api_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an API ``` /** * Delete an API returns "API deleted successfully" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteOpenAPI"] = true; const apiInstance = new v2.APIManagementApi(configuration); // there is a valid "managed_api" in the system const MANAGED_API_DATA_ID = process.env.MANAGED_API_DATA_ID as string; const params: v2.APIManagementApiDeleteOpenAPIRequest = { id: MANAGED_API_DATA_ID, }; apiInstance .deleteOpenAPI(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4bae82c7-cbcb-45c6-80df-2d17389f4dca&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=4e2f3191-aad5-4ec4-a825-57506f835088&pt=API%20Management&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapi-management%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4bae82c7-cbcb-45c6-80df-2d17389f4dca&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=4e2f3191-aad5-4ec4-a825-57506f835088&pt=API%20Management&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapi-management%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=965e7936-3e7d-410e-898c-bfb0d9329fdf&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=API%20Management&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapi-management%2F&r=<=13756&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=15729) --- # Source: https://docs.datadoghq.com/api/latest/apm-retention-filters/ # APM Retention Filters Manage configuration of [APM retention filters](https://app.datadoghq.com/apm/traces/retention-filters) for your organization. You need an API and application key with Admin rights to interact with this endpoint. See [retention filters](https://docs.datadoghq.com/tracing/trace_pipeline/trace_retention/#retention-filters) on the Trace Retention page for more information. ## [List all APM retention filters](https://docs.datadoghq.com/api/latest/apm-retention-filters/#list-all-apm-retention-filters) * [v2 (latest)](https://docs.datadoghq.com/api/latest/apm-retention-filters/#list-all-apm-retention-filters-v2) GET https://api.ap1.datadoghq.com/api/v2/apm/config/retention-filtershttps://api.ap2.datadoghq.com/api/v2/apm/config/retention-filtershttps://api.datadoghq.eu/api/v2/apm/config/retention-filtershttps://api.ddog-gov.com/api/v2/apm/config/retention-filtershttps://api.datadoghq.com/api/v2/apm/config/retention-filtershttps://api.us3.datadoghq.com/api/v2/apm/config/retention-filtershttps://api.us5.datadoghq.com/api/v2/apm/config/retention-filters ### Overview Get the list of APM retention filters. This endpoint requires any of the following permissions: * `apm_retention_filter_read` * `apm_pipelines_read` ### Response * [200](https://docs.datadoghq.com/api/latest/apm-retention-filters/#ListApmRetentionFilters-200-v2) * [403](https://docs.datadoghq.com/api/latest/apm-retention-filters/#ListApmRetentionFilters-403-v2) * [429](https://docs.datadoghq.com/api/latest/apm-retention-filters/#ListApmRetentionFilters-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) An ordered list of retention filters. Field Type Description data [_required_] [object] A list of retention filters objects. attributes [_required_] object The attributes of the retention filter. created_at int64 The creation timestamp of the retention filter. created_by string The creator of the retention filter. editable boolean Shows whether the filter can be edited. enabled boolean The status of the retention filter (Enabled/Disabled). execution_order int64 The execution order of the retention filter. filter object The spans filter used to index spans. query string The search query - following the [span search syntax](https://docs.datadoghq.com/tracing/trace_explorer/query_syntax/). filter_type enum The type of retention filter. Allowed enum values: `spans-sampling-processor,spans-errors-sampling-processor,spans-appsec-sampling-processor` default: `spans-sampling-processor` modified_at int64 The modification timestamp of the retention filter. modified_by string The modifier of the retention filter. name string The name of the retention filter. rate double Sample rate to apply to spans going through this retention filter. A value of 1.0 keeps all spans matching the query. trace_rate double Sample rate to apply to traces containing spans going through this retention filter. A value of 1.0 keeps all traces with spans matching the query. id [_required_] string The ID of the retention filter. type [_required_] enum The type of the resource. Allowed enum values: `apm_retention_filter` default: `apm_retention_filter` ``` { "data": [ { "attributes": { "created_at": "integer", "created_by": "string", "editable": true, "enabled": true, "execution_order": "integer", "filter": { "query": "@http.status_code:200 service:my-service" }, "filter_type": "spans-sampling-processor", "modified_at": "integer", "modified_by": "string", "name": "my retention filter", "rate": 1, "trace_rate": 1 }, "id": "7RBOb7dLSYWI01yc3pIH8w", "type": "apm_retention_filter" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=typescript) ##### List all APM retention filters Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/retention-filters" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all APM retention filters ``` """ List all APM retention filters returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.apm_retention_filters_api import APMRetentionFiltersApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = APMRetentionFiltersApi(api_client) response = api_instance.list_apm_retention_filters() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all APM retention filters ``` # List all APM retention filters returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::APMRetentionFiltersAPI.new p api_instance.list_apm_retention_filters() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all APM retention filters ``` // List all APM retention filters returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPMRetentionFiltersApi(apiClient) resp, r, err := api.ListApmRetentionFilters(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APMRetentionFiltersApi.ListApmRetentionFilters`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `APMRetentionFiltersApi.ListApmRetentionFilters`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all APM retention filters ``` // List all APM retention filters returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApmRetentionFiltersApi; import com.datadog.api.client.v2.model.RetentionFiltersResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApmRetentionFiltersApi apiInstance = new ApmRetentionFiltersApi(defaultClient); try { RetentionFiltersResponse result = apiInstance.listApmRetentionFilters(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ApmRetentionFiltersApi#listApmRetentionFilters"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all APM retention filters ``` // List all APM retention filters returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_apm_retention_filters::APMRetentionFiltersAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = APMRetentionFiltersAPI::with_config(configuration); let resp = api.list_apm_retention_filters().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all APM retention filters ``` /** * List all APM retention filters returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.APMRetentionFiltersApi(configuration); apiInstance .listApmRetentionFilters() .then((data: v2.RetentionFiltersResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a retention filter](https://docs.datadoghq.com/api/latest/apm-retention-filters/#create-a-retention-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/apm-retention-filters/#create-a-retention-filter-v2) POST https://api.ap1.datadoghq.com/api/v2/apm/config/retention-filtershttps://api.ap2.datadoghq.com/api/v2/apm/config/retention-filtershttps://api.datadoghq.eu/api/v2/apm/config/retention-filtershttps://api.ddog-gov.com/api/v2/apm/config/retention-filtershttps://api.datadoghq.com/api/v2/apm/config/retention-filtershttps://api.us3.datadoghq.com/api/v2/apm/config/retention-filtershttps://api.us5.datadoghq.com/api/v2/apm/config/retention-filters ### Overview Create a retention filter to index spans in your organization. Returns the retention filter definition when the request is successful. Default filters with types spans-errors-sampling-processor and spans-appsec-sampling-processor cannot be created. This endpoint requires any of the following permissions: * `apm_retention_filter_write` * `apm_pipelines_write` ### Request #### Body Data (required) The definition of the new retention filter. * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) Field Type Description data [_required_] object The body of the retention filter to be created. attributes [_required_] object The object describing the configuration of the retention filter to create/update. enabled [_required_] boolean Enable/Disable the retention filter. filter [_required_] object The spans filter. Spans matching this filter will be indexed and stored. query [_required_] string The search query - following the [span search syntax](https://docs.datadoghq.com/tracing/trace_explorer/query_syntax/). filter_type [_required_] enum The type of retention filter. The value should always be spans-sampling-processor. Allowed enum values: `spans-sampling-processor` default: `spans-sampling-processor` name [_required_] string The name of the retention filter. rate [_required_] double Sample rate to apply to spans going through this retention filter. A value of 1.0 keeps all spans matching the query. trace_rate double Sample rate to apply to traces containing spans going through this retention filter. A value of 1.0 keeps all traces with spans matching the query. type [_required_] enum The type of the resource. Allowed enum values: `apm_retention_filter` default: `apm_retention_filter` ##### Create a retention filter returns "OK" response ``` { "data": { "attributes": { "enabled": true, "filter": { "query": "@http.status_code:200 service:my-service" }, "filter_type": "spans-sampling-processor", "name": "my retention filter", "rate": 1.0 }, "type": "apm_retention_filter" } } ``` Copy ##### Create a retention filter with trace rate returns "OK" response ``` { "data": { "attributes": { "enabled": true, "filter": { "query": "@http.status_code:200 service:my-service" }, "filter_type": "spans-sampling-processor", "name": "my retention filter", "rate": 1.0, "trace_rate": 1.0 }, "type": "apm_retention_filter" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/apm-retention-filters/#CreateApmRetentionFilter-200-v2) * [400](https://docs.datadoghq.com/api/latest/apm-retention-filters/#CreateApmRetentionFilter-400-v2) * [403](https://docs.datadoghq.com/api/latest/apm-retention-filters/#CreateApmRetentionFilter-403-v2) * [409](https://docs.datadoghq.com/api/latest/apm-retention-filters/#CreateApmRetentionFilter-409-v2) * [429](https://docs.datadoghq.com/api/latest/apm-retention-filters/#CreateApmRetentionFilter-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) The retention filters definition. Field Type Description data object The definition of the retention filter. attributes [_required_] object The attributes of the retention filter. created_at int64 The creation timestamp of the retention filter. created_by string The creator of the retention filter. editable boolean Shows whether the filter can be edited. enabled boolean The status of the retention filter (Enabled/Disabled). execution_order int64 The execution order of the retention filter. filter object The spans filter used to index spans. query string The search query - following the [span search syntax](https://docs.datadoghq.com/tracing/trace_explorer/query_syntax/). filter_type enum The type of retention filter. The value should always be spans-sampling-processor. Allowed enum values: `spans-sampling-processor` default: `spans-sampling-processor` modified_at int64 The modification timestamp of the retention filter. modified_by string The modifier of the retention filter. name string The name of the retention filter. rate double Sample rate to apply to spans going through this retention filter. A value of 1.0 keeps all spans matching the query. trace_rate double Sample rate to apply to traces containing spans going through this retention filter. A value of 1.0 keeps all traces with spans matching the query. id [_required_] string The ID of the retention filter. type [_required_] enum The type of the resource. Allowed enum values: `apm_retention_filter` default: `apm_retention_filter` ``` { "data": { "attributes": { "created_at": "integer", "created_by": "string", "editable": true, "enabled": true, "execution_order": "integer", "filter": { "query": "@http.status_code:200 service:my-service" }, "filter_type": "spans-sampling-processor", "modified_at": "integer", "modified_by": "string", "name": "my retention filter", "rate": 1, "trace_rate": 1 }, "id": "7RBOb7dLSYWI01yc3pIH8w", "type": "apm_retention_filter" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=typescript) ##### Create a retention filter returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/retention-filters" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": true, "filter": { "query": "@http.status_code:200 service:my-service" }, "filter_type": "spans-sampling-processor", "name": "my retention filter", "rate": 1.0 }, "type": "apm_retention_filter" } } EOF ``` ##### Create a retention filter with trace rate returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/retention-filters" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": true, "filter": { "query": "@http.status_code:200 service:my-service" }, "filter_type": "spans-sampling-processor", "name": "my retention filter", "rate": 1.0, "trace_rate": 1.0 }, "type": "apm_retention_filter" } } EOF ``` ##### Create a retention filter returns "OK" response ``` // Create a retention filter returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RetentionFilterCreateRequest{ Data: datadogV2.RetentionFilterCreateData{ Attributes: datadogV2.RetentionFilterCreateAttributes{ Enabled: true, Filter: datadogV2.SpansFilterCreate{ Query: "@http.status_code:200 service:my-service", }, FilterType: datadogV2.RETENTIONFILTERTYPE_SPANS_SAMPLING_PROCESSOR, Name: "my retention filter", Rate: 1.0, }, Type: datadogV2.APMRETENTIONFILTERTYPE_apm_retention_filter, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPMRetentionFiltersApi(apiClient) resp, r, err := api.CreateApmRetentionFilter(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APMRetentionFiltersApi.CreateApmRetentionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `APMRetentionFiltersApi.CreateApmRetentionFilter`:\n%s\n", responseContent) } ``` Copy ##### Create a retention filter with trace rate returns "OK" response ``` // Create a retention filter with trace rate returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RetentionFilterCreateRequest{ Data: datadogV2.RetentionFilterCreateData{ Attributes: datadogV2.RetentionFilterCreateAttributes{ Enabled: true, Filter: datadogV2.SpansFilterCreate{ Query: "@http.status_code:200 service:my-service", }, FilterType: datadogV2.RETENTIONFILTERTYPE_SPANS_SAMPLING_PROCESSOR, Name: "my retention filter", Rate: 1.0, TraceRate: datadog.PtrFloat64(1.0), }, Type: datadogV2.APMRETENTIONFILTERTYPE_apm_retention_filter, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPMRetentionFiltersApi(apiClient) resp, r, err := api.CreateApmRetentionFilter(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APMRetentionFiltersApi.CreateApmRetentionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `APMRetentionFiltersApi.CreateApmRetentionFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a retention filter returns "OK" response ``` // Create a retention filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApmRetentionFiltersApi; import com.datadog.api.client.v2.model.ApmRetentionFilterType; import com.datadog.api.client.v2.model.RetentionFilterCreateAttributes; import com.datadog.api.client.v2.model.RetentionFilterCreateData; import com.datadog.api.client.v2.model.RetentionFilterCreateRequest; import com.datadog.api.client.v2.model.RetentionFilterCreateResponse; import com.datadog.api.client.v2.model.RetentionFilterType; import com.datadog.api.client.v2.model.SpansFilterCreate; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApmRetentionFiltersApi apiInstance = new ApmRetentionFiltersApi(defaultClient); RetentionFilterCreateRequest body = new RetentionFilterCreateRequest() .data( new RetentionFilterCreateData() .attributes( new RetentionFilterCreateAttributes() .enabled(true) .filter( new SpansFilterCreate() .query("@http.status_code:200 service:my-service")) .filterType(RetentionFilterType.SPANS_SAMPLING_PROCESSOR) .name("my retention filter") .rate(1.0)) .type(ApmRetentionFilterType.apm_retention_filter)); try { RetentionFilterCreateResponse result = apiInstance.createApmRetentionFilter(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ApmRetentionFiltersApi#createApmRetentionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a retention filter with trace rate returns "OK" response ``` // Create a retention filter with trace rate returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApmRetentionFiltersApi; import com.datadog.api.client.v2.model.ApmRetentionFilterType; import com.datadog.api.client.v2.model.RetentionFilterCreateAttributes; import com.datadog.api.client.v2.model.RetentionFilterCreateData; import com.datadog.api.client.v2.model.RetentionFilterCreateRequest; import com.datadog.api.client.v2.model.RetentionFilterCreateResponse; import com.datadog.api.client.v2.model.RetentionFilterType; import com.datadog.api.client.v2.model.SpansFilterCreate; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApmRetentionFiltersApi apiInstance = new ApmRetentionFiltersApi(defaultClient); RetentionFilterCreateRequest body = new RetentionFilterCreateRequest() .data( new RetentionFilterCreateData() .attributes( new RetentionFilterCreateAttributes() .enabled(true) .filter( new SpansFilterCreate() .query("@http.status_code:200 service:my-service")) .filterType(RetentionFilterType.SPANS_SAMPLING_PROCESSOR) .name("my retention filter") .rate(1.0) .traceRate(1.0)) .type(ApmRetentionFilterType.apm_retention_filter)); try { RetentionFilterCreateResponse result = apiInstance.createApmRetentionFilter(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ApmRetentionFiltersApi#createApmRetentionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a retention filter returns "OK" response ``` """ Create a retention filter returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.apm_retention_filters_api import APMRetentionFiltersApi from datadog_api_client.v2.model.apm_retention_filter_type import ApmRetentionFilterType from datadog_api_client.v2.model.retention_filter_create_attributes import RetentionFilterCreateAttributes from datadog_api_client.v2.model.retention_filter_create_data import RetentionFilterCreateData from datadog_api_client.v2.model.retention_filter_create_request import RetentionFilterCreateRequest from datadog_api_client.v2.model.retention_filter_type import RetentionFilterType from datadog_api_client.v2.model.spans_filter_create import SpansFilterCreate body = RetentionFilterCreateRequest( data=RetentionFilterCreateData( attributes=RetentionFilterCreateAttributes( enabled=True, filter=SpansFilterCreate( query="@http.status_code:200 service:my-service", ), filter_type=RetentionFilterType.SPANS_SAMPLING_PROCESSOR, name="my retention filter", rate=1.0, ), type=ApmRetentionFilterType.apm_retention_filter, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = APMRetentionFiltersApi(api_client) response = api_instance.create_apm_retention_filter(body=body) print(response) ``` Copy ##### Create a retention filter with trace rate returns "OK" response ``` """ Create a retention filter with trace rate returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.apm_retention_filters_api import APMRetentionFiltersApi from datadog_api_client.v2.model.apm_retention_filter_type import ApmRetentionFilterType from datadog_api_client.v2.model.retention_filter_create_attributes import RetentionFilterCreateAttributes from datadog_api_client.v2.model.retention_filter_create_data import RetentionFilterCreateData from datadog_api_client.v2.model.retention_filter_create_request import RetentionFilterCreateRequest from datadog_api_client.v2.model.retention_filter_type import RetentionFilterType from datadog_api_client.v2.model.spans_filter_create import SpansFilterCreate body = RetentionFilterCreateRequest( data=RetentionFilterCreateData( attributes=RetentionFilterCreateAttributes( enabled=True, filter=SpansFilterCreate( query="@http.status_code:200 service:my-service", ), filter_type=RetentionFilterType.SPANS_SAMPLING_PROCESSOR, name="my retention filter", rate=1.0, trace_rate=1.0, ), type=ApmRetentionFilterType.apm_retention_filter, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = APMRetentionFiltersApi(api_client) response = api_instance.create_apm_retention_filter(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a retention filter returns "OK" response ``` # Create a retention filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::APMRetentionFiltersAPI.new body = DatadogAPIClient::V2::RetentionFilterCreateRequest.new({ data: DatadogAPIClient::V2::RetentionFilterCreateData.new({ attributes: DatadogAPIClient::V2::RetentionFilterCreateAttributes.new({ enabled: true, filter: DatadogAPIClient::V2::SpansFilterCreate.new({ query: "@http.status_code:200 service:my-service", }), filter_type: DatadogAPIClient::V2::RetentionFilterType::SPANS_SAMPLING_PROCESSOR, name: "my retention filter", rate: 1.0, }), type: DatadogAPIClient::V2::ApmRetentionFilterType::APM_RETENTION_FILTER, }), }) p api_instance.create_apm_retention_filter(body) ``` Copy ##### Create a retention filter with trace rate returns "OK" response ``` # Create a retention filter with trace rate returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::APMRetentionFiltersAPI.new body = DatadogAPIClient::V2::RetentionFilterCreateRequest.new({ data: DatadogAPIClient::V2::RetentionFilterCreateData.new({ attributes: DatadogAPIClient::V2::RetentionFilterCreateAttributes.new({ enabled: true, filter: DatadogAPIClient::V2::SpansFilterCreate.new({ query: "@http.status_code:200 service:my-service", }), filter_type: DatadogAPIClient::V2::RetentionFilterType::SPANS_SAMPLING_PROCESSOR, name: "my retention filter", rate: 1.0, trace_rate: 1.0, }), type: DatadogAPIClient::V2::ApmRetentionFilterType::APM_RETENTION_FILTER, }), }) p api_instance.create_apm_retention_filter(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a retention filter returns "OK" response ``` // Create a retention filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_apm_retention_filters::APMRetentionFiltersAPI; use datadog_api_client::datadogV2::model::ApmRetentionFilterType; use datadog_api_client::datadogV2::model::RetentionFilterCreateAttributes; use datadog_api_client::datadogV2::model::RetentionFilterCreateData; use datadog_api_client::datadogV2::model::RetentionFilterCreateRequest; use datadog_api_client::datadogV2::model::RetentionFilterType; use datadog_api_client::datadogV2::model::SpansFilterCreate; #[tokio::main] async fn main() { let body = RetentionFilterCreateRequest::new(RetentionFilterCreateData::new( RetentionFilterCreateAttributes::new( true, SpansFilterCreate::new("@http.status_code:200 service:my-service".to_string()), RetentionFilterType::SPANS_SAMPLING_PROCESSOR, "my retention filter".to_string(), 1.0, ), ApmRetentionFilterType::apm_retention_filter, )); let configuration = datadog::Configuration::new(); let api = APMRetentionFiltersAPI::with_config(configuration); let resp = api.create_apm_retention_filter(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a retention filter with trace rate returns "OK" response ``` // Create a retention filter with trace rate returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_apm_retention_filters::APMRetentionFiltersAPI; use datadog_api_client::datadogV2::model::ApmRetentionFilterType; use datadog_api_client::datadogV2::model::RetentionFilterCreateAttributes; use datadog_api_client::datadogV2::model::RetentionFilterCreateData; use datadog_api_client::datadogV2::model::RetentionFilterCreateRequest; use datadog_api_client::datadogV2::model::RetentionFilterType; use datadog_api_client::datadogV2::model::SpansFilterCreate; #[tokio::main] async fn main() { let body = RetentionFilterCreateRequest::new(RetentionFilterCreateData::new( RetentionFilterCreateAttributes::new( true, SpansFilterCreate::new("@http.status_code:200 service:my-service".to_string()), RetentionFilterType::SPANS_SAMPLING_PROCESSOR, "my retention filter".to_string(), 1.0, ) .trace_rate(1.0 as f64), ApmRetentionFilterType::apm_retention_filter, )); let configuration = datadog::Configuration::new(); let api = APMRetentionFiltersAPI::with_config(configuration); let resp = api.create_apm_retention_filter(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a retention filter returns "OK" response ``` /** * Create a retention filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.APMRetentionFiltersApi(configuration); const params: v2.APMRetentionFiltersApiCreateApmRetentionFilterRequest = { body: { data: { attributes: { enabled: true, filter: { query: "@http.status_code:200 service:my-service", }, filterType: "spans-sampling-processor", name: "my retention filter", rate: 1.0, }, type: "apm_retention_filter", }, }, }; apiInstance .createApmRetentionFilter(params) .then((data: v2.RetentionFilterCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a retention filter with trace rate returns "OK" response ``` /** * Create a retention filter with trace rate returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.APMRetentionFiltersApi(configuration); const params: v2.APMRetentionFiltersApiCreateApmRetentionFilterRequest = { body: { data: { attributes: { enabled: true, filter: { query: "@http.status_code:200 service:my-service", }, filterType: "spans-sampling-processor", name: "my retention filter", rate: 1.0, traceRate: 1.0, }, type: "apm_retention_filter", }, }, }; apiInstance .createApmRetentionFilter(params) .then((data: v2.RetentionFilterCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a given APM retention filter](https://docs.datadoghq.com/api/latest/apm-retention-filters/#get-a-given-apm-retention-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/apm-retention-filters/#get-a-given-apm-retention-filter-v2) GET https://api.ap1.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.ap2.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.datadoghq.eu/api/v2/apm/config/retention-filters/{filter_id}https://api.ddog-gov.com/api/v2/apm/config/retention-filters/{filter_id}https://api.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.us3.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.us5.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id} ### Overview Get an APM retention filter. This endpoint requires any of the following permissions: * `apm_retention_filter_read` * `apm_pipelines_read` ### Arguments #### Path Parameters Name Type Description filter_id [_required_] string The ID of the retention filter. ### Response * [200](https://docs.datadoghq.com/api/latest/apm-retention-filters/#GetApmRetentionFilter-200-v2) * [403](https://docs.datadoghq.com/api/latest/apm-retention-filters/#GetApmRetentionFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/apm-retention-filters/#GetApmRetentionFilter-404-v2) * [429](https://docs.datadoghq.com/api/latest/apm-retention-filters/#GetApmRetentionFilter-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) The retention filters definition. Field Type Description data object The definition of the retention filter. attributes [_required_] object The attributes of the retention filter. created_at int64 The creation timestamp of the retention filter. created_by string The creator of the retention filter. editable boolean Shows whether the filter can be edited. enabled boolean The status of the retention filter (Enabled/Disabled). execution_order int64 The execution order of the retention filter. filter object The spans filter used to index spans. query string The search query - following the [span search syntax](https://docs.datadoghq.com/tracing/trace_explorer/query_syntax/). filter_type enum The type of retention filter. Allowed enum values: `spans-sampling-processor,spans-errors-sampling-processor,spans-appsec-sampling-processor` default: `spans-sampling-processor` modified_at int64 The modification timestamp of the retention filter. modified_by string The modifier of the retention filter. name string The name of the retention filter. rate double Sample rate to apply to spans going through this retention filter. A value of 1.0 keeps all spans matching the query. trace_rate double Sample rate to apply to traces containing spans going through this retention filter. A value of 1.0 keeps all traces with spans matching the query. id [_required_] string The ID of the retention filter. type [_required_] enum The type of the resource. Allowed enum values: `apm_retention_filter` default: `apm_retention_filter` ``` { "data": { "attributes": { "created_at": "integer", "created_by": "string", "editable": true, "enabled": true, "execution_order": "integer", "filter": { "query": "@http.status_code:200 service:my-service" }, "filter_type": "spans-sampling-processor", "modified_at": "integer", "modified_by": "string", "name": "my retention filter", "rate": 1, "trace_rate": 1 }, "id": "7RBOb7dLSYWI01yc3pIH8w", "type": "apm_retention_filter" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=typescript) ##### Get a given APM retention filter Copy ``` # Path parameters export filter_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/retention-filters/${filter_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a given APM retention filter ``` """ Get a given APM retention filter returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.apm_retention_filters_api import APMRetentionFiltersApi # there is a valid "retention_filter" in the system RETENTION_FILTER_DATA_ID = environ["RETENTION_FILTER_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = APMRetentionFiltersApi(api_client) response = api_instance.get_apm_retention_filter( filter_id=RETENTION_FILTER_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a given APM retention filter ``` # Get a given APM retention filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::APMRetentionFiltersAPI.new # there is a valid "retention_filter" in the system RETENTION_FILTER_DATA_ID = ENV["RETENTION_FILTER_DATA_ID"] p api_instance.get_apm_retention_filter(RETENTION_FILTER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a given APM retention filter ``` // Get a given APM retention filter returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "retention_filter" in the system RetentionFilterDataID := os.Getenv("RETENTION_FILTER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPMRetentionFiltersApi(apiClient) resp, r, err := api.GetApmRetentionFilter(ctx, RetentionFilterDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APMRetentionFiltersApi.GetApmRetentionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `APMRetentionFiltersApi.GetApmRetentionFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a given APM retention filter ``` // Get a given APM retention filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApmRetentionFiltersApi; import com.datadog.api.client.v2.model.RetentionFilterResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApmRetentionFiltersApi apiInstance = new ApmRetentionFiltersApi(defaultClient); // there is a valid "retention_filter" in the system String RETENTION_FILTER_DATA_ID = System.getenv("RETENTION_FILTER_DATA_ID"); try { RetentionFilterResponse result = apiInstance.getApmRetentionFilter(RETENTION_FILTER_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ApmRetentionFiltersApi#getApmRetentionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a given APM retention filter ``` // Get a given APM retention filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_apm_retention_filters::APMRetentionFiltersAPI; #[tokio::main] async fn main() { // there is a valid "retention_filter" in the system let retention_filter_data_id = std::env::var("RETENTION_FILTER_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = APMRetentionFiltersAPI::with_config(configuration); let resp = api .get_apm_retention_filter(retention_filter_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a given APM retention filter ``` /** * Get a given APM retention filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.APMRetentionFiltersApi(configuration); // there is a valid "retention_filter" in the system const RETENTION_FILTER_DATA_ID = process.env.RETENTION_FILTER_DATA_ID as string; const params: v2.APMRetentionFiltersApiGetApmRetentionFilterRequest = { filterId: RETENTION_FILTER_DATA_ID, }; apiInstance .getApmRetentionFilter(params) .then((data: v2.RetentionFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a retention filter](https://docs.datadoghq.com/api/latest/apm-retention-filters/#update-a-retention-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/apm-retention-filters/#update-a-retention-filter-v2) PUT https://api.ap1.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.ap2.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.datadoghq.eu/api/v2/apm/config/retention-filters/{filter_id}https://api.ddog-gov.com/api/v2/apm/config/retention-filters/{filter_id}https://api.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.us3.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.us5.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id} ### Overview Update a retention filter from your organization. Default filters (filters with types spans-errors-sampling-processor and spans-appsec-sampling-processor) cannot be renamed or removed. This endpoint requires any of the following permissions: * `apm_retention_filter_write` * `apm_pipelines_write` ### Arguments #### Path Parameters Name Type Description filter_id [_required_] string The ID of the retention filter. ### Request #### Body Data (required) The updated definition of the retention filter. * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) Field Type Description data [_required_] object The body of the retention filter to be updated. attributes [_required_] object The object describing the configuration of the retention filter to create/update. enabled [_required_] boolean Enable/Disable the retention filter. filter [_required_] object The spans filter. Spans matching this filter will be indexed and stored. query [_required_] string The search query - following the [span search syntax](https://docs.datadoghq.com/tracing/trace_explorer/query_syntax/). filter_type [_required_] enum The type of retention filter. Allowed enum values: `spans-sampling-processor,spans-errors-sampling-processor,spans-appsec-sampling-processor` default: `spans-sampling-processor` name [_required_] string The name of the retention filter. rate [_required_] double Sample rate to apply to spans going through this retention filter. A value of 1.0 keeps all spans matching the query. trace_rate double Sample rate to apply to traces containing spans going through this retention filter. A value of 1.0 keeps all traces with spans matching the query. id [_required_] string The ID of the retention filter. type [_required_] enum The type of the resource. Allowed enum values: `apm_retention_filter` default: `apm_retention_filter` ##### Update a retention filter returns "OK" response ``` { "data": { "attributes": { "name": "test", "rate": 0.9, "filter": { "query": "@_top_level:1 test:service-demo" }, "enabled": true, "filter_type": "spans-sampling-processor" }, "id": "test-id", "type": "apm_retention_filter" } } ``` Copy ##### Update a retention filter with trace rate returns "OK" response ``` { "data": { "attributes": { "name": "test", "rate": 0.9, "trace_rate": 1.0, "filter": { "query": "@_top_level:1 test:service-demo" }, "enabled": true, "filter_type": "spans-sampling-processor" }, "id": "test-id", "type": "apm_retention_filter" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/apm-retention-filters/#UpdateApmRetentionFilter-200-v2) * [400](https://docs.datadoghq.com/api/latest/apm-retention-filters/#UpdateApmRetentionFilter-400-v2) * [403](https://docs.datadoghq.com/api/latest/apm-retention-filters/#UpdateApmRetentionFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/apm-retention-filters/#UpdateApmRetentionFilter-404-v2) * [429](https://docs.datadoghq.com/api/latest/apm-retention-filters/#UpdateApmRetentionFilter-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) The retention filters definition. Field Type Description data object The definition of the retention filter. attributes [_required_] object The attributes of the retention filter. created_at int64 The creation timestamp of the retention filter. created_by string The creator of the retention filter. editable boolean Shows whether the filter can be edited. enabled boolean The status of the retention filter (Enabled/Disabled). execution_order int64 The execution order of the retention filter. filter object The spans filter used to index spans. query string The search query - following the [span search syntax](https://docs.datadoghq.com/tracing/trace_explorer/query_syntax/). filter_type enum The type of retention filter. Allowed enum values: `spans-sampling-processor,spans-errors-sampling-processor,spans-appsec-sampling-processor` default: `spans-sampling-processor` modified_at int64 The modification timestamp of the retention filter. modified_by string The modifier of the retention filter. name string The name of the retention filter. rate double Sample rate to apply to spans going through this retention filter. A value of 1.0 keeps all spans matching the query. trace_rate double Sample rate to apply to traces containing spans going through this retention filter. A value of 1.0 keeps all traces with spans matching the query. id [_required_] string The ID of the retention filter. type [_required_] enum The type of the resource. Allowed enum values: `apm_retention_filter` default: `apm_retention_filter` ``` { "data": { "attributes": { "created_at": "integer", "created_by": "string", "editable": true, "enabled": true, "execution_order": "integer", "filter": { "query": "@http.status_code:200 service:my-service" }, "filter_type": "spans-sampling-processor", "modified_at": "integer", "modified_by": "string", "name": "my retention filter", "rate": 1, "trace_rate": 1 }, "id": "7RBOb7dLSYWI01yc3pIH8w", "type": "apm_retention_filter" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=typescript) ##### Update a retention filter returns "OK" response Copy ``` # Path parameters export filter_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/retention-filters/${filter_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "test", "rate": 0.9, "filter": { "query": "@_top_level:1 test:service-demo" }, "enabled": true, "filter_type": "spans-sampling-processor" }, "id": "test-id", "type": "apm_retention_filter" } } EOF ``` ##### Update a retention filter with trace rate returns "OK" response Copy ``` # Path parameters export filter_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/retention-filters/${filter_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "test", "rate": 0.9, "trace_rate": 1.0, "filter": { "query": "@_top_level:1 test:service-demo" }, "enabled": true, "filter_type": "spans-sampling-processor" }, "id": "test-id", "type": "apm_retention_filter" } } EOF ``` ##### Update a retention filter returns "OK" response ``` // Update a retention filter returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "retention_filter" in the system RetentionFilterDataID := os.Getenv("RETENTION_FILTER_DATA_ID") body := datadogV2.RetentionFilterUpdateRequest{ Data: datadogV2.RetentionFilterUpdateData{ Attributes: datadogV2.RetentionFilterUpdateAttributes{ Name: "test", Rate: 0.9, Filter: datadogV2.SpansFilterCreate{ Query: "@_top_level:1 test:service-demo", }, Enabled: true, FilterType: datadogV2.RETENTIONFILTERALLTYPE_SPANS_SAMPLING_PROCESSOR, }, Id: "test-id", Type: datadogV2.APMRETENTIONFILTERTYPE_apm_retention_filter, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPMRetentionFiltersApi(apiClient) resp, r, err := api.UpdateApmRetentionFilter(ctx, RetentionFilterDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APMRetentionFiltersApi.UpdateApmRetentionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `APMRetentionFiltersApi.UpdateApmRetentionFilter`:\n%s\n", responseContent) } ``` Copy ##### Update a retention filter with trace rate returns "OK" response ``` // Update a retention filter with trace rate returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "retention_filter" in the system RetentionFilterDataID := os.Getenv("RETENTION_FILTER_DATA_ID") body := datadogV2.RetentionFilterUpdateRequest{ Data: datadogV2.RetentionFilterUpdateData{ Attributes: datadogV2.RetentionFilterUpdateAttributes{ Name: "test", Rate: 0.9, TraceRate: datadog.PtrFloat64(1.0), Filter: datadogV2.SpansFilterCreate{ Query: "@_top_level:1 test:service-demo", }, Enabled: true, FilterType: datadogV2.RETENTIONFILTERALLTYPE_SPANS_SAMPLING_PROCESSOR, }, Id: "test-id", Type: datadogV2.APMRETENTIONFILTERTYPE_apm_retention_filter, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPMRetentionFiltersApi(apiClient) resp, r, err := api.UpdateApmRetentionFilter(ctx, RetentionFilterDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APMRetentionFiltersApi.UpdateApmRetentionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `APMRetentionFiltersApi.UpdateApmRetentionFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a retention filter returns "OK" response ``` // Update a retention filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApmRetentionFiltersApi; import com.datadog.api.client.v2.model.ApmRetentionFilterType; import com.datadog.api.client.v2.model.RetentionFilterAllType; import com.datadog.api.client.v2.model.RetentionFilterResponse; import com.datadog.api.client.v2.model.RetentionFilterUpdateAttributes; import com.datadog.api.client.v2.model.RetentionFilterUpdateData; import com.datadog.api.client.v2.model.RetentionFilterUpdateRequest; import com.datadog.api.client.v2.model.SpansFilterCreate; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApmRetentionFiltersApi apiInstance = new ApmRetentionFiltersApi(defaultClient); // there is a valid "retention_filter" in the system String RETENTION_FILTER_DATA_ID = System.getenv("RETENTION_FILTER_DATA_ID"); RetentionFilterUpdateRequest body = new RetentionFilterUpdateRequest() .data( new RetentionFilterUpdateData() .attributes( new RetentionFilterUpdateAttributes() .name("test") .rate(0.9) .filter( new SpansFilterCreate().query("@_top_level:1 test:service-demo")) .enabled(true) .filterType(RetentionFilterAllType.SPANS_SAMPLING_PROCESSOR)) .id("test-id") .type(ApmRetentionFilterType.apm_retention_filter)); try { RetentionFilterResponse result = apiInstance.updateApmRetentionFilter(RETENTION_FILTER_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ApmRetentionFiltersApi#updateApmRetentionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update a retention filter with trace rate returns "OK" response ``` // Update a retention filter with trace rate returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApmRetentionFiltersApi; import com.datadog.api.client.v2.model.ApmRetentionFilterType; import com.datadog.api.client.v2.model.RetentionFilterAllType; import com.datadog.api.client.v2.model.RetentionFilterResponse; import com.datadog.api.client.v2.model.RetentionFilterUpdateAttributes; import com.datadog.api.client.v2.model.RetentionFilterUpdateData; import com.datadog.api.client.v2.model.RetentionFilterUpdateRequest; import com.datadog.api.client.v2.model.SpansFilterCreate; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApmRetentionFiltersApi apiInstance = new ApmRetentionFiltersApi(defaultClient); // there is a valid "retention_filter" in the system String RETENTION_FILTER_DATA_ID = System.getenv("RETENTION_FILTER_DATA_ID"); RetentionFilterUpdateRequest body = new RetentionFilterUpdateRequest() .data( new RetentionFilterUpdateData() .attributes( new RetentionFilterUpdateAttributes() .name("test") .rate(0.9) .traceRate(1.0) .filter( new SpansFilterCreate().query("@_top_level:1 test:service-demo")) .enabled(true) .filterType(RetentionFilterAllType.SPANS_SAMPLING_PROCESSOR)) .id("test-id") .type(ApmRetentionFilterType.apm_retention_filter)); try { RetentionFilterResponse result = apiInstance.updateApmRetentionFilter(RETENTION_FILTER_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ApmRetentionFiltersApi#updateApmRetentionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a retention filter returns "OK" response ``` """ Update a retention filter returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.apm_retention_filters_api import APMRetentionFiltersApi from datadog_api_client.v2.model.apm_retention_filter_type import ApmRetentionFilterType from datadog_api_client.v2.model.retention_filter_all_type import RetentionFilterAllType from datadog_api_client.v2.model.retention_filter_update_attributes import RetentionFilterUpdateAttributes from datadog_api_client.v2.model.retention_filter_update_data import RetentionFilterUpdateData from datadog_api_client.v2.model.retention_filter_update_request import RetentionFilterUpdateRequest from datadog_api_client.v2.model.spans_filter_create import SpansFilterCreate # there is a valid "retention_filter" in the system RETENTION_FILTER_DATA_ID = environ["RETENTION_FILTER_DATA_ID"] body = RetentionFilterUpdateRequest( data=RetentionFilterUpdateData( attributes=RetentionFilterUpdateAttributes( name="test", rate=0.9, filter=SpansFilterCreate( query="@_top_level:1 test:service-demo", ), enabled=True, filter_type=RetentionFilterAllType.SPANS_SAMPLING_PROCESSOR, ), id="test-id", type=ApmRetentionFilterType.apm_retention_filter, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = APMRetentionFiltersApi(api_client) response = api_instance.update_apm_retention_filter(filter_id=RETENTION_FILTER_DATA_ID, body=body) print(response) ``` Copy ##### Update a retention filter with trace rate returns "OK" response ``` """ Update a retention filter with trace rate returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.apm_retention_filters_api import APMRetentionFiltersApi from datadog_api_client.v2.model.apm_retention_filter_type import ApmRetentionFilterType from datadog_api_client.v2.model.retention_filter_all_type import RetentionFilterAllType from datadog_api_client.v2.model.retention_filter_update_attributes import RetentionFilterUpdateAttributes from datadog_api_client.v2.model.retention_filter_update_data import RetentionFilterUpdateData from datadog_api_client.v2.model.retention_filter_update_request import RetentionFilterUpdateRequest from datadog_api_client.v2.model.spans_filter_create import SpansFilterCreate # there is a valid "retention_filter" in the system RETENTION_FILTER_DATA_ID = environ["RETENTION_FILTER_DATA_ID"] body = RetentionFilterUpdateRequest( data=RetentionFilterUpdateData( attributes=RetentionFilterUpdateAttributes( name="test", rate=0.9, trace_rate=1.0, filter=SpansFilterCreate( query="@_top_level:1 test:service-demo", ), enabled=True, filter_type=RetentionFilterAllType.SPANS_SAMPLING_PROCESSOR, ), id="test-id", type=ApmRetentionFilterType.apm_retention_filter, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = APMRetentionFiltersApi(api_client) response = api_instance.update_apm_retention_filter(filter_id=RETENTION_FILTER_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a retention filter returns "OK" response ``` # Update a retention filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::APMRetentionFiltersAPI.new # there is a valid "retention_filter" in the system RETENTION_FILTER_DATA_ID = ENV["RETENTION_FILTER_DATA_ID"] body = DatadogAPIClient::V2::RetentionFilterUpdateRequest.new({ data: DatadogAPIClient::V2::RetentionFilterUpdateData.new({ attributes: DatadogAPIClient::V2::RetentionFilterUpdateAttributes.new({ name: "test", rate: 0.9, filter: DatadogAPIClient::V2::SpansFilterCreate.new({ query: "@_top_level:1 test:service-demo", }), enabled: true, filter_type: DatadogAPIClient::V2::RetentionFilterAllType::SPANS_SAMPLING_PROCESSOR, }), id: "test-id", type: DatadogAPIClient::V2::ApmRetentionFilterType::APM_RETENTION_FILTER, }), }) p api_instance.update_apm_retention_filter(RETENTION_FILTER_DATA_ID, body) ``` Copy ##### Update a retention filter with trace rate returns "OK" response ``` # Update a retention filter with trace rate returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::APMRetentionFiltersAPI.new # there is a valid "retention_filter" in the system RETENTION_FILTER_DATA_ID = ENV["RETENTION_FILTER_DATA_ID"] body = DatadogAPIClient::V2::RetentionFilterUpdateRequest.new({ data: DatadogAPIClient::V2::RetentionFilterUpdateData.new({ attributes: DatadogAPIClient::V2::RetentionFilterUpdateAttributes.new({ name: "test", rate: 0.9, trace_rate: 1.0, filter: DatadogAPIClient::V2::SpansFilterCreate.new({ query: "@_top_level:1 test:service-demo", }), enabled: true, filter_type: DatadogAPIClient::V2::RetentionFilterAllType::SPANS_SAMPLING_PROCESSOR, }), id: "test-id", type: DatadogAPIClient::V2::ApmRetentionFilterType::APM_RETENTION_FILTER, }), }) p api_instance.update_apm_retention_filter(RETENTION_FILTER_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a retention filter returns "OK" response ``` // Update a retention filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_apm_retention_filters::APMRetentionFiltersAPI; use datadog_api_client::datadogV2::model::ApmRetentionFilterType; use datadog_api_client::datadogV2::model::RetentionFilterAllType; use datadog_api_client::datadogV2::model::RetentionFilterUpdateAttributes; use datadog_api_client::datadogV2::model::RetentionFilterUpdateData; use datadog_api_client::datadogV2::model::RetentionFilterUpdateRequest; use datadog_api_client::datadogV2::model::SpansFilterCreate; #[tokio::main] async fn main() { // there is a valid "retention_filter" in the system let retention_filter_data_id = std::env::var("RETENTION_FILTER_DATA_ID").unwrap(); let body = RetentionFilterUpdateRequest::new(RetentionFilterUpdateData::new( RetentionFilterUpdateAttributes::new( true, SpansFilterCreate::new("@_top_level:1 test:service-demo".to_string()), RetentionFilterAllType::SPANS_SAMPLING_PROCESSOR, "test".to_string(), 0.9, ), "test-id".to_string(), ApmRetentionFilterType::apm_retention_filter, )); let configuration = datadog::Configuration::new(); let api = APMRetentionFiltersAPI::with_config(configuration); let resp = api .update_apm_retention_filter(retention_filter_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update a retention filter with trace rate returns "OK" response ``` // Update a retention filter with trace rate returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_apm_retention_filters::APMRetentionFiltersAPI; use datadog_api_client::datadogV2::model::ApmRetentionFilterType; use datadog_api_client::datadogV2::model::RetentionFilterAllType; use datadog_api_client::datadogV2::model::RetentionFilterUpdateAttributes; use datadog_api_client::datadogV2::model::RetentionFilterUpdateData; use datadog_api_client::datadogV2::model::RetentionFilterUpdateRequest; use datadog_api_client::datadogV2::model::SpansFilterCreate; #[tokio::main] async fn main() { // there is a valid "retention_filter" in the system let retention_filter_data_id = std::env::var("RETENTION_FILTER_DATA_ID").unwrap(); let body = RetentionFilterUpdateRequest::new(RetentionFilterUpdateData::new( RetentionFilterUpdateAttributes::new( true, SpansFilterCreate::new("@_top_level:1 test:service-demo".to_string()), RetentionFilterAllType::SPANS_SAMPLING_PROCESSOR, "test".to_string(), 0.9, ) .trace_rate(1.0 as f64), "test-id".to_string(), ApmRetentionFilterType::apm_retention_filter, )); let configuration = datadog::Configuration::new(); let api = APMRetentionFiltersAPI::with_config(configuration); let resp = api .update_apm_retention_filter(retention_filter_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a retention filter returns "OK" response ``` /** * Update a retention filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.APMRetentionFiltersApi(configuration); // there is a valid "retention_filter" in the system const RETENTION_FILTER_DATA_ID = process.env.RETENTION_FILTER_DATA_ID as string; const params: v2.APMRetentionFiltersApiUpdateApmRetentionFilterRequest = { body: { data: { attributes: { name: "test", rate: 0.9, filter: { query: "@_top_level:1 test:service-demo", }, enabled: true, filterType: "spans-sampling-processor", }, id: "test-id", type: "apm_retention_filter", }, }, filterId: RETENTION_FILTER_DATA_ID, }; apiInstance .updateApmRetentionFilter(params) .then((data: v2.RetentionFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update a retention filter with trace rate returns "OK" response ``` /** * Update a retention filter with trace rate returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.APMRetentionFiltersApi(configuration); // there is a valid "retention_filter" in the system const RETENTION_FILTER_DATA_ID = process.env.RETENTION_FILTER_DATA_ID as string; const params: v2.APMRetentionFiltersApiUpdateApmRetentionFilterRequest = { body: { data: { attributes: { name: "test", rate: 0.9, traceRate: 1.0, filter: { query: "@_top_level:1 test:service-demo", }, enabled: true, filterType: "spans-sampling-processor", }, id: "test-id", type: "apm_retention_filter", }, }, filterId: RETENTION_FILTER_DATA_ID, }; apiInstance .updateApmRetentionFilter(params) .then((data: v2.RetentionFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a retention filter](https://docs.datadoghq.com/api/latest/apm-retention-filters/#delete-a-retention-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/apm-retention-filters/#delete-a-retention-filter-v2) DELETE https://api.ap1.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.ap2.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.datadoghq.eu/api/v2/apm/config/retention-filters/{filter_id}https://api.ddog-gov.com/api/v2/apm/config/retention-filters/{filter_id}https://api.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.us3.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id}https://api.us5.datadoghq.com/api/v2/apm/config/retention-filters/{filter_id} ### Overview Delete a specific retention filter from your organization. Default filters with types spans-errors-sampling-processor and spans-appsec-sampling-processor cannot be deleted. This endpoint requires any of the following permissions: * `apm_retention_filter_write` * `apm_pipelines_write` ### Arguments #### Path Parameters Name Type Description filter_id [_required_] string The ID of the retention filter. ### Response * [200](https://docs.datadoghq.com/api/latest/apm-retention-filters/#DeleteApmRetentionFilter-200-v2) * [403](https://docs.datadoghq.com/api/latest/apm-retention-filters/#DeleteApmRetentionFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/apm-retention-filters/#DeleteApmRetentionFilter-404-v2) * [429](https://docs.datadoghq.com/api/latest/apm-retention-filters/#DeleteApmRetentionFilter-429-v2) OK Not Authorized * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=typescript) ##### Delete a retention filter Copy ``` # Path parameters export filter_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/retention-filters/${filter_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a retention filter ``` """ Delete a retention filter returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.apm_retention_filters_api import APMRetentionFiltersApi # there is a valid "retention_filter" in the system RETENTION_FILTER_DATA_ID = environ["RETENTION_FILTER_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = APMRetentionFiltersApi(api_client) api_instance.delete_apm_retention_filter( filter_id=RETENTION_FILTER_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a retention filter ``` # Delete a retention filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::APMRetentionFiltersAPI.new # there is a valid "retention_filter" in the system RETENTION_FILTER_DATA_ID = ENV["RETENTION_FILTER_DATA_ID"] p api_instance.delete_apm_retention_filter(RETENTION_FILTER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a retention filter ``` // Delete a retention filter returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "retention_filter" in the system RetentionFilterDataID := os.Getenv("RETENTION_FILTER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPMRetentionFiltersApi(apiClient) r, err := api.DeleteApmRetentionFilter(ctx, RetentionFilterDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APMRetentionFiltersApi.DeleteApmRetentionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a retention filter ``` // Delete a retention filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApmRetentionFiltersApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApmRetentionFiltersApi apiInstance = new ApmRetentionFiltersApi(defaultClient); // there is a valid "retention_filter" in the system String RETENTION_FILTER_DATA_ID = System.getenv("RETENTION_FILTER_DATA_ID"); try { apiInstance.deleteApmRetentionFilter(RETENTION_FILTER_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling ApmRetentionFiltersApi#deleteApmRetentionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a retention filter ``` // Delete a retention filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_apm_retention_filters::APMRetentionFiltersAPI; #[tokio::main] async fn main() { // there is a valid "retention_filter" in the system let retention_filter_data_id = std::env::var("RETENTION_FILTER_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = APMRetentionFiltersAPI::with_config(configuration); let resp = api .delete_apm_retention_filter(retention_filter_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a retention filter ``` /** * Delete a retention filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.APMRetentionFiltersApi(configuration); // there is a valid "retention_filter" in the system const RETENTION_FILTER_DATA_ID = process.env.RETENTION_FILTER_DATA_ID as string; const params: v2.APMRetentionFiltersApiDeleteApmRetentionFilterRequest = { filterId: RETENTION_FILTER_DATA_ID, }; apiInstance .deleteApmRetentionFilter(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Re-order retention filters](https://docs.datadoghq.com/api/latest/apm-retention-filters/#re-order-retention-filters) * [v2 (latest)](https://docs.datadoghq.com/api/latest/apm-retention-filters/#re-order-retention-filters-v2) PUT https://api.ap1.datadoghq.com/api/v2/apm/config/retention-filters-execution-orderhttps://api.ap2.datadoghq.com/api/v2/apm/config/retention-filters-execution-orderhttps://api.datadoghq.eu/api/v2/apm/config/retention-filters-execution-orderhttps://api.ddog-gov.com/api/v2/apm/config/retention-filters-execution-orderhttps://api.datadoghq.com/api/v2/apm/config/retention-filters-execution-orderhttps://api.us3.datadoghq.com/api/v2/apm/config/retention-filters-execution-orderhttps://api.us5.datadoghq.com/api/v2/apm/config/retention-filters-execution-order ### Overview Re-order the execution order of retention filters. This endpoint requires any of the following permissions: * `apm_retention_filter_write` * `apm_pipelines_write` ### Request #### Body Data (required) The list of retention filters in the new order. * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) Field Type Description data [_required_] [object] A list of retention filters objects. id [_required_] string The ID of the retention filter. type [_required_] enum The type of the resource. Allowed enum values: `apm_retention_filter` default: `apm_retention_filter` ``` { "data": [ { "id": "jdZrilSJQLqzb6Cu7aub9Q", "type": "apm_retention_filter" }, { "id": "7RBOb7dLSYWI01yc3pIH8w", "type": "apm_retention_filter" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/apm-retention-filters/#ReorderApmRetentionFilters-200-v2) * [400](https://docs.datadoghq.com/api/latest/apm-retention-filters/#ReorderApmRetentionFilters-400-v2) * [403](https://docs.datadoghq.com/api/latest/apm-retention-filters/#ReorderApmRetentionFilters-403-v2) * [429](https://docs.datadoghq.com/api/latest/apm-retention-filters/#ReorderApmRetentionFilters-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/apm-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/apm-retention-filters/?code-lang=typescript) ##### Re-order retention filters returns "OK" response Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/retention-filters-execution-order" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "id": "jdZrilSJQLqzb6Cu7aub9Q", "type": "apm_retention_filter" }, { "id": "7RBOb7dLSYWI01yc3pIH8w", "type": "apm_retention_filter" } ] } EOF ``` ##### Re-order retention filters returns "OK" response ``` // Re-order retention filters returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ReorderRetentionFiltersRequest{ Data: []datadogV2.RetentionFilterWithoutAttributes{ { Id: "jdZrilSJQLqzb6Cu7aub9Q", Type: datadogV2.APMRETENTIONFILTERTYPE_apm_retention_filter, }, { Id: "7RBOb7dLSYWI01yc3pIH8w", Type: datadogV2.APMRETENTIONFILTERTYPE_apm_retention_filter, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPMRetentionFiltersApi(apiClient) r, err := api.ReorderApmRetentionFilters(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APMRetentionFiltersApi.ReorderApmRetentionFilters`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Re-order retention filters returns "OK" response ``` // Re-order retention filters returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApmRetentionFiltersApi; import com.datadog.api.client.v2.model.ApmRetentionFilterType; import com.datadog.api.client.v2.model.ReorderRetentionFiltersRequest; import com.datadog.api.client.v2.model.RetentionFilterWithoutAttributes; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApmRetentionFiltersApi apiInstance = new ApmRetentionFiltersApi(defaultClient); ReorderRetentionFiltersRequest body = new ReorderRetentionFiltersRequest() .data( Arrays.asList( new RetentionFilterWithoutAttributes() .id("jdZrilSJQLqzb6Cu7aub9Q") .type(ApmRetentionFilterType.apm_retention_filter), new RetentionFilterWithoutAttributes() .id("7RBOb7dLSYWI01yc3pIH8w") .type(ApmRetentionFilterType.apm_retention_filter))); try { apiInstance.reorderApmRetentionFilters(body); } catch (ApiException e) { System.err.println( "Exception when calling ApmRetentionFiltersApi#reorderApmRetentionFilters"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Re-order retention filters returns "OK" response ``` """ Re-order retention filters returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.apm_retention_filters_api import APMRetentionFiltersApi from datadog_api_client.v2.model.apm_retention_filter_type import ApmRetentionFilterType from datadog_api_client.v2.model.reorder_retention_filters_request import ReorderRetentionFiltersRequest from datadog_api_client.v2.model.retention_filter_without_attributes import RetentionFilterWithoutAttributes body = ReorderRetentionFiltersRequest( data=[ RetentionFilterWithoutAttributes( id="jdZrilSJQLqzb6Cu7aub9Q", type=ApmRetentionFilterType.apm_retention_filter, ), RetentionFilterWithoutAttributes( id="7RBOb7dLSYWI01yc3pIH8w", type=ApmRetentionFilterType.apm_retention_filter, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = APMRetentionFiltersApi(api_client) api_instance.reorder_apm_retention_filters(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Re-order retention filters returns "OK" response ``` # Re-order retention filters returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::APMRetentionFiltersAPI.new body = DatadogAPIClient::V2::ReorderRetentionFiltersRequest.new({ data: [ DatadogAPIClient::V2::RetentionFilterWithoutAttributes.new({ id: "jdZrilSJQLqzb6Cu7aub9Q", type: DatadogAPIClient::V2::ApmRetentionFilterType::APM_RETENTION_FILTER, }), DatadogAPIClient::V2::RetentionFilterWithoutAttributes.new({ id: "7RBOb7dLSYWI01yc3pIH8w", type: DatadogAPIClient::V2::ApmRetentionFilterType::APM_RETENTION_FILTER, }), ], }) p api_instance.reorder_apm_retention_filters(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Re-order retention filters returns "OK" response ``` // Re-order retention filters returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_apm_retention_filters::APMRetentionFiltersAPI; use datadog_api_client::datadogV2::model::ApmRetentionFilterType; use datadog_api_client::datadogV2::model::ReorderRetentionFiltersRequest; use datadog_api_client::datadogV2::model::RetentionFilterWithoutAttributes; #[tokio::main] async fn main() { let body = ReorderRetentionFiltersRequest::new(vec![ RetentionFilterWithoutAttributes::new( "jdZrilSJQLqzb6Cu7aub9Q".to_string(), ApmRetentionFilterType::apm_retention_filter, ), RetentionFilterWithoutAttributes::new( "7RBOb7dLSYWI01yc3pIH8w".to_string(), ApmRetentionFilterType::apm_retention_filter, ), ]); let configuration = datadog::Configuration::new(); let api = APMRetentionFiltersAPI::with_config(configuration); let resp = api.reorder_apm_retention_filters(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Re-order retention filters returns "OK" response ``` /** * Re-order retention filters returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.APMRetentionFiltersApi(configuration); const params: v2.APMRetentionFiltersApiReorderApmRetentionFiltersRequest = { body: { data: [ { id: "jdZrilSJQLqzb6Cu7aub9Q", type: "apm_retention_filter", }, { id: "7RBOb7dLSYWI01yc3pIH8w", type: "apm_retention_filter", }, ], }, }; apiInstance .reorderApmRetentionFilters(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c9b33501-6b34-4eac-a33a-410993fd1887&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=38cc711f-d9a7-4a6c-9079-f1ef7c847efc&pt=APM%20Retention%20Filters&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapm-retention-filters%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c9b33501-6b34-4eac-a33a-410993fd1887&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=38cc711f-d9a7-4a6c-9079-f1ef7c847efc&pt=APM%20Retention%20Filters&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapm-retention-filters%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=2803acdb-64db-45b6-9eca-d443e88836c8&bo=2&sid=a5cda280f0c011f0ba3c9f7f0f436494&vid=a5cdb190f0c011f09ad6698ee860ed5a&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=APM%20Retention%20Filters&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapm-retention-filters%2F&r=<=2412&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=996093) --- # Source: https://docs.datadoghq.com/api/latest/apm # APM Observe, troubleshoot, and improve cloud-scale applications with all telemetry in context ## [Get service list](https://docs.datadoghq.com/api/latest/apm/#get-service-list) * [v2 (latest)](https://docs.datadoghq.com/api/latest/apm/#get-service-list-v2) GET https://api.ap1.datadoghq.com/api/v2/apm/serviceshttps://api.ap2.datadoghq.com/api/v2/apm/serviceshttps://api.datadoghq.eu/api/v2/apm/serviceshttps://api.ddog-gov.com/api/v2/apm/serviceshttps://api.datadoghq.com/api/v2/apm/serviceshttps://api.us3.datadoghq.com/api/v2/apm/serviceshttps://api.us5.datadoghq.com/api/v2/apm/services ### Overview OAuth apps require the `apm_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#apm) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/apm/#GetServiceList-200-v2) * [429](https://docs.datadoghq.com/api/latest/apm/#GetServiceList-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/apm/) * [Example](https://docs.datadoghq.com/api/latest/apm/) Field Type Description data object attributes object metadata [object] isTraced boolean isUsm boolean services [string] id string type [_required_] enum Services list resource type. Allowed enum values: `services_list` default: `services_list` ``` { "data": { "attributes": { "metadata": [ { "isTraced": false, "isUsm": false } ], "services": [] }, "id": "string", "type": "services_list" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/apm/) * [Example](https://docs.datadoghq.com/api/latest/apm/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/apm/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/apm/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/apm/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/apm/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/apm/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/apm/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/apm/?code-lang=typescript) ##### Get service list Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/services" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get service list ``` """ Get service list returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.apm_api import APMApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = APMApi(api_client) response = api_instance.get_service_list() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get service list ``` # Get service list returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::APMAPI.new p api_instance.get_service_list() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get service list ``` // Get service list returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAPMApi(apiClient) resp, r, err := api.GetServiceList(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `APMApi.GetServiceList`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `APMApi.GetServiceList`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get service list ``` // Get service list returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApmApi; import com.datadog.api.client.v2.model.ServiceList; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApmApi apiInstance = new ApmApi(defaultClient); try { ServiceList result = apiInstance.getServiceList(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ApmApi#getServiceList"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get service list ``` // Get service list returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_apm::APMAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = APMAPI::with_config(configuration); let resp = api.get_service_list().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get service list ``` /** * Get service list returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.APMApi(configuration); apiInstance .getServiceList() .then((data: v2.ServiceList) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=6941c77a-407e-4ba9-83f0-fa5729676259&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=fcb0ae40-1271-469d-bf9b-028e294d8e41&pt=APM&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapm%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=6941c77a-407e-4ba9-83f0-fa5729676259&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=fcb0ae40-1271-469d-bf9b-028e294d8e41&pt=APM&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapm%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=90881391-8e3d-40b1-b348-262a698fcec0&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=APM&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapm%2F&r=<=11343&evt=pageLoad&sv=2&asc=G&cdb=AQAC&rn=103209) --- # Source: https://docs.datadoghq.com/api/latest/app-builder/ # App Builder Datadog App Builder provides a low-code solution to rapidly develop and integrate secure, customized applications into your monitoring stack that are built to accelerate remediation at scale. These API endpoints allow you to create, read, update, delete, and publish apps. ## [List Apps](https://docs.datadoghq.com/api/latest/app-builder/#list-apps) * [v2 (latest)](https://docs.datadoghq.com/api/latest/app-builder/#list-apps-v2) GET https://api.ap1.datadoghq.com/api/v2/app-builder/appshttps://api.ap2.datadoghq.com/api/v2/app-builder/appshttps://api.datadoghq.eu/api/v2/app-builder/appshttps://api.ddog-gov.com/api/v2/app-builder/appshttps://api.datadoghq.com/api/v2/app-builder/appshttps://api.us3.datadoghq.com/api/v2/app-builder/appshttps://api.us5.datadoghq.com/api/v2/app-builder/apps ### Overview List all apps, with optional filters and sorting. This endpoint is paginated. Only basic app information such as the app ID, name, and description is returned by this endpoint. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `apps_run` permission. ### Arguments #### Query Strings Name Type Description limit integer The number of apps to return per page. page integer The page number to return. filter[user_name] string Filter apps by the app creator. Usually the user’s email. filter[user_uuid] string Filter apps by the app creator’s UUID. filter[name] string Filter by app name. filter[query] string Filter apps by the app name or the app creator. filter[deployed] boolean Filter apps by whether they are published. filter[tags] string Filter apps by tags. filter[favorite] boolean Filter apps by whether you have added them to your favorites. filter[self_service] boolean Filter apps by whether they are enabled for self-service. sort array The fields and direction to sort apps by. ### Response * [200](https://docs.datadoghq.com/api/latest/app-builder/#ListApps-200-v2) * [400](https://docs.datadoghq.com/api/latest/app-builder/#ListApps-400-v2) * [403](https://docs.datadoghq.com/api/latest/app-builder/#ListApps-403-v2) * [429](https://docs.datadoghq.com/api/latest/app-builder/#ListApps-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) A paginated list of apps matching the specified filters and sorting. Field Type Description data [object] An array of app definitions. attributes [_required_] object Basic information about the app such as name, description, and tags. description string A human-readable description for the app. favorite boolean Whether the app is marked as a favorite by the current user. name string The name of the app. selfService boolean Whether the app is enabled for use in the Datadog self-service hub. tags [string] A list of tags for the app, which can be used to filter apps. id [_required_] uuid The ID of the app. meta object Metadata of an app. created_at date-time Timestamp of when the app was created. deleted_at date-time Timestamp of when the app was deleted. org_id int64 The Datadog organization ID that owns the app. updated_at date-time Timestamp of when the app was last updated. updated_since_deployment boolean Whether the app was updated since it was last published. Published apps are pinned to a specific version and do not automatically update when the app is updated. user_id int64 The ID of the user who created the app. user_name string The name (or email address) of the user who created the app. user_uuid uuid The UUID of the user who created the app. version int64 The version number of the app. This starts at 1 and increments with each update. relationships object The app's publication information. deployment object Information pointing to the app's publication status. data object Data object containing the deployment ID. id uuid The deployment ID. type enum The deployment type. Allowed enum values: `deployment` default: `deployment` meta object Metadata object containing the publication creation information. created_at date-time Timestamp of when the app was published. user_id int64 The ID of the user who published the app. user_name string The name (or email address) of the user who published the app. user_uuid uuid The UUID of the user who published the app. type [_required_] enum The app definition type. Allowed enum values: `appDefinitions` default: `appDefinitions` included [object] Data on the version of the app that was published. attributes object The attributes object containing the version ID of the published app. app_version_id uuid The version ID of the app that was published. For an unpublished app, this is always the nil UUID (`00000000-0000-0000-0000-000000000000`). id uuid The deployment ID. meta object Metadata object containing the publication creation information. created_at date-time Timestamp of when the app was published. user_id int64 The ID of the user who published the app. user_name string The name (or email address) of the user who published the app. user_uuid uuid The UUID of the user who published the app. type enum The deployment type. Allowed enum values: `deployment` default: `deployment` meta object Pagination metadata. page object Information on the total number of apps, to be used for pagination. totalCount int64 The total number of apps under the Datadog organization, disregarding any filters applied. totalFilteredCount int64 The total number of apps that match the specified filters. ``` { "data": [ { "attributes": { "description": "string", "favorite": false, "name": "string", "selfService": false, "tags": [ "service:webshop-backend", "team:webshop" ] }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "meta": { "created_at": "2019-09-19T10:00:00.000Z", "deleted_at": "2019-09-19T10:00:00.000Z", "org_id": "integer", "updated_at": "2019-09-19T10:00:00.000Z", "updated_since_deployment": false, "user_id": "integer", "user_name": "string", "user_uuid": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "version": "integer" }, "relationships": { "deployment": { "data": { "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "deployment" }, "meta": { "created_at": "2019-09-19T10:00:00.000Z", "user_id": "integer", "user_name": "string", "user_uuid": "65bb1f25-52e1-4510-9f8d-22d1516ed693" } } }, "type": "appDefinitions" } ], "included": [ { "attributes": { "app_version_id": "65bb1f25-52e1-4510-9f8d-22d1516ed693" }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "meta": { "created_at": "2019-09-19T10:00:00.000Z", "user_id": "integer", "user_name": "string", "user_uuid": "65bb1f25-52e1-4510-9f8d-22d1516ed693" }, "type": "deployment" } ], "meta": { "page": { "totalCount": "integer", "totalFilteredCount": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=typescript) ##### List Apps Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/app-builder/apps" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Apps ``` """ List Apps returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.app_builder_api import AppBuilderApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AppBuilderApi(api_client) response = api_instance.list_apps() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Apps ``` # List Apps returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AppBuilderAPI.new p api_instance.list_apps() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Apps ``` // List Apps returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAppBuilderApi(apiClient) resp, r, err := api.ListApps(ctx, *datadogV2.NewListAppsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AppBuilderApi.ListApps`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AppBuilderApi.ListApps`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Apps ``` // List Apps returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AppBuilderApi; import com.datadog.api.client.v2.model.ListAppsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AppBuilderApi apiInstance = new AppBuilderApi(defaultClient); try { ListAppsResponse result = apiInstance.listApps(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AppBuilderApi#listApps"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Apps ``` // List Apps returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_app_builder::AppBuilderAPI; use datadog_api_client::datadogV2::api_app_builder::ListAppsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AppBuilderAPI::with_config(configuration); let resp = api.list_apps(ListAppsOptionalParams::default()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Apps ``` /** * List Apps returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AppBuilderApi(configuration); apiInstance .listApps() .then((data: v2.ListAppsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create App](https://docs.datadoghq.com/api/latest/app-builder/#create-app) * [v2 (latest)](https://docs.datadoghq.com/api/latest/app-builder/#create-app-v2) POST https://api.ap1.datadoghq.com/api/v2/app-builder/appshttps://api.ap2.datadoghq.com/api/v2/app-builder/appshttps://api.datadoghq.eu/api/v2/app-builder/appshttps://api.ddog-gov.com/api/v2/app-builder/appshttps://api.datadoghq.com/api/v2/app-builder/appshttps://api.us3.datadoghq.com/api/v2/app-builder/appshttps://api.us5.datadoghq.com/api/v2/app-builder/apps ### Overview Create a new app, returning the app ID. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires all of the following permissions: * `apps_write` * `connections_resolve` * `workflows_run` ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) Field Type Description data object The data object containing the app definition. attributes object App definition attributes such as name, description, and components. components [object] The UI components that make up the app. events [object] Events to listen for on the grid component. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id string The ID of the grid component. This property is deprecated; use `name` to identify individual components instead. name [_required_] string A unique identifier for this grid component. This name is also visible in the app editor. properties [_required_] object Properties of a grid component. backgroundColor string The background color of the grid. default: `default` children [object] The child components of the grid. events [object] Events to listen for on the UI component. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id string The ID of the UI component. This property is deprecated; use `name` to identify individual components instead. name [_required_] string A unique identifier for this UI component. This name is also visible in the app editor. properties [_required_] object Properties of a UI component. Different component types can have their own additional unique properties. See the [components documentation](https://docs.datadoghq.com/service_management/app_builder/components/) for more detail on each component type and its properties. children [object] The child components of the UI component. isVisible Whether the UI component is visible. If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. type [_required_] enum The UI component type. Allowed enum values: `table,textInput,textArea,button,text,select,modal,schemaForm,checkbox,tabs,vegaChart,radioButtons,numberInput,fileInput,jsonInput,gridCell,dateRangePicker,search,container,calloutValue` isVisible Whether the grid component and its children are visible. If a string, it must be a valid JavaScript expression that evaluates to a boolean. Option 1 string Option 2 boolean default: `true` type [_required_] enum The grid component type. Allowed enum values: `grid` default: `grid` description string A human-readable description for the app. name string The name of the app. queries [ ] An array of queries, such as external actions and state variables, that the app uses. Option 1 object An action query. This query type is used to trigger an action, such as sending a HTTP request. events [object] Events to listen for downstream of the action query. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id [_required_] uuid The ID of the action query. name [_required_] string A unique identifier for this action query. This name is also used to access the query's result throughout the app. properties [_required_] object The properties of the action query. condition Whether to run this query. If specified, the query will only run if this condition evaluates to `true` in JavaScript and all other conditions are also met. Option 1 boolean Option 2 string debounceInMs The minimum time in milliseconds that must pass before the query can be triggered again. This is useful for preventing accidental double-clicks from triggering the query multiple times. Option 1 double Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a number. mockedOutputs The mocked outputs of the action query. This is useful for testing the app without actually running the action. Option 1 string Option 2 object The mocked outputs of the action query. enabled [_required_] Whether to enable the mocked outputs for testing. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. outputs string The mocked outputs of the action query, serialized as JSON. onlyTriggerManually Determines when this query is executed. If set to `false`, the query will run when the app loads and whenever any query arguments change. If set to `true`, the query will only run when manually triggered from elsewhere in the app. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. outputs string The post-query transformation function, which is a JavaScript function that changes the query's `.outputs` property after the query's execution. pollingIntervalInMs If specified, the app will poll the query at the specified interval in milliseconds. The minimum polling interval is 15 seconds. The query will only poll when the app's browser tab is active. Option 1 double Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a number. requiresConfirmation Whether to prompt the user to confirm this query before it runs. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. showToastOnError Whether to display a toast to the user when the query returns an error. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. spec [_required_] The definition of the action query. Option 1 string Option 2 object The action query spec object. connectionGroup object The connection group to use for an action query. id uuid The ID of the connection group. tags [string] The tags of the connection group. connectionId string The ID of the custom connection to use for this action query. fqn [_required_] string The fully qualified name of the action type. inputs The inputs to the action query. These are the values that are passed to the action when it is triggered. Option 1 string Option 2 object The inputs to the action query. See the [Actions Catalog](https://docs.datadoghq.com/actions/actions_catalog/) for more detail on each action and its inputs. type [_required_] enum The action query type. Allowed enum values: `action` default: `action` Option 2 object A data transformer, which is custom JavaScript code that executes and transforms data when its inputs change. id [_required_] uuid The ID of the data transformer. name [_required_] string A unique identifier for this data transformer. This name is also used to access the transformer's result throughout the app. properties [_required_] object The properties of the data transformer. outputs string A JavaScript function that returns the transformed data. type [_required_] enum The data transform type. Allowed enum values: `dataTransform` default: `dataTransform` Option 3 object A variable, which can be set and read by other components in the app. id [_required_] uuid The ID of the state variable. name [_required_] string A unique identifier for this state variable. This name is also used to access the variable's value throughout the app. properties [_required_] object The properties of the state variable. defaultValue The default value of the state variable. type [_required_] enum The state variable type. Allowed enum values: `stateVariable` default: `stateVariable` rootInstanceName string The name of the root component of the app. This must be a `grid` component that contains all other components. tags [string] A list of tags for the app, which can be used to filter apps. type [_required_] enum The app definition type. Allowed enum values: `appDefinitions` default: `appDefinitions` ``` { "data": { "type": "appDefinitions", "attributes": { "rootInstanceName": "grid0", "components": [ { "name": "grid0", "type": "grid", "properties": { "children": [ { "type": "gridCell", "name": "gridCell0", "properties": { "children": [ { "name": "text0", "type": "text", "properties": { "content": "# Cat Facts", "contentType": "markdown", "textAlign": "left", "verticalAlign": "top", "isVisible": true }, "events": [] } ], "isVisible": "true", "layout": { "default": { "x": 0, "y": 0, "width": 4, "height": 5 } } }, "events": [] }, { "type": "gridCell", "name": "gridCell2", "properties": { "children": [ { "name": "table0", "type": "table", "properties": { "data": "${fetchFacts?.outputs?.body?.data}", "columns": [ { "dataPath": "fact", "header": "fact", "isHidden": false, "id": "0ae2ae9e-0280-4389-83c6-1c5949f7e674" }, { "dataPath": "length", "header": "length", "isHidden": true, "id": "c9048611-0196-4a00-9366-1ef9e3ec0408" }, { "id": "8fa9284b-7a58-4f13-9959-57b7d8a7fe8f", "dataPath": "Due Date", "header": "Unused Old Column", "disableSortBy": false, "formatter": { "type": "formatted_time", "format": "LARGE_WITHOUT_TIME" }, "isDeleted": true } ], "summary": true, "pageSize": "${pageSize?.value}", "paginationType": "server_side", "isLoading": "${fetchFacts?.isLoading}", "rowButtons": [], "isWrappable": false, "isScrollable": "vertical", "isSubRowsEnabled": false, "globalFilter": false, "isVisible": true, "totalCount": "${fetchFacts?.outputs?.body?.total}" }, "events": [] } ], "isVisible": "true", "layout": { "default": { "x": 0, "y": 5, "width": 12, "height": 96 } } }, "events": [] }, { "type": "gridCell", "name": "gridCell1", "properties": { "children": [ { "name": "text1", "type": "text", "properties": { "content": "## Random Fact\n\n${randomFact?.outputs?.fact}", "contentType": "markdown", "textAlign": "left", "verticalAlign": "top", "isVisible": true }, "events": [] } ], "isVisible": "true", "layout": { "default": { "x": 0, "y": 101, "width": 12, "height": 16 } } }, "events": [] }, { "type": "gridCell", "name": "gridCell3", "properties": { "children": [ { "name": "button0", "type": "button", "properties": { "label": "Increase Page Size", "level": "default", "isPrimary": true, "isBorderless": false, "isLoading": false, "isDisabled": false, "isVisible": true, "iconLeft": "angleUp", "iconRight": "" }, "events": [ { "variableName": "pageSize", "value": "${pageSize?.value + 1}", "name": "click", "type": "setStateVariableValue" } ] } ], "isVisible": "true", "layout": { "default": { "x": 10, "y": 134, "width": 2, "height": 4 } } }, "events": [] }, { "type": "gridCell", "name": "gridCell4", "properties": { "children": [ { "name": "button1", "type": "button", "properties": { "label": "Decrease Page Size", "level": "default", "isPrimary": true, "isBorderless": false, "isLoading": false, "isDisabled": false, "isVisible": true, "iconLeft": "angleDown", "iconRight": "" }, "events": [ { "variableName": "pageSize", "value": "${pageSize?.value - 1}", "name": "click", "type": "setStateVariableValue" } ] } ], "isVisible": "true", "layout": { "default": { "x": 10, "y": 138, "width": 2, "height": 4 } } }, "events": [] } ], "backgroundColor": "default" }, "events": [] } ], "queries": [ { "id": "92ff0bb8-553b-4f31-87c7-ef5bd16d47d5", "type": "action", "name": "fetchFacts", "events": [], "properties": { "spec": { "fqn": "com.datadoghq.http.request", "connectionId": "5e63f4a8-4ce6-47de-ba11-f6617c1d54f3", "inputs": { "verb": "GET", "url": "https://catfact.ninja/facts", "urlParams": [ { "key": "limit", "value": "${pageSize.value.toString()}" }, { "key": "page", "value": "${(table0.pageIndex + 1).toString()}" } ] } } } }, { "type": "stateVariable", "name": "pageSize", "properties": { "defaultValue": "${20}" }, "id": "afd03c81-4075-4432-8618-ba09d52d2f2d" }, { "type": "dataTransform", "name": "randomFact", "properties": { "outputs": "${(() => {const facts = fetchFacts.outputs.body.data\nreturn facts[Math.floor(Math.random()*facts.length)]\n})()}" }, "id": "0fb22859-47dc-4137-9e41-7b67d04c525c" } ], "name": "Example Cat Facts Viewer", "description": "This is a slightly complicated example app that fetches and displays cat facts" } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/app-builder/#CreateApp-201-v2) * [400](https://docs.datadoghq.com/api/latest/app-builder/#CreateApp-400-v2) * [403](https://docs.datadoghq.com/api/latest/app-builder/#CreateApp-403-v2) * [429](https://docs.datadoghq.com/api/latest/app-builder/#CreateApp-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) The response object after a new app is successfully created, with the app ID. Field Type Description data object The data object containing the app ID. id [_required_] uuid The ID of the created app. type [_required_] enum The app definition type. Allowed enum values: `appDefinitions` default: `appDefinitions` ``` { "data": { "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "appDefinitions" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=typescript) ##### Create App returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/app-builder/apps" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "appDefinitions", "attributes": { "rootInstanceName": "grid0", "components": [ { "name": "grid0", "type": "grid", "properties": { "children": [ { "type": "gridCell", "name": "gridCell0", "properties": { "children": [ { "name": "text0", "type": "text", "properties": { "content": "# Cat Facts", "contentType": "markdown", "textAlign": "left", "verticalAlign": "top", "isVisible": true }, "events": [] } ], "isVisible": "true", "layout": { "default": { "x": 0, "y": 0, "width": 4, "height": 5 } } }, "events": [] }, { "type": "gridCell", "name": "gridCell2", "properties": { "children": [ { "name": "table0", "type": "table", "properties": { "data": "${fetchFacts?.outputs?.body?.data}", "columns": [ { "dataPath": "fact", "header": "fact", "isHidden": false, "id": "0ae2ae9e-0280-4389-83c6-1c5949f7e674" }, { "dataPath": "length", "header": "length", "isHidden": true, "id": "c9048611-0196-4a00-9366-1ef9e3ec0408" }, { "id": "8fa9284b-7a58-4f13-9959-57b7d8a7fe8f", "dataPath": "Due Date", "header": "Unused Old Column", "disableSortBy": false, "formatter": { "type": "formatted_time", "format": "LARGE_WITHOUT_TIME" }, "isDeleted": true } ], "summary": true, "pageSize": "${pageSize?.value}", "paginationType": "server_side", "isLoading": "${fetchFacts?.isLoading}", "rowButtons": [], "isWrappable": false, "isScrollable": "vertical", "isSubRowsEnabled": false, "globalFilter": false, "isVisible": true, "totalCount": "${fetchFacts?.outputs?.body?.total}" }, "events": [] } ], "isVisible": "true", "layout": { "default": { "x": 0, "y": 5, "width": 12, "height": 96 } } }, "events": [] }, { "type": "gridCell", "name": "gridCell1", "properties": { "children": [ { "name": "text1", "type": "text", "properties": { "content": "## Random Fact\n\n${randomFact?.outputs?.fact}", "contentType": "markdown", "textAlign": "left", "verticalAlign": "top", "isVisible": true }, "events": [] } ], "isVisible": "true", "layout": { "default": { "x": 0, "y": 101, "width": 12, "height": 16 } } }, "events": [] }, { "type": "gridCell", "name": "gridCell3", "properties": { "children": [ { "name": "button0", "type": "button", "properties": { "label": "Increase Page Size", "level": "default", "isPrimary": true, "isBorderless": false, "isLoading": false, "isDisabled": false, "isVisible": true, "iconLeft": "angleUp", "iconRight": "" }, "events": [ { "variableName": "pageSize", "value": "${pageSize?.value + 1}", "name": "click", "type": "setStateVariableValue" } ] } ], "isVisible": "true", "layout": { "default": { "x": 10, "y": 134, "width": 2, "height": 4 } } }, "events": [] }, { "type": "gridCell", "name": "gridCell4", "properties": { "children": [ { "name": "button1", "type": "button", "properties": { "label": "Decrease Page Size", "level": "default", "isPrimary": true, "isBorderless": false, "isLoading": false, "isDisabled": false, "isVisible": true, "iconLeft": "angleDown", "iconRight": "" }, "events": [ { "variableName": "pageSize", "value": "${pageSize?.value - 1}", "name": "click", "type": "setStateVariableValue" } ] } ], "isVisible": "true", "layout": { "default": { "x": 10, "y": 138, "width": 2, "height": 4 } } }, "events": [] } ], "backgroundColor": "default" }, "events": [] } ], "queries": [ { "id": "92ff0bb8-553b-4f31-87c7-ef5bd16d47d5", "type": "action", "name": "fetchFacts", "events": [], "properties": { "spec": { "fqn": "com.datadoghq.http.request", "connectionId": "5e63f4a8-4ce6-47de-ba11-f6617c1d54f3", "inputs": { "verb": "GET", "url": "https://catfact.ninja/facts", "urlParams": [ { "key": "limit", "value": "${pageSize.value.toString()}" }, { "key": "page", "value": "${(table0.pageIndex + 1).toString()}" } ] } } } }, { "type": "stateVariable", "name": "pageSize", "properties": { "defaultValue": "${20}" }, "id": "afd03c81-4075-4432-8618-ba09d52d2f2d" }, { "type": "dataTransform", "name": "randomFact", "properties": { "outputs": "${(() => {const facts = fetchFacts.outputs.body.data\nreturn facts[Math.floor(Math.random()*facts.length)]\n})()}" }, "id": "0fb22859-47dc-4137-9e41-7b67d04c525c" } ], "name": "Example Cat Facts Viewer", "description": "This is a slightly complicated example app that fetches and displays cat facts" } } } EOF ``` ##### Create App returns "Created" response ``` // Create App returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { body := datadogV2.CreateAppRequest{ Data: &datadogV2.CreateAppRequestData{ Type: datadogV2.APPDEFINITIONTYPE_APPDEFINITIONS, Attributes: &datadogV2.CreateAppRequestDataAttributes{ RootInstanceName: datadog.PtrString("grid0"), Components: []datadogV2.ComponentGrid{ { Name: "grid0", Type: datadogV2.COMPONENTGRIDTYPE_GRID, Properties: datadogV2.ComponentGridProperties{ Children: []datadogV2.Component{ { Type: datadogV2.COMPONENTTYPE_GRIDCELL, Name: "gridCell0", Properties: datadogV2.ComponentProperties{ Children: []datadogV2.Component{ { Name: "text0", Type: datadogV2.COMPONENTTYPE_TEXT, Properties: datadogV2.ComponentProperties{ IsVisible: &datadogV2.ComponentPropertiesIsVisible{ Bool: datadog.PtrBool(true)}, }, Events: []datadogV2.AppBuilderEvent{}, }, }, IsVisible: &datadogV2.ComponentPropertiesIsVisible{ String: datadog.PtrString("true")}, }, Events: []datadogV2.AppBuilderEvent{}, }, { Type: datadogV2.COMPONENTTYPE_GRIDCELL, Name: "gridCell2", Properties: datadogV2.ComponentProperties{ Children: []datadogV2.Component{ { Name: "table0", Type: datadogV2.COMPONENTTYPE_TABLE, Properties: datadogV2.ComponentProperties{ IsVisible: &datadogV2.ComponentPropertiesIsVisible{ Bool: datadog.PtrBool(true)}, }, Events: []datadogV2.AppBuilderEvent{}, }, }, IsVisible: &datadogV2.ComponentPropertiesIsVisible{ String: datadog.PtrString("true")}, }, Events: []datadogV2.AppBuilderEvent{}, }, { Type: datadogV2.COMPONENTTYPE_GRIDCELL, Name: "gridCell1", Properties: datadogV2.ComponentProperties{ Children: []datadogV2.Component{ { Name: "text1", Type: datadogV2.COMPONENTTYPE_TEXT, Properties: datadogV2.ComponentProperties{ IsVisible: &datadogV2.ComponentPropertiesIsVisible{ Bool: datadog.PtrBool(true)}, }, Events: []datadogV2.AppBuilderEvent{}, }, }, IsVisible: &datadogV2.ComponentPropertiesIsVisible{ String: datadog.PtrString("true")}, }, Events: []datadogV2.AppBuilderEvent{}, }, { Type: datadogV2.COMPONENTTYPE_GRIDCELL, Name: "gridCell3", Properties: datadogV2.ComponentProperties{ Children: []datadogV2.Component{ { Name: "button0", Type: datadogV2.COMPONENTTYPE_BUTTON, Properties: datadogV2.ComponentProperties{ IsVisible: &datadogV2.ComponentPropertiesIsVisible{ Bool: datadog.PtrBool(true)}, }, Events: []datadogV2.AppBuilderEvent{ { Name: datadogV2.APPBUILDEREVENTNAME_CLICK.Ptr(), Type: datadogV2.APPBUILDEREVENTTYPE_SETSTATEVARIABLEVALUE.Ptr(), }, }, }, }, IsVisible: &datadogV2.ComponentPropertiesIsVisible{ String: datadog.PtrString("true")}, }, Events: []datadogV2.AppBuilderEvent{}, }, { Type: datadogV2.COMPONENTTYPE_GRIDCELL, Name: "gridCell4", Properties: datadogV2.ComponentProperties{ Children: []datadogV2.Component{ { Name: "button1", Type: datadogV2.COMPONENTTYPE_BUTTON, Properties: datadogV2.ComponentProperties{ IsVisible: &datadogV2.ComponentPropertiesIsVisible{ Bool: datadog.PtrBool(true)}, }, Events: []datadogV2.AppBuilderEvent{ { Name: datadogV2.APPBUILDEREVENTNAME_CLICK.Ptr(), Type: datadogV2.APPBUILDEREVENTTYPE_SETSTATEVARIABLEVALUE.Ptr(), }, }, }, }, IsVisible: &datadogV2.ComponentPropertiesIsVisible{ String: datadog.PtrString("true")}, }, Events: []datadogV2.AppBuilderEvent{}, }, }, BackgroundColor: datadog.PtrString("default"), }, Events: []datadogV2.AppBuilderEvent{}, }, }, Queries: []datadogV2.Query{ datadogV2.Query{ ActionQuery: &datadogV2.ActionQuery{ Id: uuid.MustParse("92ff0bb8-553b-4f31-87c7-ef5bd16d47d5"), Type: datadogV2.ACTIONQUERYTYPE_ACTION, Name: "fetchFacts", Events: []datadogV2.AppBuilderEvent{}, Properties: datadogV2.ActionQueryProperties{ Spec: datadogV2.ActionQuerySpec{ ActionQuerySpecObject: &datadogV2.ActionQuerySpecObject{ Fqn: "com.datadoghq.http.request", ConnectionId: datadog.PtrString("5e63f4a8-4ce6-47de-ba11-f6617c1d54f3"), Inputs: &datadogV2.ActionQuerySpecInputs{ ActionQuerySpecInput: map[string]interface{}{ "verb": "GET", "url": "https://catfact.ninja/facts", "urlParams": "[{'key': 'limit', 'value': '${pageSize.value.toString()}'}, {'key': 'page', 'value': '${(table0.pageIndex + 1).toString()}'}]", }}, }}, }, }}, datadogV2.Query{ StateVariable: &datadogV2.StateVariable{ Type: datadogV2.STATEVARIABLETYPE_STATEVARIABLE, Name: "pageSize", Properties: datadogV2.StateVariableProperties{ DefaultValue: "${20}", }, Id: uuid.MustParse("afd03c81-4075-4432-8618-ba09d52d2f2d"), }}, datadogV2.Query{ DataTransform: &datadogV2.DataTransform{ Type: datadogV2.DATATRANSFORMTYPE_DATATRANSFORM, Name: "randomFact", Properties: datadogV2.DataTransformProperties{ Outputs: datadog.PtrString(`${(() => {const facts = fetchFacts.outputs.body.data return facts[Math.floor(Math.random()*facts.length)] })()}`), }, Id: uuid.MustParse("0fb22859-47dc-4137-9e41-7b67d04c525c"), }}, }, Name: datadog.PtrString("Example Cat Facts Viewer"), Description: datadog.PtrString("This is a slightly complicated example app that fetches and displays cat facts"), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAppBuilderApi(apiClient) resp, r, err := api.CreateApp(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AppBuilderApi.CreateApp`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AppBuilderApi.CreateApp`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create App returns "Created" response ``` // Create App returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AppBuilderApi; import com.datadog.api.client.v2.model.ActionQuery; import com.datadog.api.client.v2.model.ActionQueryProperties; import com.datadog.api.client.v2.model.ActionQuerySpec; import com.datadog.api.client.v2.model.ActionQuerySpecInputs; import com.datadog.api.client.v2.model.ActionQuerySpecObject; import com.datadog.api.client.v2.model.ActionQueryType; import com.datadog.api.client.v2.model.AppBuilderEvent; import com.datadog.api.client.v2.model.AppBuilderEventName; import com.datadog.api.client.v2.model.AppBuilderEventType; import com.datadog.api.client.v2.model.AppDefinitionType; import com.datadog.api.client.v2.model.Component; import com.datadog.api.client.v2.model.ComponentGrid; import com.datadog.api.client.v2.model.ComponentGridProperties; import com.datadog.api.client.v2.model.ComponentGridType; import com.datadog.api.client.v2.model.ComponentProperties; import com.datadog.api.client.v2.model.ComponentPropertiesIsVisible; import com.datadog.api.client.v2.model.ComponentType; import com.datadog.api.client.v2.model.CreateAppRequest; import com.datadog.api.client.v2.model.CreateAppRequestData; import com.datadog.api.client.v2.model.CreateAppRequestDataAttributes; import com.datadog.api.client.v2.model.CreateAppResponse; import com.datadog.api.client.v2.model.DataTransform; import com.datadog.api.client.v2.model.DataTransformProperties; import com.datadog.api.client.v2.model.DataTransformType; import com.datadog.api.client.v2.model.Query; import com.datadog.api.client.v2.model.StateVariable; import com.datadog.api.client.v2.model.StateVariableProperties; import com.datadog.api.client.v2.model.StateVariableType; import java.util.Arrays; import java.util.Collections; import java.util.Map; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AppBuilderApi apiInstance = new AppBuilderApi(defaultClient); CreateAppRequest body = new CreateAppRequest() .data( new CreateAppRequestData() .type(AppDefinitionType.APPDEFINITIONS) .attributes( new CreateAppRequestDataAttributes() .rootInstanceName("grid0") .components( Collections.singletonList( new ComponentGrid() .name("grid0") .type(ComponentGridType.GRID) .properties( new ComponentGridProperties() .children( Arrays.asList( new Component() .type(ComponentType.GRIDCELL) .name("gridCell0") .properties( new ComponentProperties() .children( Collections.singletonList( new Component() .name("text0") .type( ComponentType .TEXT) .properties( new ComponentProperties() .isVisible( new ComponentPropertiesIsVisible( true))))) .isVisible( new ComponentPropertiesIsVisible( "true"))), new Component() .type(ComponentType.GRIDCELL) .name("gridCell2") .properties( new ComponentProperties() .children( Collections.singletonList( new Component() .name("table0") .type( ComponentType .TABLE) .properties( new ComponentProperties() .isVisible( new ComponentPropertiesIsVisible( true))))) .isVisible( new ComponentPropertiesIsVisible( "true"))), new Component() .type(ComponentType.GRIDCELL) .name("gridCell1") .properties( new ComponentProperties() .children( Collections.singletonList( new Component() .name("text1") .type( ComponentType .TEXT) .properties( new ComponentProperties() .isVisible( new ComponentPropertiesIsVisible( true))))) .isVisible( new ComponentPropertiesIsVisible( "true"))), new Component() .type(ComponentType.GRIDCELL) .name("gridCell3") .properties( new ComponentProperties() .children( Collections.singletonList( new Component() .name("button0") .type( ComponentType .BUTTON) .properties( new ComponentProperties() .isVisible( new ComponentPropertiesIsVisible( true))) .events( Collections .singletonList( new AppBuilderEvent() .name( AppBuilderEventName .CLICK) .type( AppBuilderEventType .SETSTATEVARIABLEVALUE))))) .isVisible( new ComponentPropertiesIsVisible( "true"))), new Component() .type(ComponentType.GRIDCELL) .name("gridCell4") .properties( new ComponentProperties() .children( Collections.singletonList( new Component() .name("button1") .type( ComponentType .BUTTON) .properties( new ComponentProperties() .isVisible( new ComponentPropertiesIsVisible( true))) .events( Collections .singletonList( new AppBuilderEvent() .name( AppBuilderEventName .CLICK) .type( AppBuilderEventType .SETSTATEVARIABLEVALUE))))) .isVisible( new ComponentPropertiesIsVisible( "true"))))) .backgroundColor("default")))) .queries( Arrays.asList( new Query( new ActionQuery() .id( UUID.fromString( "92ff0bb8-553b-4f31-87c7-ef5bd16d47d5")) .type(ActionQueryType.ACTION) .name("fetchFacts") .properties( new ActionQueryProperties() .spec( new ActionQuerySpec( new ActionQuerySpecObject() .fqn("com.datadoghq.http.request") .connectionId( "5e63f4a8-4ce6-47de-ba11-f6617c1d54f3") .inputs( new ActionQuerySpecInputs( Map.ofEntries( Map.entry( "verb", "GET"), Map.entry( "url", "https://catfact.ninja/facts"), Map.entry( "urlParams", "[{'key': 'limit'," + " 'value':" + " '${pageSize.value.toString()}'}," + " {'key':" + " 'page'," + " 'value':" + " '${(table0.pageIndex" + " + 1).toString()}'}]")))))))), new Query( new StateVariable() .type(StateVariableType.STATEVARIABLE) .name("pageSize") .properties( new StateVariableProperties().defaultValue("${20}")) .id( UUID.fromString( "afd03c81-4075-4432-8618-ba09d52d2f2d"))), new Query( new DataTransform() .type(DataTransformType.DATATRANSFORM) .name("randomFact") .properties( new DataTransformProperties() .outputs( """ ${(() => {const facts = fetchFacts.outputs.body.data return facts[Math.floor(Math.random()*facts.length)] })()} """)) .id( UUID.fromString( "0fb22859-47dc-4137-9e41-7b67d04c525c"))))) .name("Example Cat Facts Viewer") .description( "This is a slightly complicated example app that fetches and" + " displays cat facts"))); try { CreateAppResponse result = apiInstance.createApp(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AppBuilderApi#createApp"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create App returns "Created" response ``` """ Create App returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.app_builder_api import AppBuilderApi from datadog_api_client.v2.model.action_query import ActionQuery from datadog_api_client.v2.model.action_query_properties import ActionQueryProperties from datadog_api_client.v2.model.action_query_spec_input import ActionQuerySpecInput from datadog_api_client.v2.model.action_query_spec_object import ActionQuerySpecObject from datadog_api_client.v2.model.action_query_type import ActionQueryType from datadog_api_client.v2.model.app_builder_event import AppBuilderEvent from datadog_api_client.v2.model.app_builder_event_name import AppBuilderEventName from datadog_api_client.v2.model.app_builder_event_type import AppBuilderEventType from datadog_api_client.v2.model.app_definition_type import AppDefinitionType from datadog_api_client.v2.model.component import Component from datadog_api_client.v2.model.component_grid import ComponentGrid from datadog_api_client.v2.model.component_grid_properties import ComponentGridProperties from datadog_api_client.v2.model.component_grid_type import ComponentGridType from datadog_api_client.v2.model.component_properties import ComponentProperties from datadog_api_client.v2.model.component_type import ComponentType from datadog_api_client.v2.model.create_app_request import CreateAppRequest from datadog_api_client.v2.model.create_app_request_data import CreateAppRequestData from datadog_api_client.v2.model.create_app_request_data_attributes import CreateAppRequestDataAttributes from datadog_api_client.v2.model.data_transform import DataTransform from datadog_api_client.v2.model.data_transform_properties import DataTransformProperties from datadog_api_client.v2.model.data_transform_type import DataTransformType from datadog_api_client.v2.model.state_variable import StateVariable from datadog_api_client.v2.model.state_variable_properties import StateVariableProperties from datadog_api_client.v2.model.state_variable_type import StateVariableType from uuid import UUID body = CreateAppRequest( data=CreateAppRequestData( type=AppDefinitionType.APPDEFINITIONS, attributes=CreateAppRequestDataAttributes( root_instance_name="grid0", components=[ ComponentGrid( name="grid0", type=ComponentGridType.GRID, properties=ComponentGridProperties( children=[ Component( type=ComponentType.GRIDCELL, name="gridCell0", properties=ComponentProperties( children=[ Component( name="text0", type=ComponentType.TEXT, properties=ComponentProperties( content="# Cat Facts", content_type="markdown", text_align="left", vertical_align="top", is_visible=True, ), events=[], ), ], is_visible="true", layout=dict([("default", "{'x': 0, 'y': 0, 'width': 4, 'height': 5}")]), ), events=[], ), Component( type=ComponentType.GRIDCELL, name="gridCell2", properties=ComponentProperties( children=[ Component( name="table0", type=ComponentType.TABLE, properties=ComponentProperties( data="${fetchFacts?.outputs?.body?.data}", columns=[ dict( [ ("dataPath", "fact"), ("header", "fact"), ("isHidden", "False"), ("id", "0ae2ae9e-0280-4389-83c6-1c5949f7e674"), ] ), dict( [ ("dataPath", "length"), ("header", "length"), ("isHidden", "True"), ("id", "c9048611-0196-4a00-9366-1ef9e3ec0408"), ] ), dict( [ ("id", "8fa9284b-7a58-4f13-9959-57b7d8a7fe8f"), ("dataPath", "Due Date"), ("header", "Unused Old Column"), ("disableSortBy", "False"), ( "formatter", "{'type': 'formatted_time', 'format': 'LARGE_WITHOUT_TIME'}", ), ("isDeleted", "True"), ] ), ], summary=True, page_size="${pageSize?.value}", pagination_type="server_side", is_loading="${fetchFacts?.isLoading}", row_buttons=[], is_wrappable=False, is_scrollable="vertical", is_sub_rows_enabled=False, global_filter=False, is_visible=True, total_count="${fetchFacts?.outputs?.body?.total}", ), events=[], ), ], is_visible="true", layout=dict([("default", "{'x': 0, 'y': 5, 'width': 12, 'height': 96}")]), ), events=[], ), Component( type=ComponentType.GRIDCELL, name="gridCell1", properties=ComponentProperties( children=[ Component( name="text1", type=ComponentType.TEXT, properties=ComponentProperties( content="## Random Fact\n\n${randomFact?.outputs?.fact}", content_type="markdown", text_align="left", vertical_align="top", is_visible=True, ), events=[], ), ], is_visible="true", layout=dict([("default", "{'x': 0, 'y': 101, 'width': 12, 'height': 16}")]), ), events=[], ), Component( type=ComponentType.GRIDCELL, name="gridCell3", properties=ComponentProperties( children=[ Component( name="button0", type=ComponentType.BUTTON, properties=ComponentProperties( label="Increase Page Size", level="default", is_primary=True, is_borderless=False, is_loading=False, is_disabled=False, is_visible=True, icon_left="angleUp", icon_right="", ), events=[ AppBuilderEvent( variable_name="pageSize", value="${pageSize?.value + 1}", name=AppBuilderEventName.CLICK, type=AppBuilderEventType.SETSTATEVARIABLEVALUE, ), ], ), ], is_visible="true", layout=dict([("default", "{'x': 10, 'y': 134, 'width': 2, 'height': 4}")]), ), events=[], ), Component( type=ComponentType.GRIDCELL, name="gridCell4", properties=ComponentProperties( children=[ Component( name="button1", type=ComponentType.BUTTON, properties=ComponentProperties( label="Decrease Page Size", level="default", is_primary=True, is_borderless=False, is_loading=False, is_disabled=False, is_visible=True, icon_left="angleDown", icon_right="", ), events=[ AppBuilderEvent( variable_name="pageSize", value="${pageSize?.value - 1}", name=AppBuilderEventName.CLICK, type=AppBuilderEventType.SETSTATEVARIABLEVALUE, ), ], ), ], is_visible="true", layout=dict([("default", "{'x': 10, 'y': 138, 'width': 2, 'height': 4}")]), ), events=[], ), ], background_color="default", ), events=[], ), ], queries=[ ActionQuery( id=UUID("92ff0bb8-553b-4f31-87c7-ef5bd16d47d5"), type=ActionQueryType.ACTION, name="fetchFacts", events=[], properties=ActionQueryProperties( spec=ActionQuerySpecObject( fqn="com.datadoghq.http.request", connection_id="5e63f4a8-4ce6-47de-ba11-f6617c1d54f3", inputs=ActionQuerySpecInput( [ ("verb", "GET"), ("url", "https://catfact.ninja/facts"), ( "urlParams", "[{'key': 'limit', 'value': '${pageSize.value.toString()}'}, {'key': 'page', 'value': '${(table0.pageIndex + 1).toString()}'}]", ), ] ), ), ), ), StateVariable( type=StateVariableType.STATEVARIABLE, name="pageSize", properties=StateVariableProperties( default_value="${20}", ), id=UUID("afd03c81-4075-4432-8618-ba09d52d2f2d"), ), DataTransform( type=DataTransformType.DATATRANSFORM, name="randomFact", properties=DataTransformProperties( outputs="${(() => {const facts = fetchFacts.outputs.body.data\nreturn facts[Math.floor(Math.random()*facts.length)]\n})()}", ), id=UUID("0fb22859-47dc-4137-9e41-7b67d04c525c"), ), ], name="Example Cat Facts Viewer", description="This is a slightly complicated example app that fetches and displays cat facts", ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AppBuilderApi(api_client) response = api_instance.create_app(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create App returns "Created" response ``` # Create App returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AppBuilderAPI.new body = DatadogAPIClient::V2::CreateAppRequest.new({ data: DatadogAPIClient::V2::CreateAppRequestData.new({ type: DatadogAPIClient::V2::AppDefinitionType::APPDEFINITIONS, attributes: DatadogAPIClient::V2::CreateAppRequestDataAttributes.new({ root_instance_name: "grid0", components: [ DatadogAPIClient::V2::ComponentGrid.new({ name: "grid0", type: DatadogAPIClient::V2::ComponentGridType::GRID, properties: DatadogAPIClient::V2::ComponentGridProperties.new({ children: [ DatadogAPIClient::V2::Component.new({ type: DatadogAPIClient::V2::ComponentType::GRIDCELL, name: "gridCell0", properties: DatadogAPIClient::V2::ComponentProperties.new({ children: [ DatadogAPIClient::V2::Component.new({ name: "text0", type: DatadogAPIClient::V2::ComponentType::TEXT, properties: DatadogAPIClient::V2::ComponentProperties.new({ is_visible: true, }), events: [], }), ], is_visible: "true", }), events: [], }), DatadogAPIClient::V2::Component.new({ type: DatadogAPIClient::V2::ComponentType::GRIDCELL, name: "gridCell2", properties: DatadogAPIClient::V2::ComponentProperties.new({ children: [ DatadogAPIClient::V2::Component.new({ name: "table0", type: DatadogAPIClient::V2::ComponentType::TABLE, properties: DatadogAPIClient::V2::ComponentProperties.new({ is_visible: true, }), events: [], }), ], is_visible: "true", }), events: [], }), DatadogAPIClient::V2::Component.new({ type: DatadogAPIClient::V2::ComponentType::GRIDCELL, name: "gridCell1", properties: DatadogAPIClient::V2::ComponentProperties.new({ children: [ DatadogAPIClient::V2::Component.new({ name: "text1", type: DatadogAPIClient::V2::ComponentType::TEXT, properties: DatadogAPIClient::V2::ComponentProperties.new({ is_visible: true, }), events: [], }), ], is_visible: "true", }), events: [], }), DatadogAPIClient::V2::Component.new({ type: DatadogAPIClient::V2::ComponentType::GRIDCELL, name: "gridCell3", properties: DatadogAPIClient::V2::ComponentProperties.new({ children: [ DatadogAPIClient::V2::Component.new({ name: "button0", type: DatadogAPIClient::V2::ComponentType::BUTTON, properties: DatadogAPIClient::V2::ComponentProperties.new({ is_visible: true, }), events: [ DatadogAPIClient::V2::AppBuilderEvent.new({ name: DatadogAPIClient::V2::AppBuilderEventName::CLICK, type: DatadogAPIClient::V2::AppBuilderEventType::SETSTATEVARIABLEVALUE, }), ], }), ], is_visible: "true", }), events: [], }), DatadogAPIClient::V2::Component.new({ type: DatadogAPIClient::V2::ComponentType::GRIDCELL, name: "gridCell4", properties: DatadogAPIClient::V2::ComponentProperties.new({ children: [ DatadogAPIClient::V2::Component.new({ name: "button1", type: DatadogAPIClient::V2::ComponentType::BUTTON, properties: DatadogAPIClient::V2::ComponentProperties.new({ is_visible: true, }), events: [ DatadogAPIClient::V2::AppBuilderEvent.new({ name: DatadogAPIClient::V2::AppBuilderEventName::CLICK, type: DatadogAPIClient::V2::AppBuilderEventType::SETSTATEVARIABLEVALUE, }), ], }), ], is_visible: "true", }), events: [], }), ], background_color: "default", }), events: [], }), ], queries: [ DatadogAPIClient::V2::ActionQuery.new({ id: "92ff0bb8-553b-4f31-87c7-ef5bd16d47d5", type: DatadogAPIClient::V2::ActionQueryType::ACTION, name: "fetchFacts", events: [], properties: DatadogAPIClient::V2::ActionQueryProperties.new({ spec: { "fqn": "com.datadoghq.http.request", "connectionId": "5e63f4a8-4ce6-47de-ba11-f6617c1d54f3", "inputs": "{'verb': 'GET', 'url': 'https://catfact.ninja/facts', 'urlParams': [{'key': 'limit', 'value': '${pageSize.value.toString()}'}, {'key': 'page', 'value': '${(table0.pageIndex + 1).toString()}'}]}", }, }), }), DatadogAPIClient::V2::StateVariable.new({ type: DatadogAPIClient::V2::StateVariableType::STATEVARIABLE, name: "pageSize", properties: DatadogAPIClient::V2::StateVariableProperties.new({ default_value: "${20}", }), id: "afd03c81-4075-4432-8618-ba09d52d2f2d", }), DatadogAPIClient::V2::DataTransform.new({ type: DatadogAPIClient::V2::DataTransformType::DATATRANSFORM, name: "randomFact", properties: DatadogAPIClient::V2::DataTransformProperties.new({ outputs: '${(() => {const facts = fetchFacts.outputs.body.data\nreturn facts[Math.floor(Math.random()*facts.length)]\n})()}', }), id: "0fb22859-47dc-4137-9e41-7b67d04c525c", }), ], name: "Example Cat Facts Viewer", description: "This is a slightly complicated example app that fetches and displays cat facts", }), }), }) p api_instance.create_app(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create App returns "Created" response ``` // Create App returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_app_builder::AppBuilderAPI; use datadog_api_client::datadogV2::model::ActionQuery; use datadog_api_client::datadogV2::model::ActionQueryProperties; use datadog_api_client::datadogV2::model::ActionQuerySpec; use datadog_api_client::datadogV2::model::ActionQuerySpecInputs; use datadog_api_client::datadogV2::model::ActionQuerySpecObject; use datadog_api_client::datadogV2::model::ActionQueryType; use datadog_api_client::datadogV2::model::AppBuilderEvent; use datadog_api_client::datadogV2::model::AppBuilderEventName; use datadog_api_client::datadogV2::model::AppBuilderEventType; use datadog_api_client::datadogV2::model::AppDefinitionType; use datadog_api_client::datadogV2::model::Component; use datadog_api_client::datadogV2::model::ComponentGrid; use datadog_api_client::datadogV2::model::ComponentGridProperties; use datadog_api_client::datadogV2::model::ComponentGridType; use datadog_api_client::datadogV2::model::ComponentProperties; use datadog_api_client::datadogV2::model::ComponentPropertiesIsVisible; use datadog_api_client::datadogV2::model::ComponentType; use datadog_api_client::datadogV2::model::CreateAppRequest; use datadog_api_client::datadogV2::model::CreateAppRequestData; use datadog_api_client::datadogV2::model::CreateAppRequestDataAttributes; use datadog_api_client::datadogV2::model::DataTransform; use datadog_api_client::datadogV2::model::DataTransformProperties; use datadog_api_client::datadogV2::model::DataTransformType; use datadog_api_client::datadogV2::model::Query; use datadog_api_client::datadogV2::model::StateVariable; use datadog_api_client::datadogV2::model::StateVariableProperties; use datadog_api_client::datadogV2::model::StateVariableType; use serde_json::Value; use std::collections::BTreeMap; use uuid::Uuid; #[tokio::main] async fn main() { let body = CreateAppRequest ::new().data( CreateAppRequestData::new( AppDefinitionType::APPDEFINITIONS, ).attributes( CreateAppRequestDataAttributes::new() .components( vec![ ComponentGrid::new( "grid0".to_string(), ComponentGridProperties::new() .background_color("default".to_string()) .children( vec![ Component::new( "gridCell0".to_string(), ComponentProperties::new() .children( vec![ Component::new( "text0".to_string(), ComponentProperties ::new().is_visible( ComponentPropertiesIsVisible::Bool(true), ), ComponentType::TEXT, ).events(vec![]) ], ) .is_visible( ComponentPropertiesIsVisible::String("true".to_string()), ), ComponentType::GRIDCELL, ).events(vec![]), Component::new( "gridCell2".to_string(), ComponentProperties::new() .children( vec![ Component::new( "table0".to_string(), ComponentProperties ::new().is_visible( ComponentPropertiesIsVisible::Bool(true), ), ComponentType::TABLE, ).events(vec![]) ], ) .is_visible( ComponentPropertiesIsVisible::String("true".to_string()), ), ComponentType::GRIDCELL, ).events(vec![]), Component::new( "gridCell1".to_string(), ComponentProperties::new() .children( vec![ Component::new( "text1".to_string(), ComponentProperties ::new().is_visible( ComponentPropertiesIsVisible::Bool(true), ), ComponentType::TEXT, ).events(vec![]) ], ) .is_visible( ComponentPropertiesIsVisible::String("true".to_string()), ), ComponentType::GRIDCELL, ).events(vec![]), Component::new( "gridCell3".to_string(), ComponentProperties::new() .children( vec![ Component::new( "button0".to_string(), ComponentProperties ::new().is_visible( ComponentPropertiesIsVisible::Bool(true), ), ComponentType::BUTTON, ).events( vec![ AppBuilderEvent::new() .name(AppBuilderEventName::CLICK) .type_( AppBuilderEventType::SETSTATEVARIABLEVALUE, ) ], ) ], ) .is_visible( ComponentPropertiesIsVisible::String("true".to_string()), ), ComponentType::GRIDCELL, ).events(vec![]), Component::new( "gridCell4".to_string(), ComponentProperties::new() .children( vec![ Component::new( "button1".to_string(), ComponentProperties ::new().is_visible( ComponentPropertiesIsVisible::Bool(true), ), ComponentType::BUTTON, ).events( vec![ AppBuilderEvent::new() .name(AppBuilderEventName::CLICK) .type_( AppBuilderEventType::SETSTATEVARIABLEVALUE, ) ], ) ], ) .is_visible( ComponentPropertiesIsVisible::String("true".to_string()), ), ComponentType::GRIDCELL, ).events(vec![]) ], ), ComponentGridType::GRID, ).events(vec![]) ], ) .description( "This is a slightly complicated example app that fetches and displays cat facts".to_string(), ) .name("Example Cat Facts Viewer".to_string()) .queries( vec![ Query::ActionQuery( Box::new( ActionQuery::new( Uuid::parse_str( "92ff0bb8-553b-4f31-87c7-ef5bd16d47d5", ).expect("invalid UUID"), "fetchFacts".to_string(), ActionQueryProperties::new( ActionQuerySpec::ActionQuerySpecObject( Box::new( ActionQuerySpecObject::new( "com.datadoghq.http.request".to_string(), ) .connection_id( "5e63f4a8-4ce6-47de-ba11-f6617c1d54f3".to_string(), ) .inputs( ActionQuerySpecInputs::ActionQuerySpecInput( BTreeMap::from( [ ("verb".to_string(), Value::from("GET")), ( "url".to_string(), Value::from( "https://catfact.ninja/facts", ), ), ], ), ), ), ), ), ), ActionQueryType::ACTION, ).events(vec![]), ), ), Query::StateVariable( Box::new( StateVariable::new( Uuid::parse_str( "afd03c81-4075-4432-8618-ba09d52d2f2d", ).expect("invalid UUID"), "pageSize".to_string(), StateVariableProperties::new().default_value(Value::from("${20}")), StateVariableType::STATEVARIABLE, ), ), ), Query::DataTransform( Box::new( DataTransform::new( Uuid::parse_str( "0fb22859-47dc-4137-9e41-7b67d04c525c", ).expect("invalid UUID"), "randomFact".to_string(), DataTransformProperties ::new().outputs( r#"${(() => {const facts = fetchFacts.outputs.body.data return facts[Math.floor(Math.random()*facts.length)] })()}"#.to_string(), ), DataTransformType::DATATRANSFORM, ), ), ) ], ) .root_instance_name("grid0".to_string()), ), ); let configuration = datadog::Configuration::new(); let api = AppBuilderAPI::with_config(configuration); let resp = api.create_app(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create App returns "Created" response ``` /** * Create App returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AppBuilderApi(configuration); const params: v2.AppBuilderApiCreateAppRequest = { body: { data: { type: "appDefinitions", attributes: { rootInstanceName: "grid0", components: [ { name: "grid0", type: "grid", properties: { children: [ { type: "gridCell", name: "gridCell0", properties: { children: [ { name: "text0", type: "text", properties: { isVisible: true, }, events: [], }, ], isVisible: "true", }, events: [], }, { type: "gridCell", name: "gridCell2", properties: { children: [ { name: "table0", type: "table", properties: { isVisible: true, }, events: [], }, ], isVisible: "true", }, events: [], }, { type: "gridCell", name: "gridCell1", properties: { children: [ { name: "text1", type: "text", properties: { isVisible: true, }, events: [], }, ], isVisible: "true", }, events: [], }, { type: "gridCell", name: "gridCell3", properties: { children: [ { name: "button0", type: "button", properties: { isVisible: true, }, events: [ { name: "click", type: "setStateVariableValue", }, ], }, ], isVisible: "true", }, events: [], }, { type: "gridCell", name: "gridCell4", properties: { children: [ { name: "button1", type: "button", properties: { isVisible: true, }, events: [ { name: "click", type: "setStateVariableValue", }, ], }, ], isVisible: "true", }, events: [], }, ], backgroundColor: "default", }, events: [], }, ], queries: [ { id: "92ff0bb8-553b-4f31-87c7-ef5bd16d47d5", type: "action", name: "fetchFacts", events: [], properties: { spec: { fqn: "com.datadoghq.http.request", connectionId: "5e63f4a8-4ce6-47de-ba11-f6617c1d54f3", inputs: { verb: "GET", url: "https://catfact.ninja/facts", urlParams: "[{'key': 'limit', 'value': '${pageSize.value.toString()}'}, {'key': 'page', 'value': '${(table0.pageIndex + 1).toString()}'}]", }, }, }, }, { type: "stateVariable", name: "pageSize", properties: { defaultValue: "${20}", }, id: "afd03c81-4075-4432-8618-ba09d52d2f2d", }, { type: "dataTransform", name: "randomFact", properties: { outputs: "${(() => {const facts = fetchFacts.outputs.body.data\nreturn facts[Math.floor(Math.random()*facts.length)]\n})()}", }, id: "0fb22859-47dc-4137-9e41-7b67d04c525c", }, ], name: "Example Cat Facts Viewer", description: "This is a slightly complicated example app that fetches and displays cat facts", }, }, }, }; apiInstance .createApp(params) .then((data: v2.CreateAppResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Multiple Apps](https://docs.datadoghq.com/api/latest/app-builder/#delete-multiple-apps) * [v2 (latest)](https://docs.datadoghq.com/api/latest/app-builder/#delete-multiple-apps-v2) DELETE https://api.ap1.datadoghq.com/api/v2/app-builder/appshttps://api.ap2.datadoghq.com/api/v2/app-builder/appshttps://api.datadoghq.eu/api/v2/app-builder/appshttps://api.ddog-gov.com/api/v2/app-builder/appshttps://api.datadoghq.com/api/v2/app-builder/appshttps://api.us3.datadoghq.com/api/v2/app-builder/appshttps://api.us5.datadoghq.com/api/v2/app-builder/apps ### Overview Delete multiple apps in a single request from a list of app IDs. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `apps_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) Field Type Description data [object] An array of objects containing the IDs of the apps to delete. id [_required_] uuid The ID of the app to delete. type [_required_] enum The app definition type. Allowed enum values: `appDefinitions` default: `appDefinitions` ``` { "data": [ { "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "appDefinitions" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/app-builder/#DeleteApps-200-v2) * [400](https://docs.datadoghq.com/api/latest/app-builder/#DeleteApps-400-v2) * [403](https://docs.datadoghq.com/api/latest/app-builder/#DeleteApps-403-v2) * [404](https://docs.datadoghq.com/api/latest/app-builder/#DeleteApps-404-v2) * [429](https://docs.datadoghq.com/api/latest/app-builder/#DeleteApps-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) The response object after multiple apps are successfully deleted. Field Type Description data [object] An array of objects containing the IDs of the deleted apps. id [_required_] uuid The ID of the deleted app. type [_required_] enum The app definition type. Allowed enum values: `appDefinitions` default: `appDefinitions` ``` { "data": [ { "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "appDefinitions" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=typescript) ##### Delete Multiple Apps returns "OK" response Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/app-builder/apps" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "appDefinitions" } ] } EOF ``` ##### Delete Multiple Apps returns "OK" response ``` // Delete Multiple Apps returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "app" in the system AppDataID := uuid.MustParse(os.Getenv("APP_DATA_ID")) body := datadogV2.DeleteAppsRequest{ Data: []datadogV2.DeleteAppsRequestDataItems{ { Id: AppDataID, Type: datadogV2.APPDEFINITIONTYPE_APPDEFINITIONS, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAppBuilderApi(apiClient) resp, r, err := api.DeleteApps(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AppBuilderApi.DeleteApps`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AppBuilderApi.DeleteApps`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Multiple Apps returns "OK" response ``` // Delete Multiple Apps returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AppBuilderApi; import com.datadog.api.client.v2.model.AppDefinitionType; import com.datadog.api.client.v2.model.DeleteAppsRequest; import com.datadog.api.client.v2.model.DeleteAppsRequestDataItems; import com.datadog.api.client.v2.model.DeleteAppsResponse; import java.util.Collections; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AppBuilderApi apiInstance = new AppBuilderApi(defaultClient); // there is a valid "app" in the system UUID APP_DATA_ID = null; try { APP_DATA_ID = UUID.fromString(System.getenv("APP_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } DeleteAppsRequest body = new DeleteAppsRequest() .data( Collections.singletonList( new DeleteAppsRequestDataItems() .id(APP_DATA_ID) .type(AppDefinitionType.APPDEFINITIONS))); try { DeleteAppsResponse result = apiInstance.deleteApps(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AppBuilderApi#deleteApps"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Multiple Apps returns "OK" response ``` """ Delete Multiple Apps returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.app_builder_api import AppBuilderApi from datadog_api_client.v2.model.app_definition_type import AppDefinitionType from datadog_api_client.v2.model.delete_apps_request import DeleteAppsRequest from datadog_api_client.v2.model.delete_apps_request_data_items import DeleteAppsRequestDataItems # there is a valid "app" in the system APP_DATA_ID = environ["APP_DATA_ID"] body = DeleteAppsRequest( data=[ DeleteAppsRequestDataItems( id=APP_DATA_ID, type=AppDefinitionType.APPDEFINITIONS, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AppBuilderApi(api_client) response = api_instance.delete_apps(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Multiple Apps returns "OK" response ``` # Delete Multiple Apps returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AppBuilderAPI.new # there is a valid "app" in the system APP_DATA_ID = ENV["APP_DATA_ID"] body = DatadogAPIClient::V2::DeleteAppsRequest.new({ data: [ DatadogAPIClient::V2::DeleteAppsRequestDataItems.new({ id: APP_DATA_ID, type: DatadogAPIClient::V2::AppDefinitionType::APPDEFINITIONS, }), ], }) p api_instance.delete_apps(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Multiple Apps returns "OK" response ``` // Delete Multiple Apps returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_app_builder::AppBuilderAPI; use datadog_api_client::datadogV2::model::AppDefinitionType; use datadog_api_client::datadogV2::model::DeleteAppsRequest; use datadog_api_client::datadogV2::model::DeleteAppsRequestDataItems; #[tokio::main] async fn main() { // there is a valid "app" in the system let app_data_id = uuid::Uuid::parse_str(&std::env::var("APP_DATA_ID").unwrap()).expect("Invalid UUID"); let body = DeleteAppsRequest::new().data(vec![DeleteAppsRequestDataItems::new( app_data_id.clone(), AppDefinitionType::APPDEFINITIONS, )]); let configuration = datadog::Configuration::new(); let api = AppBuilderAPI::with_config(configuration); let resp = api.delete_apps(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Multiple Apps returns "OK" response ``` /** * Delete Multiple Apps returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AppBuilderApi(configuration); // there is a valid "app" in the system const APP_DATA_ID = process.env.APP_DATA_ID as string; const params: v2.AppBuilderApiDeleteAppsRequest = { body: { data: [ { id: APP_DATA_ID, type: "appDefinitions", }, ], }, }; apiInstance .deleteApps(params) .then((data: v2.DeleteAppsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get App](https://docs.datadoghq.com/api/latest/app-builder/#get-app) * [v2 (latest)](https://docs.datadoghq.com/api/latest/app-builder/#get-app-v2) GET https://api.ap1.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.ap2.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.datadoghq.eu/api/v2/app-builder/apps/{app_id}https://api.ddog-gov.com/api/v2/app-builder/apps/{app_id}https://api.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.us3.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.us5.datadoghq.com/api/v2/app-builder/apps/{app_id} ### Overview Get the full definition of an app. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires all of the following permissions: * `apps_run` * `connections_read` ### Arguments #### Path Parameters Name Type Description app_id [_required_] string The ID of the app to retrieve. #### Query Strings Name Type Description version string The version number of the app to retrieve. If not specified, the latest version is returned. Version numbers start at 1 and increment with each update. The special values `latest` and `deployed` can be used to retrieve the latest version or the published version, respectively. ### Response * [200](https://docs.datadoghq.com/api/latest/app-builder/#GetApp-200-v2) * [400](https://docs.datadoghq.com/api/latest/app-builder/#GetApp-400-v2) * [403](https://docs.datadoghq.com/api/latest/app-builder/#GetApp-403-v2) * [404](https://docs.datadoghq.com/api/latest/app-builder/#GetApp-404-v2) * [410](https://docs.datadoghq.com/api/latest/app-builder/#GetApp-410-v2) * [429](https://docs.datadoghq.com/api/latest/app-builder/#GetApp-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) The full app definition response object. Field Type Description data object The data object containing the app definition. attributes [_required_] object The app definition attributes, such as name, description, and components. components [object] The UI components that make up the app. events [object] Events to listen for on the grid component. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id string The ID of the grid component. This property is deprecated; use `name` to identify individual components instead. name [_required_] string A unique identifier for this grid component. This name is also visible in the app editor. properties [_required_] object Properties of a grid component. backgroundColor string The background color of the grid. default: `default` children [object] The child components of the grid. events [object] Events to listen for on the UI component. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id string The ID of the UI component. This property is deprecated; use `name` to identify individual components instead. name [_required_] string A unique identifier for this UI component. This name is also visible in the app editor. properties [_required_] object Properties of a UI component. Different component types can have their own additional unique properties. See the [components documentation](https://docs.datadoghq.com/service_management/app_builder/components/) for more detail on each component type and its properties. children [object] The child components of the UI component. isVisible Whether the UI component is visible. If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. type [_required_] enum The UI component type. Allowed enum values: `table,textInput,textArea,button,text,select,modal,schemaForm,checkbox,tabs,vegaChart,radioButtons,numberInput,fileInput,jsonInput,gridCell,dateRangePicker,search,container,calloutValue` isVisible Whether the grid component and its children are visible. If a string, it must be a valid JavaScript expression that evaluates to a boolean. Option 1 string Option 2 boolean default: `true` type [_required_] enum The grid component type. Allowed enum values: `grid` default: `grid` description string A human-readable description for the app. favorite boolean Whether the app is marked as a favorite by the current user. name string The name of the app. queries [ ] An array of queries, such as external actions and state variables, that the app uses. Option 1 object An action query. This query type is used to trigger an action, such as sending a HTTP request. events [object] Events to listen for downstream of the action query. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id [_required_] uuid The ID of the action query. name [_required_] string A unique identifier for this action query. This name is also used to access the query's result throughout the app. properties [_required_] object The properties of the action query. condition Whether to run this query. If specified, the query will only run if this condition evaluates to `true` in JavaScript and all other conditions are also met. Option 1 boolean Option 2 string debounceInMs The minimum time in milliseconds that must pass before the query can be triggered again. This is useful for preventing accidental double-clicks from triggering the query multiple times. Option 1 double Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a number. mockedOutputs The mocked outputs of the action query. This is useful for testing the app without actually running the action. Option 1 string Option 2 object The mocked outputs of the action query. enabled [_required_] Whether to enable the mocked outputs for testing. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. outputs string The mocked outputs of the action query, serialized as JSON. onlyTriggerManually Determines when this query is executed. If set to `false`, the query will run when the app loads and whenever any query arguments change. If set to `true`, the query will only run when manually triggered from elsewhere in the app. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. outputs string The post-query transformation function, which is a JavaScript function that changes the query's `.outputs` property after the query's execution. pollingIntervalInMs If specified, the app will poll the query at the specified interval in milliseconds. The minimum polling interval is 15 seconds. The query will only poll when the app's browser tab is active. Option 1 double Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a number. requiresConfirmation Whether to prompt the user to confirm this query before it runs. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. showToastOnError Whether to display a toast to the user when the query returns an error. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. spec [_required_] The definition of the action query. Option 1 string Option 2 object The action query spec object. connectionGroup object The connection group to use for an action query. id uuid The ID of the connection group. tags [string] The tags of the connection group. connectionId string The ID of the custom connection to use for this action query. fqn [_required_] string The fully qualified name of the action type. inputs The inputs to the action query. These are the values that are passed to the action when it is triggered. Option 1 string Option 2 object The inputs to the action query. See the [Actions Catalog](https://docs.datadoghq.com/actions/actions_catalog/) for more detail on each action and its inputs. type [_required_] enum The action query type. Allowed enum values: `action` default: `action` Option 2 object A data transformer, which is custom JavaScript code that executes and transforms data when its inputs change. id [_required_] uuid The ID of the data transformer. name [_required_] string A unique identifier for this data transformer. This name is also used to access the transformer's result throughout the app. properties [_required_] object The properties of the data transformer. outputs string A JavaScript function that returns the transformed data. type [_required_] enum The data transform type. Allowed enum values: `dataTransform` default: `dataTransform` Option 3 object A variable, which can be set and read by other components in the app. id [_required_] uuid The ID of the state variable. name [_required_] string A unique identifier for this state variable. This name is also used to access the variable's value throughout the app. properties [_required_] object The properties of the state variable. defaultValue The default value of the state variable. type [_required_] enum The state variable type. Allowed enum values: `stateVariable` default: `stateVariable` rootInstanceName string The name of the root component of the app. This must be a `grid` component that contains all other components. tags [string] A list of tags for the app, which can be used to filter apps. id [_required_] uuid The ID of the app. type [_required_] enum The app definition type. Allowed enum values: `appDefinitions` default: `appDefinitions` included [object] Data on the version of the app that was published. attributes object The attributes object containing the version ID of the published app. app_version_id uuid The version ID of the app that was published. For an unpublished app, this is always the nil UUID (`00000000-0000-0000-0000-000000000000`). id uuid The deployment ID. meta object Metadata object containing the publication creation information. created_at date-time Timestamp of when the app was published. user_id int64 The ID of the user who published the app. user_name string The name (or email address) of the user who published the app. user_uuid uuid The UUID of the user who published the app. type enum The deployment type. Allowed enum values: `deployment` default: `deployment` meta object Metadata of an app. created_at date-time Timestamp of when the app was created. deleted_at date-time Timestamp of when the app was deleted. org_id int64 The Datadog organization ID that owns the app. updated_at date-time Timestamp of when the app was last updated. updated_since_deployment boolean Whether the app was updated since it was last published. Published apps are pinned to a specific version and do not automatically update when the app is updated. user_id int64 The ID of the user who created the app. user_name string The name (or email address) of the user who created the app. user_uuid uuid The UUID of the user who created the app. version int64 The version number of the app. This starts at 1 and increments with each update. relationship object The app's publication relationship and custom connections. connections [object] Array of custom connections used by the app. attributes object The custom connection attributes. name string The name of the custom connection. onPremRunner object Information about the Private Action Runner used by the custom connection, if the custom connection is associated with a Private Action Runner. id string The Private Action Runner ID. url string The URL of the Private Action Runner. id uuid The ID of the custom connection. type enum The custom connection type. Allowed enum values: `custom_connections` default: `custom_connections` deployment object Information pointing to the app's publication status. data object Data object containing the deployment ID. id uuid The deployment ID. type enum The deployment type. Allowed enum values: `deployment` default: `deployment` meta object Metadata object containing the publication creation information. created_at date-time Timestamp of when the app was published. user_id int64 The ID of the user who published the app. user_name string The name (or email address) of the user who published the app. user_uuid uuid The UUID of the user who published the app. ``` { "data": { "attributes": { "components": [ { "events": [ { "name": "click", "type": "triggerQuery" } ], "id": "string", "name": "", "properties": { "backgroundColor": "string", "children": [ { "events": [ { "name": "click", "type": "triggerQuery" } ], "id": "string", "name": "", "properties": { "children": [], "isVisible": { "type": "undefined" } }, "type": "text" } ], "isVisible": { "type": "undefined" } }, "type": "grid" } ], "description": "string", "favorite": false, "name": "string", "queries": [ { "events": [ { "name": "click", "type": "triggerQuery" } ], "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "name": "fetchPendingOrders", "properties": { "condition": { "type": "undefined" }, "debounceInMs": { "example": "undefined", "format": "undefined", "type": "undefined" }, "mockedOutputs": { "type": "undefined" }, "onlyTriggerManually": { "type": "undefined" }, "outputs": "${((outputs) => {return outputs.body.data})(self.rawOutputs)}", "pollingIntervalInMs": { "example": "undefined", "format": "undefined", "minimum": "undefined", "type": "undefined" }, "requiresConfirmation": { "type": "undefined" }, "showToastOnError": { "type": "undefined" }, "spec": { "type": "" } }, "type": "action" } ], "rootInstanceName": "string", "tags": [ "service:webshop-backend", "team:webshop" ] }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "appDefinitions" }, "included": [ { "attributes": { "app_version_id": "65bb1f25-52e1-4510-9f8d-22d1516ed693" }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "meta": { "created_at": "2019-09-19T10:00:00.000Z", "user_id": "integer", "user_name": "string", "user_uuid": "65bb1f25-52e1-4510-9f8d-22d1516ed693" }, "type": "deployment" } ], "meta": { "created_at": "2019-09-19T10:00:00.000Z", "deleted_at": "2019-09-19T10:00:00.000Z", "org_id": "integer", "updated_at": "2019-09-19T10:00:00.000Z", "updated_since_deployment": false, "user_id": "integer", "user_name": "string", "user_uuid": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "version": "integer" }, "relationship": { "connections": [ { "attributes": { "name": "string", "onPremRunner": { "id": "string", "url": "string" } }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "custom_connections" } ], "deployment": { "data": { "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "deployment" }, "meta": { "created_at": "2019-09-19T10:00:00.000Z", "user_id": "integer", "user_name": "string", "user_uuid": "65bb1f25-52e1-4510-9f8d-22d1516ed693" } } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Gone * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=typescript) ##### Get App Copy ``` # Path parameters export app_id="65bb1f25-52e1-4510-9f8d-22d1516ed693" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/app-builder/apps/${app_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get App ``` """ Get App returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.app_builder_api import AppBuilderApi # there is a valid "app" in the system APP_DATA_ID = environ["APP_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AppBuilderApi(api_client) response = api_instance.get_app( app_id=APP_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get App ``` # Get App returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AppBuilderAPI.new # there is a valid "app" in the system APP_DATA_ID = ENV["APP_DATA_ID"] p api_instance.get_app(APP_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get App ``` // Get App returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "app" in the system AppDataID := uuid.MustParse(os.Getenv("APP_DATA_ID")) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAppBuilderApi(apiClient) resp, r, err := api.GetApp(ctx, AppDataID, *datadogV2.NewGetAppOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AppBuilderApi.GetApp`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AppBuilderApi.GetApp`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get App ``` // Get App returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AppBuilderApi; import com.datadog.api.client.v2.model.GetAppResponse; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AppBuilderApi apiInstance = new AppBuilderApi(defaultClient); // there is a valid "app" in the system UUID APP_DATA_ID = null; try { APP_DATA_ID = UUID.fromString(System.getenv("APP_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } try { GetAppResponse result = apiInstance.getApp(APP_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AppBuilderApi#getApp"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get App ``` // Get App returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_app_builder::AppBuilderAPI; use datadog_api_client::datadogV2::api_app_builder::GetAppOptionalParams; #[tokio::main] async fn main() { // there is a valid "app" in the system let app_data_id = uuid::Uuid::parse_str(&std::env::var("APP_DATA_ID").unwrap()).expect("Invalid UUID"); let configuration = datadog::Configuration::new(); let api = AppBuilderAPI::with_config(configuration); let resp = api .get_app(app_data_id.clone(), GetAppOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get App ``` /** * Get App returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AppBuilderApi(configuration); // there is a valid "app" in the system const APP_DATA_ID = process.env.APP_DATA_ID as string; const params: v2.AppBuilderApiGetAppRequest = { appId: APP_DATA_ID, }; apiInstance .getApp(params) .then((data: v2.GetAppResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update App](https://docs.datadoghq.com/api/latest/app-builder/#update-app) * [v2 (latest)](https://docs.datadoghq.com/api/latest/app-builder/#update-app-v2) PATCH https://api.ap1.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.ap2.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.datadoghq.eu/api/v2/app-builder/apps/{app_id}https://api.ddog-gov.com/api/v2/app-builder/apps/{app_id}https://api.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.us3.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.us5.datadoghq.com/api/v2/app-builder/apps/{app_id} ### Overview Update an existing app. This creates a new version of the app. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires all of the following permissions: * `apps_write` * `connections_resolve` * `workflows_run` ### Arguments #### Path Parameters Name Type Description app_id [_required_] string The ID of the app to update. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) Field Type Description data object The data object containing the new app definition. Any fields not included in the request remain unchanged. attributes object App definition attributes to be updated, such as name, description, and components. components [object] The new UI components that make up the app. If this field is set, all existing components are replaced with the new components under this field. events [object] Events to listen for on the grid component. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id string The ID of the grid component. This property is deprecated; use `name` to identify individual components instead. name [_required_] string A unique identifier for this grid component. This name is also visible in the app editor. properties [_required_] object Properties of a grid component. backgroundColor string The background color of the grid. default: `default` children [object] The child components of the grid. events [object] Events to listen for on the UI component. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id string The ID of the UI component. This property is deprecated; use `name` to identify individual components instead. name [_required_] string A unique identifier for this UI component. This name is also visible in the app editor. properties [_required_] object Properties of a UI component. Different component types can have their own additional unique properties. See the [components documentation](https://docs.datadoghq.com/service_management/app_builder/components/) for more detail on each component type and its properties. children [object] The child components of the UI component. isVisible Whether the UI component is visible. If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. type [_required_] enum The UI component type. Allowed enum values: `table,textInput,textArea,button,text,select,modal,schemaForm,checkbox,tabs,vegaChart,radioButtons,numberInput,fileInput,jsonInput,gridCell,dateRangePicker,search,container,calloutValue` isVisible Whether the grid component and its children are visible. If a string, it must be a valid JavaScript expression that evaluates to a boolean. Option 1 string Option 2 boolean default: `true` type [_required_] enum The grid component type. Allowed enum values: `grid` default: `grid` description string The new human-readable description for the app. name string The new name of the app. queries [ ] The new array of queries, such as external actions and state variables, that the app uses. If this field is set, all existing queries are replaced with the new queries under this field. Option 1 object An action query. This query type is used to trigger an action, such as sending a HTTP request. events [object] Events to listen for downstream of the action query. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id [_required_] uuid The ID of the action query. name [_required_] string A unique identifier for this action query. This name is also used to access the query's result throughout the app. properties [_required_] object The properties of the action query. condition Whether to run this query. If specified, the query will only run if this condition evaluates to `true` in JavaScript and all other conditions are also met. Option 1 boolean Option 2 string debounceInMs The minimum time in milliseconds that must pass before the query can be triggered again. This is useful for preventing accidental double-clicks from triggering the query multiple times. Option 1 double Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a number. mockedOutputs The mocked outputs of the action query. This is useful for testing the app without actually running the action. Option 1 string Option 2 object The mocked outputs of the action query. enabled [_required_] Whether to enable the mocked outputs for testing. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. outputs string The mocked outputs of the action query, serialized as JSON. onlyTriggerManually Determines when this query is executed. If set to `false`, the query will run when the app loads and whenever any query arguments change. If set to `true`, the query will only run when manually triggered from elsewhere in the app. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. outputs string The post-query transformation function, which is a JavaScript function that changes the query's `.outputs` property after the query's execution. pollingIntervalInMs If specified, the app will poll the query at the specified interval in milliseconds. The minimum polling interval is 15 seconds. The query will only poll when the app's browser tab is active. Option 1 double Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a number. requiresConfirmation Whether to prompt the user to confirm this query before it runs. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. showToastOnError Whether to display a toast to the user when the query returns an error. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. spec [_required_] The definition of the action query. Option 1 string Option 2 object The action query spec object. connectionGroup object The connection group to use for an action query. id uuid The ID of the connection group. tags [string] The tags of the connection group. connectionId string The ID of the custom connection to use for this action query. fqn [_required_] string The fully qualified name of the action type. inputs The inputs to the action query. These are the values that are passed to the action when it is triggered. Option 1 string Option 2 object The inputs to the action query. See the [Actions Catalog](https://docs.datadoghq.com/actions/actions_catalog/) for more detail on each action and its inputs. type [_required_] enum The action query type. Allowed enum values: `action` default: `action` Option 2 object A data transformer, which is custom JavaScript code that executes and transforms data when its inputs change. id [_required_] uuid The ID of the data transformer. name [_required_] string A unique identifier for this data transformer. This name is also used to access the transformer's result throughout the app. properties [_required_] object The properties of the data transformer. outputs string A JavaScript function that returns the transformed data. type [_required_] enum The data transform type. Allowed enum values: `dataTransform` default: `dataTransform` Option 3 object A variable, which can be set and read by other components in the app. id [_required_] uuid The ID of the state variable. name [_required_] string A unique identifier for this state variable. This name is also used to access the variable's value throughout the app. properties [_required_] object The properties of the state variable. defaultValue The default value of the state variable. type [_required_] enum The state variable type. Allowed enum values: `stateVariable` default: `stateVariable` rootInstanceName string The new name of the root component of the app. This must be a `grid` component that contains all other components. tags [string] The new list of tags for the app, which can be used to filter apps. If this field is set, any existing tags not included in the request are removed. id uuid The ID of the app to update. The app ID must match the ID in the URL path. type [_required_] enum The app definition type. Allowed enum values: `appDefinitions` default: `appDefinitions` ``` { "data": { "attributes": { "name": "Updated Name", "rootInstanceName": "grid0" }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "appDefinitions" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/app-builder/#UpdateApp-200-v2) * [400](https://docs.datadoghq.com/api/latest/app-builder/#UpdateApp-400-v2) * [403](https://docs.datadoghq.com/api/latest/app-builder/#UpdateApp-403-v2) * [429](https://docs.datadoghq.com/api/latest/app-builder/#UpdateApp-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) The response object after an app is successfully updated. Field Type Description data object The data object containing the updated app definition. attributes [_required_] object The updated app definition attributes, such as name, description, and components. components [object] The UI components that make up the app. events [object] Events to listen for on the grid component. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id string The ID of the grid component. This property is deprecated; use `name` to identify individual components instead. name [_required_] string A unique identifier for this grid component. This name is also visible in the app editor. properties [_required_] object Properties of a grid component. backgroundColor string The background color of the grid. default: `default` children [object] The child components of the grid. events [object] Events to listen for on the UI component. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id string The ID of the UI component. This property is deprecated; use `name` to identify individual components instead. name [_required_] string A unique identifier for this UI component. This name is also visible in the app editor. properties [_required_] object Properties of a UI component. Different component types can have their own additional unique properties. See the [components documentation](https://docs.datadoghq.com/service_management/app_builder/components/) for more detail on each component type and its properties. children [object] The child components of the UI component. isVisible Whether the UI component is visible. If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. type [_required_] enum The UI component type. Allowed enum values: `table,textInput,textArea,button,text,select,modal,schemaForm,checkbox,tabs,vegaChart,radioButtons,numberInput,fileInput,jsonInput,gridCell,dateRangePicker,search,container,calloutValue` isVisible Whether the grid component and its children are visible. If a string, it must be a valid JavaScript expression that evaluates to a boolean. Option 1 string Option 2 boolean default: `true` type [_required_] enum The grid component type. Allowed enum values: `grid` default: `grid` description string The human-readable description for the app. favorite boolean Whether the app is marked as a favorite by the current user. name string The name of the app. queries [ ] An array of queries, such as external actions and state variables, that the app uses. Option 1 object An action query. This query type is used to trigger an action, such as sending a HTTP request. events [object] Events to listen for downstream of the action query. name enum The triggering action for the event. Allowed enum values: `pageChange,tableRowClick,_tableRowButtonClick,change,submit,click,toggleOpen,close,open,executionFinished` type enum The response to the event. Allowed enum values: `custom,setComponentState,triggerQuery,openModal,closeModal,openUrl,downloadFile,setStateVariableValue` id [_required_] uuid The ID of the action query. name [_required_] string A unique identifier for this action query. This name is also used to access the query's result throughout the app. properties [_required_] object The properties of the action query. condition Whether to run this query. If specified, the query will only run if this condition evaluates to `true` in JavaScript and all other conditions are also met. Option 1 boolean Option 2 string debounceInMs The minimum time in milliseconds that must pass before the query can be triggered again. This is useful for preventing accidental double-clicks from triggering the query multiple times. Option 1 double Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a number. mockedOutputs The mocked outputs of the action query. This is useful for testing the app without actually running the action. Option 1 string Option 2 object The mocked outputs of the action query. enabled [_required_] Whether to enable the mocked outputs for testing. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. outputs string The mocked outputs of the action query, serialized as JSON. onlyTriggerManually Determines when this query is executed. If set to `false`, the query will run when the app loads and whenever any query arguments change. If set to `true`, the query will only run when manually triggered from elsewhere in the app. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. outputs string The post-query transformation function, which is a JavaScript function that changes the query's `.outputs` property after the query's execution. pollingIntervalInMs If specified, the app will poll the query at the specified interval in milliseconds. The minimum polling interval is 15 seconds. The query will only poll when the app's browser tab is active. Option 1 double Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a number. requiresConfirmation Whether to prompt the user to confirm this query before it runs. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. showToastOnError Whether to display a toast to the user when the query returns an error. Option 1 boolean Option 2 string If this is a string, it must be a valid JavaScript expression that evaluates to a boolean. spec [_required_] The definition of the action query. Option 1 string Option 2 object The action query spec object. connectionGroup object The connection group to use for an action query. id uuid The ID of the connection group. tags [string] The tags of the connection group. connectionId string The ID of the custom connection to use for this action query. fqn [_required_] string The fully qualified name of the action type. inputs The inputs to the action query. These are the values that are passed to the action when it is triggered. Option 1 string Option 2 object The inputs to the action query. See the [Actions Catalog](https://docs.datadoghq.com/actions/actions_catalog/) for more detail on each action and its inputs. type [_required_] enum The action query type. Allowed enum values: `action` default: `action` Option 2 object A data transformer, which is custom JavaScript code that executes and transforms data when its inputs change. id [_required_] uuid The ID of the data transformer. name [_required_] string A unique identifier for this data transformer. This name is also used to access the transformer's result throughout the app. properties [_required_] object The properties of the data transformer. outputs string A JavaScript function that returns the transformed data. type [_required_] enum The data transform type. Allowed enum values: `dataTransform` default: `dataTransform` Option 3 object A variable, which can be set and read by other components in the app. id [_required_] uuid The ID of the state variable. name [_required_] string A unique identifier for this state variable. This name is also used to access the variable's value throughout the app. properties [_required_] object The properties of the state variable. defaultValue The default value of the state variable. type [_required_] enum The state variable type. Allowed enum values: `stateVariable` default: `stateVariable` rootInstanceName string The name of the root component of the app. This must be a `grid` component that contains all other components. tags [string] A list of tags for the app, which can be used to filter apps. id [_required_] uuid The ID of the updated app. type [_required_] enum The app definition type. Allowed enum values: `appDefinitions` default: `appDefinitions` included [object] Data on the version of the app that was published. attributes object The attributes object containing the version ID of the published app. app_version_id uuid The version ID of the app that was published. For an unpublished app, this is always the nil UUID (`00000000-0000-0000-0000-000000000000`). id uuid The deployment ID. meta object Metadata object containing the publication creation information. created_at date-time Timestamp of when the app was published. user_id int64 The ID of the user who published the app. user_name string The name (or email address) of the user who published the app. user_uuid uuid The UUID of the user who published the app. type enum The deployment type. Allowed enum values: `deployment` default: `deployment` meta object Metadata of an app. created_at date-time Timestamp of when the app was created. deleted_at date-time Timestamp of when the app was deleted. org_id int64 The Datadog organization ID that owns the app. updated_at date-time Timestamp of when the app was last updated. updated_since_deployment boolean Whether the app was updated since it was last published. Published apps are pinned to a specific version and do not automatically update when the app is updated. user_id int64 The ID of the user who created the app. user_name string The name (or email address) of the user who created the app. user_uuid uuid The UUID of the user who created the app. version int64 The version number of the app. This starts at 1 and increments with each update. relationship object The app's publication relationship and custom connections. connections [object] Array of custom connections used by the app. attributes object The custom connection attributes. name string The name of the custom connection. onPremRunner object Information about the Private Action Runner used by the custom connection, if the custom connection is associated with a Private Action Runner. id string The Private Action Runner ID. url string The URL of the Private Action Runner. id uuid The ID of the custom connection. type enum The custom connection type. Allowed enum values: `custom_connections` default: `custom_connections` deployment object Information pointing to the app's publication status. data object Data object containing the deployment ID. id uuid The deployment ID. type enum The deployment type. Allowed enum values: `deployment` default: `deployment` meta object Metadata object containing the publication creation information. created_at date-time Timestamp of when the app was published. user_id int64 The ID of the user who published the app. user_name string The name (or email address) of the user who published the app. user_uuid uuid The UUID of the user who published the app. ``` { "data": { "attributes": { "components": [ { "events": [ { "name": "click", "type": "triggerQuery" } ], "id": "string", "name": "", "properties": { "backgroundColor": "string", "children": [ { "events": [ { "name": "click", "type": "triggerQuery" } ], "id": "string", "name": "", "properties": { "children": [], "isVisible": { "type": "undefined" } }, "type": "text" } ], "isVisible": { "type": "undefined" } }, "type": "grid" } ], "description": "string", "favorite": false, "name": "string", "queries": [ { "events": [ { "name": "click", "type": "triggerQuery" } ], "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "name": "fetchPendingOrders", "properties": { "condition": { "type": "undefined" }, "debounceInMs": { "example": "undefined", "format": "undefined", "type": "undefined" }, "mockedOutputs": { "type": "undefined" }, "onlyTriggerManually": { "type": "undefined" }, "outputs": "${((outputs) => {return outputs.body.data})(self.rawOutputs)}", "pollingIntervalInMs": { "example": "undefined", "format": "undefined", "minimum": "undefined", "type": "undefined" }, "requiresConfirmation": { "type": "undefined" }, "showToastOnError": { "type": "undefined" }, "spec": { "type": "" } }, "type": "action" } ], "rootInstanceName": "string", "tags": [ "service:webshop-backend", "team:webshop" ] }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "appDefinitions" }, "included": [ { "attributes": { "app_version_id": "65bb1f25-52e1-4510-9f8d-22d1516ed693" }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "meta": { "created_at": "2019-09-19T10:00:00.000Z", "user_id": "integer", "user_name": "string", "user_uuid": "65bb1f25-52e1-4510-9f8d-22d1516ed693" }, "type": "deployment" } ], "meta": { "created_at": "2019-09-19T10:00:00.000Z", "deleted_at": "2019-09-19T10:00:00.000Z", "org_id": "integer", "updated_at": "2019-09-19T10:00:00.000Z", "updated_since_deployment": false, "user_id": "integer", "user_name": "string", "user_uuid": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "version": "integer" }, "relationship": { "connections": [ { "attributes": { "name": "string", "onPremRunner": { "id": "string", "url": "string" } }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "custom_connections" } ], "deployment": { "data": { "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "deployment" }, "meta": { "created_at": "2019-09-19T10:00:00.000Z", "user_id": "integer", "user_name": "string", "user_uuid": "65bb1f25-52e1-4510-9f8d-22d1516ed693" } } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=typescript) ##### Update App returns "OK" response Copy ``` # Path parameters export app_id="65bb1f25-52e1-4510-9f8d-22d1516ed693" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/app-builder/apps/${app_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Updated Name", "rootInstanceName": "grid0" }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "appDefinitions" } } EOF ``` ##### Update App returns "OK" response ``` // Update App returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "app" in the system AppDataID := uuid.MustParse(os.Getenv("APP_DATA_ID")) body := datadogV2.UpdateAppRequest{ Data: &datadogV2.UpdateAppRequestData{ Attributes: &datadogV2.UpdateAppRequestDataAttributes{ Name: datadog.PtrString("Updated Name"), RootInstanceName: datadog.PtrString("grid0"), }, Id: &AppDataID, Type: datadogV2.APPDEFINITIONTYPE_APPDEFINITIONS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAppBuilderApi(apiClient) resp, r, err := api.UpdateApp(ctx, AppDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AppBuilderApi.UpdateApp`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AppBuilderApi.UpdateApp`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update App returns "OK" response ``` // Update App returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AppBuilderApi; import com.datadog.api.client.v2.model.AppDefinitionType; import com.datadog.api.client.v2.model.UpdateAppRequest; import com.datadog.api.client.v2.model.UpdateAppRequestData; import com.datadog.api.client.v2.model.UpdateAppRequestDataAttributes; import com.datadog.api.client.v2.model.UpdateAppResponse; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AppBuilderApi apiInstance = new AppBuilderApi(defaultClient); // there is a valid "app" in the system UUID APP_DATA_ID = null; try { APP_DATA_ID = UUID.fromString(System.getenv("APP_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } UpdateAppRequest body = new UpdateAppRequest() .data( new UpdateAppRequestData() .attributes( new UpdateAppRequestDataAttributes() .name("Updated Name") .rootInstanceName("grid0")) .id(APP_DATA_ID) .type(AppDefinitionType.APPDEFINITIONS)); try { UpdateAppResponse result = apiInstance.updateApp(APP_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AppBuilderApi#updateApp"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update App returns "OK" response ``` """ Update App returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.app_builder_api import AppBuilderApi from datadog_api_client.v2.model.app_definition_type import AppDefinitionType from datadog_api_client.v2.model.update_app_request import UpdateAppRequest from datadog_api_client.v2.model.update_app_request_data import UpdateAppRequestData from datadog_api_client.v2.model.update_app_request_data_attributes import UpdateAppRequestDataAttributes # there is a valid "app" in the system APP_DATA_ID = environ["APP_DATA_ID"] body = UpdateAppRequest( data=UpdateAppRequestData( attributes=UpdateAppRequestDataAttributes( name="Updated Name", root_instance_name="grid0", ), id=APP_DATA_ID, type=AppDefinitionType.APPDEFINITIONS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AppBuilderApi(api_client) response = api_instance.update_app(app_id=APP_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update App returns "OK" response ``` # Update App returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AppBuilderAPI.new # there is a valid "app" in the system APP_DATA_ID = ENV["APP_DATA_ID"] body = DatadogAPIClient::V2::UpdateAppRequest.new({ data: DatadogAPIClient::V2::UpdateAppRequestData.new({ attributes: DatadogAPIClient::V2::UpdateAppRequestDataAttributes.new({ name: "Updated Name", root_instance_name: "grid0", }), id: APP_DATA_ID, type: DatadogAPIClient::V2::AppDefinitionType::APPDEFINITIONS, }), }) p api_instance.update_app(APP_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update App returns "OK" response ``` // Update App returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_app_builder::AppBuilderAPI; use datadog_api_client::datadogV2::model::AppDefinitionType; use datadog_api_client::datadogV2::model::UpdateAppRequest; use datadog_api_client::datadogV2::model::UpdateAppRequestData; use datadog_api_client::datadogV2::model::UpdateAppRequestDataAttributes; #[tokio::main] async fn main() { // there is a valid "app" in the system let app_data_id = uuid::Uuid::parse_str(&std::env::var("APP_DATA_ID").unwrap()).expect("Invalid UUID"); let body = UpdateAppRequest::new().data( UpdateAppRequestData::new(AppDefinitionType::APPDEFINITIONS) .attributes( UpdateAppRequestDataAttributes::new() .name("Updated Name".to_string()) .root_instance_name("grid0".to_string()), ) .id(app_data_id.clone()), ); let configuration = datadog::Configuration::new(); let api = AppBuilderAPI::with_config(configuration); let resp = api.update_app(app_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update App returns "OK" response ``` /** * Update App returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AppBuilderApi(configuration); // there is a valid "app" in the system const APP_DATA_ID = process.env.APP_DATA_ID as string; const params: v2.AppBuilderApiUpdateAppRequest = { body: { data: { attributes: { name: "Updated Name", rootInstanceName: "grid0", }, id: APP_DATA_ID, type: "appDefinitions", }, }, appId: APP_DATA_ID, }; apiInstance .updateApp(params) .then((data: v2.UpdateAppResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete App](https://docs.datadoghq.com/api/latest/app-builder/#delete-app) * [v2 (latest)](https://docs.datadoghq.com/api/latest/app-builder/#delete-app-v2) DELETE https://api.ap1.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.ap2.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.datadoghq.eu/api/v2/app-builder/apps/{app_id}https://api.ddog-gov.com/api/v2/app-builder/apps/{app_id}https://api.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.us3.datadoghq.com/api/v2/app-builder/apps/{app_id}https://api.us5.datadoghq.com/api/v2/app-builder/apps/{app_id} ### Overview Delete a single app. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `apps_write` permission. ### Arguments #### Path Parameters Name Type Description app_id [_required_] string The ID of the app to delete. ### Response * [200](https://docs.datadoghq.com/api/latest/app-builder/#DeleteApp-200-v2) * [400](https://docs.datadoghq.com/api/latest/app-builder/#DeleteApp-400-v2) * [403](https://docs.datadoghq.com/api/latest/app-builder/#DeleteApp-403-v2) * [404](https://docs.datadoghq.com/api/latest/app-builder/#DeleteApp-404-v2) * [410](https://docs.datadoghq.com/api/latest/app-builder/#DeleteApp-410-v2) * [429](https://docs.datadoghq.com/api/latest/app-builder/#DeleteApp-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) The response object after an app is successfully deleted. Field Type Description data object The definition of `DeleteAppResponseData` object. id [_required_] uuid The ID of the deleted app. type [_required_] enum The app definition type. Allowed enum values: `appDefinitions` default: `appDefinitions` ``` { "data": { "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "type": "appDefinitions" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Gone * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=typescript) ##### Delete App Copy ``` # Path parameters export app_id="65bb1f25-52e1-4510-9f8d-22d1516ed693" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/app-builder/apps/${app_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete App ``` """ Delete App returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.app_builder_api import AppBuilderApi # there is a valid "app" in the system APP_DATA_ID = environ["APP_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AppBuilderApi(api_client) response = api_instance.delete_app( app_id=APP_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete App ``` # Delete App returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AppBuilderAPI.new # there is a valid "app" in the system APP_DATA_ID = ENV["APP_DATA_ID"] p api_instance.delete_app(APP_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete App ``` // Delete App returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "app" in the system AppDataID := uuid.MustParse(os.Getenv("APP_DATA_ID")) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAppBuilderApi(apiClient) resp, r, err := api.DeleteApp(ctx, AppDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AppBuilderApi.DeleteApp`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AppBuilderApi.DeleteApp`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete App ``` // Delete App returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AppBuilderApi; import com.datadog.api.client.v2.model.DeleteAppResponse; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AppBuilderApi apiInstance = new AppBuilderApi(defaultClient); // there is a valid "app" in the system UUID APP_DATA_ID = null; try { APP_DATA_ID = UUID.fromString(System.getenv("APP_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } try { DeleteAppResponse result = apiInstance.deleteApp(APP_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AppBuilderApi#deleteApp"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete App ``` // Delete App returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_app_builder::AppBuilderAPI; #[tokio::main] async fn main() { // there is a valid "app" in the system let app_data_id = uuid::Uuid::parse_str(&std::env::var("APP_DATA_ID").unwrap()).expect("Invalid UUID"); let configuration = datadog::Configuration::new(); let api = AppBuilderAPI::with_config(configuration); let resp = api.delete_app(app_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete App ``` /** * Delete App returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AppBuilderApi(configuration); // there is a valid "app" in the system const APP_DATA_ID = process.env.APP_DATA_ID as string; const params: v2.AppBuilderApiDeleteAppRequest = { appId: APP_DATA_ID, }; apiInstance .deleteApp(params) .then((data: v2.DeleteAppResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Publish App](https://docs.datadoghq.com/api/latest/app-builder/#publish-app) * [v2 (latest)](https://docs.datadoghq.com/api/latest/app-builder/#publish-app-v2) POST https://api.ap1.datadoghq.com/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.ap2.datadoghq.com/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.datadoghq.eu/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.ddog-gov.com/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.datadoghq.com/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.us3.datadoghq.com/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.us5.datadoghq.com/api/v2/app-builder/apps/{app_id}/deployment ### Overview Publish an app for use by other users. To ensure the app is accessible to the correct users, you also need to set a [Restriction Policy](https://docs.datadoghq.com/api/latest/restriction-policies/) on the app if a policy does not yet exist. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `apps_write` permission. ### Arguments #### Path Parameters Name Type Description app_id [_required_] string The ID of the app to publish. ### Response * [201](https://docs.datadoghq.com/api/latest/app-builder/#PublishApp-201-v2) * [400](https://docs.datadoghq.com/api/latest/app-builder/#PublishApp-400-v2) * [403](https://docs.datadoghq.com/api/latest/app-builder/#PublishApp-403-v2) * [404](https://docs.datadoghq.com/api/latest/app-builder/#PublishApp-404-v2) * [429](https://docs.datadoghq.com/api/latest/app-builder/#PublishApp-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) The response object after an app is successfully published. Field Type Description data object The version of the app that was published. attributes object The attributes object containing the version ID of the published app. app_version_id uuid The version ID of the app that was published. For an unpublished app, this is always the nil UUID (`00000000-0000-0000-0000-000000000000`). id uuid The deployment ID. meta object Metadata object containing the publication creation information. created_at date-time Timestamp of when the app was published. user_id int64 The ID of the user who published the app. user_name string The name (or email address) of the user who published the app. user_uuid uuid The UUID of the user who published the app. type enum The deployment type. Allowed enum values: `deployment` default: `deployment` ``` { "data": { "attributes": { "app_version_id": "65bb1f25-52e1-4510-9f8d-22d1516ed693" }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "meta": { "created_at": "2019-09-19T10:00:00.000Z", "user_id": "integer", "user_name": "string", "user_uuid": "65bb1f25-52e1-4510-9f8d-22d1516ed693" }, "type": "deployment" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=typescript) ##### Publish App Copy ``` # Path parameters export app_id="65bb1f25-52e1-4510-9f8d-22d1516ed693" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/app-builder/apps/${app_id}/deployment" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Publish App ``` """ Publish App returns "Created" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.app_builder_api import AppBuilderApi # there is a valid "app" in the system APP_DATA_ID = environ["APP_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AppBuilderApi(api_client) response = api_instance.publish_app( app_id=APP_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Publish App ``` # Publish App returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AppBuilderAPI.new # there is a valid "app" in the system APP_DATA_ID = ENV["APP_DATA_ID"] p api_instance.publish_app(APP_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Publish App ``` // Publish App returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "app" in the system AppDataID := uuid.MustParse(os.Getenv("APP_DATA_ID")) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAppBuilderApi(apiClient) resp, r, err := api.PublishApp(ctx, AppDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AppBuilderApi.PublishApp`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AppBuilderApi.PublishApp`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Publish App ``` // Publish App returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AppBuilderApi; import com.datadog.api.client.v2.model.PublishAppResponse; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AppBuilderApi apiInstance = new AppBuilderApi(defaultClient); // there is a valid "app" in the system UUID APP_DATA_ID = null; try { APP_DATA_ID = UUID.fromString(System.getenv("APP_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } try { PublishAppResponse result = apiInstance.publishApp(APP_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AppBuilderApi#publishApp"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Publish App ``` // Publish App returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_app_builder::AppBuilderAPI; #[tokio::main] async fn main() { // there is a valid "app" in the system let app_data_id = uuid::Uuid::parse_str(&std::env::var("APP_DATA_ID").unwrap()).expect("Invalid UUID"); let configuration = datadog::Configuration::new(); let api = AppBuilderAPI::with_config(configuration); let resp = api.publish_app(app_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Publish App ``` /** * Publish App returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AppBuilderApi(configuration); // there is a valid "app" in the system const APP_DATA_ID = process.env.APP_DATA_ID as string; const params: v2.AppBuilderApiPublishAppRequest = { appId: APP_DATA_ID, }; apiInstance .publishApp(params) .then((data: v2.PublishAppResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Unpublish App](https://docs.datadoghq.com/api/latest/app-builder/#unpublish-app) * [v2 (latest)](https://docs.datadoghq.com/api/latest/app-builder/#unpublish-app-v2) DELETE https://api.ap1.datadoghq.com/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.ap2.datadoghq.com/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.datadoghq.eu/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.ddog-gov.com/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.datadoghq.com/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.us3.datadoghq.com/api/v2/app-builder/apps/{app_id}/deploymenthttps://api.us5.datadoghq.com/api/v2/app-builder/apps/{app_id}/deployment ### Overview Unpublish an app, removing the live version of the app. Unpublishing creates a new instance of a `deployment` object on the app, with a nil `app_version_id` (`00000000-0000-0000-0000-000000000000`). The app can still be updated and published again in the future. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `apps_write` permission. ### Arguments #### Path Parameters Name Type Description app_id [_required_] string The ID of the app to unpublish. ### Response * [200](https://docs.datadoghq.com/api/latest/app-builder/#UnpublishApp-200-v2) * [400](https://docs.datadoghq.com/api/latest/app-builder/#UnpublishApp-400-v2) * [403](https://docs.datadoghq.com/api/latest/app-builder/#UnpublishApp-403-v2) * [404](https://docs.datadoghq.com/api/latest/app-builder/#UnpublishApp-404-v2) * [429](https://docs.datadoghq.com/api/latest/app-builder/#UnpublishApp-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) The response object after an app is successfully unpublished. Field Type Description data object The version of the app that was published. attributes object The attributes object containing the version ID of the published app. app_version_id uuid The version ID of the app that was published. For an unpublished app, this is always the nil UUID (`00000000-0000-0000-0000-000000000000`). id uuid The deployment ID. meta object Metadata object containing the publication creation information. created_at date-time Timestamp of when the app was published. user_id int64 The ID of the user who published the app. user_name string The name (or email address) of the user who published the app. user_uuid uuid The UUID of the user who published the app. type enum The deployment type. Allowed enum values: `deployment` default: `deployment` ``` { "data": { "attributes": { "app_version_id": "65bb1f25-52e1-4510-9f8d-22d1516ed693" }, "id": "65bb1f25-52e1-4510-9f8d-22d1516ed693", "meta": { "created_at": "2019-09-19T10:00:00.000Z", "user_id": "integer", "user_name": "string", "user_uuid": "65bb1f25-52e1-4510-9f8d-22d1516ed693" }, "type": "deployment" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/app-builder/) * [Example](https://docs.datadoghq.com/api/latest/app-builder/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/app-builder/?code-lang=typescript) ##### Unpublish App Copy ``` # Path parameters export app_id="65bb1f25-52e1-4510-9f8d-22d1516ed693" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/app-builder/apps/${app_id}/deployment" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Unpublish App ``` """ Unpublish App returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.app_builder_api import AppBuilderApi # there is a valid "app" in the system APP_DATA_ID = environ["APP_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AppBuilderApi(api_client) response = api_instance.unpublish_app( app_id=APP_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Unpublish App ``` # Unpublish App returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AppBuilderAPI.new # there is a valid "app" in the system APP_DATA_ID = ENV["APP_DATA_ID"] p api_instance.unpublish_app(APP_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Unpublish App ``` // Unpublish App returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "app" in the system AppDataID := uuid.MustParse(os.Getenv("APP_DATA_ID")) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAppBuilderApi(apiClient) resp, r, err := api.UnpublishApp(ctx, AppDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AppBuilderApi.UnpublishApp`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AppBuilderApi.UnpublishApp`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Unpublish App ``` // Unpublish App returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AppBuilderApi; import com.datadog.api.client.v2.model.UnpublishAppResponse; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AppBuilderApi apiInstance = new AppBuilderApi(defaultClient); // there is a valid "app" in the system UUID APP_DATA_ID = null; try { APP_DATA_ID = UUID.fromString(System.getenv("APP_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } try { UnpublishAppResponse result = apiInstance.unpublishApp(APP_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AppBuilderApi#unpublishApp"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Unpublish App ``` // Unpublish App returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_app_builder::AppBuilderAPI; #[tokio::main] async fn main() { // there is a valid "app" in the system let app_data_id = uuid::Uuid::parse_str(&std::env::var("APP_DATA_ID").unwrap()).expect("Invalid UUID"); let configuration = datadog::Configuration::new(); let api = AppBuilderAPI::with_config(configuration); let resp = api.unpublish_app(app_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Unpublish App ``` /** * Unpublish App returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AppBuilderApi(configuration); // there is a valid "app" in the system const APP_DATA_ID = process.env.APP_DATA_ID as string; const params: v2.AppBuilderApiUnpublishAppRequest = { appId: APP_DATA_ID, }; apiInstance .unpublishApp(params) .then((data: v2.UnpublishAppResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=a5c14004-88a8-42cc-87a4-fa91d0c30cf4&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=4651b4f6-0905-49b3-a7e0-d12a390b8cd7&pt=App%20Builder&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapp-builder%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=a5c14004-88a8-42cc-87a4-fa91d0c30cf4&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=4651b4f6-0905-49b3-a7e0-d12a390b8cd7&pt=App%20Builder&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapp-builder%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=5c797741-dc4e-48e2-be58-2f1d4b4a1015&bo=2&sid=cb8cfe40f0be11f099ffd3211cb87056&vid=cb8d3b40f0be11f0b13449d819231651&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=App%20Builder&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapp-builder%2F&r=<=2233&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=206788) --- # Source: https://docs.datadoghq.com/api/latest/application-security/ # Application Security [Datadog Application Security](https://docs.datadoghq.com/security/application_security/) provides protection against application-level attacks that aim to exploit code-level vulnerabilities, such as Server-Side-Request-Forgery (SSRF), SQL injection, Log4Shell, and Reflected Cross-Site-Scripting (XSS). You can monitor and protect apps hosted directly on a server, Docker, Kubernetes, Amazon ECS, and (for supported languages) AWS Fargate. ## [Get a WAF exclusion filter](https://docs.datadoghq.com/api/latest/application-security/#get-a-waf-exclusion-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/application-security/#get-a-waf-exclusion-filter-v2) GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.datadoghq.eu/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.ddog-gov.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id} ### Overview Retrieve a specific WAF exclusion filter using its identifier. This endpoint requires the `appsec_protect_read` permission. ### Arguments #### Path Parameters Name Type Description exclusion_filter_id [_required_] string The identifier of the WAF exclusion filter. ### Response * [200](https://docs.datadoghq.com/api/latest/application-security/#GetApplicationSecurityWafExclusionFilter-200-v2) * [403](https://docs.datadoghq.com/api/latest/application-security/#GetApplicationSecurityWafExclusionFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/application-security/#GetApplicationSecurityWafExclusionFilter-404-v2) * [429](https://docs.datadoghq.com/api/latest/application-security/#GetApplicationSecurityWafExclusionFilter-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Response object for a single WAF exclusion filter. Field Type Description data object A JSON:API resource for an WAF exclusion filter. attributes object Attributes describing a WAF exclusion filter. description string A description for the exclusion filter. enabled boolean Indicates whether the exclusion filter is enabled. event_query string The event query matched by the legacy exclusion filter. Cannot be created nor updated. ip_list [string] The client IP addresses matched by the exclusion filter (CIDR notation is supported). metadata object Extra information about the exclusion filter. added_at date-time The creation date of the exclusion filter. added_by string The handle of the user who created the exclusion filter. added_by_name string The name of the user who created the exclusion filter. modified_at date-time The last modification date of the exclusion filter. modified_by string The handle of the user who last modified the exclusion filter. modified_by_name string The name of the user who last modified the exclusion filter. on_match enum The action taken when the exclusion filter matches. When set to `monitor`, security traces are emitted but the requests are not blocked. By default, security traces are not emitted and the requests are not blocked. Allowed enum values: `monitor` parameters [string] A list of parameters matched by the exclusion filter in the HTTP query string and HTTP request body. Nested parameters can be matched by joining fields with a dot character. path_glob string The HTTP path glob expression matched by the exclusion filter. rules_target [object] The WAF rules targeted by the exclusion filter. rule_id string Target a single WAF rule based on its identifier. tags object Target multiple WAF rules based on their tags. category string The category of the targeted WAF rules. type string The type of the targeted WAF rules. scope [object] The services where the exclusion filter is deployed. env string Deploy on this environment. service string Deploy on this service. search_query string Generated event search query for traces matching the exclusion filter. id string The identifier of the WAF exclusion filter. type enum Type of the resource. The value should always be `exclusion_filter`. Allowed enum values: `exclusion_filter` default: `exclusion_filter` ``` { "data": { "attributes": { "description": "Exclude false positives on a path", "enabled": true, "event_query": "string", "ip_list": [ "198.51.100.72" ], "metadata": { "added_at": "2019-09-19T10:00:00.000Z", "added_by": "string", "added_by_name": "string", "modified_at": "2019-09-19T10:00:00.000Z", "modified_by": "string", "modified_by_name": "string" }, "on_match": "string", "parameters": [ "list.search.query" ], "path_glob": "/accounts/*", "rules_target": [ { "rule_id": "dog-913-009", "tags": { "category": "attack_attempt", "type": "lfi" } } ], "scope": [ { "env": "www", "service": "prod" } ], "search_query": "string" }, "id": "3dd-0uc-h1s", "type": "exclusion_filter" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/application-security/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/application-security/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/application-security/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/application-security/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/application-security/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/application-security/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/application-security/?code-lang=typescript) ##### Get a WAF exclusion filter Copy ``` # Path parameters export exclusion_filter_id="3b5-v82-ns6" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/${exclusion_filter_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a WAF exclusion filter ``` """ Get a WAF exclusion filter returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.application_security_api import ApplicationSecurityApi # there is a valid "exclusion_filter" in the system EXCLUSION_FILTER_DATA_ID = environ["EXCLUSION_FILTER_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ApplicationSecurityApi(api_client) response = api_instance.get_application_security_waf_exclusion_filter( exclusion_filter_id=EXCLUSION_FILTER_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a WAF exclusion filter ``` # Get a WAF exclusion filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ApplicationSecurityAPI.new # there is a valid "exclusion_filter" in the system EXCLUSION_FILTER_DATA_ID = ENV["EXCLUSION_FILTER_DATA_ID"] p api_instance.get_application_security_waf_exclusion_filter(EXCLUSION_FILTER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a WAF exclusion filter ``` // Get a WAF exclusion filter returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "exclusion_filter" in the system ExclusionFilterDataID := os.Getenv("EXCLUSION_FILTER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewApplicationSecurityApi(apiClient) resp, r, err := api.GetApplicationSecurityWafExclusionFilter(ctx, ExclusionFilterDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ApplicationSecurityApi.GetApplicationSecurityWafExclusionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ApplicationSecurityApi.GetApplicationSecurityWafExclusionFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a WAF exclusion filter ``` // Get a WAF exclusion filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApplicationSecurityApi; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApplicationSecurityApi apiInstance = new ApplicationSecurityApi(defaultClient); // there is a valid "exclusion_filter" in the system String EXCLUSION_FILTER_DATA_ID = System.getenv("EXCLUSION_FILTER_DATA_ID"); try { ApplicationSecurityWafExclusionFilterResponse result = apiInstance.getApplicationSecurityWafExclusionFilter(EXCLUSION_FILTER_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ApplicationSecurityApi#getApplicationSecurityWafExclusionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a WAF exclusion filter ``` // Get a WAF exclusion filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_application_security::ApplicationSecurityAPI; #[tokio::main] async fn main() { // there is a valid "exclusion_filter" in the system let exclusion_filter_data_id = std::env::var("EXCLUSION_FILTER_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ApplicationSecurityAPI::with_config(configuration); let resp = api .get_application_security_waf_exclusion_filter(exclusion_filter_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a WAF exclusion filter ``` /** * Get a WAF exclusion filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ApplicationSecurityApi(configuration); // there is a valid "exclusion_filter" in the system const EXCLUSION_FILTER_DATA_ID = process.env.EXCLUSION_FILTER_DATA_ID as string; const params: v2.ApplicationSecurityApiGetApplicationSecurityWafExclusionFilterRequest = { exclusionFilterId: EXCLUSION_FILTER_DATA_ID, }; apiInstance .getApplicationSecurityWafExclusionFilter(params) .then((data: v2.ApplicationSecurityWafExclusionFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a WAF exclusion filter](https://docs.datadoghq.com/api/latest/application-security/#create-a-waf-exclusion-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/application-security/#create-a-waf-exclusion-filter-v2) POST https://api.ap1.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.ap2.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.datadoghq.eu/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.ddog-gov.com/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.us3.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters ### Overview Create a new WAF exclusion filter with the given parameters. A request matched by an exclusion filter will be ignored by the Application Security WAF product. Go to to review existing exclusion filters (also called passlist entries). This endpoint requires the `appsec_protect_write` permission. ### Request #### Body Data (required) The definition of the new WAF exclusion filter. * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Field Type Description data [_required_] object Object for creating a single WAF exclusion filter. attributes [_required_] object Attributes for creating a WAF exclusion filter. description [_required_] string A description for the exclusion filter. enabled [_required_] boolean Indicates whether the exclusion filter is enabled. ip_list [string] The client IP addresses matched by the exclusion filter (CIDR notation is supported). on_match enum The action taken when the exclusion filter matches. When set to `monitor`, security traces are emitted but the requests are not blocked. By default, security traces are not emitted and the requests are not blocked. Allowed enum values: `monitor` parameters [string] A list of parameters matched by the exclusion filter in the HTTP query string and HTTP request body. Nested parameters can be matched by joining fields with a dot character. path_glob string The HTTP path glob expression matched by the exclusion filter. rules_target [object] The WAF rules targeted by the exclusion filter. rule_id string Target a single WAF rule based on its identifier. tags object Target multiple WAF rules based on their tags. category string The category of the targeted WAF rules. type string The type of the targeted WAF rules. scope [object] The services where the exclusion filter is deployed. env string Deploy on this environment. service string Deploy on this service. type [_required_] enum Type of the resource. The value should always be `exclusion_filter`. Allowed enum values: `exclusion_filter` default: `exclusion_filter` ``` { "data": { "attributes": { "description": "Exclude false positives on a path", "enabled": true, "parameters": [ "list.search.query" ], "path_glob": "/accounts/*", "rules_target": [ { "tags": { "category": "attack_attempt", "type": "lfi" } } ], "scope": [ { "env": "www", "service": "prod" } ] }, "type": "exclusion_filter" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/application-security/#CreateApplicationSecurityWafExclusionFilter-200-v2) * [400](https://docs.datadoghq.com/api/latest/application-security/#CreateApplicationSecurityWafExclusionFilter-400-v2) * [403](https://docs.datadoghq.com/api/latest/application-security/#CreateApplicationSecurityWafExclusionFilter-403-v2) * [409](https://docs.datadoghq.com/api/latest/application-security/#CreateApplicationSecurityWafExclusionFilter-409-v2) * [429](https://docs.datadoghq.com/api/latest/application-security/#CreateApplicationSecurityWafExclusionFilter-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Response object for a single WAF exclusion filter. Field Type Description data object A JSON:API resource for an WAF exclusion filter. attributes object Attributes describing a WAF exclusion filter. description string A description for the exclusion filter. enabled boolean Indicates whether the exclusion filter is enabled. event_query string The event query matched by the legacy exclusion filter. Cannot be created nor updated. ip_list [string] The client IP addresses matched by the exclusion filter (CIDR notation is supported). metadata object Extra information about the exclusion filter. added_at date-time The creation date of the exclusion filter. added_by string The handle of the user who created the exclusion filter. added_by_name string The name of the user who created the exclusion filter. modified_at date-time The last modification date of the exclusion filter. modified_by string The handle of the user who last modified the exclusion filter. modified_by_name string The name of the user who last modified the exclusion filter. on_match enum The action taken when the exclusion filter matches. When set to `monitor`, security traces are emitted but the requests are not blocked. By default, security traces are not emitted and the requests are not blocked. Allowed enum values: `monitor` parameters [string] A list of parameters matched by the exclusion filter in the HTTP query string and HTTP request body. Nested parameters can be matched by joining fields with a dot character. path_glob string The HTTP path glob expression matched by the exclusion filter. rules_target [object] The WAF rules targeted by the exclusion filter. rule_id string Target a single WAF rule based on its identifier. tags object Target multiple WAF rules based on their tags. category string The category of the targeted WAF rules. type string The type of the targeted WAF rules. scope [object] The services where the exclusion filter is deployed. env string Deploy on this environment. service string Deploy on this service. search_query string Generated event search query for traces matching the exclusion filter. id string The identifier of the WAF exclusion filter. type enum Type of the resource. The value should always be `exclusion_filter`. Allowed enum values: `exclusion_filter` default: `exclusion_filter` ``` { "data": { "attributes": { "description": "Exclude false positives on a path", "enabled": true, "event_query": "string", "ip_list": [ "198.51.100.72" ], "metadata": { "added_at": "2019-09-19T10:00:00.000Z", "added_by": "string", "added_by_name": "string", "modified_at": "2019-09-19T10:00:00.000Z", "modified_by": "string", "modified_by_name": "string" }, "on_match": "string", "parameters": [ "list.search.query" ], "path_glob": "/accounts/*", "rules_target": [ { "rule_id": "dog-913-009", "tags": { "category": "attack_attempt", "type": "lfi" } } ], "scope": [ { "env": "www", "service": "prod" } ], "search_query": "string" }, "id": "3dd-0uc-h1s", "type": "exclusion_filter" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/application-security/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/application-security/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/application-security/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/application-security/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/application-security/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/application-security/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/application-security/?code-lang=typescript) ##### Create a WAF exclusion filter returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Exclude false positives on a path", "enabled": true, "parameters": [ "list.search.query" ], "path_glob": "/accounts/*", "rules_target": [ { "tags": { "category": "attack_attempt", "type": "lfi" } } ], "scope": [ { "env": "www", "service": "prod" } ] }, "type": "exclusion_filter" } } EOF ``` ##### Create a WAF exclusion filter returns "OK" response ``` // Create a WAF exclusion filter returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ApplicationSecurityWafExclusionFilterCreateRequest{ Data: datadogV2.ApplicationSecurityWafExclusionFilterCreateData{ Attributes: datadogV2.ApplicationSecurityWafExclusionFilterCreateAttributes{ Description: "Exclude false positives on a path", Enabled: true, Parameters: []string{ "list.search.query", }, PathGlob: datadog.PtrString("/accounts/*"), RulesTarget: []datadogV2.ApplicationSecurityWafExclusionFilterRulesTarget{ { Tags: &datadogV2.ApplicationSecurityWafExclusionFilterRulesTargetTags{ Category: datadog.PtrString("attack_attempt"), Type: datadog.PtrString("lfi"), }, }, }, Scope: []datadogV2.ApplicationSecurityWafExclusionFilterScope{ { Env: datadog.PtrString("www"), Service: datadog.PtrString("prod"), }, }, }, Type: datadogV2.APPLICATIONSECURITYWAFEXCLUSIONFILTERTYPE_EXCLUSION_FILTER, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewApplicationSecurityApi(apiClient) resp, r, err := api.CreateApplicationSecurityWafExclusionFilter(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ApplicationSecurityApi.CreateApplicationSecurityWafExclusionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ApplicationSecurityApi.CreateApplicationSecurityWafExclusionFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a WAF exclusion filter returns "OK" response ``` // Create a WAF exclusion filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApplicationSecurityApi; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterCreateAttributes; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterCreateData; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterCreateRequest; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterResponse; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterRulesTarget; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterRulesTargetTags; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterScope; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApplicationSecurityApi apiInstance = new ApplicationSecurityApi(defaultClient); ApplicationSecurityWafExclusionFilterCreateRequest body = new ApplicationSecurityWafExclusionFilterCreateRequest() .data( new ApplicationSecurityWafExclusionFilterCreateData() .attributes( new ApplicationSecurityWafExclusionFilterCreateAttributes() .description("Exclude false positives on a path") .enabled(true) .parameters(Collections.singletonList("list.search.query")) .pathGlob("/accounts/*") .rulesTarget( Collections.singletonList( new ApplicationSecurityWafExclusionFilterRulesTarget() .tags( new ApplicationSecurityWafExclusionFilterRulesTargetTags() .category("attack_attempt") .type("lfi")))) .scope( Collections.singletonList( new ApplicationSecurityWafExclusionFilterScope() .env("www") .service("prod")))) .type(ApplicationSecurityWafExclusionFilterType.EXCLUSION_FILTER)); try { ApplicationSecurityWafExclusionFilterResponse result = apiInstance.createApplicationSecurityWafExclusionFilter(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling" + " ApplicationSecurityApi#createApplicationSecurityWafExclusionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a WAF exclusion filter returns "OK" response ``` """ Create a WAF exclusion filter returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.application_security_api import ApplicationSecurityApi from datadog_api_client.v2.model.application_security_waf_exclusion_filter_create_attributes import ( ApplicationSecurityWafExclusionFilterCreateAttributes, ) from datadog_api_client.v2.model.application_security_waf_exclusion_filter_create_data import ( ApplicationSecurityWafExclusionFilterCreateData, ) from datadog_api_client.v2.model.application_security_waf_exclusion_filter_create_request import ( ApplicationSecurityWafExclusionFilterCreateRequest, ) from datadog_api_client.v2.model.application_security_waf_exclusion_filter_rules_target import ( ApplicationSecurityWafExclusionFilterRulesTarget, ) from datadog_api_client.v2.model.application_security_waf_exclusion_filter_rules_target_tags import ( ApplicationSecurityWafExclusionFilterRulesTargetTags, ) from datadog_api_client.v2.model.application_security_waf_exclusion_filter_scope import ( ApplicationSecurityWafExclusionFilterScope, ) from datadog_api_client.v2.model.application_security_waf_exclusion_filter_type import ( ApplicationSecurityWafExclusionFilterType, ) body = ApplicationSecurityWafExclusionFilterCreateRequest( data=ApplicationSecurityWafExclusionFilterCreateData( attributes=ApplicationSecurityWafExclusionFilterCreateAttributes( description="Exclude false positives on a path", enabled=True, parameters=[ "list.search.query", ], path_glob="/accounts/*", rules_target=[ ApplicationSecurityWafExclusionFilterRulesTarget( tags=ApplicationSecurityWafExclusionFilterRulesTargetTags( category="attack_attempt", type="lfi", ), ), ], scope=[ ApplicationSecurityWafExclusionFilterScope( env="www", service="prod", ), ], ), type=ApplicationSecurityWafExclusionFilterType.EXCLUSION_FILTER, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ApplicationSecurityApi(api_client) response = api_instance.create_application_security_waf_exclusion_filter(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a WAF exclusion filter returns "OK" response ``` # Create a WAF exclusion filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ApplicationSecurityAPI.new body = DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterCreateRequest.new({ data: DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterCreateData.new({ attributes: DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterCreateAttributes.new({ description: "Exclude false positives on a path", enabled: true, parameters: [ "list.search.query", ], path_glob: "/accounts/*", rules_target: [ DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterRulesTarget.new({ tags: DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterRulesTargetTags.new({ category: "attack_attempt", type: "lfi", }), }), ], scope: [ DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterScope.new({ env: "www", service: "prod", }), ], }), type: DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterType::EXCLUSION_FILTER, }), }) p api_instance.create_application_security_waf_exclusion_filter(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a WAF exclusion filter returns "OK" response ``` // Create a WAF exclusion filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_application_security::ApplicationSecurityAPI; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterCreateAttributes; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterCreateData; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterCreateRequest; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterRulesTarget; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterRulesTargetTags; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterScope; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterType; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = ApplicationSecurityWafExclusionFilterCreateRequest::new( ApplicationSecurityWafExclusionFilterCreateData::new( ApplicationSecurityWafExclusionFilterCreateAttributes::new( "Exclude false positives on a path".to_string(), true, ) .parameters(vec!["list.search.query".to_string()]) .path_glob("/accounts/*".to_string()) .rules_target(vec![ApplicationSecurityWafExclusionFilterRulesTarget::new( ) .tags( ApplicationSecurityWafExclusionFilterRulesTargetTags::new() .category("attack_attempt".to_string()) .type_("lfi".to_string()) .additional_properties(BTreeMap::from([])), )]) .scope(vec![ApplicationSecurityWafExclusionFilterScope::new() .env("www".to_string()) .service("prod".to_string())]), ApplicationSecurityWafExclusionFilterType::EXCLUSION_FILTER, ), ); let configuration = datadog::Configuration::new(); let api = ApplicationSecurityAPI::with_config(configuration); let resp = api .create_application_security_waf_exclusion_filter(body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a WAF exclusion filter returns "OK" response ``` /** * Create a WAF exclusion filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ApplicationSecurityApi(configuration); const params: v2.ApplicationSecurityApiCreateApplicationSecurityWafExclusionFilterRequest = { body: { data: { attributes: { description: "Exclude false positives on a path", enabled: true, parameters: ["list.search.query"], pathGlob: "/accounts/*", rulesTarget: [ { tags: { category: "attack_attempt", type: "lfi", }, }, ], scope: [ { env: "www", service: "prod", }, ], }, type: "exclusion_filter", }, }, }; apiInstance .createApplicationSecurityWafExclusionFilter(params) .then((data: v2.ApplicationSecurityWafExclusionFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all WAF exclusion filters](https://docs.datadoghq.com/api/latest/application-security/#list-all-waf-exclusion-filters) * [v2 (latest)](https://docs.datadoghq.com/api/latest/application-security/#list-all-waf-exclusion-filters-v2) GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.ap2.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.datadoghq.eu/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.ddog-gov.com/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.us3.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filtershttps://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters ### Overview Retrieve a list of WAF exclusion filters. This endpoint requires the `appsec_protect_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/application-security/#ListApplicationSecurityWafExclusionFilters-200-v2) * [403](https://docs.datadoghq.com/api/latest/application-security/#ListApplicationSecurityWafExclusionFilters-403-v2) * [429](https://docs.datadoghq.com/api/latest/application-security/#ListApplicationSecurityWafExclusionFilters-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Response object for multiple WAF exclusion filters. Field Type Description data [object] A list of WAF exclusion filters. attributes object Attributes describing a WAF exclusion filter. description string A description for the exclusion filter. enabled boolean Indicates whether the exclusion filter is enabled. event_query string The event query matched by the legacy exclusion filter. Cannot be created nor updated. ip_list [string] The client IP addresses matched by the exclusion filter (CIDR notation is supported). metadata object Extra information about the exclusion filter. added_at date-time The creation date of the exclusion filter. added_by string The handle of the user who created the exclusion filter. added_by_name string The name of the user who created the exclusion filter. modified_at date-time The last modification date of the exclusion filter. modified_by string The handle of the user who last modified the exclusion filter. modified_by_name string The name of the user who last modified the exclusion filter. on_match enum The action taken when the exclusion filter matches. When set to `monitor`, security traces are emitted but the requests are not blocked. By default, security traces are not emitted and the requests are not blocked. Allowed enum values: `monitor` parameters [string] A list of parameters matched by the exclusion filter in the HTTP query string and HTTP request body. Nested parameters can be matched by joining fields with a dot character. path_glob string The HTTP path glob expression matched by the exclusion filter. rules_target [object] The WAF rules targeted by the exclusion filter. rule_id string Target a single WAF rule based on its identifier. tags object Target multiple WAF rules based on their tags. category string The category of the targeted WAF rules. type string The type of the targeted WAF rules. scope [object] The services where the exclusion filter is deployed. env string Deploy on this environment. service string Deploy on this service. search_query string Generated event search query for traces matching the exclusion filter. id string The identifier of the WAF exclusion filter. type enum Type of the resource. The value should always be `exclusion_filter`. Allowed enum values: `exclusion_filter` default: `exclusion_filter` ``` { "data": [ { "attributes": { "description": "Exclude false positives on a path", "enabled": true, "event_query": "string", "ip_list": [ "198.51.100.72" ], "metadata": { "added_at": "2019-09-19T10:00:00.000Z", "added_by": "string", "added_by_name": "string", "modified_at": "2019-09-19T10:00:00.000Z", "modified_by": "string", "modified_by_name": "string" }, "on_match": "string", "parameters": [ "list.search.query" ], "path_glob": "/accounts/*", "rules_target": [ { "rule_id": "dog-913-009", "tags": { "category": "attack_attempt", "type": "lfi" } } ], "scope": [ { "env": "www", "service": "prod" } ], "search_query": "string" }, "id": "3dd-0uc-h1s", "type": "exclusion_filter" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/application-security/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/application-security/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/application-security/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/application-security/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/application-security/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/application-security/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/application-security/?code-lang=typescript) ##### List all WAF exclusion filters Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all WAF exclusion filters ``` """ List all WAF exclusion filters returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.application_security_api import ApplicationSecurityApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ApplicationSecurityApi(api_client) response = api_instance.list_application_security_waf_exclusion_filters() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all WAF exclusion filters ``` # List all WAF exclusion filters returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ApplicationSecurityAPI.new p api_instance.list_application_security_waf_exclusion_filters() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all WAF exclusion filters ``` // List all WAF exclusion filters returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewApplicationSecurityApi(apiClient) resp, r, err := api.ListApplicationSecurityWafExclusionFilters(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ApplicationSecurityApi.ListApplicationSecurityWafExclusionFilters`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ApplicationSecurityApi.ListApplicationSecurityWafExclusionFilters`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all WAF exclusion filters ``` // List all WAF exclusion filters returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApplicationSecurityApi; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFiltersResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApplicationSecurityApi apiInstance = new ApplicationSecurityApi(defaultClient); try { ApplicationSecurityWafExclusionFiltersResponse result = apiInstance.listApplicationSecurityWafExclusionFilters(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling" + " ApplicationSecurityApi#listApplicationSecurityWafExclusionFilters"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all WAF exclusion filters ``` // List all WAF exclusion filters returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_application_security::ApplicationSecurityAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ApplicationSecurityAPI::with_config(configuration); let resp = api.list_application_security_waf_exclusion_filters().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all WAF exclusion filters ``` /** * List all WAF exclusion filters returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ApplicationSecurityApi(configuration); apiInstance .listApplicationSecurityWafExclusionFilters() .then((data: v2.ApplicationSecurityWafExclusionFiltersResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a WAF exclusion filter](https://docs.datadoghq.com/api/latest/application-security/#update-a-waf-exclusion-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/application-security/#update-a-waf-exclusion-filter-v2) PUT https://api.ap1.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.datadoghq.eu/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.ddog-gov.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id} ### Overview Update a specific WAF exclusion filter using its identifier. Returns the exclusion filter object when the request is successful. This endpoint requires the `appsec_protect_write` permission. ### Arguments #### Path Parameters Name Type Description exclusion_filter_id [_required_] string The identifier of the WAF exclusion filter. ### Request #### Body Data (required) The exclusion filter to update. * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Field Type Description data [_required_] object Object for updating a single WAF exclusion filter. attributes [_required_] object Attributes for updating a WAF exclusion filter. description [_required_] string A description for the exclusion filter. enabled [_required_] boolean Indicates whether the exclusion filter is enabled. ip_list [string] The client IP addresses matched by the exclusion filter (CIDR notation is supported). on_match enum The action taken when the exclusion filter matches. When set to `monitor`, security traces are emitted but the requests are not blocked. By default, security traces are not emitted and the requests are not blocked. Allowed enum values: `monitor` parameters [string] A list of parameters matched by the exclusion filter in the HTTP query string and HTTP request body. Nested parameters can be matched by joining fields with a dot character. path_glob string The HTTP path glob expression matched by the exclusion filter. rules_target [object] The WAF rules targeted by the exclusion filter. rule_id string Target a single WAF rule based on its identifier. tags object Target multiple WAF rules based on their tags. category string The category of the targeted WAF rules. type string The type of the targeted WAF rules. scope [object] The services where the exclusion filter is deployed. env string Deploy on this environment. service string Deploy on this service. type [_required_] enum Type of the resource. The value should always be `exclusion_filter`. Allowed enum values: `exclusion_filter` default: `exclusion_filter` ``` { "data": { "attributes": { "description": "Exclude false positives on a path", "enabled": false, "ip_list": [ "198.51.100.72" ], "on_match": "monitor" }, "type": "exclusion_filter" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafExclusionFilter-200-v2) * [400](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafExclusionFilter-400-v2) * [403](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafExclusionFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafExclusionFilter-404-v2) * [409](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafExclusionFilter-409-v2) * [429](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafExclusionFilter-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Response object for a single WAF exclusion filter. Field Type Description data object A JSON:API resource for an WAF exclusion filter. attributes object Attributes describing a WAF exclusion filter. description string A description for the exclusion filter. enabled boolean Indicates whether the exclusion filter is enabled. event_query string The event query matched by the legacy exclusion filter. Cannot be created nor updated. ip_list [string] The client IP addresses matched by the exclusion filter (CIDR notation is supported). metadata object Extra information about the exclusion filter. added_at date-time The creation date of the exclusion filter. added_by string The handle of the user who created the exclusion filter. added_by_name string The name of the user who created the exclusion filter. modified_at date-time The last modification date of the exclusion filter. modified_by string The handle of the user who last modified the exclusion filter. modified_by_name string The name of the user who last modified the exclusion filter. on_match enum The action taken when the exclusion filter matches. When set to `monitor`, security traces are emitted but the requests are not blocked. By default, security traces are not emitted and the requests are not blocked. Allowed enum values: `monitor` parameters [string] A list of parameters matched by the exclusion filter in the HTTP query string and HTTP request body. Nested parameters can be matched by joining fields with a dot character. path_glob string The HTTP path glob expression matched by the exclusion filter. rules_target [object] The WAF rules targeted by the exclusion filter. rule_id string Target a single WAF rule based on its identifier. tags object Target multiple WAF rules based on their tags. category string The category of the targeted WAF rules. type string The type of the targeted WAF rules. scope [object] The services where the exclusion filter is deployed. env string Deploy on this environment. service string Deploy on this service. search_query string Generated event search query for traces matching the exclusion filter. id string The identifier of the WAF exclusion filter. type enum Type of the resource. The value should always be `exclusion_filter`. Allowed enum values: `exclusion_filter` default: `exclusion_filter` ``` { "data": { "attributes": { "description": "Exclude false positives on a path", "enabled": true, "event_query": "string", "ip_list": [ "198.51.100.72" ], "metadata": { "added_at": "2019-09-19T10:00:00.000Z", "added_by": "string", "added_by_name": "string", "modified_at": "2019-09-19T10:00:00.000Z", "modified_by": "string", "modified_by_name": "string" }, "on_match": "string", "parameters": [ "list.search.query" ], "path_glob": "/accounts/*", "rules_target": [ { "rule_id": "dog-913-009", "tags": { "category": "attack_attempt", "type": "lfi" } } ], "scope": [ { "env": "www", "service": "prod" } ], "search_query": "string" }, "id": "3dd-0uc-h1s", "type": "exclusion_filter" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/application-security/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/application-security/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/application-security/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/application-security/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/application-security/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/application-security/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/application-security/?code-lang=typescript) ##### Update a WAF exclusion filter returns "OK" response Copy ``` # Path parameters export exclusion_filter_id="3b5-v82-ns6" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/${exclusion_filter_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Exclude false positives on a path", "enabled": false, "ip_list": [ "198.51.100.72" ], "on_match": "monitor" }, "type": "exclusion_filter" } } EOF ``` ##### Update a WAF exclusion filter returns "OK" response ``` // Update a WAF exclusion filter returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "exclusion_filter" in the system ExclusionFilterDataID := os.Getenv("EXCLUSION_FILTER_DATA_ID") body := datadogV2.ApplicationSecurityWafExclusionFilterUpdateRequest{ Data: datadogV2.ApplicationSecurityWafExclusionFilterUpdateData{ Attributes: datadogV2.ApplicationSecurityWafExclusionFilterUpdateAttributes{ Description: "Exclude false positives on a path", Enabled: false, IpList: []string{ "198.51.100.72", }, OnMatch: datadogV2.APPLICATIONSECURITYWAFEXCLUSIONFILTERONMATCH_MONITOR.Ptr(), }, Type: datadogV2.APPLICATIONSECURITYWAFEXCLUSIONFILTERTYPE_EXCLUSION_FILTER, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewApplicationSecurityApi(apiClient) resp, r, err := api.UpdateApplicationSecurityWafExclusionFilter(ctx, ExclusionFilterDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ApplicationSecurityApi.UpdateApplicationSecurityWafExclusionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ApplicationSecurityApi.UpdateApplicationSecurityWafExclusionFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a WAF exclusion filter returns "OK" response ``` // Update a WAF exclusion filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApplicationSecurityApi; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterOnMatch; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterResponse; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterType; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterUpdateAttributes; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterUpdateData; import com.datadog.api.client.v2.model.ApplicationSecurityWafExclusionFilterUpdateRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApplicationSecurityApi apiInstance = new ApplicationSecurityApi(defaultClient); // there is a valid "exclusion_filter" in the system String EXCLUSION_FILTER_DATA_ID = System.getenv("EXCLUSION_FILTER_DATA_ID"); ApplicationSecurityWafExclusionFilterUpdateRequest body = new ApplicationSecurityWafExclusionFilterUpdateRequest() .data( new ApplicationSecurityWafExclusionFilterUpdateData() .attributes( new ApplicationSecurityWafExclusionFilterUpdateAttributes() .description("Exclude false positives on a path") .enabled(false) .ipList(Collections.singletonList("198.51.100.72")) .onMatch(ApplicationSecurityWafExclusionFilterOnMatch.MONITOR)) .type(ApplicationSecurityWafExclusionFilterType.EXCLUSION_FILTER)); try { ApplicationSecurityWafExclusionFilterResponse result = apiInstance.updateApplicationSecurityWafExclusionFilter(EXCLUSION_FILTER_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling" + " ApplicationSecurityApi#updateApplicationSecurityWafExclusionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a WAF exclusion filter returns "OK" response ``` """ Update a WAF exclusion filter returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.application_security_api import ApplicationSecurityApi from datadog_api_client.v2.model.application_security_waf_exclusion_filter_on_match import ( ApplicationSecurityWafExclusionFilterOnMatch, ) from datadog_api_client.v2.model.application_security_waf_exclusion_filter_type import ( ApplicationSecurityWafExclusionFilterType, ) from datadog_api_client.v2.model.application_security_waf_exclusion_filter_update_attributes import ( ApplicationSecurityWafExclusionFilterUpdateAttributes, ) from datadog_api_client.v2.model.application_security_waf_exclusion_filter_update_data import ( ApplicationSecurityWafExclusionFilterUpdateData, ) from datadog_api_client.v2.model.application_security_waf_exclusion_filter_update_request import ( ApplicationSecurityWafExclusionFilterUpdateRequest, ) # there is a valid "exclusion_filter" in the system EXCLUSION_FILTER_DATA_ID = environ["EXCLUSION_FILTER_DATA_ID"] body = ApplicationSecurityWafExclusionFilterUpdateRequest( data=ApplicationSecurityWafExclusionFilterUpdateData( attributes=ApplicationSecurityWafExclusionFilterUpdateAttributes( description="Exclude false positives on a path", enabled=False, ip_list=[ "198.51.100.72", ], on_match=ApplicationSecurityWafExclusionFilterOnMatch.MONITOR, ), type=ApplicationSecurityWafExclusionFilterType.EXCLUSION_FILTER, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ApplicationSecurityApi(api_client) response = api_instance.update_application_security_waf_exclusion_filter( exclusion_filter_id=EXCLUSION_FILTER_DATA_ID, body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a WAF exclusion filter returns "OK" response ``` # Update a WAF exclusion filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ApplicationSecurityAPI.new # there is a valid "exclusion_filter" in the system EXCLUSION_FILTER_DATA_ID = ENV["EXCLUSION_FILTER_DATA_ID"] body = DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterUpdateRequest.new({ data: DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterUpdateData.new({ attributes: DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterUpdateAttributes.new({ description: "Exclude false positives on a path", enabled: false, ip_list: [ "198.51.100.72", ], on_match: DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterOnMatch::MONITOR, }), type: DatadogAPIClient::V2::ApplicationSecurityWafExclusionFilterType::EXCLUSION_FILTER, }), }) p api_instance.update_application_security_waf_exclusion_filter(EXCLUSION_FILTER_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a WAF exclusion filter returns "OK" response ``` // Update a WAF exclusion filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_application_security::ApplicationSecurityAPI; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterOnMatch; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterType; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterUpdateAttributes; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterUpdateData; use datadog_api_client::datadogV2::model::ApplicationSecurityWafExclusionFilterUpdateRequest; #[tokio::main] async fn main() { // there is a valid "exclusion_filter" in the system let exclusion_filter_data_id = std::env::var("EXCLUSION_FILTER_DATA_ID").unwrap(); let body = ApplicationSecurityWafExclusionFilterUpdateRequest::new( ApplicationSecurityWafExclusionFilterUpdateData::new( ApplicationSecurityWafExclusionFilterUpdateAttributes::new( "Exclude false positives on a path".to_string(), false, ) .ip_list(vec!["198.51.100.72".to_string()]) .on_match(ApplicationSecurityWafExclusionFilterOnMatch::MONITOR), ApplicationSecurityWafExclusionFilterType::EXCLUSION_FILTER, ), ); let configuration = datadog::Configuration::new(); let api = ApplicationSecurityAPI::with_config(configuration); let resp = api .update_application_security_waf_exclusion_filter(exclusion_filter_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a WAF exclusion filter returns "OK" response ``` /** * Update a WAF exclusion filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ApplicationSecurityApi(configuration); // there is a valid "exclusion_filter" in the system const EXCLUSION_FILTER_DATA_ID = process.env.EXCLUSION_FILTER_DATA_ID as string; const params: v2.ApplicationSecurityApiUpdateApplicationSecurityWafExclusionFilterRequest = { body: { data: { attributes: { description: "Exclude false positives on a path", enabled: false, ipList: ["198.51.100.72"], onMatch: "monitor", }, type: "exclusion_filter", }, }, exclusionFilterId: EXCLUSION_FILTER_DATA_ID, }; apiInstance .updateApplicationSecurityWafExclusionFilter(params) .then((data: v2.ApplicationSecurityWafExclusionFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a WAF exclusion filter](https://docs.datadoghq.com/api/latest/application-security/#delete-a-waf-exclusion-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/application-security/#delete-a-waf-exclusion-filter-v2) DELETE https://api.ap1.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.datadoghq.eu/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.ddog-gov.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/{exclusion_filter_id} ### Overview Delete a specific WAF exclusion filter using its identifier. This endpoint requires the `appsec_protect_write` permission. ### Arguments #### Path Parameters Name Type Description exclusion_filter_id [_required_] string The identifier of the WAF exclusion filter. ### Response * [204](https://docs.datadoghq.com/api/latest/application-security/#DeleteApplicationSecurityWafExclusionFilter-204-v2) * [403](https://docs.datadoghq.com/api/latest/application-security/#DeleteApplicationSecurityWafExclusionFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/application-security/#DeleteApplicationSecurityWafExclusionFilter-404-v2) * [409](https://docs.datadoghq.com/api/latest/application-security/#DeleteApplicationSecurityWafExclusionFilter-409-v2) * [429](https://docs.datadoghq.com/api/latest/application-security/#DeleteApplicationSecurityWafExclusionFilter-429-v2) OK Not Authorized * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/application-security/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/application-security/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/application-security/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/application-security/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/application-security/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/application-security/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/application-security/?code-lang=typescript) ##### Delete a WAF exclusion filter Copy ``` # Path parameters export exclusion_filter_id="3b5-v82-ns6" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/exclusion_filters/${exclusion_filter_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a WAF exclusion filter ``` """ Delete a WAF exclusion filter returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.application_security_api import ApplicationSecurityApi # there is a valid "exclusion_filter" in the system EXCLUSION_FILTER_DATA_ID = environ["EXCLUSION_FILTER_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ApplicationSecurityApi(api_client) api_instance.delete_application_security_waf_exclusion_filter( exclusion_filter_id=EXCLUSION_FILTER_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a WAF exclusion filter ``` # Delete a WAF exclusion filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ApplicationSecurityAPI.new # there is a valid "exclusion_filter" in the system EXCLUSION_FILTER_DATA_ID = ENV["EXCLUSION_FILTER_DATA_ID"] api_instance.delete_application_security_waf_exclusion_filter(EXCLUSION_FILTER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a WAF exclusion filter ``` // Delete a WAF exclusion filter returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "exclusion_filter" in the system ExclusionFilterDataID := os.Getenv("EXCLUSION_FILTER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewApplicationSecurityApi(apiClient) r, err := api.DeleteApplicationSecurityWafExclusionFilter(ctx, ExclusionFilterDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ApplicationSecurityApi.DeleteApplicationSecurityWafExclusionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a WAF exclusion filter ``` // Delete a WAF exclusion filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApplicationSecurityApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApplicationSecurityApi apiInstance = new ApplicationSecurityApi(defaultClient); // there is a valid "exclusion_filter" in the system String EXCLUSION_FILTER_DATA_ID = System.getenv("EXCLUSION_FILTER_DATA_ID"); try { apiInstance.deleteApplicationSecurityWafExclusionFilter(EXCLUSION_FILTER_DATA_ID); } catch (ApiException e) { System.err.println( "Exception when calling" + " ApplicationSecurityApi#deleteApplicationSecurityWafExclusionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a WAF exclusion filter ``` // Delete a WAF exclusion filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_application_security::ApplicationSecurityAPI; #[tokio::main] async fn main() { // there is a valid "exclusion_filter" in the system let exclusion_filter_data_id = std::env::var("EXCLUSION_FILTER_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ApplicationSecurityAPI::with_config(configuration); let resp = api .delete_application_security_waf_exclusion_filter(exclusion_filter_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a WAF exclusion filter ``` /** * Delete a WAF exclusion filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ApplicationSecurityApi(configuration); // there is a valid "exclusion_filter" in the system const EXCLUSION_FILTER_DATA_ID = process.env.EXCLUSION_FILTER_DATA_ID as string; const params: v2.ApplicationSecurityApiDeleteApplicationSecurityWafExclusionFilterRequest = { exclusionFilterId: EXCLUSION_FILTER_DATA_ID, }; apiInstance .deleteApplicationSecurityWafExclusionFilter(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a WAF custom rule](https://docs.datadoghq.com/api/latest/application-security/#get-a-waf-custom-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/application-security/#get-a-waf-custom-rule-v2) GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.datadoghq.eu/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.ddog-gov.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id} ### Overview Retrieve a WAF custom rule by ID. ### Arguments #### Path Parameters Name Type Description custom_rule_id [_required_] string The ID of the custom rule. ### Response * [200](https://docs.datadoghq.com/api/latest/application-security/#GetApplicationSecurityWafCustomRule-200-v2) * [403](https://docs.datadoghq.com/api/latest/application-security/#GetApplicationSecurityWafCustomRule-403-v2) * [429](https://docs.datadoghq.com/api/latest/application-security/#GetApplicationSecurityWafCustomRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Response object that includes a single WAF custom rule. Field Type Description data object Object for a single WAF custom rule. attributes object A WAF custom rule. action object The definition of `ApplicationSecurityWafCustomRuleAction` object. action enum Override the default action to take when the WAF custom rule would block. Allowed enum values: `redirect_request,block_request` default: `block_request` parameters object The definition of `ApplicationSecurityWafCustomRuleActionParameters` object. location string The location to redirect to when the WAF custom rule triggers. status_code int64 The status code to return when the WAF custom rule triggers. default: `403` blocking [_required_] boolean Indicates whether the WAF custom rule will block the request. conditions [_required_] [object] Conditions for which the WAF Custom Rule will triggers, all conditions needs to match in order for the WAF rule to trigger. operator [_required_] enum Operator to use for the WAF Condition. Allowed enum values: `match_regex,!match_regex,phrase_match,!phrase_match,is_xss,is_sqli,exact_match,!exact_match,ip_match,!ip_match,capture_data` parameters [_required_] object The scope of the WAF custom rule. data string Identifier of a list of data from the denylist. Can only be used as substitution from the list parameter. inputs [_required_] [object] List of inputs on which at least one should match with the given operator. address [_required_] enum Input from the request on which the condition should apply. Allowed enum values: `server.db.statement,server.io.fs.file,server.io.net.url,server.sys.shell.cmd,server.request.method,server.request.uri.raw,server.request.path_params,server.request.query,server.request.headers.no_cookies,server.request.cookies,server.request.trailers,server.request.body,server.response.status,server.response.headers.no_cookies,server.response.trailers,grpc.server.request.metadata,grpc.server.request.message,grpc.server.method,graphql.server.all_resolvers,usr.id,http.client_ip` key_path [string] Specific path for the input. list [string] List of value to use with the condition. Only used with the phrase_match, !phrase_match, exact_match and !exact_match operator. options object Options for the operator of this condition. case_sensitive boolean Evaluate the value as case sensitive. min_length int64 Only evaluate this condition if the value has a minimum amount of characters. regex string Regex to use with the condition. Only used with match_regex and !match_regex operator. value string Store the captured value in the specified tag name. Only used with the capture_data operator. enabled [_required_] boolean Indicates whether the WAF custom rule is enabled. metadata object Metadata associated with the WAF Custom Rule. added_at date-time The date and time the WAF custom rule was created. added_by string The handle of the user who created the WAF custom rule. added_by_name string The name of the user who created the WAF custom rule. modified_at date-time The date and time the WAF custom rule was last updated. modified_by string The handle of the user who last updated the WAF custom rule. modified_by_name string The name of the user who last updated the WAF custom rule. name [_required_] string The Name of the WAF custom rule. path_glob string The path glob for the WAF custom rule. scope [object] The scope of the WAF custom rule. env [_required_] string The environment scope for the WAF custom rule. service [_required_] string The service scope for the WAF custom rule. tags [_required_] object Tags associated with the WAF Custom Rule. The concatenation of category and type will form the security activity field associated with the traces. category [_required_] enum The category of the WAF Rule, can be either `business_logic`, `attack_attempt` or `security_response`. Allowed enum values: `attack_attempt,business_logic,security_response` type [_required_] string The type of the WAF rule, associated with the category will form the security activity. id string The ID of the custom rule. type enum The type of the resource. The value should always be `custom_rule`. Allowed enum values: `custom_rule` default: `custom_rule` ``` { "data": { "attributes": { "action": { "action": "block_request", "parameters": { "location": "/blocking", "status_code": 403 } }, "blocking": false, "conditions": [ { "operator": "match_regex", "parameters": { "data": "blocked_users", "inputs": [ { "address": "server.db.statement", "key_path": [] } ], "list": [], "options": { "case_sensitive": false, "min_length": "integer" }, "regex": "path.*", "value": "custom_tag" } } ], "enabled": false, "metadata": { "added_at": "2021-01-01T00:00:00Z", "added_by": "john.doe@datadoghq.com", "added_by_name": "John Doe", "modified_at": "2021-01-01T00:00:00Z", "modified_by": "john.doe@datadoghq.com", "modified_by_name": "John Doe" }, "name": "Block request from bad useragent", "path_glob": "/api/search/*", "scope": [ { "env": "prod", "service": "billing-service" } ], "tags": { "category": "business_logic", "type": "users.login.success" } }, "id": "2857c47d-1e3a-4300-8b2f-dc24089c084b", "type": "custom_rule" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/application-security/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/application-security/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/application-security/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/application-security/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/application-security/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/application-security/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/application-security/?code-lang=typescript) ##### Get a WAF custom rule Copy ``` # Path parameters export custom_rule_id="3b5-v82-ns6" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/${custom_rule_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a WAF custom rule ``` """ Get a WAF custom rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.application_security_api import ApplicationSecurityApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ApplicationSecurityApi(api_client) response = api_instance.get_application_security_waf_custom_rule( custom_rule_id="custom_rule_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a WAF custom rule ``` # Get a WAF custom rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ApplicationSecurityAPI.new p api_instance.get_application_security_waf_custom_rule("custom_rule_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a WAF custom rule ``` // Get a WAF custom rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewApplicationSecurityApi(apiClient) resp, r, err := api.GetApplicationSecurityWafCustomRule(ctx, "custom_rule_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ApplicationSecurityApi.GetApplicationSecurityWafCustomRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ApplicationSecurityApi.GetApplicationSecurityWafCustomRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a WAF custom rule ``` // Get a WAF custom rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApplicationSecurityApi; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApplicationSecurityApi apiInstance = new ApplicationSecurityApi(defaultClient); try { ApplicationSecurityWafCustomRuleResponse result = apiInstance.getApplicationSecurityWafCustomRule("3b5-v82-ns6"); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ApplicationSecurityApi#getApplicationSecurityWafCustomRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a WAF custom rule ``` // Get a WAF custom rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_application_security::ApplicationSecurityAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ApplicationSecurityAPI::with_config(configuration); let resp = api .get_application_security_waf_custom_rule("custom_rule_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a WAF custom rule ``` /** * Get a WAF custom rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ApplicationSecurityApi(configuration); const params: v2.ApplicationSecurityApiGetApplicationSecurityWafCustomRuleRequest = { customRuleId: "custom_rule_id", }; apiInstance .getApplicationSecurityWafCustomRule(params) .then((data: v2.ApplicationSecurityWafCustomRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a WAF custom rule](https://docs.datadoghq.com/api/latest/application-security/#create-a-waf-custom-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/application-security/#create-a-waf-custom-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.ap2.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.datadoghq.eu/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.ddog-gov.com/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.us3.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules ### Overview Create a new WAF custom rule with the given parameters. ### Request #### Body Data (required) The definition of the new WAF Custom Rule. * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Field Type Description data [_required_] object Object for a single WAF custom rule. attributes [_required_] object Create a new WAF custom rule. action object The definition of `ApplicationSecurityWafCustomRuleAction` object. action enum Override the default action to take when the WAF custom rule would block. Allowed enum values: `redirect_request,block_request` default: `block_request` parameters object The definition of `ApplicationSecurityWafCustomRuleActionParameters` object. location string The location to redirect to when the WAF custom rule triggers. status_code int64 The status code to return when the WAF custom rule triggers. default: `403` blocking [_required_] boolean Indicates whether the WAF custom rule will block the request. conditions [_required_] [object] Conditions for which the WAF Custom Rule will triggers, all conditions needs to match in order for the WAF rule to trigger operator [_required_] enum Operator to use for the WAF Condition. Allowed enum values: `match_regex,!match_regex,phrase_match,!phrase_match,is_xss,is_sqli,exact_match,!exact_match,ip_match,!ip_match,capture_data` parameters [_required_] object The scope of the WAF custom rule. data string Identifier of a list of data from the denylist. Can only be used as substitution from the list parameter. inputs [_required_] [object] List of inputs on which at least one should match with the given operator. address [_required_] enum Input from the request on which the condition should apply. Allowed enum values: `server.db.statement,server.io.fs.file,server.io.net.url,server.sys.shell.cmd,server.request.method,server.request.uri.raw,server.request.path_params,server.request.query,server.request.headers.no_cookies,server.request.cookies,server.request.trailers,server.request.body,server.response.status,server.response.headers.no_cookies,server.response.trailers,grpc.server.request.metadata,grpc.server.request.message,grpc.server.method,graphql.server.all_resolvers,usr.id,http.client_ip` key_path [string] Specific path for the input. list [string] List of value to use with the condition. Only used with the phrase_match, !phrase_match, exact_match and !exact_match operator. options object Options for the operator of this condition. case_sensitive boolean Evaluate the value as case sensitive. min_length int64 Only evaluate this condition if the value has a minimum amount of characters. regex string Regex to use with the condition. Only used with match_regex and !match_regex operator. value string Store the captured value in the specified tag name. Only used with the capture_data operator. enabled [_required_] boolean Indicates whether the WAF custom rule is enabled. name [_required_] string The Name of the WAF custom rule. path_glob string The path glob for the WAF custom rule. scope [object] The scope of the WAF custom rule. env [_required_] string The environment scope for the WAF custom rule. service [_required_] string The service scope for the WAF custom rule. tags [_required_] object Tags associated with the WAF Custom Rule. The concatenation of category and type will form the security activity field associated with the traces. category [_required_] enum The category of the WAF Rule, can be either `business_logic`, `attack_attempt` or `security_response`. Allowed enum values: `attack_attempt,business_logic,security_response` type [_required_] string The type of the WAF rule, associated with the category will form the security activity. type [_required_] enum The type of the resource. The value should always be `custom_rule`. Allowed enum values: `custom_rule` default: `custom_rule` ``` { "data": { "attributes": { "action": { "action": "block_request", "parameters": { "location": "/blocking", "status_code": 403 } }, "blocking": false, "conditions": [ { "operator": "match_regex", "parameters": { "data": "blocked_users", "inputs": [ { "address": "server.db.statement", "key_path": [] } ], "list": [], "options": { "case_sensitive": false, "min_length": "integer" }, "regex": "path.*", "value": "custom_tag" } } ], "enabled": false, "name": "Block request from a bad useragent", "path_glob": "/api/search/*", "scope": [ { "env": "prod", "service": "billing-service" } ], "tags": { "category": "business_logic", "type": "users.login.success" } }, "type": "custom_rule" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/application-security/#CreateApplicationSecurityWafCustomRule-201-v2) * [400](https://docs.datadoghq.com/api/latest/application-security/#CreateApplicationSecurityWafCustomRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/application-security/#CreateApplicationSecurityWafCustomRule-403-v2) * [409](https://docs.datadoghq.com/api/latest/application-security/#CreateApplicationSecurityWafCustomRule-409-v2) * [429](https://docs.datadoghq.com/api/latest/application-security/#CreateApplicationSecurityWafCustomRule-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Response object that includes a single WAF custom rule. Field Type Description data object Object for a single WAF custom rule. attributes object A WAF custom rule. action object The definition of `ApplicationSecurityWafCustomRuleAction` object. action enum Override the default action to take when the WAF custom rule would block. Allowed enum values: `redirect_request,block_request` default: `block_request` parameters object The definition of `ApplicationSecurityWafCustomRuleActionParameters` object. location string The location to redirect to when the WAF custom rule triggers. status_code int64 The status code to return when the WAF custom rule triggers. default: `403` blocking [_required_] boolean Indicates whether the WAF custom rule will block the request. conditions [_required_] [object] Conditions for which the WAF Custom Rule will triggers, all conditions needs to match in order for the WAF rule to trigger. operator [_required_] enum Operator to use for the WAF Condition. Allowed enum values: `match_regex,!match_regex,phrase_match,!phrase_match,is_xss,is_sqli,exact_match,!exact_match,ip_match,!ip_match,capture_data` parameters [_required_] object The scope of the WAF custom rule. data string Identifier of a list of data from the denylist. Can only be used as substitution from the list parameter. inputs [_required_] [object] List of inputs on which at least one should match with the given operator. address [_required_] enum Input from the request on which the condition should apply. Allowed enum values: `server.db.statement,server.io.fs.file,server.io.net.url,server.sys.shell.cmd,server.request.method,server.request.uri.raw,server.request.path_params,server.request.query,server.request.headers.no_cookies,server.request.cookies,server.request.trailers,server.request.body,server.response.status,server.response.headers.no_cookies,server.response.trailers,grpc.server.request.metadata,grpc.server.request.message,grpc.server.method,graphql.server.all_resolvers,usr.id,http.client_ip` key_path [string] Specific path for the input. list [string] List of value to use with the condition. Only used with the phrase_match, !phrase_match, exact_match and !exact_match operator. options object Options for the operator of this condition. case_sensitive boolean Evaluate the value as case sensitive. min_length int64 Only evaluate this condition if the value has a minimum amount of characters. regex string Regex to use with the condition. Only used with match_regex and !match_regex operator. value string Store the captured value in the specified tag name. Only used with the capture_data operator. enabled [_required_] boolean Indicates whether the WAF custom rule is enabled. metadata object Metadata associated with the WAF Custom Rule. added_at date-time The date and time the WAF custom rule was created. added_by string The handle of the user who created the WAF custom rule. added_by_name string The name of the user who created the WAF custom rule. modified_at date-time The date and time the WAF custom rule was last updated. modified_by string The handle of the user who last updated the WAF custom rule. modified_by_name string The name of the user who last updated the WAF custom rule. name [_required_] string The Name of the WAF custom rule. path_glob string The path glob for the WAF custom rule. scope [object] The scope of the WAF custom rule. env [_required_] string The environment scope for the WAF custom rule. service [_required_] string The service scope for the WAF custom rule. tags [_required_] object Tags associated with the WAF Custom Rule. The concatenation of category and type will form the security activity field associated with the traces. category [_required_] enum The category of the WAF Rule, can be either `business_logic`, `attack_attempt` or `security_response`. Allowed enum values: `attack_attempt,business_logic,security_response` type [_required_] string The type of the WAF rule, associated with the category will form the security activity. id string The ID of the custom rule. type enum The type of the resource. The value should always be `custom_rule`. Allowed enum values: `custom_rule` default: `custom_rule` ``` { "data": { "attributes": { "action": { "action": "block_request", "parameters": { "location": "/blocking", "status_code": 403 } }, "blocking": false, "conditions": [ { "operator": "match_regex", "parameters": { "data": "blocked_users", "inputs": [ { "address": "server.db.statement", "key_path": [] } ], "list": [], "options": { "case_sensitive": false, "min_length": "integer" }, "regex": "path.*", "value": "custom_tag" } } ], "enabled": false, "metadata": { "added_at": "2021-01-01T00:00:00Z", "added_by": "john.doe@datadoghq.com", "added_by_name": "John Doe", "modified_at": "2021-01-01T00:00:00Z", "modified_by": "john.doe@datadoghq.com", "modified_by_name": "John Doe" }, "name": "Block request from bad useragent", "path_glob": "/api/search/*", "scope": [ { "env": "prod", "service": "billing-service" } ], "tags": { "category": "business_logic", "type": "users.login.success" } }, "id": "2857c47d-1e3a-4300-8b2f-dc24089c084b", "type": "custom_rule" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/application-security/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/application-security/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/application-security/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/application-security/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/application-security/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/application-security/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/application-security/?code-lang=typescript) ##### Create a WAF custom rule Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "blocking": false, "conditions": [ { "operator": "match_regex", "parameters": { "inputs": [ { "address": "server.db.statement" } ] } } ], "enabled": false, "name": "Block request from a bad useragent", "scope": [ { "env": "prod", "service": "billing-service" } ], "tags": { "category": "business_logic", "type": "users.login.success" } }, "type": "custom_rule" } } EOF ``` ##### Create a WAF custom rule ``` """ Create a WAF custom rule returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.application_security_api import ApplicationSecurityApi from datadog_api_client.v2.model.application_security_waf_custom_rule_action import ( ApplicationSecurityWafCustomRuleAction, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_action_action import ( ApplicationSecurityWafCustomRuleActionAction, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_action_parameters import ( ApplicationSecurityWafCustomRuleActionParameters, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_condition import ( ApplicationSecurityWafCustomRuleCondition, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_condition_input import ( ApplicationSecurityWafCustomRuleConditionInput, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_condition_input_address import ( ApplicationSecurityWafCustomRuleConditionInputAddress, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_condition_operator import ( ApplicationSecurityWafCustomRuleConditionOperator, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_condition_options import ( ApplicationSecurityWafCustomRuleConditionOptions, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_condition_parameters import ( ApplicationSecurityWafCustomRuleConditionParameters, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_create_attributes import ( ApplicationSecurityWafCustomRuleCreateAttributes, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_create_data import ( ApplicationSecurityWafCustomRuleCreateData, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_create_request import ( ApplicationSecurityWafCustomRuleCreateRequest, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_scope import ApplicationSecurityWafCustomRuleScope from datadog_api_client.v2.model.application_security_waf_custom_rule_tags import ApplicationSecurityWafCustomRuleTags from datadog_api_client.v2.model.application_security_waf_custom_rule_tags_category import ( ApplicationSecurityWafCustomRuleTagsCategory, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_type import ApplicationSecurityWafCustomRuleType body = ApplicationSecurityWafCustomRuleCreateRequest( data=ApplicationSecurityWafCustomRuleCreateData( attributes=ApplicationSecurityWafCustomRuleCreateAttributes( action=ApplicationSecurityWafCustomRuleAction( action=ApplicationSecurityWafCustomRuleActionAction.BLOCK_REQUEST, parameters=ApplicationSecurityWafCustomRuleActionParameters( location="/blocking", status_code=403, ), ), blocking=False, conditions=[ ApplicationSecurityWafCustomRuleCondition( operator=ApplicationSecurityWafCustomRuleConditionOperator.MATCH_REGEX, parameters=ApplicationSecurityWafCustomRuleConditionParameters( data="blocked_users", inputs=[ ApplicationSecurityWafCustomRuleConditionInput( address=ApplicationSecurityWafCustomRuleConditionInputAddress.SERVER_DB_STATEMENT, key_path=[], ), ], list=[], options=ApplicationSecurityWafCustomRuleConditionOptions( case_sensitive=False, min_length=0, ), regex="path.*", value="custom_tag", ), ), ], enabled=False, name="Block request from a bad useragent", path_glob="/api/search/*", scope=[ ApplicationSecurityWafCustomRuleScope( env="prod", service="billing-service", ), ], tags=ApplicationSecurityWafCustomRuleTags( category=ApplicationSecurityWafCustomRuleTagsCategory.BUSINESS_LOGIC, type="users.login.success", ), ), type=ApplicationSecurityWafCustomRuleType.CUSTOM_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ApplicationSecurityApi(api_client) response = api_instance.create_application_security_waf_custom_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a WAF custom rule ``` # Create a WAF custom rule returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ApplicationSecurityAPI.new body = DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleCreateRequest.new({ data: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleCreateData.new({ attributes: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleCreateAttributes.new({ action: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleAction.new({ action: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleActionAction::BLOCK_REQUEST, parameters: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleActionParameters.new({ location: "/blocking", status_code: 403, }), }), blocking: false, conditions: [ DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleCondition.new({ operator: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleConditionOperator::MATCH_REGEX, parameters: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleConditionParameters.new({ data: "blocked_users", inputs: [ DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleConditionInput.new({ address: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleConditionInputAddress::SERVER_DB_STATEMENT, key_path: [], }), ], list: [], options: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleConditionOptions.new({ case_sensitive: false, min_length: 0, }), regex: "path.*", value: "custom_tag", }), }), ], enabled: false, name: "Block request from a bad useragent", path_glob: "/api/search/*", scope: [ DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleScope.new({ env: "prod", service: "billing-service", }), ], tags: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleTags.new({ category: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleTagsCategory::BUSINESS_LOGIC, type: "users.login.success", }), }), type: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleType::CUSTOM_RULE, }), }) p api_instance.create_application_security_waf_custom_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a WAF custom rule ``` // Create a WAF custom rule returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ApplicationSecurityWafCustomRuleCreateRequest{ Data: datadogV2.ApplicationSecurityWafCustomRuleCreateData{ Attributes: datadogV2.ApplicationSecurityWafCustomRuleCreateAttributes{ Action: &datadogV2.ApplicationSecurityWafCustomRuleAction{ Action: datadogV2.APPLICATIONSECURITYWAFCUSTOMRULEACTIONACTION_BLOCK_REQUEST.Ptr(), Parameters: &datadogV2.ApplicationSecurityWafCustomRuleActionParameters{ Location: datadog.PtrString("/blocking"), StatusCode: datadog.PtrInt64(403), }, }, Blocking: false, Conditions: []datadogV2.ApplicationSecurityWafCustomRuleCondition{ { Operator: datadogV2.APPLICATIONSECURITYWAFCUSTOMRULECONDITIONOPERATOR_MATCH_REGEX, Parameters: datadogV2.ApplicationSecurityWafCustomRuleConditionParameters{ Data: datadog.PtrString("blocked_users"), Inputs: []datadogV2.ApplicationSecurityWafCustomRuleConditionInput{ { Address: datadogV2.APPLICATIONSECURITYWAFCUSTOMRULECONDITIONINPUTADDRESS_SERVER_DB_STATEMENT, KeyPath: []string{}, }, }, List: []string{}, Options: &datadogV2.ApplicationSecurityWafCustomRuleConditionOptions{ CaseSensitive: datadog.PtrBool(false), MinLength: datadog.PtrInt64(0), }, Regex: datadog.PtrString("path.*"), Value: datadog.PtrString("custom_tag"), }, }, }, Enabled: false, Name: "Block request from a bad useragent", PathGlob: datadog.PtrString("/api/search/*"), Scope: []datadogV2.ApplicationSecurityWafCustomRuleScope{ { Env: "prod", Service: "billing-service", }, }, Tags: datadogV2.ApplicationSecurityWafCustomRuleTags{ Category: datadogV2.APPLICATIONSECURITYWAFCUSTOMRULETAGSCATEGORY_BUSINESS_LOGIC, Type: "users.login.success", }, }, Type: datadogV2.APPLICATIONSECURITYWAFCUSTOMRULETYPE_CUSTOM_RULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewApplicationSecurityApi(apiClient) resp, r, err := api.CreateApplicationSecurityWafCustomRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ApplicationSecurityApi.CreateApplicationSecurityWafCustomRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ApplicationSecurityApi.CreateApplicationSecurityWafCustomRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a WAF custom rule ``` // Create a WAF custom rule returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApplicationSecurityApi; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleAction; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleActionAction; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleActionParameters; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleCondition; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleConditionInput; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleConditionInputAddress; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleConditionOperator; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleConditionOptions; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleConditionParameters; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleCreateAttributes; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleCreateData; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleCreateRequest; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleResponse; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleScope; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleTags; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleTagsCategory; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApplicationSecurityApi apiInstance = new ApplicationSecurityApi(defaultClient); ApplicationSecurityWafCustomRuleCreateRequest body = new ApplicationSecurityWafCustomRuleCreateRequest() .data( new ApplicationSecurityWafCustomRuleCreateData() .attributes( new ApplicationSecurityWafCustomRuleCreateAttributes() .action( new ApplicationSecurityWafCustomRuleAction() .action( ApplicationSecurityWafCustomRuleActionAction.BLOCK_REQUEST) .parameters( new ApplicationSecurityWafCustomRuleActionParameters() .location("/blocking") .statusCode(403L))) .blocking(false) .conditions( Collections.singletonList( new ApplicationSecurityWafCustomRuleCondition() .operator( ApplicationSecurityWafCustomRuleConditionOperator .MATCH_REGEX) .parameters( new ApplicationSecurityWafCustomRuleConditionParameters() .data("blocked_users") .inputs( Collections.singletonList( new ApplicationSecurityWafCustomRuleConditionInput() .address( ApplicationSecurityWafCustomRuleConditionInputAddress .SERVER_DB_STATEMENT))) .options( new ApplicationSecurityWafCustomRuleConditionOptions() .caseSensitive(false) .minLength(0L)) .regex("path.*") .value("custom_tag")))) .enabled(false) .name("Block request from a bad useragent") .pathGlob("/api/search/*") .scope( Collections.singletonList( new ApplicationSecurityWafCustomRuleScope() .env("prod") .service("billing-service"))) .tags( new ApplicationSecurityWafCustomRuleTags() .category( ApplicationSecurityWafCustomRuleTagsCategory.BUSINESS_LOGIC) .type("users.login.success"))) .type(ApplicationSecurityWafCustomRuleType.CUSTOM_RULE)); try { ApplicationSecurityWafCustomRuleResponse result = apiInstance.createApplicationSecurityWafCustomRule(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ApplicationSecurityApi#createApplicationSecurityWafCustomRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a WAF custom rule ``` // Create a WAF custom rule returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_application_security::ApplicationSecurityAPI; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleAction; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleActionAction; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleActionParameters; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleCondition; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleConditionInput; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleConditionInputAddress; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleConditionOperator; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleConditionOptions; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleConditionParameters; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleCreateAttributes; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleCreateData; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleCreateRequest; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleScope; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleTags; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleTagsCategory; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleType; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = ApplicationSecurityWafCustomRuleCreateRequest::new( ApplicationSecurityWafCustomRuleCreateData::new( ApplicationSecurityWafCustomRuleCreateAttributes::new( false, vec![ ApplicationSecurityWafCustomRuleCondition::new( ApplicationSecurityWafCustomRuleConditionOperator::MATCH_REGEX, ApplicationSecurityWafCustomRuleConditionParameters::new( vec![ ApplicationSecurityWafCustomRuleConditionInput::new( ApplicationSecurityWafCustomRuleConditionInputAddress::SERVER_DB_STATEMENT, ).key_path(vec![]) ], ) .data("blocked_users".to_string()) .list(vec![]) .options( ApplicationSecurityWafCustomRuleConditionOptions::new() .case_sensitive(false) .min_length(0), ) .regex("path.*".to_string()) .value("custom_tag".to_string()), ) ], false, "Block request from a bad useragent".to_string(), ApplicationSecurityWafCustomRuleTags::new( ApplicationSecurityWafCustomRuleTagsCategory::BUSINESS_LOGIC, "users.login.success".to_string(), ).additional_properties(BTreeMap::from([])), ) .action( ApplicationSecurityWafCustomRuleAction::new() .action(ApplicationSecurityWafCustomRuleActionAction::BLOCK_REQUEST) .parameters( ApplicationSecurityWafCustomRuleActionParameters::new() .location("/blocking".to_string()) .status_code(403), ), ) .path_glob("/api/search/*".to_string()) .scope( vec![ ApplicationSecurityWafCustomRuleScope::new( "prod".to_string(), "billing-service".to_string(), ) ], ), ApplicationSecurityWafCustomRuleType::CUSTOM_RULE, ), ); let configuration = datadog::Configuration::new(); let api = ApplicationSecurityAPI::with_config(configuration); let resp = api.create_application_security_waf_custom_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a WAF custom rule ``` /** * Create a WAF custom rule returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ApplicationSecurityApi(configuration); const params: v2.ApplicationSecurityApiCreateApplicationSecurityWafCustomRuleRequest = { body: { data: { attributes: { action: { action: "block_request", parameters: { location: "/blocking", statusCode: 403, }, }, blocking: false, conditions: [ { operator: "match_regex", parameters: { data: "blocked_users", inputs: [ { address: "server.db.statement", keyPath: [], }, ], list: [], options: { caseSensitive: false, minLength: 0, }, regex: "path.*", value: "custom_tag", }, }, ], enabled: false, name: "Block request from a bad useragent", pathGlob: "/api/search/*", scope: [ { env: "prod", service: "billing-service", }, ], tags: { category: "business_logic", type: "users.login.success", }, }, type: "custom_rule", }, }, }; apiInstance .createApplicationSecurityWafCustomRule(params) .then((data: v2.ApplicationSecurityWafCustomRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all WAF custom rules](https://docs.datadoghq.com/api/latest/application-security/#list-all-waf-custom-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/application-security/#list-all-waf-custom-rules-v2) GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.ap2.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.datadoghq.eu/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.ddog-gov.com/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.us3.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_ruleshttps://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules ### Overview Retrieve a list of WAF custom rule. ### Response * [200](https://docs.datadoghq.com/api/latest/application-security/#ListApplicationSecurityWAFCustomRules-200-v2) * [403](https://docs.datadoghq.com/api/latest/application-security/#ListApplicationSecurityWAFCustomRules-403-v2) * [429](https://docs.datadoghq.com/api/latest/application-security/#ListApplicationSecurityWAFCustomRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Response object that includes a list of WAF custom rules. Field Type Description data [object] The WAF custom rule data. attributes object A WAF custom rule. action object The definition of `ApplicationSecurityWafCustomRuleAction` object. action enum Override the default action to take when the WAF custom rule would block. Allowed enum values: `redirect_request,block_request` default: `block_request` parameters object The definition of `ApplicationSecurityWafCustomRuleActionParameters` object. location string The location to redirect to when the WAF custom rule triggers. status_code int64 The status code to return when the WAF custom rule triggers. default: `403` blocking [_required_] boolean Indicates whether the WAF custom rule will block the request. conditions [_required_] [object] Conditions for which the WAF Custom Rule will triggers, all conditions needs to match in order for the WAF rule to trigger. operator [_required_] enum Operator to use for the WAF Condition. Allowed enum values: `match_regex,!match_regex,phrase_match,!phrase_match,is_xss,is_sqli,exact_match,!exact_match,ip_match,!ip_match,capture_data` parameters [_required_] object The scope of the WAF custom rule. data string Identifier of a list of data from the denylist. Can only be used as substitution from the list parameter. inputs [_required_] [object] List of inputs on which at least one should match with the given operator. address [_required_] enum Input from the request on which the condition should apply. Allowed enum values: `server.db.statement,server.io.fs.file,server.io.net.url,server.sys.shell.cmd,server.request.method,server.request.uri.raw,server.request.path_params,server.request.query,server.request.headers.no_cookies,server.request.cookies,server.request.trailers,server.request.body,server.response.status,server.response.headers.no_cookies,server.response.trailers,grpc.server.request.metadata,grpc.server.request.message,grpc.server.method,graphql.server.all_resolvers,usr.id,http.client_ip` key_path [string] Specific path for the input. list [string] List of value to use with the condition. Only used with the phrase_match, !phrase_match, exact_match and !exact_match operator. options object Options for the operator of this condition. case_sensitive boolean Evaluate the value as case sensitive. min_length int64 Only evaluate this condition if the value has a minimum amount of characters. regex string Regex to use with the condition. Only used with match_regex and !match_regex operator. value string Store the captured value in the specified tag name. Only used with the capture_data operator. enabled [_required_] boolean Indicates whether the WAF custom rule is enabled. metadata object Metadata associated with the WAF Custom Rule. added_at date-time The date and time the WAF custom rule was created. added_by string The handle of the user who created the WAF custom rule. added_by_name string The name of the user who created the WAF custom rule. modified_at date-time The date and time the WAF custom rule was last updated. modified_by string The handle of the user who last updated the WAF custom rule. modified_by_name string The name of the user who last updated the WAF custom rule. name [_required_] string The Name of the WAF custom rule. path_glob string The path glob for the WAF custom rule. scope [object] The scope of the WAF custom rule. env [_required_] string The environment scope for the WAF custom rule. service [_required_] string The service scope for the WAF custom rule. tags [_required_] object Tags associated with the WAF Custom Rule. The concatenation of category and type will form the security activity field associated with the traces. category [_required_] enum The category of the WAF Rule, can be either `business_logic`, `attack_attempt` or `security_response`. Allowed enum values: `attack_attempt,business_logic,security_response` type [_required_] string The type of the WAF rule, associated with the category will form the security activity. id string The ID of the custom rule. type enum The type of the resource. The value should always be `custom_rule`. Allowed enum values: `custom_rule` default: `custom_rule` ``` { "data": [ { "attributes": { "action": { "action": "block_request", "parameters": { "location": "/blocking", "status_code": 403 } }, "blocking": false, "conditions": [ { "operator": "match_regex", "parameters": { "data": "blocked_users", "inputs": [ { "address": "server.db.statement", "key_path": [] } ], "list": [], "options": { "case_sensitive": false, "min_length": "integer" }, "regex": "path.*", "value": "custom_tag" } } ], "enabled": false, "metadata": { "added_at": "2021-01-01T00:00:00Z", "added_by": "john.doe@datadoghq.com", "added_by_name": "John Doe", "modified_at": "2021-01-01T00:00:00Z", "modified_by": "john.doe@datadoghq.com", "modified_by_name": "John Doe" }, "name": "Block request from bad useragent", "path_glob": "/api/search/*", "scope": [ { "env": "prod", "service": "billing-service" } ], "tags": { "category": "business_logic", "type": "users.login.success" } }, "id": "2857c47d-1e3a-4300-8b2f-dc24089c084b", "type": "custom_rule" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/application-security/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/application-security/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/application-security/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/application-security/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/application-security/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/application-security/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/application-security/?code-lang=typescript) ##### List all WAF custom rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all WAF custom rules ``` """ List all WAF custom rules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.application_security_api import ApplicationSecurityApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ApplicationSecurityApi(api_client) response = api_instance.list_application_security_waf_custom_rules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all WAF custom rules ``` # List all WAF custom rules returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ApplicationSecurityAPI.new p api_instance.list_application_security_waf_custom_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all WAF custom rules ``` // List all WAF custom rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewApplicationSecurityApi(apiClient) resp, r, err := api.ListApplicationSecurityWAFCustomRules(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ApplicationSecurityApi.ListApplicationSecurityWAFCustomRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ApplicationSecurityApi.ListApplicationSecurityWAFCustomRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all WAF custom rules ``` // List all WAF custom rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApplicationSecurityApi; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApplicationSecurityApi apiInstance = new ApplicationSecurityApi(defaultClient); try { ApplicationSecurityWafCustomRuleListResponse result = apiInstance.listApplicationSecurityWAFCustomRules(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ApplicationSecurityApi#listApplicationSecurityWAFCustomRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all WAF custom rules ``` // List all WAF custom rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_application_security::ApplicationSecurityAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ApplicationSecurityAPI::with_config(configuration); let resp = api.list_application_security_waf_custom_rules().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all WAF custom rules ``` /** * List all WAF custom rules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ApplicationSecurityApi(configuration); apiInstance .listApplicationSecurityWAFCustomRules() .then((data: v2.ApplicationSecurityWafCustomRuleListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a WAF Custom Rule](https://docs.datadoghq.com/api/latest/application-security/#update-a-waf-custom-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/application-security/#update-a-waf-custom-rule-v2) PUT https://api.ap1.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.datadoghq.eu/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.ddog-gov.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id} ### Overview Update a specific WAF custom Rule. Returns the Custom Rule object when the request is successful. ### Arguments #### Path Parameters Name Type Description custom_rule_id [_required_] string The ID of the custom rule. ### Request #### Body Data (required) New definition of the WAF Custom Rule. * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Field Type Description data [_required_] object Object for a single WAF Custom Rule. attributes [_required_] object Update a WAF custom rule. action object The definition of `ApplicationSecurityWafCustomRuleAction` object. action enum Override the default action to take when the WAF custom rule would block. Allowed enum values: `redirect_request,block_request` default: `block_request` parameters object The definition of `ApplicationSecurityWafCustomRuleActionParameters` object. location string The location to redirect to when the WAF custom rule triggers. status_code int64 The status code to return when the WAF custom rule triggers. default: `403` blocking [_required_] boolean Indicates whether the WAF custom rule will block the request. conditions [_required_] [object] Conditions for which the WAF Custom Rule will triggers, all conditions needs to match in order for the WAF rule to trigger. operator [_required_] enum Operator to use for the WAF Condition. Allowed enum values: `match_regex,!match_regex,phrase_match,!phrase_match,is_xss,is_sqli,exact_match,!exact_match,ip_match,!ip_match,capture_data` parameters [_required_] object The scope of the WAF custom rule. data string Identifier of a list of data from the denylist. Can only be used as substitution from the list parameter. inputs [_required_] [object] List of inputs on which at least one should match with the given operator. address [_required_] enum Input from the request on which the condition should apply. Allowed enum values: `server.db.statement,server.io.fs.file,server.io.net.url,server.sys.shell.cmd,server.request.method,server.request.uri.raw,server.request.path_params,server.request.query,server.request.headers.no_cookies,server.request.cookies,server.request.trailers,server.request.body,server.response.status,server.response.headers.no_cookies,server.response.trailers,grpc.server.request.metadata,grpc.server.request.message,grpc.server.method,graphql.server.all_resolvers,usr.id,http.client_ip` key_path [string] Specific path for the input. list [string] List of value to use with the condition. Only used with the phrase_match, !phrase_match, exact_match and !exact_match operator. options object Options for the operator of this condition. case_sensitive boolean Evaluate the value as case sensitive. min_length int64 Only evaluate this condition if the value has a minimum amount of characters. regex string Regex to use with the condition. Only used with match_regex and !match_regex operator. value string Store the captured value in the specified tag name. Only used with the capture_data operator. enabled [_required_] boolean Indicates whether the WAF custom rule is enabled. name [_required_] string The Name of the WAF custom rule. path_glob string The path glob for the WAF custom rule. scope [object] The scope of the WAF custom rule. env [_required_] string The environment scope for the WAF custom rule. service [_required_] string The service scope for the WAF custom rule. tags [_required_] object Tags associated with the WAF Custom Rule. The concatenation of category and type will form the security activity field associated with the traces. category [_required_] enum The category of the WAF Rule, can be either `business_logic`, `attack_attempt` or `security_response`. Allowed enum values: `attack_attempt,business_logic,security_response` type [_required_] string The type of the WAF rule, associated with the category will form the security activity. type [_required_] enum The type of the resource. The value should always be `custom_rule`. Allowed enum values: `custom_rule` default: `custom_rule` ``` { "data": { "type": "custom_rule", "attributes": { "blocking": false, "conditions": [ { "operator": "match_regex", "parameters": { "inputs": [ { "address": "server.request.query", "key_path": [ "id" ] } ], "regex": "badactor" } } ], "enabled": false, "name": "test", "path_glob": "/test", "scope": [ { "env": "test", "service": "test" } ], "tags": { "category": "attack_attempt", "type": "test" } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafCustomRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafCustomRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafCustomRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafCustomRule-404-v2) * [409](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafCustomRule-409-v2) * [429](https://docs.datadoghq.com/api/latest/application-security/#UpdateApplicationSecurityWafCustomRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) Response object that includes a single WAF custom rule. Field Type Description data object Object for a single WAF custom rule. attributes object A WAF custom rule. action object The definition of `ApplicationSecurityWafCustomRuleAction` object. action enum Override the default action to take when the WAF custom rule would block. Allowed enum values: `redirect_request,block_request` default: `block_request` parameters object The definition of `ApplicationSecurityWafCustomRuleActionParameters` object. location string The location to redirect to when the WAF custom rule triggers. status_code int64 The status code to return when the WAF custom rule triggers. default: `403` blocking [_required_] boolean Indicates whether the WAF custom rule will block the request. conditions [_required_] [object] Conditions for which the WAF Custom Rule will triggers, all conditions needs to match in order for the WAF rule to trigger. operator [_required_] enum Operator to use for the WAF Condition. Allowed enum values: `match_regex,!match_regex,phrase_match,!phrase_match,is_xss,is_sqli,exact_match,!exact_match,ip_match,!ip_match,capture_data` parameters [_required_] object The scope of the WAF custom rule. data string Identifier of a list of data from the denylist. Can only be used as substitution from the list parameter. inputs [_required_] [object] List of inputs on which at least one should match with the given operator. address [_required_] enum Input from the request on which the condition should apply. Allowed enum values: `server.db.statement,server.io.fs.file,server.io.net.url,server.sys.shell.cmd,server.request.method,server.request.uri.raw,server.request.path_params,server.request.query,server.request.headers.no_cookies,server.request.cookies,server.request.trailers,server.request.body,server.response.status,server.response.headers.no_cookies,server.response.trailers,grpc.server.request.metadata,grpc.server.request.message,grpc.server.method,graphql.server.all_resolvers,usr.id,http.client_ip` key_path [string] Specific path for the input. list [string] List of value to use with the condition. Only used with the phrase_match, !phrase_match, exact_match and !exact_match operator. options object Options for the operator of this condition. case_sensitive boolean Evaluate the value as case sensitive. min_length int64 Only evaluate this condition if the value has a minimum amount of characters. regex string Regex to use with the condition. Only used with match_regex and !match_regex operator. value string Store the captured value in the specified tag name. Only used with the capture_data operator. enabled [_required_] boolean Indicates whether the WAF custom rule is enabled. metadata object Metadata associated with the WAF Custom Rule. added_at date-time The date and time the WAF custom rule was created. added_by string The handle of the user who created the WAF custom rule. added_by_name string The name of the user who created the WAF custom rule. modified_at date-time The date and time the WAF custom rule was last updated. modified_by string The handle of the user who last updated the WAF custom rule. modified_by_name string The name of the user who last updated the WAF custom rule. name [_required_] string The Name of the WAF custom rule. path_glob string The path glob for the WAF custom rule. scope [object] The scope of the WAF custom rule. env [_required_] string The environment scope for the WAF custom rule. service [_required_] string The service scope for the WAF custom rule. tags [_required_] object Tags associated with the WAF Custom Rule. The concatenation of category and type will form the security activity field associated with the traces. category [_required_] enum The category of the WAF Rule, can be either `business_logic`, `attack_attempt` or `security_response`. Allowed enum values: `attack_attempt,business_logic,security_response` type [_required_] string The type of the WAF rule, associated with the category will form the security activity. id string The ID of the custom rule. type enum The type of the resource. The value should always be `custom_rule`. Allowed enum values: `custom_rule` default: `custom_rule` ``` { "data": { "attributes": { "action": { "action": "block_request", "parameters": { "location": "/blocking", "status_code": 403 } }, "blocking": false, "conditions": [ { "operator": "match_regex", "parameters": { "data": "blocked_users", "inputs": [ { "address": "server.db.statement", "key_path": [] } ], "list": [], "options": { "case_sensitive": false, "min_length": "integer" }, "regex": "path.*", "value": "custom_tag" } } ], "enabled": false, "metadata": { "added_at": "2021-01-01T00:00:00Z", "added_by": "john.doe@datadoghq.com", "added_by_name": "John Doe", "modified_at": "2021-01-01T00:00:00Z", "modified_by": "john.doe@datadoghq.com", "modified_by_name": "John Doe" }, "name": "Block request from bad useragent", "path_glob": "/api/search/*", "scope": [ { "env": "prod", "service": "billing-service" } ], "tags": { "category": "business_logic", "type": "users.login.success" } }, "id": "2857c47d-1e3a-4300-8b2f-dc24089c084b", "type": "custom_rule" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/application-security/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/application-security/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/application-security/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/application-security/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/application-security/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/application-security/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/application-security/?code-lang=typescript) ##### Update a WAF Custom Rule returns "OK" response Copy ``` # Path parameters export custom_rule_id="3b5-v82-ns6" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/${custom_rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "custom_rule", "attributes": { "blocking": false, "conditions": [ { "operator": "match_regex", "parameters": { "inputs": [ { "address": "server.request.query", "key_path": [ "id" ] } ], "regex": "badactor" } } ], "enabled": false, "name": "test", "path_glob": "/test", "scope": [ { "env": "test", "service": "test" } ], "tags": { "category": "attack_attempt", "type": "test" } } } } EOF ``` ##### Update a WAF Custom Rule returns "OK" response ``` // Update a WAF Custom Rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "custom_rule" in the system CustomRuleDataID := os.Getenv("CUSTOM_RULE_DATA_ID") body := datadogV2.ApplicationSecurityWafCustomRuleUpdateRequest{ Data: datadogV2.ApplicationSecurityWafCustomRuleUpdateData{ Type: datadogV2.APPLICATIONSECURITYWAFCUSTOMRULETYPE_CUSTOM_RULE, Attributes: datadogV2.ApplicationSecurityWafCustomRuleUpdateAttributes{ Blocking: false, Conditions: []datadogV2.ApplicationSecurityWafCustomRuleCondition{ { Operator: datadogV2.APPLICATIONSECURITYWAFCUSTOMRULECONDITIONOPERATOR_MATCH_REGEX, Parameters: datadogV2.ApplicationSecurityWafCustomRuleConditionParameters{ Inputs: []datadogV2.ApplicationSecurityWafCustomRuleConditionInput{ { Address: datadogV2.APPLICATIONSECURITYWAFCUSTOMRULECONDITIONINPUTADDRESS_SERVER_REQUEST_QUERY, KeyPath: []string{ "id", }, }, }, Regex: datadog.PtrString("badactor"), }, }, }, Enabled: false, Name: "test", PathGlob: datadog.PtrString("/test"), Scope: []datadogV2.ApplicationSecurityWafCustomRuleScope{ { Env: "test", Service: "test", }, }, Tags: datadogV2.ApplicationSecurityWafCustomRuleTags{ Category: datadogV2.APPLICATIONSECURITYWAFCUSTOMRULETAGSCATEGORY_ATTACK_ATTEMPT, Type: "test", }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewApplicationSecurityApi(apiClient) resp, r, err := api.UpdateApplicationSecurityWafCustomRule(ctx, CustomRuleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ApplicationSecurityApi.UpdateApplicationSecurityWafCustomRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ApplicationSecurityApi.UpdateApplicationSecurityWafCustomRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a WAF Custom Rule returns "OK" response ``` // Update a WAF Custom Rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApplicationSecurityApi; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleCondition; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleConditionInput; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleConditionInputAddress; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleConditionOperator; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleConditionParameters; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleResponse; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleScope; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleTags; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleTagsCategory; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleType; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleUpdateAttributes; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleUpdateData; import com.datadog.api.client.v2.model.ApplicationSecurityWafCustomRuleUpdateRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApplicationSecurityApi apiInstance = new ApplicationSecurityApi(defaultClient); // there is a valid "custom_rule" in the system String CUSTOM_RULE_DATA_ID = System.getenv("CUSTOM_RULE_DATA_ID"); ApplicationSecurityWafCustomRuleUpdateRequest body = new ApplicationSecurityWafCustomRuleUpdateRequest() .data( new ApplicationSecurityWafCustomRuleUpdateData() .type(ApplicationSecurityWafCustomRuleType.CUSTOM_RULE) .attributes( new ApplicationSecurityWafCustomRuleUpdateAttributes() .blocking(false) .conditions( Collections.singletonList( new ApplicationSecurityWafCustomRuleCondition() .operator( ApplicationSecurityWafCustomRuleConditionOperator .MATCH_REGEX) .parameters( new ApplicationSecurityWafCustomRuleConditionParameters() .inputs( Collections.singletonList( new ApplicationSecurityWafCustomRuleConditionInput() .address( ApplicationSecurityWafCustomRuleConditionInputAddress .SERVER_REQUEST_QUERY) .keyPath( Collections.singletonList("id")))) .regex("badactor")))) .enabled(false) .name("test") .pathGlob("/test") .scope( Collections.singletonList( new ApplicationSecurityWafCustomRuleScope() .env("test") .service("test"))) .tags( new ApplicationSecurityWafCustomRuleTags() .category( ApplicationSecurityWafCustomRuleTagsCategory.ATTACK_ATTEMPT) .type("test")))); try { ApplicationSecurityWafCustomRuleResponse result = apiInstance.updateApplicationSecurityWafCustomRule(CUSTOM_RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ApplicationSecurityApi#updateApplicationSecurityWafCustomRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a WAF Custom Rule returns "OK" response ``` """ Update a WAF Custom Rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.application_security_api import ApplicationSecurityApi from datadog_api_client.v2.model.application_security_waf_custom_rule_condition import ( ApplicationSecurityWafCustomRuleCondition, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_condition_input import ( ApplicationSecurityWafCustomRuleConditionInput, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_condition_input_address import ( ApplicationSecurityWafCustomRuleConditionInputAddress, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_condition_operator import ( ApplicationSecurityWafCustomRuleConditionOperator, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_condition_parameters import ( ApplicationSecurityWafCustomRuleConditionParameters, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_scope import ApplicationSecurityWafCustomRuleScope from datadog_api_client.v2.model.application_security_waf_custom_rule_tags import ApplicationSecurityWafCustomRuleTags from datadog_api_client.v2.model.application_security_waf_custom_rule_tags_category import ( ApplicationSecurityWafCustomRuleTagsCategory, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_type import ApplicationSecurityWafCustomRuleType from datadog_api_client.v2.model.application_security_waf_custom_rule_update_attributes import ( ApplicationSecurityWafCustomRuleUpdateAttributes, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_update_data import ( ApplicationSecurityWafCustomRuleUpdateData, ) from datadog_api_client.v2.model.application_security_waf_custom_rule_update_request import ( ApplicationSecurityWafCustomRuleUpdateRequest, ) # there is a valid "custom_rule" in the system CUSTOM_RULE_DATA_ID = environ["CUSTOM_RULE_DATA_ID"] body = ApplicationSecurityWafCustomRuleUpdateRequest( data=ApplicationSecurityWafCustomRuleUpdateData( type=ApplicationSecurityWafCustomRuleType.CUSTOM_RULE, attributes=ApplicationSecurityWafCustomRuleUpdateAttributes( blocking=False, conditions=[ ApplicationSecurityWafCustomRuleCondition( operator=ApplicationSecurityWafCustomRuleConditionOperator.MATCH_REGEX, parameters=ApplicationSecurityWafCustomRuleConditionParameters( inputs=[ ApplicationSecurityWafCustomRuleConditionInput( address=ApplicationSecurityWafCustomRuleConditionInputAddress.SERVER_REQUEST_QUERY, key_path=[ "id", ], ), ], regex="badactor", ), ), ], enabled=False, name="test", path_glob="/test", scope=[ ApplicationSecurityWafCustomRuleScope( env="test", service="test", ), ], tags=ApplicationSecurityWafCustomRuleTags( category=ApplicationSecurityWafCustomRuleTagsCategory.ATTACK_ATTEMPT, type="test", ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ApplicationSecurityApi(api_client) response = api_instance.update_application_security_waf_custom_rule(custom_rule_id=CUSTOM_RULE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a WAF Custom Rule returns "OK" response ``` # Update a WAF Custom Rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ApplicationSecurityAPI.new # there is a valid "custom_rule" in the system CUSTOM_RULE_DATA_ID = ENV["CUSTOM_RULE_DATA_ID"] body = DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleUpdateRequest.new({ data: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleUpdateData.new({ type: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleType::CUSTOM_RULE, attributes: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleUpdateAttributes.new({ blocking: false, conditions: [ DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleCondition.new({ operator: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleConditionOperator::MATCH_REGEX, parameters: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleConditionParameters.new({ inputs: [ DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleConditionInput.new({ address: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleConditionInputAddress::SERVER_REQUEST_QUERY, key_path: [ "id", ], }), ], regex: "badactor", }), }), ], enabled: false, name: "test", path_glob: "/test", scope: [ DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleScope.new({ env: "test", service: "test", }), ], tags: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleTags.new({ category: DatadogAPIClient::V2::ApplicationSecurityWafCustomRuleTagsCategory::ATTACK_ATTEMPT, type: "test", }), }), }), }) p api_instance.update_application_security_waf_custom_rule(CUSTOM_RULE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a WAF Custom Rule returns "OK" response ``` // Update a WAF Custom Rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_application_security::ApplicationSecurityAPI; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleCondition; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleConditionInput; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleConditionInputAddress; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleConditionOperator; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleConditionParameters; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleScope; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleTags; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleTagsCategory; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleType; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleUpdateAttributes; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleUpdateData; use datadog_api_client::datadogV2::model::ApplicationSecurityWafCustomRuleUpdateRequest; use std::collections::BTreeMap; #[tokio::main] async fn main() { // there is a valid "custom_rule" in the system let custom_rule_data_id = std::env::var("CUSTOM_RULE_DATA_ID").unwrap(); let body = ApplicationSecurityWafCustomRuleUpdateRequest::new( ApplicationSecurityWafCustomRuleUpdateData::new( ApplicationSecurityWafCustomRuleUpdateAttributes::new( false, vec![ ApplicationSecurityWafCustomRuleCondition::new( ApplicationSecurityWafCustomRuleConditionOperator::MATCH_REGEX, ApplicationSecurityWafCustomRuleConditionParameters::new( vec![ ApplicationSecurityWafCustomRuleConditionInput::new( ApplicationSecurityWafCustomRuleConditionInputAddress::SERVER_REQUEST_QUERY, ).key_path(vec!["id".to_string()]) ], ).regex("badactor".to_string()), ) ], false, "test".to_string(), ApplicationSecurityWafCustomRuleTags::new( ApplicationSecurityWafCustomRuleTagsCategory::ATTACK_ATTEMPT, "test".to_string(), ).additional_properties(BTreeMap::from([])), ) .path_glob("/test".to_string()) .scope(vec![ApplicationSecurityWafCustomRuleScope::new("test".to_string(), "test".to_string())]), ApplicationSecurityWafCustomRuleType::CUSTOM_RULE, ), ); let configuration = datadog::Configuration::new(); let api = ApplicationSecurityAPI::with_config(configuration); let resp = api .update_application_security_waf_custom_rule(custom_rule_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a WAF Custom Rule returns "OK" response ``` /** * Update a WAF Custom Rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ApplicationSecurityApi(configuration); // there is a valid "custom_rule" in the system const CUSTOM_RULE_DATA_ID = process.env.CUSTOM_RULE_DATA_ID as string; const params: v2.ApplicationSecurityApiUpdateApplicationSecurityWafCustomRuleRequest = { body: { data: { type: "custom_rule", attributes: { blocking: false, conditions: [ { operator: "match_regex", parameters: { inputs: [ { address: "server.request.query", keyPath: ["id"], }, ], regex: "badactor", }, }, ], enabled: false, name: "test", pathGlob: "/test", scope: [ { env: "test", service: "test", }, ], tags: { category: "attack_attempt", type: "test", }, }, }, }, customRuleId: CUSTOM_RULE_DATA_ID, }; apiInstance .updateApplicationSecurityWafCustomRule(params) .then((data: v2.ApplicationSecurityWafCustomRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a WAF Custom Rule](https://docs.datadoghq.com/api/latest/application-security/#delete-a-waf-custom-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/application-security/#delete-a-waf-custom-rule-v2) DELETE https://api.ap1.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.datadoghq.eu/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.ddog-gov.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/{custom_rule_id} ### Overview Delete a specific WAF custom rule. ### Arguments #### Path Parameters Name Type Description custom_rule_id [_required_] string The ID of the custom rule. ### Response * [204](https://docs.datadoghq.com/api/latest/application-security/#DeleteApplicationSecurityWafCustomRule-204-v2) * [403](https://docs.datadoghq.com/api/latest/application-security/#DeleteApplicationSecurityWafCustomRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/application-security/#DeleteApplicationSecurityWafCustomRule-404-v2) * [409](https://docs.datadoghq.com/api/latest/application-security/#DeleteApplicationSecurityWafCustomRule-409-v2) * [429](https://docs.datadoghq.com/api/latest/application-security/#DeleteApplicationSecurityWafCustomRule-429-v2) No Content Not Authorized * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/application-security/) * [Example](https://docs.datadoghq.com/api/latest/application-security/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/application-security/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/application-security/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/application-security/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/application-security/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/application-security/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/application-security/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/application-security/?code-lang=typescript) ##### Delete a WAF Custom Rule Copy ``` # Path parameters export custom_rule_id="3b5-v82-ns6" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/asm/waf/custom_rules/${custom_rule_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a WAF Custom Rule ``` """ Delete a WAF Custom Rule returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.application_security_api import ApplicationSecurityApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ApplicationSecurityApi(api_client) api_instance.delete_application_security_waf_custom_rule( custom_rule_id="custom_rule_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a WAF Custom Rule ``` # Delete a WAF Custom Rule returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ApplicationSecurityAPI.new api_instance.delete_application_security_waf_custom_rule("custom_rule_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a WAF Custom Rule ``` // Delete a WAF Custom Rule returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewApplicationSecurityApi(apiClient) r, err := api.DeleteApplicationSecurityWafCustomRule(ctx, "custom_rule_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ApplicationSecurityApi.DeleteApplicationSecurityWafCustomRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a WAF Custom Rule ``` // Delete a WAF Custom Rule returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ApplicationSecurityApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ApplicationSecurityApi apiInstance = new ApplicationSecurityApi(defaultClient); try { apiInstance.deleteApplicationSecurityWafCustomRule("3b5-v82-ns6"); } catch (ApiException e) { System.err.println( "Exception when calling ApplicationSecurityApi#deleteApplicationSecurityWafCustomRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a WAF Custom Rule ``` // Delete a WAF Custom Rule returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_application_security::ApplicationSecurityAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ApplicationSecurityAPI::with_config(configuration); let resp = api .delete_application_security_waf_custom_rule("custom_rule_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a WAF Custom Rule ``` /** * Delete a WAF Custom Rule returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ApplicationSecurityApi(configuration); const params: v2.ApplicationSecurityApiDeleteApplicationSecurityWafCustomRuleRequest = { customRuleId: "custom_rule_id", }; apiInstance .deleteApplicationSecurityWafCustomRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=464cf4c4-8560-45eb-a644-680344317f8f&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2fb578fb-6bef-4636-9301-31c94dfa88d6&pt=Application%20Security&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapplication-security%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=464cf4c4-8560-45eb-a644-680344317f8f&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2fb578fb-6bef-4636-9301-31c94dfa88d6&pt=Application%20Security&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapplication-security%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=38595a27-5518-4cfc-9dc1-2e838d93ff94&bo=2&sid=d1f92c20f0be11f08ebc57cacaa062e7&vid=d1f94250f0be11f09627ed3365624e98&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Application%20Security&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fapplication-security%2F&r=<=2993&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=447850) --- # Source: https://docs.datadoghq.com/api/latest/audit/ # Audit Search your Audit Logs events over HTTP. ## [Search Audit Logs events](https://docs.datadoghq.com/api/latest/audit/#search-audit-logs-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/audit/#search-audit-logs-events-v2) POST https://api.ap1.datadoghq.com/api/v2/audit/events/searchhttps://api.ap2.datadoghq.com/api/v2/audit/events/searchhttps://api.datadoghq.eu/api/v2/audit/events/searchhttps://api.ddog-gov.com/api/v2/audit/events/searchhttps://api.datadoghq.com/api/v2/audit/events/searchhttps://api.us3.datadoghq.com/api/v2/audit/events/searchhttps://api.us5.datadoghq.com/api/v2/audit/events/search ### Overview List endpoint returns Audit Logs events that match an Audit search query. [Results are paginated](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to build complex Audit Logs events filtering and search. This endpoint requires the `audit_logs_read` permission. ### Request #### Body Data * [Model](https://docs.datadoghq.com/api/latest/audit/) * [Example](https://docs.datadoghq.com/api/latest/audit/) Field Type Description filter object Search and filter query settings. from string Minimum time for the requested events. Supports date, math, and regular timestamps (in milliseconds). default: `now-15m` query string Search query following the Audit Logs search syntax. default: `*` to string Maximum time for the requested events. Supports date, math, and regular timestamps (in milliseconds). default: `now` options object Global query options that are used during the query. Note: Specify either timezone or time offset, not both. Otherwise, the query fails. time_offset int64 Time offset (in seconds) to apply to the query. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` page object Paging attributes for listing events. cursor string List following results with a cursor provided in the previous query. limit int32 Maximum number of events in the response. default: `10` sort enum Sort parameters when querying events. Allowed enum values: `timestamp,-timestamp` ##### Search Audit Logs events returns "OK" response ``` { "filter": { "from": "now-15m", "query": "@type:session AND @session.type:user", "to": "now" }, "options": { "time_offset": 0, "timezone": "GMT" }, "page": { "limit": 25 }, "sort": "timestamp" } ``` Copy ##### Search Audit Logs events returns "OK" response with pagination ``` { "filter": { "from": "now-15m", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/audit/#SearchAuditLogs-200-v2) * [400](https://docs.datadoghq.com/api/latest/audit/#SearchAuditLogs-400-v2) * [403](https://docs.datadoghq.com/api/latest/audit/#SearchAuditLogs-403-v2) * [429](https://docs.datadoghq.com/api/latest/audit/#SearchAuditLogs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/audit/) * [Example](https://docs.datadoghq.com/api/latest/audit/) Response object with all events matching the request and pagination information. Field Type Description data [object] Array of events matching the request. attributes object JSON object containing all event attributes and their associated values. attributes object JSON object of attributes from Audit Logs events. message string Message of the event. service string Name of the application or service generating Audit Logs events. This name is used to correlate Audit Logs to APM, so make sure you specify the same value when you use both products. tags [string] Array of tags associated with your event. timestamp date-time Timestamp of your event. id string Unique ID of the event. type enum Type of the event. Allowed enum values: `audit` default: `audit` links object Links attributes. next string Link for the next set of results. Note that the request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 Time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non-fatal errors) encountered. Partial results may return if warnings are present in the response. code string Unique code for this type of warning. detail string Detailed explanation of this specific warning. title string Short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "message": "string", "service": "web-app", "tags": [ "team:A" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "audit" } ], "links": { "next": "https://app.datadoghq.com/api/v2/audit/event?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/audit/) * [Example](https://docs.datadoghq.com/api/latest/audit/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/audit/) * [Example](https://docs.datadoghq.com/api/latest/audit/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/audit/) * [Example](https://docs.datadoghq.com/api/latest/audit/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/audit/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/audit/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/audit/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/audit/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/audit/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/audit/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/audit/?code-lang=typescript) ##### Search Audit Logs events returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/audit/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "now-15m", "query": "@type:session AND @session.type:user", "to": "now" }, "options": { "time_offset": 0, "timezone": "GMT" }, "page": { "limit": 25 }, "sort": "timestamp" } EOF ``` ##### Search Audit Logs events returns "OK" response with pagination Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/audit/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "now-15m", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" } EOF ``` ##### Search Audit Logs events returns "OK" response ``` // Search Audit Logs events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AuditLogsSearchEventsRequest{ Filter: &datadogV2.AuditLogsQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("@type:session AND @session.type:user"), To: datadog.PtrString("now"), }, Options: &datadogV2.AuditLogsQueryOptions{ TimeOffset: datadog.PtrInt64(0), Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.AuditLogsQueryPageOptions{ Limit: datadog.PtrInt32(25), }, Sort: datadogV2.AUDITLOGSSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAuditApi(apiClient) resp, r, err := api.SearchAuditLogs(ctx, *datadogV2.NewSearchAuditLogsOptionalParameters().WithBody(body)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AuditApi.SearchAuditLogs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AuditApi.SearchAuditLogs`:\n%s\n", responseContent) } ``` Copy ##### Search Audit Logs events returns "OK" response with pagination ``` // Search Audit Logs events returns "OK" response with pagination package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AuditLogsSearchEventsRequest{ Filter: &datadogV2.AuditLogsQueryFilter{ From: datadog.PtrString("now-15m"), To: datadog.PtrString("now"), }, Options: &datadogV2.AuditLogsQueryOptions{ Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.AuditLogsQueryPageOptions{ Limit: datadog.PtrInt32(2), }, Sort: datadogV2.AUDITLOGSSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAuditApi(apiClient) resp, _ := api.SearchAuditLogsWithPagination(ctx, *datadogV2.NewSearchAuditLogsOptionalParameters().WithBody(body)) for paginationResult := range resp { if paginationResult.Error != nil { fmt.Fprintf(os.Stderr, "Error when calling `AuditApi.SearchAuditLogs`: %v\n", paginationResult.Error) } responseContent, _ := json.MarshalIndent(paginationResult.Item, "", " ") fmt.Fprintf(os.Stdout, "%s\n", responseContent) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search Audit Logs events returns "OK" response ``` // Search Audit Logs events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AuditApi; import com.datadog.api.client.v2.api.AuditApi.SearchAuditLogsOptionalParameters; import com.datadog.api.client.v2.model.AuditLogsEventsResponse; import com.datadog.api.client.v2.model.AuditLogsQueryFilter; import com.datadog.api.client.v2.model.AuditLogsQueryOptions; import com.datadog.api.client.v2.model.AuditLogsQueryPageOptions; import com.datadog.api.client.v2.model.AuditLogsSearchEventsRequest; import com.datadog.api.client.v2.model.AuditLogsSort; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AuditApi apiInstance = new AuditApi(defaultClient); AuditLogsSearchEventsRequest body = new AuditLogsSearchEventsRequest() .filter( new AuditLogsQueryFilter() .from("now-15m") .query("@type:session AND @session.type:user") .to("now")) .options(new AuditLogsQueryOptions().timeOffset(0L).timezone("GMT")) .page(new AuditLogsQueryPageOptions().limit(25)) .sort(AuditLogsSort.TIMESTAMP_ASCENDING); try { AuditLogsEventsResponse result = apiInstance.searchAuditLogs(new SearchAuditLogsOptionalParameters().body(body)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AuditApi#searchAuditLogs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Search Audit Logs events returns "OK" response with pagination ``` // Search Audit Logs events returns "OK" response with pagination import com.datadog.api.client.ApiClient; import com.datadog.api.client.PaginationIterable; import com.datadog.api.client.v2.api.AuditApi; import com.datadog.api.client.v2.api.AuditApi.SearchAuditLogsOptionalParameters; import com.datadog.api.client.v2.model.AuditLogsEvent; import com.datadog.api.client.v2.model.AuditLogsQueryFilter; import com.datadog.api.client.v2.model.AuditLogsQueryOptions; import com.datadog.api.client.v2.model.AuditLogsQueryPageOptions; import com.datadog.api.client.v2.model.AuditLogsSearchEventsRequest; import com.datadog.api.client.v2.model.AuditLogsSort; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AuditApi apiInstance = new AuditApi(defaultClient); AuditLogsSearchEventsRequest body = new AuditLogsSearchEventsRequest() .filter(new AuditLogsQueryFilter().from("now-15m").to("now")) .options(new AuditLogsQueryOptions().timezone("GMT")) .page(new AuditLogsQueryPageOptions().limit(2)) .sort(AuditLogsSort.TIMESTAMP_ASCENDING); try { PaginationIterable iterable = apiInstance.searchAuditLogsWithPagination( new SearchAuditLogsOptionalParameters().body(body)); for (AuditLogsEvent item : iterable) { System.out.println(item); } } catch (RuntimeException e) { System.err.println("Exception when calling AuditApi#searchAuditLogsWithPagination"); System.err.println("Reason: " + e.getMessage()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search Audit Logs events returns "OK" response ``` """ Search Audit Logs events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.audit_api import AuditApi from datadog_api_client.v2.model.audit_logs_query_filter import AuditLogsQueryFilter from datadog_api_client.v2.model.audit_logs_query_options import AuditLogsQueryOptions from datadog_api_client.v2.model.audit_logs_query_page_options import AuditLogsQueryPageOptions from datadog_api_client.v2.model.audit_logs_search_events_request import AuditLogsSearchEventsRequest from datadog_api_client.v2.model.audit_logs_sort import AuditLogsSort body = AuditLogsSearchEventsRequest( filter=AuditLogsQueryFilter( _from="now-15m", query="@type:session AND @session.type:user", to="now", ), options=AuditLogsQueryOptions( time_offset=0, timezone="GMT", ), page=AuditLogsQueryPageOptions( limit=25, ), sort=AuditLogsSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AuditApi(api_client) response = api_instance.search_audit_logs(body=body) print(response) ``` Copy ##### Search Audit Logs events returns "OK" response with pagination ``` """ Search Audit Logs events returns "OK" response with pagination """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.audit_api import AuditApi from datadog_api_client.v2.model.audit_logs_query_filter import AuditLogsQueryFilter from datadog_api_client.v2.model.audit_logs_query_options import AuditLogsQueryOptions from datadog_api_client.v2.model.audit_logs_query_page_options import AuditLogsQueryPageOptions from datadog_api_client.v2.model.audit_logs_search_events_request import AuditLogsSearchEventsRequest from datadog_api_client.v2.model.audit_logs_sort import AuditLogsSort body = AuditLogsSearchEventsRequest( filter=AuditLogsQueryFilter( _from="now-15m", to="now", ), options=AuditLogsQueryOptions( timezone="GMT", ), page=AuditLogsQueryPageOptions( limit=2, ), sort=AuditLogsSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AuditApi(api_client) items = api_instance.search_audit_logs_with_pagination(body=body) for item in items: print(item) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search Audit Logs events returns "OK" response ``` # Search Audit Logs events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AuditAPI.new body = DatadogAPIClient::V2::AuditLogsSearchEventsRequest.new({ filter: DatadogAPIClient::V2::AuditLogsQueryFilter.new({ from: "now-15m", query: "@type:session AND @session.type:user", to: "now", }), options: DatadogAPIClient::V2::AuditLogsQueryOptions.new({ time_offset: 0, timezone: "GMT", }), page: DatadogAPIClient::V2::AuditLogsQueryPageOptions.new({ limit: 25, }), sort: DatadogAPIClient::V2::AuditLogsSort::TIMESTAMP_ASCENDING, }) opts = { body: body, } p api_instance.search_audit_logs(opts) ``` Copy ##### Search Audit Logs events returns "OK" response with pagination ``` # Search Audit Logs events returns "OK" response with pagination require "datadog_api_client" api_instance = DatadogAPIClient::V2::AuditAPI.new body = DatadogAPIClient::V2::AuditLogsSearchEventsRequest.new({ filter: DatadogAPIClient::V2::AuditLogsQueryFilter.new({ from: "now-15m", to: "now", }), options: DatadogAPIClient::V2::AuditLogsQueryOptions.new({ timezone: "GMT", }), page: DatadogAPIClient::V2::AuditLogsQueryPageOptions.new({ limit: 2, }), sort: DatadogAPIClient::V2::AuditLogsSort::TIMESTAMP_ASCENDING, }) opts = { body: body, } api_instance.search_audit_logs_with_pagination(opts) { |item| puts item } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search Audit Logs events returns "OK" response ``` // Search Audit Logs events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_audit::AuditAPI; use datadog_api_client::datadogV2::api_audit::SearchAuditLogsOptionalParams; use datadog_api_client::datadogV2::model::AuditLogsQueryFilter; use datadog_api_client::datadogV2::model::AuditLogsQueryOptions; use datadog_api_client::datadogV2::model::AuditLogsQueryPageOptions; use datadog_api_client::datadogV2::model::AuditLogsSearchEventsRequest; use datadog_api_client::datadogV2::model::AuditLogsSort; #[tokio::main] async fn main() { let body = AuditLogsSearchEventsRequest::new() .filter( AuditLogsQueryFilter::new() .from("now-15m".to_string()) .query("@type:session AND @session.type:user".to_string()) .to("now".to_string()), ) .options( AuditLogsQueryOptions::new() .time_offset(0) .timezone("GMT".to_string()), ) .page(AuditLogsQueryPageOptions::new().limit(25)) .sort(AuditLogsSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = AuditAPI::with_config(configuration); let resp = api .search_audit_logs(SearchAuditLogsOptionalParams::default().body(body)) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Search Audit Logs events returns "OK" response with pagination ``` // Search Audit Logs events returns "OK" response with pagination use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_audit::AuditAPI; use datadog_api_client::datadogV2::api_audit::SearchAuditLogsOptionalParams; use datadog_api_client::datadogV2::model::AuditLogsQueryFilter; use datadog_api_client::datadogV2::model::AuditLogsQueryOptions; use datadog_api_client::datadogV2::model::AuditLogsQueryPageOptions; use datadog_api_client::datadogV2::model::AuditLogsSearchEventsRequest; use datadog_api_client::datadogV2::model::AuditLogsSort; use futures_util::pin_mut; use futures_util::stream::StreamExt; #[tokio::main] async fn main() { let body = AuditLogsSearchEventsRequest::new() .filter( AuditLogsQueryFilter::new() .from("now-15m".to_string()) .to("now".to_string()), ) .options(AuditLogsQueryOptions::new().timezone("GMT".to_string())) .page(AuditLogsQueryPageOptions::new().limit(2)) .sort(AuditLogsSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = AuditAPI::with_config(configuration); let response = api.search_audit_logs_with_pagination(SearchAuditLogsOptionalParams::default().body(body)); pin_mut!(response); while let Some(resp) = response.next().await { if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search Audit Logs events returns "OK" response ``` /** * Search Audit Logs events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AuditApi(configuration); const params: v2.AuditApiSearchAuditLogsRequest = { body: { filter: { from: "now-15m", query: "@type:session AND @session.type:user", to: "now", }, options: { timeOffset: 0, timezone: "GMT", }, page: { limit: 25, }, sort: "timestamp", }, }; apiInstance .searchAuditLogs(params) .then((data: v2.AuditLogsEventsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Search Audit Logs events returns "OK" response with pagination ``` /** * Search Audit Logs events returns "OK" response with pagination */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AuditApi(configuration); const params: v2.AuditApiSearchAuditLogsRequest = { body: { filter: { from: "now-15m", to: "now", }, options: { timezone: "GMT", }, page: { limit: 2, }, sort: "timestamp", }, }; (async () => { try { for await (const item of apiInstance.searchAuditLogsWithPagination( params )) { console.log(item); } } catch (error) { console.error(error); } })(); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of Audit Logs events](https://docs.datadoghq.com/api/latest/audit/#get-a-list-of-audit-logs-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/audit/#get-a-list-of-audit-logs-events-v2) GET https://api.ap1.datadoghq.com/api/v2/audit/eventshttps://api.ap2.datadoghq.com/api/v2/audit/eventshttps://api.datadoghq.eu/api/v2/audit/eventshttps://api.ddog-gov.com/api/v2/audit/eventshttps://api.datadoghq.com/api/v2/audit/eventshttps://api.us3.datadoghq.com/api/v2/audit/eventshttps://api.us5.datadoghq.com/api/v2/audit/events ### Overview List endpoint returns events that match a Audit Logs search query. [Results are paginated](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to see your latest Audit Logs events. This endpoint requires the `audit_logs_read` permission. ### Arguments #### Query Strings Name Type Description filter[query] string Search query following Audit Logs syntax. filter[from] string Minimum timestamp for requested events. filter[to] string Maximum timestamp for requested events. sort enum Order of events in results. Allowed enum values: `timestamp, -timestamp` page[cursor] string List following results with a cursor provided in the previous query. page[limit] integer Maximum number of events in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/audit/#ListAuditLogs-200-v2) * [400](https://docs.datadoghq.com/api/latest/audit/#ListAuditLogs-400-v2) * [403](https://docs.datadoghq.com/api/latest/audit/#ListAuditLogs-403-v2) * [429](https://docs.datadoghq.com/api/latest/audit/#ListAuditLogs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/audit/) * [Example](https://docs.datadoghq.com/api/latest/audit/) Response object with all events matching the request and pagination information. Field Type Description data [object] Array of events matching the request. attributes object JSON object containing all event attributes and their associated values. attributes object JSON object of attributes from Audit Logs events. message string Message of the event. service string Name of the application or service generating Audit Logs events. This name is used to correlate Audit Logs to APM, so make sure you specify the same value when you use both products. tags [string] Array of tags associated with your event. timestamp date-time Timestamp of your event. id string Unique ID of the event. type enum Type of the event. Allowed enum values: `audit` default: `audit` links object Links attributes. next string Link for the next set of results. Note that the request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 Time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non-fatal errors) encountered. Partial results may return if warnings are present in the response. code string Unique code for this type of warning. detail string Detailed explanation of this specific warning. title string Short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "message": "string", "service": "web-app", "tags": [ "team:A" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "audit" } ], "links": { "next": "https://app.datadoghq.com/api/v2/audit/event?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/audit/) * [Example](https://docs.datadoghq.com/api/latest/audit/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/audit/) * [Example](https://docs.datadoghq.com/api/latest/audit/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/audit/) * [Example](https://docs.datadoghq.com/api/latest/audit/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/audit/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/audit/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/audit/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/audit/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/audit/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/audit/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/audit/?code-lang=typescript) ##### Get a list of Audit Logs events Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/audit/events" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of Audit Logs events ``` """ Get a list of Audit Logs events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.audit_api import AuditApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AuditApi(api_client) response = api_instance.list_audit_logs() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of Audit Logs events ``` # Get a list of Audit Logs events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AuditAPI.new p api_instance.list_audit_logs() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of Audit Logs events ``` // Get a list of Audit Logs events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAuditApi(apiClient) resp, r, err := api.ListAuditLogs(ctx, *datadogV2.NewListAuditLogsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AuditApi.ListAuditLogs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AuditApi.ListAuditLogs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of Audit Logs events ``` // Get a list of Audit Logs events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AuditApi; import com.datadog.api.client.v2.model.AuditLogsEventsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AuditApi apiInstance = new AuditApi(defaultClient); try { AuditLogsEventsResponse result = apiInstance.listAuditLogs(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AuditApi#listAuditLogs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of Audit Logs events ``` // Get a list of Audit Logs events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_audit::AuditAPI; use datadog_api_client::datadogV2::api_audit::ListAuditLogsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AuditAPI::with_config(configuration); let resp = api .list_audit_logs(ListAuditLogsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of Audit Logs events ``` /** * Get a list of Audit Logs events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AuditApi(configuration); apiInstance .listAuditLogs() .then((data: v2.AuditLogsEventsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=dc19c66a-60f3-4198-8ffe-e06e69e560a9&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=830fa690-e35b-4750-8694-b7601907dc02&pt=Audit&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faudit%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=dc19c66a-60f3-4198-8ffe-e06e69e560a9&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=830fa690-e35b-4750-8694-b7601907dc02&pt=Audit&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faudit%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=1797db4b-5d35-42d0-9f38-b8480cc60a7c&bo=2&sid=d4936d40f0be11f0ac22fbecbc28d170&vid=d4939990f0be11f08645c313e3c6d719&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Audit&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faudit%2F&r=<=1156&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=661964) --- # Source: https://docs.datadoghq.com/api/latest/authentication/ # Authentication All requests to Datadog’s API must be authenticated. Requests that write data require reporting access and require an `API key`. Requests that read data require full access and also require an `application key`. **Note:** All Datadog API clients are configured by default to consume Datadog US site APIs. If you are on the Datadog EU site, set the environment variable `DATADOG_HOST` to `https://api.datadoghq.eu` or override this value directly when creating your client. [Manage your account’s API and application keys](https://app.datadoghq.com/organization-settings/) in Datadog, and see the [API and Application Keys page](https://docs.datadoghq.com/account_management/api-app-keys/) in the documentation. ## [Validate API key](https://docs.datadoghq.com/api/latest/authentication/#validate-api-key) * [v1 (latest)](https://docs.datadoghq.com/api/latest/authentication/#validate-api-key-v1) GET https://api.ap1.datadoghq.com/api/v1/validatehttps://api.ap2.datadoghq.com/api/v1/validatehttps://api.datadoghq.eu/api/v1/validatehttps://api.ddog-gov.com/api/v1/validatehttps://api.datadoghq.com/api/v1/validatehttps://api.us3.datadoghq.com/api/v1/validatehttps://api.us5.datadoghq.com/api/v1/validate ### Overview Check if the API key (not the APP key) is valid. If invalid, a 403 is returned. ### Response * [200](https://docs.datadoghq.com/api/latest/authentication/#Validate-200-v1) * [403](https://docs.datadoghq.com/api/latest/authentication/#Validate-403-v1) * [429](https://docs.datadoghq.com/api/latest/authentication/#Validate-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/authentication/) * [Example](https://docs.datadoghq.com/api/latest/authentication/) Represent validation endpoint responses. Expand All Field Type Description valid boolean Return `true` if the authentication response is valid. ``` { "valid": true } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/authentication/) * [Example](https://docs.datadoghq.com/api/latest/authentication/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/authentication/) * [Example](https://docs.datadoghq.com/api/latest/authentication/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/authentication/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/authentication/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/authentication/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/authentication/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/authentication/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/authentication/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/authentication/?code-lang=typescript) ##### Validate API key Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/validate" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" ``` ##### Validate API key ``` """ Validate API key returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.authentication_api import AuthenticationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AuthenticationApi(api_client) response = api_instance.validate() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Validate API key ``` # Validate API key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AuthenticationAPI.new p api_instance.validate() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Validate API key ``` // Validate API key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAuthenticationApi(apiClient) resp, r, err := api.Validate(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AuthenticationApi.Validate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AuthenticationApi.Validate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Validate API key ``` // Validate API key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AuthenticationApi; import com.datadog.api.client.v1.model.AuthenticationValidationResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AuthenticationApi apiInstance = new AuthenticationApi(defaultClient); try { AuthenticationValidationResponse result = apiInstance.validate(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AuthenticationApi#validate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Validate API key ``` // Validate API key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_authentication::AuthenticationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AuthenticationAPI::with_config(configuration); let resp = api.validate().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Validate API key ``` /** * Validate API key returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AuthenticationApi(configuration); apiInstance .validate() .then((data: v1.AuthenticationValidationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=76592dc0-8f68-4f6f-bb88-deaea1f8857a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=bd7b4083-acf3-4486-9141-73bb34aed41a&pt=Authentication&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fauthentication%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=76592dc0-8f68-4f6f-bb88-deaea1f8857a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=bd7b4083-acf3-4486-9141-73bb34aed41a&pt=Authentication&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fauthentication%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=b7fb8d81-b6ba-4547-a82a-aacd1c89087a&bo=2&sid=d872ad90f0be11f08643b917bdf1bf9a&vid=d8733b60f0be11f081df2b1decc9a94d&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Authentication&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fauthentication%2F&r=<=1132&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=667086) --- # Source: https://docs.datadoghq.com/api/latest/authn-mappings/ # AuthN Mappings [The AuthN Mappings API](https://docs.datadoghq.com/account_management/authn_mapping/?tab=example) is used to automatically map groups of users to roles in Datadog using attributes sent from Identity Providers. Use these endpoints to manage your AuthN Mappings. ## [Get an AuthN Mapping by UUID](https://docs.datadoghq.com/api/latest/authn-mappings/#get-an-authn-mapping-by-uuid) * [v2 (latest)](https://docs.datadoghq.com/api/latest/authn-mappings/#get-an-authn-mapping-by-uuid-v2) GET https://api.ap1.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.ap2.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.datadoghq.eu/api/v2/authn_mappings/{authn_mapping_id}https://api.ddog-gov.com/api/v2/authn_mappings/{authn_mapping_id}https://api.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.us3.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.us5.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id} ### Overview Get an AuthN Mapping specified by the AuthN Mapping UUID. This endpoint requires the `user_access_read` permission. ### Arguments #### Path Parameters Name Type Description authn_mapping_id [_required_] string The UUID of the AuthN Mapping. ### Response * [200](https://docs.datadoghq.com/api/latest/authn-mappings/#GetAuthNMapping-200-v2) * [403](https://docs.datadoghq.com/api/latest/authn-mappings/#GetAuthNMapping-403-v2) * [404](https://docs.datadoghq.com/api/latest/authn-mappings/#GetAuthNMapping-404-v2) * [429](https://docs.datadoghq.com/api/latest/authn-mappings/#GetAuthNMapping-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) AuthN Mapping response from the API. Field Type Description data object The AuthN Mapping object returned by API. attributes object Attributes of AuthN Mapping. attribute_key string Key portion of a key/value pair of the attribute sent from the Identity Provider. attribute_value string Value portion of a key/value pair of the attribute sent from the Identity Provider. created_at date-time Creation time of the AuthN Mapping. modified_at date-time Time of last AuthN Mapping modification. saml_assertion_attribute_id string The ID of the SAML assertion attribute. id [_required_] string ID of the AuthN Mapping. relationships object All relationships associated with AuthN Mapping. role object Relationship to role. data object Relationship to role object. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` saml_assertion_attribute object AuthN Mapping relationship to SAML Assertion Attribute. data [_required_] object Data of AuthN Mapping relationship to SAML Assertion Attribute. id [_required_] string The ID of the SAML assertion attribute. type [_required_] enum SAML assertion attributes resource type. Allowed enum values: `saml_assertion_attributes` default: `saml_assertion_attributes` team object Relationship to team. data object Relationship to Team object. id string The unique identifier of the team. type enum Team type Allowed enum values: `team` default: `team` type [_required_] enum AuthN Mappings resource type. Allowed enum values: `authn_mappings` default: `authn_mappings` included [ ] Included data in the AuthN Mapping response. Option 1 object SAML assertion attribute. attributes object Key/Value pair of attributes used in SAML assertion attributes. attribute_key string Key portion of a key/value pair of the attribute sent from the Identity Provider. attribute_value string Value portion of a key/value pair of the attribute sent from the Identity Provider. id [_required_] string The ID of the SAML assertion attribute. type [_required_] enum SAML assertion attributes resource type. Allowed enum values: `saml_assertion_attributes` default: `saml_assertion_attributes` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object Team. attributes object Team attributes. avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team handle string The team's identifier link_count int32 The number of links belonging to the team name string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team id string The ID of the Team. type enum Team type Allowed enum values: `team` default: `team` ``` { "data": { "attributes": { "attribute_key": "member-of", "attribute_value": "Development", "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "saml_assertion_attribute_id": "0" }, "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "relationships": { "role": { "data": { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } }, "saml_assertion_attribute": { "data": { "id": "0", "type": "saml_assertion_attributes" } }, "team": { "data": { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team" } } }, "type": "authn_mappings" }, "included": [ { "attributes": { "attribute_key": "member-of", "attribute_value": "Development" }, "id": "0", "type": "saml_assertion_attributes" } ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=typescript) ##### Get an AuthN Mapping by UUID Copy ``` # Path parameters export authn_mapping_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/authn_mappings/${authn_mapping_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an AuthN Mapping by UUID ``` """ Get an AuthN Mapping by UUID returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.authn_mappings_api import AuthNMappingsApi # there is a valid "authn_mapping" in the system AUTHN_MAPPING_DATA_ID = environ["AUTHN_MAPPING_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AuthNMappingsApi(api_client) response = api_instance.get_authn_mapping( authn_mapping_id=AUTHN_MAPPING_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an AuthN Mapping by UUID ``` # Get an AuthN Mapping by UUID returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AuthNMappingsAPI.new # there is a valid "authn_mapping" in the system AUTHN_MAPPING_DATA_ID = ENV["AUTHN_MAPPING_DATA_ID"] p api_instance.get_authn_mapping(AUTHN_MAPPING_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an AuthN Mapping by UUID ``` // Get an AuthN Mapping by UUID returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "authn_mapping" in the system AuthnMappingDataID := os.Getenv("AUTHN_MAPPING_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAuthNMappingsApi(apiClient) resp, r, err := api.GetAuthNMapping(ctx, AuthnMappingDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AuthNMappingsApi.GetAuthNMapping`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AuthNMappingsApi.GetAuthNMapping`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an AuthN Mapping by UUID ``` // Get an AuthN Mapping by UUID returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AuthNMappingsApi; import com.datadog.api.client.v2.model.AuthNMappingResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AuthNMappingsApi apiInstance = new AuthNMappingsApi(defaultClient); // there is a valid "authn_mapping" in the system String AUTHN_MAPPING_DATA_ID = System.getenv("AUTHN_MAPPING_DATA_ID"); try { AuthNMappingResponse result = apiInstance.getAuthNMapping(AUTHN_MAPPING_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AuthNMappingsApi#getAuthNMapping"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an AuthN Mapping by UUID ``` // Get an AuthN Mapping by UUID returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_authn_mappings::AuthNMappingsAPI; #[tokio::main] async fn main() { // there is a valid "authn_mapping" in the system let authn_mapping_data_id = std::env::var("AUTHN_MAPPING_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = AuthNMappingsAPI::with_config(configuration); let resp = api.get_authn_mapping(authn_mapping_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an AuthN Mapping by UUID ``` /** * Get an AuthN Mapping by UUID returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AuthNMappingsApi(configuration); // there is a valid "authn_mapping" in the system const AUTHN_MAPPING_DATA_ID = process.env.AUTHN_MAPPING_DATA_ID as string; const params: v2.AuthNMappingsApiGetAuthNMappingRequest = { authnMappingId: AUTHN_MAPPING_DATA_ID, }; apiInstance .getAuthNMapping(params) .then((data: v2.AuthNMappingResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit an AuthN Mapping](https://docs.datadoghq.com/api/latest/authn-mappings/#edit-an-authn-mapping) * [v2 (latest)](https://docs.datadoghq.com/api/latest/authn-mappings/#edit-an-authn-mapping-v2) PATCH https://api.ap1.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.ap2.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.datadoghq.eu/api/v2/authn_mappings/{authn_mapping_id}https://api.ddog-gov.com/api/v2/authn_mappings/{authn_mapping_id}https://api.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.us3.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.us5.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id} ### Overview Edit an AuthN Mapping. This endpoint requires the `user_access_manage` permission. ### Arguments #### Path Parameters Name Type Description authn_mapping_id [_required_] string The UUID of the AuthN Mapping. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) Field Type Description data [_required_] object Data for updating an AuthN Mapping. attributes object Key/Value pair of attributes used for update request. attribute_key string Key portion of a key/value pair of the attribute sent from the Identity Provider. attribute_value string Value portion of a key/value pair of the attribute sent from the Identity Provider. id [_required_] string ID of the AuthN Mapping. relationships Relationship of AuthN Mapping update object to a Role or Team. Option 1 object Relationship of AuthN Mapping to a Role. role [_required_] object Relationship to role. data object Relationship to role object. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` Option 2 object Relationship of AuthN Mapping to a Team. team [_required_] object Relationship to team. data object Relationship to Team object. id string The unique identifier of the team. type enum Team type Allowed enum values: `team` default: `team` type [_required_] enum AuthN Mappings resource type. Allowed enum values: `authn_mappings` default: `authn_mappings` ``` { "data": { "attributes": { "attribute_key": "member-of", "attribute_value": "Development" }, "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "relationships": { "role": { "data": { "id": "string", "type": "roles" } } }, "type": "authn_mappings" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/authn-mappings/#UpdateAuthNMapping-200-v2) * [400](https://docs.datadoghq.com/api/latest/authn-mappings/#UpdateAuthNMapping-400-v2) * [403](https://docs.datadoghq.com/api/latest/authn-mappings/#UpdateAuthNMapping-403-v2) * [404](https://docs.datadoghq.com/api/latest/authn-mappings/#UpdateAuthNMapping-404-v2) * [409](https://docs.datadoghq.com/api/latest/authn-mappings/#UpdateAuthNMapping-409-v2) * [422](https://docs.datadoghq.com/api/latest/authn-mappings/#UpdateAuthNMapping-422-v2) * [429](https://docs.datadoghq.com/api/latest/authn-mappings/#UpdateAuthNMapping-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) AuthN Mapping response from the API. Field Type Description data object The AuthN Mapping object returned by API. attributes object Attributes of AuthN Mapping. attribute_key string Key portion of a key/value pair of the attribute sent from the Identity Provider. attribute_value string Value portion of a key/value pair of the attribute sent from the Identity Provider. created_at date-time Creation time of the AuthN Mapping. modified_at date-time Time of last AuthN Mapping modification. saml_assertion_attribute_id string The ID of the SAML assertion attribute. id [_required_] string ID of the AuthN Mapping. relationships object All relationships associated with AuthN Mapping. role object Relationship to role. data object Relationship to role object. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` saml_assertion_attribute object AuthN Mapping relationship to SAML Assertion Attribute. data [_required_] object Data of AuthN Mapping relationship to SAML Assertion Attribute. id [_required_] string The ID of the SAML assertion attribute. type [_required_] enum SAML assertion attributes resource type. Allowed enum values: `saml_assertion_attributes` default: `saml_assertion_attributes` team object Relationship to team. data object Relationship to Team object. id string The unique identifier of the team. type enum Team type Allowed enum values: `team` default: `team` type [_required_] enum AuthN Mappings resource type. Allowed enum values: `authn_mappings` default: `authn_mappings` included [ ] Included data in the AuthN Mapping response. Option 1 object SAML assertion attribute. attributes object Key/Value pair of attributes used in SAML assertion attributes. attribute_key string Key portion of a key/value pair of the attribute sent from the Identity Provider. attribute_value string Value portion of a key/value pair of the attribute sent from the Identity Provider. id [_required_] string The ID of the SAML assertion attribute. type [_required_] enum SAML assertion attributes resource type. Allowed enum values: `saml_assertion_attributes` default: `saml_assertion_attributes` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object Team. attributes object Team attributes. avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team handle string The team's identifier link_count int32 The number of links belonging to the team name string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team id string The ID of the Team. type enum Team type Allowed enum values: `team` default: `team` ``` { "data": { "attributes": { "attribute_key": "member-of", "attribute_value": "Development", "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "saml_assertion_attribute_id": "0" }, "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "relationships": { "role": { "data": { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } }, "saml_assertion_attribute": { "data": { "id": "0", "type": "saml_assertion_attributes" } }, "team": { "data": { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team" } } }, "type": "authn_mappings" }, "included": [ { "attributes": { "attribute_key": "member-of", "attribute_value": "Development" }, "id": "0", "type": "saml_assertion_attributes" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unprocessable Entity * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=typescript) ##### Edit an AuthN Mapping returns "OK" response Copy ``` # Path parameters export authn_mapping_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/authn_mappings/${authn_mapping_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "attribute_key": "member-of", "attribute_value": "Development" }, "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "relationships": { "role": { "data": { "id": "string", "type": "roles" } } }, "type": "authn_mappings" } } EOF ``` ##### Edit an AuthN Mapping returns "OK" response ``` // Edit an AuthN Mapping returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "authn_mapping" in the system AuthnMappingDataID := os.Getenv("AUTHN_MAPPING_DATA_ID") // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") body := datadogV2.AuthNMappingUpdateRequest{ Data: datadogV2.AuthNMappingUpdateData{ Attributes: &datadogV2.AuthNMappingUpdateAttributes{ AttributeKey: datadog.PtrString("member-of"), AttributeValue: datadog.PtrString("Development"), }, Id: AuthnMappingDataID, Relationships: &datadogV2.AuthNMappingUpdateRelationships{ AuthNMappingRelationshipToRole: &datadogV2.AuthNMappingRelationshipToRole{ Role: datadogV2.RelationshipToRole{ Data: &datadogV2.RelationshipToRoleData{ Id: datadog.PtrString(RoleDataID), Type: datadogV2.ROLESTYPE_ROLES.Ptr(), }, }, }}, Type: datadogV2.AUTHNMAPPINGSTYPE_AUTHN_MAPPINGS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAuthNMappingsApi(apiClient) resp, r, err := api.UpdateAuthNMapping(ctx, AuthnMappingDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AuthNMappingsApi.UpdateAuthNMapping`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AuthNMappingsApi.UpdateAuthNMapping`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit an AuthN Mapping returns "OK" response ``` // Edit an AuthN Mapping returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AuthNMappingsApi; import com.datadog.api.client.v2.model.AuthNMappingRelationshipToRole; import com.datadog.api.client.v2.model.AuthNMappingResponse; import com.datadog.api.client.v2.model.AuthNMappingUpdateAttributes; import com.datadog.api.client.v2.model.AuthNMappingUpdateData; import com.datadog.api.client.v2.model.AuthNMappingUpdateRelationships; import com.datadog.api.client.v2.model.AuthNMappingUpdateRequest; import com.datadog.api.client.v2.model.AuthNMappingsType; import com.datadog.api.client.v2.model.RelationshipToRole; import com.datadog.api.client.v2.model.RelationshipToRoleData; import com.datadog.api.client.v2.model.RolesType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AuthNMappingsApi apiInstance = new AuthNMappingsApi(defaultClient); // there is a valid "authn_mapping" in the system String AUTHN_MAPPING_DATA_ID = System.getenv("AUTHN_MAPPING_DATA_ID"); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); AuthNMappingUpdateRequest body = new AuthNMappingUpdateRequest() .data( new AuthNMappingUpdateData() .attributes( new AuthNMappingUpdateAttributes() .attributeKey("member-of") .attributeValue("Development")) .id(AUTHN_MAPPING_DATA_ID) .relationships( new AuthNMappingUpdateRelationships( new AuthNMappingRelationshipToRole() .role( new RelationshipToRole() .data( new RelationshipToRoleData() .id(ROLE_DATA_ID) .type(RolesType.ROLES))))) .type(AuthNMappingsType.AUTHN_MAPPINGS)); try { AuthNMappingResponse result = apiInstance.updateAuthNMapping(AUTHN_MAPPING_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AuthNMappingsApi#updateAuthNMapping"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit an AuthN Mapping returns "OK" response ``` """ Edit an AuthN Mapping returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.authn_mappings_api import AuthNMappingsApi from datadog_api_client.v2.model.authn_mapping_relationship_to_role import AuthNMappingRelationshipToRole from datadog_api_client.v2.model.authn_mapping_update_attributes import AuthNMappingUpdateAttributes from datadog_api_client.v2.model.authn_mapping_update_data import AuthNMappingUpdateData from datadog_api_client.v2.model.authn_mapping_update_request import AuthNMappingUpdateRequest from datadog_api_client.v2.model.authn_mappings_type import AuthNMappingsType from datadog_api_client.v2.model.relationship_to_role import RelationshipToRole from datadog_api_client.v2.model.relationship_to_role_data import RelationshipToRoleData from datadog_api_client.v2.model.roles_type import RolesType # there is a valid "authn_mapping" in the system AUTHN_MAPPING_DATA_ID = environ["AUTHN_MAPPING_DATA_ID"] # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] body = AuthNMappingUpdateRequest( data=AuthNMappingUpdateData( attributes=AuthNMappingUpdateAttributes( attribute_key="member-of", attribute_value="Development", ), id=AUTHN_MAPPING_DATA_ID, relationships=AuthNMappingRelationshipToRole( role=RelationshipToRole( data=RelationshipToRoleData( id=ROLE_DATA_ID, type=RolesType.ROLES, ), ), ), type=AuthNMappingsType.AUTHN_MAPPINGS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AuthNMappingsApi(api_client) response = api_instance.update_authn_mapping(authn_mapping_id=AUTHN_MAPPING_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit an AuthN Mapping returns "OK" response ``` # Edit an AuthN Mapping returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AuthNMappingsAPI.new # there is a valid "authn_mapping" in the system AUTHN_MAPPING_DATA_ID = ENV["AUTHN_MAPPING_DATA_ID"] # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] body = DatadogAPIClient::V2::AuthNMappingUpdateRequest.new({ data: DatadogAPIClient::V2::AuthNMappingUpdateData.new({ attributes: DatadogAPIClient::V2::AuthNMappingUpdateAttributes.new({ attribute_key: "member-of", attribute_value: "Development", }), id: AUTHN_MAPPING_DATA_ID, relationships: DatadogAPIClient::V2::AuthNMappingRelationshipToRole.new({ role: DatadogAPIClient::V2::RelationshipToRole.new({ data: DatadogAPIClient::V2::RelationshipToRoleData.new({ id: ROLE_DATA_ID, type: DatadogAPIClient::V2::RolesType::ROLES, }), }), }), type: DatadogAPIClient::V2::AuthNMappingsType::AUTHN_MAPPINGS, }), }) p api_instance.update_authn_mapping(AUTHN_MAPPING_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit an AuthN Mapping returns "OK" response ``` // Edit an AuthN Mapping returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_authn_mappings::AuthNMappingsAPI; use datadog_api_client::datadogV2::model::AuthNMappingRelationshipToRole; use datadog_api_client::datadogV2::model::AuthNMappingUpdateAttributes; use datadog_api_client::datadogV2::model::AuthNMappingUpdateData; use datadog_api_client::datadogV2::model::AuthNMappingUpdateRelationships; use datadog_api_client::datadogV2::model::AuthNMappingUpdateRequest; use datadog_api_client::datadogV2::model::AuthNMappingsType; use datadog_api_client::datadogV2::model::RelationshipToRole; use datadog_api_client::datadogV2::model::RelationshipToRoleData; use datadog_api_client::datadogV2::model::RolesType; #[tokio::main] async fn main() { // there is a valid "authn_mapping" in the system let authn_mapping_data_id = std::env::var("AUTHN_MAPPING_DATA_ID").unwrap(); // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let body = AuthNMappingUpdateRequest::new( AuthNMappingUpdateData::new( authn_mapping_data_id.clone(), AuthNMappingsType::AUTHN_MAPPINGS, ) .attributes( AuthNMappingUpdateAttributes::new() .attribute_key("member-of".to_string()) .attribute_value("Development".to_string()), ) .relationships( AuthNMappingUpdateRelationships::AuthNMappingRelationshipToRole(Box::new( AuthNMappingRelationshipToRole::new( RelationshipToRole::new().data( RelationshipToRoleData::new() .id(role_data_id.clone()) .type_(RolesType::ROLES), ), ), )), ), ); let configuration = datadog::Configuration::new(); let api = AuthNMappingsAPI::with_config(configuration); let resp = api .update_authn_mapping(authn_mapping_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit an AuthN Mapping returns "OK" response ``` /** * Edit an AuthN Mapping returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AuthNMappingsApi(configuration); // there is a valid "authn_mapping" in the system const AUTHN_MAPPING_DATA_ID = process.env.AUTHN_MAPPING_DATA_ID as string; // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v2.AuthNMappingsApiUpdateAuthNMappingRequest = { body: { data: { attributes: { attributeKey: "member-of", attributeValue: "Development", }, id: AUTHN_MAPPING_DATA_ID, relationships: { role: { data: { id: ROLE_DATA_ID, type: "roles", }, }, }, type: "authn_mappings", }, }, authnMappingId: AUTHN_MAPPING_DATA_ID, }; apiInstance .updateAuthNMapping(params) .then((data: v2.AuthNMappingResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an AuthN Mapping](https://docs.datadoghq.com/api/latest/authn-mappings/#delete-an-authn-mapping) * [v2 (latest)](https://docs.datadoghq.com/api/latest/authn-mappings/#delete-an-authn-mapping-v2) DELETE https://api.ap1.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.ap2.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.datadoghq.eu/api/v2/authn_mappings/{authn_mapping_id}https://api.ddog-gov.com/api/v2/authn_mappings/{authn_mapping_id}https://api.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.us3.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id}https://api.us5.datadoghq.com/api/v2/authn_mappings/{authn_mapping_id} ### Overview Delete an AuthN Mapping specified by AuthN Mapping UUID. This endpoint requires the `user_access_manage` permission. ### Arguments #### Path Parameters Name Type Description authn_mapping_id [_required_] string The UUID of the AuthN Mapping. ### Response * [204](https://docs.datadoghq.com/api/latest/authn-mappings/#DeleteAuthNMapping-204-v2) * [403](https://docs.datadoghq.com/api/latest/authn-mappings/#DeleteAuthNMapping-403-v2) * [404](https://docs.datadoghq.com/api/latest/authn-mappings/#DeleteAuthNMapping-404-v2) * [429](https://docs.datadoghq.com/api/latest/authn-mappings/#DeleteAuthNMapping-429-v2) OK Authentication Error * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=typescript) ##### Delete an AuthN Mapping Copy ``` # Path parameters export authn_mapping_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/authn_mappings/${authn_mapping_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an AuthN Mapping ``` """ Delete an AuthN Mapping returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.authn_mappings_api import AuthNMappingsApi # there is a valid "authn_mapping" in the system AUTHN_MAPPING_DATA_ID = environ["AUTHN_MAPPING_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AuthNMappingsApi(api_client) api_instance.delete_authn_mapping( authn_mapping_id=AUTHN_MAPPING_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an AuthN Mapping ``` # Delete an AuthN Mapping returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AuthNMappingsAPI.new # there is a valid "authn_mapping" in the system AUTHN_MAPPING_DATA_ID = ENV["AUTHN_MAPPING_DATA_ID"] api_instance.delete_authn_mapping(AUTHN_MAPPING_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an AuthN Mapping ``` // Delete an AuthN Mapping returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "authn_mapping" in the system AuthnMappingDataID := os.Getenv("AUTHN_MAPPING_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAuthNMappingsApi(apiClient) r, err := api.DeleteAuthNMapping(ctx, AuthnMappingDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AuthNMappingsApi.DeleteAuthNMapping`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an AuthN Mapping ``` // Delete an AuthN Mapping returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AuthNMappingsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AuthNMappingsApi apiInstance = new AuthNMappingsApi(defaultClient); // there is a valid "authn_mapping" in the system String AUTHN_MAPPING_DATA_ID = System.getenv("AUTHN_MAPPING_DATA_ID"); try { apiInstance.deleteAuthNMapping(AUTHN_MAPPING_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling AuthNMappingsApi#deleteAuthNMapping"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an AuthN Mapping ``` // Delete an AuthN Mapping returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_authn_mappings::AuthNMappingsAPI; #[tokio::main] async fn main() { // there is a valid "authn_mapping" in the system let authn_mapping_data_id = std::env::var("AUTHN_MAPPING_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = AuthNMappingsAPI::with_config(configuration); let resp = api .delete_authn_mapping(authn_mapping_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an AuthN Mapping ``` /** * Delete an AuthN Mapping returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AuthNMappingsApi(configuration); // there is a valid "authn_mapping" in the system const AUTHN_MAPPING_DATA_ID = process.env.AUTHN_MAPPING_DATA_ID as string; const params: v2.AuthNMappingsApiDeleteAuthNMappingRequest = { authnMappingId: AUTHN_MAPPING_DATA_ID, }; apiInstance .deleteAuthNMapping(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all AuthN Mappings](https://docs.datadoghq.com/api/latest/authn-mappings/#list-all-authn-mappings) * [v2 (latest)](https://docs.datadoghq.com/api/latest/authn-mappings/#list-all-authn-mappings-v2) GET https://api.ap1.datadoghq.com/api/v2/authn_mappingshttps://api.ap2.datadoghq.com/api/v2/authn_mappingshttps://api.datadoghq.eu/api/v2/authn_mappingshttps://api.ddog-gov.com/api/v2/authn_mappingshttps://api.datadoghq.com/api/v2/authn_mappingshttps://api.us3.datadoghq.com/api/v2/authn_mappingshttps://api.us5.datadoghq.com/api/v2/authn_mappings ### Overview List all AuthN Mappings in the org. This endpoint requires the `user_access_read` permission. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort enum Sort AuthN Mappings depending on the given field. Allowed enum values: `created_at, -created_at, role_id, -role_id, saml_assertion_attribute_id, -saml_assertion_attribute_id, role.name, -role.name, saml_assertion_attribute.attribute_key, -saml_assertion_attribute.attribute_key, saml_assertion_attribute.attribute_value, -saml_assertion_attribute.attribute_value` filter string Filter all mappings by the given string. resource_type enum Filter by mapping resource type. Defaults to “role” if not specified. Allowed enum values: `role, team` ### Response * [200](https://docs.datadoghq.com/api/latest/authn-mappings/#ListAuthNMappings-200-v2) * [403](https://docs.datadoghq.com/api/latest/authn-mappings/#ListAuthNMappings-403-v2) * [429](https://docs.datadoghq.com/api/latest/authn-mappings/#ListAuthNMappings-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) Array of AuthN Mappings response. Field Type Description data [object] Array of returned AuthN Mappings. attributes object Attributes of AuthN Mapping. attribute_key string Key portion of a key/value pair of the attribute sent from the Identity Provider. attribute_value string Value portion of a key/value pair of the attribute sent from the Identity Provider. created_at date-time Creation time of the AuthN Mapping. modified_at date-time Time of last AuthN Mapping modification. saml_assertion_attribute_id string The ID of the SAML assertion attribute. id [_required_] string ID of the AuthN Mapping. relationships object All relationships associated with AuthN Mapping. role object Relationship to role. data object Relationship to role object. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` saml_assertion_attribute object AuthN Mapping relationship to SAML Assertion Attribute. data [_required_] object Data of AuthN Mapping relationship to SAML Assertion Attribute. id [_required_] string The ID of the SAML assertion attribute. type [_required_] enum SAML assertion attributes resource type. Allowed enum values: `saml_assertion_attributes` default: `saml_assertion_attributes` team object Relationship to team. data object Relationship to Team object. id string The unique identifier of the team. type enum Team type Allowed enum values: `team` default: `team` type [_required_] enum AuthN Mappings resource type. Allowed enum values: `authn_mappings` default: `authn_mappings` included [ ] Included data in the AuthN Mapping response. Option 1 object SAML assertion attribute. attributes object Key/Value pair of attributes used in SAML assertion attributes. attribute_key string Key portion of a key/value pair of the attribute sent from the Identity Provider. attribute_value string Value portion of a key/value pair of the attribute sent from the Identity Provider. id [_required_] string The ID of the SAML assertion attribute. type [_required_] enum SAML assertion attributes resource type. Allowed enum values: `saml_assertion_attributes` default: `saml_assertion_attributes` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object Team. attributes object Team attributes. avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team handle string The team's identifier link_count int32 The number of links belonging to the team name string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team id string The ID of the Team. type enum Team type Allowed enum values: `team` default: `team` meta object Object describing meta attributes of response. page object Pagination object. total_count int64 Total count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "attributes": { "attribute_key": "member-of", "attribute_value": "Development", "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "saml_assertion_attribute_id": "0" }, "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "relationships": { "role": { "data": { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } }, "saml_assertion_attribute": { "data": { "id": "0", "type": "saml_assertion_attributes" } }, "team": { "data": { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team" } } }, "type": "authn_mappings" } ], "included": [ { "attributes": { "attribute_key": "member-of", "attribute_value": "Development" }, "id": "0", "type": "saml_assertion_attributes" } ], "meta": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=typescript) ##### List all AuthN Mappings Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/authn_mappings" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all AuthN Mappings ``` """ List all AuthN Mappings returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.authn_mappings_api import AuthNMappingsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AuthNMappingsApi(api_client) response = api_instance.list_authn_mappings() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all AuthN Mappings ``` # List all AuthN Mappings returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AuthNMappingsAPI.new p api_instance.list_authn_mappings() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all AuthN Mappings ``` // List all AuthN Mappings returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAuthNMappingsApi(apiClient) resp, r, err := api.ListAuthNMappings(ctx, *datadogV2.NewListAuthNMappingsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AuthNMappingsApi.ListAuthNMappings`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AuthNMappingsApi.ListAuthNMappings`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all AuthN Mappings ``` // List all AuthN Mappings returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AuthNMappingsApi; import com.datadog.api.client.v2.model.AuthNMappingsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AuthNMappingsApi apiInstance = new AuthNMappingsApi(defaultClient); try { AuthNMappingsResponse result = apiInstance.listAuthNMappings(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AuthNMappingsApi#listAuthNMappings"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all AuthN Mappings ``` // List all AuthN Mappings returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_authn_mappings::AuthNMappingsAPI; use datadog_api_client::datadogV2::api_authn_mappings::ListAuthNMappingsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AuthNMappingsAPI::with_config(configuration); let resp = api .list_authn_mappings(ListAuthNMappingsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all AuthN Mappings ``` /** * List all AuthN Mappings returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AuthNMappingsApi(configuration); apiInstance .listAuthNMappings() .then((data: v2.AuthNMappingsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an AuthN Mapping](https://docs.datadoghq.com/api/latest/authn-mappings/#create-an-authn-mapping) * [v2 (latest)](https://docs.datadoghq.com/api/latest/authn-mappings/#create-an-authn-mapping-v2) POST https://api.ap1.datadoghq.com/api/v2/authn_mappingshttps://api.ap2.datadoghq.com/api/v2/authn_mappingshttps://api.datadoghq.eu/api/v2/authn_mappingshttps://api.ddog-gov.com/api/v2/authn_mappingshttps://api.datadoghq.com/api/v2/authn_mappingshttps://api.us3.datadoghq.com/api/v2/authn_mappingshttps://api.us5.datadoghq.com/api/v2/authn_mappings ### Overview Create an AuthN Mapping. This endpoint requires the `user_access_manage` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) Field Type Description data [_required_] object Data for creating an AuthN Mapping. attributes object Key/Value pair of attributes used for create request. attribute_key string Key portion of a key/value pair of the attribute sent from the Identity Provider. attribute_value string Value portion of a key/value pair of the attribute sent from the Identity Provider. relationships Relationship of AuthN Mapping create object to a Role or Team. Option 1 object Relationship of AuthN Mapping to a Role. role [_required_] object Relationship to role. data object Relationship to role object. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` Option 2 object Relationship of AuthN Mapping to a Team. team [_required_] object Relationship to team. data object Relationship to Team object. id string The unique identifier of the team. type enum Team type Allowed enum values: `team` default: `team` type [_required_] enum AuthN Mappings resource type. Allowed enum values: `authn_mappings` default: `authn_mappings` ``` { "data": { "attributes": { "attribute_key": "exampleauthnmapping", "attribute_value": "Example-AuthN-Mapping" }, "relationships": { "role": { "data": { "id": "string", "type": "roles" } } }, "type": "authn_mappings" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/authn-mappings/#CreateAuthNMapping-200-v2) * [400](https://docs.datadoghq.com/api/latest/authn-mappings/#CreateAuthNMapping-400-v2) * [403](https://docs.datadoghq.com/api/latest/authn-mappings/#CreateAuthNMapping-403-v2) * [404](https://docs.datadoghq.com/api/latest/authn-mappings/#CreateAuthNMapping-404-v2) * [429](https://docs.datadoghq.com/api/latest/authn-mappings/#CreateAuthNMapping-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) AuthN Mapping response from the API. Field Type Description data object The AuthN Mapping object returned by API. attributes object Attributes of AuthN Mapping. attribute_key string Key portion of a key/value pair of the attribute sent from the Identity Provider. attribute_value string Value portion of a key/value pair of the attribute sent from the Identity Provider. created_at date-time Creation time of the AuthN Mapping. modified_at date-time Time of last AuthN Mapping modification. saml_assertion_attribute_id string The ID of the SAML assertion attribute. id [_required_] string ID of the AuthN Mapping. relationships object All relationships associated with AuthN Mapping. role object Relationship to role. data object Relationship to role object. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` saml_assertion_attribute object AuthN Mapping relationship to SAML Assertion Attribute. data [_required_] object Data of AuthN Mapping relationship to SAML Assertion Attribute. id [_required_] string The ID of the SAML assertion attribute. type [_required_] enum SAML assertion attributes resource type. Allowed enum values: `saml_assertion_attributes` default: `saml_assertion_attributes` team object Relationship to team. data object Relationship to Team object. id string The unique identifier of the team. type enum Team type Allowed enum values: `team` default: `team` type [_required_] enum AuthN Mappings resource type. Allowed enum values: `authn_mappings` default: `authn_mappings` included [ ] Included data in the AuthN Mapping response. Option 1 object SAML assertion attribute. attributes object Key/Value pair of attributes used in SAML assertion attributes. attribute_key string Key portion of a key/value pair of the attribute sent from the Identity Provider. attribute_value string Value portion of a key/value pair of the attribute sent from the Identity Provider. id [_required_] string The ID of the SAML assertion attribute. type [_required_] enum SAML assertion attributes resource type. Allowed enum values: `saml_assertion_attributes` default: `saml_assertion_attributes` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object Team. attributes object Team attributes. avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team handle string The team's identifier link_count int32 The number of links belonging to the team name string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team id string The ID of the Team. type enum Team type Allowed enum values: `team` default: `team` ``` { "data": { "attributes": { "attribute_key": "member-of", "attribute_value": "Development", "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "saml_assertion_attribute_id": "0" }, "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "relationships": { "role": { "data": { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } }, "saml_assertion_attribute": { "data": { "id": "0", "type": "saml_assertion_attributes" } }, "team": { "data": { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team" } } }, "type": "authn_mappings" }, "included": [ { "attributes": { "attribute_key": "member-of", "attribute_value": "Development" }, "id": "0", "type": "saml_assertion_attributes" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Example](https://docs.datadoghq.com/api/latest/authn-mappings/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/authn-mappings/?code-lang=typescript) ##### Create an AuthN Mapping returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/authn_mappings" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "attribute_key": "exampleauthnmapping", "attribute_value": "Example-AuthN-Mapping" }, "relationships": { "role": { "data": { "id": "string", "type": "roles" } } }, "type": "authn_mappings" } } EOF ``` ##### Create an AuthN Mapping returns "OK" response ``` // Create an AuthN Mapping returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") body := datadogV2.AuthNMappingCreateRequest{ Data: datadogV2.AuthNMappingCreateData{ Attributes: &datadogV2.AuthNMappingCreateAttributes{ AttributeKey: datadog.PtrString("exampleauthnmapping"), AttributeValue: datadog.PtrString("Example-AuthN-Mapping"), }, Relationships: &datadogV2.AuthNMappingCreateRelationships{ AuthNMappingRelationshipToRole: &datadogV2.AuthNMappingRelationshipToRole{ Role: datadogV2.RelationshipToRole{ Data: &datadogV2.RelationshipToRoleData{ Id: datadog.PtrString(RoleDataID), Type: datadogV2.ROLESTYPE_ROLES.Ptr(), }, }, }}, Type: datadogV2.AUTHNMAPPINGSTYPE_AUTHN_MAPPINGS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAuthNMappingsApi(apiClient) resp, r, err := api.CreateAuthNMapping(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AuthNMappingsApi.CreateAuthNMapping`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AuthNMappingsApi.CreateAuthNMapping`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an AuthN Mapping returns "OK" response ``` // Create an AuthN Mapping returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AuthNMappingsApi; import com.datadog.api.client.v2.model.AuthNMappingCreateAttributes; import com.datadog.api.client.v2.model.AuthNMappingCreateData; import com.datadog.api.client.v2.model.AuthNMappingCreateRelationships; import com.datadog.api.client.v2.model.AuthNMappingCreateRequest; import com.datadog.api.client.v2.model.AuthNMappingRelationshipToRole; import com.datadog.api.client.v2.model.AuthNMappingResponse; import com.datadog.api.client.v2.model.AuthNMappingsType; import com.datadog.api.client.v2.model.RelationshipToRole; import com.datadog.api.client.v2.model.RelationshipToRoleData; import com.datadog.api.client.v2.model.RolesType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AuthNMappingsApi apiInstance = new AuthNMappingsApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); AuthNMappingCreateRequest body = new AuthNMappingCreateRequest() .data( new AuthNMappingCreateData() .attributes( new AuthNMappingCreateAttributes() .attributeKey("exampleauthnmapping") .attributeValue("Example-AuthN-Mapping")) .relationships( new AuthNMappingCreateRelationships( new AuthNMappingRelationshipToRole() .role( new RelationshipToRole() .data( new RelationshipToRoleData() .id(ROLE_DATA_ID) .type(RolesType.ROLES))))) .type(AuthNMappingsType.AUTHN_MAPPINGS)); try { AuthNMappingResponse result = apiInstance.createAuthNMapping(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AuthNMappingsApi#createAuthNMapping"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an AuthN Mapping returns "OK" response ``` """ Create an AuthN Mapping returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.authn_mappings_api import AuthNMappingsApi from datadog_api_client.v2.model.authn_mapping_create_attributes import AuthNMappingCreateAttributes from datadog_api_client.v2.model.authn_mapping_create_data import AuthNMappingCreateData from datadog_api_client.v2.model.authn_mapping_create_request import AuthNMappingCreateRequest from datadog_api_client.v2.model.authn_mapping_relationship_to_role import AuthNMappingRelationshipToRole from datadog_api_client.v2.model.authn_mappings_type import AuthNMappingsType from datadog_api_client.v2.model.relationship_to_role import RelationshipToRole from datadog_api_client.v2.model.relationship_to_role_data import RelationshipToRoleData from datadog_api_client.v2.model.roles_type import RolesType # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] body = AuthNMappingCreateRequest( data=AuthNMappingCreateData( attributes=AuthNMappingCreateAttributes( attribute_key="exampleauthnmapping", attribute_value="Example-AuthN-Mapping", ), relationships=AuthNMappingRelationshipToRole( role=RelationshipToRole( data=RelationshipToRoleData( id=ROLE_DATA_ID, type=RolesType.ROLES, ), ), ), type=AuthNMappingsType.AUTHN_MAPPINGS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AuthNMappingsApi(api_client) response = api_instance.create_authn_mapping(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an AuthN Mapping returns "OK" response ``` # Create an AuthN Mapping returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AuthNMappingsAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] body = DatadogAPIClient::V2::AuthNMappingCreateRequest.new({ data: DatadogAPIClient::V2::AuthNMappingCreateData.new({ attributes: DatadogAPIClient::V2::AuthNMappingCreateAttributes.new({ attribute_key: "exampleauthnmapping", attribute_value: "Example-AuthN-Mapping", }), relationships: DatadogAPIClient::V2::AuthNMappingRelationshipToRole.new({ role: DatadogAPIClient::V2::RelationshipToRole.new({ data: DatadogAPIClient::V2::RelationshipToRoleData.new({ id: ROLE_DATA_ID, type: DatadogAPIClient::V2::RolesType::ROLES, }), }), }), type: DatadogAPIClient::V2::AuthNMappingsType::AUTHN_MAPPINGS, }), }) p api_instance.create_authn_mapping(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an AuthN Mapping returns "OK" response ``` // Create an AuthN Mapping returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_authn_mappings::AuthNMappingsAPI; use datadog_api_client::datadogV2::model::AuthNMappingCreateAttributes; use datadog_api_client::datadogV2::model::AuthNMappingCreateData; use datadog_api_client::datadogV2::model::AuthNMappingCreateRelationships; use datadog_api_client::datadogV2::model::AuthNMappingCreateRequest; use datadog_api_client::datadogV2::model::AuthNMappingRelationshipToRole; use datadog_api_client::datadogV2::model::AuthNMappingsType; use datadog_api_client::datadogV2::model::RelationshipToRole; use datadog_api_client::datadogV2::model::RelationshipToRoleData; use datadog_api_client::datadogV2::model::RolesType; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let body = AuthNMappingCreateRequest::new( AuthNMappingCreateData::new(AuthNMappingsType::AUTHN_MAPPINGS) .attributes( AuthNMappingCreateAttributes::new() .attribute_key("exampleauthnmapping".to_string()) .attribute_value("Example-AuthN-Mapping".to_string()), ) .relationships( AuthNMappingCreateRelationships::AuthNMappingRelationshipToRole(Box::new( AuthNMappingRelationshipToRole::new( RelationshipToRole::new().data( RelationshipToRoleData::new() .id(role_data_id.clone()) .type_(RolesType::ROLES), ), ), )), ), ); let configuration = datadog::Configuration::new(); let api = AuthNMappingsAPI::with_config(configuration); let resp = api.create_authn_mapping(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an AuthN Mapping returns "OK" response ``` /** * Create an AuthN Mapping returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AuthNMappingsApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v2.AuthNMappingsApiCreateAuthNMappingRequest = { body: { data: { attributes: { attributeKey: "exampleauthnmapping", attributeValue: "Example-AuthN-Mapping", }, relationships: { role: { data: { id: ROLE_DATA_ID, type: "roles", }, }, }, type: "authn_mappings", }, }, }; apiInstance .createAuthNMapping(params) .then((data: v2.AuthNMappingResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c6420a0b-b50d-4e96-a616-9925242d4dc8&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=f96ad7bf-20b5-43ce-b0ac-72a4e6964e09&pt=AuthN%20Mappings&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fauthn-mappings%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c6420a0b-b50d-4e96-a616-9925242d4dc8&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=f96ad7bf-20b5-43ce-b0ac-72a4e6964e09&pt=AuthN%20Mappings&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fauthn-mappings%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=530b9e1b-8841-42af-bbc1-992c89badaa7&bo=2&sid=dc714880f0be11f0bce25d1b71a216d3&vid=dc7178d0f0be11f0a0bedbc5e643d3a9&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=AuthN%20Mappings&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fauthn-mappings%2F&r=<=1719&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=975028) --- # Source: https://docs.datadoghq.com/api/latest/aws-integration/ ![](https://d.adroll.com/cm/b/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/bombora/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/experian/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/eyeota/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/g/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/index/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/l/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/n/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/o/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/outbrain/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/pubmatic/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/taboola/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/triplelift/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/x/out?adroll_fpc=78c5680b6c34e98aed4a8ed35e36e1dc-1768337135447&pv=18704377853.58674&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM) [Read the 2025 State of Containers and Serverless Report! Read the State of Containers and Serverless Report! ](https://www.datadoghq.com/state-of-containers-and-serverless/?utm_source=inbound&utm_medium=corpsite-display&utm_campaign=int-infra-ww-announcement-product-announcement-containers-serverless-2025) * [Product](https://www.datadoghq.com/product/) The integrated platform for monitoring & security [View Product Pricing](https://www.datadoghq.com/pricing/) Observability End-to-end, simplified visibility into your stack’s health & performance Infrastructure * [Infrastructure Monitoring](https://www.datadoghq.com/product/infrastructure-monitoring/) * [Metrics](https://www.datadoghq.com/product/metrics/) * [Network Monitoring](https://www.datadoghq.com/product/network-monitoring/) * [Container Monitoring](https://www.datadoghq.com/product/container-monitoring/) * [Kubernetes Autoscaling](https://www.datadoghq.com/product/kubernetes-autoscaling/) * [Serverless](https://www.datadoghq.com/product/serverless-monitoring/) * [Cloud Cost Management](https://www.datadoghq.com/product/cloud-cost-management/) * [Cloudcraft](https://www.datadoghq.com/product/cloudcraft/) * [Storage Management](https://www.datadoghq.com/product/storage-management/) Applications * [Application Performance Monitoring](https://www.datadoghq.com/product/apm/) * [Universal Service Monitoring](https://www.datadoghq.com/product/universal-service-monitoring/) * [Continuous Profiler](https://www.datadoghq.com/product/code-profiling/) * [Dynamic Instrumentation](https://www.datadoghq.com/product/dynamic-instrumentation/) * [LLM Observability](https://www.datadoghq.com/product/llm-observability/) Data * [Database Monitoring](https://www.datadoghq.com/product/database-monitoring/) * [Data Streams Monitoring](https://www.datadoghq.com/product/data-streams-monitoring/) * [Quality Monitoring](https://www.datadoghq.com/product/data-observability/quality-monitoring/) * [Jobs Monitoring](https://www.datadoghq.com/product/data-observability/jobs-monitoring/) Logs * [Log Management](https://www.datadoghq.com/product/log-management/) * [Sensitive Data Scanner](https://www.datadoghq.com/product/sensitive-data-scanner/) * [Audit Trail](https://www.datadoghq.com/product/audit-trail/) * [Observability Pipelines](https://www.datadoghq.com/product/observability-pipelines/) * [Error Tracking](https://www.datadoghq.com/product/error-tracking/) * [CloudPrem](https://www.datadoghq.com/product/cloudprem/) Infrastructure Applications Data Logs Security Detect, prioritize, and respond to threats in real-time Code Security * [Code Security](https://www.datadoghq.com/product/code-security/) * [Software Composition Analysis](https://www.datadoghq.com/product/software-composition-analysis/) * [Static Code Analysis (SAST)](https://www.datadoghq.com/product/sast/) * [Runtime Code Analysis (IAST)](https://www.datadoghq.com/product/iast/) * [IaC Security](https://www.datadoghq.com/product/iac-security) * [Secret Scanning](https://www.datadoghq.com/product/secret-scanning/) Cloud Security * [Cloud Security](https://www.datadoghq.com/product/cloud-security/) * [Cloud Security Posture Management](https://www.datadoghq.com/product/cloud-security/#posture-management) * [Cloud Infrastructure Entitlement Management](https://www.datadoghq.com/product/cloud-security/#entitlement-management) * [Vulnerability Management](https://www.datadoghq.com/product/cloud-security/#vulnerability-management) * [Compliance](https://www.datadoghq.com/product/cloud-security/#compliance) Threat Management * [Cloud SIEM](https://www.datadoghq.com/product/cloud-siem/) * [Workload Protection](https://www.datadoghq.com/product/workload-protection/) * [App and API Protection](https://www.datadoghq.com/product/app-and-api-protection/) * [Sensitive Data Scanner](https://www.datadoghq.com/product/sensitive-data-scanner/) Security Labs * [Security Labs Research](https://securitylabs.datadoghq.com/) * [Open Source Projects](https://opensource.datadoghq.com/) Digital Experience Optimize front-end performance and enhance user experiences Digital Experience * [Browser Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/) * [Mobile Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/mobile-rum/) * [Product Analytics](https://www.datadoghq.com/product/product-analytics/) * [Session Replay](https://www.datadoghq.com/product/real-user-monitoring/session-replay/) * [Synthetic Monitoring](https://www.datadoghq.com/product/synthetic-monitoring/) * [Mobile App Testing](https://www.datadoghq.com/product/mobile-app-testing/) * [Error Tracking](https://www.datadoghq.com/product/error-tracking/) Related Products * [Continuous Testing](https://www.datadoghq.com/product/continuous-testing/) * [Dashboards](https://www.datadoghq.com/product/platform/dashboards/) * [Application Performance Monitoring](https://www.datadoghq.com/product/apm/) Software Delivery Build, test, secure and ship quality code faster Software Delivery * [Internal Developer Portal](https://www.datadoghq.com/product/internal-developer-portal/) * [CI Visibility](https://www.datadoghq.com/product/ci-cd-monitoring/) * [Test Optimization](https://www.datadoghq.com/product/test-optimization/) * [Continuous Testing](https://www.datadoghq.com/product/continuous-testing/) * [IDE Plugins](https://www.datadoghq.com/product/platform/ides/) * [DORA Metrics](https://www.datadoghq.com/product/platform/dora-metrics/) * [Feature Flags](https://www.datadoghq.com/product/feature-flags/) * [Code Coverage](https://www.datadoghq.com/product/code-coverage/) Related Products * [Software Composition Analysis](https://www.datadoghq.com/product/software-composition-analysis/) * [Application Performance Monitoring](https://www.datadoghq.com/product/apm/) * [Synthetic Monitoring](https://www.datadoghq.com/product/synthetic-monitoring/) * [Browser Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/) * [Workflow Automation](https://www.datadoghq.com/product/workflow-automation/) * [integrations](https://www.datadoghq.com/product/platform/integrations/) Service Management Integrated, streamlined workflows for faster time-to-resolution Service Management * [Incident Response](https://www.datadoghq.com/product/incident-response/) * [Software Catalog](https://www.datadoghq.com/product/software-catalog/) * [Service Level Objectives](https://www.datadoghq.com/product/service-level-objectives/) * [Case Management](https://www.datadoghq.com/product/case-management/) Actions * [Workflow Automation](https://www.datadoghq.com/product/workflow-automation/) * [App Builder](https://www.datadoghq.com/product/app-builder/) Agentic & Embedded * [Bits AI SRE](https://www.datadoghq.com/product/ai/bits-ai-sre/) * [Watchdog](https://www.datadoghq.com/product/platform/watchdog/) * [Event Management](https://www.datadoghq.com/product/event-management/) AI Monitor and improve model performance. Pinpoint root causes and detect anomalies AI Observability * [LLM Observability](https://www.datadoghq.com/product/llm-observability/) * [AI Integrations](https://www.datadoghq.com/product/platform/integrations/#cat-aiml) Agentic & Embedded * [Bits AI Agents](https://www.datadoghq.com/product/ai/bits-ai-agents/) * [Bits AI SRE](https://www.datadoghq.com/product/ai/bits-ai-sre/) * [Watchdog](https://www.datadoghq.com/product/platform/watchdog/) * [Event Management](https://www.datadoghq.com/product/event-management/) Related Products * [Incident Response](https://www.datadoghq.com/product/incident-response/) * [Workflow Automation](https://www.datadoghq.com/product/workflow-automation/) * [Application Performance Monitoring](https://www.datadoghq.com/product/apm/) * [Universal Service Monitoring](https://www.datadoghq.com/product/universal-service-monitoring/) * [Log Management](https://www.datadoghq.com/product/log-management/) Platform Capabilities Built-in features & integrations that power the Datadog platform Built-in Features * [Bits AI Agents](https://www.datadoghq.com/product/ai/bits-ai-agents/) * [Metrics](https://www.datadoghq.com/product/metrics/) * [Watchdog](https://www.datadoghq.com/product/platform/watchdog/) * [Alerts](https://www.datadoghq.com/product/platform/alerts/) * [Dashboards](https://www.datadoghq.com/product/platform/dashboards/) * [Notebooks](https://docs.datadoghq.com/notebooks/) * [Mobile App](https://docs.datadoghq.com/service_management/mobile/?tab=ios) * [Fleet Automation](https://www.datadoghq.com/product/fleet-automation/) * [Access Control](https://docs.datadoghq.com/account_management/rbac/?tab=datadogapplication) * [DORA Metrics](https://www.datadoghq.com/product/platform/dora-metrics/) Workflows & Collaboration * [Incident Response](https://www.datadoghq.com/product/incident-response/) * [Case Management](https://www.datadoghq.com/product/case-management/) * [Event Management](https://www.datadoghq.com/product/event-management/) * [Workflow Automation](https://www.datadoghq.com/product/workflow-automation/) * [App Builder](https://www.datadoghq.com/product/app-builder/) * [Cloudcraft](https://www.datadoghq.com/product/cloudcraft/) * [CoScreen](https://www.datadoghq.com/product/coscreen/) * [Teams](https://docs.datadoghq.com/account_management/teams/) Extensibility * [OpenTelemetry](https://www.datadoghq.com/solutions/opentelemetry/) * [integrations](https://www.datadoghq.com/product/platform/integrations/) * [IDE Plugins](https://www.datadoghq.com/product/platform/ides/) * [API](https://docs.datadoghq.com/api/) * [Marketplace](https://www.datadoghq.com/marketplacepartners/) * [Customers](https://www.datadoghq.com/customers/) * [Pricing](https://www.datadoghq.com/pricing/) * [Solutions](https://www.datadoghq.com/) Industry * [Financial Services](https://www.datadoghq.com/solutions/financial-services/) * [Manufacturing & Logistics](https://www.datadoghq.com/solutions/manufacturing-logistics/) * [Healthcare/Life Sciences](https://www.datadoghq.com/solutions/healthcare/) * [Retail/E-Commerce](https://www.datadoghq.com/solutions/retail-ecommerce/) * [Government](https://www.datadoghq.com/solutions/government/) * [Education](https://www.datadoghq.com/solutions/education/) * [Media & Entertainment](https://www.datadoghq.com/solutions/media-entertainment/) * [Technology](https://www.datadoghq.com/solutions/technology/) * [Gaming](https://www.datadoghq.com/solutions/gaming/) Technology * [Amazon Web Services Monitoring](https://www.datadoghq.com/solutions/aws/) * [Azure Monitoring](https://www.datadoghq.com/solutions/azure/) * [Google Cloud Monitoring](https://www.datadoghq.com/solutions/googlecloud/) * [Oracle Cloud Monitoring](https://www.datadoghq.com/solutions/oci-monitoring/) * [Kubernetes Monitoring](https://www.datadoghq.com/solutions/kubernetes/) * [Red Hat OpenShift](https://www.datadoghq.com/solutions/openshift/) * [Pivotal Platform](https://www.datadoghq.com/solutions/pivotal-platform/) * [OpenAI](https://www.datadoghq.com/solutions/openai/) * [SAP Monitoring](https://www.datadoghq.com/solutions/sap-monitoring/) * [OpenTelemetry](https://www.datadoghq.com/solutions/opentelemetry/) Use-case * [Application Security](https://www.datadoghq.com/solutions/application-security/) * [Cloud Migration](https://www.datadoghq.com/solutions/cloud-migration/) * [Monitoring Consolidation](https://www.datadoghq.com/solutions/monitoring-consolidation/) * [Unified Commerce Monitoring](https://www.datadoghq.com/solutions/unified-commerce-monitoring/) * [SOAR](https://www.datadoghq.com/solutions/soar/) * [DevOps](https://www.datadoghq.com/solutions/devops/) * [FinOps](https://www.datadoghq.com/solutions/finops/) * [Shift-Left Testing](https://www.datadoghq.com/solutions/shift-left-testing/) * [Digital Experience Monitoring](https://www.datadoghq.com/solutions/digital-experience-monitoring/) * [Security Analytics](https://www.datadoghq.com/solutions/security-analytics/) * [Compliance for CIS Benchmarks](https://www.datadoghq.com/solutions/security/cis-benchmarks/aws/) * [Hybrid Cloud Monitoring](https://www.datadoghq.com/solutions/hybrid-cloud-monitoring/) * [IoT Monitoring](https://www.datadoghq.com/solutions/iot-monitoring/) * [Real-Time BI](https://www.datadoghq.com/solutions/real-time-business-intelligence/) * [On-Premises Monitoring](https://www.datadoghq.com/solutions/on-premises-monitoring/) * [Log Analysis & Correlation](https://www.datadoghq.com/solutions/log-analysis-and-correlation/) * [CNAPP](https://www.datadoghq.com/solutions/cnapp/) * [Docs](https://docs.datadoghq.com/) [![DataDog](https://datadog-docs.imgix.net/img/dd_logo_n_70x75.png?ch=Width,DPR&fit=max&auto=format&w=70&h=75) ![DataDog](https://datadog-docs.imgix.net/img/dd-logo-n-200.png?ch=Width,DPR&fit=max&auto=format&h=14&auto=format&w=807) White modal up arrow Looking for Datadog logos? You can find the logo assets on our press page. Download Media Assets ](https://docs.datadoghq.com/) * [About](https://www.datadoghq.com/about/leadership/) * [Contact](https://www.datadoghq.com/about/contact/) * [Partners](https://www.datadoghq.com/partner/network/) * [Latest News](https://www.datadoghq.com/about/latest-news/press-releases/) * [Events & Webinars](https://www.datadoghq.com/events-webinars/) * [Leadership](https://www.datadoghq.com/about/leadership/) * [Careers](https://careers.datadoghq.com/) * [Analyst Reports](https://www.datadoghq.com/about/analyst/) * [Investor Relations](https://investors.datadoghq.com/) * [ESG Report](https://www.datadoghq.com/esg-report/) * [Trust Hub](https://www.datadoghq.com/trust/) * [Blog](https://www.datadoghq.com/blog/) * [The Monitor](https://www.datadoghq.com/blog/) * [Engineering](https://www.datadoghq.com/blog/engineering/) * [AI](https://www.datadoghq.com/blog/ai/) * [Security Labs](https://securitylabs.datadoghq.com/) * [Login](https://app.datadoghq.com/) * [](https://docs.datadoghq.com/api/latest/aws-integration/) * [GET STARTED FREE FREE TRIAL](https://www.datadoghq.com/) [![Datadog Logo](https://datadog-docs.imgix.net/img/datadog_rbg_n_2x.png?fm=png&auto=format&lossless=1)](https://docs.datadoghq.com/) Toggle navigation [ Home ](https://www.datadoghq.com)[ Docs ](https://docs.datadoghq.com/)[ API](https://docs.datadoghq.com/api/) * [Essentials ](https://docs.datadoghq.com/api/latest/aws-integration/) * [Getting Started](https://docs.datadoghq.com/getting_started/) * [Agent](https://docs.datadoghq.com/getting_started/agent/) * [API](https://docs.datadoghq.com/getting_started/api/) * [APM Tracing](https://docs.datadoghq.com/getting_started/tracing/) * [Containers](https://docs.datadoghq.com/getting_started/containers/) * [Autodiscovery](https://docs.datadoghq.com/getting_started/containers/autodiscovery) * [Datadog Operator](https://docs.datadoghq.com/getting_started/containers/datadog_operator) * [Dashboards](https://docs.datadoghq.com/getting_started/dashboards/) * [Database Monitoring](https://docs.datadoghq.com/getting_started/database_monitoring/) * [Datadog](https://docs.datadoghq.com/getting_started/application/) * [Datadog Site](https://docs.datadoghq.com/getting_started/site/) * [DevSecOps](https://docs.datadoghq.com/getting_started/devsecops) * [Incident Management](https://docs.datadoghq.com/getting_started/incident_management/) * [Integrations](https://docs.datadoghq.com/getting_started/integrations/) * [AWS](https://docs.datadoghq.com/getting_started/integrations/aws/) * [Azure](https://docs.datadoghq.com/getting_started/integrations/azure/) * [Google Cloud](https://docs.datadoghq.com/getting_started/integrations/google_cloud/) * [Terraform](https://docs.datadoghq.com/getting_started/integrations/terraform/) * [Internal Developer Portal](https://docs.datadoghq.com/getting_started/internal_developer_portal/) * [Logs](https://docs.datadoghq.com/getting_started/logs/) * [Monitors](https://docs.datadoghq.com/getting_started/monitors/) * [Notebooks](https://docs.datadoghq.com/getting_started/notebooks/) * [OpenTelemetry](https://docs.datadoghq.com/getting_started/opentelemetry/) * [Profiler](https://docs.datadoghq.com/getting_started/profiler/) * [Search](https://docs.datadoghq.com/getting_started/search/) * [Product-Specific Search](https://docs.datadoghq.com/getting_started/search/product_specific_reference) * [Session Replay](https://docs.datadoghq.com/getting_started/session_replay/) * [Security](https://docs.datadoghq.com/getting_started/security/) * [App and API Protection](https://docs.datadoghq.com/getting_started/security/application_security) * [Cloud Security](https://docs.datadoghq.com/getting_started/security/cloud_security_management/) * [Cloud SIEM](https://docs.datadoghq.com/getting_started/security/cloud_siem/) * [Code Security](https://docs.datadoghq.com/getting_started/code_security/) * [Serverless for AWS Lambda](https://docs.datadoghq.com/getting_started/serverless/) * [Software Delivery](https://docs.datadoghq.com/getting_started/software_delivery/) * [CI Visibility](https://docs.datadoghq.com/getting_started/ci_visibility/) * [Feature Flags](https://docs.datadoghq.com/getting_started/feature_flags/) * [Test Optimization](https://docs.datadoghq.com/getting_started/test_optimization/) * [Test Impact Analysis](https://docs.datadoghq.com/getting_started/test_impact_analysis/) * [Synthetic Monitoring and Testing](https://docs.datadoghq.com/getting_started/synthetics/) * [API Tests](https://docs.datadoghq.com/getting_started/synthetics/api_test) * [Browser Tests](https://docs.datadoghq.com/getting_started/synthetics/browser_test) * [Mobile App Tests](https://docs.datadoghq.com/getting_started/synthetics/mobile_app_testing) * [Continuous Testing](https://docs.datadoghq.com/getting_started/continuous_testing/) * [Private Locations](https://docs.datadoghq.com/getting_started/synthetics/private_location) * [Tags](https://docs.datadoghq.com/getting_started/tagging/) * [Assigning Tags](https://docs.datadoghq.com/getting_started/tagging/assigning_tags) * [Unified Service Tagging](https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging) * [Using Tags](https://docs.datadoghq.com/getting_started/tagging/using_tags) * [Workflow Automation](https://docs.datadoghq.com/getting_started/workflow_automation/) * [Learning Center](https://docs.datadoghq.com/getting_started/learning_center/) * [Support](https://docs.datadoghq.com/getting_started/support/) * [Glossary](https://docs.datadoghq.com/glossary/) * [Standard Attributes](https://docs.datadoghq.com/standard-attributes) * [Guides](https://docs.datadoghq.com/all_guides/) * [Agent](https://docs.datadoghq.com/agent/) * [Architecture](https://docs.datadoghq.com/agent/architecture/) * [IoT](https://docs.datadoghq.com/agent/iot/) * [Supported Platforms](https://docs.datadoghq.com/agent/supported_platforms/) * [AIX](https://docs.datadoghq.com/agent/supported_platforms/aix/) * [Linux](https://docs.datadoghq.com/agent/supported_platforms/linux/) * [Ansible](https://docs.datadoghq.com/agent/supported_platforms/ansible/) * [Chef](https://docs.datadoghq.com/agent/supported_platforms/chef/) * [Heroku](https://docs.datadoghq.com/agent/supported_platforms/heroku/) * [MacOS](https://docs.datadoghq.com/agent/supported_platforms/osx/) * [Puppet](https://docs.datadoghq.com/agent/supported_platforms/puppet/) * [SaltStack](https://docs.datadoghq.com/agent/supported_platforms/saltstack/) * [SCCM](https://docs.datadoghq.com/agent/supported_platforms/sccm/) * [Windows](https://docs.datadoghq.com/agent/supported_platforms/windows/) * [From Source](https://docs.datadoghq.com/agent/supported_platforms/source/) * [Log Collection](https://docs.datadoghq.com/agent/logs/) * [Log Agent tags](https://docs.datadoghq.com/agent/logs/agent_tags/) * [Advanced Configurations](https://docs.datadoghq.com/agent/logs/advanced_log_collection) * [Proxy](https://docs.datadoghq.com/agent/logs/proxy) * [Transport](https://docs.datadoghq.com/agent/logs/log_transport) * [Multi-Line Detection](https://docs.datadoghq.com/agent/logs/auto_multiline_detection) * [Configuration](https://docs.datadoghq.com/agent/configuration) * [Commands](https://docs.datadoghq.com/agent/configuration/agent-commands/) * [Configuration Files](https://docs.datadoghq.com/agent/configuration/agent-configuration-files/) * [Log Files](https://docs.datadoghq.com/agent/configuration/agent-log-files/) * [Status Page](https://docs.datadoghq.com/agent/configuration/agent-status-page/) * [Network Traffic](https://docs.datadoghq.com/agent/configuration/network/) * [Proxy Configuration](https://docs.datadoghq.com/agent/configuration/proxy/) * [FIPS Compliance](https://docs.datadoghq.com/agent/configuration/fips-compliance/) * [Dual Shipping](https://docs.datadoghq.com/agent/configuration/dual-shipping/) * [Secrets Management](https://docs.datadoghq.com/agent/configuration/secrets-management/) * [Fleet Automation](https://docs.datadoghq.com/agent/fleet_automation) * [Remote Agent Management](https://docs.datadoghq.com/agent/fleet_automation/remote_management) * [Troubleshooting](https://docs.datadoghq.com/agent/troubleshooting/) * [Container Hostname Detection](https://docs.datadoghq.com/agent/troubleshooting/hostname_containers/) * [Debug Mode](https://docs.datadoghq.com/agent/troubleshooting/debug_mode/) * [Agent Flare](https://docs.datadoghq.com/agent/troubleshooting/send_a_flare/) * [Agent Check Status](https://docs.datadoghq.com/agent/troubleshooting/agent_check_status/) * [NTP Issues](https://docs.datadoghq.com/agent/troubleshooting/ntp/) * [Permission Issues](https://docs.datadoghq.com/agent/troubleshooting/permissions/) * [Integrations Issues](https://docs.datadoghq.com/agent/troubleshooting/integrations/) * [Site Issues](https://docs.datadoghq.com/agent/troubleshooting/site/) * [Autodiscovery Issues](https://docs.datadoghq.com/agent/troubleshooting/autodiscovery/) * [Windows Container Issues](https://docs.datadoghq.com/agent/troubleshooting/windows_containers) * [Agent Runtime Configuration](https://docs.datadoghq.com/agent/troubleshooting/config) * [High CPU or Memory Consumption](https://docs.datadoghq.com/agent/troubleshooting/high_memory_usage/) * [Guides](https://docs.datadoghq.com/agent/guide/) * [Data Security](https://docs.datadoghq.com/data_security/agent/) * [Integrations](https://docs.datadoghq.com/integrations/) * [Guides](https://docs.datadoghq.com/integrations/guide/) * [Developers](https://docs.datadoghq.com/developers/) * [Authorization](https://docs.datadoghq.com/developers/authorization/) * [OAuth2 in Datadog](https://docs.datadoghq.com/developers/authorization/oauth2_in_datadog/) * [Authorization Endpoints](https://docs.datadoghq.com/developers/authorization/oauth2_endpoints/) * [DogStatsD](https://docs.datadoghq.com/developers/dogstatsd/) * [Datagram Format](https://docs.datadoghq.com/developers/dogstatsd/datagram_shell) * [Unix Domain Socket](https://docs.datadoghq.com/developers/dogstatsd/unix_socket) * [High Throughput Data](https://docs.datadoghq.com/developers/dogstatsd/high_throughput/) * [Data Aggregation](https://docs.datadoghq.com/developers/dogstatsd/data_aggregation/) * [DogStatsD Mapper](https://docs.datadoghq.com/developers/dogstatsd/dogstatsd_mapper/) * [Custom Checks](https://docs.datadoghq.com/developers/custom_checks/) * [Writing a Custom Agent Check](https://docs.datadoghq.com/developers/custom_checks/write_agent_check/) * [Writing a Custom OpenMetrics Check](https://docs.datadoghq.com/developers/custom_checks/prometheus/) * [Integrations](https://docs.datadoghq.com/developers/integrations/) * [Build an Integration with Datadog](https://docs.datadoghq.com/developers/integrations/build_integration/) * [Create an Agent-based Integration](https://docs.datadoghq.com/developers/integrations/agent_integration/) * [Create an API-based Integration](https://docs.datadoghq.com/developers/integrations/api_integration/) * [Create a Log Pipeline](https://docs.datadoghq.com/developers/integrations/log_pipeline/) * [Integration Assets Reference](https://docs.datadoghq.com/developers/integrations/check_references/) * [Build a Marketplace Offering](https://docs.datadoghq.com/developers/integrations/marketplace_offering/) * [Create an Integration Dashboard](https://docs.datadoghq.com/developers/integrations/create-an-integration-dashboard) * [Create a Monitor Template](https://docs.datadoghq.com/developers/integrations/create-an-integration-monitor-template) * [Create a Cloud SIEM Detection Rule](https://docs.datadoghq.com/developers/integrations/create-a-cloud-siem-detection-rule) * [Install Agent Integration Developer Tool](https://docs.datadoghq.com/developers/integrations/python/) * [Service Checks](https://docs.datadoghq.com/developers/service_checks/) * [Submission - Agent Check](https://docs.datadoghq.com/developers/service_checks/agent_service_checks_submission/) * [Submission - DogStatsD](https://docs.datadoghq.com/developers/service_checks/dogstatsd_service_checks_submission/) * [Submission - API](https://docs.datadoghq.com/api/v1/service-checks/) * [IDE Plugins](https://docs.datadoghq.com/developers/ide_plugins/) * [JetBrains IDEs](https://docs.datadoghq.com/developers/ide_plugins/idea/) * [VS Code & Cursor](https://docs.datadoghq.com/developers/ide_plugins/vscode/) * [Community](https://docs.datadoghq.com/developers/community/) * [Libraries](https://docs.datadoghq.com/developers/community/libraries/) * [Guides](https://docs.datadoghq.com/developers/guide/) * [OpenTelemetry](https://docs.datadoghq.com/opentelemetry/) * [Getting Started](https://docs.datadoghq.com/opentelemetry/getting_started/) * [Datadog Example Application](https://docs.datadoghq.com/opentelemetry/getting_started/datadog_example) * [OpenTelemetry Demo Application](https://docs.datadoghq.com/opentelemetry/getting_started/otel_demo_to_datadog) * [Feature Compatibility](https://docs.datadoghq.com/opentelemetry/compatibility/) * [Instrument Your Applications](https://docs.datadoghq.com/opentelemetry/instrument/) * [OTel SDKs](https://docs.datadoghq.com/opentelemetry/instrument/otel_sdks/) * [OTel APIs with Datadog SDKs](https://docs.datadoghq.com/opentelemetry/instrument/api_support) * [OTel Instrumentation Libraries](https://docs.datadoghq.com/opentelemetry/instrument/instrumentation_libraries/) * [Configuration](https://docs.datadoghq.com/opentelemetry/config/environment_variable_support/) * [Send Data to Datadog](https://docs.datadoghq.com/opentelemetry/setup/) * [DDOT Collector (Recommended)](https://docs.datadoghq.com/opentelemetry/setup/ddot_collector) * [Other Setup Options](https://docs.datadoghq.com/opentelemetry/setup/) * [Semantic Mapping](https://docs.datadoghq.com/opentelemetry/mapping/) * [Resource Attribute Mapping](https://docs.datadoghq.com/opentelemetry/mapping/semantic_mapping/) * [Metrics Mapping](https://docs.datadoghq.com/opentelemetry/mapping/metrics_mapping/) * [Infrastructure Host Mapping](https://docs.datadoghq.com/opentelemetry/mapping/host_metadata/) * [Hostname Mapping](https://docs.datadoghq.com/opentelemetry/mapping/hostname/) * [Service-entry Spans Mapping](https://docs.datadoghq.com/opentelemetry/mapping/service_entry_spans/) * [Ingestion Sampling](https://docs.datadoghq.com/opentelemetry/ingestion_sampling) * [Correlate Data](https://docs.datadoghq.com/opentelemetry/correlate/) * [Logs and Traces](https://docs.datadoghq.com/opentelemetry/correlate/logs_and_traces/) * [Metrics and Traces](https://docs.datadoghq.com/opentelemetry/correlate/metrics_and_traces/) * [RUM and Traces](https://docs.datadoghq.com/opentelemetry/correlate/rum_and_traces/) * [DBM and Traces](https://docs.datadoghq.com/opentelemetry/correlate/dbm_and_traces/) * [Integrations](https://docs.datadoghq.com/opentelemetry/integrations/) * [Apache Metrics](https://docs.datadoghq.com/opentelemetry/integrations/apache_metrics/) * [Apache Spark Metrics](https://docs.datadoghq.com/opentelemetry/integrations/spark_metrics/) * [Collector Health Metrics](https://docs.datadoghq.com/opentelemetry/integrations/collector_health_metrics/) * [Datadog Extension](https://docs.datadoghq.com/opentelemetry/integrations/datadog_extension/) * [Docker Metrics](https://docs.datadoghq.com/opentelemetry/integrations/docker_metrics/) * [HAProxy Metrics](https://docs.datadoghq.com/opentelemetry/integrations/haproxy_metrics/) * [Host Metrics](https://docs.datadoghq.com/opentelemetry/integrations/host_metrics/) * [IIS Metrics](https://docs.datadoghq.com/opentelemetry/integrations/iis_metrics/) * [Kafka Metrics](https://docs.datadoghq.com/opentelemetry/integrations/kafka_metrics/) * [Kubernetes Metrics](https://docs.datadoghq.com/opentelemetry/integrations/kubernetes_metrics/) * [MySQL Metrics](https://docs.datadoghq.com/opentelemetry/integrations/mysql_metrics/) * [NGINX Metrics](https://docs.datadoghq.com/opentelemetry/integrations/nginx_metrics/) * [Podman Metrics](https://docs.datadoghq.com/opentelemetry/integrations/podman_metrics/) * [Runtime Metrics](https://docs.datadoghq.com/opentelemetry/integrations/runtime_metrics/) * [Trace Metrics](https://docs.datadoghq.com/opentelemetry/integrations/trace_metrics/) * [Troubleshooting](https://docs.datadoghq.com/opentelemetry/troubleshooting/) * [Guides and Resources](https://docs.datadoghq.com/opentelemetry/guide) * [Produce Delta Temporality Metrics](https://docs.datadoghq.com/opentelemetry/guide/otlp_delta_temporality) * [Visualize Histograms as Heatmaps](https://docs.datadoghq.com/opentelemetry/guide/otlp_histogram_heatmaps) * [Migration Guides](https://docs.datadoghq.com/opentelemetry/migrate/) * [Reference](https://docs.datadoghq.com/opentelemetry/reference) * [Terms and Concepts](https://docs.datadoghq.com/opentelemetry/reference/concepts) * [Trace Context Propagation](https://docs.datadoghq.com/opentelemetry/reference/trace_context_propagation) * [Trace IDs](https://docs.datadoghq.com/opentelemetry/reference/trace_ids) * [OTLP Metric Types](https://docs.datadoghq.com/opentelemetry/reference/otlp_metric_types) * [Administrator's Guide](https://docs.datadoghq.com/administrators_guide/) * [Getting Started](https://docs.datadoghq.com/administrators_guide/getting_started/) * [Plan](https://docs.datadoghq.com/administrators_guide/plan/) * [Build](https://docs.datadoghq.com/administrators_guide/build/) * [Run](https://docs.datadoghq.com/administrators_guide/run/) * [API](https://docs.datadoghq.com/api/) * [Partners](https://docs.datadoghq.com/partners/) * [Datadog Mobile App](https://docs.datadoghq.com/mobile/) * [Enterprise Configuration](https://docs.datadoghq.com/mobile/enterprise_configuration) * [Datadog for Intune](https://docs.datadoghq.com/mobile/datadog_for_intune) * [Shortcut Configurations](https://docs.datadoghq.com/mobile/shortcut_configurations) * [Push Notifications](https://docs.datadoghq.com/mobile/push_notification) * [Widgets](https://docs.datadoghq.com/mobile/widgets) * [Guides](https://docs.datadoghq.com/mobile/guide) * [DDSQL Reference](https://docs.datadoghq.com/ddsql_reference/) * [Data Directory](https://docs.datadoghq.com/ddsql_reference/data_directory/) * [CoScreen](https://docs.datadoghq.com/coscreen/) * [Troubleshooting](https://docs.datadoghq.com/coscreen/troubleshooting) * [CoTerm](https://docs.datadoghq.com/coterm/) * [Install](https://docs.datadoghq.com/coterm/install) * [Using CoTerm](https://docs.datadoghq.com/coterm/usage) * [Configuration Rules](https://docs.datadoghq.com/coterm/rules) * [Remote Configuration](https://docs.datadoghq.com/remote_configuration) * [Cloudcraft (Standalone)](https://docs.datadoghq.com/cloudcraft/) * [Getting Started](https://docs.datadoghq.com/cloudcraft/getting-started/) * [Account Management](https://docs.datadoghq.com/cloudcraft/account-management/) * [Components: Common](https://docs.datadoghq.com/cloudcraft/components-common/) * [Components: Azure](https://docs.datadoghq.com/cloudcraft/components-azure/) * [Components: AWS](https://docs.datadoghq.com/cloudcraft/components-aws/) * [Advanced](https://docs.datadoghq.com/cloudcraft/advanced/) * [FAQ](https://docs.datadoghq.com/cloudcraft/faq/) * [API](https://docs.datadoghq.com/cloudcraft/api) * [AWS Accounts](https://docs.datadoghq.com/cloudcraft/api/aws-accounts/) * [Azure Accounts](https://docs.datadoghq.com/cloudcraft/api/azure-accounts/) * [Blueprints](https://docs.datadoghq.com/cloudcraft/api/blueprints/) * [Budgets](https://docs.datadoghq.com/cloudcraft/api/budgets/) * [Teams](https://docs.datadoghq.com/cloudcraft/api/teams/) * [Users](https://docs.datadoghq.com/cloudcraft/api/users/) * [In The App ](https://docs.datadoghq.com/api/latest/aws-integration/) * [Dashboards](https://docs.datadoghq.com/dashboards/) * [Configure](https://docs.datadoghq.com/dashboards/configure/) * [Dashboard List](https://docs.datadoghq.com/dashboards/list/) * [Widgets](https://docs.datadoghq.com/dashboards/widgets/) * [Configuration](https://docs.datadoghq.com/dashboards/widgets/configuration/) * [Widget Types](https://docs.datadoghq.com/dashboards/widgets/types/) * [Querying](https://docs.datadoghq.com/dashboards/querying/) * [Functions](https://docs.datadoghq.com/dashboards/functions/) * [Algorithms](https://docs.datadoghq.com/dashboards/functions/algorithms/) * [Arithmetic](https://docs.datadoghq.com/dashboards/functions/arithmetic/) * [Count](https://docs.datadoghq.com/dashboards/functions/count/) * [Exclusion](https://docs.datadoghq.com/dashboards/functions/exclusion/) * [Interpolation](https://docs.datadoghq.com/dashboards/functions/interpolation/) * [Rank](https://docs.datadoghq.com/dashboards/functions/rank/) * [Rate](https://docs.datadoghq.com/dashboards/functions/rate/) * [Regression](https://docs.datadoghq.com/dashboards/functions/regression/) * [Rollup](https://docs.datadoghq.com/dashboards/functions/rollup/) * [Smoothing](https://docs.datadoghq.com/dashboards/functions/smoothing/) * [Timeshift](https://docs.datadoghq.com/dashboards/functions/timeshift/) * [Beta](https://docs.datadoghq.com/dashboards/functions/beta/) * [Graph Insights](https://docs.datadoghq.com/dashboards/graph_insights) * [Metric Correlations](https://docs.datadoghq.com/dashboards/graph_insights/correlations/) * [Watchdog Explains](https://docs.datadoghq.com/dashboards/graph_insights/watchdog_explains/) * [Template Variables](https://docs.datadoghq.com/dashboards/template_variables/) * [Overlays](https://docs.datadoghq.com/dashboards/change_overlays/) * [Annotations](https://docs.datadoghq.com/dashboards/annotations/) * [Guides](https://docs.datadoghq.com/dashboards/guide/) * [Sharing](https://docs.datadoghq.com/dashboards/sharing/) * [Shared Dashboards](https://docs.datadoghq.com/dashboards/sharing/shared_dashboards) * [Share Graphs](https://docs.datadoghq.com/dashboards/sharing/graphs) * [Scheduled Reports](https://docs.datadoghq.com/dashboards/sharing/scheduled_reports) * [Notebooks](https://docs.datadoghq.com/notebooks/) * [Analysis Features](https://docs.datadoghq.com/notebooks/advanced_analysis/) * [Getting Started](https://docs.datadoghq.com/notebooks/advanced_analysis/getting_started/) * [Guides](https://docs.datadoghq.com/notebooks/guide) * [DDSQL Editor](https://docs.datadoghq.com/ddsql_editor/) * [Reference Tables](https://docs.datadoghq.com/reference_tables/) * [Sheets](https://docs.datadoghq.com/sheets/) * [Functions and Operators](https://docs.datadoghq.com/sheets/functions_operators/) * [Guides](https://docs.datadoghq.com/sheets/guide/) * [Monitors and Alerting](https://docs.datadoghq.com/monitors/) * [Draft Monitors](https://docs.datadoghq.com/monitors/draft/) * [Configure Monitors](https://docs.datadoghq.com/monitors/configuration/) * [Monitor Templates](https://docs.datadoghq.com/monitors/templates/) * [Monitor Types](https://docs.datadoghq.com/monitors/types/) * [Host](https://docs.datadoghq.com/monitors/types/host/) * [Metric](https://docs.datadoghq.com/monitors/types/metric/) * [Analysis](https://docs.datadoghq.com/monitors/types/analysis/) * [Anomaly](https://docs.datadoghq.com/monitors/types/anomaly/) * [APM](https://docs.datadoghq.com/monitors/types/apm/) * [Audit Trail](https://docs.datadoghq.com/monitors/types/audit_trail/) * [Change](https://docs.datadoghq.com/monitors/types/change-alert/) * [CI/CD & Test](https://docs.datadoghq.com/monitors/types/ci/) * [Cloud Cost](https://docs.datadoghq.com/monitors/types/cloud_cost/) * [Composite](https://docs.datadoghq.com/monitors/types/composite/) * [Database Monitoring](https://docs.datadoghq.com/monitors/types/database_monitoring/) * [Error Tracking](https://docs.datadoghq.com/monitors/types/error_tracking/) * [Event](https://docs.datadoghq.com/monitors/types/event/) * [Forecast](https://docs.datadoghq.com/monitors/types/forecasts/) * [Integration](https://docs.datadoghq.com/monitors/types/integration/) * [Live Process](https://docs.datadoghq.com/monitors/types/process/) * [Logs](https://docs.datadoghq.com/monitors/types/log/) * [Network](https://docs.datadoghq.com/monitors/types/network/) * [Cloud Network Monitoring](https://docs.datadoghq.com/monitors/types/cloud_network_monitoring/) * [NetFlow](https://docs.datadoghq.com/monitors/types/netflow/) * [Outlier](https://docs.datadoghq.com/monitors/types/outlier/) * [Process Check](https://docs.datadoghq.com/monitors/types/process_check/) * [Real User Monitoring](https://docs.datadoghq.com/monitors/types/real_user_monitoring/) * [Service Check](https://docs.datadoghq.com/monitors/types/service_check/) * [SLO Alerts](https://docs.datadoghq.com/monitors/types/slo/) * [Synthetic Monitoring](https://docs.datadoghq.com/monitors/types/synthetic_monitoring/) * [Watchdog](https://docs.datadoghq.com/monitors/types/watchdog/) * [Notifications](https://docs.datadoghq.com/monitors/notify/) * [Notification Rules](https://docs.datadoghq.com/monitors/notify/notification_rules/) * [Variables](https://docs.datadoghq.com/monitors/notify/variables/) * [Downtimes](https://docs.datadoghq.com/monitors/downtimes/) * [Examples](https://docs.datadoghq.com/monitors/downtimes/examples) * [Manage Monitors](https://docs.datadoghq.com/monitors/manage/) * [Search Monitors](https://docs.datadoghq.com/monitors/manage/search/) * [Check Summary](https://docs.datadoghq.com/monitors/manage/check_summary/) * [Monitor Status](https://docs.datadoghq.com/monitors/status/status_page) * [Status Graphs](https://docs.datadoghq.com/monitors/status/graphs) * [Status Events](https://docs.datadoghq.com/monitors/status/events) * [Monitor Settings](https://docs.datadoghq.com/monitors/settings/) * [Monitor Quality](https://docs.datadoghq.com/monitors/quality/) * [Guides](https://docs.datadoghq.com/monitors/guide/) * [Service Level Objectives](https://docs.datadoghq.com/service_level_objectives/) * [Monitor-based SLOs](https://docs.datadoghq.com/service_level_objectives/monitor/) * [Metric-based SLOs](https://docs.datadoghq.com/service_level_objectives/metric/) * [Time Slice SLOs](https://docs.datadoghq.com/service_level_objectives/time_slice/) * [Error Budget Alerts](https://docs.datadoghq.com/service_level_objectives/error_budget/) * [Burn Rate Alerts](https://docs.datadoghq.com/service_level_objectives/burn_rate/) * [Guides](https://docs.datadoghq.com/service_level_objectives/guide/) * [Metrics](https://docs.datadoghq.com/metrics/) * [Custom Metrics](https://docs.datadoghq.com/metrics/custom_metrics/) * [Metric Type Modifiers](https://docs.datadoghq.com/metrics/custom_metrics/type_modifiers/) * [Historical Metrics Ingestion](https://docs.datadoghq.com/metrics/custom_metrics/historical_metrics/) * [Submission - Agent Check](https://docs.datadoghq.com/metrics/custom_metrics/agent_metrics_submission/) * [Submission - DogStatsD](https://docs.datadoghq.com/metrics/custom_metrics/dogstatsd_metrics_submission/) * [Submission - Powershell](https://docs.datadoghq.com/metrics/custom_metrics/powershell_metrics_submission) * [Submission - API](https://docs.datadoghq.com/api/latest/metrics/#submit-metrics) * [OpenTelemetry Metrics](https://docs.datadoghq.com/metrics/open_telemetry/) * [OTLP Metric Types](https://docs.datadoghq.com/metrics/open_telemetry/otlp_metric_types) * [Query OpenTelemetry Metrics](https://docs.datadoghq.com/metrics/open_telemetry/query_metrics/) * [Metrics Types](https://docs.datadoghq.com/metrics/types/) * [Distributions](https://docs.datadoghq.com/metrics/distributions/) * [Overview](https://docs.datadoghq.com/metrics/overview/) * [Explorer](https://docs.datadoghq.com/metrics/explorer/) * [Metrics Units](https://docs.datadoghq.com/metrics/units/) * [Summary](https://docs.datadoghq.com/metrics/summary/) * [Volume](https://docs.datadoghq.com/metrics/volume/) * [Advanced Filtering](https://docs.datadoghq.com/metrics/advanced-filtering/) * [Nested Queries](https://docs.datadoghq.com/metrics/nested_queries/) * [Composite Metrics Queries](https://docs.datadoghq.com/metrics/composite_metrics_queries) * [Derived Metrics](https://docs.datadoghq.com/metrics/derived-metrics) * [Metrics Without Limits™](https://docs.datadoghq.com/metrics/metrics-without-limits/) * [Guides](https://docs.datadoghq.com/metrics/guide) * [Watchdog](https://docs.datadoghq.com/watchdog/) * [Alerts](https://docs.datadoghq.com/watchdog/alerts) * [Impact Analysis](https://docs.datadoghq.com/watchdog/impact_analysis/) * [RCA](https://docs.datadoghq.com/watchdog/rca/) * [Insights](https://docs.datadoghq.com/watchdog/insights) * [Faulty Deployment Detection](https://docs.datadoghq.com/watchdog/faulty_deployment_detection/) * [Faulty Cloud & SaaS API Detection](https://docs.datadoghq.com/watchdog/faulty_cloud_saas_api_detection) * [Bits AI](https://docs.datadoghq.com/bits_ai/) * [Bits AI SRE](https://docs.datadoghq.com/bits_ai/bits_ai_sre) * [Investigate issues](https://docs.datadoghq.com/bits_ai/bits_ai_sre/investigate_issues) * [Remediate issues](https://docs.datadoghq.com/bits_ai/bits_ai_sre/remediate_issues) * [Bits AI SRE integrations and settings](https://docs.datadoghq.com/bits_ai/bits_ai_sre/configure) * [Help Bits learn](https://docs.datadoghq.com/bits_ai/bits_ai_sre/help_bits_learn) * [Chat with Bits AI SRE](https://docs.datadoghq.com/bits_ai/bits_ai_sre/chat_bits_ai_sre) * [Bits AI Dev Agent](https://docs.datadoghq.com/bits_ai/bits_ai_dev_agent) * [Setup](https://docs.datadoghq.com/bits_ai/bits_ai_dev_agent/setup) * [Chat with Bits AI](https://docs.datadoghq.com/bits_ai/chat_with_bits_ai) * [MCP Server](https://docs.datadoghq.com/bits_ai/mcp_server) * [Internal Developer Portal](https://docs.datadoghq.com/internal_developer_portal/) * [Software Catalog](https://docs.datadoghq.com/internal_developer_portal/software_catalog/) * [Set Up](https://docs.datadoghq.com/internal_developer_portal/software_catalog/set_up) * [Entity Model](https://docs.datadoghq.com/internal_developer_portal/software_catalog/entity_model) * [Troubleshooting](https://docs.datadoghq.com/internal_developer_portal/software_catalog/troubleshooting) * [Scorecards](https://docs.datadoghq.com/internal_developer_portal/scorecards) * [Scorecard Configuration](https://docs.datadoghq.com/internal_developer_portal/scorecards/scorecard_configuration) * [Custom Rules](https://docs.datadoghq.com/internal_developer_portal/scorecards/custom_rules) * [Using Scorecards](https://docs.datadoghq.com/internal_developer_portal/scorecards/using_scorecards) * [Self-Service Actions](https://docs.datadoghq.com/internal_developer_portal/self_service_actions) * [Software Templates](https://docs.datadoghq.com/internal_developer_portal/self_service_actions/software_templates) * [Engineering Reports](https://docs.datadoghq.com/internal_developer_portal/eng_reports) * [Reliability Overview](https://docs.datadoghq.com/internal_developer_portal/eng_reports/reliability_overview) * [Scorecards Performance](https://docs.datadoghq.com/internal_developer_portal/eng_reports/scorecards_performance) * [DORA Metrics](https://docs.datadoghq.com/internal_developer_portal/eng_reports/dora_metrics) * [Custom Reports](https://docs.datadoghq.com/internal_developer_portal/eng_reports/custom_reports) * [Developer Homepage](https://docs.datadoghq.com/internal_developer_portal/developer_homepage) * [Campaigns](https://docs.datadoghq.com/internal_developer_portal/campaigns) * [External Provider Status](https://docs.datadoghq.com/internal_developer_portal/external_provider_status) * [Plugins](https://docs.datadoghq.com/internal_developer_portal/plugins) * [Integrations](https://docs.datadoghq.com/internal_developer_portal/integrations) * [Use Cases](https://docs.datadoghq.com/internal_developer_portal/use_cases) * [API Management](https://docs.datadoghq.com/internal_developer_portal/use_cases/api_management) * [Cloud Cost Management](https://docs.datadoghq.com/internal_developer_portal/use_cases/cloud_cost_management) * [App and API Protection](https://docs.datadoghq.com/internal_developer_portal/use_cases/appsec_management) * [Developer Onboarding](https://docs.datadoghq.com/internal_developer_portal/use_cases/dev_onboarding) * [Dependency Management](https://docs.datadoghq.com/internal_developer_portal/use_cases/dependency_management) * [Production Readiness](https://docs.datadoghq.com/internal_developer_portal/use_cases/production_readiness) * [Incident Response](https://docs.datadoghq.com/internal_developer_portal/use_cases/incident_response) * [CI Pipeline Visibility](https://docs.datadoghq.com/internal_developer_portal/use_cases/pipeline_visibility) * [Onboarding Guide](https://docs.datadoghq.com/internal_developer_portal/onboarding_guide) * [Error Tracking](https://docs.datadoghq.com/error_tracking/) * [Explorer](https://docs.datadoghq.com/error_tracking/explorer) * [Issue States](https://docs.datadoghq.com/error_tracking/issue_states) * [Regression Detection](https://docs.datadoghq.com/error_tracking/regression_detection) * [Suspected Causes](https://docs.datadoghq.com/error_tracking/suspected_causes) * [Error Grouping](https://docs.datadoghq.com/error_tracking/error_grouping) * [Bits AI Dev Agent](https://docs.datadoghq.com/bits_ai/bits_ai_dev_agent) * [Monitors](https://docs.datadoghq.com/error_tracking/monitors) * [Issue Correlation](https://docs.datadoghq.com/error_tracking/issue_correlation) * [Identify Suspect Commits](https://docs.datadoghq.com/error_tracking/suspect_commits) * [Auto Assign](https://docs.datadoghq.com/error_tracking/auto_assign) * [Issue Team Ownership](https://docs.datadoghq.com/error_tracking/issue_team_ownership) * [Track Browser and Mobile Errors](https://docs.datadoghq.com/error_tracking/frontend) * [Browser Error Tracking](https://docs.datadoghq.com/error_tracking/frontend/browser) * [Collecting Browser Errors](https://docs.datadoghq.com/error_tracking/frontend/collecting_browser_errors) * [Mobile Crash Tracking](https://docs.datadoghq.com/error_tracking/frontend/mobile) * [Replay Errors](https://docs.datadoghq.com/error_tracking/frontend/replay_errors) * [Real User Monitoring](https://docs.datadoghq.com/error_tracking/rum) * [Logs](https://docs.datadoghq.com/error_tracking/frontend/logs) * [Track Backend Errors](https://docs.datadoghq.com/error_tracking/backend) * [Getting Started](https://docs.datadoghq.com/error_tracking/backend/getting_started) * [Exception Replay](https://docs.datadoghq.com/error_tracking/backend/exception_replay) * [Capturing Handled Errors](https://docs.datadoghq.com/error_tracking/backend/capturing_handled_errors) * [APM](https://docs.datadoghq.com/error_tracking/apm) * [Logs](https://docs.datadoghq.com/error_tracking/backend/logs) * [Manage Data Collection](https://docs.datadoghq.com/error_tracking/manage_data_collection) * [Troubleshooting](https://docs.datadoghq.com/error_tracking/troubleshooting) * [Guides](https://docs.datadoghq.com/error_tracking/guides) * [Change Tracking](https://docs.datadoghq.com/change_tracking/) * [Feature Flags](https://docs.datadoghq.com/change_tracking/feature_flags/) * [Event Management](https://docs.datadoghq.com/events/) * [Ingest Events](https://docs.datadoghq.com/events/ingest/) * [Pipelines and Processors](https://docs.datadoghq.com/events/pipelines_and_processors/) * [Aggregation Key Processor](https://docs.datadoghq.com/events/pipelines_and_processors/aggregation_key) * [Arithmetic Processor](https://docs.datadoghq.com/events/pipelines_and_processors/arithmetic_processor) * [Date Remapper](https://docs.datadoghq.com/events/pipelines_and_processors/date_remapper) * [Category Processor](https://docs.datadoghq.com/events/pipelines_and_processors/category_processor) * [Grok Parser](https://docs.datadoghq.com/events/pipelines_and_processors/grok_parser) * [Lookup Processor](https://docs.datadoghq.com/events/pipelines_and_processors/lookup_processor) * [Remapper](https://docs.datadoghq.com/events/pipelines_and_processors/remapper) * [Service Remapper](https://docs.datadoghq.com/events/pipelines_and_processors/service_remapper) * [Status Remapper](https://docs.datadoghq.com/events/pipelines_and_processors/status_remapper) * [String Builder Processor](https://docs.datadoghq.com/events/pipelines_and_processors/string_builder_processor) * [Explorer](https://docs.datadoghq.com/events/explorer/) * [Searching](https://docs.datadoghq.com/events/explorer/searching) * [Navigate the Explorer](https://docs.datadoghq.com/events/explorer/navigate) * [Customization](https://docs.datadoghq.com/events/explorer/customization) * [Facets](https://docs.datadoghq.com/events/explorer/facets) * [Attributes](https://docs.datadoghq.com/events/explorer/attributes) * [Notifications](https://docs.datadoghq.com/events/explorer/notifications) * [Analytics](https://docs.datadoghq.com/events/explorer/analytics) * [Saved Views](https://docs.datadoghq.com/events/explorer/saved_views) * [Triage Inbox](https://docs.datadoghq.com/events/triage_inbox) * [Correlation](https://docs.datadoghq.com/events/correlation/) * [Configuration](https://docs.datadoghq.com/events/correlation/configuration) * [Triaging & Notifying](https://docs.datadoghq.com/events/correlation/triage_and_notify) * [Analytics](https://docs.datadoghq.com/events/correlation/analytics) * [Guides](https://docs.datadoghq.com/events/guides/) * [Incident Response ](https://docs.datadoghq.com/api/latest/aws-integration/) * [Incident Management](https://docs.datadoghq.com/incident_response/incident_management/) * [Declare an Incident](https://docs.datadoghq.com/incident_response/incident_management/declare) * [Describe an Incident](https://docs.datadoghq.com/incident_response/incident_management/describe) * [Response Team](https://docs.datadoghq.com/incident_response/incident_management/response_team) * [Notification](https://docs.datadoghq.com/incident_response/incident_management/notification) * [Investigate an Incident](https://docs.datadoghq.com/incident_response/incident_management/investigate) * [Timeline](https://docs.datadoghq.com/incident_response/incident_management/investigate/timeline) * [Follow-ups](https://docs.datadoghq.com/incident_response/incident_management/follow-ups) * [Incident AI](https://docs.datadoghq.com/incident_response/incident_management/incident_ai) * [Incident Settings](https://docs.datadoghq.com/incident_response/incident_management/incident_settings) * [Information](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/information) * [Property Fields](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/property_fields) * [Responder Types](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/responder_types) * [Integrations](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/integrations) * [Notification Rules](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/notification_rules) * [Templates](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/templates) * [Incident Analytics](https://docs.datadoghq.com/incident_response/incident_management/analytics) * [Integrations](https://docs.datadoghq.com/incident_response/incident_management/integrations) * [Slack](https://docs.datadoghq.com/incident_response/incident_management/integrations/slack) * [Microsoft Teams](https://docs.datadoghq.com/incident_response/incident_management/integrations/microsoft_teams) * [Jira](https://docs.datadoghq.com/incident_response/incident_management/integrations/jira) * [ServiceNow](https://docs.datadoghq.com/incident_response/incident_management/integrations/servicenow) * [Status Pages](https://docs.datadoghq.com/incident_response/incident_management/integrations/status_pages) * [Atlassian Statuspage](https://docs.datadoghq.com/incident_response/incident_management/integrations/statuspage) * [Datadog Clipboard](https://docs.datadoghq.com/incident_response/incident_management/datadog_clipboard) * [On-Call](https://docs.datadoghq.com/incident_response/on-call/) * [Onboard a Team](https://docs.datadoghq.com/incident_response/on-call/teams/) * [Trigger a Page](https://docs.datadoghq.com/incident_response/on-call/triggering_pages/) * [Live Call Routing](https://docs.datadoghq.com/incident_response/on-call/triggering_pages/live_call_routing) * [Routing Rules](https://docs.datadoghq.com/incident_response/on-call/routing_rules/) * [Escalation Policies](https://docs.datadoghq.com/incident_response/on-call/escalation_policies/) * [Schedules](https://docs.datadoghq.com/incident_response/on-call/schedules/) * [Automations](https://docs.datadoghq.com/incident_response/on-call/automations) * [Profile Settings](https://docs.datadoghq.com/incident_response/on-call/profile_settings/) * [Guides](https://docs.datadoghq.com/incident_response/on-call/guides/) * [Status Pages](https://docs.datadoghq.com/incident_response/status_pages/) * [Case Management](https://docs.datadoghq.com/incident_response/case_management/) * [Projects](https://docs.datadoghq.com/incident_response/case_management/projects) * [Settings](https://docs.datadoghq.com/incident_response/case_management/settings) * [Create a Case](https://docs.datadoghq.com/incident_response/case_management/create_case) * [Customization](https://docs.datadoghq.com/incident_response/case_management/customization) * [View and Manage Cases](https://docs.datadoghq.com/incident_response/case_management/view_and_manage) * [Notifications and Integrations](https://docs.datadoghq.com/incident_response/case_management/notifications_integrations) * [Case Automation Rules](https://docs.datadoghq.com/incident_response/case_management/automation_rules) * [Troubleshooting](https://docs.datadoghq.com/incident_response/case_management/troubleshooting) * [Actions & Remediations ](https://docs.datadoghq.com/api/latest/aws-integration/) * [Agents](https://docs.datadoghq.com/actions/agents/) * [Workflow Automation](https://docs.datadoghq.com/actions/workflows/) * [Build Workflows](https://docs.datadoghq.com/actions/workflows/build/) * [Access and Authentication](https://docs.datadoghq.com/actions/workflows/access_and_auth/) * [Trigger Workflows](https://docs.datadoghq.com/actions/workflows/trigger/) * [Variables and parameters](https://docs.datadoghq.com/actions/workflows/variables/) * [Actions](https://docs.datadoghq.com/actions/workflows/actions/) * [Workflow Logic](https://docs.datadoghq.com/actions/workflows/actions/flow_control/) * [Save and Reuse Actions](https://docs.datadoghq.com/actions/workflows/saved_actions/) * [Test and Debug](https://docs.datadoghq.com/actions/workflows/test_and_debug/) * [JavaScript Expressions](https://docs.datadoghq.com/actions/workflows/expressions/) * [Track Workflows](https://docs.datadoghq.com/actions/workflows/track) * [Limits](https://docs.datadoghq.com/actions/workflows/limits/) * [App Builder](https://docs.datadoghq.com/actions/app_builder/) * [Build Apps](https://docs.datadoghq.com/actions/app_builder/build/) * [Access and Authentication](https://docs.datadoghq.com/actions/app_builder/access_and_auth/) * [Queries](https://docs.datadoghq.com/actions/app_builder/queries/) * [Variables](https://docs.datadoghq.com/actions/app_builder/variables/) * [Events](https://docs.datadoghq.com/actions/app_builder/events/) * [Components](https://docs.datadoghq.com/actions/app_builder/components/) * [Custom Charts](https://docs.datadoghq.com/actions/app_builder/components/custom_charts/) * [React Renderer](https://docs.datadoghq.com/actions/app_builder/components/react_renderer/) * [Tables](https://docs.datadoghq.com/actions/app_builder/components/tables/) * [Reusable Modules](https://docs.datadoghq.com/actions/app_builder/components/reusable_modules/) * [JavaScript Expressions](https://docs.datadoghq.com/actions/app_builder/expressions/) * [Embedded Apps](https://docs.datadoghq.com/actions/app_builder/embedded_apps/) * [Input Parameters](https://docs.datadoghq.com/actions/app_builder/embedded_apps/input_parameters/) * [Save and Reuse Actions](https://docs.datadoghq.com/actions/app_builder/saved_actions/) * [Datastores](https://docs.datadoghq.com/actions/datastores/) * [Create and Manage Datastores](https://docs.datadoghq.com/actions/datastores/create/) * [Use Datastores with Apps and Workflows](https://docs.datadoghq.com/actions/datastores/use) * [Automation Rules](https://docs.datadoghq.com/actions/datastores/trigger) * [Access and Authentication](https://docs.datadoghq.com/actions/datastores/auth) * [Forms](https://docs.datadoghq.com/actions/forms/) * [Action Catalog](https://docs.datadoghq.com/actions/actions_catalog/) * [Connections](https://docs.datadoghq.com/actions/connections/) * [AWS Integration](https://docs.datadoghq.com/actions/connections/aws_integration/) * [HTTP Request](https://docs.datadoghq.com/actions/connections/http/) * [Private Actions](https://docs.datadoghq.com/actions/private_actions/) * [Use Private Actions](https://docs.datadoghq.com/actions/private_actions/use_private_actions/) * [Run a Script](https://docs.datadoghq.com/actions/private_actions/run_script/) * [Update the Private Action Runner](https://docs.datadoghq.com/actions/private_actions/update_private_action_runner/) * [Private Action Credentials](https://docs.datadoghq.com/actions/private_actions/private_action_credentials/) * [Infrastructure ](https://docs.datadoghq.com/api/latest/aws-integration/) * [Cloudcraft](https://docs.datadoghq.com/datadog_cloudcraft/) * [Overlays](https://docs.datadoghq.com/datadog_cloudcraft/overlays/) * [Infrastructure](https://docs.datadoghq.com/datadog_cloudcraft/overlays/infrastructure/) * [Observability](https://docs.datadoghq.com/datadog_cloudcraft/overlays/observability/) * [Security](https://docs.datadoghq.com/datadog_cloudcraft/overlays/security/) * [Cloud Cost Management](https://docs.datadoghq.com/datadog_cloudcraft/overlays/ccm/) * [Resource Catalog](https://docs.datadoghq.com/infrastructure/resource_catalog/) * [Cloud Resources Schema](https://docs.datadoghq.com/infrastructure/resource_catalog/schema/) * [Policies](https://docs.datadoghq.com/infrastructure/resource_catalog/policies/) * [Resource Changes](https://docs.datadoghq.com/infrastructure/resource_catalog/resource_changes/) * [Universal Service Monitoring](https://docs.datadoghq.com/universal_service_monitoring/) * [Setup](https://docs.datadoghq.com/universal_service_monitoring/setup/) * [Guides](https://docs.datadoghq.com/universal_service_monitoring/guide/) * [End User Device Monitoring](https://docs.datadoghq.com/infrastructure/end_user_device_monitoring/) * [Setup](https://docs.datadoghq.com/infrastructure/end_user_device_monitoring/setup/) * [Hosts](https://docs.datadoghq.com/infrastructure/hostmap/) * [Host List](https://docs.datadoghq.com/infrastructure/list/) * [Containers](https://docs.datadoghq.com/containers/) * [Monitoring Containers](https://docs.datadoghq.com/infrastructure/containers/) * [Configuration](https://docs.datadoghq.com/infrastructure/containers/configuration) * [Container Images View](https://docs.datadoghq.com/infrastructure/containers/container_images) * [Orchestrator Explorer](https://docs.datadoghq.com/infrastructure/containers/orchestrator_explorer) * [Kubernetes Resource Utilization](https://docs.datadoghq.com/infrastructure/containers/kubernetes_resource_utilization) * [Kubernetes Autoscaling](https://docs.datadoghq.com/containers/monitoring/autoscaling) * [Amazon Elastic Container Explorer](https://docs.datadoghq.com/infrastructure/containers/amazon_elastic_container_explorer) * [Autoscaling](https://docs.datadoghq.com/containers/autoscaling) * [Docker and other runtimes](https://docs.datadoghq.com/containers/docker/) * [APM](https://docs.datadoghq.com/containers/docker/apm/) * [Log collection](https://docs.datadoghq.com/containers/docker/log/) * [Tag extraction](https://docs.datadoghq.com/containers/docker/tag/) * [Integrations](https://docs.datadoghq.com/containers/docker/integrations/) * [Prometheus](https://docs.datadoghq.com/containers/docker/prometheus/) * [Data Collected](https://docs.datadoghq.com/containers/docker/data_collected/) * [Kubernetes](https://docs.datadoghq.com/containers/kubernetes/) * [Installation](https://docs.datadoghq.com/containers/kubernetes/installation) * [Further Configuration](https://docs.datadoghq.com/containers/kubernetes/configuration) * [Distributions](https://docs.datadoghq.com/containers/kubernetes/distributions) * [APM](https://docs.datadoghq.com/containers/kubernetes/apm/) * [Log collection](https://docs.datadoghq.com/containers/kubernetes/log/) * [Tag extraction](https://docs.datadoghq.com/containers/kubernetes/tag/) * [Integrations](https://docs.datadoghq.com/containers/kubernetes/integrations/) * [Prometheus & OpenMetrics](https://docs.datadoghq.com/containers/kubernetes/prometheus/) * [Control plane monitoring](https://docs.datadoghq.com/containers/kubernetes/control_plane/) * [Data collected](https://docs.datadoghq.com/containers/kubernetes/data_collected/) * [kubectl Plugin](https://docs.datadoghq.com/containers/kubernetes/kubectl_plugin) * [Datadog CSI Driver](https://docs.datadoghq.com/containers/kubernetes/csi_driver) * [Data security](https://docs.datadoghq.com/data_security/kubernetes) * [Cluster Agent](https://docs.datadoghq.com/containers/cluster_agent/) * [Setup](https://docs.datadoghq.com/containers/cluster_agent/setup/) * [Commands & Options](https://docs.datadoghq.com/containers/cluster_agent/commands/) * [Cluster Checks](https://docs.datadoghq.com/containers/cluster_agent/clusterchecks/) * [Endpoint Checks](https://docs.datadoghq.com/containers/cluster_agent/endpointschecks/) * [Admission Controller](https://docs.datadoghq.com/containers/cluster_agent/admission_controller/) * [Amazon ECS](https://docs.datadoghq.com/containers/amazon_ecs/) * [APM](https://docs.datadoghq.com/containers/amazon_ecs/apm/) * [Log collection](https://docs.datadoghq.com/containers/amazon_ecs/logs/) * [Tag extraction](https://docs.datadoghq.com/containers/amazon_ecs/tags/) * [Data collected](https://docs.datadoghq.com/containers/amazon_ecs/data_collected/) * [Managed Instances](https://docs.datadoghq.com/containers/amazon_ecs/managed_instances/) * [AWS Fargate with ECS](https://docs.datadoghq.com/integrations/ecs_fargate/) * [Datadog Operator](https://docs.datadoghq.com/containers/datadog_operator) * [Advanced Install](https://docs.datadoghq.com/containers/datadog_operator/advanced_install) * [Configuration](https://docs.datadoghq.com/containers/datadog_operator/config) * [Custom Checks](https://docs.datadoghq.com/containers/datadog_operator/custom_check) * [Data Collected](https://docs.datadoghq.com/containers/datadog_operator/data_collected) * [Secret Management](https://docs.datadoghq.com/containers/datadog_operator/secret_management) * [DatadogDashboard CRD](https://docs.datadoghq.com/containers/datadog_operator/crd_dashboard) * [DatadogMonitor CRD](https://docs.datadoghq.com/containers/datadog_operator/crd_monitor) * [DatadogSLO CRD](https://docs.datadoghq.com/containers/datadog_operator/crd_slo) * [Troubleshooting](https://docs.datadoghq.com/containers/troubleshooting/) * [Duplicate hosts](https://docs.datadoghq.com/containers/troubleshooting/duplicate_hosts) * [Cluster Agent](https://docs.datadoghq.com/containers/troubleshooting/cluster-agent) * [Cluster Checks](https://docs.datadoghq.com/containers/troubleshooting/cluster-and-endpoint-checks) * [HPA and Metrics Provider](https://docs.datadoghq.com/containers/troubleshooting/hpa) * [Admission Controller](https://docs.datadoghq.com/containers/troubleshooting/admission-controller) * [Log Collection](https://docs.datadoghq.com/containers/troubleshooting/log-collection) * [Guides](https://docs.datadoghq.com/containers/guide) * [Processes](https://docs.datadoghq.com/infrastructure/process) * [Increase Process Retention](https://docs.datadoghq.com/infrastructure/process/increase_process_retention/) * [Serverless](https://docs.datadoghq.com/serverless) * [AWS Lambda](https://docs.datadoghq.com/serverless/aws_lambda) * [Instrumentation](https://docs.datadoghq.com/serverless/aws_lambda/instrumentation) * [Managed Instances](https://docs.datadoghq.com/serverless/aws_lambda/managed_instances) * [Lambda Metrics](https://docs.datadoghq.com/serverless/aws_lambda/metrics) * [Distributed Tracing](https://docs.datadoghq.com/serverless/aws_lambda/distributed_tracing) * [Log Collection](https://docs.datadoghq.com/serverless/aws_lambda/logs) * [Remote Instrumentation](https://docs.datadoghq.com/serverless/aws_lambda/remote_instrumentation) * [Advanced Configuration](https://docs.datadoghq.com/serverless/aws_lambda/configuration) * [Continuous Profiler](https://docs.datadoghq.com/serverless/aws_lambda/profiling) * [Securing Functions](https://docs.datadoghq.com/serverless/aws_lambda/securing_functions) * [Deployment Tracking](https://docs.datadoghq.com/serverless/aws_lambda/deployment_tracking) * [OpenTelemetry](https://docs.datadoghq.com/serverless/aws_lambda/opentelemetry) * [Troubleshooting](https://docs.datadoghq.com/serverless/aws_lambda/troubleshooting) * [Lambda Web Adapter](https://docs.datadoghq.com/serverless/aws_lambda/lwa) * [FIPS Compliance](https://docs.datadoghq.com/serverless/aws_lambda/fips-compliance) * [AWS Step Functions](https://docs.datadoghq.com/serverless/step_functions) * [Installation](https://docs.datadoghq.com/serverless/step_functions/installation) * [Merge Step Functions and Lambda Traces](https://docs.datadoghq.com/serverless/step_functions/merge-step-functions-lambda) * [Enhanced Metrics](https://docs.datadoghq.com/serverless/step_functions/enhanced-metrics) * [Redrive Executions](https://docs.datadoghq.com/serverless/step_functions/redrive) * [Distributed Map States](https://docs.datadoghq.com/serverless/step_functions/distributed-maps) * [Troubleshooting](https://docs.datadoghq.com/serverless/step_functions/troubleshooting) * [AWS Fargate](https://docs.datadoghq.com/integrations/ecs_fargate) * [Azure App Service](https://docs.datadoghq.com/serverless/azure_app_service) * [Linux - Code](https://docs.datadoghq.com/serverless/azure_app_service/linux_code) * [Linux - Container](https://docs.datadoghq.com/serverless/azure_app_service/linux_container) * [Windows - Code](https://docs.datadoghq.com/serverless/azure_app_service/windows_code) * [Azure Container Apps](https://docs.datadoghq.com/serverless/azure_container_apps) * [In-Container](https://docs.datadoghq.com/serverless/azure_container_apps/in_container) * [Sidecar](https://docs.datadoghq.com/serverless/azure_container_apps/sidecar) * [Azure Functions](https://docs.datadoghq.com/serverless/azure_functions) * [Google Cloud Run](https://docs.datadoghq.com/serverless/google_cloud_run) * [Containers](https://docs.datadoghq.com/serverless/google_cloud_run/containers) * [Functions](https://docs.datadoghq.com/serverless/google_cloud_run/functions) * [Functions (1st generation)](https://docs.datadoghq.com/serverless/google_cloud_run/functions_1st_gen) * [Libraries & Integrations](https://docs.datadoghq.com/serverless/libraries_integrations) * [Glossary](https://docs.datadoghq.com/serverless/glossary) * [Guides](https://docs.datadoghq.com/serverless/guide/) * [Network Monitoring](https://docs.datadoghq.com/network_monitoring/) * [Cloud Network Monitoring](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/) * [Setup](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/setup/) * [Network Health](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/network_health/) * [Network Analytics](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/network_analytics/) * [Network Map](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/network_map/) * [Guides](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/guide/) * [Supported Cloud Services](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/supported_cloud_services/) * [Terms and Concepts](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/glossary) * [DNS Monitoring](https://docs.datadoghq.com/network_monitoring/dns/) * [Network Device Monitoring](https://docs.datadoghq.com/network_monitoring/devices) * [Setup](https://docs.datadoghq.com/network_monitoring/devices/setup) * [Integrations](https://docs.datadoghq.com/network_monitoring/devices/integrations) * [Profiles](https://docs.datadoghq.com/network_monitoring/devices/profiles) * [Configuration Management](https://docs.datadoghq.com/network_monitoring/devices/config_management) * [Maps](https://docs.datadoghq.com/network_monitoring/devices/topology) * [SNMP Metrics Reference](https://docs.datadoghq.com/network_monitoring/devices/data) * [Troubleshooting](https://docs.datadoghq.com/network_monitoring/devices/troubleshooting) * [Guides](https://docs.datadoghq.com/network_monitoring/devices/guide/) * [Terms and Concepts](https://docs.datadoghq.com/network_monitoring/devices/glossary) * [NetFlow Monitoring](https://docs.datadoghq.com/network_monitoring/netflow/) * [Monitors](https://docs.datadoghq.com/monitors/types/netflow) * [Network Path](https://docs.datadoghq.com/network_monitoring/network_path/) * [Setup](https://docs.datadoghq.com/network_monitoring/network_path/setup/) * [List View](https://docs.datadoghq.com/network_monitoring/network_path/list_view/) * [Path View](https://docs.datadoghq.com/network_monitoring/network_path/path_view/) * [Guides](https://docs.datadoghq.com/network_monitoring/network_path/guide) * [Terms and Concepts](https://docs.datadoghq.com/network_monitoring/network_path/glossary) * [Storage Management](https://docs.datadoghq.com/infrastructure/storage_management/) * [Amazon S3](https://docs.datadoghq.com/infrastructure/storage_management/amazon_s3/) * [Google Cloud Storage](https://docs.datadoghq.com/infrastructure/storage_management/google_cloud_storage/) * [Azure Blob Storage](https://docs.datadoghq.com/infrastructure/storage_management/azure_blob_storage/) * [Cloud Cost ](https://docs.datadoghq.com/api/latest/aws-integration/) * [Cloud Cost](https://docs.datadoghq.com/cloud_cost_management/) * [Datadog Costs](https://docs.datadoghq.com/cloud_cost_management/datadog_costs) * [Setup](https://docs.datadoghq.com/cloud_cost_management/setup/) * [AWS](https://docs.datadoghq.com/cloud_cost_management/setup/aws/) * [Azure](https://docs.datadoghq.com/cloud_cost_management/setup/azure/) * [Google Cloud](https://docs.datadoghq.com/cloud_cost_management/setup/google_cloud/) * [Oracle](https://docs.datadoghq.com/cloud_cost_management/setup/oracle/) * [SaaS Integrations](https://docs.datadoghq.com/cloud_cost_management/setup/saas_costs/) * [Custom](https://docs.datadoghq.com/cloud_cost_management/setup/custom) * [Tags](https://docs.datadoghq.com/cloud_cost_management/tags) * [Tag Explorer](https://docs.datadoghq.com/cloud_cost_management/tags/tag_explorer) * [Multisource Querying](https://docs.datadoghq.com/cloud_cost_management/tags/multisource_querying) * [Allocation](https://docs.datadoghq.com/cloud_cost_management/allocation) * [Tag Pipelines](https://docs.datadoghq.com/cloud_cost_management/tags/tag_pipelines) * [Container Cost Allocation](https://docs.datadoghq.com/cloud_cost_management/allocation/container_cost_allocation) * [BigQuery Costs](https://docs.datadoghq.com/cloud_cost_management/allocation/bigquery) * [Custom Allocation Rules](https://docs.datadoghq.com/cloud_cost_management/allocation/custom_allocation_rules) * [Reporting](https://docs.datadoghq.com/cloud_cost_management/reporting) * [Explorer](https://docs.datadoghq.com/cloud_cost_management/reporting/explorer) * [Scheduled Reports](https://docs.datadoghq.com/cloud_cost_management/reporting/scheduled_reports) * [Recommendations](https://docs.datadoghq.com/cloud_cost_management/recommendations) * [Custom Recommendations](https://docs.datadoghq.com/cloud_cost_management/recommendations/custom_recommendations) * [Planning](https://docs.datadoghq.com/cloud_cost_management/planning) * [Budgets](https://docs.datadoghq.com/cloud_cost_management/planning/budgets) * [Commitment Programs](https://docs.datadoghq.com/cloud_cost_management/planning/commitment_programs) * [Cost Changes](https://docs.datadoghq.com/cloud_cost_management/cost_changes) * [Monitors](https://docs.datadoghq.com/cloud_cost_management/cost_changes/monitors) * [Anomalies](https://docs.datadoghq.com/cloud_cost_management/cost_changes/anomalies) * [Real-Time Costs](https://docs.datadoghq.com/cloud_cost_management/cost_changes/real_time_costs) * [Application Performance ](https://docs.datadoghq.com/api/latest/aws-integration/) * [APM](https://docs.datadoghq.com/tracing/) * [APM Terms and Concepts](https://docs.datadoghq.com/tracing/glossary/) * [Application Instrumentation](https://docs.datadoghq.com/tracing/trace_collection/) * [Single Step Instrumentation](https://docs.datadoghq.com/tracing/trace_collection/single-step-apm/) * [Manually managed SDKs](https://docs.datadoghq.com/tracing/trace_collection/dd_libraries/) * [Code-based Custom Instrumentation](https://docs.datadoghq.com/tracing/trace_collection/custom_instrumentation/) * [Dynamic Instrumentation](https://docs.datadoghq.com/tracing/trace_collection/dynamic_instrumentation/) * [Library Compatibility](https://docs.datadoghq.com/tracing/trace_collection/compatibility/) * [Library Configuration](https://docs.datadoghq.com/tracing/trace_collection/library_config/) * [Configuration at Runtime](https://docs.datadoghq.com/tracing/trace_collection/runtime_config/) * [Trace Context Propagation](https://docs.datadoghq.com/tracing/trace_collection/trace_context_propagation/) * [Serverless Application Tracing](https://docs.datadoghq.com/serverless/distributed_tracing/) * [Proxy Tracing](https://docs.datadoghq.com/tracing/trace_collection/proxy_setup/) * [Span Tag Semantics](https://docs.datadoghq.com/tracing/trace_collection/tracing_naming_convention) * [Span Links](https://docs.datadoghq.com/tracing/trace_collection/span_links) * [APM Metrics Collection](https://docs.datadoghq.com/tracing/metrics/) * [Trace Metrics](https://docs.datadoghq.com/tracing/metrics/metrics_namespace/) * [Runtime Metrics](https://docs.datadoghq.com/tracing/metrics/runtime_metrics/) * [Trace Pipeline Configuration](https://docs.datadoghq.com/tracing/trace_pipeline/) * [Ingestion Mechanisms](https://docs.datadoghq.com/tracing/trace_pipeline/ingestion_mechanisms/) * [Ingestion Controls](https://docs.datadoghq.com/tracing/trace_pipeline/ingestion_controls/) * [Adaptive Sampling](https://docs.datadoghq.com/tracing/trace_pipeline/adaptive_sampling/) * [Generate Metrics](https://docs.datadoghq.com/tracing/trace_pipeline/generate_metrics/) * [Trace Retention](https://docs.datadoghq.com/tracing/trace_pipeline/trace_retention/) * [Usage Metrics](https://docs.datadoghq.com/tracing/trace_pipeline/metrics/) * [Correlate Traces with Other Telemetry](https://docs.datadoghq.com/tracing/other_telemetry/) * [Correlate DBM and Traces](https://docs.datadoghq.com/database_monitoring/connect_dbm_and_apm/) * [Correlate Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/) * [Correlate RUM and Traces](https://docs.datadoghq.com/tracing/other_telemetry/rum) * [Correlate Synthetics and Traces](https://docs.datadoghq.com/tracing/other_telemetry/synthetics/) * [Correlate Profiles and Traces](https://docs.datadoghq.com/profiler/connect_traces_and_profiles/) * [Trace Explorer](https://docs.datadoghq.com/tracing/trace_explorer/) * [Search Spans](https://docs.datadoghq.com/tracing/trace_explorer/search/) * [Query Syntax](https://docs.datadoghq.com/tracing/trace_explorer/query_syntax/) * [Trace Queries](https://docs.datadoghq.com/tracing/trace_explorer/trace_queries/) * [Span Tags and Attributes](https://docs.datadoghq.com/tracing/trace_explorer/span_tags_attributes/) * [Span Visualizations](https://docs.datadoghq.com/tracing/trace_explorer/visualize/) * [Trace View](https://docs.datadoghq.com/tracing/trace_explorer/trace_view/) * [Tag Analysis](https://docs.datadoghq.com/tracing/trace_explorer/tag_analysis/) * [Recommendations](https://docs.datadoghq.com/tracing/recommendations/) * [Code Origin for Spans](https://docs.datadoghq.com/tracing/code_origin) * [Service Observability](https://docs.datadoghq.com/tracing/services/) * [Software Catalog](https://docs.datadoghq.com/internal_developer_portal/software_catalog/) * [Service Page](https://docs.datadoghq.com/tracing/services/service_page/) * [Resource Page](https://docs.datadoghq.com/tracing/services/resource_page/) * [Deployment Tracking](https://docs.datadoghq.com/tracing/services/deployment_tracking/) * [Service Map](https://docs.datadoghq.com/tracing/services/services_map/) * [Inferred Services](https://docs.datadoghq.com/tracing/services/inferred_services) * [Remapping Rules for Inferred Entities](https://docs.datadoghq.com/tracing/services/inferred_entity_remapping_rules/) * [Service Remapping Rules](https://docs.datadoghq.com/tracing/services/service_remapping_rules) * [Service Override Removal](https://docs.datadoghq.com/tracing/services/service_override_removal) * [APM Monitors](https://docs.datadoghq.com/monitors/create/types/apm/) * [Endpoint Observability](https://docs.datadoghq.com/internal_developer_portal/software_catalog/endpoints/) * [Explore Endpoints](https://docs.datadoghq.com/internal_developer_portal/software_catalog/endpoints/explore_endpoints) * [Monitor Endpoints](https://docs.datadoghq.com/internal_developer_portal/software_catalog/endpoints/monitor_endpoints) * [Live Debugger](https://docs.datadoghq.com/tracing/live_debugger/) * [Error Tracking](https://docs.datadoghq.com/tracing/error_tracking/) * [Issue States](https://docs.datadoghq.com/tracing/error_tracking/issue_states) * [Error Tracking Explorer](https://docs.datadoghq.com/tracing/error_tracking/explorer) * [Error Grouping](https://docs.datadoghq.com/tracing/error_tracking/error_grouping) * [Monitors](https://docs.datadoghq.com/tracing/error_tracking/monitors) * [Identify Suspect Commits](https://docs.datadoghq.com/tracing/error_tracking/suspect_commits) * [Exception Replay](https://docs.datadoghq.com/tracing/error_tracking/exception_replay) * [Troubleshooting](https://docs.datadoghq.com/error_tracking/troubleshooting) * [Data Security](https://docs.datadoghq.com/tracing/configure_data_security/) * [Guides](https://docs.datadoghq.com/tracing/guide/) * [Troubleshooting](https://docs.datadoghq.com/tracing/troubleshooting/) * [Agent Rate Limits](https://docs.datadoghq.com/tracing/troubleshooting/agent_rate_limits) * [Agent APM metrics](https://docs.datadoghq.com/tracing/troubleshooting/agent_apm_metrics) * [Agent Resource Usage](https://docs.datadoghq.com/tracing/troubleshooting/agent_apm_resource_usage) * [Correlated Logs](https://docs.datadoghq.com/tracing/troubleshooting/correlated-logs-not-showing-up-in-the-trace-id-panel) * [PHP 5 Deep Call Stacks](https://docs.datadoghq.com/tracing/troubleshooting/php_5_deep_call_stacks) * [.NET diagnostic tool](https://docs.datadoghq.com/tracing/troubleshooting/dotnet_diagnostic_tool) * [APM Quantization](https://docs.datadoghq.com/tracing/troubleshooting/quantization) * [Go Compile-Time Instrumentation](https://docs.datadoghq.com/tracing/troubleshooting/go_compile_time) * [Tracer Startup Logs](https://docs.datadoghq.com/tracing/troubleshooting/tracer_startup_logs) * [Tracer Debug Logs](https://docs.datadoghq.com/tracing/troubleshooting/tracer_debug_logs) * [Connection Errors](https://docs.datadoghq.com/tracing/troubleshooting/connection_errors) * [Continuous Profiler](https://docs.datadoghq.com/profiler/) * [Enabling the Profiler](https://docs.datadoghq.com/profiler/enabling/) * [Supported Language and Tracer Versions](https://docs.datadoghq.com/profiler/enabling/supported_versions/) * [Java](https://docs.datadoghq.com/profiler/enabling/java/) * [Python](https://docs.datadoghq.com/profiler/enabling/python/) * [Go](https://docs.datadoghq.com/profiler/enabling/go/) * [Ruby](https://docs.datadoghq.com/profiler/enabling/ruby/) * [Node.js](https://docs.datadoghq.com/profiler/enabling/nodejs/) * [.NET](https://docs.datadoghq.com/profiler/enabling/dotnet/) * [PHP](https://docs.datadoghq.com/profiler/enabling/php/) * [C/C++/Rust](https://docs.datadoghq.com/profiler/enabling/ddprof/) * [Profile Types](https://docs.datadoghq.com/profiler/profile_types/) * [Profile Visualizations](https://docs.datadoghq.com/profiler/profile_visualizations/) * [Investigate Slow Traces or Endpoints](https://docs.datadoghq.com/profiler/connect_traces_and_profiles/) * [Compare Profiles](https://docs.datadoghq.com/profiler/compare_profiles) * [Automated Analysis](https://docs.datadoghq.com/profiler/automated_analysis) * [Profiler Troubleshooting](https://docs.datadoghq.com/profiler/profiler_troubleshooting/) * [Java](https://docs.datadoghq.com/profiler/profiler_troubleshooting/java/) * [Python](https://docs.datadoghq.com/profiler/profiler_troubleshooting/python/) * [Go](https://docs.datadoghq.com/profiler/profiler_troubleshooting/go/) * [Ruby](https://docs.datadoghq.com/profiler/profiler_troubleshooting/ruby/) * [Node.js](https://docs.datadoghq.com/profiler/profiler_troubleshooting/nodejs/) * [.NET](https://docs.datadoghq.com/profiler/profiler_troubleshooting/dotnet/) * [PHP](https://docs.datadoghq.com/profiler/profiler_troubleshooting/php/) * [C/C++/Rust](https://docs.datadoghq.com/profiler/profiler_troubleshooting/ddprof/) * [Guides](https://docs.datadoghq.com/profiler/guide/) * [Database Monitoring](https://docs.datadoghq.com/database_monitoring/) * [Agent Integration Overhead](https://docs.datadoghq.com/database_monitoring/agent_integration_overhead) * [Setup Architectures](https://docs.datadoghq.com/database_monitoring/architecture/) * [Setting Up Postgres](https://docs.datadoghq.com/database_monitoring/setup_postgres/) * [Self-hosted](https://docs.datadoghq.com/database_monitoring/setup_postgres/selfhosted) * [RDS](https://docs.datadoghq.com/database_monitoring/setup_postgres/rds) * [Aurora](https://docs.datadoghq.com/database_monitoring/setup_postgres/aurora) * [Google Cloud SQL](https://docs.datadoghq.com/database_monitoring/setup_postgres/gcsql) * [AlloyDB](https://docs.datadoghq.com/database_monitoring/setup_postgres/alloydb) * [Azure](https://docs.datadoghq.com/database_monitoring/setup_postgres/azure) * [Supabase](https://docs.datadoghq.com/database_monitoring/setup_postgres/supabase) * [Heroku](https://docs.datadoghq.com/database_monitoring/setup_postgres/heroku) * [Advanced Configuration](https://docs.datadoghq.com/database_monitoring/setup_postgres/advanced_configuration) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/setup_postgres/troubleshooting/) * [Setting Up MySQL](https://docs.datadoghq.com/database_monitoring/setup_mysql/) * [Self-hosted](https://docs.datadoghq.com/database_monitoring/setup_mysql/selfhosted) * [RDS](https://docs.datadoghq.com/database_monitoring/setup_mysql/rds) * [Aurora](https://docs.datadoghq.com/database_monitoring/setup_mysql/aurora) * [Google Cloud SQL](https://docs.datadoghq.com/database_monitoring/setup_mysql/gcsql) * [Azure](https://docs.datadoghq.com/database_monitoring/setup_mysql/azure) * [Advanced Configuration](https://docs.datadoghq.com/database_monitoring/setup_mysql/advanced_configuration) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/setup_mysql/troubleshooting/) * [Setting Up SQL Server](https://docs.datadoghq.com/database_monitoring/setup_sql_server/) * [Self-hosted](https://docs.datadoghq.com/database_monitoring/setup_sql_server/selfhosted/) * [RDS](https://docs.datadoghq.com/database_monitoring/setup_sql_server/rds/) * [Azure](https://docs.datadoghq.com/database_monitoring/setup_sql_server/azure/) * [Google Cloud SQL](https://docs.datadoghq.com/database_monitoring/setup_sql_server/gcsql/) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/setup_sql_server/troubleshooting/) * [Setting Up Oracle](https://docs.datadoghq.com/database_monitoring/setup_oracle/) * [Self-hosted](https://docs.datadoghq.com/database_monitoring/setup_oracle/selfhosted/) * [RDS](https://docs.datadoghq.com/database_monitoring/setup_oracle/rds/) * [RAC](https://docs.datadoghq.com/database_monitoring/setup_oracle/rac/) * [Exadata](https://docs.datadoghq.com/database_monitoring/setup_oracle/exadata/) * [Autonomous Database](https://docs.datadoghq.com/database_monitoring/setup_oracle/autonomous_database/) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/setup_oracle/troubleshooting/) * [Setting Up Amazon DocumentDB](https://docs.datadoghq.com/database_monitoring/setup_documentdb/) * [Amazon DocumentDB](https://docs.datadoghq.com/database_monitoring/setup_documentdb/amazon_documentdb) * [Setting Up MongoDB](https://docs.datadoghq.com/database_monitoring/setup_mongodb/) * [Self-hosted](https://docs.datadoghq.com/database_monitoring/setup_mongodb/selfhosted) * [MongoDB Atlas](https://docs.datadoghq.com/database_monitoring/setup_mongodb/mongodbatlas) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/setup_mongodb/troubleshooting/) * [Connecting DBM and Traces](https://docs.datadoghq.com/database_monitoring/connect_dbm_and_apm/) * [Data Collected](https://docs.datadoghq.com/database_monitoring/data_collected) * [Exploring Database Hosts](https://docs.datadoghq.com/database_monitoring/database_hosts/) * [Exploring Query Metrics](https://docs.datadoghq.com/database_monitoring/query_metrics/) * [Exploring Query Samples](https://docs.datadoghq.com/database_monitoring/query_samples/) * [Exploring Database Schemas](https://docs.datadoghq.com/database_monitoring/schema_explorer) * [Exploring Recommendations](https://docs.datadoghq.com/database_monitoring/recommendations/) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/troubleshooting/) * [Guides](https://docs.datadoghq.com/database_monitoring/guide/) * [Data Streams Monitoring](https://docs.datadoghq.com/data_streams/) * [Setup](https://docs.datadoghq.com/data_streams/setup) * [Kafka Messages](https://docs.datadoghq.com/data_streams/messages) * [Schema Tracking](https://docs.datadoghq.com/data_streams/schema_tracking) * [Dead Letter Queues](https://docs.datadoghq.com/data_streams/dead_letter_queues) * [Metrics and Tags](https://docs.datadoghq.com/data_streams/metrics_and_tags) * [Data Observability ](https://docs.datadoghq.com/api/latest/aws-integration/) * [Data Observability](https://docs.datadoghq.com/data_observability/) * [Quality Monitoring](https://docs.datadoghq.com/data_observability/quality_monitoring/) * [Data Warehouses](https://docs.datadoghq.com/data_observability/quality_monitoring/data_warehouses) * [Snowflake](https://docs.datadoghq.com/data_observability/quality_monitoring/data_warehouses/snowflake) * [Databricks](https://docs.datadoghq.com/data_observability/quality_monitoring/data_warehouses/databricks) * [BigQuery](https://docs.datadoghq.com/data_observability/quality_monitoring/data_warehouses/bigquery) * [Business Intelligence Integrations](https://docs.datadoghq.com/data_observability/quality_monitoring/business_intelligence) * [Tableau](https://docs.datadoghq.com/data_observability/quality_monitoring/business_intelligence/tableau) * [Sigma](https://docs.datadoghq.com/data_observability/quality_monitoring/business_intelligence/sigma) * [Metabase](https://docs.datadoghq.com/data_observability/quality_monitoring/business_intelligence/metabase) * [Power BI](https://docs.datadoghq.com/data_observability/quality_monitoring/business_intelligence/powerbi) * [Jobs Monitoring](https://docs.datadoghq.com/data_observability/jobs_monitoring/) * [Databricks](https://docs.datadoghq.com/data_observability/jobs_monitoring/databricks) * [Airflow](https://docs.datadoghq.com/data_observability/jobs_monitoring/airflow) * [dbt](https://docs.datadoghq.com/data_observability/jobs_monitoring/dbt) * [Spark on Kubernetes](https://docs.datadoghq.com/data_observability/jobs_monitoring/kubernetes) * [Spark on Amazon EMR](https://docs.datadoghq.com/data_observability/jobs_monitoring/emr) * [Spark on Google Dataproc](https://docs.datadoghq.com/data_observability/jobs_monitoring/dataproc) * [Custom Jobs (OpenLineage)](https://docs.datadoghq.com/data_observability/jobs_monitoring/openlineage) * [Datadog Agent for OpenLineage Proxy](https://docs.datadoghq.com/data_observability/jobs_monitoring/openlineage/datadog_agent_for_openlineage) * [Digital Experience ](https://docs.datadoghq.com/api/latest/aws-integration/) * [Real User Monitoring](https://docs.datadoghq.com/real_user_monitoring/) * [Application Monitoring](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/) * [Browser](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/browser/) * [Android and Android TV](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/android) * [iOS and tvOS](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/ios) * [Flutter](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/flutter) * [Kotlin Multiplatform](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/kotlin_multiplatform) * [React Native](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/react_native) * [Roku](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/roku) * [Unity](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/unity) * [Platform](https://docs.datadoghq.com/real_user_monitoring/platform) * [Dashboards](https://docs.datadoghq.com/real_user_monitoring/platform/dashboards/) * [Monitors](https://docs.datadoghq.com/monitors/types/real_user_monitoring/) * [Generate Custom Metrics](https://docs.datadoghq.com/real_user_monitoring/platform/generate_metrics) * [Exploring RUM Data](https://docs.datadoghq.com/real_user_monitoring/explorer/) * [Search RUM Events](https://docs.datadoghq.com/real_user_monitoring/explorer/search/) * [Search Syntax](https://docs.datadoghq.com/real_user_monitoring/explorer/search_syntax/) * [Group](https://docs.datadoghq.com/real_user_monitoring/explorer/group/) * [Visualize](https://docs.datadoghq.com/real_user_monitoring/explorer/visualize/) * [Events](https://docs.datadoghq.com/real_user_monitoring/explorer/events/) * [Export](https://docs.datadoghq.com/real_user_monitoring/explorer/export/) * [Saved Views](https://docs.datadoghq.com/real_user_monitoring/explorer/saved_views/) * [Watchdog Insights for RUM](https://docs.datadoghq.com/real_user_monitoring/explorer/watchdog_insights/) * [Correlate RUM with Other Telemetry](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/) * [Correlate LLM with RUM](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/llm_observability) * [Correlate Logs with RUM](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/logs) * [Correlate Profiling with RUM](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/profiling/) * [Correlate Synthetics with RUM](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/synthetics/) * [Correlate Traces with RUM](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/apm) * [Feature Flag Tracking](https://docs.datadoghq.com/real_user_monitoring/feature_flag_tracking) * [Setup](https://docs.datadoghq.com/real_user_monitoring/feature_flag_tracking/setup) * [Using Feature Flags](https://docs.datadoghq.com/real_user_monitoring/feature_flag_tracking/using_feature_flags) * [Error Tracking](https://docs.datadoghq.com/real_user_monitoring/error_tracking/) * [Explorer](https://docs.datadoghq.com/real_user_monitoring/error_tracking/explorer/) * [Issue States](https://docs.datadoghq.com/real_user_monitoring/error_tracking/issue_states) * [Track Browser Errors](https://docs.datadoghq.com/real_user_monitoring/error_tracking/browser/) * [Track Mobile Errors](https://docs.datadoghq.com/real_user_monitoring/error_tracking/mobile/) * [Error Grouping](https://docs.datadoghq.com/real_user_monitoring/error_tracking/error_grouping) * [Monitors](https://docs.datadoghq.com/real_user_monitoring/error_tracking/monitors) * [Identify Suspect Commits](https://docs.datadoghq.com/real_user_monitoring/error_tracking/suspect_commits) * [Troubleshooting](https://docs.datadoghq.com/real_user_monitoring/error_tracking/troubleshooting) * [RUM Without Limits](https://docs.datadoghq.com/real_user_monitoring/rum_without_limits/) * [Metrics](https://docs.datadoghq.com/real_user_monitoring/rum_without_limits/metrics) * [Retention Filters](https://docs.datadoghq.com/real_user_monitoring/rum_without_limits/retention_filters) * [Operations Monitoring](https://docs.datadoghq.com/real_user_monitoring/operations_monitoring/) * [Ownership of Views](https://docs.datadoghq.com/real_user_monitoring/ownership_of_views/) * [Guides](https://docs.datadoghq.com/real_user_monitoring/guide/) * [Data Security](https://docs.datadoghq.com/data_security/real_user_monitoring/) * [Synthetic Testing and Monitoring](https://docs.datadoghq.com/synthetics/) * [API Testing](https://docs.datadoghq.com/synthetics/api_tests/) * [HTTP](https://docs.datadoghq.com/synthetics/api_tests/http_tests) * [SSL](https://docs.datadoghq.com/synthetics/api_tests/ssl_tests) * [DNS](https://docs.datadoghq.com/synthetics/api_tests/dns_tests) * [WebSocket](https://docs.datadoghq.com/synthetics/api_tests/websocket_tests) * [TCP](https://docs.datadoghq.com/synthetics/api_tests/tcp_tests) * [UDP](https://docs.datadoghq.com/synthetics/api_tests/udp_tests) * [ICMP](https://docs.datadoghq.com/synthetics/api_tests/icmp_tests) * [GRPC](https://docs.datadoghq.com/synthetics/api_tests/grpc_tests) * [Error codes](https://docs.datadoghq.com/synthetics/api_tests/errors) * [Multistep API Testing](https://docs.datadoghq.com/synthetics/multistep) * [Browser Testing](https://docs.datadoghq.com/synthetics/browser_tests/) * [Recording Steps](https://docs.datadoghq.com/synthetics/browser_tests/test_steps) * [Browser Testing Results](https://docs.datadoghq.com/synthetics/browser_tests/test_results) * [Advanced Options for Steps](https://docs.datadoghq.com/synthetics/browser_tests/advanced_options) * [Authentication in Browser Testing](https://docs.datadoghq.com/synthetics/browser_tests/app-that-requires-login) * [Network Path Testing](https://docs.datadoghq.com/synthetics/network_path_tests/) * [Terms and Concepts](https://docs.datadoghq.com/synthetics/network_path_tests/glossary) * [Mobile Application Testing](https://docs.datadoghq.com/synthetics/mobile_app_testing/) * [Testing Steps](https://docs.datadoghq.com/synthetics/mobile_app_testing/mobile_app_tests/steps) * [Testing Results](https://docs.datadoghq.com/synthetics/mobile_app_testing/mobile_app_tests/results) * [Advanced Options for Steps](https://docs.datadoghq.com/synthetics/mobile_app_testing/mobile_app_tests/advanced_options) * [Supported Devices](https://docs.datadoghq.com/synthetics/mobile_app_testing/devices/) * [Restricted Networks](https://docs.datadoghq.com/synthetics/mobile_app_testing/mobile_app_tests/restricted_networks) * [Settings](https://docs.datadoghq.com/synthetics/mobile_app_testing/settings) * [Test Suites](https://docs.datadoghq.com/synthetics/test_suites/) * [Platform](https://docs.datadoghq.com/synthetics/platform/) * [Dashboards](https://docs.datadoghq.com/synthetics/platform/dashboards) * [Metrics](https://docs.datadoghq.com/synthetics/platform/metrics/) * [Test Coverage](https://docs.datadoghq.com/synthetics/platform/test_coverage) * [Private Locations](https://docs.datadoghq.com/synthetics/platform/private_locations) * [Connect APM](https://docs.datadoghq.com/synthetics/platform/apm/) * [Settings](https://docs.datadoghq.com/synthetics/platform/settings) * [Exploring Synthetics Data](https://docs.datadoghq.com/synthetics/explore/) * [Saved Views](https://docs.datadoghq.com/synthetics/explore/saved_views) * [Results Explorer](https://docs.datadoghq.com/synthetics/explore/results_explorer) * [Guides](https://docs.datadoghq.com/synthetics/guide/) * [Notifications](https://docs.datadoghq.com/synthetics/notifications/) * [Template Variables](https://docs.datadoghq.com/synthetics/notifications/template_variables) * [Conditional Alerting](https://docs.datadoghq.com/synthetics/notifications/conditional_alerting) * [Advanced Notifications](https://docs.datadoghq.com/synthetics/notifications/advanced_notifications) * [Integrate with Statuspage](https://docs.datadoghq.com/synthetics/notifications/statuspage) * [Troubleshooting](https://docs.datadoghq.com/synthetics/troubleshooting/) * [Data Security](https://docs.datadoghq.com/data_security/synthetics/) * [Continuous Testing](https://docs.datadoghq.com/continuous_testing/) * [Local and Staging Environments](https://docs.datadoghq.com/continuous_testing/environments) * [Testing Multiple Environments](https://docs.datadoghq.com/continuous_testing/environments/multiple_env) * [Testing With Proxy, Firewall, or VPN](https://docs.datadoghq.com/continuous_testing/environments/proxy_firewall_vpn) * [CI/CD Integrations](https://docs.datadoghq.com/continuous_testing/cicd_integrations) * [Configuration](https://docs.datadoghq.com/continuous_testing/cicd_integrations/configuration) * [Azure DevOps Extension](https://docs.datadoghq.com/continuous_testing/cicd_integrations/azure_devops_extension) * [CircleCI Orb](https://docs.datadoghq.com/continuous_testing/cicd_integrations/circleci_orb) * [GitHub Actions](https://docs.datadoghq.com/continuous_testing/cicd_integrations/github_actions) * [GitLab](https://docs.datadoghq.com/continuous_testing/cicd_integrations/gitlab) * [Jenkins](https://docs.datadoghq.com/continuous_testing/cicd_integrations/jenkins) * [Bitrise (Upload Application)](https://docs.datadoghq.com/continuous_testing/cicd_integrations/bitrise_upload) * [Bitrise (Run Tests)](https://docs.datadoghq.com/continuous_testing/cicd_integrations/bitrise_run) * [Settings](https://docs.datadoghq.com/continuous_testing/settings) * [Results Explorer](https://docs.datadoghq.com/continuous_testing/results_explorer/) * [Metrics](https://docs.datadoghq.com/continuous_testing/metrics/) * [Guides](https://docs.datadoghq.com/continuous_testing/guide/) * [Troubleshooting](https://docs.datadoghq.com/continuous_testing/troubleshooting/) * [Product Analytics](https://docs.datadoghq.com/product_analytics) * [Vizualizing with Charts](https://docs.datadoghq.com/product_analytics/charts) * [Chart Basics](https://docs.datadoghq.com/product_analytics/charts/chart_basics) * [Pathways Diagram](https://docs.datadoghq.com/product_analytics/charts/pathways) * [Funnel Analysis](https://docs.datadoghq.com/product_analytics/charts/funnel_analysis) * [Retention Analysis](https://docs.datadoghq.com/product_analytics/charts/retention_analysis) * [Analytics Explorer](https://docs.datadoghq.com/product_analytics/charts/analytics_explorer) * [Dashboards](https://docs.datadoghq.com/product_analytics/dashboards) * [Segments](https://docs.datadoghq.com/product_analytics/segmentation/) * [Managing Profiles](https://docs.datadoghq.com/product_analytics/profiles) * [Experiments](https://docs.datadoghq.com/product_analytics/experimentation/) * [Define Metrics](https://docs.datadoghq.com/product_analytics/experimentation/defining_metrics) * [Reading Experiment Results](https://docs.datadoghq.com/product_analytics/experimentation/reading_results) * [Minimum Detectable Effects](https://docs.datadoghq.com/product_analytics/experimentation/minimum_detectable_effect) * [Guides](https://docs.datadoghq.com/product_analytics/guide/) * [Troubleshooting](https://docs.datadoghq.com/product_analytics/troubleshooting/) * [Session Replay](https://docs.datadoghq.com/session_replay/) * [Browser](https://docs.datadoghq.com/session_replay/browser) * [Setup](https://docs.datadoghq.com/session_replay/browser/setup_and_configuration) * [Privacy Options](https://docs.datadoghq.com/session_replay/browser/privacy_options) * [Developer Tools](https://docs.datadoghq.com/session_replay/browser/dev_tools) * [Troubleshooting](https://docs.datadoghq.com/session_replay/browser/troubleshooting) * [Mobile](https://docs.datadoghq.com/session_replay/mobile) * [Setup and Configuration](https://docs.datadoghq.com/session_replay/mobile/setup_and_configuration) * [Privacy Options](https://docs.datadoghq.com/session_replay/mobile/privacy_options) * [Developer Tools](https://docs.datadoghq.com/session_replay/mobile/dev_tools) * [Impact on App Performance](https://docs.datadoghq.com/session_replay/mobile/app_performance) * [Troubleshooting](https://docs.datadoghq.com/session_replay/mobile/troubleshooting) * [Playlists](https://docs.datadoghq.com/session_replay/playlists) * [Heatmaps](https://docs.datadoghq.com/session_replay/heatmaps) * [Software Delivery ](https://docs.datadoghq.com/api/latest/aws-integration/) * [CI Visibility](https://docs.datadoghq.com/continuous_integration/) * [Pipeline Visibility](https://docs.datadoghq.com/continuous_integration/pipelines/) * [AWS CodePipeline](https://docs.datadoghq.com/continuous_integration/pipelines/awscodepipeline/) * [Azure Pipelines](https://docs.datadoghq.com/continuous_integration/pipelines/azure/) * [Buildkite](https://docs.datadoghq.com/continuous_integration/pipelines/buildkite/) * [CircleCI](https://docs.datadoghq.com/continuous_integration/pipelines/circleci/) * [Codefresh](https://docs.datadoghq.com/continuous_integration/pipelines/codefresh/) * [GitHub Actions](https://docs.datadoghq.com/continuous_integration/pipelines/github/) * [GitLab](https://docs.datadoghq.com/continuous_integration/pipelines/gitlab/) * [Jenkins](https://docs.datadoghq.com/continuous_integration/pipelines/jenkins/) * [TeamCity](https://docs.datadoghq.com/continuous_integration/pipelines/teamcity/) * [Other CI Providers](https://docs.datadoghq.com/continuous_integration/pipelines/custom/) * [Custom Commands](https://docs.datadoghq.com/continuous_integration/pipelines/custom_commands/) * [Custom Tags and Measures](https://docs.datadoghq.com/continuous_integration/pipelines/custom_tags_and_measures/) * [Search and Manage](https://docs.datadoghq.com/continuous_integration/search/) * [Explorer](https://docs.datadoghq.com/continuous_integration/explorer) * [Search Syntax](https://docs.datadoghq.com/continuous_integration/explorer/search_syntax/) * [Search Pipeline Executions](https://docs.datadoghq.com/continuous_integration/explorer/facets/) * [Export](https://docs.datadoghq.com/continuous_integration/explorer/export/) * [Saved Views](https://docs.datadoghq.com/continuous_integration/explorer/saved_views/) * [Monitors](https://docs.datadoghq.com/monitors/types/ci/?tab=pipelines) * [Guides](https://docs.datadoghq.com/continuous_integration/guides/) * [Troubleshooting](https://docs.datadoghq.com/continuous_integration/troubleshooting/) * [CD Visibility](https://docs.datadoghq.com/continuous_delivery/) * [Deployment Visibility](https://docs.datadoghq.com/continuous_delivery/deployments) * [Argo CD](https://docs.datadoghq.com/continuous_delivery/deployments/argocd) * [CI Providers](https://docs.datadoghq.com/continuous_delivery/deployments/ciproviders) * [Explore Deployments](https://docs.datadoghq.com/continuous_delivery/explorer) * [Search Syntax](https://docs.datadoghq.com/continuous_delivery/explorer/search_syntax) * [Facets](https://docs.datadoghq.com/continuous_delivery/explorer/facets) * [Saved Views](https://docs.datadoghq.com/continuous_delivery/explorer/saved_views) * [Features](https://docs.datadoghq.com/continuous_delivery/features) * [Code Changes Detection](https://docs.datadoghq.com/continuous_delivery/features/code_changes_detection) * [Rollback Detection](https://docs.datadoghq.com/continuous_delivery/features/rollbacks_detection) * [Monitors](https://docs.datadoghq.com/monitors/types/ci/?tab=deployments) * [Deployment Gates](https://docs.datadoghq.com/deployment_gates/) * [Setup](https://docs.datadoghq.com/deployment_gates/setup/) * [Explore](https://docs.datadoghq.com/deployment_gates/explore/) * [Test Optimization](https://docs.datadoghq.com/tests/) * [Setup](https://docs.datadoghq.com/tests/setup/) * [.NET](https://docs.datadoghq.com/tests/setup/dotnet/) * [Java and JVM Languages](https://docs.datadoghq.com/tests/setup/java/) * [JavaScript and TypeScript](https://docs.datadoghq.com/tests/setup/javascript/) * [Python](https://docs.datadoghq.com/tests/setup/python/) * [Ruby](https://docs.datadoghq.com/tests/setup/ruby/) * [Swift](https://docs.datadoghq.com/tests/setup/swift/) * [Go](https://docs.datadoghq.com/tests/setup/go/) * [JUnit Report Uploads](https://docs.datadoghq.com/tests/setup/junit_xml/) * [Network Settings](https://docs.datadoghq.com/tests/network/) * [Tests in Containers](https://docs.datadoghq.com/tests/containers/) * [Repositories](https://docs.datadoghq.com/tests/repositories) * [Explorer](https://docs.datadoghq.com/tests/explorer/) * [Search Syntax](https://docs.datadoghq.com/tests/explorer/search_syntax) * [Search Test Runs](https://docs.datadoghq.com/tests/explorer/facets/) * [Export](https://docs.datadoghq.com/tests/explorer/export/) * [Saved Views](https://docs.datadoghq.com/tests/explorer/saved_views/) * [Monitors](https://docs.datadoghq.com/monitors/types/ci/?tab=tests) * [Test Health](https://docs.datadoghq.com/tests/test_health) * [Flaky Test Management](https://docs.datadoghq.com/tests/flaky_management) * [Working with Flaky Tests](https://docs.datadoghq.com/tests/flaky_tests) * [Early Flake Detection](https://docs.datadoghq.com/tests/flaky_tests/early_flake_detection) * [Auto Test Retries](https://docs.datadoghq.com/tests/flaky_tests/auto_test_retries) * [Test Impact Analysis](https://docs.datadoghq.com/tests/test_impact_analysis) * [Setup](https://docs.datadoghq.com/tests/test_impact_analysis/setup/) * [How It Works](https://docs.datadoghq.com/tests/test_impact_analysis/how_it_works/) * [Troubleshooting](https://docs.datadoghq.com/tests/test_impact_analysis/troubleshooting/) * [Developer Workflows](https://docs.datadoghq.com/tests/developer_workflows) * [Code Coverage](https://docs.datadoghq.com/tests/code_coverage) * [Instrument Browser Tests with RUM](https://docs.datadoghq.com/tests/browser_tests) * [Instrument Swift Tests with RUM](https://docs.datadoghq.com/tests/swift_tests) * [Correlate Logs and Tests](https://docs.datadoghq.com/tests/correlate_logs_and_tests) * [Guides](https://docs.datadoghq.com/tests/guides/) * [Troubleshooting](https://docs.datadoghq.com/tests/troubleshooting/) * [Code Coverage](https://docs.datadoghq.com/code_coverage/) * [Setup](https://docs.datadoghq.com/code_coverage/setup/) * [Data Collected](https://docs.datadoghq.com/code_coverage/data_collected/) * [PR Gates](https://docs.datadoghq.com/pr_gates/) * [Setup](https://docs.datadoghq.com/pr_gates/setup) * [DORA Metrics](https://docs.datadoghq.com/dora_metrics/) * [Setup](https://docs.datadoghq.com/dora_metrics/setup) * [Deployment Data Sources](https://docs.datadoghq.com/dora_metrics/setup/deployments) * [Failure Data Sources](https://docs.datadoghq.com/dora_metrics/setup/failures/) * [Change Failure Detection](https://docs.datadoghq.com/dora_metrics/change_failure_detection/) * [Data Collected](https://docs.datadoghq.com/dora_metrics/data_collected/) * [Feature Flags](https://docs.datadoghq.com/feature_flags/) * [Client SDKs](https://docs.datadoghq.com/feature_flags/client) * [Android and Android TV](https://docs.datadoghq.com/feature_flags/client/android) * [iOS and tvOS](https://docs.datadoghq.com/feature_flags/client/ios) * [JavaScript](https://docs.datadoghq.com/feature_flags/client/javascript) * [React](https://docs.datadoghq.com/feature_flags/client/react) * [Server SDKs](https://docs.datadoghq.com/feature_flags/server) * [Go](https://docs.datadoghq.com/feature_flags/server/go) * [Java](https://docs.datadoghq.com/feature_flags/server/java) * [Node.js](https://docs.datadoghq.com/feature_flags/server/nodejs) * [Python](https://docs.datadoghq.com/feature_flags/server/python) * [Ruby](https://docs.datadoghq.com/feature_flags/server/ruby) * [MCP Server](https://docs.datadoghq.com/feature_flags/feature_flag_mcp_server) * [Guides](https://docs.datadoghq.com/feature_flags/guide) * [Security ](https://docs.datadoghq.com/api/latest/aws-integration/) * [Security Overview](https://docs.datadoghq.com/security/) * [Detection Rules](https://docs.datadoghq.com/security/detection_rules/) * [OOTB Rules](https://docs.datadoghq.com/security/default_rules/#all) * [Notifications](https://docs.datadoghq.com/security/notifications/) * [Rules](https://docs.datadoghq.com/security/notifications/rules/) * [Variables](https://docs.datadoghq.com/security/notifications/variables/) * [Suppressions](https://docs.datadoghq.com/security/suppressions/) * [Automation Pipelines](https://docs.datadoghq.com/security/automation_pipelines/) * [Mute](https://docs.datadoghq.com/security/automation_pipelines/mute) * [Add to Security Inbox](https://docs.datadoghq.com/security/automation_pipelines/security_inbox) * [Set Due Date Rules](https://docs.datadoghq.com/security/automation_pipelines/set_due_date) * [Security Inbox](https://docs.datadoghq.com/security/security_inbox) * [Threat Intelligence](https://docs.datadoghq.com/security/threat_intelligence) * [Audit Trail](https://docs.datadoghq.com/security/audit_trail) * [Access Control](https://docs.datadoghq.com/security/access_control) * [Account Takeover Protection](https://docs.datadoghq.com/security/account_takeover_protection) * [Ticketing Integrations](https://docs.datadoghq.com/security/ticketing_integrations) * [Research Feed](https://docs.datadoghq.com/security/research_feed) * [Guides](https://docs.datadoghq.com/security/guide) * [Cloud SIEM](https://docs.datadoghq.com/security/cloud_siem/) * [Ingest and Enrich](https://docs.datadoghq.com/security/cloud_siem/ingest_and_enrich/) * [Content Packs](https://docs.datadoghq.com/security/cloud_siem/ingest_and_enrich/content_packs) * [Bring Your Own Threat Intelligence](https://docs.datadoghq.com/security/cloud_siem/ingest_and_enrich/threat_intelligence) * [Open Cybersecurity Schema Framework](https://docs.datadoghq.com/security/cloud_siem/ingest_and_enrich/open_cybersecurity_schema_framework) * [Detect and Monitor](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/) * [OOTB Rules](https://docs.datadoghq.com/security/default_rules/#cat-cloud-siem-log-detection) * [Custom Detection Rules](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/custom_detection_rules) * [Version History](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/version_history) * [Suppressions](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/suppressions) * [Historical Jobs](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/historical_jobs) * [MITRE ATT&CK Map](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/mitre_attack_map) * [Triage and Investigate](https://docs.datadoghq.com/security/cloud_siem/triage_and_investigate) * [Investigate Security Signals](https://docs.datadoghq.com/security/cloud_siem/triage_and_investigate/investigate_security_signals) * [Risk Insights](https://docs.datadoghq.com/security/cloud_siem/triage_and_investigate/entities_and_risk_scoring) * [IOC Explorer](https://docs.datadoghq.com/security/cloud_siem/triage_and_investigate/ioc_explorer) * [Investigator](https://docs.datadoghq.com/security/cloud_siem/triage_and_investigate/investigator) * [Respond and Report](https://docs.datadoghq.com/security/cloud_siem/respond_and_report) * [Security Operational Metrics](https://docs.datadoghq.com/security/cloud_siem/respond_and_report/security_operational_metrics) * [Guides](https://docs.datadoghq.com/security/cloud_siem/guide/) * [Data Security](https://docs.datadoghq.com/data_security/cloud_siem/) * [Code Security](https://docs.datadoghq.com/security/code_security/) * [Static Code Analysis (SAST)](https://docs.datadoghq.com/security/code_security/static_analysis/) * [Setup](https://docs.datadoghq.com/security/code_security/static_analysis/setup/) * [GitHub Actions](https://docs.datadoghq.com/security/code_security/static_analysis/github_actions/) * [Generic CI Providers](https://docs.datadoghq.com/security/code_security/static_analysis/generic_ci_providers/) * [AI-Enhanced Static Code Analysis](https://docs.datadoghq.com/security/code_security/static_analysis/ai_enhanced_sast/) * [SAST Custom Rule Creation Tutorial](https://docs.datadoghq.com/security/code_security/static_analysis/custom_rules/tutorial/) * [SAST Custom Rules](https://docs.datadoghq.com/security/code_security/static_analysis/custom_rules/) * [SAST Custom Rules Guide](https://docs.datadoghq.com/security/code_security/static_analysis/custom_rules/guide/) * [Static Code Analysis (SAST) rules](https://docs.datadoghq.com/security/code_security/static_analysis/static_analysis_rules/) * [Software Composition Analysis (SCA)](https://docs.datadoghq.com/security/code_security/software_composition_analysis/) * [Static Setup](https://docs.datadoghq.com/security/code_security/software_composition_analysis/setup_static/) * [Runtime Setup](https://docs.datadoghq.com/security/code_security/software_composition_analysis/setup_runtime/) * [Library Inventory](https://docs.datadoghq.com/security/code_security/software_composition_analysis/library_inventory/) * [Secret Scanning](https://docs.datadoghq.com/security/code_security/secret_scanning/) * [GitHub Actions](https://docs.datadoghq.com/security/code_security/secret_scanning/github_actions/) * [Generic CI Providers](https://docs.datadoghq.com/security/code_security/secret_scanning/generic_ci_providers/) * [Secret Validation](https://docs.datadoghq.com/security/code_security/secret_scanning/secret_validation/) * [Runtime Code Analysis (IAST)](https://docs.datadoghq.com/security/code_security/iast/) * [Setup](https://docs.datadoghq.com/security/code_security/iast/setup/) * [Security Controls](https://docs.datadoghq.com/security/code_security/iast/security_controls/) * [Infrastructure as Code (IaC) Security](https://docs.datadoghq.com/security/code_security/iac_security/) * [Setup](https://docs.datadoghq.com/security/code_security/iac_security/setup/) * [Exclusions](https://docs.datadoghq.com/security/code_security/iac_security/exclusions/) * [Rules](https://docs.datadoghq.com/security/code_security/iac_security/iac_rules/) * [Developer Tool Integrations](https://docs.datadoghq.com/security/code_security/dev_tool_int/) * [Pull Request Comments](https://docs.datadoghq.com/security/code_security/dev_tool_int/pull_request_comments/) * [PR Gates](https://docs.datadoghq.com/pr_gates/) * [IDE Plugins](https://docs.datadoghq.com/security/code_security/dev_tool_int/ide_plugins/) * [Git Hooks](https://docs.datadoghq.com/security/code_security/dev_tool_int/git_hooks/) * [Troubleshooting](https://docs.datadoghq.com/security/code_security/troubleshooting/) * [Guides](https://docs.datadoghq.com/security/code_security/guides/) * [Cloud Security](https://docs.datadoghq.com/security/cloud_security_management) * [Setup](https://docs.datadoghq.com/security/cloud_security_management/setup) * [Supported Deployment Types](https://docs.datadoghq.com/security/cloud_security_management/setup/supported_deployment_types) * [Agentless Scanning](https://docs.datadoghq.com/security/cloud_security_management/setup/agentless_scanning) * [Deploy the Agent](https://docs.datadoghq.com/security/cloud_security_management/setup/agent) * [Set Up CloudTrail Logs](https://docs.datadoghq.com/security/cloud_security_management/setup/cloudtrail_logs) * [Set Up without Infrastructure Monitoring](https://docs.datadoghq.com/security/cloud_security_management/setup/without_infrastructure_monitoring) * [Deploy via Cloud Integrations](https://docs.datadoghq.com/security/cloud_security_management/setup/cloud_integrations) * [Security Graph](https://docs.datadoghq.com/security/cloud_security_management/security_graph/) * [Misconfigurations](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/) * [Manage Compliance Rules](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/compliance_rules) * [Create Custom Rules](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/custom_rules) * [Manage Compliance Posture](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/frameworks_and_benchmarks) * [Explore Misconfigurations](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/findings) * [Kubernetes Security Posture Management](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/kspm) * [Identity Risks](https://docs.datadoghq.com/security/cloud_security_management/identity_risks/) * [Vulnerabilities](https://docs.datadoghq.com/security/cloud_security_management/vulnerabilities) * [Hosts and Containers Compatibility](https://docs.datadoghq.com/security/cloud_security_management/vulnerabilities/hosts_containers_compatibility) * [OOTB Rules](https://docs.datadoghq.com/security/default_rules/#cat-cloud-security-management) * [Review and Remediate](https://docs.datadoghq.com/security/cloud_security_management/review_remediate) * [Mute Issues](https://docs.datadoghq.com/security/cloud_security_management/review_remediate/mute_issues) * [Automate Security Workflows](https://docs.datadoghq.com/security/cloud_security_management/review_remediate/workflows) * [Create Jira Issues](https://docs.datadoghq.com/security/cloud_security_management/review_remediate/jira) * [Severity Scoring](https://docs.datadoghq.com/security/cloud_security_management/severity_scoring/) * [Guides](https://docs.datadoghq.com/security/cloud_security_management/guide/) * [Troubleshooting](https://docs.datadoghq.com/security/cloud_security_management/troubleshooting/) * [Vulnerabilities](https://docs.datadoghq.com/security/cloud_security_management/troubleshooting/vulnerabilities/) * [App and API Protection](https://docs.datadoghq.com/security/application_security/) * [Terms and Concepts](https://docs.datadoghq.com/security/application_security/terms/) * [How It Works](https://docs.datadoghq.com/security/application_security/how-it-works/) * [Threat Intelligence](https://docs.datadoghq.com/security/application_security/how-it-works/threat-intelligence) * [Trace Qualification](https://docs.datadoghq.com/security/application_security/how-it-works/trace_qualification) * [User Monitoring and Protection](https://docs.datadoghq.com/security/application_security/how-it-works/add-user-info) * [Setup](https://docs.datadoghq.com/security/application_security/setup/) * [Overview](https://docs.datadoghq.com/security/application_security/overview) * [Security Signals](https://docs.datadoghq.com/security/application_security/security_signals) * [Attackers Explorer](https://docs.datadoghq.com/security/application_security/security_signals/attacker-explorer/) * [Attacker Fingerprint](https://docs.datadoghq.com/security/application_security/security_signals/attacker_fingerprint/) * [Attacker Clustering](https://docs.datadoghq.com/security/application_security/security_signals/attacker_clustering/) * [Users Explorer](https://docs.datadoghq.com/security/application_security/security_signals/users_explorer/) * [Policies](https://docs.datadoghq.com/security/application_security/policies/) * [Custom Rules](https://docs.datadoghq.com/security/application_security/policies/custom_rules/) * [OOTB Rules](https://docs.datadoghq.com/security/default_rules/) * [In-App WAF Rules](https://docs.datadoghq.com/security/application_security/policies/inapp_waf_rules/) * [Tracing Library Configuration](https://docs.datadoghq.com/security/application_security/policies/library_configuration/) * [Exploit Prevention](https://docs.datadoghq.com/security/application_security/exploit-prevention/) * [WAF Integrations](https://docs.datadoghq.com/security/application_security/waf-integration/) * [API Security Inventory](https://docs.datadoghq.com/security/application_security/api-inventory/) * [Guides](https://docs.datadoghq.com/security/application_security/guide/) * [Troubleshooting](https://docs.datadoghq.com/security/application_security/troubleshooting/) * [Workload Protection](https://docs.datadoghq.com/security/workload_protection/) * [Setup](https://docs.datadoghq.com/security/workload_protection/setup) * [Deploy the Agent](https://docs.datadoghq.com/security/workload_protection/setup/agent) * [Workload Protection Agent Variables](https://docs.datadoghq.com/security/workload_protection/setup/agent_variables) * [Detection Rules](https://docs.datadoghq.com/security/workload_protection/workload_security_rules) * [OOTB Rules](https://docs.datadoghq.com/security/workload_protection/setup/ootb_rules) * [Custom Rules](https://docs.datadoghq.com/security/workload_protection/workload_security_rules/custom_rules) * [Investigate Security Signals](https://docs.datadoghq.com/security/workload_protection/security_signals) * [Investigate Agent Events](https://docs.datadoghq.com/security/workload_protection/investigate_agent_events) * [Creating Agent Rule Expressions](https://docs.datadoghq.com/security/workload_protection/agent_expressions) * [Writing Custom Rule Expressions](https://docs.datadoghq.com/security/workload_protection/secl_auth_guide) * [Linux Syntax](https://docs.datadoghq.com/security/workload_protection/linux_expressions) * [Windows Syntax](https://docs.datadoghq.com/security/workload_protection/windows_expressions) * [Coverage and Posture Management](https://docs.datadoghq.com/security/workload_protection/inventory) * [Hosts and Containers](https://docs.datadoghq.com/security/workload_protection/inventory/hosts_and_containers) * [Serverless](https://docs.datadoghq.com/security/workload_protection/inventory/serverless) * [Coverage](https://docs.datadoghq.com/security/workload_protection/inventory/coverage_map) * [Guides](https://docs.datadoghq.com/security/workload_protection/guide) * [Troubleshooting](https://docs.datadoghq.com/security/workload_protection/troubleshooting/threats) * [Sensitive Data Scanner](https://docs.datadoghq.com/security/sensitive_data_scanner/) * [Setup](https://docs.datadoghq.com/security/sensitive_data_scanner/setup/) * [Telemetry Data](https://docs.datadoghq.com/security/sensitive_data_scanner/setup/telemetry_data/) * [Cloud Storage](https://docs.datadoghq.com/security/sensitive_data_scanner/setup/cloud_storage/) * [Scanning Rules](https://docs.datadoghq.com/security/sensitive_data_scanner/scanning_rules/) * [Library Rules](https://docs.datadoghq.com/security/sensitive_data_scanner/scanning_rules/library_rules/) * [Custom Rules](https://docs.datadoghq.com/security/sensitive_data_scanner/scanning_rules/custom_rules/) * [Guides](https://docs.datadoghq.com/security/sensitive_data_scanner/guide/) * [AI Observability ](https://docs.datadoghq.com/api/latest/aws-integration/) * [LLM Observability](https://docs.datadoghq.com/llm_observability/) * [Quickstart](https://docs.datadoghq.com/llm_observability/quickstart/) * [Instrumentation](https://docs.datadoghq.com/llm_observability/instrumentation/) * [Automatic](https://docs.datadoghq.com/llm_observability/instrumentation/auto_instrumentation) * [SDK Reference](https://docs.datadoghq.com/llm_observability/instrumentation/sdk) * [HTTP API](https://docs.datadoghq.com/llm_observability/instrumentation/api) * [OpenTelemetry](https://docs.datadoghq.com/llm_observability/instrumentation/otel_instrumentation) * [Monitoring](https://docs.datadoghq.com/llm_observability/monitoring) * [Querying spans and traces](https://docs.datadoghq.com/llm_observability/monitoring/querying) * [Correlate with APM](https://docs.datadoghq.com/llm_observability/monitoring/llm_observability_and_apm) * [Cluster Map](https://docs.datadoghq.com/llm_observability/monitoring/cluster_map/) * [Agent Monitoring](https://docs.datadoghq.com/llm_observability/monitoring/agent_monitoring) * [MCP Clients](https://docs.datadoghq.com/llm_observability/monitoring/mcp_client) * [Prompt Tracking](https://docs.datadoghq.com/llm_observability/monitoring/prompt_tracking) * [Metrics](https://docs.datadoghq.com/llm_observability/monitoring/metrics) * [Cost](https://docs.datadoghq.com/llm_observability/monitoring/cost) * [Evaluations](https://docs.datadoghq.com/llm_observability/evaluations/) * [Managed Evaluations](https://docs.datadoghq.com/llm_observability/evaluations/managed_evaluations) * [Custom LLM-as-a-Judge](https://docs.datadoghq.com/llm_observability/evaluations/custom_llm_as_a_judge_evaluations) * [External Evaluations](https://docs.datadoghq.com/llm_observability/evaluations/external_evaluations) * [Compatibility](https://docs.datadoghq.com/llm_observability/evaluations/evaluation_compatibility) * [Export API](https://docs.datadoghq.com/llm_observability/evaluations/export_api) * [Experiments](https://docs.datadoghq.com/llm_observability/experiments) * [Data Security and RBAC](https://docs.datadoghq.com/llm_observability/data_security_and_rbac) * [Terms and Concepts](https://docs.datadoghq.com/llm_observability/terms/) * [Guides](https://docs.datadoghq.com/llm_observability/guide/) * [Log Management ](https://docs.datadoghq.com/api/latest/aws-integration/) * [Observability Pipelines](https://docs.datadoghq.com/observability_pipelines/) * [Configuration](https://docs.datadoghq.com/observability_pipelines/configuration/) * [Explore Templates](https://docs.datadoghq.com/observability_pipelines/configuration/explore_templates/) * [Set Up Pipelines](https://docs.datadoghq.com/observability_pipelines/configuration/set_up_pipelines/) * [Install the Worker](https://docs.datadoghq.com/observability_pipelines/configuration/install_the_worker/) * [Live Capture](https://docs.datadoghq.com/observability_pipelines/configuration/live_capture/) * [Update Existing Pipelines](https://docs.datadoghq.com/observability_pipelines/configuration/update_existing_pipelines) * [Access Control](https://docs.datadoghq.com/observability_pipelines/configuration/access_control) * [Sources](https://docs.datadoghq.com/observability_pipelines/sources/) * [Akamai DataStream](https://docs.datadoghq.com/observability_pipelines/sources/akamai_datastream/) * [Amazon Data Firehose](https://docs.datadoghq.com/observability_pipelines/sources/amazon_data_firehose/) * [Amazon S3](https://docs.datadoghq.com/observability_pipelines/sources/amazon_s3/) * [Azure Event Hubs](https://docs.datadoghq.com/observability_pipelines/sources/azure_event_hubs/) * [Datadog Agent](https://docs.datadoghq.com/observability_pipelines/sources/datadog_agent/) * [Datadog Lambda Extension](https://docs.datadoghq.com/observability_pipelines/sources/lambda_extension/) * [Datadog Lambda Forwarder](https://docs.datadoghq.com/observability_pipelines/sources/lambda_forwarder/) * [Filebeat](https://docs.datadoghq.com/observability_pipelines/sources/filebeat/) * [Fluent](https://docs.datadoghq.com/observability_pipelines/sources/fluent/) * [Google Pub/Sub](https://docs.datadoghq.com/observability_pipelines/sources/google_pubsub/) * [HTTP Client](https://docs.datadoghq.com/observability_pipelines/sources/http_client/) * [HTTP Server](https://docs.datadoghq.com/observability_pipelines/sources/http_server/) * [OpenTelemetry](https://docs.datadoghq.com/observability_pipelines/sources/opentelemetry/) * [Kafka](https://docs.datadoghq.com/observability_pipelines/sources/kafka/) * [Logstash](https://docs.datadoghq.com/observability_pipelines/sources/logstash/) * [Socket](https://docs.datadoghq.com/observability_pipelines/sources/socket/) * [Splunk HEC](https://docs.datadoghq.com/observability_pipelines/sources/splunk_hec/) * [Splunk TCP](https://docs.datadoghq.com/observability_pipelines/sources/splunk_tcp/) * [Sumo Logic Hosted Collector](https://docs.datadoghq.com/observability_pipelines/sources/sumo_logic/) * [Syslog](https://docs.datadoghq.com/observability_pipelines/sources/syslog/) * [Processors](https://docs.datadoghq.com/observability_pipelines/processors/) * [Add Environment Variables](https://docs.datadoghq.com/observability_pipelines/processors/add_environment_variables) * [Add hostname](https://docs.datadoghq.com/observability_pipelines/processors/add_hostname) * [Custom Processor](https://docs.datadoghq.com/observability_pipelines/processors/custom_processor) * [Deduplicate](https://docs.datadoghq.com/observability_pipelines/processors/dedupe) * [Edit fields](https://docs.datadoghq.com/observability_pipelines/processors/edit_fields) * [Enrichment Table](https://docs.datadoghq.com/observability_pipelines/processors/enrichment_table) * [Filter](https://docs.datadoghq.com/observability_pipelines/processors/filter) * [Generate Metrics](https://docs.datadoghq.com/observability_pipelines/processors/generate_metrics) * [Grok Parser](https://docs.datadoghq.com/observability_pipelines/processors/grok_parser) * [Parse JSON](https://docs.datadoghq.com/observability_pipelines/processors/parse_json) * [Parse XML](https://docs.datadoghq.com/observability_pipelines/processors/parse_xml) * [Quota](https://docs.datadoghq.com/observability_pipelines/processors/quota) * [Reduce](https://docs.datadoghq.com/observability_pipelines/processors/reduce) * [Remap to OCSF](https://docs.datadoghq.com/observability_pipelines/processors/remap_ocsf) * [Sample](https://docs.datadoghq.com/observability_pipelines/processors/sample) * [Sensitive Data Scanner](https://docs.datadoghq.com/observability_pipelines/processors/sensitive_data_scanner) * [Split Array](https://docs.datadoghq.com/observability_pipelines/processors/split_array) * [Tag Control](https://docs.datadoghq.com/observability_pipelines/processors/tag_control/logs/) * [Throttle](https://docs.datadoghq.com/observability_pipelines/processors/throttle) * [Destinations](https://docs.datadoghq.com/observability_pipelines/destinations/) * [Amazon OpenSearch](https://docs.datadoghq.com/observability_pipelines/destinations/amazon_opensearch/) * [Amazon S3](https://docs.datadoghq.com/observability_pipelines/destinations/amazon_s3/) * [Amazon Security Lake](https://docs.datadoghq.com/observability_pipelines/destinations/amazon_security_lake/) * [Azure Storage](https://docs.datadoghq.com/observability_pipelines/destinations/azure_storage/) * [CrowdStrike NG-SIEM](https://docs.datadoghq.com/observability_pipelines/destinations/crowdstrike_ng_siem/) * [Datadog CloudPrem](https://docs.datadoghq.com/observability_pipelines/destinations/cloudprem/) * [Datadog Logs](https://docs.datadoghq.com/observability_pipelines/destinations/datadog_logs/) * [Datadog Metrics](https://docs.datadoghq.com/observability_pipelines/destinations/datadog_metrics/) * [Elasticsearch](https://docs.datadoghq.com/observability_pipelines/destinations/elasticsearch/) * [Google Cloud Storage](https://docs.datadoghq.com/observability_pipelines/destinations/google_cloud_storage/) * [Google Pub/Sub](https://docs.datadoghq.com/observability_pipelines/destinations/google_pubsub/) * [Google SecOps](https://docs.datadoghq.com/observability_pipelines/destinations/google_secops/) * [HTTP Client](https://docs.datadoghq.com/observability_pipelines/destinations/http_client/) * [Kafka](https://docs.datadoghq.com/observability_pipelines/destinations/kafka/) * [Microsoft Sentinel](https://docs.datadoghq.com/observability_pipelines/destinations/microsoft_sentinel/) * [New Relic](https://docs.datadoghq.com/observability_pipelines/destinations/new_relic/) * [OpenSearch](https://docs.datadoghq.com/observability_pipelines/destinations/opensearch) * [SentinelOne](https://docs.datadoghq.com/observability_pipelines/destinations/sentinelone) * [Socket](https://docs.datadoghq.com/observability_pipelines/destinations/socket) * [Splunk HEC](https://docs.datadoghq.com/observability_pipelines/destinations/splunk_hec) * [Sumo Logic Hosted Collector](https://docs.datadoghq.com/observability_pipelines/destinations/sumo_logic_hosted_collector) * [Syslog](https://docs.datadoghq.com/observability_pipelines/destinations/syslog) * [Packs](https://docs.datadoghq.com/observability_pipelines/packs/) * [Akamai CDN](https://docs.datadoghq.com/observability_pipelines/packs/akamai_cdn/) * [Amazon CloudFront](https://docs.datadoghq.com/observability_pipelines/packs/amazon_cloudfront/) * [Amazon VPC Flow Logs](https://docs.datadoghq.com/observability_pipelines/packs/amazon_vpc_flow_logs/) * [AWS CloudTrail](https://docs.datadoghq.com/observability_pipelines/packs/aws_cloudtrail/) * [Cisco ASA](https://docs.datadoghq.com/observability_pipelines/packs/cisco_asa/) * [Cloudflare](https://docs.datadoghq.com/observability_pipelines/packs/cloudflare/) * [F5](https://docs.datadoghq.com/observability_pipelines/packs/f5/) * [Fastly](https://docs.datadoghq.com/observability_pipelines/packs/fastly/) * [Fortinet Firewall](https://docs.datadoghq.com/observability_pipelines/packs/fortinet_firewall/) * [HAProxy Ingress](https://docs.datadoghq.com/observability_pipelines/packs/haproxy_ingress/) * [Istio Proxy](https://docs.datadoghq.com/observability_pipelines/packs/istio_proxy/) * [Netskope](https://docs.datadoghq.com/observability_pipelines/packs/netskope/) * [NGINX](https://docs.datadoghq.com/observability_pipelines/packs/nginx/) * [Okta](https://docs.datadoghq.com/observability_pipelines/packs/okta/) * [Palo Alto Firewall](https://docs.datadoghq.com/observability_pipelines/packs/palo_alto_firewall/) * [Windows XML](https://docs.datadoghq.com/observability_pipelines/packs/windows_xml/) * [ZScaler ZIA DNS](https://docs.datadoghq.com/observability_pipelines/packs/zscaler_zia_dns/) * [Zscaler ZIA Firewall](https://docs.datadoghq.com/observability_pipelines/packs/zscaler_zia_firewall/) * [Zscaler ZIA Tunnel](https://docs.datadoghq.com/observability_pipelines/packs/zscaler_zia_tunnel/) * [Zscaler ZIA Web Logs](https://docs.datadoghq.com/observability_pipelines/packs/zscaler_zia_web_logs/) * [Search Syntax](https://docs.datadoghq.com/observability_pipelines/search_syntax/logs/) * [Scaling and Performance](https://docs.datadoghq.com/observability_pipelines/scaling_and_performance/) * [Handling Load and Backpressure](https://docs.datadoghq.com/observability_pipelines/scaling_and_performance/handling_load_and_backpressure/) * [Scaling Best Practices](https://docs.datadoghq.com/observability_pipelines/scaling_and_performance/best_practices_for_scaling_observability_pipelines/) * [Monitoring and Troubleshooting](https://docs.datadoghq.com/observability_pipelines/monitoring_and_troubleshooting/) * [Worker CLI Commands](https://docs.datadoghq.com/observability_pipelines/monitoring_and_troubleshooting/worker_cli_commands/) * [Monitoring Pipelines](https://docs.datadoghq.com/observability_pipelines/monitoring_and_troubleshooting/monitoring_pipelines/) * [Pipeline Usage Metrics](https://docs.datadoghq.com/observability_pipelines/monitoring_and_troubleshooting/pipeline_usage_metrics/) * [Troubleshooting](https://docs.datadoghq.com/observability_pipelines/monitoring_and_troubleshooting/troubleshooting/) * [Guides and Resources](https://docs.datadoghq.com/observability_pipelines/guide/) * [Upgrade Worker Guide](https://docs.datadoghq.com/observability_pipelines/guide/upgrade_worker/) * [Log Management](https://docs.datadoghq.com/logs/) * [Log Collection & Integrations](https://docs.datadoghq.com/logs/log_collection/) * [Browser](https://docs.datadoghq.com/logs/log_collection/javascript/) * [Android](https://docs.datadoghq.com/logs/log_collection/android/) * [iOS](https://docs.datadoghq.com/logs/log_collection/ios/) * [Flutter](https://docs.datadoghq.com/logs/log_collection/flutter/) * [React Native](https://docs.datadoghq.com/logs/log_collection/reactnative/) * [Roku](https://docs.datadoghq.com/logs/log_collection/roku/) * [Kotlin Multiplatform](https://docs.datadoghq.com/logs/log_collection/kotlin_multiplatform/) * [C#](https://docs.datadoghq.com/logs/log_collection/csharp/) * [Go](https://docs.datadoghq.com/logs/log_collection/go/) * [Java](https://docs.datadoghq.com/logs/log_collection/java/) * [Node.js](https://docs.datadoghq.com/logs/log_collection/nodejs/) * [PHP](https://docs.datadoghq.com/logs/log_collection/php/) * [Python](https://docs.datadoghq.com/logs/log_collection/python/) * [Ruby](https://docs.datadoghq.com/logs/log_collection/ruby/) * [OpenTelemetry](https://docs.datadoghq.com/opentelemetry/otel_logs/) * [Agent Integrations](https://docs.datadoghq.com/logs/log_collection/agent_checks/) * [Other Integrations](https://docs.datadoghq.com/integrations/#cat-log-collection) * [Log Configuration](https://docs.datadoghq.com/logs/log_configuration/) * [Pipelines](https://docs.datadoghq.com/logs/log_configuration/pipelines/) * [Processors](https://docs.datadoghq.com/logs/log_configuration/processors/) * [Parsing](https://docs.datadoghq.com/logs/log_configuration/parsing/) * [Pipeline Scanner](https://docs.datadoghq.com/logs/log_configuration/pipeline_scanner/) * [Attributes and Aliasing](https://docs.datadoghq.com/logs/log_configuration/attributes_naming_convention/) * [Generate Metrics](https://docs.datadoghq.com/logs/log_configuration/logs_to_metrics/) * [Indexes](https://docs.datadoghq.com/logs/log_configuration/indexes) * [Flex Logs](https://docs.datadoghq.com/logs/log_configuration/flex_logs/) * [Archives](https://docs.datadoghq.com/logs/log_configuration/archives/) * [Rehydrate from Archives](https://docs.datadoghq.com/logs/log_configuration/rehydrating) * [Archive Search](https://docs.datadoghq.com/logs/log_configuration/archive_search) * [Forwarding](https://docs.datadoghq.com/logs/log_configuration/forwarding_custom_destinations/) * [Log Explorer](https://docs.datadoghq.com/logs/explorer/) * [Live Tail](https://docs.datadoghq.com/logs/explorer/live_tail/) * [Search Logs](https://docs.datadoghq.com/logs/explorer/search/) * [Search Syntax](https://docs.datadoghq.com/logs/explorer/search_syntax/) * [Advanced Search](https://docs.datadoghq.com/logs/explorer/advanced_search) * [Facets](https://docs.datadoghq.com/logs/explorer/facets/) * [Calculated Fields](https://docs.datadoghq.com/logs/explorer/calculated_fields/) * [Analytics](https://docs.datadoghq.com/logs/explorer/analytics/) * [Patterns](https://docs.datadoghq.com/logs/explorer/analytics/patterns/) * [Transactions](https://docs.datadoghq.com/logs/explorer/analytics/transactions/) * [Visualize](https://docs.datadoghq.com/logs/explorer/visualize/) * [Log Side Panel](https://docs.datadoghq.com/logs/explorer/side_panel/) * [Export](https://docs.datadoghq.com/logs/explorer/export/) * [Watchdog Insights for Logs](https://docs.datadoghq.com/logs/explorer/watchdog_insights/) * [Saved Views](https://docs.datadoghq.com/logs/explorer/saved_views/) * [Error Tracking](https://docs.datadoghq.com/logs/error_tracking/) * [Error Tracking Explorer](https://docs.datadoghq.com/logs/error_tracking/explorer) * [Issue States](https://docs.datadoghq.com/logs/error_tracking/issue_states) * [Track Browser and Mobile Errors](https://docs.datadoghq.com/logs/error_tracking/browser_and_mobile) * [Track Backend Errors](https://docs.datadoghq.com/logs/error_tracking/backend) * [Error Grouping](https://docs.datadoghq.com/logs/error_tracking/error_grouping) * [Manage Data Collection](https://docs.datadoghq.com/logs/error_tracking/manage_data_collection) * [Dynamic Sampling](https://docs.datadoghq.com/logs/error_tracking/dynamic_sampling) * [Monitors](https://docs.datadoghq.com/logs/error_tracking/monitors) * [Identify Suspect Commits](https://docs.datadoghq.com/logs/error_tracking/suspect_commits) * [Troubleshooting](https://docs.datadoghq.com/error_tracking/troubleshooting) * [Reports](https://docs.datadoghq.com/logs/reports/) * [Guides](https://docs.datadoghq.com/logs/guide/) * [Data Security](https://docs.datadoghq.com/data_security/logs/) * [Troubleshooting](https://docs.datadoghq.com/logs/troubleshooting) * [Live Tail](https://docs.datadoghq.com/logs/troubleshooting/live_tail) * [CloudPrem](https://docs.datadoghq.com/cloudprem/) * [Quickstart](https://docs.datadoghq.com/cloudprem/quickstart) * [Architecture](https://docs.datadoghq.com/cloudprem/architecture) * [Installation](https://docs.datadoghq.com/cloudprem/install) * [AWS EKS](https://docs.datadoghq.com/cloudprem/install/aws_eks) * [Azure AKS](https://docs.datadoghq.com/cloudprem/install/azure_aks) * [Install Locally with Docker](https://docs.datadoghq.com/cloudprem/install/docker) * [Log Ingestion](https://docs.datadoghq.com/cloudprem/ingest_logs/) * [Datadog Agent](https://docs.datadoghq.com/cloudprem/ingest_logs/datadog_agent) * [Observability Pipelines](https://docs.datadoghq.com/cloudprem/ingest_logs/observability_pipelines) * [REST API](https://docs.datadoghq.com/cloudprem/ingest_logs/rest_api/) * [Configuration](https://docs.datadoghq.com/cloudprem/configure/) * [Datadog Account](https://docs.datadoghq.com/cloudprem/configure/datadog_account) * [AWS Configuration](https://docs.datadoghq.com/cloudprem/configure/aws_config) * [Azure Configuration](https://docs.datadoghq.com/cloudprem/configure/azure_config) * [Cluster Sizing](https://docs.datadoghq.com/cloudprem/configure/cluster_sizing) * [Ingress](https://docs.datadoghq.com/cloudprem/configure/ingress) * [Processing](https://docs.datadoghq.com/cloudprem/configure/processing) * [Reverse Connection](https://docs.datadoghq.com/cloudprem/configure/reverse_connection) * [Management](https://docs.datadoghq.com/cloudprem/manage/) * [Supported Features](https://docs.datadoghq.com/cloudprem/supported_features/) * [Troubleshooting](https://docs.datadoghq.com/cloudprem/troubleshooting/) * [Administration ](https://docs.datadoghq.com/api/latest/aws-integration/) * [Account Management](https://docs.datadoghq.com/account_management/) * [Switching Between Orgs](https://docs.datadoghq.com/account_management/org_switching/) * [Organization Settings](https://docs.datadoghq.com/account_management/org_settings/) * [User Management](https://docs.datadoghq.com/account_management/users/) * [Login Methods](https://docs.datadoghq.com/account_management/login_methods/) * [OAuth Apps](https://docs.datadoghq.com/account_management/org_settings/oauth_apps) * [Custom Organization Landing Page](https://docs.datadoghq.com/account_management/org_settings/custom_landing) * [Service Accounts](https://docs.datadoghq.com/account_management/org_settings/service_accounts) * [IP Allowlist](https://docs.datadoghq.com/account_management/org_settings/ip_allowlist) * [Domain Allowlist](https://docs.datadoghq.com/account_management/org_settings/domain_allowlist) * [Cross-Organization Visibility](https://docs.datadoghq.com/account_management/org_settings/cross_org_visibility) * [Access Control](https://docs.datadoghq.com/account_management/rbac/) * [Granular Access](https://docs.datadoghq.com/account_management/rbac/granular_access) * [Permissions](https://docs.datadoghq.com/account_management/rbac/permissions) * [Data Access](https://docs.datadoghq.com/account_management/rbac/data_access) * [SSO with SAML](https://docs.datadoghq.com/account_management/saml/) * [Configuring SAML](https://docs.datadoghq.com/account_management/saml/configuration/) * [User Group Mapping](https://docs.datadoghq.com/account_management/saml/mapping/) * [Active Directory](https://docs.datadoghq.com/account_management/saml/activedirectory/) * [Auth0](https://docs.datadoghq.com/account_management/saml/auth0/) * [Entra ID](https://docs.datadoghq.com/account_management/saml/entra/) * [Google](https://docs.datadoghq.com/account_management/saml/google/) * [LastPass](https://docs.datadoghq.com/account_management/saml/lastpass/) * [Okta](https://docs.datadoghq.com/account_management/saml/okta/) * [SafeNet](https://docs.datadoghq.com/account_management/saml/safenet/) * [Troubleshooting](https://docs.datadoghq.com/account_management/saml/troubleshooting/) * [SCIM](https://docs.datadoghq.com/account_management/scim/) * [Okta](https://docs.datadoghq.com/account_management/scim/okta) * [Microsoft Entra ID](https://docs.datadoghq.com/account_management/scim/entra) * [API and Application Keys](https://docs.datadoghq.com/account_management/api-app-keys/) * [Teams](https://docs.datadoghq.com/account_management/teams/) * [Team Management](https://docs.datadoghq.com/account_management/teams/manage/) * [Provision with GitHub](https://docs.datadoghq.com/account_management/teams/github/) * [Governance Console](https://docs.datadoghq.com/account_management/governance_console/) * [Multi-Factor Authentication](https://docs.datadoghq.com/account_management/multi-factor_authentication/) * [Audit Trail](https://docs.datadoghq.com/account_management/audit_trail/) * [Events](https://docs.datadoghq.com/account_management/audit_trail/events/) * [Forwarding](https://docs.datadoghq.com/account_management/audit_trail/forwarding_audit_events/) * [Guides](https://docs.datadoghq.com/account_management/audit_trail/guides/) * [Safety Center](https://docs.datadoghq.com/account_management/safety_center/) * [Plan and Usage](https://docs.datadoghq.com/account_management/plan_and_usage/) * [Cost Details](https://docs.datadoghq.com/account_management/plan_and_usage/cost_details/) * [Usage Details](https://docs.datadoghq.com/account_management/plan_and_usage/usage_details/) * [Billing](https://docs.datadoghq.com/account_management/billing/) * [Pricing](https://docs.datadoghq.com/account_management/billing/pricing) * [Credit Card](https://docs.datadoghq.com/account_management/billing/credit_card/) * [Product Allotments](https://docs.datadoghq.com/account_management/billing/product_allotments) * [Usage Metrics](https://docs.datadoghq.com/account_management/billing/usage_metrics/) * [Usage Attribution](https://docs.datadoghq.com/account_management/billing/usage_attribution/) * [Custom Metrics](https://docs.datadoghq.com/account_management/billing/custom_metrics/) * [Containers](https://docs.datadoghq.com/account_management/billing/containers) * [Log Management](https://docs.datadoghq.com/account_management/billing/log_management/) * [APM](https://docs.datadoghq.com/account_management/billing/apm_tracing_profiler/) * [Serverless](https://docs.datadoghq.com/account_management/billing/serverless/) * [Real User Monitoring](https://docs.datadoghq.com/account_management/billing/rum/) * [CI Visibility](https://docs.datadoghq.com/account_management/billing/ci_visibility/) * [Incident Response](https://docs.datadoghq.com/account_management/billing/incident_response/) * [AWS Integration](https://docs.datadoghq.com/account_management/billing/aws/) * [Azure Integration](https://docs.datadoghq.com/account_management/billing/azure/) * [Google Cloud Integration](https://docs.datadoghq.com/account_management/billing/google_cloud/) * [Alibaba Integration](https://docs.datadoghq.com/account_management/billing/alibaba/) * [vSphere Integration](https://docs.datadoghq.com/account_management/billing/vsphere/) * [Workflow Automation](https://docs.datadoghq.com/account_management/billing/workflow_automation/) * [Multi-org Accounts](https://docs.datadoghq.com/account_management/multi_organization/) * [Guides](https://docs.datadoghq.com/account_management/guide/) * [Cloud-based Authentication](https://docs.datadoghq.com/account_management/cloud_provider_authentication/) * [Data Security](https://docs.datadoghq.com/data_security/) * [Agent](https://docs.datadoghq.com/data_security/agent/) * [Cloud SIEM](https://docs.datadoghq.com/data_security/cloud_siem/) * [Kubernetes](https://docs.datadoghq.com/data_security/kubernetes) * [Log Management](https://docs.datadoghq.com/data_security/logs/) * [Real User Monitoring](https://docs.datadoghq.com/data_security/real_user_monitoring/) * [Synthetic Monitoring](https://docs.datadoghq.com/data_security/synthetics/) * [Tracing](https://docs.datadoghq.com/tracing/configure_data_security/) * [PCI Compliance](https://docs.datadoghq.com/data_security/pci_compliance/) * [HIPAA Compliance](https://docs.datadoghq.com/data_security/hipaa_compliance/) * [Data Retention Periods](https://docs.datadoghq.com/data_security/data_retention_periods/) * [Guides](https://docs.datadoghq.com/data_security/guide/) * [Help](https://docs.datadoghq.com/help/) [Datadog Docs API](https://docs.datadoghq.com/) search * [Overview](https://docs.datadoghq.com/api/latest/) * [Using the API](https://docs.datadoghq.com/api/latest/using-the-api/) * [Authorization Scopes](https://docs.datadoghq.com/api/latest/scopes/) * [Rate Limits](https://docs.datadoghq.com/api/latest/rate-limits/) * [Action Connection](https://docs.datadoghq.com/api/latest/action-connection/) * [Get an existing Action Connection](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-existing-action-connection) * [Create a new Action Connection](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-action-connection) * [Update an existing Action Connection](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-existing-action-connection) * [Delete an existing Action Connection](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-existing-action-connection) * [Register a new App Key](https://docs.datadoghq.com/api/latest/aws-integration/#register-a-new-app-key) * [List App Key Registrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-app-key-registrations) * [Unregister an App Key](https://docs.datadoghq.com/api/latest/aws-integration/#unregister-an-app-key) * [Get an existing App Key Registration](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-existing-app-key-registration) * [Actions Datastores](https://docs.datadoghq.com/api/latest/actions-datastores/) * [List datastores](https://docs.datadoghq.com/api/latest/aws-integration/#list-datastores) * [Create datastore](https://docs.datadoghq.com/api/latest/aws-integration/#create-datastore) * [Get datastore](https://docs.datadoghq.com/api/latest/aws-integration/#get-datastore) * [Update datastore](https://docs.datadoghq.com/api/latest/aws-integration/#update-datastore) * [Delete datastore](https://docs.datadoghq.com/api/latest/aws-integration/#delete-datastore) * [List datastore items](https://docs.datadoghq.com/api/latest/aws-integration/#list-datastore-items) * [Delete datastore item](https://docs.datadoghq.com/api/latest/aws-integration/#delete-datastore-item) * [Update datastore item](https://docs.datadoghq.com/api/latest/aws-integration/#update-datastore-item) * [Bulk write datastore items](https://docs.datadoghq.com/api/latest/aws-integration/#bulk-write-datastore-items) * [Bulk delete datastore items](https://docs.datadoghq.com/api/latest/aws-integration/#bulk-delete-datastore-items) * [Agentless Scanning](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [List AWS scan options](https://docs.datadoghq.com/api/latest/aws-integration/#list-aws-scan-options) * [Create AWS scan options](https://docs.datadoghq.com/api/latest/aws-integration/#create-aws-scan-options) * [Get AWS scan options](https://docs.datadoghq.com/api/latest/aws-integration/#get-aws-scan-options) * [Update AWS scan options](https://docs.datadoghq.com/api/latest/aws-integration/#update-aws-scan-options) * [Delete AWS scan options](https://docs.datadoghq.com/api/latest/aws-integration/#delete-aws-scan-options) * [List Azure scan options](https://docs.datadoghq.com/api/latest/aws-integration/#list-azure-scan-options) * [Create Azure scan options](https://docs.datadoghq.com/api/latest/aws-integration/#create-azure-scan-options) * [Get Azure scan options](https://docs.datadoghq.com/api/latest/aws-integration/#get-azure-scan-options) * [Update Azure scan options](https://docs.datadoghq.com/api/latest/aws-integration/#update-azure-scan-options) * [Delete Azure scan options](https://docs.datadoghq.com/api/latest/aws-integration/#delete-azure-scan-options) * [List GCP scan options](https://docs.datadoghq.com/api/latest/aws-integration/#list-gcp-scan-options) * [Create GCP scan options](https://docs.datadoghq.com/api/latest/aws-integration/#create-gcp-scan-options) * [Get GCP scan options](https://docs.datadoghq.com/api/latest/aws-integration/#get-gcp-scan-options) * [Update GCP scan options](https://docs.datadoghq.com/api/latest/aws-integration/#update-gcp-scan-options) * [Delete GCP scan options](https://docs.datadoghq.com/api/latest/aws-integration/#delete-gcp-scan-options) * [List AWS on demand tasks](https://docs.datadoghq.com/api/latest/aws-integration/#list-aws-on-demand-tasks) * [Create AWS on demand task](https://docs.datadoghq.com/api/latest/aws-integration/#create-aws-on-demand-task) * [Get AWS on demand task](https://docs.datadoghq.com/api/latest/aws-integration/#get-aws-on-demand-task) * [API Management](https://docs.datadoghq.com/api/latest/api-management/) * [Create a new API](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-api) * [Update an API](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-api) * [Get an API](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-api) * [List APIs](https://docs.datadoghq.com/api/latest/aws-integration/#list-apis) * [Delete an API](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-api) * [APM](https://docs.datadoghq.com/api/latest/apm/) * [Get service list](https://docs.datadoghq.com/api/latest/aws-integration/#get-service-list) * [APM Retention Filters](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [List all APM retention filters](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-apm-retention-filters) * [Create a retention filter](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-retention-filter) * [Get a given APM retention filter](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-given-apm-retention-filter) * [Update a retention filter](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-retention-filter) * [Delete a retention filter](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-retention-filter) * [Re-order retention filters](https://docs.datadoghq.com/api/latest/aws-integration/#re-order-retention-filters) * [App Builder](https://docs.datadoghq.com/api/latest/app-builder/) * [List Apps](https://docs.datadoghq.com/api/latest/aws-integration/#list-apps) * [Create App](https://docs.datadoghq.com/api/latest/aws-integration/#create-app) * [Delete Multiple Apps](https://docs.datadoghq.com/api/latest/aws-integration/#delete-multiple-apps) * [Get App](https://docs.datadoghq.com/api/latest/aws-integration/#get-app) * [Update App](https://docs.datadoghq.com/api/latest/aws-integration/#update-app) * [Delete App](https://docs.datadoghq.com/api/latest/aws-integration/#delete-app) * [Publish App](https://docs.datadoghq.com/api/latest/aws-integration/#publish-app) * [Unpublish App](https://docs.datadoghq.com/api/latest/aws-integration/#unpublish-app) * [Application Security](https://docs.datadoghq.com/api/latest/application-security/) * [Get a WAF exclusion filter](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-waf-exclusion-filter) * [Create a WAF exclusion filter](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-waf-exclusion-filter) * [List all WAF exclusion filters](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-waf-exclusion-filters) * [Update a WAF exclusion filter](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-waf-exclusion-filter) * [Delete a WAF exclusion filter](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-waf-exclusion-filter) * [Get a WAF custom rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-waf-custom-rule) * [Create a WAF custom rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-waf-custom-rule) * [List all WAF custom rules](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-waf-custom-rules) * [Update a WAF Custom Rule](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-waf-custom-rule) * [Delete a WAF Custom Rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-waf-custom-rule) * [Audit](https://docs.datadoghq.com/api/latest/audit/) * [Search Audit Logs events](https://docs.datadoghq.com/api/latest/aws-integration/#search-audit-logs-events) * [Get a list of Audit Logs events](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-audit-logs-events) * [Authentication](https://docs.datadoghq.com/api/latest/authentication/) * [Validate API key](https://docs.datadoghq.com/api/latest/aws-integration/#validate-api-key) * [AuthN Mappings](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Get an AuthN Mapping by UUID](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-authn-mapping-by-uuid) * [Edit an AuthN Mapping](https://docs.datadoghq.com/api/latest/aws-integration/#edit-an-authn-mapping) * [Delete an AuthN Mapping](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-authn-mapping) * [List all AuthN Mappings](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-authn-mappings) * [Create an AuthN Mapping](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-authn-mapping) * [AWS Integration](https://docs.datadoghq.com/api/latest/aws-integration/) * [Get all AWS tag filters](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-aws-tag-filters) * [List available namespaces](https://docs.datadoghq.com/api/latest/aws-integration/#list-available-namespaces) * [Set an AWS tag filter](https://docs.datadoghq.com/api/latest/aws-integration/#set-an-aws-tag-filter) * [Delete a tag filtering entry](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-tag-filtering-entry) * [Get an AWS integration by config ID](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-aws-integration-by-config-id) * [Generate a new external ID](https://docs.datadoghq.com/api/latest/aws-integration/#generate-a-new-external-id) * [List namespace rules](https://docs.datadoghq.com/api/latest/aws-integration/#list-namespace-rules) * [Get AWS integration IAM permissions](https://docs.datadoghq.com/api/latest/aws-integration/#get-aws-integration-iam-permissions) * [List all AWS integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations) * [Delete an AWS integration](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-aws-integration) * [Get AWS integration standard IAM permissions](https://docs.datadoghq.com/api/latest/aws-integration/#get-aws-integration-standard-iam-permissions) * [Create an AWS integration](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-aws-integration) * [Get resource collection IAM permissions](https://docs.datadoghq.com/api/latest/aws-integration/#get-resource-collection-iam-permissions) * [Update an AWS integration](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-aws-integration) * [Get all Amazon EventBridge sources](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-amazon-eventbridge-sources) * [Create an Amazon EventBridge source](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-amazon-eventbridge-source) * [Delete an Amazon EventBridge source](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-amazon-eventbridge-source) * [AWS Logs Integration](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [List all AWS Logs integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-logs-integrations) * [Add AWS Log Lambda ARN](https://docs.datadoghq.com/api/latest/aws-integration/#add-aws-log-lambda-arn) * [Delete an AWS Logs integration](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-aws-logs-integration) * [Get list of AWS log ready services](https://docs.datadoghq.com/api/latest/aws-integration/#get-list-of-aws-log-ready-services) * [Enable an AWS Logs integration](https://docs.datadoghq.com/api/latest/aws-integration/#enable-an-aws-logs-integration) * [Check permissions for log services](https://docs.datadoghq.com/api/latest/aws-integration/#check-permissions-for-log-services) * [Check that an AWS Lambda Function exists](https://docs.datadoghq.com/api/latest/aws-integration/#check-that-an-aws-lambda-function-exists) * [Azure Integration](https://docs.datadoghq.com/api/latest/azure-integration/) * [List all Azure integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-azure-integrations) * [Create an Azure integration](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-azure-integration) * [Delete an Azure integration](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-azure-integration) * [Update an Azure integration](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-azure-integration) * [Update Azure integration host filters](https://docs.datadoghq.com/api/latest/aws-integration/#update-azure-integration-host-filters) * [Case Management](https://docs.datadoghq.com/api/latest/case-management/) * [Create a project](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-project) * [Search cases](https://docs.datadoghq.com/api/latest/aws-integration/#search-cases) * [Create a case](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-case) * [Get all projects](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-projects) * [Get the details of a case](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-details-of-a-case) * [Get the details of a project](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-details-of-a-project) * [Remove a project](https://docs.datadoghq.com/api/latest/aws-integration/#remove-a-project) * [Update case description](https://docs.datadoghq.com/api/latest/aws-integration/#update-case-description) * [Update case status](https://docs.datadoghq.com/api/latest/aws-integration/#update-case-status) * [Update case title](https://docs.datadoghq.com/api/latest/aws-integration/#update-case-title) * [Update case priority](https://docs.datadoghq.com/api/latest/aws-integration/#update-case-priority) * [Assign case](https://docs.datadoghq.com/api/latest/aws-integration/#assign-case) * [Unassign case](https://docs.datadoghq.com/api/latest/aws-integration/#unassign-case) * [Archive case](https://docs.datadoghq.com/api/latest/aws-integration/#archive-case) * [Unarchive case](https://docs.datadoghq.com/api/latest/aws-integration/#unarchive-case) * [Update case attributes](https://docs.datadoghq.com/api/latest/aws-integration/#update-case-attributes) * [Comment case](https://docs.datadoghq.com/api/latest/aws-integration/#comment-case) * [Delete case comment](https://docs.datadoghq.com/api/latest/aws-integration/#delete-case-comment) * [Update case custom attribute](https://docs.datadoghq.com/api/latest/aws-integration/#update-case-custom-attribute) * [Delete custom attribute from case](https://docs.datadoghq.com/api/latest/aws-integration/#delete-custom-attribute-from-case) * [Case Management Attribute](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Get all custom attributes](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-custom-attributes) * [Get all custom attributes config of case type](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-custom-attributes-config-of-case-type) * [Create custom attribute config for a case type](https://docs.datadoghq.com/api/latest/aws-integration/#create-custom-attribute-config-for-a-case-type) * [Delete custom attributes config](https://docs.datadoghq.com/api/latest/aws-integration/#delete-custom-attributes-config) * [Case Management Type](https://docs.datadoghq.com/api/latest/case-management-type/) * [Get all case types](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-case-types) * [Create a case type](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-case-type) * [Delete a case type](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-case-type) * [CI Visibility Pipelines](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Send pipeline event](https://docs.datadoghq.com/api/latest/aws-integration/#send-pipeline-event) * [Get a list of pipelines events](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-pipelines-events) * [Search pipelines events](https://docs.datadoghq.com/api/latest/aws-integration/#search-pipelines-events) * [Aggregate pipelines events](https://docs.datadoghq.com/api/latest/aws-integration/#aggregate-pipelines-events) * [CI Visibility Tests](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Get a list of tests events](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-tests-events) * [Search tests events](https://docs.datadoghq.com/api/latest/aws-integration/#search-tests-events) * [Aggregate tests events](https://docs.datadoghq.com/api/latest/aws-integration/#aggregate-tests-events) * [Cloud Cost Management](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [List Cloud Cost Management AWS CUR configs](https://docs.datadoghq.com/api/latest/aws-integration/#list-cloud-cost-management-aws-cur-configs) * [Create Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/aws-integration/#create-cloud-cost-management-aws-cur-config) * [Update Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/aws-integration/#update-cloud-cost-management-aws-cur-config) * [Delete Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/aws-integration/#delete-cloud-cost-management-aws-cur-config) * [Get cost AWS CUR config](https://docs.datadoghq.com/api/latest/aws-integration/#get-cost-aws-cur-config) * [List Cloud Cost Management Azure configs](https://docs.datadoghq.com/api/latest/aws-integration/#list-cloud-cost-management-azure-configs) * [Create Cloud Cost Management Azure configs](https://docs.datadoghq.com/api/latest/aws-integration/#create-cloud-cost-management-azure-configs) * [Update Cloud Cost Management Azure config](https://docs.datadoghq.com/api/latest/aws-integration/#update-cloud-cost-management-azure-config) * [Delete Cloud Cost Management Azure config](https://docs.datadoghq.com/api/latest/aws-integration/#delete-cloud-cost-management-azure-config) * [Get cost Azure UC config](https://docs.datadoghq.com/api/latest/aws-integration/#get-cost-azure-uc-config) * [List Google Cloud Usage Cost configs](https://docs.datadoghq.com/api/latest/aws-integration/#list-google-cloud-usage-cost-configs) * [Create Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/aws-integration/#create-google-cloud-usage-cost-config) * [Update Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/aws-integration/#update-google-cloud-usage-cost-config) * [Delete Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/aws-integration/#delete-google-cloud-usage-cost-config) * [Get Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/aws-integration/#get-google-cloud-usage-cost-config) * [List tag pipeline rulesets](https://docs.datadoghq.com/api/latest/aws-integration/#list-tag-pipeline-rulesets) * [Create tag pipeline ruleset](https://docs.datadoghq.com/api/latest/aws-integration/#create-tag-pipeline-ruleset) * [Update tag pipeline ruleset](https://docs.datadoghq.com/api/latest/aws-integration/#update-tag-pipeline-ruleset) * [Delete tag pipeline ruleset](https://docs.datadoghq.com/api/latest/aws-integration/#delete-tag-pipeline-ruleset) * [Get a tag pipeline ruleset](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-tag-pipeline-ruleset) * [Reorder tag pipeline rulesets](https://docs.datadoghq.com/api/latest/aws-integration/#reorder-tag-pipeline-rulesets) * [Validate query](https://docs.datadoghq.com/api/latest/aws-integration/#validate-query) * [List custom allocation rules](https://docs.datadoghq.com/api/latest/aws-integration/#list-custom-allocation-rules) * [Create custom allocation rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-custom-allocation-rule) * [Update custom allocation rule](https://docs.datadoghq.com/api/latest/aws-integration/#update-custom-allocation-rule) * [Delete custom allocation rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-custom-allocation-rule) * [Get custom allocation rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-custom-allocation-rule) * [Reorder custom allocation rules](https://docs.datadoghq.com/api/latest/aws-integration/#reorder-custom-allocation-rules) * [List Custom Costs files](https://docs.datadoghq.com/api/latest/aws-integration/#list-custom-costs-files) * [Upload Custom Costs file](https://docs.datadoghq.com/api/latest/aws-integration/#upload-custom-costs-file) * [Delete Custom Costs file](https://docs.datadoghq.com/api/latest/aws-integration/#delete-custom-costs-file) * [Get Custom Costs file](https://docs.datadoghq.com/api/latest/aws-integration/#get-custom-costs-file) * [List budgets](https://docs.datadoghq.com/api/latest/aws-integration/#list-budgets) * [Create or update a budget](https://docs.datadoghq.com/api/latest/aws-integration/#create-or-update-a-budget) * [Delete a budget](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-budget) * [Get a budget](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-budget) * [Cloud Network Monitoring](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) * [Get all aggregated connections](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-aggregated-connections) * [Get all aggregated DNS traffic](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-aggregated-dns-traffic) * [Cloudflare Integration](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [List Cloudflare accounts](https://docs.datadoghq.com/api/latest/aws-integration/#list-cloudflare-accounts) * [Add Cloudflare account](https://docs.datadoghq.com/api/latest/aws-integration/#add-cloudflare-account) * [Get Cloudflare account](https://docs.datadoghq.com/api/latest/aws-integration/#get-cloudflare-account) * [Update Cloudflare account](https://docs.datadoghq.com/api/latest/aws-integration/#update-cloudflare-account) * [Delete Cloudflare account](https://docs.datadoghq.com/api/latest/aws-integration/#delete-cloudflare-account) * [Confluent Cloud](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Update resource in Confluent account](https://docs.datadoghq.com/api/latest/aws-integration/#update-resource-in-confluent-account) * [Get resource from Confluent account](https://docs.datadoghq.com/api/latest/aws-integration/#get-resource-from-confluent-account) * [Delete resource from Confluent account](https://docs.datadoghq.com/api/latest/aws-integration/#delete-resource-from-confluent-account) * [Add resource to Confluent account](https://docs.datadoghq.com/api/latest/aws-integration/#add-resource-to-confluent-account) * [List Confluent Account resources](https://docs.datadoghq.com/api/latest/aws-integration/#list-confluent-account-resources) * [Update Confluent account](https://docs.datadoghq.com/api/latest/aws-integration/#update-confluent-account) * [Get Confluent account](https://docs.datadoghq.com/api/latest/aws-integration/#get-confluent-account) * [Delete Confluent account](https://docs.datadoghq.com/api/latest/aws-integration/#delete-confluent-account) * [Add Confluent account](https://docs.datadoghq.com/api/latest/aws-integration/#add-confluent-account) * [List Confluent accounts](https://docs.datadoghq.com/api/latest/aws-integration/#list-confluent-accounts) * [Container Images](https://docs.datadoghq.com/api/latest/container-images/) * [Get all Container Images](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-container-images) * [Containers](https://docs.datadoghq.com/api/latest/containers/) * [Get All Containers](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-containers) * [CSM Agents](https://docs.datadoghq.com/api/latest/csm-agents/) * [Get all CSM Agents](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-csm-agents) * [Get all CSM Serverless Agents](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-csm-serverless-agents) * [CSM Coverage Analysis](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) * [Get the CSM Cloud Accounts Coverage Analysis](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-csm-cloud-accounts-coverage-analysis) * [Get the CSM Hosts and Containers Coverage Analysis](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-csm-hosts-and-containers-coverage-analysis) * [Get the CSM Serverless Coverage Analysis](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-csm-serverless-coverage-analysis) * [CSM Threats](https://docs.datadoghq.com/api/latest/csm-threats/) * [Get all Workload Protection agent rules](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-workload-protection-agent-rules) * [Get a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-workload-protection-agent-rule) * [Create a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-workload-protection-agent-rule) * [Update a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-workload-protection-agent-rule) * [Delete a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-workload-protection-agent-rule) * [Get all Workload Protection policies](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-workload-protection-policies) * [Get a Workload Protection policy](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-workload-protection-policy) * [Create a Workload Protection policy](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-workload-protection-policy) * [Update a Workload Protection policy](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-workload-protection-policy) * [Delete a Workload Protection policy](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-workload-protection-policy) * [Download the Workload Protection policy](https://docs.datadoghq.com/api/latest/aws-integration/#download-the-workload-protection-policy) * [Get all Workload Protection agent rules (US1-FED)](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-workload-protection-agent-rules-us1-fed) * [Get a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-workload-protection-agent-rule-us1-fed) * [Create a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-workload-protection-agent-rule-us1-fed) * [Update a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-workload-protection-agent-rule-us1-fed) * [Delete a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-workload-protection-agent-rule-us1-fed) * [Download the Workload Protection policy (US1-FED)](https://docs.datadoghq.com/api/latest/aws-integration/#download-the-workload-protection-policy-us1-fed) * [Dashboard Lists](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Get all dashboard lists](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-dashboard-lists) * [Get items of a Dashboard List](https://docs.datadoghq.com/api/latest/aws-integration/#get-items-of-a-dashboard-list) * [Add Items to a Dashboard List](https://docs.datadoghq.com/api/latest/aws-integration/#add-items-to-a-dashboard-list) * [Create a dashboard list](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-dashboard-list) * [Get a dashboard list](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-dashboard-list) * [Update items of a dashboard list](https://docs.datadoghq.com/api/latest/aws-integration/#update-items-of-a-dashboard-list) * [Delete items from a dashboard list](https://docs.datadoghq.com/api/latest/aws-integration/#delete-items-from-a-dashboard-list) * [Update a dashboard list](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-dashboard-list) * [Delete a dashboard list](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-dashboard-list) * [Dashboards](https://docs.datadoghq.com/api/latest/dashboards/) * [Create a new dashboard](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-dashboard) * [Get a dashboard](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-dashboard) * [Get all dashboards](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-dashboards) * [Update a dashboard](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-dashboard) * [Delete a dashboard](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-dashboard) * [Delete dashboards](https://docs.datadoghq.com/api/latest/aws-integration/#delete-dashboards) * [Restore deleted dashboards](https://docs.datadoghq.com/api/latest/aws-integration/#restore-deleted-dashboards) * [Create a shared dashboard](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-shared-dashboard) * [Get a shared dashboard](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-shared-dashboard) * [Update a shared dashboard](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-shared-dashboard) * [Send shared dashboard invitation email](https://docs.datadoghq.com/api/latest/aws-integration/#send-shared-dashboard-invitation-email) * [Get all invitations for a shared dashboard](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-invitations-for-a-shared-dashboard) * [Revoke a shared dashboard URL](https://docs.datadoghq.com/api/latest/aws-integration/#revoke-a-shared-dashboard-url) * [Revoke shared dashboard invitations](https://docs.datadoghq.com/api/latest/aws-integration/#revoke-shared-dashboard-invitations) * [Datasets](https://docs.datadoghq.com/api/latest/datasets/) * [Create a dataset](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-dataset) * [Get a single dataset by ID](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-single-dataset-by-id) * [Get all datasets](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-datasets) * [Edit a dataset](https://docs.datadoghq.com/api/latest/aws-integration/#edit-a-dataset) * [Delete a dataset](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-dataset) * [Deployment Gates](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Create deployment gate](https://docs.datadoghq.com/api/latest/aws-integration/#create-deployment-gate) * [Get deployment gate](https://docs.datadoghq.com/api/latest/aws-integration/#get-deployment-gate) * [Update deployment gate](https://docs.datadoghq.com/api/latest/aws-integration/#update-deployment-gate) * [Delete deployment gate](https://docs.datadoghq.com/api/latest/aws-integration/#delete-deployment-gate) * [Create deployment rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-deployment-rule) * [Get deployment rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-deployment-rule) * [Update deployment rule](https://docs.datadoghq.com/api/latest/aws-integration/#update-deployment-rule) * [Delete deployment rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-deployment-rule) * [Get rules for a deployment gate](https://docs.datadoghq.com/api/latest/aws-integration/#get-rules-for-a-deployment-gate) * [Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/) * [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/aws-integration/#get-domain-allowlist) * [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/aws-integration/#sets-domain-allowlist) * [DORA Metrics](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Send a deployment event](https://docs.datadoghq.com/api/latest/aws-integration/#send-a-deployment-event) * [Send a failure event](https://docs.datadoghq.com/api/latest/aws-integration/#send-a-failure-event) * [Send an incident event](https://docs.datadoghq.com/api/latest/aws-integration/#send-an-incident-event) * [Get a list of deployment events](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-deployment-events) * [Get a list of failure events](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-failure-events) * [Get a deployment event](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-deployment-event) * [Get a failure event](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-failure-event) * [Delete a failure event](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-failure-event) * [Delete a deployment event](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-deployment-event) * [Downtimes](https://docs.datadoghq.com/api/latest/downtimes/) * [Get all downtimes](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-downtimes) * [Schedule a downtime](https://docs.datadoghq.com/api/latest/aws-integration/#schedule-a-downtime) * [Cancel downtimes by scope](https://docs.datadoghq.com/api/latest/aws-integration/#cancel-downtimes-by-scope) * [Cancel a downtime](https://docs.datadoghq.com/api/latest/aws-integration/#cancel-a-downtime) * [Get a downtime](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-downtime) * [Update a downtime](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-downtime) * [Get active downtimes for a monitor](https://docs.datadoghq.com/api/latest/aws-integration/#get-active-downtimes-for-a-monitor) * [Embeddable Graphs](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Revoke embed](https://docs.datadoghq.com/api/latest/aws-integration/#revoke-embed) * [Enable embed](https://docs.datadoghq.com/api/latest/aws-integration/#enable-embed) * [Get specific embed](https://docs.datadoghq.com/api/latest/aws-integration/#get-specific-embed) * [Create embed](https://docs.datadoghq.com/api/latest/aws-integration/#create-embed) * [Get all embeds](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-embeds) * [Error Tracking](https://docs.datadoghq.com/api/latest/error-tracking/) * [Search error tracking issues](https://docs.datadoghq.com/api/latest/aws-integration/#search-error-tracking-issues) * [Get the details of an error tracking issue](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-details-of-an-error-tracking-issue) * [Update the state of an issue](https://docs.datadoghq.com/api/latest/aws-integration/#update-the-state-of-an-issue) * [Update the assignee of an issue](https://docs.datadoghq.com/api/latest/aws-integration/#update-the-assignee-of-an-issue) * [Remove the assignee of an issue](https://docs.datadoghq.com/api/latest/aws-integration/#remove-the-assignee-of-an-issue) * [Events](https://docs.datadoghq.com/api/latest/events/) * [Get a list of events](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-events) * [Post an event](https://docs.datadoghq.com/api/latest/aws-integration/#post-an-event) * [Get an event](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-event) * [Search events](https://docs.datadoghq.com/api/latest/aws-integration/#search-events) * [Fastly Integration](https://docs.datadoghq.com/api/latest/fastly-integration/) * [List Fastly accounts](https://docs.datadoghq.com/api/latest/aws-integration/#list-fastly-accounts) * [Add Fastly account](https://docs.datadoghq.com/api/latest/aws-integration/#add-fastly-account) * [Get Fastly account](https://docs.datadoghq.com/api/latest/aws-integration/#get-fastly-account) * [Update Fastly account](https://docs.datadoghq.com/api/latest/aws-integration/#update-fastly-account) * [Delete Fastly account](https://docs.datadoghq.com/api/latest/aws-integration/#delete-fastly-account) * [List Fastly services](https://docs.datadoghq.com/api/latest/aws-integration/#list-fastly-services) * [Add Fastly service](https://docs.datadoghq.com/api/latest/aws-integration/#add-fastly-service) * [Get Fastly service](https://docs.datadoghq.com/api/latest/aws-integration/#get-fastly-service) * [Update Fastly service](https://docs.datadoghq.com/api/latest/aws-integration/#update-fastly-service) * [Delete Fastly service](https://docs.datadoghq.com/api/latest/aws-integration/#delete-fastly-service) * [Fleet Automation](https://docs.datadoghq.com/api/latest/fleet-automation/) * [List all available Agent versions](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-available-agent-versions) * [List all Datadog Agents](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-datadog-agents) * [Get detailed information about an agent](https://docs.datadoghq.com/api/latest/aws-integration/#get-detailed-information-about-an-agent) * [List all deployments](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-deployments) * [Create a configuration deployment](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-configuration-deployment) * [Upgrade hosts](https://docs.datadoghq.com/api/latest/aws-integration/#upgrade-hosts) * [Get a configuration deployment by ID](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-configuration-deployment-by-id) * [Cancel a deployment](https://docs.datadoghq.com/api/latest/aws-integration/#cancel-a-deployment) * [List all schedules](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-schedules) * [Create a schedule](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-schedule) * [Get a schedule by ID](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-schedule-by-id) * [Update a schedule](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-schedule) * [Delete a schedule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-schedule) * [Trigger a schedule deployment](https://docs.datadoghq.com/api/latest/aws-integration/#trigger-a-schedule-deployment) * [GCP Integration](https://docs.datadoghq.com/api/latest/gcp-integration/) * [List all GCP integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-gcp-integrations) * [List all GCP STS-enabled service accounts](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-gcp-sts-enabled-service-accounts) * [Create a GCP integration](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-gcp-integration) * [Create a new entry for your service account](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-entry-for-your-service-account) * [Delete a GCP integration](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-gcp-integration) * [Delete an STS enabled GCP Account](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-sts-enabled-gcp-account) * [Update a GCP integration](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-gcp-integration) * [Update STS Service Account](https://docs.datadoghq.com/api/latest/aws-integration/#update-sts-service-account) * [Create a Datadog GCP principal](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-datadog-gcp-principal) * [List delegate account](https://docs.datadoghq.com/api/latest/aws-integration/#list-delegate-account) * [Hosts](https://docs.datadoghq.com/api/latest/hosts/) * [Get all hosts for your organization](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-hosts-for-your-organization) * [Get the total number of active hosts](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-total-number-of-active-hosts) * [Mute a host](https://docs.datadoghq.com/api/latest/aws-integration/#mute-a-host) * [Unmute a host](https://docs.datadoghq.com/api/latest/aws-integration/#unmute-a-host) * [Incident Services](https://docs.datadoghq.com/api/latest/incident-services/) * [Get details of an incident service](https://docs.datadoghq.com/api/latest/aws-integration/#get-details-of-an-incident-service) * [Delete an existing incident service](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-existing-incident-service) * [Update an existing incident service](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-existing-incident-service) * [Get a list of all incident services](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-all-incident-services) * [Create a new incident service](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-incident-service) * [Incident Teams](https://docs.datadoghq.com/api/latest/incident-teams/) * [Get details of an incident team](https://docs.datadoghq.com/api/latest/aws-integration/#get-details-of-an-incident-team) * [Delete an existing incident team](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-existing-incident-team) * [Update an existing incident team](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-existing-incident-team) * [Get a list of all incident teams](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-all-incident-teams) * [Create a new incident team](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-incident-team) * [Incidents](https://docs.datadoghq.com/api/latest/incidents/) * [Create an incident](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-incident) * [Get the details of an incident](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-details-of-an-incident) * [Update an existing incident](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-existing-incident) * [Delete an existing incident](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-existing-incident) * [Get a list of incidents](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-incidents) * [Search for incidents](https://docs.datadoghq.com/api/latest/aws-integration/#search-for-incidents) * [List an incident's impacts](https://docs.datadoghq.com/api/latest/aws-integration/#list-an-incidents-impacts) * [Create an incident impact](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-incident-impact) * [Delete an incident impact](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-incident-impact) * [Get a list of an incident's integration metadata](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-an-incidents-integration-metadata) * [Create an incident integration metadata](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-incident-integration-metadata) * [Get incident integration metadata details](https://docs.datadoghq.com/api/latest/aws-integration/#get-incident-integration-metadata-details) * [Update an existing incident integration metadata](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-existing-incident-integration-metadata) * [Delete an incident integration metadata](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-incident-integration-metadata) * [Get a list of an incident's todos](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-an-incidents-todos) * [Create an incident todo](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-incident-todo) * [Get incident todo details](https://docs.datadoghq.com/api/latest/aws-integration/#get-incident-todo-details) * [Update an incident todo](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-incident-todo) * [Delete an incident todo](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-incident-todo) * [Create an incident type](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-incident-type) * [Get a list of incident types](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-incident-types) * [Get incident type details](https://docs.datadoghq.com/api/latest/aws-integration/#get-incident-type-details) * [Update an incident type](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-incident-type) * [Delete an incident type](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-incident-type) * [List incident notification templates](https://docs.datadoghq.com/api/latest/aws-integration/#list-incident-notification-templates) * [Create incident notification template](https://docs.datadoghq.com/api/latest/aws-integration/#create-incident-notification-template) * [Get incident notification template](https://docs.datadoghq.com/api/latest/aws-integration/#get-incident-notification-template) * [Update incident notification template](https://docs.datadoghq.com/api/latest/aws-integration/#update-incident-notification-template) * [Delete a notification template](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-notification-template) * [List incident notification rules](https://docs.datadoghq.com/api/latest/aws-integration/#list-incident-notification-rules) * [Create an incident notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-incident-notification-rule) * [Get an incident notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-incident-notification-rule) * [Update an incident notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-incident-notification-rule) * [Delete an incident notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-incident-notification-rule) * [List incident attachments](https://docs.datadoghq.com/api/latest/aws-integration/#list-incident-attachments) * [Create incident attachment](https://docs.datadoghq.com/api/latest/aws-integration/#create-incident-attachment) * [Delete incident attachment](https://docs.datadoghq.com/api/latest/aws-integration/#delete-incident-attachment) * [Update incident attachment](https://docs.datadoghq.com/api/latest/aws-integration/#update-incident-attachment) * [IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Get IP Allowlist](https://docs.datadoghq.com/api/latest/aws-integration/#get-ip-allowlist) * [Update IP Allowlist](https://docs.datadoghq.com/api/latest/aws-integration/#update-ip-allowlist) * [IP Ranges](https://docs.datadoghq.com/api/latest/ip-ranges/) * [List IP Ranges](https://docs.datadoghq.com/api/latest/aws-integration/#list-ip-ranges) * [Key Management](https://docs.datadoghq.com/api/latest/key-management/) * [Delete an application key owned by current user](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-application-key-owned-by-current-user) * [Get all API keys](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-api-keys) * [Create an API key](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-api-key) * [Edit an application key owned by current user](https://docs.datadoghq.com/api/latest/aws-integration/#edit-an-application-key-owned-by-current-user) * [Get API key](https://docs.datadoghq.com/api/latest/aws-integration/#get-api-key) * [Get one application key owned by current user](https://docs.datadoghq.com/api/latest/aws-integration/#get-one-application-key-owned-by-current-user) * [Create an application key for current user](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-application-key-for-current-user) * [Edit an API key](https://docs.datadoghq.com/api/latest/aws-integration/#edit-an-api-key) * [Delete an API key](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-api-key) * [Get all application keys owned by current user](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-application-keys-owned-by-current-user) * [Get all application keys](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-application-keys) * [Create an application key](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-application-key) * [Get an application key](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-application-key) * [Edit an application key](https://docs.datadoghq.com/api/latest/aws-integration/#edit-an-application-key) * [Delete an application key](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-application-key) * [Logs](https://docs.datadoghq.com/api/latest/logs/) * [Send logs](https://docs.datadoghq.com/api/latest/aws-integration/#send-logs) * [Aggregate events](https://docs.datadoghq.com/api/latest/aws-integration/#aggregate-events) * [Search logs](https://docs.datadoghq.com/api/latest/aws-integration/#search-logs) * [Search logs (POST)](https://docs.datadoghq.com/api/latest/aws-integration/#search-logs-post) * [Search logs (GET)](https://docs.datadoghq.com/api/latest/aws-integration/#search-logs-get) * [Logs Archives](https://docs.datadoghq.com/api/latest/logs-archives/) * [Get all archives](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-archives) * [Create an archive](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-archive) * [Get an archive](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-archive) * [Update an archive](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-archive) * [Delete an archive](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-archive) * [List read roles for an archive](https://docs.datadoghq.com/api/latest/aws-integration/#list-read-roles-for-an-archive) * [Grant role to an archive](https://docs.datadoghq.com/api/latest/aws-integration/#grant-role-to-an-archive) * [Revoke role from an archive](https://docs.datadoghq.com/api/latest/aws-integration/#revoke-role-from-an-archive) * [Get archive order](https://docs.datadoghq.com/api/latest/aws-integration/#get-archive-order) * [Update archive order](https://docs.datadoghq.com/api/latest/aws-integration/#update-archive-order) * [Logs Custom Destinations](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Get all custom destinations](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-custom-destinations) * [Create a custom destination](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-custom-destination) * [Get a custom destination](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-custom-destination) * [Update a custom destination](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-custom-destination) * [Delete a custom destination](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-custom-destination) * [Logs Indexes](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Get all indexes](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-indexes) * [Get an index](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-index) * [Create an index](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-index) * [Update an index](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-index) * [Delete an index](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-index) * [Get indexes order](https://docs.datadoghq.com/api/latest/aws-integration/#get-indexes-order) * [Update indexes order](https://docs.datadoghq.com/api/latest/aws-integration/#update-indexes-order) * [Logs Metrics](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Get all log-based metrics](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-log-based-metrics) * [Create a log-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-log-based-metric) * [Get a log-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-log-based-metric) * [Update a log-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-log-based-metric) * [Delete a log-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-log-based-metric) * [Logs Pipelines](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Get pipeline order](https://docs.datadoghq.com/api/latest/aws-integration/#get-pipeline-order) * [Update pipeline order](https://docs.datadoghq.com/api/latest/aws-integration/#update-pipeline-order) * [Get all pipelines](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-pipelines) * [Create a pipeline](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-pipeline) * [Get a pipeline](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-pipeline) * [Delete a pipeline](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-pipeline) * [Update a pipeline](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-pipeline) * [Logs Restriction Queries](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [List restriction queries](https://docs.datadoghq.com/api/latest/aws-integration/#list-restriction-queries) * [Create a restriction query](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-restriction-query) * [Get a restriction query](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-restriction-query) * [Replace a restriction query](https://docs.datadoghq.com/api/latest/aws-integration/#replace-a-restriction-query) * [Update a restriction query](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-restriction-query) * [Delete a restriction query](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-restriction-query) * [List roles for a restriction query](https://docs.datadoghq.com/api/latest/aws-integration/#list-roles-for-a-restriction-query) * [Grant role to a restriction query](https://docs.datadoghq.com/api/latest/aws-integration/#grant-role-to-a-restriction-query) * [Revoke role from a restriction query](https://docs.datadoghq.com/api/latest/aws-integration/#revoke-role-from-a-restriction-query) * [Get all restriction queries for a given user](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-restriction-queries-for-a-given-user) * [Get restriction query for a given role](https://docs.datadoghq.com/api/latest/aws-integration/#get-restriction-query-for-a-given-role) * [Metrics](https://docs.datadoghq.com/api/latest/metrics/) * [Create a tag configuration](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-tag-configuration) * [Get active metrics list](https://docs.datadoghq.com/api/latest/aws-integration/#get-active-metrics-list) * [Query timeseries data across multiple products](https://docs.datadoghq.com/api/latest/aws-integration/#query-timeseries-data-across-multiple-products) * [Submit distribution points](https://docs.datadoghq.com/api/latest/aws-integration/#submit-distribution-points) * [Submit metrics](https://docs.datadoghq.com/api/latest/aws-integration/#submit-metrics) * [Get metric metadata](https://docs.datadoghq.com/api/latest/aws-integration/#get-metric-metadata) * [List tag configuration by name](https://docs.datadoghq.com/api/latest/aws-integration/#list-tag-configuration-by-name) * [Query scalar data across multiple products](https://docs.datadoghq.com/api/latest/aws-integration/#query-scalar-data-across-multiple-products) * [Edit metric metadata](https://docs.datadoghq.com/api/latest/aws-integration/#edit-metric-metadata) * [Update a tag configuration](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-tag-configuration) * [Delete a tag configuration](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-tag-configuration) * [Search metrics](https://docs.datadoghq.com/api/latest/aws-integration/#search-metrics) * [Get a list of metrics](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-metrics) * [Query timeseries points](https://docs.datadoghq.com/api/latest/aws-integration/#query-timeseries-points) * [List tags by metric name](https://docs.datadoghq.com/api/latest/aws-integration/#list-tags-by-metric-name) * [List active tags and aggregations](https://docs.datadoghq.com/api/latest/aws-integration/#list-active-tags-and-aggregations) * [List distinct metric volumes by metric name](https://docs.datadoghq.com/api/latest/aws-integration/#list-distinct-metric-volumes-by-metric-name) * [Configure tags for multiple metrics](https://docs.datadoghq.com/api/latest/aws-integration/#configure-tags-for-multiple-metrics) * [Delete tags for multiple metrics](https://docs.datadoghq.com/api/latest/aws-integration/#delete-tags-for-multiple-metrics) * [Tag Configuration Cardinality Estimator](https://docs.datadoghq.com/api/latest/aws-integration/#tag-configuration-cardinality-estimator) * [Related Assets to a Metric](https://docs.datadoghq.com/api/latest/aws-integration/#related-assets-to-a-metric) * [Get tag key cardinality details](https://docs.datadoghq.com/api/latest/aws-integration/#get-tag-key-cardinality-details) * [Microsoft Teams Integration](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Create tenant-based handle](https://docs.datadoghq.com/api/latest/aws-integration/#create-tenant-based-handle) * [Create Workflows webhook handle](https://docs.datadoghq.com/api/latest/aws-integration/#create-workflows-webhook-handle) * [Delete tenant-based handle](https://docs.datadoghq.com/api/latest/aws-integration/#delete-tenant-based-handle) * [Delete Workflows webhook handle](https://docs.datadoghq.com/api/latest/aws-integration/#delete-workflows-webhook-handle) * [Get all tenant-based handles](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-tenant-based-handles) * [Get all Workflows webhook handles](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-workflows-webhook-handles) * [Get channel information by name](https://docs.datadoghq.com/api/latest/aws-integration/#get-channel-information-by-name) * [Get tenant-based handle information](https://docs.datadoghq.com/api/latest/aws-integration/#get-tenant-based-handle-information) * [Get Workflows webhook handle information](https://docs.datadoghq.com/api/latest/aws-integration/#get-workflows-webhook-handle-information) * [Update tenant-based handle](https://docs.datadoghq.com/api/latest/aws-integration/#update-tenant-based-handle) * [Update Workflows webhook handle](https://docs.datadoghq.com/api/latest/aws-integration/#update-workflows-webhook-handle) * [Monitors](https://docs.datadoghq.com/api/latest/monitors/) * [Create a monitor](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-monitor) * [Monitors search](https://docs.datadoghq.com/api/latest/aws-integration/#monitors-search) * [Unmute a monitor](https://docs.datadoghq.com/api/latest/aws-integration/#unmute-a-monitor) * [Get all monitors](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-monitors) * [Monitors group search](https://docs.datadoghq.com/api/latest/aws-integration/#monitors-group-search) * [Mute a monitor](https://docs.datadoghq.com/api/latest/aws-integration/#mute-a-monitor) * [Edit a monitor](https://docs.datadoghq.com/api/latest/aws-integration/#edit-a-monitor) * [Unmute all monitors](https://docs.datadoghq.com/api/latest/aws-integration/#unmute-all-monitors) * [Get a monitor's details](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-monitors-details) * [Mute all monitors](https://docs.datadoghq.com/api/latest/aws-integration/#mute-all-monitors) * [Delete a monitor](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-monitor) * [Check if a monitor can be deleted](https://docs.datadoghq.com/api/latest/aws-integration/#check-if-a-monitor-can-be-deleted) * [Validate a monitor](https://docs.datadoghq.com/api/latest/aws-integration/#validate-a-monitor) * [Validate an existing monitor](https://docs.datadoghq.com/api/latest/aws-integration/#validate-an-existing-monitor) * [Get a monitor configuration policy](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-monitor-configuration-policy) * [Get all monitor configuration policies](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-monitor-configuration-policies) * [Create a monitor configuration policy](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-monitor-configuration-policy) * [Edit a monitor configuration policy](https://docs.datadoghq.com/api/latest/aws-integration/#edit-a-monitor-configuration-policy) * [Delete a monitor configuration policy](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-monitor-configuration-policy) * [Get a monitor notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-monitor-notification-rule) * [Get all monitor notification rules](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-monitor-notification-rules) * [Create a monitor notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-monitor-notification-rule) * [Update a monitor notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-monitor-notification-rule) * [Delete a monitor notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-monitor-notification-rule) * [Get a monitor user template](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-monitor-user-template) * [Get all monitor user templates](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-monitor-user-templates) * [Create a monitor user template](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-monitor-user-template) * [Update a monitor user template to a new version](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-monitor-user-template-to-a-new-version) * [Delete a monitor user template](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-monitor-user-template) * [Validate a monitor user template](https://docs.datadoghq.com/api/latest/aws-integration/#validate-a-monitor-user-template) * [Validate an existing monitor user template](https://docs.datadoghq.com/api/latest/aws-integration/#validate-an-existing-monitor-user-template) * [Network Device Monitoring](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Get the list of devices](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-list-of-devices) * [Get the device details](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-device-details) * [Get the list of interfaces of the device](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-list-of-interfaces-of-the-device) * [Get the list of tags for a device](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-list-of-tags-for-a-device) * [Update the tags for a device](https://docs.datadoghq.com/api/latest/aws-integration/#update-the-tags-for-a-device) * [Notebooks](https://docs.datadoghq.com/api/latest/notebooks/) * [Create a notebook](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-notebook) * [Get all notebooks](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-notebooks) * [Delete a notebook](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-notebook) * [Update a notebook](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-notebook) * [Get a notebook](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-notebook) * [Observability Pipelines](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [List pipelines](https://docs.datadoghq.com/api/latest/aws-integration/#list-pipelines) * [Create a new pipeline](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-pipeline) * [Get a specific pipeline](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-specific-pipeline) * [Update a pipeline](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-pipeline) * [Delete a pipeline](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-pipeline) * [Validate an observability pipeline](https://docs.datadoghq.com/api/latest/aws-integration/#validate-an-observability-pipeline) * [Okta Integration](https://docs.datadoghq.com/api/latest/okta-integration/) * [List Okta accounts](https://docs.datadoghq.com/api/latest/aws-integration/#list-okta-accounts) * [Add Okta account](https://docs.datadoghq.com/api/latest/aws-integration/#add-okta-account) * [Get Okta account](https://docs.datadoghq.com/api/latest/aws-integration/#get-okta-account) * [Update Okta account](https://docs.datadoghq.com/api/latest/aws-integration/#update-okta-account) * [Delete Okta account](https://docs.datadoghq.com/api/latest/aws-integration/#delete-okta-account) * [On-Call](https://docs.datadoghq.com/api/latest/on-call/) * [Create On-Call schedule](https://docs.datadoghq.com/api/latest/aws-integration/#create-on-call-schedule) * [Get On-Call schedule](https://docs.datadoghq.com/api/latest/aws-integration/#get-on-call-schedule) * [Delete On-Call schedule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-on-call-schedule) * [Update On-Call schedule](https://docs.datadoghq.com/api/latest/aws-integration/#update-on-call-schedule) * [Create On-Call escalation policy](https://docs.datadoghq.com/api/latest/aws-integration/#create-on-call-escalation-policy) * [Update On-Call escalation policy](https://docs.datadoghq.com/api/latest/aws-integration/#update-on-call-escalation-policy) * [Get On-Call escalation policy](https://docs.datadoghq.com/api/latest/aws-integration/#get-on-call-escalation-policy) * [Delete On-Call escalation policy](https://docs.datadoghq.com/api/latest/aws-integration/#delete-on-call-escalation-policy) * [Get On-Call team routing rules](https://docs.datadoghq.com/api/latest/aws-integration/#get-on-call-team-routing-rules) * [Set On-Call team routing rules](https://docs.datadoghq.com/api/latest/aws-integration/#set-on-call-team-routing-rules) * [Get scheduled on-call user](https://docs.datadoghq.com/api/latest/aws-integration/#get-scheduled-on-call-user) * [Get team on-call users](https://docs.datadoghq.com/api/latest/aws-integration/#get-team-on-call-users) * [Delete an On-Call notification channel for a user](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-on-call-notification-channel-for-a-user) * [Get an On-Call notification channel for a user](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-on-call-notification-channel-for-a-user) * [List On-Call notification channels for a user](https://docs.datadoghq.com/api/latest/aws-integration/#list-on-call-notification-channels-for-a-user) * [Create an On-Call notification channel for a user](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-on-call-notification-channel-for-a-user) * [Create an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-on-call-notification-rule-for-a-user) * [List On-Call notification rules for a user](https://docs.datadoghq.com/api/latest/aws-integration/#list-on-call-notification-rules-for-a-user) * [Delete an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-on-call-notification-rule-for-a-user) * [Get an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-on-call-notification-rule-for-a-user) * [Update an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-on-call-notification-rule-for-a-user) * [On-Call Paging](https://docs.datadoghq.com/api/latest/on-call-paging/) * [Create On-Call Page](https://docs.datadoghq.com/api/latest/aws-integration/#create-on-call-page) * [Acknowledge On-Call Page](https://docs.datadoghq.com/api/latest/aws-integration/#acknowledge-on-call-page) * [Escalate On-Call Page](https://docs.datadoghq.com/api/latest/aws-integration/#escalate-on-call-page) * [Resolve On-Call Page](https://docs.datadoghq.com/api/latest/aws-integration/#resolve-on-call-page) * [Opsgenie Integration](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Get all service objects](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-service-objects) * [Create a new service object](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-service-object) * [Get a single service object](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-single-service-object) * [Update a single service object](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-single-service-object) * [Delete a single service object](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-single-service-object) * [Org Connections](https://docs.datadoghq.com/api/latest/org-connections/) * [List Org Connections](https://docs.datadoghq.com/api/latest/aws-integration/#list-org-connections) * [Create Org Connection](https://docs.datadoghq.com/api/latest/aws-integration/#create-org-connection) * [Update Org Connection](https://docs.datadoghq.com/api/latest/aws-integration/#update-org-connection) * [Delete Org Connection](https://docs.datadoghq.com/api/latest/aws-integration/#delete-org-connection) * [Organizations](https://docs.datadoghq.com/api/latest/organizations/) * [Create a child organization](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-child-organization) * [List your managed organizations](https://docs.datadoghq.com/api/latest/aws-integration/#list-your-managed-organizations) * [Get organization information](https://docs.datadoghq.com/api/latest/aws-integration/#get-organization-information) * [Update your organization](https://docs.datadoghq.com/api/latest/aws-integration/#update-your-organization) * [Upload IdP metadata](https://docs.datadoghq.com/api/latest/aws-integration/#upload-idp-metadata) * [Spin-off Child Organization](https://docs.datadoghq.com/api/latest/aws-integration/#spin-off-child-organization) * [List Org Configs](https://docs.datadoghq.com/api/latest/aws-integration/#list-org-configs) * [Get a specific Org Config value](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-specific-org-config-value) * [Update a specific Org Config](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-specific-org-config) * [PagerDuty Integration](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Create a new service object](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-service-object) * [Get a single service object](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-single-service-object) * [Update a single service object](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-single-service-object) * [Delete a single service object](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-single-service-object) * [Powerpack](https://docs.datadoghq.com/api/latest/powerpack/) * [Get all powerpacks](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-powerpacks) * [Create a new powerpack](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-powerpack) * [Delete a powerpack](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-powerpack) * [Get a Powerpack](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-powerpack) * [Update a powerpack](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-powerpack) * [Processes](https://docs.datadoghq.com/api/latest/processes/) * [Get all processes](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-processes) * [Product Analytics](https://docs.datadoghq.com/api/latest/product-analytics/) * [Send server-side events](https://docs.datadoghq.com/api/latest/aws-integration/#send-server-side-events) * [Reference Tables](https://docs.datadoghq.com/api/latest/reference-tables/) * [Create reference table upload](https://docs.datadoghq.com/api/latest/aws-integration/#create-reference-table-upload) * [Create reference table](https://docs.datadoghq.com/api/latest/aws-integration/#create-reference-table) * [List tables](https://docs.datadoghq.com/api/latest/aws-integration/#list-tables) * [Get table](https://docs.datadoghq.com/api/latest/aws-integration/#get-table) * [Get rows by id](https://docs.datadoghq.com/api/latest/aws-integration/#get-rows-by-id) * [Update reference table](https://docs.datadoghq.com/api/latest/aws-integration/#update-reference-table) * [Delete table](https://docs.datadoghq.com/api/latest/aws-integration/#delete-table) * [Upsert rows](https://docs.datadoghq.com/api/latest/aws-integration/#upsert-rows) * [Delete rows](https://docs.datadoghq.com/api/latest/aws-integration/#delete-rows) * [Restriction Policies](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Update a restriction policy](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-restriction-policy) * [Get a restriction policy](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-restriction-policy) * [Delete a restriction policy](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-restriction-policy) * [Roles](https://docs.datadoghq.com/api/latest/roles/) * [List permissions](https://docs.datadoghq.com/api/latest/aws-integration/#list-permissions) * [List roles](https://docs.datadoghq.com/api/latest/aws-integration/#list-roles) * [Create role](https://docs.datadoghq.com/api/latest/aws-integration/#create-role) * [Get a role](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-role) * [Update a role](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-role) * [Delete role](https://docs.datadoghq.com/api/latest/aws-integration/#delete-role) * [List permissions for a role](https://docs.datadoghq.com/api/latest/aws-integration/#list-permissions-for-a-role) * [Grant permission to a role](https://docs.datadoghq.com/api/latest/aws-integration/#grant-permission-to-a-role) * [Revoke permission](https://docs.datadoghq.com/api/latest/aws-integration/#revoke-permission) * [Get all users of a role](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-users-of-a-role) * [Add a user to a role](https://docs.datadoghq.com/api/latest/aws-integration/#add-a-user-to-a-role) * [Remove a user from a role](https://docs.datadoghq.com/api/latest/aws-integration/#remove-a-user-from-a-role) * [Create a new role by cloning an existing role](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-role-by-cloning-an-existing-role) * [List role templates](https://docs.datadoghq.com/api/latest/aws-integration/#list-role-templates) * [RUM](https://docs.datadoghq.com/api/latest/rum/) * [Search RUM events](https://docs.datadoghq.com/api/latest/aws-integration/#search-rum-events) * [Get a list of RUM events](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-rum-events) * [Aggregate RUM events](https://docs.datadoghq.com/api/latest/aws-integration/#aggregate-rum-events) * [Update a RUM application](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-rum-application) * [Get a RUM application](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-rum-application) * [Delete a RUM application](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-rum-application) * [Create a new RUM application](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-rum-application) * [List all the RUM applications](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-the-rum-applications) * [Rum Audience Management](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Query accounts](https://docs.datadoghq.com/api/latest/aws-integration/#query-accounts) * [Create connection](https://docs.datadoghq.com/api/latest/aws-integration/#create-connection) * [Update connection](https://docs.datadoghq.com/api/latest/aws-integration/#update-connection) * [Query event filtered users](https://docs.datadoghq.com/api/latest/aws-integration/#query-event-filtered-users) * [Get account facet info](https://docs.datadoghq.com/api/latest/aws-integration/#get-account-facet-info) * [Delete connection](https://docs.datadoghq.com/api/latest/aws-integration/#delete-connection) * [List connections](https://docs.datadoghq.com/api/latest/aws-integration/#list-connections) * [Query users](https://docs.datadoghq.com/api/latest/aws-integration/#query-users) * [Get user facet info](https://docs.datadoghq.com/api/latest/aws-integration/#get-user-facet-info) * [Get mapping](https://docs.datadoghq.com/api/latest/aws-integration/#get-mapping) * [Rum Metrics](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Get all rum-based metrics](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-rum-based-metrics) * [Create a rum-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-rum-based-metric) * [Get a rum-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-rum-based-metric) * [Update a rum-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-rum-based-metric) * [Delete a rum-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-rum-based-metric) * [Rum Retention Filters](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Get all RUM retention filters](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-rum-retention-filters) * [Get a RUM retention filter](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-rum-retention-filter) * [Create a RUM retention filter](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-rum-retention-filter) * [Update a RUM retention filter](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-rum-retention-filter) * [Delete a RUM retention filter](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-rum-retention-filter) * [Order RUM retention filters](https://docs.datadoghq.com/api/latest/aws-integration/#order-rum-retention-filters) * [SCIM](https://docs.datadoghq.com/api/latest/scim/) * [List users](https://docs.datadoghq.com/api/latest/aws-integration/#list-users) * [Create user](https://docs.datadoghq.com/api/latest/aws-integration/#create-user) * [Get user](https://docs.datadoghq.com/api/latest/aws-integration/#get-user) * [Update user](https://docs.datadoghq.com/api/latest/aws-integration/#update-user) * [Patch user](https://docs.datadoghq.com/api/latest/aws-integration/#patch-user) * [Delete user](https://docs.datadoghq.com/api/latest/aws-integration/#delete-user) * [List groups](https://docs.datadoghq.com/api/latest/aws-integration/#list-groups) * [Create group](https://docs.datadoghq.com/api/latest/aws-integration/#create-group) * [Get group](https://docs.datadoghq.com/api/latest/aws-integration/#get-group) * [Update group](https://docs.datadoghq.com/api/latest/aws-integration/#update-group) * [Patch group](https://docs.datadoghq.com/api/latest/aws-integration/#patch-group) * [Delete group](https://docs.datadoghq.com/api/latest/aws-integration/#delete-group) * [Screenboards](https://docs.datadoghq.com/api/latest/screenboards/) * [Security Monitoring](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Create a suppression rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-suppression-rule) * [Delete a suppression rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-suppression-rule) * [Get a suppression rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-suppression-rule) * [Get a suppression's version history](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-suppressions-version-history) * [Get all suppression rules](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-suppression-rules) * [Get suppressions affecting a specific rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-suppressions-affecting-a-specific-rule) * [Get suppressions affecting future rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-suppressions-affecting-future-rule) * [Update a suppression rule](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-suppression-rule) * [Validate a suppression rule](https://docs.datadoghq.com/api/latest/aws-integration/#validate-a-suppression-rule) * [List findings](https://docs.datadoghq.com/api/latest/aws-integration/#list-findings) * [List security findings](https://docs.datadoghq.com/api/latest/aws-integration/#list-security-findings) * [Get a finding](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-finding) * [Mute or unmute a batch of findings](https://docs.datadoghq.com/api/latest/aws-integration/#mute-or-unmute-a-batch-of-findings) * [Search security findings](https://docs.datadoghq.com/api/latest/aws-integration/#search-security-findings) * [Add a security signal to an incident](https://docs.datadoghq.com/api/latest/aws-integration/#add-a-security-signal-to-an-incident) * [Change the triage state of a security signal](https://docs.datadoghq.com/api/latest/aws-integration/#change-the-triage-state-of-a-security-signal) * [Create a custom framework](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-custom-framework) * [Create a detection rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-detection-rule) * [Delete a custom framework](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-custom-framework) * [Get a custom framework](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-custom-framework) * [List rules](https://docs.datadoghq.com/api/latest/aws-integration/#list-rules) * [Update a custom framework](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-custom-framework) * [Get a rule's details](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-rules-details) * [Modify the triage assignee of a security signal](https://docs.datadoghq.com/api/latest/aws-integration/#modify-the-triage-assignee-of-a-security-signal) * [Update an existing rule](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-existing-rule) * [Delete an existing rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-existing-rule) * [Test an existing rule](https://docs.datadoghq.com/api/latest/aws-integration/#test-an-existing-rule) * [Test a rule](https://docs.datadoghq.com/api/latest/aws-integration/#test-a-rule) * [Validate a detection rule](https://docs.datadoghq.com/api/latest/aws-integration/#validate-a-detection-rule) * [Change the related incidents of a security signal](https://docs.datadoghq.com/api/latest/aws-integration/#change-the-related-incidents-of-a-security-signal) * [Convert an existing rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/aws-integration/#convert-an-existing-rule-from-json-to-terraform) * [Convert a rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/aws-integration/#convert-a-rule-from-json-to-terraform) * [Get a list of security signals](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-security-signals) * [Get a signal's details](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-signals-details) * [Delete a security filter](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-security-filter) * [Get a quick list of security signals](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-quick-list-of-security-signals) * [Get the list of signal-based notification rules](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-list-of-signal-based-notification-rules) * [Get the list of vulnerability notification rules](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-list-of-vulnerability-notification-rules) * [Update a security filter](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-security-filter) * [Create a new signal-based notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-signal-based-notification-rule) * [Create a new vulnerability-based notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-vulnerability-based-notification-rule) * [Get a security filter](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-security-filter) * [Create a security filter](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-security-filter) * [Get all security filters](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-security-filters) * [Get details of a signal-based notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-details-of-a-signal-based-notification-rule) * [Get details of a vulnerability notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-details-of-a-vulnerability-notification-rule) * [Run a threat hunting job](https://docs.datadoghq.com/api/latest/aws-integration/#run-a-threat-hunting-job) * [List threat hunting jobs](https://docs.datadoghq.com/api/latest/aws-integration/#list-threat-hunting-jobs) * [Patch a signal-based notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#patch-a-signal-based-notification-rule) * [Patch a vulnerability-based notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#patch-a-vulnerability-based-notification-rule) * [Delete a signal-based notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-signal-based-notification-rule) * [Delete a vulnerability-based notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-vulnerability-based-notification-rule) * [Get a job's details](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-jobs-details) * [Cancel a threat hunting job](https://docs.datadoghq.com/api/latest/aws-integration/#cancel-a-threat-hunting-job) * [Delete an existing job](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-existing-job) * [Convert a job result to a signal](https://docs.datadoghq.com/api/latest/aws-integration/#convert-a-job-result-to-a-signal) * [Get a rule's version history](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-rules-version-history) * [List vulnerabilities](https://docs.datadoghq.com/api/latest/aws-integration/#list-vulnerabilities) * [List resource filters](https://docs.datadoghq.com/api/latest/aws-integration/#list-resource-filters) * [List vulnerable assets](https://docs.datadoghq.com/api/latest/aws-integration/#list-vulnerable-assets) * [Get SBOM](https://docs.datadoghq.com/api/latest/aws-integration/#get-sbom) * [Update resource filters](https://docs.datadoghq.com/api/latest/aws-integration/#update-resource-filters) * [List assets SBOMs](https://docs.datadoghq.com/api/latest/aws-integration/#list-assets-sboms) * [List scanned assets metadata](https://docs.datadoghq.com/api/latest/aws-integration/#list-scanned-assets-metadata) * [Returns a list of Secrets rules](https://docs.datadoghq.com/api/latest/aws-integration/#returns-a-list-of-secrets-rules) * [Ruleset get multiple](https://docs.datadoghq.com/api/latest/aws-integration/#ruleset-get-multiple) * [Search hist signals](https://docs.datadoghq.com/api/latest/aws-integration/#search-hist-signals) * [Get a hist signal's details](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-hist-signals-details) * [List hist signals](https://docs.datadoghq.com/api/latest/aws-integration/#list-hist-signals) * [Get a job's hist signals](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-jobs-hist-signals) * [Create cases for security findings](https://docs.datadoghq.com/api/latest/aws-integration/#create-cases-for-security-findings) * [Attach security findings to a case](https://docs.datadoghq.com/api/latest/aws-integration/#attach-security-findings-to-a-case) * [Detach security findings from their case](https://docs.datadoghq.com/api/latest/aws-integration/#detach-security-findings-from-their-case) * [Create Jira issues for security findings](https://docs.datadoghq.com/api/latest/aws-integration/#create-jira-issues-for-security-findings) * [Attach security findings to a Jira issue](https://docs.datadoghq.com/api/latest/aws-integration/#attach-security-findings-to-a-jira-issue) * [Sensitive Data Scanner](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [List Scanning Groups](https://docs.datadoghq.com/api/latest/aws-integration/#list-scanning-groups) * [Reorder Groups](https://docs.datadoghq.com/api/latest/aws-integration/#reorder-groups) * [List standard patterns](https://docs.datadoghq.com/api/latest/aws-integration/#list-standard-patterns) * [Create Scanning Group](https://docs.datadoghq.com/api/latest/aws-integration/#create-scanning-group) * [Update Scanning Group](https://docs.datadoghq.com/api/latest/aws-integration/#update-scanning-group) * [Delete Scanning Group](https://docs.datadoghq.com/api/latest/aws-integration/#delete-scanning-group) * [Create Scanning Rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-scanning-rule) * [Update Scanning Rule](https://docs.datadoghq.com/api/latest/aws-integration/#update-scanning-rule) * [Delete Scanning Rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-scanning-rule) * [Service Accounts](https://docs.datadoghq.com/api/latest/service-accounts/) * [Create a service account](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-service-account) * [Get one application key for this service account](https://docs.datadoghq.com/api/latest/aws-integration/#get-one-application-key-for-this-service-account) * [Edit an application key for this service account](https://docs.datadoghq.com/api/latest/aws-integration/#edit-an-application-key-for-this-service-account) * [Delete an application key for this service account](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-application-key-for-this-service-account) * [Create an application key for this service account](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-application-key-for-this-service-account) * [List application keys for this service account](https://docs.datadoghq.com/api/latest/aws-integration/#list-application-keys-for-this-service-account) * [Service Checks](https://docs.datadoghq.com/api/latest/service-checks/) * [Submit a Service Check](https://docs.datadoghq.com/api/latest/aws-integration/#submit-a-service-check) * [Service Definition](https://docs.datadoghq.com/api/latest/service-definition/) * [Get all service definitions](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-service-definitions) * [Create or update service definition](https://docs.datadoghq.com/api/latest/aws-integration/#create-or-update-service-definition) * [Get a single service definition](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-single-service-definition) * [Delete a single service definition](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-single-service-definition) * [Service Dependencies](https://docs.datadoghq.com/api/latest/service-dependencies/) * [Get all APM service dependencies](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-apm-service-dependencies) * [Get one APM service's dependencies](https://docs.datadoghq.com/api/latest/aws-integration/#get-one-apm-services-dependencies) * [Service Level Objective Corrections](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Create an SLO correction](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-slo-correction) * [Get all SLO corrections](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-slo-corrections) * [Get an SLO correction for an SLO](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-slo-correction-for-an-slo) * [Update an SLO correction](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-slo-correction) * [Delete an SLO correction](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-slo-correction) * [Service Level Objectives](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Create an SLO object](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-slo-object) * [Search for SLOs](https://docs.datadoghq.com/api/latest/aws-integration/#search-for-slos) * [Get all SLOs](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-slos) * [Update an SLO](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-slo) * [Get an SLO's details](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-slos-details) * [Delete an SLO](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-slo) * [Get an SLO's history](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-slos-history) * [Get Corrections For an SLO](https://docs.datadoghq.com/api/latest/aws-integration/#get-corrections-for-an-slo) * [Check if SLOs can be safely deleted](https://docs.datadoghq.com/api/latest/aws-integration/#check-if-slos-can-be-safely-deleted) * [Bulk Delete SLO Timeframes](https://docs.datadoghq.com/api/latest/aws-integration/#bulk-delete-slo-timeframes) * [Create a new SLO report](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-slo-report) * [Get SLO report status](https://docs.datadoghq.com/api/latest/aws-integration/#get-slo-report-status) * [Get SLO report](https://docs.datadoghq.com/api/latest/aws-integration/#get-slo-report) * [Service Scorecards](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Create a new rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-new-rule) * [Create outcomes batch](https://docs.datadoghq.com/api/latest/aws-integration/#create-outcomes-batch) * [Update Scorecard outcomes asynchronously](https://docs.datadoghq.com/api/latest/aws-integration/#update-scorecard-outcomes-asynchronously) * [List all rule outcomes](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-rule-outcomes) * [List all rules](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-rules) * [Delete a rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-rule) * [Update an existing rule](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-existing-rule) * [Slack Integration](https://docs.datadoghq.com/api/latest/slack-integration/) * [Delete a Slack integration](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-slack-integration) * [Get all channels in a Slack integration](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-channels-in-a-slack-integration) * [Add channels to Slack integration](https://docs.datadoghq.com/api/latest/aws-integration/#add-channels-to-slack-integration) * [Create a Slack integration channel](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-slack-integration-channel) * [Create a Slack integration](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-slack-integration) * [Get a Slack integration channel](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-slack-integration-channel) * [Get info about a Slack integration](https://docs.datadoghq.com/api/latest/aws-integration/#get-info-about-a-slack-integration) * [Update a Slack integration channel](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-slack-integration-channel) * [Remove a Slack integration channel](https://docs.datadoghq.com/api/latest/aws-integration/#remove-a-slack-integration-channel) * [Snapshots](https://docs.datadoghq.com/api/latest/snapshots/) * [Take graph snapshots](https://docs.datadoghq.com/api/latest/aws-integration/#take-graph-snapshots) * [Software Catalog](https://docs.datadoghq.com/api/latest/software-catalog/) * [Get a list of entities](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-entities) * [Create or update entities](https://docs.datadoghq.com/api/latest/aws-integration/#create-or-update-entities) * [Delete a single entity](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-single-entity) * [Get a list of entity relations](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-entity-relations) * [Preview catalog entities](https://docs.datadoghq.com/api/latest/aws-integration/#preview-catalog-entities) * [Get a list of entity kinds](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-entity-kinds) * [Create or update kinds](https://docs.datadoghq.com/api/latest/aws-integration/#create-or-update-kinds) * [Delete a single kind](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-single-kind) * [Spa](https://docs.datadoghq.com/api/latest/spa/) * [Get SPA Recommendations with a shard parameter](https://docs.datadoghq.com/api/latest/aws-integration/#get-spa-recommendations-with-a-shard-parameter) * [Get SPA Recommendations](https://docs.datadoghq.com/api/latest/aws-integration/#get-spa-recommendations) * [Spans](https://docs.datadoghq.com/api/latest/spans/) * [Get a list of spans](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-list-of-spans) * [Search spans](https://docs.datadoghq.com/api/latest/aws-integration/#search-spans) * [Aggregate spans](https://docs.datadoghq.com/api/latest/aws-integration/#aggregate-spans) * [Spans Metrics](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Get all span-based metrics](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-span-based-metrics) * [Create a span-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-span-based-metric) * [Get a span-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-span-based-metric) * [Update a span-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-span-based-metric) * [Delete a span-based metric](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-span-based-metric) * [Static Analysis](https://docs.datadoghq.com/api/latest/static-analysis/) * [POST request to resolve vulnerable symbols](https://docs.datadoghq.com/api/latest/aws-integration/#post-request-to-resolve-vulnerable-symbols) * [Post dependencies for analysis](https://docs.datadoghq.com/api/latest/aws-integration/#post-dependencies-for-analysis) * [Synthetics](https://docs.datadoghq.com/api/latest/synthetics/) * [Create an API test](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-api-test) * [Create a browser test](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-browser-test) * [Create a mobile test](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-mobile-test) * [Edit a Mobile test](https://docs.datadoghq.com/api/latest/aws-integration/#edit-a-mobile-test) * [Edit an API test](https://docs.datadoghq.com/api/latest/aws-integration/#edit-an-api-test) * [Edit a browser test](https://docs.datadoghq.com/api/latest/aws-integration/#edit-a-browser-test) * [Patch a Synthetic test](https://docs.datadoghq.com/api/latest/aws-integration/#patch-a-synthetic-test) * [Pause or start a test](https://docs.datadoghq.com/api/latest/aws-integration/#pause-or-start-a-test) * [Trigger tests from CI/CD pipelines](https://docs.datadoghq.com/api/latest/aws-integration/#trigger-tests-from-cicd-pipelines) * [Trigger Synthetic tests](https://docs.datadoghq.com/api/latest/aws-integration/#trigger-synthetic-tests) * [Get a Mobile test](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-mobile-test) * [Get an API test](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-api-test) * [Get a browser test](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-browser-test) * [Get the on-demand concurrency cap](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-on-demand-concurrency-cap) * [Get the list of all Synthetic tests](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-list-of-all-synthetic-tests) * [Save new value for on-demand concurrency cap](https://docs.datadoghq.com/api/latest/aws-integration/#save-new-value-for-on-demand-concurrency-cap) * [Get an API test result](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-api-test-result) * [Get a browser test result](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-browser-test-result) * [Get an API test's latest results summaries](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-api-tests-latest-results-summaries) * [Get a browser test's latest results summaries](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-browser-tests-latest-results-summaries) * [Get details of batch](https://docs.datadoghq.com/api/latest/aws-integration/#get-details-of-batch) * [Delete tests](https://docs.datadoghq.com/api/latest/aws-integration/#delete-tests) * [Get all global variables](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-global-variables) * [Create a global variable](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-global-variable) * [Get a global variable](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-global-variable) * [Patch a global variable](https://docs.datadoghq.com/api/latest/aws-integration/#patch-a-global-variable) * [Edit a global variable](https://docs.datadoghq.com/api/latest/aws-integration/#edit-a-global-variable) * [Delete a global variable](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-global-variable) * [Create a private location](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-private-location) * [Get a private location](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-private-location) * [Edit a private location](https://docs.datadoghq.com/api/latest/aws-integration/#edit-a-private-location) * [Get all locations (public and private)](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-locations-public-and-private) * [Delete a private location](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-private-location) * [Get a test configuration](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-test-configuration) * [Get the default locations](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-default-locations) * [Edit a test](https://docs.datadoghq.com/api/latest/aws-integration/#edit-a-test) * [Fetch uptime for multiple tests](https://docs.datadoghq.com/api/latest/aws-integration/#fetch-uptime-for-multiple-tests) * [Create a test](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-test) * [Search Synthetic tests](https://docs.datadoghq.com/api/latest/aws-integration/#search-synthetic-tests) * [Tags](https://docs.datadoghq.com/api/latest/tags/) * [Get Tags](https://docs.datadoghq.com/api/latest/aws-integration/#get-tags) * [Get host tags](https://docs.datadoghq.com/api/latest/aws-integration/#get-host-tags) * [Add tags to a host](https://docs.datadoghq.com/api/latest/aws-integration/#add-tags-to-a-host) * [Update host tags](https://docs.datadoghq.com/api/latest/aws-integration/#update-host-tags) * [Remove host tags](https://docs.datadoghq.com/api/latest/aws-integration/#remove-host-tags) * [Teams](https://docs.datadoghq.com/api/latest/teams/) * [Get all teams](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-teams) * [Create a team](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-team) * [Get a team](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-team) * [Update a team](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-team) * [Remove a team](https://docs.datadoghq.com/api/latest/aws-integration/#remove-a-team) * [Get team memberships](https://docs.datadoghq.com/api/latest/aws-integration/#get-team-memberships) * [Add a user to a team](https://docs.datadoghq.com/api/latest/aws-integration/#add-a-user-to-a-team) * [Remove a user from a team](https://docs.datadoghq.com/api/latest/aws-integration/#remove-a-user-from-a-team) * [Update a user's membership attributes on a team](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-users-membership-attributes-on-a-team) * [Get links for a team](https://docs.datadoghq.com/api/latest/aws-integration/#get-links-for-a-team) * [Create a team link](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-team-link) * [Get a team link](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-team-link) * [Update a team link](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-team-link) * [Remove a team link](https://docs.datadoghq.com/api/latest/aws-integration/#remove-a-team-link) * [Get permission settings for a team](https://docs.datadoghq.com/api/latest/aws-integration/#get-permission-settings-for-a-team) * [Get team sync configurations](https://docs.datadoghq.com/api/latest/aws-integration/#get-team-sync-configurations) * [Update permission setting for team](https://docs.datadoghq.com/api/latest/aws-integration/#update-permission-setting-for-team) * [Get user memberships](https://docs.datadoghq.com/api/latest/aws-integration/#get-user-memberships) * [Link Teams with GitHub Teams](https://docs.datadoghq.com/api/latest/aws-integration/#link-teams-with-github-teams) * [Add a member team](https://docs.datadoghq.com/api/latest/aws-integration/#add-a-member-team) * [Get team hierarchy links](https://docs.datadoghq.com/api/latest/aws-integration/#get-team-hierarchy-links) * [Get a team hierarchy link](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-team-hierarchy-link) * [Get all member teams](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-member-teams) * [Create a team hierarchy link](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-team-hierarchy-link) * [Remove a member team](https://docs.datadoghq.com/api/latest/aws-integration/#remove-a-member-team) * [Remove a team hierarchy link](https://docs.datadoghq.com/api/latest/aws-integration/#remove-a-team-hierarchy-link) * [List team connections](https://docs.datadoghq.com/api/latest/aws-integration/#list-team-connections) * [Create team connections](https://docs.datadoghq.com/api/latest/aws-integration/#create-team-connections) * [Delete team connections](https://docs.datadoghq.com/api/latest/aws-integration/#delete-team-connections) * [Get team notification rules](https://docs.datadoghq.com/api/latest/aws-integration/#get-team-notification-rules) * [Create team notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#create-team-notification-rule) * [Get team notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#get-team-notification-rule) * [Update team notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#update-team-notification-rule) * [Delete team notification rule](https://docs.datadoghq.com/api/latest/aws-integration/#delete-team-notification-rule) * [Test Optimization](https://docs.datadoghq.com/api/latest/test-optimization/) * [Search flaky tests](https://docs.datadoghq.com/api/latest/aws-integration/#search-flaky-tests) * [Timeboards](https://docs.datadoghq.com/api/latest/timeboards/) * [Usage Metering](https://docs.datadoghq.com/api/latest/usage-metering/) * [Get billing dimension mapping for usage endpoints](https://docs.datadoghq.com/api/latest/aws-integration/#get-billing-dimension-mapping-for-usage-endpoints) * [Get hourly usage by product family](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-by-product-family) * [Get hourly usage attribution](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-attribution) * [Get monthly usage attribution](https://docs.datadoghq.com/api/latest/aws-integration/#get-monthly-usage-attribution) * [Get active billing dimensions for cost attribution](https://docs.datadoghq.com/api/latest/aws-integration/#get-active-billing-dimensions-for-cost-attribution) * [Get billable usage across your account](https://docs.datadoghq.com/api/latest/aws-integration/#get-billable-usage-across-your-account) * [Get historical cost across your account](https://docs.datadoghq.com/api/latest/aws-integration/#get-historical-cost-across-your-account) * [Get Monthly Cost Attribution](https://docs.datadoghq.com/api/latest/aws-integration/#get-monthly-cost-attribution) * [Get estimated cost across your account](https://docs.datadoghq.com/api/latest/aws-integration/#get-estimated-cost-across-your-account) * [Get all custom metrics by hourly average](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-custom-metrics-by-hourly-average) * [Get projected cost across your account](https://docs.datadoghq.com/api/latest/aws-integration/#get-projected-cost-across-your-account) * [Get usage across your account](https://docs.datadoghq.com/api/latest/aws-integration/#get-usage-across-your-account) * [Get hourly usage for logs by index](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-logs-by-index) * [Get hourly logs usage by retention](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-logs-usage-by-retention) * [Get hourly usage for hosts and containers](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-hosts-and-containers) * [Get hourly usage for logs](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-logs) * [Get hourly usage for custom metrics](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-custom-metrics) * [Get hourly usage for indexed spans](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-indexed-spans) * [Get hourly usage for synthetics checks](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-synthetics-checks) * [Get hourly usage for synthetics API checks](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-synthetics-api-checks) * [Get hourly usage for synthetics browser checks](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-synthetics-browser-checks) * [Get hourly usage for Fargate](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-fargate) * [Get hourly usage for Lambda](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-lambda) * [Get hourly usage for RUM sessions](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-rum-sessions) * [Get hourly usage for network hosts](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-network-hosts) * [get hourly usage for network flows](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-network-flows) * [Get hourly usage for analyzed logs](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-analyzed-logs) * [Get hourly usage for SNMP devices](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-snmp-devices) * [Get hourly usage for ingested spans](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-ingested-spans) * [Get hourly usage for incident management](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-incident-management) * [Get hourly usage for IoT](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-iot) * [Get hourly usage for CSM Pro](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-csm-pro) * [Get hourly usage for cloud workload security](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-cloud-workload-security) * [Get hourly usage for database monitoring](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-database-monitoring) * [Get hourly usage for sensitive data scanner](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-sensitive-data-scanner) * [Get hourly usage for RUM units](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-rum-units) * [Get hourly usage for profiled hosts](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-profiled-hosts) * [Get hourly usage for CI visibility](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-ci-visibility) * [Get hourly usage for online archive](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-online-archive) * [Get hourly usage for Lambda traced invocations](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-lambda-traced-invocations) * [Get hourly usage for application security](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-application-security) * [Get hourly usage for observability pipelines](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-observability-pipelines) * [Get hourly usage for audit logs](https://docs.datadoghq.com/api/latest/aws-integration/#get-hourly-usage-for-audit-logs) * [Get the list of available daily custom reports](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-list-of-available-daily-custom-reports) * [Get specified daily custom reports](https://docs.datadoghq.com/api/latest/aws-integration/#get-specified-daily-custom-reports) * [Get the list of available monthly custom reports](https://docs.datadoghq.com/api/latest/aws-integration/#get-the-list-of-available-monthly-custom-reports) * [Get specified monthly custom reports](https://docs.datadoghq.com/api/latest/aws-integration/#get-specified-monthly-custom-reports) * [Get cost across multi-org account](https://docs.datadoghq.com/api/latest/aws-integration/#get-cost-across-multi-org-account) * [Users](https://docs.datadoghq.com/api/latest/users/) * [Create a user](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-user) * [List all users](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-users) * [Get user details](https://docs.datadoghq.com/api/latest/aws-integration/#get-user-details) * [Update a user](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-user) * [Disable a user](https://docs.datadoghq.com/api/latest/aws-integration/#disable-a-user) * [Get a user organization](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-user-organization) * [Get a user permissions](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-user-permissions) * [Send invitation emails](https://docs.datadoghq.com/api/latest/aws-integration/#send-invitation-emails) * [Get a user invitation](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-user-invitation) * [Webhooks Integration](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Create a webhooks integration](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-webhooks-integration) * [Get a webhook integration](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-webhook-integration) * [Update a webhook](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-webhook) * [Delete a webhook](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-webhook) * [Create a custom variable](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-custom-variable) * [Get a custom variable](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-custom-variable) * [Update a custom variable](https://docs.datadoghq.com/api/latest/aws-integration/#update-a-custom-variable) * [Delete a custom variable](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-custom-variable) * [Workflow Automation](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Get an existing Workflow](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-existing-workflow) * [Create a Workflow](https://docs.datadoghq.com/api/latest/aws-integration/#create-a-workflow) * [Update an existing Workflow](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-existing-workflow) * [Delete an existing Workflow](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-existing-workflow) * [List workflow instances](https://docs.datadoghq.com/api/latest/aws-integration/#list-workflow-instances) * [Execute a workflow](https://docs.datadoghq.com/api/latest/aws-integration/#execute-a-workflow) * [Get a workflow instance](https://docs.datadoghq.com/api/latest/aws-integration/#get-a-workflow-instance) * [Cancel a workflow instance](https://docs.datadoghq.com/api/latest/aws-integration/#cancel-a-workflow-instance) [Docs](https://docs.datadoghq.com/) > [API Reference](https://docs.datadoghq.com/api/latest/) > [AWS Integration](https://docs.datadoghq.com/api/latest/aws-integration/) Language English [English](https://docs.datadoghq.com/api/latest/aws-integration/?lang_pref=en) [Français](https://docs.datadoghq.com/fr/api/latest/aws-integration/?lang_pref=fr) [日本語](https://docs.datadoghq.com/ja/api/latest/aws-integration/?lang_pref=ja) [한국어](https://docs.datadoghq.com/ko/api/latest/aws-integration/?lang_pref=ko) [Español](https://docs.datadoghq.com/es/api/latest/aws-integration/?lang_pref=es) Datadog Site [![Site help](https://datadog-docs.imgix.net/images/icons/help-druids.svg)](https://docs.datadoghq.com/getting_started/site/) US1 US1 US3 US5 EU AP1 AP2 US1-FED # AWS Integration Configure your Datadog-AWS integration directly through the Datadog API. For more information, see the [AWS integration page](https://docs.datadoghq.com/integrations/amazon_web_services). ## [Get all AWS tag filters](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-aws-tag-filters) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-aws-tag-filters-v1) GET https://api.ap1.datadoghq.com/api/v1/integration/aws/filteringhttps://api.ap2.datadoghq.com/api/v1/integration/aws/filteringhttps://api.datadoghq.eu/api/v1/integration/aws/filteringhttps://api.ddog-gov.com/api/v1/integration/aws/filteringhttps://api.datadoghq.com/api/v1/integration/aws/filteringhttps://api.us3.datadoghq.com/api/v1/integration/aws/filteringhttps://api.us5.datadoghq.com/api/v1/integration/aws/filtering ### Overview Get all AWS tag filters. This endpoint requires the `aws_configuration_read` permission. ### Arguments #### Query Strings Name Type Description account_id [_required_] string Only return AWS filters that matches this `account_id`. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSTagFilters-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSTagFilters-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSTagFilters-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSTagFilters-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) An array of tag filter rules by `namespace` and tag filter string. Expand All Field Type Description [object] An array of tag filters. namespace enum The namespace associated with the tag filter entry. Allowed enum values: `elb,application_elb,sqs,rds,custom,network_elb,lambda,step_functions` tag_filter_str string The tag filter string. ``` { "filters": [ { "namespace": "string", "tag_filter_str": "prod*" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Get all AWS tag filters Copy ``` # Required query arguments export account_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/filtering?account_id=${account_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all AWS tag filters ``` """ Get all AWS tag filters returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.list_aws_tag_filters( account_id="account_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all AWS tag filters ``` # Get all AWS tag filters returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new p api_instance.list_aws_tag_filters("account_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all AWS tag filters ``` // Get all AWS tag filters returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.ListAWSTagFilters(ctx, "account_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.ListAWSTagFilters`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.ListAWSTagFilters`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all AWS tag filters ``` // Get all AWS tag filters returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import com.datadog.api.client.v1.model.AWSTagFilterListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); try { AWSTagFilterListResponse result = apiInstance.listAWSTagFilters("account_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#listAWSTagFilters"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all AWS tag filters ``` // Get all AWS tag filters returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.list_aws_tag_filters("account_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all AWS tag filters ``` /** * Get all AWS tag filters returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); const params: v1.AWSIntegrationApiListAWSTagFiltersRequest = { accountId: "account_id", }; apiInstance .listAWSTagFilters(params) .then((data: v1.AWSTagFilterListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List available namespaces](https://docs.datadoghq.com/api/latest/aws-integration/#list-available-namespaces) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#list-available-namespaces-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/aws/available_namespaceshttps://api.ap2.datadoghq.com/api/v2/integration/aws/available_namespaceshttps://api.datadoghq.eu/api/v2/integration/aws/available_namespaceshttps://api.ddog-gov.com/api/v2/integration/aws/available_namespaceshttps://api.datadoghq.com/api/v2/integration/aws/available_namespaceshttps://api.us3.datadoghq.com/api/v2/integration/aws/available_namespaceshttps://api.us5.datadoghq.com/api/v2/integration/aws/available_namespaces ### Overview Get a list of available AWS CloudWatch namespaces that can send metrics to Datadog. This endpoint requires the `aws_configuration_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSNamespaces-200-v2) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSNamespaces-403-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSNamespaces-429-v2) AWS Namespaces List object * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) AWS Namespaces response body. Expand All Field Type Description _required_] object AWS Namespaces response data. object AWS Namespaces response attributes. namespaces [_required_] [string] AWS CloudWatch namespace. id [_required_] string The `AWSNamespacesResponseData` `id`. default: `namespaces` type [_required_] enum The `AWSNamespacesResponseData` `type`. Allowed enum values: `namespaces` default: `namespaces` ``` { "data": { "attributes": { "namespaces": [ "AWS/ApiGateway" ] }, "id": "namespaces", "type": "namespaces" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### List available namespaces Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/available_namespaces" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List available namespaces ``` """ List available namespaces returns "AWS Namespaces List object" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.list_aws_namespaces() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List available namespaces ``` # List available namespaces returns "AWS Namespaces List object" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new p api_instance.list_aws_namespaces() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List available namespaces ``` // List available namespaces returns "AWS Namespaces List object" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.ListAWSNamespaces(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.ListAWSNamespaces`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.ListAWSNamespaces`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List available namespaces ``` // List available namespaces returns "AWS Namespaces List object" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSNamespacesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); try { AWSNamespacesResponse result = apiInstance.listAWSNamespaces(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#listAWSNamespaces"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List available namespaces ``` // List available namespaces returns "AWS Namespaces List object" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.list_aws_namespaces().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List available namespaces ``` /** * List available namespaces returns "AWS Namespaces List object" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); apiInstance .listAWSNamespaces() .then((data: v2.AWSNamespacesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Set an AWS tag filter](https://docs.datadoghq.com/api/latest/aws-integration/#set-an-aws-tag-filter) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/aws-integration/#set-an-aws-tag-filter-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/aws/filteringhttps://api.ap2.datadoghq.com/api/v1/integration/aws/filteringhttps://api.datadoghq.eu/api/v1/integration/aws/filteringhttps://api.ddog-gov.com/api/v1/integration/aws/filteringhttps://api.datadoghq.com/api/v1/integration/aws/filteringhttps://api.us3.datadoghq.com/api/v1/integration/aws/filteringhttps://api.us5.datadoghq.com/api/v1/integration/aws/filtering ### Overview Set an AWS tag filter. This endpoint requires the `aws_configuration_edit` permission. ### Request #### Body Data (required) Set an AWS tag filter using an `aws_account_identifier`, `namespace`, and filtering string. Namespace options are `application_elb`, `elb`, `lambda`, `network_elb`, `rds`, `sqs`, and `custom`. * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description account_id string Your AWS Account ID without dashes. namespace enum The namespace associated with the tag filter entry. Allowed enum values: `elb,application_elb,sqs,rds,custom,network_elb,lambda,step_functions` tag_filter_str string The tag filter string. ``` { "account_id": "123456789012", "namespace": "string", "tag_filter_str": "prod*" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSTagFilter-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSTagFilter-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSTagFilter-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSTagFilter-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Set an AWS tag filter Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/filtering" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Set an AWS tag filter ``` """ Set an AWS tag filter returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v1.model.aws_namespace import AWSNamespace from datadog_api_client.v1.model.aws_tag_filter_create_request import AWSTagFilterCreateRequest body = AWSTagFilterCreateRequest( account_id="123456789012", namespace=AWSNamespace.ELB, tag_filter_str="prod*", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.create_aws_tag_filter(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Set an AWS tag filter ``` # Set an AWS tag filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new body = DatadogAPIClient::V1::AWSTagFilterCreateRequest.new({ account_id: "123456789012", namespace: DatadogAPIClient::V1::AWSNamespace::ELB, tag_filter_str: "prod*", }) p api_instance.create_aws_tag_filter(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Set an AWS tag filter ``` // Set an AWS tag filter returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSTagFilterCreateRequest{ AccountId: datadog.PtrString("123456789012"), Namespace: datadogV1.AWSNAMESPACE_ELB.Ptr(), TagFilterStr: datadog.PtrString("prod*"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.CreateAWSTagFilter(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.CreateAWSTagFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.CreateAWSTagFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Set an AWS tag filter ``` // Set an AWS tag filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import com.datadog.api.client.v1.model.AWSNamespace; import com.datadog.api.client.v1.model.AWSTagFilterCreateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSTagFilterCreateRequest body = new AWSTagFilterCreateRequest() .accountId("123456789012") .namespace(AWSNamespace.ELB) .tagFilterStr("prod*"); try { apiInstance.createAWSTagFilter(body); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#createAWSTagFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Set an AWS tag filter ``` // Set an AWS tag filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV1::model::AWSNamespace; use datadog_api_client::datadogV1::model::AWSTagFilterCreateRequest; #[tokio::main] async fn main() { let body = AWSTagFilterCreateRequest::new() .account_id("123456789012".to_string()) .namespace(AWSNamespace::ELB) .tag_filter_str("prod*".to_string()); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.create_aws_tag_filter(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Set an AWS tag filter ``` /** * Set an AWS tag filter returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); const params: v1.AWSIntegrationApiCreateAWSTagFilterRequest = { body: { accountId: "123456789012", namespace: "elb", tagFilterStr: "prod*", }, }; apiInstance .createAWSTagFilter(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a tag filtering entry](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-tag-filtering-entry) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/aws-integration/#delete-a-tag-filtering-entry-v1) DELETE https://api.ap1.datadoghq.com/api/v1/integration/aws/filteringhttps://api.ap2.datadoghq.com/api/v1/integration/aws/filteringhttps://api.datadoghq.eu/api/v1/integration/aws/filteringhttps://api.ddog-gov.com/api/v1/integration/aws/filteringhttps://api.datadoghq.com/api/v1/integration/aws/filteringhttps://api.us3.datadoghq.com/api/v1/integration/aws/filteringhttps://api.us5.datadoghq.com/api/v1/integration/aws/filtering ### Overview Delete a tag filtering entry. This endpoint requires the `aws_configuration_edit` permission. ### Request #### Body Data (required) Delete a tag filtering entry for a given AWS account and `dd-aws` namespace. * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description account_id string The unique identifier of your AWS account. namespace enum The namespace associated with the tag filter entry. Allowed enum values: `elb,application_elb,sqs,rds,custom,network_elb,lambda,step_functions` ``` { "account_id": "FAKEAC0FAKEAC2FAKEAC", "namespace": "string" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSTagFilter-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSTagFilter-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSTagFilter-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSTagFilter-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Delete a tag filtering entry Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/filtering" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Delete a tag filtering entry ``` """ Delete a tag filtering entry returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v1.model.aws_namespace import AWSNamespace from datadog_api_client.v1.model.aws_tag_filter_delete_request import AWSTagFilterDeleteRequest body = AWSTagFilterDeleteRequest( account_id="FAKEAC0FAKEAC2FAKEAC", namespace=AWSNamespace.ELB, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.delete_aws_tag_filter(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a tag filtering entry ``` # Delete a tag filtering entry returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new body = DatadogAPIClient::V1::AWSTagFilterDeleteRequest.new({ account_id: "FAKEAC0FAKEAC2FAKEAC", namespace: DatadogAPIClient::V1::AWSNamespace::ELB, }) p api_instance.delete_aws_tag_filter(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a tag filtering entry ``` // Delete a tag filtering entry returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSTagFilterDeleteRequest{ AccountId: datadog.PtrString("FAKEAC0FAKEAC2FAKEAC"), Namespace: datadogV1.AWSNAMESPACE_ELB.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.DeleteAWSTagFilter(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.DeleteAWSTagFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.DeleteAWSTagFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a tag filtering entry ``` // Delete a tag filtering entry returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import com.datadog.api.client.v1.model.AWSNamespace; import com.datadog.api.client.v1.model.AWSTagFilterDeleteRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSTagFilterDeleteRequest body = new AWSTagFilterDeleteRequest() .accountId("FAKEAC0FAKEAC2FAKEAC") .namespace(AWSNamespace.ELB); try { apiInstance.deleteAWSTagFilter(body); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#deleteAWSTagFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a tag filtering entry ``` // Delete a tag filtering entry returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV1::model::AWSNamespace; use datadog_api_client::datadogV1::model::AWSTagFilterDeleteRequest; #[tokio::main] async fn main() { let body = AWSTagFilterDeleteRequest::new() .account_id("FAKEAC0FAKEAC2FAKEAC".to_string()) .namespace(AWSNamespace::ELB); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.delete_aws_tag_filter(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a tag filtering entry ``` /** * Delete a tag filtering entry returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); const params: v1.AWSIntegrationApiDeleteAWSTagFilterRequest = { body: { accountId: "FAKEAC0FAKEAC2FAKEAC", namespace: "elb", }, }; apiInstance .deleteAWSTagFilter(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an AWS integration by config ID](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-aws-integration-by-config-id) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#get-an-aws-integration-by-config-id-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.ap2.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.datadoghq.eu/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.ddog-gov.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.us3.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.us5.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id} ### Overview Get an AWS Account Integration Config by config ID. This endpoint requires the `aws_configuration_read` permission. ### Arguments #### Path Parameters Name Type Description aws_account_config_id [_required_] string Unique Datadog ID of the AWS Account Integration Config. To get the config ID for an account, use the [List all AWS integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations) endpoint and query by AWS Account ID. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#GetAWSAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#GetAWSAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#GetAWSAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/aws-integration/#GetAWSAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#GetAWSAccount-429-v2) AWS Account object * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) AWS Account response body. Expand All Field Type Description _required_] object AWS Account response data. object AWS Account response attributes. account_tags [string] Tags to apply to all hosts and metrics reporting for this account. Defaults to `[]`. AWS Authentication config. object AWS Authentication config to integrate your account using an access key pair. access_key_id [_required_] string AWS Access Key ID. secret_access_key string AWS Secret Access Key. object AWS Authentication config to integrate your account using an IAM role. external_id string AWS IAM External ID for associated role. role_name [_required_] string AWS IAM Role name. aws_account_id [_required_] string AWS Account ID. aws_partition enum AWS partition your AWS account is scoped to. Defaults to `aws`. See [Partitions](https://docs.aws.amazon.com/whitepapers/latest/aws-fault-isolation-boundaries/partitions.html) in the AWS documentation for more information. Allowed enum values: `aws,aws-cn,aws-us-gov` AWS Regions to collect data from. Defaults to `include_all`. object Include all regions. Defaults to `true`. include_all [_required_] boolean Include all regions. object Include only these regions. include_only [_required_] [string] Include only these regions. created_at date-time Timestamp of when the account integration was created. object AWS Logs Collection config. object Log Autosubscription configuration for Datadog Forwarder Lambda functions. Automatically set up triggers for existing and new logs for some services, ensuring no logs from new resources are missed and saving time spent on manual configuration. lambdas [string] List of Datadog Lambda Log Forwarder ARNs in your AWS account. Defaults to `[]`. object Log source configuration. [object] List of AWS log source tag filters. Defaults to `[]`. source string The AWS log source to which the tag filters defined in `tags` are applied. tags [string] The AWS resource tags to filter on for the log source specified by `source`. sources [string] List of service IDs set to enable automatic log collection. Discover the list of available services with the [Get list of AWS log ready services](https://docs.datadoghq.com/api/latest/aws-logs-integration/#get-list-of-aws-log-ready-services) endpoint. object AWS Metrics Collection config. automute_enabled boolean Enable EC2 automute for AWS metrics. Defaults to `true`. collect_cloudwatch_alarms boolean Enable CloudWatch alarms collection. Defaults to `false`. collect_custom_metrics boolean Enable custom metrics collection. Defaults to `false`. enabled boolean Enable AWS metrics collection. Defaults to `true`. AWS Metrics namespace filters. Defaults to `exclude_only`. object Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. exclude_only [_required_] [string] Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. object Include only these namespaces. include_only [_required_] [string] Include only these namespaces. [object] AWS Metrics collection tag filters list. Defaults to `[]`. namespace string The AWS service for which the tag filters defined in `tags` will be applied. tags [string] The AWS resource tags to filter on for the service specified by `namespace`. modified_at date-time Timestamp of when the account integration was updated. object AWS Resources Collection config. cloud_security_posture_management_collection boolean Enable Cloud Security Management to scan AWS resources for vulnerabilities, misconfigurations, identity risks, and compliance violations. Defaults to `false`. Requires `extended_collection` to be set to `true`. extended_collection boolean Whether Datadog collects additional attributes and configuration information about the resources in your AWS account. Defaults to `true`. Required for `cloud_security_posture_management_collection`. object AWS Traces Collection config. AWS X-Ray services to collect traces from. Defaults to `include_only`. object Include all services. include_all [_required_] boolean Include all services. object Include only these services. Defaults to `[]`. include_only [_required_] [string] Include only these services. id [_required_] string Unique Datadog ID of the AWS Account Integration Config. To get the config ID for an account, use the [List all AWS integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations) endpoint and query by AWS Account ID. type [_required_] enum AWS Account resource type. Allowed enum values: `account` default: `account` ``` { "data": { "attributes": { "account_tags": [ "env:prod" ], "auth_config": { "access_key_id": "AKIAIOSFODNN7EXAMPLE", "secret_access_key": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY" }, "aws_account_id": "123456789012", "aws_partition": "aws", "aws_regions": { "include_all": true }, "created_at": "2019-09-19T10:00:00.000Z", "logs_config": { "lambda_forwarder": { "lambdas": [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" ], "log_source_config": { "tag_filters": [ { "source": "s3", "tags": [ "env:prod" ] } ] }, "sources": [ "s3" ] } }, "metrics_config": { "automute_enabled": true, "collect_cloudwatch_alarms": false, "collect_custom_metrics": false, "enabled": true, "namespace_filters": { "exclude_only": [ "AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage" ] }, "tag_filters": [ { "namespace": "AWS/EC2", "tags": [ "datadog:true" ] } ] }, "modified_at": "2019-09-19T10:00:00.000Z", "resources_config": { "cloud_security_posture_management_collection": false, "extended_collection": true }, "traces_config": { "xray_services": { "include_all": false } } }, "id": "00000000-abcd-0001-0000-000000000000", "type": "account" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Get an AWS integration by config ID Copy ``` # Path parameters export aws_account_config_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/accounts/${aws_account_config_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an AWS integration by config ID ``` """ Get an AWS integration by config ID returns "AWS Account object" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi # there is a valid "aws_account_v2" in the system AWS_ACCOUNT_V2_DATA_ID = environ["AWS_ACCOUNT_V2_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.get_aws_account( aws_account_config_id=AWS_ACCOUNT_V2_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an AWS integration by config ID ``` # Get an AWS integration by config ID returns "AWS Account object" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new # there is a valid "aws_account_v2" in the system AWS_ACCOUNT_V2_DATA_ID = ENV["AWS_ACCOUNT_V2_DATA_ID"] p api_instance.get_aws_account(AWS_ACCOUNT_V2_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an AWS integration by config ID ``` // Get an AWS integration by config ID returns "AWS Account object" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "aws_account_v2" in the system AwsAccountV2DataID := os.Getenv("AWS_ACCOUNT_V2_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.GetAWSAccount(ctx, AwsAccountV2DataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.GetAWSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.GetAWSAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an AWS integration by config ID ``` // Get an AWS integration by config ID returns "AWS Account object" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSAccountResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); // there is a valid "aws_account_v2" in the system String AWS_ACCOUNT_V2_DATA_ID = System.getenv("AWS_ACCOUNT_V2_DATA_ID"); try { AWSAccountResponse result = apiInstance.getAWSAccount(AWS_ACCOUNT_V2_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#getAWSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an AWS integration by config ID ``` // Get an AWS integration by config ID returns "AWS Account object" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; #[tokio::main] async fn main() { // there is a valid "aws_account_v2" in the system let aws_account_v2_data_id = std::env::var("AWS_ACCOUNT_V2_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.get_aws_account(aws_account_v2_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an AWS integration by config ID ``` /** * Get an AWS integration by config ID returns "AWS Account object" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); // there is a valid "aws_account_v2" in the system const AWS_ACCOUNT_V2_DATA_ID = process.env.AWS_ACCOUNT_V2_DATA_ID as string; const params: v2.AWSIntegrationApiGetAWSAccountRequest = { awsAccountConfigId: AWS_ACCOUNT_V2_DATA_ID, }; apiInstance .getAWSAccount(params) .then((data: v2.AWSAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Generate a new external ID](https://docs.datadoghq.com/api/latest/aws-integration/#generate-a-new-external-id) * [v1](https://docs.datadoghq.com/api/latest/aws-integration/#generate-a-new-external-id-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#generate-a-new-external-id-v2) PUT https://api.ap1.datadoghq.com/api/v1/integration/aws/generate_new_external_idhttps://api.ap2.datadoghq.com/api/v1/integration/aws/generate_new_external_idhttps://api.datadoghq.eu/api/v1/integration/aws/generate_new_external_idhttps://api.ddog-gov.com/api/v1/integration/aws/generate_new_external_idhttps://api.datadoghq.com/api/v1/integration/aws/generate_new_external_idhttps://api.us3.datadoghq.com/api/v1/integration/aws/generate_new_external_idhttps://api.us5.datadoghq.com/api/v1/integration/aws/generate_new_external_id ### Overview **This endpoint is deprecated - use the V2 endpoints instead.** Generate a new AWS external ID for a given AWS account ID and role name pair. This endpoint requires the `aws_configuration_edit` permission. ### Request #### Body Data (required) Your Datadog role delegation name. For more information about your AWS account Role name, see the [Datadog AWS integration configuration info](https://docs.datadoghq.com/integrations/amazon_web_services/#setup). * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description access_key_id string Your AWS access key ID. Only required if your AWS account is a GovCloud or China account. account_id string Your AWS Account ID without dashes. object An object (in the form `{"namespace1":true/false, "namespace2":true/false}`) containing user-supplied overrides for AWS namespace metric collection. **Important** : This field only contains namespaces explicitly configured through API calls, not the comprehensive enabled or disabled status of all namespaces. If a namespace is absent from this field, it uses Datadog's internal defaults (all namespaces enabled by default, except `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage`). For a complete view of all namespace statuses, use the V2 AWS Integration API instead. boolean A list of additional properties. cspm_resource_collection_enabled boolean Whether Datadog collects cloud security posture management resources from your AWS account. This includes additional resources not covered under the general `resource_collection`. excluded_regions [string] An array of [AWS regions](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints) to exclude from metrics collection. extended_resource_collection_enabled boolean Whether Datadog collects additional attributes and configuration information about the resources in your AWS account. Required for `cspm_resource_collection`. filter_tags [string] The array of EC2 tags (in the form `key:value`) defines a filter that Datadog uses when collecting metrics from EC2. Wildcards, such as `?` (for single characters) and `*` (for multiple characters) can also be used. Only hosts that match one of the defined tags will be imported into Datadog. The rest will be ignored. Host matching a given tag can also be excluded by adding `!` before the tag. For example, `env:production,instance-type:c1.*,!region:us-east-1` host_tags [string] Array of tags (in the form `key:value`) to add to all hosts and metrics reporting through this integration. metrics_collection_enabled boolean Whether Datadog collects metrics for this AWS account. default: `true` resource_collection_enabled boolean **DEPRECATED** : Deprecated in favor of 'extended_resource_collection_enabled'. Whether Datadog collects a standard set of resources from your AWS account. role_name string Your Datadog role delegation name. secret_access_key string Your AWS secret access key. Only required if your AWS account is a GovCloud or China account. ``` { "access_key_id": "string", "account_id": "123456789012", "account_specific_namespace_rules": { "": false }, "cspm_resource_collection_enabled": true, "excluded_regions": [ "us-east-1", "us-west-2" ], "extended_resource_collection_enabled": true, "filter_tags": [ "$KEY:$VALUE" ], "host_tags": [ "$KEY:$VALUE" ], "metrics_collection_enabled": false, "resource_collection_enabled": true, "role_name": "DatadogAWSIntegrationRole", "secret_access_key": "string" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#CreateNewAWSExternalID-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#CreateNewAWSExternalID-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#CreateNewAWSExternalID-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#CreateNewAWSExternalID-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) The Response returned by the AWS Create Account call. Expand All Field Type Description external_id string AWS external_id. ``` { "external_id": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Generate a new external ID Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/generate_new_external_id" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Generate a new external ID ``` """ Generate a new external ID returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v1.model.aws_account import AWSAccount body = AWSAccount( account_id="123456789012", account_specific_namespace_rules=dict( auto_scaling=False, opswork=False, ), cspm_resource_collection_enabled=True, excluded_regions=[ "us-east-1", "us-west-2", ], extended_resource_collection_enabled=True, filter_tags=[ "$KEY:$VALUE", ], host_tags=[ "$KEY:$VALUE", ], metrics_collection_enabled=False, resource_collection_enabled=True, role_name="DatadogAWSIntegrationRole", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.create_new_aws_external_id(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Generate a new external ID ``` # Generate a new external ID returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new body = DatadogAPIClient::V1::AWSAccount.new({ account_id: "123456789012", account_specific_namespace_rules: { auto_scaling: false, opswork: false, }, cspm_resource_collection_enabled: true, excluded_regions: [ "us-east-1", "us-west-2", ], extended_resource_collection_enabled: true, filter_tags: [ "$KEY:$VALUE", ], host_tags: [ "$KEY:$VALUE", ], metrics_collection_enabled: false, resource_collection_enabled: true, role_name: "DatadogAWSIntegrationRole", }) p api_instance.create_new_aws_external_id(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Generate a new external ID ``` // Generate a new external ID returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSAccount{ AccountId: datadog.PtrString("123456789012"), AccountSpecificNamespaceRules: map[string]bool{ "auto_scaling": false, "opswork": false, }, CspmResourceCollectionEnabled: datadog.PtrBool(true), ExcludedRegions: []string{ "us-east-1", "us-west-2", }, ExtendedResourceCollectionEnabled: datadog.PtrBool(true), FilterTags: []string{ "$KEY:$VALUE", }, HostTags: []string{ "$KEY:$VALUE", }, MetricsCollectionEnabled: datadog.PtrBool(false), ResourceCollectionEnabled: datadog.PtrBool(true), RoleName: datadog.PtrString("DatadogAWSIntegrationRole"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.CreateNewAWSExternalID(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.CreateNewAWSExternalID`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.CreateNewAWSExternalID`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Generate a new external ID ``` // Generate a new external ID returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import com.datadog.api.client.v1.model.AWSAccount; import com.datadog.api.client.v1.model.AWSAccountCreateResponse; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSAccount body = new AWSAccount() .accountId("123456789012") .accountSpecificNamespaceRules( Map.ofEntries(Map.entry("auto_scaling", false), Map.entry("opswork", false))) .cspmResourceCollectionEnabled(true) .excludedRegions(Arrays.asList("us-east-1", "us-west-2")) .extendedResourceCollectionEnabled(true) .filterTags(Collections.singletonList("$KEY:$VALUE")) .hostTags(Collections.singletonList("$KEY:$VALUE")) .metricsCollectionEnabled(false) .resourceCollectionEnabled(true) .roleName("DatadogAWSIntegrationRole"); try { AWSAccountCreateResponse result = apiInstance.createNewAWSExternalID(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#createNewAWSExternalID"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Generate a new external ID ``` // Generate a new external ID returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV1::model::AWSAccount; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = AWSAccount::new() .account_id("123456789012".to_string()) .account_specific_namespace_rules(BTreeMap::from([ ("auto_scaling".to_string(), false), ("opswork".to_string(), false), ])) .cspm_resource_collection_enabled(true) .excluded_regions(vec!["us-east-1".to_string(), "us-west-2".to_string()]) .extended_resource_collection_enabled(true) .filter_tags(vec!["$KEY:$VALUE".to_string()]) .host_tags(vec!["$KEY:$VALUE".to_string()]) .metrics_collection_enabled(false) .resource_collection_enabled(true) .role_name("DatadogAWSIntegrationRole".to_string()); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.create_new_aws_external_id(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Generate a new external ID ``` /** * Generate a new external ID returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); const params: v1.AWSIntegrationApiCreateNewAWSExternalIDRequest = { body: { accountId: "123456789012", accountSpecificNamespaceRules: { auto_scaling: false, opswork: false, }, cspmResourceCollectionEnabled: true, excludedRegions: ["us-east-1", "us-west-2"], extendedResourceCollectionEnabled: true, filterTags: ["$KEY:$VALUE"], hostTags: ["$KEY:$VALUE"], metricsCollectionEnabled: false, resourceCollectionEnabled: true, roleName: "DatadogAWSIntegrationRole", }, }; apiInstance .createNewAWSExternalID(params) .then((data: v1.AWSAccountCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` POST https://api.ap1.datadoghq.com/api/v2/integration/aws/generate_new_external_idhttps://api.ap2.datadoghq.com/api/v2/integration/aws/generate_new_external_idhttps://api.datadoghq.eu/api/v2/integration/aws/generate_new_external_idhttps://api.ddog-gov.com/api/v2/integration/aws/generate_new_external_idhttps://api.datadoghq.com/api/v2/integration/aws/generate_new_external_idhttps://api.us3.datadoghq.com/api/v2/integration/aws/generate_new_external_idhttps://api.us5.datadoghq.com/api/v2/integration/aws/generate_new_external_id ### Overview Generate a new external ID for AWS role-based authentication. This endpoint requires the `aws_configuration_edit` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#CreateNewAWSExternalID-200-v2) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#CreateNewAWSExternalID-403-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#CreateNewAWSExternalID-429-v2) AWS External ID object * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) AWS External ID response body. Expand All Field Type Description _required_] object AWS External ID response body. object AWS External ID response body. external_id [_required_] string AWS IAM External ID for associated role. id [_required_] string The `AWSNewExternalIDResponseData` `id`. default: `external_id` type [_required_] enum The `AWSNewExternalIDResponseData` `type`. Allowed enum values: `external_id` default: `external_id` ``` { "data": { "attributes": { "external_id": "acb8f6b8a844443dbb726d07dcb1a870" }, "id": "external_id", "type": "external_id" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Generate a new external ID Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/generate_new_external_id" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Generate a new external ID ``` """ Generate a new external ID returns "AWS External ID object" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.create_new_aws_external_id() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Generate a new external ID ``` # Generate a new external ID returns "AWS External ID object" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new p api_instance.create_new_aws_external_id() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Generate a new external ID ``` // Generate a new external ID returns "AWS External ID object" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.CreateNewAWSExternalID(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.CreateNewAWSExternalID`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.CreateNewAWSExternalID`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Generate a new external ID ``` // Generate a new external ID returns "AWS External ID object" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSNewExternalIDResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); try { AWSNewExternalIDResponse result = apiInstance.createNewAWSExternalID(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#createNewAWSExternalID"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Generate a new external ID ``` // Generate a new external ID returns "AWS External ID object" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.create_new_aws_external_id().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Generate a new external ID ``` /** * Generate a new external ID returns "AWS External ID object" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); apiInstance .createNewAWSExternalID() .then((data: v2.AWSNewExternalIDResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List namespace rules](https://docs.datadoghq.com/api/latest/aws-integration/#list-namespace-rules) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/aws-integration/#list-namespace-rules-v1) GET https://api.ap1.datadoghq.com/api/v1/integration/aws/available_namespace_ruleshttps://api.ap2.datadoghq.com/api/v1/integration/aws/available_namespace_ruleshttps://api.datadoghq.eu/api/v1/integration/aws/available_namespace_ruleshttps://api.ddog-gov.com/api/v1/integration/aws/available_namespace_ruleshttps://api.datadoghq.com/api/v1/integration/aws/available_namespace_ruleshttps://api.us3.datadoghq.com/api/v1/integration/aws/available_namespace_ruleshttps://api.us5.datadoghq.com/api/v1/integration/aws/available_namespace_rules ### Overview **This endpoint is deprecated - use the V2 endpoints instead.** List all namespace rules for a given Datadog-AWS integration. This endpoint takes no arguments. This endpoint requires the `aws_configuration_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#ListAvailableAWSNamespaces-200-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#ListAvailableAWSNamespaces-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#ListAvailableAWSNamespaces-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description string ``` [ "namespace1", "namespace2", "namespace3" ] ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python-legacy) ##### List namespace rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/available_namespace_rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List namespace rules ``` """ List namespace rules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.list_available_aws_namespaces() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List namespace rules ``` # List namespace rules returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new p api_instance.list_available_aws_namespaces() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List namespace rules ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.aws_integration_list_namespaces ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List namespace rules ``` // List namespace rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.ListAvailableAWSNamespaces(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.ListAvailableAWSNamespaces`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.ListAvailableAWSNamespaces`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List namespace rules ``` // List namespace rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); try { List result = apiInstance.listAvailableAWSNamespaces(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#listAvailableAWSNamespaces"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List namespace rules ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.AwsIntegration.list_namespace_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### List namespace rules ``` // List namespace rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.list_available_aws_namespaces().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List namespace rules ``` /** * List namespace rules returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); apiInstance .listAvailableAWSNamespaces() .then((data: string[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get AWS integration IAM permissions](https://docs.datadoghq.com/api/latest/aws-integration/#get-aws-integration-iam-permissions) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#get-aws-integration-iam-permissions-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/aws/iam_permissionshttps://api.ap2.datadoghq.com/api/v2/integration/aws/iam_permissionshttps://api.datadoghq.eu/api/v2/integration/aws/iam_permissionshttps://api.ddog-gov.com/api/v2/integration/aws/iam_permissionshttps://api.datadoghq.com/api/v2/integration/aws/iam_permissionshttps://api.us3.datadoghq.com/api/v2/integration/aws/iam_permissionshttps://api.us5.datadoghq.com/api/v2/integration/aws/iam_permissions ### Overview Get all AWS IAM permissions required for the AWS integration. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#GetAWSIntegrationIAMPermissions-200-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#GetAWSIntegrationIAMPermissions-429-v2) AWS IAM Permissions object * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) AWS Integration IAM Permissions response body. Expand All Field Type Description _required_] object AWS Integration IAM Permissions response data. object AWS Integration IAM Permissions response attributes. permissions [_required_] [string] List of AWS IAM permissions required for the integration. id string The `AWSIntegrationIamPermissionsResponseData` `id`. default: `permissions` type enum The `AWSIntegrationIamPermissionsResponseData` `type`. Allowed enum values: `permissions` default: `permissions` ``` { "data": { "attributes": { "permissions": [ "account:GetContactInformation", "amplify:ListApps", "amplify:ListArtifacts", "amplify:ListBackendEnvironments", "amplify:ListBranches" ] }, "id": "permissions", "type": "permissions" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Get AWS integration IAM permissions Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/iam_permissions" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get AWS integration IAM permissions ``` """ Get AWS integration IAM permissions returns "AWS IAM Permissions object" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.get_aws_integration_iam_permissions() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get AWS integration IAM permissions ``` # Get AWS integration IAM permissions returns "AWS IAM Permissions object" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new p api_instance.get_aws_integration_iam_permissions() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get AWS integration IAM permissions ``` // Get AWS integration IAM permissions returns "AWS IAM Permissions object" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.GetAWSIntegrationIAMPermissions(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.GetAWSIntegrationIAMPermissions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.GetAWSIntegrationIAMPermissions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get AWS integration IAM permissions ``` // Get AWS integration IAM permissions returns "AWS IAM Permissions object" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSIntegrationIamPermissionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); try { AWSIntegrationIamPermissionsResponse result = apiInstance.getAWSIntegrationIAMPermissions(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling AwsIntegrationApi#getAWSIntegrationIAMPermissions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get AWS integration IAM permissions ``` // Get AWS integration IAM permissions returns "AWS IAM Permissions object" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.get_aws_integration_iam_permissions().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get AWS integration IAM permissions ``` /** * Get AWS integration IAM permissions returns "AWS IAM Permissions object" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); apiInstance .getAWSIntegrationIAMPermissions() .then((data: v2.AWSIntegrationIamPermissionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all AWS integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations) * [v1](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations-v2) GET https://api.ap1.datadoghq.com/api/v1/integration/awshttps://api.ap2.datadoghq.com/api/v1/integration/awshttps://api.datadoghq.eu/api/v1/integration/awshttps://api.ddog-gov.com/api/v1/integration/awshttps://api.datadoghq.com/api/v1/integration/awshttps://api.us3.datadoghq.com/api/v1/integration/awshttps://api.us5.datadoghq.com/api/v1/integration/aws ### Overview **This endpoint is deprecated - use the V2 endpoints instead.** List all Datadog-AWS integrations available in your Datadog organization. This endpoint requires the `aws_configuration_read` permission. ### Arguments #### Query Strings Name Type Description account_id string Only return AWS accounts that matches this `account_id`. role_name string Only return AWS accounts that matches this role_name. access_key_id string Only return AWS accounts that matches this `access_key_id`. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSAccounts-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSAccounts-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSAccounts-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSAccounts-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) List of enabled AWS accounts. Expand All Field Type Description [object] List of enabled AWS accounts. access_key_id string Your AWS access key ID. Only required if your AWS account is a GovCloud or China account. account_id string Your AWS Account ID without dashes. object An object (in the form `{"namespace1":true/false, "namespace2":true/false}`) containing user-supplied overrides for AWS namespace metric collection. **Important** : This field only contains namespaces explicitly configured through API calls, not the comprehensive enabled or disabled status of all namespaces. If a namespace is absent from this field, it uses Datadog's internal defaults (all namespaces enabled by default, except `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage`). For a complete view of all namespace statuses, use the V2 AWS Integration API instead. boolean A list of additional properties. cspm_resource_collection_enabled boolean Whether Datadog collects cloud security posture management resources from your AWS account. This includes additional resources not covered under the general `resource_collection`. excluded_regions [string] An array of [AWS regions](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints) to exclude from metrics collection. extended_resource_collection_enabled boolean Whether Datadog collects additional attributes and configuration information about the resources in your AWS account. Required for `cspm_resource_collection`. filter_tags [string] The array of EC2 tags (in the form `key:value`) defines a filter that Datadog uses when collecting metrics from EC2. Wildcards, such as `?` (for single characters) and `*` (for multiple characters) can also be used. Only hosts that match one of the defined tags will be imported into Datadog. The rest will be ignored. Host matching a given tag can also be excluded by adding `!` before the tag. For example, `env:production,instance-type:c1.*,!region:us-east-1` host_tags [string] Array of tags (in the form `key:value`) to add to all hosts and metrics reporting through this integration. metrics_collection_enabled boolean Whether Datadog collects metrics for this AWS account. default: `true` resource_collection_enabled boolean **DEPRECATED** : Deprecated in favor of 'extended_resource_collection_enabled'. Whether Datadog collects a standard set of resources from your AWS account. role_name string Your Datadog role delegation name. secret_access_key string Your AWS secret access key. Only required if your AWS account is a GovCloud or China account. ``` { "accounts": [ { "access_key_id": "string", "account_id": "123456789012", "account_specific_namespace_rules": { "": false }, "cspm_resource_collection_enabled": true, "excluded_regions": [ "us-east-1", "us-west-2" ], "extended_resource_collection_enabled": true, "filter_tags": [ "$KEY:$VALUE" ], "host_tags": [ "$KEY:$VALUE" ], "metrics_collection_enabled": false, "resource_collection_enabled": true, "role_name": "DatadogAWSIntegrationRole", "secret_access_key": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python-legacy) ##### List all AWS integrations Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all AWS integrations ``` """ List all AWS integrations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.list_aws_accounts() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all AWS integrations ``` # List all AWS integrations returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new p api_instance.list_aws_accounts() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all AWS integrations ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.aws_integration_list ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all AWS integrations ``` // List all AWS integrations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.ListAWSAccounts(ctx, *datadogV1.NewListAWSAccountsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.ListAWSAccounts`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.ListAWSAccounts`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all AWS integrations ``` // List all AWS integrations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import com.datadog.api.client.v1.model.AWSAccountListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); try { AWSAccountListResponse result = apiInstance.listAWSAccounts(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#listAWSAccounts"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all AWS integrations ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.AwsIntegration.list() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### List all AWS integrations ``` // List all AWS integrations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV1::api_aws_integration::ListAWSAccountsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api .list_aws_accounts(ListAWSAccountsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all AWS integrations ``` /** * List all AWS integrations returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); apiInstance .listAWSAccounts() .then((data: v1.AWSAccountListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/integration/aws/accountshttps://api.ap2.datadoghq.com/api/v2/integration/aws/accountshttps://api.datadoghq.eu/api/v2/integration/aws/accountshttps://api.ddog-gov.com/api/v2/integration/aws/accountshttps://api.datadoghq.com/api/v2/integration/aws/accountshttps://api.us3.datadoghq.com/api/v2/integration/aws/accountshttps://api.us5.datadoghq.com/api/v2/integration/aws/accounts ### Overview Get a list of AWS Account Integration Configs. This endpoint requires the `aws_configuration_read` permission. ### Arguments #### Query Strings Name Type Description aws_account_id string Optional query parameter to filter accounts by AWS Account ID. If not provided, all accounts are returned. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSAccounts-200-v2) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSAccounts-403-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSAccounts-429-v2) AWS Accounts List object * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) AWS Accounts response body. Expand All Field Type Description _required_] [object] List of AWS Account Integration Configs. object AWS Account response attributes. account_tags [string] Tags to apply to all hosts and metrics reporting for this account. Defaults to `[]`. AWS Authentication config. object AWS Authentication config to integrate your account using an access key pair. access_key_id [_required_] string AWS Access Key ID. secret_access_key string AWS Secret Access Key. object AWS Authentication config to integrate your account using an IAM role. external_id string AWS IAM External ID for associated role. role_name [_required_] string AWS IAM Role name. aws_account_id [_required_] string AWS Account ID. aws_partition enum AWS partition your AWS account is scoped to. Defaults to `aws`. See [Partitions](https://docs.aws.amazon.com/whitepapers/latest/aws-fault-isolation-boundaries/partitions.html) in the AWS documentation for more information. Allowed enum values: `aws,aws-cn,aws-us-gov` AWS Regions to collect data from. Defaults to `include_all`. object Include all regions. Defaults to `true`. include_all [_required_] boolean Include all regions. object Include only these regions. include_only [_required_] [string] Include only these regions. created_at date-time Timestamp of when the account integration was created. object AWS Logs Collection config. object Log Autosubscription configuration for Datadog Forwarder Lambda functions. Automatically set up triggers for existing and new logs for some services, ensuring no logs from new resources are missed and saving time spent on manual configuration. lambdas [string] List of Datadog Lambda Log Forwarder ARNs in your AWS account. Defaults to `[]`. object Log source configuration. [object] List of AWS log source tag filters. Defaults to `[]`. source string The AWS log source to which the tag filters defined in `tags` are applied. tags [string] The AWS resource tags to filter on for the log source specified by `source`. sources [string] List of service IDs set to enable automatic log collection. Discover the list of available services with the [Get list of AWS log ready services](https://docs.datadoghq.com/api/latest/aws-logs-integration/#get-list-of-aws-log-ready-services) endpoint. object AWS Metrics Collection config. automute_enabled boolean Enable EC2 automute for AWS metrics. Defaults to `true`. collect_cloudwatch_alarms boolean Enable CloudWatch alarms collection. Defaults to `false`. collect_custom_metrics boolean Enable custom metrics collection. Defaults to `false`. enabled boolean Enable AWS metrics collection. Defaults to `true`. AWS Metrics namespace filters. Defaults to `exclude_only`. object Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. exclude_only [_required_] [string] Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. object Include only these namespaces. include_only [_required_] [string] Include only these namespaces. [object] AWS Metrics collection tag filters list. Defaults to `[]`. namespace string The AWS service for which the tag filters defined in `tags` will be applied. tags [string] The AWS resource tags to filter on for the service specified by `namespace`. modified_at date-time Timestamp of when the account integration was updated. object AWS Resources Collection config. cloud_security_posture_management_collection boolean Enable Cloud Security Management to scan AWS resources for vulnerabilities, misconfigurations, identity risks, and compliance violations. Defaults to `false`. Requires `extended_collection` to be set to `true`. extended_collection boolean Whether Datadog collects additional attributes and configuration information about the resources in your AWS account. Defaults to `true`. Required for `cloud_security_posture_management_collection`. object AWS Traces Collection config. AWS X-Ray services to collect traces from. Defaults to `include_only`. object Include all services. include_all [_required_] boolean Include all services. object Include only these services. Defaults to `[]`. include_only [_required_] [string] Include only these services. id [_required_] string Unique Datadog ID of the AWS Account Integration Config. To get the config ID for an account, use the [List all AWS integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations) endpoint and query by AWS Account ID. type [_required_] enum AWS Account resource type. Allowed enum values: `account` default: `account` ``` { "data": [ { "attributes": { "account_tags": [ "env:prod" ], "auth_config": { "access_key_id": "AKIAIOSFODNN7EXAMPLE", "secret_access_key": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY" }, "aws_account_id": "123456789012", "aws_partition": "aws", "aws_regions": { "include_all": true }, "created_at": "2019-09-19T10:00:00.000Z", "logs_config": { "lambda_forwarder": { "lambdas": [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" ], "log_source_config": { "tag_filters": [ { "source": "s3", "tags": [ "env:prod" ] } ] }, "sources": [ "s3" ] } }, "metrics_config": { "automute_enabled": true, "collect_cloudwatch_alarms": false, "collect_custom_metrics": false, "enabled": true, "namespace_filters": { "exclude_only": [ "AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage" ] }, "tag_filters": [ { "namespace": "AWS/EC2", "tags": [ "datadog:true" ] } ] }, "modified_at": "2019-09-19T10:00:00.000Z", "resources_config": { "cloud_security_posture_management_collection": false, "extended_collection": true }, "traces_config": { "xray_services": { "include_all": false } } }, "id": "00000000-abcd-0001-0000-000000000000", "type": "account" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### List all AWS integrations Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/accounts" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all AWS integrations ``` """ List all AWS integrations returns "AWS Accounts List object" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.list_aws_accounts() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all AWS integrations ``` # List all AWS integrations returns "AWS Accounts List object" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new p api_instance.list_aws_accounts() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all AWS integrations ``` // List all AWS integrations returns "AWS Accounts List object" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.ListAWSAccounts(ctx, *datadogV2.NewListAWSAccountsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.ListAWSAccounts`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.ListAWSAccounts`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all AWS integrations ``` // List all AWS integrations returns "AWS Accounts List object" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSAccountsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); try { AWSAccountsResponse result = apiInstance.listAWSAccounts(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#listAWSAccounts"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all AWS integrations ``` // List all AWS integrations returns "AWS Accounts List object" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV2::api_aws_integration::ListAWSAccountsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api .list_aws_accounts(ListAWSAccountsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all AWS integrations ``` /** * List all AWS integrations returns "AWS Accounts List object" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); apiInstance .listAWSAccounts() .then((data: v2.AWSAccountsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an AWS integration](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-aws-integration) * [v1](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-aws-integration-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-aws-integration-v2) DELETE https://api.ap1.datadoghq.com/api/v1/integration/awshttps://api.ap2.datadoghq.com/api/v1/integration/awshttps://api.datadoghq.eu/api/v1/integration/awshttps://api.ddog-gov.com/api/v1/integration/awshttps://api.datadoghq.com/api/v1/integration/awshttps://api.us3.datadoghq.com/api/v1/integration/awshttps://api.us5.datadoghq.com/api/v1/integration/aws ### Overview **This endpoint is deprecated - use the V2 endpoints instead.** Delete a Datadog-AWS integration matching the specified `account_id` and `role_name parameters`. This endpoint requires the `aws_configurations_manage` permission. ### Request #### Body Data (required) AWS request object * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description access_key_id string Your AWS access key ID. Only required if your AWS account is a GovCloud or China account. account_id string Your AWS Account ID without dashes. role_name string Your Datadog role delegation name. ``` { "account_id": "163662907100", "role_name": "DatadogAWSIntegrationRole" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSAccount-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSAccount-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSAccount-403-v1) * [409](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSAccount-409-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSAccount-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby-legacy) ##### Delete an AWS integration returns "OK" response Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "account_id": "163662907100", "role_name": "DatadogAWSIntegrationRole" } EOF ``` ##### Delete an AWS integration returns "OK" response ``` // Delete an AWS integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSAccountDeleteRequest{ AccountId: datadog.PtrString("163662907100"), RoleName: datadog.PtrString("DatadogAWSIntegrationRole"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.DeleteAWSAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.DeleteAWSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.DeleteAWSAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an AWS integration returns "OK" response ``` // Delete an AWS integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import com.datadog.api.client.v1.model.AWSAccountDeleteRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSAccountDeleteRequest body = new AWSAccountDeleteRequest() .accountId("163662907100") .roleName("DatadogAWSIntegrationRole"); try { apiInstance.deleteAWSAccount(body); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#deleteAWSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an AWS integration returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) account_id = "" role_name = "" api.AwsIntegration.delete(account_id=account_id, role_name=role_name) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Delete an AWS integration returns "OK" response ``` """ Delete an AWS integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v1.model.aws_account_delete_request import AWSAccountDeleteRequest body = AWSAccountDeleteRequest( account_id="163662907100", role_name="DatadogAWSIntegrationRole", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.delete_aws_account(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an AWS integration returns "OK" response ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' config = { "account_id": '', "role_name": 'DatadogAWSIntegrationRole' } dog = Dogapi::Client.new(api_key, app_key) dog.aws_integration_delete(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an AWS integration returns "OK" response ``` # Delete an AWS integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new body = DatadogAPIClient::V1::AWSAccountDeleteRequest.new({ account_id: "163662907100", role_name: "DatadogAWSIntegrationRole", }) p api_instance.delete_aws_account(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an AWS integration returns "OK" response ``` // Delete an AWS integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV1::model::AWSAccountDeleteRequest; #[tokio::main] async fn main() { let body = AWSAccountDeleteRequest::new() .account_id("163662907100".to_string()) .role_name("DatadogAWSIntegrationRole".to_string()); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.delete_aws_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an AWS integration returns "OK" response ``` /** * Delete an AWS integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); const params: v1.AWSIntegrationApiDeleteAWSAccountRequest = { body: { accountId: "163662907100", roleName: "DatadogAWSIntegrationRole", }, }; apiInstance .deleteAWSAccount(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` DELETE https://api.ap1.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.ap2.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.datadoghq.eu/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.ddog-gov.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.us3.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.us5.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id} ### Overview Delete an AWS Account Integration Config by config ID. This endpoint requires the `aws_configurations_manage` permission. ### Arguments #### Path Parameters Name Type Description aws_account_config_id [_required_] string Unique Datadog ID of the AWS Account Integration Config. To get the config ID for an account, use the [List all AWS integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations) endpoint and query by AWS Account ID. ### Response * [204](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSAccount-204-v2) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSAccount-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Delete an AWS integration Copy ``` # Path parameters export aws_account_config_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/accounts/${aws_account_config_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an AWS integration ``` """ Delete an AWS integration returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi # there is a valid "aws_account_v2" in the system AWS_ACCOUNT_V2_DATA_ID = environ["AWS_ACCOUNT_V2_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) api_instance.delete_aws_account( aws_account_config_id=AWS_ACCOUNT_V2_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an AWS integration ``` # Delete an AWS integration returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new # there is a valid "aws_account_v2" in the system AWS_ACCOUNT_V2_DATA_ID = ENV["AWS_ACCOUNT_V2_DATA_ID"] api_instance.delete_aws_account(AWS_ACCOUNT_V2_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an AWS integration ``` // Delete an AWS integration returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "aws_account_v2" in the system AwsAccountV2DataID := os.Getenv("AWS_ACCOUNT_V2_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) r, err := api.DeleteAWSAccount(ctx, AwsAccountV2DataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.DeleteAWSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an AWS integration ``` // Delete an AWS integration returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); // there is a valid "aws_account_v2" in the system String AWS_ACCOUNT_V2_DATA_ID = System.getenv("AWS_ACCOUNT_V2_DATA_ID"); try { apiInstance.deleteAWSAccount(AWS_ACCOUNT_V2_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#deleteAWSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an AWS integration ``` // Delete an AWS integration returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; #[tokio::main] async fn main() { // there is a valid "aws_account_v2" in the system let aws_account_v2_data_id = std::env::var("AWS_ACCOUNT_V2_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.delete_aws_account(aws_account_v2_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an AWS integration ``` /** * Delete an AWS integration returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); // there is a valid "aws_account_v2" in the system const AWS_ACCOUNT_V2_DATA_ID = process.env.AWS_ACCOUNT_V2_DATA_ID as string; const params: v2.AWSIntegrationApiDeleteAWSAccountRequest = { awsAccountConfigId: AWS_ACCOUNT_V2_DATA_ID, }; apiInstance .deleteAWSAccount(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get AWS integration standard IAM permissions](https://docs.datadoghq.com/api/latest/aws-integration/#get-aws-integration-standard-iam-permissions) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#get-aws-integration-standard-iam-permissions-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/aws/iam_permissions/standardhttps://api.ap2.datadoghq.com/api/v2/integration/aws/iam_permissions/standardhttps://api.datadoghq.eu/api/v2/integration/aws/iam_permissions/standardhttps://api.ddog-gov.com/api/v2/integration/aws/iam_permissions/standardhttps://api.datadoghq.com/api/v2/integration/aws/iam_permissions/standardhttps://api.us3.datadoghq.com/api/v2/integration/aws/iam_permissions/standardhttps://api.us5.datadoghq.com/api/v2/integration/aws/iam_permissions/standard ### Overview Get all standard AWS IAM permissions required for the AWS integration. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#GetAWSIntegrationIAMPermissionsStandard-200-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#GetAWSIntegrationIAMPermissionsStandard-429-v2) AWS integration standard IAM permissions. * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) AWS Integration IAM Permissions response body. Expand All Field Type Description _required_] object AWS Integration IAM Permissions response data. object AWS Integration IAM Permissions response attributes. permissions [_required_] [string] List of AWS IAM permissions required for the integration. id string The `AWSIntegrationIamPermissionsResponseData` `id`. default: `permissions` type enum The `AWSIntegrationIamPermissionsResponseData` `type`. Allowed enum values: `permissions` default: `permissions` ``` { "data": { "attributes": { "permissions": [ "account:GetContactInformation", "amplify:ListApps", "amplify:ListArtifacts", "amplify:ListBackendEnvironments", "amplify:ListBranches" ] }, "id": "permissions", "type": "permissions" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Get AWS integration standard IAM permissions Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/iam_permissions/standard" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get AWS integration standard IAM permissions ``` """ Get AWS integration standard IAM permissions returns "AWS integration standard IAM permissions." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.get_aws_integration_iam_permissions_standard() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get AWS integration standard IAM permissions ``` # Get AWS integration standard IAM permissions returns "AWS integration standard IAM permissions." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new p api_instance.get_aws_integration_iam_permissions_standard() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get AWS integration standard IAM permissions ``` // Get AWS integration standard IAM permissions returns "AWS integration standard IAM permissions." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.GetAWSIntegrationIAMPermissionsStandard(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.GetAWSIntegrationIAMPermissionsStandard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.GetAWSIntegrationIAMPermissionsStandard`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get AWS integration standard IAM permissions ``` // Get AWS integration standard IAM permissions returns "AWS integration standard IAM permissions." // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSIntegrationIamPermissionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); try { AWSIntegrationIamPermissionsResponse result = apiInstance.getAWSIntegrationIAMPermissionsStandard(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling AwsIntegrationApi#getAWSIntegrationIAMPermissionsStandard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get AWS integration standard IAM permissions ``` // Get AWS integration standard IAM permissions returns "AWS integration standard // IAM permissions." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.get_aws_integration_iam_permissions_standard().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get AWS integration standard IAM permissions ``` /** * Get AWS integration standard IAM permissions returns "AWS integration standard IAM permissions." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); apiInstance .getAWSIntegrationIAMPermissionsStandard() .then((data: v2.AWSIntegrationIamPermissionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an AWS integration](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-aws-integration) * [v1](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-aws-integration-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-aws-integration-v2) POST https://api.ap1.datadoghq.com/api/v1/integration/awshttps://api.ap2.datadoghq.com/api/v1/integration/awshttps://api.datadoghq.eu/api/v1/integration/awshttps://api.ddog-gov.com/api/v1/integration/awshttps://api.datadoghq.com/api/v1/integration/awshttps://api.us3.datadoghq.com/api/v1/integration/awshttps://api.us5.datadoghq.com/api/v1/integration/aws ### Overview **This endpoint is deprecated - use the V2 endpoints instead.** Create a Datadog-Amazon Web Services integration. Using the `POST` method updates your integration configuration by adding your new configuration to the existing one in your Datadog organization. A unique AWS Account ID for role based authentication. This endpoint requires the `aws_configurations_manage` permission. ### Request #### Body Data (required) AWS Request Object * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description access_key_id string Your AWS access key ID. Only required if your AWS account is a GovCloud or China account. account_id string Your AWS Account ID without dashes. object An object (in the form `{"namespace1":true/false, "namespace2":true/false}`) containing user-supplied overrides for AWS namespace metric collection. **Important** : This field only contains namespaces explicitly configured through API calls, not the comprehensive enabled or disabled status of all namespaces. If a namespace is absent from this field, it uses Datadog's internal defaults (all namespaces enabled by default, except `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage`). For a complete view of all namespace statuses, use the V2 AWS Integration API instead. boolean A list of additional properties. cspm_resource_collection_enabled boolean Whether Datadog collects cloud security posture management resources from your AWS account. This includes additional resources not covered under the general `resource_collection`. excluded_regions [string] An array of [AWS regions](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints) to exclude from metrics collection. extended_resource_collection_enabled boolean Whether Datadog collects additional attributes and configuration information about the resources in your AWS account. Required for `cspm_resource_collection`. filter_tags [string] The array of EC2 tags (in the form `key:value`) defines a filter that Datadog uses when collecting metrics from EC2. Wildcards, such as `?` (for single characters) and `*` (for multiple characters) can also be used. Only hosts that match one of the defined tags will be imported into Datadog. The rest will be ignored. Host matching a given tag can also be excluded by adding `!` before the tag. For example, `env:production,instance-type:c1.*,!region:us-east-1` host_tags [string] Array of tags (in the form `key:value`) to add to all hosts and metrics reporting through this integration. metrics_collection_enabled boolean Whether Datadog collects metrics for this AWS account. default: `true` resource_collection_enabled boolean **DEPRECATED** : Deprecated in favor of 'extended_resource_collection_enabled'. Whether Datadog collects a standard set of resources from your AWS account. role_name string Your Datadog role delegation name. secret_access_key string Your AWS secret access key. Only required if your AWS account is a GovCloud or China account. ``` { "account_id": "163662907100", "account_specific_namespace_rules": { "auto_scaling": false }, "cspm_resource_collection_enabled": true, "excluded_regions": [ "us-east-1", "us-west-2" ], "extended_resource_collection_enabled": true, "filter_tags": [ "$KEY:$VALUE" ], "host_tags": [ "$KEY:$VALUE" ], "metrics_collection_enabled": false, "role_name": "DatadogAWSIntegrationRole" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSAccount-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSAccount-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSAccount-403-v1) * [409](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSAccount-409-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSAccount-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) The Response returned by the AWS Create Account call. Expand All Field Type Description external_id string AWS external_id. ``` { "external_id": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby-legacy) ##### Create an AWS integration returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "account_id": "163662907100", "account_specific_namespace_rules": { "auto_scaling": false }, "cspm_resource_collection_enabled": true, "excluded_regions": [ "us-east-1", "us-west-2" ], "extended_resource_collection_enabled": true, "filter_tags": [ "$KEY:$VALUE" ], "host_tags": [ "$KEY:$VALUE" ], "metrics_collection_enabled": false, "role_name": "DatadogAWSIntegrationRole" } EOF ``` ##### Create an AWS integration returns "OK" response ``` // Create an AWS integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSAccount{ AccountId: datadog.PtrString("163662907100"), AccountSpecificNamespaceRules: map[string]bool{ "auto_scaling": false, }, CspmResourceCollectionEnabled: datadog.PtrBool(true), ExcludedRegions: []string{ "us-east-1", "us-west-2", }, ExtendedResourceCollectionEnabled: datadog.PtrBool(true), FilterTags: []string{ "$KEY:$VALUE", }, HostTags: []string{ "$KEY:$VALUE", }, MetricsCollectionEnabled: datadog.PtrBool(false), RoleName: datadog.PtrString("DatadogAWSIntegrationRole"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.CreateAWSAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.CreateAWSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.CreateAWSAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an AWS integration returns "OK" response ``` // Create an AWS integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import com.datadog.api.client.v1.model.AWSAccount; import com.datadog.api.client.v1.model.AWSAccountCreateResponse; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSAccount body = new AWSAccount() .accountId("163662907100") .accountSpecificNamespaceRules(Map.ofEntries(Map.entry("auto_scaling", false))) .cspmResourceCollectionEnabled(true) .excludedRegions(Arrays.asList("us-east-1", "us-west-2")) .extendedResourceCollectionEnabled(true) .filterTags(Collections.singletonList("$KEY:$VALUE")) .hostTags(Collections.singletonList("$KEY:$VALUE")) .metricsCollectionEnabled(false) .roleName("DatadogAWSIntegrationRole"); try { AWSAccountCreateResponse result = apiInstance.createAWSAccount(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#createAWSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an AWS integration returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.AwsIntegration.create( account_id="", host_tags=["tag:example"], filter_tags=["filter:example"], role_name="", account_specific_namespace_rules={'namespace1': True/False, 'namespace2': True/False}, excluded_regions=["us-east-1", "us-west-1"] ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Create an AWS integration returns "OK" response ``` """ Create an AWS integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v1.model.aws_account import AWSAccount body = AWSAccount( account_id="163662907100", account_specific_namespace_rules=dict( auto_scaling=False, ), cspm_resource_collection_enabled=True, excluded_regions=[ "us-east-1", "us-west-2", ], extended_resource_collection_enabled=True, filter_tags=[ "$KEY:$VALUE", ], host_tags=[ "$KEY:$VALUE", ], metrics_collection_enabled=False, role_name="DatadogAWSIntegrationRole", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.create_aws_account(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an AWS integration returns "OK" response ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) config = { "account_id": "", "filter_tags": [":"], "host_tags": [":"], "role_name": "DatadogAWSIntegrationRole", "account_specific_namespace_rules": {"auto_scaling": false, "opsworks": false}, "excluded_regions": ["us-east-1", "us-west-1"] } dog.aws_integration_create(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an AWS integration returns "OK" response ``` # Create an AWS integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new body = DatadogAPIClient::V1::AWSAccount.new({ account_id: "163662907100", account_specific_namespace_rules: { auto_scaling: false, }, cspm_resource_collection_enabled: true, excluded_regions: [ "us-east-1", "us-west-2", ], extended_resource_collection_enabled: true, filter_tags: [ "$KEY:$VALUE", ], host_tags: [ "$KEY:$VALUE", ], metrics_collection_enabled: false, role_name: "DatadogAWSIntegrationRole", }) p api_instance.create_aws_account(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an AWS integration returns "OK" response ``` // Create an AWS integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV1::model::AWSAccount; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = AWSAccount::new() .account_id("163662907100".to_string()) .account_specific_namespace_rules(BTreeMap::from([("auto_scaling".to_string(), false)])) .cspm_resource_collection_enabled(true) .excluded_regions(vec!["us-east-1".to_string(), "us-west-2".to_string()]) .extended_resource_collection_enabled(true) .filter_tags(vec!["$KEY:$VALUE".to_string()]) .host_tags(vec!["$KEY:$VALUE".to_string()]) .metrics_collection_enabled(false) .role_name("DatadogAWSIntegrationRole".to_string()); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.create_aws_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an AWS integration returns "OK" response ``` /** * Create an AWS integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); const params: v1.AWSIntegrationApiCreateAWSAccountRequest = { body: { accountId: "163662907100", accountSpecificNamespaceRules: { auto_scaling: false, }, cspmResourceCollectionEnabled: true, excludedRegions: ["us-east-1", "us-west-2"], extendedResourceCollectionEnabled: true, filterTags: ["$KEY:$VALUE"], hostTags: ["$KEY:$VALUE"], metricsCollectionEnabled: false, roleName: "DatadogAWSIntegrationRole", }, }; apiInstance .createAWSAccount(params) .then((data: v1.AWSAccountCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` POST https://api.ap1.datadoghq.com/api/v2/integration/aws/accountshttps://api.ap2.datadoghq.com/api/v2/integration/aws/accountshttps://api.datadoghq.eu/api/v2/integration/aws/accountshttps://api.ddog-gov.com/api/v2/integration/aws/accountshttps://api.datadoghq.com/api/v2/integration/aws/accountshttps://api.us3.datadoghq.com/api/v2/integration/aws/accountshttps://api.us5.datadoghq.com/api/v2/integration/aws/accounts ### Overview Create a new AWS Account Integration Config. This endpoint requires the `aws_configurations_manage` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description _required_] object AWS Account Create Request data. _required_] object The AWS Account Integration Config to be created. account_tags [string] Tags to apply to all hosts and metrics reporting for this account. Defaults to `[]`. _required_] AWS Authentication config. object AWS Authentication config to integrate your account using an access key pair. access_key_id [_required_] string AWS Access Key ID. secret_access_key string AWS Secret Access Key. object AWS Authentication config to integrate your account using an IAM role. external_id string AWS IAM External ID for associated role. role_name [_required_] string AWS IAM Role name. aws_account_id [_required_] string AWS Account ID. aws_partition [_required_] enum AWS partition your AWS account is scoped to. Defaults to `aws`. See [Partitions](https://docs.aws.amazon.com/whitepapers/latest/aws-fault-isolation-boundaries/partitions.html) in the AWS documentation for more information. Allowed enum values: `aws,aws-cn,aws-us-gov` AWS Regions to collect data from. Defaults to `include_all`. object Include all regions. Defaults to `true`. include_all [_required_] boolean Include all regions. object Include only these regions. include_only [_required_] [string] Include only these regions. object AWS Logs Collection config. object Log Autosubscription configuration for Datadog Forwarder Lambda functions. Automatically set up triggers for existing and new logs for some services, ensuring no logs from new resources are missed and saving time spent on manual configuration. lambdas [string] List of Datadog Lambda Log Forwarder ARNs in your AWS account. Defaults to `[]`. object Log source configuration. [object] List of AWS log source tag filters. Defaults to `[]`. source string The AWS log source to which the tag filters defined in `tags` are applied. tags [string] The AWS resource tags to filter on for the log source specified by `source`. sources [string] List of service IDs set to enable automatic log collection. Discover the list of available services with the [Get list of AWS log ready services](https://docs.datadoghq.com/api/latest/aws-logs-integration/#get-list-of-aws-log-ready-services) endpoint. object AWS Metrics Collection config. automute_enabled boolean Enable EC2 automute for AWS metrics. Defaults to `true`. collect_cloudwatch_alarms boolean Enable CloudWatch alarms collection. Defaults to `false`. collect_custom_metrics boolean Enable custom metrics collection. Defaults to `false`. enabled boolean Enable AWS metrics collection. Defaults to `true`. AWS Metrics namespace filters. Defaults to `exclude_only`. object Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. exclude_only [_required_] [string] Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. object Include only these namespaces. include_only [_required_] [string] Include only these namespaces. [object] AWS Metrics collection tag filters list. Defaults to `[]`. namespace string The AWS service for which the tag filters defined in `tags` will be applied. tags [string] The AWS resource tags to filter on for the service specified by `namespace`. object AWS Resources Collection config. cloud_security_posture_management_collection boolean Enable Cloud Security Management to scan AWS resources for vulnerabilities, misconfigurations, identity risks, and compliance violations. Defaults to `false`. Requires `extended_collection` to be set to `true`. extended_collection boolean Whether Datadog collects additional attributes and configuration information about the resources in your AWS account. Defaults to `true`. Required for `cloud_security_posture_management_collection`. object AWS Traces Collection config. AWS X-Ray services to collect traces from. Defaults to `include_only`. object Include all services. include_all [_required_] boolean Include all services. object Include only these services. Defaults to `[]`. include_only [_required_] [string] Include only these services. type [_required_] enum AWS Account resource type. Allowed enum values: `account` default: `account` ##### Create an AWS account returns "AWS Account object" response ``` { "data": { "attributes": { "account_tags": [ "key:value" ], "auth_config": { "role_name": "DatadogIntegrationRole" }, "aws_account_id": "123456789012", "aws_partition": "aws", "logs_config": { "lambda_forwarder": { "lambdas": [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" ], "log_source_config": { "tag_filters": [ { "source": "s3", "tags": [ "test:test" ] } ] }, "sources": [ "s3" ] } }, "metrics_config": { "automute_enabled": true, "collect_cloudwatch_alarms": true, "collect_custom_metrics": true, "enabled": true, "tag_filters": [ { "namespace": "AWS/EC2", "tags": [ "key:value" ] } ] }, "resources_config": { "cloud_security_posture_management_collection": false, "extended_collection": false }, "traces_config": {} }, "type": "account" } } ``` Copy ##### Create an AWS integration returns "AWS Account object" response ``` { "data": { "attributes": { "account_tags": [ "key:value" ], "auth_config": { "access_key_id": "AKIAIOSFODNN7EXAMPLE", "secret_access_key": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY" }, "aws_account_id": "123456789012", "aws_partition": "aws", "logs_config": { "lambda_forwarder": { "lambdas": [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" ], "log_source_config": { "tag_filters": [ { "source": "s3", "tags": [ "test:test" ] } ] }, "sources": [ "s3" ] } }, "metrics_config": { "automute_enabled": true, "collect_cloudwatch_alarms": true, "collect_custom_metrics": true, "enabled": true, "tag_filters": [ { "namespace": "AWS/EC2", "tags": [ "key:value" ] } ] }, "resources_config": { "cloud_security_posture_management_collection": false, "extended_collection": false }, "traces_config": {} }, "type": "account" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSAccount-403-v2) * [409](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSAccount-409-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSAccount-429-v2) AWS Account object * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) AWS Account response body. Expand All Field Type Description _required_] object AWS Account response data. object AWS Account response attributes. account_tags [string] Tags to apply to all hosts and metrics reporting for this account. Defaults to `[]`. AWS Authentication config. object AWS Authentication config to integrate your account using an access key pair. access_key_id [_required_] string AWS Access Key ID. secret_access_key string AWS Secret Access Key. object AWS Authentication config to integrate your account using an IAM role. external_id string AWS IAM External ID for associated role. role_name [_required_] string AWS IAM Role name. aws_account_id [_required_] string AWS Account ID. aws_partition enum AWS partition your AWS account is scoped to. Defaults to `aws`. See [Partitions](https://docs.aws.amazon.com/whitepapers/latest/aws-fault-isolation-boundaries/partitions.html) in the AWS documentation for more information. Allowed enum values: `aws,aws-cn,aws-us-gov` AWS Regions to collect data from. Defaults to `include_all`. object Include all regions. Defaults to `true`. include_all [_required_] boolean Include all regions. object Include only these regions. include_only [_required_] [string] Include only these regions. created_at date-time Timestamp of when the account integration was created. object AWS Logs Collection config. object Log Autosubscription configuration for Datadog Forwarder Lambda functions. Automatically set up triggers for existing and new logs for some services, ensuring no logs from new resources are missed and saving time spent on manual configuration. lambdas [string] List of Datadog Lambda Log Forwarder ARNs in your AWS account. Defaults to `[]`. object Log source configuration. [object] List of AWS log source tag filters. Defaults to `[]`. source string The AWS log source to which the tag filters defined in `tags` are applied. tags [string] The AWS resource tags to filter on for the log source specified by `source`. sources [string] List of service IDs set to enable automatic log collection. Discover the list of available services with the [Get list of AWS log ready services](https://docs.datadoghq.com/api/latest/aws-logs-integration/#get-list-of-aws-log-ready-services) endpoint. object AWS Metrics Collection config. automute_enabled boolean Enable EC2 automute for AWS metrics. Defaults to `true`. collect_cloudwatch_alarms boolean Enable CloudWatch alarms collection. Defaults to `false`. collect_custom_metrics boolean Enable custom metrics collection. Defaults to `false`. enabled boolean Enable AWS metrics collection. Defaults to `true`. AWS Metrics namespace filters. Defaults to `exclude_only`. object Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. exclude_only [_required_] [string] Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. object Include only these namespaces. include_only [_required_] [string] Include only these namespaces. [object] AWS Metrics collection tag filters list. Defaults to `[]`. namespace string The AWS service for which the tag filters defined in `tags` will be applied. tags [string] The AWS resource tags to filter on for the service specified by `namespace`. modified_at date-time Timestamp of when the account integration was updated. object AWS Resources Collection config. cloud_security_posture_management_collection boolean Enable Cloud Security Management to scan AWS resources for vulnerabilities, misconfigurations, identity risks, and compliance violations. Defaults to `false`. Requires `extended_collection` to be set to `true`. extended_collection boolean Whether Datadog collects additional attributes and configuration information about the resources in your AWS account. Defaults to `true`. Required for `cloud_security_posture_management_collection`. object AWS Traces Collection config. AWS X-Ray services to collect traces from. Defaults to `include_only`. object Include all services. include_all [_required_] boolean Include all services. object Include only these services. Defaults to `[]`. include_only [_required_] [string] Include only these services. id [_required_] string Unique Datadog ID of the AWS Account Integration Config. To get the config ID for an account, use the [List all AWS integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations) endpoint and query by AWS Account ID. type [_required_] enum AWS Account resource type. Allowed enum values: `account` default: `account` ``` { "data": { "attributes": { "account_tags": [ "env:prod" ], "auth_config": { "access_key_id": "AKIAIOSFODNN7EXAMPLE", "secret_access_key": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY" }, "aws_account_id": "123456789012", "aws_partition": "aws", "aws_regions": { "include_all": true }, "created_at": "2019-09-19T10:00:00.000Z", "logs_config": { "lambda_forwarder": { "lambdas": [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" ], "log_source_config": { "tag_filters": [ { "source": "s3", "tags": [ "env:prod" ] } ] }, "sources": [ "s3" ] } }, "metrics_config": { "automute_enabled": true, "collect_cloudwatch_alarms": false, "collect_custom_metrics": false, "enabled": true, "namespace_filters": { "exclude_only": [ "AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage" ] }, "tag_filters": [ { "namespace": "AWS/EC2", "tags": [ "datadog:true" ] } ] }, "modified_at": "2019-09-19T10:00:00.000Z", "resources_config": { "cloud_security_posture_management_collection": false, "extended_collection": true }, "traces_config": { "xray_services": { "include_all": false } } }, "id": "00000000-abcd-0001-0000-000000000000", "type": "account" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Create an AWS account returns "AWS Account object" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/accounts" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "account_tags": [ "key:value" ], "auth_config": { "role_name": "DatadogIntegrationRole" }, "aws_account_id": "123456789012", "aws_partition": "aws", "logs_config": { "lambda_forwarder": { "lambdas": [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" ], "log_source_config": { "tag_filters": [ { "source": "s3", "tags": [ "test:test" ] } ] }, "sources": [ "s3" ] } }, "metrics_config": { "automute_enabled": true, "collect_cloudwatch_alarms": true, "collect_custom_metrics": true, "enabled": true, "tag_filters": [ { "namespace": "AWS/EC2", "tags": [ "key:value" ] } ] }, "resources_config": { "cloud_security_posture_management_collection": false, "extended_collection": false }, "traces_config": {} }, "type": "account" } } EOF ``` ##### Create an AWS integration returns "AWS Account object" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/accounts" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "account_tags": [ "key:value" ], "auth_config": { "access_key_id": "AKIAIOSFODNN7EXAMPLE", "secret_access_key": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY" }, "aws_account_id": "123456789012", "aws_partition": "aws", "logs_config": { "lambda_forwarder": { "lambdas": [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" ], "log_source_config": { "tag_filters": [ { "source": "s3", "tags": [ "test:test" ] } ] }, "sources": [ "s3" ] } }, "metrics_config": { "automute_enabled": true, "collect_cloudwatch_alarms": true, "collect_custom_metrics": true, "enabled": true, "tag_filters": [ { "namespace": "AWS/EC2", "tags": [ "key:value" ] } ] }, "resources_config": { "cloud_security_posture_management_collection": false, "extended_collection": false }, "traces_config": {} }, "type": "account" } } EOF ``` ##### Create an AWS account returns "AWS Account object" response ``` // Create an AWS account returns "AWS Account object" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AWSAccountCreateRequest{ Data: datadogV2.AWSAccountCreateRequestData{ Attributes: datadogV2.AWSAccountCreateRequestAttributes{ AccountTags: *datadog.NewNullableList(&[]string{ "key:value", }), AuthConfig: datadogV2.AWSAuthConfig{ AWSAuthConfigRole: &datadogV2.AWSAuthConfigRole{ RoleName: "DatadogIntegrationRole", }}, AwsAccountId: "123456789012", AwsPartition: datadogV2.AWSACCOUNTPARTITION_AWS, CcmConfig: &datadogV2.AWSCCMConfig{ DataExportConfigs: []datadogV2.DataExportConfig{ { BucketName: datadog.PtrString("my-bucket"), BucketRegion: datadog.PtrString("us-east-1"), ReportName: datadog.PtrString("my-report"), ReportPrefix: datadog.PtrString("reports"), ReportType: datadog.PtrString("CUR2.0"), }, }, }, LogsConfig: &datadogV2.AWSLogsConfig{ LambdaForwarder: &datadogV2.AWSLambdaForwarderConfig{ Lambdas: []string{ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", }, LogSourceConfig: &datadogV2.AWSLambdaForwarderConfigLogSourceConfig{ TagFilters: []datadogV2.AWSLogSourceTagFilter{ { Source: datadog.PtrString("s3"), Tags: *datadog.NewNullableList(&[]string{ "test:test", }), }, }, }, Sources: []string{ "s3", }, }, }, MetricsConfig: &datadogV2.AWSMetricsConfig{ AutomuteEnabled: datadog.PtrBool(true), CollectCloudwatchAlarms: datadog.PtrBool(true), CollectCustomMetrics: datadog.PtrBool(true), Enabled: datadog.PtrBool(true), TagFilters: []datadogV2.AWSNamespaceTagFilter{ { Namespace: datadog.PtrString("AWS/EC2"), Tags: *datadog.NewNullableList(&[]string{ "key:value", }), }, }, }, ResourcesConfig: &datadogV2.AWSResourcesConfig{ CloudSecurityPostureManagementCollection: datadog.PtrBool(false), ExtendedCollection: datadog.PtrBool(false), }, TracesConfig: &datadogV2.AWSTracesConfig{}, }, Type: datadogV2.AWSACCOUNTTYPE_ACCOUNT, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.CreateAWSAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.CreateAWSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.CreateAWSAccount`:\n%s\n", responseContent) } ``` Copy ##### Create an AWS integration returns "AWS Account object" response ``` // Create an AWS integration returns "AWS Account object" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AWSAccountCreateRequest{ Data: datadogV2.AWSAccountCreateRequestData{ Attributes: datadogV2.AWSAccountCreateRequestAttributes{ AccountTags: *datadog.NewNullableList(&[]string{ "key:value", }), AuthConfig: datadogV2.AWSAuthConfig{ AWSAuthConfigKeys: &datadogV2.AWSAuthConfigKeys{ AccessKeyId: "AKIAIOSFODNN7EXAMPLE", SecretAccessKey: datadog.PtrString("wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"), }}, AwsAccountId: "123456789012", AwsPartition: datadogV2.AWSACCOUNTPARTITION_AWS, CcmConfig: &datadogV2.AWSCCMConfig{ DataExportConfigs: []datadogV2.DataExportConfig{ { BucketName: datadog.PtrString("my-bucket"), BucketRegion: datadog.PtrString("us-east-1"), ReportName: datadog.PtrString("my-report"), ReportPrefix: datadog.PtrString("reports"), ReportType: datadog.PtrString("CUR2.0"), }, }, }, LogsConfig: &datadogV2.AWSLogsConfig{ LambdaForwarder: &datadogV2.AWSLambdaForwarderConfig{ Lambdas: []string{ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", }, LogSourceConfig: &datadogV2.AWSLambdaForwarderConfigLogSourceConfig{ TagFilters: []datadogV2.AWSLogSourceTagFilter{ { Source: datadog.PtrString("s3"), Tags: *datadog.NewNullableList(&[]string{ "test:test", }), }, }, }, Sources: []string{ "s3", }, }, }, MetricsConfig: &datadogV2.AWSMetricsConfig{ AutomuteEnabled: datadog.PtrBool(true), CollectCloudwatchAlarms: datadog.PtrBool(true), CollectCustomMetrics: datadog.PtrBool(true), Enabled: datadog.PtrBool(true), TagFilters: []datadogV2.AWSNamespaceTagFilter{ { Namespace: datadog.PtrString("AWS/EC2"), Tags: *datadog.NewNullableList(&[]string{ "key:value", }), }, }, }, ResourcesConfig: &datadogV2.AWSResourcesConfig{ CloudSecurityPostureManagementCollection: datadog.PtrBool(false), ExtendedCollection: datadog.PtrBool(false), }, TracesConfig: &datadogV2.AWSTracesConfig{}, }, Type: datadogV2.AWSACCOUNTTYPE_ACCOUNT, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.CreateAWSAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.CreateAWSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.CreateAWSAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an AWS account returns "AWS Account object" response ``` // Create an AWS account returns "AWS Account object" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSAccountCreateRequest; import com.datadog.api.client.v2.model.AWSAccountCreateRequestAttributes; import com.datadog.api.client.v2.model.AWSAccountCreateRequestData; import com.datadog.api.client.v2.model.AWSAccountPartition; import com.datadog.api.client.v2.model.AWSAccountResponse; import com.datadog.api.client.v2.model.AWSAccountType; import com.datadog.api.client.v2.model.AWSAuthConfig; import com.datadog.api.client.v2.model.AWSAuthConfigRole; import com.datadog.api.client.v2.model.AWSCCMConfig; import com.datadog.api.client.v2.model.AWSLambdaForwarderConfig; import com.datadog.api.client.v2.model.AWSLambdaForwarderConfigLogSourceConfig; import com.datadog.api.client.v2.model.AWSLogSourceTagFilter; import com.datadog.api.client.v2.model.AWSLogsConfig; import com.datadog.api.client.v2.model.AWSMetricsConfig; import com.datadog.api.client.v2.model.AWSNamespaceTagFilter; import com.datadog.api.client.v2.model.AWSResourcesConfig; import com.datadog.api.client.v2.model.AWSTracesConfig; import com.datadog.api.client.v2.model.DataExportConfig; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSAccountCreateRequest body = new AWSAccountCreateRequest() .data( new AWSAccountCreateRequestData() .attributes( new AWSAccountCreateRequestAttributes() .accountTags(Collections.singletonList("key:value")) .authConfig( new AWSAuthConfig( new AWSAuthConfigRole().roleName("DatadogIntegrationRole"))) .awsAccountId("123456789012") .awsPartition(AWSAccountPartition.AWS) .ccmConfig( new AWSCCMConfig() .dataExportConfigs( Collections.singletonList( new DataExportConfig() .bucketName("my-bucket") .bucketRegion("us-east-1") .reportName("my-report") .reportPrefix("reports") .reportType("CUR2.0")))) .logsConfig( new AWSLogsConfig() .lambdaForwarder( new AWSLambdaForwarderConfig() .lambdas( Collections.singletonList( "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder")) .logSourceConfig( new AWSLambdaForwarderConfigLogSourceConfig() .tagFilters( Collections.singletonList( new AWSLogSourceTagFilter() .source("s3") .tags( Collections.singletonList( "test:test"))))) .sources(Collections.singletonList("s3")))) .metricsConfig( new AWSMetricsConfig() .automuteEnabled(true) .collectCloudwatchAlarms(true) .collectCustomMetrics(true) .enabled(true) .tagFilters( Collections.singletonList( new AWSNamespaceTagFilter() .namespace("AWS/EC2") .tags(Collections.singletonList("key:value"))))) .resourcesConfig( new AWSResourcesConfig() .cloudSecurityPostureManagementCollection(false) .extendedCollection(false)) .tracesConfig(new AWSTracesConfig())) .type(AWSAccountType.ACCOUNT)); try { AWSAccountResponse result = apiInstance.createAWSAccount(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#createAWSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create an AWS integration returns "AWS Account object" response ``` // Create an AWS integration returns "AWS Account object" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSAccountCreateRequest; import com.datadog.api.client.v2.model.AWSAccountCreateRequestAttributes; import com.datadog.api.client.v2.model.AWSAccountCreateRequestData; import com.datadog.api.client.v2.model.AWSAccountPartition; import com.datadog.api.client.v2.model.AWSAccountResponse; import com.datadog.api.client.v2.model.AWSAccountType; import com.datadog.api.client.v2.model.AWSAuthConfig; import com.datadog.api.client.v2.model.AWSAuthConfigKeys; import com.datadog.api.client.v2.model.AWSCCMConfig; import com.datadog.api.client.v2.model.AWSLambdaForwarderConfig; import com.datadog.api.client.v2.model.AWSLambdaForwarderConfigLogSourceConfig; import com.datadog.api.client.v2.model.AWSLogSourceTagFilter; import com.datadog.api.client.v2.model.AWSLogsConfig; import com.datadog.api.client.v2.model.AWSMetricsConfig; import com.datadog.api.client.v2.model.AWSNamespaceTagFilter; import com.datadog.api.client.v2.model.AWSResourcesConfig; import com.datadog.api.client.v2.model.AWSTracesConfig; import com.datadog.api.client.v2.model.DataExportConfig; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSAccountCreateRequest body = new AWSAccountCreateRequest() .data( new AWSAccountCreateRequestData() .attributes( new AWSAccountCreateRequestAttributes() .accountTags(Collections.singletonList("key:value")) .authConfig( new AWSAuthConfig( new AWSAuthConfigKeys() .accessKeyId("AKIAIOSFODNN7EXAMPLE") .secretAccessKey( "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY"))) .awsAccountId("123456789012") .awsPartition(AWSAccountPartition.AWS) .ccmConfig( new AWSCCMConfig() .dataExportConfigs( Collections.singletonList( new DataExportConfig() .bucketName("my-bucket") .bucketRegion("us-east-1") .reportName("my-report") .reportPrefix("reports") .reportType("CUR2.0")))) .logsConfig( new AWSLogsConfig() .lambdaForwarder( new AWSLambdaForwarderConfig() .lambdas( Collections.singletonList( "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder")) .logSourceConfig( new AWSLambdaForwarderConfigLogSourceConfig() .tagFilters( Collections.singletonList( new AWSLogSourceTagFilter() .source("s3") .tags( Collections.singletonList( "test:test"))))) .sources(Collections.singletonList("s3")))) .metricsConfig( new AWSMetricsConfig() .automuteEnabled(true) .collectCloudwatchAlarms(true) .collectCustomMetrics(true) .enabled(true) .tagFilters( Collections.singletonList( new AWSNamespaceTagFilter() .namespace("AWS/EC2") .tags(Collections.singletonList("key:value"))))) .resourcesConfig( new AWSResourcesConfig() .cloudSecurityPostureManagementCollection(false) .extendedCollection(false)) .tracesConfig(new AWSTracesConfig())) .type(AWSAccountType.ACCOUNT)); try { AWSAccountResponse result = apiInstance.createAWSAccount(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#createAWSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an AWS account returns "AWS Account object" response ``` """ Create an AWS account returns "AWS Account object" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v2.model.aws_account_create_request import AWSAccountCreateRequest from datadog_api_client.v2.model.aws_account_create_request_attributes import AWSAccountCreateRequestAttributes from datadog_api_client.v2.model.aws_account_create_request_data import AWSAccountCreateRequestData from datadog_api_client.v2.model.aws_account_partition import AWSAccountPartition from datadog_api_client.v2.model.aws_account_type import AWSAccountType from datadog_api_client.v2.model.aws_auth_config_role import AWSAuthConfigRole from datadog_api_client.v2.model.aws_lambda_forwarder_config import AWSLambdaForwarderConfig from datadog_api_client.v2.model.aws_lambda_forwarder_config_log_source_config import ( AWSLambdaForwarderConfigLogSourceConfig, ) from datadog_api_client.v2.model.aws_log_source_tag_filter import AWSLogSourceTagFilter from datadog_api_client.v2.model.aws_logs_config import AWSLogsConfig from datadog_api_client.v2.model.aws_metrics_config import AWSMetricsConfig from datadog_api_client.v2.model.aws_namespace_tag_filter import AWSNamespaceTagFilter from datadog_api_client.v2.model.aws_resources_config import AWSResourcesConfig from datadog_api_client.v2.model.aws_traces_config import AWSTracesConfig from datadog_api_client.v2.model.awsccm_config import AWSCCMConfig from datadog_api_client.v2.model.data_export_config import DataExportConfig body = AWSAccountCreateRequest( data=AWSAccountCreateRequestData( attributes=AWSAccountCreateRequestAttributes( account_tags=[ "key:value", ], auth_config=AWSAuthConfigRole( role_name="DatadogIntegrationRole", ), aws_account_id="123456789012", aws_partition=AWSAccountPartition.AWS, ccm_config=AWSCCMConfig( data_export_configs=[ DataExportConfig( bucket_name="my-bucket", bucket_region="us-east-1", report_name="my-report", report_prefix="reports", report_type="CUR2.0", ), ], ), logs_config=AWSLogsConfig( lambda_forwarder=AWSLambdaForwarderConfig( lambdas=[ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", ], log_source_config=AWSLambdaForwarderConfigLogSourceConfig( tag_filters=[ AWSLogSourceTagFilter( source="s3", tags=[ "test:test", ], ), ], ), sources=[ "s3", ], ), ), metrics_config=AWSMetricsConfig( automute_enabled=True, collect_cloudwatch_alarms=True, collect_custom_metrics=True, enabled=True, tag_filters=[ AWSNamespaceTagFilter( namespace="AWS/EC2", tags=[ "key:value", ], ), ], ), resources_config=AWSResourcesConfig( cloud_security_posture_management_collection=False, extended_collection=False, ), traces_config=AWSTracesConfig(), ), type=AWSAccountType.ACCOUNT, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.create_aws_account(body=body) print(response) ``` Copy ##### Create an AWS integration returns "AWS Account object" response ``` """ Create an AWS integration returns "AWS Account object" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v2.model.aws_account_create_request import AWSAccountCreateRequest from datadog_api_client.v2.model.aws_account_create_request_attributes import AWSAccountCreateRequestAttributes from datadog_api_client.v2.model.aws_account_create_request_data import AWSAccountCreateRequestData from datadog_api_client.v2.model.aws_account_partition import AWSAccountPartition from datadog_api_client.v2.model.aws_account_type import AWSAccountType from datadog_api_client.v2.model.aws_auth_config_keys import AWSAuthConfigKeys from datadog_api_client.v2.model.aws_lambda_forwarder_config import AWSLambdaForwarderConfig from datadog_api_client.v2.model.aws_lambda_forwarder_config_log_source_config import ( AWSLambdaForwarderConfigLogSourceConfig, ) from datadog_api_client.v2.model.aws_log_source_tag_filter import AWSLogSourceTagFilter from datadog_api_client.v2.model.aws_logs_config import AWSLogsConfig from datadog_api_client.v2.model.aws_metrics_config import AWSMetricsConfig from datadog_api_client.v2.model.aws_namespace_tag_filter import AWSNamespaceTagFilter from datadog_api_client.v2.model.aws_resources_config import AWSResourcesConfig from datadog_api_client.v2.model.aws_traces_config import AWSTracesConfig from datadog_api_client.v2.model.awsccm_config import AWSCCMConfig from datadog_api_client.v2.model.data_export_config import DataExportConfig body = AWSAccountCreateRequest( data=AWSAccountCreateRequestData( attributes=AWSAccountCreateRequestAttributes( account_tags=[ "key:value", ], auth_config=AWSAuthConfigKeys( access_key_id="AKIAIOSFODNN7EXAMPLE", secret_access_key="wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY", ), aws_account_id="123456789012", aws_partition=AWSAccountPartition.AWS, ccm_config=AWSCCMConfig( data_export_configs=[ DataExportConfig( bucket_name="my-bucket", bucket_region="us-east-1", report_name="my-report", report_prefix="reports", report_type="CUR2.0", ), ], ), logs_config=AWSLogsConfig( lambda_forwarder=AWSLambdaForwarderConfig( lambdas=[ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", ], log_source_config=AWSLambdaForwarderConfigLogSourceConfig( tag_filters=[ AWSLogSourceTagFilter( source="s3", tags=[ "test:test", ], ), ], ), sources=[ "s3", ], ), ), metrics_config=AWSMetricsConfig( automute_enabled=True, collect_cloudwatch_alarms=True, collect_custom_metrics=True, enabled=True, tag_filters=[ AWSNamespaceTagFilter( namespace="AWS/EC2", tags=[ "key:value", ], ), ], ), resources_config=AWSResourcesConfig( cloud_security_posture_management_collection=False, extended_collection=False, ), traces_config=AWSTracesConfig(), ), type=AWSAccountType.ACCOUNT, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.create_aws_account(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an AWS account returns "AWS Account object" response ``` # Create an AWS account returns "AWS Account object" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new body = DatadogAPIClient::V2::AWSAccountCreateRequest.new({ data: DatadogAPIClient::V2::AWSAccountCreateRequestData.new({ attributes: DatadogAPIClient::V2::AWSAccountCreateRequestAttributes.new({ account_tags: [ "key:value", ], auth_config: DatadogAPIClient::V2::AWSAuthConfigRole.new({ role_name: "DatadogIntegrationRole", }), aws_account_id: "123456789012", aws_partition: DatadogAPIClient::V2::AWSAccountPartition::AWS, ccm_config: DatadogAPIClient::V2::AWSCCMConfig.new({ data_export_configs: [ DatadogAPIClient::V2::DataExportConfig.new({ bucket_name: "my-bucket", bucket_region: "us-east-1", report_name: "my-report", report_prefix: "reports", report_type: "CUR2.0", }), ], }), logs_config: DatadogAPIClient::V2::AWSLogsConfig.new({ lambda_forwarder: DatadogAPIClient::V2::AWSLambdaForwarderConfig.new({ lambdas: [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", ], log_source_config: DatadogAPIClient::V2::AWSLambdaForwarderConfigLogSourceConfig.new({ tag_filters: [ DatadogAPIClient::V2::AWSLogSourceTagFilter.new({ source: "s3", tags: [ "test:test", ], }), ], }), sources: [ "s3", ], }), }), metrics_config: DatadogAPIClient::V2::AWSMetricsConfig.new({ automute_enabled: true, collect_cloudwatch_alarms: true, collect_custom_metrics: true, enabled: true, tag_filters: [ DatadogAPIClient::V2::AWSNamespaceTagFilter.new({ namespace: "AWS/EC2", tags: [ "key:value", ], }), ], }), resources_config: DatadogAPIClient::V2::AWSResourcesConfig.new({ cloud_security_posture_management_collection: false, extended_collection: false, }), traces_config: DatadogAPIClient::V2::AWSTracesConfig.new({}), }), type: DatadogAPIClient::V2::AWSAccountType::ACCOUNT, }), }) p api_instance.create_aws_account(body) ``` Copy ##### Create an AWS integration returns "AWS Account object" response ``` # Create an AWS integration returns "AWS Account object" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new body = DatadogAPIClient::V2::AWSAccountCreateRequest.new({ data: DatadogAPIClient::V2::AWSAccountCreateRequestData.new({ attributes: DatadogAPIClient::V2::AWSAccountCreateRequestAttributes.new({ account_tags: [ "key:value", ], auth_config: DatadogAPIClient::V2::AWSAuthConfigKeys.new({ access_key_id: "AKIAIOSFODNN7EXAMPLE", secret_access_key: "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY", }), aws_account_id: "123456789012", aws_partition: DatadogAPIClient::V2::AWSAccountPartition::AWS, ccm_config: DatadogAPIClient::V2::AWSCCMConfig.new({ data_export_configs: [ DatadogAPIClient::V2::DataExportConfig.new({ bucket_name: "my-bucket", bucket_region: "us-east-1", report_name: "my-report", report_prefix: "reports", report_type: "CUR2.0", }), ], }), logs_config: DatadogAPIClient::V2::AWSLogsConfig.new({ lambda_forwarder: DatadogAPIClient::V2::AWSLambdaForwarderConfig.new({ lambdas: [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", ], log_source_config: DatadogAPIClient::V2::AWSLambdaForwarderConfigLogSourceConfig.new({ tag_filters: [ DatadogAPIClient::V2::AWSLogSourceTagFilter.new({ source: "s3", tags: [ "test:test", ], }), ], }), sources: [ "s3", ], }), }), metrics_config: DatadogAPIClient::V2::AWSMetricsConfig.new({ automute_enabled: true, collect_cloudwatch_alarms: true, collect_custom_metrics: true, enabled: true, tag_filters: [ DatadogAPIClient::V2::AWSNamespaceTagFilter.new({ namespace: "AWS/EC2", tags: [ "key:value", ], }), ], }), resources_config: DatadogAPIClient::V2::AWSResourcesConfig.new({ cloud_security_posture_management_collection: false, extended_collection: false, }), traces_config: DatadogAPIClient::V2::AWSTracesConfig.new({}), }), type: DatadogAPIClient::V2::AWSAccountType::ACCOUNT, }), }) p api_instance.create_aws_account(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an AWS account returns "AWS Account object" response ``` // Create an AWS account returns "AWS Account object" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV2::model::AWSAccountCreateRequest; use datadog_api_client::datadogV2::model::AWSAccountCreateRequestAttributes; use datadog_api_client::datadogV2::model::AWSAccountCreateRequestData; use datadog_api_client::datadogV2::model::AWSAccountPartition; use datadog_api_client::datadogV2::model::AWSAccountType; use datadog_api_client::datadogV2::model::AWSAuthConfig; use datadog_api_client::datadogV2::model::AWSAuthConfigRole; use datadog_api_client::datadogV2::model::AWSCCMConfig; use datadog_api_client::datadogV2::model::AWSLambdaForwarderConfig; use datadog_api_client::datadogV2::model::AWSLambdaForwarderConfigLogSourceConfig; use datadog_api_client::datadogV2::model::AWSLogSourceTagFilter; use datadog_api_client::datadogV2::model::AWSLogsConfig; use datadog_api_client::datadogV2::model::AWSMetricsConfig; use datadog_api_client::datadogV2::model::AWSNamespaceTagFilter; use datadog_api_client::datadogV2::model::AWSResourcesConfig; use datadog_api_client::datadogV2::model::AWSTracesConfig; use datadog_api_client::datadogV2::model::DataExportConfig; #[tokio::main] async fn main() { let body = AWSAccountCreateRequest::new(AWSAccountCreateRequestData::new( AWSAccountCreateRequestAttributes::new( AWSAuthConfig::AWSAuthConfigRole(Box::new(AWSAuthConfigRole::new( "DatadogIntegrationRole".to_string(), ))), "123456789012".to_string(), AWSAccountPartition::AWS, ) .account_tags(Some(vec!["key:value".to_string()])) .ccm_config(AWSCCMConfig::new().data_export_configs(vec![ DataExportConfig::new() .bucket_name("my-bucket".to_string()) .bucket_region("us-east-1".to_string()) .report_name("my-report".to_string()) .report_prefix("reports".to_string()) .report_type("CUR2.0".to_string()) ])) .logs_config( AWSLogsConfig::new().lambda_forwarder( AWSLambdaForwarderConfig::new() .lambdas(vec![ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" .to_string(), ]) .log_source_config(AWSLambdaForwarderConfigLogSourceConfig::new().tag_filters( vec![ AWSLogSourceTagFilter::new() .source("s3".to_string()) .tags(Some(vec!["test:test".to_string()])) ], )) .sources(vec!["s3".to_string()]), ), ) .metrics_config( AWSMetricsConfig::new() .automute_enabled(true) .collect_cloudwatch_alarms(true) .collect_custom_metrics(true) .enabled(true) .tag_filters(vec![AWSNamespaceTagFilter::new() .namespace("AWS/EC2".to_string()) .tags(Some(vec!["key:value".to_string()]))]), ) .resources_config( AWSResourcesConfig::new() .cloud_security_posture_management_collection(false) .extended_collection(false), ) .traces_config(AWSTracesConfig::new()), AWSAccountType::ACCOUNT, )); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.create_aws_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create an AWS integration returns "AWS Account object" response ``` // Create an AWS integration returns "AWS Account object" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV2::model::AWSAccountCreateRequest; use datadog_api_client::datadogV2::model::AWSAccountCreateRequestAttributes; use datadog_api_client::datadogV2::model::AWSAccountCreateRequestData; use datadog_api_client::datadogV2::model::AWSAccountPartition; use datadog_api_client::datadogV2::model::AWSAccountType; use datadog_api_client::datadogV2::model::AWSAuthConfig; use datadog_api_client::datadogV2::model::AWSAuthConfigKeys; use datadog_api_client::datadogV2::model::AWSCCMConfig; use datadog_api_client::datadogV2::model::AWSLambdaForwarderConfig; use datadog_api_client::datadogV2::model::AWSLambdaForwarderConfigLogSourceConfig; use datadog_api_client::datadogV2::model::AWSLogSourceTagFilter; use datadog_api_client::datadogV2::model::AWSLogsConfig; use datadog_api_client::datadogV2::model::AWSMetricsConfig; use datadog_api_client::datadogV2::model::AWSNamespaceTagFilter; use datadog_api_client::datadogV2::model::AWSResourcesConfig; use datadog_api_client::datadogV2::model::AWSTracesConfig; use datadog_api_client::datadogV2::model::DataExportConfig; #[tokio::main] async fn main() { let body = AWSAccountCreateRequest::new(AWSAccountCreateRequestData::new( AWSAccountCreateRequestAttributes::new( AWSAuthConfig::AWSAuthConfigKeys(Box::new( AWSAuthConfigKeys::new("AKIAIOSFODNN7EXAMPLE".to_string()) .secret_access_key("wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY".to_string()), )), "123456789012".to_string(), AWSAccountPartition::AWS, ) .account_tags(Some(vec!["key:value".to_string()])) .ccm_config(AWSCCMConfig::new().data_export_configs(vec![ DataExportConfig::new() .bucket_name("my-bucket".to_string()) .bucket_region("us-east-1".to_string()) .report_name("my-report".to_string()) .report_prefix("reports".to_string()) .report_type("CUR2.0".to_string()) ])) .logs_config( AWSLogsConfig::new().lambda_forwarder( AWSLambdaForwarderConfig::new() .lambdas(vec![ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" .to_string(), ]) .log_source_config(AWSLambdaForwarderConfigLogSourceConfig::new().tag_filters( vec![ AWSLogSourceTagFilter::new() .source("s3".to_string()) .tags(Some(vec!["test:test".to_string()])) ], )) .sources(vec!["s3".to_string()]), ), ) .metrics_config( AWSMetricsConfig::new() .automute_enabled(true) .collect_cloudwatch_alarms(true) .collect_custom_metrics(true) .enabled(true) .tag_filters(vec![AWSNamespaceTagFilter::new() .namespace("AWS/EC2".to_string()) .tags(Some(vec!["key:value".to_string()]))]), ) .resources_config( AWSResourcesConfig::new() .cloud_security_posture_management_collection(false) .extended_collection(false), ) .traces_config(AWSTracesConfig::new()), AWSAccountType::ACCOUNT, )); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.create_aws_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an AWS account returns "AWS Account object" response ``` /** * Create an AWS account returns "AWS Account object" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); const params: v2.AWSIntegrationApiCreateAWSAccountRequest = { body: { data: { attributes: { accountTags: ["key:value"], authConfig: { roleName: "DatadogIntegrationRole", }, awsAccountId: "123456789012", awsPartition: "aws", ccmConfig: { dataExportConfigs: [ { bucketName: "my-bucket", bucketRegion: "us-east-1", reportName: "my-report", reportPrefix: "reports", reportType: "CUR2.0", }, ], }, logsConfig: { lambdaForwarder: { lambdas: [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", ], logSourceConfig: { tagFilters: [ { source: "s3", tags: ["test:test"], }, ], }, sources: ["s3"], }, }, metricsConfig: { automuteEnabled: true, collectCloudwatchAlarms: true, collectCustomMetrics: true, enabled: true, tagFilters: [ { namespace: "AWS/EC2", tags: ["key:value"], }, ], }, resourcesConfig: { cloudSecurityPostureManagementCollection: false, extendedCollection: false, }, tracesConfig: {}, }, type: "account", }, }, }; apiInstance .createAWSAccount(params) .then((data: v2.AWSAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create an AWS integration returns "AWS Account object" response ``` /** * Create an AWS integration returns "AWS Account object" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); const params: v2.AWSIntegrationApiCreateAWSAccountRequest = { body: { data: { attributes: { accountTags: ["key:value"], authConfig: { accessKeyId: "AKIAIOSFODNN7EXAMPLE", secretAccessKey: "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY", }, awsAccountId: "123456789012", awsPartition: "aws", ccmConfig: { dataExportConfigs: [ { bucketName: "my-bucket", bucketRegion: "us-east-1", reportName: "my-report", reportPrefix: "reports", reportType: "CUR2.0", }, ], }, logsConfig: { lambdaForwarder: { lambdas: [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", ], logSourceConfig: { tagFilters: [ { source: "s3", tags: ["test:test"], }, ], }, sources: ["s3"], }, }, metricsConfig: { automuteEnabled: true, collectCloudwatchAlarms: true, collectCustomMetrics: true, enabled: true, tagFilters: [ { namespace: "AWS/EC2", tags: ["key:value"], }, ], }, resourcesConfig: { cloudSecurityPostureManagementCollection: false, extendedCollection: false, }, tracesConfig: {}, }, type: "account", }, }, }; apiInstance .createAWSAccount(params) .then((data: v2.AWSAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get resource collection IAM permissions](https://docs.datadoghq.com/api/latest/aws-integration/#get-resource-collection-iam-permissions) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#get-resource-collection-iam-permissions-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/aws/iam_permissions/resource_collectionhttps://api.ap2.datadoghq.com/api/v2/integration/aws/iam_permissions/resource_collectionhttps://api.datadoghq.eu/api/v2/integration/aws/iam_permissions/resource_collectionhttps://api.ddog-gov.com/api/v2/integration/aws/iam_permissions/resource_collectionhttps://api.datadoghq.com/api/v2/integration/aws/iam_permissions/resource_collectionhttps://api.us3.datadoghq.com/api/v2/integration/aws/iam_permissions/resource_collectionhttps://api.us5.datadoghq.com/api/v2/integration/aws/iam_permissions/resource_collection ### Overview Get all resource collection AWS IAM permissions required for the AWS integration. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#GetAWSIntegrationIAMPermissionsResourceCollection-200-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#GetAWSIntegrationIAMPermissionsResourceCollection-429-v2) AWS integration resource collection IAM permissions. * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) AWS Integration IAM Permissions response body. Expand All Field Type Description _required_] object AWS Integration IAM Permissions response data. object AWS Integration IAM Permissions response attributes. permissions [_required_] [string] List of AWS IAM permissions required for the integration. id string The `AWSIntegrationIamPermissionsResponseData` `id`. default: `permissions` type enum The `AWSIntegrationIamPermissionsResponseData` `type`. Allowed enum values: `permissions` default: `permissions` ``` { "data": { "attributes": { "permissions": [ "account:GetContactInformation", "amplify:ListApps", "amplify:ListArtifacts", "amplify:ListBackendEnvironments", "amplify:ListBranches" ] }, "id": "permissions", "type": "permissions" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Get resource collection IAM permissions Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/iam_permissions/resource_collection" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get resource collection IAM permissions ``` """ Get resource collection IAM permissions returns "AWS integration resource collection IAM permissions." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.get_aws_integration_iam_permissions_resource_collection() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get resource collection IAM permissions ``` # Get resource collection IAM permissions returns "AWS integration resource collection IAM permissions." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new p api_instance.get_aws_integration_iam_permissions_resource_collection() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get resource collection IAM permissions ``` // Get resource collection IAM permissions returns "AWS integration resource collection IAM permissions." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.GetAWSIntegrationIAMPermissionsResourceCollection(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.GetAWSIntegrationIAMPermissionsResourceCollection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.GetAWSIntegrationIAMPermissionsResourceCollection`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get resource collection IAM permissions ``` // Get resource collection IAM permissions returns "AWS integration resource collection IAM // permissions." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSIntegrationIamPermissionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); try { AWSIntegrationIamPermissionsResponse result = apiInstance.getAWSIntegrationIAMPermissionsResourceCollection(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling" + " AwsIntegrationApi#getAWSIntegrationIAMPermissionsResourceCollection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get resource collection IAM permissions ``` // Get resource collection IAM permissions returns "AWS integration resource // collection IAM permissions." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api .get_aws_integration_iam_permissions_resource_collection() .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get resource collection IAM permissions ``` /** * Get resource collection IAM permissions returns "AWS integration resource collection IAM permissions." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); apiInstance .getAWSIntegrationIAMPermissionsResourceCollection() .then((data: v2.AWSIntegrationIamPermissionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an AWS integration](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-aws-integration) * [v1](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-aws-integration-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#update-an-aws-integration-v2) PUT https://api.ap1.datadoghq.com/api/v1/integration/awshttps://api.ap2.datadoghq.com/api/v1/integration/awshttps://api.datadoghq.eu/api/v1/integration/awshttps://api.ddog-gov.com/api/v1/integration/awshttps://api.datadoghq.com/api/v1/integration/awshttps://api.us3.datadoghq.com/api/v1/integration/awshttps://api.us5.datadoghq.com/api/v1/integration/aws ### Overview **This endpoint is deprecated - use the V2 endpoints instead.** Update a Datadog-Amazon Web Services integration. This endpoint requires the `aws_configuration_edit` permission. ### Arguments #### Query Strings Name Type Description account_id string Only return AWS accounts that matches this `account_id`. role_name string Only return AWS accounts that match this `role_name`. Required if `account_id` is specified. access_key_id string Only return AWS accounts that matches this `access_key_id`. Required if none of the other two options are specified. ### Request #### Body Data (required) AWS request object * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description access_key_id string Your AWS access key ID. Only required if your AWS account is a GovCloud or China account. account_id string Your AWS Account ID without dashes. object An object (in the form `{"namespace1":true/false, "namespace2":true/false}`) containing user-supplied overrides for AWS namespace metric collection. **Important** : This field only contains namespaces explicitly configured through API calls, not the comprehensive enabled or disabled status of all namespaces. If a namespace is absent from this field, it uses Datadog's internal defaults (all namespaces enabled by default, except `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage`). For a complete view of all namespace statuses, use the V2 AWS Integration API instead. boolean A list of additional properties. cspm_resource_collection_enabled boolean Whether Datadog collects cloud security posture management resources from your AWS account. This includes additional resources not covered under the general `resource_collection`. excluded_regions [string] An array of [AWS regions](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints) to exclude from metrics collection. extended_resource_collection_enabled boolean Whether Datadog collects additional attributes and configuration information about the resources in your AWS account. Required for `cspm_resource_collection`. filter_tags [string] The array of EC2 tags (in the form `key:value`) defines a filter that Datadog uses when collecting metrics from EC2. Wildcards, such as `?` (for single characters) and `*` (for multiple characters) can also be used. Only hosts that match one of the defined tags will be imported into Datadog. The rest will be ignored. Host matching a given tag can also be excluded by adding `!` before the tag. For example, `env:production,instance-type:c1.*,!region:us-east-1` host_tags [string] Array of tags (in the form `key:value`) to add to all hosts and metrics reporting through this integration. metrics_collection_enabled boolean Whether Datadog collects metrics for this AWS account. default: `true` resource_collection_enabled boolean **DEPRECATED** : Deprecated in favor of 'extended_resource_collection_enabled'. Whether Datadog collects a standard set of resources from your AWS account. role_name string Your Datadog role delegation name. secret_access_key string Your AWS secret access key. Only required if your AWS account is a GovCloud or China account. ``` { "account_id": "163662907100", "account_specific_namespace_rules": { "auto_scaling": false }, "cspm_resource_collection_enabled": false, "excluded_regions": [ "us-east-1", "us-west-2" ], "extended_resource_collection_enabled": true, "filter_tags": [ "$KEY:$VALUE" ], "host_tags": [ "$KEY:$VALUE" ], "metrics_collection_enabled": true, "role_name": "DatadogAWSIntegrationRole" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#UpdateAWSAccount-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#UpdateAWSAccount-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#UpdateAWSAccount-403-v1) * [409](https://docs.datadoghq.com/api/latest/aws-integration/#UpdateAWSAccount-409-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#UpdateAWSAccount-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Update an AWS integration returns "OK" response Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "account_id": "163662907100", "account_specific_namespace_rules": { "auto_scaling": false }, "cspm_resource_collection_enabled": false, "excluded_regions": [ "us-east-1", "us-west-2" ], "extended_resource_collection_enabled": true, "filter_tags": [ "$KEY:$VALUE" ], "host_tags": [ "$KEY:$VALUE" ], "metrics_collection_enabled": true, "role_name": "DatadogAWSIntegrationRole" } EOF ``` ##### Update an AWS integration returns "OK" response ``` // Update an AWS integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSAccount{ AccountId: datadog.PtrString("163662907100"), AccountSpecificNamespaceRules: map[string]bool{ "auto_scaling": false, }, CspmResourceCollectionEnabled: datadog.PtrBool(false), ExcludedRegions: []string{ "us-east-1", "us-west-2", }, ExtendedResourceCollectionEnabled: datadog.PtrBool(true), FilterTags: []string{ "$KEY:$VALUE", }, HostTags: []string{ "$KEY:$VALUE", }, MetricsCollectionEnabled: datadog.PtrBool(true), RoleName: datadog.PtrString("DatadogAWSIntegrationRole"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.UpdateAWSAccount(ctx, body, *datadogV1.NewUpdateAWSAccountOptionalParameters().WithAccountId("163662907100").WithRoleName("DatadogAWSIntegrationRole")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.UpdateAWSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.UpdateAWSAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an AWS integration returns "OK" response ``` // Update an AWS integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import com.datadog.api.client.v1.api.AwsIntegrationApi.UpdateAWSAccountOptionalParameters; import com.datadog.api.client.v1.model.AWSAccount; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSAccount body = new AWSAccount() .accountId("163662907100") .accountSpecificNamespaceRules(Map.ofEntries(Map.entry("auto_scaling", false))) .cspmResourceCollectionEnabled(false) .excludedRegions(Arrays.asList("us-east-1", "us-west-2")) .extendedResourceCollectionEnabled(true) .filterTags(Collections.singletonList("$KEY:$VALUE")) .hostTags(Collections.singletonList("$KEY:$VALUE")) .metricsCollectionEnabled(true) .roleName("DatadogAWSIntegrationRole"); try { apiInstance.updateAWSAccount( body, new UpdateAWSAccountOptionalParameters() .accountId("163662907100") .roleName("DatadogAWSIntegrationRole")); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#updateAWSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an AWS integration returns "OK" response ``` """ Update an AWS integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v1.model.aws_account import AWSAccount body = AWSAccount( account_id="163662907100", account_specific_namespace_rules=dict( auto_scaling=False, ), cspm_resource_collection_enabled=False, excluded_regions=[ "us-east-1", "us-west-2", ], extended_resource_collection_enabled=True, filter_tags=[ "$KEY:$VALUE", ], host_tags=[ "$KEY:$VALUE", ], metrics_collection_enabled=True, role_name="DatadogAWSIntegrationRole", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.update_aws_account( account_id="163662907100", role_name="DatadogAWSIntegrationRole", body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an AWS integration returns "OK" response ``` # Update an AWS integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new body = DatadogAPIClient::V1::AWSAccount.new({ account_id: "163662907100", account_specific_namespace_rules: { auto_scaling: false, }, cspm_resource_collection_enabled: false, excluded_regions: [ "us-east-1", "us-west-2", ], extended_resource_collection_enabled: true, filter_tags: [ "$KEY:$VALUE", ], host_tags: [ "$KEY:$VALUE", ], metrics_collection_enabled: true, role_name: "DatadogAWSIntegrationRole", }) opts = { account_id: "163662907100", role_name: "DatadogAWSIntegrationRole", } p api_instance.update_aws_account(body, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an AWS integration returns "OK" response ``` // Update an AWS integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV1::api_aws_integration::UpdateAWSAccountOptionalParams; use datadog_api_client::datadogV1::model::AWSAccount; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = AWSAccount::new() .account_id("163662907100".to_string()) .account_specific_namespace_rules(BTreeMap::from([("auto_scaling".to_string(), false)])) .cspm_resource_collection_enabled(false) .excluded_regions(vec!["us-east-1".to_string(), "us-west-2".to_string()]) .extended_resource_collection_enabled(true) .filter_tags(vec!["$KEY:$VALUE".to_string()]) .host_tags(vec!["$KEY:$VALUE".to_string()]) .metrics_collection_enabled(true) .role_name("DatadogAWSIntegrationRole".to_string()); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api .update_aws_account( body, UpdateAWSAccountOptionalParams::default() .account_id("163662907100".to_string()) .role_name("DatadogAWSIntegrationRole".to_string()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an AWS integration returns "OK" response ``` /** * Update an AWS integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); const params: v1.AWSIntegrationApiUpdateAWSAccountRequest = { body: { accountId: "163662907100", accountSpecificNamespaceRules: { auto_scaling: false, }, cspmResourceCollectionEnabled: false, excludedRegions: ["us-east-1", "us-west-2"], extendedResourceCollectionEnabled: true, filterTags: ["$KEY:$VALUE"], hostTags: ["$KEY:$VALUE"], metricsCollectionEnabled: true, roleName: "DatadogAWSIntegrationRole", }, accountId: "163662907100", roleName: "DatadogAWSIntegrationRole", }; apiInstance .updateAWSAccount(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` PATCH https://api.ap1.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.ap2.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.datadoghq.eu/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.ddog-gov.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.us3.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id}https://api.us5.datadoghq.com/api/v2/integration/aws/accounts/{aws_account_config_id} ### Overview Update an AWS Account Integration Config by config ID. This endpoint requires the `aws_configuration_edit` permission. ### Arguments #### Path Parameters Name Type Description aws_account_config_id [_required_] string Unique Datadog ID of the AWS Account Integration Config. To get the config ID for an account, use the [List all AWS integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations) endpoint and query by AWS Account ID. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description _required_] object AWS Account Update Request data. _required_] object The AWS Account Integration Config to be updated. account_tags [string] Tags to apply to all hosts and metrics reporting for this account. Defaults to `[]`. AWS Authentication config. object AWS Authentication config to integrate your account using an access key pair. access_key_id [_required_] string AWS Access Key ID. secret_access_key string AWS Secret Access Key. object AWS Authentication config to integrate your account using an IAM role. external_id string AWS IAM External ID for associated role. role_name [_required_] string AWS IAM Role name. aws_account_id [_required_] string AWS Account ID. aws_partition enum AWS partition your AWS account is scoped to. Defaults to `aws`. See [Partitions](https://docs.aws.amazon.com/whitepapers/latest/aws-fault-isolation-boundaries/partitions.html) in the AWS documentation for more information. Allowed enum values: `aws,aws-cn,aws-us-gov` AWS Regions to collect data from. Defaults to `include_all`. object Include all regions. Defaults to `true`. include_all [_required_] boolean Include all regions. object Include only these regions. include_only [_required_] [string] Include only these regions. object AWS Logs Collection config. object Log Autosubscription configuration for Datadog Forwarder Lambda functions. Automatically set up triggers for existing and new logs for some services, ensuring no logs from new resources are missed and saving time spent on manual configuration. lambdas [string] List of Datadog Lambda Log Forwarder ARNs in your AWS account. Defaults to `[]`. object Log source configuration. [object] List of AWS log source tag filters. Defaults to `[]`. source string The AWS log source to which the tag filters defined in `tags` are applied. tags [string] The AWS resource tags to filter on for the log source specified by `source`. sources [string] List of service IDs set to enable automatic log collection. Discover the list of available services with the [Get list of AWS log ready services](https://docs.datadoghq.com/api/latest/aws-logs-integration/#get-list-of-aws-log-ready-services) endpoint. object AWS Metrics Collection config. automute_enabled boolean Enable EC2 automute for AWS metrics. Defaults to `true`. collect_cloudwatch_alarms boolean Enable CloudWatch alarms collection. Defaults to `false`. collect_custom_metrics boolean Enable custom metrics collection. Defaults to `false`. enabled boolean Enable AWS metrics collection. Defaults to `true`. AWS Metrics namespace filters. Defaults to `exclude_only`. object Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. exclude_only [_required_] [string] Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. object Include only these namespaces. include_only [_required_] [string] Include only these namespaces. [object] AWS Metrics collection tag filters list. Defaults to `[]`. namespace string The AWS service for which the tag filters defined in `tags` will be applied. tags [string] The AWS resource tags to filter on for the service specified by `namespace`. object AWS Resources Collection config. cloud_security_posture_management_collection boolean Enable Cloud Security Management to scan AWS resources for vulnerabilities, misconfigurations, identity risks, and compliance violations. Defaults to `false`. Requires `extended_collection` to be set to `true`. extended_collection boolean Whether Datadog collects additional attributes and configuration information about the resources in your AWS account. Defaults to `true`. Required for `cloud_security_posture_management_collection`. object AWS Traces Collection config. AWS X-Ray services to collect traces from. Defaults to `include_only`. object Include all services. include_all [_required_] boolean Include all services. object Include only these services. Defaults to `[]`. include_only [_required_] [string] Include only these services. id string Unique Datadog ID of the AWS Account Integration Config. To get the config ID for an account, use the [List all AWS integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations) endpoint and query by AWS Account ID. type [_required_] enum AWS Account resource type. Allowed enum values: `account` default: `account` ``` { "data": { "attributes": { "account_tags": [ "key:value" ], "auth_config": { "role_name": "DatadogIntegrationRole" }, "aws_account_id": "123456789012", "aws_partition": "aws", "logs_config": { "lambda_forwarder": { "lambdas": [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" ], "log_source_config": { "tag_filters": [ { "source": "s3", "tags": [ "test:test" ] } ] }, "sources": [ "s3" ] } }, "metrics_config": { "automute_enabled": true, "collect_cloudwatch_alarms": true, "collect_custom_metrics": true, "enabled": true, "tag_filters": [ { "namespace": "AWS/EC2", "tags": [ "key:value" ] } ] }, "resources_config": { "cloud_security_posture_management_collection": false, "extended_collection": false }, "traces_config": {} }, "type": "account" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#UpdateAWSAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#UpdateAWSAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#UpdateAWSAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/aws-integration/#UpdateAWSAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#UpdateAWSAccount-429-v2) AWS Account object * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) AWS Account response body. Expand All Field Type Description _required_] object AWS Account response data. object AWS Account response attributes. account_tags [string] Tags to apply to all hosts and metrics reporting for this account. Defaults to `[]`. AWS Authentication config. object AWS Authentication config to integrate your account using an access key pair. access_key_id [_required_] string AWS Access Key ID. secret_access_key string AWS Secret Access Key. object AWS Authentication config to integrate your account using an IAM role. external_id string AWS IAM External ID for associated role. role_name [_required_] string AWS IAM Role name. aws_account_id [_required_] string AWS Account ID. aws_partition enum AWS partition your AWS account is scoped to. Defaults to `aws`. See [Partitions](https://docs.aws.amazon.com/whitepapers/latest/aws-fault-isolation-boundaries/partitions.html) in the AWS documentation for more information. Allowed enum values: `aws,aws-cn,aws-us-gov` AWS Regions to collect data from. Defaults to `include_all`. object Include all regions. Defaults to `true`. include_all [_required_] boolean Include all regions. object Include only these regions. include_only [_required_] [string] Include only these regions. created_at date-time Timestamp of when the account integration was created. object AWS Logs Collection config. object Log Autosubscription configuration for Datadog Forwarder Lambda functions. Automatically set up triggers for existing and new logs for some services, ensuring no logs from new resources are missed and saving time spent on manual configuration. lambdas [string] List of Datadog Lambda Log Forwarder ARNs in your AWS account. Defaults to `[]`. object Log source configuration. [object] List of AWS log source tag filters. Defaults to `[]`. source string The AWS log source to which the tag filters defined in `tags` are applied. tags [string] The AWS resource tags to filter on for the log source specified by `source`. sources [string] List of service IDs set to enable automatic log collection. Discover the list of available services with the [Get list of AWS log ready services](https://docs.datadoghq.com/api/latest/aws-logs-integration/#get-list-of-aws-log-ready-services) endpoint. object AWS Metrics Collection config. automute_enabled boolean Enable EC2 automute for AWS metrics. Defaults to `true`. collect_cloudwatch_alarms boolean Enable CloudWatch alarms collection. Defaults to `false`. collect_custom_metrics boolean Enable custom metrics collection. Defaults to `false`. enabled boolean Enable AWS metrics collection. Defaults to `true`. AWS Metrics namespace filters. Defaults to `exclude_only`. object Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. exclude_only [_required_] [string] Exclude only these namespaces from metrics collection. Defaults to `["AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage"]`. `AWS/SQS`, `AWS/ElasticMapReduce`, and `AWS/Usage` are excluded by default to reduce your AWS CloudWatch costs from `GetMetricData` API calls. object Include only these namespaces. include_only [_required_] [string] Include only these namespaces. [object] AWS Metrics collection tag filters list. Defaults to `[]`. namespace string The AWS service for which the tag filters defined in `tags` will be applied. tags [string] The AWS resource tags to filter on for the service specified by `namespace`. modified_at date-time Timestamp of when the account integration was updated. object AWS Resources Collection config. cloud_security_posture_management_collection boolean Enable Cloud Security Management to scan AWS resources for vulnerabilities, misconfigurations, identity risks, and compliance violations. Defaults to `false`. Requires `extended_collection` to be set to `true`. extended_collection boolean Whether Datadog collects additional attributes and configuration information about the resources in your AWS account. Defaults to `true`. Required for `cloud_security_posture_management_collection`. object AWS Traces Collection config. AWS X-Ray services to collect traces from. Defaults to `include_only`. object Include all services. include_all [_required_] boolean Include all services. object Include only these services. Defaults to `[]`. include_only [_required_] [string] Include only these services. id [_required_] string Unique Datadog ID of the AWS Account Integration Config. To get the config ID for an account, use the [List all AWS integrations](https://docs.datadoghq.com/api/latest/aws-integration/#list-all-aws-integrations) endpoint and query by AWS Account ID. type [_required_] enum AWS Account resource type. Allowed enum values: `account` default: `account` ``` { "data": { "attributes": { "account_tags": [ "env:prod" ], "auth_config": { "access_key_id": "AKIAIOSFODNN7EXAMPLE", "secret_access_key": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY" }, "aws_account_id": "123456789012", "aws_partition": "aws", "aws_regions": { "include_all": true }, "created_at": "2019-09-19T10:00:00.000Z", "logs_config": { "lambda_forwarder": { "lambdas": [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" ], "log_source_config": { "tag_filters": [ { "source": "s3", "tags": [ "env:prod" ] } ] }, "sources": [ "s3" ] } }, "metrics_config": { "automute_enabled": true, "collect_cloudwatch_alarms": false, "collect_custom_metrics": false, "enabled": true, "namespace_filters": { "exclude_only": [ "AWS/SQS", "AWS/ElasticMapReduce", "AWS/Usage" ] }, "tag_filters": [ { "namespace": "AWS/EC2", "tags": [ "datadog:true" ] } ] }, "modified_at": "2019-09-19T10:00:00.000Z", "resources_config": { "cloud_security_posture_management_collection": false, "extended_collection": true }, "traces_config": { "xray_services": { "include_all": false } } }, "id": "00000000-abcd-0001-0000-000000000000", "type": "account" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Update an AWS integration returns "AWS Account object" response Copy ``` # Path parameters export aws_account_config_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/accounts/${aws_account_config_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "account_tags": [ "key:value" ], "auth_config": { "role_name": "DatadogIntegrationRole" }, "aws_account_id": "123456789012", "aws_partition": "aws", "logs_config": { "lambda_forwarder": { "lambdas": [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder" ], "log_source_config": { "tag_filters": [ { "source": "s3", "tags": [ "test:test" ] } ] }, "sources": [ "s3" ] } }, "metrics_config": { "automute_enabled": true, "collect_cloudwatch_alarms": true, "collect_custom_metrics": true, "enabled": true, "tag_filters": [ { "namespace": "AWS/EC2", "tags": [ "key:value" ] } ] }, "resources_config": { "cloud_security_posture_management_collection": false, "extended_collection": false }, "traces_config": {} }, "type": "account" } } EOF ``` ##### Update an AWS integration returns "AWS Account object" response ``` // Update an AWS integration returns "AWS Account object" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "aws_account_v2" in the system AwsAccountV2DataID := os.Getenv("AWS_ACCOUNT_V2_DATA_ID") body := datadogV2.AWSAccountUpdateRequest{ Data: datadogV2.AWSAccountUpdateRequestData{ Attributes: datadogV2.AWSAccountUpdateRequestAttributes{ AccountTags: *datadog.NewNullableList(&[]string{ "key:value", }), AuthConfig: &datadogV2.AWSAuthConfig{ AWSAuthConfigRole: &datadogV2.AWSAuthConfigRole{ RoleName: "DatadogIntegrationRole", }}, AwsAccountId: "123456789012", AwsPartition: datadogV2.AWSACCOUNTPARTITION_AWS.Ptr(), CcmConfig: &datadogV2.AWSCCMConfig{ DataExportConfigs: []datadogV2.DataExportConfig{ { BucketName: datadog.PtrString("updated-bucket"), BucketRegion: datadog.PtrString("us-west-2"), ReportName: datadog.PtrString("updated-report"), ReportPrefix: datadog.PtrString("cost-reports"), ReportType: datadog.PtrString("CUR2.0"), }, }, }, LogsConfig: &datadogV2.AWSLogsConfig{ LambdaForwarder: &datadogV2.AWSLambdaForwarderConfig{ Lambdas: []string{ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", }, LogSourceConfig: &datadogV2.AWSLambdaForwarderConfigLogSourceConfig{ TagFilters: []datadogV2.AWSLogSourceTagFilter{ { Source: datadog.PtrString("s3"), Tags: *datadog.NewNullableList(&[]string{ "test:test", }), }, }, }, Sources: []string{ "s3", }, }, }, MetricsConfig: &datadogV2.AWSMetricsConfig{ AutomuteEnabled: datadog.PtrBool(true), CollectCloudwatchAlarms: datadog.PtrBool(true), CollectCustomMetrics: datadog.PtrBool(true), Enabled: datadog.PtrBool(true), TagFilters: []datadogV2.AWSNamespaceTagFilter{ { Namespace: datadog.PtrString("AWS/EC2"), Tags: *datadog.NewNullableList(&[]string{ "key:value", }), }, }, }, ResourcesConfig: &datadogV2.AWSResourcesConfig{ CloudSecurityPostureManagementCollection: datadog.PtrBool(false), ExtendedCollection: datadog.PtrBool(false), }, TracesConfig: &datadogV2.AWSTracesConfig{}, }, Type: datadogV2.AWSACCOUNTTYPE_ACCOUNT, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.UpdateAWSAccount(ctx, AwsAccountV2DataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.UpdateAWSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.UpdateAWSAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an AWS integration returns "AWS Account object" response ``` // Update an AWS integration returns "AWS Account object" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSAccountPartition; import com.datadog.api.client.v2.model.AWSAccountResponse; import com.datadog.api.client.v2.model.AWSAccountType; import com.datadog.api.client.v2.model.AWSAccountUpdateRequest; import com.datadog.api.client.v2.model.AWSAccountUpdateRequestAttributes; import com.datadog.api.client.v2.model.AWSAccountUpdateRequestData; import com.datadog.api.client.v2.model.AWSAuthConfig; import com.datadog.api.client.v2.model.AWSAuthConfigRole; import com.datadog.api.client.v2.model.AWSCCMConfig; import com.datadog.api.client.v2.model.AWSLambdaForwarderConfig; import com.datadog.api.client.v2.model.AWSLambdaForwarderConfigLogSourceConfig; import com.datadog.api.client.v2.model.AWSLogSourceTagFilter; import com.datadog.api.client.v2.model.AWSLogsConfig; import com.datadog.api.client.v2.model.AWSMetricsConfig; import com.datadog.api.client.v2.model.AWSNamespaceTagFilter; import com.datadog.api.client.v2.model.AWSResourcesConfig; import com.datadog.api.client.v2.model.AWSTracesConfig; import com.datadog.api.client.v2.model.DataExportConfig; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); // there is a valid "aws_account_v2" in the system String AWS_ACCOUNT_V2_DATA_ID = System.getenv("AWS_ACCOUNT_V2_DATA_ID"); AWSAccountUpdateRequest body = new AWSAccountUpdateRequest() .data( new AWSAccountUpdateRequestData() .attributes( new AWSAccountUpdateRequestAttributes() .accountTags(Collections.singletonList("key:value")) .authConfig( new AWSAuthConfig( new AWSAuthConfigRole().roleName("DatadogIntegrationRole"))) .awsAccountId("123456789012") .awsPartition(AWSAccountPartition.AWS) .ccmConfig( new AWSCCMConfig() .dataExportConfigs( Collections.singletonList( new DataExportConfig() .bucketName("updated-bucket") .bucketRegion("us-west-2") .reportName("updated-report") .reportPrefix("cost-reports") .reportType("CUR2.0")))) .logsConfig( new AWSLogsConfig() .lambdaForwarder( new AWSLambdaForwarderConfig() .lambdas( Collections.singletonList( "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder")) .logSourceConfig( new AWSLambdaForwarderConfigLogSourceConfig() .tagFilters( Collections.singletonList( new AWSLogSourceTagFilter() .source("s3") .tags( Collections.singletonList( "test:test"))))) .sources(Collections.singletonList("s3")))) .metricsConfig( new AWSMetricsConfig() .automuteEnabled(true) .collectCloudwatchAlarms(true) .collectCustomMetrics(true) .enabled(true) .tagFilters( Collections.singletonList( new AWSNamespaceTagFilter() .namespace("AWS/EC2") .tags(Collections.singletonList("key:value"))))) .resourcesConfig( new AWSResourcesConfig() .cloudSecurityPostureManagementCollection(false) .extendedCollection(false)) .tracesConfig(new AWSTracesConfig())) .type(AWSAccountType.ACCOUNT)); try { AWSAccountResponse result = apiInstance.updateAWSAccount(AWS_ACCOUNT_V2_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#updateAWSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an AWS integration returns "AWS Account object" response ``` """ Update an AWS integration returns "AWS Account object" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v2.model.aws_account_partition import AWSAccountPartition from datadog_api_client.v2.model.aws_account_type import AWSAccountType from datadog_api_client.v2.model.aws_account_update_request import AWSAccountUpdateRequest from datadog_api_client.v2.model.aws_account_update_request_attributes import AWSAccountUpdateRequestAttributes from datadog_api_client.v2.model.aws_account_update_request_data import AWSAccountUpdateRequestData from datadog_api_client.v2.model.aws_auth_config_role import AWSAuthConfigRole from datadog_api_client.v2.model.aws_lambda_forwarder_config import AWSLambdaForwarderConfig from datadog_api_client.v2.model.aws_lambda_forwarder_config_log_source_config import ( AWSLambdaForwarderConfigLogSourceConfig, ) from datadog_api_client.v2.model.aws_log_source_tag_filter import AWSLogSourceTagFilter from datadog_api_client.v2.model.aws_logs_config import AWSLogsConfig from datadog_api_client.v2.model.aws_metrics_config import AWSMetricsConfig from datadog_api_client.v2.model.aws_namespace_tag_filter import AWSNamespaceTagFilter from datadog_api_client.v2.model.aws_resources_config import AWSResourcesConfig from datadog_api_client.v2.model.aws_traces_config import AWSTracesConfig from datadog_api_client.v2.model.awsccm_config import AWSCCMConfig from datadog_api_client.v2.model.data_export_config import DataExportConfig # there is a valid "aws_account_v2" in the system AWS_ACCOUNT_V2_DATA_ID = environ["AWS_ACCOUNT_V2_DATA_ID"] body = AWSAccountUpdateRequest( data=AWSAccountUpdateRequestData( attributes=AWSAccountUpdateRequestAttributes( account_tags=[ "key:value", ], auth_config=AWSAuthConfigRole( role_name="DatadogIntegrationRole", ), aws_account_id="123456789012", aws_partition=AWSAccountPartition.AWS, ccm_config=AWSCCMConfig( data_export_configs=[ DataExportConfig( bucket_name="updated-bucket", bucket_region="us-west-2", report_name="updated-report", report_prefix="cost-reports", report_type="CUR2.0", ), ], ), logs_config=AWSLogsConfig( lambda_forwarder=AWSLambdaForwarderConfig( lambdas=[ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", ], log_source_config=AWSLambdaForwarderConfigLogSourceConfig( tag_filters=[ AWSLogSourceTagFilter( source="s3", tags=[ "test:test", ], ), ], ), sources=[ "s3", ], ), ), metrics_config=AWSMetricsConfig( automute_enabled=True, collect_cloudwatch_alarms=True, collect_custom_metrics=True, enabled=True, tag_filters=[ AWSNamespaceTagFilter( namespace="AWS/EC2", tags=[ "key:value", ], ), ], ), resources_config=AWSResourcesConfig( cloud_security_posture_management_collection=False, extended_collection=False, ), traces_config=AWSTracesConfig(), ), type=AWSAccountType.ACCOUNT, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.update_aws_account(aws_account_config_id=AWS_ACCOUNT_V2_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an AWS integration returns "AWS Account object" response ``` # Update an AWS integration returns "AWS Account object" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new # there is a valid "aws_account_v2" in the system AWS_ACCOUNT_V2_DATA_ID = ENV["AWS_ACCOUNT_V2_DATA_ID"] body = DatadogAPIClient::V2::AWSAccountUpdateRequest.new({ data: DatadogAPIClient::V2::AWSAccountUpdateRequestData.new({ attributes: DatadogAPIClient::V2::AWSAccountUpdateRequestAttributes.new({ account_tags: [ "key:value", ], auth_config: DatadogAPIClient::V2::AWSAuthConfigRole.new({ role_name: "DatadogIntegrationRole", }), aws_account_id: "123456789012", aws_partition: DatadogAPIClient::V2::AWSAccountPartition::AWS, ccm_config: DatadogAPIClient::V2::AWSCCMConfig.new({ data_export_configs: [ DatadogAPIClient::V2::DataExportConfig.new({ bucket_name: "updated-bucket", bucket_region: "us-west-2", report_name: "updated-report", report_prefix: "cost-reports", report_type: "CUR2.0", }), ], }), logs_config: DatadogAPIClient::V2::AWSLogsConfig.new({ lambda_forwarder: DatadogAPIClient::V2::AWSLambdaForwarderConfig.new({ lambdas: [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", ], log_source_config: DatadogAPIClient::V2::AWSLambdaForwarderConfigLogSourceConfig.new({ tag_filters: [ DatadogAPIClient::V2::AWSLogSourceTagFilter.new({ source: "s3", tags: [ "test:test", ], }), ], }), sources: [ "s3", ], }), }), metrics_config: DatadogAPIClient::V2::AWSMetricsConfig.new({ automute_enabled: true, collect_cloudwatch_alarms: true, collect_custom_metrics: true, enabled: true, tag_filters: [ DatadogAPIClient::V2::AWSNamespaceTagFilter.new({ namespace: "AWS/EC2", tags: [ "key:value", ], }), ], }), resources_config: DatadogAPIClient::V2::AWSResourcesConfig.new({ cloud_security_posture_management_collection: false, extended_collection: false, }), traces_config: DatadogAPIClient::V2::AWSTracesConfig.new({}), }), type: DatadogAPIClient::V2::AWSAccountType::ACCOUNT, }), }) p api_instance.update_aws_account(AWS_ACCOUNT_V2_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an AWS integration returns "AWS Account object" response ``` // Update an AWS integration returns "AWS Account object" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV2::model::AWSAccountPartition; use datadog_api_client::datadogV2::model::AWSAccountType; use datadog_api_client::datadogV2::model::AWSAccountUpdateRequest; use datadog_api_client::datadogV2::model::AWSAccountUpdateRequestAttributes; use datadog_api_client::datadogV2::model::AWSAccountUpdateRequestData; use datadog_api_client::datadogV2::model::AWSAuthConfig; use datadog_api_client::datadogV2::model::AWSAuthConfigRole; use datadog_api_client::datadogV2::model::AWSCCMConfig; use datadog_api_client::datadogV2::model::AWSLambdaForwarderConfig; use datadog_api_client::datadogV2::model::AWSLambdaForwarderConfigLogSourceConfig; use datadog_api_client::datadogV2::model::AWSLogSourceTagFilter; use datadog_api_client::datadogV2::model::AWSLogsConfig; use datadog_api_client::datadogV2::model::AWSMetricsConfig; use datadog_api_client::datadogV2::model::AWSNamespaceTagFilter; use datadog_api_client::datadogV2::model::AWSResourcesConfig; use datadog_api_client::datadogV2::model::AWSTracesConfig; use datadog_api_client::datadogV2::model::DataExportConfig; #[tokio::main] async fn main() { // there is a valid "aws_account_v2" in the system let aws_account_v2_data_id = std::env::var("AWS_ACCOUNT_V2_DATA_ID").unwrap(); let body = AWSAccountUpdateRequest::new( AWSAccountUpdateRequestData::new( AWSAccountUpdateRequestAttributes::new("123456789012".to_string()) .account_tags(Some(vec!["key:value".to_string()])) .auth_config( AWSAuthConfig::AWSAuthConfigRole( Box::new(AWSAuthConfigRole::new("DatadogIntegrationRole".to_string())), ), ) .aws_partition(AWSAccountPartition::AWS) .ccm_config( AWSCCMConfig ::new().data_export_configs( vec![ DataExportConfig::new() .bucket_name("updated-bucket".to_string()) .bucket_region("us-west-2".to_string()) .report_name("updated-report".to_string()) .report_prefix("cost-reports".to_string()) .report_type("CUR2.0".to_string()) ], ), ) .logs_config( AWSLogsConfig ::new().lambda_forwarder( AWSLambdaForwarderConfig::new() .lambdas( vec![ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder".to_string() ], ) .log_source_config( AWSLambdaForwarderConfigLogSourceConfig ::new().tag_filters( vec![ AWSLogSourceTagFilter::new() .source("s3".to_string()) .tags(Some(vec!["test:test".to_string()])) ], ), ) .sources(vec!["s3".to_string()]), ), ) .metrics_config( AWSMetricsConfig::new() .automute_enabled(true) .collect_cloudwatch_alarms(true) .collect_custom_metrics(true) .enabled(true) .tag_filters( vec![ AWSNamespaceTagFilter::new() .namespace("AWS/EC2".to_string()) .tags(Some(vec!["key:value".to_string()])) ], ), ) .resources_config( AWSResourcesConfig::new() .cloud_security_posture_management_collection(false) .extended_collection(false), ) .traces_config(AWSTracesConfig::new()), AWSAccountType::ACCOUNT, ), ); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api .update_aws_account(aws_account_v2_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an AWS integration returns "AWS Account object" response ``` /** * Update an AWS integration returns "AWS Account object" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); // there is a valid "aws_account_v2" in the system const AWS_ACCOUNT_V2_DATA_ID = process.env.AWS_ACCOUNT_V2_DATA_ID as string; const params: v2.AWSIntegrationApiUpdateAWSAccountRequest = { body: { data: { attributes: { accountTags: ["key:value"], authConfig: { roleName: "DatadogIntegrationRole", }, awsAccountId: "123456789012", awsPartition: "aws", ccmConfig: { dataExportConfigs: [ { bucketName: "updated-bucket", bucketRegion: "us-west-2", reportName: "updated-report", reportPrefix: "cost-reports", reportType: "CUR2.0", }, ], }, logsConfig: { lambdaForwarder: { lambdas: [ "arn:aws:lambda:us-east-1:123456789012:function:DatadogLambdaLogForwarder", ], logSourceConfig: { tagFilters: [ { source: "s3", tags: ["test:test"], }, ], }, sources: ["s3"], }, }, metricsConfig: { automuteEnabled: true, collectCloudwatchAlarms: true, collectCustomMetrics: true, enabled: true, tagFilters: [ { namespace: "AWS/EC2", tags: ["key:value"], }, ], }, resourcesConfig: { cloudSecurityPostureManagementCollection: false, extendedCollection: false, }, tracesConfig: {}, }, type: "account", }, }, awsAccountConfigId: AWS_ACCOUNT_V2_DATA_ID, }; apiInstance .updateAWSAccount(params) .then((data: v2.AWSAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all Amazon EventBridge sources](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-amazon-eventbridge-sources) * [v1](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-amazon-eventbridge-sources-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#get-all-amazon-eventbridge-sources-v2) GET https://api.ap1.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.ap2.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.datadoghq.eu/api/v1/integration/aws/event_bridgehttps://api.ddog-gov.com/api/v1/integration/aws/event_bridgehttps://api.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.us3.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.us5.datadoghq.com/api/v1/integration/aws/event_bridge ### Overview **This endpoint is deprecated - use the V2 endpoints instead.** Get all Amazon EventBridge sources. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSEventBridgeSources-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSEventBridgeSources-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSEventBridgeSources-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSEventBridgeSources-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) An object describing the EventBridge configuration for multiple accounts. Expand All Field Type Description [object] List of accounts with their event sources. accountId string Your AWS Account ID without dashes. [object] Array of AWS event sources associated with this account. name string The event source name. region string The event source's [AWS region](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints). tags [string] Array of tags (in the form `key:value`) which are added to all hosts and metrics reporting through the main AWS integration. isInstalled boolean True if the EventBridge sub-integration is enabled for your organization. ``` { "accounts": [ { "accountId": "123456789012", "eventHubs": [ { "name": "string", "region": "string" } ], "tags": [ "$KEY:$VALUE" ] } ], "isInstalled": false } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Get all Amazon EventBridge sources Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/event_bridge" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all Amazon EventBridge sources ``` """ Get all Amazon EventBridge sources returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.list_aws_event_bridge_sources() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all Amazon EventBridge sources ``` # Get all Amazon EventBridge sources returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new p api_instance.list_aws_event_bridge_sources() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all Amazon EventBridge sources ``` // Get all Amazon EventBridge sources returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.ListAWSEventBridgeSources(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.ListAWSEventBridgeSources`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.ListAWSEventBridgeSources`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all Amazon EventBridge sources ``` // Get all Amazon EventBridge sources returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import com.datadog.api.client.v1.model.AWSEventBridgeListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); try { AWSEventBridgeListResponse result = apiInstance.listAWSEventBridgeSources(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#listAWSEventBridgeSources"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all Amazon EventBridge sources ``` // Get all Amazon EventBridge sources returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.list_aws_event_bridge_sources().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all Amazon EventBridge sources ``` /** * Get all Amazon EventBridge sources returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); apiInstance .listAWSEventBridgeSources() .then((data: v1.AWSEventBridgeListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.ap2.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.datadoghq.eu/api/v2/integration/aws/event_bridgehttps://api.ddog-gov.com/api/v2/integration/aws/event_bridgehttps://api.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.us3.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.us5.datadoghq.com/api/v2/integration/aws/event_bridge ### Overview Get all Amazon EventBridge sources. This endpoint requires the `integrations_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSEventBridgeSources-200-v2) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSEventBridgeSources-400-v2) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSEventBridgeSources-403-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#ListAWSEventBridgeSources-429-v2) Amazon EventBridge sources list. * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Amazon EventBridge list response body. Expand All Field Type Description _required_] object Amazon EventBridge list response data. _required_] object An object describing the EventBridge configuration for multiple accounts. [object] List of accounts with their event sources. account_id string Your AWS Account ID without dashes. [object] Array of AWS event sources associated with this account. name string The event source name. region string The event source's [AWS region](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints). tags [string] Array of tags (in the form `key:value`) which are added to all hosts and metrics reporting through the main AWS integration. is_installed boolean True if the EventBridge integration is enabled for your organization. id [_required_] string The ID of the Amazon EventBridge list response data. default: `get_event_bridge` type [_required_] enum Amazon EventBridge resource type. Allowed enum values: `event_bridge` default: `event_bridge` ``` { "data": { "attributes": { "accounts": [ { "account_id": "123456789012", "event_hubs": [ { "name": "app-alerts-zyxw3210", "region": "us-east-1" } ], "tags": [ "$KEY:$VALUE" ] } ], "is_installed": false }, "id": "get_event_bridge", "type": "event_bridge" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Get all Amazon EventBridge sources Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/event_bridge" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all Amazon EventBridge sources ``` """ Get all Amazon EventBridge sources returns "Amazon EventBridge sources list." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.list_aws_event_bridge_sources() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all Amazon EventBridge sources ``` # Get all Amazon EventBridge sources returns "Amazon EventBridge sources list." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new p api_instance.list_aws_event_bridge_sources() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all Amazon EventBridge sources ``` // Get all Amazon EventBridge sources returns "Amazon EventBridge sources list." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.ListAWSEventBridgeSources(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.ListAWSEventBridgeSources`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.ListAWSEventBridgeSources`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all Amazon EventBridge sources ``` // Get all Amazon EventBridge sources returns "Amazon EventBridge sources list." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSEventBridgeListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); try { AWSEventBridgeListResponse result = apiInstance.listAWSEventBridgeSources(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#listAWSEventBridgeSources"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all Amazon EventBridge sources ``` // Get all Amazon EventBridge sources returns "Amazon EventBridge sources list." // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.list_aws_event_bridge_sources().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all Amazon EventBridge sources ``` /** * Get all Amazon EventBridge sources returns "Amazon EventBridge sources list." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); apiInstance .listAWSEventBridgeSources() .then((data: v2.AWSEventBridgeListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an Amazon EventBridge source](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-amazon-eventbridge-source) * [v1](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-amazon-eventbridge-source-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#create-an-amazon-eventbridge-source-v2) POST https://api.ap1.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.ap2.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.datadoghq.eu/api/v1/integration/aws/event_bridgehttps://api.ddog-gov.com/api/v1/integration/aws/event_bridgehttps://api.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.us3.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.us5.datadoghq.com/api/v1/integration/aws/event_bridge ### Overview **This endpoint is deprecated - use the V2 endpoints instead.** Create an Amazon EventBridge source. This endpoint requires the `manage_integrations` permission. ### Request #### Body Data (required) Create an Amazon EventBridge source for an AWS account with a given name and region. * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description account_id string Your AWS Account ID without dashes. create_event_bus boolean True if Datadog should create the event bus in addition to the event source. Requires the `events:CreateEventBus` permission. event_generator_name string The given part of the event source name, which is then combined with an assigned suffix to form the full name. region string The event source's [AWS region](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints). ``` { "account_id": "123456789012", "create_event_bus": true, "event_generator_name": "app-alerts", "region": "us-east-1" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSEventBridgeSource-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSEventBridgeSource-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSEventBridgeSource-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSEventBridgeSource-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) A created EventBridge source. Expand All Field Type Description event_source_name string The event source name. has_bus boolean True if the event bus was created in addition to the source. region string The event source's [AWS region](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints). status enum The event source status "created". Allowed enum values: `created` ``` { "event_source_name": "app-alerts-zyxw3210", "has_bus": true, "region": "us-east-1", "status": "created" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Create an Amazon EventBridge source Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/event_bridge" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Create an Amazon EventBridge source ``` """ Create an Amazon EventBridge source returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v1.model.aws_event_bridge_create_request import AWSEventBridgeCreateRequest body = AWSEventBridgeCreateRequest( account_id="123456789012", create_event_bus=True, event_generator_name="app-alerts", region="us-east-1", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.create_aws_event_bridge_source(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an Amazon EventBridge source ``` # Create an Amazon EventBridge source returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new body = DatadogAPIClient::V1::AWSEventBridgeCreateRequest.new({ account_id: "123456789012", create_event_bus: true, event_generator_name: "app-alerts", region: "us-east-1", }) p api_instance.create_aws_event_bridge_source(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an Amazon EventBridge source ``` // Create an Amazon EventBridge source returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSEventBridgeCreateRequest{ AccountId: datadog.PtrString("123456789012"), CreateEventBus: datadog.PtrBool(true), EventGeneratorName: datadog.PtrString("app-alerts"), Region: datadog.PtrString("us-east-1"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.CreateAWSEventBridgeSource(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.CreateAWSEventBridgeSource`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.CreateAWSEventBridgeSource`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an Amazon EventBridge source ``` // Create an Amazon EventBridge source returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import com.datadog.api.client.v1.model.AWSEventBridgeCreateRequest; import com.datadog.api.client.v1.model.AWSEventBridgeCreateResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSEventBridgeCreateRequest body = new AWSEventBridgeCreateRequest() .accountId("123456789012") .createEventBus(true) .eventGeneratorName("app-alerts") .region("us-east-1"); try { AWSEventBridgeCreateResponse result = apiInstance.createAWSEventBridgeSource(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#createAWSEventBridgeSource"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an Amazon EventBridge source ``` // Create an Amazon EventBridge source returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV1::model::AWSEventBridgeCreateRequest; #[tokio::main] async fn main() { let body = AWSEventBridgeCreateRequest::new() .account_id("123456789012".to_string()) .create_event_bus(true) .event_generator_name("app-alerts".to_string()) .region("us-east-1".to_string()); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.create_aws_event_bridge_source(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an Amazon EventBridge source ``` /** * Create an Amazon EventBridge source returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); const params: v1.AWSIntegrationApiCreateAWSEventBridgeSourceRequest = { body: { accountId: "123456789012", createEventBus: true, eventGeneratorName: "app-alerts", region: "us-east-1", }, }; apiInstance .createAWSEventBridgeSource(params) .then((data: v1.AWSEventBridgeCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` POST https://api.ap1.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.ap2.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.datadoghq.eu/api/v2/integration/aws/event_bridgehttps://api.ddog-gov.com/api/v2/integration/aws/event_bridgehttps://api.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.us3.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.us5.datadoghq.com/api/v2/integration/aws/event_bridge ### Overview Create an Amazon EventBridge source. This endpoint requires the `manage_integrations` permission. ### Request #### Body Data (required) Create an Amazon EventBridge source for an AWS account with a given name and region. * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description _required_] object Amazon EventBridge create request data. _required_] object The EventBridge source to be created. account_id [_required_] string AWS Account ID. create_event_bus boolean Set to true if Datadog should create the event bus in addition to the event source. Requires the `events:CreateEventBus` permission. event_generator_name [_required_] string The given part of the event source name, which is then combined with an assigned suffix to form the full name. region [_required_] string The event source's [AWS region](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints). type [_required_] enum Amazon EventBridge resource type. Allowed enum values: `event_bridge` default: `event_bridge` ``` { "data": { "attributes": { "account_id": "123456789012", "create_event_bus": true, "event_generator_name": "app-alerts", "region": "us-east-1" }, "type": "event_bridge" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSEventBridgeSource-200-v2) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSEventBridgeSource-400-v2) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSEventBridgeSource-403-v2) * [409](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSEventBridgeSource-409-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#CreateAWSEventBridgeSource-429-v2) Amazon EventBridge source created. * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Amazon EventBridge create response body. Expand All Field Type Description _required_] object Amazon EventBridge create response data. _required_] object A created EventBridge source. event_source_name string The event source name. has_bus boolean True if the event bus was created in addition to the source. region string The event source's [AWS region](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints). status enum The event source status "created". Allowed enum values: `created` id string The ID of the Amazon EventBridge create response data. default: `create_event_bridge` type [_required_] enum Amazon EventBridge resource type. Allowed enum values: `event_bridge` default: `event_bridge` ``` { "data": { "attributes": { "event_source_name": "app-alerts-zyxw3210", "has_bus": true, "region": "us-east-1", "status": "created" }, "id": "create_event_bridge", "type": "event_bridge" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Create an Amazon EventBridge source Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/event_bridge" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "account_id": "123456789012", "event_generator_name": "app-alerts", "region": "us-east-1" }, "type": "event_bridge" } } EOF ``` ##### Create an Amazon EventBridge source ``` """ Create an Amazon EventBridge source returns "Amazon EventBridge source created." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v2.model.aws_event_bridge_create_request import AWSEventBridgeCreateRequest from datadog_api_client.v2.model.aws_event_bridge_create_request_attributes import AWSEventBridgeCreateRequestAttributes from datadog_api_client.v2.model.aws_event_bridge_create_request_data import AWSEventBridgeCreateRequestData from datadog_api_client.v2.model.aws_event_bridge_type import AWSEventBridgeType body = AWSEventBridgeCreateRequest( data=AWSEventBridgeCreateRequestData( attributes=AWSEventBridgeCreateRequestAttributes( account_id="123456789012", create_event_bus=True, event_generator_name="app-alerts", region="us-east-1", ), type=AWSEventBridgeType.EVENT_BRIDGE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.create_aws_event_bridge_source(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an Amazon EventBridge source ``` # Create an Amazon EventBridge source returns "Amazon EventBridge source created." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new body = DatadogAPIClient::V2::AWSEventBridgeCreateRequest.new({ data: DatadogAPIClient::V2::AWSEventBridgeCreateRequestData.new({ attributes: DatadogAPIClient::V2::AWSEventBridgeCreateRequestAttributes.new({ account_id: "123456789012", create_event_bus: true, event_generator_name: "app-alerts", region: "us-east-1", }), type: DatadogAPIClient::V2::AWSEventBridgeType::EVENT_BRIDGE, }), }) p api_instance.create_aws_event_bridge_source(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an Amazon EventBridge source ``` // Create an Amazon EventBridge source returns "Amazon EventBridge source created." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AWSEventBridgeCreateRequest{ Data: datadogV2.AWSEventBridgeCreateRequestData{ Attributes: datadogV2.AWSEventBridgeCreateRequestAttributes{ AccountId: "123456789012", CreateEventBus: datadog.PtrBool(true), EventGeneratorName: "app-alerts", Region: "us-east-1", }, Type: datadogV2.AWSEVENTBRIDGETYPE_EVENT_BRIDGE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.CreateAWSEventBridgeSource(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.CreateAWSEventBridgeSource`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.CreateAWSEventBridgeSource`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an Amazon EventBridge source ``` // Create an Amazon EventBridge source returns "Amazon EventBridge source created." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSEventBridgeCreateRequest; import com.datadog.api.client.v2.model.AWSEventBridgeCreateRequestAttributes; import com.datadog.api.client.v2.model.AWSEventBridgeCreateRequestData; import com.datadog.api.client.v2.model.AWSEventBridgeCreateResponse; import com.datadog.api.client.v2.model.AWSEventBridgeType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSEventBridgeCreateRequest body = new AWSEventBridgeCreateRequest() .data( new AWSEventBridgeCreateRequestData() .attributes( new AWSEventBridgeCreateRequestAttributes() .accountId("123456789012") .createEventBus(true) .eventGeneratorName("app-alerts") .region("us-east-1")) .type(AWSEventBridgeType.EVENT_BRIDGE)); try { AWSEventBridgeCreateResponse result = apiInstance.createAWSEventBridgeSource(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#createAWSEventBridgeSource"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an Amazon EventBridge source ``` // Create an Amazon EventBridge source returns "Amazon EventBridge source // created." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV2::model::AWSEventBridgeCreateRequest; use datadog_api_client::datadogV2::model::AWSEventBridgeCreateRequestAttributes; use datadog_api_client::datadogV2::model::AWSEventBridgeCreateRequestData; use datadog_api_client::datadogV2::model::AWSEventBridgeType; #[tokio::main] async fn main() { let body = AWSEventBridgeCreateRequest::new(AWSEventBridgeCreateRequestData::new( AWSEventBridgeCreateRequestAttributes::new( "123456789012".to_string(), "app-alerts".to_string(), "us-east-1".to_string(), ) .create_event_bus(true), AWSEventBridgeType::EVENT_BRIDGE, )); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.create_aws_event_bridge_source(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an Amazon EventBridge source ``` /** * Create an Amazon EventBridge source returns "Amazon EventBridge source created." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); const params: v2.AWSIntegrationApiCreateAWSEventBridgeSourceRequest = { body: { data: { attributes: { accountId: "123456789012", createEventBus: true, eventGeneratorName: "app-alerts", region: "us-east-1", }, type: "event_bridge", }, }, }; apiInstance .createAWSEventBridgeSource(params) .then((data: v2.AWSEventBridgeCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an Amazon EventBridge source](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-amazon-eventbridge-source) * [v1](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-amazon-eventbridge-source-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-integration/#delete-an-amazon-eventbridge-source-v2) DELETE https://api.ap1.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.ap2.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.datadoghq.eu/api/v1/integration/aws/event_bridgehttps://api.ddog-gov.com/api/v1/integration/aws/event_bridgehttps://api.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.us3.datadoghq.com/api/v1/integration/aws/event_bridgehttps://api.us5.datadoghq.com/api/v1/integration/aws/event_bridge ### Overview **This endpoint is deprecated - use the V2 endpoints instead.** Delete an Amazon EventBridge source. This endpoint requires the `manage_integrations` permission. ### Request #### Body Data (required) Delete the Amazon EventBridge source with the given name, region, and associated AWS account. * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description account_id string Your AWS Account ID without dashes. event_generator_name string The event source name. region string The event source's [AWS region](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints). ``` { "account_id": "123456789012", "event_generator_name": "app-alerts-zyxw3210", "region": "us-east-1" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSEventBridgeSource-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSEventBridgeSource-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSEventBridgeSource-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSEventBridgeSource-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) An indicator of the successful deletion of an EventBridge source. Expand All Field Type Description status enum The event source status "empty". Allowed enum values: `empty` ``` { "status": "empty" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Delete an Amazon EventBridge source Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/event_bridge" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Delete an Amazon EventBridge source ``` """ Delete an Amazon EventBridge source returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v1.model.aws_event_bridge_delete_request import AWSEventBridgeDeleteRequest body = AWSEventBridgeDeleteRequest( account_id="123456789012", event_generator_name="app-alerts-zyxw3210", region="us-east-1", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.delete_aws_event_bridge_source(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an Amazon EventBridge source ``` # Delete an Amazon EventBridge source returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSIntegrationAPI.new body = DatadogAPIClient::V1::AWSEventBridgeDeleteRequest.new({ account_id: "123456789012", event_generator_name: "app-alerts-zyxw3210", region: "us-east-1", }) p api_instance.delete_aws_event_bridge_source(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an Amazon EventBridge source ``` // Delete an Amazon EventBridge source returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSEventBridgeDeleteRequest{ AccountId: datadog.PtrString("123456789012"), EventGeneratorName: datadog.PtrString("app-alerts-zyxw3210"), Region: datadog.PtrString("us-east-1"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSIntegrationApi(apiClient) resp, r, err := api.DeleteAWSEventBridgeSource(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.DeleteAWSEventBridgeSource`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.DeleteAWSEventBridgeSource`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an Amazon EventBridge source ``` // Delete an Amazon EventBridge source returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsIntegrationApi; import com.datadog.api.client.v1.model.AWSEventBridgeDeleteRequest; import com.datadog.api.client.v1.model.AWSEventBridgeDeleteResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSEventBridgeDeleteRequest body = new AWSEventBridgeDeleteRequest() .accountId("123456789012") .eventGeneratorName("app-alerts-zyxw3210") .region("us-east-1"); try { AWSEventBridgeDeleteResponse result = apiInstance.deleteAWSEventBridgeSource(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#deleteAWSEventBridgeSource"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an Amazon EventBridge source ``` // Delete an Amazon EventBridge source returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV1::model::AWSEventBridgeDeleteRequest; #[tokio::main] async fn main() { let body = AWSEventBridgeDeleteRequest::new() .account_id("123456789012".to_string()) .event_generator_name("app-alerts-zyxw3210".to_string()) .region("us-east-1".to_string()); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.delete_aws_event_bridge_source(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an Amazon EventBridge source ``` /** * Delete an Amazon EventBridge source returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSIntegrationApi(configuration); const params: v1.AWSIntegrationApiDeleteAWSEventBridgeSourceRequest = { body: { accountId: "123456789012", eventGeneratorName: "app-alerts-zyxw3210", region: "us-east-1", }, }; apiInstance .deleteAWSEventBridgeSource(params) .then((data: v1.AWSEventBridgeDeleteResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` DELETE https://api.ap1.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.ap2.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.datadoghq.eu/api/v2/integration/aws/event_bridgehttps://api.ddog-gov.com/api/v2/integration/aws/event_bridgehttps://api.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.us3.datadoghq.com/api/v2/integration/aws/event_bridgehttps://api.us5.datadoghq.com/api/v2/integration/aws/event_bridge ### Overview Delete an Amazon EventBridge source. This endpoint requires the `manage_integrations` permission. ### Request #### Body Data (required) Delete the Amazon EventBridge source with the given name, region, and associated AWS account. * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Expand All Field Type Description _required_] object Amazon EventBridge delete request data. _required_] object The EventBridge source to be deleted. account_id [_required_] string AWS Account ID. event_generator_name [_required_] string The event source name. region [_required_] string The event source's [AWS region](https://docs.aws.amazon.com/general/latest/gr/rande.html#regional-endpoints). type [_required_] enum Amazon EventBridge resource type. Allowed enum values: `event_bridge` default: `event_bridge` ``` { "data": { "attributes": { "account_id": "123456789012", "event_generator_name": "app-alerts-zyxw3210", "region": "us-east-1" }, "type": "event_bridge" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSEventBridgeSource-200-v2) * [400](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSEventBridgeSource-400-v2) * [403](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSEventBridgeSource-403-v2) * [429](https://docs.datadoghq.com/api/latest/aws-integration/#DeleteAWSEventBridgeSource-429-v2) Amazon EventBridge source deleted. * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) Amazon EventBridge delete response body. Expand All Field Type Description _required_] object Amazon EventBridge delete response data. _required_] object The EventBridge source delete response attributes. status enum The event source status "empty". Allowed enum values: `empty` id string The ID of the Amazon EventBridge list response data. default: `delete_event_bridge` type [_required_] enum Amazon EventBridge resource type. Allowed enum values: `event_bridge` default: `event_bridge` ``` { "data": { "attributes": { "status": "empty" }, "id": "delete_event_bridge", "type": "event_bridge" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-integration/?code-lang=typescript) ##### Delete an Amazon EventBridge source Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/event_bridge" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "account_id": "123456789012", "event_generator_name": "app-alerts-zyxw3210", "region": "us-east-1" }, "type": "event_bridge" } } EOF ``` ##### Delete an Amazon EventBridge source ``` """ Delete an Amazon EventBridge source returns "Amazon EventBridge source deleted." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_integration_api import AWSIntegrationApi from datadog_api_client.v2.model.aws_event_bridge_delete_request import AWSEventBridgeDeleteRequest from datadog_api_client.v2.model.aws_event_bridge_delete_request_attributes import AWSEventBridgeDeleteRequestAttributes from datadog_api_client.v2.model.aws_event_bridge_delete_request_data import AWSEventBridgeDeleteRequestData from datadog_api_client.v2.model.aws_event_bridge_type import AWSEventBridgeType body = AWSEventBridgeDeleteRequest( data=AWSEventBridgeDeleteRequestData( attributes=AWSEventBridgeDeleteRequestAttributes( account_id="123456789012", event_generator_name="app-alerts-zyxw3210", region="us-east-1", ), type=AWSEventBridgeType.EVENT_BRIDGE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSIntegrationApi(api_client) response = api_instance.delete_aws_event_bridge_source(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an Amazon EventBridge source ``` # Delete an Amazon EventBridge source returns "Amazon EventBridge source deleted." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSIntegrationAPI.new body = DatadogAPIClient::V2::AWSEventBridgeDeleteRequest.new({ data: DatadogAPIClient::V2::AWSEventBridgeDeleteRequestData.new({ attributes: DatadogAPIClient::V2::AWSEventBridgeDeleteRequestAttributes.new({ account_id: "123456789012", event_generator_name: "app-alerts-zyxw3210", region: "us-east-1", }), type: DatadogAPIClient::V2::AWSEventBridgeType::EVENT_BRIDGE, }), }) p api_instance.delete_aws_event_bridge_source(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an Amazon EventBridge source ``` // Delete an Amazon EventBridge source returns "Amazon EventBridge source deleted." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AWSEventBridgeDeleteRequest{ Data: datadogV2.AWSEventBridgeDeleteRequestData{ Attributes: datadogV2.AWSEventBridgeDeleteRequestAttributes{ AccountId: "123456789012", EventGeneratorName: "app-alerts-zyxw3210", Region: "us-east-1", }, Type: datadogV2.AWSEVENTBRIDGETYPE_EVENT_BRIDGE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSIntegrationApi(apiClient) resp, r, err := api.DeleteAWSEventBridgeSource(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSIntegrationApi.DeleteAWSEventBridgeSource`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSIntegrationApi.DeleteAWSEventBridgeSource`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an Amazon EventBridge source ``` // Delete an Amazon EventBridge source returns "Amazon EventBridge source deleted." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsIntegrationApi; import com.datadog.api.client.v2.model.AWSEventBridgeDeleteRequest; import com.datadog.api.client.v2.model.AWSEventBridgeDeleteRequestAttributes; import com.datadog.api.client.v2.model.AWSEventBridgeDeleteRequestData; import com.datadog.api.client.v2.model.AWSEventBridgeDeleteResponse; import com.datadog.api.client.v2.model.AWSEventBridgeType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsIntegrationApi apiInstance = new AwsIntegrationApi(defaultClient); AWSEventBridgeDeleteRequest body = new AWSEventBridgeDeleteRequest() .data( new AWSEventBridgeDeleteRequestData() .attributes( new AWSEventBridgeDeleteRequestAttributes() .accountId("123456789012") .eventGeneratorName("app-alerts-zyxw3210") .region("us-east-1")) .type(AWSEventBridgeType.EVENT_BRIDGE)); try { AWSEventBridgeDeleteResponse result = apiInstance.deleteAWSEventBridgeSource(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsIntegrationApi#deleteAWSEventBridgeSource"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an Amazon EventBridge source ``` // Delete an Amazon EventBridge source returns "Amazon EventBridge source // deleted." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_integration::AWSIntegrationAPI; use datadog_api_client::datadogV2::model::AWSEventBridgeDeleteRequest; use datadog_api_client::datadogV2::model::AWSEventBridgeDeleteRequestAttributes; use datadog_api_client::datadogV2::model::AWSEventBridgeDeleteRequestData; use datadog_api_client::datadogV2::model::AWSEventBridgeType; #[tokio::main] async fn main() { let body = AWSEventBridgeDeleteRequest::new(AWSEventBridgeDeleteRequestData::new( AWSEventBridgeDeleteRequestAttributes::new( "123456789012".to_string(), "app-alerts-zyxw3210".to_string(), "us-east-1".to_string(), ), AWSEventBridgeType::EVENT_BRIDGE, )); let configuration = datadog::Configuration::new(); let api = AWSIntegrationAPI::with_config(configuration); let resp = api.delete_aws_event_bridge_source(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an Amazon EventBridge source ``` /** * Delete an Amazon EventBridge source returns "Amazon EventBridge source deleted." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSIntegrationApi(configuration); const params: v2.AWSIntegrationApiDeleteAWSEventBridgeSourceRequest = { body: { data: { attributes: { accountId: "123456789012", eventGeneratorName: "app-alerts-zyxw3210", region: "us-east-1", }, type: "event_bridge", }, }, }; apiInstance .deleteAWSEventBridgeSource(params) .then((data: v2.AWSEventBridgeDeleteResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## Can't find something? Our friendly, knowledgeable solutions engineers are here to help! [Contact Us](https://docs.datadoghq.com/help/) [Free Trial](https://docs.datadoghq.com/api/latest/aws-integration/) Download mobile app [](https://apps.apple.com/app/datadog/id1391380318)[](https://play.google.com/store/apps/details?id=com.datadog.app) Product [Infrastructure Monitoring](https://www.datadoghq.com/product/infrastructure-monitoring/) [Network Monitoring](https://www.datadoghq.com/product/network-monitoring/) [Container Monitoring](https://www.datadoghq.com/product/container-monitoring/) [Serverless](https://www.datadoghq.com/product/serverless-monitoring/) [Cloud Cost Management](https://www.datadoghq.com/product/cloud-cost-management/) [Cloudcraft](https://www.datadoghq.com/product/cloudcraft/) [Kubernetes Autoscaling](https://www.datadoghq.com/product/kubernetes-autoscaling/) [Application Performance Monitoring](https://www.datadoghq.com/product/apm/) [Software Catalog](https://www.datadoghq.com/product/software-catalog/) [Universal Service Monitoring](https://www.datadoghq.com/product/universal-service-monitoring/) [Data Streams Monitoring](https://www.datadoghq.com/product/data-streams-monitoring/) [Jobs Monitoring](https://www.datadoghq.com/product/data-observability/jobs-monitoring/) [Quality Monitoring](https://www.datadoghq.com/product/data-observability/quality-monitoring/) [Database Monitoring](https://www.datadoghq.com/product/database-monitoring/) [Continuous Profiler](https://www.datadoghq.com/product/code-profiling/) [Dynamic Instrumentation](https://www.datadoghq.com/product/dynamic-instrumentation/) [Log Management](https://www.datadoghq.com/product/log-management/) [Sensitive Data Scanner](https://www.datadoghq.com/product/sensitive-data-scanner/) [Audit Trail](https://www.datadoghq.com/product/audit-trail/) [Observability Pipelines](https://www.datadoghq.com/product/observability-pipelines/) [Cloud Security](https://www.datadoghq.com/product/cloud-security/) [Cloud Security Posture Management](https://www.datadoghq.com/product/cloud-security/#posture-management) [Workload Protection](https://www.datadoghq.com/product/workload-protection/) [Cloud Infrastructure Entitlement Management](https://www.datadoghq.com/product/cloud-security/#entitlement-management) [Vulnerability Management](https://www.datadoghq.com/product/cloud-security/#vulnerability-management) [Compliance](https://www.datadoghq.com/product/cloud-security/#compliance) [App and API Protection](https://www.datadoghq.com/product/app-and-api-protection/) [Software Composition Analysis](https://www.datadoghq.com/product/software-composition-analysis/) [Code Security](https://www.datadoghq.com/product/code-security/) [Static Code Analysis (SAST)](https://www.datadoghq.com/product/sast/) [Runtime Code Analysis (IAST)](https://www.datadoghq.com/product/iast/) [IaC Security](https://www.datadoghq.com/product/iac-security) [Cloud SIEM](https://www.datadoghq.com/product/cloud-siem/) [Browser Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/) [Mobile Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/mobile-rum/) [Product Analytics](https://www.datadoghq.com/product/product-analytics/) [Session Replay](https://www.datadoghq.com/product/real-user-monitoring/session-replay/) [Synthetic Monitoring](https://www.datadoghq.com/product/synthetic-monitoring/) [Mobile App Testing](https://www.datadoghq.com/product/mobile-app-testing/) [Continuous Testing](https://www.datadoghq.com/product/continuous-testing/) [Error Tracking](https://www.datadoghq.com/product/error-tracking/) [CloudPrem](https://www.datadoghq.com/product/cloudprem/) [Internal Developer Portal](https://www.datadoghq.com/product/internal-developer-portal/) [CI Visibility](https://www.datadoghq.com/product/ci-cd-monitoring/) [Test Optimization](https://www.datadoghq.com/product/test-optimization/) [Feature Flags](https://www.datadoghq.com/product/feature-flags/) [Code Coverage](https://www.datadoghq.com/product/code-coverage/) [Service Level Objectives](https://www.datadoghq.com/product/service-level-objectives/) [Incident Response](https://www.datadoghq.com/product/incident-response/) [Event Management](https://www.datadoghq.com/product/event-management/) [Case Management](https://www.datadoghq.com/product/case-management/) [Bits AI Agents](https://www.datadoghq.com/product/ai/bits-ai-agents/) [Bits AI SRE](https://www.datadoghq.com/product/ai/bits-ai-sre/) [Metrics](https://www.datadoghq.com/product/metrics/) [Watchdog](https://www.datadoghq.com/product/platform/watchdog/) [LLM Observability](https://www.datadoghq.com/product/llm-observability/) [AI Integrations](https://www.datadoghq.com/product/platform/integrations/#cat-aiml) [Workflow Automation](https://www.datadoghq.com/product/workflow-automation/) [App Builder](https://www.datadoghq.com/product/app-builder/) [CoScreen](https://www.datadoghq.com/product/coscreen/) [Teams](https://docs.datadoghq.com/account_management/teams/) [Dashboards](https://www.datadoghq.com/product/platform/dashboards/) [Notebooks](https://docs.datadoghq.com/notebooks/) [Mobile App](https://docs.datadoghq.com/service_management/mobile/?tab=ios) [Fleet Automation](https://www.datadoghq.com/product/fleet-automation/) [Access Control](https://docs.datadoghq.com/account_management/rbac/?tab=datadogapplication) [OpenTelemetry](https://www.datadoghq.com/solutions/opentelemetry/) [Alerts](https://www.datadoghq.com/product/platform/alerts/) [integrations](https://www.datadoghq.com/product/platform/integrations/) [IDE Plugins](https://www.datadoghq.com/product/platform/ides/) [API](https://docs.datadoghq.com/api/) [Marketplace](https://www.datadoghq.com/marketplacepartners/) [Security Labs Research](https://securitylabs.datadoghq.com/) [Open Source Projects](https://opensource.datadoghq.com/) [Storage Management](https://www.datadoghq.com/product/storage-management/) [DORA Metrics](https://www.datadoghq.com/product/platform/dora-metrics/) [Secret Scanning](https://www.datadoghq.com/product/secret-scanning/) resources [Pricing](https://www.datadoghq.com/pricing/) [Documentation](https://docs.datadoghq.com/) [Support](https://www.datadoghq.com/support/) [Services & Enablement](https://www.datadoghq.com/support-services/) [Product Preview Program](https://www.datadoghq.com/product-preview/) [Certification](https://www.datadoghq.com/certification/overview/) [Open Source](https://opensource.datadoghq.com/) [Events and Webinars](https://www.datadoghq.com/events-webinars/) [Security](https://www.datadoghq.com/security/) [Privacy Center](https://www.datadoghq.com/privacy/) [Knowledge Center](https://www.datadoghq.com/knowledge-center/) [Learning Resources](https://www.datadoghq.com/learn/) About [Contact Us](https://www.datadoghq.com/about/contact/) [Partners](https://www.datadoghq.com/partner/network/) [Press](https://www.datadoghq.com/about/latest-news/press-releases/) [Leadership](https://www.datadoghq.com/about/leadership/) [Careers](https://careers.datadoghq.com/) [Legal](https://www.datadoghq.com/legal/) [Investor Relations](https://investors.datadoghq.com/) [Analyst Reports](https://www.datadoghq.com/about/analyst/) [ESG Report](https://www.datadoghq.com/esg-report/) [Vendor Help](https://www.datadoghq.com/vendor-help/) [Trust Hub](https://www.datadoghq.com/trust/) Blog [The Monitor](https://www.datadoghq.com/blog/) [Engineering](https://www.datadoghq.com/blog/engineering/) [AI](https://www.datadoghq.com/blog/ai/) [Security Labs](https://securitylabs.datadoghq.com/) Icon/world Created with Sketch. English [English ](https://docs.datadoghq.com/?lang_pref=en)[Français ](https://docs.datadoghq.com/fr/?lang_pref=fr)[日本語 ](https://docs.datadoghq.com/ja/?lang_pref=ja)[한국어 ](https://docs.datadoghq.com/ko/?lang_pref=ko)[Español](https://docs.datadoghq.com/es/?lang_pref=es) [](https://twitter.com/datadoghq)[](https://www.instagram.com/datadoghq/)[](https://www.youtube.com/user/DatadogHQ)[](https://www.LinkedIn.com/company/datadog/) © Datadog 2026 [Terms](https://www.datadoghq.com/legal/terms/) | [Privacy](https://www.datadoghq.com/legal/privacy/) | [Your Privacy Choices ![](https://imgix.datadoghq.com/img/icons/privacyoptions.svg?w=24&dpr=2)](https://docs.datadoghq.com/api/latest/aws-integration/) ###### Request a personalized demo × * First Name* * Last Name* * Business Email* * Company* * Job Title* * Phone Number * How are you currently monitoring your infrastructure and applications? By submitting this form, you agree to the [Privacy Policy](https://www.datadoghq.com/legal/privacy/) and [Cookie Policy.](https://www.datadoghq.com/legal/cookies/) Request a Demo ##### Get Started with Datadog Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=ad76ede1-9067-439a-bba8-fe899435a1d7&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=fd0efff7-3328-4052-b3d2-94899557063f&pt=AWS%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=ad76ede1-9067-439a-bba8-fe899435a1d7&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=fd0efff7-3328-4052-b3d2-94899557063f&pt=AWS%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=9d495fb7-59f7-4abe-9919-816160285ebd&bo=2&sid=cf19a0f0f0c011f0b2b3f98b999d922b&vid=cf1a34f0f0c011f0a3f8ad0d25701bd6&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=AWS%20Integration&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-integration%2F&r=<=2990&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=635577) --- # Source: https://docs.datadoghq.com/api/latest/aws-logs-integration/ # AWS Logs Integration Configure your Datadog-AWS-Logs integration directly through Datadog API. For more information, see the [AWS integration page](https://docs.datadoghq.com/integrations/amazon_web_services/#log-collection). ## [List all AWS Logs integrations](https://docs.datadoghq.com/api/latest/aws-logs-integration/#list-all-aws-logs-integrations) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/aws-logs-integration/#list-all-aws-logs-integrations-v1) GET https://api.ap1.datadoghq.com/api/v1/integration/aws/logshttps://api.ap2.datadoghq.com/api/v1/integration/aws/logshttps://api.datadoghq.eu/api/v1/integration/aws/logshttps://api.ddog-gov.com/api/v1/integration/aws/logshttps://api.datadoghq.com/api/v1/integration/aws/logshttps://api.us3.datadoghq.com/api/v1/integration/aws/logshttps://api.us5.datadoghq.com/api/v1/integration/aws/logs ### Overview List all Datadog-AWS Logs integrations configured in your Datadog account. This endpoint requires the `aws_configuration_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-logs-integration/#ListAWSLogsIntegrations-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-logs-integration/#ListAWSLogsIntegrations-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-logs-integration/#ListAWSLogsIntegrations-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-logs-integration/#ListAWSLogsIntegrations-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Field Type Description account_id string Your AWS Account ID without dashes. lambdas [object] List of ARNs configured in your Datadog account. arn string Available ARN IDs. services [string] Array of services IDs. ``` [ { "account_id": "123456789101", "lambdas": [], "services": [ "s3", "elb", "elbv2", "cloudfront", "redshift", "lambda" ] } ] ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python-legacy) ##### List all AWS Logs integrations Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/logs" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all AWS Logs integrations ``` """ List all AWS Logs integrations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_logs_integration_api import AWSLogsIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSLogsIntegrationApi(api_client) response = api_instance.list_aws_logs_integrations() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all AWS Logs integrations ``` # List all AWS Logs integrations returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSLogsIntegrationAPI.new p api_instance.list_aws_logs_integrations() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all AWS Logs integrations ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.aws_logs_integrations_list ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all AWS Logs integrations ``` // List all AWS Logs integrations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSLogsIntegrationApi(apiClient) resp, r, err := api.ListAWSLogsIntegrations(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSLogsIntegrationApi.ListAWSLogsIntegrations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSLogsIntegrationApi.ListAWSLogsIntegrations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all AWS Logs integrations ``` // List all AWS Logs integrations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsLogsIntegrationApi; import com.datadog.api.client.v1.model.AWSLogsListResponse; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsLogsIntegrationApi apiInstance = new AwsLogsIntegrationApi(defaultClient); try { List result = apiInstance.listAWSLogsIntegrations(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsLogsIntegrationApi#listAWSLogsIntegrations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all AWS Logs integrations ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.AwsLogsIntegration.list() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### List all AWS Logs integrations ``` // List all AWS Logs integrations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_logs_integration::AWSLogsIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSLogsIntegrationAPI::with_config(configuration); let resp = api.list_aws_logs_integrations().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all AWS Logs integrations ``` /** * List all AWS Logs integrations returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSLogsIntegrationApi(configuration); apiInstance .listAWSLogsIntegrations() .then((data: v1.AWSLogsListResponse[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add AWS Log Lambda ARN](https://docs.datadoghq.com/api/latest/aws-logs-integration/#add-aws-log-lambda-arn) * [v1 (latest)](https://docs.datadoghq.com/api/latest/aws-logs-integration/#add-aws-log-lambda-arn-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/aws/logshttps://api.ap2.datadoghq.com/api/v1/integration/aws/logshttps://api.datadoghq.eu/api/v1/integration/aws/logshttps://api.ddog-gov.com/api/v1/integration/aws/logshttps://api.datadoghq.com/api/v1/integration/aws/logshttps://api.us3.datadoghq.com/api/v1/integration/aws/logshttps://api.us5.datadoghq.com/api/v1/integration/aws/logs ### Overview Attach the Lambda ARN of the Lambda created for the Datadog-AWS log collection to your AWS account ID to enable log collection. This endpoint requires the `aws_configuration_edit` permission. ### Request #### Body Data (required) AWS Log Lambda Async request body. * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Expand All Field Type Description account_id [_required_] string Your AWS Account ID without dashes. lambda_arn [_required_] string ARN of the Datadog Lambda created during the Datadog-Amazon Web services Log collection setup. ``` { "account_id": "1234567", "lambda_arn": "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CreateAWSLambdaARN-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CreateAWSLambdaARN-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CreateAWSLambdaARN-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CreateAWSLambdaARN-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python-legacy) ##### Add AWS Log Lambda ARN Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/logs" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "account_id": "1234567", "lambda_arn": "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest" } EOF ``` ##### Add AWS Log Lambda ARN ``` """ Add AWS Log Lambda ARN returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_logs_integration_api import AWSLogsIntegrationApi from datadog_api_client.v1.model.aws_account_and_lambda_request import AWSAccountAndLambdaRequest body = AWSAccountAndLambdaRequest( account_id="1234567", lambda_arn="arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSLogsIntegrationApi(api_client) response = api_instance.create_aws_lambda_arn(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add AWS Log Lambda ARN ``` # Add AWS Log Lambda ARN returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSLogsIntegrationAPI.new body = DatadogAPIClient::V1::AWSAccountAndLambdaRequest.new({ account_id: "1234567", lambda_arn: "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", }) p api_instance.create_aws_lambda_arn(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add AWS Log Lambda ARN ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) config = { "account_id": '', "lambda_arn": 'arn:aws:lambda:::function:' } dog.aws_logs_add_lambda(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add AWS Log Lambda ARN ``` // Add AWS Log Lambda ARN returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSAccountAndLambdaRequest{ AccountId: "1234567", LambdaArn: "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSLogsIntegrationApi(apiClient) resp, r, err := api.CreateAWSLambdaARN(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSLogsIntegrationApi.CreateAWSLambdaARN`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSLogsIntegrationApi.CreateAWSLambdaARN`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add AWS Log Lambda ARN ``` // Add AWS Log Lambda ARN returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsLogsIntegrationApi; import com.datadog.api.client.v1.model.AWSAccountAndLambdaRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsLogsIntegrationApi apiInstance = new AwsLogsIntegrationApi(defaultClient); AWSAccountAndLambdaRequest body = new AWSAccountAndLambdaRequest() .accountId("1234567") .lambdaArn("arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest"); try { apiInstance.createAWSLambdaARN(body); } catch (ApiException e) { System.err.println("Exception when calling AwsLogsIntegrationApi#createAWSLambdaARN"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add AWS Log Lambda ARN ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) account_id = "" lambda_arn = "arn:aws:lambda:::function:" api.AwsLogsIntegration.add_log_lambda_arn(account_id=account_id, lambda_arn=lambda_arn) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Add AWS Log Lambda ARN ``` // Add AWS Log Lambda ARN returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_logs_integration::AWSLogsIntegrationAPI; use datadog_api_client::datadogV1::model::AWSAccountAndLambdaRequest; #[tokio::main] async fn main() { let body = AWSAccountAndLambdaRequest::new( "1234567".to_string(), "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest".to_string(), ); let configuration = datadog::Configuration::new(); let api = AWSLogsIntegrationAPI::with_config(configuration); let resp = api.create_aws_lambda_arn(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add AWS Log Lambda ARN ``` /** * Add AWS Log Lambda ARN returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSLogsIntegrationApi(configuration); const params: v1.AWSLogsIntegrationApiCreateAWSLambdaARNRequest = { body: { accountId: "1234567", lambdaArn: "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", }, }; apiInstance .createAWSLambdaARN(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an AWS Logs integration](https://docs.datadoghq.com/api/latest/aws-logs-integration/#delete-an-aws-logs-integration) * [v1 (latest)](https://docs.datadoghq.com/api/latest/aws-logs-integration/#delete-an-aws-logs-integration-v1) DELETE https://api.ap1.datadoghq.com/api/v1/integration/aws/logshttps://api.ap2.datadoghq.com/api/v1/integration/aws/logshttps://api.datadoghq.eu/api/v1/integration/aws/logshttps://api.ddog-gov.com/api/v1/integration/aws/logshttps://api.datadoghq.com/api/v1/integration/aws/logshttps://api.us3.datadoghq.com/api/v1/integration/aws/logshttps://api.us5.datadoghq.com/api/v1/integration/aws/logs ### Overview Delete a Datadog-AWS logs configuration by removing the specific Lambda ARN associated with a given AWS account. This endpoint requires the `aws_configuration_edit` permission. ### Request #### Body Data (required) Delete AWS Lambda ARN request body. * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Expand All Field Type Description account_id [_required_] string Your AWS Account ID without dashes. lambda_arn [_required_] string ARN of the Datadog Lambda created during the Datadog-Amazon Web services Log collection setup. ``` { "account_id": "1234567", "lambda_arn": "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-logs-integration/#DeleteAWSLambdaARN-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-logs-integration/#DeleteAWSLambdaARN-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-logs-integration/#DeleteAWSLambdaARN-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-logs-integration/#DeleteAWSLambdaARN-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python-legacy) ##### Delete an AWS Logs integration Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/logs" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "account_id": "1234567", "lambda_arn": "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest" } EOF ``` ##### Delete an AWS Logs integration ``` """ Delete an AWS Logs integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_logs_integration_api import AWSLogsIntegrationApi from datadog_api_client.v1.model.aws_account_and_lambda_request import AWSAccountAndLambdaRequest body = AWSAccountAndLambdaRequest( account_id="1234567", lambda_arn="arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSLogsIntegrationApi(api_client) response = api_instance.delete_aws_lambda_arn(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an AWS Logs integration ``` # Delete an AWS Logs integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSLogsIntegrationAPI.new body = DatadogAPIClient::V1::AWSAccountAndLambdaRequest.new({ account_id: "1234567", lambda_arn: "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", }) p api_instance.delete_aws_lambda_arn(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an AWS Logs integration ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) config = { "account_id": '', "lambda_arn": 'arn:aws:lambda:::function:' } dog.aws_logs_integration_delete(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an AWS Logs integration ``` // Delete an AWS Logs integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSAccountAndLambdaRequest{ AccountId: "1234567", LambdaArn: "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSLogsIntegrationApi(apiClient) resp, r, err := api.DeleteAWSLambdaARN(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSLogsIntegrationApi.DeleteAWSLambdaARN`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSLogsIntegrationApi.DeleteAWSLambdaARN`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an AWS Logs integration ``` // Delete an AWS Logs integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsLogsIntegrationApi; import com.datadog.api.client.v1.model.AWSAccountAndLambdaRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsLogsIntegrationApi apiInstance = new AwsLogsIntegrationApi(defaultClient); AWSAccountAndLambdaRequest body = new AWSAccountAndLambdaRequest() .accountId("1234567") .lambdaArn("arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest"); try { apiInstance.deleteAWSLambdaARN(body); } catch (ApiException e) { System.err.println("Exception when calling AwsLogsIntegrationApi#deleteAWSLambdaARN"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an AWS Logs integration ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) account_id = "" lambda_arn = "arn:aws:lambda:::function:" api.AwsLogsIntegration.delete_config(account_id=account_id, lambda_arn=lambda_arn) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Delete an AWS Logs integration ``` // Delete an AWS Logs integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_logs_integration::AWSLogsIntegrationAPI; use datadog_api_client::datadogV1::model::AWSAccountAndLambdaRequest; #[tokio::main] async fn main() { let body = AWSAccountAndLambdaRequest::new( "1234567".to_string(), "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest".to_string(), ); let configuration = datadog::Configuration::new(); let api = AWSLogsIntegrationAPI::with_config(configuration); let resp = api.delete_aws_lambda_arn(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an AWS Logs integration ``` /** * Delete an AWS Logs integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSLogsIntegrationApi(configuration); const params: v1.AWSLogsIntegrationApiDeleteAWSLambdaARNRequest = { body: { accountId: "1234567", lambdaArn: "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", }, }; apiInstance .deleteAWSLambdaARN(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get list of AWS log ready services](https://docs.datadoghq.com/api/latest/aws-logs-integration/#get-list-of-aws-log-ready-services) * [v1](https://docs.datadoghq.com/api/latest/aws-logs-integration/#get-list-of-aws-log-ready-services-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/aws-logs-integration/#get-list-of-aws-log-ready-services-v2) GET https://api.ap1.datadoghq.com/api/v1/integration/aws/logs/serviceshttps://api.ap2.datadoghq.com/api/v1/integration/aws/logs/serviceshttps://api.datadoghq.eu/api/v1/integration/aws/logs/serviceshttps://api.ddog-gov.com/api/v1/integration/aws/logs/serviceshttps://api.datadoghq.com/api/v1/integration/aws/logs/serviceshttps://api.us3.datadoghq.com/api/v1/integration/aws/logs/serviceshttps://api.us5.datadoghq.com/api/v1/integration/aws/logs/services ### Overview **This endpoint is deprecated - use the V2 endpoint instead.** Get the list of current AWS services that Datadog offers automatic log collection. Use returned service IDs with the services parameter for the Enable an AWS service log collection API endpoint. This endpoint requires the `aws_configuration_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-logs-integration/#ListAWSLogsServices-200-v1) * [403](https://docs.datadoghq.com/api/latest/aws-logs-integration/#ListAWSLogsServices-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-logs-integration/#ListAWSLogsServices-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Expand All Field Type Description id string Key value in returned object. label string Name of service available for configuration with Datadog logs. ``` [ { "id": "s3", "label": "S3 Access Logs" }, { "id": "elb", "label": "Classic ELB Access Logs" }, { "id": "elbv2", "label": "Application ELB Access Logs" }, { "id": "cloudfront", "label": "CloudFront Access Logs" }, { "id": "redshift", "label": "Redshift Logs" }, { "id": "lambda", "label": "Lambda Cloudwatch Logs" } ] ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python-legacy) ##### Get list of AWS log ready services Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/logs/services" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get list of AWS log ready services ``` """ Get list of AWS log ready services returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_logs_integration_api import AWSLogsIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSLogsIntegrationApi(api_client) response = api_instance.list_aws_logs_services() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get list of AWS log ready services ``` # Get list of AWS log ready services returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSLogsIntegrationAPI.new p api_instance.list_aws_logs_services() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get list of AWS log ready services ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.aws_logs_list_services ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get list of AWS log ready services ``` // Get list of AWS log ready services returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSLogsIntegrationApi(apiClient) resp, r, err := api.ListAWSLogsServices(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSLogsIntegrationApi.ListAWSLogsServices`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSLogsIntegrationApi.ListAWSLogsServices`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get list of AWS log ready services ``` // Get list of AWS log ready services returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsLogsIntegrationApi; import com.datadog.api.client.v1.model.AWSLogsListServicesResponse; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsLogsIntegrationApi apiInstance = new AwsLogsIntegrationApi(defaultClient); try { List result = apiInstance.listAWSLogsServices(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsLogsIntegrationApi#listAWSLogsServices"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get list of AWS log ready services ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.AwsLogsIntegration.list_log_services() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get list of AWS log ready services ``` // Get list of AWS log ready services returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_logs_integration::AWSLogsIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSLogsIntegrationAPI::with_config(configuration); let resp = api.list_aws_logs_services().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get list of AWS log ready services ``` /** * Get list of AWS log ready services returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSLogsIntegrationApi(configuration); apiInstance .listAWSLogsServices() .then((data: v1.AWSLogsListServicesResponse[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/integration/aws/logs/serviceshttps://api.ap2.datadoghq.com/api/v2/integration/aws/logs/serviceshttps://api.datadoghq.eu/api/v2/integration/aws/logs/serviceshttps://api.ddog-gov.com/api/v2/integration/aws/logs/serviceshttps://api.datadoghq.com/api/v2/integration/aws/logs/serviceshttps://api.us3.datadoghq.com/api/v2/integration/aws/logs/serviceshttps://api.us5.datadoghq.com/api/v2/integration/aws/logs/services ### Overview Get a list of AWS services that can send logs to Datadog. This endpoint requires the `aws_configuration_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/aws-logs-integration/#ListAWSLogsServices-200-v2) * [403](https://docs.datadoghq.com/api/latest/aws-logs-integration/#ListAWSLogsServices-403-v2) * [429](https://docs.datadoghq.com/api/latest/aws-logs-integration/#ListAWSLogsServices-429-v2) AWS Logs Services List object * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) AWS Logs Services response body Field Type Description data [_required_] object AWS Logs Services response body attributes object AWS Logs Services response body logs_services [_required_] [string] List of AWS services that can send logs to Datadog id [_required_] string The `AWSLogsServicesResponseData` `id`. default: `logs_services` type [_required_] enum The `AWSLogsServicesResponseData` `type`. Allowed enum values: `logs_services` default: `logs_services` ``` { "data": { "attributes": { "logs_services": [ "s3" ] }, "id": "logs_services", "type": "logs_services" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=typescript) ##### Get list of AWS log ready services Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/aws/logs/services" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get list of AWS log ready services ``` """ Get list of AWS log ready services returns "AWS Logs Services List object" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.aws_logs_integration_api import AWSLogsIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSLogsIntegrationApi(api_client) response = api_instance.list_aws_logs_services() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get list of AWS log ready services ``` # Get list of AWS log ready services returns "AWS Logs Services List object" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::AWSLogsIntegrationAPI.new p api_instance.list_aws_logs_services() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get list of AWS log ready services ``` // Get list of AWS log ready services returns "AWS Logs Services List object" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewAWSLogsIntegrationApi(apiClient) resp, r, err := api.ListAWSLogsServices(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSLogsIntegrationApi.ListAWSLogsServices`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSLogsIntegrationApi.ListAWSLogsServices`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get list of AWS log ready services ``` // Get list of AWS log ready services returns "AWS Logs Services List object" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.AwsLogsIntegrationApi; import com.datadog.api.client.v2.model.AWSLogsServicesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsLogsIntegrationApi apiInstance = new AwsLogsIntegrationApi(defaultClient); try { AWSLogsServicesResponse result = apiInstance.listAWSLogsServices(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsLogsIntegrationApi#listAWSLogsServices"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get list of AWS log ready services ``` // Get list of AWS log ready services returns "AWS Logs Services List object" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_aws_logs_integration::AWSLogsIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AWSLogsIntegrationAPI::with_config(configuration); let resp = api.list_aws_logs_services().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get list of AWS log ready services ``` /** * Get list of AWS log ready services returns "AWS Logs Services List object" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.AWSLogsIntegrationApi(configuration); apiInstance .listAWSLogsServices() .then((data: v2.AWSLogsServicesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Enable an AWS Logs integration](https://docs.datadoghq.com/api/latest/aws-logs-integration/#enable-an-aws-logs-integration) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/aws-logs-integration/#enable-an-aws-logs-integration-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/aws/logs/serviceshttps://api.ap2.datadoghq.com/api/v1/integration/aws/logs/serviceshttps://api.datadoghq.eu/api/v1/integration/aws/logs/serviceshttps://api.ddog-gov.com/api/v1/integration/aws/logs/serviceshttps://api.datadoghq.com/api/v1/integration/aws/logs/serviceshttps://api.us3.datadoghq.com/api/v1/integration/aws/logs/serviceshttps://api.us5.datadoghq.com/api/v1/integration/aws/logs/services ### Overview Enable automatic log collection for a list of services. This should be run after running `CreateAWSLambdaARN` to save the configuration. This endpoint requires the `aws_configuration_edit` permission. ### Request #### Body Data (required) Enable AWS Log Services request body. * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Expand All Field Type Description account_id [_required_] string Your AWS Account ID without dashes. services [_required_] [string] Array of services IDs set to enable automatic log collection. Discover the list of available services with the get list of AWS log ready services API endpoint. ``` { "account_id": "1234567", "services": [ "s3", "elb", "elbv2", "cloudfront", "redshift", "lambda" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-logs-integration/#EnableAWSLogServices-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-logs-integration/#EnableAWSLogServices-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-logs-integration/#EnableAWSLogServices-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-logs-integration/#EnableAWSLogServices-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python-legacy) ##### Enable an AWS Logs integration Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/logs/services" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "account_id": "1234567", "services": [ "s3", "elb", "elbv2", "cloudfront", "redshift", "lambda" ] } EOF ``` ##### Enable an AWS Logs integration ``` """ Enable an AWS Logs integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_logs_integration_api import AWSLogsIntegrationApi from datadog_api_client.v1.model.aws_logs_services_request import AWSLogsServicesRequest body = AWSLogsServicesRequest( account_id="1234567", services=[ "s3", "elb", "elbv2", "cloudfront", "redshift", "lambda", ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSLogsIntegrationApi(api_client) response = api_instance.enable_aws_log_services(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Enable an AWS Logs integration ``` # Enable an AWS Logs integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSLogsIntegrationAPI.new body = DatadogAPIClient::V1::AWSLogsServicesRequest.new({ account_id: "1234567", services: [ "s3", "elb", "elbv2", "cloudfront", "redshift", "lambda", ], }) p api_instance.enable_aws_log_services(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Enable an AWS Logs integration ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) config = { "account_id": '', "services": ['s3', 'elb', 'elbv2', 'cloudfront', 'redshift', 'lambda'] } dog.aws_logs_save_services(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Enable an AWS Logs integration ``` // Enable an AWS Logs integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSLogsServicesRequest{ AccountId: "1234567", Services: []string{ "s3", "elb", "elbv2", "cloudfront", "redshift", "lambda", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSLogsIntegrationApi(apiClient) resp, r, err := api.EnableAWSLogServices(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSLogsIntegrationApi.EnableAWSLogServices`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSLogsIntegrationApi.EnableAWSLogServices`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Enable an AWS Logs integration ``` // Enable an AWS Logs integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsLogsIntegrationApi; import com.datadog.api.client.v1.model.AWSLogsServicesRequest; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsLogsIntegrationApi apiInstance = new AwsLogsIntegrationApi(defaultClient); AWSLogsServicesRequest body = new AWSLogsServicesRequest() .accountId("1234567") .services(Arrays.asList("s3", "elb", "elbv2", "cloudfront", "redshift", "lambda")); try { apiInstance.enableAWSLogServices(body); } catch (ApiException e) { System.err.println("Exception when calling AwsLogsIntegrationApi#enableAWSLogServices"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Enable an AWS Logs integration ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) account_id = " { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Check permissions for log services](https://docs.datadoghq.com/api/latest/aws-logs-integration/#check-permissions-for-log-services) * [v1 (latest)](https://docs.datadoghq.com/api/latest/aws-logs-integration/#check-permissions-for-log-services-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/aws/logs/services_asynchttps://api.ap2.datadoghq.com/api/v1/integration/aws/logs/services_asynchttps://api.datadoghq.eu/api/v1/integration/aws/logs/services_asynchttps://api.ddog-gov.com/api/v1/integration/aws/logs/services_asynchttps://api.datadoghq.com/api/v1/integration/aws/logs/services_asynchttps://api.us3.datadoghq.com/api/v1/integration/aws/logs/services_asynchttps://api.us5.datadoghq.com/api/v1/integration/aws/logs/services_async ### Overview Test if permissions are present to add log-forwarding triggers for the given services and AWS account. Input is the same as for `EnableAWSLogServices`. Done async, so can be repeatedly polled in a non-blocking fashion until the async request completes. * Returns a status of `created` when it’s checking if the permissions exists in the AWS account. * Returns a status of `waiting` while checking. * Returns a status of `checked and ok` if the Lambda exists. * Returns a status of `error` if the Lambda does not exist. This endpoint requires the `aws_configuration_read` permission. ### Request #### Body Data (required) Check AWS Logs Async Services request body. * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Expand All Field Type Description account_id [_required_] string Your AWS Account ID without dashes. services [_required_] [string] Array of services IDs set to enable automatic log collection. Discover the list of available services with the get list of AWS log ready services API endpoint. ``` { "account_id": "1234567", "services": [ "s3", "elb", "elbv2", "cloudfront", "redshift", "lambda" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CheckAWSLogsServicesAsync-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CheckAWSLogsServicesAsync-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CheckAWSLogsServicesAsync-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CheckAWSLogsServicesAsync-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) A list of all Datadog-AWS logs integrations available in your Datadog organization. Field Type Description errors [object] List of errors. code string Code properties message string Message content. status string Status of the properties. ``` { "errors": [ { "code": "no_such_config", "message": "AWS account 12345 has no Lambda config to update" } ], "status": "created" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python-legacy) ##### Check permissions for log services Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/logs/services_async" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "account_id": "1234567", "services": [ "s3", "elb", "elbv2", "cloudfront", "redshift", "lambda" ] } EOF ``` ##### Check permissions for log services ``` """ Check permissions for log services returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_logs_integration_api import AWSLogsIntegrationApi from datadog_api_client.v1.model.aws_logs_services_request import AWSLogsServicesRequest body = AWSLogsServicesRequest( account_id="1234567", services=[ "s3", "elb", "elbv2", "cloudfront", "redshift", "lambda", ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSLogsIntegrationApi(api_client) response = api_instance.check_aws_logs_services_async(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Check permissions for log services ``` # Check permissions for log services returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSLogsIntegrationAPI.new body = DatadogAPIClient::V1::AWSLogsServicesRequest.new({ account_id: "1234567", services: [ "s3", "elb", "elbv2", "cloudfront", "redshift", "lambda", ], }) p api_instance.check_aws_logs_services_async(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Check permissions for log services ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) config = { "account_id": '', "services": ['s3', 'elb', 'elbv2', 'cloudfront', 'redshift', 'lambda'] } dog.aws_logs_check_services(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Check permissions for log services ``` // Check permissions for log services returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSLogsServicesRequest{ AccountId: "1234567", Services: []string{ "s3", "elb", "elbv2", "cloudfront", "redshift", "lambda", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSLogsIntegrationApi(apiClient) resp, r, err := api.CheckAWSLogsServicesAsync(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSLogsIntegrationApi.CheckAWSLogsServicesAsync`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSLogsIntegrationApi.CheckAWSLogsServicesAsync`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Check permissions for log services ``` // Check permissions for log services returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsLogsIntegrationApi; import com.datadog.api.client.v1.model.AWSLogsAsyncResponse; import com.datadog.api.client.v1.model.AWSLogsServicesRequest; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsLogsIntegrationApi apiInstance = new AwsLogsIntegrationApi(defaultClient); AWSLogsServicesRequest body = new AWSLogsServicesRequest() .accountId("1234567") .services(Arrays.asList("s3", "elb", "elbv2", "cloudfront", "redshift", "lambda")); try { AWSLogsAsyncResponse result = apiInstance.checkAWSLogsServicesAsync(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsLogsIntegrationApi#checkAWSLogsServicesAsync"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Check permissions for log services ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) account_id = " { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Check that an AWS Lambda Function exists](https://docs.datadoghq.com/api/latest/aws-logs-integration/#check-that-an-aws-lambda-function-exists) * [v1 (latest)](https://docs.datadoghq.com/api/latest/aws-logs-integration/#check-that-an-aws-lambda-function-exists-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/aws/logs/check_asynchttps://api.ap2.datadoghq.com/api/v1/integration/aws/logs/check_asynchttps://api.datadoghq.eu/api/v1/integration/aws/logs/check_asynchttps://api.ddog-gov.com/api/v1/integration/aws/logs/check_asynchttps://api.datadoghq.com/api/v1/integration/aws/logs/check_asynchttps://api.us3.datadoghq.com/api/v1/integration/aws/logs/check_asynchttps://api.us5.datadoghq.com/api/v1/integration/aws/logs/check_async ### Overview Test if permissions are present to add a log-forwarding triggers for the given services and AWS account. The input is the same as for Enable an AWS service log collection. Subsequent requests will always repeat the above, so this endpoint can be polled intermittently instead of blocking. * Returns a status of ‘created’ when it’s checking if the Lambda exists in the account. * Returns a status of ‘waiting’ while checking. * Returns a status of ‘checked and ok’ if the Lambda exists. * Returns a status of ’error’ if the Lambda does not exist. This endpoint requires the `aws_configuration_read` permission. ### Request #### Body Data (required) Check AWS Log Lambda Async request body. * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Expand All Field Type Description account_id [_required_] string Your AWS Account ID without dashes. lambda_arn [_required_] string ARN of the Datadog Lambda created during the Datadog-Amazon Web services Log collection setup. ``` { "account_id": "1234567", "lambda_arn": "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CheckAWSLogsLambdaAsync-200-v1) * [400](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CheckAWSLogsLambdaAsync-400-v1) * [403](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CheckAWSLogsLambdaAsync-403-v1) * [429](https://docs.datadoghq.com/api/latest/aws-logs-integration/#CheckAWSLogsLambdaAsync-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) A list of all Datadog-AWS logs integrations available in your Datadog organization. Field Type Description errors [object] List of errors. code string Code properties message string Message content. status string Status of the properties. ``` { "errors": [ { "code": "no_such_config", "message": "AWS account 12345 has no Lambda config to update" } ], "status": "created" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [Example](https://docs.datadoghq.com/api/latest/aws-logs-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/aws-logs-integration/?code-lang=python-legacy) ##### Check that an AWS Lambda Function exists Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/aws/logs/check_async" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "account_id": "1234567", "lambda_arn": "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest" } EOF ``` ##### Check that an AWS Lambda Function exists ``` """ Check that an AWS Lambda Function exists returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.aws_logs_integration_api import AWSLogsIntegrationApi from datadog_api_client.v1.model.aws_account_and_lambda_request import AWSAccountAndLambdaRequest body = AWSAccountAndLambdaRequest( account_id="1234567", lambda_arn="arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AWSLogsIntegrationApi(api_client) response = api_instance.check_aws_logs_lambda_async(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Check that an AWS Lambda Function exists ``` # Check that an AWS Lambda Function exists returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AWSLogsIntegrationAPI.new body = DatadogAPIClient::V1::AWSAccountAndLambdaRequest.new({ account_id: "1234567", lambda_arn: "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", }) p api_instance.check_aws_logs_lambda_async(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Check that an AWS Lambda Function exists ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) config = { "account_id": '', "lambda_arn": 'arn:aws:lambda:::function:' } dog.aws_logs_check_lambda(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Check that an AWS Lambda Function exists ``` // Check that an AWS Lambda Function exists returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AWSAccountAndLambdaRequest{ AccountId: "1234567", LambdaArn: "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAWSLogsIntegrationApi(apiClient) resp, r, err := api.CheckAWSLogsLambdaAsync(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AWSLogsIntegrationApi.CheckAWSLogsLambdaAsync`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AWSLogsIntegrationApi.CheckAWSLogsLambdaAsync`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Check that an AWS Lambda Function exists ``` // Check that an AWS Lambda Function exists returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AwsLogsIntegrationApi; import com.datadog.api.client.v1.model.AWSAccountAndLambdaRequest; import com.datadog.api.client.v1.model.AWSLogsAsyncResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AwsLogsIntegrationApi apiInstance = new AwsLogsIntegrationApi(defaultClient); AWSAccountAndLambdaRequest body = new AWSAccountAndLambdaRequest() .accountId("1234567") .lambdaArn("arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest"); try { AWSLogsAsyncResponse result = apiInstance.checkAWSLogsLambdaAsync(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AwsLogsIntegrationApi#checkAWSLogsLambdaAsync"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Check that an AWS Lambda Function exists ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) account_id = "" lambda_arn = "arn:aws:lambda:::function:" api.AwsLogsIntegration.check_lambda(account_id=account_id, lambda_arn=lambda_arn) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Check that an AWS Lambda Function exists ``` // Check that an AWS Lambda Function exists returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_aws_logs_integration::AWSLogsIntegrationAPI; use datadog_api_client::datadogV1::model::AWSAccountAndLambdaRequest; #[tokio::main] async fn main() { let body = AWSAccountAndLambdaRequest::new( "1234567".to_string(), "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest".to_string(), ); let configuration = datadog::Configuration::new(); let api = AWSLogsIntegrationAPI::with_config(configuration); let resp = api.check_aws_logs_lambda_async(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Check that an AWS Lambda Function exists ``` /** * Check that an AWS Lambda Function exists returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AWSLogsIntegrationApi(configuration); const params: v1.AWSLogsIntegrationApiCheckAWSLogsLambdaAsyncRequest = { body: { accountId: "1234567", lambdaArn: "arn:aws:lambda:us-east-1:1234567:function:LogsCollectionAPITest", }, }; apiInstance .checkAWSLogsLambdaAsync(params) .then((data: v1.AWSLogsAsyncResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=2e6c7612-5a28-4467-81ca-dfda27b00f08&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=85df3bf0-1e0b-48fc-94a4-36c19e056548&pt=AWS%20Logs%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-logs-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=2e6c7612-5a28-4467-81ca-dfda27b00f08&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=85df3bf0-1e0b-48fc-94a4-36c19e056548&pt=AWS%20Logs%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-logs-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=1479f5fd-687b-41ea-b593-3fec27095780&bo=2&sid=26a3a930f0bf11f090a827a9ebc76acd&vid=26a47450f0bf11f09ce8df1ce1dab822&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=AWS%20Logs%20Integration&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Faws-logs-integration%2F&r=<=1467&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=546913) --- # Source: https://docs.datadoghq.com/api/latest/azure-integration/ # Azure Integration Configure your Datadog-Azure integration directly through the Datadog API. For more information, see the [Datadog-Azure integration page](https://docs.datadoghq.com/integrations/azure). ## [List all Azure integrations](https://docs.datadoghq.com/api/latest/azure-integration/#list-all-azure-integrations) * [v1 (latest)](https://docs.datadoghq.com/api/latest/azure-integration/#list-all-azure-integrations-v1) GET https://api.ap1.datadoghq.com/api/v1/integration/azurehttps://api.ap2.datadoghq.com/api/v1/integration/azurehttps://api.datadoghq.eu/api/v1/integration/azurehttps://api.ddog-gov.com/api/v1/integration/azurehttps://api.datadoghq.com/api/v1/integration/azurehttps://api.us3.datadoghq.com/api/v1/integration/azurehttps://api.us5.datadoghq.com/api/v1/integration/azure ### Overview List all Datadog-Azure integrations configured in your Datadog account. This endpoint requires the `azure_configuration_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/azure-integration/#ListAzureIntegration-200-v1) * [400](https://docs.datadoghq.com/api/latest/azure-integration/#ListAzureIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/azure-integration/#ListAzureIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/azure-integration/#ListAzureIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Accounts configured for your organization. Field Type Description app_service_plan_filters string Limit the Azure app service plans that are pulled into Datadog using tags. Only app service plans that match one of the defined tags are imported into Datadog. automute boolean Silence monitors for expected Azure VM shutdowns. client_id string Your Azure web application ID. client_secret string Your Azure web application secret key. container_app_filters string Limit the Azure container apps that are pulled into Datadog using tags. Only container apps that match one of the defined tags are imported into Datadog. cspm_enabled boolean When enabled, Datadog’s Cloud Security Management product scans resource configurations monitored by this app registration. Note: This requires resource_collection_enabled to be set to true. custom_metrics_enabled boolean Enable custom metrics for your organization. errors [string] Errors in your configuration. host_filters string Limit the Azure instances that are pulled into Datadog by using tags. Only hosts that match one of the defined tags are imported into Datadog. metrics_enabled boolean Enable Azure metrics for your organization. metrics_enabled_default boolean Enable Azure metrics for your organization for resource providers where no resource provider config is specified. new_client_id string Your New Azure web application ID. new_tenant_name string Your New Azure Active Directory ID. resource_collection_enabled boolean When enabled, Datadog collects metadata and configuration info from cloud resources (compute instances, databases, load balancers, etc.) monitored by this app registration. resource_provider_configs [object] Configuration settings applied to resources from the specified Azure resource providers. metrics_enabled boolean Collect metrics for resources from this provider. namespace string The provider namespace to apply this configuration to. tenant_name string Your Azure Active Directory ID. usage_metrics_enabled boolean Enable azure.usage metrics for your organization. ``` { "app_service_plan_filters": "key:value,filter:example", "automute": true, "client_id": "testc7f6-1234-5678-9101-3fcbf464test", "client_secret": "TestingRh2nx664kUy5dIApvM54T4AtO", "container_app_filters": "key:value,filter:example", "cspm_enabled": true, "custom_metrics_enabled": true, "errors": [ "*" ], "host_filters": "key:value,filter:example", "metrics_enabled": true, "metrics_enabled_default": true, "new_client_id": "new1c7f6-1234-5678-9101-3fcbf464test", "new_tenant_name": "new1c44-1234-5678-9101-cc00736ftest", "resource_collection_enabled": true, "resource_provider_configs": [ { "metrics_enabled": true, "namespace": "Microsoft.Compute" } ], "tenant_name": "testc44-1234-5678-9101-cc00736ftest", "usage_metrics_enabled": true } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=python-legacy) ##### List all Azure integrations Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/azure" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all Azure integrations ``` """ List all Azure integrations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.azure_integration_api import AzureIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AzureIntegrationApi(api_client) response = api_instance.list_azure_integration() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all Azure integrations ``` # List all Azure integrations returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AzureIntegrationAPI.new p api_instance.list_azure_integration() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all Azure integrations ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.azure_integration_list ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all Azure integrations ``` // List all Azure integrations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAzureIntegrationApi(apiClient) resp, r, err := api.ListAzureIntegration(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AzureIntegrationApi.ListAzureIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AzureIntegrationApi.ListAzureIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all Azure integrations ``` // List all Azure integrations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AzureIntegrationApi; import com.datadog.api.client.v1.model.AzureAccount; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AzureIntegrationApi apiInstance = new AzureIntegrationApi(defaultClient); try { List result = apiInstance.listAzureIntegration(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling AzureIntegrationApi#listAzureIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all Azure integrations ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.AzureIntegration.list() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### List all Azure integrations ``` // List all Azure integrations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_azure_integration::AzureIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = AzureIntegrationAPI::with_config(configuration); let resp = api.list_azure_integration().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all Azure integrations ``` /** * List all Azure integrations returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AzureIntegrationApi(configuration); apiInstance .listAzureIntegration() .then((data: v1.AzureAccount[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an Azure integration](https://docs.datadoghq.com/api/latest/azure-integration/#create-an-azure-integration) * [v1 (latest)](https://docs.datadoghq.com/api/latest/azure-integration/#create-an-azure-integration-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/azurehttps://api.ap2.datadoghq.com/api/v1/integration/azurehttps://api.datadoghq.eu/api/v1/integration/azurehttps://api.ddog-gov.com/api/v1/integration/azurehttps://api.datadoghq.com/api/v1/integration/azurehttps://api.us3.datadoghq.com/api/v1/integration/azurehttps://api.us5.datadoghq.com/api/v1/integration/azure ### Overview Create a Datadog-Azure integration. Using the `POST` method updates your integration configuration by adding your new configuration to the existing one in your Datadog organization. Using the `PUT` method updates your integration configuration by replacing your current configuration with the new one sent to your Datadog organization. This endpoint requires the `azure_configurations_manage` permission. ### Request #### Body Data (required) Create a Datadog-Azure integration for your Datadog account request body. * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Field Type Description app_service_plan_filters string Limit the Azure app service plans that are pulled into Datadog using tags. Only app service plans that match one of the defined tags are imported into Datadog. automute boolean Silence monitors for expected Azure VM shutdowns. client_id string Your Azure web application ID. client_secret string Your Azure web application secret key. container_app_filters string Limit the Azure container apps that are pulled into Datadog using tags. Only container apps that match one of the defined tags are imported into Datadog. cspm_enabled boolean When enabled, Datadog’s Cloud Security Management product scans resource configurations monitored by this app registration. Note: This requires resource_collection_enabled to be set to true. custom_metrics_enabled boolean Enable custom metrics for your organization. errors [string] Errors in your configuration. host_filters string Limit the Azure instances that are pulled into Datadog by using tags. Only hosts that match one of the defined tags are imported into Datadog. metrics_enabled boolean Enable Azure metrics for your organization. metrics_enabled_default boolean Enable Azure metrics for your organization for resource providers where no resource provider config is specified. new_client_id string Your New Azure web application ID. new_tenant_name string Your New Azure Active Directory ID. resource_collection_enabled boolean When enabled, Datadog collects metadata and configuration info from cloud resources (compute instances, databases, load balancers, etc.) monitored by this app registration. resource_provider_configs [object] Configuration settings applied to resources from the specified Azure resource providers. metrics_enabled boolean Collect metrics for resources from this provider. namespace string The provider namespace to apply this configuration to. tenant_name string Your Azure Active Directory ID. usage_metrics_enabled boolean Enable azure.usage metrics for your organization. ``` { "app_service_plan_filters": "key:value,filter:example", "automute": true, "client_id": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "client_secret": "TestingRh2nx664kUy5dIApvM54T4AtO", "container_app_filters": "key:value,filter:example", "cspm_enabled": true, "custom_metrics_enabled": true, "errors": [ "*" ], "host_filters": "key:value,filter:example", "new_client_id": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "new_tenant_name": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "resource_collection_enabled": true, "tenant_name": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/azure-integration/#CreateAzureIntegration-200-v1) * [400](https://docs.datadoghq.com/api/latest/azure-integration/#CreateAzureIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/azure-integration/#CreateAzureIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/azure-integration/#CreateAzureIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=ruby-legacy) ##### Create an Azure integration returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/azure" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "app_service_plan_filters": "key:value,filter:example", "automute": true, "client_id": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "client_secret": "TestingRh2nx664kUy5dIApvM54T4AtO", "container_app_filters": "key:value,filter:example", "cspm_enabled": true, "custom_metrics_enabled": true, "errors": [ "*" ], "host_filters": "key:value,filter:example", "new_client_id": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "new_tenant_name": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "resource_collection_enabled": true, "tenant_name": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d" } EOF ``` ##### Create an Azure integration returns "OK" response ``` // Create an Azure integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AzureAccount{ AppServicePlanFilters: datadog.PtrString("key:value,filter:example"), Automute: datadog.PtrBool(true), ClientId: datadog.PtrString("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"), ClientSecret: datadog.PtrString("TestingRh2nx664kUy5dIApvM54T4AtO"), ContainerAppFilters: datadog.PtrString("key:value,filter:example"), CspmEnabled: datadog.PtrBool(true), CustomMetricsEnabled: datadog.PtrBool(true), Errors: []string{ "*", }, HostFilters: datadog.PtrString("key:value,filter:example"), NewClientId: datadog.PtrString("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"), NewTenantName: datadog.PtrString("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"), ResourceCollectionEnabled: datadog.PtrBool(true), TenantName: datadog.PtrString("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAzureIntegrationApi(apiClient) resp, r, err := api.CreateAzureIntegration(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AzureIntegrationApi.CreateAzureIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AzureIntegrationApi.CreateAzureIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an Azure integration returns "OK" response ``` // Create an Azure integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AzureIntegrationApi; import com.datadog.api.client.v1.model.AzureAccount; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AzureIntegrationApi apiInstance = new AzureIntegrationApi(defaultClient); AzureAccount body = new AzureAccount() .appServicePlanFilters("key:value,filter:example") .automute(true) .clientId("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d") .clientSecret("TestingRh2nx664kUy5dIApvM54T4AtO") .containerAppFilters("key:value,filter:example") .cspmEnabled(true) .customMetricsEnabled(true) .errors(Collections.singletonList("*")) .hostFilters("key:value,filter:example") .newClientId("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d") .newTenantName("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d") .resourceCollectionEnabled(true) .tenantName("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"); try { apiInstance.createAzureIntegration(body); } catch (ApiException e) { System.err.println("Exception when calling AzureIntegrationApi#createAzureIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an Azure integration returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.AzureIntegration.create( tenant_name="", host_filters=":,:", client_id="", client_secret="" ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Create an Azure integration returns "OK" response ``` """ Create an Azure integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.azure_integration_api import AzureIntegrationApi from datadog_api_client.v1.model.azure_account import AzureAccount body = AzureAccount( app_service_plan_filters="key:value,filter:example", automute=True, client_id="9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", client_secret="TestingRh2nx664kUy5dIApvM54T4AtO", container_app_filters="key:value,filter:example", cspm_enabled=True, custom_metrics_enabled=True, errors=[ "*", ], host_filters="key:value,filter:example", new_client_id="9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", new_tenant_name="9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", resource_collection_enabled=True, tenant_name="9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AzureIntegrationApi(api_client) response = api_instance.create_azure_integration(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an Azure integration returns "OK" response ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' config= { "tenant_name": "", "client_id": "", "client_secret": "", "host_filters": ":,:" } dog = Dogapi::Client.new(api_key, app_key) dog.azure_integration_create(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an Azure integration returns "OK" response ``` # Create an Azure integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AzureIntegrationAPI.new body = DatadogAPIClient::V1::AzureAccount.new({ app_service_plan_filters: "key:value,filter:example", automute: true, client_id: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", client_secret: "TestingRh2nx664kUy5dIApvM54T4AtO", container_app_filters: "key:value,filter:example", cspm_enabled: true, custom_metrics_enabled: true, errors: [ "*", ], host_filters: "key:value,filter:example", new_client_id: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", new_tenant_name: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", resource_collection_enabled: true, tenant_name: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", }) p api_instance.create_azure_integration(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an Azure integration returns "OK" response ``` // Create an Azure integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_azure_integration::AzureIntegrationAPI; use datadog_api_client::datadogV1::model::AzureAccount; #[tokio::main] async fn main() { let body = AzureAccount::new() .app_service_plan_filters("key:value,filter:example".to_string()) .automute(true) .client_id("".to_string()) .client_secret("TestingRh2nx664kUy5dIApvM54T4AtO".to_string()) .container_app_filters("key:value,filter:example".to_string()) .cspm_enabled(true) .custom_metrics_enabled(true) .errors(vec!["*".to_string()]) .host_filters("key:value,filter:example".to_string()) .new_client_id("".to_string()) .new_tenant_name("".to_string()) .resource_collection_enabled(true) .tenant_name("".to_string()); let configuration = datadog::Configuration::new(); let api = AzureIntegrationAPI::with_config(configuration); let resp = api.create_azure_integration(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an Azure integration returns "OK" response ``` /** * Create an Azure integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AzureIntegrationApi(configuration); const params: v1.AzureIntegrationApiCreateAzureIntegrationRequest = { body: { appServicePlanFilters: "key:value,filter:example", automute: true, clientId: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", clientSecret: "TestingRh2nx664kUy5dIApvM54T4AtO", containerAppFilters: "key:value,filter:example", cspmEnabled: true, customMetricsEnabled: true, errors: ["*"], hostFilters: "key:value,filter:example", newClientId: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", newTenantName: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", resourceCollectionEnabled: true, tenantName: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", }, }; apiInstance .createAzureIntegration(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an Azure integration](https://docs.datadoghq.com/api/latest/azure-integration/#delete-an-azure-integration) * [v1 (latest)](https://docs.datadoghq.com/api/latest/azure-integration/#delete-an-azure-integration-v1) DELETE https://api.ap1.datadoghq.com/api/v1/integration/azurehttps://api.ap2.datadoghq.com/api/v1/integration/azurehttps://api.datadoghq.eu/api/v1/integration/azurehttps://api.ddog-gov.com/api/v1/integration/azurehttps://api.datadoghq.com/api/v1/integration/azurehttps://api.us3.datadoghq.com/api/v1/integration/azurehttps://api.us5.datadoghq.com/api/v1/integration/azure ### Overview Delete a given Datadog-Azure integration from your Datadog account. This endpoint requires the `azure_configurations_manage` permission. ### Request #### Body Data (required) Delete a given Datadog-Azure integration request body. * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Field Type Description app_service_plan_filters string Limit the Azure app service plans that are pulled into Datadog using tags. Only app service plans that match one of the defined tags are imported into Datadog. automute boolean Silence monitors for expected Azure VM shutdowns. client_id string Your Azure web application ID. client_secret string Your Azure web application secret key. container_app_filters string Limit the Azure container apps that are pulled into Datadog using tags. Only container apps that match one of the defined tags are imported into Datadog. cspm_enabled boolean When enabled, Datadog’s Cloud Security Management product scans resource configurations monitored by this app registration. Note: This requires resource_collection_enabled to be set to true. custom_metrics_enabled boolean Enable custom metrics for your organization. errors [string] Errors in your configuration. host_filters string Limit the Azure instances that are pulled into Datadog by using tags. Only hosts that match one of the defined tags are imported into Datadog. metrics_enabled boolean Enable Azure metrics for your organization. metrics_enabled_default boolean Enable Azure metrics for your organization for resource providers where no resource provider config is specified. new_client_id string Your New Azure web application ID. new_tenant_name string Your New Azure Active Directory ID. resource_collection_enabled boolean When enabled, Datadog collects metadata and configuration info from cloud resources (compute instances, databases, load balancers, etc.) monitored by this app registration. resource_provider_configs [object] Configuration settings applied to resources from the specified Azure resource providers. metrics_enabled boolean Collect metrics for resources from this provider. namespace string The provider namespace to apply this configuration to. tenant_name string Your Azure Active Directory ID. usage_metrics_enabled boolean Enable azure.usage metrics for your organization. ``` { "client_id": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "tenant_name": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/azure-integration/#DeleteAzureIntegration-200-v1) * [400](https://docs.datadoghq.com/api/latest/azure-integration/#DeleteAzureIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/azure-integration/#DeleteAzureIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/azure-integration/#DeleteAzureIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=ruby-legacy) ##### Delete an Azure integration returns "OK" response Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/azure" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "client_id": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "tenant_name": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d" } EOF ``` ##### Delete an Azure integration returns "OK" response ``` // Delete an Azure integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AzureAccount{ ClientId: datadog.PtrString("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"), TenantName: datadog.PtrString("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAzureIntegrationApi(apiClient) resp, r, err := api.DeleteAzureIntegration(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AzureIntegrationApi.DeleteAzureIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AzureIntegrationApi.DeleteAzureIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an Azure integration returns "OK" response ``` // Delete an Azure integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AzureIntegrationApi; import com.datadog.api.client.v1.model.AzureAccount; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AzureIntegrationApi apiInstance = new AzureIntegrationApi(defaultClient); AzureAccount body = new AzureAccount() .clientId("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d") .tenantName("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"); try { apiInstance.deleteAzureIntegration(body); } catch (ApiException e) { System.err.println("Exception when calling AzureIntegrationApi#deleteAzureIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an Azure integration returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.AzureIntegration.delete( tenant_name="", client_id="" ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Delete an Azure integration returns "OK" response ``` """ Delete an Azure integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.azure_integration_api import AzureIntegrationApi from datadog_api_client.v1.model.azure_account import AzureAccount body = AzureAccount( client_id="9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", tenant_name="9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AzureIntegrationApi(api_client) response = api_instance.delete_azure_integration(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an Azure integration returns "OK" response ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) config = { "tenant_name": '', "client_id": '' } dog.azure_integration_delete(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an Azure integration returns "OK" response ``` # Delete an Azure integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AzureIntegrationAPI.new body = DatadogAPIClient::V1::AzureAccount.new({ client_id: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", tenant_name: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", }) p api_instance.delete_azure_integration(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an Azure integration returns "OK" response ``` // Delete an Azure integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_azure_integration::AzureIntegrationAPI; use datadog_api_client::datadogV1::model::AzureAccount; #[tokio::main] async fn main() { let body = AzureAccount::new() .client_id("".to_string()) .tenant_name("".to_string()); let configuration = datadog::Configuration::new(); let api = AzureIntegrationAPI::with_config(configuration); let resp = api.delete_azure_integration(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an Azure integration returns "OK" response ``` /** * Delete an Azure integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AzureIntegrationApi(configuration); const params: v1.AzureIntegrationApiDeleteAzureIntegrationRequest = { body: { clientId: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", tenantName: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", }, }; apiInstance .deleteAzureIntegration(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an Azure integration](https://docs.datadoghq.com/api/latest/azure-integration/#update-an-azure-integration) * [v1 (latest)](https://docs.datadoghq.com/api/latest/azure-integration/#update-an-azure-integration-v1) PUT https://api.ap1.datadoghq.com/api/v1/integration/azurehttps://api.ap2.datadoghq.com/api/v1/integration/azurehttps://api.datadoghq.eu/api/v1/integration/azurehttps://api.ddog-gov.com/api/v1/integration/azurehttps://api.datadoghq.com/api/v1/integration/azurehttps://api.us3.datadoghq.com/api/v1/integration/azurehttps://api.us5.datadoghq.com/api/v1/integration/azure ### Overview Update a Datadog-Azure integration. Requires an existing `tenant_name` and `client_id`. Any other fields supplied will overwrite existing values. To overwrite `tenant_name` or `client_id`, use `new_tenant_name` and `new_client_id`. To leave a field unchanged, do not supply that field in the payload. This endpoint requires the `azure_configuration_edit` permission. ### Request #### Body Data (required) Update a Datadog-Azure integration request body. * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Field Type Description app_service_plan_filters string Limit the Azure app service plans that are pulled into Datadog using tags. Only app service plans that match one of the defined tags are imported into Datadog. automute boolean Silence monitors for expected Azure VM shutdowns. client_id string Your Azure web application ID. client_secret string Your Azure web application secret key. container_app_filters string Limit the Azure container apps that are pulled into Datadog using tags. Only container apps that match one of the defined tags are imported into Datadog. cspm_enabled boolean When enabled, Datadog’s Cloud Security Management product scans resource configurations monitored by this app registration. Note: This requires resource_collection_enabled to be set to true. custom_metrics_enabled boolean Enable custom metrics for your organization. errors [string] Errors in your configuration. host_filters string Limit the Azure instances that are pulled into Datadog by using tags. Only hosts that match one of the defined tags are imported into Datadog. metrics_enabled boolean Enable Azure metrics for your organization. metrics_enabled_default boolean Enable Azure metrics for your organization for resource providers where no resource provider config is specified. new_client_id string Your New Azure web application ID. new_tenant_name string Your New Azure Active Directory ID. resource_collection_enabled boolean When enabled, Datadog collects metadata and configuration info from cloud resources (compute instances, databases, load balancers, etc.) monitored by this app registration. resource_provider_configs [object] Configuration settings applied to resources from the specified Azure resource providers. metrics_enabled boolean Collect metrics for resources from this provider. namespace string The provider namespace to apply this configuration to. tenant_name string Your Azure Active Directory ID. usage_metrics_enabled boolean Enable azure.usage metrics for your organization. ``` { "app_service_plan_filters": "key:value,filter:example", "automute": true, "client_id": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "client_secret": "TestingRh2nx664kUy5dIApvM54T4AtO", "container_app_filters": "key:value,filter:example", "cspm_enabled": true, "custom_metrics_enabled": true, "errors": [ "*" ], "host_filters": "key:value,filter:example", "new_client_id": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "new_tenant_name": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "resource_collection_enabled": true, "tenant_name": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/azure-integration/#UpdateAzureIntegration-200-v1) * [400](https://docs.datadoghq.com/api/latest/azure-integration/#UpdateAzureIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/azure-integration/#UpdateAzureIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/azure-integration/#UpdateAzureIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=typescript) ##### Update an Azure integration returns "OK" response Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/azure" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "app_service_plan_filters": "key:value,filter:example", "automute": true, "client_id": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "client_secret": "TestingRh2nx664kUy5dIApvM54T4AtO", "container_app_filters": "key:value,filter:example", "cspm_enabled": true, "custom_metrics_enabled": true, "errors": [ "*" ], "host_filters": "key:value,filter:example", "new_client_id": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "new_tenant_name": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "resource_collection_enabled": true, "tenant_name": "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d" } EOF ``` ##### Update an Azure integration returns "OK" response ``` // Update an Azure integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AzureAccount{ AppServicePlanFilters: datadog.PtrString("key:value,filter:example"), Automute: datadog.PtrBool(true), ClientId: datadog.PtrString("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"), ClientSecret: datadog.PtrString("TestingRh2nx664kUy5dIApvM54T4AtO"), ContainerAppFilters: datadog.PtrString("key:value,filter:example"), CspmEnabled: datadog.PtrBool(true), CustomMetricsEnabled: datadog.PtrBool(true), Errors: []string{ "*", }, HostFilters: datadog.PtrString("key:value,filter:example"), NewClientId: datadog.PtrString("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"), NewTenantName: datadog.PtrString("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"), ResourceCollectionEnabled: datadog.PtrBool(true), TenantName: datadog.PtrString("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAzureIntegrationApi(apiClient) resp, r, err := api.UpdateAzureIntegration(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AzureIntegrationApi.UpdateAzureIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AzureIntegrationApi.UpdateAzureIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an Azure integration returns "OK" response ``` // Update an Azure integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AzureIntegrationApi; import com.datadog.api.client.v1.model.AzureAccount; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AzureIntegrationApi apiInstance = new AzureIntegrationApi(defaultClient); AzureAccount body = new AzureAccount() .appServicePlanFilters("key:value,filter:example") .automute(true) .clientId("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d") .clientSecret("TestingRh2nx664kUy5dIApvM54T4AtO") .containerAppFilters("key:value,filter:example") .cspmEnabled(true) .customMetricsEnabled(true) .errors(Collections.singletonList("*")) .hostFilters("key:value,filter:example") .newClientId("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d") .newTenantName("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d") .resourceCollectionEnabled(true) .tenantName("9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d"); try { apiInstance.updateAzureIntegration(body); } catch (ApiException e) { System.err.println("Exception when calling AzureIntegrationApi#updateAzureIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an Azure integration returns "OK" response ``` """ Update an Azure integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.azure_integration_api import AzureIntegrationApi from datadog_api_client.v1.model.azure_account import AzureAccount body = AzureAccount( app_service_plan_filters="key:value,filter:example", automute=True, client_id="9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", client_secret="TestingRh2nx664kUy5dIApvM54T4AtO", container_app_filters="key:value,filter:example", cspm_enabled=True, custom_metrics_enabled=True, errors=[ "*", ], host_filters="key:value,filter:example", new_client_id="9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", new_tenant_name="9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", resource_collection_enabled=True, tenant_name="9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AzureIntegrationApi(api_client) response = api_instance.update_azure_integration(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an Azure integration returns "OK" response ``` # Update an Azure integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AzureIntegrationAPI.new body = DatadogAPIClient::V1::AzureAccount.new({ app_service_plan_filters: "key:value,filter:example", automute: true, client_id: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", client_secret: "TestingRh2nx664kUy5dIApvM54T4AtO", container_app_filters: "key:value,filter:example", cspm_enabled: true, custom_metrics_enabled: true, errors: [ "*", ], host_filters: "key:value,filter:example", new_client_id: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", new_tenant_name: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", resource_collection_enabled: true, tenant_name: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", }) p api_instance.update_azure_integration(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an Azure integration returns "OK" response ``` // Update an Azure integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_azure_integration::AzureIntegrationAPI; use datadog_api_client::datadogV1::model::AzureAccount; #[tokio::main] async fn main() { let body = AzureAccount::new() .app_service_plan_filters("key:value,filter:example".to_string()) .automute(true) .client_id("".to_string()) .client_secret("TestingRh2nx664kUy5dIApvM54T4AtO".to_string()) .container_app_filters("key:value,filter:example".to_string()) .cspm_enabled(true) .custom_metrics_enabled(true) .errors(vec!["*".to_string()]) .host_filters("key:value,filter:example".to_string()) .new_client_id("".to_string()) .new_tenant_name("".to_string()) .resource_collection_enabled(true) .tenant_name("".to_string()); let configuration = datadog::Configuration::new(); let api = AzureIntegrationAPI::with_config(configuration); let resp = api.update_azure_integration(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an Azure integration returns "OK" response ``` /** * Update an Azure integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AzureIntegrationApi(configuration); const params: v1.AzureIntegrationApiUpdateAzureIntegrationRequest = { body: { appServicePlanFilters: "key:value,filter:example", automute: true, clientId: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", clientSecret: "TestingRh2nx664kUy5dIApvM54T4AtO", containerAppFilters: "key:value,filter:example", cspmEnabled: true, customMetricsEnabled: true, errors: ["*"], hostFilters: "key:value,filter:example", newClientId: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", newTenantName: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", resourceCollectionEnabled: true, tenantName: "9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", }, }; apiInstance .updateAzureIntegration(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Azure integration host filters](https://docs.datadoghq.com/api/latest/azure-integration/#update-azure-integration-host-filters) * [v1 (latest)](https://docs.datadoghq.com/api/latest/azure-integration/#update-azure-integration-host-filters-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/azure/host_filtershttps://api.ap2.datadoghq.com/api/v1/integration/azure/host_filtershttps://api.datadoghq.eu/api/v1/integration/azure/host_filtershttps://api.ddog-gov.com/api/v1/integration/azure/host_filtershttps://api.datadoghq.com/api/v1/integration/azure/host_filtershttps://api.us3.datadoghq.com/api/v1/integration/azure/host_filtershttps://api.us5.datadoghq.com/api/v1/integration/azure/host_filters ### Overview Update the defined list of host filters for a given Datadog-Azure integration. This endpoint requires the `azure_configuration_edit` permission. ### Request #### Body Data (required) Update a Datadog-Azure integration’s host filters request body. * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Field Type Description app_service_plan_filters string Limit the Azure app service plans that are pulled into Datadog using tags. Only app service plans that match one of the defined tags are imported into Datadog. automute boolean Silence monitors for expected Azure VM shutdowns. client_id string Your Azure web application ID. client_secret string Your Azure web application secret key. container_app_filters string Limit the Azure container apps that are pulled into Datadog using tags. Only container apps that match one of the defined tags are imported into Datadog. cspm_enabled boolean When enabled, Datadog’s Cloud Security Management product scans resource configurations monitored by this app registration. Note: This requires resource_collection_enabled to be set to true. custom_metrics_enabled boolean Enable custom metrics for your organization. errors [string] Errors in your configuration. host_filters string Limit the Azure instances that are pulled into Datadog by using tags. Only hosts that match one of the defined tags are imported into Datadog. metrics_enabled boolean Enable Azure metrics for your organization. metrics_enabled_default boolean Enable Azure metrics for your organization for resource providers where no resource provider config is specified. new_client_id string Your New Azure web application ID. new_tenant_name string Your New Azure Active Directory ID. resource_collection_enabled boolean When enabled, Datadog collects metadata and configuration info from cloud resources (compute instances, databases, load balancers, etc.) monitored by this app registration. resource_provider_configs [object] Configuration settings applied to resources from the specified Azure resource providers. metrics_enabled boolean Collect metrics for resources from this provider. namespace string The provider namespace to apply this configuration to. tenant_name string Your Azure Active Directory ID. usage_metrics_enabled boolean Enable azure.usage metrics for your organization. ``` { "app_service_plan_filters": "key:value,filter:example", "automute": true, "client_id": "testc7f6-1234-5678-9101-3fcbf464test", "client_secret": "TestingRh2nx664kUy5dIApvM54T4AtO", "container_app_filters": "key:value,filter:example", "cspm_enabled": true, "custom_metrics_enabled": true, "errors": [ "*" ], "host_filters": "key:value,filter:example", "metrics_enabled": true, "metrics_enabled_default": true, "new_client_id": "new1c7f6-1234-5678-9101-3fcbf464test", "new_tenant_name": "new1c44-1234-5678-9101-cc00736ftest", "resource_collection_enabled": true, "resource_provider_configs": [ { "metrics_enabled": true, "namespace": "Microsoft.Compute" } ], "tenant_name": "testc44-1234-5678-9101-cc00736ftest", "usage_metrics_enabled": true } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/azure-integration/#UpdateAzureHostFilters-200-v1) * [400](https://docs.datadoghq.com/api/latest/azure-integration/#UpdateAzureHostFilters-400-v1) * [403](https://docs.datadoghq.com/api/latest/azure-integration/#UpdateAzureHostFilters-403-v1) * [429](https://docs.datadoghq.com/api/latest/azure-integration/#UpdateAzureHostFilters-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/azure-integration/) * [Example](https://docs.datadoghq.com/api/latest/azure-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/azure-integration/?code-lang=python-legacy) ##### Update Azure integration host filters Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/azure/host_filters" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Update Azure integration host filters ``` """ Update Azure integration host filters returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.azure_integration_api import AzureIntegrationApi from datadog_api_client.v1.model.azure_account import AzureAccount from datadog_api_client.v1.model.resource_provider_config import ResourceProviderConfig body = AzureAccount( app_service_plan_filters="key:value,filter:example", automute=True, client_id="testc7f6-1234-5678-9101-3fcbf464test", client_secret="TestingRh2nx664kUy5dIApvM54T4AtO", container_app_filters="key:value,filter:example", cspm_enabled=True, custom_metrics_enabled=True, errors=[ "*", ], host_filters="key:value,filter:example", metrics_enabled=True, metrics_enabled_default=True, new_client_id="new1c7f6-1234-5678-9101-3fcbf464test", new_tenant_name="new1c44-1234-5678-9101-cc00736ftest", resource_collection_enabled=True, resource_provider_configs=[ ResourceProviderConfig( metrics_enabled=True, namespace="Microsoft.Compute", ), ], tenant_name="testc44-1234-5678-9101-cc00736ftest", usage_metrics_enabled=True, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = AzureIntegrationApi(api_client) response = api_instance.update_azure_host_filters(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Azure integration host filters ``` # Update Azure integration host filters returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::AzureIntegrationAPI.new body = DatadogAPIClient::V1::AzureAccount.new({ app_service_plan_filters: "key:value,filter:example", automute: true, client_id: "testc7f6-1234-5678-9101-3fcbf464test", client_secret: "TestingRh2nx664kUy5dIApvM54T4AtO", container_app_filters: "key:value,filter:example", cspm_enabled: true, custom_metrics_enabled: true, errors: [ "*", ], host_filters: "key:value,filter:example", metrics_enabled: true, metrics_enabled_default: true, new_client_id: "new1c7f6-1234-5678-9101-3fcbf464test", new_tenant_name: "new1c44-1234-5678-9101-cc00736ftest", resource_collection_enabled: true, resource_provider_configs: [ DatadogAPIClient::V1::ResourceProviderConfig.new({ metrics_enabled: true, namespace: "Microsoft.Compute", }), ], tenant_name: "testc44-1234-5678-9101-cc00736ftest", usage_metrics_enabled: true, }) p api_instance.update_azure_host_filters(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Azure integration host filters ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' config= { "tenant_name": "", "client_id": "", "host_filters": ":" } dog = Dogapi::Client.new(api_key, app_key) dog.azure_integration_update_host_filters(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Azure integration host filters ``` // Update Azure integration host filters returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AzureAccount{ AppServicePlanFilters: datadog.PtrString("key:value,filter:example"), Automute: datadog.PtrBool(true), ClientId: datadog.PtrString("testc7f6-1234-5678-9101-3fcbf464test"), ClientSecret: datadog.PtrString("TestingRh2nx664kUy5dIApvM54T4AtO"), ContainerAppFilters: datadog.PtrString("key:value,filter:example"), CspmEnabled: datadog.PtrBool(true), CustomMetricsEnabled: datadog.PtrBool(true), Errors: []string{ "*", }, HostFilters: datadog.PtrString("key:value,filter:example"), MetricsEnabled: datadog.PtrBool(true), MetricsEnabledDefault: datadog.PtrBool(true), NewClientId: datadog.PtrString("new1c7f6-1234-5678-9101-3fcbf464test"), NewTenantName: datadog.PtrString("new1c44-1234-5678-9101-cc00736ftest"), ResourceCollectionEnabled: datadog.PtrBool(true), ResourceProviderConfigs: []datadogV1.ResourceProviderConfig{ { MetricsEnabled: datadog.PtrBool(true), Namespace: datadog.PtrString("Microsoft.Compute"), }, }, TenantName: datadog.PtrString("testc44-1234-5678-9101-cc00736ftest"), UsageMetricsEnabled: datadog.PtrBool(true), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewAzureIntegrationApi(apiClient) resp, r, err := api.UpdateAzureHostFilters(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `AzureIntegrationApi.UpdateAzureHostFilters`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `AzureIntegrationApi.UpdateAzureHostFilters`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Azure integration host filters ``` // Update Azure integration host filters returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.AzureIntegrationApi; import com.datadog.api.client.v1.model.AzureAccount; import com.datadog.api.client.v1.model.ResourceProviderConfig; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); AzureIntegrationApi apiInstance = new AzureIntegrationApi(defaultClient); AzureAccount body = new AzureAccount() .appServicePlanFilters("key:value,filter:example") .automute(true) .clientId("testc7f6-1234-5678-9101-3fcbf464test") .clientSecret("TestingRh2nx664kUy5dIApvM54T4AtO") .containerAppFilters("key:value,filter:example") .cspmEnabled(true) .customMetricsEnabled(true) .errors(Collections.singletonList("*")) .hostFilters("key:value,filter:example") .metricsEnabled(true) .metricsEnabledDefault(true) .newClientId("new1c7f6-1234-5678-9101-3fcbf464test") .newTenantName("new1c44-1234-5678-9101-cc00736ftest") .resourceCollectionEnabled(true) .resourceProviderConfigs( Collections.singletonList( new ResourceProviderConfig() .metricsEnabled(true) .namespace("Microsoft.Compute"))) .tenantName("testc44-1234-5678-9101-cc00736ftest") .usageMetricsEnabled(true); try { apiInstance.updateAzureHostFilters(body); } catch (ApiException e) { System.err.println("Exception when calling AzureIntegrationApi#updateAzureHostFilters"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Azure integration host filters ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.AzureIntegration.update_host_filters( tenant_name="", host_filters="new:filters", client_id="" ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Update Azure integration host filters ``` // Update Azure integration host filters returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_azure_integration::AzureIntegrationAPI; use datadog_api_client::datadogV1::model::AzureAccount; use datadog_api_client::datadogV1::model::ResourceProviderConfig; #[tokio::main] async fn main() { let body = AzureAccount::new() .app_service_plan_filters("key:value,filter:example".to_string()) .automute(true) .client_id("testc7f6-1234-5678-9101-3fcbf464test".to_string()) .client_secret("TestingRh2nx664kUy5dIApvM54T4AtO".to_string()) .container_app_filters("key:value,filter:example".to_string()) .cspm_enabled(true) .custom_metrics_enabled(true) .errors(vec!["*".to_string()]) .host_filters("key:value,filter:example".to_string()) .metrics_enabled(true) .metrics_enabled_default(true) .new_client_id("new1c7f6-1234-5678-9101-3fcbf464test".to_string()) .new_tenant_name("new1c44-1234-5678-9101-cc00736ftest".to_string()) .resource_collection_enabled(true) .resource_provider_configs(vec![ResourceProviderConfig::new() .metrics_enabled(true) .namespace("Microsoft.Compute".to_string())]) .tenant_name("testc44-1234-5678-9101-cc00736ftest".to_string()) .usage_metrics_enabled(true); let configuration = datadog::Configuration::new(); let api = AzureIntegrationAPI::with_config(configuration); let resp = api.update_azure_host_filters(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Azure integration host filters ``` /** * Update Azure integration host filters returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.AzureIntegrationApi(configuration); const params: v1.AzureIntegrationApiUpdateAzureHostFiltersRequest = { body: { appServicePlanFilters: "key:value,filter:example", automute: true, clientId: "testc7f6-1234-5678-9101-3fcbf464test", clientSecret: "TestingRh2nx664kUy5dIApvM54T4AtO", containerAppFilters: "key:value,filter:example", cspmEnabled: true, customMetricsEnabled: true, errors: ["*"], hostFilters: "key:value,filter:example", metricsEnabled: true, metricsEnabledDefault: true, newClientId: "new1c7f6-1234-5678-9101-3fcbf464test", newTenantName: "new1c44-1234-5678-9101-cc00736ftest", resourceCollectionEnabled: true, resourceProviderConfigs: [ { metricsEnabled: true, namespace: "Microsoft.Compute", }, ], tenantName: "testc44-1234-5678-9101-cc00736ftest", usageMetricsEnabled: true, }, }; apiInstance .updateAzureHostFilters(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=e60261d0-a2ca-498b-8dc4-58f4f0392a3d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d39c3104-3d49-4c22-b0bf-39a455fdcacc&pt=Azure%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fazure-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=e60261d0-a2ca-498b-8dc4-58f4f0392a3d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d39c3104-3d49-4c22-b0bf-39a455fdcacc&pt=Azure%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fazure-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=4448af69-a3cf-4f18-a4b0-e2164dc231b9&bo=2&sid=2b97e1f0f0bf11f0a95ed35e5ee0944a&vid=2b983270f0bf11f0b99eb30165f83ff9&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Azure%20Integration&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fazure-integration%2F&r=<=1518&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=465664) --- # Source: https://docs.datadoghq.com/api/latest/case-management-attribute/ # Case Management Attribute View and configure custom attributes within Case Management. See the [Case Management page](https://docs.datadoghq.com/service_management/case_management/) for more information. ## [Get all custom attributes](https://docs.datadoghq.com/api/latest/case-management-attribute/#get-all-custom-attributes) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management-attribute/#get-all-custom-attributes-v2) GET https://api.ap1.datadoghq.com/api/v2/cases/types/custom_attributeshttps://api.ap2.datadoghq.com/api/v2/cases/types/custom_attributeshttps://api.datadoghq.eu/api/v2/cases/types/custom_attributeshttps://api.ddog-gov.com/api/v2/cases/types/custom_attributeshttps://api.datadoghq.com/api/v2/cases/types/custom_attributeshttps://api.us3.datadoghq.com/api/v2/cases/types/custom_attributeshttps://api.us5.datadoghq.com/api/v2/cases/types/custom_attributes ### Overview Get all custom attributes ### Response * [200](https://docs.datadoghq.com/api/latest/case-management-attribute/#GetAllCustomAttributes-200-v2) * [401](https://docs.datadoghq.com/api/latest/case-management-attribute/#GetAllCustomAttributes-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management-attribute/#GetAllCustomAttributes-403-v2) * [429](https://docs.datadoghq.com/api/latest/case-management-attribute/#GetAllCustomAttributes-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) Custom attribute configs response. Field Type Description data [object] List of custom attribute configs of case type attributes object Custom attribute resource attributes case_type_id [_required_] string Custom attribute config identifier. description string Custom attribute description. display_name [_required_] string Custom attribute name. is_multi [_required_] boolean Whether multiple values can be set key [_required_] string Custom attribute key. This will be the value use to search on this custom attribute type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` id string Custom attribute configs identifier type enum Custom attributes config JSON:API resource type Allowed enum values: `custom_attribute` default: `custom_attribute` ``` { "data": [ { "attributes": { "case_type_id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "description": "AWS Region, must be a valid region supported by AWS", "display_name": "AWS Region", "is_multi": true, "key": "aws_region", "type": "NUMBER" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "custom_attribute" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=typescript) ##### Get all custom attributes Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/types/custom_attributes" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all custom attributes ``` """ Get all custom attributes returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_attribute_api import CaseManagementAttributeApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementAttributeApi(api_client) response = api_instance.get_all_custom_attributes() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all custom attributes ``` # Get all custom attributes returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAttributeAPI.new p api_instance.get_all_custom_attributes() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all custom attributes ``` // Get all custom attributes returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementAttributeApi(apiClient) resp, r, err := api.GetAllCustomAttributes(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementAttributeApi.GetAllCustomAttributes`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementAttributeApi.GetAllCustomAttributes`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all custom attributes ``` // Get all custom attributes returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementAttributeApi; import com.datadog.api.client.v2.model.CustomAttributeConfigsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementAttributeApi apiInstance = new CaseManagementAttributeApi(defaultClient); try { CustomAttributeConfigsResponse result = apiInstance.getAllCustomAttributes(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CaseManagementAttributeApi#getAllCustomAttributes"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all custom attributes ``` // Get all custom attributes returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management_attribute::CaseManagementAttributeAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CaseManagementAttributeAPI::with_config(configuration); let resp = api.get_all_custom_attributes().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all custom attributes ``` /** * Get all custom attributes returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementAttributeApi(configuration); apiInstance .getAllCustomAttributes() .then((data: v2.CustomAttributeConfigsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all custom attributes config of case type](https://docs.datadoghq.com/api/latest/case-management-attribute/#get-all-custom-attributes-config-of-case-type) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management-attribute/#get-all-custom-attributes-config-of-case-type-v2) GET https://api.ap1.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.ap2.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.datadoghq.eu/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.ddog-gov.com/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.us3.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.us5.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributes ### Overview Get all custom attribute config of case type ### Arguments #### Path Parameters Name Type Description case_type_id [_required_] string Case type’s UUID ### Response * [200](https://docs.datadoghq.com/api/latest/case-management-attribute/#GetAllCustomAttributeConfigsByCaseType-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management-attribute/#GetAllCustomAttributeConfigsByCaseType-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management-attribute/#GetAllCustomAttributeConfigsByCaseType-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management-attribute/#GetAllCustomAttributeConfigsByCaseType-403-v2) * [429](https://docs.datadoghq.com/api/latest/case-management-attribute/#GetAllCustomAttributeConfigsByCaseType-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) Custom attribute configs response. Field Type Description data [object] List of custom attribute configs of case type attributes object Custom attribute resource attributes case_type_id [_required_] string Custom attribute config identifier. description string Custom attribute description. display_name [_required_] string Custom attribute name. is_multi [_required_] boolean Whether multiple values can be set key [_required_] string Custom attribute key. This will be the value use to search on this custom attribute type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` id string Custom attribute configs identifier type enum Custom attributes config JSON:API resource type Allowed enum values: `custom_attribute` default: `custom_attribute` ``` { "data": [ { "attributes": { "case_type_id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "description": "AWS Region, must be a valid region supported by AWS", "display_name": "AWS Region", "is_multi": true, "key": "aws_region", "type": "NUMBER" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "custom_attribute" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=typescript) ##### Get all custom attributes config of case type Copy ``` # Path parameters export case_type_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de505" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/types/${case_type_id}/custom_attributes" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all custom attributes config of case type ``` """ Get all custom attributes config of case type returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_attribute_api import CaseManagementAttributeApi # there is a valid "case_type" in the system CASE_TYPE_ID = environ["CASE_TYPE_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementAttributeApi(api_client) response = api_instance.get_all_custom_attribute_configs_by_case_type( case_type_id=CASE_TYPE_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all custom attributes config of case type ``` # Get all custom attributes config of case type returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAttributeAPI.new # there is a valid "case_type" in the system CASE_TYPE_ID = ENV["CASE_TYPE_ID"] p api_instance.get_all_custom_attribute_configs_by_case_type(CASE_TYPE_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all custom attributes config of case type ``` // Get all custom attributes config of case type returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case_type" in the system CaseTypeID := os.Getenv("CASE_TYPE_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementAttributeApi(apiClient) resp, r, err := api.GetAllCustomAttributeConfigsByCaseType(ctx, CaseTypeID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementAttributeApi.GetAllCustomAttributeConfigsByCaseType`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementAttributeApi.GetAllCustomAttributeConfigsByCaseType`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all custom attributes config of case type ``` // Get all custom attributes config of case type returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementAttributeApi; import com.datadog.api.client.v2.model.CustomAttributeConfigsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementAttributeApi apiInstance = new CaseManagementAttributeApi(defaultClient); // there is a valid "case_type" in the system String CASE_TYPE_ID = System.getenv("CASE_TYPE_ID"); try { CustomAttributeConfigsResponse result = apiInstance.getAllCustomAttributeConfigsByCaseType(CASE_TYPE_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling" + " CaseManagementAttributeApi#getAllCustomAttributeConfigsByCaseType"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all custom attributes config of case type ``` // Get all custom attributes config of case type returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management_attribute::CaseManagementAttributeAPI; #[tokio::main] async fn main() { // there is a valid "case_type" in the system let case_type_id = std::env::var("CASE_TYPE_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = CaseManagementAttributeAPI::with_config(configuration); let resp = api .get_all_custom_attribute_configs_by_case_type(case_type_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all custom attributes config of case type ``` /** * Get all custom attributes config of case type returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementAttributeApi(configuration); // there is a valid "case_type" in the system const CASE_TYPE_ID = process.env.CASE_TYPE_ID as string; const params: v2.CaseManagementAttributeApiGetAllCustomAttributeConfigsByCaseTypeRequest = { caseTypeId: CASE_TYPE_ID, }; apiInstance .getAllCustomAttributeConfigsByCaseType(params) .then((data: v2.CustomAttributeConfigsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create custom attribute config for a case type](https://docs.datadoghq.com/api/latest/case-management-attribute/#create-custom-attribute-config-for-a-case-type) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management-attribute/#create-custom-attribute-config-for-a-case-type-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.ap2.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.datadoghq.eu/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.ddog-gov.com/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.us3.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributeshttps://api.us5.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributes ### Overview Create custom attribute config for a case type ### Arguments #### Path Parameters Name Type Description case_type_id [_required_] string Case type’s UUID ### Request #### Body Data (required) Custom attribute config payload * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) Field Type Description data [_required_] object Custom attribute config attributes [_required_] object Custom attribute config resource attributes description string Custom attribute description. display_name [_required_] string Custom attribute name. is_multi [_required_] boolean Whether multiple values can be set key [_required_] string Custom attribute key. This will be the value use to search on this custom attribute type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` type [_required_] enum Custom attributes config JSON:API resource type Allowed enum values: `custom_attribute` default: `custom_attribute` ``` { "data": { "attributes": { "display_name": "AWS Region 9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "is_multi": true, "key": "region_d9fe56bc9274fbb6", "type": "NUMBER" }, "type": "custom_attribute" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/case-management-attribute/#CreateCustomAttributeConfig-201-v2) * [400](https://docs.datadoghq.com/api/latest/case-management-attribute/#CreateCustomAttributeConfig-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management-attribute/#CreateCustomAttributeConfig-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management-attribute/#CreateCustomAttributeConfig-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management-attribute/#CreateCustomAttributeConfig-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management-attribute/#CreateCustomAttributeConfig-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) Custom attribute config response. Field Type Description data object The definition of `CustomAttributeConfig` object. attributes object Custom attribute resource attributes case_type_id [_required_] string Custom attribute config identifier. description string Custom attribute description. display_name [_required_] string Custom attribute name. is_multi [_required_] boolean Whether multiple values can be set key [_required_] string Custom attribute key. This will be the value use to search on this custom attribute type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` id string Custom attribute configs identifier type enum Custom attributes config JSON:API resource type Allowed enum values: `custom_attribute` default: `custom_attribute` ``` { "data": { "attributes": { "case_type_id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "description": "AWS Region, must be a valid region supported by AWS", "display_name": "AWS Region", "is_multi": true, "key": "aws_region", "type": "NUMBER" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "custom_attribute" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=typescript) ##### Create custom attribute config for a case type returns "CREATED" response Copy ``` # Path parameters export case_type_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de505" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/types/${case_type_id}/custom_attributes" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "display_name": "AWS Region 9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", "is_multi": true, "key": "region_d9fe56bc9274fbb6", "type": "NUMBER" }, "type": "custom_attribute" } } EOF ``` ##### Create custom attribute config for a case type returns "CREATED" response ``` // Create custom attribute config for a case type returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case_type" in the system CaseTypeID := os.Getenv("CASE_TYPE_ID") body := datadogV2.CustomAttributeConfigCreateRequest{ Data: datadogV2.CustomAttributeConfigCreate{ Attributes: datadogV2.CustomAttributeConfigAttributesCreate{ DisplayName: "AWS Region 9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", IsMulti: true, Key: "region_d9fe56bc9274fbb6", Type: datadogV2.CUSTOMATTRIBUTETYPE_NUMBER, }, Type: datadogV2.CUSTOMATTRIBUTECONFIGRESOURCETYPE_CUSTOM_ATTRIBUTE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementAttributeApi(apiClient) resp, r, err := api.CreateCustomAttributeConfig(ctx, CaseTypeID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementAttributeApi.CreateCustomAttributeConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementAttributeApi.CreateCustomAttributeConfig`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create custom attribute config for a case type returns "CREATED" response ``` // Create custom attribute config for a case type returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementAttributeApi; import com.datadog.api.client.v2.model.CustomAttributeConfigAttributesCreate; import com.datadog.api.client.v2.model.CustomAttributeConfigCreate; import com.datadog.api.client.v2.model.CustomAttributeConfigCreateRequest; import com.datadog.api.client.v2.model.CustomAttributeConfigResourceType; import com.datadog.api.client.v2.model.CustomAttributeConfigResponse; import com.datadog.api.client.v2.model.CustomAttributeType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementAttributeApi apiInstance = new CaseManagementAttributeApi(defaultClient); // there is a valid "case_type" in the system String CASE_TYPE_ID = System.getenv("CASE_TYPE_ID"); CustomAttributeConfigCreateRequest body = new CustomAttributeConfigCreateRequest() .data( new CustomAttributeConfigCreate() .attributes( new CustomAttributeConfigAttributesCreate() .displayName("AWS Region 9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d") .isMulti(true) .key("region_d9fe56bc9274fbb6") .type(CustomAttributeType.NUMBER)) .type(CustomAttributeConfigResourceType.CUSTOM_ATTRIBUTE)); try { CustomAttributeConfigResponse result = apiInstance.createCustomAttributeConfig(CASE_TYPE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CaseManagementAttributeApi#createCustomAttributeConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create custom attribute config for a case type returns "CREATED" response ``` """ Create custom attribute config for a case type returns "CREATED" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_attribute_api import CaseManagementAttributeApi from datadog_api_client.v2.model.custom_attribute_config_attributes_create import CustomAttributeConfigAttributesCreate from datadog_api_client.v2.model.custom_attribute_config_create import CustomAttributeConfigCreate from datadog_api_client.v2.model.custom_attribute_config_create_request import CustomAttributeConfigCreateRequest from datadog_api_client.v2.model.custom_attribute_config_resource_type import CustomAttributeConfigResourceType from datadog_api_client.v2.model.custom_attribute_type import CustomAttributeType # there is a valid "case_type" in the system CASE_TYPE_ID = environ["CASE_TYPE_ID"] body = CustomAttributeConfigCreateRequest( data=CustomAttributeConfigCreate( attributes=CustomAttributeConfigAttributesCreate( display_name="AWS Region 9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", is_multi=True, key="region_d9fe56bc9274fbb6", type=CustomAttributeType.NUMBER, ), type=CustomAttributeConfigResourceType.CUSTOM_ATTRIBUTE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementAttributeApi(api_client) response = api_instance.create_custom_attribute_config(case_type_id=CASE_TYPE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create custom attribute config for a case type returns "CREATED" response ``` # Create custom attribute config for a case type returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAttributeAPI.new # there is a valid "case_type" in the system CASE_TYPE_ID = ENV["CASE_TYPE_ID"] body = DatadogAPIClient::V2::CustomAttributeConfigCreateRequest.new({ data: DatadogAPIClient::V2::CustomAttributeConfigCreate.new({ attributes: DatadogAPIClient::V2::CustomAttributeConfigAttributesCreate.new({ display_name: "AWS Region 9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", is_multi: true, key: "region_d9fe56bc9274fbb6", type: DatadogAPIClient::V2::CustomAttributeType::NUMBER, }), type: DatadogAPIClient::V2::CustomAttributeConfigResourceType::CUSTOM_ATTRIBUTE, }), }) p api_instance.create_custom_attribute_config(CASE_TYPE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create custom attribute config for a case type returns "CREATED" response ``` // Create custom attribute config for a case type returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management_attribute::CaseManagementAttributeAPI; use datadog_api_client::datadogV2::model::CustomAttributeConfigAttributesCreate; use datadog_api_client::datadogV2::model::CustomAttributeConfigCreate; use datadog_api_client::datadogV2::model::CustomAttributeConfigCreateRequest; use datadog_api_client::datadogV2::model::CustomAttributeConfigResourceType; use datadog_api_client::datadogV2::model::CustomAttributeType; #[tokio::main] async fn main() { // there is a valid "case_type" in the system let case_type_id = std::env::var("CASE_TYPE_ID").unwrap(); let body = CustomAttributeConfigCreateRequest::new(CustomAttributeConfigCreate::new( CustomAttributeConfigAttributesCreate::new( "AWS Region ".to_string(), true, "region_d9fe56bc9274fbb6".to_string(), CustomAttributeType::NUMBER, ), CustomAttributeConfigResourceType::CUSTOM_ATTRIBUTE, )); let configuration = datadog::Configuration::new(); let api = CaseManagementAttributeAPI::with_config(configuration); let resp = api .create_custom_attribute_config(case_type_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create custom attribute config for a case type returns "CREATED" response ``` /** * Create custom attribute config for a case type returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementAttributeApi(configuration); // there is a valid "case_type" in the system const CASE_TYPE_ID = process.env.CASE_TYPE_ID as string; const params: v2.CaseManagementAttributeApiCreateCustomAttributeConfigRequest = { body: { data: { attributes: { displayName: "AWS Region 9b1deb4d-3b7d-4bad-9bdd-2b0d7b3dcb6d", isMulti: true, key: "region_d9fe56bc9274fbb6", type: "NUMBER", }, type: "custom_attribute", }, }, caseTypeId: CASE_TYPE_ID, }; apiInstance .createCustomAttributeConfig(params) .then((data: v2.CustomAttributeConfigResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete custom attributes config](https://docs.datadoghq.com/api/latest/case-management-attribute/#delete-custom-attributes-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management-attribute/#delete-custom-attributes-config-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributes/{custom_attribute_id}https://api.ap2.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributes/{custom_attribute_id}https://api.datadoghq.eu/api/v2/cases/types/{case_type_id}/custom_attributes/{custom_attribute_id}https://api.ddog-gov.com/api/v2/cases/types/{case_type_id}/custom_attributes/{custom_attribute_id}https://api.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributes/{custom_attribute_id}https://api.us3.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributes/{custom_attribute_id}https://api.us5.datadoghq.com/api/v2/cases/types/{case_type_id}/custom_attributes/{custom_attribute_id} ### Overview Delete custom attribute config ### Arguments #### Path Parameters Name Type Description case_type_id [_required_] string Case type’s UUID custom_attribute_id [_required_] string Case Custom attribute’s UUID ### Response * [204](https://docs.datadoghq.com/api/latest/case-management-attribute/#DeleteCustomAttributeConfig-204-v2) * [400](https://docs.datadoghq.com/api/latest/case-management-attribute/#DeleteCustomAttributeConfig-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management-attribute/#DeleteCustomAttributeConfig-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management-attribute/#DeleteCustomAttributeConfig-403-v2) * [429](https://docs.datadoghq.com/api/latest/case-management-attribute/#DeleteCustomAttributeConfig-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Example](https://docs.datadoghq.com/api/latest/case-management-attribute/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management-attribute/?code-lang=typescript) ##### Delete custom attributes config Copy ``` # Path parameters export case_type_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de505" export custom_attribute_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de505" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/types/${case_type_id}/custom_attributes/${custom_attribute_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete custom attributes config ``` """ Delete custom attributes config returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_attribute_api import CaseManagementAttributeApi # there is a valid "case_type" in the system CASE_TYPE_ID = environ["CASE_TYPE_ID"] # there is a valid "custom_attribute" in the system CUSTOM_ATTRIBUTE_ID = environ["CUSTOM_ATTRIBUTE_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementAttributeApi(api_client) api_instance.delete_custom_attribute_config( case_type_id=CASE_TYPE_ID, custom_attribute_id=CUSTOM_ATTRIBUTE_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete custom attributes config ``` # Delete custom attributes config returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAttributeAPI.new # there is a valid "case_type" in the system CASE_TYPE_ID = ENV["CASE_TYPE_ID"] # there is a valid "custom_attribute" in the system CUSTOM_ATTRIBUTE_ID = ENV["CUSTOM_ATTRIBUTE_ID"] api_instance.delete_custom_attribute_config(CASE_TYPE_ID, CUSTOM_ATTRIBUTE_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete custom attributes config ``` // Delete custom attributes config returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case_type" in the system CaseTypeID := os.Getenv("CASE_TYPE_ID") // there is a valid "custom_attribute" in the system CustomAttributeID := os.Getenv("CUSTOM_ATTRIBUTE_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementAttributeApi(apiClient) r, err := api.DeleteCustomAttributeConfig(ctx, CaseTypeID, CustomAttributeID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementAttributeApi.DeleteCustomAttributeConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete custom attributes config ``` // Delete custom attributes config returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementAttributeApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementAttributeApi apiInstance = new CaseManagementAttributeApi(defaultClient); // there is a valid "case_type" in the system String CASE_TYPE_ID = System.getenv("CASE_TYPE_ID"); // there is a valid "custom_attribute" in the system String CUSTOM_ATTRIBUTE_ID = System.getenv("CUSTOM_ATTRIBUTE_ID"); try { apiInstance.deleteCustomAttributeConfig(CASE_TYPE_ID, CUSTOM_ATTRIBUTE_ID); } catch (ApiException e) { System.err.println( "Exception when calling CaseManagementAttributeApi#deleteCustomAttributeConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete custom attributes config ``` // Delete custom attributes config returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management_attribute::CaseManagementAttributeAPI; #[tokio::main] async fn main() { // there is a valid "case_type" in the system let case_type_id = std::env::var("CASE_TYPE_ID").unwrap(); // there is a valid "custom_attribute" in the system let custom_attribute_id = std::env::var("CUSTOM_ATTRIBUTE_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = CaseManagementAttributeAPI::with_config(configuration); let resp = api .delete_custom_attribute_config(case_type_id.clone(), custom_attribute_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete custom attributes config ``` /** * Delete custom attributes config returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementAttributeApi(configuration); // there is a valid "case_type" in the system const CASE_TYPE_ID = process.env.CASE_TYPE_ID as string; // there is a valid "custom_attribute" in the system const CUSTOM_ATTRIBUTE_ID = process.env.CUSTOM_ATTRIBUTE_ID as string; const params: v2.CaseManagementAttributeApiDeleteCustomAttributeConfigRequest = { caseTypeId: CASE_TYPE_ID, customAttributeId: CUSTOM_ATTRIBUTE_ID, }; apiInstance .deleteCustomAttributeConfig(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=505934d3-ff72-4344-a3e2-c163242747c2&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=bb4a8161-727c-4642-bf75-843a7beffcad&pt=Case%20Management%20Attribute&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcase-management-attribute%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=505934d3-ff72-4344-a3e2-c163242747c2&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=bb4a8161-727c-4642-bf75-843a7beffcad&pt=Case%20Management%20Attribute&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcase-management-attribute%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=2c3c1866-a6c8-40d7-999f-0baaeb8ae036&bo=2&sid=30dec820f0bf11f0b66d5dc00f465c99&vid=30defdd0f0bf11f0b6565f6e489a262d&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Case%20Management%20Attribute&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcase-management-attribute%2F&r=<=1901&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=242203) --- # Source: https://docs.datadoghq.com/api/latest/case-management-type/ # Case Management Type View and configure case types within Case Management. See the [Case Management page](https://docs.datadoghq.com/service_management/case_management/) for more information. ## [Get all case types](https://docs.datadoghq.com/api/latest/case-management-type/#get-all-case-types) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management-type/#get-all-case-types-v2) GET https://api.ap1.datadoghq.com/api/v2/cases/typeshttps://api.ap2.datadoghq.com/api/v2/cases/typeshttps://api.datadoghq.eu/api/v2/cases/typeshttps://api.ddog-gov.com/api/v2/cases/typeshttps://api.datadoghq.com/api/v2/cases/typeshttps://api.us3.datadoghq.com/api/v2/cases/typeshttps://api.us5.datadoghq.com/api/v2/cases/types ### Overview Get all case types ### Response * [200](https://docs.datadoghq.com/api/latest/case-management-type/#GetAllCaseTypes-200-v2) * [401](https://docs.datadoghq.com/api/latest/case-management-type/#GetAllCaseTypes-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management-type/#GetAllCaseTypes-403-v2) * [429](https://docs.datadoghq.com/api/latest/case-management-type/#GetAllCaseTypes-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) Case types response. Field Type Description data [object] List of case types attributes object Case Type resource attributes deleted_at date-time Timestamp of when the case type was deleted description string Case type description. emoji string Case type emoji. name [_required_] string Case type name. id string Case type's identifier type enum Case type resource type Allowed enum values: `case_type` default: `case_type` ``` { "data": [ { "attributes": { "deleted_at": "2019-09-19T10:00:00.000Z", "description": "Investigations done in case management", "emoji": "🕵🏻‍♂️", "name": "Investigation" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "case_type" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=typescript) ##### Get all case types Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/types" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all case types ``` """ Get all case types returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_type_api import CaseManagementTypeApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementTypeApi(api_client) response = api_instance.get_all_case_types() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all case types ``` # Get all case types returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementTypeAPI.new p api_instance.get_all_case_types() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all case types ``` // Get all case types returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementTypeApi(apiClient) resp, r, err := api.GetAllCaseTypes(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementTypeApi.GetAllCaseTypes`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementTypeApi.GetAllCaseTypes`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all case types ``` // Get all case types returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementTypeApi; import com.datadog.api.client.v2.model.CaseTypesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementTypeApi apiInstance = new CaseManagementTypeApi(defaultClient); try { CaseTypesResponse result = apiInstance.getAllCaseTypes(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementTypeApi#getAllCaseTypes"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all case types ``` // Get all case types returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management_type::CaseManagementTypeAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CaseManagementTypeAPI::with_config(configuration); let resp = api.get_all_case_types().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all case types ``` /** * Get all case types returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementTypeApi(configuration); apiInstance .getAllCaseTypes() .then((data: v2.CaseTypesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a case type](https://docs.datadoghq.com/api/latest/case-management-type/#create-a-case-type) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management-type/#create-a-case-type-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/typeshttps://api.ap2.datadoghq.com/api/v2/cases/typeshttps://api.datadoghq.eu/api/v2/cases/typeshttps://api.ddog-gov.com/api/v2/cases/typeshttps://api.datadoghq.com/api/v2/cases/typeshttps://api.us3.datadoghq.com/api/v2/cases/typeshttps://api.us5.datadoghq.com/api/v2/cases/types ### Overview Create a Case Type ### Request #### Body Data (required) Case type payload * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) Field Type Description data [_required_] object Case type attributes [_required_] object Case Type resource attributes deleted_at date-time Timestamp of when the case type was deleted description string Case type description. emoji string Case type emoji. name [_required_] string Case type name. type [_required_] enum Case type resource type Allowed enum values: `case_type` default: `case_type` ``` { "data": { "attributes": { "description": "Investigations done in case management", "emoji": "\ud83d\udc51", "name": "Investigation" }, "type": "case_type" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/case-management-type/#CreateCaseType-201-v2) * [400](https://docs.datadoghq.com/api/latest/case-management-type/#CreateCaseType-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management-type/#CreateCaseType-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management-type/#CreateCaseType-403-v2) * [429](https://docs.datadoghq.com/api/latest/case-management-type/#CreateCaseType-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) Case type response Field Type Description data object The definition of `CaseType` object. attributes object Case Type resource attributes deleted_at date-time Timestamp of when the case type was deleted description string Case type description. emoji string Case type emoji. name [_required_] string Case type name. id string Case type's identifier type enum Case type resource type Allowed enum values: `case_type` default: `case_type` ``` { "data": { "attributes": { "deleted_at": "2019-09-19T10:00:00.000Z", "description": "Investigations done in case management", "emoji": "🕵🏻‍♂️", "name": "Investigation" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "case_type" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=typescript) ##### Create a case type returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/types" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Investigations done in case management", "emoji": "\ud83d\udc51", "name": "Investigation" }, "type": "case_type" } } EOF ``` ##### Create a case type returns "CREATED" response ``` // Create a case type returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CaseTypeCreateRequest{ Data: datadogV2.CaseTypeCreate{ Attributes: datadogV2.CaseTypeResourceAttributes{ Description: datadog.PtrString("Investigations done in case management"), Emoji: datadog.PtrString("👑"), Name: "Investigation", }, Type: datadogV2.CASETYPERESOURCETYPE_CASE_TYPE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementTypeApi(apiClient) resp, r, err := api.CreateCaseType(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementTypeApi.CreateCaseType`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementTypeApi.CreateCaseType`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a case type returns "CREATED" response ``` // Create a case type returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementTypeApi; import com.datadog.api.client.v2.model.CaseTypeCreate; import com.datadog.api.client.v2.model.CaseTypeCreateRequest; import com.datadog.api.client.v2.model.CaseTypeResourceAttributes; import com.datadog.api.client.v2.model.CaseTypeResourceType; import com.datadog.api.client.v2.model.CaseTypeResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementTypeApi apiInstance = new CaseManagementTypeApi(defaultClient); CaseTypeCreateRequest body = new CaseTypeCreateRequest() .data( new CaseTypeCreate() .attributes( new CaseTypeResourceAttributes() .description("Investigations done in case management") .emoji("👑") .name("Investigation")) .type(CaseTypeResourceType.CASE_TYPE)); try { CaseTypeResponse result = apiInstance.createCaseType(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementTypeApi#createCaseType"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a case type returns "CREATED" response ``` """ Create a case type returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_type_api import CaseManagementTypeApi from datadog_api_client.v2.model.case_type_create import CaseTypeCreate from datadog_api_client.v2.model.case_type_create_request import CaseTypeCreateRequest from datadog_api_client.v2.model.case_type_resource_attributes import CaseTypeResourceAttributes from datadog_api_client.v2.model.case_type_resource_type import CaseTypeResourceType body = CaseTypeCreateRequest( data=CaseTypeCreate( attributes=CaseTypeResourceAttributes( description="Investigations done in case management", emoji="👑", name="Investigation", ), type=CaseTypeResourceType.CASE_TYPE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementTypeApi(api_client) response = api_instance.create_case_type(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a case type returns "CREATED" response ``` # Create a case type returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementTypeAPI.new body = DatadogAPIClient::V2::CaseTypeCreateRequest.new({ data: DatadogAPIClient::V2::CaseTypeCreate.new({ attributes: DatadogAPIClient::V2::CaseTypeResourceAttributes.new({ description: "Investigations done in case management", emoji: "👑", name: "Investigation", }), type: DatadogAPIClient::V2::CaseTypeResourceType::CASE_TYPE, }), }) p api_instance.create_case_type(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a case type returns "CREATED" response ``` // Create a case type returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management_type::CaseManagementTypeAPI; use datadog_api_client::datadogV2::model::CaseTypeCreate; use datadog_api_client::datadogV2::model::CaseTypeCreateRequest; use datadog_api_client::datadogV2::model::CaseTypeResourceAttributes; use datadog_api_client::datadogV2::model::CaseTypeResourceType; #[tokio::main] async fn main() { let body = CaseTypeCreateRequest::new(CaseTypeCreate::new( CaseTypeResourceAttributes::new("Investigation".to_string()) .description("Investigations done in case management".to_string()) .emoji("👑".to_string()), CaseTypeResourceType::CASE_TYPE, )); let configuration = datadog::Configuration::new(); let api = CaseManagementTypeAPI::with_config(configuration); let resp = api.create_case_type(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a case type returns "CREATED" response ``` /** * Create a case type returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementTypeApi(configuration); const params: v2.CaseManagementTypeApiCreateCaseTypeRequest = { body: { data: { attributes: { description: "Investigations done in case management", emoji: "👑", name: "Investigation", }, type: "case_type", }, }, }; apiInstance .createCaseType(params) .then((data: v2.CaseTypeResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a case type](https://docs.datadoghq.com/api/latest/case-management-type/#delete-a-case-type) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management-type/#delete-a-case-type-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cases/types/{case_type_id}https://api.ap2.datadoghq.com/api/v2/cases/types/{case_type_id}https://api.datadoghq.eu/api/v2/cases/types/{case_type_id}https://api.ddog-gov.com/api/v2/cases/types/{case_type_id}https://api.datadoghq.com/api/v2/cases/types/{case_type_id}https://api.us3.datadoghq.com/api/v2/cases/types/{case_type_id}https://api.us5.datadoghq.com/api/v2/cases/types/{case_type_id} ### Overview Delete a case type ### Arguments #### Path Parameters Name Type Description case_type_id [_required_] string Case type’s UUID ### Response * [204](https://docs.datadoghq.com/api/latest/case-management-type/#DeleteCaseType-204-v2) * [401](https://docs.datadoghq.com/api/latest/case-management-type/#DeleteCaseType-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management-type/#DeleteCaseType-403-v2) * [429](https://docs.datadoghq.com/api/latest/case-management-type/#DeleteCaseType-429-v2) No Content Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management-type/) * [Example](https://docs.datadoghq.com/api/latest/case-management-type/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management-type/?code-lang=typescript) ##### Delete a case type Copy ``` # Path parameters export case_type_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de505" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/types/${case_type_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a case type ``` """ Delete a case type returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_type_api import CaseManagementTypeApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementTypeApi(api_client) api_instance.delete_case_type( case_type_id="case_type_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a case type ``` # Delete a case type returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementTypeAPI.new api_instance.delete_case_type("case_type_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a case type ``` // Delete a case type returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementTypeApi(apiClient) r, err := api.DeleteCaseType(ctx, "case_type_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementTypeApi.DeleteCaseType`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a case type ``` // Delete a case type returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementTypeApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementTypeApi apiInstance = new CaseManagementTypeApi(defaultClient); try { apiInstance.deleteCaseType("f98a5a5b-e0ff-45d4-b2f5-afe6e74de505"); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementTypeApi#deleteCaseType"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a case type ``` // Delete a case type returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management_type::CaseManagementTypeAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CaseManagementTypeAPI::with_config(configuration); let resp = api.delete_case_type("case_type_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a case type ``` /** * Delete a case type returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementTypeApi(configuration); const params: v2.CaseManagementTypeApiDeleteCaseTypeRequest = { caseTypeId: "case_type_id", }; apiInstance .deleteCaseType(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=82b7519d-7479-498f-bf09-860712bd275c&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2d09387d-35eb-48b7-aca2-bf0a6155b646&pt=Case%20Management%20Type&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcase-management-type%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=82b7519d-7479-498f-bf09-860712bd275c&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2d09387d-35eb-48b7-aca2-bf0a6155b646&pt=Case%20Management%20Type&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcase-management-type%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=36f770ab-677f-41fd-8f65-e09b365a14e6&bo=2&sid=35894590f0bf11f0b175dbb6df07ed75&vid=35897fc0f0bf11f0801963682effb9b0&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Case%20Management%20Type&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcase-management-type%2F&r=<=1397&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=687241) --- # Source: https://docs.datadoghq.com/api/latest/case-management # Case Management View and manage cases and projects within Case Management. See the [Case Management page](https://docs.datadoghq.com/service_management/case_management/) for more information. ## [Create a project](https://docs.datadoghq.com/api/latest/case-management/#create-a-project) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#create-a-project-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/projectshttps://api.ap2.datadoghq.com/api/v2/cases/projectshttps://api.datadoghq.eu/api/v2/cases/projectshttps://api.ddog-gov.com/api/v2/cases/projectshttps://api.datadoghq.com/api/v2/cases/projectshttps://api.us3.datadoghq.com/api/v2/cases/projectshttps://api.us5.datadoghq.com/api/v2/cases/projects ### Overview Create a project. OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Request #### Body Data (required) Project payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Project create attributes [_required_] object Project creation attributes key [_required_] string Project's key. Cannot be "CASE" name [_required_] string name type [_required_] enum Project resource type Allowed enum values: `project` default: `project` ``` { "data": { "attributes": { "key": "SEC", "name": "Security Investigation" }, "type": "project" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/case-management/#CreateProject-201-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#CreateProject-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#CreateProject-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#CreateProject-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#CreateProject-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#CreateProject-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Project response Field Type Description data object A Project attributes [_required_] object Project attributes key string The project's key name string Project's name id [_required_] string The Project's identifier relationships object Project relationships member_team object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. member_user object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` type [_required_] enum Project resource type Allowed enum values: `project` default: `project` ``` { "data": { "attributes": { "key": "CASEM", "name": "string" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "member_team": { "data": [ { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team_links" } ], "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } }, "member_user": { "data": [ { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } ] } }, "type": "project" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Create a project Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/projects" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "key": "SEC", "name": "Security Investigation" }, "type": "project" } } EOF ``` ##### Create a project ``` """ Create a project returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.project_create import ProjectCreate from datadog_api_client.v2.model.project_create_attributes import ProjectCreateAttributes from datadog_api_client.v2.model.project_create_request import ProjectCreateRequest from datadog_api_client.v2.model.project_resource_type import ProjectResourceType body = ProjectCreateRequest( data=ProjectCreate( attributes=ProjectCreateAttributes( key="SEC", name="Security Investigation", ), type=ProjectResourceType.PROJECT, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.create_project(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a project ``` # Create a project returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new body = DatadogAPIClient::V2::ProjectCreateRequest.new({ data: DatadogAPIClient::V2::ProjectCreate.new({ attributes: DatadogAPIClient::V2::ProjectCreateAttributes.new({ key: "SEC", name: "Security Investigation", }), type: DatadogAPIClient::V2::ProjectResourceType::PROJECT, }), }) p api_instance.create_project(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a project ``` // Create a project returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ProjectCreateRequest{ Data: datadogV2.ProjectCreate{ Attributes: datadogV2.ProjectCreateAttributes{ Key: "SEC", Name: "Security Investigation", }, Type: datadogV2.PROJECTRESOURCETYPE_PROJECT, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.CreateProject(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.CreateProject`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.CreateProject`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a project ``` // Create a project returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.ProjectCreate; import com.datadog.api.client.v2.model.ProjectCreateAttributes; import com.datadog.api.client.v2.model.ProjectCreateRequest; import com.datadog.api.client.v2.model.ProjectResourceType; import com.datadog.api.client.v2.model.ProjectResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); ProjectCreateRequest body = new ProjectCreateRequest() .data( new ProjectCreate() .attributes( new ProjectCreateAttributes().key("SEC").name("Security Investigation")) .type(ProjectResourceType.PROJECT)); try { ProjectResponse result = apiInstance.createProject(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#createProject"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a project ``` // Create a project returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::ProjectCreate; use datadog_api_client::datadogV2::model::ProjectCreateAttributes; use datadog_api_client::datadogV2::model::ProjectCreateRequest; use datadog_api_client::datadogV2::model::ProjectResourceType; #[tokio::main] async fn main() { let body = ProjectCreateRequest::new(ProjectCreate::new( ProjectCreateAttributes::new("SEC".to_string(), "Security Investigation".to_string()), ProjectResourceType::PROJECT, )); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.create_project(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a project ``` /** * Create a project returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); const params: v2.CaseManagementApiCreateProjectRequest = { body: { data: { attributes: { key: "SEC", name: "Security Investigation", }, type: "project", }, }, }; apiInstance .createProject(params) .then((data: v2.ProjectResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search cases](https://docs.datadoghq.com/api/latest/case-management/#search-cases) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#search-cases-v2) GET https://api.ap1.datadoghq.com/api/v2/caseshttps://api.ap2.datadoghq.com/api/v2/caseshttps://api.datadoghq.eu/api/v2/caseshttps://api.ddog-gov.com/api/v2/caseshttps://api.datadoghq.com/api/v2/caseshttps://api.us3.datadoghq.com/api/v2/caseshttps://api.us5.datadoghq.com/api/v2/cases ### Overview Search cases. OAuth apps require the `cases_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort[field] enum Specify which field to sort Allowed enum values: `created_at, priority, status` filter string Search query sort[asc] boolean Specify if order is ascending or not ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#SearchCases-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#SearchCases-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#SearchCases-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#SearchCases-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#SearchCases-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#SearchCases-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Response with cases Field Type Description data [object] Cases response data attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` meta object Cases response metadata page object Pagination metadata current int64 Current page number size int64 Number of cases in current page total int64 Total number of pages ``` { "data": [ { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } ], "meta": { "page": { "current": "integer", "size": "integer", "total": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Search cases Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Search cases ``` """ Search cases returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.search_cases() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search cases ``` # Search cases returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new p api_instance.search_cases() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search cases ``` // Search cases returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.SearchCases(ctx, *datadogV2.NewSearchCasesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.SearchCases`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.SearchCases`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search cases ``` // Search cases returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CasesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); try { CasesResponse result = apiInstance.searchCases(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#searchCases"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search cases ``` // Search cases returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::api_case_management::SearchCasesOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.search_cases(SearchCasesOptionalParams::default()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search cases ``` /** * Search cases returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); apiInstance .searchCases() .then((data: v2.CasesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a case](https://docs.datadoghq.com/api/latest/case-management/#create-a-case) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#create-a-case-v2) POST https://api.ap1.datadoghq.com/api/v2/caseshttps://api.ap2.datadoghq.com/api/v2/caseshttps://api.datadoghq.eu/api/v2/caseshttps://api.ddog-gov.com/api/v2/caseshttps://api.datadoghq.com/api/v2/caseshttps://api.us3.datadoghq.com/api/v2/caseshttps://api.us5.datadoghq.com/api/v2/cases ### Overview Create a Case OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Request #### Body Data (required) Case payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case creation data attributes [_required_] object Case creation attributes custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` title [_required_] string Title type_id [_required_] string Case type UUID relationships object Relationships formed with the case on creation assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project [_required_] object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "priority": "NOT_DEFINED", "title": "Security breach investigation in 0cfbc5cbc676ee71", "type_id": "00000000-0000-0000-0000-000000000001" }, "relationships": { "assignee": { "data": { "id": "string", "type": "user" } }, "project": { "data": { "id": "d4bbe1af-f36e-42f1-87c1-493ca35c320e", "type": "project" } } }, "type": "case" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/case-management/#CreateCase-201-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#CreateCase-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#CreateCase-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#CreateCase-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#CreateCase-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#CreateCase-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Create a case returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "priority": "NOT_DEFINED", "title": "Security breach investigation in 0cfbc5cbc676ee71", "type_id": "00000000-0000-0000-0000-000000000001" }, "relationships": { "assignee": { "data": { "id": "string", "type": "user" } }, "project": { "data": { "id": "d4bbe1af-f36e-42f1-87c1-493ca35c320e", "type": "project" } } }, "type": "case" } } EOF ``` ##### Create a case returns "CREATED" response ``` // Create a case returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") body := datadogV2.CaseCreateRequest{ Data: datadogV2.CaseCreate{ Attributes: datadogV2.CaseCreateAttributes{ Priority: datadogV2.CASEPRIORITY_NOT_DEFINED.Ptr(), Title: "Security breach investigation in 0cfbc5cbc676ee71", TypeId: "00000000-0000-0000-0000-000000000001", }, Relationships: &datadogV2.CaseCreateRelationships{ Assignee: *datadogV2.NewNullableNullableUserRelationship(&datadogV2.NullableUserRelationship{ Data: *datadogV2.NewNullableNullableUserRelationshipData(&datadogV2.NullableUserRelationshipData{ Id: UserDataID, Type: datadogV2.USERRESOURCETYPE_USER, }), }), Project: datadogV2.ProjectRelationship{ Data: datadogV2.ProjectRelationshipData{ Id: "d4bbe1af-f36e-42f1-87c1-493ca35c320e", Type: datadogV2.PROJECTRESOURCETYPE_PROJECT, }, }, }, Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.CreateCase(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.CreateCase`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.CreateCase`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a case returns "CREATED" response ``` // Create a case returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseCreate; import com.datadog.api.client.v2.model.CaseCreateAttributes; import com.datadog.api.client.v2.model.CaseCreateRelationships; import com.datadog.api.client.v2.model.CaseCreateRequest; import com.datadog.api.client.v2.model.CasePriority; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.CaseResponse; import com.datadog.api.client.v2.model.NullableUserRelationship; import com.datadog.api.client.v2.model.NullableUserRelationshipData; import com.datadog.api.client.v2.model.ProjectRelationship; import com.datadog.api.client.v2.model.ProjectRelationshipData; import com.datadog.api.client.v2.model.ProjectResourceType; import com.datadog.api.client.v2.model.UserResourceType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); CaseCreateRequest body = new CaseCreateRequest() .data( new CaseCreate() .attributes( new CaseCreateAttributes() .priority(CasePriority.NOT_DEFINED) .title("Security breach investigation in 0cfbc5cbc676ee71") .typeId("00000000-0000-0000-0000-000000000001")) .relationships( new CaseCreateRelationships() .assignee( new NullableUserRelationship() .data( new NullableUserRelationshipData() .id(USER_DATA_ID) .type(UserResourceType.USER))) .project( new ProjectRelationship() .data( new ProjectRelationshipData() .id("d4bbe1af-f36e-42f1-87c1-493ca35c320e") .type(ProjectResourceType.PROJECT)))) .type(CaseResourceType.CASE)); try { CaseResponse result = apiInstance.createCase(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#createCase"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a case returns "CREATED" response ``` """ Create a case returns "CREATED" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_create import CaseCreate from datadog_api_client.v2.model.case_create_attributes import CaseCreateAttributes from datadog_api_client.v2.model.case_create_relationships import CaseCreateRelationships from datadog_api_client.v2.model.case_create_request import CaseCreateRequest from datadog_api_client.v2.model.case_priority import CasePriority from datadog_api_client.v2.model.case_resource_type import CaseResourceType from datadog_api_client.v2.model.nullable_user_relationship import NullableUserRelationship from datadog_api_client.v2.model.nullable_user_relationship_data import NullableUserRelationshipData from datadog_api_client.v2.model.project_relationship import ProjectRelationship from datadog_api_client.v2.model.project_relationship_data import ProjectRelationshipData from datadog_api_client.v2.model.project_resource_type import ProjectResourceType from datadog_api_client.v2.model.user_resource_type import UserResourceType # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] body = CaseCreateRequest( data=CaseCreate( attributes=CaseCreateAttributes( priority=CasePriority.NOT_DEFINED, title="Security breach investigation in 0cfbc5cbc676ee71", type_id="00000000-0000-0000-0000-000000000001", ), relationships=CaseCreateRelationships( assignee=NullableUserRelationship( data=NullableUserRelationshipData( id=USER_DATA_ID, type=UserResourceType.USER, ), ), project=ProjectRelationship( data=ProjectRelationshipData( id="d4bbe1af-f36e-42f1-87c1-493ca35c320e", type=ProjectResourceType.PROJECT, ), ), ), type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.create_case(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a case returns "CREATED" response ``` # Create a case returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] body = DatadogAPIClient::V2::CaseCreateRequest.new({ data: DatadogAPIClient::V2::CaseCreate.new({ attributes: DatadogAPIClient::V2::CaseCreateAttributes.new({ priority: DatadogAPIClient::V2::CasePriority::NOT_DEFINED, title: "Security breach investigation in 0cfbc5cbc676ee71", type_id: "00000000-0000-0000-0000-000000000001", }), relationships: DatadogAPIClient::V2::CaseCreateRelationships.new({ assignee: DatadogAPIClient::V2::NullableUserRelationship.new({ data: DatadogAPIClient::V2::NullableUserRelationshipData.new({ id: USER_DATA_ID, type: DatadogAPIClient::V2::UserResourceType::USER, }), }), project: DatadogAPIClient::V2::ProjectRelationship.new({ data: DatadogAPIClient::V2::ProjectRelationshipData.new({ id: "d4bbe1af-f36e-42f1-87c1-493ca35c320e", type: DatadogAPIClient::V2::ProjectResourceType::PROJECT, }), }), }), type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.create_case(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a case returns "CREATED" response ``` // Create a case returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CaseCreate; use datadog_api_client::datadogV2::model::CaseCreateAttributes; use datadog_api_client::datadogV2::model::CaseCreateRelationships; use datadog_api_client::datadogV2::model::CaseCreateRequest; use datadog_api_client::datadogV2::model::CasePriority; use datadog_api_client::datadogV2::model::CaseResourceType; use datadog_api_client::datadogV2::model::NullableUserRelationship; use datadog_api_client::datadogV2::model::NullableUserRelationshipData; use datadog_api_client::datadogV2::model::ProjectRelationship; use datadog_api_client::datadogV2::model::ProjectRelationshipData; use datadog_api_client::datadogV2::model::ProjectResourceType; use datadog_api_client::datadogV2::model::UserResourceType; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let body = CaseCreateRequest::new( CaseCreate::new( CaseCreateAttributes::new( "Security breach investigation in 0cfbc5cbc676ee71".to_string(), "00000000-0000-0000-0000-000000000001".to_string(), ) .priority(CasePriority::NOT_DEFINED), CaseResourceType::CASE, ) .relationships( CaseCreateRelationships::new(ProjectRelationship::new(ProjectRelationshipData::new( "d4bbe1af-f36e-42f1-87c1-493ca35c320e".to_string(), ProjectResourceType::PROJECT, ))) .assignee(Some(NullableUserRelationship::new(Some( NullableUserRelationshipData::new(user_data_id.clone(), UserResourceType::USER), )))), ), ); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.create_case(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a case returns "CREATED" response ``` /** * Create a case returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.CaseManagementApiCreateCaseRequest = { body: { data: { attributes: { priority: "NOT_DEFINED", title: "Security breach investigation in 0cfbc5cbc676ee71", typeId: "00000000-0000-0000-0000-000000000001", }, relationships: { assignee: { data: { id: USER_DATA_ID, type: "user", }, }, project: { data: { id: "d4bbe1af-f36e-42f1-87c1-493ca35c320e", type: "project", }, }, }, type: "case", }, }, }; apiInstance .createCase(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all projects](https://docs.datadoghq.com/api/latest/case-management/#get-all-projects) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#get-all-projects-v2) GET https://api.ap1.datadoghq.com/api/v2/cases/projectshttps://api.ap2.datadoghq.com/api/v2/cases/projectshttps://api.datadoghq.eu/api/v2/cases/projectshttps://api.ddog-gov.com/api/v2/cases/projectshttps://api.datadoghq.com/api/v2/cases/projectshttps://api.us3.datadoghq.com/api/v2/cases/projectshttps://api.us5.datadoghq.com/api/v2/cases/projects ### Overview Get all projects. OAuth apps require the `cases_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#GetProjects-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#GetProjects-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#GetProjects-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#GetProjects-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#GetProjects-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#GetProjects-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Response with projects Field Type Description data [object] Projects response data attributes [_required_] object Project attributes key string The project's key name string Project's name id [_required_] string The Project's identifier relationships object Project relationships member_team object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. member_user object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` type [_required_] enum Project resource type Allowed enum values: `project` default: `project` ``` { "data": [ { "attributes": { "key": "CASEM", "name": "string" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "member_team": { "data": [ { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team_links" } ], "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } }, "member_user": { "data": [ { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } ] } }, "type": "project" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Get all projects Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/projects" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all projects ``` """ Get all projects returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.get_projects() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all projects ``` # Get all projects returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new p api_instance.get_projects() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all projects ``` // Get all projects returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.GetProjects(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.GetProjects`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.GetProjects`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all projects ``` // Get all projects returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.ProjectsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); try { ProjectsResponse result = apiInstance.getProjects(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#getProjects"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all projects ``` // Get all projects returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.get_projects().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all projects ``` /** * Get all projects returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); apiInstance .getProjects() .then((data: v2.ProjectsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the details of a case](https://docs.datadoghq.com/api/latest/case-management/#get-the-details-of-a-case) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#get-the-details-of-a-case-v2) GET https://api.ap1.datadoghq.com/api/v2/cases/{case_id}https://api.ap2.datadoghq.com/api/v2/cases/{case_id}https://api.datadoghq.eu/api/v2/cases/{case_id}https://api.ddog-gov.com/api/v2/cases/{case_id}https://api.datadoghq.com/api/v2/cases/{case_id}https://api.us3.datadoghq.com/api/v2/cases/{case_id}https://api.us5.datadoghq.com/api/v2/cases/{case_id} ### Overview Get the details of case by `case_id` OAuth apps require the `cases_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#GetCase-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#GetCase-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#GetCase-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#GetCase-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#GetCase-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#GetCase-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Get the details of a case Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the details of a case ``` """ Get the details of a case returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.get_case( case_id=CASE_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the details of a case ``` # Get the details of a case returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] p api_instance.get_case(CASE_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the details of a case ``` // Get the details of a case returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.GetCase(ctx, CaseID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.GetCase`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.GetCase`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the details of a case ``` // Get the details of a case returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); try { CaseResponse result = apiInstance.getCase(CASE_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#getCase"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the details of a case ``` // Get the details of a case returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.get_case(case_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the details of a case ``` /** * Get the details of a case returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; const params: v2.CaseManagementApiGetCaseRequest = { caseId: CASE_ID, }; apiInstance .getCase(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the details of a project](https://docs.datadoghq.com/api/latest/case-management/#get-the-details-of-a-project) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#get-the-details-of-a-project-v2) GET https://api.ap1.datadoghq.com/api/v2/cases/projects/{project_id}https://api.ap2.datadoghq.com/api/v2/cases/projects/{project_id}https://api.datadoghq.eu/api/v2/cases/projects/{project_id}https://api.ddog-gov.com/api/v2/cases/projects/{project_id}https://api.datadoghq.com/api/v2/cases/projects/{project_id}https://api.us3.datadoghq.com/api/v2/cases/projects/{project_id}https://api.us5.datadoghq.com/api/v2/cases/projects/{project_id} ### Overview Get the details of a project by `project_id`. OAuth apps require the `cases_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description project_id [_required_] string Project UUID ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#GetProject-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#GetProject-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#GetProject-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#GetProject-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#GetProject-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#GetProject-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Project response Field Type Description data object A Project attributes [_required_] object Project attributes key string The project's key name string Project's name id [_required_] string The Project's identifier relationships object Project relationships member_team object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. member_user object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` type [_required_] enum Project resource type Allowed enum values: `project` default: `project` ``` { "data": { "attributes": { "key": "CASEM", "name": "string" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "member_team": { "data": [ { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team_links" } ], "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } }, "member_user": { "data": [ { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } ] } }, "type": "project" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) ##### Get the details of a project Copy ``` # Path parameters export project_id="e555e290-ed65-49bd-ae18-8acbfcf18db7" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/projects/${project_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the details of a project ``` """ Get the details of a project returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.get_project( project_id="project_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the details of a project ``` # Get the details of a project returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new p api_instance.get_project("project_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the details of a project ``` /** * Get the details of a project returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); const params: v2.CaseManagementApiGetProjectRequest = { projectId: "project_id", }; apiInstance .getProject(params) .then((data: v2.ProjectResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` ##### Get the details of a project ``` // Get the details of a project returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.GetProject(ctx, "project_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.GetProject`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.GetProject`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the details of a project ``` // Get the details of a project returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.ProjectResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); try { ProjectResponse result = apiInstance.getProject("e555e290-ed65-49bd-ae18-8acbfcf18db7"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#getProject"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the details of a project ``` // Get the details of a project returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.get_project("project_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` * * * ## [Remove a project](https://docs.datadoghq.com/api/latest/case-management/#remove-a-project) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#remove-a-project-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cases/projects/{project_id}https://api.ap2.datadoghq.com/api/v2/cases/projects/{project_id}https://api.datadoghq.eu/api/v2/cases/projects/{project_id}https://api.ddog-gov.com/api/v2/cases/projects/{project_id}https://api.datadoghq.com/api/v2/cases/projects/{project_id}https://api.us3.datadoghq.com/api/v2/cases/projects/{project_id}https://api.us5.datadoghq.com/api/v2/cases/projects/{project_id} ### Overview Remove a project using the project’s `id`. OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description project_id [_required_] string Project UUID ### Response * [204](https://docs.datadoghq.com/api/latest/case-management/#DeleteProject-204-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#DeleteProject-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#DeleteProject-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#DeleteProject-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Remove a project Copy ``` # Path parameters export project_id="e555e290-ed65-49bd-ae18-8acbfcf18db7" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/projects/${project_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Remove a project ``` """ Remove a project returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) api_instance.delete_project( project_id="project_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Remove a project ``` # Remove a project returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new api_instance.delete_project("project_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Remove a project ``` // Remove a project returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) r, err := api.DeleteProject(ctx, "project_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.DeleteProject`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Remove a project ``` // Remove a project returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); try { apiInstance.deleteProject("e555e290-ed65-49bd-ae18-8acbfcf18db7"); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#deleteProject"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Remove a project ``` // Remove a project returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.delete_project("project_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Remove a project ``` /** * Remove a project returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); const params: v2.CaseManagementApiDeleteProjectRequest = { projectId: "project_id", }; apiInstance .deleteProject(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update case description](https://docs.datadoghq.com/api/latest/case-management/#update-case-description) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#update-case-description-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/descriptionhttps://api.ap2.datadoghq.com/api/v2/cases/{case_id}/descriptionhttps://api.datadoghq.eu/api/v2/cases/{case_id}/descriptionhttps://api.ddog-gov.com/api/v2/cases/{case_id}/descriptionhttps://api.datadoghq.com/api/v2/cases/{case_id}/descriptionhttps://api.us3.datadoghq.com/api/v2/cases/{case_id}/descriptionhttps://api.us5.datadoghq.com/api/v2/cases/{case_id}/description ### Overview Update case description OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key ### Request #### Body Data (required) Case description update payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case update description attributes [_required_] object Case update description attributes description [_required_] string Case new description type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "description": "Seeing some weird memory increase... Updating the description" }, "type": "case" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseDescription-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseDescription-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseDescription-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseDescription-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseDescription-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseDescription-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Update case description returns "OK" response Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/description" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Seeing some weird memory increase... Updating the description" }, "type": "case" } } EOF ``` ##### Update case description returns "OK" response ``` // Update case description returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") body := datadogV2.CaseUpdateDescriptionRequest{ Data: datadogV2.CaseUpdateDescription{ Attributes: datadogV2.CaseUpdateDescriptionAttributes{ Description: "Seeing some weird memory increase... Updating the description", }, Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.UpdateCaseDescription(ctx, CaseID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.UpdateCaseDescription`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.UpdateCaseDescription`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update case description returns "OK" response ``` // Update case description returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.CaseResponse; import com.datadog.api.client.v2.model.CaseUpdateDescription; import com.datadog.api.client.v2.model.CaseUpdateDescriptionAttributes; import com.datadog.api.client.v2.model.CaseUpdateDescriptionRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); CaseUpdateDescriptionRequest body = new CaseUpdateDescriptionRequest() .data( new CaseUpdateDescription() .attributes( new CaseUpdateDescriptionAttributes() .description( "Seeing some weird memory increase... Updating the description")) .type(CaseResourceType.CASE)); try { CaseResponse result = apiInstance.updateCaseDescription(CASE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#updateCaseDescription"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update case description returns "OK" response ``` """ Update case description returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_resource_type import CaseResourceType from datadog_api_client.v2.model.case_update_description import CaseUpdateDescription from datadog_api_client.v2.model.case_update_description_attributes import CaseUpdateDescriptionAttributes from datadog_api_client.v2.model.case_update_description_request import CaseUpdateDescriptionRequest # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] body = CaseUpdateDescriptionRequest( data=CaseUpdateDescription( attributes=CaseUpdateDescriptionAttributes( description="Seeing some weird memory increase... Updating the description", ), type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.update_case_description(case_id=CASE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update case description returns "OK" response ``` # Update case description returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] body = DatadogAPIClient::V2::CaseUpdateDescriptionRequest.new({ data: DatadogAPIClient::V2::CaseUpdateDescription.new({ attributes: DatadogAPIClient::V2::CaseUpdateDescriptionAttributes.new({ description: "Seeing some weird memory increase... Updating the description", }), type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.update_case_description(CASE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update case description returns "OK" response ``` // Update case description returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CaseResourceType; use datadog_api_client::datadogV2::model::CaseUpdateDescription; use datadog_api_client::datadogV2::model::CaseUpdateDescriptionAttributes; use datadog_api_client::datadogV2::model::CaseUpdateDescriptionRequest; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); let body = CaseUpdateDescriptionRequest::new(CaseUpdateDescription::new( CaseUpdateDescriptionAttributes::new( "Seeing some weird memory increase... Updating the description".to_string(), ), CaseResourceType::CASE, )); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.update_case_description(case_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update case description returns "OK" response ``` /** * Update case description returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; const params: v2.CaseManagementApiUpdateCaseDescriptionRequest = { body: { data: { attributes: { description: "Seeing some weird memory increase... Updating the description", }, type: "case", }, }, caseId: CASE_ID, }; apiInstance .updateCaseDescription(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update case status](https://docs.datadoghq.com/api/latest/case-management/#update-case-status) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#update-case-status-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/statushttps://api.ap2.datadoghq.com/api/v2/cases/{case_id}/statushttps://api.datadoghq.eu/api/v2/cases/{case_id}/statushttps://api.ddog-gov.com/api/v2/cases/{case_id}/statushttps://api.datadoghq.com/api/v2/cases/{case_id}/statushttps://api.us3.datadoghq.com/api/v2/cases/{case_id}/statushttps://api.us5.datadoghq.com/api/v2/cases/{case_id}/status ### Overview Update case status OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key ### Request #### Body Data (required) Case status update payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case update status attributes [_required_] object Case update status attributes status [_required_] enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "status": "IN_PROGRESS" }, "type": "case" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#UpdateStatus-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#UpdateStatus-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#UpdateStatus-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#UpdateStatus-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#UpdateStatus-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#UpdateStatus-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Update case status returns "OK" response Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/status" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "status": "IN_PROGRESS" }, "type": "case" } } EOF ``` ##### Update case status returns "OK" response ``` // Update case status returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") body := datadogV2.CaseUpdateStatusRequest{ Data: datadogV2.CaseUpdateStatus{ Attributes: datadogV2.CaseUpdateStatusAttributes{ Status: datadogV2.CASESTATUS_IN_PROGRESS, }, Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.UpdateStatus(ctx, CaseID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.UpdateStatus`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.UpdateStatus`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update case status returns "OK" response ``` // Update case status returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.CaseResponse; import com.datadog.api.client.v2.model.CaseStatus; import com.datadog.api.client.v2.model.CaseUpdateStatus; import com.datadog.api.client.v2.model.CaseUpdateStatusAttributes; import com.datadog.api.client.v2.model.CaseUpdateStatusRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); CaseUpdateStatusRequest body = new CaseUpdateStatusRequest() .data( new CaseUpdateStatus() .attributes(new CaseUpdateStatusAttributes().status(CaseStatus.IN_PROGRESS)) .type(CaseResourceType.CASE)); try { CaseResponse result = apiInstance.updateStatus(CASE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#updateStatus"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update case status returns "OK" response ``` """ Update case status returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_resource_type import CaseResourceType from datadog_api_client.v2.model.case_status import CaseStatus from datadog_api_client.v2.model.case_update_status import CaseUpdateStatus from datadog_api_client.v2.model.case_update_status_attributes import CaseUpdateStatusAttributes from datadog_api_client.v2.model.case_update_status_request import CaseUpdateStatusRequest # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] body = CaseUpdateStatusRequest( data=CaseUpdateStatus( attributes=CaseUpdateStatusAttributes( status=CaseStatus.IN_PROGRESS, ), type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.update_status(case_id=CASE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update case status returns "OK" response ``` # Update case status returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] body = DatadogAPIClient::V2::CaseUpdateStatusRequest.new({ data: DatadogAPIClient::V2::CaseUpdateStatus.new({ attributes: DatadogAPIClient::V2::CaseUpdateStatusAttributes.new({ status: DatadogAPIClient::V2::CaseStatus::IN_PROGRESS, }), type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.update_status(CASE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update case status returns "OK" response ``` // Update case status returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CaseResourceType; use datadog_api_client::datadogV2::model::CaseStatus; use datadog_api_client::datadogV2::model::CaseUpdateStatus; use datadog_api_client::datadogV2::model::CaseUpdateStatusAttributes; use datadog_api_client::datadogV2::model::CaseUpdateStatusRequest; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); let body = CaseUpdateStatusRequest::new(CaseUpdateStatus::new( CaseUpdateStatusAttributes::new(CaseStatus::IN_PROGRESS), CaseResourceType::CASE, )); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.update_status(case_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update case status returns "OK" response ``` /** * Update case status returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; const params: v2.CaseManagementApiUpdateStatusRequest = { body: { data: { attributes: { status: "IN_PROGRESS", }, type: "case", }, }, caseId: CASE_ID, }; apiInstance .updateStatus(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update case title](https://docs.datadoghq.com/api/latest/case-management/#update-case-title) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#update-case-title-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/titlehttps://api.ap2.datadoghq.com/api/v2/cases/{case_id}/titlehttps://api.datadoghq.eu/api/v2/cases/{case_id}/titlehttps://api.ddog-gov.com/api/v2/cases/{case_id}/titlehttps://api.datadoghq.com/api/v2/cases/{case_id}/titlehttps://api.us3.datadoghq.com/api/v2/cases/{case_id}/titlehttps://api.us5.datadoghq.com/api/v2/cases/{case_id}/title ### Overview Update case title OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key ### Request #### Body Data (required) Case title update payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case update title attributes [_required_] object Case update title attributes title [_required_] string Case new title type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "title": "[UPDATED] Memory leak investigation on API" }, "type": "case" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseTitle-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseTitle-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseTitle-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseTitle-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseTitle-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseTitle-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Update case title returns "OK" response Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/title" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "title": "[UPDATED] Memory leak investigation on API" }, "type": "case" } } EOF ``` ##### Update case title returns "OK" response ``` // Update case title returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") body := datadogV2.CaseUpdateTitleRequest{ Data: datadogV2.CaseUpdateTitle{ Attributes: datadogV2.CaseUpdateTitleAttributes{ Title: "[UPDATED] Memory leak investigation on API", }, Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.UpdateCaseTitle(ctx, CaseID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.UpdateCaseTitle`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.UpdateCaseTitle`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update case title returns "OK" response ``` // Update case title returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.CaseResponse; import com.datadog.api.client.v2.model.CaseUpdateTitle; import com.datadog.api.client.v2.model.CaseUpdateTitleAttributes; import com.datadog.api.client.v2.model.CaseUpdateTitleRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); CaseUpdateTitleRequest body = new CaseUpdateTitleRequest() .data( new CaseUpdateTitle() .attributes( new CaseUpdateTitleAttributes() .title("[UPDATED] Memory leak investigation on API")) .type(CaseResourceType.CASE)); try { CaseResponse result = apiInstance.updateCaseTitle(CASE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#updateCaseTitle"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update case title returns "OK" response ``` """ Update case title returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_resource_type import CaseResourceType from datadog_api_client.v2.model.case_update_title import CaseUpdateTitle from datadog_api_client.v2.model.case_update_title_attributes import CaseUpdateTitleAttributes from datadog_api_client.v2.model.case_update_title_request import CaseUpdateTitleRequest # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] body = CaseUpdateTitleRequest( data=CaseUpdateTitle( attributes=CaseUpdateTitleAttributes( title="[UPDATED] Memory leak investigation on API", ), type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.update_case_title(case_id=CASE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update case title returns "OK" response ``` # Update case title returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] body = DatadogAPIClient::V2::CaseUpdateTitleRequest.new({ data: DatadogAPIClient::V2::CaseUpdateTitle.new({ attributes: DatadogAPIClient::V2::CaseUpdateTitleAttributes.new({ title: "[UPDATED] Memory leak investigation on API", }), type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.update_case_title(CASE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update case title returns "OK" response ``` // Update case title returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CaseResourceType; use datadog_api_client::datadogV2::model::CaseUpdateTitle; use datadog_api_client::datadogV2::model::CaseUpdateTitleAttributes; use datadog_api_client::datadogV2::model::CaseUpdateTitleRequest; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); let body = CaseUpdateTitleRequest::new(CaseUpdateTitle::new( CaseUpdateTitleAttributes::new("[UPDATED] Memory leak investigation on API".to_string()), CaseResourceType::CASE, )); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.update_case_title(case_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update case title returns "OK" response ``` /** * Update case title returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; const params: v2.CaseManagementApiUpdateCaseTitleRequest = { body: { data: { attributes: { title: "[UPDATED] Memory leak investigation on API", }, type: "case", }, }, caseId: CASE_ID, }; apiInstance .updateCaseTitle(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update case priority](https://docs.datadoghq.com/api/latest/case-management/#update-case-priority) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#update-case-priority-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/priorityhttps://api.ap2.datadoghq.com/api/v2/cases/{case_id}/priorityhttps://api.datadoghq.eu/api/v2/cases/{case_id}/priorityhttps://api.ddog-gov.com/api/v2/cases/{case_id}/priorityhttps://api.datadoghq.com/api/v2/cases/{case_id}/priorityhttps://api.us3.datadoghq.com/api/v2/cases/{case_id}/priorityhttps://api.us5.datadoghq.com/api/v2/cases/{case_id}/priority ### Overview Update case priority OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key ### Request #### Body Data (required) Case priority update payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case priority status attributes [_required_] object Case update priority attributes priority [_required_] enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "priority": "P3" }, "type": "case" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#UpdatePriority-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#UpdatePriority-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#UpdatePriority-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#UpdatePriority-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#UpdatePriority-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#UpdatePriority-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Update case priority returns "OK" response Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/priority" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "priority": "P3" }, "type": "case" } } EOF ``` ##### Update case priority returns "OK" response ``` // Update case priority returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") body := datadogV2.CaseUpdatePriorityRequest{ Data: datadogV2.CaseUpdatePriority{ Attributes: datadogV2.CaseUpdatePriorityAttributes{ Priority: datadogV2.CASEPRIORITY_P3, }, Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.UpdatePriority(ctx, CaseID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.UpdatePriority`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.UpdatePriority`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update case priority returns "OK" response ``` // Update case priority returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CasePriority; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.CaseResponse; import com.datadog.api.client.v2.model.CaseUpdatePriority; import com.datadog.api.client.v2.model.CaseUpdatePriorityAttributes; import com.datadog.api.client.v2.model.CaseUpdatePriorityRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); CaseUpdatePriorityRequest body = new CaseUpdatePriorityRequest() .data( new CaseUpdatePriority() .attributes(new CaseUpdatePriorityAttributes().priority(CasePriority.P3)) .type(CaseResourceType.CASE)); try { CaseResponse result = apiInstance.updatePriority(CASE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#updatePriority"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update case priority returns "OK" response ``` """ Update case priority returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_priority import CasePriority from datadog_api_client.v2.model.case_resource_type import CaseResourceType from datadog_api_client.v2.model.case_update_priority import CaseUpdatePriority from datadog_api_client.v2.model.case_update_priority_attributes import CaseUpdatePriorityAttributes from datadog_api_client.v2.model.case_update_priority_request import CaseUpdatePriorityRequest # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] body = CaseUpdatePriorityRequest( data=CaseUpdatePriority( attributes=CaseUpdatePriorityAttributes( priority=CasePriority.P3, ), type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.update_priority(case_id=CASE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update case priority returns "OK" response ``` # Update case priority returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] body = DatadogAPIClient::V2::CaseUpdatePriorityRequest.new({ data: DatadogAPIClient::V2::CaseUpdatePriority.new({ attributes: DatadogAPIClient::V2::CaseUpdatePriorityAttributes.new({ priority: DatadogAPIClient::V2::CasePriority::P3, }), type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.update_priority(CASE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update case priority returns "OK" response ``` // Update case priority returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CasePriority; use datadog_api_client::datadogV2::model::CaseResourceType; use datadog_api_client::datadogV2::model::CaseUpdatePriority; use datadog_api_client::datadogV2::model::CaseUpdatePriorityAttributes; use datadog_api_client::datadogV2::model::CaseUpdatePriorityRequest; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); let body = CaseUpdatePriorityRequest::new(CaseUpdatePriority::new( CaseUpdatePriorityAttributes::new(CasePriority::P3), CaseResourceType::CASE, )); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.update_priority(case_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update case priority returns "OK" response ``` /** * Update case priority returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; const params: v2.CaseManagementApiUpdatePriorityRequest = { body: { data: { attributes: { priority: "P3", }, type: "case", }, }, caseId: CASE_ID, }; apiInstance .updatePriority(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Assign case](https://docs.datadoghq.com/api/latest/case-management/#assign-case) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#assign-case-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/assignhttps://api.ap2.datadoghq.com/api/v2/cases/{case_id}/assignhttps://api.datadoghq.eu/api/v2/cases/{case_id}/assignhttps://api.ddog-gov.com/api/v2/cases/{case_id}/assignhttps://api.datadoghq.com/api/v2/cases/{case_id}/assignhttps://api.us3.datadoghq.com/api/v2/cases/{case_id}/assignhttps://api.us5.datadoghq.com/api/v2/cases/{case_id}/assign ### Overview Assign case to a user OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key ### Request #### Body Data (required) Assign case payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case assign attributes [_required_] object Case assign attributes assignee_id [_required_] string Assignee's UUID type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "assignee_id": "string" }, "type": "case" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#AssignCase-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#AssignCase-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#AssignCase-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#AssignCase-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#AssignCase-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#AssignCase-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Assign case returns "OK" response Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/assign" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "assignee_id": "string" }, "type": "case" } } EOF ``` ##### Assign case returns "OK" response ``` // Assign case returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") body := datadogV2.CaseAssignRequest{ Data: datadogV2.CaseAssign{ Attributes: datadogV2.CaseAssignAttributes{ AssigneeId: UserDataID, }, Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.AssignCase(ctx, CaseID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.AssignCase`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.AssignCase`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Assign case returns "OK" response ``` // Assign case returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseAssign; import com.datadog.api.client.v2.model.CaseAssignAttributes; import com.datadog.api.client.v2.model.CaseAssignRequest; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.CaseResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); CaseAssignRequest body = new CaseAssignRequest() .data( new CaseAssign() .attributes(new CaseAssignAttributes().assigneeId(USER_DATA_ID)) .type(CaseResourceType.CASE)); try { CaseResponse result = apiInstance.assignCase(CASE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#assignCase"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Assign case returns "OK" response ``` """ Assign case returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_assign import CaseAssign from datadog_api_client.v2.model.case_assign_attributes import CaseAssignAttributes from datadog_api_client.v2.model.case_assign_request import CaseAssignRequest from datadog_api_client.v2.model.case_resource_type import CaseResourceType # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] body = CaseAssignRequest( data=CaseAssign( attributes=CaseAssignAttributes( assignee_id=USER_DATA_ID, ), type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.assign_case(case_id=CASE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Assign case returns "OK" response ``` # Assign case returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] body = DatadogAPIClient::V2::CaseAssignRequest.new({ data: DatadogAPIClient::V2::CaseAssign.new({ attributes: DatadogAPIClient::V2::CaseAssignAttributes.new({ assignee_id: USER_DATA_ID, }), type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.assign_case(CASE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Assign case returns "OK" response ``` // Assign case returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CaseAssign; use datadog_api_client::datadogV2::model::CaseAssignAttributes; use datadog_api_client::datadogV2::model::CaseAssignRequest; use datadog_api_client::datadogV2::model::CaseResourceType; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let body = CaseAssignRequest::new(CaseAssign::new( CaseAssignAttributes::new(user_data_id.clone()), CaseResourceType::CASE, )); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.assign_case(case_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Assign case returns "OK" response ``` /** * Assign case returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.CaseManagementApiAssignCaseRequest = { body: { data: { attributes: { assigneeId: USER_DATA_ID, }, type: "case", }, }, caseId: CASE_ID, }; apiInstance .assignCase(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Unassign case](https://docs.datadoghq.com/api/latest/case-management/#unassign-case) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#unassign-case-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/unassignhttps://api.ap2.datadoghq.com/api/v2/cases/{case_id}/unassignhttps://api.datadoghq.eu/api/v2/cases/{case_id}/unassignhttps://api.ddog-gov.com/api/v2/cases/{case_id}/unassignhttps://api.datadoghq.com/api/v2/cases/{case_id}/unassignhttps://api.us3.datadoghq.com/api/v2/cases/{case_id}/unassignhttps://api.us5.datadoghq.com/api/v2/cases/{case_id}/unassign ### Overview Unassign case OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key ### Request #### Body Data (required) Unassign case payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case empty request data type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "type": "case" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#UnassignCase-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#UnassignCase-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#UnassignCase-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#UnassignCase-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#UnassignCase-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#UnassignCase-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Unassign case returns "OK" response Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/unassign" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "case" } } EOF ``` ##### Unassign case returns "OK" response ``` // Unassign case returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") body := datadogV2.CaseEmptyRequest{ Data: datadogV2.CaseEmpty{ Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.UnassignCase(ctx, CaseID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.UnassignCase`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.UnassignCase`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Unassign case returns "OK" response ``` // Unassign case returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseEmpty; import com.datadog.api.client.v2.model.CaseEmptyRequest; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.CaseResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); CaseEmptyRequest body = new CaseEmptyRequest().data(new CaseEmpty().type(CaseResourceType.CASE)); try { CaseResponse result = apiInstance.unassignCase(CASE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#unassignCase"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Unassign case returns "OK" response ``` """ Unassign case returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_empty import CaseEmpty from datadog_api_client.v2.model.case_empty_request import CaseEmptyRequest from datadog_api_client.v2.model.case_resource_type import CaseResourceType # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] body = CaseEmptyRequest( data=CaseEmpty( type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.unassign_case(case_id=CASE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Unassign case returns "OK" response ``` # Unassign case returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] body = DatadogAPIClient::V2::CaseEmptyRequest.new({ data: DatadogAPIClient::V2::CaseEmpty.new({ type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.unassign_case(CASE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Unassign case returns "OK" response ``` // Unassign case returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CaseEmpty; use datadog_api_client::datadogV2::model::CaseEmptyRequest; use datadog_api_client::datadogV2::model::CaseResourceType; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); let body = CaseEmptyRequest::new(CaseEmpty::new(CaseResourceType::CASE)); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.unassign_case(case_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Unassign case returns "OK" response ``` /** * Unassign case returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; const params: v2.CaseManagementApiUnassignCaseRequest = { body: { data: { type: "case", }, }, caseId: CASE_ID, }; apiInstance .unassignCase(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Archive case](https://docs.datadoghq.com/api/latest/case-management/#archive-case) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#archive-case-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/archivehttps://api.ap2.datadoghq.com/api/v2/cases/{case_id}/archivehttps://api.datadoghq.eu/api/v2/cases/{case_id}/archivehttps://api.ddog-gov.com/api/v2/cases/{case_id}/archivehttps://api.datadoghq.com/api/v2/cases/{case_id}/archivehttps://api.us3.datadoghq.com/api/v2/cases/{case_id}/archivehttps://api.us5.datadoghq.com/api/v2/cases/{case_id}/archive ### Overview Archive case OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key ### Request #### Body Data (required) Archive case payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case empty request data type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "type": "case" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#ArchiveCase-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#ArchiveCase-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#ArchiveCase-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#ArchiveCase-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#ArchiveCase-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#ArchiveCase-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Archive case returns "OK" response Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/archive" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "case" } } EOF ``` ##### Archive case returns "OK" response ``` // Archive case returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") body := datadogV2.CaseEmptyRequest{ Data: datadogV2.CaseEmpty{ Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.ArchiveCase(ctx, CaseID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.ArchiveCase`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.ArchiveCase`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Archive case returns "OK" response ``` // Archive case returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseEmpty; import com.datadog.api.client.v2.model.CaseEmptyRequest; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.CaseResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); CaseEmptyRequest body = new CaseEmptyRequest().data(new CaseEmpty().type(CaseResourceType.CASE)); try { CaseResponse result = apiInstance.archiveCase(CASE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#archiveCase"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Archive case returns "OK" response ``` """ Archive case returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_empty import CaseEmpty from datadog_api_client.v2.model.case_empty_request import CaseEmptyRequest from datadog_api_client.v2.model.case_resource_type import CaseResourceType # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] body = CaseEmptyRequest( data=CaseEmpty( type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.archive_case(case_id=CASE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Archive case returns "OK" response ``` # Archive case returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] body = DatadogAPIClient::V2::CaseEmptyRequest.new({ data: DatadogAPIClient::V2::CaseEmpty.new({ type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.archive_case(CASE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Archive case returns "OK" response ``` // Archive case returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CaseEmpty; use datadog_api_client::datadogV2::model::CaseEmptyRequest; use datadog_api_client::datadogV2::model::CaseResourceType; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); let body = CaseEmptyRequest::new(CaseEmpty::new(CaseResourceType::CASE)); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.archive_case(case_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Archive case returns "OK" response ``` /** * Archive case returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; const params: v2.CaseManagementApiArchiveCaseRequest = { body: { data: { type: "case", }, }, caseId: CASE_ID, }; apiInstance .archiveCase(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Unarchive case](https://docs.datadoghq.com/api/latest/case-management/#unarchive-case) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#unarchive-case-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/unarchivehttps://api.ap2.datadoghq.com/api/v2/cases/{case_id}/unarchivehttps://api.datadoghq.eu/api/v2/cases/{case_id}/unarchivehttps://api.ddog-gov.com/api/v2/cases/{case_id}/unarchivehttps://api.datadoghq.com/api/v2/cases/{case_id}/unarchivehttps://api.us3.datadoghq.com/api/v2/cases/{case_id}/unarchivehttps://api.us5.datadoghq.com/api/v2/cases/{case_id}/unarchive ### Overview Unarchive case OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key ### Request #### Body Data (required) Unarchive case payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case empty request data type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "type": "case" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#UnarchiveCase-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#UnarchiveCase-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#UnarchiveCase-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#UnarchiveCase-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#UnarchiveCase-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#UnarchiveCase-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Unarchive case returns "OK" response Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/unarchive" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "case" } } EOF ``` ##### Unarchive case returns "OK" response ``` // Unarchive case returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") body := datadogV2.CaseEmptyRequest{ Data: datadogV2.CaseEmpty{ Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.UnarchiveCase(ctx, CaseID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.UnarchiveCase`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.UnarchiveCase`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Unarchive case returns "OK" response ``` // Unarchive case returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseEmpty; import com.datadog.api.client.v2.model.CaseEmptyRequest; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.CaseResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); CaseEmptyRequest body = new CaseEmptyRequest().data(new CaseEmpty().type(CaseResourceType.CASE)); try { CaseResponse result = apiInstance.unarchiveCase(CASE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#unarchiveCase"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Unarchive case returns "OK" response ``` """ Unarchive case returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_empty import CaseEmpty from datadog_api_client.v2.model.case_empty_request import CaseEmptyRequest from datadog_api_client.v2.model.case_resource_type import CaseResourceType # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] body = CaseEmptyRequest( data=CaseEmpty( type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.unarchive_case(case_id=CASE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Unarchive case returns "OK" response ``` # Unarchive case returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] body = DatadogAPIClient::V2::CaseEmptyRequest.new({ data: DatadogAPIClient::V2::CaseEmpty.new({ type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.unarchive_case(CASE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Unarchive case returns "OK" response ``` // Unarchive case returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CaseEmpty; use datadog_api_client::datadogV2::model::CaseEmptyRequest; use datadog_api_client::datadogV2::model::CaseResourceType; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); let body = CaseEmptyRequest::new(CaseEmpty::new(CaseResourceType::CASE)); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.unarchive_case(case_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Unarchive case returns "OK" response ``` /** * Unarchive case returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; const params: v2.CaseManagementApiUnarchiveCaseRequest = { body: { data: { type: "case", }, }, caseId: CASE_ID, }; apiInstance .unarchiveCase(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update case attributes](https://docs.datadoghq.com/api/latest/case-management/#update-case-attributes) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#update-case-attributes-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/attributeshttps://api.ap2.datadoghq.com/api/v2/cases/{case_id}/attributeshttps://api.datadoghq.eu/api/v2/cases/{case_id}/attributeshttps://api.ddog-gov.com/api/v2/cases/{case_id}/attributeshttps://api.datadoghq.com/api/v2/cases/{case_id}/attributeshttps://api.us3.datadoghq.com/api/v2/cases/{case_id}/attributeshttps://api.us5.datadoghq.com/api/v2/cases/{case_id}/attributes ### Overview Update case attributes OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key ### Request #### Body Data (required) Case attributes update payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case update attributes attributes [_required_] object Case update attributes attributes attributes [_required_] object The definition of `CaseObjectAttributes` object. [string] type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "attributes": { "env": [ "test" ], "service": [ "web-store", "web-api" ], "team": [ "engineer" ] } }, "type": "case" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#UpdateAttributes-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#UpdateAttributes-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#UpdateAttributes-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#UpdateAttributes-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#UpdateAttributes-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#UpdateAttributes-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Update case attributes returns "OK" response Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/attributes" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "attributes": { "env": [ "test" ], "service": [ "web-store", "web-api" ], "team": [ "engineer" ] } }, "type": "case" } } EOF ``` ##### Update case attributes returns "OK" response ``` // Update case attributes returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") body := datadogV2.CaseUpdateAttributesRequest{ Data: datadogV2.CaseUpdateAttributes{ Attributes: datadogV2.CaseUpdateAttributesAttributes{ Attributes: map[string][]string{ "env": []string{ "test", }, "service": []string{ "web-store", "web-api", }, "team": []string{ "engineer", }, }, }, Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.UpdateAttributes(ctx, CaseID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.UpdateAttributes`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.UpdateAttributes`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update case attributes returns "OK" response ``` // Update case attributes returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.CaseResponse; import com.datadog.api.client.v2.model.CaseUpdateAttributes; import com.datadog.api.client.v2.model.CaseUpdateAttributesAttributes; import com.datadog.api.client.v2.model.CaseUpdateAttributesRequest; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); CaseUpdateAttributesRequest body = new CaseUpdateAttributesRequest() .data( new CaseUpdateAttributes() .attributes( new CaseUpdateAttributesAttributes() .attributes( Map.ofEntries( Map.entry("env", Collections.singletonList("test")), Map.entry("service", Arrays.asList("web-store", "web-api")), Map.entry("team", Collections.singletonList("engineer"))))) .type(CaseResourceType.CASE)); try { CaseResponse result = apiInstance.updateAttributes(CASE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#updateAttributes"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update case attributes returns "OK" response ``` """ Update case attributes returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_object_attributes import CaseObjectAttributes from datadog_api_client.v2.model.case_resource_type import CaseResourceType from datadog_api_client.v2.model.case_update_attributes import CaseUpdateAttributes from datadog_api_client.v2.model.case_update_attributes_attributes import CaseUpdateAttributesAttributes from datadog_api_client.v2.model.case_update_attributes_request import CaseUpdateAttributesRequest # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] body = CaseUpdateAttributesRequest( data=CaseUpdateAttributes( attributes=CaseUpdateAttributesAttributes( attributes=CaseObjectAttributes( env=[ "test", ], service=[ "web-store", "web-api", ], team=[ "engineer", ], ), ), type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.update_attributes(case_id=CASE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update case attributes returns "OK" response ``` # Update case attributes returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] body = DatadogAPIClient::V2::CaseUpdateAttributesRequest.new({ data: DatadogAPIClient::V2::CaseUpdateAttributes.new({ attributes: DatadogAPIClient::V2::CaseUpdateAttributesAttributes.new({ attributes: { env: [ "test", ], service: [ "web-store", "web-api", ], team: [ "engineer", ], }, }), type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.update_attributes(CASE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update case attributes returns "OK" response ``` // Update case attributes returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CaseResourceType; use datadog_api_client::datadogV2::model::CaseUpdateAttributes; use datadog_api_client::datadogV2::model::CaseUpdateAttributesAttributes; use datadog_api_client::datadogV2::model::CaseUpdateAttributesRequest; use std::collections::BTreeMap; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); let body = CaseUpdateAttributesRequest::new(CaseUpdateAttributes::new( CaseUpdateAttributesAttributes::new(BTreeMap::from([ ("env".to_string(), vec!["test".to_string()]), ( "service".to_string(), vec!["web-store".to_string(), "web-api".to_string()], ), ("team".to_string(), vec!["engineer".to_string()]), ])), CaseResourceType::CASE, )); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.update_attributes(case_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update case attributes returns "OK" response ``` /** * Update case attributes returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; const params: v2.CaseManagementApiUpdateAttributesRequest = { body: { data: { attributes: { attributes: { env: ["test"], service: ["web-store", "web-api"], team: ["engineer"], }, }, type: "case", }, }, caseId: CASE_ID, }; apiInstance .updateAttributes(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Comment case](https://docs.datadoghq.com/api/latest/case-management/#comment-case) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#comment-case-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/commenthttps://api.ap2.datadoghq.com/api/v2/cases/{case_id}/commenthttps://api.datadoghq.eu/api/v2/cases/{case_id}/commenthttps://api.ddog-gov.com/api/v2/cases/{case_id}/commenthttps://api.datadoghq.com/api/v2/cases/{case_id}/commenthttps://api.us3.datadoghq.com/api/v2/cases/{case_id}/commenthttps://api.us5.datadoghq.com/api/v2/cases/{case_id}/comment ### Overview Comment case ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key ### Request #### Body Data (required) Case comment payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case comment attributes [_required_] object Case comment attributes comment [_required_] string The `CaseCommentAttributes` `message`. type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "comment": "Hello World !" }, "type": "case" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#CommentCase-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#CommentCase-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#CommentCase-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#CommentCase-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#CommentCase-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#CommentCase-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Timeline response Field Type Description data [object] The `TimelineResponse` `data`. attributes [_required_] object timeline cell author author of the timeline cell Option 1 object timeline cell user author content object user author content. email string user email handle string user handle id string user UUID name string user name type enum user author type. Allowed enum values: `USER` cell_content timeline cell content Option 1 object comment content message string comment message created_at date-time Timestamp of when the cell was created deleted_at date-time Timestamp of when the cell was deleted modified_at date-time Timestamp of when the cell was last modified type enum Timeline cell content type Allowed enum values: `COMMENT` id [_required_] string Timeline cell's identifier type [_required_] enum Timeline cell JSON:API resource type Allowed enum values: `timeline_cell` default: `timeline_cell` ``` { "data": [ { "attributes": { "author": { "content": { "email": "string", "handle": "string", "id": "string", "name": "string" }, "type": "USER" }, "cell_content": { "message": "string" }, "created_at": "2019-09-19T10:00:00.000Z", "deleted_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "type": "COMMENT" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "timeline_cell" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Comment case returns "OK" response Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/comment" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "comment": "Hello World !" }, "type": "case" } } EOF ``` ##### Comment case returns "OK" response ``` // Comment case returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") body := datadogV2.CaseCommentRequest{ Data: datadogV2.CaseComment{ Attributes: datadogV2.CaseCommentAttributes{ Comment: "Hello World !", }, Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.CommentCase(ctx, CaseID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.CommentCase`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.CommentCase`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Comment case returns "OK" response ``` // Comment case returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseComment; import com.datadog.api.client.v2.model.CaseCommentAttributes; import com.datadog.api.client.v2.model.CaseCommentRequest; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.TimelineResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); CaseCommentRequest body = new CaseCommentRequest() .data( new CaseComment() .attributes(new CaseCommentAttributes().comment("Hello World !")) .type(CaseResourceType.CASE)); try { TimelineResponse result = apiInstance.commentCase(CASE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#commentCase"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Comment case returns "OK" response ``` """ Comment case returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_comment import CaseComment from datadog_api_client.v2.model.case_comment_attributes import CaseCommentAttributes from datadog_api_client.v2.model.case_comment_request import CaseCommentRequest from datadog_api_client.v2.model.case_resource_type import CaseResourceType # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] body = CaseCommentRequest( data=CaseComment( attributes=CaseCommentAttributes( comment="Hello World !", ), type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.comment_case(case_id=CASE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Comment case returns "OK" response ``` # Comment case returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] body = DatadogAPIClient::V2::CaseCommentRequest.new({ data: DatadogAPIClient::V2::CaseComment.new({ attributes: DatadogAPIClient::V2::CaseCommentAttributes.new({ comment: "Hello World !", }), type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.comment_case(CASE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Comment case returns "OK" response ``` // Comment case returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CaseComment; use datadog_api_client::datadogV2::model::CaseCommentAttributes; use datadog_api_client::datadogV2::model::CaseCommentRequest; use datadog_api_client::datadogV2::model::CaseResourceType; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); let body = CaseCommentRequest::new(CaseComment::new( CaseCommentAttributes::new("Hello World !".to_string()), CaseResourceType::CASE, )); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api.comment_case(case_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Comment case returns "OK" response ``` /** * Comment case returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; const params: v2.CaseManagementApiCommentCaseRequest = { body: { data: { attributes: { comment: "Hello World !", }, type: "case", }, }, caseId: CASE_ID, }; apiInstance .commentCase(params) .then((data: v2.TimelineResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete case comment](https://docs.datadoghq.com/api/latest/case-management/#delete-case-comment) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#delete-case-comment-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/comment/{cell_id}https://api.ap2.datadoghq.com/api/v2/cases/{case_id}/comment/{cell_id}https://api.datadoghq.eu/api/v2/cases/{case_id}/comment/{cell_id}https://api.ddog-gov.com/api/v2/cases/{case_id}/comment/{cell_id}https://api.datadoghq.com/api/v2/cases/{case_id}/comment/{cell_id}https://api.us3.datadoghq.com/api/v2/cases/{case_id}/comment/{cell_id}https://api.us5.datadoghq.com/api/v2/cases/{case_id}/comment/{cell_id} ### Overview Delete case comment ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key cell_id [_required_] string Timeline cell’s UUID ### Response * [204](https://docs.datadoghq.com/api/latest/case-management/#DeleteCaseComment-204-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#DeleteCaseComment-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#DeleteCaseComment-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#DeleteCaseComment-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#DeleteCaseComment-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#DeleteCaseComment-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Delete case comment Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" export cell_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/comment/${cell_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete case comment ``` """ Delete case comment returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi # there is a valid "case" in the system CASE_ID = environ["CASE_ID"] # there is a valid "comment" in the system COMMENT_ID = environ["COMMENT_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) api_instance.delete_case_comment( case_id=CASE_ID, cell_id=COMMENT_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete case comment ``` # Delete case comment returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" in the system CASE_ID = ENV["CASE_ID"] # there is a valid "comment" in the system COMMENT_ID = ENV["COMMENT_ID"] api_instance.delete_case_comment(CASE_ID, COMMENT_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete case comment ``` // Delete case comment returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" in the system CaseID := os.Getenv("CASE_ID") // there is a valid "comment" in the system CommentID := os.Getenv("COMMENT_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) r, err := api.DeleteCaseComment(ctx, CaseID, CommentID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.DeleteCaseComment`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete case comment ``` // Delete case comment returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" in the system String CASE_ID = System.getenv("CASE_ID"); // there is a valid "comment" in the system String COMMENT_ID = System.getenv("COMMENT_ID"); try { apiInstance.deleteCaseComment(CASE_ID, COMMENT_ID); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#deleteCaseComment"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete case comment ``` // Delete case comment returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; #[tokio::main] async fn main() { // there is a valid "case" in the system let case_id = std::env::var("CASE_ID").unwrap(); // there is a valid "comment" in the system let comment_id = std::env::var("COMMENT_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api .delete_case_comment(case_id.clone(), comment_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete case comment ``` /** * Delete case comment returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" in the system const CASE_ID = process.env.CASE_ID as string; // there is a valid "comment" in the system const COMMENT_ID = process.env.COMMENT_ID as string; const params: v2.CaseManagementApiDeleteCaseCommentRequest = { caseId: CASE_ID, cellId: COMMENT_ID, }; apiInstance .deleteCaseComment(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update case custom attribute](https://docs.datadoghq.com/api/latest/case-management/#update-case-custom-attribute) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#update-case-custom-attribute-v2) POST https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.ap2.datadoghq.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.datadoghq.eu/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.ddog-gov.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.datadoghq.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.us3.datadoghq.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.us5.datadoghq.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key} ### Overview Update case custom attribute OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key custom_attribute_key [_required_] string Case Custom attribute’s key ### Request #### Body Data (required) Update case custom attribute payload * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Field Type Description data [_required_] object Case update custom attribute attributes [_required_] object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } }, "type": "case" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseCustomAttribute-200-v2) * [400](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseCustomAttribute-400-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseCustomAttribute-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseCustomAttribute-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseCustomAttribute-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#UpdateCaseCustomAttribute-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Update case custom attribute Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" export custom_attribute_key="aws_region" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/custom_attributes/${custom_attribute_key}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "is_multi": false, "type": "NUMBER", "value": {} }, "type": "case" } } EOF ``` ##### Update case custom attribute ``` """ Update case custom attribute returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi from datadog_api_client.v2.model.case_resource_type import CaseResourceType from datadog_api_client.v2.model.case_update_custom_attribute import CaseUpdateCustomAttribute from datadog_api_client.v2.model.case_update_custom_attribute_request import CaseUpdateCustomAttributeRequest from datadog_api_client.v2.model.custom_attribute_type import CustomAttributeType from datadog_api_client.v2.model.custom_attribute_value import CustomAttributeValue # there is a valid "case" with a custom "case_type" in the system CASE_WITH_TYPE_ID = environ["CASE_WITH_TYPE_ID"] # there is a valid "custom_attribute" in the system CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY = environ["CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY"] body = CaseUpdateCustomAttributeRequest( data=CaseUpdateCustomAttribute( attributes=CustomAttributeValue( type=CustomAttributeType.TEXT, is_multi=True, value=["Abba", "The Cure"], ), type=CaseResourceType.CASE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.update_case_custom_attribute( case_id=CASE_WITH_TYPE_ID, custom_attribute_key=CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY, body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update case custom attribute ``` # Update case custom attribute returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" with a custom "case_type" in the system CASE_WITH_TYPE_ID = ENV["CASE_WITH_TYPE_ID"] # there is a valid "custom_attribute" in the system CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY = ENV["CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY"] body = DatadogAPIClient::V2::CaseUpdateCustomAttributeRequest.new({ data: DatadogAPIClient::V2::CaseUpdateCustomAttribute.new({ attributes: DatadogAPIClient::V2::CustomAttributeValue.new({ type: DatadogAPIClient::V2::CustomAttributeType::TEXT, is_multi: true, value: [ "Abba", "The Cure", ], }), type: DatadogAPIClient::V2::CaseResourceType::CASE, }), }) p api_instance.update_case_custom_attribute(CASE_WITH_TYPE_ID, CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update case custom attribute ``` // Update case custom attribute returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" with a custom "case_type" in the system CaseWithTypeID := os.Getenv("CASE_WITH_TYPE_ID") // there is a valid "custom_attribute" in the system CustomAttributeAttributesKey := os.Getenv("CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY") body := datadogV2.CaseUpdateCustomAttributeRequest{ Data: datadogV2.CaseUpdateCustomAttribute{ Attributes: datadogV2.CustomAttributeValue{ Type: datadogV2.CUSTOMATTRIBUTETYPE_TEXT, IsMulti: true, Value: datadogV2.CustomAttributeValuesUnion{ CustomAttributeMultiStringValue: &[]string{ "Abba", "The Cure", }}, }, Type: datadogV2.CASERESOURCETYPE_CASE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.UpdateCaseCustomAttribute(ctx, CaseWithTypeID, CustomAttributeAttributesKey, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.UpdateCaseCustomAttribute`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.UpdateCaseCustomAttribute`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update case custom attribute ``` // Update case custom attribute returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseResourceType; import com.datadog.api.client.v2.model.CaseResponse; import com.datadog.api.client.v2.model.CaseUpdateCustomAttribute; import com.datadog.api.client.v2.model.CaseUpdateCustomAttributeRequest; import com.datadog.api.client.v2.model.CustomAttributeType; import com.datadog.api.client.v2.model.CustomAttributeValue; import com.datadog.api.client.v2.model.CustomAttributeValuesUnion; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" with a custom "case_type" in the system String CASE_WITH_TYPE_ID = System.getenv("CASE_WITH_TYPE_ID"); // there is a valid "custom_attribute" in the system String CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY = System.getenv("CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY"); CaseUpdateCustomAttributeRequest body = new CaseUpdateCustomAttributeRequest() .data( new CaseUpdateCustomAttribute() .attributes( new CustomAttributeValue() .type(CustomAttributeType.TEXT) .isMulti(true) .value( CustomAttributeValuesUnion.fromStringList( Arrays.asList("Abba", "The Cure")))) .type(CaseResourceType.CASE)); try { CaseResponse result = apiInstance.updateCaseCustomAttribute( CASE_WITH_TYPE_ID, CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#updateCaseCustomAttribute"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update case custom attribute ``` // Update case custom attribute returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; use datadog_api_client::datadogV2::model::CaseResourceType; use datadog_api_client::datadogV2::model::CaseUpdateCustomAttribute; use datadog_api_client::datadogV2::model::CaseUpdateCustomAttributeRequest; use datadog_api_client::datadogV2::model::CustomAttributeType; use datadog_api_client::datadogV2::model::CustomAttributeValue; use datadog_api_client::datadogV2::model::CustomAttributeValuesUnion; #[tokio::main] async fn main() { // there is a valid "case" with a custom "case_type" in the system let case_with_type_id = std::env::var("CASE_WITH_TYPE_ID").unwrap(); // there is a valid "custom_attribute" in the system let custom_attribute_attributes_key = std::env::var("CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY").unwrap(); let body = CaseUpdateCustomAttributeRequest::new(CaseUpdateCustomAttribute::new( CustomAttributeValue::new( true, CustomAttributeType::TEXT, CustomAttributeValuesUnion::CustomAttributeMultiStringValue(vec![ "Abba".to_string(), "The Cure".to_string(), ]), ), CaseResourceType::CASE, )); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api .update_case_custom_attribute( case_with_type_id.clone(), custom_attribute_attributes_key.clone(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update case custom attribute ``` /** * Update case custom attribute returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" with a custom "case_type" in the system const CASE_WITH_TYPE_ID = process.env.CASE_WITH_TYPE_ID as string; // there is a valid "custom_attribute" in the system const CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY = process.env .CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY as string; const params: v2.CaseManagementApiUpdateCaseCustomAttributeRequest = { body: { data: { attributes: { type: "TEXT", isMulti: true, value: ["Abba", "The Cure"], }, type: "case", }, }, caseId: CASE_WITH_TYPE_ID, customAttributeKey: CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY, }; apiInstance .updateCaseCustomAttribute(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete custom attribute from case](https://docs.datadoghq.com/api/latest/case-management/#delete-custom-attribute-from-case) * [v2 (latest)](https://docs.datadoghq.com/api/latest/case-management/#delete-custom-attribute-from-case-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.ap2.datadoghq.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.datadoghq.eu/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.ddog-gov.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.datadoghq.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.us3.datadoghq.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key}https://api.us5.datadoghq.com/api/v2/cases/{case_id}/custom_attributes/{custom_attribute_key} ### Overview Delete custom attribute from case OAuth apps require the `cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#case-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Case’s UUID or key custom_attribute_key [_required_] string Case Custom attribute’s key ### Response * [200](https://docs.datadoghq.com/api/latest/case-management/#DeleteCaseCustomAttribute-200-v2) * [401](https://docs.datadoghq.com/api/latest/case-management/#DeleteCaseCustomAttribute-401-v2) * [403](https://docs.datadoghq.com/api/latest/case-management/#DeleteCaseCustomAttribute-403-v2) * [404](https://docs.datadoghq.com/api/latest/case-management/#DeleteCaseCustomAttribute-404-v2) * [429](https://docs.datadoghq.com/api/latest/case-management/#DeleteCaseCustomAttribute-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) Case response Field Type Description data object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` ``` { "data": { "attributes": { "archived_at": "2019-09-19T10:00:00.000Z", "attributes": { "": [] }, "closed_at": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "custom_attributes": { "": { "is_multi": false, "type": "NUMBER", "value": { "description": "", "type": "" } } }, "description": "string", "jira_issue": { "result": { "issue_id": "string", "issue_key": "string", "issue_url": "string", "project_key": "string" }, "status": "COMPLETED" }, "key": "CASEM-4523", "modified_at": "2019-09-19T10:00:00.000Z", "priority": "NOT_DEFINED", "service_now_ticket": { "result": { "sys_target_link": "string" }, "status": "COMPLETED" }, "status": "OPEN", "title": "Memory leak investigation on API", "type": "STANDARD", "type_id": "3b010bde-09ce-4449-b745-71dd5f861963" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/case-management/) * [Example](https://docs.datadoghq.com/api/latest/case-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/case-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/case-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/case-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/case-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/case-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/case-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/case-management/?code-lang=typescript) ##### Delete custom attribute from case Copy ``` # Path parameters export case_id="f98a5a5b-e0ff-45d4-b2f5-afe6e74de504" export custom_attribute_key="aws_region" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cases/${case_id}/custom_attributes/${custom_attribute_key}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete custom attribute from case ``` """ Delete custom attribute from case returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.case_management_api import CaseManagementApi # there is a valid "case" with a custom "case_type" in the system CASE_WITH_TYPE_ID = environ["CASE_WITH_TYPE_ID"] # there is a valid "custom_attribute" in the system CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY = environ["CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CaseManagementApi(api_client) response = api_instance.delete_case_custom_attribute( case_id=CASE_WITH_TYPE_ID, custom_attribute_key=CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete custom attribute from case ``` # Delete custom attribute from case returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CaseManagementAPI.new # there is a valid "case" with a custom "case_type" in the system CASE_WITH_TYPE_ID = ENV["CASE_WITH_TYPE_ID"] # there is a valid "custom_attribute" in the system CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY = ENV["CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY"] p api_instance.delete_case_custom_attribute(CASE_WITH_TYPE_ID, CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete custom attribute from case ``` // Delete custom attribute from case returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "case" with a custom "case_type" in the system CaseWithTypeID := os.Getenv("CASE_WITH_TYPE_ID") // there is a valid "custom_attribute" in the system CustomAttributeAttributesKey := os.Getenv("CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCaseManagementApi(apiClient) resp, r, err := api.DeleteCaseCustomAttribute(ctx, CaseWithTypeID, CustomAttributeAttributesKey) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CaseManagementApi.DeleteCaseCustomAttribute`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CaseManagementApi.DeleteCaseCustomAttribute`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete custom attribute from case ``` // Delete custom attribute from case returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CaseManagementApi; import com.datadog.api.client.v2.model.CaseResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CaseManagementApi apiInstance = new CaseManagementApi(defaultClient); // there is a valid "case" with a custom "case_type" in the system String CASE_WITH_TYPE_ID = System.getenv("CASE_WITH_TYPE_ID"); // there is a valid "custom_attribute" in the system String CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY = System.getenv("CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY"); try { CaseResponse result = apiInstance.deleteCaseCustomAttribute(CASE_WITH_TYPE_ID, CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CaseManagementApi#deleteCaseCustomAttribute"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete custom attribute from case ``` // Delete custom attribute from case returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_case_management::CaseManagementAPI; #[tokio::main] async fn main() { // there is a valid "case" with a custom "case_type" in the system let case_with_type_id = std::env::var("CASE_WITH_TYPE_ID").unwrap(); // there is a valid "custom_attribute" in the system let custom_attribute_attributes_key = std::env::var("CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY").unwrap(); let configuration = datadog::Configuration::new(); let api = CaseManagementAPI::with_config(configuration); let resp = api .delete_case_custom_attribute( case_with_type_id.clone(), custom_attribute_attributes_key.clone(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete custom attribute from case ``` /** * Delete custom attribute from case returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CaseManagementApi(configuration); // there is a valid "case" with a custom "case_type" in the system const CASE_WITH_TYPE_ID = process.env.CASE_WITH_TYPE_ID as string; // there is a valid "custom_attribute" in the system const CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY = process.env .CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY as string; const params: v2.CaseManagementApiDeleteCaseCustomAttributeRequest = { caseId: CASE_WITH_TYPE_ID, customAttributeKey: CUSTOM_ATTRIBUTE_ATTRIBUTES_KEY, }; apiInstance .deleteCaseCustomAttribute(params) .then((data: v2.CaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=6896163f-0a28-4f35-b5fc-c5c3a84c6bd8&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=5779c400-02c9-45c2-b548-9822fa5414d0&pt=Case%20Management&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcase-management%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=6896163f-0a28-4f35-b5fc-c5c3a84c6bd8&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=5779c400-02c9-45c2-b548-9822fa5414d0&pt=Case%20Management&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcase-management%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=a40d5063-42b8-4c32-ad47-939eb0916e87&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Case%20Management&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcase-management%2F&r=<=33700&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=742776) --- # Source: https://docs.datadoghq.com/api/latest/ci-visibility-pipelines # CI Visibility Pipelines Search or aggregate your CI Visibility pipeline events and send them to your Datadog site over HTTP. See the [CI Pipeline Visibility in Datadog page](https://docs.datadoghq.com/continuous_integration/pipelines/) for more information. ## [Send pipeline event](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#send-pipeline-event) * [v2 (latest)](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#send-pipeline-event-v2) POST https://api.ap1.datadoghq.com/api/v2/ci/pipelinehttps://api.ap2.datadoghq.com/api/v2/ci/pipelinehttps://api.datadoghq.eu/api/v2/ci/pipelinehttps://api.ddog-gov.com/api/v2/ci/pipelinehttps://api.datadoghq.com/api/v2/ci/pipelinehttps://api.us3.datadoghq.com/api/v2/ci/pipelinehttps://api.us5.datadoghq.com/api/v2/ci/pipeline ### Overview Send your pipeline event to your Datadog platform over HTTP. For details about how pipeline executions are modeled and what execution types we support, see [Pipeline Data Model And Execution Types](https://docs.datadoghq.com/continuous_integration/guides/pipeline_data_model/). Multiple events can be sent in an array (up to 1000). Pipeline events can be submitted with a timestamp that is up to 18 hours in the past. The duration between the event start and end times cannot exceed 1 year. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Field Type Description data Data of the pipeline events to create. Option 1 object Data of the pipeline event to create. attributes object Attributes of the pipeline event to create. env string The Datadog environment. provider_name string The name of the CI provider. By default, this is "custom". resource [_required_] Details of the CI pipeline event. Option 1 Details of the top level pipeline, build, or workflow of your CI. Option 1 object Details of a finished pipeline. end [_required_] date-time Time when the pipeline run finished. It cannot be older than 18 hours in the past from the current time. The time format must be RFC3339. error object Contains information of the CI error. domain enum Error category used to differentiate between issues related to the developer or provider environments. Allowed enum values: `provider,user,unknown` message string Error message. stack string The stack trace of the reported errors. type string Short description of the error type. git object If pipelines are triggered due to actions to a Git repository, then all payloads must contain this. Note that either `tag` or `branch` has to be provided, but not both. author_email [_required_] string The commit author email. author_name string The commit author name. author_time string The commit author timestamp in RFC3339 format. branch string The branch name (if a tag use the tag parameter). commit_time string The commit timestamp in RFC3339 format. committer_email string The committer email. committer_name string The committer name. default_branch string The Git repository's default branch. message string The commit message. repository_url [_required_] string The URL of the repository. sha [_required_] string The git commit SHA. tag string The tag name (if a branch use the branch parameter). is_manual boolean Whether or not the pipeline was triggered manually by the user. is_resumed boolean Whether or not the pipeline was resumed after being blocked. level [_required_] enum Used to distinguish between pipelines, stages, jobs, and steps. Allowed enum values: `pipeline` default: `pipeline` metrics [string] A list of user-defined metrics. The metrics must follow the `key:value` pattern and the value must be numeric. name [_required_] string Name of the pipeline. All pipeline runs for the builds should have the same name. node object Contains information of the host running the pipeline, stage, job, or step. hostname string FQDN of the host. labels [string] A list of labels used to select or identify the node. name string Name for the host. workspace string The path where the code is checked out. parameters object A map of key-value parameters or environment variables that were defined for the pipeline. string parent_pipeline object If the pipeline is triggered as child of another pipeline, this should contain the details of the parent pipeline. id [_required_] string UUID of a pipeline. url string The URL to look at the pipeline in the CI provider UI. partial_retry [_required_] boolean Whether or not the pipeline was a partial retry of a previous attempt. A partial retry is one which only runs a subset of the original jobs. pipeline_id string Any ID used in the provider to identify the pipeline run even if it is not unique across retries. If the `pipeline_id` is unique, then both `unique_id` and `pipeline_id` can be set to the same value. previous_attempt object If the pipeline is a retry, this should contain the details of the previous attempt. id [_required_] string UUID of a pipeline. url string The URL to look at the pipeline in the CI provider UI. queue_time int64 The queue time in milliseconds, if applicable. start [_required_] date-time Time when the pipeline run started (it should not include any queue time). The time format must be RFC3339. status [_required_] enum The final status of the pipeline. Allowed enum values: `success,error,canceled,skipped,blocked` tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. unique_id [_required_] string UUID of the pipeline run. The ID has to be unique across retries and pipelines, including partial retries. url [_required_] string The URL to look at the pipeline in the CI provider UI. Option 2 object Details of a running pipeline. error object Contains information of the CI error. domain enum Error category used to differentiate between issues related to the developer or provider environments. Allowed enum values: `provider,user,unknown` message string Error message. stack string The stack trace of the reported errors. type string Short description of the error type. git object If pipelines are triggered due to actions to a Git repository, then all payloads must contain this. Note that either `tag` or `branch` has to be provided, but not both. author_email [_required_] string The commit author email. author_name string The commit author name. author_time string The commit author timestamp in RFC3339 format. branch string The branch name (if a tag use the tag parameter). commit_time string The commit timestamp in RFC3339 format. committer_email string The committer email. committer_name string The committer name. default_branch string The Git repository's default branch. message string The commit message. repository_url [_required_] string The URL of the repository. sha [_required_] string The git commit SHA. tag string The tag name (if a branch use the branch parameter). is_manual boolean Whether or not the pipeline was triggered manually by the user. is_resumed boolean Whether or not the pipeline was resumed after being blocked. level [_required_] enum Used to distinguish between pipelines, stages, jobs, and steps. Allowed enum values: `pipeline` default: `pipeline` metrics [string] A list of user-defined metrics. The metrics must follow the `key:value` pattern and the value must be numeric. name [_required_] string Name of the pipeline. All pipeline runs for the builds should have the same name. node object Contains information of the host running the pipeline, stage, job, or step. hostname string FQDN of the host. labels [string] A list of labels used to select or identify the node. name string Name for the host. workspace string The path where the code is checked out. parameters object A map of key-value parameters or environment variables that were defined for the pipeline. string parent_pipeline object If the pipeline is triggered as child of another pipeline, this should contain the details of the parent pipeline. id [_required_] string UUID of a pipeline. url string The URL to look at the pipeline in the CI provider UI. partial_retry [_required_] boolean Whether or not the pipeline was a partial retry of a previous attempt. A partial retry is one which only runs a subset of the original jobs. pipeline_id string Any ID used in the provider to identify the pipeline run even if it is not unique across retries. If the `pipeline_id` is unique, then both `unique_id` and `pipeline_id` can be set to the same value. previous_attempt object If the pipeline is a retry, this should contain the details of the previous attempt. id [_required_] string UUID of a pipeline. url string The URL to look at the pipeline in the CI provider UI. queue_time int64 The queue time in milliseconds, if applicable. start [_required_] date-time Time when the pipeline run started (it should not include any queue time). The time format must be RFC3339. status [_required_] enum The in progress status of the pipeline. Allowed enum values: `running` tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. unique_id [_required_] string UUID of the pipeline run. The ID has to be the same as the finished pipeline. url [_required_] string The URL to look at the pipeline in the CI provider UI. Option 2 object Details of a CI stage. dependencies [string] A list of stage IDs that this stage depends on. end [_required_] date-time Time when the stage run finished. The time format must be RFC3339. error object Contains information of the CI error. domain enum Error category used to differentiate between issues related to the developer or provider environments. Allowed enum values: `provider,user,unknown` message string Error message. stack string The stack trace of the reported errors. type string Short description of the error type. git object If pipelines are triggered due to actions to a Git repository, then all payloads must contain this. Note that either `tag` or `branch` has to be provided, but not both. author_email [_required_] string The commit author email. author_name string The commit author name. author_time string The commit author timestamp in RFC3339 format. branch string The branch name (if a tag use the tag parameter). commit_time string The commit timestamp in RFC3339 format. committer_email string The committer email. committer_name string The committer name. default_branch string The Git repository's default branch. message string The commit message. repository_url [_required_] string The URL of the repository. sha [_required_] string The git commit SHA. tag string The tag name (if a branch use the branch parameter). id [_required_] string UUID for the stage. It has to be unique at least in the pipeline scope. level [_required_] enum Used to distinguish between pipelines, stages, jobs and steps. Allowed enum values: `stage` default: `stage` metrics [string] A list of user-defined metrics. The metrics must follow the `key:value` pattern and the value must be numeric. name [_required_] string The name for the stage. node object Contains information of the host running the pipeline, stage, job, or step. hostname string FQDN of the host. labels [string] A list of labels used to select or identify the node. name string Name for the host. workspace string The path where the code is checked out. parameters object A map of key-value parameters or environment variables that were defined for the pipeline. string pipeline_name [_required_] string The parent pipeline name. pipeline_unique_id [_required_] string The parent pipeline UUID. queue_time int64 The queue time in milliseconds, if applicable. start [_required_] date-time Time when the stage run started (it should not include any queue time). The time format must be RFC3339. status [_required_] enum The final status of the stage. Allowed enum values: `success,error,canceled,skipped` tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. Option 3 object Details of a CI job. dependencies [string] A list of job IDs that this job depends on. end [_required_] date-time Time when the job run finished. The time format must be RFC3339. error object Contains information of the CI error. domain enum Error category used to differentiate between issues related to the developer or provider environments. Allowed enum values: `provider,user,unknown` message string Error message. stack string The stack trace of the reported errors. type string Short description of the error type. git object If pipelines are triggered due to actions to a Git repository, then all payloads must contain this. Note that either `tag` or `branch` has to be provided, but not both. author_email [_required_] string The commit author email. author_name string The commit author name. author_time string The commit author timestamp in RFC3339 format. branch string The branch name (if a tag use the tag parameter). commit_time string The commit timestamp in RFC3339 format. committer_email string The committer email. committer_name string The committer name. default_branch string The Git repository's default branch. message string The commit message. repository_url [_required_] string The URL of the repository. sha [_required_] string The git commit SHA. tag string The tag name (if a branch use the branch parameter). id [_required_] string The UUID for the job. It has to be unique within each pipeline execution. level [_required_] enum Used to distinguish between pipelines, stages, jobs, and steps. Allowed enum values: `job` default: `job` metrics [string] A list of user-defined metrics. The metrics must follow the `key:value` pattern and the value must be numeric. name [_required_] string The name for the job. node object Contains information of the host running the pipeline, stage, job, or step. hostname string FQDN of the host. labels [string] A list of labels used to select or identify the node. name string Name for the host. workspace string The path where the code is checked out. parameters object A map of key-value parameters or environment variables that were defined for the pipeline. string pipeline_name [_required_] string The parent pipeline name. pipeline_unique_id [_required_] string The parent pipeline UUID. queue_time int64 The queue time in milliseconds, if applicable. stage_id string The parent stage UUID (if applicable). stage_name string The parent stage name (if applicable). start [_required_] date-time Time when the job run instance started (it should not include any queue time). The time format must be RFC3339. status [_required_] enum The final status of the job. Allowed enum values: `success,error,canceled,skipped` tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. url [_required_] string The URL to look at the job in the CI provider UI. Option 4 object Details of a CI step. end [_required_] date-time Time when the step run finished. The time format must be RFC3339. error object Contains information of the CI error. domain enum Error category used to differentiate between issues related to the developer or provider environments. Allowed enum values: `provider,user,unknown` message string Error message. stack string The stack trace of the reported errors. type string Short description of the error type. git object If pipelines are triggered due to actions to a Git repository, then all payloads must contain this. Note that either `tag` or `branch` has to be provided, but not both. author_email [_required_] string The commit author email. author_name string The commit author name. author_time string The commit author timestamp in RFC3339 format. branch string The branch name (if a tag use the tag parameter). commit_time string The commit timestamp in RFC3339 format. committer_email string The committer email. committer_name string The committer name. default_branch string The Git repository's default branch. message string The commit message. repository_url [_required_] string The URL of the repository. sha [_required_] string The git commit SHA. tag string The tag name (if a branch use the branch parameter). id [_required_] string UUID for the step. It has to be unique within each pipeline execution. job_id string The parent job UUID (if applicable). job_name string The parent job name (if applicable). level [_required_] enum Used to distinguish between pipelines, stages, jobs and steps. Allowed enum values: `step` default: `step` metrics [string] A list of user-defined metrics. The metrics must follow the `key:value` pattern and the value must be numeric. name [_required_] string The name for the step. node object Contains information of the host running the pipeline, stage, job, or step. hostname string FQDN of the host. labels [string] A list of labels used to select or identify the node. name string Name for the host. workspace string The path where the code is checked out. parameters object A map of key-value parameters or environment variables that were defined for the pipeline. string pipeline_name [_required_] string The parent pipeline name. pipeline_unique_id [_required_] string The parent pipeline UUID. stage_id string The parent stage UUID (if applicable). stage_name string The parent stage name (if applicable). start [_required_] date-time Time when the step run started. The time format must be RFC3339. status [_required_] enum The final status of the step. Allowed enum values: `success,error` tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. url string The URL to look at the step in the CI provider UI. service string If the CI provider is SaaS, use this to differentiate between instances. type enum Type of the event. Allowed enum values: `cipipeline_resource_request` default: `cipipeline_resource_request` Option 2 [object] Array of pipeline events to create in batch. attributes object Attributes of the pipeline event to create. env string The Datadog environment. provider_name string The name of the CI provider. By default, this is "custom". resource [_required_] Details of the CI pipeline event. Option 1 Details of the top level pipeline, build, or workflow of your CI. Option 1 object Details of a finished pipeline. end [_required_] date-time Time when the pipeline run finished. It cannot be older than 18 hours in the past from the current time. The time format must be RFC3339. error object Contains information of the CI error. domain enum Error category used to differentiate between issues related to the developer or provider environments. Allowed enum values: `provider,user,unknown` message string Error message. stack string The stack trace of the reported errors. type string Short description of the error type. git object If pipelines are triggered due to actions to a Git repository, then all payloads must contain this. Note that either `tag` or `branch` has to be provided, but not both. author_email [_required_] string The commit author email. author_name string The commit author name. author_time string The commit author timestamp in RFC3339 format. branch string The branch name (if a tag use the tag parameter). commit_time string The commit timestamp in RFC3339 format. committer_email string The committer email. committer_name string The committer name. default_branch string The Git repository's default branch. message string The commit message. repository_url [_required_] string The URL of the repository. sha [_required_] string The git commit SHA. tag string The tag name (if a branch use the branch parameter). is_manual boolean Whether or not the pipeline was triggered manually by the user. is_resumed boolean Whether or not the pipeline was resumed after being blocked. level [_required_] enum Used to distinguish between pipelines, stages, jobs, and steps. Allowed enum values: `pipeline` default: `pipeline` metrics [string] A list of user-defined metrics. The metrics must follow the `key:value` pattern and the value must be numeric. name [_required_] string Name of the pipeline. All pipeline runs for the builds should have the same name. node object Contains information of the host running the pipeline, stage, job, or step. hostname string FQDN of the host. labels [string] A list of labels used to select or identify the node. name string Name for the host. workspace string The path where the code is checked out. parameters object A map of key-value parameters or environment variables that were defined for the pipeline. string parent_pipeline object If the pipeline is triggered as child of another pipeline, this should contain the details of the parent pipeline. id [_required_] string UUID of a pipeline. url string The URL to look at the pipeline in the CI provider UI. partial_retry [_required_] boolean Whether or not the pipeline was a partial retry of a previous attempt. A partial retry is one which only runs a subset of the original jobs. pipeline_id string Any ID used in the provider to identify the pipeline run even if it is not unique across retries. If the `pipeline_id` is unique, then both `unique_id` and `pipeline_id` can be set to the same value. previous_attempt object If the pipeline is a retry, this should contain the details of the previous attempt. id [_required_] string UUID of a pipeline. url string The URL to look at the pipeline in the CI provider UI. queue_time int64 The queue time in milliseconds, if applicable. start [_required_] date-time Time when the pipeline run started (it should not include any queue time). The time format must be RFC3339. status [_required_] enum The final status of the pipeline. Allowed enum values: `success,error,canceled,skipped,blocked` tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. unique_id [_required_] string UUID of the pipeline run. The ID has to be unique across retries and pipelines, including partial retries. url [_required_] string The URL to look at the pipeline in the CI provider UI. Option 2 object Details of a running pipeline. error object Contains information of the CI error. domain enum Error category used to differentiate between issues related to the developer or provider environments. Allowed enum values: `provider,user,unknown` message string Error message. stack string The stack trace of the reported errors. type string Short description of the error type. git object If pipelines are triggered due to actions to a Git repository, then all payloads must contain this. Note that either `tag` or `branch` has to be provided, but not both. author_email [_required_] string The commit author email. author_name string The commit author name. author_time string The commit author timestamp in RFC3339 format. branch string The branch name (if a tag use the tag parameter). commit_time string The commit timestamp in RFC3339 format. committer_email string The committer email. committer_name string The committer name. default_branch string The Git repository's default branch. message string The commit message. repository_url [_required_] string The URL of the repository. sha [_required_] string The git commit SHA. tag string The tag name (if a branch use the branch parameter). is_manual boolean Whether or not the pipeline was triggered manually by the user. is_resumed boolean Whether or not the pipeline was resumed after being blocked. level [_required_] enum Used to distinguish between pipelines, stages, jobs, and steps. Allowed enum values: `pipeline` default: `pipeline` metrics [string] A list of user-defined metrics. The metrics must follow the `key:value` pattern and the value must be numeric. name [_required_] string Name of the pipeline. All pipeline runs for the builds should have the same name. node object Contains information of the host running the pipeline, stage, job, or step. hostname string FQDN of the host. labels [string] A list of labels used to select or identify the node. name string Name for the host. workspace string The path where the code is checked out. parameters object A map of key-value parameters or environment variables that were defined for the pipeline. string parent_pipeline object If the pipeline is triggered as child of another pipeline, this should contain the details of the parent pipeline. id [_required_] string UUID of a pipeline. url string The URL to look at the pipeline in the CI provider UI. partial_retry [_required_] boolean Whether or not the pipeline was a partial retry of a previous attempt. A partial retry is one which only runs a subset of the original jobs. pipeline_id string Any ID used in the provider to identify the pipeline run even if it is not unique across retries. If the `pipeline_id` is unique, then both `unique_id` and `pipeline_id` can be set to the same value. previous_attempt object If the pipeline is a retry, this should contain the details of the previous attempt. id [_required_] string UUID of a pipeline. url string The URL to look at the pipeline in the CI provider UI. queue_time int64 The queue time in milliseconds, if applicable. start [_required_] date-time Time when the pipeline run started (it should not include any queue time). The time format must be RFC3339. status [_required_] enum The in progress status of the pipeline. Allowed enum values: `running` tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. unique_id [_required_] string UUID of the pipeline run. The ID has to be the same as the finished pipeline. url [_required_] string The URL to look at the pipeline in the CI provider UI. Option 2 object Details of a CI stage. dependencies [string] A list of stage IDs that this stage depends on. end [_required_] date-time Time when the stage run finished. The time format must be RFC3339. error object Contains information of the CI error. domain enum Error category used to differentiate between issues related to the developer or provider environments. Allowed enum values: `provider,user,unknown` message string Error message. stack string The stack trace of the reported errors. type string Short description of the error type. git object If pipelines are triggered due to actions to a Git repository, then all payloads must contain this. Note that either `tag` or `branch` has to be provided, but not both. author_email [_required_] string The commit author email. author_name string The commit author name. author_time string The commit author timestamp in RFC3339 format. branch string The branch name (if a tag use the tag parameter). commit_time string The commit timestamp in RFC3339 format. committer_email string The committer email. committer_name string The committer name. default_branch string The Git repository's default branch. message string The commit message. repository_url [_required_] string The URL of the repository. sha [_required_] string The git commit SHA. tag string The tag name (if a branch use the branch parameter). id [_required_] string UUID for the stage. It has to be unique at least in the pipeline scope. level [_required_] enum Used to distinguish between pipelines, stages, jobs and steps. Allowed enum values: `stage` default: `stage` metrics [string] A list of user-defined metrics. The metrics must follow the `key:value` pattern and the value must be numeric. name [_required_] string The name for the stage. node object Contains information of the host running the pipeline, stage, job, or step. hostname string FQDN of the host. labels [string] A list of labels used to select or identify the node. name string Name for the host. workspace string The path where the code is checked out. parameters object A map of key-value parameters or environment variables that were defined for the pipeline. string pipeline_name [_required_] string The parent pipeline name. pipeline_unique_id [_required_] string The parent pipeline UUID. queue_time int64 The queue time in milliseconds, if applicable. start [_required_] date-time Time when the stage run started (it should not include any queue time). The time format must be RFC3339. status [_required_] enum The final status of the stage. Allowed enum values: `success,error,canceled,skipped` tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. Option 3 object Details of a CI job. dependencies [string] A list of job IDs that this job depends on. end [_required_] date-time Time when the job run finished. The time format must be RFC3339. error object Contains information of the CI error. domain enum Error category used to differentiate between issues related to the developer or provider environments. Allowed enum values: `provider,user,unknown` message string Error message. stack string The stack trace of the reported errors. type string Short description of the error type. git object If pipelines are triggered due to actions to a Git repository, then all payloads must contain this. Note that either `tag` or `branch` has to be provided, but not both. author_email [_required_] string The commit author email. author_name string The commit author name. author_time string The commit author timestamp in RFC3339 format. branch string The branch name (if a tag use the tag parameter). commit_time string The commit timestamp in RFC3339 format. committer_email string The committer email. committer_name string The committer name. default_branch string The Git repository's default branch. message string The commit message. repository_url [_required_] string The URL of the repository. sha [_required_] string The git commit SHA. tag string The tag name (if a branch use the branch parameter). id [_required_] string The UUID for the job. It has to be unique within each pipeline execution. level [_required_] enum Used to distinguish between pipelines, stages, jobs, and steps. Allowed enum values: `job` default: `job` metrics [string] A list of user-defined metrics. The metrics must follow the `key:value` pattern and the value must be numeric. name [_required_] string The name for the job. node object Contains information of the host running the pipeline, stage, job, or step. hostname string FQDN of the host. labels [string] A list of labels used to select or identify the node. name string Name for the host. workspace string The path where the code is checked out. parameters object A map of key-value parameters or environment variables that were defined for the pipeline. string pipeline_name [_required_] string The parent pipeline name. pipeline_unique_id [_required_] string The parent pipeline UUID. queue_time int64 The queue time in milliseconds, if applicable. stage_id string The parent stage UUID (if applicable). stage_name string The parent stage name (if applicable). start [_required_] date-time Time when the job run instance started (it should not include any queue time). The time format must be RFC3339. status [_required_] enum The final status of the job. Allowed enum values: `success,error,canceled,skipped` tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. url [_required_] string The URL to look at the job in the CI provider UI. Option 4 object Details of a CI step. end [_required_] date-time Time when the step run finished. The time format must be RFC3339. error object Contains information of the CI error. domain enum Error category used to differentiate between issues related to the developer or provider environments. Allowed enum values: `provider,user,unknown` message string Error message. stack string The stack trace of the reported errors. type string Short description of the error type. git object If pipelines are triggered due to actions to a Git repository, then all payloads must contain this. Note that either `tag` or `branch` has to be provided, but not both. author_email [_required_] string The commit author email. author_name string The commit author name. author_time string The commit author timestamp in RFC3339 format. branch string The branch name (if a tag use the tag parameter). commit_time string The commit timestamp in RFC3339 format. committer_email string The committer email. committer_name string The committer name. default_branch string The Git repository's default branch. message string The commit message. repository_url [_required_] string The URL of the repository. sha [_required_] string The git commit SHA. tag string The tag name (if a branch use the branch parameter). id [_required_] string UUID for the step. It has to be unique within each pipeline execution. job_id string The parent job UUID (if applicable). job_name string The parent job name (if applicable). level [_required_] enum Used to distinguish between pipelines, stages, jobs and steps. Allowed enum values: `step` default: `step` metrics [string] A list of user-defined metrics. The metrics must follow the `key:value` pattern and the value must be numeric. name [_required_] string The name for the step. node object Contains information of the host running the pipeline, stage, job, or step. hostname string FQDN of the host. labels [string] A list of labels used to select or identify the node. name string Name for the host. workspace string The path where the code is checked out. parameters object A map of key-value parameters or environment variables that were defined for the pipeline. string pipeline_name [_required_] string The parent pipeline name. pipeline_unique_id [_required_] string The parent pipeline UUID. stage_id string The parent stage UUID (if applicable). stage_name string The parent stage name (if applicable). start [_required_] date-time Time when the step run started. The time format must be RFC3339. status [_required_] enum The final status of the step. Allowed enum values: `success,error` tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. url string The URL to look at the step in the CI provider UI. service string If the CI provider is SaaS, use this to differentiate between instances. type enum Type of the event. Allowed enum values: `cipipeline_resource_request` default: `cipipeline_resource_request` ##### Send pipeline event returns "Request accepted for processing" response ``` { "data": { "attributes": { "resource": { "level": "pipeline", "unique_id": "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", "name": "Deploy to AWS", "url": "https://my-ci-provider.example/pipelines/my-pipeline/run/1", "start": "2021-11-11T11:09:11+00:00", "end": "2021-11-11T11:10:41+00:00", "status": "success", "partial_retry": false, "git": { "repository_url": "https://github.com/DataDog/datadog-agent", "sha": "7f263865994b76066c4612fd1965215e7dcb4cd2", "author_email": "john.doe@email.com" } } }, "type": "cipipeline_resource_request" } } ``` Copy ##### Send pipeline event with custom provider returns "Request accepted for processing" response ``` { "data": { "attributes": { "provider_name": "example-provider", "resource": { "level": "pipeline", "unique_id": "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", "name": "Deploy to AWS", "url": "https://my-ci-provider.example/pipelines/my-pipeline/run/1", "start": "2021-11-11T11:09:11+00:00", "end": "2021-11-11T11:10:41+00:00", "status": "success", "partial_retry": false, "git": { "repository_url": "https://github.com/DataDog/datadog-agent", "sha": "7f263865994b76066c4612fd1965215e7dcb4cd2", "author_email": "john.doe@email.com" } } }, "type": "cipipeline_resource_request" } } ``` Copy ##### Send running pipeline event returns "Request accepted for processing" response ``` { "data": { "attributes": { "resource": { "level": "pipeline", "unique_id": "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", "name": "Deploy to AWS", "url": "https://my-ci-provider.example/pipelines/my-pipeline/run/1", "start": "2021-11-11T11:09:11+00:00", "status": "running", "partial_retry": false, "git": { "repository_url": "https://github.com/DataDog/datadog-agent", "sha": "7f263865994b76066c4612fd1965215e7dcb4cd2", "author_email": "john.doe@email.com" } } }, "type": "cipipeline_resource_request" } } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#CreateCIAppPipelineEvent-202-v2) * [400](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#CreateCIAppPipelineEvent-400-v2) * [401](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#CreateCIAppPipelineEvent-401-v2) * [403](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#CreateCIAppPipelineEvent-403-v2) * [408](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#CreateCIAppPipelineEvent-408-v2) * [413](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#CreateCIAppPipelineEvent-413-v2) * [429](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#CreateCIAppPipelineEvent-429-v2) * [500](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#CreateCIAppPipelineEvent-500-v2) * [503](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#CreateCIAppPipelineEvent-503-v2) Request accepted for processing * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Request Timeout * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Payload Too Large * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Service Unavailable * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=typescript) ##### Send pipeline event returns "Request accepted for processing" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ci/pipeline" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "data": { "attributes": { "resource": { "level": "pipeline", "unique_id": "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", "name": "Deploy to AWS", "url": "https://my-ci-provider.example/pipelines/my-pipeline/run/1", "start": "2021-11-11T11:09:11+00:00", "end": "2021-11-11T11:10:41+00:00", "status": "success", "partial_retry": false, "git": { "repository_url": "https://github.com/DataDog/datadog-agent", "sha": "7f263865994b76066c4612fd1965215e7dcb4cd2", "author_email": "john.doe@email.com" } } }, "type": "cipipeline_resource_request" } } EOF ``` ##### Send pipeline event with custom provider returns "Request accepted for processing" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ci/pipeline" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "data": { "attributes": { "provider_name": "example-provider", "resource": { "level": "pipeline", "unique_id": "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", "name": "Deploy to AWS", "url": "https://my-ci-provider.example/pipelines/my-pipeline/run/1", "start": "2021-11-11T11:09:11+00:00", "end": "2021-11-11T11:10:41+00:00", "status": "success", "partial_retry": false, "git": { "repository_url": "https://github.com/DataDog/datadog-agent", "sha": "7f263865994b76066c4612fd1965215e7dcb4cd2", "author_email": "john.doe@email.com" } } }, "type": "cipipeline_resource_request" } } EOF ``` ##### Send running pipeline event returns "Request accepted for processing" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ci/pipeline" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "data": { "attributes": { "resource": { "level": "pipeline", "unique_id": "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", "name": "Deploy to AWS", "url": "https://my-ci-provider.example/pipelines/my-pipeline/run/1", "start": "2021-11-11T11:09:11+00:00", "status": "running", "partial_retry": false, "git": { "repository_url": "https://github.com/DataDog/datadog-agent", "sha": "7f263865994b76066c4612fd1965215e7dcb4cd2", "author_email": "john.doe@email.com" } } }, "type": "cipipeline_resource_request" } } EOF ``` ##### Send pipeline event returns "Request accepted for processing" response ``` // Send pipeline event returns "Request accepted for processing" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CIAppCreatePipelineEventRequest{ Data: &datadogV2.CIAppCreatePipelineEventRequestDataSingleOrArray{ CIAppCreatePipelineEventRequestData: &datadogV2.CIAppCreatePipelineEventRequestData{ Attributes: &datadogV2.CIAppCreatePipelineEventRequestAttributes{ Resource: datadogV2.CIAppCreatePipelineEventRequestAttributesResource{ CIAppPipelineEventPipeline: &datadogV2.CIAppPipelineEventPipeline{ CIAppPipelineEventFinishedPipeline: &datadogV2.CIAppPipelineEventFinishedPipeline{ Level: datadogV2.CIAPPPIPELINEEVENTPIPELINELEVEL_PIPELINE, UniqueId: "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", Name: "Deploy to AWS", Url: "https://my-ci-provider.example/pipelines/my-pipeline/run/1", Start: time.Now().Add(time.Second * -120), End: time.Now().Add(time.Second * -30), Status: datadogV2.CIAPPPIPELINEEVENTPIPELINESTATUS_SUCCESS, PartialRetry: false, Git: *datadogV2.NewNullableCIAppGitInfo(&datadogV2.CIAppGitInfo{ RepositoryUrl: "https://github.com/DataDog/datadog-agent", Sha: "7f263865994b76066c4612fd1965215e7dcb4cd2", AuthorEmail: "john.doe@email.com", }), }}}, }, Type: datadogV2.CIAPPCREATEPIPELINEEVENTREQUESTDATATYPE_CIPIPELINE_RESOURCE_REQUEST.Ptr(), }}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCIVisibilityPipelinesApi(apiClient) resp, r, err := api.CreateCIAppPipelineEvent(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityPipelinesApi.CreateCIAppPipelineEvent`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CIVisibilityPipelinesApi.CreateCIAppPipelineEvent`:\n%s\n", responseContent) } ``` Copy ##### Send pipeline event with custom provider returns "Request accepted for processing" response ``` // Send pipeline event with custom provider returns "Request accepted for processing" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CIAppCreatePipelineEventRequest{ Data: &datadogV2.CIAppCreatePipelineEventRequestDataSingleOrArray{ CIAppCreatePipelineEventRequestData: &datadogV2.CIAppCreatePipelineEventRequestData{ Attributes: &datadogV2.CIAppCreatePipelineEventRequestAttributes{ ProviderName: datadog.PtrString("example-provider"), Resource: datadogV2.CIAppCreatePipelineEventRequestAttributesResource{ CIAppPipelineEventPipeline: &datadogV2.CIAppPipelineEventPipeline{ CIAppPipelineEventFinishedPipeline: &datadogV2.CIAppPipelineEventFinishedPipeline{ Level: datadogV2.CIAPPPIPELINEEVENTPIPELINELEVEL_PIPELINE, UniqueId: "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", Name: "Deploy to AWS", Url: "https://my-ci-provider.example/pipelines/my-pipeline/run/1", Start: time.Now().Add(time.Second * -120), End: time.Now().Add(time.Second * -30), Status: datadogV2.CIAPPPIPELINEEVENTPIPELINESTATUS_SUCCESS, PartialRetry: false, Git: *datadogV2.NewNullableCIAppGitInfo(&datadogV2.CIAppGitInfo{ RepositoryUrl: "https://github.com/DataDog/datadog-agent", Sha: "7f263865994b76066c4612fd1965215e7dcb4cd2", AuthorEmail: "john.doe@email.com", }), }}}, }, Type: datadogV2.CIAPPCREATEPIPELINEEVENTREQUESTDATATYPE_CIPIPELINE_RESOURCE_REQUEST.Ptr(), }}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCIVisibilityPipelinesApi(apiClient) resp, r, err := api.CreateCIAppPipelineEvent(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityPipelinesApi.CreateCIAppPipelineEvent`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CIVisibilityPipelinesApi.CreateCIAppPipelineEvent`:\n%s\n", responseContent) } ``` Copy ##### Send running pipeline event returns "Request accepted for processing" response ``` // Send running pipeline event returns "Request accepted for processing" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CIAppCreatePipelineEventRequest{ Data: &datadogV2.CIAppCreatePipelineEventRequestDataSingleOrArray{ CIAppCreatePipelineEventRequestData: &datadogV2.CIAppCreatePipelineEventRequestData{ Attributes: &datadogV2.CIAppCreatePipelineEventRequestAttributes{ Resource: datadogV2.CIAppCreatePipelineEventRequestAttributesResource{ CIAppPipelineEventPipeline: &datadogV2.CIAppPipelineEventPipeline{ CIAppPipelineEventInProgressPipeline: &datadogV2.CIAppPipelineEventInProgressPipeline{ Level: datadogV2.CIAPPPIPELINEEVENTPIPELINELEVEL_PIPELINE, UniqueId: "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", Name: "Deploy to AWS", Url: "https://my-ci-provider.example/pipelines/my-pipeline/run/1", Start: time.Now().Add(time.Second * -120), Status: datadogV2.CIAPPPIPELINEEVENTPIPELINEINPROGRESSSTATUS_RUNNING, PartialRetry: false, Git: *datadogV2.NewNullableCIAppGitInfo(&datadogV2.CIAppGitInfo{ RepositoryUrl: "https://github.com/DataDog/datadog-agent", Sha: "7f263865994b76066c4612fd1965215e7dcb4cd2", AuthorEmail: "john.doe@email.com", }), }}}, }, Type: datadogV2.CIAPPCREATEPIPELINEEVENTREQUESTDATATYPE_CIPIPELINE_RESOURCE_REQUEST.Ptr(), }}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCIVisibilityPipelinesApi(apiClient) resp, r, err := api.CreateCIAppPipelineEvent(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityPipelinesApi.CreateCIAppPipelineEvent`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CIVisibilityPipelinesApi.CreateCIAppPipelineEvent`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Send pipeline event returns "Request accepted for processing" response ``` // Send pipeline event returns "Request accepted for processing" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CiVisibilityPipelinesApi; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequest; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestAttributes; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestAttributesResource; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestData; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestDataSingleOrArray; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestDataType; import com.datadog.api.client.v2.model.CIAppGitInfo; import com.datadog.api.client.v2.model.CIAppPipelineEventFinishedPipeline; import com.datadog.api.client.v2.model.CIAppPipelineEventPipeline; import com.datadog.api.client.v2.model.CIAppPipelineEventPipelineLevel; import com.datadog.api.client.v2.model.CIAppPipelineEventPipelineStatus; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CiVisibilityPipelinesApi apiInstance = new CiVisibilityPipelinesApi(defaultClient); CIAppCreatePipelineEventRequest body = new CIAppCreatePipelineEventRequest() .data( new CIAppCreatePipelineEventRequestDataSingleOrArray( new CIAppCreatePipelineEventRequestData() .attributes( new CIAppCreatePipelineEventRequestAttributes() .resource( new CIAppCreatePipelineEventRequestAttributesResource( new CIAppPipelineEventPipeline( new CIAppPipelineEventFinishedPipeline() .level(CIAppPipelineEventPipelineLevel.PIPELINE) .uniqueId("3eacb6f3-ff04-4e10-8a9c-46e6d054024a") .name("Deploy to AWS") .url( "https://my-ci-provider.example/pipelines/my-pipeline/run/1") .start(OffsetDateTime.now().plusSeconds(-120)) .end(OffsetDateTime.now().plusSeconds(-30)) .status(CIAppPipelineEventPipelineStatus.SUCCESS) .partialRetry(false) .git( new CIAppGitInfo() .repositoryUrl( "https://github.com/DataDog/datadog-agent") .sha( "7f263865994b76066c4612fd1965215e7dcb4cd2") .authorEmail("john.doe@email.com")))))) .type( CIAppCreatePipelineEventRequestDataType.CIPIPELINE_RESOURCE_REQUEST))); try { apiInstance.createCIAppPipelineEvent(body); } catch (ApiException e) { System.err.println( "Exception when calling CiVisibilityPipelinesApi#createCIAppPipelineEvent"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Send pipeline event with custom provider returns "Request accepted for processing" response ``` // Send pipeline event with custom provider returns "Request accepted for processing" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CiVisibilityPipelinesApi; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequest; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestAttributes; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestAttributesResource; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestData; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestDataSingleOrArray; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestDataType; import com.datadog.api.client.v2.model.CIAppGitInfo; import com.datadog.api.client.v2.model.CIAppPipelineEventFinishedPipeline; import com.datadog.api.client.v2.model.CIAppPipelineEventPipeline; import com.datadog.api.client.v2.model.CIAppPipelineEventPipelineLevel; import com.datadog.api.client.v2.model.CIAppPipelineEventPipelineStatus; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CiVisibilityPipelinesApi apiInstance = new CiVisibilityPipelinesApi(defaultClient); CIAppCreatePipelineEventRequest body = new CIAppCreatePipelineEventRequest() .data( new CIAppCreatePipelineEventRequestDataSingleOrArray( new CIAppCreatePipelineEventRequestData() .attributes( new CIAppCreatePipelineEventRequestAttributes() .providerName("example-provider") .resource( new CIAppCreatePipelineEventRequestAttributesResource( new CIAppPipelineEventPipeline( new CIAppPipelineEventFinishedPipeline() .level(CIAppPipelineEventPipelineLevel.PIPELINE) .uniqueId("3eacb6f3-ff04-4e10-8a9c-46e6d054024a") .name("Deploy to AWS") .url( "https://my-ci-provider.example/pipelines/my-pipeline/run/1") .start(OffsetDateTime.now().plusSeconds(-120)) .end(OffsetDateTime.now().plusSeconds(-30)) .status(CIAppPipelineEventPipelineStatus.SUCCESS) .partialRetry(false) .git( new CIAppGitInfo() .repositoryUrl( "https://github.com/DataDog/datadog-agent") .sha( "7f263865994b76066c4612fd1965215e7dcb4cd2") .authorEmail("john.doe@email.com")))))) .type( CIAppCreatePipelineEventRequestDataType.CIPIPELINE_RESOURCE_REQUEST))); try { apiInstance.createCIAppPipelineEvent(body); } catch (ApiException e) { System.err.println( "Exception when calling CiVisibilityPipelinesApi#createCIAppPipelineEvent"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Send running pipeline event returns "Request accepted for processing" response ``` // Send running pipeline event returns "Request accepted for processing" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CiVisibilityPipelinesApi; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequest; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestAttributes; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestAttributesResource; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestData; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestDataSingleOrArray; import com.datadog.api.client.v2.model.CIAppCreatePipelineEventRequestDataType; import com.datadog.api.client.v2.model.CIAppGitInfo; import com.datadog.api.client.v2.model.CIAppPipelineEventInProgressPipeline; import com.datadog.api.client.v2.model.CIAppPipelineEventPipeline; import com.datadog.api.client.v2.model.CIAppPipelineEventPipelineInProgressStatus; import com.datadog.api.client.v2.model.CIAppPipelineEventPipelineLevel; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CiVisibilityPipelinesApi apiInstance = new CiVisibilityPipelinesApi(defaultClient); CIAppCreatePipelineEventRequest body = new CIAppCreatePipelineEventRequest() .data( new CIAppCreatePipelineEventRequestDataSingleOrArray( new CIAppCreatePipelineEventRequestData() .attributes( new CIAppCreatePipelineEventRequestAttributes() .resource( new CIAppCreatePipelineEventRequestAttributesResource( new CIAppPipelineEventPipeline( new CIAppPipelineEventInProgressPipeline() .level(CIAppPipelineEventPipelineLevel.PIPELINE) .uniqueId("3eacb6f3-ff04-4e10-8a9c-46e6d054024a") .name("Deploy to AWS") .url( "https://my-ci-provider.example/pipelines/my-pipeline/run/1") .start(OffsetDateTime.now().plusSeconds(-120)) .status( CIAppPipelineEventPipelineInProgressStatus .RUNNING) .partialRetry(false) .git( new CIAppGitInfo() .repositoryUrl( "https://github.com/DataDog/datadog-agent") .sha( "7f263865994b76066c4612fd1965215e7dcb4cd2") .authorEmail("john.doe@email.com")))))) .type( CIAppCreatePipelineEventRequestDataType.CIPIPELINE_RESOURCE_REQUEST))); try { apiInstance.createCIAppPipelineEvent(body); } catch (ApiException e) { System.err.println( "Exception when calling CiVisibilityPipelinesApi#createCIAppPipelineEvent"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Send pipeline event returns "Request accepted for processing" response ``` """ Send pipeline event returns "Request accepted for processing" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ci_visibility_pipelines_api import CIVisibilityPipelinesApi from datadog_api_client.v2.model.ci_app_create_pipeline_event_request import CIAppCreatePipelineEventRequest from datadog_api_client.v2.model.ci_app_create_pipeline_event_request_attributes import ( CIAppCreatePipelineEventRequestAttributes, ) from datadog_api_client.v2.model.ci_app_create_pipeline_event_request_data import CIAppCreatePipelineEventRequestData from datadog_api_client.v2.model.ci_app_create_pipeline_event_request_data_type import ( CIAppCreatePipelineEventRequestDataType, ) from datadog_api_client.v2.model.ci_app_git_info import CIAppGitInfo from datadog_api_client.v2.model.ci_app_pipeline_event_finished_pipeline import CIAppPipelineEventFinishedPipeline from datadog_api_client.v2.model.ci_app_pipeline_event_pipeline_level import CIAppPipelineEventPipelineLevel from datadog_api_client.v2.model.ci_app_pipeline_event_pipeline_status import CIAppPipelineEventPipelineStatus body = CIAppCreatePipelineEventRequest( data=CIAppCreatePipelineEventRequestData( attributes=CIAppCreatePipelineEventRequestAttributes( resource=CIAppPipelineEventFinishedPipeline( level=CIAppPipelineEventPipelineLevel.PIPELINE, unique_id="3eacb6f3-ff04-4e10-8a9c-46e6d054024a", name="Deploy to AWS", url="https://my-ci-provider.example/pipelines/my-pipeline/run/1", start=(datetime.now() + relativedelta(seconds=-120)), end=(datetime.now() + relativedelta(seconds=-30)), status=CIAppPipelineEventPipelineStatus.SUCCESS, partial_retry=False, git=CIAppGitInfo( repository_url="https://github.com/DataDog/datadog-agent", sha="7f263865994b76066c4612fd1965215e7dcb4cd2", author_email="john.doe@email.com", ), ), ), type=CIAppCreatePipelineEventRequestDataType.CIPIPELINE_RESOURCE_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CIVisibilityPipelinesApi(api_client) response = api_instance.create_ci_app_pipeline_event(body=body) print(response) ``` Copy ##### Send pipeline event with custom provider returns "Request accepted for processing" response ``` """ Send pipeline event with custom provider returns "Request accepted for processing" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ci_visibility_pipelines_api import CIVisibilityPipelinesApi from datadog_api_client.v2.model.ci_app_create_pipeline_event_request import CIAppCreatePipelineEventRequest from datadog_api_client.v2.model.ci_app_create_pipeline_event_request_attributes import ( CIAppCreatePipelineEventRequestAttributes, ) from datadog_api_client.v2.model.ci_app_create_pipeline_event_request_data import CIAppCreatePipelineEventRequestData from datadog_api_client.v2.model.ci_app_create_pipeline_event_request_data_type import ( CIAppCreatePipelineEventRequestDataType, ) from datadog_api_client.v2.model.ci_app_git_info import CIAppGitInfo from datadog_api_client.v2.model.ci_app_pipeline_event_finished_pipeline import CIAppPipelineEventFinishedPipeline from datadog_api_client.v2.model.ci_app_pipeline_event_pipeline_level import CIAppPipelineEventPipelineLevel from datadog_api_client.v2.model.ci_app_pipeline_event_pipeline_status import CIAppPipelineEventPipelineStatus body = CIAppCreatePipelineEventRequest( data=CIAppCreatePipelineEventRequestData( attributes=CIAppCreatePipelineEventRequestAttributes( provider_name="example-provider", resource=CIAppPipelineEventFinishedPipeline( level=CIAppPipelineEventPipelineLevel.PIPELINE, unique_id="3eacb6f3-ff04-4e10-8a9c-46e6d054024a", name="Deploy to AWS", url="https://my-ci-provider.example/pipelines/my-pipeline/run/1", start=(datetime.now() + relativedelta(seconds=-120)), end=(datetime.now() + relativedelta(seconds=-30)), status=CIAppPipelineEventPipelineStatus.SUCCESS, partial_retry=False, git=CIAppGitInfo( repository_url="https://github.com/DataDog/datadog-agent", sha="7f263865994b76066c4612fd1965215e7dcb4cd2", author_email="john.doe@email.com", ), ), ), type=CIAppCreatePipelineEventRequestDataType.CIPIPELINE_RESOURCE_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CIVisibilityPipelinesApi(api_client) response = api_instance.create_ci_app_pipeline_event(body=body) print(response) ``` Copy ##### Send running pipeline event returns "Request accepted for processing" response ``` """ Send running pipeline event returns "Request accepted for processing" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ci_visibility_pipelines_api import CIVisibilityPipelinesApi from datadog_api_client.v2.model.ci_app_create_pipeline_event_request import CIAppCreatePipelineEventRequest from datadog_api_client.v2.model.ci_app_create_pipeline_event_request_attributes import ( CIAppCreatePipelineEventRequestAttributes, ) from datadog_api_client.v2.model.ci_app_create_pipeline_event_request_data import CIAppCreatePipelineEventRequestData from datadog_api_client.v2.model.ci_app_create_pipeline_event_request_data_type import ( CIAppCreatePipelineEventRequestDataType, ) from datadog_api_client.v2.model.ci_app_git_info import CIAppGitInfo from datadog_api_client.v2.model.ci_app_pipeline_event_in_progress_pipeline import CIAppPipelineEventInProgressPipeline from datadog_api_client.v2.model.ci_app_pipeline_event_pipeline_in_progress_status import ( CIAppPipelineEventPipelineInProgressStatus, ) from datadog_api_client.v2.model.ci_app_pipeline_event_pipeline_level import CIAppPipelineEventPipelineLevel body = CIAppCreatePipelineEventRequest( data=CIAppCreatePipelineEventRequestData( attributes=CIAppCreatePipelineEventRequestAttributes( resource=CIAppPipelineEventInProgressPipeline( level=CIAppPipelineEventPipelineLevel.PIPELINE, unique_id="3eacb6f3-ff04-4e10-8a9c-46e6d054024a", name="Deploy to AWS", url="https://my-ci-provider.example/pipelines/my-pipeline/run/1", start=(datetime.now() + relativedelta(seconds=-120)), status=CIAppPipelineEventPipelineInProgressStatus.RUNNING, partial_retry=False, git=CIAppGitInfo( repository_url="https://github.com/DataDog/datadog-agent", sha="7f263865994b76066c4612fd1965215e7dcb4cd2", author_email="john.doe@email.com", ), ), ), type=CIAppCreatePipelineEventRequestDataType.CIPIPELINE_RESOURCE_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CIVisibilityPipelinesApi(api_client) response = api_instance.create_ci_app_pipeline_event(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Send pipeline event returns "Request accepted for processing" response ``` # Send pipeline event returns "Request accepted for processing" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CIVisibilityPipelinesAPI.new body = DatadogAPIClient::V2::CIAppCreatePipelineEventRequest.new({ data: DatadogAPIClient::V2::CIAppCreatePipelineEventRequestData.new({ attributes: DatadogAPIClient::V2::CIAppCreatePipelineEventRequestAttributes.new({ resource: DatadogAPIClient::V2::CIAppPipelineEventFinishedPipeline.new({ level: DatadogAPIClient::V2::CIAppPipelineEventPipelineLevel::PIPELINE, unique_id: "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", name: "Deploy to AWS", url: "https://my-ci-provider.example/pipelines/my-pipeline/run/1", start: (Time.now + -120), _end: (Time.now + -30), status: DatadogAPIClient::V2::CIAppPipelineEventPipelineStatus::SUCCESS, partial_retry: false, git: DatadogAPIClient::V2::CIAppGitInfo.new({ repository_url: "https://github.com/DataDog/datadog-agent", sha: "7f263865994b76066c4612fd1965215e7dcb4cd2", author_email: "john.doe@email.com", }), }), }), type: DatadogAPIClient::V2::CIAppCreatePipelineEventRequestDataType::CIPIPELINE_RESOURCE_REQUEST, }), }) p api_instance.create_ci_app_pipeline_event(body) ``` Copy ##### Send pipeline event with custom provider returns "Request accepted for processing" response ``` # Send pipeline event with custom provider returns "Request accepted for processing" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CIVisibilityPipelinesAPI.new body = DatadogAPIClient::V2::CIAppCreatePipelineEventRequest.new({ data: DatadogAPIClient::V2::CIAppCreatePipelineEventRequestData.new({ attributes: DatadogAPIClient::V2::CIAppCreatePipelineEventRequestAttributes.new({ provider_name: "example-provider", resource: DatadogAPIClient::V2::CIAppPipelineEventFinishedPipeline.new({ level: DatadogAPIClient::V2::CIAppPipelineEventPipelineLevel::PIPELINE, unique_id: "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", name: "Deploy to AWS", url: "https://my-ci-provider.example/pipelines/my-pipeline/run/1", start: (Time.now + -120), _end: (Time.now + -30), status: DatadogAPIClient::V2::CIAppPipelineEventPipelineStatus::SUCCESS, partial_retry: false, git: DatadogAPIClient::V2::CIAppGitInfo.new({ repository_url: "https://github.com/DataDog/datadog-agent", sha: "7f263865994b76066c4612fd1965215e7dcb4cd2", author_email: "john.doe@email.com", }), }), }), type: DatadogAPIClient::V2::CIAppCreatePipelineEventRequestDataType::CIPIPELINE_RESOURCE_REQUEST, }), }) p api_instance.create_ci_app_pipeline_event(body) ``` Copy ##### Send running pipeline event returns "Request accepted for processing" response ``` # Send running pipeline event returns "Request accepted for processing" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CIVisibilityPipelinesAPI.new body = DatadogAPIClient::V2::CIAppCreatePipelineEventRequest.new({ data: DatadogAPIClient::V2::CIAppCreatePipelineEventRequestData.new({ attributes: DatadogAPIClient::V2::CIAppCreatePipelineEventRequestAttributes.new({ resource: DatadogAPIClient::V2::CIAppPipelineEventInProgressPipeline.new({ level: DatadogAPIClient::V2::CIAppPipelineEventPipelineLevel::PIPELINE, unique_id: "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", name: "Deploy to AWS", url: "https://my-ci-provider.example/pipelines/my-pipeline/run/1", start: (Time.now + -120), status: DatadogAPIClient::V2::CIAppPipelineEventPipelineInProgressStatus::RUNNING, partial_retry: false, git: DatadogAPIClient::V2::CIAppGitInfo.new({ repository_url: "https://github.com/DataDog/datadog-agent", sha: "7f263865994b76066c4612fd1965215e7dcb4cd2", author_email: "john.doe@email.com", }), }), }), type: DatadogAPIClient::V2::CIAppCreatePipelineEventRequestDataType::CIPIPELINE_RESOURCE_REQUEST, }), }) p api_instance.create_ci_app_pipeline_event(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Send pipeline event returns "Request accepted for processing" response ``` // Send pipeline event returns "Request accepted for processing" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ci_visibility_pipelines::CIVisibilityPipelinesAPI; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequest; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestAttributes; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestAttributesResource; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestData; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestDataSingleOrArray; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestDataType; use datadog_api_client::datadogV2::model::CIAppGitInfo; use datadog_api_client::datadogV2::model::CIAppPipelineEventFinishedPipeline; use datadog_api_client::datadogV2::model::CIAppPipelineEventPipeline; use datadog_api_client::datadogV2::model::CIAppPipelineEventPipelineLevel; use datadog_api_client::datadogV2::model::CIAppPipelineEventPipelineStatus; #[tokio::main] async fn main() { let body = CIAppCreatePipelineEventRequest ::new().data( CIAppCreatePipelineEventRequestDataSingleOrArray::CIAppCreatePipelineEventRequestData( Box::new( CIAppCreatePipelineEventRequestData::new() .attributes( CIAppCreatePipelineEventRequestAttributes::new( CIAppCreatePipelineEventRequestAttributesResource::CIAppPipelineEventPipeline( Box::new( CIAppPipelineEventPipeline::CIAppPipelineEventFinishedPipeline( Box::new( CIAppPipelineEventFinishedPipeline::new( DateTime::parse_from_rfc3339("2021-11-11T11:10:41+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), CIAppPipelineEventPipelineLevel::PIPELINE, "Deploy to AWS".to_string(), false, DateTime::parse_from_rfc3339("2021-11-11T11:09:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), CIAppPipelineEventPipelineStatus::SUCCESS, "3eacb6f3-ff04-4e10-8a9c-46e6d054024a".to_string(), "https://my-ci-provider.example/pipelines/my-pipeline/run/1".to_string(), ).git( Some( CIAppGitInfo::new( "john.doe@email.com".to_string(), "https://github.com/DataDog/datadog-agent".to_string(), "7f263865994b76066c4612fd1965215e7dcb4cd2".to_string(), ), ), ), ), ), ), ), ), ) .type_(CIAppCreatePipelineEventRequestDataType::CIPIPELINE_RESOURCE_REQUEST), ), ), ); let configuration = datadog::Configuration::new(); let api = CIVisibilityPipelinesAPI::with_config(configuration); let resp = api.create_ci_app_pipeline_event(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Send pipeline event with custom provider returns "Request accepted for processing" response ``` // Send pipeline event with custom provider returns "Request accepted for // processing" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ci_visibility_pipelines::CIVisibilityPipelinesAPI; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequest; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestAttributes; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestAttributesResource; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestData; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestDataSingleOrArray; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestDataType; use datadog_api_client::datadogV2::model::CIAppGitInfo; use datadog_api_client::datadogV2::model::CIAppPipelineEventFinishedPipeline; use datadog_api_client::datadogV2::model::CIAppPipelineEventPipeline; use datadog_api_client::datadogV2::model::CIAppPipelineEventPipelineLevel; use datadog_api_client::datadogV2::model::CIAppPipelineEventPipelineStatus; #[tokio::main] async fn main() { let body = CIAppCreatePipelineEventRequest ::new().data( CIAppCreatePipelineEventRequestDataSingleOrArray::CIAppCreatePipelineEventRequestData( Box::new( CIAppCreatePipelineEventRequestData::new() .attributes( CIAppCreatePipelineEventRequestAttributes::new( CIAppCreatePipelineEventRequestAttributesResource::CIAppPipelineEventPipeline( Box::new( CIAppPipelineEventPipeline::CIAppPipelineEventFinishedPipeline( Box::new( CIAppPipelineEventFinishedPipeline::new( DateTime::parse_from_rfc3339("2021-11-11T11:10:41+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), CIAppPipelineEventPipelineLevel::PIPELINE, "Deploy to AWS".to_string(), false, DateTime::parse_from_rfc3339("2021-11-11T11:09:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), CIAppPipelineEventPipelineStatus::SUCCESS, "3eacb6f3-ff04-4e10-8a9c-46e6d054024a".to_string(), "https://my-ci-provider.example/pipelines/my-pipeline/run/1".to_string(), ).git( Some( CIAppGitInfo::new( "john.doe@email.com".to_string(), "https://github.com/DataDog/datadog-agent".to_string(), "7f263865994b76066c4612fd1965215e7dcb4cd2".to_string(), ), ), ), ), ), ), ), ).provider_name("example-provider".to_string()), ) .type_(CIAppCreatePipelineEventRequestDataType::CIPIPELINE_RESOURCE_REQUEST), ), ), ); let configuration = datadog::Configuration::new(); let api = CIVisibilityPipelinesAPI::with_config(configuration); let resp = api.create_ci_app_pipeline_event(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Send running pipeline event returns "Request accepted for processing" response ``` // Send running pipeline event returns "Request accepted for processing" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ci_visibility_pipelines::CIVisibilityPipelinesAPI; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequest; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestAttributes; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestAttributesResource; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestData; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestDataSingleOrArray; use datadog_api_client::datadogV2::model::CIAppCreatePipelineEventRequestDataType; use datadog_api_client::datadogV2::model::CIAppGitInfo; use datadog_api_client::datadogV2::model::CIAppPipelineEventInProgressPipeline; use datadog_api_client::datadogV2::model::CIAppPipelineEventPipeline; use datadog_api_client::datadogV2::model::CIAppPipelineEventPipelineInProgressStatus; use datadog_api_client::datadogV2::model::CIAppPipelineEventPipelineLevel; #[tokio::main] async fn main() { let body = CIAppCreatePipelineEventRequest ::new().data( CIAppCreatePipelineEventRequestDataSingleOrArray::CIAppCreatePipelineEventRequestData( Box::new( CIAppCreatePipelineEventRequestData::new() .attributes( CIAppCreatePipelineEventRequestAttributes::new( CIAppCreatePipelineEventRequestAttributesResource::CIAppPipelineEventPipeline( Box::new( CIAppPipelineEventPipeline::CIAppPipelineEventInProgressPipeline( Box::new( CIAppPipelineEventInProgressPipeline::new( CIAppPipelineEventPipelineLevel::PIPELINE, "Deploy to AWS".to_string(), false, DateTime::parse_from_rfc3339("2021-11-11T11:09:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), CIAppPipelineEventPipelineInProgressStatus::RUNNING, "3eacb6f3-ff04-4e10-8a9c-46e6d054024a".to_string(), "https://my-ci-provider.example/pipelines/my-pipeline/run/1".to_string(), ).git( Some( CIAppGitInfo::new( "john.doe@email.com".to_string(), "https://github.com/DataDog/datadog-agent".to_string(), "7f263865994b76066c4612fd1965215e7dcb4cd2".to_string(), ), ), ), ), ), ), ), ), ) .type_(CIAppCreatePipelineEventRequestDataType::CIPIPELINE_RESOURCE_REQUEST), ), ), ); let configuration = datadog::Configuration::new(); let api = CIVisibilityPipelinesAPI::with_config(configuration); let resp = api.create_ci_app_pipeline_event(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Send pipeline event returns "Request accepted for processing" response ``` /** * Send pipeline event returns "Request accepted for processing" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CIVisibilityPipelinesApi(configuration); const params: v2.CIVisibilityPipelinesApiCreateCIAppPipelineEventRequest = { body: { data: { attributes: { resource: { level: "pipeline", uniqueId: "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", name: "Deploy to AWS", url: "https://my-ci-provider.example/pipelines/my-pipeline/run/1", start: new Date(new Date().getTime() + -120 * 1000), end: new Date(new Date().getTime() + -30 * 1000), status: "success", partialRetry: false, git: { repositoryUrl: "https://github.com/DataDog/datadog-agent", sha: "7f263865994b76066c4612fd1965215e7dcb4cd2", authorEmail: "john.doe@email.com", }, }, }, type: "cipipeline_resource_request", }, }, }; apiInstance .createCIAppPipelineEvent(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Send pipeline event with custom provider returns "Request accepted for processing" response ``` /** * Send pipeline event with custom provider returns "Request accepted for processing" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CIVisibilityPipelinesApi(configuration); const params: v2.CIVisibilityPipelinesApiCreateCIAppPipelineEventRequest = { body: { data: { attributes: { providerName: "example-provider", resource: { level: "pipeline", uniqueId: "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", name: "Deploy to AWS", url: "https://my-ci-provider.example/pipelines/my-pipeline/run/1", start: new Date(new Date().getTime() + -120 * 1000), end: new Date(new Date().getTime() + -30 * 1000), status: "success", partialRetry: false, git: { repositoryUrl: "https://github.com/DataDog/datadog-agent", sha: "7f263865994b76066c4612fd1965215e7dcb4cd2", authorEmail: "john.doe@email.com", }, }, }, type: "cipipeline_resource_request", }, }, }; apiInstance .createCIAppPipelineEvent(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Send running pipeline event returns "Request accepted for processing" response ``` /** * Send running pipeline event returns "Request accepted for processing" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CIVisibilityPipelinesApi(configuration); const params: v2.CIVisibilityPipelinesApiCreateCIAppPipelineEventRequest = { body: { data: { attributes: { resource: { level: "pipeline", uniqueId: "3eacb6f3-ff04-4e10-8a9c-46e6d054024a", name: "Deploy to AWS", url: "https://my-ci-provider.example/pipelines/my-pipeline/run/1", start: new Date(new Date().getTime() + -120 * 1000), status: "running", partialRetry: false, git: { repositoryUrl: "https://github.com/DataDog/datadog-agent", sha: "7f263865994b76066c4612fd1965215e7dcb4cd2", authorEmail: "john.doe@email.com", }, }, }, type: "cipipeline_resource_request", }, }, }; apiInstance .createCIAppPipelineEvent(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` * * * ## [Get a list of pipelines events](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#get-a-list-of-pipelines-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#get-a-list-of-pipelines-events-v2) GET https://api.ap1.datadoghq.com/api/v2/ci/pipelines/eventshttps://api.ap2.datadoghq.com/api/v2/ci/pipelines/eventshttps://api.datadoghq.eu/api/v2/ci/pipelines/eventshttps://api.ddog-gov.com/api/v2/ci/pipelines/eventshttps://api.datadoghq.com/api/v2/ci/pipelines/eventshttps://api.us3.datadoghq.com/api/v2/ci/pipelines/eventshttps://api.us5.datadoghq.com/api/v2/ci/pipelines/events ### Overview List endpoint returns CI Visibility pipeline events that match a [search query](https://docs.datadoghq.com/continuous_integration/explorer/search_syntax/). [Results are paginated similarly to logs](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to see your latest pipeline events. This endpoint requires the `ci_visibility_read` permission. OAuth apps require the `ci_visibility_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#ci-visibility-pipelines) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[query] string Search query following log syntax. filter[from] string Minimum timestamp for requested events. filter[to] string Maximum timestamp for requested events. sort enum Order of events in results. Allowed enum values: `timestamp, -timestamp` page[cursor] string List following results with a cursor provided in the previous query. page[limit] integer Maximum number of events in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#ListCIAppPipelineEvents-200-v2) * [400](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#ListCIAppPipelineEvents-400-v2) * [403](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#ListCIAppPipelineEvents-403-v2) * [429](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#ListCIAppPipelineEvents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Response object with all pipeline events matching the request and pagination information. Field Type Description data [object] Array of events matching the request. attributes object JSON object containing all event attributes and their associated values. attributes object JSON object of attributes from CI Visibility pipeline events. ci_level enum Pipeline execution level. Allowed enum values: `pipeline,stage,job,step,custom` tags [string] Array of tags associated with your event. id string Unique ID of the event. type enum Type of the event. Allowed enum values: `cipipeline` links object Links attributes. next string Link for the next set of results. The request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non-fatal errors) encountered. Partial results may return if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "ci_level": "pipeline", "tags": [ "team:A" ] }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "cipipeline" } ], "links": { "next": "https://app.datadoghq.com/api/v2/ci/tests/events?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=typescript) ##### Get a list of pipelines events Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ci/pipelines/events" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of pipelines events ``` """ Get a list of pipelines events returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ci_visibility_pipelines_api import CIVisibilityPipelinesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CIVisibilityPipelinesApi(api_client) response = api_instance.list_ci_app_pipeline_events( filter_query="@ci.provider.name:circleci", filter_from=(datetime.now() + relativedelta(minutes=-30)), filter_to=datetime.now(), page_limit=5, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of pipelines events ``` # Get a list of pipelines events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CIVisibilityPipelinesAPI.new opts = { filter_query: "@ci.provider.name:circleci", filter_from: (Time.now + -30 * 60), filter_to: Time.now, page_limit: 5, } p api_instance.list_ci_app_pipeline_events(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of pipelines events ``` // Get a list of pipelines events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCIVisibilityPipelinesApi(apiClient) resp, r, err := api.ListCIAppPipelineEvents(ctx, *datadogV2.NewListCIAppPipelineEventsOptionalParameters().WithFilterQuery("@ci.provider.name:circleci").WithFilterFrom(time.Now().Add(time.Minute * -30)).WithFilterTo(time.Now()).WithPageLimit(5)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityPipelinesApi.ListCIAppPipelineEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CIVisibilityPipelinesApi.ListCIAppPipelineEvents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of pipelines events ``` // Get a list of pipelines events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CiVisibilityPipelinesApi; import com.datadog.api.client.v2.api.CiVisibilityPipelinesApi.ListCIAppPipelineEventsOptionalParameters; import com.datadog.api.client.v2.model.CIAppPipelineEventsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CiVisibilityPipelinesApi apiInstance = new CiVisibilityPipelinesApi(defaultClient); try { CIAppPipelineEventsResponse result = apiInstance.listCIAppPipelineEvents( new ListCIAppPipelineEventsOptionalParameters() .filterQuery("@ci.provider.name:circleci") .filterFrom(OffsetDateTime.now().plusMinutes(-30)) .filterTo(OffsetDateTime.now()) .pageLimit(5)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CiVisibilityPipelinesApi#listCIAppPipelineEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of pipelines events ``` // Get a list of pipelines events returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ci_visibility_pipelines::CIVisibilityPipelinesAPI; use datadog_api_client::datadogV2::api_ci_visibility_pipelines::ListCIAppPipelineEventsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CIVisibilityPipelinesAPI::with_config(configuration); let resp = api .list_ci_app_pipeline_events( ListCIAppPipelineEventsOptionalParams::default() .filter_query("@ci.provider.name:circleci".to_string()) .filter_from( DateTime::parse_from_rfc3339("2021-11-11T10:41:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .filter_to( DateTime::parse_from_rfc3339("2021-11-11T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .page_limit(5), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of pipelines events ``` /** * Get a list of pipelines events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CIVisibilityPipelinesApi(configuration); const params: v2.CIVisibilityPipelinesApiListCIAppPipelineEventsRequest = { filterQuery: "@ci.provider.name:circleci", filterFrom: new Date(new Date().getTime() + -30 * 60 * 1000), filterTo: new Date(), pageLimit: 5, }; apiInstance .listCIAppPipelineEvents(params) .then((data: v2.CIAppPipelineEventsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search pipelines events](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#search-pipelines-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#search-pipelines-events-v2) POST https://api.ap1.datadoghq.com/api/v2/ci/pipelines/events/searchhttps://api.ap2.datadoghq.com/api/v2/ci/pipelines/events/searchhttps://api.datadoghq.eu/api/v2/ci/pipelines/events/searchhttps://api.ddog-gov.com/api/v2/ci/pipelines/events/searchhttps://api.datadoghq.com/api/v2/ci/pipelines/events/searchhttps://api.us3.datadoghq.com/api/v2/ci/pipelines/events/searchhttps://api.us5.datadoghq.com/api/v2/ci/pipelines/events/search ### Overview List endpoint returns CI Visibility pipeline events that match a [search query](https://docs.datadoghq.com/continuous_integration/explorer/search_syntax/). [Results are paginated similarly to logs](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to build complex events filtering and search. This endpoint requires the `ci_visibility_read` permission. OAuth apps require the `ci_visibility_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#ci-visibility-pipelines) to access this endpoint. ### Request #### Body Data * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Field Type Description filter object The search and filter query settings. from string The minimum time for the requested events; supports date, math, and regular timestamps (in milliseconds). default: `now-15m` query string The search query following the CI Visibility Explorer search syntax. default: `*` to string The maximum time for the requested events, supports date, math, and regular timestamps (in milliseconds). default: `now` options object Global query options that are used during the query. Only supply timezone or time offset, not both. Otherwise, the query fails. time_offset int64 The time offset (in seconds) to apply to the query. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` page object Paging attributes for listing events. cursor string List following results with a cursor provided in the previous query. limit int32 Maximum number of events in the response. default: `10` sort enum Sort parameters when querying events. Allowed enum values: `timestamp,-timestamp` ##### Search pipelines events returns "OK" response ``` { "filter": { "from": "now-15m", "query": "@ci.provider.name:github AND @ci.status:error", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 5 }, "sort": "timestamp" } ``` Copy ##### Search pipelines events returns "OK" response with pagination ``` { "filter": { "from": "now-30s", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#SearchCIAppPipelineEvents-200-v2) * [400](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#SearchCIAppPipelineEvents-400-v2) * [403](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#SearchCIAppPipelineEvents-403-v2) * [429](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#SearchCIAppPipelineEvents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Response object with all pipeline events matching the request and pagination information. Field Type Description data [object] Array of events matching the request. attributes object JSON object containing all event attributes and their associated values. attributes object JSON object of attributes from CI Visibility pipeline events. ci_level enum Pipeline execution level. Allowed enum values: `pipeline,stage,job,step,custom` tags [string] Array of tags associated with your event. id string Unique ID of the event. type enum Type of the event. Allowed enum values: `cipipeline` links object Links attributes. next string Link for the next set of results. The request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non-fatal errors) encountered. Partial results may return if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "ci_level": "pipeline", "tags": [ "team:A" ] }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "cipipeline" } ], "links": { "next": "https://app.datadoghq.com/api/v2/ci/tests/events?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=typescript) ##### Search pipelines events returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ci/pipelines/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "now-15m", "query": "@ci.provider.name:github AND @ci.status:error", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 5 }, "sort": "timestamp" } EOF ``` ##### Search pipelines events returns "OK" response with pagination Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ci/pipelines/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "now-30s", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" } EOF ``` ##### Search pipelines events returns "OK" response ``` // Search pipelines events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CIAppPipelineEventsRequest{ Filter: &datadogV2.CIAppPipelinesQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("@ci.provider.name:github AND @ci.status:error"), To: datadog.PtrString("now"), }, Options: &datadogV2.CIAppQueryOptions{ Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.CIAppQueryPageOptions{ Limit: datadog.PtrInt32(5), }, Sort: datadogV2.CIAPPSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCIVisibilityPipelinesApi(apiClient) resp, r, err := api.SearchCIAppPipelineEvents(ctx, *datadogV2.NewSearchCIAppPipelineEventsOptionalParameters().WithBody(body)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityPipelinesApi.SearchCIAppPipelineEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CIVisibilityPipelinesApi.SearchCIAppPipelineEvents`:\n%s\n", responseContent) } ``` Copy ##### Search pipelines events returns "OK" response with pagination ``` // Search pipelines events returns "OK" response with pagination package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CIAppPipelineEventsRequest{ Filter: &datadogV2.CIAppPipelinesQueryFilter{ From: datadog.PtrString("now-30s"), To: datadog.PtrString("now"), }, Options: &datadogV2.CIAppQueryOptions{ Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.CIAppQueryPageOptions{ Limit: datadog.PtrInt32(2), }, Sort: datadogV2.CIAPPSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCIVisibilityPipelinesApi(apiClient) resp, _ := api.SearchCIAppPipelineEventsWithPagination(ctx, *datadogV2.NewSearchCIAppPipelineEventsOptionalParameters().WithBody(body)) for paginationResult := range resp { if paginationResult.Error != nil { fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityPipelinesApi.SearchCIAppPipelineEvents`: %v\n", paginationResult.Error) } responseContent, _ := json.MarshalIndent(paginationResult.Item, "", " ") fmt.Fprintf(os.Stdout, "%s\n", responseContent) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search pipelines events returns "OK" response ``` // Search pipelines events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CiVisibilityPipelinesApi; import com.datadog.api.client.v2.api.CiVisibilityPipelinesApi.SearchCIAppPipelineEventsOptionalParameters; import com.datadog.api.client.v2.model.CIAppPipelineEventsRequest; import com.datadog.api.client.v2.model.CIAppPipelineEventsResponse; import com.datadog.api.client.v2.model.CIAppPipelinesQueryFilter; import com.datadog.api.client.v2.model.CIAppQueryOptions; import com.datadog.api.client.v2.model.CIAppQueryPageOptions; import com.datadog.api.client.v2.model.CIAppSort; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CiVisibilityPipelinesApi apiInstance = new CiVisibilityPipelinesApi(defaultClient); CIAppPipelineEventsRequest body = new CIAppPipelineEventsRequest() .filter( new CIAppPipelinesQueryFilter() .from("now-15m") .query("@ci.provider.name:github AND @ci.status:error") .to("now")) .options(new CIAppQueryOptions().timezone("GMT")) .page(new CIAppQueryPageOptions().limit(5)) .sort(CIAppSort.TIMESTAMP_ASCENDING); try { CIAppPipelineEventsResponse result = apiInstance.searchCIAppPipelineEvents( new SearchCIAppPipelineEventsOptionalParameters().body(body)); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CiVisibilityPipelinesApi#searchCIAppPipelineEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Search pipelines events returns "OK" response with pagination ``` // Search pipelines events returns "OK" response with pagination import com.datadog.api.client.ApiClient; import com.datadog.api.client.PaginationIterable; import com.datadog.api.client.v2.api.CiVisibilityPipelinesApi; import com.datadog.api.client.v2.api.CiVisibilityPipelinesApi.SearchCIAppPipelineEventsOptionalParameters; import com.datadog.api.client.v2.model.CIAppPipelineEvent; import com.datadog.api.client.v2.model.CIAppPipelineEventsRequest; import com.datadog.api.client.v2.model.CIAppPipelinesQueryFilter; import com.datadog.api.client.v2.model.CIAppQueryOptions; import com.datadog.api.client.v2.model.CIAppQueryPageOptions; import com.datadog.api.client.v2.model.CIAppSort; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CiVisibilityPipelinesApi apiInstance = new CiVisibilityPipelinesApi(defaultClient); CIAppPipelineEventsRequest body = new CIAppPipelineEventsRequest() .filter(new CIAppPipelinesQueryFilter().from("now-30s").to("now")) .options(new CIAppQueryOptions().timezone("GMT")) .page(new CIAppQueryPageOptions().limit(2)) .sort(CIAppSort.TIMESTAMP_ASCENDING); try { PaginationIterable iterable = apiInstance.searchCIAppPipelineEventsWithPagination( new SearchCIAppPipelineEventsOptionalParameters().body(body)); for (CIAppPipelineEvent item : iterable) { System.out.println(item); } } catch (RuntimeException e) { System.err.println( "Exception when calling" + " CiVisibilityPipelinesApi#searchCIAppPipelineEventsWithPagination"); System.err.println("Reason: " + e.getMessage()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search pipelines events returns "OK" response ``` """ Search pipelines events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ci_visibility_pipelines_api import CIVisibilityPipelinesApi from datadog_api_client.v2.model.ci_app_pipeline_events_request import CIAppPipelineEventsRequest from datadog_api_client.v2.model.ci_app_pipelines_query_filter import CIAppPipelinesQueryFilter from datadog_api_client.v2.model.ci_app_query_options import CIAppQueryOptions from datadog_api_client.v2.model.ci_app_query_page_options import CIAppQueryPageOptions from datadog_api_client.v2.model.ci_app_sort import CIAppSort body = CIAppPipelineEventsRequest( filter=CIAppPipelinesQueryFilter( _from="now-15m", query="@ci.provider.name:github AND @ci.status:error", to="now", ), options=CIAppQueryOptions( timezone="GMT", ), page=CIAppQueryPageOptions( limit=5, ), sort=CIAppSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CIVisibilityPipelinesApi(api_client) response = api_instance.search_ci_app_pipeline_events(body=body) print(response) ``` Copy ##### Search pipelines events returns "OK" response with pagination ``` """ Search pipelines events returns "OK" response with pagination """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ci_visibility_pipelines_api import CIVisibilityPipelinesApi from datadog_api_client.v2.model.ci_app_pipeline_events_request import CIAppPipelineEventsRequest from datadog_api_client.v2.model.ci_app_pipelines_query_filter import CIAppPipelinesQueryFilter from datadog_api_client.v2.model.ci_app_query_options import CIAppQueryOptions from datadog_api_client.v2.model.ci_app_query_page_options import CIAppQueryPageOptions from datadog_api_client.v2.model.ci_app_sort import CIAppSort body = CIAppPipelineEventsRequest( filter=CIAppPipelinesQueryFilter( _from="now-30s", to="now", ), options=CIAppQueryOptions( timezone="GMT", ), page=CIAppQueryPageOptions( limit=2, ), sort=CIAppSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CIVisibilityPipelinesApi(api_client) items = api_instance.search_ci_app_pipeline_events_with_pagination(body=body) for item in items: print(item) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search pipelines events returns "OK" response ``` # Search pipelines events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CIVisibilityPipelinesAPI.new body = DatadogAPIClient::V2::CIAppPipelineEventsRequest.new({ filter: DatadogAPIClient::V2::CIAppPipelinesQueryFilter.new({ from: "now-15m", query: "@ci.provider.name:github AND @ci.status:error", to: "now", }), options: DatadogAPIClient::V2::CIAppQueryOptions.new({ timezone: "GMT", }), page: DatadogAPIClient::V2::CIAppQueryPageOptions.new({ limit: 5, }), sort: DatadogAPIClient::V2::CIAppSort::TIMESTAMP_ASCENDING, }) opts = { body: body, } p api_instance.search_ci_app_pipeline_events(opts) ``` Copy ##### Search pipelines events returns "OK" response with pagination ``` # Search pipelines events returns "OK" response with pagination require "datadog_api_client" api_instance = DatadogAPIClient::V2::CIVisibilityPipelinesAPI.new body = DatadogAPIClient::V2::CIAppPipelineEventsRequest.new({ filter: DatadogAPIClient::V2::CIAppPipelinesQueryFilter.new({ from: "now-30s", to: "now", }), options: DatadogAPIClient::V2::CIAppQueryOptions.new({ timezone: "GMT", }), page: DatadogAPIClient::V2::CIAppQueryPageOptions.new({ limit: 2, }), sort: DatadogAPIClient::V2::CIAppSort::TIMESTAMP_ASCENDING, }) opts = { body: body, } api_instance.search_ci_app_pipeline_events_with_pagination(opts) { |item| puts item } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search pipelines events returns "OK" response ``` // Search pipelines events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ci_visibility_pipelines::CIVisibilityPipelinesAPI; use datadog_api_client::datadogV2::api_ci_visibility_pipelines::SearchCIAppPipelineEventsOptionalParams; use datadog_api_client::datadogV2::model::CIAppPipelineEventsRequest; use datadog_api_client::datadogV2::model::CIAppPipelinesQueryFilter; use datadog_api_client::datadogV2::model::CIAppQueryOptions; use datadog_api_client::datadogV2::model::CIAppQueryPageOptions; use datadog_api_client::datadogV2::model::CIAppSort; #[tokio::main] async fn main() { let body = CIAppPipelineEventsRequest::new() .filter( CIAppPipelinesQueryFilter::new() .from("now-15m".to_string()) .query("@ci.provider.name:github AND @ci.status:error".to_string()) .to("now".to_string()), ) .options(CIAppQueryOptions::new().timezone("GMT".to_string())) .page(CIAppQueryPageOptions::new().limit(5)) .sort(CIAppSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = CIVisibilityPipelinesAPI::with_config(configuration); let resp = api .search_ci_app_pipeline_events( SearchCIAppPipelineEventsOptionalParams::default().body(body), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Search pipelines events returns "OK" response with pagination ``` // Search pipelines events returns "OK" response with pagination use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ci_visibility_pipelines::CIVisibilityPipelinesAPI; use datadog_api_client::datadogV2::api_ci_visibility_pipelines::SearchCIAppPipelineEventsOptionalParams; use datadog_api_client::datadogV2::model::CIAppPipelineEventsRequest; use datadog_api_client::datadogV2::model::CIAppPipelinesQueryFilter; use datadog_api_client::datadogV2::model::CIAppQueryOptions; use datadog_api_client::datadogV2::model::CIAppQueryPageOptions; use datadog_api_client::datadogV2::model::CIAppSort; use futures_util::pin_mut; use futures_util::stream::StreamExt; #[tokio::main] async fn main() { let body = CIAppPipelineEventsRequest::new() .filter( CIAppPipelinesQueryFilter::new() .from("now-30s".to_string()) .to("now".to_string()), ) .options(CIAppQueryOptions::new().timezone("GMT".to_string())) .page(CIAppQueryPageOptions::new().limit(2)) .sort(CIAppSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = CIVisibilityPipelinesAPI::with_config(configuration); let response = api.search_ci_app_pipeline_events_with_pagination( SearchCIAppPipelineEventsOptionalParams::default().body(body), ); pin_mut!(response); while let Some(resp) = response.next().await { if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search pipelines events returns "OK" response ``` /** * Search pipelines events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CIVisibilityPipelinesApi(configuration); const params: v2.CIVisibilityPipelinesApiSearchCIAppPipelineEventsRequest = { body: { filter: { from: "now-15m", query: "@ci.provider.name:github AND @ci.status:error", to: "now", }, options: { timezone: "GMT", }, page: { limit: 5, }, sort: "timestamp", }, }; apiInstance .searchCIAppPipelineEvents(params) .then((data: v2.CIAppPipelineEventsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Search pipelines events returns "OK" response with pagination ``` /** * Search pipelines events returns "OK" response with pagination */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CIVisibilityPipelinesApi(configuration); const params: v2.CIVisibilityPipelinesApiSearchCIAppPipelineEventsRequest = { body: { filter: { from: "now-30s", to: "now", }, options: { timezone: "GMT", }, page: { limit: 2, }, sort: "timestamp", }, }; (async () => { try { for await (const item of apiInstance.searchCIAppPipelineEventsWithPagination( params )) { console.log(item); } } catch (error) { console.error(error); } })(); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Aggregate pipelines events](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#aggregate-pipelines-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#aggregate-pipelines-events-v2) POST https://api.ap1.datadoghq.com/api/v2/ci/pipelines/analytics/aggregatehttps://api.ap2.datadoghq.com/api/v2/ci/pipelines/analytics/aggregatehttps://api.datadoghq.eu/api/v2/ci/pipelines/analytics/aggregatehttps://api.ddog-gov.com/api/v2/ci/pipelines/analytics/aggregatehttps://api.datadoghq.com/api/v2/ci/pipelines/analytics/aggregatehttps://api.us3.datadoghq.com/api/v2/ci/pipelines/analytics/aggregatehttps://api.us5.datadoghq.com/api/v2/ci/pipelines/analytics/aggregate ### Overview Use this API endpoint to aggregate CI Visibility pipeline events into buckets of computed metrics and timeseries. This endpoint requires the `ci_visibility_read` permission. OAuth apps require the `ci_visibility_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#ci-visibility-pipelines) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) Field Type Description compute [object] The list of metrics or timeseries to compute for the retrieved buckets. aggregation [_required_] enum An aggregation function. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median,latest,earliest,most_frequent,delta` interval string The time buckets' size (only used for type=timeseries) Defaults to a resolution of 150 points. metric string The metric to use. type enum The type of compute. Allowed enum values: `timeseries,total` default: `total` filter object The search and filter query settings. from string The minimum time for the requested events; supports date, math, and regular timestamps (in milliseconds). default: `now-15m` query string The search query following the CI Visibility Explorer search syntax. default: `*` to string The maximum time for the requested events, supports date, math, and regular timestamps (in milliseconds). default: `now` group_by [object] The rules for the group-by. facet [_required_] string The name of the facet to use (required). histogram object Used to perform a histogram computation (only for measure facets). At most, 100 buckets are allowed, the number of buckets is `(max - min)/interval`. interval [_required_] double The bin size of the histogram buckets. max [_required_] double The maximum value for the measure used in the histogram (values greater than this one are filtered out). min [_required_] double The minimum value for the measure used in the histogram (values smaller than this one are filtered out). limit int64 The maximum buckets to return for this group-by. default: `10` missing The value to use for logs that don't have the facet used to group-by. Option 1 string The missing value to use if there is a string valued facet. Option 2 double The missing value to use if there is a number valued facet. sort object A sort rule. The `aggregation` field is required when `type` is `measure`. aggregation enum An aggregation function. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median,latest,earliest,most_frequent,delta` metric string The metric to sort by (only used for `type=measure`). order enum The order to use, ascending or descending. Allowed enum values: `asc,desc` type enum The type of sorting algorithm. Allowed enum values: `alphabetical,measure` default: `alphabetical` total A resulting object to put the given computes in over all the matching records. Option 1 boolean If set to true, creates an additional bucket labeled "$facet_total". Option 2 string A string to use as the key value for the total bucket. Option 3 double A number to use as the key value for the total bucket. options object Global query options that are used during the query. Only supply timezone or time offset, not both. Otherwise, the query fails. time_offset int64 The time offset (in seconds) to apply to the query. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` ``` { "compute": [ { "aggregation": "pc90", "metric": "@duration", "type": "total" } ], "filter": { "from": "now-15m", "query": "@ci.provider.name:(gitlab OR github)", "to": "now" }, "group_by": [ { "facet": "@ci.status", "limit": 10, "total": false } ], "options": { "timezone": "GMT" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#AggregateCIAppPipelineEvents-200-v2) * [400](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#AggregateCIAppPipelineEvents-400-v2) * [403](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#AggregateCIAppPipelineEvents-403-v2) * [429](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#AggregateCIAppPipelineEvents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) The response object for the pipeline events aggregate API endpoint. Field Type Description data object The query results. buckets [object] The list of matching buckets, one item per bucket. by object The key-value pairs for each group-by. The values for each group-by. computes object A map of the metric name to value for regular compute, or a list of values for a timeseries. A bucket value, can either be a timeseries or a single value. Option 1 string A single string value. Option 2 double A single number value. Option 3 [object] A timeseries array. time date-time The time value for this point. value double The value for this point. links object Links attributes. next string Link for the next set of results. The request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non-fatal errors) encountered. Partial results may return if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": { "buckets": [ { "by": { "": "undefined" }, "computes": { "": { "description": "undefined", "type": "undefined" } } } ] }, "links": { "next": "https://app.datadoghq.com/api/v2/ci/tests/events?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/?code-lang=typescript) ##### Aggregate pipelines events returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ci/pipelines/analytics/aggregate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "compute": [ { "aggregation": "pc90", "metric": "@duration", "type": "total" } ], "filter": { "from": "now-15m", "query": "@ci.provider.name:(gitlab OR github)", "to": "now" }, "group_by": [ { "facet": "@ci.status", "limit": 10, "total": false } ], "options": { "timezone": "GMT" } } EOF ``` ##### Aggregate pipelines events returns "OK" response ``` // Aggregate pipelines events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CIAppPipelinesAggregateRequest{ Compute: []datadogV2.CIAppCompute{ { Aggregation: datadogV2.CIAPPAGGREGATIONFUNCTION_PERCENTILE_90, Metric: datadog.PtrString("@duration"), Type: datadogV2.CIAPPCOMPUTETYPE_TOTAL.Ptr(), }, }, Filter: &datadogV2.CIAppPipelinesQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("@ci.provider.name:(gitlab OR github)"), To: datadog.PtrString("now"), }, GroupBy: []datadogV2.CIAppPipelinesGroupBy{ { Facet: "@ci.status", Limit: datadog.PtrInt64(10), Total: &datadogV2.CIAppGroupByTotal{ CIAppGroupByTotalBoolean: datadog.PtrBool(false)}, }, }, Options: &datadogV2.CIAppQueryOptions{ Timezone: datadog.PtrString("GMT"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCIVisibilityPipelinesApi(apiClient) resp, r, err := api.AggregateCIAppPipelineEvents(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityPipelinesApi.AggregateCIAppPipelineEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CIVisibilityPipelinesApi.AggregateCIAppPipelineEvents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Aggregate pipelines events returns "OK" response ``` // Aggregate pipelines events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CiVisibilityPipelinesApi; import com.datadog.api.client.v2.model.CIAppAggregationFunction; import com.datadog.api.client.v2.model.CIAppCompute; import com.datadog.api.client.v2.model.CIAppComputeType; import com.datadog.api.client.v2.model.CIAppGroupByTotal; import com.datadog.api.client.v2.model.CIAppPipelinesAggregateRequest; import com.datadog.api.client.v2.model.CIAppPipelinesAnalyticsAggregateResponse; import com.datadog.api.client.v2.model.CIAppPipelinesGroupBy; import com.datadog.api.client.v2.model.CIAppPipelinesQueryFilter; import com.datadog.api.client.v2.model.CIAppQueryOptions; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CiVisibilityPipelinesApi apiInstance = new CiVisibilityPipelinesApi(defaultClient); CIAppPipelinesAggregateRequest body = new CIAppPipelinesAggregateRequest() .compute( Collections.singletonList( new CIAppCompute() .aggregation(CIAppAggregationFunction.PERCENTILE_90) .metric("@duration") .type(CIAppComputeType.TOTAL))) .filter( new CIAppPipelinesQueryFilter() .from("now-15m") .query("@ci.provider.name:(gitlab OR github)") .to("now")) .groupBy( Collections.singletonList( new CIAppPipelinesGroupBy() .facet("@ci.status") .limit(10L) .total(new CIAppGroupByTotal(false)))) .options(new CIAppQueryOptions().timezone("GMT")); try { CIAppPipelinesAnalyticsAggregateResponse result = apiInstance.aggregateCIAppPipelineEvents(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CiVisibilityPipelinesApi#aggregateCIAppPipelineEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Aggregate pipelines events returns "OK" response ``` """ Aggregate pipelines events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ci_visibility_pipelines_api import CIVisibilityPipelinesApi from datadog_api_client.v2.model.ci_app_aggregation_function import CIAppAggregationFunction from datadog_api_client.v2.model.ci_app_compute import CIAppCompute from datadog_api_client.v2.model.ci_app_compute_type import CIAppComputeType from datadog_api_client.v2.model.ci_app_pipelines_aggregate_request import CIAppPipelinesAggregateRequest from datadog_api_client.v2.model.ci_app_pipelines_group_by import CIAppPipelinesGroupBy from datadog_api_client.v2.model.ci_app_pipelines_query_filter import CIAppPipelinesQueryFilter from datadog_api_client.v2.model.ci_app_query_options import CIAppQueryOptions body = CIAppPipelinesAggregateRequest( compute=[ CIAppCompute( aggregation=CIAppAggregationFunction.PERCENTILE_90, metric="@duration", type=CIAppComputeType.TOTAL, ), ], filter=CIAppPipelinesQueryFilter( _from="now-15m", query="@ci.provider.name:(gitlab OR github)", to="now", ), group_by=[ CIAppPipelinesGroupBy( facet="@ci.status", limit=10, total=False, ), ], options=CIAppQueryOptions( timezone="GMT", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CIVisibilityPipelinesApi(api_client) response = api_instance.aggregate_ci_app_pipeline_events(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Aggregate pipelines events returns "OK" response ``` # Aggregate pipelines events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CIVisibilityPipelinesAPI.new body = DatadogAPIClient::V2::CIAppPipelinesAggregateRequest.new({ compute: [ DatadogAPIClient::V2::CIAppCompute.new({ aggregation: DatadogAPIClient::V2::CIAppAggregationFunction::PERCENTILE_90, metric: "@duration", type: DatadogAPIClient::V2::CIAppComputeType::TOTAL, }), ], filter: DatadogAPIClient::V2::CIAppPipelinesQueryFilter.new({ from: "now-15m", query: "@ci.provider.name:(gitlab OR github)", to: "now", }), group_by: [ DatadogAPIClient::V2::CIAppPipelinesGroupBy.new({ facet: "@ci.status", limit: 10, total: false, }), ], options: DatadogAPIClient::V2::CIAppQueryOptions.new({ timezone: "GMT", }), }) p api_instance.aggregate_ci_app_pipeline_events(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Aggregate pipelines events returns "OK" response ``` // Aggregate pipelines events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ci_visibility_pipelines::CIVisibilityPipelinesAPI; use datadog_api_client::datadogV2::model::CIAppAggregationFunction; use datadog_api_client::datadogV2::model::CIAppCompute; use datadog_api_client::datadogV2::model::CIAppComputeType; use datadog_api_client::datadogV2::model::CIAppGroupByTotal; use datadog_api_client::datadogV2::model::CIAppPipelinesAggregateRequest; use datadog_api_client::datadogV2::model::CIAppPipelinesGroupBy; use datadog_api_client::datadogV2::model::CIAppPipelinesQueryFilter; use datadog_api_client::datadogV2::model::CIAppQueryOptions; #[tokio::main] async fn main() { let body = CIAppPipelinesAggregateRequest::new() .compute(vec![CIAppCompute::new( CIAppAggregationFunction::PERCENTILE_90, ) .metric("@duration".to_string()) .type_(CIAppComputeType::TOTAL)]) .filter( CIAppPipelinesQueryFilter::new() .from("now-15m".to_string()) .query("@ci.provider.name:(gitlab OR github)".to_string()) .to("now".to_string()), ) .group_by(vec![CIAppPipelinesGroupBy::new("@ci.status".to_string()) .limit(10) .total(CIAppGroupByTotal::CIAppGroupByTotalBoolean(false))]) .options(CIAppQueryOptions::new().timezone("GMT".to_string())); let configuration = datadog::Configuration::new(); let api = CIVisibilityPipelinesAPI::with_config(configuration); let resp = api.aggregate_ci_app_pipeline_events(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Aggregate pipelines events returns "OK" response ``` /** * Aggregate pipelines events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CIVisibilityPipelinesApi(configuration); const params: v2.CIVisibilityPipelinesApiAggregateCIAppPipelineEventsRequest = { body: { compute: [ { aggregation: "pc90", metric: "@duration", type: "total", }, ], filter: { from: "now-15m", query: "@ci.provider.name:(gitlab OR github)", to: "now", }, groupBy: [ { facet: "@ci.status", limit: 10, total: false, }, ], options: { timezone: "GMT", }, }, }; apiInstance .aggregateCIAppPipelineEvents(params) .then((data: v2.CIAppPipelinesAnalyticsAggregateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=d2418049-6495-449b-9dc4-b1a27955ee9f&bo=1&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=CI%20Visibility%20Pipelines&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fci-visibility-pipelines%2F&r=<=16407&evt=pageLoad&sv=2&cdb=AQAA&rn=278060) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=facc836c-adc8-4803-8ed3-5768612d21de&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=426bde9e-abcc-4567-b560-bdb2eeee4503&pt=CI%20Visibility%20Pipelines&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fci-visibility-pipelines%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=facc836c-adc8-4803-8ed3-5768612d21de&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=426bde9e-abcc-4567-b560-bdb2eeee4503&pt=CI%20Visibility%20Pipelines&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fci-visibility-pipelines%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) --- # Source: https://docs.datadoghq.com/api/latest/ci-visibility-tests # CI Visibility Tests Search or aggregate your CI Visibility test events over HTTP. See the [Test Visibility in Datadog page](https://docs.datadoghq.com/tests/) for more information. ## [Get a list of tests events](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#get-a-list-of-tests-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#get-a-list-of-tests-events-v2) GET https://api.ap1.datadoghq.com/api/v2/ci/tests/eventshttps://api.ap2.datadoghq.com/api/v2/ci/tests/eventshttps://api.datadoghq.eu/api/v2/ci/tests/eventshttps://api.ddog-gov.com/api/v2/ci/tests/eventshttps://api.datadoghq.com/api/v2/ci/tests/eventshttps://api.us3.datadoghq.com/api/v2/ci/tests/eventshttps://api.us5.datadoghq.com/api/v2/ci/tests/events ### Overview List endpoint returns CI Visibility test events that match a [search query](https://docs.datadoghq.com/continuous_integration/explorer/search_syntax/). [Results are paginated similarly to logs](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to see your latest test events. This endpoint requires any of the following permissions: * `ci_visibility_read` * `test_optimization_read` OAuth apps require the `test_optimization_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#ci-visibility-tests) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[query] string Search query following log syntax. filter[from] string Minimum timestamp for requested events. filter[to] string Maximum timestamp for requested events. sort enum Order of events in results. Allowed enum values: `timestamp, -timestamp` page[cursor] string List following results with a cursor provided in the previous query. page[limit] integer Maximum number of events in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#ListCIAppTestEvents-200-v2) * [400](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#ListCIAppTestEvents-400-v2) * [403](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#ListCIAppTestEvents-403-v2) * [429](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#ListCIAppTestEvents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) Response object with all test events matching the request and pagination information. Field Type Description data [object] Array of events matching the request. attributes object JSON object containing all event attributes and their associated values. attributes object JSON object of attributes from CI Visibility test events. tags [string] Array of tags associated with your event. test_level enum Test run level. Allowed enum values: `session,module,suite,test` id string Unique ID of the event. type enum Type of the event. Allowed enum values: `citest` links object Links attributes. next string Link for the next set of results. The request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non-fatal errors) encountered. Partial results may return if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "tags": [ "team:A" ], "test_level": "test" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "citest" } ], "links": { "next": "https://app.datadoghq.com/api/v2/ci/tests/events?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=typescript) ##### Get a list of tests events Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ci/tests/events" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of tests events ``` """ Get a list of tests events returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ci_visibility_tests_api import CIVisibilityTestsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CIVisibilityTestsApi(api_client) response = api_instance.list_ci_app_test_events( filter_query="@test.service:web-ui-tests", filter_from=(datetime.now() + relativedelta(seconds=-30)), filter_to=datetime.now(), page_limit=5, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of tests events ``` # Get a list of tests events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CIVisibilityTestsAPI.new opts = { filter_query: "@test.service:web-ui-tests", filter_from: (Time.now + -30), filter_to: Time.now, page_limit: 5, } p api_instance.list_ci_app_test_events(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of tests events ``` // Get a list of tests events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCIVisibilityTestsApi(apiClient) resp, r, err := api.ListCIAppTestEvents(ctx, *datadogV2.NewListCIAppTestEventsOptionalParameters().WithFilterQuery("@test.service:web-ui-tests").WithFilterFrom(time.Now().Add(time.Second * -30)).WithFilterTo(time.Now()).WithPageLimit(5)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityTestsApi.ListCIAppTestEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CIVisibilityTestsApi.ListCIAppTestEvents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of tests events ``` // Get a list of tests events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CiVisibilityTestsApi; import com.datadog.api.client.v2.api.CiVisibilityTestsApi.ListCIAppTestEventsOptionalParameters; import com.datadog.api.client.v2.model.CIAppTestEventsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CiVisibilityTestsApi apiInstance = new CiVisibilityTestsApi(defaultClient); try { CIAppTestEventsResponse result = apiInstance.listCIAppTestEvents( new ListCIAppTestEventsOptionalParameters() .filterQuery("@test.service:web-ui-tests") .filterFrom(OffsetDateTime.now().plusSeconds(-30)) .filterTo(OffsetDateTime.now()) .pageLimit(5)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CiVisibilityTestsApi#listCIAppTestEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of tests events ``` // Get a list of tests events returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ci_visibility_tests::CIVisibilityTestsAPI; use datadog_api_client::datadogV2::api_ci_visibility_tests::ListCIAppTestEventsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CIVisibilityTestsAPI::with_config(configuration); let resp = api .list_ci_app_test_events( ListCIAppTestEventsOptionalParams::default() .filter_query("@test.service:web-ui-tests".to_string()) .filter_from( DateTime::parse_from_rfc3339("2021-11-11T11:10:41+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .filter_to( DateTime::parse_from_rfc3339("2021-11-11T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .page_limit(5), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of tests events ``` /** * Get a list of tests events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CIVisibilityTestsApi(configuration); const params: v2.CIVisibilityTestsApiListCIAppTestEventsRequest = { filterQuery: "@test.service:web-ui-tests", filterFrom: new Date(new Date().getTime() + -30 * 1000), filterTo: new Date(), pageLimit: 5, }; apiInstance .listCIAppTestEvents(params) .then((data: v2.CIAppTestEventsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search tests events](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#search-tests-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#search-tests-events-v2) POST https://api.ap1.datadoghq.com/api/v2/ci/tests/events/searchhttps://api.ap2.datadoghq.com/api/v2/ci/tests/events/searchhttps://api.datadoghq.eu/api/v2/ci/tests/events/searchhttps://api.ddog-gov.com/api/v2/ci/tests/events/searchhttps://api.datadoghq.com/api/v2/ci/tests/events/searchhttps://api.us3.datadoghq.com/api/v2/ci/tests/events/searchhttps://api.us5.datadoghq.com/api/v2/ci/tests/events/search ### Overview List endpoint returns CI Visibility test events that match a [search query](https://docs.datadoghq.com/continuous_integration/explorer/search_syntax/). [Results are paginated similarly to logs](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to build complex events filtering and search. This endpoint requires any of the following permissions: * `ci_visibility_read` * `test_optimization_read` OAuth apps require the `test_optimization_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#ci-visibility-tests) to access this endpoint. ### Request #### Body Data * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) Field Type Description filter object The search and filter query settings. from string The minimum time for the requested events; supports date, math, and regular timestamps (in milliseconds). default: `now-15m` query string The search query following the CI Visibility Explorer search syntax. default: `*` to string The maximum time for the requested events, supports date, math, and regular timestamps (in milliseconds). default: `now` options object Global query options that are used during the query. Only supply timezone or time offset, not both. Otherwise, the query fails. time_offset int64 The time offset (in seconds) to apply to the query. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` page object Paging attributes for listing events. cursor string List following results with a cursor provided in the previous query. limit int32 Maximum number of events in the response. default: `10` sort enum Sort parameters when querying events. Allowed enum values: `timestamp,-timestamp` ##### Search tests events returns "OK" response ``` { "filter": { "from": "now-15m", "query": "@test.service:web-ui-tests AND @test.status:skip", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 25 }, "sort": "timestamp" } ``` Copy ##### Search tests events returns "OK" response with pagination ``` { "filter": { "from": "now-15m", "query": "@test.status:pass AND -@language:python", "to": "now" }, "page": { "limit": 2 }, "sort": "timestamp" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#SearchCIAppTestEvents-200-v2) * [400](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#SearchCIAppTestEvents-400-v2) * [403](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#SearchCIAppTestEvents-403-v2) * [429](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#SearchCIAppTestEvents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) Response object with all test events matching the request and pagination information. Field Type Description data [object] Array of events matching the request. attributes object JSON object containing all event attributes and their associated values. attributes object JSON object of attributes from CI Visibility test events. tags [string] Array of tags associated with your event. test_level enum Test run level. Allowed enum values: `session,module,suite,test` id string Unique ID of the event. type enum Type of the event. Allowed enum values: `citest` links object Links attributes. next string Link for the next set of results. The request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non-fatal errors) encountered. Partial results may return if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "tags": [ "team:A" ], "test_level": "test" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "citest" } ], "links": { "next": "https://app.datadoghq.com/api/v2/ci/tests/events?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=typescript) ##### Search tests events returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ci/tests/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "now-15m", "query": "@test.service:web-ui-tests AND @test.status:skip", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 25 }, "sort": "timestamp" } EOF ``` ##### Search tests events returns "OK" response with pagination Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ci/tests/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "now-15m", "query": "@test.status:pass AND -@language:python", "to": "now" }, "page": { "limit": 2 }, "sort": "timestamp" } EOF ``` ##### Search tests events returns "OK" response ``` // Search tests events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CIAppTestEventsRequest{ Filter: &datadogV2.CIAppTestsQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("@test.service:web-ui-tests AND @test.status:skip"), To: datadog.PtrString("now"), }, Options: &datadogV2.CIAppQueryOptions{ Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.CIAppQueryPageOptions{ Limit: datadog.PtrInt32(25), }, Sort: datadogV2.CIAPPSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCIVisibilityTestsApi(apiClient) resp, r, err := api.SearchCIAppTestEvents(ctx, *datadogV2.NewSearchCIAppTestEventsOptionalParameters().WithBody(body)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityTestsApi.SearchCIAppTestEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CIVisibilityTestsApi.SearchCIAppTestEvents`:\n%s\n", responseContent) } ``` Copy ##### Search tests events returns "OK" response with pagination ``` // Search tests events returns "OK" response with pagination package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CIAppTestEventsRequest{ Filter: &datadogV2.CIAppTestsQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("@test.status:pass AND -@language:python"), To: datadog.PtrString("now"), }, Page: &datadogV2.CIAppQueryPageOptions{ Limit: datadog.PtrInt32(2), }, Sort: datadogV2.CIAPPSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCIVisibilityTestsApi(apiClient) resp, _ := api.SearchCIAppTestEventsWithPagination(ctx, *datadogV2.NewSearchCIAppTestEventsOptionalParameters().WithBody(body)) for paginationResult := range resp { if paginationResult.Error != nil { fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityTestsApi.SearchCIAppTestEvents`: %v\n", paginationResult.Error) } responseContent, _ := json.MarshalIndent(paginationResult.Item, "", " ") fmt.Fprintf(os.Stdout, "%s\n", responseContent) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search tests events returns "OK" response ``` // Search tests events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CiVisibilityTestsApi; import com.datadog.api.client.v2.api.CiVisibilityTestsApi.SearchCIAppTestEventsOptionalParameters; import com.datadog.api.client.v2.model.CIAppQueryOptions; import com.datadog.api.client.v2.model.CIAppQueryPageOptions; import com.datadog.api.client.v2.model.CIAppSort; import com.datadog.api.client.v2.model.CIAppTestEventsRequest; import com.datadog.api.client.v2.model.CIAppTestEventsResponse; import com.datadog.api.client.v2.model.CIAppTestsQueryFilter; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CiVisibilityTestsApi apiInstance = new CiVisibilityTestsApi(defaultClient); CIAppTestEventsRequest body = new CIAppTestEventsRequest() .filter( new CIAppTestsQueryFilter() .from("now-15m") .query("@test.service:web-ui-tests AND @test.status:skip") .to("now")) .options(new CIAppQueryOptions().timezone("GMT")) .page(new CIAppQueryPageOptions().limit(25)) .sort(CIAppSort.TIMESTAMP_ASCENDING); try { CIAppTestEventsResponse result = apiInstance.searchCIAppTestEvents( new SearchCIAppTestEventsOptionalParameters().body(body)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CiVisibilityTestsApi#searchCIAppTestEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Search tests events returns "OK" response with pagination ``` // Search tests events returns "OK" response with pagination import com.datadog.api.client.ApiClient; import com.datadog.api.client.PaginationIterable; import com.datadog.api.client.v2.api.CiVisibilityTestsApi; import com.datadog.api.client.v2.api.CiVisibilityTestsApi.SearchCIAppTestEventsOptionalParameters; import com.datadog.api.client.v2.model.CIAppQueryPageOptions; import com.datadog.api.client.v2.model.CIAppSort; import com.datadog.api.client.v2.model.CIAppTestEvent; import com.datadog.api.client.v2.model.CIAppTestEventsRequest; import com.datadog.api.client.v2.model.CIAppTestsQueryFilter; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CiVisibilityTestsApi apiInstance = new CiVisibilityTestsApi(defaultClient); CIAppTestEventsRequest body = new CIAppTestEventsRequest() .filter( new CIAppTestsQueryFilter() .from("now-15m") .query("@test.status:pass AND -@language:python") .to("now")) .page(new CIAppQueryPageOptions().limit(2)) .sort(CIAppSort.TIMESTAMP_ASCENDING); try { PaginationIterable iterable = apiInstance.searchCIAppTestEventsWithPagination( new SearchCIAppTestEventsOptionalParameters().body(body)); for (CIAppTestEvent item : iterable) { System.out.println(item); } } catch (RuntimeException e) { System.err.println( "Exception when calling CiVisibilityTestsApi#searchCIAppTestEventsWithPagination"); System.err.println("Reason: " + e.getMessage()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search tests events returns "OK" response ``` """ Search tests events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ci_visibility_tests_api import CIVisibilityTestsApi from datadog_api_client.v2.model.ci_app_query_options import CIAppQueryOptions from datadog_api_client.v2.model.ci_app_query_page_options import CIAppQueryPageOptions from datadog_api_client.v2.model.ci_app_sort import CIAppSort from datadog_api_client.v2.model.ci_app_test_events_request import CIAppTestEventsRequest from datadog_api_client.v2.model.ci_app_tests_query_filter import CIAppTestsQueryFilter body = CIAppTestEventsRequest( filter=CIAppTestsQueryFilter( _from="now-15m", query="@test.service:web-ui-tests AND @test.status:skip", to="now", ), options=CIAppQueryOptions( timezone="GMT", ), page=CIAppQueryPageOptions( limit=25, ), sort=CIAppSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CIVisibilityTestsApi(api_client) response = api_instance.search_ci_app_test_events(body=body) print(response) ``` Copy ##### Search tests events returns "OK" response with pagination ``` """ Search tests events returns "OK" response with pagination """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ci_visibility_tests_api import CIVisibilityTestsApi from datadog_api_client.v2.model.ci_app_query_page_options import CIAppQueryPageOptions from datadog_api_client.v2.model.ci_app_sort import CIAppSort from datadog_api_client.v2.model.ci_app_test_events_request import CIAppTestEventsRequest from datadog_api_client.v2.model.ci_app_tests_query_filter import CIAppTestsQueryFilter body = CIAppTestEventsRequest( filter=CIAppTestsQueryFilter( _from="now-15m", query="@test.status:pass AND -@language:python", to="now", ), page=CIAppQueryPageOptions( limit=2, ), sort=CIAppSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CIVisibilityTestsApi(api_client) items = api_instance.search_ci_app_test_events_with_pagination(body=body) for item in items: print(item) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search tests events returns "OK" response ``` # Search tests events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CIVisibilityTestsAPI.new body = DatadogAPIClient::V2::CIAppTestEventsRequest.new({ filter: DatadogAPIClient::V2::CIAppTestsQueryFilter.new({ from: "now-15m", query: "@test.service:web-ui-tests AND @test.status:skip", to: "now", }), options: DatadogAPIClient::V2::CIAppQueryOptions.new({ timezone: "GMT", }), page: DatadogAPIClient::V2::CIAppQueryPageOptions.new({ limit: 25, }), sort: DatadogAPIClient::V2::CIAppSort::TIMESTAMP_ASCENDING, }) opts = { body: body, } p api_instance.search_ci_app_test_events(opts) ``` Copy ##### Search tests events returns "OK" response with pagination ``` # Search tests events returns "OK" response with pagination require "datadog_api_client" api_instance = DatadogAPIClient::V2::CIVisibilityTestsAPI.new body = DatadogAPIClient::V2::CIAppTestEventsRequest.new({ filter: DatadogAPIClient::V2::CIAppTestsQueryFilter.new({ from: "now-15m", query: "@test.status:pass AND -@language:python", to: "now", }), page: DatadogAPIClient::V2::CIAppQueryPageOptions.new({ limit: 2, }), sort: DatadogAPIClient::V2::CIAppSort::TIMESTAMP_ASCENDING, }) opts = { body: body, } api_instance.search_ci_app_test_events_with_pagination(opts) { |item| puts item } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search tests events returns "OK" response ``` // Search tests events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ci_visibility_tests::CIVisibilityTestsAPI; use datadog_api_client::datadogV2::api_ci_visibility_tests::SearchCIAppTestEventsOptionalParams; use datadog_api_client::datadogV2::model::CIAppQueryOptions; use datadog_api_client::datadogV2::model::CIAppQueryPageOptions; use datadog_api_client::datadogV2::model::CIAppSort; use datadog_api_client::datadogV2::model::CIAppTestEventsRequest; use datadog_api_client::datadogV2::model::CIAppTestsQueryFilter; #[tokio::main] async fn main() { let body = CIAppTestEventsRequest::new() .filter( CIAppTestsQueryFilter::new() .from("now-15m".to_string()) .query("@test.service:web-ui-tests AND @test.status:skip".to_string()) .to("now".to_string()), ) .options(CIAppQueryOptions::new().timezone("GMT".to_string())) .page(CIAppQueryPageOptions::new().limit(25)) .sort(CIAppSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = CIVisibilityTestsAPI::with_config(configuration); let resp = api .search_ci_app_test_events(SearchCIAppTestEventsOptionalParams::default().body(body)) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Search tests events returns "OK" response with pagination ``` // Search tests events returns "OK" response with pagination use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ci_visibility_tests::CIVisibilityTestsAPI; use datadog_api_client::datadogV2::api_ci_visibility_tests::SearchCIAppTestEventsOptionalParams; use datadog_api_client::datadogV2::model::CIAppQueryPageOptions; use datadog_api_client::datadogV2::model::CIAppSort; use datadog_api_client::datadogV2::model::CIAppTestEventsRequest; use datadog_api_client::datadogV2::model::CIAppTestsQueryFilter; use futures_util::pin_mut; use futures_util::stream::StreamExt; #[tokio::main] async fn main() { let body = CIAppTestEventsRequest::new() .filter( CIAppTestsQueryFilter::new() .from("now-15m".to_string()) .query("@test.status:pass AND -@language:python".to_string()) .to("now".to_string()), ) .page(CIAppQueryPageOptions::new().limit(2)) .sort(CIAppSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = CIVisibilityTestsAPI::with_config(configuration); let response = api.search_ci_app_test_events_with_pagination( SearchCIAppTestEventsOptionalParams::default().body(body), ); pin_mut!(response); while let Some(resp) = response.next().await { if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search tests events returns "OK" response ``` /** * Search tests events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CIVisibilityTestsApi(configuration); const params: v2.CIVisibilityTestsApiSearchCIAppTestEventsRequest = { body: { filter: { from: "now-15m", query: "@test.service:web-ui-tests AND @test.status:skip", to: "now", }, options: { timezone: "GMT", }, page: { limit: 25, }, sort: "timestamp", }, }; apiInstance .searchCIAppTestEvents(params) .then((data: v2.CIAppTestEventsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Search tests events returns "OK" response with pagination ``` /** * Search tests events returns "OK" response with pagination */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CIVisibilityTestsApi(configuration); const params: v2.CIVisibilityTestsApiSearchCIAppTestEventsRequest = { body: { filter: { from: "now-15m", query: "@test.status:pass AND -@language:python", to: "now", }, page: { limit: 2, }, sort: "timestamp", }, }; (async () => { try { for await (const item of apiInstance.searchCIAppTestEventsWithPagination( params )) { console.log(item); } } catch (error) { console.error(error); } })(); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Aggregate tests events](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#aggregate-tests-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#aggregate-tests-events-v2) POST https://api.ap1.datadoghq.com/api/v2/ci/tests/analytics/aggregatehttps://api.ap2.datadoghq.com/api/v2/ci/tests/analytics/aggregatehttps://api.datadoghq.eu/api/v2/ci/tests/analytics/aggregatehttps://api.ddog-gov.com/api/v2/ci/tests/analytics/aggregatehttps://api.datadoghq.com/api/v2/ci/tests/analytics/aggregatehttps://api.us3.datadoghq.com/api/v2/ci/tests/analytics/aggregatehttps://api.us5.datadoghq.com/api/v2/ci/tests/analytics/aggregate ### Overview The API endpoint to aggregate CI Visibility test events into buckets of computed metrics and timeseries. This endpoint requires any of the following permissions: * `ci_visibility_read` * `test_optimization_read` OAuth apps require the `test_optimization_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#ci-visibility-tests) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) Field Type Description compute [object] The list of metrics or timeseries to compute for the retrieved buckets. aggregation [_required_] enum An aggregation function. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median,latest,earliest,most_frequent,delta` interval string The time buckets' size (only used for type=timeseries) Defaults to a resolution of 150 points. metric string The metric to use. type enum The type of compute. Allowed enum values: `timeseries,total` default: `total` filter object The search and filter query settings. from string The minimum time for the requested events; supports date, math, and regular timestamps (in milliseconds). default: `now-15m` query string The search query following the CI Visibility Explorer search syntax. default: `*` to string The maximum time for the requested events, supports date, math, and regular timestamps (in milliseconds). default: `now` group_by [object] The rules for the group-by. facet [_required_] string The name of the facet to use (required). histogram object Used to perform a histogram computation (only for measure facets). At most, 100 buckets are allowed, the number of buckets is `(max - min)/interval`. interval [_required_] double The bin size of the histogram buckets. max [_required_] double The maximum value for the measure used in the histogram (values greater than this one are filtered out). min [_required_] double The minimum value for the measure used in the histogram (values smaller than this one are filtered out). limit int64 The maximum buckets to return for this group-by. default: `10` missing The value to use for logs that don't have the facet used to group-by. Option 1 string The missing value to use if there is a string valued facet. Option 2 double The missing value to use if there is a number valued facet. sort object A sort rule. The `aggregation` field is required when `type` is `measure`. aggregation enum An aggregation function. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median,latest,earliest,most_frequent,delta` metric string The metric to sort by (only used for `type=measure`). order enum The order to use, ascending or descending. Allowed enum values: `asc,desc` type enum The type of sorting algorithm. Allowed enum values: `alphabetical,measure` default: `alphabetical` total A resulting object to put the given computes in over all the matching records. Option 1 boolean If set to true, creates an additional bucket labeled "$facet_total". Option 2 string A string to use as the key value for the total bucket. Option 3 double A number to use as the key value for the total bucket. options object Global query options that are used during the query. Only supply timezone or time offset, not both. Otherwise, the query fails. time_offset int64 The time offset (in seconds) to apply to the query. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` ``` { "compute": [ { "aggregation": "count", "metric": "@test.is_flaky", "type": "total" } ], "filter": { "from": "now-15m", "query": "@language:(python OR go)", "to": "now" }, "group_by": [ { "facet": "@git.branch", "limit": 10, "sort": { "order": "asc" }, "total": false } ], "options": { "timezone": "GMT" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#AggregateCIAppTestEvents-200-v2) * [400](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#AggregateCIAppTestEvents-400-v2) * [403](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#AggregateCIAppTestEvents-403-v2) * [429](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#AggregateCIAppTestEvents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) The response object for the test events aggregate API endpoint. Field Type Description data object The query results. buckets [object] The list of matching buckets, one item per bucket. by object The key-value pairs for each group-by. The values for each group-by. computes object A map of the metric name to value for regular compute, or a list of values for a timeseries. A bucket value, can either be a timeseries or a single value. Option 1 string A single string value. Option 2 double A single number value. Option 3 [object] A timeseries array. time date-time The time value for this point. value double The value for this point. links object Links attributes. next string Link for the next set of results. The request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non-fatal errors) encountered. Partial results may return if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": { "buckets": [ { "by": { "": "undefined" }, "computes": { "": { "description": "undefined", "type": "undefined" } } } ] }, "links": { "next": "https://app.datadoghq.com/api/v2/ci/tests/events?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Example](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/ci-visibility-tests/?code-lang=typescript) ##### Aggregate tests events returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ci/tests/analytics/aggregate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "compute": [ { "aggregation": "count", "metric": "@test.is_flaky", "type": "total" } ], "filter": { "from": "now-15m", "query": "@language:(python OR go)", "to": "now" }, "group_by": [ { "facet": "@git.branch", "limit": 10, "sort": { "order": "asc" }, "total": false } ], "options": { "timezone": "GMT" } } EOF ``` ##### Aggregate tests events returns "OK" response ``` // Aggregate tests events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CIAppTestsAggregateRequest{ Compute: []datadogV2.CIAppCompute{ { Aggregation: datadogV2.CIAPPAGGREGATIONFUNCTION_COUNT, Metric: datadog.PtrString("@test.is_flaky"), Type: datadogV2.CIAPPCOMPUTETYPE_TOTAL.Ptr(), }, }, Filter: &datadogV2.CIAppTestsQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("@language:(python OR go)"), To: datadog.PtrString("now"), }, GroupBy: []datadogV2.CIAppTestsGroupBy{ { Facet: "@git.branch", Limit: datadog.PtrInt64(10), Sort: &datadogV2.CIAppAggregateSort{ Order: datadogV2.CIAPPSORTORDER_ASCENDING.Ptr(), }, Total: &datadogV2.CIAppGroupByTotal{ CIAppGroupByTotalBoolean: datadog.PtrBool(false)}, }, }, Options: &datadogV2.CIAppQueryOptions{ Timezone: datadog.PtrString("GMT"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCIVisibilityTestsApi(apiClient) resp, r, err := api.AggregateCIAppTestEvents(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CIVisibilityTestsApi.AggregateCIAppTestEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CIVisibilityTestsApi.AggregateCIAppTestEvents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Aggregate tests events returns "OK" response ``` // Aggregate tests events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CiVisibilityTestsApi; import com.datadog.api.client.v2.model.CIAppAggregateSort; import com.datadog.api.client.v2.model.CIAppAggregationFunction; import com.datadog.api.client.v2.model.CIAppCompute; import com.datadog.api.client.v2.model.CIAppComputeType; import com.datadog.api.client.v2.model.CIAppGroupByTotal; import com.datadog.api.client.v2.model.CIAppQueryOptions; import com.datadog.api.client.v2.model.CIAppSortOrder; import com.datadog.api.client.v2.model.CIAppTestsAggregateRequest; import com.datadog.api.client.v2.model.CIAppTestsAnalyticsAggregateResponse; import com.datadog.api.client.v2.model.CIAppTestsGroupBy; import com.datadog.api.client.v2.model.CIAppTestsQueryFilter; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CiVisibilityTestsApi apiInstance = new CiVisibilityTestsApi(defaultClient); CIAppTestsAggregateRequest body = new CIAppTestsAggregateRequest() .compute( Collections.singletonList( new CIAppCompute() .aggregation(CIAppAggregationFunction.COUNT) .metric("@test.is_flaky") .type(CIAppComputeType.TOTAL))) .filter( new CIAppTestsQueryFilter() .from("now-15m") .query("@language:(python OR go)") .to("now")) .groupBy( Collections.singletonList( new CIAppTestsGroupBy() .facet("@git.branch") .limit(10L) .sort(new CIAppAggregateSort().order(CIAppSortOrder.ASCENDING)) .total(new CIAppGroupByTotal(false)))) .options(new CIAppQueryOptions().timezone("GMT")); try { CIAppTestsAnalyticsAggregateResponse result = apiInstance.aggregateCIAppTestEvents(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CiVisibilityTestsApi#aggregateCIAppTestEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Aggregate tests events returns "OK" response ``` """ Aggregate tests events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ci_visibility_tests_api import CIVisibilityTestsApi from datadog_api_client.v2.model.ci_app_aggregate_sort import CIAppAggregateSort from datadog_api_client.v2.model.ci_app_aggregation_function import CIAppAggregationFunction from datadog_api_client.v2.model.ci_app_compute import CIAppCompute from datadog_api_client.v2.model.ci_app_compute_type import CIAppComputeType from datadog_api_client.v2.model.ci_app_query_options import CIAppQueryOptions from datadog_api_client.v2.model.ci_app_sort_order import CIAppSortOrder from datadog_api_client.v2.model.ci_app_tests_aggregate_request import CIAppTestsAggregateRequest from datadog_api_client.v2.model.ci_app_tests_group_by import CIAppTestsGroupBy from datadog_api_client.v2.model.ci_app_tests_query_filter import CIAppTestsQueryFilter body = CIAppTestsAggregateRequest( compute=[ CIAppCompute( aggregation=CIAppAggregationFunction.COUNT, metric="@test.is_flaky", type=CIAppComputeType.TOTAL, ), ], filter=CIAppTestsQueryFilter( _from="now-15m", query="@language:(python OR go)", to="now", ), group_by=[ CIAppTestsGroupBy( facet="@git.branch", limit=10, sort=CIAppAggregateSort( order=CIAppSortOrder.ASCENDING, ), total=False, ), ], options=CIAppQueryOptions( timezone="GMT", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CIVisibilityTestsApi(api_client) response = api_instance.aggregate_ci_app_test_events(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Aggregate tests events returns "OK" response ``` # Aggregate tests events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CIVisibilityTestsAPI.new body = DatadogAPIClient::V2::CIAppTestsAggregateRequest.new({ compute: [ DatadogAPIClient::V2::CIAppCompute.new({ aggregation: DatadogAPIClient::V2::CIAppAggregationFunction::COUNT, metric: "@test.is_flaky", type: DatadogAPIClient::V2::CIAppComputeType::TOTAL, }), ], filter: DatadogAPIClient::V2::CIAppTestsQueryFilter.new({ from: "now-15m", query: "@language:(python OR go)", to: "now", }), group_by: [ DatadogAPIClient::V2::CIAppTestsGroupBy.new({ facet: "@git.branch", limit: 10, sort: DatadogAPIClient::V2::CIAppAggregateSort.new({ order: DatadogAPIClient::V2::CIAppSortOrder::ASCENDING, }), total: false, }), ], options: DatadogAPIClient::V2::CIAppQueryOptions.new({ timezone: "GMT", }), }) p api_instance.aggregate_ci_app_test_events(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Aggregate tests events returns "OK" response ``` // Aggregate tests events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ci_visibility_tests::CIVisibilityTestsAPI; use datadog_api_client::datadogV2::model::CIAppAggregateSort; use datadog_api_client::datadogV2::model::CIAppAggregationFunction; use datadog_api_client::datadogV2::model::CIAppCompute; use datadog_api_client::datadogV2::model::CIAppComputeType; use datadog_api_client::datadogV2::model::CIAppGroupByTotal; use datadog_api_client::datadogV2::model::CIAppQueryOptions; use datadog_api_client::datadogV2::model::CIAppSortOrder; use datadog_api_client::datadogV2::model::CIAppTestsAggregateRequest; use datadog_api_client::datadogV2::model::CIAppTestsGroupBy; use datadog_api_client::datadogV2::model::CIAppTestsQueryFilter; #[tokio::main] async fn main() { let body = CIAppTestsAggregateRequest::new() .compute(vec![CIAppCompute::new(CIAppAggregationFunction::COUNT) .metric("@test.is_flaky".to_string()) .type_(CIAppComputeType::TOTAL)]) .filter( CIAppTestsQueryFilter::new() .from("now-15m".to_string()) .query("@language:(python OR go)".to_string()) .to("now".to_string()), ) .group_by(vec![CIAppTestsGroupBy::new("@git.branch".to_string()) .limit(10) .sort(CIAppAggregateSort::new().order(CIAppSortOrder::ASCENDING)) .total(CIAppGroupByTotal::CIAppGroupByTotalBoolean(false))]) .options(CIAppQueryOptions::new().timezone("GMT".to_string())); let configuration = datadog::Configuration::new(); let api = CIVisibilityTestsAPI::with_config(configuration); let resp = api.aggregate_ci_app_test_events(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Aggregate tests events returns "OK" response ``` /** * Aggregate tests events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CIVisibilityTestsApi(configuration); const params: v2.CIVisibilityTestsApiAggregateCIAppTestEventsRequest = { body: { compute: [ { aggregation: "count", metric: "@test.is_flaky", type: "total", }, ], filter: { from: "now-15m", query: "@language:(python OR go)", to: "now", }, groupBy: [ { facet: "@git.branch", limit: 10, sort: { order: "asc", }, total: false, }, ], options: { timezone: "GMT", }, }, }; apiInstance .aggregateCIAppTestEvents(params) .then((data: v2.CIAppTestsAnalyticsAggregateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=af92da46-b624-43bb-9173-f0b97f4d4963&bo=1&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=CI%20Visibility%20Tests&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fci-visibility-tests%2F&r=&evt=pageLoad&sv=2&cdb=AQAA&rn=859457) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4de5bb84-382c-457f-a941-8cfe7f54450a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=73770c56-1b51-43ea-8c11-e90fd77bc57b&pt=CI%20Visibility%20Tests&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fci-visibility-tests%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4de5bb84-382c-457f-a941-8cfe7f54450a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=73770c56-1b51-43ea-8c11-e90fd77bc57b&pt=CI%20Visibility%20Tests&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fci-visibility-tests%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) --- # Source: https://docs.datadoghq.com/api/latest/cloud-cost-management ![](https://d.adroll.com/cm/b/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/bombora/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/experian/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/eyeota/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/g/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/index/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/l/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/n/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/o/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/outbrain/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/pubmatic/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/taboola/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/triplelift/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/x/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=34466897503.59868&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM) # Cloud Cost Management The Cloud Cost Management API allows you to set up, edit, and delete Cloud Cost Management accounts for AWS, Azure, and Google Cloud. You can query your cost data by using the [Metrics endpoint](https://docs.datadoghq.com/api/latest/metrics/#query-timeseries-data-across-multiple-products) and the `cloud_cost` data source. For more information, see the [Cloud Cost Management documentation](https://docs.datadoghq.com/cloud_cost_management/). ## [List Cloud Cost Management AWS CUR configs](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-cloud-cost-management-aws-cur-configs) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-cloud-cost-management-aws-cur-configs-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/aws_cur_confighttps://api.ap2.datadoghq.com/api/v2/cost/aws_cur_confighttps://api.datadoghq.eu/api/v2/cost/aws_cur_confighttps://api.ddog-gov.com/api/v2/cost/aws_cur_confighttps://api.datadoghq.com/api/v2/cost/aws_cur_confighttps://api.us3.datadoghq.com/api/v2/cost/aws_cur_confighttps://api.us5.datadoghq.com/api/v2/cost/aws_cur_config ### Overview List the AWS CUR configs. This endpoint requires the `cloud_cost_management_read` permission. OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCostAWSCURConfigs-200-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCostAWSCURConfigs-403-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCostAWSCURConfigs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) List of AWS CUR configs. Field Type Description data [_required_] [object] An AWS CUR config. attributes [_required_] object Attributes for An AWS CUR config. account_filters object The account filtering configuration. excluded_accounts [string] The AWS account IDs to be excluded from your billing dataset. This field is used when `include_new_accounts` is `true`. include_new_accounts boolean Whether or not to automatically include new member accounts by default in your billing dataset. included_accounts [string] The AWS account IDs to be included in your billing dataset. This field is used when `include_new_accounts` is `false`. account_id [_required_] string The AWS account ID. bucket_name [_required_] string The AWS bucket name used to store the Cost and Usage Report. bucket_region [_required_] string The region the bucket is located in. created_at string The timestamp when the AWS CUR config was created. error_messages [string] The error messages for the AWS CUR config. months int32 **DEPRECATED** : The number of months the report has been backfilled. report_name [_required_] string The name of the Cost and Usage Report. report_prefix [_required_] string The report prefix used for the Cost and Usage Report. status [_required_] string The status of the AWS CUR. status_updated_at string The timestamp when the AWS CUR config status was updated. updated_at string The timestamp when the AWS CUR config status was updated. id string The ID of the AWS CUR config. type [_required_] enum Type of AWS CUR config. Allowed enum values: `aws_cur_config` default: `aws_cur_config` ``` { "data": [ { "attributes": { "account_filters": { "excluded_accounts": [ "123456789123", "123456789143" ], "include_new_accounts": true, "included_accounts": [ "123456789123", "123456789143" ] }, "account_id": "123456789123", "bucket_name": "dd-cost-bucket", "bucket_region": "us-east-1", "created_at": "string", "error_messages": [], "months": "integer", "report_name": "dd-report-name", "report_prefix": "dd-report-prefix", "status": "active", "status_updated_at": "string", "updated_at": "string" }, "id": "string", "type": "aws_cur_config" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### List Cloud Cost Management AWS CUR configs Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/aws_cur_config" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Cloud Cost Management AWS CUR configs ``` """ List Cloud Cost Management AWS CUR configs returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.list_cost_awscur_configs() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Cloud Cost Management AWS CUR configs ``` # List Cloud Cost Management AWS CUR configs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.list_cost_awscur_configs() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Cloud Cost Management AWS CUR configs ``` // List Cloud Cost Management AWS CUR configs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.ListCostAWSCURConfigs(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.ListCostAWSCURConfigs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.ListCostAWSCURConfigs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Cloud Cost Management AWS CUR configs ``` // List Cloud Cost Management AWS CUR configs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.AwsCURConfigsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { AwsCURConfigsResponse result = apiInstance.listCostAWSCURConfigs(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#listCostAWSCURConfigs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Cloud Cost Management AWS CUR configs ``` // List Cloud Cost Management AWS CUR configs returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.list_cost_awscur_configs().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Cloud Cost Management AWS CUR configs ``` /** * List Cloud Cost Management AWS CUR configs returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); apiInstance .listCostAWSCURConfigs() .then((data: v2.AwsCURConfigsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-cloud-cost-management-aws-cur-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-cloud-cost-management-aws-cur-config-v2) POST https://api.ap1.datadoghq.com/api/v2/cost/aws_cur_confighttps://api.ap2.datadoghq.com/api/v2/cost/aws_cur_confighttps://api.datadoghq.eu/api/v2/cost/aws_cur_confighttps://api.ddog-gov.com/api/v2/cost/aws_cur_confighttps://api.datadoghq.com/api/v2/cost/aws_cur_confighttps://api.us3.datadoghq.com/api/v2/cost/aws_cur_confighttps://api.us5.datadoghq.com/api/v2/cost/aws_cur_config ### Overview Create a Cloud Cost Management account for an AWS CUR config. This endpoint requires the `cloud_cost_management_write` permission. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data [_required_] object AWS CUR config Post data. attributes object Attributes for AWS CUR config Post Request. account_filters object The account filtering configuration. excluded_accounts [string] The AWS account IDs to be excluded from your billing dataset. This field is used when `include_new_accounts` is `true`. include_new_accounts boolean Whether or not to automatically include new member accounts by default in your billing dataset. included_accounts [string] The AWS account IDs to be included in your billing dataset. This field is used when `include_new_accounts` is `false`. account_id [_required_] string The AWS account ID. bucket_name [_required_] string The AWS bucket name used to store the Cost and Usage Report. bucket_region string The region the bucket is located in. months int32 The month of the report. report_name [_required_] string The name of the Cost and Usage Report. report_prefix [_required_] string The report prefix used for the Cost and Usage Report. type [_required_] enum Type of AWS CUR config Post Request. Allowed enum values: `aws_cur_config_post_request` default: `aws_cur_config_post_request` ``` { "data": { "attributes": { "account_id": "123456789123", "bucket_name": "dd-cost-bucket", "bucket_region": "us-east-1", "report_name": "dd-report-name", "report_prefix": "dd-report-prefix" }, "type": "aws_cur_config_post_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostAWSCURConfig-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostAWSCURConfig-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostAWSCURConfig-403-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostAWSCURConfig-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `AwsCurConfigResponse` object. Field Type Description data object The definition of `AwsCurConfigResponseData` object. attributes object The definition of `AwsCurConfigResponseDataAttributes` object. account_filters object The definition of `AwsCurConfigResponseDataAttributesAccountFilters` object. excluded_accounts [string] The `account_filters` `excluded_accounts`. include_new_accounts boolean The `account_filters` `include_new_accounts`. included_accounts [string] The `account_filters` `included_accounts`. account_id string The `attributes` `account_id`. bucket_name string The `attributes` `bucket_name`. bucket_region string The `attributes` `bucket_region`. created_at string The `attributes` `created_at`. error_messages [string] The `attributes` `error_messages`. months int64 The `attributes` `months`. report_name string The `attributes` `report_name`. report_prefix string The `attributes` `report_prefix`. status string The `attributes` `status`. status_updated_at string The `attributes` `status_updated_at`. updated_at string The `attributes` `updated_at`. id string The `AwsCurConfigResponseData` `id`. type [_required_] enum AWS CUR config resource type. Allowed enum values: `aws_cur_config` default: `aws_cur_config` ``` { "data": { "attributes": { "account_filters": { "excluded_accounts": [ "123456789124", "123456789125" ], "include_new_accounts": true }, "account_id": "123456789123", "bucket_name": "dd-cost-bucket", "bucket_region": "us-east-1", "created_at": "2023-01-01T12:00:00.000Z", "error_messages": [], "months": 36, "report_name": "dd-report-name", "report_prefix": "dd-report-prefix", "status": "active", "status_updated_at": "2023-01-01T12:00:00.000Z", "updated_at": "2023-01-01T12:00:00.000Z" }, "id": "123456789123", "type": "aws_cur_config" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Create Cloud Cost Management AWS CUR config returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/aws_cur_config" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "account_id": "123456789123", "bucket_name": "dd-cost-bucket", "bucket_region": "us-east-1", "report_name": "dd-report-name", "report_prefix": "dd-report-prefix" }, "type": "aws_cur_config_post_request" } } EOF ``` ##### Create Cloud Cost Management AWS CUR config returns "OK" response ``` // Create Cloud Cost Management AWS CUR config returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AwsCURConfigPostRequest{ Data: datadogV2.AwsCURConfigPostData{ Attributes: &datadogV2.AwsCURConfigPostRequestAttributes{ AccountId: "123456789123", BucketName: "dd-cost-bucket", BucketRegion: datadog.PtrString("us-east-1"), ReportName: "dd-report-name", ReportPrefix: "dd-report-prefix", }, Type: datadogV2.AWSCURCONFIGPOSTREQUESTTYPE_AWS_CUR_CONFIG_POST_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.CreateCostAWSCURConfig(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.CreateCostAWSCURConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.CreateCostAWSCURConfig`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create Cloud Cost Management AWS CUR config returns "OK" response ``` // Create Cloud Cost Management AWS CUR config returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.AwsCURConfigPostData; import com.datadog.api.client.v2.model.AwsCURConfigPostRequest; import com.datadog.api.client.v2.model.AwsCURConfigPostRequestAttributes; import com.datadog.api.client.v2.model.AwsCURConfigPostRequestType; import com.datadog.api.client.v2.model.AwsCurConfigResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); AwsCURConfigPostRequest body = new AwsCURConfigPostRequest() .data( new AwsCURConfigPostData() .attributes( new AwsCURConfigPostRequestAttributes() .accountId("123456789123") .bucketName("dd-cost-bucket") .bucketRegion("us-east-1") .reportName("dd-report-name") .reportPrefix("dd-report-prefix")) .type(AwsCURConfigPostRequestType.AWS_CUR_CONFIG_POST_REQUEST)); try { AwsCurConfigResponse result = apiInstance.createCostAWSCURConfig(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#createCostAWSCURConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create Cloud Cost Management AWS CUR config returns "OK" response ``` """ Create Cloud Cost Management AWS CUR config returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.aws_cur_config_post_data import AwsCURConfigPostData from datadog_api_client.v2.model.aws_cur_config_post_request import AwsCURConfigPostRequest from datadog_api_client.v2.model.aws_cur_config_post_request_attributes import AwsCURConfigPostRequestAttributes from datadog_api_client.v2.model.aws_cur_config_post_request_type import AwsCURConfigPostRequestType body = AwsCURConfigPostRequest( data=AwsCURConfigPostData( attributes=AwsCURConfigPostRequestAttributes( account_id="123456789123", bucket_name="dd-cost-bucket", bucket_region="us-east-1", report_name="dd-report-name", report_prefix="dd-report-prefix", ), type=AwsCURConfigPostRequestType.AWS_CUR_CONFIG_POST_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.create_cost_awscur_config(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create Cloud Cost Management AWS CUR config returns "OK" response ``` # Create Cloud Cost Management AWS CUR config returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::AwsCURConfigPostRequest.new({ data: DatadogAPIClient::V2::AwsCURConfigPostData.new({ attributes: DatadogAPIClient::V2::AwsCURConfigPostRequestAttributes.new({ account_id: "123456789123", bucket_name: "dd-cost-bucket", bucket_region: "us-east-1", report_name: "dd-report-name", report_prefix: "dd-report-prefix", }), type: DatadogAPIClient::V2::AwsCURConfigPostRequestType::AWS_CUR_CONFIG_POST_REQUEST, }), }) p api_instance.create_cost_awscur_config(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create Cloud Cost Management AWS CUR config returns "OK" response ``` // Create Cloud Cost Management AWS CUR config returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::AwsCURConfigPostData; use datadog_api_client::datadogV2::model::AwsCURConfigPostRequest; use datadog_api_client::datadogV2::model::AwsCURConfigPostRequestAttributes; use datadog_api_client::datadogV2::model::AwsCURConfigPostRequestType; #[tokio::main] async fn main() { let body = AwsCURConfigPostRequest::new( AwsCURConfigPostData::new(AwsCURConfigPostRequestType::AWS_CUR_CONFIG_POST_REQUEST) .attributes( AwsCURConfigPostRequestAttributes::new( "123456789123".to_string(), "dd-cost-bucket".to_string(), "dd-report-name".to_string(), "dd-report-prefix".to_string(), ) .bucket_region("us-east-1".to_string()), ), ); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.create_cost_awscur_config(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create Cloud Cost Management AWS CUR config returns "OK" response ``` /** * Create Cloud Cost Management AWS CUR config returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiCreateCostAWSCURConfigRequest = { body: { data: { attributes: { accountId: "123456789123", bucketName: "dd-cost-bucket", bucketRegion: "us-east-1", reportName: "dd-report-name", reportPrefix: "dd-report-prefix", }, type: "aws_cur_config_post_request", }, }, }; apiInstance .createCostAWSCURConfig(params) .then((data: v2.AwsCurConfigResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-cloud-cost-management-aws-cur-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-cloud-cost-management-aws-cur-config-v2) PATCH https://api.ap1.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.ap2.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.datadoghq.eu/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.ddog-gov.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.us3.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.us5.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id} ### Overview Update the status (active/archived) and/or account filtering configuration of an AWS CUR config. This endpoint requires the `cloud_cost_management_write` permission. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description cloud_account_id [_required_] integer Cloud Account id. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data [_required_] object AWS CUR config Patch data. attributes [_required_] object Attributes for AWS CUR config Patch Request. account_filters object The account filtering configuration. excluded_accounts [string] The AWS account IDs to be excluded from your billing dataset. This field is used when `include_new_accounts` is `true`. include_new_accounts boolean Whether or not to automatically include new member accounts by default in your billing dataset. included_accounts [string] The AWS account IDs to be included in your billing dataset. This field is used when `include_new_accounts` is `false`. is_enabled boolean Whether or not the Cloud Cost Management account is enabled. type [_required_] enum Type of AWS CUR config Patch Request. Allowed enum values: `aws_cur_config_patch_request` default: `aws_cur_config_patch_request` ``` { "data": { "attributes": { "is_enabled": true }, "type": "aws_cur_config_patch_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostAWSCURConfig-200-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostAWSCURConfig-403-v2) * [404](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostAWSCURConfig-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostAWSCURConfig-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) List of AWS CUR configs. Field Type Description data [_required_] [object] An AWS CUR config. attributes [_required_] object Attributes for An AWS CUR config. account_filters object The account filtering configuration. excluded_accounts [string] The AWS account IDs to be excluded from your billing dataset. This field is used when `include_new_accounts` is `true`. include_new_accounts boolean Whether or not to automatically include new member accounts by default in your billing dataset. included_accounts [string] The AWS account IDs to be included in your billing dataset. This field is used when `include_new_accounts` is `false`. account_id [_required_] string The AWS account ID. bucket_name [_required_] string The AWS bucket name used to store the Cost and Usage Report. bucket_region [_required_] string The region the bucket is located in. created_at string The timestamp when the AWS CUR config was created. error_messages [string] The error messages for the AWS CUR config. months int32 **DEPRECATED** : The number of months the report has been backfilled. report_name [_required_] string The name of the Cost and Usage Report. report_prefix [_required_] string The report prefix used for the Cost and Usage Report. status [_required_] string The status of the AWS CUR. status_updated_at string The timestamp when the AWS CUR config status was updated. updated_at string The timestamp when the AWS CUR config status was updated. id string The ID of the AWS CUR config. type [_required_] enum Type of AWS CUR config. Allowed enum values: `aws_cur_config` default: `aws_cur_config` ``` { "data": [ { "attributes": { "account_filters": { "excluded_accounts": [ "123456789123", "123456789143" ], "include_new_accounts": true, "included_accounts": [ "123456789123", "123456789143" ] }, "account_id": "123456789123", "bucket_name": "dd-cost-bucket", "bucket_region": "us-east-1", "created_at": "string", "error_messages": [], "months": "integer", "report_name": "dd-report-name", "report_prefix": "dd-report-prefix", "status": "active", "status_updated_at": "string", "updated_at": "string" }, "id": "string", "type": "aws_cur_config" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Update Cloud Cost Management AWS CUR config returns "OK" response Copy ``` # Path parameters export cloud_account_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/aws_cur_config/${cloud_account_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "is_enabled": true }, "type": "aws_cur_config_patch_request" } } EOF ``` ##### Update Cloud Cost Management AWS CUR config returns "OK" response ``` // Update Cloud Cost Management AWS CUR config returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AwsCURConfigPatchRequest{ Data: datadogV2.AwsCURConfigPatchData{ Attributes: datadogV2.AwsCURConfigPatchRequestAttributes{ IsEnabled: datadog.PtrBool(true), }, Type: datadogV2.AWSCURCONFIGPATCHREQUESTTYPE_AWS_CUR_CONFIG_PATCH_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.UpdateCostAWSCURConfig(ctx, 100, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.UpdateCostAWSCURConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.UpdateCostAWSCURConfig`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Cloud Cost Management AWS CUR config returns "OK" response ``` // Update Cloud Cost Management AWS CUR config returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.AwsCURConfigPatchData; import com.datadog.api.client.v2.model.AwsCURConfigPatchRequest; import com.datadog.api.client.v2.model.AwsCURConfigPatchRequestAttributes; import com.datadog.api.client.v2.model.AwsCURConfigPatchRequestType; import com.datadog.api.client.v2.model.AwsCURConfigsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); AwsCURConfigPatchRequest body = new AwsCURConfigPatchRequest() .data( new AwsCURConfigPatchData() .attributes(new AwsCURConfigPatchRequestAttributes().isEnabled(true)) .type(AwsCURConfigPatchRequestType.AWS_CUR_CONFIG_PATCH_REQUEST)); try { AwsCURConfigsResponse result = apiInstance.updateCostAWSCURConfig(100L, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#updateCostAWSCURConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Cloud Cost Management AWS CUR config returns "OK" response ``` """ Update Cloud Cost Management AWS CUR config returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.aws_cur_config_patch_data import AwsCURConfigPatchData from datadog_api_client.v2.model.aws_cur_config_patch_request import AwsCURConfigPatchRequest from datadog_api_client.v2.model.aws_cur_config_patch_request_attributes import AwsCURConfigPatchRequestAttributes from datadog_api_client.v2.model.aws_cur_config_patch_request_type import AwsCURConfigPatchRequestType body = AwsCURConfigPatchRequest( data=AwsCURConfigPatchData( attributes=AwsCURConfigPatchRequestAttributes( is_enabled=True, ), type=AwsCURConfigPatchRequestType.AWS_CUR_CONFIG_PATCH_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.update_cost_awscur_config(cloud_account_id=100, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Cloud Cost Management AWS CUR config returns "OK" response ``` # Update Cloud Cost Management AWS CUR config returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::AwsCURConfigPatchRequest.new({ data: DatadogAPIClient::V2::AwsCURConfigPatchData.new({ attributes: DatadogAPIClient::V2::AwsCURConfigPatchRequestAttributes.new({ is_enabled: true, }), type: DatadogAPIClient::V2::AwsCURConfigPatchRequestType::AWS_CUR_CONFIG_PATCH_REQUEST, }), }) p api_instance.update_cost_awscur_config(100, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Cloud Cost Management AWS CUR config returns "OK" response ``` // Update Cloud Cost Management AWS CUR config returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::AwsCURConfigPatchData; use datadog_api_client::datadogV2::model::AwsCURConfigPatchRequest; use datadog_api_client::datadogV2::model::AwsCURConfigPatchRequestAttributes; use datadog_api_client::datadogV2::model::AwsCURConfigPatchRequestType; #[tokio::main] async fn main() { let body = AwsCURConfigPatchRequest::new(AwsCURConfigPatchData::new( AwsCURConfigPatchRequestAttributes::new().is_enabled(true), AwsCURConfigPatchRequestType::AWS_CUR_CONFIG_PATCH_REQUEST, )); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.update_cost_awscur_config(100, body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Cloud Cost Management AWS CUR config returns "OK" response ``` /** * Update Cloud Cost Management AWS CUR config returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiUpdateCostAWSCURConfigRequest = { body: { data: { attributes: { isEnabled: true, }, type: "aws_cur_config_patch_request", }, }, cloudAccountId: 100, }; apiInstance .updateCostAWSCURConfig(params) .then((data: v2.AwsCURConfigsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-cloud-cost-management-aws-cur-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-cloud-cost-management-aws-cur-config-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.ap2.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.datadoghq.eu/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.ddog-gov.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.us3.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.us5.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id} ### Overview Archive a Cloud Cost Management Account. This endpoint requires the `cloud_cost_management_write` permission. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description cloud_account_id [_required_] integer Cloud Account id. ### Response * [204](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostAWSCURConfig-204-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostAWSCURConfig-400-v2) * [404](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostAWSCURConfig-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostAWSCURConfig-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Delete Cloud Cost Management AWS CUR config Copy ``` # Path parameters export cloud_account_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/aws_cur_config/${cloud_account_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Cloud Cost Management AWS CUR config ``` """ Delete Cloud Cost Management AWS CUR config returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) api_instance.delete_cost_awscur_config( cloud_account_id=100, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Cloud Cost Management AWS CUR config ``` # Delete Cloud Cost Management AWS CUR config returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new api_instance.delete_cost_awscur_config(100) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Cloud Cost Management AWS CUR config ``` // Delete Cloud Cost Management AWS CUR config returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) r, err := api.DeleteCostAWSCURConfig(ctx, 100) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.DeleteCostAWSCURConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Cloud Cost Management AWS CUR config ``` // Delete Cloud Cost Management AWS CUR config returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { apiInstance.deleteCostAWSCURConfig(100L); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#deleteCostAWSCURConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Cloud Cost Management AWS CUR config ``` // Delete Cloud Cost Management AWS CUR config returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.delete_cost_awscur_config(100).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Cloud Cost Management AWS CUR config ``` /** * Delete Cloud Cost Management AWS CUR config returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiDeleteCostAWSCURConfigRequest = { cloudAccountId: 100, }; apiInstance .deleteCostAWSCURConfig(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get cost AWS CUR config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-cost-aws-cur-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-cost-aws-cur-config-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.ap2.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.datadoghq.eu/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.ddog-gov.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.us3.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id}https://api.us5.datadoghq.com/api/v2/cost/aws_cur_config/{cloud_account_id} ### Overview Get a specific AWS CUR config. OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description cloud_account_id [_required_] integer The unique identifier of the cloud account ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCostAWSCURConfig-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCostAWSCURConfig-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `AwsCurConfigResponse` object. Field Type Description data object The definition of `AwsCurConfigResponseData` object. attributes object The definition of `AwsCurConfigResponseDataAttributes` object. account_filters object The definition of `AwsCurConfigResponseDataAttributesAccountFilters` object. excluded_accounts [string] The `account_filters` `excluded_accounts`. include_new_accounts boolean The `account_filters` `include_new_accounts`. included_accounts [string] The `account_filters` `included_accounts`. account_id string The `attributes` `account_id`. bucket_name string The `attributes` `bucket_name`. bucket_region string The `attributes` `bucket_region`. created_at string The `attributes` `created_at`. error_messages [string] The `attributes` `error_messages`. months int64 The `attributes` `months`. report_name string The `attributes` `report_name`. report_prefix string The `attributes` `report_prefix`. status string The `attributes` `status`. status_updated_at string The `attributes` `status_updated_at`. updated_at string The `attributes` `updated_at`. id string The `AwsCurConfigResponseData` `id`. type [_required_] enum AWS CUR config resource type. Allowed enum values: `aws_cur_config` default: `aws_cur_config` ``` { "data": { "attributes": { "account_filters": { "excluded_accounts": [ "123456789124", "123456789125" ], "include_new_accounts": true }, "account_id": "123456789123", "bucket_name": "dd-cost-bucket", "bucket_region": "us-east-1", "created_at": "2023-01-01T12:00:00.000Z", "error_messages": [], "months": 36, "report_name": "dd-report-name", "report_prefix": "dd-report-prefix", "status": "active", "status_updated_at": "2023-01-01T12:00:00.000Z", "updated_at": "2023-01-01T12:00:00.000Z" }, "id": "123456789123", "type": "aws_cur_config" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Get cost AWS CUR config Copy ``` # Path parameters export cloud_account_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/aws_cur_config/${cloud_account_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get cost AWS CUR config ``` """ Get cost AWS CUR config returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.get_cost_awscur_config( cloud_account_id=123456, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get cost AWS CUR config ``` # Get cost AWS CUR config returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.get_cost_awscur_config(123456) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get cost AWS CUR config ``` // Get cost AWS CUR config returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.GetCostAWSCURConfig(ctx, 123456) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.GetCostAWSCURConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.GetCostAWSCURConfig`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get cost AWS CUR config ``` // Get cost AWS CUR config returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.AwsCurConfigResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { AwsCurConfigResponse result = apiInstance.getCostAWSCURConfig(123456L); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#getCostAWSCURConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get cost AWS CUR config ``` // Get cost AWS CUR config returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.get_cost_awscur_config(123456).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get cost AWS CUR config ``` /** * Get cost AWS CUR config returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiGetCostAWSCURConfigRequest = { cloudAccountId: 123456, }; apiInstance .getCostAWSCURConfig(params) .then((data: v2.AwsCurConfigResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List Cloud Cost Management Azure configs](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-cloud-cost-management-azure-configs) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-cloud-cost-management-azure-configs-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/azure_uc_confighttps://api.ap2.datadoghq.com/api/v2/cost/azure_uc_confighttps://api.datadoghq.eu/api/v2/cost/azure_uc_confighttps://api.ddog-gov.com/api/v2/cost/azure_uc_confighttps://api.datadoghq.com/api/v2/cost/azure_uc_confighttps://api.us3.datadoghq.com/api/v2/cost/azure_uc_confighttps://api.us5.datadoghq.com/api/v2/cost/azure_uc_config ### Overview List the Azure configs. This endpoint requires the `cloud_cost_management_read` permission. OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCostAzureUCConfigs-200-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCostAzureUCConfigs-403-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCostAzureUCConfigs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) List of Azure accounts with configs. Field Type Description data [_required_] [object] An Azure config pair. attributes [_required_] object Attributes for Azure config pair. configs [_required_] [object] An Azure config. account_id [_required_] string The tenant ID of the Azure account. client_id [_required_] string The client ID of the Azure account. created_at string The timestamp when the Azure config was created. dataset_type [_required_] string The dataset type of the Azure config. error_messages [string] The error messages for the Azure config. export_name [_required_] string The name of the configured Azure Export. export_path [_required_] string The path where the Azure Export is saved. id string The ID of the Azure config. months int32 **DEPRECATED** : The number of months the report has been backfilled. scope [_required_] string The scope of your observed subscription. status [_required_] string The status of the Azure config. status_updated_at string The timestamp when the Azure config status was last updated. storage_account [_required_] string The name of the storage account where the Azure Export is saved. storage_container [_required_] string The name of the storage container where the Azure Export is saved. updated_at string The timestamp when the Azure config was last updated. id string The ID of the Azure config pair. id string The ID of Cloud Cost Management account. type [_required_] enum Type of Azure config pair. Allowed enum values: `azure_uc_configs` default: `azure_uc_configs` ``` { "data": [ { "attributes": { "configs": [ { "account_id": "1234abcd-1234-abcd-1234-1234abcd1234", "client_id": "1234abcd-1234-abcd-1234-1234abcd1234", "created_at": "string", "dataset_type": "actual", "error_messages": [], "export_name": "dd-actual-export", "export_path": "dd-export-path", "id": "string", "months": "integer", "scope": "/subscriptions/1234abcd-1234-abcd-1234-1234abcd1234", "status": "active", "status_updated_at": "string", "storage_account": "dd-storage-account", "storage_container": "dd-storage-container", "updated_at": "string" } ], "id": "string" }, "id": "string", "type": "azure_uc_configs" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### List Cloud Cost Management Azure configs Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/azure_uc_config" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Cloud Cost Management Azure configs ``` """ List Cloud Cost Management Azure configs returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.list_cost_azure_uc_configs() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Cloud Cost Management Azure configs ``` # List Cloud Cost Management Azure configs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.list_cost_azure_uc_configs() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Cloud Cost Management Azure configs ``` // List Cloud Cost Management Azure configs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.ListCostAzureUCConfigs(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.ListCostAzureUCConfigs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.ListCostAzureUCConfigs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Cloud Cost Management Azure configs ``` // List Cloud Cost Management Azure configs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.AzureUCConfigsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { AzureUCConfigsResponse result = apiInstance.listCostAzureUCConfigs(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#listCostAzureUCConfigs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Cloud Cost Management Azure configs ``` // List Cloud Cost Management Azure configs returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.list_cost_azure_uc_configs().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Cloud Cost Management Azure configs ``` /** * List Cloud Cost Management Azure configs returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); apiInstance .listCostAzureUCConfigs() .then((data: v2.AzureUCConfigsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create Cloud Cost Management Azure configs](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-cloud-cost-management-azure-configs) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-cloud-cost-management-azure-configs-v2) POST https://api.ap1.datadoghq.com/api/v2/cost/azure_uc_confighttps://api.ap2.datadoghq.com/api/v2/cost/azure_uc_confighttps://api.datadoghq.eu/api/v2/cost/azure_uc_confighttps://api.ddog-gov.com/api/v2/cost/azure_uc_confighttps://api.datadoghq.com/api/v2/cost/azure_uc_confighttps://api.us3.datadoghq.com/api/v2/cost/azure_uc_confighttps://api.us5.datadoghq.com/api/v2/cost/azure_uc_config ### Overview Create a Cloud Cost Management account for an Azure config. This endpoint requires the `cloud_cost_management_write` permission. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data [_required_] object Azure config Post data. attributes object Attributes for Azure config Post Request. account_id [_required_] string The tenant ID of the Azure account. actual_bill_config [_required_] object Bill config. export_name [_required_] string The name of the configured Azure Export. export_path [_required_] string The path where the Azure Export is saved. storage_account [_required_] string The name of the storage account where the Azure Export is saved. storage_container [_required_] string The name of the storage container where the Azure Export is saved. amortized_bill_config [_required_] object Bill config. export_name [_required_] string The name of the configured Azure Export. export_path [_required_] string The path where the Azure Export is saved. storage_account [_required_] string The name of the storage account where the Azure Export is saved. storage_container [_required_] string The name of the storage container where the Azure Export is saved. client_id [_required_] string The client ID of the Azure account. scope [_required_] string The scope of your observed subscription. type [_required_] enum Type of Azure config Post Request. Allowed enum values: `azure_uc_config_post_request` default: `azure_uc_config_post_request` ``` { "data": { "attributes": { "account_id": "1234abcd-1234-abcd-1234-1234abcd1234", "actual_bill_config": { "export_name": "dd-actual-export", "export_path": "dd-export-path", "storage_account": "dd-storage-account", "storage_container": "dd-storage-container" }, "amortized_bill_config": { "export_name": "dd-actual-export", "export_path": "dd-export-path", "storage_account": "dd-storage-account", "storage_container": "dd-storage-container" }, "client_id": "1234abcd-1234-abcd-1234-1234abcd1234", "scope": "subscriptions/1234abcd-1234-abcd-1234-1234abcd1234" }, "type": "azure_uc_config_post_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostAzureUCConfigs-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostAzureUCConfigs-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostAzureUCConfigs-403-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostAzureUCConfigs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Response of Azure config pair. Field Type Description data object Azure config pair. attributes [_required_] object Attributes for Azure config pair. configs [_required_] [object] An Azure config. account_id [_required_] string The tenant ID of the Azure account. client_id [_required_] string The client ID of the Azure account. created_at string The timestamp when the Azure config was created. dataset_type [_required_] string The dataset type of the Azure config. error_messages [string] The error messages for the Azure config. export_name [_required_] string The name of the configured Azure Export. export_path [_required_] string The path where the Azure Export is saved. id string The ID of the Azure config. months int32 **DEPRECATED** : The number of months the report has been backfilled. scope [_required_] string The scope of your observed subscription. status [_required_] string The status of the Azure config. status_updated_at string The timestamp when the Azure config status was last updated. storage_account [_required_] string The name of the storage account where the Azure Export is saved. storage_container [_required_] string The name of the storage container where the Azure Export is saved. updated_at string The timestamp when the Azure config was last updated. id string The ID of the Azure config pair. id string The ID of Cloud Cost Management account. type [_required_] enum Type of Azure config pair. Allowed enum values: `azure_uc_configs` default: `azure_uc_configs` ``` { "data": { "attributes": { "configs": [ { "account_id": "1234abcd-1234-abcd-1234-1234abcd1234", "client_id": "1234abcd-1234-abcd-1234-1234abcd1234", "created_at": "string", "dataset_type": "actual", "error_messages": [], "export_name": "dd-actual-export", "export_path": "dd-export-path", "id": "string", "months": "integer", "scope": "/subscriptions/1234abcd-1234-abcd-1234-1234abcd1234", "status": "active", "status_updated_at": "string", "storage_account": "dd-storage-account", "storage_container": "dd-storage-container", "updated_at": "string" } ], "id": "string" }, "id": "string", "type": "azure_uc_configs" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Create Cloud Cost Management Azure configs returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/azure_uc_config" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "account_id": "1234abcd-1234-abcd-1234-1234abcd1234", "actual_bill_config": { "export_name": "dd-actual-export", "export_path": "dd-export-path", "storage_account": "dd-storage-account", "storage_container": "dd-storage-container" }, "amortized_bill_config": { "export_name": "dd-actual-export", "export_path": "dd-export-path", "storage_account": "dd-storage-account", "storage_container": "dd-storage-container" }, "client_id": "1234abcd-1234-abcd-1234-1234abcd1234", "scope": "subscriptions/1234abcd-1234-abcd-1234-1234abcd1234" }, "type": "azure_uc_config_post_request" } } EOF ``` ##### Create Cloud Cost Management Azure configs returns "OK" response ``` // Create Cloud Cost Management Azure configs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AzureUCConfigPostRequest{ Data: datadogV2.AzureUCConfigPostData{ Attributes: &datadogV2.AzureUCConfigPostRequestAttributes{ AccountId: "1234abcd-1234-abcd-1234-1234abcd1234", ActualBillConfig: datadogV2.BillConfig{ ExportName: "dd-actual-export", ExportPath: "dd-export-path", StorageAccount: "dd-storage-account", StorageContainer: "dd-storage-container", }, AmortizedBillConfig: datadogV2.BillConfig{ ExportName: "dd-actual-export", ExportPath: "dd-export-path", StorageAccount: "dd-storage-account", StorageContainer: "dd-storage-container", }, ClientId: "1234abcd-1234-abcd-1234-1234abcd1234", Scope: "subscriptions/1234abcd-1234-abcd-1234-1234abcd1234", }, Type: datadogV2.AZUREUCCONFIGPOSTREQUESTTYPE_AZURE_UC_CONFIG_POST_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.CreateCostAzureUCConfigs(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.CreateCostAzureUCConfigs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.CreateCostAzureUCConfigs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create Cloud Cost Management Azure configs returns "OK" response ``` // Create Cloud Cost Management Azure configs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.AzureUCConfigPairsResponse; import com.datadog.api.client.v2.model.AzureUCConfigPostData; import com.datadog.api.client.v2.model.AzureUCConfigPostRequest; import com.datadog.api.client.v2.model.AzureUCConfigPostRequestAttributes; import com.datadog.api.client.v2.model.AzureUCConfigPostRequestType; import com.datadog.api.client.v2.model.BillConfig; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); AzureUCConfigPostRequest body = new AzureUCConfigPostRequest() .data( new AzureUCConfigPostData() .attributes( new AzureUCConfigPostRequestAttributes() .accountId("1234abcd-1234-abcd-1234-1234abcd1234") .actualBillConfig( new BillConfig() .exportName("dd-actual-export") .exportPath("dd-export-path") .storageAccount("dd-storage-account") .storageContainer("dd-storage-container")) .amortizedBillConfig( new BillConfig() .exportName("dd-actual-export") .exportPath("dd-export-path") .storageAccount("dd-storage-account") .storageContainer("dd-storage-container")) .clientId("1234abcd-1234-abcd-1234-1234abcd1234") .scope("subscriptions/1234abcd-1234-abcd-1234-1234abcd1234")) .type(AzureUCConfigPostRequestType.AZURE_UC_CONFIG_POST_REQUEST)); try { AzureUCConfigPairsResponse result = apiInstance.createCostAzureUCConfigs(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#createCostAzureUCConfigs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create Cloud Cost Management Azure configs returns "OK" response ``` """ Create Cloud Cost Management Azure configs returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.azure_uc_config_post_data import AzureUCConfigPostData from datadog_api_client.v2.model.azure_uc_config_post_request import AzureUCConfigPostRequest from datadog_api_client.v2.model.azure_uc_config_post_request_attributes import AzureUCConfigPostRequestAttributes from datadog_api_client.v2.model.azure_uc_config_post_request_type import AzureUCConfigPostRequestType from datadog_api_client.v2.model.bill_config import BillConfig body = AzureUCConfigPostRequest( data=AzureUCConfigPostData( attributes=AzureUCConfigPostRequestAttributes( account_id="1234abcd-1234-abcd-1234-1234abcd1234", actual_bill_config=BillConfig( export_name="dd-actual-export", export_path="dd-export-path", storage_account="dd-storage-account", storage_container="dd-storage-container", ), amortized_bill_config=BillConfig( export_name="dd-actual-export", export_path="dd-export-path", storage_account="dd-storage-account", storage_container="dd-storage-container", ), client_id="1234abcd-1234-abcd-1234-1234abcd1234", scope="subscriptions/1234abcd-1234-abcd-1234-1234abcd1234", ), type=AzureUCConfigPostRequestType.AZURE_UC_CONFIG_POST_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.create_cost_azure_uc_configs(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create Cloud Cost Management Azure configs returns "OK" response ``` # Create Cloud Cost Management Azure configs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::AzureUCConfigPostRequest.new({ data: DatadogAPIClient::V2::AzureUCConfigPostData.new({ attributes: DatadogAPIClient::V2::AzureUCConfigPostRequestAttributes.new({ account_id: "1234abcd-1234-abcd-1234-1234abcd1234", actual_bill_config: DatadogAPIClient::V2::BillConfig.new({ export_name: "dd-actual-export", export_path: "dd-export-path", storage_account: "dd-storage-account", storage_container: "dd-storage-container", }), amortized_bill_config: DatadogAPIClient::V2::BillConfig.new({ export_name: "dd-actual-export", export_path: "dd-export-path", storage_account: "dd-storage-account", storage_container: "dd-storage-container", }), client_id: "1234abcd-1234-abcd-1234-1234abcd1234", scope: "subscriptions/1234abcd-1234-abcd-1234-1234abcd1234", }), type: DatadogAPIClient::V2::AzureUCConfigPostRequestType::AZURE_UC_CONFIG_POST_REQUEST, }), }) p api_instance.create_cost_azure_uc_configs(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create Cloud Cost Management Azure configs returns "OK" response ``` // Create Cloud Cost Management Azure configs returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::AzureUCConfigPostData; use datadog_api_client::datadogV2::model::AzureUCConfigPostRequest; use datadog_api_client::datadogV2::model::AzureUCConfigPostRequestAttributes; use datadog_api_client::datadogV2::model::AzureUCConfigPostRequestType; use datadog_api_client::datadogV2::model::BillConfig; #[tokio::main] async fn main() { let body = AzureUCConfigPostRequest::new( AzureUCConfigPostData::new(AzureUCConfigPostRequestType::AZURE_UC_CONFIG_POST_REQUEST) .attributes(AzureUCConfigPostRequestAttributes::new( "1234abcd-1234-abcd-1234-1234abcd1234".to_string(), BillConfig::new( "dd-actual-export".to_string(), "dd-export-path".to_string(), "dd-storage-account".to_string(), "dd-storage-container".to_string(), ), BillConfig::new( "dd-actual-export".to_string(), "dd-export-path".to_string(), "dd-storage-account".to_string(), "dd-storage-container".to_string(), ), "1234abcd-1234-abcd-1234-1234abcd1234".to_string(), "subscriptions/1234abcd-1234-abcd-1234-1234abcd1234".to_string(), )), ); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.create_cost_azure_uc_configs(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create Cloud Cost Management Azure configs returns "OK" response ``` /** * Create Cloud Cost Management Azure configs returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiCreateCostAzureUCConfigsRequest = { body: { data: { attributes: { accountId: "1234abcd-1234-abcd-1234-1234abcd1234", actualBillConfig: { exportName: "dd-actual-export", exportPath: "dd-export-path", storageAccount: "dd-storage-account", storageContainer: "dd-storage-container", }, amortizedBillConfig: { exportName: "dd-actual-export", exportPath: "dd-export-path", storageAccount: "dd-storage-account", storageContainer: "dd-storage-container", }, clientId: "1234abcd-1234-abcd-1234-1234abcd1234", scope: "subscriptions/1234abcd-1234-abcd-1234-1234abcd1234", }, type: "azure_uc_config_post_request", }, }, }; apiInstance .createCostAzureUCConfigs(params) .then((data: v2.AzureUCConfigPairsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Cloud Cost Management Azure config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-cloud-cost-management-azure-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-cloud-cost-management-azure-config-v2) PATCH https://api.ap1.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.ap2.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.datadoghq.eu/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.ddog-gov.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.us3.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.us5.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id} ### Overview Update the status of an Azure config (active/archived). This endpoint requires the `cloud_cost_management_write` permission. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description cloud_account_id [_required_] integer Cloud Account id. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data [_required_] object Azure config Patch data. attributes object Attributes for Azure config Patch Request. is_enabled [_required_] boolean Whether or not the Cloud Cost Management account is enabled. type [_required_] enum Type of Azure config Patch Request. Allowed enum values: `azure_uc_config_patch_request` default: `azure_uc_config_patch_request` ``` { "data": { "attributes": { "is_enabled": true }, "type": "azure_uc_config_patch_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostAzureUCConfigs-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostAzureUCConfigs-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostAzureUCConfigs-403-v2) * [404](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostAzureUCConfigs-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostAzureUCConfigs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Response of Azure config pair. Field Type Description data object Azure config pair. attributes [_required_] object Attributes for Azure config pair. configs [_required_] [object] An Azure config. account_id [_required_] string The tenant ID of the Azure account. client_id [_required_] string The client ID of the Azure account. created_at string The timestamp when the Azure config was created. dataset_type [_required_] string The dataset type of the Azure config. error_messages [string] The error messages for the Azure config. export_name [_required_] string The name of the configured Azure Export. export_path [_required_] string The path where the Azure Export is saved. id string The ID of the Azure config. months int32 **DEPRECATED** : The number of months the report has been backfilled. scope [_required_] string The scope of your observed subscription. status [_required_] string The status of the Azure config. status_updated_at string The timestamp when the Azure config status was last updated. storage_account [_required_] string The name of the storage account where the Azure Export is saved. storage_container [_required_] string The name of the storage container where the Azure Export is saved. updated_at string The timestamp when the Azure config was last updated. id string The ID of the Azure config pair. id string The ID of Cloud Cost Management account. type [_required_] enum Type of Azure config pair. Allowed enum values: `azure_uc_configs` default: `azure_uc_configs` ``` { "data": { "attributes": { "configs": [ { "account_id": "1234abcd-1234-abcd-1234-1234abcd1234", "client_id": "1234abcd-1234-abcd-1234-1234abcd1234", "created_at": "string", "dataset_type": "actual", "error_messages": [], "export_name": "dd-actual-export", "export_path": "dd-export-path", "id": "string", "months": "integer", "scope": "/subscriptions/1234abcd-1234-abcd-1234-1234abcd1234", "status": "active", "status_updated_at": "string", "storage_account": "dd-storage-account", "storage_container": "dd-storage-container", "updated_at": "string" } ], "id": "string" }, "id": "string", "type": "azure_uc_configs" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Update Cloud Cost Management Azure config returns "OK" response Copy ``` # Path parameters export cloud_account_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/azure_uc_config/${cloud_account_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "is_enabled": true }, "type": "azure_uc_config_patch_request" } } EOF ``` ##### Update Cloud Cost Management Azure config returns "OK" response ``` // Update Cloud Cost Management Azure config returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AzureUCConfigPatchRequest{ Data: datadogV2.AzureUCConfigPatchData{ Attributes: &datadogV2.AzureUCConfigPatchRequestAttributes{ IsEnabled: true, }, Type: datadogV2.AZUREUCCONFIGPATCHREQUESTTYPE_AZURE_UC_CONFIG_PATCH_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.UpdateCostAzureUCConfigs(ctx, 100, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.UpdateCostAzureUCConfigs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.UpdateCostAzureUCConfigs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Cloud Cost Management Azure config returns "OK" response ``` // Update Cloud Cost Management Azure config returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.AzureUCConfigPairsResponse; import com.datadog.api.client.v2.model.AzureUCConfigPatchData; import com.datadog.api.client.v2.model.AzureUCConfigPatchRequest; import com.datadog.api.client.v2.model.AzureUCConfigPatchRequestAttributes; import com.datadog.api.client.v2.model.AzureUCConfigPatchRequestType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); AzureUCConfigPatchRequest body = new AzureUCConfigPatchRequest() .data( new AzureUCConfigPatchData() .attributes(new AzureUCConfigPatchRequestAttributes().isEnabled(true)) .type(AzureUCConfigPatchRequestType.AZURE_UC_CONFIG_PATCH_REQUEST)); try { AzureUCConfigPairsResponse result = apiInstance.updateCostAzureUCConfigs(100L, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#updateCostAzureUCConfigs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Cloud Cost Management Azure config returns "OK" response ``` """ Update Cloud Cost Management Azure config returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.azure_uc_config_patch_data import AzureUCConfigPatchData from datadog_api_client.v2.model.azure_uc_config_patch_request import AzureUCConfigPatchRequest from datadog_api_client.v2.model.azure_uc_config_patch_request_attributes import AzureUCConfigPatchRequestAttributes from datadog_api_client.v2.model.azure_uc_config_patch_request_type import AzureUCConfigPatchRequestType body = AzureUCConfigPatchRequest( data=AzureUCConfigPatchData( attributes=AzureUCConfigPatchRequestAttributes( is_enabled=True, ), type=AzureUCConfigPatchRequestType.AZURE_UC_CONFIG_PATCH_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.update_cost_azure_uc_configs(cloud_account_id=100, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Cloud Cost Management Azure config returns "OK" response ``` # Update Cloud Cost Management Azure config returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::AzureUCConfigPatchRequest.new({ data: DatadogAPIClient::V2::AzureUCConfigPatchData.new({ attributes: DatadogAPIClient::V2::AzureUCConfigPatchRequestAttributes.new({ is_enabled: true, }), type: DatadogAPIClient::V2::AzureUCConfigPatchRequestType::AZURE_UC_CONFIG_PATCH_REQUEST, }), }) p api_instance.update_cost_azure_uc_configs(100, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Cloud Cost Management Azure config returns "OK" response ``` // Update Cloud Cost Management Azure config returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::AzureUCConfigPatchData; use datadog_api_client::datadogV2::model::AzureUCConfigPatchRequest; use datadog_api_client::datadogV2::model::AzureUCConfigPatchRequestAttributes; use datadog_api_client::datadogV2::model::AzureUCConfigPatchRequestType; #[tokio::main] async fn main() { let body = AzureUCConfigPatchRequest::new( AzureUCConfigPatchData::new(AzureUCConfigPatchRequestType::AZURE_UC_CONFIG_PATCH_REQUEST) .attributes(AzureUCConfigPatchRequestAttributes::new(true)), ); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.update_cost_azure_uc_configs(100, body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Cloud Cost Management Azure config returns "OK" response ``` /** * Update Cloud Cost Management Azure config returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiUpdateCostAzureUCConfigsRequest = { body: { data: { attributes: { isEnabled: true, }, type: "azure_uc_config_patch_request", }, }, cloudAccountId: 100, }; apiInstance .updateCostAzureUCConfigs(params) .then((data: v2.AzureUCConfigPairsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Cloud Cost Management Azure config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-cloud-cost-management-azure-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-cloud-cost-management-azure-config-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.ap2.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.datadoghq.eu/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.ddog-gov.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.us3.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.us5.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id} ### Overview Archive a Cloud Cost Management Account. This endpoint requires the `cloud_cost_management_write` permission. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description cloud_account_id [_required_] integer Cloud Account id. ### Response * [204](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostAzureUCConfig-204-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostAzureUCConfig-400-v2) * [404](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostAzureUCConfig-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostAzureUCConfig-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Delete Cloud Cost Management Azure config Copy ``` # Path parameters export cloud_account_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/azure_uc_config/${cloud_account_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Cloud Cost Management Azure config ``` """ Delete Cloud Cost Management Azure config returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) api_instance.delete_cost_azure_uc_config( cloud_account_id=100, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Cloud Cost Management Azure config ``` # Delete Cloud Cost Management Azure config returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new api_instance.delete_cost_azure_uc_config(100) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Cloud Cost Management Azure config ``` // Delete Cloud Cost Management Azure config returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) r, err := api.DeleteCostAzureUCConfig(ctx, 100) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.DeleteCostAzureUCConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Cloud Cost Management Azure config ``` // Delete Cloud Cost Management Azure config returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { apiInstance.deleteCostAzureUCConfig(100L); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#deleteCostAzureUCConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Cloud Cost Management Azure config ``` // Delete Cloud Cost Management Azure config returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.delete_cost_azure_uc_config(100).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Cloud Cost Management Azure config ``` /** * Delete Cloud Cost Management Azure config returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiDeleteCostAzureUCConfigRequest = { cloudAccountId: 100, }; apiInstance .deleteCostAzureUCConfig(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get cost Azure UC config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-cost-azure-uc-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-cost-azure-uc-config-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.ap2.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.datadoghq.eu/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.ddog-gov.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.us3.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id}https://api.us5.datadoghq.com/api/v2/cost/azure_uc_config/{cloud_account_id} ### Overview Get a specific Azure config. OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description cloud_account_id [_required_] integer The unique identifier of the cloud account ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCostAzureUCConfig-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCostAzureUCConfig-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `UCConfigPair` object. Field Type Description data object The definition of `UCConfigPairData` object. attributes object The definition of `UCConfigPairDataAttributes` object. configs [object] The `attributes` `configs`. account_id string The `items` `account_id`. client_id string The `items` `client_id`. created_at string The `items` `created_at`. dataset_type string The `items` `dataset_type`. error_messages [string] The `items` `error_messages`. export_name string The `items` `export_name`. export_path string The `items` `export_path`. id string The `items` `id`. months int64 The `items` `months`. scope string The `items` `scope`. status string The `items` `status`. status_updated_at string The `items` `status_updated_at`. storage_account string The `items` `storage_account`. storage_container string The `items` `storage_container`. updated_at string The `items` `updated_at`. id string The `UCConfigPairData` `id`. type [_required_] enum Azure UC configs resource type. Allowed enum values: `azure_uc_configs` default: `azure_uc_configs` ``` { "data": { "attributes": { "configs": [ { "account_id": "1234abcd-1234-abcd-1234-1234abcd1234", "client_id": "1234abcd-1234-abcd-1234-1234abcd1234", "created_at": "2023-01-01T12:00:00.000Z", "dataset_type": "actual", "error_messages": [], "export_name": "dd-actual-export", "export_path": "dd-export-path", "id": "123456789123", "months": 36, "scope": "/subscriptions/1234abcd-1234-abcd-1234-1234abcd1234", "status": "active", "status_updated_at": "2023-01-01T12:00:00.000Z", "storage_account": "dd-storage-account", "storage_container": "dd-storage-container", "updated_at": "2023-01-01T12:00:00.000Z" } ] }, "id": "123456789123", "type": "azure_uc_configs" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Get cost Azure UC config Copy ``` # Path parameters export cloud_account_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/azure_uc_config/${cloud_account_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get cost Azure UC config ``` """ Get cost Azure UC config returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.get_cost_azure_uc_config( cloud_account_id=123456, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get cost Azure UC config ``` # Get cost Azure UC config returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.get_cost_azure_uc_config(123456) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get cost Azure UC config ``` // Get cost Azure UC config returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.GetCostAzureUCConfig(ctx, 123456) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.GetCostAzureUCConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.GetCostAzureUCConfig`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get cost Azure UC config ``` // Get cost Azure UC config returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.UCConfigPair; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { UCConfigPair result = apiInstance.getCostAzureUCConfig(123456L); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#getCostAzureUCConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get cost Azure UC config ``` // Get cost Azure UC config returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.get_cost_azure_uc_config(123456).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get cost Azure UC config ``` /** * Get cost Azure UC config returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiGetCostAzureUCConfigRequest = { cloudAccountId: 123456, }; apiInstance .getCostAzureUCConfig(params) .then((data: v2.UCConfigPair) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List Google Cloud Usage Cost configs](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-google-cloud-usage-cost-configs) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-google-cloud-usage-cost-configs-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/gcp_uc_confighttps://api.ap2.datadoghq.com/api/v2/cost/gcp_uc_confighttps://api.datadoghq.eu/api/v2/cost/gcp_uc_confighttps://api.ddog-gov.com/api/v2/cost/gcp_uc_confighttps://api.datadoghq.com/api/v2/cost/gcp_uc_confighttps://api.us3.datadoghq.com/api/v2/cost/gcp_uc_confighttps://api.us5.datadoghq.com/api/v2/cost/gcp_uc_config ### Overview List the Google Cloud Usage Cost configs. This endpoint requires the `cloud_cost_management_read` permission. OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCostGCPUsageCostConfigs-200-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCostGCPUsageCostConfigs-403-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCostGCPUsageCostConfigs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) List of Google Cloud Usage Cost configs. Field Type Description data [_required_] [object] A Google Cloud Usage Cost config. attributes [_required_] object Attributes for a Google Cloud Usage Cost config. account_id [_required_] string The Google Cloud account ID. bucket_name [_required_] string The Google Cloud bucket name used to store the Usage Cost export. created_at string The timestamp when the Google Cloud Usage Cost config was created. dataset [_required_] string The export dataset name used for the Google Cloud Usage Cost Report. error_messages [string] The error messages for the Google Cloud Usage Cost config. export_prefix [_required_] string The export prefix used for the Google Cloud Usage Cost Report. export_project_name [_required_] string The name of the Google Cloud Usage Cost Report. months int32 **DEPRECATED** : The number of months the report has been backfilled. project_id string The `project_id` of the Google Cloud Usage Cost report. service_account [_required_] string The unique Google Cloud service account email. status [_required_] string The status of the Google Cloud Usage Cost config. status_updated_at string The timestamp when the Google Cloud Usage Cost config status was updated. updated_at string The timestamp when the Google Cloud Usage Cost config status was updated. id string The ID of the Google Cloud Usage Cost config. type [_required_] enum Type of Google Cloud Usage Cost config. Allowed enum values: `gcp_uc_config` default: `gcp_uc_config` ``` { "data": [ { "attributes": { "account_id": "123456_A123BC_12AB34", "bucket_name": "dd-cost-bucket", "created_at": "string", "dataset": "billing", "error_messages": [], "export_prefix": "datadog_cloud_cost_usage_export", "export_project_name": "dd-cloud-cost-report", "months": "integer", "project_id": "my-project-123", "service_account": "dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com", "status": "active", "status_updated_at": "string", "updated_at": "string" }, "id": "string", "type": "gcp_uc_config" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### List Google Cloud Usage Cost configs Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/gcp_uc_config" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Google Cloud Usage Cost configs ``` """ List Google Cloud Usage Cost configs returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.list_cost_gcp_usage_cost_configs() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Google Cloud Usage Cost configs ``` # List Google Cloud Usage Cost configs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.list_cost_gcp_usage_cost_configs() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Google Cloud Usage Cost configs ``` // List Google Cloud Usage Cost configs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.ListCostGCPUsageCostConfigs(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.ListCostGCPUsageCostConfigs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.ListCostGCPUsageCostConfigs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Google Cloud Usage Cost configs ``` // List Google Cloud Usage Cost configs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.GCPUsageCostConfigsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { GCPUsageCostConfigsResponse result = apiInstance.listCostGCPUsageCostConfigs(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CloudCostManagementApi#listCostGCPUsageCostConfigs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Google Cloud Usage Cost configs ``` // List Google Cloud Usage Cost configs returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.list_cost_gcp_usage_cost_configs().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Google Cloud Usage Cost configs ``` /** * List Google Cloud Usage Cost configs returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); apiInstance .listCostGCPUsageCostConfigs() .then((data: v2.GCPUsageCostConfigsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-google-cloud-usage-cost-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-google-cloud-usage-cost-config-v2) POST https://api.ap1.datadoghq.com/api/v2/cost/gcp_uc_confighttps://api.ap2.datadoghq.com/api/v2/cost/gcp_uc_confighttps://api.datadoghq.eu/api/v2/cost/gcp_uc_confighttps://api.ddog-gov.com/api/v2/cost/gcp_uc_confighttps://api.datadoghq.com/api/v2/cost/gcp_uc_confighttps://api.us3.datadoghq.com/api/v2/cost/gcp_uc_confighttps://api.us5.datadoghq.com/api/v2/cost/gcp_uc_config ### Overview Create a Cloud Cost Management account for an Google Cloud Usage Cost config. This endpoint requires the `cloud_cost_management_write` permission. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data [_required_] object Google Cloud Usage Cost config post data. attributes object Attributes for Google Cloud Usage Cost config post request. billing_account_id [_required_] string The Google Cloud account ID. bucket_name [_required_] string The Google Cloud bucket name used to store the Usage Cost export. export_dataset_name [_required_] string The export dataset name used for the Google Cloud Usage Cost report. export_prefix string The export prefix used for the Google Cloud Usage Cost report. export_project_name [_required_] string The name of the Google Cloud Usage Cost report. service_account [_required_] string The unique Google Cloud service account email. type [_required_] enum Type of Google Cloud Usage Cost config post request. Allowed enum values: `gcp_uc_config_post_request` default: `gcp_uc_config_post_request` ``` { "data": { "attributes": { "billing_account_id": "123456_A123BC_12AB34", "bucket_name": "dd-cost-bucket", "export_dataset_name": "billing", "export_prefix": "datadog_cloud_cost_usage_export", "export_project_name": "dd-cloud-cost-report", "service_account": "dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com" }, "type": "gcp_uc_config_post_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostGCPUsageCostConfig-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostGCPUsageCostConfig-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostGCPUsageCostConfig-403-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCostGCPUsageCostConfig-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Response of Google Cloud Usage Cost config. Field Type Description data object Google Cloud Usage Cost config. attributes [_required_] object Attributes for a Google Cloud Usage Cost config. account_id [_required_] string The Google Cloud account ID. bucket_name [_required_] string The Google Cloud bucket name used to store the Usage Cost export. created_at string The timestamp when the Google Cloud Usage Cost config was created. dataset [_required_] string The export dataset name used for the Google Cloud Usage Cost Report. error_messages [string] The error messages for the Google Cloud Usage Cost config. export_prefix [_required_] string The export prefix used for the Google Cloud Usage Cost Report. export_project_name [_required_] string The name of the Google Cloud Usage Cost Report. months int32 **DEPRECATED** : The number of months the report has been backfilled. project_id string The `project_id` of the Google Cloud Usage Cost report. service_account [_required_] string The unique Google Cloud service account email. status [_required_] string The status of the Google Cloud Usage Cost config. status_updated_at string The timestamp when the Google Cloud Usage Cost config status was updated. updated_at string The timestamp when the Google Cloud Usage Cost config status was updated. id string The ID of the Google Cloud Usage Cost config. type [_required_] enum Type of Google Cloud Usage Cost config. Allowed enum values: `gcp_uc_config` default: `gcp_uc_config` ``` { "data": { "attributes": { "account_id": "123456_A123BC_12AB34", "bucket_name": "dd-cost-bucket", "created_at": "string", "dataset": "billing", "error_messages": [], "export_prefix": "datadog_cloud_cost_usage_export", "export_project_name": "dd-cloud-cost-report", "months": "integer", "project_id": "my-project-123", "service_account": "dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com", "status": "active", "status_updated_at": "string", "updated_at": "string" }, "id": "string", "type": "gcp_uc_config" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Create Google Cloud Usage Cost config returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/gcp_uc_config" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "billing_account_id": "123456_A123BC_12AB34", "bucket_name": "dd-cost-bucket", "export_dataset_name": "billing", "export_prefix": "datadog_cloud_cost_usage_export", "export_project_name": "dd-cloud-cost-report", "service_account": "dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com" }, "type": "gcp_uc_config_post_request" } } EOF ``` ##### Create Google Cloud Usage Cost config returns "OK" response ``` // Create Google Cloud Usage Cost config returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.GCPUsageCostConfigPostRequest{ Data: datadogV2.GCPUsageCostConfigPostData{ Attributes: &datadogV2.GCPUsageCostConfigPostRequestAttributes{ BillingAccountId: "123456_A123BC_12AB34", BucketName: "dd-cost-bucket", ExportDatasetName: "billing", ExportPrefix: datadog.PtrString("datadog_cloud_cost_usage_export"), ExportProjectName: "dd-cloud-cost-report", ServiceAccount: "dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com", }, Type: datadogV2.GCPUSAGECOSTCONFIGPOSTREQUESTTYPE_GCP_USAGE_COST_CONFIG_POST_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.CreateCostGCPUsageCostConfig(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.CreateCostGCPUsageCostConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.CreateCostGCPUsageCostConfig`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create Google Cloud Usage Cost config returns "OK" response ``` // Create Google Cloud Usage Cost config returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.GCPUsageCostConfigPostData; import com.datadog.api.client.v2.model.GCPUsageCostConfigPostRequest; import com.datadog.api.client.v2.model.GCPUsageCostConfigPostRequestAttributes; import com.datadog.api.client.v2.model.GCPUsageCostConfigPostRequestType; import com.datadog.api.client.v2.model.GCPUsageCostConfigResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); GCPUsageCostConfigPostRequest body = new GCPUsageCostConfigPostRequest() .data( new GCPUsageCostConfigPostData() .attributes( new GCPUsageCostConfigPostRequestAttributes() .billingAccountId("123456_A123BC_12AB34") .bucketName("dd-cost-bucket") .exportDatasetName("billing") .exportPrefix("datadog_cloud_cost_usage_export") .exportProjectName("dd-cloud-cost-report") .serviceAccount( "dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com")) .type(GCPUsageCostConfigPostRequestType.GCP_USAGE_COST_CONFIG_POST_REQUEST)); try { GCPUsageCostConfigResponse result = apiInstance.createCostGCPUsageCostConfig(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CloudCostManagementApi#createCostGCPUsageCostConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create Google Cloud Usage Cost config returns "OK" response ``` """ Create Google Cloud Usage Cost config returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.gcp_usage_cost_config_post_data import GCPUsageCostConfigPostData from datadog_api_client.v2.model.gcp_usage_cost_config_post_request import GCPUsageCostConfigPostRequest from datadog_api_client.v2.model.gcp_usage_cost_config_post_request_attributes import ( GCPUsageCostConfigPostRequestAttributes, ) from datadog_api_client.v2.model.gcp_usage_cost_config_post_request_type import GCPUsageCostConfigPostRequestType body = GCPUsageCostConfigPostRequest( data=GCPUsageCostConfigPostData( attributes=GCPUsageCostConfigPostRequestAttributes( billing_account_id="123456_A123BC_12AB34", bucket_name="dd-cost-bucket", export_dataset_name="billing", export_prefix="datadog_cloud_cost_usage_export", export_project_name="dd-cloud-cost-report", service_account="dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com", ), type=GCPUsageCostConfigPostRequestType.GCP_USAGE_COST_CONFIG_POST_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.create_cost_gcp_usage_cost_config(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create Google Cloud Usage Cost config returns "OK" response ``` # Create Google Cloud Usage Cost config returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::GCPUsageCostConfigPostRequest.new({ data: DatadogAPIClient::V2::GCPUsageCostConfigPostData.new({ attributes: DatadogAPIClient::V2::GCPUsageCostConfigPostRequestAttributes.new({ billing_account_id: "123456_A123BC_12AB34", bucket_name: "dd-cost-bucket", export_dataset_name: "billing", export_prefix: "datadog_cloud_cost_usage_export", export_project_name: "dd-cloud-cost-report", service_account: "dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com", }), type: DatadogAPIClient::V2::GCPUsageCostConfigPostRequestType::GCP_USAGE_COST_CONFIG_POST_REQUEST, }), }) p api_instance.create_cost_gcp_usage_cost_config(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create Google Cloud Usage Cost config returns "OK" response ``` // Create Google Cloud Usage Cost config returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::GCPUsageCostConfigPostData; use datadog_api_client::datadogV2::model::GCPUsageCostConfigPostRequest; use datadog_api_client::datadogV2::model::GCPUsageCostConfigPostRequestAttributes; use datadog_api_client::datadogV2::model::GCPUsageCostConfigPostRequestType; #[tokio::main] async fn main() { let body = GCPUsageCostConfigPostRequest::new( GCPUsageCostConfigPostData::new( GCPUsageCostConfigPostRequestType::GCP_USAGE_COST_CONFIG_POST_REQUEST, ) .attributes( GCPUsageCostConfigPostRequestAttributes::new( "123456_A123BC_12AB34".to_string(), "dd-cost-bucket".to_string(), "billing".to_string(), "dd-cloud-cost-report".to_string(), "dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com".to_string(), ) .export_prefix("datadog_cloud_cost_usage_export".to_string()), ), ); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.create_cost_gcp_usage_cost_config(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create Google Cloud Usage Cost config returns "OK" response ``` /** * Create Google Cloud Usage Cost config returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiCreateCostGCPUsageCostConfigRequest = { body: { data: { attributes: { billingAccountId: "123456_A123BC_12AB34", bucketName: "dd-cost-bucket", exportDatasetName: "billing", exportPrefix: "datadog_cloud_cost_usage_export", exportProjectName: "dd-cloud-cost-report", serviceAccount: "dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com", }, type: "gcp_uc_config_post_request", }, }, }; apiInstance .createCostGCPUsageCostConfig(params) .then((data: v2.GCPUsageCostConfigResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-google-cloud-usage-cost-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-google-cloud-usage-cost-config-v2) PATCH https://api.ap1.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.ap2.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.datadoghq.eu/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.ddog-gov.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.us3.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.us5.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id} ### Overview Update the status of an Google Cloud Usage Cost config (active/archived). This endpoint requires the `cloud_cost_management_write` permission. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description cloud_account_id [_required_] integer Cloud Account id. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data [_required_] object Google Cloud Usage Cost config patch data. attributes [_required_] object Attributes for Google Cloud Usage Cost config patch request. is_enabled [_required_] boolean Whether or not the Cloud Cost Management account is enabled. type [_required_] enum Type of Google Cloud Usage Cost config patch request. Allowed enum values: `gcp_uc_config_patch_request` default: `gcp_uc_config_patch_request` ``` { "data": { "attributes": { "is_enabled": true }, "type": "gcp_uc_config_patch_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostGCPUsageCostConfig-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostGCPUsageCostConfig-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostGCPUsageCostConfig-403-v2) * [404](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostGCPUsageCostConfig-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCostGCPUsageCostConfig-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Response of Google Cloud Usage Cost config. Field Type Description data object Google Cloud Usage Cost config. attributes [_required_] object Attributes for a Google Cloud Usage Cost config. account_id [_required_] string The Google Cloud account ID. bucket_name [_required_] string The Google Cloud bucket name used to store the Usage Cost export. created_at string The timestamp when the Google Cloud Usage Cost config was created. dataset [_required_] string The export dataset name used for the Google Cloud Usage Cost Report. error_messages [string] The error messages for the Google Cloud Usage Cost config. export_prefix [_required_] string The export prefix used for the Google Cloud Usage Cost Report. export_project_name [_required_] string The name of the Google Cloud Usage Cost Report. months int32 **DEPRECATED** : The number of months the report has been backfilled. project_id string The `project_id` of the Google Cloud Usage Cost report. service_account [_required_] string The unique Google Cloud service account email. status [_required_] string The status of the Google Cloud Usage Cost config. status_updated_at string The timestamp when the Google Cloud Usage Cost config status was updated. updated_at string The timestamp when the Google Cloud Usage Cost config status was updated. id string The ID of the Google Cloud Usage Cost config. type [_required_] enum Type of Google Cloud Usage Cost config. Allowed enum values: `gcp_uc_config` default: `gcp_uc_config` ``` { "data": { "attributes": { "account_id": "123456_A123BC_12AB34", "bucket_name": "dd-cost-bucket", "created_at": "string", "dataset": "billing", "error_messages": [], "export_prefix": "datadog_cloud_cost_usage_export", "export_project_name": "dd-cloud-cost-report", "months": "integer", "project_id": "my-project-123", "service_account": "dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com", "status": "active", "status_updated_at": "string", "updated_at": "string" }, "id": "string", "type": "gcp_uc_config" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Update Google Cloud Usage Cost config returns "OK" response Copy ``` # Path parameters export cloud_account_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/gcp_uc_config/${cloud_account_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "is_enabled": true }, "type": "gcp_uc_config_patch_request" } } EOF ``` ##### Update Google Cloud Usage Cost config returns "OK" response ``` // Update Google Cloud Usage Cost config returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.GCPUsageCostConfigPatchRequest{ Data: datadogV2.GCPUsageCostConfigPatchData{ Attributes: datadogV2.GCPUsageCostConfigPatchRequestAttributes{ IsEnabled: true, }, Type: datadogV2.GCPUSAGECOSTCONFIGPATCHREQUESTTYPE_GCP_USAGE_COST_CONFIG_PATCH_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.UpdateCostGCPUsageCostConfig(ctx, 100, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.UpdateCostGCPUsageCostConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.UpdateCostGCPUsageCostConfig`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Google Cloud Usage Cost config returns "OK" response ``` // Update Google Cloud Usage Cost config returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.GCPUsageCostConfigPatchData; import com.datadog.api.client.v2.model.GCPUsageCostConfigPatchRequest; import com.datadog.api.client.v2.model.GCPUsageCostConfigPatchRequestAttributes; import com.datadog.api.client.v2.model.GCPUsageCostConfigPatchRequestType; import com.datadog.api.client.v2.model.GCPUsageCostConfigResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); GCPUsageCostConfigPatchRequest body = new GCPUsageCostConfigPatchRequest() .data( new GCPUsageCostConfigPatchData() .attributes(new GCPUsageCostConfigPatchRequestAttributes().isEnabled(true)) .type(GCPUsageCostConfigPatchRequestType.GCP_USAGE_COST_CONFIG_PATCH_REQUEST)); try { GCPUsageCostConfigResponse result = apiInstance.updateCostGCPUsageCostConfig(100L, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CloudCostManagementApi#updateCostGCPUsageCostConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Google Cloud Usage Cost config returns "OK" response ``` """ Update Google Cloud Usage Cost config returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.gcp_usage_cost_config_patch_data import GCPUsageCostConfigPatchData from datadog_api_client.v2.model.gcp_usage_cost_config_patch_request import GCPUsageCostConfigPatchRequest from datadog_api_client.v2.model.gcp_usage_cost_config_patch_request_attributes import ( GCPUsageCostConfigPatchRequestAttributes, ) from datadog_api_client.v2.model.gcp_usage_cost_config_patch_request_type import GCPUsageCostConfigPatchRequestType body = GCPUsageCostConfigPatchRequest( data=GCPUsageCostConfigPatchData( attributes=GCPUsageCostConfigPatchRequestAttributes( is_enabled=True, ), type=GCPUsageCostConfigPatchRequestType.GCP_USAGE_COST_CONFIG_PATCH_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.update_cost_gcp_usage_cost_config(cloud_account_id=100, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Google Cloud Usage Cost config returns "OK" response ``` # Update Google Cloud Usage Cost config returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::GCPUsageCostConfigPatchRequest.new({ data: DatadogAPIClient::V2::GCPUsageCostConfigPatchData.new({ attributes: DatadogAPIClient::V2::GCPUsageCostConfigPatchRequestAttributes.new({ is_enabled: true, }), type: DatadogAPIClient::V2::GCPUsageCostConfigPatchRequestType::GCP_USAGE_COST_CONFIG_PATCH_REQUEST, }), }) p api_instance.update_cost_gcp_usage_cost_config(100, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Google Cloud Usage Cost config returns "OK" response ``` // Update Google Cloud Usage Cost config returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::GCPUsageCostConfigPatchData; use datadog_api_client::datadogV2::model::GCPUsageCostConfigPatchRequest; use datadog_api_client::datadogV2::model::GCPUsageCostConfigPatchRequestAttributes; use datadog_api_client::datadogV2::model::GCPUsageCostConfigPatchRequestType; #[tokio::main] async fn main() { let body = GCPUsageCostConfigPatchRequest::new(GCPUsageCostConfigPatchData::new( GCPUsageCostConfigPatchRequestAttributes::new(true), GCPUsageCostConfigPatchRequestType::GCP_USAGE_COST_CONFIG_PATCH_REQUEST, )); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.update_cost_gcp_usage_cost_config(100, body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Google Cloud Usage Cost config returns "OK" response ``` /** * Update Google Cloud Usage Cost config returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiUpdateCostGCPUsageCostConfigRequest = { body: { data: { attributes: { isEnabled: true, }, type: "gcp_uc_config_patch_request", }, }, cloudAccountId: 100, }; apiInstance .updateCostGCPUsageCostConfig(params) .then((data: v2.GCPUsageCostConfigResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-google-cloud-usage-cost-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-google-cloud-usage-cost-config-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.ap2.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.datadoghq.eu/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.ddog-gov.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.us3.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.us5.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id} ### Overview Archive a Cloud Cost Management account. This endpoint requires the `cloud_cost_management_write` permission. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description cloud_account_id [_required_] integer Cloud Account id. ### Response * [204](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostGCPUsageCostConfig-204-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostGCPUsageCostConfig-400-v2) * [404](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostGCPUsageCostConfig-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCostGCPUsageCostConfig-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Delete Google Cloud Usage Cost config Copy ``` # Path parameters export cloud_account_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/gcp_uc_config/${cloud_account_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Google Cloud Usage Cost config ``` """ Delete Google Cloud Usage Cost config returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) api_instance.delete_cost_gcp_usage_cost_config( cloud_account_id=100, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Google Cloud Usage Cost config ``` # Delete Google Cloud Usage Cost config returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new api_instance.delete_cost_gcp_usage_cost_config(100) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Google Cloud Usage Cost config ``` // Delete Google Cloud Usage Cost config returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) r, err := api.DeleteCostGCPUsageCostConfig(ctx, 100) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.DeleteCostGCPUsageCostConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Google Cloud Usage Cost config ``` // Delete Google Cloud Usage Cost config returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { apiInstance.deleteCostGCPUsageCostConfig(100L); } catch (ApiException e) { System.err.println( "Exception when calling CloudCostManagementApi#deleteCostGCPUsageCostConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Google Cloud Usage Cost config ``` // Delete Google Cloud Usage Cost config returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.delete_cost_gcp_usage_cost_config(100).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Google Cloud Usage Cost config ``` /** * Delete Google Cloud Usage Cost config returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiDeleteCostGCPUsageCostConfigRequest = { cloudAccountId: 100, }; apiInstance .deleteCostGCPUsageCostConfig(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-google-cloud-usage-cost-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-google-cloud-usage-cost-config-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.ap2.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.datadoghq.eu/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.ddog-gov.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.us3.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id}https://api.us5.datadoghq.com/api/v2/cost/gcp_uc_config/{cloud_account_id} ### Overview Get a specific Google Cloud Usage Cost config. OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description cloud_account_id [_required_] integer The unique identifier of the cloud account ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCostGCPUsageCostConfig-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCostGCPUsageCostConfig-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `GcpUcConfigResponse` object. Field Type Description data object The definition of `GcpUcConfigResponseData` object. attributes object The definition of `GcpUcConfigResponseDataAttributes` object. account_id string The `attributes` `account_id`. bucket_name string The `attributes` `bucket_name`. created_at string The `attributes` `created_at`. dataset string The `attributes` `dataset`. error_messages [string] The `attributes` `error_messages`. export_prefix string The `attributes` `export_prefix`. export_project_name string The `attributes` `export_project_name`. months int64 The `attributes` `months`. project_id string The `attributes` `project_id`. service_account string The `attributes` `service_account`. status string The `attributes` `status`. status_updated_at string The `attributes` `status_updated_at`. updated_at string The `attributes` `updated_at`. id string The `GcpUcConfigResponseData` `id`. type [_required_] enum Google Cloud Usage Cost config resource type. Allowed enum values: `gcp_uc_config` default: `gcp_uc_config` ``` { "data": { "attributes": { "account_id": "123456_A123BC_12AB34", "bucket_name": "dd-cost-bucket", "created_at": "2023-01-01T12:00:00.000Z", "dataset": "billing", "error_messages": [], "export_prefix": "datadog_cloud_cost_usage_export", "export_project_name": "dd-cloud-cost-report", "months": 36, "project_id": "my-project-123", "service_account": "dd-ccm-gcp-integration@my-environment.iam.gserviceaccount.com", "status": "active", "status_updated_at": "2023-01-01T12:00:00.000Z", "updated_at": "2023-01-01T12:00:00.000Z" }, "id": "123456789123", "type": "gcp_uc_config" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Get Google Cloud Usage Cost config Copy ``` # Path parameters export cloud_account_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/gcp_uc_config/${cloud_account_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Google Cloud Usage Cost config ``` """ Get Google Cloud Usage Cost config returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.get_cost_gcp_usage_cost_config( cloud_account_id=123456, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Google Cloud Usage Cost config ``` # Get Google Cloud Usage Cost config returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.get_cost_gcp_usage_cost_config(123456) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Google Cloud Usage Cost config ``` // Get Google Cloud Usage Cost config returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.GetCostGCPUsageCostConfig(ctx, 123456) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.GetCostGCPUsageCostConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.GetCostGCPUsageCostConfig`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Google Cloud Usage Cost config ``` // Get Google Cloud Usage Cost config returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.GcpUcConfigResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { GcpUcConfigResponse result = apiInstance.getCostGCPUsageCostConfig(123456L); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#getCostGCPUsageCostConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Google Cloud Usage Cost config ``` // Get Google Cloud Usage Cost config returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.get_cost_gcp_usage_cost_config(123456).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Google Cloud Usage Cost config ``` /** * Get Google Cloud Usage Cost config returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiGetCostGCPUsageCostConfigRequest = { cloudAccountId: 123456, }; apiInstance .getCostGCPUsageCostConfig(params) .then((data: v2.GcpUcConfigResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List tag pipeline rulesets](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-tag-pipeline-rulesets) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-tag-pipeline-rulesets-v2) GET https://api.ap1.datadoghq.com/api/v2/tags/enrichmenthttps://api.ap2.datadoghq.com/api/v2/tags/enrichmenthttps://api.datadoghq.eu/api/v2/tags/enrichmenthttps://api.ddog-gov.com/api/v2/tags/enrichmenthttps://api.datadoghq.com/api/v2/tags/enrichmenthttps://api.us3.datadoghq.com/api/v2/tags/enrichmenthttps://api.us5.datadoghq.com/api/v2/tags/enrichment ### Overview List all tag pipeline rulesets - Retrieve a list of all tag pipeline rulesets for the organization OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListTagPipelinesRulesets-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListTagPipelinesRulesets-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `RulesetRespArray` object. Field Type Description data [_required_] [object] The `RulesetRespArray` `data`. attributes object The definition of `RulesetRespDataAttributes` object. created [_required_] object The definition of `RulesetRespDataAttributesCreated` object. nanos int32 The `created` `nanos`. seconds int64 The `created` `seconds`. enabled [_required_] boolean The `attributes` `enabled`. last_modified_user_uuid [_required_] string The `attributes` `last_modified_user_uuid`. modified [_required_] object The definition of `RulesetRespDataAttributesModified` object. nanos int32 The `modified` `nanos`. seconds int64 The `modified` `seconds`. name [_required_] string The `attributes` `name`. position [_required_] int32 The `attributes` `position`. processing_status string The `attributes` `processing_status`. rules [_required_] [object] The `attributes` `rules`. enabled [_required_] boolean The `items` `enabled`. mapping object The definition of `RulesetRespDataAttributesRulesItemsMapping` object. destination_key [_required_] string The `mapping` `destination_key`. if_not_exists [_required_] boolean The `mapping` `if_not_exists`. source_keys [_required_] [string] The `mapping` `source_keys`. metadata object The `items` `metadata`. string name [_required_] string The `items` `name`. query object The definition of `RulesetRespDataAttributesRulesItemsQuery` object. addition [_required_] object The definition of `RulesetRespDataAttributesRulesItemsQueryAddition` object. key [_required_] string The `addition` `key`. value [_required_] string The `addition` `value`. case_insensitivity boolean The `query` `case_insensitivity`. if_not_exists [_required_] boolean The `query` `if_not_exists`. query [_required_] string The `query` `query`. reference_table object The definition of `RulesetRespDataAttributesRulesItemsReferenceTable` object. case_insensitivity boolean The `reference_table` `case_insensitivity`. field_pairs [_required_] [object] The `reference_table` `field_pairs`. input_column [_required_] string The `items` `input_column`. output_key [_required_] string The `items` `output_key`. if_not_exists boolean The `reference_table` `if_not_exists`. source_keys [_required_] [string] The `reference_table` `source_keys`. table_name [_required_] string The `reference_table` `table_name`. version [_required_] int64 The `attributes` `version`. id string The `RulesetRespData` `id`. type [_required_] enum Ruleset resource type. Allowed enum values: `ruleset` default: `ruleset` ``` { "data": [ { "attributes": { "created": null, "enabled": true, "last_modified_user_uuid": "", "modified": null, "name": "Production Cost Allocation Rules", "position": 0, "rules": [ { "enabled": true, "mapping": null, "metadata": null, "name": "AWS Production Account Tagging", "query": { "addition": { "key": "environment", "value": "production" }, "case_insensitivity": false, "if_not_exists": true, "query": "billingcurrency:\"USD\" AND account_name:\"prod-account\"" }, "reference_table": null }, { "enabled": true, "mapping": { "destination_key": "team_owner", "if_not_exists": true, "source_keys": [ "account_name", "service" ] }, "metadata": null, "name": "Team Mapping Rule", "query": null, "reference_table": null }, { "enabled": true, "mapping": null, "metadata": null, "name": "New table rule with new UI", "query": null, "reference_table": { "case_insensitivity": true, "field_pairs": [ { "input_column": "status_type", "output_key": "status" }, { "input_column": "status_description", "output_key": "dess" } ], "if_not_exists": false, "source_keys": [ "http_status", "status_description" ], "table_name": "http_status_codes" } } ], "version": 2 }, "id": "55ef2385-9ae1-4410-90c4-5ac1b60fec10", "type": "ruleset" }, { "attributes": { "created": null, "enabled": true, "last_modified_user_uuid": "", "modified": null, "name": "Development Environment Rules", "position": 0, "rules": [ { "enabled": true, "mapping": null, "metadata": null, "name": "Dev Account Cost Center", "query": { "addition": { "key": "cost_center", "value": "engineering" }, "case_insensitivity": true, "if_not_exists": true, "query": "account_name:\"dev-*\"" }, "reference_table": null } ], "version": 1 }, "id": "a7b8c9d0-1234-5678-9abc-def012345678", "type": "ruleset" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### List tag pipeline rulesets Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/tags/enrichment" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List tag pipeline rulesets ``` """ List tag pipeline rulesets returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.list_tag_pipelines_rulesets() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List tag pipeline rulesets ``` # List tag pipeline rulesets returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.list_tag_pipelines_rulesets() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List tag pipeline rulesets ``` // List tag pipeline rulesets returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.ListTagPipelinesRulesets(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.ListTagPipelinesRulesets`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.ListTagPipelinesRulesets`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List tag pipeline rulesets ``` // List tag pipeline rulesets returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.RulesetRespArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { RulesetRespArray result = apiInstance.listTagPipelinesRulesets(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#listTagPipelinesRulesets"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List tag pipeline rulesets ``` // List tag pipeline rulesets returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.list_tag_pipelines_rulesets().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List tag pipeline rulesets ``` /** * List tag pipeline rulesets returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); apiInstance .listTagPipelinesRulesets() .then((data: v2.RulesetRespArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create tag pipeline ruleset](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-tag-pipeline-ruleset) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-tag-pipeline-ruleset-v2) POST https://api.ap1.datadoghq.com/api/v2/tags/enrichmenthttps://api.ap2.datadoghq.com/api/v2/tags/enrichmenthttps://api.datadoghq.eu/api/v2/tags/enrichmenthttps://api.ddog-gov.com/api/v2/tags/enrichmenthttps://api.datadoghq.com/api/v2/tags/enrichmenthttps://api.us3.datadoghq.com/api/v2/tags/enrichmenthttps://api.us5.datadoghq.com/api/v2/tags/enrichment ### Overview Create a new tag pipeline ruleset with the specified rules and configuration OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data object The definition of `CreateRulesetRequestData` object. attributes object The definition of `CreateRulesetRequestDataAttributes` object. enabled boolean The `attributes` `enabled`. rules [_required_] [object] The `attributes` `rules`. enabled [_required_] boolean The `items` `enabled`. mapping object The definition of `CreateRulesetRequestDataAttributesRulesItemsMapping` object. destination_key [_required_] string The `mapping` `destination_key`. if_not_exists [_required_] boolean The `mapping` `if_not_exists`. source_keys [_required_] [string] The `mapping` `source_keys`. metadata object The `items` `metadata`. string name [_required_] string The `items` `name`. query object The definition of `CreateRulesetRequestDataAttributesRulesItemsQuery` object. addition [_required_] object The definition of `CreateRulesetRequestDataAttributesRulesItemsQueryAddition` object. key [_required_] string The `addition` `key`. value [_required_] string The `addition` `value`. case_insensitivity boolean The `query` `case_insensitivity`. if_not_exists [_required_] boolean The `query` `if_not_exists`. query [_required_] string The `query` `query`. reference_table object The definition of `CreateRulesetRequestDataAttributesRulesItemsReferenceTable` object. case_insensitivity boolean The `reference_table` `case_insensitivity`. field_pairs [_required_] [object] The `reference_table` `field_pairs`. input_column [_required_] string The `items` `input_column`. output_key [_required_] string The `items` `output_key`. if_not_exists boolean The `reference_table` `if_not_exists`. source_keys [_required_] [string] The `reference_table` `source_keys`. table_name [_required_] string The `reference_table` `table_name`. id string The `CreateRulesetRequestData` `id`. type [_required_] enum Create ruleset resource type. Allowed enum values: `create_ruleset` default: `create_ruleset` ``` { "data": { "attributes": { "enabled": true, "rules": [ { "enabled": true, "mapping": null, "name": "Add Cost Center Tag", "query": { "addition": { "key": "cost_center", "value": "engineering" }, "case_insensitivity": false, "if_not_exists": true, "query": "account_id:\"123456789\" AND service:\"web-api\"" }, "reference_table": null } ] }, "id": "New Ruleset", "type": "create_ruleset" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateTagPipelinesRuleset-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateTagPipelinesRuleset-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `RulesetResp` object. Field Type Description data object The definition of `RulesetRespData` object. attributes object The definition of `RulesetRespDataAttributes` object. created [_required_] object The definition of `RulesetRespDataAttributesCreated` object. nanos int32 The `created` `nanos`. seconds int64 The `created` `seconds`. enabled [_required_] boolean The `attributes` `enabled`. last_modified_user_uuid [_required_] string The `attributes` `last_modified_user_uuid`. modified [_required_] object The definition of `RulesetRespDataAttributesModified` object. nanos int32 The `modified` `nanos`. seconds int64 The `modified` `seconds`. name [_required_] string The `attributes` `name`. position [_required_] int32 The `attributes` `position`. processing_status string The `attributes` `processing_status`. rules [_required_] [object] The `attributes` `rules`. enabled [_required_] boolean The `items` `enabled`. mapping object The definition of `RulesetRespDataAttributesRulesItemsMapping` object. destination_key [_required_] string The `mapping` `destination_key`. if_not_exists [_required_] boolean The `mapping` `if_not_exists`. source_keys [_required_] [string] The `mapping` `source_keys`. metadata object The `items` `metadata`. string name [_required_] string The `items` `name`. query object The definition of `RulesetRespDataAttributesRulesItemsQuery` object. addition [_required_] object The definition of `RulesetRespDataAttributesRulesItemsQueryAddition` object. key [_required_] string The `addition` `key`. value [_required_] string The `addition` `value`. case_insensitivity boolean The `query` `case_insensitivity`. if_not_exists [_required_] boolean The `query` `if_not_exists`. query [_required_] string The `query` `query`. reference_table object The definition of `RulesetRespDataAttributesRulesItemsReferenceTable` object. case_insensitivity boolean The `reference_table` `case_insensitivity`. field_pairs [_required_] [object] The `reference_table` `field_pairs`. input_column [_required_] string The `items` `input_column`. output_key [_required_] string The `items` `output_key`. if_not_exists boolean The `reference_table` `if_not_exists`. source_keys [_required_] [string] The `reference_table` `source_keys`. table_name [_required_] string The `reference_table` `table_name`. version [_required_] int64 The `attributes` `version`. id string The `RulesetRespData` `id`. type [_required_] enum Ruleset resource type. Allowed enum values: `ruleset` default: `ruleset` ``` { "data": { "attributes": { "created": null, "enabled": true, "last_modified_user_uuid": "", "modified": null, "name": "Example Ruleset", "position": 0, "rules": [ { "enabled": false, "mapping": null, "metadata": null, "name": "RC test rule edited1", "query": { "addition": { "key": "abc", "value": "ww" }, "case_insensitivity": false, "if_not_exists": true, "query": "billingcurrency:\"USD\" AND account_name:\"SZA96462\" AND billingcurrency:\"USD\"" }, "reference_table": null }, { "enabled": true, "mapping": { "destination_key": "h", "if_not_exists": true, "source_keys": [ "accountname", "accountownerid" ] }, "metadata": null, "name": "rule with empty source key", "query": null, "reference_table": null }, { "enabled": true, "mapping": null, "metadata": null, "name": "New table rule with new UI", "query": null, "reference_table": { "case_insensitivity": true, "field_pairs": [ { "input_column": "status_type", "output_key": "status" }, { "input_column": "status_description", "output_key": "dess" } ], "if_not_exists": false, "source_keys": [ "http_status", "status_description" ], "table_name": "http_status_codes" } } ], "version": 1 }, "id": "12345", "type": "ruleset" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Create tag pipeline ruleset returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/tags/enrichment" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": true, "rules": [ { "enabled": true, "mapping": null, "name": "Add Cost Center Tag", "query": { "addition": { "key": "cost_center", "value": "engineering" }, "case_insensitivity": false, "if_not_exists": true, "query": "account_id:\"123456789\" AND service:\"web-api\"" }, "reference_table": null } ] }, "id": "New Ruleset", "type": "create_ruleset" } } EOF ``` ##### Create tag pipeline ruleset returns "OK" response ``` // Create tag pipeline ruleset returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateRulesetRequest{ Data: &datadogV2.CreateRulesetRequestData{ Attributes: &datadogV2.CreateRulesetRequestDataAttributes{ Enabled: datadog.PtrBool(true), Rules: []datadogV2.CreateRulesetRequestDataAttributesRulesItems{ { Enabled: true, Mapping: *datadogV2.NewNullableCreateRulesetRequestDataAttributesRulesItemsMapping(nil), Name: "Add Cost Center Tag", Query: *datadogV2.NewNullableCreateRulesetRequestDataAttributesRulesItemsQuery(&datadogV2.CreateRulesetRequestDataAttributesRulesItemsQuery{ Addition: *datadogV2.NewNullableCreateRulesetRequestDataAttributesRulesItemsQueryAddition(&datadogV2.CreateRulesetRequestDataAttributesRulesItemsQueryAddition{ Key: "cost_center", Value: "engineering", }), CaseInsensitivity: datadog.PtrBool(false), IfNotExists: true, Query: `account_id:"123456789" AND service:"web-api"`, }), ReferenceTable: *datadogV2.NewNullableCreateRulesetRequestDataAttributesRulesItemsReferenceTable(nil), }, }, }, Id: datadog.PtrString("New Ruleset"), Type: datadogV2.CREATERULESETREQUESTDATATYPE_CREATE_RULESET, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.CreateTagPipelinesRuleset(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.CreateTagPipelinesRuleset`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.CreateTagPipelinesRuleset`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create tag pipeline ruleset returns "OK" response ``` // Create tag pipeline ruleset returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.CreateRulesetRequest; import com.datadog.api.client.v2.model.CreateRulesetRequestData; import com.datadog.api.client.v2.model.CreateRulesetRequestDataAttributes; import com.datadog.api.client.v2.model.CreateRulesetRequestDataAttributesRulesItems; import com.datadog.api.client.v2.model.CreateRulesetRequestDataAttributesRulesItemsQuery; import com.datadog.api.client.v2.model.CreateRulesetRequestDataAttributesRulesItemsQueryAddition; import com.datadog.api.client.v2.model.CreateRulesetRequestDataType; import com.datadog.api.client.v2.model.RulesetResp; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); CreateRulesetRequest body = new CreateRulesetRequest() .data( new CreateRulesetRequestData() .attributes( new CreateRulesetRequestDataAttributes() .enabled(true) .rules( Collections.singletonList( new CreateRulesetRequestDataAttributesRulesItems() .enabled(true) .mapping(null) .name("Add Cost Center Tag") .query( new CreateRulesetRequestDataAttributesRulesItemsQuery() .addition( new CreateRulesetRequestDataAttributesRulesItemsQueryAddition() .key("cost_center") .value("engineering")) .caseInsensitivity(false) .ifNotExists(true) .query( """ account_id:"123456789" AND service:"web-api" """)) .referenceTable(null)))) .id("New Ruleset") .type(CreateRulesetRequestDataType.CREATE_RULESET)); try { RulesetResp result = apiInstance.createTagPipelinesRuleset(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#createTagPipelinesRuleset"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create tag pipeline ruleset returns "OK" response ``` """ Create tag pipeline ruleset returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.create_ruleset_request import CreateRulesetRequest from datadog_api_client.v2.model.create_ruleset_request_data import CreateRulesetRequestData from datadog_api_client.v2.model.create_ruleset_request_data_attributes import CreateRulesetRequestDataAttributes from datadog_api_client.v2.model.create_ruleset_request_data_attributes_rules_items import ( CreateRulesetRequestDataAttributesRulesItems, ) from datadog_api_client.v2.model.create_ruleset_request_data_attributes_rules_items_query import ( CreateRulesetRequestDataAttributesRulesItemsQuery, ) from datadog_api_client.v2.model.create_ruleset_request_data_attributes_rules_items_query_addition import ( CreateRulesetRequestDataAttributesRulesItemsQueryAddition, ) from datadog_api_client.v2.model.create_ruleset_request_data_type import CreateRulesetRequestDataType body = CreateRulesetRequest( data=CreateRulesetRequestData( attributes=CreateRulesetRequestDataAttributes( enabled=True, rules=[ CreateRulesetRequestDataAttributesRulesItems( enabled=True, mapping=None, name="Add Cost Center Tag", query=CreateRulesetRequestDataAttributesRulesItemsQuery( addition=CreateRulesetRequestDataAttributesRulesItemsQueryAddition( key="cost_center", value="engineering", ), case_insensitivity=False, if_not_exists=True, query='account_id:"123456789" AND service:"web-api"', ), reference_table=None, ), ], ), id="New Ruleset", type=CreateRulesetRequestDataType.CREATE_RULESET, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.create_tag_pipelines_ruleset(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create tag pipeline ruleset returns "OK" response ``` # Create tag pipeline ruleset returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::CreateRulesetRequest.new({ data: DatadogAPIClient::V2::CreateRulesetRequestData.new({ attributes: DatadogAPIClient::V2::CreateRulesetRequestDataAttributes.new({ enabled: true, rules: [ DatadogAPIClient::V2::CreateRulesetRequestDataAttributesRulesItems.new({ enabled: true, mapping: nil, name: "Add Cost Center Tag", query: DatadogAPIClient::V2::CreateRulesetRequestDataAttributesRulesItemsQuery.new({ addition: DatadogAPIClient::V2::CreateRulesetRequestDataAttributesRulesItemsQueryAddition.new({ key: "cost_center", value: "engineering", }), case_insensitivity: false, if_not_exists: true, query: 'account_id:"123456789" AND service:"web-api"', }), reference_table: nil, }), ], }), id: "New Ruleset", type: DatadogAPIClient::V2::CreateRulesetRequestDataType::CREATE_RULESET, }), }) p api_instance.create_tag_pipelines_ruleset(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create tag pipeline ruleset returns "OK" response ``` // Create tag pipeline ruleset returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::CreateRulesetRequest; use datadog_api_client::datadogV2::model::CreateRulesetRequestData; use datadog_api_client::datadogV2::model::CreateRulesetRequestDataAttributes; use datadog_api_client::datadogV2::model::CreateRulesetRequestDataAttributesRulesItems; use datadog_api_client::datadogV2::model::CreateRulesetRequestDataAttributesRulesItemsQuery; use datadog_api_client::datadogV2::model::CreateRulesetRequestDataAttributesRulesItemsQueryAddition; use datadog_api_client::datadogV2::model::CreateRulesetRequestDataType; #[tokio::main] async fn main() { let body = CreateRulesetRequest::new().data( CreateRulesetRequestData::new(CreateRulesetRequestDataType::CREATE_RULESET) .attributes( CreateRulesetRequestDataAttributes::new(vec![ CreateRulesetRequestDataAttributesRulesItems::new( true, "Add Cost Center Tag".to_string(), ) .mapping(None) .query(Some( CreateRulesetRequestDataAttributesRulesItemsQuery::new( Some( CreateRulesetRequestDataAttributesRulesItemsQueryAddition::new( "cost_center".to_string(), "engineering".to_string(), ), ), true, r#"account_id:"123456789" AND service:"web-api""#.to_string(), ) .case_insensitivity(false), )) .reference_table(None), ]) .enabled(true), ) .id("New Ruleset".to_string()), ); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.create_tag_pipelines_ruleset(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create tag pipeline ruleset returns "OK" response ``` /** * Create tag pipeline ruleset returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiCreateTagPipelinesRulesetRequest = { body: { data: { attributes: { enabled: true, rules: [ { enabled: true, mapping: undefined, name: "Add Cost Center Tag", query: { addition: { key: "cost_center", value: "engineering", }, caseInsensitivity: false, ifNotExists: true, query: `account_id:"123456789" AND service:"web-api"`, }, referenceTable: undefined, }, ], }, id: "New Ruleset", type: "create_ruleset", }, }, }; apiInstance .createTagPipelinesRuleset(params) .then((data: v2.RulesetResp) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update tag pipeline ruleset](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-tag-pipeline-ruleset) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-tag-pipeline-ruleset-v2) PATCH https://api.ap1.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.ap2.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.datadoghq.eu/api/v2/tags/enrichment/{ruleset_id}https://api.ddog-gov.com/api/v2/tags/enrichment/{ruleset_id}https://api.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.us3.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.us5.datadoghq.com/api/v2/tags/enrichment/{ruleset_id} ### Overview Update a tag pipeline ruleset - Update an existing tag pipeline ruleset with new rules and configuration OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description ruleset_id [_required_] string The unique identifier of the ruleset ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data object The definition of `UpdateRulesetRequestData` object. attributes object The definition of `UpdateRulesetRequestDataAttributes` object. enabled [_required_] boolean The `attributes` `enabled`. last_version int64 The `attributes` `last_version`. rules [_required_] [object] The `attributes` `rules`. enabled [_required_] boolean The `items` `enabled`. mapping object The definition of `UpdateRulesetRequestDataAttributesRulesItemsMapping` object. destination_key [_required_] string The `mapping` `destination_key`. if_not_exists [_required_] boolean The `mapping` `if_not_exists`. source_keys [_required_] [string] The `mapping` `source_keys`. metadata object The `items` `metadata`. string name [_required_] string The `items` `name`. query object The definition of `UpdateRulesetRequestDataAttributesRulesItemsQuery` object. addition [_required_] object The definition of `UpdateRulesetRequestDataAttributesRulesItemsQueryAddition` object. key [_required_] string The `addition` `key`. value [_required_] string The `addition` `value`. case_insensitivity boolean The `query` `case_insensitivity`. if_not_exists [_required_] boolean The `query` `if_not_exists`. query [_required_] string The `query` `query`. reference_table object The definition of `UpdateRulesetRequestDataAttributesRulesItemsReferenceTable` object. case_insensitivity boolean The `reference_table` `case_insensitivity`. field_pairs [_required_] [object] The `reference_table` `field_pairs`. input_column [_required_] string The `items` `input_column`. output_key [_required_] string The `items` `output_key`. if_not_exists boolean The `reference_table` `if_not_exists`. source_keys [_required_] [string] The `reference_table` `source_keys`. table_name [_required_] string The `reference_table` `table_name`. id string The `UpdateRulesetRequestData` `id`. type [_required_] enum Update ruleset resource type. Allowed enum values: `update_ruleset` default: `update_ruleset` ``` { "data": { "attributes": { "enabled": true, "last_version": 3611102, "rules": [ { "enabled": true, "mapping": { "destination_key": "team_owner", "if_not_exists": true, "source_keys": [ "account_name", "account_id" ] }, "name": "Account Name Mapping", "query": null, "reference_table": null } ] }, "id": "New Ruleset", "type": "update_ruleset" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateTagPipelinesRuleset-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateTagPipelinesRuleset-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `RulesetResp` object. Field Type Description data object The definition of `RulesetRespData` object. attributes object The definition of `RulesetRespDataAttributes` object. created [_required_] object The definition of `RulesetRespDataAttributesCreated` object. nanos int32 The `created` `nanos`. seconds int64 The `created` `seconds`. enabled [_required_] boolean The `attributes` `enabled`. last_modified_user_uuid [_required_] string The `attributes` `last_modified_user_uuid`. modified [_required_] object The definition of `RulesetRespDataAttributesModified` object. nanos int32 The `modified` `nanos`. seconds int64 The `modified` `seconds`. name [_required_] string The `attributes` `name`. position [_required_] int32 The `attributes` `position`. processing_status string The `attributes` `processing_status`. rules [_required_] [object] The `attributes` `rules`. enabled [_required_] boolean The `items` `enabled`. mapping object The definition of `RulesetRespDataAttributesRulesItemsMapping` object. destination_key [_required_] string The `mapping` `destination_key`. if_not_exists [_required_] boolean The `mapping` `if_not_exists`. source_keys [_required_] [string] The `mapping` `source_keys`. metadata object The `items` `metadata`. string name [_required_] string The `items` `name`. query object The definition of `RulesetRespDataAttributesRulesItemsQuery` object. addition [_required_] object The definition of `RulesetRespDataAttributesRulesItemsQueryAddition` object. key [_required_] string The `addition` `key`. value [_required_] string The `addition` `value`. case_insensitivity boolean The `query` `case_insensitivity`. if_not_exists [_required_] boolean The `query` `if_not_exists`. query [_required_] string The `query` `query`. reference_table object The definition of `RulesetRespDataAttributesRulesItemsReferenceTable` object. case_insensitivity boolean The `reference_table` `case_insensitivity`. field_pairs [_required_] [object] The `reference_table` `field_pairs`. input_column [_required_] string The `items` `input_column`. output_key [_required_] string The `items` `output_key`. if_not_exists boolean The `reference_table` `if_not_exists`. source_keys [_required_] [string] The `reference_table` `source_keys`. table_name [_required_] string The `reference_table` `table_name`. version [_required_] int64 The `attributes` `version`. id string The `RulesetRespData` `id`. type [_required_] enum Ruleset resource type. Allowed enum values: `ruleset` default: `ruleset` ``` { "data": { "attributes": { "created": null, "enabled": true, "last_modified_user_uuid": "", "modified": null, "name": "Example Ruleset", "position": 0, "rules": [ { "enabled": false, "mapping": null, "metadata": null, "name": "RC test rule edited1", "query": { "addition": { "key": "abc", "value": "ww" }, "case_insensitivity": false, "if_not_exists": true, "query": "billingcurrency:\"USD\" AND account_name:\"SZA96462\" AND billingcurrency:\"USD\"" }, "reference_table": null }, { "enabled": true, "mapping": { "destination_key": "h", "if_not_exists": true, "source_keys": [ "accountname", "accountownerid" ] }, "metadata": null, "name": "rule with empty source key", "query": null, "reference_table": null }, { "enabled": true, "mapping": null, "metadata": null, "name": "New table rule with new UI", "query": null, "reference_table": { "case_insensitivity": true, "field_pairs": [ { "input_column": "status_type", "output_key": "status" }, { "input_column": "status_description", "output_key": "dess" } ], "if_not_exists": false, "source_keys": [ "http_status", "status_description" ], "table_name": "http_status_codes" } } ], "version": 1 }, "id": "12345", "type": "ruleset" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Update tag pipeline ruleset returns "OK" response Copy ``` # Path parameters export ruleset_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/tags/enrichment/${ruleset_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": true, "last_version": 3611102, "rules": [ { "enabled": true, "mapping": { "destination_key": "team_owner", "if_not_exists": true, "source_keys": [ "account_name", "account_id" ] }, "name": "Account Name Mapping", "query": null, "reference_table": null } ] }, "id": "New Ruleset", "type": "update_ruleset" } } EOF ``` ##### Update tag pipeline ruleset returns "OK" response ``` // Update tag pipeline ruleset returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.UpdateRulesetRequest{ Data: &datadogV2.UpdateRulesetRequestData{ Attributes: &datadogV2.UpdateRulesetRequestDataAttributes{ Enabled: true, LastVersion: datadog.PtrInt64(3611102), Rules: []datadogV2.UpdateRulesetRequestDataAttributesRulesItems{ { Enabled: true, Mapping: *datadogV2.NewNullableUpdateRulesetRequestDataAttributesRulesItemsMapping(&datadogV2.UpdateRulesetRequestDataAttributesRulesItemsMapping{ DestinationKey: "team_owner", IfNotExists: true, SourceKeys: []string{ "account_name", "account_id", }, }), Name: "Account Name Mapping", Query: *datadogV2.NewNullableUpdateRulesetRequestDataAttributesRulesItemsQuery(nil), ReferenceTable: *datadogV2.NewNullableUpdateRulesetRequestDataAttributesRulesItemsReferenceTable(nil), }, }, }, Id: datadog.PtrString("New Ruleset"), Type: datadogV2.UPDATERULESETREQUESTDATATYPE_UPDATE_RULESET, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.UpdateTagPipelinesRuleset(ctx, "ee10c3ff-312f-464c-b4f6-46adaa6d00a1", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.UpdateTagPipelinesRuleset`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.UpdateTagPipelinesRuleset`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update tag pipeline ruleset returns "OK" response ``` // Update tag pipeline ruleset returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.RulesetResp; import com.datadog.api.client.v2.model.UpdateRulesetRequest; import com.datadog.api.client.v2.model.UpdateRulesetRequestData; import com.datadog.api.client.v2.model.UpdateRulesetRequestDataAttributes; import com.datadog.api.client.v2.model.UpdateRulesetRequestDataAttributesRulesItems; import com.datadog.api.client.v2.model.UpdateRulesetRequestDataAttributesRulesItemsMapping; import com.datadog.api.client.v2.model.UpdateRulesetRequestDataType; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); UpdateRulesetRequest body = new UpdateRulesetRequest() .data( new UpdateRulesetRequestData() .attributes( new UpdateRulesetRequestDataAttributes() .enabled(true) .lastVersion(3611102L) .rules( Collections.singletonList( new UpdateRulesetRequestDataAttributesRulesItems() .enabled(true) .mapping( new UpdateRulesetRequestDataAttributesRulesItemsMapping() .destinationKey("team_owner") .ifNotExists(true) .sourceKeys( Arrays.asList("account_name", "account_id"))) .name("Account Name Mapping") .query(null) .referenceTable(null)))) .id("New Ruleset") .type(UpdateRulesetRequestDataType.UPDATE_RULESET)); try { RulesetResp result = apiInstance.updateTagPipelinesRuleset("ee10c3ff-312f-464c-b4f6-46adaa6d00a1", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#updateTagPipelinesRuleset"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update tag pipeline ruleset returns "OK" response ``` """ Update tag pipeline ruleset returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.update_ruleset_request import UpdateRulesetRequest from datadog_api_client.v2.model.update_ruleset_request_data import UpdateRulesetRequestData from datadog_api_client.v2.model.update_ruleset_request_data_attributes import UpdateRulesetRequestDataAttributes from datadog_api_client.v2.model.update_ruleset_request_data_attributes_rules_items import ( UpdateRulesetRequestDataAttributesRulesItems, ) from datadog_api_client.v2.model.update_ruleset_request_data_attributes_rules_items_mapping import ( UpdateRulesetRequestDataAttributesRulesItemsMapping, ) from datadog_api_client.v2.model.update_ruleset_request_data_type import UpdateRulesetRequestDataType body = UpdateRulesetRequest( data=UpdateRulesetRequestData( attributes=UpdateRulesetRequestDataAttributes( enabled=True, last_version=3611102, rules=[ UpdateRulesetRequestDataAttributesRulesItems( enabled=True, mapping=UpdateRulesetRequestDataAttributesRulesItemsMapping( destination_key="team_owner", if_not_exists=True, source_keys=[ "account_name", "account_id", ], ), name="Account Name Mapping", query=None, reference_table=None, ), ], ), id="New Ruleset", type=UpdateRulesetRequestDataType.UPDATE_RULESET, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.update_tag_pipelines_ruleset(ruleset_id="ee10c3ff-312f-464c-b4f6-46adaa6d00a1", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update tag pipeline ruleset returns "OK" response ``` # Update tag pipeline ruleset returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::UpdateRulesetRequest.new({ data: DatadogAPIClient::V2::UpdateRulesetRequestData.new({ attributes: DatadogAPIClient::V2::UpdateRulesetRequestDataAttributes.new({ enabled: true, last_version: 3611102, rules: [ DatadogAPIClient::V2::UpdateRulesetRequestDataAttributesRulesItems.new({ enabled: true, mapping: DatadogAPIClient::V2::UpdateRulesetRequestDataAttributesRulesItemsMapping.new({ destination_key: "team_owner", if_not_exists: true, source_keys: [ "account_name", "account_id", ], }), name: "Account Name Mapping", query: nil, reference_table: nil, }), ], }), id: "New Ruleset", type: DatadogAPIClient::V2::UpdateRulesetRequestDataType::UPDATE_RULESET, }), }) p api_instance.update_tag_pipelines_ruleset("ee10c3ff-312f-464c-b4f6-46adaa6d00a1", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update tag pipeline ruleset returns "OK" response ``` // Update tag pipeline ruleset returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::UpdateRulesetRequest; use datadog_api_client::datadogV2::model::UpdateRulesetRequestData; use datadog_api_client::datadogV2::model::UpdateRulesetRequestDataAttributes; use datadog_api_client::datadogV2::model::UpdateRulesetRequestDataAttributesRulesItems; use datadog_api_client::datadogV2::model::UpdateRulesetRequestDataAttributesRulesItemsMapping; use datadog_api_client::datadogV2::model::UpdateRulesetRequestDataType; #[tokio::main] async fn main() { let body = UpdateRulesetRequest::new().data( UpdateRulesetRequestData::new(UpdateRulesetRequestDataType::UPDATE_RULESET) .attributes( UpdateRulesetRequestDataAttributes::new( true, vec![UpdateRulesetRequestDataAttributesRulesItems::new( true, "Account Name Mapping".to_string(), ) .mapping(Some( UpdateRulesetRequestDataAttributesRulesItemsMapping::new( "team_owner".to_string(), true, vec!["account_name".to_string(), "account_id".to_string()], ), )) .query(None) .reference_table(None)], ) .last_version(3611102), ) .id("New Ruleset".to_string()), ); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api .update_tag_pipelines_ruleset("ee10c3ff-312f-464c-b4f6-46adaa6d00a1".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update tag pipeline ruleset returns "OK" response ``` /** * Update tag pipeline ruleset returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiUpdateTagPipelinesRulesetRequest = { body: { data: { attributes: { enabled: true, lastVersion: 3611102, rules: [ { enabled: true, mapping: { destinationKey: "team_owner", ifNotExists: true, sourceKeys: ["account_name", "account_id"], }, name: "Account Name Mapping", query: undefined, referenceTable: undefined, }, ], }, id: "New Ruleset", type: "update_ruleset", }, }, rulesetId: "ee10c3ff-312f-464c-b4f6-46adaa6d00a1", }; apiInstance .updateTagPipelinesRuleset(params) .then((data: v2.RulesetResp) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete tag pipeline ruleset](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-tag-pipeline-ruleset) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-tag-pipeline-ruleset-v2) DELETE https://api.ap1.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.ap2.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.datadoghq.eu/api/v2/tags/enrichment/{ruleset_id}https://api.ddog-gov.com/api/v2/tags/enrichment/{ruleset_id}https://api.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.us3.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.us5.datadoghq.com/api/v2/tags/enrichment/{ruleset_id} ### Overview Delete a tag pipeline ruleset - Delete an existing tag pipeline ruleset by its ID OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description ruleset_id [_required_] string The unique identifier of the ruleset ### Response * [204](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteTagPipelinesRuleset-204-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteTagPipelinesRuleset-429-v2) No Content Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Delete tag pipeline ruleset Copy ``` # Path parameters export ruleset_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/tags/enrichment/${ruleset_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete tag pipeline ruleset ``` """ Delete tag pipeline ruleset returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) api_instance.delete_tag_pipelines_ruleset( ruleset_id="ee10c3ff-312f-464c-b4f6-46adaa6d00a1", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete tag pipeline ruleset ``` # Delete tag pipeline ruleset returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new api_instance.delete_tag_pipelines_ruleset("ee10c3ff-312f-464c-b4f6-46adaa6d00a1") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete tag pipeline ruleset ``` // Delete tag pipeline ruleset returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) r, err := api.DeleteTagPipelinesRuleset(ctx, "ee10c3ff-312f-464c-b4f6-46adaa6d00a1") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.DeleteTagPipelinesRuleset`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete tag pipeline ruleset ``` // Delete tag pipeline ruleset returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { apiInstance.deleteTagPipelinesRuleset("ee10c3ff-312f-464c-b4f6-46adaa6d00a1"); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#deleteTagPipelinesRuleset"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete tag pipeline ruleset ``` // Delete tag pipeline ruleset returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api .delete_tag_pipelines_ruleset("ee10c3ff-312f-464c-b4f6-46adaa6d00a1".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete tag pipeline ruleset ``` /** * Delete tag pipeline ruleset returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiDeleteTagPipelinesRulesetRequest = { rulesetId: "ee10c3ff-312f-464c-b4f6-46adaa6d00a1", }; apiInstance .deleteTagPipelinesRuleset(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a tag pipeline ruleset](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-a-tag-pipeline-ruleset) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-a-tag-pipeline-ruleset-v2) GET https://api.ap1.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.ap2.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.datadoghq.eu/api/v2/tags/enrichment/{ruleset_id}https://api.ddog-gov.com/api/v2/tags/enrichment/{ruleset_id}https://api.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.us3.datadoghq.com/api/v2/tags/enrichment/{ruleset_id}https://api.us5.datadoghq.com/api/v2/tags/enrichment/{ruleset_id} ### Overview Get a specific tag pipeline ruleset - Retrieve a specific tag pipeline ruleset by its ID OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description ruleset_id [_required_] string The unique identifier of the ruleset ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetTagPipelinesRuleset-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetTagPipelinesRuleset-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `RulesetResp` object. Field Type Description data object The definition of `RulesetRespData` object. attributes object The definition of `RulesetRespDataAttributes` object. created [_required_] object The definition of `RulesetRespDataAttributesCreated` object. nanos int32 The `created` `nanos`. seconds int64 The `created` `seconds`. enabled [_required_] boolean The `attributes` `enabled`. last_modified_user_uuid [_required_] string The `attributes` `last_modified_user_uuid`. modified [_required_] object The definition of `RulesetRespDataAttributesModified` object. nanos int32 The `modified` `nanos`. seconds int64 The `modified` `seconds`. name [_required_] string The `attributes` `name`. position [_required_] int32 The `attributes` `position`. processing_status string The `attributes` `processing_status`. rules [_required_] [object] The `attributes` `rules`. enabled [_required_] boolean The `items` `enabled`. mapping object The definition of `RulesetRespDataAttributesRulesItemsMapping` object. destination_key [_required_] string The `mapping` `destination_key`. if_not_exists [_required_] boolean The `mapping` `if_not_exists`. source_keys [_required_] [string] The `mapping` `source_keys`. metadata object The `items` `metadata`. string name [_required_] string The `items` `name`. query object The definition of `RulesetRespDataAttributesRulesItemsQuery` object. addition [_required_] object The definition of `RulesetRespDataAttributesRulesItemsQueryAddition` object. key [_required_] string The `addition` `key`. value [_required_] string The `addition` `value`. case_insensitivity boolean The `query` `case_insensitivity`. if_not_exists [_required_] boolean The `query` `if_not_exists`. query [_required_] string The `query` `query`. reference_table object The definition of `RulesetRespDataAttributesRulesItemsReferenceTable` object. case_insensitivity boolean The `reference_table` `case_insensitivity`. field_pairs [_required_] [object] The `reference_table` `field_pairs`. input_column [_required_] string The `items` `input_column`. output_key [_required_] string The `items` `output_key`. if_not_exists boolean The `reference_table` `if_not_exists`. source_keys [_required_] [string] The `reference_table` `source_keys`. table_name [_required_] string The `reference_table` `table_name`. version [_required_] int64 The `attributes` `version`. id string The `RulesetRespData` `id`. type [_required_] enum Ruleset resource type. Allowed enum values: `ruleset` default: `ruleset` ``` { "data": { "attributes": { "created": null, "enabled": true, "last_modified_user_uuid": "", "modified": null, "name": "Example Ruleset", "position": 0, "rules": [ { "enabled": false, "mapping": null, "metadata": null, "name": "RC test rule edited1", "query": { "addition": { "key": "abc", "value": "ww" }, "case_insensitivity": false, "if_not_exists": true, "query": "billingcurrency:\"USD\" AND account_name:\"SZA96462\" AND billingcurrency:\"USD\"" }, "reference_table": null }, { "enabled": true, "mapping": { "destination_key": "h", "if_not_exists": true, "source_keys": [ "accountname", "accountownerid" ] }, "metadata": null, "name": "rule with empty source key", "query": null, "reference_table": null }, { "enabled": true, "mapping": null, "metadata": null, "name": "New table rule with new UI", "query": null, "reference_table": { "case_insensitivity": true, "field_pairs": [ { "input_column": "status_type", "output_key": "status" }, { "input_column": "status_description", "output_key": "dess" } ], "if_not_exists": false, "source_keys": [ "http_status", "status_description" ], "table_name": "http_status_codes" } } ], "version": 1 }, "id": "12345", "type": "ruleset" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Get a tag pipeline ruleset Copy ``` # Path parameters export ruleset_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/tags/enrichment/${ruleset_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a tag pipeline ruleset ``` """ Get a tag pipeline ruleset returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.get_tag_pipelines_ruleset( ruleset_id="a1e9de9b-b88e-41c6-a0cd-cc0ebd7092de", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a tag pipeline ruleset ``` # Get a tag pipeline ruleset returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.get_tag_pipelines_ruleset("a1e9de9b-b88e-41c6-a0cd-cc0ebd7092de") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a tag pipeline ruleset ``` // Get a tag pipeline ruleset returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.GetTagPipelinesRuleset(ctx, "a1e9de9b-b88e-41c6-a0cd-cc0ebd7092de") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.GetTagPipelinesRuleset`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.GetTagPipelinesRuleset`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a tag pipeline ruleset ``` // Get a tag pipeline ruleset returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.RulesetResp; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { RulesetResp result = apiInstance.getTagPipelinesRuleset("a1e9de9b-b88e-41c6-a0cd-cc0ebd7092de"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#getTagPipelinesRuleset"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a tag pipeline ruleset ``` // Get a tag pipeline ruleset returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api .get_tag_pipelines_ruleset("a1e9de9b-b88e-41c6-a0cd-cc0ebd7092de".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a tag pipeline ruleset ``` /** * Get a tag pipeline ruleset returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiGetTagPipelinesRulesetRequest = { rulesetId: "a1e9de9b-b88e-41c6-a0cd-cc0ebd7092de", }; apiInstance .getTagPipelinesRuleset(params) .then((data: v2.RulesetResp) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Reorder tag pipeline rulesets](https://docs.datadoghq.com/api/latest/cloud-cost-management/#reorder-tag-pipeline-rulesets) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#reorder-tag-pipeline-rulesets-v2) POST https://api.ap1.datadoghq.com/api/v2/tags/enrichment/reorderhttps://api.ap2.datadoghq.com/api/v2/tags/enrichment/reorderhttps://api.datadoghq.eu/api/v2/tags/enrichment/reorderhttps://api.ddog-gov.com/api/v2/tags/enrichment/reorderhttps://api.datadoghq.com/api/v2/tags/enrichment/reorderhttps://api.us3.datadoghq.com/api/v2/tags/enrichment/reorderhttps://api.us5.datadoghq.com/api/v2/tags/enrichment/reorder ### Overview Reorder tag pipeline rulesets - Change the execution order of tag pipeline rulesets OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data [_required_] [object] The `ReorderRulesetResourceArray` `data`. id string The `ReorderRulesetResourceData` `id`. type [_required_] enum Ruleset resource type. Allowed enum values: `ruleset` default: `ruleset` ``` { "data": [ { "id": "string", "type": "ruleset" } ] } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ReorderTagPipelinesRulesets-204-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ReorderTagPipelinesRulesets-429-v2) Successfully reordered rulesets Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Reorder tag pipeline rulesets Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/tags/enrichment/reorder" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "type": "ruleset" } ] } EOF ``` ##### Reorder tag pipeline rulesets ``` """ Reorder tag pipeline rulesets returns "Successfully reordered rulesets" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.reorder_ruleset_resource_array import ReorderRulesetResourceArray from datadog_api_client.v2.model.reorder_ruleset_resource_data import ReorderRulesetResourceData from datadog_api_client.v2.model.reorder_ruleset_resource_data_type import ReorderRulesetResourceDataType body = ReorderRulesetResourceArray( data=[ ReorderRulesetResourceData( id="55ef2385-9ae1-4410-90c4-5ac1b60fec10", type=ReorderRulesetResourceDataType.RULESET, ), ReorderRulesetResourceData( id="a7b8c9d0-1234-5678-9abc-def012345678", type=ReorderRulesetResourceDataType.RULESET, ), ReorderRulesetResourceData( id="f1e2d3c4-b5a6-9780-1234-567890abcdef", type=ReorderRulesetResourceDataType.RULESET, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) api_instance.reorder_tag_pipelines_rulesets(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Reorder tag pipeline rulesets ``` # Reorder tag pipeline rulesets returns "Successfully reordered rulesets" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::ReorderRulesetResourceArray.new({ data: [ DatadogAPIClient::V2::ReorderRulesetResourceData.new({ id: "55ef2385-9ae1-4410-90c4-5ac1b60fec10", type: DatadogAPIClient::V2::ReorderRulesetResourceDataType::RULESET, }), DatadogAPIClient::V2::ReorderRulesetResourceData.new({ id: "a7b8c9d0-1234-5678-9abc-def012345678", type: DatadogAPIClient::V2::ReorderRulesetResourceDataType::RULESET, }), DatadogAPIClient::V2::ReorderRulesetResourceData.new({ id: "f1e2d3c4-b5a6-9780-1234-567890abcdef", type: DatadogAPIClient::V2::ReorderRulesetResourceDataType::RULESET, }), ], }) api_instance.reorder_tag_pipelines_rulesets(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Reorder tag pipeline rulesets ``` // Reorder tag pipeline rulesets returns "Successfully reordered rulesets" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ReorderRulesetResourceArray{ Data: []datadogV2.ReorderRulesetResourceData{ { Id: datadog.PtrString("55ef2385-9ae1-4410-90c4-5ac1b60fec10"), Type: datadogV2.REORDERRULESETRESOURCEDATATYPE_RULESET, }, { Id: datadog.PtrString("a7b8c9d0-1234-5678-9abc-def012345678"), Type: datadogV2.REORDERRULESETRESOURCEDATATYPE_RULESET, }, { Id: datadog.PtrString("f1e2d3c4-b5a6-9780-1234-567890abcdef"), Type: datadogV2.REORDERRULESETRESOURCEDATATYPE_RULESET, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) r, err := api.ReorderTagPipelinesRulesets(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.ReorderTagPipelinesRulesets`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Reorder tag pipeline rulesets ``` // Reorder tag pipeline rulesets returns "Successfully reordered rulesets" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.ReorderRulesetResourceArray; import com.datadog.api.client.v2.model.ReorderRulesetResourceData; import com.datadog.api.client.v2.model.ReorderRulesetResourceDataType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); ReorderRulesetResourceArray body = new ReorderRulesetResourceArray() .data( Arrays.asList( new ReorderRulesetResourceData() .id("55ef2385-9ae1-4410-90c4-5ac1b60fec10") .type(ReorderRulesetResourceDataType.RULESET), new ReorderRulesetResourceData() .id("a7b8c9d0-1234-5678-9abc-def012345678") .type(ReorderRulesetResourceDataType.RULESET), new ReorderRulesetResourceData() .id("f1e2d3c4-b5a6-9780-1234-567890abcdef") .type(ReorderRulesetResourceDataType.RULESET))); try { apiInstance.reorderTagPipelinesRulesets(body); } catch (ApiException e) { System.err.println( "Exception when calling CloudCostManagementApi#reorderTagPipelinesRulesets"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Reorder tag pipeline rulesets ``` // Reorder tag pipeline rulesets returns "Successfully reordered rulesets" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::ReorderRulesetResourceArray; use datadog_api_client::datadogV2::model::ReorderRulesetResourceData; use datadog_api_client::datadogV2::model::ReorderRulesetResourceDataType; #[tokio::main] async fn main() { let body = ReorderRulesetResourceArray::new(vec![ ReorderRulesetResourceData::new(ReorderRulesetResourceDataType::RULESET) .id("55ef2385-9ae1-4410-90c4-5ac1b60fec10".to_string()), ReorderRulesetResourceData::new(ReorderRulesetResourceDataType::RULESET) .id("a7b8c9d0-1234-5678-9abc-def012345678".to_string()), ReorderRulesetResourceData::new(ReorderRulesetResourceDataType::RULESET) .id("f1e2d3c4-b5a6-9780-1234-567890abcdef".to_string()), ]); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.reorder_tag_pipelines_rulesets(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Reorder tag pipeline rulesets ``` /** * Reorder tag pipeline rulesets returns "Successfully reordered rulesets" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiReorderTagPipelinesRulesetsRequest = { body: { data: [ { id: "55ef2385-9ae1-4410-90c4-5ac1b60fec10", type: "ruleset", }, { id: "a7b8c9d0-1234-5678-9abc-def012345678", type: "ruleset", }, { id: "f1e2d3c4-b5a6-9780-1234-567890abcdef", type: "ruleset", }, ], }, }; apiInstance .reorderTagPipelinesRulesets(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Validate query](https://docs.datadoghq.com/api/latest/cloud-cost-management/#validate-query) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#validate-query-v2) POST https://api.ap1.datadoghq.com/api/v2/tags/enrichment/validate-queryhttps://api.ap2.datadoghq.com/api/v2/tags/enrichment/validate-queryhttps://api.datadoghq.eu/api/v2/tags/enrichment/validate-queryhttps://api.ddog-gov.com/api/v2/tags/enrichment/validate-queryhttps://api.datadoghq.com/api/v2/tags/enrichment/validate-queryhttps://api.us3.datadoghq.com/api/v2/tags/enrichment/validate-queryhttps://api.us5.datadoghq.com/api/v2/tags/enrichment/validate-query ### Overview Validate a tag pipeline query - Validate the syntax and structure of a tag pipeline query OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data object The definition of `RulesValidateQueryRequestData` object. attributes object The definition of `RulesValidateQueryRequestDataAttributes` object. Query [_required_] string The `attributes` `Query`. id string The `RulesValidateQueryRequestData` `id`. type [_required_] enum Validate query resource type. Allowed enum values: `validate_query` default: `validate_query` ``` { "data": { "attributes": { "Query": "example:query AND test:true" }, "type": "validate_query" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ValidateQuery-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ValidateQuery-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `RulesValidateQueryResponse` object. Field Type Description data object The definition of `RulesValidateQueryResponseData` object. attributes object The definition of `RulesValidateQueryResponseDataAttributes` object. Canonical [_required_] string The `attributes` `Canonical`. id string The `RulesValidateQueryResponseData` `id`. type [_required_] enum Validate response resource type. Allowed enum values: `validate_response` default: `validate_response` ``` { "data": { "attributes": { "Canonical": "canonical query representation" }, "type": "validate_response" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Validate query returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/tags/enrichment/validate-query" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "Query": "example:query AND test:true" }, "type": "validate_query" } } EOF ``` ##### Validate query returns "OK" response ``` // Validate query returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RulesValidateQueryRequest{ Data: &datadogV2.RulesValidateQueryRequestData{ Attributes: &datadogV2.RulesValidateQueryRequestDataAttributes{ Query: "example:query AND test:true", }, Type: datadogV2.RULESVALIDATEQUERYREQUESTDATATYPE_VALIDATE_QUERY, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.ValidateQuery(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.ValidateQuery`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.ValidateQuery`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Validate query returns "OK" response ``` // Validate query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.RulesValidateQueryRequest; import com.datadog.api.client.v2.model.RulesValidateQueryRequestData; import com.datadog.api.client.v2.model.RulesValidateQueryRequestDataAttributes; import com.datadog.api.client.v2.model.RulesValidateQueryRequestDataType; import com.datadog.api.client.v2.model.RulesValidateQueryResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); RulesValidateQueryRequest body = new RulesValidateQueryRequest() .data( new RulesValidateQueryRequestData() .attributes( new RulesValidateQueryRequestDataAttributes() .query("example:query AND test:true")) .type(RulesValidateQueryRequestDataType.VALIDATE_QUERY)); try { RulesValidateQueryResponse result = apiInstance.validateQuery(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#validateQuery"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Validate query returns "OK" response ``` """ Validate query returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.rules_validate_query_request import RulesValidateQueryRequest from datadog_api_client.v2.model.rules_validate_query_request_data import RulesValidateQueryRequestData from datadog_api_client.v2.model.rules_validate_query_request_data_attributes import ( RulesValidateQueryRequestDataAttributes, ) from datadog_api_client.v2.model.rules_validate_query_request_data_type import RulesValidateQueryRequestDataType body = RulesValidateQueryRequest( data=RulesValidateQueryRequestData( attributes=RulesValidateQueryRequestDataAttributes( query="example:query AND test:true", ), type=RulesValidateQueryRequestDataType.VALIDATE_QUERY, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.validate_query(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Validate query returns "OK" response ``` # Validate query returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::RulesValidateQueryRequest.new({ data: DatadogAPIClient::V2::RulesValidateQueryRequestData.new({ attributes: DatadogAPIClient::V2::RulesValidateQueryRequestDataAttributes.new({ query: "example:query AND test:true", }), type: DatadogAPIClient::V2::RulesValidateQueryRequestDataType::VALIDATE_QUERY, }), }) p api_instance.validate_query(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Validate query returns "OK" response ``` // Validate query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::RulesValidateQueryRequest; use datadog_api_client::datadogV2::model::RulesValidateQueryRequestData; use datadog_api_client::datadogV2::model::RulesValidateQueryRequestDataAttributes; use datadog_api_client::datadogV2::model::RulesValidateQueryRequestDataType; #[tokio::main] async fn main() { let body = RulesValidateQueryRequest::new().data( RulesValidateQueryRequestData::new(RulesValidateQueryRequestDataType::VALIDATE_QUERY) .attributes(RulesValidateQueryRequestDataAttributes::new( "example:query AND test:true".to_string(), )), ); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.validate_query(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Validate query returns "OK" response ``` /** * Validate query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiValidateQueryRequest = { body: { data: { attributes: { query: "example:query AND test:true", }, type: "validate_query", }, }, }; apiInstance .validateQuery(params) .then((data: v2.RulesValidateQueryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List custom allocation rules](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-custom-allocation-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-custom-allocation-rules-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/arbitrary_rulehttps://api.ap2.datadoghq.com/api/v2/cost/arbitrary_rulehttps://api.datadoghq.eu/api/v2/cost/arbitrary_rulehttps://api.ddog-gov.com/api/v2/cost/arbitrary_rulehttps://api.datadoghq.com/api/v2/cost/arbitrary_rulehttps://api.us3.datadoghq.com/api/v2/cost/arbitrary_rulehttps://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule ### Overview List all custom allocation rules - Retrieve a list of all custom allocation rules for the organization OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCustomAllocationRules-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCustomAllocationRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `ArbitraryRuleResponseArray` object. Field Type Description data [_required_] [object] The `ArbitraryRuleResponseArray` `data`. attributes object The definition of `ArbitraryRuleResponseDataAttributes` object. costs_to_allocate [_required_] [object] The `attributes` `costs_to_allocate`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. created [_required_] date-time The `attributes` `created`. enabled [_required_] boolean The `attributes` `enabled`. last_modified_user_uuid [_required_] string The `attributes` `last_modified_user_uuid`. order_id [_required_] int64 The `attributes` `order_id`. processing_status string The `attributes` `processing_status`. provider [_required_] [string] The `attributes` `provider`. rejected boolean The `attributes` `rejected`. rule_name [_required_] string The `attributes` `rule_name`. strategy [_required_] object The definition of `ArbitraryRuleResponseDataAttributesStrategy` object. allocated_by [object] The `strategy` `allocated_by`. allocated_tags [_required_] [object] The `items` `allocated_tags`. key [_required_] string The `items` `key`. value [_required_] string The `items` `value`. percentage [_required_] double The `items` `percentage`. The numeric value format should be a 32bit float value. allocated_by_filters [object] The `strategy` `allocated_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. allocated_by_tag_keys [string] The `strategy` `allocated_by_tag_keys`. based_on_costs [object] The `strategy` `based_on_costs`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. based_on_timeseries object The rule `strategy` `based_on_timeseries`. evaluate_grouped_by_filters [object] The `strategy` `evaluate_grouped_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. evaluate_grouped_by_tag_keys [string] The `strategy` `evaluate_grouped_by_tag_keys`. granularity string The `strategy` `granularity`. method [_required_] string The `strategy` `method`. type [_required_] string The `attributes` `type`. updated [_required_] date-time The `attributes` `updated`. version [_required_] int32 The `attributes` `version`. id string The `ArbitraryRuleResponseData` `id`. type [_required_] enum Arbitrary rule resource type. Allowed enum values: `arbitrary_rule` default: `arbitrary_rule` meta object The `ArbitraryRuleResponseArray` `meta`. total_count int64 The `meta` `total_count`. ``` { "data": [ { "attributes": { "costs_to_allocate": [ { "condition": "like", "tag": "service", "value": "orgstore-csm*", "values": null } ], "created": "2024-11-20T03:44:37Z", "enabled": true, "last_modified_user_uuid": "user-example-uuid", "order_id": 1, "processing_status": "done", "provider": [ "gcp" ], "rule_name": "gcp-orgstore-csm-team-allocation", "strategy": { "allocated_by": [ { "allocated_tags": [ { "key": "team", "value": "csm-activation" } ], "percentage": 0.34 }, { "allocated_tags": [ { "key": "team", "value": "csm-agentless" } ], "percentage": 0.66 } ], "method": "percent" }, "type": "shared", "updated": "2025-09-02T21:28:32Z", "version": 1 }, "id": "19", "type": "arbitrary_rule" }, { "attributes": { "costs_to_allocate": [ { "condition": "is", "tag": "env", "value": "staging", "values": null } ], "created": "2025-05-27T18:48:05Z", "enabled": true, "last_modified_user_uuid": "user-example-uuid-2", "order_id": 2, "processing_status": "done", "provider": [ "aws" ], "rule_name": "test-even-2", "strategy": { "allocated_by_tag_keys": [ "team" ], "based_on_costs": [ { "condition": "is", "tag": "aws_product", "value": "s3", "values": null } ], "granularity": "daily", "method": "even" }, "type": "shared", "updated": "2025-09-03T21:00:49Z", "version": 1 }, "id": "311", "type": "arbitrary_rule" }, { "attributes": { "costs_to_allocate": [ { "condition": "is", "tag": "servicename", "value": "s3", "values": null } ], "created": "2025-03-21T20:42:40Z", "enabled": false, "last_modified_user_uuid": "user-example-uuid-3", "order_id": 3, "processing_status": "done", "provider": [ "aws" ], "rule_name": "test-s3-timeseries", "strategy": { "granularity": "daily", "method": "proportional_timeseries" }, "type": "shared", "updated": "2025-09-02T21:16:50Z", "version": 1 }, "id": "289", "type": "arbitrary_rule" }, { "attributes": { "costs_to_allocate": [ { "condition": "=", "tag": "aws_product", "value": "msk", "values": null }, { "condition": "is", "tag": "product", "value": "null", "values": null } ], "created": "2025-08-27T14:39:31Z", "enabled": true, "last_modified_user_uuid": "user-example-uuid-4", "order_id": 4, "processing_status": "done", "provider": [ "aws" ], "rule_name": "azure-unallocated-by-product-2", "strategy": { "allocated_by_tag_keys": [ "aws_product" ], "based_on_costs": [ { "condition": "=", "tag": "aws_product", "value": "msk", "values": null }, { "condition": "is not", "tag": "product", "value": "null", "values": null } ], "granularity": "daily", "method": "proportional" }, "type": "shared", "updated": "2025-09-02T21:28:32Z", "version": 1 }, "id": "523", "type": "arbitrary_rule" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### List custom allocation rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List custom allocation rules ``` """ List custom allocation rules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.list_custom_allocation_rules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List custom allocation rules ``` # List custom allocation rules returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.list_custom_allocation_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List custom allocation rules ``` // List custom allocation rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.ListCustomAllocationRules(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.ListCustomAllocationRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.ListCustomAllocationRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List custom allocation rules ``` // List custom allocation rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.ArbitraryRuleResponseArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { ArbitraryRuleResponseArray result = apiInstance.listCustomAllocationRules(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#listCustomAllocationRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List custom allocation rules ``` // List custom allocation rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.list_custom_allocation_rules().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List custom allocation rules ``` /** * List custom allocation rules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); apiInstance .listCustomAllocationRules() .then((data: v2.ArbitraryRuleResponseArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create custom allocation rule](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-custom-allocation-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-custom-allocation-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/cost/arbitrary_rulehttps://api.ap2.datadoghq.com/api/v2/cost/arbitrary_rulehttps://api.datadoghq.eu/api/v2/cost/arbitrary_rulehttps://api.ddog-gov.com/api/v2/cost/arbitrary_rulehttps://api.datadoghq.com/api/v2/cost/arbitrary_rulehttps://api.us3.datadoghq.com/api/v2/cost/arbitrary_rulehttps://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule ### Overview Create a new custom allocation rule with the specified filters and allocation strategy. **Strategy Methods:** * **PROPORTIONAL/EVEN** : Allocates costs proportionally/evenly based on existing costs. Requires: granularity, allocated_by_tag_keys. Optional: based_on_costs, allocated_by_filters, evaluate_grouped_by_tag_keys, evaluate_grouped_by_filters. * **PROPORTIONAL_TIMESERIES/EVEN_TIMESERIES** : Allocates based on timeseries data. Requires: granularity, based_on_timeseries. Optional: evaluate_grouped_by_tag_keys. * **PERCENT** : Allocates fixed percentages to specific tags. Requires: allocated_by (array of percentage allocations). **Filter Conditions:** * Use **value** for single-value conditions: “is”, “is not”, “contains”, “does not contain”, “=”, “!=”, “like”, “not like”, “is all values”, “is untagged” * Use **values** for multi-value conditions: “in”, “not in” * Cannot use both value and values simultaneously. **Supported operators** : is, is not, is all values, is untagged, contains, does not contain, in, not in, =, !=, like, not like OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data object The definition of `ArbitraryCostUpsertRequestData` object. attributes object The definition of `ArbitraryCostUpsertRequestDataAttributes` object. costs_to_allocate [_required_] [object] The `attributes` `costs_to_allocate`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. enabled boolean The `attributes` `enabled`. order_id int64 The `attributes` `order_id`. provider [_required_] [string] The `attributes` `provider`. rejected boolean The `attributes` `rejected`. rule_name [_required_] string The `attributes` `rule_name`. strategy [_required_] object The definition of `ArbitraryCostUpsertRequestDataAttributesStrategy` object. allocated_by [object] The `strategy` `allocated_by`. allocated_tags [_required_] [object] The `items` `allocated_tags`. key [_required_] string The `items` `key`. value [_required_] string The `items` `value`. percentage [_required_] double The `items` `percentage`. The numeric value format should be a 32bit float value. allocated_by_filters [object] The `strategy` `allocated_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. allocated_by_tag_keys [string] The `strategy` `allocated_by_tag_keys`. based_on_costs [object] The `strategy` `based_on_costs`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. based_on_timeseries object The `strategy` `based_on_timeseries`. evaluate_grouped_by_filters [object] The `strategy` `evaluate_grouped_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. evaluate_grouped_by_tag_keys [string] The `strategy` `evaluate_grouped_by_tag_keys`. granularity string The `strategy` `granularity`. method [_required_] string The `strategy` `method`. type [_required_] string The `attributes` `type`. id string The `ArbitraryCostUpsertRequestData` `id`. type [_required_] enum Upsert arbitrary rule resource type. Allowed enum values: `upsert_arbitrary_rule` default: `upsert_arbitrary_rule` ``` { "data": { "attributes": { "costs_to_allocate": [ { "condition": "is", "tag": "account_id", "value": "123456789" }, { "condition": "in", "tag": "environment", "value": "", "values": [ "production", "staging" ] } ], "enabled": true, "order_id": 1, "provider": [ "aws", "gcp" ], "rule_name": "example-arbitrary-cost-rule", "strategy": { "allocated_by_tag_keys": [ "team", "environment" ], "based_on_costs": [ { "condition": "is", "tag": "service", "value": "web-api" }, { "condition": "not in", "tag": "team", "value": "", "values": [ "legacy", "deprecated" ] } ], "granularity": "daily", "method": "proportional" }, "type": "shared" }, "type": "upsert_arbitrary_rule" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCustomAllocationRule-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#CreateCustomAllocationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `ArbitraryRuleResponse` object. Field Type Description data object The definition of `ArbitraryRuleResponseData` object. attributes object The definition of `ArbitraryRuleResponseDataAttributes` object. costs_to_allocate [_required_] [object] The `attributes` `costs_to_allocate`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. created [_required_] date-time The `attributes` `created`. enabled [_required_] boolean The `attributes` `enabled`. last_modified_user_uuid [_required_] string The `attributes` `last_modified_user_uuid`. order_id [_required_] int64 The `attributes` `order_id`. processing_status string The `attributes` `processing_status`. provider [_required_] [string] The `attributes` `provider`. rejected boolean The `attributes` `rejected`. rule_name [_required_] string The `attributes` `rule_name`. strategy [_required_] object The definition of `ArbitraryRuleResponseDataAttributesStrategy` object. allocated_by [object] The `strategy` `allocated_by`. allocated_tags [_required_] [object] The `items` `allocated_tags`. key [_required_] string The `items` `key`. value [_required_] string The `items` `value`. percentage [_required_] double The `items` `percentage`. The numeric value format should be a 32bit float value. allocated_by_filters [object] The `strategy` `allocated_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. allocated_by_tag_keys [string] The `strategy` `allocated_by_tag_keys`. based_on_costs [object] The `strategy` `based_on_costs`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. based_on_timeseries object The rule `strategy` `based_on_timeseries`. evaluate_grouped_by_filters [object] The `strategy` `evaluate_grouped_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. evaluate_grouped_by_tag_keys [string] The `strategy` `evaluate_grouped_by_tag_keys`. granularity string The `strategy` `granularity`. method [_required_] string The `strategy` `method`. type [_required_] string The `attributes` `type`. updated [_required_] date-time The `attributes` `updated`. version [_required_] int32 The `attributes` `version`. id string The `ArbitraryRuleResponseData` `id`. type [_required_] enum Arbitrary rule resource type. Allowed enum values: `arbitrary_rule` default: `arbitrary_rule` ``` { "data": { "attributes": { "costs_to_allocate": [ { "condition": "is", "tag": "account_id", "value": "123456789", "values": null }, { "condition": "in", "tag": "environment", "value": "", "values": [ "production", "staging" ] } ], "created": "2023-01-01T12:00:00Z", "enabled": true, "last_modified_user_uuid": "user-123-uuid", "order_id": 1, "provider": [ "aws", "gcp" ], "rule_name": "Example custom allocation rule", "strategy": { "allocated_by_tag_keys": [ "team", "environment" ], "based_on_costs": [ { "condition": "is", "tag": "service", "value": "web-api", "values": null }, { "condition": "not in", "tag": "team", "value": "", "values": [ "legacy", "deprecated" ] } ], "granularity": "daily", "method": "proportional" }, "type": "shared", "updated": "2023-01-01T12:00:00Z", "version": 1 }, "id": "123", "type": "arbitrary_rule" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Create custom allocation rule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "costs_to_allocate": [ { "condition": "is", "tag": "account_id", "value": "123456789" }, { "condition": "in", "tag": "environment", "value": "", "values": [ "production", "staging" ] } ], "enabled": true, "order_id": 1, "provider": [ "aws", "gcp" ], "rule_name": "example-arbitrary-cost-rule", "strategy": { "allocated_by_tag_keys": [ "team", "environment" ], "based_on_costs": [ { "condition": "is", "tag": "service", "value": "web-api" }, { "condition": "not in", "tag": "team", "value": "", "values": [ "legacy", "deprecated" ] } ], "granularity": "daily", "method": "proportional" }, "type": "shared" }, "type": "upsert_arbitrary_rule" } } EOF ``` ##### Create custom allocation rule returns "OK" response ``` // Create custom allocation rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ArbitraryCostUpsertRequest{ Data: &datadogV2.ArbitraryCostUpsertRequestData{ Attributes: &datadogV2.ArbitraryCostUpsertRequestDataAttributes{ CostsToAllocate: []datadogV2.ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems{ { Condition: "is", Tag: "account_id", Value: datadog.PtrString("123456789"), }, { Condition: "in", Tag: "environment", Value: datadog.PtrString(""), Values: *datadog.NewNullableList(&[]string{ "production", "staging", }), }, }, Enabled: datadog.PtrBool(true), OrderId: datadog.PtrInt64(1), Provider: []string{ "aws", "gcp", }, RuleName: "example-arbitrary-cost-rule", Strategy: datadogV2.ArbitraryCostUpsertRequestDataAttributesStrategy{ AllocatedByTagKeys: []string{ "team", "environment", }, BasedOnCosts: []datadogV2.ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems{ { Condition: "is", Tag: "service", Value: datadog.PtrString("web-api"), }, { Condition: "not in", Tag: "team", Value: datadog.PtrString(""), Values: *datadog.NewNullableList(&[]string{ "legacy", "deprecated", }), }, }, Granularity: datadog.PtrString("daily"), Method: "proportional", }, Type: "shared", }, Type: datadogV2.ARBITRARYCOSTUPSERTREQUESTDATATYPE_UPSERT_ARBITRARY_RULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.CreateCustomAllocationRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.CreateCustomAllocationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.CreateCustomAllocationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create custom allocation rule returns "OK" response ``` // Create custom allocation rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequest; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestData; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestDataAttributes; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestDataAttributesStrategy; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestDataType; import com.datadog.api.client.v2.model.ArbitraryRuleResponse; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); ArbitraryCostUpsertRequest body = new ArbitraryCostUpsertRequest() .data( new ArbitraryCostUpsertRequestData() .attributes( new ArbitraryCostUpsertRequestDataAttributes() .costsToAllocate( Arrays.asList( new ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems() .condition("is") .tag("account_id") .value("123456789"), new ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems() .condition("in") .tag("environment") .value("") .values(Arrays.asList("production", "staging")))) .enabled(true) .orderId(1L) .provider(Arrays.asList("aws", "gcp")) .ruleName("example-arbitrary-cost-rule") .strategy( new ArbitraryCostUpsertRequestDataAttributesStrategy() .allocatedByTagKeys(Arrays.asList("team", "environment")) .basedOnCosts( Arrays.asList( new ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems() .condition("is") .tag("service") .value("web-api"), new ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems() .condition("not in") .tag("team") .value("") .values(Arrays.asList("legacy", "deprecated")))) .granularity("daily") .method("proportional")) .type("shared")) .type(ArbitraryCostUpsertRequestDataType.UPSERT_ARBITRARY_RULE)); try { ArbitraryRuleResponse result = apiInstance.createCustomAllocationRule(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CloudCostManagementApi#createCustomAllocationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create custom allocation rule returns "OK" response ``` """ Create custom allocation rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.arbitrary_cost_upsert_request import ArbitraryCostUpsertRequest from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data import ArbitraryCostUpsertRequestData from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data_attributes import ( ArbitraryCostUpsertRequestDataAttributes, ) from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data_attributes_costs_to_allocate_items import ( ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems, ) from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data_attributes_strategy import ( ArbitraryCostUpsertRequestDataAttributesStrategy, ) from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data_attributes_strategy_based_on_costs_items import ( ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems, ) from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data_type import ArbitraryCostUpsertRequestDataType body = ArbitraryCostUpsertRequest( data=ArbitraryCostUpsertRequestData( attributes=ArbitraryCostUpsertRequestDataAttributes( costs_to_allocate=[ ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems( condition="is", tag="account_id", value="123456789", ), ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems( condition="in", tag="environment", value="", values=[ "production", "staging", ], ), ], enabled=True, order_id=1, provider=[ "aws", "gcp", ], rule_name="example-arbitrary-cost-rule", strategy=ArbitraryCostUpsertRequestDataAttributesStrategy( allocated_by_tag_keys=[ "team", "environment", ], based_on_costs=[ ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems( condition="is", tag="service", value="web-api", ), ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems( condition="not in", tag="team", value="", values=[ "legacy", "deprecated", ], ), ], granularity="daily", method="proportional", ), type="shared", ), type=ArbitraryCostUpsertRequestDataType.UPSERT_ARBITRARY_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.create_custom_allocation_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create custom allocation rule returns "OK" response ``` # Create custom allocation rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::ArbitraryCostUpsertRequest.new({ data: DatadogAPIClient::V2::ArbitraryCostUpsertRequestData.new({ attributes: DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributes.new({ costs_to_allocate: [ DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems.new({ condition: "is", tag: "account_id", value: "123456789", }), DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems.new({ condition: "in", tag: "environment", value: "", values: [ "production", "staging", ], }), ], enabled: true, order_id: 1, provider: [ "aws", "gcp", ], rule_name: "example-arbitrary-cost-rule", strategy: DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributesStrategy.new({ allocated_by_tag_keys: [ "team", "environment", ], based_on_costs: [ DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems.new({ condition: "is", tag: "service", value: "web-api", }), DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems.new({ condition: "not in", tag: "team", value: "", values: [ "legacy", "deprecated", ], }), ], granularity: "daily", method: "proportional", }), type: "shared", }), type: DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataType::UPSERT_ARBITRARY_RULE, }), }) p api_instance.create_custom_allocation_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create custom allocation rule returns "OK" response ``` // Create custom allocation rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequest; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestData; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestDataAttributes; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestDataAttributesStrategy; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestDataType; #[tokio::main] async fn main() { let body = ArbitraryCostUpsertRequest::new().data( ArbitraryCostUpsertRequestData::new( ArbitraryCostUpsertRequestDataType::UPSERT_ARBITRARY_RULE, ) .attributes( ArbitraryCostUpsertRequestDataAttributes::new( vec![ ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems::new( "is".to_string(), "account_id".to_string(), ) .value("123456789".to_string()), ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems::new( "in".to_string(), "environment".to_string(), ) .value("".to_string()) .values(Some(vec!["production".to_string(), "staging".to_string()])), ], vec!["aws".to_string(), "gcp".to_string()], "example-arbitrary-cost-rule".to_string(), ArbitraryCostUpsertRequestDataAttributesStrategy::new("proportional".to_string()) .allocated_by_tag_keys(vec!["team".to_string(), "environment".to_string()]) .based_on_costs(vec![ ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems::new( "is".to_string(), "service".to_string(), ) .value("web-api".to_string()), ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems::new( "not in".to_string(), "team".to_string(), ) .value("".to_string()) .values(Some(vec!["legacy".to_string(), "deprecated".to_string()])), ]) .granularity("daily".to_string()), "shared".to_string(), ) .enabled(true) .order_id(1), ), ); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.create_custom_allocation_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create custom allocation rule returns "OK" response ``` /** * Create custom allocation rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiCreateCustomAllocationRuleRequest = { body: { data: { attributes: { costsToAllocate: [ { condition: "is", tag: "account_id", value: "123456789", }, { condition: "in", tag: "environment", value: "", values: ["production", "staging"], }, ], enabled: true, orderId: 1, provider: ["aws", "gcp"], ruleName: "example-arbitrary-cost-rule", strategy: { allocatedByTagKeys: ["team", "environment"], basedOnCosts: [ { condition: "is", tag: "service", value: "web-api", }, { condition: "not in", tag: "team", value: "", values: ["legacy", "deprecated"], }, ], granularity: "daily", method: "proportional", }, type: "shared", }, type: "upsert_arbitrary_rule", }, }, }; apiInstance .createCustomAllocationRule(params) .then((data: v2.ArbitraryRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update custom allocation rule](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-custom-allocation-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-custom-allocation-rule-v2) PATCH https://api.ap1.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.ap2.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.datadoghq.eu/api/v2/cost/arbitrary_rule/{rule_id}https://api.ddog-gov.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.us3.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id} ### Overview Update an existing custom allocation rule with new filters and allocation strategy. **Strategy Methods:** * **PROPORTIONAL/EVEN** : Allocates costs proportionally/evenly based on existing costs. Requires: granularity, allocated_by_tag_keys. Optional: based_on_costs, allocated_by_filters, evaluate_grouped_by_tag_keys, evaluate_grouped_by_filters. * **PROPORTIONAL_TIMESERIES/EVEN_TIMESERIES** : Allocates based on timeseries data. Requires: granularity, based_on_timeseries. Optional: evaluate_grouped_by_tag_keys. * **PERCENT** : Allocates fixed percentages to specific tags. Requires: allocated_by (array of percentage allocations). * **USAGE_METRIC** : Allocates based on usage metrics (implementation varies). **Filter Conditions:** * Use **value** for single-value conditions: “is”, “is not”, “contains”, “does not contain”, “=”, “!=”, “like”, “not like”, “is all values”, “is untagged” * Use **values** for multi-value conditions: “in”, “not in” * Cannot use both value and values simultaneously. **Supported operators** : is, is not, is all values, is untagged, contains, does not contain, in, not in, =, !=, like, not like OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] integer The unique identifier of the custom allocation rule ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data object The definition of `ArbitraryCostUpsertRequestData` object. attributes object The definition of `ArbitraryCostUpsertRequestDataAttributes` object. costs_to_allocate [_required_] [object] The `attributes` `costs_to_allocate`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. enabled boolean The `attributes` `enabled`. order_id int64 The `attributes` `order_id`. provider [_required_] [string] The `attributes` `provider`. rejected boolean The `attributes` `rejected`. rule_name [_required_] string The `attributes` `rule_name`. strategy [_required_] object The definition of `ArbitraryCostUpsertRequestDataAttributesStrategy` object. allocated_by [object] The `strategy` `allocated_by`. allocated_tags [_required_] [object] The `items` `allocated_tags`. key [_required_] string The `items` `key`. value [_required_] string The `items` `value`. percentage [_required_] double The `items` `percentage`. The numeric value format should be a 32bit float value. allocated_by_filters [object] The `strategy` `allocated_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. allocated_by_tag_keys [string] The `strategy` `allocated_by_tag_keys`. based_on_costs [object] The `strategy` `based_on_costs`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. based_on_timeseries object The `strategy` `based_on_timeseries`. evaluate_grouped_by_filters [object] The `strategy` `evaluate_grouped_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. evaluate_grouped_by_tag_keys [string] The `strategy` `evaluate_grouped_by_tag_keys`. granularity string The `strategy` `granularity`. method [_required_] string The `strategy` `method`. type [_required_] string The `attributes` `type`. id string The `ArbitraryCostUpsertRequestData` `id`. type [_required_] enum Upsert arbitrary rule resource type. Allowed enum values: `upsert_arbitrary_rule` default: `upsert_arbitrary_rule` ``` { "data": { "attributes": { "costs_to_allocate": [ { "condition": "is", "tag": "account_id", "value": "123456789", "values": [] }, { "condition": "in", "tag": "environment", "value": "", "values": [ "production", "staging" ] } ], "enabled": true, "order_id": 1, "provider": [ "aws", "gcp" ], "rule_name": "example-arbitrary-cost-rule", "strategy": { "allocated_by_tag_keys": [ "team", "environment" ], "based_on_costs": [ { "condition": "is", "tag": "service", "value": "web-api", "values": [] }, { "condition": "not in", "tag": "team", "value": "", "values": [ "legacy", "deprecated" ] } ], "granularity": "daily", "method": "proportional" }, "type": "shared" }, "type": "upsert_arbitrary_rule" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCustomAllocationRule-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpdateCustomAllocationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `ArbitraryRuleResponse` object. Field Type Description data object The definition of `ArbitraryRuleResponseData` object. attributes object The definition of `ArbitraryRuleResponseDataAttributes` object. costs_to_allocate [_required_] [object] The `attributes` `costs_to_allocate`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. created [_required_] date-time The `attributes` `created`. enabled [_required_] boolean The `attributes` `enabled`. last_modified_user_uuid [_required_] string The `attributes` `last_modified_user_uuid`. order_id [_required_] int64 The `attributes` `order_id`. processing_status string The `attributes` `processing_status`. provider [_required_] [string] The `attributes` `provider`. rejected boolean The `attributes` `rejected`. rule_name [_required_] string The `attributes` `rule_name`. strategy [_required_] object The definition of `ArbitraryRuleResponseDataAttributesStrategy` object. allocated_by [object] The `strategy` `allocated_by`. allocated_tags [_required_] [object] The `items` `allocated_tags`. key [_required_] string The `items` `key`. value [_required_] string The `items` `value`. percentage [_required_] double The `items` `percentage`. The numeric value format should be a 32bit float value. allocated_by_filters [object] The `strategy` `allocated_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. allocated_by_tag_keys [string] The `strategy` `allocated_by_tag_keys`. based_on_costs [object] The `strategy` `based_on_costs`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. based_on_timeseries object The rule `strategy` `based_on_timeseries`. evaluate_grouped_by_filters [object] The `strategy` `evaluate_grouped_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. evaluate_grouped_by_tag_keys [string] The `strategy` `evaluate_grouped_by_tag_keys`. granularity string The `strategy` `granularity`. method [_required_] string The `strategy` `method`. type [_required_] string The `attributes` `type`. updated [_required_] date-time The `attributes` `updated`. version [_required_] int32 The `attributes` `version`. id string The `ArbitraryRuleResponseData` `id`. type [_required_] enum Arbitrary rule resource type. Allowed enum values: `arbitrary_rule` default: `arbitrary_rule` ``` { "data": { "attributes": { "costs_to_allocate": [ { "condition": "is", "tag": "account_id", "value": "123456789", "values": null }, { "condition": "in", "tag": "environment", "value": "", "values": [ "production", "staging" ] } ], "created": "2023-01-01T12:00:00Z", "enabled": true, "last_modified_user_uuid": "user-123-uuid", "order_id": 1, "provider": [ "aws", "gcp" ], "rule_name": "Example custom allocation rule", "strategy": { "allocated_by_tag_keys": [ "team", "environment" ], "based_on_costs": [ { "condition": "is", "tag": "service", "value": "web-api", "values": null }, { "condition": "not in", "tag": "team", "value": "", "values": [ "legacy", "deprecated" ] } ], "granularity": "daily", "method": "proportional" }, "type": "shared", "updated": "2023-01-01T12:00:00Z", "version": 1 }, "id": "123", "type": "arbitrary_rule" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Update custom allocation rule returns "OK" response Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule/${rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "costs_to_allocate": [ { "condition": "is", "tag": "account_id", "value": "123456789", "values": [] }, { "condition": "in", "tag": "environment", "value": "", "values": [ "production", "staging" ] } ], "enabled": true, "order_id": 1, "provider": [ "aws", "gcp" ], "rule_name": "example-arbitrary-cost-rule", "strategy": { "allocated_by_tag_keys": [ "team", "environment" ], "based_on_costs": [ { "condition": "is", "tag": "service", "value": "web-api", "values": [] }, { "condition": "not in", "tag": "team", "value": "", "values": [ "legacy", "deprecated" ] } ], "granularity": "daily", "method": "proportional" }, "type": "shared" }, "type": "upsert_arbitrary_rule" } } EOF ``` ##### Update custom allocation rule returns "OK" response ``` // Update custom allocation rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ArbitraryCostUpsertRequest{ Data: &datadogV2.ArbitraryCostUpsertRequestData{ Attributes: &datadogV2.ArbitraryCostUpsertRequestDataAttributes{ CostsToAllocate: []datadogV2.ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems{ { Condition: "is", Tag: "account_id", Value: datadog.PtrString("123456789"), Values: *datadog.NewNullableList(&[]string{}), }, { Condition: "in", Tag: "environment", Value: datadog.PtrString(""), Values: *datadog.NewNullableList(&[]string{ "production", "staging", }), }, }, Enabled: datadog.PtrBool(true), OrderId: datadog.PtrInt64(1), Provider: []string{ "aws", "gcp", }, RuleName: "example-arbitrary-cost-rule", Strategy: datadogV2.ArbitraryCostUpsertRequestDataAttributesStrategy{ AllocatedByTagKeys: []string{ "team", "environment", }, BasedOnCosts: []datadogV2.ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems{ { Condition: "is", Tag: "service", Value: datadog.PtrString("web-api"), Values: *datadog.NewNullableList(&[]string{}), }, { Condition: "not in", Tag: "team", Value: datadog.PtrString(""), Values: *datadog.NewNullableList(&[]string{ "legacy", "deprecated", }), }, }, Granularity: datadog.PtrString("daily"), Method: "proportional", }, Type: "shared", }, Type: datadogV2.ARBITRARYCOSTUPSERTREQUESTDATATYPE_UPSERT_ARBITRARY_RULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.UpdateCustomAllocationRule(ctx, 683, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.UpdateCustomAllocationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.UpdateCustomAllocationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update custom allocation rule returns "OK" response ``` // Update custom allocation rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequest; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestData; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestDataAttributes; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestDataAttributesStrategy; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems; import com.datadog.api.client.v2.model.ArbitraryCostUpsertRequestDataType; import com.datadog.api.client.v2.model.ArbitraryRuleResponse; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); ArbitraryCostUpsertRequest body = new ArbitraryCostUpsertRequest() .data( new ArbitraryCostUpsertRequestData() .attributes( new ArbitraryCostUpsertRequestDataAttributes() .costsToAllocate( Arrays.asList( new ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems() .condition("is") .tag("account_id") .value("123456789"), new ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems() .condition("in") .tag("environment") .value("") .values(Arrays.asList("production", "staging")))) .enabled(true) .orderId(1L) .provider(Arrays.asList("aws", "gcp")) .ruleName("example-arbitrary-cost-rule") .strategy( new ArbitraryCostUpsertRequestDataAttributesStrategy() .allocatedByTagKeys(Arrays.asList("team", "environment")) .basedOnCosts( Arrays.asList( new ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems() .condition("is") .tag("service") .value("web-api"), new ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems() .condition("not in") .tag("team") .value("") .values(Arrays.asList("legacy", "deprecated")))) .granularity("daily") .method("proportional")) .type("shared")) .type(ArbitraryCostUpsertRequestDataType.UPSERT_ARBITRARY_RULE)); try { ArbitraryRuleResponse result = apiInstance.updateCustomAllocationRule(683L, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CloudCostManagementApi#updateCustomAllocationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update custom allocation rule returns "OK" response ``` """ Update custom allocation rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.arbitrary_cost_upsert_request import ArbitraryCostUpsertRequest from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data import ArbitraryCostUpsertRequestData from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data_attributes import ( ArbitraryCostUpsertRequestDataAttributes, ) from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data_attributes_costs_to_allocate_items import ( ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems, ) from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data_attributes_strategy import ( ArbitraryCostUpsertRequestDataAttributesStrategy, ) from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data_attributes_strategy_based_on_costs_items import ( ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems, ) from datadog_api_client.v2.model.arbitrary_cost_upsert_request_data_type import ArbitraryCostUpsertRequestDataType body = ArbitraryCostUpsertRequest( data=ArbitraryCostUpsertRequestData( attributes=ArbitraryCostUpsertRequestDataAttributes( costs_to_allocate=[ ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems( condition="is", tag="account_id", value="123456789", values=[], ), ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems( condition="in", tag="environment", value="", values=[ "production", "staging", ], ), ], enabled=True, order_id=1, provider=[ "aws", "gcp", ], rule_name="example-arbitrary-cost-rule", strategy=ArbitraryCostUpsertRequestDataAttributesStrategy( allocated_by_tag_keys=[ "team", "environment", ], based_on_costs=[ ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems( condition="is", tag="service", value="web-api", values=[], ), ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems( condition="not in", tag="team", value="", values=[ "legacy", "deprecated", ], ), ], granularity="daily", method="proportional", ), type="shared", ), type=ArbitraryCostUpsertRequestDataType.UPSERT_ARBITRARY_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.update_custom_allocation_rule(rule_id=683, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update custom allocation rule returns "OK" response ``` # Update custom allocation rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::ArbitraryCostUpsertRequest.new({ data: DatadogAPIClient::V2::ArbitraryCostUpsertRequestData.new({ attributes: DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributes.new({ costs_to_allocate: [ DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems.new({ condition: "is", tag: "account_id", value: "123456789", values: [], }), DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems.new({ condition: "in", tag: "environment", value: "", values: [ "production", "staging", ], }), ], enabled: true, order_id: 1, provider: [ "aws", "gcp", ], rule_name: "example-arbitrary-cost-rule", strategy: DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributesStrategy.new({ allocated_by_tag_keys: [ "team", "environment", ], based_on_costs: [ DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems.new({ condition: "is", tag: "service", value: "web-api", values: [], }), DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems.new({ condition: "not in", tag: "team", value: "", values: [ "legacy", "deprecated", ], }), ], granularity: "daily", method: "proportional", }), type: "shared", }), type: DatadogAPIClient::V2::ArbitraryCostUpsertRequestDataType::UPSERT_ARBITRARY_RULE, }), }) p api_instance.update_custom_allocation_rule(683, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update custom allocation rule returns "OK" response ``` // Update custom allocation rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequest; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestData; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestDataAttributes; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestDataAttributesStrategy; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems; use datadog_api_client::datadogV2::model::ArbitraryCostUpsertRequestDataType; #[tokio::main] async fn main() { let body = ArbitraryCostUpsertRequest::new().data( ArbitraryCostUpsertRequestData::new( ArbitraryCostUpsertRequestDataType::UPSERT_ARBITRARY_RULE, ) .attributes( ArbitraryCostUpsertRequestDataAttributes::new( vec![ ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems::new( "is".to_string(), "account_id".to_string(), ) .value("123456789".to_string()) .values(Some(vec![])), ArbitraryCostUpsertRequestDataAttributesCostsToAllocateItems::new( "in".to_string(), "environment".to_string(), ) .value("".to_string()) .values(Some(vec!["production".to_string(), "staging".to_string()])), ], vec!["aws".to_string(), "gcp".to_string()], "example-arbitrary-cost-rule".to_string(), ArbitraryCostUpsertRequestDataAttributesStrategy::new("proportional".to_string()) .allocated_by_tag_keys(vec!["team".to_string(), "environment".to_string()]) .based_on_costs(vec![ ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems::new( "is".to_string(), "service".to_string(), ) .value("web-api".to_string()) .values(Some(vec![])), ArbitraryCostUpsertRequestDataAttributesStrategyBasedOnCostsItems::new( "not in".to_string(), "team".to_string(), ) .value("".to_string()) .values(Some(vec!["legacy".to_string(), "deprecated".to_string()])), ]) .granularity("daily".to_string()), "shared".to_string(), ) .enabled(true) .order_id(1), ), ); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.update_custom_allocation_rule(683, body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update custom allocation rule returns "OK" response ``` /** * Update custom allocation rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiUpdateCustomAllocationRuleRequest = { body: { data: { attributes: { costsToAllocate: [ { condition: "is", tag: "account_id", value: "123456789", values: [], }, { condition: "in", tag: "environment", value: "", values: ["production", "staging"], }, ], enabled: true, orderId: 1, provider: ["aws", "gcp"], ruleName: "example-arbitrary-cost-rule", strategy: { allocatedByTagKeys: ["team", "environment"], basedOnCosts: [ { condition: "is", tag: "service", value: "web-api", values: [], }, { condition: "not in", tag: "team", value: "", values: ["legacy", "deprecated"], }, ], granularity: "daily", method: "proportional", }, type: "shared", }, type: "upsert_arbitrary_rule", }, }, ruleId: 683, }; apiInstance .updateCustomAllocationRule(params) .then((data: v2.ArbitraryRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete custom allocation rule](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-custom-allocation-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-custom-allocation-rule-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.ap2.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.datadoghq.eu/api/v2/cost/arbitrary_rule/{rule_id}https://api.ddog-gov.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.us3.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id} ### Overview Delete a custom allocation rule - Delete an existing custom allocation rule by its ID OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] integer The unique identifier of the custom allocation rule ### Response * [204](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCustomAllocationRule-204-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCustomAllocationRule-429-v2) No Content Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Delete custom allocation rule Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule/${rule_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete custom allocation rule ``` """ Delete custom allocation rule returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) api_instance.delete_custom_allocation_rule( rule_id=683, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete custom allocation rule ``` # Delete custom allocation rule returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new api_instance.delete_custom_allocation_rule(683) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete custom allocation rule ``` // Delete custom allocation rule returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) r, err := api.DeleteCustomAllocationRule(ctx, 683) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.DeleteCustomAllocationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete custom allocation rule ``` // Delete custom allocation rule returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { apiInstance.deleteCustomAllocationRule(683L); } catch (ApiException e) { System.err.println( "Exception when calling CloudCostManagementApi#deleteCustomAllocationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete custom allocation rule ``` // Delete custom allocation rule returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.delete_custom_allocation_rule(683).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete custom allocation rule ``` /** * Delete custom allocation rule returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiDeleteCustomAllocationRuleRequest = { ruleId: 683, }; apiInstance .deleteCustomAllocationRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get custom allocation rule](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-custom-allocation-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-custom-allocation-rule-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.ap2.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.datadoghq.eu/api/v2/cost/arbitrary_rule/{rule_id}https://api.ddog-gov.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.us3.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id}https://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule/{rule_id} ### Overview Get a specific custom allocation rule - Retrieve a specific custom allocation rule by its ID OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] integer The unique identifier of the custom allocation rule ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCustomAllocationRule-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCustomAllocationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of `ArbitraryRuleResponse` object. Field Type Description data object The definition of `ArbitraryRuleResponseData` object. attributes object The definition of `ArbitraryRuleResponseDataAttributes` object. costs_to_allocate [_required_] [object] The `attributes` `costs_to_allocate`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. created [_required_] date-time The `attributes` `created`. enabled [_required_] boolean The `attributes` `enabled`. last_modified_user_uuid [_required_] string The `attributes` `last_modified_user_uuid`. order_id [_required_] int64 The `attributes` `order_id`. processing_status string The `attributes` `processing_status`. provider [_required_] [string] The `attributes` `provider`. rejected boolean The `attributes` `rejected`. rule_name [_required_] string The `attributes` `rule_name`. strategy [_required_] object The definition of `ArbitraryRuleResponseDataAttributesStrategy` object. allocated_by [object] The `strategy` `allocated_by`. allocated_tags [_required_] [object] The `items` `allocated_tags`. key [_required_] string The `items` `key`. value [_required_] string The `items` `value`. percentage [_required_] double The `items` `percentage`. The numeric value format should be a 32bit float value. allocated_by_filters [object] The `strategy` `allocated_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. allocated_by_tag_keys [string] The `strategy` `allocated_by_tag_keys`. based_on_costs [object] The `strategy` `based_on_costs`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. based_on_timeseries object The rule `strategy` `based_on_timeseries`. evaluate_grouped_by_filters [object] The `strategy` `evaluate_grouped_by_filters`. condition [_required_] string The `items` `condition`. tag [_required_] string The `items` `tag`. value string The `items` `value`. values [string] The `items` `values`. evaluate_grouped_by_tag_keys [string] The `strategy` `evaluate_grouped_by_tag_keys`. granularity string The `strategy` `granularity`. method [_required_] string The `strategy` `method`. type [_required_] string The `attributes` `type`. updated [_required_] date-time The `attributes` `updated`. version [_required_] int32 The `attributes` `version`. id string The `ArbitraryRuleResponseData` `id`. type [_required_] enum Arbitrary rule resource type. Allowed enum values: `arbitrary_rule` default: `arbitrary_rule` ``` { "data": { "attributes": { "costs_to_allocate": [ { "condition": "is", "tag": "account_id", "value": "123456789", "values": null }, { "condition": "in", "tag": "environment", "value": "", "values": [ "production", "staging" ] } ], "created": "2023-01-01T12:00:00Z", "enabled": true, "last_modified_user_uuid": "user-123-uuid", "order_id": 1, "provider": [ "aws", "gcp" ], "rule_name": "Example custom allocation rule", "strategy": { "allocated_by_tag_keys": [ "team", "environment" ], "based_on_costs": [ { "condition": "is", "tag": "service", "value": "web-api", "values": null }, { "condition": "not in", "tag": "team", "value": "", "values": [ "legacy", "deprecated" ] } ], "granularity": "daily", "method": "proportional" }, "type": "shared", "updated": "2023-01-01T12:00:00Z", "version": 1 }, "id": "123", "type": "arbitrary_rule" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Get custom allocation rule Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule/${rule_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get custom allocation rule ``` """ Get custom allocation rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.get_custom_allocation_rule( rule_id=683, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get custom allocation rule ``` # Get custom allocation rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.get_custom_allocation_rule(683) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get custom allocation rule ``` // Get custom allocation rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.GetCustomAllocationRule(ctx, 683) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.GetCustomAllocationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.GetCustomAllocationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get custom allocation rule ``` // Get custom allocation rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.ArbitraryRuleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { ArbitraryRuleResponse result = apiInstance.getCustomAllocationRule(683L); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#getCustomAllocationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get custom allocation rule ``` // Get custom allocation rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.get_custom_allocation_rule(683).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get custom allocation rule ``` /** * Get custom allocation rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiGetCustomAllocationRuleRequest = { ruleId: 683, }; apiInstance .getCustomAllocationRule(params) .then((data: v2.ArbitraryRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Reorder custom allocation rules](https://docs.datadoghq.com/api/latest/cloud-cost-management/#reorder-custom-allocation-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#reorder-custom-allocation-rules-v2) POST https://api.ap1.datadoghq.com/api/v2/cost/arbitrary_rule/reorderhttps://api.ap2.datadoghq.com/api/v2/cost/arbitrary_rule/reorderhttps://api.datadoghq.eu/api/v2/cost/arbitrary_rule/reorderhttps://api.ddog-gov.com/api/v2/cost/arbitrary_rule/reorderhttps://api.datadoghq.com/api/v2/cost/arbitrary_rule/reorderhttps://api.us3.datadoghq.com/api/v2/cost/arbitrary_rule/reorderhttps://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule/reorder ### Overview Reorder custom allocation rules - Change the execution order of custom allocation rules. **Important** : You must provide the **complete list** of all rule IDs in the desired execution order. The API will reorder ALL rules according to the provided sequence. Rules are executed in the order specified, with lower indices (earlier in the array) having higher priority. **Example** : If you have rules with IDs [123, 456, 789] and want to change order from 123→456→789 to 456→123→789, send: [{“id”: “456”}, {“id”: “123”}, {“id”: “789”}] OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data [_required_] [object] The `ReorderRuleResourceArray` `data`. id string The `ReorderRuleResourceData` `id`. type [_required_] enum Arbitrary rule resource type. Allowed enum values: `arbitrary_rule` default: `arbitrary_rule` ``` { "data": [ { "id": "string", "type": "arbitrary_rule" } ] } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ReorderCustomAllocationRules-204-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ReorderCustomAllocationRules-429-v2) Successfully reordered rules Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Reorder custom allocation rules Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/arbitrary_rule/reorder" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "type": "arbitrary_rule" } ] } EOF ``` ##### Reorder custom allocation rules ``` """ Reorder custom allocation rules returns "Successfully reordered rules" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.reorder_rule_resource_array import ReorderRuleResourceArray from datadog_api_client.v2.model.reorder_rule_resource_data import ReorderRuleResourceData from datadog_api_client.v2.model.reorder_rule_resource_data_type import ReorderRuleResourceDataType body = ReorderRuleResourceArray( data=[ ReorderRuleResourceData( id="456", type=ReorderRuleResourceDataType.ARBITRARY_RULE, ), ReorderRuleResourceData( id="123", type=ReorderRuleResourceDataType.ARBITRARY_RULE, ), ReorderRuleResourceData( id="789", type=ReorderRuleResourceDataType.ARBITRARY_RULE, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) api_instance.reorder_custom_allocation_rules(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Reorder custom allocation rules ``` # Reorder custom allocation rules returns "Successfully reordered rules" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::ReorderRuleResourceArray.new({ data: [ DatadogAPIClient::V2::ReorderRuleResourceData.new({ id: "456", type: DatadogAPIClient::V2::ReorderRuleResourceDataType::ARBITRARY_RULE, }), DatadogAPIClient::V2::ReorderRuleResourceData.new({ id: "123", type: DatadogAPIClient::V2::ReorderRuleResourceDataType::ARBITRARY_RULE, }), DatadogAPIClient::V2::ReorderRuleResourceData.new({ id: "789", type: DatadogAPIClient::V2::ReorderRuleResourceDataType::ARBITRARY_RULE, }), ], }) api_instance.reorder_custom_allocation_rules(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Reorder custom allocation rules ``` // Reorder custom allocation rules returns "Successfully reordered rules" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ReorderRuleResourceArray{ Data: []datadogV2.ReorderRuleResourceData{ { Id: datadog.PtrString("456"), Type: datadogV2.REORDERRULERESOURCEDATATYPE_ARBITRARY_RULE, }, { Id: datadog.PtrString("123"), Type: datadogV2.REORDERRULERESOURCEDATATYPE_ARBITRARY_RULE, }, { Id: datadog.PtrString("789"), Type: datadogV2.REORDERRULERESOURCEDATATYPE_ARBITRARY_RULE, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) r, err := api.ReorderCustomAllocationRules(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.ReorderCustomAllocationRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Reorder custom allocation rules ``` // Reorder custom allocation rules returns "Successfully reordered rules" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.ReorderRuleResourceArray; import com.datadog.api.client.v2.model.ReorderRuleResourceData; import com.datadog.api.client.v2.model.ReorderRuleResourceDataType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); ReorderRuleResourceArray body = new ReorderRuleResourceArray() .data( Arrays.asList( new ReorderRuleResourceData() .id("456") .type(ReorderRuleResourceDataType.ARBITRARY_RULE), new ReorderRuleResourceData() .id("123") .type(ReorderRuleResourceDataType.ARBITRARY_RULE), new ReorderRuleResourceData() .id("789") .type(ReorderRuleResourceDataType.ARBITRARY_RULE))); try { apiInstance.reorderCustomAllocationRules(body); } catch (ApiException e) { System.err.println( "Exception when calling CloudCostManagementApi#reorderCustomAllocationRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Reorder custom allocation rules ``` // Reorder custom allocation rules returns "Successfully reordered rules" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::ReorderRuleResourceArray; use datadog_api_client::datadogV2::model::ReorderRuleResourceData; use datadog_api_client::datadogV2::model::ReorderRuleResourceDataType; #[tokio::main] async fn main() { let body = ReorderRuleResourceArray::new(vec![ ReorderRuleResourceData::new(ReorderRuleResourceDataType::ARBITRARY_RULE) .id("456".to_string()), ReorderRuleResourceData::new(ReorderRuleResourceDataType::ARBITRARY_RULE) .id("123".to_string()), ReorderRuleResourceData::new(ReorderRuleResourceDataType::ARBITRARY_RULE) .id("789".to_string()), ]); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.reorder_custom_allocation_rules(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Reorder custom allocation rules ``` /** * Reorder custom allocation rules returns "Successfully reordered rules" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiReorderCustomAllocationRulesRequest = { body: { data: [ { id: "456", type: "arbitrary_rule", }, { id: "123", type: "arbitrary_rule", }, { id: "789", type: "arbitrary_rule", }, ], }, }; apiInstance .reorderCustomAllocationRules(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List Custom Costs files](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-custom-costs-files) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-custom-costs-files-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/custom_costshttps://api.ap2.datadoghq.com/api/v2/cost/custom_costshttps://api.datadoghq.eu/api/v2/cost/custom_costshttps://api.ddog-gov.com/api/v2/cost/custom_costshttps://api.datadoghq.com/api/v2/cost/custom_costshttps://api.us3.datadoghq.com/api/v2/cost/custom_costshttps://api.us5.datadoghq.com/api/v2/cost/custom_costs ### Overview List the Custom Costs files. This endpoint requires the `cloud_cost_management_read` permission. OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[number] integer Page number for pagination page[size] integer Page size for pagination filter[status] string Filter by file status sort string Sort key with optional descending prefix ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCustomCostsFiles-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCustomCostsFiles-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCustomCostsFiles-403-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListCustomCostsFiles-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Response for List Custom Costs files. Field Type Description data [object] List of Custom Costs files. attributes object Schema of a Custom Costs metadata. billed_cost double Total cost in the cost file. billing_currency string Currency used in the Custom Costs file. charge_period object Usage charge period of a Custom Costs file. end double End of the usage of the Custom Costs file. start double Start of the usage of the Custom Costs file. name string Name of the Custom Costs file. provider_names [string] Providers contained in the Custom Costs file. status string Status of the Custom Costs file. uploaded_at double Timestamp, in millisecond, of the upload time of the Custom Costs file. uploaded_by object Metadata of the user that has uploaded the Custom Costs file. email string The name of the Custom Costs file. icon string The name of the Custom Costs file. name string Name of the user. id string ID of the Custom Costs metadata. type string Type of the Custom Costs file metadata. meta object Meta for the response from the List Custom Costs endpoints. total_filtered_count int64 Number of Custom Costs files returned by the List Custom Costs endpoint version string Version of Custom Costs file ``` { "data": [ { "attributes": { "billed_cost": 100.5, "billing_currency": "USD", "charge_period": { "end": 1706745600000, "start": 1704067200000 }, "name": "my_file.json", "provider_names": [ "my_provider" ], "status": "active", "uploaded_at": 1704067200000, "uploaded_by": { "email": "email.test@datadohq.com", "icon": "icon.png", "name": "Test User" } }, "id": "string", "type": "string" } ], "meta": { "total_filtered_count": "integer", "version": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### List Custom Costs files Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/custom_costs" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Custom Costs files ``` """ List Custom Costs files returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.list_custom_costs_files() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Custom Costs files ``` # List Custom Costs files returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.list_custom_costs_files() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Custom Costs files ``` // List Custom Costs files returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.ListCustomCostsFiles(ctx, *datadogV2.NewListCustomCostsFilesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.ListCustomCostsFiles`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.ListCustomCostsFiles`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Custom Costs files ``` // List Custom Costs files returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.CustomCostsFileListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { CustomCostsFileListResponse result = apiInstance.listCustomCostsFiles(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#listCustomCostsFiles"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Custom Costs files ``` // List Custom Costs files returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::api_cloud_cost_management::ListCustomCostsFilesOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api .list_custom_costs_files(ListCustomCostsFilesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Custom Costs files ``` /** * List Custom Costs files returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); apiInstance .listCustomCostsFiles() .then((data: v2.CustomCostsFileListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Upload Custom Costs file](https://docs.datadoghq.com/api/latest/cloud-cost-management/#upload-custom-costs-file) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#upload-custom-costs-file-v2) PUT https://api.ap1.datadoghq.com/api/v2/cost/custom_costshttps://api.ap2.datadoghq.com/api/v2/cost/custom_costshttps://api.datadoghq.eu/api/v2/cost/custom_costshttps://api.ddog-gov.com/api/v2/cost/custom_costshttps://api.datadoghq.com/api/v2/cost/custom_costshttps://api.us3.datadoghq.com/api/v2/cost/custom_costshttps://api.us5.datadoghq.com/api/v2/cost/custom_costs ### Overview Upload a Custom Costs file. This endpoint requires the `cloud_cost_management_write` permission. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description BilledCost double Total cost in the cost file. BillingCurrency string Currency used in the Custom Costs file. ChargeDescription string Description for the line item cost. ChargePeriodEnd string End date of the usage charge. ChargePeriodStart string Start date of the usage charge. ProviderName string Name of the provider for the line item. Tags object Additional tags for the line item. string ``` [ { "ProviderName": "my_provider", "ChargePeriodStart": "2023-05-06", "ChargePeriodEnd": "2023-06-06", "ChargeDescription": "my_description", "BilledCost": 250, "BillingCurrency": "USD", "Tags": { "key": "value" } } ] ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UploadCustomCostsFile-202-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UploadCustomCostsFile-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UploadCustomCostsFile-403-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UploadCustomCostsFile-429-v2) Accepted * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Response for Uploaded Custom Costs files. Field Type Description data object JSON API format for a Custom Costs file. attributes object Schema of a Custom Costs metadata. billed_cost double Total cost in the cost file. billing_currency string Currency used in the Custom Costs file. charge_period object Usage charge period of a Custom Costs file. end double End of the usage of the Custom Costs file. start double Start of the usage of the Custom Costs file. name string Name of the Custom Costs file. provider_names [string] Providers contained in the Custom Costs file. status string Status of the Custom Costs file. uploaded_at double Timestamp, in millisecond, of the upload time of the Custom Costs file. uploaded_by object Metadata of the user that has uploaded the Custom Costs file. email string The name of the Custom Costs file. icon string The name of the Custom Costs file. name string Name of the user. id string ID of the Custom Costs metadata. type string Type of the Custom Costs file metadata. meta object Meta for the response from the Upload Custom Costs endpoints. version string Version of Custom Costs file ``` { "data": { "attributes": { "billed_cost": 100.5, "billing_currency": "USD", "charge_period": { "end": 1706745600000, "start": 1704067200000 }, "name": "my_file.json", "provider_names": [ "my_provider" ], "status": "active", "uploaded_at": 1704067200000, "uploaded_by": { "email": "email.test@datadohq.com", "icon": "icon.png", "name": "Test User" } }, "id": "string", "type": "string" }, "meta": { "version": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Upload Custom Costs File returns "Accepted" response Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/custom_costs" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF [ { "ProviderName": "my_provider", "ChargePeriodStart": "2023-05-06", "ChargePeriodEnd": "2023-06-06", "ChargeDescription": "my_description", "BilledCost": 250, "BillingCurrency": "USD", "Tags": { "key": "value" } } ] EOF ``` ##### Upload Custom Costs File returns "Accepted" response ``` // Upload Custom Costs File returns "Accepted" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := []datadogV2.CustomCostsFileLineItem{ { ProviderName: datadog.PtrString("my_provider"), ChargePeriodStart: datadog.PtrString("2023-05-06"), ChargePeriodEnd: datadog.PtrString("2023-06-06"), ChargeDescription: datadog.PtrString("my_description"), BilledCost: datadog.PtrFloat64(250), BillingCurrency: datadog.PtrString("USD"), Tags: map[string]string{ "key": "value", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.UploadCustomCostsFile(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.UploadCustomCostsFile`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.UploadCustomCostsFile`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Upload Custom Costs File returns "Accepted" response ``` // Upload Custom Costs File returns "Accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.CustomCostsFileLineItem; import com.datadog.api.client.v2.model.CustomCostsFileUploadResponse; import java.util.Collections; import java.util.List; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); List body = Collections.singletonList( new CustomCostsFileLineItem() .providerName("my_provider") .chargePeriodStart("2023-05-06") .chargePeriodEnd("2023-06-06") .chargeDescription("my_description") .billedCost(250.0) .billingCurrency("USD") .tags(Map.ofEntries(Map.entry("key", "value")))); try { CustomCostsFileUploadResponse result = apiInstance.uploadCustomCostsFile(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#uploadCustomCostsFile"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Upload Custom Costs File returns "Accepted" response ``` """ Upload Custom Costs File returns "Accepted" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.custom_costs_file_line_item import CustomCostsFileLineItem body = [ CustomCostsFileLineItem( provider_name="my_provider", charge_period_start="2023-05-06", charge_period_end="2023-06-06", charge_description="my_description", billed_cost=250.0, billing_currency="USD", tags=dict( key="value", ), ), ] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.upload_custom_costs_file(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Upload Custom Costs File returns "Accepted" response ``` # Upload Custom Costs File returns "Accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = [ DatadogAPIClient::V2::CustomCostsFileLineItem.new({ provider_name: "my_provider", charge_period_start: "2023-05-06", charge_period_end: "2023-06-06", charge_description: "my_description", billed_cost: 250, billing_currency: "USD", tags: { key: "value", }, }), ] p api_instance.upload_custom_costs_file(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Upload Custom Costs File returns "Accepted" response ``` // Upload Custom Costs File returns "Accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::CustomCostsFileLineItem; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = vec![CustomCostsFileLineItem::new() .billed_cost(250.0 as f64) .billing_currency("USD".to_string()) .charge_description("my_description".to_string()) .charge_period_end("2023-06-06".to_string()) .charge_period_start("2023-05-06".to_string()) .provider_name("my_provider".to_string()) .tags(BTreeMap::from([("key".to_string(), "value".to_string())]))]; let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.upload_custom_costs_file(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Upload Custom Costs File returns "Accepted" response ``` /** * Upload Custom Costs File returns "Accepted" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiUploadCustomCostsFileRequest = { body: [ { providerName: "my_provider", chargePeriodStart: "2023-05-06", chargePeriodEnd: "2023-06-06", chargeDescription: "my_description", billedCost: 250, billingCurrency: "USD", tags: { key: "value", }, }, ], }; apiInstance .uploadCustomCostsFile(params) .then((data: v2.CustomCostsFileUploadResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Custom Costs file](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-custom-costs-file) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-custom-costs-file-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cost/custom_costs/{file_id}https://api.ap2.datadoghq.com/api/v2/cost/custom_costs/{file_id}https://api.datadoghq.eu/api/v2/cost/custom_costs/{file_id}https://api.ddog-gov.com/api/v2/cost/custom_costs/{file_id}https://api.datadoghq.com/api/v2/cost/custom_costs/{file_id}https://api.us3.datadoghq.com/api/v2/cost/custom_costs/{file_id}https://api.us5.datadoghq.com/api/v2/cost/custom_costs/{file_id} ### Overview Delete the specified Custom Costs file. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description file_id [_required_] string File ID. ### Response * [204](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCustomCostsFile-204-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCustomCostsFile-403-v2) * [404](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCustomCostsFile-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteCustomCostsFile-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Delete Custom Costs file Copy ``` # Path parameters export file_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/custom_costs/${file_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Custom Costs file ``` """ Delete Custom Costs file returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) api_instance.delete_custom_costs_file( file_id="file_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Custom Costs file ``` # Delete Custom Costs file returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new api_instance.delete_custom_costs_file("file_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Custom Costs file ``` // Delete Custom Costs file returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) r, err := api.DeleteCustomCostsFile(ctx, "file_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.DeleteCustomCostsFile`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Custom Costs file ``` // Delete Custom Costs file returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { apiInstance.deleteCustomCostsFile("file_id"); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#deleteCustomCostsFile"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Custom Costs file ``` // Delete Custom Costs file returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.delete_custom_costs_file("file_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Custom Costs file ``` /** * Delete Custom Costs file returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiDeleteCustomCostsFileRequest = { fileId: "file_id", }; apiInstance .deleteCustomCostsFile(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get Custom Costs file](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-custom-costs-file) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-custom-costs-file-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/custom_costs/{file_id}https://api.ap2.datadoghq.com/api/v2/cost/custom_costs/{file_id}https://api.datadoghq.eu/api/v2/cost/custom_costs/{file_id}https://api.ddog-gov.com/api/v2/cost/custom_costs/{file_id}https://api.datadoghq.com/api/v2/cost/custom_costs/{file_id}https://api.us3.datadoghq.com/api/v2/cost/custom_costs/{file_id}https://api.us5.datadoghq.com/api/v2/cost/custom_costs/{file_id} ### Overview Fetch the specified Custom Costs file. OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description file_id [_required_] string File ID. ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCustomCostsFile-200-v2) * [403](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCustomCostsFile-403-v2) * [404](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCustomCostsFile-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetCustomCostsFile-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Response for Get Custom Costs files. Field Type Description data object JSON API format of for a Custom Costs file with content. attributes object Schema of a cost file's metadata. billed_cost double Total cost in the cost file. billing_currency string Currency used in the Custom Costs file. charge_period object Usage charge period of a Custom Costs file. end double End of the usage of the Custom Costs file. start double Start of the usage of the Custom Costs file. content [object] Detail of the line items from the Custom Costs file. BilledCost double Total cost in the cost file. BillingCurrency string Currency used in the Custom Costs file. ChargeDescription string Description for the line item cost. ChargePeriodEnd string End date of the usage charge. ChargePeriodStart string Start date of the usage charge. ProviderName string Name of the provider for the line item. Tags object Additional tags for the line item. string name string Name of the Custom Costs file. provider_names [string] Providers contained in the Custom Costs file. status string Status of the Custom Costs file. uploaded_at double Timestamp in millisecond of the upload time of the Custom Costs file. uploaded_by object Metadata of the user that has uploaded the Custom Costs file. email string The name of the Custom Costs file. icon string The name of the Custom Costs file. name string Name of the user. id string ID of the Custom Costs metadata. type string Type of the Custom Costs file metadata. meta object Meta for the response from the Get Custom Costs endpoints. version string Version of Custom Costs file ``` { "data": { "attributes": { "billed_cost": 100.5, "billing_currency": "USD", "charge_period": { "end": 1706745600000, "start": 1704067200000 }, "content": [ { "BilledCost": 100.5, "BillingCurrency": "USD", "ChargeDescription": "Monthly usage charge for my service", "ChargePeriodEnd": "2023-02-28", "ChargePeriodStart": "2023-02-01", "ProviderName": "string", "Tags": { "": "string" } } ], "name": "my_file.json", "provider_names": [ "my_provider" ], "status": "active", "uploaded_at": 1704067200000, "uploaded_by": { "email": "email.test@datadohq.com", "icon": "icon.png", "name": "Test User" } }, "id": "string", "type": "string" }, "meta": { "version": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Get Custom Costs file Copy ``` # Path parameters export file_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/custom_costs/${file_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Custom Costs file ``` """ Get Custom Costs file returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.get_custom_costs_file( file_id="file_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Custom Costs file ``` # Get Custom Costs file returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.get_custom_costs_file("file_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Custom Costs file ``` // Get Custom Costs file returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.GetCustomCostsFile(ctx, "file_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.GetCustomCostsFile`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.GetCustomCostsFile`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Custom Costs file ``` // Get Custom Costs file returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.CustomCostsFileGetResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { CustomCostsFileGetResponse result = apiInstance.getCustomCostsFile("file_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#getCustomCostsFile"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Custom Costs file ``` // Get Custom Costs file returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.get_custom_costs_file("file_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Custom Costs file ``` /** * Get Custom Costs file returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiGetCustomCostsFileRequest = { fileId: "file_id", }; apiInstance .getCustomCostsFile(params) .then((data: v2.CustomCostsFileGetResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List budgets](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-budgets) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-budgets-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/budgetshttps://api.ap2.datadoghq.com/api/v2/cost/budgetshttps://api.datadoghq.eu/api/v2/cost/budgetshttps://api.ddog-gov.com/api/v2/cost/budgetshttps://api.datadoghq.com/api/v2/cost/budgetshttps://api.us3.datadoghq.com/api/v2/cost/budgetshttps://api.us5.datadoghq.com/api/v2/cost/budgets ### Overview List budgets. OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListBudgets-200-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#ListBudgets-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) An array of budgets. Field Type Description data [object] The `BudgetArray` `data`. attributes object The attributes of a budget. created_at int64 The timestamp when the budget was created. created_by string The id of the user that created the budget. end_month int64 The month when the budget ends. entries [object] The entries of the budget. amount double The `amount` of the budget entry. month int64 The `month` of the budget entry. tag_filters [object] The `tag_filters` of the budget entry. tag_key string The key of the tag. tag_value string The value of the tag. metrics_query string The cost query used to track against the budget. name string The name of the budget. org_id int64 The id of the org the budget belongs to. start_month int64 The month when the budget starts. total_amount double The sum of all budget entries' amounts. updated_at int64 The timestamp when the budget was last updated. updated_by string The id of the user that created the budget. id string The id of the budget. type string The type of the object, must be `budget`. ``` { "data": [ { "attributes": { "created_at": 1741011342772, "created_by": "user1", "end_month": 202502, "metrics_query": "aws.cost.amortized{service:ec2} by {service}", "name": "my budget", "org_id": 123, "start_month": 202501, "total_amount": 1000, "updated_at": 1741011342772, "updated_by": "user2" }, "id": "00000000-0a0a-0a0a-aaa0-00000000000a", "type": "budget" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### List budgets Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/budgets" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List budgets ``` """ List budgets returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.list_budgets() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List budgets ``` # List budgets returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.list_budgets() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List budgets ``` // List budgets returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.ListBudgets(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.ListBudgets`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.ListBudgets`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List budgets ``` // List budgets returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.BudgetArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { BudgetArray result = apiInstance.listBudgets(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#listBudgets"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List budgets ``` // List budgets returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.list_budgets().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List budgets ``` /** * List budgets returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); apiInstance .listBudgets() .then((data: v2.BudgetArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create or update a budget](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-or-update-a-budget) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-or-update-a-budget-v2) PUT https://api.ap1.datadoghq.com/api/v2/cost/budgethttps://api.ap2.datadoghq.com/api/v2/cost/budgethttps://api.datadoghq.eu/api/v2/cost/budgethttps://api.ddog-gov.com/api/v2/cost/budgethttps://api.datadoghq.com/api/v2/cost/budgethttps://api.us3.datadoghq.com/api/v2/cost/budgethttps://api.us5.datadoghq.com/api/v2/cost/budget ### Overview Create a new budget or update an existing one. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) Field Type Description data object A budget and all its entries. attributes object The attributes of a budget. created_at int64 The timestamp when the budget was created. created_by string The id of the user that created the budget. end_month int64 The month when the budget ends. entries [object] The entries of the budget. amount double The `amount` of the budget entry. month int64 The `month` of the budget entry. tag_filters [object] The `tag_filters` of the budget entry. tag_key string The key of the tag. tag_value string The value of the tag. metrics_query string The cost query used to track against the budget. name string The name of the budget. org_id int64 The id of the org the budget belongs to. start_month int64 The month when the budget starts. total_amount double The sum of all budget entries' amounts. updated_at int64 The timestamp when the budget was last updated. updated_by string The id of the user that created the budget. id string The `BudgetWithEntriesData` `id`. type string The type of the object, must be `budget`. ``` { "data": { "attributes": { "created_at": 1738258683590, "created_by": "00000000-0a0a-0a0a-aaa0-00000000000a", "end_month": 202502, "entries": [ { "amount": 500, "month": 202501, "tag_filters": [ { "tag_key": "service", "tag_value": "ec2" } ] } ], "metrics_query": "aws.cost.amortized{service:ec2} by {service}", "name": "my budget", "org_id": 123, "start_month": 202501, "total_amount": 1000, "updated_at": 1738258683590, "updated_by": "00000000-0a0a-0a0a-aaa0-00000000000a" }, "id": "00000000-0a0a-0a0a-aaa0-00000000000a", "type": "string" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpsertBudget-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpsertBudget-400-v2) * [404](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpsertBudget-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#UpsertBudget-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of the `BudgetWithEntries` object. Field Type Description data object A budget and all its entries. attributes object The attributes of a budget. created_at int64 The timestamp when the budget was created. created_by string The id of the user that created the budget. end_month int64 The month when the budget ends. entries [object] The entries of the budget. amount double The `amount` of the budget entry. month int64 The `month` of the budget entry. tag_filters [object] The `tag_filters` of the budget entry. tag_key string The key of the tag. tag_value string The value of the tag. metrics_query string The cost query used to track against the budget. name string The name of the budget. org_id int64 The id of the org the budget belongs to. start_month int64 The month when the budget starts. total_amount double The sum of all budget entries' amounts. updated_at int64 The timestamp when the budget was last updated. updated_by string The id of the user that created the budget. id string The `BudgetWithEntriesData` `id`. type string The type of the object, must be `budget`. ``` { "data": { "attributes": { "created_at": 1738258683590, "created_by": "00000000-0a0a-0a0a-aaa0-00000000000a", "end_month": 202502, "entries": [ { "amount": 500, "month": 202501, "tag_filters": [ { "tag_key": "service", "tag_value": "ec2" } ] } ], "metrics_query": "aws.cost.amortized{service:ec2} by {service}", "name": "my budget", "org_id": 123, "start_month": 202501, "total_amount": 1000, "updated_at": 1738258683590, "updated_by": "00000000-0a0a-0a0a-aaa0-00000000000a" }, "id": "00000000-0a0a-0a0a-aaa0-00000000000a", "type": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Create or update a budget Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/budget" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Create or update a budget ``` """ Create or update a budget returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi from datadog_api_client.v2.model.budget_attributes import BudgetAttributes from datadog_api_client.v2.model.budget_entry import BudgetEntry from datadog_api_client.v2.model.budget_with_entries import BudgetWithEntries from datadog_api_client.v2.model.budget_with_entries_data import BudgetWithEntriesData from datadog_api_client.v2.model.tag_filter import TagFilter body = BudgetWithEntries( data=BudgetWithEntriesData( attributes=BudgetAttributes( created_at=1738258683590, created_by="00000000-0a0a-0a0a-aaa0-00000000000a", end_month=202502, entries=[ BudgetEntry( amount=500.0, month=202501, tag_filters=[ TagFilter( tag_key="service", tag_value="ec2", ), ], ), ], metrics_query="aws.cost.amortized{service:ec2} by {service}", name="my budget", org_id=123, start_month=202501, total_amount=1000.0, updated_at=1738258683590, updated_by="00000000-0a0a-0a0a-aaa0-00000000000a", ), id="00000000-0a0a-0a0a-aaa0-00000000000a", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.upsert_budget(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create or update a budget ``` # Create or update a budget returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new body = DatadogAPIClient::V2::BudgetWithEntries.new({ data: DatadogAPIClient::V2::BudgetWithEntriesData.new({ attributes: DatadogAPIClient::V2::BudgetAttributes.new({ created_at: 1738258683590, created_by: "00000000-0a0a-0a0a-aaa0-00000000000a", end_month: 202502, entries: [ DatadogAPIClient::V2::BudgetEntry.new({ amount: 500, month: 202501, tag_filters: [ DatadogAPIClient::V2::TagFilter.new({ tag_key: "service", tag_value: "ec2", }), ], }), ], metrics_query: "aws.cost.amortized{service:ec2} by {service}", name: "my budget", org_id: 123, start_month: 202501, total_amount: 1000, updated_at: 1738258683590, updated_by: "00000000-0a0a-0a0a-aaa0-00000000000a", }), id: "00000000-0a0a-0a0a-aaa0-00000000000a", }), }) p api_instance.upsert_budget(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create or update a budget ``` // Create or update a budget returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.BudgetWithEntries{ Data: &datadogV2.BudgetWithEntriesData{ Attributes: &datadogV2.BudgetAttributes{ CreatedAt: datadog.PtrInt64(1738258683590), CreatedBy: datadog.PtrString("00000000-0a0a-0a0a-aaa0-00000000000a"), EndMonth: datadog.PtrInt64(202502), Entries: []datadogV2.BudgetEntry{ { Amount: datadog.PtrFloat64(500), Month: datadog.PtrInt64(202501), TagFilters: []datadogV2.TagFilter{ { TagKey: datadog.PtrString("service"), TagValue: datadog.PtrString("ec2"), }, }, }, }, MetricsQuery: datadog.PtrString("aws.cost.amortized{service:ec2} by {service}"), Name: datadog.PtrString("my budget"), OrgId: datadog.PtrInt64(123), StartMonth: datadog.PtrInt64(202501), TotalAmount: datadog.PtrFloat64(1000), UpdatedAt: datadog.PtrInt64(1738258683590), UpdatedBy: datadog.PtrString("00000000-0a0a-0a0a-aaa0-00000000000a"), }, Id: datadog.PtrString("00000000-0a0a-0a0a-aaa0-00000000000a"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.UpsertBudget(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.UpsertBudget`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.UpsertBudget`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create or update a budget ``` // Create or update a budget returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.BudgetAttributes; import com.datadog.api.client.v2.model.BudgetEntry; import com.datadog.api.client.v2.model.BudgetWithEntries; import com.datadog.api.client.v2.model.BudgetWithEntriesData; import com.datadog.api.client.v2.model.TagFilter; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); BudgetWithEntries body = new BudgetWithEntries() .data( new BudgetWithEntriesData() .attributes( new BudgetAttributes() .createdAt(1738258683590L) .createdBy("00000000-0a0a-0a0a-aaa0-00000000000a") .endMonth(202502L) .entries( Collections.singletonList( new BudgetEntry() .amount(500.0) .month(202501L) .tagFilters( Collections.singletonList( new TagFilter() .tagKey("service") .tagValue("ec2"))))) .metricsQuery("aws.cost.amortized{service:ec2} by {service}") .name("my budget") .orgId(123L) .startMonth(202501L) .totalAmount(1000.0) .updatedAt(1738258683590L) .updatedBy("00000000-0a0a-0a0a-aaa0-00000000000a")) .id("00000000-0a0a-0a0a-aaa0-00000000000a")); try { BudgetWithEntries result = apiInstance.upsertBudget(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#upsertBudget"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create or update a budget ``` // Create or update a budget returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; use datadog_api_client::datadogV2::model::BudgetAttributes; use datadog_api_client::datadogV2::model::BudgetEntry; use datadog_api_client::datadogV2::model::BudgetWithEntries; use datadog_api_client::datadogV2::model::BudgetWithEntriesData; use datadog_api_client::datadogV2::model::TagFilter; #[tokio::main] async fn main() { let body = BudgetWithEntries::new().data( BudgetWithEntriesData::new() .attributes( BudgetAttributes::new() .created_at(1738258683590) .created_by("00000000-0a0a-0a0a-aaa0-00000000000a".to_string()) .end_month(202502) .entries(vec![BudgetEntry::new() .amount(500.0 as f64) .month(202501) .tag_filters(vec![TagFilter::new() .tag_key("service".to_string()) .tag_value("ec2".to_string())])]) .metrics_query("aws.cost.amortized{service:ec2} by {service}".to_string()) .name("my budget".to_string()) .org_id(123) .start_month(202501) .total_amount(1000.0 as f64) .updated_at(1738258683590) .updated_by("00000000-0a0a-0a0a-aaa0-00000000000a".to_string()), ) .id("00000000-0a0a-0a0a-aaa0-00000000000a".to_string()), ); let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.upsert_budget(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create or update a budget ``` /** * Create or update a budget returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiUpsertBudgetRequest = { body: { data: { attributes: { createdAt: 1738258683590, createdBy: "00000000-0a0a-0a0a-aaa0-00000000000a", endMonth: 202502, entries: [ { amount: 500, month: 202501, tagFilters: [ { tagKey: "service", tagValue: "ec2", }, ], }, ], metricsQuery: "aws.cost.amortized{service:ec2} by {service}", name: "my budget", orgId: 123, startMonth: 202501, totalAmount: 1000, updatedAt: 1738258683590, updatedBy: "00000000-0a0a-0a0a-aaa0-00000000000a", }, id: "00000000-0a0a-0a0a-aaa0-00000000000a", }, }, }; apiInstance .upsertBudget(params) .then((data: v2.BudgetWithEntries) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a budget](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-a-budget) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-a-budget-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cost/budget/{budget_id}https://api.ap2.datadoghq.com/api/v2/cost/budget/{budget_id}https://api.datadoghq.eu/api/v2/cost/budget/{budget_id}https://api.ddog-gov.com/api/v2/cost/budget/{budget_id}https://api.datadoghq.com/api/v2/cost/budget/{budget_id}https://api.us3.datadoghq.com/api/v2/cost/budget/{budget_id}https://api.us5.datadoghq.com/api/v2/cost/budget/{budget_id} ### Overview Delete a budget. OAuth apps require the `cloud_cost_management_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description budget_id [_required_] string Budget id. ### Response * [204](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteBudget-204-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteBudget-400-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#DeleteBudget-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Delete a budget Copy ``` # Path parameters export budget_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/budget/${budget_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a budget ``` """ Delete a budget returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) api_instance.delete_budget( budget_id="budget_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a budget ``` # Delete a budget returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new api_instance.delete_budget("budget_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a budget ``` // Delete a budget returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) r, err := api.DeleteBudget(ctx, "budget_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.DeleteBudget`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a budget ``` // Delete a budget returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { apiInstance.deleteBudget("budget_id"); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#deleteBudget"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a budget ``` // Delete a budget returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.delete_budget("budget_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a budget ``` /** * Delete a budget returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiDeleteBudgetRequest = { budgetId: "budget_id", }; apiInstance .deleteBudget(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a budget](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-a-budget) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-a-budget-v2) GET https://api.ap1.datadoghq.com/api/v2/cost/budget/{budget_id}https://api.ap2.datadoghq.com/api/v2/cost/budget/{budget_id}https://api.datadoghq.eu/api/v2/cost/budget/{budget_id}https://api.ddog-gov.com/api/v2/cost/budget/{budget_id}https://api.datadoghq.com/api/v2/cost/budget/{budget_id}https://api.us3.datadoghq.com/api/v2/cost/budget/{budget_id}https://api.us5.datadoghq.com/api/v2/cost/budget/{budget_id} ### Overview Get a budget. OAuth apps require the `cloud_cost_management_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#cloud-cost-management) to access this endpoint. ### Arguments #### Path Parameters Name Type Description budget_id [_required_] string Budget id. ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetBudget-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetBudget-400-v2) * [404](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetBudget-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-cost-management/#GetBudget-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) The definition of the `BudgetWithEntries` object. Field Type Description data object A budget and all its entries. attributes object The attributes of a budget. created_at int64 The timestamp when the budget was created. created_by string The id of the user that created the budget. end_month int64 The month when the budget ends. entries [object] The entries of the budget. amount double The `amount` of the budget entry. month int64 The `month` of the budget entry. tag_filters [object] The `tag_filters` of the budget entry. tag_key string The key of the tag. tag_value string The value of the tag. metrics_query string The cost query used to track against the budget. name string The name of the budget. org_id int64 The id of the org the budget belongs to. start_month int64 The month when the budget starts. total_amount double The sum of all budget entries' amounts. updated_at int64 The timestamp when the budget was last updated. updated_by string The id of the user that created the budget. id string The `BudgetWithEntriesData` `id`. type string The type of the object, must be `budget`. ``` { "data": { "attributes": { "created_at": 1738258683590, "created_by": "00000000-0a0a-0a0a-aaa0-00000000000a", "end_month": 202502, "entries": [ { "amount": 500, "month": 202501, "tag_filters": [ { "tag_key": "service", "tag_value": "ec2" } ] } ], "metrics_query": "aws.cost.amortized{service:ec2} by {service}", "name": "my budget", "org_id": 123, "start_month": 202501, "total_amount": 1000, "updated_at": 1738258683590, "updated_by": "00000000-0a0a-0a0a-aaa0-00000000000a" }, "id": "00000000-0a0a-0a0a-aaa0-00000000000a", "type": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [Example](https://docs.datadoghq.com/api/latest/cloud-cost-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-cost-management/?code-lang=typescript) ##### Get a budget Copy ``` # Path parameters export budget_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost/budget/${budget_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a budget ``` """ Get a budget returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_cost_management_api import CloudCostManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudCostManagementApi(api_client) response = api_instance.get_budget( budget_id="budget_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a budget ``` # Get a budget returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudCostManagementAPI.new p api_instance.get_budget("budget_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a budget ``` // Get a budget returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudCostManagementApi(apiClient) resp, r, err := api.GetBudget(ctx, "budget_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudCostManagementApi.GetBudget`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudCostManagementApi.GetBudget`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a budget ``` // Get a budget returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudCostManagementApi; import com.datadog.api.client.v2.model.BudgetWithEntries; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudCostManagementApi apiInstance = new CloudCostManagementApi(defaultClient); try { BudgetWithEntries result = apiInstance.getBudget("budget_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudCostManagementApi#getBudget"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a budget ``` // Get a budget returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_cost_management::CloudCostManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudCostManagementAPI::with_config(configuration); let resp = api.get_budget("budget_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a budget ``` /** * Get a budget returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudCostManagementApi(configuration); const params: v2.CloudCostManagementApiGetBudgetRequest = { budgetId: "budget_id", }; apiInstance .getBudget(params) .then((data: v2.BudgetWithEntries) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=6b51990a-4346-4c3b-8132-a763991b27f2&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=a5656d46-bb80-4525-b0ae-7600f33aab8f&pt=Cloud%20Cost%20Management&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=6b51990a-4346-4c3b-8132-a763991b27f2&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=a5656d46-bb80-4525-b0ae-7600f33aab8f&pt=Cloud%20Cost%20Management&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=61da729c-b581-4553-997a-e67aa3f2e4aa&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Cloud%20Cost%20Management&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-cost-management%2F&r=<=41098&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=56861) --- # Source: https://docs.datadoghq.com/api/latest/cloud-network-monitoring/ # Cloud Network Monitoring The Cloud Network Monitoring API allows you to fetch aggregated connections and DNS traffic with their attributes. See the [Cloud Network Monitoring page](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/) and [DNS Monitoring page](https://docs.datadoghq.com/network_monitoring/dns/) for more information. ## [Get all aggregated connections](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/#get-all-aggregated-connections) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/#get-all-aggregated-connections-v2) GET https://api.ap1.datadoghq.com/api/v2/network/connections/aggregatehttps://api.ap2.datadoghq.com/api/v2/network/connections/aggregatehttps://api.datadoghq.eu/api/v2/network/connections/aggregatehttps://api.ddog-gov.com/api/v2/network/connections/aggregatehttps://api.datadoghq.com/api/v2/network/connections/aggregatehttps://api.us3.datadoghq.com/api/v2/network/connections/aggregatehttps://api.us5.datadoghq.com/api/v2/network/connections/aggregate ### Overview Get all aggregated connections. ### Arguments #### Query Strings Name Type Description from integer Unix timestamp (number of seconds since epoch) of the start of the query window. If not provided, the start of the query window is 15 minutes before the `to` timestamp. If neither `from` nor `to` are provided, the query window is `[now - 15m, now]`. to integer Unix timestamp (number of seconds since epoch) of the end of the query window. If not provided, the end of the query window is the current time. If neither `from` nor `to` are provided, the query window is `[now - 15m, now]`. group_by string Comma-separated list of fields to group connections by. The maximum number of group_by(s) is 10. tags string Comma-separated list of tags to filter connections by. limit integer The number of connections to be returned. The maximum value is 7500. The default is 100. ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/#GetAggregatedConnections-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/#GetAggregatedConnections-400-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/#GetAggregatedConnections-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) List of aggregated connections. Field Type Description data [object] Array of aggregated connection objects. attributes object Attributes for an aggregated connection. bytes_sent_by_client int64 The total number of bytes sent by the client over the given period. bytes_sent_by_server int64 The total number of bytes sent by the server over the given period. group_bys object The key, value pairs for each group by. [string] The values for each group by. packets_sent_by_client int64 The total number of packets sent by the client over the given period. packets_sent_by_server int64 The total number of packets sent by the server over the given period. rtt_micro_seconds int64 Measured as TCP smoothed round trip time in microseconds (the time between a TCP frame being sent and acknowledged). tcp_closed_connections int64 The number of TCP connections in a closed state. Measured in connections per second from the client. tcp_established_connections int64 The number of TCP connections in an established state. Measured in connections per second from the client. tcp_refusals int64 The number of TCP connections that were refused by the server. Typically this indicates an attempt to connect to an IP/port that is not receiving connections, or a firewall/security misconfiguration. tcp_resets int64 The number of TCP connections that were reset by the server. tcp_retransmits int64 TCP Retransmits represent detected failures that are retransmitted to ensure delivery. Measured in count of retransmits from the client. tcp_timeouts int64 The number of TCP connections that timed out from the perspective of the operating system. This can indicate general connectivity and latency issues. id string A unique identifier for the aggregated connection based on the group by values. type enum Aggregated connection resource type. Allowed enum values: `aggregated_connection` default: `aggregated_connection` ``` { "data": [ { "attributes": { "bytes_sent_by_client": 100, "bytes_sent_by_server": 200, "group_bys": { "client_team": [ "networks" ], "server_service": [ "hucklebuck" ] }, "packets_sent_by_client": 10, "packets_sent_by_server": 20, "rtt_micro_seconds": 800, "tcp_closed_connections": 30, "tcp_established_connections": 40, "tcp_refusals": 7, "tcp_resets": 5, "tcp_retransmits": 30, "tcp_timeouts": 6 }, "id": "client_team:networks, server_service:hucklebuck", "type": "aggregated_connection" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=typescript) ##### Get all aggregated connections Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/network/connections/aggregate" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all aggregated connections ``` """ Get all aggregated connections returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_network_monitoring_api import CloudNetworkMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudNetworkMonitoringApi(api_client) response = api_instance.get_aggregated_connections() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all aggregated connections ``` # Get all aggregated connections returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudNetworkMonitoringAPI.new p api_instance.get_aggregated_connections() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all aggregated connections ``` // Get all aggregated connections returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudNetworkMonitoringApi(apiClient) resp, r, err := api.GetAggregatedConnections(ctx, *datadogV2.NewGetAggregatedConnectionsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudNetworkMonitoringApi.GetAggregatedConnections`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudNetworkMonitoringApi.GetAggregatedConnections`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all aggregated connections ``` // Get all aggregated connections returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudNetworkMonitoringApi; import com.datadog.api.client.v2.model.SingleAggregatedConnectionResponseArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudNetworkMonitoringApi apiInstance = new CloudNetworkMonitoringApi(defaultClient); try { SingleAggregatedConnectionResponseArray result = apiInstance.getAggregatedConnections(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CloudNetworkMonitoringApi#getAggregatedConnections"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all aggregated connections ``` // Get all aggregated connections returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_network_monitoring::CloudNetworkMonitoringAPI; use datadog_api_client::datadogV2::api_cloud_network_monitoring::GetAggregatedConnectionsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudNetworkMonitoringAPI::with_config(configuration); let resp = api .get_aggregated_connections(GetAggregatedConnectionsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all aggregated connections ``` /** * Get all aggregated connections returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudNetworkMonitoringApi(configuration); apiInstance .getAggregatedConnections() .then((data: v2.SingleAggregatedConnectionResponseArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all aggregated DNS traffic](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/#get-all-aggregated-dns-traffic) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/#get-all-aggregated-dns-traffic-v2) GET https://api.ap1.datadoghq.com/api/v2/network/dns/aggregatehttps://api.ap2.datadoghq.com/api/v2/network/dns/aggregatehttps://api.datadoghq.eu/api/v2/network/dns/aggregatehttps://api.ddog-gov.com/api/v2/network/dns/aggregatehttps://api.datadoghq.com/api/v2/network/dns/aggregatehttps://api.us3.datadoghq.com/api/v2/network/dns/aggregatehttps://api.us5.datadoghq.com/api/v2/network/dns/aggregate ### Overview Get all aggregated DNS traffic. ### Arguments #### Query Strings Name Type Description from integer Unix timestamp (number of seconds since epoch) of the start of the query window. If not provided, the start of the query window is 15 minutes before the `to` timestamp. If neither `from` nor `to` are provided, the query window is `[now - 15m, now]`. to integer Unix timestamp (number of seconds since epoch) of the end of the query window. If not provided, the end of the query window is the current time. If neither `from` nor `to` are provided, the query window is `[now - 15m, now]`. group_by string Comma-separated list of fields to group DNS traffic by. The server side defaults to `network.dns_query` if unspecified. `server_ungrouped` may be used if groups are not desired. The maximum number of group_by(s) is 10. tags string Comma-separated list of tags to filter DNS traffic by. limit integer The number of aggregated DNS entries to be returned. The maximum value is 7500. The default is 100. ### Response * [200](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/#GetAggregatedDns-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/#GetAggregatedDns-400-v2) * [429](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/#GetAggregatedDns-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) List of aggregated DNS flows. Field Type Description data [object] Array of aggregated DNS objects. attributes object Attributes for an aggregated DNS flow. group_bys [object] The key, value pairs for each group by. key string The group by key. value string The group by value. metrics [object] Metrics associated with an aggregated DNS flow. key enum The metric key for DNS metrics. Allowed enum values: `dns_total_requests,dns_failures,dns_successful_responses,dns_failed_responses,dns_timeouts,dns_responses.nxdomain,dns_responses.servfail,dns_responses.other,dns_success_latency_percentile,dns_failure_latency_percentile` value int64 The metric value. id string A unique identifier for the aggregated DNS traffic based on the group by values. type enum Aggregated DNS resource type. Allowed enum values: `aggregated_dns` default: `aggregated_dns` ``` { "data": [ { "attributes": { "group_bys": [ { "key": "client_service", "value": "example-service" }, { "key": "network.dns_query", "value": "example.com" } ], "metrics": [ { "key": "dns_total_requests", "value": 100 }, { "key": "dns_failures", "value": 7 }, { "key": "dns_successful_responses", "value": 93 }, { "key": "dns_failed_responses", "value": 5 }, { "key": "dns_timeouts", "value": 2 }, { "key": "dns_responses.nxdomain", "value": 1 }, { "key": "dns_responses.servfail", "value": 1 }, { "key": "dns_responses.other", "value": 3 }, { "key": "dns_success_latency_percentile", "value": 50 }, { "key": "dns_failure_latency_percentile", "value": 75 } ] }, "id": "client_service:example-service,network.dns_query:example.com", "type": "aggregated_dns" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/?code-lang=typescript) ##### Get all aggregated DNS traffic Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/network/dns/aggregate" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all aggregated DNS traffic ``` """ Get all aggregated DNS traffic returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloud_network_monitoring_api import CloudNetworkMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudNetworkMonitoringApi(api_client) response = api_instance.get_aggregated_dns() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all aggregated DNS traffic ``` # Get all aggregated DNS traffic returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudNetworkMonitoringAPI.new p api_instance.get_aggregated_dns() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all aggregated DNS traffic ``` // Get all aggregated DNS traffic returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudNetworkMonitoringApi(apiClient) resp, r, err := api.GetAggregatedDns(ctx, *datadogV2.NewGetAggregatedDnsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudNetworkMonitoringApi.GetAggregatedDns`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudNetworkMonitoringApi.GetAggregatedDns`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all aggregated DNS traffic ``` // Get all aggregated DNS traffic returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudNetworkMonitoringApi; import com.datadog.api.client.v2.model.SingleAggregatedDnsResponseArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudNetworkMonitoringApi apiInstance = new CloudNetworkMonitoringApi(defaultClient); try { SingleAggregatedDnsResponseArray result = apiInstance.getAggregatedDns(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudNetworkMonitoringApi#getAggregatedDns"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all aggregated DNS traffic ``` // Get all aggregated DNS traffic returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloud_network_monitoring::CloudNetworkMonitoringAPI; use datadog_api_client::datadogV2::api_cloud_network_monitoring::GetAggregatedDnsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudNetworkMonitoringAPI::with_config(configuration); let resp = api .get_aggregated_dns(GetAggregatedDnsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all aggregated DNS traffic ``` /** * Get all aggregated DNS traffic returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudNetworkMonitoringApi(configuration); apiInstance .getAggregatedDns() .then((data: v2.SingleAggregatedDnsResponseArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=b22495e6-c287-43d7-bbd4-cb180383d202&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=e44df2fb-ca5d-4940-bfd0-97f116cb55a3&pt=Cloud%20Network%20Monitoring&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-network-monitoring%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=b22495e6-c287-43d7-bbd4-cb180383d202&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=e44df2fb-ca5d-4940-bfd0-97f116cb55a3&pt=Cloud%20Network%20Monitoring&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-network-monitoring%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=c8b20a9a-8b67-4907-addd-23944eeaa41a&bo=2&sid=39b42aa0f0bf11f0ad99cb7dbf58476b&vid=39b4a9f0f0bf11f0a926e11ddcbf2165&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Cloud%20Network%20Monitoring&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloud-network-monitoring%2F&r=<=1240&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=464708) --- # Source: https://docs.datadoghq.com/api/latest/cloudflare-integration/ # Cloudflare Integration Manage your Datadog Cloudflare integration directly through the Datadog API. See the [Cloudflare integration page](https://docs.datadoghq.com/integrations/cloudflare/) for more information. ## [List Cloudflare accounts](https://docs.datadoghq.com/api/latest/cloudflare-integration/#list-cloudflare-accounts) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloudflare-integration/#list-cloudflare-accounts-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/cloudflare/accountshttps://api.ap2.datadoghq.com/api/v2/integrations/cloudflare/accountshttps://api.datadoghq.eu/api/v2/integrations/cloudflare/accountshttps://api.ddog-gov.com/api/v2/integrations/cloudflare/accountshttps://api.datadoghq.com/api/v2/integrations/cloudflare/accountshttps://api.us3.datadoghq.com/api/v2/integrations/cloudflare/accountshttps://api.us5.datadoghq.com/api/v2/integrations/cloudflare/accounts ### Overview List Cloudflare accounts. This endpoint requires the `integrations_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/cloudflare-integration/#ListCloudflareAccounts-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloudflare-integration/#ListCloudflareAccounts-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloudflare-integration/#ListCloudflareAccounts-403-v2) * [404](https://docs.datadoghq.com/api/latest/cloudflare-integration/#ListCloudflareAccounts-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloudflare-integration/#ListCloudflareAccounts-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) The expected response schema when getting Cloudflare accounts. Field Type Description data [object] The JSON:API data schema. attributes [_required_] object Attributes object of a Cloudflare account. email string The email associated with the Cloudflare account. name [_required_] string The name of the Cloudflare account. resources [string] An allowlist of resources, such as `web`, `dns`, `lb` (load balancer), `worker`, that restricts pulling metrics from those resources. zones [string] An allowlist of zones to restrict pulling metrics for. id [_required_] string The ID of the Cloudflare account, a hash of the account name. type [_required_] enum The JSON:API type for this API. Should always be `cloudflare-accounts`. Allowed enum values: `cloudflare-accounts` default: `cloudflare-accounts` ``` { "data": [ { "attributes": { "email": "test-email@example.com", "name": "test-name", "resources": [ "web", "dns", "lb", "worker" ], "zones": [ "zone_id_1", "zone_id_2" ] }, "id": "c1a8e059bfd1e911cf10b626340c9a54", "type": "cloudflare-accounts" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=typescript) ##### List Cloudflare accounts Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/cloudflare/accounts" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Cloudflare accounts ``` """ List Cloudflare accounts returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloudflare_integration_api import CloudflareIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudflareIntegrationApi(api_client) response = api_instance.list_cloudflare_accounts() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Cloudflare accounts ``` # List Cloudflare accounts returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudflareIntegrationAPI.new p api_instance.list_cloudflare_accounts() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Cloudflare accounts ``` // List Cloudflare accounts returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudflareIntegrationApi(apiClient) resp, r, err := api.ListCloudflareAccounts(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudflareIntegrationApi.ListCloudflareAccounts`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudflareIntegrationApi.ListCloudflareAccounts`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Cloudflare accounts ``` // List Cloudflare accounts returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudflareIntegrationApi; import com.datadog.api.client.v2.model.CloudflareAccountsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudflareIntegrationApi apiInstance = new CloudflareIntegrationApi(defaultClient); try { CloudflareAccountsResponse result = apiInstance.listCloudflareAccounts(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudflareIntegrationApi#listCloudflareAccounts"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Cloudflare accounts ``` // List Cloudflare accounts returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloudflare_integration::CloudflareIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudflareIntegrationAPI::with_config(configuration); let resp = api.list_cloudflare_accounts().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Cloudflare accounts ``` /** * List Cloudflare accounts returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudflareIntegrationApi(configuration); apiInstance .listCloudflareAccounts() .then((data: v2.CloudflareAccountsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add Cloudflare account](https://docs.datadoghq.com/api/latest/cloudflare-integration/#add-cloudflare-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloudflare-integration/#add-cloudflare-account-v2) POST https://api.ap1.datadoghq.com/api/v2/integrations/cloudflare/accountshttps://api.ap2.datadoghq.com/api/v2/integrations/cloudflare/accountshttps://api.datadoghq.eu/api/v2/integrations/cloudflare/accountshttps://api.ddog-gov.com/api/v2/integrations/cloudflare/accountshttps://api.datadoghq.com/api/v2/integrations/cloudflare/accountshttps://api.us3.datadoghq.com/api/v2/integrations/cloudflare/accountshttps://api.us5.datadoghq.com/api/v2/integrations/cloudflare/accounts ### Overview Create a Cloudflare account. This endpoint requires the `manage_integrations` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) Field Type Description data [_required_] object Data object for creating a Cloudflare account. attributes [_required_] object Attributes object for creating a Cloudflare account. api_key [_required_] string The API key (or token) for the Cloudflare account. email string The email associated with the Cloudflare account. If an API key is provided (and not a token), this field is also required. name [_required_] string The name of the Cloudflare account. resources [string] An allowlist of resources to restrict pulling metrics for including `'web', 'dns', 'lb' (load balancer), 'worker'`. zones [string] An allowlist of zones to restrict pulling metrics for. type [_required_] enum The JSON:API type for this API. Should always be `cloudflare-accounts`. Allowed enum values: `cloudflare-accounts` default: `cloudflare-accounts` ``` { "data": { "attributes": { "api_key": "fakekey", "email": "dev@datadoghq.com", "name": "examplecloudflareintegration" }, "type": "cloudflare-accounts" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/cloudflare-integration/#CreateCloudflareAccount-201-v2) * [400](https://docs.datadoghq.com/api/latest/cloudflare-integration/#CreateCloudflareAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloudflare-integration/#CreateCloudflareAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/cloudflare-integration/#CreateCloudflareAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloudflare-integration/#CreateCloudflareAccount-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) The expected response schema when getting a Cloudflare account. Field Type Description data object Data object of a Cloudflare account. attributes [_required_] object Attributes object of a Cloudflare account. email string The email associated with the Cloudflare account. name [_required_] string The name of the Cloudflare account. resources [string] An allowlist of resources, such as `web`, `dns`, `lb` (load balancer), `worker`, that restricts pulling metrics from those resources. zones [string] An allowlist of zones to restrict pulling metrics for. id [_required_] string The ID of the Cloudflare account, a hash of the account name. type [_required_] enum The JSON:API type for this API. Should always be `cloudflare-accounts`. Allowed enum values: `cloudflare-accounts` default: `cloudflare-accounts` ``` { "data": { "attributes": { "email": "test-email@example.com", "name": "test-name", "resources": [ "web", "dns", "lb", "worker" ], "zones": [ "zone_id_1", "zone_id_2" ] }, "id": "c1a8e059bfd1e911cf10b626340c9a54", "type": "cloudflare-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=typescript) ##### Add Cloudflare account returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/cloudflare/accounts" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "api_key": "fakekey", "email": "dev@datadoghq.com", "name": "examplecloudflareintegration" }, "type": "cloudflare-accounts" } } EOF ``` ##### Add Cloudflare account returns "CREATED" response ``` // Add Cloudflare account returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CloudflareAccountCreateRequest{ Data: datadogV2.CloudflareAccountCreateRequestData{ Attributes: datadogV2.CloudflareAccountCreateRequestAttributes{ ApiKey: "fakekey", Email: datadog.PtrString("dev@datadoghq.com"), Name: "examplecloudflareintegration", }, Type: datadogV2.CLOUDFLAREACCOUNTTYPE_CLOUDFLARE_ACCOUNTS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudflareIntegrationApi(apiClient) resp, r, err := api.CreateCloudflareAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudflareIntegrationApi.CreateCloudflareAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudflareIntegrationApi.CreateCloudflareAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add Cloudflare account returns "CREATED" response ``` // Add Cloudflare account returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudflareIntegrationApi; import com.datadog.api.client.v2.model.CloudflareAccountCreateRequest; import com.datadog.api.client.v2.model.CloudflareAccountCreateRequestAttributes; import com.datadog.api.client.v2.model.CloudflareAccountCreateRequestData; import com.datadog.api.client.v2.model.CloudflareAccountResponse; import com.datadog.api.client.v2.model.CloudflareAccountType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudflareIntegrationApi apiInstance = new CloudflareIntegrationApi(defaultClient); CloudflareAccountCreateRequest body = new CloudflareAccountCreateRequest() .data( new CloudflareAccountCreateRequestData() .attributes( new CloudflareAccountCreateRequestAttributes() .apiKey("fakekey") .email("dev@datadoghq.com") .name("examplecloudflareintegration")) .type(CloudflareAccountType.CLOUDFLARE_ACCOUNTS)); try { CloudflareAccountResponse result = apiInstance.createCloudflareAccount(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudflareIntegrationApi#createCloudflareAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add Cloudflare account returns "CREATED" response ``` """ Add Cloudflare account returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloudflare_integration_api import CloudflareIntegrationApi from datadog_api_client.v2.model.cloudflare_account_create_request import CloudflareAccountCreateRequest from datadog_api_client.v2.model.cloudflare_account_create_request_attributes import ( CloudflareAccountCreateRequestAttributes, ) from datadog_api_client.v2.model.cloudflare_account_create_request_data import CloudflareAccountCreateRequestData from datadog_api_client.v2.model.cloudflare_account_type import CloudflareAccountType body = CloudflareAccountCreateRequest( data=CloudflareAccountCreateRequestData( attributes=CloudflareAccountCreateRequestAttributes( api_key="fakekey", email="dev@datadoghq.com", name="examplecloudflareintegration", ), type=CloudflareAccountType.CLOUDFLARE_ACCOUNTS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudflareIntegrationApi(api_client) response = api_instance.create_cloudflare_account(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add Cloudflare account returns "CREATED" response ``` # Add Cloudflare account returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudflareIntegrationAPI.new body = DatadogAPIClient::V2::CloudflareAccountCreateRequest.new({ data: DatadogAPIClient::V2::CloudflareAccountCreateRequestData.new({ attributes: DatadogAPIClient::V2::CloudflareAccountCreateRequestAttributes.new({ api_key: "fakekey", email: "dev@datadoghq.com", name: "examplecloudflareintegration", }), type: DatadogAPIClient::V2::CloudflareAccountType::CLOUDFLARE_ACCOUNTS, }), }) p api_instance.create_cloudflare_account(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add Cloudflare account returns "CREATED" response ``` // Add Cloudflare account returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloudflare_integration::CloudflareIntegrationAPI; use datadog_api_client::datadogV2::model::CloudflareAccountCreateRequest; use datadog_api_client::datadogV2::model::CloudflareAccountCreateRequestAttributes; use datadog_api_client::datadogV2::model::CloudflareAccountCreateRequestData; use datadog_api_client::datadogV2::model::CloudflareAccountType; #[tokio::main] async fn main() { let body = CloudflareAccountCreateRequest::new(CloudflareAccountCreateRequestData::new( CloudflareAccountCreateRequestAttributes::new( "fakekey".to_string(), "examplecloudflareintegration".to_string(), ) .email("dev@datadoghq.com".to_string()), CloudflareAccountType::CLOUDFLARE_ACCOUNTS, )); let configuration = datadog::Configuration::new(); let api = CloudflareIntegrationAPI::with_config(configuration); let resp = api.create_cloudflare_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add Cloudflare account returns "CREATED" response ``` /** * Add Cloudflare account returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudflareIntegrationApi(configuration); const params: v2.CloudflareIntegrationApiCreateCloudflareAccountRequest = { body: { data: { attributes: { apiKey: "fakekey", email: "dev@datadoghq.com", name: "examplecloudflareintegration", }, type: "cloudflare-accounts", }, }, }; apiInstance .createCloudflareAccount(params) .then((data: v2.CloudflareAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get Cloudflare account](https://docs.datadoghq.com/api/latest/cloudflare-integration/#get-cloudflare-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloudflare-integration/#get-cloudflare-account-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/cloudflare/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id} ### Overview Get a Cloudflare account. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string None ### Response * [200](https://docs.datadoghq.com/api/latest/cloudflare-integration/#GetCloudflareAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloudflare-integration/#GetCloudflareAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloudflare-integration/#GetCloudflareAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/cloudflare-integration/#GetCloudflareAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloudflare-integration/#GetCloudflareAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) The expected response schema when getting a Cloudflare account. Field Type Description data object Data object of a Cloudflare account. attributes [_required_] object Attributes object of a Cloudflare account. email string The email associated with the Cloudflare account. name [_required_] string The name of the Cloudflare account. resources [string] An allowlist of resources, such as `web`, `dns`, `lb` (load balancer), `worker`, that restricts pulling metrics from those resources. zones [string] An allowlist of zones to restrict pulling metrics for. id [_required_] string The ID of the Cloudflare account, a hash of the account name. type [_required_] enum The JSON:API type for this API. Should always be `cloudflare-accounts`. Allowed enum values: `cloudflare-accounts` default: `cloudflare-accounts` ``` { "data": { "attributes": { "email": "test-email@example.com", "name": "test-name", "resources": [ "web", "dns", "lb", "worker" ], "zones": [ "zone_id_1", "zone_id_2" ] }, "id": "c1a8e059bfd1e911cf10b626340c9a54", "type": "cloudflare-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=typescript) ##### Get Cloudflare account Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/cloudflare/accounts/${account_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Cloudflare account ``` """ Get Cloudflare account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloudflare_integration_api import CloudflareIntegrationApi # there is a valid "cloudflare_account" in the system CLOUDFLARE_ACCOUNT_DATA_ID = environ["CLOUDFLARE_ACCOUNT_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudflareIntegrationApi(api_client) response = api_instance.get_cloudflare_account( account_id=CLOUDFLARE_ACCOUNT_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Cloudflare account ``` # Get Cloudflare account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudflareIntegrationAPI.new # there is a valid "cloudflare_account" in the system CLOUDFLARE_ACCOUNT_DATA_ID = ENV["CLOUDFLARE_ACCOUNT_DATA_ID"] p api_instance.get_cloudflare_account(CLOUDFLARE_ACCOUNT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Cloudflare account ``` // Get Cloudflare account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "cloudflare_account" in the system CloudflareAccountDataID := os.Getenv("CLOUDFLARE_ACCOUNT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudflareIntegrationApi(apiClient) resp, r, err := api.GetCloudflareAccount(ctx, CloudflareAccountDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudflareIntegrationApi.GetCloudflareAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudflareIntegrationApi.GetCloudflareAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Cloudflare account ``` // Get Cloudflare account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudflareIntegrationApi; import com.datadog.api.client.v2.model.CloudflareAccountResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudflareIntegrationApi apiInstance = new CloudflareIntegrationApi(defaultClient); // there is a valid "cloudflare_account" in the system String CLOUDFLARE_ACCOUNT_DATA_ID = System.getenv("CLOUDFLARE_ACCOUNT_DATA_ID"); try { CloudflareAccountResponse result = apiInstance.getCloudflareAccount(CLOUDFLARE_ACCOUNT_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudflareIntegrationApi#getCloudflareAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Cloudflare account ``` // Get Cloudflare account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloudflare_integration::CloudflareIntegrationAPI; #[tokio::main] async fn main() { // there is a valid "cloudflare_account" in the system let cloudflare_account_data_id = std::env::var("CLOUDFLARE_ACCOUNT_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = CloudflareIntegrationAPI::with_config(configuration); let resp = api .get_cloudflare_account(cloudflare_account_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Cloudflare account ``` /** * Get Cloudflare account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudflareIntegrationApi(configuration); // there is a valid "cloudflare_account" in the system const CLOUDFLARE_ACCOUNT_DATA_ID = process.env .CLOUDFLARE_ACCOUNT_DATA_ID as string; const params: v2.CloudflareIntegrationApiGetCloudflareAccountRequest = { accountId: CLOUDFLARE_ACCOUNT_DATA_ID, }; apiInstance .getCloudflareAccount(params) .then((data: v2.CloudflareAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Cloudflare account](https://docs.datadoghq.com/api/latest/cloudflare-integration/#update-cloudflare-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloudflare-integration/#update-cloudflare-account-v2) PATCH https://api.ap1.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/cloudflare/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id} ### Overview Update a Cloudflare account. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) Field Type Description data [_required_] object Data object for updating a Cloudflare account. attributes object Attributes object for updating a Cloudflare account. api_key [_required_] string The API key of the Cloudflare account. email string The email associated with the Cloudflare account. If an API key is provided (and not a token), this field is also required. name string The name of the Cloudflare account. resources [string] An allowlist of resources to restrict pulling metrics for including `'web', 'dns', 'lb' (load balancer), 'worker'`. zones [string] An allowlist of zones to restrict pulling metrics for. type enum The JSON:API type for this API. Should always be `cloudflare-accounts`. Allowed enum values: `cloudflare-accounts` default: `cloudflare-accounts` ``` { "data": { "attributes": { "api_key": "fakekey", "email": "dev@datadoghq.com", "zones": [ "zone-id-3" ] }, "type": "cloudflare-accounts" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/cloudflare-integration/#UpdateCloudflareAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/cloudflare-integration/#UpdateCloudflareAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloudflare-integration/#UpdateCloudflareAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/cloudflare-integration/#UpdateCloudflareAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloudflare-integration/#UpdateCloudflareAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) The expected response schema when getting a Cloudflare account. Field Type Description data object Data object of a Cloudflare account. attributes [_required_] object Attributes object of a Cloudflare account. email string The email associated with the Cloudflare account. name [_required_] string The name of the Cloudflare account. resources [string] An allowlist of resources, such as `web`, `dns`, `lb` (load balancer), `worker`, that restricts pulling metrics from those resources. zones [string] An allowlist of zones to restrict pulling metrics for. id [_required_] string The ID of the Cloudflare account, a hash of the account name. type [_required_] enum The JSON:API type for this API. Should always be `cloudflare-accounts`. Allowed enum values: `cloudflare-accounts` default: `cloudflare-accounts` ``` { "data": { "attributes": { "email": "test-email@example.com", "name": "test-name", "resources": [ "web", "dns", "lb", "worker" ], "zones": [ "zone_id_1", "zone_id_2" ] }, "id": "c1a8e059bfd1e911cf10b626340c9a54", "type": "cloudflare-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=typescript) ##### Update Cloudflare account returns "OK" response Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/cloudflare/accounts/${account_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "api_key": "fakekey", "email": "dev@datadoghq.com", "zones": [ "zone-id-3" ] }, "type": "cloudflare-accounts" } } EOF ``` ##### Update Cloudflare account returns "OK" response ``` // Update Cloudflare account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "cloudflare_account" in the system CloudflareAccountDataID := os.Getenv("CLOUDFLARE_ACCOUNT_DATA_ID") body := datadogV2.CloudflareAccountUpdateRequest{ Data: datadogV2.CloudflareAccountUpdateRequestData{ Attributes: &datadogV2.CloudflareAccountUpdateRequestAttributes{ ApiKey: "fakekey", Email: datadog.PtrString("dev@datadoghq.com"), Zones: []string{ "zone-id-3", }, }, Type: datadogV2.CLOUDFLAREACCOUNTTYPE_CLOUDFLARE_ACCOUNTS.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudflareIntegrationApi(apiClient) resp, r, err := api.UpdateCloudflareAccount(ctx, CloudflareAccountDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudflareIntegrationApi.UpdateCloudflareAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CloudflareIntegrationApi.UpdateCloudflareAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Cloudflare account returns "OK" response ``` // Update Cloudflare account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudflareIntegrationApi; import com.datadog.api.client.v2.model.CloudflareAccountResponse; import com.datadog.api.client.v2.model.CloudflareAccountType; import com.datadog.api.client.v2.model.CloudflareAccountUpdateRequest; import com.datadog.api.client.v2.model.CloudflareAccountUpdateRequestAttributes; import com.datadog.api.client.v2.model.CloudflareAccountUpdateRequestData; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudflareIntegrationApi apiInstance = new CloudflareIntegrationApi(defaultClient); // there is a valid "cloudflare_account" in the system String CLOUDFLARE_ACCOUNT_DATA_ID = System.getenv("CLOUDFLARE_ACCOUNT_DATA_ID"); CloudflareAccountUpdateRequest body = new CloudflareAccountUpdateRequest() .data( new CloudflareAccountUpdateRequestData() .attributes( new CloudflareAccountUpdateRequestAttributes() .apiKey("fakekey") .email("dev@datadoghq.com") .zones(Collections.singletonList("zone-id-3"))) .type(CloudflareAccountType.CLOUDFLARE_ACCOUNTS)); try { CloudflareAccountResponse result = apiInstance.updateCloudflareAccount(CLOUDFLARE_ACCOUNT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CloudflareIntegrationApi#updateCloudflareAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Cloudflare account returns "OK" response ``` """ Update Cloudflare account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloudflare_integration_api import CloudflareIntegrationApi from datadog_api_client.v2.model.cloudflare_account_type import CloudflareAccountType from datadog_api_client.v2.model.cloudflare_account_update_request import CloudflareAccountUpdateRequest from datadog_api_client.v2.model.cloudflare_account_update_request_attributes import ( CloudflareAccountUpdateRequestAttributes, ) from datadog_api_client.v2.model.cloudflare_account_update_request_data import CloudflareAccountUpdateRequestData # there is a valid "cloudflare_account" in the system CLOUDFLARE_ACCOUNT_DATA_ID = environ["CLOUDFLARE_ACCOUNT_DATA_ID"] body = CloudflareAccountUpdateRequest( data=CloudflareAccountUpdateRequestData( attributes=CloudflareAccountUpdateRequestAttributes( api_key="fakekey", email="dev@datadoghq.com", zones=[ "zone-id-3", ], ), type=CloudflareAccountType.CLOUDFLARE_ACCOUNTS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudflareIntegrationApi(api_client) response = api_instance.update_cloudflare_account(account_id=CLOUDFLARE_ACCOUNT_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Cloudflare account returns "OK" response ``` # Update Cloudflare account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudflareIntegrationAPI.new # there is a valid "cloudflare_account" in the system CLOUDFLARE_ACCOUNT_DATA_ID = ENV["CLOUDFLARE_ACCOUNT_DATA_ID"] body = DatadogAPIClient::V2::CloudflareAccountUpdateRequest.new({ data: DatadogAPIClient::V2::CloudflareAccountUpdateRequestData.new({ attributes: DatadogAPIClient::V2::CloudflareAccountUpdateRequestAttributes.new({ api_key: "fakekey", email: "dev@datadoghq.com", zones: [ "zone-id-3", ], }), type: DatadogAPIClient::V2::CloudflareAccountType::CLOUDFLARE_ACCOUNTS, }), }) p api_instance.update_cloudflare_account(CLOUDFLARE_ACCOUNT_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Cloudflare account returns "OK" response ``` // Update Cloudflare account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloudflare_integration::CloudflareIntegrationAPI; use datadog_api_client::datadogV2::model::CloudflareAccountType; use datadog_api_client::datadogV2::model::CloudflareAccountUpdateRequest; use datadog_api_client::datadogV2::model::CloudflareAccountUpdateRequestAttributes; use datadog_api_client::datadogV2::model::CloudflareAccountUpdateRequestData; #[tokio::main] async fn main() { // there is a valid "cloudflare_account" in the system let cloudflare_account_data_id = std::env::var("CLOUDFLARE_ACCOUNT_DATA_ID").unwrap(); let body = CloudflareAccountUpdateRequest::new( CloudflareAccountUpdateRequestData::new() .attributes( CloudflareAccountUpdateRequestAttributes::new("fakekey".to_string()) .email("dev@datadoghq.com".to_string()) .zones(vec!["zone-id-3".to_string()]), ) .type_(CloudflareAccountType::CLOUDFLARE_ACCOUNTS), ); let configuration = datadog::Configuration::new(); let api = CloudflareIntegrationAPI::with_config(configuration); let resp = api .update_cloudflare_account(cloudflare_account_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Cloudflare account returns "OK" response ``` /** * Update Cloudflare account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudflareIntegrationApi(configuration); // there is a valid "cloudflare_account" in the system const CLOUDFLARE_ACCOUNT_DATA_ID = process.env .CLOUDFLARE_ACCOUNT_DATA_ID as string; const params: v2.CloudflareIntegrationApiUpdateCloudflareAccountRequest = { body: { data: { attributes: { apiKey: "fakekey", email: "dev@datadoghq.com", zones: ["zone-id-3"], }, type: "cloudflare-accounts", }, }, accountId: CLOUDFLARE_ACCOUNT_DATA_ID, }; apiInstance .updateCloudflareAccount(params) .then((data: v2.CloudflareAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Cloudflare account](https://docs.datadoghq.com/api/latest/cloudflare-integration/#delete-cloudflare-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/cloudflare-integration/#delete-cloudflare-account-v2) DELETE https://api.ap1.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/cloudflare/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/cloudflare/accounts/{account_id} ### Overview Delete a Cloudflare account. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string None ### Response * [204](https://docs.datadoghq.com/api/latest/cloudflare-integration/#DeleteCloudflareAccount-204-v2) * [400](https://docs.datadoghq.com/api/latest/cloudflare-integration/#DeleteCloudflareAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/cloudflare-integration/#DeleteCloudflareAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/cloudflare-integration/#DeleteCloudflareAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/cloudflare-integration/#DeleteCloudflareAccount-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [Example](https://docs.datadoghq.com/api/latest/cloudflare-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/cloudflare-integration/?code-lang=typescript) ##### Delete Cloudflare account Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/cloudflare/accounts/${account_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Cloudflare account ``` """ Delete Cloudflare account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.cloudflare_integration_api import CloudflareIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CloudflareIntegrationApi(api_client) api_instance.delete_cloudflare_account( account_id="account_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Cloudflare account ``` # Delete Cloudflare account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CloudflareIntegrationAPI.new api_instance.delete_cloudflare_account("account_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Cloudflare account ``` // Delete Cloudflare account returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCloudflareIntegrationApi(apiClient) r, err := api.DeleteCloudflareAccount(ctx, "account_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CloudflareIntegrationApi.DeleteCloudflareAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Cloudflare account ``` // Delete Cloudflare account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CloudflareIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CloudflareIntegrationApi apiInstance = new CloudflareIntegrationApi(defaultClient); try { apiInstance.deleteCloudflareAccount("account_id"); } catch (ApiException e) { System.err.println("Exception when calling CloudflareIntegrationApi#deleteCloudflareAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Cloudflare account ``` // Delete Cloudflare account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_cloudflare_integration::CloudflareIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CloudflareIntegrationAPI::with_config(configuration); let resp = api .delete_cloudflare_account("account_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Cloudflare account ``` /** * Delete Cloudflare account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CloudflareIntegrationApi(configuration); const params: v2.CloudflareIntegrationApiDeleteCloudflareAccountRequest = { accountId: "account_id", }; apiInstance .deleteCloudflareAccount(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=dceef51d-cccd-4f70-a4a0-7cf2258adb66&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b6214877-75d3-4c26-b080-075f5abc4279&pt=Cloudflare%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloudflare-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=dceef51d-cccd-4f70-a4a0-7cf2258adb66&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b6214877-75d3-4c26-b080-075f5abc4279&pt=Cloudflare%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloudflare-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=62193122-e17f-4bab-a847-d529f040019b&bo=2&sid=3dcb5360f0bf11f0b770d1a90854b039&vid=3dcb8270f0bf11f0b05aab124df80fee&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Cloudflare%20Integration&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcloudflare-integration%2F&r=<=1272&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=472858) --- # Source: https://docs.datadoghq.com/api/latest/confluent-cloud/ # Confluent Cloud Manage your Datadog Confluent Cloud integration accounts and account resources directly through the Datadog API. See the [Confluent Cloud page](https://docs.datadoghq.com/integrations/confluent_cloud/) for more information. ## [Update resource in Confluent account](https://docs.datadoghq.com/api/latest/confluent-cloud/#update-resource-in-confluent-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/confluent-cloud/#update-resource-in-confluent-account-v2) PATCH https://api.ap1.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.ap2.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.datadoghq.eu/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.ddog-gov.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.us3.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id} ### Overview Update a Confluent resource with the provided resource id for the account associated with the provided account ID. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Confluent Account ID. resource_id [_required_] string Confluent Account Resource ID. ### Request #### Body Data (required) Confluent payload * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) Field Type Description data [_required_] object JSON:API request for updating a Confluent resource. attributes [_required_] object Attributes object for updating a Confluent resource. enable_custom_metrics boolean Enable the `custom.consumer_lag_offset` metric, which contains extra metric tags. resource_type [_required_] string The resource type of the Resource. Can be `kafka`, `connector`, `ksql`, or `schema_registry`. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. id [_required_] string The ID associated with a Confluent resource. type [_required_] enum The JSON:API type for this request. Allowed enum values: `confluent-cloud-resources` default: `confluent-cloud-resources` ``` { "data": { "attributes": { "enable_custom_metrics": false, "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ] }, "id": "resource-id-123", "type": "confluent-cloud-resources" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/confluent-cloud/#UpdateConfluentResource-200-v2) * [400](https://docs.datadoghq.com/api/latest/confluent-cloud/#UpdateConfluentResource-400-v2) * [403](https://docs.datadoghq.com/api/latest/confluent-cloud/#UpdateConfluentResource-403-v2) * [404](https://docs.datadoghq.com/api/latest/confluent-cloud/#UpdateConfluentResource-404-v2) * [429](https://docs.datadoghq.com/api/latest/confluent-cloud/#UpdateConfluentResource-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) Response schema when interacting with a Confluent resource. Field Type Description data object Confluent Cloud resource data. attributes [_required_] object Model representation of a Confluent Cloud resource. enable_custom_metrics boolean Enable the `custom.consumer_lag_offset` metric, which contains extra metric tags. id string The ID associated with the Confluent resource. resource_type [_required_] string The resource type of the Resource. Can be `kafka`, `connector`, `ksql`, or `schema_registry`. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. id [_required_] string The ID associated with the Confluent resource. type [_required_] enum The JSON:API type for this request. Allowed enum values: `confluent-cloud-resources` default: `confluent-cloud-resources` ``` { "data": { "attributes": { "enable_custom_metrics": false, "id": "resource_id_abc123", "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ] }, "id": "resource_id_abc123", "type": "confluent-cloud-resources" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=typescript) ##### Update resource in Confluent account Copy ``` # Path parameters export account_id="CHANGE_ME" export resource_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/${account_id}/resources/${resource_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "resource_type": "kafka" }, "id": "resource-id-123", "type": "confluent-cloud-resources" } } EOF ``` ##### Update resource in Confluent account ``` """ Update resource in Confluent account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.confluent_cloud_api import ConfluentCloudApi from datadog_api_client.v2.model.confluent_resource_request import ConfluentResourceRequest from datadog_api_client.v2.model.confluent_resource_request_attributes import ConfluentResourceRequestAttributes from datadog_api_client.v2.model.confluent_resource_request_data import ConfluentResourceRequestData from datadog_api_client.v2.model.confluent_resource_type import ConfluentResourceType body = ConfluentResourceRequest( data=ConfluentResourceRequestData( attributes=ConfluentResourceRequestAttributes( enable_custom_metrics=False, resource_type="kafka", tags=[ "myTag", "myTag2:myValue", ], ), id="resource-id-123", type=ConfluentResourceType.CONFLUENT_CLOUD_RESOURCES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ConfluentCloudApi(api_client) response = api_instance.update_confluent_resource(account_id="account_id", resource_id="resource_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update resource in Confluent account ``` # Update resource in Confluent account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ConfluentCloudAPI.new body = DatadogAPIClient::V2::ConfluentResourceRequest.new({ data: DatadogAPIClient::V2::ConfluentResourceRequestData.new({ attributes: DatadogAPIClient::V2::ConfluentResourceRequestAttributes.new({ enable_custom_metrics: false, resource_type: "kafka", tags: [ "myTag", "myTag2:myValue", ], }), id: "resource-id-123", type: DatadogAPIClient::V2::ConfluentResourceType::CONFLUENT_CLOUD_RESOURCES, }), }) p api_instance.update_confluent_resource("account_id", "resource_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update resource in Confluent account ``` // Update resource in Confluent account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ConfluentResourceRequest{ Data: datadogV2.ConfluentResourceRequestData{ Attributes: datadogV2.ConfluentResourceRequestAttributes{ EnableCustomMetrics: datadog.PtrBool(false), ResourceType: "kafka", Tags: []string{ "myTag", "myTag2:myValue", }, }, Id: "resource-id-123", Type: datadogV2.CONFLUENTRESOURCETYPE_CONFLUENT_CLOUD_RESOURCES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewConfluentCloudApi(apiClient) resp, r, err := api.UpdateConfluentResource(ctx, "account_id", "resource_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ConfluentCloudApi.UpdateConfluentResource`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ConfluentCloudApi.UpdateConfluentResource`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update resource in Confluent account ``` // Update resource in Confluent account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ConfluentCloudApi; import com.datadog.api.client.v2.model.ConfluentResourceRequest; import com.datadog.api.client.v2.model.ConfluentResourceRequestAttributes; import com.datadog.api.client.v2.model.ConfluentResourceRequestData; import com.datadog.api.client.v2.model.ConfluentResourceResponse; import com.datadog.api.client.v2.model.ConfluentResourceType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ConfluentCloudApi apiInstance = new ConfluentCloudApi(defaultClient); ConfluentResourceRequest body = new ConfluentResourceRequest() .data( new ConfluentResourceRequestData() .attributes( new ConfluentResourceRequestAttributes() .enableCustomMetrics(false) .resourceType("kafka") .tags(Arrays.asList("myTag", "myTag2:myValue"))) .id("resource-id-123") .type(ConfluentResourceType.CONFLUENT_CLOUD_RESOURCES)); try { ConfluentResourceResponse result = apiInstance.updateConfluentResource("account_id", "resource_id", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ConfluentCloudApi#updateConfluentResource"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update resource in Confluent account ``` // Update resource in Confluent account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_confluent_cloud::ConfluentCloudAPI; use datadog_api_client::datadogV2::model::ConfluentResourceRequest; use datadog_api_client::datadogV2::model::ConfluentResourceRequestAttributes; use datadog_api_client::datadogV2::model::ConfluentResourceRequestData; use datadog_api_client::datadogV2::model::ConfluentResourceType; #[tokio::main] async fn main() { let body = ConfluentResourceRequest::new(ConfluentResourceRequestData::new( ConfluentResourceRequestAttributes::new("kafka".to_string()) .enable_custom_metrics(false) .tags(vec!["myTag".to_string(), "myTag2:myValue".to_string()]), "resource-id-123".to_string(), ConfluentResourceType::CONFLUENT_CLOUD_RESOURCES, )); let configuration = datadog::Configuration::new(); let api = ConfluentCloudAPI::with_config(configuration); let resp = api .update_confluent_resource("account_id".to_string(), "resource_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update resource in Confluent account ``` /** * Update resource in Confluent account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ConfluentCloudApi(configuration); const params: v2.ConfluentCloudApiUpdateConfluentResourceRequest = { body: { data: { attributes: { enableCustomMetrics: false, resourceType: "kafka", tags: ["myTag", "myTag2:myValue"], }, id: "resource-id-123", type: "confluent-cloud-resources", }, }, accountId: "account_id", resourceId: "resource_id", }; apiInstance .updateConfluentResource(params) .then((data: v2.ConfluentResourceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get resource from Confluent account](https://docs.datadoghq.com/api/latest/confluent-cloud/#get-resource-from-confluent-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/confluent-cloud/#get-resource-from-confluent-account-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.ap2.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.datadoghq.eu/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.ddog-gov.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.us3.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id} ### Overview Get a Confluent resource with the provided resource id for the account associated with the provided account ID. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Confluent Account ID. resource_id [_required_] string Confluent Account Resource ID. ### Response * [200](https://docs.datadoghq.com/api/latest/confluent-cloud/#GetConfluentResource-200-v2) * [400](https://docs.datadoghq.com/api/latest/confluent-cloud/#GetConfluentResource-400-v2) * [403](https://docs.datadoghq.com/api/latest/confluent-cloud/#GetConfluentResource-403-v2) * [404](https://docs.datadoghq.com/api/latest/confluent-cloud/#GetConfluentResource-404-v2) * [429](https://docs.datadoghq.com/api/latest/confluent-cloud/#GetConfluentResource-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) Response schema when interacting with a Confluent resource. Field Type Description data object Confluent Cloud resource data. attributes [_required_] object Model representation of a Confluent Cloud resource. enable_custom_metrics boolean Enable the `custom.consumer_lag_offset` metric, which contains extra metric tags. id string The ID associated with the Confluent resource. resource_type [_required_] string The resource type of the Resource. Can be `kafka`, `connector`, `ksql`, or `schema_registry`. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. id [_required_] string The ID associated with the Confluent resource. type [_required_] enum The JSON:API type for this request. Allowed enum values: `confluent-cloud-resources` default: `confluent-cloud-resources` ``` { "data": { "attributes": { "enable_custom_metrics": false, "id": "resource_id_abc123", "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ] }, "id": "resource_id_abc123", "type": "confluent-cloud-resources" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=typescript) ##### Get resource from Confluent account Copy ``` # Path parameters export account_id="CHANGE_ME" export resource_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/${account_id}/resources/${resource_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get resource from Confluent account ``` """ Get resource from Confluent account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.confluent_cloud_api import ConfluentCloudApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ConfluentCloudApi(api_client) response = api_instance.get_confluent_resource( account_id="account_id", resource_id="resource_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get resource from Confluent account ``` # Get resource from Confluent account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ConfluentCloudAPI.new p api_instance.get_confluent_resource("account_id", "resource_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get resource from Confluent account ``` // Get resource from Confluent account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewConfluentCloudApi(apiClient) resp, r, err := api.GetConfluentResource(ctx, "account_id", "resource_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ConfluentCloudApi.GetConfluentResource`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ConfluentCloudApi.GetConfluentResource`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get resource from Confluent account ``` // Get resource from Confluent account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ConfluentCloudApi; import com.datadog.api.client.v2.model.ConfluentResourceResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ConfluentCloudApi apiInstance = new ConfluentCloudApi(defaultClient); try { ConfluentResourceResponse result = apiInstance.getConfluentResource("account_id", "resource_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ConfluentCloudApi#getConfluentResource"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get resource from Confluent account ``` // Get resource from Confluent account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_confluent_cloud::ConfluentCloudAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ConfluentCloudAPI::with_config(configuration); let resp = api .get_confluent_resource("account_id".to_string(), "resource_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get resource from Confluent account ``` /** * Get resource from Confluent account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ConfluentCloudApi(configuration); const params: v2.ConfluentCloudApiGetConfluentResourceRequest = { accountId: "account_id", resourceId: "resource_id", }; apiInstance .getConfluentResource(params) .then((data: v2.ConfluentResourceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete resource from Confluent account](https://docs.datadoghq.com/api/latest/confluent-cloud/#delete-resource-from-confluent-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/confluent-cloud/#delete-resource-from-confluent-account-v2) DELETE https://api.ap1.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.ap2.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.datadoghq.eu/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.ddog-gov.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.us3.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id}https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources/{resource_id} ### Overview Delete a Confluent resource with the provided resource id for the account associated with the provided account ID. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Confluent Account ID. resource_id [_required_] string Confluent Account Resource ID. ### Response * [204](https://docs.datadoghq.com/api/latest/confluent-cloud/#DeleteConfluentResource-204-v2) * [400](https://docs.datadoghq.com/api/latest/confluent-cloud/#DeleteConfluentResource-400-v2) * [403](https://docs.datadoghq.com/api/latest/confluent-cloud/#DeleteConfluentResource-403-v2) * [404](https://docs.datadoghq.com/api/latest/confluent-cloud/#DeleteConfluentResource-404-v2) * [429](https://docs.datadoghq.com/api/latest/confluent-cloud/#DeleteConfluentResource-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=typescript) ##### Delete resource from Confluent account Copy ``` # Path parameters export account_id="CHANGE_ME" export resource_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/${account_id}/resources/${resource_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete resource from Confluent account ``` """ Delete resource from Confluent account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.confluent_cloud_api import ConfluentCloudApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ConfluentCloudApi(api_client) api_instance.delete_confluent_resource( account_id="account_id", resource_id="resource_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete resource from Confluent account ``` # Delete resource from Confluent account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ConfluentCloudAPI.new api_instance.delete_confluent_resource("account_id", "resource_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete resource from Confluent account ``` // Delete resource from Confluent account returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewConfluentCloudApi(apiClient) r, err := api.DeleteConfluentResource(ctx, "account_id", "resource_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ConfluentCloudApi.DeleteConfluentResource`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete resource from Confluent account ``` // Delete resource from Confluent account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ConfluentCloudApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ConfluentCloudApi apiInstance = new ConfluentCloudApi(defaultClient); try { apiInstance.deleteConfluentResource("account_id", "resource_id"); } catch (ApiException e) { System.err.println("Exception when calling ConfluentCloudApi#deleteConfluentResource"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete resource from Confluent account ``` // Delete resource from Confluent account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_confluent_cloud::ConfluentCloudAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ConfluentCloudAPI::with_config(configuration); let resp = api .delete_confluent_resource("account_id".to_string(), "resource_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete resource from Confluent account ``` /** * Delete resource from Confluent account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ConfluentCloudApi(configuration); const params: v2.ConfluentCloudApiDeleteConfluentResourceRequest = { accountId: "account_id", resourceId: "resource_id", }; apiInstance .deleteConfluentResource(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add resource to Confluent account](https://docs.datadoghq.com/api/latest/confluent-cloud/#add-resource-to-confluent-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/confluent-cloud/#add-resource-to-confluent-account-v2) POST https://api.ap1.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.ap2.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.datadoghq.eu/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.ddog-gov.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.us3.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources ### Overview Create a Confluent resource for the account associated with the provided ID. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Confluent Account ID. ### Request #### Body Data (required) Confluent payload * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) Field Type Description data [_required_] object JSON:API request for updating a Confluent resource. attributes [_required_] object Attributes object for updating a Confluent resource. enable_custom_metrics boolean Enable the `custom.consumer_lag_offset` metric, which contains extra metric tags. resource_type [_required_] string The resource type of the Resource. Can be `kafka`, `connector`, `ksql`, or `schema_registry`. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. id [_required_] string The ID associated with a Confluent resource. type [_required_] enum The JSON:API type for this request. Allowed enum values: `confluent-cloud-resources` default: `confluent-cloud-resources` ``` { "data": { "attributes": { "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ], "enable_custom_metrics": false }, "id": "exampleconfluentcloud", "type": "confluent-cloud-resources" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/confluent-cloud/#CreateConfluentResource-201-v2) * [400](https://docs.datadoghq.com/api/latest/confluent-cloud/#CreateConfluentResource-400-v2) * [403](https://docs.datadoghq.com/api/latest/confluent-cloud/#CreateConfluentResource-403-v2) * [404](https://docs.datadoghq.com/api/latest/confluent-cloud/#CreateConfluentResource-404-v2) * [429](https://docs.datadoghq.com/api/latest/confluent-cloud/#CreateConfluentResource-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) Response schema when interacting with a Confluent resource. Field Type Description data object Confluent Cloud resource data. attributes [_required_] object Model representation of a Confluent Cloud resource. enable_custom_metrics boolean Enable the `custom.consumer_lag_offset` metric, which contains extra metric tags. id string The ID associated with the Confluent resource. resource_type [_required_] string The resource type of the Resource. Can be `kafka`, `connector`, `ksql`, or `schema_registry`. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. id [_required_] string The ID associated with the Confluent resource. type [_required_] enum The JSON:API type for this request. Allowed enum values: `confluent-cloud-resources` default: `confluent-cloud-resources` ``` { "data": { "attributes": { "enable_custom_metrics": false, "id": "resource_id_abc123", "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ] }, "id": "resource_id_abc123", "type": "confluent-cloud-resources" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=typescript) ##### Add resource to Confluent account returns "OK" response Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/${account_id}/resources" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ], "enable_custom_metrics": false }, "id": "exampleconfluentcloud", "type": "confluent-cloud-resources" } } EOF ``` ##### Add resource to Confluent account returns "OK" response ``` // Add resource to Confluent account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "confluent_account" in the system ConfluentAccountDataID := os.Getenv("CONFLUENT_ACCOUNT_DATA_ID") body := datadogV2.ConfluentResourceRequest{ Data: datadogV2.ConfluentResourceRequestData{ Attributes: datadogV2.ConfluentResourceRequestAttributes{ ResourceType: "kafka", Tags: []string{ "myTag", "myTag2:myValue", }, EnableCustomMetrics: datadog.PtrBool(false), }, Id: "exampleconfluentcloud", Type: datadogV2.CONFLUENTRESOURCETYPE_CONFLUENT_CLOUD_RESOURCES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewConfluentCloudApi(apiClient) resp, r, err := api.CreateConfluentResource(ctx, ConfluentAccountDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ConfluentCloudApi.CreateConfluentResource`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ConfluentCloudApi.CreateConfluentResource`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add resource to Confluent account returns "OK" response ``` // Add resource to Confluent account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ConfluentCloudApi; import com.datadog.api.client.v2.model.ConfluentResourceRequest; import com.datadog.api.client.v2.model.ConfluentResourceRequestAttributes; import com.datadog.api.client.v2.model.ConfluentResourceRequestData; import com.datadog.api.client.v2.model.ConfluentResourceResponse; import com.datadog.api.client.v2.model.ConfluentResourceType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ConfluentCloudApi apiInstance = new ConfluentCloudApi(defaultClient); // there is a valid "confluent_account" in the system String CONFLUENT_ACCOUNT_DATA_ID = System.getenv("CONFLUENT_ACCOUNT_DATA_ID"); ConfluentResourceRequest body = new ConfluentResourceRequest() .data( new ConfluentResourceRequestData() .attributes( new ConfluentResourceRequestAttributes() .resourceType("kafka") .tags(Arrays.asList("myTag", "myTag2:myValue")) .enableCustomMetrics(false)) .id("exampleconfluentcloud") .type(ConfluentResourceType.CONFLUENT_CLOUD_RESOURCES)); try { ConfluentResourceResponse result = apiInstance.createConfluentResource(CONFLUENT_ACCOUNT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ConfluentCloudApi#createConfluentResource"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add resource to Confluent account returns "OK" response ``` """ Add resource to Confluent account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.confluent_cloud_api import ConfluentCloudApi from datadog_api_client.v2.model.confluent_resource_request import ConfluentResourceRequest from datadog_api_client.v2.model.confluent_resource_request_attributes import ConfluentResourceRequestAttributes from datadog_api_client.v2.model.confluent_resource_request_data import ConfluentResourceRequestData from datadog_api_client.v2.model.confluent_resource_type import ConfluentResourceType # there is a valid "confluent_account" in the system CONFLUENT_ACCOUNT_DATA_ID = environ["CONFLUENT_ACCOUNT_DATA_ID"] body = ConfluentResourceRequest( data=ConfluentResourceRequestData( attributes=ConfluentResourceRequestAttributes( resource_type="kafka", tags=[ "myTag", "myTag2:myValue", ], enable_custom_metrics=False, ), id="exampleconfluentcloud", type=ConfluentResourceType.CONFLUENT_CLOUD_RESOURCES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ConfluentCloudApi(api_client) response = api_instance.create_confluent_resource(account_id=CONFLUENT_ACCOUNT_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add resource to Confluent account returns "OK" response ``` # Add resource to Confluent account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ConfluentCloudAPI.new # there is a valid "confluent_account" in the system CONFLUENT_ACCOUNT_DATA_ID = ENV["CONFLUENT_ACCOUNT_DATA_ID"] body = DatadogAPIClient::V2::ConfluentResourceRequest.new({ data: DatadogAPIClient::V2::ConfluentResourceRequestData.new({ attributes: DatadogAPIClient::V2::ConfluentResourceRequestAttributes.new({ resource_type: "kafka", tags: [ "myTag", "myTag2:myValue", ], enable_custom_metrics: false, }), id: "exampleconfluentcloud", type: DatadogAPIClient::V2::ConfluentResourceType::CONFLUENT_CLOUD_RESOURCES, }), }) p api_instance.create_confluent_resource(CONFLUENT_ACCOUNT_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add resource to Confluent account returns "OK" response ``` // Add resource to Confluent account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_confluent_cloud::ConfluentCloudAPI; use datadog_api_client::datadogV2::model::ConfluentResourceRequest; use datadog_api_client::datadogV2::model::ConfluentResourceRequestAttributes; use datadog_api_client::datadogV2::model::ConfluentResourceRequestData; use datadog_api_client::datadogV2::model::ConfluentResourceType; #[tokio::main] async fn main() { // there is a valid "confluent_account" in the system let confluent_account_data_id = std::env::var("CONFLUENT_ACCOUNT_DATA_ID").unwrap(); let body = ConfluentResourceRequest::new(ConfluentResourceRequestData::new( ConfluentResourceRequestAttributes::new("kafka".to_string()) .enable_custom_metrics(false) .tags(vec!["myTag".to_string(), "myTag2:myValue".to_string()]), "exampleconfluentcloud".to_string(), ConfluentResourceType::CONFLUENT_CLOUD_RESOURCES, )); let configuration = datadog::Configuration::new(); let api = ConfluentCloudAPI::with_config(configuration); let resp = api .create_confluent_resource(confluent_account_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add resource to Confluent account returns "OK" response ``` /** * Add resource to Confluent account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ConfluentCloudApi(configuration); // there is a valid "confluent_account" in the system const CONFLUENT_ACCOUNT_DATA_ID = process.env .CONFLUENT_ACCOUNT_DATA_ID as string; const params: v2.ConfluentCloudApiCreateConfluentResourceRequest = { body: { data: { attributes: { resourceType: "kafka", tags: ["myTag", "myTag2:myValue"], enableCustomMetrics: false, }, id: "exampleconfluentcloud", type: "confluent-cloud-resources", }, }, accountId: CONFLUENT_ACCOUNT_DATA_ID, }; apiInstance .createConfluentResource(params) .then((data: v2.ConfluentResourceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List Confluent Account resources](https://docs.datadoghq.com/api/latest/confluent-cloud/#list-confluent-account-resources) * [v2 (latest)](https://docs.datadoghq.com/api/latest/confluent-cloud/#list-confluent-account-resources-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.ap2.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.datadoghq.eu/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.ddog-gov.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.us3.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resourceshttps://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}/resources ### Overview Get a Confluent resource for the account associated with the provided ID. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Confluent Account ID. ### Response * [200](https://docs.datadoghq.com/api/latest/confluent-cloud/#ListConfluentResource-200-v2) * [400](https://docs.datadoghq.com/api/latest/confluent-cloud/#ListConfluentResource-400-v2) * [403](https://docs.datadoghq.com/api/latest/confluent-cloud/#ListConfluentResource-403-v2) * [404](https://docs.datadoghq.com/api/latest/confluent-cloud/#ListConfluentResource-404-v2) * [429](https://docs.datadoghq.com/api/latest/confluent-cloud/#ListConfluentResource-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) Response schema when interacting with a list of Confluent resources. Field Type Description data [object] The JSON:API data attribute. attributes [_required_] object Model representation of a Confluent Cloud resource. enable_custom_metrics boolean Enable the `custom.consumer_lag_offset` metric, which contains extra metric tags. id string The ID associated with the Confluent resource. resource_type [_required_] string The resource type of the Resource. Can be `kafka`, `connector`, `ksql`, or `schema_registry`. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. id [_required_] string The ID associated with the Confluent resource. type [_required_] enum The JSON:API type for this request. Allowed enum values: `confluent-cloud-resources` default: `confluent-cloud-resources` ``` { "data": [ { "attributes": { "enable_custom_metrics": false, "id": "resource_id_abc123", "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ] }, "id": "resource_id_abc123", "type": "confluent-cloud-resources" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=typescript) ##### List Confluent Account resources Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/${account_id}/resources" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Confluent Account resources ``` """ List Confluent Account resources returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.confluent_cloud_api import ConfluentCloudApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ConfluentCloudApi(api_client) response = api_instance.list_confluent_resource( account_id="account_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Confluent Account resources ``` # List Confluent Account resources returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ConfluentCloudAPI.new p api_instance.list_confluent_resource("account_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Confluent Account resources ``` // List Confluent Account resources returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewConfluentCloudApi(apiClient) resp, r, err := api.ListConfluentResource(ctx, "account_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ConfluentCloudApi.ListConfluentResource`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ConfluentCloudApi.ListConfluentResource`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Confluent Account resources ``` // List Confluent Account resources returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ConfluentCloudApi; import com.datadog.api.client.v2.model.ConfluentResourcesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ConfluentCloudApi apiInstance = new ConfluentCloudApi(defaultClient); try { ConfluentResourcesResponse result = apiInstance.listConfluentResource("account_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ConfluentCloudApi#listConfluentResource"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Confluent Account resources ``` // List Confluent Account resources returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_confluent_cloud::ConfluentCloudAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ConfluentCloudAPI::with_config(configuration); let resp = api.list_confluent_resource("account_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Confluent Account resources ``` /** * List Confluent Account resources returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ConfluentCloudApi(configuration); const params: v2.ConfluentCloudApiListConfluentResourceRequest = { accountId: "account_id", }; apiInstance .listConfluentResource(params) .then((data: v2.ConfluentResourcesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Confluent account](https://docs.datadoghq.com/api/latest/confluent-cloud/#update-confluent-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/confluent-cloud/#update-confluent-account-v2) PATCH https://api.ap1.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id} ### Overview Update the Confluent account with the provided account ID. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Confluent Account ID. ### Request #### Body Data (required) Confluent payload * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) Field Type Description data [_required_] object Data object for updating a Confluent account. attributes [_required_] object Attributes object for updating a Confluent account. api_key [_required_] string The API key associated with your Confluent account. api_secret [_required_] string The API secret associated with your Confluent account. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. type [_required_] enum The JSON:API type for this API. Should always be `confluent-cloud-accounts`. Allowed enum values: `confluent-cloud-accounts` default: `confluent-cloud-accounts` ``` { "data": { "attributes": { "api_key": "TESTAPIKEY123", "api_secret": "update-secret", "tags": [ "updated_tag:val" ] }, "type": "confluent-cloud-accounts" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/confluent-cloud/#UpdateConfluentAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/confluent-cloud/#UpdateConfluentAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/confluent-cloud/#UpdateConfluentAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/confluent-cloud/#UpdateConfluentAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/confluent-cloud/#UpdateConfluentAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) The expected response schema when getting a Confluent account. Field Type Description data object An API key and API secret pair that represents a Confluent account. attributes [_required_] object The attributes of a Confluent account. api_key [_required_] string The API key associated with your Confluent account. resources [object] A list of Confluent resources associated with the Confluent account. enable_custom_metrics boolean Enable the `custom.consumer_lag_offset` metric, which contains extra metric tags. id string The ID associated with the Confluent resource. resource_type [_required_] string The resource type of the Resource. Can be `kafka`, `connector`, `ksql`, or `schema_registry`. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. id [_required_] string A randomly generated ID associated with a Confluent account. type [_required_] enum The JSON:API type for this API. Should always be `confluent-cloud-accounts`. Allowed enum values: `confluent-cloud-accounts` default: `confluent-cloud-accounts` ``` { "data": { "attributes": { "api_key": "TESTAPIKEY123", "resources": [ { "enable_custom_metrics": false, "id": "resource_id_abc123", "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ] } ], "tags": [ "myTag", "myTag2:myValue" ] }, "id": "account_id_abc123", "type": "confluent-cloud-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=typescript) ##### Update Confluent account returns "OK" response Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/${account_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "api_key": "TESTAPIKEY123", "api_secret": "update-secret", "tags": [ "updated_tag:val" ] }, "type": "confluent-cloud-accounts" } } EOF ``` ##### Update Confluent account returns "OK" response ``` // Update Confluent account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "confluent_account" in the system ConfluentAccountDataAttributesAPIKey := os.Getenv("CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY") ConfluentAccountDataID := os.Getenv("CONFLUENT_ACCOUNT_DATA_ID") body := datadogV2.ConfluentAccountUpdateRequest{ Data: datadogV2.ConfluentAccountUpdateRequestData{ Attributes: datadogV2.ConfluentAccountUpdateRequestAttributes{ ApiKey: ConfluentAccountDataAttributesAPIKey, ApiSecret: "update-secret", Tags: []string{ "updated_tag:val", }, }, Type: datadogV2.CONFLUENTACCOUNTTYPE_CONFLUENT_CLOUD_ACCOUNTS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewConfluentCloudApi(apiClient) resp, r, err := api.UpdateConfluentAccount(ctx, ConfluentAccountDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ConfluentCloudApi.UpdateConfluentAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ConfluentCloudApi.UpdateConfluentAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Confluent account returns "OK" response ``` // Update Confluent account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ConfluentCloudApi; import com.datadog.api.client.v2.model.ConfluentAccountResponse; import com.datadog.api.client.v2.model.ConfluentAccountType; import com.datadog.api.client.v2.model.ConfluentAccountUpdateRequest; import com.datadog.api.client.v2.model.ConfluentAccountUpdateRequestAttributes; import com.datadog.api.client.v2.model.ConfluentAccountUpdateRequestData; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ConfluentCloudApi apiInstance = new ConfluentCloudApi(defaultClient); // there is a valid "confluent_account" in the system String CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY = System.getenv("CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY"); String CONFLUENT_ACCOUNT_DATA_ID = System.getenv("CONFLUENT_ACCOUNT_DATA_ID"); ConfluentAccountUpdateRequest body = new ConfluentAccountUpdateRequest() .data( new ConfluentAccountUpdateRequestData() .attributes( new ConfluentAccountUpdateRequestAttributes() .apiKey(CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY) .apiSecret("update-secret") .tags(Collections.singletonList("updated_tag:val"))) .type(ConfluentAccountType.CONFLUENT_CLOUD_ACCOUNTS)); try { ConfluentAccountResponse result = apiInstance.updateConfluentAccount(CONFLUENT_ACCOUNT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ConfluentCloudApi#updateConfluentAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Confluent account returns "OK" response ``` """ Update Confluent account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.confluent_cloud_api import ConfluentCloudApi from datadog_api_client.v2.model.confluent_account_type import ConfluentAccountType from datadog_api_client.v2.model.confluent_account_update_request import ConfluentAccountUpdateRequest from datadog_api_client.v2.model.confluent_account_update_request_attributes import ( ConfluentAccountUpdateRequestAttributes, ) from datadog_api_client.v2.model.confluent_account_update_request_data import ConfluentAccountUpdateRequestData # there is a valid "confluent_account" in the system CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY = environ["CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY"] CONFLUENT_ACCOUNT_DATA_ID = environ["CONFLUENT_ACCOUNT_DATA_ID"] body = ConfluentAccountUpdateRequest( data=ConfluentAccountUpdateRequestData( attributes=ConfluentAccountUpdateRequestAttributes( api_key=CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY, api_secret="update-secret", tags=[ "updated_tag:val", ], ), type=ConfluentAccountType.CONFLUENT_CLOUD_ACCOUNTS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ConfluentCloudApi(api_client) response = api_instance.update_confluent_account(account_id=CONFLUENT_ACCOUNT_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Confluent account returns "OK" response ``` # Update Confluent account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ConfluentCloudAPI.new # there is a valid "confluent_account" in the system CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY = ENV["CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY"] CONFLUENT_ACCOUNT_DATA_ID = ENV["CONFLUENT_ACCOUNT_DATA_ID"] body = DatadogAPIClient::V2::ConfluentAccountUpdateRequest.new({ data: DatadogAPIClient::V2::ConfluentAccountUpdateRequestData.new({ attributes: DatadogAPIClient::V2::ConfluentAccountUpdateRequestAttributes.new({ api_key: CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY, api_secret: "update-secret", tags: [ "updated_tag:val", ], }), type: DatadogAPIClient::V2::ConfluentAccountType::CONFLUENT_CLOUD_ACCOUNTS, }), }) p api_instance.update_confluent_account(CONFLUENT_ACCOUNT_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Confluent account returns "OK" response ``` // Update Confluent account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_confluent_cloud::ConfluentCloudAPI; use datadog_api_client::datadogV2::model::ConfluentAccountType; use datadog_api_client::datadogV2::model::ConfluentAccountUpdateRequest; use datadog_api_client::datadogV2::model::ConfluentAccountUpdateRequestAttributes; use datadog_api_client::datadogV2::model::ConfluentAccountUpdateRequestData; #[tokio::main] async fn main() { // there is a valid "confluent_account" in the system let confluent_account_data_attributes_api_key = std::env::var("CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY").unwrap(); let confluent_account_data_id = std::env::var("CONFLUENT_ACCOUNT_DATA_ID").unwrap(); let body = ConfluentAccountUpdateRequest::new(ConfluentAccountUpdateRequestData::new( ConfluentAccountUpdateRequestAttributes::new( confluent_account_data_attributes_api_key.clone(), "update-secret".to_string(), ) .tags(vec!["updated_tag:val".to_string()]), ConfluentAccountType::CONFLUENT_CLOUD_ACCOUNTS, )); let configuration = datadog::Configuration::new(); let api = ConfluentCloudAPI::with_config(configuration); let resp = api .update_confluent_account(confluent_account_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Confluent account returns "OK" response ``` /** * Update Confluent account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ConfluentCloudApi(configuration); // there is a valid "confluent_account" in the system const CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY = process.env .CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY as string; const CONFLUENT_ACCOUNT_DATA_ID = process.env .CONFLUENT_ACCOUNT_DATA_ID as string; const params: v2.ConfluentCloudApiUpdateConfluentAccountRequest = { body: { data: { attributes: { apiKey: CONFLUENT_ACCOUNT_DATA_ATTRIBUTES_API_KEY, apiSecret: "update-secret", tags: ["updated_tag:val"], }, type: "confluent-cloud-accounts", }, }, accountId: CONFLUENT_ACCOUNT_DATA_ID, }; apiInstance .updateConfluentAccount(params) .then((data: v2.ConfluentAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get Confluent account](https://docs.datadoghq.com/api/latest/confluent-cloud/#get-confluent-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/confluent-cloud/#get-confluent-account-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id} ### Overview Get the Confluent account with the provided account ID. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Confluent Account ID. ### Response * [200](https://docs.datadoghq.com/api/latest/confluent-cloud/#GetConfluentAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/confluent-cloud/#GetConfluentAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/confluent-cloud/#GetConfluentAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/confluent-cloud/#GetConfluentAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/confluent-cloud/#GetConfluentAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) The expected response schema when getting a Confluent account. Field Type Description data object An API key and API secret pair that represents a Confluent account. attributes [_required_] object The attributes of a Confluent account. api_key [_required_] string The API key associated with your Confluent account. resources [object] A list of Confluent resources associated with the Confluent account. enable_custom_metrics boolean Enable the `custom.consumer_lag_offset` metric, which contains extra metric tags. id string The ID associated with the Confluent resource. resource_type [_required_] string The resource type of the Resource. Can be `kafka`, `connector`, `ksql`, or `schema_registry`. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. id [_required_] string A randomly generated ID associated with a Confluent account. type [_required_] enum The JSON:API type for this API. Should always be `confluent-cloud-accounts`. Allowed enum values: `confluent-cloud-accounts` default: `confluent-cloud-accounts` ``` { "data": { "attributes": { "api_key": "TESTAPIKEY123", "resources": [ { "enable_custom_metrics": false, "id": "resource_id_abc123", "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ] } ], "tags": [ "myTag", "myTag2:myValue" ] }, "id": "account_id_abc123", "type": "confluent-cloud-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=typescript) ##### Get Confluent account Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/${account_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Confluent account ``` """ Get Confluent account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.confluent_cloud_api import ConfluentCloudApi # there is a valid "confluent_account" in the system CONFLUENT_ACCOUNT_DATA_ID = environ["CONFLUENT_ACCOUNT_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ConfluentCloudApi(api_client) response = api_instance.get_confluent_account( account_id=CONFLUENT_ACCOUNT_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Confluent account ``` # Get Confluent account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ConfluentCloudAPI.new # there is a valid "confluent_account" in the system CONFLUENT_ACCOUNT_DATA_ID = ENV["CONFLUENT_ACCOUNT_DATA_ID"] p api_instance.get_confluent_account(CONFLUENT_ACCOUNT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Confluent account ``` // Get Confluent account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "confluent_account" in the system ConfluentAccountDataID := os.Getenv("CONFLUENT_ACCOUNT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewConfluentCloudApi(apiClient) resp, r, err := api.GetConfluentAccount(ctx, ConfluentAccountDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ConfluentCloudApi.GetConfluentAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ConfluentCloudApi.GetConfluentAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Confluent account ``` // Get Confluent account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ConfluentCloudApi; import com.datadog.api.client.v2.model.ConfluentAccountResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ConfluentCloudApi apiInstance = new ConfluentCloudApi(defaultClient); // there is a valid "confluent_account" in the system String CONFLUENT_ACCOUNT_DATA_ID = System.getenv("CONFLUENT_ACCOUNT_DATA_ID"); try { ConfluentAccountResponse result = apiInstance.getConfluentAccount(CONFLUENT_ACCOUNT_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ConfluentCloudApi#getConfluentAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Confluent account ``` // Get Confluent account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_confluent_cloud::ConfluentCloudAPI; #[tokio::main] async fn main() { // there is a valid "confluent_account" in the system let confluent_account_data_id = std::env::var("CONFLUENT_ACCOUNT_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ConfluentCloudAPI::with_config(configuration); let resp = api .get_confluent_account(confluent_account_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Confluent account ``` /** * Get Confluent account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ConfluentCloudApi(configuration); // there is a valid "confluent_account" in the system const CONFLUENT_ACCOUNT_DATA_ID = process.env .CONFLUENT_ACCOUNT_DATA_ID as string; const params: v2.ConfluentCloudApiGetConfluentAccountRequest = { accountId: CONFLUENT_ACCOUNT_DATA_ID, }; apiInstance .getConfluentAccount(params) .then((data: v2.ConfluentAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Confluent account](https://docs.datadoghq.com/api/latest/confluent-cloud/#delete-confluent-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/confluent-cloud/#delete-confluent-account-v2) DELETE https://api.ap1.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/{account_id} ### Overview Delete a Confluent account with the provided account ID. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Confluent Account ID. ### Response * [204](https://docs.datadoghq.com/api/latest/confluent-cloud/#DeleteConfluentAccount-204-v2) * [400](https://docs.datadoghq.com/api/latest/confluent-cloud/#DeleteConfluentAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/confluent-cloud/#DeleteConfluentAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/confluent-cloud/#DeleteConfluentAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/confluent-cloud/#DeleteConfluentAccount-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=typescript) ##### Delete Confluent account Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts/${account_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Confluent account ``` """ Delete Confluent account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.confluent_cloud_api import ConfluentCloudApi # there is a valid "confluent_account" in the system CONFLUENT_ACCOUNT_DATA_ID = environ["CONFLUENT_ACCOUNT_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ConfluentCloudApi(api_client) api_instance.delete_confluent_account( account_id=CONFLUENT_ACCOUNT_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Confluent account ``` # Delete Confluent account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ConfluentCloudAPI.new # there is a valid "confluent_account" in the system CONFLUENT_ACCOUNT_DATA_ID = ENV["CONFLUENT_ACCOUNT_DATA_ID"] api_instance.delete_confluent_account(CONFLUENT_ACCOUNT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Confluent account ``` // Delete Confluent account returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "confluent_account" in the system ConfluentAccountDataID := os.Getenv("CONFLUENT_ACCOUNT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewConfluentCloudApi(apiClient) r, err := api.DeleteConfluentAccount(ctx, ConfluentAccountDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ConfluentCloudApi.DeleteConfluentAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Confluent account ``` // Delete Confluent account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ConfluentCloudApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ConfluentCloudApi apiInstance = new ConfluentCloudApi(defaultClient); // there is a valid "confluent_account" in the system String CONFLUENT_ACCOUNT_DATA_ID = System.getenv("CONFLUENT_ACCOUNT_DATA_ID"); try { apiInstance.deleteConfluentAccount(CONFLUENT_ACCOUNT_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling ConfluentCloudApi#deleteConfluentAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Confluent account ``` // Delete Confluent account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_confluent_cloud::ConfluentCloudAPI; #[tokio::main] async fn main() { // there is a valid "confluent_account" in the system let confluent_account_data_id = std::env::var("CONFLUENT_ACCOUNT_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ConfluentCloudAPI::with_config(configuration); let resp = api .delete_confluent_account(confluent_account_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Confluent account ``` /** * Delete Confluent account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ConfluentCloudApi(configuration); // there is a valid "confluent_account" in the system const CONFLUENT_ACCOUNT_DATA_ID = process.env .CONFLUENT_ACCOUNT_DATA_ID as string; const params: v2.ConfluentCloudApiDeleteConfluentAccountRequest = { accountId: CONFLUENT_ACCOUNT_DATA_ID, }; apiInstance .deleteConfluentAccount(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add Confluent account](https://docs.datadoghq.com/api/latest/confluent-cloud/#add-confluent-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/confluent-cloud/#add-confluent-account-v2) POST https://api.ap1.datadoghq.com/api/v2/integrations/confluent-cloud/accountshttps://api.ap2.datadoghq.com/api/v2/integrations/confluent-cloud/accountshttps://api.datadoghq.eu/api/v2/integrations/confluent-cloud/accountshttps://api.ddog-gov.com/api/v2/integrations/confluent-cloud/accountshttps://api.datadoghq.com/api/v2/integrations/confluent-cloud/accountshttps://api.us3.datadoghq.com/api/v2/integrations/confluent-cloud/accountshttps://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts ### Overview Create a Confluent account. This endpoint requires the `manage_integrations` permission. ### Request #### Body Data (required) Confluent payload * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) Field Type Description data [_required_] object The data body for adding a Confluent account. attributes [_required_] object Attributes associated with the account creation request. api_key [_required_] string The API key associated with your Confluent account. api_secret [_required_] string The API secret associated with your Confluent account. resources [object] A list of Confluent resources associated with the Confluent account. enable_custom_metrics boolean Enable the `custom.consumer_lag_offset` metric, which contains extra metric tags. id string The ID associated with a Confluent resource. resource_type [_required_] string The resource type of the Resource. Can be `kafka`, `connector`, `ksql`, or `schema_registry`. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. type [_required_] enum The JSON:API type for this API. Should always be `confluent-cloud-accounts`. Allowed enum values: `confluent-cloud-accounts` default: `confluent-cloud-accounts` ``` { "data": { "attributes": { "api_key": "TESTAPIKEY123", "api_secret": "test-api-secret-123", "resources": [ { "enable_custom_metrics": false, "id": "resource-id-123", "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ] } ], "tags": [ "myTag", "myTag2:myValue" ] }, "type": "confluent-cloud-accounts" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/confluent-cloud/#CreateConfluentAccount-201-v2) * [400](https://docs.datadoghq.com/api/latest/confluent-cloud/#CreateConfluentAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/confluent-cloud/#CreateConfluentAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/confluent-cloud/#CreateConfluentAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/confluent-cloud/#CreateConfluentAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) The expected response schema when getting a Confluent account. Field Type Description data object An API key and API secret pair that represents a Confluent account. attributes [_required_] object The attributes of a Confluent account. api_key [_required_] string The API key associated with your Confluent account. resources [object] A list of Confluent resources associated with the Confluent account. enable_custom_metrics boolean Enable the `custom.consumer_lag_offset` metric, which contains extra metric tags. id string The ID associated with the Confluent resource. resource_type [_required_] string The resource type of the Resource. Can be `kafka`, `connector`, `ksql`, or `schema_registry`. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. id [_required_] string A randomly generated ID associated with a Confluent account. type [_required_] enum The JSON:API type for this API. Should always be `confluent-cloud-accounts`. Allowed enum values: `confluent-cloud-accounts` default: `confluent-cloud-accounts` ``` { "data": { "attributes": { "api_key": "TESTAPIKEY123", "resources": [ { "enable_custom_metrics": false, "id": "resource_id_abc123", "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ] } ], "tags": [ "myTag", "myTag2:myValue" ] }, "id": "account_id_abc123", "type": "confluent-cloud-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=typescript) ##### Add Confluent account Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "api_key": "TESTAPIKEY123", "api_secret": "test-api-secret-123", "resources": [ { "resource_type": "kafka" } ] }, "type": "confluent-cloud-accounts" } } EOF ``` ##### Add Confluent account ``` """ Add Confluent account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.confluent_cloud_api import ConfluentCloudApi from datadog_api_client.v2.model.confluent_account_create_request import ConfluentAccountCreateRequest from datadog_api_client.v2.model.confluent_account_create_request_attributes import ( ConfluentAccountCreateRequestAttributes, ) from datadog_api_client.v2.model.confluent_account_create_request_data import ConfluentAccountCreateRequestData from datadog_api_client.v2.model.confluent_account_resource_attributes import ConfluentAccountResourceAttributes from datadog_api_client.v2.model.confluent_account_type import ConfluentAccountType body = ConfluentAccountCreateRequest( data=ConfluentAccountCreateRequestData( attributes=ConfluentAccountCreateRequestAttributes( api_key="TESTAPIKEY123", api_secret="test-api-secret-123", resources=[ ConfluentAccountResourceAttributes( enable_custom_metrics=False, id="resource-id-123", resource_type="kafka", tags=[ "myTag", "myTag2:myValue", ], ), ], tags=[ "myTag", "myTag2:myValue", ], ), type=ConfluentAccountType.CONFLUENT_CLOUD_ACCOUNTS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ConfluentCloudApi(api_client) response = api_instance.create_confluent_account(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add Confluent account ``` # Add Confluent account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ConfluentCloudAPI.new body = DatadogAPIClient::V2::ConfluentAccountCreateRequest.new({ data: DatadogAPIClient::V2::ConfluentAccountCreateRequestData.new({ attributes: DatadogAPIClient::V2::ConfluentAccountCreateRequestAttributes.new({ api_key: "TESTAPIKEY123", api_secret: "test-api-secret-123", resources: [ DatadogAPIClient::V2::ConfluentAccountResourceAttributes.new({ enable_custom_metrics: false, id: "resource-id-123", resource_type: "kafka", tags: [ "myTag", "myTag2:myValue", ], }), ], tags: [ "myTag", "myTag2:myValue", ], }), type: DatadogAPIClient::V2::ConfluentAccountType::CONFLUENT_CLOUD_ACCOUNTS, }), }) p api_instance.create_confluent_account(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add Confluent account ``` // Add Confluent account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ConfluentAccountCreateRequest{ Data: datadogV2.ConfluentAccountCreateRequestData{ Attributes: datadogV2.ConfluentAccountCreateRequestAttributes{ ApiKey: "TESTAPIKEY123", ApiSecret: "test-api-secret-123", Resources: []datadogV2.ConfluentAccountResourceAttributes{ { EnableCustomMetrics: datadog.PtrBool(false), Id: datadog.PtrString("resource-id-123"), ResourceType: "kafka", Tags: []string{ "myTag", "myTag2:myValue", }, }, }, Tags: []string{ "myTag", "myTag2:myValue", }, }, Type: datadogV2.CONFLUENTACCOUNTTYPE_CONFLUENT_CLOUD_ACCOUNTS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewConfluentCloudApi(apiClient) resp, r, err := api.CreateConfluentAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ConfluentCloudApi.CreateConfluentAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ConfluentCloudApi.CreateConfluentAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add Confluent account ``` // Add Confluent account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ConfluentCloudApi; import com.datadog.api.client.v2.model.ConfluentAccountCreateRequest; import com.datadog.api.client.v2.model.ConfluentAccountCreateRequestAttributes; import com.datadog.api.client.v2.model.ConfluentAccountCreateRequestData; import com.datadog.api.client.v2.model.ConfluentAccountResourceAttributes; import com.datadog.api.client.v2.model.ConfluentAccountResponse; import com.datadog.api.client.v2.model.ConfluentAccountType; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ConfluentCloudApi apiInstance = new ConfluentCloudApi(defaultClient); ConfluentAccountCreateRequest body = new ConfluentAccountCreateRequest() .data( new ConfluentAccountCreateRequestData() .attributes( new ConfluentAccountCreateRequestAttributes() .apiKey("TESTAPIKEY123") .apiSecret("test-api-secret-123") .resources( Collections.singletonList( new ConfluentAccountResourceAttributes() .enableCustomMetrics(false) .id("resource-id-123") .resourceType("kafka") .tags(Arrays.asList("myTag", "myTag2:myValue")))) .tags(Arrays.asList("myTag", "myTag2:myValue"))) .type(ConfluentAccountType.CONFLUENT_CLOUD_ACCOUNTS)); try { ConfluentAccountResponse result = apiInstance.createConfluentAccount(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ConfluentCloudApi#createConfluentAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add Confluent account ``` // Add Confluent account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_confluent_cloud::ConfluentCloudAPI; use datadog_api_client::datadogV2::model::ConfluentAccountCreateRequest; use datadog_api_client::datadogV2::model::ConfluentAccountCreateRequestAttributes; use datadog_api_client::datadogV2::model::ConfluentAccountCreateRequestData; use datadog_api_client::datadogV2::model::ConfluentAccountResourceAttributes; use datadog_api_client::datadogV2::model::ConfluentAccountType; #[tokio::main] async fn main() { let body = ConfluentAccountCreateRequest::new(ConfluentAccountCreateRequestData::new( ConfluentAccountCreateRequestAttributes::new( "TESTAPIKEY123".to_string(), "test-api-secret-123".to_string(), ) .resources(vec![ConfluentAccountResourceAttributes::new( "kafka".to_string(), ) .enable_custom_metrics(false) .id("resource-id-123".to_string()) .tags(vec!["myTag".to_string(), "myTag2:myValue".to_string()])]) .tags(vec!["myTag".to_string(), "myTag2:myValue".to_string()]), ConfluentAccountType::CONFLUENT_CLOUD_ACCOUNTS, )); let configuration = datadog::Configuration::new(); let api = ConfluentCloudAPI::with_config(configuration); let resp = api.create_confluent_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add Confluent account ``` /** * Add Confluent account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ConfluentCloudApi(configuration); const params: v2.ConfluentCloudApiCreateConfluentAccountRequest = { body: { data: { attributes: { apiKey: "TESTAPIKEY123", apiSecret: "test-api-secret-123", resources: [ { enableCustomMetrics: false, id: "resource-id-123", resourceType: "kafka", tags: ["myTag", "myTag2:myValue"], }, ], tags: ["myTag", "myTag2:myValue"], }, type: "confluent-cloud-accounts", }, }, }; apiInstance .createConfluentAccount(params) .then((data: v2.ConfluentAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List Confluent accounts](https://docs.datadoghq.com/api/latest/confluent-cloud/#list-confluent-accounts) * [v2 (latest)](https://docs.datadoghq.com/api/latest/confluent-cloud/#list-confluent-accounts-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/confluent-cloud/accountshttps://api.ap2.datadoghq.com/api/v2/integrations/confluent-cloud/accountshttps://api.datadoghq.eu/api/v2/integrations/confluent-cloud/accountshttps://api.ddog-gov.com/api/v2/integrations/confluent-cloud/accountshttps://api.datadoghq.com/api/v2/integrations/confluent-cloud/accountshttps://api.us3.datadoghq.com/api/v2/integrations/confluent-cloud/accountshttps://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts ### Overview List Confluent accounts. This endpoint requires the `integrations_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/confluent-cloud/#ListConfluentAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/confluent-cloud/#ListConfluentAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/confluent-cloud/#ListConfluentAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/confluent-cloud/#ListConfluentAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/confluent-cloud/#ListConfluentAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) Confluent account returned by the API. Field Type Description data [object] The Confluent account. attributes [_required_] object The attributes of a Confluent account. api_key [_required_] string The API key associated with your Confluent account. resources [object] A list of Confluent resources associated with the Confluent account. enable_custom_metrics boolean Enable the `custom.consumer_lag_offset` metric, which contains extra metric tags. id string The ID associated with the Confluent resource. resource_type [_required_] string The resource type of the Resource. Can be `kafka`, `connector`, `ksql`, or `schema_registry`. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. tags [string] A list of strings representing tags. Can be a single key, or key-value pairs separated by a colon. id [_required_] string A randomly generated ID associated with a Confluent account. type [_required_] enum The JSON:API type for this API. Should always be `confluent-cloud-accounts`. Allowed enum values: `confluent-cloud-accounts` default: `confluent-cloud-accounts` ``` { "data": [ { "attributes": { "api_key": "TESTAPIKEY123", "resources": [ { "enable_custom_metrics": false, "id": "resource_id_abc123", "resource_type": "kafka", "tags": [ "myTag", "myTag2:myValue" ] } ], "tags": [ "myTag", "myTag2:myValue" ] }, "id": "account_id_abc123", "type": "confluent-cloud-accounts" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Example](https://docs.datadoghq.com/api/latest/confluent-cloud/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/confluent-cloud/?code-lang=typescript) ##### List Confluent accounts Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/confluent-cloud/accounts" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Confluent accounts ``` """ List Confluent accounts returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.confluent_cloud_api import ConfluentCloudApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ConfluentCloudApi(api_client) response = api_instance.list_confluent_account() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Confluent accounts ``` # List Confluent accounts returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ConfluentCloudAPI.new p api_instance.list_confluent_account() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Confluent accounts ``` // List Confluent accounts returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewConfluentCloudApi(apiClient) resp, r, err := api.ListConfluentAccount(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ConfluentCloudApi.ListConfluentAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ConfluentCloudApi.ListConfluentAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Confluent accounts ``` // List Confluent accounts returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ConfluentCloudApi; import com.datadog.api.client.v2.model.ConfluentAccountsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ConfluentCloudApi apiInstance = new ConfluentCloudApi(defaultClient); try { ConfluentAccountsResponse result = apiInstance.listConfluentAccount(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ConfluentCloudApi#listConfluentAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Confluent accounts ``` // List Confluent accounts returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_confluent_cloud::ConfluentCloudAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ConfluentCloudAPI::with_config(configuration); let resp = api.list_confluent_account().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Confluent accounts ``` /** * List Confluent accounts returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ConfluentCloudApi(configuration); apiInstance .listConfluentAccount() .then((data: v2.ConfluentAccountsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=1af7079c-9ae9-4d8e-b494-246c54444394&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=483bf997-a98a-4eb7-bcce-ea8b350336bd&pt=Confluent%20Cloud&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fconfluent-cloud%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=1af7079c-9ae9-4d8e-b494-246c54444394&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=483bf997-a98a-4eb7-bcce-ea8b350336bd&pt=Confluent%20Cloud&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fconfluent-cloud%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=542c90e2-c027-4501-a031-f799da1fc534&bo=2&sid=429bc5c0f0bf11f0a4206f1647a1e4e9&vid=429c17d0f0bf11f0be3e793ded46c7de&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Confluent%20Cloud&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fconfluent-cloud%2F&r=<=2217&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=389470) --- # Source: https://docs.datadoghq.com/api/latest/container-images/ # Container Images The Container Images API allows you to query Container Image data for your organization. See the [Container Images View page](https://docs.datadoghq.com/infrastructure/containers/container_images/) for more information. ## [Get all Container Images](https://docs.datadoghq.com/api/latest/container-images/#get-all-container-images) * [v2 (latest)](https://docs.datadoghq.com/api/latest/container-images/#get-all-container-images-v2) GET https://api.ap1.datadoghq.com/api/v2/container_imageshttps://api.ap2.datadoghq.com/api/v2/container_imageshttps://api.datadoghq.eu/api/v2/container_imageshttps://api.ddog-gov.com/api/v2/container_imageshttps://api.datadoghq.com/api/v2/container_imageshttps://api.us3.datadoghq.com/api/v2/container_imageshttps://api.us5.datadoghq.com/api/v2/container_images ### Overview Get all Container Images for your organization. **Note** : To enrich the data returned by this endpoint with security scans, see the new [api/v2/security/scanned-assets-metadata](https://docs.datadoghq.com/api/latest/security-monitoring/#list-scanned-assets-metadata) endpoint. ### Arguments #### Query Strings Name Type Description filter[tags] string Comma-separated list of tags to filter Container Images by. group_by string Comma-separated list of tags to group Container Images by. sort string Attribute to sort Container Images by. page[size] integer Maximum number of results returned. page[cursor] string String to query the next page of results. This key is provided with each valid response from the API in `meta.pagination.next_cursor`. ### Response * [200](https://docs.datadoghq.com/api/latest/container-images/#ListContainerImages-200-v2) * [400](https://docs.datadoghq.com/api/latest/container-images/#ListContainerImages-400-v2) * [403](https://docs.datadoghq.com/api/latest/container-images/#ListContainerImages-403-v2) * [429](https://docs.datadoghq.com/api/latest/container-images/#ListContainerImages-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/container-images/) * [Example](https://docs.datadoghq.com/api/latest/container-images/) List of Container Images. Field Type Description data [ ] Array of Container Image objects. Option 1 object Container Image object. attributes object Attributes for a Container Image. container_count int64 Number of containers running the image. image_flavors [object] List of platform-specific images associated with the image record. The list contains more than 1 entry for multi-architecture images. built_at string Time the platform-specific Container Image was built. os_architecture string Operating System architecture supported by the Container Image. os_name string Operating System name supported by the Container Image. os_version string Operating System version supported by the Container Image. size int64 Size of the platform-specific Container Image. image_tags [string] List of image tags associated with the Container Image. images_built_at [string] List of build times associated with the Container Image. The list contains more than 1 entry for multi-architecture images. name string Name of the Container Image. os_architectures [string] List of Operating System architectures supported by the Container Image. os_names [string] List of Operating System names supported by the Container Image. os_versions [string] List of Operating System versions supported by the Container Image. published_at string Time the image was pushed to the container registry. registry string Registry the Container Image was pushed to. repo_digest string Digest of the compressed image manifest. repository string Repository where the Container Image is stored in. short_image string Short version of the Container Image name. sizes [integer] List of size for each platform-specific image associated with the image record. The list contains more than 1 entry for multi-architecture images. sources [string] List of sources where the Container Image was collected from. tags [string] List of tags associated with the Container Image. vulnerability_count object Vulnerability counts associated with the Container Image. asset_id string ID of the Container Image. critical int64 Number of vulnerabilities with CVSS Critical severity. high int64 Number of vulnerabilities with CVSS High severity. low int64 Number of vulnerabilities with CVSS Low severity. medium int64 Number of vulnerabilities with CVSS Medium severity. none int64 Number of vulnerabilities with CVSS None severity. unknown int64 Number of vulnerabilities with an unknown CVSS severity. id string Container Image ID. type enum Type of Container Image. Allowed enum values: `container_image` default: `container_image` Option 2 object Container Image Group object. attributes object Attributes for a Container Image Group. count int64 Number of Container Images in the group. name string Name of the Container Image group. tags object Tags from the group name parsed in key/value format. id string Container Image Group ID. relationships object Relationships inside a Container Image Group. container_images object Relationships to Container Images inside a Container Image Group. data [string] Links data. links object Links attributes. related string Link to related Container Images. type enum Type of Container Image Group. Allowed enum values: `container_image_group` default: `container_image_group` links object Pagination links. first string Link to the first page. last string Link to the last page. next string Link to the next page. prev string Link to previous page. self string Link to current page. meta object Response metadata object. pagination object Paging attributes. cursor string The cursor used to get the current results, if any. limit int32 Number of results returned next_cursor string The cursor used to get the next results, if any. prev_cursor string The cursor used to get the previous results, if any. total int64 Total number of records that match the query. type enum Type of Container Image pagination. Allowed enum values: `cursor_limit` default: `cursor_limit` ``` { "data": [ { "attributes": { "container_count": "integer", "image_flavors": [ { "built_at": "string", "os_architecture": "string", "os_name": "string", "os_version": "string", "size": "integer" } ], "image_tags": [], "images_built_at": [], "name": "string", "os_architectures": [ "amd64" ], "os_names": [ "linux" ], "os_versions": [], "published_at": "string", "registry": "string", "repo_digest": "string", "repository": "string", "short_image": "string", "sizes": [], "sources": [], "tags": [], "vulnerability_count": { "asset_id": "string", "critical": "integer", "high": "integer", "low": "integer", "medium": "integer", "none": "integer", "unknown": "integer" } }, "id": "string", "type": "container_image" } ], "links": { "first": "string", "last": "string", "next": "string", "prev": "string", "self": "string" }, "meta": { "pagination": { "cursor": "string", "limit": "integer", "next_cursor": "string", "prev_cursor": "string", "total": "integer", "type": "cursor_limit" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/container-images/) * [Example](https://docs.datadoghq.com/api/latest/container-images/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/container-images/) * [Example](https://docs.datadoghq.com/api/latest/container-images/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/container-images/) * [Example](https://docs.datadoghq.com/api/latest/container-images/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/container-images/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/container-images/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/container-images/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/container-images/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/container-images/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/container-images/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/container-images/?code-lang=typescript) ##### Get all Container Images Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/container_images" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all Container Images ``` """ Get all Container Images returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.container_images_api import ContainerImagesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ContainerImagesApi(api_client) response = api_instance.list_container_images() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all Container Images ``` # Get all Container Images returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ContainerImagesAPI.new p api_instance.list_container_images() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all Container Images ``` // Get all Container Images returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewContainerImagesApi(apiClient) resp, r, err := api.ListContainerImages(ctx, *datadogV2.NewListContainerImagesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ContainerImagesApi.ListContainerImages`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ContainerImagesApi.ListContainerImages`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all Container Images ``` // Get all Container Images returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ContainerImagesApi; import com.datadog.api.client.v2.model.ContainerImagesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ContainerImagesApi apiInstance = new ContainerImagesApi(defaultClient); try { ContainerImagesResponse result = apiInstance.listContainerImages(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ContainerImagesApi#listContainerImages"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all Container Images ``` // Get all Container Images returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_container_images::ContainerImagesAPI; use datadog_api_client::datadogV2::api_container_images::ListContainerImagesOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ContainerImagesAPI::with_config(configuration); let resp = api .list_container_images(ListContainerImagesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all Container Images ``` /** * Get all Container Images returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ContainerImagesApi(configuration); apiInstance .listContainerImages() .then((data: v2.ContainerImagesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=96a38cb7-6e81-4580-9977-e726897ba85a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=37fb4afe-2b31-4028-a90a-5c5e0326b028&pt=Container%20Images&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcontainer-images%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=96a38cb7-6e81-4580-9977-e726897ba85a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=37fb4afe-2b31-4028-a90a-5c5e0326b028&pt=Container%20Images&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcontainer-images%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=f5f9cbbe-8954-483b-b84d-7e34900f8142&bo=2&sid=48345350f0bf11f0bc2fb78c0d5ba948&vid=48355510f0bf11f08ea5d5d0a8745430&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Container%20Images&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcontainer-images%2F&r=<=1321&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=938675) --- # Source: https://docs.datadoghq.com/api/latest/containers/ # Containers The Containers API allows you to query container data for your organization. See the [Container Monitoring page](https://docs.datadoghq.com/containers/) for more information. ## [Get All Containers](https://docs.datadoghq.com/api/latest/containers/#get-all-containers) * [v2 (latest)](https://docs.datadoghq.com/api/latest/containers/#get-all-containers-v2) GET https://api.ap1.datadoghq.com/api/v2/containershttps://api.ap2.datadoghq.com/api/v2/containershttps://api.datadoghq.eu/api/v2/containershttps://api.ddog-gov.com/api/v2/containershttps://api.datadoghq.com/api/v2/containershttps://api.us3.datadoghq.com/api/v2/containershttps://api.us5.datadoghq.com/api/v2/containers ### Overview Get all containers for your organization. ### Arguments #### Query Strings Name Type Description filter[tags] string Comma-separated list of tags to filter containers by. group_by string Comma-separated list of tags to group containers by. sort string Attribute to sort containers by. page[size] integer Maximum number of results returned. page[cursor] string String to query the next page of results. This key is provided with each valid response from the API in `meta.pagination.next_cursor`. ### Response * [200](https://docs.datadoghq.com/api/latest/containers/#ListContainers-200-v2) * [400](https://docs.datadoghq.com/api/latest/containers/#ListContainers-400-v2) * [403](https://docs.datadoghq.com/api/latest/containers/#ListContainers-403-v2) * [429](https://docs.datadoghq.com/api/latest/containers/#ListContainers-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/containers/) * [Example](https://docs.datadoghq.com/api/latest/containers/) List of containers. Field Type Description data [ ] Array of Container objects. Option 1 object Container object. attributes object Attributes for a container. container_id string The ID of the container. created_at string Time the container was created. host string Hostname of the host running the container. image_digest string Digest of the compressed image manifest. image_name string Name of the associated container image. image_tags [string] List of image tags associated with the container image. name string Name of the container. started_at string Time the container was started. state string State of the container. This depends on the container runtime. tags [string] List of tags associated with the container. id string Container ID. type enum Type of container. Allowed enum values: `container` default: `container` Option 2 object Container group object. attributes object Attributes for a container group. count int64 Number of containers in the group. tags object Tags from the group name parsed in key/value format. id string Container Group ID. relationships object Relationships to containers inside a container group. containers object Relationships to Containers inside a Container Group. data [string] Links data. links object Links attributes. related string Link to related containers. type enum Type of container group. Allowed enum values: `container_group` default: `container_group` links object Pagination links. first string Link to the first page. last string Link to the last page. next string Link to the next page. prev string Link to previous page. self string Link to current page. meta object Response metadata object. pagination object Paging attributes. cursor string The cursor used to get the current results, if any. limit int32 Number of results returned next_cursor string The cursor used to get the next results, if any. prev_cursor string The cursor used to get the previous results, if any. total int64 Total number of records that match the query. type enum Type of Container pagination. Allowed enum values: `cursor_limit` default: `cursor_limit` ``` { "data": [ { "attributes": { "container_id": "string", "created_at": "string", "host": "string", "image_digest": "string", "image_name": "string", "image_tags": [], "name": "string", "started_at": "string", "state": "string", "tags": [] }, "id": "string", "type": "container" } ], "links": { "first": "string", "last": "string", "next": "string", "prev": "string", "self": "string" }, "meta": { "pagination": { "cursor": "string", "limit": "integer", "next_cursor": "string", "prev_cursor": "string", "total": "integer", "type": "cursor_limit" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/containers/) * [Example](https://docs.datadoghq.com/api/latest/containers/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/containers/) * [Example](https://docs.datadoghq.com/api/latest/containers/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/containers/) * [Example](https://docs.datadoghq.com/api/latest/containers/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/containers/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/containers/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/containers/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/containers/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/containers/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/containers/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/containers/?code-lang=typescript) ##### Get All Containers Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/containers" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get All Containers ``` """ Get All Containers returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.containers_api import ContainersApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ContainersApi(api_client) response = api_instance.list_containers() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get All Containers ``` # Get All Containers returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ContainersAPI.new p api_instance.list_containers() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get All Containers ``` // Get All Containers returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewContainersApi(apiClient) resp, r, err := api.ListContainers(ctx, *datadogV2.NewListContainersOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ContainersApi.ListContainers`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ContainersApi.ListContainers`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get All Containers ``` // Get All Containers returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ContainersApi; import com.datadog.api.client.v2.model.ContainersResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ContainersApi apiInstance = new ContainersApi(defaultClient); try { ContainersResponse result = apiInstance.listContainers(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ContainersApi#listContainers"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get All Containers ``` // Get All Containers returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_containers::ContainersAPI; use datadog_api_client::datadogV2::api_containers::ListContainersOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ContainersAPI::with_config(configuration); let resp = api .list_containers(ListContainersOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get All Containers ``` /** * Get All Containers returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ContainersApi(configuration); apiInstance .listContainers() .then((data: v2.ContainersResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=0a5a37f5-7d35-48f6-a252-6eda0299449d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=46782de7-6a6d-40c5-bfe1-92ac81b87c87&pt=Containers&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcontainers%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=0a5a37f5-7d35-48f6-a252-6eda0299449d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=46782de7-6a6d-40c5-bfe1-92ac81b87c87&pt=Containers&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcontainers%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=83e2f6e0-e40d-41ff-930c-d4d651fb31b9&bo=2&sid=4bf34e80f0bf11f09e1d5327e47010f6&vid=4bf3d8e0f0bf11f0a7cfe9f253ca5829&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Containers&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcontainers%2F&r=<=904&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=146370) --- # Source: https://docs.datadoghq.com/api/latest/csm-agents/ # CSM Agents Datadog Cloud Security Management (CSM) delivers real-time threat detection and continuous configuration audits across your entire cloud infrastructure, all in a unified view for seamless collaboration and faster remediation. Go to to learn more ## [Get all CSM Agents](https://docs.datadoghq.com/api/latest/csm-agents/#get-all-csm-agents) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-agents/#get-all-csm-agents-v2) GET https://api.ap1.datadoghq.com/api/v2/csm/onboarding/agentshttps://api.ap2.datadoghq.com/api/v2/csm/onboarding/agentshttps://api.datadoghq.eu/api/v2/csm/onboarding/agentshttps://api.ddog-gov.com/api/v2/csm/onboarding/agentshttps://api.datadoghq.com/api/v2/csm/onboarding/agentshttps://api.us3.datadoghq.com/api/v2/csm/onboarding/agentshttps://api.us5.datadoghq.com/api/v2/csm/onboarding/agents ### Overview Get the list of all CSM Agents running on your hosts and containers. ### Arguments #### Query Strings Name Type Description page integer The page index for pagination (zero-based). size integer The number of items to include in a single page. query string A search query string to filter results (for example, `hostname:COMP-T2H4J27423`). order_direction enum The sort direction for results. Use `asc` for ascending or `desc` for descending. Allowed enum values: `asc, desc` ### Response * [200](https://docs.datadoghq.com/api/latest/csm-agents/#ListAllCSMAgents-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-agents/#ListAllCSMAgents-403-v2) * [429](https://docs.datadoghq.com/api/latest/csm-agents/#ListAllCSMAgents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-agents/) * [Example](https://docs.datadoghq.com/api/latest/csm-agents/) Response object that includes a list of CSM Agents. Field Type Description data [object] A list of Agents. attributes object A CSM Agent returned by the API. agent_version string Version of the Datadog Agent. aws_fargate string AWS Fargate details. cluster_name [string] List of cluster names associated with the Agent. datadog_agent string Unique identifier for the Datadog Agent. ecs_fargate_task_arn string ARN of the ECS Fargate task. envs [string] List of environments associated with the Agent. host_id int64 ID of the host. hostname string Name of the host. install_method_installer_version string Version of the installer used for installing the Datadog Agent. install_method_tool string Tool used for installing the Datadog Agent. is_csm_vm_containers_enabled boolean Indicates if CSM VM Containers is enabled. is_csm_vm_hosts_enabled boolean Indicates if CSM VM Hosts is enabled. is_cspm_enabled boolean Indicates if CSPM is enabled. is_cws_enabled boolean Indicates if CWS is enabled. is_cws_remote_configuration_enabled boolean Indicates if CWS Remote Configuration is enabled. is_remote_configuration_enabled boolean Indicates if Remote Configuration is enabled. os string Operating system of the host. id string The ID of the Agent. type enum The type of the resource. The value should always be `datadog_agent`. Allowed enum values: `datadog_agent` default: `datadog_agent` meta object Metadata related to the paginated response. page_index int64 The index of the current page in the paginated results. page_size int64 The number of items per page in the paginated results. total_filtered int64 Total number of items that match the filter criteria. ``` { "data": [ { "attributes": { "agent_version": "string", "aws_fargate": "string", "cluster_name": [], "datadog_agent": "string", "ecs_fargate_task_arn": "string", "envs": [], "host_id": "integer", "hostname": "string", "install_method_installer_version": "string", "install_method_tool": "string", "is_csm_vm_containers_enabled": false, "is_csm_vm_hosts_enabled": false, "is_cspm_enabled": false, "is_cws_enabled": false, "is_cws_remote_configuration_enabled": false, "is_remote_configuration_enabled": false, "os": "string" }, "id": "fffffc5505f6a006fdf7cf5aae053653", "type": "datadog_agent" } ], "meta": { "page_index": 0, "page_size": 10, "total_filtered": 128697 } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-agents/) * [Example](https://docs.datadoghq.com/api/latest/csm-agents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-agents/) * [Example](https://docs.datadoghq.com/api/latest/csm-agents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=typescript) ##### Get all CSM Agents Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/csm/onboarding/agents" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all CSM Agents ``` """ Get all CSM Agents returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_agents_api import CSMAgentsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMAgentsApi(api_client) response = api_instance.list_all_csm_agents() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all CSM Agents ``` # Get all CSM Agents returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMAgentsAPI.new p api_instance.list_all_csm_agents() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all CSM Agents ``` // Get all CSM Agents returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMAgentsApi(apiClient) resp, r, err := api.ListAllCSMAgents(ctx, *datadogV2.NewListAllCSMAgentsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMAgentsApi.ListAllCSMAgents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMAgentsApi.ListAllCSMAgents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all CSM Agents ``` // Get all CSM Agents returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmAgentsApi; import com.datadog.api.client.v2.model.CsmAgentsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmAgentsApi apiInstance = new CsmAgentsApi(defaultClient); try { CsmAgentsResponse result = apiInstance.listAllCSMAgents(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmAgentsApi#listAllCSMAgents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all CSM Agents ``` // Get all CSM Agents returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_agents::CSMAgentsAPI; use datadog_api_client::datadogV2::api_csm_agents::ListAllCSMAgentsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CSMAgentsAPI::with_config(configuration); let resp = api .list_all_csm_agents(ListAllCSMAgentsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all CSM Agents ``` /** * Get all CSM Agents returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMAgentsApi(configuration); apiInstance .listAllCSMAgents() .then((data: v2.CsmAgentsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all CSM Serverless Agents](https://docs.datadoghq.com/api/latest/csm-agents/#get-all-csm-serverless-agents) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-agents/#get-all-csm-serverless-agents-v2) GET https://api.ap1.datadoghq.com/api/v2/csm/onboarding/serverless/agentshttps://api.ap2.datadoghq.com/api/v2/csm/onboarding/serverless/agentshttps://api.datadoghq.eu/api/v2/csm/onboarding/serverless/agentshttps://api.ddog-gov.com/api/v2/csm/onboarding/serverless/agentshttps://api.datadoghq.com/api/v2/csm/onboarding/serverless/agentshttps://api.us3.datadoghq.com/api/v2/csm/onboarding/serverless/agentshttps://api.us5.datadoghq.com/api/v2/csm/onboarding/serverless/agents ### Overview Get the list of all CSM Serverless Agents running on your hosts and containers. ### Arguments #### Query Strings Name Type Description page integer The page index for pagination (zero-based). size integer The number of items to include in a single page. query string A search query string to filter results (for example, `hostname:COMP-T2H4J27423`). order_direction enum The sort direction for results. Use `asc` for ascending or `desc` for descending. Allowed enum values: `asc, desc` ### Response * [200](https://docs.datadoghq.com/api/latest/csm-agents/#ListAllCSMServerlessAgents-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-agents/#ListAllCSMServerlessAgents-403-v2) * [429](https://docs.datadoghq.com/api/latest/csm-agents/#ListAllCSMServerlessAgents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-agents/) * [Example](https://docs.datadoghq.com/api/latest/csm-agents/) Response object that includes a list of CSM Agents. Field Type Description data [object] A list of Agents. attributes object A CSM Agent returned by the API. agent_version string Version of the Datadog Agent. aws_fargate string AWS Fargate details. cluster_name [string] List of cluster names associated with the Agent. datadog_agent string Unique identifier for the Datadog Agent. ecs_fargate_task_arn string ARN of the ECS Fargate task. envs [string] List of environments associated with the Agent. host_id int64 ID of the host. hostname string Name of the host. install_method_installer_version string Version of the installer used for installing the Datadog Agent. install_method_tool string Tool used for installing the Datadog Agent. is_csm_vm_containers_enabled boolean Indicates if CSM VM Containers is enabled. is_csm_vm_hosts_enabled boolean Indicates if CSM VM Hosts is enabled. is_cspm_enabled boolean Indicates if CSPM is enabled. is_cws_enabled boolean Indicates if CWS is enabled. is_cws_remote_configuration_enabled boolean Indicates if CWS Remote Configuration is enabled. is_remote_configuration_enabled boolean Indicates if Remote Configuration is enabled. os string Operating system of the host. id string The ID of the Agent. type enum The type of the resource. The value should always be `datadog_agent`. Allowed enum values: `datadog_agent` default: `datadog_agent` meta object Metadata related to the paginated response. page_index int64 The index of the current page in the paginated results. page_size int64 The number of items per page in the paginated results. total_filtered int64 Total number of items that match the filter criteria. ``` { "data": [ { "attributes": { "agent_version": "string", "aws_fargate": "string", "cluster_name": [], "datadog_agent": "string", "ecs_fargate_task_arn": "string", "envs": [], "host_id": "integer", "hostname": "string", "install_method_installer_version": "string", "install_method_tool": "string", "is_csm_vm_containers_enabled": false, "is_csm_vm_hosts_enabled": false, "is_cspm_enabled": false, "is_cws_enabled": false, "is_cws_remote_configuration_enabled": false, "is_remote_configuration_enabled": false, "os": "string" }, "id": "fffffc5505f6a006fdf7cf5aae053653", "type": "datadog_agent" } ], "meta": { "page_index": 0, "page_size": 10, "total_filtered": 128697 } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-agents/) * [Example](https://docs.datadoghq.com/api/latest/csm-agents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-agents/) * [Example](https://docs.datadoghq.com/api/latest/csm-agents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-agents/?code-lang=typescript) ##### Get all CSM Serverless Agents Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/csm/onboarding/serverless/agents" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all CSM Serverless Agents ``` """ Get all CSM Serverless Agents returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_agents_api import CSMAgentsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMAgentsApi(api_client) response = api_instance.list_all_csm_serverless_agents() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all CSM Serverless Agents ``` # Get all CSM Serverless Agents returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMAgentsAPI.new p api_instance.list_all_csm_serverless_agents() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all CSM Serverless Agents ``` // Get all CSM Serverless Agents returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMAgentsApi(apiClient) resp, r, err := api.ListAllCSMServerlessAgents(ctx, *datadogV2.NewListAllCSMServerlessAgentsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMAgentsApi.ListAllCSMServerlessAgents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMAgentsApi.ListAllCSMServerlessAgents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all CSM Serverless Agents ``` // Get all CSM Serverless Agents returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmAgentsApi; import com.datadog.api.client.v2.model.CsmAgentsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmAgentsApi apiInstance = new CsmAgentsApi(defaultClient); try { CsmAgentsResponse result = apiInstance.listAllCSMServerlessAgents(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmAgentsApi#listAllCSMServerlessAgents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all CSM Serverless Agents ``` // Get all CSM Serverless Agents returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_agents::CSMAgentsAPI; use datadog_api_client::datadogV2::api_csm_agents::ListAllCSMServerlessAgentsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CSMAgentsAPI::with_config(configuration); let resp = api .list_all_csm_serverless_agents(ListAllCSMServerlessAgentsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all CSM Serverless Agents ``` /** * Get all CSM Serverless Agents returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMAgentsApi(configuration); apiInstance .listAllCSMServerlessAgents() .then((data: v2.CsmAgentsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=80a9d9be-c39f-43ee-b691-98b3e82828b6&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=e39ac4b2-4b81-4846-aeec-4ba150899c65&pt=CSM%20Agents&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcsm-agents%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=80a9d9be-c39f-43ee-b691-98b3e82828b6&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=e39ac4b2-4b81-4846-aeec-4ba150899c65&pt=CSM%20Agents&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcsm-agents%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=f8d6efd1-6d0e-4d51-8d85-af3dda25f374&bo=2&sid=4fe86b30f0bf11f0874679e4dad5e713&vid=4fe8abf0f0bf11f088d28d16ecb91adf&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=CSM%20Agents&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcsm-agents%2F&r=<=1305&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=301957) --- # Source: https://docs.datadoghq.com/api/latest/csm-coverage-analysis/ # CSM Coverage Analysis Datadog Cloud Security Management (CSM) delivers real-time threat detection and continuous configuration audits across your entire cloud infrastructure, all in a unified view for seamless collaboration and faster remediation. Go to to learn more. ## [Get the CSM Cloud Accounts Coverage Analysis](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#get-the-csm-cloud-accounts-coverage-analysis) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#get-the-csm-cloud-accounts-coverage-analysis-v2) GET https://api.ap1.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/cloud_accountshttps://api.ap2.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/cloud_accountshttps://api.datadoghq.eu/api/v2/csm/onboarding/coverage_analysis/cloud_accountshttps://api.ddog-gov.com/api/v2/csm/onboarding/coverage_analysis/cloud_accountshttps://api.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/cloud_accountshttps://api.us3.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/cloud_accountshttps://api.us5.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/cloud_accounts ### Overview Get the CSM Coverage Analysis of your Cloud Accounts. This is calculated based on the number of your Cloud Accounts that are scanned for security issues. ### Response * [200](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#GetCSMCloudAccountsCoverageAnalysis-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#GetCSMCloudAccountsCoverageAnalysis-403-v2) * [429](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#GetCSMCloudAccountsCoverageAnalysis-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) * [Example](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) CSM Cloud Accounts Coverage Analysis response. Field Type Description data object CSM Cloud Accounts Coverage Analysis data. attributes object CSM Cloud Accounts Coverage Analysis attributes. aws_coverage object CSM Coverage Analysis. configured_resources_count int64 The number of fully configured resources. coverage double The coverage percentage. partially_configured_resources_count int64 The number of partially configured resources. total_resources_count int64 The total number of resources. azure_coverage object CSM Coverage Analysis. configured_resources_count int64 The number of fully configured resources. coverage double The coverage percentage. partially_configured_resources_count int64 The number of partially configured resources. total_resources_count int64 The total number of resources. gcp_coverage object CSM Coverage Analysis. configured_resources_count int64 The number of fully configured resources. coverage double The coverage percentage. partially_configured_resources_count int64 The number of partially configured resources. total_resources_count int64 The total number of resources. org_id int64 The ID of your organization. total_coverage object CSM Coverage Analysis. configured_resources_count int64 The number of fully configured resources. coverage double The coverage percentage. partially_configured_resources_count int64 The number of partially configured resources. total_resources_count int64 The total number of resources. id string The ID of your organization. type string The type of the resource. The value should always be `get_cloud_accounts_coverage_analysis_response_public_v0`. default: `get_cloud_accounts_coverage_analysis_response_public_v0` ``` { "data": { "attributes": { "aws_coverage": { "configured_resources_count": 8, "coverage": 0.8, "partially_configured_resources_count": 0, "total_resources_count": 10 }, "azure_coverage": { "configured_resources_count": 8, "coverage": 0.8, "partially_configured_resources_count": 0, "total_resources_count": 10 }, "gcp_coverage": { "configured_resources_count": 8, "coverage": 0.8, "partially_configured_resources_count": 0, "total_resources_count": 10 }, "org_id": 123456, "total_coverage": { "configured_resources_count": 8, "coverage": 0.8, "partially_configured_resources_count": 0, "total_resources_count": 10 } }, "id": "66b3c6b5-5c9a-457e-b1c3-f247ca23afa3", "type": "get_cloud_accounts_coverage_analysis_response_public_v0" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) * [Example](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) * [Example](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=typescript) ##### Get the CSM Cloud Accounts Coverage Analysis Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/cloud_accounts" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the CSM Cloud Accounts Coverage Analysis ``` """ Get the CSM Cloud Accounts Coverage Analysis returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_coverage_analysis_api import CSMCoverageAnalysisApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMCoverageAnalysisApi(api_client) response = api_instance.get_csm_cloud_accounts_coverage_analysis() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the CSM Cloud Accounts Coverage Analysis ``` # Get the CSM Cloud Accounts Coverage Analysis returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMCoverageAnalysisAPI.new p api_instance.get_csm_cloud_accounts_coverage_analysis() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the CSM Cloud Accounts Coverage Analysis ``` // Get the CSM Cloud Accounts Coverage Analysis returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMCoverageAnalysisApi(apiClient) resp, r, err := api.GetCSMCloudAccountsCoverageAnalysis(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMCoverageAnalysisApi.GetCSMCloudAccountsCoverageAnalysis`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMCoverageAnalysisApi.GetCSMCloudAccountsCoverageAnalysis`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the CSM Cloud Accounts Coverage Analysis ``` // Get the CSM Cloud Accounts Coverage Analysis returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmCoverageAnalysisApi; import com.datadog.api.client.v2.model.CsmCloudAccountsCoverageAnalysisResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmCoverageAnalysisApi apiInstance = new CsmCoverageAnalysisApi(defaultClient); try { CsmCloudAccountsCoverageAnalysisResponse result = apiInstance.getCSMCloudAccountsCoverageAnalysis(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CsmCoverageAnalysisApi#getCSMCloudAccountsCoverageAnalysis"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the CSM Cloud Accounts Coverage Analysis ``` // Get the CSM Cloud Accounts Coverage Analysis returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_coverage_analysis::CSMCoverageAnalysisAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CSMCoverageAnalysisAPI::with_config(configuration); let resp = api.get_csm_cloud_accounts_coverage_analysis().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the CSM Cloud Accounts Coverage Analysis ``` /** * Get the CSM Cloud Accounts Coverage Analysis returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMCoverageAnalysisApi(configuration); apiInstance .getCSMCloudAccountsCoverageAnalysis() .then((data: v2.CsmCloudAccountsCoverageAnalysisResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the CSM Hosts and Containers Coverage Analysis](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#get-the-csm-hosts-and-containers-coverage-analysis) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#get-the-csm-hosts-and-containers-coverage-analysis-v2) GET https://api.ap1.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/hosts_and_containershttps://api.ap2.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/hosts_and_containershttps://api.datadoghq.eu/api/v2/csm/onboarding/coverage_analysis/hosts_and_containershttps://api.ddog-gov.com/api/v2/csm/onboarding/coverage_analysis/hosts_and_containershttps://api.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/hosts_and_containershttps://api.us3.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/hosts_and_containershttps://api.us5.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/hosts_and_containers ### Overview Get the CSM Coverage Analysis of your Hosts and Containers. This is calculated based on the number of agents running on your Hosts and Containers with CSM feature(s) enabled. ### Response * [200](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#GetCSMHostsAndContainersCoverageAnalysis-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#GetCSMHostsAndContainersCoverageAnalysis-403-v2) * [429](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#GetCSMHostsAndContainersCoverageAnalysis-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) * [Example](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) CSM Hosts and Containers Coverage Analysis response. Field Type Description data object CSM Hosts and Containers Coverage Analysis data. attributes object CSM Hosts and Containers Coverage Analysis attributes. cspm_coverage object CSM Coverage Analysis. configured_resources_count int64 The number of fully configured resources. coverage double The coverage percentage. partially_configured_resources_count int64 The number of partially configured resources. total_resources_count int64 The total number of resources. cws_coverage object CSM Coverage Analysis. configured_resources_count int64 The number of fully configured resources. coverage double The coverage percentage. partially_configured_resources_count int64 The number of partially configured resources. total_resources_count int64 The total number of resources. org_id int64 The ID of your organization. total_coverage object CSM Coverage Analysis. configured_resources_count int64 The number of fully configured resources. coverage double The coverage percentage. partially_configured_resources_count int64 The number of partially configured resources. total_resources_count int64 The total number of resources. vm_coverage object CSM Coverage Analysis. configured_resources_count int64 The number of fully configured resources. coverage double The coverage percentage. partially_configured_resources_count int64 The number of partially configured resources. total_resources_count int64 The total number of resources. id string The ID of your organization. type string The type of the resource. The value should always be `get_hosts_and_containers_coverage_analysis_response_public_v0`. default: `get_hosts_and_containers_coverage_analysis_response_public_v0` ``` { "data": { "attributes": { "cspm_coverage": { "configured_resources_count": 8, "coverage": 0.8, "partially_configured_resources_count": 0, "total_resources_count": 10 }, "cws_coverage": { "configured_resources_count": 8, "coverage": 0.8, "partially_configured_resources_count": 0, "total_resources_count": 10 }, "org_id": 123456, "total_coverage": { "configured_resources_count": 8, "coverage": 0.8, "partially_configured_resources_count": 0, "total_resources_count": 10 }, "vm_coverage": { "configured_resources_count": 8, "coverage": 0.8, "partially_configured_resources_count": 0, "total_resources_count": 10 } }, "id": "66b3c6b5-5c9a-457e-b1c3-f247ca23afa3", "type": "get_hosts_and_containers_coverage_analysis_response_public_v0" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) * [Example](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) * [Example](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=typescript) ##### Get the CSM Hosts and Containers Coverage Analysis Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/hosts_and_containers" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the CSM Hosts and Containers Coverage Analysis ``` """ Get the CSM Hosts and Containers Coverage Analysis returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_coverage_analysis_api import CSMCoverageAnalysisApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMCoverageAnalysisApi(api_client) response = api_instance.get_csm_hosts_and_containers_coverage_analysis() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the CSM Hosts and Containers Coverage Analysis ``` # Get the CSM Hosts and Containers Coverage Analysis returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMCoverageAnalysisAPI.new p api_instance.get_csm_hosts_and_containers_coverage_analysis() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the CSM Hosts and Containers Coverage Analysis ``` // Get the CSM Hosts and Containers Coverage Analysis returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMCoverageAnalysisApi(apiClient) resp, r, err := api.GetCSMHostsAndContainersCoverageAnalysis(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMCoverageAnalysisApi.GetCSMHostsAndContainersCoverageAnalysis`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMCoverageAnalysisApi.GetCSMHostsAndContainersCoverageAnalysis`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the CSM Hosts and Containers Coverage Analysis ``` // Get the CSM Hosts and Containers Coverage Analysis returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmCoverageAnalysisApi; import com.datadog.api.client.v2.model.CsmHostsAndContainersCoverageAnalysisResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmCoverageAnalysisApi apiInstance = new CsmCoverageAnalysisApi(defaultClient); try { CsmHostsAndContainersCoverageAnalysisResponse result = apiInstance.getCSMHostsAndContainersCoverageAnalysis(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CsmCoverageAnalysisApi#getCSMHostsAndContainersCoverageAnalysis"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the CSM Hosts and Containers Coverage Analysis ``` // Get the CSM Hosts and Containers Coverage Analysis returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_coverage_analysis::CSMCoverageAnalysisAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CSMCoverageAnalysisAPI::with_config(configuration); let resp = api.get_csm_hosts_and_containers_coverage_analysis().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the CSM Hosts and Containers Coverage Analysis ``` /** * Get the CSM Hosts and Containers Coverage Analysis returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMCoverageAnalysisApi(configuration); apiInstance .getCSMHostsAndContainersCoverageAnalysis() .then((data: v2.CsmHostsAndContainersCoverageAnalysisResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the CSM Serverless Coverage Analysis](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#get-the-csm-serverless-coverage-analysis) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#get-the-csm-serverless-coverage-analysis-v2) GET https://api.ap1.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/serverlesshttps://api.ap2.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/serverlesshttps://api.datadoghq.eu/api/v2/csm/onboarding/coverage_analysis/serverlesshttps://api.ddog-gov.com/api/v2/csm/onboarding/coverage_analysis/serverlesshttps://api.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/serverlesshttps://api.us3.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/serverlesshttps://api.us5.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/serverless ### Overview Get the CSM Coverage Analysis of your Serverless Resources. This is calculated based on the number of agents running on your Serverless Resources with CSM feature(s) enabled. ### Response * [200](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#GetCSMServerlessCoverageAnalysis-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#GetCSMServerlessCoverageAnalysis-403-v2) * [429](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/#GetCSMServerlessCoverageAnalysis-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) * [Example](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) CSM Serverless Resources Coverage Analysis response. Field Type Description data object CSM Serverless Resources Coverage Analysis data. attributes object CSM Serverless Resources Coverage Analysis attributes. cws_coverage object CSM Coverage Analysis. configured_resources_count int64 The number of fully configured resources. coverage double The coverage percentage. partially_configured_resources_count int64 The number of partially configured resources. total_resources_count int64 The total number of resources. org_id int64 The ID of your organization. total_coverage object CSM Coverage Analysis. configured_resources_count int64 The number of fully configured resources. coverage double The coverage percentage. partially_configured_resources_count int64 The number of partially configured resources. total_resources_count int64 The total number of resources. id string The ID of your organization. type string The type of the resource. The value should always be `get_serverless_coverage_analysis_response_public_v0`. default: `get_serverless_coverage_analysis_response_public_v0` ``` { "data": { "attributes": { "cws_coverage": { "configured_resources_count": 8, "coverage": 0.8, "partially_configured_resources_count": 0, "total_resources_count": 10 }, "org_id": 123456, "total_coverage": { "configured_resources_count": 8, "coverage": 0.8, "partially_configured_resources_count": 0, "total_resources_count": 10 } }, "id": "66b3c6b5-5c9a-457e-b1c3-f247ca23afa3", "type": "get_serverless_coverage_analysis_response_public_v0" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) * [Example](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) * [Example](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/?code-lang=typescript) ##### Get the CSM Serverless Coverage Analysis Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/csm/onboarding/coverage_analysis/serverless" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the CSM Serverless Coverage Analysis ``` """ Get the CSM Serverless Coverage Analysis returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_coverage_analysis_api import CSMCoverageAnalysisApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMCoverageAnalysisApi(api_client) response = api_instance.get_csm_serverless_coverage_analysis() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the CSM Serverless Coverage Analysis ``` # Get the CSM Serverless Coverage Analysis returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMCoverageAnalysisAPI.new p api_instance.get_csm_serverless_coverage_analysis() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the CSM Serverless Coverage Analysis ``` // Get the CSM Serverless Coverage Analysis returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMCoverageAnalysisApi(apiClient) resp, r, err := api.GetCSMServerlessCoverageAnalysis(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMCoverageAnalysisApi.GetCSMServerlessCoverageAnalysis`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMCoverageAnalysisApi.GetCSMServerlessCoverageAnalysis`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the CSM Serverless Coverage Analysis ``` // Get the CSM Serverless Coverage Analysis returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmCoverageAnalysisApi; import com.datadog.api.client.v2.model.CsmServerlessCoverageAnalysisResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmCoverageAnalysisApi apiInstance = new CsmCoverageAnalysisApi(defaultClient); try { CsmServerlessCoverageAnalysisResponse result = apiInstance.getCSMServerlessCoverageAnalysis(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CsmCoverageAnalysisApi#getCSMServerlessCoverageAnalysis"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the CSM Serverless Coverage Analysis ``` // Get the CSM Serverless Coverage Analysis returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_coverage_analysis::CSMCoverageAnalysisAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CSMCoverageAnalysisAPI::with_config(configuration); let resp = api.get_csm_serverless_coverage_analysis().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the CSM Serverless Coverage Analysis ``` /** * Get the CSM Serverless Coverage Analysis returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMCoverageAnalysisApi(configuration); apiInstance .getCSMServerlessCoverageAnalysis() .then((data: v2.CsmServerlessCoverageAnalysisResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=b28a1bb4-2b3d-4439-a283-21505def6896&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b514547b-f614-46c8-bbfb-45b7c5411aa7&pt=CSM%20Coverage%20Analysis&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcsm-coverage-analysis%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=b28a1bb4-2b3d-4439-a283-21505def6896&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b514547b-f614-46c8-bbfb-45b7c5411aa7&pt=CSM%20Coverage%20Analysis&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcsm-coverage-analysis%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=c100eba4-35a2-441d-b42c-eab9e5877b8f&bo=2&sid=53eb54b0f0bf11f08bc051f3bb0a0149&vid=53ebab10f0bf11f08df6bf78b3295379&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=CSM%20Coverage%20Analysis&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fcsm-coverage-analysis%2F&r=<=1190&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=573874) --- # Source: https://docs.datadoghq.com/api/latest/csm-threats/ [Read the 2025 State of Containers and Serverless Report! Read the State of Containers and Serverless Report! ](https://www.datadoghq.com/state-of-containers-and-serverless/?utm_source=inbound&utm_medium=corpsite-display&utm_campaign=int-infra-ww-announcement-product-announcement-containers-serverless-2025) * [Product](https://www.datadoghq.com/product/) The integrated platform for monitoring & security [View Product Pricing](https://www.datadoghq.com/pricing/) Observability End-to-end, simplified visibility into your stack’s health & performance Infrastructure * [Infrastructure Monitoring](https://www.datadoghq.com/product/infrastructure-monitoring/) * [Metrics](https://www.datadoghq.com/product/metrics/) * [Network Monitoring](https://www.datadoghq.com/product/network-monitoring/) * [Container Monitoring](https://www.datadoghq.com/product/container-monitoring/) * [Kubernetes Autoscaling](https://www.datadoghq.com/product/kubernetes-autoscaling/) * [Serverless](https://www.datadoghq.com/product/serverless-monitoring/) * [Cloud Cost Management](https://www.datadoghq.com/product/cloud-cost-management/) * [Cloudcraft](https://www.datadoghq.com/product/cloudcraft/) * [Storage Management](https://www.datadoghq.com/product/storage-management/) Applications * [Application Performance Monitoring](https://www.datadoghq.com/product/apm/) * [Universal Service Monitoring](https://www.datadoghq.com/product/universal-service-monitoring/) * [Continuous Profiler](https://www.datadoghq.com/product/code-profiling/) * [Dynamic Instrumentation](https://www.datadoghq.com/product/dynamic-instrumentation/) * [LLM Observability](https://www.datadoghq.com/product/llm-observability/) Data * [Database Monitoring](https://www.datadoghq.com/product/database-monitoring/) * [Data Streams Monitoring](https://www.datadoghq.com/product/data-streams-monitoring/) * [Quality Monitoring](https://www.datadoghq.com/product/data-observability/quality-monitoring/) * [Jobs Monitoring](https://www.datadoghq.com/product/data-observability/jobs-monitoring/) Logs * [Log Management](https://www.datadoghq.com/product/log-management/) * [Sensitive Data Scanner](https://www.datadoghq.com/product/sensitive-data-scanner/) * [Audit Trail](https://www.datadoghq.com/product/audit-trail/) * [Observability Pipelines](https://www.datadoghq.com/product/observability-pipelines/) * [Error Tracking](https://www.datadoghq.com/product/error-tracking/) * [CloudPrem](https://www.datadoghq.com/product/cloudprem/) Infrastructure Applications Data Logs Security Detect, prioritize, and respond to threats in real-time Code Security * [Code Security](https://www.datadoghq.com/product/code-security/) * [Software Composition Analysis](https://www.datadoghq.com/product/software-composition-analysis/) * [Static Code Analysis (SAST)](https://www.datadoghq.com/product/sast/) * [Runtime Code Analysis (IAST)](https://www.datadoghq.com/product/iast/) * [IaC Security](https://www.datadoghq.com/product/iac-security) * [Secret Scanning](https://www.datadoghq.com/product/secret-scanning/) Cloud Security * [Cloud Security](https://www.datadoghq.com/product/cloud-security/) * [Cloud Security Posture Management](https://www.datadoghq.com/product/cloud-security/#posture-management) * [Cloud Infrastructure Entitlement Management](https://www.datadoghq.com/product/cloud-security/#entitlement-management) * [Vulnerability Management](https://www.datadoghq.com/product/cloud-security/#vulnerability-management) * [Compliance](https://www.datadoghq.com/product/cloud-security/#compliance) Threat Management * [Cloud SIEM](https://www.datadoghq.com/product/cloud-siem/) * [Workload Protection](https://www.datadoghq.com/product/workload-protection/) * [App and API Protection](https://www.datadoghq.com/product/app-and-api-protection/) * [Sensitive Data Scanner](https://www.datadoghq.com/product/sensitive-data-scanner/) Security Labs * [Security Labs Research](https://securitylabs.datadoghq.com/) * [Open Source Projects](https://opensource.datadoghq.com/) Digital Experience Optimize front-end performance and enhance user experiences Digital Experience * [Browser Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/) * [Mobile Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/mobile-rum/) * [Product Analytics](https://www.datadoghq.com/product/product-analytics/) * [Session Replay](https://www.datadoghq.com/product/real-user-monitoring/session-replay/) * [Synthetic Monitoring](https://www.datadoghq.com/product/synthetic-monitoring/) * [Mobile App Testing](https://www.datadoghq.com/product/mobile-app-testing/) * [Error Tracking](https://www.datadoghq.com/product/error-tracking/) Related Products * [Continuous Testing](https://www.datadoghq.com/product/continuous-testing/) * [Dashboards](https://www.datadoghq.com/product/platform/dashboards/) * [Application Performance Monitoring](https://www.datadoghq.com/product/apm/) Software Delivery Build, test, secure and ship quality code faster Software Delivery * [Internal Developer Portal](https://www.datadoghq.com/product/internal-developer-portal/) * [CI Visibility](https://www.datadoghq.com/product/ci-cd-monitoring/) * [Test Optimization](https://www.datadoghq.com/product/test-optimization/) * [Continuous Testing](https://www.datadoghq.com/product/continuous-testing/) * [IDE Plugins](https://www.datadoghq.com/product/platform/ides/) * [DORA Metrics](https://www.datadoghq.com/product/platform/dora-metrics/) * [Feature Flags](https://www.datadoghq.com/product/feature-flags/) * [Code Coverage](https://www.datadoghq.com/product/code-coverage/) Related Products * [Software Composition Analysis](https://www.datadoghq.com/product/software-composition-analysis/) * [Application Performance Monitoring](https://www.datadoghq.com/product/apm/) * [Synthetic Monitoring](https://www.datadoghq.com/product/synthetic-monitoring/) * [Browser Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/) * [Workflow Automation](https://www.datadoghq.com/product/workflow-automation/) * [integrations](https://www.datadoghq.com/product/platform/integrations/) Service Management Integrated, streamlined workflows for faster time-to-resolution Service Management * [Incident Response](https://www.datadoghq.com/product/incident-response/) * [Software Catalog](https://www.datadoghq.com/product/software-catalog/) * [Service Level Objectives](https://www.datadoghq.com/product/service-level-objectives/) * [Case Management](https://www.datadoghq.com/product/case-management/) Actions * [Workflow Automation](https://www.datadoghq.com/product/workflow-automation/) * [App Builder](https://www.datadoghq.com/product/app-builder/) Agentic & Embedded * [Bits AI SRE](https://www.datadoghq.com/product/ai/bits-ai-sre/) * [Watchdog](https://www.datadoghq.com/product/platform/watchdog/) * [Event Management](https://www.datadoghq.com/product/event-management/) AI Monitor and improve model performance. Pinpoint root causes and detect anomalies AI Observability * [LLM Observability](https://www.datadoghq.com/product/llm-observability/) * [AI Integrations](https://www.datadoghq.com/product/platform/integrations/#cat-aiml) Agentic & Embedded * [Bits AI Agents](https://www.datadoghq.com/product/ai/bits-ai-agents/) * [Bits AI SRE](https://www.datadoghq.com/product/ai/bits-ai-sre/) * [Watchdog](https://www.datadoghq.com/product/platform/watchdog/) * [Event Management](https://www.datadoghq.com/product/event-management/) Related Products * [Incident Response](https://www.datadoghq.com/product/incident-response/) * [Workflow Automation](https://www.datadoghq.com/product/workflow-automation/) * [Application Performance Monitoring](https://www.datadoghq.com/product/apm/) * [Universal Service Monitoring](https://www.datadoghq.com/product/universal-service-monitoring/) * [Log Management](https://www.datadoghq.com/product/log-management/) Platform Capabilities Built-in features & integrations that power the Datadog platform Built-in Features * [Bits AI Agents](https://www.datadoghq.com/product/ai/bits-ai-agents/) * [Metrics](https://www.datadoghq.com/product/metrics/) * [Watchdog](https://www.datadoghq.com/product/platform/watchdog/) * [Alerts](https://www.datadoghq.com/product/platform/alerts/) * [Dashboards](https://www.datadoghq.com/product/platform/dashboards/) * [Notebooks](https://docs.datadoghq.com/notebooks/) * [Mobile App](https://docs.datadoghq.com/service_management/mobile/?tab=ios) * [Fleet Automation](https://www.datadoghq.com/product/fleet-automation/) * [Access Control](https://docs.datadoghq.com/account_management/rbac/?tab=datadogapplication) * [DORA Metrics](https://www.datadoghq.com/product/platform/dora-metrics/) Workflows & Collaboration * [Incident Response](https://www.datadoghq.com/product/incident-response/) * [Case Management](https://www.datadoghq.com/product/case-management/) * [Event Management](https://www.datadoghq.com/product/event-management/) * [Workflow Automation](https://www.datadoghq.com/product/workflow-automation/) * [App Builder](https://www.datadoghq.com/product/app-builder/) * [Cloudcraft](https://www.datadoghq.com/product/cloudcraft/) * [CoScreen](https://www.datadoghq.com/product/coscreen/) * [Teams](https://docs.datadoghq.com/account_management/teams/) Extensibility * [OpenTelemetry](https://www.datadoghq.com/solutions/opentelemetry/) * [integrations](https://www.datadoghq.com/product/platform/integrations/) * [IDE Plugins](https://www.datadoghq.com/product/platform/ides/) * [API](https://docs.datadoghq.com/api/) * [Marketplace](https://www.datadoghq.com/marketplacepartners/) * [Customers](https://www.datadoghq.com/customers/) * [Pricing](https://www.datadoghq.com/pricing/) * [Solutions](https://www.datadoghq.com/) Industry * [Financial Services](https://www.datadoghq.com/solutions/financial-services/) * [Manufacturing & Logistics](https://www.datadoghq.com/solutions/manufacturing-logistics/) * [Healthcare/Life Sciences](https://www.datadoghq.com/solutions/healthcare/) * [Retail/E-Commerce](https://www.datadoghq.com/solutions/retail-ecommerce/) * [Government](https://www.datadoghq.com/solutions/government/) * [Education](https://www.datadoghq.com/solutions/education/) * [Media & Entertainment](https://www.datadoghq.com/solutions/media-entertainment/) * [Technology](https://www.datadoghq.com/solutions/technology/) * [Gaming](https://www.datadoghq.com/solutions/gaming/) Technology * [Amazon Web Services Monitoring](https://www.datadoghq.com/solutions/aws/) * [Azure Monitoring](https://www.datadoghq.com/solutions/azure/) * [Google Cloud Monitoring](https://www.datadoghq.com/solutions/googlecloud/) * [Oracle Cloud Monitoring](https://www.datadoghq.com/solutions/oci-monitoring/) * [Kubernetes Monitoring](https://www.datadoghq.com/solutions/kubernetes/) * [Red Hat OpenShift](https://www.datadoghq.com/solutions/openshift/) * [Pivotal Platform](https://www.datadoghq.com/solutions/pivotal-platform/) * [OpenAI](https://www.datadoghq.com/solutions/openai/) * [SAP Monitoring](https://www.datadoghq.com/solutions/sap-monitoring/) * [OpenTelemetry](https://www.datadoghq.com/solutions/opentelemetry/) Use-case * [Application Security](https://www.datadoghq.com/solutions/application-security/) * [Cloud Migration](https://www.datadoghq.com/solutions/cloud-migration/) * [Monitoring Consolidation](https://www.datadoghq.com/solutions/monitoring-consolidation/) * [Unified Commerce Monitoring](https://www.datadoghq.com/solutions/unified-commerce-monitoring/) * [SOAR](https://www.datadoghq.com/solutions/soar/) * [DevOps](https://www.datadoghq.com/solutions/devops/) * [FinOps](https://www.datadoghq.com/solutions/finops/) * [Shift-Left Testing](https://www.datadoghq.com/solutions/shift-left-testing/) * [Digital Experience Monitoring](https://www.datadoghq.com/solutions/digital-experience-monitoring/) * [Security Analytics](https://www.datadoghq.com/solutions/security-analytics/) * [Compliance for CIS Benchmarks](https://www.datadoghq.com/solutions/security/cis-benchmarks/aws/) * [Hybrid Cloud Monitoring](https://www.datadoghq.com/solutions/hybrid-cloud-monitoring/) * [IoT Monitoring](https://www.datadoghq.com/solutions/iot-monitoring/) * [Real-Time BI](https://www.datadoghq.com/solutions/real-time-business-intelligence/) * [On-Premises Monitoring](https://www.datadoghq.com/solutions/on-premises-monitoring/) * [Log Analysis & Correlation](https://www.datadoghq.com/solutions/log-analysis-and-correlation/) * [CNAPP](https://www.datadoghq.com/solutions/cnapp/) * [Docs](https://docs.datadoghq.com/) [![DataDog](https://datadog-docs.imgix.net/img/dd_logo_n_70x75.png?ch=Width,DPR&fit=max&auto=format&w=70&h=75) ![DataDog](https://datadog-docs.imgix.net/img/dd-logo-n-200.png?ch=Width,DPR&fit=max&auto=format&h=14&auto=format&w=807) White modal up arrow Looking for Datadog logos? You can find the logo assets on our press page. Download Media Assets ](https://docs.datadoghq.com/) * [About](https://www.datadoghq.com/about/leadership/) * [Contact](https://www.datadoghq.com/about/contact/) * [Partners](https://www.datadoghq.com/partner/network/) * [Latest News](https://www.datadoghq.com/about/latest-news/press-releases/) * [Events & Webinars](https://www.datadoghq.com/events-webinars/) * [Leadership](https://www.datadoghq.com/about/leadership/) * [Careers](https://careers.datadoghq.com/) * [Analyst Reports](https://www.datadoghq.com/about/analyst/) * [Investor Relations](https://investors.datadoghq.com/) * [ESG Report](https://www.datadoghq.com/esg-report/) * [Trust Hub](https://www.datadoghq.com/trust/) * [Blog](https://www.datadoghq.com/blog/) * [The Monitor](https://www.datadoghq.com/blog/) * [Engineering](https://www.datadoghq.com/blog/engineering/) * [AI](https://www.datadoghq.com/blog/ai/) * [Security Labs](https://securitylabs.datadoghq.com/) * [Login](https://app.datadoghq.com/) * [](https://docs.datadoghq.com/api/latest/csm-threats/) * [GET STARTED FREE FREE TRIAL](https://www.datadoghq.com/) [![Datadog Logo](https://datadog-docs.imgix.net/img/datadog_rbg_n_2x.png?fm=png&auto=format&lossless=1)](https://docs.datadoghq.com/) Toggle navigation [ Home ](https://www.datadoghq.com)[ Docs ](https://docs.datadoghq.com/)[ API](https://docs.datadoghq.com/api/) * [Essentials ](https://docs.datadoghq.com/api/latest/csm-threats/) * [Getting Started](https://docs.datadoghq.com/getting_started/) * [Agent](https://docs.datadoghq.com/getting_started/agent/) * [API](https://docs.datadoghq.com/getting_started/api/) * [APM Tracing](https://docs.datadoghq.com/getting_started/tracing/) * [Containers](https://docs.datadoghq.com/getting_started/containers/) * [Autodiscovery](https://docs.datadoghq.com/getting_started/containers/autodiscovery) * [Datadog Operator](https://docs.datadoghq.com/getting_started/containers/datadog_operator) * [Dashboards](https://docs.datadoghq.com/getting_started/dashboards/) * [Database Monitoring](https://docs.datadoghq.com/getting_started/database_monitoring/) * [Datadog](https://docs.datadoghq.com/getting_started/application/) * [Datadog Site](https://docs.datadoghq.com/getting_started/site/) * [DevSecOps](https://docs.datadoghq.com/getting_started/devsecops) * [Incident Management](https://docs.datadoghq.com/getting_started/incident_management/) * [Integrations](https://docs.datadoghq.com/getting_started/integrations/) * [AWS](https://docs.datadoghq.com/getting_started/integrations/aws/) * [Azure](https://docs.datadoghq.com/getting_started/integrations/azure/) * [Google Cloud](https://docs.datadoghq.com/getting_started/integrations/google_cloud/) * [Terraform](https://docs.datadoghq.com/getting_started/integrations/terraform/) * [Internal Developer Portal](https://docs.datadoghq.com/getting_started/internal_developer_portal/) * [Logs](https://docs.datadoghq.com/getting_started/logs/) * [Monitors](https://docs.datadoghq.com/getting_started/monitors/) * [Notebooks](https://docs.datadoghq.com/getting_started/notebooks/) * [OpenTelemetry](https://docs.datadoghq.com/getting_started/opentelemetry/) * [Profiler](https://docs.datadoghq.com/getting_started/profiler/) * [Search](https://docs.datadoghq.com/getting_started/search/) * [Product-Specific Search](https://docs.datadoghq.com/getting_started/search/product_specific_reference) * [Session Replay](https://docs.datadoghq.com/getting_started/session_replay/) * [Security](https://docs.datadoghq.com/getting_started/security/) * [App and API Protection](https://docs.datadoghq.com/getting_started/security/application_security) * [Cloud Security](https://docs.datadoghq.com/getting_started/security/cloud_security_management/) * [Cloud SIEM](https://docs.datadoghq.com/getting_started/security/cloud_siem/) * [Code Security](https://docs.datadoghq.com/getting_started/code_security/) * [Serverless for AWS Lambda](https://docs.datadoghq.com/getting_started/serverless/) * [Software Delivery](https://docs.datadoghq.com/getting_started/software_delivery/) * [CI Visibility](https://docs.datadoghq.com/getting_started/ci_visibility/) * [Feature Flags](https://docs.datadoghq.com/getting_started/feature_flags/) * [Test Optimization](https://docs.datadoghq.com/getting_started/test_optimization/) * [Test Impact Analysis](https://docs.datadoghq.com/getting_started/test_impact_analysis/) * [Synthetic Monitoring and Testing](https://docs.datadoghq.com/getting_started/synthetics/) * [API Tests](https://docs.datadoghq.com/getting_started/synthetics/api_test) * [Browser Tests](https://docs.datadoghq.com/getting_started/synthetics/browser_test) * [Mobile App Tests](https://docs.datadoghq.com/getting_started/synthetics/mobile_app_testing) * [Continuous Testing](https://docs.datadoghq.com/getting_started/continuous_testing/) * [Private Locations](https://docs.datadoghq.com/getting_started/synthetics/private_location) * [Tags](https://docs.datadoghq.com/getting_started/tagging/) * [Assigning Tags](https://docs.datadoghq.com/getting_started/tagging/assigning_tags) * [Unified Service Tagging](https://docs.datadoghq.com/getting_started/tagging/unified_service_tagging) * [Using Tags](https://docs.datadoghq.com/getting_started/tagging/using_tags) * [Workflow Automation](https://docs.datadoghq.com/getting_started/workflow_automation/) * [Learning Center](https://docs.datadoghq.com/getting_started/learning_center/) * [Support](https://docs.datadoghq.com/getting_started/support/) * [Glossary](https://docs.datadoghq.com/glossary/) * [Standard Attributes](https://docs.datadoghq.com/standard-attributes) * [Guides](https://docs.datadoghq.com/all_guides/) * [Agent](https://docs.datadoghq.com/agent/) * [Architecture](https://docs.datadoghq.com/agent/architecture/) * [IoT](https://docs.datadoghq.com/agent/iot/) * [Supported Platforms](https://docs.datadoghq.com/agent/supported_platforms/) * [AIX](https://docs.datadoghq.com/agent/supported_platforms/aix/) * [Linux](https://docs.datadoghq.com/agent/supported_platforms/linux/) * [Ansible](https://docs.datadoghq.com/agent/supported_platforms/ansible/) * [Chef](https://docs.datadoghq.com/agent/supported_platforms/chef/) * [Heroku](https://docs.datadoghq.com/agent/supported_platforms/heroku/) * [MacOS](https://docs.datadoghq.com/agent/supported_platforms/osx/) * [Puppet](https://docs.datadoghq.com/agent/supported_platforms/puppet/) * [SaltStack](https://docs.datadoghq.com/agent/supported_platforms/saltstack/) * [SCCM](https://docs.datadoghq.com/agent/supported_platforms/sccm/) * [Windows](https://docs.datadoghq.com/agent/supported_platforms/windows/) * [From Source](https://docs.datadoghq.com/agent/supported_platforms/source/) * [Log Collection](https://docs.datadoghq.com/agent/logs/) * [Log Agent tags](https://docs.datadoghq.com/agent/logs/agent_tags/) * [Advanced Configurations](https://docs.datadoghq.com/agent/logs/advanced_log_collection) * [Proxy](https://docs.datadoghq.com/agent/logs/proxy) * [Transport](https://docs.datadoghq.com/agent/logs/log_transport) * [Multi-Line Detection](https://docs.datadoghq.com/agent/logs/auto_multiline_detection) * [Configuration](https://docs.datadoghq.com/agent/configuration) * [Commands](https://docs.datadoghq.com/agent/configuration/agent-commands/) * [Configuration Files](https://docs.datadoghq.com/agent/configuration/agent-configuration-files/) * [Log Files](https://docs.datadoghq.com/agent/configuration/agent-log-files/) * [Status Page](https://docs.datadoghq.com/agent/configuration/agent-status-page/) * [Network Traffic](https://docs.datadoghq.com/agent/configuration/network/) * [Proxy Configuration](https://docs.datadoghq.com/agent/configuration/proxy/) * [FIPS Compliance](https://docs.datadoghq.com/agent/configuration/fips-compliance/) * [Dual Shipping](https://docs.datadoghq.com/agent/configuration/dual-shipping/) * [Secrets Management](https://docs.datadoghq.com/agent/configuration/secrets-management/) * [Fleet Automation](https://docs.datadoghq.com/agent/fleet_automation) * [Remote Agent Management](https://docs.datadoghq.com/agent/fleet_automation/remote_management) * [Troubleshooting](https://docs.datadoghq.com/agent/troubleshooting/) * [Container Hostname Detection](https://docs.datadoghq.com/agent/troubleshooting/hostname_containers/) * [Debug Mode](https://docs.datadoghq.com/agent/troubleshooting/debug_mode/) * [Agent Flare](https://docs.datadoghq.com/agent/troubleshooting/send_a_flare/) * [Agent Check Status](https://docs.datadoghq.com/agent/troubleshooting/agent_check_status/) * [NTP Issues](https://docs.datadoghq.com/agent/troubleshooting/ntp/) * [Permission Issues](https://docs.datadoghq.com/agent/troubleshooting/permissions/) * [Integrations Issues](https://docs.datadoghq.com/agent/troubleshooting/integrations/) * [Site Issues](https://docs.datadoghq.com/agent/troubleshooting/site/) * [Autodiscovery Issues](https://docs.datadoghq.com/agent/troubleshooting/autodiscovery/) * [Windows Container Issues](https://docs.datadoghq.com/agent/troubleshooting/windows_containers) * [Agent Runtime Configuration](https://docs.datadoghq.com/agent/troubleshooting/config) * [High CPU or Memory Consumption](https://docs.datadoghq.com/agent/troubleshooting/high_memory_usage/) * [Guides](https://docs.datadoghq.com/agent/guide/) * [Data Security](https://docs.datadoghq.com/data_security/agent/) * [Integrations](https://docs.datadoghq.com/integrations/) * [Guides](https://docs.datadoghq.com/integrations/guide/) * [Developers](https://docs.datadoghq.com/developers/) * [Authorization](https://docs.datadoghq.com/developers/authorization/) * [OAuth2 in Datadog](https://docs.datadoghq.com/developers/authorization/oauth2_in_datadog/) * [Authorization Endpoints](https://docs.datadoghq.com/developers/authorization/oauth2_endpoints/) * [DogStatsD](https://docs.datadoghq.com/developers/dogstatsd/) * [Datagram Format](https://docs.datadoghq.com/developers/dogstatsd/datagram_shell) * [Unix Domain Socket](https://docs.datadoghq.com/developers/dogstatsd/unix_socket) * [High Throughput Data](https://docs.datadoghq.com/developers/dogstatsd/high_throughput/) * [Data Aggregation](https://docs.datadoghq.com/developers/dogstatsd/data_aggregation/) * [DogStatsD Mapper](https://docs.datadoghq.com/developers/dogstatsd/dogstatsd_mapper/) * [Custom Checks](https://docs.datadoghq.com/developers/custom_checks/) * [Writing a Custom Agent Check](https://docs.datadoghq.com/developers/custom_checks/write_agent_check/) * [Writing a Custom OpenMetrics Check](https://docs.datadoghq.com/developers/custom_checks/prometheus/) * [Integrations](https://docs.datadoghq.com/developers/integrations/) * [Build an Integration with Datadog](https://docs.datadoghq.com/developers/integrations/build_integration/) * [Create an Agent-based Integration](https://docs.datadoghq.com/developers/integrations/agent_integration/) * [Create an API-based Integration](https://docs.datadoghq.com/developers/integrations/api_integration/) * [Create a Log Pipeline](https://docs.datadoghq.com/developers/integrations/log_pipeline/) * [Integration Assets Reference](https://docs.datadoghq.com/developers/integrations/check_references/) * [Build a Marketplace Offering](https://docs.datadoghq.com/developers/integrations/marketplace_offering/) * [Create an Integration Dashboard](https://docs.datadoghq.com/developers/integrations/create-an-integration-dashboard) * [Create a Monitor Template](https://docs.datadoghq.com/developers/integrations/create-an-integration-monitor-template) * [Create a Cloud SIEM Detection Rule](https://docs.datadoghq.com/developers/integrations/create-a-cloud-siem-detection-rule) * [Install Agent Integration Developer Tool](https://docs.datadoghq.com/developers/integrations/python/) * [Service Checks](https://docs.datadoghq.com/developers/service_checks/) * [Submission - Agent Check](https://docs.datadoghq.com/developers/service_checks/agent_service_checks_submission/) * [Submission - DogStatsD](https://docs.datadoghq.com/developers/service_checks/dogstatsd_service_checks_submission/) * [Submission - API](https://docs.datadoghq.com/api/v1/service-checks/) * [IDE Plugins](https://docs.datadoghq.com/developers/ide_plugins/) * [JetBrains IDEs](https://docs.datadoghq.com/developers/ide_plugins/idea/) * [VS Code & Cursor](https://docs.datadoghq.com/developers/ide_plugins/vscode/) * [Community](https://docs.datadoghq.com/developers/community/) * [Libraries](https://docs.datadoghq.com/developers/community/libraries/) * [Guides](https://docs.datadoghq.com/developers/guide/) * [OpenTelemetry](https://docs.datadoghq.com/opentelemetry/) * [Getting Started](https://docs.datadoghq.com/opentelemetry/getting_started/) * [Datadog Example Application](https://docs.datadoghq.com/opentelemetry/getting_started/datadog_example) * [OpenTelemetry Demo Application](https://docs.datadoghq.com/opentelemetry/getting_started/otel_demo_to_datadog) * [Feature Compatibility](https://docs.datadoghq.com/opentelemetry/compatibility/) * [Instrument Your Applications](https://docs.datadoghq.com/opentelemetry/instrument/) * [OTel SDKs](https://docs.datadoghq.com/opentelemetry/instrument/otel_sdks/) * [OTel APIs with Datadog SDKs](https://docs.datadoghq.com/opentelemetry/instrument/api_support) * [OTel Instrumentation Libraries](https://docs.datadoghq.com/opentelemetry/instrument/instrumentation_libraries/) * [Configuration](https://docs.datadoghq.com/opentelemetry/config/environment_variable_support/) * [Send Data to Datadog](https://docs.datadoghq.com/opentelemetry/setup/) * [DDOT Collector (Recommended)](https://docs.datadoghq.com/opentelemetry/setup/ddot_collector) * [Other Setup Options](https://docs.datadoghq.com/opentelemetry/setup/) * [Semantic Mapping](https://docs.datadoghq.com/opentelemetry/mapping/) * [Resource Attribute Mapping](https://docs.datadoghq.com/opentelemetry/mapping/semantic_mapping/) * [Metrics Mapping](https://docs.datadoghq.com/opentelemetry/mapping/metrics_mapping/) * [Infrastructure Host Mapping](https://docs.datadoghq.com/opentelemetry/mapping/host_metadata/) * [Hostname Mapping](https://docs.datadoghq.com/opentelemetry/mapping/hostname/) * [Service-entry Spans Mapping](https://docs.datadoghq.com/opentelemetry/mapping/service_entry_spans/) * [Ingestion Sampling](https://docs.datadoghq.com/opentelemetry/ingestion_sampling) * [Correlate Data](https://docs.datadoghq.com/opentelemetry/correlate/) * [Logs and Traces](https://docs.datadoghq.com/opentelemetry/correlate/logs_and_traces/) * [Metrics and Traces](https://docs.datadoghq.com/opentelemetry/correlate/metrics_and_traces/) * [RUM and Traces](https://docs.datadoghq.com/opentelemetry/correlate/rum_and_traces/) * [DBM and Traces](https://docs.datadoghq.com/opentelemetry/correlate/dbm_and_traces/) * [Integrations](https://docs.datadoghq.com/opentelemetry/integrations/) * [Apache Metrics](https://docs.datadoghq.com/opentelemetry/integrations/apache_metrics/) * [Apache Spark Metrics](https://docs.datadoghq.com/opentelemetry/integrations/spark_metrics/) * [Collector Health Metrics](https://docs.datadoghq.com/opentelemetry/integrations/collector_health_metrics/) * [Datadog Extension](https://docs.datadoghq.com/opentelemetry/integrations/datadog_extension/) * [Docker Metrics](https://docs.datadoghq.com/opentelemetry/integrations/docker_metrics/) * [HAProxy Metrics](https://docs.datadoghq.com/opentelemetry/integrations/haproxy_metrics/) * [Host Metrics](https://docs.datadoghq.com/opentelemetry/integrations/host_metrics/) * [IIS Metrics](https://docs.datadoghq.com/opentelemetry/integrations/iis_metrics/) * [Kafka Metrics](https://docs.datadoghq.com/opentelemetry/integrations/kafka_metrics/) * [Kubernetes Metrics](https://docs.datadoghq.com/opentelemetry/integrations/kubernetes_metrics/) * [MySQL Metrics](https://docs.datadoghq.com/opentelemetry/integrations/mysql_metrics/) * [NGINX Metrics](https://docs.datadoghq.com/opentelemetry/integrations/nginx_metrics/) * [Podman Metrics](https://docs.datadoghq.com/opentelemetry/integrations/podman_metrics/) * [Runtime Metrics](https://docs.datadoghq.com/opentelemetry/integrations/runtime_metrics/) * [Trace Metrics](https://docs.datadoghq.com/opentelemetry/integrations/trace_metrics/) * [Troubleshooting](https://docs.datadoghq.com/opentelemetry/troubleshooting/) * [Guides and Resources](https://docs.datadoghq.com/opentelemetry/guide) * [Produce Delta Temporality Metrics](https://docs.datadoghq.com/opentelemetry/guide/otlp_delta_temporality) * [Visualize Histograms as Heatmaps](https://docs.datadoghq.com/opentelemetry/guide/otlp_histogram_heatmaps) * [Migration Guides](https://docs.datadoghq.com/opentelemetry/migrate/) * [Reference](https://docs.datadoghq.com/opentelemetry/reference) * [Terms and Concepts](https://docs.datadoghq.com/opentelemetry/reference/concepts) * [Trace Context Propagation](https://docs.datadoghq.com/opentelemetry/reference/trace_context_propagation) * [Trace IDs](https://docs.datadoghq.com/opentelemetry/reference/trace_ids) * [OTLP Metric Types](https://docs.datadoghq.com/opentelemetry/reference/otlp_metric_types) * [Administrator's Guide](https://docs.datadoghq.com/administrators_guide/) * [Getting Started](https://docs.datadoghq.com/administrators_guide/getting_started/) * [Plan](https://docs.datadoghq.com/administrators_guide/plan/) * [Build](https://docs.datadoghq.com/administrators_guide/build/) * [Run](https://docs.datadoghq.com/administrators_guide/run/) * [API](https://docs.datadoghq.com/api/) * [Partners](https://docs.datadoghq.com/partners/) * [Datadog Mobile App](https://docs.datadoghq.com/mobile/) * [Enterprise Configuration](https://docs.datadoghq.com/mobile/enterprise_configuration) * [Datadog for Intune](https://docs.datadoghq.com/mobile/datadog_for_intune) * [Shortcut Configurations](https://docs.datadoghq.com/mobile/shortcut_configurations) * [Push Notifications](https://docs.datadoghq.com/mobile/push_notification) * [Widgets](https://docs.datadoghq.com/mobile/widgets) * [Guides](https://docs.datadoghq.com/mobile/guide) * [DDSQL Reference](https://docs.datadoghq.com/ddsql_reference/) * [Data Directory](https://docs.datadoghq.com/ddsql_reference/data_directory/) * [CoScreen](https://docs.datadoghq.com/coscreen/) * [Troubleshooting](https://docs.datadoghq.com/coscreen/troubleshooting) * [CoTerm](https://docs.datadoghq.com/coterm/) * [Install](https://docs.datadoghq.com/coterm/install) * [Using CoTerm](https://docs.datadoghq.com/coterm/usage) * [Configuration Rules](https://docs.datadoghq.com/coterm/rules) * [Remote Configuration](https://docs.datadoghq.com/remote_configuration) * [Cloudcraft (Standalone)](https://docs.datadoghq.com/cloudcraft/) * [Getting Started](https://docs.datadoghq.com/cloudcraft/getting-started/) * [Account Management](https://docs.datadoghq.com/cloudcraft/account-management/) * [Components: Common](https://docs.datadoghq.com/cloudcraft/components-common/) * [Components: Azure](https://docs.datadoghq.com/cloudcraft/components-azure/) * [Components: AWS](https://docs.datadoghq.com/cloudcraft/components-aws/) * [Advanced](https://docs.datadoghq.com/cloudcraft/advanced/) * [FAQ](https://docs.datadoghq.com/cloudcraft/faq/) * [API](https://docs.datadoghq.com/cloudcraft/api) * [AWS Accounts](https://docs.datadoghq.com/cloudcraft/api/aws-accounts/) * [Azure Accounts](https://docs.datadoghq.com/cloudcraft/api/azure-accounts/) * [Blueprints](https://docs.datadoghq.com/cloudcraft/api/blueprints/) * [Budgets](https://docs.datadoghq.com/cloudcraft/api/budgets/) * [Teams](https://docs.datadoghq.com/cloudcraft/api/teams/) * [Users](https://docs.datadoghq.com/cloudcraft/api/users/) * [In The App ](https://docs.datadoghq.com/api/latest/csm-threats/) * [Dashboards](https://docs.datadoghq.com/dashboards/) * [Configure](https://docs.datadoghq.com/dashboards/configure/) * [Dashboard List](https://docs.datadoghq.com/dashboards/list/) * [Widgets](https://docs.datadoghq.com/dashboards/widgets/) * [Configuration](https://docs.datadoghq.com/dashboards/widgets/configuration/) * [Widget Types](https://docs.datadoghq.com/dashboards/widgets/types/) * [Querying](https://docs.datadoghq.com/dashboards/querying/) * [Functions](https://docs.datadoghq.com/dashboards/functions/) * [Algorithms](https://docs.datadoghq.com/dashboards/functions/algorithms/) * [Arithmetic](https://docs.datadoghq.com/dashboards/functions/arithmetic/) * [Count](https://docs.datadoghq.com/dashboards/functions/count/) * [Exclusion](https://docs.datadoghq.com/dashboards/functions/exclusion/) * [Interpolation](https://docs.datadoghq.com/dashboards/functions/interpolation/) * [Rank](https://docs.datadoghq.com/dashboards/functions/rank/) * [Rate](https://docs.datadoghq.com/dashboards/functions/rate/) * [Regression](https://docs.datadoghq.com/dashboards/functions/regression/) * [Rollup](https://docs.datadoghq.com/dashboards/functions/rollup/) * [Smoothing](https://docs.datadoghq.com/dashboards/functions/smoothing/) * [Timeshift](https://docs.datadoghq.com/dashboards/functions/timeshift/) * [Beta](https://docs.datadoghq.com/dashboards/functions/beta/) * [Graph Insights](https://docs.datadoghq.com/dashboards/graph_insights) * [Metric Correlations](https://docs.datadoghq.com/dashboards/graph_insights/correlations/) * [Watchdog Explains](https://docs.datadoghq.com/dashboards/graph_insights/watchdog_explains/) * [Template Variables](https://docs.datadoghq.com/dashboards/template_variables/) * [Overlays](https://docs.datadoghq.com/dashboards/change_overlays/) * [Annotations](https://docs.datadoghq.com/dashboards/annotations/) * [Guides](https://docs.datadoghq.com/dashboards/guide/) * [Sharing](https://docs.datadoghq.com/dashboards/sharing/) * [Shared Dashboards](https://docs.datadoghq.com/dashboards/sharing/shared_dashboards) * [Share Graphs](https://docs.datadoghq.com/dashboards/sharing/graphs) * [Scheduled Reports](https://docs.datadoghq.com/dashboards/sharing/scheduled_reports) * [Notebooks](https://docs.datadoghq.com/notebooks/) * [Analysis Features](https://docs.datadoghq.com/notebooks/advanced_analysis/) * [Getting Started](https://docs.datadoghq.com/notebooks/advanced_analysis/getting_started/) * [Guides](https://docs.datadoghq.com/notebooks/guide) * [DDSQL Editor](https://docs.datadoghq.com/ddsql_editor/) * [Reference Tables](https://docs.datadoghq.com/reference_tables/) * [Sheets](https://docs.datadoghq.com/sheets/) * [Functions and Operators](https://docs.datadoghq.com/sheets/functions_operators/) * [Guides](https://docs.datadoghq.com/sheets/guide/) * [Monitors and Alerting](https://docs.datadoghq.com/monitors/) * [Draft Monitors](https://docs.datadoghq.com/monitors/draft/) * [Configure Monitors](https://docs.datadoghq.com/monitors/configuration/) * [Monitor Templates](https://docs.datadoghq.com/monitors/templates/) * [Monitor Types](https://docs.datadoghq.com/monitors/types/) * [Host](https://docs.datadoghq.com/monitors/types/host/) * [Metric](https://docs.datadoghq.com/monitors/types/metric/) * [Analysis](https://docs.datadoghq.com/monitors/types/analysis/) * [Anomaly](https://docs.datadoghq.com/monitors/types/anomaly/) * [APM](https://docs.datadoghq.com/monitors/types/apm/) * [Audit Trail](https://docs.datadoghq.com/monitors/types/audit_trail/) * [Change](https://docs.datadoghq.com/monitors/types/change-alert/) * [CI/CD & Test](https://docs.datadoghq.com/monitors/types/ci/) * [Cloud Cost](https://docs.datadoghq.com/monitors/types/cloud_cost/) * [Composite](https://docs.datadoghq.com/monitors/types/composite/) * [Database Monitoring](https://docs.datadoghq.com/monitors/types/database_monitoring/) * [Error Tracking](https://docs.datadoghq.com/monitors/types/error_tracking/) * [Event](https://docs.datadoghq.com/monitors/types/event/) * [Forecast](https://docs.datadoghq.com/monitors/types/forecasts/) * [Integration](https://docs.datadoghq.com/monitors/types/integration/) * [Live Process](https://docs.datadoghq.com/monitors/types/process/) * [Logs](https://docs.datadoghq.com/monitors/types/log/) * [Network](https://docs.datadoghq.com/monitors/types/network/) * [Cloud Network Monitoring](https://docs.datadoghq.com/monitors/types/cloud_network_monitoring/) * [NetFlow](https://docs.datadoghq.com/monitors/types/netflow/) * [Outlier](https://docs.datadoghq.com/monitors/types/outlier/) * [Process Check](https://docs.datadoghq.com/monitors/types/process_check/) * [Real User Monitoring](https://docs.datadoghq.com/monitors/types/real_user_monitoring/) * [Service Check](https://docs.datadoghq.com/monitors/types/service_check/) * [SLO Alerts](https://docs.datadoghq.com/monitors/types/slo/) * [Synthetic Monitoring](https://docs.datadoghq.com/monitors/types/synthetic_monitoring/) * [Watchdog](https://docs.datadoghq.com/monitors/types/watchdog/) * [Notifications](https://docs.datadoghq.com/monitors/notify/) * [Notification Rules](https://docs.datadoghq.com/monitors/notify/notification_rules/) * [Variables](https://docs.datadoghq.com/monitors/notify/variables/) * [Downtimes](https://docs.datadoghq.com/monitors/downtimes/) * [Examples](https://docs.datadoghq.com/monitors/downtimes/examples) * [Manage Monitors](https://docs.datadoghq.com/monitors/manage/) * [Search Monitors](https://docs.datadoghq.com/monitors/manage/search/) * [Check Summary](https://docs.datadoghq.com/monitors/manage/check_summary/) * [Monitor Status](https://docs.datadoghq.com/monitors/status/status_page) * [Status Graphs](https://docs.datadoghq.com/monitors/status/graphs) * [Status Events](https://docs.datadoghq.com/monitors/status/events) * [Monitor Settings](https://docs.datadoghq.com/monitors/settings/) * [Monitor Quality](https://docs.datadoghq.com/monitors/quality/) * [Guides](https://docs.datadoghq.com/monitors/guide/) * [Service Level Objectives](https://docs.datadoghq.com/service_level_objectives/) * [Monitor-based SLOs](https://docs.datadoghq.com/service_level_objectives/monitor/) * [Metric-based SLOs](https://docs.datadoghq.com/service_level_objectives/metric/) * [Time Slice SLOs](https://docs.datadoghq.com/service_level_objectives/time_slice/) * [Error Budget Alerts](https://docs.datadoghq.com/service_level_objectives/error_budget/) * [Burn Rate Alerts](https://docs.datadoghq.com/service_level_objectives/burn_rate/) * [Guides](https://docs.datadoghq.com/service_level_objectives/guide/) * [Metrics](https://docs.datadoghq.com/metrics/) * [Custom Metrics](https://docs.datadoghq.com/metrics/custom_metrics/) * [Metric Type Modifiers](https://docs.datadoghq.com/metrics/custom_metrics/type_modifiers/) * [Historical Metrics Ingestion](https://docs.datadoghq.com/metrics/custom_metrics/historical_metrics/) * [Submission - Agent Check](https://docs.datadoghq.com/metrics/custom_metrics/agent_metrics_submission/) * [Submission - DogStatsD](https://docs.datadoghq.com/metrics/custom_metrics/dogstatsd_metrics_submission/) * [Submission - Powershell](https://docs.datadoghq.com/metrics/custom_metrics/powershell_metrics_submission) * [Submission - API](https://docs.datadoghq.com/api/latest/metrics/#submit-metrics) * [OpenTelemetry Metrics](https://docs.datadoghq.com/metrics/open_telemetry/) * [OTLP Metric Types](https://docs.datadoghq.com/metrics/open_telemetry/otlp_metric_types) * [Query OpenTelemetry Metrics](https://docs.datadoghq.com/metrics/open_telemetry/query_metrics/) * [Metrics Types](https://docs.datadoghq.com/metrics/types/) * [Distributions](https://docs.datadoghq.com/metrics/distributions/) * [Overview](https://docs.datadoghq.com/metrics/overview/) * [Explorer](https://docs.datadoghq.com/metrics/explorer/) * [Metrics Units](https://docs.datadoghq.com/metrics/units/) * [Summary](https://docs.datadoghq.com/metrics/summary/) * [Volume](https://docs.datadoghq.com/metrics/volume/) * [Advanced Filtering](https://docs.datadoghq.com/metrics/advanced-filtering/) * [Nested Queries](https://docs.datadoghq.com/metrics/nested_queries/) * [Composite Metrics Queries](https://docs.datadoghq.com/metrics/composite_metrics_queries) * [Derived Metrics](https://docs.datadoghq.com/metrics/derived-metrics) * [Metrics Without Limits™](https://docs.datadoghq.com/metrics/metrics-without-limits/) * [Guides](https://docs.datadoghq.com/metrics/guide) * [Watchdog](https://docs.datadoghq.com/watchdog/) * [Alerts](https://docs.datadoghq.com/watchdog/alerts) * [Impact Analysis](https://docs.datadoghq.com/watchdog/impact_analysis/) * [RCA](https://docs.datadoghq.com/watchdog/rca/) * [Insights](https://docs.datadoghq.com/watchdog/insights) * [Faulty Deployment Detection](https://docs.datadoghq.com/watchdog/faulty_deployment_detection/) * [Faulty Cloud & SaaS API Detection](https://docs.datadoghq.com/watchdog/faulty_cloud_saas_api_detection) * [Bits AI](https://docs.datadoghq.com/bits_ai/) * [Bits AI SRE](https://docs.datadoghq.com/bits_ai/bits_ai_sre) * [Investigate issues](https://docs.datadoghq.com/bits_ai/bits_ai_sre/investigate_issues) * [Remediate issues](https://docs.datadoghq.com/bits_ai/bits_ai_sre/remediate_issues) * [Bits AI SRE integrations and settings](https://docs.datadoghq.com/bits_ai/bits_ai_sre/configure) * [Help Bits learn](https://docs.datadoghq.com/bits_ai/bits_ai_sre/help_bits_learn) * [Chat with Bits AI SRE](https://docs.datadoghq.com/bits_ai/bits_ai_sre/chat_bits_ai_sre) * [Bits AI Dev Agent](https://docs.datadoghq.com/bits_ai/bits_ai_dev_agent) * [Setup](https://docs.datadoghq.com/bits_ai/bits_ai_dev_agent/setup) * [Chat with Bits AI](https://docs.datadoghq.com/bits_ai/chat_with_bits_ai) * [MCP Server](https://docs.datadoghq.com/bits_ai/mcp_server) * [Internal Developer Portal](https://docs.datadoghq.com/internal_developer_portal/) * [Software Catalog](https://docs.datadoghq.com/internal_developer_portal/software_catalog/) * [Set Up](https://docs.datadoghq.com/internal_developer_portal/software_catalog/set_up) * [Entity Model](https://docs.datadoghq.com/internal_developer_portal/software_catalog/entity_model) * [Troubleshooting](https://docs.datadoghq.com/internal_developer_portal/software_catalog/troubleshooting) * [Scorecards](https://docs.datadoghq.com/internal_developer_portal/scorecards) * [Scorecard Configuration](https://docs.datadoghq.com/internal_developer_portal/scorecards/scorecard_configuration) * [Custom Rules](https://docs.datadoghq.com/internal_developer_portal/scorecards/custom_rules) * [Using Scorecards](https://docs.datadoghq.com/internal_developer_portal/scorecards/using_scorecards) * [Self-Service Actions](https://docs.datadoghq.com/internal_developer_portal/self_service_actions) * [Software Templates](https://docs.datadoghq.com/internal_developer_portal/self_service_actions/software_templates) * [Engineering Reports](https://docs.datadoghq.com/internal_developer_portal/eng_reports) * [Reliability Overview](https://docs.datadoghq.com/internal_developer_portal/eng_reports/reliability_overview) * [Scorecards Performance](https://docs.datadoghq.com/internal_developer_portal/eng_reports/scorecards_performance) * [DORA Metrics](https://docs.datadoghq.com/internal_developer_portal/eng_reports/dora_metrics) * [Custom Reports](https://docs.datadoghq.com/internal_developer_portal/eng_reports/custom_reports) * [Developer Homepage](https://docs.datadoghq.com/internal_developer_portal/developer_homepage) * [Campaigns](https://docs.datadoghq.com/internal_developer_portal/campaigns) * [External Provider Status](https://docs.datadoghq.com/internal_developer_portal/external_provider_status) * [Plugins](https://docs.datadoghq.com/internal_developer_portal/plugins) * [Integrations](https://docs.datadoghq.com/internal_developer_portal/integrations) * [Use Cases](https://docs.datadoghq.com/internal_developer_portal/use_cases) * [API Management](https://docs.datadoghq.com/internal_developer_portal/use_cases/api_management) * [Cloud Cost Management](https://docs.datadoghq.com/internal_developer_portal/use_cases/cloud_cost_management) * [App and API Protection](https://docs.datadoghq.com/internal_developer_portal/use_cases/appsec_management) * [Developer Onboarding](https://docs.datadoghq.com/internal_developer_portal/use_cases/dev_onboarding) * [Dependency Management](https://docs.datadoghq.com/internal_developer_portal/use_cases/dependency_management) * [Production Readiness](https://docs.datadoghq.com/internal_developer_portal/use_cases/production_readiness) * [Incident Response](https://docs.datadoghq.com/internal_developer_portal/use_cases/incident_response) * [CI Pipeline Visibility](https://docs.datadoghq.com/internal_developer_portal/use_cases/pipeline_visibility) * [Onboarding Guide](https://docs.datadoghq.com/internal_developer_portal/onboarding_guide) * [Error Tracking](https://docs.datadoghq.com/error_tracking/) * [Explorer](https://docs.datadoghq.com/error_tracking/explorer) * [Issue States](https://docs.datadoghq.com/error_tracking/issue_states) * [Regression Detection](https://docs.datadoghq.com/error_tracking/regression_detection) * [Suspected Causes](https://docs.datadoghq.com/error_tracking/suspected_causes) * [Error Grouping](https://docs.datadoghq.com/error_tracking/error_grouping) * [Bits AI Dev Agent](https://docs.datadoghq.com/bits_ai/bits_ai_dev_agent) * [Monitors](https://docs.datadoghq.com/error_tracking/monitors) * [Issue Correlation](https://docs.datadoghq.com/error_tracking/issue_correlation) * [Identify Suspect Commits](https://docs.datadoghq.com/error_tracking/suspect_commits) * [Auto Assign](https://docs.datadoghq.com/error_tracking/auto_assign) * [Issue Team Ownership](https://docs.datadoghq.com/error_tracking/issue_team_ownership) * [Track Browser and Mobile Errors](https://docs.datadoghq.com/error_tracking/frontend) * [Browser Error Tracking](https://docs.datadoghq.com/error_tracking/frontend/browser) * [Collecting Browser Errors](https://docs.datadoghq.com/error_tracking/frontend/collecting_browser_errors) * [Mobile Crash Tracking](https://docs.datadoghq.com/error_tracking/frontend/mobile) * [Replay Errors](https://docs.datadoghq.com/error_tracking/frontend/replay_errors) * [Real User Monitoring](https://docs.datadoghq.com/error_tracking/rum) * [Logs](https://docs.datadoghq.com/error_tracking/frontend/logs) * [Track Backend Errors](https://docs.datadoghq.com/error_tracking/backend) * [Getting Started](https://docs.datadoghq.com/error_tracking/backend/getting_started) * [Exception Replay](https://docs.datadoghq.com/error_tracking/backend/exception_replay) * [Capturing Handled Errors](https://docs.datadoghq.com/error_tracking/backend/capturing_handled_errors) * [APM](https://docs.datadoghq.com/error_tracking/apm) * [Logs](https://docs.datadoghq.com/error_tracking/backend/logs) * [Manage Data Collection](https://docs.datadoghq.com/error_tracking/manage_data_collection) * [Troubleshooting](https://docs.datadoghq.com/error_tracking/troubleshooting) * [Guides](https://docs.datadoghq.com/error_tracking/guides) * [Change Tracking](https://docs.datadoghq.com/change_tracking/) * [Feature Flags](https://docs.datadoghq.com/change_tracking/feature_flags/) * [Event Management](https://docs.datadoghq.com/events/) * [Ingest Events](https://docs.datadoghq.com/events/ingest/) * [Pipelines and Processors](https://docs.datadoghq.com/events/pipelines_and_processors/) * [Aggregation Key Processor](https://docs.datadoghq.com/events/pipelines_and_processors/aggregation_key) * [Arithmetic Processor](https://docs.datadoghq.com/events/pipelines_and_processors/arithmetic_processor) * [Date Remapper](https://docs.datadoghq.com/events/pipelines_and_processors/date_remapper) * [Category Processor](https://docs.datadoghq.com/events/pipelines_and_processors/category_processor) * [Grok Parser](https://docs.datadoghq.com/events/pipelines_and_processors/grok_parser) * [Lookup Processor](https://docs.datadoghq.com/events/pipelines_and_processors/lookup_processor) * [Remapper](https://docs.datadoghq.com/events/pipelines_and_processors/remapper) * [Service Remapper](https://docs.datadoghq.com/events/pipelines_and_processors/service_remapper) * [Status Remapper](https://docs.datadoghq.com/events/pipelines_and_processors/status_remapper) * [String Builder Processor](https://docs.datadoghq.com/events/pipelines_and_processors/string_builder_processor) * [Explorer](https://docs.datadoghq.com/events/explorer/) * [Searching](https://docs.datadoghq.com/events/explorer/searching) * [Navigate the Explorer](https://docs.datadoghq.com/events/explorer/navigate) * [Customization](https://docs.datadoghq.com/events/explorer/customization) * [Facets](https://docs.datadoghq.com/events/explorer/facets) * [Attributes](https://docs.datadoghq.com/events/explorer/attributes) * [Notifications](https://docs.datadoghq.com/events/explorer/notifications) * [Analytics](https://docs.datadoghq.com/events/explorer/analytics) * [Saved Views](https://docs.datadoghq.com/events/explorer/saved_views) * [Triage Inbox](https://docs.datadoghq.com/events/triage_inbox) * [Correlation](https://docs.datadoghq.com/events/correlation/) * [Configuration](https://docs.datadoghq.com/events/correlation/configuration) * [Triaging & Notifying](https://docs.datadoghq.com/events/correlation/triage_and_notify) * [Analytics](https://docs.datadoghq.com/events/correlation/analytics) * [Guides](https://docs.datadoghq.com/events/guides/) * [Incident Response ](https://docs.datadoghq.com/api/latest/csm-threats/) * [Incident Management](https://docs.datadoghq.com/incident_response/incident_management/) * [Declare an Incident](https://docs.datadoghq.com/incident_response/incident_management/declare) * [Describe an Incident](https://docs.datadoghq.com/incident_response/incident_management/describe) * [Response Team](https://docs.datadoghq.com/incident_response/incident_management/response_team) * [Notification](https://docs.datadoghq.com/incident_response/incident_management/notification) * [Investigate an Incident](https://docs.datadoghq.com/incident_response/incident_management/investigate) * [Timeline](https://docs.datadoghq.com/incident_response/incident_management/investigate/timeline) * [Follow-ups](https://docs.datadoghq.com/incident_response/incident_management/follow-ups) * [Incident AI](https://docs.datadoghq.com/incident_response/incident_management/incident_ai) * [Incident Settings](https://docs.datadoghq.com/incident_response/incident_management/incident_settings) * [Information](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/information) * [Property Fields](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/property_fields) * [Responder Types](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/responder_types) * [Integrations](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/integrations) * [Notification Rules](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/notification_rules) * [Templates](https://docs.datadoghq.com/incident_response/incident_management/incident_settings/templates) * [Incident Analytics](https://docs.datadoghq.com/incident_response/incident_management/analytics) * [Integrations](https://docs.datadoghq.com/incident_response/incident_management/integrations) * [Slack](https://docs.datadoghq.com/incident_response/incident_management/integrations/slack) * [Microsoft Teams](https://docs.datadoghq.com/incident_response/incident_management/integrations/microsoft_teams) * [Jira](https://docs.datadoghq.com/incident_response/incident_management/integrations/jira) * [ServiceNow](https://docs.datadoghq.com/incident_response/incident_management/integrations/servicenow) * [Status Pages](https://docs.datadoghq.com/incident_response/incident_management/integrations/status_pages) * [Atlassian Statuspage](https://docs.datadoghq.com/incident_response/incident_management/integrations/statuspage) * [Datadog Clipboard](https://docs.datadoghq.com/incident_response/incident_management/datadog_clipboard) * [On-Call](https://docs.datadoghq.com/incident_response/on-call/) * [Onboard a Team](https://docs.datadoghq.com/incident_response/on-call/teams/) * [Trigger a Page](https://docs.datadoghq.com/incident_response/on-call/triggering_pages/) * [Live Call Routing](https://docs.datadoghq.com/incident_response/on-call/triggering_pages/live_call_routing) * [Routing Rules](https://docs.datadoghq.com/incident_response/on-call/routing_rules/) * [Escalation Policies](https://docs.datadoghq.com/incident_response/on-call/escalation_policies/) * [Schedules](https://docs.datadoghq.com/incident_response/on-call/schedules/) * [Automations](https://docs.datadoghq.com/incident_response/on-call/automations) * [Profile Settings](https://docs.datadoghq.com/incident_response/on-call/profile_settings/) * [Guides](https://docs.datadoghq.com/incident_response/on-call/guides/) * [Status Pages](https://docs.datadoghq.com/incident_response/status_pages/) * [Case Management](https://docs.datadoghq.com/incident_response/case_management/) * [Projects](https://docs.datadoghq.com/incident_response/case_management/projects) * [Settings](https://docs.datadoghq.com/incident_response/case_management/settings) * [Create a Case](https://docs.datadoghq.com/incident_response/case_management/create_case) * [Customization](https://docs.datadoghq.com/incident_response/case_management/customization) * [View and Manage Cases](https://docs.datadoghq.com/incident_response/case_management/view_and_manage) * [Notifications and Integrations](https://docs.datadoghq.com/incident_response/case_management/notifications_integrations) * [Case Automation Rules](https://docs.datadoghq.com/incident_response/case_management/automation_rules) * [Troubleshooting](https://docs.datadoghq.com/incident_response/case_management/troubleshooting) * [Actions & Remediations ](https://docs.datadoghq.com/api/latest/csm-threats/) * [Agents](https://docs.datadoghq.com/actions/agents/) * [Workflow Automation](https://docs.datadoghq.com/actions/workflows/) * [Build Workflows](https://docs.datadoghq.com/actions/workflows/build/) * [Access and Authentication](https://docs.datadoghq.com/actions/workflows/access_and_auth/) * [Trigger Workflows](https://docs.datadoghq.com/actions/workflows/trigger/) * [Variables and parameters](https://docs.datadoghq.com/actions/workflows/variables/) * [Actions](https://docs.datadoghq.com/actions/workflows/actions/) * [Workflow Logic](https://docs.datadoghq.com/actions/workflows/actions/flow_control/) * [Save and Reuse Actions](https://docs.datadoghq.com/actions/workflows/saved_actions/) * [Test and Debug](https://docs.datadoghq.com/actions/workflows/test_and_debug/) * [JavaScript Expressions](https://docs.datadoghq.com/actions/workflows/expressions/) * [Track Workflows](https://docs.datadoghq.com/actions/workflows/track) * [Limits](https://docs.datadoghq.com/actions/workflows/limits/) * [App Builder](https://docs.datadoghq.com/actions/app_builder/) * [Build Apps](https://docs.datadoghq.com/actions/app_builder/build/) * [Access and Authentication](https://docs.datadoghq.com/actions/app_builder/access_and_auth/) * [Queries](https://docs.datadoghq.com/actions/app_builder/queries/) * [Variables](https://docs.datadoghq.com/actions/app_builder/variables/) * [Events](https://docs.datadoghq.com/actions/app_builder/events/) * [Components](https://docs.datadoghq.com/actions/app_builder/components/) * [Custom Charts](https://docs.datadoghq.com/actions/app_builder/components/custom_charts/) * [React Renderer](https://docs.datadoghq.com/actions/app_builder/components/react_renderer/) * [Tables](https://docs.datadoghq.com/actions/app_builder/components/tables/) * [Reusable Modules](https://docs.datadoghq.com/actions/app_builder/components/reusable_modules/) * [JavaScript Expressions](https://docs.datadoghq.com/actions/app_builder/expressions/) * [Embedded Apps](https://docs.datadoghq.com/actions/app_builder/embedded_apps/) * [Input Parameters](https://docs.datadoghq.com/actions/app_builder/embedded_apps/input_parameters/) * [Save and Reuse Actions](https://docs.datadoghq.com/actions/app_builder/saved_actions/) * [Datastores](https://docs.datadoghq.com/actions/datastores/) * [Create and Manage Datastores](https://docs.datadoghq.com/actions/datastores/create/) * [Use Datastores with Apps and Workflows](https://docs.datadoghq.com/actions/datastores/use) * [Automation Rules](https://docs.datadoghq.com/actions/datastores/trigger) * [Access and Authentication](https://docs.datadoghq.com/actions/datastores/auth) * [Forms](https://docs.datadoghq.com/actions/forms/) * [Action Catalog](https://docs.datadoghq.com/actions/actions_catalog/) * [Connections](https://docs.datadoghq.com/actions/connections/) * [AWS Integration](https://docs.datadoghq.com/actions/connections/aws_integration/) * [HTTP Request](https://docs.datadoghq.com/actions/connections/http/) * [Private Actions](https://docs.datadoghq.com/actions/private_actions/) * [Use Private Actions](https://docs.datadoghq.com/actions/private_actions/use_private_actions/) * [Run a Script](https://docs.datadoghq.com/actions/private_actions/run_script/) * [Update the Private Action Runner](https://docs.datadoghq.com/actions/private_actions/update_private_action_runner/) * [Private Action Credentials](https://docs.datadoghq.com/actions/private_actions/private_action_credentials/) * [Infrastructure ](https://docs.datadoghq.com/api/latest/csm-threats/) * [Cloudcraft](https://docs.datadoghq.com/datadog_cloudcraft/) * [Overlays](https://docs.datadoghq.com/datadog_cloudcraft/overlays/) * [Infrastructure](https://docs.datadoghq.com/datadog_cloudcraft/overlays/infrastructure/) * [Observability](https://docs.datadoghq.com/datadog_cloudcraft/overlays/observability/) * [Security](https://docs.datadoghq.com/datadog_cloudcraft/overlays/security/) * [Cloud Cost Management](https://docs.datadoghq.com/datadog_cloudcraft/overlays/ccm/) * [Resource Catalog](https://docs.datadoghq.com/infrastructure/resource_catalog/) * [Cloud Resources Schema](https://docs.datadoghq.com/infrastructure/resource_catalog/schema/) * [Policies](https://docs.datadoghq.com/infrastructure/resource_catalog/policies/) * [Resource Changes](https://docs.datadoghq.com/infrastructure/resource_catalog/resource_changes/) * [Universal Service Monitoring](https://docs.datadoghq.com/universal_service_monitoring/) * [Setup](https://docs.datadoghq.com/universal_service_monitoring/setup/) * [Guides](https://docs.datadoghq.com/universal_service_monitoring/guide/) * [End User Device Monitoring](https://docs.datadoghq.com/infrastructure/end_user_device_monitoring/) * [Setup](https://docs.datadoghq.com/infrastructure/end_user_device_monitoring/setup/) * [Hosts](https://docs.datadoghq.com/infrastructure/hostmap/) * [Host List](https://docs.datadoghq.com/infrastructure/list/) * [Containers](https://docs.datadoghq.com/containers/) * [Monitoring Containers](https://docs.datadoghq.com/infrastructure/containers/) * [Configuration](https://docs.datadoghq.com/infrastructure/containers/configuration) * [Container Images View](https://docs.datadoghq.com/infrastructure/containers/container_images) * [Orchestrator Explorer](https://docs.datadoghq.com/infrastructure/containers/orchestrator_explorer) * [Kubernetes Resource Utilization](https://docs.datadoghq.com/infrastructure/containers/kubernetes_resource_utilization) * [Kubernetes Autoscaling](https://docs.datadoghq.com/containers/monitoring/autoscaling) * [Amazon Elastic Container Explorer](https://docs.datadoghq.com/infrastructure/containers/amazon_elastic_container_explorer) * [Autoscaling](https://docs.datadoghq.com/containers/autoscaling) * [Docker and other runtimes](https://docs.datadoghq.com/containers/docker/) * [APM](https://docs.datadoghq.com/containers/docker/apm/) * [Log collection](https://docs.datadoghq.com/containers/docker/log/) * [Tag extraction](https://docs.datadoghq.com/containers/docker/tag/) * [Integrations](https://docs.datadoghq.com/containers/docker/integrations/) * [Prometheus](https://docs.datadoghq.com/containers/docker/prometheus/) * [Data Collected](https://docs.datadoghq.com/containers/docker/data_collected/) * [Kubernetes](https://docs.datadoghq.com/containers/kubernetes/) * [Installation](https://docs.datadoghq.com/containers/kubernetes/installation) * [Further Configuration](https://docs.datadoghq.com/containers/kubernetes/configuration) * [Distributions](https://docs.datadoghq.com/containers/kubernetes/distributions) * [APM](https://docs.datadoghq.com/containers/kubernetes/apm/) * [Log collection](https://docs.datadoghq.com/containers/kubernetes/log/) * [Tag extraction](https://docs.datadoghq.com/containers/kubernetes/tag/) * [Integrations](https://docs.datadoghq.com/containers/kubernetes/integrations/) * [Prometheus & OpenMetrics](https://docs.datadoghq.com/containers/kubernetes/prometheus/) * [Control plane monitoring](https://docs.datadoghq.com/containers/kubernetes/control_plane/) * [Data collected](https://docs.datadoghq.com/containers/kubernetes/data_collected/) * [kubectl Plugin](https://docs.datadoghq.com/containers/kubernetes/kubectl_plugin) * [Datadog CSI Driver](https://docs.datadoghq.com/containers/kubernetes/csi_driver) * [Data security](https://docs.datadoghq.com/data_security/kubernetes) * [Cluster Agent](https://docs.datadoghq.com/containers/cluster_agent/) * [Setup](https://docs.datadoghq.com/containers/cluster_agent/setup/) * [Commands & Options](https://docs.datadoghq.com/containers/cluster_agent/commands/) * [Cluster Checks](https://docs.datadoghq.com/containers/cluster_agent/clusterchecks/) * [Endpoint Checks](https://docs.datadoghq.com/containers/cluster_agent/endpointschecks/) * [Admission Controller](https://docs.datadoghq.com/containers/cluster_agent/admission_controller/) * [Amazon ECS](https://docs.datadoghq.com/containers/amazon_ecs/) * [APM](https://docs.datadoghq.com/containers/amazon_ecs/apm/) * [Log collection](https://docs.datadoghq.com/containers/amazon_ecs/logs/) * [Tag extraction](https://docs.datadoghq.com/containers/amazon_ecs/tags/) * [Data collected](https://docs.datadoghq.com/containers/amazon_ecs/data_collected/) * [Managed Instances](https://docs.datadoghq.com/containers/amazon_ecs/managed_instances/) * [AWS Fargate with ECS](https://docs.datadoghq.com/integrations/ecs_fargate/) * [Datadog Operator](https://docs.datadoghq.com/containers/datadog_operator) * [Advanced Install](https://docs.datadoghq.com/containers/datadog_operator/advanced_install) * [Configuration](https://docs.datadoghq.com/containers/datadog_operator/config) * [Custom Checks](https://docs.datadoghq.com/containers/datadog_operator/custom_check) * [Data Collected](https://docs.datadoghq.com/containers/datadog_operator/data_collected) * [Secret Management](https://docs.datadoghq.com/containers/datadog_operator/secret_management) * [DatadogDashboard CRD](https://docs.datadoghq.com/containers/datadog_operator/crd_dashboard) * [DatadogMonitor CRD](https://docs.datadoghq.com/containers/datadog_operator/crd_monitor) * [DatadogSLO CRD](https://docs.datadoghq.com/containers/datadog_operator/crd_slo) * [Troubleshooting](https://docs.datadoghq.com/containers/troubleshooting/) * [Duplicate hosts](https://docs.datadoghq.com/containers/troubleshooting/duplicate_hosts) * [Cluster Agent](https://docs.datadoghq.com/containers/troubleshooting/cluster-agent) * [Cluster Checks](https://docs.datadoghq.com/containers/troubleshooting/cluster-and-endpoint-checks) * [HPA and Metrics Provider](https://docs.datadoghq.com/containers/troubleshooting/hpa) * [Admission Controller](https://docs.datadoghq.com/containers/troubleshooting/admission-controller) * [Log Collection](https://docs.datadoghq.com/containers/troubleshooting/log-collection) * [Guides](https://docs.datadoghq.com/containers/guide) * [Processes](https://docs.datadoghq.com/infrastructure/process) * [Increase Process Retention](https://docs.datadoghq.com/infrastructure/process/increase_process_retention/) * [Serverless](https://docs.datadoghq.com/serverless) * [AWS Lambda](https://docs.datadoghq.com/serverless/aws_lambda) * [Instrumentation](https://docs.datadoghq.com/serverless/aws_lambda/instrumentation) * [Managed Instances](https://docs.datadoghq.com/serverless/aws_lambda/managed_instances) * [Lambda Metrics](https://docs.datadoghq.com/serverless/aws_lambda/metrics) * [Distributed Tracing](https://docs.datadoghq.com/serverless/aws_lambda/distributed_tracing) * [Log Collection](https://docs.datadoghq.com/serverless/aws_lambda/logs) * [Remote Instrumentation](https://docs.datadoghq.com/serverless/aws_lambda/remote_instrumentation) * [Advanced Configuration](https://docs.datadoghq.com/serverless/aws_lambda/configuration) * [Continuous Profiler](https://docs.datadoghq.com/serverless/aws_lambda/profiling) * [Securing Functions](https://docs.datadoghq.com/serverless/aws_lambda/securing_functions) * [Deployment Tracking](https://docs.datadoghq.com/serverless/aws_lambda/deployment_tracking) * [OpenTelemetry](https://docs.datadoghq.com/serverless/aws_lambda/opentelemetry) * [Troubleshooting](https://docs.datadoghq.com/serverless/aws_lambda/troubleshooting) * [Lambda Web Adapter](https://docs.datadoghq.com/serverless/aws_lambda/lwa) * [FIPS Compliance](https://docs.datadoghq.com/serverless/aws_lambda/fips-compliance) * [AWS Step Functions](https://docs.datadoghq.com/serverless/step_functions) * [Installation](https://docs.datadoghq.com/serverless/step_functions/installation) * [Merge Step Functions and Lambda Traces](https://docs.datadoghq.com/serverless/step_functions/merge-step-functions-lambda) * [Enhanced Metrics](https://docs.datadoghq.com/serverless/step_functions/enhanced-metrics) * [Redrive Executions](https://docs.datadoghq.com/serverless/step_functions/redrive) * [Distributed Map States](https://docs.datadoghq.com/serverless/step_functions/distributed-maps) * [Troubleshooting](https://docs.datadoghq.com/serverless/step_functions/troubleshooting) * [AWS Fargate](https://docs.datadoghq.com/integrations/ecs_fargate) * [Azure App Service](https://docs.datadoghq.com/serverless/azure_app_service) * [Linux - Code](https://docs.datadoghq.com/serverless/azure_app_service/linux_code) * [Linux - Container](https://docs.datadoghq.com/serverless/azure_app_service/linux_container) * [Windows - Code](https://docs.datadoghq.com/serverless/azure_app_service/windows_code) * [Azure Container Apps](https://docs.datadoghq.com/serverless/azure_container_apps) * [In-Container](https://docs.datadoghq.com/serverless/azure_container_apps/in_container) * [Sidecar](https://docs.datadoghq.com/serverless/azure_container_apps/sidecar) * [Azure Functions](https://docs.datadoghq.com/serverless/azure_functions) * [Google Cloud Run](https://docs.datadoghq.com/serverless/google_cloud_run) * [Containers](https://docs.datadoghq.com/serverless/google_cloud_run/containers) * [Functions](https://docs.datadoghq.com/serverless/google_cloud_run/functions) * [Functions (1st generation)](https://docs.datadoghq.com/serverless/google_cloud_run/functions_1st_gen) * [Libraries & Integrations](https://docs.datadoghq.com/serverless/libraries_integrations) * [Glossary](https://docs.datadoghq.com/serverless/glossary) * [Guides](https://docs.datadoghq.com/serverless/guide/) * [Network Monitoring](https://docs.datadoghq.com/network_monitoring/) * [Cloud Network Monitoring](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/) * [Setup](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/setup/) * [Network Health](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/network_health/) * [Network Analytics](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/network_analytics/) * [Network Map](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/network_map/) * [Guides](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/guide/) * [Supported Cloud Services](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/supported_cloud_services/) * [Terms and Concepts](https://docs.datadoghq.com/network_monitoring/cloud_network_monitoring/glossary) * [DNS Monitoring](https://docs.datadoghq.com/network_monitoring/dns/) * [Network Device Monitoring](https://docs.datadoghq.com/network_monitoring/devices) * [Setup](https://docs.datadoghq.com/network_monitoring/devices/setup) * [Integrations](https://docs.datadoghq.com/network_monitoring/devices/integrations) * [Profiles](https://docs.datadoghq.com/network_monitoring/devices/profiles) * [Configuration Management](https://docs.datadoghq.com/network_monitoring/devices/config_management) * [Maps](https://docs.datadoghq.com/network_monitoring/devices/topology) * [SNMP Metrics Reference](https://docs.datadoghq.com/network_monitoring/devices/data) * [Troubleshooting](https://docs.datadoghq.com/network_monitoring/devices/troubleshooting) * [Guides](https://docs.datadoghq.com/network_monitoring/devices/guide/) * [Terms and Concepts](https://docs.datadoghq.com/network_monitoring/devices/glossary) * [NetFlow Monitoring](https://docs.datadoghq.com/network_monitoring/netflow/) * [Monitors](https://docs.datadoghq.com/monitors/types/netflow) * [Network Path](https://docs.datadoghq.com/network_monitoring/network_path/) * [Setup](https://docs.datadoghq.com/network_monitoring/network_path/setup/) * [List View](https://docs.datadoghq.com/network_monitoring/network_path/list_view/) * [Path View](https://docs.datadoghq.com/network_monitoring/network_path/path_view/) * [Guides](https://docs.datadoghq.com/network_monitoring/network_path/guide) * [Terms and Concepts](https://docs.datadoghq.com/network_monitoring/network_path/glossary) * [Storage Management](https://docs.datadoghq.com/infrastructure/storage_management/) * [Amazon S3](https://docs.datadoghq.com/infrastructure/storage_management/amazon_s3/) * [Google Cloud Storage](https://docs.datadoghq.com/infrastructure/storage_management/google_cloud_storage/) * [Azure Blob Storage](https://docs.datadoghq.com/infrastructure/storage_management/azure_blob_storage/) * [Cloud Cost ](https://docs.datadoghq.com/api/latest/csm-threats/) * [Cloud Cost](https://docs.datadoghq.com/cloud_cost_management/) * [Datadog Costs](https://docs.datadoghq.com/cloud_cost_management/datadog_costs) * [Setup](https://docs.datadoghq.com/cloud_cost_management/setup/) * [AWS](https://docs.datadoghq.com/cloud_cost_management/setup/aws/) * [Azure](https://docs.datadoghq.com/cloud_cost_management/setup/azure/) * [Google Cloud](https://docs.datadoghq.com/cloud_cost_management/setup/google_cloud/) * [Oracle](https://docs.datadoghq.com/cloud_cost_management/setup/oracle/) * [SaaS Integrations](https://docs.datadoghq.com/cloud_cost_management/setup/saas_costs/) * [Custom](https://docs.datadoghq.com/cloud_cost_management/setup/custom) * [Tags](https://docs.datadoghq.com/cloud_cost_management/tags) * [Tag Explorer](https://docs.datadoghq.com/cloud_cost_management/tags/tag_explorer) * [Multisource Querying](https://docs.datadoghq.com/cloud_cost_management/tags/multisource_querying) * [Allocation](https://docs.datadoghq.com/cloud_cost_management/allocation) * [Tag Pipelines](https://docs.datadoghq.com/cloud_cost_management/tags/tag_pipelines) * [Container Cost Allocation](https://docs.datadoghq.com/cloud_cost_management/allocation/container_cost_allocation) * [BigQuery Costs](https://docs.datadoghq.com/cloud_cost_management/allocation/bigquery) * [Custom Allocation Rules](https://docs.datadoghq.com/cloud_cost_management/allocation/custom_allocation_rules) * [Reporting](https://docs.datadoghq.com/cloud_cost_management/reporting) * [Explorer](https://docs.datadoghq.com/cloud_cost_management/reporting/explorer) * [Scheduled Reports](https://docs.datadoghq.com/cloud_cost_management/reporting/scheduled_reports) * [Recommendations](https://docs.datadoghq.com/cloud_cost_management/recommendations) * [Custom Recommendations](https://docs.datadoghq.com/cloud_cost_management/recommendations/custom_recommendations) * [Planning](https://docs.datadoghq.com/cloud_cost_management/planning) * [Budgets](https://docs.datadoghq.com/cloud_cost_management/planning/budgets) * [Commitment Programs](https://docs.datadoghq.com/cloud_cost_management/planning/commitment_programs) * [Cost Changes](https://docs.datadoghq.com/cloud_cost_management/cost_changes) * [Monitors](https://docs.datadoghq.com/cloud_cost_management/cost_changes/monitors) * [Anomalies](https://docs.datadoghq.com/cloud_cost_management/cost_changes/anomalies) * [Real-Time Costs](https://docs.datadoghq.com/cloud_cost_management/cost_changes/real_time_costs) * [Application Performance ](https://docs.datadoghq.com/api/latest/csm-threats/) * [APM](https://docs.datadoghq.com/tracing/) * [APM Terms and Concepts](https://docs.datadoghq.com/tracing/glossary/) * [Application Instrumentation](https://docs.datadoghq.com/tracing/trace_collection/) * [Single Step Instrumentation](https://docs.datadoghq.com/tracing/trace_collection/single-step-apm/) * [Manually managed SDKs](https://docs.datadoghq.com/tracing/trace_collection/dd_libraries/) * [Code-based Custom Instrumentation](https://docs.datadoghq.com/tracing/trace_collection/custom_instrumentation/) * [Dynamic Instrumentation](https://docs.datadoghq.com/tracing/trace_collection/dynamic_instrumentation/) * [Library Compatibility](https://docs.datadoghq.com/tracing/trace_collection/compatibility/) * [Library Configuration](https://docs.datadoghq.com/tracing/trace_collection/library_config/) * [Configuration at Runtime](https://docs.datadoghq.com/tracing/trace_collection/runtime_config/) * [Trace Context Propagation](https://docs.datadoghq.com/tracing/trace_collection/trace_context_propagation/) * [Serverless Application Tracing](https://docs.datadoghq.com/serverless/distributed_tracing/) * [Proxy Tracing](https://docs.datadoghq.com/tracing/trace_collection/proxy_setup/) * [Span Tag Semantics](https://docs.datadoghq.com/tracing/trace_collection/tracing_naming_convention) * [Span Links](https://docs.datadoghq.com/tracing/trace_collection/span_links) * [APM Metrics Collection](https://docs.datadoghq.com/tracing/metrics/) * [Trace Metrics](https://docs.datadoghq.com/tracing/metrics/metrics_namespace/) * [Runtime Metrics](https://docs.datadoghq.com/tracing/metrics/runtime_metrics/) * [Trace Pipeline Configuration](https://docs.datadoghq.com/tracing/trace_pipeline/) * [Ingestion Mechanisms](https://docs.datadoghq.com/tracing/trace_pipeline/ingestion_mechanisms/) * [Ingestion Controls](https://docs.datadoghq.com/tracing/trace_pipeline/ingestion_controls/) * [Adaptive Sampling](https://docs.datadoghq.com/tracing/trace_pipeline/adaptive_sampling/) * [Generate Metrics](https://docs.datadoghq.com/tracing/trace_pipeline/generate_metrics/) * [Trace Retention](https://docs.datadoghq.com/tracing/trace_pipeline/trace_retention/) * [Usage Metrics](https://docs.datadoghq.com/tracing/trace_pipeline/metrics/) * [Correlate Traces with Other Telemetry](https://docs.datadoghq.com/tracing/other_telemetry/) * [Correlate DBM and Traces](https://docs.datadoghq.com/database_monitoring/connect_dbm_and_apm/) * [Correlate Logs and Traces](https://docs.datadoghq.com/tracing/other_telemetry/connect_logs_and_traces/) * [Correlate RUM and Traces](https://docs.datadoghq.com/tracing/other_telemetry/rum) * [Correlate Synthetics and Traces](https://docs.datadoghq.com/tracing/other_telemetry/synthetics/) * [Correlate Profiles and Traces](https://docs.datadoghq.com/profiler/connect_traces_and_profiles/) * [Trace Explorer](https://docs.datadoghq.com/tracing/trace_explorer/) * [Search Spans](https://docs.datadoghq.com/tracing/trace_explorer/search/) * [Query Syntax](https://docs.datadoghq.com/tracing/trace_explorer/query_syntax/) * [Trace Queries](https://docs.datadoghq.com/tracing/trace_explorer/trace_queries/) * [Span Tags and Attributes](https://docs.datadoghq.com/tracing/trace_explorer/span_tags_attributes/) * [Span Visualizations](https://docs.datadoghq.com/tracing/trace_explorer/visualize/) * [Trace View](https://docs.datadoghq.com/tracing/trace_explorer/trace_view/) * [Tag Analysis](https://docs.datadoghq.com/tracing/trace_explorer/tag_analysis/) * [Recommendations](https://docs.datadoghq.com/tracing/recommendations/) * [Code Origin for Spans](https://docs.datadoghq.com/tracing/code_origin) * [Service Observability](https://docs.datadoghq.com/tracing/services/) * [Software Catalog](https://docs.datadoghq.com/internal_developer_portal/software_catalog/) * [Service Page](https://docs.datadoghq.com/tracing/services/service_page/) * [Resource Page](https://docs.datadoghq.com/tracing/services/resource_page/) * [Deployment Tracking](https://docs.datadoghq.com/tracing/services/deployment_tracking/) * [Service Map](https://docs.datadoghq.com/tracing/services/services_map/) * [Inferred Services](https://docs.datadoghq.com/tracing/services/inferred_services) * [Remapping Rules for Inferred Entities](https://docs.datadoghq.com/tracing/services/inferred_entity_remapping_rules/) * [Service Remapping Rules](https://docs.datadoghq.com/tracing/services/service_remapping_rules) * [Service Override Removal](https://docs.datadoghq.com/tracing/services/service_override_removal) * [APM Monitors](https://docs.datadoghq.com/monitors/create/types/apm/) * [Endpoint Observability](https://docs.datadoghq.com/internal_developer_portal/software_catalog/endpoints/) * [Explore Endpoints](https://docs.datadoghq.com/internal_developer_portal/software_catalog/endpoints/explore_endpoints) * [Monitor Endpoints](https://docs.datadoghq.com/internal_developer_portal/software_catalog/endpoints/monitor_endpoints) * [Live Debugger](https://docs.datadoghq.com/tracing/live_debugger/) * [Error Tracking](https://docs.datadoghq.com/tracing/error_tracking/) * [Issue States](https://docs.datadoghq.com/tracing/error_tracking/issue_states) * [Error Tracking Explorer](https://docs.datadoghq.com/tracing/error_tracking/explorer) * [Error Grouping](https://docs.datadoghq.com/tracing/error_tracking/error_grouping) * [Monitors](https://docs.datadoghq.com/tracing/error_tracking/monitors) * [Identify Suspect Commits](https://docs.datadoghq.com/tracing/error_tracking/suspect_commits) * [Exception Replay](https://docs.datadoghq.com/tracing/error_tracking/exception_replay) * [Troubleshooting](https://docs.datadoghq.com/error_tracking/troubleshooting) * [Data Security](https://docs.datadoghq.com/tracing/configure_data_security/) * [Guides](https://docs.datadoghq.com/tracing/guide/) * [Troubleshooting](https://docs.datadoghq.com/tracing/troubleshooting/) * [Agent Rate Limits](https://docs.datadoghq.com/tracing/troubleshooting/agent_rate_limits) * [Agent APM metrics](https://docs.datadoghq.com/tracing/troubleshooting/agent_apm_metrics) * [Agent Resource Usage](https://docs.datadoghq.com/tracing/troubleshooting/agent_apm_resource_usage) * [Correlated Logs](https://docs.datadoghq.com/tracing/troubleshooting/correlated-logs-not-showing-up-in-the-trace-id-panel) * [PHP 5 Deep Call Stacks](https://docs.datadoghq.com/tracing/troubleshooting/php_5_deep_call_stacks) * [.NET diagnostic tool](https://docs.datadoghq.com/tracing/troubleshooting/dotnet_diagnostic_tool) * [APM Quantization](https://docs.datadoghq.com/tracing/troubleshooting/quantization) * [Go Compile-Time Instrumentation](https://docs.datadoghq.com/tracing/troubleshooting/go_compile_time) * [Tracer Startup Logs](https://docs.datadoghq.com/tracing/troubleshooting/tracer_startup_logs) * [Tracer Debug Logs](https://docs.datadoghq.com/tracing/troubleshooting/tracer_debug_logs) * [Connection Errors](https://docs.datadoghq.com/tracing/troubleshooting/connection_errors) * [Continuous Profiler](https://docs.datadoghq.com/profiler/) * [Enabling the Profiler](https://docs.datadoghq.com/profiler/enabling/) * [Supported Language and Tracer Versions](https://docs.datadoghq.com/profiler/enabling/supported_versions/) * [Java](https://docs.datadoghq.com/profiler/enabling/java/) * [Python](https://docs.datadoghq.com/profiler/enabling/python/) * [Go](https://docs.datadoghq.com/profiler/enabling/go/) * [Ruby](https://docs.datadoghq.com/profiler/enabling/ruby/) * [Node.js](https://docs.datadoghq.com/profiler/enabling/nodejs/) * [.NET](https://docs.datadoghq.com/profiler/enabling/dotnet/) * [PHP](https://docs.datadoghq.com/profiler/enabling/php/) * [C/C++/Rust](https://docs.datadoghq.com/profiler/enabling/ddprof/) * [Profile Types](https://docs.datadoghq.com/profiler/profile_types/) * [Profile Visualizations](https://docs.datadoghq.com/profiler/profile_visualizations/) * [Investigate Slow Traces or Endpoints](https://docs.datadoghq.com/profiler/connect_traces_and_profiles/) * [Compare Profiles](https://docs.datadoghq.com/profiler/compare_profiles) * [Automated Analysis](https://docs.datadoghq.com/profiler/automated_analysis) * [Profiler Troubleshooting](https://docs.datadoghq.com/profiler/profiler_troubleshooting/) * [Java](https://docs.datadoghq.com/profiler/profiler_troubleshooting/java/) * [Python](https://docs.datadoghq.com/profiler/profiler_troubleshooting/python/) * [Go](https://docs.datadoghq.com/profiler/profiler_troubleshooting/go/) * [Ruby](https://docs.datadoghq.com/profiler/profiler_troubleshooting/ruby/) * [Node.js](https://docs.datadoghq.com/profiler/profiler_troubleshooting/nodejs/) * [.NET](https://docs.datadoghq.com/profiler/profiler_troubleshooting/dotnet/) * [PHP](https://docs.datadoghq.com/profiler/profiler_troubleshooting/php/) * [C/C++/Rust](https://docs.datadoghq.com/profiler/profiler_troubleshooting/ddprof/) * [Guides](https://docs.datadoghq.com/profiler/guide/) * [Database Monitoring](https://docs.datadoghq.com/database_monitoring/) * [Agent Integration Overhead](https://docs.datadoghq.com/database_monitoring/agent_integration_overhead) * [Setup Architectures](https://docs.datadoghq.com/database_monitoring/architecture/) * [Setting Up Postgres](https://docs.datadoghq.com/database_monitoring/setup_postgres/) * [Self-hosted](https://docs.datadoghq.com/database_monitoring/setup_postgres/selfhosted) * [RDS](https://docs.datadoghq.com/database_monitoring/setup_postgres/rds) * [Aurora](https://docs.datadoghq.com/database_monitoring/setup_postgres/aurora) * [Google Cloud SQL](https://docs.datadoghq.com/database_monitoring/setup_postgres/gcsql) * [AlloyDB](https://docs.datadoghq.com/database_monitoring/setup_postgres/alloydb) * [Azure](https://docs.datadoghq.com/database_monitoring/setup_postgres/azure) * [Supabase](https://docs.datadoghq.com/database_monitoring/setup_postgres/supabase) * [Heroku](https://docs.datadoghq.com/database_monitoring/setup_postgres/heroku) * [Advanced Configuration](https://docs.datadoghq.com/database_monitoring/setup_postgres/advanced_configuration) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/setup_postgres/troubleshooting/) * [Setting Up MySQL](https://docs.datadoghq.com/database_monitoring/setup_mysql/) * [Self-hosted](https://docs.datadoghq.com/database_monitoring/setup_mysql/selfhosted) * [RDS](https://docs.datadoghq.com/database_monitoring/setup_mysql/rds) * [Aurora](https://docs.datadoghq.com/database_monitoring/setup_mysql/aurora) * [Google Cloud SQL](https://docs.datadoghq.com/database_monitoring/setup_mysql/gcsql) * [Azure](https://docs.datadoghq.com/database_monitoring/setup_mysql/azure) * [Advanced Configuration](https://docs.datadoghq.com/database_monitoring/setup_mysql/advanced_configuration) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/setup_mysql/troubleshooting/) * [Setting Up SQL Server](https://docs.datadoghq.com/database_monitoring/setup_sql_server/) * [Self-hosted](https://docs.datadoghq.com/database_monitoring/setup_sql_server/selfhosted/) * [RDS](https://docs.datadoghq.com/database_monitoring/setup_sql_server/rds/) * [Azure](https://docs.datadoghq.com/database_monitoring/setup_sql_server/azure/) * [Google Cloud SQL](https://docs.datadoghq.com/database_monitoring/setup_sql_server/gcsql/) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/setup_sql_server/troubleshooting/) * [Setting Up Oracle](https://docs.datadoghq.com/database_monitoring/setup_oracle/) * [Self-hosted](https://docs.datadoghq.com/database_monitoring/setup_oracle/selfhosted/) * [RDS](https://docs.datadoghq.com/database_monitoring/setup_oracle/rds/) * [RAC](https://docs.datadoghq.com/database_monitoring/setup_oracle/rac/) * [Exadata](https://docs.datadoghq.com/database_monitoring/setup_oracle/exadata/) * [Autonomous Database](https://docs.datadoghq.com/database_monitoring/setup_oracle/autonomous_database/) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/setup_oracle/troubleshooting/) * [Setting Up Amazon DocumentDB](https://docs.datadoghq.com/database_monitoring/setup_documentdb/) * [Amazon DocumentDB](https://docs.datadoghq.com/database_monitoring/setup_documentdb/amazon_documentdb) * [Setting Up MongoDB](https://docs.datadoghq.com/database_monitoring/setup_mongodb/) * [Self-hosted](https://docs.datadoghq.com/database_monitoring/setup_mongodb/selfhosted) * [MongoDB Atlas](https://docs.datadoghq.com/database_monitoring/setup_mongodb/mongodbatlas) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/setup_mongodb/troubleshooting/) * [Connecting DBM and Traces](https://docs.datadoghq.com/database_monitoring/connect_dbm_and_apm/) * [Data Collected](https://docs.datadoghq.com/database_monitoring/data_collected) * [Exploring Database Hosts](https://docs.datadoghq.com/database_monitoring/database_hosts/) * [Exploring Query Metrics](https://docs.datadoghq.com/database_monitoring/query_metrics/) * [Exploring Query Samples](https://docs.datadoghq.com/database_monitoring/query_samples/) * [Exploring Database Schemas](https://docs.datadoghq.com/database_monitoring/schema_explorer) * [Exploring Recommendations](https://docs.datadoghq.com/database_monitoring/recommendations/) * [Troubleshooting](https://docs.datadoghq.com/database_monitoring/troubleshooting/) * [Guides](https://docs.datadoghq.com/database_monitoring/guide/) * [Data Streams Monitoring](https://docs.datadoghq.com/data_streams/) * [Setup](https://docs.datadoghq.com/data_streams/setup) * [Kafka Messages](https://docs.datadoghq.com/data_streams/messages) * [Schema Tracking](https://docs.datadoghq.com/data_streams/schema_tracking) * [Dead Letter Queues](https://docs.datadoghq.com/data_streams/dead_letter_queues) * [Metrics and Tags](https://docs.datadoghq.com/data_streams/metrics_and_tags) * [Data Observability ](https://docs.datadoghq.com/api/latest/csm-threats/) * [Data Observability](https://docs.datadoghq.com/data_observability/) * [Quality Monitoring](https://docs.datadoghq.com/data_observability/quality_monitoring/) * [Data Warehouses](https://docs.datadoghq.com/data_observability/quality_monitoring/data_warehouses) * [Snowflake](https://docs.datadoghq.com/data_observability/quality_monitoring/data_warehouses/snowflake) * [Databricks](https://docs.datadoghq.com/data_observability/quality_monitoring/data_warehouses/databricks) * [BigQuery](https://docs.datadoghq.com/data_observability/quality_monitoring/data_warehouses/bigquery) * [Business Intelligence Integrations](https://docs.datadoghq.com/data_observability/quality_monitoring/business_intelligence) * [Tableau](https://docs.datadoghq.com/data_observability/quality_monitoring/business_intelligence/tableau) * [Sigma](https://docs.datadoghq.com/data_observability/quality_monitoring/business_intelligence/sigma) * [Metabase](https://docs.datadoghq.com/data_observability/quality_monitoring/business_intelligence/metabase) * [Power BI](https://docs.datadoghq.com/data_observability/quality_monitoring/business_intelligence/powerbi) * [Jobs Monitoring](https://docs.datadoghq.com/data_observability/jobs_monitoring/) * [Databricks](https://docs.datadoghq.com/data_observability/jobs_monitoring/databricks) * [Airflow](https://docs.datadoghq.com/data_observability/jobs_monitoring/airflow) * [dbt](https://docs.datadoghq.com/data_observability/jobs_monitoring/dbt) * [Spark on Kubernetes](https://docs.datadoghq.com/data_observability/jobs_monitoring/kubernetes) * [Spark on Amazon EMR](https://docs.datadoghq.com/data_observability/jobs_monitoring/emr) * [Spark on Google Dataproc](https://docs.datadoghq.com/data_observability/jobs_monitoring/dataproc) * [Custom Jobs (OpenLineage)](https://docs.datadoghq.com/data_observability/jobs_monitoring/openlineage) * [Datadog Agent for OpenLineage Proxy](https://docs.datadoghq.com/data_observability/jobs_monitoring/openlineage/datadog_agent_for_openlineage) * [Digital Experience ](https://docs.datadoghq.com/api/latest/csm-threats/) * [Real User Monitoring](https://docs.datadoghq.com/real_user_monitoring/) * [Application Monitoring](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/) * [Browser](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/browser/) * [Android and Android TV](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/android) * [iOS and tvOS](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/ios) * [Flutter](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/flutter) * [Kotlin Multiplatform](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/kotlin_multiplatform) * [React Native](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/react_native) * [Roku](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/roku) * [Unity](https://docs.datadoghq.com/real_user_monitoring/application_monitoring/unity) * [Platform](https://docs.datadoghq.com/real_user_monitoring/platform) * [Dashboards](https://docs.datadoghq.com/real_user_monitoring/platform/dashboards/) * [Monitors](https://docs.datadoghq.com/monitors/types/real_user_monitoring/) * [Generate Custom Metrics](https://docs.datadoghq.com/real_user_monitoring/platform/generate_metrics) * [Exploring RUM Data](https://docs.datadoghq.com/real_user_monitoring/explorer/) * [Search RUM Events](https://docs.datadoghq.com/real_user_monitoring/explorer/search/) * [Search Syntax](https://docs.datadoghq.com/real_user_monitoring/explorer/search_syntax/) * [Group](https://docs.datadoghq.com/real_user_monitoring/explorer/group/) * [Visualize](https://docs.datadoghq.com/real_user_monitoring/explorer/visualize/) * [Events](https://docs.datadoghq.com/real_user_monitoring/explorer/events/) * [Export](https://docs.datadoghq.com/real_user_monitoring/explorer/export/) * [Saved Views](https://docs.datadoghq.com/real_user_monitoring/explorer/saved_views/) * [Watchdog Insights for RUM](https://docs.datadoghq.com/real_user_monitoring/explorer/watchdog_insights/) * [Correlate RUM with Other Telemetry](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/) * [Correlate LLM with RUM](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/llm_observability) * [Correlate Logs with RUM](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/logs) * [Correlate Profiling with RUM](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/profiling/) * [Correlate Synthetics with RUM](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/synthetics/) * [Correlate Traces with RUM](https://docs.datadoghq.com/real_user_monitoring/correlate_with_other_telemetry/apm) * [Feature Flag Tracking](https://docs.datadoghq.com/real_user_monitoring/feature_flag_tracking) * [Setup](https://docs.datadoghq.com/real_user_monitoring/feature_flag_tracking/setup) * [Using Feature Flags](https://docs.datadoghq.com/real_user_monitoring/feature_flag_tracking/using_feature_flags) * [Error Tracking](https://docs.datadoghq.com/real_user_monitoring/error_tracking/) * [Explorer](https://docs.datadoghq.com/real_user_monitoring/error_tracking/explorer/) * [Issue States](https://docs.datadoghq.com/real_user_monitoring/error_tracking/issue_states) * [Track Browser Errors](https://docs.datadoghq.com/real_user_monitoring/error_tracking/browser/) * [Track Mobile Errors](https://docs.datadoghq.com/real_user_monitoring/error_tracking/mobile/) * [Error Grouping](https://docs.datadoghq.com/real_user_monitoring/error_tracking/error_grouping) * [Monitors](https://docs.datadoghq.com/real_user_monitoring/error_tracking/monitors) * [Identify Suspect Commits](https://docs.datadoghq.com/real_user_monitoring/error_tracking/suspect_commits) * [Troubleshooting](https://docs.datadoghq.com/real_user_monitoring/error_tracking/troubleshooting) * [RUM Without Limits](https://docs.datadoghq.com/real_user_monitoring/rum_without_limits/) * [Metrics](https://docs.datadoghq.com/real_user_monitoring/rum_without_limits/metrics) * [Retention Filters](https://docs.datadoghq.com/real_user_monitoring/rum_without_limits/retention_filters) * [Operations Monitoring](https://docs.datadoghq.com/real_user_monitoring/operations_monitoring/) * [Ownership of Views](https://docs.datadoghq.com/real_user_monitoring/ownership_of_views/) * [Guides](https://docs.datadoghq.com/real_user_monitoring/guide/) * [Data Security](https://docs.datadoghq.com/data_security/real_user_monitoring/) * [Synthetic Testing and Monitoring](https://docs.datadoghq.com/synthetics/) * [API Testing](https://docs.datadoghq.com/synthetics/api_tests/) * [HTTP](https://docs.datadoghq.com/synthetics/api_tests/http_tests) * [SSL](https://docs.datadoghq.com/synthetics/api_tests/ssl_tests) * [DNS](https://docs.datadoghq.com/synthetics/api_tests/dns_tests) * [WebSocket](https://docs.datadoghq.com/synthetics/api_tests/websocket_tests) * [TCP](https://docs.datadoghq.com/synthetics/api_tests/tcp_tests) * [UDP](https://docs.datadoghq.com/synthetics/api_tests/udp_tests) * [ICMP](https://docs.datadoghq.com/synthetics/api_tests/icmp_tests) * [GRPC](https://docs.datadoghq.com/synthetics/api_tests/grpc_tests) * [Error codes](https://docs.datadoghq.com/synthetics/api_tests/errors) * [Multistep API Testing](https://docs.datadoghq.com/synthetics/multistep) * [Browser Testing](https://docs.datadoghq.com/synthetics/browser_tests/) * [Recording Steps](https://docs.datadoghq.com/synthetics/browser_tests/test_steps) * [Browser Testing Results](https://docs.datadoghq.com/synthetics/browser_tests/test_results) * [Advanced Options for Steps](https://docs.datadoghq.com/synthetics/browser_tests/advanced_options) * [Authentication in Browser Testing](https://docs.datadoghq.com/synthetics/browser_tests/app-that-requires-login) * [Network Path Testing](https://docs.datadoghq.com/synthetics/network_path_tests/) * [Terms and Concepts](https://docs.datadoghq.com/synthetics/network_path_tests/glossary) * [Mobile Application Testing](https://docs.datadoghq.com/synthetics/mobile_app_testing/) * [Testing Steps](https://docs.datadoghq.com/synthetics/mobile_app_testing/mobile_app_tests/steps) * [Testing Results](https://docs.datadoghq.com/synthetics/mobile_app_testing/mobile_app_tests/results) * [Advanced Options for Steps](https://docs.datadoghq.com/synthetics/mobile_app_testing/mobile_app_tests/advanced_options) * [Supported Devices](https://docs.datadoghq.com/synthetics/mobile_app_testing/devices/) * [Restricted Networks](https://docs.datadoghq.com/synthetics/mobile_app_testing/mobile_app_tests/restricted_networks) * [Settings](https://docs.datadoghq.com/synthetics/mobile_app_testing/settings) * [Test Suites](https://docs.datadoghq.com/synthetics/test_suites/) * [Platform](https://docs.datadoghq.com/synthetics/platform/) * [Dashboards](https://docs.datadoghq.com/synthetics/platform/dashboards) * [Metrics](https://docs.datadoghq.com/synthetics/platform/metrics/) * [Test Coverage](https://docs.datadoghq.com/synthetics/platform/test_coverage) * [Private Locations](https://docs.datadoghq.com/synthetics/platform/private_locations) * [Connect APM](https://docs.datadoghq.com/synthetics/platform/apm/) * [Settings](https://docs.datadoghq.com/synthetics/platform/settings) * [Exploring Synthetics Data](https://docs.datadoghq.com/synthetics/explore/) * [Saved Views](https://docs.datadoghq.com/synthetics/explore/saved_views) * [Results Explorer](https://docs.datadoghq.com/synthetics/explore/results_explorer) * [Guides](https://docs.datadoghq.com/synthetics/guide/) * [Notifications](https://docs.datadoghq.com/synthetics/notifications/) * [Template Variables](https://docs.datadoghq.com/synthetics/notifications/template_variables) * [Conditional Alerting](https://docs.datadoghq.com/synthetics/notifications/conditional_alerting) * [Advanced Notifications](https://docs.datadoghq.com/synthetics/notifications/advanced_notifications) * [Integrate with Statuspage](https://docs.datadoghq.com/synthetics/notifications/statuspage) * [Troubleshooting](https://docs.datadoghq.com/synthetics/troubleshooting/) * [Data Security](https://docs.datadoghq.com/data_security/synthetics/) * [Continuous Testing](https://docs.datadoghq.com/continuous_testing/) * [Local and Staging Environments](https://docs.datadoghq.com/continuous_testing/environments) * [Testing Multiple Environments](https://docs.datadoghq.com/continuous_testing/environments/multiple_env) * [Testing With Proxy, Firewall, or VPN](https://docs.datadoghq.com/continuous_testing/environments/proxy_firewall_vpn) * [CI/CD Integrations](https://docs.datadoghq.com/continuous_testing/cicd_integrations) * [Configuration](https://docs.datadoghq.com/continuous_testing/cicd_integrations/configuration) * [Azure DevOps Extension](https://docs.datadoghq.com/continuous_testing/cicd_integrations/azure_devops_extension) * [CircleCI Orb](https://docs.datadoghq.com/continuous_testing/cicd_integrations/circleci_orb) * [GitHub Actions](https://docs.datadoghq.com/continuous_testing/cicd_integrations/github_actions) * [GitLab](https://docs.datadoghq.com/continuous_testing/cicd_integrations/gitlab) * [Jenkins](https://docs.datadoghq.com/continuous_testing/cicd_integrations/jenkins) * [Bitrise (Upload Application)](https://docs.datadoghq.com/continuous_testing/cicd_integrations/bitrise_upload) * [Bitrise (Run Tests)](https://docs.datadoghq.com/continuous_testing/cicd_integrations/bitrise_run) * [Settings](https://docs.datadoghq.com/continuous_testing/settings) * [Results Explorer](https://docs.datadoghq.com/continuous_testing/results_explorer/) * [Metrics](https://docs.datadoghq.com/continuous_testing/metrics/) * [Guides](https://docs.datadoghq.com/continuous_testing/guide/) * [Troubleshooting](https://docs.datadoghq.com/continuous_testing/troubleshooting/) * [Product Analytics](https://docs.datadoghq.com/product_analytics) * [Vizualizing with Charts](https://docs.datadoghq.com/product_analytics/charts) * [Chart Basics](https://docs.datadoghq.com/product_analytics/charts/chart_basics) * [Pathways Diagram](https://docs.datadoghq.com/product_analytics/charts/pathways) * [Funnel Analysis](https://docs.datadoghq.com/product_analytics/charts/funnel_analysis) * [Retention Analysis](https://docs.datadoghq.com/product_analytics/charts/retention_analysis) * [Analytics Explorer](https://docs.datadoghq.com/product_analytics/charts/analytics_explorer) * [Dashboards](https://docs.datadoghq.com/product_analytics/dashboards) * [Segments](https://docs.datadoghq.com/product_analytics/segmentation/) * [Managing Profiles](https://docs.datadoghq.com/product_analytics/profiles) * [Experiments](https://docs.datadoghq.com/product_analytics/experimentation/) * [Define Metrics](https://docs.datadoghq.com/product_analytics/experimentation/defining_metrics) * [Reading Experiment Results](https://docs.datadoghq.com/product_analytics/experimentation/reading_results) * [Minimum Detectable Effects](https://docs.datadoghq.com/product_analytics/experimentation/minimum_detectable_effect) * [Guides](https://docs.datadoghq.com/product_analytics/guide/) * [Troubleshooting](https://docs.datadoghq.com/product_analytics/troubleshooting/) * [Session Replay](https://docs.datadoghq.com/session_replay/) * [Browser](https://docs.datadoghq.com/session_replay/browser) * [Setup](https://docs.datadoghq.com/session_replay/browser/setup_and_configuration) * [Privacy Options](https://docs.datadoghq.com/session_replay/browser/privacy_options) * [Developer Tools](https://docs.datadoghq.com/session_replay/browser/dev_tools) * [Troubleshooting](https://docs.datadoghq.com/session_replay/browser/troubleshooting) * [Mobile](https://docs.datadoghq.com/session_replay/mobile) * [Setup and Configuration](https://docs.datadoghq.com/session_replay/mobile/setup_and_configuration) * [Privacy Options](https://docs.datadoghq.com/session_replay/mobile/privacy_options) * [Developer Tools](https://docs.datadoghq.com/session_replay/mobile/dev_tools) * [Impact on App Performance](https://docs.datadoghq.com/session_replay/mobile/app_performance) * [Troubleshooting](https://docs.datadoghq.com/session_replay/mobile/troubleshooting) * [Playlists](https://docs.datadoghq.com/session_replay/playlists) * [Heatmaps](https://docs.datadoghq.com/session_replay/heatmaps) * [Software Delivery ](https://docs.datadoghq.com/api/latest/csm-threats/) * [CI Visibility](https://docs.datadoghq.com/continuous_integration/) * [Pipeline Visibility](https://docs.datadoghq.com/continuous_integration/pipelines/) * [AWS CodePipeline](https://docs.datadoghq.com/continuous_integration/pipelines/awscodepipeline/) * [Azure Pipelines](https://docs.datadoghq.com/continuous_integration/pipelines/azure/) * [Buildkite](https://docs.datadoghq.com/continuous_integration/pipelines/buildkite/) * [CircleCI](https://docs.datadoghq.com/continuous_integration/pipelines/circleci/) * [Codefresh](https://docs.datadoghq.com/continuous_integration/pipelines/codefresh/) * [GitHub Actions](https://docs.datadoghq.com/continuous_integration/pipelines/github/) * [GitLab](https://docs.datadoghq.com/continuous_integration/pipelines/gitlab/) * [Jenkins](https://docs.datadoghq.com/continuous_integration/pipelines/jenkins/) * [TeamCity](https://docs.datadoghq.com/continuous_integration/pipelines/teamcity/) * [Other CI Providers](https://docs.datadoghq.com/continuous_integration/pipelines/custom/) * [Custom Commands](https://docs.datadoghq.com/continuous_integration/pipelines/custom_commands/) * [Custom Tags and Measures](https://docs.datadoghq.com/continuous_integration/pipelines/custom_tags_and_measures/) * [Search and Manage](https://docs.datadoghq.com/continuous_integration/search/) * [Explorer](https://docs.datadoghq.com/continuous_integration/explorer) * [Search Syntax](https://docs.datadoghq.com/continuous_integration/explorer/search_syntax/) * [Search Pipeline Executions](https://docs.datadoghq.com/continuous_integration/explorer/facets/) * [Export](https://docs.datadoghq.com/continuous_integration/explorer/export/) * [Saved Views](https://docs.datadoghq.com/continuous_integration/explorer/saved_views/) * [Monitors](https://docs.datadoghq.com/monitors/types/ci/?tab=pipelines) * [Guides](https://docs.datadoghq.com/continuous_integration/guides/) * [Troubleshooting](https://docs.datadoghq.com/continuous_integration/troubleshooting/) * [CD Visibility](https://docs.datadoghq.com/continuous_delivery/) * [Deployment Visibility](https://docs.datadoghq.com/continuous_delivery/deployments) * [Argo CD](https://docs.datadoghq.com/continuous_delivery/deployments/argocd) * [CI Providers](https://docs.datadoghq.com/continuous_delivery/deployments/ciproviders) * [Explore Deployments](https://docs.datadoghq.com/continuous_delivery/explorer) * [Search Syntax](https://docs.datadoghq.com/continuous_delivery/explorer/search_syntax) * [Facets](https://docs.datadoghq.com/continuous_delivery/explorer/facets) * [Saved Views](https://docs.datadoghq.com/continuous_delivery/explorer/saved_views) * [Features](https://docs.datadoghq.com/continuous_delivery/features) * [Code Changes Detection](https://docs.datadoghq.com/continuous_delivery/features/code_changes_detection) * [Rollback Detection](https://docs.datadoghq.com/continuous_delivery/features/rollbacks_detection) * [Monitors](https://docs.datadoghq.com/monitors/types/ci/?tab=deployments) * [Deployment Gates](https://docs.datadoghq.com/deployment_gates/) * [Setup](https://docs.datadoghq.com/deployment_gates/setup/) * [Explore](https://docs.datadoghq.com/deployment_gates/explore/) * [Test Optimization](https://docs.datadoghq.com/tests/) * [Setup](https://docs.datadoghq.com/tests/setup/) * [.NET](https://docs.datadoghq.com/tests/setup/dotnet/) * [Java and JVM Languages](https://docs.datadoghq.com/tests/setup/java/) * [JavaScript and TypeScript](https://docs.datadoghq.com/tests/setup/javascript/) * [Python](https://docs.datadoghq.com/tests/setup/python/) * [Ruby](https://docs.datadoghq.com/tests/setup/ruby/) * [Swift](https://docs.datadoghq.com/tests/setup/swift/) * [Go](https://docs.datadoghq.com/tests/setup/go/) * [JUnit Report Uploads](https://docs.datadoghq.com/tests/setup/junit_xml/) * [Network Settings](https://docs.datadoghq.com/tests/network/) * [Tests in Containers](https://docs.datadoghq.com/tests/containers/) * [Repositories](https://docs.datadoghq.com/tests/repositories) * [Explorer](https://docs.datadoghq.com/tests/explorer/) * [Search Syntax](https://docs.datadoghq.com/tests/explorer/search_syntax) * [Search Test Runs](https://docs.datadoghq.com/tests/explorer/facets/) * [Export](https://docs.datadoghq.com/tests/explorer/export/) * [Saved Views](https://docs.datadoghq.com/tests/explorer/saved_views/) * [Monitors](https://docs.datadoghq.com/monitors/types/ci/?tab=tests) * [Test Health](https://docs.datadoghq.com/tests/test_health) * [Flaky Test Management](https://docs.datadoghq.com/tests/flaky_management) * [Working with Flaky Tests](https://docs.datadoghq.com/tests/flaky_tests) * [Early Flake Detection](https://docs.datadoghq.com/tests/flaky_tests/early_flake_detection) * [Auto Test Retries](https://docs.datadoghq.com/tests/flaky_tests/auto_test_retries) * [Test Impact Analysis](https://docs.datadoghq.com/tests/test_impact_analysis) * [Setup](https://docs.datadoghq.com/tests/test_impact_analysis/setup/) * [How It Works](https://docs.datadoghq.com/tests/test_impact_analysis/how_it_works/) * [Troubleshooting](https://docs.datadoghq.com/tests/test_impact_analysis/troubleshooting/) * [Developer Workflows](https://docs.datadoghq.com/tests/developer_workflows) * [Code Coverage](https://docs.datadoghq.com/tests/code_coverage) * [Instrument Browser Tests with RUM](https://docs.datadoghq.com/tests/browser_tests) * [Instrument Swift Tests with RUM](https://docs.datadoghq.com/tests/swift_tests) * [Correlate Logs and Tests](https://docs.datadoghq.com/tests/correlate_logs_and_tests) * [Guides](https://docs.datadoghq.com/tests/guides/) * [Troubleshooting](https://docs.datadoghq.com/tests/troubleshooting/) * [Code Coverage](https://docs.datadoghq.com/code_coverage/) * [Setup](https://docs.datadoghq.com/code_coverage/setup/) * [Data Collected](https://docs.datadoghq.com/code_coverage/data_collected/) * [PR Gates](https://docs.datadoghq.com/pr_gates/) * [Setup](https://docs.datadoghq.com/pr_gates/setup) * [DORA Metrics](https://docs.datadoghq.com/dora_metrics/) * [Setup](https://docs.datadoghq.com/dora_metrics/setup) * [Deployment Data Sources](https://docs.datadoghq.com/dora_metrics/setup/deployments) * [Failure Data Sources](https://docs.datadoghq.com/dora_metrics/setup/failures/) * [Change Failure Detection](https://docs.datadoghq.com/dora_metrics/change_failure_detection/) * [Data Collected](https://docs.datadoghq.com/dora_metrics/data_collected/) * [Feature Flags](https://docs.datadoghq.com/feature_flags/) * [Client SDKs](https://docs.datadoghq.com/feature_flags/client) * [Android and Android TV](https://docs.datadoghq.com/feature_flags/client/android) * [iOS and tvOS](https://docs.datadoghq.com/feature_flags/client/ios) * [JavaScript](https://docs.datadoghq.com/feature_flags/client/javascript) * [React](https://docs.datadoghq.com/feature_flags/client/react) * [Server SDKs](https://docs.datadoghq.com/feature_flags/server) * [Go](https://docs.datadoghq.com/feature_flags/server/go) * [Java](https://docs.datadoghq.com/feature_flags/server/java) * [Node.js](https://docs.datadoghq.com/feature_flags/server/nodejs) * [Python](https://docs.datadoghq.com/feature_flags/server/python) * [Ruby](https://docs.datadoghq.com/feature_flags/server/ruby) * [MCP Server](https://docs.datadoghq.com/feature_flags/feature_flag_mcp_server) * [Guides](https://docs.datadoghq.com/feature_flags/guide) * [Security ](https://docs.datadoghq.com/api/latest/csm-threats/) * [Security Overview](https://docs.datadoghq.com/security/) * [Detection Rules](https://docs.datadoghq.com/security/detection_rules/) * [OOTB Rules](https://docs.datadoghq.com/security/default_rules/#all) * [Notifications](https://docs.datadoghq.com/security/notifications/) * [Rules](https://docs.datadoghq.com/security/notifications/rules/) * [Variables](https://docs.datadoghq.com/security/notifications/variables/) * [Suppressions](https://docs.datadoghq.com/security/suppressions/) * [Automation Pipelines](https://docs.datadoghq.com/security/automation_pipelines/) * [Mute](https://docs.datadoghq.com/security/automation_pipelines/mute) * [Add to Security Inbox](https://docs.datadoghq.com/security/automation_pipelines/security_inbox) * [Set Due Date Rules](https://docs.datadoghq.com/security/automation_pipelines/set_due_date) * [Security Inbox](https://docs.datadoghq.com/security/security_inbox) * [Threat Intelligence](https://docs.datadoghq.com/security/threat_intelligence) * [Audit Trail](https://docs.datadoghq.com/security/audit_trail) * [Access Control](https://docs.datadoghq.com/security/access_control) * [Account Takeover Protection](https://docs.datadoghq.com/security/account_takeover_protection) * [Ticketing Integrations](https://docs.datadoghq.com/security/ticketing_integrations) * [Research Feed](https://docs.datadoghq.com/security/research_feed) * [Guides](https://docs.datadoghq.com/security/guide) * [Cloud SIEM](https://docs.datadoghq.com/security/cloud_siem/) * [Ingest and Enrich](https://docs.datadoghq.com/security/cloud_siem/ingest_and_enrich/) * [Content Packs](https://docs.datadoghq.com/security/cloud_siem/ingest_and_enrich/content_packs) * [Bring Your Own Threat Intelligence](https://docs.datadoghq.com/security/cloud_siem/ingest_and_enrich/threat_intelligence) * [Open Cybersecurity Schema Framework](https://docs.datadoghq.com/security/cloud_siem/ingest_and_enrich/open_cybersecurity_schema_framework) * [Detect and Monitor](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/) * [OOTB Rules](https://docs.datadoghq.com/security/default_rules/#cat-cloud-siem-log-detection) * [Custom Detection Rules](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/custom_detection_rules) * [Version History](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/version_history) * [Suppressions](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/suppressions) * [Historical Jobs](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/historical_jobs) * [MITRE ATT&CK Map](https://docs.datadoghq.com/security/cloud_siem/detect_and_monitor/mitre_attack_map) * [Triage and Investigate](https://docs.datadoghq.com/security/cloud_siem/triage_and_investigate) * [Investigate Security Signals](https://docs.datadoghq.com/security/cloud_siem/triage_and_investigate/investigate_security_signals) * [Risk Insights](https://docs.datadoghq.com/security/cloud_siem/triage_and_investigate/entities_and_risk_scoring) * [IOC Explorer](https://docs.datadoghq.com/security/cloud_siem/triage_and_investigate/ioc_explorer) * [Investigator](https://docs.datadoghq.com/security/cloud_siem/triage_and_investigate/investigator) * [Respond and Report](https://docs.datadoghq.com/security/cloud_siem/respond_and_report) * [Security Operational Metrics](https://docs.datadoghq.com/security/cloud_siem/respond_and_report/security_operational_metrics) * [Guides](https://docs.datadoghq.com/security/cloud_siem/guide/) * [Data Security](https://docs.datadoghq.com/data_security/cloud_siem/) * [Code Security](https://docs.datadoghq.com/security/code_security/) * [Static Code Analysis (SAST)](https://docs.datadoghq.com/security/code_security/static_analysis/) * [Setup](https://docs.datadoghq.com/security/code_security/static_analysis/setup/) * [GitHub Actions](https://docs.datadoghq.com/security/code_security/static_analysis/github_actions/) * [Generic CI Providers](https://docs.datadoghq.com/security/code_security/static_analysis/generic_ci_providers/) * [AI-Enhanced Static Code Analysis](https://docs.datadoghq.com/security/code_security/static_analysis/ai_enhanced_sast/) * [SAST Custom Rule Creation Tutorial](https://docs.datadoghq.com/security/code_security/static_analysis/custom_rules/tutorial/) * [SAST Custom Rules](https://docs.datadoghq.com/security/code_security/static_analysis/custom_rules/) * [SAST Custom Rules Guide](https://docs.datadoghq.com/security/code_security/static_analysis/custom_rules/guide/) * [Static Code Analysis (SAST) rules](https://docs.datadoghq.com/security/code_security/static_analysis/static_analysis_rules/) * [Software Composition Analysis (SCA)](https://docs.datadoghq.com/security/code_security/software_composition_analysis/) * [Static Setup](https://docs.datadoghq.com/security/code_security/software_composition_analysis/setup_static/) * [Runtime Setup](https://docs.datadoghq.com/security/code_security/software_composition_analysis/setup_runtime/) * [Library Inventory](https://docs.datadoghq.com/security/code_security/software_composition_analysis/library_inventory/) * [Secret Scanning](https://docs.datadoghq.com/security/code_security/secret_scanning/) * [GitHub Actions](https://docs.datadoghq.com/security/code_security/secret_scanning/github_actions/) * [Generic CI Providers](https://docs.datadoghq.com/security/code_security/secret_scanning/generic_ci_providers/) * [Secret Validation](https://docs.datadoghq.com/security/code_security/secret_scanning/secret_validation/) * [Runtime Code Analysis (IAST)](https://docs.datadoghq.com/security/code_security/iast/) * [Setup](https://docs.datadoghq.com/security/code_security/iast/setup/) * [Security Controls](https://docs.datadoghq.com/security/code_security/iast/security_controls/) * [Infrastructure as Code (IaC) Security](https://docs.datadoghq.com/security/code_security/iac_security/) * [Setup](https://docs.datadoghq.com/security/code_security/iac_security/setup/) * [Exclusions](https://docs.datadoghq.com/security/code_security/iac_security/exclusions/) * [Rules](https://docs.datadoghq.com/security/code_security/iac_security/iac_rules/) * [Developer Tool Integrations](https://docs.datadoghq.com/security/code_security/dev_tool_int/) * [Pull Request Comments](https://docs.datadoghq.com/security/code_security/dev_tool_int/pull_request_comments/) * [PR Gates](https://docs.datadoghq.com/pr_gates/) * [IDE Plugins](https://docs.datadoghq.com/security/code_security/dev_tool_int/ide_plugins/) * [Git Hooks](https://docs.datadoghq.com/security/code_security/dev_tool_int/git_hooks/) * [Troubleshooting](https://docs.datadoghq.com/security/code_security/troubleshooting/) * [Guides](https://docs.datadoghq.com/security/code_security/guides/) * [Cloud Security](https://docs.datadoghq.com/security/cloud_security_management) * [Setup](https://docs.datadoghq.com/security/cloud_security_management/setup) * [Supported Deployment Types](https://docs.datadoghq.com/security/cloud_security_management/setup/supported_deployment_types) * [Agentless Scanning](https://docs.datadoghq.com/security/cloud_security_management/setup/agentless_scanning) * [Deploy the Agent](https://docs.datadoghq.com/security/cloud_security_management/setup/agent) * [Set Up CloudTrail Logs](https://docs.datadoghq.com/security/cloud_security_management/setup/cloudtrail_logs) * [Set Up without Infrastructure Monitoring](https://docs.datadoghq.com/security/cloud_security_management/setup/without_infrastructure_monitoring) * [Deploy via Cloud Integrations](https://docs.datadoghq.com/security/cloud_security_management/setup/cloud_integrations) * [Security Graph](https://docs.datadoghq.com/security/cloud_security_management/security_graph/) * [Misconfigurations](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/) * [Manage Compliance Rules](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/compliance_rules) * [Create Custom Rules](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/custom_rules) * [Manage Compliance Posture](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/frameworks_and_benchmarks) * [Explore Misconfigurations](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/findings) * [Kubernetes Security Posture Management](https://docs.datadoghq.com/security/cloud_security_management/misconfigurations/kspm) * [Identity Risks](https://docs.datadoghq.com/security/cloud_security_management/identity_risks/) * [Vulnerabilities](https://docs.datadoghq.com/security/cloud_security_management/vulnerabilities) * [Hosts and Containers Compatibility](https://docs.datadoghq.com/security/cloud_security_management/vulnerabilities/hosts_containers_compatibility) * [OOTB Rules](https://docs.datadoghq.com/security/default_rules/#cat-cloud-security-management) * [Review and Remediate](https://docs.datadoghq.com/security/cloud_security_management/review_remediate) * [Mute Issues](https://docs.datadoghq.com/security/cloud_security_management/review_remediate/mute_issues) * [Automate Security Workflows](https://docs.datadoghq.com/security/cloud_security_management/review_remediate/workflows) * [Create Jira Issues](https://docs.datadoghq.com/security/cloud_security_management/review_remediate/jira) * [Severity Scoring](https://docs.datadoghq.com/security/cloud_security_management/severity_scoring/) * [Guides](https://docs.datadoghq.com/security/cloud_security_management/guide/) * [Troubleshooting](https://docs.datadoghq.com/security/cloud_security_management/troubleshooting/) * [Vulnerabilities](https://docs.datadoghq.com/security/cloud_security_management/troubleshooting/vulnerabilities/) * [App and API Protection](https://docs.datadoghq.com/security/application_security/) * [Terms and Concepts](https://docs.datadoghq.com/security/application_security/terms/) * [How It Works](https://docs.datadoghq.com/security/application_security/how-it-works/) * [Threat Intelligence](https://docs.datadoghq.com/security/application_security/how-it-works/threat-intelligence) * [Trace Qualification](https://docs.datadoghq.com/security/application_security/how-it-works/trace_qualification) * [User Monitoring and Protection](https://docs.datadoghq.com/security/application_security/how-it-works/add-user-info) * [Setup](https://docs.datadoghq.com/security/application_security/setup/) * [Overview](https://docs.datadoghq.com/security/application_security/overview) * [Security Signals](https://docs.datadoghq.com/security/application_security/security_signals) * [Attackers Explorer](https://docs.datadoghq.com/security/application_security/security_signals/attacker-explorer/) * [Attacker Fingerprint](https://docs.datadoghq.com/security/application_security/security_signals/attacker_fingerprint/) * [Attacker Clustering](https://docs.datadoghq.com/security/application_security/security_signals/attacker_clustering/) * [Users Explorer](https://docs.datadoghq.com/security/application_security/security_signals/users_explorer/) * [Policies](https://docs.datadoghq.com/security/application_security/policies/) * [Custom Rules](https://docs.datadoghq.com/security/application_security/policies/custom_rules/) * [OOTB Rules](https://docs.datadoghq.com/security/default_rules/) * [In-App WAF Rules](https://docs.datadoghq.com/security/application_security/policies/inapp_waf_rules/) * [Tracing Library Configuration](https://docs.datadoghq.com/security/application_security/policies/library_configuration/) * [Exploit Prevention](https://docs.datadoghq.com/security/application_security/exploit-prevention/) * [WAF Integrations](https://docs.datadoghq.com/security/application_security/waf-integration/) * [API Security Inventory](https://docs.datadoghq.com/security/application_security/api-inventory/) * [Guides](https://docs.datadoghq.com/security/application_security/guide/) * [Troubleshooting](https://docs.datadoghq.com/security/application_security/troubleshooting/) * [Workload Protection](https://docs.datadoghq.com/security/workload_protection/) * [Setup](https://docs.datadoghq.com/security/workload_protection/setup) * [Deploy the Agent](https://docs.datadoghq.com/security/workload_protection/setup/agent) * [Workload Protection Agent Variables](https://docs.datadoghq.com/security/workload_protection/setup/agent_variables) * [Detection Rules](https://docs.datadoghq.com/security/workload_protection/workload_security_rules) * [OOTB Rules](https://docs.datadoghq.com/security/workload_protection/setup/ootb_rules) * [Custom Rules](https://docs.datadoghq.com/security/workload_protection/workload_security_rules/custom_rules) * [Investigate Security Signals](https://docs.datadoghq.com/security/workload_protection/security_signals) * [Investigate Agent Events](https://docs.datadoghq.com/security/workload_protection/investigate_agent_events) * [Creating Agent Rule Expressions](https://docs.datadoghq.com/security/workload_protection/agent_expressions) * [Writing Custom Rule Expressions](https://docs.datadoghq.com/security/workload_protection/secl_auth_guide) * [Linux Syntax](https://docs.datadoghq.com/security/workload_protection/linux_expressions) * [Windows Syntax](https://docs.datadoghq.com/security/workload_protection/windows_expressions) * [Coverage and Posture Management](https://docs.datadoghq.com/security/workload_protection/inventory) * [Hosts and Containers](https://docs.datadoghq.com/security/workload_protection/inventory/hosts_and_containers) * [Serverless](https://docs.datadoghq.com/security/workload_protection/inventory/serverless) * [Coverage](https://docs.datadoghq.com/security/workload_protection/inventory/coverage_map) * [Guides](https://docs.datadoghq.com/security/workload_protection/guide) * [Troubleshooting](https://docs.datadoghq.com/security/workload_protection/troubleshooting/threats) * [Sensitive Data Scanner](https://docs.datadoghq.com/security/sensitive_data_scanner/) * [Setup](https://docs.datadoghq.com/security/sensitive_data_scanner/setup/) * [Telemetry Data](https://docs.datadoghq.com/security/sensitive_data_scanner/setup/telemetry_data/) * [Cloud Storage](https://docs.datadoghq.com/security/sensitive_data_scanner/setup/cloud_storage/) * [Scanning Rules](https://docs.datadoghq.com/security/sensitive_data_scanner/scanning_rules/) * [Library Rules](https://docs.datadoghq.com/security/sensitive_data_scanner/scanning_rules/library_rules/) * [Custom Rules](https://docs.datadoghq.com/security/sensitive_data_scanner/scanning_rules/custom_rules/) * [Guides](https://docs.datadoghq.com/security/sensitive_data_scanner/guide/) * [AI Observability ](https://docs.datadoghq.com/api/latest/csm-threats/) * [LLM Observability](https://docs.datadoghq.com/llm_observability/) * [Quickstart](https://docs.datadoghq.com/llm_observability/quickstart/) * [Instrumentation](https://docs.datadoghq.com/llm_observability/instrumentation/) * [Automatic](https://docs.datadoghq.com/llm_observability/instrumentation/auto_instrumentation) * [SDK Reference](https://docs.datadoghq.com/llm_observability/instrumentation/sdk) * [HTTP API](https://docs.datadoghq.com/llm_observability/instrumentation/api) * [OpenTelemetry](https://docs.datadoghq.com/llm_observability/instrumentation/otel_instrumentation) * [Monitoring](https://docs.datadoghq.com/llm_observability/monitoring) * [Querying spans and traces](https://docs.datadoghq.com/llm_observability/monitoring/querying) * [Correlate with APM](https://docs.datadoghq.com/llm_observability/monitoring/llm_observability_and_apm) * [Cluster Map](https://docs.datadoghq.com/llm_observability/monitoring/cluster_map/) * [Agent Monitoring](https://docs.datadoghq.com/llm_observability/monitoring/agent_monitoring) * [MCP Clients](https://docs.datadoghq.com/llm_observability/monitoring/mcp_client) * [Prompt Tracking](https://docs.datadoghq.com/llm_observability/monitoring/prompt_tracking) * [Metrics](https://docs.datadoghq.com/llm_observability/monitoring/metrics) * [Cost](https://docs.datadoghq.com/llm_observability/monitoring/cost) * [Evaluations](https://docs.datadoghq.com/llm_observability/evaluations/) * [Managed Evaluations](https://docs.datadoghq.com/llm_observability/evaluations/managed_evaluations) * [Custom LLM-as-a-Judge](https://docs.datadoghq.com/llm_observability/evaluations/custom_llm_as_a_judge_evaluations) * [External Evaluations](https://docs.datadoghq.com/llm_observability/evaluations/external_evaluations) * [Compatibility](https://docs.datadoghq.com/llm_observability/evaluations/evaluation_compatibility) * [Export API](https://docs.datadoghq.com/llm_observability/evaluations/export_api) * [Experiments](https://docs.datadoghq.com/llm_observability/experiments) * [Data Security and RBAC](https://docs.datadoghq.com/llm_observability/data_security_and_rbac) * [Terms and Concepts](https://docs.datadoghq.com/llm_observability/terms/) * [Guides](https://docs.datadoghq.com/llm_observability/guide/) * [Log Management ](https://docs.datadoghq.com/api/latest/csm-threats/) * [Observability Pipelines](https://docs.datadoghq.com/observability_pipelines/) * [Configuration](https://docs.datadoghq.com/observability_pipelines/configuration/) * [Explore Templates](https://docs.datadoghq.com/observability_pipelines/configuration/explore_templates/) * [Set Up Pipelines](https://docs.datadoghq.com/observability_pipelines/configuration/set_up_pipelines/) * [Install the Worker](https://docs.datadoghq.com/observability_pipelines/configuration/install_the_worker/) * [Live Capture](https://docs.datadoghq.com/observability_pipelines/configuration/live_capture/) * [Update Existing Pipelines](https://docs.datadoghq.com/observability_pipelines/configuration/update_existing_pipelines) * [Access Control](https://docs.datadoghq.com/observability_pipelines/configuration/access_control) * [Sources](https://docs.datadoghq.com/observability_pipelines/sources/) * [Akamai DataStream](https://docs.datadoghq.com/observability_pipelines/sources/akamai_datastream/) * [Amazon Data Firehose](https://docs.datadoghq.com/observability_pipelines/sources/amazon_data_firehose/) * [Amazon S3](https://docs.datadoghq.com/observability_pipelines/sources/amazon_s3/) * [Azure Event Hubs](https://docs.datadoghq.com/observability_pipelines/sources/azure_event_hubs/) * [Datadog Agent](https://docs.datadoghq.com/observability_pipelines/sources/datadog_agent/) * [Datadog Lambda Extension](https://docs.datadoghq.com/observability_pipelines/sources/lambda_extension/) * [Datadog Lambda Forwarder](https://docs.datadoghq.com/observability_pipelines/sources/lambda_forwarder/) * [Filebeat](https://docs.datadoghq.com/observability_pipelines/sources/filebeat/) * [Fluent](https://docs.datadoghq.com/observability_pipelines/sources/fluent/) * [Google Pub/Sub](https://docs.datadoghq.com/observability_pipelines/sources/google_pubsub/) * [HTTP Client](https://docs.datadoghq.com/observability_pipelines/sources/http_client/) * [HTTP Server](https://docs.datadoghq.com/observability_pipelines/sources/http_server/) * [OpenTelemetry](https://docs.datadoghq.com/observability_pipelines/sources/opentelemetry/) * [Kafka](https://docs.datadoghq.com/observability_pipelines/sources/kafka/) * [Logstash](https://docs.datadoghq.com/observability_pipelines/sources/logstash/) * [Socket](https://docs.datadoghq.com/observability_pipelines/sources/socket/) * [Splunk HEC](https://docs.datadoghq.com/observability_pipelines/sources/splunk_hec/) * [Splunk TCP](https://docs.datadoghq.com/observability_pipelines/sources/splunk_tcp/) * [Sumo Logic Hosted Collector](https://docs.datadoghq.com/observability_pipelines/sources/sumo_logic/) * [Syslog](https://docs.datadoghq.com/observability_pipelines/sources/syslog/) * [Processors](https://docs.datadoghq.com/observability_pipelines/processors/) * [Add Environment Variables](https://docs.datadoghq.com/observability_pipelines/processors/add_environment_variables) * [Add hostname](https://docs.datadoghq.com/observability_pipelines/processors/add_hostname) * [Custom Processor](https://docs.datadoghq.com/observability_pipelines/processors/custom_processor) * [Deduplicate](https://docs.datadoghq.com/observability_pipelines/processors/dedupe) * [Edit fields](https://docs.datadoghq.com/observability_pipelines/processors/edit_fields) * [Enrichment Table](https://docs.datadoghq.com/observability_pipelines/processors/enrichment_table) * [Filter](https://docs.datadoghq.com/observability_pipelines/processors/filter) * [Generate Metrics](https://docs.datadoghq.com/observability_pipelines/processors/generate_metrics) * [Grok Parser](https://docs.datadoghq.com/observability_pipelines/processors/grok_parser) * [Parse JSON](https://docs.datadoghq.com/observability_pipelines/processors/parse_json) * [Parse XML](https://docs.datadoghq.com/observability_pipelines/processors/parse_xml) * [Quota](https://docs.datadoghq.com/observability_pipelines/processors/quota) * [Reduce](https://docs.datadoghq.com/observability_pipelines/processors/reduce) * [Remap to OCSF](https://docs.datadoghq.com/observability_pipelines/processors/remap_ocsf) * [Sample](https://docs.datadoghq.com/observability_pipelines/processors/sample) * [Sensitive Data Scanner](https://docs.datadoghq.com/observability_pipelines/processors/sensitive_data_scanner) * [Split Array](https://docs.datadoghq.com/observability_pipelines/processors/split_array) * [Tag Control](https://docs.datadoghq.com/observability_pipelines/processors/tag_control/logs/) * [Throttle](https://docs.datadoghq.com/observability_pipelines/processors/throttle) * [Destinations](https://docs.datadoghq.com/observability_pipelines/destinations/) * [Amazon OpenSearch](https://docs.datadoghq.com/observability_pipelines/destinations/amazon_opensearch/) * [Amazon S3](https://docs.datadoghq.com/observability_pipelines/destinations/amazon_s3/) * [Amazon Security Lake](https://docs.datadoghq.com/observability_pipelines/destinations/amazon_security_lake/) * [Azure Storage](https://docs.datadoghq.com/observability_pipelines/destinations/azure_storage/) * [CrowdStrike NG-SIEM](https://docs.datadoghq.com/observability_pipelines/destinations/crowdstrike_ng_siem/) * [Datadog CloudPrem](https://docs.datadoghq.com/observability_pipelines/destinations/cloudprem/) * [Datadog Logs](https://docs.datadoghq.com/observability_pipelines/destinations/datadog_logs/) * [Datadog Metrics](https://docs.datadoghq.com/observability_pipelines/destinations/datadog_metrics/) * [Elasticsearch](https://docs.datadoghq.com/observability_pipelines/destinations/elasticsearch/) * [Google Cloud Storage](https://docs.datadoghq.com/observability_pipelines/destinations/google_cloud_storage/) * [Google Pub/Sub](https://docs.datadoghq.com/observability_pipelines/destinations/google_pubsub/) * [Google SecOps](https://docs.datadoghq.com/observability_pipelines/destinations/google_secops/) * [HTTP Client](https://docs.datadoghq.com/observability_pipelines/destinations/http_client/) * [Kafka](https://docs.datadoghq.com/observability_pipelines/destinations/kafka/) * [Microsoft Sentinel](https://docs.datadoghq.com/observability_pipelines/destinations/microsoft_sentinel/) * [New Relic](https://docs.datadoghq.com/observability_pipelines/destinations/new_relic/) * [OpenSearch](https://docs.datadoghq.com/observability_pipelines/destinations/opensearch) * [SentinelOne](https://docs.datadoghq.com/observability_pipelines/destinations/sentinelone) * [Socket](https://docs.datadoghq.com/observability_pipelines/destinations/socket) * [Splunk HEC](https://docs.datadoghq.com/observability_pipelines/destinations/splunk_hec) * [Sumo Logic Hosted Collector](https://docs.datadoghq.com/observability_pipelines/destinations/sumo_logic_hosted_collector) * [Syslog](https://docs.datadoghq.com/observability_pipelines/destinations/syslog) * [Packs](https://docs.datadoghq.com/observability_pipelines/packs/) * [Akamai CDN](https://docs.datadoghq.com/observability_pipelines/packs/akamai_cdn/) * [Amazon CloudFront](https://docs.datadoghq.com/observability_pipelines/packs/amazon_cloudfront/) * [Amazon VPC Flow Logs](https://docs.datadoghq.com/observability_pipelines/packs/amazon_vpc_flow_logs/) * [AWS CloudTrail](https://docs.datadoghq.com/observability_pipelines/packs/aws_cloudtrail/) * [Cisco ASA](https://docs.datadoghq.com/observability_pipelines/packs/cisco_asa/) * [Cloudflare](https://docs.datadoghq.com/observability_pipelines/packs/cloudflare/) * [F5](https://docs.datadoghq.com/observability_pipelines/packs/f5/) * [Fastly](https://docs.datadoghq.com/observability_pipelines/packs/fastly/) * [Fortinet Firewall](https://docs.datadoghq.com/observability_pipelines/packs/fortinet_firewall/) * [HAProxy Ingress](https://docs.datadoghq.com/observability_pipelines/packs/haproxy_ingress/) * [Istio Proxy](https://docs.datadoghq.com/observability_pipelines/packs/istio_proxy/) * [Netskope](https://docs.datadoghq.com/observability_pipelines/packs/netskope/) * [NGINX](https://docs.datadoghq.com/observability_pipelines/packs/nginx/) * [Okta](https://docs.datadoghq.com/observability_pipelines/packs/okta/) * [Palo Alto Firewall](https://docs.datadoghq.com/observability_pipelines/packs/palo_alto_firewall/) * [Windows XML](https://docs.datadoghq.com/observability_pipelines/packs/windows_xml/) * [ZScaler ZIA DNS](https://docs.datadoghq.com/observability_pipelines/packs/zscaler_zia_dns/) * [Zscaler ZIA Firewall](https://docs.datadoghq.com/observability_pipelines/packs/zscaler_zia_firewall/) * [Zscaler ZIA Tunnel](https://docs.datadoghq.com/observability_pipelines/packs/zscaler_zia_tunnel/) * [Zscaler ZIA Web Logs](https://docs.datadoghq.com/observability_pipelines/packs/zscaler_zia_web_logs/) * [Search Syntax](https://docs.datadoghq.com/observability_pipelines/search_syntax/logs/) * [Scaling and Performance](https://docs.datadoghq.com/observability_pipelines/scaling_and_performance/) * [Handling Load and Backpressure](https://docs.datadoghq.com/observability_pipelines/scaling_and_performance/handling_load_and_backpressure/) * [Scaling Best Practices](https://docs.datadoghq.com/observability_pipelines/scaling_and_performance/best_practices_for_scaling_observability_pipelines/) * [Monitoring and Troubleshooting](https://docs.datadoghq.com/observability_pipelines/monitoring_and_troubleshooting/) * [Worker CLI Commands](https://docs.datadoghq.com/observability_pipelines/monitoring_and_troubleshooting/worker_cli_commands/) * [Monitoring Pipelines](https://docs.datadoghq.com/observability_pipelines/monitoring_and_troubleshooting/monitoring_pipelines/) * [Pipeline Usage Metrics](https://docs.datadoghq.com/observability_pipelines/monitoring_and_troubleshooting/pipeline_usage_metrics/) * [Troubleshooting](https://docs.datadoghq.com/observability_pipelines/monitoring_and_troubleshooting/troubleshooting/) * [Guides and Resources](https://docs.datadoghq.com/observability_pipelines/guide/) * [Upgrade Worker Guide](https://docs.datadoghq.com/observability_pipelines/guide/upgrade_worker/) * [Log Management](https://docs.datadoghq.com/logs/) * [Log Collection & Integrations](https://docs.datadoghq.com/logs/log_collection/) * [Browser](https://docs.datadoghq.com/logs/log_collection/javascript/) * [Android](https://docs.datadoghq.com/logs/log_collection/android/) * [iOS](https://docs.datadoghq.com/logs/log_collection/ios/) * [Flutter](https://docs.datadoghq.com/logs/log_collection/flutter/) * [React Native](https://docs.datadoghq.com/logs/log_collection/reactnative/) * [Roku](https://docs.datadoghq.com/logs/log_collection/roku/) * [Kotlin Multiplatform](https://docs.datadoghq.com/logs/log_collection/kotlin_multiplatform/) * [C#](https://docs.datadoghq.com/logs/log_collection/csharp/) * [Go](https://docs.datadoghq.com/logs/log_collection/go/) * [Java](https://docs.datadoghq.com/logs/log_collection/java/) * [Node.js](https://docs.datadoghq.com/logs/log_collection/nodejs/) * [PHP](https://docs.datadoghq.com/logs/log_collection/php/) * [Python](https://docs.datadoghq.com/logs/log_collection/python/) * [Ruby](https://docs.datadoghq.com/logs/log_collection/ruby/) * [OpenTelemetry](https://docs.datadoghq.com/opentelemetry/otel_logs/) * [Agent Integrations](https://docs.datadoghq.com/logs/log_collection/agent_checks/) * [Other Integrations](https://docs.datadoghq.com/integrations/#cat-log-collection) * [Log Configuration](https://docs.datadoghq.com/logs/log_configuration/) * [Pipelines](https://docs.datadoghq.com/logs/log_configuration/pipelines/) * [Processors](https://docs.datadoghq.com/logs/log_configuration/processors/) * [Parsing](https://docs.datadoghq.com/logs/log_configuration/parsing/) * [Pipeline Scanner](https://docs.datadoghq.com/logs/log_configuration/pipeline_scanner/) * [Attributes and Aliasing](https://docs.datadoghq.com/logs/log_configuration/attributes_naming_convention/) * [Generate Metrics](https://docs.datadoghq.com/logs/log_configuration/logs_to_metrics/) * [Indexes](https://docs.datadoghq.com/logs/log_configuration/indexes) * [Flex Logs](https://docs.datadoghq.com/logs/log_configuration/flex_logs/) * [Archives](https://docs.datadoghq.com/logs/log_configuration/archives/) * [Rehydrate from Archives](https://docs.datadoghq.com/logs/log_configuration/rehydrating) * [Archive Search](https://docs.datadoghq.com/logs/log_configuration/archive_search) * [Forwarding](https://docs.datadoghq.com/logs/log_configuration/forwarding_custom_destinations/) * [Log Explorer](https://docs.datadoghq.com/logs/explorer/) * [Live Tail](https://docs.datadoghq.com/logs/explorer/live_tail/) * [Search Logs](https://docs.datadoghq.com/logs/explorer/search/) * [Search Syntax](https://docs.datadoghq.com/logs/explorer/search_syntax/) * [Advanced Search](https://docs.datadoghq.com/logs/explorer/advanced_search) * [Facets](https://docs.datadoghq.com/logs/explorer/facets/) * [Calculated Fields](https://docs.datadoghq.com/logs/explorer/calculated_fields/) * [Analytics](https://docs.datadoghq.com/logs/explorer/analytics/) * [Patterns](https://docs.datadoghq.com/logs/explorer/analytics/patterns/) * [Transactions](https://docs.datadoghq.com/logs/explorer/analytics/transactions/) * [Visualize](https://docs.datadoghq.com/logs/explorer/visualize/) * [Log Side Panel](https://docs.datadoghq.com/logs/explorer/side_panel/) * [Export](https://docs.datadoghq.com/logs/explorer/export/) * [Watchdog Insights for Logs](https://docs.datadoghq.com/logs/explorer/watchdog_insights/) * [Saved Views](https://docs.datadoghq.com/logs/explorer/saved_views/) * [Error Tracking](https://docs.datadoghq.com/logs/error_tracking/) * [Error Tracking Explorer](https://docs.datadoghq.com/logs/error_tracking/explorer) * [Issue States](https://docs.datadoghq.com/logs/error_tracking/issue_states) * [Track Browser and Mobile Errors](https://docs.datadoghq.com/logs/error_tracking/browser_and_mobile) * [Track Backend Errors](https://docs.datadoghq.com/logs/error_tracking/backend) * [Error Grouping](https://docs.datadoghq.com/logs/error_tracking/error_grouping) * [Manage Data Collection](https://docs.datadoghq.com/logs/error_tracking/manage_data_collection) * [Dynamic Sampling](https://docs.datadoghq.com/logs/error_tracking/dynamic_sampling) * [Monitors](https://docs.datadoghq.com/logs/error_tracking/monitors) * [Identify Suspect Commits](https://docs.datadoghq.com/logs/error_tracking/suspect_commits) * [Troubleshooting](https://docs.datadoghq.com/error_tracking/troubleshooting) * [Reports](https://docs.datadoghq.com/logs/reports/) * [Guides](https://docs.datadoghq.com/logs/guide/) * [Data Security](https://docs.datadoghq.com/data_security/logs/) * [Troubleshooting](https://docs.datadoghq.com/logs/troubleshooting) * [Live Tail](https://docs.datadoghq.com/logs/troubleshooting/live_tail) * [CloudPrem](https://docs.datadoghq.com/cloudprem/) * [Quickstart](https://docs.datadoghq.com/cloudprem/quickstart) * [Architecture](https://docs.datadoghq.com/cloudprem/architecture) * [Installation](https://docs.datadoghq.com/cloudprem/install) * [AWS EKS](https://docs.datadoghq.com/cloudprem/install/aws_eks) * [Azure AKS](https://docs.datadoghq.com/cloudprem/install/azure_aks) * [Install Locally with Docker](https://docs.datadoghq.com/cloudprem/install/docker) * [Log Ingestion](https://docs.datadoghq.com/cloudprem/ingest_logs/) * [Datadog Agent](https://docs.datadoghq.com/cloudprem/ingest_logs/datadog_agent) * [Observability Pipelines](https://docs.datadoghq.com/cloudprem/ingest_logs/observability_pipelines) * [REST API](https://docs.datadoghq.com/cloudprem/ingest_logs/rest_api/) * [Configuration](https://docs.datadoghq.com/cloudprem/configure/) * [Datadog Account](https://docs.datadoghq.com/cloudprem/configure/datadog_account) * [AWS Configuration](https://docs.datadoghq.com/cloudprem/configure/aws_config) * [Azure Configuration](https://docs.datadoghq.com/cloudprem/configure/azure_config) * [Cluster Sizing](https://docs.datadoghq.com/cloudprem/configure/cluster_sizing) * [Ingress](https://docs.datadoghq.com/cloudprem/configure/ingress) * [Processing](https://docs.datadoghq.com/cloudprem/configure/processing) * [Reverse Connection](https://docs.datadoghq.com/cloudprem/configure/reverse_connection) * [Management](https://docs.datadoghq.com/cloudprem/manage/) * [Supported Features](https://docs.datadoghq.com/cloudprem/supported_features/) * [Troubleshooting](https://docs.datadoghq.com/cloudprem/troubleshooting/) * [Administration ](https://docs.datadoghq.com/api/latest/csm-threats/) * [Account Management](https://docs.datadoghq.com/account_management/) * [Switching Between Orgs](https://docs.datadoghq.com/account_management/org_switching/) * [Organization Settings](https://docs.datadoghq.com/account_management/org_settings/) * [User Management](https://docs.datadoghq.com/account_management/users/) * [Login Methods](https://docs.datadoghq.com/account_management/login_methods/) * [OAuth Apps](https://docs.datadoghq.com/account_management/org_settings/oauth_apps) * [Custom Organization Landing Page](https://docs.datadoghq.com/account_management/org_settings/custom_landing) * [Service Accounts](https://docs.datadoghq.com/account_management/org_settings/service_accounts) * [IP Allowlist](https://docs.datadoghq.com/account_management/org_settings/ip_allowlist) * [Domain Allowlist](https://docs.datadoghq.com/account_management/org_settings/domain_allowlist) * [Cross-Organization Visibility](https://docs.datadoghq.com/account_management/org_settings/cross_org_visibility) * [Access Control](https://docs.datadoghq.com/account_management/rbac/) * [Granular Access](https://docs.datadoghq.com/account_management/rbac/granular_access) * [Permissions](https://docs.datadoghq.com/account_management/rbac/permissions) * [Data Access](https://docs.datadoghq.com/account_management/rbac/data_access) * [SSO with SAML](https://docs.datadoghq.com/account_management/saml/) * [Configuring SAML](https://docs.datadoghq.com/account_management/saml/configuration/) * [User Group Mapping](https://docs.datadoghq.com/account_management/saml/mapping/) * [Active Directory](https://docs.datadoghq.com/account_management/saml/activedirectory/) * [Auth0](https://docs.datadoghq.com/account_management/saml/auth0/) * [Entra ID](https://docs.datadoghq.com/account_management/saml/entra/) * [Google](https://docs.datadoghq.com/account_management/saml/google/) * [LastPass](https://docs.datadoghq.com/account_management/saml/lastpass/) * [Okta](https://docs.datadoghq.com/account_management/saml/okta/) * [SafeNet](https://docs.datadoghq.com/account_management/saml/safenet/) * [Troubleshooting](https://docs.datadoghq.com/account_management/saml/troubleshooting/) * [SCIM](https://docs.datadoghq.com/account_management/scim/) * [Okta](https://docs.datadoghq.com/account_management/scim/okta) * [Microsoft Entra ID](https://docs.datadoghq.com/account_management/scim/entra) * [API and Application Keys](https://docs.datadoghq.com/account_management/api-app-keys/) * [Teams](https://docs.datadoghq.com/account_management/teams/) * [Team Management](https://docs.datadoghq.com/account_management/teams/manage/) * [Provision with GitHub](https://docs.datadoghq.com/account_management/teams/github/) * [Governance Console](https://docs.datadoghq.com/account_management/governance_console/) * [Multi-Factor Authentication](https://docs.datadoghq.com/account_management/multi-factor_authentication/) * [Audit Trail](https://docs.datadoghq.com/account_management/audit_trail/) * [Events](https://docs.datadoghq.com/account_management/audit_trail/events/) * [Forwarding](https://docs.datadoghq.com/account_management/audit_trail/forwarding_audit_events/) * [Guides](https://docs.datadoghq.com/account_management/audit_trail/guides/) * [Safety Center](https://docs.datadoghq.com/account_management/safety_center/) * [Plan and Usage](https://docs.datadoghq.com/account_management/plan_and_usage/) * [Cost Details](https://docs.datadoghq.com/account_management/plan_and_usage/cost_details/) * [Usage Details](https://docs.datadoghq.com/account_management/plan_and_usage/usage_details/) * [Billing](https://docs.datadoghq.com/account_management/billing/) * [Pricing](https://docs.datadoghq.com/account_management/billing/pricing) * [Credit Card](https://docs.datadoghq.com/account_management/billing/credit_card/) * [Product Allotments](https://docs.datadoghq.com/account_management/billing/product_allotments) * [Usage Metrics](https://docs.datadoghq.com/account_management/billing/usage_metrics/) * [Usage Attribution](https://docs.datadoghq.com/account_management/billing/usage_attribution/) * [Custom Metrics](https://docs.datadoghq.com/account_management/billing/custom_metrics/) * [Containers](https://docs.datadoghq.com/account_management/billing/containers) * [Log Management](https://docs.datadoghq.com/account_management/billing/log_management/) * [APM](https://docs.datadoghq.com/account_management/billing/apm_tracing_profiler/) * [Serverless](https://docs.datadoghq.com/account_management/billing/serverless/) * [Real User Monitoring](https://docs.datadoghq.com/account_management/billing/rum/) * [CI Visibility](https://docs.datadoghq.com/account_management/billing/ci_visibility/) * [Incident Response](https://docs.datadoghq.com/account_management/billing/incident_response/) * [AWS Integration](https://docs.datadoghq.com/account_management/billing/aws/) * [Azure Integration](https://docs.datadoghq.com/account_management/billing/azure/) * [Google Cloud Integration](https://docs.datadoghq.com/account_management/billing/google_cloud/) * [Alibaba Integration](https://docs.datadoghq.com/account_management/billing/alibaba/) * [vSphere Integration](https://docs.datadoghq.com/account_management/billing/vsphere/) * [Workflow Automation](https://docs.datadoghq.com/account_management/billing/workflow_automation/) * [Multi-org Accounts](https://docs.datadoghq.com/account_management/multi_organization/) * [Guides](https://docs.datadoghq.com/account_management/guide/) * [Cloud-based Authentication](https://docs.datadoghq.com/account_management/cloud_provider_authentication/) * [Data Security](https://docs.datadoghq.com/data_security/) * [Agent](https://docs.datadoghq.com/data_security/agent/) * [Cloud SIEM](https://docs.datadoghq.com/data_security/cloud_siem/) * [Kubernetes](https://docs.datadoghq.com/data_security/kubernetes) * [Log Management](https://docs.datadoghq.com/data_security/logs/) * [Real User Monitoring](https://docs.datadoghq.com/data_security/real_user_monitoring/) * [Synthetic Monitoring](https://docs.datadoghq.com/data_security/synthetics/) * [Tracing](https://docs.datadoghq.com/tracing/configure_data_security/) * [PCI Compliance](https://docs.datadoghq.com/data_security/pci_compliance/) * [HIPAA Compliance](https://docs.datadoghq.com/data_security/hipaa_compliance/) * [Data Retention Periods](https://docs.datadoghq.com/data_security/data_retention_periods/) * [Guides](https://docs.datadoghq.com/data_security/guide/) * [Help](https://docs.datadoghq.com/help/) [Datadog Docs API](https://docs.datadoghq.com/) search * [Overview](https://docs.datadoghq.com/api/latest/) * [Using the API](https://docs.datadoghq.com/api/latest/using-the-api/) * [Authorization Scopes](https://docs.datadoghq.com/api/latest/scopes/) * [Rate Limits](https://docs.datadoghq.com/api/latest/rate-limits/) * [Action Connection](https://docs.datadoghq.com/api/latest/action-connection/) * [Get an existing Action Connection](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-existing-action-connection) * [Create a new Action Connection](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-action-connection) * [Update an existing Action Connection](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-existing-action-connection) * [Delete an existing Action Connection](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-existing-action-connection) * [Register a new App Key](https://docs.datadoghq.com/api/latest/csm-threats/#register-a-new-app-key) * [List App Key Registrations](https://docs.datadoghq.com/api/latest/csm-threats/#list-app-key-registrations) * [Unregister an App Key](https://docs.datadoghq.com/api/latest/csm-threats/#unregister-an-app-key) * [Get an existing App Key Registration](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-existing-app-key-registration) * [Actions Datastores](https://docs.datadoghq.com/api/latest/actions-datastores/) * [List datastores](https://docs.datadoghq.com/api/latest/csm-threats/#list-datastores) * [Create datastore](https://docs.datadoghq.com/api/latest/csm-threats/#create-datastore) * [Get datastore](https://docs.datadoghq.com/api/latest/csm-threats/#get-datastore) * [Update datastore](https://docs.datadoghq.com/api/latest/csm-threats/#update-datastore) * [Delete datastore](https://docs.datadoghq.com/api/latest/csm-threats/#delete-datastore) * [List datastore items](https://docs.datadoghq.com/api/latest/csm-threats/#list-datastore-items) * [Delete datastore item](https://docs.datadoghq.com/api/latest/csm-threats/#delete-datastore-item) * [Update datastore item](https://docs.datadoghq.com/api/latest/csm-threats/#update-datastore-item) * [Bulk write datastore items](https://docs.datadoghq.com/api/latest/csm-threats/#bulk-write-datastore-items) * [Bulk delete datastore items](https://docs.datadoghq.com/api/latest/csm-threats/#bulk-delete-datastore-items) * [Agentless Scanning](https://docs.datadoghq.com/api/latest/agentless-scanning/) * [List AWS scan options](https://docs.datadoghq.com/api/latest/csm-threats/#list-aws-scan-options) * [Create AWS scan options](https://docs.datadoghq.com/api/latest/csm-threats/#create-aws-scan-options) * [Get AWS scan options](https://docs.datadoghq.com/api/latest/csm-threats/#get-aws-scan-options) * [Update AWS scan options](https://docs.datadoghq.com/api/latest/csm-threats/#update-aws-scan-options) * [Delete AWS scan options](https://docs.datadoghq.com/api/latest/csm-threats/#delete-aws-scan-options) * [List Azure scan options](https://docs.datadoghq.com/api/latest/csm-threats/#list-azure-scan-options) * [Create Azure scan options](https://docs.datadoghq.com/api/latest/csm-threats/#create-azure-scan-options) * [Get Azure scan options](https://docs.datadoghq.com/api/latest/csm-threats/#get-azure-scan-options) * [Update Azure scan options](https://docs.datadoghq.com/api/latest/csm-threats/#update-azure-scan-options) * [Delete Azure scan options](https://docs.datadoghq.com/api/latest/csm-threats/#delete-azure-scan-options) * [List GCP scan options](https://docs.datadoghq.com/api/latest/csm-threats/#list-gcp-scan-options) * [Create GCP scan options](https://docs.datadoghq.com/api/latest/csm-threats/#create-gcp-scan-options) * [Get GCP scan options](https://docs.datadoghq.com/api/latest/csm-threats/#get-gcp-scan-options) * [Update GCP scan options](https://docs.datadoghq.com/api/latest/csm-threats/#update-gcp-scan-options) * [Delete GCP scan options](https://docs.datadoghq.com/api/latest/csm-threats/#delete-gcp-scan-options) * [List AWS on demand tasks](https://docs.datadoghq.com/api/latest/csm-threats/#list-aws-on-demand-tasks) * [Create AWS on demand task](https://docs.datadoghq.com/api/latest/csm-threats/#create-aws-on-demand-task) * [Get AWS on demand task](https://docs.datadoghq.com/api/latest/csm-threats/#get-aws-on-demand-task) * [API Management](https://docs.datadoghq.com/api/latest/api-management/) * [Create a new API](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-api) * [Update an API](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-api) * [Get an API](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-api) * [List APIs](https://docs.datadoghq.com/api/latest/csm-threats/#list-apis) * [Delete an API](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-api) * [APM](https://docs.datadoghq.com/api/latest/apm/) * [Get service list](https://docs.datadoghq.com/api/latest/csm-threats/#get-service-list) * [APM Retention Filters](https://docs.datadoghq.com/api/latest/apm-retention-filters/) * [List all APM retention filters](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-apm-retention-filters) * [Create a retention filter](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-retention-filter) * [Get a given APM retention filter](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-given-apm-retention-filter) * [Update a retention filter](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-retention-filter) * [Delete a retention filter](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-retention-filter) * [Re-order retention filters](https://docs.datadoghq.com/api/latest/csm-threats/#re-order-retention-filters) * [App Builder](https://docs.datadoghq.com/api/latest/app-builder/) * [List Apps](https://docs.datadoghq.com/api/latest/csm-threats/#list-apps) * [Create App](https://docs.datadoghq.com/api/latest/csm-threats/#create-app) * [Delete Multiple Apps](https://docs.datadoghq.com/api/latest/csm-threats/#delete-multiple-apps) * [Get App](https://docs.datadoghq.com/api/latest/csm-threats/#get-app) * [Update App](https://docs.datadoghq.com/api/latest/csm-threats/#update-app) * [Delete App](https://docs.datadoghq.com/api/latest/csm-threats/#delete-app) * [Publish App](https://docs.datadoghq.com/api/latest/csm-threats/#publish-app) * [Unpublish App](https://docs.datadoghq.com/api/latest/csm-threats/#unpublish-app) * [Application Security](https://docs.datadoghq.com/api/latest/application-security/) * [Get a WAF exclusion filter](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-waf-exclusion-filter) * [Create a WAF exclusion filter](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-waf-exclusion-filter) * [List all WAF exclusion filters](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-waf-exclusion-filters) * [Update a WAF exclusion filter](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-waf-exclusion-filter) * [Delete a WAF exclusion filter](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-waf-exclusion-filter) * [Get a WAF custom rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-waf-custom-rule) * [Create a WAF custom rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-waf-custom-rule) * [List all WAF custom rules](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-waf-custom-rules) * [Update a WAF Custom Rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-waf-custom-rule) * [Delete a WAF Custom Rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-waf-custom-rule) * [Audit](https://docs.datadoghq.com/api/latest/audit/) * [Search Audit Logs events](https://docs.datadoghq.com/api/latest/csm-threats/#search-audit-logs-events) * [Get a list of Audit Logs events](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-audit-logs-events) * [Authentication](https://docs.datadoghq.com/api/latest/authentication/) * [Validate API key](https://docs.datadoghq.com/api/latest/csm-threats/#validate-api-key) * [AuthN Mappings](https://docs.datadoghq.com/api/latest/authn-mappings/) * [Get an AuthN Mapping by UUID](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-authn-mapping-by-uuid) * [Edit an AuthN Mapping](https://docs.datadoghq.com/api/latest/csm-threats/#edit-an-authn-mapping) * [Delete an AuthN Mapping](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-authn-mapping) * [List all AuthN Mappings](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-authn-mappings) * [Create an AuthN Mapping](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-authn-mapping) * [AWS Integration](https://docs.datadoghq.com/api/latest/aws-integration/) * [Get all AWS tag filters](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-aws-tag-filters) * [List available namespaces](https://docs.datadoghq.com/api/latest/csm-threats/#list-available-namespaces) * [Set an AWS tag filter](https://docs.datadoghq.com/api/latest/csm-threats/#set-an-aws-tag-filter) * [Delete a tag filtering entry](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-tag-filtering-entry) * [Get an AWS integration by config ID](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-aws-integration-by-config-id) * [Generate a new external ID](https://docs.datadoghq.com/api/latest/csm-threats/#generate-a-new-external-id) * [List namespace rules](https://docs.datadoghq.com/api/latest/csm-threats/#list-namespace-rules) * [Get AWS integration IAM permissions](https://docs.datadoghq.com/api/latest/csm-threats/#get-aws-integration-iam-permissions) * [List all AWS integrations](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-aws-integrations) * [Delete an AWS integration](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-aws-integration) * [Get AWS integration standard IAM permissions](https://docs.datadoghq.com/api/latest/csm-threats/#get-aws-integration-standard-iam-permissions) * [Create an AWS integration](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-aws-integration) * [Get resource collection IAM permissions](https://docs.datadoghq.com/api/latest/csm-threats/#get-resource-collection-iam-permissions) * [Update an AWS integration](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-aws-integration) * [Get all Amazon EventBridge sources](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-amazon-eventbridge-sources) * [Create an Amazon EventBridge source](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-amazon-eventbridge-source) * [Delete an Amazon EventBridge source](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-amazon-eventbridge-source) * [AWS Logs Integration](https://docs.datadoghq.com/api/latest/aws-logs-integration/) * [List all AWS Logs integrations](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-aws-logs-integrations) * [Add AWS Log Lambda ARN](https://docs.datadoghq.com/api/latest/csm-threats/#add-aws-log-lambda-arn) * [Delete an AWS Logs integration](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-aws-logs-integration) * [Get list of AWS log ready services](https://docs.datadoghq.com/api/latest/csm-threats/#get-list-of-aws-log-ready-services) * [Enable an AWS Logs integration](https://docs.datadoghq.com/api/latest/csm-threats/#enable-an-aws-logs-integration) * [Check permissions for log services](https://docs.datadoghq.com/api/latest/csm-threats/#check-permissions-for-log-services) * [Check that an AWS Lambda Function exists](https://docs.datadoghq.com/api/latest/csm-threats/#check-that-an-aws-lambda-function-exists) * [Azure Integration](https://docs.datadoghq.com/api/latest/azure-integration/) * [List all Azure integrations](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-azure-integrations) * [Create an Azure integration](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-azure-integration) * [Delete an Azure integration](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-azure-integration) * [Update an Azure integration](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-azure-integration) * [Update Azure integration host filters](https://docs.datadoghq.com/api/latest/csm-threats/#update-azure-integration-host-filters) * [Case Management](https://docs.datadoghq.com/api/latest/case-management/) * [Create a project](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-project) * [Search cases](https://docs.datadoghq.com/api/latest/csm-threats/#search-cases) * [Create a case](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-case) * [Get all projects](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-projects) * [Get the details of a case](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-details-of-a-case) * [Get the details of a project](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-details-of-a-project) * [Remove a project](https://docs.datadoghq.com/api/latest/csm-threats/#remove-a-project) * [Update case description](https://docs.datadoghq.com/api/latest/csm-threats/#update-case-description) * [Update case status](https://docs.datadoghq.com/api/latest/csm-threats/#update-case-status) * [Update case title](https://docs.datadoghq.com/api/latest/csm-threats/#update-case-title) * [Update case priority](https://docs.datadoghq.com/api/latest/csm-threats/#update-case-priority) * [Assign case](https://docs.datadoghq.com/api/latest/csm-threats/#assign-case) * [Unassign case](https://docs.datadoghq.com/api/latest/csm-threats/#unassign-case) * [Archive case](https://docs.datadoghq.com/api/latest/csm-threats/#archive-case) * [Unarchive case](https://docs.datadoghq.com/api/latest/csm-threats/#unarchive-case) * [Update case attributes](https://docs.datadoghq.com/api/latest/csm-threats/#update-case-attributes) * [Comment case](https://docs.datadoghq.com/api/latest/csm-threats/#comment-case) * [Delete case comment](https://docs.datadoghq.com/api/latest/csm-threats/#delete-case-comment) * [Update case custom attribute](https://docs.datadoghq.com/api/latest/csm-threats/#update-case-custom-attribute) * [Delete custom attribute from case](https://docs.datadoghq.com/api/latest/csm-threats/#delete-custom-attribute-from-case) * [Case Management Attribute](https://docs.datadoghq.com/api/latest/case-management-attribute/) * [Get all custom attributes](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-custom-attributes) * [Get all custom attributes config of case type](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-custom-attributes-config-of-case-type) * [Create custom attribute config for a case type](https://docs.datadoghq.com/api/latest/csm-threats/#create-custom-attribute-config-for-a-case-type) * [Delete custom attributes config](https://docs.datadoghq.com/api/latest/csm-threats/#delete-custom-attributes-config) * [Case Management Type](https://docs.datadoghq.com/api/latest/case-management-type/) * [Get all case types](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-case-types) * [Create a case type](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-case-type) * [Delete a case type](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-case-type) * [CI Visibility Pipelines](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/) * [Send pipeline event](https://docs.datadoghq.com/api/latest/csm-threats/#send-pipeline-event) * [Get a list of pipelines events](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-pipelines-events) * [Search pipelines events](https://docs.datadoghq.com/api/latest/csm-threats/#search-pipelines-events) * [Aggregate pipelines events](https://docs.datadoghq.com/api/latest/csm-threats/#aggregate-pipelines-events) * [CI Visibility Tests](https://docs.datadoghq.com/api/latest/ci-visibility-tests/) * [Get a list of tests events](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-tests-events) * [Search tests events](https://docs.datadoghq.com/api/latest/csm-threats/#search-tests-events) * [Aggregate tests events](https://docs.datadoghq.com/api/latest/csm-threats/#aggregate-tests-events) * [Cloud Cost Management](https://docs.datadoghq.com/api/latest/cloud-cost-management/) * [List Cloud Cost Management AWS CUR configs](https://docs.datadoghq.com/api/latest/csm-threats/#list-cloud-cost-management-aws-cur-configs) * [Create Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/csm-threats/#create-cloud-cost-management-aws-cur-config) * [Update Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/csm-threats/#update-cloud-cost-management-aws-cur-config) * [Delete Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/csm-threats/#delete-cloud-cost-management-aws-cur-config) * [Get cost AWS CUR config](https://docs.datadoghq.com/api/latest/csm-threats/#get-cost-aws-cur-config) * [List Cloud Cost Management Azure configs](https://docs.datadoghq.com/api/latest/csm-threats/#list-cloud-cost-management-azure-configs) * [Create Cloud Cost Management Azure configs](https://docs.datadoghq.com/api/latest/csm-threats/#create-cloud-cost-management-azure-configs) * [Update Cloud Cost Management Azure config](https://docs.datadoghq.com/api/latest/csm-threats/#update-cloud-cost-management-azure-config) * [Delete Cloud Cost Management Azure config](https://docs.datadoghq.com/api/latest/csm-threats/#delete-cloud-cost-management-azure-config) * [Get cost Azure UC config](https://docs.datadoghq.com/api/latest/csm-threats/#get-cost-azure-uc-config) * [List Google Cloud Usage Cost configs](https://docs.datadoghq.com/api/latest/csm-threats/#list-google-cloud-usage-cost-configs) * [Create Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/csm-threats/#create-google-cloud-usage-cost-config) * [Update Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/csm-threats/#update-google-cloud-usage-cost-config) * [Delete Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/csm-threats/#delete-google-cloud-usage-cost-config) * [Get Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/csm-threats/#get-google-cloud-usage-cost-config) * [List tag pipeline rulesets](https://docs.datadoghq.com/api/latest/csm-threats/#list-tag-pipeline-rulesets) * [Create tag pipeline ruleset](https://docs.datadoghq.com/api/latest/csm-threats/#create-tag-pipeline-ruleset) * [Update tag pipeline ruleset](https://docs.datadoghq.com/api/latest/csm-threats/#update-tag-pipeline-ruleset) * [Delete tag pipeline ruleset](https://docs.datadoghq.com/api/latest/csm-threats/#delete-tag-pipeline-ruleset) * [Get a tag pipeline ruleset](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-tag-pipeline-ruleset) * [Reorder tag pipeline rulesets](https://docs.datadoghq.com/api/latest/csm-threats/#reorder-tag-pipeline-rulesets) * [Validate query](https://docs.datadoghq.com/api/latest/csm-threats/#validate-query) * [List custom allocation rules](https://docs.datadoghq.com/api/latest/csm-threats/#list-custom-allocation-rules) * [Create custom allocation rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-custom-allocation-rule) * [Update custom allocation rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-custom-allocation-rule) * [Delete custom allocation rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-custom-allocation-rule) * [Get custom allocation rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-custom-allocation-rule) * [Reorder custom allocation rules](https://docs.datadoghq.com/api/latest/csm-threats/#reorder-custom-allocation-rules) * [List Custom Costs files](https://docs.datadoghq.com/api/latest/csm-threats/#list-custom-costs-files) * [Upload Custom Costs file](https://docs.datadoghq.com/api/latest/csm-threats/#upload-custom-costs-file) * [Delete Custom Costs file](https://docs.datadoghq.com/api/latest/csm-threats/#delete-custom-costs-file) * [Get Custom Costs file](https://docs.datadoghq.com/api/latest/csm-threats/#get-custom-costs-file) * [List budgets](https://docs.datadoghq.com/api/latest/csm-threats/#list-budgets) * [Create or update a budget](https://docs.datadoghq.com/api/latest/csm-threats/#create-or-update-a-budget) * [Delete a budget](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-budget) * [Get a budget](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-budget) * [Cloud Network Monitoring](https://docs.datadoghq.com/api/latest/cloud-network-monitoring/) * [Get all aggregated connections](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-aggregated-connections) * [Get all aggregated DNS traffic](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-aggregated-dns-traffic) * [Cloudflare Integration](https://docs.datadoghq.com/api/latest/cloudflare-integration/) * [List Cloudflare accounts](https://docs.datadoghq.com/api/latest/csm-threats/#list-cloudflare-accounts) * [Add Cloudflare account](https://docs.datadoghq.com/api/latest/csm-threats/#add-cloudflare-account) * [Get Cloudflare account](https://docs.datadoghq.com/api/latest/csm-threats/#get-cloudflare-account) * [Update Cloudflare account](https://docs.datadoghq.com/api/latest/csm-threats/#update-cloudflare-account) * [Delete Cloudflare account](https://docs.datadoghq.com/api/latest/csm-threats/#delete-cloudflare-account) * [Confluent Cloud](https://docs.datadoghq.com/api/latest/confluent-cloud/) * [Update resource in Confluent account](https://docs.datadoghq.com/api/latest/csm-threats/#update-resource-in-confluent-account) * [Get resource from Confluent account](https://docs.datadoghq.com/api/latest/csm-threats/#get-resource-from-confluent-account) * [Delete resource from Confluent account](https://docs.datadoghq.com/api/latest/csm-threats/#delete-resource-from-confluent-account) * [Add resource to Confluent account](https://docs.datadoghq.com/api/latest/csm-threats/#add-resource-to-confluent-account) * [List Confluent Account resources](https://docs.datadoghq.com/api/latest/csm-threats/#list-confluent-account-resources) * [Update Confluent account](https://docs.datadoghq.com/api/latest/csm-threats/#update-confluent-account) * [Get Confluent account](https://docs.datadoghq.com/api/latest/csm-threats/#get-confluent-account) * [Delete Confluent account](https://docs.datadoghq.com/api/latest/csm-threats/#delete-confluent-account) * [Add Confluent account](https://docs.datadoghq.com/api/latest/csm-threats/#add-confluent-account) * [List Confluent accounts](https://docs.datadoghq.com/api/latest/csm-threats/#list-confluent-accounts) * [Container Images](https://docs.datadoghq.com/api/latest/container-images/) * [Get all Container Images](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-container-images) * [Containers](https://docs.datadoghq.com/api/latest/containers/) * [Get All Containers](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-containers) * [CSM Agents](https://docs.datadoghq.com/api/latest/csm-agents/) * [Get all CSM Agents](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-csm-agents) * [Get all CSM Serverless Agents](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-csm-serverless-agents) * [CSM Coverage Analysis](https://docs.datadoghq.com/api/latest/csm-coverage-analysis/) * [Get the CSM Cloud Accounts Coverage Analysis](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-csm-cloud-accounts-coverage-analysis) * [Get the CSM Hosts and Containers Coverage Analysis](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-csm-hosts-and-containers-coverage-analysis) * [Get the CSM Serverless Coverage Analysis](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-csm-serverless-coverage-analysis) * [CSM Threats](https://docs.datadoghq.com/api/latest/csm-threats/) * [Get all Workload Protection agent rules](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-workload-protection-agent-rules) * [Get a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-workload-protection-agent-rule) * [Create a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-workload-protection-agent-rule) * [Update a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-workload-protection-agent-rule) * [Delete a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-workload-protection-agent-rule) * [Get all Workload Protection policies](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-workload-protection-policies) * [Get a Workload Protection policy](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-workload-protection-policy) * [Create a Workload Protection policy](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-workload-protection-policy) * [Update a Workload Protection policy](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-workload-protection-policy) * [Delete a Workload Protection policy](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-workload-protection-policy) * [Download the Workload Protection policy](https://docs.datadoghq.com/api/latest/csm-threats/#download-the-workload-protection-policy) * [Get all Workload Protection agent rules (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-workload-protection-agent-rules-us1-fed) * [Get a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-workload-protection-agent-rule-us1-fed) * [Create a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-workload-protection-agent-rule-us1-fed) * [Update a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-workload-protection-agent-rule-us1-fed) * [Delete a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-workload-protection-agent-rule-us1-fed) * [Download the Workload Protection policy (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#download-the-workload-protection-policy-us1-fed) * [Dashboard Lists](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Get all dashboard lists](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-dashboard-lists) * [Get items of a Dashboard List](https://docs.datadoghq.com/api/latest/csm-threats/#get-items-of-a-dashboard-list) * [Add Items to a Dashboard List](https://docs.datadoghq.com/api/latest/csm-threats/#add-items-to-a-dashboard-list) * [Create a dashboard list](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-dashboard-list) * [Get a dashboard list](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-dashboard-list) * [Update items of a dashboard list](https://docs.datadoghq.com/api/latest/csm-threats/#update-items-of-a-dashboard-list) * [Delete items from a dashboard list](https://docs.datadoghq.com/api/latest/csm-threats/#delete-items-from-a-dashboard-list) * [Update a dashboard list](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-dashboard-list) * [Delete a dashboard list](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-dashboard-list) * [Dashboards](https://docs.datadoghq.com/api/latest/dashboards/) * [Create a new dashboard](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-dashboard) * [Get a dashboard](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-dashboard) * [Get all dashboards](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-dashboards) * [Update a dashboard](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-dashboard) * [Delete a dashboard](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-dashboard) * [Delete dashboards](https://docs.datadoghq.com/api/latest/csm-threats/#delete-dashboards) * [Restore deleted dashboards](https://docs.datadoghq.com/api/latest/csm-threats/#restore-deleted-dashboards) * [Create a shared dashboard](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-shared-dashboard) * [Get a shared dashboard](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-shared-dashboard) * [Update a shared dashboard](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-shared-dashboard) * [Send shared dashboard invitation email](https://docs.datadoghq.com/api/latest/csm-threats/#send-shared-dashboard-invitation-email) * [Get all invitations for a shared dashboard](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-invitations-for-a-shared-dashboard) * [Revoke a shared dashboard URL](https://docs.datadoghq.com/api/latest/csm-threats/#revoke-a-shared-dashboard-url) * [Revoke shared dashboard invitations](https://docs.datadoghq.com/api/latest/csm-threats/#revoke-shared-dashboard-invitations) * [Datasets](https://docs.datadoghq.com/api/latest/datasets/) * [Create a dataset](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-dataset) * [Get a single dataset by ID](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-single-dataset-by-id) * [Get all datasets](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-datasets) * [Edit a dataset](https://docs.datadoghq.com/api/latest/csm-threats/#edit-a-dataset) * [Delete a dataset](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-dataset) * [Deployment Gates](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Create deployment gate](https://docs.datadoghq.com/api/latest/csm-threats/#create-deployment-gate) * [Get deployment gate](https://docs.datadoghq.com/api/latest/csm-threats/#get-deployment-gate) * [Update deployment gate](https://docs.datadoghq.com/api/latest/csm-threats/#update-deployment-gate) * [Delete deployment gate](https://docs.datadoghq.com/api/latest/csm-threats/#delete-deployment-gate) * [Create deployment rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-deployment-rule) * [Get deployment rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-deployment-rule) * [Update deployment rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-deployment-rule) * [Delete deployment rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-deployment-rule) * [Get rules for a deployment gate](https://docs.datadoghq.com/api/latest/csm-threats/#get-rules-for-a-deployment-gate) * [Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/) * [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/csm-threats/#get-domain-allowlist) * [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/csm-threats/#sets-domain-allowlist) * [DORA Metrics](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Send a deployment event](https://docs.datadoghq.com/api/latest/csm-threats/#send-a-deployment-event) * [Send a failure event](https://docs.datadoghq.com/api/latest/csm-threats/#send-a-failure-event) * [Send an incident event](https://docs.datadoghq.com/api/latest/csm-threats/#send-an-incident-event) * [Get a list of deployment events](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-deployment-events) * [Get a list of failure events](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-failure-events) * [Get a deployment event](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-deployment-event) * [Get a failure event](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-failure-event) * [Delete a failure event](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-failure-event) * [Delete a deployment event](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-deployment-event) * [Downtimes](https://docs.datadoghq.com/api/latest/downtimes/) * [Get all downtimes](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-downtimes) * [Schedule a downtime](https://docs.datadoghq.com/api/latest/csm-threats/#schedule-a-downtime) * [Cancel downtimes by scope](https://docs.datadoghq.com/api/latest/csm-threats/#cancel-downtimes-by-scope) * [Cancel a downtime](https://docs.datadoghq.com/api/latest/csm-threats/#cancel-a-downtime) * [Get a downtime](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-downtime) * [Update a downtime](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-downtime) * [Get active downtimes for a monitor](https://docs.datadoghq.com/api/latest/csm-threats/#get-active-downtimes-for-a-monitor) * [Embeddable Graphs](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Revoke embed](https://docs.datadoghq.com/api/latest/csm-threats/#revoke-embed) * [Enable embed](https://docs.datadoghq.com/api/latest/csm-threats/#enable-embed) * [Get specific embed](https://docs.datadoghq.com/api/latest/csm-threats/#get-specific-embed) * [Create embed](https://docs.datadoghq.com/api/latest/csm-threats/#create-embed) * [Get all embeds](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-embeds) * [Error Tracking](https://docs.datadoghq.com/api/latest/error-tracking/) * [Search error tracking issues](https://docs.datadoghq.com/api/latest/csm-threats/#search-error-tracking-issues) * [Get the details of an error tracking issue](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-details-of-an-error-tracking-issue) * [Update the state of an issue](https://docs.datadoghq.com/api/latest/csm-threats/#update-the-state-of-an-issue) * [Update the assignee of an issue](https://docs.datadoghq.com/api/latest/csm-threats/#update-the-assignee-of-an-issue) * [Remove the assignee of an issue](https://docs.datadoghq.com/api/latest/csm-threats/#remove-the-assignee-of-an-issue) * [Events](https://docs.datadoghq.com/api/latest/events/) * [Get a list of events](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-events) * [Post an event](https://docs.datadoghq.com/api/latest/csm-threats/#post-an-event) * [Get an event](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-event) * [Search events](https://docs.datadoghq.com/api/latest/csm-threats/#search-events) * [Fastly Integration](https://docs.datadoghq.com/api/latest/fastly-integration/) * [List Fastly accounts](https://docs.datadoghq.com/api/latest/csm-threats/#list-fastly-accounts) * [Add Fastly account](https://docs.datadoghq.com/api/latest/csm-threats/#add-fastly-account) * [Get Fastly account](https://docs.datadoghq.com/api/latest/csm-threats/#get-fastly-account) * [Update Fastly account](https://docs.datadoghq.com/api/latest/csm-threats/#update-fastly-account) * [Delete Fastly account](https://docs.datadoghq.com/api/latest/csm-threats/#delete-fastly-account) * [List Fastly services](https://docs.datadoghq.com/api/latest/csm-threats/#list-fastly-services) * [Add Fastly service](https://docs.datadoghq.com/api/latest/csm-threats/#add-fastly-service) * [Get Fastly service](https://docs.datadoghq.com/api/latest/csm-threats/#get-fastly-service) * [Update Fastly service](https://docs.datadoghq.com/api/latest/csm-threats/#update-fastly-service) * [Delete Fastly service](https://docs.datadoghq.com/api/latest/csm-threats/#delete-fastly-service) * [Fleet Automation](https://docs.datadoghq.com/api/latest/fleet-automation/) * [List all available Agent versions](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-available-agent-versions) * [List all Datadog Agents](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-datadog-agents) * [Get detailed information about an agent](https://docs.datadoghq.com/api/latest/csm-threats/#get-detailed-information-about-an-agent) * [List all deployments](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-deployments) * [Create a configuration deployment](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-configuration-deployment) * [Upgrade hosts](https://docs.datadoghq.com/api/latest/csm-threats/#upgrade-hosts) * [Get a configuration deployment by ID](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-configuration-deployment-by-id) * [Cancel a deployment](https://docs.datadoghq.com/api/latest/csm-threats/#cancel-a-deployment) * [List all schedules](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-schedules) * [Create a schedule](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-schedule) * [Get a schedule by ID](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-schedule-by-id) * [Update a schedule](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-schedule) * [Delete a schedule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-schedule) * [Trigger a schedule deployment](https://docs.datadoghq.com/api/latest/csm-threats/#trigger-a-schedule-deployment) * [GCP Integration](https://docs.datadoghq.com/api/latest/gcp-integration/) * [List all GCP integrations](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-gcp-integrations) * [List all GCP STS-enabled service accounts](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-gcp-sts-enabled-service-accounts) * [Create a GCP integration](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-gcp-integration) * [Create a new entry for your service account](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-entry-for-your-service-account) * [Delete a GCP integration](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-gcp-integration) * [Delete an STS enabled GCP Account](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-sts-enabled-gcp-account) * [Update a GCP integration](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-gcp-integration) * [Update STS Service Account](https://docs.datadoghq.com/api/latest/csm-threats/#update-sts-service-account) * [Create a Datadog GCP principal](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-datadog-gcp-principal) * [List delegate account](https://docs.datadoghq.com/api/latest/csm-threats/#list-delegate-account) * [Hosts](https://docs.datadoghq.com/api/latest/hosts/) * [Get all hosts for your organization](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-hosts-for-your-organization) * [Get the total number of active hosts](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-total-number-of-active-hosts) * [Mute a host](https://docs.datadoghq.com/api/latest/csm-threats/#mute-a-host) * [Unmute a host](https://docs.datadoghq.com/api/latest/csm-threats/#unmute-a-host) * [Incident Services](https://docs.datadoghq.com/api/latest/incident-services/) * [Get details of an incident service](https://docs.datadoghq.com/api/latest/csm-threats/#get-details-of-an-incident-service) * [Delete an existing incident service](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-existing-incident-service) * [Update an existing incident service](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-existing-incident-service) * [Get a list of all incident services](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-all-incident-services) * [Create a new incident service](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-incident-service) * [Incident Teams](https://docs.datadoghq.com/api/latest/incident-teams/) * [Get details of an incident team](https://docs.datadoghq.com/api/latest/csm-threats/#get-details-of-an-incident-team) * [Delete an existing incident team](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-existing-incident-team) * [Update an existing incident team](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-existing-incident-team) * [Get a list of all incident teams](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-all-incident-teams) * [Create a new incident team](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-incident-team) * [Incidents](https://docs.datadoghq.com/api/latest/incidents/) * [Create an incident](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-incident) * [Get the details of an incident](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-details-of-an-incident) * [Update an existing incident](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-existing-incident) * [Delete an existing incident](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-existing-incident) * [Get a list of incidents](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-incidents) * [Search for incidents](https://docs.datadoghq.com/api/latest/csm-threats/#search-for-incidents) * [List an incident's impacts](https://docs.datadoghq.com/api/latest/csm-threats/#list-an-incidents-impacts) * [Create an incident impact](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-incident-impact) * [Delete an incident impact](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-incident-impact) * [Get a list of an incident's integration metadata](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-an-incidents-integration-metadata) * [Create an incident integration metadata](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-incident-integration-metadata) * [Get incident integration metadata details](https://docs.datadoghq.com/api/latest/csm-threats/#get-incident-integration-metadata-details) * [Update an existing incident integration metadata](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-existing-incident-integration-metadata) * [Delete an incident integration metadata](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-incident-integration-metadata) * [Get a list of an incident's todos](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-an-incidents-todos) * [Create an incident todo](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-incident-todo) * [Get incident todo details](https://docs.datadoghq.com/api/latest/csm-threats/#get-incident-todo-details) * [Update an incident todo](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-incident-todo) * [Delete an incident todo](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-incident-todo) * [Create an incident type](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-incident-type) * [Get a list of incident types](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-incident-types) * [Get incident type details](https://docs.datadoghq.com/api/latest/csm-threats/#get-incident-type-details) * [Update an incident type](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-incident-type) * [Delete an incident type](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-incident-type) * [List incident notification templates](https://docs.datadoghq.com/api/latest/csm-threats/#list-incident-notification-templates) * [Create incident notification template](https://docs.datadoghq.com/api/latest/csm-threats/#create-incident-notification-template) * [Get incident notification template](https://docs.datadoghq.com/api/latest/csm-threats/#get-incident-notification-template) * [Update incident notification template](https://docs.datadoghq.com/api/latest/csm-threats/#update-incident-notification-template) * [Delete a notification template](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-notification-template) * [List incident notification rules](https://docs.datadoghq.com/api/latest/csm-threats/#list-incident-notification-rules) * [Create an incident notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-incident-notification-rule) * [Get an incident notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-incident-notification-rule) * [Update an incident notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-incident-notification-rule) * [Delete an incident notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-incident-notification-rule) * [List incident attachments](https://docs.datadoghq.com/api/latest/csm-threats/#list-incident-attachments) * [Create incident attachment](https://docs.datadoghq.com/api/latest/csm-threats/#create-incident-attachment) * [Delete incident attachment](https://docs.datadoghq.com/api/latest/csm-threats/#delete-incident-attachment) * [Update incident attachment](https://docs.datadoghq.com/api/latest/csm-threats/#update-incident-attachment) * [IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Get IP Allowlist](https://docs.datadoghq.com/api/latest/csm-threats/#get-ip-allowlist) * [Update IP Allowlist](https://docs.datadoghq.com/api/latest/csm-threats/#update-ip-allowlist) * [IP Ranges](https://docs.datadoghq.com/api/latest/ip-ranges/) * [List IP Ranges](https://docs.datadoghq.com/api/latest/csm-threats/#list-ip-ranges) * [Key Management](https://docs.datadoghq.com/api/latest/key-management/) * [Delete an application key owned by current user](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-application-key-owned-by-current-user) * [Get all API keys](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-api-keys) * [Create an API key](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-api-key) * [Edit an application key owned by current user](https://docs.datadoghq.com/api/latest/csm-threats/#edit-an-application-key-owned-by-current-user) * [Get API key](https://docs.datadoghq.com/api/latest/csm-threats/#get-api-key) * [Get one application key owned by current user](https://docs.datadoghq.com/api/latest/csm-threats/#get-one-application-key-owned-by-current-user) * [Create an application key for current user](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-application-key-for-current-user) * [Edit an API key](https://docs.datadoghq.com/api/latest/csm-threats/#edit-an-api-key) * [Delete an API key](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-api-key) * [Get all application keys owned by current user](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-application-keys-owned-by-current-user) * [Get all application keys](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-application-keys) * [Create an application key](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-application-key) * [Get an application key](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-application-key) * [Edit an application key](https://docs.datadoghq.com/api/latest/csm-threats/#edit-an-application-key) * [Delete an application key](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-application-key) * [Logs](https://docs.datadoghq.com/api/latest/logs/) * [Send logs](https://docs.datadoghq.com/api/latest/csm-threats/#send-logs) * [Aggregate events](https://docs.datadoghq.com/api/latest/csm-threats/#aggregate-events) * [Search logs](https://docs.datadoghq.com/api/latest/csm-threats/#search-logs) * [Search logs (POST)](https://docs.datadoghq.com/api/latest/csm-threats/#search-logs-post) * [Search logs (GET)](https://docs.datadoghq.com/api/latest/csm-threats/#search-logs-get) * [Logs Archives](https://docs.datadoghq.com/api/latest/logs-archives/) * [Get all archives](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-archives) * [Create an archive](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-archive) * [Get an archive](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-archive) * [Update an archive](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-archive) * [Delete an archive](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-archive) * [List read roles for an archive](https://docs.datadoghq.com/api/latest/csm-threats/#list-read-roles-for-an-archive) * [Grant role to an archive](https://docs.datadoghq.com/api/latest/csm-threats/#grant-role-to-an-archive) * [Revoke role from an archive](https://docs.datadoghq.com/api/latest/csm-threats/#revoke-role-from-an-archive) * [Get archive order](https://docs.datadoghq.com/api/latest/csm-threats/#get-archive-order) * [Update archive order](https://docs.datadoghq.com/api/latest/csm-threats/#update-archive-order) * [Logs Custom Destinations](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Get all custom destinations](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-custom-destinations) * [Create a custom destination](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-custom-destination) * [Get a custom destination](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-custom-destination) * [Update a custom destination](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-custom-destination) * [Delete a custom destination](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-custom-destination) * [Logs Indexes](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Get all indexes](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-indexes) * [Get an index](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-index) * [Create an index](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-index) * [Update an index](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-index) * [Delete an index](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-index) * [Get indexes order](https://docs.datadoghq.com/api/latest/csm-threats/#get-indexes-order) * [Update indexes order](https://docs.datadoghq.com/api/latest/csm-threats/#update-indexes-order) * [Logs Metrics](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Get all log-based metrics](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-log-based-metrics) * [Create a log-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-log-based-metric) * [Get a log-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-log-based-metric) * [Update a log-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-log-based-metric) * [Delete a log-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-log-based-metric) * [Logs Pipelines](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Get pipeline order](https://docs.datadoghq.com/api/latest/csm-threats/#get-pipeline-order) * [Update pipeline order](https://docs.datadoghq.com/api/latest/csm-threats/#update-pipeline-order) * [Get all pipelines](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-pipelines) * [Create a pipeline](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-pipeline) * [Get a pipeline](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-pipeline) * [Delete a pipeline](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-pipeline) * [Update a pipeline](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-pipeline) * [Logs Restriction Queries](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [List restriction queries](https://docs.datadoghq.com/api/latest/csm-threats/#list-restriction-queries) * [Create a restriction query](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-restriction-query) * [Get a restriction query](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-restriction-query) * [Replace a restriction query](https://docs.datadoghq.com/api/latest/csm-threats/#replace-a-restriction-query) * [Update a restriction query](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-restriction-query) * [Delete a restriction query](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-restriction-query) * [List roles for a restriction query](https://docs.datadoghq.com/api/latest/csm-threats/#list-roles-for-a-restriction-query) * [Grant role to a restriction query](https://docs.datadoghq.com/api/latest/csm-threats/#grant-role-to-a-restriction-query) * [Revoke role from a restriction query](https://docs.datadoghq.com/api/latest/csm-threats/#revoke-role-from-a-restriction-query) * [Get all restriction queries for a given user](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-restriction-queries-for-a-given-user) * [Get restriction query for a given role](https://docs.datadoghq.com/api/latest/csm-threats/#get-restriction-query-for-a-given-role) * [Metrics](https://docs.datadoghq.com/api/latest/metrics/) * [Create a tag configuration](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-tag-configuration) * [Get active metrics list](https://docs.datadoghq.com/api/latest/csm-threats/#get-active-metrics-list) * [Query timeseries data across multiple products](https://docs.datadoghq.com/api/latest/csm-threats/#query-timeseries-data-across-multiple-products) * [Submit distribution points](https://docs.datadoghq.com/api/latest/csm-threats/#submit-distribution-points) * [Submit metrics](https://docs.datadoghq.com/api/latest/csm-threats/#submit-metrics) * [Get metric metadata](https://docs.datadoghq.com/api/latest/csm-threats/#get-metric-metadata) * [List tag configuration by name](https://docs.datadoghq.com/api/latest/csm-threats/#list-tag-configuration-by-name) * [Query scalar data across multiple products](https://docs.datadoghq.com/api/latest/csm-threats/#query-scalar-data-across-multiple-products) * [Edit metric metadata](https://docs.datadoghq.com/api/latest/csm-threats/#edit-metric-metadata) * [Update a tag configuration](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-tag-configuration) * [Delete a tag configuration](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-tag-configuration) * [Search metrics](https://docs.datadoghq.com/api/latest/csm-threats/#search-metrics) * [Get a list of metrics](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-metrics) * [Query timeseries points](https://docs.datadoghq.com/api/latest/csm-threats/#query-timeseries-points) * [List tags by metric name](https://docs.datadoghq.com/api/latest/csm-threats/#list-tags-by-metric-name) * [List active tags and aggregations](https://docs.datadoghq.com/api/latest/csm-threats/#list-active-tags-and-aggregations) * [List distinct metric volumes by metric name](https://docs.datadoghq.com/api/latest/csm-threats/#list-distinct-metric-volumes-by-metric-name) * [Configure tags for multiple metrics](https://docs.datadoghq.com/api/latest/csm-threats/#configure-tags-for-multiple-metrics) * [Delete tags for multiple metrics](https://docs.datadoghq.com/api/latest/csm-threats/#delete-tags-for-multiple-metrics) * [Tag Configuration Cardinality Estimator](https://docs.datadoghq.com/api/latest/csm-threats/#tag-configuration-cardinality-estimator) * [Related Assets to a Metric](https://docs.datadoghq.com/api/latest/csm-threats/#related-assets-to-a-metric) * [Get tag key cardinality details](https://docs.datadoghq.com/api/latest/csm-threats/#get-tag-key-cardinality-details) * [Microsoft Teams Integration](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Create tenant-based handle](https://docs.datadoghq.com/api/latest/csm-threats/#create-tenant-based-handle) * [Create Workflows webhook handle](https://docs.datadoghq.com/api/latest/csm-threats/#create-workflows-webhook-handle) * [Delete tenant-based handle](https://docs.datadoghq.com/api/latest/csm-threats/#delete-tenant-based-handle) * [Delete Workflows webhook handle](https://docs.datadoghq.com/api/latest/csm-threats/#delete-workflows-webhook-handle) * [Get all tenant-based handles](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-tenant-based-handles) * [Get all Workflows webhook handles](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-workflows-webhook-handles) * [Get channel information by name](https://docs.datadoghq.com/api/latest/csm-threats/#get-channel-information-by-name) * [Get tenant-based handle information](https://docs.datadoghq.com/api/latest/csm-threats/#get-tenant-based-handle-information) * [Get Workflows webhook handle information](https://docs.datadoghq.com/api/latest/csm-threats/#get-workflows-webhook-handle-information) * [Update tenant-based handle](https://docs.datadoghq.com/api/latest/csm-threats/#update-tenant-based-handle) * [Update Workflows webhook handle](https://docs.datadoghq.com/api/latest/csm-threats/#update-workflows-webhook-handle) * [Monitors](https://docs.datadoghq.com/api/latest/monitors/) * [Create a monitor](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-monitor) * [Monitors search](https://docs.datadoghq.com/api/latest/csm-threats/#monitors-search) * [Unmute a monitor](https://docs.datadoghq.com/api/latest/csm-threats/#unmute-a-monitor) * [Get all monitors](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-monitors) * [Monitors group search](https://docs.datadoghq.com/api/latest/csm-threats/#monitors-group-search) * [Mute a monitor](https://docs.datadoghq.com/api/latest/csm-threats/#mute-a-monitor) * [Edit a monitor](https://docs.datadoghq.com/api/latest/csm-threats/#edit-a-monitor) * [Unmute all monitors](https://docs.datadoghq.com/api/latest/csm-threats/#unmute-all-monitors) * [Get a monitor's details](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-monitors-details) * [Mute all monitors](https://docs.datadoghq.com/api/latest/csm-threats/#mute-all-monitors) * [Delete a monitor](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-monitor) * [Check if a monitor can be deleted](https://docs.datadoghq.com/api/latest/csm-threats/#check-if-a-monitor-can-be-deleted) * [Validate a monitor](https://docs.datadoghq.com/api/latest/csm-threats/#validate-a-monitor) * [Validate an existing monitor](https://docs.datadoghq.com/api/latest/csm-threats/#validate-an-existing-monitor) * [Get a monitor configuration policy](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-monitor-configuration-policy) * [Get all monitor configuration policies](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-monitor-configuration-policies) * [Create a monitor configuration policy](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-monitor-configuration-policy) * [Edit a monitor configuration policy](https://docs.datadoghq.com/api/latest/csm-threats/#edit-a-monitor-configuration-policy) * [Delete a monitor configuration policy](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-monitor-configuration-policy) * [Get a monitor notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-monitor-notification-rule) * [Get all monitor notification rules](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-monitor-notification-rules) * [Create a monitor notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-monitor-notification-rule) * [Update a monitor notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-monitor-notification-rule) * [Delete a monitor notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-monitor-notification-rule) * [Get a monitor user template](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-monitor-user-template) * [Get all monitor user templates](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-monitor-user-templates) * [Create a monitor user template](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-monitor-user-template) * [Update a monitor user template to a new version](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-monitor-user-template-to-a-new-version) * [Delete a monitor user template](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-monitor-user-template) * [Validate a monitor user template](https://docs.datadoghq.com/api/latest/csm-threats/#validate-a-monitor-user-template) * [Validate an existing monitor user template](https://docs.datadoghq.com/api/latest/csm-threats/#validate-an-existing-monitor-user-template) * [Network Device Monitoring](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Get the list of devices](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-list-of-devices) * [Get the device details](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-device-details) * [Get the list of interfaces of the device](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-list-of-interfaces-of-the-device) * [Get the list of tags for a device](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-list-of-tags-for-a-device) * [Update the tags for a device](https://docs.datadoghq.com/api/latest/csm-threats/#update-the-tags-for-a-device) * [Notebooks](https://docs.datadoghq.com/api/latest/notebooks/) * [Create a notebook](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-notebook) * [Get all notebooks](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-notebooks) * [Delete a notebook](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-notebook) * [Update a notebook](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-notebook) * [Get a notebook](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-notebook) * [Observability Pipelines](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [List pipelines](https://docs.datadoghq.com/api/latest/csm-threats/#list-pipelines) * [Create a new pipeline](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-pipeline) * [Get a specific pipeline](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-specific-pipeline) * [Update a pipeline](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-pipeline) * [Delete a pipeline](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-pipeline) * [Validate an observability pipeline](https://docs.datadoghq.com/api/latest/csm-threats/#validate-an-observability-pipeline) * [Okta Integration](https://docs.datadoghq.com/api/latest/okta-integration/) * [List Okta accounts](https://docs.datadoghq.com/api/latest/csm-threats/#list-okta-accounts) * [Add Okta account](https://docs.datadoghq.com/api/latest/csm-threats/#add-okta-account) * [Get Okta account](https://docs.datadoghq.com/api/latest/csm-threats/#get-okta-account) * [Update Okta account](https://docs.datadoghq.com/api/latest/csm-threats/#update-okta-account) * [Delete Okta account](https://docs.datadoghq.com/api/latest/csm-threats/#delete-okta-account) * [On-Call](https://docs.datadoghq.com/api/latest/on-call/) * [Create On-Call schedule](https://docs.datadoghq.com/api/latest/csm-threats/#create-on-call-schedule) * [Get On-Call schedule](https://docs.datadoghq.com/api/latest/csm-threats/#get-on-call-schedule) * [Delete On-Call schedule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-on-call-schedule) * [Update On-Call schedule](https://docs.datadoghq.com/api/latest/csm-threats/#update-on-call-schedule) * [Create On-Call escalation policy](https://docs.datadoghq.com/api/latest/csm-threats/#create-on-call-escalation-policy) * [Update On-Call escalation policy](https://docs.datadoghq.com/api/latest/csm-threats/#update-on-call-escalation-policy) * [Get On-Call escalation policy](https://docs.datadoghq.com/api/latest/csm-threats/#get-on-call-escalation-policy) * [Delete On-Call escalation policy](https://docs.datadoghq.com/api/latest/csm-threats/#delete-on-call-escalation-policy) * [Get On-Call team routing rules](https://docs.datadoghq.com/api/latest/csm-threats/#get-on-call-team-routing-rules) * [Set On-Call team routing rules](https://docs.datadoghq.com/api/latest/csm-threats/#set-on-call-team-routing-rules) * [Get scheduled on-call user](https://docs.datadoghq.com/api/latest/csm-threats/#get-scheduled-on-call-user) * [Get team on-call users](https://docs.datadoghq.com/api/latest/csm-threats/#get-team-on-call-users) * [Delete an On-Call notification channel for a user](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-on-call-notification-channel-for-a-user) * [Get an On-Call notification channel for a user](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-on-call-notification-channel-for-a-user) * [List On-Call notification channels for a user](https://docs.datadoghq.com/api/latest/csm-threats/#list-on-call-notification-channels-for-a-user) * [Create an On-Call notification channel for a user](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-on-call-notification-channel-for-a-user) * [Create an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-on-call-notification-rule-for-a-user) * [List On-Call notification rules for a user](https://docs.datadoghq.com/api/latest/csm-threats/#list-on-call-notification-rules-for-a-user) * [Delete an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-on-call-notification-rule-for-a-user) * [Get an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-on-call-notification-rule-for-a-user) * [Update an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-on-call-notification-rule-for-a-user) * [On-Call Paging](https://docs.datadoghq.com/api/latest/on-call-paging/) * [Create On-Call Page](https://docs.datadoghq.com/api/latest/csm-threats/#create-on-call-page) * [Acknowledge On-Call Page](https://docs.datadoghq.com/api/latest/csm-threats/#acknowledge-on-call-page) * [Escalate On-Call Page](https://docs.datadoghq.com/api/latest/csm-threats/#escalate-on-call-page) * [Resolve On-Call Page](https://docs.datadoghq.com/api/latest/csm-threats/#resolve-on-call-page) * [Opsgenie Integration](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Get all service objects](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-service-objects) * [Create a new service object](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-service-object) * [Get a single service object](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-single-service-object) * [Update a single service object](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-single-service-object) * [Delete a single service object](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-single-service-object) * [Org Connections](https://docs.datadoghq.com/api/latest/org-connections/) * [List Org Connections](https://docs.datadoghq.com/api/latest/csm-threats/#list-org-connections) * [Create Org Connection](https://docs.datadoghq.com/api/latest/csm-threats/#create-org-connection) * [Update Org Connection](https://docs.datadoghq.com/api/latest/csm-threats/#update-org-connection) * [Delete Org Connection](https://docs.datadoghq.com/api/latest/csm-threats/#delete-org-connection) * [Organizations](https://docs.datadoghq.com/api/latest/organizations/) * [Create a child organization](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-child-organization) * [List your managed organizations](https://docs.datadoghq.com/api/latest/csm-threats/#list-your-managed-organizations) * [Get organization information](https://docs.datadoghq.com/api/latest/csm-threats/#get-organization-information) * [Update your organization](https://docs.datadoghq.com/api/latest/csm-threats/#update-your-organization) * [Upload IdP metadata](https://docs.datadoghq.com/api/latest/csm-threats/#upload-idp-metadata) * [Spin-off Child Organization](https://docs.datadoghq.com/api/latest/csm-threats/#spin-off-child-organization) * [List Org Configs](https://docs.datadoghq.com/api/latest/csm-threats/#list-org-configs) * [Get a specific Org Config value](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-specific-org-config-value) * [Update a specific Org Config](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-specific-org-config) * [PagerDuty Integration](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Create a new service object](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-service-object) * [Get a single service object](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-single-service-object) * [Update a single service object](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-single-service-object) * [Delete a single service object](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-single-service-object) * [Powerpack](https://docs.datadoghq.com/api/latest/powerpack/) * [Get all powerpacks](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-powerpacks) * [Create a new powerpack](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-powerpack) * [Delete a powerpack](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-powerpack) * [Get a Powerpack](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-powerpack) * [Update a powerpack](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-powerpack) * [Processes](https://docs.datadoghq.com/api/latest/processes/) * [Get all processes](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-processes) * [Product Analytics](https://docs.datadoghq.com/api/latest/product-analytics/) * [Send server-side events](https://docs.datadoghq.com/api/latest/csm-threats/#send-server-side-events) * [Reference Tables](https://docs.datadoghq.com/api/latest/reference-tables/) * [Create reference table upload](https://docs.datadoghq.com/api/latest/csm-threats/#create-reference-table-upload) * [Create reference table](https://docs.datadoghq.com/api/latest/csm-threats/#create-reference-table) * [List tables](https://docs.datadoghq.com/api/latest/csm-threats/#list-tables) * [Get table](https://docs.datadoghq.com/api/latest/csm-threats/#get-table) * [Get rows by id](https://docs.datadoghq.com/api/latest/csm-threats/#get-rows-by-id) * [Update reference table](https://docs.datadoghq.com/api/latest/csm-threats/#update-reference-table) * [Delete table](https://docs.datadoghq.com/api/latest/csm-threats/#delete-table) * [Upsert rows](https://docs.datadoghq.com/api/latest/csm-threats/#upsert-rows) * [Delete rows](https://docs.datadoghq.com/api/latest/csm-threats/#delete-rows) * [Restriction Policies](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Update a restriction policy](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-restriction-policy) * [Get a restriction policy](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-restriction-policy) * [Delete a restriction policy](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-restriction-policy) * [Roles](https://docs.datadoghq.com/api/latest/roles/) * [List permissions](https://docs.datadoghq.com/api/latest/csm-threats/#list-permissions) * [List roles](https://docs.datadoghq.com/api/latest/csm-threats/#list-roles) * [Create role](https://docs.datadoghq.com/api/latest/csm-threats/#create-role) * [Get a role](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-role) * [Update a role](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-role) * [Delete role](https://docs.datadoghq.com/api/latest/csm-threats/#delete-role) * [List permissions for a role](https://docs.datadoghq.com/api/latest/csm-threats/#list-permissions-for-a-role) * [Grant permission to a role](https://docs.datadoghq.com/api/latest/csm-threats/#grant-permission-to-a-role) * [Revoke permission](https://docs.datadoghq.com/api/latest/csm-threats/#revoke-permission) * [Get all users of a role](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-users-of-a-role) * [Add a user to a role](https://docs.datadoghq.com/api/latest/csm-threats/#add-a-user-to-a-role) * [Remove a user from a role](https://docs.datadoghq.com/api/latest/csm-threats/#remove-a-user-from-a-role) * [Create a new role by cloning an existing role](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-role-by-cloning-an-existing-role) * [List role templates](https://docs.datadoghq.com/api/latest/csm-threats/#list-role-templates) * [RUM](https://docs.datadoghq.com/api/latest/rum/) * [Search RUM events](https://docs.datadoghq.com/api/latest/csm-threats/#search-rum-events) * [Get a list of RUM events](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-rum-events) * [Aggregate RUM events](https://docs.datadoghq.com/api/latest/csm-threats/#aggregate-rum-events) * [Update a RUM application](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-rum-application) * [Get a RUM application](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-rum-application) * [Delete a RUM application](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-rum-application) * [Create a new RUM application](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-rum-application) * [List all the RUM applications](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-the-rum-applications) * [Rum Audience Management](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Query accounts](https://docs.datadoghq.com/api/latest/csm-threats/#query-accounts) * [Create connection](https://docs.datadoghq.com/api/latest/csm-threats/#create-connection) * [Update connection](https://docs.datadoghq.com/api/latest/csm-threats/#update-connection) * [Query event filtered users](https://docs.datadoghq.com/api/latest/csm-threats/#query-event-filtered-users) * [Get account facet info](https://docs.datadoghq.com/api/latest/csm-threats/#get-account-facet-info) * [Delete connection](https://docs.datadoghq.com/api/latest/csm-threats/#delete-connection) * [List connections](https://docs.datadoghq.com/api/latest/csm-threats/#list-connections) * [Query users](https://docs.datadoghq.com/api/latest/csm-threats/#query-users) * [Get user facet info](https://docs.datadoghq.com/api/latest/csm-threats/#get-user-facet-info) * [Get mapping](https://docs.datadoghq.com/api/latest/csm-threats/#get-mapping) * [Rum Metrics](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Get all rum-based metrics](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-rum-based-metrics) * [Create a rum-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-rum-based-metric) * [Get a rum-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-rum-based-metric) * [Update a rum-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-rum-based-metric) * [Delete a rum-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-rum-based-metric) * [Rum Retention Filters](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Get all RUM retention filters](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-rum-retention-filters) * [Get a RUM retention filter](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-rum-retention-filter) * [Create a RUM retention filter](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-rum-retention-filter) * [Update a RUM retention filter](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-rum-retention-filter) * [Delete a RUM retention filter](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-rum-retention-filter) * [Order RUM retention filters](https://docs.datadoghq.com/api/latest/csm-threats/#order-rum-retention-filters) * [SCIM](https://docs.datadoghq.com/api/latest/scim/) * [List users](https://docs.datadoghq.com/api/latest/csm-threats/#list-users) * [Create user](https://docs.datadoghq.com/api/latest/csm-threats/#create-user) * [Get user](https://docs.datadoghq.com/api/latest/csm-threats/#get-user) * [Update user](https://docs.datadoghq.com/api/latest/csm-threats/#update-user) * [Patch user](https://docs.datadoghq.com/api/latest/csm-threats/#patch-user) * [Delete user](https://docs.datadoghq.com/api/latest/csm-threats/#delete-user) * [List groups](https://docs.datadoghq.com/api/latest/csm-threats/#list-groups) * [Create group](https://docs.datadoghq.com/api/latest/csm-threats/#create-group) * [Get group](https://docs.datadoghq.com/api/latest/csm-threats/#get-group) * [Update group](https://docs.datadoghq.com/api/latest/csm-threats/#update-group) * [Patch group](https://docs.datadoghq.com/api/latest/csm-threats/#patch-group) * [Delete group](https://docs.datadoghq.com/api/latest/csm-threats/#delete-group) * [Screenboards](https://docs.datadoghq.com/api/latest/screenboards/) * [Security Monitoring](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Create a suppression rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-suppression-rule) * [Delete a suppression rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-suppression-rule) * [Get a suppression rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-suppression-rule) * [Get a suppression's version history](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-suppressions-version-history) * [Get all suppression rules](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-suppression-rules) * [Get suppressions affecting a specific rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-suppressions-affecting-a-specific-rule) * [Get suppressions affecting future rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-suppressions-affecting-future-rule) * [Update a suppression rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-suppression-rule) * [Validate a suppression rule](https://docs.datadoghq.com/api/latest/csm-threats/#validate-a-suppression-rule) * [List findings](https://docs.datadoghq.com/api/latest/csm-threats/#list-findings) * [List security findings](https://docs.datadoghq.com/api/latest/csm-threats/#list-security-findings) * [Get a finding](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-finding) * [Mute or unmute a batch of findings](https://docs.datadoghq.com/api/latest/csm-threats/#mute-or-unmute-a-batch-of-findings) * [Search security findings](https://docs.datadoghq.com/api/latest/csm-threats/#search-security-findings) * [Add a security signal to an incident](https://docs.datadoghq.com/api/latest/csm-threats/#add-a-security-signal-to-an-incident) * [Change the triage state of a security signal](https://docs.datadoghq.com/api/latest/csm-threats/#change-the-triage-state-of-a-security-signal) * [Create a custom framework](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-custom-framework) * [Create a detection rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-detection-rule) * [Delete a custom framework](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-custom-framework) * [Get a custom framework](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-custom-framework) * [List rules](https://docs.datadoghq.com/api/latest/csm-threats/#list-rules) * [Update a custom framework](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-custom-framework) * [Get a rule's details](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-rules-details) * [Modify the triage assignee of a security signal](https://docs.datadoghq.com/api/latest/csm-threats/#modify-the-triage-assignee-of-a-security-signal) * [Update an existing rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-existing-rule) * [Delete an existing rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-existing-rule) * [Test an existing rule](https://docs.datadoghq.com/api/latest/csm-threats/#test-an-existing-rule) * [Test a rule](https://docs.datadoghq.com/api/latest/csm-threats/#test-a-rule) * [Validate a detection rule](https://docs.datadoghq.com/api/latest/csm-threats/#validate-a-detection-rule) * [Change the related incidents of a security signal](https://docs.datadoghq.com/api/latest/csm-threats/#change-the-related-incidents-of-a-security-signal) * [Convert an existing rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/csm-threats/#convert-an-existing-rule-from-json-to-terraform) * [Convert a rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/csm-threats/#convert-a-rule-from-json-to-terraform) * [Get a list of security signals](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-security-signals) * [Get a signal's details](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-signals-details) * [Delete a security filter](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-security-filter) * [Get a quick list of security signals](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-quick-list-of-security-signals) * [Get the list of signal-based notification rules](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-list-of-signal-based-notification-rules) * [Get the list of vulnerability notification rules](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-list-of-vulnerability-notification-rules) * [Update a security filter](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-security-filter) * [Create a new signal-based notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-signal-based-notification-rule) * [Create a new vulnerability-based notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-vulnerability-based-notification-rule) * [Get a security filter](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-security-filter) * [Create a security filter](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-security-filter) * [Get all security filters](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-security-filters) * [Get details of a signal-based notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-details-of-a-signal-based-notification-rule) * [Get details of a vulnerability notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-details-of-a-vulnerability-notification-rule) * [Run a threat hunting job](https://docs.datadoghq.com/api/latest/csm-threats/#run-a-threat-hunting-job) * [List threat hunting jobs](https://docs.datadoghq.com/api/latest/csm-threats/#list-threat-hunting-jobs) * [Patch a signal-based notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#patch-a-signal-based-notification-rule) * [Patch a vulnerability-based notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#patch-a-vulnerability-based-notification-rule) * [Delete a signal-based notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-signal-based-notification-rule) * [Delete a vulnerability-based notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-vulnerability-based-notification-rule) * [Get a job's details](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-jobs-details) * [Cancel a threat hunting job](https://docs.datadoghq.com/api/latest/csm-threats/#cancel-a-threat-hunting-job) * [Delete an existing job](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-existing-job) * [Convert a job result to a signal](https://docs.datadoghq.com/api/latest/csm-threats/#convert-a-job-result-to-a-signal) * [Get a rule's version history](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-rules-version-history) * [List vulnerabilities](https://docs.datadoghq.com/api/latest/csm-threats/#list-vulnerabilities) * [List resource filters](https://docs.datadoghq.com/api/latest/csm-threats/#list-resource-filters) * [List vulnerable assets](https://docs.datadoghq.com/api/latest/csm-threats/#list-vulnerable-assets) * [Get SBOM](https://docs.datadoghq.com/api/latest/csm-threats/#get-sbom) * [Update resource filters](https://docs.datadoghq.com/api/latest/csm-threats/#update-resource-filters) * [List assets SBOMs](https://docs.datadoghq.com/api/latest/csm-threats/#list-assets-sboms) * [List scanned assets metadata](https://docs.datadoghq.com/api/latest/csm-threats/#list-scanned-assets-metadata) * [Returns a list of Secrets rules](https://docs.datadoghq.com/api/latest/csm-threats/#returns-a-list-of-secrets-rules) * [Ruleset get multiple](https://docs.datadoghq.com/api/latest/csm-threats/#ruleset-get-multiple) * [Search hist signals](https://docs.datadoghq.com/api/latest/csm-threats/#search-hist-signals) * [Get a hist signal's details](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-hist-signals-details) * [List hist signals](https://docs.datadoghq.com/api/latest/csm-threats/#list-hist-signals) * [Get a job's hist signals](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-jobs-hist-signals) * [Create cases for security findings](https://docs.datadoghq.com/api/latest/csm-threats/#create-cases-for-security-findings) * [Attach security findings to a case](https://docs.datadoghq.com/api/latest/csm-threats/#attach-security-findings-to-a-case) * [Detach security findings from their case](https://docs.datadoghq.com/api/latest/csm-threats/#detach-security-findings-from-their-case) * [Create Jira issues for security findings](https://docs.datadoghq.com/api/latest/csm-threats/#create-jira-issues-for-security-findings) * [Attach security findings to a Jira issue](https://docs.datadoghq.com/api/latest/csm-threats/#attach-security-findings-to-a-jira-issue) * [Sensitive Data Scanner](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [List Scanning Groups](https://docs.datadoghq.com/api/latest/csm-threats/#list-scanning-groups) * [Reorder Groups](https://docs.datadoghq.com/api/latest/csm-threats/#reorder-groups) * [List standard patterns](https://docs.datadoghq.com/api/latest/csm-threats/#list-standard-patterns) * [Create Scanning Group](https://docs.datadoghq.com/api/latest/csm-threats/#create-scanning-group) * [Update Scanning Group](https://docs.datadoghq.com/api/latest/csm-threats/#update-scanning-group) * [Delete Scanning Group](https://docs.datadoghq.com/api/latest/csm-threats/#delete-scanning-group) * [Create Scanning Rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-scanning-rule) * [Update Scanning Rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-scanning-rule) * [Delete Scanning Rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-scanning-rule) * [Service Accounts](https://docs.datadoghq.com/api/latest/service-accounts/) * [Create a service account](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-service-account) * [Get one application key for this service account](https://docs.datadoghq.com/api/latest/csm-threats/#get-one-application-key-for-this-service-account) * [Edit an application key for this service account](https://docs.datadoghq.com/api/latest/csm-threats/#edit-an-application-key-for-this-service-account) * [Delete an application key for this service account](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-application-key-for-this-service-account) * [Create an application key for this service account](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-application-key-for-this-service-account) * [List application keys for this service account](https://docs.datadoghq.com/api/latest/csm-threats/#list-application-keys-for-this-service-account) * [Service Checks](https://docs.datadoghq.com/api/latest/service-checks/) * [Submit a Service Check](https://docs.datadoghq.com/api/latest/csm-threats/#submit-a-service-check) * [Service Definition](https://docs.datadoghq.com/api/latest/service-definition/) * [Get all service definitions](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-service-definitions) * [Create or update service definition](https://docs.datadoghq.com/api/latest/csm-threats/#create-or-update-service-definition) * [Get a single service definition](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-single-service-definition) * [Delete a single service definition](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-single-service-definition) * [Service Dependencies](https://docs.datadoghq.com/api/latest/service-dependencies/) * [Get all APM service dependencies](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-apm-service-dependencies) * [Get one APM service's dependencies](https://docs.datadoghq.com/api/latest/csm-threats/#get-one-apm-services-dependencies) * [Service Level Objective Corrections](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Create an SLO correction](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-slo-correction) * [Get all SLO corrections](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-slo-corrections) * [Get an SLO correction for an SLO](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-slo-correction-for-an-slo) * [Update an SLO correction](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-slo-correction) * [Delete an SLO correction](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-slo-correction) * [Service Level Objectives](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Create an SLO object](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-slo-object) * [Search for SLOs](https://docs.datadoghq.com/api/latest/csm-threats/#search-for-slos) * [Get all SLOs](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-slos) * [Update an SLO](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-slo) * [Get an SLO's details](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-slos-details) * [Delete an SLO](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-slo) * [Get an SLO's history](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-slos-history) * [Get Corrections For an SLO](https://docs.datadoghq.com/api/latest/csm-threats/#get-corrections-for-an-slo) * [Check if SLOs can be safely deleted](https://docs.datadoghq.com/api/latest/csm-threats/#check-if-slos-can-be-safely-deleted) * [Bulk Delete SLO Timeframes](https://docs.datadoghq.com/api/latest/csm-threats/#bulk-delete-slo-timeframes) * [Create a new SLO report](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-slo-report) * [Get SLO report status](https://docs.datadoghq.com/api/latest/csm-threats/#get-slo-report-status) * [Get SLO report](https://docs.datadoghq.com/api/latest/csm-threats/#get-slo-report) * [Service Scorecards](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Create a new rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-new-rule) * [Create outcomes batch](https://docs.datadoghq.com/api/latest/csm-threats/#create-outcomes-batch) * [Update Scorecard outcomes asynchronously](https://docs.datadoghq.com/api/latest/csm-threats/#update-scorecard-outcomes-asynchronously) * [List all rule outcomes](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-rule-outcomes) * [List all rules](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-rules) * [Delete a rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-rule) * [Update an existing rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-existing-rule) * [Slack Integration](https://docs.datadoghq.com/api/latest/slack-integration/) * [Delete a Slack integration](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-slack-integration) * [Get all channels in a Slack integration](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-channels-in-a-slack-integration) * [Add channels to Slack integration](https://docs.datadoghq.com/api/latest/csm-threats/#add-channels-to-slack-integration) * [Create a Slack integration channel](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-slack-integration-channel) * [Create a Slack integration](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-slack-integration) * [Get a Slack integration channel](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-slack-integration-channel) * [Get info about a Slack integration](https://docs.datadoghq.com/api/latest/csm-threats/#get-info-about-a-slack-integration) * [Update a Slack integration channel](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-slack-integration-channel) * [Remove a Slack integration channel](https://docs.datadoghq.com/api/latest/csm-threats/#remove-a-slack-integration-channel) * [Snapshots](https://docs.datadoghq.com/api/latest/snapshots/) * [Take graph snapshots](https://docs.datadoghq.com/api/latest/csm-threats/#take-graph-snapshots) * [Software Catalog](https://docs.datadoghq.com/api/latest/software-catalog/) * [Get a list of entities](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-entities) * [Create or update entities](https://docs.datadoghq.com/api/latest/csm-threats/#create-or-update-entities) * [Delete a single entity](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-single-entity) * [Get a list of entity relations](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-entity-relations) * [Preview catalog entities](https://docs.datadoghq.com/api/latest/csm-threats/#preview-catalog-entities) * [Get a list of entity kinds](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-entity-kinds) * [Create or update kinds](https://docs.datadoghq.com/api/latest/csm-threats/#create-or-update-kinds) * [Delete a single kind](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-single-kind) * [Spa](https://docs.datadoghq.com/api/latest/spa/) * [Get SPA Recommendations with a shard parameter](https://docs.datadoghq.com/api/latest/csm-threats/#get-spa-recommendations-with-a-shard-parameter) * [Get SPA Recommendations](https://docs.datadoghq.com/api/latest/csm-threats/#get-spa-recommendations) * [Spans](https://docs.datadoghq.com/api/latest/spans/) * [Get a list of spans](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-list-of-spans) * [Search spans](https://docs.datadoghq.com/api/latest/csm-threats/#search-spans) * [Aggregate spans](https://docs.datadoghq.com/api/latest/csm-threats/#aggregate-spans) * [Spans Metrics](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Get all span-based metrics](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-span-based-metrics) * [Create a span-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-span-based-metric) * [Get a span-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-span-based-metric) * [Update a span-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-span-based-metric) * [Delete a span-based metric](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-span-based-metric) * [Static Analysis](https://docs.datadoghq.com/api/latest/static-analysis/) * [POST request to resolve vulnerable symbols](https://docs.datadoghq.com/api/latest/csm-threats/#post-request-to-resolve-vulnerable-symbols) * [Post dependencies for analysis](https://docs.datadoghq.com/api/latest/csm-threats/#post-dependencies-for-analysis) * [Synthetics](https://docs.datadoghq.com/api/latest/synthetics/) * [Create an API test](https://docs.datadoghq.com/api/latest/csm-threats/#create-an-api-test) * [Create a browser test](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-browser-test) * [Create a mobile test](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-mobile-test) * [Edit a Mobile test](https://docs.datadoghq.com/api/latest/csm-threats/#edit-a-mobile-test) * [Edit an API test](https://docs.datadoghq.com/api/latest/csm-threats/#edit-an-api-test) * [Edit a browser test](https://docs.datadoghq.com/api/latest/csm-threats/#edit-a-browser-test) * [Patch a Synthetic test](https://docs.datadoghq.com/api/latest/csm-threats/#patch-a-synthetic-test) * [Pause or start a test](https://docs.datadoghq.com/api/latest/csm-threats/#pause-or-start-a-test) * [Trigger tests from CI/CD pipelines](https://docs.datadoghq.com/api/latest/csm-threats/#trigger-tests-from-cicd-pipelines) * [Trigger Synthetic tests](https://docs.datadoghq.com/api/latest/csm-threats/#trigger-synthetic-tests) * [Get a Mobile test](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-mobile-test) * [Get an API test](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-api-test) * [Get a browser test](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-browser-test) * [Get the on-demand concurrency cap](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-on-demand-concurrency-cap) * [Get the list of all Synthetic tests](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-list-of-all-synthetic-tests) * [Save new value for on-demand concurrency cap](https://docs.datadoghq.com/api/latest/csm-threats/#save-new-value-for-on-demand-concurrency-cap) * [Get an API test result](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-api-test-result) * [Get a browser test result](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-browser-test-result) * [Get an API test's latest results summaries](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-api-tests-latest-results-summaries) * [Get a browser test's latest results summaries](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-browser-tests-latest-results-summaries) * [Get details of batch](https://docs.datadoghq.com/api/latest/csm-threats/#get-details-of-batch) * [Delete tests](https://docs.datadoghq.com/api/latest/csm-threats/#delete-tests) * [Get all global variables](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-global-variables) * [Create a global variable](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-global-variable) * [Get a global variable](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-global-variable) * [Patch a global variable](https://docs.datadoghq.com/api/latest/csm-threats/#patch-a-global-variable) * [Edit a global variable](https://docs.datadoghq.com/api/latest/csm-threats/#edit-a-global-variable) * [Delete a global variable](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-global-variable) * [Create a private location](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-private-location) * [Get a private location](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-private-location) * [Edit a private location](https://docs.datadoghq.com/api/latest/csm-threats/#edit-a-private-location) * [Get all locations (public and private)](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-locations-public-and-private) * [Delete a private location](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-private-location) * [Get a test configuration](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-test-configuration) * [Get the default locations](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-default-locations) * [Edit a test](https://docs.datadoghq.com/api/latest/csm-threats/#edit-a-test) * [Fetch uptime for multiple tests](https://docs.datadoghq.com/api/latest/csm-threats/#fetch-uptime-for-multiple-tests) * [Create a test](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-test) * [Search Synthetic tests](https://docs.datadoghq.com/api/latest/csm-threats/#search-synthetic-tests) * [Tags](https://docs.datadoghq.com/api/latest/tags/) * [Get Tags](https://docs.datadoghq.com/api/latest/csm-threats/#get-tags) * [Get host tags](https://docs.datadoghq.com/api/latest/csm-threats/#get-host-tags) * [Add tags to a host](https://docs.datadoghq.com/api/latest/csm-threats/#add-tags-to-a-host) * [Update host tags](https://docs.datadoghq.com/api/latest/csm-threats/#update-host-tags) * [Remove host tags](https://docs.datadoghq.com/api/latest/csm-threats/#remove-host-tags) * [Teams](https://docs.datadoghq.com/api/latest/teams/) * [Get all teams](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-teams) * [Create a team](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-team) * [Get a team](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-team) * [Update a team](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-team) * [Remove a team](https://docs.datadoghq.com/api/latest/csm-threats/#remove-a-team) * [Get team memberships](https://docs.datadoghq.com/api/latest/csm-threats/#get-team-memberships) * [Add a user to a team](https://docs.datadoghq.com/api/latest/csm-threats/#add-a-user-to-a-team) * [Remove a user from a team](https://docs.datadoghq.com/api/latest/csm-threats/#remove-a-user-from-a-team) * [Update a user's membership attributes on a team](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-users-membership-attributes-on-a-team) * [Get links for a team](https://docs.datadoghq.com/api/latest/csm-threats/#get-links-for-a-team) * [Create a team link](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-team-link) * [Get a team link](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-team-link) * [Update a team link](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-team-link) * [Remove a team link](https://docs.datadoghq.com/api/latest/csm-threats/#remove-a-team-link) * [Get permission settings for a team](https://docs.datadoghq.com/api/latest/csm-threats/#get-permission-settings-for-a-team) * [Get team sync configurations](https://docs.datadoghq.com/api/latest/csm-threats/#get-team-sync-configurations) * [Update permission setting for team](https://docs.datadoghq.com/api/latest/csm-threats/#update-permission-setting-for-team) * [Get user memberships](https://docs.datadoghq.com/api/latest/csm-threats/#get-user-memberships) * [Link Teams with GitHub Teams](https://docs.datadoghq.com/api/latest/csm-threats/#link-teams-with-github-teams) * [Add a member team](https://docs.datadoghq.com/api/latest/csm-threats/#add-a-member-team) * [Get team hierarchy links](https://docs.datadoghq.com/api/latest/csm-threats/#get-team-hierarchy-links) * [Get a team hierarchy link](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-team-hierarchy-link) * [Get all member teams](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-member-teams) * [Create a team hierarchy link](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-team-hierarchy-link) * [Remove a member team](https://docs.datadoghq.com/api/latest/csm-threats/#remove-a-member-team) * [Remove a team hierarchy link](https://docs.datadoghq.com/api/latest/csm-threats/#remove-a-team-hierarchy-link) * [List team connections](https://docs.datadoghq.com/api/latest/csm-threats/#list-team-connections) * [Create team connections](https://docs.datadoghq.com/api/latest/csm-threats/#create-team-connections) * [Delete team connections](https://docs.datadoghq.com/api/latest/csm-threats/#delete-team-connections) * [Get team notification rules](https://docs.datadoghq.com/api/latest/csm-threats/#get-team-notification-rules) * [Create team notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-team-notification-rule) * [Get team notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-team-notification-rule) * [Update team notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-team-notification-rule) * [Delete team notification rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-team-notification-rule) * [Test Optimization](https://docs.datadoghq.com/api/latest/test-optimization/) * [Search flaky tests](https://docs.datadoghq.com/api/latest/csm-threats/#search-flaky-tests) * [Timeboards](https://docs.datadoghq.com/api/latest/timeboards/) * [Usage Metering](https://docs.datadoghq.com/api/latest/usage-metering/) * [Get billing dimension mapping for usage endpoints](https://docs.datadoghq.com/api/latest/csm-threats/#get-billing-dimension-mapping-for-usage-endpoints) * [Get hourly usage by product family](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-by-product-family) * [Get hourly usage attribution](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-attribution) * [Get monthly usage attribution](https://docs.datadoghq.com/api/latest/csm-threats/#get-monthly-usage-attribution) * [Get active billing dimensions for cost attribution](https://docs.datadoghq.com/api/latest/csm-threats/#get-active-billing-dimensions-for-cost-attribution) * [Get billable usage across your account](https://docs.datadoghq.com/api/latest/csm-threats/#get-billable-usage-across-your-account) * [Get historical cost across your account](https://docs.datadoghq.com/api/latest/csm-threats/#get-historical-cost-across-your-account) * [Get Monthly Cost Attribution](https://docs.datadoghq.com/api/latest/csm-threats/#get-monthly-cost-attribution) * [Get estimated cost across your account](https://docs.datadoghq.com/api/latest/csm-threats/#get-estimated-cost-across-your-account) * [Get all custom metrics by hourly average](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-custom-metrics-by-hourly-average) * [Get projected cost across your account](https://docs.datadoghq.com/api/latest/csm-threats/#get-projected-cost-across-your-account) * [Get usage across your account](https://docs.datadoghq.com/api/latest/csm-threats/#get-usage-across-your-account) * [Get hourly usage for logs by index](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-logs-by-index) * [Get hourly logs usage by retention](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-logs-usage-by-retention) * [Get hourly usage for hosts and containers](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-hosts-and-containers) * [Get hourly usage for logs](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-logs) * [Get hourly usage for custom metrics](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-custom-metrics) * [Get hourly usage for indexed spans](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-indexed-spans) * [Get hourly usage for synthetics checks](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-synthetics-checks) * [Get hourly usage for synthetics API checks](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-synthetics-api-checks) * [Get hourly usage for synthetics browser checks](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-synthetics-browser-checks) * [Get hourly usage for Fargate](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-fargate) * [Get hourly usage for Lambda](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-lambda) * [Get hourly usage for RUM sessions](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-rum-sessions) * [Get hourly usage for network hosts](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-network-hosts) * [get hourly usage for network flows](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-network-flows) * [Get hourly usage for analyzed logs](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-analyzed-logs) * [Get hourly usage for SNMP devices](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-snmp-devices) * [Get hourly usage for ingested spans](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-ingested-spans) * [Get hourly usage for incident management](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-incident-management) * [Get hourly usage for IoT](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-iot) * [Get hourly usage for CSM Pro](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-csm-pro) * [Get hourly usage for cloud workload security](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-cloud-workload-security) * [Get hourly usage for database monitoring](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-database-monitoring) * [Get hourly usage for sensitive data scanner](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-sensitive-data-scanner) * [Get hourly usage for RUM units](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-rum-units) * [Get hourly usage for profiled hosts](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-profiled-hosts) * [Get hourly usage for CI visibility](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-ci-visibility) * [Get hourly usage for online archive](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-online-archive) * [Get hourly usage for Lambda traced invocations](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-lambda-traced-invocations) * [Get hourly usage for application security](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-application-security) * [Get hourly usage for observability pipelines](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-observability-pipelines) * [Get hourly usage for audit logs](https://docs.datadoghq.com/api/latest/csm-threats/#get-hourly-usage-for-audit-logs) * [Get the list of available daily custom reports](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-list-of-available-daily-custom-reports) * [Get specified daily custom reports](https://docs.datadoghq.com/api/latest/csm-threats/#get-specified-daily-custom-reports) * [Get the list of available monthly custom reports](https://docs.datadoghq.com/api/latest/csm-threats/#get-the-list-of-available-monthly-custom-reports) * [Get specified monthly custom reports](https://docs.datadoghq.com/api/latest/csm-threats/#get-specified-monthly-custom-reports) * [Get cost across multi-org account](https://docs.datadoghq.com/api/latest/csm-threats/#get-cost-across-multi-org-account) * [Users](https://docs.datadoghq.com/api/latest/users/) * [Create a user](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-user) * [List all users](https://docs.datadoghq.com/api/latest/csm-threats/#list-all-users) * [Get user details](https://docs.datadoghq.com/api/latest/csm-threats/#get-user-details) * [Update a user](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-user) * [Disable a user](https://docs.datadoghq.com/api/latest/csm-threats/#disable-a-user) * [Get a user organization](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-user-organization) * [Get a user permissions](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-user-permissions) * [Send invitation emails](https://docs.datadoghq.com/api/latest/csm-threats/#send-invitation-emails) * [Get a user invitation](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-user-invitation) * [Webhooks Integration](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Create a webhooks integration](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-webhooks-integration) * [Get a webhook integration](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-webhook-integration) * [Update a webhook](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-webhook) * [Delete a webhook](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-webhook) * [Create a custom variable](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-custom-variable) * [Get a custom variable](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-custom-variable) * [Update a custom variable](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-custom-variable) * [Delete a custom variable](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-custom-variable) * [Workflow Automation](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Get an existing Workflow](https://docs.datadoghq.com/api/latest/csm-threats/#get-an-existing-workflow) * [Create a Workflow](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-workflow) * [Update an existing Workflow](https://docs.datadoghq.com/api/latest/csm-threats/#update-an-existing-workflow) * [Delete an existing Workflow](https://docs.datadoghq.com/api/latest/csm-threats/#delete-an-existing-workflow) * [List workflow instances](https://docs.datadoghq.com/api/latest/csm-threats/#list-workflow-instances) * [Execute a workflow](https://docs.datadoghq.com/api/latest/csm-threats/#execute-a-workflow) * [Get a workflow instance](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-workflow-instance) * [Cancel a workflow instance](https://docs.datadoghq.com/api/latest/csm-threats/#cancel-a-workflow-instance) [Docs](https://docs.datadoghq.com/) > [API Reference](https://docs.datadoghq.com/api/latest/) > [CSM Threats](https://docs.datadoghq.com/api/latest/csm-threats/) Language English [English](https://docs.datadoghq.com/api/latest/csm-threats/?lang_pref=en) [Français](https://docs.datadoghq.com/fr/api/latest/csm-threats/?lang_pref=fr) [日本語](https://docs.datadoghq.com/ja/api/latest/csm-threats/?lang_pref=ja) [한국어](https://docs.datadoghq.com/ko/api/latest/csm-threats/?lang_pref=ko) [Español](https://docs.datadoghq.com/es/api/latest/csm-threats/?lang_pref=es) Datadog Site [![Site help](https://datadog-docs.imgix.net/images/icons/help-druids.svg)](https://docs.datadoghq.com/getting_started/site/) US1 US1 US3 US5 EU AP1 AP2 US1-FED # CSM Threats Workload Protection monitors file, network, and process activity across your environment to detect real-time threats to your infrastructure. See [Workload Protection](https://docs.datadoghq.com/security/workload_protection/) for more information on setting up Workload Protection. **Note** : These endpoints are split based on whether you are using the US1-FED site or not. Please reference the specific resource for the site you are using. ## [Get all Workload Protection agent rules](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-workload-protection-agent-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-workload-protection-agent-rules-v2) GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/cws/agent_ruleshttps://api.ap2.datadoghq.com/api/v2/remote_config/products/cws/agent_ruleshttps://api.datadoghq.eu/api/v2/remote_config/products/cws/agent_ruleshttps://api.ddog-gov.com/api/v2/remote_config/products/cws/agent_ruleshttps://api.datadoghq.com/api/v2/remote_config/products/cws/agent_ruleshttps://api.us3.datadoghq.com/api/v2/remote_config/products/cws/agent_ruleshttps://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules ### Overview Get the list of Workload Protection agent rules. **Note** : This endpoint is not available for the Government (US1-FED) site. Please reference the (US1-FED) specific resource below. ### Arguments #### Query Strings Name Type Description policy_id string The ID of the Agent policy ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#ListCSMThreatsAgentRules-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#ListCSMThreatsAgentRules-403-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#ListCSMThreatsAgentRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes a list of Agent rule Expand All Field Type Description [object] A list of Agent rules objects object A Cloud Workload Security Agent rule returned by the API [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agentConstraint string The version of the Agent blocking [string] The blocking policies that the rule belongs to category string The category of the Agent rule creationAuthorUuId string The ID of the user who created the rule creationDate int64 When the Agent rule was created, timestamp in milliseconds object The attributes of the user who created the Agent rule handle string The handle of the user name string The name of the user defaultRule boolean Whether the rule is included by default description string The description of the Agent rule disabled [string] The disabled policies that the rule belongs to enabled boolean Whether the Agent rule is enabled expression string The SECL expression of the Agent rule filters [string] The platforms the Agent rule is supported on monitoring [string] The monitoring policies that the rule belongs to name string The name of the Agent rule product_tags [string] The list of product tags associated with the rule silent boolean Whether the rule is silent. updateAuthorUuId string The ID of the user who updated the rule updateDate int64 Timestamp in milliseconds when the Agent rule was last updated updatedAt int64 When the Agent rule was last updated, timestamp in milliseconds object The attributes of the user who last updated the Agent rule handle string The handle of the user name string The name of the user version int64 The version of the Agent rule id string The ID of the Agent rule type enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ``` { "data": [ { "attributes": { "actions": [ { "filter": "string", "hash": { "field": "string" }, "kill": { "signal": "string" }, "metadata": { "image_tag": "string", "service": "string", "short_image": "string" }, "set": { "append": false, "default_value": "string", "expression": "string", "field": "string", "inherited": false, "name": "string", "scope": "string", "size": "integer", "ttl": "integer", "value": { "type": "undefined" } } } ], "agentConstraint": "string", "blocking": [], "category": "Process Activity", "creationAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "creationDate": 1624366480320, "creator": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "defaultRule": false, "description": "My Agent rule", "disabled": [], "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "monitoring": [], "name": "my_agent_rule", "product_tags": [], "silent": false, "updateAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "version": 23 }, "id": "3dd-0uc-h1s", "type": "agent_rule" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Get all Workload Protection agent rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all Workload Protection agent rules ``` """ Get all Workload Protection agent rules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.list_csm_threats_agent_rules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all Workload Protection agent rules ``` # Get all Workload Protection agent rules returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new p api_instance.list_csm_threats_agent_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all Workload Protection agent rules ``` // Get all Workload Protection agent rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.ListCSMThreatsAgentRules(ctx, *datadogV2.NewListCSMThreatsAgentRulesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.ListCSMThreatsAgentRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.ListCSMThreatsAgentRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all Workload Protection agent rules ``` // Get all Workload Protection agent rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRulesListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); try { CloudWorkloadSecurityAgentRulesListResponse result = apiInstance.listCSMThreatsAgentRules(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#listCSMThreatsAgentRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all Workload Protection agent rules ``` // Get all Workload Protection agent rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; use datadog_api_client::datadogV2::api_csm_threats::ListCSMThreatsAgentRulesOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api .list_csm_threats_agent_rules(ListCSMThreatsAgentRulesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all Workload Protection agent rules ``` /** * Get all Workload Protection agent rules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); apiInstance .listCSMThreatsAgentRules() .then((data: v2.CloudWorkloadSecurityAgentRulesListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-workload-protection-agent-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-workload-protection-agent-rule-v2) GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.datadoghq.eu/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.ddog-gov.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id} ### Overview Get the details of a specific Workload Protection agent rule. **Note** : This endpoint is not available for the Government (US1-FED) site. Please reference the (US1-FED) specific resource below. ### Arguments #### Path Parameters Name Type Description agent_rule_id [_required_] string The ID of the Agent rule #### Query Strings Name Type Description policy_id string The ID of the Agent policy ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#GetCSMThreatsAgentRule-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#GetCSMThreatsAgentRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/csm-threats/#GetCSMThreatsAgentRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#GetCSMThreatsAgentRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes an Agent rule Expand All Field Type Description object Object for a single Agent rule object A Cloud Workload Security Agent rule returned by the API [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agentConstraint string The version of the Agent blocking [string] The blocking policies that the rule belongs to category string The category of the Agent rule creationAuthorUuId string The ID of the user who created the rule creationDate int64 When the Agent rule was created, timestamp in milliseconds object The attributes of the user who created the Agent rule handle string The handle of the user name string The name of the user defaultRule boolean Whether the rule is included by default description string The description of the Agent rule disabled [string] The disabled policies that the rule belongs to enabled boolean Whether the Agent rule is enabled expression string The SECL expression of the Agent rule filters [string] The platforms the Agent rule is supported on monitoring [string] The monitoring policies that the rule belongs to name string The name of the Agent rule product_tags [string] The list of product tags associated with the rule silent boolean Whether the rule is silent. updateAuthorUuId string The ID of the user who updated the rule updateDate int64 Timestamp in milliseconds when the Agent rule was last updated updatedAt int64 When the Agent rule was last updated, timestamp in milliseconds object The attributes of the user who last updated the Agent rule handle string The handle of the user name string The name of the user version int64 The version of the Agent rule id string The ID of the Agent rule type enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ``` { "data": { "attributes": { "actions": [ { "filter": "string", "hash": { "field": "string" }, "kill": { "signal": "string" }, "metadata": { "image_tag": "string", "service": "string", "short_image": "string" }, "set": { "append": false, "default_value": "string", "expression": "string", "field": "string", "inherited": false, "name": "string", "scope": "string", "size": "integer", "ttl": "integer", "value": { "type": "undefined" } } } ], "agentConstraint": "string", "blocking": [], "category": "Process Activity", "creationAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "creationDate": 1624366480320, "creator": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "defaultRule": false, "description": "My Agent rule", "disabled": [], "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "monitoring": [], "name": "my_agent_rule", "product_tags": [], "silent": false, "updateAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "version": 23 }, "id": "3dd-0uc-h1s", "type": "agent_rule" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Get a Workload Protection agent rule Copy ``` # Path parameters export agent_rule_id="3b5-v82-ns6" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/${agent_rule_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a Workload Protection agent rule ``` """ Get a Workload Protection agent rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi # there is a valid "agent_rule_rc" in the system AGENT_RULE_DATA_ID = environ["AGENT_RULE_DATA_ID"] # there is a valid "policy_rc" in the system POLICY_DATA_ID = environ["POLICY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.get_csm_threats_agent_rule( agent_rule_id=AGENT_RULE_DATA_ID, policy_id=POLICY_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a Workload Protection agent rule ``` # Get a Workload Protection agent rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "agent_rule_rc" in the system AGENT_RULE_DATA_ID = ENV["AGENT_RULE_DATA_ID"] # there is a valid "policy_rc" in the system POLICY_DATA_ID = ENV["POLICY_DATA_ID"] opts = { policy_id: POLICY_DATA_ID, } p api_instance.get_csm_threats_agent_rule(AGENT_RULE_DATA_ID, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a Workload Protection agent rule ``` // Get a Workload Protection agent rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "agent_rule_rc" in the system AgentRuleDataID := os.Getenv("AGENT_RULE_DATA_ID") // there is a valid "policy_rc" in the system PolicyDataID := os.Getenv("POLICY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.GetCSMThreatsAgentRule(ctx, AgentRuleDataID, *datadogV2.NewGetCSMThreatsAgentRuleOptionalParameters().WithPolicyId(PolicyDataID)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.GetCSMThreatsAgentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.GetCSMThreatsAgentRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a Workload Protection agent rule ``` // Get a Workload Protection agent rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.api.CsmThreatsApi.GetCSMThreatsAgentRuleOptionalParameters; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "agent_rule_rc" in the system String AGENT_RULE_DATA_ID = System.getenv("AGENT_RULE_DATA_ID"); // there is a valid "policy_rc" in the system String POLICY_DATA_ID = System.getenv("POLICY_DATA_ID"); try { CloudWorkloadSecurityAgentRuleResponse result = apiInstance.getCSMThreatsAgentRule( AGENT_RULE_DATA_ID, new GetCSMThreatsAgentRuleOptionalParameters().policyId(POLICY_DATA_ID)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#getCSMThreatsAgentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a Workload Protection agent rule ``` // Get a Workload Protection agent rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; use datadog_api_client::datadogV2::api_csm_threats::GetCSMThreatsAgentRuleOptionalParams; #[tokio::main] async fn main() { // there is a valid "agent_rule_rc" in the system let agent_rule_data_id = std::env::var("AGENT_RULE_DATA_ID").unwrap(); // there is a valid "policy_rc" in the system let policy_data_id = std::env::var("POLICY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api .get_csm_threats_agent_rule( agent_rule_data_id.clone(), GetCSMThreatsAgentRuleOptionalParams::default().policy_id(policy_data_id.clone()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a Workload Protection agent rule ``` /** * Get a Workload Protection agent rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "agent_rule_rc" in the system const AGENT_RULE_DATA_ID = process.env.AGENT_RULE_DATA_ID as string; // there is a valid "policy_rc" in the system const POLICY_DATA_ID = process.env.POLICY_DATA_ID as string; const params: v2.CSMThreatsApiGetCSMThreatsAgentRuleRequest = { agentRuleId: AGENT_RULE_DATA_ID, policyId: POLICY_DATA_ID, }; apiInstance .getCSMThreatsAgentRule(params) .then((data: v2.CloudWorkloadSecurityAgentRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-workload-protection-agent-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-workload-protection-agent-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/remote_config/products/cws/agent_ruleshttps://api.ap2.datadoghq.com/api/v2/remote_config/products/cws/agent_ruleshttps://api.datadoghq.eu/api/v2/remote_config/products/cws/agent_ruleshttps://api.ddog-gov.com/api/v2/remote_config/products/cws/agent_ruleshttps://api.datadoghq.com/api/v2/remote_config/products/cws/agent_ruleshttps://api.us3.datadoghq.com/api/v2/remote_config/products/cws/agent_ruleshttps://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules ### Overview Create a new Workload Protection agent rule with the given parameters. **Note** : This endpoint is not available for the Government (US1-FED) site. Please reference the (US1-FED) specific resource below. ### Request #### Body Data (required) The definition of the new agent rule * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Expand All Field Type Description _required_] object Object for a single Agent rule _required_] object Create a new Cloud Workload Security Agent rule. [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agent_version string Constrain the rule to specific versions of the Datadog Agent. blocking [string] The blocking policies that the rule belongs to. description string The description of the Agent rule. disabled [string] The disabled policies that the rule belongs to. enabled boolean Whether the Agent rule is enabled. expression [_required_] string The SECL expression of the Agent rule. filters [string] The platforms the Agent rule is supported on. monitoring [string] The monitoring policies that the rule belongs to. name [_required_] string The name of the Agent rule. policy_id string The ID of the policy where the Agent rule is saved. product_tags [string] The list of product tags associated with the rule. silent boolean Whether the rule is silent. type [_required_] enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ##### Create a Workload Protection agent rule returns "OK" response ``` { "data": { "attributes": { "description": "My Agent rule", "enabled": true, "expression": "exec.file.name == \"sh\"", "agent_version": "> 7.60", "filters": [], "name": "examplecsmthreat", "policy_id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "product_tags": [] }, "type": "agent_rule" } } ``` Copy ##### Create a Workload Protection agent rule with set action returns "OK" response ``` { "data": { "attributes": { "description": "My Agent rule with set action", "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "name": "examplecsmthreat", "policy_id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "product_tags": [], "actions": [ { "set": { "name": "test_set", "value": "test_value", "scope": "process", "inherited": true } }, { "hash": { "field": "exec.file" } } ] }, "type": "agent_rule" } } ``` Copy ##### Create a Workload Protection agent rule with set action with expression returns "OK" response ``` { "data": { "attributes": { "description": "My Agent rule with set action with expression", "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "name": "examplecsmthreat", "policy_id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "product_tags": [], "actions": [ { "set": { "name": "test_set", "expression": "exec.file.path", "default_value": "/dev/null", "scope": "process" } } ] }, "type": "agent_rule" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCSMThreatsAgentRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCSMThreatsAgentRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCSMThreatsAgentRule-403-v2) * [409](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCSMThreatsAgentRule-409-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCSMThreatsAgentRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes an Agent rule Expand All Field Type Description object Object for a single Agent rule object A Cloud Workload Security Agent rule returned by the API [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agentConstraint string The version of the Agent blocking [string] The blocking policies that the rule belongs to category string The category of the Agent rule creationAuthorUuId string The ID of the user who created the rule creationDate int64 When the Agent rule was created, timestamp in milliseconds object The attributes of the user who created the Agent rule handle string The handle of the user name string The name of the user defaultRule boolean Whether the rule is included by default description string The description of the Agent rule disabled [string] The disabled policies that the rule belongs to enabled boolean Whether the Agent rule is enabled expression string The SECL expression of the Agent rule filters [string] The platforms the Agent rule is supported on monitoring [string] The monitoring policies that the rule belongs to name string The name of the Agent rule product_tags [string] The list of product tags associated with the rule silent boolean Whether the rule is silent. updateAuthorUuId string The ID of the user who updated the rule updateDate int64 Timestamp in milliseconds when the Agent rule was last updated updatedAt int64 When the Agent rule was last updated, timestamp in milliseconds object The attributes of the user who last updated the Agent rule handle string The handle of the user name string The name of the user version int64 The version of the Agent rule id string The ID of the Agent rule type enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ``` { "data": { "attributes": { "actions": [ { "filter": "string", "hash": { "field": "string" }, "kill": { "signal": "string" }, "metadata": { "image_tag": "string", "service": "string", "short_image": "string" }, "set": { "append": false, "default_value": "string", "expression": "string", "field": "string", "inherited": false, "name": "string", "scope": "string", "size": "integer", "ttl": "integer", "value": { "type": "undefined" } } } ], "agentConstraint": "string", "blocking": [], "category": "Process Activity", "creationAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "creationDate": 1624366480320, "creator": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "defaultRule": false, "description": "My Agent rule", "disabled": [], "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "monitoring": [], "name": "my_agent_rule", "product_tags": [], "silent": false, "updateAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "version": 23 }, "id": "3dd-0uc-h1s", "type": "agent_rule" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Create a Workload Protection agent rule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "My Agent rule", "enabled": true, "expression": "exec.file.name == \"sh\"", "agent_version": "> 7.60", "filters": [], "name": "examplecsmthreat", "policy_id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "product_tags": [] }, "type": "agent_rule" } } EOF ``` ##### Create a Workload Protection agent rule with set action returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "My Agent rule with set action", "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "name": "examplecsmthreat", "policy_id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "product_tags": [], "actions": [ { "set": { "name": "test_set", "value": "test_value", "scope": "process", "inherited": true } }, { "hash": { "field": "exec.file" } } ] }, "type": "agent_rule" } } EOF ``` ##### Create a Workload Protection agent rule with set action with expression returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "My Agent rule with set action with expression", "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "name": "examplecsmthreat", "policy_id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "product_tags": [], "actions": [ { "set": { "name": "test_set", "expression": "exec.file.path", "default_value": "/dev/null", "scope": "process" } } ] }, "type": "agent_rule" } } EOF ``` ##### Create a Workload Protection agent rule returns "OK" response ``` // Create a Workload Protection agent rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "policy_rc" in the system PolicyDataID := os.Getenv("POLICY_DATA_ID") body := datadogV2.CloudWorkloadSecurityAgentRuleCreateRequest{ Data: datadogV2.CloudWorkloadSecurityAgentRuleCreateData{ Attributes: datadogV2.CloudWorkloadSecurityAgentRuleCreateAttributes{ Description: datadog.PtrString("My Agent rule"), Enabled: datadog.PtrBool(true), Expression: `exec.file.name == "sh"`, AgentVersion: datadog.PtrString("> 7.60"), Filters: []string{}, Name: "examplecsmthreat", PolicyId: datadog.PtrString(PolicyDataID), ProductTags: []string{}, }, Type: datadogV2.CLOUDWORKLOADSECURITYAGENTRULETYPE_AGENT_RULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.CreateCSMThreatsAgentRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.CreateCSMThreatsAgentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.CreateCSMThreatsAgentRule`:\n%s\n", responseContent) } ``` Copy ##### Create a Workload Protection agent rule with set action returns "OK" response ``` // Create a Workload Protection agent rule with set action returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "policy_rc" in the system PolicyDataID := os.Getenv("POLICY_DATA_ID") body := datadogV2.CloudWorkloadSecurityAgentRuleCreateRequest{ Data: datadogV2.CloudWorkloadSecurityAgentRuleCreateData{ Attributes: datadogV2.CloudWorkloadSecurityAgentRuleCreateAttributes{ Description: datadog.PtrString("My Agent rule with set action"), Enabled: datadog.PtrBool(true), Expression: `exec.file.name == "sh"`, Filters: []string{}, Name: "examplecsmthreat", PolicyId: datadog.PtrString(PolicyDataID), ProductTags: []string{}, Actions: []datadogV2.CloudWorkloadSecurityAgentRuleAction{ { Set: &datadogV2.CloudWorkloadSecurityAgentRuleActionSet{ Name: datadog.PtrString("test_set"), Value: &datadogV2.CloudWorkloadSecurityAgentRuleActionSetValue{ String: datadog.PtrString("test_value")}, Scope: datadog.PtrString("process"), Inherited: datadog.PtrBool(true), }, }, { Hash: &datadogV2.CloudWorkloadSecurityAgentRuleActionHash{ Field: datadog.PtrString("exec.file"), }, }, }, }, Type: datadogV2.CLOUDWORKLOADSECURITYAGENTRULETYPE_AGENT_RULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.CreateCSMThreatsAgentRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.CreateCSMThreatsAgentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.CreateCSMThreatsAgentRule`:\n%s\n", responseContent) } ``` Copy ##### Create a Workload Protection agent rule with set action with expression returns "OK" response ``` // Create a Workload Protection agent rule with set action with expression returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "policy_rc" in the system PolicyDataID := os.Getenv("POLICY_DATA_ID") body := datadogV2.CloudWorkloadSecurityAgentRuleCreateRequest{ Data: datadogV2.CloudWorkloadSecurityAgentRuleCreateData{ Attributes: datadogV2.CloudWorkloadSecurityAgentRuleCreateAttributes{ Description: datadog.PtrString("My Agent rule with set action with expression"), Enabled: datadog.PtrBool(true), Expression: `exec.file.name == "sh"`, Filters: []string{}, Name: "examplecsmthreat", PolicyId: datadog.PtrString(PolicyDataID), ProductTags: []string{}, Actions: []datadogV2.CloudWorkloadSecurityAgentRuleAction{ { Set: &datadogV2.CloudWorkloadSecurityAgentRuleActionSet{ Name: datadog.PtrString("test_set"), Expression: datadog.PtrString("exec.file.path"), DefaultValue: datadog.PtrString("/dev/null"), Scope: datadog.PtrString("process"), }, }, }, }, Type: datadogV2.CLOUDWORKLOADSECURITYAGENTRULETYPE_AGENT_RULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.CreateCSMThreatsAgentRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.CreateCSMThreatsAgentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.CreateCSMThreatsAgentRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a Workload Protection agent rule returns "OK" response ``` // Create a Workload Protection agent rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateAttributes; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateData; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateRequest; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleResponse; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "policy_rc" in the system String POLICY_DATA_ID = System.getenv("POLICY_DATA_ID"); CloudWorkloadSecurityAgentRuleCreateRequest body = new CloudWorkloadSecurityAgentRuleCreateRequest() .data( new CloudWorkloadSecurityAgentRuleCreateData() .attributes( new CloudWorkloadSecurityAgentRuleCreateAttributes() .description("My Agent rule") .enabled(true) .expression(""" exec.file.name == "sh" """) .agentVersion("> 7.60") .name("examplecsmthreat") .policyId(POLICY_DATA_ID)) .type(CloudWorkloadSecurityAgentRuleType.AGENT_RULE)); try { CloudWorkloadSecurityAgentRuleResponse result = apiInstance.createCSMThreatsAgentRule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#createCSMThreatsAgentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a Workload Protection agent rule with set action returns "OK" response ``` // Create a Workload Protection agent rule with set action returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleAction; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleActionHash; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleActionSet; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleActionSetValue; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateAttributes; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateData; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateRequest; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleResponse; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "policy_rc" in the system String POLICY_DATA_ID = System.getenv("POLICY_DATA_ID"); CloudWorkloadSecurityAgentRuleCreateRequest body = new CloudWorkloadSecurityAgentRuleCreateRequest() .data( new CloudWorkloadSecurityAgentRuleCreateData() .attributes( new CloudWorkloadSecurityAgentRuleCreateAttributes() .description("My Agent rule with set action") .enabled(true) .expression(""" exec.file.name == "sh" """) .name("examplecsmthreat") .policyId(POLICY_DATA_ID) .actions( Arrays.asList( new CloudWorkloadSecurityAgentRuleAction() .set( new CloudWorkloadSecurityAgentRuleActionSet() .name("test_set") .value( new CloudWorkloadSecurityAgentRuleActionSetValue( "test_value")) .scope("process") .inherited(true)), new CloudWorkloadSecurityAgentRuleAction() .hash( new CloudWorkloadSecurityAgentRuleActionHash() .field("exec.file"))))) .type(CloudWorkloadSecurityAgentRuleType.AGENT_RULE)); try { CloudWorkloadSecurityAgentRuleResponse result = apiInstance.createCSMThreatsAgentRule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#createCSMThreatsAgentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a Workload Protection agent rule with set action with expression returns "OK" response ``` // Create a Workload Protection agent rule with set action with expression returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleAction; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleActionSet; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateAttributes; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateData; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateRequest; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleResponse; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "policy_rc" in the system String POLICY_DATA_ID = System.getenv("POLICY_DATA_ID"); CloudWorkloadSecurityAgentRuleCreateRequest body = new CloudWorkloadSecurityAgentRuleCreateRequest() .data( new CloudWorkloadSecurityAgentRuleCreateData() .attributes( new CloudWorkloadSecurityAgentRuleCreateAttributes() .description("My Agent rule with set action with expression") .enabled(true) .expression(""" exec.file.name == "sh" """) .name("examplecsmthreat") .policyId(POLICY_DATA_ID) .actions( Collections.singletonList( new CloudWorkloadSecurityAgentRuleAction() .set( new CloudWorkloadSecurityAgentRuleActionSet() .name("test_set") .expression("exec.file.path") .defaultValue("/dev/null") .scope("process"))))) .type(CloudWorkloadSecurityAgentRuleType.AGENT_RULE)); try { CloudWorkloadSecurityAgentRuleResponse result = apiInstance.createCSMThreatsAgentRule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#createCSMThreatsAgentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a Workload Protection agent rule returns "OK" response ``` """ Create a Workload Protection agent rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_attributes import ( CloudWorkloadSecurityAgentRuleCreateAttributes, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_data import ( CloudWorkloadSecurityAgentRuleCreateData, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_request import ( CloudWorkloadSecurityAgentRuleCreateRequest, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_type import CloudWorkloadSecurityAgentRuleType # there is a valid "policy_rc" in the system POLICY_DATA_ID = environ["POLICY_DATA_ID"] body = CloudWorkloadSecurityAgentRuleCreateRequest( data=CloudWorkloadSecurityAgentRuleCreateData( attributes=CloudWorkloadSecurityAgentRuleCreateAttributes( description="My Agent rule", enabled=True, expression='exec.file.name == "sh"', agent_version="> 7.60", filters=[], name="examplecsmthreat", policy_id=POLICY_DATA_ID, product_tags=[], ), type=CloudWorkloadSecurityAgentRuleType.AGENT_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.create_csm_threats_agent_rule(body=body) print(response) ``` Copy ##### Create a Workload Protection agent rule with set action returns "OK" response ``` """ Create a Workload Protection agent rule with set action returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi from datadog_api_client.v2.model.cloud_workload_security_agent_rule_action import CloudWorkloadSecurityAgentRuleAction from datadog_api_client.v2.model.cloud_workload_security_agent_rule_action_hash import ( CloudWorkloadSecurityAgentRuleActionHash, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_action_set import ( CloudWorkloadSecurityAgentRuleActionSet, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_attributes import ( CloudWorkloadSecurityAgentRuleCreateAttributes, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_data import ( CloudWorkloadSecurityAgentRuleCreateData, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_request import ( CloudWorkloadSecurityAgentRuleCreateRequest, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_type import CloudWorkloadSecurityAgentRuleType # there is a valid "policy_rc" in the system POLICY_DATA_ID = environ["POLICY_DATA_ID"] body = CloudWorkloadSecurityAgentRuleCreateRequest( data=CloudWorkloadSecurityAgentRuleCreateData( attributes=CloudWorkloadSecurityAgentRuleCreateAttributes( description="My Agent rule with set action", enabled=True, expression='exec.file.name == "sh"', filters=[], name="examplecsmthreat", policy_id=POLICY_DATA_ID, product_tags=[], actions=[ CloudWorkloadSecurityAgentRuleAction( set=CloudWorkloadSecurityAgentRuleActionSet( name="test_set", value="test_value", scope="process", inherited=True, ), ), CloudWorkloadSecurityAgentRuleAction( hash=CloudWorkloadSecurityAgentRuleActionHash( field="exec.file", ), ), ], ), type=CloudWorkloadSecurityAgentRuleType.AGENT_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.create_csm_threats_agent_rule(body=body) print(response) ``` Copy ##### Create a Workload Protection agent rule with set action with expression returns "OK" response ``` """ Create a Workload Protection agent rule with set action with expression returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi from datadog_api_client.v2.model.cloud_workload_security_agent_rule_action import CloudWorkloadSecurityAgentRuleAction from datadog_api_client.v2.model.cloud_workload_security_agent_rule_action_set import ( CloudWorkloadSecurityAgentRuleActionSet, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_attributes import ( CloudWorkloadSecurityAgentRuleCreateAttributes, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_data import ( CloudWorkloadSecurityAgentRuleCreateData, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_request import ( CloudWorkloadSecurityAgentRuleCreateRequest, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_type import CloudWorkloadSecurityAgentRuleType # there is a valid "policy_rc" in the system POLICY_DATA_ID = environ["POLICY_DATA_ID"] body = CloudWorkloadSecurityAgentRuleCreateRequest( data=CloudWorkloadSecurityAgentRuleCreateData( attributes=CloudWorkloadSecurityAgentRuleCreateAttributes( description="My Agent rule with set action with expression", enabled=True, expression='exec.file.name == "sh"', filters=[], name="examplecsmthreat", policy_id=POLICY_DATA_ID, product_tags=[], actions=[ CloudWorkloadSecurityAgentRuleAction( set=CloudWorkloadSecurityAgentRuleActionSet( name="test_set", expression="exec.file.path", default_value="/dev/null", scope="process", ), ), ], ), type=CloudWorkloadSecurityAgentRuleType.AGENT_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.create_csm_threats_agent_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a Workload Protection agent rule returns "OK" response ``` # Create a Workload Protection agent rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "policy_rc" in the system POLICY_DATA_ID = ENV["POLICY_DATA_ID"] body = DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateRequest.new({ data: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateData.new({ attributes: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateAttributes.new({ description: "My Agent rule", enabled: true, expression: 'exec.file.name == "sh"', agent_version: "> 7.60", filters: [], name: "examplecsmthreat", policy_id: POLICY_DATA_ID, product_tags: [], }), type: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleType::AGENT_RULE, }), }) p api_instance.create_csm_threats_agent_rule(body) ``` Copy ##### Create a Workload Protection agent rule with set action returns "OK" response ``` # Create a Workload Protection agent rule with set action returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "policy_rc" in the system POLICY_DATA_ID = ENV["POLICY_DATA_ID"] body = DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateRequest.new({ data: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateData.new({ attributes: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateAttributes.new({ description: "My Agent rule with set action", enabled: true, expression: 'exec.file.name == "sh"', filters: [], name: "examplecsmthreat", policy_id: POLICY_DATA_ID, product_tags: [], actions: [ DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleAction.new({ set: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleActionSet.new({ name: "test_set", value: "test_value", scope: "process", inherited: true, }), }), DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleAction.new({ _hash: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleActionHash.new({ field: "exec.file", }), }), ], }), type: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleType::AGENT_RULE, }), }) p api_instance.create_csm_threats_agent_rule(body) ``` Copy ##### Create a Workload Protection agent rule with set action with expression returns "OK" response ``` # Create a Workload Protection agent rule with set action with expression returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "policy_rc" in the system POLICY_DATA_ID = ENV["POLICY_DATA_ID"] body = DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateRequest.new({ data: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateData.new({ attributes: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateAttributes.new({ description: "My Agent rule with set action with expression", enabled: true, expression: 'exec.file.name == "sh"', filters: [], name: "examplecsmthreat", policy_id: POLICY_DATA_ID, product_tags: [], actions: [ DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleAction.new({ set: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleActionSet.new({ name: "test_set", expression: "exec.file.path", default_value: "/dev/null", scope: "process", }), }), ], }), type: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleType::AGENT_RULE, }), }) p api_instance.create_csm_threats_agent_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a Workload Protection agent rule returns "OK" response ``` // Create a Workload Protection agent rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateAttributes; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateData; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateRequest; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleType; #[tokio::main] async fn main() { // there is a valid "policy_rc" in the system let policy_data_id = std::env::var("POLICY_DATA_ID").unwrap(); let body = CloudWorkloadSecurityAgentRuleCreateRequest::new( CloudWorkloadSecurityAgentRuleCreateData::new( CloudWorkloadSecurityAgentRuleCreateAttributes::new( r#"exec.file.name == "sh""#.to_string(), "examplecsmthreat".to_string(), ) .agent_version("> 7.60".to_string()) .description("My Agent rule".to_string()) .enabled(true) .filters(vec![]) .policy_id(policy_data_id.clone()) .product_tags(vec![]), CloudWorkloadSecurityAgentRuleType::AGENT_RULE, ), ); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api.create_csm_threats_agent_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a Workload Protection agent rule with set action returns "OK" response ``` // Create a Workload Protection agent rule with set action returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleAction; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleActionHash; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleActionSet; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleActionSetValue; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateAttributes; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateData; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateRequest; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleType; #[tokio::main] async fn main() { // there is a valid "policy_rc" in the system let policy_data_id = std::env::var("POLICY_DATA_ID").unwrap(); let body = CloudWorkloadSecurityAgentRuleCreateRequest::new( CloudWorkloadSecurityAgentRuleCreateData::new( CloudWorkloadSecurityAgentRuleCreateAttributes::new( r#"exec.file.name == "sh""#.to_string(), "examplecsmthreat".to_string(), ) .actions(Some(vec![ CloudWorkloadSecurityAgentRuleAction::new().set( CloudWorkloadSecurityAgentRuleActionSet::new() .inherited(true) .name("test_set".to_string()) .scope("process".to_string()) .value(CloudWorkloadSecurityAgentRuleActionSetValue::String( "test_value".to_string(), )), ), CloudWorkloadSecurityAgentRuleAction::new().hash( CloudWorkloadSecurityAgentRuleActionHash::new().field("exec.file".to_string()), ), ])) .description("My Agent rule with set action".to_string()) .enabled(true) .filters(vec![]) .policy_id(policy_data_id.clone()) .product_tags(vec![]), CloudWorkloadSecurityAgentRuleType::AGENT_RULE, ), ); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api.create_csm_threats_agent_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a Workload Protection agent rule with set action with expression returns "OK" response ``` // Create a Workload Protection agent rule with set action with expression returns // "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleAction; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleActionSet; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateAttributes; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateData; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateRequest; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleType; #[tokio::main] async fn main() { // there is a valid "policy_rc" in the system let policy_data_id = std::env::var("POLICY_DATA_ID").unwrap(); let body = CloudWorkloadSecurityAgentRuleCreateRequest::new( CloudWorkloadSecurityAgentRuleCreateData::new( CloudWorkloadSecurityAgentRuleCreateAttributes::new( r#"exec.file.name == "sh""#.to_string(), "examplecsmthreat".to_string(), ) .actions(Some(vec![CloudWorkloadSecurityAgentRuleAction::new().set( CloudWorkloadSecurityAgentRuleActionSet::new() .default_value("/dev/null".to_string()) .expression("exec.file.path".to_string()) .name("test_set".to_string()) .scope("process".to_string()), )])) .description("My Agent rule with set action with expression".to_string()) .enabled(true) .filters(vec![]) .policy_id(policy_data_id.clone()) .product_tags(vec![]), CloudWorkloadSecurityAgentRuleType::AGENT_RULE, ), ); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api.create_csm_threats_agent_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a Workload Protection agent rule returns "OK" response ``` /** * Create a Workload Protection agent rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "policy_rc" in the system const POLICY_DATA_ID = process.env.POLICY_DATA_ID as string; const params: v2.CSMThreatsApiCreateCSMThreatsAgentRuleRequest = { body: { data: { attributes: { description: "My Agent rule", enabled: true, expression: `exec.file.name == "sh"`, agentVersion: "> 7.60", filters: [], name: "examplecsmthreat", policyId: POLICY_DATA_ID, productTags: [], }, type: "agent_rule", }, }, }; apiInstance .createCSMThreatsAgentRule(params) .then((data: v2.CloudWorkloadSecurityAgentRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a Workload Protection agent rule with set action returns "OK" response ``` /** * Create a Workload Protection agent rule with set action returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "policy_rc" in the system const POLICY_DATA_ID = process.env.POLICY_DATA_ID as string; const params: v2.CSMThreatsApiCreateCSMThreatsAgentRuleRequest = { body: { data: { attributes: { description: "My Agent rule with set action", enabled: true, expression: `exec.file.name == "sh"`, filters: [], name: "examplecsmthreat", policyId: POLICY_DATA_ID, productTags: [], actions: [ { set: { name: "test_set", value: "test_value", scope: "process", inherited: true, }, }, { hash: { field: "exec.file", }, }, ], }, type: "agent_rule", }, }, }; apiInstance .createCSMThreatsAgentRule(params) .then((data: v2.CloudWorkloadSecurityAgentRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a Workload Protection agent rule with set action with expression returns "OK" response ``` /** * Create a Workload Protection agent rule with set action with expression returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "policy_rc" in the system const POLICY_DATA_ID = process.env.POLICY_DATA_ID as string; const params: v2.CSMThreatsApiCreateCSMThreatsAgentRuleRequest = { body: { data: { attributes: { description: "My Agent rule with set action with expression", enabled: true, expression: `exec.file.name == "sh"`, filters: [], name: "examplecsmthreat", policyId: POLICY_DATA_ID, productTags: [], actions: [ { set: { name: "test_set", expression: "exec.file.path", defaultValue: "/dev/null", scope: "process", }, }, ], }, type: "agent_rule", }, }, }; apiInstance .createCSMThreatsAgentRule(params) .then((data: v2.CloudWorkloadSecurityAgentRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-workload-protection-agent-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-workload-protection-agent-rule-v2) PATCH https://api.ap1.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.datadoghq.eu/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.ddog-gov.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id} ### Overview Update a specific Workload Protection Agent rule. Returns the agent rule object when the request is successful. **Note** : This endpoint is not available for the Government (US1-FED) site. Please reference the (US1-FED) specific resource below. ### Arguments #### Path Parameters Name Type Description agent_rule_id [_required_] string The ID of the Agent rule #### Query Strings Name Type Description policy_id string The ID of the Agent policy ### Request #### Body Data (required) New definition of the agent rule * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Expand All Field Type Description _required_] object Object for a single Agent rule _required_] object Update an existing Cloud Workload Security Agent rule [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agent_version string Constrain the rule to specific versions of the Datadog Agent blocking [string] The blocking policies that the rule belongs to description string The description of the Agent rule disabled [string] The disabled policies that the rule belongs to enabled boolean Whether the Agent rule is enabled expression string The SECL expression of the Agent rule monitoring [string] The monitoring policies that the rule belongs to policy_id string The ID of the policy where the Agent rule is saved product_tags [string] The list of product tags associated with the rule silent boolean Whether the rule is silent. id string The ID of the Agent rule type [_required_] enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ``` { "data": { "attributes": { "description": "Test Agent rule", "enabled": true, "expression": "exec.file.name == \"sh\"" }, "type": "agent_rule", "id": "3dd-0uc-h1s" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentRule-404-v2) * [409](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentRule-409-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes an Agent rule Expand All Field Type Description object Object for a single Agent rule object A Cloud Workload Security Agent rule returned by the API [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agentConstraint string The version of the Agent blocking [string] The blocking policies that the rule belongs to category string The category of the Agent rule creationAuthorUuId string The ID of the user who created the rule creationDate int64 When the Agent rule was created, timestamp in milliseconds object The attributes of the user who created the Agent rule handle string The handle of the user name string The name of the user defaultRule boolean Whether the rule is included by default description string The description of the Agent rule disabled [string] The disabled policies that the rule belongs to enabled boolean Whether the Agent rule is enabled expression string The SECL expression of the Agent rule filters [string] The platforms the Agent rule is supported on monitoring [string] The monitoring policies that the rule belongs to name string The name of the Agent rule product_tags [string] The list of product tags associated with the rule silent boolean Whether the rule is silent. updateAuthorUuId string The ID of the user who updated the rule updateDate int64 Timestamp in milliseconds when the Agent rule was last updated updatedAt int64 When the Agent rule was last updated, timestamp in milliseconds object The attributes of the user who last updated the Agent rule handle string The handle of the user name string The name of the user version int64 The version of the Agent rule id string The ID of the Agent rule type enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ``` { "data": { "attributes": { "actions": [ { "filter": "string", "hash": { "field": "string" }, "kill": { "signal": "string" }, "metadata": { "image_tag": "string", "service": "string", "short_image": "string" }, "set": { "append": false, "default_value": "string", "expression": "string", "field": "string", "inherited": false, "name": "string", "scope": "string", "size": "integer", "ttl": "integer", "value": { "type": "undefined" } } } ], "agentConstraint": "string", "blocking": [], "category": "Process Activity", "creationAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "creationDate": 1624366480320, "creator": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "defaultRule": false, "description": "My Agent rule", "disabled": [], "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "monitoring": [], "name": "my_agent_rule", "product_tags": [], "silent": false, "updateAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "version": 23 }, "id": "3dd-0uc-h1s", "type": "agent_rule" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Update a Workload Protection agent rule Copy ``` # Path parameters export agent_rule_id="3b5-v82-ns6" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/${agent_rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": {}, "type": "agent_rule" } } EOF ``` ##### Update a Workload Protection agent rule ``` """ Update a Workload Protection agent rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi from datadog_api_client.v2.model.cloud_workload_security_agent_rule_type import CloudWorkloadSecurityAgentRuleType from datadog_api_client.v2.model.cloud_workload_security_agent_rule_update_attributes import ( CloudWorkloadSecurityAgentRuleUpdateAttributes, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_update_data import ( CloudWorkloadSecurityAgentRuleUpdateData, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_update_request import ( CloudWorkloadSecurityAgentRuleUpdateRequest, ) # there is a valid "agent_rule_rc" in the system AGENT_RULE_DATA_ID = environ["AGENT_RULE_DATA_ID"] # there is a valid "policy_rc" in the system POLICY_DATA_ID = environ["POLICY_DATA_ID"] body = CloudWorkloadSecurityAgentRuleUpdateRequest( data=CloudWorkloadSecurityAgentRuleUpdateData( attributes=CloudWorkloadSecurityAgentRuleUpdateAttributes( description="My Agent rule", enabled=True, expression='exec.file.name == "sh"', policy_id=POLICY_DATA_ID, product_tags=[], ), id=AGENT_RULE_DATA_ID, type=CloudWorkloadSecurityAgentRuleType.AGENT_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.update_csm_threats_agent_rule( agent_rule_id=AGENT_RULE_DATA_ID, policy_id=POLICY_DATA_ID, body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a Workload Protection agent rule ``` # Update a Workload Protection agent rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "agent_rule_rc" in the system AGENT_RULE_DATA_ID = ENV["AGENT_RULE_DATA_ID"] # there is a valid "policy_rc" in the system POLICY_DATA_ID = ENV["POLICY_DATA_ID"] body = DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleUpdateRequest.new({ data: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleUpdateData.new({ attributes: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleUpdateAttributes.new({ description: "My Agent rule", enabled: true, expression: 'exec.file.name == "sh"', policy_id: POLICY_DATA_ID, product_tags: [], }), id: AGENT_RULE_DATA_ID, type: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleType::AGENT_RULE, }), }) opts = { policy_id: POLICY_DATA_ID, } p api_instance.update_csm_threats_agent_rule(AGENT_RULE_DATA_ID, body, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a Workload Protection agent rule ``` // Update a Workload Protection agent rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "agent_rule_rc" in the system AgentRuleDataID := os.Getenv("AGENT_RULE_DATA_ID") // there is a valid "policy_rc" in the system PolicyDataID := os.Getenv("POLICY_DATA_ID") body := datadogV2.CloudWorkloadSecurityAgentRuleUpdateRequest{ Data: datadogV2.CloudWorkloadSecurityAgentRuleUpdateData{ Attributes: datadogV2.CloudWorkloadSecurityAgentRuleUpdateAttributes{ Description: datadog.PtrString("My Agent rule"), Enabled: datadog.PtrBool(true), Expression: datadog.PtrString(`exec.file.name == "sh"`), PolicyId: datadog.PtrString(PolicyDataID), ProductTags: []string{}, }, Id: datadog.PtrString(AgentRuleDataID), Type: datadogV2.CLOUDWORKLOADSECURITYAGENTRULETYPE_AGENT_RULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.UpdateCSMThreatsAgentRule(ctx, AgentRuleDataID, body, *datadogV2.NewUpdateCSMThreatsAgentRuleOptionalParameters().WithPolicyId(PolicyDataID)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.UpdateCSMThreatsAgentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.UpdateCSMThreatsAgentRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a Workload Protection agent rule ``` // Update a Workload Protection agent rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.api.CsmThreatsApi.UpdateCSMThreatsAgentRuleOptionalParameters; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleResponse; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleType; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleUpdateAttributes; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleUpdateData; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "agent_rule_rc" in the system String AGENT_RULE_DATA_ID = System.getenv("AGENT_RULE_DATA_ID"); // there is a valid "policy_rc" in the system String POLICY_DATA_ID = System.getenv("POLICY_DATA_ID"); CloudWorkloadSecurityAgentRuleUpdateRequest body = new CloudWorkloadSecurityAgentRuleUpdateRequest() .data( new CloudWorkloadSecurityAgentRuleUpdateData() .attributes( new CloudWorkloadSecurityAgentRuleUpdateAttributes() .description("My Agent rule") .enabled(true) .expression(""" exec.file.name == "sh" """) .policyId(POLICY_DATA_ID)) .id(AGENT_RULE_DATA_ID) .type(CloudWorkloadSecurityAgentRuleType.AGENT_RULE)); try { CloudWorkloadSecurityAgentRuleResponse result = apiInstance.updateCSMThreatsAgentRule( AGENT_RULE_DATA_ID, body, new UpdateCSMThreatsAgentRuleOptionalParameters().policyId(POLICY_DATA_ID)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#updateCSMThreatsAgentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a Workload Protection agent rule ``` // Update a Workload Protection agent rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; use datadog_api_client::datadogV2::api_csm_threats::UpdateCSMThreatsAgentRuleOptionalParams; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleType; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleUpdateAttributes; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleUpdateData; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleUpdateRequest; #[tokio::main] async fn main() { // there is a valid "agent_rule_rc" in the system let agent_rule_data_id = std::env::var("AGENT_RULE_DATA_ID").unwrap(); // there is a valid "policy_rc" in the system let policy_data_id = std::env::var("POLICY_DATA_ID").unwrap(); let body = CloudWorkloadSecurityAgentRuleUpdateRequest::new( CloudWorkloadSecurityAgentRuleUpdateData::new( CloudWorkloadSecurityAgentRuleUpdateAttributes::new() .description("My Agent rule".to_string()) .enabled(true) .expression(r#"exec.file.name == "sh""#.to_string()) .policy_id(policy_data_id.clone()) .product_tags(vec![]), CloudWorkloadSecurityAgentRuleType::AGENT_RULE, ) .id(agent_rule_data_id.clone()), ); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api .update_csm_threats_agent_rule( agent_rule_data_id.clone(), body, UpdateCSMThreatsAgentRuleOptionalParams::default().policy_id(policy_data_id.clone()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a Workload Protection agent rule ``` /** * Update a Workload Protection agent rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "agent_rule_rc" in the system const AGENT_RULE_DATA_ID = process.env.AGENT_RULE_DATA_ID as string; // there is a valid "policy_rc" in the system const POLICY_DATA_ID = process.env.POLICY_DATA_ID as string; const params: v2.CSMThreatsApiUpdateCSMThreatsAgentRuleRequest = { body: { data: { attributes: { description: "My Agent rule", enabled: true, expression: `exec.file.name == "sh"`, policyId: POLICY_DATA_ID, productTags: [], }, id: AGENT_RULE_DATA_ID, type: "agent_rule", }, }, agentRuleId: AGENT_RULE_DATA_ID, policyId: POLICY_DATA_ID, }; apiInstance .updateCSMThreatsAgentRule(params) .then((data: v2.CloudWorkloadSecurityAgentRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a Workload Protection agent rule](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-workload-protection-agent-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-workload-protection-agent-rule-v2) DELETE https://api.ap1.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.datadoghq.eu/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.ddog-gov.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/{agent_rule_id} ### Overview Delete a specific Workload Protection agent rule. **Note** : This endpoint is not available for the Government (US1-FED) site. Please reference the (US1-FED) specific resource below. ### Arguments #### Path Parameters Name Type Description agent_rule_id [_required_] string The ID of the Agent rule #### Query Strings Name Type Description policy_id string The ID of the Agent policy ### Response * [204](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCSMThreatsAgentRule-204-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCSMThreatsAgentRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCSMThreatsAgentRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCSMThreatsAgentRule-429-v2) OK Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Delete a Workload Protection agent rule Copy ``` # Path parameters export agent_rule_id="3b5-v82-ns6" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/agent_rules/${agent_rule_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a Workload Protection agent rule ``` """ Delete a Workload Protection agent rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi # there is a valid "agent_rule_rc" in the system AGENT_RULE_DATA_ID = environ["AGENT_RULE_DATA_ID"] # there is a valid "policy_rc" in the system POLICY_DATA_ID = environ["POLICY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) api_instance.delete_csm_threats_agent_rule( agent_rule_id=AGENT_RULE_DATA_ID, policy_id=POLICY_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a Workload Protection agent rule ``` # Delete a Workload Protection agent rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "agent_rule_rc" in the system AGENT_RULE_DATA_ID = ENV["AGENT_RULE_DATA_ID"] # there is a valid "policy_rc" in the system POLICY_DATA_ID = ENV["POLICY_DATA_ID"] opts = { policy_id: POLICY_DATA_ID, } api_instance.delete_csm_threats_agent_rule(AGENT_RULE_DATA_ID, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a Workload Protection agent rule ``` // Delete a Workload Protection agent rule returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "agent_rule_rc" in the system AgentRuleDataID := os.Getenv("AGENT_RULE_DATA_ID") // there is a valid "policy_rc" in the system PolicyDataID := os.Getenv("POLICY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) r, err := api.DeleteCSMThreatsAgentRule(ctx, AgentRuleDataID, *datadogV2.NewDeleteCSMThreatsAgentRuleOptionalParameters().WithPolicyId(PolicyDataID)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.DeleteCSMThreatsAgentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a Workload Protection agent rule ``` // Delete a Workload Protection agent rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.api.CsmThreatsApi.DeleteCSMThreatsAgentRuleOptionalParameters; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "agent_rule_rc" in the system String AGENT_RULE_DATA_ID = System.getenv("AGENT_RULE_DATA_ID"); // there is a valid "policy_rc" in the system String POLICY_DATA_ID = System.getenv("POLICY_DATA_ID"); try { apiInstance.deleteCSMThreatsAgentRule( AGENT_RULE_DATA_ID, new DeleteCSMThreatsAgentRuleOptionalParameters().policyId(POLICY_DATA_ID)); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#deleteCSMThreatsAgentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a Workload Protection agent rule ``` // Delete a Workload Protection agent rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; use datadog_api_client::datadogV2::api_csm_threats::DeleteCSMThreatsAgentRuleOptionalParams; #[tokio::main] async fn main() { // there is a valid "agent_rule_rc" in the system let agent_rule_data_id = std::env::var("AGENT_RULE_DATA_ID").unwrap(); // there is a valid "policy_rc" in the system let policy_data_id = std::env::var("POLICY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api .delete_csm_threats_agent_rule( agent_rule_data_id.clone(), DeleteCSMThreatsAgentRuleOptionalParams::default().policy_id(policy_data_id.clone()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a Workload Protection agent rule ``` /** * Delete a Workload Protection agent rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "agent_rule_rc" in the system const AGENT_RULE_DATA_ID = process.env.AGENT_RULE_DATA_ID as string; // there is a valid "policy_rc" in the system const POLICY_DATA_ID = process.env.POLICY_DATA_ID as string; const params: v2.CSMThreatsApiDeleteCSMThreatsAgentRuleRequest = { agentRuleId: AGENT_RULE_DATA_ID, policyId: POLICY_DATA_ID, }; apiInstance .deleteCSMThreatsAgentRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all Workload Protection policies](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-workload-protection-policies) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-workload-protection-policies-v2) GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/cws/policyhttps://api.ap2.datadoghq.com/api/v2/remote_config/products/cws/policyhttps://api.datadoghq.eu/api/v2/remote_config/products/cws/policyhttps://api.ddog-gov.com/api/v2/remote_config/products/cws/policyhttps://api.datadoghq.com/api/v2/remote_config/products/cws/policyhttps://api.us3.datadoghq.com/api/v2/remote_config/products/cws/policyhttps://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy ### Overview Get the list of Workload Protection policies. **Note** : This endpoint is not available for the Government (US1-FED) site. Please reference the (US1-FED) specific resource below. ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#ListCSMThreatsAgentPolicies-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#ListCSMThreatsAgentPolicies-403-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#ListCSMThreatsAgentPolicies-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes a list of Agent policies Expand All Field Type Description [object] A list of Agent policy objects object A Cloud Workload Security Agent policy returned by the API blockingRulesCount int32 The number of rules with the blocking feature in this policy datadogManaged boolean Whether the policy is managed by Datadog description string The description of the policy disabledRulesCount int32 The number of rules that are disabled in this policy enabled boolean Whether the Agent policy is enabled hostTags [string] The host tags defining where this policy is deployed hostTagsLists [array] The host tags defining where this policy is deployed, the inner values are linked with AND, the outer values are linked with OR monitoringRulesCount int32 The number of rules in the monitoring state in this policy name string The name of the policy pinned boolean Whether the policy is pinned policyType string The type of the policy policyVersion string The version of the policy priority int64 The priority of the policy ruleCount int32 The number of rules in this policy updateDate int64 Timestamp in milliseconds when the policy was last updated updatedAt int64 When the policy was last updated, timestamp in milliseconds object The attributes of the user who last updated the policy handle string The handle of the user name string The name of the user [object] The versions of the policy date string The date and time the version was created name string The version of the policy id string The ID of the Agent policy type enum The type of the resource, must always be `policy` Allowed enum values: `policy` default: `policy` ``` { "data": [ { "attributes": { "blockingRulesCount": 100, "datadogManaged": false, "description": "My agent policy", "disabledRulesCount": 100, "enabled": true, "hostTags": [], "hostTagsLists": [], "monitoringRulesCount": 100, "name": "my_agent_policy", "pinned": false, "policyType": "policy", "policyVersion": "1", "priority": 10, "ruleCount": 100, "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "versions": [ { "date": "string", "name": "1.47.0-rc2" } ] }, "id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "type": "policy" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Get all Workload Protection policies Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all Workload Protection policies ``` """ Get all Workload Protection policies returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.list_csm_threats_agent_policies() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all Workload Protection policies ``` # Get all Workload Protection policies returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new p api_instance.list_csm_threats_agent_policies() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all Workload Protection policies ``` // Get all Workload Protection policies returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.ListCSMThreatsAgentPolicies(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.ListCSMThreatsAgentPolicies`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.ListCSMThreatsAgentPolicies`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all Workload Protection policies ``` // Get all Workload Protection policies returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPoliciesListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); try { CloudWorkloadSecurityAgentPoliciesListResponse result = apiInstance.listCSMThreatsAgentPolicies(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#listCSMThreatsAgentPolicies"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all Workload Protection policies ``` // Get all Workload Protection policies returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api.list_csm_threats_agent_policies().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all Workload Protection policies ``` /** * Get all Workload Protection policies returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); apiInstance .listCSMThreatsAgentPolicies() .then((data: v2.CloudWorkloadSecurityAgentPoliciesListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a Workload Protection policy](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-workload-protection-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-workload-protection-policy-v2) GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.datadoghq.eu/api/v2/remote_config/products/cws/policy/{policy_id}https://api.ddog-gov.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id} ### Overview Get the details of a specific Workload Protection policy. **Note** : This endpoint is not available for the Government (US1-FED) site. Please reference the (US1-FED) specific resource below. ### Arguments #### Path Parameters Name Type Description policy_id [_required_] string The ID of the Agent policy ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#GetCSMThreatsAgentPolicy-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#GetCSMThreatsAgentPolicy-403-v2) * [404](https://docs.datadoghq.com/api/latest/csm-threats/#GetCSMThreatsAgentPolicy-404-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#GetCSMThreatsAgentPolicy-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes an Agent policy Expand All Field Type Description object Object for a single Agent policy object A Cloud Workload Security Agent policy returned by the API blockingRulesCount int32 The number of rules with the blocking feature in this policy datadogManaged boolean Whether the policy is managed by Datadog description string The description of the policy disabledRulesCount int32 The number of rules that are disabled in this policy enabled boolean Whether the Agent policy is enabled hostTags [string] The host tags defining where this policy is deployed hostTagsLists [array] The host tags defining where this policy is deployed, the inner values are linked with AND, the outer values are linked with OR monitoringRulesCount int32 The number of rules in the monitoring state in this policy name string The name of the policy pinned boolean Whether the policy is pinned policyType string The type of the policy policyVersion string The version of the policy priority int64 The priority of the policy ruleCount int32 The number of rules in this policy updateDate int64 Timestamp in milliseconds when the policy was last updated updatedAt int64 When the policy was last updated, timestamp in milliseconds object The attributes of the user who last updated the policy handle string The handle of the user name string The name of the user [object] The versions of the policy date string The date and time the version was created name string The version of the policy id string The ID of the Agent policy type enum The type of the resource, must always be `policy` Allowed enum values: `policy` default: `policy` ``` { "data": { "attributes": { "blockingRulesCount": 100, "datadogManaged": false, "description": "My agent policy", "disabledRulesCount": 100, "enabled": true, "hostTags": [], "hostTagsLists": [], "monitoringRulesCount": 100, "name": "my_agent_policy", "pinned": false, "policyType": "policy", "policyVersion": "1", "priority": 10, "ruleCount": 100, "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "versions": [ { "date": "string", "name": "1.47.0-rc2" } ] }, "id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "type": "policy" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Get a Workload Protection policy Copy ``` # Path parameters export policy_id="6517fcc1-cec7-4394-a655-8d6e9d085255" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy/${policy_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a Workload Protection policy ``` """ Get a Workload Protection policy returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi # there is a valid "policy_rc" in the system POLICY_DATA_ID = environ["POLICY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.get_csm_threats_agent_policy( policy_id=POLICY_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a Workload Protection policy ``` # Get a Workload Protection policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "policy_rc" in the system POLICY_DATA_ID = ENV["POLICY_DATA_ID"] p api_instance.get_csm_threats_agent_policy(POLICY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a Workload Protection policy ``` // Get a Workload Protection policy returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "policy_rc" in the system PolicyDataID := os.Getenv("POLICY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.GetCSMThreatsAgentPolicy(ctx, PolicyDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.GetCSMThreatsAgentPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.GetCSMThreatsAgentPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a Workload Protection policy ``` // Get a Workload Protection policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPolicyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "policy_rc" in the system String POLICY_DATA_ID = System.getenv("POLICY_DATA_ID"); try { CloudWorkloadSecurityAgentPolicyResponse result = apiInstance.getCSMThreatsAgentPolicy(POLICY_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#getCSMThreatsAgentPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a Workload Protection policy ``` // Get a Workload Protection policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; #[tokio::main] async fn main() { // there is a valid "policy_rc" in the system let policy_data_id = std::env::var("POLICY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api .get_csm_threats_agent_policy(policy_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a Workload Protection policy ``` /** * Get a Workload Protection policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "policy_rc" in the system const POLICY_DATA_ID = process.env.POLICY_DATA_ID as string; const params: v2.CSMThreatsApiGetCSMThreatsAgentPolicyRequest = { policyId: POLICY_DATA_ID, }; apiInstance .getCSMThreatsAgentPolicy(params) .then((data: v2.CloudWorkloadSecurityAgentPolicyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a Workload Protection policy](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-workload-protection-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-workload-protection-policy-v2) POST https://api.ap1.datadoghq.com/api/v2/remote_config/products/cws/policyhttps://api.ap2.datadoghq.com/api/v2/remote_config/products/cws/policyhttps://api.datadoghq.eu/api/v2/remote_config/products/cws/policyhttps://api.ddog-gov.com/api/v2/remote_config/products/cws/policyhttps://api.datadoghq.com/api/v2/remote_config/products/cws/policyhttps://api.us3.datadoghq.com/api/v2/remote_config/products/cws/policyhttps://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy ### Overview Create a new Workload Protection policy with the given parameters. **Note** : This endpoint is not available for the Government (US1-FED) site. Please reference the (US1-FED) specific resource below. ### Request #### Body Data (required) The definition of the new Agent policy * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Expand All Field Type Description _required_] object Object for a single Agent rule _required_] object Create a new Cloud Workload Security Agent policy description string The description of the policy enabled boolean Whether the policy is enabled hostTags [string] The host tags defining where this policy is deployed hostTagsLists [array] The host tags defining where this policy is deployed, the inner values are linked with AND, the outer values are linked with OR name [_required_] string The name of the policy type [_required_] enum The type of the resource, must always be `policy` Allowed enum values: `policy` default: `policy` ``` { "data": { "attributes": { "description": "My agent policy", "enabled": true, "hostTagsLists": [ [ "env:test" ] ], "name": "my_agent_policy_2" }, "type": "policy" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCSMThreatsAgentPolicy-200-v2) * [400](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCSMThreatsAgentPolicy-400-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCSMThreatsAgentPolicy-403-v2) * [409](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCSMThreatsAgentPolicy-409-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCSMThreatsAgentPolicy-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes an Agent policy Expand All Field Type Description object Object for a single Agent policy object A Cloud Workload Security Agent policy returned by the API blockingRulesCount int32 The number of rules with the blocking feature in this policy datadogManaged boolean Whether the policy is managed by Datadog description string The description of the policy disabledRulesCount int32 The number of rules that are disabled in this policy enabled boolean Whether the Agent policy is enabled hostTags [string] The host tags defining where this policy is deployed hostTagsLists [array] The host tags defining where this policy is deployed, the inner values are linked with AND, the outer values are linked with OR monitoringRulesCount int32 The number of rules in the monitoring state in this policy name string The name of the policy pinned boolean Whether the policy is pinned policyType string The type of the policy policyVersion string The version of the policy priority int64 The priority of the policy ruleCount int32 The number of rules in this policy updateDate int64 Timestamp in milliseconds when the policy was last updated updatedAt int64 When the policy was last updated, timestamp in milliseconds object The attributes of the user who last updated the policy handle string The handle of the user name string The name of the user [object] The versions of the policy date string The date and time the version was created name string The version of the policy id string The ID of the Agent policy type enum The type of the resource, must always be `policy` Allowed enum values: `policy` default: `policy` ``` { "data": { "attributes": { "blockingRulesCount": 100, "datadogManaged": false, "description": "My agent policy", "disabledRulesCount": 100, "enabled": true, "hostTags": [], "hostTagsLists": [], "monitoringRulesCount": 100, "name": "my_agent_policy", "pinned": false, "policyType": "policy", "policyVersion": "1", "priority": 10, "ruleCount": 100, "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "versions": [ { "date": "string", "name": "1.47.0-rc2" } ] }, "id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "type": "policy" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Create a Workload Protection policy returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "My agent policy", "enabled": true, "hostTagsLists": [ [ "env:test" ] ], "name": "my_agent_policy_2" }, "type": "policy" } } EOF ``` ##### Create a Workload Protection policy returns "OK" response ``` // Create a Workload Protection policy returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CloudWorkloadSecurityAgentPolicyCreateRequest{ Data: datadogV2.CloudWorkloadSecurityAgentPolicyCreateData{ Attributes: datadogV2.CloudWorkloadSecurityAgentPolicyCreateAttributes{ Description: datadog.PtrString("My agent policy"), Enabled: datadog.PtrBool(true), HostTagsLists: [][]string{ { "env:test", }, }, Name: "my_agent_policy_2", }, Type: datadogV2.CLOUDWORKLOADSECURITYAGENTPOLICYTYPE_POLICY, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.CreateCSMThreatsAgentPolicy(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.CreateCSMThreatsAgentPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.CreateCSMThreatsAgentPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a Workload Protection policy returns "OK" response ``` // Create a Workload Protection policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPolicyCreateAttributes; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPolicyCreateData; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPolicyCreateRequest; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPolicyResponse; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPolicyType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); CloudWorkloadSecurityAgentPolicyCreateRequest body = new CloudWorkloadSecurityAgentPolicyCreateRequest() .data( new CloudWorkloadSecurityAgentPolicyCreateData() .attributes( new CloudWorkloadSecurityAgentPolicyCreateAttributes() .description("My agent policy") .enabled(true) .hostTagsLists( Collections.singletonList(Collections.singletonList("env:test"))) .name("my_agent_policy_2")) .type(CloudWorkloadSecurityAgentPolicyType.POLICY)); try { CloudWorkloadSecurityAgentPolicyResponse result = apiInstance.createCSMThreatsAgentPolicy(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#createCSMThreatsAgentPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a Workload Protection policy returns "OK" response ``` """ Create a Workload Protection policy returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi from datadog_api_client.v2.model.cloud_workload_security_agent_policy_create_attributes import ( CloudWorkloadSecurityAgentPolicyCreateAttributes, ) from datadog_api_client.v2.model.cloud_workload_security_agent_policy_create_data import ( CloudWorkloadSecurityAgentPolicyCreateData, ) from datadog_api_client.v2.model.cloud_workload_security_agent_policy_create_request import ( CloudWorkloadSecurityAgentPolicyCreateRequest, ) from datadog_api_client.v2.model.cloud_workload_security_agent_policy_type import CloudWorkloadSecurityAgentPolicyType body = CloudWorkloadSecurityAgentPolicyCreateRequest( data=CloudWorkloadSecurityAgentPolicyCreateData( attributes=CloudWorkloadSecurityAgentPolicyCreateAttributes( description="My agent policy", enabled=True, host_tags_lists=[ [ "env:test", ], ], name="my_agent_policy_2", ), type=CloudWorkloadSecurityAgentPolicyType.POLICY, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.create_csm_threats_agent_policy(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a Workload Protection policy returns "OK" response ``` # Create a Workload Protection policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new body = DatadogAPIClient::V2::CloudWorkloadSecurityAgentPolicyCreateRequest.new({ data: DatadogAPIClient::V2::CloudWorkloadSecurityAgentPolicyCreateData.new({ attributes: DatadogAPIClient::V2::CloudWorkloadSecurityAgentPolicyCreateAttributes.new({ description: "My agent policy", enabled: true, host_tags_lists: [ [ "env:test", ], ], name: "my_agent_policy_2", }), type: DatadogAPIClient::V2::CloudWorkloadSecurityAgentPolicyType::POLICY, }), }) p api_instance.create_csm_threats_agent_policy(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a Workload Protection policy returns "OK" response ``` // Create a Workload Protection policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentPolicyCreateAttributes; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentPolicyCreateData; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentPolicyCreateRequest; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentPolicyType; #[tokio::main] async fn main() { let body = CloudWorkloadSecurityAgentPolicyCreateRequest::new( CloudWorkloadSecurityAgentPolicyCreateData::new( CloudWorkloadSecurityAgentPolicyCreateAttributes::new("my_agent_policy_2".to_string()) .description("My agent policy".to_string()) .enabled(true) .host_tags_lists(vec![vec!["env:test".to_string()]]), CloudWorkloadSecurityAgentPolicyType::POLICY, ), ); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api.create_csm_threats_agent_policy(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a Workload Protection policy returns "OK" response ``` /** * Create a Workload Protection policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); const params: v2.CSMThreatsApiCreateCSMThreatsAgentPolicyRequest = { body: { data: { attributes: { description: "My agent policy", enabled: true, hostTagsLists: [["env:test"]], name: "my_agent_policy_2", }, type: "policy", }, }, }; apiInstance .createCSMThreatsAgentPolicy(params) .then((data: v2.CloudWorkloadSecurityAgentPolicyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a Workload Protection policy](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-workload-protection-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-workload-protection-policy-v2) PATCH https://api.ap1.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.datadoghq.eu/api/v2/remote_config/products/cws/policy/{policy_id}https://api.ddog-gov.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id} ### Overview Update a specific Workload Protection policy. Returns the policy object when the request is successful. **Note** : This endpoint is not available for the Government (US1-FED) site. Please reference the (US1-FED) specific resource below. ### Arguments #### Path Parameters Name Type Description policy_id [_required_] string The ID of the Agent policy ### Request #### Body Data (required) New definition of the Agent policy * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Expand All Field Type Description _required_] object Object for a single Agent policy _required_] object Update an existing Cloud Workload Security Agent policy description string The description of the policy enabled boolean Whether the policy is enabled hostTags [string] The host tags defining where this policy is deployed hostTagsLists [array] The host tags defining where this policy is deployed, the inner values are linked with AND, the outer values are linked with OR name string The name of the policy id string The ID of the Agent policy type [_required_] enum The type of the resource, must always be `policy` Allowed enum values: `policy` default: `policy` ``` { "data": { "attributes": { "description": "Updated agent policy", "enabled": true, "hostTagsLists": [ [ "env:test" ] ], "name": "updated_agent_policy" }, "id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "type": "policy" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentPolicy-200-v2) * [400](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentPolicy-400-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentPolicy-403-v2) * [404](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentPolicy-404-v2) * [409](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentPolicy-409-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCSMThreatsAgentPolicy-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes an Agent policy Expand All Field Type Description object Object for a single Agent policy object A Cloud Workload Security Agent policy returned by the API blockingRulesCount int32 The number of rules with the blocking feature in this policy datadogManaged boolean Whether the policy is managed by Datadog description string The description of the policy disabledRulesCount int32 The number of rules that are disabled in this policy enabled boolean Whether the Agent policy is enabled hostTags [string] The host tags defining where this policy is deployed hostTagsLists [array] The host tags defining where this policy is deployed, the inner values are linked with AND, the outer values are linked with OR monitoringRulesCount int32 The number of rules in the monitoring state in this policy name string The name of the policy pinned boolean Whether the policy is pinned policyType string The type of the policy policyVersion string The version of the policy priority int64 The priority of the policy ruleCount int32 The number of rules in this policy updateDate int64 Timestamp in milliseconds when the policy was last updated updatedAt int64 When the policy was last updated, timestamp in milliseconds object The attributes of the user who last updated the policy handle string The handle of the user name string The name of the user [object] The versions of the policy date string The date and time the version was created name string The version of the policy id string The ID of the Agent policy type enum The type of the resource, must always be `policy` Allowed enum values: `policy` default: `policy` ``` { "data": { "attributes": { "blockingRulesCount": 100, "datadogManaged": false, "description": "My agent policy", "disabledRulesCount": 100, "enabled": true, "hostTags": [], "hostTagsLists": [], "monitoringRulesCount": 100, "name": "my_agent_policy", "pinned": false, "policyType": "policy", "policyVersion": "1", "priority": 10, "ruleCount": 100, "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "versions": [ { "date": "string", "name": "1.47.0-rc2" } ] }, "id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "type": "policy" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Update a Workload Protection policy returns "OK" response Copy ``` # Path parameters export policy_id="6517fcc1-cec7-4394-a655-8d6e9d085255" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy/${policy_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Updated agent policy", "enabled": true, "hostTagsLists": [ [ "env:test" ] ], "name": "updated_agent_policy" }, "id": "6517fcc1-cec7-4394-a655-8d6e9d085255", "type": "policy" } } EOF ``` ##### Update a Workload Protection policy returns "OK" response ``` // Update a Workload Protection policy returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "policy_rc" in the system PolicyDataID := os.Getenv("POLICY_DATA_ID") body := datadogV2.CloudWorkloadSecurityAgentPolicyUpdateRequest{ Data: datadogV2.CloudWorkloadSecurityAgentPolicyUpdateData{ Attributes: datadogV2.CloudWorkloadSecurityAgentPolicyUpdateAttributes{ Description: datadog.PtrString("Updated agent policy"), Enabled: datadog.PtrBool(true), HostTagsLists: [][]string{ { "env:test", }, }, Name: datadog.PtrString("updated_agent_policy"), }, Id: datadog.PtrString(PolicyDataID), Type: datadogV2.CLOUDWORKLOADSECURITYAGENTPOLICYTYPE_POLICY, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.UpdateCSMThreatsAgentPolicy(ctx, PolicyDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.UpdateCSMThreatsAgentPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.UpdateCSMThreatsAgentPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a Workload Protection policy returns "OK" response ``` // Update a Workload Protection policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPolicyResponse; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPolicyType; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPolicyUpdateAttributes; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPolicyUpdateData; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentPolicyUpdateRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "policy_rc" in the system String POLICY_DATA_ID = System.getenv("POLICY_DATA_ID"); CloudWorkloadSecurityAgentPolicyUpdateRequest body = new CloudWorkloadSecurityAgentPolicyUpdateRequest() .data( new CloudWorkloadSecurityAgentPolicyUpdateData() .attributes( new CloudWorkloadSecurityAgentPolicyUpdateAttributes() .description("Updated agent policy") .enabled(true) .hostTagsLists( Collections.singletonList(Collections.singletonList("env:test"))) .name("updated_agent_policy")) .id(POLICY_DATA_ID) .type(CloudWorkloadSecurityAgentPolicyType.POLICY)); try { CloudWorkloadSecurityAgentPolicyResponse result = apiInstance.updateCSMThreatsAgentPolicy(POLICY_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#updateCSMThreatsAgentPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a Workload Protection policy returns "OK" response ``` """ Update a Workload Protection policy returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi from datadog_api_client.v2.model.cloud_workload_security_agent_policy_type import CloudWorkloadSecurityAgentPolicyType from datadog_api_client.v2.model.cloud_workload_security_agent_policy_update_attributes import ( CloudWorkloadSecurityAgentPolicyUpdateAttributes, ) from datadog_api_client.v2.model.cloud_workload_security_agent_policy_update_data import ( CloudWorkloadSecurityAgentPolicyUpdateData, ) from datadog_api_client.v2.model.cloud_workload_security_agent_policy_update_request import ( CloudWorkloadSecurityAgentPolicyUpdateRequest, ) # there is a valid "policy_rc" in the system POLICY_DATA_ID = environ["POLICY_DATA_ID"] body = CloudWorkloadSecurityAgentPolicyUpdateRequest( data=CloudWorkloadSecurityAgentPolicyUpdateData( attributes=CloudWorkloadSecurityAgentPolicyUpdateAttributes( description="Updated agent policy", enabled=True, host_tags_lists=[ [ "env:test", ], ], name="updated_agent_policy", ), id=POLICY_DATA_ID, type=CloudWorkloadSecurityAgentPolicyType.POLICY, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.update_csm_threats_agent_policy(policy_id=POLICY_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a Workload Protection policy returns "OK" response ``` # Update a Workload Protection policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "policy_rc" in the system POLICY_DATA_ID = ENV["POLICY_DATA_ID"] body = DatadogAPIClient::V2::CloudWorkloadSecurityAgentPolicyUpdateRequest.new({ data: DatadogAPIClient::V2::CloudWorkloadSecurityAgentPolicyUpdateData.new({ attributes: DatadogAPIClient::V2::CloudWorkloadSecurityAgentPolicyUpdateAttributes.new({ description: "Updated agent policy", enabled: true, host_tags_lists: [ [ "env:test", ], ], name: "updated_agent_policy", }), id: POLICY_DATA_ID, type: DatadogAPIClient::V2::CloudWorkloadSecurityAgentPolicyType::POLICY, }), }) p api_instance.update_csm_threats_agent_policy(POLICY_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a Workload Protection policy returns "OK" response ``` // Update a Workload Protection policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentPolicyType; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentPolicyUpdateAttributes; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentPolicyUpdateData; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentPolicyUpdateRequest; #[tokio::main] async fn main() { // there is a valid "policy_rc" in the system let policy_data_id = std::env::var("POLICY_DATA_ID").unwrap(); let body = CloudWorkloadSecurityAgentPolicyUpdateRequest::new( CloudWorkloadSecurityAgentPolicyUpdateData::new( CloudWorkloadSecurityAgentPolicyUpdateAttributes::new() .description("Updated agent policy".to_string()) .enabled(true) .host_tags_lists(vec![vec!["env:test".to_string()]]) .name("updated_agent_policy".to_string()), CloudWorkloadSecurityAgentPolicyType::POLICY, ) .id(policy_data_id.clone()), ); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api .update_csm_threats_agent_policy(policy_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a Workload Protection policy returns "OK" response ``` /** * Update a Workload Protection policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "policy_rc" in the system const POLICY_DATA_ID = process.env.POLICY_DATA_ID as string; const params: v2.CSMThreatsApiUpdateCSMThreatsAgentPolicyRequest = { body: { data: { attributes: { description: "Updated agent policy", enabled: true, hostTagsLists: [["env:test"]], name: "updated_agent_policy", }, id: POLICY_DATA_ID, type: "policy", }, }, policyId: POLICY_DATA_ID, }; apiInstance .updateCSMThreatsAgentPolicy(params) .then((data: v2.CloudWorkloadSecurityAgentPolicyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a Workload Protection policy](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-workload-protection-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-workload-protection-policy-v2) DELETE https://api.ap1.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.datadoghq.eu/api/v2/remote_config/products/cws/policy/{policy_id}https://api.ddog-gov.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy/{policy_id} ### Overview Delete a specific Workload Protection policy. **Note** : This endpoint is not available for the Government (US1-FED) site. Please reference the (US1-FED) specific resource below. ### Arguments #### Path Parameters Name Type Description policy_id [_required_] string The ID of the Agent policy ### Response * [202](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCSMThreatsAgentPolicy-202-v2) * [204](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCSMThreatsAgentPolicy-204-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCSMThreatsAgentPolicy-403-v2) * [404](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCSMThreatsAgentPolicy-404-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCSMThreatsAgentPolicy-429-v2) OK OK Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Delete a Workload Protection policy Copy ``` # Path parameters export policy_id="6517fcc1-cec7-4394-a655-8d6e9d085255" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy/${policy_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a Workload Protection policy ``` """ Delete a Workload Protection policy returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi # there is a valid "policy_rc" in the system POLICY_DATA_ID = environ["POLICY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) api_instance.delete_csm_threats_agent_policy( policy_id=POLICY_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a Workload Protection policy ``` # Delete a Workload Protection policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "policy_rc" in the system POLICY_DATA_ID = ENV["POLICY_DATA_ID"] api_instance.delete_csm_threats_agent_policy(POLICY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a Workload Protection policy ``` // Delete a Workload Protection policy returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "policy_rc" in the system PolicyDataID := os.Getenv("POLICY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) r, err := api.DeleteCSMThreatsAgentPolicy(ctx, PolicyDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.DeleteCSMThreatsAgentPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a Workload Protection policy ``` // Delete a Workload Protection policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "policy_rc" in the system String POLICY_DATA_ID = System.getenv("POLICY_DATA_ID"); try { apiInstance.deleteCSMThreatsAgentPolicy(POLICY_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#deleteCSMThreatsAgentPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a Workload Protection policy ``` // Delete a Workload Protection policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; #[tokio::main] async fn main() { // there is a valid "policy_rc" in the system let policy_data_id = std::env::var("POLICY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api .delete_csm_threats_agent_policy(policy_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a Workload Protection policy ``` /** * Delete a Workload Protection policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "policy_rc" in the system const POLICY_DATA_ID = process.env.POLICY_DATA_ID as string; const params: v2.CSMThreatsApiDeleteCSMThreatsAgentPolicyRequest = { policyId: POLICY_DATA_ID, }; apiInstance .deleteCSMThreatsAgentPolicy(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Download the Workload Protection policy](https://docs.datadoghq.com/api/latest/csm-threats/#download-the-workload-protection-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#download-the-workload-protection-policy-v2) GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/cws/policy/downloadhttps://api.ap2.datadoghq.com/api/v2/remote_config/products/cws/policy/downloadhttps://api.datadoghq.eu/api/v2/remote_config/products/cws/policy/downloadhttps://api.ddog-gov.com/api/v2/remote_config/products/cws/policy/downloadhttps://api.datadoghq.com/api/v2/remote_config/products/cws/policy/downloadhttps://api.us3.datadoghq.com/api/v2/remote_config/products/cws/policy/downloadhttps://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy/download ### Overview The download endpoint generates a Workload Protection policy file from your currently active Workload Protection agent rules, and downloads them as a `.policy` file. This file can then be deployed to your agents to update the policy running in your environment. **Note** : This endpoint is not available for the Government (US1-FED) site. Please reference the (US1-FED) specific resource below. ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#DownloadCSMThreatsPolicy-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#DownloadCSMThreatsPolicy-403-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#DownloadCSMThreatsPolicy-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Expand All Field Type Description No response body ``` {} ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Download the Workload Protection policy Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/cws/policy/download" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Download the Workload Protection policy ``` """ Download the Workload Protection policy returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.download_csm_threats_policy() print(response.read()) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Download the Workload Protection policy ``` # Download the Workload Protection policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new p api_instance.download_csm_threats_policy() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Download the Workload Protection policy ``` // Download the Workload Protection policy returns "OK" response package main import ( "context" "fmt" "io/ioutil" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.DownloadCSMThreatsPolicy(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.DownloadCSMThreatsPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := ioutil.ReadAll(resp) fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.DownloadCSMThreatsPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Download the Workload Protection policy ``` // Download the Workload Protection policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import java.io.File; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); try { File result = apiInstance.downloadCSMThreatsPolicy(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#downloadCSMThreatsPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Download the Workload Protection policy ``` // Download the Workload Protection policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api.download_csm_threats_policy().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Download the Workload Protection policy ``` /** * Download the Workload Protection policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); apiInstance .downloadCSMThreatsPolicy() .then((data: client.HttpFile) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all Workload Protection agent rules (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-workload-protection-agent-rules-us1-fed) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#get-all-workload-protection-agent-rules-us1-fed-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.ap2.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.datadoghq.eu/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.ddog-gov.com/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.us3.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.us5.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules ### Overview Get the list of agent rules. **Note** : This endpoint should only be used for the Government (US1-FED) site. This endpoint requires the `security_monitoring_cws_agent_rules_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#ListCloudWorkloadSecurityAgentRules-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#ListCloudWorkloadSecurityAgentRules-403-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#ListCloudWorkloadSecurityAgentRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes a list of Agent rule Expand All Field Type Description [object] A list of Agent rules objects object A Cloud Workload Security Agent rule returned by the API [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agentConstraint string The version of the Agent blocking [string] The blocking policies that the rule belongs to category string The category of the Agent rule creationAuthorUuId string The ID of the user who created the rule creationDate int64 When the Agent rule was created, timestamp in milliseconds object The attributes of the user who created the Agent rule handle string The handle of the user name string The name of the user defaultRule boolean Whether the rule is included by default description string The description of the Agent rule disabled [string] The disabled policies that the rule belongs to enabled boolean Whether the Agent rule is enabled expression string The SECL expression of the Agent rule filters [string] The platforms the Agent rule is supported on monitoring [string] The monitoring policies that the rule belongs to name string The name of the Agent rule product_tags [string] The list of product tags associated with the rule silent boolean Whether the rule is silent. updateAuthorUuId string The ID of the user who updated the rule updateDate int64 Timestamp in milliseconds when the Agent rule was last updated updatedAt int64 When the Agent rule was last updated, timestamp in milliseconds object The attributes of the user who last updated the Agent rule handle string The handle of the user name string The name of the user version int64 The version of the Agent rule id string The ID of the Agent rule type enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ``` { "data": [ { "attributes": { "actions": [ { "filter": "string", "hash": { "field": "string" }, "kill": { "signal": "string" }, "metadata": { "image_tag": "string", "service": "string", "short_image": "string" }, "set": { "append": false, "default_value": "string", "expression": "string", "field": "string", "inherited": false, "name": "string", "scope": "string", "size": "integer", "ttl": "integer", "value": { "type": "undefined" } } } ], "agentConstraint": "string", "blocking": [], "category": "Process Activity", "creationAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "creationDate": 1624366480320, "creator": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "defaultRule": false, "description": "My Agent rule", "disabled": [], "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "monitoring": [], "name": "my_agent_rule", "product_tags": [], "silent": false, "updateAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "version": 23 }, "id": "3dd-0uc-h1s", "type": "agent_rule" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Get all Workload Protection agent rules (US1-FED) Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all Workload Protection agent rules (US1-FED) ``` """ Get all Workload Protection agent rules (US1-FED) returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.list_cloud_workload_security_agent_rules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all Workload Protection agent rules (US1-FED) ``` # Get all Workload Protection agent rules (US1-FED) returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new p api_instance.list_cloud_workload_security_agent_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all Workload Protection agent rules (US1-FED) ``` // Get all Workload Protection agent rules (US1-FED) returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.ListCloudWorkloadSecurityAgentRules(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.ListCloudWorkloadSecurityAgentRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.ListCloudWorkloadSecurityAgentRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all Workload Protection agent rules (US1-FED) ``` // Get all Workload Protection agent rules (US1-FED) returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRulesListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); try { CloudWorkloadSecurityAgentRulesListResponse result = apiInstance.listCloudWorkloadSecurityAgentRules(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CsmThreatsApi#listCloudWorkloadSecurityAgentRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all Workload Protection agent rules (US1-FED) ``` // Get all Workload Protection agent rules (US1-FED) returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api.list_cloud_workload_security_agent_rules().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all Workload Protection agent rules (US1-FED) ``` /** * Get all Workload Protection agent rules (US1-FED) returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); apiInstance .listCloudWorkloadSecurityAgentRules() .then((data: v2.CloudWorkloadSecurityAgentRulesListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-workload-protection-agent-rule-us1-fed) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#get-a-workload-protection-agent-rule-us1-fed-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.datadoghq.eu/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.ddog-gov.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id} ### Overview Get the details of a specific agent rule. **Note** : This endpoint should only be used for the Government (US1-FED) site. This endpoint requires the `security_monitoring_cws_agent_rules_read` permission. ### Arguments #### Path Parameters Name Type Description agent_rule_id [_required_] string The ID of the Agent rule ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#GetCloudWorkloadSecurityAgentRule-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#GetCloudWorkloadSecurityAgentRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/csm-threats/#GetCloudWorkloadSecurityAgentRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#GetCloudWorkloadSecurityAgentRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes an Agent rule Expand All Field Type Description object Object for a single Agent rule object A Cloud Workload Security Agent rule returned by the API [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agentConstraint string The version of the Agent blocking [string] The blocking policies that the rule belongs to category string The category of the Agent rule creationAuthorUuId string The ID of the user who created the rule creationDate int64 When the Agent rule was created, timestamp in milliseconds object The attributes of the user who created the Agent rule handle string The handle of the user name string The name of the user defaultRule boolean Whether the rule is included by default description string The description of the Agent rule disabled [string] The disabled policies that the rule belongs to enabled boolean Whether the Agent rule is enabled expression string The SECL expression of the Agent rule filters [string] The platforms the Agent rule is supported on monitoring [string] The monitoring policies that the rule belongs to name string The name of the Agent rule product_tags [string] The list of product tags associated with the rule silent boolean Whether the rule is silent. updateAuthorUuId string The ID of the user who updated the rule updateDate int64 Timestamp in milliseconds when the Agent rule was last updated updatedAt int64 When the Agent rule was last updated, timestamp in milliseconds object The attributes of the user who last updated the Agent rule handle string The handle of the user name string The name of the user version int64 The version of the Agent rule id string The ID of the Agent rule type enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ``` { "data": { "attributes": { "actions": [ { "filter": "string", "hash": { "field": "string" }, "kill": { "signal": "string" }, "metadata": { "image_tag": "string", "service": "string", "short_image": "string" }, "set": { "append": false, "default_value": "string", "expression": "string", "field": "string", "inherited": false, "name": "string", "scope": "string", "size": "integer", "ttl": "integer", "value": { "type": "undefined" } } } ], "agentConstraint": "string", "blocking": [], "category": "Process Activity", "creationAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "creationDate": 1624366480320, "creator": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "defaultRule": false, "description": "My Agent rule", "disabled": [], "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "monitoring": [], "name": "my_agent_rule", "product_tags": [], "silent": false, "updateAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "version": 23 }, "id": "3dd-0uc-h1s", "type": "agent_rule" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Get a Workload Protection agent rule (US1-FED) Copy ``` # Path parameters export agent_rule_id="3b5-v82-ns6" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/${agent_rule_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a Workload Protection agent rule (US1-FED) ``` """ Get a Workload Protection agent rule (US1-FED) returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi # there is a valid "agent_rule" in the system AGENT_RULE_DATA_ID = environ["AGENT_RULE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.get_cloud_workload_security_agent_rule( agent_rule_id=AGENT_RULE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a Workload Protection agent rule (US1-FED) ``` # Get a Workload Protection agent rule (US1-FED) returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "agent_rule" in the system AGENT_RULE_DATA_ID = ENV["AGENT_RULE_DATA_ID"] p api_instance.get_cloud_workload_security_agent_rule(AGENT_RULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a Workload Protection agent rule (US1-FED) ``` // Get a Workload Protection agent rule (US1-FED) returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "agent_rule" in the system AgentRuleDataID := os.Getenv("AGENT_RULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.GetCloudWorkloadSecurityAgentRule(ctx, AgentRuleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.GetCloudWorkloadSecurityAgentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.GetCloudWorkloadSecurityAgentRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a Workload Protection agent rule (US1-FED) ``` // Get a Workload Protection agent rule (US1-FED) returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "agent_rule" in the system String AGENT_RULE_DATA_ID = System.getenv("AGENT_RULE_DATA_ID"); try { CloudWorkloadSecurityAgentRuleResponse result = apiInstance.getCloudWorkloadSecurityAgentRule(AGENT_RULE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#getCloudWorkloadSecurityAgentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a Workload Protection agent rule (US1-FED) ``` // Get a Workload Protection agent rule (US1-FED) returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; #[tokio::main] async fn main() { // there is a valid "agent_rule" in the system let agent_rule_data_id = std::env::var("AGENT_RULE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api .get_cloud_workload_security_agent_rule(agent_rule_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a Workload Protection agent rule (US1-FED) ``` /** * Get a Workload Protection agent rule (US1-FED) returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "agent_rule" in the system const AGENT_RULE_DATA_ID = process.env.AGENT_RULE_DATA_ID as string; const params: v2.CSMThreatsApiGetCloudWorkloadSecurityAgentRuleRequest = { agentRuleId: AGENT_RULE_DATA_ID, }; apiInstance .getCloudWorkloadSecurityAgentRule(params) .then((data: v2.CloudWorkloadSecurityAgentRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-workload-protection-agent-rule-us1-fed) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#create-a-workload-protection-agent-rule-us1-fed-v2) POST https://api.ap1.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.ap2.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.datadoghq.eu/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.ddog-gov.com/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.us3.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_ruleshttps://api.us5.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules ### Overview Create a new agent rule with the given parameters. **Note** : This endpoint should only be used for the Government (US1-FED) site. This endpoint requires the `security_monitoring_cws_agent_rules_write` permission. ### Request #### Body Data (required) The definition of the new agent rule * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Expand All Field Type Description _required_] object Object for a single Agent rule _required_] object Create a new Cloud Workload Security Agent rule. [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agent_version string Constrain the rule to specific versions of the Datadog Agent. blocking [string] The blocking policies that the rule belongs to. description string The description of the Agent rule. disabled [string] The disabled policies that the rule belongs to. enabled boolean Whether the Agent rule is enabled. expression [_required_] string The SECL expression of the Agent rule. filters [string] The platforms the Agent rule is supported on. monitoring [string] The monitoring policies that the rule belongs to. name [_required_] string The name of the Agent rule. policy_id string The ID of the policy where the Agent rule is saved. product_tags [string] The list of product tags associated with the rule. silent boolean Whether the rule is silent. type [_required_] enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ``` { "data": { "attributes": { "description": "My Agent rule", "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "name": "examplecsmthreat" }, "type": "agent_rule" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCloudWorkloadSecurityAgentRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCloudWorkloadSecurityAgentRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCloudWorkloadSecurityAgentRule-403-v2) * [409](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCloudWorkloadSecurityAgentRule-409-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#CreateCloudWorkloadSecurityAgentRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes an Agent rule Expand All Field Type Description object Object for a single Agent rule object A Cloud Workload Security Agent rule returned by the API [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agentConstraint string The version of the Agent blocking [string] The blocking policies that the rule belongs to category string The category of the Agent rule creationAuthorUuId string The ID of the user who created the rule creationDate int64 When the Agent rule was created, timestamp in milliseconds object The attributes of the user who created the Agent rule handle string The handle of the user name string The name of the user defaultRule boolean Whether the rule is included by default description string The description of the Agent rule disabled [string] The disabled policies that the rule belongs to enabled boolean Whether the Agent rule is enabled expression string The SECL expression of the Agent rule filters [string] The platforms the Agent rule is supported on monitoring [string] The monitoring policies that the rule belongs to name string The name of the Agent rule product_tags [string] The list of product tags associated with the rule silent boolean Whether the rule is silent. updateAuthorUuId string The ID of the user who updated the rule updateDate int64 Timestamp in milliseconds when the Agent rule was last updated updatedAt int64 When the Agent rule was last updated, timestamp in milliseconds object The attributes of the user who last updated the Agent rule handle string The handle of the user name string The name of the user version int64 The version of the Agent rule id string The ID of the Agent rule type enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ``` { "data": { "attributes": { "actions": [ { "filter": "string", "hash": { "field": "string" }, "kill": { "signal": "string" }, "metadata": { "image_tag": "string", "service": "string", "short_image": "string" }, "set": { "append": false, "default_value": "string", "expression": "string", "field": "string", "inherited": false, "name": "string", "scope": "string", "size": "integer", "ttl": "integer", "value": { "type": "undefined" } } } ], "agentConstraint": "string", "blocking": [], "category": "Process Activity", "creationAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "creationDate": 1624366480320, "creator": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "defaultRule": false, "description": "My Agent rule", "disabled": [], "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "monitoring": [], "name": "my_agent_rule", "product_tags": [], "silent": false, "updateAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "version": 23 }, "id": "3dd-0uc-h1s", "type": "agent_rule" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Create a Workload Protection agent rule (US1-FED) returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "My Agent rule", "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "name": "examplecsmthreat" }, "type": "agent_rule" } } EOF ``` ##### Create a Workload Protection agent rule (US1-FED) returns "OK" response ``` // Create a Workload Protection agent rule (US1-FED) returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CloudWorkloadSecurityAgentRuleCreateRequest{ Data: datadogV2.CloudWorkloadSecurityAgentRuleCreateData{ Attributes: datadogV2.CloudWorkloadSecurityAgentRuleCreateAttributes{ Description: datadog.PtrString("My Agent rule"), Enabled: datadog.PtrBool(true), Expression: `exec.file.name == "sh"`, Filters: []string{}, Name: "examplecsmthreat", }, Type: datadogV2.CLOUDWORKLOADSECURITYAGENTRULETYPE_AGENT_RULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.CreateCloudWorkloadSecurityAgentRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.CreateCloudWorkloadSecurityAgentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.CreateCloudWorkloadSecurityAgentRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a Workload Protection agent rule (US1-FED) returns "OK" response ``` // Create a Workload Protection agent rule (US1-FED) returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateAttributes; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateData; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleCreateRequest; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleResponse; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); CloudWorkloadSecurityAgentRuleCreateRequest body = new CloudWorkloadSecurityAgentRuleCreateRequest() .data( new CloudWorkloadSecurityAgentRuleCreateData() .attributes( new CloudWorkloadSecurityAgentRuleCreateAttributes() .description("My Agent rule") .enabled(true) .expression(""" exec.file.name == "sh" """) .name("examplecsmthreat")) .type(CloudWorkloadSecurityAgentRuleType.AGENT_RULE)); try { CloudWorkloadSecurityAgentRuleResponse result = apiInstance.createCloudWorkloadSecurityAgentRule(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CsmThreatsApi#createCloudWorkloadSecurityAgentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a Workload Protection agent rule (US1-FED) returns "OK" response ``` """ Create a Workload Protection agent rule (US1-FED) returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_attributes import ( CloudWorkloadSecurityAgentRuleCreateAttributes, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_data import ( CloudWorkloadSecurityAgentRuleCreateData, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_create_request import ( CloudWorkloadSecurityAgentRuleCreateRequest, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_type import CloudWorkloadSecurityAgentRuleType body = CloudWorkloadSecurityAgentRuleCreateRequest( data=CloudWorkloadSecurityAgentRuleCreateData( attributes=CloudWorkloadSecurityAgentRuleCreateAttributes( description="My Agent rule", enabled=True, expression='exec.file.name == "sh"', filters=[], name="examplecsmthreat", ), type=CloudWorkloadSecurityAgentRuleType.AGENT_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.create_cloud_workload_security_agent_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a Workload Protection agent rule (US1-FED) returns "OK" response ``` # Create a Workload Protection agent rule (US1-FED) returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new body = DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateRequest.new({ data: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateData.new({ attributes: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleCreateAttributes.new({ description: "My Agent rule", enabled: true, expression: 'exec.file.name == "sh"', filters: [], name: "examplecsmthreat", }), type: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleType::AGENT_RULE, }), }) p api_instance.create_cloud_workload_security_agent_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a Workload Protection agent rule (US1-FED) returns "OK" response ``` // Create a Workload Protection agent rule (US1-FED) returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateAttributes; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateData; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleCreateRequest; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleType; #[tokio::main] async fn main() { let body = CloudWorkloadSecurityAgentRuleCreateRequest::new( CloudWorkloadSecurityAgentRuleCreateData::new( CloudWorkloadSecurityAgentRuleCreateAttributes::new( r#"exec.file.name == "sh""#.to_string(), "examplecsmthreat".to_string(), ) .description("My Agent rule".to_string()) .enabled(true) .filters(vec![]), CloudWorkloadSecurityAgentRuleType::AGENT_RULE, ), ); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api.create_cloud_workload_security_agent_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a Workload Protection agent rule (US1-FED) returns "OK" response ``` /** * Create a Workload Protection agent rule (US1-FED) returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); const params: v2.CSMThreatsApiCreateCloudWorkloadSecurityAgentRuleRequest = { body: { data: { attributes: { description: "My Agent rule", enabled: true, expression: `exec.file.name == "sh"`, filters: [], name: "examplecsmthreat", }, type: "agent_rule", }, }, }; apiInstance .createCloudWorkloadSecurityAgentRule(params) .then((data: v2.CloudWorkloadSecurityAgentRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-workload-protection-agent-rule-us1-fed) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#update-a-workload-protection-agent-rule-us1-fed-v2) PATCH https://api.ap1.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.datadoghq.eu/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.ddog-gov.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id} ### Overview Update a specific agent rule. Returns the agent rule object when the request is successful. **Note** : This endpoint should only be used for the Government (US1-FED) site. This endpoint requires the `security_monitoring_cws_agent_rules_write` permission. ### Arguments #### Path Parameters Name Type Description agent_rule_id [_required_] string The ID of the Agent rule ### Request #### Body Data (required) New definition of the agent rule * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Expand All Field Type Description _required_] object Object for a single Agent rule _required_] object Update an existing Cloud Workload Security Agent rule [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agent_version string Constrain the rule to specific versions of the Datadog Agent blocking [string] The blocking policies that the rule belongs to description string The description of the Agent rule disabled [string] The disabled policies that the rule belongs to enabled boolean Whether the Agent rule is enabled expression string The SECL expression of the Agent rule monitoring [string] The monitoring policies that the rule belongs to policy_id string The ID of the policy where the Agent rule is saved product_tags [string] The list of product tags associated with the rule silent boolean Whether the rule is silent. id string The ID of the Agent rule type [_required_] enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ``` { "data": { "attributes": { "description": "Updated Agent rule", "expression": "exec.file.name == \"sh\"" }, "id": "3dd-0uc-h1s", "type": "agent_rule" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCloudWorkloadSecurityAgentRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCloudWorkloadSecurityAgentRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCloudWorkloadSecurityAgentRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCloudWorkloadSecurityAgentRule-404-v2) * [409](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCloudWorkloadSecurityAgentRule-409-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#UpdateCloudWorkloadSecurityAgentRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Response object that includes an Agent rule Expand All Field Type Description object Object for a single Agent rule object A Cloud Workload Security Agent rule returned by the API [object] The array of actions the rule can perform if triggered filter string SECL expression used to target the container to apply the action on object Hash file specified by the field attribute field string The field of the hash action object Kill system call applied on the container matching the rule signal string Supported signals for the kill system call object The metadata action applied on the scope matching the rule image_tag string The image tag of the metadata action service string The service of the metadata action short_image string The short image of the metadata action object The set action applied on the scope matching the rule append boolean Whether the value should be appended to the field. default_value string The default value of the set action expression string The expression of the set action. field string The field of the set action inherited boolean Whether the value should be inherited. name string The name of the set action scope string The scope of the set action. size int64 The size of the set action. ttl int64 The time to live of the set action. The value of the set action Option 1 string Option 2 integer Option 3 boolean agentConstraint string The version of the Agent blocking [string] The blocking policies that the rule belongs to category string The category of the Agent rule creationAuthorUuId string The ID of the user who created the rule creationDate int64 When the Agent rule was created, timestamp in milliseconds object The attributes of the user who created the Agent rule handle string The handle of the user name string The name of the user defaultRule boolean Whether the rule is included by default description string The description of the Agent rule disabled [string] The disabled policies that the rule belongs to enabled boolean Whether the Agent rule is enabled expression string The SECL expression of the Agent rule filters [string] The platforms the Agent rule is supported on monitoring [string] The monitoring policies that the rule belongs to name string The name of the Agent rule product_tags [string] The list of product tags associated with the rule silent boolean Whether the rule is silent. updateAuthorUuId string The ID of the user who updated the rule updateDate int64 Timestamp in milliseconds when the Agent rule was last updated updatedAt int64 When the Agent rule was last updated, timestamp in milliseconds object The attributes of the user who last updated the Agent rule handle string The handle of the user name string The name of the user version int64 The version of the Agent rule id string The ID of the Agent rule type enum The type of the resource, must always be `agent_rule` Allowed enum values: `agent_rule` default: `agent_rule` ``` { "data": { "attributes": { "actions": [ { "filter": "string", "hash": { "field": "string" }, "kill": { "signal": "string" }, "metadata": { "image_tag": "string", "service": "string", "short_image": "string" }, "set": { "append": false, "default_value": "string", "expression": "string", "field": "string", "inherited": false, "name": "string", "scope": "string", "size": "integer", "ttl": "integer", "value": { "type": "undefined" } } } ], "agentConstraint": "string", "blocking": [], "category": "Process Activity", "creationAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "creationDate": 1624366480320, "creator": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "defaultRule": false, "description": "My Agent rule", "disabled": [], "enabled": true, "expression": "exec.file.name == \"sh\"", "filters": [], "monitoring": [], "name": "my_agent_rule", "product_tags": [], "silent": false, "updateAuthorUuId": "e51c9744-d158-11ec-ad23-da7ad0900002", "updateDate": 1624366480320, "updatedAt": 1624366480320, "updater": { "handle": "datadog.user@example.com", "name": "Datadog User" }, "version": 23 }, "id": "3dd-0uc-h1s", "type": "agent_rule" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Update a Workload Protection agent rule (US1-FED) returns "OK" response Copy ``` # Path parameters export agent_rule_id="3b5-v82-ns6" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/${agent_rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Updated Agent rule", "expression": "exec.file.name == \"sh\"" }, "id": "3dd-0uc-h1s", "type": "agent_rule" } } EOF ``` ##### Update a Workload Protection agent rule (US1-FED) returns "OK" response ``` // Update a Workload Protection agent rule (US1-FED) returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "agent_rule" in the system AgentRuleDataID := os.Getenv("AGENT_RULE_DATA_ID") body := datadogV2.CloudWorkloadSecurityAgentRuleUpdateRequest{ Data: datadogV2.CloudWorkloadSecurityAgentRuleUpdateData{ Attributes: datadogV2.CloudWorkloadSecurityAgentRuleUpdateAttributes{ Description: datadog.PtrString("Updated Agent rule"), Expression: datadog.PtrString(`exec.file.name == "sh"`), }, Id: datadog.PtrString(AgentRuleDataID), Type: datadogV2.CLOUDWORKLOADSECURITYAGENTRULETYPE_AGENT_RULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.UpdateCloudWorkloadSecurityAgentRule(ctx, AgentRuleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.UpdateCloudWorkloadSecurityAgentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.UpdateCloudWorkloadSecurityAgentRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a Workload Protection agent rule (US1-FED) returns "OK" response ``` // Update a Workload Protection agent rule (US1-FED) returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleResponse; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleType; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleUpdateAttributes; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleUpdateData; import com.datadog.api.client.v2.model.CloudWorkloadSecurityAgentRuleUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "agent_rule" in the system String AGENT_RULE_DATA_ID = System.getenv("AGENT_RULE_DATA_ID"); CloudWorkloadSecurityAgentRuleUpdateRequest body = new CloudWorkloadSecurityAgentRuleUpdateRequest() .data( new CloudWorkloadSecurityAgentRuleUpdateData() .attributes( new CloudWorkloadSecurityAgentRuleUpdateAttributes() .description("Updated Agent rule") .expression(""" exec.file.name == "sh" """)) .id(AGENT_RULE_DATA_ID) .type(CloudWorkloadSecurityAgentRuleType.AGENT_RULE)); try { CloudWorkloadSecurityAgentRuleResponse result = apiInstance.updateCloudWorkloadSecurityAgentRule(AGENT_RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling CsmThreatsApi#updateCloudWorkloadSecurityAgentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a Workload Protection agent rule (US1-FED) returns "OK" response ``` """ Update a Workload Protection agent rule (US1-FED) returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi from datadog_api_client.v2.model.cloud_workload_security_agent_rule_type import CloudWorkloadSecurityAgentRuleType from datadog_api_client.v2.model.cloud_workload_security_agent_rule_update_attributes import ( CloudWorkloadSecurityAgentRuleUpdateAttributes, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_update_data import ( CloudWorkloadSecurityAgentRuleUpdateData, ) from datadog_api_client.v2.model.cloud_workload_security_agent_rule_update_request import ( CloudWorkloadSecurityAgentRuleUpdateRequest, ) # there is a valid "agent_rule" in the system AGENT_RULE_DATA_ID = environ["AGENT_RULE_DATA_ID"] body = CloudWorkloadSecurityAgentRuleUpdateRequest( data=CloudWorkloadSecurityAgentRuleUpdateData( attributes=CloudWorkloadSecurityAgentRuleUpdateAttributes( description="Updated Agent rule", expression='exec.file.name == "sh"', ), id=AGENT_RULE_DATA_ID, type=CloudWorkloadSecurityAgentRuleType.AGENT_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.update_cloud_workload_security_agent_rule(agent_rule_id=AGENT_RULE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a Workload Protection agent rule (US1-FED) returns "OK" response ``` # Update a Workload Protection agent rule (US1-FED) returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "agent_rule" in the system AGENT_RULE_DATA_ID = ENV["AGENT_RULE_DATA_ID"] body = DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleUpdateRequest.new({ data: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleUpdateData.new({ attributes: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleUpdateAttributes.new({ description: "Updated Agent rule", expression: 'exec.file.name == "sh"', }), id: AGENT_RULE_DATA_ID, type: DatadogAPIClient::V2::CloudWorkloadSecurityAgentRuleType::AGENT_RULE, }), }) p api_instance.update_cloud_workload_security_agent_rule(AGENT_RULE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a Workload Protection agent rule (US1-FED) returns "OK" response ``` // Update a Workload Protection agent rule (US1-FED) returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleType; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleUpdateAttributes; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleUpdateData; use datadog_api_client::datadogV2::model::CloudWorkloadSecurityAgentRuleUpdateRequest; #[tokio::main] async fn main() { // there is a valid "agent_rule" in the system let agent_rule_data_id = std::env::var("AGENT_RULE_DATA_ID").unwrap(); let body = CloudWorkloadSecurityAgentRuleUpdateRequest::new( CloudWorkloadSecurityAgentRuleUpdateData::new( CloudWorkloadSecurityAgentRuleUpdateAttributes::new() .description("Updated Agent rule".to_string()) .expression(r#"exec.file.name == "sh""#.to_string()), CloudWorkloadSecurityAgentRuleType::AGENT_RULE, ) .id(agent_rule_data_id.clone()), ); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api .update_cloud_workload_security_agent_rule(agent_rule_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a Workload Protection agent rule (US1-FED) returns "OK" response ``` /** * Update a Workload Protection agent rule (US1-FED) returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "agent_rule" in the system const AGENT_RULE_DATA_ID = process.env.AGENT_RULE_DATA_ID as string; const params: v2.CSMThreatsApiUpdateCloudWorkloadSecurityAgentRuleRequest = { body: { data: { attributes: { description: "Updated Agent rule", expression: `exec.file.name == "sh"`, }, id: AGENT_RULE_DATA_ID, type: "agent_rule", }, }, agentRuleId: AGENT_RULE_DATA_ID, }; apiInstance .updateCloudWorkloadSecurityAgentRule(params) .then((data: v2.CloudWorkloadSecurityAgentRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a Workload Protection agent rule (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-workload-protection-agent-rule-us1-fed) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#delete-a-workload-protection-agent-rule-us1-fed-v2) DELETE https://api.ap1.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.datadoghq.eu/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.ddog-gov.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/{agent_rule_id} ### Overview Delete a specific agent rule. **Note** : This endpoint should only be used for the Government (US1-FED) site. This endpoint requires the `security_monitoring_cws_agent_rules_write` permission. ### Arguments #### Path Parameters Name Type Description agent_rule_id [_required_] string The ID of the Agent rule ### Response * [204](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCloudWorkloadSecurityAgentRule-204-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCloudWorkloadSecurityAgentRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCloudWorkloadSecurityAgentRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#DeleteCloudWorkloadSecurityAgentRule-429-v2) OK Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Delete a Workload Protection agent rule (US1-FED) Copy ``` # Path parameters export agent_rule_id="3b5-v82-ns6" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/cloud_workload_security/agent_rules/${agent_rule_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a Workload Protection agent rule (US1-FED) ``` """ Delete a Workload Protection agent rule (US1-FED) returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi # there is a valid "agent_rule" in the system AGENT_RULE_DATA_ID = environ["AGENT_RULE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) api_instance.delete_cloud_workload_security_agent_rule( agent_rule_id=AGENT_RULE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a Workload Protection agent rule (US1-FED) ``` # Delete a Workload Protection agent rule (US1-FED) returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new # there is a valid "agent_rule" in the system AGENT_RULE_DATA_ID = ENV["AGENT_RULE_DATA_ID"] api_instance.delete_cloud_workload_security_agent_rule(AGENT_RULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a Workload Protection agent rule (US1-FED) ``` // Delete a Workload Protection agent rule (US1-FED) returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "agent_rule" in the system AgentRuleDataID := os.Getenv("AGENT_RULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) r, err := api.DeleteCloudWorkloadSecurityAgentRule(ctx, AgentRuleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.DeleteCloudWorkloadSecurityAgentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a Workload Protection agent rule (US1-FED) ``` // Delete a Workload Protection agent rule (US1-FED) returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); // there is a valid "agent_rule" in the system String AGENT_RULE_DATA_ID = System.getenv("AGENT_RULE_DATA_ID"); try { apiInstance.deleteCloudWorkloadSecurityAgentRule(AGENT_RULE_DATA_ID); } catch (ApiException e) { System.err.println( "Exception when calling CsmThreatsApi#deleteCloudWorkloadSecurityAgentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a Workload Protection agent rule (US1-FED) ``` // Delete a Workload Protection agent rule (US1-FED) returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; #[tokio::main] async fn main() { // there is a valid "agent_rule" in the system let agent_rule_data_id = std::env::var("AGENT_RULE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api .delete_cloud_workload_security_agent_rule(agent_rule_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a Workload Protection agent rule (US1-FED) ``` /** * Delete a Workload Protection agent rule (US1-FED) returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); // there is a valid "agent_rule" in the system const AGENT_RULE_DATA_ID = process.env.AGENT_RULE_DATA_ID as string; const params: v2.CSMThreatsApiDeleteCloudWorkloadSecurityAgentRuleRequest = { agentRuleId: AGENT_RULE_DATA_ID, }; apiInstance .deleteCloudWorkloadSecurityAgentRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Download the Workload Protection policy (US1-FED)](https://docs.datadoghq.com/api/latest/csm-threats/#download-the-workload-protection-policy-us1-fed) * [v2 (latest)](https://docs.datadoghq.com/api/latest/csm-threats/#download-the-workload-protection-policy-us1-fed-v2) GET https://api.ap1.datadoghq.com/api/v2/security/cloud_workload/policy/downloadhttps://api.ap2.datadoghq.com/api/v2/security/cloud_workload/policy/downloadhttps://api.datadoghq.eu/api/v2/security/cloud_workload/policy/downloadhttps://api.ddog-gov.com/api/v2/security/cloud_workload/policy/downloadhttps://api.datadoghq.com/api/v2/security/cloud_workload/policy/downloadhttps://api.us3.datadoghq.com/api/v2/security/cloud_workload/policy/downloadhttps://api.us5.datadoghq.com/api/v2/security/cloud_workload/policy/download ### Overview The download endpoint generates a Workload Protection policy file from your currently active Workload Protection agent rules, and downloads them as a `.policy` file. This file can then be deployed to your agents to update the policy running in your environment. **Note** : This endpoint should only be used for the Government (US1-FED) site. This endpoint requires the `security_monitoring_cws_agent_rules_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/csm-threats/#DownloadCloudWorkloadPolicyFile-200-v2) * [403](https://docs.datadoghq.com/api/latest/csm-threats/#DownloadCloudWorkloadPolicyFile-403-v2) * [429](https://docs.datadoghq.com/api/latest/csm-threats/#DownloadCloudWorkloadPolicyFile-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) Expand All Field Type Description No response body ``` {} ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/csm-threats/) * [Example](https://docs.datadoghq.com/api/latest/csm-threats/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/csm-threats/?code-lang=typescript) ##### Download the Workload Protection policy (US1-FED) Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/cloud_workload/policy/download" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Download the Workload Protection policy (US1-FED) ``` """ Download the Workload Protection policy (US1-FED) returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.csm_threats_api import CSMThreatsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = CSMThreatsApi(api_client) response = api_instance.download_cloud_workload_policy_file() print(response.read()) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Download the Workload Protection policy (US1-FED) ``` # Download the Workload Protection policy (US1-FED) returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::CSMThreatsAPI.new p api_instance.download_cloud_workload_policy_file() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Download the Workload Protection policy (US1-FED) ``` // Download the Workload Protection policy (US1-FED) returns "OK" response package main import ( "context" "fmt" "io/ioutil" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewCSMThreatsApi(apiClient) resp, r, err := api.DownloadCloudWorkloadPolicyFile(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `CSMThreatsApi.DownloadCloudWorkloadPolicyFile`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := ioutil.ReadAll(resp) fmt.Fprintf(os.Stdout, "Response from `CSMThreatsApi.DownloadCloudWorkloadPolicyFile`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Download the Workload Protection policy (US1-FED) ``` // Download the Workload Protection policy (US1-FED) returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.CsmThreatsApi; import java.io.File; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); CsmThreatsApi apiInstance = new CsmThreatsApi(defaultClient); try { File result = apiInstance.downloadCloudWorkloadPolicyFile(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling CsmThreatsApi#downloadCloudWorkloadPolicyFile"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Download the Workload Protection policy (US1-FED) ``` // Download the Workload Protection policy (US1-FED) returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_csm_threats::CSMThreatsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = CSMThreatsAPI::with_config(configuration); let resp = api.download_cloud_workload_policy_file().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Download the Workload Protection policy (US1-FED) ``` /** * Download the Workload Protection policy (US1-FED) returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.CSMThreatsApi(configuration); apiInstance .downloadCloudWorkloadPolicyFile() .then((data: client.HttpFile) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## Can't find something? Our friendly, knowledgeable solutions engineers are here to help! [Contact Us](https://docs.datadoghq.com/help/) [Free Trial](https://docs.datadoghq.com/api/latest/csm-threats/) Download mobile app [](https://apps.apple.com/app/datadog/id1391380318)[](https://play.google.com/store/apps/details?id=com.datadog.app) Product [Infrastructure Monitoring](https://www.datadoghq.com/product/infrastructure-monitoring/) [Network Monitoring](https://www.datadoghq.com/product/network-monitoring/) [Container Monitoring](https://www.datadoghq.com/product/container-monitoring/) [Serverless](https://www.datadoghq.com/product/serverless-monitoring/) [Cloud Cost Management](https://www.datadoghq.com/product/cloud-cost-management/) [Cloudcraft](https://www.datadoghq.com/product/cloudcraft/) [Kubernetes Autoscaling](https://www.datadoghq.com/product/kubernetes-autoscaling/) [Application Performance Monitoring](https://www.datadoghq.com/product/apm/) [Software Catalog](https://www.datadoghq.com/product/software-catalog/) [Universal Service Monitoring](https://www.datadoghq.com/product/universal-service-monitoring/) [Data Streams Monitoring](https://www.datadoghq.com/product/data-streams-monitoring/) [Jobs Monitoring](https://www.datadoghq.com/product/data-observability/jobs-monitoring/) [Quality Monitoring](https://www.datadoghq.com/product/data-observability/quality-monitoring/) [Database Monitoring](https://www.datadoghq.com/product/database-monitoring/) [Continuous Profiler](https://www.datadoghq.com/product/code-profiling/) [Dynamic Instrumentation](https://www.datadoghq.com/product/dynamic-instrumentation/) [Log Management](https://www.datadoghq.com/product/log-management/) [Sensitive Data Scanner](https://www.datadoghq.com/product/sensitive-data-scanner/) [Audit Trail](https://www.datadoghq.com/product/audit-trail/) [Observability Pipelines](https://www.datadoghq.com/product/observability-pipelines/) [Cloud Security](https://www.datadoghq.com/product/cloud-security/) [Cloud Security Posture Management](https://www.datadoghq.com/product/cloud-security/#posture-management) [Workload Protection](https://www.datadoghq.com/product/workload-protection/) [Cloud Infrastructure Entitlement Management](https://www.datadoghq.com/product/cloud-security/#entitlement-management) [Vulnerability Management](https://www.datadoghq.com/product/cloud-security/#vulnerability-management) [Compliance](https://www.datadoghq.com/product/cloud-security/#compliance) [App and API Protection](https://www.datadoghq.com/product/app-and-api-protection/) [Software Composition Analysis](https://www.datadoghq.com/product/software-composition-analysis/) [Code Security](https://www.datadoghq.com/product/code-security/) [Static Code Analysis (SAST)](https://www.datadoghq.com/product/sast/) [Runtime Code Analysis (IAST)](https://www.datadoghq.com/product/iast/) [IaC Security](https://www.datadoghq.com/product/iac-security) [Cloud SIEM](https://www.datadoghq.com/product/cloud-siem/) [Browser Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/) [Mobile Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/mobile-rum/) [Product Analytics](https://www.datadoghq.com/product/product-analytics/) [Session Replay](https://www.datadoghq.com/product/real-user-monitoring/session-replay/) [Synthetic Monitoring](https://www.datadoghq.com/product/synthetic-monitoring/) [Mobile App Testing](https://www.datadoghq.com/product/mobile-app-testing/) [Continuous Testing](https://www.datadoghq.com/product/continuous-testing/) [Error Tracking](https://www.datadoghq.com/product/error-tracking/) [CloudPrem](https://www.datadoghq.com/product/cloudprem/) [Internal Developer Portal](https://www.datadoghq.com/product/internal-developer-portal/) [CI Visibility](https://www.datadoghq.com/product/ci-cd-monitoring/) [Test Optimization](https://www.datadoghq.com/product/test-optimization/) [Feature Flags](https://www.datadoghq.com/product/feature-flags/) [Code Coverage](https://www.datadoghq.com/product/code-coverage/) [Service Level Objectives](https://www.datadoghq.com/product/service-level-objectives/) [Incident Response](https://www.datadoghq.com/product/incident-response/) [Event Management](https://www.datadoghq.com/product/event-management/) [Case Management](https://www.datadoghq.com/product/case-management/) [Bits AI Agents](https://www.datadoghq.com/product/ai/bits-ai-agents/) [Bits AI SRE](https://www.datadoghq.com/product/ai/bits-ai-sre/) [Metrics](https://www.datadoghq.com/product/metrics/) [Watchdog](https://www.datadoghq.com/product/platform/watchdog/) [LLM Observability](https://www.datadoghq.com/product/llm-observability/) [AI Integrations](https://www.datadoghq.com/product/platform/integrations/#cat-aiml) [Workflow Automation](https://www.datadoghq.com/product/workflow-automation/) [App Builder](https://www.datadoghq.com/product/app-builder/) [CoScreen](https://www.datadoghq.com/product/coscreen/) [Teams](https://docs.datadoghq.com/account_management/teams/) [Dashboards](https://www.datadoghq.com/product/platform/dashboards/) [Notebooks](https://docs.datadoghq.com/notebooks/) [Mobile App](https://docs.datadoghq.com/service_management/mobile/?tab=ios) [Fleet Automation](https://www.datadoghq.com/product/fleet-automation/) [Access Control](https://docs.datadoghq.com/account_management/rbac/?tab=datadogapplication) [OpenTelemetry](https://www.datadoghq.com/solutions/opentelemetry/) [Alerts](https://www.datadoghq.com/product/platform/alerts/) [integrations](https://www.datadoghq.com/product/platform/integrations/) [IDE Plugins](https://www.datadoghq.com/product/platform/ides/) [API](https://docs.datadoghq.com/api/) [Marketplace](https://www.datadoghq.com/marketplacepartners/) [Security Labs Research](https://securitylabs.datadoghq.com/) [Open Source Projects](https://opensource.datadoghq.com/) [Storage Management](https://www.datadoghq.com/product/storage-management/) [DORA Metrics](https://www.datadoghq.com/product/platform/dora-metrics/) [Secret Scanning](https://www.datadoghq.com/product/secret-scanning/) resources [Pricing](https://www.datadoghq.com/pricing/) [Documentation](https://docs.datadoghq.com/) [Support](https://www.datadoghq.com/support/) [Services & Enablement](https://www.datadoghq.com/support-services/) [Product Preview Program](https://www.datadoghq.com/product-preview/) [Certification](https://www.datadoghq.com/certification/overview/) [Open Source](https://opensource.datadoghq.com/) [Events and Webinars](https://www.datadoghq.com/events-webinars/) [Security](https://www.datadoghq.com/security/) [Privacy Center](https://www.datadoghq.com/privacy/) [Knowledge Center](https://www.datadoghq.com/knowledge-center/) [Learning Resources](https://www.datadoghq.com/learn/) About [Contact Us](https://www.datadoghq.com/about/contact/) [Partners](https://www.datadoghq.com/partner/network/) [Press](https://www.datadoghq.com/about/latest-news/press-releases/) [Leadership](https://www.datadoghq.com/about/leadership/) [Careers](https://careers.datadoghq.com/) [Legal](https://www.datadoghq.com/legal/) [Investor Relations](https://investors.datadoghq.com/) [Analyst Reports](https://www.datadoghq.com/about/analyst/) [ESG Report](https://www.datadoghq.com/esg-report/) [Vendor Help](https://www.datadoghq.com/vendor-help/) [Trust Hub](https://www.datadoghq.com/trust/) Blog [The Monitor](https://www.datadoghq.com/blog/) [Engineering](https://www.datadoghq.com/blog/engineering/) [AI](https://www.datadoghq.com/blog/ai/) [Security Labs](https://securitylabs.datadoghq.com/) Icon/world Created with Sketch. English [English ](https://docs.datadoghq.com/?lang_pref=en)[Français ](https://docs.datadoghq.com/fr/?lang_pref=fr)[日本語 ](https://docs.datadoghq.com/ja/?lang_pref=ja)[한국어 ](https://docs.datadoghq.com/ko/?lang_pref=ko)[Español](https://docs.datadoghq.com/es/?lang_pref=es) [](https://twitter.com/datadoghq)[](https://www.instagram.com/datadoghq/)[](https://www.youtube.com/user/DatadogHQ)[](https://www.LinkedIn.com/company/datadog/) © Datadog 2026 [Terms](https://www.datadoghq.com/legal/terms/) | [Privacy](https://www.datadoghq.com/legal/privacy/) | [Your Privacy Choices ![](https://imgix.datadoghq.com/img/icons/privacyoptions.svg?w=24&dpr=2)](https://docs.datadoghq.com/api/latest/csm-threats/) ###### Request a personalized demo × * First Name* * Last Name* * Business Email* * Company* * Job Title* * Phone Number * How are you currently monitoring your infrastructure and applications? By submitting this form, you agree to the [Privacy Policy](https://www.datadoghq.com/legal/privacy/) and [Cookie Policy.](https://www.datadoghq.com/legal/cookies/) Request a Demo ##### Get Started with Datadog --- # Source: https://docs.datadoghq.com/api/latest/dashboard-lists # Dashboard Lists Interact with your dashboard lists through the API to organize, find, and share all of your dashboards with your team and organization. ## [Get all dashboard lists](https://docs.datadoghq.com/api/latest/dashboard-lists/#get-all-dashboard-lists) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboard-lists/#get-all-dashboard-lists-v1) GET https://api.ap1.datadoghq.com/api/v1/dashboard/lists/manualhttps://api.ap2.datadoghq.com/api/v1/dashboard/lists/manualhttps://api.datadoghq.eu/api/v1/dashboard/lists/manualhttps://api.ddog-gov.com/api/v1/dashboard/lists/manualhttps://api.datadoghq.com/api/v1/dashboard/lists/manualhttps://api.us3.datadoghq.com/api/v1/dashboard/lists/manualhttps://api.us5.datadoghq.com/api/v1/dashboard/lists/manual ### Overview Fetch all of your existing dashboard list definitions. This endpoint requires the `dashboards_read` permission. OAuth apps require the `dashboards_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboard-lists) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/dashboard-lists/#ListDashboardLists-200-v1) * [403](https://docs.datadoghq.com/api/latest/dashboard-lists/#ListDashboardLists-403-v1) * [429](https://docs.datadoghq.com/api/latest/dashboard-lists/#ListDashboardLists-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Information on your dashboard lists. Field Type Description dashboard_lists [object] List of all your dashboard lists. author object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. created date-time Date of creation of the dashboard list. dashboard_count int64 The number of dashboards in the list. id int64 The ID of the dashboard list. is_favorite boolean Whether or not the list is in the favorites. modified date-time Date of last edition of the dashboard list. name [_required_] string The name of the dashboard list. type string The type of dashboard list. ``` { "dashboard_lists": [ { "author": { "email": "string", "handle": "string", "name": "string" }, "created": "2019-09-19T10:00:00.000Z", "dashboard_count": "integer", "id": "integer", "is_favorite": false, "modified": "2019-09-19T10:00:00.000Z", "name": "My Dashboard", "type": "manual_dashboard_list" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python-legacy) ##### Get all dashboard lists Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/lists/manual" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all dashboard lists ``` """ Get all dashboard lists returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboard_lists_api import DashboardListsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardListsApi(api_client) response = api_instance.list_dashboard_lists() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all dashboard lists ``` # Get all dashboard lists returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardListsAPI.new p api_instance.list_dashboard_lists() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all dashboard lists ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) result = dog.get_all_dashboard_lists() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all dashboard lists ``` // Get all dashboard lists returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardListsApi(apiClient) resp, r, err := api.ListDashboardLists(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardListsApi.ListDashboardLists`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardListsApi.ListDashboardLists`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all dashboard lists ``` // Get all dashboard lists returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardListsApi; import com.datadog.api.client.v1.model.DashboardListListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardListsApi apiInstance = new DashboardListsApi(defaultClient); try { DashboardListListResponse result = apiInstance.listDashboardLists(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardListsApi#listDashboardLists"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all dashboard lists ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.DashboardList.get_all() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get all dashboard lists ``` // Get all dashboard lists returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboard_lists::DashboardListsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DashboardListsAPI::with_config(configuration); let resp = api.list_dashboard_lists().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all dashboard lists ``` /** * Get all dashboard lists returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardListsApi(configuration); apiInstance .listDashboardLists() .then((data: v1.DashboardListListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get items of a Dashboard List](https://docs.datadoghq.com/api/latest/dashboard-lists/#get-items-of-a-dashboard-list) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dashboard-lists/#get-items-of-a-dashboard-list-v2) GET https://api.ap1.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.ap2.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.datadoghq.eu/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.ddog-gov.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.us3.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.us5.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboards ### Overview Fetch the dashboard list’s dashboard definitions. This endpoint requires the `dashboards_read` permission. OAuth apps require the `dashboards_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboard-lists) to access this endpoint. ### Arguments #### Path Parameters Name Type Description dashboard_list_id [_required_] integer ID of the dashboard list to get items from. ### Response * [200](https://docs.datadoghq.com/api/latest/dashboard-lists/#GetDashboardListItems-200-v2) * [403](https://docs.datadoghq.com/api/latest/dashboard-lists/#GetDashboardListItems-403-v2) * [404](https://docs.datadoghq.com/api/latest/dashboard-lists/#GetDashboardListItems-404-v2) * [429](https://docs.datadoghq.com/api/latest/dashboard-lists/#GetDashboardListItems-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Dashboards within a list. Field Type Description dashboards [_required_] [object] List of dashboards in the dashboard list. author object Creator of the object. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. created date-time Date of creation of the dashboard. icon string URL to the icon of the dashboard. id [_required_] string ID of the dashboard. integration_id string The short name of the integration. is_favorite boolean Whether or not the dashboard is in the favorites. is_read_only boolean Whether or not the dashboard is read only. is_shared boolean Whether the dashboard is publicly shared or not. modified date-time Date of last edition of the dashboard. popularity int32 Popularity of the dashboard. tags [string] List of team names representing ownership of a dashboard. title string Title of the dashboard. type [_required_] enum The type of the dashboard. Allowed enum values: `custom_timeboard,custom_screenboard,integration_screenboard,integration_timeboard,host_timeboard` url string URL path to the dashboard. total int64 Number of dashboards in the dashboard list. ``` { "dashboards": [ { "author": { "email": "string", "handle": "string", "name": "string" }, "created": "2019-09-19T10:00:00.000Z", "icon": "string", "id": "q5j-nti-fv6", "integration_id": "string", "is_favorite": false, "is_read_only": false, "is_shared": false, "modified": "2019-09-19T10:00:00.000Z", "popularity": "integer", "tags": [], "title": "string", "type": "host_timeboard", "url": "string" } ], "total": "integer" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python-legacy) ##### Get items of a Dashboard List Copy ``` # Path parameters export dashboard_list_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dashboard/lists/manual/${dashboard_list_id}/dashboards" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get items of a Dashboard List ``` """ Get items of a Dashboard List returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dashboard_lists_api import DashboardListsApi # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = environ["DASHBOARD_LIST_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardListsApi(api_client) response = api_instance.get_dashboard_list_items( dashboard_list_id=int(DASHBOARD_LIST_ID), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get items of a Dashboard List ``` # Get items of a Dashboard List returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DashboardListsAPI.new # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = ENV["DASHBOARD_LIST_ID"] p api_instance.get_dashboard_list_items(DASHBOARD_LIST_ID.to_i) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get items of a Dashboard List ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) result = dog.v2.get_items_of_dashboard_list(4741) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get items of a Dashboard List ``` // Get items of a Dashboard List returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dashboard_list" in the system DashboardListID, _ := strconv.ParseInt(os.Getenv("DASHBOARD_LIST_ID"), 10, 64) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDashboardListsApi(apiClient) resp, r, err := api.GetDashboardListItems(ctx, DashboardListID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardListsApi.GetDashboardListItems`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardListsApi.GetDashboardListItems`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get items of a Dashboard List ``` // Get items of a Dashboard List returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DashboardListsApi; import com.datadog.api.client.v2.model.DashboardListItems; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardListsApi apiInstance = new DashboardListsApi(defaultClient); // there is a valid "dashboard_list" in the system Long DASHBOARD_LIST_ID = Long.parseLong(System.getenv("DASHBOARD_LIST_ID")); try { DashboardListItems result = apiInstance.getDashboardListItems(DASHBOARD_LIST_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardListsApi#getDashboardListItems"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get items of a Dashboard List ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.DashboardList.v2.get_items(4741) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get items of a Dashboard List ``` // Get items of a Dashboard List returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dashboard_lists::DashboardListsAPI; #[tokio::main] async fn main() { // there is a valid "dashboard_list" in the system let dashboard_list_id: i64 = std::env::var("DASHBOARD_LIST_ID").unwrap().parse().unwrap(); let configuration = datadog::Configuration::new(); let api = DashboardListsAPI::with_config(configuration); let resp = api .get_dashboard_list_items(dashboard_list_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get items of a Dashboard List ``` /** * Get items of a Dashboard List returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DashboardListsApi(configuration); // there is a valid "dashboard_list" in the system const DASHBOARD_LIST_ID = parseInt(process.env.DASHBOARD_LIST_ID as string); const params: v2.DashboardListsApiGetDashboardListItemsRequest = { dashboardListId: DASHBOARD_LIST_ID, }; apiInstance .getDashboardListItems(params) .then((data: v2.DashboardListItems) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add Items to a Dashboard List](https://docs.datadoghq.com/api/latest/dashboard-lists/#add-items-to-a-dashboard-list) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dashboard-lists/#add-items-to-a-dashboard-list-v2) POST https://api.ap1.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.ap2.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.datadoghq.eu/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.ddog-gov.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.us3.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.us5.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboards ### Overview Add dashboards to an existing dashboard list. ### Arguments #### Path Parameters Name Type Description dashboard_list_id [_required_] integer ID of the dashboard list to add items to. ### Request #### Body Data (required) Dashboards to add to the dashboard list. * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Field Type Description dashboards [object] List of dashboards to add the dashboard list. id [_required_] string ID of the dashboard. type [_required_] enum The type of the dashboard. Allowed enum values: `custom_timeboard,custom_screenboard,integration_screenboard,integration_timeboard,host_timeboard` ##### Add custom screenboard dashboard to an existing dashboard list returns "OK" response ``` { "dashboards": [ { "id": "123-abc-456", "type": "custom_screenboard" } ] } ``` Copy ##### Add custom timeboard dashboard to an existing dashboard list returns "OK" response ``` { "dashboards": [ { "id": "123-abc-456", "type": "custom_timeboard" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dashboard-lists/#CreateDashboardListItems-200-v2) * [400](https://docs.datadoghq.com/api/latest/dashboard-lists/#CreateDashboardListItems-400-v2) * [403](https://docs.datadoghq.com/api/latest/dashboard-lists/#CreateDashboardListItems-403-v2) * [404](https://docs.datadoghq.com/api/latest/dashboard-lists/#CreateDashboardListItems-404-v2) * [429](https://docs.datadoghq.com/api/latest/dashboard-lists/#CreateDashboardListItems-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Response containing a list of added dashboards. Field Type Description added_dashboards_to_list [object] List of dashboards added to the dashboard list. id [_required_] string ID of the dashboard. type [_required_] enum The type of the dashboard. Allowed enum values: `custom_timeboard,custom_screenboard,integration_screenboard,integration_timeboard,host_timeboard` ``` { "added_dashboards_to_list": [ { "id": "q5j-nti-fv6", "type": "host_timeboard" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python-legacy) ##### Add custom screenboard dashboard to an existing dashboard list returns "OK" response Copy ``` # Path parameters export dashboard_list_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dashboard/lists/manual/${dashboard_list_id}/dashboards" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "dashboards": [ { "id": "123-abc-456", "type": "custom_screenboard" } ] } EOF ``` ##### Add custom timeboard dashboard to an existing dashboard list returns "OK" response Copy ``` # Path parameters export dashboard_list_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dashboard/lists/manual/${dashboard_list_id}/dashboards" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "dashboards": [ { "id": "123-abc-456", "type": "custom_timeboard" } ] } EOF ``` ##### Add custom screenboard dashboard to an existing dashboard list returns "OK" response ``` // Add custom screenboard dashboard to an existing dashboard list returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dashboard_list" in the system DashboardListID, _ := strconv.ParseInt(os.Getenv("DASHBOARD_LIST_ID"), 10, 64) // there is a valid "screenboard_dashboard" in the system ScreenboardDashboardID := os.Getenv("SCREENBOARD_DASHBOARD_ID") body := datadogV2.DashboardListAddItemsRequest{ Dashboards: []datadogV2.DashboardListItemRequest{ { Id: ScreenboardDashboardID, Type: datadogV2.DASHBOARDTYPE_CUSTOM_SCREENBOARD, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDashboardListsApi(apiClient) resp, r, err := api.CreateDashboardListItems(ctx, DashboardListID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardListsApi.CreateDashboardListItems`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardListsApi.CreateDashboardListItems`:\n%s\n", responseContent) } ``` Copy ##### Add custom timeboard dashboard to an existing dashboard list returns "OK" response ``` // Add custom timeboard dashboard to an existing dashboard list returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dashboard_list" in the system DashboardListID, _ := strconv.ParseInt(os.Getenv("DASHBOARD_LIST_ID"), 10, 64) // there is a valid "dashboard" in the system DashboardID := os.Getenv("DASHBOARD_ID") body := datadogV2.DashboardListAddItemsRequest{ Dashboards: []datadogV2.DashboardListItemRequest{ { Id: DashboardID, Type: datadogV2.DASHBOARDTYPE_CUSTOM_TIMEBOARD, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDashboardListsApi(apiClient) resp, r, err := api.CreateDashboardListItems(ctx, DashboardListID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardListsApi.CreateDashboardListItems`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardListsApi.CreateDashboardListItems`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add custom screenboard dashboard to an existing dashboard list returns "OK" response ``` // Add custom screenboard dashboard to an existing dashboard list returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DashboardListsApi; import com.datadog.api.client.v2.model.DashboardListAddItemsRequest; import com.datadog.api.client.v2.model.DashboardListAddItemsResponse; import com.datadog.api.client.v2.model.DashboardListItemRequest; import com.datadog.api.client.v2.model.DashboardType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardListsApi apiInstance = new DashboardListsApi(defaultClient); // there is a valid "dashboard_list" in the system Long DASHBOARD_LIST_ID = Long.parseLong(System.getenv("DASHBOARD_LIST_ID")); // there is a valid "screenboard_dashboard" in the system String SCREENBOARD_DASHBOARD_ID = System.getenv("SCREENBOARD_DASHBOARD_ID"); DashboardListAddItemsRequest body = new DashboardListAddItemsRequest() .dashboards( Collections.singletonList( new DashboardListItemRequest() .id(SCREENBOARD_DASHBOARD_ID) .type(DashboardType.CUSTOM_SCREENBOARD))); try { DashboardListAddItemsResponse result = apiInstance.createDashboardListItems(DASHBOARD_LIST_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardListsApi#createDashboardListItems"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Add custom timeboard dashboard to an existing dashboard list returns "OK" response ``` // Add custom timeboard dashboard to an existing dashboard list returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DashboardListsApi; import com.datadog.api.client.v2.model.DashboardListAddItemsRequest; import com.datadog.api.client.v2.model.DashboardListAddItemsResponse; import com.datadog.api.client.v2.model.DashboardListItemRequest; import com.datadog.api.client.v2.model.DashboardType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardListsApi apiInstance = new DashboardListsApi(defaultClient); // there is a valid "dashboard_list" in the system Long DASHBOARD_LIST_ID = Long.parseLong(System.getenv("DASHBOARD_LIST_ID")); // there is a valid "dashboard" in the system String DASHBOARD_ID = System.getenv("DASHBOARD_ID"); DashboardListAddItemsRequest body = new DashboardListAddItemsRequest() .dashboards( Collections.singletonList( new DashboardListItemRequest() .id(DASHBOARD_ID) .type(DashboardType.CUSTOM_TIMEBOARD))); try { DashboardListAddItemsResponse result = apiInstance.createDashboardListItems(DASHBOARD_LIST_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardListsApi#createDashboardListItems"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add custom screenboard dashboard to an existing dashboard list returns "OK" response ``` """ Add custom screenboard dashboard to an existing dashboard list returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dashboard_lists_api import DashboardListsApi from datadog_api_client.v2.model.dashboard_list_add_items_request import DashboardListAddItemsRequest from datadog_api_client.v2.model.dashboard_list_item_request import DashboardListItemRequest from datadog_api_client.v2.model.dashboard_type import DashboardType # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = environ["DASHBOARD_LIST_ID"] # there is a valid "screenboard_dashboard" in the system SCREENBOARD_DASHBOARD_ID = environ["SCREENBOARD_DASHBOARD_ID"] body = DashboardListAddItemsRequest( dashboards=[ DashboardListItemRequest( id=SCREENBOARD_DASHBOARD_ID, type=DashboardType.CUSTOM_SCREENBOARD, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardListsApi(api_client) response = api_instance.create_dashboard_list_items(dashboard_list_id=int(DASHBOARD_LIST_ID), body=body) print(response) ``` Copy ##### Add custom timeboard dashboard to an existing dashboard list returns "OK" response ``` """ Add custom timeboard dashboard to an existing dashboard list returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dashboard_lists_api import DashboardListsApi from datadog_api_client.v2.model.dashboard_list_add_items_request import DashboardListAddItemsRequest from datadog_api_client.v2.model.dashboard_list_item_request import DashboardListItemRequest from datadog_api_client.v2.model.dashboard_type import DashboardType # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = environ["DASHBOARD_LIST_ID"] # there is a valid "dashboard" in the system DASHBOARD_ID = environ["DASHBOARD_ID"] body = DashboardListAddItemsRequest( dashboards=[ DashboardListItemRequest( id=DASHBOARD_ID, type=DashboardType.CUSTOM_TIMEBOARD, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardListsApi(api_client) response = api_instance.create_dashboard_list_items(dashboard_list_id=int(DASHBOARD_LIST_ID), body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add custom screenboard dashboard to an existing dashboard list returns "OK" response ``` # Add custom screenboard dashboard to an existing dashboard list returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DashboardListsAPI.new # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = ENV["DASHBOARD_LIST_ID"] # there is a valid "screenboard_dashboard" in the system SCREENBOARD_DASHBOARD_ID = ENV["SCREENBOARD_DASHBOARD_ID"] body = DatadogAPIClient::V2::DashboardListAddItemsRequest.new({ dashboards: [ DatadogAPIClient::V2::DashboardListItemRequest.new({ id: SCREENBOARD_DASHBOARD_ID, type: DatadogAPIClient::V2::DashboardType::CUSTOM_SCREENBOARD, }), ], }) p api_instance.create_dashboard_list_items(DASHBOARD_LIST_ID.to_i, body) ``` Copy ##### Add custom timeboard dashboard to an existing dashboard list returns "OK" response ``` # Add custom timeboard dashboard to an existing dashboard list returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DashboardListsAPI.new # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = ENV["DASHBOARD_LIST_ID"] # there is a valid "dashboard" in the system DASHBOARD_ID = ENV["DASHBOARD_ID"] body = DatadogAPIClient::V2::DashboardListAddItemsRequest.new({ dashboards: [ DatadogAPIClient::V2::DashboardListItemRequest.new({ id: DASHBOARD_ID, type: DatadogAPIClient::V2::DashboardType::CUSTOM_TIMEBOARD, }), ], }) p api_instance.create_dashboard_list_items(DASHBOARD_LIST_ID.to_i, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add custom screenboard dashboard to an existing dashboard list returns "OK" response ``` // Add custom screenboard dashboard to an existing dashboard list returns "OK" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dashboard_lists::DashboardListsAPI; use datadog_api_client::datadogV2::model::DashboardListAddItemsRequest; use datadog_api_client::datadogV2::model::DashboardListItemRequest; use datadog_api_client::datadogV2::model::DashboardType; #[tokio::main] async fn main() { // there is a valid "dashboard_list" in the system let dashboard_list_id: i64 = std::env::var("DASHBOARD_LIST_ID").unwrap().parse().unwrap(); // there is a valid "screenboard_dashboard" in the system let screenboard_dashboard_id = std::env::var("SCREENBOARD_DASHBOARD_ID").unwrap(); let body = DashboardListAddItemsRequest::new().dashboards(vec![DashboardListItemRequest::new( screenboard_dashboard_id.clone(), DashboardType::CUSTOM_SCREENBOARD, )]); let configuration = datadog::Configuration::new(); let api = DashboardListsAPI::with_config(configuration); let resp = api .create_dashboard_list_items(dashboard_list_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Add custom timeboard dashboard to an existing dashboard list returns "OK" response ``` // Add custom timeboard dashboard to an existing dashboard list returns "OK" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dashboard_lists::DashboardListsAPI; use datadog_api_client::datadogV2::model::DashboardListAddItemsRequest; use datadog_api_client::datadogV2::model::DashboardListItemRequest; use datadog_api_client::datadogV2::model::DashboardType; #[tokio::main] async fn main() { // there is a valid "dashboard_list" in the system let dashboard_list_id: i64 = std::env::var("DASHBOARD_LIST_ID").unwrap().parse().unwrap(); // there is a valid "dashboard" in the system let dashboard_id = std::env::var("DASHBOARD_ID").unwrap(); let body = DashboardListAddItemsRequest::new().dashboards(vec![DashboardListItemRequest::new( dashboard_id.clone(), DashboardType::CUSTOM_TIMEBOARD, )]); let configuration = datadog::Configuration::new(); let api = DashboardListsAPI::with_config(configuration); let resp = api .create_dashboard_list_items(dashboard_list_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add custom screenboard dashboard to an existing dashboard list returns "OK" response ``` /** * Add custom screenboard dashboard to an existing dashboard list returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DashboardListsApi(configuration); // there is a valid "dashboard_list" in the system const DASHBOARD_LIST_ID = parseInt(process.env.DASHBOARD_LIST_ID as string); // there is a valid "screenboard_dashboard" in the system const SCREENBOARD_DASHBOARD_ID = process.env.SCREENBOARD_DASHBOARD_ID as string; const params: v2.DashboardListsApiCreateDashboardListItemsRequest = { body: { dashboards: [ { id: SCREENBOARD_DASHBOARD_ID, type: "custom_screenboard", }, ], }, dashboardListId: DASHBOARD_LIST_ID, }; apiInstance .createDashboardListItems(params) .then((data: v2.DashboardListAddItemsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Add custom timeboard dashboard to an existing dashboard list returns "OK" response ``` /** * Add custom timeboard dashboard to an existing dashboard list returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DashboardListsApi(configuration); // there is a valid "dashboard_list" in the system const DASHBOARD_LIST_ID = parseInt(process.env.DASHBOARD_LIST_ID as string); // there is a valid "dashboard" in the system const DASHBOARD_ID = process.env.DASHBOARD_ID as string; const params: v2.DashboardListsApiCreateDashboardListItemsRequest = { body: { dashboards: [ { id: DASHBOARD_ID, type: "custom_timeboard", }, ], }, dashboardListId: DASHBOARD_LIST_ID, }; apiInstance .createDashboardListItems(params) .then((data: v2.DashboardListAddItemsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a dashboard list](https://docs.datadoghq.com/api/latest/dashboard-lists/#create-a-dashboard-list) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboard-lists/#create-a-dashboard-list-v1) POST https://api.ap1.datadoghq.com/api/v1/dashboard/lists/manualhttps://api.ap2.datadoghq.com/api/v1/dashboard/lists/manualhttps://api.datadoghq.eu/api/v1/dashboard/lists/manualhttps://api.ddog-gov.com/api/v1/dashboard/lists/manualhttps://api.datadoghq.com/api/v1/dashboard/lists/manualhttps://api.us3.datadoghq.com/api/v1/dashboard/lists/manualhttps://api.us5.datadoghq.com/api/v1/dashboard/lists/manual ### Overview Create an empty dashboard list. This endpoint requires the `dashboards_write` permission. OAuth apps require the `dashboards_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboard-lists) to access this endpoint. ### Request #### Body Data (required) Create a dashboard list request body. * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Field Type Description author object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. created date-time Date of creation of the dashboard list. dashboard_count int64 The number of dashboards in the list. id int64 The ID of the dashboard list. is_favorite boolean Whether or not the list is in the favorites. modified date-time Date of last edition of the dashboard list. name [_required_] string The name of the dashboard list. type string The type of dashboard list. ``` { "name": "Example-Dashboard-List" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dashboard-lists/#CreateDashboardList-200-v1) * [400](https://docs.datadoghq.com/api/latest/dashboard-lists/#CreateDashboardList-400-v1) * [403](https://docs.datadoghq.com/api/latest/dashboard-lists/#CreateDashboardList-403-v1) * [429](https://docs.datadoghq.com/api/latest/dashboard-lists/#CreateDashboardList-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Your Datadog Dashboards. Field Type Description author object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. created date-time Date of creation of the dashboard list. dashboard_count int64 The number of dashboards in the list. id int64 The ID of the dashboard list. is_favorite boolean Whether or not the list is in the favorites. modified date-time Date of last edition of the dashboard list. name [_required_] string The name of the dashboard list. type string The type of dashboard list. ``` { "author": { "email": "string", "handle": "string", "name": "string" }, "created": "2019-09-19T10:00:00.000Z", "dashboard_count": "integer", "id": "integer", "is_favorite": false, "modified": "2019-09-19T10:00:00.000Z", "name": "My Dashboard", "type": "manual_dashboard_list" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby-legacy) ##### Create a dashboard list returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/lists/manual" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Dashboard-List" } EOF ``` ##### Create a dashboard list returns "OK" response ``` // Create a dashboard list returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.DashboardList{ Name: "Example-Dashboard-List", } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardListsApi(apiClient) resp, r, err := api.CreateDashboardList(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardListsApi.CreateDashboardList`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardListsApi.CreateDashboardList`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a dashboard list returns "OK" response ``` // Create a dashboard list returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardListsApi; import com.datadog.api.client.v1.model.DashboardList; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardListsApi apiInstance = new DashboardListsApi(defaultClient); DashboardList body = new DashboardList().name("Example-Dashboard-List"); try { DashboardList result = apiInstance.createDashboardList(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardListsApi#createDashboardList"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a dashboard list returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) name = 'My Dashboard List' api.DashboardList.create(name=name) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Create a dashboard list returns "OK" response ``` """ Create a dashboard list returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboard_lists_api import DashboardListsApi from datadog_api_client.v1.model.dashboard_list import DashboardList body = DashboardList( name="Example-Dashboard-List", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardListsApi(api_client) response = api_instance.create_dashboard_list(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a dashboard list returns "OK" response ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) name = 'My Dashboard List' result = dog.create_dashboard_list(name) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a dashboard list returns "OK" response ``` # Create a dashboard list returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardListsAPI.new body = DatadogAPIClient::V1::DashboardList.new({ name: "Example-Dashboard-List", }) p api_instance.create_dashboard_list(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a dashboard list returns "OK" response ``` // Create a dashboard list returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboard_lists::DashboardListsAPI; use datadog_api_client::datadogV1::model::DashboardList; #[tokio::main] async fn main() { let body = DashboardList::new("Example-Dashboard-List".to_string()); let configuration = datadog::Configuration::new(); let api = DashboardListsAPI::with_config(configuration); let resp = api.create_dashboard_list(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a dashboard list returns "OK" response ``` /** * Create a dashboard list returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardListsApi(configuration); const params: v1.DashboardListsApiCreateDashboardListRequest = { body: { name: "Example-Dashboard-List", }, }; apiInstance .createDashboardList(params) .then((data: v1.DashboardList) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a dashboard list](https://docs.datadoghq.com/api/latest/dashboard-lists/#get-a-dashboard-list) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboard-lists/#get-a-dashboard-list-v1) GET https://api.ap1.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.ap2.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.datadoghq.eu/api/v1/dashboard/lists/manual/{list_id}https://api.ddog-gov.com/api/v1/dashboard/lists/manual/{list_id}https://api.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.us3.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.us5.datadoghq.com/api/v1/dashboard/lists/manual/{list_id} ### Overview Fetch an existing dashboard list’s definition. This endpoint requires the `dashboards_read` permission. OAuth apps require the `dashboards_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboard-lists) to access this endpoint. ### Arguments #### Path Parameters Name Type Description list_id [_required_] integer ID of the dashboard list to fetch. ### Response * [200](https://docs.datadoghq.com/api/latest/dashboard-lists/#GetDashboardList-200-v1) * [403](https://docs.datadoghq.com/api/latest/dashboard-lists/#GetDashboardList-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboard-lists/#GetDashboardList-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboard-lists/#GetDashboardList-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Your Datadog Dashboards. Field Type Description author object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. created date-time Date of creation of the dashboard list. dashboard_count int64 The number of dashboards in the list. id int64 The ID of the dashboard list. is_favorite boolean Whether or not the list is in the favorites. modified date-time Date of last edition of the dashboard list. name [_required_] string The name of the dashboard list. type string The type of dashboard list. ``` { "author": { "email": "string", "handle": "string", "name": "string" }, "created": "2019-09-19T10:00:00.000Z", "dashboard_count": "integer", "id": "integer", "is_favorite": false, "modified": "2019-09-19T10:00:00.000Z", "name": "My Dashboard", "type": "manual_dashboard_list" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python-legacy) ##### Get a dashboard list Copy ``` # Path parameters export list_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/lists/manual/${list_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a dashboard list ``` """ Get a dashboard list returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboard_lists_api import DashboardListsApi # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = environ["DASHBOARD_LIST_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardListsApi(api_client) response = api_instance.get_dashboard_list( list_id=int(DASHBOARD_LIST_ID), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a dashboard list ``` # Get a dashboard list returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardListsAPI.new # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = ENV["DASHBOARD_LIST_ID"] p api_instance.get_dashboard_list(DASHBOARD_LIST_ID.to_i) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a dashboard list ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) result = dog.get_dashboard_list(4741) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a dashboard list ``` // Get a dashboard list returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "dashboard_list" in the system DashboardListID, _ := strconv.ParseInt(os.Getenv("DASHBOARD_LIST_ID"), 10, 64) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardListsApi(apiClient) resp, r, err := api.GetDashboardList(ctx, DashboardListID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardListsApi.GetDashboardList`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardListsApi.GetDashboardList`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a dashboard list ``` // Get a dashboard list returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardListsApi; import com.datadog.api.client.v1.model.DashboardList; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardListsApi apiInstance = new DashboardListsApi(defaultClient); // there is a valid "dashboard_list" in the system Long DASHBOARD_LIST_ID = Long.parseLong(System.getenv("DASHBOARD_LIST_ID")); try { DashboardList result = apiInstance.getDashboardList(DASHBOARD_LIST_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardListsApi#getDashboardList"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a dashboard list ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.DashboardList.get(4741) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get a dashboard list ``` // Get a dashboard list returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboard_lists::DashboardListsAPI; #[tokio::main] async fn main() { // there is a valid "dashboard_list" in the system let dashboard_list_id: i64 = std::env::var("DASHBOARD_LIST_ID").unwrap().parse().unwrap(); let configuration = datadog::Configuration::new(); let api = DashboardListsAPI::with_config(configuration); let resp = api.get_dashboard_list(dashboard_list_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a dashboard list ``` /** * Get a dashboard list returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardListsApi(configuration); // there is a valid "dashboard_list" in the system const DASHBOARD_LIST_ID = parseInt(process.env.DASHBOARD_LIST_ID as string); const params: v1.DashboardListsApiGetDashboardListRequest = { listId: DASHBOARD_LIST_ID, }; apiInstance .getDashboardList(params) .then((data: v1.DashboardList) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update items of a dashboard list](https://docs.datadoghq.com/api/latest/dashboard-lists/#update-items-of-a-dashboard-list) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dashboard-lists/#update-items-of-a-dashboard-list-v2) PUT https://api.ap1.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.ap2.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.datadoghq.eu/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.ddog-gov.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.us3.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.us5.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboards ### Overview Update dashboards of an existing dashboard list. ### Arguments #### Path Parameters Name Type Description dashboard_list_id [_required_] integer ID of the dashboard list to update items from. ### Request #### Body Data (required) New dashboards of the dashboard list. * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Field Type Description dashboards [object] List of dashboards to update the dashboard list to. id [_required_] string ID of the dashboard. type [_required_] enum The type of the dashboard. Allowed enum values: `custom_timeboard,custom_screenboard,integration_screenboard,integration_timeboard,host_timeboard` ``` { "dashboards": [ { "id": "123-abc-456", "type": "custom_screenboard" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dashboard-lists/#UpdateDashboardListItems-200-v2) * [400](https://docs.datadoghq.com/api/latest/dashboard-lists/#UpdateDashboardListItems-400-v2) * [403](https://docs.datadoghq.com/api/latest/dashboard-lists/#UpdateDashboardListItems-403-v2) * [404](https://docs.datadoghq.com/api/latest/dashboard-lists/#UpdateDashboardListItems-404-v2) * [429](https://docs.datadoghq.com/api/latest/dashboard-lists/#UpdateDashboardListItems-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Response containing a list of updated dashboards. Field Type Description dashboards [object] List of dashboards in the dashboard list. id [_required_] string ID of the dashboard. type [_required_] enum The type of the dashboard. Allowed enum values: `custom_timeboard,custom_screenboard,integration_screenboard,integration_timeboard,host_timeboard` ``` { "dashboards": [ { "id": "q5j-nti-fv6", "type": "host_timeboard" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=typescript) ##### Update items of a dashboard list returns "OK" response Copy ``` # Path parameters export dashboard_list_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dashboard/lists/manual/${dashboard_list_id}/dashboards" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "dashboards": [ { "id": "123-abc-456", "type": "custom_screenboard" } ] } EOF ``` ##### Update items of a dashboard list returns "OK" response ``` // Update items of a dashboard list returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dashboard_list" in the system DashboardListID, _ := strconv.ParseInt(os.Getenv("DASHBOARD_LIST_ID"), 10, 64) // there is a valid "screenboard_dashboard" in the system ScreenboardDashboardID := os.Getenv("SCREENBOARD_DASHBOARD_ID") body := datadogV2.DashboardListUpdateItemsRequest{ Dashboards: []datadogV2.DashboardListItemRequest{ { Id: ScreenboardDashboardID, Type: datadogV2.DASHBOARDTYPE_CUSTOM_SCREENBOARD, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDashboardListsApi(apiClient) resp, r, err := api.UpdateDashboardListItems(ctx, DashboardListID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardListsApi.UpdateDashboardListItems`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardListsApi.UpdateDashboardListItems`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update items of a dashboard list returns "OK" response ``` // Update items of a dashboard list returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DashboardListsApi; import com.datadog.api.client.v2.model.DashboardListItemRequest; import com.datadog.api.client.v2.model.DashboardListUpdateItemsRequest; import com.datadog.api.client.v2.model.DashboardListUpdateItemsResponse; import com.datadog.api.client.v2.model.DashboardType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardListsApi apiInstance = new DashboardListsApi(defaultClient); // there is a valid "dashboard_list" in the system Long DASHBOARD_LIST_ID = Long.parseLong(System.getenv("DASHBOARD_LIST_ID")); // there is a valid "screenboard_dashboard" in the system String SCREENBOARD_DASHBOARD_ID = System.getenv("SCREENBOARD_DASHBOARD_ID"); DashboardListUpdateItemsRequest body = new DashboardListUpdateItemsRequest() .dashboards( Collections.singletonList( new DashboardListItemRequest() .id(SCREENBOARD_DASHBOARD_ID) .type(DashboardType.CUSTOM_SCREENBOARD))); try { DashboardListUpdateItemsResponse result = apiInstance.updateDashboardListItems(DASHBOARD_LIST_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardListsApi#updateDashboardListItems"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update items of a dashboard list returns "OK" response ``` """ Update items of a dashboard list returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dashboard_lists_api import DashboardListsApi from datadog_api_client.v2.model.dashboard_list_item_request import DashboardListItemRequest from datadog_api_client.v2.model.dashboard_list_update_items_request import DashboardListUpdateItemsRequest from datadog_api_client.v2.model.dashboard_type import DashboardType # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = environ["DASHBOARD_LIST_ID"] # there is a valid "screenboard_dashboard" in the system SCREENBOARD_DASHBOARD_ID = environ["SCREENBOARD_DASHBOARD_ID"] body = DashboardListUpdateItemsRequest( dashboards=[ DashboardListItemRequest( id=SCREENBOARD_DASHBOARD_ID, type=DashboardType.CUSTOM_SCREENBOARD, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardListsApi(api_client) response = api_instance.update_dashboard_list_items(dashboard_list_id=int(DASHBOARD_LIST_ID), body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update items of a dashboard list returns "OK" response ``` # Update items of a dashboard list returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DashboardListsAPI.new # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = ENV["DASHBOARD_LIST_ID"] # there is a valid "screenboard_dashboard" in the system SCREENBOARD_DASHBOARD_ID = ENV["SCREENBOARD_DASHBOARD_ID"] body = DatadogAPIClient::V2::DashboardListUpdateItemsRequest.new({ dashboards: [ DatadogAPIClient::V2::DashboardListItemRequest.new({ id: SCREENBOARD_DASHBOARD_ID, type: DatadogAPIClient::V2::DashboardType::CUSTOM_SCREENBOARD, }), ], }) p api_instance.update_dashboard_list_items(DASHBOARD_LIST_ID.to_i, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update items of a dashboard list returns "OK" response ``` // Update items of a dashboard list returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dashboard_lists::DashboardListsAPI; use datadog_api_client::datadogV2::model::DashboardListItemRequest; use datadog_api_client::datadogV2::model::DashboardListUpdateItemsRequest; use datadog_api_client::datadogV2::model::DashboardType; #[tokio::main] async fn main() { // there is a valid "dashboard_list" in the system let dashboard_list_id: i64 = std::env::var("DASHBOARD_LIST_ID").unwrap().parse().unwrap(); // there is a valid "screenboard_dashboard" in the system let screenboard_dashboard_id = std::env::var("SCREENBOARD_DASHBOARD_ID").unwrap(); let body = DashboardListUpdateItemsRequest::new().dashboards(vec![DashboardListItemRequest::new( screenboard_dashboard_id.clone(), DashboardType::CUSTOM_SCREENBOARD, )]); let configuration = datadog::Configuration::new(); let api = DashboardListsAPI::with_config(configuration); let resp = api .update_dashboard_list_items(dashboard_list_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update items of a dashboard list returns "OK" response ``` /** * Update items of a dashboard list returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DashboardListsApi(configuration); // there is a valid "dashboard_list" in the system const DASHBOARD_LIST_ID = parseInt(process.env.DASHBOARD_LIST_ID as string); // there is a valid "screenboard_dashboard" in the system const SCREENBOARD_DASHBOARD_ID = process.env.SCREENBOARD_DASHBOARD_ID as string; const params: v2.DashboardListsApiUpdateDashboardListItemsRequest = { body: { dashboards: [ { id: SCREENBOARD_DASHBOARD_ID, type: "custom_screenboard", }, ], }, dashboardListId: DASHBOARD_LIST_ID, }; apiInstance .updateDashboardListItems(params) .then((data: v2.DashboardListUpdateItemsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete items from a dashboard list](https://docs.datadoghq.com/api/latest/dashboard-lists/#delete-items-from-a-dashboard-list) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dashboard-lists/#delete-items-from-a-dashboard-list-v2) DELETE https://api.ap1.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.ap2.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.datadoghq.eu/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.ddog-gov.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.us3.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboardshttps://api.us5.datadoghq.com/api/v2/dashboard/lists/manual/{dashboard_list_id}/dashboards ### Overview Delete dashboards from an existing dashboard list. ### Arguments #### Path Parameters Name Type Description dashboard_list_id [_required_] integer ID of the dashboard list to delete items from. ### Request #### Body Data (required) Dashboards to delete from the dashboard list. * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Field Type Description dashboards [object] List of dashboards to delete from the dashboard list. id [_required_] string ID of the dashboard. type [_required_] enum The type of the dashboard. Allowed enum values: `custom_timeboard,custom_screenboard,integration_screenboard,integration_timeboard,host_timeboard` ##### Delete custom screenboard dashboard from an existing dashboard list returns "OK" response ``` { "dashboards": [ { "id": "123-abc-456", "type": "custom_screenboard" } ] } ``` Copy ##### Delete custom timeboard dashboard from an existing dashboard list returns "OK" response ``` { "dashboards": [ { "id": "123-abc-456", "type": "custom_timeboard" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dashboard-lists/#DeleteDashboardListItems-200-v2) * [400](https://docs.datadoghq.com/api/latest/dashboard-lists/#DeleteDashboardListItems-400-v2) * [403](https://docs.datadoghq.com/api/latest/dashboard-lists/#DeleteDashboardListItems-403-v2) * [404](https://docs.datadoghq.com/api/latest/dashboard-lists/#DeleteDashboardListItems-404-v2) * [429](https://docs.datadoghq.com/api/latest/dashboard-lists/#DeleteDashboardListItems-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Response containing a list of deleted dashboards. Field Type Description deleted_dashboards_from_list [object] List of dashboards deleted from the dashboard list. id [_required_] string ID of the dashboard. type [_required_] enum The type of the dashboard. Allowed enum values: `custom_timeboard,custom_screenboard,integration_screenboard,integration_timeboard,host_timeboard` ``` { "deleted_dashboards_from_list": [ { "id": "q5j-nti-fv6", "type": "host_timeboard" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python-legacy) ##### Delete custom screenboard dashboard from an existing dashboard list returns "OK" response Copy ``` # Path parameters export dashboard_list_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dashboard/lists/manual/${dashboard_list_id}/dashboards" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "dashboards": [ { "id": "123-abc-456", "type": "custom_screenboard" } ] } EOF ``` ##### Delete custom timeboard dashboard from an existing dashboard list returns "OK" response Copy ``` # Path parameters export dashboard_list_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dashboard/lists/manual/${dashboard_list_id}/dashboards" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "dashboards": [ { "id": "123-abc-456", "type": "custom_timeboard" } ] } EOF ``` ##### Delete custom screenboard dashboard from an existing dashboard list returns "OK" response ``` // Delete custom screenboard dashboard from an existing dashboard list returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dashboard_list" in the system DashboardListID, _ := strconv.ParseInt(os.Getenv("DASHBOARD_LIST_ID"), 10, 64) // there is a valid "screenboard_dashboard" in the system ScreenboardDashboardID := os.Getenv("SCREENBOARD_DASHBOARD_ID") body := datadogV2.DashboardListDeleteItemsRequest{ Dashboards: []datadogV2.DashboardListItemRequest{ { Id: ScreenboardDashboardID, Type: datadogV2.DASHBOARDTYPE_CUSTOM_SCREENBOARD, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDashboardListsApi(apiClient) resp, r, err := api.DeleteDashboardListItems(ctx, DashboardListID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardListsApi.DeleteDashboardListItems`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardListsApi.DeleteDashboardListItems`:\n%s\n", responseContent) } ``` Copy ##### Delete custom timeboard dashboard from an existing dashboard list returns "OK" response ``` // Delete custom timeboard dashboard from an existing dashboard list returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dashboard_list" in the system DashboardListID, _ := strconv.ParseInt(os.Getenv("DASHBOARD_LIST_ID"), 10, 64) // there is a valid "dashboard" in the system DashboardID := os.Getenv("DASHBOARD_ID") body := datadogV2.DashboardListDeleteItemsRequest{ Dashboards: []datadogV2.DashboardListItemRequest{ { Id: DashboardID, Type: datadogV2.DASHBOARDTYPE_CUSTOM_TIMEBOARD, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDashboardListsApi(apiClient) resp, r, err := api.DeleteDashboardListItems(ctx, DashboardListID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardListsApi.DeleteDashboardListItems`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardListsApi.DeleteDashboardListItems`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete custom screenboard dashboard from an existing dashboard list returns "OK" response ``` // Delete custom screenboard dashboard from an existing dashboard list returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DashboardListsApi; import com.datadog.api.client.v2.model.DashboardListDeleteItemsRequest; import com.datadog.api.client.v2.model.DashboardListDeleteItemsResponse; import com.datadog.api.client.v2.model.DashboardListItemRequest; import com.datadog.api.client.v2.model.DashboardType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardListsApi apiInstance = new DashboardListsApi(defaultClient); // there is a valid "dashboard_list" in the system Long DASHBOARD_LIST_ID = Long.parseLong(System.getenv("DASHBOARD_LIST_ID")); // there is a valid "screenboard_dashboard" in the system String SCREENBOARD_DASHBOARD_ID = System.getenv("SCREENBOARD_DASHBOARD_ID"); DashboardListDeleteItemsRequest body = new DashboardListDeleteItemsRequest() .dashboards( Collections.singletonList( new DashboardListItemRequest() .id(SCREENBOARD_DASHBOARD_ID) .type(DashboardType.CUSTOM_SCREENBOARD))); try { DashboardListDeleteItemsResponse result = apiInstance.deleteDashboardListItems(DASHBOARD_LIST_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardListsApi#deleteDashboardListItems"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Delete custom timeboard dashboard from an existing dashboard list returns "OK" response ``` // Delete custom timeboard dashboard from an existing dashboard list returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DashboardListsApi; import com.datadog.api.client.v2.model.DashboardListDeleteItemsRequest; import com.datadog.api.client.v2.model.DashboardListDeleteItemsResponse; import com.datadog.api.client.v2.model.DashboardListItemRequest; import com.datadog.api.client.v2.model.DashboardType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardListsApi apiInstance = new DashboardListsApi(defaultClient); // there is a valid "dashboard_list" in the system Long DASHBOARD_LIST_ID = Long.parseLong(System.getenv("DASHBOARD_LIST_ID")); // there is a valid "dashboard" in the system String DASHBOARD_ID = System.getenv("DASHBOARD_ID"); DashboardListDeleteItemsRequest body = new DashboardListDeleteItemsRequest() .dashboards( Collections.singletonList( new DashboardListItemRequest() .id(DASHBOARD_ID) .type(DashboardType.CUSTOM_TIMEBOARD))); try { DashboardListDeleteItemsResponse result = apiInstance.deleteDashboardListItems(DASHBOARD_LIST_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardListsApi#deleteDashboardListItems"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete custom screenboard dashboard from an existing dashboard list returns "OK" response ``` """ Delete custom screenboard dashboard from an existing dashboard list returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dashboard_lists_api import DashboardListsApi from datadog_api_client.v2.model.dashboard_list_delete_items_request import DashboardListDeleteItemsRequest from datadog_api_client.v2.model.dashboard_list_item_request import DashboardListItemRequest from datadog_api_client.v2.model.dashboard_type import DashboardType # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = environ["DASHBOARD_LIST_ID"] # there is a valid "screenboard_dashboard" in the system SCREENBOARD_DASHBOARD_ID = environ["SCREENBOARD_DASHBOARD_ID"] body = DashboardListDeleteItemsRequest( dashboards=[ DashboardListItemRequest( id=SCREENBOARD_DASHBOARD_ID, type=DashboardType.CUSTOM_SCREENBOARD, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardListsApi(api_client) response = api_instance.delete_dashboard_list_items(dashboard_list_id=int(DASHBOARD_LIST_ID), body=body) print(response) ``` Copy ##### Delete custom timeboard dashboard from an existing dashboard list returns "OK" response ``` """ Delete custom timeboard dashboard from an existing dashboard list returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dashboard_lists_api import DashboardListsApi from datadog_api_client.v2.model.dashboard_list_delete_items_request import DashboardListDeleteItemsRequest from datadog_api_client.v2.model.dashboard_list_item_request import DashboardListItemRequest from datadog_api_client.v2.model.dashboard_type import DashboardType # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = environ["DASHBOARD_LIST_ID"] # there is a valid "dashboard" in the system DASHBOARD_ID = environ["DASHBOARD_ID"] body = DashboardListDeleteItemsRequest( dashboards=[ DashboardListItemRequest( id=DASHBOARD_ID, type=DashboardType.CUSTOM_TIMEBOARD, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardListsApi(api_client) response = api_instance.delete_dashboard_list_items(dashboard_list_id=int(DASHBOARD_LIST_ID), body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete custom screenboard dashboard from an existing dashboard list returns "OK" response ``` # Delete custom screenboard dashboard from an existing dashboard list returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DashboardListsAPI.new # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = ENV["DASHBOARD_LIST_ID"] # there is a valid "screenboard_dashboard" in the system SCREENBOARD_DASHBOARD_ID = ENV["SCREENBOARD_DASHBOARD_ID"] body = DatadogAPIClient::V2::DashboardListDeleteItemsRequest.new({ dashboards: [ DatadogAPIClient::V2::DashboardListItemRequest.new({ id: SCREENBOARD_DASHBOARD_ID, type: DatadogAPIClient::V2::DashboardType::CUSTOM_SCREENBOARD, }), ], }) p api_instance.delete_dashboard_list_items(DASHBOARD_LIST_ID.to_i, body) ``` Copy ##### Delete custom timeboard dashboard from an existing dashboard list returns "OK" response ``` # Delete custom timeboard dashboard from an existing dashboard list returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DashboardListsAPI.new # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = ENV["DASHBOARD_LIST_ID"] # there is a valid "dashboard" in the system DASHBOARD_ID = ENV["DASHBOARD_ID"] body = DatadogAPIClient::V2::DashboardListDeleteItemsRequest.new({ dashboards: [ DatadogAPIClient::V2::DashboardListItemRequest.new({ id: DASHBOARD_ID, type: DatadogAPIClient::V2::DashboardType::CUSTOM_TIMEBOARD, }), ], }) p api_instance.delete_dashboard_list_items(DASHBOARD_LIST_ID.to_i, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete custom screenboard dashboard from an existing dashboard list returns "OK" response ``` // Delete custom screenboard dashboard from an existing dashboard list returns // "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dashboard_lists::DashboardListsAPI; use datadog_api_client::datadogV2::model::DashboardListDeleteItemsRequest; use datadog_api_client::datadogV2::model::DashboardListItemRequest; use datadog_api_client::datadogV2::model::DashboardType; #[tokio::main] async fn main() { // there is a valid "dashboard_list" in the system let dashboard_list_id: i64 = std::env::var("DASHBOARD_LIST_ID").unwrap().parse().unwrap(); // there is a valid "screenboard_dashboard" in the system let screenboard_dashboard_id = std::env::var("SCREENBOARD_DASHBOARD_ID").unwrap(); let body = DashboardListDeleteItemsRequest::new().dashboards(vec![DashboardListItemRequest::new( screenboard_dashboard_id.clone(), DashboardType::CUSTOM_SCREENBOARD, )]); let configuration = datadog::Configuration::new(); let api = DashboardListsAPI::with_config(configuration); let resp = api .delete_dashboard_list_items(dashboard_list_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Delete custom timeboard dashboard from an existing dashboard list returns "OK" response ``` // Delete custom timeboard dashboard from an existing dashboard list returns "OK" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dashboard_lists::DashboardListsAPI; use datadog_api_client::datadogV2::model::DashboardListDeleteItemsRequest; use datadog_api_client::datadogV2::model::DashboardListItemRequest; use datadog_api_client::datadogV2::model::DashboardType; #[tokio::main] async fn main() { // there is a valid "dashboard_list" in the system let dashboard_list_id: i64 = std::env::var("DASHBOARD_LIST_ID").unwrap().parse().unwrap(); // there is a valid "dashboard" in the system let dashboard_id = std::env::var("DASHBOARD_ID").unwrap(); let body = DashboardListDeleteItemsRequest::new().dashboards(vec![DashboardListItemRequest::new( dashboard_id.clone(), DashboardType::CUSTOM_TIMEBOARD, )]); let configuration = datadog::Configuration::new(); let api = DashboardListsAPI::with_config(configuration); let resp = api .delete_dashboard_list_items(dashboard_list_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete custom screenboard dashboard from an existing dashboard list returns "OK" response ``` /** * Delete custom screenboard dashboard from an existing dashboard list returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DashboardListsApi(configuration); // there is a valid "dashboard_list" in the system const DASHBOARD_LIST_ID = parseInt(process.env.DASHBOARD_LIST_ID as string); // there is a valid "screenboard_dashboard" in the system const SCREENBOARD_DASHBOARD_ID = process.env.SCREENBOARD_DASHBOARD_ID as string; const params: v2.DashboardListsApiDeleteDashboardListItemsRequest = { body: { dashboards: [ { id: SCREENBOARD_DASHBOARD_ID, type: "custom_screenboard", }, ], }, dashboardListId: DASHBOARD_LIST_ID, }; apiInstance .deleteDashboardListItems(params) .then((data: v2.DashboardListDeleteItemsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Delete custom timeboard dashboard from an existing dashboard list returns "OK" response ``` /** * Delete custom timeboard dashboard from an existing dashboard list returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DashboardListsApi(configuration); // there is a valid "dashboard_list" in the system const DASHBOARD_LIST_ID = parseInt(process.env.DASHBOARD_LIST_ID as string); // there is a valid "dashboard" in the system const DASHBOARD_ID = process.env.DASHBOARD_ID as string; const params: v2.DashboardListsApiDeleteDashboardListItemsRequest = { body: { dashboards: [ { id: DASHBOARD_ID, type: "custom_timeboard", }, ], }, dashboardListId: DASHBOARD_LIST_ID, }; apiInstance .deleteDashboardListItems(params) .then((data: v2.DashboardListDeleteItemsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a dashboard list](https://docs.datadoghq.com/api/latest/dashboard-lists/#update-a-dashboard-list) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboard-lists/#update-a-dashboard-list-v1) PUT https://api.ap1.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.ap2.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.datadoghq.eu/api/v1/dashboard/lists/manual/{list_id}https://api.ddog-gov.com/api/v1/dashboard/lists/manual/{list_id}https://api.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.us3.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.us5.datadoghq.com/api/v1/dashboard/lists/manual/{list_id} ### Overview Update the name of a dashboard list. This endpoint requires the `dashboards_write` permission. OAuth apps require the `dashboards_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboard-lists) to access this endpoint. ### Arguments #### Path Parameters Name Type Description list_id [_required_] integer ID of the dashboard list to update. ### Request #### Body Data (required) Update a dashboard list request body. * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Field Type Description author object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. created date-time Date of creation of the dashboard list. dashboard_count int64 The number of dashboards in the list. id int64 The ID of the dashboard list. is_favorite boolean Whether or not the list is in the favorites. modified date-time Date of last edition of the dashboard list. name [_required_] string The name of the dashboard list. type string The type of dashboard list. ``` { "name": "updated Example-Dashboard-List" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dashboard-lists/#UpdateDashboardList-200-v1) * [400](https://docs.datadoghq.com/api/latest/dashboard-lists/#UpdateDashboardList-400-v1) * [403](https://docs.datadoghq.com/api/latest/dashboard-lists/#UpdateDashboardList-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboard-lists/#UpdateDashboardList-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboard-lists/#UpdateDashboardList-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Your Datadog Dashboards. Field Type Description author object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. created date-time Date of creation of the dashboard list. dashboard_count int64 The number of dashboards in the list. id int64 The ID of the dashboard list. is_favorite boolean Whether or not the list is in the favorites. modified date-time Date of last edition of the dashboard list. name [_required_] string The name of the dashboard list. type string The type of dashboard list. ``` { "author": { "email": "string", "handle": "string", "name": "string" }, "created": "2019-09-19T10:00:00.000Z", "dashboard_count": "integer", "id": "integer", "is_favorite": false, "modified": "2019-09-19T10:00:00.000Z", "name": "My Dashboard", "type": "manual_dashboard_list" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=typescript) ##### Update a dashboard list returns "OK" response Copy ``` # Path parameters export list_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/lists/manual/${list_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "updated Example-Dashboard-List" } EOF ``` ##### Update a dashboard list returns "OK" response ``` // Update a dashboard list returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "dashboard_list" in the system DashboardListID, _ := strconv.ParseInt(os.Getenv("DASHBOARD_LIST_ID"), 10, 64) body := datadogV1.DashboardList{ Name: "updated Example-Dashboard-List", } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardListsApi(apiClient) resp, r, err := api.UpdateDashboardList(ctx, DashboardListID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardListsApi.UpdateDashboardList`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardListsApi.UpdateDashboardList`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a dashboard list returns "OK" response ``` // Update a dashboard list returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardListsApi; import com.datadog.api.client.v1.model.DashboardList; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardListsApi apiInstance = new DashboardListsApi(defaultClient); // there is a valid "dashboard_list" in the system Long DASHBOARD_LIST_ID = Long.parseLong(System.getenv("DASHBOARD_LIST_ID")); DashboardList body = new DashboardList().name("updated Example-Dashboard-List"); try { DashboardList result = apiInstance.updateDashboardList(DASHBOARD_LIST_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardListsApi#updateDashboardList"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a dashboard list returns "OK" response ``` """ Update a dashboard list returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboard_lists_api import DashboardListsApi from datadog_api_client.v1.model.dashboard_list import DashboardList # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = environ["DASHBOARD_LIST_ID"] body = DashboardList( name="updated Example-Dashboard-List", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardListsApi(api_client) response = api_instance.update_dashboard_list(list_id=int(DASHBOARD_LIST_ID), body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a dashboard list returns "OK" response ``` # Update a dashboard list returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardListsAPI.new # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = ENV["DASHBOARD_LIST_ID"] body = DatadogAPIClient::V1::DashboardList.new({ name: "updated Example-Dashboard-List", }) p api_instance.update_dashboard_list(DASHBOARD_LIST_ID.to_i, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a dashboard list returns "OK" response ``` // Update a dashboard list returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboard_lists::DashboardListsAPI; use datadog_api_client::datadogV1::model::DashboardList; #[tokio::main] async fn main() { // there is a valid "dashboard_list" in the system let dashboard_list_id: i64 = std::env::var("DASHBOARD_LIST_ID").unwrap().parse().unwrap(); let body = DashboardList::new("updated Example-Dashboard-List".to_string()); let configuration = datadog::Configuration::new(); let api = DashboardListsAPI::with_config(configuration); let resp = api .update_dashboard_list(dashboard_list_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a dashboard list returns "OK" response ``` /** * Update a dashboard list returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardListsApi(configuration); // there is a valid "dashboard_list" in the system const DASHBOARD_LIST_ID = parseInt(process.env.DASHBOARD_LIST_ID as string); const params: v1.DashboardListsApiUpdateDashboardListRequest = { body: { name: "updated Example-Dashboard-List", }, listId: DASHBOARD_LIST_ID, }; apiInstance .updateDashboardList(params) .then((data: v1.DashboardList) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a dashboard list](https://docs.datadoghq.com/api/latest/dashboard-lists/#delete-a-dashboard-list) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboard-lists/#delete-a-dashboard-list-v1) DELETE https://api.ap1.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.ap2.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.datadoghq.eu/api/v1/dashboard/lists/manual/{list_id}https://api.ddog-gov.com/api/v1/dashboard/lists/manual/{list_id}https://api.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.us3.datadoghq.com/api/v1/dashboard/lists/manual/{list_id}https://api.us5.datadoghq.com/api/v1/dashboard/lists/manual/{list_id} ### Overview Delete a dashboard list. This endpoint requires the `dashboards_write` permission. OAuth apps require the `dashboards_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboard-lists) to access this endpoint. ### Arguments #### Path Parameters Name Type Description list_id [_required_] integer ID of the dashboard list to delete. ### Response * [200](https://docs.datadoghq.com/api/latest/dashboard-lists/#DeleteDashboardList-200-v1) * [403](https://docs.datadoghq.com/api/latest/dashboard-lists/#DeleteDashboardList-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboard-lists/#DeleteDashboardList-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboard-lists/#DeleteDashboardList-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Deleted dashboard details. Expand All Field Type Description deleted_dashboard_list_id int64 ID of the deleted dashboard list. ``` { "deleted_dashboard_list_id": "integer" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboard-lists/) * [Example](https://docs.datadoghq.com/api/latest/dashboard-lists/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/dashboard-lists/?code-lang=python-legacy) ##### Delete a dashboard list Copy ``` # Path parameters export list_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/lists/manual/${list_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a dashboard list ``` """ Delete a dashboard list returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboard_lists_api import DashboardListsApi # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = environ["DASHBOARD_LIST_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardListsApi(api_client) response = api_instance.delete_dashboard_list( list_id=int(DASHBOARD_LIST_ID), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a dashboard list ``` # Delete a dashboard list returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardListsAPI.new # there is a valid "dashboard_list" in the system DASHBOARD_LIST_ID = ENV["DASHBOARD_LIST_ID"] p api_instance.delete_dashboard_list(DASHBOARD_LIST_ID.to_i) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a dashboard list ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) result = dog.delete_dashboard_list(4741) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a dashboard list ``` // Delete a dashboard list returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "dashboard_list" in the system DashboardListID, _ := strconv.ParseInt(os.Getenv("DASHBOARD_LIST_ID"), 10, 64) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardListsApi(apiClient) resp, r, err := api.DeleteDashboardList(ctx, DashboardListID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardListsApi.DeleteDashboardList`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardListsApi.DeleteDashboardList`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a dashboard list ``` // Delete a dashboard list returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardListsApi; import com.datadog.api.client.v1.model.DashboardListDeleteResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardListsApi apiInstance = new DashboardListsApi(defaultClient); // there is a valid "dashboard_list" in the system Long DASHBOARD_LIST_ID = Long.parseLong(System.getenv("DASHBOARD_LIST_ID")); try { DashboardListDeleteResponse result = apiInstance.deleteDashboardList(DASHBOARD_LIST_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardListsApi#deleteDashboardList"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a dashboard list ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.DashboardList.delete(4741) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Delete a dashboard list ``` // Delete a dashboard list returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboard_lists::DashboardListsAPI; #[tokio::main] async fn main() { // there is a valid "dashboard_list" in the system let dashboard_list_id: i64 = std::env::var("DASHBOARD_LIST_ID").unwrap().parse().unwrap(); let configuration = datadog::Configuration::new(); let api = DashboardListsAPI::with_config(configuration); let resp = api.delete_dashboard_list(dashboard_list_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a dashboard list ``` /** * Delete a dashboard list returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardListsApi(configuration); // there is a valid "dashboard_list" in the system const DASHBOARD_LIST_ID = parseInt(process.env.DASHBOARD_LIST_ID as string); const params: v1.DashboardListsApiDeleteDashboardListRequest = { listId: DASHBOARD_LIST_ID, }; apiInstance .deleteDashboardList(params) .then((data: v1.DashboardListDeleteResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=dff4be51-f862-4abb-b14c-0ef8615697c6&bo=1&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Dashboard%20Lists&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboard-lists%2F&r=&evt=pageLoad&sv=2&cdb=AQAA&rn=261057) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=a0ee916e-9018-443c-80a7-14dda7823f52&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=4da4c407-18d9-4b01-9e64-abef1cd313f2&pt=Dashboard%20Lists&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboard-lists%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=a0ee916e-9018-443c-80a7-14dda7823f52&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=4da4c407-18d9-4b01-9e64-abef1cd313f2&pt=Dashboard%20Lists&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboard-lists%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) --- # Source: https://docs.datadoghq.com/api/latest/dashboards ![](https://d.adroll.com/cm/b/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/bombora/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/experian/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/eyeota/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/g/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/index/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/l/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/n/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/o/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/outbrain/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/pubmatic/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/taboola/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/triplelift/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/x/out?adroll_fpc=4e8a2db038d0bec3cd48425cbc4c4607-1768335890802&pv=22486804871.081673&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM) # Dashboards Manage all your dashboards, as well as access to your shared dashboards, through the API. See the [Dashboards page](https://docs.datadoghq.com/dashboards/) for more information. ## [Create a new dashboard](https://docs.datadoghq.com/api/latest/dashboards/#create-a-new-dashboard) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#create-a-new-dashboard-v1) POST https://api.ap1.datadoghq.com/api/v1/dashboardhttps://api.ap2.datadoghq.com/api/v1/dashboardhttps://api.datadoghq.eu/api/v1/dashboardhttps://api.ddog-gov.com/api/v1/dashboardhttps://api.datadoghq.com/api/v1/dashboardhttps://api.us3.datadoghq.com/api/v1/dashboardhttps://api.us5.datadoghq.com/api/v1/dashboard ### Overview Create a dashboard using the specified options. When defining queries in your widgets, take note of which queries should have the `as_count()` or `as_rate()` modifiers appended. Refer to the following [documentation](https://docs.datadoghq.com/developers/metrics/type_modifiers/?tab=count#in-application-modifiers) for more information on these modifiers. This endpoint requires the `dashboards_write` permission. OAuth apps require the `dashboards_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Request #### Body Data (required) Create a dashboard request body. * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Field Type Description author_handle string Identifier of the dashboard author. author_name string Name of the dashboard author. created_at date-time Creation date of the dashboard. description string Description of the dashboard. id string ID of the dashboard. is_read_only boolean **DEPRECATED** : Whether this dashboard is read-only. If True, only the author and admins can make changes to it. This property is deprecated; please use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) instead to manage write authorization for individual dashboards. layout_type [_required_] enum Layout type of the dashboard. Allowed enum values: `ordered,free` modified_at date-time Modification date of the dashboard. notify_list [string] List of handles of users to notify when changes are made to this dashboard. reflow_type enum Reflow type for a **new dashboard layout** dashboard. Set this only when layout type is 'ordered'. If set to 'fixed', the dashboard expects all widgets to have a layout, and if it's set to 'auto', widgets should not have layouts. Allowed enum values: `auto,fixed` restricted_roles [string] A list of role identifiers. Only the author and users associated with at least one of these roles can edit this dashboard. tags [string] List of team names representing ownership of a dashboard. template_variable_presets [object] Array of template variables saved views. name string The name of the variable. template_variables [object] List of variables. name string The name of the variable. value string **DEPRECATED** : (deprecated) The value of the template variable within the saved view. Cannot be used in conjunction with `values`. values [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. Cannot be used in conjunction with `value`. template_variables [object] List of template variables for this dashboard. available_values [string] The list of values that the template variable drop-down is limited to. default string **DEPRECATED** : (deprecated) The default value for the template variable on dashboard load. Cannot be used in conjunction with `defaults`. defaults [string] One or many default values for template variables on load. If more than one default is specified, they will be unioned together with `OR`. Cannot be used in conjunction with `default`. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. Only tags with this prefix appear in the variable drop-down. type string The type of variable. This is to differentiate between filter variables (interpolated in query) and group by variables (interpolated into group by). title [_required_] string Title of the dashboard. url string The URL of the dashboard. widgets [_required_] [object] List of widgets to display on the dashboard. definition [_required_] [Definition of the widget](https://docs.datadoghq.com/dashboards/widgets/). Option 1 object Alert graphs are timeseries graphs showing the current status of any monitor defined on your system. alert_id [_required_] string ID of the alert to use in the widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the alert graph widget. Allowed enum values: `alert_graph` default: `alert_graph` viz_type [_required_] enum Whether to display the Alert Graph as a timeseries or a top list. Allowed enum values: `timeseries,toplist` Option 2 object Alert values are query values showing the current value of the metric in any monitor defined on your system. alert_id [_required_] string ID of the alert to use in the widget. precision int64 Number of decimal to show. If not defined, will use the raw value. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of value in the widget. type [_required_] enum Type of the alert value widget. Allowed enum values: `alert_value` default: `alert_value` unit string Unit to display with the value. Option 3 object The Change graph shows you the change in a value over the time period chosen. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. change_type enum Show the absolute or the relative change. Allowed enum values: `absolute,relative` compare_to enum Timeframe used for the change comparison. Allowed enum values: `hour_before,day_before,week_before,month_before` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. increase_good boolean Whether to show increase as good. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order_by enum What to order by. Allowed enum values: `change,name,present,past` order_dir enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. show_present boolean Whether to show the present value. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the change widget. Allowed enum values: `change` default: `change` Option 4 object Check status shows the current status or number of results for any check performed. check [_required_] string Name of the check to use in the widget. group string Group reporting a single check. group_by [string] List of tag prefixes to group by in the case of a cluster check. grouping [_required_] enum The kind of grouping to use. Allowed enum values: `check,cluster` tags [string] List of tags used to filter the groups reporting a cluster check. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the check status widget. Allowed enum values: `check_status` default: `check_status` Option 5 object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query Query definition for Distribution Widget Histogram Request Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` xaxis object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` yaxis object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` Option 6 object The event stream is a widget version of the stream of events on the Event Stream view. Only available on FREE layout dashboards. event_size enum Size to use to display an event. Allowed enum values: `s,l` query [_required_] string Query to filter the event stream with. tags_execution string The execution method for multi-value filters. Can be either and or or. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the event stream widget. Allowed enum values: `event_stream` default: `event_stream` Option 7 object The event timeline is a widget version of the timeline that appears at the top of the Event Stream view. Only available on FREE layout dashboards. query [_required_] string Query to filter the event timeline with. tags_execution string The execution method for multi-value filters. Can be either and or or. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the event timeline widget. Allowed enum values: `event_timeline` default: `event_timeline` Option 8 object Free text is a widget that allows you to add headings to your screenboard. Commonly used to state the overall purpose of the dashboard. Only available on FREE layout dashboards. color string Color of the text. font_size string Size of the text. text [_required_] string Text to display. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` type [_required_] enum Type of the free text widget. Allowed enum values: `free_text` default: `free_text` Option 9 object The funnel visualization displays a funnel of user sessions that maps a sequence of view navigation and user interaction in your application. requests [_required_] [object] Request payload used to query items. query [_required_] object Updated funnel widget. data_source [_required_] enum Source from which to query items to display in the funnel. Allowed enum values: `rum` default: `rum` query_string [_required_] string The widget query. steps [_required_] [object] List of funnel steps. facet [_required_] string The facet of the step. value [_required_] string The value of the step. request_type [_required_] enum Widget request type. Allowed enum values: `funnel` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of funnel widget. Allowed enum values: `funnel` default: `funnel` Option 10 object This visualization displays a series of values by country on a world map. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of request objects to display in the widget. May include an optional request for the region layer and/or an optional request for the points layer. Region layer requests must contain a `group-by` tag whose value is a country ISO code. See the [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) for information about building the `REQUEST_SCHEMA`. columns [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` conditional_formats [object] Threshold (numeric) conditional formatting rules may be used by a regions layer. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. compute [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` group_by [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object The style to apply to the request for points layer. color_by string The category to color the points by. text_formats [object] Text formatting rules may be used by a points layer. custom_bg_color string Hex representation of the custom background color. Used with custom background palette option. custom_fg_color string Hex representation of the custom text color. Used with custom text palette option. match [_required_] object Match rule for the table widget text format. type [_required_] enum Match or compare option. Allowed enum values: `is,is_not,contains,does_not_contain,starts_with,ends_with` value [_required_] string Table Widget Match String. palette enum Color-on-color palette to highlight replaced text. Allowed enum values: `white_on_red,white_on_yellow,white_on_green,black_on_light_red,black_on_light_yellow,black_on_light_green,red_on_white,yellow_on_white,green_on_white,custom_bg,custom_text` default: `white_on_green` replace Replace rule for the table widget text format. Option 1 object Match All definition. type [_required_] enum Table widget text format replace all type. Allowed enum values: `all` with [_required_] string Replace All type. Option 2 object Match Sub-string definition. substring [_required_] string Text that will be replaced. type [_required_] enum Table widget text format replace sub-string type. Allowed enum values: `substring` with [_required_] string Text that will replace original sub-string. style [_required_] object The style to apply to the widget. palette [_required_] string The color palette to apply to the widget. palette_flip [_required_] boolean Whether to flip the palette tones. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of the geomap widget. Allowed enum values: `geomap` default: `geomap` view [_required_] object The view of the world that the map should render. focus [_required_] string The 2-letter ISO code of a country to focus the map on, or `WORLD` for global view, or a region (`EMEA`, `APAC`, `LATAM`), or a continent (`NORTH_AMERICA`, `SOUTH_AMERICA`, `EUROPE`, `AFRICA`, `ASIA`, `OCEANIA`). Option 11 object The groups widget allows you to keep similar graphs together on your timeboard. Each group has a custom header, can hold one to many graphs, and is collapsible. background_color string Background color of the group title. banner_img string URL of image to display as a banner for the group. layout_type [_required_] enum Layout type of the group. Allowed enum values: `ordered` show_title boolean Whether to show the title or not. default: `true` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` type [_required_] enum Type of the group widget. Allowed enum values: `group` default: `group` widgets [_required_] [object] List of widget groups. Option 12 object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of widget types. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` xaxis object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 13 object The host map widget graphs any metric across your hosts using the same visualization available from the main Host Map page. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group [string] List of tag prefixes to group by. no_group_hosts boolean Whether to show the hosts that don’t fit in a group. no_metric_hosts boolean Whether to show the hosts with no metrics. node_type enum Which type of node to use in the map. Allowed enum values: `host,container` notes string Notes on the title. requests [_required_] object List of definitions. fill object Updated host map. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. size object Updated host map. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. scope [string] List of tags used to filter the map. style object The style to apply to the widget. fill_max string Max value to use to color the map. fill_min string Min value to use to color the map. palette string Color palette to apply to the widget. palette_flip boolean Whether to flip the palette tones. title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the host map widget. Allowed enum values: `hostmap` default: `hostmap` Option 14 object The iframe widget allows you to embed a portion of any other web page on your dashboard. Only available on FREE layout dashboards. type [_required_] enum Type of the iframe widget. Allowed enum values: `iframe` default: `iframe` url [_required_] string URL of the iframe. Option 15 object The image widget allows you to embed an image on your dashboard. An image can be a PNG, JPG, or animated GIF. Only available on FREE layout dashboards. has_background boolean Whether to display a background or not. default: `true` has_border boolean Whether to display a border or not. default: `true` horizontal_align enum Horizontal alignment. Allowed enum values: `center,left,right` margin enum Size of the margins around the image. **Note** : `small` and `large` values are deprecated. Allowed enum values: `sm,md,lg,small,large` sizing enum How to size the image on the widget. The values are based on the image `object-fit` CSS properties. **Note** : `zoom`, `fit` and `center` values are deprecated. Allowed enum values: `fill,contain,cover,none,scale-down,zoom,fit,center` type [_required_] enum Type of the image widget. Allowed enum values: `image` default: `image` url [_required_] string URL of the image. url_dark_theme string URL of the image in dark mode. vertical_align enum Vertical alignment. Allowed enum values: `center,top,bottom` Option 16 object The list stream visualization displays a table of recent events in your application that match a search criteria using user-defined columns. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". requests [_required_] [object] Request payload used to query items. columns [_required_] [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` query [_required_] object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. compute [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` group_by [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format [_required_] enum Widget response format. Allowed enum values: `event_list` show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the list stream widget. Allowed enum values: `list_stream` default: `list_stream` Option 17 object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` Option 18 object The monitor summary widget displays a summary view of all your Datadog monitors, or a subset based on a query. Only available on FREE layout dashboards. color_preference enum Which color to use on the widget. Allowed enum values: `background,text` count int64 **DEPRECATED** : The number of monitors to display. display_format enum What to display on the widget. Allowed enum values: `counts,countsAndList,list` hide_zero_counts boolean Whether to show counts of 0 or not. query [_required_] string Query to filter the monitors with. show_last_triggered boolean Whether to show the time that has elapsed since the monitor/group triggered. show_priority boolean Whether to show the priorities column. sort enum Widget sorting methods. Allowed enum values: `name,group,status,tags,triggered,group,asc,group,desc,name,asc,name,desc,status,asc,status,desc,tags,asc,tags,desc,triggered,asc,triggered,desc,priority,asc,priority,desc` start int64 **DEPRECATED** : The start of the list. Typically 0. summary_type enum Which summary type should be used. Allowed enum values: `monitors,groups,combined` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the monitor summary widget. Allowed enum values: `manage_status` default: `manage_status` Option 19 object The notes and links widget is similar to free text widget, but allows for more formatting options. background_color string Background color of the note. content [_required_] string Content of the note. font_size string Size of the text. has_padding boolean Whether to add padding or not. default: `true` show_tick boolean Whether to show a tick or not. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` tick_edge enum Define how you want to align the text on the widget. Allowed enum values: `bottom,left,right,top` tick_pos string Where to position the tick on an edge. type [_required_] enum Type of the note widget. Allowed enum values: `note` default: `note` vertical_align enum Vertical alignment. Allowed enum values: `center,top,bottom` Option 20 object The powerpack widget allows you to keep similar graphs together on your timeboard. Each group has a custom header, can hold one to many graphs, and is collapsible. background_color string Background color of the powerpack title. banner_img string URL of image to display as a banner for the powerpack. powerpack_id [_required_] string UUID of the associated powerpack. show_title boolean Whether to show the title or not. default: `true` template_variables object Powerpack template variables. controlled_by_powerpack [object] Template variables controlled at the powerpack level. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. values [_required_] [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. controlled_externally [object] Template variables controlled by the external resource, such as the dashboard this powerpack is on. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. values [_required_] [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. title string Title of the widget. type [_required_] enum Type of the powerpack widget. Allowed enum values: `powerpack` default: `powerpack` Option 21 object Query values display the current value of a given metric, APM, or log query. autoscale boolean Whether to use auto-scaling or not. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. custom_unit string Display a unit of your choice on the widget. precision int64 Number of decimals to show. If not defined, the widget uses the raw value. requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string TODO. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` timeseries_background object Set a timeseries on the widget background. type [_required_] enum Timeseries is made using an area or bars. Allowed enum values: `bars,area` default: `area` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the query value widget. Allowed enum values: `query_value` default: `query_value` Option 22 object Run workflow is widget that allows you to run a workflow from a dashboard. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. inputs [object] Array of workflow inputs to map to dashboard template variables. name [_required_] string Name of the workflow input. value [_required_] string Dashboard template variable. Can be suffixed with '.value' or '.key'. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the run workflow widget. Allowed enum values: `run_workflow` default: `run_workflow` workflow_id [_required_] string Workflow id. Option 23 object Use the SLO List widget to track your SLOs (Service Level Objectives) on dashboards. requests [_required_] [object] Array of one request object to display in the widget. query [_required_] object Updated SLO List widget. limit int64 Maximum number of results to display in the table. default: `100` query_string [_required_] string Widget query. sort [object] Options for sorting results. column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` request_type [_required_] enum Widget request type. Allowed enum values: `slo_list` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the SLO List widget. Allowed enum values: `slo_list` default: `slo_list` Option 24 object Use the SLO and uptime widget to track your SLOs (Service Level Objectives) and uptime on screenboards and timeboards. additional_query_filters string Additional filters applied to the SLO query. global_time_target string Defined global time target. show_error_budget boolean Defined error budget. slo_id string ID of the SLO displayed. time_windows [string] Times being monitored. title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the SLO widget. Allowed enum values: `slo` default: `slo` view_mode enum Define how you want the SLO to be displayed. Allowed enum values: `overall,component,both` view_type [_required_] string Type of view displayed by the widget. default: `detail` Option 25 object The scatter plot visualization allows you to graph a chosen scope over two different metrics with their respective aggregation. color_by_groups [string] List of groups used for colors. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] object Widget definition. table object Scatterplot request containing formulas and functions. formulas [object] List of Scatterplot formulas that operate on queries. alias string Expression alias. dimension [_required_] enum Dimension of the Scatterplot. Allowed enum values: `x,y,radius,color` formula [_required_] string String expression built from queries, formulas, and functions. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` x object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. y object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the scatter plot widget. Allowed enum values: `scatterplot` default: `scatterplot` xaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 26 object This widget displays a map of a service to all of the services that call it, and all of the services that it calls. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. filters [_required_] [string] Your environment and primary tag (or * if enabled for your account). service [_required_] string The ID of the service you want to map. title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the service map widget. Allowed enum values: `servicemap` default: `servicemap` Option 27 object The service summary displays the graphs of a chosen service in your screenboard. Only available on FREE layout dashboards. display_format enum Number of columns to display. Allowed enum values: `one_column,two_column,three_column` env [_required_] string APM environment. service [_required_] string APM service. show_breakdown boolean Whether to show the latency breakdown or not. show_distribution boolean Whether to show the latency distribution or not. show_errors boolean Whether to show the error metrics or not. show_hits boolean Whether to show the hits metrics or not. show_latency boolean Whether to show the latency metrics or not. show_resource_list boolean Whether to show the resource list or not. size_format enum Size of the widget. Allowed enum values: `small,medium,large` span_name [_required_] string APM span name. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the service summary widget. Allowed enum values: `trace_service` default: `trace_service` Option 28 object The split graph widget allows you to create repeating units of a graph - one for each value in a group (for example: one per service) has_uniform_y_axes boolean Normalize y axes across graphs size [_required_] enum Size of the individual graphs in the split. Allowed enum values: `xs,sm,md,lg` source_widget_definition [_required_] The original widget we are splitting on. Option 1 object The Change graph shows you the change in a value over the time period chosen. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. change_type enum Show the absolute or the relative change. Allowed enum values: `absolute,relative` compare_to enum Timeframe used for the change comparison. Allowed enum values: `hour_before,day_before,week_before,month_before` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. increase_good boolean Whether to show increase as good. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order_by enum What to order by. Allowed enum values: `change,name,present,past` order_dir enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. show_present boolean Whether to show the present value. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the change widget. Allowed enum values: `change` default: `change` Option 2 object This visualization displays a series of values by country on a world map. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of request objects to display in the widget. May include an optional request for the region layer and/or an optional request for the points layer. Region layer requests must contain a `group-by` tag whose value is a country ISO code. See the [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) for information about building the `REQUEST_SCHEMA`. columns [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` conditional_formats [object] Threshold (numeric) conditional formatting rules may be used by a regions layer. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. compute [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` group_by [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object The style to apply to the request for points layer. color_by string The category to color the points by. text_formats [object] Text formatting rules may be used by a points layer. custom_bg_color string Hex representation of the custom background color. Used with custom background palette option. custom_fg_color string Hex representation of the custom text color. Used with custom text palette option. match [_required_] object Match rule for the table widget text format. type [_required_] enum Match or compare option. Allowed enum values: `is,is_not,contains,does_not_contain,starts_with,ends_with` value [_required_] string Table Widget Match String. palette enum Color-on-color palette to highlight replaced text. Allowed enum values: `white_on_red,white_on_yellow,white_on_green,black_on_light_red,black_on_light_yellow,black_on_light_green,red_on_white,yellow_on_white,green_on_white,custom_bg,custom_text` default: `white_on_green` replace Replace rule for the table widget text format. Option 1 object Match All definition. type [_required_] enum Table widget text format replace all type. Allowed enum values: `all` with [_required_] string Replace All type. Option 2 object Match Sub-string definition. substring [_required_] string Text that will be replaced. type [_required_] enum Table widget text format replace sub-string type. Allowed enum values: `substring` with [_required_] string Text that will replace original sub-string. style [_required_] object The style to apply to the widget. palette [_required_] string The color palette to apply to the widget. palette_flip [_required_] boolean Whether to flip the palette tones. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of the geomap widget. Allowed enum values: `geomap` default: `geomap` view [_required_] object The view of the world that the map should render. focus [_required_] string The 2-letter ISO code of a country to focus the map on, or `WORLD` for global view, or a region (`EMEA`, `APAC`, `LATAM`), or a continent (`NORTH_AMERICA`, `SOUTH_AMERICA`, `EUROPE`, `AFRICA`, `ASIA`, `OCEANIA`). Option 3 object Query values display the current value of a given metric, APM, or log query. autoscale boolean Whether to use auto-scaling or not. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. custom_unit string Display a unit of your choice on the widget. precision int64 Number of decimals to show. If not defined, the widget uses the raw value. requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string TODO. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` timeseries_background object Set a timeseries on the widget background. type [_required_] enum Timeseries is made using an area or bars. Allowed enum values: `bars,area` default: `area` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the query value widget. Allowed enum values: `query_value` default: `query_value` Option 4 object The scatter plot visualization allows you to graph a chosen scope over two different metrics with their respective aggregation. color_by_groups [string] List of groups used for colors. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] object Widget definition. table object Scatterplot request containing formulas and functions. formulas [object] List of Scatterplot formulas that operate on queries. alias string Expression alias. dimension [_required_] enum Dimension of the Scatterplot. Allowed enum values: `x,y,radius,color` formula [_required_] string String expression built from queries, formulas, and functions. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` x object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. y object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the scatter plot widget. Allowed enum values: `scatterplot` default: `scatterplot` xaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 5 object Sunbursts are spot on to highlight how groups contribute to the total of a query. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. hide_total boolean Show the total value in this widget. legend Configuration of the legend. Option 1 object Configuration of table-based legend. type [_required_] enum Whether or not to show a table legend. Allowed enum values: `table,none` Option 2 object Configuration of inline or automatic legends. hide_percent boolean Whether to hide the percentages of the groups. hide_value boolean Whether to hide the values of the groups. type [_required_] enum Whether to show the legend inline or let it be automatically generated. Allowed enum values: `inline,automatic` requests [_required_] [object] List of sunburst widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the Sunburst widget. Allowed enum values: `sunburst` default: `sunburst` Option 6 object The table visualization is available on timeboards and screenboards. It displays columns of metrics grouped by tag key. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. has_search_bar enum Controls the display of the search bar. Allowed enum values: `always,never,auto` requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` alias string The column name (defaults to the metric name). apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. cell_display_mode [string] A list of display modes for each table cell. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. limit int64 For metric queries, the number of lines to show in the table. Only one request should have this property. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` text_formats [array] List of text formats for columns produced by tags. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the table widget. Allowed enum values: `query_table` default: `query_table` Option 7 object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 8 object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` Option 9 object The treemap visualization enables you to display hierarchical and nested data. It is well suited for queries that describe part-whole relationships, such as resource usage by availability zone, data center, or team. color_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine color in the widget. Allowed enum values: `user` default: `user` custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group_by enum **DEPRECATED** : (deprecated) The attribute formerly used to group elements in the widget. Allowed enum values: `user,family,process` requests [_required_] [object] List of treemap widget requests. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` size_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine size in the widget. Allowed enum values: `pct_cpu,pct_mem` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the treemap widget. Allowed enum values: `treemap` default: `treemap` split_config [_required_] object Encapsulates all user choices about how to split a graph. limit [_required_] int64 Maximum number of graphs to display in the widget. sort [_required_] object Controls the order in which graphs appear in the split. compute object Defines the metric and aggregation used as the sort value. aggregation [_required_] string How to aggregate the sort metric for the purposes of ordering. metric [_required_] string The metric to use for sorting graphs. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` split_dimensions [_required_] [object] The dimension(s) on which to split the graph one_graph_per [_required_] string The system interprets this attribute differently depending on the data source of the query being split. For metrics, it's a tag. For the events platform, it's an attribute or tag. static_splits [array] Manual selection of tags making split graph widget static time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the split graph widget Allowed enum values: `split_group` default: `split_group` Option 29 object Sunbursts are spot on to highlight how groups contribute to the total of a query. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. hide_total boolean Show the total value in this widget. legend Configuration of the legend. Option 1 object Configuration of table-based legend. type [_required_] enum Whether or not to show a table legend. Allowed enum values: `table,none` Option 2 object Configuration of inline or automatic legends. hide_percent boolean Whether to hide the percentages of the groups. hide_value boolean Whether to hide the values of the groups. type [_required_] enum Whether to show the legend inline or let it be automatically generated. Allowed enum values: `inline,automatic` requests [_required_] [object] List of sunburst widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the Sunburst widget. Allowed enum values: `sunburst` default: `sunburst` Option 30 object The table visualization is available on timeboards and screenboards. It displays columns of metrics grouped by tag key. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. has_search_bar enum Controls the display of the search bar. Allowed enum values: `always,never,auto` requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` alias string The column name (defaults to the metric name). apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. cell_display_mode [string] A list of display modes for each table cell. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. limit int64 For metric queries, the number of lines to show in the table. Only one request should have this property. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` text_formats [array] List of text formats for columns produced by tags. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the table widget. Allowed enum values: `query_table` default: `query_table` Option 31 object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 32 object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` Option 33 object This widget displays a topology of nodes and edges for different data sources. It replaces the service map widget. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] One or more Topology requests. query object Query to service-based topology data sources like the service map or data streams. data_source enum Name of the data source Allowed enum values: `data_streams,service_map` filters [string] Your environment and primary tag (or * if enabled for your account). service string Name of the service request_type enum Widget request type. Allowed enum values: `topology` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the topology map widget. Allowed enum values: `topology_map` default: `topology_map` Option 34 object The treemap visualization enables you to display hierarchical and nested data. It is well suited for queries that describe part-whole relationships, such as resource usage by availability zone, data center, or team. color_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine color in the widget. Allowed enum values: `user` default: `user` custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group_by enum **DEPRECATED** : (deprecated) The attribute formerly used to group elements in the widget. Allowed enum values: `user,family,process` requests [_required_] [object] List of treemap widget requests. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` size_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine size in the widget. Allowed enum values: `pct_cpu,pct_mem` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the treemap widget. Allowed enum values: `treemap` default: `treemap` id int64 ID of the widget. layout object The layout for a widget on a `free` or **new dashboard layout** dashboard. height [_required_] int64 The height of the widget. Should be a non-negative integer. is_column_break boolean Whether the widget should be the first one on the second column in high density or not. **Note** : Only for the **new dashboard layout** and only one widget in the dashboard should have this property set to `true`. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. ##### Clients deserialize a dashboard with a empty time object ``` { "title": "Example-Dashboard", "widgets": [ { "definition": { "title": "Example Cloud Cost Query", "title_size": "16", "title_align": "left", "type": "timeseries", "requests": [ { "formulas": [ { "formula": "query1" } ], "queries": [ { "data_source": "cloud_cost", "name": "query1", "query": "sum:aws.cost.amortized{*} by {aws_product}.rollup(sum, monthly)" } ], "response_format": "timeseries", "style": { "palette": "dog_classic", "line_type": "solid", "line_width": "normal" }, "display_type": "bars" } ], "time": {} } } ], "layout_type": "ordered" } ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions APM Stats query ``` { "title": "Example-Dashboard", "description": "", "widgets": [ { "definition": { "title": "APM Stats - Request latency HOP", "title_size": "16", "title_align": "left", "show_legend": false, "type": "distribution", "xaxis": { "max": "auto", "include_zero": true, "scale": "linear", "min": "auto" }, "yaxis": { "max": "auto", "include_zero": true, "scale": "linear", "min": "auto" }, "requests": [ { "query": { "primary_tag_value": "*", "stat": "latency_distribution", "data_source": "apm_resource_stats", "name": "query1", "service": "azure-bill-import", "group_by": [ "resource_name" ], "env": "staging", "primary_tag_name": "datacenter", "operation_name": "universal.http.client" }, "request_type": "histogram", "style": { "palette": "dog_classic" } } ] }, "layout": { "x": 8, "y": 0, "width": 4, "height": 2 } } ], "layout_type": "ordered" } ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions events query ``` { "title": "Example-Dashboard", "description": "Example-Dashboard", "widgets": [ { "definition": { "title": "Events Platform - Request latency HOP", "title_size": "16", "title_align": "left", "show_legend": false, "type": "distribution", "xaxis": { "max": "auto", "include_zero": true, "scale": "linear", "min": "auto" }, "yaxis": { "max": "auto", "include_zero": true, "scale": "linear", "min": "auto" }, "requests": [ { "query": { "search": { "query": "" }, "data_source": "events", "compute": { "metric": "@duration", "aggregation": "min" }, "name": "query1", "indexes": [ "*" ], "group_by": [] }, "request_type": "histogram" } ] }, "layout": { "x": 0, "y": 0, "width": 4, "height": 2 } } ], "layout_type": "ordered" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dashboards/#CreateDashboard-200-v1) * [400](https://docs.datadoghq.com/api/latest/dashboards/#CreateDashboard-400-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#CreateDashboard-403-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#CreateDashboard-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) A dashboard is Datadog’s tool for visually tracking, analyzing, and displaying key performance metrics, which enable you to monitor the health of your infrastructure. Field Type Description author_handle string Identifier of the dashboard author. author_name string Name of the dashboard author. created_at date-time Creation date of the dashboard. description string Description of the dashboard. id string ID of the dashboard. is_read_only boolean **DEPRECATED** : Whether this dashboard is read-only. If True, only the author and admins can make changes to it. This property is deprecated; please use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) instead to manage write authorization for individual dashboards. layout_type [_required_] enum Layout type of the dashboard. Allowed enum values: `ordered,free` modified_at date-time Modification date of the dashboard. notify_list [string] List of handles of users to notify when changes are made to this dashboard. reflow_type enum Reflow type for a **new dashboard layout** dashboard. Set this only when layout type is 'ordered'. If set to 'fixed', the dashboard expects all widgets to have a layout, and if it's set to 'auto', widgets should not have layouts. Allowed enum values: `auto,fixed` restricted_roles [string] A list of role identifiers. Only the author and users associated with at least one of these roles can edit this dashboard. tags [string] List of team names representing ownership of a dashboard. template_variable_presets [object] Array of template variables saved views. name string The name of the variable. template_variables [object] List of variables. name string The name of the variable. value string **DEPRECATED** : (deprecated) The value of the template variable within the saved view. Cannot be used in conjunction with `values`. values [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. Cannot be used in conjunction with `value`. template_variables [object] List of template variables for this dashboard. available_values [string] The list of values that the template variable drop-down is limited to. default string **DEPRECATED** : (deprecated) The default value for the template variable on dashboard load. Cannot be used in conjunction with `defaults`. defaults [string] One or many default values for template variables on load. If more than one default is specified, they will be unioned together with `OR`. Cannot be used in conjunction with `default`. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. Only tags with this prefix appear in the variable drop-down. type string The type of variable. This is to differentiate between filter variables (interpolated in query) and group by variables (interpolated into group by). title [_required_] string Title of the dashboard. url string The URL of the dashboard. widgets [_required_] [object] List of widgets to display on the dashboard. definition [_required_] [Definition of the widget](https://docs.datadoghq.com/dashboards/widgets/). Option 1 object Alert graphs are timeseries graphs showing the current status of any monitor defined on your system. alert_id [_required_] string ID of the alert to use in the widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the alert graph widget. Allowed enum values: `alert_graph` default: `alert_graph` viz_type [_required_] enum Whether to display the Alert Graph as a timeseries or a top list. Allowed enum values: `timeseries,toplist` Option 2 object Alert values are query values showing the current value of the metric in any monitor defined on your system. alert_id [_required_] string ID of the alert to use in the widget. precision int64 Number of decimal to show. If not defined, will use the raw value. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of value in the widget. type [_required_] enum Type of the alert value widget. Allowed enum values: `alert_value` default: `alert_value` unit string Unit to display with the value. Option 3 object The Change graph shows you the change in a value over the time period chosen. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. change_type enum Show the absolute or the relative change. Allowed enum values: `absolute,relative` compare_to enum Timeframe used for the change comparison. Allowed enum values: `hour_before,day_before,week_before,month_before` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. increase_good boolean Whether to show increase as good. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order_by enum What to order by. Allowed enum values: `change,name,present,past` order_dir enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. show_present boolean Whether to show the present value. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the change widget. Allowed enum values: `change` default: `change` Option 4 object Check status shows the current status or number of results for any check performed. check [_required_] string Name of the check to use in the widget. group string Group reporting a single check. group_by [string] List of tag prefixes to group by in the case of a cluster check. grouping [_required_] enum The kind of grouping to use. Allowed enum values: `check,cluster` tags [string] List of tags used to filter the groups reporting a cluster check. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the check status widget. Allowed enum values: `check_status` default: `check_status` Option 5 object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query Query definition for Distribution Widget Histogram Request Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` xaxis object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` yaxis object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` Option 6 object The event stream is a widget version of the stream of events on the Event Stream view. Only available on FREE layout dashboards. event_size enum Size to use to display an event. Allowed enum values: `s,l` query [_required_] string Query to filter the event stream with. tags_execution string The execution method for multi-value filters. Can be either and or or. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the event stream widget. Allowed enum values: `event_stream` default: `event_stream` Option 7 object The event timeline is a widget version of the timeline that appears at the top of the Event Stream view. Only available on FREE layout dashboards. query [_required_] string Query to filter the event timeline with. tags_execution string The execution method for multi-value filters. Can be either and or or. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the event timeline widget. Allowed enum values: `event_timeline` default: `event_timeline` Option 8 object Free text is a widget that allows you to add headings to your screenboard. Commonly used to state the overall purpose of the dashboard. Only available on FREE layout dashboards. color string Color of the text. font_size string Size of the text. text [_required_] string Text to display. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` type [_required_] enum Type of the free text widget. Allowed enum values: `free_text` default: `free_text` Option 9 object The funnel visualization displays a funnel of user sessions that maps a sequence of view navigation and user interaction in your application. requests [_required_] [object] Request payload used to query items. query [_required_] object Updated funnel widget. data_source [_required_] enum Source from which to query items to display in the funnel. Allowed enum values: `rum` default: `rum` query_string [_required_] string The widget query. steps [_required_] [object] List of funnel steps. facet [_required_] string The facet of the step. value [_required_] string The value of the step. request_type [_required_] enum Widget request type. Allowed enum values: `funnel` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of funnel widget. Allowed enum values: `funnel` default: `funnel` Option 10 object This visualization displays a series of values by country on a world map. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of request objects to display in the widget. May include an optional request for the region layer and/or an optional request for the points layer. Region layer requests must contain a `group-by` tag whose value is a country ISO code. See the [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) for information about building the `REQUEST_SCHEMA`. columns [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` conditional_formats [object] Threshold (numeric) conditional formatting rules may be used by a regions layer. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. compute [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` group_by [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object The style to apply to the request for points layer. color_by string The category to color the points by. text_formats [object] Text formatting rules may be used by a points layer. custom_bg_color string Hex representation of the custom background color. Used with custom background palette option. custom_fg_color string Hex representation of the custom text color. Used with custom text palette option. match [_required_] object Match rule for the table widget text format. type [_required_] enum Match or compare option. Allowed enum values: `is,is_not,contains,does_not_contain,starts_with,ends_with` value [_required_] string Table Widget Match String. palette enum Color-on-color palette to highlight replaced text. Allowed enum values: `white_on_red,white_on_yellow,white_on_green,black_on_light_red,black_on_light_yellow,black_on_light_green,red_on_white,yellow_on_white,green_on_white,custom_bg,custom_text` default: `white_on_green` replace Replace rule for the table widget text format. Option 1 object Match All definition. type [_required_] enum Table widget text format replace all type. Allowed enum values: `all` with [_required_] string Replace All type. Option 2 object Match Sub-string definition. substring [_required_] string Text that will be replaced. type [_required_] enum Table widget text format replace sub-string type. Allowed enum values: `substring` with [_required_] string Text that will replace original sub-string. style [_required_] object The style to apply to the widget. palette [_required_] string The color palette to apply to the widget. palette_flip [_required_] boolean Whether to flip the palette tones. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of the geomap widget. Allowed enum values: `geomap` default: `geomap` view [_required_] object The view of the world that the map should render. focus [_required_] string The 2-letter ISO code of a country to focus the map on, or `WORLD` for global view, or a region (`EMEA`, `APAC`, `LATAM`), or a continent (`NORTH_AMERICA`, `SOUTH_AMERICA`, `EUROPE`, `AFRICA`, `ASIA`, `OCEANIA`). Option 11 object The groups widget allows you to keep similar graphs together on your timeboard. Each group has a custom header, can hold one to many graphs, and is collapsible. background_color string Background color of the group title. banner_img string URL of image to display as a banner for the group. layout_type [_required_] enum Layout type of the group. Allowed enum values: `ordered` show_title boolean Whether to show the title or not. default: `true` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` type [_required_] enum Type of the group widget. Allowed enum values: `group` default: `group` widgets [_required_] [object] List of widget groups. Option 12 object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of widget types. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` xaxis object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 13 object The host map widget graphs any metric across your hosts using the same visualization available from the main Host Map page. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group [string] List of tag prefixes to group by. no_group_hosts boolean Whether to show the hosts that don’t fit in a group. no_metric_hosts boolean Whether to show the hosts with no metrics. node_type enum Which type of node to use in the map. Allowed enum values: `host,container` notes string Notes on the title. requests [_required_] object List of definitions. fill object Updated host map. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. size object Updated host map. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. scope [string] List of tags used to filter the map. style object The style to apply to the widget. fill_max string Max value to use to color the map. fill_min string Min value to use to color the map. palette string Color palette to apply to the widget. palette_flip boolean Whether to flip the palette tones. title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the host map widget. Allowed enum values: `hostmap` default: `hostmap` Option 14 object The iframe widget allows you to embed a portion of any other web page on your dashboard. Only available on FREE layout dashboards. type [_required_] enum Type of the iframe widget. Allowed enum values: `iframe` default: `iframe` url [_required_] string URL of the iframe. Option 15 object The image widget allows you to embed an image on your dashboard. An image can be a PNG, JPG, or animated GIF. Only available on FREE layout dashboards. has_background boolean Whether to display a background or not. default: `true` has_border boolean Whether to display a border or not. default: `true` horizontal_align enum Horizontal alignment. Allowed enum values: `center,left,right` margin enum Size of the margins around the image. **Note** : `small` and `large` values are deprecated. Allowed enum values: `sm,md,lg,small,large` sizing enum How to size the image on the widget. The values are based on the image `object-fit` CSS properties. **Note** : `zoom`, `fit` and `center` values are deprecated. Allowed enum values: `fill,contain,cover,none,scale-down,zoom,fit,center` type [_required_] enum Type of the image widget. Allowed enum values: `image` default: `image` url [_required_] string URL of the image. url_dark_theme string URL of the image in dark mode. vertical_align enum Vertical alignment. Allowed enum values: `center,top,bottom` Option 16 object The list stream visualization displays a table of recent events in your application that match a search criteria using user-defined columns. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". requests [_required_] [object] Request payload used to query items. columns [_required_] [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` query [_required_] object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. compute [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` group_by [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format [_required_] enum Widget response format. Allowed enum values: `event_list` show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the list stream widget. Allowed enum values: `list_stream` default: `list_stream` Option 17 object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` Option 18 object The monitor summary widget displays a summary view of all your Datadog monitors, or a subset based on a query. Only available on FREE layout dashboards. color_preference enum Which color to use on the widget. Allowed enum values: `background,text` count int64 **DEPRECATED** : The number of monitors to display. display_format enum What to display on the widget. Allowed enum values: `counts,countsAndList,list` hide_zero_counts boolean Whether to show counts of 0 or not. query [_required_] string Query to filter the monitors with. show_last_triggered boolean Whether to show the time that has elapsed since the monitor/group triggered. show_priority boolean Whether to show the priorities column. sort enum Widget sorting methods. Allowed enum values: `name,group,status,tags,triggered,group,asc,group,desc,name,asc,name,desc,status,asc,status,desc,tags,asc,tags,desc,triggered,asc,triggered,desc,priority,asc,priority,desc` start int64 **DEPRECATED** : The start of the list. Typically 0. summary_type enum Which summary type should be used. Allowed enum values: `monitors,groups,combined` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the monitor summary widget. Allowed enum values: `manage_status` default: `manage_status` Option 19 object The notes and links widget is similar to free text widget, but allows for more formatting options. background_color string Background color of the note. content [_required_] string Content of the note. font_size string Size of the text. has_padding boolean Whether to add padding or not. default: `true` show_tick boolean Whether to show a tick or not. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` tick_edge enum Define how you want to align the text on the widget. Allowed enum values: `bottom,left,right,top` tick_pos string Where to position the tick on an edge. type [_required_] enum Type of the note widget. Allowed enum values: `note` default: `note` vertical_align enum Vertical alignment. Allowed enum values: `center,top,bottom` Option 20 object The powerpack widget allows you to keep similar graphs together on your timeboard. Each group has a custom header, can hold one to many graphs, and is collapsible. background_color string Background color of the powerpack title. banner_img string URL of image to display as a banner for the powerpack. powerpack_id [_required_] string UUID of the associated powerpack. show_title boolean Whether to show the title or not. default: `true` template_variables object Powerpack template variables. controlled_by_powerpack [object] Template variables controlled at the powerpack level. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. values [_required_] [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. controlled_externally [object] Template variables controlled by the external resource, such as the dashboard this powerpack is on. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. values [_required_] [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. title string Title of the widget. type [_required_] enum Type of the powerpack widget. Allowed enum values: `powerpack` default: `powerpack` Option 21 object Query values display the current value of a given metric, APM, or log query. autoscale boolean Whether to use auto-scaling or not. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. custom_unit string Display a unit of your choice on the widget. precision int64 Number of decimals to show. If not defined, the widget uses the raw value. requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string TODO. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` timeseries_background object Set a timeseries on the widget background. type [_required_] enum Timeseries is made using an area or bars. Allowed enum values: `bars,area` default: `area` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the query value widget. Allowed enum values: `query_value` default: `query_value` Option 22 object Run workflow is widget that allows you to run a workflow from a dashboard. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. inputs [object] Array of workflow inputs to map to dashboard template variables. name [_required_] string Name of the workflow input. value [_required_] string Dashboard template variable. Can be suffixed with '.value' or '.key'. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the run workflow widget. Allowed enum values: `run_workflow` default: `run_workflow` workflow_id [_required_] string Workflow id. Option 23 object Use the SLO List widget to track your SLOs (Service Level Objectives) on dashboards. requests [_required_] [object] Array of one request object to display in the widget. query [_required_] object Updated SLO List widget. limit int64 Maximum number of results to display in the table. default: `100` query_string [_required_] string Widget query. sort [object] Options for sorting results. column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` request_type [_required_] enum Widget request type. Allowed enum values: `slo_list` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the SLO List widget. Allowed enum values: `slo_list` default: `slo_list` Option 24 object Use the SLO and uptime widget to track your SLOs (Service Level Objectives) and uptime on screenboards and timeboards. additional_query_filters string Additional filters applied to the SLO query. global_time_target string Defined global time target. show_error_budget boolean Defined error budget. slo_id string ID of the SLO displayed. time_windows [string] Times being monitored. title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the SLO widget. Allowed enum values: `slo` default: `slo` view_mode enum Define how you want the SLO to be displayed. Allowed enum values: `overall,component,both` view_type [_required_] string Type of view displayed by the widget. default: `detail` Option 25 object The scatter plot visualization allows you to graph a chosen scope over two different metrics with their respective aggregation. color_by_groups [string] List of groups used for colors. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] object Widget definition. table object Scatterplot request containing formulas and functions. formulas [object] List of Scatterplot formulas that operate on queries. alias string Expression alias. dimension [_required_] enum Dimension of the Scatterplot. Allowed enum values: `x,y,radius,color` formula [_required_] string String expression built from queries, formulas, and functions. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` x object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. y object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the scatter plot widget. Allowed enum values: `scatterplot` default: `scatterplot` xaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 26 object This widget displays a map of a service to all of the services that call it, and all of the services that it calls. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. filters [_required_] [string] Your environment and primary tag (or * if enabled for your account). service [_required_] string The ID of the service you want to map. title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the service map widget. Allowed enum values: `servicemap` default: `servicemap` Option 27 object The service summary displays the graphs of a chosen service in your screenboard. Only available on FREE layout dashboards. display_format enum Number of columns to display. Allowed enum values: `one_column,two_column,three_column` env [_required_] string APM environment. service [_required_] string APM service. show_breakdown boolean Whether to show the latency breakdown or not. show_distribution boolean Whether to show the latency distribution or not. show_errors boolean Whether to show the error metrics or not. show_hits boolean Whether to show the hits metrics or not. show_latency boolean Whether to show the latency metrics or not. show_resource_list boolean Whether to show the resource list or not. size_format enum Size of the widget. Allowed enum values: `small,medium,large` span_name [_required_] string APM span name. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the service summary widget. Allowed enum values: `trace_service` default: `trace_service` Option 28 object The split graph widget allows you to create repeating units of a graph - one for each value in a group (for example: one per service) has_uniform_y_axes boolean Normalize y axes across graphs size [_required_] enum Size of the individual graphs in the split. Allowed enum values: `xs,sm,md,lg` source_widget_definition [_required_] The original widget we are splitting on. Option 1 object The Change graph shows you the change in a value over the time period chosen. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. change_type enum Show the absolute or the relative change. Allowed enum values: `absolute,relative` compare_to enum Timeframe used for the change comparison. Allowed enum values: `hour_before,day_before,week_before,month_before` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. increase_good boolean Whether to show increase as good. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order_by enum What to order by. Allowed enum values: `change,name,present,past` order_dir enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. show_present boolean Whether to show the present value. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the change widget. Allowed enum values: `change` default: `change` Option 2 object This visualization displays a series of values by country on a world map. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of request objects to display in the widget. May include an optional request for the region layer and/or an optional request for the points layer. Region layer requests must contain a `group-by` tag whose value is a country ISO code. See the [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) for information about building the `REQUEST_SCHEMA`. columns [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` conditional_formats [object] Threshold (numeric) conditional formatting rules may be used by a regions layer. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. compute [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` group_by [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object The style to apply to the request for points layer. color_by string The category to color the points by. text_formats [object] Text formatting rules may be used by a points layer. custom_bg_color string Hex representation of the custom background color. Used with custom background palette option. custom_fg_color string Hex representation of the custom text color. Used with custom text palette option. match [_required_] object Match rule for the table widget text format. type [_required_] enum Match or compare option. Allowed enum values: `is,is_not,contains,does_not_contain,starts_with,ends_with` value [_required_] string Table Widget Match String. palette enum Color-on-color palette to highlight replaced text. Allowed enum values: `white_on_red,white_on_yellow,white_on_green,black_on_light_red,black_on_light_yellow,black_on_light_green,red_on_white,yellow_on_white,green_on_white,custom_bg,custom_text` default: `white_on_green` replace Replace rule for the table widget text format. Option 1 object Match All definition. type [_required_] enum Table widget text format replace all type. Allowed enum values: `all` with [_required_] string Replace All type. Option 2 object Match Sub-string definition. substring [_required_] string Text that will be replaced. type [_required_] enum Table widget text format replace sub-string type. Allowed enum values: `substring` with [_required_] string Text that will replace original sub-string. style [_required_] object The style to apply to the widget. palette [_required_] string The color palette to apply to the widget. palette_flip [_required_] boolean Whether to flip the palette tones. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of the geomap widget. Allowed enum values: `geomap` default: `geomap` view [_required_] object The view of the world that the map should render. focus [_required_] string The 2-letter ISO code of a country to focus the map on, or `WORLD` for global view, or a region (`EMEA`, `APAC`, `LATAM`), or a continent (`NORTH_AMERICA`, `SOUTH_AMERICA`, `EUROPE`, `AFRICA`, `ASIA`, `OCEANIA`). Option 3 object Query values display the current value of a given metric, APM, or log query. autoscale boolean Whether to use auto-scaling or not. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. custom_unit string Display a unit of your choice on the widget. precision int64 Number of decimals to show. If not defined, the widget uses the raw value. requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string TODO. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` timeseries_background object Set a timeseries on the widget background. type [_required_] enum Timeseries is made using an area or bars. Allowed enum values: `bars,area` default: `area` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the query value widget. Allowed enum values: `query_value` default: `query_value` Option 4 object The scatter plot visualization allows you to graph a chosen scope over two different metrics with their respective aggregation. color_by_groups [string] List of groups used for colors. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] object Widget definition. table object Scatterplot request containing formulas and functions. formulas [object] List of Scatterplot formulas that operate on queries. alias string Expression alias. dimension [_required_] enum Dimension of the Scatterplot. Allowed enum values: `x,y,radius,color` formula [_required_] string String expression built from queries, formulas, and functions. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` x object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. y object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the scatter plot widget. Allowed enum values: `scatterplot` default: `scatterplot` xaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 5 object Sunbursts are spot on to highlight how groups contribute to the total of a query. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. hide_total boolean Show the total value in this widget. legend Configuration of the legend. Option 1 object Configuration of table-based legend. type [_required_] enum Whether or not to show a table legend. Allowed enum values: `table,none` Option 2 object Configuration of inline or automatic legends. hide_percent boolean Whether to hide the percentages of the groups. hide_value boolean Whether to hide the values of the groups. type [_required_] enum Whether to show the legend inline or let it be automatically generated. Allowed enum values: `inline,automatic` requests [_required_] [object] List of sunburst widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the Sunburst widget. Allowed enum values: `sunburst` default: `sunburst` Option 6 object The table visualization is available on timeboards and screenboards. It displays columns of metrics grouped by tag key. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. has_search_bar enum Controls the display of the search bar. Allowed enum values: `always,never,auto` requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` alias string The column name (defaults to the metric name). apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. cell_display_mode [string] A list of display modes for each table cell. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. limit int64 For metric queries, the number of lines to show in the table. Only one request should have this property. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` text_formats [array] List of text formats for columns produced by tags. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the table widget. Allowed enum values: `query_table` default: `query_table` Option 7 object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 8 object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` Option 9 object The treemap visualization enables you to display hierarchical and nested data. It is well suited for queries that describe part-whole relationships, such as resource usage by availability zone, data center, or team. color_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine color in the widget. Allowed enum values: `user` default: `user` custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group_by enum **DEPRECATED** : (deprecated) The attribute formerly used to group elements in the widget. Allowed enum values: `user,family,process` requests [_required_] [object] List of treemap widget requests. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` size_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine size in the widget. Allowed enum values: `pct_cpu,pct_mem` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the treemap widget. Allowed enum values: `treemap` default: `treemap` split_config [_required_] object Encapsulates all user choices about how to split a graph. limit [_required_] int64 Maximum number of graphs to display in the widget. sort [_required_] object Controls the order in which graphs appear in the split. compute object Defines the metric and aggregation used as the sort value. aggregation [_required_] string How to aggregate the sort metric for the purposes of ordering. metric [_required_] string The metric to use for sorting graphs. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` split_dimensions [_required_] [object] The dimension(s) on which to split the graph one_graph_per [_required_] string The system interprets this attribute differently depending on the data source of the query being split. For metrics, it's a tag. For the events platform, it's an attribute or tag. static_splits [array] Manual selection of tags making split graph widget static time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the split graph widget Allowed enum values: `split_group` default: `split_group` Option 29 object Sunbursts are spot on to highlight how groups contribute to the total of a query. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. hide_total boolean Show the total value in this widget. legend Configuration of the legend. Option 1 object Configuration of table-based legend. type [_required_] enum Whether or not to show a table legend. Allowed enum values: `table,none` Option 2 object Configuration of inline or automatic legends. hide_percent boolean Whether to hide the percentages of the groups. hide_value boolean Whether to hide the values of the groups. type [_required_] enum Whether to show the legend inline or let it be automatically generated. Allowed enum values: `inline,automatic` requests [_required_] [object] List of sunburst widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the Sunburst widget. Allowed enum values: `sunburst` default: `sunburst` Option 30 object The table visualization is available on timeboards and screenboards. It displays columns of metrics grouped by tag key. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. has_search_bar enum Controls the display of the search bar. Allowed enum values: `always,never,auto` requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` alias string The column name (defaults to the metric name). apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. cell_display_mode [string] A list of display modes for each table cell. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. limit int64 For metric queries, the number of lines to show in the table. Only one request should have this property. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` text_formats [array] List of text formats for columns produced by tags. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the table widget. Allowed enum values: `query_table` default: `query_table` Option 31 object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 32 object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` Option 33 object This widget displays a topology of nodes and edges for different data sources. It replaces the service map widget. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] One or more Topology requests. query object Query to service-based topology data sources like the service map or data streams. data_source enum Name of the data source Allowed enum values: `data_streams,service_map` filters [string] Your environment and primary tag (or * if enabled for your account). service string Name of the service request_type enum Widget request type. Allowed enum values: `topology` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the topology map widget. Allowed enum values: `topology_map` default: `topology_map` Option 34 object The treemap visualization enables you to display hierarchical and nested data. It is well suited for queries that describe part-whole relationships, such as resource usage by availability zone, data center, or team. color_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine color in the widget. Allowed enum values: `user` default: `user` custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group_by enum **DEPRECATED** : (deprecated) The attribute formerly used to group elements in the widget. Allowed enum values: `user,family,process` requests [_required_] [object] List of treemap widget requests. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` size_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine size in the widget. Allowed enum values: `pct_cpu,pct_mem` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the treemap widget. Allowed enum values: `treemap` default: `treemap` id int64 ID of the widget. layout object The layout for a widget on a `free` or **new dashboard layout** dashboard. height [_required_] int64 The height of the widget. Should be a non-negative integer. is_column_break boolean Whether the widget should be the first one on the second column in high density or not. **Note** : Only for the **new dashboard layout** and only one widget in the dashboard should have this property set to `true`. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. ``` { "author_handle": "test@datadoghq.com", "author_name": "John Doe", "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "id": "123-abc-456", "is_read_only": false, "layout_type": "ordered", "modified_at": "2019-09-19T10:00:00.000Z", "notify_list": [], "reflow_type": "string", "restricted_roles": [], "tags": [], "template_variable_presets": [ { "name": "string", "template_variables": [ { "name": "string", "value": "string", "values": [] } ] } ], "template_variables": [ { "available_values": [ "my-host", "host1", "host2" ], "default": "my-host", "defaults": [ "my-host-1", "my-host-2" ], "name": "host1", "prefix": "host", "type": "group" } ], "title": "", "url": "/dashboard/123-abc-456/example-dashboard-title", "widgets": [ { "definition": { "requests": { "fill": { "q": "avg:system.cpu.user{*}" } }, "type": "hostmap" }, "id": "integer", "layout": { "height": 0, "is_column_break": false, "width": 0, "x": 0, "y": 0 } } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby-legacy) ##### Clients deserialize a dashboard with a empty time object Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "title": "Example-Dashboard", "widgets": [ { "definition": { "title": "Example Cloud Cost Query", "title_size": "16", "title_align": "left", "type": "timeseries", "requests": [ { "formulas": [ { "formula": "query1" } ], "queries": [ { "data_source": "cloud_cost", "name": "query1", "query": "sum:aws.cost.amortized{*} by {aws_product}.rollup(sum, monthly)" } ], "response_format": "timeseries", "style": { "palette": "dog_classic", "line_type": "solid", "line_width": "normal" }, "display_type": "bars" } ], "time": {} } } ], "layout_type": "ordered" } EOF ``` ##### Create a distribution widget using a histogram request containing a formulas and functions APM Stats query Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "title": "Example-Dashboard", "description": "", "widgets": [ { "definition": { "title": "APM Stats - Request latency HOP", "title_size": "16", "title_align": "left", "show_legend": false, "type": "distribution", "xaxis": { "max": "auto", "include_zero": true, "scale": "linear", "min": "auto" }, "yaxis": { "max": "auto", "include_zero": true, "scale": "linear", "min": "auto" }, "requests": [ { "query": { "primary_tag_value": "*", "stat": "latency_distribution", "data_source": "apm_resource_stats", "name": "query1", "service": "azure-bill-import", "group_by": [ "resource_name" ], "env": "staging", "primary_tag_name": "datacenter", "operation_name": "universal.http.client" }, "request_type": "histogram", "style": { "palette": "dog_classic" } } ] }, "layout": { "x": 8, "y": 0, "width": 4, "height": 2 } } ], "layout_type": "ordered" } EOF ``` ##### Create a distribution widget using a histogram request containing a formulas and functions events query Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "title": "Example-Dashboard", "description": "Example-Dashboard", "widgets": [ { "definition": { "title": "Events Platform - Request latency HOP", "title_size": "16", "title_align": "left", "show_legend": false, "type": "distribution", "xaxis": { "max": "auto", "include_zero": true, "scale": "linear", "min": "auto" }, "yaxis": { "max": "auto", "include_zero": true, "scale": "linear", "min": "auto" }, "requests": [ { "query": { "search": { "query": "" }, "data_source": "events", "compute": { "metric": "@duration", "aggregation": "min" }, "name": "query1", "indexes": [ "*" ], "group_by": [] }, "request_type": "histogram" } ] }, "layout": { "x": 0, "y": 0, "width": 4, "height": 2 } } ], "layout_type": "ordered" } EOF ``` ##### Clients deserialize a dashboard with a empty time object ``` // Clients deserialize a dashboard with a empty time object package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Dashboard{ Title: "Example-Dashboard", Widgets: []datadogV1.Widget{ { Definition: datadogV1.WidgetDefinition{ TimeseriesWidgetDefinition: &datadogV1.TimeseriesWidgetDefinition{ Title: datadog.PtrString("Example Cloud Cost Query"), TitleSize: datadog.PtrString("16"), TitleAlign: datadogV1.WIDGETTEXTALIGN_LEFT.Ptr(), Type: datadogV1.TIMESERIESWIDGETDEFINITIONTYPE_TIMESERIES, Requests: []datadogV1.TimeseriesWidgetRequest{ { Formulas: []datadogV1.WidgetFormula{ { Formula: "query1", }, }, Queries: []datadogV1.FormulaAndFunctionQueryDefinition{ datadogV1.FormulaAndFunctionQueryDefinition{ FormulaAndFunctionCloudCostQueryDefinition: &datadogV1.FormulaAndFunctionCloudCostQueryDefinition{ DataSource: datadogV1.FORMULAANDFUNCTIONCLOUDCOSTDATASOURCE_CLOUD_COST, Name: "query1", Query: "sum:aws.cost.amortized{*} by {aws_product}.rollup(sum, monthly)", }}, }, ResponseFormat: datadogV1.FORMULAANDFUNCTIONRESPONSEFORMAT_TIMESERIES.Ptr(), Style: &datadogV1.WidgetRequestStyle{ Palette: datadog.PtrString("dog_classic"), LineType: datadogV1.WIDGETLINETYPE_SOLID.Ptr(), LineWidth: datadogV1.WIDGETLINEWIDTH_NORMAL.Ptr(), }, DisplayType: datadogV1.WIDGETDISPLAYTYPE_BARS.Ptr(), }, }, Time: &datadogV1.WidgetTime{ WidgetLegacyLiveSpan: &datadogV1.WidgetLegacyLiveSpan{}}, }}, }, }, LayoutType: datadogV1.DASHBOARDLAYOUTTYPE_ORDERED, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.CreateDashboard(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.CreateDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.CreateDashboard`:\n%s\n", responseContent) } ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions APM Stats query ``` // Create a distribution widget using a histogram request containing a formulas and functions APM Stats query package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Dashboard{ Title: "Example-Dashboard", Description: *datadog.NewNullableString(datadog.PtrString("")), Widgets: []datadogV1.Widget{ { Definition: datadogV1.WidgetDefinition{ DistributionWidgetDefinition: &datadogV1.DistributionWidgetDefinition{ Title: datadog.PtrString("APM Stats - Request latency HOP"), TitleSize: datadog.PtrString("16"), TitleAlign: datadogV1.WIDGETTEXTALIGN_LEFT.Ptr(), ShowLegend: datadog.PtrBool(false), Type: datadogV1.DISTRIBUTIONWIDGETDEFINITIONTYPE_DISTRIBUTION, Xaxis: &datadogV1.DistributionWidgetXAxis{ Max: datadog.PtrString("auto"), IncludeZero: datadog.PtrBool(true), Scale: datadog.PtrString("linear"), Min: datadog.PtrString("auto"), }, Yaxis: &datadogV1.DistributionWidgetYAxis{ Max: datadog.PtrString("auto"), IncludeZero: datadog.PtrBool(true), Scale: datadog.PtrString("linear"), Min: datadog.PtrString("auto"), }, Requests: []datadogV1.DistributionWidgetRequest{ { Query: &datadogV1.DistributionWidgetHistogramRequestQuery{ FormulaAndFunctionApmResourceStatsQueryDefinition: &datadogV1.FormulaAndFunctionApmResourceStatsQueryDefinition{ PrimaryTagValue: datadog.PtrString("*"), Stat: datadogV1.FORMULAANDFUNCTIONAPMRESOURCESTATNAME_LATENCY_DISTRIBUTION, DataSource: datadogV1.FORMULAANDFUNCTIONAPMRESOURCESTATSDATASOURCE_APM_RESOURCE_STATS, Name: "query1", Service: "azure-bill-import", GroupBy: []string{ "resource_name", }, Env: "staging", PrimaryTagName: datadog.PtrString("datacenter"), OperationName: datadog.PtrString("universal.http.client"), }}, RequestType: datadogV1.DISTRIBUTIONWIDGETHISTOGRAMREQUESTTYPE_HISTOGRAM.Ptr(), Style: &datadogV1.WidgetStyle{ Palette: datadog.PtrString("dog_classic"), }, }, }, }}, Layout: &datadogV1.WidgetLayout{ X: 8, Y: 0, Width: 4, Height: 2, }, }, }, LayoutType: datadogV1.DASHBOARDLAYOUTTYPE_ORDERED, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.CreateDashboard(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.CreateDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.CreateDashboard`:\n%s\n", responseContent) } ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions events query ``` // Create a distribution widget using a histogram request containing a formulas and functions events query package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Dashboard{ Title: "Example-Dashboard", Description: *datadog.NewNullableString(datadog.PtrString("Example-Dashboard")), Widgets: []datadogV1.Widget{ { Definition: datadogV1.WidgetDefinition{ DistributionWidgetDefinition: &datadogV1.DistributionWidgetDefinition{ Title: datadog.PtrString("Events Platform - Request latency HOP"), TitleSize: datadog.PtrString("16"), TitleAlign: datadogV1.WIDGETTEXTALIGN_LEFT.Ptr(), ShowLegend: datadog.PtrBool(false), Type: datadogV1.DISTRIBUTIONWIDGETDEFINITIONTYPE_DISTRIBUTION, Xaxis: &datadogV1.DistributionWidgetXAxis{ Max: datadog.PtrString("auto"), IncludeZero: datadog.PtrBool(true), Scale: datadog.PtrString("linear"), Min: datadog.PtrString("auto"), }, Yaxis: &datadogV1.DistributionWidgetYAxis{ Max: datadog.PtrString("auto"), IncludeZero: datadog.PtrBool(true), Scale: datadog.PtrString("linear"), Min: datadog.PtrString("auto"), }, Requests: []datadogV1.DistributionWidgetRequest{ { Query: &datadogV1.DistributionWidgetHistogramRequestQuery{ FormulaAndFunctionEventQueryDefinition: &datadogV1.FormulaAndFunctionEventQueryDefinition{ Search: &datadogV1.FormulaAndFunctionEventQueryDefinitionSearch{ Query: "", }, DataSource: datadogV1.FORMULAANDFUNCTIONEVENTSDATASOURCE_EVENTS, Compute: datadogV1.FormulaAndFunctionEventQueryDefinitionCompute{ Metric: datadog.PtrString("@duration"), Aggregation: datadogV1.FORMULAANDFUNCTIONEVENTAGGREGATION_MIN, }, Name: "query1", Indexes: []string{ "*", }, GroupBy: []datadogV1.FormulaAndFunctionEventQueryGroupBy{}, }}, RequestType: datadogV1.DISTRIBUTIONWIDGETHISTOGRAMREQUESTTYPE_HISTOGRAM.Ptr(), }, }, }}, Layout: &datadogV1.WidgetLayout{ X: 0, Y: 0, Width: 4, Height: 2, }, }, }, LayoutType: datadogV1.DASHBOARDLAYOUTTYPE_ORDERED, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.CreateDashboard(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.CreateDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.CreateDashboard`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Clients deserialize a dashboard with a empty time object ``` // Clients deserialize a dashboard with a empty time object import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.Dashboard; import com.datadog.api.client.v1.model.DashboardLayoutType; import com.datadog.api.client.v1.model.FormulaAndFunctionCloudCostDataSource; import com.datadog.api.client.v1.model.FormulaAndFunctionCloudCostQueryDefinition; import com.datadog.api.client.v1.model.FormulaAndFunctionQueryDefinition; import com.datadog.api.client.v1.model.FormulaAndFunctionResponseFormat; import com.datadog.api.client.v1.model.TimeseriesWidgetDefinition; import com.datadog.api.client.v1.model.TimeseriesWidgetDefinitionType; import com.datadog.api.client.v1.model.TimeseriesWidgetRequest; import com.datadog.api.client.v1.model.Widget; import com.datadog.api.client.v1.model.WidgetDefinition; import com.datadog.api.client.v1.model.WidgetDisplayType; import com.datadog.api.client.v1.model.WidgetFormula; import com.datadog.api.client.v1.model.WidgetLegacyLiveSpan; import com.datadog.api.client.v1.model.WidgetLineType; import com.datadog.api.client.v1.model.WidgetLineWidth; import com.datadog.api.client.v1.model.WidgetRequestStyle; import com.datadog.api.client.v1.model.WidgetTextAlign; import com.datadog.api.client.v1.model.WidgetTime; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); Dashboard body = new Dashboard() .title("Example-Dashboard") .widgets( Collections.singletonList( new Widget() .definition( new WidgetDefinition( new TimeseriesWidgetDefinition() .title("Example Cloud Cost Query") .titleSize("16") .titleAlign(WidgetTextAlign.LEFT) .type(TimeseriesWidgetDefinitionType.TIMESERIES) .requests( Collections.singletonList( new TimeseriesWidgetRequest() .formulas( Collections.singletonList( new WidgetFormula().formula("query1"))) .queries( Collections.singletonList( new FormulaAndFunctionQueryDefinition( new FormulaAndFunctionCloudCostQueryDefinition() .dataSource( FormulaAndFunctionCloudCostDataSource .CLOUD_COST) .name("query1") .query( "sum:aws.cost.amortized{*} by" + " {aws_product}.rollup(sum," + " monthly)")))) .responseFormat( FormulaAndFunctionResponseFormat.TIMESERIES) .style( new WidgetRequestStyle() .palette("dog_classic") .lineType(WidgetLineType.SOLID) .lineWidth(WidgetLineWidth.NORMAL)) .displayType(WidgetDisplayType.BARS))) .time(new WidgetTime(new WidgetLegacyLiveSpan())))))) .layoutType(DashboardLayoutType.ORDERED); try { Dashboard result = apiInstance.createDashboard(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#createDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions APM Stats query ``` // Create a distribution widget using a histogram request containing a formulas and functions APM // Stats query import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.Dashboard; import com.datadog.api.client.v1.model.DashboardLayoutType; import com.datadog.api.client.v1.model.DistributionWidgetDefinition; import com.datadog.api.client.v1.model.DistributionWidgetDefinitionType; import com.datadog.api.client.v1.model.DistributionWidgetHistogramRequestQuery; import com.datadog.api.client.v1.model.DistributionWidgetHistogramRequestType; import com.datadog.api.client.v1.model.DistributionWidgetRequest; import com.datadog.api.client.v1.model.DistributionWidgetXAxis; import com.datadog.api.client.v1.model.DistributionWidgetYAxis; import com.datadog.api.client.v1.model.FormulaAndFunctionApmResourceStatName; import com.datadog.api.client.v1.model.FormulaAndFunctionApmResourceStatsDataSource; import com.datadog.api.client.v1.model.FormulaAndFunctionApmResourceStatsQueryDefinition; import com.datadog.api.client.v1.model.Widget; import com.datadog.api.client.v1.model.WidgetDefinition; import com.datadog.api.client.v1.model.WidgetLayout; import com.datadog.api.client.v1.model.WidgetStyle; import com.datadog.api.client.v1.model.WidgetTextAlign; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); Dashboard body = new Dashboard() .title("Example-Dashboard") .description("") .widgets( Collections.singletonList( new Widget() .definition( new WidgetDefinition( new DistributionWidgetDefinition() .title("APM Stats - Request latency HOP") .titleSize("16") .titleAlign(WidgetTextAlign.LEFT) .showLegend(false) .type(DistributionWidgetDefinitionType.DISTRIBUTION) .xaxis( new DistributionWidgetXAxis() .max("auto") .includeZero(true) .scale("linear") .min("auto")) .yaxis( new DistributionWidgetYAxis() .max("auto") .includeZero(true) .scale("linear") .min("auto")) .requests( Collections.singletonList( new DistributionWidgetRequest() .query( new DistributionWidgetHistogramRequestQuery( new FormulaAndFunctionApmResourceStatsQueryDefinition() .primaryTagValue("*") .stat( FormulaAndFunctionApmResourceStatName .LATENCY_DISTRIBUTION) .dataSource( FormulaAndFunctionApmResourceStatsDataSource .APM_RESOURCE_STATS) .name("query1") .service("azure-bill-import") .groupBy( Collections.singletonList( "resource_name")) .env("staging") .primaryTagName("datacenter") .operationName( "universal.http.client"))) .requestType( DistributionWidgetHistogramRequestType .HISTOGRAM) .style(new WidgetStyle().palette("dog_classic")))))) .layout(new WidgetLayout().x(8L).y(0L).width(4L).height(2L)))) .layoutType(DashboardLayoutType.ORDERED); try { Dashboard result = apiInstance.createDashboard(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#createDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions events query ``` // Create a distribution widget using a histogram request containing a formulas and functions events // query import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.Dashboard; import com.datadog.api.client.v1.model.DashboardLayoutType; import com.datadog.api.client.v1.model.DistributionWidgetDefinition; import com.datadog.api.client.v1.model.DistributionWidgetDefinitionType; import com.datadog.api.client.v1.model.DistributionWidgetHistogramRequestQuery; import com.datadog.api.client.v1.model.DistributionWidgetHistogramRequestType; import com.datadog.api.client.v1.model.DistributionWidgetRequest; import com.datadog.api.client.v1.model.DistributionWidgetXAxis; import com.datadog.api.client.v1.model.DistributionWidgetYAxis; import com.datadog.api.client.v1.model.FormulaAndFunctionEventAggregation; import com.datadog.api.client.v1.model.FormulaAndFunctionEventQueryDefinition; import com.datadog.api.client.v1.model.FormulaAndFunctionEventQueryDefinitionCompute; import com.datadog.api.client.v1.model.FormulaAndFunctionEventQueryDefinitionSearch; import com.datadog.api.client.v1.model.FormulaAndFunctionEventsDataSource; import com.datadog.api.client.v1.model.Widget; import com.datadog.api.client.v1.model.WidgetDefinition; import com.datadog.api.client.v1.model.WidgetLayout; import com.datadog.api.client.v1.model.WidgetTextAlign; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); Dashboard body = new Dashboard() .title("Example-Dashboard") .description("Example-Dashboard") .widgets( Collections.singletonList( new Widget() .definition( new WidgetDefinition( new DistributionWidgetDefinition() .title("Events Platform - Request latency HOP") .titleSize("16") .titleAlign(WidgetTextAlign.LEFT) .showLegend(false) .type(DistributionWidgetDefinitionType.DISTRIBUTION) .xaxis( new DistributionWidgetXAxis() .max("auto") .includeZero(true) .scale("linear") .min("auto")) .yaxis( new DistributionWidgetYAxis() .max("auto") .includeZero(true) .scale("linear") .min("auto")) .requests( Collections.singletonList( new DistributionWidgetRequest() .query( new DistributionWidgetHistogramRequestQuery( new FormulaAndFunctionEventQueryDefinition() .search( new FormulaAndFunctionEventQueryDefinitionSearch() .query("")) .dataSource( FormulaAndFunctionEventsDataSource .EVENTS) .compute( new FormulaAndFunctionEventQueryDefinitionCompute() .metric("@duration") .aggregation( FormulaAndFunctionEventAggregation .MIN)) .name("query1") .indexes( Collections.singletonList("*")))) .requestType( DistributionWidgetHistogramRequestType .HISTOGRAM))))) .layout(new WidgetLayout().x(0L).y(0L).width(4L).height(2L)))) .layoutType(DashboardLayoutType.ORDERED); try { Dashboard result = apiInstance.createDashboard(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#createDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new dashboard returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) title = 'Average Memory Free Shell' widgets = [{ 'definition': { 'type': 'timeseries', 'requests': [ {'q': 'avg:system.mem.free{*}'} ], 'title': 'Average Memory Free' } }] layout_type = 'ordered' description = 'A dashboard with memory info.' is_read_only = True notify_list = ['user@domain.com'] template_variables = [{ 'name': 'host1', 'prefix': 'host', 'default': 'my-host' }] saved_views = [{ 'name': 'Saved views for hostname 2', 'template_variables': [{'name': 'host', 'value': ''}]} ] api.Dashboard.create(title=title, widgets=widgets, layout_type=layout_type, description=description, is_read_only=is_read_only, notify_list=notify_list, template_variables=template_variables, template_variable_presets=saved_view) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Clients deserialize a dashboard with a empty time object ``` """ Clients deserialize a dashboard with a empty time object """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard import Dashboard from datadog_api_client.v1.model.dashboard_layout_type import DashboardLayoutType from datadog_api_client.v1.model.formula_and_function_cloud_cost_data_source import ( FormulaAndFunctionCloudCostDataSource, ) from datadog_api_client.v1.model.formula_and_function_cloud_cost_query_definition import ( FormulaAndFunctionCloudCostQueryDefinition, ) from datadog_api_client.v1.model.formula_and_function_response_format import FormulaAndFunctionResponseFormat from datadog_api_client.v1.model.timeseries_widget_definition import TimeseriesWidgetDefinition from datadog_api_client.v1.model.timeseries_widget_definition_type import TimeseriesWidgetDefinitionType from datadog_api_client.v1.model.timeseries_widget_request import TimeseriesWidgetRequest from datadog_api_client.v1.model.widget import Widget from datadog_api_client.v1.model.widget_display_type import WidgetDisplayType from datadog_api_client.v1.model.widget_formula import WidgetFormula from datadog_api_client.v1.model.widget_legacy_live_span import WidgetLegacyLiveSpan from datadog_api_client.v1.model.widget_line_type import WidgetLineType from datadog_api_client.v1.model.widget_line_width import WidgetLineWidth from datadog_api_client.v1.model.widget_request_style import WidgetRequestStyle from datadog_api_client.v1.model.widget_text_align import WidgetTextAlign body = Dashboard( title="Example-Dashboard", widgets=[ Widget( definition=TimeseriesWidgetDefinition( title="Example Cloud Cost Query", title_size="16", title_align=WidgetTextAlign.LEFT, type=TimeseriesWidgetDefinitionType.TIMESERIES, requests=[ TimeseriesWidgetRequest( formulas=[ WidgetFormula( formula="query1", ), ], queries=[ FormulaAndFunctionCloudCostQueryDefinition( data_source=FormulaAndFunctionCloudCostDataSource.CLOUD_COST, name="query1", query="sum:aws.cost.amortized{*} by {aws_product}.rollup(sum, monthly)", ), ], response_format=FormulaAndFunctionResponseFormat.TIMESERIES, style=WidgetRequestStyle( palette="dog_classic", line_type=WidgetLineType.SOLID, line_width=WidgetLineWidth.NORMAL, ), display_type=WidgetDisplayType.BARS, ), ], time=WidgetLegacyLiveSpan(), ), ), ], layout_type=DashboardLayoutType.ORDERED, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.create_dashboard(body=body) print(response) ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions APM Stats query ``` """ Create a distribution widget using a histogram request containing a formulas and functions APM Stats query """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard import Dashboard from datadog_api_client.v1.model.dashboard_layout_type import DashboardLayoutType from datadog_api_client.v1.model.distribution_widget_definition import DistributionWidgetDefinition from datadog_api_client.v1.model.distribution_widget_definition_type import DistributionWidgetDefinitionType from datadog_api_client.v1.model.distribution_widget_histogram_request_type import ( DistributionWidgetHistogramRequestType, ) from datadog_api_client.v1.model.distribution_widget_request import DistributionWidgetRequest from datadog_api_client.v1.model.distribution_widget_x_axis import DistributionWidgetXAxis from datadog_api_client.v1.model.distribution_widget_y_axis import DistributionWidgetYAxis from datadog_api_client.v1.model.formula_and_function_apm_resource_stat_name import ( FormulaAndFunctionApmResourceStatName, ) from datadog_api_client.v1.model.formula_and_function_apm_resource_stats_data_source import ( FormulaAndFunctionApmResourceStatsDataSource, ) from datadog_api_client.v1.model.formula_and_function_apm_resource_stats_query_definition import ( FormulaAndFunctionApmResourceStatsQueryDefinition, ) from datadog_api_client.v1.model.widget import Widget from datadog_api_client.v1.model.widget_layout import WidgetLayout from datadog_api_client.v1.model.widget_style import WidgetStyle from datadog_api_client.v1.model.widget_text_align import WidgetTextAlign body = Dashboard( title="Example-Dashboard", description="", widgets=[ Widget( definition=DistributionWidgetDefinition( title="APM Stats - Request latency HOP", title_size="16", title_align=WidgetTextAlign.LEFT, show_legend=False, type=DistributionWidgetDefinitionType.DISTRIBUTION, xaxis=DistributionWidgetXAxis( max="auto", include_zero=True, scale="linear", min="auto", ), yaxis=DistributionWidgetYAxis( max="auto", include_zero=True, scale="linear", min="auto", ), requests=[ DistributionWidgetRequest( query=FormulaAndFunctionApmResourceStatsQueryDefinition( primary_tag_value="*", stat=FormulaAndFunctionApmResourceStatName.LATENCY_DISTRIBUTION, data_source=FormulaAndFunctionApmResourceStatsDataSource.APM_RESOURCE_STATS, name="query1", service="azure-bill-import", group_by=[ "resource_name", ], env="staging", primary_tag_name="datacenter", operation_name="universal.http.client", ), request_type=DistributionWidgetHistogramRequestType.HISTOGRAM, style=WidgetStyle( palette="dog_classic", ), ), ], ), layout=WidgetLayout( x=8, y=0, width=4, height=2, ), ), ], layout_type=DashboardLayoutType.ORDERED, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.create_dashboard(body=body) print(response) ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions events query ``` """ Create a distribution widget using a histogram request containing a formulas and functions events query """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard import Dashboard from datadog_api_client.v1.model.dashboard_layout_type import DashboardLayoutType from datadog_api_client.v1.model.distribution_widget_definition import DistributionWidgetDefinition from datadog_api_client.v1.model.distribution_widget_definition_type import DistributionWidgetDefinitionType from datadog_api_client.v1.model.distribution_widget_histogram_request_type import ( DistributionWidgetHistogramRequestType, ) from datadog_api_client.v1.model.distribution_widget_request import DistributionWidgetRequest from datadog_api_client.v1.model.distribution_widget_x_axis import DistributionWidgetXAxis from datadog_api_client.v1.model.distribution_widget_y_axis import DistributionWidgetYAxis from datadog_api_client.v1.model.formula_and_function_event_aggregation import FormulaAndFunctionEventAggregation from datadog_api_client.v1.model.formula_and_function_event_query_definition import ( FormulaAndFunctionEventQueryDefinition, ) from datadog_api_client.v1.model.formula_and_function_event_query_definition_compute import ( FormulaAndFunctionEventQueryDefinitionCompute, ) from datadog_api_client.v1.model.formula_and_function_event_query_definition_search import ( FormulaAndFunctionEventQueryDefinitionSearch, ) from datadog_api_client.v1.model.formula_and_function_events_data_source import FormulaAndFunctionEventsDataSource from datadog_api_client.v1.model.widget import Widget from datadog_api_client.v1.model.widget_layout import WidgetLayout from datadog_api_client.v1.model.widget_text_align import WidgetTextAlign body = Dashboard( title="Example-Dashboard", description="Example-Dashboard", widgets=[ Widget( definition=DistributionWidgetDefinition( title="Events Platform - Request latency HOP", title_size="16", title_align=WidgetTextAlign.LEFT, show_legend=False, type=DistributionWidgetDefinitionType.DISTRIBUTION, xaxis=DistributionWidgetXAxis( max="auto", include_zero=True, scale="linear", min="auto", ), yaxis=DistributionWidgetYAxis( max="auto", include_zero=True, scale="linear", min="auto", ), requests=[ DistributionWidgetRequest( query=FormulaAndFunctionEventQueryDefinition( search=FormulaAndFunctionEventQueryDefinitionSearch( query="", ), data_source=FormulaAndFunctionEventsDataSource.EVENTS, compute=FormulaAndFunctionEventQueryDefinitionCompute( metric="@duration", aggregation=FormulaAndFunctionEventAggregation.MIN, ), name="query1", indexes=[ "*", ], group_by=[], ), request_type=DistributionWidgetHistogramRequestType.HISTOGRAM, ), ], ), layout=WidgetLayout( x=0, y=0, width=4, height=2, ), ), ], layout_type=DashboardLayoutType.ORDERED, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.create_dashboard(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new dashboard returns "OK" response ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Create a dashboard. title = 'Average Memory Free Shell' widgets = [{ 'definition': { 'type' => 'timeseries', 'requests' => [ {'q' => 'avg:system.mem.free{*}'} ], 'title' => 'Average Memory Free' } }] layout_type = 'ordered' description = 'A dashboard with memory info.' is_read_only = true notify_list = ['user@domain.com'] template_variables = [{ 'name' => 'host', 'prefix' => 'host', 'default' => '' }] saved_view = [{ 'name': 'Saved views for hostname 2', 'template_variables': [{'name': 'host', 'value': ''}]} ] dog.create_board(title, widgets, layout_type, { :description => description, :is_read_only => is_read_only, :notify_list => notify_list, :template_variables => template_variables, :template_variable_presets => saved_view }) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Clients deserialize a dashboard with a empty time object ``` # Clients deserialize a dashboard with a empty time object require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new body = DatadogAPIClient::V1::Dashboard.new({ title: "Example-Dashboard", widgets: [ DatadogAPIClient::V1::Widget.new({ definition: DatadogAPIClient::V1::TimeseriesWidgetDefinition.new({ title: "Example Cloud Cost Query", title_size: "16", title_align: DatadogAPIClient::V1::WidgetTextAlign::LEFT, type: DatadogAPIClient::V1::TimeseriesWidgetDefinitionType::TIMESERIES, requests: [ DatadogAPIClient::V1::TimeseriesWidgetRequest.new({ formulas: [ DatadogAPIClient::V1::WidgetFormula.new({ formula: "query1", }), ], queries: [ DatadogAPIClient::V1::FormulaAndFunctionCloudCostQueryDefinition.new({ data_source: DatadogAPIClient::V1::FormulaAndFunctionCloudCostDataSource::CLOUD_COST, name: "query1", query: "sum:aws.cost.amortized{*} by {aws_product}.rollup(sum, monthly)", }), ], response_format: DatadogAPIClient::V1::FormulaAndFunctionResponseFormat::TIMESERIES, style: DatadogAPIClient::V1::WidgetRequestStyle.new({ palette: "dog_classic", line_type: DatadogAPIClient::V1::WidgetLineType::SOLID, line_width: DatadogAPIClient::V1::WidgetLineWidth::NORMAL, }), display_type: DatadogAPIClient::V1::WidgetDisplayType::BARS, }), ], time: DatadogAPIClient::V1::WidgetLegacyLiveSpan.new({}), }), }), ], layout_type: DatadogAPIClient::V1::DashboardLayoutType::ORDERED, }) p api_instance.create_dashboard(body) ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions APM Stats query ``` # Create a distribution widget using a histogram request containing a formulas and functions APM Stats query require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new body = DatadogAPIClient::V1::Dashboard.new({ title: "Example-Dashboard", description: "", widgets: [ DatadogAPIClient::V1::Widget.new({ definition: DatadogAPIClient::V1::DistributionWidgetDefinition.new({ title: "APM Stats - Request latency HOP", title_size: "16", title_align: DatadogAPIClient::V1::WidgetTextAlign::LEFT, show_legend: false, type: DatadogAPIClient::V1::DistributionWidgetDefinitionType::DISTRIBUTION, xaxis: DatadogAPIClient::V1::DistributionWidgetXAxis.new({ max: "auto", include_zero: true, scale: "linear", min: "auto", }), yaxis: DatadogAPIClient::V1::DistributionWidgetYAxis.new({ max: "auto", include_zero: true, scale: "linear", min: "auto", }), requests: [ DatadogAPIClient::V1::DistributionWidgetRequest.new({ query: DatadogAPIClient::V1::FormulaAndFunctionApmResourceStatsQueryDefinition.new({ primary_tag_value: "*", stat: DatadogAPIClient::V1::FormulaAndFunctionApmResourceStatName::LATENCY_DISTRIBUTION, data_source: DatadogAPIClient::V1::FormulaAndFunctionApmResourceStatsDataSource::APM_RESOURCE_STATS, name: "query1", service: "azure-bill-import", group_by: [ "resource_name", ], env: "staging", primary_tag_name: "datacenter", operation_name: "universal.http.client", }), request_type: DatadogAPIClient::V1::DistributionWidgetHistogramRequestType::HISTOGRAM, style: DatadogAPIClient::V1::WidgetStyle.new({ palette: "dog_classic", }), }), ], }), layout: DatadogAPIClient::V1::WidgetLayout.new({ x: 8, y: 0, width: 4, height: 2, }), }), ], layout_type: DatadogAPIClient::V1::DashboardLayoutType::ORDERED, }) p api_instance.create_dashboard(body) ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions events query ``` # Create a distribution widget using a histogram request containing a formulas and functions events query require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new body = DatadogAPIClient::V1::Dashboard.new({ title: "Example-Dashboard", description: "Example-Dashboard", widgets: [ DatadogAPIClient::V1::Widget.new({ definition: DatadogAPIClient::V1::DistributionWidgetDefinition.new({ title: "Events Platform - Request latency HOP", title_size: "16", title_align: DatadogAPIClient::V1::WidgetTextAlign::LEFT, show_legend: false, type: DatadogAPIClient::V1::DistributionWidgetDefinitionType::DISTRIBUTION, xaxis: DatadogAPIClient::V1::DistributionWidgetXAxis.new({ max: "auto", include_zero: true, scale: "linear", min: "auto", }), yaxis: DatadogAPIClient::V1::DistributionWidgetYAxis.new({ max: "auto", include_zero: true, scale: "linear", min: "auto", }), requests: [ DatadogAPIClient::V1::DistributionWidgetRequest.new({ query: DatadogAPIClient::V1::FormulaAndFunctionEventQueryDefinition.new({ search: DatadogAPIClient::V1::FormulaAndFunctionEventQueryDefinitionSearch.new({ query: "", }), data_source: DatadogAPIClient::V1::FormulaAndFunctionEventsDataSource::EVENTS, compute: DatadogAPIClient::V1::FormulaAndFunctionEventQueryDefinitionCompute.new({ metric: "@duration", aggregation: DatadogAPIClient::V1::FormulaAndFunctionEventAggregation::MIN, }), name: "query1", indexes: [ "*", ], group_by: [], }), request_type: DatadogAPIClient::V1::DistributionWidgetHistogramRequestType::HISTOGRAM, }), ], }), layout: DatadogAPIClient::V1::WidgetLayout.new({ x: 0, y: 0, width: 4, height: 2, }), }), ], layout_type: DatadogAPIClient::V1::DashboardLayoutType::ORDERED, }) p api_instance.create_dashboard(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Clients deserialize a dashboard with a empty time object ``` // Clients deserialize a dashboard with a empty time object use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::Dashboard; use datadog_api_client::datadogV1::model::DashboardLayoutType; use datadog_api_client::datadogV1::model::FormulaAndFunctionCloudCostDataSource; use datadog_api_client::datadogV1::model::FormulaAndFunctionCloudCostQueryDefinition; use datadog_api_client::datadogV1::model::FormulaAndFunctionQueryDefinition; use datadog_api_client::datadogV1::model::FormulaAndFunctionResponseFormat; use datadog_api_client::datadogV1::model::TimeseriesWidgetDefinition; use datadog_api_client::datadogV1::model::TimeseriesWidgetDefinitionType; use datadog_api_client::datadogV1::model::TimeseriesWidgetRequest; use datadog_api_client::datadogV1::model::Widget; use datadog_api_client::datadogV1::model::WidgetDefinition; use datadog_api_client::datadogV1::model::WidgetDisplayType; use datadog_api_client::datadogV1::model::WidgetFormula; use datadog_api_client::datadogV1::model::WidgetLegacyLiveSpan; use datadog_api_client::datadogV1::model::WidgetLineType; use datadog_api_client::datadogV1::model::WidgetLineWidth; use datadog_api_client::datadogV1::model::WidgetRequestStyle; use datadog_api_client::datadogV1::model::WidgetTextAlign; use datadog_api_client::datadogV1::model::WidgetTime; #[tokio::main] async fn main() { let body = Dashboard::new( DashboardLayoutType::ORDERED, "Example-Dashboard".to_string(), vec![ Widget::new( WidgetDefinition::TimeseriesWidgetDefinition( Box::new( TimeseriesWidgetDefinition::new( vec![ TimeseriesWidgetRequest::new() .display_type(WidgetDisplayType::BARS) .formulas(vec![WidgetFormula::new("query1".to_string())]) .queries( vec![ FormulaAndFunctionQueryDefinition ::FormulaAndFunctionCloudCostQueryDefinition( Box::new( FormulaAndFunctionCloudCostQueryDefinition::new( FormulaAndFunctionCloudCostDataSource::CLOUD_COST, "query1".to_string(), "sum:aws.cost.amortized{*} by {aws_product}.rollup(sum, monthly)".to_string(), ), ), ) ], ) .response_format(FormulaAndFunctionResponseFormat::TIMESERIES) .style( WidgetRequestStyle::new() .line_type(WidgetLineType::SOLID) .line_width(WidgetLineWidth::NORMAL) .palette("dog_classic".to_string()), ) ], TimeseriesWidgetDefinitionType::TIMESERIES, ) .time(WidgetTime::WidgetLegacyLiveSpan(Box::new(WidgetLegacyLiveSpan::new()))) .title("Example Cloud Cost Query".to_string()) .title_align(WidgetTextAlign::LEFT) .title_size("16".to_string()), ), ), ) ], ); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.create_dashboard(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions APM Stats query ``` // Create a distribution widget using a histogram request containing a formulas // and functions APM Stats query use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::Dashboard; use datadog_api_client::datadogV1::model::DashboardLayoutType; use datadog_api_client::datadogV1::model::DistributionWidgetDefinition; use datadog_api_client::datadogV1::model::DistributionWidgetDefinitionType; use datadog_api_client::datadogV1::model::DistributionWidgetHistogramRequestQuery; use datadog_api_client::datadogV1::model::DistributionWidgetHistogramRequestType; use datadog_api_client::datadogV1::model::DistributionWidgetRequest; use datadog_api_client::datadogV1::model::DistributionWidgetXAxis; use datadog_api_client::datadogV1::model::DistributionWidgetYAxis; use datadog_api_client::datadogV1::model::FormulaAndFunctionApmResourceStatName; use datadog_api_client::datadogV1::model::FormulaAndFunctionApmResourceStatsDataSource; use datadog_api_client::datadogV1::model::FormulaAndFunctionApmResourceStatsQueryDefinition; use datadog_api_client::datadogV1::model::Widget; use datadog_api_client::datadogV1::model::WidgetDefinition; use datadog_api_client::datadogV1::model::WidgetLayout; use datadog_api_client::datadogV1::model::WidgetStyle; use datadog_api_client::datadogV1::model::WidgetTextAlign; #[tokio::main] async fn main() { let body = Dashboard::new( DashboardLayoutType::ORDERED, "Example-Dashboard".to_string(), vec![ Widget::new( WidgetDefinition::DistributionWidgetDefinition( Box::new( DistributionWidgetDefinition::new( vec![ DistributionWidgetRequest::new() .query( DistributionWidgetHistogramRequestQuery ::FormulaAndFunctionApmResourceStatsQueryDefinition( Box::new( FormulaAndFunctionApmResourceStatsQueryDefinition::new( FormulaAndFunctionApmResourceStatsDataSource ::APM_RESOURCE_STATS, "staging".to_string(), "query1".to_string(), "azure-bill-import".to_string(), FormulaAndFunctionApmResourceStatName::LATENCY_DISTRIBUTION, ) .group_by(vec!["resource_name".to_string()]) .operation_name("universal.http.client".to_string()) .primary_tag_name("datacenter".to_string()) .primary_tag_value("*".to_string()), ), ), ) .request_type(DistributionWidgetHistogramRequestType::HISTOGRAM) .style(WidgetStyle::new().palette("dog_classic".to_string())) ], DistributionWidgetDefinitionType::DISTRIBUTION, ) .show_legend(false) .title("APM Stats - Request latency HOP".to_string()) .title_align(WidgetTextAlign::LEFT) .title_size("16".to_string()) .xaxis( DistributionWidgetXAxis::new() .include_zero(true) .max("auto".to_string()) .min("auto".to_string()) .scale("linear".to_string()), ) .yaxis( DistributionWidgetYAxis::new() .include_zero(true) .max("auto".to_string()) .min("auto".to_string()) .scale("linear".to_string()), ), ), ), ).layout(WidgetLayout::new(2, 4, 8, 0)) ], ).description(Some("".to_string())); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.create_dashboard(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions events query ``` // Create a distribution widget using a histogram request containing a formulas // and functions events query use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::Dashboard; use datadog_api_client::datadogV1::model::DashboardLayoutType; use datadog_api_client::datadogV1::model::DistributionWidgetDefinition; use datadog_api_client::datadogV1::model::DistributionWidgetDefinitionType; use datadog_api_client::datadogV1::model::DistributionWidgetHistogramRequestQuery; use datadog_api_client::datadogV1::model::DistributionWidgetHistogramRequestType; use datadog_api_client::datadogV1::model::DistributionWidgetRequest; use datadog_api_client::datadogV1::model::DistributionWidgetXAxis; use datadog_api_client::datadogV1::model::DistributionWidgetYAxis; use datadog_api_client::datadogV1::model::FormulaAndFunctionEventAggregation; use datadog_api_client::datadogV1::model::FormulaAndFunctionEventQueryDefinition; use datadog_api_client::datadogV1::model::FormulaAndFunctionEventQueryDefinitionCompute; use datadog_api_client::datadogV1::model::FormulaAndFunctionEventQueryDefinitionSearch; use datadog_api_client::datadogV1::model::FormulaAndFunctionEventsDataSource; use datadog_api_client::datadogV1::model::Widget; use datadog_api_client::datadogV1::model::WidgetDefinition; use datadog_api_client::datadogV1::model::WidgetLayout; use datadog_api_client::datadogV1::model::WidgetTextAlign; #[tokio::main] async fn main() { let body = Dashboard::new( DashboardLayoutType::ORDERED, "Example-Dashboard".to_string(), vec![ Widget::new( WidgetDefinition::DistributionWidgetDefinition( Box::new( DistributionWidgetDefinition::new( vec![ DistributionWidgetRequest::new() .query( DistributionWidgetHistogramRequestQuery ::FormulaAndFunctionEventQueryDefinition( Box::new( FormulaAndFunctionEventQueryDefinition::new( FormulaAndFunctionEventQueryDefinitionCompute::new( FormulaAndFunctionEventAggregation::MIN, ).metric("@duration".to_string()), FormulaAndFunctionEventsDataSource::EVENTS, "query1".to_string(), ) .group_by(vec![]) .indexes(vec!["*".to_string()]) .search( FormulaAndFunctionEventQueryDefinitionSearch::new( "".to_string(), ), ), ), ), ) .request_type(DistributionWidgetHistogramRequestType::HISTOGRAM) ], DistributionWidgetDefinitionType::DISTRIBUTION, ) .show_legend(false) .title("Events Platform - Request latency HOP".to_string()) .title_align(WidgetTextAlign::LEFT) .title_size("16".to_string()) .xaxis( DistributionWidgetXAxis::new() .include_zero(true) .max("auto".to_string()) .min("auto".to_string()) .scale("linear".to_string()), ) .yaxis( DistributionWidgetYAxis::new() .include_zero(true) .max("auto".to_string()) .min("auto".to_string()) .scale("linear".to_string()), ), ), ), ).layout(WidgetLayout::new(2, 4, 0, 0)) ], ).description(Some("Example-Dashboard".to_string())); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.create_dashboard(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Clients deserialize a dashboard with a empty time object ``` /** * Clients deserialize a dashboard with a empty time object */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); const params: v1.DashboardsApiCreateDashboardRequest = { body: { title: "Example-Dashboard", widgets: [ { definition: { title: "Example Cloud Cost Query", titleSize: "16", titleAlign: "left", type: "timeseries", requests: [ { formulas: [ { formula: "query1", }, ], queries: [ { dataSource: "cloud_cost", name: "query1", query: "sum:aws.cost.amortized{*} by {aws_product}.rollup(sum, monthly)", }, ], responseFormat: "timeseries", style: { palette: "dog_classic", lineType: "solid", lineWidth: "normal", }, displayType: "bars", }, ], time: {}, }, }, ], layoutType: "ordered", }, }; apiInstance .createDashboard(params) .then((data: v1.Dashboard) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions APM Stats query ``` /** * Create a distribution widget using a histogram request containing a formulas and functions APM Stats query */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); const params: v1.DashboardsApiCreateDashboardRequest = { body: { title: "Example-Dashboard", description: "", widgets: [ { definition: { title: "APM Stats - Request latency HOP", titleSize: "16", titleAlign: "left", showLegend: false, type: "distribution", xaxis: { max: "auto", includeZero: true, scale: "linear", min: "auto", }, yaxis: { max: "auto", includeZero: true, scale: "linear", min: "auto", }, requests: [ { query: { primaryTagValue: "*", stat: "latency_distribution", dataSource: "apm_resource_stats", name: "query1", service: "azure-bill-import", groupBy: ["resource_name"], env: "staging", primaryTagName: "datacenter", operationName: "universal.http.client", }, requestType: "histogram", style: { palette: "dog_classic", }, }, ], }, layout: { x: 8, y: 0, width: 4, height: 2, }, }, ], layoutType: "ordered", }, }; apiInstance .createDashboard(params) .then((data: v1.Dashboard) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a distribution widget using a histogram request containing a formulas and functions events query ``` /** * Create a distribution widget using a histogram request containing a formulas and functions events query */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); const params: v1.DashboardsApiCreateDashboardRequest = { body: { title: "Example-Dashboard", description: "Example-Dashboard", widgets: [ { definition: { title: "Events Platform - Request latency HOP", titleSize: "16", titleAlign: "left", showLegend: false, type: "distribution", xaxis: { max: "auto", includeZero: true, scale: "linear", min: "auto", }, yaxis: { max: "auto", includeZero: true, scale: "linear", min: "auto", }, requests: [ { query: { search: { query: "", }, dataSource: "events", compute: { metric: "@duration", aggregation: "min", }, name: "query1", indexes: ["*"], groupBy: [], }, requestType: "histogram", }, ], }, layout: { x: 0, y: 0, width: 4, height: 2, }, }, ], layoutType: "ordered", }, }; apiInstance .createDashboard(params) .then((data: v1.Dashboard) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a dashboard](https://docs.datadoghq.com/api/latest/dashboards/#get-a-dashboard) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#get-a-dashboard-v1) GET https://api.ap1.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.ap2.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.datadoghq.eu/api/v1/dashboard/{dashboard_id}https://api.ddog-gov.com/api/v1/dashboard/{dashboard_id}https://api.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.us3.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.us5.datadoghq.com/api/v1/dashboard/{dashboard_id} ### Overview Get a dashboard using the specified ID. This endpoint requires the `dashboards_read` permission. OAuth apps require the `dashboards_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Arguments #### Path Parameters Name Type Description dashboard_id [_required_] string The ID of the dashboard. ### Response * [200](https://docs.datadoghq.com/api/latest/dashboards/#GetDashboard-200-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#GetDashboard-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#GetDashboard-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#GetDashboard-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) A dashboard is Datadog’s tool for visually tracking, analyzing, and displaying key performance metrics, which enable you to monitor the health of your infrastructure. Field Type Description author_handle string Identifier of the dashboard author. author_name string Name of the dashboard author. created_at date-time Creation date of the dashboard. description string Description of the dashboard. id string ID of the dashboard. is_read_only boolean **DEPRECATED** : Whether this dashboard is read-only. If True, only the author and admins can make changes to it. This property is deprecated; please use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) instead to manage write authorization for individual dashboards. layout_type [_required_] enum Layout type of the dashboard. Allowed enum values: `ordered,free` modified_at date-time Modification date of the dashboard. notify_list [string] List of handles of users to notify when changes are made to this dashboard. reflow_type enum Reflow type for a **new dashboard layout** dashboard. Set this only when layout type is 'ordered'. If set to 'fixed', the dashboard expects all widgets to have a layout, and if it's set to 'auto', widgets should not have layouts. Allowed enum values: `auto,fixed` restricted_roles [string] A list of role identifiers. Only the author and users associated with at least one of these roles can edit this dashboard. tags [string] List of team names representing ownership of a dashboard. template_variable_presets [object] Array of template variables saved views. name string The name of the variable. template_variables [object] List of variables. name string The name of the variable. value string **DEPRECATED** : (deprecated) The value of the template variable within the saved view. Cannot be used in conjunction with `values`. values [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. Cannot be used in conjunction with `value`. template_variables [object] List of template variables for this dashboard. available_values [string] The list of values that the template variable drop-down is limited to. default string **DEPRECATED** : (deprecated) The default value for the template variable on dashboard load. Cannot be used in conjunction with `defaults`. defaults [string] One or many default values for template variables on load. If more than one default is specified, they will be unioned together with `OR`. Cannot be used in conjunction with `default`. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. Only tags with this prefix appear in the variable drop-down. type string The type of variable. This is to differentiate between filter variables (interpolated in query) and group by variables (interpolated into group by). title [_required_] string Title of the dashboard. url string The URL of the dashboard. widgets [_required_] [object] List of widgets to display on the dashboard. definition [_required_] [Definition of the widget](https://docs.datadoghq.com/dashboards/widgets/). Option 1 object Alert graphs are timeseries graphs showing the current status of any monitor defined on your system. alert_id [_required_] string ID of the alert to use in the widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the alert graph widget. Allowed enum values: `alert_graph` default: `alert_graph` viz_type [_required_] enum Whether to display the Alert Graph as a timeseries or a top list. Allowed enum values: `timeseries,toplist` Option 2 object Alert values are query values showing the current value of the metric in any monitor defined on your system. alert_id [_required_] string ID of the alert to use in the widget. precision int64 Number of decimal to show. If not defined, will use the raw value. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of value in the widget. type [_required_] enum Type of the alert value widget. Allowed enum values: `alert_value` default: `alert_value` unit string Unit to display with the value. Option 3 object The Change graph shows you the change in a value over the time period chosen. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. change_type enum Show the absolute or the relative change. Allowed enum values: `absolute,relative` compare_to enum Timeframe used for the change comparison. Allowed enum values: `hour_before,day_before,week_before,month_before` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. increase_good boolean Whether to show increase as good. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order_by enum What to order by. Allowed enum values: `change,name,present,past` order_dir enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. show_present boolean Whether to show the present value. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the change widget. Allowed enum values: `change` default: `change` Option 4 object Check status shows the current status or number of results for any check performed. check [_required_] string Name of the check to use in the widget. group string Group reporting a single check. group_by [string] List of tag prefixes to group by in the case of a cluster check. grouping [_required_] enum The kind of grouping to use. Allowed enum values: `check,cluster` tags [string] List of tags used to filter the groups reporting a cluster check. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the check status widget. Allowed enum values: `check_status` default: `check_status` Option 5 object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query Query definition for Distribution Widget Histogram Request Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` xaxis object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` yaxis object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` Option 6 object The event stream is a widget version of the stream of events on the Event Stream view. Only available on FREE layout dashboards. event_size enum Size to use to display an event. Allowed enum values: `s,l` query [_required_] string Query to filter the event stream with. tags_execution string The execution method for multi-value filters. Can be either and or or. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the event stream widget. Allowed enum values: `event_stream` default: `event_stream` Option 7 object The event timeline is a widget version of the timeline that appears at the top of the Event Stream view. Only available on FREE layout dashboards. query [_required_] string Query to filter the event timeline with. tags_execution string The execution method for multi-value filters. Can be either and or or. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the event timeline widget. Allowed enum values: `event_timeline` default: `event_timeline` Option 8 object Free text is a widget that allows you to add headings to your screenboard. Commonly used to state the overall purpose of the dashboard. Only available on FREE layout dashboards. color string Color of the text. font_size string Size of the text. text [_required_] string Text to display. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` type [_required_] enum Type of the free text widget. Allowed enum values: `free_text` default: `free_text` Option 9 object The funnel visualization displays a funnel of user sessions that maps a sequence of view navigation and user interaction in your application. requests [_required_] [object] Request payload used to query items. query [_required_] object Updated funnel widget. data_source [_required_] enum Source from which to query items to display in the funnel. Allowed enum values: `rum` default: `rum` query_string [_required_] string The widget query. steps [_required_] [object] List of funnel steps. facet [_required_] string The facet of the step. value [_required_] string The value of the step. request_type [_required_] enum Widget request type. Allowed enum values: `funnel` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of funnel widget. Allowed enum values: `funnel` default: `funnel` Option 10 object This visualization displays a series of values by country on a world map. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of request objects to display in the widget. May include an optional request for the region layer and/or an optional request for the points layer. Region layer requests must contain a `group-by` tag whose value is a country ISO code. See the [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) for information about building the `REQUEST_SCHEMA`. columns [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` conditional_formats [object] Threshold (numeric) conditional formatting rules may be used by a regions layer. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. compute [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` group_by [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object The style to apply to the request for points layer. color_by string The category to color the points by. text_formats [object] Text formatting rules may be used by a points layer. custom_bg_color string Hex representation of the custom background color. Used with custom background palette option. custom_fg_color string Hex representation of the custom text color. Used with custom text palette option. match [_required_] object Match rule for the table widget text format. type [_required_] enum Match or compare option. Allowed enum values: `is,is_not,contains,does_not_contain,starts_with,ends_with` value [_required_] string Table Widget Match String. palette enum Color-on-color palette to highlight replaced text. Allowed enum values: `white_on_red,white_on_yellow,white_on_green,black_on_light_red,black_on_light_yellow,black_on_light_green,red_on_white,yellow_on_white,green_on_white,custom_bg,custom_text` default: `white_on_green` replace Replace rule for the table widget text format. Option 1 object Match All definition. type [_required_] enum Table widget text format replace all type. Allowed enum values: `all` with [_required_] string Replace All type. Option 2 object Match Sub-string definition. substring [_required_] string Text that will be replaced. type [_required_] enum Table widget text format replace sub-string type. Allowed enum values: `substring` with [_required_] string Text that will replace original sub-string. style [_required_] object The style to apply to the widget. palette [_required_] string The color palette to apply to the widget. palette_flip [_required_] boolean Whether to flip the palette tones. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of the geomap widget. Allowed enum values: `geomap` default: `geomap` view [_required_] object The view of the world that the map should render. focus [_required_] string The 2-letter ISO code of a country to focus the map on, or `WORLD` for global view, or a region (`EMEA`, `APAC`, `LATAM`), or a continent (`NORTH_AMERICA`, `SOUTH_AMERICA`, `EUROPE`, `AFRICA`, `ASIA`, `OCEANIA`). Option 11 object The groups widget allows you to keep similar graphs together on your timeboard. Each group has a custom header, can hold one to many graphs, and is collapsible. background_color string Background color of the group title. banner_img string URL of image to display as a banner for the group. layout_type [_required_] enum Layout type of the group. Allowed enum values: `ordered` show_title boolean Whether to show the title or not. default: `true` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` type [_required_] enum Type of the group widget. Allowed enum values: `group` default: `group` widgets [_required_] [object] List of widget groups. Option 12 object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of widget types. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` xaxis object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 13 object The host map widget graphs any metric across your hosts using the same visualization available from the main Host Map page. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group [string] List of tag prefixes to group by. no_group_hosts boolean Whether to show the hosts that don’t fit in a group. no_metric_hosts boolean Whether to show the hosts with no metrics. node_type enum Which type of node to use in the map. Allowed enum values: `host,container` notes string Notes on the title. requests [_required_] object List of definitions. fill object Updated host map. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. size object Updated host map. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. scope [string] List of tags used to filter the map. style object The style to apply to the widget. fill_max string Max value to use to color the map. fill_min string Min value to use to color the map. palette string Color palette to apply to the widget. palette_flip boolean Whether to flip the palette tones. title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the host map widget. Allowed enum values: `hostmap` default: `hostmap` Option 14 object The iframe widget allows you to embed a portion of any other web page on your dashboard. Only available on FREE layout dashboards. type [_required_] enum Type of the iframe widget. Allowed enum values: `iframe` default: `iframe` url [_required_] string URL of the iframe. Option 15 object The image widget allows you to embed an image on your dashboard. An image can be a PNG, JPG, or animated GIF. Only available on FREE layout dashboards. has_background boolean Whether to display a background or not. default: `true` has_border boolean Whether to display a border or not. default: `true` horizontal_align enum Horizontal alignment. Allowed enum values: `center,left,right` margin enum Size of the margins around the image. **Note** : `small` and `large` values are deprecated. Allowed enum values: `sm,md,lg,small,large` sizing enum How to size the image on the widget. The values are based on the image `object-fit` CSS properties. **Note** : `zoom`, `fit` and `center` values are deprecated. Allowed enum values: `fill,contain,cover,none,scale-down,zoom,fit,center` type [_required_] enum Type of the image widget. Allowed enum values: `image` default: `image` url [_required_] string URL of the image. url_dark_theme string URL of the image in dark mode. vertical_align enum Vertical alignment. Allowed enum values: `center,top,bottom` Option 16 object The list stream visualization displays a table of recent events in your application that match a search criteria using user-defined columns. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". requests [_required_] [object] Request payload used to query items. columns [_required_] [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` query [_required_] object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. compute [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` group_by [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format [_required_] enum Widget response format. Allowed enum values: `event_list` show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the list stream widget. Allowed enum values: `list_stream` default: `list_stream` Option 17 object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` Option 18 object The monitor summary widget displays a summary view of all your Datadog monitors, or a subset based on a query. Only available on FREE layout dashboards. color_preference enum Which color to use on the widget. Allowed enum values: `background,text` count int64 **DEPRECATED** : The number of monitors to display. display_format enum What to display on the widget. Allowed enum values: `counts,countsAndList,list` hide_zero_counts boolean Whether to show counts of 0 or not. query [_required_] string Query to filter the monitors with. show_last_triggered boolean Whether to show the time that has elapsed since the monitor/group triggered. show_priority boolean Whether to show the priorities column. sort enum Widget sorting methods. Allowed enum values: `name,group,status,tags,triggered,group,asc,group,desc,name,asc,name,desc,status,asc,status,desc,tags,asc,tags,desc,triggered,asc,triggered,desc,priority,asc,priority,desc` start int64 **DEPRECATED** : The start of the list. Typically 0. summary_type enum Which summary type should be used. Allowed enum values: `monitors,groups,combined` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the monitor summary widget. Allowed enum values: `manage_status` default: `manage_status` Option 19 object The notes and links widget is similar to free text widget, but allows for more formatting options. background_color string Background color of the note. content [_required_] string Content of the note. font_size string Size of the text. has_padding boolean Whether to add padding or not. default: `true` show_tick boolean Whether to show a tick or not. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` tick_edge enum Define how you want to align the text on the widget. Allowed enum values: `bottom,left,right,top` tick_pos string Where to position the tick on an edge. type [_required_] enum Type of the note widget. Allowed enum values: `note` default: `note` vertical_align enum Vertical alignment. Allowed enum values: `center,top,bottom` Option 20 object The powerpack widget allows you to keep similar graphs together on your timeboard. Each group has a custom header, can hold one to many graphs, and is collapsible. background_color string Background color of the powerpack title. banner_img string URL of image to display as a banner for the powerpack. powerpack_id [_required_] string UUID of the associated powerpack. show_title boolean Whether to show the title or not. default: `true` template_variables object Powerpack template variables. controlled_by_powerpack [object] Template variables controlled at the powerpack level. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. values [_required_] [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. controlled_externally [object] Template variables controlled by the external resource, such as the dashboard this powerpack is on. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. values [_required_] [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. title string Title of the widget. type [_required_] enum Type of the powerpack widget. Allowed enum values: `powerpack` default: `powerpack` Option 21 object Query values display the current value of a given metric, APM, or log query. autoscale boolean Whether to use auto-scaling or not. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. custom_unit string Display a unit of your choice on the widget. precision int64 Number of decimals to show. If not defined, the widget uses the raw value. requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string TODO. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` timeseries_background object Set a timeseries on the widget background. type [_required_] enum Timeseries is made using an area or bars. Allowed enum values: `bars,area` default: `area` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the query value widget. Allowed enum values: `query_value` default: `query_value` Option 22 object Run workflow is widget that allows you to run a workflow from a dashboard. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. inputs [object] Array of workflow inputs to map to dashboard template variables. name [_required_] string Name of the workflow input. value [_required_] string Dashboard template variable. Can be suffixed with '.value' or '.key'. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the run workflow widget. Allowed enum values: `run_workflow` default: `run_workflow` workflow_id [_required_] string Workflow id. Option 23 object Use the SLO List widget to track your SLOs (Service Level Objectives) on dashboards. requests [_required_] [object] Array of one request object to display in the widget. query [_required_] object Updated SLO List widget. limit int64 Maximum number of results to display in the table. default: `100` query_string [_required_] string Widget query. sort [object] Options for sorting results. column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` request_type [_required_] enum Widget request type. Allowed enum values: `slo_list` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the SLO List widget. Allowed enum values: `slo_list` default: `slo_list` Option 24 object Use the SLO and uptime widget to track your SLOs (Service Level Objectives) and uptime on screenboards and timeboards. additional_query_filters string Additional filters applied to the SLO query. global_time_target string Defined global time target. show_error_budget boolean Defined error budget. slo_id string ID of the SLO displayed. time_windows [string] Times being monitored. title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the SLO widget. Allowed enum values: `slo` default: `slo` view_mode enum Define how you want the SLO to be displayed. Allowed enum values: `overall,component,both` view_type [_required_] string Type of view displayed by the widget. default: `detail` Option 25 object The scatter plot visualization allows you to graph a chosen scope over two different metrics with their respective aggregation. color_by_groups [string] List of groups used for colors. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] object Widget definition. table object Scatterplot request containing formulas and functions. formulas [object] List of Scatterplot formulas that operate on queries. alias string Expression alias. dimension [_required_] enum Dimension of the Scatterplot. Allowed enum values: `x,y,radius,color` formula [_required_] string String expression built from queries, formulas, and functions. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` x object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. y object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the scatter plot widget. Allowed enum values: `scatterplot` default: `scatterplot` xaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 26 object This widget displays a map of a service to all of the services that call it, and all of the services that it calls. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. filters [_required_] [string] Your environment and primary tag (or * if enabled for your account). service [_required_] string The ID of the service you want to map. title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the service map widget. Allowed enum values: `servicemap` default: `servicemap` Option 27 object The service summary displays the graphs of a chosen service in your screenboard. Only available on FREE layout dashboards. display_format enum Number of columns to display. Allowed enum values: `one_column,two_column,three_column` env [_required_] string APM environment. service [_required_] string APM service. show_breakdown boolean Whether to show the latency breakdown or not. show_distribution boolean Whether to show the latency distribution or not. show_errors boolean Whether to show the error metrics or not. show_hits boolean Whether to show the hits metrics or not. show_latency boolean Whether to show the latency metrics or not. show_resource_list boolean Whether to show the resource list or not. size_format enum Size of the widget. Allowed enum values: `small,medium,large` span_name [_required_] string APM span name. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the service summary widget. Allowed enum values: `trace_service` default: `trace_service` Option 28 object The split graph widget allows you to create repeating units of a graph - one for each value in a group (for example: one per service) has_uniform_y_axes boolean Normalize y axes across graphs size [_required_] enum Size of the individual graphs in the split. Allowed enum values: `xs,sm,md,lg` source_widget_definition [_required_] The original widget we are splitting on. Option 1 object The Change graph shows you the change in a value over the time period chosen. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. change_type enum Show the absolute or the relative change. Allowed enum values: `absolute,relative` compare_to enum Timeframe used for the change comparison. Allowed enum values: `hour_before,day_before,week_before,month_before` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. increase_good boolean Whether to show increase as good. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order_by enum What to order by. Allowed enum values: `change,name,present,past` order_dir enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. show_present boolean Whether to show the present value. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the change widget. Allowed enum values: `change` default: `change` Option 2 object This visualization displays a series of values by country on a world map. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of request objects to display in the widget. May include an optional request for the region layer and/or an optional request for the points layer. Region layer requests must contain a `group-by` tag whose value is a country ISO code. See the [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) for information about building the `REQUEST_SCHEMA`. columns [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` conditional_formats [object] Threshold (numeric) conditional formatting rules may be used by a regions layer. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. compute [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` group_by [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object The style to apply to the request for points layer. color_by string The category to color the points by. text_formats [object] Text formatting rules may be used by a points layer. custom_bg_color string Hex representation of the custom background color. Used with custom background palette option. custom_fg_color string Hex representation of the custom text color. Used with custom text palette option. match [_required_] object Match rule for the table widget text format. type [_required_] enum Match or compare option. Allowed enum values: `is,is_not,contains,does_not_contain,starts_with,ends_with` value [_required_] string Table Widget Match String. palette enum Color-on-color palette to highlight replaced text. Allowed enum values: `white_on_red,white_on_yellow,white_on_green,black_on_light_red,black_on_light_yellow,black_on_light_green,red_on_white,yellow_on_white,green_on_white,custom_bg,custom_text` default: `white_on_green` replace Replace rule for the table widget text format. Option 1 object Match All definition. type [_required_] enum Table widget text format replace all type. Allowed enum values: `all` with [_required_] string Replace All type. Option 2 object Match Sub-string definition. substring [_required_] string Text that will be replaced. type [_required_] enum Table widget text format replace sub-string type. Allowed enum values: `substring` with [_required_] string Text that will replace original sub-string. style [_required_] object The style to apply to the widget. palette [_required_] string The color palette to apply to the widget. palette_flip [_required_] boolean Whether to flip the palette tones. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of the geomap widget. Allowed enum values: `geomap` default: `geomap` view [_required_] object The view of the world that the map should render. focus [_required_] string The 2-letter ISO code of a country to focus the map on, or `WORLD` for global view, or a region (`EMEA`, `APAC`, `LATAM`), or a continent (`NORTH_AMERICA`, `SOUTH_AMERICA`, `EUROPE`, `AFRICA`, `ASIA`, `OCEANIA`). Option 3 object Query values display the current value of a given metric, APM, or log query. autoscale boolean Whether to use auto-scaling or not. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. custom_unit string Display a unit of your choice on the widget. precision int64 Number of decimals to show. If not defined, the widget uses the raw value. requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string TODO. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` timeseries_background object Set a timeseries on the widget background. type [_required_] enum Timeseries is made using an area or bars. Allowed enum values: `bars,area` default: `area` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the query value widget. Allowed enum values: `query_value` default: `query_value` Option 4 object The scatter plot visualization allows you to graph a chosen scope over two different metrics with their respective aggregation. color_by_groups [string] List of groups used for colors. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] object Widget definition. table object Scatterplot request containing formulas and functions. formulas [object] List of Scatterplot formulas that operate on queries. alias string Expression alias. dimension [_required_] enum Dimension of the Scatterplot. Allowed enum values: `x,y,radius,color` formula [_required_] string String expression built from queries, formulas, and functions. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` x object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. y object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the scatter plot widget. Allowed enum values: `scatterplot` default: `scatterplot` xaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 5 object Sunbursts are spot on to highlight how groups contribute to the total of a query. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. hide_total boolean Show the total value in this widget. legend Configuration of the legend. Option 1 object Configuration of table-based legend. type [_required_] enum Whether or not to show a table legend. Allowed enum values: `table,none` Option 2 object Configuration of inline or automatic legends. hide_percent boolean Whether to hide the percentages of the groups. hide_value boolean Whether to hide the values of the groups. type [_required_] enum Whether to show the legend inline or let it be automatically generated. Allowed enum values: `inline,automatic` requests [_required_] [object] List of sunburst widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the Sunburst widget. Allowed enum values: `sunburst` default: `sunburst` Option 6 object The table visualization is available on timeboards and screenboards. It displays columns of metrics grouped by tag key. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. has_search_bar enum Controls the display of the search bar. Allowed enum values: `always,never,auto` requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` alias string The column name (defaults to the metric name). apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. cell_display_mode [string] A list of display modes for each table cell. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. limit int64 For metric queries, the number of lines to show in the table. Only one request should have this property. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` text_formats [array] List of text formats for columns produced by tags. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the table widget. Allowed enum values: `query_table` default: `query_table` Option 7 object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 8 object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` Option 9 object The treemap visualization enables you to display hierarchical and nested data. It is well suited for queries that describe part-whole relationships, such as resource usage by availability zone, data center, or team. color_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine color in the widget. Allowed enum values: `user` default: `user` custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group_by enum **DEPRECATED** : (deprecated) The attribute formerly used to group elements in the widget. Allowed enum values: `user,family,process` requests [_required_] [object] List of treemap widget requests. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` size_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine size in the widget. Allowed enum values: `pct_cpu,pct_mem` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the treemap widget. Allowed enum values: `treemap` default: `treemap` split_config [_required_] object Encapsulates all user choices about how to split a graph. limit [_required_] int64 Maximum number of graphs to display in the widget. sort [_required_] object Controls the order in which graphs appear in the split. compute object Defines the metric and aggregation used as the sort value. aggregation [_required_] string How to aggregate the sort metric for the purposes of ordering. metric [_required_] string The metric to use for sorting graphs. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` split_dimensions [_required_] [object] The dimension(s) on which to split the graph one_graph_per [_required_] string The system interprets this attribute differently depending on the data source of the query being split. For metrics, it's a tag. For the events platform, it's an attribute or tag. static_splits [array] Manual selection of tags making split graph widget static time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the split graph widget Allowed enum values: `split_group` default: `split_group` Option 29 object Sunbursts are spot on to highlight how groups contribute to the total of a query. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. hide_total boolean Show the total value in this widget. legend Configuration of the legend. Option 1 object Configuration of table-based legend. type [_required_] enum Whether or not to show a table legend. Allowed enum values: `table,none` Option 2 object Configuration of inline or automatic legends. hide_percent boolean Whether to hide the percentages of the groups. hide_value boolean Whether to hide the values of the groups. type [_required_] enum Whether to show the legend inline or let it be automatically generated. Allowed enum values: `inline,automatic` requests [_required_] [object] List of sunburst widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the Sunburst widget. Allowed enum values: `sunburst` default: `sunburst` Option 30 object The table visualization is available on timeboards and screenboards. It displays columns of metrics grouped by tag key. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. has_search_bar enum Controls the display of the search bar. Allowed enum values: `always,never,auto` requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` alias string The column name (defaults to the metric name). apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. cell_display_mode [string] A list of display modes for each table cell. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. limit int64 For metric queries, the number of lines to show in the table. Only one request should have this property. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` text_formats [array] List of text formats for columns produced by tags. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the table widget. Allowed enum values: `query_table` default: `query_table` Option 31 object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 32 object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` Option 33 object This widget displays a topology of nodes and edges for different data sources. It replaces the service map widget. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] One or more Topology requests. query object Query to service-based topology data sources like the service map or data streams. data_source enum Name of the data source Allowed enum values: `data_streams,service_map` filters [string] Your environment and primary tag (or * if enabled for your account). service string Name of the service request_type enum Widget request type. Allowed enum values: `topology` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the topology map widget. Allowed enum values: `topology_map` default: `topology_map` Option 34 object The treemap visualization enables you to display hierarchical and nested data. It is well suited for queries that describe part-whole relationships, such as resource usage by availability zone, data center, or team. color_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine color in the widget. Allowed enum values: `user` default: `user` custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group_by enum **DEPRECATED** : (deprecated) The attribute formerly used to group elements in the widget. Allowed enum values: `user,family,process` requests [_required_] [object] List of treemap widget requests. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` size_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine size in the widget. Allowed enum values: `pct_cpu,pct_mem` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the treemap widget. Allowed enum values: `treemap` default: `treemap` id int64 ID of the widget. layout object The layout for a widget on a `free` or **new dashboard layout** dashboard. height [_required_] int64 The height of the widget. Should be a non-negative integer. is_column_break boolean Whether the widget should be the first one on the second column in high density or not. **Note** : Only for the **new dashboard layout** and only one widget in the dashboard should have this property set to `true`. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. ``` { "author_handle": "test@datadoghq.com", "author_name": "John Doe", "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "id": "123-abc-456", "is_read_only": false, "layout_type": "ordered", "modified_at": "2019-09-19T10:00:00.000Z", "notify_list": [], "reflow_type": "string", "restricted_roles": [], "tags": [], "template_variable_presets": [ { "name": "string", "template_variables": [ { "name": "string", "value": "string", "values": [] } ] } ], "template_variables": [ { "available_values": [ "my-host", "host1", "host2" ], "default": "my-host", "defaults": [ "my-host-1", "my-host-2" ], "name": "host1", "prefix": "host", "type": "group" } ], "title": "", "url": "/dashboard/123-abc-456/example-dashboard-title", "widgets": [ { "definition": { "requests": { "fill": { "q": "avg:system.cpu.user{*}" } }, "type": "hostmap" }, "id": "integer", "layout": { "height": 0, "is_column_break": false, "width": 0, "x": 0, "y": 0 } } ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python-legacy) ##### Get a dashboard Copy ``` # Path parameters export dashboard_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/${dashboard_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a dashboard ``` """ Get a dashboard returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi # there is a valid "dashboard" in the system DASHBOARD_ID = environ["DASHBOARD_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.get_dashboard( dashboard_id=DASHBOARD_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a dashboard ``` # Get a dashboard returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "dashboard" in the system DASHBOARD_ID = ENV["DASHBOARD_ID"] p api_instance.get_dashboard(DASHBOARD_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a dashboard ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dashboard_id = '' dog.get_board(dashboard_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a dashboard ``` // Get a dashboard returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "dashboard" in the system DashboardID := os.Getenv("DASHBOARD_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.GetDashboard(ctx, DashboardID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.GetDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.GetDashboard`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a dashboard ``` // Get a dashboard returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.Dashboard; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "dashboard" in the system String DASHBOARD_ID = System.getenv("DASHBOARD_ID"); try { Dashboard result = apiInstance.getDashboard(DASHBOARD_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#getDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a dashboard ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) dashboard_id = '' api.Dashboard.get(dashboard_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get a dashboard ``` // Get a dashboard returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; #[tokio::main] async fn main() { // there is a valid "dashboard" in the system let dashboard_id = std::env::var("DASHBOARD_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.get_dashboard(dashboard_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a dashboard ``` /** * Get a dashboard returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "dashboard" in the system const DASHBOARD_ID = process.env.DASHBOARD_ID as string; const params: v1.DashboardsApiGetDashboardRequest = { dashboardId: DASHBOARD_ID, }; apiInstance .getDashboard(params) .then((data: v1.Dashboard) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all dashboards](https://docs.datadoghq.com/api/latest/dashboards/#get-all-dashboards) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#get-all-dashboards-v1) GET https://api.ap1.datadoghq.com/api/v1/dashboardhttps://api.ap2.datadoghq.com/api/v1/dashboardhttps://api.datadoghq.eu/api/v1/dashboardhttps://api.ddog-gov.com/api/v1/dashboardhttps://api.datadoghq.com/api/v1/dashboardhttps://api.us3.datadoghq.com/api/v1/dashboardhttps://api.us5.datadoghq.com/api/v1/dashboard ### Overview Get all dashboards. **Note** : This query will only return custom created or cloned dashboards. This query will not return preset dashboards. This endpoint requires the `dashboards_read` permission. OAuth apps require the `dashboards_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[shared] boolean When `true`, this query only returns shared custom created or cloned dashboards. filter[deleted] boolean When `true`, this query returns only deleted custom-created or cloned dashboards. This parameter is incompatible with `filter[shared]`. count integer The maximum number of dashboards returned in the list. start integer The specific offset to use as the beginning of the returned response. ### Response * [200](https://docs.datadoghq.com/api/latest/dashboards/#ListDashboards-200-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#ListDashboards-403-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#ListDashboards-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Dashboard summary response. Field Type Description dashboards [object] List of dashboard definitions. author_handle string Identifier of the dashboard author. created_at date-time Creation date of the dashboard. description string Description of the dashboard. id string Dashboard identifier. is_read_only boolean **DEPRECATED** : Whether this dashboard is read-only. If True, only the author and admins can make changes to it. This property is deprecated; please use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) instead to manage write authorization for individual dashboards. layout_type enum Layout type of the dashboard. Allowed enum values: `ordered,free` modified_at date-time Modification date of the dashboard. title string Title of the dashboard. url string URL of the dashboard. ``` { "dashboards": [ { "author_handle": "string", "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "id": "string", "is_read_only": false, "layout_type": "ordered", "modified_at": "2019-09-19T10:00:00.000Z", "title": "string", "url": "string" } ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python-legacy) ##### Get all dashboards Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all dashboards ``` """ Get all dashboards returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.list_dashboards( filter_shared=False, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all dashboards ``` # Get all dashboards returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new opts = { filter_shared: false, } p api_instance.list_dashboards(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all dashboards ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.get_all_boards() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all dashboards ``` // Get all dashboards returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.ListDashboards(ctx, *datadogV1.NewListDashboardsOptionalParameters().WithFilterShared(false)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.ListDashboards`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.ListDashboards`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all dashboards ``` // Get all dashboards returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.api.DashboardsApi.ListDashboardsOptionalParameters; import com.datadog.api.client.v1.model.DashboardSummary; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); try { DashboardSummary result = apiInstance.listDashboards(new ListDashboardsOptionalParameters().filterShared(false)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#listDashboards"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all dashboards ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.Dashboard.get_all() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get all dashboards ``` // Get all dashboards returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::api_dashboards::ListDashboardsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api .list_dashboards(ListDashboardsOptionalParams::default().filter_shared(false)) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all dashboards ``` /** * Get all dashboards returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); const params: v1.DashboardsApiListDashboardsRequest = { filterShared: false, }; apiInstance .listDashboards(params) .then((data: v1.DashboardSummary) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a dashboard](https://docs.datadoghq.com/api/latest/dashboards/#update-a-dashboard) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#update-a-dashboard-v1) PUT https://api.ap1.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.ap2.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.datadoghq.eu/api/v1/dashboard/{dashboard_id}https://api.ddog-gov.com/api/v1/dashboard/{dashboard_id}https://api.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.us3.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.us5.datadoghq.com/api/v1/dashboard/{dashboard_id} ### Overview Update a dashboard using the specified ID. This endpoint requires the `dashboards_write` permission. OAuth apps require the `dashboards_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Arguments #### Path Parameters Name Type Description dashboard_id [_required_] string The ID of the dashboard. ### Request #### Body Data (required) Update Dashboard request body. * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Field Type Description author_handle string Identifier of the dashboard author. author_name string Name of the dashboard author. created_at date-time Creation date of the dashboard. description string Description of the dashboard. id string ID of the dashboard. is_read_only boolean **DEPRECATED** : Whether this dashboard is read-only. If True, only the author and admins can make changes to it. This property is deprecated; please use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) instead to manage write authorization for individual dashboards. layout_type [_required_] enum Layout type of the dashboard. Allowed enum values: `ordered,free` modified_at date-time Modification date of the dashboard. notify_list [string] List of handles of users to notify when changes are made to this dashboard. reflow_type enum Reflow type for a **new dashboard layout** dashboard. Set this only when layout type is 'ordered'. If set to 'fixed', the dashboard expects all widgets to have a layout, and if it's set to 'auto', widgets should not have layouts. Allowed enum values: `auto,fixed` restricted_roles [string] A list of role identifiers. Only the author and users associated with at least one of these roles can edit this dashboard. tags [string] List of team names representing ownership of a dashboard. template_variable_presets [object] Array of template variables saved views. name string The name of the variable. template_variables [object] List of variables. name string The name of the variable. value string **DEPRECATED** : (deprecated) The value of the template variable within the saved view. Cannot be used in conjunction with `values`. values [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. Cannot be used in conjunction with `value`. template_variables [object] List of template variables for this dashboard. available_values [string] The list of values that the template variable drop-down is limited to. default string **DEPRECATED** : (deprecated) The default value for the template variable on dashboard load. Cannot be used in conjunction with `defaults`. defaults [string] One or many default values for template variables on load. If more than one default is specified, they will be unioned together with `OR`. Cannot be used in conjunction with `default`. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. Only tags with this prefix appear in the variable drop-down. type string The type of variable. This is to differentiate between filter variables (interpolated in query) and group by variables (interpolated into group by). title [_required_] string Title of the dashboard. url string The URL of the dashboard. widgets [_required_] [object] List of widgets to display on the dashboard. definition [_required_] [Definition of the widget](https://docs.datadoghq.com/dashboards/widgets/). Option 1 object Alert graphs are timeseries graphs showing the current status of any monitor defined on your system. alert_id [_required_] string ID of the alert to use in the widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the alert graph widget. Allowed enum values: `alert_graph` default: `alert_graph` viz_type [_required_] enum Whether to display the Alert Graph as a timeseries or a top list. Allowed enum values: `timeseries,toplist` Option 2 object Alert values are query values showing the current value of the metric in any monitor defined on your system. alert_id [_required_] string ID of the alert to use in the widget. precision int64 Number of decimal to show. If not defined, will use the raw value. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of value in the widget. type [_required_] enum Type of the alert value widget. Allowed enum values: `alert_value` default: `alert_value` unit string Unit to display with the value. Option 3 object The Change graph shows you the change in a value over the time period chosen. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. change_type enum Show the absolute or the relative change. Allowed enum values: `absolute,relative` compare_to enum Timeframe used for the change comparison. Allowed enum values: `hour_before,day_before,week_before,month_before` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. increase_good boolean Whether to show increase as good. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. order_by enum What to order by. Allowed enum values: `change,name,present,past` order_dir enum Widget sorting methods. Allowed enum values: `asc,desc` process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. show_present boolean Whether to show the present value. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the change widget. Allowed enum values: `change` default: `change` Option 4 object Check status shows the current status or number of results for any check performed. check [_required_] string Name of the check to use in the widget. group string Group reporting a single check. group_by [string] List of tag prefixes to group by in the case of a cluster check. grouping [_required_] enum The kind of grouping to use. Allowed enum values: `check,cluster` tags [string] List of tags used to filter the groups reporting a cluster check. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the check status widget. Allowed enum values: `check_status` default: `check_status` Option 5 object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query Query definition for Distribution Widget Histogram Request Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` xaxis object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` yaxis object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` Option 6 object The event stream is a widget version of the stream of events on the Event Stream view. Only available on FREE layout dashboards. event_size enum Size to use to display an event. Allowed enum values: `s,l` query [_required_] string Query to filter the event stream with. tags_execution string The execution method for multi-value filters. Can be either and or or. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the event stream widget. Allowed enum values: `event_stream` default: `event_stream` Option 7 object The event timeline is a widget version of the timeline that appears at the top of the Event Stream view. Only available on FREE layout dashboards. query [_required_] string Query to filter the event timeline with. tags_execution string The execution method for multi-value filters. Can be either and or or. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the event timeline widget. Allowed enum values: `event_timeline` default: `event_timeline` Option 8 object Free text is a widget that allows you to add headings to your screenboard. Commonly used to state the overall purpose of the dashboard. Only available on FREE layout dashboards. color string Color of the text. font_size string Size of the text. text [_required_] string Text to display. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` type [_required_] enum Type of the free text widget. Allowed enum values: `free_text` default: `free_text` Option 9 object The funnel visualization displays a funnel of user sessions that maps a sequence of view navigation and user interaction in your application. requests [_required_] [object] Request payload used to query items. query [_required_] object Updated funnel widget. data_source [_required_] enum Source from which to query items to display in the funnel. Allowed enum values: `rum` default: `rum` query_string [_required_] string The widget query. steps [_required_] [object] List of funnel steps. facet [_required_] string The facet of the step. value [_required_] string The value of the step. request_type [_required_] enum Widget request type. Allowed enum values: `funnel` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of funnel widget. Allowed enum values: `funnel` default: `funnel` Option 10 object This visualization displays a series of values by country on a world map. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] Array of request objects to display in the widget. May include an optional request for the region layer and/or an optional request for the points layer. Region layer requests must contain a `group-by` tag whose value is a country ISO code. See the [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) for information about building the `REQUEST_SCHEMA`. columns [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` conditional_formats [object] Threshold (numeric) conditional formatting rules may be used by a regions layer. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string The widget metrics query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. compute [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` group_by [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object The style to apply to the request for points layer. color_by string The category to color the points by. text_formats [object] Text formatting rules may be used by a points layer. custom_bg_color string Hex representation of the custom background color. Used with custom background palette option. custom_fg_color string Hex representation of the custom text color. Used with custom text palette option. match [_required_] object Match rule for the table widget text format. type [_required_] enum Match or compare option. Allowed enum values: `is,is_not,contains,does_not_contain,starts_with,ends_with` value [_required_] string Table Widget Match String. palette enum Color-on-color palette to highlight replaced text. Allowed enum values: `white_on_red,white_on_yellow,white_on_green,black_on_light_red,black_on_light_yellow,black_on_light_green,red_on_white,yellow_on_white,green_on_white,custom_bg,custom_text` default: `white_on_green` replace Replace rule for the table widget text format. Option 1 object Match All definition. type [_required_] enum Table widget text format replace all type. Allowed enum values: `all` with [_required_] string Replace All type. Option 2 object Match Sub-string definition. substring [_required_] string Text that will be replaced. type [_required_] enum Table widget text format replace sub-string type. Allowed enum values: `substring` with [_required_] string Text that will replace original sub-string. style [_required_] object The style to apply to the widget. palette [_required_] string The color palette to apply to the widget. palette_flip [_required_] boolean Whether to flip the palette tones. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of the geomap widget. Allowed enum values: `geomap` default: `geomap` view [_required_] object The view of the world that the map should render. focus [_required_] string The 2-letter ISO code of a country to focus the map on, or `WORLD` for global view, or a region (`EMEA`, `APAC`, `LATAM`), or a continent (`NORTH_AMERICA`, `SOUTH_AMERICA`, `EUROPE`, `AFRICA`, `ASIA`, `OCEANIA`). Option 11 object The groups widget allows you to keep similar graphs together on your timeboard. Each group has a custom header, can hold one to many graphs, and is collapsible. background_color string Background color of the group title. banner_img string URL of image to display as a banner for the group. layout_type [_required_] enum Layout type of the group. Allowed enum values: `ordered` show_title boolean Whether to show the title or not. default: `true` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` type [_required_] enum Type of the group widget. Allowed enum values: `group` default: `group` widgets [_required_] [object] List of widget groups. Option 12 object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of widget types. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` xaxis object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` Option 13 object The host map widget graphs any metric across your hosts using the same visualization available from the main Host Map page. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group [string] List of tag prefixes to group by. no_group_hosts boolean Whether to show the hosts that don’t fit in a group. no_metric_hosts boolean Whether to show the hosts with no metrics. node_type enum Which type of node to use in the map. Allowed enum values: `host,container` notes string Notes on the title. requests [_required_] object List of definitions. fill object Updated host map. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. size object Updated host map. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. scope [string] List of tags used to filter the map. style object The style to apply to the widget. fill_max string Max value to use to color the map. fill_min string Min value to use to color the map. palette string Color palette to apply to the widget. palette_flip boolean Whether to flip the palette tones. title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the host map widget. Allowed enum values: `hostmap` default: `hostmap` Option 14 object The iframe widget allows you to embed a portion of any other web page on your dashboard. Only available on FREE layout dashboards. type [_required_] enum Type of the iframe widget. Allowed enum values: `iframe` default: `iframe` url [_required_] string URL of the iframe. Option 15 object The image widget allows you to embed an image on your dashboard. An image can be a PNG, JPG, or animated GIF. Only available on FREE layout dashboards. has_background boolean Whether to display a background or not. default: `true` has_border boolean Whether to display a border or not. default: `true` horizontal_align enum Horizontal alignment. Allowed enum values: `center,left,right` margin enum Size of the margins around the image. **Note** : `small` and `large` values are deprecated. Allowed enum values: `sm,md,lg,small,large` sizing enum How to size the image on the widget. The values are based on the image `object-fit` CSS properties. **Note** : `zoom`, `fit` and `center` values are deprecated. Allowed enum values: `fill,contain,cover,none,scale-down,zoom,fit,center` type [_required_] enum Type of the image widget. Allowed enum values: `image` default: `image` url [_required_] string URL of the image. url_dark_theme string URL of the image in dark mode. vertical_align enum Vertical alignment. Allowed enum values: `center,top,bottom` Option 16 object The list stream visualization displays a table of recent events in your application that match a search criteria using user-defined columns. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". requests [_required_] [object] Request payload used to query items. columns [_required_] [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` query [_required_] object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. compute [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` group_by [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format [_required_] enum Widget response format. Allowed enum values: `event_list` show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the list stream widget. Allowed enum values: `list_stream` default: `list_stream` Option 17 object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` Option 18 object The monitor summary widget displays a summary view of all your Datadog monitors, or a subset based on a query. Only available on FREE layout dashboards. color_preference enum Which color to use on the widget. Allowed enum values: `background,text` count int64 **DEPRECATED** : The number of monitors to display. display_format enum What to display on the widget. Allowed enum values: `counts,countsAndList,list` hide_zero_counts boolean Whether to show counts of 0 or not. query [_required_] string Query to filter the monitors with. show_last_triggered boolean Whether to show the time that has elapsed since the monitor/group triggered. show_priority boolean Whether to show the priorities column. sort enum Widget sorting methods. Allowed enum values: `name,group,status,tags,triggered,group,asc,group,desc,name,asc,name,desc,status,asc,status,desc,tags,asc,tags,desc,triggered,asc,triggered,desc,priority,asc,priority,desc` start int64 **DEPRECATED** : The start of the list. Typically 0. summary_type enum Which summary type should be used. Allowed enum values: `monitors,groups,combined` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the monitor summary widget. Allowed enum values: `manage_status` default: `manage_status` Option 19 object The notes and links widget is similar to free text widget, but allows for more formatting options. background_color string Background color of the note. content [_required_] string Content of the note. font_size string Size of the text. has_padding boolean Whether to add padding or not. default: `true` show_tick boolean Whether to show a tick or not. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` tick_edge enum Define how you want to align the text on the widget. Allowed enum values: `bottom,left,right,top` tick_pos string Where to position the tick on an edge. type [_required_] enum Type of the note widget. Allowed enum values: `note` default: `note` vertical_align enum Vertical alignment. Allowed enum values: `center,top,bottom` Option 20 object The powerpack widget allows you to keep similar graphs together on your timeboard. Each group has a custom header, can hold one to many graphs, and is collapsible. background_color string Background color of the powerpack title. banner_img string URL of image to display as a banner for the powerpack. powerpack_id [_required_] string UUID of the associated powerpack. show_title boolean Whether to show the title or not. default: `true` template_variables object Powerpack template variables. controlled_by_powerpack [object] Template variables controlled at the powerpack level. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. values [_required_] [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. controlled_externally [object] Template variables controlled by the external resource, such as the dashboard this powerpack is on. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. values [_required_] [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. title string Title of the widget. type [_required_] enum Type of the powerpack widget. Allowed enum values: `powerpack` default: `powerpack` Option 21 object Query values display the current value of a given metric, APM, or log query. autoscale boolean Whether to use auto-scaling or not. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. custom_unit string Display a unit of your choice on the widget. precision int64 Number of decimals to show. If not defined, the widget uses the raw value. requests [_required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. Option 2 object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string TODO. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` timeseries_background object Set a timeseries on the widget background. type [_required_] enum Timeseries is made using an area or bars. Allowed enum values: `bars,area` default: `area` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the query value widget. Allowed enum values: `query_value` default: `query_value` object Run workflow is widget that allows you to run a workflow from a dashboard. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. [object] Array of workflow inputs to map to dashboard template variables. name [_required_] string Name of the workflow input. value [_required_] string Dashboard template variable. Can be suffixed with '.value' or '.key'. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the run workflow widget. Allowed enum values: `run_workflow` default: `run_workflow` workflow_id [_required_] string Workflow id. object Use the SLO List widget to track your SLOs (Service Level Objectives) on dashboards. _required_] [object] Array of one request object to display in the widget. _required_] object Updated SLO List widget. limit int64 Maximum number of results to display in the table. default: `100` query_string [_required_] string Widget query. [object] Options for sorting results. column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` request_type [_required_] enum Widget request type. Allowed enum values: `slo_list` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the SLO List widget. Allowed enum values: `slo_list` default: `slo_list` object Use the SLO and uptime widget to track your SLOs (Service Level Objectives) and uptime on screenboards and timeboards. additional_query_filters string Additional filters applied to the SLO query. global_time_target string Defined global time target. show_error_budget boolean Defined error budget. slo_id string ID of the SLO displayed. time_windows [string] Times being monitored. title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the SLO widget. Allowed enum values: `slo` default: `slo` view_mode enum Define how you want the SLO to be displayed. Allowed enum values: `overall,component,both` view_type [_required_] string Type of view displayed by the widget. default: `detail` object The scatter plot visualization allows you to graph a chosen scope over two different metrics with their respective aggregation. color_by_groups [string] List of groups used for colors. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] object Widget definition. object Scatterplot request containing formulas and functions. [object] List of Scatterplot formulas that operate on queries. alias string Expression alias. dimension [_required_] enum Dimension of the Scatterplot. Allowed enum values: `x,y,radius,color` formula [_required_] string String expression built from queries, formulas, and functions. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the scatter plot widget. Allowed enum values: `scatterplot` default: `scatterplot` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object This widget displays a map of a service to all of the services that call it, and all of the services that it calls. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. filters [_required_] [string] Your environment and primary tag (or * if enabled for your account). service [_required_] string The ID of the service you want to map. title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the service map widget. Allowed enum values: `servicemap` default: `servicemap` object The service summary displays the graphs of a chosen service in your screenboard. Only available on FREE layout dashboards. display_format enum Number of columns to display. Allowed enum values: `one_column,two_column,three_column` env [_required_] string APM environment. service [_required_] string APM service. show_breakdown boolean Whether to show the latency breakdown or not. show_distribution boolean Whether to show the latency distribution or not. show_errors boolean Whether to show the error metrics or not. show_hits boolean Whether to show the hits metrics or not. show_latency boolean Whether to show the latency metrics or not. show_resource_list boolean Whether to show the resource list or not. size_format enum Size of the widget. Allowed enum values: `small,medium,large` span_name [_required_] string APM span name. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the service summary widget. Allowed enum values: `trace_service` default: `trace_service` object The split graph widget allows you to create repeating units of a graph - one for each value in a group (for example: one per service) has_uniform_y_axes boolean Normalize y axes across graphs size [_required_] enum Size of the individual graphs in the split. Allowed enum values: `xs,sm,md,lg` _required_] The original widget we are splitting on. object The Change graph shows you the change in a value over the time period chosen. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. change_type enum Show the absolute or the relative change. Allowed enum values: `absolute,relative` compare_to enum Timeframe used for the change comparison. Allowed enum values: `hour_before,day_before,week_before,month_before` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. increase_good boolean Whether to show increase as good. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. order_by enum What to order by. Allowed enum values: `change,name,present,past` order_dir enum Widget sorting methods. Allowed enum values: `asc,desc` object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. show_present boolean Whether to show the present value. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the change widget. Allowed enum values: `change` default: `change` object This visualization displays a series of values by country on a world map. [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] Array of request objects to display in the widget. May include an optional request for the region layer and/or an optional request for the points layer. Region layer requests must contain a `group-by` tag whose value is a country ISO code. See the [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) for information about building the `REQUEST_SCHEMA`. [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` [object] Threshold (numeric) conditional formatting rules may be used by a regions layer. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string The widget metrics query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The controls for sorting the widget. count int64 The number of items to limit the widget to. [ ] The array of items to sort the widget by in order. object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` object The style to apply to the request for points layer. color_by string The category to color the points by. [object] Text formatting rules may be used by a points layer. custom_bg_color string Hex representation of the custom background color. Used with custom background palette option. custom_fg_color string Hex representation of the custom text color. Used with custom text palette option. _required_] object Match rule for the table widget text format. type [_required_] enum Match or compare option. Allowed enum values: `is,is_not,contains,does_not_contain,starts_with,ends_with` value [_required_] string Table Widget Match String. palette enum Color-on-color palette to highlight replaced text. Allowed enum values: `white_on_red,white_on_yellow,white_on_green,black_on_light_red,black_on_light_yellow,black_on_light_green,red_on_white,yellow_on_white,green_on_white,custom_bg,custom_text` default: `white_on_green` Replace rule for the table widget text format. object Match All definition. type [_required_] enum Table widget text format replace all type. Allowed enum values: `all` with [_required_] string Replace All type. object Match Sub-string definition. substring [_required_] string Text that will be replaced. type [_required_] enum Table widget text format replace sub-string type. Allowed enum values: `substring` with [_required_] string Text that will replace original sub-string. _required_] object The style to apply to the widget. palette [_required_] string The color palette to apply to the widget. palette_flip [_required_] boolean Whether to flip the palette tones. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of the geomap widget. Allowed enum values: `geomap` default: `geomap` _required_] object The view of the world that the map should render. focus [_required_] string The 2-letter ISO code of a country to focus the map on, or `WORLD` for global view, or a region (`EMEA`, `APAC`, `LATAM`), or a continent (`NORTH_AMERICA`, `SOUTH_AMERICA`, `EUROPE`, `AFRICA`, `ASIA`, `OCEANIA`). object Query values display the current value of a given metric, APM, or log query. autoscale boolean Whether to use auto-scaling or not. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. custom_unit string Display a unit of your choice on the widget. precision int64 Number of decimals to show. If not defined, the widget uses the raw value. _required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string TODO. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` object Set a timeseries on the widget background. type [_required_] enum Timeseries is made using an area or bars. Allowed enum values: `bars,area` default: `area` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the query value widget. Allowed enum values: `query_value` default: `query_value` object The scatter plot visualization allows you to graph a chosen scope over two different metrics with their respective aggregation. color_by_groups [string] List of groups used for colors. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] object Widget definition. object Scatterplot request containing formulas and functions. [object] List of Scatterplot formulas that operate on queries. alias string Expression alias. dimension [_required_] enum Dimension of the Scatterplot. Allowed enum values: `x,y,radius,color` formula [_required_] string String expression built from queries, formulas, and functions. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the scatter plot widget. Allowed enum values: `scatterplot` default: `scatterplot` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object Sunbursts are spot on to highlight how groups contribute to the total of a query. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. hide_total boolean Show the total value in this widget. Configuration of the legend. object Configuration of table-based legend. type [_required_] enum Whether or not to show a table legend. Allowed enum values: `table,none` object Configuration of inline or automatic legends. hide_percent boolean Whether to hide the percentages of the groups. hide_value boolean Whether to hide the values of the groups. type [_required_] enum Whether to show the legend inline or let it be automatically generated. Allowed enum values: `inline,automatic` _required_] [object] List of sunburst widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Widget style definition. palette string Color palette to apply to the widget. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the Sunburst widget. Allowed enum values: `sunburst` default: `sunburst` object The table visualization is available on timeboards and screenboards. It displays columns of metrics grouped by tag key. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. has_search_bar enum Controls the display of the search bar. Allowed enum values: `always,never,auto` _required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` alias string The column name (defaults to the metric name). object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The APM stats query for table and distributions widgets. [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. cell_display_mode [string] A list of display modes for each table cell. [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. limit int64 For metric queries, the number of lines to show in the table. Only one request should have this property. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. order enum Widget sorting methods. Allowed enum values: `asc,desc` object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The controls for sorting the widget. count int64 The number of items to limit the widget to. [ ] The array of items to sort the widget by in order. object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` text_formats [array] List of text formats for columns produced by tags. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the table widget. Allowed enum values: `query_table` default: `query_table` object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). _required_] [object] List of timeseries widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] List of top list widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The controls for sorting the widget. count int64 The number of items to limit the widget to. [ ] The array of items to sort the widget by in order. object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. object Style customization for a top list widget. Top list widget display options. object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` object The treemap visualization enables you to display hierarchical and nested data. It is well suited for queries that describe part-whole relationships, such as resource usage by availability zone, data center, or team. color_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine color in the widget. Allowed enum values: `user` default: `user` [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group_by enum **DEPRECATED** : (deprecated) The attribute formerly used to group elements in the widget. Allowed enum values: `user,family,process` _required_] [object] List of treemap widget requests. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. q string The widget metrics query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` size_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine size in the widget. Allowed enum values: `pct_cpu,pct_mem` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the treemap widget. Allowed enum values: `treemap` default: `treemap` _required_] object Encapsulates all user choices about how to split a graph. limit [_required_] int64 Maximum number of graphs to display in the widget. _required_] object Controls the order in which graphs appear in the split. object Defines the metric and aggregation used as the sort value. aggregation [_required_] string How to aggregate the sort metric for the purposes of ordering. metric [_required_] string The metric to use for sorting graphs. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` _required_] [object] The dimension(s) on which to split the graph one_graph_per [_required_] string The system interprets this attribute differently depending on the data source of the query being split. For metrics, it's a tag. For the events platform, it's an attribute or tag. static_splits [array] Manual selection of tags making split graph widget static Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the split graph widget Allowed enum values: `split_group` default: `split_group` object Sunbursts are spot on to highlight how groups contribute to the total of a query. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. hide_total boolean Show the total value in this widget. Configuration of the legend. object Configuration of table-based legend. type [_required_] enum Whether or not to show a table legend. Allowed enum values: `table,none` object Configuration of inline or automatic legends. hide_percent boolean Whether to hide the percentages of the groups. hide_value boolean Whether to hide the values of the groups. type [_required_] enum Whether to show the legend inline or let it be automatically generated. Allowed enum values: `inline,automatic` _required_] [object] List of sunburst widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Widget style definition. palette string Color palette to apply to the widget. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the Sunburst widget. Allowed enum values: `sunburst` default: `sunburst` object The table visualization is available on timeboards and screenboards. It displays columns of metrics grouped by tag key. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. has_search_bar enum Controls the display of the search bar. Allowed enum values: `always,never,auto` _required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` alias string The column name (defaults to the metric name). object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The APM stats query for table and distributions widgets. [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. cell_display_mode [string] A list of display modes for each table cell. [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. limit int64 For metric queries, the number of lines to show in the table. Only one request should have this property. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. order enum Widget sorting methods. Allowed enum values: `asc,desc` object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The controls for sorting the widget. count int64 The number of items to limit the widget to. [ ] The array of items to sort the widget by in order. object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` text_formats [array] List of text formats for columns produced by tags. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the table widget. Allowed enum values: `query_table` default: `query_table` object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). _required_] [object] List of timeseries widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] List of top list widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The controls for sorting the widget. count int64 The number of items to limit the widget to. [ ] The array of items to sort the widget by in order. object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. object Style customization for a top list widget. Top list widget display options. object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` object This widget displays a topology of nodes and edges for different data sources. It replaces the service map widget. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] One or more Topology requests. object Query to service-based topology data sources like the service map or data streams. data_source enum Name of the data source Allowed enum values: `data_streams,service_map` filters [string] Your environment and primary tag (or * if enabled for your account). service string Name of the service request_type enum Widget request type. Allowed enum values: `topology` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the topology map widget. Allowed enum values: `topology_map` default: `topology_map` object The treemap visualization enables you to display hierarchical and nested data. It is well suited for queries that describe part-whole relationships, such as resource usage by availability zone, data center, or team. color_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine color in the widget. Allowed enum values: `user` default: `user` [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group_by enum **DEPRECATED** : (deprecated) The attribute formerly used to group elements in the widget. Allowed enum values: `user,family,process` _required_] [object] List of treemap widget requests. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. q string The widget metrics query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` size_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine size in the widget. Allowed enum values: `pct_cpu,pct_mem` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the treemap widget. Allowed enum values: `treemap` default: `treemap` id int64 ID of the widget. object The layout for a widget on a `free` or **new dashboard layout** dashboard. height [_required_] int64 The height of the widget. Should be a non-negative integer. is_column_break boolean Whether the widget should be the first one on the second column in high density or not. **Note** : Only for the **new dashboard layout** and only one widget in the dashboard should have this property set to `true`. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. ##### Update a dashboard returns "OK" response ``` { "layout_type": "ordered", "title": "Example-Dashboard with list_stream widget", "description": "Updated description", "widgets": [ { "definition": { "type": "list_stream", "requests": [ { "columns": [ { "width": "auto", "field": "timestamp" } ], "query": { "data_source": "apm_issue_stream", "query_string": "" }, "response_format": "event_list" } ] } } ] } ``` Copy ##### Update a dashboard with tags returns "OK" response ``` { "layout_type": "ordered", "title": "Example-Dashboard with list_stream widget", "description": "Updated description", "tags": [ "team:foo", "team:bar" ], "widgets": [ { "definition": { "type": "list_stream", "requests": [ { "columns": [ { "width": "auto", "field": "timestamp" } ], "query": { "data_source": "apm_issue_stream", "query_string": "" }, "response_format": "event_list" } ] } } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dashboards/#UpdateDashboard-200-v1) * [400](https://docs.datadoghq.com/api/latest/dashboards/#UpdateDashboard-400-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#UpdateDashboard-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#UpdateDashboard-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#UpdateDashboard-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) A dashboard is Datadog’s tool for visually tracking, analyzing, and displaying key performance metrics, which enable you to monitor the health of your infrastructure. Expand All Field Type Description author_handle string Identifier of the dashboard author. author_name string Name of the dashboard author. created_at date-time Creation date of the dashboard. description string Description of the dashboard. id string ID of the dashboard. is_read_only boolean **DEPRECATED** : Whether this dashboard is read-only. If True, only the author and admins can make changes to it. This property is deprecated; please use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) instead to manage write authorization for individual dashboards. layout_type [_required_] enum Layout type of the dashboard. Allowed enum values: `ordered,free` modified_at date-time Modification date of the dashboard. notify_list [string] List of handles of users to notify when changes are made to this dashboard. reflow_type enum Reflow type for a **new dashboard layout** dashboard. Set this only when layout type is 'ordered'. If set to 'fixed', the dashboard expects all widgets to have a layout, and if it's set to 'auto', widgets should not have layouts. Allowed enum values: `auto,fixed` restricted_roles [string] A list of role identifiers. Only the author and users associated with at least one of these roles can edit this dashboard. tags [string] List of team names representing ownership of a dashboard. [object] Array of template variables saved views. name string The name of the variable. [object] List of variables. name string The name of the variable. value string **DEPRECATED** : (deprecated) The value of the template variable within the saved view. Cannot be used in conjunction with `values`. values [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. Cannot be used in conjunction with `value`. [object] List of template variables for this dashboard. available_values [string] The list of values that the template variable drop-down is limited to. default string **DEPRECATED** : (deprecated) The default value for the template variable on dashboard load. Cannot be used in conjunction with `defaults`. defaults [string] One or many default values for template variables on load. If more than one default is specified, they will be unioned together with `OR`. Cannot be used in conjunction with `default`. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. Only tags with this prefix appear in the variable drop-down. type string The type of variable. This is to differentiate between filter variables (interpolated in query) and group by variables (interpolated into group by). title [_required_] string Title of the dashboard. url string The URL of the dashboard. _required_] [object] List of widgets to display on the dashboard. _required_] [Definition of the widget](https://docs.datadoghq.com/dashboards/widgets/). object Alert graphs are timeseries graphs showing the current status of any monitor defined on your system. alert_id [_required_] string ID of the alert to use in the widget. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the alert graph widget. Allowed enum values: `alert_graph` default: `alert_graph` viz_type [_required_] enum Whether to display the Alert Graph as a timeseries or a top list. Allowed enum values: `timeseries,toplist` object Alert values are query values showing the current value of the metric in any monitor defined on your system. alert_id [_required_] string ID of the alert to use in the widget. precision int64 Number of decimal to show. If not defined, will use the raw value. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of value in the widget. type [_required_] enum Type of the alert value widget. Allowed enum values: `alert_value` default: `alert_value` unit string Unit to display with the value. object The Change graph shows you the change in a value over the time period chosen. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. change_type enum Show the absolute or the relative change. Allowed enum values: `absolute,relative` compare_to enum Timeframe used for the change comparison. Allowed enum values: `hour_before,day_before,week_before,month_before` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. increase_good boolean Whether to show increase as good. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. order_by enum What to order by. Allowed enum values: `change,name,present,past` order_dir enum Widget sorting methods. Allowed enum values: `asc,desc` object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. show_present boolean Whether to show the present value. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the change widget. Allowed enum values: `change` default: `change` object Check status shows the current status or number of results for any check performed. check [_required_] string Name of the check to use in the widget. group string Group reporting a single check. group_by [string] List of tag prefixes to group by in the case of a cluster check. grouping [_required_] enum The kind of grouping to use. Allowed enum values: `check,cluster` tags [string] List of tags used to filter the groups reporting a cluster check. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the check status widget. Allowed enum values: `check_status` default: `check_status` object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). _required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The APM stats query for table and distributions widgets. [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. Query definition for Distribution Widget Histogram Request object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` object The event stream is a widget version of the stream of events on the Event Stream view. Only available on FREE layout dashboards. event_size enum Size to use to display an event. Allowed enum values: `s,l` query [_required_] string Query to filter the event stream with. tags_execution string The execution method for multi-value filters. Can be either and or or. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the event stream widget. Allowed enum values: `event_stream` default: `event_stream` object The event timeline is a widget version of the timeline that appears at the top of the Event Stream view. Only available on FREE layout dashboards. query [_required_] string Query to filter the event timeline with. tags_execution string The execution method for multi-value filters. Can be either and or or. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the event timeline widget. Allowed enum values: `event_timeline` default: `event_timeline` object Free text is a widget that allows you to add headings to your screenboard. Commonly used to state the overall purpose of the dashboard. Only available on FREE layout dashboards. color string Color of the text. font_size string Size of the text. text [_required_] string Text to display. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` type [_required_] enum Type of the free text widget. Allowed enum values: `free_text` default: `free_text` object The funnel visualization displays a funnel of user sessions that maps a sequence of view navigation and user interaction in your application. _required_] [object] Request payload used to query items. _required_] object Updated funnel widget. data_source [_required_] enum Source from which to query items to display in the funnel. Allowed enum values: `rum` default: `rum` query_string [_required_] string The widget query. _required_] [object] List of funnel steps. facet [_required_] string The facet of the step. value [_required_] string The value of the step. request_type [_required_] enum Widget request type. Allowed enum values: `funnel` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of funnel widget. Allowed enum values: `funnel` default: `funnel` object This visualization displays a series of values by country on a world map. [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] Array of request objects to display in the widget. May include an optional request for the region layer and/or an optional request for the points layer. Region layer requests must contain a `group-by` tag whose value is a country ISO code. See the [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) for information about building the `REQUEST_SCHEMA`. [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` [object] Threshold (numeric) conditional formatting rules may be used by a regions layer. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string The widget metrics query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The controls for sorting the widget. count int64 The number of items to limit the widget to. [ ] The array of items to sort the widget by in order. object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` object The style to apply to the request for points layer. color_by string The category to color the points by. [object] Text formatting rules may be used by a points layer. custom_bg_color string Hex representation of the custom background color. Used with custom background palette option. custom_fg_color string Hex representation of the custom text color. Used with custom text palette option. _required_] object Match rule for the table widget text format. type [_required_] enum Match or compare option. Allowed enum values: `is,is_not,contains,does_not_contain,starts_with,ends_with` value [_required_] string Table Widget Match String. palette enum Color-on-color palette to highlight replaced text. Allowed enum values: `white_on_red,white_on_yellow,white_on_green,black_on_light_red,black_on_light_yellow,black_on_light_green,red_on_white,yellow_on_white,green_on_white,custom_bg,custom_text` default: `white_on_green` Replace rule for the table widget text format. object Match All definition. type [_required_] enum Table widget text format replace all type. Allowed enum values: `all` with [_required_] string Replace All type. object Match Sub-string definition. substring [_required_] string Text that will be replaced. type [_required_] enum Table widget text format replace sub-string type. Allowed enum values: `substring` with [_required_] string Text that will replace original sub-string. _required_] object The style to apply to the widget. palette [_required_] string The color palette to apply to the widget. palette_flip [_required_] boolean Whether to flip the palette tones. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of the geomap widget. Allowed enum values: `geomap` default: `geomap` _required_] object The view of the world that the map should render. focus [_required_] string The 2-letter ISO code of a country to focus the map on, or `WORLD` for global view, or a region (`EMEA`, `APAC`, `LATAM`), or a continent (`NORTH_AMERICA`, `SOUTH_AMERICA`, `EUROPE`, `AFRICA`, `ASIA`, `OCEANIA`). object The groups widget allows you to keep similar graphs together on your timeboard. Each group has a custom header, can hold one to many graphs, and is collapsible. background_color string Background color of the group title. banner_img string URL of image to display as a banner for the group. layout_type [_required_] enum Layout type of the group. Allowed enum values: `ordered` show_title boolean Whether to show the title or not. default: `true` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` type [_required_] enum Type of the group widget. Allowed enum values: `group` default: `group` widgets [_required_] [object] List of widget groups. object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). _required_] [object] List of widget types. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object The host map widget graphs any metric across your hosts using the same visualization available from the main Host Map page. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group [string] List of tag prefixes to group by. no_group_hosts boolean Whether to show the hosts that don’t fit in a group. no_metric_hosts boolean Whether to show the hosts with no metrics. node_type enum Which type of node to use in the map. Allowed enum values: `host,container` notes string Notes on the title. _required_] object List of definitions. object Updated host map. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Updated host map. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. scope [string] List of tags used to filter the map. object The style to apply to the widget. fill_max string Max value to use to color the map. fill_min string Min value to use to color the map. palette string Color palette to apply to the widget. palette_flip boolean Whether to flip the palette tones. title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the host map widget. Allowed enum values: `hostmap` default: `hostmap` object The iframe widget allows you to embed a portion of any other web page on your dashboard. Only available on FREE layout dashboards. type [_required_] enum Type of the iframe widget. Allowed enum values: `iframe` default: `iframe` url [_required_] string URL of the iframe. object The image widget allows you to embed an image on your dashboard. An image can be a PNG, JPG, or animated GIF. Only available on FREE layout dashboards. has_background boolean Whether to display a background or not. default: `true` has_border boolean Whether to display a border or not. default: `true` horizontal_align enum Horizontal alignment. Allowed enum values: `center,left,right` margin enum Size of the margins around the image. **Note** : `small` and `large` values are deprecated. Allowed enum values: `sm,md,lg,small,large` sizing enum How to size the image on the widget. The values are based on the image `object-fit` CSS properties. **Note** : `zoom`, `fit` and `center` values are deprecated. Allowed enum values: `fill,contain,cover,none,scale-down,zoom,fit,center` type [_required_] enum Type of the image widget. Allowed enum values: `image` default: `image` url [_required_] string URL of the image. url_dark_theme string URL of the image in dark mode. vertical_align enum Vertical alignment. Allowed enum values: `center,top,bottom` object The list stream visualization displays a table of recent events in your application that match a search criteria using user-defined columns. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". _required_] [object] Request payload used to query items. _required_] [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` _required_] object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format [_required_] enum Widget response format. Allowed enum values: `event_list` show_legend boolean Whether or not to display the legend on this widget. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the list stream widget. Allowed enum values: `list_stream` default: `list_stream` object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` object The monitor summary widget displays a summary view of all your Datadog monitors, or a subset based on a query. Only available on FREE layout dashboards. color_preference enum Which color to use on the widget. Allowed enum values: `background,text` count int64 **DEPRECATED** : The number of monitors to display. display_format enum What to display on the widget. Allowed enum values: `counts,countsAndList,list` hide_zero_counts boolean Whether to show counts of 0 or not. query [_required_] string Query to filter the monitors with. show_last_triggered boolean Whether to show the time that has elapsed since the monitor/group triggered. show_priority boolean Whether to show the priorities column. sort enum Widget sorting methods. Allowed enum values: `name,group,status,tags,triggered,group,asc,group,desc,name,asc,name,desc,status,asc,status,desc,tags,asc,tags,desc,triggered,asc,triggered,desc,priority,asc,priority,desc` start int64 **DEPRECATED** : The start of the list. Typically 0. summary_type enum Which summary type should be used. Allowed enum values: `monitors,groups,combined` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the monitor summary widget. Allowed enum values: `manage_status` default: `manage_status` object The notes and links widget is similar to free text widget, but allows for more formatting options. background_color string Background color of the note. content [_required_] string Content of the note. font_size string Size of the text. has_padding boolean Whether to add padding or not. default: `true` show_tick boolean Whether to show a tick or not. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` tick_edge enum Define how you want to align the text on the widget. Allowed enum values: `bottom,left,right,top` tick_pos string Where to position the tick on an edge. type [_required_] enum Type of the note widget. Allowed enum values: `note` default: `note` vertical_align enum Vertical alignment. Allowed enum values: `center,top,bottom` object The powerpack widget allows you to keep similar graphs together on your timeboard. Each group has a custom header, can hold one to many graphs, and is collapsible. background_color string Background color of the powerpack title. banner_img string URL of image to display as a banner for the powerpack. powerpack_id [_required_] string UUID of the associated powerpack. show_title boolean Whether to show the title or not. default: `true` object Powerpack template variables. [object] Template variables controlled at the powerpack level. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. values [_required_] [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. [object] Template variables controlled by the external resource, such as the dashboard this powerpack is on. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. values [_required_] [string] One or many template variable values within the saved view, which will be unioned together using `OR` if more than one is specified. title string Title of the widget. type [_required_] enum Type of the powerpack widget. Allowed enum values: `powerpack` default: `powerpack` object Query values display the current value of a given metric, APM, or log query. autoscale boolean Whether to use auto-scaling or not. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. custom_unit string Display a unit of your choice on the widget. precision int64 Number of decimals to show. If not defined, the widget uses the raw value. _required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string TODO. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` object Set a timeseries on the widget background. type [_required_] enum Timeseries is made using an area or bars. Allowed enum values: `bars,area` default: `area` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the query value widget. Allowed enum values: `query_value` default: `query_value` object Run workflow is widget that allows you to run a workflow from a dashboard. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. [object] Array of workflow inputs to map to dashboard template variables. name [_required_] string Name of the workflow input. value [_required_] string Dashboard template variable. Can be suffixed with '.value' or '.key'. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the run workflow widget. Allowed enum values: `run_workflow` default: `run_workflow` workflow_id [_required_] string Workflow id. object Use the SLO List widget to track your SLOs (Service Level Objectives) on dashboards. _required_] [object] Array of one request object to display in the widget. _required_] object Updated SLO List widget. limit int64 Maximum number of results to display in the table. default: `100` query_string [_required_] string Widget query. [object] Options for sorting results. column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` request_type [_required_] enum Widget request type. Allowed enum values: `slo_list` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the SLO List widget. Allowed enum values: `slo_list` default: `slo_list` object Use the SLO and uptime widget to track your SLOs (Service Level Objectives) and uptime on screenboards and timeboards. additional_query_filters string Additional filters applied to the SLO query. global_time_target string Defined global time target. show_error_budget boolean Defined error budget. slo_id string ID of the SLO displayed. time_windows [string] Times being monitored. title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the SLO widget. Allowed enum values: `slo` default: `slo` view_mode enum Define how you want the SLO to be displayed. Allowed enum values: `overall,component,both` view_type [_required_] string Type of view displayed by the widget. default: `detail` object The scatter plot visualization allows you to graph a chosen scope over two different metrics with their respective aggregation. color_by_groups [string] List of groups used for colors. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] object Widget definition. object Scatterplot request containing formulas and functions. [object] List of Scatterplot formulas that operate on queries. alias string Expression alias. dimension [_required_] enum Dimension of the Scatterplot. Allowed enum values: `x,y,radius,color` formula [_required_] string String expression built from queries, formulas, and functions. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the scatter plot widget. Allowed enum values: `scatterplot` default: `scatterplot` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object This widget displays a map of a service to all of the services that call it, and all of the services that it calls. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. filters [_required_] [string] Your environment and primary tag (or * if enabled for your account). service [_required_] string The ID of the service you want to map. title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the service map widget. Allowed enum values: `servicemap` default: `servicemap` object The service summary displays the graphs of a chosen service in your screenboard. Only available on FREE layout dashboards. display_format enum Number of columns to display. Allowed enum values: `one_column,two_column,three_column` env [_required_] string APM environment. service [_required_] string APM service. show_breakdown boolean Whether to show the latency breakdown or not. show_distribution boolean Whether to show the latency distribution or not. show_errors boolean Whether to show the error metrics or not. show_hits boolean Whether to show the hits metrics or not. show_latency boolean Whether to show the latency metrics or not. show_resource_list boolean Whether to show the resource list or not. size_format enum Size of the widget. Allowed enum values: `small,medium,large` span_name [_required_] string APM span name. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the service summary widget. Allowed enum values: `trace_service` default: `trace_service` object The split graph widget allows you to create repeating units of a graph - one for each value in a group (for example: one per service) has_uniform_y_axes boolean Normalize y axes across graphs size [_required_] enum Size of the individual graphs in the split. Allowed enum values: `xs,sm,md,lg` _required_] The original widget we are splitting on. object The Change graph shows you the change in a value over the time period chosen. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. change_type enum Show the absolute or the relative change. Allowed enum values: `absolute,relative` compare_to enum Timeframe used for the change comparison. Allowed enum values: `hour_before,day_before,week_before,month_before` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. increase_good boolean Whether to show increase as good. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. order_by enum What to order by. Allowed enum values: `change,name,present,past` order_dir enum Widget sorting methods. Allowed enum values: `asc,desc` object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. show_present boolean Whether to show the present value. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the change widget. Allowed enum values: `change` default: `change` object This visualization displays a series of values by country on a world map. [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] Array of request objects to display in the widget. May include an optional request for the region layer and/or an optional request for the points layer. Region layer requests must contain a `group-by` tag whose value is a country ISO code. See the [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) for information about building the `REQUEST_SCHEMA`. [object] Widget columns. field [_required_] string Widget column field. width [_required_] enum Widget column width. Allowed enum values: `auto,compact,full` [object] Threshold (numeric) conditional formatting rules may be used by a regions layer. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string The widget metrics query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. object Updated list stream widget. clustering_pattern_field_path string Specifies the field for logs pattern clustering. Usable only with logs_pattern_stream. [object] Compute configuration for the List Stream Widget. Compute can be used only with the logs_transaction_stream (from 1 to 5 items) list stream source. aggregation [_required_] enum Aggregation value. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,earliest,latest,most_frequent` facet string Facet name. data_source [_required_] enum Source from which to query items to display in the stream. Allowed enum values: `logs_stream,audit_stream,ci_pipeline_stream,ci_test_stream,rum_issue_stream,apm_issue_stream,trace_stream,logs_issue_stream,logs_pattern_stream,logs_transaction_stream,event_stream,rum_stream,llm_observability_stream` default: `apm_issue_stream` event_size enum Size to use to display an event. Allowed enum values: `s,l` [object] Group by configuration for the List Stream Widget. Group by can be used only with logs_pattern_stream (up to 4 items) or logs_transaction_stream (one group by item is required) list stream source. facet [_required_] string Facet name. indexes [string] List of indexes. query_string [_required_] string Widget query. object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` storage string Option for storage location. Feature in Private Beta. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The controls for sorting the widget. count int64 The number of items to limit the widget to. [ ] The array of items to sort the widget by in order. object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` object The style to apply to the request for points layer. color_by string The category to color the points by. [object] Text formatting rules may be used by a points layer. custom_bg_color string Hex representation of the custom background color. Used with custom background palette option. custom_fg_color string Hex representation of the custom text color. Used with custom text palette option. _required_] object Match rule for the table widget text format. type [_required_] enum Match or compare option. Allowed enum values: `is,is_not,contains,does_not_contain,starts_with,ends_with` value [_required_] string Table Widget Match String. palette enum Color-on-color palette to highlight replaced text. Allowed enum values: `white_on_red,white_on_yellow,white_on_green,black_on_light_red,black_on_light_yellow,black_on_light_green,red_on_white,yellow_on_white,green_on_white,custom_bg,custom_text` default: `white_on_green` Replace rule for the table widget text format. object Match All definition. type [_required_] enum Table widget text format replace all type. Allowed enum values: `all` with [_required_] string Replace All type. object Match Sub-string definition. substring [_required_] string Text that will be replaced. type [_required_] enum Table widget text format replace sub-string type. Allowed enum values: `substring` with [_required_] string Text that will replace original sub-string. _required_] object The style to apply to the widget. palette [_required_] string The color palette to apply to the widget. palette_flip [_required_] boolean Whether to flip the palette tones. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string The title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string The size of the title. type [_required_] enum Type of the geomap widget. Allowed enum values: `geomap` default: `geomap` _required_] object The view of the world that the map should render. focus [_required_] string The 2-letter ISO code of a country to focus the map on, or `WORLD` for global view, or a region (`EMEA`, `APAC`, `LATAM`), or a continent (`NORTH_AMERICA`, `SOUTH_AMERICA`, `EUROPE`, `AFRICA`, `ASIA`, `OCEANIA`). object Query values display the current value of a given metric, APM, or log query. autoscale boolean Whether to use auto-scaling or not. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. custom_unit string Display a unit of your choice on the widget. precision int64 Number of decimals to show. If not defined, the widget uses the raw value. _required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string TODO. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. text_align enum How to align the text on the widget. Allowed enum values: `center,left,right` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` object Set a timeseries on the widget background. type [_required_] enum Timeseries is made using an area or bars. Allowed enum values: `bars,area` default: `area` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the query value widget. Allowed enum values: `query_value` default: `query_value` object The scatter plot visualization allows you to graph a chosen scope over two different metrics with their respective aggregation. color_by_groups [string] List of groups used for colors. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] object Widget definition. object Scatterplot request containing formulas and functions. [object] List of Scatterplot formulas that operate on queries. alias string Expression alias. dimension [_required_] enum Dimension of the Scatterplot. Allowed enum values: `x,y,radius,color` formula [_required_] string String expression built from queries, formulas, and functions. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Updated scatter plot. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the scatter plot widget. Allowed enum values: `scatterplot` default: `scatterplot` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object Sunbursts are spot on to highlight how groups contribute to the total of a query. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. hide_total boolean Show the total value in this widget. Configuration of the legend. object Configuration of table-based legend. type [_required_] enum Whether or not to show a table legend. Allowed enum values: `table,none` object Configuration of inline or automatic legends. hide_percent boolean Whether to hide the percentages of the groups. hide_value boolean Whether to hide the values of the groups. type [_required_] enum Whether to show the legend inline or let it be automatically generated. Allowed enum values: `inline,automatic` _required_] [object] List of sunburst widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Widget style definition. palette string Color palette to apply to the widget. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the Sunburst widget. Allowed enum values: `sunburst` default: `sunburst` object The table visualization is available on timeboards and screenboards. It displays columns of metrics grouped by tag key. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. has_search_bar enum Controls the display of the search bar. Allowed enum values: `always,never,auto` _required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` alias string The column name (defaults to the metric name). object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The APM stats query for table and distributions widgets. [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. cell_display_mode [string] A list of display modes for each table cell. [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. limit int64 For metric queries, the number of lines to show in the table. Only one request should have this property. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. order enum Widget sorting methods. Allowed enum values: `asc,desc` object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The controls for sorting the widget. count int64 The number of items to limit the widget to. [ ] The array of items to sort the widget by in order. object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` text_formats [array] List of text formats for columns produced by tags. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the table widget. Allowed enum values: `query_table` default: `query_table` object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). _required_] [object] List of timeseries widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] List of top list widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The controls for sorting the widget. count int64 The number of items to limit the widget to. [ ] The array of items to sort the widget by in order. object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. object Style customization for a top list widget. Top list widget display options. object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` object The treemap visualization enables you to display hierarchical and nested data. It is well suited for queries that describe part-whole relationships, such as resource usage by availability zone, data center, or team. color_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine color in the widget. Allowed enum values: `user` default: `user` [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group_by enum **DEPRECATED** : (deprecated) The attribute formerly used to group elements in the widget. Allowed enum values: `user,family,process` _required_] [object] List of treemap widget requests. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. q string The widget metrics query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` size_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine size in the widget. Allowed enum values: `pct_cpu,pct_mem` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the treemap widget. Allowed enum values: `treemap` default: `treemap` _required_] object Encapsulates all user choices about how to split a graph. limit [_required_] int64 Maximum number of graphs to display in the widget. _required_] object Controls the order in which graphs appear in the split. object Defines the metric and aggregation used as the sort value. aggregation [_required_] string How to aggregate the sort metric for the purposes of ordering. metric [_required_] string The metric to use for sorting graphs. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` _required_] [object] The dimension(s) on which to split the graph one_graph_per [_required_] string The system interprets this attribute differently depending on the data source of the query being split. For metrics, it's a tag. For the events platform, it's an attribute or tag. static_splits [array] Manual selection of tags making split graph widget static Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the split graph widget Allowed enum values: `split_group` default: `split_group` object Sunbursts are spot on to highlight how groups contribute to the total of a query. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. hide_total boolean Show the total value in this widget. Configuration of the legend. object Configuration of table-based legend. type [_required_] enum Whether or not to show a table legend. Allowed enum values: `table,none` object Configuration of inline or automatic legends. hide_percent boolean Whether to hide the percentages of the groups. hide_value boolean Whether to hide the values of the groups. type [_required_] enum Whether to show the legend inline or let it be automatically generated. Allowed enum values: `inline,automatic` _required_] [object] List of sunburst widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Widget style definition. palette string Color palette to apply to the widget. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the Sunburst widget. Allowed enum values: `sunburst` default: `sunburst` object The table visualization is available on timeboards and screenboards. It displays columns of metrics grouped by tag key. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. has_search_bar enum Controls the display of the search bar. Allowed enum values: `always,never,auto` _required_] [object] Widget definition. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` alias string The column name (defaults to the metric name). object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The APM stats query for table and distributions widgets. [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. cell_display_mode [string] A list of display modes for each table cell. [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. limit int64 For metric queries, the number of lines to show in the table. Only one request should have this property. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. order enum Widget sorting methods. Allowed enum values: `asc,desc` object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Query definition. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The controls for sorting the widget. count int64 The number of items to limit the widget to. [ ] The array of items to sort the widget by in order. object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` text_formats [array] List of text formats for columns produced by tags. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the table widget. Allowed enum values: `query_table` default: `query_table` object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). _required_] [object] List of timeseries widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] List of top list widget requests. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The log query. object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. object The query being made on the logs. query [_required_] string Search value to apply. object The controls for sorting the widget. count int64 The number of items to limit the widget to. [ ] The array of items to sort the widget by in order. object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. object Style customization for a top list widget. Top list widget display options. object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` object This widget displays a topology of nodes and edges for different data sources. It replaces the service map widget. [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. _required_] [object] One or more Topology requests. object Query to service-based topology data sources like the service map or data streams. data_source enum Name of the data source Allowed enum values: `data_streams,service_map` filters [string] Your environment and primary tag (or * if enabled for your account). service string Name of the service request_type enum Widget request type. Allowed enum values: `topology` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the topology map widget. Allowed enum values: `topology_map` default: `topology_map` object The treemap visualization enables you to display hierarchical and nested data. It is well suited for queries that describe part-whole relationships, such as resource usage by availability zone, data center, or team. color_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine color in the widget. Allowed enum values: `user` default: `user` [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. group_by enum **DEPRECATED** : (deprecated) The attribute formerly used to group elements in the widget. Allowed enum values: `user,family,process` _required_] [object] List of treemap widget requests. [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` object Number format options for the widget. Number format unit. object Canonical unit. per_unit_name string The name of the unit per item. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Custom unit. label string The label for the custom unit. type enum The type of custom unit. Allowed enum values: `custom_unit_label` object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. q string The widget metrics query. [ ] List of queries that can be returned directly or used in formulas. object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` object A formula and functions events query. _required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` size_by enum **DEPRECATED** : (deprecated) The attribute formerly used to determine size in the widget. Allowed enum values: `pct_cpu,pct_mem` Time setting for the widget. object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. type [_required_] enum Type of the treemap widget. Allowed enum values: `treemap` default: `treemap` id int64 ID of the widget. object The layout for a widget on a `free` or **new dashboard layout** dashboard. height [_required_] int64 The height of the widget. Should be a non-negative integer. is_column_break boolean Whether the widget should be the first one on the second column in high density or not. **Note** : Only for the **new dashboard layout** and only one widget in the dashboard should have this property set to `true`. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. ``` { "author_handle": "test@datadoghq.com", "author_name": "John Doe", "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "id": "123-abc-456", "is_read_only": false, "layout_type": "ordered", "modified_at": "2019-09-19T10:00:00.000Z", "notify_list": [], "reflow_type": "string", "restricted_roles": [], "tags": [], "template_variable_presets": [ { "name": "string", "template_variables": [ { "name": "string", "value": "string", "values": [] } ] } ], "template_variables": [ { "available_values": [ "my-host", "host1", "host2" ], "default": "my-host", "defaults": [ "my-host-1", "my-host-2" ], "name": "host1", "prefix": "host", "type": "group" } ], "title": "", "url": "/dashboard/123-abc-456/example-dashboard-title", "widgets": [ { "definition": { "requests": { "fill": { "q": "avg:system.cpu.user{*}" } }, "type": "hostmap" }, "id": "integer", "layout": { "height": 0, "is_column_break": false, "width": 0, "x": 0, "y": 0 } } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) ##### Update a dashboard returns "OK" response Copy ``` # Path parameters export dashboard_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/${dashboard_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "layout_type": "ordered", "title": "Example-Dashboard with list_stream widget", "description": "Updated description", "widgets": [ { "definition": { "type": "list_stream", "requests": [ { "columns": [ { "width": "auto", "field": "timestamp" } ], "query": { "data_source": "apm_issue_stream", "query_string": "" }, "response_format": "event_list" } ] } } ] } EOF ``` ##### Update a dashboard with tags returns "OK" response Copy ``` # Path parameters export dashboard_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/${dashboard_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "layout_type": "ordered", "title": "Example-Dashboard with list_stream widget", "description": "Updated description", "tags": [ "team:foo", "team:bar" ], "widgets": [ { "definition": { "type": "list_stream", "requests": [ { "columns": [ { "width": "auto", "field": "timestamp" } ], "query": { "data_source": "apm_issue_stream", "query_string": "" }, "response_format": "event_list" } ] } } ] } EOF ``` ##### Update a dashboard returns "OK" response ``` // Update a dashboard returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "dashboard" in the system DashboardID := os.Getenv("DASHBOARD_ID") body := datadogV1.Dashboard{ LayoutType: datadogV1.DASHBOARDLAYOUTTYPE_ORDERED, Title: "Example-Dashboard with list_stream widget", Description: *datadog.NewNullableString(datadog.PtrString("Updated description")), Widgets: []datadogV1.Widget{ { Definition: datadogV1.WidgetDefinition{ ListStreamWidgetDefinition: &datadogV1.ListStreamWidgetDefinition{ Type: datadogV1.LISTSTREAMWIDGETDEFINITIONTYPE_LIST_STREAM, Requests: []datadogV1.ListStreamWidgetRequest{ { Columns: []datadogV1.ListStreamColumn{ { Width: datadogV1.LISTSTREAMCOLUMNWIDTH_AUTO, Field: "timestamp", }, }, Query: datadogV1.ListStreamQuery{ DataSource: datadogV1.LISTSTREAMSOURCE_APM_ISSUE_STREAM, QueryString: "", }, ResponseFormat: datadogV1.LISTSTREAMRESPONSEFORMAT_EVENT_LIST, }, }, }}, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.UpdateDashboard(ctx, DashboardID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.UpdateDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.UpdateDashboard`:\n%s\n", responseContent) } ``` Copy ##### Update a dashboard with tags returns "OK" response ``` // Update a dashboard with tags returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "dashboard" in the system DashboardID := os.Getenv("DASHBOARD_ID") body := datadogV1.Dashboard{ LayoutType: datadogV1.DASHBOARDLAYOUTTYPE_ORDERED, Title: "Example-Dashboard with list_stream widget", Description: *datadog.NewNullableString(datadog.PtrString("Updated description")), Tags: *datadog.NewNullableList(&[]string{ "team:foo", "team:bar", }), Widgets: []datadogV1.Widget{ { Definition: datadogV1.WidgetDefinition{ ListStreamWidgetDefinition: &datadogV1.ListStreamWidgetDefinition{ Type: datadogV1.LISTSTREAMWIDGETDEFINITIONTYPE_LIST_STREAM, Requests: []datadogV1.ListStreamWidgetRequest{ { Columns: []datadogV1.ListStreamColumn{ { Width: datadogV1.LISTSTREAMCOLUMNWIDTH_AUTO, Field: "timestamp", }, }, Query: datadogV1.ListStreamQuery{ DataSource: datadogV1.LISTSTREAMSOURCE_APM_ISSUE_STREAM, QueryString: "", }, ResponseFormat: datadogV1.LISTSTREAMRESPONSEFORMAT_EVENT_LIST, }, }, }}, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.UpdateDashboard(ctx, DashboardID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.UpdateDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.UpdateDashboard`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a dashboard returns "OK" response ``` // Update a dashboard returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.Dashboard; import com.datadog.api.client.v1.model.DashboardLayoutType; import com.datadog.api.client.v1.model.ListStreamColumn; import com.datadog.api.client.v1.model.ListStreamColumnWidth; import com.datadog.api.client.v1.model.ListStreamQuery; import com.datadog.api.client.v1.model.ListStreamResponseFormat; import com.datadog.api.client.v1.model.ListStreamSource; import com.datadog.api.client.v1.model.ListStreamWidgetDefinition; import com.datadog.api.client.v1.model.ListStreamWidgetDefinitionType; import com.datadog.api.client.v1.model.ListStreamWidgetRequest; import com.datadog.api.client.v1.model.Widget; import com.datadog.api.client.v1.model.WidgetDefinition; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "dashboard" in the system String DASHBOARD_ID = System.getenv("DASHBOARD_ID"); Dashboard body = new Dashboard() .layoutType(DashboardLayoutType.ORDERED) .title("Example-Dashboard with list_stream widget") .description("Updated description") .widgets( Collections.singletonList( new Widget() .definition( new WidgetDefinition( new ListStreamWidgetDefinition() .type(ListStreamWidgetDefinitionType.LIST_STREAM) .requests( Collections.singletonList( new ListStreamWidgetRequest() .columns( Collections.singletonList( new ListStreamColumn() .width(ListStreamColumnWidth.AUTO) .field("timestamp"))) .query( new ListStreamQuery() .dataSource( ListStreamSource.APM_ISSUE_STREAM) .queryString("")) .responseFormat( ListStreamResponseFormat.EVENT_LIST))))))); try { Dashboard result = apiInstance.updateDashboard(DASHBOARD_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#updateDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update a dashboard with tags returns "OK" response ``` // Update a dashboard with tags returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.Dashboard; import com.datadog.api.client.v1.model.DashboardLayoutType; import com.datadog.api.client.v1.model.ListStreamColumn; import com.datadog.api.client.v1.model.ListStreamColumnWidth; import com.datadog.api.client.v1.model.ListStreamQuery; import com.datadog.api.client.v1.model.ListStreamResponseFormat; import com.datadog.api.client.v1.model.ListStreamSource; import com.datadog.api.client.v1.model.ListStreamWidgetDefinition; import com.datadog.api.client.v1.model.ListStreamWidgetDefinitionType; import com.datadog.api.client.v1.model.ListStreamWidgetRequest; import com.datadog.api.client.v1.model.Widget; import com.datadog.api.client.v1.model.WidgetDefinition; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "dashboard" in the system String DASHBOARD_ID = System.getenv("DASHBOARD_ID"); Dashboard body = new Dashboard() .layoutType(DashboardLayoutType.ORDERED) .title("Example-Dashboard with list_stream widget") .description("Updated description") .tags(Arrays.asList("team:foo", "team:bar")) .widgets( Collections.singletonList( new Widget() .definition( new WidgetDefinition( new ListStreamWidgetDefinition() .type(ListStreamWidgetDefinitionType.LIST_STREAM) .requests( Collections.singletonList( new ListStreamWidgetRequest() .columns( Collections.singletonList( new ListStreamColumn() .width(ListStreamColumnWidth.AUTO) .field("timestamp"))) .query( new ListStreamQuery() .dataSource( ListStreamSource.APM_ISSUE_STREAM) .queryString("")) .responseFormat( ListStreamResponseFormat.EVENT_LIST))))))); try { Dashboard result = apiInstance.updateDashboard(DASHBOARD_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#updateDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a dashboard returns "OK" response ``` """ Update a dashboard returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard import Dashboard from datadog_api_client.v1.model.dashboard_layout_type import DashboardLayoutType from datadog_api_client.v1.model.list_stream_column import ListStreamColumn from datadog_api_client.v1.model.list_stream_column_width import ListStreamColumnWidth from datadog_api_client.v1.model.list_stream_query import ListStreamQuery from datadog_api_client.v1.model.list_stream_response_format import ListStreamResponseFormat from datadog_api_client.v1.model.list_stream_source import ListStreamSource from datadog_api_client.v1.model.list_stream_widget_definition import ListStreamWidgetDefinition from datadog_api_client.v1.model.list_stream_widget_definition_type import ListStreamWidgetDefinitionType from datadog_api_client.v1.model.list_stream_widget_request import ListStreamWidgetRequest from datadog_api_client.v1.model.widget import Widget # there is a valid "dashboard" in the system DASHBOARD_ID = environ["DASHBOARD_ID"] body = Dashboard( layout_type=DashboardLayoutType.ORDERED, title="Example-Dashboard with list_stream widget", description="Updated description", widgets=[ Widget( definition=ListStreamWidgetDefinition( type=ListStreamWidgetDefinitionType.LIST_STREAM, requests=[ ListStreamWidgetRequest( columns=[ ListStreamColumn( width=ListStreamColumnWidth.AUTO, field="timestamp", ), ], query=ListStreamQuery( data_source=ListStreamSource.APM_ISSUE_STREAM, query_string="", ), response_format=ListStreamResponseFormat.EVENT_LIST, ), ], ), ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.update_dashboard(dashboard_id=DASHBOARD_ID, body=body) print(response) ``` Copy ##### Update a dashboard with tags returns "OK" response ``` """ Update a dashboard with tags returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard import Dashboard from datadog_api_client.v1.model.dashboard_layout_type import DashboardLayoutType from datadog_api_client.v1.model.list_stream_column import ListStreamColumn from datadog_api_client.v1.model.list_stream_column_width import ListStreamColumnWidth from datadog_api_client.v1.model.list_stream_query import ListStreamQuery from datadog_api_client.v1.model.list_stream_response_format import ListStreamResponseFormat from datadog_api_client.v1.model.list_stream_source import ListStreamSource from datadog_api_client.v1.model.list_stream_widget_definition import ListStreamWidgetDefinition from datadog_api_client.v1.model.list_stream_widget_definition_type import ListStreamWidgetDefinitionType from datadog_api_client.v1.model.list_stream_widget_request import ListStreamWidgetRequest from datadog_api_client.v1.model.widget import Widget # there is a valid "dashboard" in the system DASHBOARD_ID = environ["DASHBOARD_ID"] body = Dashboard( layout_type=DashboardLayoutType.ORDERED, title="Example-Dashboard with list_stream widget", description="Updated description", tags=[ "team:foo", "team:bar", ], widgets=[ Widget( definition=ListStreamWidgetDefinition( type=ListStreamWidgetDefinitionType.LIST_STREAM, requests=[ ListStreamWidgetRequest( columns=[ ListStreamColumn( width=ListStreamColumnWidth.AUTO, field="timestamp", ), ], query=ListStreamQuery( data_source=ListStreamSource.APM_ISSUE_STREAM, query_string="", ), response_format=ListStreamResponseFormat.EVENT_LIST, ), ], ), ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.update_dashboard(dashboard_id=DASHBOARD_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a dashboard returns "OK" response ``` # Update a dashboard returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "dashboard" in the system DASHBOARD_ID = ENV["DASHBOARD_ID"] body = DatadogAPIClient::V1::Dashboard.new({ layout_type: DatadogAPIClient::V1::DashboardLayoutType::ORDERED, title: "Example-Dashboard with list_stream widget", description: "Updated description", widgets: [ DatadogAPIClient::V1::Widget.new({ definition: DatadogAPIClient::V1::ListStreamWidgetDefinition.new({ type: DatadogAPIClient::V1::ListStreamWidgetDefinitionType::LIST_STREAM, requests: [ DatadogAPIClient::V1::ListStreamWidgetRequest.new({ columns: [ DatadogAPIClient::V1::ListStreamColumn.new({ width: DatadogAPIClient::V1::ListStreamColumnWidth::AUTO, field: "timestamp", }), ], query: DatadogAPIClient::V1::ListStreamQuery.new({ data_source: DatadogAPIClient::V1::ListStreamSource::APM_ISSUE_STREAM, query_string: "", }), response_format: DatadogAPIClient::V1::ListStreamResponseFormat::EVENT_LIST, }), ], }), }), ], }) p api_instance.update_dashboard(DASHBOARD_ID, body) ``` Copy ##### Update a dashboard with tags returns "OK" response ``` # Update a dashboard with tags returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "dashboard" in the system DASHBOARD_ID = ENV["DASHBOARD_ID"] body = DatadogAPIClient::V1::Dashboard.new({ layout_type: DatadogAPIClient::V1::DashboardLayoutType::ORDERED, title: "Example-Dashboard with list_stream widget", description: "Updated description", tags: [ "team:foo", "team:bar", ], widgets: [ DatadogAPIClient::V1::Widget.new({ definition: DatadogAPIClient::V1::ListStreamWidgetDefinition.new({ type: DatadogAPIClient::V1::ListStreamWidgetDefinitionType::LIST_STREAM, requests: [ DatadogAPIClient::V1::ListStreamWidgetRequest.new({ columns: [ DatadogAPIClient::V1::ListStreamColumn.new({ width: DatadogAPIClient::V1::ListStreamColumnWidth::AUTO, field: "timestamp", }), ], query: DatadogAPIClient::V1::ListStreamQuery.new({ data_source: DatadogAPIClient::V1::ListStreamSource::APM_ISSUE_STREAM, query_string: "", }), response_format: DatadogAPIClient::V1::ListStreamResponseFormat::EVENT_LIST, }), ], }), }), ], }) p api_instance.update_dashboard(DASHBOARD_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a dashboard returns "OK" response ``` // Update a dashboard returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::Dashboard; use datadog_api_client::datadogV1::model::DashboardLayoutType; use datadog_api_client::datadogV1::model::ListStreamColumn; use datadog_api_client::datadogV1::model::ListStreamColumnWidth; use datadog_api_client::datadogV1::model::ListStreamQuery; use datadog_api_client::datadogV1::model::ListStreamResponseFormat; use datadog_api_client::datadogV1::model::ListStreamSource; use datadog_api_client::datadogV1::model::ListStreamWidgetDefinition; use datadog_api_client::datadogV1::model::ListStreamWidgetDefinitionType; use datadog_api_client::datadogV1::model::ListStreamWidgetRequest; use datadog_api_client::datadogV1::model::Widget; use datadog_api_client::datadogV1::model::WidgetDefinition; #[tokio::main] async fn main() { // there is a valid "dashboard" in the system let dashboard_id = std::env::var("DASHBOARD_ID").unwrap(); let body = Dashboard::new( DashboardLayoutType::ORDERED, "Example-Dashboard with list_stream widget".to_string(), vec![Widget::new(WidgetDefinition::ListStreamWidgetDefinition( Box::new(ListStreamWidgetDefinition::new( vec![ListStreamWidgetRequest::new( vec![ListStreamColumn::new( "timestamp".to_string(), ListStreamColumnWidth::AUTO, )], ListStreamQuery::new(ListStreamSource::APM_ISSUE_STREAM, "".to_string()), ListStreamResponseFormat::EVENT_LIST, )], ListStreamWidgetDefinitionType::LIST_STREAM, )), ))], ) .description(Some("Updated description".to_string())); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.update_dashboard(dashboard_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update a dashboard with tags returns "OK" response ``` // Update a dashboard with tags returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::Dashboard; use datadog_api_client::datadogV1::model::DashboardLayoutType; use datadog_api_client::datadogV1::model::ListStreamColumn; use datadog_api_client::datadogV1::model::ListStreamColumnWidth; use datadog_api_client::datadogV1::model::ListStreamQuery; use datadog_api_client::datadogV1::model::ListStreamResponseFormat; use datadog_api_client::datadogV1::model::ListStreamSource; use datadog_api_client::datadogV1::model::ListStreamWidgetDefinition; use datadog_api_client::datadogV1::model::ListStreamWidgetDefinitionType; use datadog_api_client::datadogV1::model::ListStreamWidgetRequest; use datadog_api_client::datadogV1::model::Widget; use datadog_api_client::datadogV1::model::WidgetDefinition; #[tokio::main] async fn main() { // there is a valid "dashboard" in the system let dashboard_id = std::env::var("DASHBOARD_ID").unwrap(); let body = Dashboard::new( DashboardLayoutType::ORDERED, "Example-Dashboard with list_stream widget".to_string(), vec![Widget::new(WidgetDefinition::ListStreamWidgetDefinition( Box::new(ListStreamWidgetDefinition::new( vec![ListStreamWidgetRequest::new( vec![ListStreamColumn::new( "timestamp".to_string(), ListStreamColumnWidth::AUTO, )], ListStreamQuery::new(ListStreamSource::APM_ISSUE_STREAM, "".to_string()), ListStreamResponseFormat::EVENT_LIST, )], ListStreamWidgetDefinitionType::LIST_STREAM, )), ))], ) .description(Some("Updated description".to_string())) .tags(Some(vec!["team:foo".to_string(), "team:bar".to_string()])); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.update_dashboard(dashboard_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a dashboard returns "OK" response ``` /** * Update a dashboard returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "dashboard" in the system const DASHBOARD_ID = process.env.DASHBOARD_ID as string; const params: v1.DashboardsApiUpdateDashboardRequest = { body: { layoutType: "ordered", title: "Example-Dashboard with list_stream widget", description: "Updated description", widgets: [ { definition: { type: "list_stream", requests: [ { columns: [ { width: "auto", field: "timestamp", }, ], query: { dataSource: "apm_issue_stream", queryString: "", }, responseFormat: "event_list", }, ], }, }, ], }, dashboardId: DASHBOARD_ID, }; apiInstance .updateDashboard(params) .then((data: v1.Dashboard) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update a dashboard with tags returns "OK" response ``` /** * Update a dashboard with tags returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "dashboard" in the system const DASHBOARD_ID = process.env.DASHBOARD_ID as string; const params: v1.DashboardsApiUpdateDashboardRequest = { body: { layoutType: "ordered", title: "Example-Dashboard with list_stream widget", description: "Updated description", tags: ["team:foo", "team:bar"], widgets: [ { definition: { type: "list_stream", requests: [ { columns: [ { width: "auto", field: "timestamp", }, ], query: { dataSource: "apm_issue_stream", queryString: "", }, responseFormat: "event_list", }, ], }, }, ], }, dashboardId: DASHBOARD_ID, }; apiInstance .updateDashboard(params) .then((data: v1.Dashboard) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a dashboard](https://docs.datadoghq.com/api/latest/dashboards/#delete-a-dashboard) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#delete-a-dashboard-v1) DELETE https://api.ap1.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.ap2.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.datadoghq.eu/api/v1/dashboard/{dashboard_id}https://api.ddog-gov.com/api/v1/dashboard/{dashboard_id}https://api.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.us3.datadoghq.com/api/v1/dashboard/{dashboard_id}https://api.us5.datadoghq.com/api/v1/dashboard/{dashboard_id} ### Overview Delete a dashboard using the specified ID. This endpoint requires the `dashboards_write` permission. OAuth apps require the `dashboards_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Arguments #### Path Parameters Name Type Description dashboard_id [_required_] string The ID of the dashboard. ### Response * [200](https://docs.datadoghq.com/api/latest/dashboards/#DeleteDashboard-200-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#DeleteDashboard-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#DeleteDashboard-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#DeleteDashboard-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Response from the delete dashboard call. Expand All Field Type Description deleted_dashboard_id string ID of the deleted dashboard. ``` { "deleted_dashboard_id": "string" } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Dashboards Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python-legacy) ##### Delete a dashboard Copy ``` # Path parameters export dashboard_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/${dashboard_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a dashboard ``` """ Delete a dashboard returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi # there is a valid "dashboard" in the system DASHBOARD_ID = environ["DASHBOARD_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.delete_dashboard( dashboard_id=DASHBOARD_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a dashboard ``` # Delete a dashboard returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "dashboard" in the system DASHBOARD_ID = ENV["DASHBOARD_ID"] p api_instance.delete_dashboard(DASHBOARD_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a dashboard ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dashboard_id = '' dog.delete_board(dashboard_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a dashboard ``` // Delete a dashboard returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "dashboard" in the system DashboardID := os.Getenv("DASHBOARD_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.DeleteDashboard(ctx, DashboardID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.DeleteDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.DeleteDashboard`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a dashboard ``` // Delete a dashboard returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.DashboardDeleteResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "dashboard" in the system String DASHBOARD_ID = System.getenv("DASHBOARD_ID"); try { DashboardDeleteResponse result = apiInstance.deleteDashboard(DASHBOARD_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#deleteDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a dashboard ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) dashboard_id = '' api.Dashboard.delete(dashboard_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Delete a dashboard ``` // Delete a dashboard returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; #[tokio::main] async fn main() { // there is a valid "dashboard" in the system let dashboard_id = std::env::var("DASHBOARD_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.delete_dashboard(dashboard_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a dashboard ``` /** * Delete a dashboard returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "dashboard" in the system const DASHBOARD_ID = process.env.DASHBOARD_ID as string; const params: v1.DashboardsApiDeleteDashboardRequest = { dashboardId: DASHBOARD_ID, }; apiInstance .deleteDashboard(params) .then((data: v1.DashboardDeleteResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete dashboards](https://docs.datadoghq.com/api/latest/dashboards/#delete-dashboards) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#delete-dashboards-v1) DELETE https://api.ap1.datadoghq.com/api/v1/dashboardhttps://api.ap2.datadoghq.com/api/v1/dashboardhttps://api.datadoghq.eu/api/v1/dashboardhttps://api.ddog-gov.com/api/v1/dashboardhttps://api.datadoghq.com/api/v1/dashboardhttps://api.us3.datadoghq.com/api/v1/dashboardhttps://api.us5.datadoghq.com/api/v1/dashboard ### Overview Delete dashboards using the specified IDs. If there are any failures, no dashboards will be deleted (partial success is not allowed). This endpoint requires the `dashboards_write` permission. OAuth apps require the `dashboards_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Request #### Body Data (required) Delete dashboards request body. * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Expand All Field Type Description _required_] [object] List of dashboard bulk action request data objects. id [_required_] string Dashboard resource ID. type [_required_] enum Dashboard resource type. Allowed enum values: `dashboard` default: `dashboard` ``` { "data": [ { "id": "123-abc-456", "type": "dashboard" } ] } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/dashboards/#DeleteDashboards-204-v1) * [400](https://docs.datadoghq.com/api/latest/dashboards/#DeleteDashboards-400-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#DeleteDashboards-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#DeleteDashboards-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#DeleteDashboards-429-v1) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Dashboards Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) ##### Delete dashboards returns "No Content" response Copy ``` ## json-request-body # # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "id": "123-abc-456", "type": "dashboard" }, { "id": "789-def-101", "type": "dashboard" } ] } EOF ``` ##### Delete dashboards returns "No Content" response ``` // Delete dashboards returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "dashboard" in the system DashboardID := os.Getenv("DASHBOARD_ID") body := datadogV1.DashboardBulkDeleteRequest{ Data: []datadogV1.DashboardBulkActionData{ { Id: DashboardID, Type: datadogV1.DASHBOARDRESOURCETYPE_DASHBOARD, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) r, err := api.DeleteDashboards(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.DeleteDashboards`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete dashboards returns "No Content" response ``` // Delete dashboards returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.DashboardBulkActionData; import com.datadog.api.client.v1.model.DashboardBulkDeleteRequest; import com.datadog.api.client.v1.model.DashboardResourceType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "dashboard" in the system String DASHBOARD_ID = System.getenv("DASHBOARD_ID"); DashboardBulkDeleteRequest body = new DashboardBulkDeleteRequest() .data( Collections.singletonList( new DashboardBulkActionData() .id(DASHBOARD_ID) .type(DashboardResourceType.DASHBOARD))); try { apiInstance.deleteDashboards(body); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#deleteDashboards"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete dashboards returns "No Content" response ``` """ Delete dashboards returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard_bulk_action_data import DashboardBulkActionData from datadog_api_client.v1.model.dashboard_bulk_action_data_list import DashboardBulkActionDataList from datadog_api_client.v1.model.dashboard_bulk_delete_request import DashboardBulkDeleteRequest from datadog_api_client.v1.model.dashboard_resource_type import DashboardResourceType # there is a valid "dashboard" in the system DASHBOARD_ID = environ["DASHBOARD_ID"] body = DashboardBulkDeleteRequest( data=DashboardBulkActionDataList( [ DashboardBulkActionData( id=DASHBOARD_ID, type=DashboardResourceType.DASHBOARD, ), ] ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) api_instance.delete_dashboards(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete dashboards returns "No Content" response ``` # Delete dashboards returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "dashboard" in the system DASHBOARD_ID = ENV["DASHBOARD_ID"] body = DatadogAPIClient::V1::DashboardBulkDeleteRequest.new({ data: [ DatadogAPIClient::V1::DashboardBulkActionData.new({ id: DASHBOARD_ID, type: DatadogAPIClient::V1::DashboardResourceType::DASHBOARD, }), ], }) api_instance.delete_dashboards(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete dashboards returns "No Content" response ``` // Delete dashboards returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::DashboardBulkActionData; use datadog_api_client::datadogV1::model::DashboardBulkDeleteRequest; use datadog_api_client::datadogV1::model::DashboardResourceType; #[tokio::main] async fn main() { // there is a valid "dashboard" in the system let dashboard_id = std::env::var("DASHBOARD_ID").unwrap(); let body = DashboardBulkDeleteRequest::new(vec![DashboardBulkActionData::new( dashboard_id.clone(), DashboardResourceType::DASHBOARD, )]); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.delete_dashboards(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete dashboards returns "No Content" response ``` /** * Delete dashboards returns "No Content" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "dashboard" in the system const DASHBOARD_ID = process.env.DASHBOARD_ID as string; const params: v1.DashboardsApiDeleteDashboardsRequest = { body: { data: [ { id: DASHBOARD_ID, type: "dashboard", }, ], }, }; apiInstance .deleteDashboards(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Restore deleted dashboards](https://docs.datadoghq.com/api/latest/dashboards/#restore-deleted-dashboards) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#restore-deleted-dashboards-v1) PATCH https://api.ap1.datadoghq.com/api/v1/dashboardhttps://api.ap2.datadoghq.com/api/v1/dashboardhttps://api.datadoghq.eu/api/v1/dashboardhttps://api.ddog-gov.com/api/v1/dashboardhttps://api.datadoghq.com/api/v1/dashboardhttps://api.us3.datadoghq.com/api/v1/dashboardhttps://api.us5.datadoghq.com/api/v1/dashboard ### Overview Restore dashboards using the specified IDs. If there are any failures, no dashboards will be restored (partial success is not allowed). This endpoint requires the `dashboards_write` permission. OAuth apps require the `dashboards_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Request #### Body Data (required) Restore dashboards request body. * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Expand All Field Type Description _required_] [object] List of dashboard bulk action request data objects. id [_required_] string Dashboard resource ID. type [_required_] enum Dashboard resource type. Allowed enum values: `dashboard` default: `dashboard` ``` { "data": [ { "id": "123-abc-456", "type": "dashboard" } ] } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/dashboards/#RestoreDashboards-204-v1) * [400](https://docs.datadoghq.com/api/latest/dashboards/#RestoreDashboards-400-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#RestoreDashboards-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#RestoreDashboards-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#RestoreDashboards-429-v1) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Dashboards Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) ##### Restore deleted dashboards returns "No Content" response Copy ``` ## json-request-body # # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "id": "123-abc-456", "type": "dashboard" }, { "id": "789-def-101", "type": "dashboard" } ] } EOF ``` ##### Restore deleted dashboards returns "No Content" response ``` // Restore deleted dashboards returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "dashboard" in the system DashboardID := os.Getenv("DASHBOARD_ID") body := datadogV1.DashboardRestoreRequest{ Data: []datadogV1.DashboardBulkActionData{ { Id: DashboardID, Type: datadogV1.DASHBOARDRESOURCETYPE_DASHBOARD, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) r, err := api.RestoreDashboards(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.RestoreDashboards`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Restore deleted dashboards returns "No Content" response ``` // Restore deleted dashboards returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.DashboardBulkActionData; import com.datadog.api.client.v1.model.DashboardResourceType; import com.datadog.api.client.v1.model.DashboardRestoreRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "dashboard" in the system String DASHBOARD_ID = System.getenv("DASHBOARD_ID"); DashboardRestoreRequest body = new DashboardRestoreRequest() .data( Collections.singletonList( new DashboardBulkActionData() .id(DASHBOARD_ID) .type(DashboardResourceType.DASHBOARD))); try { apiInstance.restoreDashboards(body); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#restoreDashboards"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Restore deleted dashboards returns "No Content" response ``` """ Restore deleted dashboards returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard_bulk_action_data import DashboardBulkActionData from datadog_api_client.v1.model.dashboard_bulk_action_data_list import DashboardBulkActionDataList from datadog_api_client.v1.model.dashboard_resource_type import DashboardResourceType from datadog_api_client.v1.model.dashboard_restore_request import DashboardRestoreRequest # there is a valid "dashboard" in the system DASHBOARD_ID = environ["DASHBOARD_ID"] body = DashboardRestoreRequest( data=DashboardBulkActionDataList( [ DashboardBulkActionData( id=DASHBOARD_ID, type=DashboardResourceType.DASHBOARD, ), ] ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) api_instance.restore_dashboards(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Restore deleted dashboards returns "No Content" response ``` # Restore deleted dashboards returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "dashboard" in the system DASHBOARD_ID = ENV["DASHBOARD_ID"] body = DatadogAPIClient::V1::DashboardRestoreRequest.new({ data: [ DatadogAPIClient::V1::DashboardBulkActionData.new({ id: DASHBOARD_ID, type: DatadogAPIClient::V1::DashboardResourceType::DASHBOARD, }), ], }) api_instance.restore_dashboards(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Restore deleted dashboards returns "No Content" response ``` // Restore deleted dashboards returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::DashboardBulkActionData; use datadog_api_client::datadogV1::model::DashboardResourceType; use datadog_api_client::datadogV1::model::DashboardRestoreRequest; #[tokio::main] async fn main() { // there is a valid "dashboard" in the system let dashboard_id = std::env::var("DASHBOARD_ID").unwrap(); let body = DashboardRestoreRequest::new(vec![DashboardBulkActionData::new( dashboard_id.clone(), DashboardResourceType::DASHBOARD, )]); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.restore_dashboards(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Restore deleted dashboards returns "No Content" response ``` /** * Restore deleted dashboards returns "No Content" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "dashboard" in the system const DASHBOARD_ID = process.env.DASHBOARD_ID as string; const params: v1.DashboardsApiRestoreDashboardsRequest = { body: { data: [ { id: DASHBOARD_ID, type: "dashboard", }, ], }, }; apiInstance .restoreDashboards(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#create-a-shared-dashboard) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#create-a-shared-dashboard-v1) POST https://api.ap1.datadoghq.com/api/v1/dashboard/publichttps://api.ap2.datadoghq.com/api/v1/dashboard/publichttps://api.datadoghq.eu/api/v1/dashboard/publichttps://api.ddog-gov.com/api/v1/dashboard/publichttps://api.datadoghq.com/api/v1/dashboard/publichttps://api.us3.datadoghq.com/api/v1/dashboard/publichttps://api.us5.datadoghq.com/api/v1/dashboard/public ### Overview Share a specified private dashboard, generating a URL at which it can be publicly viewed. This endpoint requires any of the following permissions: * `dashboards_public_share` * `dashboards_embed_share` * `dashboards_invite_share` OAuth apps require the `dashboards_invite_share` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Request #### Body Data (required) Create a shared dashboard request body. * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Expand All Field Type Description object User who shared the dashboard. handle string Identifier of the user who shared the dashboard. name string Name of the user who shared the dashboard. created date-time Date the dashboard was shared. dashboard_id [_required_] string ID of the dashboard to share. dashboard_type [_required_] enum The type of the associated private dashboard. Allowed enum values: `custom_timeboard,custom_screenboard` embeddable_domains [string] The `SharedDashboard` `embeddable_domains`. expiration date-time The time when an OPEN shared dashboard becomes publicly unavailable. object Object containing the live span selection for the dashboard. live_span enum Dashboard global time live_span selection Allowed enum values: `15m,1h,4h,1d,2d,1w,1mo,3mo` global_time_selectable_enabled boolean Whether to allow viewers to select a different global time setting for the shared dashboard. [object] The `SharedDashboard` `invitees`. access_expiration date-time Time of the invitee expiration. Null means the invite will not expire. created_at date-time Time that the invitee was created. email [_required_] string Email of the invitee. last_accessed date-time The last time the shared dashboard was accessed. Null if never accessed. public_url string URL of the shared dashboard. [object] List of objects representing template variables on the shared dashboard which can have selectable values. default_value string The default value of the template variable. name string Name of the template variable. prefix string The tag/attribute key associated with the template variable. type string The type of variable. This is to differentiate between filter variables (interpolated in query) and group by variables (interpolated into group by). visible_tags [string] List of visible tag values on the shared dashboard. share_list [string] **DEPRECATED** : List of email addresses that can receive an invitation to access to the shared dashboard. share_type enum Type of sharing access (either open to anyone who has the public URL or invite-only). Allowed enum values: `open,invite,embed` status enum Active means the dashboard is publicly available. Paused means the dashboard is not publicly available. Allowed enum values: `active,paused` title string Title of the shared dashboard. token string A unique token assigned to the shared dashboard. object The viewing preferences for a shared dashboard. high_density boolean Whether the widgets on the shared dashboard should be displayed with high density. theme enum The theme of the shared dashboard view. "system" follows your system's default viewing theme. Allowed enum values: `system,light,dark` ##### Create a shared dashboard returns "OK" response ``` { "dashboard_id": "123-abc-456", "dashboard_type": "custom_timeboard", "share_type": "open", "global_time": { "live_span": "1h" } } ``` Copy ##### Create a shared dashboard with a group template variable returns "OK" response ``` { "dashboard_id": "123-abc-456", "dashboard_type": "custom_timeboard", "share_type": "open", "global_time": { "live_span": "1h" }, "selectable_template_vars": [ { "default_value": "*", "name": "group_by_var", "type": "group", "visible_tags": [ "selectableValue1", "selectableValue2" ] } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dashboards/#CreatePublicDashboard-200-v1) * [400](https://docs.datadoghq.com/api/latest/dashboards/#CreatePublicDashboard-400-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#CreatePublicDashboard-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#CreatePublicDashboard-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#CreatePublicDashboard-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) The metadata object associated with how a dashboard has been/will be shared. Expand All Field Type Description object User who shared the dashboard. handle string Identifier of the user who shared the dashboard. name string Name of the user who shared the dashboard. created date-time Date the dashboard was shared. dashboard_id [_required_] string ID of the dashboard to share. dashboard_type [_required_] enum The type of the associated private dashboard. Allowed enum values: `custom_timeboard,custom_screenboard` embeddable_domains [string] The `SharedDashboard` `embeddable_domains`. expiration date-time The time when an OPEN shared dashboard becomes publicly unavailable. object Object containing the live span selection for the dashboard. live_span enum Dashboard global time live_span selection Allowed enum values: `15m,1h,4h,1d,2d,1w,1mo,3mo` global_time_selectable_enabled boolean Whether to allow viewers to select a different global time setting for the shared dashboard. [object] The `SharedDashboard` `invitees`. access_expiration date-time Time of the invitee expiration. Null means the invite will not expire. created_at date-time Time that the invitee was created. email [_required_] string Email of the invitee. last_accessed date-time The last time the shared dashboard was accessed. Null if never accessed. public_url string URL of the shared dashboard. [object] List of objects representing template variables on the shared dashboard which can have selectable values. default_value string The default value of the template variable. name string Name of the template variable. prefix string The tag/attribute key associated with the template variable. type string The type of variable. This is to differentiate between filter variables (interpolated in query) and group by variables (interpolated into group by). visible_tags [string] List of visible tag values on the shared dashboard. share_list [string] **DEPRECATED** : List of email addresses that can receive an invitation to access to the shared dashboard. share_type enum Type of sharing access (either open to anyone who has the public URL or invite-only). Allowed enum values: `open,invite,embed` status enum Active means the dashboard is publicly available. Paused means the dashboard is not publicly available. Allowed enum values: `active,paused` title string Title of the shared dashboard. token string A unique token assigned to the shared dashboard. object The viewing preferences for a shared dashboard. high_density boolean Whether the widgets on the shared dashboard should be displayed with high density. theme enum The theme of the shared dashboard view. "system" follows your system's default viewing theme. Allowed enum values: `system,light,dark` ``` { "author": { "handle": "test@datadoghq.com", "name": "string" }, "created": "2019-09-19T10:00:00.000Z", "dashboard_id": "123-abc-456", "dashboard_type": "custom_timeboard", "embeddable_domains": [ "https://domain.atlassian.net/", "http://myserver.com/" ], "expiration": "2019-09-19T10:00:00.000Z", "global_time": { "live_span": "1h" }, "global_time_selectable_enabled": false, "invitees": [ { "access_expiration": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "email": "test@datadoghq.com" } ], "last_accessed": "2019-09-19T10:00:00.000Z", "public_url": "string", "selectable_template_vars": [ { "default_value": "*", "name": "exampleVar", "prefix": "test", "type": "string", "visible_tags": [ "selectableValue1", "selectableValue2" ] } ], "share_list": [ "test@datadoghq.com", "test2@email.com" ], "share_type": "string", "status": "active", "title": "string", "token": "string", "viewing_preferences": { "high_density": false, "theme": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Dashboard Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) ##### Create a shared dashboard returns "OK" response Copy ``` ## json-request-body # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/public" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "dashboard_id": "123-abc-456", "dashboard_type": "custom_timeboard", "share_type": "open" } EOF ``` ##### Create a shared dashboard with a group template variable returns "OK" response Copy ``` ## json-request-body # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/public" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "dashboard_id": "123-abc-456", "dashboard_type": "custom_timeboard", "share_type": "open" } EOF ``` ##### Create a shared dashboard returns "OK" response ``` // Create a shared dashboard returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "dashboard" in the system DashboardID := os.Getenv("DASHBOARD_ID") body := datadogV1.SharedDashboard{ DashboardId: DashboardID, DashboardType: datadogV1.DASHBOARDTYPE_CUSTOM_TIMEBOARD, ShareType: *datadogV1.NewNullableDashboardShareType(datadogV1.DASHBOARDSHARETYPE_OPEN.Ptr()), GlobalTime: &datadogV1.DashboardGlobalTime{ LiveSpan: datadogV1.DASHBOARDGLOBALTIMELIVESPAN_PAST_ONE_HOUR.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.CreatePublicDashboard(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.CreatePublicDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.CreatePublicDashboard`:\n%s\n", responseContent) } ``` Copy ##### Create a shared dashboard with a group template variable returns "OK" response ``` // Create a shared dashboard with a group template variable returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "dashboard" in the system DashboardID := os.Getenv("DASHBOARD_ID") body := datadogV1.SharedDashboard{ DashboardId: DashboardID, DashboardType: datadogV1.DASHBOARDTYPE_CUSTOM_TIMEBOARD, ShareType: *datadogV1.NewNullableDashboardShareType(datadogV1.DASHBOARDSHARETYPE_OPEN.Ptr()), GlobalTime: &datadogV1.DashboardGlobalTime{ LiveSpan: datadogV1.DASHBOARDGLOBALTIMELIVESPAN_PAST_ONE_HOUR.Ptr(), }, SelectableTemplateVars: []datadogV1.SelectableTemplateVariableItems{ { DefaultValue: datadog.PtrString("*"), Name: datadog.PtrString("group_by_var"), Type: *datadog.NewNullableString(datadog.PtrString("group")), VisibleTags: *datadog.NewNullableList(&[]string{ "selectableValue1", "selectableValue2", }), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.CreatePublicDashboard(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.CreatePublicDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.CreatePublicDashboard`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a shared dashboard returns "OK" response ``` // Create a shared dashboard returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.DashboardGlobalTime; import com.datadog.api.client.v1.model.DashboardGlobalTimeLiveSpan; import com.datadog.api.client.v1.model.DashboardShareType; import com.datadog.api.client.v1.model.DashboardType; import com.datadog.api.client.v1.model.SharedDashboard; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "dashboard" in the system String DASHBOARD_ID = System.getenv("DASHBOARD_ID"); SharedDashboard body = new SharedDashboard() .dashboardId(DASHBOARD_ID) .dashboardType(DashboardType.CUSTOM_TIMEBOARD) .shareType(DashboardShareType.OPEN) .globalTime( new DashboardGlobalTime().liveSpan(DashboardGlobalTimeLiveSpan.PAST_ONE_HOUR)); try { SharedDashboard result = apiInstance.createPublicDashboard(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#createPublicDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a shared dashboard with a group template variable returns "OK" response ``` // Create a shared dashboard with a group template variable returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.DashboardGlobalTime; import com.datadog.api.client.v1.model.DashboardGlobalTimeLiveSpan; import com.datadog.api.client.v1.model.DashboardShareType; import com.datadog.api.client.v1.model.DashboardType; import com.datadog.api.client.v1.model.SelectableTemplateVariableItems; import com.datadog.api.client.v1.model.SharedDashboard; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "dashboard" in the system String DASHBOARD_ID = System.getenv("DASHBOARD_ID"); SharedDashboard body = new SharedDashboard() .dashboardId(DASHBOARD_ID) .dashboardType(DashboardType.CUSTOM_TIMEBOARD) .shareType(DashboardShareType.OPEN) .globalTime( new DashboardGlobalTime().liveSpan(DashboardGlobalTimeLiveSpan.PAST_ONE_HOUR)) .selectableTemplateVars( Collections.singletonList( new SelectableTemplateVariableItems() .defaultValue("*") .name("group_by_var") .type("group") .visibleTags(Arrays.asList("selectableValue1", "selectableValue2")))); try { SharedDashboard result = apiInstance.createPublicDashboard(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#createPublicDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a shared dashboard returns "OK" response ``` """ Create a shared dashboard returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard_global_time import DashboardGlobalTime from datadog_api_client.v1.model.dashboard_global_time_live_span import DashboardGlobalTimeLiveSpan from datadog_api_client.v1.model.dashboard_share_type import DashboardShareType from datadog_api_client.v1.model.dashboard_type import DashboardType from datadog_api_client.v1.model.shared_dashboard import SharedDashboard # there is a valid "dashboard" in the system DASHBOARD_ID = environ["DASHBOARD_ID"] body = SharedDashboard( dashboard_id=DASHBOARD_ID, dashboard_type=DashboardType.CUSTOM_TIMEBOARD, share_type=DashboardShareType.OPEN, global_time=DashboardGlobalTime( live_span=DashboardGlobalTimeLiveSpan.PAST_ONE_HOUR, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.create_public_dashboard(body=body) print(response) ``` Copy ##### Create a shared dashboard with a group template variable returns "OK" response ``` """ Create a shared dashboard with a group template variable returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard_global_time import DashboardGlobalTime from datadog_api_client.v1.model.dashboard_global_time_live_span import DashboardGlobalTimeLiveSpan from datadog_api_client.v1.model.dashboard_share_type import DashboardShareType from datadog_api_client.v1.model.dashboard_type import DashboardType from datadog_api_client.v1.model.selectable_template_variable_items import SelectableTemplateVariableItems from datadog_api_client.v1.model.shared_dashboard import SharedDashboard # there is a valid "dashboard" in the system DASHBOARD_ID = environ["DASHBOARD_ID"] body = SharedDashboard( dashboard_id=DASHBOARD_ID, dashboard_type=DashboardType.CUSTOM_TIMEBOARD, share_type=DashboardShareType.OPEN, global_time=DashboardGlobalTime( live_span=DashboardGlobalTimeLiveSpan.PAST_ONE_HOUR, ), selectable_template_vars=[ SelectableTemplateVariableItems( default_value="*", name="group_by_var", type="group", visible_tags=[ "selectableValue1", "selectableValue2", ], ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.create_public_dashboard(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a shared dashboard returns "OK" response ``` # Create a shared dashboard returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "dashboard" in the system DASHBOARD_ID = ENV["DASHBOARD_ID"] body = DatadogAPIClient::V1::SharedDashboard.new({ dashboard_id: DASHBOARD_ID, dashboard_type: DatadogAPIClient::V1::DashboardType::CUSTOM_TIMEBOARD, share_type: DatadogAPIClient::V1::DashboardShareType::OPEN, global_time: DatadogAPIClient::V1::DashboardGlobalTime.new({ live_span: DatadogAPIClient::V1::DashboardGlobalTimeLiveSpan::PAST_ONE_HOUR, }), }) p api_instance.create_public_dashboard(body) ``` Copy ##### Create a shared dashboard with a group template variable returns "OK" response ``` # Create a shared dashboard with a group template variable returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "dashboard" in the system DASHBOARD_ID = ENV["DASHBOARD_ID"] body = DatadogAPIClient::V1::SharedDashboard.new({ dashboard_id: DASHBOARD_ID, dashboard_type: DatadogAPIClient::V1::DashboardType::CUSTOM_TIMEBOARD, share_type: DatadogAPIClient::V1::DashboardShareType::OPEN, global_time: DatadogAPIClient::V1::DashboardGlobalTime.new({ live_span: DatadogAPIClient::V1::DashboardGlobalTimeLiveSpan::PAST_ONE_HOUR, }), selectable_template_vars: [ DatadogAPIClient::V1::SelectableTemplateVariableItems.new({ default_value: "*", name: "group_by_var", type: "group", visible_tags: [ "selectableValue1", "selectableValue2", ], }), ], }) p api_instance.create_public_dashboard(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a shared dashboard returns "OK" response ``` // Create a shared dashboard returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::DashboardGlobalTime; use datadog_api_client::datadogV1::model::DashboardGlobalTimeLiveSpan; use datadog_api_client::datadogV1::model::DashboardShareType; use datadog_api_client::datadogV1::model::DashboardType; use datadog_api_client::datadogV1::model::SharedDashboard; #[tokio::main] async fn main() { // there is a valid "dashboard" in the system let dashboard_id = std::env::var("DASHBOARD_ID").unwrap(); let body = SharedDashboard::new(dashboard_id.clone(), DashboardType::CUSTOM_TIMEBOARD) .global_time( DashboardGlobalTime::new().live_span(DashboardGlobalTimeLiveSpan::PAST_ONE_HOUR), ) .share_type(Some(DashboardShareType::OPEN)); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.create_public_dashboard(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a shared dashboard with a group template variable returns "OK" response ``` // Create a shared dashboard with a group template variable returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::DashboardGlobalTime; use datadog_api_client::datadogV1::model::DashboardGlobalTimeLiveSpan; use datadog_api_client::datadogV1::model::DashboardShareType; use datadog_api_client::datadogV1::model::DashboardType; use datadog_api_client::datadogV1::model::SelectableTemplateVariableItems; use datadog_api_client::datadogV1::model::SharedDashboard; #[tokio::main] async fn main() { // there is a valid "dashboard" in the system let dashboard_id = std::env::var("DASHBOARD_ID").unwrap(); let body = SharedDashboard::new(dashboard_id.clone(), DashboardType::CUSTOM_TIMEBOARD) .global_time( DashboardGlobalTime::new().live_span(DashboardGlobalTimeLiveSpan::PAST_ONE_HOUR), ) .selectable_template_vars(Some(vec![SelectableTemplateVariableItems::new() .default_value("*".to_string()) .name("group_by_var".to_string()) .type_(Some("group".to_string())) .visible_tags(Some(vec![ "selectableValue1".to_string(), "selectableValue2".to_string(), ]))])) .share_type(Some(DashboardShareType::OPEN)); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.create_public_dashboard(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a shared dashboard returns "OK" response ``` /** * Create a shared dashboard returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "dashboard" in the system const DASHBOARD_ID = process.env.DASHBOARD_ID as string; const params: v1.DashboardsApiCreatePublicDashboardRequest = { body: { dashboardId: DASHBOARD_ID, dashboardType: "custom_timeboard", shareType: "open", globalTime: { liveSpan: "1h", }, }, }; apiInstance .createPublicDashboard(params) .then((data: v1.SharedDashboard) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a shared dashboard with a group template variable returns "OK" response ``` /** * Create a shared dashboard with a group template variable returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "dashboard" in the system const DASHBOARD_ID = process.env.DASHBOARD_ID as string; const params: v1.DashboardsApiCreatePublicDashboardRequest = { body: { dashboardId: DASHBOARD_ID, dashboardType: "custom_timeboard", shareType: "open", globalTime: { liveSpan: "1h", }, selectableTemplateVars: [ { defaultValue: "*", name: "group_by_var", type: "group", visibleTags: ["selectableValue1", "selectableValue2"], }, ], }, }; apiInstance .createPublicDashboard(params) .then((data: v1.SharedDashboard) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#get-a-shared-dashboard) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#get-a-shared-dashboard-v1) GET https://api.ap1.datadoghq.com/api/v1/dashboard/public/{token}https://api.ap2.datadoghq.com/api/v1/dashboard/public/{token}https://api.datadoghq.eu/api/v1/dashboard/public/{token}https://api.ddog-gov.com/api/v1/dashboard/public/{token}https://api.datadoghq.com/api/v1/dashboard/public/{token}https://api.us3.datadoghq.com/api/v1/dashboard/public/{token}https://api.us5.datadoghq.com/api/v1/dashboard/public/{token} ### Overview Fetch an existing shared dashboard’s sharing metadata associated with the specified token. This endpoint requires the `dashboards_read` permission. OAuth apps require the `dashboards_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Arguments #### Path Parameters Name Type Description token [_required_] string The token of the shared dashboard. Generated when a dashboard is shared. ### Response * [200](https://docs.datadoghq.com/api/latest/dashboards/#GetPublicDashboard-200-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#GetPublicDashboard-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#GetPublicDashboard-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#GetPublicDashboard-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) The metadata object associated with how a dashboard has been/will be shared. Expand All Field Type Description object User who shared the dashboard. handle string Identifier of the user who shared the dashboard. name string Name of the user who shared the dashboard. created date-time Date the dashboard was shared. dashboard_id [_required_] string ID of the dashboard to share. dashboard_type [_required_] enum The type of the associated private dashboard. Allowed enum values: `custom_timeboard,custom_screenboard` embeddable_domains [string] The `SharedDashboard` `embeddable_domains`. expiration date-time The time when an OPEN shared dashboard becomes publicly unavailable. object Object containing the live span selection for the dashboard. live_span enum Dashboard global time live_span selection Allowed enum values: `15m,1h,4h,1d,2d,1w,1mo,3mo` global_time_selectable_enabled boolean Whether to allow viewers to select a different global time setting for the shared dashboard. [object] The `SharedDashboard` `invitees`. access_expiration date-time Time of the invitee expiration. Null means the invite will not expire. created_at date-time Time that the invitee was created. email [_required_] string Email of the invitee. last_accessed date-time The last time the shared dashboard was accessed. Null if never accessed. public_url string URL of the shared dashboard. [object] List of objects representing template variables on the shared dashboard which can have selectable values. default_value string The default value of the template variable. name string Name of the template variable. prefix string The tag/attribute key associated with the template variable. type string The type of variable. This is to differentiate between filter variables (interpolated in query) and group by variables (interpolated into group by). visible_tags [string] List of visible tag values on the shared dashboard. share_list [string] **DEPRECATED** : List of email addresses that can receive an invitation to access to the shared dashboard. share_type enum Type of sharing access (either open to anyone who has the public URL or invite-only). Allowed enum values: `open,invite,embed` status enum Active means the dashboard is publicly available. Paused means the dashboard is not publicly available. Allowed enum values: `active,paused` title string Title of the shared dashboard. token string A unique token assigned to the shared dashboard. object The viewing preferences for a shared dashboard. high_density boolean Whether the widgets on the shared dashboard should be displayed with high density. theme enum The theme of the shared dashboard view. "system" follows your system's default viewing theme. Allowed enum values: `system,light,dark` ``` { "author": { "handle": "test@datadoghq.com", "name": "string" }, "created": "2019-09-19T10:00:00.000Z", "dashboard_id": "123-abc-456", "dashboard_type": "custom_timeboard", "embeddable_domains": [ "https://domain.atlassian.net/", "http://myserver.com/" ], "expiration": "2019-09-19T10:00:00.000Z", "global_time": { "live_span": "1h" }, "global_time_selectable_enabled": false, "invitees": [ { "access_expiration": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "email": "test@datadoghq.com" } ], "last_accessed": "2019-09-19T10:00:00.000Z", "public_url": "string", "selectable_template_vars": [ { "default_value": "*", "name": "exampleVar", "prefix": "test", "type": "string", "visible_tags": [ "selectableValue1", "selectableValue2" ] } ], "share_list": [ "test@datadoghq.com", "test2@email.com" ], "share_type": "string", "status": "active", "title": "string", "token": "string", "viewing_preferences": { "high_density": false, "theme": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Shared Dashboard Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) ##### Get a shared dashboard Copy ``` # Path parameters export token="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/public/${token}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a shared dashboard ``` """ Get a shared dashboard returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi # there is a valid "shared_dashboard" in the system SHARED_DASHBOARD_TOKEN = environ["SHARED_DASHBOARD_TOKEN"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.get_public_dashboard( token=SHARED_DASHBOARD_TOKEN, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a shared dashboard ``` # Get a shared dashboard returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "shared_dashboard" in the system SHARED_DASHBOARD_TOKEN = ENV["SHARED_DASHBOARD_TOKEN"] p api_instance.get_public_dashboard(SHARED_DASHBOARD_TOKEN) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a shared dashboard ``` // Get a shared dashboard returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "shared_dashboard" in the system SharedDashboardToken := os.Getenv("SHARED_DASHBOARD_TOKEN") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.GetPublicDashboard(ctx, SharedDashboardToken) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.GetPublicDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.GetPublicDashboard`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a shared dashboard ``` // Get a shared dashboard returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.SharedDashboard; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "shared_dashboard" in the system String SHARED_DASHBOARD_TOKEN = System.getenv("SHARED_DASHBOARD_TOKEN"); try { SharedDashboard result = apiInstance.getPublicDashboard(SHARED_DASHBOARD_TOKEN); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#getPublicDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a shared dashboard ``` // Get a shared dashboard returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; #[tokio::main] async fn main() { // there is a valid "shared_dashboard" in the system let shared_dashboard_token = std::env::var("SHARED_DASHBOARD_TOKEN").unwrap(); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api .get_public_dashboard(shared_dashboard_token.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a shared dashboard ``` /** * Get a shared dashboard returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "shared_dashboard" in the system const SHARED_DASHBOARD_TOKEN = process.env.SHARED_DASHBOARD_TOKEN as string; const params: v1.DashboardsApiGetPublicDashboardRequest = { token: SHARED_DASHBOARD_TOKEN, }; apiInstance .getPublicDashboard(params) .then((data: v1.SharedDashboard) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#update-a-shared-dashboard) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#update-a-shared-dashboard-v1) PUT https://api.ap1.datadoghq.com/api/v1/dashboard/public/{token}https://api.ap2.datadoghq.com/api/v1/dashboard/public/{token}https://api.datadoghq.eu/api/v1/dashboard/public/{token}https://api.ddog-gov.com/api/v1/dashboard/public/{token}https://api.datadoghq.com/api/v1/dashboard/public/{token}https://api.us3.datadoghq.com/api/v1/dashboard/public/{token}https://api.us5.datadoghq.com/api/v1/dashboard/public/{token} ### Overview Update a shared dashboard associated with the specified token. This endpoint requires any of the following permissions: * `dashboards_public_share` * `dashboards_embed_share` * `dashboards_invite_share` OAuth apps require the `dashboards_invite_share` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Arguments #### Path Parameters Name Type Description token [_required_] string The token of the shared dashboard. ### Request #### Body Data (required) Update Dashboard request body. * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Expand All Field Type Description embeddable_domains [string] The `SharedDashboard` `embeddable_domains`. expiration date-time The time when an OPEN shared dashboard becomes publicly unavailable. object Timeframe setting for the shared dashboard. live_span enum Dashboard global time live_span selection Allowed enum values: `15m,1h,4h,1d,2d,1w,1mo,3mo` global_time_selectable_enabled boolean Whether to allow viewers to select a different global time setting for the shared dashboard. [object] The `SharedDashboard` `invitees`. access_expiration date-time Time of the invitee expiration. Null means the invite will not expire. created_at date-time Time that the invitee was created. email [_required_] string Email of the invitee. [object] List of objects representing template variables on the shared dashboard which can have selectable values. default_value string The default value of the template variable. name string Name of the template variable. prefix string The tag/attribute key associated with the template variable. type string The type of variable. This is to differentiate between filter variables (interpolated in query) and group by variables (interpolated into group by). visible_tags [string] List of visible tag values on the shared dashboard. share_list [string] **DEPRECATED** : List of email addresses that can be given access to the shared dashboard. share_type enum Type of sharing access (either open to anyone who has the public URL or invite-only). Allowed enum values: `open,invite,embed` status enum Active means the dashboard is publicly available. Paused means the dashboard is not publicly available. Allowed enum values: `active,paused` title string Title of the shared dashboard. object The viewing preferences for a shared dashboard. high_density boolean Whether the widgets on the shared dashboard should be displayed with high density. theme enum The theme of the shared dashboard view. "system" follows your system's default viewing theme. Allowed enum values: `system,light,dark` ##### Update a shared dashboard returns "OK" response ``` { "global_time": { "live_span": "15m" }, "share_list": [], "share_type": "open" } ``` Copy ##### Update a shared dashboard with selectable_template_vars returns "OK" response ``` { "global_time": { "live_span": "15m" }, "share_list": [], "share_type": "open", "selectable_template_vars": [ { "default_value": "*", "name": "group_by_var", "type": "group", "visible_tags": [ "selectableValue1", "selectableValue2" ] } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dashboards/#UpdatePublicDashboard-200-v1) * [400](https://docs.datadoghq.com/api/latest/dashboards/#UpdatePublicDashboard-400-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#UpdatePublicDashboard-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#UpdatePublicDashboard-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#UpdatePublicDashboard-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) The metadata object associated with how a dashboard has been/will be shared. Expand All Field Type Description object User who shared the dashboard. handle string Identifier of the user who shared the dashboard. name string Name of the user who shared the dashboard. created date-time Date the dashboard was shared. dashboard_id [_required_] string ID of the dashboard to share. dashboard_type [_required_] enum The type of the associated private dashboard. Allowed enum values: `custom_timeboard,custom_screenboard` embeddable_domains [string] The `SharedDashboard` `embeddable_domains`. expiration date-time The time when an OPEN shared dashboard becomes publicly unavailable. object Object containing the live span selection for the dashboard. live_span enum Dashboard global time live_span selection Allowed enum values: `15m,1h,4h,1d,2d,1w,1mo,3mo` global_time_selectable_enabled boolean Whether to allow viewers to select a different global time setting for the shared dashboard. [object] The `SharedDashboard` `invitees`. access_expiration date-time Time of the invitee expiration. Null means the invite will not expire. created_at date-time Time that the invitee was created. email [_required_] string Email of the invitee. last_accessed date-time The last time the shared dashboard was accessed. Null if never accessed. public_url string URL of the shared dashboard. [object] List of objects representing template variables on the shared dashboard which can have selectable values. default_value string The default value of the template variable. name string Name of the template variable. prefix string The tag/attribute key associated with the template variable. type string The type of variable. This is to differentiate between filter variables (interpolated in query) and group by variables (interpolated into group by). visible_tags [string] List of visible tag values on the shared dashboard. share_list [string] **DEPRECATED** : List of email addresses that can receive an invitation to access to the shared dashboard. share_type enum Type of sharing access (either open to anyone who has the public URL or invite-only). Allowed enum values: `open,invite,embed` status enum Active means the dashboard is publicly available. Paused means the dashboard is not publicly available. Allowed enum values: `active,paused` title string Title of the shared dashboard. token string A unique token assigned to the shared dashboard. object The viewing preferences for a shared dashboard. high_density boolean Whether the widgets on the shared dashboard should be displayed with high density. theme enum The theme of the shared dashboard view. "system" follows your system's default viewing theme. Allowed enum values: `system,light,dark` ``` { "author": { "handle": "test@datadoghq.com", "name": "string" }, "created": "2019-09-19T10:00:00.000Z", "dashboard_id": "123-abc-456", "dashboard_type": "custom_timeboard", "embeddable_domains": [ "https://domain.atlassian.net/", "http://myserver.com/" ], "expiration": "2019-09-19T10:00:00.000Z", "global_time": { "live_span": "1h" }, "global_time_selectable_enabled": false, "invitees": [ { "access_expiration": "2019-09-19T10:00:00.000Z", "created_at": "2019-09-19T10:00:00.000Z", "email": "test@datadoghq.com" } ], "last_accessed": "2019-09-19T10:00:00.000Z", "public_url": "string", "selectable_template_vars": [ { "default_value": "*", "name": "exampleVar", "prefix": "test", "type": "string", "visible_tags": [ "selectableValue1", "selectableValue2" ] } ], "share_list": [ "test@datadoghq.com", "test2@email.com" ], "share_type": "string", "status": "active", "title": "string", "token": "string", "viewing_preferences": { "high_density": false, "theme": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) ##### Update a shared dashboard returns "OK" response Copy ``` ## json-request-body # # Path parameters export token="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/public/${token}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "global_time": { "live_span": "1h" }, "selectable_template_vars": [ { "default_value": "*", "name": "exampleVar", "prefix": "test", "visible_tags": [ "selectableValue1", "selectableValue2" ] } ], "share_list": [ "test@datadoghq.com", "test2@datadoghq.com" ], "share_type": "invite" } EOF ``` ##### Update a shared dashboard with selectable_template_vars returns "OK" response Copy ``` ## json-request-body # # Path parameters export token="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/public/${token}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "global_time": { "live_span": "1h" }, "selectable_template_vars": [ { "default_value": "*", "name": "exampleVar", "prefix": "test", "visible_tags": [ "selectableValue1", "selectableValue2" ] } ], "share_list": [ "test@datadoghq.com", "test2@datadoghq.com" ], "share_type": "invite" } EOF ``` ##### Update a shared dashboard returns "OK" response ``` // Update a shared dashboard returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "shared_dashboard" in the system SharedDashboardToken := os.Getenv("SHARED_DASHBOARD_TOKEN") body := datadogV1.SharedDashboardUpdateRequest{ GlobalTime: *datadogV1.NewNullableSharedDashboardUpdateRequestGlobalTime(&datadogV1.SharedDashboardUpdateRequestGlobalTime{ LiveSpan: datadogV1.DASHBOARDGLOBALTIMELIVESPAN_PAST_FIFTEEN_MINUTES.Ptr(), }), ShareList: *datadog.NewNullableList(&[]string{}), ShareType: *datadogV1.NewNullableDashboardShareType(datadogV1.DASHBOARDSHARETYPE_OPEN.Ptr()), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.UpdatePublicDashboard(ctx, SharedDashboardToken, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.UpdatePublicDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.UpdatePublicDashboard`:\n%s\n", responseContent) } ``` Copy ##### Update a shared dashboard with selectable_template_vars returns "OK" response ``` // Update a shared dashboard with selectable_template_vars returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "shared_dashboard" in the system SharedDashboardToken := os.Getenv("SHARED_DASHBOARD_TOKEN") body := datadogV1.SharedDashboardUpdateRequest{ GlobalTime: *datadogV1.NewNullableSharedDashboardUpdateRequestGlobalTime(&datadogV1.SharedDashboardUpdateRequestGlobalTime{ LiveSpan: datadogV1.DASHBOARDGLOBALTIMELIVESPAN_PAST_FIFTEEN_MINUTES.Ptr(), }), ShareList: *datadog.NewNullableList(&[]string{}), ShareType: *datadogV1.NewNullableDashboardShareType(datadogV1.DASHBOARDSHARETYPE_OPEN.Ptr()), SelectableTemplateVars: []datadogV1.SelectableTemplateVariableItems{ { DefaultValue: datadog.PtrString("*"), Name: datadog.PtrString("group_by_var"), Type: *datadog.NewNullableString(datadog.PtrString("group")), VisibleTags: *datadog.NewNullableList(&[]string{ "selectableValue1", "selectableValue2", }), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.UpdatePublicDashboard(ctx, SharedDashboardToken, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.UpdatePublicDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.UpdatePublicDashboard`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a shared dashboard returns "OK" response ``` // Update a shared dashboard returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.DashboardGlobalTimeLiveSpan; import com.datadog.api.client.v1.model.DashboardShareType; import com.datadog.api.client.v1.model.SharedDashboard; import com.datadog.api.client.v1.model.SharedDashboardUpdateRequest; import com.datadog.api.client.v1.model.SharedDashboardUpdateRequestGlobalTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "shared_dashboard" in the system String SHARED_DASHBOARD_TOKEN = System.getenv("SHARED_DASHBOARD_TOKEN"); SharedDashboardUpdateRequest body = new SharedDashboardUpdateRequest() .globalTime( new SharedDashboardUpdateRequestGlobalTime() .liveSpan(DashboardGlobalTimeLiveSpan.PAST_FIFTEEN_MINUTES)) .shareType(DashboardShareType.OPEN); try { SharedDashboard result = apiInstance.updatePublicDashboard(SHARED_DASHBOARD_TOKEN, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#updatePublicDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update a shared dashboard with selectable_template_vars returns "OK" response ``` // Update a shared dashboard with selectable_template_vars returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.DashboardGlobalTimeLiveSpan; import com.datadog.api.client.v1.model.DashboardShareType; import com.datadog.api.client.v1.model.SelectableTemplateVariableItems; import com.datadog.api.client.v1.model.SharedDashboard; import com.datadog.api.client.v1.model.SharedDashboardUpdateRequest; import com.datadog.api.client.v1.model.SharedDashboardUpdateRequestGlobalTime; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "shared_dashboard" in the system String SHARED_DASHBOARD_TOKEN = System.getenv("SHARED_DASHBOARD_TOKEN"); SharedDashboardUpdateRequest body = new SharedDashboardUpdateRequest() .globalTime( new SharedDashboardUpdateRequestGlobalTime() .liveSpan(DashboardGlobalTimeLiveSpan.PAST_FIFTEEN_MINUTES)) .shareType(DashboardShareType.OPEN) .selectableTemplateVars( Collections.singletonList( new SelectableTemplateVariableItems() .defaultValue("*") .name("group_by_var") .type("group") .visibleTags(Arrays.asList("selectableValue1", "selectableValue2")))); try { SharedDashboard result = apiInstance.updatePublicDashboard(SHARED_DASHBOARD_TOKEN, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#updatePublicDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a shared dashboard returns "OK" response ``` """ Update a shared dashboard returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard_global_time_live_span import DashboardGlobalTimeLiveSpan from datadog_api_client.v1.model.dashboard_share_type import DashboardShareType from datadog_api_client.v1.model.shared_dashboard_update_request import SharedDashboardUpdateRequest from datadog_api_client.v1.model.shared_dashboard_update_request_global_time import ( SharedDashboardUpdateRequestGlobalTime, ) # there is a valid "shared_dashboard" in the system SHARED_DASHBOARD_TOKEN = environ["SHARED_DASHBOARD_TOKEN"] body = SharedDashboardUpdateRequest( global_time=SharedDashboardUpdateRequestGlobalTime( live_span=DashboardGlobalTimeLiveSpan.PAST_FIFTEEN_MINUTES, ), share_list=[], share_type=DashboardShareType.OPEN, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.update_public_dashboard(token=SHARED_DASHBOARD_TOKEN, body=body) print(response) ``` Copy ##### Update a shared dashboard with selectable_template_vars returns "OK" response ``` """ Update a shared dashboard with selectable_template_vars returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard_global_time_live_span import DashboardGlobalTimeLiveSpan from datadog_api_client.v1.model.dashboard_share_type import DashboardShareType from datadog_api_client.v1.model.selectable_template_variable_items import SelectableTemplateVariableItems from datadog_api_client.v1.model.shared_dashboard_update_request import SharedDashboardUpdateRequest from datadog_api_client.v1.model.shared_dashboard_update_request_global_time import ( SharedDashboardUpdateRequestGlobalTime, ) # there is a valid "shared_dashboard" in the system SHARED_DASHBOARD_TOKEN = environ["SHARED_DASHBOARD_TOKEN"] body = SharedDashboardUpdateRequest( global_time=SharedDashboardUpdateRequestGlobalTime( live_span=DashboardGlobalTimeLiveSpan.PAST_FIFTEEN_MINUTES, ), share_list=[], share_type=DashboardShareType.OPEN, selectable_template_vars=[ SelectableTemplateVariableItems( default_value="*", name="group_by_var", type="group", visible_tags=[ "selectableValue1", "selectableValue2", ], ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.update_public_dashboard(token=SHARED_DASHBOARD_TOKEN, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a shared dashboard returns "OK" response ``` # Update a shared dashboard returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "shared_dashboard" in the system SHARED_DASHBOARD_TOKEN = ENV["SHARED_DASHBOARD_TOKEN"] body = DatadogAPIClient::V1::SharedDashboardUpdateRequest.new({ global_time: DatadogAPIClient::V1::SharedDashboardUpdateRequestGlobalTime.new({ live_span: DatadogAPIClient::V1::DashboardGlobalTimeLiveSpan::PAST_FIFTEEN_MINUTES, }), share_list: [], share_type: DatadogAPIClient::V1::DashboardShareType::OPEN, }) p api_instance.update_public_dashboard(SHARED_DASHBOARD_TOKEN, body) ``` Copy ##### Update a shared dashboard with selectable_template_vars returns "OK" response ``` # Update a shared dashboard with selectable_template_vars returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "shared_dashboard" in the system SHARED_DASHBOARD_TOKEN = ENV["SHARED_DASHBOARD_TOKEN"] body = DatadogAPIClient::V1::SharedDashboardUpdateRequest.new({ global_time: DatadogAPIClient::V1::SharedDashboardUpdateRequestGlobalTime.new({ live_span: DatadogAPIClient::V1::DashboardGlobalTimeLiveSpan::PAST_FIFTEEN_MINUTES, }), share_list: [], share_type: DatadogAPIClient::V1::DashboardShareType::OPEN, selectable_template_vars: [ DatadogAPIClient::V1::SelectableTemplateVariableItems.new({ default_value: "*", name: "group_by_var", type: "group", visible_tags: [ "selectableValue1", "selectableValue2", ], }), ], }) p api_instance.update_public_dashboard(SHARED_DASHBOARD_TOKEN, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a shared dashboard returns "OK" response ``` // Update a shared dashboard returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::DashboardGlobalTimeLiveSpan; use datadog_api_client::datadogV1::model::DashboardShareType; use datadog_api_client::datadogV1::model::SharedDashboardUpdateRequest; use datadog_api_client::datadogV1::model::SharedDashboardUpdateRequestGlobalTime; #[tokio::main] async fn main() { // there is a valid "shared_dashboard" in the system let shared_dashboard_token = std::env::var("SHARED_DASHBOARD_TOKEN").unwrap(); let body = SharedDashboardUpdateRequest::new() .global_time(Some( SharedDashboardUpdateRequestGlobalTime::new() .live_span(DashboardGlobalTimeLiveSpan::PAST_FIFTEEN_MINUTES), )) .share_list(Some(vec![])) .share_type(Some(DashboardShareType::OPEN)); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api .update_public_dashboard(shared_dashboard_token.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update a shared dashboard with selectable_template_vars returns "OK" response ``` // Update a shared dashboard with selectable_template_vars returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::DashboardGlobalTimeLiveSpan; use datadog_api_client::datadogV1::model::DashboardShareType; use datadog_api_client::datadogV1::model::SelectableTemplateVariableItems; use datadog_api_client::datadogV1::model::SharedDashboardUpdateRequest; use datadog_api_client::datadogV1::model::SharedDashboardUpdateRequestGlobalTime; #[tokio::main] async fn main() { // there is a valid "shared_dashboard" in the system let shared_dashboard_token = std::env::var("SHARED_DASHBOARD_TOKEN").unwrap(); let body = SharedDashboardUpdateRequest::new() .global_time(Some( SharedDashboardUpdateRequestGlobalTime::new() .live_span(DashboardGlobalTimeLiveSpan::PAST_FIFTEEN_MINUTES), )) .selectable_template_vars(Some(vec![SelectableTemplateVariableItems::new() .default_value("*".to_string()) .name("group_by_var".to_string()) .type_(Some("group".to_string())) .visible_tags(Some(vec![ "selectableValue1".to_string(), "selectableValue2".to_string(), ]))])) .share_list(Some(vec![])) .share_type(Some(DashboardShareType::OPEN)); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api .update_public_dashboard(shared_dashboard_token.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a shared dashboard returns "OK" response ``` /** * Update a shared dashboard returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "shared_dashboard" in the system const SHARED_DASHBOARD_TOKEN = process.env.SHARED_DASHBOARD_TOKEN as string; const params: v1.DashboardsApiUpdatePublicDashboardRequest = { body: { globalTime: { liveSpan: "15m", }, shareList: [], shareType: "open", }, token: SHARED_DASHBOARD_TOKEN, }; apiInstance .updatePublicDashboard(params) .then((data: v1.SharedDashboard) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update a shared dashboard with selectable_template_vars returns "OK" response ``` /** * Update a shared dashboard with selectable_template_vars returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "shared_dashboard" in the system const SHARED_DASHBOARD_TOKEN = process.env.SHARED_DASHBOARD_TOKEN as string; const params: v1.DashboardsApiUpdatePublicDashboardRequest = { body: { globalTime: { liveSpan: "15m", }, shareList: [], shareType: "open", selectableTemplateVars: [ { defaultValue: "*", name: "group_by_var", type: "group", visibleTags: ["selectableValue1", "selectableValue2"], }, ], }, token: SHARED_DASHBOARD_TOKEN, }; apiInstance .updatePublicDashboard(params) .then((data: v1.SharedDashboard) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Send shared dashboard invitation email](https://docs.datadoghq.com/api/latest/dashboards/#send-shared-dashboard-invitation-email) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#send-shared-dashboard-invitation-email-v1) POST https://api.ap1.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.ap2.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.datadoghq.eu/api/v1/dashboard/public/{token}/invitationhttps://api.ddog-gov.com/api/v1/dashboard/public/{token}/invitationhttps://api.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.us3.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.us5.datadoghq.com/api/v1/dashboard/public/{token}/invitation ### Overview Send emails to specified email addresses containing links to access a given authenticated shared dashboard. Email addresses must already belong to the authenticated shared dashboard’s share_list. This endpoint requires the `dashboards_invite_share` permission. OAuth apps require the `dashboards_invite_share` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Arguments #### Path Parameters Name Type Description token [_required_] string The token of the shared dashboard. ### Request #### Body Data (required) Shared Dashboard Invitation request body. * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Expand All Field Type Description _required_] An object or list of objects containing the information for an invitation to a shared dashboard. object Object containing the information for an invitation to a shared dashboard. _required_] object Attributes of the shared dashboard invitation created_at date-time When the invitation was sent. email string An email address that an invitation has been (or if used in invitation request, will be) sent to. has_session boolean Indicates whether an active session exists for the invitation (produced when a user clicks the link in the email). invitation_expiry date-time When the invitation expires. session_expiry date-time When the invited user's session expires. null if the invitation has no associated session. share_token string The unique token of the shared dashboard that was (or is to be) shared. type [_required_] enum Type for shared dashboard invitation request body. Allowed enum values: `public_dashboard_invitation` [object] A list of objects containing the information for an invitation(s) to a shared dashboard. _required_] object Attributes of the shared dashboard invitation created_at date-time When the invitation was sent. email string An email address that an invitation has been (or if used in invitation request, will be) sent to. has_session boolean Indicates whether an active session exists for the invitation (produced when a user clicks the link in the email). invitation_expiry date-time When the invitation expires. session_expiry date-time When the invited user's session expires. null if the invitation has no associated session. share_token string The unique token of the shared dashboard that was (or is to be) shared. type [_required_] enum Type for shared dashboard invitation request body. Allowed enum values: `public_dashboard_invitation` object Pagination metadata returned by the API. object Object containing the total count of invitations across all pages total_count int64 The total number of invitations on this shared board, across all pages. ``` { "data": { "attributes": { "email": "exampledashboard@datadoghq.com" }, "type": "public_dashboard_invitation" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/dashboards/#SendPublicDashboardInvitation-201-v1) * [400](https://docs.datadoghq.com/api/latest/dashboards/#SendPublicDashboardInvitation-400-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#SendPublicDashboardInvitation-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#SendPublicDashboardInvitation-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#SendPublicDashboardInvitation-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Invitations data and metadata that exists for a shared dashboard returned by the API. Expand All Field Type Description _required_] An object or list of objects containing the information for an invitation to a shared dashboard. object Object containing the information for an invitation to a shared dashboard. _required_] object Attributes of the shared dashboard invitation created_at date-time When the invitation was sent. email string An email address that an invitation has been (or if used in invitation request, will be) sent to. has_session boolean Indicates whether an active session exists for the invitation (produced when a user clicks the link in the email). invitation_expiry date-time When the invitation expires. session_expiry date-time When the invited user's session expires. null if the invitation has no associated session. share_token string The unique token of the shared dashboard that was (or is to be) shared. type [_required_] enum Type for shared dashboard invitation request body. Allowed enum values: `public_dashboard_invitation` [object] A list of objects containing the information for an invitation(s) to a shared dashboard. _required_] object Attributes of the shared dashboard invitation created_at date-time When the invitation was sent. email string An email address that an invitation has been (or if used in invitation request, will be) sent to. has_session boolean Indicates whether an active session exists for the invitation (produced when a user clicks the link in the email). invitation_expiry date-time When the invitation expires. session_expiry date-time When the invited user's session expires. null if the invitation has no associated session. share_token string The unique token of the shared dashboard that was (or is to be) shared. type [_required_] enum Type for shared dashboard invitation request body. Allowed enum values: `public_dashboard_invitation` object Pagination metadata returned by the API. object Object containing the total count of invitations across all pages total_count int64 The total number of invitations on this shared board, across all pages. ``` { "data": [ { "attributes": { "email": "test@datadoghq.com" }, "type": "public_dashboard_invitation" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) ##### Send shared dashboard invitation email returns "OK" response Copy ``` ## json-request-body # # Path parameters export token="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/public/${token}/invitation" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "attributes": { "email": "test@datadoghq.com" }, "type": "public_dashboard_invitation" } ] } EOF ``` ##### Send shared dashboard invitation email returns "OK" response ``` // Send shared dashboard invitation email returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "shared_dashboard" in the system SharedDashboardToken := os.Getenv("SHARED_DASHBOARD_TOKEN") body := datadogV1.SharedDashboardInvites{ Data: datadogV1.SharedDashboardInvitesData{ SharedDashboardInvitesDataObject: &datadogV1.SharedDashboardInvitesDataObject{ Attributes: datadogV1.SharedDashboardInvitesDataObjectAttributes{ Email: datadog.PtrString("exampledashboard@datadoghq.com"), }, Type: datadogV1.DASHBOARDINVITETYPE_PUBLIC_DASHBOARD_INVITATION, }}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.SendPublicDashboardInvitation(ctx, SharedDashboardToken, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.SendPublicDashboardInvitation`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.SendPublicDashboardInvitation`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Send shared dashboard invitation email returns "OK" response ``` // Send shared dashboard invitation email returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.DashboardInviteType; import com.datadog.api.client.v1.model.SharedDashboardInvites; import com.datadog.api.client.v1.model.SharedDashboardInvitesData; import com.datadog.api.client.v1.model.SharedDashboardInvitesDataObject; import com.datadog.api.client.v1.model.SharedDashboardInvitesDataObjectAttributes; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "shared_dashboard" in the system String SHARED_DASHBOARD_TOKEN = System.getenv("SHARED_DASHBOARD_TOKEN"); SharedDashboardInvites body = new SharedDashboardInvites() .data( new SharedDashboardInvitesData( new SharedDashboardInvitesDataObject() .attributes( new SharedDashboardInvitesDataObjectAttributes() .email("exampledashboard@datadoghq.com")) .type(DashboardInviteType.PUBLIC_DASHBOARD_INVITATION))); try { SharedDashboardInvites result = apiInstance.sendPublicDashboardInvitation(SHARED_DASHBOARD_TOKEN, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#sendPublicDashboardInvitation"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Send shared dashboard invitation email returns "OK" response ``` """ Send shared dashboard invitation email returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard_invite_type import DashboardInviteType from datadog_api_client.v1.model.shared_dashboard_invites import SharedDashboardInvites from datadog_api_client.v1.model.shared_dashboard_invites_data_object import SharedDashboardInvitesDataObject from datadog_api_client.v1.model.shared_dashboard_invites_data_object_attributes import ( SharedDashboardInvitesDataObjectAttributes, ) # there is a valid "shared_dashboard" in the system SHARED_DASHBOARD_TOKEN = environ["SHARED_DASHBOARD_TOKEN"] body = SharedDashboardInvites( data=SharedDashboardInvitesDataObject( attributes=SharedDashboardInvitesDataObjectAttributes( email="exampledashboard@datadoghq.com", ), type=DashboardInviteType.PUBLIC_DASHBOARD_INVITATION, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.send_public_dashboard_invitation(token=SHARED_DASHBOARD_TOKEN, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Send shared dashboard invitation email returns "OK" response ``` # Send shared dashboard invitation email returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "shared_dashboard" in the system SHARED_DASHBOARD_TOKEN = ENV["SHARED_DASHBOARD_TOKEN"] body = DatadogAPIClient::V1::SharedDashboardInvites.new({ data: DatadogAPIClient::V1::SharedDashboardInvitesDataObject.new({ attributes: DatadogAPIClient::V1::SharedDashboardInvitesDataObjectAttributes.new({ email: "exampledashboard@datadoghq.com", }), type: DatadogAPIClient::V1::DashboardInviteType::PUBLIC_DASHBOARD_INVITATION, }), }) p api_instance.send_public_dashboard_invitation(SHARED_DASHBOARD_TOKEN, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Send shared dashboard invitation email returns "OK" response ``` // Send shared dashboard invitation email returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::DashboardInviteType; use datadog_api_client::datadogV1::model::SharedDashboardInvites; use datadog_api_client::datadogV1::model::SharedDashboardInvitesData; use datadog_api_client::datadogV1::model::SharedDashboardInvitesDataObject; use datadog_api_client::datadogV1::model::SharedDashboardInvitesDataObjectAttributes; #[tokio::main] async fn main() { // there is a valid "shared_dashboard" in the system let shared_dashboard_token = std::env::var("SHARED_DASHBOARD_TOKEN").unwrap(); let body = SharedDashboardInvites::new( SharedDashboardInvitesData::SharedDashboardInvitesDataObject(Box::new( SharedDashboardInvitesDataObject::new( SharedDashboardInvitesDataObjectAttributes::new() .email("exampledashboard@datadoghq.com".to_string()), DashboardInviteType::PUBLIC_DASHBOARD_INVITATION, ), )), ); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api .send_public_dashboard_invitation(shared_dashboard_token.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Send shared dashboard invitation email returns "OK" response ``` /** * Send shared dashboard invitation email returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "shared_dashboard" in the system const SHARED_DASHBOARD_TOKEN = process.env.SHARED_DASHBOARD_TOKEN as string; const params: v1.DashboardsApiSendPublicDashboardInvitationRequest = { body: { data: { attributes: { email: "exampledashboard@datadoghq.com", }, type: "public_dashboard_invitation", }, }, token: SHARED_DASHBOARD_TOKEN, }; apiInstance .sendPublicDashboardInvitation(params) .then((data: v1.SharedDashboardInvites) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all invitations for a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#get-all-invitations-for-a-shared-dashboard) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#get-all-invitations-for-a-shared-dashboard-v1) GET https://api.ap1.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.ap2.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.datadoghq.eu/api/v1/dashboard/public/{token}/invitationhttps://api.ddog-gov.com/api/v1/dashboard/public/{token}/invitationhttps://api.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.us3.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.us5.datadoghq.com/api/v1/dashboard/public/{token}/invitation ### Overview Describe the invitations that exist for the given shared dashboard (paginated). This endpoint requires the `dashboards_invite_share` permission. OAuth apps require the `dashboards_invite_share` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Arguments #### Path Parameters Name Type Description token [_required_] string Token of the shared dashboard for which to fetch invitations. #### Query Strings Name Type Description page_size integer The number of records to return in a single request. page_number integer The page to access (base 0). ### Response * [200](https://docs.datadoghq.com/api/latest/dashboards/#GetPublicDashboardInvitations-200-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#GetPublicDashboardInvitations-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#GetPublicDashboardInvitations-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#GetPublicDashboardInvitations-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Invitations data and metadata that exists for a shared dashboard returned by the API. Expand All Field Type Description _required_] An object or list of objects containing the information for an invitation to a shared dashboard. object Object containing the information for an invitation to a shared dashboard. _required_] object Attributes of the shared dashboard invitation created_at date-time When the invitation was sent. email string An email address that an invitation has been (or if used in invitation request, will be) sent to. has_session boolean Indicates whether an active session exists for the invitation (produced when a user clicks the link in the email). invitation_expiry date-time When the invitation expires. session_expiry date-time When the invited user's session expires. null if the invitation has no associated session. share_token string The unique token of the shared dashboard that was (or is to be) shared. type [_required_] enum Type for shared dashboard invitation request body. Allowed enum values: `public_dashboard_invitation` [object] A list of objects containing the information for an invitation(s) to a shared dashboard. _required_] object Attributes of the shared dashboard invitation created_at date-time When the invitation was sent. email string An email address that an invitation has been (or if used in invitation request, will be) sent to. has_session boolean Indicates whether an active session exists for the invitation (produced when a user clicks the link in the email). invitation_expiry date-time When the invitation expires. session_expiry date-time When the invited user's session expires. null if the invitation has no associated session. share_token string The unique token of the shared dashboard that was (or is to be) shared. type [_required_] enum Type for shared dashboard invitation request body. Allowed enum values: `public_dashboard_invitation` object Pagination metadata returned by the API. object Object containing the total count of invitations across all pages total_count int64 The total number of invitations on this shared board, across all pages. ``` { "data": [ { "attributes": { "email": "test@datadoghq.com" }, "type": "public_dashboard_invitation" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) ##### Get all invitations for a shared dashboard Copy ``` # Path parameters export token="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/public/${token}/invitation" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all invitations for a shared dashboard ``` """ Get all invitations for a shared dashboard returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi # there is a valid "shared_dashboard" in the system SHARED_DASHBOARD_TOKEN = environ["SHARED_DASHBOARD_TOKEN"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.get_public_dashboard_invitations( token=SHARED_DASHBOARD_TOKEN, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all invitations for a shared dashboard ``` # Get all invitations for a shared dashboard returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new # there is a valid "shared_dashboard" in the system SHARED_DASHBOARD_TOKEN = ENV["SHARED_DASHBOARD_TOKEN"] p api_instance.get_public_dashboard_invitations(SHARED_DASHBOARD_TOKEN) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all invitations for a shared dashboard ``` // Get all invitations for a shared dashboard returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "shared_dashboard" in the system SharedDashboardToken := os.Getenv("SHARED_DASHBOARD_TOKEN") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.GetPublicDashboardInvitations(ctx, SharedDashboardToken, *datadogV1.NewGetPublicDashboardInvitationsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.GetPublicDashboardInvitations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.GetPublicDashboardInvitations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all invitations for a shared dashboard ``` // Get all invitations for a shared dashboard returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.SharedDashboardInvites; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); // there is a valid "shared_dashboard" in the system String SHARED_DASHBOARD_TOKEN = System.getenv("SHARED_DASHBOARD_TOKEN"); try { SharedDashboardInvites result = apiInstance.getPublicDashboardInvitations(SHARED_DASHBOARD_TOKEN); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#getPublicDashboardInvitations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all invitations for a shared dashboard ``` // Get all invitations for a shared dashboard returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::api_dashboards::GetPublicDashboardInvitationsOptionalParams; #[tokio::main] async fn main() { // there is a valid "shared_dashboard" in the system let shared_dashboard_token = std::env::var("SHARED_DASHBOARD_TOKEN").unwrap(); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api .get_public_dashboard_invitations( shared_dashboard_token.clone(), GetPublicDashboardInvitationsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all invitations for a shared dashboard ``` /** * Get all invitations for a shared dashboard returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); // there is a valid "shared_dashboard" in the system const SHARED_DASHBOARD_TOKEN = process.env.SHARED_DASHBOARD_TOKEN as string; const params: v1.DashboardsApiGetPublicDashboardInvitationsRequest = { token: SHARED_DASHBOARD_TOKEN, }; apiInstance .getPublicDashboardInvitations(params) .then((data: v1.SharedDashboardInvites) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Revoke a shared dashboard URL](https://docs.datadoghq.com/api/latest/dashboards/#revoke-a-shared-dashboard-url) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#revoke-a-shared-dashboard-url-v1) DELETE https://api.ap1.datadoghq.com/api/v1/dashboard/public/{token}https://api.ap2.datadoghq.com/api/v1/dashboard/public/{token}https://api.datadoghq.eu/api/v1/dashboard/public/{token}https://api.ddog-gov.com/api/v1/dashboard/public/{token}https://api.datadoghq.com/api/v1/dashboard/public/{token}https://api.us3.datadoghq.com/api/v1/dashboard/public/{token}https://api.us5.datadoghq.com/api/v1/dashboard/public/{token} ### Overview Revoke the public URL for a dashboard (rendering it private) associated with the specified token. This endpoint requires any of the following permissions: * `dashboards_public_share` * `dashboards_embed_share` * `dashboards_invite_share` OAuth apps require the `dashboards_invite_share` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Arguments #### Path Parameters Name Type Description token [_required_] string The token of the shared dashboard. ### Response * [200](https://docs.datadoghq.com/api/latest/dashboards/#DeletePublicDashboard-200-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#DeletePublicDashboard-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#DeletePublicDashboard-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#DeletePublicDashboard-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Response containing token of deleted shared dashboard. Expand All Field Type Description deleted_public_dashboard_token string Token associated with the shared dashboard that was revoked. ``` { "deleted_public_dashboard_token": "string" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Shared Dashboard Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) ##### Revoke a shared dashboard URL Copy ``` # Path parameters export token="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/public/${token}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Revoke a shared dashboard URL ``` """ Revoke a shared dashboard URL returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) response = api_instance.delete_public_dashboard( token="token", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Revoke a shared dashboard URL ``` # Revoke a shared dashboard URL returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new p api_instance.delete_public_dashboard("token") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Revoke a shared dashboard URL ``` // Revoke a shared dashboard URL returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) resp, r, err := api.DeletePublicDashboard(ctx, "token") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.DeletePublicDashboard`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DashboardsApi.DeletePublicDashboard`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Revoke a shared dashboard URL ``` // Revoke a shared dashboard URL returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.DeleteSharedDashboardResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); try { DeleteSharedDashboardResponse result = apiInstance.deletePublicDashboard("token"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#deletePublicDashboard"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Revoke a shared dashboard URL ``` // Revoke a shared dashboard URL returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api.delete_public_dashboard("token".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Revoke a shared dashboard URL ``` /** * Revoke a shared dashboard URL returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); const params: v1.DashboardsApiDeletePublicDashboardRequest = { token: "token", }; apiInstance .deletePublicDashboard(params) .then((data: v1.DeleteSharedDashboardResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Revoke shared dashboard invitations](https://docs.datadoghq.com/api/latest/dashboards/#revoke-shared-dashboard-invitations) * [v1 (latest)](https://docs.datadoghq.com/api/latest/dashboards/#revoke-shared-dashboard-invitations-v1) DELETE https://api.ap1.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.ap2.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.datadoghq.eu/api/v1/dashboard/public/{token}/invitationhttps://api.ddog-gov.com/api/v1/dashboard/public/{token}/invitationhttps://api.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.us3.datadoghq.com/api/v1/dashboard/public/{token}/invitationhttps://api.us5.datadoghq.com/api/v1/dashboard/public/{token}/invitation ### Overview Revoke previously sent invitation emails and active sessions used to access a given shared dashboard for specific email addresses. This endpoint requires the `dashboards_invite_share` permission. OAuth apps require the `dashboards_invite_share` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#dashboards) to access this endpoint. ### Arguments #### Path Parameters Name Type Description token [_required_] string The token of the shared dashboard. ### Request #### Body Data (required) Shared Dashboard Invitation deletion request body. * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Expand All Field Type Description _required_] An object or list of objects containing the information for an invitation to a shared dashboard. object Object containing the information for an invitation to a shared dashboard. _required_] object Attributes of the shared dashboard invitation created_at date-time When the invitation was sent. email string An email address that an invitation has been (or if used in invitation request, will be) sent to. has_session boolean Indicates whether an active session exists for the invitation (produced when a user clicks the link in the email). invitation_expiry date-time When the invitation expires. session_expiry date-time When the invited user's session expires. null if the invitation has no associated session. share_token string The unique token of the shared dashboard that was (or is to be) shared. type [_required_] enum Type for shared dashboard invitation request body. Allowed enum values: `public_dashboard_invitation` [object] A list of objects containing the information for an invitation(s) to a shared dashboard. _required_] object Attributes of the shared dashboard invitation created_at date-time When the invitation was sent. email string An email address that an invitation has been (or if used in invitation request, will be) sent to. has_session boolean Indicates whether an active session exists for the invitation (produced when a user clicks the link in the email). invitation_expiry date-time When the invitation expires. session_expiry date-time When the invited user's session expires. null if the invitation has no associated session. share_token string The unique token of the shared dashboard that was (or is to be) shared. type [_required_] enum Type for shared dashboard invitation request body. Allowed enum values: `public_dashboard_invitation` object Pagination metadata returned by the API. object Object containing the total count of invitations across all pages total_count int64 The total number of invitations on this shared board, across all pages. ``` { "data": [ "undefined" ] } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/dashboards/#DeletePublicDashboardInvitation-204-v1) * [403](https://docs.datadoghq.com/api/latest/dashboards/#DeletePublicDashboardInvitation-403-v1) * [404](https://docs.datadoghq.com/api/latest/dashboards/#DeletePublicDashboardInvitation-404-v1) * [429](https://docs.datadoghq.com/api/latest/dashboards/#DeletePublicDashboardInvitation-429-v1) OK Forbidden * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dashboards/) * [Example](https://docs.datadoghq.com/api/latest/dashboards/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dashboards/?code-lang=typescript) ##### Revoke shared dashboard invitations Copy ``` ## json-request-body # # Path parameters export token="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/dashboard/public/${token}/invitation" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "email": "test@datadoghq.com" }, "type": "public_dashboard_invitation" } } EOF ``` ##### Revoke shared dashboard invitations ``` """ Revoke shared dashboard invitations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.dashboards_api import DashboardsApi from datadog_api_client.v1.model.dashboard_invite_type import DashboardInviteType from datadog_api_client.v1.model.shared_dashboard_invites import SharedDashboardInvites from datadog_api_client.v1.model.shared_dashboard_invites_data_list import SharedDashboardInvitesDataList from datadog_api_client.v1.model.shared_dashboard_invites_data_object import SharedDashboardInvitesDataObject from datadog_api_client.v1.model.shared_dashboard_invites_data_object_attributes import ( SharedDashboardInvitesDataObjectAttributes, ) body = SharedDashboardInvites( data=SharedDashboardInvitesDataList( [ SharedDashboardInvitesDataObject( attributes=SharedDashboardInvitesDataObjectAttributes( email="test@datadoghq.com", ), type=DashboardInviteType.PUBLIC_DASHBOARD_INVITATION, ), ] ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DashboardsApi(api_client) api_instance.delete_public_dashboard_invitation(token="token", body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Revoke shared dashboard invitations ``` # Revoke shared dashboard invitations returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DashboardsAPI.new body = DatadogAPIClient::V1::SharedDashboardInvites.new({ data: [ DatadogAPIClient::V1::SharedDashboardInvitesDataObject.new({ attributes: DatadogAPIClient::V1::SharedDashboardInvitesDataObjectAttributes.new({ email: "test@datadoghq.com", }), type: DatadogAPIClient::V1::DashboardInviteType::PUBLIC_DASHBOARD_INVITATION, }), ], }) api_instance.delete_public_dashboard_invitation("token", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Revoke shared dashboard invitations ``` // Revoke shared dashboard invitations returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SharedDashboardInvites{ Data: datadogV1.SharedDashboardInvitesData{ SharedDashboardInvitesDataList: &[]datadogV1.SharedDashboardInvitesDataObject{ { Attributes: datadogV1.SharedDashboardInvitesDataObjectAttributes{ Email: datadog.PtrString("test@datadoghq.com"), }, Type: datadogV1.DASHBOARDINVITETYPE_PUBLIC_DASHBOARD_INVITATION, }, }}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDashboardsApi(apiClient) r, err := api.DeletePublicDashboardInvitation(ctx, "token", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DashboardsApi.DeletePublicDashboardInvitation`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Revoke shared dashboard invitations ``` // Revoke shared dashboard invitations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DashboardsApi; import com.datadog.api.client.v1.model.DashboardInviteType; import com.datadog.api.client.v1.model.SharedDashboardInvites; import com.datadog.api.client.v1.model.SharedDashboardInvitesData; import com.datadog.api.client.v1.model.SharedDashboardInvitesDataObject; import com.datadog.api.client.v1.model.SharedDashboardInvitesDataObjectAttributes; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DashboardsApi apiInstance = new DashboardsApi(defaultClient); SharedDashboardInvites body = new SharedDashboardInvites() .data( new SharedDashboardInvitesData( Collections.singletonList( new SharedDashboardInvitesDataObject() .attributes( new SharedDashboardInvitesDataObjectAttributes() .email("test@datadoghq.com")) .type(DashboardInviteType.PUBLIC_DASHBOARD_INVITATION)))); try { apiInstance.deletePublicDashboardInvitation("token", body); } catch (ApiException e) { System.err.println("Exception when calling DashboardsApi#deletePublicDashboardInvitation"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Revoke shared dashboard invitations ``` // Revoke shared dashboard invitations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_dashboards::DashboardsAPI; use datadog_api_client::datadogV1::model::DashboardInviteType; use datadog_api_client::datadogV1::model::SharedDashboardInvites; use datadog_api_client::datadogV1::model::SharedDashboardInvitesData; use datadog_api_client::datadogV1::model::SharedDashboardInvitesDataObject; use datadog_api_client::datadogV1::model::SharedDashboardInvitesDataObjectAttributes; #[tokio::main] async fn main() { let body = SharedDashboardInvites::new(SharedDashboardInvitesData::SharedDashboardInvitesDataList( vec![SharedDashboardInvitesDataObject::new( SharedDashboardInvitesDataObjectAttributes::new() .email("test@datadoghq.com".to_string()), DashboardInviteType::PUBLIC_DASHBOARD_INVITATION, )], )); let configuration = datadog::Configuration::new(); let api = DashboardsAPI::with_config(configuration); let resp = api .delete_public_dashboard_invitation("token".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Revoke shared dashboard invitations ``` /** * Revoke shared dashboard invitations returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DashboardsApi(configuration); const params: v1.DashboardsApiDeletePublicDashboardInvitationRequest = { body: { data: [ { attributes: { email: "test@datadoghq.com", }, type: "public_dashboard_invitation", }, ], }, token: "token", }; apiInstance .deletePublicDashboardInvitation(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif) ###### Request a personalized demo × * First Name* * Last Name* * Business Email* * Company* * Job Title* * Phone Number * How are you currently monitoring your infrastructure and applications? By submitting this form, you agree to the [Privacy Policy](https://www.datadoghq.com/legal/privacy/) and [Cookie Policy.](https://www.datadoghq.com/legal/cookies/) Request a Demo ##### Get Started with Datadog ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=bbcf7c9d-20d9-45a4-99a7-bf7cae564e8b&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2bc39a42-a97d-475f-bbad-66a9839889b0&pt=Dashboards&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=bbcf7c9d-20d9-45a4-99a7-bf7cae564e8b&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2bc39a42-a97d-475f-bbad-66a9839889b0&pt=Dashboards&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=8f213669-2502-468a-ac02-134da355f6b8&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Dashboards&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdashboards%2F&r=<=87738&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=33056) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) --- # Source: https://docs.datadoghq.com/api/latest/datasets # Datasets Data Access Controls in Datadog is a feature that allows administrators and access managers to regulate access to sensitive data. By defining Restricted Datasets, you can ensure that only specific teams or roles can view certain types of telemetry (for example, logs, traces, metrics, and RUM data). ## [Create a dataset](https://docs.datadoghq.com/api/latest/datasets/#create-a-dataset) * [v2 (latest)](https://docs.datadoghq.com/api/latest/datasets/#create-a-dataset-v2) **Note: Data Access is in preview. If you have any feedback, contact[Datadog support](https://docs.datadoghq.com/help/).** POST https://api.ap1.datadoghq.com/api/v2/datasetshttps://api.ap2.datadoghq.com/api/v2/datasetshttps://api.datadoghq.eu/api/v2/datasetshttps://api.ddog-gov.com/api/v2/datasetshttps://api.datadoghq.com/api/v2/datasetshttps://api.us3.datadoghq.com/api/v2/datasetshttps://api.us5.datadoghq.com/api/v2/datasets ### Overview Create a dataset with the configurations in the request. This endpoint requires the `user_access_manage` permission. OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#datasets) to access this endpoint. ### Request #### Body Data (required) Dataset payload * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) Field Type Description data [_required_] object **Datasets Object Constraints** * **Tag limit per dataset** : * Each restricted dataset supports a maximum of 10 key:value pairs per product. * **Tag key rules per telemetry type** : * Only one tag key or attribute may be used to define access within a single telemetry type. * The same or different tag key may be used across different telemetry types. * **Tag value uniqueness** : * Tag values must be unique within a single dataset. * A tag value used in one dataset cannot be reused in another dataset of the same telemetry type. attributes [_required_] object Dataset metadata and configurations. name [_required_] string Name of the dataset. principals [_required_] [string] List of access principals, formatted as `principal_type:id`. Principal can be 'team' or 'role'. product_filters [_required_] [object] List of product-specific filters. filters [_required_] [string] Defines the list of tag-based filters used to restrict access to telemetry data for a specific product. These filters act as access control rules. Each filter must follow the tag query syntax used by Datadog (such as `@tag.key:value`), and only one tag or attribute may be used to define the access strategy per telemetry type. product [_required_] string Name of the product the dataset is for. Possible values are 'apm', 'rum', 'metrics', 'logs', 'error_tracking', and 'cloud_cost'. type [_required_] enum Resource type, always set to `dataset`. Allowed enum values: `dataset` default: `dataset` ``` { "data": { "attributes": { "name": "Security Audit Dataset", "principals": [ "role:94172442-be03-11e9-a77a-3b7612558ac1" ], "product_filters": [ { "filters": [ "@application.id:ABCD" ], "product": "metrics" } ] }, "type": "dataset" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/datasets/#CreateDataset-200-v2) * [400](https://docs.datadoghq.com/api/latest/datasets/#CreateDataset-400-v2) * [403](https://docs.datadoghq.com/api/latest/datasets/#CreateDataset-403-v2) * [409](https://docs.datadoghq.com/api/latest/datasets/#CreateDataset-409-v2) * [429](https://docs.datadoghq.com/api/latest/datasets/#CreateDataset-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) Response containing a single dataset object. Field Type Description data object **Datasets Object Constraints** * **Tag Limit per Dataset** : * Each restricted dataset supports a maximum of 10 key:value pairs per product. * **Tag Key Rules per Telemetry Type** : * Only one tag key or attribute may be used to define access within a single telemetry type. * The same or different tag key may be used across different telemetry types. * **Tag Value Uniqueness** : * Tag values must be unique within a single dataset. * A tag value used in one dataset cannot be reused in another dataset of the same telemetry type. attributes object Dataset metadata and configuration(s). created_at date-time Timestamp when the dataset was created. created_by uuid Unique ID of the user who created the dataset. name string Name of the dataset. principals [string] List of access principals, formatted as `principal_type:id`. Principal can be 'team' or 'role'. product_filters [object] List of product-specific filters. filters [_required_] [string] Defines the list of tag-based filters used to restrict access to telemetry data for a specific product. These filters act as access control rules. Each filter must follow the tag query syntax used by Datadog (such as `@tag.key:value`), and only one tag or attribute may be used to define the access strategy per telemetry type. product [_required_] string Name of the product the dataset is for. Possible values are 'apm', 'rum', 'metrics', 'logs', 'error_tracking', and 'cloud_cost'. id string Unique identifier for the dataset. type enum Resource type, always set to `dataset`. Allowed enum values: `dataset` default: `dataset` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "created_by": "string", "name": "Security Audit Dataset", "principals": [ "role:86245fce-0a4e-11f0-92bd-da7ad0900002" ], "product_filters": [ { "filters": [ "@application.id:ABCD" ], "product": "logs" } ] }, "id": "123e4567-e89b-12d3-a456-426614174000", "type": "dataset" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/datasets/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/datasets/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/datasets/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/datasets/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/datasets/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/datasets/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/datasets/?code-lang=typescript) ##### Create a dataset returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/datasets" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Security Audit Dataset", "principals": [ "role:94172442-be03-11e9-a77a-3b7612558ac1" ], "product_filters": [ { "filters": [ "@application.id:ABCD" ], "product": "metrics" } ] }, "type": "dataset" } } EOF ``` ##### Create a dataset returns "OK" response ``` // Create a dataset returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.DatasetCreateRequest{ Data: datadogV2.DatasetRequest{ Attributes: datadogV2.DatasetAttributesRequest{ Name: "Security Audit Dataset", Principals: []string{ "role:94172442-be03-11e9-a77a-3b7612558ac1", }, ProductFilters: []datadogV2.FiltersPerProduct{ { Filters: []string{ "@application.id:ABCD", }, Product: "metrics", }, }, }, Type: datadogV2.DATASETTYPE_DATASET, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateDataset", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDatasetsApi(apiClient) resp, r, err := api.CreateDataset(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DatasetsApi.CreateDataset`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DatasetsApi.CreateDataset`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a dataset returns "OK" response ``` // Create a dataset returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DatasetsApi; import com.datadog.api.client.v2.model.DatasetAttributesRequest; import com.datadog.api.client.v2.model.DatasetCreateRequest; import com.datadog.api.client.v2.model.DatasetRequest; import com.datadog.api.client.v2.model.DatasetResponseSingle; import com.datadog.api.client.v2.model.DatasetType; import com.datadog.api.client.v2.model.FiltersPerProduct; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createDataset", true); DatasetsApi apiInstance = new DatasetsApi(defaultClient); DatasetCreateRequest body = new DatasetCreateRequest() .data( new DatasetRequest() .attributes( new DatasetAttributesRequest() .name("Security Audit Dataset") .principals( Collections.singletonList( "role:94172442-be03-11e9-a77a-3b7612558ac1")) .productFilters( Collections.singletonList( new FiltersPerProduct() .filters(Collections.singletonList("@application.id:ABCD")) .product("metrics")))) .type(DatasetType.DATASET)); try { DatasetResponseSingle result = apiInstance.createDataset(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DatasetsApi#createDataset"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a dataset returns "OK" response ``` """ Create a dataset returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.datasets_api import DatasetsApi from datadog_api_client.v2.model.dataset_attributes_request import DatasetAttributesRequest from datadog_api_client.v2.model.dataset_create_request import DatasetCreateRequest from datadog_api_client.v2.model.dataset_request import DatasetRequest from datadog_api_client.v2.model.dataset_type import DatasetType from datadog_api_client.v2.model.filters_per_product import FiltersPerProduct body = DatasetCreateRequest( data=DatasetRequest( attributes=DatasetAttributesRequest( name="Security Audit Dataset", principals=[ "role:94172442-be03-11e9-a77a-3b7612558ac1", ], product_filters=[ FiltersPerProduct( filters=[ "@application.id:ABCD", ], product="metrics", ), ], ), type=DatasetType.DATASET, ), ) configuration = Configuration() configuration.unstable_operations["create_dataset"] = True with ApiClient(configuration) as api_client: api_instance = DatasetsApi(api_client) response = api_instance.create_dataset(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a dataset returns "OK" response ``` # Create a dataset returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_dataset".to_sym] = true end api_instance = DatadogAPIClient::V2::DatasetsAPI.new body = DatadogAPIClient::V2::DatasetCreateRequest.new({ data: DatadogAPIClient::V2::DatasetRequest.new({ attributes: DatadogAPIClient::V2::DatasetAttributesRequest.new({ name: "Security Audit Dataset", principals: [ "role:94172442-be03-11e9-a77a-3b7612558ac1", ], product_filters: [ DatadogAPIClient::V2::FiltersPerProduct.new({ filters: [ "@application.id:ABCD", ], product: "metrics", }), ], }), type: DatadogAPIClient::V2::DatasetType::DATASET, }), }) p api_instance.create_dataset(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a dataset returns "OK" response ``` // Create a dataset returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_datasets::DatasetsAPI; use datadog_api_client::datadogV2::model::DatasetAttributesRequest; use datadog_api_client::datadogV2::model::DatasetCreateRequest; use datadog_api_client::datadogV2::model::DatasetRequest; use datadog_api_client::datadogV2::model::DatasetType; use datadog_api_client::datadogV2::model::FiltersPerProduct; #[tokio::main] async fn main() { let body = DatasetCreateRequest::new(DatasetRequest::new( DatasetAttributesRequest::new( "Security Audit Dataset".to_string(), vec!["role:94172442-be03-11e9-a77a-3b7612558ac1".to_string()], vec![FiltersPerProduct::new( vec!["@application.id:ABCD".to_string()], "metrics".to_string(), )], ), DatasetType::DATASET, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateDataset", true); let api = DatasetsAPI::with_config(configuration); let resp = api.create_dataset(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a dataset returns "OK" response ``` /** * Create a dataset returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createDataset"] = true; const apiInstance = new v2.DatasetsApi(configuration); const params: v2.DatasetsApiCreateDatasetRequest = { body: { data: { attributes: { name: "Security Audit Dataset", principals: ["role:94172442-be03-11e9-a77a-3b7612558ac1"], productFilters: [ { filters: ["@application.id:ABCD"], product: "metrics", }, ], }, type: "dataset", }, }, }; apiInstance .createDataset(params) .then((data: v2.DatasetResponseSingle) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a single dataset by ID](https://docs.datadoghq.com/api/latest/datasets/#get-a-single-dataset-by-id) * [v2 (latest)](https://docs.datadoghq.com/api/latest/datasets/#get-a-single-dataset-by-id-v2) **Note: Data Access is in preview. If you have any feedback, contact[Datadog support](https://docs.datadoghq.com/help/).** GET https://api.ap1.datadoghq.com/api/v2/datasets/{dataset_id}https://api.ap2.datadoghq.com/api/v2/datasets/{dataset_id}https://api.datadoghq.eu/api/v2/datasets/{dataset_id}https://api.ddog-gov.com/api/v2/datasets/{dataset_id}https://api.datadoghq.com/api/v2/datasets/{dataset_id}https://api.us3.datadoghq.com/api/v2/datasets/{dataset_id}https://api.us5.datadoghq.com/api/v2/datasets/{dataset_id} ### Overview Retrieves the dataset associated with the ID. OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#datasets) to access this endpoint. ### Arguments #### Path Parameters Name Type Description dataset_id [_required_] string The ID of a defined dataset. ### Response * [200](https://docs.datadoghq.com/api/latest/datasets/#GetDataset-200-v2) * [400](https://docs.datadoghq.com/api/latest/datasets/#GetDataset-400-v2) * [403](https://docs.datadoghq.com/api/latest/datasets/#GetDataset-403-v2) * [404](https://docs.datadoghq.com/api/latest/datasets/#GetDataset-404-v2) * [429](https://docs.datadoghq.com/api/latest/datasets/#GetDataset-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) Response containing a single dataset object. Field Type Description data object **Datasets Object Constraints** * **Tag Limit per Dataset** : * Each restricted dataset supports a maximum of 10 key:value pairs per product. * **Tag Key Rules per Telemetry Type** : * Only one tag key or attribute may be used to define access within a single telemetry type. * The same or different tag key may be used across different telemetry types. * **Tag Value Uniqueness** : * Tag values must be unique within a single dataset. * A tag value used in one dataset cannot be reused in another dataset of the same telemetry type. attributes object Dataset metadata and configuration(s). created_at date-time Timestamp when the dataset was created. created_by uuid Unique ID of the user who created the dataset. name string Name of the dataset. principals [string] List of access principals, formatted as `principal_type:id`. Principal can be 'team' or 'role'. product_filters [object] List of product-specific filters. filters [_required_] [string] Defines the list of tag-based filters used to restrict access to telemetry data for a specific product. These filters act as access control rules. Each filter must follow the tag query syntax used by Datadog (such as `@tag.key:value`), and only one tag or attribute may be used to define the access strategy per telemetry type. product [_required_] string Name of the product the dataset is for. Possible values are 'apm', 'rum', 'metrics', 'logs', 'error_tracking', and 'cloud_cost'. id string Unique identifier for the dataset. type enum Resource type, always set to `dataset`. Allowed enum values: `dataset` default: `dataset` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "created_by": "string", "name": "Security Audit Dataset", "principals": [ "role:86245fce-0a4e-11f0-92bd-da7ad0900002" ], "product_filters": [ { "filters": [ "@application.id:ABCD" ], "product": "logs" } ] }, "id": "123e4567-e89b-12d3-a456-426614174000", "type": "dataset" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/datasets/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/datasets/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/datasets/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/datasets/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/datasets/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/datasets/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/datasets/?code-lang=typescript) ##### Get a single dataset by ID Copy ``` # Path parameters export dataset_id="0879ce27-29a1-481f-a12e-bc2a48ec9ae1" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/datasets/${dataset_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a single dataset by ID ``` """ Get a single dataset by ID returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.datasets_api import DatasetsApi # there is a valid "dataset" in the system DATASET_DATA_ID = environ["DATASET_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_dataset"] = True with ApiClient(configuration) as api_client: api_instance = DatasetsApi(api_client) response = api_instance.get_dataset( dataset_id=DATASET_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a single dataset by ID ``` # Get a single dataset by ID returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_dataset".to_sym] = true end api_instance = DatadogAPIClient::V2::DatasetsAPI.new # there is a valid "dataset" in the system DATASET_DATA_ID = ENV["DATASET_DATA_ID"] p api_instance.get_dataset(DATASET_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a single dataset by ID ``` // Get a single dataset by ID returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dataset" in the system DatasetDataID := os.Getenv("DATASET_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetDataset", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDatasetsApi(apiClient) resp, r, err := api.GetDataset(ctx, DatasetDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DatasetsApi.GetDataset`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DatasetsApi.GetDataset`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a single dataset by ID ``` // Get a single dataset by ID returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DatasetsApi; import com.datadog.api.client.v2.model.DatasetResponseSingle; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getDataset", true); DatasetsApi apiInstance = new DatasetsApi(defaultClient); // there is a valid "dataset" in the system String DATASET_DATA_ID = System.getenv("DATASET_DATA_ID"); try { DatasetResponseSingle result = apiInstance.getDataset(DATASET_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DatasetsApi#getDataset"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a single dataset by ID ``` // Get a single dataset by ID returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_datasets::DatasetsAPI; #[tokio::main] async fn main() { // there is a valid "dataset" in the system let dataset_data_id = std::env::var("DATASET_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetDataset", true); let api = DatasetsAPI::with_config(configuration); let resp = api.get_dataset(dataset_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a single dataset by ID ``` /** * Get a single dataset by ID returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getDataset"] = true; const apiInstance = new v2.DatasetsApi(configuration); // there is a valid "dataset" in the system const DATASET_DATA_ID = process.env.DATASET_DATA_ID as string; const params: v2.DatasetsApiGetDatasetRequest = { datasetId: DATASET_DATA_ID, }; apiInstance .getDataset(params) .then((data: v2.DatasetResponseSingle) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all datasets](https://docs.datadoghq.com/api/latest/datasets/#get-all-datasets) * [v2 (latest)](https://docs.datadoghq.com/api/latest/datasets/#get-all-datasets-v2) **Note: Data Access is in preview. If you have any feedback, contact[Datadog support](https://docs.datadoghq.com/help/).** GET https://api.ap1.datadoghq.com/api/v2/datasetshttps://api.ap2.datadoghq.com/api/v2/datasetshttps://api.datadoghq.eu/api/v2/datasetshttps://api.ddog-gov.com/api/v2/datasetshttps://api.datadoghq.com/api/v2/datasetshttps://api.us3.datadoghq.com/api/v2/datasetshttps://api.us5.datadoghq.com/api/v2/datasets ### Overview Get all datasets that have been configured for an organization. This endpoint requires the `user_access_read` permission. OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#datasets) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/datasets/#GetAllDatasets-200-v2) * [403](https://docs.datadoghq.com/api/latest/datasets/#GetAllDatasets-403-v2) * [429](https://docs.datadoghq.com/api/latest/datasets/#GetAllDatasets-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) Response containing a list of datasets. Field Type Description data [object] The list of datasets returned in response. attributes object Dataset metadata and configuration(s). created_at date-time Timestamp when the dataset was created. created_by uuid Unique ID of the user who created the dataset. name string Name of the dataset. principals [string] List of access principals, formatted as `principal_type:id`. Principal can be 'team' or 'role'. product_filters [object] List of product-specific filters. filters [_required_] [string] Defines the list of tag-based filters used to restrict access to telemetry data for a specific product. These filters act as access control rules. Each filter must follow the tag query syntax used by Datadog (such as `@tag.key:value`), and only one tag or attribute may be used to define the access strategy per telemetry type. product [_required_] string Name of the product the dataset is for. Possible values are 'apm', 'rum', 'metrics', 'logs', 'error_tracking', and 'cloud_cost'. id string Unique identifier for the dataset. type enum Resource type, always set to `dataset`. Allowed enum values: `dataset` default: `dataset` ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "created_by": "string", "name": "Security Audit Dataset", "principals": [ "role:86245fce-0a4e-11f0-92bd-da7ad0900002" ], "product_filters": [ { "filters": [ "@application.id:ABCD" ], "product": "logs" } ] }, "id": "123e4567-e89b-12d3-a456-426614174000", "type": "dataset" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/datasets/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/datasets/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/datasets/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/datasets/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/datasets/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/datasets/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/datasets/?code-lang=typescript) ##### Get all datasets Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/datasets" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all datasets ``` """ Get all datasets returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.datasets_api import DatasetsApi configuration = Configuration() configuration.unstable_operations["get_all_datasets"] = True with ApiClient(configuration) as api_client: api_instance = DatasetsApi(api_client) response = api_instance.get_all_datasets() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all datasets ``` # Get all datasets returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_all_datasets".to_sym] = true end api_instance = DatadogAPIClient::V2::DatasetsAPI.new p api_instance.get_all_datasets() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all datasets ``` // Get all datasets returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetAllDatasets", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDatasetsApi(apiClient) resp, r, err := api.GetAllDatasets(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DatasetsApi.GetAllDatasets`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DatasetsApi.GetAllDatasets`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all datasets ``` // Get all datasets returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DatasetsApi; import com.datadog.api.client.v2.model.DatasetResponseMulti; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getAllDatasets", true); DatasetsApi apiInstance = new DatasetsApi(defaultClient); try { DatasetResponseMulti result = apiInstance.getAllDatasets(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DatasetsApi#getAllDatasets"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all datasets ``` // Get all datasets returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_datasets::DatasetsAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetAllDatasets", true); let api = DatasetsAPI::with_config(configuration); let resp = api.get_all_datasets().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all datasets ``` /** * Get all datasets returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getAllDatasets"] = true; const apiInstance = new v2.DatasetsApi(configuration); apiInstance .getAllDatasets() .then((data: v2.DatasetResponseMulti) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit a dataset](https://docs.datadoghq.com/api/latest/datasets/#edit-a-dataset) * [v2 (latest)](https://docs.datadoghq.com/api/latest/datasets/#edit-a-dataset-v2) **Note: Data Access is in preview. If you have any feedback, contact[Datadog support](https://docs.datadoghq.com/help/).** PUT https://api.ap1.datadoghq.com/api/v2/datasets/{dataset_id}https://api.ap2.datadoghq.com/api/v2/datasets/{dataset_id}https://api.datadoghq.eu/api/v2/datasets/{dataset_id}https://api.ddog-gov.com/api/v2/datasets/{dataset_id}https://api.datadoghq.com/api/v2/datasets/{dataset_id}https://api.us3.datadoghq.com/api/v2/datasets/{dataset_id}https://api.us5.datadoghq.com/api/v2/datasets/{dataset_id} ### Overview Edits the dataset associated with the ID. This endpoint requires the `user_access_manage` permission. OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#datasets) to access this endpoint. ### Arguments #### Path Parameters Name Type Description dataset_id [_required_] string The ID of a defined dataset. ### Request #### Body Data (required) Dataset payload * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) Field Type Description data [_required_] object **Datasets Object Constraints** * **Tag limit per dataset** : * Each restricted dataset supports a maximum of 10 key:value pairs per product. * **Tag key rules per telemetry type** : * Only one tag key or attribute may be used to define access within a single telemetry type. * The same or different tag key may be used across different telemetry types. * **Tag value uniqueness** : * Tag values must be unique within a single dataset. * A tag value used in one dataset cannot be reused in another dataset of the same telemetry type. attributes [_required_] object Dataset metadata and configurations. name [_required_] string Name of the dataset. principals [_required_] [string] List of access principals, formatted as `principal_type:id`. Principal can be 'team' or 'role'. product_filters [_required_] [object] List of product-specific filters. filters [_required_] [string] Defines the list of tag-based filters used to restrict access to telemetry data for a specific product. These filters act as access control rules. Each filter must follow the tag query syntax used by Datadog (such as `@tag.key:value`), and only one tag or attribute may be used to define the access strategy per telemetry type. product [_required_] string Name of the product the dataset is for. Possible values are 'apm', 'rum', 'metrics', 'logs', 'error_tracking', and 'cloud_cost'. type [_required_] enum Resource type, always set to `dataset`. Allowed enum values: `dataset` default: `dataset` ``` { "data": { "attributes": { "name": "Security Audit Dataset", "principals": [ "role:94172442-be03-11e9-a77a-3b7612558ac1" ], "product_filters": [ { "filters": [ "@application.id:1234" ], "product": "metrics" } ] }, "type": "dataset" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/datasets/#UpdateDataset-200-v2) * [400](https://docs.datadoghq.com/api/latest/datasets/#UpdateDataset-400-v2) * [403](https://docs.datadoghq.com/api/latest/datasets/#UpdateDataset-403-v2) * [404](https://docs.datadoghq.com/api/latest/datasets/#UpdateDataset-404-v2) * [429](https://docs.datadoghq.com/api/latest/datasets/#UpdateDataset-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) Response containing a single dataset object. Field Type Description data object **Datasets Object Constraints** * **Tag Limit per Dataset** : * Each restricted dataset supports a maximum of 10 key:value pairs per product. * **Tag Key Rules per Telemetry Type** : * Only one tag key or attribute may be used to define access within a single telemetry type. * The same or different tag key may be used across different telemetry types. * **Tag Value Uniqueness** : * Tag values must be unique within a single dataset. * A tag value used in one dataset cannot be reused in another dataset of the same telemetry type. attributes object Dataset metadata and configuration(s). created_at date-time Timestamp when the dataset was created. created_by uuid Unique ID of the user who created the dataset. name string Name of the dataset. principals [string] List of access principals, formatted as `principal_type:id`. Principal can be 'team' or 'role'. product_filters [object] List of product-specific filters. filters [_required_] [string] Defines the list of tag-based filters used to restrict access to telemetry data for a specific product. These filters act as access control rules. Each filter must follow the tag query syntax used by Datadog (such as `@tag.key:value`), and only one tag or attribute may be used to define the access strategy per telemetry type. product [_required_] string Name of the product the dataset is for. Possible values are 'apm', 'rum', 'metrics', 'logs', 'error_tracking', and 'cloud_cost'. id string Unique identifier for the dataset. type enum Resource type, always set to `dataset`. Allowed enum values: `dataset` default: `dataset` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "created_by": "string", "name": "Security Audit Dataset", "principals": [ "role:86245fce-0a4e-11f0-92bd-da7ad0900002" ], "product_filters": [ { "filters": [ "@application.id:ABCD" ], "product": "logs" } ] }, "id": "123e4567-e89b-12d3-a456-426614174000", "type": "dataset" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/datasets/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/datasets/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/datasets/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/datasets/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/datasets/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/datasets/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/datasets/?code-lang=typescript) ##### Edit a dataset returns "OK" response Copy ``` # Path parameters export dataset_id="0879ce27-29a1-481f-a12e-bc2a48ec9ae1" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/datasets/${dataset_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Security Audit Dataset", "principals": [ "role:94172442-be03-11e9-a77a-3b7612558ac1" ], "product_filters": [ { "filters": [ "@application.id:1234" ], "product": "metrics" } ] }, "type": "dataset" } } EOF ``` ##### Edit a dataset returns "OK" response ``` // Edit a dataset returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dataset" in the system DatasetDataID := os.Getenv("DATASET_DATA_ID") body := datadogV2.DatasetUpdateRequest{ Data: datadogV2.DatasetRequest{ Attributes: datadogV2.DatasetAttributesRequest{ Name: "Security Audit Dataset", Principals: []string{ "role:94172442-be03-11e9-a77a-3b7612558ac1", }, ProductFilters: []datadogV2.FiltersPerProduct{ { Filters: []string{ "@application.id:1234", }, Product: "metrics", }, }, }, Type: datadogV2.DATASETTYPE_DATASET, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateDataset", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDatasetsApi(apiClient) resp, r, err := api.UpdateDataset(ctx, DatasetDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DatasetsApi.UpdateDataset`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DatasetsApi.UpdateDataset`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit a dataset returns "OK" response ``` // Edit a dataset returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DatasetsApi; import com.datadog.api.client.v2.model.DatasetAttributesRequest; import com.datadog.api.client.v2.model.DatasetRequest; import com.datadog.api.client.v2.model.DatasetResponseSingle; import com.datadog.api.client.v2.model.DatasetType; import com.datadog.api.client.v2.model.DatasetUpdateRequest; import com.datadog.api.client.v2.model.FiltersPerProduct; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateDataset", true); DatasetsApi apiInstance = new DatasetsApi(defaultClient); // there is a valid "dataset" in the system String DATASET_DATA_ID = System.getenv("DATASET_DATA_ID"); DatasetUpdateRequest body = new DatasetUpdateRequest() .data( new DatasetRequest() .attributes( new DatasetAttributesRequest() .name("Security Audit Dataset") .principals( Collections.singletonList( "role:94172442-be03-11e9-a77a-3b7612558ac1")) .productFilters( Collections.singletonList( new FiltersPerProduct() .filters(Collections.singletonList("@application.id:1234")) .product("metrics")))) .type(DatasetType.DATASET)); try { DatasetResponseSingle result = apiInstance.updateDataset(DATASET_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DatasetsApi#updateDataset"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit a dataset returns "OK" response ``` """ Edit a dataset returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.datasets_api import DatasetsApi from datadog_api_client.v2.model.dataset_attributes_request import DatasetAttributesRequest from datadog_api_client.v2.model.dataset_request import DatasetRequest from datadog_api_client.v2.model.dataset_type import DatasetType from datadog_api_client.v2.model.dataset_update_request import DatasetUpdateRequest from datadog_api_client.v2.model.filters_per_product import FiltersPerProduct # there is a valid "dataset" in the system DATASET_DATA_ID = environ["DATASET_DATA_ID"] body = DatasetUpdateRequest( data=DatasetRequest( attributes=DatasetAttributesRequest( name="Security Audit Dataset", principals=[ "role:94172442-be03-11e9-a77a-3b7612558ac1", ], product_filters=[ FiltersPerProduct( filters=[ "@application.id:1234", ], product="metrics", ), ], ), type=DatasetType.DATASET, ), ) configuration = Configuration() configuration.unstable_operations["update_dataset"] = True with ApiClient(configuration) as api_client: api_instance = DatasetsApi(api_client) response = api_instance.update_dataset(dataset_id=DATASET_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit a dataset returns "OK" response ``` # Edit a dataset returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_dataset".to_sym] = true end api_instance = DatadogAPIClient::V2::DatasetsAPI.new # there is a valid "dataset" in the system DATASET_DATA_ID = ENV["DATASET_DATA_ID"] body = DatadogAPIClient::V2::DatasetUpdateRequest.new({ data: DatadogAPIClient::V2::DatasetRequest.new({ attributes: DatadogAPIClient::V2::DatasetAttributesRequest.new({ name: "Security Audit Dataset", principals: [ "role:94172442-be03-11e9-a77a-3b7612558ac1", ], product_filters: [ DatadogAPIClient::V2::FiltersPerProduct.new({ filters: [ "@application.id:1234", ], product: "metrics", }), ], }), type: DatadogAPIClient::V2::DatasetType::DATASET, }), }) p api_instance.update_dataset(DATASET_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit a dataset returns "OK" response ``` // Edit a dataset returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_datasets::DatasetsAPI; use datadog_api_client::datadogV2::model::DatasetAttributesRequest; use datadog_api_client::datadogV2::model::DatasetRequest; use datadog_api_client::datadogV2::model::DatasetType; use datadog_api_client::datadogV2::model::DatasetUpdateRequest; use datadog_api_client::datadogV2::model::FiltersPerProduct; #[tokio::main] async fn main() { // there is a valid "dataset" in the system let dataset_data_id = std::env::var("DATASET_DATA_ID").unwrap(); let body = DatasetUpdateRequest::new(DatasetRequest::new( DatasetAttributesRequest::new( "Security Audit Dataset".to_string(), vec!["role:94172442-be03-11e9-a77a-3b7612558ac1".to_string()], vec![FiltersPerProduct::new( vec!["@application.id:1234".to_string()], "metrics".to_string(), )], ), DatasetType::DATASET, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateDataset", true); let api = DatasetsAPI::with_config(configuration); let resp = api.update_dataset(dataset_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit a dataset returns "OK" response ``` /** * Edit a dataset returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateDataset"] = true; const apiInstance = new v2.DatasetsApi(configuration); // there is a valid "dataset" in the system const DATASET_DATA_ID = process.env.DATASET_DATA_ID as string; const params: v2.DatasetsApiUpdateDatasetRequest = { body: { data: { attributes: { name: "Security Audit Dataset", principals: ["role:94172442-be03-11e9-a77a-3b7612558ac1"], productFilters: [ { filters: ["@application.id:1234"], product: "metrics", }, ], }, type: "dataset", }, }, datasetId: DATASET_DATA_ID, }; apiInstance .updateDataset(params) .then((data: v2.DatasetResponseSingle) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a dataset](https://docs.datadoghq.com/api/latest/datasets/#delete-a-dataset) * [v2 (latest)](https://docs.datadoghq.com/api/latest/datasets/#delete-a-dataset-v2) **Note: Data Access is in preview. If you have any feedback, contact[Datadog support](https://docs.datadoghq.com/help/).** DELETE https://api.ap1.datadoghq.com/api/v2/datasets/{dataset_id}https://api.ap2.datadoghq.com/api/v2/datasets/{dataset_id}https://api.datadoghq.eu/api/v2/datasets/{dataset_id}https://api.ddog-gov.com/api/v2/datasets/{dataset_id}https://api.datadoghq.com/api/v2/datasets/{dataset_id}https://api.us3.datadoghq.com/api/v2/datasets/{dataset_id}https://api.us5.datadoghq.com/api/v2/datasets/{dataset_id} ### Overview Deletes the dataset associated with the ID. This endpoint requires the `user_access_manage` permission. OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#datasets) to access this endpoint. ### Arguments #### Path Parameters Name Type Description dataset_id [_required_] string The ID of a defined dataset. ### Response * [204](https://docs.datadoghq.com/api/latest/datasets/#DeleteDataset-204-v2) * [400](https://docs.datadoghq.com/api/latest/datasets/#DeleteDataset-400-v2) * [403](https://docs.datadoghq.com/api/latest/datasets/#DeleteDataset-403-v2) * [404](https://docs.datadoghq.com/api/latest/datasets/#DeleteDataset-404-v2) * [429](https://docs.datadoghq.com/api/latest/datasets/#DeleteDataset-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/datasets/) * [Example](https://docs.datadoghq.com/api/latest/datasets/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/datasets/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/datasets/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/datasets/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/datasets/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/datasets/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/datasets/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/datasets/?code-lang=typescript) ##### Delete a dataset Copy ``` # Path parameters export dataset_id="0879ce27-29a1-481f-a12e-bc2a48ec9ae1" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/datasets/${dataset_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a dataset ``` """ Delete a dataset returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.datasets_api import DatasetsApi # there is a valid "dataset" in the system DATASET_DATA_ID = environ["DATASET_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_dataset"] = True with ApiClient(configuration) as api_client: api_instance = DatasetsApi(api_client) api_instance.delete_dataset( dataset_id=DATASET_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a dataset ``` # Delete a dataset returns "No Content" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_dataset".to_sym] = true end api_instance = DatadogAPIClient::V2::DatasetsAPI.new # there is a valid "dataset" in the system DATASET_DATA_ID = ENV["DATASET_DATA_ID"] api_instance.delete_dataset(DATASET_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a dataset ``` // Delete a dataset returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dataset" in the system DatasetDataID := os.Getenv("DATASET_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteDataset", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDatasetsApi(apiClient) r, err := api.DeleteDataset(ctx, DatasetDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DatasetsApi.DeleteDataset`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a dataset ``` // Delete a dataset returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DatasetsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteDataset", true); DatasetsApi apiInstance = new DatasetsApi(defaultClient); // there is a valid "dataset" in the system String DATASET_DATA_ID = System.getenv("DATASET_DATA_ID"); try { apiInstance.deleteDataset(DATASET_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling DatasetsApi#deleteDataset"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a dataset ``` // Delete a dataset returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_datasets::DatasetsAPI; #[tokio::main] async fn main() { // there is a valid "dataset" in the system let dataset_data_id = std::env::var("DATASET_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteDataset", true); let api = DatasetsAPI::with_config(configuration); let resp = api.delete_dataset(dataset_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a dataset ``` /** * Delete a dataset returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteDataset"] = true; const apiInstance = new v2.DatasetsApi(configuration); // there is a valid "dataset" in the system const DATASET_DATA_ID = process.env.DATASET_DATA_ID as string; const params: v2.DatasetsApiDeleteDatasetRequest = { datasetId: DATASET_DATA_ID, }; apiInstance .deleteDataset(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=6832aae6-6843-4a84-90db-a8b6df1d8976&bo=1&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Datasets&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdatasets%2F&r=&evt=pageLoad&sv=2&cdb=AQAA&rn=985995) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=13c71917-f22a-4a0e-aafa-4dddc146023c&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=c076f32d-a550-4a31-9216-5b51cbd905da&pt=Datasets&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdatasets%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=13c71917-f22a-4a0e-aafa-4dddc146023c&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=c076f32d-a550-4a31-9216-5b51cbd905da&pt=Datasets&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdatasets%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) --- # Source: https://docs.datadoghq.com/api/latest/deployment-gates/ # Deployment Gates Manage Deployment Gates using this API to reduce the likelihood and impact of incidents caused by deployments. See the [Deployment Gates documentation](https://docs.datadoghq.com/deployment_gates/) for more information. ## [Create deployment gate](https://docs.datadoghq.com/api/latest/deployment-gates/#create-deployment-gate) * [v2 (latest)](https://docs.datadoghq.com/api/latest/deployment-gates/#create-deployment-gate-v2) **Note** : This endpoint is in preview and may be subject to change. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/deployment_gateshttps://api.ap2.datadoghq.com/api/v2/deployment_gateshttps://api.datadoghq.eu/api/v2/deployment_gateshttps://api.ddog-gov.com/api/v2/deployment_gateshttps://api.datadoghq.com/api/v2/deployment_gateshttps://api.us3.datadoghq.com/api/v2/deployment_gateshttps://api.us5.datadoghq.com/api/v2/deployment_gates ### Overview Endpoint to create a deployment gate. This endpoint requires the `deployment_gates_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Field Type Description data [_required_] object Parameters for creating a deployment gate. attributes [_required_] object Parameters for creating a deployment gate. dry_run boolean Whether this gate is run in dry-run mode. env [_required_] string The environment of the deployment gate. identifier string The identifier of the deployment gate. default: `default` service [_required_] string The service of the deployment gate. type [_required_] enum Deployment gate resource type. Allowed enum values: `deployment_gate` ``` { "data": { "attributes": { "dry_run": false, "env": "production", "identifier": "my-gate-1", "service": "my-service" }, "type": "deployment_gate" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentGate-200-v2) * [400](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentGate-400-v2) * [401](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentGate-401-v2) * [403](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentGate-403-v2) * [429](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentGate-429-v2) * [500](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentGate-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Response for a deployment gate. Field Type Description data object Data for a deployment gate. attributes [_required_] object Basic information about a deployment gate. created_at [_required_] date-time The timestamp when the deployment gate was created. created_by [_required_] object Information about the user who created the deployment gate. handle string The handle of the user who created the deployment rule. id [_required_] string The ID of the user who created the deployment rule. name string The name of the user who created the deployment rule. dry_run [_required_] boolean Whether this gate is run in dry-run mode. env [_required_] string The environment of the deployment gate. identifier [_required_] string The identifier of the deployment gate. service [_required_] string The service of the deployment gate. updated_at date-time The timestamp when the deployment gate was last updated. updated_by object Information about the user who updated the deployment gate. handle string The handle of the user who updated the deployment rule. id [_required_] string The ID of the user who updated the deployment rule. name string The name of the user who updated the deployment rule. id [_required_] string Unique identifier of the deployment gate. type [_required_] enum Deployment gate resource type. Allowed enum values: `deployment_gate` ``` { "data": { "attributes": { "created_at": "2021-01-01T00:00:00Z", "created_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" }, "dry_run": false, "env": "production", "identifier": "pre", "service": "my-service", "updated_at": "2021-01-01T00:00:00Z", "updated_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" } }, "id": "1111-2222-3333-4444-555566667777", "type": "deployment_gate" } } ``` Copy Bad request. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Bad request. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=typescript) ##### Create deployment gate returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/deployment_gates" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "dry_run": false, "env": "production", "identifier": "my-gate-1", "service": "my-service" }, "type": "deployment_gate" } } EOF ``` ##### Create deployment gate returns "OK" response ``` // Create deployment gate returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateDeploymentGateParams{ Data: datadogV2.CreateDeploymentGateParamsData{ Attributes: datadogV2.CreateDeploymentGateParamsDataAttributes{ DryRun: datadog.PtrBool(false), Env: "production", Identifier: datadog.PtrString("my-gate-1"), Service: "my-service", }, Type: datadogV2.DEPLOYMENTGATEDATATYPE_DEPLOYMENT_GATE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateDeploymentGate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDeploymentGatesApi(apiClient) resp, r, err := api.CreateDeploymentGate(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DeploymentGatesApi.CreateDeploymentGate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DeploymentGatesApi.CreateDeploymentGate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create deployment gate returns "OK" response ``` // Create deployment gate returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DeploymentGatesApi; import com.datadog.api.client.v2.model.CreateDeploymentGateParams; import com.datadog.api.client.v2.model.CreateDeploymentGateParamsData; import com.datadog.api.client.v2.model.CreateDeploymentGateParamsDataAttributes; import com.datadog.api.client.v2.model.DeploymentGateDataType; import com.datadog.api.client.v2.model.DeploymentGateResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createDeploymentGate", true); DeploymentGatesApi apiInstance = new DeploymentGatesApi(defaultClient); CreateDeploymentGateParams body = new CreateDeploymentGateParams() .data( new CreateDeploymentGateParamsData() .attributes( new CreateDeploymentGateParamsDataAttributes() .dryRun(false) .env("production") .identifier("my-gate-1") .service("my-service")) .type(DeploymentGateDataType.DEPLOYMENT_GATE)); try { DeploymentGateResponse result = apiInstance.createDeploymentGate(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DeploymentGatesApi#createDeploymentGate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create deployment gate returns "OK" response ``` """ Create deployment gate returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.deployment_gates_api import DeploymentGatesApi from datadog_api_client.v2.model.create_deployment_gate_params import CreateDeploymentGateParams from datadog_api_client.v2.model.create_deployment_gate_params_data import CreateDeploymentGateParamsData from datadog_api_client.v2.model.create_deployment_gate_params_data_attributes import ( CreateDeploymentGateParamsDataAttributes, ) from datadog_api_client.v2.model.deployment_gate_data_type import DeploymentGateDataType body = CreateDeploymentGateParams( data=CreateDeploymentGateParamsData( attributes=CreateDeploymentGateParamsDataAttributes( dry_run=False, env="production", identifier="my-gate-1", service="my-service", ), type=DeploymentGateDataType.DEPLOYMENT_GATE, ), ) configuration = Configuration() configuration.unstable_operations["create_deployment_gate"] = True with ApiClient(configuration) as api_client: api_instance = DeploymentGatesApi(api_client) response = api_instance.create_deployment_gate(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create deployment gate returns "OK" response ``` # Create deployment gate returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_deployment_gate".to_sym] = true end api_instance = DatadogAPIClient::V2::DeploymentGatesAPI.new body = DatadogAPIClient::V2::CreateDeploymentGateParams.new({ data: DatadogAPIClient::V2::CreateDeploymentGateParamsData.new({ attributes: DatadogAPIClient::V2::CreateDeploymentGateParamsDataAttributes.new({ dry_run: false, env: "production", identifier: "my-gate-1", service: "my-service", }), type: DatadogAPIClient::V2::DeploymentGateDataType::DEPLOYMENT_GATE, }), }) p api_instance.create_deployment_gate(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create deployment gate returns "OK" response ``` // Create deployment gate returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_deployment_gates::DeploymentGatesAPI; use datadog_api_client::datadogV2::model::CreateDeploymentGateParams; use datadog_api_client::datadogV2::model::CreateDeploymentGateParamsData; use datadog_api_client::datadogV2::model::CreateDeploymentGateParamsDataAttributes; use datadog_api_client::datadogV2::model::DeploymentGateDataType; #[tokio::main] async fn main() { let body = CreateDeploymentGateParams::new(CreateDeploymentGateParamsData::new( CreateDeploymentGateParamsDataAttributes::new( "production".to_string(), "my-service".to_string(), ) .dry_run(false) .identifier("my-gate-1".to_string()), DeploymentGateDataType::DEPLOYMENT_GATE, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateDeploymentGate", true); let api = DeploymentGatesAPI::with_config(configuration); let resp = api.create_deployment_gate(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create deployment gate returns "OK" response ``` /** * Create deployment gate returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createDeploymentGate"] = true; const apiInstance = new v2.DeploymentGatesApi(configuration); const params: v2.DeploymentGatesApiCreateDeploymentGateRequest = { body: { data: { attributes: { dryRun: false, env: "production", identifier: "my-gate-1", service: "my-service", }, type: "deployment_gate", }, }, }; apiInstance .createDeploymentGate(params) .then((data: v2.DeploymentGateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get deployment gate](https://docs.datadoghq.com/api/latest/deployment-gates/#get-deployment-gate) * [v2 (latest)](https://docs.datadoghq.com/api/latest/deployment-gates/#get-deployment-gate-v2) **Note** : This endpoint is in preview and may be subject to change. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/deployment_gates/{id}https://api.ap2.datadoghq.com/api/v2/deployment_gates/{id}https://api.datadoghq.eu/api/v2/deployment_gates/{id}https://api.ddog-gov.com/api/v2/deployment_gates/{id}https://api.datadoghq.com/api/v2/deployment_gates/{id}https://api.us3.datadoghq.com/api/v2/deployment_gates/{id}https://api.us5.datadoghq.com/api/v2/deployment_gates/{id} ### Overview Endpoint to get a deployment gate. This endpoint requires the `deployment_gates_read` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string The ID of the deployment gate. ### Response * [200](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGate-200-v2) * [400](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGate-400-v2) * [401](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGate-401-v2) * [403](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGate-403-v2) * [404](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGate-404-v2) * [429](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGate-429-v2) * [500](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGate-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Response for a deployment gate. Field Type Description data object Data for a deployment gate. attributes [_required_] object Basic information about a deployment gate. created_at [_required_] date-time The timestamp when the deployment gate was created. created_by [_required_] object Information about the user who created the deployment gate. handle string The handle of the user who created the deployment rule. id [_required_] string The ID of the user who created the deployment rule. name string The name of the user who created the deployment rule. dry_run [_required_] boolean Whether this gate is run in dry-run mode. env [_required_] string The environment of the deployment gate. identifier [_required_] string The identifier of the deployment gate. service [_required_] string The service of the deployment gate. updated_at date-time The timestamp when the deployment gate was last updated. updated_by object Information about the user who updated the deployment gate. handle string The handle of the user who updated the deployment rule. id [_required_] string The ID of the user who updated the deployment rule. name string The name of the user who updated the deployment rule. id [_required_] string Unique identifier of the deployment gate. type [_required_] enum Deployment gate resource type. Allowed enum values: `deployment_gate` ``` { "data": { "attributes": { "created_at": "2021-01-01T00:00:00Z", "created_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" }, "dry_run": false, "env": "production", "identifier": "pre", "service": "my-service", "updated_at": "2021-01-01T00:00:00Z", "updated_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" } }, "id": "1111-2222-3333-4444-555566667777", "type": "deployment_gate" } } ``` Copy Bad request. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Bad request. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Deployment gate not found. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Deployment gate not found. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=typescript) ##### Get deployment gate Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/deployment_gates/${id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get deployment gate ``` """ Get deployment gate returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.deployment_gates_api import DeploymentGatesApi # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = environ["DEPLOYMENT_GATE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_deployment_gate"] = True with ApiClient(configuration) as api_client: api_instance = DeploymentGatesApi(api_client) response = api_instance.get_deployment_gate( id=DEPLOYMENT_GATE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get deployment gate ``` # Get deployment gate returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_deployment_gate".to_sym] = true end api_instance = DatadogAPIClient::V2::DeploymentGatesAPI.new # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = ENV["DEPLOYMENT_GATE_DATA_ID"] p api_instance.get_deployment_gate(DEPLOYMENT_GATE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get deployment gate ``` // Get deployment gate returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "deployment_gate" in the system DeploymentGateDataID := os.Getenv("DEPLOYMENT_GATE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetDeploymentGate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDeploymentGatesApi(apiClient) resp, r, err := api.GetDeploymentGate(ctx, DeploymentGateDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DeploymentGatesApi.GetDeploymentGate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DeploymentGatesApi.GetDeploymentGate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get deployment gate ``` // Get deployment gate returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DeploymentGatesApi; import com.datadog.api.client.v2.model.DeploymentGateResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getDeploymentGate", true); DeploymentGatesApi apiInstance = new DeploymentGatesApi(defaultClient); // there is a valid "deployment_gate" in the system String DEPLOYMENT_GATE_DATA_ID = System.getenv("DEPLOYMENT_GATE_DATA_ID"); try { DeploymentGateResponse result = apiInstance.getDeploymentGate(DEPLOYMENT_GATE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DeploymentGatesApi#getDeploymentGate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get deployment gate ``` // Get deployment gate returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_deployment_gates::DeploymentGatesAPI; #[tokio::main] async fn main() { // there is a valid "deployment_gate" in the system let deployment_gate_data_id = std::env::var("DEPLOYMENT_GATE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetDeploymentGate", true); let api = DeploymentGatesAPI::with_config(configuration); let resp = api .get_deployment_gate(deployment_gate_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get deployment gate ``` /** * Get deployment gate returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getDeploymentGate"] = true; const apiInstance = new v2.DeploymentGatesApi(configuration); // there is a valid "deployment_gate" in the system const DEPLOYMENT_GATE_DATA_ID = process.env.DEPLOYMENT_GATE_DATA_ID as string; const params: v2.DeploymentGatesApiGetDeploymentGateRequest = { id: DEPLOYMENT_GATE_DATA_ID, }; apiInstance .getDeploymentGate(params) .then((data: v2.DeploymentGateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update deployment gate](https://docs.datadoghq.com/api/latest/deployment-gates/#update-deployment-gate) * [v2 (latest)](https://docs.datadoghq.com/api/latest/deployment-gates/#update-deployment-gate-v2) **Note** : This endpoint is in preview and may be subject to change. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PUT https://api.ap1.datadoghq.com/api/v2/deployment_gates/{id}https://api.ap2.datadoghq.com/api/v2/deployment_gates/{id}https://api.datadoghq.eu/api/v2/deployment_gates/{id}https://api.ddog-gov.com/api/v2/deployment_gates/{id}https://api.datadoghq.com/api/v2/deployment_gates/{id}https://api.us3.datadoghq.com/api/v2/deployment_gates/{id}https://api.us5.datadoghq.com/api/v2/deployment_gates/{id} ### Overview Endpoint to update a deployment gate. This endpoint requires the `deployment_gates_write` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string The ID of the deployment gate. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Field Type Description data [_required_] object Parameters for updating a deployment gate. attributes [_required_] object Attributes for updating a deployment gate. dry_run [_required_] boolean Whether to run in dry-run mode. id [_required_] string Unique identifier of the deployment gate. type [_required_] enum Deployment gate resource type. Allowed enum values: `deployment_gate` ``` { "data": { "attributes": { "dry_run": false }, "id": "12345678-1234-1234-1234-123456789012", "type": "deployment_gate" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentGate-200-v2) * [400](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentGate-400-v2) * [401](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentGate-401-v2) * [403](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentGate-403-v2) * [404](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentGate-404-v2) * [429](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentGate-429-v2) * [500](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentGate-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Response for a deployment gate. Field Type Description data object Data for a deployment gate. attributes [_required_] object Basic information about a deployment gate. created_at [_required_] date-time The timestamp when the deployment gate was created. created_by [_required_] object Information about the user who created the deployment gate. handle string The handle of the user who created the deployment rule. id [_required_] string The ID of the user who created the deployment rule. name string The name of the user who created the deployment rule. dry_run [_required_] boolean Whether this gate is run in dry-run mode. env [_required_] string The environment of the deployment gate. identifier [_required_] string The identifier of the deployment gate. service [_required_] string The service of the deployment gate. updated_at date-time The timestamp when the deployment gate was last updated. updated_by object Information about the user who updated the deployment gate. handle string The handle of the user who updated the deployment rule. id [_required_] string The ID of the user who updated the deployment rule. name string The name of the user who updated the deployment rule. id [_required_] string Unique identifier of the deployment gate. type [_required_] enum Deployment gate resource type. Allowed enum values: `deployment_gate` ``` { "data": { "attributes": { "created_at": "2021-01-01T00:00:00Z", "created_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" }, "dry_run": false, "env": "production", "identifier": "pre", "service": "my-service", "updated_at": "2021-01-01T00:00:00Z", "updated_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" } }, "id": "1111-2222-3333-4444-555566667777", "type": "deployment_gate" } } ``` Copy Bad request. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Bad request. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Deployment gate not found. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Deployment gate not found. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=typescript) ##### Update deployment gate returns "OK" response Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/deployment_gates/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "dry_run": false }, "id": "12345678-1234-1234-1234-123456789012", "type": "deployment_gate" } } EOF ``` ##### Update deployment gate returns "OK" response ``` // Update deployment gate returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "deployment_gate" in the system DeploymentGateDataID := os.Getenv("DEPLOYMENT_GATE_DATA_ID") body := datadogV2.UpdateDeploymentGateParams{ Data: datadogV2.UpdateDeploymentGateParamsData{ Attributes: datadogV2.UpdateDeploymentGateParamsDataAttributes{ DryRun: false, }, Id: "12345678-1234-1234-1234-123456789012", Type: datadogV2.DEPLOYMENTGATEDATATYPE_DEPLOYMENT_GATE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateDeploymentGate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDeploymentGatesApi(apiClient) resp, r, err := api.UpdateDeploymentGate(ctx, DeploymentGateDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DeploymentGatesApi.UpdateDeploymentGate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DeploymentGatesApi.UpdateDeploymentGate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update deployment gate returns "OK" response ``` // Update deployment gate returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DeploymentGatesApi; import com.datadog.api.client.v2.model.DeploymentGateDataType; import com.datadog.api.client.v2.model.DeploymentGateResponse; import com.datadog.api.client.v2.model.UpdateDeploymentGateParams; import com.datadog.api.client.v2.model.UpdateDeploymentGateParamsData; import com.datadog.api.client.v2.model.UpdateDeploymentGateParamsDataAttributes; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateDeploymentGate", true); DeploymentGatesApi apiInstance = new DeploymentGatesApi(defaultClient); // there is a valid "deployment_gate" in the system String DEPLOYMENT_GATE_DATA_ID = System.getenv("DEPLOYMENT_GATE_DATA_ID"); UpdateDeploymentGateParams body = new UpdateDeploymentGateParams() .data( new UpdateDeploymentGateParamsData() .attributes(new UpdateDeploymentGateParamsDataAttributes().dryRun(false)) .id("12345678-1234-1234-1234-123456789012") .type(DeploymentGateDataType.DEPLOYMENT_GATE)); try { DeploymentGateResponse result = apiInstance.updateDeploymentGate(DEPLOYMENT_GATE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DeploymentGatesApi#updateDeploymentGate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update deployment gate returns "OK" response ``` """ Update deployment gate returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.deployment_gates_api import DeploymentGatesApi from datadog_api_client.v2.model.deployment_gate_data_type import DeploymentGateDataType from datadog_api_client.v2.model.update_deployment_gate_params import UpdateDeploymentGateParams from datadog_api_client.v2.model.update_deployment_gate_params_data import UpdateDeploymentGateParamsData from datadog_api_client.v2.model.update_deployment_gate_params_data_attributes import ( UpdateDeploymentGateParamsDataAttributes, ) # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = environ["DEPLOYMENT_GATE_DATA_ID"] body = UpdateDeploymentGateParams( data=UpdateDeploymentGateParamsData( attributes=UpdateDeploymentGateParamsDataAttributes( dry_run=False, ), id="12345678-1234-1234-1234-123456789012", type=DeploymentGateDataType.DEPLOYMENT_GATE, ), ) configuration = Configuration() configuration.unstable_operations["update_deployment_gate"] = True with ApiClient(configuration) as api_client: api_instance = DeploymentGatesApi(api_client) response = api_instance.update_deployment_gate(id=DEPLOYMENT_GATE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update deployment gate returns "OK" response ``` # Update deployment gate returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_deployment_gate".to_sym] = true end api_instance = DatadogAPIClient::V2::DeploymentGatesAPI.new # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = ENV["DEPLOYMENT_GATE_DATA_ID"] body = DatadogAPIClient::V2::UpdateDeploymentGateParams.new({ data: DatadogAPIClient::V2::UpdateDeploymentGateParamsData.new({ attributes: DatadogAPIClient::V2::UpdateDeploymentGateParamsDataAttributes.new({ dry_run: false, }), id: "12345678-1234-1234-1234-123456789012", type: DatadogAPIClient::V2::DeploymentGateDataType::DEPLOYMENT_GATE, }), }) p api_instance.update_deployment_gate(DEPLOYMENT_GATE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update deployment gate returns "OK" response ``` // Update deployment gate returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_deployment_gates::DeploymentGatesAPI; use datadog_api_client::datadogV2::model::DeploymentGateDataType; use datadog_api_client::datadogV2::model::UpdateDeploymentGateParams; use datadog_api_client::datadogV2::model::UpdateDeploymentGateParamsData; use datadog_api_client::datadogV2::model::UpdateDeploymentGateParamsDataAttributes; #[tokio::main] async fn main() { // there is a valid "deployment_gate" in the system let deployment_gate_data_id = std::env::var("DEPLOYMENT_GATE_DATA_ID").unwrap(); let body = UpdateDeploymentGateParams::new(UpdateDeploymentGateParamsData::new( UpdateDeploymentGateParamsDataAttributes::new(false), "12345678-1234-1234-1234-123456789012".to_string(), DeploymentGateDataType::DEPLOYMENT_GATE, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateDeploymentGate", true); let api = DeploymentGatesAPI::with_config(configuration); let resp = api .update_deployment_gate(deployment_gate_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update deployment gate returns "OK" response ``` /** * Update deployment gate returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateDeploymentGate"] = true; const apiInstance = new v2.DeploymentGatesApi(configuration); // there is a valid "deployment_gate" in the system const DEPLOYMENT_GATE_DATA_ID = process.env.DEPLOYMENT_GATE_DATA_ID as string; const params: v2.DeploymentGatesApiUpdateDeploymentGateRequest = { body: { data: { attributes: { dryRun: false, }, id: "12345678-1234-1234-1234-123456789012", type: "deployment_gate", }, }, id: DEPLOYMENT_GATE_DATA_ID, }; apiInstance .updateDeploymentGate(params) .then((data: v2.DeploymentGateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete deployment gate](https://docs.datadoghq.com/api/latest/deployment-gates/#delete-deployment-gate) * [v2 (latest)](https://docs.datadoghq.com/api/latest/deployment-gates/#delete-deployment-gate-v2) **Note** : This endpoint is in preview and may be subject to change. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/deployment_gates/{id}https://api.ap2.datadoghq.com/api/v2/deployment_gates/{id}https://api.datadoghq.eu/api/v2/deployment_gates/{id}https://api.ddog-gov.com/api/v2/deployment_gates/{id}https://api.datadoghq.com/api/v2/deployment_gates/{id}https://api.us3.datadoghq.com/api/v2/deployment_gates/{id}https://api.us5.datadoghq.com/api/v2/deployment_gates/{id} ### Overview Endpoint to delete a deployment gate. Rules associated with the gate are also deleted. This endpoint requires the `deployment_gates_write` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string The ID of the deployment gate. ### Response * [204](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentGate-204-v2) * [400](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentGate-400-v2) * [401](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentGate-401-v2) * [403](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentGate-403-v2) * [404](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentGate-404-v2) * [429](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentGate-429-v2) * [500](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentGate-500-v2) No Content Bad request. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Bad request. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Deployment gate not found. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Deployment gate not found. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=typescript) ##### Delete deployment gate Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/deployment_gates/${id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete deployment gate ``` """ Delete deployment gate returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.deployment_gates_api import DeploymentGatesApi # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = environ["DEPLOYMENT_GATE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_deployment_gate"] = True with ApiClient(configuration) as api_client: api_instance = DeploymentGatesApi(api_client) api_instance.delete_deployment_gate( id=DEPLOYMENT_GATE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete deployment gate ``` # Delete deployment gate returns "No Content" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_deployment_gate".to_sym] = true end api_instance = DatadogAPIClient::V2::DeploymentGatesAPI.new # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = ENV["DEPLOYMENT_GATE_DATA_ID"] api_instance.delete_deployment_gate(DEPLOYMENT_GATE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete deployment gate ``` // Delete deployment gate returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "deployment_gate" in the system DeploymentGateDataID := os.Getenv("DEPLOYMENT_GATE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteDeploymentGate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDeploymentGatesApi(apiClient) r, err := api.DeleteDeploymentGate(ctx, DeploymentGateDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DeploymentGatesApi.DeleteDeploymentGate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete deployment gate ``` // Delete deployment gate returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DeploymentGatesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteDeploymentGate", true); DeploymentGatesApi apiInstance = new DeploymentGatesApi(defaultClient); // there is a valid "deployment_gate" in the system String DEPLOYMENT_GATE_DATA_ID = System.getenv("DEPLOYMENT_GATE_DATA_ID"); try { apiInstance.deleteDeploymentGate(DEPLOYMENT_GATE_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling DeploymentGatesApi#deleteDeploymentGate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete deployment gate ``` // Delete deployment gate returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_deployment_gates::DeploymentGatesAPI; #[tokio::main] async fn main() { // there is a valid "deployment_gate" in the system let deployment_gate_data_id = std::env::var("DEPLOYMENT_GATE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteDeploymentGate", true); let api = DeploymentGatesAPI::with_config(configuration); let resp = api .delete_deployment_gate(deployment_gate_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete deployment gate ``` /** * Delete deployment gate returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteDeploymentGate"] = true; const apiInstance = new v2.DeploymentGatesApi(configuration); // there is a valid "deployment_gate" in the system const DEPLOYMENT_GATE_DATA_ID = process.env.DEPLOYMENT_GATE_DATA_ID as string; const params: v2.DeploymentGatesApiDeleteDeploymentGateRequest = { id: DEPLOYMENT_GATE_DATA_ID, }; apiInstance .deleteDeploymentGate(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create deployment rule](https://docs.datadoghq.com/api/latest/deployment-gates/#create-deployment-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/deployment-gates/#create-deployment-rule-v2) **Note** : This endpoint is in preview and may be subject to change. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/deployment_gates/{gate_id}/ruleshttps://api.ap2.datadoghq.com/api/v2/deployment_gates/{gate_id}/ruleshttps://api.datadoghq.eu/api/v2/deployment_gates/{gate_id}/ruleshttps://api.ddog-gov.com/api/v2/deployment_gates/{gate_id}/ruleshttps://api.datadoghq.com/api/v2/deployment_gates/{gate_id}/ruleshttps://api.us3.datadoghq.com/api/v2/deployment_gates/{gate_id}/ruleshttps://api.us5.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules ### Overview Endpoint to create a deployment rule. A gate for the rule must already exist. This endpoint requires the `deployment_gates_write` permission. ### Arguments #### Path Parameters Name Type Description gate_id [_required_] string The ID of the deployment gate. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Field Type Description data object Parameters for creating a deployment rule. attributes [_required_] object Parameters for creating a deployment rule. dry_run boolean Whether this rule is run in dry-run mode. name [_required_] string The name of the deployment rule. options [_required_] Options for deployment rule response representing either faulty deployment detection or monitor options. Option 1 object Faulty deployment detection options for deployment rules. duration int64 The duration for faulty deployment detection. excluded_resources [string] Resources to exclude from faulty deployment detection. Option 2 object Monitor options for deployment rules. duration int64 Seconds the monitor needs to stay in OK status for the rule to pass. query [_required_] string Monitors that match this query are evaluated. type [_required_] string The type of the deployment rule (faulty_deployment_detection or monitor). type [_required_] enum Deployment rule resource type. Allowed enum values: `deployment_rule` ``` { "data": { "attributes": { "dry_run": false, "name": "My deployment rule", "options": { "excluded_resources": [] }, "type": "faulty_deployment_detection" }, "type": "deployment_rule" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentRule-403-v2) * [429](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentRule-429-v2) * [500](https://docs.datadoghq.com/api/latest/deployment-gates/#CreateDeploymentRule-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Response for a deployment rule. Field Type Description data object Data for a deployment rule. attributes [_required_] object Basic information about a deployment rule. created_at [_required_] date-time The timestamp when the deployment rule was created. created_by [_required_] object Information about the user who created the deployment rule. handle string The handle of the user who created the deployment rule. id [_required_] string The ID of the user who created the deployment rule. name string The name of the user who created the deployment rule. dry_run [_required_] boolean Whether this rule is run in dry-run mode. gate_id [_required_] string The ID of the deployment gate. name [_required_] string The name of the deployment rule. options [_required_] Options for deployment rule response representing either faulty deployment detection or monitor options. Option 1 object Faulty deployment detection options for deployment rules. duration int64 The duration for faulty deployment detection. excluded_resources [string] Resources to exclude from faulty deployment detection. Option 2 object Monitor options for deployment rules. duration int64 Seconds the monitor needs to stay in OK status for the rule to pass. query [_required_] string Monitors that match this query are evaluated. type [_required_] enum The type of the deployment rule. Allowed enum values: `faulty_deployment_detection,monitor` updated_at date-time The timestamp when the deployment rule was last updated. updated_by object Information about the user who updated the deployment rule. handle string The handle of the user who updated the deployment rule. id [_required_] string The ID of the user who updated the deployment rule. name string The name of the user who updated the deployment rule. id [_required_] string Unique identifier of the deployment rule. type [_required_] enum Deployment rule resource type. Allowed enum values: `deployment_rule` ``` { "data": { "attributes": { "created_at": "2021-01-01T00:00:00Z", "created_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" }, "dry_run": false, "gate_id": "1111-2222-3333-4444-555566667777", "name": "My deployment rule", "options": { "duration": 3600, "excluded_resources": [ "resource1", "resource2" ] }, "type": "faulty_deployment_detection", "updated_at": "2019-09-19T10:00:00.000Z", "updated_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" } }, "id": "1111-2222-3333-4444-555566667777", "type": "deployment_rule" } } ``` Copy Bad request. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Bad request. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=typescript) ##### Create deployment rule returns "OK" response Copy ``` # Path parameters export gate_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/deployment_gates/${gate_id}/rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "dry_run": false, "name": "My deployment rule", "options": { "excluded_resources": [] }, "type": "faulty_deployment_detection" }, "type": "deployment_rule" } } EOF ``` ##### Create deployment rule returns "OK" response ``` // Create deployment rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "deployment_gate" in the system DeploymentGateDataID := os.Getenv("DEPLOYMENT_GATE_DATA_ID") body := datadogV2.CreateDeploymentRuleParams{ Data: &datadogV2.CreateDeploymentRuleParamsData{ Attributes: datadogV2.CreateDeploymentRuleParamsDataAttributes{ DryRun: datadog.PtrBool(false), Name: "My deployment rule", Options: datadogV2.DeploymentRulesOptions{ DeploymentRuleOptionsFaultyDeploymentDetection: &datadogV2.DeploymentRuleOptionsFaultyDeploymentDetection{ ExcludedResources: []string{}, }}, Type: "faulty_deployment_detection", }, Type: datadogV2.DEPLOYMENTRULEDATATYPE_DEPLOYMENT_RULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateDeploymentRule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDeploymentGatesApi(apiClient) resp, r, err := api.CreateDeploymentRule(ctx, DeploymentGateDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DeploymentGatesApi.CreateDeploymentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DeploymentGatesApi.CreateDeploymentRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create deployment rule returns "OK" response ``` // Create deployment rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DeploymentGatesApi; import com.datadog.api.client.v2.model.CreateDeploymentRuleParams; import com.datadog.api.client.v2.model.CreateDeploymentRuleParamsData; import com.datadog.api.client.v2.model.CreateDeploymentRuleParamsDataAttributes; import com.datadog.api.client.v2.model.DeploymentRuleDataType; import com.datadog.api.client.v2.model.DeploymentRuleOptionsFaultyDeploymentDetection; import com.datadog.api.client.v2.model.DeploymentRuleResponse; import com.datadog.api.client.v2.model.DeploymentRulesOptions; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createDeploymentRule", true); DeploymentGatesApi apiInstance = new DeploymentGatesApi(defaultClient); // there is a valid "deployment_gate" in the system String DEPLOYMENT_GATE_DATA_ID = System.getenv("DEPLOYMENT_GATE_DATA_ID"); CreateDeploymentRuleParams body = new CreateDeploymentRuleParams() .data( new CreateDeploymentRuleParamsData() .attributes( new CreateDeploymentRuleParamsDataAttributes() .dryRun(false) .name("My deployment rule") .options( new DeploymentRulesOptions( new DeploymentRuleOptionsFaultyDeploymentDetection())) .type("faulty_deployment_detection")) .type(DeploymentRuleDataType.DEPLOYMENT_RULE)); try { DeploymentRuleResponse result = apiInstance.createDeploymentRule(DEPLOYMENT_GATE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DeploymentGatesApi#createDeploymentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create deployment rule returns "OK" response ``` """ Create deployment rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.deployment_gates_api import DeploymentGatesApi from datadog_api_client.v2.model.create_deployment_rule_params import CreateDeploymentRuleParams from datadog_api_client.v2.model.create_deployment_rule_params_data import CreateDeploymentRuleParamsData from datadog_api_client.v2.model.create_deployment_rule_params_data_attributes import ( CreateDeploymentRuleParamsDataAttributes, ) from datadog_api_client.v2.model.deployment_rule_data_type import DeploymentRuleDataType from datadog_api_client.v2.model.deployment_rule_options_faulty_deployment_detection import ( DeploymentRuleOptionsFaultyDeploymentDetection, ) # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = environ["DEPLOYMENT_GATE_DATA_ID"] body = CreateDeploymentRuleParams( data=CreateDeploymentRuleParamsData( attributes=CreateDeploymentRuleParamsDataAttributes( dry_run=False, name="My deployment rule", options=DeploymentRuleOptionsFaultyDeploymentDetection( excluded_resources=[], ), type="faulty_deployment_detection", ), type=DeploymentRuleDataType.DEPLOYMENT_RULE, ), ) configuration = Configuration() configuration.unstable_operations["create_deployment_rule"] = True with ApiClient(configuration) as api_client: api_instance = DeploymentGatesApi(api_client) response = api_instance.create_deployment_rule(gate_id=DEPLOYMENT_GATE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create deployment rule returns "OK" response ``` # Create deployment rule returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_deployment_rule".to_sym] = true end api_instance = DatadogAPIClient::V2::DeploymentGatesAPI.new # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = ENV["DEPLOYMENT_GATE_DATA_ID"] body = DatadogAPIClient::V2::CreateDeploymentRuleParams.new({ data: DatadogAPIClient::V2::CreateDeploymentRuleParamsData.new({ attributes: DatadogAPIClient::V2::CreateDeploymentRuleParamsDataAttributes.new({ dry_run: false, name: "My deployment rule", options: DatadogAPIClient::V2::DeploymentRuleOptionsFaultyDeploymentDetection.new({ excluded_resources: [], }), type: "faulty_deployment_detection", }), type: DatadogAPIClient::V2::DeploymentRuleDataType::DEPLOYMENT_RULE, }), }) p api_instance.create_deployment_rule(DEPLOYMENT_GATE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create deployment rule returns "OK" response ``` // Create deployment rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_deployment_gates::DeploymentGatesAPI; use datadog_api_client::datadogV2::model::CreateDeploymentRuleParams; use datadog_api_client::datadogV2::model::CreateDeploymentRuleParamsData; use datadog_api_client::datadogV2::model::CreateDeploymentRuleParamsDataAttributes; use datadog_api_client::datadogV2::model::DeploymentRuleDataType; use datadog_api_client::datadogV2::model::DeploymentRuleOptionsFaultyDeploymentDetection; use datadog_api_client::datadogV2::model::DeploymentRulesOptions; #[tokio::main] async fn main() { // there is a valid "deployment_gate" in the system let deployment_gate_data_id = std::env::var("DEPLOYMENT_GATE_DATA_ID").unwrap(); let body = CreateDeploymentRuleParams::new().data(CreateDeploymentRuleParamsData::new( CreateDeploymentRuleParamsDataAttributes::new( "My deployment rule".to_string(), DeploymentRulesOptions::DeploymentRuleOptionsFaultyDeploymentDetection(Box::new( DeploymentRuleOptionsFaultyDeploymentDetection::new().excluded_resources(vec![]), )), "faulty_deployment_detection".to_string(), ) .dry_run(false), DeploymentRuleDataType::DEPLOYMENT_RULE, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateDeploymentRule", true); let api = DeploymentGatesAPI::with_config(configuration); let resp = api .create_deployment_rule(deployment_gate_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create deployment rule returns "OK" response ``` /** * Create deployment rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createDeploymentRule"] = true; const apiInstance = new v2.DeploymentGatesApi(configuration); // there is a valid "deployment_gate" in the system const DEPLOYMENT_GATE_DATA_ID = process.env.DEPLOYMENT_GATE_DATA_ID as string; const params: v2.DeploymentGatesApiCreateDeploymentRuleRequest = { body: { data: { attributes: { dryRun: false, name: "My deployment rule", options: { excludedResources: [], }, type: "faulty_deployment_detection", }, type: "deployment_rule", }, }, gateId: DEPLOYMENT_GATE_DATA_ID, }; apiInstance .createDeploymentRule(params) .then((data: v2.DeploymentRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get deployment rule](https://docs.datadoghq.com/api/latest/deployment-gates/#get-deployment-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/deployment-gates/#get-deployment-rule-v2) **Note** : This endpoint is in preview and may be subject to change. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.ap2.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.datadoghq.eu/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.ddog-gov.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.us3.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.us5.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id} ### Overview Endpoint to get a deployment rule. This endpoint requires the `deployment_gates_read` permission. ### Arguments #### Path Parameters Name Type Description gate_id [_required_] string The ID of the deployment gate. id [_required_] string The ID of the deployment rule. ### Response * [200](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentRule-429-v2) * [500](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentRule-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Response for a deployment rule. Field Type Description data object Data for a deployment rule. attributes [_required_] object Basic information about a deployment rule. created_at [_required_] date-time The timestamp when the deployment rule was created. created_by [_required_] object Information about the user who created the deployment rule. handle string The handle of the user who created the deployment rule. id [_required_] string The ID of the user who created the deployment rule. name string The name of the user who created the deployment rule. dry_run [_required_] boolean Whether this rule is run in dry-run mode. gate_id [_required_] string The ID of the deployment gate. name [_required_] string The name of the deployment rule. options [_required_] Options for deployment rule response representing either faulty deployment detection or monitor options. Option 1 object Faulty deployment detection options for deployment rules. duration int64 The duration for faulty deployment detection. excluded_resources [string] Resources to exclude from faulty deployment detection. Option 2 object Monitor options for deployment rules. duration int64 Seconds the monitor needs to stay in OK status for the rule to pass. query [_required_] string Monitors that match this query are evaluated. type [_required_] enum The type of the deployment rule. Allowed enum values: `faulty_deployment_detection,monitor` updated_at date-time The timestamp when the deployment rule was last updated. updated_by object Information about the user who updated the deployment rule. handle string The handle of the user who updated the deployment rule. id [_required_] string The ID of the user who updated the deployment rule. name string The name of the user who updated the deployment rule. id [_required_] string Unique identifier of the deployment rule. type [_required_] enum Deployment rule resource type. Allowed enum values: `deployment_rule` ``` { "data": { "attributes": { "created_at": "2021-01-01T00:00:00Z", "created_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" }, "dry_run": false, "gate_id": "1111-2222-3333-4444-555566667777", "name": "My deployment rule", "options": { "duration": 3600, "excluded_resources": [ "resource1", "resource2" ] }, "type": "faulty_deployment_detection", "updated_at": "2019-09-19T10:00:00.000Z", "updated_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" } }, "id": "1111-2222-3333-4444-555566667777", "type": "deployment_rule" } } ``` Copy Bad request. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Bad request. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Deployment rule not found. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Deployment rule not found. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=typescript) ##### Get deployment rule Copy ``` # Path parameters export gate_id="CHANGE_ME" export id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/deployment_gates/${gate_id}/rules/${id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get deployment rule ``` """ Get deployment rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.deployment_gates_api import DeploymentGatesApi # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = environ["DEPLOYMENT_GATE_DATA_ID"] # there is a valid "deployment_rule" in the system DEPLOYMENT_RULE_DATA_ID = environ["DEPLOYMENT_RULE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_deployment_rule"] = True with ApiClient(configuration) as api_client: api_instance = DeploymentGatesApi(api_client) response = api_instance.get_deployment_rule( gate_id=DEPLOYMENT_GATE_DATA_ID, id=DEPLOYMENT_RULE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get deployment rule ``` # Get deployment rule returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_deployment_rule".to_sym] = true end api_instance = DatadogAPIClient::V2::DeploymentGatesAPI.new # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = ENV["DEPLOYMENT_GATE_DATA_ID"] # there is a valid "deployment_rule" in the system DEPLOYMENT_RULE_DATA_ID = ENV["DEPLOYMENT_RULE_DATA_ID"] p api_instance.get_deployment_rule(DEPLOYMENT_GATE_DATA_ID, DEPLOYMENT_RULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get deployment rule ``` // Get deployment rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "deployment_gate" in the system DeploymentGateDataID := os.Getenv("DEPLOYMENT_GATE_DATA_ID") // there is a valid "deployment_rule" in the system DeploymentRuleDataID := os.Getenv("DEPLOYMENT_RULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetDeploymentRule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDeploymentGatesApi(apiClient) resp, r, err := api.GetDeploymentRule(ctx, DeploymentGateDataID, DeploymentRuleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DeploymentGatesApi.GetDeploymentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DeploymentGatesApi.GetDeploymentRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get deployment rule ``` // Get deployment rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DeploymentGatesApi; import com.datadog.api.client.v2.model.DeploymentRuleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getDeploymentRule", true); DeploymentGatesApi apiInstance = new DeploymentGatesApi(defaultClient); // there is a valid "deployment_gate" in the system String DEPLOYMENT_GATE_DATA_ID = System.getenv("DEPLOYMENT_GATE_DATA_ID"); // there is a valid "deployment_rule" in the system String DEPLOYMENT_RULE_DATA_ID = System.getenv("DEPLOYMENT_RULE_DATA_ID"); try { DeploymentRuleResponse result = apiInstance.getDeploymentRule(DEPLOYMENT_GATE_DATA_ID, DEPLOYMENT_RULE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DeploymentGatesApi#getDeploymentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get deployment rule ``` // Get deployment rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_deployment_gates::DeploymentGatesAPI; #[tokio::main] async fn main() { // there is a valid "deployment_gate" in the system let deployment_gate_data_id = std::env::var("DEPLOYMENT_GATE_DATA_ID").unwrap(); // there is a valid "deployment_rule" in the system let deployment_rule_data_id = std::env::var("DEPLOYMENT_RULE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetDeploymentRule", true); let api = DeploymentGatesAPI::with_config(configuration); let resp = api .get_deployment_rule( deployment_gate_data_id.clone(), deployment_rule_data_id.clone(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get deployment rule ``` /** * Get deployment rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getDeploymentRule"] = true; const apiInstance = new v2.DeploymentGatesApi(configuration); // there is a valid "deployment_gate" in the system const DEPLOYMENT_GATE_DATA_ID = process.env.DEPLOYMENT_GATE_DATA_ID as string; // there is a valid "deployment_rule" in the system const DEPLOYMENT_RULE_DATA_ID = process.env.DEPLOYMENT_RULE_DATA_ID as string; const params: v2.DeploymentGatesApiGetDeploymentRuleRequest = { gateId: DEPLOYMENT_GATE_DATA_ID, id: DEPLOYMENT_RULE_DATA_ID, }; apiInstance .getDeploymentRule(params) .then((data: v2.DeploymentRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update deployment rule](https://docs.datadoghq.com/api/latest/deployment-gates/#update-deployment-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/deployment-gates/#update-deployment-rule-v2) **Note** : This endpoint is in preview and may be subject to change. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PUT https://api.ap1.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.ap2.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.datadoghq.eu/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.ddog-gov.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.us3.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.us5.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id} ### Overview Endpoint to update a deployment rule. This endpoint requires the `deployment_gates_write` permission. ### Arguments #### Path Parameters Name Type Description gate_id [_required_] string The ID of the deployment gate. id [_required_] string The ID of the deployment rule. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Field Type Description data [_required_] object Parameters for updating a deployment rule. attributes [_required_] object Parameters for updating a deployment rule. dry_run [_required_] boolean Whether to run this rule in dry-run mode. name [_required_] string The name of the deployment rule. options [_required_] Options for deployment rule response representing either faulty deployment detection or monitor options. Option 1 object Faulty deployment detection options for deployment rules. duration int64 The duration for faulty deployment detection. excluded_resources [string] Resources to exclude from faulty deployment detection. Option 2 object Monitor options for deployment rules. duration int64 Seconds the monitor needs to stay in OK status for the rule to pass. query [_required_] string Monitors that match this query are evaluated. type [_required_] enum Deployment rule resource type. Allowed enum values: `deployment_rule` ``` { "data": { "attributes": { "dry_run": false, "name": "Updated deployment rule", "options": { "excluded_resources": [] } }, "type": "deployment_rule" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentRule-429-v2) * [500](https://docs.datadoghq.com/api/latest/deployment-gates/#UpdateDeploymentRule-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Response for a deployment rule. Field Type Description data object Data for a deployment rule. attributes [_required_] object Basic information about a deployment rule. created_at [_required_] date-time The timestamp when the deployment rule was created. created_by [_required_] object Information about the user who created the deployment rule. handle string The handle of the user who created the deployment rule. id [_required_] string The ID of the user who created the deployment rule. name string The name of the user who created the deployment rule. dry_run [_required_] boolean Whether this rule is run in dry-run mode. gate_id [_required_] string The ID of the deployment gate. name [_required_] string The name of the deployment rule. options [_required_] Options for deployment rule response representing either faulty deployment detection or monitor options. Option 1 object Faulty deployment detection options for deployment rules. duration int64 The duration for faulty deployment detection. excluded_resources [string] Resources to exclude from faulty deployment detection. Option 2 object Monitor options for deployment rules. duration int64 Seconds the monitor needs to stay in OK status for the rule to pass. query [_required_] string Monitors that match this query are evaluated. type [_required_] enum The type of the deployment rule. Allowed enum values: `faulty_deployment_detection,monitor` updated_at date-time The timestamp when the deployment rule was last updated. updated_by object Information about the user who updated the deployment rule. handle string The handle of the user who updated the deployment rule. id [_required_] string The ID of the user who updated the deployment rule. name string The name of the user who updated the deployment rule. id [_required_] string Unique identifier of the deployment rule. type [_required_] enum Deployment rule resource type. Allowed enum values: `deployment_rule` ``` { "data": { "attributes": { "created_at": "2021-01-01T00:00:00Z", "created_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" }, "dry_run": false, "gate_id": "1111-2222-3333-4444-555566667777", "name": "My deployment rule", "options": { "duration": 3600, "excluded_resources": [ "resource1", "resource2" ] }, "type": "faulty_deployment_detection", "updated_at": "2019-09-19T10:00:00.000Z", "updated_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" } }, "id": "1111-2222-3333-4444-555566667777", "type": "deployment_rule" } } ``` Copy Bad request. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Bad request. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Deployment rule not found. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Deployment rule not found. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=typescript) ##### Update deployment rule returns "OK" response Copy ``` # Path parameters export gate_id="CHANGE_ME" export id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/deployment_gates/${gate_id}/rules/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "dry_run": false, "name": "Updated deployment rule", "options": { "excluded_resources": [] } }, "type": "deployment_rule" } } EOF ``` ##### Update deployment rule returns "OK" response ``` // Update deployment rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "deployment_gate" in the system DeploymentGateDataID := os.Getenv("DEPLOYMENT_GATE_DATA_ID") // there is a valid "deployment_rule" in the system DeploymentRuleDataID := os.Getenv("DEPLOYMENT_RULE_DATA_ID") body := datadogV2.UpdateDeploymentRuleParams{ Data: datadogV2.UpdateDeploymentRuleParamsData{ Attributes: datadogV2.UpdateDeploymentRuleParamsDataAttributes{ DryRun: false, Name: "Updated deployment rule", Options: datadogV2.DeploymentRulesOptions{ DeploymentRuleOptionsFaultyDeploymentDetection: &datadogV2.DeploymentRuleOptionsFaultyDeploymentDetection{ ExcludedResources: []string{}, }}, }, Type: datadogV2.DEPLOYMENTRULEDATATYPE_DEPLOYMENT_RULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateDeploymentRule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDeploymentGatesApi(apiClient) resp, r, err := api.UpdateDeploymentRule(ctx, DeploymentGateDataID, DeploymentRuleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DeploymentGatesApi.UpdateDeploymentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DeploymentGatesApi.UpdateDeploymentRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update deployment rule returns "OK" response ``` // Update deployment rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DeploymentGatesApi; import com.datadog.api.client.v2.model.DeploymentRuleDataType; import com.datadog.api.client.v2.model.DeploymentRuleOptionsFaultyDeploymentDetection; import com.datadog.api.client.v2.model.DeploymentRuleResponse; import com.datadog.api.client.v2.model.DeploymentRulesOptions; import com.datadog.api.client.v2.model.UpdateDeploymentRuleParams; import com.datadog.api.client.v2.model.UpdateDeploymentRuleParamsData; import com.datadog.api.client.v2.model.UpdateDeploymentRuleParamsDataAttributes; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateDeploymentRule", true); DeploymentGatesApi apiInstance = new DeploymentGatesApi(defaultClient); // there is a valid "deployment_gate" in the system String DEPLOYMENT_GATE_DATA_ID = System.getenv("DEPLOYMENT_GATE_DATA_ID"); // there is a valid "deployment_rule" in the system String DEPLOYMENT_RULE_DATA_ID = System.getenv("DEPLOYMENT_RULE_DATA_ID"); UpdateDeploymentRuleParams body = new UpdateDeploymentRuleParams() .data( new UpdateDeploymentRuleParamsData() .attributes( new UpdateDeploymentRuleParamsDataAttributes() .dryRun(false) .name("Updated deployment rule") .options( new DeploymentRulesOptions( new DeploymentRuleOptionsFaultyDeploymentDetection()))) .type(DeploymentRuleDataType.DEPLOYMENT_RULE)); try { DeploymentRuleResponse result = apiInstance.updateDeploymentRule(DEPLOYMENT_GATE_DATA_ID, DEPLOYMENT_RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DeploymentGatesApi#updateDeploymentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update deployment rule returns "OK" response ``` """ Update deployment rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.deployment_gates_api import DeploymentGatesApi from datadog_api_client.v2.model.deployment_rule_data_type import DeploymentRuleDataType from datadog_api_client.v2.model.deployment_rule_options_faulty_deployment_detection import ( DeploymentRuleOptionsFaultyDeploymentDetection, ) from datadog_api_client.v2.model.update_deployment_rule_params import UpdateDeploymentRuleParams from datadog_api_client.v2.model.update_deployment_rule_params_data import UpdateDeploymentRuleParamsData from datadog_api_client.v2.model.update_deployment_rule_params_data_attributes import ( UpdateDeploymentRuleParamsDataAttributes, ) # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = environ["DEPLOYMENT_GATE_DATA_ID"] # there is a valid "deployment_rule" in the system DEPLOYMENT_RULE_DATA_ID = environ["DEPLOYMENT_RULE_DATA_ID"] body = UpdateDeploymentRuleParams( data=UpdateDeploymentRuleParamsData( attributes=UpdateDeploymentRuleParamsDataAttributes( dry_run=False, name="Updated deployment rule", options=DeploymentRuleOptionsFaultyDeploymentDetection( excluded_resources=[], ), ), type=DeploymentRuleDataType.DEPLOYMENT_RULE, ), ) configuration = Configuration() configuration.unstable_operations["update_deployment_rule"] = True with ApiClient(configuration) as api_client: api_instance = DeploymentGatesApi(api_client) response = api_instance.update_deployment_rule( gate_id=DEPLOYMENT_GATE_DATA_ID, id=DEPLOYMENT_RULE_DATA_ID, body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update deployment rule returns "OK" response ``` # Update deployment rule returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_deployment_rule".to_sym] = true end api_instance = DatadogAPIClient::V2::DeploymentGatesAPI.new # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = ENV["DEPLOYMENT_GATE_DATA_ID"] # there is a valid "deployment_rule" in the system DEPLOYMENT_RULE_DATA_ID = ENV["DEPLOYMENT_RULE_DATA_ID"] body = DatadogAPIClient::V2::UpdateDeploymentRuleParams.new({ data: DatadogAPIClient::V2::UpdateDeploymentRuleParamsData.new({ attributes: DatadogAPIClient::V2::UpdateDeploymentRuleParamsDataAttributes.new({ dry_run: false, name: "Updated deployment rule", options: DatadogAPIClient::V2::DeploymentRuleOptionsFaultyDeploymentDetection.new({ excluded_resources: [], }), }), type: DatadogAPIClient::V2::DeploymentRuleDataType::DEPLOYMENT_RULE, }), }) p api_instance.update_deployment_rule(DEPLOYMENT_GATE_DATA_ID, DEPLOYMENT_RULE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update deployment rule returns "OK" response ``` // Update deployment rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_deployment_gates::DeploymentGatesAPI; use datadog_api_client::datadogV2::model::DeploymentRuleDataType; use datadog_api_client::datadogV2::model::DeploymentRuleOptionsFaultyDeploymentDetection; use datadog_api_client::datadogV2::model::DeploymentRulesOptions; use datadog_api_client::datadogV2::model::UpdateDeploymentRuleParams; use datadog_api_client::datadogV2::model::UpdateDeploymentRuleParamsData; use datadog_api_client::datadogV2::model::UpdateDeploymentRuleParamsDataAttributes; #[tokio::main] async fn main() { // there is a valid "deployment_gate" in the system let deployment_gate_data_id = std::env::var("DEPLOYMENT_GATE_DATA_ID").unwrap(); // there is a valid "deployment_rule" in the system let deployment_rule_data_id = std::env::var("DEPLOYMENT_RULE_DATA_ID").unwrap(); let body = UpdateDeploymentRuleParams::new(UpdateDeploymentRuleParamsData::new( UpdateDeploymentRuleParamsDataAttributes::new( false, "Updated deployment rule".to_string(), DeploymentRulesOptions::DeploymentRuleOptionsFaultyDeploymentDetection(Box::new( DeploymentRuleOptionsFaultyDeploymentDetection::new().excluded_resources(vec![]), )), ), DeploymentRuleDataType::DEPLOYMENT_RULE, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateDeploymentRule", true); let api = DeploymentGatesAPI::with_config(configuration); let resp = api .update_deployment_rule( deployment_gate_data_id.clone(), deployment_rule_data_id.clone(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update deployment rule returns "OK" response ``` /** * Update deployment rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateDeploymentRule"] = true; const apiInstance = new v2.DeploymentGatesApi(configuration); // there is a valid "deployment_gate" in the system const DEPLOYMENT_GATE_DATA_ID = process.env.DEPLOYMENT_GATE_DATA_ID as string; // there is a valid "deployment_rule" in the system const DEPLOYMENT_RULE_DATA_ID = process.env.DEPLOYMENT_RULE_DATA_ID as string; const params: v2.DeploymentGatesApiUpdateDeploymentRuleRequest = { body: { data: { attributes: { dryRun: false, name: "Updated deployment rule", options: { excludedResources: [], }, }, type: "deployment_rule", }, }, gateId: DEPLOYMENT_GATE_DATA_ID, id: DEPLOYMENT_RULE_DATA_ID, }; apiInstance .updateDeploymentRule(params) .then((data: v2.DeploymentRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete deployment rule](https://docs.datadoghq.com/api/latest/deployment-gates/#delete-deployment-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/deployment-gates/#delete-deployment-rule-v2) **Note** : This endpoint is in preview and may be subject to change. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.ap2.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.datadoghq.eu/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.ddog-gov.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.us3.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id}https://api.us5.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules/{id} ### Overview Endpoint to delete a deployment rule. This endpoint requires the `deployment_gates_write` permission. ### Arguments #### Path Parameters Name Type Description gate_id [_required_] string The ID of the deployment gate. id [_required_] string The ID of the deployment rule. ### Response * [204](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentRule-204-v2) * [400](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentRule-429-v2) * [500](https://docs.datadoghq.com/api/latest/deployment-gates/#DeleteDeploymentRule-500-v2) No Content Bad request. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Bad request. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Deployment gate not found. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Deployment gate not found. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=typescript) ##### Delete deployment rule Copy ``` # Path parameters export gate_id="CHANGE_ME" export id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/deployment_gates/${gate_id}/rules/${id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete deployment rule ``` """ Delete deployment rule returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.deployment_gates_api import DeploymentGatesApi # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = environ["DEPLOYMENT_GATE_DATA_ID"] # there is a valid "deployment_rule" in the system DEPLOYMENT_RULE_DATA_ID = environ["DEPLOYMENT_RULE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_deployment_rule"] = True with ApiClient(configuration) as api_client: api_instance = DeploymentGatesApi(api_client) api_instance.delete_deployment_rule( gate_id=DEPLOYMENT_GATE_DATA_ID, id=DEPLOYMENT_RULE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete deployment rule ``` # Delete deployment rule returns "No Content" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_deployment_rule".to_sym] = true end api_instance = DatadogAPIClient::V2::DeploymentGatesAPI.new # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = ENV["DEPLOYMENT_GATE_DATA_ID"] # there is a valid "deployment_rule" in the system DEPLOYMENT_RULE_DATA_ID = ENV["DEPLOYMENT_RULE_DATA_ID"] api_instance.delete_deployment_rule(DEPLOYMENT_GATE_DATA_ID, DEPLOYMENT_RULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete deployment rule ``` // Delete deployment rule returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "deployment_gate" in the system DeploymentGateDataID := os.Getenv("DEPLOYMENT_GATE_DATA_ID") // there is a valid "deployment_rule" in the system DeploymentRuleDataID := os.Getenv("DEPLOYMENT_RULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteDeploymentRule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDeploymentGatesApi(apiClient) r, err := api.DeleteDeploymentRule(ctx, DeploymentGateDataID, DeploymentRuleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DeploymentGatesApi.DeleteDeploymentRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete deployment rule ``` // Delete deployment rule returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DeploymentGatesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteDeploymentRule", true); DeploymentGatesApi apiInstance = new DeploymentGatesApi(defaultClient); // there is a valid "deployment_gate" in the system String DEPLOYMENT_GATE_DATA_ID = System.getenv("DEPLOYMENT_GATE_DATA_ID"); // there is a valid "deployment_rule" in the system String DEPLOYMENT_RULE_DATA_ID = System.getenv("DEPLOYMENT_RULE_DATA_ID"); try { apiInstance.deleteDeploymentRule(DEPLOYMENT_GATE_DATA_ID, DEPLOYMENT_RULE_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling DeploymentGatesApi#deleteDeploymentRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete deployment rule ``` // Delete deployment rule returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_deployment_gates::DeploymentGatesAPI; #[tokio::main] async fn main() { // there is a valid "deployment_gate" in the system let deployment_gate_data_id = std::env::var("DEPLOYMENT_GATE_DATA_ID").unwrap(); // there is a valid "deployment_rule" in the system let deployment_rule_data_id = std::env::var("DEPLOYMENT_RULE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteDeploymentRule", true); let api = DeploymentGatesAPI::with_config(configuration); let resp = api .delete_deployment_rule( deployment_gate_data_id.clone(), deployment_rule_data_id.clone(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete deployment rule ``` /** * Delete deployment rule returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteDeploymentRule"] = true; const apiInstance = new v2.DeploymentGatesApi(configuration); // there is a valid "deployment_gate" in the system const DEPLOYMENT_GATE_DATA_ID = process.env.DEPLOYMENT_GATE_DATA_ID as string; // there is a valid "deployment_rule" in the system const DEPLOYMENT_RULE_DATA_ID = process.env.DEPLOYMENT_RULE_DATA_ID as string; const params: v2.DeploymentGatesApiDeleteDeploymentRuleRequest = { gateId: DEPLOYMENT_GATE_DATA_ID, id: DEPLOYMENT_RULE_DATA_ID, }; apiInstance .deleteDeploymentRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get rules for a deployment gate](https://docs.datadoghq.com/api/latest/deployment-gates/#get-rules-for-a-deployment-gate) * [v2 (latest)](https://docs.datadoghq.com/api/latest/deployment-gates/#get-rules-for-a-deployment-gate-v2) **Note** : This endpoint is in preview and may be subject to change. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/deployment_gates/{gate_id}/ruleshttps://api.ap2.datadoghq.com/api/v2/deployment_gates/{gate_id}/ruleshttps://api.datadoghq.eu/api/v2/deployment_gates/{gate_id}/ruleshttps://api.ddog-gov.com/api/v2/deployment_gates/{gate_id}/ruleshttps://api.datadoghq.com/api/v2/deployment_gates/{gate_id}/ruleshttps://api.us3.datadoghq.com/api/v2/deployment_gates/{gate_id}/ruleshttps://api.us5.datadoghq.com/api/v2/deployment_gates/{gate_id}/rules ### Overview Endpoint to get rules for a deployment gate. This endpoint requires the `deployment_gates_read` permission. ### Arguments #### Path Parameters Name Type Description gate_id [_required_] string The ID of the deployment gate. ### Response * [200](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGateRules-200-v2) * [400](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGateRules-400-v2) * [401](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGateRules-401-v2) * [403](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGateRules-403-v2) * [429](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGateRules-429-v2) * [500](https://docs.datadoghq.com/api/latest/deployment-gates/#GetDeploymentGateRules-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Response for a deployment gate rules. Field Type Description data object Data for a list of deployment rules. attributes [_required_] object rules [object] created_at [_required_] date-time The timestamp when the deployment rule was created. created_by [_required_] object Information about the user who created the deployment rule. handle string The handle of the user who created the deployment rule. id [_required_] string The ID of the user who created the deployment rule. name string The name of the user who created the deployment rule. dry_run [_required_] boolean Whether this rule is run in dry-run mode. gate_id [_required_] string The ID of the deployment gate. name [_required_] string The name of the deployment rule. options [_required_] Options for deployment rule response representing either faulty deployment detection or monitor options. Option 1 object Faulty deployment detection options for deployment rules. duration int64 The duration for faulty deployment detection. excluded_resources [string] Resources to exclude from faulty deployment detection. Option 2 object Monitor options for deployment rules. duration int64 Seconds the monitor needs to stay in OK status for the rule to pass. query [_required_] string Monitors that match this query are evaluated. type [_required_] enum The type of the deployment rule. Allowed enum values: `faulty_deployment_detection,monitor` updated_at date-time The timestamp when the deployment rule was last updated. updated_by object Information about the user who updated the deployment rule. handle string The handle of the user who updated the deployment rule. id [_required_] string The ID of the user who updated the deployment rule. name string The name of the user who updated the deployment rule. id [_required_] string Unique identifier of the deployment rule. type [_required_] enum List deployment rule resource type. Allowed enum values: `list_deployment_rules` ``` { "data": { "attributes": { "rules": [ { "created_at": "2021-01-01T00:00:00Z", "created_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" }, "dry_run": false, "gate_id": "1111-2222-3333-4444-555566667777", "name": "My deployment rule", "options": { "duration": 3600, "excluded_resources": [ "resource1", "resource2" ] }, "type": "faulty_deployment_detection", "updated_at": "2019-09-19T10:00:00.000Z", "updated_by": { "handle": "test-user", "id": "1111-2222-3333-4444-555566667777", "name": "Test User" } } ] }, "id": "1111-2222-3333-4444-555566667777", "type": "list_deployment_rules" } } ``` Copy Bad request. * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Bad request. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/deployment-gates/) * [Example](https://docs.datadoghq.com/api/latest/deployment-gates/) Errors occurred. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/deployment-gates/?code-lang=typescript) ##### Get rules for a deployment gate Copy ``` # Path parameters export gate_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/deployment_gates/${gate_id}/rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get rules for a deployment gate ``` """ Get rules for a deployment gate returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.deployment_gates_api import DeploymentGatesApi # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = environ["DEPLOYMENT_GATE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_deployment_gate_rules"] = True with ApiClient(configuration) as api_client: api_instance = DeploymentGatesApi(api_client) response = api_instance.get_deployment_gate_rules( gate_id=DEPLOYMENT_GATE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get rules for a deployment gate ``` # Get rules for a deployment gate returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_deployment_gate_rules".to_sym] = true end api_instance = DatadogAPIClient::V2::DeploymentGatesAPI.new # there is a valid "deployment_gate" in the system DEPLOYMENT_GATE_DATA_ID = ENV["DEPLOYMENT_GATE_DATA_ID"] p api_instance.get_deployment_gate_rules(DEPLOYMENT_GATE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get rules for a deployment gate ``` // Get rules for a deployment gate returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "deployment_gate" in the system DeploymentGateDataID := os.Getenv("DEPLOYMENT_GATE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetDeploymentGateRules", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDeploymentGatesApi(apiClient) resp, r, err := api.GetDeploymentGateRules(ctx, DeploymentGateDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DeploymentGatesApi.GetDeploymentGateRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DeploymentGatesApi.GetDeploymentGateRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get rules for a deployment gate ``` // Get rules for a deployment gate returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DeploymentGatesApi; import com.datadog.api.client.v2.model.DeploymentGateRulesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getDeploymentGateRules", true); DeploymentGatesApi apiInstance = new DeploymentGatesApi(defaultClient); // there is a valid "deployment_gate" in the system String DEPLOYMENT_GATE_DATA_ID = System.getenv("DEPLOYMENT_GATE_DATA_ID"); try { DeploymentGateRulesResponse result = apiInstance.getDeploymentGateRules(DEPLOYMENT_GATE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DeploymentGatesApi#getDeploymentGateRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get rules for a deployment gate ``` // Get rules for a deployment gate returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_deployment_gates::DeploymentGatesAPI; #[tokio::main] async fn main() { // there is a valid "deployment_gate" in the system let deployment_gate_data_id = std::env::var("DEPLOYMENT_GATE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetDeploymentGateRules", true); let api = DeploymentGatesAPI::with_config(configuration); let resp = api .get_deployment_gate_rules(deployment_gate_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get rules for a deployment gate ``` /** * Get rules for a deployment gate returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getDeploymentGateRules"] = true; const apiInstance = new v2.DeploymentGatesApi(configuration); // there is a valid "deployment_gate" in the system const DEPLOYMENT_GATE_DATA_ID = process.env.DEPLOYMENT_GATE_DATA_ID as string; const params: v2.DeploymentGatesApiGetDeploymentGateRulesRequest = { gateId: DEPLOYMENT_GATE_DATA_ID, }; apiInstance .getDeploymentGateRules(params) .then((data: v2.DeploymentGateRulesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=9b098a60-1ddc-4a8b-97ad-00564304212f&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=c26a2ad5-007a-4427-8572-4dcaa693e022&pt=Deployment%20Gates&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdeployment-gates%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=9b098a60-1ddc-4a8b-97ad-00564304212f&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=c26a2ad5-007a-4427-8572-4dcaa693e022&pt=Deployment%20Gates&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdeployment-gates%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=d0643678-bfda-4c6e-8080-d76964a9158d&bo=2&sid=a0b8b460f0bf11f0adcf65ee3e3f7e81&vid=a0b8f840f0bf11f090614bb702f21557&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Deployment%20Gates&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdeployment-gates%2F&r=<=2602&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=777830) --- # Source: https://docs.datadoghq.com/api/latest/domain-allowlist # Domain Allowlist Configure your Datadog Email Domain Allowlist directly through the Datadog API. The Email Domain Allowlist controls the domains that certain datadog emails can be sent to. For more information, see the [Domain Allowlist docs page](https://docs.datadoghq.com/account_management/org_settings/domain_allowlist) ## [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist) * [v2 (latest)](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist-v2) GET https://api.ap1.datadoghq.com/api/v2/domain_allowlisthttps://api.ap2.datadoghq.com/api/v2/domain_allowlisthttps://api.datadoghq.eu/api/v2/domain_allowlisthttps://api.ddog-gov.com/api/v2/domain_allowlisthttps://api.datadoghq.com/api/v2/domain_allowlisthttps://api.us3.datadoghq.com/api/v2/domain_allowlisthttps://api.us5.datadoghq.com/api/v2/domain_allowlist ### Overview Get the domain allowlist for an organization. This endpoint requires any of the following permissions: * `org_management` * `monitors_write` * `generate_dashboard_reports` * `generate_log_reports` * `manage_log_reports` OAuth apps require the `monitors_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#domain-allowlist) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/domain-allowlist/#GetDomainAllowlist-200-v2) * [429](https://docs.datadoghq.com/api/latest/domain-allowlist/#GetDomainAllowlist-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/domain-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/domain-allowlist/) Response containing information about the email domain allowlist. Field Type Description data object The email domain allowlist response for an org. attributes object The details of the email domain allowlist. domains [string] The list of domains in the email domain allowlist. enabled boolean Whether the email domain allowlist is enabled for the org. id string The unique identifier of the org. type [_required_] enum Email domain allowlist allowlist type. Allowed enum values: `domain_allowlist` default: `domain_allowlist` ``` { "data": { "attributes": { "domains": [], "enabled": false }, "id": "string", "type": "domain_allowlist" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/domain-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/domain-allowlist/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=typescript) ##### Get Domain Allowlist Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/domain_allowlist" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Domain Allowlist ``` """ Get Domain Allowlist returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.domain_allowlist_api import DomainAllowlistApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DomainAllowlistApi(api_client) response = api_instance.get_domain_allowlist() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Domain Allowlist ``` # Get Domain Allowlist returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DomainAllowlistAPI.new p api_instance.get_domain_allowlist() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Domain Allowlist ``` // Get Domain Allowlist returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDomainAllowlistApi(apiClient) resp, r, err := api.GetDomainAllowlist(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DomainAllowlistApi.GetDomainAllowlist`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DomainAllowlistApi.GetDomainAllowlist`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Domain Allowlist ``` // Get Domain Allowlist returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DomainAllowlistApi; import com.datadog.api.client.v2.model.DomainAllowlistResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DomainAllowlistApi apiInstance = new DomainAllowlistApi(defaultClient); try { DomainAllowlistResponse result = apiInstance.getDomainAllowlist(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DomainAllowlistApi#getDomainAllowlist"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Domain Allowlist ``` // Get Domain Allowlist returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_domain_allowlist::DomainAllowlistAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DomainAllowlistAPI::with_config(configuration); let resp = api.get_domain_allowlist().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Domain Allowlist ``` /** * Get Domain Allowlist returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DomainAllowlistApi(configuration); apiInstance .getDomainAllowlist() .then((data: v2.DomainAllowlistResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist) * [v2 (latest)](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist-v2) PATCH https://api.ap1.datadoghq.com/api/v2/domain_allowlisthttps://api.ap2.datadoghq.com/api/v2/domain_allowlisthttps://api.datadoghq.eu/api/v2/domain_allowlisthttps://api.ddog-gov.com/api/v2/domain_allowlisthttps://api.datadoghq.com/api/v2/domain_allowlisthttps://api.us3.datadoghq.com/api/v2/domain_allowlisthttps://api.us5.datadoghq.com/api/v2/domain_allowlist ### Overview Update the domain allowlist for an organization. This endpoint requires any of the following permissions: * `org_management` * `monitors_write` * `generate_dashboard_reports` * `generate_log_reports` * `manage_log_reports` OAuth apps require the `monitors_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#domain-allowlist) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/domain-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/domain-allowlist/) Field Type Description data [_required_] object The email domain allowlist for an org. attributes object The details of the email domain allowlist. domains [string] The list of domains in the email domain allowlist. enabled boolean Whether the email domain allowlist is enabled for the org. id string The unique identifier of the org. type [_required_] enum Email domain allowlist allowlist type. Allowed enum values: `domain_allowlist` default: `domain_allowlist` ``` { "data": { "attributes": { "domains": [ "@static-test-domain.test" ], "enabled": false }, "type": "domain_allowlist" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/domain-allowlist/#PatchDomainAllowlist-200-v2) * [429](https://docs.datadoghq.com/api/latest/domain-allowlist/#PatchDomainAllowlist-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/domain-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/domain-allowlist/) Response containing information about the email domain allowlist. Field Type Description data object The email domain allowlist response for an org. attributes object The details of the email domain allowlist. domains [string] The list of domains in the email domain allowlist. enabled boolean Whether the email domain allowlist is enabled for the org. id string The unique identifier of the org. type [_required_] enum Email domain allowlist allowlist type. Allowed enum values: `domain_allowlist` default: `domain_allowlist` ``` { "data": { "attributes": { "domains": [], "enabled": false }, "id": "string", "type": "domain_allowlist" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/domain-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/domain-allowlist/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/domain-allowlist/?code-lang=typescript) ##### Sets Domain Allowlist returns "OK" response Copy ``` # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/domain_allowlist" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "domains": [ "@static-test-domain.test" ], "enabled": false }, "type": "domain_allowlist" } } EOF ``` ##### Sets Domain Allowlist returns "OK" response ``` // Sets Domain Allowlist returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.DomainAllowlistRequest{ Data: datadogV2.DomainAllowlist{ Attributes: &datadogV2.DomainAllowlistAttributes{ Domains: []string{ "@static-test-domain.test", }, Enabled: datadog.PtrBool(false), }, Type: datadogV2.DOMAINALLOWLISTTYPE_DOMAIN_ALLOWLIST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDomainAllowlistApi(apiClient) resp, r, err := api.PatchDomainAllowlist(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DomainAllowlistApi.PatchDomainAllowlist`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DomainAllowlistApi.PatchDomainAllowlist`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Sets Domain Allowlist returns "OK" response ``` // Sets Domain Allowlist returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DomainAllowlistApi; import com.datadog.api.client.v2.model.DomainAllowlist; import com.datadog.api.client.v2.model.DomainAllowlistAttributes; import com.datadog.api.client.v2.model.DomainAllowlistRequest; import com.datadog.api.client.v2.model.DomainAllowlistResponse; import com.datadog.api.client.v2.model.DomainAllowlistType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DomainAllowlistApi apiInstance = new DomainAllowlistApi(defaultClient); DomainAllowlistRequest body = new DomainAllowlistRequest() .data( new DomainAllowlist() .attributes( new DomainAllowlistAttributes() .domains(Collections.singletonList("@static-test-domain.test")) .enabled(false)) .type(DomainAllowlistType.DOMAIN_ALLOWLIST)); try { DomainAllowlistResponse result = apiInstance.patchDomainAllowlist(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DomainAllowlistApi#patchDomainAllowlist"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Sets Domain Allowlist returns "OK" response ``` """ Sets Domain Allowlist returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.domain_allowlist_api import DomainAllowlistApi from datadog_api_client.v2.model.domain_allowlist import DomainAllowlist from datadog_api_client.v2.model.domain_allowlist_attributes import DomainAllowlistAttributes from datadog_api_client.v2.model.domain_allowlist_request import DomainAllowlistRequest from datadog_api_client.v2.model.domain_allowlist_type import DomainAllowlistType body = DomainAllowlistRequest( data=DomainAllowlist( attributes=DomainAllowlistAttributes( domains=[ "@static-test-domain.test", ], enabled=False, ), type=DomainAllowlistType.DOMAIN_ALLOWLIST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DomainAllowlistApi(api_client) response = api_instance.patch_domain_allowlist(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Sets Domain Allowlist returns "OK" response ``` # Sets Domain Allowlist returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DomainAllowlistAPI.new body = DatadogAPIClient::V2::DomainAllowlistRequest.new({ data: DatadogAPIClient::V2::DomainAllowlist.new({ attributes: DatadogAPIClient::V2::DomainAllowlistAttributes.new({ domains: [ "@static-test-domain.test", ], enabled: false, }), type: DatadogAPIClient::V2::DomainAllowlistType::DOMAIN_ALLOWLIST, }), }) p api_instance.patch_domain_allowlist(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Sets Domain Allowlist returns "OK" response ``` // Sets Domain Allowlist returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_domain_allowlist::DomainAllowlistAPI; use datadog_api_client::datadogV2::model::DomainAllowlist; use datadog_api_client::datadogV2::model::DomainAllowlistAttributes; use datadog_api_client::datadogV2::model::DomainAllowlistRequest; use datadog_api_client::datadogV2::model::DomainAllowlistType; #[tokio::main] async fn main() { let body = DomainAllowlistRequest::new( DomainAllowlist::new(DomainAllowlistType::DOMAIN_ALLOWLIST).attributes( DomainAllowlistAttributes::new() .domains(vec!["@static-test-domain.test".to_string()]) .enabled(false), ), ); let configuration = datadog::Configuration::new(); let api = DomainAllowlistAPI::with_config(configuration); let resp = api.patch_domain_allowlist(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Sets Domain Allowlist returns "OK" response ``` /** * Sets Domain Allowlist returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DomainAllowlistApi(configuration); const params: v2.DomainAllowlistApiPatchDomainAllowlistRequest = { body: { data: { attributes: { domains: ["@static-test-domain.test"], enabled: false, }, type: "domain_allowlist", }, }, }; apiInstance .patchDomainAllowlist(params) .then((data: v2.DomainAllowlistResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=3ba173a6-0943-4521-bcc0-4243e97c10ea&bo=1&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Domain%20Allowlist&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdomain-allowlist%2F&r=&evt=pageLoad&sv=2&cdb=AQAA&rn=181359) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=0fd7b951-ad2e-465d-a338-ca253de7a4c0&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2cbed7a6-b412-4e83-88cb-e90c53ffa5bf&pt=Domain%20Allowlist&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdomain-allowlist%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=0fd7b951-ad2e-465d-a338-ca253de7a4c0&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2cbed7a6-b412-4e83-88cb-e90c53ffa5bf&pt=Domain%20Allowlist&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdomain-allowlist%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) --- # Source: https://docs.datadoghq.com/api/latest/dora-metrics/ # DORA Metrics Search, send, or delete events for DORA Metrics to measure and improve your software delivery performance. See the [DORA Metrics page](https://docs.datadoghq.com/dora_metrics/) for more information. **Note** : DORA Metrics are not available in the US1-FED site. ## [Send a deployment event](https://docs.datadoghq.com/api/latest/dora-metrics/#send-a-deployment-event) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dora-metrics/#send-a-deployment-event-v2) POST https://api.ap1.datadoghq.com/api/v2/dora/deploymenthttps://api.ap2.datadoghq.com/api/v2/dora/deploymenthttps://api.datadoghq.eu/api/v2/dora/deploymenthttps://api.ddog-gov.com/api/v2/dora/deploymenthttps://api.datadoghq.com/api/v2/dora/deploymenthttps://api.us3.datadoghq.com/api/v2/dora/deploymenthttps://api.us5.datadoghq.com/api/v2/dora/deployment ### Overview Use this API endpoint to provide deployment data. This is necessary for: * Deployment Frequency * Change Lead Time * Change Failure Rate ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Field Type Description data [_required_] object The JSON:API data. attributes [_required_] object Attributes to create a DORA deployment event. custom_tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. Up to 100 may be added per event. env string Environment name to where the service was deployed. finished_at [_required_] int64 Unix timestamp when the deployment finished. It must be in nanoseconds, milliseconds, or seconds. git object Git info for DORA Metrics events. commit_sha [_required_] string Git Commit SHA. repository_url [_required_] string Git Repository URL id string Deployment ID. Must be 16-128 characters and contain only alphanumeric characters, hyphens, underscores, periods, and colons (a-z, A-Z, 0-9, -, _, ., :). service [_required_] string Service name. started_at [_required_] int64 Unix timestamp when the deployment started. It must be in nanoseconds, milliseconds, or seconds. team string Name of the team owning the deployed service. If not provided, this is automatically populated with the team associated with the service in the Service Catalog. version string Version to correlate with [APM Deployment Tracking](https://docs.datadoghq.com/tracing/services/deployment_tracking/). ``` { "data": { "attributes": { "finished_at": 1693491984000000000, "git": { "commit_sha": "66adc9350f2cc9b250b69abddab733dd55e1a588", "repository_url": "https://github.com/organization/example-repository" }, "service": "shopist", "started_at": 1693491974000000000, "version": "v1.12.07" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORADeployment-200-v2) * [202](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORADeployment-202-v2) * [400](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORADeployment-400-v2) * [403](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORADeployment-403-v2) * [429](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORADeployment-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Response after receiving a DORA deployment event. Field Type Description data [_required_] object The JSON:API data. id [_required_] string The ID of the received DORA deployment event. type enum JSON:API type for DORA deployment events. Allowed enum values: `dora_deployment` default: `dora_deployment` ``` { "data": { "id": "4242fcdd31586083", "type": "dora_deployment" } } ``` Copy OK - but delayed due to incident * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Response after receiving a DORA deployment event. Field Type Description data [_required_] object The JSON:API data. id [_required_] string The ID of the received DORA deployment event. type enum JSON:API type for DORA deployment events. Allowed enum values: `dora_deployment` default: `dora_deployment` ``` { "data": { "id": "4242fcdd31586083", "type": "dora_deployment" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=typescript) ##### Send a deployment event returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dora/deployment" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "data": { "attributes": { "finished_at": 1693491984000000000, "git": { "commit_sha": "66adc9350f2cc9b250b69abddab733dd55e1a588", "repository_url": "https://github.com/organization/example-repository" }, "service": "shopist", "started_at": 1693491974000000000, "version": "v1.12.07" } } } EOF ``` ##### Send a deployment event returns "OK" response ``` // Send a deployment event returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.DORADeploymentRequest{ Data: datadogV2.DORADeploymentRequestData{ Attributes: datadogV2.DORADeploymentRequestAttributes{ FinishedAt: 1693491984000000000, Git: &datadogV2.DORAGitInfo{ CommitSha: "66adc9350f2cc9b250b69abddab733dd55e1a588", RepositoryUrl: "https://github.com/organization/example-repository", }, Service: "shopist", StartedAt: 1693491974000000000, Version: datadog.PtrString("v1.12.07"), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDORAMetricsApi(apiClient) resp, r, err := api.CreateDORADeployment(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DORAMetricsApi.CreateDORADeployment`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DORAMetricsApi.CreateDORADeployment`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Send a deployment event returns "OK" response ``` // Send a deployment event returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DoraMetricsApi; import com.datadog.api.client.v2.model.DORADeploymentRequest; import com.datadog.api.client.v2.model.DORADeploymentRequestAttributes; import com.datadog.api.client.v2.model.DORADeploymentRequestData; import com.datadog.api.client.v2.model.DORADeploymentResponse; import com.datadog.api.client.v2.model.DORAGitInfo; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DoraMetricsApi apiInstance = new DoraMetricsApi(defaultClient); DORADeploymentRequest body = new DORADeploymentRequest() .data( new DORADeploymentRequestData() .attributes( new DORADeploymentRequestAttributes() .finishedAt(1693491984000000000L) .git( new DORAGitInfo() .commitSha("66adc9350f2cc9b250b69abddab733dd55e1a588") .repositoryUrl( "https://github.com/organization/example-repository")) .service("shopist") .startedAt(1693491974000000000L) .version("v1.12.07"))); try { DORADeploymentResponse result = apiInstance.createDORADeployment(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DoraMetricsApi#createDORADeployment"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Send a deployment event returns "OK" response ``` """ Send a deployment event returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dora_metrics_api import DORAMetricsApi from datadog_api_client.v2.model.dora_deployment_request import DORADeploymentRequest from datadog_api_client.v2.model.dora_deployment_request_attributes import DORADeploymentRequestAttributes from datadog_api_client.v2.model.dora_deployment_request_data import DORADeploymentRequestData from datadog_api_client.v2.model.dora_git_info import DORAGitInfo body = DORADeploymentRequest( data=DORADeploymentRequestData( attributes=DORADeploymentRequestAttributes( finished_at=1693491984000000000, git=DORAGitInfo( commit_sha="66adc9350f2cc9b250b69abddab733dd55e1a588", repository_url="https://github.com/organization/example-repository", ), service="shopist", started_at=1693491974000000000, version="v1.12.07", ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DORAMetricsApi(api_client) response = api_instance.create_dora_deployment(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Send a deployment event returns "OK" response ``` # Send a deployment event returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DORAMetricsAPI.new body = DatadogAPIClient::V2::DORADeploymentRequest.new({ data: DatadogAPIClient::V2::DORADeploymentRequestData.new({ attributes: DatadogAPIClient::V2::DORADeploymentRequestAttributes.new({ finished_at: 1693491984000000000, git: DatadogAPIClient::V2::DORAGitInfo.new({ commit_sha: "66adc9350f2cc9b250b69abddab733dd55e1a588", repository_url: "https://github.com/organization/example-repository", }), service: "shopist", started_at: 1693491974000000000, version: "v1.12.07", }), }), }) p api_instance.create_dora_deployment(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Send a deployment event returns "OK" response ``` // Send a deployment event returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dora_metrics::DORAMetricsAPI; use datadog_api_client::datadogV2::model::DORADeploymentRequest; use datadog_api_client::datadogV2::model::DORADeploymentRequestAttributes; use datadog_api_client::datadogV2::model::DORADeploymentRequestData; use datadog_api_client::datadogV2::model::DORAGitInfo; #[tokio::main] async fn main() { let body = DORADeploymentRequest::new(DORADeploymentRequestData::new( DORADeploymentRequestAttributes::new( 1693491984000000000, "shopist".to_string(), 1693491974000000000, ) .git(DORAGitInfo::new( "66adc9350f2cc9b250b69abddab733dd55e1a588".to_string(), "https://github.com/organization/example-repository".to_string(), )) .version("v1.12.07".to_string()), )); let configuration = datadog::Configuration::new(); let api = DORAMetricsAPI::with_config(configuration); let resp = api.create_dora_deployment(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Send a deployment event returns "OK" response ``` /** * Send a deployment event returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DORAMetricsApi(configuration); const params: v2.DORAMetricsApiCreateDORADeploymentRequest = { body: { data: { attributes: { finishedAt: 1693491984000000000, git: { commitSha: "66adc9350f2cc9b250b69abddab733dd55e1a588", repositoryUrl: "https://github.com/organization/example-repository", }, service: "shopist", startedAt: 1693491974000000000, version: "v1.12.07", }, }, }, }; apiInstance .createDORADeployment(params) .then((data: v2.DORADeploymentResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` * * * ## [Send a failure event](https://docs.datadoghq.com/api/latest/dora-metrics/#send-a-failure-event) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dora-metrics/#send-a-failure-event-v2) POST https://api.ap1.datadoghq.com/api/v2/dora/failurehttps://api.ap2.datadoghq.com/api/v2/dora/failurehttps://api.datadoghq.eu/api/v2/dora/failurehttps://api.ddog-gov.com/api/v2/dora/failurehttps://api.datadoghq.com/api/v2/dora/failurehttps://api.us3.datadoghq.com/api/v2/dora/failurehttps://api.us5.datadoghq.com/api/v2/dora/failure ### Overview Use this API endpoint to provide failure data. This is necessary for: * Change Failure Rate * Time to Restore ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Field Type Description data [_required_] object The JSON:API data. attributes [_required_] object Attributes to create a DORA failure event. custom_tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. Up to 100 may be added per event. env string Environment name that was impacted by the failure. finished_at int64 Unix timestamp when the failure finished. It must be in nanoseconds, milliseconds, or seconds. git object Git info for DORA Metrics events. commit_sha [_required_] string Git Commit SHA. repository_url [_required_] string Git Repository URL id string Failure ID. Must be 16-128 characters and contain only alphanumeric characters, hyphens, underscores, periods, and colons (a-z, A-Z, 0-9, -, _, ., :). name string Failure name. services [string] Service names impacted by the failure. If possible, use names registered in the Service Catalog. Required when the team field is not provided. severity string Failure severity. started_at [_required_] int64 Unix timestamp when the failure started. It must be in nanoseconds, milliseconds, or seconds. team string Name of the team owning the services impacted. If possible, use team handles registered in Datadog. Required when the services field is not provided. version string Version to correlate with [APM Deployment Tracking](https://docs.datadoghq.com/tracing/services/deployment_tracking/). ``` { "data": { "attributes": { "custom_tags": [ "language:java", "department:engineering" ], "env": "staging", "finished_at": 1693491984000000000, "git": { "commit_sha": "66adc9350f2cc9b250b69abddab733dd55e1a588", "repository_url": "https://github.com/organization/example-repository" }, "id": "string", "name": "Webserver is down failing all requests.", "services": [ "shopist" ], "severity": "High", "started_at": 1693491974000000000, "team": "backend", "version": "v1.12.07" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORAFailure-200-v2) * [202](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORAFailure-202-v2) * [400](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORAFailure-400-v2) * [403](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORAFailure-403-v2) * [429](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORAFailure-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Response after receiving a DORA failure event. Field Type Description data [_required_] object Response after receiving a DORA failure event. id [_required_] string The ID of the received DORA failure event. type enum JSON:API type for DORA failure events. Allowed enum values: `dora_failure` default: `dora_failure` ``` { "data": { "id": "4242fcdd31586083", "type": "dora_failure" } } ``` Copy OK - but delayed due to incident * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Response after receiving a DORA failure event. Field Type Description data [_required_] object Response after receiving a DORA failure event. id [_required_] string The ID of the received DORA failure event. type enum JSON:API type for DORA failure events. Allowed enum values: `dora_failure` default: `dora_failure` ``` { "data": { "id": "4242fcdd31586083", "type": "dora_failure" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=typescript) ##### Send a failure event Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dora/failure" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "data": { "attributes": { "git": { "commit_sha": "66adc9350f2cc9b250b69abddab733dd55e1a588", "repository_url": "https://github.com/organization/example-repository" }, "started_at": 1693491974000000000 } } } EOF ``` ##### Send a failure event ``` """ Send a failure event returns "OK - but delayed due to incident" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dora_metrics_api import DORAMetricsApi from datadog_api_client.v2.model.dora_failure_request import DORAFailureRequest from datadog_api_client.v2.model.dora_failure_request_attributes import DORAFailureRequestAttributes from datadog_api_client.v2.model.dora_failure_request_data import DORAFailureRequestData from datadog_api_client.v2.model.dora_git_info import DORAGitInfo body = DORAFailureRequest( data=DORAFailureRequestData( attributes=DORAFailureRequestAttributes( custom_tags=[ "language:java", "department:engineering", ], env="staging", finished_at=1693491984000000000, git=DORAGitInfo( commit_sha="66adc9350f2cc9b250b69abddab733dd55e1a588", repository_url="https://github.com/organization/example-repository", ), name="Webserver is down failing all requests.", services=[ "shopist", ], severity="High", started_at=1693491974000000000, team="backend", version="v1.12.07", ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DORAMetricsApi(api_client) response = api_instance.create_dora_failure(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Send a failure event ``` # Send a failure event returns "OK - but delayed due to incident" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DORAMetricsAPI.new body = DatadogAPIClient::V2::DORAFailureRequest.new({ data: DatadogAPIClient::V2::DORAFailureRequestData.new({ attributes: DatadogAPIClient::V2::DORAFailureRequestAttributes.new({ custom_tags: [ "language:java", "department:engineering", ], env: "staging", finished_at: 1693491984000000000, git: DatadogAPIClient::V2::DORAGitInfo.new({ commit_sha: "66adc9350f2cc9b250b69abddab733dd55e1a588", repository_url: "https://github.com/organization/example-repository", }), name: "Webserver is down failing all requests.", services: [ "shopist", ], severity: "High", started_at: 1693491974000000000, team: "backend", version: "v1.12.07", }), }), }) p api_instance.create_dora_failure(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Send a failure event ``` // Send a failure event returns "OK - but delayed due to incident" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.DORAFailureRequest{ Data: datadogV2.DORAFailureRequestData{ Attributes: datadogV2.DORAFailureRequestAttributes{ CustomTags: *datadog.NewNullableList(&[]string{ "language:java", "department:engineering", }), Env: datadog.PtrString("staging"), FinishedAt: datadog.PtrInt64(1693491984000000000), Git: &datadogV2.DORAGitInfo{ CommitSha: "66adc9350f2cc9b250b69abddab733dd55e1a588", RepositoryUrl: "https://github.com/organization/example-repository", }, Name: datadog.PtrString("Webserver is down failing all requests."), Services: []string{ "shopist", }, Severity: datadog.PtrString("High"), StartedAt: 1693491974000000000, Team: datadog.PtrString("backend"), Version: datadog.PtrString("v1.12.07"), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDORAMetricsApi(apiClient) resp, r, err := api.CreateDORAFailure(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DORAMetricsApi.CreateDORAFailure`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DORAMetricsApi.CreateDORAFailure`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Send a failure event ``` // Send a failure event returns "OK - but delayed due to incident" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DoraMetricsApi; import com.datadog.api.client.v2.model.DORAFailureRequest; import com.datadog.api.client.v2.model.DORAFailureRequestAttributes; import com.datadog.api.client.v2.model.DORAFailureRequestData; import com.datadog.api.client.v2.model.DORAFailureResponse; import com.datadog.api.client.v2.model.DORAGitInfo; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DoraMetricsApi apiInstance = new DoraMetricsApi(defaultClient); DORAFailureRequest body = new DORAFailureRequest() .data( new DORAFailureRequestData() .attributes( new DORAFailureRequestAttributes() .customTags(Arrays.asList("language:java", "department:engineering")) .env("staging") .finishedAt(1693491984000000000L) .git( new DORAGitInfo() .commitSha("66adc9350f2cc9b250b69abddab733dd55e1a588") .repositoryUrl( "https://github.com/organization/example-repository")) .name("Webserver is down failing all requests.") .services(Collections.singletonList("shopist")) .severity("High") .startedAt(1693491974000000000L) .team("backend") .version("v1.12.07"))); try { DORAFailureResponse result = apiInstance.createDORAFailure(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DoraMetricsApi#createDORAFailure"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Send a failure event ``` // Send a failure event returns "OK - but delayed due to incident" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dora_metrics::DORAMetricsAPI; use datadog_api_client::datadogV2::model::DORAFailureRequest; use datadog_api_client::datadogV2::model::DORAFailureRequestAttributes; use datadog_api_client::datadogV2::model::DORAFailureRequestData; use datadog_api_client::datadogV2::model::DORAGitInfo; #[tokio::main] async fn main() { let body = DORAFailureRequest::new(DORAFailureRequestData::new( DORAFailureRequestAttributes::new(1693491974000000000) .custom_tags(Some(vec![ "language:java".to_string(), "department:engineering".to_string(), ])) .env("staging".to_string()) .finished_at(1693491984000000000) .git(DORAGitInfo::new( "66adc9350f2cc9b250b69abddab733dd55e1a588".to_string(), "https://github.com/organization/example-repository".to_string(), )) .name("Webserver is down failing all requests.".to_string()) .services(vec!["shopist".to_string()]) .severity("High".to_string()) .team("backend".to_string()) .version("v1.12.07".to_string()), )); let configuration = datadog::Configuration::new(); let api = DORAMetricsAPI::with_config(configuration); let resp = api.create_dora_failure(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Send a failure event ``` /** * Send a failure event returns "OK - but delayed due to incident" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DORAMetricsApi(configuration); const params: v2.DORAMetricsApiCreateDORAFailureRequest = { body: { data: { attributes: { customTags: ["language:java", "department:engineering"], env: "staging", finishedAt: 1693491984000000000, git: { commitSha: "66adc9350f2cc9b250b69abddab733dd55e1a588", repositoryUrl: "https://github.com/organization/example-repository", }, name: "Webserver is down failing all requests.", services: ["shopist"], severity: "High", startedAt: 1693491974000000000, team: "backend", version: "v1.12.07", }, }, }, }; apiInstance .createDORAFailure(params) .then((data: v2.DORAFailureResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` * * * ## [Send an incident event](https://docs.datadoghq.com/api/latest/dora-metrics/#send-an-incident-event) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/dora-metrics/#send-an-incident-event-v2) POST https://api.ap1.datadoghq.com/api/v2/dora/incidenthttps://api.ap2.datadoghq.com/api/v2/dora/incidenthttps://api.datadoghq.eu/api/v2/dora/incidenthttps://api.ddog-gov.com/api/v2/dora/incidenthttps://api.datadoghq.com/api/v2/dora/incidenthttps://api.us3.datadoghq.com/api/v2/dora/incidenthttps://api.us5.datadoghq.com/api/v2/dora/incident ### Overview **Note** : This endpoint is deprecated. Please use `/api/v2/dora/failure` instead. Use this API endpoint to provide failure data. This is necessary for: * Change Failure Rate * Time to Restore ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Field Type Description data [_required_] object The JSON:API data. attributes [_required_] object Attributes to create a DORA failure event. custom_tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. Up to 100 may be added per event. env string Environment name that was impacted by the failure. finished_at int64 Unix timestamp when the failure finished. It must be in nanoseconds, milliseconds, or seconds. git object Git info for DORA Metrics events. commit_sha [_required_] string Git Commit SHA. repository_url [_required_] string Git Repository URL id string Failure ID. Must be 16-128 characters and contain only alphanumeric characters, hyphens, underscores, periods, and colons (a-z, A-Z, 0-9, -, _, ., :). name string Failure name. services [string] Service names impacted by the failure. If possible, use names registered in the Service Catalog. Required when the team field is not provided. severity string Failure severity. started_at [_required_] int64 Unix timestamp when the failure started. It must be in nanoseconds, milliseconds, or seconds. team string Name of the team owning the services impacted. If possible, use team handles registered in Datadog. Required when the services field is not provided. version string Version to correlate with [APM Deployment Tracking](https://docs.datadoghq.com/tracing/services/deployment_tracking/). ``` { "data": { "attributes": { "finished_at": 1707842944600000000, "git": { "commit_sha": "66adc9350f2cc9b250b69abddab733dd55e1a588", "repository_url": "https://github.com/organization/example-repository" }, "name": "Webserver is down failing all requests", "services": [ "shopist" ], "severity": "High", "started_at": 1707842944500000000, "team": "backend", "version": "v1.12.07" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORAIncident-200-v2) * [202](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORAIncident-202-v2) * [400](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORAIncident-400-v2) * [403](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORAIncident-403-v2) * [429](https://docs.datadoghq.com/api/latest/dora-metrics/#CreateDORAIncident-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Response after receiving a DORA failure event. Field Type Description data [_required_] object Response after receiving a DORA failure event. id [_required_] string The ID of the received DORA failure event. type enum JSON:API type for DORA failure events. Allowed enum values: `dora_failure` default: `dora_failure` ``` { "data": { "id": "4242fcdd31586083", "type": "dora_failure" } } ``` Copy OK - but delayed due to incident * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Response after receiving a DORA failure event. Field Type Description data [_required_] object Response after receiving a DORA failure event. id [_required_] string The ID of the received DORA failure event. type enum JSON:API type for DORA failure events. Allowed enum values: `dora_failure` default: `dora_failure` ``` { "data": { "id": "4242fcdd31586083", "type": "dora_failure" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=typescript) ##### Send a failure event returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dora/incident" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "data": { "attributes": { "finished_at": 1707842944600000000, "git": { "commit_sha": "66adc9350f2cc9b250b69abddab733dd55e1a588", "repository_url": "https://github.com/organization/example-repository" }, "name": "Webserver is down failing all requests", "services": [ "shopist" ], "severity": "High", "started_at": 1707842944500000000, "team": "backend", "version": "v1.12.07" } } } EOF ``` ##### Send a failure event returns "OK" response ``` // Send a failure event returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.DORAFailureRequest{ Data: datadogV2.DORAFailureRequestData{ Attributes: datadogV2.DORAFailureRequestAttributes{ FinishedAt: datadog.PtrInt64(1707842944600000000), Git: &datadogV2.DORAGitInfo{ CommitSha: "66adc9350f2cc9b250b69abddab733dd55e1a588", RepositoryUrl: "https://github.com/organization/example-repository", }, Name: datadog.PtrString("Webserver is down failing all requests"), Services: []string{ "shopist", }, Severity: datadog.PtrString("High"), StartedAt: 1707842944500000000, Team: datadog.PtrString("backend"), Version: datadog.PtrString("v1.12.07"), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDORAMetricsApi(apiClient) resp, r, err := api.CreateDORAIncident(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DORAMetricsApi.CreateDORAIncident`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DORAMetricsApi.CreateDORAIncident`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Send a failure event returns "OK" response ``` // Send a failure event returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DoraMetricsApi; import com.datadog.api.client.v2.model.DORAFailureRequest; import com.datadog.api.client.v2.model.DORAFailureRequestAttributes; import com.datadog.api.client.v2.model.DORAFailureRequestData; import com.datadog.api.client.v2.model.DORAFailureResponse; import com.datadog.api.client.v2.model.DORAGitInfo; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DoraMetricsApi apiInstance = new DoraMetricsApi(defaultClient); DORAFailureRequest body = new DORAFailureRequest() .data( new DORAFailureRequestData() .attributes( new DORAFailureRequestAttributes() .finishedAt(1707842944600000000L) .git( new DORAGitInfo() .commitSha("66adc9350f2cc9b250b69abddab733dd55e1a588") .repositoryUrl( "https://github.com/organization/example-repository")) .name("Webserver is down failing all requests") .services(Collections.singletonList("shopist")) .severity("High") .startedAt(1707842944500000000L) .team("backend") .version("v1.12.07"))); try { DORAFailureResponse result = apiInstance.createDORAIncident(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DoraMetricsApi#createDORAIncident"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Send a failure event returns "OK" response ``` """ Send a failure event returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dora_metrics_api import DORAMetricsApi from datadog_api_client.v2.model.dora_failure_request import DORAFailureRequest from datadog_api_client.v2.model.dora_failure_request_attributes import DORAFailureRequestAttributes from datadog_api_client.v2.model.dora_failure_request_data import DORAFailureRequestData from datadog_api_client.v2.model.dora_git_info import DORAGitInfo body = DORAFailureRequest( data=DORAFailureRequestData( attributes=DORAFailureRequestAttributes( finished_at=1707842944600000000, git=DORAGitInfo( commit_sha="66adc9350f2cc9b250b69abddab733dd55e1a588", repository_url="https://github.com/organization/example-repository", ), name="Webserver is down failing all requests", services=[ "shopist", ], severity="High", started_at=1707842944500000000, team="backend", version="v1.12.07", ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DORAMetricsApi(api_client) response = api_instance.create_dora_incident(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Send a failure event returns "OK" response ``` # Send a failure event returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DORAMetricsAPI.new body = DatadogAPIClient::V2::DORAFailureRequest.new({ data: DatadogAPIClient::V2::DORAFailureRequestData.new({ attributes: DatadogAPIClient::V2::DORAFailureRequestAttributes.new({ finished_at: 1707842944600000000, git: DatadogAPIClient::V2::DORAGitInfo.new({ commit_sha: "66adc9350f2cc9b250b69abddab733dd55e1a588", repository_url: "https://github.com/organization/example-repository", }), name: "Webserver is down failing all requests", services: [ "shopist", ], severity: "High", started_at: 1707842944500000000, team: "backend", version: "v1.12.07", }), }), }) p api_instance.create_dora_incident(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Send a failure event returns "OK" response ``` // Send a failure event returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dora_metrics::DORAMetricsAPI; use datadog_api_client::datadogV2::model::DORAFailureRequest; use datadog_api_client::datadogV2::model::DORAFailureRequestAttributes; use datadog_api_client::datadogV2::model::DORAFailureRequestData; use datadog_api_client::datadogV2::model::DORAGitInfo; #[tokio::main] async fn main() { let body = DORAFailureRequest::new(DORAFailureRequestData::new( DORAFailureRequestAttributes::new(1707842944500000000) .finished_at(1707842944600000000) .git(DORAGitInfo::new( "66adc9350f2cc9b250b69abddab733dd55e1a588".to_string(), "https://github.com/organization/example-repository".to_string(), )) .name("Webserver is down failing all requests".to_string()) .services(vec!["shopist".to_string()]) .severity("High".to_string()) .team("backend".to_string()) .version("v1.12.07".to_string()), )); let configuration = datadog::Configuration::new(); let api = DORAMetricsAPI::with_config(configuration); let resp = api.create_dora_incident(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Send a failure event returns "OK" response ``` /** * Send a failure event returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DORAMetricsApi(configuration); const params: v2.DORAMetricsApiCreateDORAIncidentRequest = { body: { data: { attributes: { finishedAt: 1707842944600000000, git: { commitSha: "66adc9350f2cc9b250b69abddab733dd55e1a588", repositoryUrl: "https://github.com/organization/example-repository", }, name: "Webserver is down failing all requests", services: ["shopist"], severity: "High", startedAt: 1707842944500000000, team: "backend", version: "v1.12.07", }, }, }, }; apiInstance .createDORAIncident(params) .then((data: v2.DORAFailureResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` * * * ## [Get a list of deployment events](https://docs.datadoghq.com/api/latest/dora-metrics/#get-a-list-of-deployment-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dora-metrics/#get-a-list-of-deployment-events-v2) POST https://api.ap1.datadoghq.com/api/v2/dora/deploymentshttps://api.ap2.datadoghq.com/api/v2/dora/deploymentshttps://api.datadoghq.eu/api/v2/dora/deploymentshttps://api.ddog-gov.com/api/v2/dora/deploymentshttps://api.datadoghq.com/api/v2/dora/deploymentshttps://api.us3.datadoghq.com/api/v2/dora/deploymentshttps://api.us5.datadoghq.com/api/v2/dora/deployments ### Overview Use this API endpoint to get a list of deployment events. This endpoint requires the `dora_metrics_read` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Field Type Description data [_required_] object The JSON:API data. attributes [_required_] object Attributes to get a list of deployments. from date-time Minimum timestamp for requested events. limit int32 Maximum number of events in the response. default: `10` query string Search query with event platform syntax. sort string Sort order (prefixed with `-` for descending). to date-time Maximum timestamp for requested events. type enum The definition of `DORAListDeploymentsRequestDataType` object. Allowed enum values: `dora_deployments_list_request` default: `dora_deployments_list_request` ``` { "data": { "attributes": { "from": "2025-01-01T00:00:00Z", "limit": 500, "query": "service:(shopist OR api-service OR payment-service) env:(production OR staging) team:(backend OR platform)", "sort": "-started_at", "to": "2025-01-31T23:59:59Z" }, "type": "dora_deployments_list_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dora-metrics/#ListDORADeployments-200-v2) * [400](https://docs.datadoghq.com/api/latest/dora-metrics/#ListDORADeployments-400-v2) * [403](https://docs.datadoghq.com/api/latest/dora-metrics/#ListDORADeployments-403-v2) * [429](https://docs.datadoghq.com/api/latest/dora-metrics/#ListDORADeployments-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Response for the list deployments endpoint. Field Type Description data [object] The list of DORA deployment events. attributes object The attributes of the deployment event. custom_tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. Up to 100 may be added per event. env string Environment name to where the service was deployed. finished_at [_required_] int64 Unix timestamp when the deployment finished. git object Git info for DORA Metrics events. commit_sha [_required_] string Git Commit SHA. repository_url [_required_] string Git Repository URL service [_required_] string Service name. started_at [_required_] int64 Unix timestamp when the deployment started. team string Name of the team owning the deployed service. version string Version to correlate with APM Deployment Tracking. id string The ID of the deployment event. type enum JSON:API type for DORA deployment events. Allowed enum values: `dora_deployment` default: `dora_deployment` ``` { "data": [ { "attributes": { "custom_tags": [ "language:java", "department:engineering", "region:us-east-1" ], "env": "production", "finished_at": 1693491984000000000, "git": { "commit_sha": "66adc9350f2cc9b250b69abddab733dd55e1a588", "repository_url": "https://github.com/organization/example-repository" }, "service": "shopist", "started_at": 1693491974000000000, "team": "backend", "version": "v1.12.07" }, "id": "4242fcdd31586083", "type": "dora_deployment" }, { "attributes": { "custom_tags": [ "language:go", "department:platform" ], "env": "production", "finished_at": 1693492084000000000, "git": { "commit_sha": "77bdc9350f2cc9b250b69abddab733dd55e1a599", "repository_url": "https://github.com/organization/api-service" }, "service": "api-service", "started_at": 1693492074000000000, "team": "backend", "version": "v2.1.0" }, "id": "4242fcdd31586084", "type": "dora_deployment" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=typescript) ##### Get a list of deployment events Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dora/deployments" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": {} } } EOF ``` ##### Get a list of deployment events ``` """ Get a list of deployment events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dora_metrics_api import DORAMetricsApi from datadog_api_client.v2.model.dora_list_deployments_request import DORAListDeploymentsRequest from datadog_api_client.v2.model.dora_list_deployments_request_attributes import DORAListDeploymentsRequestAttributes from datadog_api_client.v2.model.dora_list_deployments_request_data import DORAListDeploymentsRequestData from datadog_api_client.v2.model.dora_list_deployments_request_data_type import DORAListDeploymentsRequestDataType from datetime import datetime from dateutil.tz import tzutc body = DORAListDeploymentsRequest( data=DORAListDeploymentsRequestData( attributes=DORAListDeploymentsRequestAttributes( _from=datetime(2025, 3, 23, 0, 0, tzinfo=tzutc()), limit=1, to=datetime(2025, 3, 24, 0, 0, tzinfo=tzutc()), ), type=DORAListDeploymentsRequestDataType.DORA_DEPLOYMENTS_LIST_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DORAMetricsApi(api_client) response = api_instance.list_dora_deployments(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of deployment events ``` # Get a list of deployment events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DORAMetricsAPI.new body = DatadogAPIClient::V2::DORAListDeploymentsRequest.new({ data: DatadogAPIClient::V2::DORAListDeploymentsRequestData.new({ attributes: DatadogAPIClient::V2::DORAListDeploymentsRequestAttributes.new({ from: "2025-03-23T00:00:00Z", limit: 1, to: "2025-03-24T00:00:00Z", }), type: DatadogAPIClient::V2::DORAListDeploymentsRequestDataType::DORA_DEPLOYMENTS_LIST_REQUEST, }), }) p api_instance.list_dora_deployments(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of deployment events ``` // Get a list of deployment events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.DORAListDeploymentsRequest{ Data: datadogV2.DORAListDeploymentsRequestData{ Attributes: datadogV2.DORAListDeploymentsRequestAttributes{ From: datadog.PtrTime(time.Date(2025, 3, 23, 0, 0, 0, 0, time.UTC)), Limit: datadog.PtrInt32(1), To: datadog.PtrTime(time.Date(2025, 3, 24, 0, 0, 0, 0, time.UTC)), }, Type: datadogV2.DORALISTDEPLOYMENTSREQUESTDATATYPE_DORA_DEPLOYMENTS_LIST_REQUEST.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDORAMetricsApi(apiClient) resp, r, err := api.ListDORADeployments(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DORAMetricsApi.ListDORADeployments`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DORAMetricsApi.ListDORADeployments`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of deployment events ``` // Get a list of deployment events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DoraMetricsApi; import com.datadog.api.client.v2.model.DORAListDeploymentsRequest; import com.datadog.api.client.v2.model.DORAListDeploymentsRequestAttributes; import com.datadog.api.client.v2.model.DORAListDeploymentsRequestData; import com.datadog.api.client.v2.model.DORAListDeploymentsRequestDataType; import com.datadog.api.client.v2.model.DORAListResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DoraMetricsApi apiInstance = new DoraMetricsApi(defaultClient); DORAListDeploymentsRequest body = new DORAListDeploymentsRequest() .data( new DORAListDeploymentsRequestData() .attributes( new DORAListDeploymentsRequestAttributes() .from(OffsetDateTime.parse("2025-03-23T00:00:00Z")) .limit(1) .to(OffsetDateTime.parse("2025-03-24T00:00:00Z"))) .type(DORAListDeploymentsRequestDataType.DORA_DEPLOYMENTS_LIST_REQUEST)); try { DORAListResponse result = apiInstance.listDORADeployments(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DoraMetricsApi#listDORADeployments"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of deployment events ``` // Get a list of deployment events returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dora_metrics::DORAMetricsAPI; use datadog_api_client::datadogV2::model::DORAListDeploymentsRequest; use datadog_api_client::datadogV2::model::DORAListDeploymentsRequestAttributes; use datadog_api_client::datadogV2::model::DORAListDeploymentsRequestData; use datadog_api_client::datadogV2::model::DORAListDeploymentsRequestDataType; #[tokio::main] async fn main() { let body = DORAListDeploymentsRequest::new( DORAListDeploymentsRequestData::new( DORAListDeploymentsRequestAttributes::new() .from( DateTime::parse_from_rfc3339("2025-03-23T00:00:00+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .limit(1) .to(DateTime::parse_from_rfc3339("2025-03-24T00:00:00+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc)), ) .type_(DORAListDeploymentsRequestDataType::DORA_DEPLOYMENTS_LIST_REQUEST), ); let configuration = datadog::Configuration::new(); let api = DORAMetricsAPI::with_config(configuration); let resp = api.list_dora_deployments(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of deployment events ``` /** * Get a list of deployment events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DORAMetricsApi(configuration); const params: v2.DORAMetricsApiListDORADeploymentsRequest = { body: { data: { attributes: { from: new Date(2025, 3, 23, 0, 0, 0, 0), limit: 1, to: new Date(2025, 3, 24, 0, 0, 0, 0), }, type: "dora_deployments_list_request", }, }, }; apiInstance .listDORADeployments(params) .then((data: v2.DORAListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of failure events](https://docs.datadoghq.com/api/latest/dora-metrics/#get-a-list-of-failure-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dora-metrics/#get-a-list-of-failure-events-v2) POST https://api.ap1.datadoghq.com/api/v2/dora/failureshttps://api.ap2.datadoghq.com/api/v2/dora/failureshttps://api.datadoghq.eu/api/v2/dora/failureshttps://api.ddog-gov.com/api/v2/dora/failureshttps://api.datadoghq.com/api/v2/dora/failureshttps://api.us3.datadoghq.com/api/v2/dora/failureshttps://api.us5.datadoghq.com/api/v2/dora/failures ### Overview Use this API endpoint to get a list of failure events. This endpoint requires the `dora_metrics_read` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Field Type Description data [_required_] object The JSON:API data. attributes [_required_] object Attributes to get a list of failures. from date-time Minimum timestamp for requested events. limit int32 Maximum number of events in the response. default: `10` query string Search query with event platform syntax. sort string Sort order (prefixed with `-` for descending). to date-time Maximum timestamp for requested events. type enum The definition of `DORAListFailuresRequestDataType` object. Allowed enum values: `dora_failures_list_request` default: `dora_failures_list_request` ``` { "data": { "attributes": { "from": "2025-01-01T00:00:00Z", "limit": 500, "query": "severity:(SEV-1 OR SEV-2) env:(production OR staging) service:(shopist OR api-service OR payment-service) team:(backend OR platform OR payments)", "sort": "-started_at", "to": "2025-01-31T23:59:59Z" }, "type": "dora_failures_list_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/dora-metrics/#ListDORAFailures-200-v2) * [400](https://docs.datadoghq.com/api/latest/dora-metrics/#ListDORAFailures-400-v2) * [403](https://docs.datadoghq.com/api/latest/dora-metrics/#ListDORAFailures-403-v2) * [429](https://docs.datadoghq.com/api/latest/dora-metrics/#ListDORAFailures-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Response for the list failures endpoint. Field Type Description data [object] The list of DORA incident events. attributes object The attributes of the incident event. custom_tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. Up to 100 may be added per event. env string Environment name that was impacted by the incident. finished_at int64 Unix timestamp when the incident finished. git object Git info for DORA Metrics events. commit_sha [_required_] string Git Commit SHA. repository_url [_required_] string Git Repository URL name string Incident name. services [string] Service names impacted by the incident. severity string Incident severity. started_at [_required_] int64 Unix timestamp when the incident started. team string Name of the team owning the services impacted. version string Version to correlate with APM Deployment Tracking. id string The ID of the incident event. type enum JSON:API type for DORA failure events. Allowed enum values: `dora_failure` default: `dora_failure` ``` { "data": [ { "attributes": { "custom_tags": [ "incident_type:database", "department:engineering" ], "env": "production", "finished_at": 1693492274000000000, "name": "Database outage", "services": [ "shopist" ], "severity": "SEV-1", "started_at": 1693492174000000000, "team": "backend" }, "id": "4242fcdd31586085", "type": "dora_incident" }, { "attributes": { "custom_tags": [ "incident_type:service_down", "department:platform" ], "env": "production", "finished_at": 1693492474000000000, "name": "API service outage", "services": [ "api-service", "payment-service" ], "severity": "SEV-2", "started_at": 1693492374000000000, "team": "backend" }, "id": "4242fcdd31586086", "type": "dora_incident" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=typescript) ##### Get a list of failure events Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dora/failures" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": {} } } EOF ``` ##### Get a list of failure events ``` """ Get a list of failure events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dora_metrics_api import DORAMetricsApi from datadog_api_client.v2.model.dora_list_failures_request import DORAListFailuresRequest from datadog_api_client.v2.model.dora_list_failures_request_attributes import DORAListFailuresRequestAttributes from datadog_api_client.v2.model.dora_list_failures_request_data import DORAListFailuresRequestData from datadog_api_client.v2.model.dora_list_failures_request_data_type import DORAListFailuresRequestDataType from datetime import datetime from dateutil.tz import tzutc body = DORAListFailuresRequest( data=DORAListFailuresRequestData( attributes=DORAListFailuresRequestAttributes( _from=datetime(2025, 3, 23, 0, 0, tzinfo=tzutc()), limit=1, to=datetime(2025, 3, 24, 0, 0, tzinfo=tzutc()), ), type=DORAListFailuresRequestDataType.DORA_FAILURES_LIST_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DORAMetricsApi(api_client) response = api_instance.list_dora_failures(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of failure events ``` # Get a list of failure events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DORAMetricsAPI.new body = DatadogAPIClient::V2::DORAListFailuresRequest.new({ data: DatadogAPIClient::V2::DORAListFailuresRequestData.new({ attributes: DatadogAPIClient::V2::DORAListFailuresRequestAttributes.new({ from: "2025-03-23T00:00:00Z", limit: 1, to: "2025-03-24T00:00:00Z", }), type: DatadogAPIClient::V2::DORAListFailuresRequestDataType::DORA_FAILURES_LIST_REQUEST, }), }) p api_instance.list_dora_failures(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of failure events ``` // Get a list of failure events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.DORAListFailuresRequest{ Data: datadogV2.DORAListFailuresRequestData{ Attributes: datadogV2.DORAListFailuresRequestAttributes{ From: datadog.PtrTime(time.Date(2025, 3, 23, 0, 0, 0, 0, time.UTC)), Limit: datadog.PtrInt32(1), To: datadog.PtrTime(time.Date(2025, 3, 24, 0, 0, 0, 0, time.UTC)), }, Type: datadogV2.DORALISTFAILURESREQUESTDATATYPE_DORA_FAILURES_LIST_REQUEST.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDORAMetricsApi(apiClient) resp, r, err := api.ListDORAFailures(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DORAMetricsApi.ListDORAFailures`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DORAMetricsApi.ListDORAFailures`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of failure events ``` // Get a list of failure events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DoraMetricsApi; import com.datadog.api.client.v2.model.DORAListFailuresRequest; import com.datadog.api.client.v2.model.DORAListFailuresRequestAttributes; import com.datadog.api.client.v2.model.DORAListFailuresRequestData; import com.datadog.api.client.v2.model.DORAListFailuresRequestDataType; import com.datadog.api.client.v2.model.DORAListResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DoraMetricsApi apiInstance = new DoraMetricsApi(defaultClient); DORAListFailuresRequest body = new DORAListFailuresRequest() .data( new DORAListFailuresRequestData() .attributes( new DORAListFailuresRequestAttributes() .from(OffsetDateTime.parse("2025-03-23T00:00:00Z")) .limit(1) .to(OffsetDateTime.parse("2025-03-24T00:00:00Z"))) .type(DORAListFailuresRequestDataType.DORA_FAILURES_LIST_REQUEST)); try { DORAListResponse result = apiInstance.listDORAFailures(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DoraMetricsApi#listDORAFailures"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of failure events ``` // Get a list of failure events returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dora_metrics::DORAMetricsAPI; use datadog_api_client::datadogV2::model::DORAListFailuresRequest; use datadog_api_client::datadogV2::model::DORAListFailuresRequestAttributes; use datadog_api_client::datadogV2::model::DORAListFailuresRequestData; use datadog_api_client::datadogV2::model::DORAListFailuresRequestDataType; #[tokio::main] async fn main() { let body = DORAListFailuresRequest::new( DORAListFailuresRequestData::new( DORAListFailuresRequestAttributes::new() .from( DateTime::parse_from_rfc3339("2025-03-23T00:00:00+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .limit(1) .to(DateTime::parse_from_rfc3339("2025-03-24T00:00:00+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc)), ) .type_(DORAListFailuresRequestDataType::DORA_FAILURES_LIST_REQUEST), ); let configuration = datadog::Configuration::new(); let api = DORAMetricsAPI::with_config(configuration); let resp = api.list_dora_failures(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of failure events ``` /** * Get a list of failure events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DORAMetricsApi(configuration); const params: v2.DORAMetricsApiListDORAFailuresRequest = { body: { data: { attributes: { from: new Date(2025, 3, 23, 0, 0, 0, 0), limit: 1, to: new Date(2025, 3, 24, 0, 0, 0, 0), }, type: "dora_failures_list_request", }, }, }; apiInstance .listDORAFailures(params) .then((data: v2.DORAListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a deployment event](https://docs.datadoghq.com/api/latest/dora-metrics/#get-a-deployment-event) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dora-metrics/#get-a-deployment-event-v2) GET https://api.ap1.datadoghq.com/api/v2/dora/deployments/{deployment_id}https://api.ap2.datadoghq.com/api/v2/dora/deployments/{deployment_id}https://api.datadoghq.eu/api/v2/dora/deployments/{deployment_id}https://api.ddog-gov.com/api/v2/dora/deployments/{deployment_id}https://api.datadoghq.com/api/v2/dora/deployments/{deployment_id}https://api.us3.datadoghq.com/api/v2/dora/deployments/{deployment_id}https://api.us5.datadoghq.com/api/v2/dora/deployments/{deployment_id} ### Overview Use this API endpoint to get a deployment event. This endpoint requires the `dora_metrics_read` permission. ### Arguments #### Path Parameters Name Type Description deployment_id [_required_] string The ID of the deployment event. ### Response * [200](https://docs.datadoghq.com/api/latest/dora-metrics/#GetDORADeployment-200-v2) * [400](https://docs.datadoghq.com/api/latest/dora-metrics/#GetDORADeployment-400-v2) * [403](https://docs.datadoghq.com/api/latest/dora-metrics/#GetDORADeployment-403-v2) * [429](https://docs.datadoghq.com/api/latest/dora-metrics/#GetDORADeployment-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Response for fetching a single deployment event. Field Type Description data object A DORA deployment event. attributes object The attributes of the deployment event. custom_tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. Up to 100 may be added per event. env string Environment name to where the service was deployed. finished_at [_required_] int64 Unix timestamp when the deployment finished. git object Git info for DORA Metrics events. commit_sha [_required_] string Git Commit SHA. repository_url [_required_] string Git Repository URL service [_required_] string Service name. started_at [_required_] int64 Unix timestamp when the deployment started. team string Name of the team owning the deployed service. version string Version to correlate with APM Deployment Tracking. id string The ID of the deployment event. type enum JSON:API type for DORA deployment events. Allowed enum values: `dora_deployment` default: `dora_deployment` ``` { "data": { "attributes": { "custom_tags": [ "language:java", "department:engineering" ], "env": "production", "finished_at": 1693491984000000000, "git": { "commit_sha": "66adc9350f2cc9b250b69abddab733dd55e1a588", "repository_url": "https://github.com/organization/example-repository" }, "service": "shopist", "started_at": 1693491974000000000, "team": "backend", "version": "v1.12.07" }, "id": "string", "type": "dora_deployment" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=typescript) ##### Get a deployment event Copy ``` # Path parameters export deployment_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dora/deployments/${deployment_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a deployment event ``` """ Get a deployment event returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dora_metrics_api import DORAMetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DORAMetricsApi(api_client) response = api_instance.get_dora_deployment( deployment_id="deployment_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a deployment event ``` # Get a deployment event returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DORAMetricsAPI.new p api_instance.get_dora_deployment("deployment_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a deployment event ``` // Get a deployment event returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDORAMetricsApi(apiClient) resp, r, err := api.GetDORADeployment(ctx, "deployment_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DORAMetricsApi.GetDORADeployment`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DORAMetricsApi.GetDORADeployment`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a deployment event ``` // Get a deployment event returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DoraMetricsApi; import com.datadog.api.client.v2.model.DORAFetchResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DoraMetricsApi apiInstance = new DoraMetricsApi(defaultClient); try { DORAFetchResponse result = apiInstance.getDORADeployment("deployment_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DoraMetricsApi#getDORADeployment"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a deployment event ``` // Get a deployment event returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dora_metrics::DORAMetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DORAMetricsAPI::with_config(configuration); let resp = api.get_dora_deployment("deployment_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a deployment event ``` /** * Get a deployment event returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DORAMetricsApi(configuration); const params: v2.DORAMetricsApiGetDORADeploymentRequest = { deploymentId: "deployment_id", }; apiInstance .getDORADeployment(params) .then((data: v2.DORAFetchResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a failure event](https://docs.datadoghq.com/api/latest/dora-metrics/#get-a-failure-event) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dora-metrics/#get-a-failure-event-v2) GET https://api.ap1.datadoghq.com/api/v2/dora/failures/{failure_id}https://api.ap2.datadoghq.com/api/v2/dora/failures/{failure_id}https://api.datadoghq.eu/api/v2/dora/failures/{failure_id}https://api.ddog-gov.com/api/v2/dora/failures/{failure_id}https://api.datadoghq.com/api/v2/dora/failures/{failure_id}https://api.us3.datadoghq.com/api/v2/dora/failures/{failure_id}https://api.us5.datadoghq.com/api/v2/dora/failures/{failure_id} ### Overview Use this API endpoint to get a failure event. This endpoint requires the `dora_metrics_read` permission. ### Arguments #### Path Parameters Name Type Description failure_id [_required_] string The ID of the failure event. ### Response * [200](https://docs.datadoghq.com/api/latest/dora-metrics/#GetDORAFailure-200-v2) * [400](https://docs.datadoghq.com/api/latest/dora-metrics/#GetDORAFailure-400-v2) * [403](https://docs.datadoghq.com/api/latest/dora-metrics/#GetDORAFailure-403-v2) * [429](https://docs.datadoghq.com/api/latest/dora-metrics/#GetDORAFailure-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) Response for fetching a single failure event. Field Type Description data object A DORA incident event. attributes object The attributes of the incident event. custom_tags [string] A list of user-defined tags. The tags must follow the `key:value` pattern. Up to 100 may be added per event. env string Environment name that was impacted by the incident. finished_at int64 Unix timestamp when the incident finished. git object Git info for DORA Metrics events. commit_sha [_required_] string Git Commit SHA. repository_url [_required_] string Git Repository URL name string Incident name. services [string] Service names impacted by the incident. severity string Incident severity. started_at [_required_] int64 Unix timestamp when the incident started. team string Name of the team owning the services impacted. version string Version to correlate with APM Deployment Tracking. id string The ID of the incident event. type enum JSON:API type for DORA failure events. Allowed enum values: `dora_failure` default: `dora_failure` ``` { "data": { "attributes": { "custom_tags": [ "language:java", "department:engineering" ], "env": "production", "finished_at": 1693491984000000000, "git": { "commit_sha": "66adc9350f2cc9b250b69abddab733dd55e1a588", "repository_url": "https://github.com/organization/example-repository" }, "name": "Database outage", "services": [ "shopist" ], "severity": "SEV-1", "started_at": 1693491974000000000, "team": "backend", "version": "v1.12.07" }, "id": "string", "type": "dora_failure" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=typescript) ##### Get a failure event Copy ``` # Path parameters export failure_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dora/failures/${failure_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a failure event ``` """ Get a failure event returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dora_metrics_api import DORAMetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DORAMetricsApi(api_client) response = api_instance.get_dora_failure( failure_id="failure_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a failure event ``` # Get a failure event returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DORAMetricsAPI.new p api_instance.get_dora_failure("failure_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a failure event ``` // Get a failure event returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDORAMetricsApi(apiClient) resp, r, err := api.GetDORAFailure(ctx, "failure_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DORAMetricsApi.GetDORAFailure`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DORAMetricsApi.GetDORAFailure`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a failure event ``` // Get a failure event returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DoraMetricsApi; import com.datadog.api.client.v2.model.DORAFetchResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DoraMetricsApi apiInstance = new DoraMetricsApi(defaultClient); try { DORAFetchResponse result = apiInstance.getDORAFailure("failure_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DoraMetricsApi#getDORAFailure"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a failure event ``` // Get a failure event returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dora_metrics::DORAMetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DORAMetricsAPI::with_config(configuration); let resp = api.get_dora_failure("failure_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a failure event ``` /** * Get a failure event returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DORAMetricsApi(configuration); const params: v2.DORAMetricsApiGetDORAFailureRequest = { failureId: "failure_id", }; apiInstance .getDORAFailure(params) .then((data: v2.DORAFetchResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a failure event](https://docs.datadoghq.com/api/latest/dora-metrics/#delete-a-failure-event) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dora-metrics/#delete-a-failure-event-v2) DELETE https://api.ap1.datadoghq.com/api/v2/dora/failure/{failure_id}https://api.ap2.datadoghq.com/api/v2/dora/failure/{failure_id}https://api.datadoghq.eu/api/v2/dora/failure/{failure_id}https://api.ddog-gov.com/api/v2/dora/failure/{failure_id}https://api.datadoghq.com/api/v2/dora/failure/{failure_id}https://api.us3.datadoghq.com/api/v2/dora/failure/{failure_id}https://api.us5.datadoghq.com/api/v2/dora/failure/{failure_id} ### Overview Use this API endpoint to delete a failure event. This endpoint requires the `dora_metrics_write` permission. ### Arguments #### Path Parameters Name Type Description failure_id [_required_] string The ID of the failure event to delete. ### Response * [202](https://docs.datadoghq.com/api/latest/dora-metrics/#DeleteDORAFailure-202-v2) * [400](https://docs.datadoghq.com/api/latest/dora-metrics/#DeleteDORAFailure-400-v2) * [403](https://docs.datadoghq.com/api/latest/dora-metrics/#DeleteDORAFailure-403-v2) * [429](https://docs.datadoghq.com/api/latest/dora-metrics/#DeleteDORAFailure-429-v2) Accepted Bad Request * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=typescript) ##### Delete a failure event Copy ``` # Path parameters export failure_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dora/failure/${failure_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a failure event ``` """ Delete a failure event returns "Accepted" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dora_metrics_api import DORAMetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DORAMetricsApi(api_client) api_instance.delete_dora_failure( failure_id="NO_VALUE", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a failure event ``` # Delete a failure event returns "Accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DORAMetricsAPI.new p api_instance.delete_dora_failure("NO_VALUE") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a failure event ``` // Delete a failure event returns "Accepted" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDORAMetricsApi(apiClient) r, err := api.DeleteDORAFailure(ctx, "NO_VALUE") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DORAMetricsApi.DeleteDORAFailure`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a failure event ``` // Delete a failure event returns "Accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DoraMetricsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DoraMetricsApi apiInstance = new DoraMetricsApi(defaultClient); try { apiInstance.deleteDORAFailure("NO_VALUE"); } catch (ApiException e) { System.err.println("Exception when calling DoraMetricsApi#deleteDORAFailure"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a failure event ``` // Delete a failure event returns "Accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dora_metrics::DORAMetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DORAMetricsAPI::with_config(configuration); let resp = api.delete_dora_failure("NO_VALUE".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a failure event ``` /** * Delete a failure event returns "Accepted" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DORAMetricsApi(configuration); const params: v2.DORAMetricsApiDeleteDORAFailureRequest = { failureId: "NO_VALUE", }; apiInstance .deleteDORAFailure(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a deployment event](https://docs.datadoghq.com/api/latest/dora-metrics/#delete-a-deployment-event) * [v2 (latest)](https://docs.datadoghq.com/api/latest/dora-metrics/#delete-a-deployment-event-v2) DELETE https://api.ap1.datadoghq.com/api/v2/dora/deployment/{deployment_id}https://api.ap2.datadoghq.com/api/v2/dora/deployment/{deployment_id}https://api.datadoghq.eu/api/v2/dora/deployment/{deployment_id}https://api.ddog-gov.com/api/v2/dora/deployment/{deployment_id}https://api.datadoghq.com/api/v2/dora/deployment/{deployment_id}https://api.us3.datadoghq.com/api/v2/dora/deployment/{deployment_id}https://api.us5.datadoghq.com/api/v2/dora/deployment/{deployment_id} ### Overview Use this API endpoint to delete a deployment event. This endpoint requires the `dora_metrics_write` permission. ### Arguments #### Path Parameters Name Type Description deployment_id [_required_] string The ID of the deployment event to delete. ### Response * [202](https://docs.datadoghq.com/api/latest/dora-metrics/#DeleteDORADeployment-202-v2) * [400](https://docs.datadoghq.com/api/latest/dora-metrics/#DeleteDORADeployment-400-v2) * [403](https://docs.datadoghq.com/api/latest/dora-metrics/#DeleteDORADeployment-403-v2) * [429](https://docs.datadoghq.com/api/latest/dora-metrics/#DeleteDORADeployment-429-v2) Accepted Bad Request * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/dora-metrics/) * [Example](https://docs.datadoghq.com/api/latest/dora-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/dora-metrics/?code-lang=typescript) ##### Delete a deployment event Copy ``` # Path parameters export deployment_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/dora/deployment/${deployment_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a deployment event ``` """ Delete a deployment event returns "Accepted" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.dora_metrics_api import DORAMetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DORAMetricsApi(api_client) api_instance.delete_dora_deployment( deployment_id="NO_VALUE", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a deployment event ``` # Delete a deployment event returns "Accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DORAMetricsAPI.new p api_instance.delete_dora_deployment("NO_VALUE") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a deployment event ``` // Delete a deployment event returns "Accepted" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDORAMetricsApi(apiClient) r, err := api.DeleteDORADeployment(ctx, "NO_VALUE") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DORAMetricsApi.DeleteDORADeployment`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a deployment event ``` // Delete a deployment event returns "Accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DoraMetricsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DoraMetricsApi apiInstance = new DoraMetricsApi(defaultClient); try { apiInstance.deleteDORADeployment("NO_VALUE"); } catch (ApiException e) { System.err.println("Exception when calling DoraMetricsApi#deleteDORADeployment"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a deployment event ``` // Delete a deployment event returns "Accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_dora_metrics::DORAMetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DORAMetricsAPI::with_config(configuration); let resp = api.delete_dora_deployment("NO_VALUE".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a deployment event ``` /** * Delete a deployment event returns "Accepted" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DORAMetricsApi(configuration); const params: v2.DORAMetricsApiDeleteDORADeploymentRequest = { deploymentId: "NO_VALUE", }; apiInstance .deleteDORADeployment(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=07b74464-a1f6-465c-891d-728892261d90&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=1a019bde-1452-4d90-8b07-4b24923cc14f&pt=DORA%20Metrics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdora-metrics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=07b74464-a1f6-465c-891d-728892261d90&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=1a019bde-1452-4d90-8b07-4b24923cc14f&pt=DORA%20Metrics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdora-metrics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=e399b277-ad13-4b52-a5a3-2066b702457d&bo=2&sid=a619e810f0bf11f09cfddf8eecd6685c&vid=a61a4b10f0bf11f0bf527f4b7489ac07&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=DORA%20Metrics&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdora-metrics%2F&r=<=2206&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=86032) --- # Source: https://docs.datadoghq.com/api/latest/downtimes # Downtimes [Downtiming](https://docs.datadoghq.com/monitors/notify/downtimes) gives you greater control over monitor notifications by allowing you to globally exclude scopes from alerting. Downtime settings, which can be scheduled with start and end times, prevent all alerting related to specified Datadog tags. **Note:** `curl` commands require [url encoding](https://curl.se/docs/url-syntax.html). ## [Get all downtimes](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes) * [v1](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes-v2) GET https://api.ap1.datadoghq.com/api/v1/downtimehttps://api.ap2.datadoghq.com/api/v1/downtimehttps://api.datadoghq.eu/api/v1/downtimehttps://api.ddog-gov.com/api/v1/downtimehttps://api.datadoghq.com/api/v1/downtimehttps://api.us3.datadoghq.com/api/v1/downtimehttps://api.us5.datadoghq.com/api/v1/downtime ### Overview Get all scheduled downtimes. **Note:** This endpoint has been deprecated. Please use v2 endpoints. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Arguments #### Query Strings Name Type Description current_only boolean Only return downtimes that are active when the request is made. with_creator boolean Return creator information. ### Response * [200](https://docs.datadoghq.com/api/latest/downtimes/#ListDowntimes-200-v1) * [403](https://docs.datadoghq.com/api/latest/downtimes/#ListDowntimes-403-v1) * [429](https://docs.datadoghq.com/api/latest/downtimes/#ListDowntimes-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Field Type Description active boolean If a scheduled downtime currently exists. active_child object The downtime object definition of the active child for the original parent recurring downtime. This field will only exist on recurring downtimes. active boolean If a scheduled downtime currently exists. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. ``` { "active": true, "active_child": { "active": true, "canceled": 1412799983, "creator_id": 123456, "disabled": false, "downtime_type": 2, "end": 1412793983, "id": 1626, "message": "Message on the downtime", "monitor_id": 123456, "monitor_tags": [ "*" ], "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ], "parent_id": 123, "recurrence": { "period": 1, "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "type": "weeks", "until_date": 1447786293, "until_occurrences": 2, "week_days": [ "Mon", "Tue" ] }, "scope": [ "env:staging" ], "start": 1412792983, "timezone": "America/New_York", "updater_id": 123456 }, "canceled": 1412799983, "creator_id": 123456, "disabled": false, "downtime_type": 2, "end": 1412793983, "id": 1625, "message": "Message on the downtime", "monitor_id": 123456, "monitor_tags": [ "*" ], "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ], "parent_id": 123, "recurrence": { "period": 1, "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "type": "weeks", "until_date": 1447786293, "until_occurrences": 2, "week_days": [ "Mon", "Tue" ] }, "scope": [ "env:staging" ], "start": 1412792983, "timezone": "America/New_York", "updater_id": 123456 } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python-legacy) ##### Get all downtimes Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/downtime" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all downtimes ``` """ Get all downtimes returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.downtimes_api import DowntimesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.list_downtimes( with_creator=True, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all downtimes ``` # Get all downtimes returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DowntimesAPI.new opts = { with_creator: true, } p api_instance.list_downtimes(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all downtimes ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Get all downtimes dog.get_all_downtimes ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all downtimes ``` // Get all downtimes returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_downtimes::DowntimesAPI; use datadog_api_client::datadogV1::api_downtimes::ListDowntimesOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api .list_downtimes(ListDowntimesOptionalParams::default().with_creator(true)) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all downtimes ``` // Get all downtimes returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDowntimesApi(apiClient) resp, r, err := api.ListDowntimes(ctx, *datadogV1.NewListDowntimesOptionalParameters().WithWithCreator(true)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.ListDowntimes`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.ListDowntimes`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all downtimes ``` // Get all downtimes returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DowntimesApi; import com.datadog.api.client.v1.api.DowntimesApi.ListDowntimesOptionalParameters; import com.datadog.api.client.v1.model.Downtime; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); try { List result = apiInstance.listDowntimes(new ListDowntimesOptionalParameters().withCreator(true)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#listDowntimes"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all downtimes ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Get all downtimes print(api.Downtime.get_all()) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get all downtimes ``` /** * Get all downtimes returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DowntimesApi(configuration); const params: v1.DowntimesApiListDowntimesRequest = { withCreator: true, }; apiInstance .listDowntimes(params) .then((data: v1.Downtime[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/downtimehttps://api.ap2.datadoghq.com/api/v2/downtimehttps://api.datadoghq.eu/api/v2/downtimehttps://api.ddog-gov.com/api/v2/downtimehttps://api.datadoghq.com/api/v2/downtimehttps://api.us3.datadoghq.com/api/v2/downtimehttps://api.us5.datadoghq.com/api/v2/downtime ### Overview Get all scheduled downtimes. This endpoint requires the `monitors_downtime` permission. OAuth apps require the `monitors_downtime` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Arguments #### Query Strings Name Type Description current_only boolean Only return downtimes that are active when the request is made. include string Comma-separated list of resource paths for related resources to include in the response. Supported resource paths are `created_by` and `monitor`. page[offset] integer Specific offset to use as the beginning of the returned page. page[limit] integer Maximum number of downtimes in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/downtimes/#ListDowntimes-200-v2) * [403](https://docs.datadoghq.com/api/latest/downtimes/#ListDowntimes-403-v2) * [429](https://docs.datadoghq.com/api/latest/downtimes/#ListDowntimes-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Response for retrieving all downtimes. Field Type Description data [object] An array of downtimes. attributes object Downtime details. canceled date-time Time that the downtime was canceled. created date-time Creation time of the downtime. display_timezone string The timezone in which to display the downtime's start and end times in Datadog applications. This is not used as an offset for scheduling. default: `UTC` message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. modified date-time Time that the downtime was last modified. monitor_identifier Monitor identifier for the downtime. Option 1 object Object of the monitor identifier. monitor_id [_required_] int64 ID of the monitor to prevent notifications. Option 2 object Object of the monitor tags. monitor_tags [_required_] [string] A list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match **all** provided monitor tags. Setting `monitor_tags` to `[*]` configures the downtime to mute all monitors for the given scope. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States that will trigger a monitor notification when the `notify_end_types` action occurs. notify_end_types [string] Actions that will trigger a monitor notification if the downtime is in the `notify_end_types` state. schedule The schedule that defines when the monitor starts, stops, and recurs. There are two types of schedules: one-time and recurring. Recurring schedules may have up to five RRULE-based recurrences. If no schedules are provided, the downtime will begin immediately and never end. Option 1 object A recurring downtime schedule definition. current_downtime object The most recent actual start and end dates for a recurring downtime. For a canceled downtime, this is the previously occurring downtime. For active downtimes, this is the ongoing downtime, and for scheduled downtimes it is the upcoming downtime. end date-time The end of the current downtime. start date-time The start of the current downtime. recurrences [_required_] [object] A list of downtime recurrences. duration string The length of the downtime. Must begin with an integer and end with one of 'm', 'h', d', or 'w'. rrule string The `RRULE` standard for defining recurring events. For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api). start string ISO-8601 Datetime to start the downtime. Must not include a UTC offset. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to schedule the downtime. This affects recurring start and end dates. Must match `display_timezone`. default: `UTC` Option 2 object A one-time downtime definition. end date-time ISO-8601 Datetime to end the downtime. start [_required_] date-time ISO-8601 Datetime to start the downtime. scope string The scope to which the downtime applies. Must follow the [common search syntax](https://docs.datadoghq.com/logs/explorer/search_syntax/). status enum The current status of the downtime. Allowed enum values: `active,canceled,ended,scheduled` id string The downtime ID. relationships object All relationships associated with downtime. created_by object The user who created the downtime. data object Data for the user who created the downtime. id string User ID of the downtime creator. type enum Users resource type. Allowed enum values: `users` default: `users` monitor object The monitor identified by the downtime. data object Data for the monitor. id string Monitor ID of the downtime. type enum Monitor resource type. Allowed enum values: `monitors` default: `monitors` type enum Downtime resource type. Allowed enum values: `downtime` default: `downtime` included [ ] Array of objects related to the downtimes. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Information about the monitor identified by the downtime. attributes object Attributes of the monitor identified by the downtime. name string The name of the monitor identified by the downtime. id int64 ID of the monitor identified by the downtime. type enum Monitor resource type. Allowed enum values: `monitors` default: `monitors` meta object Pagination metadata returned by the API. page object Object containing the total filtered count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "attributes": { "canceled": "2020-01-02T03:04:05.282979+0000", "created": "2020-01-02T03:04:05.282979+0000", "display_timezone": "America/New_York", "message": "Message about the downtime", "modified": "2020-01-02T03:04:05.282979+0000", "monitor_identifier": { "monitor_id": 123 }, "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "warn" ], "notify_end_types": [ "canceled", "expired" ], "schedule": { "current_downtime": { "end": "2020-01-02T03:04:00.000Z", "start": "2020-01-02T03:04:00.000Z" }, "recurrences": [ { "duration": "123d", "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "start": "2020-01-02T03:04" } ], "timezone": "America/New_York" }, "scope": "env:(staging OR prod) AND datacenter:us-east-1", "status": "active" }, "id": "00000000-0000-1234-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-1234-0000-000000000000", "type": "users" } }, "monitor": { "data": { "id": "12345", "type": "monitors" } } }, "type": "downtime" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "meta": { "page": { "total_filtered_count": "integer" } } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) ##### Get all downtimes Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/downtime" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all downtimes ``` """ Get all downtimes returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.downtimes_api import DowntimesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.list_downtimes() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all downtimes ``` # Get all downtimes returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DowntimesAPI.new p api_instance.list_downtimes() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all downtimes ``` // Get all downtimes returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDowntimesApi(apiClient) resp, r, err := api.ListDowntimes(ctx, *datadogV2.NewListDowntimesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.ListDowntimes`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.ListDowntimes`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all downtimes ``` // Get all downtimes returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DowntimesApi; import com.datadog.api.client.v2.model.ListDowntimesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); try { ListDowntimesResponse result = apiInstance.listDowntimes(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#listDowntimes"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all downtimes ``` // Get all downtimes returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_downtimes::DowntimesAPI; use datadog_api_client::datadogV2::api_downtimes::ListDowntimesOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api .list_downtimes(ListDowntimesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all downtimes ``` /** * Get all downtimes returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DowntimesApi(configuration); apiInstance .listDowntimes() .then((data: v2.ListDowntimesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Schedule a downtime](https://docs.datadoghq.com/api/latest/downtimes/#schedule-a-downtime) * [v1](https://docs.datadoghq.com/api/latest/downtimes/#schedule-a-downtime-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/downtimes/#schedule-a-downtime-v2) POST https://api.ap1.datadoghq.com/api/v1/downtimehttps://api.ap2.datadoghq.com/api/v1/downtimehttps://api.datadoghq.eu/api/v1/downtimehttps://api.ddog-gov.com/api/v1/downtimehttps://api.datadoghq.com/api/v1/downtimehttps://api.us3.datadoghq.com/api/v1/downtimehttps://api.us5.datadoghq.com/api/v1/downtime ### Overview Schedule a downtime. **Note:** This endpoint has been deprecated. Please use v2 endpoints. This endpoint requires the `monitors_downtime` permission. OAuth apps require the `monitors_downtime` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Request #### Body Data (required) Schedule a downtime request body. * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Field Type Description active boolean If a scheduled downtime currently exists. active_child object The downtime object definition of the active child for the original parent recurring downtime. This field will only exist on recurring downtimes. active boolean If a scheduled downtime currently exists. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. ##### Schedule a downtime once a year ``` { "message": "Example-Downtime", "recurrence": { "period": 1, "type": "years" }, "scope": [ "*" ], "start": 1636629071, "end": 1636632671, "timezone": "Etc/UTC", "mute_first_recovery_notification": true, "monitor_tags": [ "tag0" ], "notify_end_states": [ "alert", "warn" ], "notify_end_types": [ "expired" ] } ``` Copy ##### Schedule a downtime returns "OK" response ``` { "message": "Example-Downtime", "start": 1636629071, "end": 1636632671, "timezone": "Etc/UTC", "scope": [ "test:exampledowntime" ], "recurrence": { "type": "weeks", "period": 1, "week_days": [ "Mon", "Tue", "Wed", "Thu", "Fri" ], "until_date": 1638443471 }, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ] } ``` Copy ##### Schedule a downtime until date ``` { "message": "Example-Downtime", "recurrence": { "period": 1, "type": "weeks", "until_date": 1638443471, "week_days": [ "Mon", "Tue", "Wed", "Thu", "Fri" ] }, "scope": [ "*" ], "start": 1636629071, "end": 1636632671, "timezone": "Etc/UTC", "mute_first_recovery_notification": true, "monitor_tags": [ "tag0" ], "notify_end_states": [ "alert" ], "notify_end_types": [ "canceled" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/downtimes/#CreateDowntime-200-v1) * [400](https://docs.datadoghq.com/api/latest/downtimes/#CreateDowntime-400-v1) * [403](https://docs.datadoghq.com/api/latest/downtimes/#CreateDowntime-403-v1) * [429](https://docs.datadoghq.com/api/latest/downtimes/#CreateDowntime-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Downtiming gives you greater control over monitor notifications by allowing you to globally exclude scopes from alerting. Downtime settings, which can be scheduled with start and end times, prevent all alerting related to specified Datadog tags. Field Type Description active boolean If a scheduled downtime currently exists. active_child object The downtime object definition of the active child for the original parent recurring downtime. This field will only exist on recurring downtimes. active boolean If a scheduled downtime currently exists. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. ``` { "active": true, "active_child": { "active": true, "canceled": 1412799983, "creator_id": 123456, "disabled": false, "downtime_type": 2, "end": 1412793983, "id": 1626, "message": "Message on the downtime", "monitor_id": 123456, "monitor_tags": [ "*" ], "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ], "parent_id": 123, "recurrence": { "period": 1, "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "type": "weeks", "until_date": 1447786293, "until_occurrences": 2, "week_days": [ "Mon", "Tue" ] }, "scope": [ "env:staging" ], "start": 1412792983, "timezone": "America/New_York", "updater_id": 123456 }, "canceled": 1412799983, "creator_id": 123456, "disabled": false, "downtime_type": 2, "end": 1412793983, "id": 1625, "message": "Message on the downtime", "monitor_id": 123456, "monitor_tags": [ "*" ], "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ], "parent_id": 123, "recurrence": { "period": 1, "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "type": "weeks", "until_date": 1447786293, "until_occurrences": 2, "week_days": [ "Mon", "Tue" ] }, "scope": [ "env:staging" ], "start": 1412792983, "timezone": "America/New_York", "updater_id": 123456 } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby-legacy) ##### Schedule a downtime once a year Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/downtime" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "message": "Example-Downtime", "recurrence": { "period": 1, "type": "years" }, "scope": [ "*" ], "start": 1636629071, "end": 1636632671, "timezone": "Etc/UTC", "mute_first_recovery_notification": true, "monitor_tags": [ "tag0" ], "notify_end_states": [ "alert", "warn" ], "notify_end_types": [ "expired" ] } EOF ``` ##### Schedule a downtime returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/downtime" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "message": "Example-Downtime", "start": 1636629071, "end": 1636632671, "timezone": "Etc/UTC", "scope": [ "test:exampledowntime" ], "recurrence": { "type": "weeks", "period": 1, "week_days": [ "Mon", "Tue", "Wed", "Thu", "Fri" ], "until_date": 1638443471 }, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ] } EOF ``` ##### Schedule a downtime until date Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/downtime" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "message": "Example-Downtime", "recurrence": { "period": 1, "type": "weeks", "until_date": 1638443471, "week_days": [ "Mon", "Tue", "Wed", "Thu", "Fri" ] }, "scope": [ "*" ], "start": 1636629071, "end": 1636632671, "timezone": "Etc/UTC", "mute_first_recovery_notification": true, "monitor_tags": [ "tag0" ], "notify_end_states": [ "alert" ], "notify_end_types": [ "canceled" ] } EOF ``` ##### Schedule a downtime once a year ``` // Schedule a downtime once a year package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Downtime{ Message: *datadog.NewNullableString(datadog.PtrString("Example-Downtime")), Recurrence: *datadogV1.NewNullableDowntimeRecurrence(&datadogV1.DowntimeRecurrence{ Period: datadog.PtrInt32(1), Type: datadog.PtrString("years"), }), Scope: []string{ "*", }, Start: datadog.PtrInt64(time.Now().Unix()), End: *datadog.NewNullableInt64(datadog.PtrInt64(time.Now().Add(time.Hour * 1).Unix())), Timezone: datadog.PtrString("Etc/UTC"), MuteFirstRecoveryNotification: datadog.PtrBool(true), MonitorTags: []string{ "tag0", }, NotifyEndStates: []datadogV1.NotifyEndState{ datadogV1.NOTIFYENDSTATE_ALERT, datadogV1.NOTIFYENDSTATE_WARN, }, NotifyEndTypes: []datadogV1.NotifyEndType{ datadogV1.NOTIFYENDTYPE_EXPIRED, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDowntimesApi(apiClient) resp, r, err := api.CreateDowntime(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.CreateDowntime`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.CreateDowntime`:\n%s\n", responseContent) } ``` Copy ##### Schedule a downtime returns "OK" response ``` // Schedule a downtime returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Downtime{ Message: *datadog.NewNullableString(datadog.PtrString("Example-Downtime")), Start: datadog.PtrInt64(time.Now().Unix()), End: *datadog.NewNullableInt64(datadog.PtrInt64(time.Now().Add(time.Hour * 1).Unix())), Timezone: datadog.PtrString("Etc/UTC"), Scope: []string{ "test:exampledowntime", }, Recurrence: *datadogV1.NewNullableDowntimeRecurrence(&datadogV1.DowntimeRecurrence{ Type: datadog.PtrString("weeks"), Period: datadog.PtrInt32(1), WeekDays: *datadog.NewNullableList(&[]string{ "Mon", "Tue", "Wed", "Thu", "Fri", }), UntilDate: *datadog.NewNullableInt64(datadog.PtrInt64(time.Now().AddDate(0, 0, 21).Unix())), }), NotifyEndStates: []datadogV1.NotifyEndState{ datadogV1.NOTIFYENDSTATE_ALERT, datadogV1.NOTIFYENDSTATE_NO_DATA, datadogV1.NOTIFYENDSTATE_WARN, }, NotifyEndTypes: []datadogV1.NotifyEndType{ datadogV1.NOTIFYENDTYPE_CANCELED, datadogV1.NOTIFYENDTYPE_EXPIRED, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDowntimesApi(apiClient) resp, r, err := api.CreateDowntime(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.CreateDowntime`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.CreateDowntime`:\n%s\n", responseContent) } ``` Copy ##### Schedule a downtime until date ``` // Schedule a downtime until date package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Downtime{ Message: *datadog.NewNullableString(datadog.PtrString("Example-Downtime")), Recurrence: *datadogV1.NewNullableDowntimeRecurrence(&datadogV1.DowntimeRecurrence{ Period: datadog.PtrInt32(1), Type: datadog.PtrString("weeks"), UntilDate: *datadog.NewNullableInt64(datadog.PtrInt64(time.Now().AddDate(0, 0, 21).Unix())), WeekDays: *datadog.NewNullableList(&[]string{ "Mon", "Tue", "Wed", "Thu", "Fri", }), }), Scope: []string{ "*", }, Start: datadog.PtrInt64(time.Now().Unix()), End: *datadog.NewNullableInt64(datadog.PtrInt64(time.Now().Add(time.Hour * 1).Unix())), Timezone: datadog.PtrString("Etc/UTC"), MuteFirstRecoveryNotification: datadog.PtrBool(true), MonitorTags: []string{ "tag0", }, NotifyEndStates: []datadogV1.NotifyEndState{ datadogV1.NOTIFYENDSTATE_ALERT, }, NotifyEndTypes: []datadogV1.NotifyEndType{ datadogV1.NOTIFYENDTYPE_CANCELED, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDowntimesApi(apiClient) resp, r, err := api.CreateDowntime(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.CreateDowntime`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.CreateDowntime`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Schedule a downtime once a year ``` // Schedule a downtime once a year import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DowntimesApi; import com.datadog.api.client.v1.model.Downtime; import com.datadog.api.client.v1.model.DowntimeRecurrence; import com.datadog.api.client.v1.model.NotifyEndState; import com.datadog.api.client.v1.model.NotifyEndType; import java.time.OffsetDateTime; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); Downtime body = new Downtime() .message("Example-Downtime") .recurrence(new DowntimeRecurrence().period(1).type("years")) .scope(Collections.singletonList("*")) .start(OffsetDateTime.now().toInstant().getEpochSecond()) .end(OffsetDateTime.now().plusHours(1).toInstant().getEpochSecond()) .timezone("Etc/UTC") .muteFirstRecoveryNotification(true) .monitorTags(Collections.singletonList("tag0")) .notifyEndStates(Arrays.asList(NotifyEndState.ALERT, NotifyEndState.WARN)) .notifyEndTypes(Collections.singletonList(NotifyEndType.EXPIRED)); try { Downtime result = apiInstance.createDowntime(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#createDowntime"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Schedule a downtime returns "OK" response ``` // Schedule a downtime returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DowntimesApi; import com.datadog.api.client.v1.model.Downtime; import com.datadog.api.client.v1.model.DowntimeRecurrence; import com.datadog.api.client.v1.model.NotifyEndState; import com.datadog.api.client.v1.model.NotifyEndType; import java.time.OffsetDateTime; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); Downtime body = new Downtime() .message("Example-Downtime") .start(OffsetDateTime.now().toInstant().getEpochSecond()) .end(OffsetDateTime.now().plusHours(1).toInstant().getEpochSecond()) .timezone("Etc/UTC") .scope(Collections.singletonList("test:exampledowntime")) .recurrence( new DowntimeRecurrence() .type("weeks") .period(1) .weekDays(Arrays.asList("Mon", "Tue", "Wed", "Thu", "Fri")) .untilDate(OffsetDateTime.now().plusDays(21).toInstant().getEpochSecond())) .notifyEndStates( Arrays.asList(NotifyEndState.ALERT, NotifyEndState.NO_DATA, NotifyEndState.WARN)) .notifyEndTypes(Arrays.asList(NotifyEndType.CANCELED, NotifyEndType.EXPIRED)); try { Downtime result = apiInstance.createDowntime(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#createDowntime"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Schedule a downtime until date ``` // Schedule a downtime until date import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DowntimesApi; import com.datadog.api.client.v1.model.Downtime; import com.datadog.api.client.v1.model.DowntimeRecurrence; import com.datadog.api.client.v1.model.NotifyEndState; import com.datadog.api.client.v1.model.NotifyEndType; import java.time.OffsetDateTime; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); Downtime body = new Downtime() .message("Example-Downtime") .recurrence( new DowntimeRecurrence() .period(1) .type("weeks") .untilDate(OffsetDateTime.now().plusDays(21).toInstant().getEpochSecond()) .weekDays(Arrays.asList("Mon", "Tue", "Wed", "Thu", "Fri"))) .scope(Collections.singletonList("*")) .start(OffsetDateTime.now().toInstant().getEpochSecond()) .end(OffsetDateTime.now().plusHours(1).toInstant().getEpochSecond()) .timezone("Etc/UTC") .muteFirstRecoveryNotification(true) .monitorTags(Collections.singletonList("tag0")) .notifyEndStates(Collections.singletonList(NotifyEndState.ALERT)) .notifyEndTypes(Collections.singletonList(NotifyEndType.CANCELED)); try { Downtime result = apiInstance.createDowntime(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#createDowntime"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Schedule a downtime returns "OK" response ``` from datadog import initialize, api import time options = { 'api_key': '', 'app_key': '' } initialize(**options) # Repeat for 3 hours (starting now) on every week day for 4 weeks. start_ts = int(time.time()) end_ts = start_ts + (3 * 60 * 60) end_reccurrence_ts = start_ts + (4 * 7 * 24 * 60 * 60) # 4 weeks from now recurrence = { 'type': 'weeks', 'period': 1, 'week_days': ['Mon', 'Tue', 'Wed', 'Thu', 'Fri'], 'until_date': end_reccurrence_ts } # Schedule downtime api.Downtime.create( scope='env:staging', start=start_ts, end=end_ts, recurrence=recurrence ) # OR use RRULE reccurence rrule_recurrence = { 'type': 'rrule', 'rrule': 'FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1', } # Schedule downtime api.Downtime.create( scope='env:staging', start=start_ts, end=end_ts, recurrence=rrule_recurrence ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Schedule a downtime once a year ``` """ Schedule a downtime once a year """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.downtimes_api import DowntimesApi from datadog_api_client.v1.model.downtime import Downtime from datadog_api_client.v1.model.downtime_recurrence import DowntimeRecurrence from datadog_api_client.v1.model.notify_end_state import NotifyEndState from datadog_api_client.v1.model.notify_end_type import NotifyEndType body = Downtime( message="Example-Downtime", recurrence=DowntimeRecurrence( period=1, type="years", ), scope=[ "*", ], start=int(datetime.now().timestamp()), end=int((datetime.now() + relativedelta(hours=1)).timestamp()), timezone="Etc/UTC", mute_first_recovery_notification=True, monitor_tags=[ "tag0", ], notify_end_states=[ NotifyEndState.ALERT, NotifyEndState.WARN, ], notify_end_types=[ NotifyEndType.EXPIRED, ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.create_downtime(body=body) print(response) ``` Copy ##### Schedule a downtime returns "OK" response ``` """ Schedule a downtime returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.downtimes_api import DowntimesApi from datadog_api_client.v1.model.downtime import Downtime from datadog_api_client.v1.model.downtime_recurrence import DowntimeRecurrence from datadog_api_client.v1.model.notify_end_state import NotifyEndState from datadog_api_client.v1.model.notify_end_type import NotifyEndType body = Downtime( message="Example-Downtime", start=int(datetime.now().timestamp()), end=int((datetime.now() + relativedelta(hours=1)).timestamp()), timezone="Etc/UTC", scope=[ "test:exampledowntime", ], recurrence=DowntimeRecurrence( type="weeks", period=1, week_days=[ "Mon", "Tue", "Wed", "Thu", "Fri", ], until_date=int((datetime.now() + relativedelta(days=21)).timestamp()), ), notify_end_states=[ NotifyEndState.ALERT, NotifyEndState.NO_DATA, NotifyEndState.WARN, ], notify_end_types=[ NotifyEndType.CANCELED, NotifyEndType.EXPIRED, ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.create_downtime(body=body) print(response) ``` Copy ##### Schedule a downtime until date ``` """ Schedule a downtime until date """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.downtimes_api import DowntimesApi from datadog_api_client.v1.model.downtime import Downtime from datadog_api_client.v1.model.downtime_recurrence import DowntimeRecurrence from datadog_api_client.v1.model.notify_end_state import NotifyEndState from datadog_api_client.v1.model.notify_end_type import NotifyEndType body = Downtime( message="Example-Downtime", recurrence=DowntimeRecurrence( period=1, type="weeks", until_date=int((datetime.now() + relativedelta(days=21)).timestamp()), week_days=[ "Mon", "Tue", "Wed", "Thu", "Fri", ], ), scope=[ "*", ], start=int(datetime.now().timestamp()), end=int((datetime.now() + relativedelta(hours=1)).timestamp()), timezone="Etc/UTC", mute_first_recovery_notification=True, monitor_tags=[ "tag0", ], notify_end_states=[ NotifyEndState.ALERT, ], notify_end_types=[ NotifyEndType.CANCELED, ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.create_downtime(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Schedule a downtime returns "OK" response ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Repeat for 3 hours (starting now) on every week day for 4 weeks. start_ts = Time.now.to_i end_ts = start_ts + (3 * 60 * 60) end_reccurrence_ts = start_ts + (4 * 7 * 24 * 60 * 60) # 4 weeks from now recurrence = { 'type' => 'weeks', 'period' => 1, 'week_days' => %w[Mon Tue Wed Thu Fri], 'until_date' => end_reccurrence_ts } # Schedule downtime dog.Downtime.create( 'env:staging', start: start_ts, end: end_ts, recurrence: recurrence ) # OR use RRULE reccurence rrule_recurrence = { 'type' => 'rrule', 'rrule' => 'FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1', } # Schedule downtime dog.Downtime.create( 'env:staging', start: start_ts, end: end_ts, recurrence: rrule_recurrence ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Schedule a downtime once a year ``` # Schedule a downtime once a year require "datadog_api_client" api_instance = DatadogAPIClient::V1::DowntimesAPI.new body = DatadogAPIClient::V1::Downtime.new({ message: "Example-Downtime", recurrence: DatadogAPIClient::V1::DowntimeRecurrence.new({ period: 1, type: "years", }), scope: [ "*", ], start: Time.now.to_i, _end: (Time.now + 1 * 3600).to_i, timezone: "Etc/UTC", mute_first_recovery_notification: true, monitor_tags: [ "tag0", ], notify_end_states: [ DatadogAPIClient::V1::NotifyEndState::ALERT, DatadogAPIClient::V1::NotifyEndState::WARN, ], notify_end_types: [ DatadogAPIClient::V1::NotifyEndType::EXPIRED, ], }) p api_instance.create_downtime(body) ``` Copy ##### Schedule a downtime returns "OK" response ``` # Schedule a downtime returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DowntimesAPI.new body = DatadogAPIClient::V1::Downtime.new({ message: "Example-Downtime", start: Time.now.to_i, _end: (Time.now + 1 * 3600).to_i, timezone: "Etc/UTC", scope: [ "test:exampledowntime", ], recurrence: DatadogAPIClient::V1::DowntimeRecurrence.new({ type: "weeks", period: 1, week_days: [ "Mon", "Tue", "Wed", "Thu", "Fri", ], until_date: (Time.now + 21 * 86400).to_i, }), notify_end_states: [ DatadogAPIClient::V1::NotifyEndState::ALERT, DatadogAPIClient::V1::NotifyEndState::NO_DATA, DatadogAPIClient::V1::NotifyEndState::WARN, ], notify_end_types: [ DatadogAPIClient::V1::NotifyEndType::CANCELED, DatadogAPIClient::V1::NotifyEndType::EXPIRED, ], }) p api_instance.create_downtime(body) ``` Copy ##### Schedule a downtime until date ``` # Schedule a downtime until date require "datadog_api_client" api_instance = DatadogAPIClient::V1::DowntimesAPI.new body = DatadogAPIClient::V1::Downtime.new({ message: "Example-Downtime", recurrence: DatadogAPIClient::V1::DowntimeRecurrence.new({ period: 1, type: "weeks", until_date: (Time.now + 21 * 86400).to_i, week_days: [ "Mon", "Tue", "Wed", "Thu", "Fri", ], }), scope: [ "*", ], start: Time.now.to_i, _end: (Time.now + 1 * 3600).to_i, timezone: "Etc/UTC", mute_first_recovery_notification: true, monitor_tags: [ "tag0", ], notify_end_states: [ DatadogAPIClient::V1::NotifyEndState::ALERT, ], notify_end_types: [ DatadogAPIClient::V1::NotifyEndType::CANCELED, ], }) p api_instance.create_downtime(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Schedule a downtime once a year ``` // Schedule a downtime once a year use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_downtimes::DowntimesAPI; use datadog_api_client::datadogV1::model::Downtime; use datadog_api_client::datadogV1::model::DowntimeRecurrence; use datadog_api_client::datadogV1::model::NotifyEndState; use datadog_api_client::datadogV1::model::NotifyEndType; #[tokio::main] async fn main() { let body = Downtime::new() .end(Some(1636632671)) .message(Some("Example-Downtime".to_string())) .monitor_tags(vec!["tag0".to_string()]) .mute_first_recovery_notification(true) .notify_end_states(vec![NotifyEndState::ALERT, NotifyEndState::WARN]) .notify_end_types(vec![NotifyEndType::EXPIRED]) .recurrence(Some( DowntimeRecurrence::new() .period(1) .type_("years".to_string()), )) .scope(vec!["*".to_string()]) .start(1636629071) .timezone("Etc/UTC".to_string()); let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api.create_downtime(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Schedule a downtime returns "OK" response ``` // Schedule a downtime returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_downtimes::DowntimesAPI; use datadog_api_client::datadogV1::model::Downtime; use datadog_api_client::datadogV1::model::DowntimeRecurrence; use datadog_api_client::datadogV1::model::NotifyEndState; use datadog_api_client::datadogV1::model::NotifyEndType; #[tokio::main] async fn main() { let body = Downtime::new() .end(Some(1636632671)) .message(Some("Example-Downtime".to_string())) .notify_end_states(vec![ NotifyEndState::ALERT, NotifyEndState::NO_DATA, NotifyEndState::WARN, ]) .notify_end_types(vec![NotifyEndType::CANCELED, NotifyEndType::EXPIRED]) .recurrence(Some( DowntimeRecurrence::new() .period(1) .type_("weeks".to_string()) .until_date(Some(1638443471)) .week_days(Some(vec![ "Mon".to_string(), "Tue".to_string(), "Wed".to_string(), "Thu".to_string(), "Fri".to_string(), ])), )) .scope(vec!["test:exampledowntime".to_string()]) .start(1636629071) .timezone("Etc/UTC".to_string()); let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api.create_downtime(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Schedule a downtime until date ``` // Schedule a downtime until date use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_downtimes::DowntimesAPI; use datadog_api_client::datadogV1::model::Downtime; use datadog_api_client::datadogV1::model::DowntimeRecurrence; use datadog_api_client::datadogV1::model::NotifyEndState; use datadog_api_client::datadogV1::model::NotifyEndType; #[tokio::main] async fn main() { let body = Downtime::new() .end(Some(1636632671)) .message(Some("Example-Downtime".to_string())) .monitor_tags(vec!["tag0".to_string()]) .mute_first_recovery_notification(true) .notify_end_states(vec![NotifyEndState::ALERT]) .notify_end_types(vec![NotifyEndType::CANCELED]) .recurrence(Some( DowntimeRecurrence::new() .period(1) .type_("weeks".to_string()) .until_date(Some(1638443471)) .week_days(Some(vec![ "Mon".to_string(), "Tue".to_string(), "Wed".to_string(), "Thu".to_string(), "Fri".to_string(), ])), )) .scope(vec!["*".to_string()]) .start(1636629071) .timezone("Etc/UTC".to_string()); let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api.create_downtime(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Schedule a downtime once a year ``` /** * Schedule a downtime once a year */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DowntimesApi(configuration); const params: v1.DowntimesApiCreateDowntimeRequest = { body: { message: "Example-Downtime", recurrence: { period: 1, type: "years", }, scope: ["*"], start: Math.round(new Date().getTime() / 1000), end: Math.round( new Date(new Date().getTime() + 1 * 3600 * 1000).getTime() / 1000 ), timezone: "Etc/UTC", muteFirstRecoveryNotification: true, monitorTags: ["tag0"], notifyEndStates: ["alert", "warn"], notifyEndTypes: ["expired"], }, }; apiInstance .createDowntime(params) .then((data: v1.Downtime) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Schedule a downtime returns "OK" response ``` /** * Schedule a downtime returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DowntimesApi(configuration); const params: v1.DowntimesApiCreateDowntimeRequest = { body: { message: "Example-Downtime", start: Math.round(new Date().getTime() / 1000), end: Math.round( new Date(new Date().getTime() + 1 * 3600 * 1000).getTime() / 1000 ), timezone: "Etc/UTC", scope: ["test:exampledowntime"], recurrence: { type: "weeks", period: 1, weekDays: ["Mon", "Tue", "Wed", "Thu", "Fri"], untilDate: Math.round( new Date(new Date().getTime() + 21 * 86400 * 1000).getTime() / 1000 ), }, notifyEndStates: ["alert", "no data", "warn"], notifyEndTypes: ["canceled", "expired"], }, }; apiInstance .createDowntime(params) .then((data: v1.Downtime) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Schedule a downtime until date ``` /** * Schedule a downtime until date */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DowntimesApi(configuration); const params: v1.DowntimesApiCreateDowntimeRequest = { body: { message: "Example-Downtime", recurrence: { period: 1, type: "weeks", untilDate: Math.round( new Date(new Date().getTime() + 21 * 86400 * 1000).getTime() / 1000 ), weekDays: ["Mon", "Tue", "Wed", "Thu", "Fri"], }, scope: ["*"], start: Math.round(new Date().getTime() / 1000), end: Math.round( new Date(new Date().getTime() + 1 * 3600 * 1000).getTime() / 1000 ), timezone: "Etc/UTC", muteFirstRecoveryNotification: true, monitorTags: ["tag0"], notifyEndStates: ["alert"], notifyEndTypes: ["canceled"], }, }; apiInstance .createDowntime(params) .then((data: v1.Downtime) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` POST https://api.ap1.datadoghq.com/api/v2/downtimehttps://api.ap2.datadoghq.com/api/v2/downtimehttps://api.datadoghq.eu/api/v2/downtimehttps://api.ddog-gov.com/api/v2/downtimehttps://api.datadoghq.com/api/v2/downtimehttps://api.us3.datadoghq.com/api/v2/downtimehttps://api.us5.datadoghq.com/api/v2/downtime ### Overview Schedule a downtime. This endpoint requires the `monitors_downtime` permission. OAuth apps require the `monitors_downtime` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Request #### Body Data (required) Schedule a downtime request body. * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Field Type Description data [_required_] object Object to create a downtime. attributes [_required_] object Downtime details. display_timezone string The timezone in which to display the downtime's start and end times in Datadog applications. This is not used as an offset for scheduling. default: `UTC` message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_identifier [_required_] Monitor identifier for the downtime. Option 1 object Object of the monitor identifier. monitor_id [_required_] int64 ID of the monitor to prevent notifications. Option 2 object Object of the monitor tags. monitor_tags [_required_] [string] A list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match **all** provided monitor tags. Setting `monitor_tags` to `[*]` configures the downtime to mute all monitors for the given scope. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States that will trigger a monitor notification when the `notify_end_types` action occurs. notify_end_types [string] Actions that will trigger a monitor notification if the downtime is in the `notify_end_types` state. schedule Schedule for the downtime. Option 1 object A recurring downtime schedule definition. recurrences [_required_] [object] A list of downtime recurrences. duration [_required_] string The length of the downtime. Must begin with an integer and end with one of 'm', 'h', d', or 'w'. rrule [_required_] string The `RRULE` standard for defining recurring events. For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api). start string ISO-8601 Datetime to start the downtime. Must not include a UTC offset. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to schedule the downtime. default: `UTC` Option 2 object A one-time downtime definition. end date-time ISO-8601 Datetime to end the downtime. Must include a UTC offset of zero. If not provided, the downtime continues forever. start date-time ISO-8601 Datetime to start the downtime. Must include a UTC offset of zero. If not provided, the downtime starts the moment it is created. scope [_required_] string The scope to which the downtime applies. Must follow the [common search syntax](https://docs.datadoghq.com/logs/explorer/search_syntax/). type [_required_] enum Downtime resource type. Allowed enum values: `downtime` default: `downtime` ``` { "data": { "attributes": { "message": "dark forest", "monitor_identifier": { "monitor_tags": [ "cat:hat" ] }, "scope": "test:exampledowntime", "schedule": { "start": null } }, "type": "downtime" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/downtimes/#CreateDowntime-200-v2) * [400](https://docs.datadoghq.com/api/latest/downtimes/#CreateDowntime-400-v2) * [403](https://docs.datadoghq.com/api/latest/downtimes/#CreateDowntime-403-v2) * [429](https://docs.datadoghq.com/api/latest/downtimes/#CreateDowntime-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Downtiming gives you greater control over monitor notifications by allowing you to globally exclude scopes from alerting. Downtime settings, which can be scheduled with start and end times, prevent all alerting related to specified Datadog tags. Field Type Description data object Downtime data. attributes object Downtime details. canceled date-time Time that the downtime was canceled. created date-time Creation time of the downtime. display_timezone string The timezone in which to display the downtime's start and end times in Datadog applications. This is not used as an offset for scheduling. default: `UTC` message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. modified date-time Time that the downtime was last modified. monitor_identifier Monitor identifier for the downtime. Option 1 object Object of the monitor identifier. monitor_id [_required_] int64 ID of the monitor to prevent notifications. Option 2 object Object of the monitor tags. monitor_tags [_required_] [string] A list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match **all** provided monitor tags. Setting `monitor_tags` to `[*]` configures the downtime to mute all monitors for the given scope. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States that will trigger a monitor notification when the `notify_end_types` action occurs. notify_end_types [string] Actions that will trigger a monitor notification if the downtime is in the `notify_end_types` state. schedule The schedule that defines when the monitor starts, stops, and recurs. There are two types of schedules: one-time and recurring. Recurring schedules may have up to five RRULE-based recurrences. If no schedules are provided, the downtime will begin immediately and never end. Option 1 object A recurring downtime schedule definition. current_downtime object The most recent actual start and end dates for a recurring downtime. For a canceled downtime, this is the previously occurring downtime. For active downtimes, this is the ongoing downtime, and for scheduled downtimes it is the upcoming downtime. end date-time The end of the current downtime. start date-time The start of the current downtime. recurrences [_required_] [object] A list of downtime recurrences. duration string The length of the downtime. Must begin with an integer and end with one of 'm', 'h', d', or 'w'. rrule string The `RRULE` standard for defining recurring events. For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api). start string ISO-8601 Datetime to start the downtime. Must not include a UTC offset. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to schedule the downtime. This affects recurring start and end dates. Must match `display_timezone`. default: `UTC` Option 2 object A one-time downtime definition. end date-time ISO-8601 Datetime to end the downtime. start [_required_] date-time ISO-8601 Datetime to start the downtime. scope string The scope to which the downtime applies. Must follow the [common search syntax](https://docs.datadoghq.com/logs/explorer/search_syntax/). status enum The current status of the downtime. Allowed enum values: `active,canceled,ended,scheduled` id string The downtime ID. relationships object All relationships associated with downtime. created_by object The user who created the downtime. data object Data for the user who created the downtime. id string User ID of the downtime creator. type enum Users resource type. Allowed enum values: `users` default: `users` monitor object The monitor identified by the downtime. data object Data for the monitor. id string Monitor ID of the downtime. type enum Monitor resource type. Allowed enum values: `monitors` default: `monitors` type enum Downtime resource type. Allowed enum values: `downtime` default: `downtime` included [ ] Array of objects related to the downtime that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Information about the monitor identified by the downtime. attributes object Attributes of the monitor identified by the downtime. name string The name of the monitor identified by the downtime. id int64 ID of the monitor identified by the downtime. type enum Monitor resource type. Allowed enum values: `monitors` default: `monitors` ``` { "data": { "attributes": { "canceled": "2020-01-02T03:04:05.282979+0000", "created": "2020-01-02T03:04:05.282979+0000", "display_timezone": "America/New_York", "message": "Message about the downtime", "modified": "2020-01-02T03:04:05.282979+0000", "monitor_identifier": { "monitor_id": 123 }, "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "warn" ], "notify_end_types": [ "canceled", "expired" ], "schedule": { "current_downtime": { "end": "2020-01-02T03:04:00.000Z", "start": "2020-01-02T03:04:00.000Z" }, "recurrences": [ { "duration": "123d", "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "start": "2020-01-02T03:04" } ], "timezone": "America/New_York" }, "scope": "env:(staging OR prod) AND datacenter:us-east-1", "status": "active" }, "id": "00000000-0000-1234-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-1234-0000-000000000000", "type": "users" } }, "monitor": { "data": { "id": "12345", "type": "monitors" } } }, "type": "downtime" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) ##### Schedule a downtime returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/downtime" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "message": "dark forest", "monitor_identifier": { "monitor_tags": [ "cat:hat" ] }, "scope": "test:exampledowntime", "schedule": { "start": null } }, "type": "downtime" } } EOF ``` ##### Schedule a downtime returns "OK" response ``` // Schedule a downtime returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.DowntimeCreateRequest{ Data: datadogV2.DowntimeCreateRequestData{ Attributes: datadogV2.DowntimeCreateRequestAttributes{ Message: *datadog.NewNullableString(datadog.PtrString("dark forest")), MonitorIdentifier: datadogV2.DowntimeMonitorIdentifier{ DowntimeMonitorIdentifierTags: &datadogV2.DowntimeMonitorIdentifierTags{ MonitorTags: []string{ "cat:hat", }, }}, Scope: "test:exampledowntime", Schedule: &datadogV2.DowntimeScheduleCreateRequest{ DowntimeScheduleOneTimeCreateUpdateRequest: &datadogV2.DowntimeScheduleOneTimeCreateUpdateRequest{ Start: *datadog.NewNullableTime(nil), }}, }, Type: datadogV2.DOWNTIMERESOURCETYPE_DOWNTIME, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDowntimesApi(apiClient) resp, r, err := api.CreateDowntime(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.CreateDowntime`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.CreateDowntime`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Schedule a downtime returns "OK" response ``` // Schedule a downtime returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DowntimesApi; import com.datadog.api.client.v2.model.DowntimeCreateRequest; import com.datadog.api.client.v2.model.DowntimeCreateRequestAttributes; import com.datadog.api.client.v2.model.DowntimeCreateRequestData; import com.datadog.api.client.v2.model.DowntimeMonitorIdentifier; import com.datadog.api.client.v2.model.DowntimeMonitorIdentifierTags; import com.datadog.api.client.v2.model.DowntimeResourceType; import com.datadog.api.client.v2.model.DowntimeResponse; import com.datadog.api.client.v2.model.DowntimeScheduleCreateRequest; import com.datadog.api.client.v2.model.DowntimeScheduleOneTimeCreateUpdateRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); DowntimeCreateRequest body = new DowntimeCreateRequest() .data( new DowntimeCreateRequestData() .attributes( new DowntimeCreateRequestAttributes() .message("dark forest") .monitorIdentifier( new DowntimeMonitorIdentifier( new DowntimeMonitorIdentifierTags() .monitorTags(Collections.singletonList("cat:hat")))) .scope("test:exampledowntime") .schedule( new DowntimeScheduleCreateRequest( new DowntimeScheduleOneTimeCreateUpdateRequest().start(null)))) .type(DowntimeResourceType.DOWNTIME)); try { DowntimeResponse result = apiInstance.createDowntime(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#createDowntime"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Schedule a downtime returns "OK" response ``` """ Schedule a downtime returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.downtimes_api import DowntimesApi from datadog_api_client.v2.model.downtime_create_request import DowntimeCreateRequest from datadog_api_client.v2.model.downtime_create_request_attributes import DowntimeCreateRequestAttributes from datadog_api_client.v2.model.downtime_create_request_data import DowntimeCreateRequestData from datadog_api_client.v2.model.downtime_monitor_identifier_tags import DowntimeMonitorIdentifierTags from datadog_api_client.v2.model.downtime_resource_type import DowntimeResourceType from datadog_api_client.v2.model.downtime_schedule_one_time_create_update_request import ( DowntimeScheduleOneTimeCreateUpdateRequest, ) body = DowntimeCreateRequest( data=DowntimeCreateRequestData( attributes=DowntimeCreateRequestAttributes( message="dark forest", monitor_identifier=DowntimeMonitorIdentifierTags( monitor_tags=[ "cat:hat", ], ), scope="test:exampledowntime", schedule=DowntimeScheduleOneTimeCreateUpdateRequest( start=None, ), ), type=DowntimeResourceType.DOWNTIME, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.create_downtime(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Schedule a downtime returns "OK" response ``` # Schedule a downtime returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DowntimesAPI.new body = DatadogAPIClient::V2::DowntimeCreateRequest.new({ data: DatadogAPIClient::V2::DowntimeCreateRequestData.new({ attributes: DatadogAPIClient::V2::DowntimeCreateRequestAttributes.new({ message: "dark forest", monitor_identifier: DatadogAPIClient::V2::DowntimeMonitorIdentifierTags.new({ monitor_tags: [ "cat:hat", ], }), scope: "test:exampledowntime", schedule: DatadogAPIClient::V2::DowntimeScheduleOneTimeCreateUpdateRequest.new({ start: nil, }), }), type: DatadogAPIClient::V2::DowntimeResourceType::DOWNTIME, }), }) p api_instance.create_downtime(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Schedule a downtime returns "OK" response ``` // Schedule a downtime returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_downtimes::DowntimesAPI; use datadog_api_client::datadogV2::model::DowntimeCreateRequest; use datadog_api_client::datadogV2::model::DowntimeCreateRequestAttributes; use datadog_api_client::datadogV2::model::DowntimeCreateRequestData; use datadog_api_client::datadogV2::model::DowntimeMonitorIdentifier; use datadog_api_client::datadogV2::model::DowntimeMonitorIdentifierTags; use datadog_api_client::datadogV2::model::DowntimeResourceType; use datadog_api_client::datadogV2::model::DowntimeScheduleCreateRequest; use datadog_api_client::datadogV2::model::DowntimeScheduleOneTimeCreateUpdateRequest; #[tokio::main] async fn main() { let body = DowntimeCreateRequest::new(DowntimeCreateRequestData::new( DowntimeCreateRequestAttributes::new( DowntimeMonitorIdentifier::DowntimeMonitorIdentifierTags(Box::new( DowntimeMonitorIdentifierTags::new(vec!["cat:hat".to_string()]), )), "test:exampledowntime".to_string(), ) .message(Some("dark forest".to_string())) .schedule( DowntimeScheduleCreateRequest::DowntimeScheduleOneTimeCreateUpdateRequest(Box::new( DowntimeScheduleOneTimeCreateUpdateRequest::new().start(None), )), ), DowntimeResourceType::DOWNTIME, )); let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api.create_downtime(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Schedule a downtime returns "OK" response ``` /** * Schedule a downtime returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DowntimesApi(configuration); const params: v2.DowntimesApiCreateDowntimeRequest = { body: { data: { attributes: { message: "dark forest", monitorIdentifier: { monitorTags: ["cat:hat"], }, scope: "test:exampledowntime", schedule: { start: undefined, }, }, type: "downtime", }, }, }; apiInstance .createDowntime(params) .then((data: v2.DowntimeResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Cancel downtimes by scope](https://docs.datadoghq.com/api/latest/downtimes/#cancel-downtimes-by-scope) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/downtimes/#cancel-downtimes-by-scope-v1) POST https://api.ap1.datadoghq.com/api/v1/downtime/cancel/by_scopehttps://api.ap2.datadoghq.com/api/v1/downtime/cancel/by_scopehttps://api.datadoghq.eu/api/v1/downtime/cancel/by_scopehttps://api.ddog-gov.com/api/v1/downtime/cancel/by_scopehttps://api.datadoghq.com/api/v1/downtime/cancel/by_scopehttps://api.us3.datadoghq.com/api/v1/downtime/cancel/by_scopehttps://api.us5.datadoghq.com/api/v1/downtime/cancel/by_scope ### Overview Delete all downtimes that match the scope of `X`. **Note:** This only interacts with Downtimes created using v1 endpoints. This endpoint has been deprecated and will not be replaced. Please use v2 endpoints to find and cancel downtimes. This endpoint requires the `monitors_downtime` permission. OAuth apps require the `monitors_downtime` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Request #### Body Data (required) Scope to cancel downtimes for. * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Expand All Field Type Description scope [_required_] string The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). ``` { "scope": "string" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntimesByScope-200-v1) * [400](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntimesByScope-400-v1) * [403](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntimesByScope-403-v1) * [404](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntimesByScope-404-v1) * [429](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntimesByScope-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Object containing array of IDs of canceled downtimes. Expand All Field Type Description cancelled_ids [integer] ID of downtimes that were canceled. ``` { "cancelled_ids": [ 123456789, 123456790 ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Downtimes not found * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby-legacy) ##### Cancel downtimes by scope returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/downtime/cancel/by_scope" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "scope": "string" } EOF ``` ##### Cancel downtimes by scope returns "OK" response ``` // Cancel downtimes by scope returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "downtime" in the system DowntimeScope0 := os.Getenv("DOWNTIME_SCOPE_0") body := datadogV1.CancelDowntimesByScopeRequest{ Scope: DowntimeScope0, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDowntimesApi(apiClient) resp, r, err := api.CancelDowntimesByScope(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.CancelDowntimesByScope`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.CancelDowntimesByScope`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Cancel downtimes by scope returns "OK" response ``` // Cancel downtimes by scope returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DowntimesApi; import com.datadog.api.client.v1.model.CancelDowntimesByScopeRequest; import com.datadog.api.client.v1.model.CanceledDowntimesIds; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); // there is a valid "downtime" in the system String DOWNTIME_SCOPE_0 = System.getenv("DOWNTIME_SCOPE_0"); CancelDowntimesByScopeRequest body = new CancelDowntimesByScopeRequest().scope(DOWNTIME_SCOPE_0); try { CanceledDowntimesIds result = apiInstance.cancelDowntimesByScope(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#cancelDowntimesByScope"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Cancel downtimes by scope returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Cancel all downtimes with scope api.Downtime.cancel_downtime_by_scope('env:testing') ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Cancel downtimes by scope returns "OK" response ``` """ Cancel downtimes by scope returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.downtimes_api import DowntimesApi from datadog_api_client.v1.model.cancel_downtimes_by_scope_request import CancelDowntimesByScopeRequest # there is a valid "downtime" in the system DOWNTIME_SCOPE_0 = environ["DOWNTIME_SCOPE_0"] body = CancelDowntimesByScopeRequest( scope=DOWNTIME_SCOPE_0, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.cancel_downtimes_by_scope(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Cancel downtimes by scope returns "OK" response ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Cancel all downtimes with the given scope dog.cancel_downtime_by_scope('env:testing') ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Cancel downtimes by scope returns "OK" response ``` # Cancel downtimes by scope returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DowntimesAPI.new # there is a valid "downtime" in the system DOWNTIME_SCOPE_0 = ENV["DOWNTIME_SCOPE_0"] body = DatadogAPIClient::V1::CancelDowntimesByScopeRequest.new({ scope: DOWNTIME_SCOPE_0, }) p api_instance.cancel_downtimes_by_scope(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Cancel downtimes by scope returns "OK" response ``` // Cancel downtimes by scope returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_downtimes::DowntimesAPI; use datadog_api_client::datadogV1::model::CancelDowntimesByScopeRequest; #[tokio::main] async fn main() { // there is a valid "downtime" in the system let downtime_scope_0 = std::env::var("DOWNTIME_SCOPE_0").unwrap(); let body = CancelDowntimesByScopeRequest::new(downtime_scope_0.clone()); let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api.cancel_downtimes_by_scope(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Cancel downtimes by scope returns "OK" response ``` /** * Cancel downtimes by scope returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DowntimesApi(configuration); // there is a valid "downtime" in the system const DOWNTIME_SCOPE_0 = process.env.DOWNTIME_SCOPE_0 as string; const params: v1.DowntimesApiCancelDowntimesByScopeRequest = { body: { scope: DOWNTIME_SCOPE_0, }, }; apiInstance .cancelDowntimesByScope(params) .then((data: v1.CanceledDowntimesIds) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Cancel a downtime](https://docs.datadoghq.com/api/latest/downtimes/#cancel-a-downtime) * [v1](https://docs.datadoghq.com/api/latest/downtimes/#cancel-a-downtime-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/downtimes/#cancel-a-downtime-v2) DELETE https://api.ap1.datadoghq.com/api/v1/downtime/{downtime_id}https://api.ap2.datadoghq.com/api/v1/downtime/{downtime_id}https://api.datadoghq.eu/api/v1/downtime/{downtime_id}https://api.ddog-gov.com/api/v1/downtime/{downtime_id}https://api.datadoghq.com/api/v1/downtime/{downtime_id}https://api.us3.datadoghq.com/api/v1/downtime/{downtime_id}https://api.us5.datadoghq.com/api/v1/downtime/{downtime_id} ### Overview Cancel a downtime. **Note:** This endpoint has been deprecated. Please use v2 endpoints. This endpoint requires the `monitors_downtime` permission. OAuth apps require the `monitors_downtime` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Arguments #### Path Parameters Name Type Description downtime_id [_required_] integer ID of the downtime to cancel. ### Response * [204](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntime-204-v1) * [403](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntime-403-v1) * [404](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntime-404-v1) * [429](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntime-429-v1) OK Forbidden * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Downtime not found * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) ##### Cancel a downtime Copy ``` # Path parameters export downtime_id="123456" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/downtime/${downtime_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Cancel a downtime ``` """ Cancel a downtime returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.downtimes_api import DowntimesApi # there is a valid "downtime" in the system DOWNTIME_ID = environ["DOWNTIME_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) api_instance.cancel_downtime( downtime_id=int(DOWNTIME_ID), ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Cancel a downtime ``` # Cancel a downtime returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DowntimesAPI.new # there is a valid "downtime" in the system DOWNTIME_ID = ENV["DOWNTIME_ID"] api_instance.cancel_downtime(DOWNTIME_ID.to_i) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Cancel a downtime ``` // Cancel a downtime returns "OK" response package main import ( "context" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "downtime" in the system DowntimeID, _ := strconv.ParseInt(os.Getenv("DOWNTIME_ID"), 10, 64) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDowntimesApi(apiClient) r, err := api.CancelDowntime(ctx, DowntimeID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.CancelDowntime`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Cancel a downtime ``` // Cancel a downtime returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DowntimesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); // there is a valid "downtime" in the system Long DOWNTIME_ID = Long.parseLong(System.getenv("DOWNTIME_ID")); try { apiInstance.cancelDowntime(DOWNTIME_ID); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#cancelDowntime"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Cancel a downtime ``` // Cancel a downtime returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_downtimes::DowntimesAPI; #[tokio::main] async fn main() { // there is a valid "downtime" in the system let downtime_id: i64 = std::env::var("DOWNTIME_ID").unwrap().parse().unwrap(); let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api.cancel_downtime(downtime_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Cancel a downtime ``` /** * Cancel a downtime returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DowntimesApi(configuration); // there is a valid "downtime" in the system const DOWNTIME_ID = parseInt(process.env.DOWNTIME_ID as string); const params: v1.DowntimesApiCancelDowntimeRequest = { downtimeId: DOWNTIME_ID, }; apiInstance .cancelDowntime(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` DELETE https://api.ap1.datadoghq.com/api/v2/downtime/{downtime_id}https://api.ap2.datadoghq.com/api/v2/downtime/{downtime_id}https://api.datadoghq.eu/api/v2/downtime/{downtime_id}https://api.ddog-gov.com/api/v2/downtime/{downtime_id}https://api.datadoghq.com/api/v2/downtime/{downtime_id}https://api.us3.datadoghq.com/api/v2/downtime/{downtime_id}https://api.us5.datadoghq.com/api/v2/downtime/{downtime_id} ### Overview Cancel a downtime. **Note** : Downtimes canceled through the API are no longer active, but are retained for approximately two days before being permanently removed. The downtime may still appear in search results until it is permanently removed. This endpoint requires the `monitors_downtime` permission. OAuth apps require the `monitors_downtime` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Arguments #### Path Parameters Name Type Description downtime_id [_required_] string ID of the downtime to cancel. ### Response * [204](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntime-204-v2) * [403](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntime-403-v2) * [404](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntime-404-v2) * [429](https://docs.datadoghq.com/api/latest/downtimes/#CancelDowntime-429-v2) OK Forbidden * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Downtime not found * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) ##### Cancel a downtime Copy ``` # Path parameters export downtime_id="00000000-0000-1234-0000-000000000000" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/downtime/${downtime_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Cancel a downtime ``` """ Cancel a downtime returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.downtimes_api import DowntimesApi # there is a valid "downtime_v2" in the system DOWNTIME_V2_DATA_ID = environ["DOWNTIME_V2_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) api_instance.cancel_downtime( downtime_id=DOWNTIME_V2_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Cancel a downtime ``` # Cancel a downtime returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DowntimesAPI.new # there is a valid "downtime_v2" in the system DOWNTIME_V2_DATA_ID = ENV["DOWNTIME_V2_DATA_ID"] api_instance.cancel_downtime(DOWNTIME_V2_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Cancel a downtime ``` // Cancel a downtime returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "downtime_v2" in the system DowntimeV2DataID := os.Getenv("DOWNTIME_V2_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDowntimesApi(apiClient) r, err := api.CancelDowntime(ctx, DowntimeV2DataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.CancelDowntime`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Cancel a downtime ``` // Cancel a downtime returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DowntimesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); // there is a valid "downtime_v2" in the system String DOWNTIME_V2_DATA_ID = System.getenv("DOWNTIME_V2_DATA_ID"); try { apiInstance.cancelDowntime(DOWNTIME_V2_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#cancelDowntime"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Cancel a downtime ``` // Cancel a downtime returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_downtimes::DowntimesAPI; #[tokio::main] async fn main() { // there is a valid "downtime_v2" in the system let downtime_v2_data_id = std::env::var("DOWNTIME_V2_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api.cancel_downtime(downtime_v2_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Cancel a downtime ``` /** * Cancel a downtime returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DowntimesApi(configuration); // there is a valid "downtime_v2" in the system const DOWNTIME_V2_DATA_ID = process.env.DOWNTIME_V2_DATA_ID as string; const params: v2.DowntimesApiCancelDowntimeRequest = { downtimeId: DOWNTIME_V2_DATA_ID, }; apiInstance .cancelDowntime(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a downtime](https://docs.datadoghq.com/api/latest/downtimes/#get-a-downtime) * [v1](https://docs.datadoghq.com/api/latest/downtimes/#get-a-downtime-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/downtimes/#get-a-downtime-v2) GET https://api.ap1.datadoghq.com/api/v1/downtime/{downtime_id}https://api.ap2.datadoghq.com/api/v1/downtime/{downtime_id}https://api.datadoghq.eu/api/v1/downtime/{downtime_id}https://api.ddog-gov.com/api/v1/downtime/{downtime_id}https://api.datadoghq.com/api/v1/downtime/{downtime_id}https://api.us3.datadoghq.com/api/v1/downtime/{downtime_id}https://api.us5.datadoghq.com/api/v1/downtime/{downtime_id} ### Overview Get downtime detail by `downtime_id`. **Note:** This endpoint has been deprecated. Please use v2 endpoints. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Arguments #### Path Parameters Name Type Description downtime_id [_required_] integer ID of the downtime to fetch. ### Response * [200](https://docs.datadoghq.com/api/latest/downtimes/#GetDowntime-200-v1) * [403](https://docs.datadoghq.com/api/latest/downtimes/#GetDowntime-403-v1) * [404](https://docs.datadoghq.com/api/latest/downtimes/#GetDowntime-404-v1) * [429](https://docs.datadoghq.com/api/latest/downtimes/#GetDowntime-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Downtiming gives you greater control over monitor notifications by allowing you to globally exclude scopes from alerting. Downtime settings, which can be scheduled with start and end times, prevent all alerting related to specified Datadog tags. Field Type Description active boolean If a scheduled downtime currently exists. active_child object The downtime object definition of the active child for the original parent recurring downtime. This field will only exist on recurring downtimes. active boolean If a scheduled downtime currently exists. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. ``` { "active": true, "active_child": { "active": true, "canceled": 1412799983, "creator_id": 123456, "disabled": false, "downtime_type": 2, "end": 1412793983, "id": 1626, "message": "Message on the downtime", "monitor_id": 123456, "monitor_tags": [ "*" ], "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ], "parent_id": 123, "recurrence": { "period": 1, "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "type": "weeks", "until_date": 1447786293, "until_occurrences": 2, "week_days": [ "Mon", "Tue" ] }, "scope": [ "env:staging" ], "start": 1412792983, "timezone": "America/New_York", "updater_id": 123456 }, "canceled": 1412799983, "creator_id": 123456, "disabled": false, "downtime_type": 2, "end": 1412793983, "id": 1625, "message": "Message on the downtime", "monitor_id": 123456, "monitor_tags": [ "*" ], "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ], "parent_id": 123, "recurrence": { "period": 1, "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "type": "weeks", "until_date": 1447786293, "until_occurrences": 2, "week_days": [ "Mon", "Tue" ] }, "scope": [ "env:staging" ], "start": 1412792983, "timezone": "America/New_York", "updater_id": 123456 } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Downtime not found * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python-legacy) ##### Get a downtime Copy ``` # Path parameters export downtime_id="123456" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/downtime/${downtime_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a downtime ``` """ Get a downtime returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.downtimes_api import DowntimesApi # there is a valid "downtime" in the system DOWNTIME_ID = environ["DOWNTIME_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.get_downtime( downtime_id=int(DOWNTIME_ID), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a downtime ``` # Get a downtime returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DowntimesAPI.new # there is a valid "downtime" in the system DOWNTIME_ID = ENV["DOWNTIME_ID"] p api_instance.get_downtime(DOWNTIME_ID.to_i) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a downtime ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Get a downtime object downtime_id = 1625 dog.get_downtime(downtime_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a downtime ``` // Get a downtime returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "downtime" in the system DowntimeID, _ := strconv.ParseInt(os.Getenv("DOWNTIME_ID"), 10, 64) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDowntimesApi(apiClient) resp, r, err := api.GetDowntime(ctx, DowntimeID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.GetDowntime`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.GetDowntime`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a downtime ``` // Get a downtime returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DowntimesApi; import com.datadog.api.client.v1.model.Downtime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); // there is a valid "downtime" in the system Long DOWNTIME_ID = Long.parseLong(System.getenv("DOWNTIME_ID")); try { Downtime result = apiInstance.getDowntime(DOWNTIME_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#getDowntime"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a downtime ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Get a downtime api.Downtime.get(2910) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get a downtime ``` // Get a downtime returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_downtimes::DowntimesAPI; #[tokio::main] async fn main() { // there is a valid "downtime" in the system let downtime_id: i64 = std::env::var("DOWNTIME_ID").unwrap().parse().unwrap(); let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api.get_downtime(downtime_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a downtime ``` /** * Get a downtime returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DowntimesApi(configuration); // there is a valid "downtime" in the system const DOWNTIME_ID = parseInt(process.env.DOWNTIME_ID as string); const params: v1.DowntimesApiGetDowntimeRequest = { downtimeId: DOWNTIME_ID, }; apiInstance .getDowntime(params) .then((data: v1.Downtime) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/downtime/{downtime_id}https://api.ap2.datadoghq.com/api/v2/downtime/{downtime_id}https://api.datadoghq.eu/api/v2/downtime/{downtime_id}https://api.ddog-gov.com/api/v2/downtime/{downtime_id}https://api.datadoghq.com/api/v2/downtime/{downtime_id}https://api.us3.datadoghq.com/api/v2/downtime/{downtime_id}https://api.us5.datadoghq.com/api/v2/downtime/{downtime_id} ### Overview Get downtime detail by `downtime_id`. This endpoint requires the `monitors_downtime` permission. OAuth apps require the `monitors_downtime` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Arguments #### Path Parameters Name Type Description downtime_id [_required_] string ID of the downtime to fetch. #### Query Strings Name Type Description include string Comma-separated list of resource paths for related resources to include in the response. Supported resource paths are `created_by` and `monitor`. ### Response * [200](https://docs.datadoghq.com/api/latest/downtimes/#GetDowntime-200-v2) * [400](https://docs.datadoghq.com/api/latest/downtimes/#GetDowntime-400-v2) * [403](https://docs.datadoghq.com/api/latest/downtimes/#GetDowntime-403-v2) * [404](https://docs.datadoghq.com/api/latest/downtimes/#GetDowntime-404-v2) * [429](https://docs.datadoghq.com/api/latest/downtimes/#GetDowntime-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Downtiming gives you greater control over monitor notifications by allowing you to globally exclude scopes from alerting. Downtime settings, which can be scheduled with start and end times, prevent all alerting related to specified Datadog tags. Field Type Description data object Downtime data. attributes object Downtime details. canceled date-time Time that the downtime was canceled. created date-time Creation time of the downtime. display_timezone string The timezone in which to display the downtime's start and end times in Datadog applications. This is not used as an offset for scheduling. default: `UTC` message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. modified date-time Time that the downtime was last modified. monitor_identifier Monitor identifier for the downtime. Option 1 object Object of the monitor identifier. monitor_id [_required_] int64 ID of the monitor to prevent notifications. Option 2 object Object of the monitor tags. monitor_tags [_required_] [string] A list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match **all** provided monitor tags. Setting `monitor_tags` to `[*]` configures the downtime to mute all monitors for the given scope. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States that will trigger a monitor notification when the `notify_end_types` action occurs. notify_end_types [string] Actions that will trigger a monitor notification if the downtime is in the `notify_end_types` state. schedule The schedule that defines when the monitor starts, stops, and recurs. There are two types of schedules: one-time and recurring. Recurring schedules may have up to five RRULE-based recurrences. If no schedules are provided, the downtime will begin immediately and never end. Option 1 object A recurring downtime schedule definition. current_downtime object The most recent actual start and end dates for a recurring downtime. For a canceled downtime, this is the previously occurring downtime. For active downtimes, this is the ongoing downtime, and for scheduled downtimes it is the upcoming downtime. end date-time The end of the current downtime. start date-time The start of the current downtime. recurrences [_required_] [object] A list of downtime recurrences. duration string The length of the downtime. Must begin with an integer and end with one of 'm', 'h', d', or 'w'. rrule string The `RRULE` standard for defining recurring events. For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api). start string ISO-8601 Datetime to start the downtime. Must not include a UTC offset. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to schedule the downtime. This affects recurring start and end dates. Must match `display_timezone`. default: `UTC` Option 2 object A one-time downtime definition. end date-time ISO-8601 Datetime to end the downtime. start [_required_] date-time ISO-8601 Datetime to start the downtime. scope string The scope to which the downtime applies. Must follow the [common search syntax](https://docs.datadoghq.com/logs/explorer/search_syntax/). status enum The current status of the downtime. Allowed enum values: `active,canceled,ended,scheduled` id string The downtime ID. relationships object All relationships associated with downtime. created_by object The user who created the downtime. data object Data for the user who created the downtime. id string User ID of the downtime creator. type enum Users resource type. Allowed enum values: `users` default: `users` monitor object The monitor identified by the downtime. data object Data for the monitor. id string Monitor ID of the downtime. type enum Monitor resource type. Allowed enum values: `monitors` default: `monitors` type enum Downtime resource type. Allowed enum values: `downtime` default: `downtime` included [ ] Array of objects related to the downtime that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Information about the monitor identified by the downtime. attributes object Attributes of the monitor identified by the downtime. name string The name of the monitor identified by the downtime. id int64 ID of the monitor identified by the downtime. type enum Monitor resource type. Allowed enum values: `monitors` default: `monitors` ``` { "data": { "attributes": { "canceled": "2020-01-02T03:04:05.282979+0000", "created": "2020-01-02T03:04:05.282979+0000", "display_timezone": "America/New_York", "message": "Message about the downtime", "modified": "2020-01-02T03:04:05.282979+0000", "monitor_identifier": { "monitor_id": 123 }, "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "warn" ], "notify_end_types": [ "canceled", "expired" ], "schedule": { "current_downtime": { "end": "2020-01-02T03:04:00.000Z", "start": "2020-01-02T03:04:00.000Z" }, "recurrences": [ { "duration": "123d", "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "start": "2020-01-02T03:04" } ], "timezone": "America/New_York" }, "scope": "env:(staging OR prod) AND datacenter:us-east-1", "status": "active" }, "id": "00000000-0000-1234-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-1234-0000-000000000000", "type": "users" } }, "monitor": { "data": { "id": "12345", "type": "monitors" } } }, "type": "downtime" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) ##### Get a downtime Copy ``` # Path parameters export downtime_id="00000000-0000-1234-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/downtime/${downtime_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a downtime ``` """ Get a downtime returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.downtimes_api import DowntimesApi # there is a valid "downtime_v2" in the system DOWNTIME_V2_DATA_ID = environ["DOWNTIME_V2_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.get_downtime( downtime_id=DOWNTIME_V2_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a downtime ``` # Get a downtime returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DowntimesAPI.new # there is a valid "downtime_v2" in the system DOWNTIME_V2_DATA_ID = ENV["DOWNTIME_V2_DATA_ID"] p api_instance.get_downtime(DOWNTIME_V2_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a downtime ``` // Get a downtime returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "downtime_v2" in the system DowntimeV2DataID := os.Getenv("DOWNTIME_V2_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDowntimesApi(apiClient) resp, r, err := api.GetDowntime(ctx, DowntimeV2DataID, *datadogV2.NewGetDowntimeOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.GetDowntime`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.GetDowntime`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a downtime ``` // Get a downtime returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DowntimesApi; import com.datadog.api.client.v2.model.DowntimeResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); // there is a valid "downtime_v2" in the system String DOWNTIME_V2_DATA_ID = System.getenv("DOWNTIME_V2_DATA_ID"); try { DowntimeResponse result = apiInstance.getDowntime(DOWNTIME_V2_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#getDowntime"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a downtime ``` // Get a downtime returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_downtimes::DowntimesAPI; use datadog_api_client::datadogV2::api_downtimes::GetDowntimeOptionalParams; #[tokio::main] async fn main() { // there is a valid "downtime_v2" in the system let downtime_v2_data_id = std::env::var("DOWNTIME_V2_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api .get_downtime( downtime_v2_data_id.clone(), GetDowntimeOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a downtime ``` /** * Get a downtime returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DowntimesApi(configuration); // there is a valid "downtime_v2" in the system const DOWNTIME_V2_DATA_ID = process.env.DOWNTIME_V2_DATA_ID as string; const params: v2.DowntimesApiGetDowntimeRequest = { downtimeId: DOWNTIME_V2_DATA_ID, }; apiInstance .getDowntime(params) .then((data: v2.DowntimeResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a downtime](https://docs.datadoghq.com/api/latest/downtimes/#update-a-downtime) * [v1](https://docs.datadoghq.com/api/latest/downtimes/#update-a-downtime-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/downtimes/#update-a-downtime-v2) PUT https://api.ap1.datadoghq.com/api/v1/downtime/{downtime_id}https://api.ap2.datadoghq.com/api/v1/downtime/{downtime_id}https://api.datadoghq.eu/api/v1/downtime/{downtime_id}https://api.ddog-gov.com/api/v1/downtime/{downtime_id}https://api.datadoghq.com/api/v1/downtime/{downtime_id}https://api.us3.datadoghq.com/api/v1/downtime/{downtime_id}https://api.us5.datadoghq.com/api/v1/downtime/{downtime_id} ### Overview Update a single downtime by `downtime_id`. **Note:** This endpoint has been deprecated. Please use v2 endpoints. This endpoint requires the `monitors_downtime` permission. OAuth apps require the `monitors_downtime` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Arguments #### Path Parameters Name Type Description downtime_id [_required_] integer ID of the downtime to update. ### Request #### Body Data (required) Update a downtime request body. * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Field Type Description active boolean If a scheduled downtime currently exists. active_child object The downtime object definition of the active child for the original parent recurring downtime. This field will only exist on recurring downtimes. active boolean If a scheduled downtime currently exists. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. ``` { "message": "Example-Downtime-updated", "mute_first_recovery_notification": true, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/downtimes/#UpdateDowntime-200-v1) * [400](https://docs.datadoghq.com/api/latest/downtimes/#UpdateDowntime-400-v1) * [403](https://docs.datadoghq.com/api/latest/downtimes/#UpdateDowntime-403-v1) * [404](https://docs.datadoghq.com/api/latest/downtimes/#UpdateDowntime-404-v1) * [429](https://docs.datadoghq.com/api/latest/downtimes/#UpdateDowntime-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Downtiming gives you greater control over monitor notifications by allowing you to globally exclude scopes from alerting. Downtime settings, which can be scheduled with start and end times, prevent all alerting related to specified Datadog tags. Field Type Description active boolean If a scheduled downtime currently exists. active_child object The downtime object definition of the active child for the original parent recurring downtime. This field will only exist on recurring downtimes. active boolean If a scheduled downtime currently exists. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. ``` { "active": true, "active_child": { "active": true, "canceled": 1412799983, "creator_id": 123456, "disabled": false, "downtime_type": 2, "end": 1412793983, "id": 1626, "message": "Message on the downtime", "monitor_id": 123456, "monitor_tags": [ "*" ], "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ], "parent_id": 123, "recurrence": { "period": 1, "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "type": "weeks", "until_date": 1447786293, "until_occurrences": 2, "week_days": [ "Mon", "Tue" ] }, "scope": [ "env:staging" ], "start": 1412792983, "timezone": "America/New_York", "updater_id": 123456 }, "canceled": 1412799983, "creator_id": 123456, "disabled": false, "downtime_type": 2, "end": 1412793983, "id": 1625, "message": "Message on the downtime", "monitor_id": 123456, "monitor_tags": [ "*" ], "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ], "parent_id": 123, "recurrence": { "period": 1, "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "type": "weeks", "until_date": 1447786293, "until_occurrences": 2, "week_days": [ "Mon", "Tue" ] }, "scope": [ "env:staging" ], "start": 1412792983, "timezone": "America/New_York", "updater_id": 123456 } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Downtime not found * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) ##### Update a downtime returns "OK" response Copy ``` # Path parameters export downtime_id="123456" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/downtime/${downtime_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "message": "Example-Downtime-updated", "mute_first_recovery_notification": true, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ] } EOF ``` ##### Update a downtime returns "OK" response ``` // Update a downtime returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "downtime" in the system DowntimeID, _ := strconv.ParseInt(os.Getenv("DOWNTIME_ID"), 10, 64) body := datadogV1.Downtime{ Message: *datadog.NewNullableString(datadog.PtrString("Example-Downtime-updated")), MuteFirstRecoveryNotification: datadog.PtrBool(true), NotifyEndStates: []datadogV1.NotifyEndState{ datadogV1.NOTIFYENDSTATE_ALERT, datadogV1.NOTIFYENDSTATE_NO_DATA, datadogV1.NOTIFYENDSTATE_WARN, }, NotifyEndTypes: []datadogV1.NotifyEndType{ datadogV1.NOTIFYENDTYPE_CANCELED, datadogV1.NOTIFYENDTYPE_EXPIRED, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDowntimesApi(apiClient) resp, r, err := api.UpdateDowntime(ctx, DowntimeID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.UpdateDowntime`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.UpdateDowntime`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a downtime returns "OK" response ``` // Update a downtime returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DowntimesApi; import com.datadog.api.client.v1.model.Downtime; import com.datadog.api.client.v1.model.NotifyEndState; import com.datadog.api.client.v1.model.NotifyEndType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); // there is a valid "downtime" in the system Long DOWNTIME_ID = Long.parseLong(System.getenv("DOWNTIME_ID")); Downtime body = new Downtime() .message("Example-Downtime-updated") .muteFirstRecoveryNotification(true) .notifyEndStates( Arrays.asList(NotifyEndState.ALERT, NotifyEndState.NO_DATA, NotifyEndState.WARN)) .notifyEndTypes(Arrays.asList(NotifyEndType.CANCELED, NotifyEndType.EXPIRED)); try { Downtime result = apiInstance.updateDowntime(DOWNTIME_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#updateDowntime"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a downtime returns "OK" response ``` """ Update a downtime returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.downtimes_api import DowntimesApi from datadog_api_client.v1.model.downtime import Downtime from datadog_api_client.v1.model.notify_end_state import NotifyEndState from datadog_api_client.v1.model.notify_end_type import NotifyEndType # there is a valid "downtime" in the system DOWNTIME_ID = environ["DOWNTIME_ID"] body = Downtime( message="Example-Downtime-updated", mute_first_recovery_notification=True, notify_end_states=[ NotifyEndState.ALERT, NotifyEndState.NO_DATA, NotifyEndState.WARN, ], notify_end_types=[ NotifyEndType.CANCELED, NotifyEndType.EXPIRED, ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.update_downtime(downtime_id=int(DOWNTIME_ID), body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a downtime returns "OK" response ``` # Update a downtime returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DowntimesAPI.new # there is a valid "downtime" in the system DOWNTIME_ID = ENV["DOWNTIME_ID"] body = DatadogAPIClient::V1::Downtime.new({ message: "Example-Downtime-updated", mute_first_recovery_notification: true, notify_end_states: [ DatadogAPIClient::V1::NotifyEndState::ALERT, DatadogAPIClient::V1::NotifyEndState::NO_DATA, DatadogAPIClient::V1::NotifyEndState::WARN, ], notify_end_types: [ DatadogAPIClient::V1::NotifyEndType::CANCELED, DatadogAPIClient::V1::NotifyEndType::EXPIRED, ], }) p api_instance.update_downtime(DOWNTIME_ID.to_i, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a downtime returns "OK" response ``` // Update a downtime returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_downtimes::DowntimesAPI; use datadog_api_client::datadogV1::model::Downtime; use datadog_api_client::datadogV1::model::NotifyEndState; use datadog_api_client::datadogV1::model::NotifyEndType; #[tokio::main] async fn main() { // there is a valid "downtime" in the system let downtime_id: i64 = std::env::var("DOWNTIME_ID").unwrap().parse().unwrap(); let body = Downtime::new() .message(Some("Example-Downtime-updated".to_string())) .mute_first_recovery_notification(true) .notify_end_states(vec![ NotifyEndState::ALERT, NotifyEndState::NO_DATA, NotifyEndState::WARN, ]) .notify_end_types(vec![NotifyEndType::CANCELED, NotifyEndType::EXPIRED]); let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api.update_downtime(downtime_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a downtime returns "OK" response ``` /** * Update a downtime returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DowntimesApi(configuration); // there is a valid "downtime" in the system const DOWNTIME_ID = parseInt(process.env.DOWNTIME_ID as string); const params: v1.DowntimesApiUpdateDowntimeRequest = { body: { message: "Example-Downtime-updated", muteFirstRecoveryNotification: true, notifyEndStates: ["alert", "no data", "warn"], notifyEndTypes: ["canceled", "expired"], }, downtimeId: DOWNTIME_ID, }; apiInstance .updateDowntime(params) .then((data: v1.Downtime) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` PATCH https://api.ap1.datadoghq.com/api/v2/downtime/{downtime_id}https://api.ap2.datadoghq.com/api/v2/downtime/{downtime_id}https://api.datadoghq.eu/api/v2/downtime/{downtime_id}https://api.ddog-gov.com/api/v2/downtime/{downtime_id}https://api.datadoghq.com/api/v2/downtime/{downtime_id}https://api.us3.datadoghq.com/api/v2/downtime/{downtime_id}https://api.us5.datadoghq.com/api/v2/downtime/{downtime_id} ### Overview Update a downtime by `downtime_id`. This endpoint requires the `monitors_downtime` permission. OAuth apps require the `monitors_downtime` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Arguments #### Path Parameters Name Type Description downtime_id [_required_] string ID of the downtime to update. ### Request #### Body Data (required) Update a downtime request body. * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Field Type Description data [_required_] object Object to update a downtime. attributes [_required_] object Attributes of the downtime to update. display_timezone string The timezone in which to display the downtime's start and end times in Datadog applications. This is not used as an offset for scheduling. default: `UTC` message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_identifier Monitor identifier for the downtime. Option 1 object Object of the monitor identifier. monitor_id [_required_] int64 ID of the monitor to prevent notifications. Option 2 object Object of the monitor tags. monitor_tags [_required_] [string] A list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match **all** provided monitor tags. Setting `monitor_tags` to `[*]` configures the downtime to mute all monitors for the given scope. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States that will trigger a monitor notification when the `notify_end_types` action occurs. notify_end_types [string] Actions that will trigger a monitor notification if the downtime is in the `notify_end_types` state. schedule Schedule for the downtime. Option 1 object A recurring downtime schedule definition. recurrences [object] A list of downtime recurrences. duration [_required_] string The length of the downtime. Must begin with an integer and end with one of 'm', 'h', d', or 'w'. rrule [_required_] string The `RRULE` standard for defining recurring events. For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api). start string ISO-8601 Datetime to start the downtime. Must not include a UTC offset. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to schedule the downtime. default: `UTC` Option 2 object A one-time downtime definition. end date-time ISO-8601 Datetime to end the downtime. Must include a UTC offset of zero. If not provided, the downtime continues forever. start date-time ISO-8601 Datetime to start the downtime. Must include a UTC offset of zero. If not provided, the downtime starts the moment it is created. scope string The scope to which the downtime applies. Must follow the [common search syntax](https://docs.datadoghq.com/logs/explorer/search_syntax/). id [_required_] string ID of this downtime. type [_required_] enum Downtime resource type. Allowed enum values: `downtime` default: `downtime` ``` { "data": { "attributes": { "message": "light speed" }, "id": "00000000-0000-1234-0000-000000000000", "type": "downtime" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/downtimes/#UpdateDowntime-200-v2) * [400](https://docs.datadoghq.com/api/latest/downtimes/#UpdateDowntime-400-v2) * [403](https://docs.datadoghq.com/api/latest/downtimes/#UpdateDowntime-403-v2) * [404](https://docs.datadoghq.com/api/latest/downtimes/#UpdateDowntime-404-v2) * [429](https://docs.datadoghq.com/api/latest/downtimes/#UpdateDowntime-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Downtiming gives you greater control over monitor notifications by allowing you to globally exclude scopes from alerting. Downtime settings, which can be scheduled with start and end times, prevent all alerting related to specified Datadog tags. Field Type Description data object Downtime data. attributes object Downtime details. canceled date-time Time that the downtime was canceled. created date-time Creation time of the downtime. display_timezone string The timezone in which to display the downtime's start and end times in Datadog applications. This is not used as an offset for scheduling. default: `UTC` message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. modified date-time Time that the downtime was last modified. monitor_identifier Monitor identifier for the downtime. Option 1 object Object of the monitor identifier. monitor_id [_required_] int64 ID of the monitor to prevent notifications. Option 2 object Object of the monitor tags. monitor_tags [_required_] [string] A list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match **all** provided monitor tags. Setting `monitor_tags` to `[*]` configures the downtime to mute all monitors for the given scope. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States that will trigger a monitor notification when the `notify_end_types` action occurs. notify_end_types [string] Actions that will trigger a monitor notification if the downtime is in the `notify_end_types` state. schedule The schedule that defines when the monitor starts, stops, and recurs. There are two types of schedules: one-time and recurring. Recurring schedules may have up to five RRULE-based recurrences. If no schedules are provided, the downtime will begin immediately and never end. Option 1 object A recurring downtime schedule definition. current_downtime object The most recent actual start and end dates for a recurring downtime. For a canceled downtime, this is the previously occurring downtime. For active downtimes, this is the ongoing downtime, and for scheduled downtimes it is the upcoming downtime. end date-time The end of the current downtime. start date-time The start of the current downtime. recurrences [_required_] [object] A list of downtime recurrences. duration string The length of the downtime. Must begin with an integer and end with one of 'm', 'h', d', or 'w'. rrule string The `RRULE` standard for defining recurring events. For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api). start string ISO-8601 Datetime to start the downtime. Must not include a UTC offset. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to schedule the downtime. This affects recurring start and end dates. Must match `display_timezone`. default: `UTC` Option 2 object A one-time downtime definition. end date-time ISO-8601 Datetime to end the downtime. start [_required_] date-time ISO-8601 Datetime to start the downtime. scope string The scope to which the downtime applies. Must follow the [common search syntax](https://docs.datadoghq.com/logs/explorer/search_syntax/). status enum The current status of the downtime. Allowed enum values: `active,canceled,ended,scheduled` id string The downtime ID. relationships object All relationships associated with downtime. created_by object The user who created the downtime. data object Data for the user who created the downtime. id string User ID of the downtime creator. type enum Users resource type. Allowed enum values: `users` default: `users` monitor object The monitor identified by the downtime. data object Data for the monitor. id string Monitor ID of the downtime. type enum Monitor resource type. Allowed enum values: `monitors` default: `monitors` type enum Downtime resource type. Allowed enum values: `downtime` default: `downtime` included [ ] Array of objects related to the downtime that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Information about the monitor identified by the downtime. attributes object Attributes of the monitor identified by the downtime. name string The name of the monitor identified by the downtime. id int64 ID of the monitor identified by the downtime. type enum Monitor resource type. Allowed enum values: `monitors` default: `monitors` ``` { "data": { "attributes": { "canceled": "2020-01-02T03:04:05.282979+0000", "created": "2020-01-02T03:04:05.282979+0000", "display_timezone": "America/New_York", "message": "Message about the downtime", "modified": "2020-01-02T03:04:05.282979+0000", "monitor_identifier": { "monitor_id": 123 }, "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "warn" ], "notify_end_types": [ "canceled", "expired" ], "schedule": { "current_downtime": { "end": "2020-01-02T03:04:00.000Z", "start": "2020-01-02T03:04:00.000Z" }, "recurrences": [ { "duration": "123d", "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "start": "2020-01-02T03:04" } ], "timezone": "America/New_York" }, "scope": "env:(staging OR prod) AND datacenter:us-east-1", "status": "active" }, "id": "00000000-0000-1234-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-1234-0000-000000000000", "type": "users" } }, "monitor": { "data": { "id": "12345", "type": "monitors" } } }, "type": "downtime" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Downtime not found * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) ##### Update a downtime returns "OK" response Copy ``` # Path parameters export downtime_id="00e000000-0000-1234-0000-000000000000" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/downtime/${downtime_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "message": "light speed" }, "id": "00000000-0000-1234-0000-000000000000", "type": "downtime" } } EOF ``` ##### Update a downtime returns "OK" response ``` // Update a downtime returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "downtime_v2" in the system DowntimeV2DataID := os.Getenv("DOWNTIME_V2_DATA_ID") body := datadogV2.DowntimeUpdateRequest{ Data: datadogV2.DowntimeUpdateRequestData{ Attributes: datadogV2.DowntimeUpdateRequestAttributes{ Message: *datadog.NewNullableString(datadog.PtrString("light speed")), }, Id: DowntimeV2DataID, Type: datadogV2.DOWNTIMERESOURCETYPE_DOWNTIME, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDowntimesApi(apiClient) resp, r, err := api.UpdateDowntime(ctx, DowntimeV2DataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.UpdateDowntime`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.UpdateDowntime`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a downtime returns "OK" response ``` // Update a downtime returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DowntimesApi; import com.datadog.api.client.v2.model.DowntimeResourceType; import com.datadog.api.client.v2.model.DowntimeResponse; import com.datadog.api.client.v2.model.DowntimeUpdateRequest; import com.datadog.api.client.v2.model.DowntimeUpdateRequestAttributes; import com.datadog.api.client.v2.model.DowntimeUpdateRequestData; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); // there is a valid "downtime_v2" in the system String DOWNTIME_V2_DATA_ID = System.getenv("DOWNTIME_V2_DATA_ID"); DowntimeUpdateRequest body = new DowntimeUpdateRequest() .data( new DowntimeUpdateRequestData() .attributes(new DowntimeUpdateRequestAttributes().message("light speed")) .id(DOWNTIME_V2_DATA_ID) .type(DowntimeResourceType.DOWNTIME)); try { DowntimeResponse result = apiInstance.updateDowntime(DOWNTIME_V2_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#updateDowntime"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a downtime returns "OK" response ``` """ Update a downtime returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.downtimes_api import DowntimesApi from datadog_api_client.v2.model.downtime_resource_type import DowntimeResourceType from datadog_api_client.v2.model.downtime_update_request import DowntimeUpdateRequest from datadog_api_client.v2.model.downtime_update_request_attributes import DowntimeUpdateRequestAttributes from datadog_api_client.v2.model.downtime_update_request_data import DowntimeUpdateRequestData # there is a valid "downtime_v2" in the system DOWNTIME_V2_DATA_ID = environ["DOWNTIME_V2_DATA_ID"] body = DowntimeUpdateRequest( data=DowntimeUpdateRequestData( attributes=DowntimeUpdateRequestAttributes( message="light speed", ), id=DOWNTIME_V2_DATA_ID, type=DowntimeResourceType.DOWNTIME, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.update_downtime(downtime_id=DOWNTIME_V2_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a downtime returns "OK" response ``` # Update a downtime returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DowntimesAPI.new # there is a valid "downtime_v2" in the system DOWNTIME_V2_DATA_ID = ENV["DOWNTIME_V2_DATA_ID"] body = DatadogAPIClient::V2::DowntimeUpdateRequest.new({ data: DatadogAPIClient::V2::DowntimeUpdateRequestData.new({ attributes: DatadogAPIClient::V2::DowntimeUpdateRequestAttributes.new({ message: "light speed", }), id: DOWNTIME_V2_DATA_ID, type: DatadogAPIClient::V2::DowntimeResourceType::DOWNTIME, }), }) p api_instance.update_downtime(DOWNTIME_V2_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a downtime returns "OK" response ``` // Update a downtime returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_downtimes::DowntimesAPI; use datadog_api_client::datadogV2::model::DowntimeResourceType; use datadog_api_client::datadogV2::model::DowntimeUpdateRequest; use datadog_api_client::datadogV2::model::DowntimeUpdateRequestAttributes; use datadog_api_client::datadogV2::model::DowntimeUpdateRequestData; #[tokio::main] async fn main() { // there is a valid "downtime_v2" in the system let downtime_v2_data_id = std::env::var("DOWNTIME_V2_DATA_ID").unwrap(); let body = DowntimeUpdateRequest::new(DowntimeUpdateRequestData::new( DowntimeUpdateRequestAttributes::new().message(Some("light speed".to_string())), downtime_v2_data_id.clone(), DowntimeResourceType::DOWNTIME, )); let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api.update_downtime(downtime_v2_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a downtime returns "OK" response ``` /** * Update a downtime returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DowntimesApi(configuration); // there is a valid "downtime_v2" in the system const DOWNTIME_V2_DATA_ID = process.env.DOWNTIME_V2_DATA_ID as string; const params: v2.DowntimesApiUpdateDowntimeRequest = { body: { data: { attributes: { message: "light speed", }, id: DOWNTIME_V2_DATA_ID, type: "downtime", }, }, downtimeId: DOWNTIME_V2_DATA_ID, }; apiInstance .updateDowntime(params) .then((data: v2.DowntimeResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get active downtimes for a monitor](https://docs.datadoghq.com/api/latest/downtimes/#get-active-downtimes-for-a-monitor) * [v1](https://docs.datadoghq.com/api/latest/downtimes/#get-active-downtimes-for-a-monitor-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/downtimes/#get-active-downtimes-for-a-monitor-v2) GET https://api.ap1.datadoghq.com/api/v1/monitor/{monitor_id}/downtimeshttps://api.ap2.datadoghq.com/api/v1/monitor/{monitor_id}/downtimeshttps://api.datadoghq.eu/api/v1/monitor/{monitor_id}/downtimeshttps://api.ddog-gov.com/api/v1/monitor/{monitor_id}/downtimeshttps://api.datadoghq.com/api/v1/monitor/{monitor_id}/downtimeshttps://api.us3.datadoghq.com/api/v1/monitor/{monitor_id}/downtimeshttps://api.us5.datadoghq.com/api/v1/monitor/{monitor_id}/downtimes ### Overview Get all active v1 downtimes for the specified monitor. **Note:** This endpoint has been deprecated. Please use v2 endpoints. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Arguments #### Path Parameters Name Type Description monitor_id [_required_] integer The id of the monitor ### Response * [200](https://docs.datadoghq.com/api/latest/downtimes/#ListMonitorDowntimes-200-v1) * [400](https://docs.datadoghq.com/api/latest/downtimes/#ListMonitorDowntimes-400-v1) * [404](https://docs.datadoghq.com/api/latest/downtimes/#ListMonitorDowntimes-404-v1) * [429](https://docs.datadoghq.com/api/latest/downtimes/#ListMonitorDowntimes-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Field Type Description active boolean If a scheduled downtime currently exists. active_child object The downtime object definition of the active child for the original parent recurring downtime. This field will only exist on recurring downtimes. active boolean If a scheduled downtime currently exists. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. canceled int64 If a scheduled downtime is canceled. creator_id int32 User ID of the downtime creator. disabled boolean If a downtime has been disabled. downtime_type int32 `0` for a downtime applied on `*` or all, `1` when the downtime is only scoped to hosts, or `2` when the downtime is scoped to anything but hosts. end int64 POSIX timestamp to end the downtime. If not provided, the downtime is in effect indefinitely until you cancel it. id int64 The downtime ID. message string A message to include with notifications for this downtime. Email notifications can be sent to specific users by using the same `@username` notation as events. monitor_id int64 A single monitor to which the downtime applies. If not provided, the downtime applies to all monitors. monitor_tags [string] A comma-separated list of monitor tags. For example, tags that are applied directly to monitors, not tags that are used in monitor queries (which are filtered by the scope parameter), to which the downtime applies. The resulting downtime applies to monitors that match ALL provided monitor tags. For example, `service:postgres` **AND** `team:frontend`. mute_first_recovery_notification boolean If the first recovery notification during a downtime should be muted. notify_end_states [string] States for which `notify_end_types` sends out notifications for. default: `alert,no data,warn` notify_end_types [string] If set, notifies if a monitor is in an alert-worthy state (`ALERT`, `WARNING`, or `NO DATA`) when this downtime expires or is canceled. Applied to monitors that change states during the downtime (such as from `OK` to `ALERT`, `WARNING`, or `NO DATA`), and to monitors that already have an alert-worthy state when downtime begins. default: `expired` parent_id int64 ID of the parent Downtime. recurrence object An object defining the recurrence of the downtime. period int32 How often to repeat as an integer. For example, to repeat every 3 days, select a type of `days` and a period of `3`. rrule string The `RRULE` standard for defining recurring events (**requires to set "type" to rrule**) For example, to have a recurring event on the first day of each month, set the type to `rrule` and set the `FREQ` to `MONTHLY` and `BYMONTHDAY` to `1`. Most common `rrule` options from the [iCalendar Spec](https://tools.ietf.org/html/rfc5545) are supported. **Note** : Attributes specifying the duration in `RRULE` are not supported (for example, `DTSTART`, `DTEND`, `DURATION`). More examples available in this [downtime guide](https://docs.datadoghq.com/monitors/guide/suppress-alert-with-downtimes/?tab=api) type string The type of recurrence. Choose from `days`, `weeks`, `months`, `years`, `rrule`. until_date int64 The date at which the recurrence should end as a POSIX timestamp. `until_occurences` and `until_date` are mutually exclusive. until_occurrences int32 How many times the downtime is rescheduled. `until_occurences` and `until_date` are mutually exclusive. week_days [string] A list of week days to repeat on. Choose from `Mon`, `Tue`, `Wed`, `Thu`, `Fri`, `Sat` or `Sun`. Only applicable when type is weeks. First letter must be capitalized. scope [string] The scope(s) to which the downtime applies and must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. If not provided, the downtime starts the moment it is created. timezone string The timezone in which to display the downtime's start and end times in Datadog applications. updater_id int32 ID of the last user that updated the downtime. ``` { "active": true, "active_child": { "active": true, "canceled": 1412799983, "creator_id": 123456, "disabled": false, "downtime_type": 2, "end": 1412793983, "id": 1626, "message": "Message on the downtime", "monitor_id": 123456, "monitor_tags": [ "*" ], "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ], "parent_id": 123, "recurrence": { "period": 1, "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "type": "weeks", "until_date": 1447786293, "until_occurrences": 2, "week_days": [ "Mon", "Tue" ] }, "scope": [ "env:staging" ], "start": 1412792983, "timezone": "America/New_York", "updater_id": 123456 }, "canceled": 1412799983, "creator_id": 123456, "disabled": false, "downtime_type": 2, "end": 1412793983, "id": 1625, "message": "Message on the downtime", "monitor_id": 123456, "monitor_tags": [ "*" ], "mute_first_recovery_notification": false, "notify_end_states": [ "alert", "no data", "warn" ], "notify_end_types": [ "canceled", "expired" ], "parent_id": 123, "recurrence": { "period": 1, "rrule": "FREQ=MONTHLY;BYSETPOS=3;BYDAY=WE;INTERVAL=1", "type": "weeks", "until_date": 1447786293, "until_occurrences": 2, "week_days": [ "Mon", "Tue" ] }, "scope": [ "env:staging" ], "start": 1412792983, "timezone": "America/New_York", "updater_id": 123456 } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Monitor Not Found error * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) ##### Get active downtimes for a monitor Copy ``` # Path parameters export monitor_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/${monitor_id}/downtimes" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get active downtimes for a monitor ``` """ Get active downtimes for a monitor returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.downtimes_api import DowntimesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.list_monitor_downtimes( monitor_id=9223372036854775807, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get active downtimes for a monitor ``` # Get active downtimes for a monitor returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::DowntimesAPI.new p api_instance.list_monitor_downtimes(9223372036854775807) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get active downtimes for a monitor ``` // Get active downtimes for a monitor returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewDowntimesApi(apiClient) resp, r, err := api.ListMonitorDowntimes(ctx, 9223372036854775807) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.ListMonitorDowntimes`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.ListMonitorDowntimes`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get active downtimes for a monitor ``` // Get active downtimes for a monitor returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.DowntimesApi; import com.datadog.api.client.v1.model.Downtime; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); try { List result = apiInstance.listMonitorDowntimes(9223372036854775807L); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#listMonitorDowntimes"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get active downtimes for a monitor ``` // Get active downtimes for a monitor returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_downtimes::DowntimesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api.list_monitor_downtimes(9223372036854775807).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get active downtimes for a monitor ``` /** * Get active downtimes for a monitor returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.DowntimesApi(configuration); const params: v1.DowntimesApiListMonitorDowntimesRequest = { monitorId: 9223372036854775807, }; apiInstance .listMonitorDowntimes(params) .then((data: v1.Downtime[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/monitor/{monitor_id}/downtime_matcheshttps://api.ap2.datadoghq.com/api/v2/monitor/{monitor_id}/downtime_matcheshttps://api.datadoghq.eu/api/v2/monitor/{monitor_id}/downtime_matcheshttps://api.ddog-gov.com/api/v2/monitor/{monitor_id}/downtime_matcheshttps://api.datadoghq.com/api/v2/monitor/{monitor_id}/downtime_matcheshttps://api.us3.datadoghq.com/api/v2/monitor/{monitor_id}/downtime_matcheshttps://api.us5.datadoghq.com/api/v2/monitor/{monitor_id}/downtime_matches ### Overview Get all active downtimes for the specified monitor. This endpoint requires the `monitors_downtime` permission. OAuth apps require the `monitors_downtime` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#downtimes) to access this endpoint. ### Arguments #### Path Parameters Name Type Description monitor_id [_required_] integer The id of the monitor. #### Query Strings Name Type Description page[offset] integer Specific offset to use as the beginning of the returned page. page[limit] integer Maximum number of downtimes in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/downtimes/#ListMonitorDowntimes-200-v2) * [404](https://docs.datadoghq.com/api/latest/downtimes/#ListMonitorDowntimes-404-v2) * [429](https://docs.datadoghq.com/api/latest/downtimes/#ListMonitorDowntimes-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) Response for retrieving all downtime matches for a monitor. Field Type Description data [object] An array of downtime matches. attributes object Downtime match details. end date-time The end of the downtime. groups [string] An array of groups associated with the downtime. scope string The scope to which the downtime applies. Must follow the [common search syntax](https://docs.datadoghq.com/logs/explorer/search_syntax/). start date-time The start of the downtime. id string The downtime ID. type enum Monitor Downtime Match resource type. Allowed enum values: `downtime_match` default: `downtime_match` meta object Pagination metadata returned by the API. page object Object containing the total filtered count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "attributes": { "end": "2020-01-02T03:04:00.000Z", "groups": [ "service:postgres", "team:frontend" ], "scope": "env:(staging OR prod) AND datacenter:us-east-1", "start": "2020-01-02T03:04:00.000Z" }, "id": "00000000-0000-1234-0000-000000000000", "type": "downtime_match" } ], "meta": { "page": { "total_filtered_count": "integer" } } } ``` Copy Monitor Not Found error * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/downtimes/) * [Example](https://docs.datadoghq.com/api/latest/downtimes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/downtimes/?code-lang=typescript) ##### Get active downtimes for a monitor Copy ``` # Path parameters export monitor_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/${monitor_id}/downtime_matches" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get active downtimes for a monitor ``` """ Get active downtimes for a monitor returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.downtimes_api import DowntimesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = DowntimesApi(api_client) response = api_instance.list_monitor_downtimes( monitor_id=35534610, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get active downtimes for a monitor ``` # Get active downtimes for a monitor returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::DowntimesAPI.new p api_instance.list_monitor_downtimes(35534610) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get active downtimes for a monitor ``` // Get active downtimes for a monitor returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewDowntimesApi(apiClient) resp, r, err := api.ListMonitorDowntimes(ctx, 35534610, *datadogV2.NewListMonitorDowntimesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `DowntimesApi.ListMonitorDowntimes`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `DowntimesApi.ListMonitorDowntimes`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get active downtimes for a monitor ``` // Get active downtimes for a monitor returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.DowntimesApi; import com.datadog.api.client.v2.model.MonitorDowntimeMatchResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); DowntimesApi apiInstance = new DowntimesApi(defaultClient); try { MonitorDowntimeMatchResponse result = apiInstance.listMonitorDowntimes(35534610L); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling DowntimesApi#listMonitorDowntimes"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get active downtimes for a monitor ``` // Get active downtimes for a monitor returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_downtimes::DowntimesAPI; use datadog_api_client::datadogV2::api_downtimes::ListMonitorDowntimesOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = DowntimesAPI::with_config(configuration); let resp = api .list_monitor_downtimes(35534610, ListMonitorDowntimesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get active downtimes for a monitor ``` /** * Get active downtimes for a monitor returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.DowntimesApi(configuration); const params: v2.DowntimesApiListMonitorDowntimesRequest = { monitorId: 35534610, }; apiInstance .listMonitorDowntimes(params) .then((data: v2.MonitorDowntimeMatchResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c272a4ee-4876-4461-b0b5-f4aa85f1045b&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=a1949067-cea0-46b7-8c76-74d0cf9a94b2&pt=Downtimes&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdowntimes%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c272a4ee-4876-4461-b0b5-f4aa85f1045b&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=a1949067-cea0-46b7-8c76-74d0cf9a94b2&pt=Downtimes&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdowntimes%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=1b9c4145-1a1b-4c17-9066-777e2bc01590&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Downtimes&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fdowntimes%2F&r=<=27344&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=72933) --- # Source: https://docs.datadoghq.com/api/latest/embeddable-graphs/ # Embeddable Graphs Manage embeddable graphs through the API. See [Embeddable Graphs with Template Variables](https://docs.datadoghq.com/dashboards/guide/embeddable-graphs-with-template-variables/) for more information. ## [Revoke embed](https://docs.datadoghq.com/api/latest/embeddable-graphs/#revoke-embed) * [v1 (latest)](https://docs.datadoghq.com/api/latest/embeddable-graphs/#revoke-embed-v1) GET https://api.ap1.datadoghq.com/api/v1/graph/embed/{embed_id}/revokehttps://api.ap2.datadoghq.com/api/v1/graph/embed/{embed_id}/revokehttps://api.datadoghq.eu/api/v1/graph/embed/{embed_id}/revokehttps://api.ddog-gov.com/api/v1/graph/embed/{embed_id}/revokehttps://api.datadoghq.com/api/v1/graph/embed/{embed_id}/revokehttps://api.us3.datadoghq.com/api/v1/graph/embed/{embed_id}/revokehttps://api.us5.datadoghq.com/api/v1/graph/embed/{embed_id}/revoke ### Overview Revoke a specified embed. This endpoint requires the `embeddable_graphs_share` permission. ### Arguments #### Path Parameters Name Type Description embed_id [_required_] string ID of the embed. ### Response * [200](https://docs.datadoghq.com/api/latest/embeddable-graphs/#RevokeEmbeddableGraph-200-v1) * [403](https://docs.datadoghq.com/api/latest/embeddable-graphs/#RevokeEmbeddableGraph-403-v1) * [404](https://docs.datadoghq.com/api/latest/embeddable-graphs/#RevokeEmbeddableGraph-404-v1) * [429](https://docs.datadoghq.com/api/latest/embeddable-graphs/#RevokeEmbeddableGraph-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) A JSON object containing the success message Expand All Field Type Description success string Message. ``` { "success": "Embed 00000000000 successfully enabled." } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/embeddable-graphs/?code-lang=curl) ##### Revoke embed Copy ``` # Path parameters export embed_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/graph/embed/${embed_id}/revoke" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Enable embed](https://docs.datadoghq.com/api/latest/embeddable-graphs/#enable-embed) * [v1 (latest)](https://docs.datadoghq.com/api/latest/embeddable-graphs/#enable-embed-v1) GET https://api.ap1.datadoghq.com/api/v1/graph/embed/{embed_id}/enablehttps://api.ap2.datadoghq.com/api/v1/graph/embed/{embed_id}/enablehttps://api.datadoghq.eu/api/v1/graph/embed/{embed_id}/enablehttps://api.ddog-gov.com/api/v1/graph/embed/{embed_id}/enablehttps://api.datadoghq.com/api/v1/graph/embed/{embed_id}/enablehttps://api.us3.datadoghq.com/api/v1/graph/embed/{embed_id}/enablehttps://api.us5.datadoghq.com/api/v1/graph/embed/{embed_id}/enable ### Overview Enable a specified embed. This endpoint requires the `embeddable_graphs_share` permission. ### Arguments #### Path Parameters Name Type Description embed_id [_required_] string ID of the embed. ### Response * [200](https://docs.datadoghq.com/api/latest/embeddable-graphs/#EnableEmbeddableGraph-200-v1) * [403](https://docs.datadoghq.com/api/latest/embeddable-graphs/#EnableEmbeddableGraph-403-v1) * [404](https://docs.datadoghq.com/api/latest/embeddable-graphs/#EnableEmbeddableGraph-404-v1) * [429](https://docs.datadoghq.com/api/latest/embeddable-graphs/#EnableEmbeddableGraph-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) A JSON object containing the success message Expand All Field Type Description success string Message. ``` { "success": "Embed 00000000000 successfully enabled." } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/embeddable-graphs/?code-lang=curl) ##### Enable embed Copy ``` # Path parameters export embed_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/graph/embed/${embed_id}/enable" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Get specific embed](https://docs.datadoghq.com/api/latest/embeddable-graphs/#get-specific-embed) * [v1 (latest)](https://docs.datadoghq.com/api/latest/embeddable-graphs/#get-specific-embed-v1) GET https://api.ap1.datadoghq.com/api/v1/graph/embed/{embed_id}https://api.ap2.datadoghq.com/api/v1/graph/embed/{embed_id}https://api.datadoghq.eu/api/v1/graph/embed/{embed_id}https://api.ddog-gov.com/api/v1/graph/embed/{embed_id}https://api.datadoghq.com/api/v1/graph/embed/{embed_id}https://api.us3.datadoghq.com/api/v1/graph/embed/{embed_id}https://api.us5.datadoghq.com/api/v1/graph/embed/{embed_id} ### Overview Get the HTML fragment for a previously generated embed with `embed_id`. ### Arguments #### Path Parameters Name Type Description embed_id [_required_] string Token of the embed. ### Response * [200](https://docs.datadoghq.com/api/latest/embeddable-graphs/#GetEmbeddableGraph-200-v1) * [403](https://docs.datadoghq.com/api/latest/embeddable-graphs/#GetEmbeddableGraph-403-v1) * [404](https://docs.datadoghq.com/api/latest/embeddable-graphs/#GetEmbeddableGraph-404-v1) * [429](https://docs.datadoghq.com/api/latest/embeddable-graphs/#GetEmbeddableGraph-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Embeddable graph. Expand All Field Type Description dash_name string Name of the dashboard the graph is on (null if none). dash_url string URL of the dashboard the graph is on (null if none). embed_id string ID of the embed. graph_title string Title of the graph. html string HTML fragment for the embed (iframe). revoked boolean Boolean flag for whether or not the embed is revoked. shared_by int64 ID of the use who shared the embed. ``` { "dash_name": "string", "dash_url": "string", "embed_id": "string", "graph_title": "string", "html": "string", "revoked": false, "shared_by": "integer" } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/embeddable-graphs/?code-lang=curl) ##### Get specific embed Copy ``` # Path parameters export embed_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/graph/embed/${embed_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Create embed](https://docs.datadoghq.com/api/latest/embeddable-graphs/#create-embed) * [v1 (latest)](https://docs.datadoghq.com/api/latest/embeddable-graphs/#create-embed-v1) POST https://api.ap1.datadoghq.com/api/v1/graph/embedhttps://api.ap2.datadoghq.com/api/v1/graph/embedhttps://api.datadoghq.eu/api/v1/graph/embedhttps://api.ddog-gov.com/api/v1/graph/embedhttps://api.datadoghq.com/api/v1/graph/embedhttps://api.us3.datadoghq.com/api/v1/graph/embedhttps://api.us5.datadoghq.com/api/v1/graph/embed ### Overview Creates a new embeddable graph. Note: If an embed already exists for the exact same query in a given organization, the older embed is returned instead of creating a new embed. If you are interested in using template variables, see [Embeddable Graphs with Template Variables](https://docs.datadoghq.com/dashboards/faq/embeddable-graphs-with-template-variables). This endpoint requires the `embeddable_graphs_share` permission. ### Request #### Body Data (required) Embeddable Graph body * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Expand All Field Type Description graph_json [_required_] string The graph definition in JSON. Same format that is available on the JSON tab of the graph editor. legend enum The flag determining if the graph includes a legend. Allowed enum values: `yes,no` default: `no` size enum The size of the graph. Allowed enum values: `small,medium,large,xlarge` default: `medium` timeframe enum The timeframe for the graph. Allowed enum values: `1_hour,4_hours,1_day,2_days,1_week` default: `1_hour` title string Determines graph title. Must be at least 1 character. default: `Embed created through API` ``` { "graph_json": "", "legend": "string", "size": "string", "timeframe": "string", "title": "string" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/embeddable-graphs/#CreateEmbeddableGraph-200-v1) * [400](https://docs.datadoghq.com/api/latest/embeddable-graphs/#CreateEmbeddableGraph-400-v1) * [403](https://docs.datadoghq.com/api/latest/embeddable-graphs/#CreateEmbeddableGraph-403-v1) * [429](https://docs.datadoghq.com/api/latest/embeddable-graphs/#CreateEmbeddableGraph-429-v1) Payload accepted * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Embeddable graph. Expand All Field Type Description dash_name string Name of the dashboard the graph is on (null if none). dash_url string URL of the dashboard the graph is on (null if none). embed_id string ID of the embed. graph_title string Title of the graph. html string HTML fragment for the embed (iframe). revoked boolean Boolean flag for whether or not the embed is revoked. shared_by int64 ID of the use who shared the embed. ``` { "dash_name": "string", "dash_url": "string", "embed_id": "string", "graph_title": "string", "html": "string", "revoked": false, "shared_by": "integer" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/embeddable-graphs/?code-lang=curl) ##### Create embed Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/graph/embed" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "graph_json": "" } EOF ``` * * * ## [Get all embeds](https://docs.datadoghq.com/api/latest/embeddable-graphs/#get-all-embeds) * [v1 (latest)](https://docs.datadoghq.com/api/latest/embeddable-graphs/#get-all-embeds-v1) GET https://api.ap1.datadoghq.com/api/v1/graph/embedhttps://api.ap2.datadoghq.com/api/v1/graph/embedhttps://api.datadoghq.eu/api/v1/graph/embedhttps://api.ddog-gov.com/api/v1/graph/embedhttps://api.datadoghq.com/api/v1/graph/embedhttps://api.us3.datadoghq.com/api/v1/graph/embedhttps://api.us5.datadoghq.com/api/v1/graph/embed ### Overview Gets a list of previously created embeddable graphs. ### Response * [200](https://docs.datadoghq.com/api/latest/embeddable-graphs/#ListEmbeddableGraphs-200-v1) * [403](https://docs.datadoghq.com/api/latest/embeddable-graphs/#ListEmbeddableGraphs-403-v1) * [429](https://docs.datadoghq.com/api/latest/embeddable-graphs/#ListEmbeddableGraphs-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Response with embeddable graphs. Field Type Description embedded_graphs [object] List of embeddable graphs. dash_name string Name of the dashboard the graph is on (null if none). dash_url string URL of the dashboard the graph is on (null if none). embed_id string ID of the embed. graph_title string Title of the graph. html string HTML fragment for the embed (iframe). revoked boolean Boolean flag for whether or not the embed is revoked. shared_by int64 ID of the use who shared the embed. ``` { "embedded_graphs": [ { "dash_name": "string", "dash_url": "string", "embed_id": "string", "graph_title": "string", "html": "string", "revoked": false, "shared_by": "integer" } ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/embeddable-graphs/) * [Example](https://docs.datadoghq.com/api/latest/embeddable-graphs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/embeddable-graphs/?code-lang=curl) ##### Get all embeds Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/graph/embed" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=fd9aa011-ce81-43d3-87f0-bcd076dfdcd2&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=34972677-6c1f-4022-b59b-da703673f644&pt=Embeddable%20Graphs&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fembeddable-graphs%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=fd9aa011-ce81-43d3-87f0-bcd076dfdcd2&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=34972677-6c1f-4022-b59b-da703673f644&pt=Embeddable%20Graphs&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fembeddable-graphs%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=039bf8d6-d5e7-445e-a511-d08471592dbc&bo=2&sid=aa811b10f0bf11f0ab045579f582e260&vid=aa815aa0f0bf11f0b365ed070b1e7cb2&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Embeddable%20Graphs&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fembeddable-graphs%2F&r=<=1033&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=444324) --- # Source: https://docs.datadoghq.com/api/latest/error-tracking # Error Tracking View and manage issues within Error Tracking. See the [Error Tracking page](https://docs.datadoghq.com/error_tracking/) for more information. ## [Search error tracking issues](https://docs.datadoghq.com/api/latest/error-tracking/#search-error-tracking-issues) * [v2 (latest)](https://docs.datadoghq.com/api/latest/error-tracking/#search-error-tracking-issues-v2) POST https://api.ap1.datadoghq.com/api/v2/error-tracking/issues/searchhttps://api.ap2.datadoghq.com/api/v2/error-tracking/issues/searchhttps://api.datadoghq.eu/api/v2/error-tracking/issues/searchhttps://api.ddog-gov.com/api/v2/error-tracking/issues/searchhttps://api.datadoghq.com/api/v2/error-tracking/issues/searchhttps://api.us3.datadoghq.com/api/v2/error-tracking/issues/searchhttps://api.us5.datadoghq.com/api/v2/error-tracking/issues/search ### Overview Search issues endpoint allows you to programmatically search for issues within your organization. This endpoint returns a list of issues that match a given search query, following the event search syntax. The search results are limited to a maximum of 100 issues per request. OAuth apps require the `error_tracking_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#error-tracking) to access this endpoint. ### Arguments #### Query Strings Name Type Description include array Comma-separated list of relationship objects that should be included in the response. Possible values are `issue`, `issue.assignee`, `issue.case`, and `issue.team_owners`. ### Request #### Body Data (required) Search issues request payload. * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) Field Type Description data [_required_] object Search issues request. attributes [_required_] object Object describing a search issue request. from [_required_] int64 Start date (inclusive) of the query in milliseconds since the Unix epoch. order_by enum The attribute to sort the search results by. Allowed enum values: `TOTAL_COUNT,FIRST_SEEN,IMPACTED_SESSIONS,PRIORITY` persona enum Persona for the search. Either track(s) or persona(s) must be specified. Allowed enum values: `ALL,BROWSER,MOBILE,BACKEND` query [_required_] string Search query following the event search syntax. to [_required_] int64 End date (exclusive) of the query in milliseconds since the Unix epoch. track enum Track of the events to query. Either track(s) or persona(s) must be specified. Allowed enum values: `trace,logs,rum` type [_required_] enum Type of the object. Allowed enum values: `search_request` ``` { "data": { "attributes": { "query": "service:orders-* AND @language:go", "from": 1671612804000, "to": 1671620004000, "track": "trace" }, "type": "search_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/error-tracking/#SearchIssues-200-v2) * [400](https://docs.datadoghq.com/api/latest/error-tracking/#SearchIssues-400-v2) * [401](https://docs.datadoghq.com/api/latest/error-tracking/#SearchIssues-401-v2) * [403](https://docs.datadoghq.com/api/latest/error-tracking/#SearchIssues-403-v2) * [429](https://docs.datadoghq.com/api/latest/error-tracking/#SearchIssues-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) Search issues response payload. Field Type Description data [object] Array of results matching the search query. attributes [_required_] object Object containing the information of a search result. impacted_sessions int64 Count of sessions impacted by the issue over the queried time window. impacted_users int64 Count of users impacted by the issue over the queried time window. total_count int64 Total count of errors that match the issue over the queried time window. id [_required_] string Search result identifier (matches the nested issue's identifier). relationships object Relationships between the search result and other resources. issue object Relationship between the search result and the corresponding issue. data [_required_] object The issue the search result corresponds to. id [_required_] string Issue identifier. type [_required_] enum Type of the object. Allowed enum values: `issue` type [_required_] enum Type of the object. Allowed enum values: `error_tracking_search_result` included [ ] Array of resources related to the search results. Option 1 object The issue matching the request. attributes [_required_] object Object containing the information of an issue. error_message string Error message associated with the issue. error_type string Type of the error that matches the issue. file_path string Path of the file where the issue occurred. first_seen int64 Timestamp of the first seen error in milliseconds since the Unix epoch. first_seen_version string The application version (for example, git commit hash) where the issue was first observed. function_name string Name of the function where the issue occurred. is_crash boolean Error is a crash. languages [string] Array of programming languages associated with the issue. last_seen int64 Timestamp of the last seen error in milliseconds since the Unix epoch. last_seen_version string The application version (for example, git commit hash) where the issue was last observed. platform enum Platform associated with the issue. Allowed enum values: `ANDROID,BACKEND,BROWSER,FLUTTER,IOS,REACT_NATIVE,ROKU,UNKNOWN` service string Service name. state enum State of the issue Allowed enum values: `OPEN,ACKNOWLEDGED,RESOLVED,IGNORED,EXCLUDED` id [_required_] string Issue identifier. relationships object Relationship between the issue and an assignee, case and/or teams. assignee object Relationship between the issue and assignee. data [_required_] object The user the issue is assigned to. id [_required_] string User identifier. type [_required_] enum Type of the object Allowed enum values: `user` case object Relationship between the issue and case. data [_required_] object The case the issue is attached to. id [_required_] string Case identifier. type [_required_] enum Type of the object. Allowed enum values: `case` team_owners object Relationship between the issue and teams. data [_required_] [object] Array of teams that are owners of the issue. id [_required_] string Team identifier. type [_required_] enum Type of the object. Allowed enum values: `team` type [_required_] enum Type of the object. Allowed enum values: `issue` Option 2 object A case attributes [_required_] object Case resource attributes archived_at date-time Timestamp of when the case was archived attributes object The definition of `CaseObjectAttributes` object. [string] closed_at date-time Timestamp of when the case was closed created_at date-time Timestamp of when the case was created custom_attributes object Case custom attributes object Custom attribute values is_multi [_required_] boolean If true, value must be an array type [_required_] enum Custom attributes type Allowed enum values: `URL,TEXT,NUMBER,SELECT` value [_required_] Union of supported value for a custom attribute Option 1 string Value of TEXT/URL/NUMBER/SELECT custom attribute Option 2 [string] Value of multi TEXT/URL/NUMBER/SELECT custom attribute Option 3 double Value of NUMBER custom attribute Option 4 [number] Values of multi NUMBER custom attribute description string Description jira_issue object Jira issue attached to case result object Jira issue information issue_id string Jira issue ID issue_key string Jira issue key issue_url string Jira issue URL project_key string Jira project key status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` key string Key modified_at date-time Timestamp of when the case was last modified priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` service_now_ticket object ServiceNow ticket attached to case result object ServiceNow ticket information sys_target_link string Link to the Incident created on ServiceNow status enum Case status Allowed enum values: `IN_PROGRESS,COMPLETED,FAILED` default: `IN_PROGRESS` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title type enum **DEPRECATED** : Case type Allowed enum values: `STANDARD` type_id string Case type UUID id [_required_] string Case's identifier relationships object Resources related to a case assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Case resource type Allowed enum values: `case` default: `case` Option 3 object The user to whom the issue is assigned. attributes [_required_] object Object containing the information of a user. email string Email of the user. handle string Handle of the user. name string Name of the user. id [_required_] string User identifier. type [_required_] enum Type of the object Allowed enum values: `user` Option 4 object A team that owns an issue. attributes [_required_] object Object containing the information of a team. handle string The team's identifier. name string The name of the team. summary string A brief summary of the team, derived from its description. id [_required_] string Team identifier. type [_required_] enum Type of the object. Allowed enum values: `team` ``` { "data": [ { "attributes": { "impacted_sessions": 12, "impacted_users": 4, "total_count": 82 }, "id": "c1726a66-1f64-11ee-b338-da7ad0900002", "relationships": { "issue": { "data": { "id": "c1726a66-1f64-11ee-b338-da7ad0900002", "type": "issue" } } }, "type": "error_tracking_search_result" } ], "included": [ { "attributes": { "error_message": "object of type 'NoneType' has no len()", "error_type": "builtins.TypeError", "file_path": "/django-email/conduit/apps/core/utils.py", "first_seen": 1671612804001, "first_seen_version": "aaf65cd0", "function_name": "filter_forbidden_tags", "is_crash": false, "languages": [ "PYTHON", "GO" ], "last_seen": 1671620003100, "last_seen_version": "b6199f80", "platform": "BACKEND", "service": "email-api-py", "state": "RESOLVED" }, "id": "c1726a66-1f64-11ee-b338-da7ad0900002", "relationships": { "assignee": { "data": { "id": "87cb11a0-278c-440a-99fe-701223c80296", "type": "user" } }, "case": { "data": { "id": "2841440d-e780-4fe2-96cd-6a8c1d194da5", "type": "case" } }, "team_owners": { "data": [ { "id": "221b0179-6447-4d03-91c3-3ca98bf60e8a", "type": "team" } ] } }, "type": "issue" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=typescript) ##### Search error tracking issues returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/error-tracking/issues/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "query": "service:orders-* AND @language:go", "from": 1671612804000, "to": 1671620004000, "track": "trace" }, "type": "search_request" } } EOF ``` ##### Search error tracking issues returns "OK" response ``` // Search error tracking issues returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.IssuesSearchRequest{ Data: datadogV2.IssuesSearchRequestData{ Attributes: datadogV2.IssuesSearchRequestDataAttributes{ Query: "service:orders-* AND @language:go", From: 1671612804000, To: 1671620004000, Track: datadogV2.ISSUESSEARCHREQUESTDATAATTRIBUTESTRACK_TRACE.Ptr(), }, Type: datadogV2.ISSUESSEARCHREQUESTDATATYPE_SEARCH_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewErrorTrackingApi(apiClient) resp, r, err := api.SearchIssues(ctx, body, *datadogV2.NewSearchIssuesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ErrorTrackingApi.SearchIssues`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ErrorTrackingApi.SearchIssues`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search error tracking issues returns "OK" response ``` // Search error tracking issues returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ErrorTrackingApi; import com.datadog.api.client.v2.model.IssuesSearchRequest; import com.datadog.api.client.v2.model.IssuesSearchRequestData; import com.datadog.api.client.v2.model.IssuesSearchRequestDataAttributes; import com.datadog.api.client.v2.model.IssuesSearchRequestDataAttributesTrack; import com.datadog.api.client.v2.model.IssuesSearchRequestDataType; import com.datadog.api.client.v2.model.IssuesSearchResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ErrorTrackingApi apiInstance = new ErrorTrackingApi(defaultClient); IssuesSearchRequest body = new IssuesSearchRequest() .data( new IssuesSearchRequestData() .attributes( new IssuesSearchRequestDataAttributes() .query("service:orders-* AND @language:go") .from(1671612804000L) .to(1671620004000L) .track(IssuesSearchRequestDataAttributesTrack.TRACE)) .type(IssuesSearchRequestDataType.SEARCH_REQUEST)); try { IssuesSearchResponse result = apiInstance.searchIssues(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ErrorTrackingApi#searchIssues"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search error tracking issues returns "OK" response ``` """ Search error tracking issues returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.error_tracking_api import ErrorTrackingApi from datadog_api_client.v2.model.issues_search_request import IssuesSearchRequest from datadog_api_client.v2.model.issues_search_request_data import IssuesSearchRequestData from datadog_api_client.v2.model.issues_search_request_data_attributes import IssuesSearchRequestDataAttributes from datadog_api_client.v2.model.issues_search_request_data_attributes_track import ( IssuesSearchRequestDataAttributesTrack, ) from datadog_api_client.v2.model.issues_search_request_data_type import IssuesSearchRequestDataType body = IssuesSearchRequest( data=IssuesSearchRequestData( attributes=IssuesSearchRequestDataAttributes( query="service:orders-* AND @language:go", _from=1671612804000, to=1671620004000, track=IssuesSearchRequestDataAttributesTrack.TRACE, ), type=IssuesSearchRequestDataType.SEARCH_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ErrorTrackingApi(api_client) response = api_instance.search_issues(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search error tracking issues returns "OK" response ``` # Search error tracking issues returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ErrorTrackingAPI.new body = DatadogAPIClient::V2::IssuesSearchRequest.new({ data: DatadogAPIClient::V2::IssuesSearchRequestData.new({ attributes: DatadogAPIClient::V2::IssuesSearchRequestDataAttributes.new({ query: "service:orders-* AND @language:go", from: 1671612804000, to: 1671620004000, track: DatadogAPIClient::V2::IssuesSearchRequestDataAttributesTrack::TRACE, }), type: DatadogAPIClient::V2::IssuesSearchRequestDataType::SEARCH_REQUEST, }), }) p api_instance.search_issues(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search error tracking issues returns "OK" response ``` // Search error tracking issues returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_error_tracking::ErrorTrackingAPI; use datadog_api_client::datadogV2::api_error_tracking::SearchIssuesOptionalParams; use datadog_api_client::datadogV2::model::IssuesSearchRequest; use datadog_api_client::datadogV2::model::IssuesSearchRequestData; use datadog_api_client::datadogV2::model::IssuesSearchRequestDataAttributes; use datadog_api_client::datadogV2::model::IssuesSearchRequestDataAttributesTrack; use datadog_api_client::datadogV2::model::IssuesSearchRequestDataType; #[tokio::main] async fn main() { let body = IssuesSearchRequest::new(IssuesSearchRequestData::new( IssuesSearchRequestDataAttributes::new( 1671612804000, "service:orders-* AND @language:go".to_string(), 1671620004000, ) .track(IssuesSearchRequestDataAttributesTrack::TRACE), IssuesSearchRequestDataType::SEARCH_REQUEST, )); let configuration = datadog::Configuration::new(); let api = ErrorTrackingAPI::with_config(configuration); let resp = api .search_issues(body, SearchIssuesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search error tracking issues returns "OK" response ``` /** * Search error tracking issues returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ErrorTrackingApi(configuration); const params: v2.ErrorTrackingApiSearchIssuesRequest = { body: { data: { attributes: { query: "service:orders-* AND @language:go", from: 1671612804000, to: 1671620004000, track: "trace", }, type: "search_request", }, }, }; apiInstance .searchIssues(params) .then((data: v2.IssuesSearchResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the details of an error tracking issue](https://docs.datadoghq.com/api/latest/error-tracking/#get-the-details-of-an-error-tracking-issue) * [v2 (latest)](https://docs.datadoghq.com/api/latest/error-tracking/#get-the-details-of-an-error-tracking-issue-v2) GET https://api.ap1.datadoghq.com/api/v2/error-tracking/issues/{issue_id}https://api.ap2.datadoghq.com/api/v2/error-tracking/issues/{issue_id}https://api.datadoghq.eu/api/v2/error-tracking/issues/{issue_id}https://api.ddog-gov.com/api/v2/error-tracking/issues/{issue_id}https://api.datadoghq.com/api/v2/error-tracking/issues/{issue_id}https://api.us3.datadoghq.com/api/v2/error-tracking/issues/{issue_id}https://api.us5.datadoghq.com/api/v2/error-tracking/issues/{issue_id} ### Overview Retrieve the full details for a specific error tracking issue, including attributes and relationships. OAuth apps require the `error_tracking_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#error-tracking) to access this endpoint. ### Arguments #### Path Parameters Name Type Description issue_id [_required_] string The identifier of the issue. #### Query Strings Name Type Description include array Comma-separated list of relationship objects that should be included in the response. Possible values are `assignee`, `case`, and `team_owners`. ### Response * [200](https://docs.datadoghq.com/api/latest/error-tracking/#GetIssue-200-v2) * [400](https://docs.datadoghq.com/api/latest/error-tracking/#GetIssue-400-v2) * [401](https://docs.datadoghq.com/api/latest/error-tracking/#GetIssue-401-v2) * [403](https://docs.datadoghq.com/api/latest/error-tracking/#GetIssue-403-v2) * [404](https://docs.datadoghq.com/api/latest/error-tracking/#GetIssue-404-v2) * [429](https://docs.datadoghq.com/api/latest/error-tracking/#GetIssue-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) Response containing error tracking issue data. Field Type Description data object The issue matching the request. attributes [_required_] object Object containing the information of an issue. error_message string Error message associated with the issue. error_type string Type of the error that matches the issue. file_path string Path of the file where the issue occurred. first_seen int64 Timestamp of the first seen error in milliseconds since the Unix epoch. first_seen_version string The application version (for example, git commit hash) where the issue was first observed. function_name string Name of the function where the issue occurred. is_crash boolean Error is a crash. languages [string] Array of programming languages associated with the issue. last_seen int64 Timestamp of the last seen error in milliseconds since the Unix epoch. last_seen_version string The application version (for example, git commit hash) where the issue was last observed. platform enum Platform associated with the issue. Allowed enum values: `ANDROID,BACKEND,BROWSER,FLUTTER,IOS,REACT_NATIVE,ROKU,UNKNOWN` service string Service name. state enum State of the issue Allowed enum values: `OPEN,ACKNOWLEDGED,RESOLVED,IGNORED,EXCLUDED` id [_required_] string Issue identifier. relationships object Relationship between the issue and an assignee, case and/or teams. assignee object Relationship between the issue and assignee. data [_required_] object The user the issue is assigned to. id [_required_] string User identifier. type [_required_] enum Type of the object Allowed enum values: `user` case object Relationship between the issue and case. data [_required_] object The case the issue is attached to. id [_required_] string Case identifier. type [_required_] enum Type of the object. Allowed enum values: `case` team_owners object Relationship between the issue and teams. data [_required_] [object] Array of teams that are owners of the issue. id [_required_] string Team identifier. type [_required_] enum Type of the object. Allowed enum values: `team` type [_required_] enum Type of the object. Allowed enum values: `issue` included [ ] Array of resources related to the issue. Option 1 object The case attached to the issue. attributes [_required_] object Object containing the information of a case. archived_at date-time Timestamp of when the case was archived. closed_at date-time Timestamp of when the case was closed. created_at date-time Timestamp of when the case was created. creation_source string Source of the case creation. description string Description of the case. due_date string Due date of the case. insights [object] Insights of the case. ref string Reference of the insight. resource_id string Insight identifier. type string Type of the insight. jira_issue object Jira issue of the case. result object Contains the identifiers and URL for a successfully created Jira issue. issue_id string Jira issue identifier. issue_key string Jira issue key. issue_url string Jira issue URL. project_key string Jira project key. status string Creation status of the Jira issue. key string Key of the case. modified_at date-time Timestamp of when the case was last modified. priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title of the case. type string Type of the case. id [_required_] string Case identifier. relationships object Resources related to a case. assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Type of the object. Allowed enum values: `case` Option 2 object The user to whom the issue is assigned. attributes [_required_] object Object containing the information of a user. email string Email of the user. handle string Handle of the user. name string Name of the user. id [_required_] string User identifier. type [_required_] enum Type of the object Allowed enum values: `user` Option 3 object A team that owns an issue. attributes [_required_] object Object containing the information of a team. handle string The team's identifier. name string The name of the team. summary string A brief summary of the team, derived from its description. id [_required_] string Team identifier. type [_required_] enum Type of the object. Allowed enum values: `team` ``` { "data": { "attributes": { "error_message": "object of type 'NoneType' has no len()", "error_type": "builtins.TypeError", "file_path": "/django-email/conduit/apps/core/utils.py", "first_seen": 1671612804001, "first_seen_version": "aaf65cd0", "function_name": "filter_forbidden_tags", "is_crash": false, "languages": [ "PYTHON", "GO" ], "last_seen": 1671620003100, "last_seen_version": "b6199f80", "platform": "BACKEND", "service": "email-api-py", "state": "RESOLVED" }, "id": "c1726a66-1f64-11ee-b338-da7ad0900002", "relationships": { "assignee": { "data": { "id": "87cb11a0-278c-440a-99fe-701223c80296", "type": "user" } }, "case": { "data": { "id": "2841440d-e780-4fe2-96cd-6a8c1d194da5", "type": "case" } }, "team_owners": { "data": [ { "id": "221b0179-6447-4d03-91c3-3ca98bf60e8a", "type": "team" } ] } }, "type": "issue" }, "included": [ { "attributes": { "archived_at": "2025-01-01T00:00:00Z", "closed_at": "2025-01-01T00:00:00Z", "created_at": "2025-01-01T00:00:00Z", "creation_source": "ERROR_TRACKING", "description": "string", "due_date": "2025-01-01", "insights": [ { "ref": "/error-tracking?issueId=2841440d-e780-4fe2-96cd-6a8c1d194da5", "resource_id": "2841440d-e780-4fe2-96cd-6a8c1d194da5", "type": "ERROR_TRACKING" } ], "jira_issue": { "result": { "issue_id": "1904866", "issue_key": "ET-123", "issue_url": "https://your-jira-instance.atlassian.net/browse/ET-123", "project_key": "ET" }, "status": "COMPLETED" }, "key": "ET-123", "modified_at": "2025-01-01T00:00:00Z", "priority": "NOT_DEFINED", "status": "OPEN", "title": "Error: HTTP error", "type": "ERROR_TRACKING_ISSUE" }, "id": "2841440d-e780-4fe2-96cd-6a8c1d194da5", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=typescript) ##### Get the details of an error tracking issue Copy ``` # Path parameters export issue_id="c1726a66-1f64-11ee-b338-da7ad0900002" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/error-tracking/issues/${issue_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the details of an error tracking issue ``` """ Get the details of an error tracking issue returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.error_tracking_api import ErrorTrackingApi # there is a valid "issue" in the system ISSUE_ID = environ["ISSUE_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ErrorTrackingApi(api_client) response = api_instance.get_issue( issue_id=ISSUE_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the details of an error tracking issue ``` # Get the details of an error tracking issue returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ErrorTrackingAPI.new # there is a valid "issue" in the system ISSUE_ID = ENV["ISSUE_ID"] p api_instance.get_issue(ISSUE_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the details of an error tracking issue ``` // Get the details of an error tracking issue returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "issue" in the system IssueID := os.Getenv("ISSUE_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewErrorTrackingApi(apiClient) resp, r, err := api.GetIssue(ctx, IssueID, *datadogV2.NewGetIssueOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ErrorTrackingApi.GetIssue`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ErrorTrackingApi.GetIssue`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the details of an error tracking issue ``` // Get the details of an error tracking issue returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ErrorTrackingApi; import com.datadog.api.client.v2.model.IssueResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ErrorTrackingApi apiInstance = new ErrorTrackingApi(defaultClient); // there is a valid "issue" in the system String ISSUE_ID = System.getenv("ISSUE_ID"); try { IssueResponse result = apiInstance.getIssue(ISSUE_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ErrorTrackingApi#getIssue"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the details of an error tracking issue ``` // Get the details of an error tracking issue returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_error_tracking::ErrorTrackingAPI; use datadog_api_client::datadogV2::api_error_tracking::GetIssueOptionalParams; #[tokio::main] async fn main() { // there is a valid "issue" in the system let issue_id = std::env::var("ISSUE_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ErrorTrackingAPI::with_config(configuration); let resp = api .get_issue(issue_id.clone(), GetIssueOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the details of an error tracking issue ``` /** * Get the details of an error tracking issue returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ErrorTrackingApi(configuration); // there is a valid "issue" in the system const ISSUE_ID = process.env.ISSUE_ID as string; const params: v2.ErrorTrackingApiGetIssueRequest = { issueId: ISSUE_ID, }; apiInstance .getIssue(params) .then((data: v2.IssueResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update the state of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#update-the-state-of-an-issue) * [v2 (latest)](https://docs.datadoghq.com/api/latest/error-tracking/#update-the-state-of-an-issue-v2) PUT https://api.ap1.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/statehttps://api.ap2.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/statehttps://api.datadoghq.eu/api/v2/error-tracking/issues/{issue_id}/statehttps://api.ddog-gov.com/api/v2/error-tracking/issues/{issue_id}/statehttps://api.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/statehttps://api.us3.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/statehttps://api.us5.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/state ### Overview Update the state of an issue by `issue_id`. Use this endpoint to move an issue between states such as `OPEN`, `RESOLVED`, or `IGNORED`. OAuth apps require the `error_tracking_read, error_tracking_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#error-tracking) to access this endpoint. ### Arguments #### Path Parameters Name Type Description issue_id [_required_] string The identifier of the issue. ### Request #### Body Data (required) Update issue state request payload. * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) Field Type Description data [_required_] object Update issue state request. attributes [_required_] object Object describing an issue state update request. state [_required_] enum State of the issue Allowed enum values: `OPEN,ACKNOWLEDGED,RESOLVED,IGNORED,EXCLUDED` id [_required_] string Issue identifier. type [_required_] enum Type of the object. Allowed enum values: `error_tracking_issue` ``` { "data": { "attributes": { "state": "RESOLVED" }, "id": "c1726a66-1f64-11ee-b338-da7ad0900002", "type": "error_tracking_issue" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueState-200-v2) * [400](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueState-400-v2) * [401](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueState-401-v2) * [403](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueState-403-v2) * [404](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueState-404-v2) * [429](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueState-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) Response containing error tracking issue data. Field Type Description data object The issue matching the request. attributes [_required_] object Object containing the information of an issue. error_message string Error message associated with the issue. error_type string Type of the error that matches the issue. file_path string Path of the file where the issue occurred. first_seen int64 Timestamp of the first seen error in milliseconds since the Unix epoch. first_seen_version string The application version (for example, git commit hash) where the issue was first observed. function_name string Name of the function where the issue occurred. is_crash boolean Error is a crash. languages [string] Array of programming languages associated with the issue. last_seen int64 Timestamp of the last seen error in milliseconds since the Unix epoch. last_seen_version string The application version (for example, git commit hash) where the issue was last observed. platform enum Platform associated with the issue. Allowed enum values: `ANDROID,BACKEND,BROWSER,FLUTTER,IOS,REACT_NATIVE,ROKU,UNKNOWN` service string Service name. state enum State of the issue Allowed enum values: `OPEN,ACKNOWLEDGED,RESOLVED,IGNORED,EXCLUDED` id [_required_] string Issue identifier. relationships object Relationship between the issue and an assignee, case and/or teams. assignee object Relationship between the issue and assignee. data [_required_] object The user the issue is assigned to. id [_required_] string User identifier. type [_required_] enum Type of the object Allowed enum values: `user` case object Relationship between the issue and case. data [_required_] object The case the issue is attached to. id [_required_] string Case identifier. type [_required_] enum Type of the object. Allowed enum values: `case` team_owners object Relationship between the issue and teams. data [_required_] [object] Array of teams that are owners of the issue. id [_required_] string Team identifier. type [_required_] enum Type of the object. Allowed enum values: `team` type [_required_] enum Type of the object. Allowed enum values: `issue` included [ ] Array of resources related to the issue. Option 1 object The case attached to the issue. attributes [_required_] object Object containing the information of a case. archived_at date-time Timestamp of when the case was archived. closed_at date-time Timestamp of when the case was closed. created_at date-time Timestamp of when the case was created. creation_source string Source of the case creation. description string Description of the case. due_date string Due date of the case. insights [object] Insights of the case. ref string Reference of the insight. resource_id string Insight identifier. type string Type of the insight. jira_issue object Jira issue of the case. result object Contains the identifiers and URL for a successfully created Jira issue. issue_id string Jira issue identifier. issue_key string Jira issue key. issue_url string Jira issue URL. project_key string Jira project key. status string Creation status of the Jira issue. key string Key of the case. modified_at date-time Timestamp of when the case was last modified. priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title of the case. type string Type of the case. id [_required_] string Case identifier. relationships object Resources related to a case. assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Type of the object. Allowed enum values: `case` Option 2 object The user to whom the issue is assigned. attributes [_required_] object Object containing the information of a user. email string Email of the user. handle string Handle of the user. name string Name of the user. id [_required_] string User identifier. type [_required_] enum Type of the object Allowed enum values: `user` Option 3 object A team that owns an issue. attributes [_required_] object Object containing the information of a team. handle string The team's identifier. name string The name of the team. summary string A brief summary of the team, derived from its description. id [_required_] string Team identifier. type [_required_] enum Type of the object. Allowed enum values: `team` ``` { "data": { "attributes": { "error_message": "object of type 'NoneType' has no len()", "error_type": "builtins.TypeError", "file_path": "/django-email/conduit/apps/core/utils.py", "first_seen": 1671612804001, "first_seen_version": "aaf65cd0", "function_name": "filter_forbidden_tags", "is_crash": false, "languages": [ "PYTHON", "GO" ], "last_seen": 1671620003100, "last_seen_version": "b6199f80", "platform": "BACKEND", "service": "email-api-py", "state": "RESOLVED" }, "id": "c1726a66-1f64-11ee-b338-da7ad0900002", "relationships": { "assignee": { "data": { "id": "87cb11a0-278c-440a-99fe-701223c80296", "type": "user" } }, "case": { "data": { "id": "2841440d-e780-4fe2-96cd-6a8c1d194da5", "type": "case" } }, "team_owners": { "data": [ { "id": "221b0179-6447-4d03-91c3-3ca98bf60e8a", "type": "team" } ] } }, "type": "issue" }, "included": [ { "attributes": { "archived_at": "2025-01-01T00:00:00Z", "closed_at": "2025-01-01T00:00:00Z", "created_at": "2025-01-01T00:00:00Z", "creation_source": "ERROR_TRACKING", "description": "string", "due_date": "2025-01-01", "insights": [ { "ref": "/error-tracking?issueId=2841440d-e780-4fe2-96cd-6a8c1d194da5", "resource_id": "2841440d-e780-4fe2-96cd-6a8c1d194da5", "type": "ERROR_TRACKING" } ], "jira_issue": { "result": { "issue_id": "1904866", "issue_key": "ET-123", "issue_url": "https://your-jira-instance.atlassian.net/browse/ET-123", "project_key": "ET" }, "status": "COMPLETED" }, "key": "ET-123", "modified_at": "2025-01-01T00:00:00Z", "priority": "NOT_DEFINED", "status": "OPEN", "title": "Error: HTTP error", "type": "ERROR_TRACKING_ISSUE" }, "id": "2841440d-e780-4fe2-96cd-6a8c1d194da5", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=typescript) ##### Update the state of an issue returns "OK" response Copy ``` # Path parameters export issue_id="c1726a66-1f64-11ee-b338-da7ad0900002" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/error-tracking/issues/${issue_id}/state" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "state": "RESOLVED" }, "id": "c1726a66-1f64-11ee-b338-da7ad0900002", "type": "error_tracking_issue" } } EOF ``` ##### Update the state of an issue returns "OK" response ``` // Update the state of an issue returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "issue" in the system IssueID := os.Getenv("ISSUE_ID") body := datadogV2.IssueUpdateStateRequest{ Data: datadogV2.IssueUpdateStateRequestData{ Attributes: datadogV2.IssueUpdateStateRequestDataAttributes{ State: datadogV2.ISSUESTATE_RESOLVED, }, Id: IssueID, Type: datadogV2.ISSUEUPDATESTATEREQUESTDATATYPE_ERROR_TRACKING_ISSUE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewErrorTrackingApi(apiClient) resp, r, err := api.UpdateIssueState(ctx, IssueID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ErrorTrackingApi.UpdateIssueState`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ErrorTrackingApi.UpdateIssueState`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update the state of an issue returns "OK" response ``` // Update the state of an issue returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ErrorTrackingApi; import com.datadog.api.client.v2.model.IssueResponse; import com.datadog.api.client.v2.model.IssueState; import com.datadog.api.client.v2.model.IssueUpdateStateRequest; import com.datadog.api.client.v2.model.IssueUpdateStateRequestData; import com.datadog.api.client.v2.model.IssueUpdateStateRequestDataAttributes; import com.datadog.api.client.v2.model.IssueUpdateStateRequestDataType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ErrorTrackingApi apiInstance = new ErrorTrackingApi(defaultClient); // there is a valid "issue" in the system String ISSUE_ID = System.getenv("ISSUE_ID"); IssueUpdateStateRequest body = new IssueUpdateStateRequest() .data( new IssueUpdateStateRequestData() .attributes( new IssueUpdateStateRequestDataAttributes().state(IssueState.RESOLVED)) .id(ISSUE_ID) .type(IssueUpdateStateRequestDataType.ERROR_TRACKING_ISSUE)); try { IssueResponse result = apiInstance.updateIssueState(ISSUE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ErrorTrackingApi#updateIssueState"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update the state of an issue returns "OK" response ``` """ Update the state of an issue returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.error_tracking_api import ErrorTrackingApi from datadog_api_client.v2.model.issue_state import IssueState from datadog_api_client.v2.model.issue_update_state_request import IssueUpdateStateRequest from datadog_api_client.v2.model.issue_update_state_request_data import IssueUpdateStateRequestData from datadog_api_client.v2.model.issue_update_state_request_data_attributes import IssueUpdateStateRequestDataAttributes from datadog_api_client.v2.model.issue_update_state_request_data_type import IssueUpdateStateRequestDataType # there is a valid "issue" in the system ISSUE_ID = environ["ISSUE_ID"] body = IssueUpdateStateRequest( data=IssueUpdateStateRequestData( attributes=IssueUpdateStateRequestDataAttributes( state=IssueState.RESOLVED, ), id=ISSUE_ID, type=IssueUpdateStateRequestDataType.ERROR_TRACKING_ISSUE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ErrorTrackingApi(api_client) response = api_instance.update_issue_state(issue_id=ISSUE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update the state of an issue returns "OK" response ``` # Update the state of an issue returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ErrorTrackingAPI.new # there is a valid "issue" in the system ISSUE_ID = ENV["ISSUE_ID"] body = DatadogAPIClient::V2::IssueUpdateStateRequest.new({ data: DatadogAPIClient::V2::IssueUpdateStateRequestData.new({ attributes: DatadogAPIClient::V2::IssueUpdateStateRequestDataAttributes.new({ state: DatadogAPIClient::V2::IssueState::RESOLVED, }), id: ISSUE_ID, type: DatadogAPIClient::V2::IssueUpdateStateRequestDataType::ERROR_TRACKING_ISSUE, }), }) p api_instance.update_issue_state(ISSUE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update the state of an issue returns "OK" response ``` // Update the state of an issue returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_error_tracking::ErrorTrackingAPI; use datadog_api_client::datadogV2::model::IssueState; use datadog_api_client::datadogV2::model::IssueUpdateStateRequest; use datadog_api_client::datadogV2::model::IssueUpdateStateRequestData; use datadog_api_client::datadogV2::model::IssueUpdateStateRequestDataAttributes; use datadog_api_client::datadogV2::model::IssueUpdateStateRequestDataType; #[tokio::main] async fn main() { // there is a valid "issue" in the system let issue_id = std::env::var("ISSUE_ID").unwrap(); let body = IssueUpdateStateRequest::new(IssueUpdateStateRequestData::new( IssueUpdateStateRequestDataAttributes::new(IssueState::RESOLVED), issue_id.clone(), IssueUpdateStateRequestDataType::ERROR_TRACKING_ISSUE, )); let configuration = datadog::Configuration::new(); let api = ErrorTrackingAPI::with_config(configuration); let resp = api.update_issue_state(issue_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update the state of an issue returns "OK" response ``` /** * Update the state of an issue returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ErrorTrackingApi(configuration); // there is a valid "issue" in the system const ISSUE_ID = process.env.ISSUE_ID as string; const params: v2.ErrorTrackingApiUpdateIssueStateRequest = { body: { data: { attributes: { state: "RESOLVED", }, id: ISSUE_ID, type: "error_tracking_issue", }, }, issueId: ISSUE_ID, }; apiInstance .updateIssueState(params) .then((data: v2.IssueResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update the assignee of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#update-the-assignee-of-an-issue) * [v2 (latest)](https://docs.datadoghq.com/api/latest/error-tracking/#update-the-assignee-of-an-issue-v2) PUT https://api.ap1.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.ap2.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.datadoghq.eu/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.ddog-gov.com/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.us3.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.us5.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/assignee ### Overview Update the assignee of an issue by `issue_id`. OAuth apps require the `error_tracking_read, error_tracking_write, cases_read, cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#error-tracking) to access this endpoint. ### Arguments #### Path Parameters Name Type Description issue_id [_required_] string The identifier of the issue. ### Request #### Body Data (required) Update issue assignee request payload. * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) Field Type Description data [_required_] object Update issue assignee request. id [_required_] string User identifier. type [_required_] enum Type of the object. Allowed enum values: `assignee` ``` { "data": { "id": "87cb11a0-278c-440a-99fe-701223c80296", "type": "assignee" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueAssignee-200-v2) * [400](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueAssignee-400-v2) * [401](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueAssignee-401-v2) * [403](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueAssignee-403-v2) * [404](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueAssignee-404-v2) * [429](https://docs.datadoghq.com/api/latest/error-tracking/#UpdateIssueAssignee-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) Response containing error tracking issue data. Field Type Description data object The issue matching the request. attributes [_required_] object Object containing the information of an issue. error_message string Error message associated with the issue. error_type string Type of the error that matches the issue. file_path string Path of the file where the issue occurred. first_seen int64 Timestamp of the first seen error in milliseconds since the Unix epoch. first_seen_version string The application version (for example, git commit hash) where the issue was first observed. function_name string Name of the function where the issue occurred. is_crash boolean Error is a crash. languages [string] Array of programming languages associated with the issue. last_seen int64 Timestamp of the last seen error in milliseconds since the Unix epoch. last_seen_version string The application version (for example, git commit hash) where the issue was last observed. platform enum Platform associated with the issue. Allowed enum values: `ANDROID,BACKEND,BROWSER,FLUTTER,IOS,REACT_NATIVE,ROKU,UNKNOWN` service string Service name. state enum State of the issue Allowed enum values: `OPEN,ACKNOWLEDGED,RESOLVED,IGNORED,EXCLUDED` id [_required_] string Issue identifier. relationships object Relationship between the issue and an assignee, case and/or teams. assignee object Relationship between the issue and assignee. data [_required_] object The user the issue is assigned to. id [_required_] string User identifier. type [_required_] enum Type of the object Allowed enum values: `user` case object Relationship between the issue and case. data [_required_] object The case the issue is attached to. id [_required_] string Case identifier. type [_required_] enum Type of the object. Allowed enum values: `case` team_owners object Relationship between the issue and teams. data [_required_] [object] Array of teams that are owners of the issue. id [_required_] string Team identifier. type [_required_] enum Type of the object. Allowed enum values: `team` type [_required_] enum Type of the object. Allowed enum values: `issue` included [ ] Array of resources related to the issue. Option 1 object The case attached to the issue. attributes [_required_] object Object containing the information of a case. archived_at date-time Timestamp of when the case was archived. closed_at date-time Timestamp of when the case was closed. created_at date-time Timestamp of when the case was created. creation_source string Source of the case creation. description string Description of the case. due_date string Due date of the case. insights [object] Insights of the case. ref string Reference of the insight. resource_id string Insight identifier. type string Type of the insight. jira_issue object Jira issue of the case. result object Contains the identifiers and URL for a successfully created Jira issue. issue_id string Jira issue identifier. issue_key string Jira issue key. issue_url string Jira issue URL. project_key string Jira project key. status string Creation status of the Jira issue. key string Key of the case. modified_at date-time Timestamp of when the case was last modified. priority enum Case priority Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` status enum Case status Allowed enum values: `OPEN,IN_PROGRESS,CLOSED` title string Title of the case. type string Type of the case. id [_required_] string Case identifier. relationships object Resources related to a case. assignee object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum User resource type. Allowed enum values: `user` default: `user` project object Relationship to project data [_required_] object Relationship to project object id [_required_] string A unique identifier that represents the project type [_required_] enum Project resource type Allowed enum values: `project` default: `project` type [_required_] enum Type of the object. Allowed enum values: `case` Option 2 object The user to whom the issue is assigned. attributes [_required_] object Object containing the information of a user. email string Email of the user. handle string Handle of the user. name string Name of the user. id [_required_] string User identifier. type [_required_] enum Type of the object Allowed enum values: `user` Option 3 object A team that owns an issue. attributes [_required_] object Object containing the information of a team. handle string The team's identifier. name string The name of the team. summary string A brief summary of the team, derived from its description. id [_required_] string Team identifier. type [_required_] enum Type of the object. Allowed enum values: `team` ``` { "data": { "attributes": { "error_message": "object of type 'NoneType' has no len()", "error_type": "builtins.TypeError", "file_path": "/django-email/conduit/apps/core/utils.py", "first_seen": 1671612804001, "first_seen_version": "aaf65cd0", "function_name": "filter_forbidden_tags", "is_crash": false, "languages": [ "PYTHON", "GO" ], "last_seen": 1671620003100, "last_seen_version": "b6199f80", "platform": "BACKEND", "service": "email-api-py", "state": "RESOLVED" }, "id": "c1726a66-1f64-11ee-b338-da7ad0900002", "relationships": { "assignee": { "data": { "id": "87cb11a0-278c-440a-99fe-701223c80296", "type": "user" } }, "case": { "data": { "id": "2841440d-e780-4fe2-96cd-6a8c1d194da5", "type": "case" } }, "team_owners": { "data": [ { "id": "221b0179-6447-4d03-91c3-3ca98bf60e8a", "type": "team" } ] } }, "type": "issue" }, "included": [ { "attributes": { "archived_at": "2025-01-01T00:00:00Z", "closed_at": "2025-01-01T00:00:00Z", "created_at": "2025-01-01T00:00:00Z", "creation_source": "ERROR_TRACKING", "description": "string", "due_date": "2025-01-01", "insights": [ { "ref": "/error-tracking?issueId=2841440d-e780-4fe2-96cd-6a8c1d194da5", "resource_id": "2841440d-e780-4fe2-96cd-6a8c1d194da5", "type": "ERROR_TRACKING" } ], "jira_issue": { "result": { "issue_id": "1904866", "issue_key": "ET-123", "issue_url": "https://your-jira-instance.atlassian.net/browse/ET-123", "project_key": "ET" }, "status": "COMPLETED" }, "key": "ET-123", "modified_at": "2025-01-01T00:00:00Z", "priority": "NOT_DEFINED", "status": "OPEN", "title": "Error: HTTP error", "type": "ERROR_TRACKING_ISSUE" }, "id": "2841440d-e780-4fe2-96cd-6a8c1d194da5", "relationships": { "assignee": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "created_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "user" } }, "project": { "data": { "id": "e555e290-ed65-49bd-ae18-8acbfcf18db7", "type": "project" } } }, "type": "case" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=typescript) ##### Update the assignee of an issue returns "OK" response Copy ``` # Path parameters export issue_id="c1726a66-1f64-11ee-b338-da7ad0900002" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/error-tracking/issues/${issue_id}/assignee" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "87cb11a0-278c-440a-99fe-701223c80296", "type": "assignee" } } EOF ``` ##### Update the assignee of an issue returns "OK" response ``` // Update the assignee of an issue returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "issue" in the system IssueID := os.Getenv("ISSUE_ID") body := datadogV2.IssueUpdateAssigneeRequest{ Data: datadogV2.IssueUpdateAssigneeRequestData{ Id: "87cb11a0-278c-440a-99fe-701223c80296", Type: datadogV2.ISSUEUPDATEASSIGNEEREQUESTDATATYPE_ASSIGNEE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewErrorTrackingApi(apiClient) resp, r, err := api.UpdateIssueAssignee(ctx, IssueID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ErrorTrackingApi.UpdateIssueAssignee`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ErrorTrackingApi.UpdateIssueAssignee`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update the assignee of an issue returns "OK" response ``` // Update the assignee of an issue returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ErrorTrackingApi; import com.datadog.api.client.v2.model.IssueResponse; import com.datadog.api.client.v2.model.IssueUpdateAssigneeRequest; import com.datadog.api.client.v2.model.IssueUpdateAssigneeRequestData; import com.datadog.api.client.v2.model.IssueUpdateAssigneeRequestDataType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ErrorTrackingApi apiInstance = new ErrorTrackingApi(defaultClient); // there is a valid "issue" in the system String ISSUE_ID = System.getenv("ISSUE_ID"); IssueUpdateAssigneeRequest body = new IssueUpdateAssigneeRequest() .data( new IssueUpdateAssigneeRequestData() .id("87cb11a0-278c-440a-99fe-701223c80296") .type(IssueUpdateAssigneeRequestDataType.ASSIGNEE)); try { IssueResponse result = apiInstance.updateIssueAssignee(ISSUE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ErrorTrackingApi#updateIssueAssignee"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update the assignee of an issue returns "OK" response ``` """ Update the assignee of an issue returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.error_tracking_api import ErrorTrackingApi from datadog_api_client.v2.model.issue_update_assignee_request import IssueUpdateAssigneeRequest from datadog_api_client.v2.model.issue_update_assignee_request_data import IssueUpdateAssigneeRequestData from datadog_api_client.v2.model.issue_update_assignee_request_data_type import IssueUpdateAssigneeRequestDataType # there is a valid "issue" in the system ISSUE_ID = environ["ISSUE_ID"] body = IssueUpdateAssigneeRequest( data=IssueUpdateAssigneeRequestData( id="87cb11a0-278c-440a-99fe-701223c80296", type=IssueUpdateAssigneeRequestDataType.ASSIGNEE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ErrorTrackingApi(api_client) response = api_instance.update_issue_assignee(issue_id=ISSUE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update the assignee of an issue returns "OK" response ``` # Update the assignee of an issue returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ErrorTrackingAPI.new # there is a valid "issue" in the system ISSUE_ID = ENV["ISSUE_ID"] body = DatadogAPIClient::V2::IssueUpdateAssigneeRequest.new({ data: DatadogAPIClient::V2::IssueUpdateAssigneeRequestData.new({ id: "87cb11a0-278c-440a-99fe-701223c80296", type: DatadogAPIClient::V2::IssueUpdateAssigneeRequestDataType::ASSIGNEE, }), }) p api_instance.update_issue_assignee(ISSUE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update the assignee of an issue returns "OK" response ``` // Update the assignee of an issue returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_error_tracking::ErrorTrackingAPI; use datadog_api_client::datadogV2::model::IssueUpdateAssigneeRequest; use datadog_api_client::datadogV2::model::IssueUpdateAssigneeRequestData; use datadog_api_client::datadogV2::model::IssueUpdateAssigneeRequestDataType; #[tokio::main] async fn main() { // there is a valid "issue" in the system let issue_id = std::env::var("ISSUE_ID").unwrap(); let body = IssueUpdateAssigneeRequest::new(IssueUpdateAssigneeRequestData::new( "87cb11a0-278c-440a-99fe-701223c80296".to_string(), IssueUpdateAssigneeRequestDataType::ASSIGNEE, )); let configuration = datadog::Configuration::new(); let api = ErrorTrackingAPI::with_config(configuration); let resp = api.update_issue_assignee(issue_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update the assignee of an issue returns "OK" response ``` /** * Update the assignee of an issue returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ErrorTrackingApi(configuration); // there is a valid "issue" in the system const ISSUE_ID = process.env.ISSUE_ID as string; const params: v2.ErrorTrackingApiUpdateIssueAssigneeRequest = { body: { data: { id: "87cb11a0-278c-440a-99fe-701223c80296", type: "assignee", }, }, issueId: ISSUE_ID, }; apiInstance .updateIssueAssignee(params) .then((data: v2.IssueResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Remove the assignee of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#remove-the-assignee-of-an-issue) * [v2 (latest)](https://docs.datadoghq.com/api/latest/error-tracking/#remove-the-assignee-of-an-issue-v2) DELETE https://api.ap1.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.ap2.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.datadoghq.eu/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.ddog-gov.com/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.us3.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/assigneehttps://api.us5.datadoghq.com/api/v2/error-tracking/issues/{issue_id}/assignee ### Overview Remove the assignee of an issue by `issue_id`. OAuth apps require the `error_tracking_read, error_tracking_write, cases_read, cases_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#error-tracking) to access this endpoint. ### Arguments #### Path Parameters Name Type Description issue_id [_required_] string The identifier of the issue. ### Response * [204](https://docs.datadoghq.com/api/latest/error-tracking/#DeleteIssueAssignee-204-v2) * [400](https://docs.datadoghq.com/api/latest/error-tracking/#DeleteIssueAssignee-400-v2) * [401](https://docs.datadoghq.com/api/latest/error-tracking/#DeleteIssueAssignee-401-v2) * [403](https://docs.datadoghq.com/api/latest/error-tracking/#DeleteIssueAssignee-403-v2) * [404](https://docs.datadoghq.com/api/latest/error-tracking/#DeleteIssueAssignee-404-v2) * [429](https://docs.datadoghq.com/api/latest/error-tracking/#DeleteIssueAssignee-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/error-tracking/) * [Example](https://docs.datadoghq.com/api/latest/error-tracking/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/error-tracking/?code-lang=typescript) ##### Remove the assignee of an issue Copy ``` # Path parameters export issue_id="c1726a66-1f64-11ee-b338-da7ad0900002" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/error-tracking/issues/${issue_id}/assignee" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Remove the assignee of an issue ``` """ Remove the assignee of an issue returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.error_tracking_api import ErrorTrackingApi # there is a valid "issue" in the system ISSUE_ID = environ["ISSUE_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ErrorTrackingApi(api_client) api_instance.delete_issue_assignee( issue_id=ISSUE_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Remove the assignee of an issue ``` # Remove the assignee of an issue returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ErrorTrackingAPI.new # there is a valid "issue" in the system ISSUE_ID = ENV["ISSUE_ID"] api_instance.delete_issue_assignee(ISSUE_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Remove the assignee of an issue ``` // Remove the assignee of an issue returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "issue" in the system IssueID := os.Getenv("ISSUE_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewErrorTrackingApi(apiClient) r, err := api.DeleteIssueAssignee(ctx, IssueID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ErrorTrackingApi.DeleteIssueAssignee`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Remove the assignee of an issue ``` // Remove the assignee of an issue returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ErrorTrackingApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ErrorTrackingApi apiInstance = new ErrorTrackingApi(defaultClient); // there is a valid "issue" in the system String ISSUE_ID = System.getenv("ISSUE_ID"); try { apiInstance.deleteIssueAssignee(ISSUE_ID); } catch (ApiException e) { System.err.println("Exception when calling ErrorTrackingApi#deleteIssueAssignee"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Remove the assignee of an issue ``` // Remove the assignee of an issue returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_error_tracking::ErrorTrackingAPI; #[tokio::main] async fn main() { // there is a valid "issue" in the system let issue_id = std::env::var("ISSUE_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ErrorTrackingAPI::with_config(configuration); let resp = api.delete_issue_assignee(issue_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Remove the assignee of an issue ``` /** * Remove the assignee of an issue returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ErrorTrackingApi(configuration); // there is a valid "issue" in the system const ISSUE_ID = process.env.ISSUE_ID as string; const params: v2.ErrorTrackingApiDeleteIssueAssigneeRequest = { issueId: ISSUE_ID, }; apiInstance .deleteIssueAssignee(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=24191c32-4081-4083-b64d-8d878ee8887d&bo=1&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Error%20Tracking&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ferror-tracking%2F&r=&evt=pageLoad&sv=2&cdb=AQAQ&rn=737228) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4679a05f-48fa-4138-9d01-2260df6a9e44&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=41574f96-6ed5-4cab-add6-f21ffd605ba8&pt=Error%20Tracking&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ferror-tracking%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4679a05f-48fa-4138-9d01-2260df6a9e44&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=41574f96-6ed5-4cab-add6-f21ffd605ba8&pt=Error%20Tracking&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ferror-tracking%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) --- # Source: https://docs.datadoghq.com/api/latest/events # Events The Event Management API allows you to programmatically post events to the Events Explorer and fetch events from the Events Explorer. See the [Event Management page](https://docs.datadoghq.com/service_management/events/) for more information. **Update to Datadog monitor events`aggregation_key` starting March 1, 2025:** The Datadog monitor events `aggregation_key` is unique to each Monitor ID. Starting March 1st, this key will also include Monitor Group, making it unique per _Monitor ID and Monitor Group_. If you’re using monitor events `aggregation_key` in dashboard queries or the Event API, you must migrate to use `@monitor.id`. Reach out to [support](https://www.datadoghq.com/support/) if you have any question. ## [Get a list of events](https://docs.datadoghq.com/api/latest/events/#get-a-list-of-events) * [v1](https://docs.datadoghq.com/api/latest/events/#get-a-list-of-events-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/events/#get-a-list-of-events-v2) GET https://api.ap1.datadoghq.com/api/v1/eventshttps://api.ap2.datadoghq.com/api/v1/eventshttps://api.datadoghq.eu/api/v1/eventshttps://api.ddog-gov.com/api/v1/eventshttps://api.datadoghq.com/api/v1/eventshttps://api.us3.datadoghq.com/api/v1/eventshttps://api.us5.datadoghq.com/api/v1/events ### Overview The event stream can be queried and filtered by time, priority, sources and tags. **Notes** : * If the event you’re querying contains markdown formatting of any kind, you may see characters such as `%`,`\`,`n` in your output. * This endpoint returns a maximum of `1000` most recent results. To return additional results, identify the last timestamp of the last result and set that as the `end` query time to paginate the results. You can also use the page parameter to specify which set of `1000` results to return. This endpoint requires the `events_read` permission. OAuth apps require the `events_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#events) to access this endpoint. ### Arguments #### Query Strings Name Type Description start [_required_] integer POSIX timestamp. end [_required_] integer POSIX timestamp. priority enum Priority of your events, either `low` or `normal`. Allowed enum values: `normal, low` sources string A comma separated string of sources. tags string A comma separated list indicating what tags, if any, should be used to filter the list of events. unaggregated boolean Set unaggregated to `true` to return all events within the specified [`start`,`end`] timeframe. Otherwise if an event is aggregated to a parent event with a timestamp outside of the timeframe, it won’t be available in the output. Aggregated events with `is_aggregate=true` in the response will still be returned unless exclude_aggregate is set to `true.` exclude_aggregate boolean Set `exclude_aggregate` to `true` to only return unaggregated events where `is_aggregate=false` in the response. If the `exclude_aggregate` parameter is set to `true`, then the unaggregated parameter is ignored and will be `true` by default. page integer By default 1000 results are returned per request. Set page to the number of the page to return with `0` being the first page. The page parameter can only be used when either unaggregated or exclude_aggregate is set to `true.` ### Response * [200](https://docs.datadoghq.com/api/latest/events/#ListEvents-200-v1) * [400](https://docs.datadoghq.com/api/latest/events/#ListEvents-400-v1) * [403](https://docs.datadoghq.com/api/latest/events/#ListEvents-403-v1) * [429](https://docs.datadoghq.com/api/latest/events/#ListEvents-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) An event list response. Field Type Description events [object] An array of events. alert_type enum If an alert event is enabled, set its type. For example, `error`, `warning`, `info`, `success`, `user_update`, `recommendation`, and `snapshot`. Allowed enum values: `error,warning,info,success,user_update,recommendation,snapshot` date_happened int64 POSIX timestamp of the event. Must be sent as an integer (that is no quotes). Limited to events up to 18 hours in the past and two hours in the future. device_name string A device name. host string Host name to associate with the event. Any tags associated with the host are also applied to this event. id int64 Integer ID of the event. id_str string Handling IDs as large 64-bit numbers can cause loss of accuracy issues with some programming languages. Instead, use the string representation of the Event ID to avoid losing accuracy. payload string Payload of the event. priority enum The priority of the event. For example, `normal` or `low`. Allowed enum values: `normal,low` source_type_name string The type of event being posted. Option examples include nagios, hudson, jenkins, my_apps, chef, puppet, git, bitbucket, etc. The list of standard source attribute values [available here](https://docs.datadoghq.com/integrations/faq/list-of-api-source-attribute-value). tags [string] A list of tags to apply to the event. text string The body of the event. Limited to 4000 characters. The text supports markdown. To use markdown in the event text, start the text block with `%%% \n` and end the text block with `\n %%%`. Use `msg_text` with the Datadog Ruby library. title string The event title. url string URL of the event. status string A status. ``` { "events": [ { "alert_type": "info", "date_happened": "integer", "device_name": "string", "host": "string", "id": "integer", "id_str": "string", "payload": "{}", "priority": "normal", "source_type_name": "string", "tags": [ "environment:test" ], "text": "Oh boy!", "title": "Did you hear the news today?", "url": "string" } ], "status": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/events/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/events/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/events/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/events/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/events/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/events/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/events/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/events/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/events/?code-lang=python-legacy) ##### Get a list of events Copy ``` # Required query arguments export start="CHANGE_ME" export end="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/events?start=${start}&end=${end}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of events ``` """ Get a list of events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.events_api import EventsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = EventsApi(api_client) response = api_instance.list_events( start=9223372036854775807, end=9223372036854775807, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of events ``` # Get a list of events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::EventsAPI.new p api_instance.list_events(9223372036854775807, 9223372036854775807) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of events ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) end_time = Time.now.to_i start_time = end_time - 100 dog.stream(start_time, end_time, :priority => "normal", :tags => ["-env:dev,application:web"], :unaggregated => true) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of events ``` // Get a list of events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewEventsApi(apiClient) resp, r, err := api.ListEvents(ctx, 9223372036854775807, 9223372036854775807, *datadogV1.NewListEventsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `EventsApi.ListEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `EventsApi.ListEvents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of events ``` // Get a list of events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.EventsApi; import com.datadog.api.client.v1.model.EventListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); EventsApi apiInstance = new EventsApi(defaultClient); try { EventListResponse result = apiInstance.listEvents(9223372036854775807L, 9223372036854775807L); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling EventsApi#listEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of events ``` from datadog import initialize, api import time options = { 'api_key': '', 'app_key': '' } initialize(**options) end_time = time.time() start_time = end_time - 100 api.Event.query( start=start_time, end=end_time, priority="normal", tags=["-env:dev,application:web"], unaggregated=True ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get a list of events ``` // Get a list of events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_events::EventsAPI; use datadog_api_client::datadogV1::api_events::ListEventsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = EventsAPI::with_config(configuration); let resp = api .list_events( 9223372036854775807, 9223372036854775807, ListEventsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of events ``` /** * Get a list of events returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.EventsApi(configuration); const params: v1.EventsApiListEventsRequest = { start: 9223372036854775807, end: 9223372036854775807, }; apiInstance .listEvents(params) .then((data: v1.EventListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/eventshttps://api.ap2.datadoghq.com/api/v2/eventshttps://api.datadoghq.eu/api/v2/eventshttps://api.ddog-gov.com/api/v2/eventshttps://api.datadoghq.com/api/v2/eventshttps://api.us3.datadoghq.com/api/v2/eventshttps://api.us5.datadoghq.com/api/v2/events ### Overview List endpoint returns events that match an events search query. [Results are paginated similarly to logs](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to see your latest events. This endpoint requires the `events_read` permission. OAuth apps require the `events_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#events) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[query] string Search query following events syntax. filter[from] string Minimum timestamp for requested events, in milliseconds. filter[to] string Maximum timestamp for requested events, in milliseconds. sort enum Order of events in results. Allowed enum values: `timestamp, -timestamp` page[cursor] string List following results with a cursor provided in the previous query. page[limit] integer Maximum number of events in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/events/#ListEvents-200-v2) * [400](https://docs.datadoghq.com/api/latest/events/#ListEvents-400-v2) * [403](https://docs.datadoghq.com/api/latest/events/#ListEvents-403-v2) * [429](https://docs.datadoghq.com/api/latest/events/#ListEvents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) The response object with all events matching the request and pagination information. Field Type Description data [object] An array of events matching the request. attributes object The object description of an event response attribute. attributes object Object description of attributes from your event. aggregation_key string Aggregation key of the event. date_happened int64 POSIX timestamp of the event. Must be sent as an integer (no quotation marks). Limited to events no older than 18 hours. device_name string A device name. duration int64 The duration between the triggering of the event and its recovery in nanoseconds. event_object string The event title. evt object The metadata associated with a request. id string Event ID. name string The event name. source_id int64 Event source ID. type string Event type. hostname string Host name to associate with the event. Any tags associated with the host are also applied to this event. monitor object Attributes from the monitor that triggered the event. created_at int64 The POSIX timestamp of the monitor's creation in nanoseconds. group_status int32 Monitor group status used when there is no `result_groups`. groups [string] Groups to which the monitor belongs. id int64 The monitor ID. message string The monitor message. modified int64 The monitor's last-modified timestamp. name string The monitor name. query string The query that triggers the alert. tags [string] A list of tags attached to the monitor. templated_name string The templated name of the monitor before resolving any template variables. type string The monitor type. monitor_groups [string] List of groups referred to in the event. monitor_id int64 ID of the monitor that triggered the event. When an event isn't related to a monitor, this field is empty. priority enum The priority of the event's monitor. For example, `normal` or `low`. Allowed enum values: `normal,low` related_event_id int64 Related event ID. service string Service that triggered the event. source_type_name string The type of event being posted. For example, `nagios`, `hudson`, `jenkins`, `my_apps`, `chef`, `puppet`, `git` or `bitbucket`. The list of standard source attribute values is [available here](https://docs.datadoghq.com/integrations/faq/list-of-api-source-attribute-value). sourcecategory string Identifier for the source of the event, such as a monitor alert, an externally-submitted event, or an integration. status enum If an alert event is enabled, its status is one of the following: `failure`, `error`, `warning`, `info`, `success`, `user_update`, `recommendation`, or `snapshot`. Allowed enum values: `failure,error,warning,info,success,user_update,recommendation,snapshot` tags [string] A list of tags to apply to the event. timestamp int64 POSIX timestamp of your event in milliseconds. title string The event title. message string The message of the event. tags [string] An array of tags associated with the event. timestamp date-time The timestamp of the event. id string the unique ID of the event. type enum Type of the event. Allowed enum values: `event` default: `event` links object Links attributes. next string Link for the next set of results. Note that the request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Pagination attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. request_id string The identifier of the request. status string The request status. warnings [object] A list of warnings (non-fatal errors) encountered. Partial results might be returned if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "aggregation_key": "string", "date_happened": "integer", "device_name": "string", "duration": "integer", "event_object": "Did you hear the news today?", "evt": { "id": "6509751066204996294", "name": "string", "source_id": 36, "type": "error_tracking_alert" }, "hostname": "string", "monitor": { "created_at": 1646318692000, "group_status": "integer", "groups": [], "id": "integer", "message": "string", "modified": "integer", "name": "string", "query": "string", "tags": [ "environment:test" ], "templated_name": "string", "type": "string" }, "monitor_groups": [], "monitor_id": "integer", "priority": "normal", "related_event_id": "integer", "service": "datadog-api", "source_type_name": "string", "sourcecategory": "string", "status": "info", "tags": [ "environment:test" ], "timestamp": 1652274265000, "title": "Oh boy!" }, "message": "string", "tags": [ "team:A" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "event" } ], "links": { "next": "https://app.datadoghq.com/api/v2/events?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid. Results hold data from the other indexes." } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/events/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/events/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/events/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/events/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/events/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/events/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/events/?code-lang=typescript) ##### Get a list of events Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/events" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of events ``` """ Get a list of events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.events_api import EventsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = EventsApi(api_client) response = api_instance.list_events() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of events ``` # Get a list of events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::EventsAPI.new p api_instance.list_events() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of events ``` // Get a list of events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewEventsApi(apiClient) resp, r, err := api.ListEvents(ctx, *datadogV2.NewListEventsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `EventsApi.ListEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `EventsApi.ListEvents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of events ``` // Get a list of events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.EventsApi; import com.datadog.api.client.v2.model.EventsListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); EventsApi apiInstance = new EventsApi(defaultClient); try { EventsListResponse result = apiInstance.listEvents(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling EventsApi#listEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of events ``` // Get a list of events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_events::EventsAPI; use datadog_api_client::datadogV2::api_events::ListEventsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = EventsAPI::with_config(configuration); let resp = api.list_events(ListEventsOptionalParams::default()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of events ``` /** * Get a list of events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.EventsApi(configuration); apiInstance .listEvents() .then((data: v2.EventsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Post an event](https://docs.datadoghq.com/api/latest/events/#post-an-event) * [v1](https://docs.datadoghq.com/api/latest/events/#post-an-event-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/events/#post-an-event-v2) POST https://api.ap1.datadoghq.com/api/v1/eventshttps://api.ap2.datadoghq.com/api/v1/eventshttps://api.datadoghq.eu/api/v1/eventshttps://api.ddog-gov.com/api/v1/eventshttps://api.datadoghq.com/api/v1/eventshttps://api.us3.datadoghq.com/api/v1/eventshttps://api.us5.datadoghq.com/api/v1/events ### Overview This endpoint allows you to post events to the stream. Tag them, set priority and event aggregate them with other events. ### Request #### Body Data (required) Event request object * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Expand All Field Type Description aggregation_key string An arbitrary string to use for aggregation. Limited to 100 characters. If you specify a key, all events using that key are grouped together in the Event Stream. alert_type enum If an alert event is enabled, set its type. For example, `error`, `warning`, `info`, `success`, `user_update`, `recommendation`, and `snapshot`. Allowed enum values: `error,warning,info,success,user_update,recommendation,snapshot` date_happened int64 POSIX timestamp of the event. Must be sent as an integer (that is no quotes). Limited to events no older than 18 hours device_name string A device name. host string Host name to associate with the event. Any tags associated with the host are also applied to this event. priority enum The priority of the event. For example, `normal` or `low`. Allowed enum values: `normal,low` related_event_id int64 ID of the parent event. Must be sent as an integer (that is no quotes). source_type_name string The type of event being posted. Option examples include nagios, hudson, jenkins, my_apps, chef, puppet, git, bitbucket, etc. A complete list of source attribute values [available here](https://docs.datadoghq.com/integrations/faq/list-of-api-source-attribute-value). tags [string] A list of tags to apply to the event. text [_required_] string The body of the event. Limited to 4000 characters. The text supports markdown. To use markdown in the event text, start the text block with `%%% \n` and end the text block with `\n %%%`. Use `msg_text` with the Datadog Ruby library. title [_required_] string The event title. ##### Post an event returns "OK" response ``` { "title": "Example-Event", "text": "A text message.", "tags": [ "test:ExampleEvent" ] } ``` Copy ##### Post an event with a long title returns "OK" response ``` { "title": "Example-Event very very very looooooooong looooooooooooong loooooooooooooooooooooong looooooooooooooooooooooooooong title with 100+ characters", "text": "A text message.", "tags": [ "test:ExampleEvent" ] } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/events/#CreateEvent-202-v1) * [400](https://docs.datadoghq.com/api/latest/events/#CreateEvent-400-v1) * [429](https://docs.datadoghq.com/api/latest/events/#CreateEvent-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Object containing an event response. Field Type Description event object Object representing an event. alert_type enum If an alert event is enabled, set its type. For example, `error`, `warning`, `info`, `success`, `user_update`, `recommendation`, and `snapshot`. Allowed enum values: `error,warning,info,success,user_update,recommendation,snapshot` date_happened int64 POSIX timestamp of the event. Must be sent as an integer (that is no quotes). Limited to events up to 18 hours in the past and two hours in the future. device_name string A device name. host string Host name to associate with the event. Any tags associated with the host are also applied to this event. id int64 Integer ID of the event. id_str string Handling IDs as large 64-bit numbers can cause loss of accuracy issues with some programming languages. Instead, use the string representation of the Event ID to avoid losing accuracy. payload string Payload of the event. priority enum The priority of the event. For example, `normal` or `low`. Allowed enum values: `normal,low` source_type_name string The type of event being posted. Option examples include nagios, hudson, jenkins, my_apps, chef, puppet, git, bitbucket, etc. The list of standard source attribute values [available here](https://docs.datadoghq.com/integrations/faq/list-of-api-source-attribute-value). tags [string] A list of tags to apply to the event. text string The body of the event. Limited to 4000 characters. The text supports markdown. To use markdown in the event text, start the text block with `%%% \n` and end the text block with `\n %%%`. Use `msg_text` with the Datadog Ruby library. title string The event title. url string URL of the event. status string A status. ``` { "event": { "alert_type": "info", "date_happened": "integer", "device_name": "string", "host": "string", "id": "integer", "id_str": "string", "payload": "{}", "priority": "normal", "source_type_name": "string", "tags": [ "environment:test" ], "text": "Oh boy!", "title": "Did you hear the news today?", "url": "string" }, "status": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/events/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/events/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/events/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/events/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/events/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/events/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/events/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/events/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/events/?code-lang=ruby-legacy) ##### Post an event returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/events" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "title": "Example-Event", "text": "A text message.", "tags": [ "test:ExampleEvent" ] } EOF ``` ##### Post an event with a long title returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/events" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "title": "Example-Event very very very looooooooong looooooooooooong loooooooooooooooooooooong looooooooooooooooooooooooooong title with 100+ characters", "text": "A text message.", "tags": [ "test:ExampleEvent" ] } EOF ``` ##### Post an event returns "OK" response ``` // Post an event returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.EventCreateRequest{ Title: "Example-Event", Text: "A text message.", Tags: []string{ "test:ExampleEvent", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewEventsApi(apiClient) resp, r, err := api.CreateEvent(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `EventsApi.CreateEvent`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `EventsApi.CreateEvent`:\n%s\n", responseContent) } ``` Copy ##### Post an event with a long title returns "OK" response ``` // Post an event with a long title returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.EventCreateRequest{ Title: "Example-Event very very very looooooooong looooooooooooong loooooooooooooooooooooong looooooooooooooooooooooooooong title with 100+ characters", Text: "A text message.", Tags: []string{ "test:ExampleEvent", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewEventsApi(apiClient) resp, r, err := api.CreateEvent(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `EventsApi.CreateEvent`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `EventsApi.CreateEvent`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Post an event returns "OK" response ``` // Post an event returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.EventsApi; import com.datadog.api.client.v1.model.EventCreateRequest; import com.datadog.api.client.v1.model.EventCreateResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); EventsApi apiInstance = new EventsApi(defaultClient); EventCreateRequest body = new EventCreateRequest() .title("Example-Event") .text("A text message.") .tags(Collections.singletonList("test:ExampleEvent")); try { EventCreateResponse result = apiInstance.createEvent(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling EventsApi#createEvent"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Post an event with a long title returns "OK" response ``` // Post an event with a long title returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.EventsApi; import com.datadog.api.client.v1.model.EventCreateRequest; import com.datadog.api.client.v1.model.EventCreateResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); EventsApi apiInstance = new EventsApi(defaultClient); EventCreateRequest body = new EventCreateRequest() .title( "Example-Event very very very looooooooong looooooooooooong" + " loooooooooooooooooooooong looooooooooooooooooooooooooong title with 100+" + " characters") .text("A text message.") .tags(Collections.singletonList("test:ExampleEvent")); try { EventCreateResponse result = apiInstance.createEvent(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling EventsApi#createEvent"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Post an event returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) title = "Something big happened!" text = 'And let me tell you all about it here!' tags = ['version:1', 'application:web'] api.Event.create(title=title, text=text, tags=tags) # If you are programmatically adding a comment to this new event # you might want to insert a pause of .5 - 1 second to allow the # event to be available. ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python "example.py" ``` ##### Post an event returns "OK" response ``` """ Post an event returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.events_api import EventsApi from datadog_api_client.v1.model.event_create_request import EventCreateRequest body = EventCreateRequest( title="Example-Event", text="A text message.", tags=[ "test:ExampleEvent", ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = EventsApi(api_client) response = api_instance.create_event(body=body) print(response) ``` Copy ##### Post an event with a long title returns "OK" response ``` """ Post an event with a long title returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.events_api import EventsApi from datadog_api_client.v1.model.event_create_request import EventCreateRequest body = EventCreateRequest( title="Example-Event very very very looooooooong looooooooooooong loooooooooooooooooooooong looooooooooooooooooooooooooong title with 100+ characters", text="A text message.", tags=[ "test:ExampleEvent", ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = EventsApi(api_client) response = api_instance.create_event(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Post an event returns "OK" response ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # submitting events doesn 't require an application_key, # so we don't bother setting it dog = Dogapi::Client.new(api_key) dog.emit_event(Dogapi::Event.new('msg_text', :msg_title => 'Title')) # If you are programmatically adding a comment to this new event # you might want to insert a pause of.5 - 1 second to allow the # event to be available. ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Post an event returns "OK" response ``` # Post an event returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::EventsAPI.new body = DatadogAPIClient::V1::EventCreateRequest.new({ title: "Example-Event", text: "A text message.", tags: [ "test:ExampleEvent", ], }) p api_instance.create_event(body) ``` Copy ##### Post an event with a long title returns "OK" response ``` # Post an event with a long title returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::EventsAPI.new body = DatadogAPIClient::V1::EventCreateRequest.new({ title: "Example-Event very very very looooooooong looooooooooooong loooooooooooooooooooooong looooooooooooooooooooooooooong title with 100+ characters", text: "A text message.", tags: [ "test:ExampleEvent", ], }) p api_instance.create_event(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Post an event returns "OK" response ``` // Post an event returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_events::EventsAPI; use datadog_api_client::datadogV1::model::EventCreateRequest; #[tokio::main] async fn main() { let body = EventCreateRequest::new("A text message.".to_string(), "Example-Event".to_string()) .tags(vec!["test:ExampleEvent".to_string()]); let configuration = datadog::Configuration::new(); let api = EventsAPI::with_config(configuration); let resp = api.create_event(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Post an event with a long title returns "OK" response ``` // Post an event with a long title returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_events::EventsAPI; use datadog_api_client::datadogV1::model::EventCreateRequest; #[tokio::main] async fn main() { let body = EventCreateRequest::new( "A text message.".to_string(), "Example-Event very very very looooooooong looooooooooooong loooooooooooooooooooooong looooooooooooooooooooooooooong title with 100+ characters".to_string(), ).tags(vec!["test:ExampleEvent".to_string()]); let configuration = datadog::Configuration::new(); let api = EventsAPI::with_config(configuration); let resp = api.create_event(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Post an event returns "OK" response ``` /** * Post an event returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.EventsApi(configuration); const params: v1.EventsApiCreateEventRequest = { body: { title: "Example-Event", text: "A text message.", tags: ["test:ExampleEvent"], }, }; apiInstance .createEvent(params) .then((data: v1.EventCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Post an event with a long title returns "OK" response ``` /** * Post an event with a long title returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.EventsApi(configuration); const params: v1.EventsApiCreateEventRequest = { body: { title: "Example-Event very very very looooooooong looooooooooooong loooooooooooooooooooooong looooooooooooooooooooooooooong title with 100+ characters", text: "A text message.", tags: ["test:ExampleEvent"], }, }; apiInstance .createEvent(params) .then((data: v1.EventCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` POST https://event-management-intake.ap1.datadoghq.com/api/v2/eventshttps://event-management-intake.ap2.datadoghq.com/api/v2/eventshttps://event-management-intake.datadoghq.eu/api/v2/eventshttps://event-management-intake.ddog-gov.com/api/v2/eventshttps://event-management-intake.datadoghq.com/api/v2/eventshttps://event-management-intake.us3.datadoghq.com/api/v2/eventshttps://event-management-intake.us5.datadoghq.com/api/v2/events ### Overview This endpoint allows you to publish events. **Note:** To utilize this endpoint with our client libraries, please ensure you are using the latest version released on or after July 1, 2025. Earlier versions do not support this functionality. **Important:** Upgrade to the latest client library version to use the updated endpoint at `https://event-management-intake.{site}/api/v2/events`. Older client library versions of the Post an event (v2) API send requests to a deprecated endpoint (`https://api.{site}/api/v2/events`). ✅ **Only events with the`change` or `alert` category** are in General Availability. For change events, see [Change Tracking](https://docs.datadoghq.com/change_tracking) for more details. ❌ For use cases involving other event categories, use the V1 endpoint or reach out to [support](https://www.datadoghq.com/support/). ❌ Notifications are not yet supported for events sent to this endpoint. Use the V1 endpoint for notification functionality. ### Request #### Body Data (required) Event creation request payload. * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Field Type Description data [_required_] object An event object. attributes [_required_] object Event attributes. aggregation_key string A string used for aggregation when [correlating](https://docs.datadoghq.com/service_management/events/correlation/) events. If you specify a key, events are deduplicated to alerts based on this key. Limited to 100 characters. attributes [_required_] JSON object for category-specific attributes. Schema is different per event category. Option 1 object Change event attributes. author object The entity that made the change. Optional, if provided it must include `type` and `name`. name [_required_] string The name of the user or system that made the change. Limited to 128 characters. type [_required_] enum Author's type. Allowed enum values: `user,system,api,automation` change_metadata object Free form JSON object with information related to the `change` event. Supports up to 100 properties per object and a maximum nesting depth of 10 levels. changed_resource [_required_] object A uniquely identified resource. name [_required_] string The name of the resource that was changed. Limited to 128 characters. type [_required_] enum The type of the resource that was changed. Allowed enum values: `feature_flag,configuration` impacted_resources [object] A list of resources impacted by this change. It is recommended to provide an impacted resource to display the change event at the correct location. Only resources of type `service` are supported. Maximum of 100 impacted resources allowed. name [_required_] string The name of the impacted resource. Limited to 128 characters. type [_required_] enum The type of the impacted resource. Allowed enum values: `service` new_value object Free form JSON object representing the new state of the changed resource. prev_value object Free form JSON object representing the previous state of the changed resource. Option 2 object Alert event attributes. custom object Free form JSON object for arbitrary data. Supports up to 100 properties per object and a maximum nesting depth of 10 levels. links [object] The links related to the event. Maximum of 20 links allowed. category [_required_] enum The category of the link. Allowed enum values: `runbook,documentation,dashboard` title string The display text of the link. Limited to 300 characters. url [_required_] string The URL of the link. Limited to 2048 characters. priority enum The priority of the alert. Allowed enum values: `1,2,3,4,5` default: `5` status [_required_] enum The status of the alert. Allowed enum values: `warn,error,ok` category [_required_] enum Event category identifying the type of event. Allowed enum values: `change,alert` host string Host name to associate with the event. Any tags associated with the host are also applied to this event. Limited to 255 characters. integration_id enum Integration ID sourced from integration manifests. Allowed enum values: `custom-events` message string Free formed text associated with the event. It's suggested to use `data.attributes.attributes.custom` for well-structured attributes. Limited to 4000 characters. tags [string] A list of tags associated with the event. Maximum of 100 tags allowed. Refer to [Tags docs](https://docs.datadoghq.com/getting_started/tagging/). timestamp string Timestamp when the event occurred. Must follow [ISO 8601](https://www.iso.org/iso-8601-date-and-time-format.html) format. For example `"2017-01-15T01:30:15.010000Z"`. Defaults to the timestamp of receipt. Limited to values no older than 18 hours. title [_required_] string The title of the event. Limited to 500 characters. type [_required_] enum Entity type. Allowed enum values: `event` ``` { "data": { "attributes": { "aggregation_key": "aggregation_key_123", "attributes": { "author": { "name": "example@datadog.com", "type": "user" }, "change_metadata": { "dd": { "team": "datadog_team", "user_email": "datadog@datadog.com", "user_id": "datadog_user_id", "user_name": "datadog_username" }, "resource_link": "datadog.com/feature/fallback_payments_test" }, "changed_resource": { "name": "fallback_payments_test", "type": "feature_flag" }, "impacted_resources": [ { "name": "payments_api", "type": "service" } ], "new_value": { "enabled": true, "percentage": "50%", "rule": { "datacenter": "devcycle.us1.prod" } }, "prev_value": { "enabled": true, "percentage": "10%", "rule": { "datacenter": "devcycle.us1.prod" } } }, "category": "change", "integration_id": "custom-events", "host": "test-host", "message": "payment_processed feature flag has been enabled", "tags": [ "env:api_client_test" ], "title": "payment_processed feature flag updated" }, "type": "event" } } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/events/#CreateEvent-202-v2) * [400](https://docs.datadoghq.com/api/latest/events/#CreateEvent-400-v2) * [403](https://docs.datadoghq.com/api/latest/events/#CreateEvent-403-v2) * [429](https://docs.datadoghq.com/api/latest/events/#CreateEvent-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Event creation response. Field Type Description data object Event object. attributes object Event attributes. attributes object JSON object for category-specific attributes. evt object JSON object of event system attributes. id string **DEPRECATED** : Event identifier. This field is deprecated and will be removed in a future version. Use the `uid` field instead. uid string A unique identifier for the event. You can use this identifier to query or reference the event. type string Entity type. links object Links to the event. self string The URL of the event. This link is only functional when using the default subdomain. ``` { "data": { "attributes": { "attributes": { "evt": { "id": "string", "uid": "string" } } }, "type": "event" }, "links": { "self": "string" } } ``` Copy Bad request * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/events/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/events/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/events/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/events/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/events/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/events/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/events/?code-lang=typescript) ##### Post an event returns "OK" response Copy ``` ## json-request-body # # Curl command curl -X POST "https://event-management-intake.ap1.datadoghq.com"https://event-management-intake.ap2.datadoghq.com"https://event-management-intake.datadoghq.eu"https://event-management-intake.ddog-gov.com"https://event-management-intake.datadoghq.com"https://event-management-intake.us3.datadoghq.com"https://event-management-intake.us5.datadoghq.com/api/v2/events" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "aggregation_key": "aggregation_key_123", "attributes": { "author": { "name": "example@datadog.com", "type": "user" }, "change_metadata": { "dd": { "team": "datadog_team", "user_email": "datadog@datadog.com", "user_id": "datadog_user_id", "user_name": "datadog_username" }, "resource_link": "datadog.com/feature/fallback_payments_test" }, "changed_resource": { "name": "fallback_payments_test", "type": "feature_flag" }, "impacted_resources": [ { "name": "payments_api", "type": "service" } ], "new_value": { "enabled": true, "percentage": "50%", "rule": { "datacenter": "devcycle.us1.prod" } }, "prev_value": { "enabled": true, "percentage": "10%", "rule": { "datacenter": "devcycle.us1.prod" } } }, "category": "change", "host": "hostname", "integration_id": "custom-events", "message": "payment_processed feature flag has been enabled", "tags": [ "env:api_client_test" ], "timestamp": "2020-01-01T01:30:15.010000Z", "title": "payment_processed feature flag updated" }, "type": "event" } } EOF ``` ##### Post an event returns "OK" response ``` // Post an event returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.EventCreateRequestPayload{ Data: datadogV2.EventCreateRequest{ Attributes: datadogV2.EventPayload{ AggregationKey: datadog.PtrString("aggregation_key_123"), Attributes: datadogV2.EventPayloadAttributes{ ChangeEventCustomAttributes: &datadogV2.ChangeEventCustomAttributes{ Author: &datadogV2.ChangeEventCustomAttributesAuthor{ Name: "example@datadog.com", Type: datadogV2.CHANGEEVENTCUSTOMATTRIBUTESAUTHORTYPE_USER, }, ChangeMetadata: map[string]interface{}{ "dd": "{'team': 'datadog_team', 'user_email': 'datadog@datadog.com', 'user_id': 'datadog_user_id', 'user_name': 'datadog_username'}", "resource_link": "datadog.com/feature/fallback_payments_test", }, ChangedResource: datadogV2.ChangeEventCustomAttributesChangedResource{ Name: "fallback_payments_test", Type: datadogV2.CHANGEEVENTCUSTOMATTRIBUTESCHANGEDRESOURCETYPE_FEATURE_FLAG, }, ImpactedResources: []datadogV2.ChangeEventCustomAttributesImpactedResourcesItems{ { Name: "payments_api", Type: datadogV2.CHANGEEVENTCUSTOMATTRIBUTESIMPACTEDRESOURCESITEMSTYPE_SERVICE, }, }, NewValue: map[string]interface{}{ "enabled": "True", "percentage": "50%", "rule": "{'datacenter': 'devcycle.us1.prod'}", }, PrevValue: map[string]interface{}{ "enabled": "True", "percentage": "10%", "rule": "{'datacenter': 'devcycle.us1.prod'}", }, }}, Category: datadogV2.EVENTCATEGORY_CHANGE, IntegrationId: datadogV2.EVENTPAYLOADINTEGRATIONID_CUSTOM_EVENTS.Ptr(), Host: datadog.PtrString("test-host"), Message: datadog.PtrString("payment_processed feature flag has been enabled"), Tags: []string{ "env:api_client_test", }, Title: "payment_processed feature flag updated", }, Type: datadogV2.EVENTCREATEREQUESTTYPE_EVENT, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewEventsApi(apiClient) resp, r, err := api.CreateEvent(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `EventsApi.CreateEvent`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `EventsApi.CreateEvent`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Post an event returns "OK" response ``` // Post an event returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.EventsApi; import com.datadog.api.client.v2.model.ChangeEventCustomAttributes; import com.datadog.api.client.v2.model.ChangeEventCustomAttributesAuthor; import com.datadog.api.client.v2.model.ChangeEventCustomAttributesAuthorType; import com.datadog.api.client.v2.model.ChangeEventCustomAttributesChangedResource; import com.datadog.api.client.v2.model.ChangeEventCustomAttributesChangedResourceType; import com.datadog.api.client.v2.model.ChangeEventCustomAttributesImpactedResourcesItems; import com.datadog.api.client.v2.model.ChangeEventCustomAttributesImpactedResourcesItemsType; import com.datadog.api.client.v2.model.EventCategory; import com.datadog.api.client.v2.model.EventCreateRequest; import com.datadog.api.client.v2.model.EventCreateRequestPayload; import com.datadog.api.client.v2.model.EventCreateRequestType; import com.datadog.api.client.v2.model.EventCreateResponsePayload; import com.datadog.api.client.v2.model.EventPayload; import com.datadog.api.client.v2.model.EventPayloadAttributes; import com.datadog.api.client.v2.model.EventPayloadIntegrationId; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); EventsApi apiInstance = new EventsApi(defaultClient); EventCreateRequestPayload body = new EventCreateRequestPayload() .data( new EventCreateRequest() .attributes( new EventPayload() .aggregationKey("aggregation_key_123") .attributes( new EventPayloadAttributes( new ChangeEventCustomAttributes() .author( new ChangeEventCustomAttributesAuthor() .name("example@datadog.com") .type(ChangeEventCustomAttributesAuthorType.USER)) .changeMetadata( Map.ofEntries( Map.entry( "dd", "{'team': 'datadog_team', 'user_email':" + " 'datadog@datadog.com', 'user_id':" + " 'datadog_user_id', 'user_name':" + " 'datadog_username'}"), Map.entry( "resource_link", "datadog.com/feature/fallback_payments_test"))) .changedResource( new ChangeEventCustomAttributesChangedResource() .name("fallback_payments_test") .type( ChangeEventCustomAttributesChangedResourceType .FEATURE_FLAG)) .impactedResources( Collections.singletonList( new ChangeEventCustomAttributesImpactedResourcesItems() .name("payments_api") .type( ChangeEventCustomAttributesImpactedResourcesItemsType .SERVICE))) .newValue( Map.ofEntries( Map.entry("enabled", "True"), Map.entry("percentage", "50%"), Map.entry( "rule", "{'datacenter': 'devcycle.us1.prod'}"))) .prevValue( Map.ofEntries( Map.entry("enabled", "True"), Map.entry("percentage", "10%"), Map.entry( "rule", "{'datacenter': 'devcycle.us1.prod'}"))))) .category(EventCategory.CHANGE) .integrationId(EventPayloadIntegrationId.CUSTOM_EVENTS) .host("test-host") .message("payment_processed feature flag has been enabled") .tags(Collections.singletonList("env:api_client_test")) .title("payment_processed feature flag updated")) .type(EventCreateRequestType.EVENT)); try { EventCreateResponsePayload result = apiInstance.createEvent(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling EventsApi#createEvent"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Post an event returns "OK" response ``` """ Post an event returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.events_api import EventsApi from datadog_api_client.v2.model.change_event_custom_attributes import ChangeEventCustomAttributes from datadog_api_client.v2.model.change_event_custom_attributes_author import ChangeEventCustomAttributesAuthor from datadog_api_client.v2.model.change_event_custom_attributes_author_type import ChangeEventCustomAttributesAuthorType from datadog_api_client.v2.model.change_event_custom_attributes_changed_resource import ( ChangeEventCustomAttributesChangedResource, ) from datadog_api_client.v2.model.change_event_custom_attributes_changed_resource_type import ( ChangeEventCustomAttributesChangedResourceType, ) from datadog_api_client.v2.model.change_event_custom_attributes_impacted_resources_items import ( ChangeEventCustomAttributesImpactedResourcesItems, ) from datadog_api_client.v2.model.change_event_custom_attributes_impacted_resources_items_type import ( ChangeEventCustomAttributesImpactedResourcesItemsType, ) from datadog_api_client.v2.model.event_category import EventCategory from datadog_api_client.v2.model.event_create_request import EventCreateRequest from datadog_api_client.v2.model.event_create_request_payload import EventCreateRequestPayload from datadog_api_client.v2.model.event_create_request_type import EventCreateRequestType from datadog_api_client.v2.model.event_payload import EventPayload from datadog_api_client.v2.model.event_payload_integration_id import EventPayloadIntegrationId body = EventCreateRequestPayload( data=EventCreateRequest( attributes=EventPayload( aggregation_key="aggregation_key_123", attributes=ChangeEventCustomAttributes( author=ChangeEventCustomAttributesAuthor( name="example@datadog.com", type=ChangeEventCustomAttributesAuthorType.USER, ), change_metadata=dict( [ ( "dd", "{'team': 'datadog_team', 'user_email': 'datadog@datadog.com', 'user_id': 'datadog_user_id', 'user_name': 'datadog_username'}", ), ("resource_link", "datadog.com/feature/fallback_payments_test"), ] ), changed_resource=ChangeEventCustomAttributesChangedResource( name="fallback_payments_test", type=ChangeEventCustomAttributesChangedResourceType.FEATURE_FLAG, ), impacted_resources=[ ChangeEventCustomAttributesImpactedResourcesItems( name="payments_api", type=ChangeEventCustomAttributesImpactedResourcesItemsType.SERVICE, ), ], new_value=dict( [("enabled", "True"), ("percentage", "50%"), ("rule", "{'datacenter': 'devcycle.us1.prod'}")] ), prev_value=dict( [("enabled", "True"), ("percentage", "10%"), ("rule", "{'datacenter': 'devcycle.us1.prod'}")] ), ), category=EventCategory.CHANGE, integration_id=EventPayloadIntegrationId.CUSTOM_EVENTS, host="test-host", message="payment_processed feature flag has been enabled", tags=[ "env:api_client_test", ], title="payment_processed feature flag updated", ), type=EventCreateRequestType.EVENT, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = EventsApi(api_client) response = api_instance.create_event(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Post an event returns "OK" response ``` # Post an event returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::EventsAPI.new body = DatadogAPIClient::V2::EventCreateRequestPayload.new({ data: DatadogAPIClient::V2::EventCreateRequest.new({ attributes: DatadogAPIClient::V2::EventPayload.new({ aggregation_key: "aggregation_key_123", attributes: DatadogAPIClient::V2::ChangeEventCustomAttributes.new({ author: DatadogAPIClient::V2::ChangeEventCustomAttributesAuthor.new({ name: "example@datadog.com", type: DatadogAPIClient::V2::ChangeEventCustomAttributesAuthorType::USER, }), change_metadata: { "dd": "{'team': 'datadog_team', 'user_email': 'datadog@datadog.com', 'user_id': 'datadog_user_id', 'user_name': 'datadog_username'}", "resource_link": "datadog.com/feature/fallback_payments_test", }, changed_resource: DatadogAPIClient::V2::ChangeEventCustomAttributesChangedResource.new({ name: "fallback_payments_test", type: DatadogAPIClient::V2::ChangeEventCustomAttributesChangedResourceType::FEATURE_FLAG, }), impacted_resources: [ DatadogAPIClient::V2::ChangeEventCustomAttributesImpactedResourcesItems.new({ name: "payments_api", type: DatadogAPIClient::V2::ChangeEventCustomAttributesImpactedResourcesItemsType::SERVICE, }), ], new_value: { "enabled": "True", "percentage": "50%", "rule": "{'datacenter': 'devcycle.us1.prod'}", }, prev_value: { "enabled": "True", "percentage": "10%", "rule": "{'datacenter': 'devcycle.us1.prod'}", }, }), category: DatadogAPIClient::V2::EventCategory::CHANGE, integration_id: DatadogAPIClient::V2::EventPayloadIntegrationId::CUSTOM_EVENTS, host: "test-host", message: "payment_processed feature flag has been enabled", tags: [ "env:api_client_test", ], title: "payment_processed feature flag updated", }), type: DatadogAPIClient::V2::EventCreateRequestType::EVENT, }), }) p api_instance.create_event(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Post an event returns "OK" response ``` // Post an event returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_events::EventsAPI; use datadog_api_client::datadogV2::model::ChangeEventCustomAttributes; use datadog_api_client::datadogV2::model::ChangeEventCustomAttributesAuthor; use datadog_api_client::datadogV2::model::ChangeEventCustomAttributesAuthorType; use datadog_api_client::datadogV2::model::ChangeEventCustomAttributesChangedResource; use datadog_api_client::datadogV2::model::ChangeEventCustomAttributesChangedResourceType; use datadog_api_client::datadogV2::model::ChangeEventCustomAttributesImpactedResourcesItems; use datadog_api_client::datadogV2::model::ChangeEventCustomAttributesImpactedResourcesItemsType; use datadog_api_client::datadogV2::model::EventCategory; use datadog_api_client::datadogV2::model::EventCreateRequest; use datadog_api_client::datadogV2::model::EventCreateRequestPayload; use datadog_api_client::datadogV2::model::EventCreateRequestType; use datadog_api_client::datadogV2::model::EventPayload; use datadog_api_client::datadogV2::model::EventPayloadAttributes; use datadog_api_client::datadogV2::model::EventPayloadIntegrationId; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = EventCreateRequestPayload::new(EventCreateRequest::new( EventPayload::new( EventPayloadAttributes::ChangeEventCustomAttributes(Box::new( ChangeEventCustomAttributes::new(ChangeEventCustomAttributesChangedResource::new( "fallback_payments_test".to_string(), ChangeEventCustomAttributesChangedResourceType::FEATURE_FLAG, )) .author(ChangeEventCustomAttributesAuthor::new( "example@datadog.com".to_string(), ChangeEventCustomAttributesAuthorType::USER, )) .change_metadata(BTreeMap::from([( "resource_link".to_string(), Value::from("datadog.com/feature/fallback_payments_test"), )])) .impacted_resources(vec![ ChangeEventCustomAttributesImpactedResourcesItems::new( "payments_api".to_string(), ChangeEventCustomAttributesImpactedResourcesItemsType::SERVICE, ), ]) .new_value(BTreeMap::from([ ("enabled".to_string(), Value::from("True")), ("percentage".to_string(), Value::from("50%")), ])) .prev_value(BTreeMap::from([ ("enabled".to_string(), Value::from("True")), ("percentage".to_string(), Value::from("10%")), ])), )), EventCategory::CHANGE, "payment_processed feature flag updated".to_string(), ) .aggregation_key("aggregation_key_123".to_string()) .host("test-host".to_string()) .integration_id(EventPayloadIntegrationId::CUSTOM_EVENTS) .message("payment_processed feature flag has been enabled".to_string()) .tags(vec!["env:api_client_test".to_string()]), EventCreateRequestType::EVENT, )); let configuration = datadog::Configuration::new(); let api = EventsAPI::with_config(configuration); let resp = api.create_event(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Post an event returns "OK" response ``` /** * Post an event returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.EventsApi(configuration); const params: v2.EventsApiCreateEventRequest = { body: { data: { attributes: { aggregationKey: "aggregation_key_123", attributes: { author: { name: "example@datadog.com", type: "user", }, changeMetadata: { dd: "{'team': 'datadog_team', 'user_email': 'datadog@datadog.com', 'user_id': 'datadog_user_id', 'user_name': 'datadog_username'}", resource_link: "datadog.com/feature/fallback_payments_test", }, changedResource: { name: "fallback_payments_test", type: "feature_flag", }, impactedResources: [ { name: "payments_api", type: "service", }, ], newValue: { enabled: "True", percentage: "50%", rule: "{'datacenter': 'devcycle.us1.prod'}", }, prevValue: { enabled: "True", percentage: "10%", rule: "{'datacenter': 'devcycle.us1.prod'}", }, }, category: "change", integrationId: "custom-events", host: "test-host", message: "payment_processed feature flag has been enabled", tags: ["env:api_client_test"], title: "payment_processed feature flag updated", }, type: "event", }, }, }; apiInstance .createEvent(params) .then((data: v2.EventCreateResponsePayload) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an event](https://docs.datadoghq.com/api/latest/events/#get-an-event) * [v1](https://docs.datadoghq.com/api/latest/events/#get-an-event-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/events/#get-an-event-v2) GET https://api.ap1.datadoghq.com/api/v1/events/{event_id}https://api.ap2.datadoghq.com/api/v1/events/{event_id}https://api.datadoghq.eu/api/v1/events/{event_id}https://api.ddog-gov.com/api/v1/events/{event_id}https://api.datadoghq.com/api/v1/events/{event_id}https://api.us3.datadoghq.com/api/v1/events/{event_id}https://api.us5.datadoghq.com/api/v1/events/{event_id} ### Overview This endpoint allows you to query for event details. **Note** : If the event you’re querying contains markdown formatting of any kind, you may see characters such as `%`,`\`,`n` in your output. This endpoint requires the `events_read` permission. OAuth apps require the `events_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#events) to access this endpoint. ### Arguments #### Path Parameters Name Type Description event_id [_required_] integer The ID of the event. ### Response * [200](https://docs.datadoghq.com/api/latest/events/#GetEvent-200-v1) * [403](https://docs.datadoghq.com/api/latest/events/#GetEvent-403-v1) * [404](https://docs.datadoghq.com/api/latest/events/#GetEvent-404-v1) * [429](https://docs.datadoghq.com/api/latest/events/#GetEvent-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Object containing an event response. Field Type Description event object Object representing an event. alert_type enum If an alert event is enabled, set its type. For example, `error`, `warning`, `info`, `success`, `user_update`, `recommendation`, and `snapshot`. Allowed enum values: `error,warning,info,success,user_update,recommendation,snapshot` date_happened int64 POSIX timestamp of the event. Must be sent as an integer (that is no quotes). Limited to events up to 18 hours in the past and two hours in the future. device_name string A device name. host string Host name to associate with the event. Any tags associated with the host are also applied to this event. id int64 Integer ID of the event. id_str string Handling IDs as large 64-bit numbers can cause loss of accuracy issues with some programming languages. Instead, use the string representation of the Event ID to avoid losing accuracy. payload string Payload of the event. priority enum The priority of the event. For example, `normal` or `low`. Allowed enum values: `normal,low` source_type_name string The type of event being posted. Option examples include nagios, hudson, jenkins, my_apps, chef, puppet, git, bitbucket, etc. The list of standard source attribute values [available here](https://docs.datadoghq.com/integrations/faq/list-of-api-source-attribute-value). tags [string] A list of tags to apply to the event. text string The body of the event. Limited to 4000 characters. The text supports markdown. To use markdown in the event text, start the text block with `%%% \n` and end the text block with `\n %%%`. Use `msg_text` with the Datadog Ruby library. title string The event title. url string URL of the event. status string A status. ``` { "event": { "alert_type": "info", "date_happened": "integer", "device_name": "string", "host": "string", "id": "integer", "id_str": "string", "payload": "{}", "priority": "normal", "source_type_name": "string", "tags": [ "environment:test" ], "text": "Oh boy!", "title": "Did you hear the news today?", "url": "string" }, "status": "string" } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/events/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/events/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/events/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/events/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/events/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/events/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/events/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/events/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/events/?code-lang=python-legacy) ##### Get an event Copy ``` # Path parameters export event_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/events/${event_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an event ``` """ Get an event returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.events_api import EventsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = EventsApi(api_client) response = api_instance.get_event( event_id=9223372036854775807, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an event ``` # Get an event returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::EventsAPI.new p api_instance.get_event(9223372036854775807) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an event ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) event_id = '1375909614428331251' dog.get_event(event_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an event ``` // Get an event returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewEventsApi(apiClient) resp, r, err := api.GetEvent(ctx, 9223372036854775807) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `EventsApi.GetEvent`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `EventsApi.GetEvent`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an event ``` // Get an event returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.EventsApi; import com.datadog.api.client.v1.model.EventResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); EventsApi apiInstance = new EventsApi(defaultClient); try { EventResponse result = apiInstance.getEvent(9223372036854775807L); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling EventsApi#getEvent"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an event ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.Event.get(2603387619536318140) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get an event ``` // Get an event returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_events::EventsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = EventsAPI::with_config(configuration); let resp = api.get_event(9223372036854775807).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an event ``` /** * Get an event returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.EventsApi(configuration); const params: v1.EventsApiGetEventRequest = { eventId: 9223372036854775807, }; apiInstance .getEvent(params) .then((data: v1.EventResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/events/{event_id}https://api.ap2.datadoghq.com/api/v2/events/{event_id}https://api.datadoghq.eu/api/v2/events/{event_id}https://api.ddog-gov.com/api/v2/events/{event_id}https://api.datadoghq.com/api/v2/events/{event_id}https://api.us3.datadoghq.com/api/v2/events/{event_id}https://api.us5.datadoghq.com/api/v2/events/{event_id} ### Overview Get the details of an event by `event_id`. This endpoint requires the `events_read` permission. OAuth apps require the `events_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#events) to access this endpoint. ### Arguments #### Path Parameters Name Type Description event_id [_required_] string The UID of the event. ### Response * [200](https://docs.datadoghq.com/api/latest/events/#GetEvent-200-v2) * [400](https://docs.datadoghq.com/api/latest/events/#GetEvent-400-v2) * [401](https://docs.datadoghq.com/api/latest/events/#GetEvent-401-v2) * [403](https://docs.datadoghq.com/api/latest/events/#GetEvent-403-v2) * [404](https://docs.datadoghq.com/api/latest/events/#GetEvent-404-v2) * [429](https://docs.datadoghq.com/api/latest/events/#GetEvent-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Get an event response. Field Type Description data object An event object. attributes object Event attributes. attributes JSON object for category-specific attributes. Option 1 object Change event attributes. aggregation_key string Aggregation key of the event. author object The entity that made the change. name string The name of the user or system that made the change. type enum The type of the author. Allowed enum values: `user,system,api,automation` change_metadata object JSON object of change metadata. changed_resource object A uniquely identified resource. name string The name of the changed resource. type enum The type of the changed resource. Allowed enum values: `feature_flag,configuration` evt object JSON object of event system attributes. category enum Event category identifying the type of event. Allowed enum values: `change,alert` id string Event identifier. This field is deprecated and will be removed in a future version. Use the `uid` field instead. integration_id enum Integration ID sourced from integration manifests. Allowed enum values: `custom-events` source_id int64 The source type ID of the event. uid string A unique identifier for the event. You can use this identifier to query or reference the event. impacted_resources [object] A list of resources impacted by this change. name string The name of the impacted resource. type enum The type of the impacted resource. Allowed enum values: `service` new_value object The new state of the changed resource. prev_value object The previous state of the changed resource. service string Service that triggered the event. timestamp int64 POSIX timestamp of the event. title string The title of the event. Option 2 object Alert event attributes. aggregation_key string Aggregation key of the event. custom object JSON object of custom attributes. evt object JSON object of event system attributes. category enum Event category identifying the type of event. Allowed enum values: `change,alert` id string Event identifier. This field is deprecated and will be removed in a future version. Use the `uid` field instead. integration_id enum Integration ID sourced from integration manifests. Allowed enum values: `custom-events` source_id int64 The source type ID of the event. uid string A unique identifier for the event. You can use this identifier to query or reference the event. links [object] The links related to the event. category enum The category of the link. Allowed enum values: `runbook,documentation,dashboard` title string The display text of the link. url string The URL of the link. priority enum The priority of the alert. Allowed enum values: `1,2,3,4,5` service string Service that triggered the event. status enum The status of the alert. Allowed enum values: `warn,error,ok` timestamp int64 POSIX timestamp of the event. title string The title of the event. message string Free-form text associated with the event. tags [string] A list of tags associated with the event. timestamp string Timestamp when the event occurred. id string The event's ID. type string Entity type. ``` { "data": { "attributes": { "attributes": { "aggregation_key": "aggregation-key", "author": { "name": "example@datadog.com", "type": "user" }, "change_metadata": { "dd": { "team": "datadog_team", "user_email": "datadog@datadog.com", "user_id": "datadog_user_id", "user_name": "datadog_username" } }, "changed_resource": { "name": "string", "type": "feature_flag" }, "evt": { "category": "change", "id": "string", "integration_id": "custom-events", "source_id": "integer", "uid": "string" }, "impacted_resources": [ { "name": "service-name", "type": "service" } ], "new_value": { "enabled": true, "percentage": "50%", "rule": { "datacenter": "devcycle.us1.prod" } }, "prev_value": { "enabled": true, "percentage": "10%", "rule": { "datacenter": "devcycle.us1.prod" } }, "service": "service-name", "timestamp": 175019386627, "title": "The event title" }, "message": "The event message", "tags": [ "env:api_client_test" ], "timestamp": "2017-01-15T01:30:15.010000Z" }, "id": "", "type": "event" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/events/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/events/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/events/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/events/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/events/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/events/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/events/?code-lang=typescript) ##### Get an event Copy ``` # Path parameters export event_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/events/${event_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an event ``` """ Get an event returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.events_api import EventsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = EventsApi(api_client) response = api_instance.get_event( event_id="AZeF-nTCAABzkAgGXzYPtgAA", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an event ``` # Get an event returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::EventsAPI.new p api_instance.get_event("AZeF-nTCAABzkAgGXzYPtgAA") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an event ``` // Get an event returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewEventsApi(apiClient) resp, r, err := api.GetEvent(ctx, "AZeF-nTCAABzkAgGXzYPtgAA") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `EventsApi.GetEvent`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `EventsApi.GetEvent`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an event ``` // Get an event returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.EventsApi; import com.datadog.api.client.v2.model.V2EventResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); EventsApi apiInstance = new EventsApi(defaultClient); try { V2EventResponse result = apiInstance.getEvent("AZeF-nTCAABzkAgGXzYPtgAA"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling EventsApi#getEvent"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an event ``` // Get an event returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_events::EventsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = EventsAPI::with_config(configuration); let resp = api.get_event("AZeF-nTCAABzkAgGXzYPtgAA".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an event ``` /** * Get an event returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.EventsApi(configuration); const params: v2.EventsApiGetEventRequest = { eventId: "AZeF-nTCAABzkAgGXzYPtgAA", }; apiInstance .getEvent(params) .then((data: v2.V2EventResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search events](https://docs.datadoghq.com/api/latest/events/#search-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/events/#search-events-v2) POST https://api.ap1.datadoghq.com/api/v2/events/searchhttps://api.ap2.datadoghq.com/api/v2/events/searchhttps://api.datadoghq.eu/api/v2/events/searchhttps://api.ddog-gov.com/api/v2/events/searchhttps://api.datadoghq.com/api/v2/events/searchhttps://api.us3.datadoghq.com/api/v2/events/searchhttps://api.us5.datadoghq.com/api/v2/events/search ### Overview List endpoint returns events that match an events search query. [Results are paginated similarly to logs](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to build complex events filtering and search. This endpoint requires the `events_read` permission. ### Request #### Body Data * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) Field Type Description filter object The search and filter query settings. from string The minimum time for the requested events. Supports date math and regular timestamps in milliseconds. default: `now-15m` query string The search query following the event search syntax. default: `*` to string The maximum time for the requested events. Supports date math and regular timestamps in milliseconds. default: `now` options object The global query options that are used. Either provide a timezone or a time offset but not both, otherwise the query fails. timeOffset int64 The time offset to apply to the query in seconds. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` page object Pagination settings. cursor string The returned paging point to use to get the next results. limit int32 The maximum number of logs in the response. default: `10` sort enum The sort parameters when querying events. Allowed enum values: `timestamp,-timestamp` ##### Search events returns "OK" response ``` { "filter": { "query": "datadog-agent", "from": "2020-09-17T11:48:36+01:00", "to": "2020-09-17T12:48:36+01:00" }, "sort": "timestamp", "page": { "limit": 5 } } ``` Copy ##### Search events returns "OK" response with pagination ``` { "filter": { "from": "now-15m", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/events/#SearchEvents-200-v2) * [400](https://docs.datadoghq.com/api/latest/events/#SearchEvents-400-v2) * [403](https://docs.datadoghq.com/api/latest/events/#SearchEvents-403-v2) * [429](https://docs.datadoghq.com/api/latest/events/#SearchEvents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) The response object with all events matching the request and pagination information. Field Type Description data [object] An array of events matching the request. attributes object The object description of an event response attribute. attributes object Object description of attributes from your event. aggregation_key string Aggregation key of the event. date_happened int64 POSIX timestamp of the event. Must be sent as an integer (no quotation marks). Limited to events no older than 18 hours. device_name string A device name. duration int64 The duration between the triggering of the event and its recovery in nanoseconds. event_object string The event title. evt object The metadata associated with a request. id string Event ID. name string The event name. source_id int64 Event source ID. type string Event type. hostname string Host name to associate with the event. Any tags associated with the host are also applied to this event. monitor object Attributes from the monitor that triggered the event. created_at int64 The POSIX timestamp of the monitor's creation in nanoseconds. group_status int32 Monitor group status used when there is no `result_groups`. groups [string] Groups to which the monitor belongs. id int64 The monitor ID. message string The monitor message. modified int64 The monitor's last-modified timestamp. name string The monitor name. query string The query that triggers the alert. tags [string] A list of tags attached to the monitor. templated_name string The templated name of the monitor before resolving any template variables. type string The monitor type. monitor_groups [string] List of groups referred to in the event. monitor_id int64 ID of the monitor that triggered the event. When an event isn't related to a monitor, this field is empty. priority enum The priority of the event's monitor. For example, `normal` or `low`. Allowed enum values: `normal,low` related_event_id int64 Related event ID. service string Service that triggered the event. source_type_name string The type of event being posted. For example, `nagios`, `hudson`, `jenkins`, `my_apps`, `chef`, `puppet`, `git` or `bitbucket`. The list of standard source attribute values is [available here](https://docs.datadoghq.com/integrations/faq/list-of-api-source-attribute-value). sourcecategory string Identifier for the source of the event, such as a monitor alert, an externally-submitted event, or an integration. status enum If an alert event is enabled, its status is one of the following: `failure`, `error`, `warning`, `info`, `success`, `user_update`, `recommendation`, or `snapshot`. Allowed enum values: `failure,error,warning,info,success,user_update,recommendation,snapshot` tags [string] A list of tags to apply to the event. timestamp int64 POSIX timestamp of your event in milliseconds. title string The event title. message string The message of the event. tags [string] An array of tags associated with the event. timestamp date-time The timestamp of the event. id string the unique ID of the event. type enum Type of the event. Allowed enum values: `event` default: `event` links object Links attributes. next string Link for the next set of results. Note that the request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Pagination attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. request_id string The identifier of the request. status string The request status. warnings [object] A list of warnings (non-fatal errors) encountered. Partial results might be returned if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "aggregation_key": "string", "date_happened": "integer", "device_name": "string", "duration": "integer", "event_object": "Did you hear the news today?", "evt": { "id": "6509751066204996294", "name": "string", "source_id": 36, "type": "error_tracking_alert" }, "hostname": "string", "monitor": { "created_at": 1646318692000, "group_status": "integer", "groups": [], "id": "integer", "message": "string", "modified": "integer", "name": "string", "query": "string", "tags": [ "environment:test" ], "templated_name": "string", "type": "string" }, "monitor_groups": [], "monitor_id": "integer", "priority": "normal", "related_event_id": "integer", "service": "datadog-api", "source_type_name": "string", "sourcecategory": "string", "status": "info", "tags": [ "environment:test" ], "timestamp": 1652274265000, "title": "Oh boy!" }, "message": "string", "tags": [ "team:A" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "event" } ], "links": { "next": "https://app.datadoghq.com/api/v2/events?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid. Results hold data from the other indexes." } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/events/) * [Example](https://docs.datadoghq.com/api/latest/events/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/events/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/events/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/events/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/events/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/events/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/events/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/events/?code-lang=typescript) ##### Search events returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "query": "datadog-agent", "from": "2020-09-17T11:48:36+01:00", "to": "2020-09-17T12:48:36+01:00" }, "sort": "timestamp", "page": { "limit": 5 } } EOF ``` ##### Search events returns "OK" response with pagination Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "now-15m", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" } EOF ``` ##### Search events returns "OK" response ``` // Search events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.EventsListRequest{ Filter: &datadogV2.EventsQueryFilter{ Query: datadog.PtrString("datadog-agent"), From: datadog.PtrString("2020-09-17T11:48:36+01:00"), To: datadog.PtrString("2020-09-17T12:48:36+01:00"), }, Sort: datadogV2.EVENTSSORT_TIMESTAMP_ASCENDING.Ptr(), Page: &datadogV2.EventsRequestPage{ Limit: datadog.PtrInt32(5), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewEventsApi(apiClient) resp, r, err := api.SearchEvents(ctx, *datadogV2.NewSearchEventsOptionalParameters().WithBody(body)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `EventsApi.SearchEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `EventsApi.SearchEvents`:\n%s\n", responseContent) } ``` Copy ##### Search events returns "OK" response with pagination ``` // Search events returns "OK" response with pagination package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.EventsListRequest{ Filter: &datadogV2.EventsQueryFilter{ From: datadog.PtrString("now-15m"), To: datadog.PtrString("now"), }, Options: &datadogV2.EventsQueryOptions{ Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.EventsRequestPage{ Limit: datadog.PtrInt32(2), }, Sort: datadogV2.EVENTSSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewEventsApi(apiClient) resp, _ := api.SearchEventsWithPagination(ctx, *datadogV2.NewSearchEventsOptionalParameters().WithBody(body)) for paginationResult := range resp { if paginationResult.Error != nil { fmt.Fprintf(os.Stderr, "Error when calling `EventsApi.SearchEvents`: %v\n", paginationResult.Error) } responseContent, _ := json.MarshalIndent(paginationResult.Item, "", " ") fmt.Fprintf(os.Stdout, "%s\n", responseContent) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search events returns "OK" response ``` // Search events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.EventsApi; import com.datadog.api.client.v2.api.EventsApi.SearchEventsOptionalParameters; import com.datadog.api.client.v2.model.EventsListRequest; import com.datadog.api.client.v2.model.EventsListResponse; import com.datadog.api.client.v2.model.EventsQueryFilter; import com.datadog.api.client.v2.model.EventsRequestPage; import com.datadog.api.client.v2.model.EventsSort; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); EventsApi apiInstance = new EventsApi(defaultClient); EventsListRequest body = new EventsListRequest() .filter( new EventsQueryFilter() .query("datadog-agent") .from("2020-09-17T11:48:36+01:00") .to("2020-09-17T12:48:36+01:00")) .sort(EventsSort.TIMESTAMP_ASCENDING) .page(new EventsRequestPage().limit(5)); try { EventsListResponse result = apiInstance.searchEvents(new SearchEventsOptionalParameters().body(body)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling EventsApi#searchEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Search events returns "OK" response with pagination ``` // Search events returns "OK" response with pagination import com.datadog.api.client.ApiClient; import com.datadog.api.client.PaginationIterable; import com.datadog.api.client.v2.api.EventsApi; import com.datadog.api.client.v2.api.EventsApi.SearchEventsOptionalParameters; import com.datadog.api.client.v2.model.EventResponse; import com.datadog.api.client.v2.model.EventsListRequest; import com.datadog.api.client.v2.model.EventsQueryFilter; import com.datadog.api.client.v2.model.EventsQueryOptions; import com.datadog.api.client.v2.model.EventsRequestPage; import com.datadog.api.client.v2.model.EventsSort; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); EventsApi apiInstance = new EventsApi(defaultClient); EventsListRequest body = new EventsListRequest() .filter(new EventsQueryFilter().from("now-15m").to("now")) .options(new EventsQueryOptions().timezone("GMT")) .page(new EventsRequestPage().limit(2)) .sort(EventsSort.TIMESTAMP_ASCENDING); try { PaginationIterable iterable = apiInstance.searchEventsWithPagination(new SearchEventsOptionalParameters().body(body)); for (EventResponse item : iterable) { System.out.println(item); } } catch (RuntimeException e) { System.err.println("Exception when calling EventsApi#searchEventsWithPagination"); System.err.println("Reason: " + e.getMessage()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search events returns "OK" response ``` """ Search events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.events_api import EventsApi from datadog_api_client.v2.model.events_list_request import EventsListRequest from datadog_api_client.v2.model.events_query_filter import EventsQueryFilter from datadog_api_client.v2.model.events_request_page import EventsRequestPage from datadog_api_client.v2.model.events_sort import EventsSort body = EventsListRequest( filter=EventsQueryFilter( query="datadog-agent", _from="2020-09-17T11:48:36+01:00", to="2020-09-17T12:48:36+01:00", ), sort=EventsSort.TIMESTAMP_ASCENDING, page=EventsRequestPage( limit=5, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = EventsApi(api_client) response = api_instance.search_events(body=body) print(response) ``` Copy ##### Search events returns "OK" response with pagination ``` """ Search events returns "OK" response with pagination """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.events_api import EventsApi from datadog_api_client.v2.model.events_list_request import EventsListRequest from datadog_api_client.v2.model.events_query_filter import EventsQueryFilter from datadog_api_client.v2.model.events_query_options import EventsQueryOptions from datadog_api_client.v2.model.events_request_page import EventsRequestPage from datadog_api_client.v2.model.events_sort import EventsSort body = EventsListRequest( filter=EventsQueryFilter( _from="now-15m", to="now", ), options=EventsQueryOptions( timezone="GMT", ), page=EventsRequestPage( limit=2, ), sort=EventsSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = EventsApi(api_client) items = api_instance.search_events_with_pagination(body=body) for item in items: print(item) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search events returns "OK" response ``` # Search events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::EventsAPI.new body = DatadogAPIClient::V2::EventsListRequest.new({ filter: DatadogAPIClient::V2::EventsQueryFilter.new({ query: "datadog-agent", from: "2020-09-17T11:48:36+01:00", to: "2020-09-17T12:48:36+01:00", }), sort: DatadogAPIClient::V2::EventsSort::TIMESTAMP_ASCENDING, page: DatadogAPIClient::V2::EventsRequestPage.new({ limit: 5, }), }) opts = { body: body, } p api_instance.search_events(opts) ``` Copy ##### Search events returns "OK" response with pagination ``` # Search events returns "OK" response with pagination require "datadog_api_client" api_instance = DatadogAPIClient::V2::EventsAPI.new body = DatadogAPIClient::V2::EventsListRequest.new({ filter: DatadogAPIClient::V2::EventsQueryFilter.new({ from: "now-15m", to: "now", }), options: DatadogAPIClient::V2::EventsQueryOptions.new({ timezone: "GMT", }), page: DatadogAPIClient::V2::EventsRequestPage.new({ limit: 2, }), sort: DatadogAPIClient::V2::EventsSort::TIMESTAMP_ASCENDING, }) opts = { body: body, } api_instance.search_events_with_pagination(opts) { |item| puts item } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search events returns "OK" response ``` // Search events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_events::EventsAPI; use datadog_api_client::datadogV2::api_events::SearchEventsOptionalParams; use datadog_api_client::datadogV2::model::EventsListRequest; use datadog_api_client::datadogV2::model::EventsQueryFilter; use datadog_api_client::datadogV2::model::EventsRequestPage; use datadog_api_client::datadogV2::model::EventsSort; #[tokio::main] async fn main() { let body = EventsListRequest::new() .filter( EventsQueryFilter::new() .from("2020-09-17T11:48:36+01:00".to_string()) .query("datadog-agent".to_string()) .to("2020-09-17T12:48:36+01:00".to_string()), ) .page(EventsRequestPage::new().limit(5)) .sort(EventsSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = EventsAPI::with_config(configuration); let resp = api .search_events(SearchEventsOptionalParams::default().body(body)) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Search events returns "OK" response with pagination ``` // Search events returns "OK" response with pagination use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_events::EventsAPI; use datadog_api_client::datadogV2::api_events::SearchEventsOptionalParams; use datadog_api_client::datadogV2::model::EventsListRequest; use datadog_api_client::datadogV2::model::EventsQueryFilter; use datadog_api_client::datadogV2::model::EventsQueryOptions; use datadog_api_client::datadogV2::model::EventsRequestPage; use datadog_api_client::datadogV2::model::EventsSort; use futures_util::pin_mut; use futures_util::stream::StreamExt; #[tokio::main] async fn main() { let body = EventsListRequest::new() .filter( EventsQueryFilter::new() .from("now-15m".to_string()) .to("now".to_string()), ) .options(EventsQueryOptions::new().timezone("GMT".to_string())) .page(EventsRequestPage::new().limit(2)) .sort(EventsSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = EventsAPI::with_config(configuration); let response = api.search_events_with_pagination(SearchEventsOptionalParams::default().body(body)); pin_mut!(response); while let Some(resp) = response.next().await { if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search events returns "OK" response ``` /** * Search events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.EventsApi(configuration); const params: v2.EventsApiSearchEventsRequest = { body: { filter: { query: "datadog-agent", from: "2020-09-17T11:48:36+01:00", to: "2020-09-17T12:48:36+01:00", }, sort: "timestamp", page: { limit: 5, }, }, }; apiInstance .searchEvents(params) .then((data: v2.EventsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Search events returns "OK" response with pagination ``` /** * Search events returns "OK" response with pagination */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.EventsApi(configuration); const params: v2.EventsApiSearchEventsRequest = { body: { filter: { from: "now-15m", to: "now", }, options: { timezone: "GMT", }, page: { limit: 2, }, sort: "timestamp", }, }; (async () => { try { for await (const item of apiInstance.searchEventsWithPagination(params)) { console.log(item); } } catch (error) { console.error(error); } })(); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=7c139f4a-682b-4b25-8c60-0db2e3102762&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Events&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fevents%2F&r=&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=493494) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=86abcc51-2b4f-4cd5-ad31-687cb609b671&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=28637b3d-65d5-4a3a-b8fc-44d718e872c8&pt=Events&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fevents%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=86abcc51-2b4f-4cd5-ad31-687cb609b671&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=28637b3d-65d5-4a3a-b8fc-44d718e872c8&pt=Events&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fevents%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) --- # Source: https://docs.datadoghq.com/api/latest/fastly-integration/ # Fastly Integration Manage your Datadog Fastly integration accounts and services directly through the Datadog API. See the [Fastly integration page](https://docs.datadoghq.com/integrations/fastly/) for more information. ## [List Fastly accounts](https://docs.datadoghq.com/api/latest/fastly-integration/#list-fastly-accounts) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fastly-integration/#list-fastly-accounts-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/fastly/accountshttps://api.ap2.datadoghq.com/api/v2/integrations/fastly/accountshttps://api.datadoghq.eu/api/v2/integrations/fastly/accountshttps://api.ddog-gov.com/api/v2/integrations/fastly/accountshttps://api.datadoghq.com/api/v2/integrations/fastly/accountshttps://api.us3.datadoghq.com/api/v2/integrations/fastly/accountshttps://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts ### Overview List Fastly accounts. This endpoint requires the `integrations_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/fastly-integration/#ListFastlyAccounts-200-v2) * [400](https://docs.datadoghq.com/api/latest/fastly-integration/#ListFastlyAccounts-400-v2) * [403](https://docs.datadoghq.com/api/latest/fastly-integration/#ListFastlyAccounts-403-v2) * [404](https://docs.datadoghq.com/api/latest/fastly-integration/#ListFastlyAccounts-404-v2) * [429](https://docs.datadoghq.com/api/latest/fastly-integration/#ListFastlyAccounts-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) The expected response schema when getting Fastly accounts. Field Type Description data [object] The JSON:API data schema. attributes [_required_] object Attributes object of a Fastly account. name [_required_] string The name of the Fastly account. services [object] A list of services belonging to the parent account. id [_required_] string The ID of the Fastly service tags [string] A list of tags for the Fastly service. id [_required_] string The ID of the Fastly account, a hash of the account name. type [_required_] enum The JSON:API type for this API. Should always be `fastly-accounts`. Allowed enum values: `fastly-accounts` default: `fastly-accounts` ``` { "data": [ { "attributes": { "name": "test-name", "services": [ { "id": "6abc7de6893AbcDe9fghIj", "tags": [ "myTag", "myTag2:myValue" ] } ] }, "id": "abc123", "type": "fastly-accounts" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=typescript) ##### List Fastly accounts Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Fastly accounts ``` """ List Fastly accounts returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fastly_integration_api import FastlyIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = FastlyIntegrationApi(api_client) response = api_instance.list_fastly_accounts() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Fastly accounts ``` # List Fastly accounts returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::FastlyIntegrationAPI.new p api_instance.list_fastly_accounts() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Fastly accounts ``` // List Fastly accounts returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFastlyIntegrationApi(apiClient) resp, r, err := api.ListFastlyAccounts(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FastlyIntegrationApi.ListFastlyAccounts`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FastlyIntegrationApi.ListFastlyAccounts`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Fastly accounts ``` // List Fastly accounts returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FastlyIntegrationApi; import com.datadog.api.client.v2.model.FastlyAccountsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); FastlyIntegrationApi apiInstance = new FastlyIntegrationApi(defaultClient); try { FastlyAccountsResponse result = apiInstance.listFastlyAccounts(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FastlyIntegrationApi#listFastlyAccounts"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Fastly accounts ``` // List Fastly accounts returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fastly_integration::FastlyIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = FastlyIntegrationAPI::with_config(configuration); let resp = api.list_fastly_accounts().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Fastly accounts ``` /** * List Fastly accounts returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.FastlyIntegrationApi(configuration); apiInstance .listFastlyAccounts() .then((data: v2.FastlyAccountsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add Fastly account](https://docs.datadoghq.com/api/latest/fastly-integration/#add-fastly-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fastly-integration/#add-fastly-account-v2) POST https://api.ap1.datadoghq.com/api/v2/integrations/fastly/accountshttps://api.ap2.datadoghq.com/api/v2/integrations/fastly/accountshttps://api.datadoghq.eu/api/v2/integrations/fastly/accountshttps://api.ddog-gov.com/api/v2/integrations/fastly/accountshttps://api.datadoghq.com/api/v2/integrations/fastly/accountshttps://api.us3.datadoghq.com/api/v2/integrations/fastly/accountshttps://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts ### Overview Create a Fastly account. This endpoint requires the `manage_integrations` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) Field Type Description data [_required_] object Data object for creating a Fastly account. attributes [_required_] object Attributes object for creating a Fastly account. api_key [_required_] string The API key for the Fastly account. name [_required_] string The name of the Fastly account. services [object] A list of services belonging to the parent account. id [_required_] string The ID of the Fastly service tags [string] A list of tags for the Fastly service. type [_required_] enum The JSON:API type for this API. Should always be `fastly-accounts`. Allowed enum values: `fastly-accounts` default: `fastly-accounts` ``` { "data": { "attributes": { "api_key": "ExampleFastlyIntegration", "name": "Example-Fastly-Integration", "services": [] }, "type": "fastly-accounts" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/fastly-integration/#CreateFastlyAccount-201-v2) * [400](https://docs.datadoghq.com/api/latest/fastly-integration/#CreateFastlyAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/fastly-integration/#CreateFastlyAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/fastly-integration/#CreateFastlyAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/fastly-integration/#CreateFastlyAccount-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) The expected response schema when getting a Fastly account. Field Type Description data object Data object of a Fastly account. attributes [_required_] object Attributes object of a Fastly account. name [_required_] string The name of the Fastly account. services [object] A list of services belonging to the parent account. id [_required_] string The ID of the Fastly service tags [string] A list of tags for the Fastly service. id [_required_] string The ID of the Fastly account, a hash of the account name. type [_required_] enum The JSON:API type for this API. Should always be `fastly-accounts`. Allowed enum values: `fastly-accounts` default: `fastly-accounts` ``` { "data": { "attributes": { "name": "test-name", "services": [ { "id": "6abc7de6893AbcDe9fghIj", "tags": [ "myTag", "myTag2:myValue" ] } ] }, "id": "abc123", "type": "fastly-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=typescript) ##### Add Fastly account returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "api_key": "ExampleFastlyIntegration", "name": "Example-Fastly-Integration", "services": [] }, "type": "fastly-accounts" } } EOF ``` ##### Add Fastly account returns "CREATED" response ``` // Add Fastly account returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.FastlyAccountCreateRequest{ Data: datadogV2.FastlyAccountCreateRequestData{ Attributes: datadogV2.FastlyAccountCreateRequestAttributes{ ApiKey: "ExampleFastlyIntegration", Name: "Example-Fastly-Integration", Services: []datadogV2.FastlyService{}, }, Type: datadogV2.FASTLYACCOUNTTYPE_FASTLY_ACCOUNTS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFastlyIntegrationApi(apiClient) resp, r, err := api.CreateFastlyAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FastlyIntegrationApi.CreateFastlyAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FastlyIntegrationApi.CreateFastlyAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add Fastly account returns "CREATED" response ``` // Add Fastly account returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FastlyIntegrationApi; import com.datadog.api.client.v2.model.FastlyAccountCreateRequest; import com.datadog.api.client.v2.model.FastlyAccountCreateRequestAttributes; import com.datadog.api.client.v2.model.FastlyAccountCreateRequestData; import com.datadog.api.client.v2.model.FastlyAccountResponse; import com.datadog.api.client.v2.model.FastlyAccountType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); FastlyIntegrationApi apiInstance = new FastlyIntegrationApi(defaultClient); FastlyAccountCreateRequest body = new FastlyAccountCreateRequest() .data( new FastlyAccountCreateRequestData() .attributes( new FastlyAccountCreateRequestAttributes() .apiKey("ExampleFastlyIntegration") .name("Example-Fastly-Integration")) .type(FastlyAccountType.FASTLY_ACCOUNTS)); try { FastlyAccountResponse result = apiInstance.createFastlyAccount(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FastlyIntegrationApi#createFastlyAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add Fastly account returns "CREATED" response ``` """ Add Fastly account returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fastly_integration_api import FastlyIntegrationApi from datadog_api_client.v2.model.fastly_account_create_request import FastlyAccountCreateRequest from datadog_api_client.v2.model.fastly_account_create_request_attributes import FastlyAccountCreateRequestAttributes from datadog_api_client.v2.model.fastly_account_create_request_data import FastlyAccountCreateRequestData from datadog_api_client.v2.model.fastly_account_type import FastlyAccountType body = FastlyAccountCreateRequest( data=FastlyAccountCreateRequestData( attributes=FastlyAccountCreateRequestAttributes( api_key="ExampleFastlyIntegration", name="Example-Fastly-Integration", services=[], ), type=FastlyAccountType.FASTLY_ACCOUNTS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = FastlyIntegrationApi(api_client) response = api_instance.create_fastly_account(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add Fastly account returns "CREATED" response ``` # Add Fastly account returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::FastlyIntegrationAPI.new body = DatadogAPIClient::V2::FastlyAccountCreateRequest.new({ data: DatadogAPIClient::V2::FastlyAccountCreateRequestData.new({ attributes: DatadogAPIClient::V2::FastlyAccountCreateRequestAttributes.new({ api_key: "ExampleFastlyIntegration", name: "Example-Fastly-Integration", services: [], }), type: DatadogAPIClient::V2::FastlyAccountType::FASTLY_ACCOUNTS, }), }) p api_instance.create_fastly_account(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add Fastly account returns "CREATED" response ``` // Add Fastly account returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fastly_integration::FastlyIntegrationAPI; use datadog_api_client::datadogV2::model::FastlyAccountCreateRequest; use datadog_api_client::datadogV2::model::FastlyAccountCreateRequestAttributes; use datadog_api_client::datadogV2::model::FastlyAccountCreateRequestData; use datadog_api_client::datadogV2::model::FastlyAccountType; #[tokio::main] async fn main() { let body = FastlyAccountCreateRequest::new(FastlyAccountCreateRequestData::new( FastlyAccountCreateRequestAttributes::new( "ExampleFastlyIntegration".to_string(), "Example-Fastly-Integration".to_string(), ) .services(vec![]), FastlyAccountType::FASTLY_ACCOUNTS, )); let configuration = datadog::Configuration::new(); let api = FastlyIntegrationAPI::with_config(configuration); let resp = api.create_fastly_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add Fastly account returns "CREATED" response ``` /** * Add Fastly account returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.FastlyIntegrationApi(configuration); const params: v2.FastlyIntegrationApiCreateFastlyAccountRequest = { body: { data: { attributes: { apiKey: "ExampleFastlyIntegration", name: "Example-Fastly-Integration", services: [], }, type: "fastly-accounts", }, }, }; apiInstance .createFastlyAccount(params) .then((data: v2.FastlyAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get Fastly account](https://docs.datadoghq.com/api/latest/fastly-integration/#get-fastly-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fastly-integration/#get-fastly-account-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/fastly/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/fastly/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id} ### Overview Get a Fastly account. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Fastly Account id. ### Response * [200](https://docs.datadoghq.com/api/latest/fastly-integration/#GetFastlyAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/fastly-integration/#GetFastlyAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/fastly-integration/#GetFastlyAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/fastly-integration/#GetFastlyAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/fastly-integration/#GetFastlyAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) The expected response schema when getting a Fastly account. Field Type Description data object Data object of a Fastly account. attributes [_required_] object Attributes object of a Fastly account. name [_required_] string The name of the Fastly account. services [object] A list of services belonging to the parent account. id [_required_] string The ID of the Fastly service tags [string] A list of tags for the Fastly service. id [_required_] string The ID of the Fastly account, a hash of the account name. type [_required_] enum The JSON:API type for this API. Should always be `fastly-accounts`. Allowed enum values: `fastly-accounts` default: `fastly-accounts` ``` { "data": { "attributes": { "name": "test-name", "services": [ { "id": "6abc7de6893AbcDe9fghIj", "tags": [ "myTag", "myTag2:myValue" ] } ] }, "id": "abc123", "type": "fastly-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=typescript) ##### Get Fastly account Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/${account_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Fastly account ``` """ Get Fastly account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fastly_integration_api import FastlyIntegrationApi # there is a valid "fastly_account" in the system FASTLY_ACCOUNT_DATA_ID = environ["FASTLY_ACCOUNT_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = FastlyIntegrationApi(api_client) response = api_instance.get_fastly_account( account_id=FASTLY_ACCOUNT_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Fastly account ``` # Get Fastly account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::FastlyIntegrationAPI.new # there is a valid "fastly_account" in the system FASTLY_ACCOUNT_DATA_ID = ENV["FASTLY_ACCOUNT_DATA_ID"] p api_instance.get_fastly_account(FASTLY_ACCOUNT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Fastly account ``` // Get Fastly account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "fastly_account" in the system FastlyAccountDataID := os.Getenv("FASTLY_ACCOUNT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFastlyIntegrationApi(apiClient) resp, r, err := api.GetFastlyAccount(ctx, FastlyAccountDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FastlyIntegrationApi.GetFastlyAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FastlyIntegrationApi.GetFastlyAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Fastly account ``` // Get Fastly account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FastlyIntegrationApi; import com.datadog.api.client.v2.model.FastlyAccountResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); FastlyIntegrationApi apiInstance = new FastlyIntegrationApi(defaultClient); // there is a valid "fastly_account" in the system String FASTLY_ACCOUNT_DATA_ID = System.getenv("FASTLY_ACCOUNT_DATA_ID"); try { FastlyAccountResponse result = apiInstance.getFastlyAccount(FASTLY_ACCOUNT_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FastlyIntegrationApi#getFastlyAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Fastly account ``` // Get Fastly account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fastly_integration::FastlyIntegrationAPI; #[tokio::main] async fn main() { // there is a valid "fastly_account" in the system let fastly_account_data_id = std::env::var("FASTLY_ACCOUNT_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = FastlyIntegrationAPI::with_config(configuration); let resp = api.get_fastly_account(fastly_account_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Fastly account ``` /** * Get Fastly account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.FastlyIntegrationApi(configuration); // there is a valid "fastly_account" in the system const FASTLY_ACCOUNT_DATA_ID = process.env.FASTLY_ACCOUNT_DATA_ID as string; const params: v2.FastlyIntegrationApiGetFastlyAccountRequest = { accountId: FASTLY_ACCOUNT_DATA_ID, }; apiInstance .getFastlyAccount(params) .then((data: v2.FastlyAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Fastly account](https://docs.datadoghq.com/api/latest/fastly-integration/#update-fastly-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fastly-integration/#update-fastly-account-v2) PATCH https://api.ap1.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/fastly/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/fastly/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id} ### Overview Update a Fastly account. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Fastly Account id. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) Field Type Description data [_required_] object Data object for updating a Fastly account. attributes object Attributes object for updating a Fastly account. api_key string The API key of the Fastly account. name string The name of the Fastly account. type enum The JSON:API type for this API. Should always be `fastly-accounts`. Allowed enum values: `fastly-accounts` default: `fastly-accounts` ``` { "data": { "attributes": { "api_key": "update-secret" }, "type": "fastly-accounts" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/fastly-integration/#UpdateFastlyAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/fastly-integration/#UpdateFastlyAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/fastly-integration/#UpdateFastlyAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/fastly-integration/#UpdateFastlyAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/fastly-integration/#UpdateFastlyAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) The expected response schema when getting a Fastly account. Field Type Description data object Data object of a Fastly account. attributes [_required_] object Attributes object of a Fastly account. name [_required_] string The name of the Fastly account. services [object] A list of services belonging to the parent account. id [_required_] string The ID of the Fastly service tags [string] A list of tags for the Fastly service. id [_required_] string The ID of the Fastly account, a hash of the account name. type [_required_] enum The JSON:API type for this API. Should always be `fastly-accounts`. Allowed enum values: `fastly-accounts` default: `fastly-accounts` ``` { "data": { "attributes": { "name": "test-name", "services": [ { "id": "6abc7de6893AbcDe9fghIj", "tags": [ "myTag", "myTag2:myValue" ] } ] }, "id": "abc123", "type": "fastly-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=typescript) ##### Update Fastly account returns "OK" response Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/${account_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "api_key": "update-secret" }, "type": "fastly-accounts" } } EOF ``` ##### Update Fastly account returns "OK" response ``` // Update Fastly account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "fastly_account" in the system FastlyAccountDataID := os.Getenv("FASTLY_ACCOUNT_DATA_ID") body := datadogV2.FastlyAccountUpdateRequest{ Data: datadogV2.FastlyAccountUpdateRequestData{ Attributes: &datadogV2.FastlyAccountUpdateRequestAttributes{ ApiKey: datadog.PtrString("update-secret"), }, Type: datadogV2.FASTLYACCOUNTTYPE_FASTLY_ACCOUNTS.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFastlyIntegrationApi(apiClient) resp, r, err := api.UpdateFastlyAccount(ctx, FastlyAccountDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FastlyIntegrationApi.UpdateFastlyAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FastlyIntegrationApi.UpdateFastlyAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Fastly account returns "OK" response ``` // Update Fastly account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FastlyIntegrationApi; import com.datadog.api.client.v2.model.FastlyAccountResponse; import com.datadog.api.client.v2.model.FastlyAccountType; import com.datadog.api.client.v2.model.FastlyAccountUpdateRequest; import com.datadog.api.client.v2.model.FastlyAccountUpdateRequestAttributes; import com.datadog.api.client.v2.model.FastlyAccountUpdateRequestData; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); FastlyIntegrationApi apiInstance = new FastlyIntegrationApi(defaultClient); // there is a valid "fastly_account" in the system String FASTLY_ACCOUNT_DATA_ID = System.getenv("FASTLY_ACCOUNT_DATA_ID"); FastlyAccountUpdateRequest body = new FastlyAccountUpdateRequest() .data( new FastlyAccountUpdateRequestData() .attributes(new FastlyAccountUpdateRequestAttributes().apiKey("update-secret")) .type(FastlyAccountType.FASTLY_ACCOUNTS)); try { FastlyAccountResponse result = apiInstance.updateFastlyAccount(FASTLY_ACCOUNT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FastlyIntegrationApi#updateFastlyAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Fastly account returns "OK" response ``` """ Update Fastly account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fastly_integration_api import FastlyIntegrationApi from datadog_api_client.v2.model.fastly_account_type import FastlyAccountType from datadog_api_client.v2.model.fastly_account_update_request import FastlyAccountUpdateRequest from datadog_api_client.v2.model.fastly_account_update_request_attributes import FastlyAccountUpdateRequestAttributes from datadog_api_client.v2.model.fastly_account_update_request_data import FastlyAccountUpdateRequestData # there is a valid "fastly_account" in the system FASTLY_ACCOUNT_DATA_ID = environ["FASTLY_ACCOUNT_DATA_ID"] body = FastlyAccountUpdateRequest( data=FastlyAccountUpdateRequestData( attributes=FastlyAccountUpdateRequestAttributes( api_key="update-secret", ), type=FastlyAccountType.FASTLY_ACCOUNTS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = FastlyIntegrationApi(api_client) response = api_instance.update_fastly_account(account_id=FASTLY_ACCOUNT_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Fastly account returns "OK" response ``` # Update Fastly account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::FastlyIntegrationAPI.new # there is a valid "fastly_account" in the system FASTLY_ACCOUNT_DATA_ID = ENV["FASTLY_ACCOUNT_DATA_ID"] body = DatadogAPIClient::V2::FastlyAccountUpdateRequest.new({ data: DatadogAPIClient::V2::FastlyAccountUpdateRequestData.new({ attributes: DatadogAPIClient::V2::FastlyAccountUpdateRequestAttributes.new({ api_key: "update-secret", }), type: DatadogAPIClient::V2::FastlyAccountType::FASTLY_ACCOUNTS, }), }) p api_instance.update_fastly_account(FASTLY_ACCOUNT_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Fastly account returns "OK" response ``` // Update Fastly account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fastly_integration::FastlyIntegrationAPI; use datadog_api_client::datadogV2::model::FastlyAccountType; use datadog_api_client::datadogV2::model::FastlyAccountUpdateRequest; use datadog_api_client::datadogV2::model::FastlyAccountUpdateRequestAttributes; use datadog_api_client::datadogV2::model::FastlyAccountUpdateRequestData; #[tokio::main] async fn main() { // there is a valid "fastly_account" in the system let fastly_account_data_id = std::env::var("FASTLY_ACCOUNT_DATA_ID").unwrap(); let body = FastlyAccountUpdateRequest::new( FastlyAccountUpdateRequestData::new() .attributes( FastlyAccountUpdateRequestAttributes::new().api_key("update-secret".to_string()), ) .type_(FastlyAccountType::FASTLY_ACCOUNTS), ); let configuration = datadog::Configuration::new(); let api = FastlyIntegrationAPI::with_config(configuration); let resp = api .update_fastly_account(fastly_account_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Fastly account returns "OK" response ``` /** * Update Fastly account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.FastlyIntegrationApi(configuration); // there is a valid "fastly_account" in the system const FASTLY_ACCOUNT_DATA_ID = process.env.FASTLY_ACCOUNT_DATA_ID as string; const params: v2.FastlyIntegrationApiUpdateFastlyAccountRequest = { body: { data: { attributes: { apiKey: "update-secret", }, type: "fastly-accounts", }, }, accountId: FASTLY_ACCOUNT_DATA_ID, }; apiInstance .updateFastlyAccount(params) .then((data: v2.FastlyAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Fastly account](https://docs.datadoghq.com/api/latest/fastly-integration/#delete-fastly-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fastly-integration/#delete-fastly-account-v2) DELETE https://api.ap1.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/fastly/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/fastly/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id} ### Overview Delete a Fastly account. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Fastly Account id. ### Response * [204](https://docs.datadoghq.com/api/latest/fastly-integration/#DeleteFastlyAccount-204-v2) * [400](https://docs.datadoghq.com/api/latest/fastly-integration/#DeleteFastlyAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/fastly-integration/#DeleteFastlyAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/fastly-integration/#DeleteFastlyAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/fastly-integration/#DeleteFastlyAccount-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=typescript) ##### Delete Fastly account Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/${account_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Fastly account ``` """ Delete Fastly account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fastly_integration_api import FastlyIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = FastlyIntegrationApi(api_client) api_instance.delete_fastly_account( account_id="account_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Fastly account ``` # Delete Fastly account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::FastlyIntegrationAPI.new api_instance.delete_fastly_account("account_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Fastly account ``` // Delete Fastly account returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFastlyIntegrationApi(apiClient) r, err := api.DeleteFastlyAccount(ctx, "account_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FastlyIntegrationApi.DeleteFastlyAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Fastly account ``` // Delete Fastly account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FastlyIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); FastlyIntegrationApi apiInstance = new FastlyIntegrationApi(defaultClient); try { apiInstance.deleteFastlyAccount("account_id"); } catch (ApiException e) { System.err.println("Exception when calling FastlyIntegrationApi#deleteFastlyAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Fastly account ``` // Delete Fastly account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fastly_integration::FastlyIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = FastlyIntegrationAPI::with_config(configuration); let resp = api.delete_fastly_account("account_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Fastly account ``` /** * Delete Fastly account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.FastlyIntegrationApi(configuration); const params: v2.FastlyIntegrationApiDeleteFastlyAccountRequest = { accountId: "account_id", }; apiInstance .deleteFastlyAccount(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List Fastly services](https://docs.datadoghq.com/api/latest/fastly-integration/#list-fastly-services) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fastly-integration/#list-fastly-services-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.ap2.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.datadoghq.eu/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.ddog-gov.com/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.us3.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services ### Overview List Fastly services for an account. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Fastly Account id. ### Response * [200](https://docs.datadoghq.com/api/latest/fastly-integration/#ListFastlyServices-200-v2) * [400](https://docs.datadoghq.com/api/latest/fastly-integration/#ListFastlyServices-400-v2) * [403](https://docs.datadoghq.com/api/latest/fastly-integration/#ListFastlyServices-403-v2) * [404](https://docs.datadoghq.com/api/latest/fastly-integration/#ListFastlyServices-404-v2) * [429](https://docs.datadoghq.com/api/latest/fastly-integration/#ListFastlyServices-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) The expected response schema when getting Fastly services. Field Type Description data [object] The JSON:API data schema. attributes object Attributes object for Fastly service requests. tags [string] A list of tags for the Fastly service. id [_required_] string The ID of the Fastly service. type [_required_] enum The JSON:API type for this API. Should always be `fastly-services`. Allowed enum values: `fastly-services` default: `fastly-services` ``` { "data": [ { "attributes": { "tags": [ "myTag", "myTag2:myValue" ] }, "id": "abc123", "type": "fastly-services" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=typescript) ##### List Fastly services Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/${account_id}/services" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Fastly services ``` """ List Fastly services returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fastly_integration_api import FastlyIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = FastlyIntegrationApi(api_client) response = api_instance.list_fastly_services( account_id="account_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Fastly services ``` # List Fastly services returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::FastlyIntegrationAPI.new p api_instance.list_fastly_services("account_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Fastly services ``` // List Fastly services returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFastlyIntegrationApi(apiClient) resp, r, err := api.ListFastlyServices(ctx, "account_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FastlyIntegrationApi.ListFastlyServices`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FastlyIntegrationApi.ListFastlyServices`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Fastly services ``` // List Fastly services returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FastlyIntegrationApi; import com.datadog.api.client.v2.model.FastlyServicesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); FastlyIntegrationApi apiInstance = new FastlyIntegrationApi(defaultClient); try { FastlyServicesResponse result = apiInstance.listFastlyServices("account_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FastlyIntegrationApi#listFastlyServices"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Fastly services ``` // List Fastly services returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fastly_integration::FastlyIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = FastlyIntegrationAPI::with_config(configuration); let resp = api.list_fastly_services("account_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Fastly services ``` /** * List Fastly services returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.FastlyIntegrationApi(configuration); const params: v2.FastlyIntegrationApiListFastlyServicesRequest = { accountId: "account_id", }; apiInstance .listFastlyServices(params) .then((data: v2.FastlyServicesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add Fastly service](https://docs.datadoghq.com/api/latest/fastly-integration/#add-fastly-service) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fastly-integration/#add-fastly-service-v2) POST https://api.ap1.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.ap2.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.datadoghq.eu/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.ddog-gov.com/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.us3.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/serviceshttps://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services ### Overview Create a Fastly service for an account. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Fastly Account id. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) Field Type Description data [_required_] object Data object for Fastly service requests. attributes object Attributes object for Fastly service requests. tags [string] A list of tags for the Fastly service. id [_required_] string The ID of the Fastly service. type [_required_] enum The JSON:API type for this API. Should always be `fastly-services`. Allowed enum values: `fastly-services` default: `fastly-services` ``` { "data": { "attributes": { "tags": [ "myTag", "myTag2:myValue" ] }, "id": "abc123", "type": "fastly-services" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/fastly-integration/#CreateFastlyService-201-v2) * [400](https://docs.datadoghq.com/api/latest/fastly-integration/#CreateFastlyService-400-v2) * [403](https://docs.datadoghq.com/api/latest/fastly-integration/#CreateFastlyService-403-v2) * [404](https://docs.datadoghq.com/api/latest/fastly-integration/#CreateFastlyService-404-v2) * [429](https://docs.datadoghq.com/api/latest/fastly-integration/#CreateFastlyService-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) The expected response schema when getting a Fastly service. Field Type Description data object Data object for Fastly service requests. attributes object Attributes object for Fastly service requests. tags [string] A list of tags for the Fastly service. id [_required_] string The ID of the Fastly service. type [_required_] enum The JSON:API type for this API. Should always be `fastly-services`. Allowed enum values: `fastly-services` default: `fastly-services` ``` { "data": { "attributes": { "tags": [ "myTag", "myTag2:myValue" ] }, "id": "abc123", "type": "fastly-services" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=typescript) ##### Add Fastly service Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/${account_id}/services" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "abc123", "type": "fastly-services" } } EOF ``` ##### Add Fastly service ``` """ Add Fastly service returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fastly_integration_api import FastlyIntegrationApi from datadog_api_client.v2.model.fastly_service_attributes import FastlyServiceAttributes from datadog_api_client.v2.model.fastly_service_data import FastlyServiceData from datadog_api_client.v2.model.fastly_service_request import FastlyServiceRequest from datadog_api_client.v2.model.fastly_service_type import FastlyServiceType body = FastlyServiceRequest( data=FastlyServiceData( attributes=FastlyServiceAttributes( tags=[ "myTag", "myTag2:myValue", ], ), id="abc123", type=FastlyServiceType.FASTLY_SERVICES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = FastlyIntegrationApi(api_client) response = api_instance.create_fastly_service(account_id="account_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add Fastly service ``` # Add Fastly service returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::FastlyIntegrationAPI.new body = DatadogAPIClient::V2::FastlyServiceRequest.new({ data: DatadogAPIClient::V2::FastlyServiceData.new({ attributes: DatadogAPIClient::V2::FastlyServiceAttributes.new({ tags: [ "myTag", "myTag2:myValue", ], }), id: "abc123", type: DatadogAPIClient::V2::FastlyServiceType::FASTLY_SERVICES, }), }) p api_instance.create_fastly_service("account_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add Fastly service ``` // Add Fastly service returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.FastlyServiceRequest{ Data: datadogV2.FastlyServiceData{ Attributes: &datadogV2.FastlyServiceAttributes{ Tags: []string{ "myTag", "myTag2:myValue", }, }, Id: "abc123", Type: datadogV2.FASTLYSERVICETYPE_FASTLY_SERVICES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFastlyIntegrationApi(apiClient) resp, r, err := api.CreateFastlyService(ctx, "account_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FastlyIntegrationApi.CreateFastlyService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FastlyIntegrationApi.CreateFastlyService`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add Fastly service ``` // Add Fastly service returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FastlyIntegrationApi; import com.datadog.api.client.v2.model.FastlyServiceAttributes; import com.datadog.api.client.v2.model.FastlyServiceData; import com.datadog.api.client.v2.model.FastlyServiceRequest; import com.datadog.api.client.v2.model.FastlyServiceResponse; import com.datadog.api.client.v2.model.FastlyServiceType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); FastlyIntegrationApi apiInstance = new FastlyIntegrationApi(defaultClient); FastlyServiceRequest body = new FastlyServiceRequest() .data( new FastlyServiceData() .attributes( new FastlyServiceAttributes() .tags(Arrays.asList("myTag", "myTag2:myValue"))) .id("abc123") .type(FastlyServiceType.FASTLY_SERVICES)); try { FastlyServiceResponse result = apiInstance.createFastlyService("account_id", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FastlyIntegrationApi#createFastlyService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add Fastly service ``` // Add Fastly service returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fastly_integration::FastlyIntegrationAPI; use datadog_api_client::datadogV2::model::FastlyServiceAttributes; use datadog_api_client::datadogV2::model::FastlyServiceData; use datadog_api_client::datadogV2::model::FastlyServiceRequest; use datadog_api_client::datadogV2::model::FastlyServiceType; #[tokio::main] async fn main() { let body = FastlyServiceRequest::new( FastlyServiceData::new("abc123".to_string(), FastlyServiceType::FASTLY_SERVICES) .attributes( FastlyServiceAttributes::new() .tags(vec!["myTag".to_string(), "myTag2:myValue".to_string()]), ), ); let configuration = datadog::Configuration::new(); let api = FastlyIntegrationAPI::with_config(configuration); let resp = api .create_fastly_service("account_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add Fastly service ``` /** * Add Fastly service returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.FastlyIntegrationApi(configuration); const params: v2.FastlyIntegrationApiCreateFastlyServiceRequest = { body: { data: { attributes: { tags: ["myTag", "myTag2:myValue"], }, id: "abc123", type: "fastly-services", }, }, accountId: "account_id", }; apiInstance .createFastlyService(params) .then((data: v2.FastlyServiceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get Fastly service](https://docs.datadoghq.com/api/latest/fastly-integration/#get-fastly-service) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fastly-integration/#get-fastly-service-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.ap2.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.datadoghq.eu/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.ddog-gov.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.us3.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id} ### Overview Get a Fastly service for an account. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Fastly Account id. service_id [_required_] string Fastly Service ID. ### Response * [200](https://docs.datadoghq.com/api/latest/fastly-integration/#GetFastlyService-200-v2) * [400](https://docs.datadoghq.com/api/latest/fastly-integration/#GetFastlyService-400-v2) * [403](https://docs.datadoghq.com/api/latest/fastly-integration/#GetFastlyService-403-v2) * [404](https://docs.datadoghq.com/api/latest/fastly-integration/#GetFastlyService-404-v2) * [429](https://docs.datadoghq.com/api/latest/fastly-integration/#GetFastlyService-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) The expected response schema when getting a Fastly service. Field Type Description data object Data object for Fastly service requests. attributes object Attributes object for Fastly service requests. tags [string] A list of tags for the Fastly service. id [_required_] string The ID of the Fastly service. type [_required_] enum The JSON:API type for this API. Should always be `fastly-services`. Allowed enum values: `fastly-services` default: `fastly-services` ``` { "data": { "attributes": { "tags": [ "myTag", "myTag2:myValue" ] }, "id": "abc123", "type": "fastly-services" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=typescript) ##### Get Fastly service Copy ``` # Path parameters export account_id="CHANGE_ME" export service_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/${account_id}/services/${service_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Fastly service ``` """ Get Fastly service returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fastly_integration_api import FastlyIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = FastlyIntegrationApi(api_client) response = api_instance.get_fastly_service( account_id="account_id", service_id="service_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Fastly service ``` # Get Fastly service returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::FastlyIntegrationAPI.new p api_instance.get_fastly_service("account_id", "service_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Fastly service ``` // Get Fastly service returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFastlyIntegrationApi(apiClient) resp, r, err := api.GetFastlyService(ctx, "account_id", "service_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FastlyIntegrationApi.GetFastlyService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FastlyIntegrationApi.GetFastlyService`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Fastly service ``` // Get Fastly service returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FastlyIntegrationApi; import com.datadog.api.client.v2.model.FastlyServiceResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); FastlyIntegrationApi apiInstance = new FastlyIntegrationApi(defaultClient); try { FastlyServiceResponse result = apiInstance.getFastlyService("account_id", "service_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FastlyIntegrationApi#getFastlyService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Fastly service ``` // Get Fastly service returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fastly_integration::FastlyIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = FastlyIntegrationAPI::with_config(configuration); let resp = api .get_fastly_service("account_id".to_string(), "service_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Fastly service ``` /** * Get Fastly service returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.FastlyIntegrationApi(configuration); const params: v2.FastlyIntegrationApiGetFastlyServiceRequest = { accountId: "account_id", serviceId: "service_id", }; apiInstance .getFastlyService(params) .then((data: v2.FastlyServiceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Fastly service](https://docs.datadoghq.com/api/latest/fastly-integration/#update-fastly-service) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fastly-integration/#update-fastly-service-v2) PATCH https://api.ap1.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.ap2.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.datadoghq.eu/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.ddog-gov.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.us3.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id} ### Overview Update a Fastly service for an account. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Fastly Account id. service_id [_required_] string Fastly Service ID. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) Field Type Description data [_required_] object Data object for Fastly service requests. attributes object Attributes object for Fastly service requests. tags [string] A list of tags for the Fastly service. id [_required_] string The ID of the Fastly service. type [_required_] enum The JSON:API type for this API. Should always be `fastly-services`. Allowed enum values: `fastly-services` default: `fastly-services` ``` { "data": { "attributes": { "tags": [ "myTag", "myTag2:myValue" ] }, "id": "abc123", "type": "fastly-services" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/fastly-integration/#UpdateFastlyService-200-v2) * [400](https://docs.datadoghq.com/api/latest/fastly-integration/#UpdateFastlyService-400-v2) * [403](https://docs.datadoghq.com/api/latest/fastly-integration/#UpdateFastlyService-403-v2) * [404](https://docs.datadoghq.com/api/latest/fastly-integration/#UpdateFastlyService-404-v2) * [429](https://docs.datadoghq.com/api/latest/fastly-integration/#UpdateFastlyService-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) The expected response schema when getting a Fastly service. Field Type Description data object Data object for Fastly service requests. attributes object Attributes object for Fastly service requests. tags [string] A list of tags for the Fastly service. id [_required_] string The ID of the Fastly service. type [_required_] enum The JSON:API type for this API. Should always be `fastly-services`. Allowed enum values: `fastly-services` default: `fastly-services` ``` { "data": { "attributes": { "tags": [ "myTag", "myTag2:myValue" ] }, "id": "abc123", "type": "fastly-services" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=typescript) ##### Update Fastly service Copy ``` # Path parameters export account_id="CHANGE_ME" export service_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/${account_id}/services/${service_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "abc123", "type": "fastly-services" } } EOF ``` ##### Update Fastly service ``` """ Update Fastly service returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fastly_integration_api import FastlyIntegrationApi from datadog_api_client.v2.model.fastly_service_attributes import FastlyServiceAttributes from datadog_api_client.v2.model.fastly_service_data import FastlyServiceData from datadog_api_client.v2.model.fastly_service_request import FastlyServiceRequest from datadog_api_client.v2.model.fastly_service_type import FastlyServiceType body = FastlyServiceRequest( data=FastlyServiceData( attributes=FastlyServiceAttributes( tags=[ "myTag", "myTag2:myValue", ], ), id="abc123", type=FastlyServiceType.FASTLY_SERVICES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = FastlyIntegrationApi(api_client) response = api_instance.update_fastly_service(account_id="account_id", service_id="service_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Fastly service ``` # Update Fastly service returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::FastlyIntegrationAPI.new body = DatadogAPIClient::V2::FastlyServiceRequest.new({ data: DatadogAPIClient::V2::FastlyServiceData.new({ attributes: DatadogAPIClient::V2::FastlyServiceAttributes.new({ tags: [ "myTag", "myTag2:myValue", ], }), id: "abc123", type: DatadogAPIClient::V2::FastlyServiceType::FASTLY_SERVICES, }), }) p api_instance.update_fastly_service("account_id", "service_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Fastly service ``` // Update Fastly service returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.FastlyServiceRequest{ Data: datadogV2.FastlyServiceData{ Attributes: &datadogV2.FastlyServiceAttributes{ Tags: []string{ "myTag", "myTag2:myValue", }, }, Id: "abc123", Type: datadogV2.FASTLYSERVICETYPE_FASTLY_SERVICES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFastlyIntegrationApi(apiClient) resp, r, err := api.UpdateFastlyService(ctx, "account_id", "service_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FastlyIntegrationApi.UpdateFastlyService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FastlyIntegrationApi.UpdateFastlyService`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Fastly service ``` // Update Fastly service returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FastlyIntegrationApi; import com.datadog.api.client.v2.model.FastlyServiceAttributes; import com.datadog.api.client.v2.model.FastlyServiceData; import com.datadog.api.client.v2.model.FastlyServiceRequest; import com.datadog.api.client.v2.model.FastlyServiceResponse; import com.datadog.api.client.v2.model.FastlyServiceType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); FastlyIntegrationApi apiInstance = new FastlyIntegrationApi(defaultClient); FastlyServiceRequest body = new FastlyServiceRequest() .data( new FastlyServiceData() .attributes( new FastlyServiceAttributes() .tags(Arrays.asList("myTag", "myTag2:myValue"))) .id("abc123") .type(FastlyServiceType.FASTLY_SERVICES)); try { FastlyServiceResponse result = apiInstance.updateFastlyService("account_id", "service_id", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FastlyIntegrationApi#updateFastlyService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Fastly service ``` // Update Fastly service returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fastly_integration::FastlyIntegrationAPI; use datadog_api_client::datadogV2::model::FastlyServiceAttributes; use datadog_api_client::datadogV2::model::FastlyServiceData; use datadog_api_client::datadogV2::model::FastlyServiceRequest; use datadog_api_client::datadogV2::model::FastlyServiceType; #[tokio::main] async fn main() { let body = FastlyServiceRequest::new( FastlyServiceData::new("abc123".to_string(), FastlyServiceType::FASTLY_SERVICES) .attributes( FastlyServiceAttributes::new() .tags(vec!["myTag".to_string(), "myTag2:myValue".to_string()]), ), ); let configuration = datadog::Configuration::new(); let api = FastlyIntegrationAPI::with_config(configuration); let resp = api .update_fastly_service("account_id".to_string(), "service_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Fastly service ``` /** * Update Fastly service returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.FastlyIntegrationApi(configuration); const params: v2.FastlyIntegrationApiUpdateFastlyServiceRequest = { body: { data: { attributes: { tags: ["myTag", "myTag2:myValue"], }, id: "abc123", type: "fastly-services", }, }, accountId: "account_id", serviceId: "service_id", }; apiInstance .updateFastlyService(params) .then((data: v2.FastlyServiceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Fastly service](https://docs.datadoghq.com/api/latest/fastly-integration/#delete-fastly-service) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fastly-integration/#delete-fastly-service-v2) DELETE https://api.ap1.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.ap2.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.datadoghq.eu/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.ddog-gov.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.us3.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id}https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/{account_id}/services/{service_id} ### Overview Delete a Fastly service for an account. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Fastly Account id. service_id [_required_] string Fastly Service ID. ### Response * [204](https://docs.datadoghq.com/api/latest/fastly-integration/#DeleteFastlyService-204-v2) * [400](https://docs.datadoghq.com/api/latest/fastly-integration/#DeleteFastlyService-400-v2) * [403](https://docs.datadoghq.com/api/latest/fastly-integration/#DeleteFastlyService-403-v2) * [404](https://docs.datadoghq.com/api/latest/fastly-integration/#DeleteFastlyService-404-v2) * [429](https://docs.datadoghq.com/api/latest/fastly-integration/#DeleteFastlyService-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fastly-integration/) * [Example](https://docs.datadoghq.com/api/latest/fastly-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fastly-integration/?code-lang=typescript) ##### Delete Fastly service Copy ``` # Path parameters export account_id="CHANGE_ME" export service_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/fastly/accounts/${account_id}/services/${service_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Fastly service ``` """ Delete Fastly service returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fastly_integration_api import FastlyIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = FastlyIntegrationApi(api_client) api_instance.delete_fastly_service( account_id="account_id", service_id="service_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Fastly service ``` # Delete Fastly service returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::FastlyIntegrationAPI.new api_instance.delete_fastly_service("account_id", "service_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Fastly service ``` // Delete Fastly service returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFastlyIntegrationApi(apiClient) r, err := api.DeleteFastlyService(ctx, "account_id", "service_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FastlyIntegrationApi.DeleteFastlyService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Fastly service ``` // Delete Fastly service returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FastlyIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); FastlyIntegrationApi apiInstance = new FastlyIntegrationApi(defaultClient); try { apiInstance.deleteFastlyService("account_id", "service_id"); } catch (ApiException e) { System.err.println("Exception when calling FastlyIntegrationApi#deleteFastlyService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Fastly service ``` // Delete Fastly service returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fastly_integration::FastlyIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = FastlyIntegrationAPI::with_config(configuration); let resp = api .delete_fastly_service("account_id".to_string(), "service_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Fastly service ``` /** * Delete Fastly service returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.FastlyIntegrationApi(configuration); const params: v2.FastlyIntegrationApiDeleteFastlyServiceRequest = { accountId: "account_id", serviceId: "service_id", }; apiInstance .deleteFastlyService(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=33a58cbf-7740-4118-8bbf-5abfa26bba28&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=56d7ce7d-34ea-41f6-a644-5bfeaf37e94c&pt=Fastly%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffastly-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=33a58cbf-7740-4118-8bbf-5abfa26bba28&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=56d7ce7d-34ea-41f6-a644-5bfeaf37e94c&pt=Fastly%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffastly-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=505e5784-5e3d-4029-9e0b-cc42c7cdd0f0&bo=2&sid=ae838290f0bf11f082a211cb2590164a&vid=ae83a1b0f0bf11f0ac21fb24fefcc951&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Fastly%20Integration&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffastly-integration%2F&r=<=1516&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=281323) --- # Source: https://docs.datadoghq.com/api/latest/fleet-automation/ ![](https://d.adroll.com/cm/b/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/bombora/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/experian/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/eyeota/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/g/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/index/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/l/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/n/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/o/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/outbrain/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/pubmatic/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/taboola/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/triplelift/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM)![](https://d.adroll.com/cm/x/out?adroll_fpc=0788ac283710a7c30c1f425425769d25-1768336658229&pv=30876712072.47534&arrfrr=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&advertisable=AWPYDAH5JJH2JGSUWJVDKM) # Fleet Automation Manage automated deployments across your fleet of hosts. Fleet Automation provides two types of deployments: Configuration Deployments (`/configure`): * Apply configuration file changes to target hosts * Support merge-patch operations to update specific configuration fields * Support delete operations to remove configuration files * Useful for updating Datadog Agent settings, integration configs, and more Package Upgrade Deployments (`/upgrade`): * Upgrade the Datadog Agent to specific versions ## [List all available Agent versions](https://docs.datadoghq.com/api/latest/fleet-automation/#list-all-available-agent-versions) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#list-all-available-agent-versions-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/unstable/fleet/agent_versionshttps://api.ap2.datadoghq.com/api/unstable/fleet/agent_versionshttps://api.datadoghq.eu/api/unstable/fleet/agent_versionshttps://api.ddog-gov.com/api/unstable/fleet/agent_versionshttps://api.datadoghq.com/api/unstable/fleet/agent_versionshttps://api.us3.datadoghq.com/api/unstable/fleet/agent_versionshttps://api.us5.datadoghq.com/api/unstable/fleet/agent_versions ### Overview Retrieve a list of all available Datadog Agent versions. This endpoint returns the available Agent versions that can be deployed to your fleet. These versions are used when creating deployments or configuring schedules for automated Agent upgrades. This endpoint requires the `hosts_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgentVersions-200-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgentVersions-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgentVersions-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgentVersions-403-v2) * [404](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgentVersions-404-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgentVersions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing a list of available Agent versions. Field Type Description data [_required_] [object] Array of available Agent versions. attributes object version string The Agent version string. id [_required_] string Unique identifier for the Agent version (same as version). type [_required_] enum The type of Agent version resource. Allowed enum values: `agent_version` default: `agent_version` ``` { "data": [ { "attributes": { "version": "7.50.0" }, "id": "7.50.0", "type": "agent_version" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### List all available Agent versions Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/agent_versions" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all available Agent versions ``` """ List all available Agent versions returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi configuration = Configuration() configuration.unstable_operations["list_fleet_agent_versions"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.list_fleet_agent_versions() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all available Agent versions ``` # List all available Agent versions returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_fleet_agent_versions".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new p api_instance.list_fleet_agent_versions() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all available Agent versions ``` // List all available Agent versions returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListFleetAgentVersions", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.ListFleetAgentVersions(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.ListFleetAgentVersions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.ListFleetAgentVersions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all available Agent versions ``` // List all available Agent versions returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetAgentVersionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listFleetAgentVersions", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); try { FleetAgentVersionsResponse result = apiInstance.listFleetAgentVersions(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#listFleetAgentVersions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all available Agent versions ``` // List all available Agent versions returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListFleetAgentVersions", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api.list_fleet_agent_versions().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all available Agent versions ``` /** * List all available Agent versions returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listFleetAgentVersions"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); apiInstance .listFleetAgentVersions() .then((data: v2.FleetAgentVersionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all Datadog Agents](https://docs.datadoghq.com/api/latest/fleet-automation/#list-all-datadog-agents) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#list-all-datadog-agents-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/unstable/fleet/agentshttps://api.ap2.datadoghq.com/api/unstable/fleet/agentshttps://api.datadoghq.eu/api/unstable/fleet/agentshttps://api.ddog-gov.com/api/unstable/fleet/agentshttps://api.datadoghq.com/api/unstable/fleet/agentshttps://api.us3.datadoghq.com/api/unstable/fleet/agentshttps://api.us5.datadoghq.com/api/unstable/fleet/agents ### Overview Retrieve a paginated list of all Datadog Agents. This endpoint returns a paginated list of all Datadog Agents with support for pagination, sorting, and filtering. Use the `page_number` and `page_size` query parameters to paginate through results. This endpoint requires the `hosts_read` permission. ### Arguments #### Query Strings Name Type Description page_number integer Page number for pagination (starts at 0). page_size integer Number of results per page (must be greater than 0 and less than or equal to 100). sort_attribute string Attribute to sort by. sort_descending boolean Sort order (true for descending, false for ascending). tags string Comma-separated list of tags to filter agents. filter string Filter string for narrowing down agent results. ### Response * [200](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgents-200-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgents-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgents-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgents-403-v2) * [404](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgents-404-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetAgents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing a paginated list of Datadog Agents. Field Type Description data [_required_] object The response data containing status and agents array. attributes [_required_] object agents [object] Array of agents matching the query criteria. agent_version string The Datadog Agent version. api_key_name string The API key name (if available and not redacted). api_key_uuid string The API key UUID. cloud_provider string The cloud provider where the agent is running. cluster_name string Kubernetes cluster name (if applicable). datadog_agent_key string The unique agent key identifier. enabled_products [string] Datadog products enabled on the agent. envs [string] Environments the agent is reporting from. first_seen_at int64 Timestamp when the agent was first seen. hostname string The hostname of the agent. ip_addresses [string] IP addresses of the agent. is_single_step_instrumentation_enabled boolean Whether single-step instrumentation is enabled. last_restart_at int64 Timestamp of the last agent restart. os string The operating system. otel_collector_version string OpenTelemetry collector version (if applicable). otel_collector_versions [string] List of OpenTelemetry collector versions (if applicable). pod_name string Kubernetes pod name (if applicable). remote_agent_management string Remote agent management status. remote_config_status string Remote configuration status. services [string] Services running on the agent. tags [object] Tags associated with the agent. key string value string team string Team associated with the agent. id [_required_] string Status identifier. type [_required_] string Resource type. meta object Metadata for the list of agents response. total_filtered_count int64 Total number of agents matching the filter criteria across all pages. ``` { "data": { "attributes": { "agents": [ { "agent_version": "7.50.0", "api_key_name": "Production API Key", "api_key_uuid": "a1b2c3d4-e5f6-4321-a123-123456789abc", "cloud_provider": "aws", "cluster_name": "string", "datadog_agent_key": "my-agent-hostname", "enabled_products": [], "envs": [], "first_seen_at": "integer", "hostname": "my-hostname", "ip_addresses": [], "is_single_step_instrumentation_enabled": false, "last_restart_at": "integer", "os": "linux", "otel_collector_version": "string", "otel_collector_versions": [], "pod_name": "string", "remote_agent_management": "enabled", "remote_config_status": "connected", "services": [], "tags": [ { "key": "string", "value": "string" } ], "team": "string" } ] }, "id": "done", "type": "status" }, "meta": { "total_filtered_count": 150 } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### List all Datadog Agents Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/agents" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all Datadog Agents ``` """ List all Datadog Agents returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi configuration = Configuration() configuration.unstable_operations["list_fleet_agents"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.list_fleet_agents() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all Datadog Agents ``` # List all Datadog Agents returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_fleet_agents".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new p api_instance.list_fleet_agents() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all Datadog Agents ``` // List all Datadog Agents returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListFleetAgents", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.ListFleetAgents(ctx, *datadogV2.NewListFleetAgentsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.ListFleetAgents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.ListFleetAgents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all Datadog Agents ``` // List all Datadog Agents returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetAgentsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listFleetAgents", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); try { FleetAgentsResponse result = apiInstance.listFleetAgents(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#listFleetAgents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all Datadog Agents ``` // List all Datadog Agents returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; use datadog_api_client::datadogV2::api_fleet_automation::ListFleetAgentsOptionalParams; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListFleetAgents", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api .list_fleet_agents(ListFleetAgentsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all Datadog Agents ``` /** * List all Datadog Agents returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listFleetAgents"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); apiInstance .listFleetAgents() .then((data: v2.FleetAgentsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get detailed information about an agent](https://docs.datadoghq.com/api/latest/fleet-automation/#get-detailed-information-about-an-agent) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#get-detailed-information-about-an-agent-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/unstable/fleet/agents/{agent_key}https://api.ap2.datadoghq.com/api/unstable/fleet/agents/{agent_key}https://api.datadoghq.eu/api/unstable/fleet/agents/{agent_key}https://api.ddog-gov.com/api/unstable/fleet/agents/{agent_key}https://api.datadoghq.com/api/unstable/fleet/agents/{agent_key}https://api.us3.datadoghq.com/api/unstable/fleet/agents/{agent_key}https://api.us5.datadoghq.com/api/unstable/fleet/agents/{agent_key} ### Overview Retrieve detailed information about a specific Datadog Agent. This endpoint returns comprehensive information about an agent including: * Agent details and metadata * Configured integrations organized by status (working, warning, error, missing) * Detected integrations * Configuration files and layers This endpoint requires the `hosts_read` permission. ### Arguments #### Path Parameters Name Type Description agent_key [_required_] string The unique identifier (agent key) for the Datadog Agent. ### Response * [200](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetAgentInfo-200-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetAgentInfo-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetAgentInfo-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetAgentInfo-403-v2) * [404](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetAgentInfo-404-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetAgentInfo-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing detailed information about a specific agent. Field Type Description data [_required_] object Represents detailed information about a specific Datadog Agent. attributes [_required_] object Attributes for agent information. agent_infos object Detailed information about a Datadog Agent. agent_version string The Datadog Agent version. api_key_name string The API key name (if available and not redacted). api_key_uuid string The API key UUID. cloud_provider string The cloud provider where the agent is running. cluster_name string Kubernetes cluster name (if applicable). datadog_agent_key string The unique agent key identifier. enabled_products [string] Datadog products enabled on the agent. env [string] Environments the agent is reporting from. first_seen_at int64 Timestamp when the agent was first seen. hostname string The hostname of the agent. hostname_aliases [string] Alternative hostname list for the agent. install_method_installer_version string The version of the installer used. install_method_tool string The tool used to install the agent. ip_addresses [string] IP addresses of the agent. is_single_step_instrumentation_enabled boolean Whether single-step instrumentation is enabled. last_restart_at int64 Timestamp of the last agent restart. os string The operating system. os_version string The operating system version. otel_collector_version string OpenTelemetry collector version (if applicable). otel_collector_versions [string] List of OpenTelemetry collector versions (if applicable). otel_collectors [object] OpenTelemetry collectors associated with the agent (if applicable). pod_name string Kubernetes pod name (if applicable). python_version string The Python version used by the agent. region [string] Regions where the agent is running. remote_agent_management string Remote agent management status. remote_config_status string Remote configuration status. services [string] Services running on the agent. tags [string] Tags associated with the agent. team string Team associated with the agent. configuration_files object Configuration information organized by layers. compiled_configuration string The final compiled configuration. env_configuration string Configuration from environment variables. file_configuration string Configuration from files. parsed_configuration string Parsed configuration output. remote_configuration string Remote configuration settings. runtime_configuration string Runtime configuration. detected_integrations [object] List of detected integrations. escaped_name string Escaped integration name. prefix string Integration prefix identifier. integrations object Integrations organized by their status. configuration_files [object] Configuration files for integrations. file_content string The raw content of the configuration file. file_path string Path to the configuration file. filename string Name of the configuration file. datadog_agent_key string The unique agent key identifier. error_integrations [object] Integrations with errors. data_type string Type of data collected (metrics, logs). error_messages [string] Error messages if the integration has issues. init_config string Initialization configuration (YAML format). instance_config string Instance-specific configuration (YAML format). is_custom_check boolean Whether this is a custom integration. log_config string Log collection configuration (YAML format). name string Name of the integration instance. source_index int64 Index in the configuration file. source_path string Path to the configuration file. type string Integration type. missing_integrations [object] Detected but not configured integrations. escaped_name string Escaped integration name. prefix string Integration prefix identifier. warning_integrations [object] Integrations with warnings. data_type string Type of data collected (metrics, logs). error_messages [string] Error messages if the integration has issues. init_config string Initialization configuration (YAML format). instance_config string Instance-specific configuration (YAML format). is_custom_check boolean Whether this is a custom integration. log_config string Log collection configuration (YAML format). name string Name of the integration instance. source_index int64 Index in the configuration file. source_path string Path to the configuration file. type string Integration type. working_integrations [object] Integrations that are working correctly. data_type string Type of data collected (metrics, logs). error_messages [string] Error messages if the integration has issues. init_config string Initialization configuration (YAML format). instance_config string Instance-specific configuration (YAML format). is_custom_check boolean Whether this is a custom integration. log_config string Log collection configuration (YAML format). name string Name of the integration instance. source_index int64 Index in the configuration file. source_path string Path to the configuration file. type string Integration type. id [_required_] string The unique agent key identifier. type [_required_] enum The type of Agent info resource. Allowed enum values: `datadog_agent_key` default: `datadog_agent_key` ``` { "data": { "attributes": { "agent_infos": { "agent_version": "7.50.0", "api_key_name": "Production API Key", "api_key_uuid": "a1b2c3d4-e5f6-4321-a123-123456789abc", "cloud_provider": "aws", "cluster_name": "string", "datadog_agent_key": "my-agent-hostname", "enabled_products": [], "env": [], "first_seen_at": "integer", "hostname": "my-hostname", "hostname_aliases": [], "install_method_installer_version": "1.2.3", "install_method_tool": "chef", "ip_addresses": [], "is_single_step_instrumentation_enabled": false, "last_restart_at": "integer", "os": "linux", "os_version": "Ubuntu 20.04", "otel_collector_version": "string", "otel_collector_versions": [], "otel_collectors": [], "pod_name": "string", "python_version": "3.9.5", "region": [], "remote_agent_management": "enabled", "remote_config_status": "connected", "services": [], "tags": [], "team": "string" }, "configuration_files": { "compiled_configuration": "string", "env_configuration": "string", "file_configuration": "string", "parsed_configuration": "string", "remote_configuration": "string", "runtime_configuration": "string" }, "detected_integrations": [ { "escaped_name": "postgresql", "prefix": "postgres" } ], "integrations": { "configuration_files": [ { "file_content": "string", "file_path": "/conf.d/postgres.d/postgres.yaml", "filename": "postgres.yaml" } ], "datadog_agent_key": "string", "error_integrations": [ { "data_type": "metrics", "error_messages": [], "init_config": "string", "instance_config": "string", "is_custom_check": false, "log_config": "string", "name": "string", "source_index": "integer", "source_path": "string", "type": "postgres" } ], "missing_integrations": [ { "escaped_name": "postgresql", "prefix": "postgres" } ], "warning_integrations": [ { "data_type": "metrics", "error_messages": [], "init_config": "string", "instance_config": "string", "is_custom_check": false, "log_config": "string", "name": "string", "source_index": "integer", "source_path": "string", "type": "postgres" } ], "working_integrations": [ { "data_type": "metrics", "error_messages": [], "init_config": "string", "instance_config": "string", "is_custom_check": false, "log_config": "string", "name": "string", "source_index": "integer", "source_path": "string", "type": "postgres" } ] } }, "id": "my-agent-hostname", "type": "datadog_agent_key" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### Get detailed information about an agent Copy ``` # Path parameters export agent_key="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/agents/${agent_key}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get detailed information about an agent ``` """ Get detailed information about an agent returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi configuration = Configuration() configuration.unstable_operations["get_fleet_agent_info"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.get_fleet_agent_info( agent_key="agent_key", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get detailed information about an agent ``` # Get detailed information about an agent returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_fleet_agent_info".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new p api_instance.get_fleet_agent_info("agent_key") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get detailed information about an agent ``` // Get detailed information about an agent returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetFleetAgentInfo", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.GetFleetAgentInfo(ctx, "agent_key") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.GetFleetAgentInfo`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.GetFleetAgentInfo`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get detailed information about an agent ``` // Get detailed information about an agent returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetAgentInfoResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getFleetAgentInfo", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); try { FleetAgentInfoResponse result = apiInstance.getFleetAgentInfo("agent_key"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#getFleetAgentInfo"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get detailed information about an agent ``` // Get detailed information about an agent returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetFleetAgentInfo", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api.get_fleet_agent_info("agent_key".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get detailed information about an agent ``` /** * Get detailed information about an agent returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getFleetAgentInfo"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); const params: v2.FleetAutomationApiGetFleetAgentInfoRequest = { agentKey: "agent_key", }; apiInstance .getFleetAgentInfo(params) .then((data: v2.FleetAgentInfoResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all deployments](https://docs.datadoghq.com/api/latest/fleet-automation/#list-all-deployments) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#list-all-deployments-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/unstable/fleet/deploymentshttps://api.ap2.datadoghq.com/api/unstable/fleet/deploymentshttps://api.datadoghq.eu/api/unstable/fleet/deploymentshttps://api.ddog-gov.com/api/unstable/fleet/deploymentshttps://api.datadoghq.com/api/unstable/fleet/deploymentshttps://api.us3.datadoghq.com/api/unstable/fleet/deploymentshttps://api.us5.datadoghq.com/api/unstable/fleet/deployments ### Overview Retrieve a list of all deployments for fleet automation. Use the `page_size` and `page_offset` parameters to paginate results. This endpoint requires the `hosts_read` permission. ### Arguments #### Query Strings Name Type Description page_size integer Number of deployments to return per page. Maximum value is 100. page_offset integer Index of the first deployment to return. Use this with `page_size` to paginate through results. ### Response * [200](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetDeployments-200-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetDeployments-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetDeployments-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetDeployments-403-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetDeployments-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing a paginated list of deployments. Field Type Description data [_required_] [object] Array of deployments matching the query criteria. attributes [_required_] object Attributes of a deployment in the response. config_operations [object] Ordered list of configuration file operations to perform on the target hosts. file_op [_required_] enum Type of file operation to perform on the target configuration file. * `merge-patch`: Merges the provided patch data with the existing configuration file. Creates the file if it doesn't exist. * `delete`: Removes the specified configuration file from the target hosts. Allowed enum values: `merge-patch,delete` file_path [_required_] string Absolute path to the target configuration file on the host. patch object Patch data in JSON format to apply to the configuration file. When using `merge-patch`, this object is merged with the existing configuration, allowing you to add, update, or override specific fields without replacing the entire file. The structure must match the target configuration file format (for example, YAML structure for Datadog Agent config). Not applicable when using the `delete` operation. estimated_end_time_unix int64 Estimated completion time of the deployment as a Unix timestamp (seconds since epoch). filter_query string Query used to filter and select target hosts for the deployment. Uses the Datadog query syntax. high_level_status string Current high-level status of the deployment (for example, "pending", "running", "completed", "failed"). hosts [object] Paginated list of hosts in this deployment with their individual statuses. Only included when fetching a single deployment by ID. Use the `limit` and `page` query parameters to navigate through pages. Pagination metadata is included in the response `meta.hosts` field. error string Error message if the deployment failed on this host. hostname string The hostname of the agent. status string Current deployment status for this specific host. versions [object] List of packages and their versions currently installed on this host. current_version string The current version of the package on the host. initial_version string The initial version of the package on the host before the deployment started. package_name string The name of the package. target_version string The target version that the deployment is attempting to install. packages [object] List of packages to deploy to target hosts. Present only for package upgrade deployments. name [_required_] string The name of the package to deploy. version [_required_] string The target version of the package to deploy. total_hosts int64 Total number of hosts targeted by this deployment. id [_required_] string Unique identifier for the deployment. type [_required_] enum The type of deployment resource. Allowed enum values: `deployment` default: `deployment` meta object Metadata for the list of deployments, including pagination information. page object Pagination details for the list of deployments. total_count int64 Total number of deployments available across all pages. ``` { "data": [ { "attributes": { "config_operations": [ { "file_op": "merge-patch", "file_path": "/datadog.yaml", "patch": { "apm_config": { "enabled": true }, "log_level": "debug", "logs_enabled": true } } ], "estimated_end_time_unix": 1699999999, "filter_query": "env:prod AND service:web", "high_level_status": "pending", "hosts": [ { "error": "", "hostname": "web-server-01.example.com", "status": "succeeded", "versions": [ { "current_version": "7.51.0", "initial_version": "7.51.0", "package_name": "datadog-agent", "target_version": "7.52.0" } ] } ], "packages": [ { "name": "datadog-agent", "version": "7.52.0" } ], "total_hosts": 42 }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "deployment" } ], "meta": { "page": { "total_count": 25 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### List all deployments Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/deployments" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all deployments ``` """ List all deployments returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi configuration = Configuration() configuration.unstable_operations["list_fleet_deployments"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.list_fleet_deployments() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all deployments ``` # List all deployments returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_fleet_deployments".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new p api_instance.list_fleet_deployments() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all deployments ``` // List all deployments returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListFleetDeployments", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.ListFleetDeployments(ctx, *datadogV2.NewListFleetDeploymentsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.ListFleetDeployments`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.ListFleetDeployments`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all deployments ``` // List all deployments returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetDeploymentsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listFleetDeployments", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); try { FleetDeploymentsResponse result = apiInstance.listFleetDeployments(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#listFleetDeployments"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all deployments ``` // List all deployments returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; use datadog_api_client::datadogV2::api_fleet_automation::ListFleetDeploymentsOptionalParams; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListFleetDeployments", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api .list_fleet_deployments(ListFleetDeploymentsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all deployments ``` /** * List all deployments returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listFleetDeployments"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); apiInstance .listFleetDeployments() .then((data: v2.FleetDeploymentsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a configuration deployment](https://docs.datadoghq.com/api/latest/fleet-automation/#create-a-configuration-deployment) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#create-a-configuration-deployment-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/unstable/fleet/deployments/configurehttps://api.ap2.datadoghq.com/api/unstable/fleet/deployments/configurehttps://api.datadoghq.eu/api/unstable/fleet/deployments/configurehttps://api.ddog-gov.com/api/unstable/fleet/deployments/configurehttps://api.datadoghq.com/api/unstable/fleet/deployments/configurehttps://api.us3.datadoghq.com/api/unstable/fleet/deployments/configurehttps://api.us5.datadoghq.com/api/unstable/fleet/deployments/configure ### Overview Create a new deployment to apply configuration changes to a fleet of hosts matching the specified filter query. This endpoint supports two types of configuration operations: * `merge-patch`: Merges the provided patch data with the existing configuration file, creating the file if it doesn’t exist * `delete`: Removes the specified configuration file from the target hosts The deployment is created and started automatically. You can specify multiple configuration operations that will be executed in order on each target host. Use the filter query to target specific hosts using the Datadog query syntax. This endpoint requires all of the following permissions: * `agent_upgrade_write` * `fleet_policies_write` ### Request #### Body Data (required) Request payload containing the deployment details. * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Field Type Description data [_required_] object Data for creating a new configuration deployment. attributes [_required_] object Attributes for creating a new configuration deployment. config_operations [_required_] [object] Ordered list of configuration file operations to perform on the target hosts. file_op [_required_] enum Type of file operation to perform on the target configuration file. * `merge-patch`: Merges the provided patch data with the existing configuration file. Creates the file if it doesn't exist. * `delete`: Removes the specified configuration file from the target hosts. Allowed enum values: `merge-patch,delete` file_path [_required_] string Absolute path to the target configuration file on the host. patch object Patch data in JSON format to apply to the configuration file. When using `merge-patch`, this object is merged with the existing configuration, allowing you to add, update, or override specific fields without replacing the entire file. The structure must match the target configuration file format (for example, YAML structure for Datadog Agent config). Not applicable when using the `delete` operation. filter_query string Query used to filter and select target hosts for the deployment. Uses the Datadog query syntax. type [_required_] enum The type of deployment resource. Allowed enum values: `deployment` default: `deployment` ``` { "data": { "attributes": { "config_operations": [ { "file_op": "merge-patch", "file_path": "/datadog.yaml", "patch": { "apm_config": { "enabled": true }, "log_level": "debug", "logs_enabled": true } } ], "filter_query": "env:prod AND service:web" }, "type": "deployment" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetDeploymentConfigure-201-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetDeploymentConfigure-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetDeploymentConfigure-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetDeploymentConfigure-403-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetDeploymentConfigure-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing a single deployment. Field Type Description data object A deployment that defines automated configuration changes for a fleet of hosts. attributes [_required_] object Attributes of a deployment in the response. config_operations [object] Ordered list of configuration file operations to perform on the target hosts. file_op [_required_] enum Type of file operation to perform on the target configuration file. * `merge-patch`: Merges the provided patch data with the existing configuration file. Creates the file if it doesn't exist. * `delete`: Removes the specified configuration file from the target hosts. Allowed enum values: `merge-patch,delete` file_path [_required_] string Absolute path to the target configuration file on the host. patch object Patch data in JSON format to apply to the configuration file. When using `merge-patch`, this object is merged with the existing configuration, allowing you to add, update, or override specific fields without replacing the entire file. The structure must match the target configuration file format (for example, YAML structure for Datadog Agent config). Not applicable when using the `delete` operation. estimated_end_time_unix int64 Estimated completion time of the deployment as a Unix timestamp (seconds since epoch). filter_query string Query used to filter and select target hosts for the deployment. Uses the Datadog query syntax. high_level_status string Current high-level status of the deployment (for example, "pending", "running", "completed", "failed"). hosts [object] Paginated list of hosts in this deployment with their individual statuses. Only included when fetching a single deployment by ID. Use the `limit` and `page` query parameters to navigate through pages. Pagination metadata is included in the response `meta.hosts` field. error string Error message if the deployment failed on this host. hostname string The hostname of the agent. status string Current deployment status for this specific host. versions [object] List of packages and their versions currently installed on this host. current_version string The current version of the package on the host. initial_version string The initial version of the package on the host before the deployment started. package_name string The name of the package. target_version string The target version that the deployment is attempting to install. packages [object] List of packages to deploy to target hosts. Present only for package upgrade deployments. name [_required_] string The name of the package to deploy. version [_required_] string The target version of the package to deploy. total_hosts int64 Total number of hosts targeted by this deployment. id [_required_] string Unique identifier for the deployment. type [_required_] enum The type of deployment resource. Allowed enum values: `deployment` default: `deployment` meta object Metadata for a single deployment response, including pagination information for hosts. hosts object Pagination details for the list of hosts in a deployment. current_page int64 Current page index (zero-based). page_size int64 Number of hosts returned per page. total_hosts int64 Total number of hosts in this deployment. total_pages int64 Total number of pages available. ``` { "data": { "attributes": { "config_operations": [ { "file_op": "merge-patch", "file_path": "/datadog.yaml", "patch": { "apm_config": { "enabled": true }, "log_level": "debug", "logs_enabled": true } } ], "estimated_end_time_unix": 1699999999, "filter_query": "env:prod AND service:web", "high_level_status": "pending", "hosts": [ { "error": "", "hostname": "web-server-01.example.com", "status": "succeeded", "versions": [ { "current_version": "7.51.0", "initial_version": "7.51.0", "package_name": "datadog-agent", "target_version": "7.52.0" } ] } ], "packages": [ { "name": "datadog-agent", "version": "7.52.0" } ], "total_hosts": 42 }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "deployment" }, "meta": { "hosts": { "current_page": 0, "page_size": 50, "total_hosts": 150, "total_pages": 3 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### Create a configuration deployment Copy ``` ## Add log integrations for multiple services # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/deployments/configure" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "config_operations": [ { "file_op": "merge-patch", "file_path": "/conf.d/postgres.d/logs.yaml", "patch": { "logs": [ { "path": "/var/log/postgres.log", "service": "postgres1", "source": "postgres", "type": "file" } ] } }, { "file_op": "merge-patch", "file_path": "/conf.d/kafka.d/logs.yaml", "patch": { "logs": [ { "path": "/var/log/kafka.log", "service": "kafka1", "source": "kafka", "type": "file" } ] } } ], "filter_query": "env:prod" }, "type": "deployment" } } EOF ## Comprehensive example with multiple configuration file types # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/deployments/configure" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "config_operations": [ { "file_op": "merge-patch", "file_path": "/datadog.yaml", "patch": { "apm_config": { "apm_dd_url": "https://trace.agent.datadoghq.com", "enabled": true }, "log_level": "info", "logs_enabled": true, "process_config": { "enabled": true } } }, { "file_op": "merge-patch", "file_path": "/security-agent.yaml", "patch": { "runtime_security_config": { "enabled": true, "fim_enabled": true } } }, { "file_op": "merge-patch", "file_path": "/system-probe.yaml", "patch": { "network_config": { "enabled": true }, "service_monitoring_config": { "enabled": true } } }, { "file_op": "merge-patch", "file_path": "/application_monitoring.yaml", "patch": { "enabled": true, "server": { "host": "0.0.0.0", "port": 8126 } } }, { "file_op": "merge-patch", "file_path": "/conf.d/logs.d/custom-app.yaml", "patch": { "logs": [ { "path": "/var/log/custom-app/*.log", "service": "custom-app", "source": "custom", "type": "file" } ] } }, { "file_op": "merge-patch", "file_path": "/conf.d/docker.d/docker-logs.yaml", "patch": { "logs": [ { "service": "docker", "source": "docker", "type": "docker" } ] } }, { "file_op": "merge-patch", "file_path": "/conf.d/snmp.d/network-devices.yaml", "patch": { "init_config": { "loader": "core" }, "instances": [ { "community_string": "public", "ip_address": "192.168.1.1" } ] } }, { "file_op": "merge-patch", "file_path": "/conf.d/postgres.d/conf.yaml", "patch": { "instances": [ { "dbname": "postgres", "host": "localhost", "port": 5432, "username": "datadog" } ] } }, { "file_op": "delete", "file_path": "/conf.d/deprecated-integration.yaml" } ], "filter_query": "env:prod AND datacenter:us-east-1" }, "type": "deployment" } } EOF ## Delete a configuration file # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/deployments/configure" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "config_operations": [ { "file_op": "delete", "file_path": "/conf.d/old-integration.yaml" } ], "filter_query": "env:dev" }, "type": "deployment" } } EOF ## Enable APM and Logs products # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/deployments/configure" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "config_operations": [ { "file_op": "merge-patch", "file_path": "/datadog.yaml", "patch": { "apm_config": { "enabled": true }, "log_level": "debug", "logs_enabled": true } } ], "filter_query": "env:prod AND service:web" }, "type": "deployment" } } EOF ## Set log level to info # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/deployments/configure" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "config_operations": [ { "file_op": "merge-patch", "file_path": "/datadog.yaml", "patch": { "log_level": "info" } } ], "filter_query": "env:staging" }, "type": "deployment" } } EOF ``` ##### Create a configuration deployment ``` """ Create a configuration deployment returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi from datadog_api_client.v2.model.fleet_deployment_configure_attributes import FleetDeploymentConfigureAttributes from datadog_api_client.v2.model.fleet_deployment_configure_create import FleetDeploymentConfigureCreate from datadog_api_client.v2.model.fleet_deployment_configure_create_request import FleetDeploymentConfigureCreateRequest from datadog_api_client.v2.model.fleet_deployment_file_op import FleetDeploymentFileOp from datadog_api_client.v2.model.fleet_deployment_operation import FleetDeploymentOperation from datadog_api_client.v2.model.fleet_deployment_resource_type import FleetDeploymentResourceType body = FleetDeploymentConfigureCreateRequest( data=FleetDeploymentConfigureCreate( attributes=FleetDeploymentConfigureAttributes( config_operations=[ FleetDeploymentOperation( file_op=FleetDeploymentFileOp.MERGE_PATCH, file_path="/datadog.yaml", patch=dict([("apm_config", "{'enabled': True}"), ("log_level", "debug"), ("logs_enabled", "True")]), ), ], filter_query="env:prod AND service:web", ), type=FleetDeploymentResourceType.DEPLOYMENT, ), ) configuration = Configuration() configuration.unstable_operations["create_fleet_deployment_configure"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.create_fleet_deployment_configure(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a configuration deployment ``` # Create a configuration deployment returns "CREATED" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_fleet_deployment_configure".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new body = DatadogAPIClient::V2::FleetDeploymentConfigureCreateRequest.new({ data: DatadogAPIClient::V2::FleetDeploymentConfigureCreate.new({ attributes: DatadogAPIClient::V2::FleetDeploymentConfigureAttributes.new({ config_operations: [ DatadogAPIClient::V2::FleetDeploymentOperation.new({ file_op: DatadogAPIClient::V2::FleetDeploymentFileOp::MERGE_PATCH, file_path: "/datadog.yaml", patch: { "apm_config": "{'enabled': True}", "log_level": "debug", "logs_enabled": "True", }, }), ], filter_query: "env:prod AND service:web", }), type: DatadogAPIClient::V2::FleetDeploymentResourceType::DEPLOYMENT, }), }) p api_instance.create_fleet_deployment_configure(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a configuration deployment ``` // Create a configuration deployment returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.FleetDeploymentConfigureCreateRequest{ Data: datadogV2.FleetDeploymentConfigureCreate{ Attributes: datadogV2.FleetDeploymentConfigureAttributes{ ConfigOperations: []datadogV2.FleetDeploymentOperation{ { FileOp: datadogV2.FLEETDEPLOYMENTFILEOP_MERGE_PATCH, FilePath: "/datadog.yaml", Patch: map[string]interface{}{ "apm_config": "{'enabled': True}", "log_level": "debug", "logs_enabled": "True", }, }, }, FilterQuery: datadog.PtrString("env:prod AND service:web"), }, Type: datadogV2.FLEETDEPLOYMENTRESOURCETYPE_DEPLOYMENT, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateFleetDeploymentConfigure", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.CreateFleetDeploymentConfigure(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.CreateFleetDeploymentConfigure`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.CreateFleetDeploymentConfigure`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a configuration deployment ``` // Create a configuration deployment returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetDeploymentConfigureAttributes; import com.datadog.api.client.v2.model.FleetDeploymentConfigureCreate; import com.datadog.api.client.v2.model.FleetDeploymentConfigureCreateRequest; import com.datadog.api.client.v2.model.FleetDeploymentFileOp; import com.datadog.api.client.v2.model.FleetDeploymentOperation; import com.datadog.api.client.v2.model.FleetDeploymentResourceType; import com.datadog.api.client.v2.model.FleetDeploymentResponse; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createFleetDeploymentConfigure", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); FleetDeploymentConfigureCreateRequest body = new FleetDeploymentConfigureCreateRequest() .data( new FleetDeploymentConfigureCreate() .attributes( new FleetDeploymentConfigureAttributes() .configOperations( Collections.singletonList( new FleetDeploymentOperation() .fileOp(FleetDeploymentFileOp.MERGE_PATCH) .filePath("/datadog.yaml") .patch( Map.ofEntries( Map.entry("apm_config", "{'enabled': True}"), Map.entry("log_level", "debug"), Map.entry("logs_enabled", "True"))))) .filterQuery("env:prod AND service:web")) .type(FleetDeploymentResourceType.DEPLOYMENT)); try { FleetDeploymentResponse result = apiInstance.createFleetDeploymentConfigure(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling FleetAutomationApi#createFleetDeploymentConfigure"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a configuration deployment ``` // Create a configuration deployment returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; use datadog_api_client::datadogV2::model::FleetDeploymentConfigureAttributes; use datadog_api_client::datadogV2::model::FleetDeploymentConfigureCreate; use datadog_api_client::datadogV2::model::FleetDeploymentConfigureCreateRequest; use datadog_api_client::datadogV2::model::FleetDeploymentFileOp; use datadog_api_client::datadogV2::model::FleetDeploymentOperation; use datadog_api_client::datadogV2::model::FleetDeploymentResourceType; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = FleetDeploymentConfigureCreateRequest::new(FleetDeploymentConfigureCreate::new( FleetDeploymentConfigureAttributes::new(vec![FleetDeploymentOperation::new( FleetDeploymentFileOp::MERGE_PATCH, "/datadog.yaml".to_string(), ) .patch(BTreeMap::from([ ("log_level".to_string(), Value::from("debug")), ("logs_enabled".to_string(), Value::from("True")), ]))]) .filter_query("env:prod AND service:web".to_string()), FleetDeploymentResourceType::DEPLOYMENT, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateFleetDeploymentConfigure", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api.create_fleet_deployment_configure(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a configuration deployment ``` /** * Create a configuration deployment returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createFleetDeploymentConfigure"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); const params: v2.FleetAutomationApiCreateFleetDeploymentConfigureRequest = { body: { data: { attributes: { configOperations: [ { fileOp: "merge-patch", filePath: "/datadog.yaml", patch: { apm_config: "{'enabled': True}", log_level: "debug", logs_enabled: "True", }, }, ], filterQuery: "env:prod AND service:web", }, type: "deployment", }, }, }; apiInstance .createFleetDeploymentConfigure(params) .then((data: v2.FleetDeploymentResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Upgrade hosts](https://docs.datadoghq.com/api/latest/fleet-automation/#upgrade-hosts) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#upgrade-hosts-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/unstable/fleet/deployments/upgradehttps://api.ap2.datadoghq.com/api/unstable/fleet/deployments/upgradehttps://api.datadoghq.eu/api/unstable/fleet/deployments/upgradehttps://api.ddog-gov.com/api/unstable/fleet/deployments/upgradehttps://api.datadoghq.com/api/unstable/fleet/deployments/upgradehttps://api.us3.datadoghq.com/api/unstable/fleet/deployments/upgradehttps://api.us5.datadoghq.com/api/unstable/fleet/deployments/upgrade ### Overview Create and immediately start a new package upgrade on hosts matching the specified filter query. This endpoint allows you to upgrade the Datadog Agent to a specific version on hosts matching the specified filter query. The deployment is created and started automatically. The system will: 1. Identify all hosts matching the filter query 2. Validate that the specified version is available 3. Begin rolling out the package upgrade to the target hosts This endpoint requires all of the following permissions: * `agent_upgrade_write` * `fleet_policies_write` ### Request #### Body Data (required) Request payload containing the package upgrade details. * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Field Type Description data [_required_] object Data for creating a new package upgrade deployment. attributes [_required_] object Attributes for creating a new package upgrade deployment. filter_query string Query used to filter and select target hosts for the deployment. Uses the Datadog query syntax. target_packages [_required_] [object] List of packages and their target versions to deploy to the selected hosts. name [_required_] string The name of the package to deploy. version [_required_] string The target version of the package to deploy. type [_required_] enum The type of deployment resource. Allowed enum values: `deployment` default: `deployment` ``` { "data": { "attributes": { "filter_query": "env:prod AND service:web", "target_packages": [ { "name": "datadog-agent", "version": "7.52.0" } ] }, "type": "deployment" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetDeploymentUpgrade-201-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetDeploymentUpgrade-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetDeploymentUpgrade-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetDeploymentUpgrade-403-v2) * [404](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetDeploymentUpgrade-404-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetDeploymentUpgrade-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing a single deployment. Field Type Description data object A deployment that defines automated configuration changes for a fleet of hosts. attributes [_required_] object Attributes of a deployment in the response. config_operations [object] Ordered list of configuration file operations to perform on the target hosts. file_op [_required_] enum Type of file operation to perform on the target configuration file. * `merge-patch`: Merges the provided patch data with the existing configuration file. Creates the file if it doesn't exist. * `delete`: Removes the specified configuration file from the target hosts. Allowed enum values: `merge-patch,delete` file_path [_required_] string Absolute path to the target configuration file on the host. patch object Patch data in JSON format to apply to the configuration file. When using `merge-patch`, this object is merged with the existing configuration, allowing you to add, update, or override specific fields without replacing the entire file. The structure must match the target configuration file format (for example, YAML structure for Datadog Agent config). Not applicable when using the `delete` operation. estimated_end_time_unix int64 Estimated completion time of the deployment as a Unix timestamp (seconds since epoch). filter_query string Query used to filter and select target hosts for the deployment. Uses the Datadog query syntax. high_level_status string Current high-level status of the deployment (for example, "pending", "running", "completed", "failed"). hosts [object] Paginated list of hosts in this deployment with their individual statuses. Only included when fetching a single deployment by ID. Use the `limit` and `page` query parameters to navigate through pages. Pagination metadata is included in the response `meta.hosts` field. error string Error message if the deployment failed on this host. hostname string The hostname of the agent. status string Current deployment status for this specific host. versions [object] List of packages and their versions currently installed on this host. current_version string The current version of the package on the host. initial_version string The initial version of the package on the host before the deployment started. package_name string The name of the package. target_version string The target version that the deployment is attempting to install. packages [object] List of packages to deploy to target hosts. Present only for package upgrade deployments. name [_required_] string The name of the package to deploy. version [_required_] string The target version of the package to deploy. total_hosts int64 Total number of hosts targeted by this deployment. id [_required_] string Unique identifier for the deployment. type [_required_] enum The type of deployment resource. Allowed enum values: `deployment` default: `deployment` meta object Metadata for a single deployment response, including pagination information for hosts. hosts object Pagination details for the list of hosts in a deployment. current_page int64 Current page index (zero-based). page_size int64 Number of hosts returned per page. total_hosts int64 Total number of hosts in this deployment. total_pages int64 Total number of pages available. ``` { "data": { "attributes": { "config_operations": [ { "file_op": "merge-patch", "file_path": "/datadog.yaml", "patch": { "apm_config": { "enabled": true }, "log_level": "debug", "logs_enabled": true } } ], "estimated_end_time_unix": 1699999999, "filter_query": "env:prod AND service:web", "high_level_status": "pending", "hosts": [ { "error": "", "hostname": "web-server-01.example.com", "status": "succeeded", "versions": [ { "current_version": "7.51.0", "initial_version": "7.51.0", "package_name": "datadog-agent", "target_version": "7.52.0" } ] } ], "packages": [ { "name": "datadog-agent", "version": "7.52.0" } ], "total_hosts": 42 }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "deployment" }, "meta": { "hosts": { "current_page": 0, "page_size": 50, "total_hosts": 150, "total_pages": 3 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### Upgrade hosts Copy ``` ## Upgrade Datadog Agent to version 7.52.0 # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/deployments/upgrade" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter_query": "env:prod AND service:web", "target_packages": [ { "name": "datadog-agent", "version": "7.52.0" } ] }, "type": "deployment" } } EOF ## Upgrade multiple packages # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/deployments/upgrade" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter_query": "env:staging", "target_packages": [ { "name": "datadog-agent", "version": "7.52.0-1" }, { "name": "datadog-apm-inject", "version": "0.10.0" } ] }, "type": "deployment" } } EOF ``` ##### Upgrade hosts ``` """ Upgrade hosts returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi from datadog_api_client.v2.model.fleet_deployment_package import FleetDeploymentPackage from datadog_api_client.v2.model.fleet_deployment_package_upgrade_attributes import ( FleetDeploymentPackageUpgradeAttributes, ) from datadog_api_client.v2.model.fleet_deployment_package_upgrade_create import FleetDeploymentPackageUpgradeCreate from datadog_api_client.v2.model.fleet_deployment_package_upgrade_create_request import ( FleetDeploymentPackageUpgradeCreateRequest, ) from datadog_api_client.v2.model.fleet_deployment_resource_type import FleetDeploymentResourceType body = FleetDeploymentPackageUpgradeCreateRequest( data=FleetDeploymentPackageUpgradeCreate( attributes=FleetDeploymentPackageUpgradeAttributes( filter_query="env:prod AND service:web", target_packages=[ FleetDeploymentPackage( name="datadog-agent", version="7.52.0", ), ], ), type=FleetDeploymentResourceType.DEPLOYMENT, ), ) configuration = Configuration() configuration.unstable_operations["create_fleet_deployment_upgrade"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.create_fleet_deployment_upgrade(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Upgrade hosts ``` # Upgrade hosts returns "CREATED" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_fleet_deployment_upgrade".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new body = DatadogAPIClient::V2::FleetDeploymentPackageUpgradeCreateRequest.new({ data: DatadogAPIClient::V2::FleetDeploymentPackageUpgradeCreate.new({ attributes: DatadogAPIClient::V2::FleetDeploymentPackageUpgradeAttributes.new({ filter_query: "env:prod AND service:web", target_packages: [ DatadogAPIClient::V2::FleetDeploymentPackage.new({ name: "datadog-agent", version: "7.52.0", }), ], }), type: DatadogAPIClient::V2::FleetDeploymentResourceType::DEPLOYMENT, }), }) p api_instance.create_fleet_deployment_upgrade(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Upgrade hosts ``` // Upgrade hosts returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.FleetDeploymentPackageUpgradeCreateRequest{ Data: datadogV2.FleetDeploymentPackageUpgradeCreate{ Attributes: datadogV2.FleetDeploymentPackageUpgradeAttributes{ FilterQuery: datadog.PtrString("env:prod AND service:web"), TargetPackages: []datadogV2.FleetDeploymentPackage{ { Name: "datadog-agent", Version: "7.52.0", }, }, }, Type: datadogV2.FLEETDEPLOYMENTRESOURCETYPE_DEPLOYMENT, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateFleetDeploymentUpgrade", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.CreateFleetDeploymentUpgrade(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.CreateFleetDeploymentUpgrade`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.CreateFleetDeploymentUpgrade`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Upgrade hosts ``` // Upgrade hosts returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetDeploymentPackage; import com.datadog.api.client.v2.model.FleetDeploymentPackageUpgradeAttributes; import com.datadog.api.client.v2.model.FleetDeploymentPackageUpgradeCreate; import com.datadog.api.client.v2.model.FleetDeploymentPackageUpgradeCreateRequest; import com.datadog.api.client.v2.model.FleetDeploymentResourceType; import com.datadog.api.client.v2.model.FleetDeploymentResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createFleetDeploymentUpgrade", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); FleetDeploymentPackageUpgradeCreateRequest body = new FleetDeploymentPackageUpgradeCreateRequest() .data( new FleetDeploymentPackageUpgradeCreate() .attributes( new FleetDeploymentPackageUpgradeAttributes() .filterQuery("env:prod AND service:web") .targetPackages( Collections.singletonList( new FleetDeploymentPackage() .name("datadog-agent") .version("7.52.0")))) .type(FleetDeploymentResourceType.DEPLOYMENT)); try { FleetDeploymentResponse result = apiInstance.createFleetDeploymentUpgrade(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#createFleetDeploymentUpgrade"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Upgrade hosts ``` // Upgrade hosts returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; use datadog_api_client::datadogV2::model::FleetDeploymentPackage; use datadog_api_client::datadogV2::model::FleetDeploymentPackageUpgradeAttributes; use datadog_api_client::datadogV2::model::FleetDeploymentPackageUpgradeCreate; use datadog_api_client::datadogV2::model::FleetDeploymentPackageUpgradeCreateRequest; use datadog_api_client::datadogV2::model::FleetDeploymentResourceType; #[tokio::main] async fn main() { let body = FleetDeploymentPackageUpgradeCreateRequest::new(FleetDeploymentPackageUpgradeCreate::new( FleetDeploymentPackageUpgradeAttributes::new(vec![FleetDeploymentPackage::new( "datadog-agent".to_string(), "7.52.0".to_string(), )]) .filter_query("env:prod AND service:web".to_string()), FleetDeploymentResourceType::DEPLOYMENT, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateFleetDeploymentUpgrade", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api.create_fleet_deployment_upgrade(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Upgrade hosts ``` /** * Upgrade hosts returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createFleetDeploymentUpgrade"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); const params: v2.FleetAutomationApiCreateFleetDeploymentUpgradeRequest = { body: { data: { attributes: { filterQuery: "env:prod AND service:web", targetPackages: [ { name: "datadog-agent", version: "7.52.0", }, ], }, type: "deployment", }, }, }; apiInstance .createFleetDeploymentUpgrade(params) .then((data: v2.FleetDeploymentResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a configuration deployment by ID](https://docs.datadoghq.com/api/latest/fleet-automation/#get-a-configuration-deployment-by-id) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#get-a-configuration-deployment-by-id-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/unstable/fleet/deployments/{deployment_id}https://api.ap2.datadoghq.com/api/unstable/fleet/deployments/{deployment_id}https://api.datadoghq.eu/api/unstable/fleet/deployments/{deployment_id}https://api.ddog-gov.com/api/unstable/fleet/deployments/{deployment_id}https://api.datadoghq.com/api/unstable/fleet/deployments/{deployment_id}https://api.us3.datadoghq.com/api/unstable/fleet/deployments/{deployment_id}https://api.us5.datadoghq.com/api/unstable/fleet/deployments/{deployment_id} ### Overview Retrieve detailed information about a specific deployment using its unique identifier. This endpoint returns comprehensive information about a deployment, including: * Deployment metadata (ID, type, filter query) * Total number of target hosts * Current high-level status (pending, running, succeeded, failed) * Estimated completion time * Configuration operations that were or are being applied * Detailed host list: A paginated array of hosts included in this deployment with individual host status, current package versions, and any errors The host list provides visibility into the per-host execution status, allowing you to: * Monitor which hosts have completed successfully * Identify hosts that are still in progress * Investigate failures on specific hosts * View current package versions installed on each host (including initial, target, and current versions for each package) Pagination: Use the `limit` and `page` query parameters to paginate through hosts. The response includes pagination metadata in the `meta.hosts` field with information about the current page, total pages, and total host count. The default page size is 50 hosts, with a maximum of 100. This endpoint requires the `hosts_read` permission. ### Arguments #### Path Parameters Name Type Description deployment_id [_required_] string The unique identifier of the deployment to retrieve. #### Query Strings Name Type Description limit integer Maximum number of hosts to return per page. Default is 50, maximum is 100. page integer Page index for pagination (zero-based). Use this to retrieve subsequent pages of hosts. ### Response * [200](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetDeployment-200-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetDeployment-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetDeployment-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetDeployment-403-v2) * [404](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetDeployment-404-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetDeployment-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing a single deployment. Field Type Description data object A deployment that defines automated configuration changes for a fleet of hosts. attributes [_required_] object Attributes of a deployment in the response. config_operations [object] Ordered list of configuration file operations to perform on the target hosts. file_op [_required_] enum Type of file operation to perform on the target configuration file. * `merge-patch`: Merges the provided patch data with the existing configuration file. Creates the file if it doesn't exist. * `delete`: Removes the specified configuration file from the target hosts. Allowed enum values: `merge-patch,delete` file_path [_required_] string Absolute path to the target configuration file on the host. patch object Patch data in JSON format to apply to the configuration file. When using `merge-patch`, this object is merged with the existing configuration, allowing you to add, update, or override specific fields without replacing the entire file. The structure must match the target configuration file format (for example, YAML structure for Datadog Agent config). Not applicable when using the `delete` operation. estimated_end_time_unix int64 Estimated completion time of the deployment as a Unix timestamp (seconds since epoch). filter_query string Query used to filter and select target hosts for the deployment. Uses the Datadog query syntax. high_level_status string Current high-level status of the deployment (for example, "pending", "running", "completed", "failed"). hosts [object] Paginated list of hosts in this deployment with their individual statuses. Only included when fetching a single deployment by ID. Use the `limit` and `page` query parameters to navigate through pages. Pagination metadata is included in the response `meta.hosts` field. error string Error message if the deployment failed on this host. hostname string The hostname of the agent. status string Current deployment status for this specific host. versions [object] List of packages and their versions currently installed on this host. current_version string The current version of the package on the host. initial_version string The initial version of the package on the host before the deployment started. package_name string The name of the package. target_version string The target version that the deployment is attempting to install. packages [object] List of packages to deploy to target hosts. Present only for package upgrade deployments. name [_required_] string The name of the package to deploy. version [_required_] string The target version of the package to deploy. total_hosts int64 Total number of hosts targeted by this deployment. id [_required_] string Unique identifier for the deployment. type [_required_] enum The type of deployment resource. Allowed enum values: `deployment` default: `deployment` meta object Metadata for a single deployment response, including pagination information for hosts. hosts object Pagination details for the list of hosts in a deployment. current_page int64 Current page index (zero-based). page_size int64 Number of hosts returned per page. total_hosts int64 Total number of hosts in this deployment. total_pages int64 Total number of pages available. ``` { "data": { "attributes": { "config_operations": [ { "file_op": "merge-patch", "file_path": "/datadog.yaml", "patch": { "apm_config": { "enabled": true }, "log_level": "debug", "logs_enabled": true } } ], "estimated_end_time_unix": 1699999999, "filter_query": "env:prod AND service:web", "high_level_status": "pending", "hosts": [ { "error": "", "hostname": "web-server-01.example.com", "status": "succeeded", "versions": [ { "current_version": "7.51.0", "initial_version": "7.51.0", "package_name": "datadog-agent", "target_version": "7.52.0" } ] } ], "packages": [ { "name": "datadog-agent", "version": "7.52.0" } ], "total_hosts": 42 }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "deployment" }, "meta": { "hosts": { "current_page": 0, "page_size": 50, "total_hosts": 150, "total_pages": 3 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### Get a configuration deployment by ID Copy ``` # Path parameters export deployment_id="abc-def-ghi" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/deployments/${deployment_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a configuration deployment by ID ``` """ Get a configuration deployment by ID returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi configuration = Configuration() configuration.unstable_operations["get_fleet_deployment"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.get_fleet_deployment( deployment_id="deployment_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a configuration deployment by ID ``` # Get a configuration deployment by ID returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_fleet_deployment".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new p api_instance.get_fleet_deployment("deployment_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a configuration deployment by ID ``` // Get a configuration deployment by ID returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetFleetDeployment", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.GetFleetDeployment(ctx, "deployment_id", *datadogV2.NewGetFleetDeploymentOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.GetFleetDeployment`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.GetFleetDeployment`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a configuration deployment by ID ``` // Get a configuration deployment by ID returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetDeploymentResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getFleetDeployment", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); try { FleetDeploymentResponse result = apiInstance.getFleetDeployment("abc-def-ghi"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#getFleetDeployment"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a configuration deployment by ID ``` // Get a configuration deployment by ID returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; use datadog_api_client::datadogV2::api_fleet_automation::GetFleetDeploymentOptionalParams; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetFleetDeployment", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api .get_fleet_deployment( "deployment_id".to_string(), GetFleetDeploymentOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a configuration deployment by ID ``` /** * Get a configuration deployment by ID returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getFleetDeployment"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); const params: v2.FleetAutomationApiGetFleetDeploymentRequest = { deploymentId: "deployment_id", }; apiInstance .getFleetDeployment(params) .then((data: v2.FleetDeploymentResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Cancel a deployment](https://docs.datadoghq.com/api/latest/fleet-automation/#cancel-a-deployment) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#cancel-a-deployment-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/unstable/fleet/deployments/{deployment_id}/cancelhttps://api.ap2.datadoghq.com/api/unstable/fleet/deployments/{deployment_id}/cancelhttps://api.datadoghq.eu/api/unstable/fleet/deployments/{deployment_id}/cancelhttps://api.ddog-gov.com/api/unstable/fleet/deployments/{deployment_id}/cancelhttps://api.datadoghq.com/api/unstable/fleet/deployments/{deployment_id}/cancelhttps://api.us3.datadoghq.com/api/unstable/fleet/deployments/{deployment_id}/cancelhttps://api.us5.datadoghq.com/api/unstable/fleet/deployments/{deployment_id}/cancel ### Overview Cancel an active deployment and stop all pending operations. When you cancel a deployment: * All pending operations on hosts that haven’t started yet are stopped * Operations currently in progress on hosts may complete or be interrupted, depending on their current state * Configuration changes or package upgrades already applied to hosts are not rolled back After cancellation, you can view the final state of the deployment using the GET endpoint to see which hosts were successfully updated before the cancellation. This endpoint requires all of the following permissions: * `agent_upgrade_write` * `fleet_policies_write` ### Arguments #### Path Parameters Name Type Description deployment_id [_required_] string The unique identifier of the deployment to cancel. ### Response * [204](https://docs.datadoghq.com/api/latest/fleet-automation/#CancelFleetDeployment-204-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#CancelFleetDeployment-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#CancelFleetDeployment-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#CancelFleetDeployment-403-v2) * [404](https://docs.datadoghq.com/api/latest/fleet-automation/#CancelFleetDeployment-404-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#CancelFleetDeployment-429-v2) Deployment successfully canceled. Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### Cancel a deployment Copy ``` # Path parameters export deployment_id="abc-def-ghi" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/deployments/${deployment_id}/cancel" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Cancel a deployment ``` """ Cancel a deployment returns "Deployment successfully canceled." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi configuration = Configuration() configuration.unstable_operations["cancel_fleet_deployment"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) api_instance.cancel_fleet_deployment( deployment_id="deployment_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Cancel a deployment ``` # Cancel a deployment returns "Deployment successfully canceled." response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.cancel_fleet_deployment".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new api_instance.cancel_fleet_deployment("deployment_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Cancel a deployment ``` // Cancel a deployment returns "Deployment successfully canceled." response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CancelFleetDeployment", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) r, err := api.CancelFleetDeployment(ctx, "deployment_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.CancelFleetDeployment`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Cancel a deployment ``` // Cancel a deployment returns "Deployment successfully canceled." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.cancelFleetDeployment", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); try { apiInstance.cancelFleetDeployment("abc-def-ghi"); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#cancelFleetDeployment"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Cancel a deployment ``` // Cancel a deployment returns "Deployment successfully canceled." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CancelFleetDeployment", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api .cancel_fleet_deployment("deployment_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Cancel a deployment ``` /** * Cancel a deployment returns "Deployment successfully canceled." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.cancelFleetDeployment"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); const params: v2.FleetAutomationApiCancelFleetDeploymentRequest = { deploymentId: "deployment_id", }; apiInstance .cancelFleetDeployment(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all schedules](https://docs.datadoghq.com/api/latest/fleet-automation/#list-all-schedules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#list-all-schedules-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/unstable/fleet/scheduleshttps://api.ap2.datadoghq.com/api/unstable/fleet/scheduleshttps://api.datadoghq.eu/api/unstable/fleet/scheduleshttps://api.ddog-gov.com/api/unstable/fleet/scheduleshttps://api.datadoghq.com/api/unstable/fleet/scheduleshttps://api.us3.datadoghq.com/api/unstable/fleet/scheduleshttps://api.us5.datadoghq.com/api/unstable/fleet/schedules ### Overview Retrieve a list of all schedules for automated fleet deployments. Schedules allow you to automate package upgrades by defining maintenance windows and recurrence rules. Each schedule automatically creates deployments based on its configuration. This endpoint requires the `hosts_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetSchedules-200-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetSchedules-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetSchedules-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetSchedules-403-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#ListFleetSchedules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing a list of schedules. Field Type Description data [_required_] [object] Array of schedules. attributes [_required_] object Attributes of a schedule in the response. created_at_unix int64 Unix timestamp (seconds since epoch) when the schedule was created. created_by string User handle of the person who created the schedule. name string Human-readable name for the schedule. query string Query used to filter and select target hosts for scheduled deployments. Uses the Datadog query syntax. rule object Defines the recurrence pattern for the schedule. Specifies when deployments should be automatically triggered based on maintenance windows. days_of_week [_required_] [string] List of days of the week when the schedule should trigger. Valid values are: "Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun". maintenance_window_duration [_required_] int64 Duration of the maintenance window in minutes. start_maintenance_window [_required_] string Start time of the maintenance window in 24-hour clock format (HH:MM). Deployments will be triggered at this time on the specified days. timezone [_required_] string Timezone for the schedule in IANA Time Zone Database format (e.g., "America/New_York", "UTC"). status enum The status of the schedule. * `active`: The schedule is active and will create deployments according to its recurrence rule. * `inactive`: The schedule is inactive and will not create any deployments. Allowed enum values: `active,inactive` updated_at_unix int64 Unix timestamp (seconds since epoch) when the schedule was last updated. updated_by string User handle of the person who last updated the schedule. version_to_latest int64 Number of major versions behind the latest to target for upgrades. * 0: Always upgrade to the latest version * 1: Upgrade to latest minus 1 major version * 2: Upgrade to latest minus 2 major versions Maximum value is 2. id [_required_] string Unique identifier for the schedule. type [_required_] enum The type of schedule resource. Allowed enum values: `schedule` default: `schedule` ``` { "data": [ { "attributes": { "created_at_unix": 1699999999, "created_by": "user@example.com", "name": "Weekly Production Agent Updates", "query": "env:prod AND service:web", "rule": { "days_of_week": [ "Mon", "Wed", "Fri" ], "maintenance_window_duration": 1200, "start_maintenance_window": "02:00", "timezone": "America/New_York" }, "status": "active", "updated_at_unix": 1699999999, "updated_by": "user@example.com", "version_to_latest": 0 }, "id": "abc-def-ghi-123", "type": "schedule" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### List all schedules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/schedules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all schedules ``` """ List all schedules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi configuration = Configuration() configuration.unstable_operations["list_fleet_schedules"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.list_fleet_schedules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all schedules ``` # List all schedules returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_fleet_schedules".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new p api_instance.list_fleet_schedules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all schedules ``` // List all schedules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListFleetSchedules", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.ListFleetSchedules(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.ListFleetSchedules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.ListFleetSchedules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all schedules ``` // List all schedules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetSchedulesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listFleetSchedules", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); try { FleetSchedulesResponse result = apiInstance.listFleetSchedules(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#listFleetSchedules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all schedules ``` // List all schedules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListFleetSchedules", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api.list_fleet_schedules().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all schedules ``` /** * List all schedules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listFleetSchedules"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); apiInstance .listFleetSchedules() .then((data: v2.FleetSchedulesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a schedule](https://docs.datadoghq.com/api/latest/fleet-automation/#create-a-schedule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#create-a-schedule-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/unstable/fleet/scheduleshttps://api.ap2.datadoghq.com/api/unstable/fleet/scheduleshttps://api.datadoghq.eu/api/unstable/fleet/scheduleshttps://api.ddog-gov.com/api/unstable/fleet/scheduleshttps://api.datadoghq.com/api/unstable/fleet/scheduleshttps://api.us3.datadoghq.com/api/unstable/fleet/scheduleshttps://api.us5.datadoghq.com/api/unstable/fleet/schedules ### Overview Create a new schedule for automated package upgrades. Schedules define when and how often to automatically deploy package upgrades to a fleet of hosts. Each schedule includes: * A filter query to select target hosts * A recurrence rule defining maintenance windows * A version strategy (e.g., always latest, or N versions behind latest) When the schedule triggers during a maintenance window, it automatically creates a deployment that upgrades the Datadog Agent to the specified version on all matching hosts. This endpoint requires the `agent_upgrade_write` permission. ### Request #### Body Data (required) Request payload containing the schedule details. * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Field Type Description data [_required_] object Data for creating a new schedule. attributes [_required_] object Attributes for creating a new schedule. name [_required_] string Human-readable name for the schedule. query [_required_] string Query used to filter and select target hosts for scheduled deployments. Uses the Datadog query syntax. rule [_required_] object Defines the recurrence pattern for the schedule. Specifies when deployments should be automatically triggered based on maintenance windows. days_of_week [_required_] [string] List of days of the week when the schedule should trigger. Valid values are: "Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun". maintenance_window_duration [_required_] int64 Duration of the maintenance window in minutes. start_maintenance_window [_required_] string Start time of the maintenance window in 24-hour clock format (HH:MM). Deployments will be triggered at this time on the specified days. timezone [_required_] string Timezone for the schedule in IANA Time Zone Database format (e.g., "America/New_York", "UTC"). status enum The status of the schedule. * `active`: The schedule is active and will create deployments according to its recurrence rule. * `inactive`: The schedule is inactive and will not create any deployments. Allowed enum values: `active,inactive` version_to_latest int64 Number of major versions behind the latest to target for upgrades. * 0: Always upgrade to the latest version (default) * 1: Upgrade to latest minus 1 major version * 2: Upgrade to latest minus 2 major versions Maximum value is 2. type [_required_] enum The type of schedule resource. Allowed enum values: `schedule` default: `schedule` ``` { "data": { "attributes": { "name": "Weekly Production Agent Updates", "query": "env:prod AND service:web", "rule": { "days_of_week": [ "Mon", "Wed", "Fri" ], "maintenance_window_duration": 1200, "start_maintenance_window": "02:00", "timezone": "America/New_York" }, "status": "active", "version_to_latest": 0 }, "type": "schedule" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetSchedule-201-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetSchedule-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetSchedule-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetSchedule-403-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#CreateFleetSchedule-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing a single schedule. Field Type Description data object A schedule that automatically creates deployments based on a recurrence rule. attributes [_required_] object Attributes of a schedule in the response. created_at_unix int64 Unix timestamp (seconds since epoch) when the schedule was created. created_by string User handle of the person who created the schedule. name string Human-readable name for the schedule. query string Query used to filter and select target hosts for scheduled deployments. Uses the Datadog query syntax. rule object Defines the recurrence pattern for the schedule. Specifies when deployments should be automatically triggered based on maintenance windows. days_of_week [_required_] [string] List of days of the week when the schedule should trigger. Valid values are: "Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun". maintenance_window_duration [_required_] int64 Duration of the maintenance window in minutes. start_maintenance_window [_required_] string Start time of the maintenance window in 24-hour clock format (HH:MM). Deployments will be triggered at this time on the specified days. timezone [_required_] string Timezone for the schedule in IANA Time Zone Database format (e.g., "America/New_York", "UTC"). status enum The status of the schedule. * `active`: The schedule is active and will create deployments according to its recurrence rule. * `inactive`: The schedule is inactive and will not create any deployments. Allowed enum values: `active,inactive` updated_at_unix int64 Unix timestamp (seconds since epoch) when the schedule was last updated. updated_by string User handle of the person who last updated the schedule. version_to_latest int64 Number of major versions behind the latest to target for upgrades. * 0: Always upgrade to the latest version * 1: Upgrade to latest minus 1 major version * 2: Upgrade to latest minus 2 major versions Maximum value is 2. id [_required_] string Unique identifier for the schedule. type [_required_] enum The type of schedule resource. Allowed enum values: `schedule` default: `schedule` ``` { "data": { "attributes": { "created_at_unix": 1699999999, "created_by": "user@example.com", "name": "Weekly Production Agent Updates", "query": "env:prod AND service:web", "rule": { "days_of_week": [ "Mon", "Wed", "Fri" ], "maintenance_window_duration": 1200, "start_maintenance_window": "02:00", "timezone": "America/New_York" }, "status": "active", "updated_at_unix": 1699999999, "updated_by": "user@example.com", "version_to_latest": 0 }, "id": "abc-def-ghi-123", "type": "schedule" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### Create a schedule Copy ``` ## Conservative staging updates (N-1 version) # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/schedules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Staging Environment - Conservative Updates", "query": "env:staging", "rule": { "days_of_week": [ "Fri" ], "maintenance_window_duration": 240, "start_maintenance_window": "22:00", "timezone": "UTC" }, "status": "active", "version_to_latest": 1 }, "type": "schedule" } } EOF ## Weekly production agent updates # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/schedules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Weekly Production Agent Updates", "query": "env:prod", "rule": { "days_of_week": [ "Mon", "Wed" ], "maintenance_window_duration": 180, "start_maintenance_window": "02:00", "timezone": "America/New_York" }, "status": "active", "version_to_latest": 0 }, "type": "schedule" } } EOF ``` ##### Create a schedule ``` """ Create a schedule returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi from datadog_api_client.v2.model.fleet_schedule_create import FleetScheduleCreate from datadog_api_client.v2.model.fleet_schedule_create_attributes import FleetScheduleCreateAttributes from datadog_api_client.v2.model.fleet_schedule_create_request import FleetScheduleCreateRequest from datadog_api_client.v2.model.fleet_schedule_recurrence_rule import FleetScheduleRecurrenceRule from datadog_api_client.v2.model.fleet_schedule_resource_type import FleetScheduleResourceType from datadog_api_client.v2.model.fleet_schedule_status import FleetScheduleStatus body = FleetScheduleCreateRequest( data=FleetScheduleCreate( attributes=FleetScheduleCreateAttributes( name="Weekly Production Agent Updates", query="env:prod AND service:web", rule=FleetScheduleRecurrenceRule( days_of_week=[ "Mon", "Wed", "Fri", ], maintenance_window_duration=1200, start_maintenance_window="02:00", timezone="America/New_York", ), status=FleetScheduleStatus.ACTIVE, version_to_latest=0, ), type=FleetScheduleResourceType.SCHEDULE, ), ) configuration = Configuration() configuration.unstable_operations["create_fleet_schedule"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.create_fleet_schedule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a schedule ``` # Create a schedule returns "CREATED" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_fleet_schedule".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new body = DatadogAPIClient::V2::FleetScheduleCreateRequest.new({ data: DatadogAPIClient::V2::FleetScheduleCreate.new({ attributes: DatadogAPIClient::V2::FleetScheduleCreateAttributes.new({ name: "Weekly Production Agent Updates", query: "env:prod AND service:web", rule: DatadogAPIClient::V2::FleetScheduleRecurrenceRule.new({ days_of_week: [ "Mon", "Wed", "Fri", ], maintenance_window_duration: 1200, start_maintenance_window: "02:00", timezone: "America/New_York", }), status: DatadogAPIClient::V2::FleetScheduleStatus::ACTIVE, version_to_latest: 0, }), type: DatadogAPIClient::V2::FleetScheduleResourceType::SCHEDULE, }), }) p api_instance.create_fleet_schedule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a schedule ``` // Create a schedule returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.FleetScheduleCreateRequest{ Data: datadogV2.FleetScheduleCreate{ Attributes: datadogV2.FleetScheduleCreateAttributes{ Name: "Weekly Production Agent Updates", Query: "env:prod AND service:web", Rule: datadogV2.FleetScheduleRecurrenceRule{ DaysOfWeek: []string{ "Mon", "Wed", "Fri", }, MaintenanceWindowDuration: 1200, StartMaintenanceWindow: "02:00", Timezone: "America/New_York", }, Status: datadogV2.FLEETSCHEDULESTATUS_ACTIVE.Ptr(), VersionToLatest: datadog.PtrInt64(0), }, Type: datadogV2.FLEETSCHEDULERESOURCETYPE_SCHEDULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateFleetSchedule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.CreateFleetSchedule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.CreateFleetSchedule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.CreateFleetSchedule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a schedule ``` // Create a schedule returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetScheduleCreate; import com.datadog.api.client.v2.model.FleetScheduleCreateAttributes; import com.datadog.api.client.v2.model.FleetScheduleCreateRequest; import com.datadog.api.client.v2.model.FleetScheduleRecurrenceRule; import com.datadog.api.client.v2.model.FleetScheduleResourceType; import com.datadog.api.client.v2.model.FleetScheduleResponse; import com.datadog.api.client.v2.model.FleetScheduleStatus; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createFleetSchedule", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); FleetScheduleCreateRequest body = new FleetScheduleCreateRequest() .data( new FleetScheduleCreate() .attributes( new FleetScheduleCreateAttributes() .name("Weekly Production Agent Updates") .query("env:prod AND service:web") .rule( new FleetScheduleRecurrenceRule() .daysOfWeek(Arrays.asList("Mon", "Wed", "Fri")) .maintenanceWindowDuration(1200L) .startMaintenanceWindow("02:00") .timezone("America/New_York")) .status(FleetScheduleStatus.ACTIVE) .versionToLatest(0L)) .type(FleetScheduleResourceType.SCHEDULE)); try { FleetScheduleResponse result = apiInstance.createFleetSchedule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#createFleetSchedule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a schedule ``` // Create a schedule returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; use datadog_api_client::datadogV2::model::FleetScheduleCreate; use datadog_api_client::datadogV2::model::FleetScheduleCreateAttributes; use datadog_api_client::datadogV2::model::FleetScheduleCreateRequest; use datadog_api_client::datadogV2::model::FleetScheduleRecurrenceRule; use datadog_api_client::datadogV2::model::FleetScheduleResourceType; use datadog_api_client::datadogV2::model::FleetScheduleStatus; #[tokio::main] async fn main() { let body = FleetScheduleCreateRequest::new(FleetScheduleCreate::new( FleetScheduleCreateAttributes::new( "Weekly Production Agent Updates".to_string(), "env:prod AND service:web".to_string(), FleetScheduleRecurrenceRule::new( vec!["Mon".to_string(), "Wed".to_string(), "Fri".to_string()], 1200, "02:00".to_string(), "America/New_York".to_string(), ), ) .status(FleetScheduleStatus::ACTIVE) .version_to_latest(0), FleetScheduleResourceType::SCHEDULE, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateFleetSchedule", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api.create_fleet_schedule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a schedule ``` /** * Create a schedule returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createFleetSchedule"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); const params: v2.FleetAutomationApiCreateFleetScheduleRequest = { body: { data: { attributes: { name: "Weekly Production Agent Updates", query: "env:prod AND service:web", rule: { daysOfWeek: ["Mon", "Wed", "Fri"], maintenanceWindowDuration: 1200, startMaintenanceWindow: "02:00", timezone: "America/New_York", }, status: "active", versionToLatest: 0, }, type: "schedule", }, }, }; apiInstance .createFleetSchedule(params) .then((data: v2.FleetScheduleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a schedule by ID](https://docs.datadoghq.com/api/latest/fleet-automation/#get-a-schedule-by-id) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#get-a-schedule-by-id-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.ap2.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.datadoghq.eu/api/unstable/fleet/schedules/{id}https://api.ddog-gov.com/api/unstable/fleet/schedules/{id}https://api.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.us3.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.us5.datadoghq.com/api/unstable/fleet/schedules/{id} ### Overview Retrieve detailed information about a specific schedule using its unique identifier. This endpoint returns comprehensive information about a schedule, including: * Schedule metadata (ID, name, creation/update timestamps) * Filter query for selecting target hosts * Recurrence rule defining when deployments are triggered * Version strategy for package upgrades * Current status (active or inactive) This endpoint requires the `hosts_read` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string The unique identifier of the schedule to retrieve. ### Response * [200](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetSchedule-200-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetSchedule-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetSchedule-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetSchedule-403-v2) * [404](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetSchedule-404-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#GetFleetSchedule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing a single schedule. Field Type Description data object A schedule that automatically creates deployments based on a recurrence rule. attributes [_required_] object Attributes of a schedule in the response. created_at_unix int64 Unix timestamp (seconds since epoch) when the schedule was created. created_by string User handle of the person who created the schedule. name string Human-readable name for the schedule. query string Query used to filter and select target hosts for scheduled deployments. Uses the Datadog query syntax. rule object Defines the recurrence pattern for the schedule. Specifies when deployments should be automatically triggered based on maintenance windows. days_of_week [_required_] [string] List of days of the week when the schedule should trigger. Valid values are: "Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun". maintenance_window_duration [_required_] int64 Duration of the maintenance window in minutes. start_maintenance_window [_required_] string Start time of the maintenance window in 24-hour clock format (HH:MM). Deployments will be triggered at this time on the specified days. timezone [_required_] string Timezone for the schedule in IANA Time Zone Database format (e.g., "America/New_York", "UTC"). status enum The status of the schedule. * `active`: The schedule is active and will create deployments according to its recurrence rule. * `inactive`: The schedule is inactive and will not create any deployments. Allowed enum values: `active,inactive` updated_at_unix int64 Unix timestamp (seconds since epoch) when the schedule was last updated. updated_by string User handle of the person who last updated the schedule. version_to_latest int64 Number of major versions behind the latest to target for upgrades. * 0: Always upgrade to the latest version * 1: Upgrade to latest minus 1 major version * 2: Upgrade to latest minus 2 major versions Maximum value is 2. id [_required_] string Unique identifier for the schedule. type [_required_] enum The type of schedule resource. Allowed enum values: `schedule` default: `schedule` ``` { "data": { "attributes": { "created_at_unix": 1699999999, "created_by": "user@example.com", "name": "Weekly Production Agent Updates", "query": "env:prod AND service:web", "rule": { "days_of_week": [ "Mon", "Wed", "Fri" ], "maintenance_window_duration": 1200, "start_maintenance_window": "02:00", "timezone": "America/New_York" }, "status": "active", "updated_at_unix": 1699999999, "updated_by": "user@example.com", "version_to_latest": 0 }, "id": "abc-def-ghi-123", "type": "schedule" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### Get a schedule by ID Copy ``` # Path parameters export id="abc-def-ghi-123" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/schedules/${id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a schedule by ID ``` """ Get a schedule by ID returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi configuration = Configuration() configuration.unstable_operations["get_fleet_schedule"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.get_fleet_schedule( id="id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a schedule by ID ``` # Get a schedule by ID returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_fleet_schedule".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new p api_instance.get_fleet_schedule("id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a schedule by ID ``` // Get a schedule by ID returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetFleetSchedule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.GetFleetSchedule(ctx, "id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.GetFleetSchedule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.GetFleetSchedule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a schedule by ID ``` // Get a schedule by ID returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetScheduleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getFleetSchedule", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); try { FleetScheduleResponse result = apiInstance.getFleetSchedule("abc-def-ghi-123"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#getFleetSchedule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a schedule by ID ``` // Get a schedule by ID returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetFleetSchedule", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api.get_fleet_schedule("id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a schedule by ID ``` /** * Get a schedule by ID returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getFleetSchedule"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); const params: v2.FleetAutomationApiGetFleetScheduleRequest = { id: "id", }; apiInstance .getFleetSchedule(params) .then((data: v2.FleetScheduleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a schedule](https://docs.datadoghq.com/api/latest/fleet-automation/#update-a-schedule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#update-a-schedule-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PATCH https://api.ap1.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.ap2.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.datadoghq.eu/api/unstable/fleet/schedules/{id}https://api.ddog-gov.com/api/unstable/fleet/schedules/{id}https://api.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.us3.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.us5.datadoghq.com/api/unstable/fleet/schedules/{id} ### Overview Partially update a schedule by providing only the fields you want to change. This endpoint allows you to modify specific attributes of a schedule without affecting other fields. Common use cases include: * Changing the schedule status between active and inactive * Updating the maintenance window times * Modifying the filter query to target different hosts * Adjusting the version strategy Only include the fields you want to update in the request body. All fields are optional in a PATCH request. This endpoint requires the `agent_upgrade_write` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string The unique identifier of the schedule to update. ### Request #### Body Data (required) Request payload containing the fields to update. * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Field Type Description data [_required_] object Data for partially updating a schedule. attributes object Attributes for partially updating a schedule. All fields are optional. name string Human-readable name for the schedule. query string Query used to filter and select target hosts for scheduled deployments. Uses the Datadog query syntax. rule object Defines the recurrence pattern for the schedule. Specifies when deployments should be automatically triggered based on maintenance windows. days_of_week [_required_] [string] List of days of the week when the schedule should trigger. Valid values are: "Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun". maintenance_window_duration [_required_] int64 Duration of the maintenance window in minutes. start_maintenance_window [_required_] string Start time of the maintenance window in 24-hour clock format (HH:MM). Deployments will be triggered at this time on the specified days. timezone [_required_] string Timezone for the schedule in IANA Time Zone Database format (e.g., "America/New_York", "UTC"). status enum The status of the schedule. * `active`: The schedule is active and will create deployments according to its recurrence rule. * `inactive`: The schedule is inactive and will not create any deployments. Allowed enum values: `active,inactive` version_to_latest int64 Number of major versions behind the latest to target for upgrades. * 0: Always upgrade to the latest version * 1: Upgrade to latest minus 1 major version * 2: Upgrade to latest minus 2 major versions Maximum value is 2. type [_required_] enum The type of schedule resource. Allowed enum values: `schedule` default: `schedule` ``` { "data": { "attributes": { "name": "Weekly Production Agent Updates", "query": "env:prod AND service:web", "rule": { "days_of_week": [ "Mon", "Wed", "Fri" ], "maintenance_window_duration": 1200, "start_maintenance_window": "02:00", "timezone": "America/New_York" }, "status": "active", "version_to_latest": 0 }, "type": "schedule" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/fleet-automation/#UpdateFleetSchedule-200-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#UpdateFleetSchedule-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#UpdateFleetSchedule-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#UpdateFleetSchedule-403-v2) * [404](https://docs.datadoghq.com/api/latest/fleet-automation/#UpdateFleetSchedule-404-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#UpdateFleetSchedule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing a single schedule. Field Type Description data object A schedule that automatically creates deployments based on a recurrence rule. attributes [_required_] object Attributes of a schedule in the response. created_at_unix int64 Unix timestamp (seconds since epoch) when the schedule was created. created_by string User handle of the person who created the schedule. name string Human-readable name for the schedule. query string Query used to filter and select target hosts for scheduled deployments. Uses the Datadog query syntax. rule object Defines the recurrence pattern for the schedule. Specifies when deployments should be automatically triggered based on maintenance windows. days_of_week [_required_] [string] List of days of the week when the schedule should trigger. Valid values are: "Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun". maintenance_window_duration [_required_] int64 Duration of the maintenance window in minutes. start_maintenance_window [_required_] string Start time of the maintenance window in 24-hour clock format (HH:MM). Deployments will be triggered at this time on the specified days. timezone [_required_] string Timezone for the schedule in IANA Time Zone Database format (e.g., "America/New_York", "UTC"). status enum The status of the schedule. * `active`: The schedule is active and will create deployments according to its recurrence rule. * `inactive`: The schedule is inactive and will not create any deployments. Allowed enum values: `active,inactive` updated_at_unix int64 Unix timestamp (seconds since epoch) when the schedule was last updated. updated_by string User handle of the person who last updated the schedule. version_to_latest int64 Number of major versions behind the latest to target for upgrades. * 0: Always upgrade to the latest version * 1: Upgrade to latest minus 1 major version * 2: Upgrade to latest minus 2 major versions Maximum value is 2. id [_required_] string Unique identifier for the schedule. type [_required_] enum The type of schedule resource. Allowed enum values: `schedule` default: `schedule` ``` { "data": { "attributes": { "created_at_unix": 1699999999, "created_by": "user@example.com", "name": "Weekly Production Agent Updates", "query": "env:prod AND service:web", "rule": { "days_of_week": [ "Mon", "Wed", "Fri" ], "maintenance_window_duration": 1200, "start_maintenance_window": "02:00", "timezone": "America/New_York" }, "status": "active", "updated_at_unix": 1699999999, "updated_by": "user@example.com", "version_to_latest": 0 }, "id": "abc-def-ghi-123", "type": "schedule" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### Update a schedule Copy ``` ## Change maintenance window time # # Path parameters export id="abc-def-ghi-123" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/schedules/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "rule": { "days_of_week": [ "Mon", "Wed", "Fri" ], "maintenance_window_duration": 240, "start_maintenance_window": "03:00", "timezone": "America/New_York" } }, "type": "schedule" } } EOF ## Pause a schedule # # Path parameters export id="abc-def-ghi-123" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/schedules/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "status": "inactive" }, "type": "schedule" } } EOF ## Update target hosts query # # Path parameters export id="abc-def-ghi-123" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/schedules/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "query": "env:prod AND service:api" }, "type": "schedule" } } EOF ``` ##### Update a schedule ``` """ Update a schedule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi from datadog_api_client.v2.model.fleet_schedule_patch import FleetSchedulePatch from datadog_api_client.v2.model.fleet_schedule_patch_attributes import FleetSchedulePatchAttributes from datadog_api_client.v2.model.fleet_schedule_patch_request import FleetSchedulePatchRequest from datadog_api_client.v2.model.fleet_schedule_recurrence_rule import FleetScheduleRecurrenceRule from datadog_api_client.v2.model.fleet_schedule_resource_type import FleetScheduleResourceType from datadog_api_client.v2.model.fleet_schedule_status import FleetScheduleStatus body = FleetSchedulePatchRequest( data=FleetSchedulePatch( attributes=FleetSchedulePatchAttributes( name="Weekly Production Agent Updates", query="env:prod AND service:web", rule=FleetScheduleRecurrenceRule( days_of_week=[ "Mon", "Wed", "Fri", ], maintenance_window_duration=1200, start_maintenance_window="02:00", timezone="America/New_York", ), status=FleetScheduleStatus.ACTIVE, version_to_latest=0, ), type=FleetScheduleResourceType.SCHEDULE, ), ) configuration = Configuration() configuration.unstable_operations["update_fleet_schedule"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.update_fleet_schedule(id="id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a schedule ``` # Update a schedule returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_fleet_schedule".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new body = DatadogAPIClient::V2::FleetSchedulePatchRequest.new({ data: DatadogAPIClient::V2::FleetSchedulePatch.new({ attributes: DatadogAPIClient::V2::FleetSchedulePatchAttributes.new({ name: "Weekly Production Agent Updates", query: "env:prod AND service:web", rule: DatadogAPIClient::V2::FleetScheduleRecurrenceRule.new({ days_of_week: [ "Mon", "Wed", "Fri", ], maintenance_window_duration: 1200, start_maintenance_window: "02:00", timezone: "America/New_York", }), status: DatadogAPIClient::V2::FleetScheduleStatus::ACTIVE, version_to_latest: 0, }), type: DatadogAPIClient::V2::FleetScheduleResourceType::SCHEDULE, }), }) p api_instance.update_fleet_schedule("id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a schedule ``` // Update a schedule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.FleetSchedulePatchRequest{ Data: datadogV2.FleetSchedulePatch{ Attributes: &datadogV2.FleetSchedulePatchAttributes{ Name: datadog.PtrString("Weekly Production Agent Updates"), Query: datadog.PtrString("env:prod AND service:web"), Rule: &datadogV2.FleetScheduleRecurrenceRule{ DaysOfWeek: []string{ "Mon", "Wed", "Fri", }, MaintenanceWindowDuration: 1200, StartMaintenanceWindow: "02:00", Timezone: "America/New_York", }, Status: datadogV2.FLEETSCHEDULESTATUS_ACTIVE.Ptr(), VersionToLatest: datadog.PtrInt64(0), }, Type: datadogV2.FLEETSCHEDULERESOURCETYPE_SCHEDULE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateFleetSchedule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.UpdateFleetSchedule(ctx, "id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.UpdateFleetSchedule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.UpdateFleetSchedule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a schedule ``` // Update a schedule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetSchedulePatch; import com.datadog.api.client.v2.model.FleetSchedulePatchAttributes; import com.datadog.api.client.v2.model.FleetSchedulePatchRequest; import com.datadog.api.client.v2.model.FleetScheduleRecurrenceRule; import com.datadog.api.client.v2.model.FleetScheduleResourceType; import com.datadog.api.client.v2.model.FleetScheduleResponse; import com.datadog.api.client.v2.model.FleetScheduleStatus; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateFleetSchedule", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); FleetSchedulePatchRequest body = new FleetSchedulePatchRequest() .data( new FleetSchedulePatch() .attributes( new FleetSchedulePatchAttributes() .name("Weekly Production Agent Updates") .query("env:prod AND service:web") .rule( new FleetScheduleRecurrenceRule() .daysOfWeek(Arrays.asList("Mon", "Wed", "Fri")) .maintenanceWindowDuration(1200L) .startMaintenanceWindow("02:00") .timezone("America/New_York")) .status(FleetScheduleStatus.ACTIVE) .versionToLatest(0L)) .type(FleetScheduleResourceType.SCHEDULE)); try { FleetScheduleResponse result = apiInstance.updateFleetSchedule("abc-def-ghi-123", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#updateFleetSchedule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a schedule ``` // Update a schedule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; use datadog_api_client::datadogV2::model::FleetSchedulePatch; use datadog_api_client::datadogV2::model::FleetSchedulePatchAttributes; use datadog_api_client::datadogV2::model::FleetSchedulePatchRequest; use datadog_api_client::datadogV2::model::FleetScheduleRecurrenceRule; use datadog_api_client::datadogV2::model::FleetScheduleResourceType; use datadog_api_client::datadogV2::model::FleetScheduleStatus; #[tokio::main] async fn main() { let body = FleetSchedulePatchRequest::new( FleetSchedulePatch::new(FleetScheduleResourceType::SCHEDULE).attributes( FleetSchedulePatchAttributes::new() .name("Weekly Production Agent Updates".to_string()) .query("env:prod AND service:web".to_string()) .rule(FleetScheduleRecurrenceRule::new( vec!["Mon".to_string(), "Wed".to_string(), "Fri".to_string()], 1200, "02:00".to_string(), "America/New_York".to_string(), )) .status(FleetScheduleStatus::ACTIVE) .version_to_latest(0), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateFleetSchedule", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api.update_fleet_schedule("id".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a schedule ``` /** * Update a schedule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateFleetSchedule"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); const params: v2.FleetAutomationApiUpdateFleetScheduleRequest = { body: { data: { attributes: { name: "Weekly Production Agent Updates", query: "env:prod AND service:web", rule: { daysOfWeek: ["Mon", "Wed", "Fri"], maintenanceWindowDuration: 1200, startMaintenanceWindow: "02:00", timezone: "America/New_York", }, status: "active", versionToLatest: 0, }, type: "schedule", }, }, id: "id", }; apiInstance .updateFleetSchedule(params) .then((data: v2.FleetScheduleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a schedule](https://docs.datadoghq.com/api/latest/fleet-automation/#delete-a-schedule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#delete-a-schedule-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.ap2.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.datadoghq.eu/api/unstable/fleet/schedules/{id}https://api.ddog-gov.com/api/unstable/fleet/schedules/{id}https://api.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.us3.datadoghq.com/api/unstable/fleet/schedules/{id}https://api.us5.datadoghq.com/api/unstable/fleet/schedules/{id} ### Overview Delete a schedule permanently. When you delete a schedule: * The schedule is permanently removed and will no longer create deployments * Any deployments already created by this schedule are not affected * This action cannot be undone If you want to temporarily stop a schedule from creating deployments, consider updating its status to “inactive” instead of deleting it. This endpoint requires the `agent_upgrade_write` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string The unique identifier of the schedule to delete. ### Response * [204](https://docs.datadoghq.com/api/latest/fleet-automation/#DeleteFleetSchedule-204-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#DeleteFleetSchedule-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#DeleteFleetSchedule-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#DeleteFleetSchedule-403-v2) * [404](https://docs.datadoghq.com/api/latest/fleet-automation/#DeleteFleetSchedule-404-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#DeleteFleetSchedule-429-v2) Schedule successfully deleted. Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### Delete a schedule Copy ``` # Path parameters export id="abc-def-ghi-123" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/schedules/${id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a schedule ``` """ Delete a schedule returns "Schedule successfully deleted." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi configuration = Configuration() configuration.unstable_operations["delete_fleet_schedule"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) api_instance.delete_fleet_schedule( id="id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a schedule ``` # Delete a schedule returns "Schedule successfully deleted." response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_fleet_schedule".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new api_instance.delete_fleet_schedule("id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a schedule ``` // Delete a schedule returns "Schedule successfully deleted." response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteFleetSchedule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) r, err := api.DeleteFleetSchedule(ctx, "id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.DeleteFleetSchedule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a schedule ``` // Delete a schedule returns "Schedule successfully deleted." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteFleetSchedule", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); try { apiInstance.deleteFleetSchedule("abc-def-ghi-123"); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#deleteFleetSchedule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a schedule ``` // Delete a schedule returns "Schedule successfully deleted." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteFleetSchedule", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api.delete_fleet_schedule("id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a schedule ``` /** * Delete a schedule returns "Schedule successfully deleted." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteFleetSchedule"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); const params: v2.FleetAutomationApiDeleteFleetScheduleRequest = { id: "id", }; apiInstance .deleteFleetSchedule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Trigger a schedule deployment](https://docs.datadoghq.com/api/latest/fleet-automation/#trigger-a-schedule-deployment) * [v2 (latest)](https://docs.datadoghq.com/api/latest/fleet-automation/#trigger-a-schedule-deployment-v2) This endpoint is in Preview and may introduce breaking changes. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/unstable/fleet/schedules/{id}/triggerhttps://api.ap2.datadoghq.com/api/unstable/fleet/schedules/{id}/triggerhttps://api.datadoghq.eu/api/unstable/fleet/schedules/{id}/triggerhttps://api.ddog-gov.com/api/unstable/fleet/schedules/{id}/triggerhttps://api.datadoghq.com/api/unstable/fleet/schedules/{id}/triggerhttps://api.us3.datadoghq.com/api/unstable/fleet/schedules/{id}/triggerhttps://api.us5.datadoghq.com/api/unstable/fleet/schedules/{id}/trigger ### Overview Manually trigger a schedule to immediately create and start a deployment. This endpoint allows you to manually initiate a deployment using the schedule’s configuration, without waiting for the next scheduled maintenance window. This is useful for: * Testing a schedule before it runs automatically * Performing an emergency update outside the regular maintenance window * Creating an ad-hoc deployment with the same settings as a schedule The deployment is created immediately with: * The same filter query as the schedule * The package version determined by the schedule’s version strategy * All matching hosts as targets The manually triggered deployment is independent of the schedule and does not affect the schedule’s normal recurrence pattern. This endpoint requires the `agent_upgrade_write` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string The unique identifier of the schedule to trigger. ### Response * [201](https://docs.datadoghq.com/api/latest/fleet-automation/#TriggerFleetSchedule-201-v2) * [400](https://docs.datadoghq.com/api/latest/fleet-automation/#TriggerFleetSchedule-400-v2) * [401](https://docs.datadoghq.com/api/latest/fleet-automation/#TriggerFleetSchedule-401-v2) * [403](https://docs.datadoghq.com/api/latest/fleet-automation/#TriggerFleetSchedule-403-v2) * [404](https://docs.datadoghq.com/api/latest/fleet-automation/#TriggerFleetSchedule-404-v2) * [429](https://docs.datadoghq.com/api/latest/fleet-automation/#TriggerFleetSchedule-429-v2) CREATED - Deployment successfully created and started. * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) Response containing a single deployment. Field Type Description data object A deployment that defines automated configuration changes for a fleet of hosts. attributes [_required_] object Attributes of a deployment in the response. config_operations [object] Ordered list of configuration file operations to perform on the target hosts. file_op [_required_] enum Type of file operation to perform on the target configuration file. * `merge-patch`: Merges the provided patch data with the existing configuration file. Creates the file if it doesn't exist. * `delete`: Removes the specified configuration file from the target hosts. Allowed enum values: `merge-patch,delete` file_path [_required_] string Absolute path to the target configuration file on the host. patch object Patch data in JSON format to apply to the configuration file. When using `merge-patch`, this object is merged with the existing configuration, allowing you to add, update, or override specific fields without replacing the entire file. The structure must match the target configuration file format (for example, YAML structure for Datadog Agent config). Not applicable when using the `delete` operation. estimated_end_time_unix int64 Estimated completion time of the deployment as a Unix timestamp (seconds since epoch). filter_query string Query used to filter and select target hosts for the deployment. Uses the Datadog query syntax. high_level_status string Current high-level status of the deployment (for example, "pending", "running", "completed", "failed"). hosts [object] Paginated list of hosts in this deployment with their individual statuses. Only included when fetching a single deployment by ID. Use the `limit` and `page` query parameters to navigate through pages. Pagination metadata is included in the response `meta.hosts` field. error string Error message if the deployment failed on this host. hostname string The hostname of the agent. status string Current deployment status for this specific host. versions [object] List of packages and their versions currently installed on this host. current_version string The current version of the package on the host. initial_version string The initial version of the package on the host before the deployment started. package_name string The name of the package. target_version string The target version that the deployment is attempting to install. packages [object] List of packages to deploy to target hosts. Present only for package upgrade deployments. name [_required_] string The name of the package to deploy. version [_required_] string The target version of the package to deploy. total_hosts int64 Total number of hosts targeted by this deployment. id [_required_] string Unique identifier for the deployment. type [_required_] enum The type of deployment resource. Allowed enum values: `deployment` default: `deployment` meta object Metadata for a single deployment response, including pagination information for hosts. hosts object Pagination details for the list of hosts in a deployment. current_page int64 Current page index (zero-based). page_size int64 Number of hosts returned per page. total_hosts int64 Total number of hosts in this deployment. total_pages int64 Total number of pages available. ``` { "data": { "attributes": { "config_operations": [ { "file_op": "merge-patch", "file_path": "/datadog.yaml", "patch": { "apm_config": { "enabled": true }, "log_level": "debug", "logs_enabled": true } } ], "estimated_end_time_unix": 1699999999, "filter_query": "env:prod AND service:web", "high_level_status": "pending", "hosts": [ { "error": "", "hostname": "web-server-01.example.com", "status": "succeeded", "versions": [ { "current_version": "7.51.0", "initial_version": "7.51.0", "package_name": "datadog-agent", "target_version": "7.52.0" } ] } ], "packages": [ { "name": "datadog-agent", "version": "7.52.0" } ], "total_hosts": 42 }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "deployment" }, "meta": { "hosts": { "current_page": 0, "page_size": 50, "total_hosts": 150, "total_pages": 3 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/fleet-automation/) * [Example](https://docs.datadoghq.com/api/latest/fleet-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/fleet-automation/?code-lang=typescript) ##### Trigger a schedule deployment Copy ``` # Path parameters export id="abc-def-ghi-123" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/unstable/fleet/schedules/${id}/trigger" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Trigger a schedule deployment ``` """ Trigger a schedule deployment returns "CREATED - Deployment successfully created and started." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.fleet_automation_api import FleetAutomationApi configuration = Configuration() configuration.unstable_operations["trigger_fleet_schedule"] = True with ApiClient(configuration) as api_client: api_instance = FleetAutomationApi(api_client) response = api_instance.trigger_fleet_schedule( id="id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Trigger a schedule deployment ``` # Trigger a schedule deployment returns "CREATED - Deployment successfully created and started." response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.trigger_fleet_schedule".to_sym] = true end api_instance = DatadogAPIClient::V2::FleetAutomationAPI.new p api_instance.trigger_fleet_schedule("id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Trigger a schedule deployment ``` // Trigger a schedule deployment returns "CREATED - Deployment successfully created and started." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.TriggerFleetSchedule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewFleetAutomationApi(apiClient) resp, r, err := api.TriggerFleetSchedule(ctx, "id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `FleetAutomationApi.TriggerFleetSchedule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `FleetAutomationApi.TriggerFleetSchedule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Trigger a schedule deployment ``` // Trigger a schedule deployment returns "CREATED - Deployment successfully created and started." // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.FleetAutomationApi; import com.datadog.api.client.v2.model.FleetDeploymentResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.triggerFleetSchedule", true); FleetAutomationApi apiInstance = new FleetAutomationApi(defaultClient); try { FleetDeploymentResponse result = apiInstance.triggerFleetSchedule("abc-def-ghi-123"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling FleetAutomationApi#triggerFleetSchedule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Trigger a schedule deployment ``` // Trigger a schedule deployment returns "CREATED - Deployment successfully // created and started." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_fleet_automation::FleetAutomationAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.TriggerFleetSchedule", true); let api = FleetAutomationAPI::with_config(configuration); let resp = api.trigger_fleet_schedule("id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Trigger a schedule deployment ``` /** * Trigger a schedule deployment returns "CREATED - Deployment successfully created and started." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.triggerFleetSchedule"] = true; const apiInstance = new v2.FleetAutomationApi(configuration); const params: v2.FleetAutomationApiTriggerFleetScheduleRequest = { id: "id", }; apiInstance .triggerFleetSchedule(params) .then((data: v2.FleetDeploymentResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=8f1e5205-ccd2-4194-a659-99f3e5184cc3&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=e7a50935-9bc0-46ae-9eab-4902afa4dd25&pt=Fleet%20Automation&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=8f1e5205-ccd2-4194-a659-99f3e5184cc3&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=e7a50935-9bc0-46ae-9eab-4902afa4dd25&pt=Fleet%20Automation&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=6647292c-3833-48db-a3a8-af367a1ed904&bo=2&sid=b28d9090f0bf11f0b30273253c7db78b&vid=b28de6c0f0bf11f0933f9f2238d9b757&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Fleet%20Automation&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ffleet-automation%2F&r=<=3576&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=478081) --- # Source: https://docs.datadoghq.com/api/latest/gcp-integration/ # GCP Integration Configure your Datadog-Google Cloud Platform (GCP) integration directly through the Datadog API. Read more about the [Datadog-Google Cloud Platform integration](https://docs.datadoghq.com/integrations/google_cloud_platform). ## [List all GCP integrations](https://docs.datadoghq.com/api/latest/gcp-integration/#list-all-gcp-integrations) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/gcp-integration/#list-all-gcp-integrations-v1) GET https://api.ap1.datadoghq.com/api/v1/integration/gcphttps://api.ap2.datadoghq.com/api/v1/integration/gcphttps://api.datadoghq.eu/api/v1/integration/gcphttps://api.ddog-gov.com/api/v1/integration/gcphttps://api.datadoghq.com/api/v1/integration/gcphttps://api.us3.datadoghq.com/api/v1/integration/gcphttps://api.us5.datadoghq.com/api/v1/integration/gcp ### Overview This endpoint is deprecated – use the V2 endpoints instead. List all Datadog-GCP integrations configured in your Datadog account. This endpoint requires the `gcp_configuration_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/gcp-integration/#ListGCPIntegration-200-v1) * [400](https://docs.datadoghq.com/api/latest/gcp-integration/#ListGCPIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/gcp-integration/#ListGCPIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/gcp-integration/#ListGCPIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Array of GCP account responses. Field Type Description auth_provider_x509_cert_url string Should be `https://www.googleapis.com/oauth2/v1/certs`. auth_uri string Should be `https://accounts.google.com/o/oauth2/auth`. automute boolean Silence monitors for expected GCE instance shutdowns. client_email string Your email found in your JSON service account key. client_id string Your ID found in your JSON service account key. client_x509_cert_url string Should be `https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL` where `$CLIENT_EMAIL` is the email found in your JSON service account key. cloud_run_revision_filters [string] **DEPRECATED** : List of filters to limit the Cloud Run revisions that are pulled into Datadog by using tags. Only Cloud Run revision resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=cloud_run_revision` errors [string] An array of errors. host_filters string **DEPRECATED** : A comma-separated list of filters to limit the VM instances that are pulled into Datadog by using tags. Only VM instance resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=gce_instance` is_cspm_enabled boolean When enabled, Datadog will activate the Cloud Security Monitoring product for this service account. Note: This requires resource_collection_enabled to be set to true. is_resource_change_collection_enabled boolean When enabled, Datadog scans for all resource change data in your Google Cloud environment. is_security_command_center_enabled boolean When enabled, Datadog will attempt to collect Security Command Center Findings. Note: This requires additional permissions on the service account. monitored_resource_configs [object] Configurations for GCP monitored resources. filters [string] List of filters to limit the monitored resources that are pulled into Datadog by using tags. Only monitored resources that apply to specified filters are imported into Datadog. type enum The GCP monitored resource type. Only a subset of resource types are supported. Allowed enum values: `cloud_function,cloud_run_revision,gce_instance` private_key string Your private key name found in your JSON service account key. private_key_id string Your private key ID found in your JSON service account key. project_id string Your Google Cloud project ID found in your JSON service account key. resource_collection_enabled boolean When enabled, Datadog scans for all resources in your GCP environment. token_uri string Should be `https://accounts.google.com/o/oauth2/token`. type string The value for service_account found in your JSON service account key. ``` { "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "automute": false, "client_email": "api-dev@datadog-sandbox.iam.gserviceaccount.com", "client_id": "123456712345671234567", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", "cloud_run_revision_filters": [ "$KEY:$VALUE" ], "errors": [ "*" ], "host_filters": "$KEY1:$VALUE1,$KEY2:$VALUE2", "is_cspm_enabled": true, "is_resource_change_collection_enabled": true, "is_security_command_center_enabled": true, "monitored_resource_configs": [ { "filters": [ "$KEY:$VALUE" ], "type": "gce_instance" } ], "private_key": "private_key", "private_key_id": "123456789abcdefghi123456789abcdefghijklm", "project_id": "datadog-apitest", "resource_collection_enabled": true, "token_uri": "https://accounts.google.com/o/oauth2/token", "type": "service_account" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python-legacy) ##### List all GCP integrations Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/gcp" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all GCP integrations ``` """ List all GCP integrations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.gcp_integration_api import GCPIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.list_gcp_integration() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all GCP integrations ``` # List all GCP integrations returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::GCPIntegrationAPI.new p api_instance.list_gcp_integration() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all GCP integrations ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.gcp_integration_list ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all GCP integrations ``` // List all GCP integrations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewGCPIntegrationApi(apiClient) resp, r, err := api.ListGCPIntegration(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.ListGCPIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.ListGCPIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all GCP integrations ``` // List all GCP integrations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.GcpIntegrationApi; import com.datadog.api.client.v1.model.GCPAccount; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); try { List result = apiInstance.listGCPIntegration(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#listGCPIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all GCP integrations ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.GcpIntegration.list() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### List all GCP integrations ``` // List all GCP integrations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_gcp_integration::GCPIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api.list_gcp_integration().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all GCP integrations ``` /** * List all GCP integrations returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.GCPIntegrationApi(configuration); apiInstance .listGCPIntegration() .then((data: v1.GCPAccount[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all GCP STS-enabled service accounts](https://docs.datadoghq.com/api/latest/gcp-integration/#list-all-gcp-sts-enabled-service-accounts) * [v2 (latest)](https://docs.datadoghq.com/api/latest/gcp-integration/#list-all-gcp-sts-enabled-service-accounts-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/gcp/accountshttps://api.ap2.datadoghq.com/api/v2/integration/gcp/accountshttps://api.datadoghq.eu/api/v2/integration/gcp/accountshttps://api.ddog-gov.com/api/v2/integration/gcp/accountshttps://api.datadoghq.com/api/v2/integration/gcp/accountshttps://api.us3.datadoghq.com/api/v2/integration/gcp/accountshttps://api.us5.datadoghq.com/api/v2/integration/gcp/accounts ### Overview List all GCP STS-enabled service accounts configured in your Datadog account. This endpoint requires the `gcp_configuration_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/gcp-integration/#ListGCPSTSAccounts-200-v2) * [403](https://docs.datadoghq.com/api/latest/gcp-integration/#ListGCPSTSAccounts-403-v2) * [404](https://docs.datadoghq.com/api/latest/gcp-integration/#ListGCPSTSAccounts-404-v2) * [429](https://docs.datadoghq.com/api/latest/gcp-integration/#ListGCPSTSAccounts-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Object containing all your STS enabled accounts. Field Type Description data [object] Array of GCP STS enabled service accounts. attributes object Attributes associated with your service account. account_tags [string] Tags to be associated with GCP metrics and service checks from your account. automute boolean Silence monitors for expected GCE instance shutdowns. client_email string Your service account email address. cloud_run_revision_filters [string] **DEPRECATED** : List of filters to limit the Cloud Run revisions that are pulled into Datadog by using tags. Only Cloud Run revision resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=cloud_run_revision` host_filters [string] **DEPRECATED** : List of filters to limit the VM instances that are pulled into Datadog by using tags. Only VM instance resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=gce_instance` is_cspm_enabled boolean When enabled, Datadog will activate the Cloud Security Monitoring product for this service account. Note: This requires resource_collection_enabled to be set to true. is_global_location_enabled boolean When enabled, Datadog collects metrics where location is explicitly stated as "global" or where location information cannot be deduced from GCP labels. default: `true` is_per_project_quota_enabled boolean When enabled, Datadog applies the `X-Goog-User-Project` header, attributing Google Cloud billing and quota usage to the project being monitored rather than the default service account project. is_resource_change_collection_enabled boolean When enabled, Datadog scans for all resource change data in your Google Cloud environment. is_security_command_center_enabled boolean When enabled, Datadog will attempt to collect Security Command Center Findings. Note: This requires additional permissions on the service account. metric_namespace_configs [object] Configurations for GCP metric namespaces. disabled boolean When disabled, Datadog does not collect metrics that are related to this GCP metric namespace. filters [string] When enabled, Datadog applies these additional filters to limit metric collection. A metric is collected only if it does not match all exclusion filters and matches at least one allow filter. id string The id of the GCP metric namespace. monitored_resource_configs [object] Configurations for GCP monitored resources. filters [string] List of filters to limit the monitored resources that are pulled into Datadog by using tags. Only monitored resources that apply to specified filters are imported into Datadog. type enum The GCP monitored resource type. Only a subset of resource types are supported. Allowed enum values: `cloud_function,cloud_run_revision,gce_instance` region_filter_configs [string] Configurations for GCP location filtering, such as region, multi-region, or zone. Only monitored resources that match the specified regions are imported into Datadog. By default, Datadog collects from all locations. resource_collection_enabled boolean When enabled, Datadog scans for all resources in your GCP environment. id string Your service account's unique ID. meta object Additional information related to your service account. accessible_projects [string] The current list of projects accessible from your service account. type enum The type of account. Allowed enum values: `gcp_service_account` default: `gcp_service_account` ``` { "data": [ { "attributes": { "account_tags": [], "automute": false, "client_email": "datadog-service-account@test-project.iam.gserviceaccount.com", "cloud_run_revision_filters": [ "$KEY:$VALUE" ], "host_filters": [ "$KEY:$VALUE" ], "is_cspm_enabled": false, "is_global_location_enabled": true, "is_per_project_quota_enabled": true, "is_resource_change_collection_enabled": true, "is_security_command_center_enabled": true, "metric_namespace_configs": [ { "disabled": true, "filters": [ "snapshot.*", "!*_by_region" ], "id": "pubsub" } ], "monitored_resource_configs": [ { "filters": [ "$KEY:$VALUE" ], "type": "gce_instance" } ], "region_filter_configs": [ "nam4", "europe-north1" ], "resource_collection_enabled": false }, "id": "d291291f-12c2-22g4-j290-123456678897", "meta": { "accessible_projects": [] }, "type": "gcp_service_account" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=typescript) ##### List all GCP STS-enabled service accounts Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/gcp/accounts" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all GCP STS-enabled service accounts ``` """ List all GCP STS-enabled service accounts returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.gcp_integration_api import GCPIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.list_gcpsts_accounts() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all GCP STS-enabled service accounts ``` # List all GCP STS-enabled service accounts returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::GCPIntegrationAPI.new p api_instance.list_gcpsts_accounts() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all GCP STS-enabled service accounts ``` // List all GCP STS-enabled service accounts returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewGCPIntegrationApi(apiClient) resp, r, err := api.ListGCPSTSAccounts(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.ListGCPSTSAccounts`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.ListGCPSTSAccounts`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all GCP STS-enabled service accounts ``` // List all GCP STS-enabled service accounts returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.GcpIntegrationApi; import com.datadog.api.client.v2.model.GCPSTSServiceAccountsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); try { GCPSTSServiceAccountsResponse result = apiInstance.listGCPSTSAccounts(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#listGCPSTSAccounts"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all GCP STS-enabled service accounts ``` // List all GCP STS-enabled service accounts returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_gcp_integration::GCPIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api.list_gcpsts_accounts().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all GCP STS-enabled service accounts ``` /** * List all GCP STS-enabled service accounts returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.GCPIntegrationApi(configuration); apiInstance .listGCPSTSAccounts() .then((data: v2.GCPSTSServiceAccountsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a GCP integration](https://docs.datadoghq.com/api/latest/gcp-integration/#create-a-gcp-integration) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/gcp-integration/#create-a-gcp-integration-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/gcphttps://api.ap2.datadoghq.com/api/v1/integration/gcphttps://api.datadoghq.eu/api/v1/integration/gcphttps://api.ddog-gov.com/api/v1/integration/gcphttps://api.datadoghq.com/api/v1/integration/gcphttps://api.us3.datadoghq.com/api/v1/integration/gcphttps://api.us5.datadoghq.com/api/v1/integration/gcp ### Overview This endpoint is deprecated – use the V2 endpoints instead. Create a Datadog-GCP integration. This endpoint requires the `gcp_configurations_manage` permission. ### Request #### Body Data (required) Create a Datadog-GCP integration. * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Field Type Description auth_provider_x509_cert_url string Should be `https://www.googleapis.com/oauth2/v1/certs`. auth_uri string Should be `https://accounts.google.com/o/oauth2/auth`. automute boolean Silence monitors for expected GCE instance shutdowns. client_email string Your email found in your JSON service account key. client_id string Your ID found in your JSON service account key. client_x509_cert_url string Should be `https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL` where `$CLIENT_EMAIL` is the email found in your JSON service account key. cloud_run_revision_filters [string] **DEPRECATED** : List of filters to limit the Cloud Run revisions that are pulled into Datadog by using tags. Only Cloud Run revision resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=cloud_run_revision` errors [string] An array of errors. host_filters string **DEPRECATED** : A comma-separated list of filters to limit the VM instances that are pulled into Datadog by using tags. Only VM instance resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=gce_instance` is_cspm_enabled boolean When enabled, Datadog will activate the Cloud Security Monitoring product for this service account. Note: This requires resource_collection_enabled to be set to true. is_resource_change_collection_enabled boolean When enabled, Datadog scans for all resource change data in your Google Cloud environment. is_security_command_center_enabled boolean When enabled, Datadog will attempt to collect Security Command Center Findings. Note: This requires additional permissions on the service account. monitored_resource_configs [object] Configurations for GCP monitored resources. filters [string] List of filters to limit the monitored resources that are pulled into Datadog by using tags. Only monitored resources that apply to specified filters are imported into Datadog. type enum The GCP monitored resource type. Only a subset of resource types are supported. Allowed enum values: `cloud_function,cloud_run_revision,gce_instance` private_key string Your private key name found in your JSON service account key. private_key_id string Your private key ID found in your JSON service account key. project_id string Your Google Cloud project ID found in your JSON service account key. resource_collection_enabled boolean When enabled, Datadog scans for all resources in your GCP environment. token_uri string Should be `https://accounts.google.com/o/oauth2/token`. type string The value for service_account found in your JSON service account key. ``` { "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "client_email": "252bf553ef04b351@example.com", "client_id": "163662907116366290710", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", "host_filters": "key:value,filter:example", "cloud_run_revision_filters": [ "dr:dre" ], "is_cspm_enabled": true, "is_security_command_center_enabled": true, "is_resource_change_collection_enabled": true, "private_key": "private_key", "private_key_id": "123456789abcdefghi123456789abcdefghijklm", "project_id": "datadog-apitest", "resource_collection_enabled": true, "token_uri": "https://accounts.google.com/o/oauth2/token", "type": "service_account" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/gcp-integration/#CreateGCPIntegration-200-v1) * [400](https://docs.datadoghq.com/api/latest/gcp-integration/#CreateGCPIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/gcp-integration/#CreateGCPIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/gcp-integration/#CreateGCPIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby-legacy) ##### Create a GCP integration returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/gcp" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "client_email": "252bf553ef04b351@example.com", "client_id": "163662907116366290710", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", "host_filters": "key:value,filter:example", "cloud_run_revision_filters": [ "dr:dre" ], "is_cspm_enabled": true, "is_security_command_center_enabled": true, "is_resource_change_collection_enabled": true, "private_key": "private_key", "private_key_id": "123456789abcdefghi123456789abcdefghijklm", "project_id": "datadog-apitest", "resource_collection_enabled": true, "token_uri": "https://accounts.google.com/o/oauth2/token", "type": "service_account" } EOF ``` ##### Create a GCP integration returns "OK" response ``` // Create a GCP integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.GCPAccount{ AuthProviderX509CertUrl: datadog.PtrString("https://www.googleapis.com/oauth2/v1/certs"), AuthUri: datadog.PtrString("https://accounts.google.com/o/oauth2/auth"), ClientEmail: datadog.PtrString("252bf553ef04b351@example.com"), ClientId: datadog.PtrString("163662907116366290710"), ClientX509CertUrl: datadog.PtrString("https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL"), HostFilters: datadog.PtrString("key:value,filter:example"), CloudRunRevisionFilters: []string{ "dr:dre", }, IsCspmEnabled: datadog.PtrBool(true), IsSecurityCommandCenterEnabled: datadog.PtrBool(true), IsResourceChangeCollectionEnabled: datadog.PtrBool(true), PrivateKey: datadog.PtrString("private_key"), PrivateKeyId: datadog.PtrString("123456789abcdefghi123456789abcdefghijklm"), ProjectId: datadog.PtrString("datadog-apitest"), ResourceCollectionEnabled: datadog.PtrBool(true), TokenUri: datadog.PtrString("https://accounts.google.com/o/oauth2/token"), Type: datadog.PtrString("service_account"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewGCPIntegrationApi(apiClient) resp, r, err := api.CreateGCPIntegration(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.CreateGCPIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.CreateGCPIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a GCP integration returns "OK" response ``` // Create a GCP integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.GcpIntegrationApi; import com.datadog.api.client.v1.model.GCPAccount; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); GCPAccount body = new GCPAccount() .authProviderX509CertUrl("https://www.googleapis.com/oauth2/v1/certs") .authUri("https://accounts.google.com/o/oauth2/auth") .clientEmail("252bf553ef04b351@example.com") .clientId("163662907116366290710") .clientX509CertUrl("https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL") .hostFilters("key:value,filter:example") .cloudRunRevisionFilters(Collections.singletonList("dr:dre")) .isCspmEnabled(true) .isSecurityCommandCenterEnabled(true) .isResourceChangeCollectionEnabled(true) .privateKey("private_key") .privateKeyId("123456789abcdefghi123456789abcdefghijklm") .projectId("datadog-apitest") .resourceCollectionEnabled(true) .tokenUri("https://accounts.google.com/o/oauth2/token") .type("service_account"); try { apiInstance.createGCPIntegration(body); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#createGCPIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a GCP integration returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.GcpIntegration.create( type="service_account", project_id="", private_key_id="", private_key="", client_email="", client_id="", auth_uri="", token_uri="", auth_provider_x509_cert_url="", client_x509_cert_url="", host_filters=":,:" ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Create a GCP integration returns "OK" response ``` """ Create a GCP integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.gcp_integration_api import GCPIntegrationApi from datadog_api_client.v1.model.gcp_account import GCPAccount body = GCPAccount( auth_provider_x509_cert_url="https://www.googleapis.com/oauth2/v1/certs", auth_uri="https://accounts.google.com/o/oauth2/auth", client_email="252bf553ef04b351@example.com", client_id="163662907116366290710", client_x509_cert_url="https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", host_filters="key:value,filter:example", cloud_run_revision_filters=[ "dr:dre", ], is_cspm_enabled=True, is_security_command_center_enabled=True, is_resource_change_collection_enabled=True, private_key="private_key", private_key_id="123456789abcdefghi123456789abcdefghijklm", project_id="datadog-apitest", resource_collection_enabled=True, token_uri="https://accounts.google.com/o/oauth2/token", type="service_account", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.create_gcp_integration(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a GCP integration returns "OK" response ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' config= { "type": "service_account", "project_id": "", "private_key_id": "", "private_key": "", "client_email": "", "client_id": "", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "token_uri": "https://accounts.google.com/o/oauth2/token", "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/", "host_filters": ":,:" } dog = Dogapi::Client.new(api_key, app_key) dog.gcp_integration_create(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a GCP integration returns "OK" response ``` # Create a GCP integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::GCPIntegrationAPI.new body = DatadogAPIClient::V1::GCPAccount.new({ auth_provider_x509_cert_url: "https://www.googleapis.com/oauth2/v1/certs", auth_uri: "https://accounts.google.com/o/oauth2/auth", client_email: "252bf553ef04b351@example.com", client_id: "163662907116366290710", client_x509_cert_url: "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", host_filters: "key:value,filter:example", cloud_run_revision_filters: [ "dr:dre", ], is_cspm_enabled: true, is_security_command_center_enabled: true, is_resource_change_collection_enabled: true, private_key: "private_key", private_key_id: "123456789abcdefghi123456789abcdefghijklm", project_id: "datadog-apitest", resource_collection_enabled: true, token_uri: "https://accounts.google.com/o/oauth2/token", type: "service_account", }) p api_instance.create_gcp_integration(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a GCP integration returns "OK" response ``` // Create a GCP integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_gcp_integration::GCPIntegrationAPI; use datadog_api_client::datadogV1::model::GCPAccount; #[tokio::main] async fn main() { let body = GCPAccount::new() .auth_provider_x509_cert_url("https://www.googleapis.com/oauth2/v1/certs".to_string()) .auth_uri("https://accounts.google.com/o/oauth2/auth".to_string()) .client_email("252bf553ef04b351@example.com".to_string()) .client_id("163662907116366290710".to_string()) .client_x509_cert_url( "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL".to_string(), ) .cloud_run_revision_filters(vec!["dr:dre".to_string()]) .host_filters("key:value,filter:example".to_string()) .is_cspm_enabled(true) .is_resource_change_collection_enabled(true) .is_security_command_center_enabled(true) .private_key("private_key".to_string()) .private_key_id("123456789abcdefghi123456789abcdefghijklm".to_string()) .project_id("datadog-apitest".to_string()) .resource_collection_enabled(true) .token_uri("https://accounts.google.com/o/oauth2/token".to_string()) .type_("service_account".to_string()); let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api.create_gcp_integration(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a GCP integration returns "OK" response ``` /** * Create a GCP integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.GCPIntegrationApi(configuration); const params: v1.GCPIntegrationApiCreateGCPIntegrationRequest = { body: { authProviderX509CertUrl: "https://www.googleapis.com/oauth2/v1/certs", authUri: "https://accounts.google.com/o/oauth2/auth", clientEmail: "252bf553ef04b351@example.com", clientId: "163662907116366290710", clientX509CertUrl: "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", hostFilters: "key:value,filter:example", cloudRunRevisionFilters: ["dr:dre"], isCspmEnabled: true, isSecurityCommandCenterEnabled: true, isResourceChangeCollectionEnabled: true, privateKey: "private_key", privateKeyId: "123456789abcdefghi123456789abcdefghijklm", projectId: "datadog-apitest", resourceCollectionEnabled: true, tokenUri: "https://accounts.google.com/o/oauth2/token", type: "service_account", }, }; apiInstance .createGCPIntegration(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new entry for your service account](https://docs.datadoghq.com/api/latest/gcp-integration/#create-a-new-entry-for-your-service-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/gcp-integration/#create-a-new-entry-for-your-service-account-v2) POST https://api.ap1.datadoghq.com/api/v2/integration/gcp/accountshttps://api.ap2.datadoghq.com/api/v2/integration/gcp/accountshttps://api.datadoghq.eu/api/v2/integration/gcp/accountshttps://api.ddog-gov.com/api/v2/integration/gcp/accountshttps://api.datadoghq.com/api/v2/integration/gcp/accountshttps://api.us3.datadoghq.com/api/v2/integration/gcp/accountshttps://api.us5.datadoghq.com/api/v2/integration/gcp/accounts ### Overview Create a new entry within Datadog for your STS enabled service account. This endpoint requires the `gcp_configurations_manage` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Field Type Description data object Additional metadata on your generated service account. attributes object Attributes associated with your service account. account_tags [string] Tags to be associated with GCP metrics and service checks from your account. automute boolean Silence monitors for expected GCE instance shutdowns. client_email string Your service account email address. cloud_run_revision_filters [string] **DEPRECATED** : List of filters to limit the Cloud Run revisions that are pulled into Datadog by using tags. Only Cloud Run revision resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=cloud_run_revision` host_filters [string] **DEPRECATED** : List of filters to limit the VM instances that are pulled into Datadog by using tags. Only VM instance resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=gce_instance` is_cspm_enabled boolean When enabled, Datadog will activate the Cloud Security Monitoring product for this service account. Note: This requires resource_collection_enabled to be set to true. is_global_location_enabled boolean When enabled, Datadog collects metrics where location is explicitly stated as "global" or where location information cannot be deduced from GCP labels. default: `true` is_per_project_quota_enabled boolean When enabled, Datadog applies the `X-Goog-User-Project` header, attributing Google Cloud billing and quota usage to the project being monitored rather than the default service account project. is_resource_change_collection_enabled boolean When enabled, Datadog scans for all resource change data in your Google Cloud environment. is_security_command_center_enabled boolean When enabled, Datadog will attempt to collect Security Command Center Findings. Note: This requires additional permissions on the service account. metric_namespace_configs [object] Configurations for GCP metric namespaces. disabled boolean When disabled, Datadog does not collect metrics that are related to this GCP metric namespace. filters [string] When enabled, Datadog applies these additional filters to limit metric collection. A metric is collected only if it does not match all exclusion filters and matches at least one allow filter. id string The id of the GCP metric namespace. monitored_resource_configs [object] Configurations for GCP monitored resources. filters [string] List of filters to limit the monitored resources that are pulled into Datadog by using tags. Only monitored resources that apply to specified filters are imported into Datadog. type enum The GCP monitored resource type. Only a subset of resource types are supported. Allowed enum values: `cloud_function,cloud_run_revision,gce_instance` region_filter_configs [string] Configurations for GCP location filtering, such as region, multi-region, or zone. Only monitored resources that match the specified regions are imported into Datadog. By default, Datadog collects from all locations. resource_collection_enabled boolean When enabled, Datadog scans for all resources in your GCP environment. type enum The type of account. Allowed enum values: `gcp_service_account` default: `gcp_service_account` ##### Create a new entry for your service account returns "OK" response ``` { "data": { "attributes": { "client_email": "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", "host_filters": [] }, "type": "gcp_service_account" } } ``` Copy ##### Create a new entry for your service account with account_tags returns "OK" response ``` { "data": { "attributes": { "account_tags": [ "lorem", "ipsum" ], "client_email": "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", "host_filters": [] }, "type": "gcp_service_account" } } ``` Copy ##### Create a new entry for your service account with cloud run revision filters enabled returns "OK" response ``` { "data": { "attributes": { "cloud_run_revision_filters": [ "meh:bleh" ], "client_email": "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", "host_filters": [] }, "type": "gcp_service_account" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/gcp-integration/#CreateGCPSTSAccount-201-v2) * [400](https://docs.datadoghq.com/api/latest/gcp-integration/#CreateGCPSTSAccount-400-v2) * [401](https://docs.datadoghq.com/api/latest/gcp-integration/#CreateGCPSTSAccount-401-v2) * [403](https://docs.datadoghq.com/api/latest/gcp-integration/#CreateGCPSTSAccount-403-v2) * [409](https://docs.datadoghq.com/api/latest/gcp-integration/#CreateGCPSTSAccount-409-v2) * [429](https://docs.datadoghq.com/api/latest/gcp-integration/#CreateGCPSTSAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) The account creation response. Field Type Description data object Info on your service account. attributes object Attributes associated with your service account. account_tags [string] Tags to be associated with GCP metrics and service checks from your account. automute boolean Silence monitors for expected GCE instance shutdowns. client_email string Your service account email address. cloud_run_revision_filters [string] **DEPRECATED** : List of filters to limit the Cloud Run revisions that are pulled into Datadog by using tags. Only Cloud Run revision resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=cloud_run_revision` host_filters [string] **DEPRECATED** : List of filters to limit the VM instances that are pulled into Datadog by using tags. Only VM instance resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=gce_instance` is_cspm_enabled boolean When enabled, Datadog will activate the Cloud Security Monitoring product for this service account. Note: This requires resource_collection_enabled to be set to true. is_global_location_enabled boolean When enabled, Datadog collects metrics where location is explicitly stated as "global" or where location information cannot be deduced from GCP labels. default: `true` is_per_project_quota_enabled boolean When enabled, Datadog applies the `X-Goog-User-Project` header, attributing Google Cloud billing and quota usage to the project being monitored rather than the default service account project. is_resource_change_collection_enabled boolean When enabled, Datadog scans for all resource change data in your Google Cloud environment. is_security_command_center_enabled boolean When enabled, Datadog will attempt to collect Security Command Center Findings. Note: This requires additional permissions on the service account. metric_namespace_configs [object] Configurations for GCP metric namespaces. disabled boolean When disabled, Datadog does not collect metrics that are related to this GCP metric namespace. filters [string] When enabled, Datadog applies these additional filters to limit metric collection. A metric is collected only if it does not match all exclusion filters and matches at least one allow filter. id string The id of the GCP metric namespace. monitored_resource_configs [object] Configurations for GCP monitored resources. filters [string] List of filters to limit the monitored resources that are pulled into Datadog by using tags. Only monitored resources that apply to specified filters are imported into Datadog. type enum The GCP monitored resource type. Only a subset of resource types are supported. Allowed enum values: `cloud_function,cloud_run_revision,gce_instance` region_filter_configs [string] Configurations for GCP location filtering, such as region, multi-region, or zone. Only monitored resources that match the specified regions are imported into Datadog. By default, Datadog collects from all locations. resource_collection_enabled boolean When enabled, Datadog scans for all resources in your GCP environment. id string Your service account's unique ID. meta object Additional information related to your service account. accessible_projects [string] The current list of projects accessible from your service account. type enum The type of account. Allowed enum values: `gcp_service_account` default: `gcp_service_account` ``` { "data": { "attributes": { "account_tags": [], "automute": false, "client_email": "datadog-service-account@test-project.iam.gserviceaccount.com", "cloud_run_revision_filters": [ "$KEY:$VALUE" ], "host_filters": [ "$KEY:$VALUE" ], "is_cspm_enabled": false, "is_global_location_enabled": true, "is_per_project_quota_enabled": true, "is_resource_change_collection_enabled": true, "is_security_command_center_enabled": true, "metric_namespace_configs": [ { "disabled": true, "filters": [ "snapshot.*", "!*_by_region" ], "id": "pubsub" } ], "monitored_resource_configs": [ { "filters": [ "$KEY:$VALUE" ], "type": "gce_instance" } ], "region_filter_configs": [ "nam4", "europe-north1" ], "resource_collection_enabled": false }, "id": "d291291f-12c2-22g4-j290-123456678897", "meta": { "accessible_projects": [] }, "type": "gcp_service_account" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=typescript) ##### Create a new entry for your service account returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/gcp/accounts" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "client_email": "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", "host_filters": [] }, "type": "gcp_service_account" } } EOF ``` ##### Create a new entry for your service account with account_tags returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/gcp/accounts" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "account_tags": [ "lorem", "ipsum" ], "client_email": "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", "host_filters": [] }, "type": "gcp_service_account" } } EOF ``` ##### Create a new entry for your service account with cloud run revision filters enabled returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/gcp/accounts" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "cloud_run_revision_filters": [ "meh:bleh" ], "client_email": "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", "host_filters": [] }, "type": "gcp_service_account" } } EOF ``` ##### Create a new entry for your service account returns "OK" response ``` // Create a new entry for your service account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.GCPSTSServiceAccountCreateRequest{ Data: &datadogV2.GCPSTSServiceAccountData{ Attributes: &datadogV2.GCPSTSServiceAccountAttributes{ ClientEmail: datadog.PtrString("Test-252bf553ef04b351@test-project.iam.gserviceaccount.com"), HostFilters: []string{}, }, Type: datadogV2.GCPSERVICEACCOUNTTYPE_GCP_SERVICE_ACCOUNT.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewGCPIntegrationApi(apiClient) resp, r, err := api.CreateGCPSTSAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.CreateGCPSTSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.CreateGCPSTSAccount`:\n%s\n", responseContent) } ``` Copy ##### Create a new entry for your service account with account_tags returns "OK" response ``` // Create a new entry for your service account with account_tags returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.GCPSTSServiceAccountCreateRequest{ Data: &datadogV2.GCPSTSServiceAccountData{ Attributes: &datadogV2.GCPSTSServiceAccountAttributes{ AccountTags: []string{ "lorem", "ipsum", }, ClientEmail: datadog.PtrString("Test-252bf553ef04b351@test-project.iam.gserviceaccount.com"), HostFilters: []string{}, }, Type: datadogV2.GCPSERVICEACCOUNTTYPE_GCP_SERVICE_ACCOUNT.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewGCPIntegrationApi(apiClient) resp, r, err := api.CreateGCPSTSAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.CreateGCPSTSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.CreateGCPSTSAccount`:\n%s\n", responseContent) } ``` Copy ##### Create a new entry for your service account with cloud run revision filters enabled returns "OK" response ``` // Create a new entry for your service account with cloud run revision filters enabled returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.GCPSTSServiceAccountCreateRequest{ Data: &datadogV2.GCPSTSServiceAccountData{ Attributes: &datadogV2.GCPSTSServiceAccountAttributes{ CloudRunRevisionFilters: []string{ "meh:bleh", }, ClientEmail: datadog.PtrString("Test-252bf553ef04b351@test-project.iam.gserviceaccount.com"), HostFilters: []string{}, }, Type: datadogV2.GCPSERVICEACCOUNTTYPE_GCP_SERVICE_ACCOUNT.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewGCPIntegrationApi(apiClient) resp, r, err := api.CreateGCPSTSAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.CreateGCPSTSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.CreateGCPSTSAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new entry for your service account returns "OK" response ``` // Create a new entry for your service account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.GcpIntegrationApi; import com.datadog.api.client.v2.model.GCPSTSServiceAccountAttributes; import com.datadog.api.client.v2.model.GCPSTSServiceAccountCreateRequest; import com.datadog.api.client.v2.model.GCPSTSServiceAccountData; import com.datadog.api.client.v2.model.GCPSTSServiceAccountResponse; import com.datadog.api.client.v2.model.GCPServiceAccountType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); GCPSTSServiceAccountCreateRequest body = new GCPSTSServiceAccountCreateRequest() .data( new GCPSTSServiceAccountData() .attributes( new GCPSTSServiceAccountAttributes() .clientEmail( "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com")) .type(GCPServiceAccountType.GCP_SERVICE_ACCOUNT)); try { GCPSTSServiceAccountResponse result = apiInstance.createGCPSTSAccount(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#createGCPSTSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a new entry for your service account with account_tags returns "OK" response ``` // Create a new entry for your service account with account_tags returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.GcpIntegrationApi; import com.datadog.api.client.v2.model.GCPSTSServiceAccountAttributes; import com.datadog.api.client.v2.model.GCPSTSServiceAccountCreateRequest; import com.datadog.api.client.v2.model.GCPSTSServiceAccountData; import com.datadog.api.client.v2.model.GCPSTSServiceAccountResponse; import com.datadog.api.client.v2.model.GCPServiceAccountType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); GCPSTSServiceAccountCreateRequest body = new GCPSTSServiceAccountCreateRequest() .data( new GCPSTSServiceAccountData() .attributes( new GCPSTSServiceAccountAttributes() .accountTags(Arrays.asList("lorem", "ipsum")) .clientEmail( "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com")) .type(GCPServiceAccountType.GCP_SERVICE_ACCOUNT)); try { GCPSTSServiceAccountResponse result = apiInstance.createGCPSTSAccount(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#createGCPSTSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a new entry for your service account with cloud run revision filters enabled returns "OK" response ``` // Create a new entry for your service account with cloud run revision filters enabled returns "OK" // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.GcpIntegrationApi; import com.datadog.api.client.v2.model.GCPSTSServiceAccountAttributes; import com.datadog.api.client.v2.model.GCPSTSServiceAccountCreateRequest; import com.datadog.api.client.v2.model.GCPSTSServiceAccountData; import com.datadog.api.client.v2.model.GCPSTSServiceAccountResponse; import com.datadog.api.client.v2.model.GCPServiceAccountType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); GCPSTSServiceAccountCreateRequest body = new GCPSTSServiceAccountCreateRequest() .data( new GCPSTSServiceAccountData() .attributes( new GCPSTSServiceAccountAttributes() .cloudRunRevisionFilters(Collections.singletonList("meh:bleh")) .clientEmail( "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com")) .type(GCPServiceAccountType.GCP_SERVICE_ACCOUNT)); try { GCPSTSServiceAccountResponse result = apiInstance.createGCPSTSAccount(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#createGCPSTSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new entry for your service account returns "OK" response ``` """ Create a new entry for your service account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.gcp_integration_api import GCPIntegrationApi from datadog_api_client.v2.model.gcp_service_account_type import GCPServiceAccountType from datadog_api_client.v2.model.gcpsts_service_account_attributes import GCPSTSServiceAccountAttributes from datadog_api_client.v2.model.gcpsts_service_account_create_request import GCPSTSServiceAccountCreateRequest from datadog_api_client.v2.model.gcpsts_service_account_data import GCPSTSServiceAccountData body = GCPSTSServiceAccountCreateRequest( data=GCPSTSServiceAccountData( attributes=GCPSTSServiceAccountAttributes( client_email="Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", host_filters=[], ), type=GCPServiceAccountType.GCP_SERVICE_ACCOUNT, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.create_gcpsts_account(body=body) print(response) ``` Copy ##### Create a new entry for your service account with account_tags returns "OK" response ``` """ Create a new entry for your service account with account_tags returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.gcp_integration_api import GCPIntegrationApi from datadog_api_client.v2.model.gcp_service_account_type import GCPServiceAccountType from datadog_api_client.v2.model.gcpsts_service_account_attributes import GCPSTSServiceAccountAttributes from datadog_api_client.v2.model.gcpsts_service_account_create_request import GCPSTSServiceAccountCreateRequest from datadog_api_client.v2.model.gcpsts_service_account_data import GCPSTSServiceAccountData body = GCPSTSServiceAccountCreateRequest( data=GCPSTSServiceAccountData( attributes=GCPSTSServiceAccountAttributes( account_tags=[ "lorem", "ipsum", ], client_email="Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", host_filters=[], ), type=GCPServiceAccountType.GCP_SERVICE_ACCOUNT, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.create_gcpsts_account(body=body) print(response) ``` Copy ##### Create a new entry for your service account with cloud run revision filters enabled returns "OK" response ``` """ Create a new entry for your service account with cloud run revision filters enabled returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.gcp_integration_api import GCPIntegrationApi from datadog_api_client.v2.model.gcp_service_account_type import GCPServiceAccountType from datadog_api_client.v2.model.gcpsts_service_account_attributes import GCPSTSServiceAccountAttributes from datadog_api_client.v2.model.gcpsts_service_account_create_request import GCPSTSServiceAccountCreateRequest from datadog_api_client.v2.model.gcpsts_service_account_data import GCPSTSServiceAccountData body = GCPSTSServiceAccountCreateRequest( data=GCPSTSServiceAccountData( attributes=GCPSTSServiceAccountAttributes( cloud_run_revision_filters=[ "meh:bleh", ], client_email="Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", host_filters=[], ), type=GCPServiceAccountType.GCP_SERVICE_ACCOUNT, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.create_gcpsts_account(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new entry for your service account returns "OK" response ``` # Create a new entry for your service account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::GCPIntegrationAPI.new body = DatadogAPIClient::V2::GCPSTSServiceAccountCreateRequest.new({ data: DatadogAPIClient::V2::GCPSTSServiceAccountData.new({ attributes: DatadogAPIClient::V2::GCPSTSServiceAccountAttributes.new({ client_email: "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", host_filters: [], }), type: DatadogAPIClient::V2::GCPServiceAccountType::GCP_SERVICE_ACCOUNT, }), }) p api_instance.create_gcpsts_account(body) ``` Copy ##### Create a new entry for your service account with account_tags returns "OK" response ``` # Create a new entry for your service account with account_tags returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::GCPIntegrationAPI.new body = DatadogAPIClient::V2::GCPSTSServiceAccountCreateRequest.new({ data: DatadogAPIClient::V2::GCPSTSServiceAccountData.new({ attributes: DatadogAPIClient::V2::GCPSTSServiceAccountAttributes.new({ account_tags: [ "lorem", "ipsum", ], client_email: "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", host_filters: [], }), type: DatadogAPIClient::V2::GCPServiceAccountType::GCP_SERVICE_ACCOUNT, }), }) p api_instance.create_gcpsts_account(body) ``` Copy ##### Create a new entry for your service account with cloud run revision filters enabled returns "OK" response ``` # Create a new entry for your service account with cloud run revision filters enabled returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::GCPIntegrationAPI.new body = DatadogAPIClient::V2::GCPSTSServiceAccountCreateRequest.new({ data: DatadogAPIClient::V2::GCPSTSServiceAccountData.new({ attributes: DatadogAPIClient::V2::GCPSTSServiceAccountAttributes.new({ cloud_run_revision_filters: [ "meh:bleh", ], client_email: "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", host_filters: [], }), type: DatadogAPIClient::V2::GCPServiceAccountType::GCP_SERVICE_ACCOUNT, }), }) p api_instance.create_gcpsts_account(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new entry for your service account returns "OK" response ``` // Create a new entry for your service account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_gcp_integration::GCPIntegrationAPI; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountAttributes; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountCreateRequest; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountData; use datadog_api_client::datadogV2::model::GCPServiceAccountType; #[tokio::main] async fn main() { let body = GCPSTSServiceAccountCreateRequest::new().data( GCPSTSServiceAccountData::new() .attributes( GCPSTSServiceAccountAttributes::new() .client_email( "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com".to_string(), ) .host_filters(vec![]), ) .type_(GCPServiceAccountType::GCP_SERVICE_ACCOUNT), ); let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api.create_gcpsts_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a new entry for your service account with account_tags returns "OK" response ``` // Create a new entry for your service account with account_tags returns "OK" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_gcp_integration::GCPIntegrationAPI; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountAttributes; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountCreateRequest; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountData; use datadog_api_client::datadogV2::model::GCPServiceAccountType; #[tokio::main] async fn main() { let body = GCPSTSServiceAccountCreateRequest::new().data( GCPSTSServiceAccountData::new() .attributes( GCPSTSServiceAccountAttributes::new() .account_tags(vec!["lorem".to_string(), "ipsum".to_string()]) .client_email( "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com".to_string(), ) .host_filters(vec![]), ) .type_(GCPServiceAccountType::GCP_SERVICE_ACCOUNT), ); let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api.create_gcpsts_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a new entry for your service account with cloud run revision filters enabled returns "OK" response ``` // Create a new entry for your service account with cloud run revision filters // enabled returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_gcp_integration::GCPIntegrationAPI; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountAttributes; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountCreateRequest; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountData; use datadog_api_client::datadogV2::model::GCPServiceAccountType; #[tokio::main] async fn main() { let body = GCPSTSServiceAccountCreateRequest::new().data( GCPSTSServiceAccountData::new() .attributes( GCPSTSServiceAccountAttributes::new() .client_email( "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com".to_string(), ) .cloud_run_revision_filters(vec!["meh:bleh".to_string()]) .host_filters(vec![]), ) .type_(GCPServiceAccountType::GCP_SERVICE_ACCOUNT), ); let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api.create_gcpsts_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new entry for your service account returns "OK" response ``` /** * Create a new entry for your service account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.GCPIntegrationApi(configuration); const params: v2.GCPIntegrationApiCreateGCPSTSAccountRequest = { body: { data: { attributes: { clientEmail: "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", hostFilters: [], }, type: "gcp_service_account", }, }, }; apiInstance .createGCPSTSAccount(params) .then((data: v2.GCPSTSServiceAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a new entry for your service account with account_tags returns "OK" response ``` /** * Create a new entry for your service account with account_tags returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.GCPIntegrationApi(configuration); const params: v2.GCPIntegrationApiCreateGCPSTSAccountRequest = { body: { data: { attributes: { accountTags: ["lorem", "ipsum"], clientEmail: "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", hostFilters: [], }, type: "gcp_service_account", }, }, }; apiInstance .createGCPSTSAccount(params) .then((data: v2.GCPSTSServiceAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a new entry for your service account with cloud run revision filters enabled returns "OK" response ``` /** * Create a new entry for your service account with cloud run revision filters enabled returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.GCPIntegrationApi(configuration); const params: v2.GCPIntegrationApiCreateGCPSTSAccountRequest = { body: { data: { attributes: { cloudRunRevisionFilters: ["meh:bleh"], clientEmail: "Test-252bf553ef04b351@test-project.iam.gserviceaccount.com", hostFilters: [], }, type: "gcp_service_account", }, }, }; apiInstance .createGCPSTSAccount(params) .then((data: v2.GCPSTSServiceAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a GCP integration](https://docs.datadoghq.com/api/latest/gcp-integration/#delete-a-gcp-integration) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/gcp-integration/#delete-a-gcp-integration-v1) DELETE https://api.ap1.datadoghq.com/api/v1/integration/gcphttps://api.ap2.datadoghq.com/api/v1/integration/gcphttps://api.datadoghq.eu/api/v1/integration/gcphttps://api.ddog-gov.com/api/v1/integration/gcphttps://api.datadoghq.com/api/v1/integration/gcphttps://api.us3.datadoghq.com/api/v1/integration/gcphttps://api.us5.datadoghq.com/api/v1/integration/gcp ### Overview This endpoint is deprecated – use the V2 endpoints instead. Delete a given Datadog-GCP integration. This endpoint requires the `gcp_configurations_manage` permission. ### Request #### Body Data (required) Delete a given Datadog-GCP integration. * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Field Type Description auth_provider_x509_cert_url string Should be `https://www.googleapis.com/oauth2/v1/certs`. auth_uri string Should be `https://accounts.google.com/o/oauth2/auth`. automute boolean Silence monitors for expected GCE instance shutdowns. client_email string Your email found in your JSON service account key. client_id string Your ID found in your JSON service account key. client_x509_cert_url string Should be `https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL` where `$CLIENT_EMAIL` is the email found in your JSON service account key. cloud_run_revision_filters [string] **DEPRECATED** : List of filters to limit the Cloud Run revisions that are pulled into Datadog by using tags. Only Cloud Run revision resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=cloud_run_revision` errors [string] An array of errors. host_filters string **DEPRECATED** : A comma-separated list of filters to limit the VM instances that are pulled into Datadog by using tags. Only VM instance resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=gce_instance` is_cspm_enabled boolean When enabled, Datadog will activate the Cloud Security Monitoring product for this service account. Note: This requires resource_collection_enabled to be set to true. is_resource_change_collection_enabled boolean When enabled, Datadog scans for all resource change data in your Google Cloud environment. is_security_command_center_enabled boolean When enabled, Datadog will attempt to collect Security Command Center Findings. Note: This requires additional permissions on the service account. monitored_resource_configs [object] Configurations for GCP monitored resources. filters [string] List of filters to limit the monitored resources that are pulled into Datadog by using tags. Only monitored resources that apply to specified filters are imported into Datadog. type enum The GCP monitored resource type. Only a subset of resource types are supported. Allowed enum values: `cloud_function,cloud_run_revision,gce_instance` private_key string Your private key name found in your JSON service account key. private_key_id string Your private key ID found in your JSON service account key. project_id string Your Google Cloud project ID found in your JSON service account key. resource_collection_enabled boolean When enabled, Datadog scans for all resources in your GCP environment. token_uri string Should be `https://accounts.google.com/o/oauth2/token`. type string The value for service_account found in your JSON service account key. ``` { "client_email": "252bf553ef04b351@example.com", "client_id": "163662907116366290710", "project_id": "datadog-apitest" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/gcp-integration/#DeleteGCPIntegration-200-v1) * [400](https://docs.datadoghq.com/api/latest/gcp-integration/#DeleteGCPIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/gcp-integration/#DeleteGCPIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/gcp-integration/#DeleteGCPIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby-legacy) ##### Delete a GCP integration returns "OK" response Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/gcp" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "client_email": "252bf553ef04b351@example.com", "client_id": "163662907116366290710", "project_id": "datadog-apitest" } EOF ``` ##### Delete a GCP integration returns "OK" response ``` // Delete a GCP integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.GCPAccount{ ClientEmail: datadog.PtrString("252bf553ef04b351@example.com"), ClientId: datadog.PtrString("163662907116366290710"), ProjectId: datadog.PtrString("datadog-apitest"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewGCPIntegrationApi(apiClient) resp, r, err := api.DeleteGCPIntegration(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.DeleteGCPIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.DeleteGCPIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a GCP integration returns "OK" response ``` // Delete a GCP integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.GcpIntegrationApi; import com.datadog.api.client.v1.model.GCPAccount; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); GCPAccount body = new GCPAccount() .clientEmail("252bf553ef04b351@example.com") .clientId("163662907116366290710") .projectId("datadog-apitest"); try { apiInstance.deleteGCPIntegration(body); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#deleteGCPIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a GCP integration returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.GcpIntegration.delete( project_id="", client_email="" ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Delete a GCP integration returns "OK" response ``` """ Delete a GCP integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.gcp_integration_api import GCPIntegrationApi from datadog_api_client.v1.model.gcp_account import GCPAccount body = GCPAccount( client_email="252bf553ef04b351@example.com", client_id="163662907116366290710", project_id="datadog-apitest", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.delete_gcp_integration(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a GCP integration returns "OK" response ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) config = { "project_id": "", "client_email": "" } dog.gcp_integration_delete(config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a GCP integration returns "OK" response ``` # Delete a GCP integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::GCPIntegrationAPI.new body = DatadogAPIClient::V1::GCPAccount.new({ client_email: "252bf553ef04b351@example.com", client_id: "163662907116366290710", project_id: "datadog-apitest", }) p api_instance.delete_gcp_integration(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a GCP integration returns "OK" response ``` // Delete a GCP integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_gcp_integration::GCPIntegrationAPI; use datadog_api_client::datadogV1::model::GCPAccount; #[tokio::main] async fn main() { let body = GCPAccount::new() .client_email("252bf553ef04b351@example.com".to_string()) .client_id("163662907116366290710".to_string()) .project_id("datadog-apitest".to_string()); let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api.delete_gcp_integration(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a GCP integration returns "OK" response ``` /** * Delete a GCP integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.GCPIntegrationApi(configuration); const params: v1.GCPIntegrationApiDeleteGCPIntegrationRequest = { body: { clientEmail: "252bf553ef04b351@example.com", clientId: "163662907116366290710", projectId: "datadog-apitest", }, }; apiInstance .deleteGCPIntegration(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an STS enabled GCP Account](https://docs.datadoghq.com/api/latest/gcp-integration/#delete-an-sts-enabled-gcp-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/gcp-integration/#delete-an-sts-enabled-gcp-account-v2) DELETE https://api.ap1.datadoghq.com/api/v2/integration/gcp/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integration/gcp/accounts/{account_id}https://api.datadoghq.eu/api/v2/integration/gcp/accounts/{account_id}https://api.ddog-gov.com/api/v2/integration/gcp/accounts/{account_id}https://api.datadoghq.com/api/v2/integration/gcp/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integration/gcp/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integration/gcp/accounts/{account_id} ### Overview Delete an STS enabled GCP account from within Datadog. This endpoint requires the `gcp_configurations_manage` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Your GCP STS enabled service account’s unique ID. ### Response * [204](https://docs.datadoghq.com/api/latest/gcp-integration/#DeleteGCPSTSAccount-204-v2) * [400](https://docs.datadoghq.com/api/latest/gcp-integration/#DeleteGCPSTSAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/gcp-integration/#DeleteGCPSTSAccount-403-v2) * [429](https://docs.datadoghq.com/api/latest/gcp-integration/#DeleteGCPSTSAccount-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=typescript) ##### Delete an STS enabled GCP Account Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/gcp/accounts/${account_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an STS enabled GCP Account ``` """ Delete an STS enabled GCP Account returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.gcp_integration_api import GCPIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) api_instance.delete_gcpsts_account( account_id="account_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an STS enabled GCP Account ``` # Delete an STS enabled GCP Account returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::GCPIntegrationAPI.new api_instance.delete_gcpsts_account("account_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an STS enabled GCP Account ``` // Delete an STS enabled GCP Account returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewGCPIntegrationApi(apiClient) r, err := api.DeleteGCPSTSAccount(ctx, "account_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.DeleteGCPSTSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an STS enabled GCP Account ``` // Delete an STS enabled GCP Account returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.GcpIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); try { apiInstance.deleteGCPSTSAccount("account_id"); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#deleteGCPSTSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an STS enabled GCP Account ``` // Delete an STS enabled GCP Account returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_gcp_integration::GCPIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api.delete_gcpsts_account("account_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an STS enabled GCP Account ``` /** * Delete an STS enabled GCP Account returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.GCPIntegrationApi(configuration); const params: v2.GCPIntegrationApiDeleteGCPSTSAccountRequest = { accountId: "account_id", }; apiInstance .deleteGCPSTSAccount(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a GCP integration](https://docs.datadoghq.com/api/latest/gcp-integration/#update-a-gcp-integration) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/gcp-integration/#update-a-gcp-integration-v1) PUT https://api.ap1.datadoghq.com/api/v1/integration/gcphttps://api.ap2.datadoghq.com/api/v1/integration/gcphttps://api.datadoghq.eu/api/v1/integration/gcphttps://api.ddog-gov.com/api/v1/integration/gcphttps://api.datadoghq.com/api/v1/integration/gcphttps://api.us3.datadoghq.com/api/v1/integration/gcphttps://api.us5.datadoghq.com/api/v1/integration/gcp ### Overview This endpoint is deprecated – use the V2 endpoints instead. Update a Datadog-GCP integrations host_filters and/or auto-mute. Requires a `project_id` and `client_email`, however these fields cannot be updated. If you need to update these fields, delete and use the create (`POST`) endpoint. The unspecified fields will keep their original values. This endpoint requires the `gcp_configuration_edit` permission. ### Request #### Body Data (required) Update a Datadog-GCP integration. * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Field Type Description auth_provider_x509_cert_url string Should be `https://www.googleapis.com/oauth2/v1/certs`. auth_uri string Should be `https://accounts.google.com/o/oauth2/auth`. automute boolean Silence monitors for expected GCE instance shutdowns. client_email string Your email found in your JSON service account key. client_id string Your ID found in your JSON service account key. client_x509_cert_url string Should be `https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL` where `$CLIENT_EMAIL` is the email found in your JSON service account key. cloud_run_revision_filters [string] **DEPRECATED** : List of filters to limit the Cloud Run revisions that are pulled into Datadog by using tags. Only Cloud Run revision resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=cloud_run_revision` errors [string] An array of errors. host_filters string **DEPRECATED** : A comma-separated list of filters to limit the VM instances that are pulled into Datadog by using tags. Only VM instance resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=gce_instance` is_cspm_enabled boolean When enabled, Datadog will activate the Cloud Security Monitoring product for this service account. Note: This requires resource_collection_enabled to be set to true. is_resource_change_collection_enabled boolean When enabled, Datadog scans for all resource change data in your Google Cloud environment. is_security_command_center_enabled boolean When enabled, Datadog will attempt to collect Security Command Center Findings. Note: This requires additional permissions on the service account. monitored_resource_configs [object] Configurations for GCP monitored resources. filters [string] List of filters to limit the monitored resources that are pulled into Datadog by using tags. Only monitored resources that apply to specified filters are imported into Datadog. type enum The GCP monitored resource type. Only a subset of resource types are supported. Allowed enum values: `cloud_function,cloud_run_revision,gce_instance` private_key string Your private key name found in your JSON service account key. private_key_id string Your private key ID found in your JSON service account key. project_id string Your Google Cloud project ID found in your JSON service account key. resource_collection_enabled boolean When enabled, Datadog scans for all resources in your GCP environment. token_uri string Should be `https://accounts.google.com/o/oauth2/token`. type string The value for service_account found in your JSON service account key. ##### Update a GCP integration cloud run revision filters returns "OK" response ``` { "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "client_email": "252bf553ef04b351@example.com", "client_id": "163662907116366290710", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", "host_filters": "key:value,filter:example", "cloud_run_revision_filters": [ "merp:derp" ], "is_cspm_enabled": true, "is_security_command_center_enabled": true, "is_resource_change_collection_enabled": true, "private_key": "private_key", "private_key_id": "123456789abcdefghi123456789abcdefghijklm", "project_id": "datadog-apitest", "resource_collection_enabled": true, "token_uri": "https://accounts.google.com/o/oauth2/token", "type": "service_account" } ``` Copy ##### Update a GCP integration returns "OK" response ``` { "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "client_email": "252bf553ef04b351@example.com", "client_id": "163662907116366290710", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", "host_filters": "key:value,filter:example", "is_cspm_enabled": true, "is_security_command_center_enabled": true, "is_resource_change_collection_enabled": true, "private_key": "private_key", "private_key_id": "123456789abcdefghi123456789abcdefghijklm", "project_id": "datadog-apitest", "resource_collection_enabled": true, "token_uri": "https://accounts.google.com/o/oauth2/token", "type": "service_account" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/gcp-integration/#UpdateGCPIntegration-200-v1) * [400](https://docs.datadoghq.com/api/latest/gcp-integration/#UpdateGCPIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/gcp-integration/#UpdateGCPIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/gcp-integration/#UpdateGCPIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=typescript) ##### Update a GCP integration cloud run revision filters returns "OK" response Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/gcp" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "client_email": "252bf553ef04b351@example.com", "client_id": "163662907116366290710", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", "host_filters": "key:value,filter:example", "cloud_run_revision_filters": [ "merp:derp" ], "is_cspm_enabled": true, "is_security_command_center_enabled": true, "is_resource_change_collection_enabled": true, "private_key": "private_key", "private_key_id": "123456789abcdefghi123456789abcdefghijklm", "project_id": "datadog-apitest", "resource_collection_enabled": true, "token_uri": "https://accounts.google.com/o/oauth2/token", "type": "service_account" } EOF ``` ##### Update a GCP integration returns "OK" response Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/gcp" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "client_email": "252bf553ef04b351@example.com", "client_id": "163662907116366290710", "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", "host_filters": "key:value,filter:example", "is_cspm_enabled": true, "is_security_command_center_enabled": true, "is_resource_change_collection_enabled": true, "private_key": "private_key", "private_key_id": "123456789abcdefghi123456789abcdefghijklm", "project_id": "datadog-apitest", "resource_collection_enabled": true, "token_uri": "https://accounts.google.com/o/oauth2/token", "type": "service_account" } EOF ``` ##### Update a GCP integration cloud run revision filters returns "OK" response ``` // Update a GCP integration cloud run revision filters returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.GCPAccount{ AuthProviderX509CertUrl: datadog.PtrString("https://www.googleapis.com/oauth2/v1/certs"), AuthUri: datadog.PtrString("https://accounts.google.com/o/oauth2/auth"), ClientEmail: datadog.PtrString("252bf553ef04b351@example.com"), ClientId: datadog.PtrString("163662907116366290710"), ClientX509CertUrl: datadog.PtrString("https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL"), HostFilters: datadog.PtrString("key:value,filter:example"), CloudRunRevisionFilters: []string{ "merp:derp", }, IsCspmEnabled: datadog.PtrBool(true), IsSecurityCommandCenterEnabled: datadog.PtrBool(true), IsResourceChangeCollectionEnabled: datadog.PtrBool(true), PrivateKey: datadog.PtrString("private_key"), PrivateKeyId: datadog.PtrString("123456789abcdefghi123456789abcdefghijklm"), ProjectId: datadog.PtrString("datadog-apitest"), ResourceCollectionEnabled: datadog.PtrBool(true), TokenUri: datadog.PtrString("https://accounts.google.com/o/oauth2/token"), Type: datadog.PtrString("service_account"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewGCPIntegrationApi(apiClient) resp, r, err := api.UpdateGCPIntegration(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.UpdateGCPIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.UpdateGCPIntegration`:\n%s\n", responseContent) } ``` Copy ##### Update a GCP integration returns "OK" response ``` // Update a GCP integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.GCPAccount{ AuthProviderX509CertUrl: datadog.PtrString("https://www.googleapis.com/oauth2/v1/certs"), AuthUri: datadog.PtrString("https://accounts.google.com/o/oauth2/auth"), ClientEmail: datadog.PtrString("252bf553ef04b351@example.com"), ClientId: datadog.PtrString("163662907116366290710"), ClientX509CertUrl: datadog.PtrString("https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL"), HostFilters: datadog.PtrString("key:value,filter:example"), IsCspmEnabled: datadog.PtrBool(true), IsSecurityCommandCenterEnabled: datadog.PtrBool(true), IsResourceChangeCollectionEnabled: datadog.PtrBool(true), PrivateKey: datadog.PtrString("private_key"), PrivateKeyId: datadog.PtrString("123456789abcdefghi123456789abcdefghijklm"), ProjectId: datadog.PtrString("datadog-apitest"), ResourceCollectionEnabled: datadog.PtrBool(true), TokenUri: datadog.PtrString("https://accounts.google.com/o/oauth2/token"), Type: datadog.PtrString("service_account"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewGCPIntegrationApi(apiClient) resp, r, err := api.UpdateGCPIntegration(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.UpdateGCPIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.UpdateGCPIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a GCP integration cloud run revision filters returns "OK" response ``` // Update a GCP integration cloud run revision filters returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.GcpIntegrationApi; import com.datadog.api.client.v1.model.GCPAccount; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); GCPAccount body = new GCPAccount() .authProviderX509CertUrl("https://www.googleapis.com/oauth2/v1/certs") .authUri("https://accounts.google.com/o/oauth2/auth") .clientEmail("252bf553ef04b351@example.com") .clientId("163662907116366290710") .clientX509CertUrl("https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL") .hostFilters("key:value,filter:example") .cloudRunRevisionFilters(Collections.singletonList("merp:derp")) .isCspmEnabled(true) .isSecurityCommandCenterEnabled(true) .isResourceChangeCollectionEnabled(true) .privateKey("private_key") .privateKeyId("123456789abcdefghi123456789abcdefghijklm") .projectId("datadog-apitest") .resourceCollectionEnabled(true) .tokenUri("https://accounts.google.com/o/oauth2/token") .type("service_account"); try { apiInstance.updateGCPIntegration(body); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#updateGCPIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update a GCP integration returns "OK" response ``` // Update a GCP integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.GcpIntegrationApi; import com.datadog.api.client.v1.model.GCPAccount; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); GCPAccount body = new GCPAccount() .authProviderX509CertUrl("https://www.googleapis.com/oauth2/v1/certs") .authUri("https://accounts.google.com/o/oauth2/auth") .clientEmail("252bf553ef04b351@example.com") .clientId("163662907116366290710") .clientX509CertUrl("https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL") .hostFilters("key:value,filter:example") .isCspmEnabled(true) .isSecurityCommandCenterEnabled(true) .isResourceChangeCollectionEnabled(true) .privateKey("private_key") .privateKeyId("123456789abcdefghi123456789abcdefghijklm") .projectId("datadog-apitest") .resourceCollectionEnabled(true) .tokenUri("https://accounts.google.com/o/oauth2/token") .type("service_account"); try { apiInstance.updateGCPIntegration(body); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#updateGCPIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a GCP integration cloud run revision filters returns "OK" response ``` """ Update a GCP integration cloud run revision filters returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.gcp_integration_api import GCPIntegrationApi from datadog_api_client.v1.model.gcp_account import GCPAccount body = GCPAccount( auth_provider_x509_cert_url="https://www.googleapis.com/oauth2/v1/certs", auth_uri="https://accounts.google.com/o/oauth2/auth", client_email="252bf553ef04b351@example.com", client_id="163662907116366290710", client_x509_cert_url="https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", host_filters="key:value,filter:example", cloud_run_revision_filters=[ "merp:derp", ], is_cspm_enabled=True, is_security_command_center_enabled=True, is_resource_change_collection_enabled=True, private_key="private_key", private_key_id="123456789abcdefghi123456789abcdefghijklm", project_id="datadog-apitest", resource_collection_enabled=True, token_uri="https://accounts.google.com/o/oauth2/token", type="service_account", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.update_gcp_integration(body=body) print(response) ``` Copy ##### Update a GCP integration returns "OK" response ``` """ Update a GCP integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.gcp_integration_api import GCPIntegrationApi from datadog_api_client.v1.model.gcp_account import GCPAccount body = GCPAccount( auth_provider_x509_cert_url="https://www.googleapis.com/oauth2/v1/certs", auth_uri="https://accounts.google.com/o/oauth2/auth", client_email="252bf553ef04b351@example.com", client_id="163662907116366290710", client_x509_cert_url="https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", host_filters="key:value,filter:example", is_cspm_enabled=True, is_security_command_center_enabled=True, is_resource_change_collection_enabled=True, private_key="private_key", private_key_id="123456789abcdefghi123456789abcdefghijklm", project_id="datadog-apitest", resource_collection_enabled=True, token_uri="https://accounts.google.com/o/oauth2/token", type="service_account", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.update_gcp_integration(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a GCP integration cloud run revision filters returns "OK" response ``` # Update a GCP integration cloud run revision filters returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::GCPIntegrationAPI.new body = DatadogAPIClient::V1::GCPAccount.new({ auth_provider_x509_cert_url: "https://www.googleapis.com/oauth2/v1/certs", auth_uri: "https://accounts.google.com/o/oauth2/auth", client_email: "252bf553ef04b351@example.com", client_id: "163662907116366290710", client_x509_cert_url: "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", host_filters: "key:value,filter:example", cloud_run_revision_filters: [ "merp:derp", ], is_cspm_enabled: true, is_security_command_center_enabled: true, is_resource_change_collection_enabled: true, private_key: "private_key", private_key_id: "123456789abcdefghi123456789abcdefghijklm", project_id: "datadog-apitest", resource_collection_enabled: true, token_uri: "https://accounts.google.com/o/oauth2/token", type: "service_account", }) p api_instance.update_gcp_integration(body) ``` Copy ##### Update a GCP integration returns "OK" response ``` # Update a GCP integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::GCPIntegrationAPI.new body = DatadogAPIClient::V1::GCPAccount.new({ auth_provider_x509_cert_url: "https://www.googleapis.com/oauth2/v1/certs", auth_uri: "https://accounts.google.com/o/oauth2/auth", client_email: "252bf553ef04b351@example.com", client_id: "163662907116366290710", client_x509_cert_url: "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", host_filters: "key:value,filter:example", is_cspm_enabled: true, is_security_command_center_enabled: true, is_resource_change_collection_enabled: true, private_key: "private_key", private_key_id: "123456789abcdefghi123456789abcdefghijklm", project_id: "datadog-apitest", resource_collection_enabled: true, token_uri: "https://accounts.google.com/o/oauth2/token", type: "service_account", }) p api_instance.update_gcp_integration(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a GCP integration cloud run revision filters returns "OK" response ``` // Update a GCP integration cloud run revision filters returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_gcp_integration::GCPIntegrationAPI; use datadog_api_client::datadogV1::model::GCPAccount; #[tokio::main] async fn main() { let body = GCPAccount::new() .auth_provider_x509_cert_url("https://www.googleapis.com/oauth2/v1/certs".to_string()) .auth_uri("https://accounts.google.com/o/oauth2/auth".to_string()) .client_email("252bf553ef04b351@example.com".to_string()) .client_id("163662907116366290710".to_string()) .client_x509_cert_url( "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL".to_string(), ) .cloud_run_revision_filters(vec!["merp:derp".to_string()]) .host_filters("key:value,filter:example".to_string()) .is_cspm_enabled(true) .is_resource_change_collection_enabled(true) .is_security_command_center_enabled(true) .private_key("private_key".to_string()) .private_key_id("123456789abcdefghi123456789abcdefghijklm".to_string()) .project_id("datadog-apitest".to_string()) .resource_collection_enabled(true) .token_uri("https://accounts.google.com/o/oauth2/token".to_string()) .type_("service_account".to_string()); let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api.update_gcp_integration(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update a GCP integration returns "OK" response ``` // Update a GCP integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_gcp_integration::GCPIntegrationAPI; use datadog_api_client::datadogV1::model::GCPAccount; #[tokio::main] async fn main() { let body = GCPAccount::new() .auth_provider_x509_cert_url("https://www.googleapis.com/oauth2/v1/certs".to_string()) .auth_uri("https://accounts.google.com/o/oauth2/auth".to_string()) .client_email("252bf553ef04b351@example.com".to_string()) .client_id("163662907116366290710".to_string()) .client_x509_cert_url( "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL".to_string(), ) .host_filters("key:value,filter:example".to_string()) .is_cspm_enabled(true) .is_resource_change_collection_enabled(true) .is_security_command_center_enabled(true) .private_key("private_key".to_string()) .private_key_id("123456789abcdefghi123456789abcdefghijklm".to_string()) .project_id("datadog-apitest".to_string()) .resource_collection_enabled(true) .token_uri("https://accounts.google.com/o/oauth2/token".to_string()) .type_("service_account".to_string()); let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api.update_gcp_integration(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a GCP integration cloud run revision filters returns "OK" response ``` /** * Update a GCP integration cloud run revision filters returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.GCPIntegrationApi(configuration); const params: v1.GCPIntegrationApiUpdateGCPIntegrationRequest = { body: { authProviderX509CertUrl: "https://www.googleapis.com/oauth2/v1/certs", authUri: "https://accounts.google.com/o/oauth2/auth", clientEmail: "252bf553ef04b351@example.com", clientId: "163662907116366290710", clientX509CertUrl: "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", hostFilters: "key:value,filter:example", cloudRunRevisionFilters: ["merp:derp"], isCspmEnabled: true, isSecurityCommandCenterEnabled: true, isResourceChangeCollectionEnabled: true, privateKey: "private_key", privateKeyId: "123456789abcdefghi123456789abcdefghijklm", projectId: "datadog-apitest", resourceCollectionEnabled: true, tokenUri: "https://accounts.google.com/o/oauth2/token", type: "service_account", }, }; apiInstance .updateGCPIntegration(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update a GCP integration returns "OK" response ``` /** * Update a GCP integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.GCPIntegrationApi(configuration); const params: v1.GCPIntegrationApiUpdateGCPIntegrationRequest = { body: { authProviderX509CertUrl: "https://www.googleapis.com/oauth2/v1/certs", authUri: "https://accounts.google.com/o/oauth2/auth", clientEmail: "252bf553ef04b351@example.com", clientId: "163662907116366290710", clientX509CertUrl: "https://www.googleapis.com/robot/v1/metadata/x509/$CLIENT_EMAIL", hostFilters: "key:value,filter:example", isCspmEnabled: true, isSecurityCommandCenterEnabled: true, isResourceChangeCollectionEnabled: true, privateKey: "private_key", privateKeyId: "123456789abcdefghi123456789abcdefghijklm", projectId: "datadog-apitest", resourceCollectionEnabled: true, tokenUri: "https://accounts.google.com/o/oauth2/token", type: "service_account", }, }; apiInstance .updateGCPIntegration(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update STS Service Account](https://docs.datadoghq.com/api/latest/gcp-integration/#update-sts-service-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/gcp-integration/#update-sts-service-account-v2) PATCH https://api.ap1.datadoghq.com/api/v2/integration/gcp/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integration/gcp/accounts/{account_id}https://api.datadoghq.eu/api/v2/integration/gcp/accounts/{account_id}https://api.ddog-gov.com/api/v2/integration/gcp/accounts/{account_id}https://api.datadoghq.com/api/v2/integration/gcp/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integration/gcp/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integration/gcp/accounts/{account_id} ### Overview Update an STS enabled service account. This endpoint requires the `gcp_configuration_edit` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string Your GCP STS enabled service account’s unique ID. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Field Type Description data object Data on your service account. attributes object Attributes associated with your service account. account_tags [string] Tags to be associated with GCP metrics and service checks from your account. automute boolean Silence monitors for expected GCE instance shutdowns. client_email string Your service account email address. cloud_run_revision_filters [string] **DEPRECATED** : List of filters to limit the Cloud Run revisions that are pulled into Datadog by using tags. Only Cloud Run revision resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=cloud_run_revision` host_filters [string] **DEPRECATED** : List of filters to limit the VM instances that are pulled into Datadog by using tags. Only VM instance resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=gce_instance` is_cspm_enabled boolean When enabled, Datadog will activate the Cloud Security Monitoring product for this service account. Note: This requires resource_collection_enabled to be set to true. is_global_location_enabled boolean When enabled, Datadog collects metrics where location is explicitly stated as "global" or where location information cannot be deduced from GCP labels. default: `true` is_per_project_quota_enabled boolean When enabled, Datadog applies the `X-Goog-User-Project` header, attributing Google Cloud billing and quota usage to the project being monitored rather than the default service account project. is_resource_change_collection_enabled boolean When enabled, Datadog scans for all resource change data in your Google Cloud environment. is_security_command_center_enabled boolean When enabled, Datadog will attempt to collect Security Command Center Findings. Note: This requires additional permissions on the service account. metric_namespace_configs [object] Configurations for GCP metric namespaces. disabled boolean When disabled, Datadog does not collect metrics that are related to this GCP metric namespace. filters [string] When enabled, Datadog applies these additional filters to limit metric collection. A metric is collected only if it does not match all exclusion filters and matches at least one allow filter. id string The id of the GCP metric namespace. monitored_resource_configs [object] Configurations for GCP monitored resources. filters [string] List of filters to limit the monitored resources that are pulled into Datadog by using tags. Only monitored resources that apply to specified filters are imported into Datadog. type enum The GCP monitored resource type. Only a subset of resource types are supported. Allowed enum values: `cloud_function,cloud_run_revision,gce_instance` region_filter_configs [string] Configurations for GCP location filtering, such as region, multi-region, or zone. Only monitored resources that match the specified regions are imported into Datadog. By default, Datadog collects from all locations. resource_collection_enabled boolean When enabled, Datadog scans for all resources in your GCP environment. id string Your service account's unique ID. type enum The type of account. Allowed enum values: `gcp_service_account` default: `gcp_service_account` ##### Update STS Service Account returns "OK" response ``` { "data": { "attributes": { "client_email": "Test-252bf553ef04b351@example.com", "host_filters": [ "foo:bar" ] }, "id": "d291291f-12c2-22g4-j290-123456678897", "type": "gcp_service_account" } } ``` Copy ##### Update STS Service Account returns "OK" response with cloud run revision filters ``` { "data": { "attributes": { "client_email": "Test-252bf553ef04b351@example.com", "cloud_run_revision_filters": [ "merp:derp" ] }, "id": "d291291f-12c2-22g4-j290-123456678897", "type": "gcp_service_account" } } ``` Copy ##### Update STS Service Account returns "OK" response with enable resource collection turned on ``` { "data": { "attributes": { "client_email": "Test-252bf553ef04b351@example.com", "resource_collection_enabled": true }, "id": "d291291f-12c2-22g4-j290-123456678897", "type": "gcp_service_account" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/gcp-integration/#UpdateGCPSTSAccount-201-v2) * [400](https://docs.datadoghq.com/api/latest/gcp-integration/#UpdateGCPSTSAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/gcp-integration/#UpdateGCPSTSAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/gcp-integration/#UpdateGCPSTSAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/gcp-integration/#UpdateGCPSTSAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) The account creation response. Field Type Description data object Info on your service account. attributes object Attributes associated with your service account. account_tags [string] Tags to be associated with GCP metrics and service checks from your account. automute boolean Silence monitors for expected GCE instance shutdowns. client_email string Your service account email address. cloud_run_revision_filters [string] **DEPRECATED** : List of filters to limit the Cloud Run revisions that are pulled into Datadog by using tags. Only Cloud Run revision resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=cloud_run_revision` host_filters [string] **DEPRECATED** : List of filters to limit the VM instances that are pulled into Datadog by using tags. Only VM instance resources that apply to specified filters are imported into Datadog. **Note:** This field is deprecated. Instead, use `monitored_resource_configs` with `type=gce_instance` is_cspm_enabled boolean When enabled, Datadog will activate the Cloud Security Monitoring product for this service account. Note: This requires resource_collection_enabled to be set to true. is_global_location_enabled boolean When enabled, Datadog collects metrics where location is explicitly stated as "global" or where location information cannot be deduced from GCP labels. default: `true` is_per_project_quota_enabled boolean When enabled, Datadog applies the `X-Goog-User-Project` header, attributing Google Cloud billing and quota usage to the project being monitored rather than the default service account project. is_resource_change_collection_enabled boolean When enabled, Datadog scans for all resource change data in your Google Cloud environment. is_security_command_center_enabled boolean When enabled, Datadog will attempt to collect Security Command Center Findings. Note: This requires additional permissions on the service account. metric_namespace_configs [object] Configurations for GCP metric namespaces. disabled boolean When disabled, Datadog does not collect metrics that are related to this GCP metric namespace. filters [string] When enabled, Datadog applies these additional filters to limit metric collection. A metric is collected only if it does not match all exclusion filters and matches at least one allow filter. id string The id of the GCP metric namespace. monitored_resource_configs [object] Configurations for GCP monitored resources. filters [string] List of filters to limit the monitored resources that are pulled into Datadog by using tags. Only monitored resources that apply to specified filters are imported into Datadog. type enum The GCP monitored resource type. Only a subset of resource types are supported. Allowed enum values: `cloud_function,cloud_run_revision,gce_instance` region_filter_configs [string] Configurations for GCP location filtering, such as region, multi-region, or zone. Only monitored resources that match the specified regions are imported into Datadog. By default, Datadog collects from all locations. resource_collection_enabled boolean When enabled, Datadog scans for all resources in your GCP environment. id string Your service account's unique ID. meta object Additional information related to your service account. accessible_projects [string] The current list of projects accessible from your service account. type enum The type of account. Allowed enum values: `gcp_service_account` default: `gcp_service_account` ``` { "data": { "attributes": { "account_tags": [], "automute": false, "client_email": "datadog-service-account@test-project.iam.gserviceaccount.com", "cloud_run_revision_filters": [ "$KEY:$VALUE" ], "host_filters": [ "$KEY:$VALUE" ], "is_cspm_enabled": false, "is_global_location_enabled": true, "is_per_project_quota_enabled": true, "is_resource_change_collection_enabled": true, "is_security_command_center_enabled": true, "metric_namespace_configs": [ { "disabled": true, "filters": [ "snapshot.*", "!*_by_region" ], "id": "pubsub" } ], "monitored_resource_configs": [ { "filters": [ "$KEY:$VALUE" ], "type": "gce_instance" } ], "region_filter_configs": [ "nam4", "europe-north1" ], "resource_collection_enabled": false }, "id": "d291291f-12c2-22g4-j290-123456678897", "meta": { "accessible_projects": [] }, "type": "gcp_service_account" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=typescript) ##### Update STS Service Account returns "OK" response Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/gcp/accounts/${account_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "client_email": "Test-252bf553ef04b351@example.com", "host_filters": [ "foo:bar" ] }, "id": "d291291f-12c2-22g4-j290-123456678897", "type": "gcp_service_account" } } EOF ``` ##### Update STS Service Account returns "OK" response with cloud run revision filters Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/gcp/accounts/${account_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "client_email": "Test-252bf553ef04b351@example.com", "cloud_run_revision_filters": [ "merp:derp" ] }, "id": "d291291f-12c2-22g4-j290-123456678897", "type": "gcp_service_account" } } EOF ``` ##### Update STS Service Account returns "OK" response with enable resource collection turned on Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/gcp/accounts/${account_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "client_email": "Test-252bf553ef04b351@example.com", "resource_collection_enabled": true }, "id": "d291291f-12c2-22g4-j290-123456678897", "type": "gcp_service_account" } } EOF ``` ##### Update STS Service Account returns "OK" response ``` // Update STS Service Account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "gcp_sts_account" in the system GcpStsAccountDataID := os.Getenv("GCP_STS_ACCOUNT_DATA_ID") body := datadogV2.GCPSTSServiceAccountUpdateRequest{ Data: &datadogV2.GCPSTSServiceAccountUpdateRequestData{ Attributes: &datadogV2.GCPSTSServiceAccountAttributes{ ClientEmail: datadog.PtrString("Test-252bf553ef04b351@example.com"), HostFilters: []string{ "foo:bar", }, }, Id: datadog.PtrString(GcpStsAccountDataID), Type: datadogV2.GCPSERVICEACCOUNTTYPE_GCP_SERVICE_ACCOUNT.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewGCPIntegrationApi(apiClient) resp, r, err := api.UpdateGCPSTSAccount(ctx, GcpStsAccountDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.UpdateGCPSTSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.UpdateGCPSTSAccount`:\n%s\n", responseContent) } ``` Copy ##### Update STS Service Account returns "OK" response with cloud run revision filters ``` // Update STS Service Account returns "OK" response with cloud run revision filters package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "gcp_sts_account" in the system GcpStsAccountDataID := os.Getenv("GCP_STS_ACCOUNT_DATA_ID") body := datadogV2.GCPSTSServiceAccountUpdateRequest{ Data: &datadogV2.GCPSTSServiceAccountUpdateRequestData{ Attributes: &datadogV2.GCPSTSServiceAccountAttributes{ ClientEmail: datadog.PtrString("Test-252bf553ef04b351@example.com"), CloudRunRevisionFilters: []string{ "merp:derp", }, }, Id: datadog.PtrString(GcpStsAccountDataID), Type: datadogV2.GCPSERVICEACCOUNTTYPE_GCP_SERVICE_ACCOUNT.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewGCPIntegrationApi(apiClient) resp, r, err := api.UpdateGCPSTSAccount(ctx, GcpStsAccountDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.UpdateGCPSTSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.UpdateGCPSTSAccount`:\n%s\n", responseContent) } ``` Copy ##### Update STS Service Account returns "OK" response with enable resource collection turned on ``` // Update STS Service Account returns "OK" response with enable resource collection turned on package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "gcp_sts_account" in the system GcpStsAccountDataID := os.Getenv("GCP_STS_ACCOUNT_DATA_ID") body := datadogV2.GCPSTSServiceAccountUpdateRequest{ Data: &datadogV2.GCPSTSServiceAccountUpdateRequestData{ Attributes: &datadogV2.GCPSTSServiceAccountAttributes{ ClientEmail: datadog.PtrString("Test-252bf553ef04b351@example.com"), ResourceCollectionEnabled: datadog.PtrBool(true), }, Id: datadog.PtrString(GcpStsAccountDataID), Type: datadogV2.GCPSERVICEACCOUNTTYPE_GCP_SERVICE_ACCOUNT.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewGCPIntegrationApi(apiClient) resp, r, err := api.UpdateGCPSTSAccount(ctx, GcpStsAccountDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.UpdateGCPSTSAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.UpdateGCPSTSAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update STS Service Account returns "OK" response ``` // Update STS Service Account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.GcpIntegrationApi; import com.datadog.api.client.v2.model.GCPSTSServiceAccountAttributes; import com.datadog.api.client.v2.model.GCPSTSServiceAccountResponse; import com.datadog.api.client.v2.model.GCPSTSServiceAccountUpdateRequest; import com.datadog.api.client.v2.model.GCPSTSServiceAccountUpdateRequestData; import com.datadog.api.client.v2.model.GCPServiceAccountType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); // there is a valid "gcp_sts_account" in the system String GCP_STS_ACCOUNT_DATA_ID = System.getenv("GCP_STS_ACCOUNT_DATA_ID"); GCPSTSServiceAccountUpdateRequest body = new GCPSTSServiceAccountUpdateRequest() .data( new GCPSTSServiceAccountUpdateRequestData() .attributes( new GCPSTSServiceAccountAttributes() .clientEmail("Test-252bf553ef04b351@example.com") .hostFilters(Collections.singletonList("foo:bar"))) .id(GCP_STS_ACCOUNT_DATA_ID) .type(GCPServiceAccountType.GCP_SERVICE_ACCOUNT)); try { GCPSTSServiceAccountResponse result = apiInstance.updateGCPSTSAccount(GCP_STS_ACCOUNT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#updateGCPSTSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update STS Service Account returns "OK" response with cloud run revision filters ``` // Update STS Service Account returns "OK" response with cloud run revision filters import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.GcpIntegrationApi; import com.datadog.api.client.v2.model.GCPSTSServiceAccountAttributes; import com.datadog.api.client.v2.model.GCPSTSServiceAccountResponse; import com.datadog.api.client.v2.model.GCPSTSServiceAccountUpdateRequest; import com.datadog.api.client.v2.model.GCPSTSServiceAccountUpdateRequestData; import com.datadog.api.client.v2.model.GCPServiceAccountType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); // there is a valid "gcp_sts_account" in the system String GCP_STS_ACCOUNT_DATA_ID = System.getenv("GCP_STS_ACCOUNT_DATA_ID"); GCPSTSServiceAccountUpdateRequest body = new GCPSTSServiceAccountUpdateRequest() .data( new GCPSTSServiceAccountUpdateRequestData() .attributes( new GCPSTSServiceAccountAttributes() .clientEmail("Test-252bf553ef04b351@example.com") .cloudRunRevisionFilters(Collections.singletonList("merp:derp"))) .id(GCP_STS_ACCOUNT_DATA_ID) .type(GCPServiceAccountType.GCP_SERVICE_ACCOUNT)); try { GCPSTSServiceAccountResponse result = apiInstance.updateGCPSTSAccount(GCP_STS_ACCOUNT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#updateGCPSTSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update STS Service Account returns "OK" response with enable resource collection turned on ``` // Update STS Service Account returns "OK" response with enable resource collection turned on import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.GcpIntegrationApi; import com.datadog.api.client.v2.model.GCPSTSServiceAccountAttributes; import com.datadog.api.client.v2.model.GCPSTSServiceAccountResponse; import com.datadog.api.client.v2.model.GCPSTSServiceAccountUpdateRequest; import com.datadog.api.client.v2.model.GCPSTSServiceAccountUpdateRequestData; import com.datadog.api.client.v2.model.GCPServiceAccountType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); // there is a valid "gcp_sts_account" in the system String GCP_STS_ACCOUNT_DATA_ID = System.getenv("GCP_STS_ACCOUNT_DATA_ID"); GCPSTSServiceAccountUpdateRequest body = new GCPSTSServiceAccountUpdateRequest() .data( new GCPSTSServiceAccountUpdateRequestData() .attributes( new GCPSTSServiceAccountAttributes() .clientEmail("Test-252bf553ef04b351@example.com") .resourceCollectionEnabled(true)) .id(GCP_STS_ACCOUNT_DATA_ID) .type(GCPServiceAccountType.GCP_SERVICE_ACCOUNT)); try { GCPSTSServiceAccountResponse result = apiInstance.updateGCPSTSAccount(GCP_STS_ACCOUNT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#updateGCPSTSAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update STS Service Account returns "OK" response ``` """ Update STS Service Account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.gcp_integration_api import GCPIntegrationApi from datadog_api_client.v2.model.gcp_service_account_type import GCPServiceAccountType from datadog_api_client.v2.model.gcpsts_service_account_attributes import GCPSTSServiceAccountAttributes from datadog_api_client.v2.model.gcpsts_service_account_update_request import GCPSTSServiceAccountUpdateRequest from datadog_api_client.v2.model.gcpsts_service_account_update_request_data import GCPSTSServiceAccountUpdateRequestData # there is a valid "gcp_sts_account" in the system GCP_STS_ACCOUNT_DATA_ID = environ["GCP_STS_ACCOUNT_DATA_ID"] body = GCPSTSServiceAccountUpdateRequest( data=GCPSTSServiceAccountUpdateRequestData( attributes=GCPSTSServiceAccountAttributes( client_email="Test-252bf553ef04b351@example.com", host_filters=[ "foo:bar", ], ), id=GCP_STS_ACCOUNT_DATA_ID, type=GCPServiceAccountType.GCP_SERVICE_ACCOUNT, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.update_gcpsts_account(account_id=GCP_STS_ACCOUNT_DATA_ID, body=body) print(response) ``` Copy ##### Update STS Service Account returns "OK" response with cloud run revision filters ``` """ Update STS Service Account returns "OK" response with cloud run revision filters """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.gcp_integration_api import GCPIntegrationApi from datadog_api_client.v2.model.gcp_service_account_type import GCPServiceAccountType from datadog_api_client.v2.model.gcpsts_service_account_attributes import GCPSTSServiceAccountAttributes from datadog_api_client.v2.model.gcpsts_service_account_update_request import GCPSTSServiceAccountUpdateRequest from datadog_api_client.v2.model.gcpsts_service_account_update_request_data import GCPSTSServiceAccountUpdateRequestData # there is a valid "gcp_sts_account" in the system GCP_STS_ACCOUNT_DATA_ID = environ["GCP_STS_ACCOUNT_DATA_ID"] body = GCPSTSServiceAccountUpdateRequest( data=GCPSTSServiceAccountUpdateRequestData( attributes=GCPSTSServiceAccountAttributes( client_email="Test-252bf553ef04b351@example.com", cloud_run_revision_filters=[ "merp:derp", ], ), id=GCP_STS_ACCOUNT_DATA_ID, type=GCPServiceAccountType.GCP_SERVICE_ACCOUNT, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.update_gcpsts_account(account_id=GCP_STS_ACCOUNT_DATA_ID, body=body) print(response) ``` Copy ##### Update STS Service Account returns "OK" response with enable resource collection turned on ``` """ Update STS Service Account returns "OK" response with enable resource collection turned on """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.gcp_integration_api import GCPIntegrationApi from datadog_api_client.v2.model.gcp_service_account_type import GCPServiceAccountType from datadog_api_client.v2.model.gcpsts_service_account_attributes import GCPSTSServiceAccountAttributes from datadog_api_client.v2.model.gcpsts_service_account_update_request import GCPSTSServiceAccountUpdateRequest from datadog_api_client.v2.model.gcpsts_service_account_update_request_data import GCPSTSServiceAccountUpdateRequestData # there is a valid "gcp_sts_account" in the system GCP_STS_ACCOUNT_DATA_ID = environ["GCP_STS_ACCOUNT_DATA_ID"] body = GCPSTSServiceAccountUpdateRequest( data=GCPSTSServiceAccountUpdateRequestData( attributes=GCPSTSServiceAccountAttributes( client_email="Test-252bf553ef04b351@example.com", resource_collection_enabled=True, ), id=GCP_STS_ACCOUNT_DATA_ID, type=GCPServiceAccountType.GCP_SERVICE_ACCOUNT, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.update_gcpsts_account(account_id=GCP_STS_ACCOUNT_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update STS Service Account returns "OK" response ``` # Update STS Service Account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::GCPIntegrationAPI.new # there is a valid "gcp_sts_account" in the system GCP_STS_ACCOUNT_DATA_ID = ENV["GCP_STS_ACCOUNT_DATA_ID"] body = DatadogAPIClient::V2::GCPSTSServiceAccountUpdateRequest.new({ data: DatadogAPIClient::V2::GCPSTSServiceAccountUpdateRequestData.new({ attributes: DatadogAPIClient::V2::GCPSTSServiceAccountAttributes.new({ client_email: "Test-252bf553ef04b351@example.com", host_filters: [ "foo:bar", ], }), id: GCP_STS_ACCOUNT_DATA_ID, type: DatadogAPIClient::V2::GCPServiceAccountType::GCP_SERVICE_ACCOUNT, }), }) p api_instance.update_gcpsts_account(GCP_STS_ACCOUNT_DATA_ID, body) ``` Copy ##### Update STS Service Account returns "OK" response with cloud run revision filters ``` # Update STS Service Account returns "OK" response with cloud run revision filters require "datadog_api_client" api_instance = DatadogAPIClient::V2::GCPIntegrationAPI.new # there is a valid "gcp_sts_account" in the system GCP_STS_ACCOUNT_DATA_ID = ENV["GCP_STS_ACCOUNT_DATA_ID"] body = DatadogAPIClient::V2::GCPSTSServiceAccountUpdateRequest.new({ data: DatadogAPIClient::V2::GCPSTSServiceAccountUpdateRequestData.new({ attributes: DatadogAPIClient::V2::GCPSTSServiceAccountAttributes.new({ client_email: "Test-252bf553ef04b351@example.com", cloud_run_revision_filters: [ "merp:derp", ], }), id: GCP_STS_ACCOUNT_DATA_ID, type: DatadogAPIClient::V2::GCPServiceAccountType::GCP_SERVICE_ACCOUNT, }), }) p api_instance.update_gcpsts_account(GCP_STS_ACCOUNT_DATA_ID, body) ``` Copy ##### Update STS Service Account returns "OK" response with enable resource collection turned on ``` # Update STS Service Account returns "OK" response with enable resource collection turned on require "datadog_api_client" api_instance = DatadogAPIClient::V2::GCPIntegrationAPI.new # there is a valid "gcp_sts_account" in the system GCP_STS_ACCOUNT_DATA_ID = ENV["GCP_STS_ACCOUNT_DATA_ID"] body = DatadogAPIClient::V2::GCPSTSServiceAccountUpdateRequest.new({ data: DatadogAPIClient::V2::GCPSTSServiceAccountUpdateRequestData.new({ attributes: DatadogAPIClient::V2::GCPSTSServiceAccountAttributes.new({ client_email: "Test-252bf553ef04b351@example.com", resource_collection_enabled: true, }), id: GCP_STS_ACCOUNT_DATA_ID, type: DatadogAPIClient::V2::GCPServiceAccountType::GCP_SERVICE_ACCOUNT, }), }) p api_instance.update_gcpsts_account(GCP_STS_ACCOUNT_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update STS Service Account returns "OK" response ``` // Update STS Service Account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_gcp_integration::GCPIntegrationAPI; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountAttributes; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountUpdateRequest; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountUpdateRequestData; use datadog_api_client::datadogV2::model::GCPServiceAccountType; #[tokio::main] async fn main() { // there is a valid "gcp_sts_account" in the system let gcp_sts_account_data_id = std::env::var("GCP_STS_ACCOUNT_DATA_ID").unwrap(); let body = GCPSTSServiceAccountUpdateRequest::new().data( GCPSTSServiceAccountUpdateRequestData::new() .attributes( GCPSTSServiceAccountAttributes::new() .client_email("Test-252bf553ef04b351@example.com".to_string()) .host_filters(vec!["foo:bar".to_string()]), ) .id(gcp_sts_account_data_id.clone()) .type_(GCPServiceAccountType::GCP_SERVICE_ACCOUNT), ); let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api .update_gcpsts_account(gcp_sts_account_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update STS Service Account returns "OK" response with cloud run revision filters ``` // Update STS Service Account returns "OK" response with cloud run revision filters use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_gcp_integration::GCPIntegrationAPI; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountAttributes; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountUpdateRequest; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountUpdateRequestData; use datadog_api_client::datadogV2::model::GCPServiceAccountType; #[tokio::main] async fn main() { // there is a valid "gcp_sts_account" in the system let gcp_sts_account_data_id = std::env::var("GCP_STS_ACCOUNT_DATA_ID").unwrap(); let body = GCPSTSServiceAccountUpdateRequest::new().data( GCPSTSServiceAccountUpdateRequestData::new() .attributes( GCPSTSServiceAccountAttributes::new() .client_email("Test-252bf553ef04b351@example.com".to_string()) .cloud_run_revision_filters(vec!["merp:derp".to_string()]), ) .id(gcp_sts_account_data_id.clone()) .type_(GCPServiceAccountType::GCP_SERVICE_ACCOUNT), ); let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api .update_gcpsts_account(gcp_sts_account_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update STS Service Account returns "OK" response with enable resource collection turned on ``` // Update STS Service Account returns "OK" response with enable resource // collection turned on use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_gcp_integration::GCPIntegrationAPI; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountAttributes; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountUpdateRequest; use datadog_api_client::datadogV2::model::GCPSTSServiceAccountUpdateRequestData; use datadog_api_client::datadogV2::model::GCPServiceAccountType; #[tokio::main] async fn main() { // there is a valid "gcp_sts_account" in the system let gcp_sts_account_data_id = std::env::var("GCP_STS_ACCOUNT_DATA_ID").unwrap(); let body = GCPSTSServiceAccountUpdateRequest::new().data( GCPSTSServiceAccountUpdateRequestData::new() .attributes( GCPSTSServiceAccountAttributes::new() .client_email("Test-252bf553ef04b351@example.com".to_string()) .resource_collection_enabled(true), ) .id(gcp_sts_account_data_id.clone()) .type_(GCPServiceAccountType::GCP_SERVICE_ACCOUNT), ); let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api .update_gcpsts_account(gcp_sts_account_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update STS Service Account returns "OK" response ``` /** * Update STS Service Account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.GCPIntegrationApi(configuration); // there is a valid "gcp_sts_account" in the system const GCP_STS_ACCOUNT_DATA_ID = process.env.GCP_STS_ACCOUNT_DATA_ID as string; const params: v2.GCPIntegrationApiUpdateGCPSTSAccountRequest = { body: { data: { attributes: { clientEmail: "Test-252bf553ef04b351@example.com", hostFilters: ["foo:bar"], }, id: GCP_STS_ACCOUNT_DATA_ID, type: "gcp_service_account", }, }, accountId: GCP_STS_ACCOUNT_DATA_ID, }; apiInstance .updateGCPSTSAccount(params) .then((data: v2.GCPSTSServiceAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update STS Service Account returns "OK" response with cloud run revision filters ``` /** * Update STS Service Account returns "OK" response with cloud run revision filters */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.GCPIntegrationApi(configuration); // there is a valid "gcp_sts_account" in the system const GCP_STS_ACCOUNT_DATA_ID = process.env.GCP_STS_ACCOUNT_DATA_ID as string; const params: v2.GCPIntegrationApiUpdateGCPSTSAccountRequest = { body: { data: { attributes: { clientEmail: "Test-252bf553ef04b351@example.com", cloudRunRevisionFilters: ["merp:derp"], }, id: GCP_STS_ACCOUNT_DATA_ID, type: "gcp_service_account", }, }, accountId: GCP_STS_ACCOUNT_DATA_ID, }; apiInstance .updateGCPSTSAccount(params) .then((data: v2.GCPSTSServiceAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update STS Service Account returns "OK" response with enable resource collection turned on ``` /** * Update STS Service Account returns "OK" response with enable resource collection turned on */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.GCPIntegrationApi(configuration); // there is a valid "gcp_sts_account" in the system const GCP_STS_ACCOUNT_DATA_ID = process.env.GCP_STS_ACCOUNT_DATA_ID as string; const params: v2.GCPIntegrationApiUpdateGCPSTSAccountRequest = { body: { data: { attributes: { clientEmail: "Test-252bf553ef04b351@example.com", resourceCollectionEnabled: true, }, id: GCP_STS_ACCOUNT_DATA_ID, type: "gcp_service_account", }, }, accountId: GCP_STS_ACCOUNT_DATA_ID, }; apiInstance .updateGCPSTSAccount(params) .then((data: v2.GCPSTSServiceAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a Datadog GCP principal](https://docs.datadoghq.com/api/latest/gcp-integration/#create-a-datadog-gcp-principal) * [v2 (latest)](https://docs.datadoghq.com/api/latest/gcp-integration/#create-a-datadog-gcp-principal-v2) POST https://api.ap1.datadoghq.com/api/v2/integration/gcp/sts_delegatehttps://api.ap2.datadoghq.com/api/v2/integration/gcp/sts_delegatehttps://api.datadoghq.eu/api/v2/integration/gcp/sts_delegatehttps://api.ddog-gov.com/api/v2/integration/gcp/sts_delegatehttps://api.datadoghq.com/api/v2/integration/gcp/sts_delegatehttps://api.us3.datadoghq.com/api/v2/integration/gcp/sts_delegatehttps://api.us5.datadoghq.com/api/v2/integration/gcp/sts_delegate ### Overview Create a Datadog GCP principal. This endpoint requires the `gcp_configuration_edit` permission. ### Request #### Body Data Create a delegate service account within Datadog. * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Expand All Field Type Description No request body ``` {} ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/gcp-integration/#MakeGCPSTSDelegate-200-v2) * [403](https://docs.datadoghq.com/api/latest/gcp-integration/#MakeGCPSTSDelegate-403-v2) * [409](https://docs.datadoghq.com/api/latest/gcp-integration/#MakeGCPSTSDelegate-409-v2) * [429](https://docs.datadoghq.com/api/latest/gcp-integration/#MakeGCPSTSDelegate-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Your delegate service account response data. Field Type Description data object Datadog principal service account info. attributes object Your delegate account attributes. delegate_account_email string Your organization's Datadog principal email address. id string The ID of the delegate service account. type enum The type of account. Allowed enum values: `gcp_sts_delegate` default: `gcp_sts_delegate` ``` { "data": { "attributes": { "delegate_account_email": "ddgci-1a19n28hb1a812221893@datadog-gci-sts-us5-prod.iam.gserviceaccount.com" }, "id": "ddgci-1a19n28hb1a812221893@datadog-gci-sts-us5-prod.iam.gserviceaccount.com", "type": "gcp_sts_delegate" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=typescript) ##### Create a Datadog GCP principal with empty body returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/gcp/sts_delegate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Create a Datadog GCP principal with empty body returns "OK" response ``` // Create a Datadog GCP principal with empty body returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := new(interface{}) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewGCPIntegrationApi(apiClient) resp, r, err := api.MakeGCPSTSDelegate(ctx, *datadogV2.NewMakeGCPSTSDelegateOptionalParameters().WithBody(body)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.MakeGCPSTSDelegate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.MakeGCPSTSDelegate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a Datadog GCP principal with empty body returns "OK" response ``` // Create a Datadog GCP principal with empty body returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.GcpIntegrationApi; import com.datadog.api.client.v2.api.GcpIntegrationApi.MakeGCPSTSDelegateOptionalParameters; import com.datadog.api.client.v2.model.GCPSTSDelegateAccountResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); Object body = new Object(); try { GCPSTSDelegateAccountResponse result = apiInstance.makeGCPSTSDelegate(new MakeGCPSTSDelegateOptionalParameters().body(body)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#makeGCPSTSDelegate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a Datadog GCP principal with empty body returns "OK" response ``` """ Create a Datadog GCP principal with empty body returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.gcp_integration_api import GCPIntegrationApi body = dict() configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.make_gcpsts_delegate(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a Datadog GCP principal with empty body returns "OK" response ``` # Create a Datadog GCP principal with empty body returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::GCPIntegrationAPI.new body = {} opts = { body: body, } p api_instance.make_gcpsts_delegate(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a Datadog GCP principal with empty body returns "OK" response ``` // Create a Datadog GCP principal with empty body returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_gcp_integration::GCPIntegrationAPI; use datadog_api_client::datadogV2::api_gcp_integration::MakeGCPSTSDelegateOptionalParams; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = BTreeMap::new(); let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api .make_gcpsts_delegate(MakeGCPSTSDelegateOptionalParams::default().body(body)) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a Datadog GCP principal with empty body returns "OK" response ``` /** * Create a Datadog GCP principal with empty body returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.GCPIntegrationApi(configuration); const params: v2.GCPIntegrationApiMakeGCPSTSDelegateRequest = { body: {}, }; apiInstance .makeGCPSTSDelegate(params) .then((data: v2.GCPSTSDelegateAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List delegate account](https://docs.datadoghq.com/api/latest/gcp-integration/#list-delegate-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/gcp-integration/#list-delegate-account-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/gcp/sts_delegatehttps://api.ap2.datadoghq.com/api/v2/integration/gcp/sts_delegatehttps://api.datadoghq.eu/api/v2/integration/gcp/sts_delegatehttps://api.ddog-gov.com/api/v2/integration/gcp/sts_delegatehttps://api.datadoghq.com/api/v2/integration/gcp/sts_delegatehttps://api.us3.datadoghq.com/api/v2/integration/gcp/sts_delegatehttps://api.us5.datadoghq.com/api/v2/integration/gcp/sts_delegate ### Overview List your Datadog-GCP STS delegate account configured in your Datadog account. This endpoint requires the `gcp_configuration_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/gcp-integration/#GetGCPSTSDelegate-200-v2) * [403](https://docs.datadoghq.com/api/latest/gcp-integration/#GetGCPSTSDelegate-403-v2) * [429](https://docs.datadoghq.com/api/latest/gcp-integration/#GetGCPSTSDelegate-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) Your delegate service account response data. Field Type Description data object Datadog principal service account info. attributes object Your delegate account attributes. delegate_account_email string Your organization's Datadog principal email address. id string The ID of the delegate service account. type enum The type of account. Allowed enum values: `gcp_sts_delegate` default: `gcp_sts_delegate` ``` { "data": { "attributes": { "delegate_account_email": "ddgci-1a19n28hb1a812221893@datadog-gci-sts-us5-prod.iam.gserviceaccount.com" }, "id": "ddgci-1a19n28hb1a812221893@datadog-gci-sts-us5-prod.iam.gserviceaccount.com", "type": "gcp_sts_delegate" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/gcp-integration/) * [Example](https://docs.datadoghq.com/api/latest/gcp-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/gcp-integration/?code-lang=typescript) ##### List delegate account Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/gcp/sts_delegate" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List delegate account ``` """ List delegate account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.gcp_integration_api import GCPIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = GCPIntegrationApi(api_client) response = api_instance.get_gcpsts_delegate() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List delegate account ``` # List delegate account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::GCPIntegrationAPI.new p api_instance.get_gcpsts_delegate() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List delegate account ``` // List delegate account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewGCPIntegrationApi(apiClient) resp, r, err := api.GetGCPSTSDelegate(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `GCPIntegrationApi.GetGCPSTSDelegate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `GCPIntegrationApi.GetGCPSTSDelegate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List delegate account ``` // List delegate account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.GcpIntegrationApi; import com.datadog.api.client.v2.model.GCPSTSDelegateAccountResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); GcpIntegrationApi apiInstance = new GcpIntegrationApi(defaultClient); try { GCPSTSDelegateAccountResponse result = apiInstance.getGCPSTSDelegate(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling GcpIntegrationApi#getGCPSTSDelegate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List delegate account ``` // List delegate account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_gcp_integration::GCPIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = GCPIntegrationAPI::with_config(configuration); let resp = api.get_gcpsts_delegate().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List delegate account ``` /** * List delegate account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.GCPIntegrationApi(configuration); apiInstance .getGCPSTSDelegate() .then((data: v2.GCPSTSDelegateAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=badfaab2-ff12-4dd0-91e8-24898e3e64e5&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=ddd7bc7d-531f-40a1-a03f-bcc567f6462b&pt=GCP%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fgcp-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=badfaab2-ff12-4dd0-91e8-24898e3e64e5&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=ddd7bc7d-531f-40a1-a03f-bcc567f6462b&pt=GCP%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fgcp-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=e0cc25ea-5c1f-4896-9231-d6936e2237d1&bo=2&sid=b77a5fe0f0bf11f0bab0291dd4b5f9fd&vid=b77a7130f0bf11f0b69eb7008fe8e23b&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=GCP%20Integration&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fgcp-integration%2F&r=<=2515&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=547360) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) --- # Source: https://docs.datadoghq.com/api/latest/hosts # Hosts Get information about your infrastructure hosts in Datadog, and mute or unmute any notifications from your hosts. See the [Infrastructure page](https://docs.datadoghq.com/infrastructure/) for more information. ## [Get all hosts for your organization](https://docs.datadoghq.com/api/latest/hosts/#get-all-hosts-for-your-organization) * [v1 (latest)](https://docs.datadoghq.com/api/latest/hosts/#get-all-hosts-for-your-organization-v1) GET https://api.ap1.datadoghq.com/api/v1/hostshttps://api.ap2.datadoghq.com/api/v1/hostshttps://api.datadoghq.eu/api/v1/hostshttps://api.ddog-gov.com/api/v1/hostshttps://api.datadoghq.com/api/v1/hostshttps://api.us3.datadoghq.com/api/v1/hostshttps://api.us5.datadoghq.com/api/v1/hosts ### Overview This endpoint allows searching for hosts by name, alias, or tag. Hosts live within the past 3 hours are included by default. Retention is 7 days. Results are paginated with a max of 1000 results at a time. **Note:** If the host is an Amazon EC2 instance, `id` is replaced with `aws_id` in the response. **Note** : To enrich the data returned by this endpoint with security scans, see the new [api/v2/security/scanned-assets-metadata](https://docs.datadoghq.com/api/latest/security-monitoring/#list-scanned-assets-metadata) endpoint. This endpoint requires the `hosts_read` permission. OAuth apps require the `hosts_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#hosts) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter string String to filter search results. sort_field string Sort hosts by this field. sort_dir string Direction of sort. Options include `asc` and `desc`. start integer Specify the starting point for the host search results. For example, if you set `count` to 100 and the first 100 results have already been returned, you can set `start` to `101` to get the next 100 results. count integer Number of hosts to return. Max 1000. from integer Number of seconds since UNIX epoch from which you want to search your hosts. include_muted_hosts_data boolean Include information on the muted status of hosts and when the mute expires. include_hosts_metadata boolean Include additional metadata about the hosts (agent_version, machine, platform, processor, etc.). ### Response * [200](https://docs.datadoghq.com/api/latest/hosts/#ListHosts-200-v1) * [400](https://docs.datadoghq.com/api/latest/hosts/#ListHosts-400-v1) * [403](https://docs.datadoghq.com/api/latest/hosts/#ListHosts-403-v1) * [429](https://docs.datadoghq.com/api/latest/hosts/#ListHosts-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Response with Host information from Datadog. Field Type Description host_list [object] Array of hosts. aliases [string] Host aliases collected by Datadog. apps [string] The Datadog integrations reporting metrics for the host. aws_name string AWS name of your host. host_name string The host name. id int64 The host ID. is_muted boolean If a host is muted or unmuted. last_reported_time int64 Last time the host reported a metric data point. meta object Metadata associated with your host. agent_checks [array] A list of Agent checks running on the host. agent_version string The Datadog Agent version. cpuCores int64 The number of cores. fbsdV [] An array of Mac versions. gohai string JSON string containing system information. install_method object Agent install method. installer_version string The installer version. tool string Tool used to install the agent. tool_version string The tool version. macV [] An array of Mac versions. machine string The machine architecture. nixV [] Array of Unix versions. platform string The OS platform. processor string The processor. pythonV string The Python version. socket-fqdn string The socket fqdn. socket-hostname string The socket hostname. winV [] An array of Windows versions. metrics object Host Metrics collected. cpu double The percent of CPU used (everything but idle). iowait double The percent of CPU spent waiting on the IO (not reported for all platforms). load double The system load over the last 15 minutes. mute_timeout int64 Timeout of the mute applied to your host. name string The host name. sources [string] Source or cloud provider associated with your host. tags_by_source object List of tags for each source (AWS, Datadog Agent, Chef..). [string] Array of tags for a single source. up boolean Displays UP when the expected metrics are received and displays `???` if no metrics are received. total_matching int64 Number of host matching the query. total_returned int64 Number of host returned. ``` { "host_list": [ { "aliases": [ "mycoolhost-1" ], "apps": [ "agent" ], "aws_name": "mycoolhost-1", "host_name": "i-deadbeef", "id": 123456, "is_muted": false, "last_reported_time": 1565000000, "meta": { "agent_checks": [ "ntp", "ntp", "ntp:d884b5186b651429", "OK", "", "" ], "agent_version": "7.32.3", "cpuCores": 1, "fbsdV": [ "FreeBSD" ], "gohai": "{\"cpu\":{\"cache_size\":\"8192 KB\",\"cpu_cores\":\"1\",\"cpu_logical_processors\":\"1\",\"family\":\"6\",\"mhz\":\"2712.000\",\"model\":\"142\",\"model_name\":\"Intel(R) Core(TM) i7-8559U CPU @ 2.70GHz\",\"stepping\":\"10\",\"vendor_id\":\"GenuineIntel\"},\"filesystem\":[{\"kb_size\":\"3966896\",\"mounted_on\":\"/dev\",\"name\":\"udev\"},{\"kb_size\":\"797396\",\"mounted_on\":\"/run\",\"name\":\"tmpfs\"},{\"kb_size\":\"64800356\",\"mounted_on\":\"/\",\"name\":\"/dev/mapper/vagrant--vg-root\"},{\"kb_size\":\"3986972\",\"mounted_on\":\"/dev/shm\",\"name\":\"tmpfs\"},{\"kb_size\":\"5120\",\"mounted_on\":\"/run/lock\",\"name\":\"tmpfs\"},{\"kb_size\":\"3986972\",\"mounted_on\":\"/sys/fs/cgroup\",\"name\":\"tmpfs\"},{\"kb_size\":\"488245288\",\"mounted_on\":\"/vagrant\",\"name\":\"vagrant\"},{\"kb_size\":\"797392\",\"mounted_on\":\"/run/user/1000\",\"name\":\"tmpfs\"}],\"memory\":{\"swap_total\":\"1003516kB\",\"total\":\"7973944kB\"},\"network\":{\"interfaces\":[{\"ipv4\":\"10.0.2.15\",\"ipv4-network\":\"10.0.2.0/24\",\"ipv6\":\"fe80::a00:27ff:fec2:be11\",\"ipv6-network\":\"fe80::/64\",\"macaddress\":\"08:00:27:c2:be:11\",\"name\":\"eth0\"},{\"ipv4\":\"192.168.122.1\",\"ipv4-network\":\"192.168.122.0/24\",\"macaddress\":\"52:54:00:6f:1c:bf\",\"name\":\"virbr0\"}],\"ipaddress\":\"10.0.2.15\",\"ipaddressv6\":\"fe80::a00:27ff:fec2:be11\",\"macaddress\":\"08:00:27:c2:be:11\"},\"platform\":{\"GOOARCH\":\"amd64\",\"GOOS\":\"linux\",\"goV\":\"1.16.7\",\"hardware_platform\":\"x86_64\",\"hostname\":\"vagrant\",\"kernel_name\":\"Linux\",\"kernel_release\":\"4.15.0-29-generic\",\"kernel_version\":\"#31-Ubuntu SMP Tue Jul 17 15:39:52 UTC 2018\",\"machine\":\"x86_64\",\"os\":\"GNU/Linux\",\"processor\":\"x86_64\",\"pythonV\":\"2.7.15rc1\"}}", "install_method": { "installer_version": "install_script-1.7.1", "tool": "install_script", "tool_version": "install_script" }, "macV": [ "Mac" ], "machine": "amd64", "nixV": [ "Ubuntu" ], "platform": "linux", "processor": "Intel(R) Core(TM) i7-8559U CPU @ 2.70GHz", "pythonV": "3.8.11", "socket-fqdn": "vagrant.vm.", "socket-hostname": "vagrant", "winV": [ "Windows" ] }, "metrics": { "cpu": 99, "iowait": 3.2, "load": 0.5 }, "mute_timeout": "integer", "name": "i-hostname", "sources": [ "aws" ], "tags_by_source": { "": [ "test.example.com.host" ] }, "up": true } ], "total_matching": 1, "total_returned": 1 } ``` Copy Invalid Parameter Error * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/hosts/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/hosts/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/hosts/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/hosts/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/hosts/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/hosts/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/hosts/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/hosts/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/hosts/?code-lang=python-legacy) ##### Get all hosts for your organization Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/hosts" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all hosts for your organization ``` """ Get all hosts for your organization returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.hosts_api import HostsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = HostsApi(api_client) response = api_instance.list_hosts( filter="env:ci", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all hosts for your organization ``` # Get all hosts for your organization returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::HostsAPI.new opts = { filter: "env:ci", } p api_instance.list_hosts(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all hosts for your organization ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.search_hosts() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all hosts for your organization ``` // Get all hosts for your organization returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewHostsApi(apiClient) resp, r, err := api.ListHosts(ctx, *datadogV1.NewListHostsOptionalParameters().WithFilter("env:ci")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `HostsApi.ListHosts`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `HostsApi.ListHosts`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all hosts for your organization ``` // Get all hosts for your organization returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.HostsApi; import com.datadog.api.client.v1.api.HostsApi.ListHostsOptionalParameters; import com.datadog.api.client.v1.model.HostListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); HostsApi apiInstance = new HostsApi(defaultClient); try { HostListResponse result = apiInstance.listHosts(new ListHostsOptionalParameters().filter("env:ci")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling HostsApi#listHosts"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all hosts for your organization ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.Hosts.search() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get all hosts for your organization ``` // Get all hosts for your organization returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_hosts::HostsAPI; use datadog_api_client::datadogV1::api_hosts::ListHostsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = HostsAPI::with_config(configuration); let resp = api .list_hosts(ListHostsOptionalParams::default().filter("env:ci".to_string())) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all hosts for your organization ``` /** * Get all hosts for your organization returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.HostsApi(configuration); const params: v1.HostsApiListHostsRequest = { filter: "env:ci", }; apiInstance .listHosts(params) .then((data: v1.HostListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the total number of active hosts](https://docs.datadoghq.com/api/latest/hosts/#get-the-total-number-of-active-hosts) * [v1 (latest)](https://docs.datadoghq.com/api/latest/hosts/#get-the-total-number-of-active-hosts-v1) GET https://api.ap1.datadoghq.com/api/v1/hosts/totalshttps://api.ap2.datadoghq.com/api/v1/hosts/totalshttps://api.datadoghq.eu/api/v1/hosts/totalshttps://api.ddog-gov.com/api/v1/hosts/totalshttps://api.datadoghq.com/api/v1/hosts/totalshttps://api.us3.datadoghq.com/api/v1/hosts/totalshttps://api.us5.datadoghq.com/api/v1/hosts/totals ### Overview This endpoint returns the total number of active and up hosts in your Datadog account. Active means the host has reported in the past hour, and up means it has reported in the past two hours. This endpoint requires the `hosts_read` permission. OAuth apps require the `hosts_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#hosts) to access this endpoint. ### Arguments #### Query Strings Name Type Description from integer Number of seconds from which you want to get total number of active hosts. ### Response * [200](https://docs.datadoghq.com/api/latest/hosts/#GetHostTotals-200-v1) * [400](https://docs.datadoghq.com/api/latest/hosts/#GetHostTotals-400-v1) * [403](https://docs.datadoghq.com/api/latest/hosts/#GetHostTotals-403-v1) * [429](https://docs.datadoghq.com/api/latest/hosts/#GetHostTotals-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Total number of host currently monitored by Datadog. Expand All Field Type Description total_active int64 Total number of active host (UP and ???) reporting to Datadog. total_up int64 Number of host that are UP and reporting to Datadog. ``` { "total_active": "integer", "total_up": "integer" } ``` Copy Invalid Parameter Error * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/hosts/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/hosts/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/hosts/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/hosts/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/hosts/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/hosts/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/hosts/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/hosts/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/hosts/?code-lang=python-legacy) ##### Get the total number of active hosts Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/hosts/totals" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the total number of active hosts ``` """ Get the total number of active hosts returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.hosts_api import HostsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = HostsApi(api_client) response = api_instance.get_host_totals() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the total number of active hosts ``` # Get the total number of active hosts returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::HostsAPI.new p api_instance.get_host_totals() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the total number of active hosts ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.host_totals() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the total number of active hosts ``` // Get the total number of active hosts returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewHostsApi(apiClient) resp, r, err := api.GetHostTotals(ctx, *datadogV1.NewGetHostTotalsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `HostsApi.GetHostTotals`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `HostsApi.GetHostTotals`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the total number of active hosts ``` // Get the total number of active hosts returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.HostsApi; import com.datadog.api.client.v1.model.HostTotals; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); HostsApi apiInstance = new HostsApi(defaultClient); try { HostTotals result = apiInstance.getHostTotals(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling HostsApi#getHostTotals"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the total number of active hosts ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.Hosts.totals() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get the total number of active hosts ``` // Get the total number of active hosts returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_hosts::GetHostTotalsOptionalParams; use datadog_api_client::datadogV1::api_hosts::HostsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = HostsAPI::with_config(configuration); let resp = api .get_host_totals(GetHostTotalsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the total number of active hosts ``` /** * Get the total number of active hosts returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.HostsApi(configuration); apiInstance .getHostTotals() .then((data: v1.HostTotals) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Mute a host](https://docs.datadoghq.com/api/latest/hosts/#mute-a-host) * [v1 (latest)](https://docs.datadoghq.com/api/latest/hosts/#mute-a-host-v1) POST https://api.ap1.datadoghq.com/api/v1/host/{host_name}/mutehttps://api.ap2.datadoghq.com/api/v1/host/{host_name}/mutehttps://api.datadoghq.eu/api/v1/host/{host_name}/mutehttps://api.ddog-gov.com/api/v1/host/{host_name}/mutehttps://api.datadoghq.com/api/v1/host/{host_name}/mutehttps://api.us3.datadoghq.com/api/v1/host/{host_name}/mutehttps://api.us5.datadoghq.com/api/v1/host/{host_name}/mute ### Overview Mute a host. **Note:** This creates a [Downtime V2](https://docs.datadoghq.com/api/latest/downtimes/#schedule-a-downtime) for the host. ### Arguments #### Path Parameters Name Type Description host_name [_required_] string Name of the host to mute. ### Request #### Body Data (required) Mute a host request body. * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Expand All Field Type Description end int64 POSIX timestamp in seconds when the host is unmuted. If omitted, the host remains muted until explicitly unmuted. message string Message to associate with the muting of this host. override boolean If true and the host is already muted, replaces existing host mute settings. ``` { "end": 1579098130, "message": "Muting this host for a test!", "override": false } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/hosts/#MuteHost-200-v1) * [400](https://docs.datadoghq.com/api/latest/hosts/#MuteHost-400-v1) * [403](https://docs.datadoghq.com/api/latest/hosts/#MuteHost-403-v1) * [429](https://docs.datadoghq.com/api/latest/hosts/#MuteHost-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Response with the list of muted host for your organization. Expand All Field Type Description action string Action applied to the hosts. end int64 POSIX timestamp in seconds when the host is unmuted. hostname string The host name. message string Message associated with the mute. ``` { "action": "Muted", "end": 1579098130, "hostname": "test.host", "message": "Muting this host for a test!" } ``` Copy Invalid Parameter Error * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/hosts/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/hosts/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/hosts/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/hosts/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/hosts/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/hosts/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/hosts/?code-lang=typescript) ##### Mute a host Copy ``` # Path parameters export host_name="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/host/${host_name}/mute" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Mute a host ``` """ Mute a host returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.hosts_api import HostsApi from datadog_api_client.v1.model.host_mute_settings import HostMuteSettings body = HostMuteSettings( end=1579098130, message="Muting this host for a test!", override=False, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = HostsApi(api_client) response = api_instance.mute_host(host_name="host_name", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Mute a host ``` # Mute a host returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::HostsAPI.new body = DatadogAPIClient::V1::HostMuteSettings.new({ _end: 1579098130, message: "Muting this host for a test!", override: false, }) p api_instance.mute_host("host_name", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Mute a host ``` // Mute a host returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.HostMuteSettings{ End: datadog.PtrInt64(1579098130), Message: datadog.PtrString("Muting this host for a test!"), Override: datadog.PtrBool(false), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewHostsApi(apiClient) resp, r, err := api.MuteHost(ctx, "host_name", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `HostsApi.MuteHost`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `HostsApi.MuteHost`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Mute a host ``` // Mute a host returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.HostsApi; import com.datadog.api.client.v1.model.HostMuteResponse; import com.datadog.api.client.v1.model.HostMuteSettings; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); HostsApi apiInstance = new HostsApi(defaultClient); HostMuteSettings body = new HostMuteSettings() .end(1579098130L) .message("Muting this host for a test!") .override(false); try { HostMuteResponse result = apiInstance.muteHost("host_name", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling HostsApi#muteHost"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Mute a host ``` // Mute a host returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_hosts::HostsAPI; use datadog_api_client::datadogV1::model::HostMuteSettings; #[tokio::main] async fn main() { let body = HostMuteSettings::new() .end(1579098130) .message("Muting this host for a test!".to_string()) .override_(false); let configuration = datadog::Configuration::new(); let api = HostsAPI::with_config(configuration); let resp = api.mute_host("host_name".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Mute a host ``` /** * Mute a host returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.HostsApi(configuration); const params: v1.HostsApiMuteHostRequest = { body: { end: 1579098130, message: "Muting this host for a test!", override: false, }, hostName: "host_name", }; apiInstance .muteHost(params) .then((data: v1.HostMuteResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Unmute a host](https://docs.datadoghq.com/api/latest/hosts/#unmute-a-host) * [v1 (latest)](https://docs.datadoghq.com/api/latest/hosts/#unmute-a-host-v1) POST https://api.ap1.datadoghq.com/api/v1/host/{host_name}/unmutehttps://api.ap2.datadoghq.com/api/v1/host/{host_name}/unmutehttps://api.datadoghq.eu/api/v1/host/{host_name}/unmutehttps://api.ddog-gov.com/api/v1/host/{host_name}/unmutehttps://api.datadoghq.com/api/v1/host/{host_name}/unmutehttps://api.us3.datadoghq.com/api/v1/host/{host_name}/unmutehttps://api.us5.datadoghq.com/api/v1/host/{host_name}/unmute ### Overview Unmutes a host. This endpoint takes no JSON arguments. ### Arguments #### Path Parameters Name Type Description host_name [_required_] string Name of the host to unmute. ### Response * [200](https://docs.datadoghq.com/api/latest/hosts/#UnmuteHost-200-v1) * [400](https://docs.datadoghq.com/api/latest/hosts/#UnmuteHost-400-v1) * [403](https://docs.datadoghq.com/api/latest/hosts/#UnmuteHost-403-v1) * [429](https://docs.datadoghq.com/api/latest/hosts/#UnmuteHost-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Response with the list of muted host for your organization. Expand All Field Type Description action string Action applied to the hosts. end int64 POSIX timestamp in seconds when the host is unmuted. hostname string The host name. message string Message associated with the mute. ``` { "action": "Muted", "end": 1579098130, "hostname": "test.host", "message": "Muting this host for a test!" } ``` Copy Invalid Parameter Error * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/hosts/) * [Example](https://docs.datadoghq.com/api/latest/hosts/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/hosts/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/hosts/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/hosts/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/hosts/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/hosts/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/hosts/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/hosts/?code-lang=typescript) ##### Unmute a host Copy ``` # Path parameters export host_name="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/host/${host_name}/unmute" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Unmute a host ``` """ Unmute a host returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.hosts_api import HostsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = HostsApi(api_client) response = api_instance.unmute_host( host_name="host_name", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Unmute a host ``` # Unmute a host returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::HostsAPI.new p api_instance.unmute_host("host_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Unmute a host ``` // Unmute a host returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewHostsApi(apiClient) resp, r, err := api.UnmuteHost(ctx, "host_name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `HostsApi.UnmuteHost`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `HostsApi.UnmuteHost`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Unmute a host ``` // Unmute a host returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.HostsApi; import com.datadog.api.client.v1.model.HostMuteResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); HostsApi apiInstance = new HostsApi(defaultClient); try { HostMuteResponse result = apiInstance.unmuteHost("host_name"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling HostsApi#unmuteHost"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Unmute a host ``` // Unmute a host returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_hosts::HostsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = HostsAPI::with_config(configuration); let resp = api.unmute_host("host_name".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Unmute a host ``` /** * Unmute a host returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.HostsApi(configuration); const params: v1.HostsApiUnmuteHostRequest = { hostName: "host_name", }; apiInstance .unmuteHost(params) .then((data: v1.HostMuteResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=0ad00c34-7baa-49c4-9189-f56e1e5b67cc&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=a7b8ed01-8ba2-4ae8-9d2e-178e6d027ddf&pt=Hosts&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fhosts%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=0ad00c34-7baa-49c4-9189-f56e1e5b67cc&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=a7b8ed01-8ba2-4ae8-9d2e-178e6d027ddf&pt=Hosts&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fhosts%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=a0b328a9-4b82-4c9c-b0e9-63cebcadf5cf&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Hosts&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fhosts%2F&r=&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=317137) --- # Source: https://docs.datadoghq.com/api/latest/incident-services # Incident Services Create, update, delete, and retrieve services which can be associated with incidents. See the [Incident Management page](https://docs.datadoghq.com/service_management/incident_management/) for more information. ## [Get details of an incident service](https://docs.datadoghq.com/api/latest/incident-services/#get-details-of-an-incident-service) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/incident-services/#get-details-of-an-incident-service-v2) **Note** : This endpoint is deprecated. GET https://api.ap1.datadoghq.com/api/v2/services/{service_id}https://api.ap2.datadoghq.com/api/v2/services/{service_id}https://api.datadoghq.eu/api/v2/services/{service_id}https://api.ddog-gov.com/api/v2/services/{service_id}https://api.datadoghq.com/api/v2/services/{service_id}https://api.us3.datadoghq.com/api/v2/services/{service_id}https://api.us5.datadoghq.com/api/v2/services/{service_id} ### Overview Get details of an incident service. If the `include[users]` query parameter is provided, the included attribute will contain the users related to these incident services. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incident-services) to access this endpoint. ### Arguments #### Path Parameters Name Type Description service_id [_required_] string The ID of the incident service. #### Query Strings Name Type Description include enum Specifies which types of related objects should be included in the response. Allowed enum values: `users, attachments` ### Response * [200](https://docs.datadoghq.com/api/latest/incident-services/#GetIncidentService-200-v2) * [400](https://docs.datadoghq.com/api/latest/incident-services/#GetIncidentService-400-v2) * [401](https://docs.datadoghq.com/api/latest/incident-services/#GetIncidentService-401-v2) * [403](https://docs.datadoghq.com/api/latest/incident-services/#GetIncidentService-403-v2) * [404](https://docs.datadoghq.com/api/latest/incident-services/#GetIncidentService-404-v2) * [429](https://docs.datadoghq.com/api/latest/incident-services/#GetIncidentService-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) Response with an incident service payload. Field Type Description data [_required_] object Incident Service data from responses. attributes object The incident service's attributes from a response. created date-time Timestamp of when the incident service was created. modified date-time Timestamp of when the incident service was modified. name string Name of the incident service. id [_required_] string The incident service's ID. relationships object The incident service's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Incident service resource type. Allowed enum values: `services` default: `services` included [ ] Included objects from relationships. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "created": "2019-09-19T10:00:00.000Z", "modified": "2019-09-19T10:00:00.000Z", "name": "service name" }, "id": "00000000-0000-0000-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "services" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=typescript) ##### Get details of an incident service Copy ``` # Path parameters export service_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/services/${service_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get details of an incident service ``` """ Get details of an incident service returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incident_services_api import IncidentServicesApi # there is a valid "service" in the system SERVICE_DATA_ID = environ["SERVICE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_incident_service"] = True with ApiClient(configuration) as api_client: api_instance = IncidentServicesApi(api_client) response = api_instance.get_incident_service( service_id=SERVICE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get details of an incident service ``` # Get details of an incident service returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_incident_service".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentServicesAPI.new # there is a valid "service" in the system SERVICE_DATA_ID = ENV["SERVICE_DATA_ID"] p api_instance.get_incident_service(SERVICE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get details of an incident service ``` // Get details of an incident service returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "service" in the system ServiceDataID := os.Getenv("SERVICE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetIncidentService", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentServicesApi(apiClient) resp, r, err := api.GetIncidentService(ctx, ServiceDataID, *datadogV2.NewGetIncidentServiceOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentServicesApi.GetIncidentService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentServicesApi.GetIncidentService`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get details of an incident service ``` // Get details of an incident service returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentServicesApi; import com.datadog.api.client.v2.model.IncidentServiceResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getIncidentService", true); IncidentServicesApi apiInstance = new IncidentServicesApi(defaultClient); // there is a valid "service" in the system String SERVICE_DATA_ID = System.getenv("SERVICE_DATA_ID"); try { IncidentServiceResponse result = apiInstance.getIncidentService(SERVICE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentServicesApi#getIncidentService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get details of an incident service ``` // Get details of an incident service returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incident_services::GetIncidentServiceOptionalParams; use datadog_api_client::datadogV2::api_incident_services::IncidentServicesAPI; #[tokio::main] async fn main() { // there is a valid "service" in the system let service_data_id = std::env::var("SERVICE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetIncidentService", true); let api = IncidentServicesAPI::with_config(configuration); let resp = api .get_incident_service( service_data_id.clone(), GetIncidentServiceOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get details of an incident service ``` /** * Get details of an incident service returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getIncidentService"] = true; const apiInstance = new v2.IncidentServicesApi(configuration); // there is a valid "service" in the system const SERVICE_DATA_ID = process.env.SERVICE_DATA_ID as string; const params: v2.IncidentServicesApiGetIncidentServiceRequest = { serviceId: SERVICE_DATA_ID, }; apiInstance .getIncidentService(params) .then((data: v2.IncidentServiceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an existing incident service](https://docs.datadoghq.com/api/latest/incident-services/#delete-an-existing-incident-service) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/incident-services/#delete-an-existing-incident-service-v2) **Note** : This endpoint is deprecated. DELETE https://api.ap1.datadoghq.com/api/v2/services/{service_id}https://api.ap2.datadoghq.com/api/v2/services/{service_id}https://api.datadoghq.eu/api/v2/services/{service_id}https://api.ddog-gov.com/api/v2/services/{service_id}https://api.datadoghq.com/api/v2/services/{service_id}https://api.us3.datadoghq.com/api/v2/services/{service_id}https://api.us5.datadoghq.com/api/v2/services/{service_id} ### Overview Deletes an existing incident service. This endpoint requires the `incident_settings_write` permission. OAuth apps require the `incident_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incident-services) to access this endpoint. ### Arguments #### Path Parameters Name Type Description service_id [_required_] string The ID of the incident service. ### Response * [204](https://docs.datadoghq.com/api/latest/incident-services/#DeleteIncidentService-204-v2) * [400](https://docs.datadoghq.com/api/latest/incident-services/#DeleteIncidentService-400-v2) * [401](https://docs.datadoghq.com/api/latest/incident-services/#DeleteIncidentService-401-v2) * [403](https://docs.datadoghq.com/api/latest/incident-services/#DeleteIncidentService-403-v2) * [404](https://docs.datadoghq.com/api/latest/incident-services/#DeleteIncidentService-404-v2) * [429](https://docs.datadoghq.com/api/latest/incident-services/#DeleteIncidentService-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=typescript) ##### Delete an existing incident service Copy ``` # Path parameters export service_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/services/${service_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an existing incident service ``` """ Delete an existing incident service returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incident_services_api import IncidentServicesApi # there is a valid "service" in the system SERVICE_DATA_ID = environ["SERVICE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_incident_service"] = True with ApiClient(configuration) as api_client: api_instance = IncidentServicesApi(api_client) api_instance.delete_incident_service( service_id=SERVICE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an existing incident service ``` # Delete an existing incident service returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_incident_service".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentServicesAPI.new # there is a valid "service" in the system SERVICE_DATA_ID = ENV["SERVICE_DATA_ID"] api_instance.delete_incident_service(SERVICE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an existing incident service ``` // Delete an existing incident service returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "service" in the system ServiceDataID := os.Getenv("SERVICE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteIncidentService", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentServicesApi(apiClient) r, err := api.DeleteIncidentService(ctx, ServiceDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentServicesApi.DeleteIncidentService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an existing incident service ``` // Delete an existing incident service returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentServicesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteIncidentService", true); IncidentServicesApi apiInstance = new IncidentServicesApi(defaultClient); // there is a valid "service" in the system String SERVICE_DATA_ID = System.getenv("SERVICE_DATA_ID"); try { apiInstance.deleteIncidentService(SERVICE_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling IncidentServicesApi#deleteIncidentService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an existing incident service ``` // Delete an existing incident service returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incident_services::IncidentServicesAPI; #[tokio::main] async fn main() { // there is a valid "service" in the system let service_data_id = std::env::var("SERVICE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteIncidentService", true); let api = IncidentServicesAPI::with_config(configuration); let resp = api.delete_incident_service(service_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an existing incident service ``` /** * Delete an existing incident service returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteIncidentService"] = true; const apiInstance = new v2.IncidentServicesApi(configuration); // there is a valid "service" in the system const SERVICE_DATA_ID = process.env.SERVICE_DATA_ID as string; const params: v2.IncidentServicesApiDeleteIncidentServiceRequest = { serviceId: SERVICE_DATA_ID, }; apiInstance .deleteIncidentService(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an existing incident service](https://docs.datadoghq.com/api/latest/incident-services/#update-an-existing-incident-service) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/incident-services/#update-an-existing-incident-service-v2) **Note** : This endpoint is deprecated. PATCH https://api.ap1.datadoghq.com/api/v2/services/{service_id}https://api.ap2.datadoghq.com/api/v2/services/{service_id}https://api.datadoghq.eu/api/v2/services/{service_id}https://api.ddog-gov.com/api/v2/services/{service_id}https://api.datadoghq.com/api/v2/services/{service_id}https://api.us3.datadoghq.com/api/v2/services/{service_id}https://api.us5.datadoghq.com/api/v2/services/{service_id} ### Overview Updates an existing incident service. Only provide the attributes which should be updated as this request is a partial update. This endpoint requires the `incident_settings_write` permission. OAuth apps require the `incident_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incident-services) to access this endpoint. ### Arguments #### Path Parameters Name Type Description service_id [_required_] string The ID of the incident service. ### Request #### Body Data (required) Incident Service Payload. * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) Field Type Description data [_required_] object Incident Service payload for update requests. attributes object The incident service's attributes for an update request. name [_required_] string Name of the incident service. id string The incident service's ID. relationships object The incident service's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Incident service resource type. Allowed enum values: `services` default: `services` ``` { "data": { "type": "services", "attributes": { "name": "service name-updated" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/incident-services/#UpdateIncidentService-200-v2) * [400](https://docs.datadoghq.com/api/latest/incident-services/#UpdateIncidentService-400-v2) * [401](https://docs.datadoghq.com/api/latest/incident-services/#UpdateIncidentService-401-v2) * [403](https://docs.datadoghq.com/api/latest/incident-services/#UpdateIncidentService-403-v2) * [404](https://docs.datadoghq.com/api/latest/incident-services/#UpdateIncidentService-404-v2) * [429](https://docs.datadoghq.com/api/latest/incident-services/#UpdateIncidentService-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) Response with an incident service payload. Field Type Description data [_required_] object Incident Service data from responses. attributes object The incident service's attributes from a response. created date-time Timestamp of when the incident service was created. modified date-time Timestamp of when the incident service was modified. name string Name of the incident service. id [_required_] string The incident service's ID. relationships object The incident service's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Incident service resource type. Allowed enum values: `services` default: `services` included [ ] Included objects from relationships. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "created": "2019-09-19T10:00:00.000Z", "modified": "2019-09-19T10:00:00.000Z", "name": "service name" }, "id": "00000000-0000-0000-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "services" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=typescript) ##### Update an existing incident service returns "OK" response Copy ``` # Path parameters export service_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/services/${service_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "services", "attributes": { "name": "service name-updated" } } } EOF ``` ##### Update an existing incident service returns "OK" response ``` // Update an existing incident service returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "service" in the system ServiceDataID := os.Getenv("SERVICE_DATA_ID") body := datadogV2.IncidentServiceUpdateRequest{ Data: datadogV2.IncidentServiceUpdateData{ Type: datadogV2.INCIDENTSERVICETYPE_SERVICES, Attributes: &datadogV2.IncidentServiceUpdateAttributes{ Name: "service name-updated", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateIncidentService", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentServicesApi(apiClient) resp, r, err := api.UpdateIncidentService(ctx, ServiceDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentServicesApi.UpdateIncidentService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentServicesApi.UpdateIncidentService`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an existing incident service returns "OK" response ``` // Update an existing incident service returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentServicesApi; import com.datadog.api.client.v2.model.IncidentServiceResponse; import com.datadog.api.client.v2.model.IncidentServiceType; import com.datadog.api.client.v2.model.IncidentServiceUpdateAttributes; import com.datadog.api.client.v2.model.IncidentServiceUpdateData; import com.datadog.api.client.v2.model.IncidentServiceUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateIncidentService", true); IncidentServicesApi apiInstance = new IncidentServicesApi(defaultClient); // there is a valid "service" in the system String SERVICE_DATA_ATTRIBUTES_NAME = System.getenv("SERVICE_DATA_ATTRIBUTES_NAME"); String SERVICE_DATA_ID = System.getenv("SERVICE_DATA_ID"); IncidentServiceUpdateRequest body = new IncidentServiceUpdateRequest() .data( new IncidentServiceUpdateData() .type(IncidentServiceType.SERVICES) .attributes( new IncidentServiceUpdateAttributes().name("service name-updated"))); try { IncidentServiceResponse result = apiInstance.updateIncidentService(SERVICE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentServicesApi#updateIncidentService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an existing incident service returns "OK" response ``` """ Update an existing incident service returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incident_services_api import IncidentServicesApi from datadog_api_client.v2.model.incident_service_type import IncidentServiceType from datadog_api_client.v2.model.incident_service_update_attributes import IncidentServiceUpdateAttributes from datadog_api_client.v2.model.incident_service_update_data import IncidentServiceUpdateData from datadog_api_client.v2.model.incident_service_update_request import IncidentServiceUpdateRequest # there is a valid "service" in the system SERVICE_DATA_ATTRIBUTES_NAME = environ["SERVICE_DATA_ATTRIBUTES_NAME"] SERVICE_DATA_ID = environ["SERVICE_DATA_ID"] body = IncidentServiceUpdateRequest( data=IncidentServiceUpdateData( type=IncidentServiceType.SERVICES, attributes=IncidentServiceUpdateAttributes( name="service name-updated", ), ), ) configuration = Configuration() configuration.unstable_operations["update_incident_service"] = True with ApiClient(configuration) as api_client: api_instance = IncidentServicesApi(api_client) response = api_instance.update_incident_service(service_id=SERVICE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an existing incident service returns "OK" response ``` # Update an existing incident service returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_incident_service".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentServicesAPI.new # there is a valid "service" in the system SERVICE_DATA_ATTRIBUTES_NAME = ENV["SERVICE_DATA_ATTRIBUTES_NAME"] SERVICE_DATA_ID = ENV["SERVICE_DATA_ID"] body = DatadogAPIClient::V2::IncidentServiceUpdateRequest.new({ data: DatadogAPIClient::V2::IncidentServiceUpdateData.new({ type: DatadogAPIClient::V2::IncidentServiceType::SERVICES, attributes: DatadogAPIClient::V2::IncidentServiceUpdateAttributes.new({ name: "service name-updated", }), }), }) p api_instance.update_incident_service(SERVICE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an existing incident service returns "OK" response ``` // Update an existing incident service returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incident_services::IncidentServicesAPI; use datadog_api_client::datadogV2::model::IncidentServiceType; use datadog_api_client::datadogV2::model::IncidentServiceUpdateAttributes; use datadog_api_client::datadogV2::model::IncidentServiceUpdateData; use datadog_api_client::datadogV2::model::IncidentServiceUpdateRequest; #[tokio::main] async fn main() { // there is a valid "service" in the system let service_data_id = std::env::var("SERVICE_DATA_ID").unwrap(); let body = IncidentServiceUpdateRequest::new( IncidentServiceUpdateData::new(IncidentServiceType::SERVICES).attributes( IncidentServiceUpdateAttributes::new("service name-updated".to_string()), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateIncidentService", true); let api = IncidentServicesAPI::with_config(configuration); let resp = api .update_incident_service(service_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an existing incident service returns "OK" response ``` /** * Update an existing incident service returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateIncidentService"] = true; const apiInstance = new v2.IncidentServicesApi(configuration); // there is a valid "service" in the system const SERVICE_DATA_ID = process.env.SERVICE_DATA_ID as string; const params: v2.IncidentServicesApiUpdateIncidentServiceRequest = { body: { data: { type: "services", attributes: { name: "service name-updated", }, }, }, serviceId: SERVICE_DATA_ID, }; apiInstance .updateIncidentService(params) .then((data: v2.IncidentServiceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of all incident services](https://docs.datadoghq.com/api/latest/incident-services/#get-a-list-of-all-incident-services) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/incident-services/#get-a-list-of-all-incident-services-v2) **Note** : This endpoint is deprecated. GET https://api.ap1.datadoghq.com/api/v2/serviceshttps://api.ap2.datadoghq.com/api/v2/serviceshttps://api.datadoghq.eu/api/v2/serviceshttps://api.ddog-gov.com/api/v2/serviceshttps://api.datadoghq.com/api/v2/serviceshttps://api.us3.datadoghq.com/api/v2/serviceshttps://api.us5.datadoghq.com/api/v2/services ### Overview Get all incident services uploaded for the requesting user’s organization. If the `include[users]` query parameter is provided, the included attribute will contain the users related to these incident services. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incident-services) to access this endpoint. ### Arguments #### Query Strings Name Type Description include enum Specifies which types of related objects should be included in the response. Allowed enum values: `users, attachments` page[size] integer Size for a given page. The maximum allowed value is 100. page[offset] integer Specific offset to use as the beginning of the returned page. filter string A search query that filters services by name. ### Response * [200](https://docs.datadoghq.com/api/latest/incident-services/#ListIncidentServices-200-v2) * [400](https://docs.datadoghq.com/api/latest/incident-services/#ListIncidentServices-400-v2) * [401](https://docs.datadoghq.com/api/latest/incident-services/#ListIncidentServices-401-v2) * [403](https://docs.datadoghq.com/api/latest/incident-services/#ListIncidentServices-403-v2) * [404](https://docs.datadoghq.com/api/latest/incident-services/#ListIncidentServices-404-v2) * [429](https://docs.datadoghq.com/api/latest/incident-services/#ListIncidentServices-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) Response with a list of incident service payloads. Field Type Description data [_required_] [object] An array of incident services. attributes object The incident service's attributes from a response. created date-time Timestamp of when the incident service was created. modified date-time Timestamp of when the incident service was modified. name string Name of the incident service. id [_required_] string The incident service's ID. relationships object The incident service's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Incident service resource type. Allowed enum values: `services` default: `services` included [ ] Included related resources which the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` meta object The metadata object containing pagination metadata. pagination object Pagination properties. next_offset int64 The index of the first element in the next page of results. Equal to page size added to the current offset. offset int64 The index of the first element in the results. size int64 Maximum size of pages to return. ``` { "data": [ { "attributes": { "created": "2019-09-19T10:00:00.000Z", "modified": "2019-09-19T10:00:00.000Z", "name": "service name" }, "id": "00000000-0000-0000-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "services" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "meta": { "pagination": { "next_offset": 1000, "offset": 10, "size": 1000 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=typescript) ##### Get a list of all incident services Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/services" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of all incident services ``` """ Get a list of all incident services returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incident_services_api import IncidentServicesApi # there is a valid "service" in the system SERVICE_DATA_ATTRIBUTES_NAME = environ["SERVICE_DATA_ATTRIBUTES_NAME"] configuration = Configuration() configuration.unstable_operations["list_incident_services"] = True with ApiClient(configuration) as api_client: api_instance = IncidentServicesApi(api_client) response = api_instance.list_incident_services( filter=SERVICE_DATA_ATTRIBUTES_NAME, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of all incident services ``` # Get a list of all incident services returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_incident_services".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentServicesAPI.new # there is a valid "service" in the system SERVICE_DATA_ATTRIBUTES_NAME = ENV["SERVICE_DATA_ATTRIBUTES_NAME"] opts = { filter: SERVICE_DATA_ATTRIBUTES_NAME, } p api_instance.list_incident_services(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of all incident services ``` // Get a list of all incident services returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "service" in the system ServiceDataAttributesName := os.Getenv("SERVICE_DATA_ATTRIBUTES_NAME") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListIncidentServices", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentServicesApi(apiClient) resp, r, err := api.ListIncidentServices(ctx, *datadogV2.NewListIncidentServicesOptionalParameters().WithFilter(ServiceDataAttributesName)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentServicesApi.ListIncidentServices`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentServicesApi.ListIncidentServices`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of all incident services ``` // Get a list of all incident services returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentServicesApi; import com.datadog.api.client.v2.api.IncidentServicesApi.ListIncidentServicesOptionalParameters; import com.datadog.api.client.v2.model.IncidentServicesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listIncidentServices", true); IncidentServicesApi apiInstance = new IncidentServicesApi(defaultClient); // there is a valid "service" in the system String SERVICE_DATA_ATTRIBUTES_NAME = System.getenv("SERVICE_DATA_ATTRIBUTES_NAME"); try { IncidentServicesResponse result = apiInstance.listIncidentServices( new ListIncidentServicesOptionalParameters().filter(SERVICE_DATA_ATTRIBUTES_NAME)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentServicesApi#listIncidentServices"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of all incident services ``` // Get a list of all incident services returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incident_services::IncidentServicesAPI; use datadog_api_client::datadogV2::api_incident_services::ListIncidentServicesOptionalParams; #[tokio::main] async fn main() { // there is a valid "service" in the system let service_data_attributes_name = std::env::var("SERVICE_DATA_ATTRIBUTES_NAME").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListIncidentServices", true); let api = IncidentServicesAPI::with_config(configuration); let resp = api .list_incident_services( ListIncidentServicesOptionalParams::default() .filter(service_data_attributes_name.clone()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of all incident services ``` /** * Get a list of all incident services returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listIncidentServices"] = true; const apiInstance = new v2.IncidentServicesApi(configuration); // there is a valid "service" in the system const SERVICE_DATA_ATTRIBUTES_NAME = process.env .SERVICE_DATA_ATTRIBUTES_NAME as string; const params: v2.IncidentServicesApiListIncidentServicesRequest = { filter: SERVICE_DATA_ATTRIBUTES_NAME, }; apiInstance .listIncidentServices(params) .then((data: v2.IncidentServicesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new incident service](https://docs.datadoghq.com/api/latest/incident-services/#create-a-new-incident-service) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/incident-services/#create-a-new-incident-service-v2) **Note** : This endpoint is deprecated. POST https://api.ap1.datadoghq.com/api/v2/serviceshttps://api.ap2.datadoghq.com/api/v2/serviceshttps://api.datadoghq.eu/api/v2/serviceshttps://api.ddog-gov.com/api/v2/serviceshttps://api.datadoghq.com/api/v2/serviceshttps://api.us3.datadoghq.com/api/v2/serviceshttps://api.us5.datadoghq.com/api/v2/services ### Overview Creates a new incident service. This endpoint requires the `incident_settings_write` permission. OAuth apps require the `incident_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incident-services) to access this endpoint. ### Request #### Body Data (required) Incident Service Payload. * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) Field Type Description data [_required_] object Incident Service payload for create requests. attributes object The incident service's attributes for a create request. name [_required_] string Name of the incident service. relationships object The incident service's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Incident service resource type. Allowed enum values: `services` default: `services` ``` { "data": { "type": "services", "attributes": { "name": "Example-Incident-Service" } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/incident-services/#CreateIncidentService-201-v2) * [400](https://docs.datadoghq.com/api/latest/incident-services/#CreateIncidentService-400-v2) * [401](https://docs.datadoghq.com/api/latest/incident-services/#CreateIncidentService-401-v2) * [403](https://docs.datadoghq.com/api/latest/incident-services/#CreateIncidentService-403-v2) * [404](https://docs.datadoghq.com/api/latest/incident-services/#CreateIncidentService-404-v2) * [429](https://docs.datadoghq.com/api/latest/incident-services/#CreateIncidentService-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) Response with an incident service payload. Field Type Description data [_required_] object Incident Service data from responses. attributes object The incident service's attributes from a response. created date-time Timestamp of when the incident service was created. modified date-time Timestamp of when the incident service was modified. name string Name of the incident service. id [_required_] string The incident service's ID. relationships object The incident service's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Incident service resource type. Allowed enum values: `services` default: `services` included [ ] Included objects from relationships. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "created": "2019-09-19T10:00:00.000Z", "modified": "2019-09-19T10:00:00.000Z", "name": "service name" }, "id": "00000000-0000-0000-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "services" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incident-services/) * [Example](https://docs.datadoghq.com/api/latest/incident-services/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incident-services/?code-lang=typescript) ##### Create a new incident service returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/services" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "services", "attributes": { "name": "Example-Incident-Service" } } } EOF ``` ##### Create a new incident service returns "CREATED" response ``` // Create a new incident service returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.IncidentServiceCreateRequest{ Data: datadogV2.IncidentServiceCreateData{ Type: datadogV2.INCIDENTSERVICETYPE_SERVICES, Attributes: &datadogV2.IncidentServiceCreateAttributes{ Name: "Example-Incident-Service", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateIncidentService", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentServicesApi(apiClient) resp, r, err := api.CreateIncidentService(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentServicesApi.CreateIncidentService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentServicesApi.CreateIncidentService`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new incident service returns "CREATED" response ``` // Create a new incident service returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentServicesApi; import com.datadog.api.client.v2.model.IncidentServiceCreateAttributes; import com.datadog.api.client.v2.model.IncidentServiceCreateData; import com.datadog.api.client.v2.model.IncidentServiceCreateRequest; import com.datadog.api.client.v2.model.IncidentServiceResponse; import com.datadog.api.client.v2.model.IncidentServiceType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createIncidentService", true); IncidentServicesApi apiInstance = new IncidentServicesApi(defaultClient); IncidentServiceCreateRequest body = new IncidentServiceCreateRequest() .data( new IncidentServiceCreateData() .type(IncidentServiceType.SERVICES) .attributes( new IncidentServiceCreateAttributes().name("Example-Incident-Service"))); try { IncidentServiceResponse result = apiInstance.createIncidentService(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentServicesApi#createIncidentService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new incident service returns "CREATED" response ``` """ Create a new incident service returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incident_services_api import IncidentServicesApi from datadog_api_client.v2.model.incident_service_create_attributes import IncidentServiceCreateAttributes from datadog_api_client.v2.model.incident_service_create_data import IncidentServiceCreateData from datadog_api_client.v2.model.incident_service_create_request import IncidentServiceCreateRequest from datadog_api_client.v2.model.incident_service_type import IncidentServiceType body = IncidentServiceCreateRequest( data=IncidentServiceCreateData( type=IncidentServiceType.SERVICES, attributes=IncidentServiceCreateAttributes( name="Example-Incident-Service", ), ), ) configuration = Configuration() configuration.unstable_operations["create_incident_service"] = True with ApiClient(configuration) as api_client: api_instance = IncidentServicesApi(api_client) response = api_instance.create_incident_service(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new incident service returns "CREATED" response ``` # Create a new incident service returns "CREATED" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_incident_service".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentServicesAPI.new body = DatadogAPIClient::V2::IncidentServiceCreateRequest.new({ data: DatadogAPIClient::V2::IncidentServiceCreateData.new({ type: DatadogAPIClient::V2::IncidentServiceType::SERVICES, attributes: DatadogAPIClient::V2::IncidentServiceCreateAttributes.new({ name: "Example-Incident-Service", }), }), }) p api_instance.create_incident_service(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new incident service returns "CREATED" response ``` // Create a new incident service returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incident_services::IncidentServicesAPI; use datadog_api_client::datadogV2::model::IncidentServiceCreateAttributes; use datadog_api_client::datadogV2::model::IncidentServiceCreateData; use datadog_api_client::datadogV2::model::IncidentServiceCreateRequest; use datadog_api_client::datadogV2::model::IncidentServiceType; #[tokio::main] async fn main() { let body = IncidentServiceCreateRequest::new( IncidentServiceCreateData::new(IncidentServiceType::SERVICES).attributes( IncidentServiceCreateAttributes::new("Example-Incident-Service".to_string()), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateIncidentService", true); let api = IncidentServicesAPI::with_config(configuration); let resp = api.create_incident_service(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new incident service returns "CREATED" response ``` /** * Create a new incident service returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createIncidentService"] = true; const apiInstance = new v2.IncidentServicesApi(configuration); const params: v2.IncidentServicesApiCreateIncidentServiceRequest = { body: { data: { type: "services", attributes: { name: "Example-Incident-Service", }, }, }, }; apiInstance .createIncidentService(params) .then((data: v2.IncidentServiceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=11a93fdd-d5df-4b09-80e9-09692d74e354&bo=1&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Incident%20Services&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fincident-services%2F&r=&evt=pageLoad&sv=2&cdb=AQAA&rn=704461) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4f475cf0-5167-4580-a5d0-3120856fabff&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=33c24ce5-93b7-406f-b176-f86b17c1f37a&pt=Incident%20Services&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fincident-services%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4f475cf0-5167-4580-a5d0-3120856fabff&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=33c24ce5-93b7-406f-b176-f86b17c1f37a&pt=Incident%20Services&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fincident-services%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) --- # Source: https://docs.datadoghq.com/api/latest/incident-teams # Incident Teams The Incident Teams endpoints are deprecated. See the [Teams API endpoints](https://docs.datadoghq.com/api/latest/teams/) to create, update, delete, and retrieve teams which can be associated with incidents. ## [Get details of an incident team](https://docs.datadoghq.com/api/latest/incident-teams/#get-details-of-an-incident-team) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/incident-teams/#get-details-of-an-incident-team-v2) **Note** : This endpoint is deprecated. See the [Teams API endpoints](https://docs.datadoghq.com/api/latest/teams/). GET https://api.ap1.datadoghq.com/api/v2/teams/{team_id}https://api.ap2.datadoghq.com/api/v2/teams/{team_id}https://api.datadoghq.eu/api/v2/teams/{team_id}https://api.ddog-gov.com/api/v2/teams/{team_id}https://api.datadoghq.com/api/v2/teams/{team_id}https://api.us3.datadoghq.com/api/v2/teams/{team_id}https://api.us5.datadoghq.com/api/v2/teams/{team_id} ### Overview Get details of an incident team. If the `include[users]` query parameter is provided, the included attribute will contain the users related to these incident teams. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incident-teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string The ID of the incident team. #### Query Strings Name Type Description include enum Specifies which types of related objects should be included in the response. Allowed enum values: `users, attachments` ### Response * [200](https://docs.datadoghq.com/api/latest/incident-teams/#GetIncidentTeam-200-v2) * [400](https://docs.datadoghq.com/api/latest/incident-teams/#GetIncidentTeam-400-v2) * [401](https://docs.datadoghq.com/api/latest/incident-teams/#GetIncidentTeam-401-v2) * [403](https://docs.datadoghq.com/api/latest/incident-teams/#GetIncidentTeam-403-v2) * [404](https://docs.datadoghq.com/api/latest/incident-teams/#GetIncidentTeam-404-v2) * [429](https://docs.datadoghq.com/api/latest/incident-teams/#GetIncidentTeam-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) Response with an incident team payload. Field Type Description data [_required_] object Incident Team data from a response. attributes object The incident team's attributes from a response. created date-time Timestamp of when the incident team was created. modified date-time Timestamp of when the incident team was modified. name string Name of the incident team. id string The incident team's ID. relationships object The incident team's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Incident Team resource type. Allowed enum values: `teams` default: `teams` included [ ] Included objects from relationships. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "created": "2019-09-19T10:00:00.000Z", "modified": "2019-09-19T10:00:00.000Z", "name": "team name" }, "id": "00000000-7ea3-0000-000a-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "teams" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=typescript) ##### Get details of an incident team Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/teams/${team_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get details of an incident team ``` """ Get details of an incident team returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incident_teams_api import IncidentTeamsApi # there is a valid "team" in the system TEAM_DATA_ID = environ["TEAM_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_incident_team"] = True with ApiClient(configuration) as api_client: api_instance = IncidentTeamsApi(api_client) response = api_instance.get_incident_team( team_id=TEAM_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get details of an incident team ``` # Get details of an incident team returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_incident_team".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentTeamsAPI.new # there is a valid "team" in the system TEAM_DATA_ID = ENV["TEAM_DATA_ID"] p api_instance.get_incident_team(TEAM_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get details of an incident team ``` // Get details of an incident team returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "team" in the system TeamDataID := os.Getenv("TEAM_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetIncidentTeam", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentTeamsApi(apiClient) resp, r, err := api.GetIncidentTeam(ctx, TeamDataID, *datadogV2.NewGetIncidentTeamOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentTeamsApi.GetIncidentTeam`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentTeamsApi.GetIncidentTeam`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get details of an incident team ``` // Get details of an incident team returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentTeamsApi; import com.datadog.api.client.v2.model.IncidentTeamResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getIncidentTeam", true); IncidentTeamsApi apiInstance = new IncidentTeamsApi(defaultClient); // there is a valid "team" in the system String TEAM_DATA_ID = System.getenv("TEAM_DATA_ID"); try { IncidentTeamResponse result = apiInstance.getIncidentTeam(TEAM_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentTeamsApi#getIncidentTeam"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get details of an incident team ``` // Get details of an incident team returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incident_teams::GetIncidentTeamOptionalParams; use datadog_api_client::datadogV2::api_incident_teams::IncidentTeamsAPI; #[tokio::main] async fn main() { // there is a valid "team" in the system let team_data_id = std::env::var("TEAM_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetIncidentTeam", true); let api = IncidentTeamsAPI::with_config(configuration); let resp = api .get_incident_team( team_data_id.clone(), GetIncidentTeamOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get details of an incident team ``` /** * Get details of an incident team returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getIncidentTeam"] = true; const apiInstance = new v2.IncidentTeamsApi(configuration); // there is a valid "team" in the system const TEAM_DATA_ID = process.env.TEAM_DATA_ID as string; const params: v2.IncidentTeamsApiGetIncidentTeamRequest = { teamId: TEAM_DATA_ID, }; apiInstance .getIncidentTeam(params) .then((data: v2.IncidentTeamResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an existing incident team](https://docs.datadoghq.com/api/latest/incident-teams/#delete-an-existing-incident-team) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/incident-teams/#delete-an-existing-incident-team-v2) **Note** : This endpoint is deprecated. See the [Teams API endpoints](https://docs.datadoghq.com/api/latest/teams/). DELETE https://api.ap1.datadoghq.com/api/v2/teams/{team_id}https://api.ap2.datadoghq.com/api/v2/teams/{team_id}https://api.datadoghq.eu/api/v2/teams/{team_id}https://api.ddog-gov.com/api/v2/teams/{team_id}https://api.datadoghq.com/api/v2/teams/{team_id}https://api.us3.datadoghq.com/api/v2/teams/{team_id}https://api.us5.datadoghq.com/api/v2/teams/{team_id} ### Overview Deletes an existing incident team. This endpoint requires the `incident_settings_write` permission. OAuth apps require the `incident_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incident-teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string The ID of the incident team. ### Response * [204](https://docs.datadoghq.com/api/latest/incident-teams/#DeleteIncidentTeam-204-v2) * [400](https://docs.datadoghq.com/api/latest/incident-teams/#DeleteIncidentTeam-400-v2) * [401](https://docs.datadoghq.com/api/latest/incident-teams/#DeleteIncidentTeam-401-v2) * [403](https://docs.datadoghq.com/api/latest/incident-teams/#DeleteIncidentTeam-403-v2) * [404](https://docs.datadoghq.com/api/latest/incident-teams/#DeleteIncidentTeam-404-v2) * [429](https://docs.datadoghq.com/api/latest/incident-teams/#DeleteIncidentTeam-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=typescript) ##### Delete an existing incident team Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/teams/${team_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an existing incident team ``` """ Delete an existing incident team returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incident_teams_api import IncidentTeamsApi # there is a valid "team" in the system TEAM_DATA_ID = environ["TEAM_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_incident_team"] = True with ApiClient(configuration) as api_client: api_instance = IncidentTeamsApi(api_client) api_instance.delete_incident_team( team_id=TEAM_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an existing incident team ``` # Delete an existing incident team returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_incident_team".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentTeamsAPI.new # there is a valid "team" in the system TEAM_DATA_ID = ENV["TEAM_DATA_ID"] api_instance.delete_incident_team(TEAM_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an existing incident team ``` // Delete an existing incident team returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "team" in the system TeamDataID := os.Getenv("TEAM_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteIncidentTeam", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentTeamsApi(apiClient) r, err := api.DeleteIncidentTeam(ctx, TeamDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentTeamsApi.DeleteIncidentTeam`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an existing incident team ``` // Delete an existing incident team returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentTeamsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteIncidentTeam", true); IncidentTeamsApi apiInstance = new IncidentTeamsApi(defaultClient); // there is a valid "team" in the system String TEAM_DATA_ID = System.getenv("TEAM_DATA_ID"); try { apiInstance.deleteIncidentTeam(TEAM_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling IncidentTeamsApi#deleteIncidentTeam"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an existing incident team ``` // Delete an existing incident team returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incident_teams::IncidentTeamsAPI; #[tokio::main] async fn main() { // there is a valid "team" in the system let team_data_id = std::env::var("TEAM_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteIncidentTeam", true); let api = IncidentTeamsAPI::with_config(configuration); let resp = api.delete_incident_team(team_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an existing incident team ``` /** * Delete an existing incident team returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteIncidentTeam"] = true; const apiInstance = new v2.IncidentTeamsApi(configuration); // there is a valid "team" in the system const TEAM_DATA_ID = process.env.TEAM_DATA_ID as string; const params: v2.IncidentTeamsApiDeleteIncidentTeamRequest = { teamId: TEAM_DATA_ID, }; apiInstance .deleteIncidentTeam(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an existing incident team](https://docs.datadoghq.com/api/latest/incident-teams/#update-an-existing-incident-team) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/incident-teams/#update-an-existing-incident-team-v2) **Note** : This endpoint is deprecated. See the [Teams API endpoints](https://docs.datadoghq.com/api/latest/teams/). PATCH https://api.ap1.datadoghq.com/api/v2/teams/{team_id}https://api.ap2.datadoghq.com/api/v2/teams/{team_id}https://api.datadoghq.eu/api/v2/teams/{team_id}https://api.ddog-gov.com/api/v2/teams/{team_id}https://api.datadoghq.com/api/v2/teams/{team_id}https://api.us3.datadoghq.com/api/v2/teams/{team_id}https://api.us5.datadoghq.com/api/v2/teams/{team_id} ### Overview Updates an existing incident team. Only provide the attributes which should be updated as this request is a partial update. This endpoint requires the `incident_settings_write` permission. OAuth apps require the `incident_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incident-teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string The ID of the incident team. ### Request #### Body Data (required) Incident Team Payload. * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) Field Type Description data [_required_] object Incident Team data for an update request. attributes object The incident team's attributes for an update request. name [_required_] string Name of the incident team. id string The incident team's ID. relationships object The incident team's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Incident Team resource type. Allowed enum values: `teams` default: `teams` ``` { "data": { "type": "teams", "attributes": { "name": "team name-updated" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/incident-teams/#UpdateIncidentTeam-200-v2) * [400](https://docs.datadoghq.com/api/latest/incident-teams/#UpdateIncidentTeam-400-v2) * [401](https://docs.datadoghq.com/api/latest/incident-teams/#UpdateIncidentTeam-401-v2) * [403](https://docs.datadoghq.com/api/latest/incident-teams/#UpdateIncidentTeam-403-v2) * [404](https://docs.datadoghq.com/api/latest/incident-teams/#UpdateIncidentTeam-404-v2) * [429](https://docs.datadoghq.com/api/latest/incident-teams/#UpdateIncidentTeam-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) Response with an incident team payload. Field Type Description data [_required_] object Incident Team data from a response. attributes object The incident team's attributes from a response. created date-time Timestamp of when the incident team was created. modified date-time Timestamp of when the incident team was modified. name string Name of the incident team. id string The incident team's ID. relationships object The incident team's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Incident Team resource type. Allowed enum values: `teams` default: `teams` included [ ] Included objects from relationships. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "created": "2019-09-19T10:00:00.000Z", "modified": "2019-09-19T10:00:00.000Z", "name": "team name" }, "id": "00000000-7ea3-0000-000a-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "teams" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=typescript) ##### Update an existing incident team returns "OK" response Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/teams/${team_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "teams", "attributes": { "name": "team name-updated" } } } EOF ``` ##### Update an existing incident team returns "OK" response ``` // Update an existing incident team returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "team" in the system TeamDataID := os.Getenv("TEAM_DATA_ID") body := datadogV2.IncidentTeamUpdateRequest{ Data: datadogV2.IncidentTeamUpdateData{ Type: datadogV2.INCIDENTTEAMTYPE_TEAMS, Attributes: &datadogV2.IncidentTeamUpdateAttributes{ Name: "team name-updated", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateIncidentTeam", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentTeamsApi(apiClient) resp, r, err := api.UpdateIncidentTeam(ctx, TeamDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentTeamsApi.UpdateIncidentTeam`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentTeamsApi.UpdateIncidentTeam`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an existing incident team returns "OK" response ``` // Update an existing incident team returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentTeamsApi; import com.datadog.api.client.v2.model.IncidentTeamResponse; import com.datadog.api.client.v2.model.IncidentTeamType; import com.datadog.api.client.v2.model.IncidentTeamUpdateAttributes; import com.datadog.api.client.v2.model.IncidentTeamUpdateData; import com.datadog.api.client.v2.model.IncidentTeamUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateIncidentTeam", true); IncidentTeamsApi apiInstance = new IncidentTeamsApi(defaultClient); // there is a valid "team" in the system String TEAM_DATA_ATTRIBUTES_NAME = System.getenv("TEAM_DATA_ATTRIBUTES_NAME"); String TEAM_DATA_ID = System.getenv("TEAM_DATA_ID"); IncidentTeamUpdateRequest body = new IncidentTeamUpdateRequest() .data( new IncidentTeamUpdateData() .type(IncidentTeamType.TEAMS) .attributes(new IncidentTeamUpdateAttributes().name("team name-updated"))); try { IncidentTeamResponse result = apiInstance.updateIncidentTeam(TEAM_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentTeamsApi#updateIncidentTeam"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an existing incident team returns "OK" response ``` """ Update an existing incident team returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incident_teams_api import IncidentTeamsApi from datadog_api_client.v2.model.incident_team_type import IncidentTeamType from datadog_api_client.v2.model.incident_team_update_attributes import IncidentTeamUpdateAttributes from datadog_api_client.v2.model.incident_team_update_data import IncidentTeamUpdateData from datadog_api_client.v2.model.incident_team_update_request import IncidentTeamUpdateRequest # there is a valid "team" in the system TEAM_DATA_ATTRIBUTES_NAME = environ["TEAM_DATA_ATTRIBUTES_NAME"] TEAM_DATA_ID = environ["TEAM_DATA_ID"] body = IncidentTeamUpdateRequest( data=IncidentTeamUpdateData( type=IncidentTeamType.TEAMS, attributes=IncidentTeamUpdateAttributes( name="team name-updated", ), ), ) configuration = Configuration() configuration.unstable_operations["update_incident_team"] = True with ApiClient(configuration) as api_client: api_instance = IncidentTeamsApi(api_client) response = api_instance.update_incident_team(team_id=TEAM_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an existing incident team returns "OK" response ``` # Update an existing incident team returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_incident_team".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentTeamsAPI.new # there is a valid "team" in the system TEAM_DATA_ATTRIBUTES_NAME = ENV["TEAM_DATA_ATTRIBUTES_NAME"] TEAM_DATA_ID = ENV["TEAM_DATA_ID"] body = DatadogAPIClient::V2::IncidentTeamUpdateRequest.new({ data: DatadogAPIClient::V2::IncidentTeamUpdateData.new({ type: DatadogAPIClient::V2::IncidentTeamType::TEAMS, attributes: DatadogAPIClient::V2::IncidentTeamUpdateAttributes.new({ name: "team name-updated", }), }), }) p api_instance.update_incident_team(TEAM_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an existing incident team returns "OK" response ``` // Update an existing incident team returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incident_teams::IncidentTeamsAPI; use datadog_api_client::datadogV2::model::IncidentTeamType; use datadog_api_client::datadogV2::model::IncidentTeamUpdateAttributes; use datadog_api_client::datadogV2::model::IncidentTeamUpdateData; use datadog_api_client::datadogV2::model::IncidentTeamUpdateRequest; #[tokio::main] async fn main() { // there is a valid "team" in the system let team_data_id = std::env::var("TEAM_DATA_ID").unwrap(); let body = IncidentTeamUpdateRequest::new( IncidentTeamUpdateData::new(IncidentTeamType::TEAMS).attributes( IncidentTeamUpdateAttributes::new("team name-updated".to_string()), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateIncidentTeam", true); let api = IncidentTeamsAPI::with_config(configuration); let resp = api.update_incident_team(team_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an existing incident team returns "OK" response ``` /** * Update an existing incident team returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateIncidentTeam"] = true; const apiInstance = new v2.IncidentTeamsApi(configuration); // there is a valid "team" in the system const TEAM_DATA_ID = process.env.TEAM_DATA_ID as string; const params: v2.IncidentTeamsApiUpdateIncidentTeamRequest = { body: { data: { type: "teams", attributes: { name: "team name-updated", }, }, }, teamId: TEAM_DATA_ID, }; apiInstance .updateIncidentTeam(params) .then((data: v2.IncidentTeamResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of all incident teams](https://docs.datadoghq.com/api/latest/incident-teams/#get-a-list-of-all-incident-teams) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/incident-teams/#get-a-list-of-all-incident-teams-v2) **Note** : This endpoint is deprecated. See the [Teams API endpoints](https://docs.datadoghq.com/api/latest/teams/). GET https://api.ap1.datadoghq.com/api/v2/teamshttps://api.ap2.datadoghq.com/api/v2/teamshttps://api.datadoghq.eu/api/v2/teamshttps://api.ddog-gov.com/api/v2/teamshttps://api.datadoghq.com/api/v2/teamshttps://api.us3.datadoghq.com/api/v2/teamshttps://api.us5.datadoghq.com/api/v2/teams ### Overview Get all incident teams for the requesting user’s organization. If the `include[users]` query parameter is provided, the included attribute will contain the users related to these incident teams. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incident-teams) to access this endpoint. ### Arguments #### Query Strings Name Type Description include enum Specifies which types of related objects should be included in the response. Allowed enum values: `users, attachments` page[size] integer Size for a given page. The maximum allowed value is 100. page[offset] integer Specific offset to use as the beginning of the returned page. filter string A search query that filters teams by name. ### Response * [200](https://docs.datadoghq.com/api/latest/incident-teams/#ListIncidentTeams-200-v2) * [400](https://docs.datadoghq.com/api/latest/incident-teams/#ListIncidentTeams-400-v2) * [401](https://docs.datadoghq.com/api/latest/incident-teams/#ListIncidentTeams-401-v2) * [403](https://docs.datadoghq.com/api/latest/incident-teams/#ListIncidentTeams-403-v2) * [404](https://docs.datadoghq.com/api/latest/incident-teams/#ListIncidentTeams-404-v2) * [429](https://docs.datadoghq.com/api/latest/incident-teams/#ListIncidentTeams-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) Response with a list of incident team payloads. Field Type Description data [_required_] [object] An array of incident teams. attributes object The incident team's attributes from a response. created date-time Timestamp of when the incident team was created. modified date-time Timestamp of when the incident team was modified. name string Name of the incident team. id string The incident team's ID. relationships object The incident team's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Incident Team resource type. Allowed enum values: `teams` default: `teams` included [ ] Included related resources which the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` meta object The metadata object containing pagination metadata. pagination object Pagination properties. next_offset int64 The index of the first element in the next page of results. Equal to page size added to the current offset. offset int64 The index of the first element in the results. size int64 Maximum size of pages to return. ``` { "data": [ { "attributes": { "created": "2019-09-19T10:00:00.000Z", "modified": "2019-09-19T10:00:00.000Z", "name": "team name" }, "id": "00000000-7ea3-0000-000a-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "teams" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "meta": { "pagination": { "next_offset": 1000, "offset": 10, "size": 1000 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=typescript) ##### Get a list of all incident teams Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/teams" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of all incident teams ``` """ Get a list of all incident teams returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incident_teams_api import IncidentTeamsApi # there is a valid "team" in the system TEAM_DATA_ATTRIBUTES_NAME = environ["TEAM_DATA_ATTRIBUTES_NAME"] configuration = Configuration() configuration.unstable_operations["list_incident_teams"] = True with ApiClient(configuration) as api_client: api_instance = IncidentTeamsApi(api_client) response = api_instance.list_incident_teams( filter=TEAM_DATA_ATTRIBUTES_NAME, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of all incident teams ``` # Get a list of all incident teams returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_incident_teams".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentTeamsAPI.new # there is a valid "team" in the system TEAM_DATA_ATTRIBUTES_NAME = ENV["TEAM_DATA_ATTRIBUTES_NAME"] opts = { filter: TEAM_DATA_ATTRIBUTES_NAME, } p api_instance.list_incident_teams(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of all incident teams ``` // Get a list of all incident teams returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "team" in the system TeamDataAttributesName := os.Getenv("TEAM_DATA_ATTRIBUTES_NAME") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListIncidentTeams", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentTeamsApi(apiClient) resp, r, err := api.ListIncidentTeams(ctx, *datadogV2.NewListIncidentTeamsOptionalParameters().WithFilter(TeamDataAttributesName)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentTeamsApi.ListIncidentTeams`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentTeamsApi.ListIncidentTeams`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of all incident teams ``` // Get a list of all incident teams returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentTeamsApi; import com.datadog.api.client.v2.api.IncidentTeamsApi.ListIncidentTeamsOptionalParameters; import com.datadog.api.client.v2.model.IncidentTeamsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listIncidentTeams", true); IncidentTeamsApi apiInstance = new IncidentTeamsApi(defaultClient); // there is a valid "team" in the system String TEAM_DATA_ATTRIBUTES_NAME = System.getenv("TEAM_DATA_ATTRIBUTES_NAME"); try { IncidentTeamsResponse result = apiInstance.listIncidentTeams( new ListIncidentTeamsOptionalParameters().filter(TEAM_DATA_ATTRIBUTES_NAME)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentTeamsApi#listIncidentTeams"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of all incident teams ``` // Get a list of all incident teams returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incident_teams::IncidentTeamsAPI; use datadog_api_client::datadogV2::api_incident_teams::ListIncidentTeamsOptionalParams; #[tokio::main] async fn main() { // there is a valid "team" in the system let team_data_attributes_name = std::env::var("TEAM_DATA_ATTRIBUTES_NAME").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListIncidentTeams", true); let api = IncidentTeamsAPI::with_config(configuration); let resp = api .list_incident_teams( ListIncidentTeamsOptionalParams::default().filter(team_data_attributes_name.clone()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of all incident teams ``` /** * Get a list of all incident teams returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listIncidentTeams"] = true; const apiInstance = new v2.IncidentTeamsApi(configuration); // there is a valid "team" in the system const TEAM_DATA_ATTRIBUTES_NAME = process.env .TEAM_DATA_ATTRIBUTES_NAME as string; const params: v2.IncidentTeamsApiListIncidentTeamsRequest = { filter: TEAM_DATA_ATTRIBUTES_NAME, }; apiInstance .listIncidentTeams(params) .then((data: v2.IncidentTeamsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new incident team](https://docs.datadoghq.com/api/latest/incident-teams/#create-a-new-incident-team) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/incident-teams/#create-a-new-incident-team-v2) **Note** : This endpoint is deprecated. See the [Teams API endpoints](https://docs.datadoghq.com/api/latest/teams/). POST https://api.ap1.datadoghq.com/api/v2/teamshttps://api.ap2.datadoghq.com/api/v2/teamshttps://api.datadoghq.eu/api/v2/teamshttps://api.ddog-gov.com/api/v2/teamshttps://api.datadoghq.com/api/v2/teamshttps://api.us3.datadoghq.com/api/v2/teamshttps://api.us5.datadoghq.com/api/v2/teams ### Overview Creates a new incident team. This endpoint requires the `incident_settings_write` permission. OAuth apps require the `incident_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incident-teams) to access this endpoint. ### Request #### Body Data (required) Incident Team Payload. * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) Field Type Description data [_required_] object Incident Team data for a create request. attributes object The incident team's attributes for a create request. name [_required_] string Name of the incident team. relationships object The incident team's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Incident Team resource type. Allowed enum values: `teams` default: `teams` ``` { "data": { "type": "teams", "attributes": { "name": "Example-Incident-Team" } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/incident-teams/#CreateIncidentTeam-201-v2) * [400](https://docs.datadoghq.com/api/latest/incident-teams/#CreateIncidentTeam-400-v2) * [401](https://docs.datadoghq.com/api/latest/incident-teams/#CreateIncidentTeam-401-v2) * [403](https://docs.datadoghq.com/api/latest/incident-teams/#CreateIncidentTeam-403-v2) * [404](https://docs.datadoghq.com/api/latest/incident-teams/#CreateIncidentTeam-404-v2) * [429](https://docs.datadoghq.com/api/latest/incident-teams/#CreateIncidentTeam-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) Response with an incident team payload. Field Type Description data [_required_] object Incident Team data from a response. attributes object The incident team's attributes from a response. created date-time Timestamp of when the incident team was created. modified date-time Timestamp of when the incident team was modified. name string Name of the incident team. id string The incident team's ID. relationships object The incident team's relationships. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Incident Team resource type. Allowed enum values: `teams` default: `teams` included [ ] Included objects from relationships. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "created": "2019-09-19T10:00:00.000Z", "modified": "2019-09-19T10:00:00.000Z", "name": "team name" }, "id": "00000000-7ea3-0000-000a-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "teams" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incident-teams/) * [Example](https://docs.datadoghq.com/api/latest/incident-teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incident-teams/?code-lang=typescript) ##### Create a new incident team returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/teams" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "teams", "attributes": { "name": "Example-Incident-Team" } } } EOF ``` ##### Create a new incident team returns "CREATED" response ``` // Create a new incident team returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.IncidentTeamCreateRequest{ Data: datadogV2.IncidentTeamCreateData{ Type: datadogV2.INCIDENTTEAMTYPE_TEAMS, Attributes: &datadogV2.IncidentTeamCreateAttributes{ Name: "Example-Incident-Team", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateIncidentTeam", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentTeamsApi(apiClient) resp, r, err := api.CreateIncidentTeam(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentTeamsApi.CreateIncidentTeam`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentTeamsApi.CreateIncidentTeam`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new incident team returns "CREATED" response ``` // Create a new incident team returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentTeamsApi; import com.datadog.api.client.v2.model.IncidentTeamCreateAttributes; import com.datadog.api.client.v2.model.IncidentTeamCreateData; import com.datadog.api.client.v2.model.IncidentTeamCreateRequest; import com.datadog.api.client.v2.model.IncidentTeamResponse; import com.datadog.api.client.v2.model.IncidentTeamType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createIncidentTeam", true); IncidentTeamsApi apiInstance = new IncidentTeamsApi(defaultClient); IncidentTeamCreateRequest body = new IncidentTeamCreateRequest() .data( new IncidentTeamCreateData() .type(IncidentTeamType.TEAMS) .attributes(new IncidentTeamCreateAttributes().name("Example-Incident-Team"))); try { IncidentTeamResponse result = apiInstance.createIncidentTeam(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentTeamsApi#createIncidentTeam"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new incident team returns "CREATED" response ``` """ Create a new incident team returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incident_teams_api import IncidentTeamsApi from datadog_api_client.v2.model.incident_team_create_attributes import IncidentTeamCreateAttributes from datadog_api_client.v2.model.incident_team_create_data import IncidentTeamCreateData from datadog_api_client.v2.model.incident_team_create_request import IncidentTeamCreateRequest from datadog_api_client.v2.model.incident_team_type import IncidentTeamType body = IncidentTeamCreateRequest( data=IncidentTeamCreateData( type=IncidentTeamType.TEAMS, attributes=IncidentTeamCreateAttributes( name="Example-Incident-Team", ), ), ) configuration = Configuration() configuration.unstable_operations["create_incident_team"] = True with ApiClient(configuration) as api_client: api_instance = IncidentTeamsApi(api_client) response = api_instance.create_incident_team(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new incident team returns "CREATED" response ``` # Create a new incident team returns "CREATED" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_incident_team".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentTeamsAPI.new body = DatadogAPIClient::V2::IncidentTeamCreateRequest.new({ data: DatadogAPIClient::V2::IncidentTeamCreateData.new({ type: DatadogAPIClient::V2::IncidentTeamType::TEAMS, attributes: DatadogAPIClient::V2::IncidentTeamCreateAttributes.new({ name: "Example-Incident-Team", }), }), }) p api_instance.create_incident_team(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new incident team returns "CREATED" response ``` // Create a new incident team returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incident_teams::IncidentTeamsAPI; use datadog_api_client::datadogV2::model::IncidentTeamCreateAttributes; use datadog_api_client::datadogV2::model::IncidentTeamCreateData; use datadog_api_client::datadogV2::model::IncidentTeamCreateRequest; use datadog_api_client::datadogV2::model::IncidentTeamType; #[tokio::main] async fn main() { let body = IncidentTeamCreateRequest::new( IncidentTeamCreateData::new(IncidentTeamType::TEAMS).attributes( IncidentTeamCreateAttributes::new("Example-Incident-Team".to_string()), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateIncidentTeam", true); let api = IncidentTeamsAPI::with_config(configuration); let resp = api.create_incident_team(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new incident team returns "CREATED" response ``` /** * Create a new incident team returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createIncidentTeam"] = true; const apiInstance = new v2.IncidentTeamsApi(configuration); const params: v2.IncidentTeamsApiCreateIncidentTeamRequest = { body: { data: { type: "teams", attributes: { name: "Example-Incident-Team", }, }, }, }; apiInstance .createIncidentTeam(params) .then((data: v2.IncidentTeamResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=a702be77-f482-4118-a200-836272306c34&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b2ff3b54-3cdd-4cfa-82b2-28799788c8f5&pt=Incident%20Teams&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fincident-teams%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=a702be77-f482-4118-a200-836272306c34&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b2ff3b54-3cdd-4cfa-82b2-28799788c8f5&pt=Incident%20Teams&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fincident-teams%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=b1f62ed5-ff34-45b5-9d24-1b89189125ea&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Incident%20Teams&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fincident-teams%2F&r=<=13530&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=42961) --- # Source: https://docs.datadoghq.com/api/latest/incidents # Incidents Manage incident response, as well as associated attachments, metadata, and todos. See the [Incident Management page](https://docs.datadoghq.com/service_management/incident_management/) for more information. ## [Create an incident](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/incidentshttps://api.ap2.datadoghq.com/api/v2/incidentshttps://api.datadoghq.eu/api/v2/incidentshttps://api.ddog-gov.com/api/v2/incidentshttps://api.datadoghq.com/api/v2/incidentshttps://api.us3.datadoghq.com/api/v2/incidentshttps://api.us5.datadoghq.com/api/v2/incidents ### Overview Create an incident. This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Request #### Body Data (required) Incident payload. * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Incident data for a create request. attributes [_required_] object The incident's attributes for a create request. customer_impact_scope string Required if `customer_impacted:"true"`. A summary of the impact customers experienced during the incident. customer_impacted [_required_] boolean A flag indicating whether the incident caused customer impact. fields object A condensed view of the user-defined fields for which to create initial selections. Dynamic fields for which selections can be made, with field names as keys. Option 1 object A field with a single value selected. type enum Type of the single value field definitions. Allowed enum values: `dropdown,textbox` default: `dropdown` value string The single value selected for this field. Option 2 object A field with potentially multiple values selected. type enum Type of the multiple value field definitions. Allowed enum values: `multiselect,textarray,metrictag,autocomplete` default: `multiselect` value [string] The multiple values selected for this field. incident_type_uuid string A unique identifier that represents an incident type. The default incident type will be used if this property is not provided. initial_cells [ ] An array of initial timeline cells to be placed at the beginning of the incident timeline. Option 1 object Timeline cell data for Markdown timeline cells for a create request. cell_type [_required_] enum Type of the Markdown timeline cell. Allowed enum values: `markdown` default: `markdown` content [_required_] object The Markdown timeline cell contents. content string The Markdown content of the cell. important boolean A flag indicating whether the timeline cell is important and should be highlighted. is_test boolean A flag indicating whether the incident is a test incident. notification_handles [object] Notification handles that will be notified of the incident at creation. display_name string The name of the notified handle. handle string The handle used for the notification. This includes an email address, Slack channel, or workflow. title [_required_] string The title of the incident, which summarizes what happened. relationships object The relationships the incident will have with other resources once created. commander_user [_required_] object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Incident resource type. Allowed enum values: `incidents` default: `incidents` ``` { "data": { "type": "incidents", "attributes": { "title": "Example-Incident", "customer_impacted": false, "fields": { "state": { "type": "dropdown", "value": "resolved" } } }, "relationships": { "commander_user": { "data": { "type": "users", "id": "string" } } } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/incidents/#CreateIncident-201-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#CreateIncident-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#CreateIncident-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#CreateIncident-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#CreateIncident-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#CreateIncident-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with an incident. Field Type Description data [_required_] object Incident data from a response. attributes object The incident's attributes from a response. archived date-time Timestamp of when the incident was archived. case_id int64 The incident case id. created date-time Timestamp when the incident was created. customer_impact_duration int64 Length of the incident's customer impact in seconds. Equals the difference between `customer_impact_start` and `customer_impact_end`. customer_impact_end date-time Timestamp when customers were no longer impacted by the incident. customer_impact_scope string A summary of the impact customers experienced during the incident. customer_impact_start date-time Timestamp when customers began being impacted by the incident. customer_impacted boolean A flag indicating whether the incident caused customer impact. declared date-time Timestamp when the incident was declared. declared_by object Incident's non Datadog creator. image_48_px string Non Datadog creator `48px` image. name string Non Datadog creator name. declared_by_uuid string UUID of the user who declared the incident. detected date-time Timestamp when the incident was detected. fields object A condensed view of the user-defined fields attached to incidents. Dynamic fields for which selections can be made, with field names as keys. Option 1 object A field with a single value selected. type enum Type of the single value field definitions. Allowed enum values: `dropdown,textbox` default: `dropdown` value string The single value selected for this field. Option 2 object A field with potentially multiple values selected. type enum Type of the multiple value field definitions. Allowed enum values: `multiselect,textarray,metrictag,autocomplete` default: `multiselect` value [string] The multiple values selected for this field. incident_type_uuid string A unique identifier that represents an incident type. is_test boolean A flag indicating whether the incident is a test incident. modified date-time Timestamp when the incident was last modified. non_datadog_creator object Incident's non Datadog creator. image_48_px string Non Datadog creator `48px` image. name string Non Datadog creator name. notification_handles [object] Notification handles that will be notified of the incident during update. display_name string The name of the notified handle. handle string The handle used for the notification. This includes an email address, Slack channel, or workflow. public_id int64 The monotonically increasing integer ID for the incident. resolved date-time Timestamp when the incident's state was last changed from active or stable to resolved or completed. severity enum The incident severity. Allowed enum values: `UNKNOWN,SEV-0,SEV-1,SEV-2,SEV-3,SEV-4,SEV-5` state string The state incident. time_to_detect int64 The amount of time in seconds to detect the incident. Equals the difference between `customer_impact_start` and `detected`. time_to_internal_response int64 The amount of time in seconds to call incident after detection. Equals the difference of `detected` and `created`. time_to_repair int64 The amount of time in seconds to resolve customer impact after detecting the issue. Equals the difference between `customer_impact_end` and `detected`. time_to_resolve int64 The amount of time in seconds to resolve the incident after it was created. Equals the difference between `created` and `resolved`. title [_required_] string The title of the incident, which summarizes what happened. visibility string The incident visibility status. id [_required_] string The incident's ID. relationships object The incident's relationships from a response. attachments object A relationship reference for attachments. data [_required_] [object] An array of incident attachments. id [_required_] string A unique identifier that represents the attachment. type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` commander_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` declared_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` impacts object Relationship to impacts. data [_required_] [object] An array of incident impacts. id [_required_] string A unique identifier that represents the impact. type [_required_] enum The incident impacts type. Allowed enum values: `incident_impacts` integrations object A relationship reference for multiple integration metadata objects. data [_required_] [object] Integration metadata relationship array id [_required_] string A unique identifier that represents the integration metadata. type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` responders object Relationship to incident responders. data [_required_] [object] An array of incident responders. id [_required_] string A unique identifier that represents the responder. type [_required_] enum The incident responders type. Allowed enum values: `incident_responders` user_defined_fields object Relationship to incident user defined fields. data [_required_] [object] An array of user defined fields. id [_required_] string A unique identifier that represents the responder. type [_required_] enum The incident user defined fields type. Allowed enum values: `user_defined_field` type [_required_] enum Incident resource type. Allowed enum values: `incidents` default: `incidents` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. name string Name of the user. uuid string UUID of the user. id string ID of the user. type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object attributes [_required_] object attachment object documentUrl string title string attachment_type enum modified date-time id [_required_] string relationships [_required_] object last_modified_by_user object data [_required_] object id [_required_] string type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` ``` { "data": { "attributes": { "archived": "2019-09-19T10:00:00.000Z", "case_id": "integer", "created": "2019-09-19T10:00:00.000Z", "customer_impact_duration": "integer", "customer_impact_end": "2019-09-19T10:00:00.000Z", "customer_impact_scope": "An example customer impact scope", "customer_impact_start": "2019-09-19T10:00:00.000Z", "customer_impacted": false, "declared": "2019-09-19T10:00:00.000Z", "declared_by": { "image_48_px": "string", "name": "string" }, "declared_by_uuid": "string", "detected": "2019-09-19T10:00:00.000Z", "fields": { "": "undefined" }, "incident_type_uuid": "00000000-0000-0000-0000-000000000000", "is_test": false, "modified": "2019-09-19T10:00:00.000Z", "non_datadog_creator": { "image_48_px": "string", "name": "string" }, "notification_handles": [ { "display_name": "Jane Doe", "handle": "@test.user@test.com" } ], "public_id": 1, "resolved": "2019-09-19T10:00:00.000Z", "severity": "UNKNOWN", "state": "string", "time_to_detect": "integer", "time_to_internal_response": "integer", "time_to_repair": "integer", "time_to_resolve": "integer", "title": "A test incident title", "visibility": "string" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "attachments": { "data": [ { "id": "00000000-0000-abcd-1000-000000000000", "type": "incident_attachments" } ] }, "commander_user": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "users" } }, "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "declared_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "impacts": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "incident_impacts" } ] }, "integrations": { "data": [ { "id": "00000000-abcd-0001-0000-000000000000", "type": "incident_integrations" } ] }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "responders": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "incident_responders" } ] }, "user_defined_fields": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "user_defined_field" } ] } }, "type": "incidents" }, "included": [ { "attributes": { "email": "string", "handle": "string", "icon": "string", "name": "string", "uuid": "string" }, "id": "string", "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Create an incident returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "incidents", "attributes": { "title": "Example-Incident", "customer_impacted": false, "fields": { "state": { "type": "dropdown", "value": "resolved" } } }, "relationships": { "commander_user": { "data": { "type": "users", "id": "string" } } } } } EOF ``` ##### Create an incident returns "CREATED" response ``` // Create an incident returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") body := datadogV2.IncidentCreateRequest{ Data: datadogV2.IncidentCreateData{ Type: datadogV2.INCIDENTTYPE_INCIDENTS, Attributes: datadogV2.IncidentCreateAttributes{ Title: "Example-Incident", CustomerImpacted: false, Fields: map[string]datadogV2.IncidentFieldAttributes{ "state": datadogV2.IncidentFieldAttributes{ IncidentFieldAttributesSingleValue: &datadogV2.IncidentFieldAttributesSingleValue{ Type: datadogV2.INCIDENTFIELDATTRIBUTESSINGLEVALUETYPE_DROPDOWN.Ptr(), Value: *datadog.NewNullableString(datadog.PtrString("resolved")), }}, }, }, Relationships: &datadogV2.IncidentCreateRelationships{ CommanderUser: *datadogV2.NewNullableNullableRelationshipToUser(&datadogV2.NullableRelationshipToUser{ Data: *datadogV2.NewNullableNullableRelationshipToUserData(&datadogV2.NullableRelationshipToUserData{ Type: datadogV2.USERSTYPE_USERS, Id: UserDataID, }), }), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateIncident", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.CreateIncident(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.CreateIncident`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.CreateIncident`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an incident returns "CREATED" response ``` // Create an incident returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentCreateAttributes; import com.datadog.api.client.v2.model.IncidentCreateData; import com.datadog.api.client.v2.model.IncidentCreateRelationships; import com.datadog.api.client.v2.model.IncidentCreateRequest; import com.datadog.api.client.v2.model.IncidentFieldAttributes; import com.datadog.api.client.v2.model.IncidentFieldAttributesSingleValue; import com.datadog.api.client.v2.model.IncidentFieldAttributesSingleValueType; import com.datadog.api.client.v2.model.IncidentResponse; import com.datadog.api.client.v2.model.IncidentType; import com.datadog.api.client.v2.model.NullableRelationshipToUser; import com.datadog.api.client.v2.model.NullableRelationshipToUserData; import com.datadog.api.client.v2.model.UsersType; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createIncident", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); IncidentCreateRequest body = new IncidentCreateRequest() .data( new IncidentCreateData() .type(IncidentType.INCIDENTS) .attributes( new IncidentCreateAttributes() .title("Example-Incident") .customerImpacted(false) .fields( Map.ofEntries( Map.entry( "state", new IncidentFieldAttributes( new IncidentFieldAttributesSingleValue() .type( IncidentFieldAttributesSingleValueType.DROPDOWN) .value("resolved")))))) .relationships( new IncidentCreateRelationships() .commanderUser( new NullableRelationshipToUser() .data( new NullableRelationshipToUserData() .type(UsersType.USERS) .id(USER_DATA_ID))))); try { IncidentResponse result = apiInstance.createIncident(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#createIncident"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an incident returns "CREATED" response ``` """ Create an incident returns "CREATED" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_create_attributes import IncidentCreateAttributes from datadog_api_client.v2.model.incident_create_data import IncidentCreateData from datadog_api_client.v2.model.incident_create_relationships import IncidentCreateRelationships from datadog_api_client.v2.model.incident_create_request import IncidentCreateRequest from datadog_api_client.v2.model.incident_field_attributes_single_value import IncidentFieldAttributesSingleValue from datadog_api_client.v2.model.incident_field_attributes_single_value_type import ( IncidentFieldAttributesSingleValueType, ) from datadog_api_client.v2.model.incident_type import IncidentType from datadog_api_client.v2.model.nullable_relationship_to_user import NullableRelationshipToUser from datadog_api_client.v2.model.nullable_relationship_to_user_data import NullableRelationshipToUserData from datadog_api_client.v2.model.users_type import UsersType # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] body = IncidentCreateRequest( data=IncidentCreateData( type=IncidentType.INCIDENTS, attributes=IncidentCreateAttributes( title="Example-Incident", customer_impacted=False, fields=dict( state=IncidentFieldAttributesSingleValue( type=IncidentFieldAttributesSingleValueType.DROPDOWN, value="resolved", ), ), ), relationships=IncidentCreateRelationships( commander_user=NullableRelationshipToUser( data=NullableRelationshipToUserData( type=UsersType.USERS, id=USER_DATA_ID, ), ), ), ), ) configuration = Configuration() configuration.unstable_operations["create_incident"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.create_incident(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an incident returns "CREATED" response ``` # Create an incident returns "CREATED" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_incident".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] body = DatadogAPIClient::V2::IncidentCreateRequest.new({ data: DatadogAPIClient::V2::IncidentCreateData.new({ type: DatadogAPIClient::V2::IncidentType::INCIDENTS, attributes: DatadogAPIClient::V2::IncidentCreateAttributes.new({ title: "Example-Incident", customer_impacted: false, fields: { state: DatadogAPIClient::V2::IncidentFieldAttributesSingleValue.new({ type: DatadogAPIClient::V2::IncidentFieldAttributesSingleValueType::DROPDOWN, value: "resolved", }), }, }), relationships: DatadogAPIClient::V2::IncidentCreateRelationships.new({ commander_user: DatadogAPIClient::V2::NullableRelationshipToUser.new({ data: DatadogAPIClient::V2::NullableRelationshipToUserData.new({ type: DatadogAPIClient::V2::UsersType::USERS, id: USER_DATA_ID, }), }), }), }), }) p api_instance.create_incident(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an incident returns "CREATED" response ``` // Create an incident returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::model::IncidentCreateAttributes; use datadog_api_client::datadogV2::model::IncidentCreateData; use datadog_api_client::datadogV2::model::IncidentCreateRelationships; use datadog_api_client::datadogV2::model::IncidentCreateRequest; use datadog_api_client::datadogV2::model::IncidentFieldAttributes; use datadog_api_client::datadogV2::model::IncidentFieldAttributesSingleValue; use datadog_api_client::datadogV2::model::IncidentFieldAttributesSingleValueType; use datadog_api_client::datadogV2::model::IncidentType; use datadog_api_client::datadogV2::model::NullableRelationshipToUser; use datadog_api_client::datadogV2::model::NullableRelationshipToUserData; use datadog_api_client::datadogV2::model::UsersType; use std::collections::BTreeMap; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let body = IncidentCreateRequest::new( IncidentCreateData::new( IncidentCreateAttributes::new(false, "Example-Incident".to_string()).fields( BTreeMap::from([( "state".to_string(), IncidentFieldAttributes::IncidentFieldAttributesSingleValue(Box::new( IncidentFieldAttributesSingleValue::new() .type_(IncidentFieldAttributesSingleValueType::DROPDOWN) .value(Some("resolved".to_string())), )), )]), ), IncidentType::INCIDENTS, ) .relationships(IncidentCreateRelationships::new(Some( NullableRelationshipToUser::new(Some(NullableRelationshipToUserData::new( user_data_id.clone(), UsersType::USERS, ))), ))), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateIncident", true); let api = IncidentsAPI::with_config(configuration); let resp = api.create_incident(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an incident returns "CREATED" response ``` /** * Create an incident returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createIncident"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.IncidentsApiCreateIncidentRequest = { body: { data: { type: "incidents", attributes: { title: "Example-Incident", customerImpacted: false, fields: { state: { type: "dropdown", value: "resolved", }, }, }, relationships: { commanderUser: { data: { type: "users", id: USER_DATA_ID, }, }, }, }, }, }; apiInstance .createIncident(params) .then((data: v2.IncidentResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the details of an incident](https://docs.datadoghq.com/api/latest/incidents/#get-the-details-of-an-incident) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#get-the-details-of-an-incident-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id} ### Overview Get the details of an incident by `incident_id`. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. #### Query Strings Name Type Description include array Specifies which types of related objects should be included in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#GetIncident-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#GetIncident-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#GetIncident-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#GetIncident-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#GetIncident-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#GetIncident-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with an incident. Field Type Description data [_required_] object Incident data from a response. attributes object The incident's attributes from a response. archived date-time Timestamp of when the incident was archived. case_id int64 The incident case id. created date-time Timestamp when the incident was created. customer_impact_duration int64 Length of the incident's customer impact in seconds. Equals the difference between `customer_impact_start` and `customer_impact_end`. customer_impact_end date-time Timestamp when customers were no longer impacted by the incident. customer_impact_scope string A summary of the impact customers experienced during the incident. customer_impact_start date-time Timestamp when customers began being impacted by the incident. customer_impacted boolean A flag indicating whether the incident caused customer impact. declared date-time Timestamp when the incident was declared. declared_by object Incident's non Datadog creator. image_48_px string Non Datadog creator `48px` image. name string Non Datadog creator name. declared_by_uuid string UUID of the user who declared the incident. detected date-time Timestamp when the incident was detected. fields object A condensed view of the user-defined fields attached to incidents. Dynamic fields for which selections can be made, with field names as keys. Option 1 object A field with a single value selected. type enum Type of the single value field definitions. Allowed enum values: `dropdown,textbox` default: `dropdown` value string The single value selected for this field. Option 2 object A field with potentially multiple values selected. type enum Type of the multiple value field definitions. Allowed enum values: `multiselect,textarray,metrictag,autocomplete` default: `multiselect` value [string] The multiple values selected for this field. incident_type_uuid string A unique identifier that represents an incident type. is_test boolean A flag indicating whether the incident is a test incident. modified date-time Timestamp when the incident was last modified. non_datadog_creator object Incident's non Datadog creator. image_48_px string Non Datadog creator `48px` image. name string Non Datadog creator name. notification_handles [object] Notification handles that will be notified of the incident during update. display_name string The name of the notified handle. handle string The handle used for the notification. This includes an email address, Slack channel, or workflow. public_id int64 The monotonically increasing integer ID for the incident. resolved date-time Timestamp when the incident's state was last changed from active or stable to resolved or completed. severity enum The incident severity. Allowed enum values: `UNKNOWN,SEV-0,SEV-1,SEV-2,SEV-3,SEV-4,SEV-5` state string The state incident. time_to_detect int64 The amount of time in seconds to detect the incident. Equals the difference between `customer_impact_start` and `detected`. time_to_internal_response int64 The amount of time in seconds to call incident after detection. Equals the difference of `detected` and `created`. time_to_repair int64 The amount of time in seconds to resolve customer impact after detecting the issue. Equals the difference between `customer_impact_end` and `detected`. time_to_resolve int64 The amount of time in seconds to resolve the incident after it was created. Equals the difference between `created` and `resolved`. title [_required_] string The title of the incident, which summarizes what happened. visibility string The incident visibility status. id [_required_] string The incident's ID. relationships object The incident's relationships from a response. attachments object A relationship reference for attachments. data [_required_] [object] An array of incident attachments. id [_required_] string A unique identifier that represents the attachment. type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` commander_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` declared_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` impacts object Relationship to impacts. data [_required_] [object] An array of incident impacts. id [_required_] string A unique identifier that represents the impact. type [_required_] enum The incident impacts type. Allowed enum values: `incident_impacts` integrations object A relationship reference for multiple integration metadata objects. data [_required_] [object] Integration metadata relationship array id [_required_] string A unique identifier that represents the integration metadata. type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` responders object Relationship to incident responders. data [_required_] [object] An array of incident responders. id [_required_] string A unique identifier that represents the responder. type [_required_] enum The incident responders type. Allowed enum values: `incident_responders` user_defined_fields object Relationship to incident user defined fields. data [_required_] [object] An array of user defined fields. id [_required_] string A unique identifier that represents the responder. type [_required_] enum The incident user defined fields type. Allowed enum values: `user_defined_field` type [_required_] enum Incident resource type. Allowed enum values: `incidents` default: `incidents` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. name string Name of the user. uuid string UUID of the user. id string ID of the user. type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object attributes [_required_] object attachment object documentUrl string title string attachment_type enum modified date-time id [_required_] string relationships [_required_] object last_modified_by_user object data [_required_] object id [_required_] string type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` ``` { "data": { "attributes": { "archived": "2019-09-19T10:00:00.000Z", "case_id": "integer", "created": "2019-09-19T10:00:00.000Z", "customer_impact_duration": "integer", "customer_impact_end": "2019-09-19T10:00:00.000Z", "customer_impact_scope": "An example customer impact scope", "customer_impact_start": "2019-09-19T10:00:00.000Z", "customer_impacted": false, "declared": "2019-09-19T10:00:00.000Z", "declared_by": { "image_48_px": "string", "name": "string" }, "declared_by_uuid": "string", "detected": "2019-09-19T10:00:00.000Z", "fields": { "": "undefined" }, "incident_type_uuid": "00000000-0000-0000-0000-000000000000", "is_test": false, "modified": "2019-09-19T10:00:00.000Z", "non_datadog_creator": { "image_48_px": "string", "name": "string" }, "notification_handles": [ { "display_name": "Jane Doe", "handle": "@test.user@test.com" } ], "public_id": 1, "resolved": "2019-09-19T10:00:00.000Z", "severity": "UNKNOWN", "state": "string", "time_to_detect": "integer", "time_to_internal_response": "integer", "time_to_repair": "integer", "time_to_resolve": "integer", "title": "A test incident title", "visibility": "string" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "attachments": { "data": [ { "id": "00000000-0000-abcd-1000-000000000000", "type": "incident_attachments" } ] }, "commander_user": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "users" } }, "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "declared_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "impacts": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "incident_impacts" } ] }, "integrations": { "data": [ { "id": "00000000-abcd-0001-0000-000000000000", "type": "incident_integrations" } ] }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "responders": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "incident_responders" } ] }, "user_defined_fields": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "user_defined_field" } ] } }, "type": "incidents" }, "included": [ { "attributes": { "email": "string", "handle": "string", "icon": "string", "name": "string", "uuid": "string" }, "id": "string", "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Get the details of an incident Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the details of an incident ``` """ Get the details of an incident returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_incident"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.get_incident( incident_id=INCIDENT_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the details of an incident ``` # Get the details of an incident returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_incident".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] p api_instance.get_incident(INCIDENT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the details of an incident ``` // Get the details of an incident returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetIncident", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.GetIncident(ctx, IncidentDataID, *datadogV2.NewGetIncidentOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.GetIncident`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.GetIncident`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the details of an incident ``` // Get the details of an incident returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getIncident", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); try { IncidentResponse result = apiInstance.getIncident(INCIDENT_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#getIncident"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the details of an incident ``` // Get the details of an incident returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::GetIncidentOptionalParams; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetIncident", true); let api = IncidentsAPI::with_config(configuration); let resp = api .get_incident( incident_data_id.clone(), GetIncidentOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the details of an incident ``` /** * Get the details of an incident returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getIncident"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; const params: v2.IncidentsApiGetIncidentRequest = { incidentId: INCIDENT_DATA_ID, }; apiInstance .getIncident(params) .then((data: v2.IncidentResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an existing incident](https://docs.datadoghq.com/api/latest/incidents/#update-an-existing-incident) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#update-an-existing-incident-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PATCH https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id} ### Overview Updates an incident. Provide only the attributes that should be updated as this request is a partial update. This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. #### Query Strings Name Type Description include array Specifies which types of related objects should be included in the response. ### Request #### Body Data (required) Incident Payload. * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Incident data for an update request. attributes object The incident's attributes for an update request. customer_impact_end date-time Timestamp when customers were no longer impacted by the incident. customer_impact_scope string A summary of the impact customers experienced during the incident. customer_impact_start date-time Timestamp when customers began being impacted by the incident. customer_impacted boolean A flag indicating whether the incident caused customer impact. detected date-time Timestamp when the incident was detected. fields object A condensed view of the user-defined fields for which to update selections. Dynamic fields for which selections can be made, with field names as keys. Option 1 object A field with a single value selected. type enum Type of the single value field definitions. Allowed enum values: `dropdown,textbox` default: `dropdown` value string The single value selected for this field. Option 2 object A field with potentially multiple values selected. type enum Type of the multiple value field definitions. Allowed enum values: `multiselect,textarray,metrictag,autocomplete` default: `multiselect` value [string] The multiple values selected for this field. notification_handles [object] Notification handles that will be notified of the incident during update. display_name string The name of the notified handle. handle string The handle used for the notification. This includes an email address, Slack channel, or workflow. title string The title of the incident, which summarizes what happened. id [_required_] string The incident's ID. relationships object The incident's relationships for an update request. commander_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` integrations object A relationship reference for multiple integration metadata objects. data [_required_] [object] Integration metadata relationship array id [_required_] string A unique identifier that represents the integration metadata. type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` postmortem object A relationship reference for postmortems. data [_required_] object The postmortem relationship data. id [_required_] string A unique identifier that represents the postmortem. type [_required_] enum Incident postmortem resource type. Allowed enum values: `incident_postmortems` default: `incident_postmortems` type [_required_] enum Incident resource type. Allowed enum values: `incidents` default: `incidents` ##### Add commander to an incident returns "OK" response ``` { "data": { "id": "00000000-0000-0000-1234-000000000000", "type": "incidents", "relationships": { "commander_user": { "data": { "id": "string", "type": "users" } } } } } ``` Copy ##### Remove commander from an incident returns "OK" response ``` { "data": { "id": "00000000-0000-0000-1234-000000000000", "type": "incidents", "relationships": { "commander_user": { "data": null } } } } ``` Copy ##### Update an existing incident returns "OK" response ``` { "data": { "id": "00000000-0000-0000-1234-000000000000", "type": "incidents", "attributes": { "fields": { "state": { "type": "dropdown", "value": "resolved" } }, "title": "A test incident title-updated" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncident-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncident-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncident-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncident-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncident-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncident-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with an incident. Field Type Description data [_required_] object Incident data from a response. attributes object The incident's attributes from a response. archived date-time Timestamp of when the incident was archived. case_id int64 The incident case id. created date-time Timestamp when the incident was created. customer_impact_duration int64 Length of the incident's customer impact in seconds. Equals the difference between `customer_impact_start` and `customer_impact_end`. customer_impact_end date-time Timestamp when customers were no longer impacted by the incident. customer_impact_scope string A summary of the impact customers experienced during the incident. customer_impact_start date-time Timestamp when customers began being impacted by the incident. customer_impacted boolean A flag indicating whether the incident caused customer impact. declared date-time Timestamp when the incident was declared. declared_by object Incident's non Datadog creator. image_48_px string Non Datadog creator `48px` image. name string Non Datadog creator name. declared_by_uuid string UUID of the user who declared the incident. detected date-time Timestamp when the incident was detected. fields object A condensed view of the user-defined fields attached to incidents. Dynamic fields for which selections can be made, with field names as keys. Option 1 object A field with a single value selected. type enum Type of the single value field definitions. Allowed enum values: `dropdown,textbox` default: `dropdown` value string The single value selected for this field. Option 2 object A field with potentially multiple values selected. type enum Type of the multiple value field definitions. Allowed enum values: `multiselect,textarray,metrictag,autocomplete` default: `multiselect` value [string] The multiple values selected for this field. incident_type_uuid string A unique identifier that represents an incident type. is_test boolean A flag indicating whether the incident is a test incident. modified date-time Timestamp when the incident was last modified. non_datadog_creator object Incident's non Datadog creator. image_48_px string Non Datadog creator `48px` image. name string Non Datadog creator name. notification_handles [object] Notification handles that will be notified of the incident during update. display_name string The name of the notified handle. handle string The handle used for the notification. This includes an email address, Slack channel, or workflow. public_id int64 The monotonically increasing integer ID for the incident. resolved date-time Timestamp when the incident's state was last changed from active or stable to resolved or completed. severity enum The incident severity. Allowed enum values: `UNKNOWN,SEV-0,SEV-1,SEV-2,SEV-3,SEV-4,SEV-5` state string The state incident. time_to_detect int64 The amount of time in seconds to detect the incident. Equals the difference between `customer_impact_start` and `detected`. time_to_internal_response int64 The amount of time in seconds to call incident after detection. Equals the difference of `detected` and `created`. time_to_repair int64 The amount of time in seconds to resolve customer impact after detecting the issue. Equals the difference between `customer_impact_end` and `detected`. time_to_resolve int64 The amount of time in seconds to resolve the incident after it was created. Equals the difference between `created` and `resolved`. title [_required_] string The title of the incident, which summarizes what happened. visibility string The incident visibility status. id [_required_] string The incident's ID. relationships object The incident's relationships from a response. attachments object A relationship reference for attachments. data [_required_] [object] An array of incident attachments. id [_required_] string A unique identifier that represents the attachment. type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` commander_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` declared_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` impacts object Relationship to impacts. data [_required_] [object] An array of incident impacts. id [_required_] string A unique identifier that represents the impact. type [_required_] enum The incident impacts type. Allowed enum values: `incident_impacts` integrations object A relationship reference for multiple integration metadata objects. data [_required_] [object] Integration metadata relationship array id [_required_] string A unique identifier that represents the integration metadata. type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` responders object Relationship to incident responders. data [_required_] [object] An array of incident responders. id [_required_] string A unique identifier that represents the responder. type [_required_] enum The incident responders type. Allowed enum values: `incident_responders` user_defined_fields object Relationship to incident user defined fields. data [_required_] [object] An array of user defined fields. id [_required_] string A unique identifier that represents the responder. type [_required_] enum The incident user defined fields type. Allowed enum values: `user_defined_field` type [_required_] enum Incident resource type. Allowed enum values: `incidents` default: `incidents` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. name string Name of the user. uuid string UUID of the user. id string ID of the user. type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object attributes [_required_] object attachment object documentUrl string title string attachment_type enum modified date-time id [_required_] string relationships [_required_] object last_modified_by_user object data [_required_] object id [_required_] string type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` ``` { "data": { "attributes": { "archived": "2019-09-19T10:00:00.000Z", "case_id": "integer", "created": "2019-09-19T10:00:00.000Z", "customer_impact_duration": "integer", "customer_impact_end": "2019-09-19T10:00:00.000Z", "customer_impact_scope": "An example customer impact scope", "customer_impact_start": "2019-09-19T10:00:00.000Z", "customer_impacted": false, "declared": "2019-09-19T10:00:00.000Z", "declared_by": { "image_48_px": "string", "name": "string" }, "declared_by_uuid": "string", "detected": "2019-09-19T10:00:00.000Z", "fields": { "": "undefined" }, "incident_type_uuid": "00000000-0000-0000-0000-000000000000", "is_test": false, "modified": "2019-09-19T10:00:00.000Z", "non_datadog_creator": { "image_48_px": "string", "name": "string" }, "notification_handles": [ { "display_name": "Jane Doe", "handle": "@test.user@test.com" } ], "public_id": 1, "resolved": "2019-09-19T10:00:00.000Z", "severity": "UNKNOWN", "state": "string", "time_to_detect": "integer", "time_to_internal_response": "integer", "time_to_repair": "integer", "time_to_resolve": "integer", "title": "A test incident title", "visibility": "string" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "attachments": { "data": [ { "id": "00000000-0000-abcd-1000-000000000000", "type": "incident_attachments" } ] }, "commander_user": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "users" } }, "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "declared_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "impacts": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "incident_impacts" } ] }, "integrations": { "data": [ { "id": "00000000-abcd-0001-0000-000000000000", "type": "incident_integrations" } ] }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "responders": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "incident_responders" } ] }, "user_defined_fields": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "user_defined_field" } ] } }, "type": "incidents" }, "included": [ { "attributes": { "email": "string", "handle": "string", "icon": "string", "name": "string", "uuid": "string" }, "id": "string", "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Add commander to an incident returns "OK" response Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "00000000-0000-0000-1234-000000000000", "type": "incidents", "relationships": { "commander_user": { "data": { "id": "string", "type": "users" } } } } } EOF ``` ##### Remove commander from an incident returns "OK" response Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "00000000-0000-0000-1234-000000000000", "type": "incidents", "relationships": { "commander_user": { "data": null } } } } EOF ``` ##### Update an existing incident returns "OK" response Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "00000000-0000-0000-1234-000000000000", "type": "incidents", "attributes": { "fields": { "state": { "type": "dropdown", "value": "resolved" } }, "title": "A test incident title-updated" } } } EOF ``` ##### Add commander to an incident returns "OK" response ``` // Add commander to an incident returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") body := datadogV2.IncidentUpdateRequest{ Data: datadogV2.IncidentUpdateData{ Id: IncidentDataID, Type: datadogV2.INCIDENTTYPE_INCIDENTS, Relationships: &datadogV2.IncidentUpdateRelationships{ CommanderUser: *datadogV2.NewNullableNullableRelationshipToUser(&datadogV2.NullableRelationshipToUser{ Data: *datadogV2.NewNullableNullableRelationshipToUserData(&datadogV2.NullableRelationshipToUserData{ Id: UserDataID, Type: datadogV2.USERSTYPE_USERS, }), }), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateIncident", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.UpdateIncident(ctx, IncidentDataID, body, *datadogV2.NewUpdateIncidentOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.UpdateIncident`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.UpdateIncident`:\n%s\n", responseContent) } ``` Copy ##### Remove commander from an incident returns "OK" response ``` // Remove commander from an incident returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") body := datadogV2.IncidentUpdateRequest{ Data: datadogV2.IncidentUpdateData{ Id: IncidentDataID, Type: datadogV2.INCIDENTTYPE_INCIDENTS, Relationships: &datadogV2.IncidentUpdateRelationships{ CommanderUser: *datadogV2.NewNullableNullableRelationshipToUser(&datadogV2.NullableRelationshipToUser{ Data: *datadogV2.NewNullableNullableRelationshipToUserData(nil), }), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateIncident", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.UpdateIncident(ctx, IncidentDataID, body, *datadogV2.NewUpdateIncidentOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.UpdateIncident`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.UpdateIncident`:\n%s\n", responseContent) } ``` Copy ##### Update an existing incident returns "OK" response ``` // Update an existing incident returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") body := datadogV2.IncidentUpdateRequest{ Data: datadogV2.IncidentUpdateData{ Id: IncidentDataID, Type: datadogV2.INCIDENTTYPE_INCIDENTS, Attributes: &datadogV2.IncidentUpdateAttributes{ Fields: map[string]datadogV2.IncidentFieldAttributes{ "state": datadogV2.IncidentFieldAttributes{ IncidentFieldAttributesSingleValue: &datadogV2.IncidentFieldAttributesSingleValue{ Type: datadogV2.INCIDENTFIELDATTRIBUTESSINGLEVALUETYPE_DROPDOWN.Ptr(), Value: *datadog.NewNullableString(datadog.PtrString("resolved")), }}, }, Title: datadog.PtrString("A test incident title-updated"), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateIncident", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.UpdateIncident(ctx, IncidentDataID, body, *datadogV2.NewUpdateIncidentOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.UpdateIncident`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.UpdateIncident`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add commander to an incident returns "OK" response ``` // Add commander to an incident returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentResponse; import com.datadog.api.client.v2.model.IncidentType; import com.datadog.api.client.v2.model.IncidentUpdateData; import com.datadog.api.client.v2.model.IncidentUpdateRelationships; import com.datadog.api.client.v2.model.IncidentUpdateRequest; import com.datadog.api.client.v2.model.NullableRelationshipToUser; import com.datadog.api.client.v2.model.NullableRelationshipToUserData; import com.datadog.api.client.v2.model.UsersType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateIncident", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); IncidentUpdateRequest body = new IncidentUpdateRequest() .data( new IncidentUpdateData() .id(INCIDENT_DATA_ID) .type(IncidentType.INCIDENTS) .relationships( new IncidentUpdateRelationships() .commanderUser( new NullableRelationshipToUser() .data( new NullableRelationshipToUserData() .id(USER_DATA_ID) .type(UsersType.USERS))))); try { IncidentResponse result = apiInstance.updateIncident(INCIDENT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#updateIncident"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Remove commander from an incident returns "OK" response ``` // Remove commander from an incident returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentResponse; import com.datadog.api.client.v2.model.IncidentType; import com.datadog.api.client.v2.model.IncidentUpdateData; import com.datadog.api.client.v2.model.IncidentUpdateRelationships; import com.datadog.api.client.v2.model.IncidentUpdateRequest; import com.datadog.api.client.v2.model.NullableRelationshipToUser; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateIncident", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); IncidentUpdateRequest body = new IncidentUpdateRequest() .data( new IncidentUpdateData() .id(INCIDENT_DATA_ID) .type(IncidentType.INCIDENTS) .relationships( new IncidentUpdateRelationships() .commanderUser(new NullableRelationshipToUser().data(null)))); try { IncidentResponse result = apiInstance.updateIncident(INCIDENT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#updateIncident"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update an existing incident returns "OK" response ``` // Update an existing incident returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentFieldAttributes; import com.datadog.api.client.v2.model.IncidentFieldAttributesSingleValue; import com.datadog.api.client.v2.model.IncidentFieldAttributesSingleValueType; import com.datadog.api.client.v2.model.IncidentResponse; import com.datadog.api.client.v2.model.IncidentType; import com.datadog.api.client.v2.model.IncidentUpdateAttributes; import com.datadog.api.client.v2.model.IncidentUpdateData; import com.datadog.api.client.v2.model.IncidentUpdateRequest; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateIncident", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ATTRIBUTES_TITLE = System.getenv("INCIDENT_DATA_ATTRIBUTES_TITLE"); String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); IncidentUpdateRequest body = new IncidentUpdateRequest() .data( new IncidentUpdateData() .id(INCIDENT_DATA_ID) .type(IncidentType.INCIDENTS) .attributes( new IncidentUpdateAttributes() .fields( Map.ofEntries( Map.entry( "state", new IncidentFieldAttributes( new IncidentFieldAttributesSingleValue() .type( IncidentFieldAttributesSingleValueType.DROPDOWN) .value("resolved"))))) .title("A test incident title-updated"))); try { IncidentResponse result = apiInstance.updateIncident(INCIDENT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#updateIncident"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add commander to an incident returns "OK" response ``` """ Add commander to an incident returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_type import IncidentType from datadog_api_client.v2.model.incident_update_data import IncidentUpdateData from datadog_api_client.v2.model.incident_update_relationships import IncidentUpdateRelationships from datadog_api_client.v2.model.incident_update_request import IncidentUpdateRequest from datadog_api_client.v2.model.nullable_relationship_to_user import NullableRelationshipToUser from datadog_api_client.v2.model.nullable_relationship_to_user_data import NullableRelationshipToUserData from datadog_api_client.v2.model.users_type import UsersType # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] body = IncidentUpdateRequest( data=IncidentUpdateData( id=INCIDENT_DATA_ID, type=IncidentType.INCIDENTS, relationships=IncidentUpdateRelationships( commander_user=NullableRelationshipToUser( data=NullableRelationshipToUserData( id=USER_DATA_ID, type=UsersType.USERS, ), ), ), ), ) configuration = Configuration() configuration.unstable_operations["update_incident"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.update_incident(incident_id=INCIDENT_DATA_ID, body=body) print(response) ``` Copy ##### Remove commander from an incident returns "OK" response ``` """ Remove commander from an incident returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_type import IncidentType from datadog_api_client.v2.model.incident_update_data import IncidentUpdateData from datadog_api_client.v2.model.incident_update_relationships import IncidentUpdateRelationships from datadog_api_client.v2.model.incident_update_request import IncidentUpdateRequest from datadog_api_client.v2.model.nullable_relationship_to_user import NullableRelationshipToUser # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] body = IncidentUpdateRequest( data=IncidentUpdateData( id=INCIDENT_DATA_ID, type=IncidentType.INCIDENTS, relationships=IncidentUpdateRelationships( commander_user=NullableRelationshipToUser( data=None, ), ), ), ) configuration = Configuration() configuration.unstable_operations["update_incident"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.update_incident(incident_id=INCIDENT_DATA_ID, body=body) print(response) ``` Copy ##### Update an existing incident returns "OK" response ``` """ Update an existing incident returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_field_attributes_single_value import IncidentFieldAttributesSingleValue from datadog_api_client.v2.model.incident_field_attributes_single_value_type import ( IncidentFieldAttributesSingleValueType, ) from datadog_api_client.v2.model.incident_type import IncidentType from datadog_api_client.v2.model.incident_update_attributes import IncidentUpdateAttributes from datadog_api_client.v2.model.incident_update_data import IncidentUpdateData from datadog_api_client.v2.model.incident_update_request import IncidentUpdateRequest # there is a valid "incident" in the system INCIDENT_DATA_ATTRIBUTES_TITLE = environ["INCIDENT_DATA_ATTRIBUTES_TITLE"] INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] body = IncidentUpdateRequest( data=IncidentUpdateData( id=INCIDENT_DATA_ID, type=IncidentType.INCIDENTS, attributes=IncidentUpdateAttributes( fields=dict( state=IncidentFieldAttributesSingleValue( type=IncidentFieldAttributesSingleValueType.DROPDOWN, value="resolved", ), ), title="A test incident title-updated", ), ), ) configuration = Configuration() configuration.unstable_operations["update_incident"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.update_incident(incident_id=INCIDENT_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add commander to an incident returns "OK" response ``` # Add commander to an incident returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_incident".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] body = DatadogAPIClient::V2::IncidentUpdateRequest.new({ data: DatadogAPIClient::V2::IncidentUpdateData.new({ id: INCIDENT_DATA_ID, type: DatadogAPIClient::V2::IncidentType::INCIDENTS, relationships: DatadogAPIClient::V2::IncidentUpdateRelationships.new({ commander_user: DatadogAPIClient::V2::NullableRelationshipToUser.new({ data: DatadogAPIClient::V2::NullableRelationshipToUserData.new({ id: USER_DATA_ID, type: DatadogAPIClient::V2::UsersType::USERS, }), }), }), }), }) p api_instance.update_incident(INCIDENT_DATA_ID, body) ``` Copy ##### Remove commander from an incident returns "OK" response ``` # Remove commander from an incident returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_incident".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] body = DatadogAPIClient::V2::IncidentUpdateRequest.new({ data: DatadogAPIClient::V2::IncidentUpdateData.new({ id: INCIDENT_DATA_ID, type: DatadogAPIClient::V2::IncidentType::INCIDENTS, relationships: DatadogAPIClient::V2::IncidentUpdateRelationships.new({ commander_user: DatadogAPIClient::V2::NullableRelationshipToUser.new({ data: nil, }), }), }), }) p api_instance.update_incident(INCIDENT_DATA_ID, body) ``` Copy ##### Update an existing incident returns "OK" response ``` # Update an existing incident returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_incident".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ATTRIBUTES_TITLE = ENV["INCIDENT_DATA_ATTRIBUTES_TITLE"] INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] body = DatadogAPIClient::V2::IncidentUpdateRequest.new({ data: DatadogAPIClient::V2::IncidentUpdateData.new({ id: INCIDENT_DATA_ID, type: DatadogAPIClient::V2::IncidentType::INCIDENTS, attributes: DatadogAPIClient::V2::IncidentUpdateAttributes.new({ fields: { state: DatadogAPIClient::V2::IncidentFieldAttributesSingleValue.new({ type: DatadogAPIClient::V2::IncidentFieldAttributesSingleValueType::DROPDOWN, value: "resolved", }), }, title: "A test incident title-updated", }), }), }) p api_instance.update_incident(INCIDENT_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add commander to an incident returns "OK" response ``` // Add commander to an incident returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::UpdateIncidentOptionalParams; use datadog_api_client::datadogV2::model::IncidentType; use datadog_api_client::datadogV2::model::IncidentUpdateData; use datadog_api_client::datadogV2::model::IncidentUpdateRelationships; use datadog_api_client::datadogV2::model::IncidentUpdateRequest; use datadog_api_client::datadogV2::model::NullableRelationshipToUser; use datadog_api_client::datadogV2::model::NullableRelationshipToUserData; use datadog_api_client::datadogV2::model::UsersType; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let body = IncidentUpdateRequest::new( IncidentUpdateData::new(incident_data_id.clone(), IncidentType::INCIDENTS).relationships( IncidentUpdateRelationships::new().commander_user(Some( NullableRelationshipToUser::new(Some(NullableRelationshipToUserData::new( user_data_id.clone(), UsersType::USERS, ))), )), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateIncident", true); let api = IncidentsAPI::with_config(configuration); let resp = api .update_incident( incident_data_id.clone(), body, UpdateIncidentOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Remove commander from an incident returns "OK" response ``` // Remove commander from an incident returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::UpdateIncidentOptionalParams; use datadog_api_client::datadogV2::model::IncidentType; use datadog_api_client::datadogV2::model::IncidentUpdateData; use datadog_api_client::datadogV2::model::IncidentUpdateRelationships; use datadog_api_client::datadogV2::model::IncidentUpdateRequest; use datadog_api_client::datadogV2::model::NullableRelationshipToUser; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); let body = IncidentUpdateRequest::new( IncidentUpdateData::new(incident_data_id.clone(), IncidentType::INCIDENTS).relationships( IncidentUpdateRelationships::new() .commander_user(Some(NullableRelationshipToUser::new(None))), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateIncident", true); let api = IncidentsAPI::with_config(configuration); let resp = api .update_incident( incident_data_id.clone(), body, UpdateIncidentOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update an existing incident returns "OK" response ``` // Update an existing incident returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::UpdateIncidentOptionalParams; use datadog_api_client::datadogV2::model::IncidentFieldAttributes; use datadog_api_client::datadogV2::model::IncidentFieldAttributesSingleValue; use datadog_api_client::datadogV2::model::IncidentFieldAttributesSingleValueType; use datadog_api_client::datadogV2::model::IncidentType; use datadog_api_client::datadogV2::model::IncidentUpdateAttributes; use datadog_api_client::datadogV2::model::IncidentUpdateData; use datadog_api_client::datadogV2::model::IncidentUpdateRequest; use std::collections::BTreeMap; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); let body = IncidentUpdateRequest::new( IncidentUpdateData::new(incident_data_id.clone(), IncidentType::INCIDENTS).attributes( IncidentUpdateAttributes::new() .fields(BTreeMap::from([( "state".to_string(), IncidentFieldAttributes::IncidentFieldAttributesSingleValue(Box::new( IncidentFieldAttributesSingleValue::new() .type_(IncidentFieldAttributesSingleValueType::DROPDOWN) .value(Some("resolved".to_string())), )), )])) .title("A test incident title-updated".to_string()), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateIncident", true); let api = IncidentsAPI::with_config(configuration); let resp = api .update_incident( incident_data_id.clone(), body, UpdateIncidentOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add commander to an incident returns "OK" response ``` /** * Add commander to an incident returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateIncident"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.IncidentsApiUpdateIncidentRequest = { body: { data: { id: INCIDENT_DATA_ID, type: "incidents", relationships: { commanderUser: { data: { id: USER_DATA_ID, type: "users", }, }, }, }, }, incidentId: INCIDENT_DATA_ID, }; apiInstance .updateIncident(params) .then((data: v2.IncidentResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Remove commander from an incident returns "OK" response ``` /** * Remove commander from an incident returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateIncident"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; const params: v2.IncidentsApiUpdateIncidentRequest = { body: { data: { id: INCIDENT_DATA_ID, type: "incidents", relationships: { commanderUser: { data: null, }, }, }, }, incidentId: INCIDENT_DATA_ID, }; apiInstance .updateIncident(params) .then((data: v2.IncidentResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update an existing incident returns "OK" response ``` /** * Update an existing incident returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateIncident"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; const params: v2.IncidentsApiUpdateIncidentRequest = { body: { data: { id: INCIDENT_DATA_ID, type: "incidents", attributes: { fields: { state: { type: "dropdown", value: "resolved", }, }, title: "A test incident title-updated", }, }, }, incidentId: INCIDENT_DATA_ID, }; apiInstance .updateIncident(params) .then((data: v2.IncidentResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an existing incident](https://docs.datadoghq.com/api/latest/incidents/#delete-an-existing-incident) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#delete-an-existing-incident-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id} ### Overview Deletes an existing incident from the users organization. This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. ### Response * [204](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncident-204-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncident-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncident-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncident-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncident-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncident-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Delete an existing incident Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an existing incident ``` """ Delete an existing incident returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_incident"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) api_instance.delete_incident( incident_id=INCIDENT_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an existing incident ``` # Delete an existing incident returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_incident".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] api_instance.delete_incident(INCIDENT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an existing incident ``` // Delete an existing incident returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteIncident", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) r, err := api.DeleteIncident(ctx, IncidentDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.DeleteIncident`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an existing incident ``` // Delete an existing incident returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteIncident", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); try { apiInstance.deleteIncident(INCIDENT_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#deleteIncident"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an existing incident ``` // Delete an existing incident returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteIncident", true); let api = IncidentsAPI::with_config(configuration); let resp = api.delete_incident(incident_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an existing incident ``` /** * Delete an existing incident returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteIncident"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; const params: v2.IncidentsApiDeleteIncidentRequest = { incidentId: INCIDENT_DATA_ID, }; apiInstance .deleteIncident(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of incidents](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-incidents) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-incidents-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidentshttps://api.ap2.datadoghq.com/api/v2/incidentshttps://api.datadoghq.eu/api/v2/incidentshttps://api.ddog-gov.com/api/v2/incidentshttps://api.datadoghq.com/api/v2/incidentshttps://api.us3.datadoghq.com/api/v2/incidentshttps://api.us5.datadoghq.com/api/v2/incidents ### Overview Get all incidents for the user’s organization. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Query Strings Name Type Description include array Specifies which types of related objects should be included in the response. page[size] integer Size for a given page. The maximum allowed value is 100. page[offset] integer Specific offset to use as the beginning of the returned page. ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#ListIncidents-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#ListIncidents-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#ListIncidents-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#ListIncidents-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#ListIncidents-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#ListIncidents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with a list of incidents. Field Type Description data [_required_] [object] An array of incidents. attributes object The incident's attributes from a response. archived date-time Timestamp of when the incident was archived. case_id int64 The incident case id. created date-time Timestamp when the incident was created. customer_impact_duration int64 Length of the incident's customer impact in seconds. Equals the difference between `customer_impact_start` and `customer_impact_end`. customer_impact_end date-time Timestamp when customers were no longer impacted by the incident. customer_impact_scope string A summary of the impact customers experienced during the incident. customer_impact_start date-time Timestamp when customers began being impacted by the incident. customer_impacted boolean A flag indicating whether the incident caused customer impact. declared date-time Timestamp when the incident was declared. declared_by object Incident's non Datadog creator. image_48_px string Non Datadog creator `48px` image. name string Non Datadog creator name. declared_by_uuid string UUID of the user who declared the incident. detected date-time Timestamp when the incident was detected. fields object A condensed view of the user-defined fields attached to incidents. Dynamic fields for which selections can be made, with field names as keys. Option 1 object A field with a single value selected. type enum Type of the single value field definitions. Allowed enum values: `dropdown,textbox` default: `dropdown` value string The single value selected for this field. Option 2 object A field with potentially multiple values selected. type enum Type of the multiple value field definitions. Allowed enum values: `multiselect,textarray,metrictag,autocomplete` default: `multiselect` value [string] The multiple values selected for this field. incident_type_uuid string A unique identifier that represents an incident type. is_test boolean A flag indicating whether the incident is a test incident. modified date-time Timestamp when the incident was last modified. non_datadog_creator object Incident's non Datadog creator. image_48_px string Non Datadog creator `48px` image. name string Non Datadog creator name. notification_handles [object] Notification handles that will be notified of the incident during update. display_name string The name of the notified handle. handle string The handle used for the notification. This includes an email address, Slack channel, or workflow. public_id int64 The monotonically increasing integer ID for the incident. resolved date-time Timestamp when the incident's state was last changed from active or stable to resolved or completed. severity enum The incident severity. Allowed enum values: `UNKNOWN,SEV-0,SEV-1,SEV-2,SEV-3,SEV-4,SEV-5` state string The state incident. time_to_detect int64 The amount of time in seconds to detect the incident. Equals the difference between `customer_impact_start` and `detected`. time_to_internal_response int64 The amount of time in seconds to call incident after detection. Equals the difference of `detected` and `created`. time_to_repair int64 The amount of time in seconds to resolve customer impact after detecting the issue. Equals the difference between `customer_impact_end` and `detected`. time_to_resolve int64 The amount of time in seconds to resolve the incident after it was created. Equals the difference between `created` and `resolved`. title [_required_] string The title of the incident, which summarizes what happened. visibility string The incident visibility status. id [_required_] string The incident's ID. relationships object The incident's relationships from a response. attachments object A relationship reference for attachments. data [_required_] [object] An array of incident attachments. id [_required_] string A unique identifier that represents the attachment. type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` commander_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` declared_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` impacts object Relationship to impacts. data [_required_] [object] An array of incident impacts. id [_required_] string A unique identifier that represents the impact. type [_required_] enum The incident impacts type. Allowed enum values: `incident_impacts` integrations object A relationship reference for multiple integration metadata objects. data [_required_] [object] Integration metadata relationship array id [_required_] string A unique identifier that represents the integration metadata. type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` responders object Relationship to incident responders. data [_required_] [object] An array of incident responders. id [_required_] string A unique identifier that represents the responder. type [_required_] enum The incident responders type. Allowed enum values: `incident_responders` user_defined_fields object Relationship to incident user defined fields. data [_required_] [object] An array of user defined fields. id [_required_] string A unique identifier that represents the responder. type [_required_] enum The incident user defined fields type. Allowed enum values: `user_defined_field` type [_required_] enum Incident resource type. Allowed enum values: `incidents` default: `incidents` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. name string Name of the user. uuid string UUID of the user. id string ID of the user. type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object attributes [_required_] object attachment object documentUrl string title string attachment_type enum modified date-time id [_required_] string relationships [_required_] object last_modified_by_user object data [_required_] object id [_required_] string type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` meta object The metadata object containing pagination metadata. pagination object Pagination properties. next_offset int64 The index of the first element in the next page of results. Equal to page size added to the current offset. offset int64 The index of the first element in the results. size int64 Maximum size of pages to return. ``` { "data": [ { "attributes": { "archived": "2019-09-19T10:00:00.000Z", "case_id": "integer", "created": "2019-09-19T10:00:00.000Z", "customer_impact_duration": "integer", "customer_impact_end": "2019-09-19T10:00:00.000Z", "customer_impact_scope": "An example customer impact scope", "customer_impact_start": "2019-09-19T10:00:00.000Z", "customer_impacted": false, "declared": "2019-09-19T10:00:00.000Z", "declared_by": { "image_48_px": "string", "name": "string" }, "declared_by_uuid": "string", "detected": "2019-09-19T10:00:00.000Z", "fields": { "": "undefined" }, "incident_type_uuid": "00000000-0000-0000-0000-000000000000", "is_test": false, "modified": "2019-09-19T10:00:00.000Z", "non_datadog_creator": { "image_48_px": "string", "name": "string" }, "notification_handles": [ { "display_name": "Jane Doe", "handle": "@test.user@test.com" } ], "public_id": 1, "resolved": "2019-09-19T10:00:00.000Z", "severity": "UNKNOWN", "state": "string", "time_to_detect": "integer", "time_to_internal_response": "integer", "time_to_repair": "integer", "time_to_resolve": "integer", "title": "A test incident title", "visibility": "string" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "attachments": { "data": [ { "id": "00000000-0000-abcd-1000-000000000000", "type": "incident_attachments" } ] }, "commander_user": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "users" } }, "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "declared_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "impacts": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "incident_impacts" } ] }, "integrations": { "data": [ { "id": "00000000-abcd-0001-0000-000000000000", "type": "incident_integrations" } ] }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "responders": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "incident_responders" } ] }, "user_defined_fields": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "user_defined_field" } ] } }, "type": "incidents" } ], "included": [ { "attributes": { "email": "string", "handle": "string", "icon": "string", "name": "string", "uuid": "string" }, "id": "string", "type": "users" } ], "meta": { "pagination": { "next_offset": 1000, "offset": 10, "size": 1000 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Get a list of incidents Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of incidents ``` """ Get a list of incidents returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi configuration = Configuration() configuration.unstable_operations["list_incidents"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.list_incidents() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of incidents ``` # Get a list of incidents returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_incidents".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new p api_instance.list_incidents() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of incidents ``` // Get a list of incidents returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListIncidents", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.ListIncidents(ctx, *datadogV2.NewListIncidentsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.ListIncidents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.ListIncidents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of incidents ``` // Get a list of incidents returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listIncidents", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); try { IncidentsResponse result = apiInstance.listIncidents(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#listIncidents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of incidents ``` // Get a list of incidents returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::ListIncidentsOptionalParams; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListIncidents", true); let api = IncidentsAPI::with_config(configuration); let resp = api .list_incidents(ListIncidentsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of incidents ``` /** * Get a list of incidents returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listIncidents"] = true; const apiInstance = new v2.IncidentsApi(configuration); apiInstance .listIncidents() .then((data: v2.IncidentsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search for incidents](https://docs.datadoghq.com/api/latest/incidents/#search-for-incidents) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#search-for-incidents-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/searchhttps://api.ap2.datadoghq.com/api/v2/incidents/searchhttps://api.datadoghq.eu/api/v2/incidents/searchhttps://api.ddog-gov.com/api/v2/incidents/searchhttps://api.datadoghq.com/api/v2/incidents/searchhttps://api.us3.datadoghq.com/api/v2/incidents/searchhttps://api.us5.datadoghq.com/api/v2/incidents/search ### Overview Search for incidents matching a certain query. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Query Strings Name Type Description include enum Specifies which types of related objects should be included in the response. Allowed enum values: `users, attachments` query [_required_] string Specifies which incidents should be returned. The query can contain any number of incident facets joined by `ANDs`, along with multiple values for each of those facets joined by `OR`s. For example: `state:active AND severity:(SEV-2 OR SEV-1)`. sort enum Specifies the order of returned incidents. Allowed enum values: `created, -created` page[size] integer Size for a given page. The maximum allowed value is 100. page[offset] integer Specific offset to use as the beginning of the returned page. ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#SearchIncidents-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#SearchIncidents-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#SearchIncidents-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#SearchIncidents-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#SearchIncidents-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#SearchIncidents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with incidents and facets. Field Type Description data [_required_] object Data returned by an incident search. attributes object Attributes returned by an incident search. facets [_required_] object Facet data for incidents returned by a search query. commander [object] Facet data for incident commander users. count int32 Count of the facet value appearing in search results. email string Email of the user. handle string Handle of the user. name string Name of the user. uuid string ID of the user. created_by [object] Facet data for incident creator users. count int32 Count of the facet value appearing in search results. email string Email of the user. handle string Handle of the user. name string Name of the user. uuid string ID of the user. fields [object] Facet data for incident property fields. aggregates object Aggregate information for numeric incident data. max double Maximum value of the numeric aggregates. min double Minimum value of the numeric aggregates. facets [_required_] [object] Facet data for the property field of an incident. count int32 Count of the facet value appearing in search results. name string The facet value appearing in search results. name [_required_] string Name of the incident property field. impact [object] Facet data for incident impact attributes. count int32 Count of the facet value appearing in search results. name string The facet value appearing in search results. last_modified_by [object] Facet data for incident last modified by users. count int32 Count of the facet value appearing in search results. email string Email of the user. handle string Handle of the user. name string Name of the user. uuid string ID of the user. postmortem [object] Facet data for incident postmortem existence. count int32 Count of the facet value appearing in search results. name string The facet value appearing in search results. responder [object] Facet data for incident responder users. count int32 Count of the facet value appearing in search results. email string Email of the user. handle string Handle of the user. name string Name of the user. uuid string ID of the user. severity [object] Facet data for incident severity attributes. count int32 Count of the facet value appearing in search results. name string The facet value appearing in search results. state [object] Facet data for incident state attributes. count int32 Count of the facet value appearing in search results. name string The facet value appearing in search results. time_to_repair [object] Facet data for incident time to repair metrics. aggregates [_required_] object Aggregate information for numeric incident data. max double Maximum value of the numeric aggregates. min double Minimum value of the numeric aggregates. name [_required_] string Name of the incident property field. time_to_resolve [object] Facet data for incident time to resolve metrics. aggregates [_required_] object Aggregate information for numeric incident data. max double Maximum value of the numeric aggregates. min double Minimum value of the numeric aggregates. name [_required_] string Name of the incident property field. incidents [_required_] [object] Incidents returned by the search. data [_required_] object Incident data from a response. attributes object The incident's attributes from a response. archived date-time Timestamp of when the incident was archived. case_id int64 The incident case id. created date-time Timestamp when the incident was created. customer_impact_duration int64 Length of the incident's customer impact in seconds. Equals the difference between `customer_impact_start` and `customer_impact_end`. customer_impact_end date-time Timestamp when customers were no longer impacted by the incident. customer_impact_scope string A summary of the impact customers experienced during the incident. customer_impact_start date-time Timestamp when customers began being impacted by the incident. customer_impacted boolean A flag indicating whether the incident caused customer impact. declared date-time Timestamp when the incident was declared. declared_by object Incident's non Datadog creator. image_48_px string Non Datadog creator `48px` image. name string Non Datadog creator name. declared_by_uuid string UUID of the user who declared the incident. detected date-time Timestamp when the incident was detected. fields object A condensed view of the user-defined fields attached to incidents. Dynamic fields for which selections can be made, with field names as keys. Option 1 object A field with a single value selected. type enum Type of the single value field definitions. Allowed enum values: `dropdown,textbox` default: `dropdown` value string The single value selected for this field. Option 2 object A field with potentially multiple values selected. type enum Type of the multiple value field definitions. Allowed enum values: `multiselect,textarray,metrictag,autocomplete` default: `multiselect` value [string] The multiple values selected for this field. incident_type_uuid string A unique identifier that represents an incident type. is_test boolean A flag indicating whether the incident is a test incident. modified date-time Timestamp when the incident was last modified. non_datadog_creator object Incident's non Datadog creator. image_48_px string Non Datadog creator `48px` image. name string Non Datadog creator name. notification_handles [object] Notification handles that will be notified of the incident during update. display_name string The name of the notified handle. handle string The handle used for the notification. This includes an email address, Slack channel, or workflow. public_id int64 The monotonically increasing integer ID for the incident. resolved date-time Timestamp when the incident's state was last changed from active or stable to resolved or completed. severity enum The incident severity. Allowed enum values: `UNKNOWN,SEV-0,SEV-1,SEV-2,SEV-3,SEV-4,SEV-5` state string The state incident. time_to_detect int64 The amount of time in seconds to detect the incident. Equals the difference between `customer_impact_start` and `detected`. time_to_internal_response int64 The amount of time in seconds to call incident after detection. Equals the difference of `detected` and `created`. time_to_repair int64 The amount of time in seconds to resolve customer impact after detecting the issue. Equals the difference between `customer_impact_end` and `detected`. time_to_resolve int64 The amount of time in seconds to resolve the incident after it was created. Equals the difference between `created` and `resolved`. title [_required_] string The title of the incident, which summarizes what happened. visibility string The incident visibility status. id [_required_] string The incident's ID. relationships object The incident's relationships from a response. attachments object A relationship reference for attachments. data [_required_] [object] An array of incident attachments. id [_required_] string A unique identifier that represents the attachment. type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` commander_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` declared_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` impacts object Relationship to impacts. data [_required_] [object] An array of incident impacts. id [_required_] string A unique identifier that represents the impact. type [_required_] enum The incident impacts type. Allowed enum values: `incident_impacts` integrations object A relationship reference for multiple integration metadata objects. data [_required_] [object] Integration metadata relationship array id [_required_] string A unique identifier that represents the integration metadata. type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` responders object Relationship to incident responders. data [_required_] [object] An array of incident responders. id [_required_] string A unique identifier that represents the responder. type [_required_] enum The incident responders type. Allowed enum values: `incident_responders` user_defined_fields object Relationship to incident user defined fields. data [_required_] [object] An array of user defined fields. id [_required_] string A unique identifier that represents the responder. type [_required_] enum The incident user defined fields type. Allowed enum values: `user_defined_field` type [_required_] enum Incident resource type. Allowed enum values: `incidents` default: `incidents` total [_required_] int32 Number of incidents returned by the search. type enum Incident search result type. Allowed enum values: `incidents_search_results` default: `incidents_search_results` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. name string Name of the user. uuid string UUID of the user. id string ID of the user. type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object attributes [_required_] object attachment object documentUrl string title string attachment_type enum modified date-time id [_required_] string relationships [_required_] object last_modified_by_user object data [_required_] object id [_required_] string type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` meta object The metadata object containing pagination metadata. pagination object Pagination properties. next_offset int64 The index of the first element in the next page of results. Equal to page size added to the current offset. offset int64 The index of the first element in the results. size int64 Maximum size of pages to return. ``` { "data": { "attributes": { "facets": { "commander": [ { "count": 5, "email": "datadog.user@example.com", "handle": "@datadog.user@example.com", "name": "Datadog User", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" } ], "created_by": [ { "count": 5, "email": "datadog.user@example.com", "handle": "@datadog.user@example.com", "name": "Datadog User", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" } ], "fields": [ { "aggregates": { "max": 1234, "min": 20 }, "facets": [ { "count": 5, "name": "SEV-2" } ], "name": "Severity" } ], "impact": [ { "count": 5, "name": "SEV-2" } ], "last_modified_by": [ { "count": 5, "email": "datadog.user@example.com", "handle": "@datadog.user@example.com", "name": "Datadog User", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" } ], "postmortem": [ { "count": 5, "name": "SEV-2" } ], "responder": [ { "count": 5, "email": "datadog.user@example.com", "handle": "@datadog.user@example.com", "name": "Datadog User", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" } ], "severity": [ { "count": 5, "name": "SEV-2" } ], "state": [ { "count": 5, "name": "SEV-2" } ], "time_to_repair": [ { "aggregates": { "max": 1234, "min": 20 }, "name": "time_to_repair" } ], "time_to_resolve": [ { "aggregates": { "max": 1234, "min": 20 }, "name": "time_to_repair" } ] }, "incidents": [ { "data": { "attributes": { "archived": "2019-09-19T10:00:00.000Z", "case_id": "integer", "created": "2019-09-19T10:00:00.000Z", "customer_impact_duration": "integer", "customer_impact_end": "2019-09-19T10:00:00.000Z", "customer_impact_scope": "An example customer impact scope", "customer_impact_start": "2019-09-19T10:00:00.000Z", "customer_impacted": false, "declared": "2019-09-19T10:00:00.000Z", "declared_by": { "image_48_px": "string", "name": "string" }, "declared_by_uuid": "string", "detected": "2019-09-19T10:00:00.000Z", "fields": { "": "undefined" }, "incident_type_uuid": "00000000-0000-0000-0000-000000000000", "is_test": false, "modified": "2019-09-19T10:00:00.000Z", "non_datadog_creator": { "image_48_px": "string", "name": "string" }, "notification_handles": [ { "display_name": "Jane Doe", "handle": "@test.user@test.com" } ], "public_id": 1, "resolved": "2019-09-19T10:00:00.000Z", "severity": "UNKNOWN", "state": "string", "time_to_detect": "integer", "time_to_internal_response": "integer", "time_to_repair": "integer", "time_to_resolve": "integer", "title": "A test incident title", "visibility": "string" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "attachments": { "data": [ { "id": "00000000-0000-abcd-1000-000000000000", "type": "incident_attachments" } ] }, "commander_user": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "users" } }, "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "declared_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "impacts": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "incident_impacts" } ] }, "integrations": { "data": [ { "id": "00000000-abcd-0001-0000-000000000000", "type": "incident_integrations" } ] }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "responders": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "incident_responders" } ] }, "user_defined_fields": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "user_defined_field" } ] } }, "type": "incidents" } } ], "total": 10 }, "type": "incidents_search_results" }, "included": [ { "attributes": { "email": "string", "handle": "string", "icon": "string", "name": "string", "uuid": "string" }, "id": "string", "type": "users" } ], "meta": { "pagination": { "next_offset": 1000, "offset": 10, "size": 1000 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Search for incidents Copy ``` # Required query arguments export query="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/search?query=${query}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Search for incidents ``` """ Search for incidents returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi configuration = Configuration() configuration.unstable_operations["search_incidents"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.search_incidents( query="state:(active OR stable OR resolved)", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search for incidents ``` # Search for incidents returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.search_incidents".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new p api_instance.search_incidents("state:(active OR stable OR resolved)") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search for incidents ``` // Search for incidents returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.SearchIncidents", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.SearchIncidents(ctx, "state:(active OR stable OR resolved)", *datadogV2.NewSearchIncidentsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.SearchIncidents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.SearchIncidents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search for incidents ``` // Search for incidents returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentSearchResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.searchIncidents", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); try { IncidentSearchResponse result = apiInstance.searchIncidents("state:(active OR stable OR resolved)"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#searchIncidents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search for incidents ``` // Search for incidents returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::SearchIncidentsOptionalParams; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.SearchIncidents", true); let api = IncidentsAPI::with_config(configuration); let resp = api .search_incidents( "state:(active OR stable OR resolved)".to_string(), SearchIncidentsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search for incidents ``` /** * Search for incidents returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.searchIncidents"] = true; const apiInstance = new v2.IncidentsApi(configuration); const params: v2.IncidentsApiSearchIncidentsRequest = { query: "state:(active OR stable OR resolved)", }; apiInstance .searchIncidents(params) .then((data: v2.IncidentSearchResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List an incident's impacts](https://docs.datadoghq.com/api/latest/incidents/#list-an-incidents-impacts) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#list-an-incidents-impacts-v2) GET https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/impactshttps://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/impactshttps://api.datadoghq.eu/api/v2/incidents/{incident_id}/impactshttps://api.ddog-gov.com/api/v2/incidents/{incident_id}/impactshttps://api.datadoghq.com/api/v2/incidents/{incident_id}/impactshttps://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/impactshttps://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/impacts ### Overview Get all impacts for an incident. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. #### Query Strings Name Type Description include array Specifies which related resources should be included in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentImpacts-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentImpacts-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentImpacts-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentImpacts-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentImpacts-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentImpacts-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with a list of incident impacts. Field Type Description data [_required_] [object] An array of incident impacts. attributes object The incident impact's attributes. created date-time Timestamp when the impact was created. description [_required_] string Description of the impact. end_at date-time Timestamp when the impact ended. fields object An object mapping impact field names to field values. impact_type string The type of impact. modified date-time Timestamp when the impact was last modified. start_at [_required_] date-time Timestamp representing when the impact started. id [_required_] string The incident impact's ID. relationships object The incident impact's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident object Relationship to incident. data [_required_] object Relationship to incident object. id [_required_] string A unique identifier that represents the incident. type [_required_] enum Incident resource type. Allowed enum values: `incidents` default: `incidents` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Incident impact resource type. Allowed enum values: `incident_impacts` default: `incident_impacts` included [object] Included related resources that the user requested. attributes object Attributes of user object returned by the API. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. name string Name of the user. uuid string UUID of the user. id string ID of the user. type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": [ { "attributes": { "created": "2025-08-29T13:17:00Z", "description": "Service was unavailable for external users", "end_at": "2025-08-29T13:17:00Z", "fields": { "customers_impacted": "all", "products_impacted": [ "shopping", "marketing" ] }, "impact_type": "customer", "modified": "2025-08-29T13:17:00Z", "start_at": "2025-08-28T13:17:00Z" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "incident": { "data": { "id": "00000000-0000-0000-1234-000000000000", "type": "incidents" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "incident_impacts" } ], "included": [ { "attributes": { "email": "string", "handle": "string", "icon": "string", "name": "string", "uuid": "string" }, "id": "string", "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### List an incident's impacts Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/impacts" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List an incident's impacts ``` """ List an incident's impacts returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] configuration = Configuration() configuration.unstable_operations["list_incident_impacts"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.list_incident_impacts( incident_id=INCIDENT_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List an incident's impacts ``` # List an incident's impacts returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_incident_impacts".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] p api_instance.list_incident_impacts(INCIDENT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List an incident's impacts ``` // List an incident's impacts returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListIncidentImpacts", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.ListIncidentImpacts(ctx, IncidentDataID, *datadogV2.NewListIncidentImpactsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.ListIncidentImpacts`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.ListIncidentImpacts`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List an incident's impacts ``` // List an incident's impacts returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentImpactsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listIncidentImpacts", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); try { IncidentImpactsResponse result = apiInstance.listIncidentImpacts(INCIDENT_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#listIncidentImpacts"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List an incident's impacts ``` // List an incident's impacts returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::ListIncidentImpactsOptionalParams; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListIncidentImpacts", true); let api = IncidentsAPI::with_config(configuration); let resp = api .list_incident_impacts( incident_data_id.clone(), ListIncidentImpactsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List an incident's impacts ``` /** * List an incident's impacts returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listIncidentImpacts"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; const params: v2.IncidentsApiListIncidentImpactsRequest = { incidentId: INCIDENT_DATA_ID, }; apiInstance .listIncidentImpacts(params) .then((data: v2.IncidentImpactsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an incident impact](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-impact) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-impact-v2) POST https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/impactshttps://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/impactshttps://api.datadoghq.eu/api/v2/incidents/{incident_id}/impactshttps://api.ddog-gov.com/api/v2/incidents/{incident_id}/impactshttps://api.datadoghq.com/api/v2/incidents/{incident_id}/impactshttps://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/impactshttps://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/impacts ### Overview Create an impact for an incident. This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. #### Query Strings Name Type Description include array Specifies which related resources should be included in the response. ### Request #### Body Data (required) Incident impact payload. * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Incident impact data for a create request. attributes [_required_] object The incident impact's attributes for a create request. description [_required_] string Description of the impact. end_at date-time Timestamp when the impact ended. fields object An object mapping impact field names to field values. start_at [_required_] date-time Timestamp when the impact started. type [_required_] enum Incident impact resource type. Allowed enum values: `incident_impacts` default: `incident_impacts` ``` { "data": { "attributes": { "description": "Service was unavailable for external users", "end_at": "2025-08-29T13:17:00Z", "fields": { "customers_impacted": "all", "products_impacted": [ "shopping", "marketing" ] }, "start_at": "2025-08-28T13:17:00Z" }, "type": "incident_impacts" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentImpact-201-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentImpact-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentImpact-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentImpact-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentImpact-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentImpact-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with an incident impact. Field Type Description data [_required_] object Incident impact data from a response. attributes object The incident impact's attributes. created date-time Timestamp when the impact was created. description [_required_] string Description of the impact. end_at date-time Timestamp when the impact ended. fields object An object mapping impact field names to field values. impact_type string The type of impact. modified date-time Timestamp when the impact was last modified. start_at [_required_] date-time Timestamp representing when the impact started. id [_required_] string The incident impact's ID. relationships object The incident impact's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident object Relationship to incident. data [_required_] object Relationship to incident object. id [_required_] string A unique identifier that represents the incident. type [_required_] enum Incident resource type. Allowed enum values: `incidents` default: `incidents` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Incident impact resource type. Allowed enum values: `incident_impacts` default: `incident_impacts` included [object] Included related resources that the user requested. attributes object Attributes of user object returned by the API. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. name string Name of the user. uuid string UUID of the user. id string ID of the user. type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "created": "2025-08-29T13:17:00Z", "description": "Service was unavailable for external users", "end_at": "2025-08-29T13:17:00Z", "fields": { "customers_impacted": "all", "products_impacted": [ "shopping", "marketing" ] }, "impact_type": "customer", "modified": "2025-08-29T13:17:00Z", "start_at": "2025-08-28T13:17:00Z" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "incident": { "data": { "id": "00000000-0000-0000-1234-000000000000", "type": "incidents" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "incident_impacts" }, "included": [ { "attributes": { "email": "string", "handle": "string", "icon": "string", "name": "string", "uuid": "string" }, "id": "string", "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Create an incident impact Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/impacts" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Service was unavailable for external users", "start_at": "2025-08-28T13:17:00Z" }, "type": "incident_impacts" } } EOF ``` ##### Create an incident impact ``` """ Create an incident impact returns "CREATED" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_impact_create_attributes import IncidentImpactCreateAttributes from datadog_api_client.v2.model.incident_impact_create_data import IncidentImpactCreateData from datadog_api_client.v2.model.incident_impact_create_request import IncidentImpactCreateRequest from datadog_api_client.v2.model.incident_impact_type import IncidentImpactType from datetime import datetime from dateutil.tz import tzutc # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] body = IncidentImpactCreateRequest( data=IncidentImpactCreateData( type=IncidentImpactType.INCIDENT_IMPACTS, attributes=IncidentImpactCreateAttributes( start_at=datetime(2025, 9, 12, 13, 50, tzinfo=tzutc()), end_at=datetime(2025, 9, 12, 14, 50, tzinfo=tzutc()), description="Outage in the us-east-1 region", ), ), ) configuration = Configuration() configuration.unstable_operations["create_incident_impact"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.create_incident_impact(incident_id=INCIDENT_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an incident impact ``` # Create an incident impact returns "CREATED" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_incident_impact".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] body = DatadogAPIClient::V2::IncidentImpactCreateRequest.new({ data: DatadogAPIClient::V2::IncidentImpactCreateData.new({ type: DatadogAPIClient::V2::IncidentImpactType::INCIDENT_IMPACTS, attributes: DatadogAPIClient::V2::IncidentImpactCreateAttributes.new({ start_at: "2025-09-12T13:50:00.000Z", end_at: "2025-09-12T14:50:00.000Z", description: "Outage in the us-east-1 region", }), }), }) p api_instance.create_incident_impact(INCIDENT_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an incident impact ``` // Create an incident impact returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") body := datadogV2.IncidentImpactCreateRequest{ Data: datadogV2.IncidentImpactCreateData{ Type: datadogV2.INCIDENTIMPACTTYPE_INCIDENT_IMPACTS, Attributes: datadogV2.IncidentImpactCreateAttributes{ StartAt: time.Date(2025, 9, 12, 13, 50, 0, 0, time.UTC), EndAt: *datadog.NewNullableTime(datadog.PtrTime(time.Date(2025, 9, 12, 14, 50, 0, 0, time.UTC))), Description: "Outage in the us-east-1 region", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateIncidentImpact", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.CreateIncidentImpact(ctx, IncidentDataID, body, *datadogV2.NewCreateIncidentImpactOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.CreateIncidentImpact`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.CreateIncidentImpact`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an incident impact ``` // Create an incident impact returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentImpactCreateAttributes; import com.datadog.api.client.v2.model.IncidentImpactCreateData; import com.datadog.api.client.v2.model.IncidentImpactCreateRequest; import com.datadog.api.client.v2.model.IncidentImpactResponse; import com.datadog.api.client.v2.model.IncidentImpactType; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createIncidentImpact", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); IncidentImpactCreateRequest body = new IncidentImpactCreateRequest() .data( new IncidentImpactCreateData() .type(IncidentImpactType.INCIDENT_IMPACTS) .attributes( new IncidentImpactCreateAttributes() .startAt(OffsetDateTime.parse("2025-09-12T13:50:00.000Z")) .endAt(OffsetDateTime.parse("2025-09-12T14:50:00.000Z")) .description("Outage in the us-east-1 region"))); try { IncidentImpactResponse result = apiInstance.createIncidentImpact(INCIDENT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#createIncidentImpact"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an incident impact ``` // Create an incident impact returns "CREATED" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::CreateIncidentImpactOptionalParams; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::model::IncidentImpactCreateAttributes; use datadog_api_client::datadogV2::model::IncidentImpactCreateData; use datadog_api_client::datadogV2::model::IncidentImpactCreateRequest; use datadog_api_client::datadogV2::model::IncidentImpactType; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); let body = IncidentImpactCreateRequest::new(IncidentImpactCreateData::new( IncidentImpactCreateAttributes::new( "Outage in the us-east-1 region".to_string(), DateTime::parse_from_rfc3339("2025-09-12T13:50:00+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .end_at(Some( DateTime::parse_from_rfc3339("2025-09-12T14:50:00+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), )), IncidentImpactType::INCIDENT_IMPACTS, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateIncidentImpact", true); let api = IncidentsAPI::with_config(configuration); let resp = api .create_incident_impact( incident_data_id.clone(), body, CreateIncidentImpactOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an incident impact ``` /** * Create an incident impact returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createIncidentImpact"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; const params: v2.IncidentsApiCreateIncidentImpactRequest = { body: { data: { type: "incident_impacts", attributes: { startAt: new Date(2025, 9, 12, 13, 50, 0, 0), endAt: new Date(2025, 9, 12, 14, 50, 0, 0), description: "Outage in the us-east-1 region", }, }, }, incidentId: INCIDENT_DATA_ID, }; apiInstance .createIncidentImpact(params) .then((data: v2.IncidentImpactResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an incident impact](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-impact) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-impact-v2) DELETE https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/impacts/{impact_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/impacts/{impact_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}/impacts/{impact_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}/impacts/{impact_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}/impacts/{impact_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/impacts/{impact_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/impacts/{impact_id} ### Overview Delete an incident impact. This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. impact_id [_required_] string The UUID of the incident impact. ### Response * [204](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentImpact-204-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentImpact-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentImpact-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentImpact-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentImpact-429-v2) No Content Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Delete an incident impact Copy ``` # Path parameters export incident_id="CHANGE_ME" export impact_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/impacts/${impact_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an incident impact ``` """ Delete an incident impact returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # the "incident" has an "incident_impact" INCIDENT_IMPACT_DATA_ID = environ["INCIDENT_IMPACT_DATA_ID"] INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID = environ["INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_incident_impact"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) api_instance.delete_incident_impact( incident_id=INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID, impact_id=INCIDENT_IMPACT_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an incident impact ``` # Delete an incident impact returns "No Content" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_incident_impact".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # the "incident" has an "incident_impact" INCIDENT_IMPACT_DATA_ID = ENV["INCIDENT_IMPACT_DATA_ID"] INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID = ENV["INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID"] api_instance.delete_incident_impact(INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID, INCIDENT_IMPACT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an incident impact ``` // Delete an incident impact returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // the "incident" has an "incident_impact" IncidentImpactDataID := os.Getenv("INCIDENT_IMPACT_DATA_ID") IncidentImpactDataRelationshipsIncidentDataID := os.Getenv("INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteIncidentImpact", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) r, err := api.DeleteIncidentImpact(ctx, IncidentImpactDataRelationshipsIncidentDataID, IncidentImpactDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.DeleteIncidentImpact`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an incident impact ``` // Delete an incident impact returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteIncidentImpact", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // the "incident" has an "incident_impact" String INCIDENT_IMPACT_DATA_ID = System.getenv("INCIDENT_IMPACT_DATA_ID"); String INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID = System.getenv("INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID"); try { apiInstance.deleteIncidentImpact( INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID, INCIDENT_IMPACT_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#deleteIncidentImpact"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an incident impact ``` // Delete an incident impact returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { // the "incident" has an "incident_impact" let incident_impact_data_id = std::env::var("INCIDENT_IMPACT_DATA_ID").unwrap(); let incident_impact_data_relationships_incident_data_id = std::env::var("INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteIncidentImpact", true); let api = IncidentsAPI::with_config(configuration); let resp = api .delete_incident_impact( incident_impact_data_relationships_incident_data_id.clone(), incident_impact_data_id.clone(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an incident impact ``` /** * Delete an incident impact returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteIncidentImpact"] = true; const apiInstance = new v2.IncidentsApi(configuration); // the "incident" has an "incident_impact" const INCIDENT_IMPACT_DATA_ID = process.env.INCIDENT_IMPACT_DATA_ID as string; const INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID = process.env .INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID as string; const params: v2.IncidentsApiDeleteIncidentImpactRequest = { incidentId: INCIDENT_IMPACT_DATA_RELATIONSHIPS_INCIDENT_DATA_ID, impactId: INCIDENT_IMPACT_DATA_ID, }; apiInstance .deleteIncidentImpact(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of an incident's integration metadata](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-an-incidents-integration-metadata) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-an-incidents-integration-metadata-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.datadoghq.eu/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.ddog-gov.com/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations ### Overview Get all integration metadata for an incident. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentIntegrations-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentIntegrations-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentIntegrations-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentIntegrations-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentIntegrations-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentIntegrations-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with a list of incident integration metadata. Field Type Description data [_required_] [object] An array of incident integration metadata. attributes object Incident integration metadata's attributes for a create request. created date-time Timestamp when the incident todo was created. incident_id string UUID of the incident this integration metadata is connected to. integration_type [_required_] int32 A number indicating the type of integration this metadata is for. 1 indicates Slack; 8 indicates Jira. metadata [_required_] Incident integration metadata's metadata attribute. Option 1 object Incident integration metadata for the Slack integration. channels [_required_] [object] Array of Slack channels in this integration metadata. channel_id [_required_] string Slack channel ID. channel_name [_required_] string Name of the Slack channel. redirect_url [_required_] string URL redirecting to the Slack channel. team_id string Slack team ID. Option 2 object Incident integration metadata for the Jira integration. issues [_required_] [object] Array of Jira issues in this integration metadata. account [_required_] string URL of issue's Jira account. issue_key string Jira issue's issue key. issuetype_id string Jira issue's issue type. project_key [_required_] string Jira issue's project keys. redirect_url string URL redirecting to the Jira issue. Option 3 object Incident integration metadata for the Microsoft Teams integration. teams [_required_] [object] Array of Microsoft Teams in this integration metadata. ms_channel_id [_required_] string Microsoft Teams channel ID. ms_channel_name [_required_] string Microsoft Teams channel name. ms_tenant_id [_required_] string Microsoft Teams tenant ID. redirect_url [_required_] string URL redirecting to the Microsoft Teams channel. modified date-time Timestamp when the incident todo was last modified. status int32 A number indicating the status of this integration metadata. 0 indicates unknown; 1 indicates pending; 2 indicates complete; 3 indicates manually created; 4 indicates manually updated; 5 indicates failed. id [_required_] string The incident integration metadata's ID. relationships object The incident's integration relationships from a response. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` meta object The metadata object containing pagination metadata. pagination object Pagination properties. next_offset int64 The index of the first element in the next page of results. Equal to page size added to the current offset. offset int64 The index of the first element in the results. size int64 Maximum size of pages to return. ``` { "data": [ { "attributes": { "created": "2019-09-19T10:00:00.000Z", "incident_id": "00000000-aaaa-0000-0000-000000000000", "integration_type": 1, "metadata": { "channels": [ { "channel_id": "C0123456789", "channel_name": "#example-channel-name", "redirect_url": "https://slack.com/app_redirect?channel=C0123456789\u0026team=T01234567", "team_id": "T01234567" } ] }, "modified": "2019-09-19T10:00:00.000Z", "status": "integer" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "incident_integrations" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "meta": { "pagination": { "next_offset": 1000, "offset": 10, "size": 1000 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Get a list of an incident's integration metadata Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/relationships/integrations" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of an incident's integration metadata ``` """ Get a list of an incident's integration metadata returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] configuration = Configuration() configuration.unstable_operations["list_incident_integrations"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.list_incident_integrations( incident_id=INCIDENT_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of an incident's integration metadata ``` # Get a list of an incident's integration metadata returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_incident_integrations".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] p api_instance.list_incident_integrations(INCIDENT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of an incident's integration metadata ``` // Get a list of an incident's integration metadata returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListIncidentIntegrations", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.ListIncidentIntegrations(ctx, IncidentDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.ListIncidentIntegrations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.ListIncidentIntegrations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of an incident's integration metadata ``` // Get a list of an incident's integration metadata returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listIncidentIntegrations", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); try { IncidentIntegrationMetadataListResponse result = apiInstance.listIncidentIntegrations(INCIDENT_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#listIncidentIntegrations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of an incident's integration metadata ``` // Get a list of an incident's integration metadata returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListIncidentIntegrations", true); let api = IncidentsAPI::with_config(configuration); let resp = api .list_incident_integrations(incident_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of an incident's integration metadata ``` /** * Get a list of an incident's integration metadata returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listIncidentIntegrations"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; const params: v2.IncidentsApiListIncidentIntegrationsRequest = { incidentId: INCIDENT_DATA_ID, }; apiInstance .listIncidentIntegrations(params) .then((data: v2.IncidentIntegrationMetadataListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an incident integration metadata](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-integration-metadata) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-integration-metadata-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.datadoghq.eu/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.ddog-gov.com/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrationshttps://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations ### Overview Create an incident integration metadata. This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. ### Request #### Body Data (required) Incident integration metadata payload. * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Incident integration metadata data for a create request. attributes [_required_] object Incident integration metadata's attributes for a create request. created date-time Timestamp when the incident todo was created. incident_id string UUID of the incident this integration metadata is connected to. integration_type [_required_] int32 A number indicating the type of integration this metadata is for. 1 indicates Slack; 8 indicates Jira. metadata [_required_] Incident integration metadata's metadata attribute. Option 1 object Incident integration metadata for the Slack integration. channels [_required_] [object] Array of Slack channels in this integration metadata. channel_id [_required_] string Slack channel ID. channel_name [_required_] string Name of the Slack channel. redirect_url [_required_] string URL redirecting to the Slack channel. team_id string Slack team ID. Option 2 object Incident integration metadata for the Jira integration. issues [_required_] [object] Array of Jira issues in this integration metadata. account [_required_] string URL of issue's Jira account. issue_key string Jira issue's issue key. issuetype_id string Jira issue's issue type. project_key [_required_] string Jira issue's project keys. redirect_url string URL redirecting to the Jira issue. Option 3 object Incident integration metadata for the Microsoft Teams integration. teams [_required_] [object] Array of Microsoft Teams in this integration metadata. ms_channel_id [_required_] string Microsoft Teams channel ID. ms_channel_name [_required_] string Microsoft Teams channel name. ms_tenant_id [_required_] string Microsoft Teams tenant ID. redirect_url [_required_] string URL redirecting to the Microsoft Teams channel. modified date-time Timestamp when the incident todo was last modified. status int32 A number indicating the status of this integration metadata. 0 indicates unknown; 1 indicates pending; 2 indicates complete; 3 indicates manually created; 4 indicates manually updated; 5 indicates failed. type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` ``` { "data": { "attributes": { "incident_id": "00000000-0000-0000-1234-000000000000", "integration_type": 1, "metadata": { "channels": [ { "channel_id": "C0123456789", "channel_name": "#new-channel", "team_id": "T01234567", "redirect_url": "https://slack.com/app_redirect?channel=C0123456789&team=T01234567" } ] } }, "type": "incident_integrations" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentIntegration-201-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentIntegration-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentIntegration-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentIntegration-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentIntegration-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentIntegration-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with an incident integration metadata. Field Type Description data [_required_] object Incident integration metadata from a response. attributes object Incident integration metadata's attributes for a create request. created date-time Timestamp when the incident todo was created. incident_id string UUID of the incident this integration metadata is connected to. integration_type [_required_] int32 A number indicating the type of integration this metadata is for. 1 indicates Slack; 8 indicates Jira. metadata [_required_] Incident integration metadata's metadata attribute. Option 1 object Incident integration metadata for the Slack integration. channels [_required_] [object] Array of Slack channels in this integration metadata. channel_id [_required_] string Slack channel ID. channel_name [_required_] string Name of the Slack channel. redirect_url [_required_] string URL redirecting to the Slack channel. team_id string Slack team ID. Option 2 object Incident integration metadata for the Jira integration. issues [_required_] [object] Array of Jira issues in this integration metadata. account [_required_] string URL of issue's Jira account. issue_key string Jira issue's issue key. issuetype_id string Jira issue's issue type. project_key [_required_] string Jira issue's project keys. redirect_url string URL redirecting to the Jira issue. Option 3 object Incident integration metadata for the Microsoft Teams integration. teams [_required_] [object] Array of Microsoft Teams in this integration metadata. ms_channel_id [_required_] string Microsoft Teams channel ID. ms_channel_name [_required_] string Microsoft Teams channel name. ms_tenant_id [_required_] string Microsoft Teams tenant ID. redirect_url [_required_] string URL redirecting to the Microsoft Teams channel. modified date-time Timestamp when the incident todo was last modified. status int32 A number indicating the status of this integration metadata. 0 indicates unknown; 1 indicates pending; 2 indicates complete; 3 indicates manually created; 4 indicates manually updated; 5 indicates failed. id [_required_] string The incident integration metadata's ID. relationships object The incident's integration relationships from a response. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "created": "2019-09-19T10:00:00.000Z", "incident_id": "00000000-aaaa-0000-0000-000000000000", "integration_type": 1, "metadata": { "channels": [ { "channel_id": "C0123456789", "channel_name": "#example-channel-name", "redirect_url": "https://slack.com/app_redirect?channel=C0123456789\u0026team=T01234567", "team_id": "T01234567" } ] }, "modified": "2019-09-19T10:00:00.000Z", "status": "integer" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "incident_integrations" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Create an incident integration metadata returns "CREATED" response Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/relationships/integrations" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "incident_id": "00000000-0000-0000-1234-000000000000", "integration_type": 1, "metadata": { "channels": [ { "channel_id": "C0123456789", "channel_name": "#new-channel", "team_id": "T01234567", "redirect_url": "https://slack.com/app_redirect?channel=C0123456789&team=T01234567" } ] } }, "type": "incident_integrations" } } EOF ``` ##### Create an incident integration metadata returns "CREATED" response ``` // Create an incident integration metadata returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") body := datadogV2.IncidentIntegrationMetadataCreateRequest{ Data: datadogV2.IncidentIntegrationMetadataCreateData{ Attributes: datadogV2.IncidentIntegrationMetadataAttributes{ IncidentId: datadog.PtrString(IncidentDataID), IntegrationType: 1, Metadata: datadogV2.IncidentIntegrationMetadataMetadata{ SlackIntegrationMetadata: &datadogV2.SlackIntegrationMetadata{ Channels: []datadogV2.SlackIntegrationMetadataChannelItem{ { ChannelId: "C0123456789", ChannelName: "#new-channel", TeamId: datadog.PtrString("T01234567"), RedirectUrl: "https://slack.com/app_redirect?channel=C0123456789&team=T01234567", }, }, }}, }, Type: datadogV2.INCIDENTINTEGRATIONMETADATATYPE_INCIDENT_INTEGRATIONS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateIncidentIntegration", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.CreateIncidentIntegration(ctx, IncidentDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.CreateIncidentIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.CreateIncidentIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an incident integration metadata returns "CREATED" response ``` // Create an incident integration metadata returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataAttributes; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataCreateData; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataCreateRequest; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataMetadata; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataResponse; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataType; import com.datadog.api.client.v2.model.SlackIntegrationMetadata; import com.datadog.api.client.v2.model.SlackIntegrationMetadataChannelItem; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createIncidentIntegration", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); IncidentIntegrationMetadataCreateRequest body = new IncidentIntegrationMetadataCreateRequest() .data( new IncidentIntegrationMetadataCreateData() .attributes( new IncidentIntegrationMetadataAttributes() .incidentId(INCIDENT_DATA_ID) .integrationType(1) .metadata( new IncidentIntegrationMetadataMetadata( new SlackIntegrationMetadata() .channels( Collections.singletonList( new SlackIntegrationMetadataChannelItem() .channelId("C0123456789") .channelName("#new-channel") .teamId("T01234567") .redirectUrl( "https://slack.com/app_redirect?channel=C0123456789&team=T01234567")))))) .type(IncidentIntegrationMetadataType.INCIDENT_INTEGRATIONS)); try { IncidentIntegrationMetadataResponse result = apiInstance.createIncidentIntegration(INCIDENT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#createIncidentIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an incident integration metadata returns "CREATED" response ``` """ Create an incident integration metadata returns "CREATED" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_integration_metadata_attributes import IncidentIntegrationMetadataAttributes from datadog_api_client.v2.model.incident_integration_metadata_create_data import IncidentIntegrationMetadataCreateData from datadog_api_client.v2.model.incident_integration_metadata_create_request import ( IncidentIntegrationMetadataCreateRequest, ) from datadog_api_client.v2.model.incident_integration_metadata_type import IncidentIntegrationMetadataType from datadog_api_client.v2.model.slack_integration_metadata import SlackIntegrationMetadata from datadog_api_client.v2.model.slack_integration_metadata_channel_item import SlackIntegrationMetadataChannelItem # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] body = IncidentIntegrationMetadataCreateRequest( data=IncidentIntegrationMetadataCreateData( attributes=IncidentIntegrationMetadataAttributes( incident_id=INCIDENT_DATA_ID, integration_type=1, metadata=SlackIntegrationMetadata( channels=[ SlackIntegrationMetadataChannelItem( channel_id="C0123456789", channel_name="#new-channel", team_id="T01234567", redirect_url="https://slack.com/app_redirect?channel=C0123456789&team=T01234567", ), ], ), ), type=IncidentIntegrationMetadataType.INCIDENT_INTEGRATIONS, ), ) configuration = Configuration() configuration.unstable_operations["create_incident_integration"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.create_incident_integration(incident_id=INCIDENT_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an incident integration metadata returns "CREATED" response ``` # Create an incident integration metadata returns "CREATED" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_incident_integration".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] body = DatadogAPIClient::V2::IncidentIntegrationMetadataCreateRequest.new({ data: DatadogAPIClient::V2::IncidentIntegrationMetadataCreateData.new({ attributes: DatadogAPIClient::V2::IncidentIntegrationMetadataAttributes.new({ incident_id: INCIDENT_DATA_ID, integration_type: 1, metadata: DatadogAPIClient::V2::SlackIntegrationMetadata.new({ channels: [ DatadogAPIClient::V2::SlackIntegrationMetadataChannelItem.new({ channel_id: "C0123456789", channel_name: "#new-channel", team_id: "T01234567", redirect_url: "https://slack.com/app_redirect?channel=C0123456789&team=T01234567", }), ], }), }), type: DatadogAPIClient::V2::IncidentIntegrationMetadataType::INCIDENT_INTEGRATIONS, }), }) p api_instance.create_incident_integration(INCIDENT_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an incident integration metadata returns "CREATED" response ``` // Create an incident integration metadata returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::model::IncidentIntegrationMetadataAttributes; use datadog_api_client::datadogV2::model::IncidentIntegrationMetadataCreateData; use datadog_api_client::datadogV2::model::IncidentIntegrationMetadataCreateRequest; use datadog_api_client::datadogV2::model::IncidentIntegrationMetadataMetadata; use datadog_api_client::datadogV2::model::IncidentIntegrationMetadataType; use datadog_api_client::datadogV2::model::SlackIntegrationMetadata; use datadog_api_client::datadogV2::model::SlackIntegrationMetadataChannelItem; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); let body = IncidentIntegrationMetadataCreateRequest::new(IncidentIntegrationMetadataCreateData::new( IncidentIntegrationMetadataAttributes::new( 1, IncidentIntegrationMetadataMetadata::SlackIntegrationMetadata(Box::new( SlackIntegrationMetadata::new(vec![SlackIntegrationMetadataChannelItem::new( "C0123456789".to_string(), "#new-channel".to_string(), "https://slack.com/app_redirect?channel=C0123456789&team=T01234567" .to_string(), ) .team_id("T01234567".to_string())]), )), ) .incident_id(incident_data_id.clone()), IncidentIntegrationMetadataType::INCIDENT_INTEGRATIONS, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateIncidentIntegration", true); let api = IncidentsAPI::with_config(configuration); let resp = api .create_incident_integration(incident_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an incident integration metadata returns "CREATED" response ``` /** * Create an incident integration metadata returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createIncidentIntegration"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; const params: v2.IncidentsApiCreateIncidentIntegrationRequest = { body: { data: { attributes: { incidentId: INCIDENT_DATA_ID, integrationType: 1, metadata: { channels: [ { channelId: "C0123456789", channelName: "#new-channel", teamId: "T01234567", redirectUrl: "https://slack.com/app_redirect?channel=C0123456789&team=T01234567", }, ], }, }, type: "incident_integrations", }, }, incidentId: INCIDENT_DATA_ID, }; apiInstance .createIncidentIntegration(params) .then((data: v2.IncidentIntegrationMetadataResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get incident integration metadata details](https://docs.datadoghq.com/api/latest/incidents/#get-incident-integration-metadata-details) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#get-incident-integration-metadata-details-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id} ### Overview Get incident integration metadata details. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. integration_metadata_id [_required_] string The UUID of the incident integration metadata. ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentIntegration-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentIntegration-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentIntegration-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentIntegration-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentIntegration-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentIntegration-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with an incident integration metadata. Field Type Description data [_required_] object Incident integration metadata from a response. attributes object Incident integration metadata's attributes for a create request. created date-time Timestamp when the incident todo was created. incident_id string UUID of the incident this integration metadata is connected to. integration_type [_required_] int32 A number indicating the type of integration this metadata is for. 1 indicates Slack; 8 indicates Jira. metadata [_required_] Incident integration metadata's metadata attribute. Option 1 object Incident integration metadata for the Slack integration. channels [_required_] [object] Array of Slack channels in this integration metadata. channel_id [_required_] string Slack channel ID. channel_name [_required_] string Name of the Slack channel. redirect_url [_required_] string URL redirecting to the Slack channel. team_id string Slack team ID. Option 2 object Incident integration metadata for the Jira integration. issues [_required_] [object] Array of Jira issues in this integration metadata. account [_required_] string URL of issue's Jira account. issue_key string Jira issue's issue key. issuetype_id string Jira issue's issue type. project_key [_required_] string Jira issue's project keys. redirect_url string URL redirecting to the Jira issue. Option 3 object Incident integration metadata for the Microsoft Teams integration. teams [_required_] [object] Array of Microsoft Teams in this integration metadata. ms_channel_id [_required_] string Microsoft Teams channel ID. ms_channel_name [_required_] string Microsoft Teams channel name. ms_tenant_id [_required_] string Microsoft Teams tenant ID. redirect_url [_required_] string URL redirecting to the Microsoft Teams channel. modified date-time Timestamp when the incident todo was last modified. status int32 A number indicating the status of this integration metadata. 0 indicates unknown; 1 indicates pending; 2 indicates complete; 3 indicates manually created; 4 indicates manually updated; 5 indicates failed. id [_required_] string The incident integration metadata's ID. relationships object The incident's integration relationships from a response. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "created": "2019-09-19T10:00:00.000Z", "incident_id": "00000000-aaaa-0000-0000-000000000000", "integration_type": 1, "metadata": { "channels": [ { "channel_id": "C0123456789", "channel_name": "#example-channel-name", "redirect_url": "https://slack.com/app_redirect?channel=C0123456789\u0026team=T01234567", "team_id": "T01234567" } ] }, "modified": "2019-09-19T10:00:00.000Z", "status": "integer" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "incident_integrations" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Get incident integration metadata details Copy ``` # Path parameters export incident_id="CHANGE_ME" export integration_metadata_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/relationships/integrations/${integration_metadata_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get incident integration metadata details ``` """ Get incident integration metadata details returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] # the "incident" has an "incident_integration_metadata" INCIDENT_INTEGRATION_METADATA_DATA_ID = environ["INCIDENT_INTEGRATION_METADATA_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_incident_integration"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.get_incident_integration( incident_id=INCIDENT_DATA_ID, integration_metadata_id=INCIDENT_INTEGRATION_METADATA_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get incident integration metadata details ``` # Get incident integration metadata details returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_incident_integration".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] # the "incident" has an "incident_integration_metadata" INCIDENT_INTEGRATION_METADATA_DATA_ID = ENV["INCIDENT_INTEGRATION_METADATA_DATA_ID"] p api_instance.get_incident_integration(INCIDENT_DATA_ID, INCIDENT_INTEGRATION_METADATA_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get incident integration metadata details ``` // Get incident integration metadata details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") // the "incident" has an "incident_integration_metadata" IncidentIntegrationMetadataDataID := os.Getenv("INCIDENT_INTEGRATION_METADATA_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetIncidentIntegration", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.GetIncidentIntegration(ctx, IncidentDataID, IncidentIntegrationMetadataDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.GetIncidentIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.GetIncidentIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get incident integration metadata details ``` // Get incident integration metadata details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getIncidentIntegration", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); // the "incident" has an "incident_integration_metadata" String INCIDENT_INTEGRATION_METADATA_DATA_ID = System.getenv("INCIDENT_INTEGRATION_METADATA_DATA_ID"); try { IncidentIntegrationMetadataResponse result = apiInstance.getIncidentIntegration( INCIDENT_DATA_ID, INCIDENT_INTEGRATION_METADATA_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#getIncidentIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get incident integration metadata details ``` // Get incident integration metadata details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); // the "incident" has an "incident_integration_metadata" let incident_integration_metadata_data_id = std::env::var("INCIDENT_INTEGRATION_METADATA_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetIncidentIntegration", true); let api = IncidentsAPI::with_config(configuration); let resp = api .get_incident_integration( incident_data_id.clone(), incident_integration_metadata_data_id.clone(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get incident integration metadata details ``` /** * Get incident integration metadata details returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getIncidentIntegration"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; // the "incident" has an "incident_integration_metadata" const INCIDENT_INTEGRATION_METADATA_DATA_ID = process.env .INCIDENT_INTEGRATION_METADATA_DATA_ID as string; const params: v2.IncidentsApiGetIncidentIntegrationRequest = { incidentId: INCIDENT_DATA_ID, integrationMetadataId: INCIDENT_INTEGRATION_METADATA_DATA_ID, }; apiInstance .getIncidentIntegration(params) .then((data: v2.IncidentIntegrationMetadataResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an existing incident integration metadata](https://docs.datadoghq.com/api/latest/incidents/#update-an-existing-incident-integration-metadata) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#update-an-existing-incident-integration-metadata-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PATCH https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id} ### Overview Update an existing incident integration metadata. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. integration_metadata_id [_required_] string The UUID of the incident integration metadata. ### Request #### Body Data (required) Incident integration metadata payload. * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Incident integration metadata data for a patch request. attributes [_required_] object Incident integration metadata's attributes for a create request. created date-time Timestamp when the incident todo was created. incident_id string UUID of the incident this integration metadata is connected to. integration_type [_required_] int32 A number indicating the type of integration this metadata is for. 1 indicates Slack; 8 indicates Jira. metadata [_required_] Incident integration metadata's metadata attribute. Option 1 object Incident integration metadata for the Slack integration. channels [_required_] [object] Array of Slack channels in this integration metadata. channel_id [_required_] string Slack channel ID. channel_name [_required_] string Name of the Slack channel. redirect_url [_required_] string URL redirecting to the Slack channel. team_id string Slack team ID. Option 2 object Incident integration metadata for the Jira integration. issues [_required_] [object] Array of Jira issues in this integration metadata. account [_required_] string URL of issue's Jira account. issue_key string Jira issue's issue key. issuetype_id string Jira issue's issue type. project_key [_required_] string Jira issue's project keys. redirect_url string URL redirecting to the Jira issue. Option 3 object Incident integration metadata for the Microsoft Teams integration. teams [_required_] [object] Array of Microsoft Teams in this integration metadata. ms_channel_id [_required_] string Microsoft Teams channel ID. ms_channel_name [_required_] string Microsoft Teams channel name. ms_tenant_id [_required_] string Microsoft Teams tenant ID. redirect_url [_required_] string URL redirecting to the Microsoft Teams channel. modified date-time Timestamp when the incident todo was last modified. status int32 A number indicating the status of this integration metadata. 0 indicates unknown; 1 indicates pending; 2 indicates complete; 3 indicates manually created; 4 indicates manually updated; 5 indicates failed. type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` ``` { "data": { "attributes": { "incident_id": "00000000-0000-0000-1234-000000000000", "integration_type": 1, "metadata": { "channels": [ { "channel_id": "C0123456789", "channel_name": "#updated-channel-name", "team_id": "T01234567", "redirect_url": "https://slack.com/app_redirect?channel=C0123456789&team=T01234567" } ] } }, "type": "incident_integrations" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentIntegration-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentIntegration-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentIntegration-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentIntegration-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentIntegration-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentIntegration-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with an incident integration metadata. Field Type Description data [_required_] object Incident integration metadata from a response. attributes object Incident integration metadata's attributes for a create request. created date-time Timestamp when the incident todo was created. incident_id string UUID of the incident this integration metadata is connected to. integration_type [_required_] int32 A number indicating the type of integration this metadata is for. 1 indicates Slack; 8 indicates Jira. metadata [_required_] Incident integration metadata's metadata attribute. Option 1 object Incident integration metadata for the Slack integration. channels [_required_] [object] Array of Slack channels in this integration metadata. channel_id [_required_] string Slack channel ID. channel_name [_required_] string Name of the Slack channel. redirect_url [_required_] string URL redirecting to the Slack channel. team_id string Slack team ID. Option 2 object Incident integration metadata for the Jira integration. issues [_required_] [object] Array of Jira issues in this integration metadata. account [_required_] string URL of issue's Jira account. issue_key string Jira issue's issue key. issuetype_id string Jira issue's issue type. project_key [_required_] string Jira issue's project keys. redirect_url string URL redirecting to the Jira issue. Option 3 object Incident integration metadata for the Microsoft Teams integration. teams [_required_] [object] Array of Microsoft Teams in this integration metadata. ms_channel_id [_required_] string Microsoft Teams channel ID. ms_channel_name [_required_] string Microsoft Teams channel name. ms_tenant_id [_required_] string Microsoft Teams tenant ID. redirect_url [_required_] string URL redirecting to the Microsoft Teams channel. modified date-time Timestamp when the incident todo was last modified. status int32 A number indicating the status of this integration metadata. 0 indicates unknown; 1 indicates pending; 2 indicates complete; 3 indicates manually created; 4 indicates manually updated; 5 indicates failed. id [_required_] string The incident integration metadata's ID. relationships object The incident's integration relationships from a response. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Integration metadata resource type. Allowed enum values: `incident_integrations` default: `incident_integrations` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "created": "2019-09-19T10:00:00.000Z", "incident_id": "00000000-aaaa-0000-0000-000000000000", "integration_type": 1, "metadata": { "channels": [ { "channel_id": "C0123456789", "channel_name": "#example-channel-name", "redirect_url": "https://slack.com/app_redirect?channel=C0123456789\u0026team=T01234567", "team_id": "T01234567" } ] }, "modified": "2019-09-19T10:00:00.000Z", "status": "integer" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "incident_integrations" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Update an existing incident integration metadata returns "OK" response Copy ``` # Path parameters export incident_id="CHANGE_ME" export integration_metadata_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/relationships/integrations/${integration_metadata_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "incident_id": "00000000-0000-0000-1234-000000000000", "integration_type": 1, "metadata": { "channels": [ { "channel_id": "C0123456789", "channel_name": "#updated-channel-name", "team_id": "T01234567", "redirect_url": "https://slack.com/app_redirect?channel=C0123456789&team=T01234567" } ] } }, "type": "incident_integrations" } } EOF ``` ##### Update an existing incident integration metadata returns "OK" response ``` // Update an existing incident integration metadata returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") // the "incident" has an "incident_integration_metadata" IncidentIntegrationMetadataDataID := os.Getenv("INCIDENT_INTEGRATION_METADATA_DATA_ID") body := datadogV2.IncidentIntegrationMetadataPatchRequest{ Data: datadogV2.IncidentIntegrationMetadataPatchData{ Attributes: datadogV2.IncidentIntegrationMetadataAttributes{ IncidentId: datadog.PtrString(IncidentDataID), IntegrationType: 1, Metadata: datadogV2.IncidentIntegrationMetadataMetadata{ SlackIntegrationMetadata: &datadogV2.SlackIntegrationMetadata{ Channels: []datadogV2.SlackIntegrationMetadataChannelItem{ { ChannelId: "C0123456789", ChannelName: "#updated-channel-name", TeamId: datadog.PtrString("T01234567"), RedirectUrl: "https://slack.com/app_redirect?channel=C0123456789&team=T01234567", }, }, }}, }, Type: datadogV2.INCIDENTINTEGRATIONMETADATATYPE_INCIDENT_INTEGRATIONS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateIncidentIntegration", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.UpdateIncidentIntegration(ctx, IncidentDataID, IncidentIntegrationMetadataDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.UpdateIncidentIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.UpdateIncidentIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an existing incident integration metadata returns "OK" response ``` // Update an existing incident integration metadata returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataAttributes; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataMetadata; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataPatchData; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataPatchRequest; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataResponse; import com.datadog.api.client.v2.model.IncidentIntegrationMetadataType; import com.datadog.api.client.v2.model.SlackIntegrationMetadata; import com.datadog.api.client.v2.model.SlackIntegrationMetadataChannelItem; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateIncidentIntegration", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); // the "incident" has an "incident_integration_metadata" String INCIDENT_INTEGRATION_METADATA_DATA_ID = System.getenv("INCIDENT_INTEGRATION_METADATA_DATA_ID"); IncidentIntegrationMetadataPatchRequest body = new IncidentIntegrationMetadataPatchRequest() .data( new IncidentIntegrationMetadataPatchData() .attributes( new IncidentIntegrationMetadataAttributes() .incidentId(INCIDENT_DATA_ID) .integrationType(1) .metadata( new IncidentIntegrationMetadataMetadata( new SlackIntegrationMetadata() .channels( Collections.singletonList( new SlackIntegrationMetadataChannelItem() .channelId("C0123456789") .channelName("#updated-channel-name") .teamId("T01234567") .redirectUrl( "https://slack.com/app_redirect?channel=C0123456789&team=T01234567")))))) .type(IncidentIntegrationMetadataType.INCIDENT_INTEGRATIONS)); try { IncidentIntegrationMetadataResponse result = apiInstance.updateIncidentIntegration( INCIDENT_DATA_ID, INCIDENT_INTEGRATION_METADATA_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#updateIncidentIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an existing incident integration metadata returns "OK" response ``` """ Update an existing incident integration metadata returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_integration_metadata_attributes import IncidentIntegrationMetadataAttributes from datadog_api_client.v2.model.incident_integration_metadata_patch_data import IncidentIntegrationMetadataPatchData from datadog_api_client.v2.model.incident_integration_metadata_patch_request import ( IncidentIntegrationMetadataPatchRequest, ) from datadog_api_client.v2.model.incident_integration_metadata_type import IncidentIntegrationMetadataType from datadog_api_client.v2.model.slack_integration_metadata import SlackIntegrationMetadata from datadog_api_client.v2.model.slack_integration_metadata_channel_item import SlackIntegrationMetadataChannelItem # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] # the "incident" has an "incident_integration_metadata" INCIDENT_INTEGRATION_METADATA_DATA_ID = environ["INCIDENT_INTEGRATION_METADATA_DATA_ID"] body = IncidentIntegrationMetadataPatchRequest( data=IncidentIntegrationMetadataPatchData( attributes=IncidentIntegrationMetadataAttributes( incident_id=INCIDENT_DATA_ID, integration_type=1, metadata=SlackIntegrationMetadata( channels=[ SlackIntegrationMetadataChannelItem( channel_id="C0123456789", channel_name="#updated-channel-name", team_id="T01234567", redirect_url="https://slack.com/app_redirect?channel=C0123456789&team=T01234567", ), ], ), ), type=IncidentIntegrationMetadataType.INCIDENT_INTEGRATIONS, ), ) configuration = Configuration() configuration.unstable_operations["update_incident_integration"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.update_incident_integration( incident_id=INCIDENT_DATA_ID, integration_metadata_id=INCIDENT_INTEGRATION_METADATA_DATA_ID, body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an existing incident integration metadata returns "OK" response ``` # Update an existing incident integration metadata returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_incident_integration".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] # the "incident" has an "incident_integration_metadata" INCIDENT_INTEGRATION_METADATA_DATA_ID = ENV["INCIDENT_INTEGRATION_METADATA_DATA_ID"] body = DatadogAPIClient::V2::IncidentIntegrationMetadataPatchRequest.new({ data: DatadogAPIClient::V2::IncidentIntegrationMetadataPatchData.new({ attributes: DatadogAPIClient::V2::IncidentIntegrationMetadataAttributes.new({ incident_id: INCIDENT_DATA_ID, integration_type: 1, metadata: DatadogAPIClient::V2::SlackIntegrationMetadata.new({ channels: [ DatadogAPIClient::V2::SlackIntegrationMetadataChannelItem.new({ channel_id: "C0123456789", channel_name: "#updated-channel-name", team_id: "T01234567", redirect_url: "https://slack.com/app_redirect?channel=C0123456789&team=T01234567", }), ], }), }), type: DatadogAPIClient::V2::IncidentIntegrationMetadataType::INCIDENT_INTEGRATIONS, }), }) p api_instance.update_incident_integration(INCIDENT_DATA_ID, INCIDENT_INTEGRATION_METADATA_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an existing incident integration metadata returns "OK" response ``` // Update an existing incident integration metadata returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::model::IncidentIntegrationMetadataAttributes; use datadog_api_client::datadogV2::model::IncidentIntegrationMetadataMetadata; use datadog_api_client::datadogV2::model::IncidentIntegrationMetadataPatchData; use datadog_api_client::datadogV2::model::IncidentIntegrationMetadataPatchRequest; use datadog_api_client::datadogV2::model::IncidentIntegrationMetadataType; use datadog_api_client::datadogV2::model::SlackIntegrationMetadata; use datadog_api_client::datadogV2::model::SlackIntegrationMetadataChannelItem; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); // the "incident" has an "incident_integration_metadata" let incident_integration_metadata_data_id = std::env::var("INCIDENT_INTEGRATION_METADATA_DATA_ID").unwrap(); let body = IncidentIntegrationMetadataPatchRequest::new(IncidentIntegrationMetadataPatchData::new( IncidentIntegrationMetadataAttributes::new( 1, IncidentIntegrationMetadataMetadata::SlackIntegrationMetadata(Box::new( SlackIntegrationMetadata::new(vec![SlackIntegrationMetadataChannelItem::new( "C0123456789".to_string(), "#updated-channel-name".to_string(), "https://slack.com/app_redirect?channel=C0123456789&team=T01234567" .to_string(), ) .team_id("T01234567".to_string())]), )), ) .incident_id(incident_data_id.clone()), IncidentIntegrationMetadataType::INCIDENT_INTEGRATIONS, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateIncidentIntegration", true); let api = IncidentsAPI::with_config(configuration); let resp = api .update_incident_integration( incident_data_id.clone(), incident_integration_metadata_data_id.clone(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an existing incident integration metadata returns "OK" response ``` /** * Update an existing incident integration metadata returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateIncidentIntegration"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; // the "incident" has an "incident_integration_metadata" const INCIDENT_INTEGRATION_METADATA_DATA_ID = process.env .INCIDENT_INTEGRATION_METADATA_DATA_ID as string; const params: v2.IncidentsApiUpdateIncidentIntegrationRequest = { body: { data: { attributes: { incidentId: INCIDENT_DATA_ID, integrationType: 1, metadata: { channels: [ { channelId: "C0123456789", channelName: "#updated-channel-name", teamId: "T01234567", redirectUrl: "https://slack.com/app_redirect?channel=C0123456789&team=T01234567", }, ], }, }, type: "incident_integrations", }, }, incidentId: INCIDENT_DATA_ID, integrationMetadataId: INCIDENT_INTEGRATION_METADATA_DATA_ID, }; apiInstance .updateIncidentIntegration(params) .then((data: v2.IncidentIntegrationMetadataResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an incident integration metadata](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-integration-metadata) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-integration-metadata-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/relationships/integrations/{integration_metadata_id} ### Overview Delete an incident integration metadata. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. integration_metadata_id [_required_] string The UUID of the incident integration metadata. ### Response * [204](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentIntegration-204-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentIntegration-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentIntegration-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentIntegration-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentIntegration-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentIntegration-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Delete an incident integration metadata Copy ``` # Path parameters export incident_id="CHANGE_ME" export integration_metadata_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/relationships/integrations/${integration_metadata_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an incident integration metadata ``` """ Delete an incident integration metadata returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] # the "incident" has an "incident_integration_metadata" INCIDENT_INTEGRATION_METADATA_DATA_ID = environ["INCIDENT_INTEGRATION_METADATA_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_incident_integration"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) api_instance.delete_incident_integration( incident_id=INCIDENT_DATA_ID, integration_metadata_id=INCIDENT_INTEGRATION_METADATA_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an incident integration metadata ``` # Delete an incident integration metadata returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_incident_integration".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] # the "incident" has an "incident_integration_metadata" INCIDENT_INTEGRATION_METADATA_DATA_ID = ENV["INCIDENT_INTEGRATION_METADATA_DATA_ID"] api_instance.delete_incident_integration(INCIDENT_DATA_ID, INCIDENT_INTEGRATION_METADATA_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an incident integration metadata ``` // Delete an incident integration metadata returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") // the "incident" has an "incident_integration_metadata" IncidentIntegrationMetadataDataID := os.Getenv("INCIDENT_INTEGRATION_METADATA_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteIncidentIntegration", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) r, err := api.DeleteIncidentIntegration(ctx, IncidentDataID, IncidentIntegrationMetadataDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.DeleteIncidentIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an incident integration metadata ``` // Delete an incident integration metadata returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteIncidentIntegration", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); // the "incident" has an "incident_integration_metadata" String INCIDENT_INTEGRATION_METADATA_DATA_ID = System.getenv("INCIDENT_INTEGRATION_METADATA_DATA_ID"); try { apiInstance.deleteIncidentIntegration( INCIDENT_DATA_ID, INCIDENT_INTEGRATION_METADATA_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#deleteIncidentIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an incident integration metadata ``` // Delete an incident integration metadata returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); // the "incident" has an "incident_integration_metadata" let incident_integration_metadata_data_id = std::env::var("INCIDENT_INTEGRATION_METADATA_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteIncidentIntegration", true); let api = IncidentsAPI::with_config(configuration); let resp = api .delete_incident_integration( incident_data_id.clone(), incident_integration_metadata_data_id.clone(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an incident integration metadata ``` /** * Delete an incident integration metadata returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteIncidentIntegration"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; // the "incident" has an "incident_integration_metadata" const INCIDENT_INTEGRATION_METADATA_DATA_ID = process.env .INCIDENT_INTEGRATION_METADATA_DATA_ID as string; const params: v2.IncidentsApiDeleteIncidentIntegrationRequest = { incidentId: INCIDENT_DATA_ID, integrationMetadataId: INCIDENT_INTEGRATION_METADATA_DATA_ID, }; apiInstance .deleteIncidentIntegration(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of an incident's todos](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-an-incidents-todos) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-an-incidents-todos-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todoshttps://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todoshttps://api.datadoghq.eu/api/v2/incidents/{incident_id}/relationships/todoshttps://api.ddog-gov.com/api/v2/incidents/{incident_id}/relationships/todoshttps://api.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todoshttps://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todoshttps://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos ### Overview Get all todos for an incident. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentTodos-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentTodos-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentTodos-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentTodos-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentTodos-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentTodos-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with a list of incident todos. Field Type Description data [_required_] [object] An array of incident todos. attributes object Incident todo's attributes. assignees [_required_] [ ] Array of todo assignees. Option 1 string Assignee's @-handle. Option 2 object Anonymous assignee entity. icon [_required_] string URL for assignee's icon. id [_required_] string Anonymous assignee's ID. name [_required_] string Assignee's name. source [_required_] enum The source of the anonymous assignee. Allowed enum values: `slack,microsoft_teams` default: `slack` completed string Timestamp when the todo was completed. content [_required_] string The follow-up task's content. created date-time Timestamp when the incident todo was created. due_date string Timestamp when the todo should be completed by. incident_id string UUID of the incident this todo is connected to. modified date-time Timestamp when the incident todo was last modified. id [_required_] string The incident todo's ID. relationships object The incident's relationships from a response. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Todo resource type. Allowed enum values: `incident_todos` default: `incident_todos` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` meta object The metadata object containing pagination metadata. pagination object Pagination properties. next_offset int64 The index of the first element in the next page of results. Equal to page size added to the current offset. offset int64 The index of the first element in the results. size int64 Maximum size of pages to return. ``` { "data": [ { "attributes": { "assignees": [ { "description": "@test.user@test.com", "example": "@test.user@test.com", "type": "@test.user@test.com" } ], "completed": "2023-03-06T22:00:00.000000+00:00", "content": "Restore lost data.", "created": "2019-09-19T10:00:00.000Z", "due_date": "2023-07-10T05:00:00.000000+00:00", "incident_id": "00000000-aaaa-0000-0000-000000000000", "modified": "2019-09-19T10:00:00.000Z" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "incident_todos" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "meta": { "pagination": { "next_offset": 1000, "offset": 10, "size": 1000 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Get a list of an incident's todos Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/relationships/todos" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of an incident's todos ``` """ Get a list of an incident's todos returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] configuration = Configuration() configuration.unstable_operations["list_incident_todos"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.list_incident_todos( incident_id=INCIDENT_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of an incident's todos ``` # Get a list of an incident's todos returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_incident_todos".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] p api_instance.list_incident_todos(INCIDENT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of an incident's todos ``` // Get a list of an incident's todos returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListIncidentTodos", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.ListIncidentTodos(ctx, IncidentDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.ListIncidentTodos`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.ListIncidentTodos`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of an incident's todos ``` // Get a list of an incident's todos returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentTodoListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listIncidentTodos", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); try { IncidentTodoListResponse result = apiInstance.listIncidentTodos(INCIDENT_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#listIncidentTodos"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of an incident's todos ``` // Get a list of an incident's todos returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListIncidentTodos", true); let api = IncidentsAPI::with_config(configuration); let resp = api.list_incident_todos(incident_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of an incident's todos ``` /** * Get a list of an incident's todos returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listIncidentTodos"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; const params: v2.IncidentsApiListIncidentTodosRequest = { incidentId: INCIDENT_DATA_ID, }; apiInstance .listIncidentTodos(params) .then((data: v2.IncidentTodoListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an incident todo](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-todo) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-todo-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todoshttps://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todoshttps://api.datadoghq.eu/api/v2/incidents/{incident_id}/relationships/todoshttps://api.ddog-gov.com/api/v2/incidents/{incident_id}/relationships/todoshttps://api.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todoshttps://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todoshttps://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos ### Overview Create an incident todo. This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. ### Request #### Body Data (required) Incident todo payload. * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Incident todo data for a create request. attributes [_required_] object Incident todo's attributes. assignees [_required_] [ ] Array of todo assignees. Option 1 string Assignee's @-handle. Option 2 object Anonymous assignee entity. icon [_required_] string URL for assignee's icon. id [_required_] string Anonymous assignee's ID. name [_required_] string Assignee's name. source [_required_] enum The source of the anonymous assignee. Allowed enum values: `slack,microsoft_teams` default: `slack` completed string Timestamp when the todo was completed. content [_required_] string The follow-up task's content. created date-time Timestamp when the incident todo was created. due_date string Timestamp when the todo should be completed by. incident_id string UUID of the incident this todo is connected to. modified date-time Timestamp when the incident todo was last modified. type [_required_] enum Todo resource type. Allowed enum values: `incident_todos` default: `incident_todos` ``` { "data": { "attributes": { "assignees": [ "@test.user@test.com" ], "content": "Restore lost data." }, "type": "incident_todos" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentTodo-201-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentTodo-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentTodo-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentTodo-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentTodo-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentTodo-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with an incident todo. Field Type Description data [_required_] object Incident todo response data. attributes object Incident todo's attributes. assignees [_required_] [ ] Array of todo assignees. Option 1 string Assignee's @-handle. Option 2 object Anonymous assignee entity. icon [_required_] string URL for assignee's icon. id [_required_] string Anonymous assignee's ID. name [_required_] string Assignee's name. source [_required_] enum The source of the anonymous assignee. Allowed enum values: `slack,microsoft_teams` default: `slack` completed string Timestamp when the todo was completed. content [_required_] string The follow-up task's content. created date-time Timestamp when the incident todo was created. due_date string Timestamp when the todo should be completed by. incident_id string UUID of the incident this todo is connected to. modified date-time Timestamp when the incident todo was last modified. id [_required_] string The incident todo's ID. relationships object The incident's relationships from a response. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Todo resource type. Allowed enum values: `incident_todos` default: `incident_todos` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "assignees": [ { "description": "@test.user@test.com", "example": "@test.user@test.com", "type": "@test.user@test.com" } ], "completed": "2023-03-06T22:00:00.000000+00:00", "content": "Restore lost data.", "created": "2019-09-19T10:00:00.000Z", "due_date": "2023-07-10T05:00:00.000000+00:00", "incident_id": "00000000-aaaa-0000-0000-000000000000", "modified": "2019-09-19T10:00:00.000Z" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "incident_todos" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Create an incident todo returns "CREATED" response Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/relationships/todos" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "assignees": [ "@test.user@test.com" ], "content": "Restore lost data." }, "type": "incident_todos" } } EOF ``` ##### Create an incident todo returns "CREATED" response ``` // Create an incident todo returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") body := datadogV2.IncidentTodoCreateRequest{ Data: datadogV2.IncidentTodoCreateData{ Attributes: datadogV2.IncidentTodoAttributes{ Assignees: []datadogV2.IncidentTodoAssignee{ datadogV2.IncidentTodoAssignee{ IncidentTodoAssigneeHandle: datadog.PtrString("@test.user@test.com")}, }, Content: "Restore lost data.", }, Type: datadogV2.INCIDENTTODOTYPE_INCIDENT_TODOS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateIncidentTodo", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.CreateIncidentTodo(ctx, IncidentDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.CreateIncidentTodo`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.CreateIncidentTodo`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an incident todo returns "CREATED" response ``` // Create an incident todo returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentTodoAssignee; import com.datadog.api.client.v2.model.IncidentTodoAttributes; import com.datadog.api.client.v2.model.IncidentTodoCreateData; import com.datadog.api.client.v2.model.IncidentTodoCreateRequest; import com.datadog.api.client.v2.model.IncidentTodoResponse; import com.datadog.api.client.v2.model.IncidentTodoType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createIncidentTodo", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); IncidentTodoCreateRequest body = new IncidentTodoCreateRequest() .data( new IncidentTodoCreateData() .attributes( new IncidentTodoAttributes() .assignees( Collections.singletonList( new IncidentTodoAssignee("@test.user@test.com"))) .content("Restore lost data.")) .type(IncidentTodoType.INCIDENT_TODOS)); try { IncidentTodoResponse result = apiInstance.createIncidentTodo(INCIDENT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#createIncidentTodo"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an incident todo returns "CREATED" response ``` """ Create an incident todo returns "CREATED" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_todo_assignee_array import IncidentTodoAssigneeArray from datadog_api_client.v2.model.incident_todo_attributes import IncidentTodoAttributes from datadog_api_client.v2.model.incident_todo_create_data import IncidentTodoCreateData from datadog_api_client.v2.model.incident_todo_create_request import IncidentTodoCreateRequest from datadog_api_client.v2.model.incident_todo_type import IncidentTodoType # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] body = IncidentTodoCreateRequest( data=IncidentTodoCreateData( attributes=IncidentTodoAttributes( assignees=IncidentTodoAssigneeArray( [ "@test.user@test.com", ] ), content="Restore lost data.", ), type=IncidentTodoType.INCIDENT_TODOS, ), ) configuration = Configuration() configuration.unstable_operations["create_incident_todo"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.create_incident_todo(incident_id=INCIDENT_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an incident todo returns "CREATED" response ``` # Create an incident todo returns "CREATED" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_incident_todo".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] body = DatadogAPIClient::V2::IncidentTodoCreateRequest.new({ data: DatadogAPIClient::V2::IncidentTodoCreateData.new({ attributes: DatadogAPIClient::V2::IncidentTodoAttributes.new({ assignees: [ "@test.user@test.com", ], content: "Restore lost data.", }), type: DatadogAPIClient::V2::IncidentTodoType::INCIDENT_TODOS, }), }) p api_instance.create_incident_todo(INCIDENT_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an incident todo returns "CREATED" response ``` // Create an incident todo returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::model::IncidentTodoAssignee; use datadog_api_client::datadogV2::model::IncidentTodoAttributes; use datadog_api_client::datadogV2::model::IncidentTodoCreateData; use datadog_api_client::datadogV2::model::IncidentTodoCreateRequest; use datadog_api_client::datadogV2::model::IncidentTodoType; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); let body = IncidentTodoCreateRequest::new(IncidentTodoCreateData::new( IncidentTodoAttributes::new( vec![IncidentTodoAssignee::IncidentTodoAssigneeHandle( "@test.user@test.com".to_string(), )], "Restore lost data.".to_string(), ), IncidentTodoType::INCIDENT_TODOS, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateIncidentTodo", true); let api = IncidentsAPI::with_config(configuration); let resp = api .create_incident_todo(incident_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an incident todo returns "CREATED" response ``` /** * Create an incident todo returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createIncidentTodo"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; const params: v2.IncidentsApiCreateIncidentTodoRequest = { body: { data: { attributes: { assignees: ["@test.user@test.com"], content: "Restore lost data.", }, type: "incident_todos", }, }, incidentId: INCIDENT_DATA_ID, }; apiInstance .createIncidentTodo(params) .then((data: v2.IncidentTodoResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get incident todo details](https://docs.datadoghq.com/api/latest/incidents/#get-incident-todo-details) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#get-incident-todo-details-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id} ### Overview Get incident todo details. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. todo_id [_required_] string The UUID of the incident todo. ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentTodo-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentTodo-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentTodo-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentTodo-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentTodo-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentTodo-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with an incident todo. Field Type Description data [_required_] object Incident todo response data. attributes object Incident todo's attributes. assignees [_required_] [ ] Array of todo assignees. Option 1 string Assignee's @-handle. Option 2 object Anonymous assignee entity. icon [_required_] string URL for assignee's icon. id [_required_] string Anonymous assignee's ID. name [_required_] string Assignee's name. source [_required_] enum The source of the anonymous assignee. Allowed enum values: `slack,microsoft_teams` default: `slack` completed string Timestamp when the todo was completed. content [_required_] string The follow-up task's content. created date-time Timestamp when the incident todo was created. due_date string Timestamp when the todo should be completed by. incident_id string UUID of the incident this todo is connected to. modified date-time Timestamp when the incident todo was last modified. id [_required_] string The incident todo's ID. relationships object The incident's relationships from a response. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Todo resource type. Allowed enum values: `incident_todos` default: `incident_todos` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "assignees": [ { "description": "@test.user@test.com", "example": "@test.user@test.com", "type": "@test.user@test.com" } ], "completed": "2023-03-06T22:00:00.000000+00:00", "content": "Restore lost data.", "created": "2019-09-19T10:00:00.000Z", "due_date": "2023-07-10T05:00:00.000000+00:00", "incident_id": "00000000-aaaa-0000-0000-000000000000", "modified": "2019-09-19T10:00:00.000Z" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "incident_todos" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Get incident todo details Copy ``` # Path parameters export incident_id="CHANGE_ME" export todo_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/relationships/todos/${todo_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get incident todo details ``` """ Get incident todo details returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] # the "incident" has an "incident_todo" INCIDENT_TODO_DATA_ID = environ["INCIDENT_TODO_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_incident_todo"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.get_incident_todo( incident_id=INCIDENT_DATA_ID, todo_id=INCIDENT_TODO_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get incident todo details ``` # Get incident todo details returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_incident_todo".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] # the "incident" has an "incident_todo" INCIDENT_TODO_DATA_ID = ENV["INCIDENT_TODO_DATA_ID"] p api_instance.get_incident_todo(INCIDENT_DATA_ID, INCIDENT_TODO_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get incident todo details ``` // Get incident todo details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") // the "incident" has an "incident_todo" IncidentTodoDataID := os.Getenv("INCIDENT_TODO_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetIncidentTodo", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.GetIncidentTodo(ctx, IncidentDataID, IncidentTodoDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.GetIncidentTodo`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.GetIncidentTodo`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get incident todo details ``` // Get incident todo details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentTodoResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getIncidentTodo", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); // the "incident" has an "incident_todo" String INCIDENT_TODO_DATA_ID = System.getenv("INCIDENT_TODO_DATA_ID"); try { IncidentTodoResponse result = apiInstance.getIncidentTodo(INCIDENT_DATA_ID, INCIDENT_TODO_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#getIncidentTodo"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get incident todo details ``` // Get incident todo details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); // the "incident" has an "incident_todo" let incident_todo_data_id = std::env::var("INCIDENT_TODO_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetIncidentTodo", true); let api = IncidentsAPI::with_config(configuration); let resp = api .get_incident_todo(incident_data_id.clone(), incident_todo_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get incident todo details ``` /** * Get incident todo details returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getIncidentTodo"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; // the "incident" has an "incident_todo" const INCIDENT_TODO_DATA_ID = process.env.INCIDENT_TODO_DATA_ID as string; const params: v2.IncidentsApiGetIncidentTodoRequest = { incidentId: INCIDENT_DATA_ID, todoId: INCIDENT_TODO_DATA_ID, }; apiInstance .getIncidentTodo(params) .then((data: v2.IncidentTodoResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an incident todo](https://docs.datadoghq.com/api/latest/incidents/#update-an-incident-todo) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#update-an-incident-todo-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PATCH https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id} ### Overview Update an incident todo. This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. todo_id [_required_] string The UUID of the incident todo. ### Request #### Body Data (required) Incident todo payload. * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Incident todo data for a patch request. attributes [_required_] object Incident todo's attributes. assignees [_required_] [ ] Array of todo assignees. Option 1 string Assignee's @-handle. Option 2 object Anonymous assignee entity. icon [_required_] string URL for assignee's icon. id [_required_] string Anonymous assignee's ID. name [_required_] string Assignee's name. source [_required_] enum The source of the anonymous assignee. Allowed enum values: `slack,microsoft_teams` default: `slack` completed string Timestamp when the todo was completed. content [_required_] string The follow-up task's content. created date-time Timestamp when the incident todo was created. due_date string Timestamp when the todo should be completed by. incident_id string UUID of the incident this todo is connected to. modified date-time Timestamp when the incident todo was last modified. type [_required_] enum Todo resource type. Allowed enum values: `incident_todos` default: `incident_todos` ``` { "data": { "attributes": { "assignees": [ "@test.user@test.com" ], "content": "Restore lost data.", "completed": "2023-03-06T22:00:00.000000+00:00", "due_date": "2023-07-10T05:00:00.000000+00:00" }, "type": "incident_todos" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentTodo-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentTodo-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentTodo-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentTodo-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentTodo-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentTodo-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with an incident todo. Field Type Description data [_required_] object Incident todo response data. attributes object Incident todo's attributes. assignees [_required_] [ ] Array of todo assignees. Option 1 string Assignee's @-handle. Option 2 object Anonymous assignee entity. icon [_required_] string URL for assignee's icon. id [_required_] string Anonymous assignee's ID. name [_required_] string Assignee's name. source [_required_] enum The source of the anonymous assignee. Allowed enum values: `slack,microsoft_teams` default: `slack` completed string Timestamp when the todo was completed. content [_required_] string The follow-up task's content. created date-time Timestamp when the incident todo was created. due_date string Timestamp when the todo should be completed by. incident_id string UUID of the incident this todo is connected to. modified date-time Timestamp when the incident todo was last modified. id [_required_] string The incident todo's ID. relationships object The incident's relationships from a response. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Todo resource type. Allowed enum values: `incident_todos` default: `incident_todos` included [ ] Included related resources that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "assignees": [ { "description": "@test.user@test.com", "example": "@test.user@test.com", "type": "@test.user@test.com" } ], "completed": "2023-03-06T22:00:00.000000+00:00", "content": "Restore lost data.", "created": "2019-09-19T10:00:00.000Z", "due_date": "2023-07-10T05:00:00.000000+00:00", "incident_id": "00000000-aaaa-0000-0000-000000000000", "modified": "2019-09-19T10:00:00.000Z" }, "id": "00000000-0000-0000-1234-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "incident_todos" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Update an incident todo returns "OK" response Copy ``` # Path parameters export incident_id="CHANGE_ME" export todo_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/relationships/todos/${todo_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "assignees": [ "@test.user@test.com" ], "content": "Restore lost data.", "completed": "2023-03-06T22:00:00.000000+00:00", "due_date": "2023-07-10T05:00:00.000000+00:00" }, "type": "incident_todos" } } EOF ``` ##### Update an incident todo returns "OK" response ``` // Update an incident todo returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") // the "incident" has an "incident_todo" IncidentTodoDataID := os.Getenv("INCIDENT_TODO_DATA_ID") body := datadogV2.IncidentTodoPatchRequest{ Data: datadogV2.IncidentTodoPatchData{ Attributes: datadogV2.IncidentTodoAttributes{ Assignees: []datadogV2.IncidentTodoAssignee{ datadogV2.IncidentTodoAssignee{ IncidentTodoAssigneeHandle: datadog.PtrString("@test.user@test.com")}, }, Content: "Restore lost data.", Completed: *datadog.NewNullableString(datadog.PtrString("2023-03-06T22:00:00.000000+00:00")), DueDate: *datadog.NewNullableString(datadog.PtrString("2023-07-10T05:00:00.000000+00:00")), }, Type: datadogV2.INCIDENTTODOTYPE_INCIDENT_TODOS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateIncidentTodo", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.UpdateIncidentTodo(ctx, IncidentDataID, IncidentTodoDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.UpdateIncidentTodo`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.UpdateIncidentTodo`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an incident todo returns "OK" response ``` // Update an incident todo returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentTodoAssignee; import com.datadog.api.client.v2.model.IncidentTodoAttributes; import com.datadog.api.client.v2.model.IncidentTodoPatchData; import com.datadog.api.client.v2.model.IncidentTodoPatchRequest; import com.datadog.api.client.v2.model.IncidentTodoResponse; import com.datadog.api.client.v2.model.IncidentTodoType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateIncidentTodo", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); // the "incident" has an "incident_todo" String INCIDENT_TODO_DATA_ID = System.getenv("INCIDENT_TODO_DATA_ID"); IncidentTodoPatchRequest body = new IncidentTodoPatchRequest() .data( new IncidentTodoPatchData() .attributes( new IncidentTodoAttributes() .assignees( Collections.singletonList( new IncidentTodoAssignee("@test.user@test.com"))) .content("Restore lost data.") .completed("2023-03-06T22:00:00.000000+00:00") .dueDate("2023-07-10T05:00:00.000000+00:00")) .type(IncidentTodoType.INCIDENT_TODOS)); try { IncidentTodoResponse result = apiInstance.updateIncidentTodo(INCIDENT_DATA_ID, INCIDENT_TODO_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#updateIncidentTodo"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an incident todo returns "OK" response ``` """ Update an incident todo returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_todo_assignee_array import IncidentTodoAssigneeArray from datadog_api_client.v2.model.incident_todo_attributes import IncidentTodoAttributes from datadog_api_client.v2.model.incident_todo_patch_data import IncidentTodoPatchData from datadog_api_client.v2.model.incident_todo_patch_request import IncidentTodoPatchRequest from datadog_api_client.v2.model.incident_todo_type import IncidentTodoType # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] # the "incident" has an "incident_todo" INCIDENT_TODO_DATA_ID = environ["INCIDENT_TODO_DATA_ID"] body = IncidentTodoPatchRequest( data=IncidentTodoPatchData( attributes=IncidentTodoAttributes( assignees=IncidentTodoAssigneeArray( [ "@test.user@test.com", ] ), content="Restore lost data.", completed="2023-03-06T22:00:00.000000+00:00", due_date="2023-07-10T05:00:00.000000+00:00", ), type=IncidentTodoType.INCIDENT_TODOS, ), ) configuration = Configuration() configuration.unstable_operations["update_incident_todo"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.update_incident_todo(incident_id=INCIDENT_DATA_ID, todo_id=INCIDENT_TODO_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an incident todo returns "OK" response ``` # Update an incident todo returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_incident_todo".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] # the "incident" has an "incident_todo" INCIDENT_TODO_DATA_ID = ENV["INCIDENT_TODO_DATA_ID"] body = DatadogAPIClient::V2::IncidentTodoPatchRequest.new({ data: DatadogAPIClient::V2::IncidentTodoPatchData.new({ attributes: DatadogAPIClient::V2::IncidentTodoAttributes.new({ assignees: [ "@test.user@test.com", ], content: "Restore lost data.", completed: "2023-03-06T22:00:00.000000+00:00", due_date: "2023-07-10T05:00:00.000000+00:00", }), type: DatadogAPIClient::V2::IncidentTodoType::INCIDENT_TODOS, }), }) p api_instance.update_incident_todo(INCIDENT_DATA_ID, INCIDENT_TODO_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an incident todo returns "OK" response ``` // Update an incident todo returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::model::IncidentTodoAssignee; use datadog_api_client::datadogV2::model::IncidentTodoAttributes; use datadog_api_client::datadogV2::model::IncidentTodoPatchData; use datadog_api_client::datadogV2::model::IncidentTodoPatchRequest; use datadog_api_client::datadogV2::model::IncidentTodoType; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); // the "incident" has an "incident_todo" let incident_todo_data_id = std::env::var("INCIDENT_TODO_DATA_ID").unwrap(); let body = IncidentTodoPatchRequest::new(IncidentTodoPatchData::new( IncidentTodoAttributes::new( vec![IncidentTodoAssignee::IncidentTodoAssigneeHandle( "@test.user@test.com".to_string(), )], "Restore lost data.".to_string(), ) .completed(Some("2023-03-06T22:00:00.000000+00:00".to_string())) .due_date(Some("2023-07-10T05:00:00.000000+00:00".to_string())), IncidentTodoType::INCIDENT_TODOS, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateIncidentTodo", true); let api = IncidentsAPI::with_config(configuration); let resp = api .update_incident_todo( incident_data_id.clone(), incident_todo_data_id.clone(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an incident todo returns "OK" response ``` /** * Update an incident todo returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateIncidentTodo"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; // the "incident" has an "incident_todo" const INCIDENT_TODO_DATA_ID = process.env.INCIDENT_TODO_DATA_ID as string; const params: v2.IncidentsApiUpdateIncidentTodoRequest = { body: { data: { attributes: { assignees: ["@test.user@test.com"], content: "Restore lost data.", completed: "2023-03-06T22:00:00.000000+00:00", dueDate: "2023-07-10T05:00:00.000000+00:00", }, type: "incident_todos", }, }, incidentId: INCIDENT_DATA_ID, todoId: INCIDENT_TODO_DATA_ID, }; apiInstance .updateIncidentTodo(params) .then((data: v2.IncidentTodoResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an incident todo](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-todo) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-todo-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/relationships/todos/{todo_id} ### Overview Delete an incident todo. This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. todo_id [_required_] string The UUID of the incident todo. ### Response * [204](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentTodo-204-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentTodo-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentTodo-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentTodo-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentTodo-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentTodo-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Delete an incident todo Copy ``` # Path parameters export incident_id="CHANGE_ME" export todo_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/relationships/todos/${todo_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an incident todo ``` """ Delete an incident todo returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # there is a valid "incident" in the system INCIDENT_DATA_ID = environ["INCIDENT_DATA_ID"] # the "incident" has an "incident_todo" INCIDENT_TODO_DATA_ID = environ["INCIDENT_TODO_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_incident_todo"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) api_instance.delete_incident_todo( incident_id=INCIDENT_DATA_ID, todo_id=INCIDENT_TODO_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an incident todo ``` # Delete an incident todo returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_incident_todo".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident" in the system INCIDENT_DATA_ID = ENV["INCIDENT_DATA_ID"] # the "incident" has an "incident_todo" INCIDENT_TODO_DATA_ID = ENV["INCIDENT_TODO_DATA_ID"] api_instance.delete_incident_todo(INCIDENT_DATA_ID, INCIDENT_TODO_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an incident todo ``` // Delete an incident todo returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident" in the system IncidentDataID := os.Getenv("INCIDENT_DATA_ID") // the "incident" has an "incident_todo" IncidentTodoDataID := os.Getenv("INCIDENT_TODO_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteIncidentTodo", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) r, err := api.DeleteIncidentTodo(ctx, IncidentDataID, IncidentTodoDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.DeleteIncidentTodo`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an incident todo ``` // Delete an incident todo returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteIncidentTodo", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident" in the system String INCIDENT_DATA_ID = System.getenv("INCIDENT_DATA_ID"); // the "incident" has an "incident_todo" String INCIDENT_TODO_DATA_ID = System.getenv("INCIDENT_TODO_DATA_ID"); try { apiInstance.deleteIncidentTodo(INCIDENT_DATA_ID, INCIDENT_TODO_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#deleteIncidentTodo"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an incident todo ``` // Delete an incident todo returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { // there is a valid "incident" in the system let incident_data_id = std::env::var("INCIDENT_DATA_ID").unwrap(); // the "incident" has an "incident_todo" let incident_todo_data_id = std::env::var("INCIDENT_TODO_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteIncidentTodo", true); let api = IncidentsAPI::with_config(configuration); let resp = api .delete_incident_todo(incident_data_id.clone(), incident_todo_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an incident todo ``` /** * Delete an incident todo returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteIncidentTodo"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident" in the system const INCIDENT_DATA_ID = process.env.INCIDENT_DATA_ID as string; // the "incident" has an "incident_todo" const INCIDENT_TODO_DATA_ID = process.env.INCIDENT_TODO_DATA_ID as string; const params: v2.IncidentsApiDeleteIncidentTodoRequest = { incidentId: INCIDENT_DATA_ID, todoId: INCIDENT_TODO_DATA_ID, }; apiInstance .deleteIncidentTodo(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an incident type](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-type) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-type-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/incidents/config/typeshttps://api.ap2.datadoghq.com/api/v2/incidents/config/typeshttps://api.datadoghq.eu/api/v2/incidents/config/typeshttps://api.ddog-gov.com/api/v2/incidents/config/typeshttps://api.datadoghq.com/api/v2/incidents/config/typeshttps://api.us3.datadoghq.com/api/v2/incidents/config/typeshttps://api.us5.datadoghq.com/api/v2/incidents/config/types ### Overview Create an incident type. This endpoint requires the `incident_settings_write` permission. OAuth apps require the `incident_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Request #### Body Data (required) Incident type payload. * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Incident type data for a create request. attributes [_required_] object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` ``` { "data": { "attributes": { "description": "Any incidents that harm (or have the potential to) the confidentiality, integrity, or availability of our data.", "is_default": false, "name": "Security Incident" }, "type": "incident_types" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentType-201-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentType-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentType-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentType-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentType-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentType-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Incident type response data. Field Type Description data [_required_] object Incident type response data. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` ``` { "data": { "attributes": { "createdAt": "2019-09-19T10:00:00.000Z", "createdBy": "00000000-0000-0000-0000-000000000000", "description": "Any incidents that harm (or have the potential to) the confidentiality, integrity, or availability of our data.", "is_default": false, "lastModifiedBy": "00000000-0000-0000-0000-000000000000", "modifiedAt": "2019-09-19T10:00:00.000Z", "name": "Security Incident", "prefix": "IR" }, "id": "00000000-0000-0000-0000-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "google_meet_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "google_meet_configurations" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "microsoft_teams_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "microsoft_teams_configurations" } }, "zoom_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "zoom_configurations" } } }, "type": "incident_types" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Create an incident type returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/types" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Any incidents that harm (or have the potential to) the confidentiality, integrity, or availability of our data.", "is_default": false, "name": "Security Incident" }, "type": "incident_types" } } EOF ``` ##### Create an incident type returns "CREATED" response ``` // Create an incident type returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.IncidentTypeCreateRequest{ Data: datadogV2.IncidentTypeCreateData{ Attributes: datadogV2.IncidentTypeAttributes{ Description: datadog.PtrString("Any incidents that harm (or have the potential to) the confidentiality, integrity, or availability of our data."), IsDefault: datadog.PtrBool(false), Name: "Security Incident", }, Type: datadogV2.INCIDENTTYPETYPE_INCIDENT_TYPES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateIncidentType", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.CreateIncidentType(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.CreateIncidentType`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.CreateIncidentType`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an incident type returns "CREATED" response ``` // Create an incident type returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentTypeAttributes; import com.datadog.api.client.v2.model.IncidentTypeCreateData; import com.datadog.api.client.v2.model.IncidentTypeCreateRequest; import com.datadog.api.client.v2.model.IncidentTypeResponse; import com.datadog.api.client.v2.model.IncidentTypeType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createIncidentType", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); IncidentTypeCreateRequest body = new IncidentTypeCreateRequest() .data( new IncidentTypeCreateData() .attributes( new IncidentTypeAttributes() .description( "Any incidents that harm (or have the potential to) the" + " confidentiality, integrity, or availability of our data.") .isDefault(false) .name("Security Incident")) .type(IncidentTypeType.INCIDENT_TYPES)); try { IncidentTypeResponse result = apiInstance.createIncidentType(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#createIncidentType"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an incident type returns "CREATED" response ``` """ Create an incident type returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_type_attributes import IncidentTypeAttributes from datadog_api_client.v2.model.incident_type_create_data import IncidentTypeCreateData from datadog_api_client.v2.model.incident_type_create_request import IncidentTypeCreateRequest from datadog_api_client.v2.model.incident_type_type import IncidentTypeType body = IncidentTypeCreateRequest( data=IncidentTypeCreateData( attributes=IncidentTypeAttributes( description="Any incidents that harm (or have the potential to) the confidentiality, integrity, or availability of our data.", is_default=False, name="Security Incident", ), type=IncidentTypeType.INCIDENT_TYPES, ), ) configuration = Configuration() configuration.unstable_operations["create_incident_type"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.create_incident_type(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an incident type returns "CREATED" response ``` # Create an incident type returns "CREATED" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_incident_type".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new body = DatadogAPIClient::V2::IncidentTypeCreateRequest.new({ data: DatadogAPIClient::V2::IncidentTypeCreateData.new({ attributes: DatadogAPIClient::V2::IncidentTypeAttributes.new({ description: "Any incidents that harm (or have the potential to) the confidentiality, integrity, or availability of our data.", is_default: false, name: "Security Incident", }), type: DatadogAPIClient::V2::IncidentTypeType::INCIDENT_TYPES, }), }) p api_instance.create_incident_type(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an incident type returns "CREATED" response ``` // Create an incident type returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::model::IncidentTypeAttributes; use datadog_api_client::datadogV2::model::IncidentTypeCreateData; use datadog_api_client::datadogV2::model::IncidentTypeCreateRequest; use datadog_api_client::datadogV2::model::IncidentTypeType; #[tokio::main] async fn main() { let body = IncidentTypeCreateRequest::new( IncidentTypeCreateData::new( IncidentTypeAttributes::new("Security Incident".to_string()) .description( "Any incidents that harm (or have the potential to) the confidentiality, integrity, or availability of our data.".to_string(), ) .is_default(false), IncidentTypeType::INCIDENT_TYPES, ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateIncidentType", true); let api = IncidentsAPI::with_config(configuration); let resp = api.create_incident_type(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an incident type returns "CREATED" response ``` /** * Create an incident type returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createIncidentType"] = true; const apiInstance = new v2.IncidentsApi(configuration); const params: v2.IncidentsApiCreateIncidentTypeRequest = { body: { data: { attributes: { description: "Any incidents that harm (or have the potential to) the confidentiality, integrity, or availability of our data.", isDefault: false, name: "Security Incident", }, type: "incident_types", }, }, }; apiInstance .createIncidentType(params) .then((data: v2.IncidentTypeResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of incident types](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-incident-types) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-incident-types-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/config/typeshttps://api.ap2.datadoghq.com/api/v2/incidents/config/typeshttps://api.datadoghq.eu/api/v2/incidents/config/typeshttps://api.ddog-gov.com/api/v2/incidents/config/typeshttps://api.datadoghq.com/api/v2/incidents/config/typeshttps://api.us3.datadoghq.com/api/v2/incidents/config/typeshttps://api.us5.datadoghq.com/api/v2/incidents/config/types ### Overview Get all incident types. This endpoint requires any of the following permissions: * `incident_settings_read` * `incident_read` OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Query Strings Name Type Description include_deleted boolean Include deleted incident types in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentTypes-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentTypes-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentTypes-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentTypes-403-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentTypes-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with a list of incident types. Field Type Description data [_required_] [object] An array of incident type objects. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` ``` { "data": [ { "attributes": { "createdAt": "2019-09-19T10:00:00.000Z", "createdBy": "00000000-0000-0000-0000-000000000000", "description": "Any incidents that harm (or have the potential to) the confidentiality, integrity, or availability of our data.", "is_default": false, "lastModifiedBy": "00000000-0000-0000-0000-000000000000", "modifiedAt": "2019-09-19T10:00:00.000Z", "name": "Security Incident", "prefix": "IR" }, "id": "00000000-0000-0000-0000-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "google_meet_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "google_meet_configurations" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "microsoft_teams_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "microsoft_teams_configurations" } }, "zoom_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "zoom_configurations" } } }, "type": "incident_types" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Get a list of incident types Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/types" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of incident types ``` """ Get a list of incident types returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi configuration = Configuration() configuration.unstable_operations["list_incident_types"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.list_incident_types() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of incident types ``` # Get a list of incident types returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_incident_types".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new p api_instance.list_incident_types() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of incident types ``` // Get a list of incident types returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListIncidentTypes", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.ListIncidentTypes(ctx, *datadogV2.NewListIncidentTypesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.ListIncidentTypes`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.ListIncidentTypes`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of incident types ``` // Get a list of incident types returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentTypeListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listIncidentTypes", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); try { IncidentTypeListResponse result = apiInstance.listIncidentTypes(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#listIncidentTypes"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of incident types ``` // Get a list of incident types returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::ListIncidentTypesOptionalParams; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListIncidentTypes", true); let api = IncidentsAPI::with_config(configuration); let resp = api .list_incident_types(ListIncidentTypesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of incident types ``` /** * Get a list of incident types returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listIncidentTypes"] = true; const apiInstance = new v2.IncidentsApi(configuration); apiInstance .listIncidentTypes() .then((data: v2.IncidentTypeListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get incident type details](https://docs.datadoghq.com/api/latest/incidents/#get-incident-type-details) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#get-incident-type-details-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.ap2.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.datadoghq.eu/api/v2/incidents/config/types/{incident_type_id}https://api.ddog-gov.com/api/v2/incidents/config/types/{incident_type_id}https://api.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.us3.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.us5.datadoghq.com/api/v2/incidents/config/types/{incident_type_id} ### Overview Get incident type details. This endpoint requires the `incident_read` permission. OAuth apps require the `incident_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_type_id [_required_] string The UUID of the incident type. ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentType-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentType-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentType-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentType-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentType-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentType-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Incident type response data. Field Type Description data [_required_] object Incident type response data. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` ``` { "data": { "attributes": { "createdAt": "2019-09-19T10:00:00.000Z", "createdBy": "00000000-0000-0000-0000-000000000000", "description": "Any incidents that harm (or have the potential to) the confidentiality, integrity, or availability of our data.", "is_default": false, "lastModifiedBy": "00000000-0000-0000-0000-000000000000", "modifiedAt": "2019-09-19T10:00:00.000Z", "name": "Security Incident", "prefix": "IR" }, "id": "00000000-0000-0000-0000-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "google_meet_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "google_meet_configurations" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "microsoft_teams_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "microsoft_teams_configurations" } }, "zoom_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "zoom_configurations" } } }, "type": "incident_types" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Get incident type details Copy ``` # Path parameters export incident_type_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/types/${incident_type_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get incident type details ``` """ Get incident type details returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi configuration = Configuration() configuration.unstable_operations["get_incident_type"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.get_incident_type( incident_type_id="incident_type_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get incident type details ``` # Get incident type details returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_incident_type".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new p api_instance.get_incident_type("incident_type_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get incident type details ``` // Get incident type details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetIncidentType", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.GetIncidentType(ctx, "incident_type_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.GetIncidentType`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.GetIncidentType`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get incident type details ``` // Get incident type details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentTypeResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getIncidentType", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); try { IncidentTypeResponse result = apiInstance.getIncidentType("incident_type_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#getIncidentType"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get incident type details ``` // Get incident type details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetIncidentType", true); let api = IncidentsAPI::with_config(configuration); let resp = api.get_incident_type("incident_type_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get incident type details ``` /** * Get incident type details returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getIncidentType"] = true; const apiInstance = new v2.IncidentsApi(configuration); const params: v2.IncidentsApiGetIncidentTypeRequest = { incidentTypeId: "incident_type_id", }; apiInstance .getIncidentType(params) .then((data: v2.IncidentTypeResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an incident type](https://docs.datadoghq.com/api/latest/incidents/#update-an-incident-type) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#update-an-incident-type-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PATCH https://api.ap1.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.ap2.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.datadoghq.eu/api/v2/incidents/config/types/{incident_type_id}https://api.ddog-gov.com/api/v2/incidents/config/types/{incident_type_id}https://api.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.us3.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.us5.datadoghq.com/api/v2/incidents/config/types/{incident_type_id} ### Overview Update an incident type. This endpoint requires the `incident_settings_write` permission. OAuth apps require the `incident_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_type_id [_required_] string The UUID of the incident type. ### Request #### Body Data (required) Incident type payload. * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Incident type data for a patch request. attributes [_required_] object Incident type's attributes for updates. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean When true, this incident type will be used as the default type when an incident type is not specified. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` ``` { "data": { "id": "00000000-0000-0000-0000-000000000000", "attributes": { "name": "Security Incident-updated" }, "type": "incident_types" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentType-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentType-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentType-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentType-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentType-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentType-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Incident type response data. Field Type Description data [_required_] object Incident type response data. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` ``` { "data": { "attributes": { "createdAt": "2019-09-19T10:00:00.000Z", "createdBy": "00000000-0000-0000-0000-000000000000", "description": "Any incidents that harm (or have the potential to) the confidentiality, integrity, or availability of our data.", "is_default": false, "lastModifiedBy": "00000000-0000-0000-0000-000000000000", "modifiedAt": "2019-09-19T10:00:00.000Z", "name": "Security Incident", "prefix": "IR" }, "id": "00000000-0000-0000-0000-000000000000", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "google_meet_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "google_meet_configurations" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "microsoft_teams_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "microsoft_teams_configurations" } }, "zoom_configuration": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "zoom_configurations" } } }, "type": "incident_types" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Update an incident type returns "OK" response Copy ``` # Path parameters export incident_type_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/types/${incident_type_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "00000000-0000-0000-0000-000000000000", "attributes": { "name": "Security Incident-updated" }, "type": "incident_types" } } EOF ``` ##### Update an incident type returns "OK" response ``` // Update an incident type returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident_type" in the system IncidentTypeDataID := os.Getenv("INCIDENT_TYPE_DATA_ID") body := datadogV2.IncidentTypePatchRequest{ Data: datadogV2.IncidentTypePatchData{ Id: IncidentTypeDataID, Attributes: datadogV2.IncidentTypeUpdateAttributes{ Name: datadog.PtrString("Security Incident-updated"), }, Type: datadogV2.INCIDENTTYPETYPE_INCIDENT_TYPES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateIncidentType", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.UpdateIncidentType(ctx, IncidentTypeDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.UpdateIncidentType`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.UpdateIncidentType`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an incident type returns "OK" response ``` // Update an incident type returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentTypePatchData; import com.datadog.api.client.v2.model.IncidentTypePatchRequest; import com.datadog.api.client.v2.model.IncidentTypeResponse; import com.datadog.api.client.v2.model.IncidentTypeType; import com.datadog.api.client.v2.model.IncidentTypeUpdateAttributes; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateIncidentType", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident_type" in the system String INCIDENT_TYPE_DATA_ATTRIBUTES_NAME = System.getenv("INCIDENT_TYPE_DATA_ATTRIBUTES_NAME"); String INCIDENT_TYPE_DATA_ID = System.getenv("INCIDENT_TYPE_DATA_ID"); IncidentTypePatchRequest body = new IncidentTypePatchRequest() .data( new IncidentTypePatchData() .id(INCIDENT_TYPE_DATA_ID) .attributes( new IncidentTypeUpdateAttributes().name("Security Incident-updated")) .type(IncidentTypeType.INCIDENT_TYPES)); try { IncidentTypeResponse result = apiInstance.updateIncidentType(INCIDENT_TYPE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#updateIncidentType"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an incident type returns "OK" response ``` """ Update an incident type returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_type_patch_data import IncidentTypePatchData from datadog_api_client.v2.model.incident_type_patch_request import IncidentTypePatchRequest from datadog_api_client.v2.model.incident_type_type import IncidentTypeType from datadog_api_client.v2.model.incident_type_update_attributes import IncidentTypeUpdateAttributes # there is a valid "incident_type" in the system INCIDENT_TYPE_DATA_ATTRIBUTES_NAME = environ["INCIDENT_TYPE_DATA_ATTRIBUTES_NAME"] INCIDENT_TYPE_DATA_ID = environ["INCIDENT_TYPE_DATA_ID"] body = IncidentTypePatchRequest( data=IncidentTypePatchData( id=INCIDENT_TYPE_DATA_ID, attributes=IncidentTypeUpdateAttributes( name="Security Incident-updated", ), type=IncidentTypeType.INCIDENT_TYPES, ), ) configuration = Configuration() configuration.unstable_operations["update_incident_type"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.update_incident_type(incident_type_id=INCIDENT_TYPE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an incident type returns "OK" response ``` # Update an incident type returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_incident_type".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident_type" in the system INCIDENT_TYPE_DATA_ATTRIBUTES_NAME = ENV["INCIDENT_TYPE_DATA_ATTRIBUTES_NAME"] INCIDENT_TYPE_DATA_ID = ENV["INCIDENT_TYPE_DATA_ID"] body = DatadogAPIClient::V2::IncidentTypePatchRequest.new({ data: DatadogAPIClient::V2::IncidentTypePatchData.new({ id: INCIDENT_TYPE_DATA_ID, attributes: DatadogAPIClient::V2::IncidentTypeUpdateAttributes.new({ name: "Security Incident-updated", }), type: DatadogAPIClient::V2::IncidentTypeType::INCIDENT_TYPES, }), }) p api_instance.update_incident_type(INCIDENT_TYPE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an incident type returns "OK" response ``` // Update an incident type returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::model::IncidentTypePatchData; use datadog_api_client::datadogV2::model::IncidentTypePatchRequest; use datadog_api_client::datadogV2::model::IncidentTypeType; use datadog_api_client::datadogV2::model::IncidentTypeUpdateAttributes; #[tokio::main] async fn main() { // there is a valid "incident_type" in the system let incident_type_data_id = std::env::var("INCIDENT_TYPE_DATA_ID").unwrap(); let body = IncidentTypePatchRequest::new(IncidentTypePatchData::new( IncidentTypeUpdateAttributes::new().name("Security Incident-updated".to_string()), incident_type_data_id.clone(), IncidentTypeType::INCIDENT_TYPES, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateIncidentType", true); let api = IncidentsAPI::with_config(configuration); let resp = api .update_incident_type(incident_type_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an incident type returns "OK" response ``` /** * Update an incident type returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateIncidentType"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident_type" in the system const INCIDENT_TYPE_DATA_ID = process.env.INCIDENT_TYPE_DATA_ID as string; const params: v2.IncidentsApiUpdateIncidentTypeRequest = { body: { data: { id: INCIDENT_TYPE_DATA_ID, attributes: { name: "Security Incident-updated", }, type: "incident_types", }, }, incidentTypeId: INCIDENT_TYPE_DATA_ID, }; apiInstance .updateIncidentType(params) .then((data: v2.IncidentTypeResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an incident type](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-type) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-type-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.ap2.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.datadoghq.eu/api/v2/incidents/config/types/{incident_type_id}https://api.ddog-gov.com/api/v2/incidents/config/types/{incident_type_id}https://api.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.us3.datadoghq.com/api/v2/incidents/config/types/{incident_type_id}https://api.us5.datadoghq.com/api/v2/incidents/config/types/{incident_type_id} ### Overview Delete an incident type. This endpoint requires the `incident_settings_write` permission. OAuth apps require the `incident_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_type_id [_required_] string The UUID of the incident type. ### Response * [204](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentType-204-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentType-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentType-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentType-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentType-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentType-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Delete an incident type Copy ``` # Path parameters export incident_type_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/types/${incident_type_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an incident type ``` """ Delete an incident type returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # there is a valid "incident_type" in the system INCIDENT_TYPE_DATA_ID = environ["INCIDENT_TYPE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_incident_type"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) api_instance.delete_incident_type( incident_type_id=INCIDENT_TYPE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an incident type ``` # Delete an incident type returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_incident_type".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident_type" in the system INCIDENT_TYPE_DATA_ID = ENV["INCIDENT_TYPE_DATA_ID"] api_instance.delete_incident_type(INCIDENT_TYPE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an incident type ``` // Delete an incident type returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident_type" in the system IncidentTypeDataID := os.Getenv("INCIDENT_TYPE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteIncidentType", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) r, err := api.DeleteIncidentType(ctx, IncidentTypeDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.DeleteIncidentType`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an incident type ``` // Delete an incident type returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteIncidentType", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident_type" in the system String INCIDENT_TYPE_DATA_ID = System.getenv("INCIDENT_TYPE_DATA_ID"); try { apiInstance.deleteIncidentType(INCIDENT_TYPE_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#deleteIncidentType"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an incident type ``` // Delete an incident type returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { // there is a valid "incident_type" in the system let incident_type_data_id = std::env::var("INCIDENT_TYPE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteIncidentType", true); let api = IncidentsAPI::with_config(configuration); let resp = api .delete_incident_type(incident_type_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an incident type ``` /** * Delete an incident type returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteIncidentType"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident_type" in the system const INCIDENT_TYPE_DATA_ID = process.env.INCIDENT_TYPE_DATA_ID as string; const params: v2.IncidentsApiDeleteIncidentTypeRequest = { incidentTypeId: INCIDENT_TYPE_DATA_ID, }; apiInstance .deleteIncidentType(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List incident notification templates](https://docs.datadoghq.com/api/latest/incidents/#list-incident-notification-templates) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#list-incident-notification-templates-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/config/notification-templateshttps://api.ap2.datadoghq.com/api/v2/incidents/config/notification-templateshttps://api.datadoghq.eu/api/v2/incidents/config/notification-templateshttps://api.ddog-gov.com/api/v2/incidents/config/notification-templateshttps://api.datadoghq.com/api/v2/incidents/config/notification-templateshttps://api.us3.datadoghq.com/api/v2/incidents/config/notification-templateshttps://api.us5.datadoghq.com/api/v2/incidents/config/notification-templates ### Overview Lists all notification templates. Optionally filter by incident type. This endpoint requires the `incident_notification_settings_read` permission. ### Arguments #### Query Strings Name Type Description filter[incident-type] string Optional incident type ID filter. include string Comma-separated list of relationships to include. Supported values: `created_by_user`, `last_modified_by_user`, `incident_type` ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationTemplates-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationTemplates-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationTemplates-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationTemplates-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationTemplates-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationTemplates-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with notification templates. Field Type Description data [_required_] [object] The `NotificationTemplateArray` `data`. attributes object The notification template's attributes. category [_required_] string The category of the notification template. content [_required_] string The content body of the notification template. created [_required_] date-time Timestamp when the notification template was created. modified [_required_] date-time Timestamp when the notification template was last modified. name [_required_] string The name of the notification template. subject [_required_] string The subject line of the notification template. id [_required_] uuid The unique identifier of the notification template. relationships object The notification template's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` included [ ] Related objects that are included in the response. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Incident type response data. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` meta object Response metadata. page object Pagination metadata. total_count int64 Total number of notification templates. total_filtered_count int64 Total number of notification templates matching the filter. ``` { "data": [ { "attributes": { "category": "alert", "content": "An incident has been declared.\n\nTitle: {{incident.title}}\nSeverity: {{incident.severity}}\nAffected Services: {{incident.services}}\nStatus: {{incident.state}}\n\nPlease join the incident channel for updates.", "created": "2025-01-15T10:30:00Z", "modified": "2025-01-15T14:45:00Z", "name": "Incident Alert Template", "subject": "{{incident.severity}} Incident: {{incident.title}}" }, "id": "00000000-0000-0000-0000-000000000001", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "notification_templates" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "meta": { "page": { "total_count": 42, "total_filtered_count": 15 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### List incident notification templates Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/notification-templates" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List incident notification templates ``` """ List incident notification templates returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi configuration = Configuration() configuration.unstable_operations["list_incident_notification_templates"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.list_incident_notification_templates() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List incident notification templates ``` # List incident notification templates returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_incident_notification_templates".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new p api_instance.list_incident_notification_templates() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List incident notification templates ``` // List incident notification templates returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListIncidentNotificationTemplates", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.ListIncidentNotificationTemplates(ctx, *datadogV2.NewListIncidentNotificationTemplatesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.ListIncidentNotificationTemplates`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.ListIncidentNotificationTemplates`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List incident notification templates ``` // List incident notification templates returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentNotificationTemplateArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listIncidentNotificationTemplates", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); try { IncidentNotificationTemplateArray result = apiInstance.listIncidentNotificationTemplates(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#listIncidentNotificationTemplates"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List incident notification templates ``` // List incident notification templates returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::ListIncidentNotificationTemplatesOptionalParams; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListIncidentNotificationTemplates", true); let api = IncidentsAPI::with_config(configuration); let resp = api .list_incident_notification_templates( ListIncidentNotificationTemplatesOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List incident notification templates ``` /** * List incident notification templates returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listIncidentNotificationTemplates"] = true; const apiInstance = new v2.IncidentsApi(configuration); apiInstance .listIncidentNotificationTemplates() .then((data: v2.IncidentNotificationTemplateArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create incident notification template](https://docs.datadoghq.com/api/latest/incidents/#create-incident-notification-template) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#create-incident-notification-template-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/incidents/config/notification-templateshttps://api.ap2.datadoghq.com/api/v2/incidents/config/notification-templateshttps://api.datadoghq.eu/api/v2/incidents/config/notification-templateshttps://api.ddog-gov.com/api/v2/incidents/config/notification-templateshttps://api.datadoghq.com/api/v2/incidents/config/notification-templateshttps://api.us3.datadoghq.com/api/v2/incidents/config/notification-templateshttps://api.us5.datadoghq.com/api/v2/incidents/config/notification-templates ### Overview Creates a new notification template. This endpoint requires the `incident_notification_settings_write` permission. OAuth apps require the `incident_notification_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Notification template data for a create request. attributes [_required_] object The attributes for creating a notification template. category [_required_] string The category of the notification template. content [_required_] string The content body of the notification template. name [_required_] string The name of the notification template. subject [_required_] string The subject line of the notification template. relationships object The definition of `NotificationTemplateCreateDataRelationships` object. incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` ``` { "data": { "attributes": { "category": "alert", "content": "An incident has been declared.\n\nTitle: Sample Incident Title\nSeverity: SEV-2\nAffected Services: web-service, database-service\nStatus: active\n\nPlease join the incident channel for updates.", "name": "Example-Incident", "subject": "SEV-2 Incident: Sample Incident Title" }, "relationships": { "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } } }, "type": "notification_templates" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationTemplate-201-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationTemplate-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationTemplate-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationTemplate-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationTemplate-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationTemplate-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with a notification template. Field Type Description data [_required_] object Notification template data from a response. attributes object The notification template's attributes. category [_required_] string The category of the notification template. content [_required_] string The content body of the notification template. created [_required_] date-time Timestamp when the notification template was created. modified [_required_] date-time Timestamp when the notification template was last modified. name [_required_] string The name of the notification template. subject [_required_] string The subject line of the notification template. id [_required_] uuid The unique identifier of the notification template. relationships object The notification template's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` included [ ] Related objects that are included in the response. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Incident type response data. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` ``` { "data": { "attributes": { "category": "alert", "content": "An incident has been declared.\n\nTitle: {{incident.title}}\nSeverity: {{incident.severity}}\nAffected Services: {{incident.services}}\nStatus: {{incident.state}}\n\nPlease join the incident channel for updates.", "created": "2025-01-15T10:30:00Z", "modified": "2025-01-15T14:45:00Z", "name": "Incident Alert Template", "subject": "{{incident.severity}} Incident: {{incident.title}}" }, "id": "00000000-0000-0000-0000-000000000001", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "notification_templates" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Create incident notification template returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/notification-templates" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "category": "alert", "content": "An incident has been declared.\n\nTitle: Sample Incident Title\nSeverity: SEV-2\nAffected Services: web-service, database-service\nStatus: active\n\nPlease join the incident channel for updates.", "name": "Example-Incident", "subject": "SEV-2 Incident: Sample Incident Title" }, "relationships": { "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } } }, "type": "notification_templates" } } EOF ``` ##### Create incident notification template returns "Created" response ``` // Create incident notification template returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident_type" in the system IncidentTypeDataID := os.Getenv("INCIDENT_TYPE_DATA_ID") body := datadogV2.CreateIncidentNotificationTemplateRequest{ Data: datadogV2.IncidentNotificationTemplateCreateData{ Attributes: datadogV2.IncidentNotificationTemplateCreateAttributes{ Category: "alert", Content: `An incident has been declared. Title: Sample Incident Title Severity: SEV-2 Affected Services: web-service, database-service Status: active Please join the incident channel for updates.`, Name: "Example-Incident", Subject: "SEV-2 Incident: Sample Incident Title", }, Relationships: &datadogV2.IncidentNotificationTemplateCreateDataRelationships{ IncidentType: &datadogV2.RelationshipToIncidentType{ Data: datadogV2.RelationshipToIncidentTypeData{ Id: IncidentTypeDataID, Type: datadogV2.INCIDENTTYPETYPE_INCIDENT_TYPES, }, }, }, Type: datadogV2.INCIDENTNOTIFICATIONTEMPLATETYPE_NOTIFICATION_TEMPLATES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateIncidentNotificationTemplate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.CreateIncidentNotificationTemplate(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.CreateIncidentNotificationTemplate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.CreateIncidentNotificationTemplate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create incident notification template returns "Created" response ``` // Create incident notification template returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.CreateIncidentNotificationTemplateRequest; import com.datadog.api.client.v2.model.IncidentNotificationTemplate; import com.datadog.api.client.v2.model.IncidentNotificationTemplateCreateAttributes; import com.datadog.api.client.v2.model.IncidentNotificationTemplateCreateData; import com.datadog.api.client.v2.model.IncidentNotificationTemplateCreateDataRelationships; import com.datadog.api.client.v2.model.IncidentNotificationTemplateType; import com.datadog.api.client.v2.model.IncidentTypeType; import com.datadog.api.client.v2.model.RelationshipToIncidentType; import com.datadog.api.client.v2.model.RelationshipToIncidentTypeData; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createIncidentNotificationTemplate", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident_type" in the system String INCIDENT_TYPE_DATA_ID = System.getenv("INCIDENT_TYPE_DATA_ID"); CreateIncidentNotificationTemplateRequest body = new CreateIncidentNotificationTemplateRequest() .data( new IncidentNotificationTemplateCreateData() .attributes( new IncidentNotificationTemplateCreateAttributes() .category("alert") .content( """ An incident has been declared. Title: Sample Incident Title Severity: SEV-2 Affected Services: web-service, database-service Status: active Please join the incident channel for updates. """) .name("Example-Incident") .subject("SEV-2 Incident: Sample Incident Title")) .relationships( new IncidentNotificationTemplateCreateDataRelationships() .incidentType( new RelationshipToIncidentType() .data( new RelationshipToIncidentTypeData() .id(INCIDENT_TYPE_DATA_ID) .type(IncidentTypeType.INCIDENT_TYPES)))) .type(IncidentNotificationTemplateType.NOTIFICATION_TEMPLATES)); try { IncidentNotificationTemplate result = apiInstance.createIncidentNotificationTemplate(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#createIncidentNotificationTemplate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create incident notification template returns "Created" response ``` """ Create incident notification template returns "Created" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.create_incident_notification_template_request import ( CreateIncidentNotificationTemplateRequest, ) from datadog_api_client.v2.model.incident_notification_template_create_attributes import ( IncidentNotificationTemplateCreateAttributes, ) from datadog_api_client.v2.model.incident_notification_template_create_data import ( IncidentNotificationTemplateCreateData, ) from datadog_api_client.v2.model.incident_notification_template_create_data_relationships import ( IncidentNotificationTemplateCreateDataRelationships, ) from datadog_api_client.v2.model.incident_notification_template_type import IncidentNotificationTemplateType from datadog_api_client.v2.model.incident_type_type import IncidentTypeType from datadog_api_client.v2.model.relationship_to_incident_type import RelationshipToIncidentType from datadog_api_client.v2.model.relationship_to_incident_type_data import RelationshipToIncidentTypeData # there is a valid "incident_type" in the system INCIDENT_TYPE_DATA_ID = environ["INCIDENT_TYPE_DATA_ID"] body = CreateIncidentNotificationTemplateRequest( data=IncidentNotificationTemplateCreateData( attributes=IncidentNotificationTemplateCreateAttributes( category="alert", content="An incident has been declared.\n\nTitle: Sample Incident Title\nSeverity: SEV-2\nAffected Services: web-service, database-service\nStatus: active\n\nPlease join the incident channel for updates.", name="Example-Incident", subject="SEV-2 Incident: Sample Incident Title", ), relationships=IncidentNotificationTemplateCreateDataRelationships( incident_type=RelationshipToIncidentType( data=RelationshipToIncidentTypeData( id=INCIDENT_TYPE_DATA_ID, type=IncidentTypeType.INCIDENT_TYPES, ), ), ), type=IncidentNotificationTemplateType.NOTIFICATION_TEMPLATES, ), ) configuration = Configuration() configuration.unstable_operations["create_incident_notification_template"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.create_incident_notification_template(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create incident notification template returns "Created" response ``` # Create incident notification template returns "Created" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_incident_notification_template".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident_type" in the system INCIDENT_TYPE_DATA_ID = ENV["INCIDENT_TYPE_DATA_ID"] body = DatadogAPIClient::V2::CreateIncidentNotificationTemplateRequest.new({ data: DatadogAPIClient::V2::IncidentNotificationTemplateCreateData.new({ attributes: DatadogAPIClient::V2::IncidentNotificationTemplateCreateAttributes.new({ category: "alert", content: 'An incident has been declared.\n\nTitle: Sample Incident Title\nSeverity: SEV-2\nAffected Services: web-service, database-service\nStatus: active\n\nPlease join the incident channel for updates.', name: "Example-Incident", subject: "SEV-2 Incident: Sample Incident Title", }), relationships: DatadogAPIClient::V2::IncidentNotificationTemplateCreateDataRelationships.new({ incident_type: DatadogAPIClient::V2::RelationshipToIncidentType.new({ data: DatadogAPIClient::V2::RelationshipToIncidentTypeData.new({ id: INCIDENT_TYPE_DATA_ID, type: DatadogAPIClient::V2::IncidentTypeType::INCIDENT_TYPES, }), }), }), type: DatadogAPIClient::V2::IncidentNotificationTemplateType::NOTIFICATION_TEMPLATES, }), }) p api_instance.create_incident_notification_template(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create incident notification template returns "Created" response ``` // Create incident notification template returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::model::CreateIncidentNotificationTemplateRequest; use datadog_api_client::datadogV2::model::IncidentNotificationTemplateCreateAttributes; use datadog_api_client::datadogV2::model::IncidentNotificationTemplateCreateData; use datadog_api_client::datadogV2::model::IncidentNotificationTemplateCreateDataRelationships; use datadog_api_client::datadogV2::model::IncidentNotificationTemplateType; use datadog_api_client::datadogV2::model::IncidentTypeType; use datadog_api_client::datadogV2::model::RelationshipToIncidentType; use datadog_api_client::datadogV2::model::RelationshipToIncidentTypeData; #[tokio::main] async fn main() { // there is a valid "incident_type" in the system let incident_type_data_id = std::env::var("INCIDENT_TYPE_DATA_ID").unwrap(); let body = CreateIncidentNotificationTemplateRequest::new( IncidentNotificationTemplateCreateData::new( IncidentNotificationTemplateCreateAttributes::new( "alert".to_string(), r#"An incident has been declared. Title: Sample Incident Title Severity: SEV-2 Affected Services: web-service, database-service Status: active Please join the incident channel for updates."# .to_string(), "Example-Incident".to_string(), "SEV-2 Incident: Sample Incident Title".to_string(), ), IncidentNotificationTemplateType::NOTIFICATION_TEMPLATES, ) .relationships( IncidentNotificationTemplateCreateDataRelationships::new().incident_type( RelationshipToIncidentType::new(RelationshipToIncidentTypeData::new( incident_type_data_id.clone(), IncidentTypeType::INCIDENT_TYPES, )), ), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateIncidentNotificationTemplate", true); let api = IncidentsAPI::with_config(configuration); let resp = api.create_incident_notification_template(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create incident notification template returns "Created" response ``` /** * Create incident notification template returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createIncidentNotificationTemplate"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident_type" in the system const INCIDENT_TYPE_DATA_ID = process.env.INCIDENT_TYPE_DATA_ID as string; const params: v2.IncidentsApiCreateIncidentNotificationTemplateRequest = { body: { data: { attributes: { category: "alert", content: "An incident has been declared.\n\nTitle: Sample Incident Title\nSeverity: SEV-2\nAffected Services: web-service, database-service\nStatus: active\n\nPlease join the incident channel for updates.", name: "Example-Incident", subject: "SEV-2 Incident: Sample Incident Title", }, relationships: { incidentType: { data: { id: INCIDENT_TYPE_DATA_ID, type: "incident_types", }, }, }, type: "notification_templates", }, }, }; apiInstance .createIncidentNotificationTemplate(params) .then((data: v2.IncidentNotificationTemplate) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get incident notification template](https://docs.datadoghq.com/api/latest/incidents/#get-incident-notification-template) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#get-incident-notification-template-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.ap2.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.datadoghq.eu/api/v2/incidents/config/notification-templates/{id}https://api.ddog-gov.com/api/v2/incidents/config/notification-templates/{id}https://api.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.us3.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.us5.datadoghq.com/api/v2/incidents/config/notification-templates/{id} ### Overview Retrieves a specific notification template by its ID. This endpoint requires any of the following permissions: * `incident_settings_read` * `incident_write` * `incident_read` OAuth apps require the `incident_read, incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description id [_required_] string The ID of the notification template. #### Query Strings Name Type Description include string Comma-separated list of relationships to include. Supported values: `created_by_user`, `last_modified_by_user`, `incident_type` ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationTemplate-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationTemplate-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationTemplate-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationTemplate-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationTemplate-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationTemplate-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with a notification template. Field Type Description data [_required_] object Notification template data from a response. attributes object The notification template's attributes. category [_required_] string The category of the notification template. content [_required_] string The content body of the notification template. created [_required_] date-time Timestamp when the notification template was created. modified [_required_] date-time Timestamp when the notification template was last modified. name [_required_] string The name of the notification template. subject [_required_] string The subject line of the notification template. id [_required_] uuid The unique identifier of the notification template. relationships object The notification template's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` included [ ] Related objects that are included in the response. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Incident type response data. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` ``` { "data": { "attributes": { "category": "alert", "content": "An incident has been declared.\n\nTitle: {{incident.title}}\nSeverity: {{incident.severity}}\nAffected Services: {{incident.services}}\nStatus: {{incident.state}}\n\nPlease join the incident channel for updates.", "created": "2025-01-15T10:30:00Z", "modified": "2025-01-15T14:45:00Z", "name": "Incident Alert Template", "subject": "{{incident.severity}} Incident: {{incident.title}}" }, "id": "00000000-0000-0000-0000-000000000001", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "notification_templates" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Get incident notification template Copy ``` # Path parameters export id="00000000-0000-0000-0000-000000000001" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/notification-templates/${id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get incident notification template ``` """ Get incident notification template returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi # there is a valid "notification_template" in the system NOTIFICATION_TEMPLATE_DATA_ID = environ["NOTIFICATION_TEMPLATE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_incident_notification_template"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.get_incident_notification_template( id=NOTIFICATION_TEMPLATE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get incident notification template ``` # Get incident notification template returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_incident_notification_template".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "notification_template" in the system NOTIFICATION_TEMPLATE_DATA_ID = ENV["NOTIFICATION_TEMPLATE_DATA_ID"] p api_instance.get_incident_notification_template(NOTIFICATION_TEMPLATE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get incident notification template ``` // Get incident notification template returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "notification_template" in the system NotificationTemplateDataID := uuid.MustParse(os.Getenv("NOTIFICATION_TEMPLATE_DATA_ID")) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetIncidentNotificationTemplate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.GetIncidentNotificationTemplate(ctx, NotificationTemplateDataID, *datadogV2.NewGetIncidentNotificationTemplateOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.GetIncidentNotificationTemplate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.GetIncidentNotificationTemplate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get incident notification template ``` // Get incident notification template returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentNotificationTemplate; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getIncidentNotificationTemplate", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "notification_template" in the system UUID NOTIFICATION_TEMPLATE_DATA_ID = null; try { NOTIFICATION_TEMPLATE_DATA_ID = UUID.fromString(System.getenv("NOTIFICATION_TEMPLATE_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } try { IncidentNotificationTemplate result = apiInstance.getIncidentNotificationTemplate(NOTIFICATION_TEMPLATE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#getIncidentNotificationTemplate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get incident notification template ``` // Get incident notification template returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::GetIncidentNotificationTemplateOptionalParams; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; #[tokio::main] async fn main() { // there is a valid "notification_template" in the system let notification_template_data_id = uuid::Uuid::parse_str(&std::env::var("NOTIFICATION_TEMPLATE_DATA_ID").unwrap()) .expect("Invalid UUID"); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetIncidentNotificationTemplate", true); let api = IncidentsAPI::with_config(configuration); let resp = api .get_incident_notification_template( notification_template_data_id.clone(), GetIncidentNotificationTemplateOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get incident notification template ``` /** * Get incident notification template returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getIncidentNotificationTemplate"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "notification_template" in the system const NOTIFICATION_TEMPLATE_DATA_ID = process.env .NOTIFICATION_TEMPLATE_DATA_ID as string; const params: v2.IncidentsApiGetIncidentNotificationTemplateRequest = { id: NOTIFICATION_TEMPLATE_DATA_ID, }; apiInstance .getIncidentNotificationTemplate(params) .then((data: v2.IncidentNotificationTemplate) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update incident notification template](https://docs.datadoghq.com/api/latest/incidents/#update-incident-notification-template) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#update-incident-notification-template-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PATCH https://api.ap1.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.ap2.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.datadoghq.eu/api/v2/incidents/config/notification-templates/{id}https://api.ddog-gov.com/api/v2/incidents/config/notification-templates/{id}https://api.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.us3.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.us5.datadoghq.com/api/v2/incidents/config/notification-templates/{id} ### Overview Updates an existing notification template’s attributes. This endpoint requires the `incident_notification_settings_write` permission. OAuth apps require the `incident_notification_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description id [_required_] string The ID of the notification template. #### Query Strings Name Type Description include string Comma-separated list of relationships to include. Supported values: `created_by_user`, `last_modified_by_user`, `incident_type` ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Notification template data for an update request. attributes object The attributes to update on a notification template. category string The category of the notification template. content string The content body of the notification template. name string The name of the notification template. subject string The subject line of the notification template. id [_required_] uuid The unique identifier of the notification template. type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` ``` { "data": { "attributes": { "category": "update", "content": "Incident Status Update:\n\nTitle: Sample Incident Title\nNew Status: resolved\nSeverity: SEV-2\nServices: web-service, database-service\nCommander: John Doe\n\nFor more details, visit the incident page.", "name": "Example-Incident", "subject": "Incident Update: Sample Incident Title - resolved" }, "id": "00000000-0000-0000-0000-000000000001", "type": "notification_templates" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationTemplate-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationTemplate-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationTemplate-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationTemplate-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationTemplate-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationTemplate-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with a notification template. Field Type Description data [_required_] object Notification template data from a response. attributes object The notification template's attributes. category [_required_] string The category of the notification template. content [_required_] string The content body of the notification template. created [_required_] date-time Timestamp when the notification template was created. modified [_required_] date-time Timestamp when the notification template was last modified. name [_required_] string The name of the notification template. subject [_required_] string The subject line of the notification template. id [_required_] uuid The unique identifier of the notification template. relationships object The notification template's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` included [ ] Related objects that are included in the response. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Incident type response data. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` ``` { "data": { "attributes": { "category": "alert", "content": "An incident has been declared.\n\nTitle: {{incident.title}}\nSeverity: {{incident.severity}}\nAffected Services: {{incident.services}}\nStatus: {{incident.state}}\n\nPlease join the incident channel for updates.", "created": "2025-01-15T10:30:00Z", "modified": "2025-01-15T14:45:00Z", "name": "Incident Alert Template", "subject": "{{incident.severity}} Incident: {{incident.title}}" }, "id": "00000000-0000-0000-0000-000000000001", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "notification_templates" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Update incident notification template returns "OK" response Copy ``` # Path parameters export id="00000000-0000-0000-0000-000000000001" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/notification-templates/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "category": "update", "content": "Incident Status Update:\n\nTitle: Sample Incident Title\nNew Status: resolved\nSeverity: SEV-2\nServices: web-service, database-service\nCommander: John Doe\n\nFor more details, visit the incident page.", "name": "Example-Incident", "subject": "Incident Update: Sample Incident Title - resolved" }, "id": "00000000-0000-0000-0000-000000000001", "type": "notification_templates" } } EOF ``` ##### Update incident notification template returns "OK" response ``` // Update incident notification template returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "notification_template" in the system NotificationTemplateDataID := uuid.MustParse(os.Getenv("NOTIFICATION_TEMPLATE_DATA_ID")) body := datadogV2.PatchIncidentNotificationTemplateRequest{ Data: datadogV2.IncidentNotificationTemplateUpdateData{ Attributes: &datadogV2.IncidentNotificationTemplateUpdateAttributes{ Category: datadog.PtrString("update"), Content: datadog.PtrString(`Incident Status Update: Title: Sample Incident Title New Status: resolved Severity: SEV-2 Services: web-service, database-service Commander: John Doe For more details, visit the incident page.`), Name: datadog.PtrString("Example-Incident"), Subject: datadog.PtrString("Incident Update: Sample Incident Title - resolved"), }, Id: NotificationTemplateDataID, Type: datadogV2.INCIDENTNOTIFICATIONTEMPLATETYPE_NOTIFICATION_TEMPLATES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateIncidentNotificationTemplate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.UpdateIncidentNotificationTemplate(ctx, NotificationTemplateDataID, body, *datadogV2.NewUpdateIncidentNotificationTemplateOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.UpdateIncidentNotificationTemplate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.UpdateIncidentNotificationTemplate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update incident notification template returns "OK" response ``` // Update incident notification template returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentNotificationTemplate; import com.datadog.api.client.v2.model.IncidentNotificationTemplateType; import com.datadog.api.client.v2.model.IncidentNotificationTemplateUpdateAttributes; import com.datadog.api.client.v2.model.IncidentNotificationTemplateUpdateData; import com.datadog.api.client.v2.model.PatchIncidentNotificationTemplateRequest; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateIncidentNotificationTemplate", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "notification_template" in the system UUID NOTIFICATION_TEMPLATE_DATA_ID = null; try { NOTIFICATION_TEMPLATE_DATA_ID = UUID.fromString(System.getenv("NOTIFICATION_TEMPLATE_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } PatchIncidentNotificationTemplateRequest body = new PatchIncidentNotificationTemplateRequest() .data( new IncidentNotificationTemplateUpdateData() .attributes( new IncidentNotificationTemplateUpdateAttributes() .category("update") .content( """ Incident Status Update: Title: Sample Incident Title New Status: resolved Severity: SEV-2 Services: web-service, database-service Commander: John Doe For more details, visit the incident page. """) .name("Example-Incident") .subject("Incident Update: Sample Incident Title - resolved")) .id(NOTIFICATION_TEMPLATE_DATA_ID) .type(IncidentNotificationTemplateType.NOTIFICATION_TEMPLATES)); try { IncidentNotificationTemplate result = apiInstance.updateIncidentNotificationTemplate(NOTIFICATION_TEMPLATE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#updateIncidentNotificationTemplate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update incident notification template returns "OK" response ``` """ Update incident notification template returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_notification_template_type import IncidentNotificationTemplateType from datadog_api_client.v2.model.incident_notification_template_update_attributes import ( IncidentNotificationTemplateUpdateAttributes, ) from datadog_api_client.v2.model.incident_notification_template_update_data import ( IncidentNotificationTemplateUpdateData, ) from datadog_api_client.v2.model.patch_incident_notification_template_request import ( PatchIncidentNotificationTemplateRequest, ) # there is a valid "notification_template" in the system NOTIFICATION_TEMPLATE_DATA_ID = environ["NOTIFICATION_TEMPLATE_DATA_ID"] body = PatchIncidentNotificationTemplateRequest( data=IncidentNotificationTemplateUpdateData( attributes=IncidentNotificationTemplateUpdateAttributes( category="update", content="Incident Status Update:\n\nTitle: Sample Incident Title\nNew Status: resolved\nSeverity: SEV-2\nServices: web-service, database-service\nCommander: John Doe\n\nFor more details, visit the incident page.", name="Example-Incident", subject="Incident Update: Sample Incident Title - resolved", ), id=NOTIFICATION_TEMPLATE_DATA_ID, type=IncidentNotificationTemplateType.NOTIFICATION_TEMPLATES, ), ) configuration = Configuration() configuration.unstable_operations["update_incident_notification_template"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.update_incident_notification_template(id=NOTIFICATION_TEMPLATE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update incident notification template returns "OK" response ``` # Update incident notification template returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_incident_notification_template".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "notification_template" in the system NOTIFICATION_TEMPLATE_DATA_ID = ENV["NOTIFICATION_TEMPLATE_DATA_ID"] body = DatadogAPIClient::V2::PatchIncidentNotificationTemplateRequest.new({ data: DatadogAPIClient::V2::IncidentNotificationTemplateUpdateData.new({ attributes: DatadogAPIClient::V2::IncidentNotificationTemplateUpdateAttributes.new({ category: "update", content: 'Incident Status Update:\n\nTitle: Sample Incident Title\nNew Status: resolved\nSeverity: SEV-2\nServices: web-service, database-service\nCommander: John Doe\n\nFor more details, visit the incident page.', name: "Example-Incident", subject: "Incident Update: Sample Incident Title - resolved", }), id: NOTIFICATION_TEMPLATE_DATA_ID, type: DatadogAPIClient::V2::IncidentNotificationTemplateType::NOTIFICATION_TEMPLATES, }), }) p api_instance.update_incident_notification_template(NOTIFICATION_TEMPLATE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update incident notification template returns "OK" response ``` // Update incident notification template returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::UpdateIncidentNotificationTemplateOptionalParams; use datadog_api_client::datadogV2::model::IncidentNotificationTemplateType; use datadog_api_client::datadogV2::model::IncidentNotificationTemplateUpdateAttributes; use datadog_api_client::datadogV2::model::IncidentNotificationTemplateUpdateData; use datadog_api_client::datadogV2::model::PatchIncidentNotificationTemplateRequest; #[tokio::main] async fn main() { // there is a valid "notification_template" in the system let notification_template_data_id = uuid::Uuid::parse_str(&std::env::var("NOTIFICATION_TEMPLATE_DATA_ID").unwrap()) .expect("Invalid UUID"); let body = PatchIncidentNotificationTemplateRequest::new( IncidentNotificationTemplateUpdateData::new( notification_template_data_id.clone(), IncidentNotificationTemplateType::NOTIFICATION_TEMPLATES, ) .attributes( IncidentNotificationTemplateUpdateAttributes::new() .category("update".to_string()) .content( r#"Incident Status Update: Title: Sample Incident Title New Status: resolved Severity: SEV-2 Services: web-service, database-service Commander: John Doe For more details, visit the incident page."# .to_string(), ) .name("Example-Incident".to_string()) .subject("Incident Update: Sample Incident Title - resolved".to_string()), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateIncidentNotificationTemplate", true); let api = IncidentsAPI::with_config(configuration); let resp = api .update_incident_notification_template( notification_template_data_id.clone(), body, UpdateIncidentNotificationTemplateOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update incident notification template returns "OK" response ``` /** * Update incident notification template returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateIncidentNotificationTemplate"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "notification_template" in the system const NOTIFICATION_TEMPLATE_DATA_ID = process.env .NOTIFICATION_TEMPLATE_DATA_ID as string; const params: v2.IncidentsApiUpdateIncidentNotificationTemplateRequest = { body: { data: { attributes: { category: "update", content: "Incident Status Update:\n\nTitle: Sample Incident Title\nNew Status: resolved\nSeverity: SEV-2\nServices: web-service, database-service\nCommander: John Doe\n\nFor more details, visit the incident page.", name: "Example-Incident", subject: "Incident Update: Sample Incident Title - resolved", }, id: NOTIFICATION_TEMPLATE_DATA_ID, type: "notification_templates", }, }, id: NOTIFICATION_TEMPLATE_DATA_ID, }; apiInstance .updateIncidentNotificationTemplate(params) .then((data: v2.IncidentNotificationTemplate) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a notification template](https://docs.datadoghq.com/api/latest/incidents/#delete-a-notification-template) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#delete-a-notification-template-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.ap2.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.datadoghq.eu/api/v2/incidents/config/notification-templates/{id}https://api.ddog-gov.com/api/v2/incidents/config/notification-templates/{id}https://api.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.us3.datadoghq.com/api/v2/incidents/config/notification-templates/{id}https://api.us5.datadoghq.com/api/v2/incidents/config/notification-templates/{id} ### Overview Deletes a notification template by its ID. This endpoint requires the `incident_notification_settings_write` permission. OAuth apps require the `incident_notification_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description id [_required_] string The ID of the notification template. #### Query Strings Name Type Description include string Comma-separated list of relationships to include. Supported values: `created_by_user`, `last_modified_by_user`, `incident_type` ### Response * [204](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationTemplate-204-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationTemplate-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationTemplate-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationTemplate-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationTemplate-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationTemplate-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Delete a notification template Copy ``` # Path parameters export id="00000000-0000-0000-0000-000000000001" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/notification-templates/${id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a notification template ``` """ Delete a notification template returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from uuid import UUID configuration = Configuration() configuration.unstable_operations["delete_incident_notification_template"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) api_instance.delete_incident_notification_template( id=UUID("00000000-0000-0000-0000-000000000001"), ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a notification template ``` # Delete a notification template returns "No Content" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_incident_notification_template".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new api_instance.delete_incident_notification_template("00000000-0000-0000-0000-000000000001") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a notification template ``` // Delete a notification template returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteIncidentNotificationTemplate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) r, err := api.DeleteIncidentNotificationTemplate(ctx, uuid.MustParse("00000000-0000-0000-0000-000000000001"), *datadogV2.NewDeleteIncidentNotificationTemplateOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.DeleteIncidentNotificationTemplate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a notification template ``` // Delete a notification template returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteIncidentNotificationTemplate", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); try { apiInstance.deleteIncidentNotificationTemplate( UUID.fromString("00000000-0000-0000-0000-000000000001")); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#deleteIncidentNotificationTemplate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a notification template ``` // Delete a notification template returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::DeleteIncidentNotificationTemplateOptionalParams; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use uuid::Uuid; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteIncidentNotificationTemplate", true); let api = IncidentsAPI::with_config(configuration); let resp = api .delete_incident_notification_template( Uuid::parse_str("00000000-0000-0000-0000-000000000001").expect("invalid UUID"), DeleteIncidentNotificationTemplateOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a notification template ``` /** * Delete a notification template returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteIncidentNotificationTemplate"] = true; const apiInstance = new v2.IncidentsApi(configuration); const params: v2.IncidentsApiDeleteIncidentNotificationTemplateRequest = { id: "00000000-0000-0000-0000-000000000001", }; apiInstance .deleteIncidentNotificationTemplate(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List incident notification rules](https://docs.datadoghq.com/api/latest/incidents/#list-incident-notification-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#list-incident-notification-rules-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/config/notification-ruleshttps://api.ap2.datadoghq.com/api/v2/incidents/config/notification-ruleshttps://api.datadoghq.eu/api/v2/incidents/config/notification-ruleshttps://api.ddog-gov.com/api/v2/incidents/config/notification-ruleshttps://api.datadoghq.com/api/v2/incidents/config/notification-ruleshttps://api.us3.datadoghq.com/api/v2/incidents/config/notification-ruleshttps://api.us5.datadoghq.com/api/v2/incidents/config/notification-rules ### Overview Lists all notification rules for the organization. Optionally filter by incident type. This endpoint requires the `incident_notification_settings_read` permission. ### Arguments #### Query Strings Name Type Description include string Comma-separated list of resources to include. Supported values: `created_by_user`, `last_modified_by_user`, `incident_type`, `notification_template` ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationRules-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationRules-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationRules-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationRules-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationRules-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentNotificationRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with notification rules. Field Type Description data [_required_] [object] The `NotificationRuleArray` `data`. attributes object The notification rule's attributes. conditions [_required_] [object] The conditions that trigger this notification rule. field [_required_] string The incident field to evaluate values [_required_] [string] The value(s) to compare against. Multiple values are `ORed` together. created [_required_] date-time Timestamp when the notification rule was created. enabled [_required_] boolean Whether the notification rule is enabled. handles [_required_] [string] The notification handles (targets) for this rule. modified [_required_] date-time Timestamp when the notification rule was last modified. renotify_on [string] List of incident fields that trigger re-notification when changed. trigger [_required_] string The trigger event for this notification rule. visibility [_required_] enum The visibility of the notification rule. Allowed enum values: `all,organization,private` id [_required_] uuid The unique identifier of the notification rule. relationships object The notification rule's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` notification_template object A relationship reference to a notification template. data [_required_] object The notification template relationship data. id [_required_] uuid The unique identifier of the notification template. type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` type [_required_] enum Notification rules resource type. Allowed enum values: `incident_notification_rules` included [ ] Related objects that are included in the response. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Incident type response data. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` Option 3 object A notification template object for inclusion in other resources. attributes object The notification template's attributes. category [_required_] string The category of the notification template. content [_required_] string The content body of the notification template. created [_required_] date-time Timestamp when the notification template was created. modified [_required_] date-time Timestamp when the notification template was last modified. name [_required_] string The name of the notification template. subject [_required_] string The subject line of the notification template. id [_required_] uuid The unique identifier of the notification template. relationships object The notification template's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` meta object Response metadata. pagination object Pagination metadata. next_offset int64 The offset for the next page of results. offset int64 The current offset in the results. size int64 The number of results returned per page. ``` { "data": [ { "attributes": { "conditions": [ { "field": "severity", "values": [ "SEV-1", "SEV-2" ] } ], "created": "2025-01-15T10:30:00Z", "enabled": true, "handles": [ "@team-email@company.com", "@slack-channel" ], "modified": "2025-01-15T14:45:00Z", "renotify_on": [ "status", "severity" ], "trigger": "incident_created_trigger", "visibility": "organization" }, "id": "00000000-0000-0000-0000-000000000001", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "notification_template": { "data": { "id": "00000000-0000-0000-0000-000000000001", "type": "notification_templates" } } }, "type": "incident_notification_rules" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "meta": { "pagination": { "next_offset": 15, "offset": 0, "size": 15 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### List incident notification rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/notification-rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List incident notification rules ``` """ List incident notification rules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi configuration = Configuration() configuration.unstable_operations["list_incident_notification_rules"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.list_incident_notification_rules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List incident notification rules ``` # List incident notification rules returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_incident_notification_rules".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new p api_instance.list_incident_notification_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List incident notification rules ``` // List incident notification rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListIncidentNotificationRules", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.ListIncidentNotificationRules(ctx, *datadogV2.NewListIncidentNotificationRulesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.ListIncidentNotificationRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.ListIncidentNotificationRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List incident notification rules ``` // List incident notification rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentNotificationRuleArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listIncidentNotificationRules", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); try { IncidentNotificationRuleArray result = apiInstance.listIncidentNotificationRules(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#listIncidentNotificationRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List incident notification rules ``` // List incident notification rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::ListIncidentNotificationRulesOptionalParams; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListIncidentNotificationRules", true); let api = IncidentsAPI::with_config(configuration); let resp = api .list_incident_notification_rules(ListIncidentNotificationRulesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List incident notification rules ``` /** * List incident notification rules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listIncidentNotificationRules"] = true; const apiInstance = new v2.IncidentsApi(configuration); apiInstance .listIncidentNotificationRules() .then((data: v2.IncidentNotificationRuleArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an incident notification rule](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-notification-rule-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/incidents/config/notification-ruleshttps://api.ap2.datadoghq.com/api/v2/incidents/config/notification-ruleshttps://api.datadoghq.eu/api/v2/incidents/config/notification-ruleshttps://api.ddog-gov.com/api/v2/incidents/config/notification-ruleshttps://api.datadoghq.com/api/v2/incidents/config/notification-ruleshttps://api.us3.datadoghq.com/api/v2/incidents/config/notification-ruleshttps://api.us5.datadoghq.com/api/v2/incidents/config/notification-rules ### Overview Creates a new notification rule. This endpoint requires the `incident_notification_settings_write` permission. OAuth apps require the `incident_notification_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Notification rule data for a create request. attributes [_required_] object The attributes for creating a notification rule. conditions [_required_] [object] The conditions that trigger this notification rule. field [_required_] string The incident field to evaluate values [_required_] [string] The value(s) to compare against. Multiple values are `ORed` together. enabled boolean Whether the notification rule is enabled. handles [_required_] [string] The notification handles (targets) for this rule. renotify_on [string] List of incident fields that trigger re-notification when changed. trigger [_required_] string The trigger event for this notification rule. visibility enum The visibility of the notification rule. Allowed enum values: `all,organization,private` relationships object The definition of `NotificationRuleCreateDataRelationships` object. incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` notification_template object A relationship reference to a notification template. data [_required_] object The notification template relationship data. id [_required_] uuid The unique identifier of the notification template. type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` type [_required_] enum Notification rules resource type. Allowed enum values: `incident_notification_rules` ``` { "data": { "attributes": { "conditions": [ { "field": "severity", "values": [ "SEV-1", "SEV-2" ] } ], "handles": [ "@test-email@company.com" ], "visibility": "organization", "trigger": "incident_created_trigger", "enabled": true }, "relationships": { "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } } }, "type": "incident_notification_rules" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationRule-201-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentNotificationRule-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with a notification rule. Field Type Description data [_required_] object Notification rule data from a response. attributes object The notification rule's attributes. conditions [_required_] [object] The conditions that trigger this notification rule. field [_required_] string The incident field to evaluate values [_required_] [string] The value(s) to compare against. Multiple values are `ORed` together. created [_required_] date-time Timestamp when the notification rule was created. enabled [_required_] boolean Whether the notification rule is enabled. handles [_required_] [string] The notification handles (targets) for this rule. modified [_required_] date-time Timestamp when the notification rule was last modified. renotify_on [string] List of incident fields that trigger re-notification when changed. trigger [_required_] string The trigger event for this notification rule. visibility [_required_] enum The visibility of the notification rule. Allowed enum values: `all,organization,private` id [_required_] uuid The unique identifier of the notification rule. relationships object The notification rule's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` notification_template object A relationship reference to a notification template. data [_required_] object The notification template relationship data. id [_required_] uuid The unique identifier of the notification template. type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` type [_required_] enum Notification rules resource type. Allowed enum values: `incident_notification_rules` included [ ] Related objects that are included in the response. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Incident type response data. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` Option 3 object A notification template object for inclusion in other resources. attributes object The notification template's attributes. category [_required_] string The category of the notification template. content [_required_] string The content body of the notification template. created [_required_] date-time Timestamp when the notification template was created. modified [_required_] date-time Timestamp when the notification template was last modified. name [_required_] string The name of the notification template. subject [_required_] string The subject line of the notification template. id [_required_] uuid The unique identifier of the notification template. relationships object The notification template's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` ``` { "data": { "attributes": { "conditions": [ { "field": "severity", "values": [ "SEV-1", "SEV-2" ] } ], "created": "2025-01-15T10:30:00Z", "enabled": true, "handles": [ "@team-email@company.com", "@slack-channel" ], "modified": "2025-01-15T14:45:00Z", "renotify_on": [ "status", "severity" ], "trigger": "incident_created_trigger", "visibility": "organization" }, "id": "00000000-0000-0000-0000-000000000001", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "notification_template": { "data": { "id": "00000000-0000-0000-0000-000000000001", "type": "notification_templates" } } }, "type": "incident_notification_rules" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Create incident notification rule returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/notification-rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "conditions": [ { "field": "severity", "values": [ "SEV-1", "SEV-2" ] } ], "handles": [ "@test-email@company.com" ], "visibility": "organization", "trigger": "incident_created_trigger", "enabled": true }, "relationships": { "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } } }, "type": "incident_notification_rules" } } EOF ``` ##### Create incident notification rule returns "Created" response ``` // Create incident notification rule returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "incident_type" in the system IncidentTypeDataID := os.Getenv("INCIDENT_TYPE_DATA_ID") body := datadogV2.CreateIncidentNotificationRuleRequest{ Data: datadogV2.IncidentNotificationRuleCreateData{ Attributes: datadogV2.IncidentNotificationRuleCreateAttributes{ Conditions: []datadogV2.IncidentNotificationRuleConditionsItems{ { Field: "severity", Values: []string{ "SEV-1", "SEV-2", }, }, }, Handles: []string{ "@test-email@company.com", }, Visibility: datadogV2.INCIDENTNOTIFICATIONRULECREATEATTRIBUTESVISIBILITY_ORGANIZATION.Ptr(), Trigger: "incident_created_trigger", Enabled: datadog.PtrBool(true), }, Relationships: &datadogV2.IncidentNotificationRuleCreateDataRelationships{ IncidentType: &datadogV2.RelationshipToIncidentType{ Data: datadogV2.RelationshipToIncidentTypeData{ Id: IncidentTypeDataID, Type: datadogV2.INCIDENTTYPETYPE_INCIDENT_TYPES, }, }, }, Type: datadogV2.INCIDENTNOTIFICATIONRULETYPE_INCIDENT_NOTIFICATION_RULES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateIncidentNotificationRule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.CreateIncidentNotificationRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.CreateIncidentNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.CreateIncidentNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create incident notification rule returns "Created" response ``` // Create incident notification rule returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.CreateIncidentNotificationRuleRequest; import com.datadog.api.client.v2.model.IncidentNotificationRule; import com.datadog.api.client.v2.model.IncidentNotificationRuleConditionsItems; import com.datadog.api.client.v2.model.IncidentNotificationRuleCreateAttributes; import com.datadog.api.client.v2.model.IncidentNotificationRuleCreateAttributesVisibility; import com.datadog.api.client.v2.model.IncidentNotificationRuleCreateData; import com.datadog.api.client.v2.model.IncidentNotificationRuleCreateDataRelationships; import com.datadog.api.client.v2.model.IncidentNotificationRuleType; import com.datadog.api.client.v2.model.IncidentTypeType; import com.datadog.api.client.v2.model.RelationshipToIncidentType; import com.datadog.api.client.v2.model.RelationshipToIncidentTypeData; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createIncidentNotificationRule", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "incident_type" in the system String INCIDENT_TYPE_DATA_ID = System.getenv("INCIDENT_TYPE_DATA_ID"); CreateIncidentNotificationRuleRequest body = new CreateIncidentNotificationRuleRequest() .data( new IncidentNotificationRuleCreateData() .attributes( new IncidentNotificationRuleCreateAttributes() .conditions( Collections.singletonList( new IncidentNotificationRuleConditionsItems() .field("severity") .values(Arrays.asList("SEV-1", "SEV-2")))) .handles(Collections.singletonList("@test-email@company.com")) .visibility( IncidentNotificationRuleCreateAttributesVisibility.ORGANIZATION) .trigger("incident_created_trigger") .enabled(true)) .relationships( new IncidentNotificationRuleCreateDataRelationships() .incidentType( new RelationshipToIncidentType() .data( new RelationshipToIncidentTypeData() .id(INCIDENT_TYPE_DATA_ID) .type(IncidentTypeType.INCIDENT_TYPES)))) .type(IncidentNotificationRuleType.INCIDENT_NOTIFICATION_RULES)); try { IncidentNotificationRule result = apiInstance.createIncidentNotificationRule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#createIncidentNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create incident notification rule returns "Created" response ``` """ Create incident notification rule returns "Created" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.create_incident_notification_rule_request import CreateIncidentNotificationRuleRequest from datadog_api_client.v2.model.incident_notification_rule_conditions_items import ( IncidentNotificationRuleConditionsItems, ) from datadog_api_client.v2.model.incident_notification_rule_create_attributes import ( IncidentNotificationRuleCreateAttributes, ) from datadog_api_client.v2.model.incident_notification_rule_create_attributes_visibility import ( IncidentNotificationRuleCreateAttributesVisibility, ) from datadog_api_client.v2.model.incident_notification_rule_create_data import IncidentNotificationRuleCreateData from datadog_api_client.v2.model.incident_notification_rule_create_data_relationships import ( IncidentNotificationRuleCreateDataRelationships, ) from datadog_api_client.v2.model.incident_notification_rule_type import IncidentNotificationRuleType from datadog_api_client.v2.model.incident_type_type import IncidentTypeType from datadog_api_client.v2.model.relationship_to_incident_type import RelationshipToIncidentType from datadog_api_client.v2.model.relationship_to_incident_type_data import RelationshipToIncidentTypeData # there is a valid "incident_type" in the system INCIDENT_TYPE_DATA_ID = environ["INCIDENT_TYPE_DATA_ID"] body = CreateIncidentNotificationRuleRequest( data=IncidentNotificationRuleCreateData( attributes=IncidentNotificationRuleCreateAttributes( conditions=[ IncidentNotificationRuleConditionsItems( field="severity", values=[ "SEV-1", "SEV-2", ], ), ], handles=[ "@test-email@company.com", ], visibility=IncidentNotificationRuleCreateAttributesVisibility.ORGANIZATION, trigger="incident_created_trigger", enabled=True, ), relationships=IncidentNotificationRuleCreateDataRelationships( incident_type=RelationshipToIncidentType( data=RelationshipToIncidentTypeData( id=INCIDENT_TYPE_DATA_ID, type=IncidentTypeType.INCIDENT_TYPES, ), ), ), type=IncidentNotificationRuleType.INCIDENT_NOTIFICATION_RULES, ), ) configuration = Configuration() configuration.unstable_operations["create_incident_notification_rule"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.create_incident_notification_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create incident notification rule returns "Created" response ``` # Create incident notification rule returns "Created" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_incident_notification_rule".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "incident_type" in the system INCIDENT_TYPE_DATA_ID = ENV["INCIDENT_TYPE_DATA_ID"] body = DatadogAPIClient::V2::CreateIncidentNotificationRuleRequest.new({ data: DatadogAPIClient::V2::IncidentNotificationRuleCreateData.new({ attributes: DatadogAPIClient::V2::IncidentNotificationRuleCreateAttributes.new({ conditions: [ DatadogAPIClient::V2::IncidentNotificationRuleConditionsItems.new({ field: "severity", values: [ "SEV-1", "SEV-2", ], }), ], handles: [ "@test-email@company.com", ], visibility: DatadogAPIClient::V2::IncidentNotificationRuleCreateAttributesVisibility::ORGANIZATION, trigger: "incident_created_trigger", enabled: true, }), relationships: DatadogAPIClient::V2::IncidentNotificationRuleCreateDataRelationships.new({ incident_type: DatadogAPIClient::V2::RelationshipToIncidentType.new({ data: DatadogAPIClient::V2::RelationshipToIncidentTypeData.new({ id: INCIDENT_TYPE_DATA_ID, type: DatadogAPIClient::V2::IncidentTypeType::INCIDENT_TYPES, }), }), }), type: DatadogAPIClient::V2::IncidentNotificationRuleType::INCIDENT_NOTIFICATION_RULES, }), }) p api_instance.create_incident_notification_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create incident notification rule returns "Created" response ``` // Create incident notification rule returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::model::CreateIncidentNotificationRuleRequest; use datadog_api_client::datadogV2::model::IncidentNotificationRuleConditionsItems; use datadog_api_client::datadogV2::model::IncidentNotificationRuleCreateAttributes; use datadog_api_client::datadogV2::model::IncidentNotificationRuleCreateAttributesVisibility; use datadog_api_client::datadogV2::model::IncidentNotificationRuleCreateData; use datadog_api_client::datadogV2::model::IncidentNotificationRuleCreateDataRelationships; use datadog_api_client::datadogV2::model::IncidentNotificationRuleType; use datadog_api_client::datadogV2::model::IncidentTypeType; use datadog_api_client::datadogV2::model::RelationshipToIncidentType; use datadog_api_client::datadogV2::model::RelationshipToIncidentTypeData; #[tokio::main] async fn main() { // there is a valid "incident_type" in the system let incident_type_data_id = std::env::var("INCIDENT_TYPE_DATA_ID").unwrap(); let body = CreateIncidentNotificationRuleRequest::new( IncidentNotificationRuleCreateData::new( IncidentNotificationRuleCreateAttributes::new( vec![IncidentNotificationRuleConditionsItems::new( "severity".to_string(), vec!["SEV-1".to_string(), "SEV-2".to_string()], )], vec!["@test-email@company.com".to_string()], "incident_created_trigger".to_string(), ) .enabled(true) .visibility(IncidentNotificationRuleCreateAttributesVisibility::ORGANIZATION), IncidentNotificationRuleType::INCIDENT_NOTIFICATION_RULES, ) .relationships( IncidentNotificationRuleCreateDataRelationships::new().incident_type( RelationshipToIncidentType::new(RelationshipToIncidentTypeData::new( incident_type_data_id.clone(), IncidentTypeType::INCIDENT_TYPES, )), ), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateIncidentNotificationRule", true); let api = IncidentsAPI::with_config(configuration); let resp = api.create_incident_notification_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create incident notification rule returns "Created" response ``` /** * Create incident notification rule returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createIncidentNotificationRule"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "incident_type" in the system const INCIDENT_TYPE_DATA_ID = process.env.INCIDENT_TYPE_DATA_ID as string; const params: v2.IncidentsApiCreateIncidentNotificationRuleRequest = { body: { data: { attributes: { conditions: [ { field: "severity", values: ["SEV-1", "SEV-2"], }, ], handles: ["@test-email@company.com"], visibility: "organization", trigger: "incident_created_trigger", enabled: true, }, relationships: { incidentType: { data: { id: INCIDENT_TYPE_DATA_ID, type: "incident_types", }, }, }, type: "incident_notification_rules", }, }, }; apiInstance .createIncidentNotificationRule(params) .then((data: v2.IncidentNotificationRule) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an incident notification rule](https://docs.datadoghq.com/api/latest/incidents/#get-an-incident-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#get-an-incident-notification-rule-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.ap2.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.datadoghq.eu/api/v2/incidents/config/notification-rules/{id}https://api.ddog-gov.com/api/v2/incidents/config/notification-rules/{id}https://api.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.us3.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.us5.datadoghq.com/api/v2/incidents/config/notification-rules/{id} ### Overview Retrieves a specific notification rule by its ID. This endpoint requires the `incident_notification_settings_read` permission. OAuth apps require the `incident_notification_settings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description id [_required_] string The ID of the notification rule. #### Query Strings Name Type Description include string Comma-separated list of resources to include. Supported values: `created_by_user`, `last_modified_by_user`, `incident_type`, `notification_template` ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#GetIncidentNotificationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with a notification rule. Field Type Description data [_required_] object Notification rule data from a response. attributes object The notification rule's attributes. conditions [_required_] [object] The conditions that trigger this notification rule. field [_required_] string The incident field to evaluate values [_required_] [string] The value(s) to compare against. Multiple values are `ORed` together. created [_required_] date-time Timestamp when the notification rule was created. enabled [_required_] boolean Whether the notification rule is enabled. handles [_required_] [string] The notification handles (targets) for this rule. modified [_required_] date-time Timestamp when the notification rule was last modified. renotify_on [string] List of incident fields that trigger re-notification when changed. trigger [_required_] string The trigger event for this notification rule. visibility [_required_] enum The visibility of the notification rule. Allowed enum values: `all,organization,private` id [_required_] uuid The unique identifier of the notification rule. relationships object The notification rule's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` notification_template object A relationship reference to a notification template. data [_required_] object The notification template relationship data. id [_required_] uuid The unique identifier of the notification template. type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` type [_required_] enum Notification rules resource type. Allowed enum values: `incident_notification_rules` included [ ] Related objects that are included in the response. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Incident type response data. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` Option 3 object A notification template object for inclusion in other resources. attributes object The notification template's attributes. category [_required_] string The category of the notification template. content [_required_] string The content body of the notification template. created [_required_] date-time Timestamp when the notification template was created. modified [_required_] date-time Timestamp when the notification template was last modified. name [_required_] string The name of the notification template. subject [_required_] string The subject line of the notification template. id [_required_] uuid The unique identifier of the notification template. relationships object The notification template's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` ``` { "data": { "attributes": { "conditions": [ { "field": "severity", "values": [ "SEV-1", "SEV-2" ] } ], "created": "2025-01-15T10:30:00Z", "enabled": true, "handles": [ "@team-email@company.com", "@slack-channel" ], "modified": "2025-01-15T14:45:00Z", "renotify_on": [ "status", "severity" ], "trigger": "incident_created_trigger", "visibility": "organization" }, "id": "00000000-0000-0000-0000-000000000001", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "notification_template": { "data": { "id": "00000000-0000-0000-0000-000000000001", "type": "notification_templates" } } }, "type": "incident_notification_rules" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Get an incident notification rule Copy ``` # Path parameters export id="00000000-0000-0000-0000-000000000001" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/notification-rules/${id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an incident notification rule ``` """ Get an incident notification rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from uuid import UUID configuration = Configuration() configuration.unstable_operations["get_incident_notification_rule"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.get_incident_notification_rule( id=UUID("00000000-0000-0000-0000-000000000001"), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an incident notification rule ``` # Get an incident notification rule returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_incident_notification_rule".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new p api_instance.get_incident_notification_rule("00000000-0000-0000-0000-000000000001") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an incident notification rule ``` // Get an incident notification rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetIncidentNotificationRule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.GetIncidentNotificationRule(ctx, uuid.MustParse("00000000-0000-0000-0000-000000000001"), *datadogV2.NewGetIncidentNotificationRuleOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.GetIncidentNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.GetIncidentNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an incident notification rule ``` // Get an incident notification rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentNotificationRule; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getIncidentNotificationRule", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); try { IncidentNotificationRule result = apiInstance.getIncidentNotificationRule( UUID.fromString("00000000-0000-0000-0000-000000000001")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#getIncidentNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an incident notification rule ``` // Get an incident notification rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::GetIncidentNotificationRuleOptionalParams; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use uuid::Uuid; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetIncidentNotificationRule", true); let api = IncidentsAPI::with_config(configuration); let resp = api .get_incident_notification_rule( Uuid::parse_str("00000000-0000-0000-0000-000000000001").expect("invalid UUID"), GetIncidentNotificationRuleOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an incident notification rule ``` /** * Get an incident notification rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getIncidentNotificationRule"] = true; const apiInstance = new v2.IncidentsApi(configuration); const params: v2.IncidentsApiGetIncidentNotificationRuleRequest = { id: "00000000-0000-0000-0000-000000000001", }; apiInstance .getIncidentNotificationRule(params) .then((data: v2.IncidentNotificationRule) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an incident notification rule](https://docs.datadoghq.com/api/latest/incidents/#update-an-incident-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#update-an-incident-notification-rule-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PUT https://api.ap1.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.ap2.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.datadoghq.eu/api/v2/incidents/config/notification-rules/{id}https://api.ddog-gov.com/api/v2/incidents/config/notification-rules/{id}https://api.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.us3.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.us5.datadoghq.com/api/v2/incidents/config/notification-rules/{id} ### Overview Updates an existing notification rule with a complete replacement. This endpoint requires the `incident_notification_settings_write` permission. OAuth apps require the `incident_notification_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description id [_required_] string The ID of the notification rule. #### Query Strings Name Type Description include string Comma-separated list of resources to include. Supported values: `created_by_user`, `last_modified_by_user`, `incident_type`, `notification_template` ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] object Notification rule data for an update request. attributes [_required_] object The attributes for creating a notification rule. conditions [_required_] [object] The conditions that trigger this notification rule. field [_required_] string The incident field to evaluate values [_required_] [string] The value(s) to compare against. Multiple values are `ORed` together. enabled boolean Whether the notification rule is enabled. handles [_required_] [string] The notification handles (targets) for this rule. renotify_on [string] List of incident fields that trigger re-notification when changed. trigger [_required_] string The trigger event for this notification rule. visibility enum The visibility of the notification rule. Allowed enum values: `all,organization,private` id [_required_] uuid The unique identifier of the notification rule. relationships object The definition of `NotificationRuleCreateDataRelationships` object. incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` notification_template object A relationship reference to a notification template. data [_required_] object The notification template relationship data. id [_required_] uuid The unique identifier of the notification template. type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` type [_required_] enum Notification rules resource type. Allowed enum values: `incident_notification_rules` ``` { "data": { "attributes": { "enabled": false, "conditions": [ { "field": "severity", "values": [ "SEV-1" ] } ], "handles": [ "@updated-team-email@company.com" ], "visibility": "private", "trigger": "incident_modified_trigger" }, "relationships": { "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } } }, "id": "00000000-0000-0000-0000-000000000001", "type": "incident_notification_rules" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentNotificationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Response with a notification rule. Field Type Description data [_required_] object Notification rule data from a response. attributes object The notification rule's attributes. conditions [_required_] [object] The conditions that trigger this notification rule. field [_required_] string The incident field to evaluate values [_required_] [string] The value(s) to compare against. Multiple values are `ORed` together. created [_required_] date-time Timestamp when the notification rule was created. enabled [_required_] boolean Whether the notification rule is enabled. handles [_required_] [string] The notification handles (targets) for this rule. modified [_required_] date-time Timestamp when the notification rule was last modified. renotify_on [string] List of incident fields that trigger re-notification when changed. trigger [_required_] string The trigger event for this notification rule. visibility [_required_] enum The visibility of the notification rule. Allowed enum values: `all,organization,private` id [_required_] uuid The unique identifier of the notification rule. relationships object The notification rule's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` notification_template object A relationship reference to a notification template. data [_required_] object The notification template relationship data. id [_required_] uuid The unique identifier of the notification template. type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` type [_required_] enum Notification rules resource type. Allowed enum values: `incident_notification_rules` included [ ] Related objects that are included in the response. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Incident type response data. attributes object Incident type's attributes. createdAt date-time Timestamp when the incident type was created. createdBy string A unique identifier that represents the user that created the incident type. description string Text that describes the incident type. is_default boolean If true, this incident type will be used as the default incident type if a type is not specified during the creation of incident resources. lastModifiedBy string A unique identifier that represents the user that last modified the incident type. modifiedAt date-time Timestamp when the incident type was last modified. name [_required_] string The name of the incident type. prefix string The string that will be prepended to the incident title across the Datadog app. id [_required_] string The incident type's ID. relationships object The incident type's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` google_meet_configuration object A reference to a Google Meet Configuration resource. data [_required_] object The Google Meet configuration relationship data object. id [_required_] string The unique identifier of the Google Meet configuration. type [_required_] string The type of the Google Meet configuration. last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` microsoft_teams_configuration object A reference to a Microsoft Teams Configuration resource. data [_required_] object The Microsoft Teams configuration relationship data object. id [_required_] string The unique identifier of the Microsoft Teams configuration. type [_required_] string The type of the Microsoft Teams configuration. zoom_configuration object A reference to a Zoom configuration resource. data [_required_] object The Zoom configuration relationship data object. id [_required_] string The unique identifier of the Zoom configuration. type [_required_] string The type of the Zoom configuration. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` Option 3 object A notification template object for inclusion in other resources. attributes object The notification template's attributes. category [_required_] string The category of the notification template. content [_required_] string The content body of the notification template. created [_required_] date-time Timestamp when the notification template was created. modified [_required_] date-time Timestamp when the notification template was last modified. name [_required_] string The name of the notification template. subject [_required_] string The subject line of the notification template. id [_required_] uuid The unique identifier of the notification template. relationships object The notification template's resource relationships. created_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` incident_type object Relationship to an incident type. data [_required_] object Relationship to incident type object. id [_required_] string The incident type's ID. type [_required_] enum Incident type resource type. Allowed enum values: `incident_types` default: `incident_types` last_modified_by_user object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Notification templates resource type. Allowed enum values: `notification_templates` ``` { "data": { "attributes": { "conditions": [ { "field": "severity", "values": [ "SEV-1", "SEV-2" ] } ], "created": "2025-01-15T10:30:00Z", "enabled": true, "handles": [ "@team-email@company.com", "@slack-channel" ], "modified": "2025-01-15T14:45:00Z", "renotify_on": [ "status", "severity" ], "trigger": "incident_created_trigger", "visibility": "organization" }, "id": "00000000-0000-0000-0000-000000000001", "relationships": { "created_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } }, "last_modified_by_user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "notification_template": { "data": { "id": "00000000-0000-0000-0000-000000000001", "type": "notification_templates" } } }, "type": "incident_notification_rules" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Update incident notification rule returns "OK" response Copy ``` # Path parameters export id="00000000-0000-0000-0000-000000000001" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/notification-rules/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": false, "conditions": [ { "field": "severity", "values": [ "SEV-1" ] } ], "handles": [ "@updated-team-email@company.com" ], "visibility": "private", "trigger": "incident_modified_trigger" }, "relationships": { "incident_type": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "incident_types" } } }, "id": "00000000-0000-0000-0000-000000000001", "type": "incident_notification_rules" } } EOF ``` ##### Update incident notification rule returns "OK" response ``` // Update incident notification rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "notification_rule" in the system NotificationRuleDataID := uuid.MustParse(os.Getenv("NOTIFICATION_RULE_DATA_ID")) // there is a valid "incident_type" in the system IncidentTypeDataID := os.Getenv("INCIDENT_TYPE_DATA_ID") body := datadogV2.PutIncidentNotificationRuleRequest{ Data: datadogV2.IncidentNotificationRuleUpdateData{ Attributes: datadogV2.IncidentNotificationRuleCreateAttributes{ Enabled: datadog.PtrBool(false), Conditions: []datadogV2.IncidentNotificationRuleConditionsItems{ { Field: "severity", Values: []string{ "SEV-1", }, }, }, Handles: []string{ "@updated-team-email@company.com", }, Visibility: datadogV2.INCIDENTNOTIFICATIONRULECREATEATTRIBUTESVISIBILITY_PRIVATE.Ptr(), Trigger: "incident_modified_trigger", }, Relationships: &datadogV2.IncidentNotificationRuleCreateDataRelationships{ IncidentType: &datadogV2.RelationshipToIncidentType{ Data: datadogV2.RelationshipToIncidentTypeData{ Id: IncidentTypeDataID, Type: datadogV2.INCIDENTTYPETYPE_INCIDENT_TYPES, }, }, }, Id: NotificationRuleDataID, Type: datadogV2.INCIDENTNOTIFICATIONRULETYPE_INCIDENT_NOTIFICATION_RULES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateIncidentNotificationRule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.UpdateIncidentNotificationRule(ctx, NotificationRuleDataID, body, *datadogV2.NewUpdateIncidentNotificationRuleOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.UpdateIncidentNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.UpdateIncidentNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update incident notification rule returns "OK" response ``` // Update incident notification rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentNotificationRule; import com.datadog.api.client.v2.model.IncidentNotificationRuleConditionsItems; import com.datadog.api.client.v2.model.IncidentNotificationRuleCreateAttributes; import com.datadog.api.client.v2.model.IncidentNotificationRuleCreateAttributesVisibility; import com.datadog.api.client.v2.model.IncidentNotificationRuleCreateDataRelationships; import com.datadog.api.client.v2.model.IncidentNotificationRuleType; import com.datadog.api.client.v2.model.IncidentNotificationRuleUpdateData; import com.datadog.api.client.v2.model.IncidentTypeType; import com.datadog.api.client.v2.model.PutIncidentNotificationRuleRequest; import com.datadog.api.client.v2.model.RelationshipToIncidentType; import com.datadog.api.client.v2.model.RelationshipToIncidentTypeData; import java.util.Collections; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateIncidentNotificationRule", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); // there is a valid "notification_rule" in the system UUID NOTIFICATION_RULE_DATA_ID = null; try { NOTIFICATION_RULE_DATA_ID = UUID.fromString(System.getenv("NOTIFICATION_RULE_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } // there is a valid "incident_type" in the system String INCIDENT_TYPE_DATA_ID = System.getenv("INCIDENT_TYPE_DATA_ID"); PutIncidentNotificationRuleRequest body = new PutIncidentNotificationRuleRequest() .data( new IncidentNotificationRuleUpdateData() .attributes( new IncidentNotificationRuleCreateAttributes() .enabled(false) .conditions( Collections.singletonList( new IncidentNotificationRuleConditionsItems() .field("severity") .values(Collections.singletonList("SEV-1")))) .handles(Collections.singletonList("@updated-team-email@company.com")) .visibility(IncidentNotificationRuleCreateAttributesVisibility.PRIVATE) .trigger("incident_modified_trigger")) .relationships( new IncidentNotificationRuleCreateDataRelationships() .incidentType( new RelationshipToIncidentType() .data( new RelationshipToIncidentTypeData() .id(INCIDENT_TYPE_DATA_ID) .type(IncidentTypeType.INCIDENT_TYPES)))) .id(NOTIFICATION_RULE_DATA_ID) .type(IncidentNotificationRuleType.INCIDENT_NOTIFICATION_RULES)); try { IncidentNotificationRule result = apiInstance.updateIncidentNotificationRule(NOTIFICATION_RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#updateIncidentNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update incident notification rule returns "OK" response ``` """ Update incident notification rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from datadog_api_client.v2.model.incident_notification_rule_conditions_items import ( IncidentNotificationRuleConditionsItems, ) from datadog_api_client.v2.model.incident_notification_rule_create_attributes import ( IncidentNotificationRuleCreateAttributes, ) from datadog_api_client.v2.model.incident_notification_rule_create_attributes_visibility import ( IncidentNotificationRuleCreateAttributesVisibility, ) from datadog_api_client.v2.model.incident_notification_rule_create_data_relationships import ( IncidentNotificationRuleCreateDataRelationships, ) from datadog_api_client.v2.model.incident_notification_rule_type import IncidentNotificationRuleType from datadog_api_client.v2.model.incident_notification_rule_update_data import IncidentNotificationRuleUpdateData from datadog_api_client.v2.model.incident_type_type import IncidentTypeType from datadog_api_client.v2.model.put_incident_notification_rule_request import PutIncidentNotificationRuleRequest from datadog_api_client.v2.model.relationship_to_incident_type import RelationshipToIncidentType from datadog_api_client.v2.model.relationship_to_incident_type_data import RelationshipToIncidentTypeData # there is a valid "notification_rule" in the system NOTIFICATION_RULE_DATA_ID = environ["NOTIFICATION_RULE_DATA_ID"] # there is a valid "incident_type" in the system INCIDENT_TYPE_DATA_ID = environ["INCIDENT_TYPE_DATA_ID"] body = PutIncidentNotificationRuleRequest( data=IncidentNotificationRuleUpdateData( attributes=IncidentNotificationRuleCreateAttributes( enabled=False, conditions=[ IncidentNotificationRuleConditionsItems( field="severity", values=[ "SEV-1", ], ), ], handles=[ "@updated-team-email@company.com", ], visibility=IncidentNotificationRuleCreateAttributesVisibility.PRIVATE, trigger="incident_modified_trigger", ), relationships=IncidentNotificationRuleCreateDataRelationships( incident_type=RelationshipToIncidentType( data=RelationshipToIncidentTypeData( id=INCIDENT_TYPE_DATA_ID, type=IncidentTypeType.INCIDENT_TYPES, ), ), ), id=NOTIFICATION_RULE_DATA_ID, type=IncidentNotificationRuleType.INCIDENT_NOTIFICATION_RULES, ), ) configuration = Configuration() configuration.unstable_operations["update_incident_notification_rule"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.update_incident_notification_rule(id=NOTIFICATION_RULE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update incident notification rule returns "OK" response ``` # Update incident notification rule returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_incident_notification_rule".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new # there is a valid "notification_rule" in the system NOTIFICATION_RULE_DATA_ID = ENV["NOTIFICATION_RULE_DATA_ID"] # there is a valid "incident_type" in the system INCIDENT_TYPE_DATA_ID = ENV["INCIDENT_TYPE_DATA_ID"] body = DatadogAPIClient::V2::PutIncidentNotificationRuleRequest.new({ data: DatadogAPIClient::V2::IncidentNotificationRuleUpdateData.new({ attributes: DatadogAPIClient::V2::IncidentNotificationRuleCreateAttributes.new({ enabled: false, conditions: [ DatadogAPIClient::V2::IncidentNotificationRuleConditionsItems.new({ field: "severity", values: [ "SEV-1", ], }), ], handles: [ "@updated-team-email@company.com", ], visibility: DatadogAPIClient::V2::IncidentNotificationRuleCreateAttributesVisibility::PRIVATE, trigger: "incident_modified_trigger", }), relationships: DatadogAPIClient::V2::IncidentNotificationRuleCreateDataRelationships.new({ incident_type: DatadogAPIClient::V2::RelationshipToIncidentType.new({ data: DatadogAPIClient::V2::RelationshipToIncidentTypeData.new({ id: INCIDENT_TYPE_DATA_ID, type: DatadogAPIClient::V2::IncidentTypeType::INCIDENT_TYPES, }), }), }), id: NOTIFICATION_RULE_DATA_ID, type: DatadogAPIClient::V2::IncidentNotificationRuleType::INCIDENT_NOTIFICATION_RULES, }), }) p api_instance.update_incident_notification_rule(NOTIFICATION_RULE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update incident notification rule returns "OK" response ``` // Update incident notification rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::UpdateIncidentNotificationRuleOptionalParams; use datadog_api_client::datadogV2::model::IncidentNotificationRuleConditionsItems; use datadog_api_client::datadogV2::model::IncidentNotificationRuleCreateAttributes; use datadog_api_client::datadogV2::model::IncidentNotificationRuleCreateAttributesVisibility; use datadog_api_client::datadogV2::model::IncidentNotificationRuleCreateDataRelationships; use datadog_api_client::datadogV2::model::IncidentNotificationRuleType; use datadog_api_client::datadogV2::model::IncidentNotificationRuleUpdateData; use datadog_api_client::datadogV2::model::IncidentTypeType; use datadog_api_client::datadogV2::model::PutIncidentNotificationRuleRequest; use datadog_api_client::datadogV2::model::RelationshipToIncidentType; use datadog_api_client::datadogV2::model::RelationshipToIncidentTypeData; #[tokio::main] async fn main() { // there is a valid "notification_rule" in the system let notification_rule_data_id = uuid::Uuid::parse_str(&std::env::var("NOTIFICATION_RULE_DATA_ID").unwrap()) .expect("Invalid UUID"); // there is a valid "incident_type" in the system let incident_type_data_id = std::env::var("INCIDENT_TYPE_DATA_ID").unwrap(); let body = PutIncidentNotificationRuleRequest::new( IncidentNotificationRuleUpdateData::new( IncidentNotificationRuleCreateAttributes::new( vec![IncidentNotificationRuleConditionsItems::new( "severity".to_string(), vec!["SEV-1".to_string()], )], vec!["@updated-team-email@company.com".to_string()], "incident_modified_trigger".to_string(), ) .enabled(false) .visibility(IncidentNotificationRuleCreateAttributesVisibility::PRIVATE), notification_rule_data_id.clone(), IncidentNotificationRuleType::INCIDENT_NOTIFICATION_RULES, ) .relationships( IncidentNotificationRuleCreateDataRelationships::new().incident_type( RelationshipToIncidentType::new(RelationshipToIncidentTypeData::new( incident_type_data_id.clone(), IncidentTypeType::INCIDENT_TYPES, )), ), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateIncidentNotificationRule", true); let api = IncidentsAPI::with_config(configuration); let resp = api .update_incident_notification_rule( notification_rule_data_id.clone(), body, UpdateIncidentNotificationRuleOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update incident notification rule returns "OK" response ``` /** * Update incident notification rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateIncidentNotificationRule"] = true; const apiInstance = new v2.IncidentsApi(configuration); // there is a valid "notification_rule" in the system const NOTIFICATION_RULE_DATA_ID = process.env .NOTIFICATION_RULE_DATA_ID as string; // there is a valid "incident_type" in the system const INCIDENT_TYPE_DATA_ID = process.env.INCIDENT_TYPE_DATA_ID as string; const params: v2.IncidentsApiUpdateIncidentNotificationRuleRequest = { body: { data: { attributes: { enabled: false, conditions: [ { field: "severity", values: ["SEV-1"], }, ], handles: ["@updated-team-email@company.com"], visibility: "private", trigger: "incident_modified_trigger", }, relationships: { incidentType: { data: { id: INCIDENT_TYPE_DATA_ID, type: "incident_types", }, }, }, id: NOTIFICATION_RULE_DATA_ID, type: "incident_notification_rules", }, }, id: NOTIFICATION_RULE_DATA_ID, }; apiInstance .updateIncidentNotificationRule(params) .then((data: v2.IncidentNotificationRule) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an incident notification rule](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-notification-rule-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.ap2.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.datadoghq.eu/api/v2/incidents/config/notification-rules/{id}https://api.ddog-gov.com/api/v2/incidents/config/notification-rules/{id}https://api.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.us3.datadoghq.com/api/v2/incidents/config/notification-rules/{id}https://api.us5.datadoghq.com/api/v2/incidents/config/notification-rules/{id} ### Overview Deletes a notification rule by its ID. This endpoint requires the `incident_notification_settings_write` permission. OAuth apps require the `incident_notification_settings_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description id [_required_] string The ID of the notification rule. #### Query Strings Name Type Description include string Comma-separated list of resources to include. Supported values: `created_by_user`, `last_modified_by_user`, `incident_type`, `notification_template` ### Response * [204](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationRule-204-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentNotificationRule-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### Delete an incident notification rule Copy ``` # Path parameters export id="00000000-0000-0000-0000-000000000001" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/config/notification-rules/${id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an incident notification rule ``` """ Delete an incident notification rule returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi from uuid import UUID configuration = Configuration() configuration.unstable_operations["delete_incident_notification_rule"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) api_instance.delete_incident_notification_rule( id=UUID("00000000-0000-0000-0000-000000000001"), ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an incident notification rule ``` # Delete an incident notification rule returns "No Content" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_incident_notification_rule".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new api_instance.delete_incident_notification_rule("00000000-0000-0000-0000-000000000001") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an incident notification rule ``` // Delete an incident notification rule returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteIncidentNotificationRule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) r, err := api.DeleteIncidentNotificationRule(ctx, uuid.MustParse("00000000-0000-0000-0000-000000000001"), *datadogV2.NewDeleteIncidentNotificationRuleOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.DeleteIncidentNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an incident notification rule ``` // Delete an incident notification rule returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteIncidentNotificationRule", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); try { apiInstance.deleteIncidentNotificationRule( UUID.fromString("00000000-0000-0000-0000-000000000001")); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#deleteIncidentNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an incident notification rule ``` // Delete an incident notification rule returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::DeleteIncidentNotificationRuleOptionalParams; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use uuid::Uuid; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteIncidentNotificationRule", true); let api = IncidentsAPI::with_config(configuration); let resp = api .delete_incident_notification_rule( Uuid::parse_str("00000000-0000-0000-0000-000000000001").expect("invalid UUID"), DeleteIncidentNotificationRuleOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an incident notification rule ``` /** * Delete an incident notification rule returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteIncidentNotificationRule"] = true; const apiInstance = new v2.IncidentsApi(configuration); const params: v2.IncidentsApiDeleteIncidentNotificationRuleRequest = { id: "00000000-0000-0000-0000-000000000001", }; apiInstance .deleteIncidentNotificationRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List incident attachments](https://docs.datadoghq.com/api/latest/incidents/#list-incident-attachments) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#list-incident-attachments-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/attachmentshttps://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/attachmentshttps://api.datadoghq.eu/api/v2/incidents/{incident_id}/attachmentshttps://api.ddog-gov.com/api/v2/incidents/{incident_id}/attachmentshttps://api.datadoghq.com/api/v2/incidents/{incident_id}/attachmentshttps://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/attachmentshttps://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/attachments ### Overview List incident attachments. This endpoint requires the `incident_read` permission. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. #### Query Strings Name Type Description filter[attachment_type] string Filter attachments by type. Supported values are `1` (`postmortem`) and `2` (`link`). include string Resource to include in the response. Supported value: `last_modified_by_user`. ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentAttachments-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentAttachments-400-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#ListIncidentAttachments-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data [_required_] [object] attributes [_required_] object attachment object documentUrl string title string attachment_type enum modified date-time id [_required_] string relationships [_required_] object last_modified_by_user object data [_required_] object id [_required_] string type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` included [ ] Option 1 object attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": [ { "attributes": { "attachment": { "documentUrl": "string", "title": "string" }, "attachment_type": "string", "modified": "2019-09-19T10:00:00.000Z" }, "id": "00000000-abcd-0002-0000-000000000000", "relationships": { "last_modified_by_user": { "data": { "id": "", "type": "users" } } }, "type": "incident_attachments" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/incidents/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/incidents/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/incidents/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/incidents/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/incidents/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/incidents/?code-lang=typescript) ##### List incident attachments Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/attachments" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List incident attachments ``` """ Get a list of attachments returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.incidents_api import IncidentsApi configuration = Configuration() configuration.unstable_operations["list_incident_attachments"] = True with ApiClient(configuration) as api_client: api_instance = IncidentsApi(api_client) response = api_instance.list_incident_attachments( incident_id="incident_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List incident attachments ``` # Get a list of attachments returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_incident_attachments".to_sym] = true end api_instance = DatadogAPIClient::V2::IncidentsAPI.new p api_instance.list_incident_attachments("incident_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List incident attachments ``` // Get a list of attachments returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListIncidentAttachments", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIncidentsApi(apiClient) resp, r, err := api.ListIncidentAttachments(ctx, "incident_id", *datadogV2.NewListIncidentAttachmentsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IncidentsApi.ListIncidentAttachments`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IncidentsApi.ListIncidentAttachments`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List incident attachments ``` // Get a list of attachments returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IncidentsApi; import com.datadog.api.client.v2.model.IncidentAttachmentsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listIncidentAttachments", true); IncidentsApi apiInstance = new IncidentsApi(defaultClient); try { IncidentAttachmentsResponse result = apiInstance.listIncidentAttachments("incident_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IncidentsApi#listIncidentAttachments"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List incident attachments ``` // Get a list of attachments returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_incidents::IncidentsAPI; use datadog_api_client::datadogV2::api_incidents::ListIncidentAttachmentsOptionalParams; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListIncidentAttachments", true); let api = IncidentsAPI::with_config(configuration); let resp = api .list_incident_attachments( "incident_id".to_string(), ListIncidentAttachmentsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List incident attachments ``` /** * Get a list of attachments returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listIncidentAttachments"] = true; const apiInstance = new v2.IncidentsApi(configuration); const params: v2.IncidentsApiListIncidentAttachmentsRequest = { incidentId: "incident_id", }; apiInstance .listIncidentAttachments(params) .then((data: v2.IncidentAttachmentsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create incident attachment](https://docs.datadoghq.com/api/latest/incidents/#create-incident-attachment) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#create-incident-attachment-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/attachmentshttps://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/attachmentshttps://api.datadoghq.eu/api/v2/incidents/{incident_id}/attachmentshttps://api.ddog-gov.com/api/v2/incidents/{incident_id}/attachmentshttps://api.datadoghq.com/api/v2/incidents/{incident_id}/attachmentshttps://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/attachmentshttps://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/attachments ### Overview Create an incident attachment. This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. #### Query Strings Name Type Description include string Resource to include in the response. Supported value: `last_modified_by_user`. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data object attributes object attachment object documentUrl string title string attachment_type enum id string type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` ``` { "data": { "attributes": { "attachment": { "documentUrl": "string", "title": "string" }, "attachment_type": "string" }, "id": "string", "type": "incident_attachments" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentAttachment-201-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentAttachment-400-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentAttachment-403-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#CreateIncidentAttachment-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data object attributes [_required_] object attachment object documentUrl string title string attachment_type enum modified date-time id [_required_] string relationships [_required_] object last_modified_by_user object data [_required_] object id [_required_] string type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` included [ ] Option 1 object attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "attachment": { "documentUrl": "string", "title": "string" }, "attachment_type": "string", "modified": "2019-09-19T10:00:00.000Z" }, "id": "00000000-abcd-0002-0000-000000000000", "relationships": { "last_modified_by_user": { "data": { "id": "", "type": "users" } } }, "type": "incident_attachments" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) ##### Create incident attachment Copy ``` # Path parameters export incident_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/attachments" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "incident_attachments" } } EOF ``` * * * ## [Delete incident attachment](https://docs.datadoghq.com/api/latest/incidents/#delete-incident-attachment) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#delete-incident-attachment-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/attachments/{attachment_id} ### Overview This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. attachment_id [_required_] The ID of the attachment. ### Response * [204](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentAttachment-204-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentAttachment-400-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentAttachment-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentAttachment-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#DeleteIncidentAttachment-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) ##### Delete incident attachment Copy ``` # Path parameters export incident_id="CHANGE_ME" export attachment_id="00000000-0000-0000-0000-000000000002" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/attachments/${attachment_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Update incident attachment](https://docs.datadoghq.com/api/latest/incidents/#update-incident-attachment) * [v2 (latest)](https://docs.datadoghq.com/api/latest/incidents/#update-incident-attachment-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PATCH https://api.ap1.datadoghq.com/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.ap2.datadoghq.com/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.datadoghq.eu/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.ddog-gov.com/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.datadoghq.com/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.us3.datadoghq.com/api/v2/incidents/{incident_id}/attachments/{attachment_id}https://api.us5.datadoghq.com/api/v2/incidents/{incident_id}/attachments/{attachment_id} ### Overview This endpoint requires the `incident_write` permission. OAuth apps require the `incident_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#incidents) to access this endpoint. ### Arguments #### Path Parameters Name Type Description incident_id [_required_] string The UUID of the incident. attachment_id [_required_] The ID of the attachment. #### Query Strings Name Type Description include string Resource to include in the response. Supported value: `last_modified_by_user`. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data object attributes object attachment object documentUrl string title string id string type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` ``` { "data": { "attributes": { "attachment": { "documentUrl": "string", "title": "string" } }, "id": "string", "type": "incident_attachments" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentAttachment-200-v2) * [400](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentAttachment-400-v2) * [403](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentAttachment-403-v2) * [404](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentAttachment-404-v2) * [429](https://docs.datadoghq.com/api/latest/incidents/#UpdateIncidentAttachment-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) Field Type Description data object attributes [_required_] object attachment object documentUrl string title string attachment_type enum modified date-time id [_required_] string relationships [_required_] object last_modified_by_user object data [_required_] object id [_required_] string type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum The incident attachment resource type. Allowed enum values: `incident_attachments` default: `incident_attachments` included [ ] Option 1 object attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "attachment": { "documentUrl": "string", "title": "string" }, "attachment_type": "string", "modified": "2019-09-19T10:00:00.000Z" }, "id": "00000000-abcd-0002-0000-000000000000", "relationships": { "last_modified_by_user": { "data": { "id": "", "type": "users" } } }, "type": "incident_attachments" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/incidents/) * [Example](https://docs.datadoghq.com/api/latest/incidents/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/incidents/?code-lang=curl) ##### Update incident attachment Copy ``` # Path parameters export incident_id="CHANGE_ME" export attachment_id="00000000-0000-0000-0000-000000000002" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/incidents/${incident_id}/attachments/${attachment_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "incident_attachments" } } EOF ``` * * * ![](https://id.rlcdn.com/464526.gif) ##### Get Started with Datadog --- # Source: https://docs.datadoghq.com/api/latest/ ## [API Reference](https://docs.datadoghq.com/api/latest/?tab=java#api-reference) The Datadog API is an HTTP REST API. The API uses resource-oriented URLs to call the API, uses status codes to indicate the success or failure of requests, returns JSON from all requests, and uses standard HTTP response codes. Use the Datadog API to access the Datadog platform programmatically. ### [Getting started](https://docs.datadoghq.com/api/latest/?tab=java#getting-started) Authenticate to the API with an [API key](https://docs.datadoghq.com/account_management/api-app-keys/#api-keys) using the header `DD-API-KEY`. For some endpoints, you also need an [Application key](https://docs.datadoghq.com/account_management/api-app-keys/#application-keys), which uses the header `DD-APPLICATION-KEY`. To try out the API [![Run in Postman](https://run.pstmn.io/button.svg)](https://god.gw.postman.com/run-collection/20651290-809b13c1-4ada-46c1-af65-ab276c434068?action=collection%2Ffork&source=rip_markdown&collection-url=entityId%3D20651290-809b13c1-4ada-46c1-af65-ab276c434068%26entityType%3Dcollection%26workspaceId%3Dbf049f54-c695-4e91-b879-0cad1854bafa) **Note** : To authenticate to the Datadog API through Postman, add your Datadog API and Application key values to the **Collection variables** of the Datadog API collection. [Using the API](https://docs.datadoghq.com/api/v1/using-the-api/) is a guide to the endpoints. **Notes** : * Add your API and application key values to the **Variables** tab of the Datadog API Collection. * cURL code examples assume usage of BASH and GNU coreutils. On macOS, you can install coreutils with the [Homebrew package manager](https://brew.sh): `brew install coreutils` ### [Client libraries](https://docs.datadoghq.com/api/latest/?tab=java#client-libraries) By default, the Datadog API Docs show examples in cURL. Select one of our official [client libraries](https://docs.datadoghq.com/developers/community/libraries/) languages in each endpoint to see code examples from that library. To install each library: * [Java](https://docs.datadoghq.com/api/latest/?tab=java) * [Python [legacy]](https://docs.datadoghq.com/api/latest/?tab=java) * [Python](https://docs.datadoghq.com/api/latest/?tab=java) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/?tab=java) * [Ruby](https://docs.datadoghq.com/api/latest/?tab=java) * [Go](https://docs.datadoghq.com/api/latest/?tab=java) * [Typescript](https://docs.datadoghq.com/api/latest/?tab=java) * [Rust](https://docs.datadoghq.com/api/latest/?tab=java) #### Installation Maven - Add this dependency to your project’s POM: ``` com.datadoghq datadog-api-client 2.48.0 compile ``` Gradle - Add this dependency to your project’s build file: ``` compile "com.datadoghq:datadog-api-client:2.48.0" ``` #### Usage ``` import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.Configuration; import com.datadog.api..client.api.*; import com.datadog.api..client.model.*; ``` **Note** : Replace `` with v1 or v2, depending on which endpoints you want to use. #### Examples Maven `pom.xml` for running examples: ``` 4.0.0 com.example example 1 com.datadoghq datadog-api-client 2.48.0 compile ``` Make sure that `CLASSPATH` variable contains all dependencies. ``` export CLASSPATH=$(mvn -q exec:exec -Dexec.executable=echo -Dexec.args="%classpath") ``` Gradle `build.gradle` for running examples: ``` plugins { id 'java' id 'application' } repositories { jcenter() } dependencies { implementation 'com.datadoghq:datadog-api-client:2.48.0' } application { mainClassName = 'Example.java' } ``` Execute example by running `gradle run` command. #### Installation ``` pip install datadog ``` #### Usage ``` import datadog ``` #### Installation ``` pip3 install datadog-api-client ``` #### Usage ``` import datadog_api_client ``` #### Installation ``` gem install dogapi ``` #### Usage ``` require 'dogapi' ``` #### Installation ``` gem install datadog_api_client -v 2.47.0 ``` #### Usage ``` require 'datadog_api_client' ``` #### Installation ``` go mod init main && go get github.com/DataDog/datadog-api-client-go/v2/api/datadog ``` #### Usage ``` import ( "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" ) ``` **Note** : Replace `` with `V1` or `V2`, depending on which endpoints you want to use. #### Installation The package is under [@datadog/datadog-api-client](https://www.npmjs.com/package/@datadog/datadog-api-client) and can be installed through NPM or Yarn: ``` # NPM npm install @datadog/datadog-api-client # Yarn yarn add @datadog/datadog-api-client ``` #### Usage ``` import { } from 'datadog-api-client'; ``` **Note** : Replace `` with v1 or v2, depending on which endpoints you want to use. #### Installation Run `cargo add datadog-api-client`, or add the following to `Cargo.toml` under `[dependencies]`: ``` datadog-api-client = "0" ``` #### Usage Try the following snippet to validate your Datadog API key: ``` use datadog_api_client::datadog::Configuration; use datadog_api_client::datadogV1::api_authentication::AuthenticationAPI; #[tokio::main] async fn main() { let configuration = Configuration::new(); let api = AuthenticationAPI::with_config(configuration); let resp = api.validate().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Or check out the libraries directly: [](https://github.com/DataDog/datadog-api-client-java) [](https://github.com/DataDog/datadog-api-client-python) [](https://github.com/DataDog/datadog-api-client-ruby) [](https://github.com/DataDog/datadog-api-client-go) [](https://github.com/DataDog/datadog-api-client-typescript) [](https://github.com/DataDog/datadog-api-client-rust) Trying to get started with the application instead? Check out Datadog’s general [Getting Started docs](https://docs.datadoghq.com/getting_started/application/). ## [Further reading](https://docs.datadoghq.com/api/latest/?tab=java#further-reading) Additional helpful documentation, links, and articles: [Using the APIDOCUMENTATION ](https://docs.datadoghq.com/api/latest/using-the-api/)[Authorization ScopesDOCUMENTATION ](https://docs.datadoghq.com/api/latest/scopes/)[Rate LimitsDOCUMENTATION ](https://docs.datadoghq.com/api/latest/rate-limits/) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4ad43077-e978-4dcf-b35d-7501c3c75011&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=f27da7c9-3587-4515-9d64-f023062d9b50&pt=API%20Reference&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2F%3Ftab%3Djava&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4ad43077-e978-4dcf-b35d-7501c3c75011&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=f27da7c9-3587-4515-9d64-f023062d9b50&pt=API%20Reference&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2F%3Ftab%3Djava&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=bd62d6d3-f0d6-4cb0-b8a8-e8af34396a3b&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=API%20Reference&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2F%3Ftab%3Djava&r=<=873&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=468350) --- # Source: https://docs.datadoghq.com/api/latest/ip-allowlist # IP Allowlist The IP allowlist API is used to manage the IP addresses that can access the Datadog API and web UI. It does not block access to intake APIs or public dashboards. This is an enterprise-only feature. Request access by contacting Datadog support, or see the [IP Allowlist page](https://docs.datadoghq.com/account_management/org_settings/ip_allowlist/) for more information. ## [Get IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#get-ip-allowlist) * [v2 (latest)](https://docs.datadoghq.com/api/latest/ip-allowlist/#get-ip-allowlist-v2) GET https://api.ap1.datadoghq.com/api/v2/ip_allowlisthttps://api.ap2.datadoghq.com/api/v2/ip_allowlisthttps://api.datadoghq.eu/api/v2/ip_allowlisthttps://api.ddog-gov.com/api/v2/ip_allowlisthttps://api.datadoghq.com/api/v2/ip_allowlisthttps://api.us3.datadoghq.com/api/v2/ip_allowlisthttps://api.us5.datadoghq.com/api/v2/ip_allowlist ### Overview Returns the IP allowlist and its enabled or disabled state. This endpoint requires the `org_management` permission. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#ip-allowlist) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/ip-allowlist/#GetIPAllowlist-200-v2) * [403](https://docs.datadoghq.com/api/latest/ip-allowlist/#GetIPAllowlist-403-v2) * [404](https://docs.datadoghq.com/api/latest/ip-allowlist/#GetIPAllowlist-404-v2) * [429](https://docs.datadoghq.com/api/latest/ip-allowlist/#GetIPAllowlist-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/ip-allowlist/) Response containing information about the IP allowlist. Field Type Description data object IP allowlist data. attributes object Attributes of the IP allowlist. enabled boolean Whether the IP allowlist logic is enabled or not. entries [object] Array of entries in the IP allowlist. data [_required_] object Data of the IP allowlist entry object. attributes object Attributes of the IP allowlist entry. cidr_block string The CIDR block describing the IP range of the entry. created_at date-time Creation time of the entry. modified_at date-time Time of last entry modification. note string A note describing the IP allowlist entry. id string The unique identifier of the IP allowlist entry. type [_required_] enum IP allowlist Entry type. Allowed enum values: `ip_allowlist_entry` default: `ip_allowlist_entry` id string The unique identifier of the org. type [_required_] enum IP allowlist type. Allowed enum values: `ip_allowlist` default: `ip_allowlist` ``` { "data": { "attributes": { "enabled": false, "entries": [ { "data": { "attributes": { "cidr_block": "string", "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "note": "string" }, "id": "string", "type": "ip_allowlist_entry" } } ] }, "id": "string", "type": "ip_allowlist" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/ip-allowlist/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/ip-allowlist/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/ip-allowlist/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=typescript) ##### Get IP Allowlist Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ip_allowlist" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get IP Allowlist ``` """ Get IP Allowlist returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ip_allowlist_api import IPAllowlistApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = IPAllowlistApi(api_client) response = api_instance.get_ip_allowlist() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get IP Allowlist ``` # Get IP Allowlist returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::IPAllowlistAPI.new p api_instance.get_ip_allowlist() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get IP Allowlist ``` // Get IP Allowlist returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIPAllowlistApi(apiClient) resp, r, err := api.GetIPAllowlist(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IPAllowlistApi.GetIPAllowlist`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IPAllowlistApi.GetIPAllowlist`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get IP Allowlist ``` // Get IP Allowlist returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IpAllowlistApi; import com.datadog.api.client.v2.model.IPAllowlistResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); IpAllowlistApi apiInstance = new IpAllowlistApi(defaultClient); try { IPAllowlistResponse result = apiInstance.getIPAllowlist(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IpAllowlistApi#getIPAllowlist"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get IP Allowlist ``` // Get IP Allowlist returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ip_allowlist::IPAllowlistAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = IPAllowlistAPI::with_config(configuration); let resp = api.get_ip_allowlist().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get IP Allowlist ``` /** * Get IP Allowlist returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.IPAllowlistApi(configuration); apiInstance .getIPAllowlist() .then((data: v2.IPAllowlistResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#update-ip-allowlist) * [v2 (latest)](https://docs.datadoghq.com/api/latest/ip-allowlist/#update-ip-allowlist-v2) PATCH https://api.ap1.datadoghq.com/api/v2/ip_allowlisthttps://api.ap2.datadoghq.com/api/v2/ip_allowlisthttps://api.datadoghq.eu/api/v2/ip_allowlisthttps://api.ddog-gov.com/api/v2/ip_allowlisthttps://api.datadoghq.com/api/v2/ip_allowlisthttps://api.us3.datadoghq.com/api/v2/ip_allowlisthttps://api.us5.datadoghq.com/api/v2/ip_allowlist ### Overview Edit the entries in the IP allowlist, and enable or disable it. This endpoint requires the `org_management` permission. OAuth apps require the `org_management` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#ip-allowlist) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/ip-allowlist/) Field Type Description data [_required_] object IP allowlist data. attributes object Attributes of the IP allowlist. enabled boolean Whether the IP allowlist logic is enabled or not. entries [object] Array of entries in the IP allowlist. data [_required_] object Data of the IP allowlist entry object. attributes object Attributes of the IP allowlist entry. cidr_block string The CIDR block describing the IP range of the entry. created_at date-time Creation time of the entry. modified_at date-time Time of last entry modification. note string A note describing the IP allowlist entry. id string The unique identifier of the IP allowlist entry. type [_required_] enum IP allowlist Entry type. Allowed enum values: `ip_allowlist_entry` default: `ip_allowlist_entry` id string The unique identifier of the org. type [_required_] enum IP allowlist type. Allowed enum values: `ip_allowlist` default: `ip_allowlist` ``` { "data": { "attributes": { "entries": [ { "data": { "attributes": { "note": "Example-IP-Allowlist", "cidr_block": "127.0.0.1" }, "type": "ip_allowlist_entry" } } ], "enabled": false }, "type": "ip_allowlist" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/ip-allowlist/#UpdateIPAllowlist-200-v2) * [400](https://docs.datadoghq.com/api/latest/ip-allowlist/#UpdateIPAllowlist-400-v2) * [403](https://docs.datadoghq.com/api/latest/ip-allowlist/#UpdateIPAllowlist-403-v2) * [404](https://docs.datadoghq.com/api/latest/ip-allowlist/#UpdateIPAllowlist-404-v2) * [429](https://docs.datadoghq.com/api/latest/ip-allowlist/#UpdateIPAllowlist-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/ip-allowlist/) Response containing information about the IP allowlist. Field Type Description data object IP allowlist data. attributes object Attributes of the IP allowlist. enabled boolean Whether the IP allowlist logic is enabled or not. entries [object] Array of entries in the IP allowlist. data [_required_] object Data of the IP allowlist entry object. attributes object Attributes of the IP allowlist entry. cidr_block string The CIDR block describing the IP range of the entry. created_at date-time Creation time of the entry. modified_at date-time Time of last entry modification. note string A note describing the IP allowlist entry. id string The unique identifier of the IP allowlist entry. type [_required_] enum IP allowlist Entry type. Allowed enum values: `ip_allowlist_entry` default: `ip_allowlist_entry` id string The unique identifier of the org. type [_required_] enum IP allowlist type. Allowed enum values: `ip_allowlist` default: `ip_allowlist` ``` { "data": { "attributes": { "enabled": false, "entries": [ { "data": { "attributes": { "cidr_block": "string", "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "note": "string" }, "id": "string", "type": "ip_allowlist_entry" } } ] }, "id": "string", "type": "ip_allowlist" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/ip-allowlist/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/ip-allowlist/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/ip-allowlist/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/ip-allowlist/) * [Example](https://docs.datadoghq.com/api/latest/ip-allowlist/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/ip-allowlist/?code-lang=typescript) ##### Update IP Allowlist returns "OK" response Copy ``` # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ip_allowlist" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "entries": [ { "data": { "attributes": { "note": "Example-IP-Allowlist", "cidr_block": "127.0.0.1" }, "type": "ip_allowlist_entry" } } ], "enabled": false }, "type": "ip_allowlist" } } EOF ``` ##### Update IP Allowlist returns "OK" response ``` // Update IP Allowlist returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.IPAllowlistUpdateRequest{ Data: datadogV2.IPAllowlistData{ Attributes: &datadogV2.IPAllowlistAttributes{ Entries: []datadogV2.IPAllowlistEntry{ { Data: datadogV2.IPAllowlistEntryData{ Attributes: &datadogV2.IPAllowlistEntryAttributes{ Note: datadog.PtrString("Example-IP-Allowlist"), CidrBlock: datadog.PtrString("127.0.0.1"), }, Type: datadogV2.IPALLOWLISTENTRYTYPE_IP_ALLOWLIST_ENTRY, }, }, }, Enabled: datadog.PtrBool(false), }, Type: datadogV2.IPALLOWLISTTYPE_IP_ALLOWLIST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewIPAllowlistApi(apiClient) resp, r, err := api.UpdateIPAllowlist(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IPAllowlistApi.UpdateIPAllowlist`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IPAllowlistApi.UpdateIPAllowlist`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update IP Allowlist returns "OK" response ``` // Update IP Allowlist returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.IpAllowlistApi; import com.datadog.api.client.v2.model.IPAllowlistAttributes; import com.datadog.api.client.v2.model.IPAllowlistData; import com.datadog.api.client.v2.model.IPAllowlistEntry; import com.datadog.api.client.v2.model.IPAllowlistEntryAttributes; import com.datadog.api.client.v2.model.IPAllowlistEntryData; import com.datadog.api.client.v2.model.IPAllowlistEntryType; import com.datadog.api.client.v2.model.IPAllowlistResponse; import com.datadog.api.client.v2.model.IPAllowlistType; import com.datadog.api.client.v2.model.IPAllowlistUpdateRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); IpAllowlistApi apiInstance = new IpAllowlistApi(defaultClient); IPAllowlistUpdateRequest body = new IPAllowlistUpdateRequest() .data( new IPAllowlistData() .attributes( new IPAllowlistAttributes() .entries( Collections.singletonList( new IPAllowlistEntry() .data( new IPAllowlistEntryData() .attributes( new IPAllowlistEntryAttributes() .note("Example-IP-Allowlist") .cidrBlock("127.0.0.1")) .type(IPAllowlistEntryType.IP_ALLOWLIST_ENTRY)))) .enabled(false)) .type(IPAllowlistType.IP_ALLOWLIST)); try { IPAllowlistResponse result = apiInstance.updateIPAllowlist(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IpAllowlistApi#updateIPAllowlist"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update IP Allowlist returns "OK" response ``` """ Update IP Allowlist returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.ip_allowlist_api import IPAllowlistApi from datadog_api_client.v2.model.ip_allowlist_attributes import IPAllowlistAttributes from datadog_api_client.v2.model.ip_allowlist_data import IPAllowlistData from datadog_api_client.v2.model.ip_allowlist_entry import IPAllowlistEntry from datadog_api_client.v2.model.ip_allowlist_entry_attributes import IPAllowlistEntryAttributes from datadog_api_client.v2.model.ip_allowlist_entry_data import IPAllowlistEntryData from datadog_api_client.v2.model.ip_allowlist_entry_type import IPAllowlistEntryType from datadog_api_client.v2.model.ip_allowlist_type import IPAllowlistType from datadog_api_client.v2.model.ip_allowlist_update_request import IPAllowlistUpdateRequest body = IPAllowlistUpdateRequest( data=IPAllowlistData( attributes=IPAllowlistAttributes( entries=[ IPAllowlistEntry( data=IPAllowlistEntryData( attributes=IPAllowlistEntryAttributes( note="Example-IP-Allowlist", cidr_block="127.0.0.1", ), type=IPAllowlistEntryType.IP_ALLOWLIST_ENTRY, ), ), ], enabled=False, ), type=IPAllowlistType.IP_ALLOWLIST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = IPAllowlistApi(api_client) response = api_instance.update_ip_allowlist(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update IP Allowlist returns "OK" response ``` # Update IP Allowlist returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::IPAllowlistAPI.new body = DatadogAPIClient::V2::IPAllowlistUpdateRequest.new({ data: DatadogAPIClient::V2::IPAllowlistData.new({ attributes: DatadogAPIClient::V2::IPAllowlistAttributes.new({ entries: [ DatadogAPIClient::V2::IPAllowlistEntry.new({ data: DatadogAPIClient::V2::IPAllowlistEntryData.new({ attributes: DatadogAPIClient::V2::IPAllowlistEntryAttributes.new({ note: "Example-IP-Allowlist", cidr_block: "127.0.0.1", }), type: DatadogAPIClient::V2::IPAllowlistEntryType::IP_ALLOWLIST_ENTRY, }), }), ], enabled: false, }), type: DatadogAPIClient::V2::IPAllowlistType::IP_ALLOWLIST, }), }) p api_instance.update_ip_allowlist(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update IP Allowlist returns "OK" response ``` // Update IP Allowlist returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_ip_allowlist::IPAllowlistAPI; use datadog_api_client::datadogV2::model::IPAllowlistAttributes; use datadog_api_client::datadogV2::model::IPAllowlistData; use datadog_api_client::datadogV2::model::IPAllowlistEntry; use datadog_api_client::datadogV2::model::IPAllowlistEntryAttributes; use datadog_api_client::datadogV2::model::IPAllowlistEntryData; use datadog_api_client::datadogV2::model::IPAllowlistEntryType; use datadog_api_client::datadogV2::model::IPAllowlistType; use datadog_api_client::datadogV2::model::IPAllowlistUpdateRequest; #[tokio::main] async fn main() { let body = IPAllowlistUpdateRequest::new( IPAllowlistData::new(IPAllowlistType::IP_ALLOWLIST).attributes( IPAllowlistAttributes::new() .enabled(false) .entries(vec![IPAllowlistEntry::new( IPAllowlistEntryData::new(IPAllowlistEntryType::IP_ALLOWLIST_ENTRY).attributes( IPAllowlistEntryAttributes::new() .cidr_block("127.0.0.1".to_string()) .note("Example-IP-Allowlist".to_string()), ), )]), ), ); let configuration = datadog::Configuration::new(); let api = IPAllowlistAPI::with_config(configuration); let resp = api.update_ip_allowlist(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update IP Allowlist returns "OK" response ``` /** * Update IP Allowlist returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.IPAllowlistApi(configuration); const params: v2.IPAllowlistApiUpdateIPAllowlistRequest = { body: { data: { attributes: { entries: [ { data: { attributes: { note: "Example-IP-Allowlist", cidrBlock: "127.0.0.1", }, type: "ip_allowlist_entry", }, }, ], enabled: false, }, type: "ip_allowlist", }, }, }; apiInstance .updateIPAllowlist(params) .then((data: v2.IPAllowlistResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=32b1d487-2c22-4c91-9142-f7f8c9ad92ad&bo=1&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=IP%20Allowlist&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fip-allowlist%2F&r=&evt=pageLoad&sv=2&cdb=AQAA&rn=479532) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=1e55f83e-6475-4659-8ebd-e677e7dbaf75&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=6f5f89fa-5c38-446c-ad0c-72de0723f539&pt=IP%20Allowlist&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fip-allowlist%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=1e55f83e-6475-4659-8ebd-e677e7dbaf75&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=6f5f89fa-5c38-446c-ad0c-72de0723f539&pt=IP%20Allowlist&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fip-allowlist%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) --- # Source: https://docs.datadoghq.com/api/latest/ip-ranges/ # IP Ranges Get a list of IP prefixes belonging to Datadog. ## [List IP Ranges](https://docs.datadoghq.com/api/latest/ip-ranges/#list-ip-ranges) * [v1 (latest)](https://docs.datadoghq.com/api/latest/ip-ranges/#list-ip-ranges-v1) GET https://ip-ranges.ap1.datadoghq.com/https://ip-ranges.ap2.datadoghq.com/https://ip-ranges.datadoghq.eu/https://ip-ranges.ddog-gov.com/https://ip-ranges.datadoghq.com/https://ip-ranges.us3.datadoghq.com/https://ip-ranges.us5.datadoghq.com/ ### Overview Get information about Datadog IP ranges. ### Response * [200](https://docs.datadoghq.com/api/latest/ip-ranges/#GetIPRanges-200-v1) * [429](https://docs.datadoghq.com/api/latest/ip-ranges/#GetIPRanges-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/ip-ranges/) * [Example](https://docs.datadoghq.com/api/latest/ip-ranges/) IP ranges. Field Type Description agents object Available prefix information for the Agent endpoints. prefixes_ipv4 [string] List of IPv4 prefixes. prefixes_ipv6 [string] List of IPv6 prefixes. api object Available prefix information for the API endpoints. prefixes_ipv4 [string] List of IPv4 prefixes. prefixes_ipv6 [string] List of IPv6 prefixes. apm object Available prefix information for the APM endpoints. prefixes_ipv4 [string] List of IPv4 prefixes. prefixes_ipv6 [string] List of IPv6 prefixes. global object Available prefix information for all Datadog endpoints. prefixes_ipv4 [string] List of IPv4 prefixes. prefixes_ipv6 [string] List of IPv6 prefixes. logs object Available prefix information for the Logs endpoints. prefixes_ipv4 [string] List of IPv4 prefixes. prefixes_ipv6 [string] List of IPv6 prefixes. modified string Date when last updated, in the form `YYYY-MM-DD-hh-mm-ss`. orchestrator object Available prefix information for the Orchestrator endpoints. prefixes_ipv4 [string] List of IPv4 prefixes. prefixes_ipv6 [string] List of IPv6 prefixes. process object Available prefix information for the Process endpoints. prefixes_ipv4 [string] List of IPv4 prefixes. prefixes_ipv6 [string] List of IPv6 prefixes. remote-configuration object Available prefix information for the Remote Configuration endpoints. prefixes_ipv4 [string] List of IPv4 prefixes. prefixes_ipv6 [string] List of IPv6 prefixes. synthetics object Available prefix information for the Synthetics endpoints. prefixes_ipv4 [string] List of IPv4 prefixes. prefixes_ipv4_by_location object List of IPv4 prefixes by location. [string] List of IPv4 prefixes. prefixes_ipv6 [string] List of IPv6 prefixes. prefixes_ipv6_by_location object List of IPv6 prefixes by location. [string] List of IPv6 prefixes. synthetics-private-locations object Available prefix information for the Synthetics Private Locations endpoints. prefixes_ipv4 [string] List of IPv4 prefixes. prefixes_ipv6 [string] List of IPv6 prefixes. version int64 Version of the IP list. webhooks object Available prefix information for the Webhook endpoints. prefixes_ipv4 [string] List of IPv4 prefixes. prefixes_ipv6 [string] List of IPv6 prefixes. ``` { "agents": { "prefixes_ipv4": [], "prefixes_ipv6": [] }, "api": { "prefixes_ipv4": [], "prefixes_ipv6": [] }, "apm": { "prefixes_ipv4": [], "prefixes_ipv6": [] }, "global": { "prefixes_ipv4": [], "prefixes_ipv6": [] }, "logs": { "prefixes_ipv4": [], "prefixes_ipv6": [] }, "modified": "2019-10-31-20-00-00", "orchestrator": { "prefixes_ipv4": [], "prefixes_ipv6": [] }, "process": { "prefixes_ipv4": [], "prefixes_ipv6": [] }, "remote-configuration": { "prefixes_ipv4": [], "prefixes_ipv6": [] }, "synthetics": { "prefixes_ipv4": [], "prefixes_ipv4_by_location": { "": [] }, "prefixes_ipv6": [], "prefixes_ipv6_by_location": { "": [] } }, "synthetics-private-locations": { "prefixes_ipv4": [], "prefixes_ipv6": [] }, "version": 11, "webhooks": { "prefixes_ipv4": [], "prefixes_ipv6": [] } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/ip-ranges/) * [Example](https://docs.datadoghq.com/api/latest/ip-ranges/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/ip-ranges/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/ip-ranges/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/ip-ranges/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/ip-ranges/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/ip-ranges/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/ip-ranges/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/ip-ranges/?code-lang=typescript) ##### List IP Ranges Copy ``` # Curl command curl -X GET "https://ip-ranges.ap1.datadoghq.com"https://ip-ranges.ap2.datadoghq.com"https://ip-ranges.datadoghq.eu"https://ip-ranges.ddog-gov.com"https://ip-ranges.datadoghq.com"https://ip-ranges.us3.datadoghq.com"https://ip-ranges.us5.datadoghq.com/" \ -H "Accept: application/json" ``` ##### List IP Ranges ``` """ List IP Ranges returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.ip_ranges_api import IPRangesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = IPRangesApi(api_client) response = api_instance.get_ip_ranges() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" python3 "example.py" ``` ##### List IP Ranges ``` # List IP Ranges returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::IPRangesAPI.new p api_instance.get_ip_ranges() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" rb "example.rb" ``` ##### List IP Ranges ``` // List IP Ranges returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewIPRangesApi(apiClient) resp, r, err := api.GetIPRanges(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `IPRangesApi.GetIPRanges`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `IPRangesApi.GetIPRanges`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" go run "main.go" ``` ##### List IP Ranges ``` // List IP Ranges returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.IpRangesApi; import com.datadog.api.client.v1.model.IPRanges; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); IpRangesApi apiInstance = new IpRangesApi(defaultClient); try { IPRanges result = apiInstance.getIPRanges(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling IpRangesApi#getIPRanges"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" java "Example.java" ``` ##### List IP Ranges ``` // List IP Ranges returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_ip_ranges::IPRangesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = IPRangesAPI::with_config(configuration); let resp = api.get_ip_ranges().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" cargo run ``` ##### List IP Ranges ``` /** * List IP Ranges returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.IPRangesApi(configuration); apiInstance .getIPRanges() .then((data: v1.IPRanges) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=165a87e1-922a-4b1f-9e3c-f8cb92880f42&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=cfc80acf-2cb7-4e74-8c86-c1505102f113&pt=IP%20Ranges&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fip-ranges%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=165a87e1-922a-4b1f-9e3c-f8cb92880f42&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=cfc80acf-2cb7-4e74-8c86-c1505102f113&pt=IP%20Ranges&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fip-ranges%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=9ac8729a-978b-42b6-b5ad-25f35e8ef79a&bo=2&sid=ba46b8a0f0bf11f08761a3199a385bde&vid=ba46dde0f0bf11f0b26b6ba9935020b6&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=IP%20Ranges&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fip-ranges%2F&r=<=1047&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=944965) --- # Source: https://docs.datadoghq.com/api/latest/key-management/ # Key Management Manage your Datadog API and application keys. You need an API key and an application key for a user with the required permissions to interact with these endpoints. Consult the following pages to view and manage your keys: * [API Keys](https://app.datadoghq.com/organization-settings/api-keys) * [Application Keys](https://app.datadoghq.com/personal-settings/application-keys) ## [Delete an application key owned by current user](https://docs.datadoghq.com/api/latest/key-management/#delete-an-application-key-owned-by-current-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#delete-an-application-key-owned-by-current-user-v2) DELETE https://api.ap1.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.ap2.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.datadoghq.eu/api/v2/current_user/application_keys/{app_key_id}https://api.ddog-gov.com/api/v2/current_user/application_keys/{app_key_id}https://api.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.us3.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.us5.datadoghq.com/api/v2/current_user/application_keys/{app_key_id} ### Overview Delete an application key owned by current user This endpoint requires the `user_app_keys` permission. ### Arguments #### Path Parameters Name Type Description app_key_id [_required_] string The ID of the application key. ### Response * [204](https://docs.datadoghq.com/api/latest/key-management/#DeleteCurrentUserApplicationKey-204-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#DeleteCurrentUserApplicationKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/key-management/#DeleteCurrentUserApplicationKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#DeleteCurrentUserApplicationKey-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Delete an application key owned by current user Copy ``` # Path parameters export app_key_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/current_user/application_keys/${app_key_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an application key owned by current user ``` """ Delete an application key owned by current user returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ID = environ["APPLICATION_KEY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) api_instance.delete_current_user_application_key( app_key_id=APPLICATION_KEY_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an application key owned by current user ``` # Delete an application key owned by current user returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ID = ENV["APPLICATION_KEY_DATA_ID"] api_instance.delete_current_user_application_key(APPLICATION_KEY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an application key owned by current user ``` // Delete an application key owned by current user returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "application_key" in the system ApplicationKeyDataID := os.Getenv("APPLICATION_KEY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) r, err := api.DeleteCurrentUserApplicationKey(ctx, ApplicationKeyDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.DeleteCurrentUserApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an application key owned by current user ``` // Delete an application key owned by current user returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); // there is a valid "application_key" in the system String APPLICATION_KEY_DATA_ID = System.getenv("APPLICATION_KEY_DATA_ID"); try { apiInstance.deleteCurrentUserApplicationKey(APPLICATION_KEY_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#deleteCurrentUserApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an application key owned by current user ``` // Delete an application key owned by current user returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { // there is a valid "application_key" in the system let application_key_data_id = std::env::var("APPLICATION_KEY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api .delete_current_user_application_key(application_key_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an application key owned by current user ``` /** * Delete an application key owned by current user returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); // there is a valid "application_key" in the system const APPLICATION_KEY_DATA_ID = process.env.APPLICATION_KEY_DATA_ID as string; const params: v2.KeyManagementApiDeleteCurrentUserApplicationKeyRequest = { appKeyId: APPLICATION_KEY_DATA_ID, }; apiInstance .deleteCurrentUserApplicationKey(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all API keys](https://docs.datadoghq.com/api/latest/key-management/#get-all-api-keys) * [v1](https://docs.datadoghq.com/api/latest/key-management/#get-all-api-keys-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#get-all-api-keys-v2) GET https://api.ap1.datadoghq.com/api/v1/api_keyhttps://api.ap2.datadoghq.com/api/v1/api_keyhttps://api.datadoghq.eu/api/v1/api_keyhttps://api.ddog-gov.com/api/v1/api_keyhttps://api.datadoghq.com/api/v1/api_keyhttps://api.us3.datadoghq.com/api/v1/api_keyhttps://api.us5.datadoghq.com/api/v1/api_key ### Overview Get all API keys available for your account. This endpoint requires the `api_keys_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#ListAPIKeys-200-v1) * [403](https://docs.datadoghq.com/api/latest/key-management/#ListAPIKeys-403-v1) * [429](https://docs.datadoghq.com/api/latest/key-management/#ListAPIKeys-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) List of API and application keys available for a given organization. Field Type Description api_keys [object] Array of API keys. created string Date of creation of the API key. created_by string Datadog user handle that created the API key. key string API key. name string Name of your API key. ``` { "api_keys": [ { "created_by": "test_user", "key": "1234512345123456abcabc912349abcd", "name": "app_key" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Get all API keys Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/api_key" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all API keys ``` """ Get all API keys returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.key_management_api import KeyManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.list_api_keys() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all API keys ``` # Get all API keys returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::KeyManagementAPI.new p api_instance.list_api_keys() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all API keys ``` // Get all API keys returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewKeyManagementApi(apiClient) resp, r, err := api.ListAPIKeys(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.ListAPIKeys`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.ListAPIKeys`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all API keys ``` // Get all API keys returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.KeyManagementApi; import com.datadog.api.client.v1.model.ApiKeyListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); try { ApiKeyListResponse result = apiInstance.listAPIKeys(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#listAPIKeys"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all API keys ``` // Get all API keys returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.list_api_keys().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all API keys ``` /** * Get all API keys returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.KeyManagementApi(configuration); apiInstance .listAPIKeys() .then((data: v1.ApiKeyListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/api_keyshttps://api.ap2.datadoghq.com/api/v2/api_keyshttps://api.datadoghq.eu/api/v2/api_keyshttps://api.ddog-gov.com/api/v2/api_keyshttps://api.datadoghq.com/api/v2/api_keyshttps://api.us3.datadoghq.com/api/v2/api_keyshttps://api.us5.datadoghq.com/api/v2/api_keys ### Overview List all API keys available for your account. This endpoint requires the `api_keys_read` permission. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort enum API key attribute used to sort results. Sort order is ascending by default. In order to specify a descending sort, prefix the attribute with a minus sign. Allowed enum values: `created_at, -created_at, last4, -last4, modified_at, -modified_at, name, -name` filter string Filter API keys by the specified string. filter[created_at][start] string Only include API keys created on or after the specified date. filter[created_at][end] string Only include API keys created on or before the specified date. filter[modified_at][start] string Only include API keys modified on or after the specified date. filter[modified_at][end] string Only include API keys modified on or before the specified date. include string Comma separated list of resource paths for related resources to include in the response. Supported resource paths are `created_by` and `modified_by`. filter[remote_config_read_enabled] boolean Filter API keys by remote config read enabled status. filter[category] string Filter API keys by category. ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#ListAPIKeys-200-v2) * [400](https://docs.datadoghq.com/api/latest/key-management/#ListAPIKeys-400-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#ListAPIKeys-403-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#ListAPIKeys-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Response for a list of API keys. Field Type Description data [object] Array of API keys. attributes object Attributes of a partial API key. category string The category of the API key. created_at string Creation date of the API key. date_last_used date-time Date the API Key was last used. last4 string The last four characters of the API key. modified_at string Date the API key was last modified. name string Name of the API key. remote_config_read_enabled boolean The remote config read enabled status. id string ID of the API key. relationships object Resources related to the API key. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum API Keys resource type. Allowed enum values: `api_keys` default: `api_keys` included [ ] Array of objects related to the API key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` meta object Additional information related to api keys response. max_allowed int64 Max allowed number of API keys. page object Additional information related to the API keys response. total_filtered_count int64 Total filtered application key count. ``` { "data": [ { "attributes": { "category": "string", "created_at": "2020-11-23T10:00:00.000Z", "date_last_used": "2020-11-27T10:00:00.000Z", "last4": "abcd", "modified_at": "2020-11-23T10:00:00.000Z", "name": "API Key for submitting metrics", "remote_config_read_enabled": false }, "id": "string", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "users" } } }, "type": "api_keys" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "meta": { "max_allowed": "integer", "page": { "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Get all API keys Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/api_keys" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all API keys ``` """ Get all API keys returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi # there is a valid "api_key" in the system API_KEY_DATA_ATTRIBUTES_NAME = environ["API_KEY_DATA_ATTRIBUTES_NAME"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.list_api_keys( filter=API_KEY_DATA_ATTRIBUTES_NAME, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all API keys ``` # Get all API keys returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new # there is a valid "api_key" in the system API_KEY_DATA_ATTRIBUTES_NAME = ENV["API_KEY_DATA_ATTRIBUTES_NAME"] opts = { filter: API_KEY_DATA_ATTRIBUTES_NAME, } p api_instance.list_api_keys(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all API keys ``` // Get all API keys returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "api_key" in the system APIKeyDataAttributesName := os.Getenv("API_KEY_DATA_ATTRIBUTES_NAME") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.ListAPIKeys(ctx, *datadogV2.NewListAPIKeysOptionalParameters().WithFilter(APIKeyDataAttributesName)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.ListAPIKeys`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.ListAPIKeys`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all API keys ``` // Get all API keys returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.api.KeyManagementApi.ListAPIKeysOptionalParameters; import com.datadog.api.client.v2.model.APIKeysResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); // there is a valid "api_key" in the system String API_KEY_DATA_ATTRIBUTES_NAME = System.getenv("API_KEY_DATA_ATTRIBUTES_NAME"); try { APIKeysResponse result = apiInstance.listAPIKeys( new ListAPIKeysOptionalParameters().filter(API_KEY_DATA_ATTRIBUTES_NAME)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#listAPIKeys"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all API keys ``` // Get all API keys returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV2::api_key_management::ListAPIKeysOptionalParams; #[tokio::main] async fn main() { // there is a valid "api_key" in the system let api_key_data_attributes_name = std::env::var("API_KEY_DATA_ATTRIBUTES_NAME").unwrap(); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api .list_api_keys( ListAPIKeysOptionalParams::default().filter(api_key_data_attributes_name.clone()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all API keys ``` /** * Get all API keys returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); // there is a valid "api_key" in the system const API_KEY_DATA_ATTRIBUTES_NAME = process.env .API_KEY_DATA_ATTRIBUTES_NAME as string; const params: v2.KeyManagementApiListAPIKeysRequest = { filter: API_KEY_DATA_ATTRIBUTES_NAME, }; apiInstance .listAPIKeys(params) .then((data: v2.APIKeysResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an API key](https://docs.datadoghq.com/api/latest/key-management/#create-an-api-key) * [v1](https://docs.datadoghq.com/api/latest/key-management/#create-an-api-key-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#create-an-api-key-v2) POST https://api.ap1.datadoghq.com/api/v1/api_keyhttps://api.ap2.datadoghq.com/api/v1/api_keyhttps://api.datadoghq.eu/api/v1/api_keyhttps://api.ddog-gov.com/api/v1/api_keyhttps://api.datadoghq.com/api/v1/api_keyhttps://api.us3.datadoghq.com/api/v1/api_keyhttps://api.us5.datadoghq.com/api/v1/api_key ### Overview Creates an API key with a given name. This endpoint requires the `api_keys_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Expand All Field Type Description created string Date of creation of the API key. created_by string Datadog user handle that created the API key. key string API key. name string Name of your API key. ``` { "name": "example user" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#CreateAPIKey-200-v1) * [400](https://docs.datadoghq.com/api/latest/key-management/#CreateAPIKey-400-v1) * [403](https://docs.datadoghq.com/api/latest/key-management/#CreateAPIKey-403-v1) * [429](https://docs.datadoghq.com/api/latest/key-management/#CreateAPIKey-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) An API key with its associated metadata. Field Type Description api_key object Datadog API key. created string Date of creation of the API key. created_by string Datadog user handle that created the API key. key string API key. name string Name of your API key. ``` { "api_key": { "created_by": "test_user", "key": "1234512345123456abcabc912349abcd", "name": "app_key" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Create an API key Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/api_key" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Create an API key ``` """ Create an API key returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.key_management_api import KeyManagementApi from datadog_api_client.v1.model.api_key import ApiKey body = ApiKey( name="example user", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.create_api_key(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an API key ``` # Create an API key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::KeyManagementAPI.new body = DatadogAPIClient::V1::ApiKey.new({ name: "example user", }) p api_instance.create_api_key(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an API key ``` // Create an API key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.ApiKey{ Name: datadog.PtrString("example user"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewKeyManagementApi(apiClient) resp, r, err := api.CreateAPIKey(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.CreateAPIKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.CreateAPIKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an API key ``` // Create an API key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.KeyManagementApi; import com.datadog.api.client.v1.model.ApiKey; import com.datadog.api.client.v1.model.ApiKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); ApiKey body = new ApiKey().name("example user"); try { ApiKeyResponse result = apiInstance.createAPIKey(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#createAPIKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an API key ``` // Create an API key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV1::model::ApiKey; #[tokio::main] async fn main() { let body = ApiKey::new().name("example user".to_string()); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.create_api_key(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an API key ``` /** * Create an API key returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.KeyManagementApi(configuration); const params: v1.KeyManagementApiCreateAPIKeyRequest = { body: { name: "example user", }, }; apiInstance .createAPIKey(params) .then((data: v1.ApiKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` POST https://api.ap1.datadoghq.com/api/v2/api_keyshttps://api.ap2.datadoghq.com/api/v2/api_keyshttps://api.datadoghq.eu/api/v2/api_keyshttps://api.ddog-gov.com/api/v2/api_keyshttps://api.datadoghq.com/api/v2/api_keyshttps://api.us3.datadoghq.com/api/v2/api_keyshttps://api.us5.datadoghq.com/api/v2/api_keys ### Overview Create an API key. This endpoint requires the `api_keys_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Field Type Description data [_required_] object Object used to create an API key. attributes [_required_] object Attributes used to create an API Key. category string The APIKeyCreateAttributes category. name [_required_] string Name of the API key. remote_config_read_enabled boolean The APIKeyCreateAttributes remote_config_read_enabled. type [_required_] enum API Keys resource type. Allowed enum values: `api_keys` default: `api_keys` ``` { "data": { "type": "api_keys", "attributes": { "name": "Example-Key-Management" } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/key-management/#CreateAPIKey-201-v2) * [400](https://docs.datadoghq.com/api/latest/key-management/#CreateAPIKey-400-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#CreateAPIKey-403-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#CreateAPIKey-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Response for retrieving an API key. Field Type Description data object Datadog API key. attributes object Attributes of a full API key. category string The category of the API key. created_at date-time Creation date of the API key. date_last_used date-time Date the API Key was last used key string The API key. last4 string The last four characters of the API key. modified_at date-time Date the API key was last modified. name string Name of the API key. remote_config_read_enabled boolean The remote config read enabled status. id string ID of the API key. relationships object Resources related to the API key. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum API Keys resource type. Allowed enum values: `api_keys` default: `api_keys` included [ ] Array of objects related to the API key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` ``` { "data": { "attributes": { "category": "string", "created_at": "2020-11-23T10:00:00.000Z", "date_last_used": "2020-11-27T10:00:00.000Z", "key": "string", "last4": "abcd", "modified_at": "2020-11-23T10:00:00.000Z", "name": "API Key for submitting metrics", "remote_config_read_enabled": false }, "id": "string", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "users" } } }, "type": "api_keys" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Create an API key returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/api_keys" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "api_keys", "attributes": { "name": "Example-Key-Management" } } } EOF ``` ##### Create an API key returns "Created" response ``` // Create an API key returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.APIKeyCreateRequest{ Data: datadogV2.APIKeyCreateData{ Type: datadogV2.APIKEYSTYPE_API_KEYS, Attributes: datadogV2.APIKeyCreateAttributes{ Name: "Example-Key-Management", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.CreateAPIKey(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.CreateAPIKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.CreateAPIKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an API key returns "Created" response ``` // Create an API key returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.model.APIKeyCreateAttributes; import com.datadog.api.client.v2.model.APIKeyCreateData; import com.datadog.api.client.v2.model.APIKeyCreateRequest; import com.datadog.api.client.v2.model.APIKeyResponse; import com.datadog.api.client.v2.model.APIKeysType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); APIKeyCreateRequest body = new APIKeyCreateRequest() .data( new APIKeyCreateData() .type(APIKeysType.API_KEYS) .attributes(new APIKeyCreateAttributes().name("Example-Key-Management"))); try { APIKeyResponse result = apiInstance.createAPIKey(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#createAPIKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an API key returns "Created" response ``` """ Create an API key returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi from datadog_api_client.v2.model.api_key_create_attributes import APIKeyCreateAttributes from datadog_api_client.v2.model.api_key_create_data import APIKeyCreateData from datadog_api_client.v2.model.api_key_create_request import APIKeyCreateRequest from datadog_api_client.v2.model.api_keys_type import APIKeysType body = APIKeyCreateRequest( data=APIKeyCreateData( type=APIKeysType.API_KEYS, attributes=APIKeyCreateAttributes( name="Example-Key-Management", ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.create_api_key(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an API key returns "Created" response ``` # Create an API key returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new body = DatadogAPIClient::V2::APIKeyCreateRequest.new({ data: DatadogAPIClient::V2::APIKeyCreateData.new({ type: DatadogAPIClient::V2::APIKeysType::API_KEYS, attributes: DatadogAPIClient::V2::APIKeyCreateAttributes.new({ name: "Example-Key-Management", }), }), }) p api_instance.create_api_key(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an API key returns "Created" response ``` // Create an API key returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV2::model::APIKeyCreateAttributes; use datadog_api_client::datadogV2::model::APIKeyCreateData; use datadog_api_client::datadogV2::model::APIKeyCreateRequest; use datadog_api_client::datadogV2::model::APIKeysType; #[tokio::main] async fn main() { let body = APIKeyCreateRequest::new(APIKeyCreateData::new( APIKeyCreateAttributes::new("Example-Key-Management".to_string()), APIKeysType::API_KEYS, )); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.create_api_key(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an API key returns "Created" response ``` /** * Create an API key returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); const params: v2.KeyManagementApiCreateAPIKeyRequest = { body: { data: { type: "api_keys", attributes: { name: "Example-Key-Management", }, }, }, }; apiInstance .createAPIKey(params) .then((data: v2.APIKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit an application key owned by current user](https://docs.datadoghq.com/api/latest/key-management/#edit-an-application-key-owned-by-current-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#edit-an-application-key-owned-by-current-user-v2) PATCH https://api.ap1.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.ap2.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.datadoghq.eu/api/v2/current_user/application_keys/{app_key_id}https://api.ddog-gov.com/api/v2/current_user/application_keys/{app_key_id}https://api.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.us3.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.us5.datadoghq.com/api/v2/current_user/application_keys/{app_key_id} ### Overview Edit an application key owned by current user. The `key` field is not returned for organizations in [One-Time Read mode](https://docs.datadoghq.com/account_management/api-app-keys/#one-time-read-mode). This endpoint requires the `user_app_keys` permission. ### Arguments #### Path Parameters Name Type Description app_key_id [_required_] string The ID of the application key. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Field Type Description data [_required_] object Object used to update an application key. attributes [_required_] object Attributes used to update an application Key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id [_required_] string ID of the application key. type [_required_] enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` ``` { "data": { "id": "string", "type": "application_keys", "attributes": { "name": "Application Key for managing dashboards-updated" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#UpdateCurrentUserApplicationKey-200-v2) * [400](https://docs.datadoghq.com/api/latest/key-management/#UpdateCurrentUserApplicationKey-400-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#UpdateCurrentUserApplicationKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/key-management/#UpdateCurrentUserApplicationKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#UpdateCurrentUserApplicationKey-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Response for retrieving an application key. Field Type Description data object Datadog application key. attributes object Attributes of a full application key. created_at date-time Creation date of the application key. key string The application key. last4 string The last four characters of the application key. last_used_at date-time Last usage timestamp of the application key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id string ID of the application key. relationships object Resources related to the application key. owned_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` included [ ] Array of objects related to the application key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` ``` { "data": { "attributes": { "created_at": "2020-11-23T10:00:00.000Z", "key": "string", "last4": "abcd", "last_used_at": "2020-12-20T10:00:00.000Z", "name": "Application Key for managing dashboards", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "id": "string", "relationships": { "owned_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "application_keys" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Edit an application key owned by current user returns "OK" response Copy ``` # Path parameters export app_key_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/current_user/application_keys/${app_key_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "string", "type": "application_keys", "attributes": { "name": "Application Key for managing dashboards-updated" } } } EOF ``` ##### Edit an application key owned by current user returns "OK" response ``` // Edit an application key owned by current user returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "application_key" in the system ApplicationKeyDataID := os.Getenv("APPLICATION_KEY_DATA_ID") body := datadogV2.ApplicationKeyUpdateRequest{ Data: datadogV2.ApplicationKeyUpdateData{ Id: ApplicationKeyDataID, Type: datadogV2.APPLICATIONKEYSTYPE_APPLICATION_KEYS, Attributes: datadogV2.ApplicationKeyUpdateAttributes{ Name: datadog.PtrString("Application Key for managing dashboards-updated"), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.UpdateCurrentUserApplicationKey(ctx, ApplicationKeyDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.UpdateCurrentUserApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.UpdateCurrentUserApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit an application key owned by current user returns "OK" response ``` // Edit an application key owned by current user returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.model.ApplicationKeyResponse; import com.datadog.api.client.v2.model.ApplicationKeyUpdateAttributes; import com.datadog.api.client.v2.model.ApplicationKeyUpdateData; import com.datadog.api.client.v2.model.ApplicationKeyUpdateRequest; import com.datadog.api.client.v2.model.ApplicationKeysType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); // there is a valid "application_key" in the system String APPLICATION_KEY_DATA_ATTRIBUTES_NAME = System.getenv("APPLICATION_KEY_DATA_ATTRIBUTES_NAME"); String APPLICATION_KEY_DATA_ID = System.getenv("APPLICATION_KEY_DATA_ID"); ApplicationKeyUpdateRequest body = new ApplicationKeyUpdateRequest() .data( new ApplicationKeyUpdateData() .id(APPLICATION_KEY_DATA_ID) .type(ApplicationKeysType.APPLICATION_KEYS) .attributes( new ApplicationKeyUpdateAttributes() .name("Application Key for managing dashboards-updated"))); try { ApplicationKeyResponse result = apiInstance.updateCurrentUserApplicationKey(APPLICATION_KEY_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#updateCurrentUserApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit an application key owned by current user returns "OK" response ``` """ Edit an application key owned by current user returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi from datadog_api_client.v2.model.application_key_update_attributes import ApplicationKeyUpdateAttributes from datadog_api_client.v2.model.application_key_update_data import ApplicationKeyUpdateData from datadog_api_client.v2.model.application_key_update_request import ApplicationKeyUpdateRequest from datadog_api_client.v2.model.application_keys_type import ApplicationKeysType # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ATTRIBUTES_NAME = environ["APPLICATION_KEY_DATA_ATTRIBUTES_NAME"] APPLICATION_KEY_DATA_ID = environ["APPLICATION_KEY_DATA_ID"] body = ApplicationKeyUpdateRequest( data=ApplicationKeyUpdateData( id=APPLICATION_KEY_DATA_ID, type=ApplicationKeysType.APPLICATION_KEYS, attributes=ApplicationKeyUpdateAttributes( name="Application Key for managing dashboards-updated", ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.update_current_user_application_key(app_key_id=APPLICATION_KEY_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit an application key owned by current user returns "OK" response ``` # Edit an application key owned by current user returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ATTRIBUTES_NAME = ENV["APPLICATION_KEY_DATA_ATTRIBUTES_NAME"] APPLICATION_KEY_DATA_ID = ENV["APPLICATION_KEY_DATA_ID"] body = DatadogAPIClient::V2::ApplicationKeyUpdateRequest.new({ data: DatadogAPIClient::V2::ApplicationKeyUpdateData.new({ id: APPLICATION_KEY_DATA_ID, type: DatadogAPIClient::V2::ApplicationKeysType::APPLICATION_KEYS, attributes: DatadogAPIClient::V2::ApplicationKeyUpdateAttributes.new({ name: "Application Key for managing dashboards-updated", }), }), }) p api_instance.update_current_user_application_key(APPLICATION_KEY_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit an application key owned by current user returns "OK" response ``` // Edit an application key owned by current user returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV2::model::ApplicationKeyUpdateAttributes; use datadog_api_client::datadogV2::model::ApplicationKeyUpdateData; use datadog_api_client::datadogV2::model::ApplicationKeyUpdateRequest; use datadog_api_client::datadogV2::model::ApplicationKeysType; #[tokio::main] async fn main() { // there is a valid "application_key" in the system let application_key_data_id = std::env::var("APPLICATION_KEY_DATA_ID").unwrap(); let body = ApplicationKeyUpdateRequest::new(ApplicationKeyUpdateData::new( ApplicationKeyUpdateAttributes::new() .name("Application Key for managing dashboards-updated".to_string()), application_key_data_id.clone(), ApplicationKeysType::APPLICATION_KEYS, )); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api .update_current_user_application_key(application_key_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit an application key owned by current user returns "OK" response ``` /** * Edit an application key owned by current user returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); // there is a valid "application_key" in the system const APPLICATION_KEY_DATA_ID = process.env.APPLICATION_KEY_DATA_ID as string; const params: v2.KeyManagementApiUpdateCurrentUserApplicationKeyRequest = { body: { data: { id: APPLICATION_KEY_DATA_ID, type: "application_keys", attributes: { name: "Application Key for managing dashboards-updated", }, }, }, appKeyId: APPLICATION_KEY_DATA_ID, }; apiInstance .updateCurrentUserApplicationKey(params) .then((data: v2.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get API key](https://docs.datadoghq.com/api/latest/key-management/#get-api-key) * [v1](https://docs.datadoghq.com/api/latest/key-management/#get-api-key-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#get-api-key-v2) GET https://api.ap1.datadoghq.com/api/v1/api_key/{key}https://api.ap2.datadoghq.com/api/v1/api_key/{key}https://api.datadoghq.eu/api/v1/api_key/{key}https://api.ddog-gov.com/api/v1/api_key/{key}https://api.datadoghq.com/api/v1/api_key/{key}https://api.us3.datadoghq.com/api/v1/api_key/{key}https://api.us5.datadoghq.com/api/v1/api_key/{key} ### Overview Get a given API key. This endpoint requires the `api_keys_read` permission. ### Arguments #### Path Parameters Name Type Description key [_required_] string The specific API key you are working with. ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#GetAPIKey-200-v1) * [403](https://docs.datadoghq.com/api/latest/key-management/#GetAPIKey-403-v1) * [404](https://docs.datadoghq.com/api/latest/key-management/#GetAPIKey-404-v1) * [429](https://docs.datadoghq.com/api/latest/key-management/#GetAPIKey-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) An API key with its associated metadata. Field Type Description api_key object Datadog API key. created string Date of creation of the API key. created_by string Datadog user handle that created the API key. key string API key. name string Name of your API key. ``` { "api_key": { "created_by": "test_user", "key": "1234512345123456abcabc912349abcd", "name": "app_key" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Get API key Copy ``` # Path parameters export key="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/api_key/${key}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get API key ``` """ Get API key returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.key_management_api import KeyManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.get_api_key( key="key", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get API key ``` # Get API key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::KeyManagementAPI.new p api_instance.get_api_key("key") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get API key ``` // Get API key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewKeyManagementApi(apiClient) resp, r, err := api.GetAPIKey(ctx, "key") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.GetAPIKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.GetAPIKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get API key ``` // Get API key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.KeyManagementApi; import com.datadog.api.client.v1.model.ApiKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); try { ApiKeyResponse result = apiInstance.getAPIKey("key"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#getAPIKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get API key ``` // Get API key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.get_api_key("key".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get API key ``` /** * Get API key returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.KeyManagementApi(configuration); const params: v1.KeyManagementApiGetAPIKeyRequest = { key: "key", }; apiInstance .getAPIKey(params) .then((data: v1.ApiKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.ap2.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.datadoghq.eu/api/v2/api_keys/{api_key_id}https://api.ddog-gov.com/api/v2/api_keys/{api_key_id}https://api.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.us3.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.us5.datadoghq.com/api/v2/api_keys/{api_key_id} ### Overview Get an API key. This endpoint requires the `api_keys_read` permission. ### Arguments #### Path Parameters Name Type Description api_key_id [_required_] string The ID of the API key. #### Query Strings Name Type Description include string Comma separated list of resource paths for related resources to include in the response. Supported resource paths are `created_by` and `modified_by`. ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#GetAPIKey-200-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#GetAPIKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/key-management/#GetAPIKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#GetAPIKey-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Response for retrieving an API key. Field Type Description data object Datadog API key. attributes object Attributes of a full API key. category string The category of the API key. created_at date-time Creation date of the API key. date_last_used date-time Date the API Key was last used key string The API key. last4 string The last four characters of the API key. modified_at date-time Date the API key was last modified. name string Name of the API key. remote_config_read_enabled boolean The remote config read enabled status. id string ID of the API key. relationships object Resources related to the API key. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum API Keys resource type. Allowed enum values: `api_keys` default: `api_keys` included [ ] Array of objects related to the API key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` ``` { "data": { "attributes": { "category": "string", "created_at": "2020-11-23T10:00:00.000Z", "date_last_used": "2020-11-27T10:00:00.000Z", "key": "string", "last4": "abcd", "modified_at": "2020-11-23T10:00:00.000Z", "name": "API Key for submitting metrics", "remote_config_read_enabled": false }, "id": "string", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "users" } } }, "type": "api_keys" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Get API key Copy ``` # Path parameters export api_key_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/api_keys/${api_key_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get API key ``` """ Get API key returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi # there is a valid "api_key" in the system API_KEY_DATA_ID = environ["API_KEY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.get_api_key( api_key_id=API_KEY_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get API key ``` # Get API key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new # there is a valid "api_key" in the system API_KEY_DATA_ID = ENV["API_KEY_DATA_ID"] p api_instance.get_api_key(API_KEY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get API key ``` // Get API key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "api_key" in the system APIKeyDataID := os.Getenv("API_KEY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.GetAPIKey(ctx, APIKeyDataID, *datadogV2.NewGetAPIKeyOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.GetAPIKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.GetAPIKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get API key ``` // Get API key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.model.APIKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); // there is a valid "api_key" in the system String API_KEY_DATA_ID = System.getenv("API_KEY_DATA_ID"); try { APIKeyResponse result = apiInstance.getAPIKey(API_KEY_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#getAPIKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get API key ``` // Get API key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::GetAPIKeyOptionalParams; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { // there is a valid "api_key" in the system let api_key_data_id = std::env::var("API_KEY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api .get_api_key(api_key_data_id.clone(), GetAPIKeyOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get API key ``` /** * Get API key returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); // there is a valid "api_key" in the system const API_KEY_DATA_ID = process.env.API_KEY_DATA_ID as string; const params: v2.KeyManagementApiGetAPIKeyRequest = { apiKeyId: API_KEY_DATA_ID, }; apiInstance .getAPIKey(params) .then((data: v2.APIKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get one application key owned by current user](https://docs.datadoghq.com/api/latest/key-management/#get-one-application-key-owned-by-current-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#get-one-application-key-owned-by-current-user-v2) GET https://api.ap1.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.ap2.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.datadoghq.eu/api/v2/current_user/application_keys/{app_key_id}https://api.ddog-gov.com/api/v2/current_user/application_keys/{app_key_id}https://api.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.us3.datadoghq.com/api/v2/current_user/application_keys/{app_key_id}https://api.us5.datadoghq.com/api/v2/current_user/application_keys/{app_key_id} ### Overview Get an application key owned by current user. The `key` field is not returned for organizations in [One-Time Read mode](https://docs.datadoghq.com/account_management/api-app-keys/#one-time-read-mode). This endpoint requires the `user_app_keys` permission. ### Arguments #### Path Parameters Name Type Description app_key_id [_required_] string The ID of the application key. ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#GetCurrentUserApplicationKey-200-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#GetCurrentUserApplicationKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/key-management/#GetCurrentUserApplicationKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#GetCurrentUserApplicationKey-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Response for retrieving an application key. Field Type Description data object Datadog application key. attributes object Attributes of a full application key. created_at date-time Creation date of the application key. key string The application key. last4 string The last four characters of the application key. last_used_at date-time Last usage timestamp of the application key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id string ID of the application key. relationships object Resources related to the application key. owned_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` included [ ] Array of objects related to the application key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` ``` { "data": { "attributes": { "created_at": "2020-11-23T10:00:00.000Z", "key": "string", "last4": "abcd", "last_used_at": "2020-12-20T10:00:00.000Z", "name": "Application Key for managing dashboards", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "id": "string", "relationships": { "owned_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "application_keys" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Get one application key owned by current user Copy ``` # Path parameters export app_key_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/current_user/application_keys/${app_key_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get one application key owned by current user ``` """ Get one application key owned by current user returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ID = environ["APPLICATION_KEY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.get_current_user_application_key( app_key_id=APPLICATION_KEY_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get one application key owned by current user ``` # Get one application key owned by current user returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ID = ENV["APPLICATION_KEY_DATA_ID"] p api_instance.get_current_user_application_key(APPLICATION_KEY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get one application key owned by current user ``` // Get one application key owned by current user returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "application_key" in the system ApplicationKeyDataID := os.Getenv("APPLICATION_KEY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.GetCurrentUserApplicationKey(ctx, ApplicationKeyDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.GetCurrentUserApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.GetCurrentUserApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get one application key owned by current user ``` // Get one application key owned by current user returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.model.ApplicationKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); // there is a valid "application_key" in the system String APPLICATION_KEY_DATA_ID = System.getenv("APPLICATION_KEY_DATA_ID"); try { ApplicationKeyResponse result = apiInstance.getCurrentUserApplicationKey(APPLICATION_KEY_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#getCurrentUserApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get one application key owned by current user ``` // Get one application key owned by current user returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { // there is a valid "application_key" in the system let application_key_data_id = std::env::var("APPLICATION_KEY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api .get_current_user_application_key(application_key_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get one application key owned by current user ``` /** * Get one application key owned by current user returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); // there is a valid "application_key" in the system const APPLICATION_KEY_DATA_ID = process.env.APPLICATION_KEY_DATA_ID as string; const params: v2.KeyManagementApiGetCurrentUserApplicationKeyRequest = { appKeyId: APPLICATION_KEY_DATA_ID, }; apiInstance .getCurrentUserApplicationKey(params) .then((data: v2.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an application key for current user](https://docs.datadoghq.com/api/latest/key-management/#create-an-application-key-for-current-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#create-an-application-key-for-current-user-v2) POST https://api.ap1.datadoghq.com/api/v2/current_user/application_keyshttps://api.ap2.datadoghq.com/api/v2/current_user/application_keyshttps://api.datadoghq.eu/api/v2/current_user/application_keyshttps://api.ddog-gov.com/api/v2/current_user/application_keyshttps://api.datadoghq.com/api/v2/current_user/application_keyshttps://api.us3.datadoghq.com/api/v2/current_user/application_keyshttps://api.us5.datadoghq.com/api/v2/current_user/application_keys ### Overview Create an application key for current user This endpoint requires the `user_app_keys` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Field Type Description data [_required_] object Object used to create an application key. attributes [_required_] object Attributes used to create an application Key. name [_required_] string Name of the application key. scopes [string] Array of scopes to grant the application key. type [_required_] enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` ##### Create an Application key with scopes for current user returns "Created" response ``` { "data": { "type": "application_keys", "attributes": { "name": "Example-Key-Management", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] } } } ``` Copy ##### Create an application key for current user returns "Created" response ``` { "data": { "type": "application_keys", "attributes": { "name": "Example-Key-Management" } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/key-management/#CreateCurrentUserApplicationKey-201-v2) * [400](https://docs.datadoghq.com/api/latest/key-management/#CreateCurrentUserApplicationKey-400-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#CreateCurrentUserApplicationKey-403-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#CreateCurrentUserApplicationKey-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Response for retrieving an application key. Field Type Description data object Datadog application key. attributes object Attributes of a full application key. created_at date-time Creation date of the application key. key string The application key. last4 string The last four characters of the application key. last_used_at date-time Last usage timestamp of the application key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id string ID of the application key. relationships object Resources related to the application key. owned_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` included [ ] Array of objects related to the application key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` ``` { "data": { "attributes": { "created_at": "2020-11-23T10:00:00.000Z", "key": "string", "last4": "abcd", "last_used_at": "2020-12-20T10:00:00.000Z", "name": "Application Key for managing dashboards", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "id": "string", "relationships": { "owned_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "application_keys" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Create an Application key with scopes for current user returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/current_user/application_keys" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "application_keys", "attributes": { "name": "Example-Key-Management", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] } } } EOF ``` ##### Create an application key for current user returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/current_user/application_keys" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "application_keys", "attributes": { "name": "Example-Key-Management" } } } EOF ``` ##### Create an Application key with scopes for current user returns "Created" response ``` // Create an Application key with scopes for current user returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ApplicationKeyCreateRequest{ Data: datadogV2.ApplicationKeyCreateData{ Type: datadogV2.APPLICATIONKEYSTYPE_APPLICATION_KEYS, Attributes: datadogV2.ApplicationKeyCreateAttributes{ Name: "Example-Key-Management", Scopes: *datadog.NewNullableList(&[]string{ "dashboards_read", "dashboards_write", "dashboards_public_share", }), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.CreateCurrentUserApplicationKey(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.CreateCurrentUserApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.CreateCurrentUserApplicationKey`:\n%s\n", responseContent) } ``` Copy ##### Create an application key for current user returns "Created" response ``` // Create an application key for current user returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ApplicationKeyCreateRequest{ Data: datadogV2.ApplicationKeyCreateData{ Type: datadogV2.APPLICATIONKEYSTYPE_APPLICATION_KEYS, Attributes: datadogV2.ApplicationKeyCreateAttributes{ Name: "Example-Key-Management", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.CreateCurrentUserApplicationKey(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.CreateCurrentUserApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.CreateCurrentUserApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an Application key with scopes for current user returns "Created" response ``` // Create an Application key with scopes for current user returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.model.ApplicationKeyCreateAttributes; import com.datadog.api.client.v2.model.ApplicationKeyCreateData; import com.datadog.api.client.v2.model.ApplicationKeyCreateRequest; import com.datadog.api.client.v2.model.ApplicationKeyResponse; import com.datadog.api.client.v2.model.ApplicationKeysType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); ApplicationKeyCreateRequest body = new ApplicationKeyCreateRequest() .data( new ApplicationKeyCreateData() .type(ApplicationKeysType.APPLICATION_KEYS) .attributes( new ApplicationKeyCreateAttributes() .name("Example-Key-Management") .scopes( Arrays.asList( "dashboards_read", "dashboards_write", "dashboards_public_share")))); try { ApplicationKeyResponse result = apiInstance.createCurrentUserApplicationKey(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#createCurrentUserApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create an application key for current user returns "Created" response ``` // Create an application key for current user returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.model.ApplicationKeyCreateAttributes; import com.datadog.api.client.v2.model.ApplicationKeyCreateData; import com.datadog.api.client.v2.model.ApplicationKeyCreateRequest; import com.datadog.api.client.v2.model.ApplicationKeyResponse; import com.datadog.api.client.v2.model.ApplicationKeysType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); ApplicationKeyCreateRequest body = new ApplicationKeyCreateRequest() .data( new ApplicationKeyCreateData() .type(ApplicationKeysType.APPLICATION_KEYS) .attributes( new ApplicationKeyCreateAttributes().name("Example-Key-Management"))); try { ApplicationKeyResponse result = apiInstance.createCurrentUserApplicationKey(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#createCurrentUserApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an Application key with scopes for current user returns "Created" response ``` """ Create an Application key with scopes for current user returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi from datadog_api_client.v2.model.application_key_create_attributes import ApplicationKeyCreateAttributes from datadog_api_client.v2.model.application_key_create_data import ApplicationKeyCreateData from datadog_api_client.v2.model.application_key_create_request import ApplicationKeyCreateRequest from datadog_api_client.v2.model.application_keys_type import ApplicationKeysType body = ApplicationKeyCreateRequest( data=ApplicationKeyCreateData( type=ApplicationKeysType.APPLICATION_KEYS, attributes=ApplicationKeyCreateAttributes( name="Example-Key-Management", scopes=[ "dashboards_read", "dashboards_write", "dashboards_public_share", ], ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.create_current_user_application_key(body=body) print(response) ``` Copy ##### Create an application key for current user returns "Created" response ``` """ Create an application key for current user returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi from datadog_api_client.v2.model.application_key_create_attributes import ApplicationKeyCreateAttributes from datadog_api_client.v2.model.application_key_create_data import ApplicationKeyCreateData from datadog_api_client.v2.model.application_key_create_request import ApplicationKeyCreateRequest from datadog_api_client.v2.model.application_keys_type import ApplicationKeysType body = ApplicationKeyCreateRequest( data=ApplicationKeyCreateData( type=ApplicationKeysType.APPLICATION_KEYS, attributes=ApplicationKeyCreateAttributes( name="Example-Key-Management", ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.create_current_user_application_key(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an Application key with scopes for current user returns "Created" response ``` # Create an Application key with scopes for current user returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new body = DatadogAPIClient::V2::ApplicationKeyCreateRequest.new({ data: DatadogAPIClient::V2::ApplicationKeyCreateData.new({ type: DatadogAPIClient::V2::ApplicationKeysType::APPLICATION_KEYS, attributes: DatadogAPIClient::V2::ApplicationKeyCreateAttributes.new({ name: "Example-Key-Management", scopes: [ "dashboards_read", "dashboards_write", "dashboards_public_share", ], }), }), }) p api_instance.create_current_user_application_key(body) ``` Copy ##### Create an application key for current user returns "Created" response ``` # Create an application key for current user returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new body = DatadogAPIClient::V2::ApplicationKeyCreateRequest.new({ data: DatadogAPIClient::V2::ApplicationKeyCreateData.new({ type: DatadogAPIClient::V2::ApplicationKeysType::APPLICATION_KEYS, attributes: DatadogAPIClient::V2::ApplicationKeyCreateAttributes.new({ name: "Example-Key-Management", }), }), }) p api_instance.create_current_user_application_key(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an Application key with scopes for current user returns "Created" response ``` // Create an Application key with scopes for current user returns "Created" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV2::model::ApplicationKeyCreateAttributes; use datadog_api_client::datadogV2::model::ApplicationKeyCreateData; use datadog_api_client::datadogV2::model::ApplicationKeyCreateRequest; use datadog_api_client::datadogV2::model::ApplicationKeysType; #[tokio::main] async fn main() { let body = ApplicationKeyCreateRequest::new(ApplicationKeyCreateData::new( ApplicationKeyCreateAttributes::new("Example-Key-Management".to_string()).scopes(Some( vec![ "dashboards_read".to_string(), "dashboards_write".to_string(), "dashboards_public_share".to_string(), ], )), ApplicationKeysType::APPLICATION_KEYS, )); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.create_current_user_application_key(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create an application key for current user returns "Created" response ``` // Create an application key for current user returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV2::model::ApplicationKeyCreateAttributes; use datadog_api_client::datadogV2::model::ApplicationKeyCreateData; use datadog_api_client::datadogV2::model::ApplicationKeyCreateRequest; use datadog_api_client::datadogV2::model::ApplicationKeysType; #[tokio::main] async fn main() { let body = ApplicationKeyCreateRequest::new(ApplicationKeyCreateData::new( ApplicationKeyCreateAttributes::new("Example-Key-Management".to_string()), ApplicationKeysType::APPLICATION_KEYS, )); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.create_current_user_application_key(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an Application key with scopes for current user returns "Created" response ``` /** * Create an Application key with scopes for current user returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); const params: v2.KeyManagementApiCreateCurrentUserApplicationKeyRequest = { body: { data: { type: "application_keys", attributes: { name: "Example-Key-Management", scopes: [ "dashboards_read", "dashboards_write", "dashboards_public_share", ], }, }, }, }; apiInstance .createCurrentUserApplicationKey(params) .then((data: v2.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create an application key for current user returns "Created" response ``` /** * Create an application key for current user returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); const params: v2.KeyManagementApiCreateCurrentUserApplicationKeyRequest = { body: { data: { type: "application_keys", attributes: { name: "Example-Key-Management", }, }, }, }; apiInstance .createCurrentUserApplicationKey(params) .then((data: v2.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit an API key](https://docs.datadoghq.com/api/latest/key-management/#edit-an-api-key) * [v1](https://docs.datadoghq.com/api/latest/key-management/#edit-an-api-key-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#edit-an-api-key-v2) PUT https://api.ap1.datadoghq.com/api/v1/api_key/{key}https://api.ap2.datadoghq.com/api/v1/api_key/{key}https://api.datadoghq.eu/api/v1/api_key/{key}https://api.ddog-gov.com/api/v1/api_key/{key}https://api.datadoghq.com/api/v1/api_key/{key}https://api.us3.datadoghq.com/api/v1/api_key/{key}https://api.us5.datadoghq.com/api/v1/api_key/{key} ### Overview Edit an API key name. This endpoint requires the `api_keys_write` permission. ### Arguments #### Path Parameters Name Type Description key [_required_] string The specific API key you are working with. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Expand All Field Type Description created string Date of creation of the API key. created_by string Datadog user handle that created the API key. key string API key. name string Name of your API key. ``` { "name": "example user" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#UpdateAPIKey-200-v1) * [400](https://docs.datadoghq.com/api/latest/key-management/#UpdateAPIKey-400-v1) * [403](https://docs.datadoghq.com/api/latest/key-management/#UpdateAPIKey-403-v1) * [404](https://docs.datadoghq.com/api/latest/key-management/#UpdateAPIKey-404-v1) * [429](https://docs.datadoghq.com/api/latest/key-management/#UpdateAPIKey-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) An API key with its associated metadata. Field Type Description api_key object Datadog API key. created string Date of creation of the API key. created_by string Datadog user handle that created the API key. key string API key. name string Name of your API key. ``` { "api_key": { "created_by": "test_user", "key": "1234512345123456abcabc912349abcd", "name": "app_key" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Edit an API key Copy ``` # Path parameters export key="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/api_key/${key}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Edit an API key ``` """ Edit an API key returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.key_management_api import KeyManagementApi from datadog_api_client.v1.model.api_key import ApiKey body = ApiKey( name="example user", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.update_api_key(key="key", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit an API key ``` # Edit an API key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::KeyManagementAPI.new body = DatadogAPIClient::V1::ApiKey.new({ name: "example user", }) p api_instance.update_api_key("key", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit an API key ``` // Edit an API key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.ApiKey{ Name: datadog.PtrString("example user"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewKeyManagementApi(apiClient) resp, r, err := api.UpdateAPIKey(ctx, "key", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.UpdateAPIKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.UpdateAPIKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit an API key ``` // Edit an API key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.KeyManagementApi; import com.datadog.api.client.v1.model.ApiKey; import com.datadog.api.client.v1.model.ApiKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); ApiKey body = new ApiKey().name("example user"); try { ApiKeyResponse result = apiInstance.updateAPIKey("key", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#updateAPIKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit an API key ``` // Edit an API key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV1::model::ApiKey; #[tokio::main] async fn main() { let body = ApiKey::new().name("example user".to_string()); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.update_api_key("key".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit an API key ``` /** * Edit an API key returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.KeyManagementApi(configuration); const params: v1.KeyManagementApiUpdateAPIKeyRequest = { body: { name: "example user", }, key: "key", }; apiInstance .updateAPIKey(params) .then((data: v1.ApiKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` PATCH https://api.ap1.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.ap2.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.datadoghq.eu/api/v2/api_keys/{api_key_id}https://api.ddog-gov.com/api/v2/api_keys/{api_key_id}https://api.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.us3.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.us5.datadoghq.com/api/v2/api_keys/{api_key_id} ### Overview Update an API key. This endpoint requires the `api_keys_write` permission. ### Arguments #### Path Parameters Name Type Description api_key_id [_required_] string The ID of the API key. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Field Type Description data [_required_] object Object used to update an API key. attributes [_required_] object Attributes used to update an API Key. category string The APIKeyUpdateAttributes category. name [_required_] string Name of the API key. remote_config_read_enabled boolean The APIKeyUpdateAttributes remote_config_read_enabled. id [_required_] string ID of the API key. type [_required_] enum API Keys resource type. Allowed enum values: `api_keys` default: `api_keys` ``` { "data": { "type": "api_keys", "id": "string", "attributes": { "name": "Example-Key-Management" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#UpdateAPIKey-200-v2) * [400](https://docs.datadoghq.com/api/latest/key-management/#UpdateAPIKey-400-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#UpdateAPIKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/key-management/#UpdateAPIKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#UpdateAPIKey-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Response for retrieving an API key. Field Type Description data object Datadog API key. attributes object Attributes of a full API key. category string The category of the API key. created_at date-time Creation date of the API key. date_last_used date-time Date the API Key was last used key string The API key. last4 string The last four characters of the API key. modified_at date-time Date the API key was last modified. name string Name of the API key. remote_config_read_enabled boolean The remote config read enabled status. id string ID of the API key. relationships object Resources related to the API key. created_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` modified_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum API Keys resource type. Allowed enum values: `api_keys` default: `api_keys` included [ ] Array of objects related to the API key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` ``` { "data": { "attributes": { "category": "string", "created_at": "2020-11-23T10:00:00.000Z", "date_last_used": "2020-11-27T10:00:00.000Z", "key": "string", "last4": "abcd", "modified_at": "2020-11-23T10:00:00.000Z", "name": "API Key for submitting metrics", "remote_config_read_enabled": false }, "id": "string", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "modified_by": { "data": { "id": "00000000-0000-0000-0000-000000000000", "type": "users" } } }, "type": "api_keys" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Edit an API key returns "OK" response Copy ``` # Path parameters export api_key_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/api_keys/${api_key_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "api_keys", "id": "string", "attributes": { "name": "Example-Key-Management" } } } EOF ``` ##### Edit an API key returns "OK" response ``` // Edit an API key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "api_key" in the system APIKeyDataID := os.Getenv("API_KEY_DATA_ID") body := datadogV2.APIKeyUpdateRequest{ Data: datadogV2.APIKeyUpdateData{ Type: datadogV2.APIKEYSTYPE_API_KEYS, Id: APIKeyDataID, Attributes: datadogV2.APIKeyUpdateAttributes{ Name: "Example-Key-Management", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.UpdateAPIKey(ctx, APIKeyDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.UpdateAPIKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.UpdateAPIKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit an API key returns "OK" response ``` // Edit an API key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.model.APIKeyResponse; import com.datadog.api.client.v2.model.APIKeyUpdateAttributes; import com.datadog.api.client.v2.model.APIKeyUpdateData; import com.datadog.api.client.v2.model.APIKeyUpdateRequest; import com.datadog.api.client.v2.model.APIKeysType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); // there is a valid "api_key" in the system String API_KEY_DATA_ID = System.getenv("API_KEY_DATA_ID"); APIKeyUpdateRequest body = new APIKeyUpdateRequest() .data( new APIKeyUpdateData() .type(APIKeysType.API_KEYS) .id(API_KEY_DATA_ID) .attributes(new APIKeyUpdateAttributes().name("Example-Key-Management"))); try { APIKeyResponse result = apiInstance.updateAPIKey(API_KEY_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#updateAPIKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit an API key returns "OK" response ``` """ Edit an API key returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi from datadog_api_client.v2.model.api_key_update_attributes import APIKeyUpdateAttributes from datadog_api_client.v2.model.api_key_update_data import APIKeyUpdateData from datadog_api_client.v2.model.api_key_update_request import APIKeyUpdateRequest from datadog_api_client.v2.model.api_keys_type import APIKeysType # there is a valid "api_key" in the system API_KEY_DATA_ID = environ["API_KEY_DATA_ID"] body = APIKeyUpdateRequest( data=APIKeyUpdateData( type=APIKeysType.API_KEYS, id=API_KEY_DATA_ID, attributes=APIKeyUpdateAttributes( name="Example-Key-Management", ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.update_api_key(api_key_id=API_KEY_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit an API key returns "OK" response ``` # Edit an API key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new # there is a valid "api_key" in the system API_KEY_DATA_ID = ENV["API_KEY_DATA_ID"] body = DatadogAPIClient::V2::APIKeyUpdateRequest.new({ data: DatadogAPIClient::V2::APIKeyUpdateData.new({ type: DatadogAPIClient::V2::APIKeysType::API_KEYS, id: API_KEY_DATA_ID, attributes: DatadogAPIClient::V2::APIKeyUpdateAttributes.new({ name: "Example-Key-Management", }), }), }) p api_instance.update_api_key(API_KEY_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit an API key returns "OK" response ``` // Edit an API key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV2::model::APIKeyUpdateAttributes; use datadog_api_client::datadogV2::model::APIKeyUpdateData; use datadog_api_client::datadogV2::model::APIKeyUpdateRequest; use datadog_api_client::datadogV2::model::APIKeysType; #[tokio::main] async fn main() { // there is a valid "api_key" in the system let api_key_data_id = std::env::var("API_KEY_DATA_ID").unwrap(); let body = APIKeyUpdateRequest::new(APIKeyUpdateData::new( APIKeyUpdateAttributes::new("Example-Key-Management".to_string()), api_key_data_id.clone(), APIKeysType::API_KEYS, )); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.update_api_key(api_key_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit an API key returns "OK" response ``` /** * Edit an API key returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); // there is a valid "api_key" in the system const API_KEY_DATA_ID = process.env.API_KEY_DATA_ID as string; const params: v2.KeyManagementApiUpdateAPIKeyRequest = { body: { data: { type: "api_keys", id: API_KEY_DATA_ID, attributes: { name: "Example-Key-Management", }, }, }, apiKeyId: API_KEY_DATA_ID, }; apiInstance .updateAPIKey(params) .then((data: v2.APIKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an API key](https://docs.datadoghq.com/api/latest/key-management/#delete-an-api-key) * [v1](https://docs.datadoghq.com/api/latest/key-management/#delete-an-api-key-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#delete-an-api-key-v2) DELETE https://api.ap1.datadoghq.com/api/v1/api_key/{key}https://api.ap2.datadoghq.com/api/v1/api_key/{key}https://api.datadoghq.eu/api/v1/api_key/{key}https://api.ddog-gov.com/api/v1/api_key/{key}https://api.datadoghq.com/api/v1/api_key/{key}https://api.us3.datadoghq.com/api/v1/api_key/{key}https://api.us5.datadoghq.com/api/v1/api_key/{key} ### Overview Delete a given API key. This endpoint requires the `api_keys_delete` permission. ### Arguments #### Path Parameters Name Type Description key [_required_] string The specific API key you are working with. ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#DeleteAPIKey-200-v1) * [400](https://docs.datadoghq.com/api/latest/key-management/#DeleteAPIKey-400-v1) * [403](https://docs.datadoghq.com/api/latest/key-management/#DeleteAPIKey-403-v1) * [404](https://docs.datadoghq.com/api/latest/key-management/#DeleteAPIKey-404-v1) * [429](https://docs.datadoghq.com/api/latest/key-management/#DeleteAPIKey-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) An API key with its associated metadata. Field Type Description api_key object Datadog API key. created string Date of creation of the API key. created_by string Datadog user handle that created the API key. key string API key. name string Name of your API key. ``` { "api_key": { "created_by": "test_user", "key": "1234512345123456abcabc912349abcd", "name": "app_key" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Delete an API key Copy ``` # Path parameters export key="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/api_key/${key}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an API key ``` """ Delete an API key returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.key_management_api import KeyManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.delete_api_key( key="key", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an API key ``` # Delete an API key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::KeyManagementAPI.new p api_instance.delete_api_key("key") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an API key ``` // Delete an API key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewKeyManagementApi(apiClient) resp, r, err := api.DeleteAPIKey(ctx, "key") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.DeleteAPIKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.DeleteAPIKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an API key ``` // Delete an API key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.KeyManagementApi; import com.datadog.api.client.v1.model.ApiKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); try { ApiKeyResponse result = apiInstance.deleteAPIKey("key"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#deleteAPIKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an API key ``` // Delete an API key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.delete_api_key("key".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an API key ``` /** * Delete an API key returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.KeyManagementApi(configuration); const params: v1.KeyManagementApiDeleteAPIKeyRequest = { key: "key", }; apiInstance .deleteAPIKey(params) .then((data: v1.ApiKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` DELETE https://api.ap1.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.ap2.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.datadoghq.eu/api/v2/api_keys/{api_key_id}https://api.ddog-gov.com/api/v2/api_keys/{api_key_id}https://api.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.us3.datadoghq.com/api/v2/api_keys/{api_key_id}https://api.us5.datadoghq.com/api/v2/api_keys/{api_key_id} ### Overview Delete an API key. This endpoint requires the `api_keys_delete` permission. ### Arguments #### Path Parameters Name Type Description api_key_id [_required_] string The ID of the API key. ### Response * [204](https://docs.datadoghq.com/api/latest/key-management/#DeleteAPIKey-204-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#DeleteAPIKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/key-management/#DeleteAPIKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#DeleteAPIKey-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Delete an API key Copy ``` # Path parameters export api_key_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/api_keys/${api_key_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an API key ``` """ Delete an API key returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi # there is a valid "api_key" in the system API_KEY_DATA_ID = environ["API_KEY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) api_instance.delete_api_key( api_key_id=API_KEY_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an API key ``` # Delete an API key returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new # there is a valid "api_key" in the system API_KEY_DATA_ID = ENV["API_KEY_DATA_ID"] api_instance.delete_api_key(API_KEY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an API key ``` // Delete an API key returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "api_key" in the system APIKeyDataID := os.Getenv("API_KEY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) r, err := api.DeleteAPIKey(ctx, APIKeyDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.DeleteAPIKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an API key ``` // Delete an API key returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); // there is a valid "api_key" in the system String API_KEY_DATA_ID = System.getenv("API_KEY_DATA_ID"); try { apiInstance.deleteAPIKey(API_KEY_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#deleteAPIKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an API key ``` // Delete an API key returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { // there is a valid "api_key" in the system let api_key_data_id = std::env::var("API_KEY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.delete_api_key(api_key_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an API key ``` /** * Delete an API key returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); // there is a valid "api_key" in the system const API_KEY_DATA_ID = process.env.API_KEY_DATA_ID as string; const params: v2.KeyManagementApiDeleteAPIKeyRequest = { apiKeyId: API_KEY_DATA_ID, }; apiInstance .deleteAPIKey(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all application keys owned by current user](https://docs.datadoghq.com/api/latest/key-management/#get-all-application-keys-owned-by-current-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#get-all-application-keys-owned-by-current-user-v2) GET https://api.ap1.datadoghq.com/api/v2/current_user/application_keyshttps://api.ap2.datadoghq.com/api/v2/current_user/application_keyshttps://api.datadoghq.eu/api/v2/current_user/application_keyshttps://api.ddog-gov.com/api/v2/current_user/application_keyshttps://api.datadoghq.com/api/v2/current_user/application_keyshttps://api.us3.datadoghq.com/api/v2/current_user/application_keyshttps://api.us5.datadoghq.com/api/v2/current_user/application_keys ### Overview List all application keys available for current user This endpoint requires the `user_app_keys` permission. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort enum Application key attribute used to sort results. Sort order is ascending by default. In order to specify a descending sort, prefix the attribute with a minus sign. Allowed enum values: `created_at, -created_at, last4, -last4, name, -name` filter string Filter application keys by the specified string. filter[created_at][start] string Only include application keys created on or after the specified date. filter[created_at][end] string Only include application keys created on or before the specified date. include string Resource path for related resources to include in the response. Only `owned_by` is supported. ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#ListCurrentUserApplicationKeys-200-v2) * [400](https://docs.datadoghq.com/api/latest/key-management/#ListCurrentUserApplicationKeys-400-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#ListCurrentUserApplicationKeys-403-v2) * [404](https://docs.datadoghq.com/api/latest/key-management/#ListCurrentUserApplicationKeys-404-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#ListCurrentUserApplicationKeys-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Response for a list of application keys. Field Type Description data [object] Array of application keys. attributes object Attributes of a partial application key. created_at string Creation date of the application key. last4 string The last four characters of the application key. last_used_at string Last usage timestamp of the application key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id string ID of the application key. relationships object Resources related to the application key. owned_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` included [ ] Array of objects related to the application key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` meta object Additional information related to the application key response. max_allowed_per_user int64 Max allowed number of application keys per user. page object Additional information related to the application key response. total_filtered_count int64 Total filtered application key count. ``` { "data": [ { "attributes": { "created_at": "2020-11-23T10:00:00.000Z", "last4": "abcd", "last_used_at": "2020-12-20T10:00:00.000Z", "name": "Application Key for managing dashboards", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "id": "string", "relationships": { "owned_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "application_keys" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "meta": { "max_allowed_per_user": "integer", "page": { "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Get all application keys owned by current user Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/current_user/application_keys" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all application keys owned by current user ``` """ Get all application keys owned by current user returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.list_current_user_application_keys() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all application keys owned by current user ``` # Get all application keys owned by current user returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new p api_instance.list_current_user_application_keys() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all application keys owned by current user ``` // Get all application keys owned by current user returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.ListCurrentUserApplicationKeys(ctx, *datadogV2.NewListCurrentUserApplicationKeysOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.ListCurrentUserApplicationKeys`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.ListCurrentUserApplicationKeys`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all application keys owned by current user ``` // Get all application keys owned by current user returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.model.ListApplicationKeysResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); try { ListApplicationKeysResponse result = apiInstance.listCurrentUserApplicationKeys(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#listCurrentUserApplicationKeys"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all application keys owned by current user ``` // Get all application keys owned by current user returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV2::api_key_management::ListCurrentUserApplicationKeysOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api .list_current_user_application_keys(ListCurrentUserApplicationKeysOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all application keys owned by current user ``` /** * Get all application keys owned by current user returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); apiInstance .listCurrentUserApplicationKeys() .then((data: v2.ListApplicationKeysResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all application keys](https://docs.datadoghq.com/api/latest/key-management/#get-all-application-keys) * [v1](https://docs.datadoghq.com/api/latest/key-management/#get-all-application-keys-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#get-all-application-keys-v2) GET https://api.ap1.datadoghq.com/api/v1/application_keyhttps://api.ap2.datadoghq.com/api/v1/application_keyhttps://api.datadoghq.eu/api/v1/application_keyhttps://api.ddog-gov.com/api/v1/application_keyhttps://api.datadoghq.com/api/v1/application_keyhttps://api.us3.datadoghq.com/api/v1/application_keyhttps://api.us5.datadoghq.com/api/v1/application_key ### Overview Get all application keys available for your Datadog account. This endpoint is disabled for organizations in [One-Time Read mode](https://docs.datadoghq.com/account_management/api-app-keys/#one-time-read-mode). This endpoint requires any of the following permissions: * `org_app_keys_read` * `user_app_keys` ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#ListApplicationKeys-200-v1) * [403](https://docs.datadoghq.com/api/latest/key-management/#ListApplicationKeys-403-v1) * [429](https://docs.datadoghq.com/api/latest/key-management/#ListApplicationKeys-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) An application key response. Field Type Description application_keys [object] Array of application keys. hash string Hash of an application key. name string Name of an application key. owner string Owner of an application key. ``` { "application_keys": [ { "hash": "1234512345123459cda4eb9ced49a3d84fd0138c", "name": "app_key", "owner": "test_user" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Get all application keys Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/application_key" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all application keys ``` """ Get all application keys returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.key_management_api import KeyManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.list_application_keys() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all application keys ``` # Get all application keys returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::KeyManagementAPI.new p api_instance.list_application_keys() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all application keys ``` // Get all application keys returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewKeyManagementApi(apiClient) resp, r, err := api.ListApplicationKeys(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.ListApplicationKeys`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.ListApplicationKeys`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all application keys ``` // Get all application keys returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.KeyManagementApi; import com.datadog.api.client.v1.model.ApplicationKeyListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); try { ApplicationKeyListResponse result = apiInstance.listApplicationKeys(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#listApplicationKeys"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all application keys ``` // Get all application keys returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.list_application_keys().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all application keys ``` /** * Get all application keys returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.KeyManagementApi(configuration); apiInstance .listApplicationKeys() .then((data: v1.ApplicationKeyListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/application_keyshttps://api.ap2.datadoghq.com/api/v2/application_keyshttps://api.datadoghq.eu/api/v2/application_keyshttps://api.ddog-gov.com/api/v2/application_keyshttps://api.datadoghq.com/api/v2/application_keyshttps://api.us3.datadoghq.com/api/v2/application_keyshttps://api.us5.datadoghq.com/api/v2/application_keys ### Overview List all application keys available for your org This endpoint requires the `org_app_keys_read` permission. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort enum Application key attribute used to sort results. Sort order is ascending by default. In order to specify a descending sort, prefix the attribute with a minus sign. Allowed enum values: `created_at, -created_at, last4, -last4, name, -name` filter string Filter application keys by the specified string. filter[created_at][start] string Only include application keys created on or after the specified date. filter[created_at][end] string Only include application keys created on or before the specified date. include string Resource path for related resources to include in the response. Only `owned_by` is supported. ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#ListApplicationKeys-200-v2) * [400](https://docs.datadoghq.com/api/latest/key-management/#ListApplicationKeys-400-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#ListApplicationKeys-403-v2) * [404](https://docs.datadoghq.com/api/latest/key-management/#ListApplicationKeys-404-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#ListApplicationKeys-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Response for a list of application keys. Field Type Description data [object] Array of application keys. attributes object Attributes of a partial application key. created_at string Creation date of the application key. last4 string The last four characters of the application key. last_used_at string Last usage timestamp of the application key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id string ID of the application key. relationships object Resources related to the application key. owned_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` included [ ] Array of objects related to the application key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` meta object Additional information related to the application key response. max_allowed_per_user int64 Max allowed number of application keys per user. page object Additional information related to the application key response. total_filtered_count int64 Total filtered application key count. ``` { "data": [ { "attributes": { "created_at": "2020-11-23T10:00:00.000Z", "last4": "abcd", "last_used_at": "2020-12-20T10:00:00.000Z", "name": "Application Key for managing dashboards", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "id": "string", "relationships": { "owned_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "application_keys" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "meta": { "max_allowed_per_user": "integer", "page": { "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Get all application keys Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/application_keys" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all application keys ``` """ Get all application keys returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.list_application_keys() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all application keys ``` # Get all application keys returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new p api_instance.list_application_keys() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all application keys ``` // Get all application keys returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.ListApplicationKeys(ctx, *datadogV2.NewListApplicationKeysOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.ListApplicationKeys`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.ListApplicationKeys`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all application keys ``` // Get all application keys returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.model.ListApplicationKeysResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); try { ListApplicationKeysResponse result = apiInstance.listApplicationKeys(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#listApplicationKeys"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all application keys ``` // Get all application keys returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV2::api_key_management::ListApplicationKeysOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api .list_application_keys(ListApplicationKeysOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all application keys ``` /** * Get all application keys returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); apiInstance .listApplicationKeys() .then((data: v2.ListApplicationKeysResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an application key](https://docs.datadoghq.com/api/latest/key-management/#create-an-application-key) * [v1 (latest)](https://docs.datadoghq.com/api/latest/key-management/#create-an-application-key-v1) POST https://api.ap1.datadoghq.com/api/v1/application_keyhttps://api.ap2.datadoghq.com/api/v1/application_keyhttps://api.datadoghq.eu/api/v1/application_keyhttps://api.ddog-gov.com/api/v1/application_keyhttps://api.datadoghq.com/api/v1/application_keyhttps://api.us3.datadoghq.com/api/v1/application_keyhttps://api.us5.datadoghq.com/api/v1/application_key ### Overview Create an application key with a given name. This endpoint is disabled for organizations in [One-Time Read mode](https://docs.datadoghq.com/account_management/api-app-keys/#one-time-read-mode). This endpoint requires the `user_app_keys` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Expand All Field Type Description hash string Hash of an application key. name string Name of an application key. owner string Owner of an application key. ``` { "name": "example user" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#CreateApplicationKey-200-v1) * [400](https://docs.datadoghq.com/api/latest/key-management/#CreateApplicationKey-400-v1) * [403](https://docs.datadoghq.com/api/latest/key-management/#CreateApplicationKey-403-v1) * [409](https://docs.datadoghq.com/api/latest/key-management/#CreateApplicationKey-409-v1) * [429](https://docs.datadoghq.com/api/latest/key-management/#CreateApplicationKey-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) An application key response. Field Type Description application_key object An application key with its associated metadata. hash string Hash of an application key. name string Name of an application key. owner string Owner of an application key. ``` { "application_key": { "hash": "1234512345123459cda4eb9ced49a3d84fd0138c", "name": "app_key", "owner": "test_user" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Create an application key Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/application_key" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Create an application key ``` """ Create an application key returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.key_management_api import KeyManagementApi from datadog_api_client.v1.model.application_key import ApplicationKey body = ApplicationKey( name="example user", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.create_application_key(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an application key ``` # Create an application key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::KeyManagementAPI.new body = DatadogAPIClient::V1::ApplicationKey.new({ name: "example user", }) p api_instance.create_application_key(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an application key ``` // Create an application key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.ApplicationKey{ Name: datadog.PtrString("example user"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewKeyManagementApi(apiClient) resp, r, err := api.CreateApplicationKey(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.CreateApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.CreateApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an application key ``` // Create an application key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.KeyManagementApi; import com.datadog.api.client.v1.model.ApplicationKey; import com.datadog.api.client.v1.model.ApplicationKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); ApplicationKey body = new ApplicationKey().name("example user"); try { ApplicationKeyResponse result = apiInstance.createApplicationKey(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#createApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an application key ``` // Create an application key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV1::model::ApplicationKey; #[tokio::main] async fn main() { let body = ApplicationKey::new().name("example user".to_string()); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.create_application_key(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an application key ``` /** * Create an application key returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.KeyManagementApi(configuration); const params: v1.KeyManagementApiCreateApplicationKeyRequest = { body: { name: "example user", }, }; apiInstance .createApplicationKey(params) .then((data: v1.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an application key](https://docs.datadoghq.com/api/latest/key-management/#get-an-application-key) * [v1](https://docs.datadoghq.com/api/latest/key-management/#get-an-application-key-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#get-an-application-key-v2) GET https://api.ap1.datadoghq.com/api/v1/application_key/{key}https://api.ap2.datadoghq.com/api/v1/application_key/{key}https://api.datadoghq.eu/api/v1/application_key/{key}https://api.ddog-gov.com/api/v1/application_key/{key}https://api.datadoghq.com/api/v1/application_key/{key}https://api.us3.datadoghq.com/api/v1/application_key/{key}https://api.us5.datadoghq.com/api/v1/application_key/{key} ### Overview Get a given application key. This endpoint is disabled for organizations in [One-Time Read mode](https://docs.datadoghq.com/account_management/api-app-keys/#one-time-read-mode). This endpoint requires any of the following permissions: * `org_app_keys_read` * `user_app_keys` ### Arguments #### Path Parameters Name Type Description key [_required_] string The specific APP key you are working with. ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#GetApplicationKey-200-v1) * [403](https://docs.datadoghq.com/api/latest/key-management/#GetApplicationKey-403-v1) * [404](https://docs.datadoghq.com/api/latest/key-management/#GetApplicationKey-404-v1) * [429](https://docs.datadoghq.com/api/latest/key-management/#GetApplicationKey-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) An application key response. Field Type Description application_key object An application key with its associated metadata. hash string Hash of an application key. name string Name of an application key. owner string Owner of an application key. ``` { "application_key": { "hash": "1234512345123459cda4eb9ced49a3d84fd0138c", "name": "app_key", "owner": "test_user" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Get an application key Copy ``` # Path parameters export key="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/application_key/${key}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an application key ``` """ Get an application key returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.key_management_api import KeyManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.get_application_key( key="key", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an application key ``` # Get an application key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::KeyManagementAPI.new p api_instance.get_application_key("key") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an application key ``` // Get an application key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewKeyManagementApi(apiClient) resp, r, err := api.GetApplicationKey(ctx, "key") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.GetApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.GetApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an application key ``` // Get an application key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.KeyManagementApi; import com.datadog.api.client.v1.model.ApplicationKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); try { ApplicationKeyResponse result = apiInstance.getApplicationKey("key"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#getApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an application key ``` // Get an application key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.get_application_key("key".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an application key ``` /** * Get an application key returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.KeyManagementApi(configuration); const params: v1.KeyManagementApiGetApplicationKeyRequest = { key: "key", }; apiInstance .getApplicationKey(params) .then((data: v1.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.ap2.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.datadoghq.eu/api/v2/application_keys/{app_key_id}https://api.ddog-gov.com/api/v2/application_keys/{app_key_id}https://api.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.us3.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.us5.datadoghq.com/api/v2/application_keys/{app_key_id} ### Overview Get an application key for your org. This endpoint requires the `org_app_keys_read` permission. ### Arguments #### Path Parameters Name Type Description app_key_id [_required_] string The ID of the application key. #### Query Strings Name Type Description include string Resource path for related resources to include in the response. Only `owned_by` is supported. ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#GetApplicationKey-200-v2) * [400](https://docs.datadoghq.com/api/latest/key-management/#GetApplicationKey-400-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#GetApplicationKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/key-management/#GetApplicationKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#GetApplicationKey-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Response for retrieving an application key. Field Type Description data object Datadog application key. attributes object Attributes of a full application key. created_at date-time Creation date of the application key. key string The application key. last4 string The last four characters of the application key. last_used_at date-time Last usage timestamp of the application key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id string ID of the application key. relationships object Resources related to the application key. owned_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` included [ ] Array of objects related to the application key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` ``` { "data": { "attributes": { "created_at": "2020-11-23T10:00:00.000Z", "key": "string", "last4": "abcd", "last_used_at": "2020-12-20T10:00:00.000Z", "name": "Application Key for managing dashboards", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "id": "string", "relationships": { "owned_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "application_keys" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Get an application key Copy ``` # Path parameters export app_key_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/application_keys/${app_key_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an application key ``` """ Get an application key returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ID = environ["APPLICATION_KEY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.get_application_key( app_key_id=APPLICATION_KEY_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an application key ``` # Get an application key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ID = ENV["APPLICATION_KEY_DATA_ID"] p api_instance.get_application_key(APPLICATION_KEY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an application key ``` // Get an application key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "application_key" in the system ApplicationKeyDataID := os.Getenv("APPLICATION_KEY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.GetApplicationKey(ctx, ApplicationKeyDataID, *datadogV2.NewGetApplicationKeyOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.GetApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.GetApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an application key ``` // Get an application key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.model.ApplicationKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); // there is a valid "application_key" in the system String APPLICATION_KEY_DATA_ID = System.getenv("APPLICATION_KEY_DATA_ID"); try { ApplicationKeyResponse result = apiInstance.getApplicationKey(APPLICATION_KEY_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#getApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an application key ``` // Get an application key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::GetApplicationKeyOptionalParams; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { // there is a valid "application_key" in the system let application_key_data_id = std::env::var("APPLICATION_KEY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api .get_application_key( application_key_data_id.clone(), GetApplicationKeyOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an application key ``` /** * Get an application key returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); // there is a valid "application_key" in the system const APPLICATION_KEY_DATA_ID = process.env.APPLICATION_KEY_DATA_ID as string; const params: v2.KeyManagementApiGetApplicationKeyRequest = { appKeyId: APPLICATION_KEY_DATA_ID, }; apiInstance .getApplicationKey(params) .then((data: v2.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit an application key](https://docs.datadoghq.com/api/latest/key-management/#edit-an-application-key) * [v1](https://docs.datadoghq.com/api/latest/key-management/#edit-an-application-key-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#edit-an-application-key-v2) PUT https://api.ap1.datadoghq.com/api/v1/application_key/{key}https://api.ap2.datadoghq.com/api/v1/application_key/{key}https://api.datadoghq.eu/api/v1/application_key/{key}https://api.ddog-gov.com/api/v1/application_key/{key}https://api.datadoghq.com/api/v1/application_key/{key}https://api.us3.datadoghq.com/api/v1/application_key/{key}https://api.us5.datadoghq.com/api/v1/application_key/{key} ### Overview Edit an application key name. This endpoint is disabled for organizations in [One-Time Read mode](https://docs.datadoghq.com/account_management/api-app-keys/#one-time-read-mode). This endpoint requires any of the following permissions: * `org_app_keys_write` * `user_app_keys` ### Arguments #### Path Parameters Name Type Description key [_required_] string The specific APP key you are working with. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Expand All Field Type Description hash string Hash of an application key. name string Name of an application key. owner string Owner of an application key. ``` { "name": "example user" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#UpdateApplicationKey-200-v1) * [400](https://docs.datadoghq.com/api/latest/key-management/#UpdateApplicationKey-400-v1) * [403](https://docs.datadoghq.com/api/latest/key-management/#UpdateApplicationKey-403-v1) * [404](https://docs.datadoghq.com/api/latest/key-management/#UpdateApplicationKey-404-v1) * [409](https://docs.datadoghq.com/api/latest/key-management/#UpdateApplicationKey-409-v1) * [429](https://docs.datadoghq.com/api/latest/key-management/#UpdateApplicationKey-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) An application key response. Field Type Description application_key object An application key with its associated metadata. hash string Hash of an application key. name string Name of an application key. owner string Owner of an application key. ``` { "application_key": { "hash": "1234512345123459cda4eb9ced49a3d84fd0138c", "name": "app_key", "owner": "test_user" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Edit an application key Copy ``` # Path parameters export key="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/application_key/${key}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Edit an application key ``` """ Edit an application key returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.key_management_api import KeyManagementApi from datadog_api_client.v1.model.application_key import ApplicationKey body = ApplicationKey( name="example user", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.update_application_key(key="key", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit an application key ``` # Edit an application key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::KeyManagementAPI.new body = DatadogAPIClient::V1::ApplicationKey.new({ name: "example user", }) p api_instance.update_application_key("key", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit an application key ``` // Edit an application key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.ApplicationKey{ Name: datadog.PtrString("example user"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewKeyManagementApi(apiClient) resp, r, err := api.UpdateApplicationKey(ctx, "key", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.UpdateApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.UpdateApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit an application key ``` // Edit an application key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.KeyManagementApi; import com.datadog.api.client.v1.model.ApplicationKey; import com.datadog.api.client.v1.model.ApplicationKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); ApplicationKey body = new ApplicationKey().name("example user"); try { ApplicationKeyResponse result = apiInstance.updateApplicationKey("key", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#updateApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit an application key ``` // Edit an application key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV1::model::ApplicationKey; #[tokio::main] async fn main() { let body = ApplicationKey::new().name("example user".to_string()); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.update_application_key("key".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit an application key ``` /** * Edit an application key returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.KeyManagementApi(configuration); const params: v1.KeyManagementApiUpdateApplicationKeyRequest = { body: { name: "example user", }, key: "key", }; apiInstance .updateApplicationKey(params) .then((data: v1.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` PATCH https://api.ap1.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.ap2.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.datadoghq.eu/api/v2/application_keys/{app_key_id}https://api.ddog-gov.com/api/v2/application_keys/{app_key_id}https://api.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.us3.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.us5.datadoghq.com/api/v2/application_keys/{app_key_id} ### Overview Edit an application key This endpoint requires the `org_app_keys_write` permission. ### Arguments #### Path Parameters Name Type Description app_key_id [_required_] string The ID of the application key. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Field Type Description data [_required_] object Object used to update an application key. attributes [_required_] object Attributes used to update an application Key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id [_required_] string ID of the application key. type [_required_] enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` ``` { "data": { "id": "string", "type": "application_keys", "attributes": { "name": "Application Key for managing dashboards-updated" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#UpdateApplicationKey-200-v2) * [400](https://docs.datadoghq.com/api/latest/key-management/#UpdateApplicationKey-400-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#UpdateApplicationKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/key-management/#UpdateApplicationKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#UpdateApplicationKey-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Response for retrieving an application key. Field Type Description data object Datadog application key. attributes object Attributes of a full application key. created_at date-time Creation date of the application key. key string The application key. last4 string The last four characters of the application key. last_used_at date-time Last usage timestamp of the application key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id string ID of the application key. relationships object Resources related to the application key. owned_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` included [ ] Array of objects related to the application key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` ``` { "data": { "attributes": { "created_at": "2020-11-23T10:00:00.000Z", "key": "string", "last4": "abcd", "last_used_at": "2020-12-20T10:00:00.000Z", "name": "Application Key for managing dashboards", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "id": "string", "relationships": { "owned_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "application_keys" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Edit an application key returns "OK" response Copy ``` # Path parameters export app_key_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/application_keys/${app_key_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "string", "type": "application_keys", "attributes": { "name": "Application Key for managing dashboards-updated" } } } EOF ``` ##### Edit an application key returns "OK" response ``` // Edit an application key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "application_key" in the system ApplicationKeyDataID := os.Getenv("APPLICATION_KEY_DATA_ID") body := datadogV2.ApplicationKeyUpdateRequest{ Data: datadogV2.ApplicationKeyUpdateData{ Id: ApplicationKeyDataID, Type: datadogV2.APPLICATIONKEYSTYPE_APPLICATION_KEYS, Attributes: datadogV2.ApplicationKeyUpdateAttributes{ Name: datadog.PtrString("Application Key for managing dashboards-updated"), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) resp, r, err := api.UpdateApplicationKey(ctx, ApplicationKeyDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.UpdateApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.UpdateApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit an application key returns "OK" response ``` // Edit an application key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; import com.datadog.api.client.v2.model.ApplicationKeyResponse; import com.datadog.api.client.v2.model.ApplicationKeyUpdateAttributes; import com.datadog.api.client.v2.model.ApplicationKeyUpdateData; import com.datadog.api.client.v2.model.ApplicationKeyUpdateRequest; import com.datadog.api.client.v2.model.ApplicationKeysType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); // there is a valid "application_key" in the system String APPLICATION_KEY_DATA_ATTRIBUTES_NAME = System.getenv("APPLICATION_KEY_DATA_ATTRIBUTES_NAME"); String APPLICATION_KEY_DATA_ID = System.getenv("APPLICATION_KEY_DATA_ID"); ApplicationKeyUpdateRequest body = new ApplicationKeyUpdateRequest() .data( new ApplicationKeyUpdateData() .id(APPLICATION_KEY_DATA_ID) .type(ApplicationKeysType.APPLICATION_KEYS) .attributes( new ApplicationKeyUpdateAttributes() .name("Application Key for managing dashboards-updated"))); try { ApplicationKeyResponse result = apiInstance.updateApplicationKey(APPLICATION_KEY_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#updateApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit an application key returns "OK" response ``` """ Edit an application key returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi from datadog_api_client.v2.model.application_key_update_attributes import ApplicationKeyUpdateAttributes from datadog_api_client.v2.model.application_key_update_data import ApplicationKeyUpdateData from datadog_api_client.v2.model.application_key_update_request import ApplicationKeyUpdateRequest from datadog_api_client.v2.model.application_keys_type import ApplicationKeysType # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ATTRIBUTES_NAME = environ["APPLICATION_KEY_DATA_ATTRIBUTES_NAME"] APPLICATION_KEY_DATA_ID = environ["APPLICATION_KEY_DATA_ID"] body = ApplicationKeyUpdateRequest( data=ApplicationKeyUpdateData( id=APPLICATION_KEY_DATA_ID, type=ApplicationKeysType.APPLICATION_KEYS, attributes=ApplicationKeyUpdateAttributes( name="Application Key for managing dashboards-updated", ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.update_application_key(app_key_id=APPLICATION_KEY_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit an application key returns "OK" response ``` # Edit an application key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ATTRIBUTES_NAME = ENV["APPLICATION_KEY_DATA_ATTRIBUTES_NAME"] APPLICATION_KEY_DATA_ID = ENV["APPLICATION_KEY_DATA_ID"] body = DatadogAPIClient::V2::ApplicationKeyUpdateRequest.new({ data: DatadogAPIClient::V2::ApplicationKeyUpdateData.new({ id: APPLICATION_KEY_DATA_ID, type: DatadogAPIClient::V2::ApplicationKeysType::APPLICATION_KEYS, attributes: DatadogAPIClient::V2::ApplicationKeyUpdateAttributes.new({ name: "Application Key for managing dashboards-updated", }), }), }) p api_instance.update_application_key(APPLICATION_KEY_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit an application key returns "OK" response ``` // Edit an application key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; use datadog_api_client::datadogV2::model::ApplicationKeyUpdateAttributes; use datadog_api_client::datadogV2::model::ApplicationKeyUpdateData; use datadog_api_client::datadogV2::model::ApplicationKeyUpdateRequest; use datadog_api_client::datadogV2::model::ApplicationKeysType; #[tokio::main] async fn main() { // there is a valid "application_key" in the system let application_key_data_id = std::env::var("APPLICATION_KEY_DATA_ID").unwrap(); let body = ApplicationKeyUpdateRequest::new(ApplicationKeyUpdateData::new( ApplicationKeyUpdateAttributes::new() .name("Application Key for managing dashboards-updated".to_string()), application_key_data_id.clone(), ApplicationKeysType::APPLICATION_KEYS, )); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api .update_application_key(application_key_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit an application key returns "OK" response ``` /** * Edit an application key returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); // there is a valid "application_key" in the system const APPLICATION_KEY_DATA_ID = process.env.APPLICATION_KEY_DATA_ID as string; const params: v2.KeyManagementApiUpdateApplicationKeyRequest = { body: { data: { id: APPLICATION_KEY_DATA_ID, type: "application_keys", attributes: { name: "Application Key for managing dashboards-updated", }, }, }, appKeyId: APPLICATION_KEY_DATA_ID, }; apiInstance .updateApplicationKey(params) .then((data: v2.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an application key](https://docs.datadoghq.com/api/latest/key-management/#delete-an-application-key) * [v1](https://docs.datadoghq.com/api/latest/key-management/#delete-an-application-key-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/key-management/#delete-an-application-key-v2) DELETE https://api.ap1.datadoghq.com/api/v1/application_key/{key}https://api.ap2.datadoghq.com/api/v1/application_key/{key}https://api.datadoghq.eu/api/v1/application_key/{key}https://api.ddog-gov.com/api/v1/application_key/{key}https://api.datadoghq.com/api/v1/application_key/{key}https://api.us3.datadoghq.com/api/v1/application_key/{key}https://api.us5.datadoghq.com/api/v1/application_key/{key} ### Overview Delete a given application key. This endpoint is disabled for organizations in [One-Time Read mode](https://docs.datadoghq.com/account_management/api-app-keys/#one-time-read-mode). This endpoint requires any of the following permissions: * `org_app_keys_write` * `user_app_keys` ### Arguments #### Path Parameters Name Type Description key [_required_] string The specific APP key you are working with. ### Response * [200](https://docs.datadoghq.com/api/latest/key-management/#DeleteApplicationKey-200-v1) * [403](https://docs.datadoghq.com/api/latest/key-management/#DeleteApplicationKey-403-v1) * [404](https://docs.datadoghq.com/api/latest/key-management/#DeleteApplicationKey-404-v1) * [429](https://docs.datadoghq.com/api/latest/key-management/#DeleteApplicationKey-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) An application key response. Field Type Description application_key object An application key with its associated metadata. hash string Hash of an application key. name string Name of an application key. owner string Owner of an application key. ``` { "application_key": { "hash": "1234512345123459cda4eb9ced49a3d84fd0138c", "name": "app_key", "owner": "test_user" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Delete an application key Copy ``` # Path parameters export key="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/application_key/${key}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an application key ``` """ Delete an application key returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.key_management_api import KeyManagementApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) response = api_instance.delete_application_key( key="key", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an application key ``` # Delete an application key returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::KeyManagementAPI.new p api_instance.delete_application_key("key") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an application key ``` // Delete an application key returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewKeyManagementApi(apiClient) resp, r, err := api.DeleteApplicationKey(ctx, "key") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.DeleteApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `KeyManagementApi.DeleteApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an application key ``` // Delete an application key returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.KeyManagementApi; import com.datadog.api.client.v1.model.ApplicationKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); try { ApplicationKeyResponse result = apiInstance.deleteApplicationKey("key"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#deleteApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an application key ``` // Delete an application key returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api.delete_application_key("key".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an application key ``` /** * Delete an application key returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.KeyManagementApi(configuration); const params: v1.KeyManagementApiDeleteApplicationKeyRequest = { key: "key", }; apiInstance .deleteApplicationKey(params) .then((data: v1.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` DELETE https://api.ap1.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.ap2.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.datadoghq.eu/api/v2/application_keys/{app_key_id}https://api.ddog-gov.com/api/v2/application_keys/{app_key_id}https://api.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.us3.datadoghq.com/api/v2/application_keys/{app_key_id}https://api.us5.datadoghq.com/api/v2/application_keys/{app_key_id} ### Overview Delete an application key This endpoint requires the `org_app_keys_write` permission. ### Arguments #### Path Parameters Name Type Description app_key_id [_required_] string The ID of the application key. ### Response * [204](https://docs.datadoghq.com/api/latest/key-management/#DeleteApplicationKey-204-v2) * [403](https://docs.datadoghq.com/api/latest/key-management/#DeleteApplicationKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/key-management/#DeleteApplicationKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/key-management/#DeleteApplicationKey-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/key-management/) * [Example](https://docs.datadoghq.com/api/latest/key-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/key-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/key-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/key-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/key-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/key-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/key-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/key-management/?code-lang=typescript) ##### Delete an application key Copy ``` # Path parameters export app_key_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/application_keys/${app_key_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an application key ``` """ Delete an application key returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.key_management_api import KeyManagementApi # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ID = environ["APPLICATION_KEY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = KeyManagementApi(api_client) api_instance.delete_application_key( app_key_id=APPLICATION_KEY_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an application key ``` # Delete an application key returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::KeyManagementAPI.new # there is a valid "application_key" in the system APPLICATION_KEY_DATA_ID = ENV["APPLICATION_KEY_DATA_ID"] api_instance.delete_application_key(APPLICATION_KEY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an application key ``` // Delete an application key returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "application_key" in the system ApplicationKeyDataID := os.Getenv("APPLICATION_KEY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewKeyManagementApi(apiClient) r, err := api.DeleteApplicationKey(ctx, ApplicationKeyDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `KeyManagementApi.DeleteApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an application key ``` // Delete an application key returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.KeyManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); KeyManagementApi apiInstance = new KeyManagementApi(defaultClient); // there is a valid "application_key" in the system String APPLICATION_KEY_DATA_ID = System.getenv("APPLICATION_KEY_DATA_ID"); try { apiInstance.deleteApplicationKey(APPLICATION_KEY_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling KeyManagementApi#deleteApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an application key ``` // Delete an application key returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_key_management::KeyManagementAPI; #[tokio::main] async fn main() { // there is a valid "application_key" in the system let application_key_data_id = std::env::var("APPLICATION_KEY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = KeyManagementAPI::with_config(configuration); let resp = api .delete_application_key(application_key_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an application key ``` /** * Delete an application key returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.KeyManagementApi(configuration); // there is a valid "application_key" in the system const APPLICATION_KEY_DATA_ID = process.env.APPLICATION_KEY_DATA_ID as string; const params: v2.KeyManagementApiDeleteApplicationKeyRequest = { appKeyId: APPLICATION_KEY_DATA_ID, }; apiInstance .deleteApplicationKey(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=ca59bb50-6422-4256-ae9f-6f0afdc31ea6&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=f1a49637-8b9d-4f98-a01a-4a47b32ad73f&pt=Key%20Management&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fkey-management%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=ca59bb50-6422-4256-ae9f-6f0afdc31ea6&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=f1a49637-8b9d-4f98-a01a-4a47b32ad73f&pt=Key%20Management&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fkey-management%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=aeacf556-316d-4de3-9a6a-2766ee3cf09d&bo=2&sid=b77a5fe0f0bf11f0bab0291dd4b5f9fd&vid=b77a7130f0bf11f0b69eb7008fe8e23b&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Key%20Management&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fkey-management%2F&r=<=3241&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=557113) --- # Source: https://docs.datadoghq.com/api/latest/logs-archives/ # Logs Archives Archives forward all the logs ingested to a cloud storage system. See the [Archives Page](https://app.datadoghq.com/logs/pipelines/archives) for a list of the archives currently configured in Datadog. ## [Get all archives](https://docs.datadoghq.com/api/latest/logs-archives/#get-all-archives) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-archives/#get-all-archives-v2) GET https://api.ap1.datadoghq.com/api/v2/logs/config/archiveshttps://api.ap2.datadoghq.com/api/v2/logs/config/archiveshttps://api.datadoghq.eu/api/v2/logs/config/archiveshttps://api.ddog-gov.com/api/v2/logs/config/archiveshttps://api.datadoghq.com/api/v2/logs/config/archiveshttps://api.us3.datadoghq.com/api/v2/logs/config/archiveshttps://api.us5.datadoghq.com/api/v2/logs/config/archives ### Overview Get the list of configured logs archives with their definitions. This endpoint requires the `logs_read_archives` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-archives/#ListLogsArchives-200-v2) * [403](https://docs.datadoghq.com/api/latest/logs-archives/#ListLogsArchives-403-v2) * [429](https://docs.datadoghq.com/api/latest/logs-archives/#ListLogsArchives-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) The available archives. Field Type Description data [object] A list of archives. attributes object The attributes associated with the archive. destination [_required_] object An archive's destination. Option 1 object The Azure archive destination. container [_required_] string The container where the archive will be stored. integration [_required_] object The Azure archive's integration destination. client_id [_required_] string A client ID. tenant_id [_required_] string A tenant ID. path string The archive path. region string The region where the archive will be stored. storage_account [_required_] string The associated storage account. type [_required_] enum Type of the Azure archive destination. Allowed enum values: `azure` default: `azure` Option 2 object The GCS archive destination. bucket [_required_] string The bucket where the archive will be stored. integration [_required_] object The GCS archive's integration destination. client_email [_required_] string A client email. project_id string A project ID. path string The archive path. type [_required_] enum Type of the GCS archive destination. Allowed enum values: `gcs` default: `gcs` Option 3 object The S3 archive destination. bucket [_required_] string The bucket where the archive will be stored. encryption object The S3 encryption settings. key string An Amazon Resource Name (ARN) used to identify an AWS KMS key. type [_required_] enum Type of S3 encryption for a destination. Allowed enum values: `NO_OVERRIDE,SSE_S3,SSE_KMS` integration [_required_] object The S3 Archive's integration destination. account_id [_required_] string The account ID for the integration. role_name [_required_] string The path of the integration. path string The archive path. storage_class enum The storage class where the archive will be stored. Allowed enum values: `STANDARD,STANDARD_IA,ONEZONE_IA,INTELLIGENT_TIERING,GLACIER_IR` default: `STANDARD` type [_required_] enum Type of the S3 archive destination. Allowed enum values: `s3` default: `s3` include_tags boolean To store the tags in the archive, set the value "true". If it is set to "false", the tags will be deleted when the logs are sent to the archive. name [_required_] string The archive name. query [_required_] string The archive query/filter. Logs matching this query are included in the archive. rehydration_max_scan_size_in_gb int64 Maximum scan size for rehydration from this archive. rehydration_tags [string] An array of tags to add to rehydrated logs from an archive. state enum The state of the archive. Allowed enum values: `UNKNOWN,WORKING,FAILING,WORKING_AUTH_LEGACY` id string The archive ID. type [_required_] string The type of the resource. The value should always be archives. default: `archives` ``` { "data": [ { "attributes": { "destination": { "container": "container-name", "integration": { "client_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", "tenant_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa" }, "path": "string", "region": "string", "storage_account": "account-name", "type": "azure" }, "include_tags": false, "name": "Nginx Archive", "query": "source:nginx", "rehydration_max_scan_size_in_gb": 100, "rehydration_tags": [ "team:intake", "team:app" ], "state": "WORKING" }, "id": "a2zcMylnM4OCHpYusxIi3g", "type": "archives" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=typescript) ##### Get all archives Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/archives" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all archives ``` """ Get all archives returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_archives_api import LogsArchivesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsArchivesApi(api_client) response = api_instance.list_logs_archives() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all archives ``` # Get all archives returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsArchivesAPI.new p api_instance.list_logs_archives() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all archives ``` // Get all archives returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsArchivesApi(apiClient) resp, r, err := api.ListLogsArchives(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsArchivesApi.ListLogsArchives`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsArchivesApi.ListLogsArchives`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all archives ``` // Get all archives returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsArchivesApi; import com.datadog.api.client.v2.model.LogsArchives; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsArchivesApi apiInstance = new LogsArchivesApi(defaultClient); try { LogsArchives result = apiInstance.listLogsArchives(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsArchivesApi#listLogsArchives"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all archives ``` // Get all archives returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_archives::LogsArchivesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsArchivesAPI::with_config(configuration); let resp = api.list_logs_archives().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all archives ``` /** * Get all archives returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsArchivesApi(configuration); apiInstance .listLogsArchives() .then((data: v2.LogsArchives) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an archive](https://docs.datadoghq.com/api/latest/logs-archives/#create-an-archive) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-archives/#create-an-archive-v2) POST https://api.ap1.datadoghq.com/api/v2/logs/config/archiveshttps://api.ap2.datadoghq.com/api/v2/logs/config/archiveshttps://api.datadoghq.eu/api/v2/logs/config/archiveshttps://api.ddog-gov.com/api/v2/logs/config/archiveshttps://api.datadoghq.com/api/v2/logs/config/archiveshttps://api.us3.datadoghq.com/api/v2/logs/config/archiveshttps://api.us5.datadoghq.com/api/v2/logs/config/archives ### Overview Create an archive in your organization. This endpoint requires the `logs_write_archives` permission. ### Request #### Body Data (required) The definition of the new archive. * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) Field Type Description data object The definition of an archive. attributes object The attributes associated with the archive. destination [_required_] An archive's destination. Option 1 object The Azure archive destination. container [_required_] string The container where the archive will be stored. integration [_required_] object The Azure archive's integration destination. client_id [_required_] string A client ID. tenant_id [_required_] string A tenant ID. path string The archive path. region string The region where the archive will be stored. storage_account [_required_] string The associated storage account. type [_required_] enum Type of the Azure archive destination. Allowed enum values: `azure` default: `azure` Option 2 object The GCS archive destination. bucket [_required_] string The bucket where the archive will be stored. integration [_required_] object The GCS archive's integration destination. client_email [_required_] string A client email. project_id string A project ID. path string The archive path. type [_required_] enum Type of the GCS archive destination. Allowed enum values: `gcs` default: `gcs` Option 3 object The S3 archive destination. bucket [_required_] string The bucket where the archive will be stored. encryption object The S3 encryption settings. key string An Amazon Resource Name (ARN) used to identify an AWS KMS key. type [_required_] enum Type of S3 encryption for a destination. Allowed enum values: `NO_OVERRIDE,SSE_S3,SSE_KMS` integration [_required_] object The S3 Archive's integration destination. account_id [_required_] string The account ID for the integration. role_name [_required_] string The path of the integration. path string The archive path. storage_class enum The storage class where the archive will be stored. Allowed enum values: `STANDARD,STANDARD_IA,ONEZONE_IA,INTELLIGENT_TIERING,GLACIER_IR` default: `STANDARD` type [_required_] enum Type of the S3 archive destination. Allowed enum values: `s3` default: `s3` include_tags boolean To store the tags in the archive, set the value "true". If it is set to "false", the tags will be deleted when the logs are sent to the archive. name [_required_] string The archive name. query [_required_] string The archive query/filter. Logs matching this query are included in the archive. rehydration_max_scan_size_in_gb int64 Maximum scan size for rehydration from this archive. rehydration_tags [string] An array of tags to add to rehydrated logs from an archive. type [_required_] string The type of the resource. The value should always be archives. default: `archives` ``` { "data": { "attributes": { "destination": { "container": "container-name", "integration": { "client_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", "tenant_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa" }, "path": "string", "region": "string", "storage_account": "account-name", "type": "azure" }, "include_tags": false, "name": "Nginx Archive", "query": "source:nginx", "rehydration_max_scan_size_in_gb": 100, "rehydration_tags": [ "team:intake", "team:app" ] }, "type": "archives" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-archives/#CreateLogsArchive-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-archives/#CreateLogsArchive-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-archives/#CreateLogsArchive-403-v2) * [429](https://docs.datadoghq.com/api/latest/logs-archives/#CreateLogsArchive-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) The logs archive. Field Type Description data object The definition of an archive. attributes object The attributes associated with the archive. destination [_required_] object An archive's destination. Option 1 object The Azure archive destination. container [_required_] string The container where the archive will be stored. integration [_required_] object The Azure archive's integration destination. client_id [_required_] string A client ID. tenant_id [_required_] string A tenant ID. path string The archive path. region string The region where the archive will be stored. storage_account [_required_] string The associated storage account. type [_required_] enum Type of the Azure archive destination. Allowed enum values: `azure` default: `azure` Option 2 object The GCS archive destination. bucket [_required_] string The bucket where the archive will be stored. integration [_required_] object The GCS archive's integration destination. client_email [_required_] string A client email. project_id string A project ID. path string The archive path. type [_required_] enum Type of the GCS archive destination. Allowed enum values: `gcs` default: `gcs` Option 3 object The S3 archive destination. bucket [_required_] string The bucket where the archive will be stored. encryption object The S3 encryption settings. key string An Amazon Resource Name (ARN) used to identify an AWS KMS key. type [_required_] enum Type of S3 encryption for a destination. Allowed enum values: `NO_OVERRIDE,SSE_S3,SSE_KMS` integration [_required_] object The S3 Archive's integration destination. account_id [_required_] string The account ID for the integration. role_name [_required_] string The path of the integration. path string The archive path. storage_class enum The storage class where the archive will be stored. Allowed enum values: `STANDARD,STANDARD_IA,ONEZONE_IA,INTELLIGENT_TIERING,GLACIER_IR` default: `STANDARD` type [_required_] enum Type of the S3 archive destination. Allowed enum values: `s3` default: `s3` include_tags boolean To store the tags in the archive, set the value "true". If it is set to "false", the tags will be deleted when the logs are sent to the archive. name [_required_] string The archive name. query [_required_] string The archive query/filter. Logs matching this query are included in the archive. rehydration_max_scan_size_in_gb int64 Maximum scan size for rehydration from this archive. rehydration_tags [string] An array of tags to add to rehydrated logs from an archive. state enum The state of the archive. Allowed enum values: `UNKNOWN,WORKING,FAILING,WORKING_AUTH_LEGACY` id string The archive ID. type [_required_] string The type of the resource. The value should always be archives. default: `archives` ``` { "data": { "attributes": { "destination": { "container": "container-name", "integration": { "client_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", "tenant_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa" }, "path": "string", "region": "string", "storage_account": "account-name", "type": "azure" }, "include_tags": false, "name": "Nginx Archive", "query": "source:nginx", "rehydration_max_scan_size_in_gb": 100, "rehydration_tags": [ "team:intake", "team:app" ], "state": "WORKING" }, "id": "a2zcMylnM4OCHpYusxIi3g", "type": "archives" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=typescript) ##### Create an archive Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/archives" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "destination": { "integration": { "client_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", "tenant_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa" } }, "name": "Nginx Archive", "query": "source:nginx" }, "type": "archives" } } EOF ``` ##### Create an archive ``` """ Create an archive returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_archives_api import LogsArchivesApi from datadog_api_client.v2.model.logs_archive_create_request import LogsArchiveCreateRequest from datadog_api_client.v2.model.logs_archive_create_request_attributes import LogsArchiveCreateRequestAttributes from datadog_api_client.v2.model.logs_archive_create_request_definition import LogsArchiveCreateRequestDefinition from datadog_api_client.v2.model.logs_archive_destination_azure import LogsArchiveDestinationAzure from datadog_api_client.v2.model.logs_archive_destination_azure_type import LogsArchiveDestinationAzureType from datadog_api_client.v2.model.logs_archive_integration_azure import LogsArchiveIntegrationAzure body = LogsArchiveCreateRequest( data=LogsArchiveCreateRequestDefinition( attributes=LogsArchiveCreateRequestAttributes( destination=LogsArchiveDestinationAzure( container="container-name", integration=LogsArchiveIntegrationAzure( client_id="aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", tenant_id="aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", ), storage_account="account-name", type=LogsArchiveDestinationAzureType.AZURE, ), include_tags=False, name="Nginx Archive", query="source:nginx", rehydration_max_scan_size_in_gb=100, rehydration_tags=[ "team:intake", "team:app", ], ), type="archives", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsArchivesApi(api_client) response = api_instance.create_logs_archive(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an archive ``` # Create an archive returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsArchivesAPI.new body = DatadogAPIClient::V2::LogsArchiveCreateRequest.new({ data: DatadogAPIClient::V2::LogsArchiveCreateRequestDefinition.new({ attributes: DatadogAPIClient::V2::LogsArchiveCreateRequestAttributes.new({ destination: DatadogAPIClient::V2::LogsArchiveDestinationAzure.new({ container: "container-name", integration: DatadogAPIClient::V2::LogsArchiveIntegrationAzure.new({ client_id: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", tenant_id: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", }), storage_account: "account-name", type: DatadogAPIClient::V2::LogsArchiveDestinationAzureType::AZURE, }), include_tags: false, name: "Nginx Archive", query: "source:nginx", rehydration_max_scan_size_in_gb: 100, rehydration_tags: [ "team:intake", "team:app", ], }), type: "archives", }), }) p api_instance.create_logs_archive(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an archive ``` // Create an archive returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.LogsArchiveCreateRequest{ Data: &datadogV2.LogsArchiveCreateRequestDefinition{ Attributes: &datadogV2.LogsArchiveCreateRequestAttributes{ Destination: datadogV2.LogsArchiveCreateRequestDestination{ LogsArchiveDestinationAzure: &datadogV2.LogsArchiveDestinationAzure{ Container: "container-name", Integration: datadogV2.LogsArchiveIntegrationAzure{ ClientId: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", TenantId: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", }, StorageAccount: "account-name", Type: datadogV2.LOGSARCHIVEDESTINATIONAZURETYPE_AZURE, }}, IncludeTags: datadog.PtrBool(false), Name: "Nginx Archive", Query: "source:nginx", RehydrationMaxScanSizeInGb: *datadog.NewNullableInt64(datadog.PtrInt64(100)), RehydrationTags: []string{ "team:intake", "team:app", }, }, Type: "archives", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsArchivesApi(apiClient) resp, r, err := api.CreateLogsArchive(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsArchivesApi.CreateLogsArchive`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsArchivesApi.CreateLogsArchive`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an archive ``` // Create an archive returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsArchivesApi; import com.datadog.api.client.v2.model.LogsArchive; import com.datadog.api.client.v2.model.LogsArchiveCreateRequest; import com.datadog.api.client.v2.model.LogsArchiveCreateRequestAttributes; import com.datadog.api.client.v2.model.LogsArchiveCreateRequestDefinition; import com.datadog.api.client.v2.model.LogsArchiveCreateRequestDestination; import com.datadog.api.client.v2.model.LogsArchiveDestinationAzure; import com.datadog.api.client.v2.model.LogsArchiveDestinationAzureType; import com.datadog.api.client.v2.model.LogsArchiveIntegrationAzure; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsArchivesApi apiInstance = new LogsArchivesApi(defaultClient); LogsArchiveCreateRequest body = new LogsArchiveCreateRequest() .data( new LogsArchiveCreateRequestDefinition() .attributes( new LogsArchiveCreateRequestAttributes() .destination( new LogsArchiveCreateRequestDestination( new LogsArchiveDestinationAzure() .container("container-name") .integration( new LogsArchiveIntegrationAzure() .clientId("aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa") .tenantId("aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa")) .storageAccount("account-name") .type(LogsArchiveDestinationAzureType.AZURE))) .includeTags(false) .name("Nginx Archive") .query("source:nginx") .rehydrationMaxScanSizeInGb(100L) .rehydrationTags(Arrays.asList("team:intake", "team:app"))) .type("archives")); try { LogsArchive result = apiInstance.createLogsArchive(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsArchivesApi#createLogsArchive"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an archive ``` // Create an archive returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_archives::LogsArchivesAPI; use datadog_api_client::datadogV2::model::LogsArchiveCreateRequest; use datadog_api_client::datadogV2::model::LogsArchiveCreateRequestAttributes; use datadog_api_client::datadogV2::model::LogsArchiveCreateRequestDefinition; use datadog_api_client::datadogV2::model::LogsArchiveCreateRequestDestination; use datadog_api_client::datadogV2::model::LogsArchiveDestinationAzure; use datadog_api_client::datadogV2::model::LogsArchiveDestinationAzureType; use datadog_api_client::datadogV2::model::LogsArchiveIntegrationAzure; #[tokio::main] async fn main() { let body = LogsArchiveCreateRequest::new().data( LogsArchiveCreateRequestDefinition::new("archives".to_string()).attributes( LogsArchiveCreateRequestAttributes::new( LogsArchiveCreateRequestDestination::LogsArchiveDestinationAzure(Box::new( LogsArchiveDestinationAzure::new( "container-name".to_string(), LogsArchiveIntegrationAzure::new( "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa".to_string(), "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa".to_string(), ), "account-name".to_string(), LogsArchiveDestinationAzureType::AZURE, ), )), "Nginx Archive".to_string(), "source:nginx".to_string(), ) .include_tags(false) .rehydration_max_scan_size_in_gb(Some(100)) .rehydration_tags(vec!["team:intake".to_string(), "team:app".to_string()]), ), ); let configuration = datadog::Configuration::new(); let api = LogsArchivesAPI::with_config(configuration); let resp = api.create_logs_archive(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an archive ``` /** * Create an archive returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsArchivesApi(configuration); const params: v2.LogsArchivesApiCreateLogsArchiveRequest = { body: { data: { attributes: { destination: { container: "container-name", integration: { clientId: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", tenantId: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", }, storageAccount: "account-name", type: "azure", }, includeTags: false, name: "Nginx Archive", query: "source:nginx", rehydrationMaxScanSizeInGb: 100, rehydrationTags: ["team:intake", "team:app"], }, type: "archives", }, }, }; apiInstance .createLogsArchive(params) .then((data: v2.LogsArchive) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an archive](https://docs.datadoghq.com/api/latest/logs-archives/#get-an-archive) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-archives/#get-an-archive-v2) GET https://api.ap1.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.ap2.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.datadoghq.eu/api/v2/logs/config/archives/{archive_id}https://api.ddog-gov.com/api/v2/logs/config/archives/{archive_id}https://api.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.us3.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.us5.datadoghq.com/api/v2/logs/config/archives/{archive_id} ### Overview Get a specific archive from your organization. This endpoint requires the `logs_read_archives` permission. ### Arguments #### Path Parameters Name Type Description archive_id [_required_] string The ID of the archive. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-archives/#GetLogsArchive-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-archives/#GetLogsArchive-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-archives/#GetLogsArchive-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-archives/#GetLogsArchive-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-archives/#GetLogsArchive-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) The logs archive. Field Type Description data object The definition of an archive. attributes object The attributes associated with the archive. destination [_required_] object An archive's destination. Option 1 object The Azure archive destination. container [_required_] string The container where the archive will be stored. integration [_required_] object The Azure archive's integration destination. client_id [_required_] string A client ID. tenant_id [_required_] string A tenant ID. path string The archive path. region string The region where the archive will be stored. storage_account [_required_] string The associated storage account. type [_required_] enum Type of the Azure archive destination. Allowed enum values: `azure` default: `azure` Option 2 object The GCS archive destination. bucket [_required_] string The bucket where the archive will be stored. integration [_required_] object The GCS archive's integration destination. client_email [_required_] string A client email. project_id string A project ID. path string The archive path. type [_required_] enum Type of the GCS archive destination. Allowed enum values: `gcs` default: `gcs` Option 3 object The S3 archive destination. bucket [_required_] string The bucket where the archive will be stored. encryption object The S3 encryption settings. key string An Amazon Resource Name (ARN) used to identify an AWS KMS key. type [_required_] enum Type of S3 encryption for a destination. Allowed enum values: `NO_OVERRIDE,SSE_S3,SSE_KMS` integration [_required_] object The S3 Archive's integration destination. account_id [_required_] string The account ID for the integration. role_name [_required_] string The path of the integration. path string The archive path. storage_class enum The storage class where the archive will be stored. Allowed enum values: `STANDARD,STANDARD_IA,ONEZONE_IA,INTELLIGENT_TIERING,GLACIER_IR` default: `STANDARD` type [_required_] enum Type of the S3 archive destination. Allowed enum values: `s3` default: `s3` include_tags boolean To store the tags in the archive, set the value "true". If it is set to "false", the tags will be deleted when the logs are sent to the archive. name [_required_] string The archive name. query [_required_] string The archive query/filter. Logs matching this query are included in the archive. rehydration_max_scan_size_in_gb int64 Maximum scan size for rehydration from this archive. rehydration_tags [string] An array of tags to add to rehydrated logs from an archive. state enum The state of the archive. Allowed enum values: `UNKNOWN,WORKING,FAILING,WORKING_AUTH_LEGACY` id string The archive ID. type [_required_] string The type of the resource. The value should always be archives. default: `archives` ``` { "data": { "attributes": { "destination": { "container": "container-name", "integration": { "client_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", "tenant_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa" }, "path": "string", "region": "string", "storage_account": "account-name", "type": "azure" }, "include_tags": false, "name": "Nginx Archive", "query": "source:nginx", "rehydration_max_scan_size_in_gb": 100, "rehydration_tags": [ "team:intake", "team:app" ], "state": "WORKING" }, "id": "a2zcMylnM4OCHpYusxIi3g", "type": "archives" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=typescript) ##### Get an archive Copy ``` # Path parameters export archive_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/archives/${archive_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an archive ``` """ Get an archive returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_archives_api import LogsArchivesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsArchivesApi(api_client) response = api_instance.get_logs_archive( archive_id="archive_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an archive ``` # Get an archive returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsArchivesAPI.new p api_instance.get_logs_archive("archive_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an archive ``` // Get an archive returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsArchivesApi(apiClient) resp, r, err := api.GetLogsArchive(ctx, "archive_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsArchivesApi.GetLogsArchive`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsArchivesApi.GetLogsArchive`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an archive ``` // Get an archive returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsArchivesApi; import com.datadog.api.client.v2.model.LogsArchive; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsArchivesApi apiInstance = new LogsArchivesApi(defaultClient); try { LogsArchive result = apiInstance.getLogsArchive("archive_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsArchivesApi#getLogsArchive"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an archive ``` // Get an archive returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_archives::LogsArchivesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsArchivesAPI::with_config(configuration); let resp = api.get_logs_archive("archive_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an archive ``` /** * Get an archive returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsArchivesApi(configuration); const params: v2.LogsArchivesApiGetLogsArchiveRequest = { archiveId: "archive_id", }; apiInstance .getLogsArchive(params) .then((data: v2.LogsArchive) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an archive](https://docs.datadoghq.com/api/latest/logs-archives/#update-an-archive) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-archives/#update-an-archive-v2) PUT https://api.ap1.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.ap2.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.datadoghq.eu/api/v2/logs/config/archives/{archive_id}https://api.ddog-gov.com/api/v2/logs/config/archives/{archive_id}https://api.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.us3.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.us5.datadoghq.com/api/v2/logs/config/archives/{archive_id} ### Overview Update a given archive configuration. **Note** : Using this method updates your archive configuration by **replacing** your current configuration with the new one sent to your Datadog organization. This endpoint requires the `logs_write_archives` permission. ### Arguments #### Path Parameters Name Type Description archive_id [_required_] string The ID of the archive. ### Request #### Body Data (required) New definition of the archive. * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) Field Type Description data object The definition of an archive. attributes object The attributes associated with the archive. destination [_required_] An archive's destination. Option 1 object The Azure archive destination. container [_required_] string The container where the archive will be stored. integration [_required_] object The Azure archive's integration destination. client_id [_required_] string A client ID. tenant_id [_required_] string A tenant ID. path string The archive path. region string The region where the archive will be stored. storage_account [_required_] string The associated storage account. type [_required_] enum Type of the Azure archive destination. Allowed enum values: `azure` default: `azure` Option 2 object The GCS archive destination. bucket [_required_] string The bucket where the archive will be stored. integration [_required_] object The GCS archive's integration destination. client_email [_required_] string A client email. project_id string A project ID. path string The archive path. type [_required_] enum Type of the GCS archive destination. Allowed enum values: `gcs` default: `gcs` Option 3 object The S3 archive destination. bucket [_required_] string The bucket where the archive will be stored. encryption object The S3 encryption settings. key string An Amazon Resource Name (ARN) used to identify an AWS KMS key. type [_required_] enum Type of S3 encryption for a destination. Allowed enum values: `NO_OVERRIDE,SSE_S3,SSE_KMS` integration [_required_] object The S3 Archive's integration destination. account_id [_required_] string The account ID for the integration. role_name [_required_] string The path of the integration. path string The archive path. storage_class enum The storage class where the archive will be stored. Allowed enum values: `STANDARD,STANDARD_IA,ONEZONE_IA,INTELLIGENT_TIERING,GLACIER_IR` default: `STANDARD` type [_required_] enum Type of the S3 archive destination. Allowed enum values: `s3` default: `s3` include_tags boolean To store the tags in the archive, set the value "true". If it is set to "false", the tags will be deleted when the logs are sent to the archive. name [_required_] string The archive name. query [_required_] string The archive query/filter. Logs matching this query are included in the archive. rehydration_max_scan_size_in_gb int64 Maximum scan size for rehydration from this archive. rehydration_tags [string] An array of tags to add to rehydrated logs from an archive. type [_required_] string The type of the resource. The value should always be archives. default: `archives` ``` { "data": { "attributes": { "destination": { "container": "container-name", "integration": { "client_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", "tenant_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa" }, "path": "string", "region": "string", "storage_account": "account-name", "type": "azure" }, "include_tags": false, "name": "Nginx Archive", "query": "source:nginx", "rehydration_max_scan_size_in_gb": 100, "rehydration_tags": [ "team:intake", "team:app" ] }, "type": "archives" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-archives/#UpdateLogsArchive-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-archives/#UpdateLogsArchive-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-archives/#UpdateLogsArchive-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-archives/#UpdateLogsArchive-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-archives/#UpdateLogsArchive-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) The logs archive. Field Type Description data object The definition of an archive. attributes object The attributes associated with the archive. destination [_required_] object An archive's destination. Option 1 object The Azure archive destination. container [_required_] string The container where the archive will be stored. integration [_required_] object The Azure archive's integration destination. client_id [_required_] string A client ID. tenant_id [_required_] string A tenant ID. path string The archive path. region string The region where the archive will be stored. storage_account [_required_] string The associated storage account. type [_required_] enum Type of the Azure archive destination. Allowed enum values: `azure` default: `azure` Option 2 object The GCS archive destination. bucket [_required_] string The bucket where the archive will be stored. integration [_required_] object The GCS archive's integration destination. client_email [_required_] string A client email. project_id string A project ID. path string The archive path. type [_required_] enum Type of the GCS archive destination. Allowed enum values: `gcs` default: `gcs` Option 3 object The S3 archive destination. bucket [_required_] string The bucket where the archive will be stored. encryption object The S3 encryption settings. key string An Amazon Resource Name (ARN) used to identify an AWS KMS key. type [_required_] enum Type of S3 encryption for a destination. Allowed enum values: `NO_OVERRIDE,SSE_S3,SSE_KMS` integration [_required_] object The S3 Archive's integration destination. account_id [_required_] string The account ID for the integration. role_name [_required_] string The path of the integration. path string The archive path. storage_class enum The storage class where the archive will be stored. Allowed enum values: `STANDARD,STANDARD_IA,ONEZONE_IA,INTELLIGENT_TIERING,GLACIER_IR` default: `STANDARD` type [_required_] enum Type of the S3 archive destination. Allowed enum values: `s3` default: `s3` include_tags boolean To store the tags in the archive, set the value "true". If it is set to "false", the tags will be deleted when the logs are sent to the archive. name [_required_] string The archive name. query [_required_] string The archive query/filter. Logs matching this query are included in the archive. rehydration_max_scan_size_in_gb int64 Maximum scan size for rehydration from this archive. rehydration_tags [string] An array of tags to add to rehydrated logs from an archive. state enum The state of the archive. Allowed enum values: `UNKNOWN,WORKING,FAILING,WORKING_AUTH_LEGACY` id string The archive ID. type [_required_] string The type of the resource. The value should always be archives. default: `archives` ``` { "data": { "attributes": { "destination": { "container": "container-name", "integration": { "client_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", "tenant_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa" }, "path": "string", "region": "string", "storage_account": "account-name", "type": "azure" }, "include_tags": false, "name": "Nginx Archive", "query": "source:nginx", "rehydration_max_scan_size_in_gb": 100, "rehydration_tags": [ "team:intake", "team:app" ], "state": "WORKING" }, "id": "a2zcMylnM4OCHpYusxIi3g", "type": "archives" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=typescript) ##### Update an archive Copy ``` # Path parameters export archive_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/archives/${archive_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "destination": { "integration": { "client_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", "tenant_id": "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa" } }, "name": "Nginx Archive", "query": "source:nginx" }, "type": "archives" } } EOF ``` ##### Update an archive ``` """ Update an archive returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_archives_api import LogsArchivesApi from datadog_api_client.v2.model.logs_archive_create_request import LogsArchiveCreateRequest from datadog_api_client.v2.model.logs_archive_create_request_attributes import LogsArchiveCreateRequestAttributes from datadog_api_client.v2.model.logs_archive_create_request_definition import LogsArchiveCreateRequestDefinition from datadog_api_client.v2.model.logs_archive_destination_azure import LogsArchiveDestinationAzure from datadog_api_client.v2.model.logs_archive_destination_azure_type import LogsArchiveDestinationAzureType from datadog_api_client.v2.model.logs_archive_integration_azure import LogsArchiveIntegrationAzure body = LogsArchiveCreateRequest( data=LogsArchiveCreateRequestDefinition( attributes=LogsArchiveCreateRequestAttributes( destination=LogsArchiveDestinationAzure( container="container-name", integration=LogsArchiveIntegrationAzure( client_id="aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", tenant_id="aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", ), storage_account="account-name", type=LogsArchiveDestinationAzureType.AZURE, ), include_tags=False, name="Nginx Archive", query="source:nginx", rehydration_max_scan_size_in_gb=100, rehydration_tags=[ "team:intake", "team:app", ], ), type="archives", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsArchivesApi(api_client) response = api_instance.update_logs_archive(archive_id="archive_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an archive ``` # Update an archive returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsArchivesAPI.new body = DatadogAPIClient::V2::LogsArchiveCreateRequest.new({ data: DatadogAPIClient::V2::LogsArchiveCreateRequestDefinition.new({ attributes: DatadogAPIClient::V2::LogsArchiveCreateRequestAttributes.new({ destination: DatadogAPIClient::V2::LogsArchiveDestinationAzure.new({ container: "container-name", integration: DatadogAPIClient::V2::LogsArchiveIntegrationAzure.new({ client_id: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", tenant_id: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", }), storage_account: "account-name", type: DatadogAPIClient::V2::LogsArchiveDestinationAzureType::AZURE, }), include_tags: false, name: "Nginx Archive", query: "source:nginx", rehydration_max_scan_size_in_gb: 100, rehydration_tags: [ "team:intake", "team:app", ], }), type: "archives", }), }) p api_instance.update_logs_archive("archive_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an archive ``` // Update an archive returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.LogsArchiveCreateRequest{ Data: &datadogV2.LogsArchiveCreateRequestDefinition{ Attributes: &datadogV2.LogsArchiveCreateRequestAttributes{ Destination: datadogV2.LogsArchiveCreateRequestDestination{ LogsArchiveDestinationAzure: &datadogV2.LogsArchiveDestinationAzure{ Container: "container-name", Integration: datadogV2.LogsArchiveIntegrationAzure{ ClientId: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", TenantId: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", }, StorageAccount: "account-name", Type: datadogV2.LOGSARCHIVEDESTINATIONAZURETYPE_AZURE, }}, IncludeTags: datadog.PtrBool(false), Name: "Nginx Archive", Query: "source:nginx", RehydrationMaxScanSizeInGb: *datadog.NewNullableInt64(datadog.PtrInt64(100)), RehydrationTags: []string{ "team:intake", "team:app", }, }, Type: "archives", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsArchivesApi(apiClient) resp, r, err := api.UpdateLogsArchive(ctx, "archive_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsArchivesApi.UpdateLogsArchive`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsArchivesApi.UpdateLogsArchive`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an archive ``` // Update an archive returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsArchivesApi; import com.datadog.api.client.v2.model.LogsArchive; import com.datadog.api.client.v2.model.LogsArchiveCreateRequest; import com.datadog.api.client.v2.model.LogsArchiveCreateRequestAttributes; import com.datadog.api.client.v2.model.LogsArchiveCreateRequestDefinition; import com.datadog.api.client.v2.model.LogsArchiveCreateRequestDestination; import com.datadog.api.client.v2.model.LogsArchiveDestinationAzure; import com.datadog.api.client.v2.model.LogsArchiveDestinationAzureType; import com.datadog.api.client.v2.model.LogsArchiveIntegrationAzure; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsArchivesApi apiInstance = new LogsArchivesApi(defaultClient); LogsArchiveCreateRequest body = new LogsArchiveCreateRequest() .data( new LogsArchiveCreateRequestDefinition() .attributes( new LogsArchiveCreateRequestAttributes() .destination( new LogsArchiveCreateRequestDestination( new LogsArchiveDestinationAzure() .container("container-name") .integration( new LogsArchiveIntegrationAzure() .clientId("aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa") .tenantId("aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa")) .storageAccount("account-name") .type(LogsArchiveDestinationAzureType.AZURE))) .includeTags(false) .name("Nginx Archive") .query("source:nginx") .rehydrationMaxScanSizeInGb(100L) .rehydrationTags(Arrays.asList("team:intake", "team:app"))) .type("archives")); try { LogsArchive result = apiInstance.updateLogsArchive("archive_id", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsArchivesApi#updateLogsArchive"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an archive ``` // Update an archive returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_archives::LogsArchivesAPI; use datadog_api_client::datadogV2::model::LogsArchiveCreateRequest; use datadog_api_client::datadogV2::model::LogsArchiveCreateRequestAttributes; use datadog_api_client::datadogV2::model::LogsArchiveCreateRequestDefinition; use datadog_api_client::datadogV2::model::LogsArchiveCreateRequestDestination; use datadog_api_client::datadogV2::model::LogsArchiveDestinationAzure; use datadog_api_client::datadogV2::model::LogsArchiveDestinationAzureType; use datadog_api_client::datadogV2::model::LogsArchiveIntegrationAzure; #[tokio::main] async fn main() { let body = LogsArchiveCreateRequest::new().data( LogsArchiveCreateRequestDefinition::new("archives".to_string()).attributes( LogsArchiveCreateRequestAttributes::new( LogsArchiveCreateRequestDestination::LogsArchiveDestinationAzure(Box::new( LogsArchiveDestinationAzure::new( "container-name".to_string(), LogsArchiveIntegrationAzure::new( "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa".to_string(), "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa".to_string(), ), "account-name".to_string(), LogsArchiveDestinationAzureType::AZURE, ), )), "Nginx Archive".to_string(), "source:nginx".to_string(), ) .include_tags(false) .rehydration_max_scan_size_in_gb(Some(100)) .rehydration_tags(vec!["team:intake".to_string(), "team:app".to_string()]), ), ); let configuration = datadog::Configuration::new(); let api = LogsArchivesAPI::with_config(configuration); let resp = api .update_logs_archive("archive_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an archive ``` /** * Update an archive returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsArchivesApi(configuration); const params: v2.LogsArchivesApiUpdateLogsArchiveRequest = { body: { data: { attributes: { destination: { container: "container-name", integration: { clientId: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", tenantId: "aaaaaaaa-1a1a-1a1a-1a1a-aaaaaaaaaaaa", }, storageAccount: "account-name", type: "azure", }, includeTags: false, name: "Nginx Archive", query: "source:nginx", rehydrationMaxScanSizeInGb: 100, rehydrationTags: ["team:intake", "team:app"], }, type: "archives", }, }, archiveId: "archive_id", }; apiInstance .updateLogsArchive(params) .then((data: v2.LogsArchive) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an archive](https://docs.datadoghq.com/api/latest/logs-archives/#delete-an-archive) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-archives/#delete-an-archive-v2) DELETE https://api.ap1.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.ap2.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.datadoghq.eu/api/v2/logs/config/archives/{archive_id}https://api.ddog-gov.com/api/v2/logs/config/archives/{archive_id}https://api.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.us3.datadoghq.com/api/v2/logs/config/archives/{archive_id}https://api.us5.datadoghq.com/api/v2/logs/config/archives/{archive_id} ### Overview Delete a given archive from your organization. This endpoint requires the `logs_write_archives` permission. ### Arguments #### Path Parameters Name Type Description archive_id [_required_] string The ID of the archive. ### Response * [204](https://docs.datadoghq.com/api/latest/logs-archives/#DeleteLogsArchive-204-v2) * [400](https://docs.datadoghq.com/api/latest/logs-archives/#DeleteLogsArchive-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-archives/#DeleteLogsArchive-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-archives/#DeleteLogsArchive-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-archives/#DeleteLogsArchive-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=typescript) ##### Delete an archive Copy ``` # Path parameters export archive_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/archives/${archive_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an archive ``` """ Delete an archive returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_archives_api import LogsArchivesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsArchivesApi(api_client) api_instance.delete_logs_archive( archive_id="archive_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an archive ``` # Delete an archive returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsArchivesAPI.new api_instance.delete_logs_archive("archive_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an archive ``` // Delete an archive returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsArchivesApi(apiClient) r, err := api.DeleteLogsArchive(ctx, "archive_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsArchivesApi.DeleteLogsArchive`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an archive ``` // Delete an archive returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsArchivesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsArchivesApi apiInstance = new LogsArchivesApi(defaultClient); try { apiInstance.deleteLogsArchive("archive_id"); } catch (ApiException e) { System.err.println("Exception when calling LogsArchivesApi#deleteLogsArchive"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an archive ``` // Delete an archive returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_archives::LogsArchivesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsArchivesAPI::with_config(configuration); let resp = api.delete_logs_archive("archive_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an archive ``` /** * Delete an archive returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsArchivesApi(configuration); const params: v2.LogsArchivesApiDeleteLogsArchiveRequest = { archiveId: "archive_id", }; apiInstance .deleteLogsArchive(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List read roles for an archive](https://docs.datadoghq.com/api/latest/logs-archives/#list-read-roles-for-an-archive) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-archives/#list-read-roles-for-an-archive-v2) GET https://api.ap1.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.ap2.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.datadoghq.eu/api/v2/logs/config/archives/{archive_id}/readershttps://api.ddog-gov.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.us3.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.us5.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readers ### Overview Returns all read roles a given archive is restricted to. This endpoint requires the `logs_read_config` permission. ### Arguments #### Path Parameters Name Type Description archive_id [_required_] string The ID of the archive. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-archives/#ListArchiveReadRoles-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-archives/#ListArchiveReadRoles-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-archives/#ListArchiveReadRoles-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-archives/#ListArchiveReadRoles-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-archives/#ListArchiveReadRoles-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) Response containing information about multiple roles. Field Type Description data [object] Array of returned roles. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` meta object Object describing meta attributes of response. page object Pagination object. total_count int64 Total count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "user_count": "integer" }, "id": "string", "relationships": { "permissions": { "data": [ { "id": "string", "type": "permissions" } ] } }, "type": "roles" } ], "meta": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=typescript) ##### List read roles for an archive Copy ``` # Path parameters export archive_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/archives/${archive_id}/readers" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List read roles for an archive ``` """ List read roles for an archive returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_archives_api import LogsArchivesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsArchivesApi(api_client) response = api_instance.list_archive_read_roles( archive_id="archive_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List read roles for an archive ``` # List read roles for an archive returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsArchivesAPI.new p api_instance.list_archive_read_roles("archive_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List read roles for an archive ``` // List read roles for an archive returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsArchivesApi(apiClient) resp, r, err := api.ListArchiveReadRoles(ctx, "archive_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsArchivesApi.ListArchiveReadRoles`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsArchivesApi.ListArchiveReadRoles`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List read roles for an archive ``` // List read roles for an archive returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsArchivesApi; import com.datadog.api.client.v2.model.RolesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsArchivesApi apiInstance = new LogsArchivesApi(defaultClient); try { RolesResponse result = apiInstance.listArchiveReadRoles("archive_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsArchivesApi#listArchiveReadRoles"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List read roles for an archive ``` // List read roles for an archive returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_archives::LogsArchivesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsArchivesAPI::with_config(configuration); let resp = api.list_archive_read_roles("archive_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List read roles for an archive ``` /** * List read roles for an archive returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsArchivesApi(configuration); const params: v2.LogsArchivesApiListArchiveReadRolesRequest = { archiveId: "archive_id", }; apiInstance .listArchiveReadRoles(params) .then((data: v2.RolesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Grant role to an archive](https://docs.datadoghq.com/api/latest/logs-archives/#grant-role-to-an-archive) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-archives/#grant-role-to-an-archive-v2) POST https://api.ap1.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.ap2.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.datadoghq.eu/api/v2/logs/config/archives/{archive_id}/readershttps://api.ddog-gov.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.us3.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.us5.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readers ### Overview Adds a read role to an archive. ([Roles API](https://docs.datadoghq.com/api/v2/roles/)) This endpoint requires the `logs_write_archives` permission. ### Arguments #### Path Parameters Name Type Description archive_id [_required_] string The ID of the archive. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) Field Type Description data object Relationship to role object. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/logs-archives/#AddReadRoleToArchive-204-v2) * [400](https://docs.datadoghq.com/api/latest/logs-archives/#AddReadRoleToArchive-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-archives/#AddReadRoleToArchive-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-archives/#AddReadRoleToArchive-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-archives/#AddReadRoleToArchive-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=typescript) ##### Grant role to an archive Copy ``` # Path parameters export archive_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/archives/${archive_id}/readers" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Grant role to an archive ``` """ Grant role to an archive returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_archives_api import LogsArchivesApi from datadog_api_client.v2.model.relationship_to_role import RelationshipToRole from datadog_api_client.v2.model.relationship_to_role_data import RelationshipToRoleData from datadog_api_client.v2.model.roles_type import RolesType body = RelationshipToRole( data=RelationshipToRoleData( id="3653d3c6-0c75-11ea-ad28-fb5701eabc7d", type=RolesType.ROLES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsArchivesApi(api_client) api_instance.add_read_role_to_archive(archive_id="archive_id", body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Grant role to an archive ``` # Grant role to an archive returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsArchivesAPI.new body = DatadogAPIClient::V2::RelationshipToRole.new({ data: DatadogAPIClient::V2::RelationshipToRoleData.new({ id: "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", type: DatadogAPIClient::V2::RolesType::ROLES, }), }) api_instance.add_read_role_to_archive("archive_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Grant role to an archive ``` // Grant role to an archive returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RelationshipToRole{ Data: &datadogV2.RelationshipToRoleData{ Id: datadog.PtrString("3653d3c6-0c75-11ea-ad28-fb5701eabc7d"), Type: datadogV2.ROLESTYPE_ROLES.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsArchivesApi(apiClient) r, err := api.AddReadRoleToArchive(ctx, "archive_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsArchivesApi.AddReadRoleToArchive`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Grant role to an archive ``` // Grant role to an archive returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsArchivesApi; import com.datadog.api.client.v2.model.RelationshipToRole; import com.datadog.api.client.v2.model.RelationshipToRoleData; import com.datadog.api.client.v2.model.RolesType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsArchivesApi apiInstance = new LogsArchivesApi(defaultClient); RelationshipToRole body = new RelationshipToRole() .data( new RelationshipToRoleData() .id("3653d3c6-0c75-11ea-ad28-fb5701eabc7d") .type(RolesType.ROLES)); try { apiInstance.addReadRoleToArchive("archive_id", body); } catch (ApiException e) { System.err.println("Exception when calling LogsArchivesApi#addReadRoleToArchive"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Grant role to an archive ``` // Grant role to an archive returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_archives::LogsArchivesAPI; use datadog_api_client::datadogV2::model::RelationshipToRole; use datadog_api_client::datadogV2::model::RelationshipToRoleData; use datadog_api_client::datadogV2::model::RolesType; #[tokio::main] async fn main() { let body = RelationshipToRole::new().data( RelationshipToRoleData::new() .id("3653d3c6-0c75-11ea-ad28-fb5701eabc7d".to_string()) .type_(RolesType::ROLES), ); let configuration = datadog::Configuration::new(); let api = LogsArchivesAPI::with_config(configuration); let resp = api .add_read_role_to_archive("archive_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Grant role to an archive ``` /** * Grant role to an archive returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsArchivesApi(configuration); const params: v2.LogsArchivesApiAddReadRoleToArchiveRequest = { body: { data: { id: "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", type: "roles", }, }, archiveId: "archive_id", }; apiInstance .addReadRoleToArchive(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Revoke role from an archive](https://docs.datadoghq.com/api/latest/logs-archives/#revoke-role-from-an-archive) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-archives/#revoke-role-from-an-archive-v2) DELETE https://api.ap1.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.ap2.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.datadoghq.eu/api/v2/logs/config/archives/{archive_id}/readershttps://api.ddog-gov.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.us3.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readershttps://api.us5.datadoghq.com/api/v2/logs/config/archives/{archive_id}/readers ### Overview Removes a role from an archive. ([Roles API](https://docs.datadoghq.com/api/v2/roles/)) This endpoint requires the `logs_write_archives` permission. ### Arguments #### Path Parameters Name Type Description archive_id [_required_] string The ID of the archive. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) Field Type Description data object Relationship to role object. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/logs-archives/#RemoveRoleFromArchive-204-v2) * [400](https://docs.datadoghq.com/api/latest/logs-archives/#RemoveRoleFromArchive-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-archives/#RemoveRoleFromArchive-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-archives/#RemoveRoleFromArchive-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-archives/#RemoveRoleFromArchive-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=typescript) ##### Revoke role from an archive Copy ``` # Path parameters export archive_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/archives/${archive_id}/readers" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Revoke role from an archive ``` """ Revoke role from an archive returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_archives_api import LogsArchivesApi from datadog_api_client.v2.model.relationship_to_role import RelationshipToRole from datadog_api_client.v2.model.relationship_to_role_data import RelationshipToRoleData from datadog_api_client.v2.model.roles_type import RolesType body = RelationshipToRole( data=RelationshipToRoleData( id="3653d3c6-0c75-11ea-ad28-fb5701eabc7d", type=RolesType.ROLES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsArchivesApi(api_client) api_instance.remove_role_from_archive(archive_id="archive_id", body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Revoke role from an archive ``` # Revoke role from an archive returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsArchivesAPI.new body = DatadogAPIClient::V2::RelationshipToRole.new({ data: DatadogAPIClient::V2::RelationshipToRoleData.new({ id: "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", type: DatadogAPIClient::V2::RolesType::ROLES, }), }) api_instance.remove_role_from_archive("archive_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Revoke role from an archive ``` // Revoke role from an archive returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RelationshipToRole{ Data: &datadogV2.RelationshipToRoleData{ Id: datadog.PtrString("3653d3c6-0c75-11ea-ad28-fb5701eabc7d"), Type: datadogV2.ROLESTYPE_ROLES.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsArchivesApi(apiClient) r, err := api.RemoveRoleFromArchive(ctx, "archive_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsArchivesApi.RemoveRoleFromArchive`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Revoke role from an archive ``` // Revoke role from an archive returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsArchivesApi; import com.datadog.api.client.v2.model.RelationshipToRole; import com.datadog.api.client.v2.model.RelationshipToRoleData; import com.datadog.api.client.v2.model.RolesType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsArchivesApi apiInstance = new LogsArchivesApi(defaultClient); RelationshipToRole body = new RelationshipToRole() .data( new RelationshipToRoleData() .id("3653d3c6-0c75-11ea-ad28-fb5701eabc7d") .type(RolesType.ROLES)); try { apiInstance.removeRoleFromArchive("archive_id", body); } catch (ApiException e) { System.err.println("Exception when calling LogsArchivesApi#removeRoleFromArchive"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Revoke role from an archive ``` // Revoke role from an archive returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_archives::LogsArchivesAPI; use datadog_api_client::datadogV2::model::RelationshipToRole; use datadog_api_client::datadogV2::model::RelationshipToRoleData; use datadog_api_client::datadogV2::model::RolesType; #[tokio::main] async fn main() { let body = RelationshipToRole::new().data( RelationshipToRoleData::new() .id("3653d3c6-0c75-11ea-ad28-fb5701eabc7d".to_string()) .type_(RolesType::ROLES), ); let configuration = datadog::Configuration::new(); let api = LogsArchivesAPI::with_config(configuration); let resp = api .remove_role_from_archive("archive_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Revoke role from an archive ``` /** * Revoke role from an archive returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsArchivesApi(configuration); const params: v2.LogsArchivesApiRemoveRoleFromArchiveRequest = { body: { data: { id: "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", type: "roles", }, }, archiveId: "archive_id", }; apiInstance .removeRoleFromArchive(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get archive order](https://docs.datadoghq.com/api/latest/logs-archives/#get-archive-order) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-archives/#get-archive-order-v2) GET https://api.ap1.datadoghq.com/api/v2/logs/config/archive-orderhttps://api.ap2.datadoghq.com/api/v2/logs/config/archive-orderhttps://api.datadoghq.eu/api/v2/logs/config/archive-orderhttps://api.ddog-gov.com/api/v2/logs/config/archive-orderhttps://api.datadoghq.com/api/v2/logs/config/archive-orderhttps://api.us3.datadoghq.com/api/v2/logs/config/archive-orderhttps://api.us5.datadoghq.com/api/v2/logs/config/archive-order ### Overview Get the current order of your archives. This endpoint takes no JSON arguments. This endpoint requires the `logs_read_config` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-archives/#GetLogsArchiveOrder-200-v2) * [403](https://docs.datadoghq.com/api/latest/logs-archives/#GetLogsArchiveOrder-403-v2) * [429](https://docs.datadoghq.com/api/latest/logs-archives/#GetLogsArchiveOrder-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) A ordered list of archive IDs. Field Type Description data object The definition of an archive order. attributes [_required_] object The attributes associated with the archive order. archive_ids [_required_] [string] An ordered array of `` strings, the order of archive IDs in the array define the overall archives order for Datadog. type [_required_] enum Type of the archive order definition. Allowed enum values: `archive_order` default: `archive_order` ``` { "data": { "attributes": { "archive_ids": [ "a2zcMylnM4OCHpYusxIi1g", "a2zcMylnM4OCHpYusxIi2g", "a2zcMylnM4OCHpYusxIi3g" ] }, "type": "archive_order" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=typescript) ##### Get archive order Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/archive-order" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get archive order ``` """ Get archive order returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_archives_api import LogsArchivesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsArchivesApi(api_client) response = api_instance.get_logs_archive_order() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get archive order ``` # Get archive order returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsArchivesAPI.new p api_instance.get_logs_archive_order() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get archive order ``` // Get archive order returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsArchivesApi(apiClient) resp, r, err := api.GetLogsArchiveOrder(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsArchivesApi.GetLogsArchiveOrder`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsArchivesApi.GetLogsArchiveOrder`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get archive order ``` // Get archive order returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsArchivesApi; import com.datadog.api.client.v2.model.LogsArchiveOrder; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsArchivesApi apiInstance = new LogsArchivesApi(defaultClient); try { LogsArchiveOrder result = apiInstance.getLogsArchiveOrder(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsArchivesApi#getLogsArchiveOrder"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get archive order ``` // Get archive order returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_archives::LogsArchivesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsArchivesAPI::with_config(configuration); let resp = api.get_logs_archive_order().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get archive order ``` /** * Get archive order returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsArchivesApi(configuration); apiInstance .getLogsArchiveOrder() .then((data: v2.LogsArchiveOrder) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update archive order](https://docs.datadoghq.com/api/latest/logs-archives/#update-archive-order) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-archives/#update-archive-order-v2) PUT https://api.ap1.datadoghq.com/api/v2/logs/config/archive-orderhttps://api.ap2.datadoghq.com/api/v2/logs/config/archive-orderhttps://api.datadoghq.eu/api/v2/logs/config/archive-orderhttps://api.ddog-gov.com/api/v2/logs/config/archive-orderhttps://api.datadoghq.com/api/v2/logs/config/archive-orderhttps://api.us3.datadoghq.com/api/v2/logs/config/archive-orderhttps://api.us5.datadoghq.com/api/v2/logs/config/archive-order ### Overview Update the order of your archives. Since logs are processed sequentially, reordering an archive may change the structure and content of the data processed by other archives. **Note** : Using the `PUT` method updates your archive’s order by replacing the current order with the new one. This endpoint requires the `logs_write_archives` permission. ### Request #### Body Data (required) An object containing the new ordered list of archive IDs. * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) Field Type Description data object The definition of an archive order. attributes [_required_] object The attributes associated with the archive order. archive_ids [_required_] [string] An ordered array of `` strings, the order of archive IDs in the array define the overall archives order for Datadog. type [_required_] enum Type of the archive order definition. Allowed enum values: `archive_order` default: `archive_order` ``` { "data": { "attributes": { "archive_ids": [ "a2zcMylnM4OCHpYusxIi1g", "a2zcMylnM4OCHpYusxIi2g", "a2zcMylnM4OCHpYusxIi3g" ] }, "type": "archive_order" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-archives/#UpdateLogsArchiveOrder-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-archives/#UpdateLogsArchiveOrder-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-archives/#UpdateLogsArchiveOrder-403-v2) * [422](https://docs.datadoghq.com/api/latest/logs-archives/#UpdateLogsArchiveOrder-422-v2) * [429](https://docs.datadoghq.com/api/latest/logs-archives/#UpdateLogsArchiveOrder-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) A ordered list of archive IDs. Field Type Description data object The definition of an archive order. attributes [_required_] object The attributes associated with the archive order. archive_ids [_required_] [string] An ordered array of `` strings, the order of archive IDs in the array define the overall archives order for Datadog. type [_required_] enum Type of the archive order definition. Allowed enum values: `archive_order` default: `archive_order` ``` { "data": { "attributes": { "archive_ids": [ "a2zcMylnM4OCHpYusxIi1g", "a2zcMylnM4OCHpYusxIi2g", "a2zcMylnM4OCHpYusxIi3g" ] }, "type": "archive_order" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unprocessable Entity * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-archives/) * [Example](https://docs.datadoghq.com/api/latest/logs-archives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-archives/?code-lang=typescript) ##### Update archive order Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/archive-order" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "archive_ids": [ "a2zcMylnM4OCHpYusxIi1g", "a2zcMylnM4OCHpYusxIi2g", "a2zcMylnM4OCHpYusxIi3g" ] }, "type": "archive_order" } } EOF ``` ##### Update archive order ``` """ Update archive order returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_archives_api import LogsArchivesApi from datadog_api_client.v2.model.logs_archive_order import LogsArchiveOrder from datadog_api_client.v2.model.logs_archive_order_attributes import LogsArchiveOrderAttributes from datadog_api_client.v2.model.logs_archive_order_definition import LogsArchiveOrderDefinition from datadog_api_client.v2.model.logs_archive_order_definition_type import LogsArchiveOrderDefinitionType body = LogsArchiveOrder( data=LogsArchiveOrderDefinition( attributes=LogsArchiveOrderAttributes( archive_ids=[ "a2zcMylnM4OCHpYusxIi1g", "a2zcMylnM4OCHpYusxIi2g", "a2zcMylnM4OCHpYusxIi3g", ], ), type=LogsArchiveOrderDefinitionType.ARCHIVE_ORDER, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsArchivesApi(api_client) response = api_instance.update_logs_archive_order(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update archive order ``` # Update archive order returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsArchivesAPI.new body = DatadogAPIClient::V2::LogsArchiveOrder.new({ data: DatadogAPIClient::V2::LogsArchiveOrderDefinition.new({ attributes: DatadogAPIClient::V2::LogsArchiveOrderAttributes.new({ archive_ids: [ "a2zcMylnM4OCHpYusxIi1g", "a2zcMylnM4OCHpYusxIi2g", "a2zcMylnM4OCHpYusxIi3g", ], }), type: DatadogAPIClient::V2::LogsArchiveOrderDefinitionType::ARCHIVE_ORDER, }), }) p api_instance.update_logs_archive_order(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update archive order ``` // Update archive order returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.LogsArchiveOrder{ Data: &datadogV2.LogsArchiveOrderDefinition{ Attributes: datadogV2.LogsArchiveOrderAttributes{ ArchiveIds: []string{ "a2zcMylnM4OCHpYusxIi1g", "a2zcMylnM4OCHpYusxIi2g", "a2zcMylnM4OCHpYusxIi3g", }, }, Type: datadogV2.LOGSARCHIVEORDERDEFINITIONTYPE_ARCHIVE_ORDER, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsArchivesApi(apiClient) resp, r, err := api.UpdateLogsArchiveOrder(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsArchivesApi.UpdateLogsArchiveOrder`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsArchivesApi.UpdateLogsArchiveOrder`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update archive order ``` // Update archive order returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsArchivesApi; import com.datadog.api.client.v2.model.LogsArchiveOrder; import com.datadog.api.client.v2.model.LogsArchiveOrderAttributes; import com.datadog.api.client.v2.model.LogsArchiveOrderDefinition; import com.datadog.api.client.v2.model.LogsArchiveOrderDefinitionType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsArchivesApi apiInstance = new LogsArchivesApi(defaultClient); LogsArchiveOrder body = new LogsArchiveOrder() .data( new LogsArchiveOrderDefinition() .attributes( new LogsArchiveOrderAttributes() .archiveIds( Arrays.asList( "a2zcMylnM4OCHpYusxIi1g", "a2zcMylnM4OCHpYusxIi2g", "a2zcMylnM4OCHpYusxIi3g"))) .type(LogsArchiveOrderDefinitionType.ARCHIVE_ORDER)); try { LogsArchiveOrder result = apiInstance.updateLogsArchiveOrder(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsArchivesApi#updateLogsArchiveOrder"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update archive order ``` // Update archive order returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_archives::LogsArchivesAPI; use datadog_api_client::datadogV2::model::LogsArchiveOrder; use datadog_api_client::datadogV2::model::LogsArchiveOrderAttributes; use datadog_api_client::datadogV2::model::LogsArchiveOrderDefinition; use datadog_api_client::datadogV2::model::LogsArchiveOrderDefinitionType; #[tokio::main] async fn main() { let body = LogsArchiveOrder::new().data(LogsArchiveOrderDefinition::new( LogsArchiveOrderAttributes::new(vec![ "a2zcMylnM4OCHpYusxIi1g".to_string(), "a2zcMylnM4OCHpYusxIi2g".to_string(), "a2zcMylnM4OCHpYusxIi3g".to_string(), ]), LogsArchiveOrderDefinitionType::ARCHIVE_ORDER, )); let configuration = datadog::Configuration::new(); let api = LogsArchivesAPI::with_config(configuration); let resp = api.update_logs_archive_order(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update archive order ``` /** * Update archive order returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsArchivesApi(configuration); const params: v2.LogsArchivesApiUpdateLogsArchiveOrderRequest = { body: { data: { attributes: { archiveIds: [ "a2zcMylnM4OCHpYusxIi1g", "a2zcMylnM4OCHpYusxIi2g", "a2zcMylnM4OCHpYusxIi3g", ], }, type: "archive_order", }, }, }; apiInstance .updateLogsArchiveOrder(params) .then((data: v2.LogsArchiveOrder) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=08b78d98-66d5-4336-b8d3-2e559efcef0e&bo=1&sid=ca97aff0f0bf11f0b225ebe7018368cd&vid=ca980060f0bf11f09385b797b877861f&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Logs%20Archives&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-archives%2F&r=<=2386&evt=pageLoad&sv=2&cdb=AQAQ&rn=858097) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=892cca82-d58a-46c9-91cf-7057e8326871&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=142a5ea7-c314-41ff-8288-28aa8ccd8278&pt=Logs%20Archives&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-archives%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=892cca82-d58a-46c9-91cf-7057e8326871&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=142a5ea7-c314-41ff-8288-28aa8ccd8278&pt=Logs%20Archives&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-archives%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) --- # Source: https://docs.datadoghq.com/api/latest/logs-custom-destinations/ # Logs Custom Destinations Custom Destinations forward all the logs ingested to an external destination. **Note** : Log forwarding is not available for the Government (US1-FED) site. Contact your account representative for more information. See the [Custom Destinations Page](https://app.datadoghq.com/logs/pipelines/log-forwarding/custom-destinations) for a list of the custom destinations currently configured in web UI. ## [Get all custom destinations](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#get-all-custom-destinations) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#get-all-custom-destinations-v2) GET https://api.ap1.datadoghq.com/api/v2/logs/config/custom-destinationshttps://api.ap2.datadoghq.com/api/v2/logs/config/custom-destinationshttps://api.datadoghq.eu/api/v2/logs/config/custom-destinationshttps://api.ddog-gov.com/api/v2/logs/config/custom-destinationshttps://api.datadoghq.com/api/v2/logs/config/custom-destinationshttps://api.us3.datadoghq.com/api/v2/logs/config/custom-destinationshttps://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations ### Overview Get the list of configured custom destinations in your organization with their definitions. This endpoint requires any of the following permissions: * `logs_read_config` * `logs_read_data` ### Response * [200](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#ListLogsCustomDestinations-200-v2) * [403](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#ListLogsCustomDestinations-403-v2) * [429](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#ListLogsCustomDestinations-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) The available custom destinations. Field Type Description data [object] A list of custom destinations. attributes object The attributes associated with the custom destination. enabled boolean Whether logs matching this custom destination should be forwarded or not. default: `true` forward_tags boolean Whether tags from the forwarded logs should be forwarded or not. default: `true` forward_tags_restriction_list [string] List of [keys of tags](https://docs.datadoghq.com/getting_started/tagging/#define-tags) to be filtered. An empty list represents no restriction is in place and either all or no tags will be forwarded depending on `forward_tags_restriction_list_type` parameter. default: forward_tags_restriction_list_type enum How `forward_tags_restriction_list` parameter should be interpreted. If `ALLOW_LIST`, then only tags whose keys on the forwarded logs match the ones on the restriction list are forwarded. `BLOCK_LIST` works the opposite way. It does not forward the tags matching the ones on the list. Allowed enum values: `ALLOW_LIST,BLOCK_LIST` default: `ALLOW_LIST` forwarder_destination A custom destination's location to forward logs. Option 1 object The HTTP destination. auth [_required_] Authentication method of the HTTP requests. Option 1 object Basic access authentication. type [_required_] enum Type of the basic access authentication. Allowed enum values: `basic` default: `basic` Option 2 object Custom header access authentication. header_name [_required_] string The header name of the authentication. type [_required_] enum Type of the custom header access authentication. Allowed enum values: `custom_header` default: `custom_header` endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the HTTP destination. Allowed enum values: `http` default: `http` Option 2 object The Splunk HTTP Event Collector (HEC) destination. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the Splunk HTTP Event Collector (HEC) destination. Allowed enum values: `splunk_hec` default: `splunk_hec` Option 3 object The Elasticsearch destination. auth [_required_] object Basic access authentication. Basic access authentication. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. index_name [_required_] string Name of the Elasticsearch index (must follow [Elasticsearch's criteria](https://www.elastic.co/guide/en/elasticsearch/reference/8.11/indices-create-index.html#indices-create-api-path-params)). index_rotation string Date pattern with US locale and UTC timezone to be appended to the index name after adding `-` (that is, `${index_name}-${indexPattern}`). You can customize the index rotation naming pattern by choosing one of these options: * Hourly: `yyyy-MM-dd-HH` (as an example, it would render: `2022-10-19-09`) * Daily: `yyyy-MM-dd` (as an example, it would render: `2022-10-19`) * Weekly: `yyyy-'W'ww` (as an example, it would render: `2022-W42`) * Monthly: `yyyy-MM` (as an example, it would render: `2022-10`) If this field is missing or is blank, it means that the index name will always be the same (that is, no rotation). type [_required_] enum Type of the Elasticsearch destination. Allowed enum values: `elasticsearch` default: `elasticsearch` Option 4 object The Microsoft Sentinel destination. client_id [_required_] string Client ID from the Datadog Azure integration. data_collection_endpoint [_required_] string Azure data collection endpoint. data_collection_rule_id [_required_] string Azure data collection rule ID. stream_name [_required_] string Azure stream name. tenant_id [_required_] string Tenant ID from the Datadog Azure integration. type [_required_] enum Type of the Microsoft Sentinel destination. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` name string The custom destination name. query string The custom destination query filter. Logs matching this query are forwarded to the destination. id string The custom destination ID. type enum The type of the resource. The value should always be `custom_destination`. Allowed enum values: `custom_destination` default: `custom_destination` ``` { "data": [ { "attributes": { "enabled": true, "forward_tags": true, "forward_tags_restriction_list": [ "datacenter", "host" ], "forward_tags_restriction_list_type": "ALLOW_LIST", "forwarder_destination": { "auth": { "type": "basic" }, "endpoint": "https://example.com", "type": "http" }, "name": "Nginx logs", "query": "source:nginx" }, "id": "be5d7a69-d0c8-4d4d-8ee8-bba292d98139", "type": "custom_destination" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=typescript) ##### Get all custom destinations Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all custom destinations ``` """ Get all custom destinations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_custom_destinations_api import LogsCustomDestinationsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsCustomDestinationsApi(api_client) response = api_instance.list_logs_custom_destinations() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all custom destinations ``` # Get all custom destinations returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsCustomDestinationsAPI.new p api_instance.list_logs_custom_destinations() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all custom destinations ``` // Get all custom destinations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsCustomDestinationsApi(apiClient) resp, r, err := api.ListLogsCustomDestinations(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsCustomDestinationsApi.ListLogsCustomDestinations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsCustomDestinationsApi.ListLogsCustomDestinations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all custom destinations ``` // Get all custom destinations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsCustomDestinationsApi; import com.datadog.api.client.v2.model.CustomDestinationsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsCustomDestinationsApi apiInstance = new LogsCustomDestinationsApi(defaultClient); try { CustomDestinationsResponse result = apiInstance.listLogsCustomDestinations(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling LogsCustomDestinationsApi#listLogsCustomDestinations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all custom destinations ``` // Get all custom destinations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_custom_destinations::LogsCustomDestinationsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsCustomDestinationsAPI::with_config(configuration); let resp = api.list_logs_custom_destinations().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all custom destinations ``` /** * Get all custom destinations returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsCustomDestinationsApi(configuration); apiInstance .listLogsCustomDestinations() .then((data: v2.CustomDestinationsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a custom destination](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#create-a-custom-destination) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#create-a-custom-destination-v2) POST https://api.ap1.datadoghq.com/api/v2/logs/config/custom-destinationshttps://api.ap2.datadoghq.com/api/v2/logs/config/custom-destinationshttps://api.datadoghq.eu/api/v2/logs/config/custom-destinationshttps://api.ddog-gov.com/api/v2/logs/config/custom-destinationshttps://api.datadoghq.com/api/v2/logs/config/custom-destinationshttps://api.us3.datadoghq.com/api/v2/logs/config/custom-destinationshttps://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations ### Overview Create a custom destination in your organization. This endpoint requires the `logs_write_forwarding_rules` permission. ### Request #### Body Data (required) The definition of the new custom destination. * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) Field Type Description data object The definition of a custom destination. attributes [_required_] object The attributes associated with the custom destination. enabled boolean Whether logs matching this custom destination should be forwarded or not. default: `true` forward_tags boolean Whether tags from the forwarded logs should be forwarded or not. default: `true` forward_tags_restriction_list [string] List of [keys of tags](https://docs.datadoghq.com/getting_started/tagging/#define-tags) to be filtered. An empty list represents no restriction is in place and either all or no tags will be forwarded depending on `forward_tags_restriction_list_type` parameter. default: forward_tags_restriction_list_type enum How `forward_tags_restriction_list` parameter should be interpreted. If `ALLOW_LIST`, then only tags whose keys on the forwarded logs match the ones on the restriction list are forwarded. `BLOCK_LIST` works the opposite way. It does not forward the tags matching the ones on the list. Allowed enum values: `ALLOW_LIST,BLOCK_LIST` default: `ALLOW_LIST` forwarder_destination [_required_] A custom destination's location to forward logs. Option 1 object The HTTP destination. auth [_required_] Authentication method of the HTTP requests. Option 1 object Basic access authentication. password [_required_] string The password of the authentication. This field is not returned by the API. type [_required_] enum Type of the basic access authentication. Allowed enum values: `basic` default: `basic` username [_required_] string The username of the authentication. This field is not returned by the API. Option 2 object Custom header access authentication. header_name [_required_] string The header name of the authentication. header_value [_required_] string The header value of the authentication. This field is not returned by the API. type [_required_] enum Type of the custom header access authentication. Allowed enum values: `custom_header` default: `custom_header` endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the HTTP destination. Allowed enum values: `http` default: `http` Option 2 object The Splunk HTTP Event Collector (HEC) destination. access_token [_required_] string Access token of the Splunk HTTP Event Collector. This field is not returned by the API. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the Splunk HTTP Event Collector (HEC) destination. Allowed enum values: `splunk_hec` default: `splunk_hec` Option 3 object The Elasticsearch destination. auth [_required_] object Basic access authentication. password [_required_] string The password of the authentication. This field is not returned by the API. username [_required_] string The username of the authentication. This field is not returned by the API. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. index_name [_required_] string Name of the Elasticsearch index (must follow [Elasticsearch's criteria](https://www.elastic.co/guide/en/elasticsearch/reference/8.11/indices-create-index.html#indices-create-api-path-params)). index_rotation string Date pattern with US locale and UTC timezone to be appended to the index name after adding `-` (that is, `${index_name}-${indexPattern}`). You can customize the index rotation naming pattern by choosing one of these options: * Hourly: `yyyy-MM-dd-HH` (as an example, it would render: `2022-10-19-09`) * Daily: `yyyy-MM-dd` (as an example, it would render: `2022-10-19`) * Weekly: `yyyy-'W'ww` (as an example, it would render: `2022-W42`) * Monthly: `yyyy-MM` (as an example, it would render: `2022-10`) If this field is missing or is blank, it means that the index name will always be the same (that is, no rotation). type [_required_] enum Type of the Elasticsearch destination. Allowed enum values: `elasticsearch` default: `elasticsearch` Option 4 object The Microsoft Sentinel destination. client_id [_required_] string Client ID from the Datadog Azure integration. data_collection_endpoint [_required_] string Azure data collection endpoint. data_collection_rule_id [_required_] string Azure data collection rule ID. stream_name [_required_] string Azure stream name. tenant_id [_required_] string Tenant ID from the Datadog Azure integration. type [_required_] enum Type of the Microsoft Sentinel destination. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` name [_required_] string The custom destination name. query string The custom destination query and filter. Logs matching this query are forwarded to the destination. type [_required_] enum The type of the resource. The value should always be `custom_destination`. Allowed enum values: `custom_destination` default: `custom_destination` ##### Create a Basic HTTP custom destination returns "OK" response ``` { "data": { "attributes": { "enabled": false, "forward_tags": false, "forward_tags_restriction_list": [ "datacenter", "host" ], "forward_tags_restriction_list_type": "ALLOW_LIST", "forwarder_destination": { "auth": { "password": "datadog-custom-destination-password", "type": "basic", "username": "datadog-custom-destination-username" }, "endpoint": "https://example.com", "type": "http" }, "name": "Nginx logs", "query": "source:nginx" }, "type": "custom_destination" } } ``` Copy ##### Create a Custom Header HTTP custom destination returns "OK" response ``` { "data": { "attributes": { "enabled": false, "forward_tags": false, "forward_tags_restriction_list": [ "datacenter", "host" ], "forward_tags_restriction_list_type": "ALLOW_LIST", "forwarder_destination": { "auth": { "header_value": "my-secret", "type": "custom_header", "header_name": "MY-AUTHENTICATION-HEADER" }, "endpoint": "https://example.com", "type": "http" }, "name": "Nginx logs", "query": "source:nginx" }, "type": "custom_destination" } } ``` Copy ##### Create a Microsoft Sentinel custom destination returns "OK" response ``` { "data": { "attributes": { "enabled": false, "forward_tags": false, "forward_tags_restriction_list": [ "datacenter", "host" ], "forward_tags_restriction_list_type": "ALLOW_LIST", "forwarder_destination": { "type": "microsoft_sentinel", "tenant_id": "f3c9a8a1-4c2e-4d2e-b911-9f3c28c3c8b2", "client_id": "9a2f4d83-2b5e-429e-a35a-2b3c4182db71", "data_collection_endpoint": "https://my-dce-5kyl.eastus-1.ingest.monitor.azure.com", "data_collection_rule_id": "dcr-000a00a000a00000a000000aa000a0aa", "stream_name": "Custom-MyTable" }, "name": "Nginx logs", "query": "source:nginx" }, "type": "custom_destination" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#CreateLogsCustomDestination-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#CreateLogsCustomDestination-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#CreateLogsCustomDestination-403-v2) * [409](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#CreateLogsCustomDestination-409-v2) * [429](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#CreateLogsCustomDestination-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) The custom destination. Field Type Description data object The definition of a custom destination. attributes object The attributes associated with the custom destination. enabled boolean Whether logs matching this custom destination should be forwarded or not. default: `true` forward_tags boolean Whether tags from the forwarded logs should be forwarded or not. default: `true` forward_tags_restriction_list [string] List of [keys of tags](https://docs.datadoghq.com/getting_started/tagging/#define-tags) to be filtered. An empty list represents no restriction is in place and either all or no tags will be forwarded depending on `forward_tags_restriction_list_type` parameter. default: forward_tags_restriction_list_type enum How `forward_tags_restriction_list` parameter should be interpreted. If `ALLOW_LIST`, then only tags whose keys on the forwarded logs match the ones on the restriction list are forwarded. `BLOCK_LIST` works the opposite way. It does not forward the tags matching the ones on the list. Allowed enum values: `ALLOW_LIST,BLOCK_LIST` default: `ALLOW_LIST` forwarder_destination A custom destination's location to forward logs. Option 1 object The HTTP destination. auth [_required_] Authentication method of the HTTP requests. Option 1 object Basic access authentication. type [_required_] enum Type of the basic access authentication. Allowed enum values: `basic` default: `basic` Option 2 object Custom header access authentication. header_name [_required_] string The header name of the authentication. type [_required_] enum Type of the custom header access authentication. Allowed enum values: `custom_header` default: `custom_header` endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the HTTP destination. Allowed enum values: `http` default: `http` Option 2 object The Splunk HTTP Event Collector (HEC) destination. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the Splunk HTTP Event Collector (HEC) destination. Allowed enum values: `splunk_hec` default: `splunk_hec` Option 3 object The Elasticsearch destination. auth [_required_] object Basic access authentication. Basic access authentication. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. index_name [_required_] string Name of the Elasticsearch index (must follow [Elasticsearch's criteria](https://www.elastic.co/guide/en/elasticsearch/reference/8.11/indices-create-index.html#indices-create-api-path-params)). index_rotation string Date pattern with US locale and UTC timezone to be appended to the index name after adding `-` (that is, `${index_name}-${indexPattern}`). You can customize the index rotation naming pattern by choosing one of these options: * Hourly: `yyyy-MM-dd-HH` (as an example, it would render: `2022-10-19-09`) * Daily: `yyyy-MM-dd` (as an example, it would render: `2022-10-19`) * Weekly: `yyyy-'W'ww` (as an example, it would render: `2022-W42`) * Monthly: `yyyy-MM` (as an example, it would render: `2022-10`) If this field is missing or is blank, it means that the index name will always be the same (that is, no rotation). type [_required_] enum Type of the Elasticsearch destination. Allowed enum values: `elasticsearch` default: `elasticsearch` Option 4 object The Microsoft Sentinel destination. client_id [_required_] string Client ID from the Datadog Azure integration. data_collection_endpoint [_required_] string Azure data collection endpoint. data_collection_rule_id [_required_] string Azure data collection rule ID. stream_name [_required_] string Azure stream name. tenant_id [_required_] string Tenant ID from the Datadog Azure integration. type [_required_] enum Type of the Microsoft Sentinel destination. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` name string The custom destination name. query string The custom destination query filter. Logs matching this query are forwarded to the destination. id string The custom destination ID. type enum The type of the resource. The value should always be `custom_destination`. Allowed enum values: `custom_destination` default: `custom_destination` ``` { "data": { "attributes": { "enabled": true, "forward_tags": true, "forward_tags_restriction_list": [ "datacenter", "host" ], "forward_tags_restriction_list_type": "ALLOW_LIST", "forwarder_destination": { "auth": { "type": "basic" }, "endpoint": "https://example.com", "type": "http" }, "name": "Nginx logs", "query": "source:nginx" }, "id": "be5d7a69-d0c8-4d4d-8ee8-bba292d98139", "type": "custom_destination" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=typescript) ##### Create a Basic HTTP custom destination returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": false, "forward_tags": false, "forward_tags_restriction_list": [ "datacenter", "host" ], "forward_tags_restriction_list_type": "ALLOW_LIST", "forwarder_destination": { "auth": { "password": "datadog-custom-destination-password", "type": "basic", "username": "datadog-custom-destination-username" }, "endpoint": "https://example.com", "type": "http" }, "name": "Nginx logs", "query": "source:nginx" }, "type": "custom_destination" } } EOF ``` ##### Create a Custom Header HTTP custom destination returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": false, "forward_tags": false, "forward_tags_restriction_list": [ "datacenter", "host" ], "forward_tags_restriction_list_type": "ALLOW_LIST", "forwarder_destination": { "auth": { "header_value": "my-secret", "type": "custom_header", "header_name": "MY-AUTHENTICATION-HEADER" }, "endpoint": "https://example.com", "type": "http" }, "name": "Nginx logs", "query": "source:nginx" }, "type": "custom_destination" } } EOF ``` ##### Create a Microsoft Sentinel custom destination returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": false, "forward_tags": false, "forward_tags_restriction_list": [ "datacenter", "host" ], "forward_tags_restriction_list_type": "ALLOW_LIST", "forwarder_destination": { "type": "microsoft_sentinel", "tenant_id": "f3c9a8a1-4c2e-4d2e-b911-9f3c28c3c8b2", "client_id": "9a2f4d83-2b5e-429e-a35a-2b3c4182db71", "data_collection_endpoint": "https://my-dce-5kyl.eastus-1.ingest.monitor.azure.com", "data_collection_rule_id": "dcr-000a00a000a00000a000000aa000a0aa", "stream_name": "Custom-MyTable" }, "name": "Nginx logs", "query": "source:nginx" }, "type": "custom_destination" } } EOF ``` ##### Create a Basic HTTP custom destination returns "OK" response ``` // Create a Basic HTTP custom destination returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CustomDestinationCreateRequest{ Data: &datadogV2.CustomDestinationCreateRequestDefinition{ Attributes: datadogV2.CustomDestinationCreateRequestAttributes{ Enabled: datadog.PtrBool(false), ForwardTags: datadog.PtrBool(false), ForwardTagsRestrictionList: []string{ "datacenter", "host", }, ForwardTagsRestrictionListType: datadogV2.CUSTOMDESTINATIONATTRIBUTETAGSRESTRICTIONLISTTYPE_ALLOW_LIST.Ptr(), ForwarderDestination: datadogV2.CustomDestinationForwardDestination{ CustomDestinationForwardDestinationHttp: &datadogV2.CustomDestinationForwardDestinationHttp{ Auth: datadogV2.CustomDestinationHttpDestinationAuth{ CustomDestinationHttpDestinationAuthBasic: &datadogV2.CustomDestinationHttpDestinationAuthBasic{ Password: "datadog-custom-destination-password", Type: datadogV2.CUSTOMDESTINATIONHTTPDESTINATIONAUTHBASICTYPE_BASIC, Username: "datadog-custom-destination-username", }}, Endpoint: "https://example.com", Type: datadogV2.CUSTOMDESTINATIONFORWARDDESTINATIONHTTPTYPE_HTTP, }}, Name: "Nginx logs", Query: datadog.PtrString("source:nginx"), }, Type: datadogV2.CUSTOMDESTINATIONTYPE_CUSTOM_DESTINATION, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsCustomDestinationsApi(apiClient) resp, r, err := api.CreateLogsCustomDestination(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsCustomDestinationsApi.CreateLogsCustomDestination`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsCustomDestinationsApi.CreateLogsCustomDestination`:\n%s\n", responseContent) } ``` Copy ##### Create a Custom Header HTTP custom destination returns "OK" response ``` // Create a Custom Header HTTP custom destination returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CustomDestinationCreateRequest{ Data: &datadogV2.CustomDestinationCreateRequestDefinition{ Attributes: datadogV2.CustomDestinationCreateRequestAttributes{ Enabled: datadog.PtrBool(false), ForwardTags: datadog.PtrBool(false), ForwardTagsRestrictionList: []string{ "datacenter", "host", }, ForwardTagsRestrictionListType: datadogV2.CUSTOMDESTINATIONATTRIBUTETAGSRESTRICTIONLISTTYPE_ALLOW_LIST.Ptr(), ForwarderDestination: datadogV2.CustomDestinationForwardDestination{ CustomDestinationForwardDestinationHttp: &datadogV2.CustomDestinationForwardDestinationHttp{ Auth: datadogV2.CustomDestinationHttpDestinationAuth{ CustomDestinationHttpDestinationAuthCustomHeader: &datadogV2.CustomDestinationHttpDestinationAuthCustomHeader{ HeaderValue: "my-secret", Type: datadogV2.CUSTOMDESTINATIONHTTPDESTINATIONAUTHCUSTOMHEADERTYPE_CUSTOM_HEADER, HeaderName: "MY-AUTHENTICATION-HEADER", }}, Endpoint: "https://example.com", Type: datadogV2.CUSTOMDESTINATIONFORWARDDESTINATIONHTTPTYPE_HTTP, }}, Name: "Nginx logs", Query: datadog.PtrString("source:nginx"), }, Type: datadogV2.CUSTOMDESTINATIONTYPE_CUSTOM_DESTINATION, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsCustomDestinationsApi(apiClient) resp, r, err := api.CreateLogsCustomDestination(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsCustomDestinationsApi.CreateLogsCustomDestination`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsCustomDestinationsApi.CreateLogsCustomDestination`:\n%s\n", responseContent) } ``` Copy ##### Create a Microsoft Sentinel custom destination returns "OK" response ``` // Create a Microsoft Sentinel custom destination returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CustomDestinationCreateRequest{ Data: &datadogV2.CustomDestinationCreateRequestDefinition{ Attributes: datadogV2.CustomDestinationCreateRequestAttributes{ Enabled: datadog.PtrBool(false), ForwardTags: datadog.PtrBool(false), ForwardTagsRestrictionList: []string{ "datacenter", "host", }, ForwardTagsRestrictionListType: datadogV2.CUSTOMDESTINATIONATTRIBUTETAGSRESTRICTIONLISTTYPE_ALLOW_LIST.Ptr(), ForwarderDestination: datadogV2.CustomDestinationForwardDestination{ CustomDestinationForwardDestinationMicrosoftSentinel: &datadogV2.CustomDestinationForwardDestinationMicrosoftSentinel{ Type: datadogV2.CUSTOMDESTINATIONFORWARDDESTINATIONMICROSOFTSENTINELTYPE_MICROSOFT_SENTINEL, TenantId: "f3c9a8a1-4c2e-4d2e-b911-9f3c28c3c8b2", ClientId: "9a2f4d83-2b5e-429e-a35a-2b3c4182db71", DataCollectionEndpoint: "https://my-dce-5kyl.eastus-1.ingest.monitor.azure.com", DataCollectionRuleId: "dcr-000a00a000a00000a000000aa000a0aa", StreamName: "Custom-MyTable", }}, Name: "Nginx logs", Query: datadog.PtrString("source:nginx"), }, Type: datadogV2.CUSTOMDESTINATIONTYPE_CUSTOM_DESTINATION, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsCustomDestinationsApi(apiClient) resp, r, err := api.CreateLogsCustomDestination(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsCustomDestinationsApi.CreateLogsCustomDestination`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsCustomDestinationsApi.CreateLogsCustomDestination`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a Basic HTTP custom destination returns "OK" response ``` // Create a Basic HTTP custom destination returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsCustomDestinationsApi; import com.datadog.api.client.v2.model.CustomDestinationAttributeTagsRestrictionListType; import com.datadog.api.client.v2.model.CustomDestinationCreateRequest; import com.datadog.api.client.v2.model.CustomDestinationCreateRequestAttributes; import com.datadog.api.client.v2.model.CustomDestinationCreateRequestDefinition; import com.datadog.api.client.v2.model.CustomDestinationForwardDestination; import com.datadog.api.client.v2.model.CustomDestinationForwardDestinationHttp; import com.datadog.api.client.v2.model.CustomDestinationForwardDestinationHttpType; import com.datadog.api.client.v2.model.CustomDestinationHttpDestinationAuth; import com.datadog.api.client.v2.model.CustomDestinationHttpDestinationAuthBasic; import com.datadog.api.client.v2.model.CustomDestinationHttpDestinationAuthBasicType; import com.datadog.api.client.v2.model.CustomDestinationResponse; import com.datadog.api.client.v2.model.CustomDestinationType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsCustomDestinationsApi apiInstance = new LogsCustomDestinationsApi(defaultClient); CustomDestinationCreateRequest body = new CustomDestinationCreateRequest() .data( new CustomDestinationCreateRequestDefinition() .attributes( new CustomDestinationCreateRequestAttributes() .enabled(false) .forwardTags(false) .forwardTagsRestrictionList(Arrays.asList("datacenter", "host")) .forwardTagsRestrictionListType( CustomDestinationAttributeTagsRestrictionListType.ALLOW_LIST) .forwarderDestination( new CustomDestinationForwardDestination( new CustomDestinationForwardDestinationHttp() .auth( new CustomDestinationHttpDestinationAuth( new CustomDestinationHttpDestinationAuthBasic() .password("datadog-custom-destination-password") .type( CustomDestinationHttpDestinationAuthBasicType .BASIC) .username( "datadog-custom-destination-username"))) .endpoint("https://example.com") .type(CustomDestinationForwardDestinationHttpType.HTTP))) .name("Nginx logs") .query("source:nginx")) .type(CustomDestinationType.CUSTOM_DESTINATION)); try { CustomDestinationResponse result = apiInstance.createLogsCustomDestination(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling LogsCustomDestinationsApi#createLogsCustomDestination"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a Custom Header HTTP custom destination returns "OK" response ``` // Create a Custom Header HTTP custom destination returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsCustomDestinationsApi; import com.datadog.api.client.v2.model.CustomDestinationAttributeTagsRestrictionListType; import com.datadog.api.client.v2.model.CustomDestinationCreateRequest; import com.datadog.api.client.v2.model.CustomDestinationCreateRequestAttributes; import com.datadog.api.client.v2.model.CustomDestinationCreateRequestDefinition; import com.datadog.api.client.v2.model.CustomDestinationForwardDestination; import com.datadog.api.client.v2.model.CustomDestinationForwardDestinationHttp; import com.datadog.api.client.v2.model.CustomDestinationForwardDestinationHttpType; import com.datadog.api.client.v2.model.CustomDestinationHttpDestinationAuth; import com.datadog.api.client.v2.model.CustomDestinationHttpDestinationAuthCustomHeader; import com.datadog.api.client.v2.model.CustomDestinationHttpDestinationAuthCustomHeaderType; import com.datadog.api.client.v2.model.CustomDestinationResponse; import com.datadog.api.client.v2.model.CustomDestinationType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsCustomDestinationsApi apiInstance = new LogsCustomDestinationsApi(defaultClient); CustomDestinationCreateRequest body = new CustomDestinationCreateRequest() .data( new CustomDestinationCreateRequestDefinition() .attributes( new CustomDestinationCreateRequestAttributes() .enabled(false) .forwardTags(false) .forwardTagsRestrictionList(Arrays.asList("datacenter", "host")) .forwardTagsRestrictionListType( CustomDestinationAttributeTagsRestrictionListType.ALLOW_LIST) .forwarderDestination( new CustomDestinationForwardDestination( new CustomDestinationForwardDestinationHttp() .auth( new CustomDestinationHttpDestinationAuth( new CustomDestinationHttpDestinationAuthCustomHeader() .headerValue("my-secret") .type( CustomDestinationHttpDestinationAuthCustomHeaderType .CUSTOM_HEADER) .headerName("MY-AUTHENTICATION-HEADER"))) .endpoint("https://example.com") .type(CustomDestinationForwardDestinationHttpType.HTTP))) .name("Nginx logs") .query("source:nginx")) .type(CustomDestinationType.CUSTOM_DESTINATION)); try { CustomDestinationResponse result = apiInstance.createLogsCustomDestination(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling LogsCustomDestinationsApi#createLogsCustomDestination"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a Microsoft Sentinel custom destination returns "OK" response ``` // Create a Microsoft Sentinel custom destination returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsCustomDestinationsApi; import com.datadog.api.client.v2.model.CustomDestinationAttributeTagsRestrictionListType; import com.datadog.api.client.v2.model.CustomDestinationCreateRequest; import com.datadog.api.client.v2.model.CustomDestinationCreateRequestAttributes; import com.datadog.api.client.v2.model.CustomDestinationCreateRequestDefinition; import com.datadog.api.client.v2.model.CustomDestinationForwardDestination; import com.datadog.api.client.v2.model.CustomDestinationForwardDestinationMicrosoftSentinel; import com.datadog.api.client.v2.model.CustomDestinationForwardDestinationMicrosoftSentinelType; import com.datadog.api.client.v2.model.CustomDestinationResponse; import com.datadog.api.client.v2.model.CustomDestinationType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsCustomDestinationsApi apiInstance = new LogsCustomDestinationsApi(defaultClient); CustomDestinationCreateRequest body = new CustomDestinationCreateRequest() .data( new CustomDestinationCreateRequestDefinition() .attributes( new CustomDestinationCreateRequestAttributes() .enabled(false) .forwardTags(false) .forwardTagsRestrictionList(Arrays.asList("datacenter", "host")) .forwardTagsRestrictionListType( CustomDestinationAttributeTagsRestrictionListType.ALLOW_LIST) .forwarderDestination( new CustomDestinationForwardDestination( new CustomDestinationForwardDestinationMicrosoftSentinel() .type( CustomDestinationForwardDestinationMicrosoftSentinelType .MICROSOFT_SENTINEL) .tenantId("f3c9a8a1-4c2e-4d2e-b911-9f3c28c3c8b2") .clientId("9a2f4d83-2b5e-429e-a35a-2b3c4182db71") .dataCollectionEndpoint( "https://my-dce-5kyl.eastus-1.ingest.monitor.azure.com") .dataCollectionRuleId( "dcr-000a00a000a00000a000000aa000a0aa") .streamName("Custom-MyTable"))) .name("Nginx logs") .query("source:nginx")) .type(CustomDestinationType.CUSTOM_DESTINATION)); try { CustomDestinationResponse result = apiInstance.createLogsCustomDestination(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling LogsCustomDestinationsApi#createLogsCustomDestination"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a Basic HTTP custom destination returns "OK" response ``` """ Create a Basic HTTP custom destination returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_custom_destinations_api import LogsCustomDestinationsApi from datadog_api_client.v2.model.custom_destination_attribute_tags_restriction_list_type import ( CustomDestinationAttributeTagsRestrictionListType, ) from datadog_api_client.v2.model.custom_destination_create_request import CustomDestinationCreateRequest from datadog_api_client.v2.model.custom_destination_create_request_attributes import ( CustomDestinationCreateRequestAttributes, ) from datadog_api_client.v2.model.custom_destination_create_request_definition import ( CustomDestinationCreateRequestDefinition, ) from datadog_api_client.v2.model.custom_destination_forward_destination_http import ( CustomDestinationForwardDestinationHttp, ) from datadog_api_client.v2.model.custom_destination_forward_destination_http_type import ( CustomDestinationForwardDestinationHttpType, ) from datadog_api_client.v2.model.custom_destination_http_destination_auth_basic import ( CustomDestinationHttpDestinationAuthBasic, ) from datadog_api_client.v2.model.custom_destination_http_destination_auth_basic_type import ( CustomDestinationHttpDestinationAuthBasicType, ) from datadog_api_client.v2.model.custom_destination_type import CustomDestinationType body = CustomDestinationCreateRequest( data=CustomDestinationCreateRequestDefinition( attributes=CustomDestinationCreateRequestAttributes( enabled=False, forward_tags=False, forward_tags_restriction_list=[ "datacenter", "host", ], forward_tags_restriction_list_type=CustomDestinationAttributeTagsRestrictionListType.ALLOW_LIST, forwarder_destination=CustomDestinationForwardDestinationHttp( auth=CustomDestinationHttpDestinationAuthBasic( password="datadog-custom-destination-password", type=CustomDestinationHttpDestinationAuthBasicType.BASIC, username="datadog-custom-destination-username", ), endpoint="https://example.com", type=CustomDestinationForwardDestinationHttpType.HTTP, ), name="Nginx logs", query="source:nginx", ), type=CustomDestinationType.CUSTOM_DESTINATION, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsCustomDestinationsApi(api_client) response = api_instance.create_logs_custom_destination(body=body) print(response) ``` Copy ##### Create a Custom Header HTTP custom destination returns "OK" response ``` """ Create a Custom Header HTTP custom destination returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_custom_destinations_api import LogsCustomDestinationsApi from datadog_api_client.v2.model.custom_destination_attribute_tags_restriction_list_type import ( CustomDestinationAttributeTagsRestrictionListType, ) from datadog_api_client.v2.model.custom_destination_create_request import CustomDestinationCreateRequest from datadog_api_client.v2.model.custom_destination_create_request_attributes import ( CustomDestinationCreateRequestAttributes, ) from datadog_api_client.v2.model.custom_destination_create_request_definition import ( CustomDestinationCreateRequestDefinition, ) from datadog_api_client.v2.model.custom_destination_forward_destination_http import ( CustomDestinationForwardDestinationHttp, ) from datadog_api_client.v2.model.custom_destination_forward_destination_http_type import ( CustomDestinationForwardDestinationHttpType, ) from datadog_api_client.v2.model.custom_destination_http_destination_auth_custom_header import ( CustomDestinationHttpDestinationAuthCustomHeader, ) from datadog_api_client.v2.model.custom_destination_http_destination_auth_custom_header_type import ( CustomDestinationHttpDestinationAuthCustomHeaderType, ) from datadog_api_client.v2.model.custom_destination_type import CustomDestinationType body = CustomDestinationCreateRequest( data=CustomDestinationCreateRequestDefinition( attributes=CustomDestinationCreateRequestAttributes( enabled=False, forward_tags=False, forward_tags_restriction_list=[ "datacenter", "host", ], forward_tags_restriction_list_type=CustomDestinationAttributeTagsRestrictionListType.ALLOW_LIST, forwarder_destination=CustomDestinationForwardDestinationHttp( auth=CustomDestinationHttpDestinationAuthCustomHeader( header_value="my-secret", type=CustomDestinationHttpDestinationAuthCustomHeaderType.CUSTOM_HEADER, header_name="MY-AUTHENTICATION-HEADER", ), endpoint="https://example.com", type=CustomDestinationForwardDestinationHttpType.HTTP, ), name="Nginx logs", query="source:nginx", ), type=CustomDestinationType.CUSTOM_DESTINATION, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsCustomDestinationsApi(api_client) response = api_instance.create_logs_custom_destination(body=body) print(response) ``` Copy ##### Create a Microsoft Sentinel custom destination returns "OK" response ``` """ Create a Microsoft Sentinel custom destination returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_custom_destinations_api import LogsCustomDestinationsApi from datadog_api_client.v2.model.custom_destination_attribute_tags_restriction_list_type import ( CustomDestinationAttributeTagsRestrictionListType, ) from datadog_api_client.v2.model.custom_destination_create_request import CustomDestinationCreateRequest from datadog_api_client.v2.model.custom_destination_create_request_attributes import ( CustomDestinationCreateRequestAttributes, ) from datadog_api_client.v2.model.custom_destination_create_request_definition import ( CustomDestinationCreateRequestDefinition, ) from datadog_api_client.v2.model.custom_destination_forward_destination_microsoft_sentinel import ( CustomDestinationForwardDestinationMicrosoftSentinel, ) from datadog_api_client.v2.model.custom_destination_forward_destination_microsoft_sentinel_type import ( CustomDestinationForwardDestinationMicrosoftSentinelType, ) from datadog_api_client.v2.model.custom_destination_type import CustomDestinationType body = CustomDestinationCreateRequest( data=CustomDestinationCreateRequestDefinition( attributes=CustomDestinationCreateRequestAttributes( enabled=False, forward_tags=False, forward_tags_restriction_list=[ "datacenter", "host", ], forward_tags_restriction_list_type=CustomDestinationAttributeTagsRestrictionListType.ALLOW_LIST, forwarder_destination=CustomDestinationForwardDestinationMicrosoftSentinel( type=CustomDestinationForwardDestinationMicrosoftSentinelType.MICROSOFT_SENTINEL, tenant_id="f3c9a8a1-4c2e-4d2e-b911-9f3c28c3c8b2", client_id="9a2f4d83-2b5e-429e-a35a-2b3c4182db71", data_collection_endpoint="https://my-dce-5kyl.eastus-1.ingest.monitor.azure.com", data_collection_rule_id="dcr-000a00a000a00000a000000aa000a0aa", stream_name="Custom-MyTable", ), name="Nginx logs", query="source:nginx", ), type=CustomDestinationType.CUSTOM_DESTINATION, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsCustomDestinationsApi(api_client) response = api_instance.create_logs_custom_destination(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a Basic HTTP custom destination returns "OK" response ``` # Create a Basic HTTP custom destination returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsCustomDestinationsAPI.new body = DatadogAPIClient::V2::CustomDestinationCreateRequest.new({ data: DatadogAPIClient::V2::CustomDestinationCreateRequestDefinition.new({ attributes: DatadogAPIClient::V2::CustomDestinationCreateRequestAttributes.new({ enabled: false, forward_tags: false, forward_tags_restriction_list: [ "datacenter", "host", ], forward_tags_restriction_list_type: DatadogAPIClient::V2::CustomDestinationAttributeTagsRestrictionListType::ALLOW_LIST, forwarder_destination: DatadogAPIClient::V2::CustomDestinationForwardDestinationHttp.new({ auth: DatadogAPIClient::V2::CustomDestinationHttpDestinationAuthBasic.new({ password: "datadog-custom-destination-password", type: DatadogAPIClient::V2::CustomDestinationHttpDestinationAuthBasicType::BASIC, username: "datadog-custom-destination-username", }), endpoint: "https://example.com", type: DatadogAPIClient::V2::CustomDestinationForwardDestinationHttpType::HTTP, }), name: "Nginx logs", query: "source:nginx", }), type: DatadogAPIClient::V2::CustomDestinationType::CUSTOM_DESTINATION, }), }) p api_instance.create_logs_custom_destination(body) ``` Copy ##### Create a Custom Header HTTP custom destination returns "OK" response ``` # Create a Custom Header HTTP custom destination returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsCustomDestinationsAPI.new body = DatadogAPIClient::V2::CustomDestinationCreateRequest.new({ data: DatadogAPIClient::V2::CustomDestinationCreateRequestDefinition.new({ attributes: DatadogAPIClient::V2::CustomDestinationCreateRequestAttributes.new({ enabled: false, forward_tags: false, forward_tags_restriction_list: [ "datacenter", "host", ], forward_tags_restriction_list_type: DatadogAPIClient::V2::CustomDestinationAttributeTagsRestrictionListType::ALLOW_LIST, forwarder_destination: DatadogAPIClient::V2::CustomDestinationForwardDestinationHttp.new({ auth: DatadogAPIClient::V2::CustomDestinationHttpDestinationAuthCustomHeader.new({ header_value: "my-secret", type: DatadogAPIClient::V2::CustomDestinationHttpDestinationAuthCustomHeaderType::CUSTOM_HEADER, header_name: "MY-AUTHENTICATION-HEADER", }), endpoint: "https://example.com", type: DatadogAPIClient::V2::CustomDestinationForwardDestinationHttpType::HTTP, }), name: "Nginx logs", query: "source:nginx", }), type: DatadogAPIClient::V2::CustomDestinationType::CUSTOM_DESTINATION, }), }) p api_instance.create_logs_custom_destination(body) ``` Copy ##### Create a Microsoft Sentinel custom destination returns "OK" response ``` # Create a Microsoft Sentinel custom destination returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsCustomDestinationsAPI.new body = DatadogAPIClient::V2::CustomDestinationCreateRequest.new({ data: DatadogAPIClient::V2::CustomDestinationCreateRequestDefinition.new({ attributes: DatadogAPIClient::V2::CustomDestinationCreateRequestAttributes.new({ enabled: false, forward_tags: false, forward_tags_restriction_list: [ "datacenter", "host", ], forward_tags_restriction_list_type: DatadogAPIClient::V2::CustomDestinationAttributeTagsRestrictionListType::ALLOW_LIST, forwarder_destination: DatadogAPIClient::V2::CustomDestinationForwardDestinationMicrosoftSentinel.new({ type: DatadogAPIClient::V2::CustomDestinationForwardDestinationMicrosoftSentinelType::MICROSOFT_SENTINEL, tenant_id: "f3c9a8a1-4c2e-4d2e-b911-9f3c28c3c8b2", client_id: "9a2f4d83-2b5e-429e-a35a-2b3c4182db71", data_collection_endpoint: "https://my-dce-5kyl.eastus-1.ingest.monitor.azure.com", data_collection_rule_id: "dcr-000a00a000a00000a000000aa000a0aa", stream_name: "Custom-MyTable", }), name: "Nginx logs", query: "source:nginx", }), type: DatadogAPIClient::V2::CustomDestinationType::CUSTOM_DESTINATION, }), }) p api_instance.create_logs_custom_destination(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a Basic HTTP custom destination returns "OK" response ``` // Create a Basic HTTP custom destination returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_custom_destinations::LogsCustomDestinationsAPI; use datadog_api_client::datadogV2::model::CustomDestinationAttributeTagsRestrictionListType; use datadog_api_client::datadogV2::model::CustomDestinationCreateRequest; use datadog_api_client::datadogV2::model::CustomDestinationCreateRequestAttributes; use datadog_api_client::datadogV2::model::CustomDestinationCreateRequestDefinition; use datadog_api_client::datadogV2::model::CustomDestinationForwardDestination; use datadog_api_client::datadogV2::model::CustomDestinationForwardDestinationHttp; use datadog_api_client::datadogV2::model::CustomDestinationForwardDestinationHttpType; use datadog_api_client::datadogV2::model::CustomDestinationHttpDestinationAuth; use datadog_api_client::datadogV2::model::CustomDestinationHttpDestinationAuthBasic; use datadog_api_client::datadogV2::model::CustomDestinationHttpDestinationAuthBasicType; use datadog_api_client::datadogV2::model::CustomDestinationType; #[tokio::main] async fn main() { let body = CustomDestinationCreateRequest::new() .data(CustomDestinationCreateRequestDefinition::new( CustomDestinationCreateRequestAttributes::new( CustomDestinationForwardDestination::CustomDestinationForwardDestinationHttp(Box::new( CustomDestinationForwardDestinationHttp::new( CustomDestinationHttpDestinationAuth::CustomDestinationHttpDestinationAuthBasic( Box::new(CustomDestinationHttpDestinationAuthBasic::new( "datadog-custom-destination-password".to_string(), CustomDestinationHttpDestinationAuthBasicType::BASIC, "datadog-custom-destination-username".to_string(), )), ), "https://example.com".to_string(), CustomDestinationForwardDestinationHttpType::HTTP, ), )), "Nginx logs".to_string(), ) .enabled(false) .forward_tags(false) .forward_tags_restriction_list(vec!["datacenter".to_string(), "host".to_string()]) .forward_tags_restriction_list_type( CustomDestinationAttributeTagsRestrictionListType::ALLOW_LIST, ) .query("source:nginx".to_string()), CustomDestinationType::CUSTOM_DESTINATION, )); let configuration = datadog::Configuration::new(); let api = LogsCustomDestinationsAPI::with_config(configuration); let resp = api.create_logs_custom_destination(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a Custom Header HTTP custom destination returns "OK" response ``` // Create a Custom Header HTTP custom destination returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_custom_destinations::LogsCustomDestinationsAPI; use datadog_api_client::datadogV2::model::CustomDestinationAttributeTagsRestrictionListType; use datadog_api_client::datadogV2::model::CustomDestinationCreateRequest; use datadog_api_client::datadogV2::model::CustomDestinationCreateRequestAttributes; use datadog_api_client::datadogV2::model::CustomDestinationCreateRequestDefinition; use datadog_api_client::datadogV2::model::CustomDestinationForwardDestination; use datadog_api_client::datadogV2::model::CustomDestinationForwardDestinationHttp; use datadog_api_client::datadogV2::model::CustomDestinationForwardDestinationHttpType; use datadog_api_client::datadogV2::model::CustomDestinationHttpDestinationAuth; use datadog_api_client::datadogV2::model::CustomDestinationHttpDestinationAuthCustomHeader; use datadog_api_client::datadogV2::model::CustomDestinationHttpDestinationAuthCustomHeaderType; use datadog_api_client::datadogV2::model::CustomDestinationType; #[tokio::main] async fn main() { let body = CustomDestinationCreateRequest ::new().data( CustomDestinationCreateRequestDefinition::new( CustomDestinationCreateRequestAttributes::new( CustomDestinationForwardDestination::CustomDestinationForwardDestinationHttp( Box::new( CustomDestinationForwardDestinationHttp::new( CustomDestinationHttpDestinationAuth::CustomDestinationHttpDestinationAuthCustomHeader( Box::new( CustomDestinationHttpDestinationAuthCustomHeader::new( "MY-AUTHENTICATION-HEADER".to_string(), "my-secret".to_string(), CustomDestinationHttpDestinationAuthCustomHeaderType::CUSTOM_HEADER, ), ), ), "https://example.com".to_string(), CustomDestinationForwardDestinationHttpType::HTTP, ), ), ), "Nginx logs".to_string(), ) .enabled(false) .forward_tags(false) .forward_tags_restriction_list(vec!["datacenter".to_string(), "host".to_string()]) .forward_tags_restriction_list_type(CustomDestinationAttributeTagsRestrictionListType::ALLOW_LIST) .query("source:nginx".to_string()), CustomDestinationType::CUSTOM_DESTINATION, ), ); let configuration = datadog::Configuration::new(); let api = LogsCustomDestinationsAPI::with_config(configuration); let resp = api.create_logs_custom_destination(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a Microsoft Sentinel custom destination returns "OK" response ``` // Create a Microsoft Sentinel custom destination returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_custom_destinations::LogsCustomDestinationsAPI; use datadog_api_client::datadogV2::model::CustomDestinationAttributeTagsRestrictionListType; use datadog_api_client::datadogV2::model::CustomDestinationCreateRequest; use datadog_api_client::datadogV2::model::CustomDestinationCreateRequestAttributes; use datadog_api_client::datadogV2::model::CustomDestinationCreateRequestDefinition; use datadog_api_client::datadogV2::model::CustomDestinationForwardDestination; use datadog_api_client::datadogV2::model::CustomDestinationForwardDestinationMicrosoftSentinel; use datadog_api_client::datadogV2::model::CustomDestinationForwardDestinationMicrosoftSentinelType; use datadog_api_client::datadogV2::model::CustomDestinationType; #[tokio::main] async fn main() { let body = CustomDestinationCreateRequest ::new().data( CustomDestinationCreateRequestDefinition::new( CustomDestinationCreateRequestAttributes::new( CustomDestinationForwardDestination::CustomDestinationForwardDestinationMicrosoftSentinel( Box::new( CustomDestinationForwardDestinationMicrosoftSentinel::new( "9a2f4d83-2b5e-429e-a35a-2b3c4182db71".to_string(), "https://my-dce-5kyl.eastus-1.ingest.monitor.azure.com".to_string(), "dcr-000a00a000a00000a000000aa000a0aa".to_string(), "Custom-MyTable".to_string(), "f3c9a8a1-4c2e-4d2e-b911-9f3c28c3c8b2".to_string(), CustomDestinationForwardDestinationMicrosoftSentinelType::MICROSOFT_SENTINEL, ), ), ), "Nginx logs".to_string(), ) .enabled(false) .forward_tags(false) .forward_tags_restriction_list(vec!["datacenter".to_string(), "host".to_string()]) .forward_tags_restriction_list_type(CustomDestinationAttributeTagsRestrictionListType::ALLOW_LIST) .query("source:nginx".to_string()), CustomDestinationType::CUSTOM_DESTINATION, ), ); let configuration = datadog::Configuration::new(); let api = LogsCustomDestinationsAPI::with_config(configuration); let resp = api.create_logs_custom_destination(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a Basic HTTP custom destination returns "OK" response ``` /** * Create a Basic HTTP custom destination returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsCustomDestinationsApi(configuration); const params: v2.LogsCustomDestinationsApiCreateLogsCustomDestinationRequest = { body: { data: { attributes: { enabled: false, forwardTags: false, forwardTagsRestrictionList: ["datacenter", "host"], forwardTagsRestrictionListType: "ALLOW_LIST", forwarderDestination: { auth: { password: "datadog-custom-destination-password", type: "basic", username: "datadog-custom-destination-username", }, endpoint: "https://example.com", type: "http", }, name: "Nginx logs", query: "source:nginx", }, type: "custom_destination", }, }, }; apiInstance .createLogsCustomDestination(params) .then((data: v2.CustomDestinationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a Custom Header HTTP custom destination returns "OK" response ``` /** * Create a Custom Header HTTP custom destination returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsCustomDestinationsApi(configuration); const params: v2.LogsCustomDestinationsApiCreateLogsCustomDestinationRequest = { body: { data: { attributes: { enabled: false, forwardTags: false, forwardTagsRestrictionList: ["datacenter", "host"], forwardTagsRestrictionListType: "ALLOW_LIST", forwarderDestination: { auth: { headerValue: "my-secret", type: "custom_header", headerName: "MY-AUTHENTICATION-HEADER", }, endpoint: "https://example.com", type: "http", }, name: "Nginx logs", query: "source:nginx", }, type: "custom_destination", }, }, }; apiInstance .createLogsCustomDestination(params) .then((data: v2.CustomDestinationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a Microsoft Sentinel custom destination returns "OK" response ``` /** * Create a Microsoft Sentinel custom destination returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsCustomDestinationsApi(configuration); const params: v2.LogsCustomDestinationsApiCreateLogsCustomDestinationRequest = { body: { data: { attributes: { enabled: false, forwardTags: false, forwardTagsRestrictionList: ["datacenter", "host"], forwardTagsRestrictionListType: "ALLOW_LIST", forwarderDestination: { type: "microsoft_sentinel", tenantId: "f3c9a8a1-4c2e-4d2e-b911-9f3c28c3c8b2", clientId: "9a2f4d83-2b5e-429e-a35a-2b3c4182db71", dataCollectionEndpoint: "https://my-dce-5kyl.eastus-1.ingest.monitor.azure.com", dataCollectionRuleId: "dcr-000a00a000a00000a000000aa000a0aa", streamName: "Custom-MyTable", }, name: "Nginx logs", query: "source:nginx", }, type: "custom_destination", }, }, }; apiInstance .createLogsCustomDestination(params) .then((data: v2.CustomDestinationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a custom destination](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#get-a-custom-destination) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#get-a-custom-destination-v2) GET https://api.ap1.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.ap2.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.datadoghq.eu/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.ddog-gov.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.us3.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id} ### Overview Get a specific custom destination in your organization. This endpoint requires any of the following permissions: * `logs_read_config` * `logs_read_data` ### Arguments #### Path Parameters Name Type Description custom_destination_id [_required_] string The ID of the custom destination. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#GetLogsCustomDestination-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#GetLogsCustomDestination-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#GetLogsCustomDestination-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#GetLogsCustomDestination-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#GetLogsCustomDestination-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) The custom destination. Field Type Description data object The definition of a custom destination. attributes object The attributes associated with the custom destination. enabled boolean Whether logs matching this custom destination should be forwarded or not. default: `true` forward_tags boolean Whether tags from the forwarded logs should be forwarded or not. default: `true` forward_tags_restriction_list [string] List of [keys of tags](https://docs.datadoghq.com/getting_started/tagging/#define-tags) to be filtered. An empty list represents no restriction is in place and either all or no tags will be forwarded depending on `forward_tags_restriction_list_type` parameter. default: forward_tags_restriction_list_type enum How `forward_tags_restriction_list` parameter should be interpreted. If `ALLOW_LIST`, then only tags whose keys on the forwarded logs match the ones on the restriction list are forwarded. `BLOCK_LIST` works the opposite way. It does not forward the tags matching the ones on the list. Allowed enum values: `ALLOW_LIST,BLOCK_LIST` default: `ALLOW_LIST` forwarder_destination A custom destination's location to forward logs. Option 1 object The HTTP destination. auth [_required_] Authentication method of the HTTP requests. Option 1 object Basic access authentication. type [_required_] enum Type of the basic access authentication. Allowed enum values: `basic` default: `basic` Option 2 object Custom header access authentication. header_name [_required_] string The header name of the authentication. type [_required_] enum Type of the custom header access authentication. Allowed enum values: `custom_header` default: `custom_header` endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the HTTP destination. Allowed enum values: `http` default: `http` Option 2 object The Splunk HTTP Event Collector (HEC) destination. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the Splunk HTTP Event Collector (HEC) destination. Allowed enum values: `splunk_hec` default: `splunk_hec` Option 3 object The Elasticsearch destination. auth [_required_] object Basic access authentication. Basic access authentication. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. index_name [_required_] string Name of the Elasticsearch index (must follow [Elasticsearch's criteria](https://www.elastic.co/guide/en/elasticsearch/reference/8.11/indices-create-index.html#indices-create-api-path-params)). index_rotation string Date pattern with US locale and UTC timezone to be appended to the index name after adding `-` (that is, `${index_name}-${indexPattern}`). You can customize the index rotation naming pattern by choosing one of these options: * Hourly: `yyyy-MM-dd-HH` (as an example, it would render: `2022-10-19-09`) * Daily: `yyyy-MM-dd` (as an example, it would render: `2022-10-19`) * Weekly: `yyyy-'W'ww` (as an example, it would render: `2022-W42`) * Monthly: `yyyy-MM` (as an example, it would render: `2022-10`) If this field is missing or is blank, it means that the index name will always be the same (that is, no rotation). type [_required_] enum Type of the Elasticsearch destination. Allowed enum values: `elasticsearch` default: `elasticsearch` Option 4 object The Microsoft Sentinel destination. client_id [_required_] string Client ID from the Datadog Azure integration. data_collection_endpoint [_required_] string Azure data collection endpoint. data_collection_rule_id [_required_] string Azure data collection rule ID. stream_name [_required_] string Azure stream name. tenant_id [_required_] string Tenant ID from the Datadog Azure integration. type [_required_] enum Type of the Microsoft Sentinel destination. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` name string The custom destination name. query string The custom destination query filter. Logs matching this query are forwarded to the destination. id string The custom destination ID. type enum The type of the resource. The value should always be `custom_destination`. Allowed enum values: `custom_destination` default: `custom_destination` ``` { "data": { "attributes": { "enabled": true, "forward_tags": true, "forward_tags_restriction_list": [ "datacenter", "host" ], "forward_tags_restriction_list_type": "ALLOW_LIST", "forwarder_destination": { "auth": { "type": "basic" }, "endpoint": "https://example.com", "type": "http" }, "name": "Nginx logs", "query": "source:nginx" }, "id": "be5d7a69-d0c8-4d4d-8ee8-bba292d98139", "type": "custom_destination" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=typescript) ##### Get a custom destination Copy ``` # Path parameters export custom_destination_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations/${custom_destination_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a custom destination ``` """ Get a custom destination returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_custom_destinations_api import LogsCustomDestinationsApi # there is a valid "custom_destination" in the system CUSTOM_DESTINATION_DATA_ID = environ["CUSTOM_DESTINATION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsCustomDestinationsApi(api_client) response = api_instance.get_logs_custom_destination( custom_destination_id=CUSTOM_DESTINATION_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a custom destination ``` # Get a custom destination returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsCustomDestinationsAPI.new # there is a valid "custom_destination" in the system CUSTOM_DESTINATION_DATA_ID = ENV["CUSTOM_DESTINATION_DATA_ID"] p api_instance.get_logs_custom_destination(CUSTOM_DESTINATION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a custom destination ``` // Get a custom destination returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "custom_destination" in the system CustomDestinationDataID := os.Getenv("CUSTOM_DESTINATION_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsCustomDestinationsApi(apiClient) resp, r, err := api.GetLogsCustomDestination(ctx, CustomDestinationDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsCustomDestinationsApi.GetLogsCustomDestination`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsCustomDestinationsApi.GetLogsCustomDestination`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a custom destination ``` // Get a custom destination returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsCustomDestinationsApi; import com.datadog.api.client.v2.model.CustomDestinationResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsCustomDestinationsApi apiInstance = new LogsCustomDestinationsApi(defaultClient); // there is a valid "custom_destination" in the system String CUSTOM_DESTINATION_DATA_ID = System.getenv("CUSTOM_DESTINATION_DATA_ID"); try { CustomDestinationResponse result = apiInstance.getLogsCustomDestination(CUSTOM_DESTINATION_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling LogsCustomDestinationsApi#getLogsCustomDestination"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a custom destination ``` // Get a custom destination returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_custom_destinations::LogsCustomDestinationsAPI; #[tokio::main] async fn main() { // there is a valid "custom_destination" in the system let custom_destination_data_id = std::env::var("CUSTOM_DESTINATION_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = LogsCustomDestinationsAPI::with_config(configuration); let resp = api .get_logs_custom_destination(custom_destination_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a custom destination ``` /** * Get a custom destination returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsCustomDestinationsApi(configuration); // there is a valid "custom_destination" in the system const CUSTOM_DESTINATION_DATA_ID = process.env .CUSTOM_DESTINATION_DATA_ID as string; const params: v2.LogsCustomDestinationsApiGetLogsCustomDestinationRequest = { customDestinationId: CUSTOM_DESTINATION_DATA_ID, }; apiInstance .getLogsCustomDestination(params) .then((data: v2.CustomDestinationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a custom destination](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#update-a-custom-destination) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#update-a-custom-destination-v2) PATCH https://api.ap1.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.ap2.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.datadoghq.eu/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.ddog-gov.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.us3.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id} ### Overview Update the given fields of a specific custom destination in your organization. This endpoint requires the `logs_write_forwarding_rules` permission. ### Arguments #### Path Parameters Name Type Description custom_destination_id [_required_] string The ID of the custom destination. ### Request #### Body Data (required) New definition of the custom destination’s fields. * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) Field Type Description data object The definition of a custom destination. attributes object The attributes associated with the custom destination. enabled boolean Whether logs matching this custom destination should be forwarded or not. default: `true` forward_tags boolean Whether tags from the forwarded logs should be forwarded or not. default: `true` forward_tags_restriction_list [string] List of [keys of tags](https://docs.datadoghq.com/getting_started/tagging/#define-tags) to be restricted from being forwarded. An empty list represents no restriction is in place and either all or no tags will be forwarded depending on `forward_tags_restriction_list_type` parameter. default: forward_tags_restriction_list_type enum How `forward_tags_restriction_list` parameter should be interpreted. If `ALLOW_LIST`, then only tags whose keys on the forwarded logs match the ones on the restriction list are forwarded. `BLOCK_LIST` works the opposite way. It does not forward the tags matching the ones on the list. Allowed enum values: `ALLOW_LIST,BLOCK_LIST` default: `ALLOW_LIST` forwarder_destination A custom destination's location to forward logs. Option 1 object The HTTP destination. auth [_required_] Authentication method of the HTTP requests. Option 1 object Basic access authentication. password [_required_] string The password of the authentication. This field is not returned by the API. type [_required_] enum Type of the basic access authentication. Allowed enum values: `basic` default: `basic` username [_required_] string The username of the authentication. This field is not returned by the API. Option 2 object Custom header access authentication. header_name [_required_] string The header name of the authentication. header_value [_required_] string The header value of the authentication. This field is not returned by the API. type [_required_] enum Type of the custom header access authentication. Allowed enum values: `custom_header` default: `custom_header` endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the HTTP destination. Allowed enum values: `http` default: `http` Option 2 object The Splunk HTTP Event Collector (HEC) destination. access_token [_required_] string Access token of the Splunk HTTP Event Collector. This field is not returned by the API. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the Splunk HTTP Event Collector (HEC) destination. Allowed enum values: `splunk_hec` default: `splunk_hec` Option 3 object The Elasticsearch destination. auth [_required_] object Basic access authentication. password [_required_] string The password of the authentication. This field is not returned by the API. username [_required_] string The username of the authentication. This field is not returned by the API. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. index_name [_required_] string Name of the Elasticsearch index (must follow [Elasticsearch's criteria](https://www.elastic.co/guide/en/elasticsearch/reference/8.11/indices-create-index.html#indices-create-api-path-params)). index_rotation string Date pattern with US locale and UTC timezone to be appended to the index name after adding `-` (that is, `${index_name}-${indexPattern}`). You can customize the index rotation naming pattern by choosing one of these options: * Hourly: `yyyy-MM-dd-HH` (as an example, it would render: `2022-10-19-09`) * Daily: `yyyy-MM-dd` (as an example, it would render: `2022-10-19`) * Weekly: `yyyy-'W'ww` (as an example, it would render: `2022-W42`) * Monthly: `yyyy-MM` (as an example, it would render: `2022-10`) If this field is missing or is blank, it means that the index name will always be the same (that is, no rotation). type [_required_] enum Type of the Elasticsearch destination. Allowed enum values: `elasticsearch` default: `elasticsearch` Option 4 object The Microsoft Sentinel destination. client_id [_required_] string Client ID from the Datadog Azure integration. data_collection_endpoint [_required_] string Azure data collection endpoint. data_collection_rule_id [_required_] string Azure data collection rule ID. stream_name [_required_] string Azure stream name. tenant_id [_required_] string Tenant ID from the Datadog Azure integration. type [_required_] enum Type of the Microsoft Sentinel destination. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` name string The custom destination name. query string The custom destination query and filter. Logs matching this query are forwarded to the destination. id [_required_] string The custom destination ID. type [_required_] enum The type of the resource. The value should always be `custom_destination`. Allowed enum values: `custom_destination` default: `custom_destination` ``` { "data": { "attributes": { "name": "Nginx logs (Updated)", "query": "source:nginx", "enabled": false, "forward_tags": false, "forward_tags_restriction_list_type": "BLOCK_LIST" }, "type": "custom_destination", "id": "be5d7a69-d0c8-4d4d-8ee8-bba292d98139" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#UpdateLogsCustomDestination-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#UpdateLogsCustomDestination-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#UpdateLogsCustomDestination-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#UpdateLogsCustomDestination-404-v2) * [409](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#UpdateLogsCustomDestination-409-v2) * [429](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#UpdateLogsCustomDestination-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) The custom destination. Field Type Description data object The definition of a custom destination. attributes object The attributes associated with the custom destination. enabled boolean Whether logs matching this custom destination should be forwarded or not. default: `true` forward_tags boolean Whether tags from the forwarded logs should be forwarded or not. default: `true` forward_tags_restriction_list [string] List of [keys of tags](https://docs.datadoghq.com/getting_started/tagging/#define-tags) to be filtered. An empty list represents no restriction is in place and either all or no tags will be forwarded depending on `forward_tags_restriction_list_type` parameter. default: forward_tags_restriction_list_type enum How `forward_tags_restriction_list` parameter should be interpreted. If `ALLOW_LIST`, then only tags whose keys on the forwarded logs match the ones on the restriction list are forwarded. `BLOCK_LIST` works the opposite way. It does not forward the tags matching the ones on the list. Allowed enum values: `ALLOW_LIST,BLOCK_LIST` default: `ALLOW_LIST` forwarder_destination A custom destination's location to forward logs. Option 1 object The HTTP destination. auth [_required_] Authentication method of the HTTP requests. Option 1 object Basic access authentication. type [_required_] enum Type of the basic access authentication. Allowed enum values: `basic` default: `basic` Option 2 object Custom header access authentication. header_name [_required_] string The header name of the authentication. type [_required_] enum Type of the custom header access authentication. Allowed enum values: `custom_header` default: `custom_header` endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the HTTP destination. Allowed enum values: `http` default: `http` Option 2 object The Splunk HTTP Event Collector (HEC) destination. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. type [_required_] enum Type of the Splunk HTTP Event Collector (HEC) destination. Allowed enum values: `splunk_hec` default: `splunk_hec` Option 3 object The Elasticsearch destination. auth [_required_] object Basic access authentication. Basic access authentication. endpoint [_required_] string The destination for which logs will be forwarded to. Must have HTTPS scheme and forwarding back to Datadog is not allowed. index_name [_required_] string Name of the Elasticsearch index (must follow [Elasticsearch's criteria](https://www.elastic.co/guide/en/elasticsearch/reference/8.11/indices-create-index.html#indices-create-api-path-params)). index_rotation string Date pattern with US locale and UTC timezone to be appended to the index name after adding `-` (that is, `${index_name}-${indexPattern}`). You can customize the index rotation naming pattern by choosing one of these options: * Hourly: `yyyy-MM-dd-HH` (as an example, it would render: `2022-10-19-09`) * Daily: `yyyy-MM-dd` (as an example, it would render: `2022-10-19`) * Weekly: `yyyy-'W'ww` (as an example, it would render: `2022-W42`) * Monthly: `yyyy-MM` (as an example, it would render: `2022-10`) If this field is missing or is blank, it means that the index name will always be the same (that is, no rotation). type [_required_] enum Type of the Elasticsearch destination. Allowed enum values: `elasticsearch` default: `elasticsearch` Option 4 object The Microsoft Sentinel destination. client_id [_required_] string Client ID from the Datadog Azure integration. data_collection_endpoint [_required_] string Azure data collection endpoint. data_collection_rule_id [_required_] string Azure data collection rule ID. stream_name [_required_] string Azure stream name. tenant_id [_required_] string Tenant ID from the Datadog Azure integration. type [_required_] enum Type of the Microsoft Sentinel destination. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` name string The custom destination name. query string The custom destination query filter. Logs matching this query are forwarded to the destination. id string The custom destination ID. type enum The type of the resource. The value should always be `custom_destination`. Allowed enum values: `custom_destination` default: `custom_destination` ``` { "data": { "attributes": { "enabled": true, "forward_tags": true, "forward_tags_restriction_list": [ "datacenter", "host" ], "forward_tags_restriction_list_type": "ALLOW_LIST", "forwarder_destination": { "auth": { "type": "basic" }, "endpoint": "https://example.com", "type": "http" }, "name": "Nginx logs", "query": "source:nginx" }, "id": "be5d7a69-d0c8-4d4d-8ee8-bba292d98139", "type": "custom_destination" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=typescript) ##### Update a custom destination returns "OK" response Copy ``` # Path parameters export custom_destination_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations/${custom_destination_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Nginx logs (Updated)", "query": "source:nginx", "enabled": false, "forward_tags": false, "forward_tags_restriction_list_type": "BLOCK_LIST" }, "type": "custom_destination", "id": "be5d7a69-d0c8-4d4d-8ee8-bba292d98139" } } EOF ``` ##### Update a custom destination returns "OK" response ``` // Update a custom destination returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "custom_destination" in the system CustomDestinationDataID := os.Getenv("CUSTOM_DESTINATION_DATA_ID") body := datadogV2.CustomDestinationUpdateRequest{ Data: &datadogV2.CustomDestinationUpdateRequestDefinition{ Attributes: &datadogV2.CustomDestinationUpdateRequestAttributes{ Name: datadog.PtrString("Nginx logs (Updated)"), Query: datadog.PtrString("source:nginx"), Enabled: datadog.PtrBool(false), ForwardTags: datadog.PtrBool(false), ForwardTagsRestrictionListType: datadogV2.CUSTOMDESTINATIONATTRIBUTETAGSRESTRICTIONLISTTYPE_BLOCK_LIST.Ptr(), }, Type: datadogV2.CUSTOMDESTINATIONTYPE_CUSTOM_DESTINATION, Id: CustomDestinationDataID, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsCustomDestinationsApi(apiClient) resp, r, err := api.UpdateLogsCustomDestination(ctx, CustomDestinationDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsCustomDestinationsApi.UpdateLogsCustomDestination`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsCustomDestinationsApi.UpdateLogsCustomDestination`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a custom destination returns "OK" response ``` // Update a custom destination returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsCustomDestinationsApi; import com.datadog.api.client.v2.model.CustomDestinationAttributeTagsRestrictionListType; import com.datadog.api.client.v2.model.CustomDestinationResponse; import com.datadog.api.client.v2.model.CustomDestinationType; import com.datadog.api.client.v2.model.CustomDestinationUpdateRequest; import com.datadog.api.client.v2.model.CustomDestinationUpdateRequestAttributes; import com.datadog.api.client.v2.model.CustomDestinationUpdateRequestDefinition; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsCustomDestinationsApi apiInstance = new LogsCustomDestinationsApi(defaultClient); // there is a valid "custom_destination" in the system String CUSTOM_DESTINATION_DATA_ID = System.getenv("CUSTOM_DESTINATION_DATA_ID"); CustomDestinationUpdateRequest body = new CustomDestinationUpdateRequest() .data( new CustomDestinationUpdateRequestDefinition() .attributes( new CustomDestinationUpdateRequestAttributes() .name("Nginx logs (Updated)") .query("source:nginx") .enabled(false) .forwardTags(false) .forwardTagsRestrictionListType( CustomDestinationAttributeTagsRestrictionListType.BLOCK_LIST)) .type(CustomDestinationType.CUSTOM_DESTINATION) .id(CUSTOM_DESTINATION_DATA_ID)); try { CustomDestinationResponse result = apiInstance.updateLogsCustomDestination(CUSTOM_DESTINATION_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling LogsCustomDestinationsApi#updateLogsCustomDestination"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a custom destination returns "OK" response ``` """ Update a custom destination returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_custom_destinations_api import LogsCustomDestinationsApi from datadog_api_client.v2.model.custom_destination_attribute_tags_restriction_list_type import ( CustomDestinationAttributeTagsRestrictionListType, ) from datadog_api_client.v2.model.custom_destination_type import CustomDestinationType from datadog_api_client.v2.model.custom_destination_update_request import CustomDestinationUpdateRequest from datadog_api_client.v2.model.custom_destination_update_request_attributes import ( CustomDestinationUpdateRequestAttributes, ) from datadog_api_client.v2.model.custom_destination_update_request_definition import ( CustomDestinationUpdateRequestDefinition, ) # there is a valid "custom_destination" in the system CUSTOM_DESTINATION_DATA_ID = environ["CUSTOM_DESTINATION_DATA_ID"] body = CustomDestinationUpdateRequest( data=CustomDestinationUpdateRequestDefinition( attributes=CustomDestinationUpdateRequestAttributes( name="Nginx logs (Updated)", query="source:nginx", enabled=False, forward_tags=False, forward_tags_restriction_list_type=CustomDestinationAttributeTagsRestrictionListType.BLOCK_LIST, ), type=CustomDestinationType.CUSTOM_DESTINATION, id=CUSTOM_DESTINATION_DATA_ID, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsCustomDestinationsApi(api_client) response = api_instance.update_logs_custom_destination(custom_destination_id=CUSTOM_DESTINATION_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a custom destination returns "OK" response ``` # Update a custom destination returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsCustomDestinationsAPI.new # there is a valid "custom_destination" in the system CUSTOM_DESTINATION_DATA_ID = ENV["CUSTOM_DESTINATION_DATA_ID"] body = DatadogAPIClient::V2::CustomDestinationUpdateRequest.new({ data: DatadogAPIClient::V2::CustomDestinationUpdateRequestDefinition.new({ attributes: DatadogAPIClient::V2::CustomDestinationUpdateRequestAttributes.new({ name: "Nginx logs (Updated)", query: "source:nginx", enabled: false, forward_tags: false, forward_tags_restriction_list_type: DatadogAPIClient::V2::CustomDestinationAttributeTagsRestrictionListType::BLOCK_LIST, }), type: DatadogAPIClient::V2::CustomDestinationType::CUSTOM_DESTINATION, id: CUSTOM_DESTINATION_DATA_ID, }), }) p api_instance.update_logs_custom_destination(CUSTOM_DESTINATION_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a custom destination returns "OK" response ``` // Update a custom destination returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_custom_destinations::LogsCustomDestinationsAPI; use datadog_api_client::datadogV2::model::CustomDestinationAttributeTagsRestrictionListType; use datadog_api_client::datadogV2::model::CustomDestinationType; use datadog_api_client::datadogV2::model::CustomDestinationUpdateRequest; use datadog_api_client::datadogV2::model::CustomDestinationUpdateRequestAttributes; use datadog_api_client::datadogV2::model::CustomDestinationUpdateRequestDefinition; #[tokio::main] async fn main() { // there is a valid "custom_destination" in the system let custom_destination_data_id = std::env::var("CUSTOM_DESTINATION_DATA_ID").unwrap(); let body = CustomDestinationUpdateRequest::new().data( CustomDestinationUpdateRequestDefinition::new( custom_destination_data_id.clone(), CustomDestinationType::CUSTOM_DESTINATION, ) .attributes( CustomDestinationUpdateRequestAttributes::new() .enabled(false) .forward_tags(false) .forward_tags_restriction_list_type( CustomDestinationAttributeTagsRestrictionListType::BLOCK_LIST, ) .name("Nginx logs (Updated)".to_string()) .query("source:nginx".to_string()), ), ); let configuration = datadog::Configuration::new(); let api = LogsCustomDestinationsAPI::with_config(configuration); let resp = api .update_logs_custom_destination(custom_destination_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a custom destination returns "OK" response ``` /** * Update a custom destination returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsCustomDestinationsApi(configuration); // there is a valid "custom_destination" in the system const CUSTOM_DESTINATION_DATA_ID = process.env .CUSTOM_DESTINATION_DATA_ID as string; const params: v2.LogsCustomDestinationsApiUpdateLogsCustomDestinationRequest = { body: { data: { attributes: { name: "Nginx logs (Updated)", query: "source:nginx", enabled: false, forwardTags: false, forwardTagsRestrictionListType: "BLOCK_LIST", }, type: "custom_destination", id: CUSTOM_DESTINATION_DATA_ID, }, }, customDestinationId: CUSTOM_DESTINATION_DATA_ID, }; apiInstance .updateLogsCustomDestination(params) .then((data: v2.CustomDestinationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a custom destination](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#delete-a-custom-destination) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#delete-a-custom-destination-v2) DELETE https://api.ap1.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.ap2.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.datadoghq.eu/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.ddog-gov.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.us3.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id}https://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations/{custom_destination_id} ### Overview Delete a specific custom destination in your organization. This endpoint requires the `logs_write_forwarding_rules` permission. ### Arguments #### Path Parameters Name Type Description custom_destination_id [_required_] string The ID of the custom destination. ### Response * [204](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#DeleteLogsCustomDestination-204-v2) * [400](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#DeleteLogsCustomDestination-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#DeleteLogsCustomDestination-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#DeleteLogsCustomDestination-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-custom-destinations/#DeleteLogsCustomDestination-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) * [Example](https://docs.datadoghq.com/api/latest/logs-custom-destinations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-custom-destinations/?code-lang=typescript) ##### Delete a custom destination Copy ``` # Path parameters export custom_destination_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/custom-destinations/${custom_destination_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a custom destination ``` """ Delete a custom destination returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_custom_destinations_api import LogsCustomDestinationsApi # there is a valid "custom_destination" in the system CUSTOM_DESTINATION_DATA_ID = environ["CUSTOM_DESTINATION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsCustomDestinationsApi(api_client) api_instance.delete_logs_custom_destination( custom_destination_id=CUSTOM_DESTINATION_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a custom destination ``` # Delete a custom destination returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsCustomDestinationsAPI.new # there is a valid "custom_destination" in the system CUSTOM_DESTINATION_DATA_ID = ENV["CUSTOM_DESTINATION_DATA_ID"] api_instance.delete_logs_custom_destination(CUSTOM_DESTINATION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a custom destination ``` // Delete a custom destination returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "custom_destination" in the system CustomDestinationDataID := os.Getenv("CUSTOM_DESTINATION_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsCustomDestinationsApi(apiClient) r, err := api.DeleteLogsCustomDestination(ctx, CustomDestinationDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsCustomDestinationsApi.DeleteLogsCustomDestination`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a custom destination ``` // Delete a custom destination returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsCustomDestinationsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsCustomDestinationsApi apiInstance = new LogsCustomDestinationsApi(defaultClient); // there is a valid "custom_destination" in the system String CUSTOM_DESTINATION_DATA_ID = System.getenv("CUSTOM_DESTINATION_DATA_ID"); try { apiInstance.deleteLogsCustomDestination(CUSTOM_DESTINATION_DATA_ID); } catch (ApiException e) { System.err.println( "Exception when calling LogsCustomDestinationsApi#deleteLogsCustomDestination"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a custom destination ``` // Delete a custom destination returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_custom_destinations::LogsCustomDestinationsAPI; #[tokio::main] async fn main() { // there is a valid "custom_destination" in the system let custom_destination_data_id = std::env::var("CUSTOM_DESTINATION_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = LogsCustomDestinationsAPI::with_config(configuration); let resp = api .delete_logs_custom_destination(custom_destination_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a custom destination ``` /** * Delete a custom destination returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsCustomDestinationsApi(configuration); // there is a valid "custom_destination" in the system const CUSTOM_DESTINATION_DATA_ID = process.env .CUSTOM_DESTINATION_DATA_ID as string; const params: v2.LogsCustomDestinationsApiDeleteLogsCustomDestinationRequest = { customDestinationId: CUSTOM_DESTINATION_DATA_ID, }; apiInstance .deleteLogsCustomDestination(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4cf33b55-1e5e-45d2-ab93-0fe3a591a204&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=fe3226e5-c39d-423e-ab6f-5ba618931cc0&pt=Logs%20Custom%20Destinations&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-custom-destinations%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=4cf33b55-1e5e-45d2-ab93-0fe3a591a204&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=fe3226e5-c39d-423e-ab6f-5ba618931cc0&pt=Logs%20Custom%20Destinations&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-custom-destinations%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=b55e9adc-0ddd-4329-b202-fab38f39f2fa&bo=2&sid=c4ba0d50f0bf11f09afde9c0412ba014&vid=c4ba67e0f0bf11f099fcdb9b6c7a6b3e&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Logs%20Custom%20Destinations&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-custom-destinations%2F&r=<=1662&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=849368) --- # Source: https://docs.datadoghq.com/api/latest/logs-indexes/ # Logs Indexes Manage configuration of [log indexes](https://docs.datadoghq.com/logs/indexes/). ## [Get all indexes](https://docs.datadoghq.com/api/latest/logs-indexes/#get-all-indexes) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-indexes/#get-all-indexes-v1) GET https://api.ap1.datadoghq.com/api/v1/logs/config/indexeshttps://api.ap2.datadoghq.com/api/v1/logs/config/indexeshttps://api.datadoghq.eu/api/v1/logs/config/indexeshttps://api.ddog-gov.com/api/v1/logs/config/indexeshttps://api.datadoghq.com/api/v1/logs/config/indexeshttps://api.us3.datadoghq.com/api/v1/logs/config/indexeshttps://api.us5.datadoghq.com/api/v1/logs/config/indexes ### Overview The Index object describes the configuration of a log index. This endpoint returns an array of the `LogIndex` objects of your organization. This endpoint requires the `logs_read_config` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-indexes/#ListLogIndexes-200-v1) * [403](https://docs.datadoghq.com/api/latest/logs-indexes/#ListLogIndexes-403-v1) * [429](https://docs.datadoghq.com/api/latest/logs-indexes/#ListLogIndexes-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Object with all Index configurations for a given organization. Field Type Description indexes [object] Array of Log index configurations. daily_limit int64 The number of log events you can send in this index per day before you are rate-limited. daily_limit_reset object Object containing options to override the default daily limit reset time. reset_time string String in `HH:00` format representing the time of day the daily limit should be reset. The hours must be between 00 and 23 (inclusive). reset_utc_offset string String in `(-|+)HH:00` format representing the UTC offset to apply to the given reset time. The hours must be between -12 and +14 (inclusive). daily_limit_warning_threshold_percentage double A percentage threshold of the daily quota at which a Datadog warning event is generated. exclusion_filters [object] An array of exclusion objects. The logs are tested against the query of each filter, following the order of the array. Only the first matching active exclusion matters, others (if any) are ignored. filter object Exclusion filter is defined by a query, a sampling rule, and a active/inactive toggle. query string Default query is `*`, meaning all logs flowing in the index would be excluded. Scope down exclusion filter to only a subset of logs with a log query. sample_rate [_required_] double Sample rate to apply to logs going through this exclusion filter, a value of 1.0 excludes all logs matching the query. is_enabled boolean Whether or not the exclusion filter is active. name [_required_] string Name of the index exclusion filter. filter [_required_] object Filter for logs. query string The filter query. is_rate_limited boolean A boolean stating if the index is rate limited, meaning more logs than the daily limit have been sent. Rate limit is reset every-day at 2pm UTC. name [_required_] string The name of the index. num_flex_logs_retention_days int64 The total number of days logs are stored in Standard and Flex Tier before being deleted from the index. If Standard Tier is enabled on this index, logs are first retained in Standard Tier for the number of days specified through `num_retention_days`, and then stored in Flex Tier until the number of days specified in `num_flex_logs_retention_days` is reached. The available values depend on retention plans specified in your organization's contract/subscriptions. num_retention_days int64 The number of days logs are stored in Standard Tier before aging into the Flex Tier or being deleted from the index. The available values depend on retention plans specified in your organization's contract/subscriptions. ``` { "indexes": [ { "daily_limit": 300000000, "daily_limit_reset": { "reset_time": "14:00", "reset_utc_offset": "+02:00" }, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [ { "filter": { "query": "*", "sample_rate": 1 }, "is_enabled": false, "name": "payment" } ], "filter": { "query": "source:python" }, "is_rate_limited": false, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15 } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=typescript) ##### Get all indexes Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/indexes" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all indexes ``` """ Get all indexes returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_indexes_api import LogsIndexesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsIndexesApi(api_client) response = api_instance.list_log_indexes() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all indexes ``` # Get all indexes returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsIndexesAPI.new p api_instance.list_log_indexes() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all indexes ``` // Get all indexes returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsIndexesApi(apiClient) resp, r, err := api.ListLogIndexes(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsIndexesApi.ListLogIndexes`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsIndexesApi.ListLogIndexes`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all indexes ``` // Get all indexes returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsIndexesApi; import com.datadog.api.client.v1.model.LogsIndexListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsIndexesApi apiInstance = new LogsIndexesApi(defaultClient); try { LogsIndexListResponse result = apiInstance.listLogIndexes(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsIndexesApi#listLogIndexes"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all indexes ``` // Get all indexes returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_indexes::LogsIndexesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsIndexesAPI::with_config(configuration); let resp = api.list_log_indexes().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all indexes ``` /** * Get all indexes returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsIndexesApi(configuration); apiInstance .listLogIndexes() .then((data: v1.LogsIndexListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an index](https://docs.datadoghq.com/api/latest/logs-indexes/#get-an-index) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-indexes/#get-an-index-v1) GET https://api.ap1.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.ap2.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.datadoghq.eu/api/v1/logs/config/indexes/{name}https://api.ddog-gov.com/api/v1/logs/config/indexes/{name}https://api.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.us3.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.us5.datadoghq.com/api/v1/logs/config/indexes/{name} ### Overview Get one log index from your organization. This endpoint takes no JSON arguments. This endpoint requires the `logs_read_config` permission. ### Arguments #### Path Parameters Name Type Description name [_required_] string Name of the log index. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-indexes/#GetLogsIndex-200-v1) * [403](https://docs.datadoghq.com/api/latest/logs-indexes/#GetLogsIndex-403-v1) * [404](https://docs.datadoghq.com/api/latest/logs-indexes/#GetLogsIndex-404-v1) * [429](https://docs.datadoghq.com/api/latest/logs-indexes/#GetLogsIndex-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Object describing a Datadog Log index. Field Type Description daily_limit int64 The number of log events you can send in this index per day before you are rate-limited. daily_limit_reset object Object containing options to override the default daily limit reset time. reset_time string String in `HH:00` format representing the time of day the daily limit should be reset. The hours must be between 00 and 23 (inclusive). reset_utc_offset string String in `(-|+)HH:00` format representing the UTC offset to apply to the given reset time. The hours must be between -12 and +14 (inclusive). daily_limit_warning_threshold_percentage double A percentage threshold of the daily quota at which a Datadog warning event is generated. exclusion_filters [object] An array of exclusion objects. The logs are tested against the query of each filter, following the order of the array. Only the first matching active exclusion matters, others (if any) are ignored. filter object Exclusion filter is defined by a query, a sampling rule, and a active/inactive toggle. query string Default query is `*`, meaning all logs flowing in the index would be excluded. Scope down exclusion filter to only a subset of logs with a log query. sample_rate [_required_] double Sample rate to apply to logs going through this exclusion filter, a value of 1.0 excludes all logs matching the query. is_enabled boolean Whether or not the exclusion filter is active. name [_required_] string Name of the index exclusion filter. filter [_required_] object Filter for logs. query string The filter query. is_rate_limited boolean A boolean stating if the index is rate limited, meaning more logs than the daily limit have been sent. Rate limit is reset every-day at 2pm UTC. name [_required_] string The name of the index. num_flex_logs_retention_days int64 The total number of days logs are stored in Standard and Flex Tier before being deleted from the index. If Standard Tier is enabled on this index, logs are first retained in Standard Tier for the number of days specified through `num_retention_days`, and then stored in Flex Tier until the number of days specified in `num_flex_logs_retention_days` is reached. The available values depend on retention plans specified in your organization's contract/subscriptions. num_retention_days int64 The number of days logs are stored in Standard Tier before aging into the Flex Tier or being deleted from the index. The available values depend on retention plans specified in your organization's contract/subscriptions. ``` { "daily_limit": 300000000, "daily_limit_reset": { "reset_time": "14:00", "reset_utc_offset": "+02:00" }, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [ { "filter": { "query": "*", "sample_rate": 1 }, "is_enabled": false, "name": "payment" } ], "filter": { "query": "source:python" }, "is_rate_limited": false, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15 } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=typescript) ##### Get an index Copy ``` # Path parameters export name="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/indexes/${name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an index ``` """ Get an index returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_indexes_api import LogsIndexesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsIndexesApi(api_client) response = api_instance.get_logs_index( name="name", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an index ``` # Get an index returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsIndexesAPI.new p api_instance.get_logs_index("name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an index ``` // Get an index returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsIndexesApi(apiClient) resp, r, err := api.GetLogsIndex(ctx, "name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsIndexesApi.GetLogsIndex`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsIndexesApi.GetLogsIndex`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an index ``` // Get an index returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsIndexesApi; import com.datadog.api.client.v1.model.LogsIndex; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsIndexesApi apiInstance = new LogsIndexesApi(defaultClient); try { LogsIndex result = apiInstance.getLogsIndex("name"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsIndexesApi#getLogsIndex"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an index ``` // Get an index returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_indexes::LogsIndexesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsIndexesAPI::with_config(configuration); let resp = api.get_logs_index("name".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an index ``` /** * Get an index returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsIndexesApi(configuration); const params: v1.LogsIndexesApiGetLogsIndexRequest = { name: "name", }; apiInstance .getLogsIndex(params) .then((data: v1.LogsIndex) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an index](https://docs.datadoghq.com/api/latest/logs-indexes/#create-an-index) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-indexes/#create-an-index-v1) POST https://api.ap1.datadoghq.com/api/v1/logs/config/indexeshttps://api.ap2.datadoghq.com/api/v1/logs/config/indexeshttps://api.datadoghq.eu/api/v1/logs/config/indexeshttps://api.ddog-gov.com/api/v1/logs/config/indexeshttps://api.datadoghq.com/api/v1/logs/config/indexeshttps://api.us3.datadoghq.com/api/v1/logs/config/indexeshttps://api.us5.datadoghq.com/api/v1/logs/config/indexes ### Overview Creates a new index. Returns the Index object passed in the request body when the request is successful. This endpoint requires the `logs_modify_indexes` permission. ### Request #### Body Data (required) Object containing the new index. * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Field Type Description daily_limit int64 The number of log events you can send in this index per day before you are rate-limited. daily_limit_reset object Object containing options to override the default daily limit reset time. reset_time string String in `HH:00` format representing the time of day the daily limit should be reset. The hours must be between 00 and 23 (inclusive). reset_utc_offset string String in `(-|+)HH:00` format representing the UTC offset to apply to the given reset time. The hours must be between -12 and +14 (inclusive). daily_limit_warning_threshold_percentage double A percentage threshold of the daily quota at which a Datadog warning event is generated. exclusion_filters [object] An array of exclusion objects. The logs are tested against the query of each filter, following the order of the array. Only the first matching active exclusion matters, others (if any) are ignored. filter object Exclusion filter is defined by a query, a sampling rule, and a active/inactive toggle. query string Default query is `*`, meaning all logs flowing in the index would be excluded. Scope down exclusion filter to only a subset of logs with a log query. sample_rate [_required_] double Sample rate to apply to logs going through this exclusion filter, a value of 1.0 excludes all logs matching the query. is_enabled boolean Whether or not the exclusion filter is active. name [_required_] string Name of the index exclusion filter. filter [_required_] object Filter for logs. query string The filter query. is_rate_limited boolean A boolean stating if the index is rate limited, meaning more logs than the daily limit have been sent. Rate limit is reset every-day at 2pm UTC. name [_required_] string The name of the index. num_flex_logs_retention_days int64 The total number of days logs are stored in Standard and Flex Tier before being deleted from the index. If Standard Tier is enabled on this index, logs are first retained in Standard Tier for the number of days specified through `num_retention_days`, and then stored in Flex Tier until the number of days specified in `num_flex_logs_retention_days` is reached. The available values depend on retention plans specified in your organization's contract/subscriptions. num_retention_days int64 The number of days logs are stored in Standard Tier before aging into the Flex Tier or being deleted from the index. The available values depend on retention plans specified in your organization's contract/subscriptions. ``` { "daily_limit": 300000000, "daily_limit_reset": { "reset_time": "14:00", "reset_utc_offset": "+02:00" }, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [ { "filter": { "query": "*", "sample_rate": 1 }, "is_enabled": false, "name": "payment" } ], "filter": { "query": "source:python" }, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15 } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-indexes/#CreateLogsIndex-200-v1) * [400](https://docs.datadoghq.com/api/latest/logs-indexes/#CreateLogsIndex-400-v1) * [403](https://docs.datadoghq.com/api/latest/logs-indexes/#CreateLogsIndex-403-v1) * [422](https://docs.datadoghq.com/api/latest/logs-indexes/#CreateLogsIndex-422-v1) * [429](https://docs.datadoghq.com/api/latest/logs-indexes/#CreateLogsIndex-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Object describing a Datadog Log index. Field Type Description daily_limit int64 The number of log events you can send in this index per day before you are rate-limited. daily_limit_reset object Object containing options to override the default daily limit reset time. reset_time string String in `HH:00` format representing the time of day the daily limit should be reset. The hours must be between 00 and 23 (inclusive). reset_utc_offset string String in `(-|+)HH:00` format representing the UTC offset to apply to the given reset time. The hours must be between -12 and +14 (inclusive). daily_limit_warning_threshold_percentage double A percentage threshold of the daily quota at which a Datadog warning event is generated. exclusion_filters [object] An array of exclusion objects. The logs are tested against the query of each filter, following the order of the array. Only the first matching active exclusion matters, others (if any) are ignored. filter object Exclusion filter is defined by a query, a sampling rule, and a active/inactive toggle. query string Default query is `*`, meaning all logs flowing in the index would be excluded. Scope down exclusion filter to only a subset of logs with a log query. sample_rate [_required_] double Sample rate to apply to logs going through this exclusion filter, a value of 1.0 excludes all logs matching the query. is_enabled boolean Whether or not the exclusion filter is active. name [_required_] string Name of the index exclusion filter. filter [_required_] object Filter for logs. query string The filter query. is_rate_limited boolean A boolean stating if the index is rate limited, meaning more logs than the daily limit have been sent. Rate limit is reset every-day at 2pm UTC. name [_required_] string The name of the index. num_flex_logs_retention_days int64 The total number of days logs are stored in Standard and Flex Tier before being deleted from the index. If Standard Tier is enabled on this index, logs are first retained in Standard Tier for the number of days specified through `num_retention_days`, and then stored in Flex Tier until the number of days specified in `num_flex_logs_retention_days` is reached. The available values depend on retention plans specified in your organization's contract/subscriptions. num_retention_days int64 The number of days logs are stored in Standard Tier before aging into the Flex Tier or being deleted from the index. The available values depend on retention plans specified in your organization's contract/subscriptions. ``` { "daily_limit": 300000000, "daily_limit_reset": { "reset_time": "14:00", "reset_utc_offset": "+02:00" }, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [ { "filter": { "query": "*", "sample_rate": 1 }, "is_enabled": false, "name": "payment" } ], "filter": { "query": "source:python" }, "is_rate_limited": false, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15 } ``` Copy Invalid Parameter Error * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Unprocessable Entity * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Response returned by the Logs API when the max limit has been reached. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=typescript) ##### Create an index Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/indexes" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "exclusion_filters": [ { "filter": { "sample_rate": 1 }, "name": "payment" } ], "filter": {}, "name": "main" } EOF ``` ##### Create an index ``` """ Create an index returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_indexes_api import LogsIndexesApi from datadog_api_client.v1.model.logs_daily_limit_reset import LogsDailyLimitReset from datadog_api_client.v1.model.logs_exclusion import LogsExclusion from datadog_api_client.v1.model.logs_exclusion_filter import LogsExclusionFilter from datadog_api_client.v1.model.logs_filter import LogsFilter from datadog_api_client.v1.model.logs_index import LogsIndex body = LogsIndex( daily_limit=300000000, daily_limit_reset=LogsDailyLimitReset( reset_time="14:00", reset_utc_offset="+02:00", ), daily_limit_warning_threshold_percentage=70.0, exclusion_filters=[ LogsExclusion( filter=LogsExclusionFilter( query="*", sample_rate=1.0, ), name="payment", ), ], filter=LogsFilter( query="source:python", ), name="main", num_flex_logs_retention_days=360, num_retention_days=15, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsIndexesApi(api_client) response = api_instance.create_logs_index(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an index ``` # Create an index returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsIndexesAPI.new body = DatadogAPIClient::V1::LogsIndex.new({ daily_limit: 300000000, daily_limit_reset: DatadogAPIClient::V1::LogsDailyLimitReset.new({ reset_time: "14:00", reset_utc_offset: "+02:00", }), daily_limit_warning_threshold_percentage: 70, exclusion_filters: [ DatadogAPIClient::V1::LogsExclusion.new({ filter: DatadogAPIClient::V1::LogsExclusionFilter.new({ query: "*", sample_rate: 1.0, }), name: "payment", }), ], filter: DatadogAPIClient::V1::LogsFilter.new({ query: "source:python", }), name: "main", num_flex_logs_retention_days: 360, num_retention_days: 15, }) p api_instance.create_logs_index(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an index ``` // Create an index returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.LogsIndex{ DailyLimit: datadog.PtrInt64(300000000), DailyLimitReset: &datadogV1.LogsDailyLimitReset{ ResetTime: datadog.PtrString("14:00"), ResetUtcOffset: datadog.PtrString("+02:00"), }, DailyLimitWarningThresholdPercentage: datadog.PtrFloat64(70), ExclusionFilters: []datadogV1.LogsExclusion{ { Filter: &datadogV1.LogsExclusionFilter{ Query: datadog.PtrString("*"), SampleRate: 1.0, }, Name: "payment", }, }, Filter: datadogV1.LogsFilter{ Query: datadog.PtrString("source:python"), }, Name: "main", NumFlexLogsRetentionDays: datadog.PtrInt64(360), NumRetentionDays: datadog.PtrInt64(15), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsIndexesApi(apiClient) resp, r, err := api.CreateLogsIndex(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsIndexesApi.CreateLogsIndex`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsIndexesApi.CreateLogsIndex`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an index ``` // Create an index returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsIndexesApi; import com.datadog.api.client.v1.model.LogsDailyLimitReset; import com.datadog.api.client.v1.model.LogsExclusion; import com.datadog.api.client.v1.model.LogsExclusionFilter; import com.datadog.api.client.v1.model.LogsFilter; import com.datadog.api.client.v1.model.LogsIndex; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsIndexesApi apiInstance = new LogsIndexesApi(defaultClient); LogsIndex body = new LogsIndex() .dailyLimit(300000000L) .dailyLimitReset(new LogsDailyLimitReset().resetTime("14:00").resetUtcOffset("+02:00")) .dailyLimitWarningThresholdPercentage(70.0) .exclusionFilters( Collections.singletonList( new LogsExclusion() .filter(new LogsExclusionFilter().query("*").sampleRate(1.0)) .name("payment"))) .filter(new LogsFilter().query("source:python")) .name("main") .numFlexLogsRetentionDays(360L) .numRetentionDays(15L); try { LogsIndex result = apiInstance.createLogsIndex(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsIndexesApi#createLogsIndex"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an index ``` // Create an index returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_indexes::LogsIndexesAPI; use datadog_api_client::datadogV1::model::LogsDailyLimitReset; use datadog_api_client::datadogV1::model::LogsExclusion; use datadog_api_client::datadogV1::model::LogsExclusionFilter; use datadog_api_client::datadogV1::model::LogsFilter; use datadog_api_client::datadogV1::model::LogsIndex; #[tokio::main] async fn main() { let body = LogsIndex::new( LogsFilter::new().query("source:python".to_string()), "main".to_string(), ) .daily_limit(300000000) .daily_limit_reset( LogsDailyLimitReset::new() .reset_time("14:00".to_string()) .reset_utc_offset("+02:00".to_string()), ) .daily_limit_warning_threshold_percentage(70.0 as f64) .exclusion_filters(vec![LogsExclusion::new("payment".to_string()) .filter(LogsExclusionFilter::new(1.0).query("*".to_string()))]) .num_flex_logs_retention_days(360) .num_retention_days(15); let configuration = datadog::Configuration::new(); let api = LogsIndexesAPI::with_config(configuration); let resp = api.create_logs_index(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an index ``` /** * Create an index returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsIndexesApi(configuration); const params: v1.LogsIndexesApiCreateLogsIndexRequest = { body: { dailyLimit: 300000000, dailyLimitReset: { resetTime: "14:00", resetUtcOffset: "+02:00", }, dailyLimitWarningThresholdPercentage: 70, exclusionFilters: [ { filter: { query: "*", sampleRate: 1.0, }, name: "payment", }, ], filter: { query: "source:python", }, name: "main", numFlexLogsRetentionDays: 360, numRetentionDays: 15, }, }; apiInstance .createLogsIndex(params) .then((data: v1.LogsIndex) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an index](https://docs.datadoghq.com/api/latest/logs-indexes/#update-an-index) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-indexes/#update-an-index-v1) PUT https://api.ap1.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.ap2.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.datadoghq.eu/api/v1/logs/config/indexes/{name}https://api.ddog-gov.com/api/v1/logs/config/indexes/{name}https://api.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.us3.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.us5.datadoghq.com/api/v1/logs/config/indexes/{name} ### Overview Update an index as identified by its name. Returns the Index object passed in the request body when the request is successful. Using the `PUT` method updates your index’s configuration by **replacing** your current configuration with the new one sent to your Datadog organization. ### Arguments #### Path Parameters Name Type Description name [_required_] string Name of the log index. ### Request #### Body Data (required) Object containing the new `LogsIndexUpdateRequest`. * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Field Type Description daily_limit int64 The number of log events you can send in this index per day before you are rate-limited. daily_limit_reset object Object containing options to override the default daily limit reset time. reset_time string String in `HH:00` format representing the time of day the daily limit should be reset. The hours must be between 00 and 23 (inclusive). reset_utc_offset string String in `(-|+)HH:00` format representing the UTC offset to apply to the given reset time. The hours must be between -12 and +14 (inclusive). daily_limit_warning_threshold_percentage double A percentage threshold of the daily quota at which a Datadog warning event is generated. disable_daily_limit boolean If true, sets the `daily_limit` value to null and the index is not limited on a daily basis (any specified `daily_limit` value in the request is ignored). If false or omitted, the index's current `daily_limit` is maintained. exclusion_filters [object] An array of exclusion objects. The logs are tested against the query of each filter, following the order of the array. Only the first matching active exclusion matters, others (if any) are ignored. filter object Exclusion filter is defined by a query, a sampling rule, and a active/inactive toggle. query string Default query is `*`, meaning all logs flowing in the index would be excluded. Scope down exclusion filter to only a subset of logs with a log query. sample_rate [_required_] double Sample rate to apply to logs going through this exclusion filter, a value of 1.0 excludes all logs matching the query. is_enabled boolean Whether or not the exclusion filter is active. name [_required_] string Name of the index exclusion filter. filter [_required_] object Filter for logs. query string The filter query. num_flex_logs_retention_days int64 The total number of days logs are stored in Standard and Flex Tier before being deleted from the index. If Standard Tier is enabled on this index, logs are first retained in Standard Tier for the number of days specified through `num_retention_days`, and then stored in Flex Tier until the number of days specified in `num_flex_logs_retention_days` is reached. The available values depend on retention plans specified in your organization's contract/subscriptions. **Note** : Changing this value affects all logs already in this index. It may also affect billing. num_retention_days int64 The number of days logs are stored in Standard Tier before aging into the Flex Tier or being deleted from the index. The available values depend on retention plans specified in your organization's contract/subscriptions. **Note** : Changing this value affects all logs already in this index. It may also affect billing. ``` { "daily_limit": 300000000, "daily_limit_reset": { "reset_time": "14:00", "reset_utc_offset": "+02:00" }, "daily_limit_warning_threshold_percentage": 70, "disable_daily_limit": false, "exclusion_filters": [ { "filter": { "query": "*", "sample_rate": 1 }, "is_enabled": false, "name": "payment" } ], "filter": { "query": "source:python" }, "num_flex_logs_retention_days": 360, "num_retention_days": 15 } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-indexes/#UpdateLogsIndex-200-v1) * [400](https://docs.datadoghq.com/api/latest/logs-indexes/#UpdateLogsIndex-400-v1) * [403](https://docs.datadoghq.com/api/latest/logs-indexes/#UpdateLogsIndex-403-v1) * [429](https://docs.datadoghq.com/api/latest/logs-indexes/#UpdateLogsIndex-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Object describing a Datadog Log index. Field Type Description daily_limit int64 The number of log events you can send in this index per day before you are rate-limited. daily_limit_reset object Object containing options to override the default daily limit reset time. reset_time string String in `HH:00` format representing the time of day the daily limit should be reset. The hours must be between 00 and 23 (inclusive). reset_utc_offset string String in `(-|+)HH:00` format representing the UTC offset to apply to the given reset time. The hours must be between -12 and +14 (inclusive). daily_limit_warning_threshold_percentage double A percentage threshold of the daily quota at which a Datadog warning event is generated. exclusion_filters [object] An array of exclusion objects. The logs are tested against the query of each filter, following the order of the array. Only the first matching active exclusion matters, others (if any) are ignored. filter object Exclusion filter is defined by a query, a sampling rule, and a active/inactive toggle. query string Default query is `*`, meaning all logs flowing in the index would be excluded. Scope down exclusion filter to only a subset of logs with a log query. sample_rate [_required_] double Sample rate to apply to logs going through this exclusion filter, a value of 1.0 excludes all logs matching the query. is_enabled boolean Whether or not the exclusion filter is active. name [_required_] string Name of the index exclusion filter. filter [_required_] object Filter for logs. query string The filter query. is_rate_limited boolean A boolean stating if the index is rate limited, meaning more logs than the daily limit have been sent. Rate limit is reset every-day at 2pm UTC. name [_required_] string The name of the index. num_flex_logs_retention_days int64 The total number of days logs are stored in Standard and Flex Tier before being deleted from the index. If Standard Tier is enabled on this index, logs are first retained in Standard Tier for the number of days specified through `num_retention_days`, and then stored in Flex Tier until the number of days specified in `num_flex_logs_retention_days` is reached. The available values depend on retention plans specified in your organization's contract/subscriptions. num_retention_days int64 The number of days logs are stored in Standard Tier before aging into the Flex Tier or being deleted from the index. The available values depend on retention plans specified in your organization's contract/subscriptions. ``` { "daily_limit": 300000000, "daily_limit_reset": { "reset_time": "14:00", "reset_utc_offset": "+02:00" }, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [ { "filter": { "query": "*", "sample_rate": 1 }, "is_enabled": false, "name": "payment" } ], "filter": { "query": "source:python" }, "is_rate_limited": false, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15 } ``` Copy Invalid Parameter Error * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=typescript) ##### Update an index Copy ``` # Path parameters export name="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/indexes/${name}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "exclusion_filters": [ { "filter": { "sample_rate": 1 }, "name": "payment" } ], "filter": {} } EOF ``` ##### Update an index ``` """ Update an index returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_indexes_api import LogsIndexesApi from datadog_api_client.v1.model.logs_daily_limit_reset import LogsDailyLimitReset from datadog_api_client.v1.model.logs_exclusion import LogsExclusion from datadog_api_client.v1.model.logs_exclusion_filter import LogsExclusionFilter from datadog_api_client.v1.model.logs_filter import LogsFilter from datadog_api_client.v1.model.logs_index_update_request import LogsIndexUpdateRequest body = LogsIndexUpdateRequest( daily_limit=300000000, daily_limit_reset=LogsDailyLimitReset( reset_time="14:00", reset_utc_offset="+02:00", ), daily_limit_warning_threshold_percentage=70.0, disable_daily_limit=False, exclusion_filters=[ LogsExclusion( filter=LogsExclusionFilter( query="*", sample_rate=1.0, ), name="payment", ), ], filter=LogsFilter( query="source:python", ), num_flex_logs_retention_days=360, num_retention_days=15, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsIndexesApi(api_client) response = api_instance.update_logs_index(name="name", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an index ``` # Update an index returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsIndexesAPI.new body = DatadogAPIClient::V1::LogsIndexUpdateRequest.new({ daily_limit: 300000000, daily_limit_reset: DatadogAPIClient::V1::LogsDailyLimitReset.new({ reset_time: "14:00", reset_utc_offset: "+02:00", }), daily_limit_warning_threshold_percentage: 70, disable_daily_limit: false, exclusion_filters: [ DatadogAPIClient::V1::LogsExclusion.new({ filter: DatadogAPIClient::V1::LogsExclusionFilter.new({ query: "*", sample_rate: 1.0, }), name: "payment", }), ], filter: DatadogAPIClient::V1::LogsFilter.new({ query: "source:python", }), num_flex_logs_retention_days: 360, num_retention_days: 15, }) p api_instance.update_logs_index("name", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an index ``` // Update an index returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.LogsIndexUpdateRequest{ DailyLimit: datadog.PtrInt64(300000000), DailyLimitReset: &datadogV1.LogsDailyLimitReset{ ResetTime: datadog.PtrString("14:00"), ResetUtcOffset: datadog.PtrString("+02:00"), }, DailyLimitWarningThresholdPercentage: datadog.PtrFloat64(70), DisableDailyLimit: datadog.PtrBool(false), ExclusionFilters: []datadogV1.LogsExclusion{ { Filter: &datadogV1.LogsExclusionFilter{ Query: datadog.PtrString("*"), SampleRate: 1.0, }, Name: "payment", }, }, Filter: datadogV1.LogsFilter{ Query: datadog.PtrString("source:python"), }, NumFlexLogsRetentionDays: datadog.PtrInt64(360), NumRetentionDays: datadog.PtrInt64(15), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsIndexesApi(apiClient) resp, r, err := api.UpdateLogsIndex(ctx, "name", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsIndexesApi.UpdateLogsIndex`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsIndexesApi.UpdateLogsIndex`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an index ``` // Update an index returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsIndexesApi; import com.datadog.api.client.v1.model.LogsDailyLimitReset; import com.datadog.api.client.v1.model.LogsExclusion; import com.datadog.api.client.v1.model.LogsExclusionFilter; import com.datadog.api.client.v1.model.LogsFilter; import com.datadog.api.client.v1.model.LogsIndex; import com.datadog.api.client.v1.model.LogsIndexUpdateRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsIndexesApi apiInstance = new LogsIndexesApi(defaultClient); LogsIndexUpdateRequest body = new LogsIndexUpdateRequest() .dailyLimit(300000000L) .dailyLimitReset(new LogsDailyLimitReset().resetTime("14:00").resetUtcOffset("+02:00")) .dailyLimitWarningThresholdPercentage(70.0) .disableDailyLimit(false) .exclusionFilters( Collections.singletonList( new LogsExclusion() .filter(new LogsExclusionFilter().query("*").sampleRate(1.0)) .name("payment"))) .filter(new LogsFilter().query("source:python")) .numFlexLogsRetentionDays(360L) .numRetentionDays(15L); try { LogsIndex result = apiInstance.updateLogsIndex("name", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsIndexesApi#updateLogsIndex"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an index ``` // Update an index returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_indexes::LogsIndexesAPI; use datadog_api_client::datadogV1::model::LogsDailyLimitReset; use datadog_api_client::datadogV1::model::LogsExclusion; use datadog_api_client::datadogV1::model::LogsExclusionFilter; use datadog_api_client::datadogV1::model::LogsFilter; use datadog_api_client::datadogV1::model::LogsIndexUpdateRequest; #[tokio::main] async fn main() { let body = LogsIndexUpdateRequest::new(LogsFilter::new().query("source:python".to_string())) .daily_limit(300000000) .daily_limit_reset( LogsDailyLimitReset::new() .reset_time("14:00".to_string()) .reset_utc_offset("+02:00".to_string()), ) .daily_limit_warning_threshold_percentage(70.0 as f64) .disable_daily_limit(false) .exclusion_filters(vec![LogsExclusion::new("payment".to_string()) .filter(LogsExclusionFilter::new(1.0).query("*".to_string()))]) .num_flex_logs_retention_days(360) .num_retention_days(15); let configuration = datadog::Configuration::new(); let api = LogsIndexesAPI::with_config(configuration); let resp = api.update_logs_index("name".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an index ``` /** * Update an index returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsIndexesApi(configuration); const params: v1.LogsIndexesApiUpdateLogsIndexRequest = { body: { dailyLimit: 300000000, dailyLimitReset: { resetTime: "14:00", resetUtcOffset: "+02:00", }, dailyLimitWarningThresholdPercentage: 70, disableDailyLimit: false, exclusionFilters: [ { filter: { query: "*", sampleRate: 1.0, }, name: "payment", }, ], filter: { query: "source:python", }, numFlexLogsRetentionDays: 360, numRetentionDays: 15, }, name: "name", }; apiInstance .updateLogsIndex(params) .then((data: v1.LogsIndex) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an index](https://docs.datadoghq.com/api/latest/logs-indexes/#delete-an-index) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-indexes/#delete-an-index-v1) DELETE https://api.ap1.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.ap2.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.datadoghq.eu/api/v1/logs/config/indexes/{name}https://api.ddog-gov.com/api/v1/logs/config/indexes/{name}https://api.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.us3.datadoghq.com/api/v1/logs/config/indexes/{name}https://api.us5.datadoghq.com/api/v1/logs/config/indexes/{name} ### Overview Delete an existing index from your organization. Index deletions are permanent and cannot be reverted. You cannot recreate an index with the same name as deleted ones. This endpoint requires the `logs_modify_indexes` permission. ### Arguments #### Path Parameters Name Type Description name [_required_] string Name of the log index. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-indexes/#DeleteLogsIndex-200-v1) * [403](https://docs.datadoghq.com/api/latest/logs-indexes/#DeleteLogsIndex-403-v1) * [404](https://docs.datadoghq.com/api/latest/logs-indexes/#DeleteLogsIndex-404-v1) * [429](https://docs.datadoghq.com/api/latest/logs-indexes/#DeleteLogsIndex-429-v1) OK Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=typescript) ##### Delete an index Copy ``` # Path parameters export name="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/indexes/${name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an index ``` """ Delete an index returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_indexes_api import LogsIndexesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsIndexesApi(api_client) api_instance.delete_logs_index( name="name", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an index ``` # Delete an index returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsIndexesAPI.new p api_instance.delete_logs_index("name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an index ``` // Delete an index returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsIndexesApi(apiClient) r, err := api.DeleteLogsIndex(ctx, "name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsIndexesApi.DeleteLogsIndex`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an index ``` // Delete an index returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsIndexesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsIndexesApi apiInstance = new LogsIndexesApi(defaultClient); try { apiInstance.deleteLogsIndex("name"); } catch (ApiException e) { System.err.println("Exception when calling LogsIndexesApi#deleteLogsIndex"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an index ``` // Delete an index returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_indexes::LogsIndexesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsIndexesAPI::with_config(configuration); let resp = api.delete_logs_index("name".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an index ``` /** * Delete an index returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsIndexesApi(configuration); const params: v1.LogsIndexesApiDeleteLogsIndexRequest = { name: "name", }; apiInstance .deleteLogsIndex(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get indexes order](https://docs.datadoghq.com/api/latest/logs-indexes/#get-indexes-order) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-indexes/#get-indexes-order-v1) GET https://api.ap1.datadoghq.com/api/v1/logs/config/index-orderhttps://api.ap2.datadoghq.com/api/v1/logs/config/index-orderhttps://api.datadoghq.eu/api/v1/logs/config/index-orderhttps://api.ddog-gov.com/api/v1/logs/config/index-orderhttps://api.datadoghq.com/api/v1/logs/config/index-orderhttps://api.us3.datadoghq.com/api/v1/logs/config/index-orderhttps://api.us5.datadoghq.com/api/v1/logs/config/index-order ### Overview Get the current order of your log indexes. This endpoint takes no JSON arguments. This endpoint requires the `logs_read_config` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-indexes/#GetLogsIndexOrder-200-v1) * [403](https://docs.datadoghq.com/api/latest/logs-indexes/#GetLogsIndexOrder-403-v1) * [429](https://docs.datadoghq.com/api/latest/logs-indexes/#GetLogsIndexOrder-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Object containing the ordered list of log index names. Expand All Field Type Description index_names [_required_] [string] Array of strings identifying by their name(s) the index(es) of your organization. Logs are tested against the query filter of each index one by one, following the order of the array. Logs are eventually stored in the first matching index. ``` { "index_names": [ "main", "payments", "web" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=typescript) ##### Get indexes order Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/index-order" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get indexes order ``` """ Get indexes order returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_indexes_api import LogsIndexesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsIndexesApi(api_client) response = api_instance.get_logs_index_order() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get indexes order ``` # Get indexes order returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsIndexesAPI.new p api_instance.get_logs_index_order() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get indexes order ``` // Get indexes order returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsIndexesApi(apiClient) resp, r, err := api.GetLogsIndexOrder(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsIndexesApi.GetLogsIndexOrder`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsIndexesApi.GetLogsIndexOrder`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get indexes order ``` // Get indexes order returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsIndexesApi; import com.datadog.api.client.v1.model.LogsIndexesOrder; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsIndexesApi apiInstance = new LogsIndexesApi(defaultClient); try { LogsIndexesOrder result = apiInstance.getLogsIndexOrder(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsIndexesApi#getLogsIndexOrder"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get indexes order ``` // Get indexes order returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_indexes::LogsIndexesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsIndexesAPI::with_config(configuration); let resp = api.get_logs_index_order().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get indexes order ``` /** * Get indexes order returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsIndexesApi(configuration); apiInstance .getLogsIndexOrder() .then((data: v1.LogsIndexesOrder) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update indexes order](https://docs.datadoghq.com/api/latest/logs-indexes/#update-indexes-order) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-indexes/#update-indexes-order-v1) PUT https://api.ap1.datadoghq.com/api/v1/logs/config/index-orderhttps://api.ap2.datadoghq.com/api/v1/logs/config/index-orderhttps://api.datadoghq.eu/api/v1/logs/config/index-orderhttps://api.ddog-gov.com/api/v1/logs/config/index-orderhttps://api.datadoghq.com/api/v1/logs/config/index-orderhttps://api.us3.datadoghq.com/api/v1/logs/config/index-orderhttps://api.us5.datadoghq.com/api/v1/logs/config/index-order ### Overview This endpoint updates the index order of your organization. It returns the index order object passed in the request body when the request is successful. This endpoint requires the `logs_modify_indexes` permission. ### Request #### Body Data (required) Object containing the new ordered list of index names * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Expand All Field Type Description index_names [_required_] [string] Array of strings identifying by their name(s) the index(es) of your organization. Logs are tested against the query filter of each index one by one, following the order of the array. Logs are eventually stored in the first matching index. ``` { "index_names": [ "main", "payments", "web" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-indexes/#UpdateLogsIndexOrder-200-v1) * [400](https://docs.datadoghq.com/api/latest/logs-indexes/#UpdateLogsIndexOrder-400-v1) * [403](https://docs.datadoghq.com/api/latest/logs-indexes/#UpdateLogsIndexOrder-403-v1) * [429](https://docs.datadoghq.com/api/latest/logs-indexes/#UpdateLogsIndexOrder-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Object containing the ordered list of log index names. Expand All Field Type Description index_names [_required_] [string] Array of strings identifying by their name(s) the index(es) of your organization. Logs are tested against the query filter of each index one by one, following the order of the array. Logs are eventually stored in the first matching index. ``` { "index_names": [ "main", "payments", "web" ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-indexes/) * [Example](https://docs.datadoghq.com/api/latest/logs-indexes/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-indexes/?code-lang=typescript) ##### Update indexes order Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/index-order" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "index_names": [ "main", "payments", "web" ] } EOF ``` ##### Update indexes order ``` """ Update indexes order returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_indexes_api import LogsIndexesApi from datadog_api_client.v1.model.logs_indexes_order import LogsIndexesOrder body = LogsIndexesOrder( index_names=[ "main", "payments", "web", ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsIndexesApi(api_client) response = api_instance.update_logs_index_order(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update indexes order ``` # Update indexes order returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsIndexesAPI.new body = DatadogAPIClient::V1::LogsIndexesOrder.new({ index_names: [ "main", "payments", "web", ], }) p api_instance.update_logs_index_order(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update indexes order ``` // Update indexes order returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.LogsIndexesOrder{ IndexNames: []string{ "main", "payments", "web", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsIndexesApi(apiClient) resp, r, err := api.UpdateLogsIndexOrder(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsIndexesApi.UpdateLogsIndexOrder`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsIndexesApi.UpdateLogsIndexOrder`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update indexes order ``` // Update indexes order returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsIndexesApi; import com.datadog.api.client.v1.model.LogsIndexesOrder; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsIndexesApi apiInstance = new LogsIndexesApi(defaultClient); LogsIndexesOrder body = new LogsIndexesOrder().indexNames(Arrays.asList("main", "payments", "web")); try { LogsIndexesOrder result = apiInstance.updateLogsIndexOrder(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsIndexesApi#updateLogsIndexOrder"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update indexes order ``` // Update indexes order returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_indexes::LogsIndexesAPI; use datadog_api_client::datadogV1::model::LogsIndexesOrder; #[tokio::main] async fn main() { let body = LogsIndexesOrder::new(vec![ "main".to_string(), "payments".to_string(), "web".to_string(), ]); let configuration = datadog::Configuration::new(); let api = LogsIndexesAPI::with_config(configuration); let resp = api.update_logs_index_order(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update indexes order ``` /** * Update indexes order returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsIndexesApi(configuration); const params: v1.LogsIndexesApiUpdateLogsIndexOrderRequest = { body: { indexNames: ["main", "payments", "web"], }, }; apiInstance .updateLogsIndexOrder(params) .then((data: v1.LogsIndexesOrder) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=daca7683-47b9-48b3-929c-9dafba0284e6&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=76f1c302-2c92-4dd9-a36e-0458bb937d18&pt=Logs%20Indexes&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-indexes%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=daca7683-47b9-48b3-929c-9dafba0284e6&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=76f1c302-2c92-4dd9-a36e-0458bb937d18&pt=Logs%20Indexes&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-indexes%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=b95ae522-08d7-4f2e-b551-f274a2fb570e&bo=2&sid=d2ef8500f0bf11f0a1f0c74ff5b75d7f&vid=d2ef9f00f0bf11f0a649af9ed78a93a9&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Logs%20Indexes&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-indexes%2F&r=<=1680&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=620727) --- # Source: https://docs.datadoghq.com/api/latest/logs-metrics/ # Logs Metrics Manage configuration of [log-based metrics](https://app.datadoghq.com/logs/pipelines/generate-metrics) for your organization. ## [Get all log-based metrics](https://docs.datadoghq.com/api/latest/logs-metrics/#get-all-log-based-metrics) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-metrics/#get-all-log-based-metrics-v2) GET https://api.ap1.datadoghq.com/api/v2/logs/config/metricshttps://api.ap2.datadoghq.com/api/v2/logs/config/metricshttps://api.datadoghq.eu/api/v2/logs/config/metricshttps://api.ddog-gov.com/api/v2/logs/config/metricshttps://api.datadoghq.com/api/v2/logs/config/metricshttps://api.us3.datadoghq.com/api/v2/logs/config/metricshttps://api.us5.datadoghq.com/api/v2/logs/config/metrics ### Overview Get the list of configured log-based metrics with their definitions. This endpoint requires the `logs_read_config` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-metrics/#ListLogsMetrics-200-v2) * [403](https://docs.datadoghq.com/api/latest/logs-metrics/#ListLogsMetrics-403-v2) * [429](https://docs.datadoghq.com/api/latest/logs-metrics/#ListLogsMetrics-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) All the available log-based metric objects. Field Type Description data [object] A list of log-based metric objects. attributes object The object describing a Datadog log-based metric. compute object The compute rule to compute the log-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. path string The path to the value the log-based metric will aggregate on (only used if the aggregation type is a "distribution"). filter object The log-based metric filter. Logs matching this filter will be aggregated in this metric. query string The search query - following the log search syntax. group_by [object] The rules for the group by. path string The path to the value the log-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. id string The name of the log-based metric. type enum The type of the resource. The value should always be logs_metrics. Allowed enum values: `logs_metrics` default: `logs_metrics` ``` { "data": [ { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" }, "filter": { "query": "service:web* AND @http.status_code:[200 TO 299]" }, "group_by": [ { "path": "@http.status_code", "tag_name": "status_code" } ] }, "id": "logs.page.load.count", "type": "logs_metrics" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=typescript) ##### Get all log-based metrics Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/metrics" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all log-based metrics ``` """ Get all log-based metrics returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_metrics_api import LogsMetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsMetricsApi(api_client) response = api_instance.list_logs_metrics() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all log-based metrics ``` # Get all log-based metrics returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsMetricsAPI.new p api_instance.list_logs_metrics() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all log-based metrics ``` // Get all log-based metrics returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsMetricsApi(apiClient) resp, r, err := api.ListLogsMetrics(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsMetricsApi.ListLogsMetrics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsMetricsApi.ListLogsMetrics`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all log-based metrics ``` // Get all log-based metrics returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsMetricsApi; import com.datadog.api.client.v2.model.LogsMetricsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsMetricsApi apiInstance = new LogsMetricsApi(defaultClient); try { LogsMetricsResponse result = apiInstance.listLogsMetrics(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsMetricsApi#listLogsMetrics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all log-based metrics ``` // Get all log-based metrics returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_metrics::LogsMetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsMetricsAPI::with_config(configuration); let resp = api.list_logs_metrics().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all log-based metrics ``` /** * Get all log-based metrics returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsMetricsApi(configuration); apiInstance .listLogsMetrics() .then((data: v2.LogsMetricsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a log-based metric](https://docs.datadoghq.com/api/latest/logs-metrics/#create-a-log-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-metrics/#create-a-log-based-metric-v2) POST https://api.ap1.datadoghq.com/api/v2/logs/config/metricshttps://api.ap2.datadoghq.com/api/v2/logs/config/metricshttps://api.datadoghq.eu/api/v2/logs/config/metricshttps://api.ddog-gov.com/api/v2/logs/config/metricshttps://api.datadoghq.com/api/v2/logs/config/metricshttps://api.us3.datadoghq.com/api/v2/logs/config/metricshttps://api.us5.datadoghq.com/api/v2/logs/config/metrics ### Overview Create a metric based on your ingested logs in your organization. Returns the log-based metric object from the request body when the request is successful. This endpoint requires the `logs_generate_metrics` permission. ### Request #### Body Data (required) The definition of the new log-based metric. * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) Field Type Description data [_required_] object The new log-based metric properties. attributes [_required_] object The object describing the Datadog log-based metric to create. compute [_required_] object The compute rule to compute the log-based metric. aggregation_type [_required_] enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. path string The path to the value the log-based metric will aggregate on (only used if the aggregation type is a "distribution"). filter object The log-based metric filter. Logs matching this filter will be aggregated in this metric. query string The search query - following the log search syntax. default: `*` group_by [object] The rules for the group by. path [_required_] string The path to the value the log-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. id [_required_] string The name of the log-based metric. type [_required_] enum The type of the resource. The value should always be logs_metrics. Allowed enum values: `logs_metrics` default: `logs_metrics` ``` { "data": { "id": "ExampleLogsMetric", "type": "logs_metrics", "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-metrics/#CreateLogsMetric-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-metrics/#CreateLogsMetric-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-metrics/#CreateLogsMetric-403-v2) * [409](https://docs.datadoghq.com/api/latest/logs-metrics/#CreateLogsMetric-409-v2) * [429](https://docs.datadoghq.com/api/latest/logs-metrics/#CreateLogsMetric-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) The log-based metric object. Field Type Description data object The log-based metric properties. attributes object The object describing a Datadog log-based metric. compute object The compute rule to compute the log-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. path string The path to the value the log-based metric will aggregate on (only used if the aggregation type is a "distribution"). filter object The log-based metric filter. Logs matching this filter will be aggregated in this metric. query string The search query - following the log search syntax. group_by [object] The rules for the group by. path string The path to the value the log-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. id string The name of the log-based metric. type enum The type of the resource. The value should always be logs_metrics. Allowed enum values: `logs_metrics` default: `logs_metrics` ``` { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" }, "filter": { "query": "service:web* AND @http.status_code:[200 TO 299]" }, "group_by": [ { "path": "@http.status_code", "tag_name": "status_code" } ] }, "id": "logs.page.load.count", "type": "logs_metrics" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=typescript) ##### Create a log-based metric returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/metrics" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "ExampleLogsMetric", "type": "logs_metrics", "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" } } } } EOF ``` ##### Create a log-based metric returns "OK" response ``` // Create a log-based metric returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.LogsMetricCreateRequest{ Data: datadogV2.LogsMetricCreateData{ Id: "ExampleLogsMetric", Type: datadogV2.LOGSMETRICTYPE_LOGS_METRICS, Attributes: datadogV2.LogsMetricCreateAttributes{ Compute: datadogV2.LogsMetricCompute{ AggregationType: datadogV2.LOGSMETRICCOMPUTEAGGREGATIONTYPE_DISTRIBUTION, IncludePercentiles: datadog.PtrBool(true), Path: datadog.PtrString("@duration"), }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsMetricsApi(apiClient) resp, r, err := api.CreateLogsMetric(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsMetricsApi.CreateLogsMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsMetricsApi.CreateLogsMetric`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a log-based metric returns "OK" response ``` // Create a log-based metric returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsMetricsApi; import com.datadog.api.client.v2.model.LogsMetricCompute; import com.datadog.api.client.v2.model.LogsMetricComputeAggregationType; import com.datadog.api.client.v2.model.LogsMetricCreateAttributes; import com.datadog.api.client.v2.model.LogsMetricCreateData; import com.datadog.api.client.v2.model.LogsMetricCreateRequest; import com.datadog.api.client.v2.model.LogsMetricResponse; import com.datadog.api.client.v2.model.LogsMetricType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsMetricsApi apiInstance = new LogsMetricsApi(defaultClient); LogsMetricCreateRequest body = new LogsMetricCreateRequest() .data( new LogsMetricCreateData() .id("ExampleLogsMetric") .type(LogsMetricType.LOGS_METRICS) .attributes( new LogsMetricCreateAttributes() .compute( new LogsMetricCompute() .aggregationType(LogsMetricComputeAggregationType.DISTRIBUTION) .includePercentiles(true) .path("@duration")))); try { LogsMetricResponse result = apiInstance.createLogsMetric(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsMetricsApi#createLogsMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a log-based metric returns "OK" response ``` """ Create a log-based metric returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_metrics_api import LogsMetricsApi from datadog_api_client.v2.model.logs_metric_compute import LogsMetricCompute from datadog_api_client.v2.model.logs_metric_compute_aggregation_type import LogsMetricComputeAggregationType from datadog_api_client.v2.model.logs_metric_create_attributes import LogsMetricCreateAttributes from datadog_api_client.v2.model.logs_metric_create_data import LogsMetricCreateData from datadog_api_client.v2.model.logs_metric_create_request import LogsMetricCreateRequest from datadog_api_client.v2.model.logs_metric_type import LogsMetricType body = LogsMetricCreateRequest( data=LogsMetricCreateData( id="ExampleLogsMetric", type=LogsMetricType.LOGS_METRICS, attributes=LogsMetricCreateAttributes( compute=LogsMetricCompute( aggregation_type=LogsMetricComputeAggregationType.DISTRIBUTION, include_percentiles=True, path="@duration", ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsMetricsApi(api_client) response = api_instance.create_logs_metric(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a log-based metric returns "OK" response ``` # Create a log-based metric returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsMetricsAPI.new body = DatadogAPIClient::V2::LogsMetricCreateRequest.new({ data: DatadogAPIClient::V2::LogsMetricCreateData.new({ id: "ExampleLogsMetric", type: DatadogAPIClient::V2::LogsMetricType::LOGS_METRICS, attributes: DatadogAPIClient::V2::LogsMetricCreateAttributes.new({ compute: DatadogAPIClient::V2::LogsMetricCompute.new({ aggregation_type: DatadogAPIClient::V2::LogsMetricComputeAggregationType::DISTRIBUTION, include_percentiles: true, path: "@duration", }), }), }), }) p api_instance.create_logs_metric(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a log-based metric returns "OK" response ``` // Create a log-based metric returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_metrics::LogsMetricsAPI; use datadog_api_client::datadogV2::model::LogsMetricCompute; use datadog_api_client::datadogV2::model::LogsMetricComputeAggregationType; use datadog_api_client::datadogV2::model::LogsMetricCreateAttributes; use datadog_api_client::datadogV2::model::LogsMetricCreateData; use datadog_api_client::datadogV2::model::LogsMetricCreateRequest; use datadog_api_client::datadogV2::model::LogsMetricType; #[tokio::main] async fn main() { let body = LogsMetricCreateRequest::new(LogsMetricCreateData::new( LogsMetricCreateAttributes::new( LogsMetricCompute::new(LogsMetricComputeAggregationType::DISTRIBUTION) .include_percentiles(true) .path("@duration".to_string()), ), "ExampleLogsMetric".to_string(), LogsMetricType::LOGS_METRICS, )); let configuration = datadog::Configuration::new(); let api = LogsMetricsAPI::with_config(configuration); let resp = api.create_logs_metric(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a log-based metric returns "OK" response ``` /** * Create a log-based metric returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsMetricsApi(configuration); const params: v2.LogsMetricsApiCreateLogsMetricRequest = { body: { data: { id: "ExampleLogsMetric", type: "logs_metrics", attributes: { compute: { aggregationType: "distribution", includePercentiles: true, path: "@duration", }, }, }, }, }; apiInstance .createLogsMetric(params) .then((data: v2.LogsMetricResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a log-based metric](https://docs.datadoghq.com/api/latest/logs-metrics/#get-a-log-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-metrics/#get-a-log-based-metric-v2) GET https://api.ap1.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.ap2.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.datadoghq.eu/api/v2/logs/config/metrics/{metric_id}https://api.ddog-gov.com/api/v2/logs/config/metrics/{metric_id}https://api.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.us3.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.us5.datadoghq.com/api/v2/logs/config/metrics/{metric_id} ### Overview Get a specific log-based metric from your organization. This endpoint requires the `logs_read_config` permission. ### Arguments #### Path Parameters Name Type Description metric_id [_required_] string The name of the log-based metric. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-metrics/#GetLogsMetric-200-v2) * [403](https://docs.datadoghq.com/api/latest/logs-metrics/#GetLogsMetric-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-metrics/#GetLogsMetric-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-metrics/#GetLogsMetric-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) The log-based metric object. Field Type Description data object The log-based metric properties. attributes object The object describing a Datadog log-based metric. compute object The compute rule to compute the log-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. path string The path to the value the log-based metric will aggregate on (only used if the aggregation type is a "distribution"). filter object The log-based metric filter. Logs matching this filter will be aggregated in this metric. query string The search query - following the log search syntax. group_by [object] The rules for the group by. path string The path to the value the log-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. id string The name of the log-based metric. type enum The type of the resource. The value should always be logs_metrics. Allowed enum values: `logs_metrics` default: `logs_metrics` ``` { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" }, "filter": { "query": "service:web* AND @http.status_code:[200 TO 299]" }, "group_by": [ { "path": "@http.status_code", "tag_name": "status_code" } ] }, "id": "logs.page.load.count", "type": "logs_metrics" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=typescript) ##### Get a log-based metric Copy ``` # Path parameters export metric_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/metrics/${metric_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a log-based metric ``` """ Get a log-based metric returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_metrics_api import LogsMetricsApi # there is a valid "logs_metric" in the system LOGS_METRIC_DATA_ID = environ["LOGS_METRIC_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsMetricsApi(api_client) response = api_instance.get_logs_metric( metric_id=LOGS_METRIC_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a log-based metric ``` # Get a log-based metric returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsMetricsAPI.new # there is a valid "logs_metric" in the system LOGS_METRIC_DATA_ID = ENV["LOGS_METRIC_DATA_ID"] p api_instance.get_logs_metric(LOGS_METRIC_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a log-based metric ``` // Get a log-based metric returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "logs_metric" in the system LogsMetricDataID := os.Getenv("LOGS_METRIC_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsMetricsApi(apiClient) resp, r, err := api.GetLogsMetric(ctx, LogsMetricDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsMetricsApi.GetLogsMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsMetricsApi.GetLogsMetric`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a log-based metric ``` // Get a log-based metric returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsMetricsApi; import com.datadog.api.client.v2.model.LogsMetricResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsMetricsApi apiInstance = new LogsMetricsApi(defaultClient); // there is a valid "logs_metric" in the system String LOGS_METRIC_DATA_ID = System.getenv("LOGS_METRIC_DATA_ID"); try { LogsMetricResponse result = apiInstance.getLogsMetric(LOGS_METRIC_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsMetricsApi#getLogsMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a log-based metric ``` // Get a log-based metric returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_metrics::LogsMetricsAPI; #[tokio::main] async fn main() { // there is a valid "logs_metric" in the system let logs_metric_data_id = std::env::var("LOGS_METRIC_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = LogsMetricsAPI::with_config(configuration); let resp = api.get_logs_metric(logs_metric_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a log-based metric ``` /** * Get a log-based metric returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsMetricsApi(configuration); // there is a valid "logs_metric" in the system const LOGS_METRIC_DATA_ID = process.env.LOGS_METRIC_DATA_ID as string; const params: v2.LogsMetricsApiGetLogsMetricRequest = { metricId: LOGS_METRIC_DATA_ID, }; apiInstance .getLogsMetric(params) .then((data: v2.LogsMetricResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a log-based metric](https://docs.datadoghq.com/api/latest/logs-metrics/#update-a-log-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-metrics/#update-a-log-based-metric-v2) PATCH https://api.ap1.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.ap2.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.datadoghq.eu/api/v2/logs/config/metrics/{metric_id}https://api.ddog-gov.com/api/v2/logs/config/metrics/{metric_id}https://api.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.us3.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.us5.datadoghq.com/api/v2/logs/config/metrics/{metric_id} ### Overview Update a specific log-based metric from your organization. Returns the log-based metric object from the request body when the request is successful. This endpoint requires the `logs_generate_metrics` permission. ### Arguments #### Path Parameters Name Type Description metric_id [_required_] string The name of the log-based metric. ### Request #### Body Data (required) New definition of the log-based metric. * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) Field Type Description data [_required_] object The new log-based metric properties. attributes [_required_] object The log-based metric properties that will be updated. compute object The compute rule to compute the log-based metric. include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. filter object The log-based metric filter. Logs matching this filter will be aggregated in this metric. query string The search query - following the log search syntax. default: `*` group_by [object] The rules for the group by. path [_required_] string The path to the value the log-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. type [_required_] enum The type of the resource. The value should always be logs_metrics. Allowed enum values: `logs_metrics` default: `logs_metrics` ##### Update a log-based metric returns "OK" response ``` { "data": { "type": "logs_metrics", "attributes": { "filter": { "query": "service:web* AND @http.status_code:[200 TO 299]-updated" } } } } ``` Copy ##### Update a log-based metric with include_percentiles field returns "OK" response ``` { "data": { "type": "logs_metrics", "attributes": { "compute": { "include_percentiles": false } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-metrics/#UpdateLogsMetric-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-metrics/#UpdateLogsMetric-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-metrics/#UpdateLogsMetric-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-metrics/#UpdateLogsMetric-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-metrics/#UpdateLogsMetric-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) The log-based metric object. Field Type Description data object The log-based metric properties. attributes object The object describing a Datadog log-based metric. compute object The compute rule to compute the log-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. path string The path to the value the log-based metric will aggregate on (only used if the aggregation type is a "distribution"). filter object The log-based metric filter. Logs matching this filter will be aggregated in this metric. query string The search query - following the log search syntax. group_by [object] The rules for the group by. path string The path to the value the log-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. id string The name of the log-based metric. type enum The type of the resource. The value should always be logs_metrics. Allowed enum values: `logs_metrics` default: `logs_metrics` ``` { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" }, "filter": { "query": "service:web* AND @http.status_code:[200 TO 299]" }, "group_by": [ { "path": "@http.status_code", "tag_name": "status_code" } ] }, "id": "logs.page.load.count", "type": "logs_metrics" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=typescript) ##### Update a log-based metric returns "OK" response Copy ``` # Path parameters export metric_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/metrics/${metric_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "logs_metrics", "attributes": { "filter": { "query": "service:web* AND @http.status_code:[200 TO 299]-updated" } } } } EOF ``` ##### Update a log-based metric with include_percentiles field returns "OK" response Copy ``` # Path parameters export metric_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/metrics/${metric_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "logs_metrics", "attributes": { "compute": { "include_percentiles": false } } } } EOF ``` ##### Update a log-based metric returns "OK" response ``` // Update a log-based metric returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "logs_metric" in the system LogsMetricDataID := os.Getenv("LOGS_METRIC_DATA_ID") body := datadogV2.LogsMetricUpdateRequest{ Data: datadogV2.LogsMetricUpdateData{ Type: datadogV2.LOGSMETRICTYPE_LOGS_METRICS, Attributes: datadogV2.LogsMetricUpdateAttributes{ Filter: &datadogV2.LogsMetricFilter{ Query: datadog.PtrString("service:web* AND @http.status_code:[200 TO 299]-updated"), }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsMetricsApi(apiClient) resp, r, err := api.UpdateLogsMetric(ctx, LogsMetricDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsMetricsApi.UpdateLogsMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsMetricsApi.UpdateLogsMetric`:\n%s\n", responseContent) } ``` Copy ##### Update a log-based metric with include_percentiles field returns "OK" response ``` // Update a log-based metric with include_percentiles field returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "logs_metric_percentile" in the system LogsMetricPercentileDataID := os.Getenv("LOGS_METRIC_PERCENTILE_DATA_ID") body := datadogV2.LogsMetricUpdateRequest{ Data: datadogV2.LogsMetricUpdateData{ Type: datadogV2.LOGSMETRICTYPE_LOGS_METRICS, Attributes: datadogV2.LogsMetricUpdateAttributes{ Compute: &datadogV2.LogsMetricUpdateCompute{ IncludePercentiles: datadog.PtrBool(false), }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsMetricsApi(apiClient) resp, r, err := api.UpdateLogsMetric(ctx, LogsMetricPercentileDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsMetricsApi.UpdateLogsMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsMetricsApi.UpdateLogsMetric`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a log-based metric returns "OK" response ``` // Update a log-based metric returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsMetricsApi; import com.datadog.api.client.v2.model.LogsMetricFilter; import com.datadog.api.client.v2.model.LogsMetricResponse; import com.datadog.api.client.v2.model.LogsMetricType; import com.datadog.api.client.v2.model.LogsMetricUpdateAttributes; import com.datadog.api.client.v2.model.LogsMetricUpdateData; import com.datadog.api.client.v2.model.LogsMetricUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsMetricsApi apiInstance = new LogsMetricsApi(defaultClient); // there is a valid "logs_metric" in the system String LOGS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY = System.getenv("LOGS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY"); String LOGS_METRIC_DATA_ID = System.getenv("LOGS_METRIC_DATA_ID"); LogsMetricUpdateRequest body = new LogsMetricUpdateRequest() .data( new LogsMetricUpdateData() .type(LogsMetricType.LOGS_METRICS) .attributes( new LogsMetricUpdateAttributes() .filter( new LogsMetricFilter() .query( "service:web* AND @http.status_code:[200 TO" + " 299]-updated")))); try { LogsMetricResponse result = apiInstance.updateLogsMetric(LOGS_METRIC_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsMetricsApi#updateLogsMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update a log-based metric with include_percentiles field returns "OK" response ``` // Update a log-based metric with include_percentiles field returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsMetricsApi; import com.datadog.api.client.v2.model.LogsMetricResponse; import com.datadog.api.client.v2.model.LogsMetricType; import com.datadog.api.client.v2.model.LogsMetricUpdateAttributes; import com.datadog.api.client.v2.model.LogsMetricUpdateCompute; import com.datadog.api.client.v2.model.LogsMetricUpdateData; import com.datadog.api.client.v2.model.LogsMetricUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsMetricsApi apiInstance = new LogsMetricsApi(defaultClient); // there is a valid "logs_metric_percentile" in the system String LOGS_METRIC_PERCENTILE_DATA_ID = System.getenv("LOGS_METRIC_PERCENTILE_DATA_ID"); LogsMetricUpdateRequest body = new LogsMetricUpdateRequest() .data( new LogsMetricUpdateData() .type(LogsMetricType.LOGS_METRICS) .attributes( new LogsMetricUpdateAttributes() .compute(new LogsMetricUpdateCompute().includePercentiles(false)))); try { LogsMetricResponse result = apiInstance.updateLogsMetric(LOGS_METRIC_PERCENTILE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsMetricsApi#updateLogsMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a log-based metric returns "OK" response ``` """ Update a log-based metric returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_metrics_api import LogsMetricsApi from datadog_api_client.v2.model.logs_metric_filter import LogsMetricFilter from datadog_api_client.v2.model.logs_metric_type import LogsMetricType from datadog_api_client.v2.model.logs_metric_update_attributes import LogsMetricUpdateAttributes from datadog_api_client.v2.model.logs_metric_update_data import LogsMetricUpdateData from datadog_api_client.v2.model.logs_metric_update_request import LogsMetricUpdateRequest # there is a valid "logs_metric" in the system LOGS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY = environ["LOGS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY"] LOGS_METRIC_DATA_ID = environ["LOGS_METRIC_DATA_ID"] body = LogsMetricUpdateRequest( data=LogsMetricUpdateData( type=LogsMetricType.LOGS_METRICS, attributes=LogsMetricUpdateAttributes( filter=LogsMetricFilter( query="service:web* AND @http.status_code:[200 TO 299]-updated", ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsMetricsApi(api_client) response = api_instance.update_logs_metric(metric_id=LOGS_METRIC_DATA_ID, body=body) print(response) ``` Copy ##### Update a log-based metric with include_percentiles field returns "OK" response ``` """ Update a log-based metric with include_percentiles field returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_metrics_api import LogsMetricsApi from datadog_api_client.v2.model.logs_metric_type import LogsMetricType from datadog_api_client.v2.model.logs_metric_update_attributes import LogsMetricUpdateAttributes from datadog_api_client.v2.model.logs_metric_update_compute import LogsMetricUpdateCompute from datadog_api_client.v2.model.logs_metric_update_data import LogsMetricUpdateData from datadog_api_client.v2.model.logs_metric_update_request import LogsMetricUpdateRequest # there is a valid "logs_metric_percentile" in the system LOGS_METRIC_PERCENTILE_DATA_ID = environ["LOGS_METRIC_PERCENTILE_DATA_ID"] body = LogsMetricUpdateRequest( data=LogsMetricUpdateData( type=LogsMetricType.LOGS_METRICS, attributes=LogsMetricUpdateAttributes( compute=LogsMetricUpdateCompute( include_percentiles=False, ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsMetricsApi(api_client) response = api_instance.update_logs_metric(metric_id=LOGS_METRIC_PERCENTILE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a log-based metric returns "OK" response ``` # Update a log-based metric returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsMetricsAPI.new # there is a valid "logs_metric" in the system LOGS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY = ENV["LOGS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY"] LOGS_METRIC_DATA_ID = ENV["LOGS_METRIC_DATA_ID"] body = DatadogAPIClient::V2::LogsMetricUpdateRequest.new({ data: DatadogAPIClient::V2::LogsMetricUpdateData.new({ type: DatadogAPIClient::V2::LogsMetricType::LOGS_METRICS, attributes: DatadogAPIClient::V2::LogsMetricUpdateAttributes.new({ filter: DatadogAPIClient::V2::LogsMetricFilter.new({ query: "service:web* AND @http.status_code:[200 TO 299]-updated", }), }), }), }) p api_instance.update_logs_metric(LOGS_METRIC_DATA_ID, body) ``` Copy ##### Update a log-based metric with include_percentiles field returns "OK" response ``` # Update a log-based metric with include_percentiles field returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsMetricsAPI.new # there is a valid "logs_metric_percentile" in the system LOGS_METRIC_PERCENTILE_DATA_ID = ENV["LOGS_METRIC_PERCENTILE_DATA_ID"] body = DatadogAPIClient::V2::LogsMetricUpdateRequest.new({ data: DatadogAPIClient::V2::LogsMetricUpdateData.new({ type: DatadogAPIClient::V2::LogsMetricType::LOGS_METRICS, attributes: DatadogAPIClient::V2::LogsMetricUpdateAttributes.new({ compute: DatadogAPIClient::V2::LogsMetricUpdateCompute.new({ include_percentiles: false, }), }), }), }) p api_instance.update_logs_metric(LOGS_METRIC_PERCENTILE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a log-based metric returns "OK" response ``` // Update a log-based metric returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_metrics::LogsMetricsAPI; use datadog_api_client::datadogV2::model::LogsMetricFilter; use datadog_api_client::datadogV2::model::LogsMetricType; use datadog_api_client::datadogV2::model::LogsMetricUpdateAttributes; use datadog_api_client::datadogV2::model::LogsMetricUpdateData; use datadog_api_client::datadogV2::model::LogsMetricUpdateRequest; #[tokio::main] async fn main() { // there is a valid "logs_metric" in the system let logs_metric_data_id = std::env::var("LOGS_METRIC_DATA_ID").unwrap(); let body = LogsMetricUpdateRequest::new(LogsMetricUpdateData::new( LogsMetricUpdateAttributes::new().filter( LogsMetricFilter::new() .query("service:web* AND @http.status_code:[200 TO 299]-updated".to_string()), ), LogsMetricType::LOGS_METRICS, )); let configuration = datadog::Configuration::new(); let api = LogsMetricsAPI::with_config(configuration); let resp = api .update_logs_metric(logs_metric_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update a log-based metric with include_percentiles field returns "OK" response ``` // Update a log-based metric with include_percentiles field returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_metrics::LogsMetricsAPI; use datadog_api_client::datadogV2::model::LogsMetricType; use datadog_api_client::datadogV2::model::LogsMetricUpdateAttributes; use datadog_api_client::datadogV2::model::LogsMetricUpdateCompute; use datadog_api_client::datadogV2::model::LogsMetricUpdateData; use datadog_api_client::datadogV2::model::LogsMetricUpdateRequest; #[tokio::main] async fn main() { // there is a valid "logs_metric_percentile" in the system let logs_metric_percentile_data_id = std::env::var("LOGS_METRIC_PERCENTILE_DATA_ID").unwrap(); let body = LogsMetricUpdateRequest::new(LogsMetricUpdateData::new( LogsMetricUpdateAttributes::new() .compute(LogsMetricUpdateCompute::new().include_percentiles(false)), LogsMetricType::LOGS_METRICS, )); let configuration = datadog::Configuration::new(); let api = LogsMetricsAPI::with_config(configuration); let resp = api .update_logs_metric(logs_metric_percentile_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a log-based metric returns "OK" response ``` /** * Update a log-based metric returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsMetricsApi(configuration); // there is a valid "logs_metric" in the system const LOGS_METRIC_DATA_ID = process.env.LOGS_METRIC_DATA_ID as string; const params: v2.LogsMetricsApiUpdateLogsMetricRequest = { body: { data: { type: "logs_metrics", attributes: { filter: { query: "service:web* AND @http.status_code:[200 TO 299]-updated", }, }, }, }, metricId: LOGS_METRIC_DATA_ID, }; apiInstance .updateLogsMetric(params) .then((data: v2.LogsMetricResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update a log-based metric with include_percentiles field returns "OK" response ``` /** * Update a log-based metric with include_percentiles field returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsMetricsApi(configuration); // there is a valid "logs_metric_percentile" in the system const LOGS_METRIC_PERCENTILE_DATA_ID = process.env .LOGS_METRIC_PERCENTILE_DATA_ID as string; const params: v2.LogsMetricsApiUpdateLogsMetricRequest = { body: { data: { type: "logs_metrics", attributes: { compute: { includePercentiles: false, }, }, }, }, metricId: LOGS_METRIC_PERCENTILE_DATA_ID, }; apiInstance .updateLogsMetric(params) .then((data: v2.LogsMetricResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a log-based metric](https://docs.datadoghq.com/api/latest/logs-metrics/#delete-a-log-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-metrics/#delete-a-log-based-metric-v2) DELETE https://api.ap1.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.ap2.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.datadoghq.eu/api/v2/logs/config/metrics/{metric_id}https://api.ddog-gov.com/api/v2/logs/config/metrics/{metric_id}https://api.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.us3.datadoghq.com/api/v2/logs/config/metrics/{metric_id}https://api.us5.datadoghq.com/api/v2/logs/config/metrics/{metric_id} ### Overview Delete a specific log-based metric from your organization. This endpoint requires the `logs_generate_metrics` permission. ### Arguments #### Path Parameters Name Type Description metric_id [_required_] string The name of the log-based metric. ### Response * [204](https://docs.datadoghq.com/api/latest/logs-metrics/#DeleteLogsMetric-204-v2) * [403](https://docs.datadoghq.com/api/latest/logs-metrics/#DeleteLogsMetric-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-metrics/#DeleteLogsMetric-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-metrics/#DeleteLogsMetric-429-v2) OK Not Authorized * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-metrics/) * [Example](https://docs.datadoghq.com/api/latest/logs-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-metrics/?code-lang=typescript) ##### Delete a log-based metric Copy ``` # Path parameters export metric_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/metrics/${metric_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a log-based metric ``` """ Delete a log-based metric returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_metrics_api import LogsMetricsApi # there is a valid "logs_metric" in the system LOGS_METRIC_DATA_ID = environ["LOGS_METRIC_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsMetricsApi(api_client) api_instance.delete_logs_metric( metric_id=LOGS_METRIC_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a log-based metric ``` # Delete a log-based metric returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsMetricsAPI.new # there is a valid "logs_metric" in the system LOGS_METRIC_DATA_ID = ENV["LOGS_METRIC_DATA_ID"] api_instance.delete_logs_metric(LOGS_METRIC_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a log-based metric ``` // Delete a log-based metric returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "logs_metric" in the system LogsMetricDataID := os.Getenv("LOGS_METRIC_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsMetricsApi(apiClient) r, err := api.DeleteLogsMetric(ctx, LogsMetricDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsMetricsApi.DeleteLogsMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a log-based metric ``` // Delete a log-based metric returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsMetricsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsMetricsApi apiInstance = new LogsMetricsApi(defaultClient); // there is a valid "logs_metric" in the system String LOGS_METRIC_DATA_ID = System.getenv("LOGS_METRIC_DATA_ID"); try { apiInstance.deleteLogsMetric(LOGS_METRIC_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling LogsMetricsApi#deleteLogsMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a log-based metric ``` // Delete a log-based metric returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_metrics::LogsMetricsAPI; #[tokio::main] async fn main() { // there is a valid "logs_metric" in the system let logs_metric_data_id = std::env::var("LOGS_METRIC_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = LogsMetricsAPI::with_config(configuration); let resp = api.delete_logs_metric(logs_metric_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a log-based metric ``` /** * Delete a log-based metric returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsMetricsApi(configuration); // there is a valid "logs_metric" in the system const LOGS_METRIC_DATA_ID = process.env.LOGS_METRIC_DATA_ID as string; const params: v2.LogsMetricsApiDeleteLogsMetricRequest = { metricId: LOGS_METRIC_DATA_ID, }; apiInstance .deleteLogsMetric(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=5e9e70e3-641d-43b8-a0e4-9a19b0bbbadf&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d26e35da-5fae-450c-983a-4246a23ddd6a&pt=Logs%20Metrics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-metrics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=5e9e70e3-641d-43b8-a0e4-9a19b0bbbadf&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d26e35da-5fae-450c-983a-4246a23ddd6a&pt=Logs%20Metrics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-metrics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=839fbc45-1e3c-482f-a54f-4e463c9c7a6c&bo=2&sid=c4ba0d50f0bf11f09afde9c0412ba014&vid=c4ba67e0f0bf11f099fcdb9b6c7a6b3e&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Logs%20Metrics&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-metrics%2F&r=<=1167&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=874204) --- # Source: https://docs.datadoghq.com/api/latest/logs-pipelines/ # Logs Pipelines Pipelines and processors operate on incoming logs, parsing and transforming them into structured attributes for easier querying. * See the [pipelines configuration page](https://app.datadoghq.com/logs/pipelines) for a list of the pipelines and processors currently configured in web UI. * Additional API-related information about processors can be found in the [processors documentation](https://docs.datadoghq.com/logs/log_configuration/processors/?tab=api#lookup-processor). * For more information about Pipelines, see the [pipeline documentation](https://docs.datadoghq.com/logs/log_configuration/pipelines). **Notes:** **Grok parsing rules may effect JSON output and require returned data to be configured before using in a request.** For example, if you are using the data returned from a request for another request body, and have a parsing rule that uses a regex pattern like `\s` for spaces, you will need to configure all escaped spaces as `%{space}` to use in the body data. ## [Get pipeline order](https://docs.datadoghq.com/api/latest/logs-pipelines/#get-pipeline-order) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-pipelines/#get-pipeline-order-v1) GET https://api.ap1.datadoghq.com/api/v1/logs/config/pipeline-orderhttps://api.ap2.datadoghq.com/api/v1/logs/config/pipeline-orderhttps://api.datadoghq.eu/api/v1/logs/config/pipeline-orderhttps://api.ddog-gov.com/api/v1/logs/config/pipeline-orderhttps://api.datadoghq.com/api/v1/logs/config/pipeline-orderhttps://api.us3.datadoghq.com/api/v1/logs/config/pipeline-orderhttps://api.us5.datadoghq.com/api/v1/logs/config/pipeline-order ### Overview Get the current order of your pipelines. This endpoint takes no JSON arguments. This endpoint requires the `logs_read_config` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-pipelines/#GetLogsPipelineOrder-200-v1) * [403](https://docs.datadoghq.com/api/latest/logs-pipelines/#GetLogsPipelineOrder-403-v1) * [429](https://docs.datadoghq.com/api/latest/logs-pipelines/#GetLogsPipelineOrder-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Object containing the ordered list of pipeline IDs. Expand All Field Type Description pipeline_ids [_required_] [string] Ordered Array of `` strings, the order of pipeline IDs in the array define the overall Pipelines order for Datadog. ``` { "pipeline_ids": [ "tags", "org_ids", "products" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=typescript) ##### Get pipeline order Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/pipeline-order" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get pipeline order ``` """ Get pipeline order returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_pipelines_api import LogsPipelinesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsPipelinesApi(api_client) response = api_instance.get_logs_pipeline_order() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get pipeline order ``` # Get pipeline order returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsPipelinesAPI.new p api_instance.get_logs_pipeline_order() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get pipeline order ``` // Get pipeline order returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsPipelinesApi(apiClient) resp, r, err := api.GetLogsPipelineOrder(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.GetLogsPipelineOrder`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.GetLogsPipelineOrder`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get pipeline order ``` // Get pipeline order returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsPipelinesApi; import com.datadog.api.client.v1.model.LogsPipelinesOrder; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient); try { LogsPipelinesOrder result = apiInstance.getLogsPipelineOrder(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsPipelinesApi#getLogsPipelineOrder"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get pipeline order ``` // Get pipeline order returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_pipelines::LogsPipelinesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsPipelinesAPI::with_config(configuration); let resp = api.get_logs_pipeline_order().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get pipeline order ``` /** * Get pipeline order returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsPipelinesApi(configuration); apiInstance .getLogsPipelineOrder() .then((data: v1.LogsPipelinesOrder) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update pipeline order](https://docs.datadoghq.com/api/latest/logs-pipelines/#update-pipeline-order) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-pipelines/#update-pipeline-order-v1) PUT https://api.ap1.datadoghq.com/api/v1/logs/config/pipeline-orderhttps://api.ap2.datadoghq.com/api/v1/logs/config/pipeline-orderhttps://api.datadoghq.eu/api/v1/logs/config/pipeline-orderhttps://api.ddog-gov.com/api/v1/logs/config/pipeline-orderhttps://api.datadoghq.com/api/v1/logs/config/pipeline-orderhttps://api.us3.datadoghq.com/api/v1/logs/config/pipeline-orderhttps://api.us5.datadoghq.com/api/v1/logs/config/pipeline-order ### Overview Update the order of your pipelines. Since logs are processed sequentially, reordering a pipeline may change the structure and content of the data processed by other pipelines and their processors. **Note** : Using the `PUT` method updates your pipeline order by replacing your current order with the new one sent to your Datadog organization. This endpoint requires the `logs_write_pipelines` permission. ### Request #### Body Data (required) Object containing the new ordered list of pipeline IDs. * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Expand All Field Type Description pipeline_ids [_required_] [string] Ordered Array of `` strings, the order of pipeline IDs in the array define the overall Pipelines order for Datadog. ``` { "pipeline_ids": [ "tags", "org_ids", "products" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-pipelines/#UpdateLogsPipelineOrder-200-v1) * [400](https://docs.datadoghq.com/api/latest/logs-pipelines/#UpdateLogsPipelineOrder-400-v1) * [403](https://docs.datadoghq.com/api/latest/logs-pipelines/#UpdateLogsPipelineOrder-403-v1) * [422](https://docs.datadoghq.com/api/latest/logs-pipelines/#UpdateLogsPipelineOrder-422-v1) * [429](https://docs.datadoghq.com/api/latest/logs-pipelines/#UpdateLogsPipelineOrder-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Object containing the ordered list of pipeline IDs. Expand All Field Type Description pipeline_ids [_required_] [string] Ordered Array of `` strings, the order of pipeline IDs in the array define the overall Pipelines order for Datadog. ``` { "pipeline_ids": [ "tags", "org_ids", "products" ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Unprocessable Entity * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=typescript) ##### Update pipeline order Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/pipeline-order" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "pipeline_ids": [ "tags", "org_ids", "products" ] } EOF ``` ##### Update pipeline order ``` """ Update pipeline order returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_pipelines_api import LogsPipelinesApi from datadog_api_client.v1.model.logs_pipelines_order import LogsPipelinesOrder body = LogsPipelinesOrder( pipeline_ids=[ "tags", "org_ids", "products", ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsPipelinesApi(api_client) response = api_instance.update_logs_pipeline_order(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update pipeline order ``` # Update pipeline order returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsPipelinesAPI.new body = DatadogAPIClient::V1::LogsPipelinesOrder.new({ pipeline_ids: [ "tags", "org_ids", "products", ], }) p api_instance.update_logs_pipeline_order(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update pipeline order ``` // Update pipeline order returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.LogsPipelinesOrder{ PipelineIds: []string{ "tags", "org_ids", "products", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsPipelinesApi(apiClient) resp, r, err := api.UpdateLogsPipelineOrder(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.UpdateLogsPipelineOrder`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.UpdateLogsPipelineOrder`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update pipeline order ``` // Update pipeline order returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsPipelinesApi; import com.datadog.api.client.v1.model.LogsPipelinesOrder; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient); LogsPipelinesOrder body = new LogsPipelinesOrder().pipelineIds(Arrays.asList("tags", "org_ids", "products")); try { LogsPipelinesOrder result = apiInstance.updateLogsPipelineOrder(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsPipelinesApi#updateLogsPipelineOrder"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update pipeline order ``` // Update pipeline order returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_pipelines::LogsPipelinesAPI; use datadog_api_client::datadogV1::model::LogsPipelinesOrder; #[tokio::main] async fn main() { let body = LogsPipelinesOrder::new(vec![ "tags".to_string(), "org_ids".to_string(), "products".to_string(), ]); let configuration = datadog::Configuration::new(); let api = LogsPipelinesAPI::with_config(configuration); let resp = api.update_logs_pipeline_order(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update pipeline order ``` /** * Update pipeline order returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsPipelinesApi(configuration); const params: v1.LogsPipelinesApiUpdateLogsPipelineOrderRequest = { body: { pipelineIds: ["tags", "org_ids", "products"], }, }; apiInstance .updateLogsPipelineOrder(params) .then((data: v1.LogsPipelinesOrder) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all pipelines](https://docs.datadoghq.com/api/latest/logs-pipelines/#get-all-pipelines) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-pipelines/#get-all-pipelines-v1) GET https://api.ap1.datadoghq.com/api/v1/logs/config/pipelineshttps://api.ap2.datadoghq.com/api/v1/logs/config/pipelineshttps://api.datadoghq.eu/api/v1/logs/config/pipelineshttps://api.ddog-gov.com/api/v1/logs/config/pipelineshttps://api.datadoghq.com/api/v1/logs/config/pipelineshttps://api.us3.datadoghq.com/api/v1/logs/config/pipelineshttps://api.us5.datadoghq.com/api/v1/logs/config/pipelines ### Overview Get all pipelines from your organization. This endpoint takes no JSON arguments. This endpoint requires the `logs_read_config` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-pipelines/#ListLogsPipelines-200-v1) * [403](https://docs.datadoghq.com/api/latest/logs-pipelines/#ListLogsPipelines-403-v1) * [429](https://docs.datadoghq.com/api/latest/logs-pipelines/#ListLogsPipelines-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Array of all log pipeline objects configured for the organization. Field Type Description description string A description of the pipeline. filter object Filter for logs. query string The filter query. id string ID of the pipeline. is_enabled boolean Whether or not the pipeline is enabled. is_read_only boolean Whether or not the pipeline can be edited. name string Name of the pipeline. processors [ ] Ordered list of processors in this pipeline. Option 1 object Create custom grok rules to parse the full message or [a specific attribute of your raw event](https://docs.datadoghq.com/logs/log_configuration/parsing/#advanced-settings). For more information, see the [parsing section](https://docs.datadoghq.com/logs/log_configuration/parsing). grok [_required_] object Set of rules for the grok parser. match_rules [_required_] string List of match rules for the grok parser, separated by a new line. support_rules string List of support rules for the grok parser, separated by a new line. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. samples [string] List of sample logs to test this grok parser. source [_required_] string Name of the log attribute to parse. default: `message` type [_required_] enum Type of logs grok parser. Allowed enum values: `grok-parser` default: `grok-parser` Option 2 object As Datadog receives logs, it timestamps them using the value(s) from any of these default attributes. * `timestamp` * `date` * `_timestamp` * `Timestamp` * `eventTime` * `published_date` If your logs put their dates in an attribute not in this list, use the log date Remapper Processor to define their date attribute as the official log timestamp. The recognized date formats are ISO8601, UNIX (the milliseconds EPOCH format), and RFC3164. **Note:** If your logs don’t contain any of the default attributes and you haven’t defined your own date attribute, Datadog timestamps the logs with the date it received them. If multiple log date remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs date remapper. Allowed enum values: `date-remapper` default: `date-remapper` Option 3 object Use this Processor if you want to assign some attributes as the official status. Each incoming status value is mapped as follows. * Integers from 0 to 7 map to the Syslog severity standards * Strings beginning with `emerg` or f (case-insensitive) map to `emerg` (0) * Strings beginning with `a` (case-insensitive) map to `alert` (1) * Strings beginning with `c` (case-insensitive) map to `critical` (2) * Strings beginning with `err` (case-insensitive) map to `error` (3) * Strings beginning with `w` (case-insensitive) map to `warning` (4) * Strings beginning with `n` (case-insensitive) map to `notice` (5) * Strings beginning with `i` (case-insensitive) map to `info` (6) * Strings beginning with `d`, `trace` or `verbose` (case-insensitive) map to `debug` (7) * Strings beginning with `o` or matching `OK` or `Success` (case-insensitive) map to OK * All others map to `info` (6) **Note:** If multiple log status remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs status remapper. Allowed enum values: `status-remapper` default: `status-remapper` Option 4 object Use this processor if you want to assign one or more attributes as the official service. **Note:** If multiple service remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs service remapper. Allowed enum values: `service-remapper` default: `service-remapper` Option 5 object The message is a key attribute in Datadog. It is displayed in the message column of the Log Explorer and you can do full string search on it. Use this Processor to define one or more attributes as the official log message. **Note:** If multiple log message remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `msg` type [_required_] enum Type of logs message remapper. Allowed enum values: `message-remapper` default: `message-remapper` Option 6 object The remapper processor remaps any source attribute(s) or tag to another target attribute or tag. Constraints on the tag/attribute name are explained in the [Tag Best Practice documentation](https://docs.datadoghq.com/logs/guide/log-parsing-best-practice). Some additional constraints are applied as `:` or `,` are not allowed in the target tag/attribute name. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. override_on_conflict boolean Override or not the target element if already set, preserve_source boolean Remove or preserve the remapped source element. source_type string Defines if the sources are from log `attribute` or `tag`. default: `attribute` sources [_required_] [string] Array of source attributes. target [_required_] string Final attribute or tag name to remap the sources to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` target_type string Defines if the final attribute or tag name is from log `attribute` or `tag`. default: `attribute` type [_required_] enum Type of logs attribute remapper. Allowed enum values: `attribute-remapper` default: `attribute-remapper` Option 7 object This processor extracts query parameters and other important parameters from a URL. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. normalize_ending_slashes boolean Normalize the ending slashes or not. sources [_required_] [string] Array of source attributes. default: `http.url` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.url_details` type [_required_] enum Type of logs URL parser. Allowed enum values: `url-parser` default: `url-parser` Option 8 object The User-Agent parser takes a User-Agent attribute and extracts the OS, browser, device, and other user data. It recognizes major bots like the Google Bot, Yahoo Slurp, and Bing. is_enabled boolean Whether or not the processor is enabled. is_encoded boolean Define if the source attribute is URL encoded or not. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `http.useragent` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.useragent_details` type [_required_] enum Type of logs User-Agent parser. Allowed enum values: `user-agent-parser` default: `user-agent-parser` Option 9 object Use the Category Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log matching a provided search query. Use categories to create groups for an analytical view. For example, URL groups, machine groups, environments, and response time buckets. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Once the log has matched one of the Processor queries, it stops. Make sure they are properly ordered in case a log could match several queries. * The names of the categories must be unique. * Once defined in the Category Processor, you can map categories to log status using the Log Status Remapper. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter object Filter for logs. query string The filter query. name string Value to assign to the target attribute. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. target [_required_] string Name of the target attribute which value is defined by the matching category. type [_required_] enum Type of logs category processor. Allowed enum values: `category-processor` default: `category-processor` Option 10 object Use the Arithmetic Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log with the result of the provided formula. This enables you to remap different time attributes with different units into a single attribute, or to compute operations on attributes within the same log. The formula can use parentheses and the basic arithmetic operators `-`, `+`, `*`, `/`. By default, the calculation is skipped if an attribute is missing. Select “Replace missing attribute by 0” to automatically populate missing attribute values with 0 to ensure that the calculation is done. An attribute is missing if it is not found in the log attributes, or if it cannot be converted to a number. _Notes_ : * The operator `-` needs to be space split in the formula as it can also be contained in attribute names. * If the target attribute already exists, it is overwritten by the result of the formula. * Results are rounded up to the 9th decimal. For example, if the result of the formula is `0.1234567891`, the actual value stored for the attribute is `0.123456789`. * If you need to scale a unit of measure, see [Scale Filter](https://docs.datadoghq.com/logs/log_configuration/parsing/?tab=filter#matcher-and-filter). expression [_required_] string Arithmetic operation between one or more log attributes. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If `true`, it replaces all missing attributes of expression by `0`, `false` skip the operation if an attribute is missing. name string Name of the processor. target [_required_] string Name of the attribute that contains the result of the arithmetic operation. type [_required_] enum Type of logs arithmetic processor. Allowed enum values: `arithmetic-processor` default: `arithmetic-processor` Option 11 object Use the string builder processor to add a new attribute (without spaces or special characters) to a log with the result of the provided template. This enables aggregation of different attributes or raw strings into a single attribute. The template is defined by both raw text and blocks with the syntax `%{attribute_path}`. **Notes** : * The processor only accepts attributes with values or an array of values in the blocks. * If an attribute cannot be used (object or array of object), it is replaced by an empty string or the entire operation is skipped depending on your selection. * If the target attribute already exists, it is overwritten by the result of the template. * Results of the template cannot exceed 256 characters. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If true, it replaces all missing attributes of `template` by an empty string. If `false` (default), skips the operation for missing attributes. name string Name of the processor. target [_required_] string The name of the attribute that contains the result of the template. template [_required_] string A formula with one or more attributes and raw text. type [_required_] enum Type of logs string builder processor. Allowed enum values: `string-builder-processor` default: `string-builder-processor` Option 12 object Nested Pipelines are pipelines within a pipeline. Use Nested Pipelines to split the processing into two steps. For example, first use a high-level filtering such as team and then a second level of filtering based on the integration, service, or any other tag or attribute. A pipeline can contain Nested Pipelines and Processors whereas a Nested Pipeline can only contain Processors. filter object Filter for logs. query string The filter query. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. processors [object] Ordered list of processors in this pipeline. type [_required_] enum Type of logs pipeline processor. Allowed enum values: `pipeline` default: `pipeline` Option 13 object The GeoIP parser takes an IP address attribute and extracts if available the Continent, Country, Subdivision, and City information in the target attribute path. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `network.client.ip` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `network.client.geoip` type [_required_] enum Type of GeoIP parser. Allowed enum values: `geo-ip-parser` default: `geo-ip-parser` Option 14 object Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in the processors mapping table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. default_lookup string Value to set the target attribute if the source value is not found in the list. is_enabled boolean Whether or not the processor is enabled. lookup_table [_required_] [string] Mapping table of values for the source attribute and their associated target attribute values, formatted as `["source_key1,target_value1", "source_key2,target_value2"]` name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list or the `default_lookup` if not found in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 15 object **Note** : Reference Tables are in public beta. Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in a Reference Table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. is_enabled boolean Whether or not the processor is enabled. lookup_enrichment_table [_required_] string Name of the Reference Table for the source attribute and their associated target attribute values. name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 16 object There are two ways to improve correlation between application traces and logs. 1. Follow the documentation on [how to inject a trace ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces) and by default log integrations take care of all the rest of the setup. 2. Use the Trace remapper processor to define a log attribute as its associated trace ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.trace_id` type [_required_] enum Type of logs trace remapper. Allowed enum values: `trace-id-remapper` default: `trace-id-remapper` Option 17 object There are two ways to define correlation between application spans and logs: 1. Follow the documentation on [how to inject a span ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces). Log integrations automatically handle all remaining setup steps by default. 2. Use the span remapper processor to define a log attribute as its associated span ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.span_id` type [_required_] enum Type of logs span remapper. Allowed enum values: `span-id-remapper` default: `span-id-remapper` Option 18 object A processor for extracting, aggregating, or transforming values from JSON arrays within your logs. Supported operations are: * Select value from matching element * Compute array length * Append a value to an array is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. operation [_required_] Configuration of the array processor operation to perform. Option 1 object Operation that appends a value to a target array attribute. preserve_source boolean Remove or preserve the remapped source element. default: `true` source [_required_] string Attribute path containing the value to append. target [_required_] string Attribute path of the array to append to. type [_required_] enum Operation type. Allowed enum values: `append` Option 2 object Operation that computes the length of a `source` array and stores the result in the `target` attribute. source [_required_] string Attribute path of the array to measure. target [_required_] string Attribute that receives the computed length. type [_required_] enum Operation type. Allowed enum values: `length` Option 3 object Operation that finds an object in a `source` array using a `filter`, and then extracts a specific value into the `target` attribute. filter [_required_] string Filter condition expressed as `key:value` used to find the matching element. source [_required_] string Attribute path of the array to search into. target [_required_] string Attribute that receives the extracted value. type [_required_] enum Operation type. Allowed enum values: `select` value_to_extract [_required_] string Key of the value to extract from the matching element. type [_required_] enum Type of logs array processor. Allowed enum values: `array-processor` default: `array-processor` Option 19 object The decoder processor decodes any source attribute containing a base64/base16-encoded UTF-8/ASCII string back to its original value, storing the result in a target attribute. binary_to_text_encoding [_required_] enum The encoding used to represent the binary data. Allowed enum values: `base64,base16` input_representation [_required_] enum The original representation of input string. Allowed enum values: `utf_8,integer` is_enabled boolean Whether the processor is enabled. name string Name of the processor. source [_required_] string Name of the log attribute with the encoded data. target [_required_] string Name of the log attribute that contains the decoded data. type [_required_] enum Type of logs decoder processor. Allowed enum values: `decoder-processor` default: `decoder-processor` Option 20 object A processor that has additional validations and checks for a given schema. Currently supported schema types include OCSF. is_enabled boolean Whether or not the processor is enabled. mappers [_required_] [ ] The `LogsSchemaProcessor` `mappers`. Option 1 object The schema remapper maps source log fields to their correct fields. name [_required_] string Name of the logs schema remapper. override_on_conflict boolean Override or not the target element if already set. preserve_source boolean Remove or preserve the remapped source element. sources [_required_] [string] Array of source attributes. target [_required_] string Target field to map log source field to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` type [_required_] enum Type of logs schema remapper. Allowed enum values: `schema-remapper` Option 2 object Use the Schema Category Mapper to categorize log event into enum fields. In the case of OCSF, they can be used to map sibling fields which are composed of an ID and a name. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Categories are executed in order and processing stops at the first match. Make sure categories are properly ordered in case a log could match multiple queries. * Sibling fields always have a numerical ID field and a human-readable string name. * A fallback section handles cases where the name or ID value matches a specific value. If the name matches "Other" or the ID matches 99, the value of the sibling name field will be pulled from a source field from the original log. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter [_required_] object Filter for logs. query string The filter query. id [_required_] int64 ID to inject into the category. name [_required_] string Value to assign to target schema field. fallback object Used to override hardcoded category values with a value pulled from a source attribute on the log. sources object Fallback sources used to populate value of field. [string] values object Values that define when the fallback is used. string name [_required_] string Name of the logs schema category mapper. targets [_required_] object Name of the target attributes which value is defined by the matching category. id string ID of the field to map log attributes to. name string Name of the field to map log attributes to. type [_required_] enum Type of logs schema category mapper. Allowed enum values: `schema-category-mapper` name [_required_] string Name of the processor. schema [_required_] object Configuration of the schema data to use. class_name [_required_] string Class name of the schema to use. class_uid [_required_] int64 Class UID of the schema to use. profiles [string] Optional list of profiles to modify the schema. schema_type [_required_] string Type of schema to use. version [_required_] string Version of the schema to use. type [_required_] enum Type of logs schema processor. Allowed enum values: `schema-processor` default: `schema-processor` tags [string] A list of tags associated with the pipeline. type string Type of pipeline. ``` { "description": "string", "filter": { "query": "source:python" }, "id": "string", "is_enabled": false, "is_read_only": false, "name": "", "processors": [ { "grok": { "match_rules": "rule_name_1 foo\nrule_name_2 bar", "support_rules": "rule_name_1 foo\nrule_name_2 bar" }, "is_enabled": false, "name": "string", "samples": [], "source": "message", "type": "grok-parser" } ], "tags": [], "type": "pipeline" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=typescript) ##### Get all pipelines Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/pipelines" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all pipelines ``` """ Get all pipelines returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_pipelines_api import LogsPipelinesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsPipelinesApi(api_client) response = api_instance.list_logs_pipelines() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all pipelines ``` # Get all pipelines returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsPipelinesAPI.new p api_instance.list_logs_pipelines() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all pipelines ``` // Get all pipelines returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsPipelinesApi(apiClient) resp, r, err := api.ListLogsPipelines(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.ListLogsPipelines`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.ListLogsPipelines`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all pipelines ``` // Get all pipelines returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsPipelinesApi; import com.datadog.api.client.v1.model.LogsPipeline; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient); try { List result = apiInstance.listLogsPipelines(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsPipelinesApi#listLogsPipelines"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all pipelines ``` // Get all pipelines returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_pipelines::LogsPipelinesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsPipelinesAPI::with_config(configuration); let resp = api.list_logs_pipelines().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all pipelines ``` /** * Get all pipelines returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsPipelinesApi(configuration); apiInstance .listLogsPipelines() .then((data: v1.LogsPipeline[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a pipeline](https://docs.datadoghq.com/api/latest/logs-pipelines/#create-a-pipeline) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-pipelines/#create-a-pipeline-v1) POST https://api.ap1.datadoghq.com/api/v1/logs/config/pipelineshttps://api.ap2.datadoghq.com/api/v1/logs/config/pipelineshttps://api.datadoghq.eu/api/v1/logs/config/pipelineshttps://api.ddog-gov.com/api/v1/logs/config/pipelineshttps://api.datadoghq.com/api/v1/logs/config/pipelineshttps://api.us3.datadoghq.com/api/v1/logs/config/pipelineshttps://api.us5.datadoghq.com/api/v1/logs/config/pipelines ### Overview Create a pipeline in your organization. This endpoint requires the `logs_write_pipelines` permission. ### Request #### Body Data (required) Definition of the new pipeline. * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Field Type Description description string A description of the pipeline. filter object Filter for logs. query string The filter query. id string ID of the pipeline. is_enabled boolean Whether or not the pipeline is enabled. is_read_only boolean Whether or not the pipeline can be edited. name [_required_] string Name of the pipeline. processors [ ] Ordered list of processors in this pipeline. Option 1 object Create custom grok rules to parse the full message or [a specific attribute of your raw event](https://docs.datadoghq.com/logs/log_configuration/parsing/#advanced-settings). For more information, see the [parsing section](https://docs.datadoghq.com/logs/log_configuration/parsing). grok [_required_] object Set of rules for the grok parser. match_rules [_required_] string List of match rules for the grok parser, separated by a new line. support_rules string List of support rules for the grok parser, separated by a new line. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. samples [string] List of sample logs to test this grok parser. source [_required_] string Name of the log attribute to parse. default: `message` type [_required_] enum Type of logs grok parser. Allowed enum values: `grok-parser` default: `grok-parser` Option 2 object As Datadog receives logs, it timestamps them using the value(s) from any of these default attributes. * `timestamp` * `date` * `_timestamp` * `Timestamp` * `eventTime` * `published_date` If your logs put their dates in an attribute not in this list, use the log date Remapper Processor to define their date attribute as the official log timestamp. The recognized date formats are ISO8601, UNIX (the milliseconds EPOCH format), and RFC3164. **Note:** If your logs don’t contain any of the default attributes and you haven’t defined your own date attribute, Datadog timestamps the logs with the date it received them. If multiple log date remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs date remapper. Allowed enum values: `date-remapper` default: `date-remapper` Option 3 object Use this Processor if you want to assign some attributes as the official status. Each incoming status value is mapped as follows. * Integers from 0 to 7 map to the Syslog severity standards * Strings beginning with `emerg` or f (case-insensitive) map to `emerg` (0) * Strings beginning with `a` (case-insensitive) map to `alert` (1) * Strings beginning with `c` (case-insensitive) map to `critical` (2) * Strings beginning with `err` (case-insensitive) map to `error` (3) * Strings beginning with `w` (case-insensitive) map to `warning` (4) * Strings beginning with `n` (case-insensitive) map to `notice` (5) * Strings beginning with `i` (case-insensitive) map to `info` (6) * Strings beginning with `d`, `trace` or `verbose` (case-insensitive) map to `debug` (7) * Strings beginning with `o` or matching `OK` or `Success` (case-insensitive) map to OK * All others map to `info` (6) **Note:** If multiple log status remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs status remapper. Allowed enum values: `status-remapper` default: `status-remapper` Option 4 object Use this processor if you want to assign one or more attributes as the official service. **Note:** If multiple service remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs service remapper. Allowed enum values: `service-remapper` default: `service-remapper` Option 5 object The message is a key attribute in Datadog. It is displayed in the message column of the Log Explorer and you can do full string search on it. Use this Processor to define one or more attributes as the official log message. **Note:** If multiple log message remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `msg` type [_required_] enum Type of logs message remapper. Allowed enum values: `message-remapper` default: `message-remapper` Option 6 object The remapper processor remaps any source attribute(s) or tag to another target attribute or tag. Constraints on the tag/attribute name are explained in the [Tag Best Practice documentation](https://docs.datadoghq.com/logs/guide/log-parsing-best-practice). Some additional constraints are applied as `:` or `,` are not allowed in the target tag/attribute name. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. override_on_conflict boolean Override or not the target element if already set, preserve_source boolean Remove or preserve the remapped source element. source_type string Defines if the sources are from log `attribute` or `tag`. default: `attribute` sources [_required_] [string] Array of source attributes. target [_required_] string Final attribute or tag name to remap the sources to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` target_type string Defines if the final attribute or tag name is from log `attribute` or `tag`. default: `attribute` type [_required_] enum Type of logs attribute remapper. Allowed enum values: `attribute-remapper` default: `attribute-remapper` Option 7 object This processor extracts query parameters and other important parameters from a URL. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. normalize_ending_slashes boolean Normalize the ending slashes or not. sources [_required_] [string] Array of source attributes. default: `http.url` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.url_details` type [_required_] enum Type of logs URL parser. Allowed enum values: `url-parser` default: `url-parser` Option 8 object The User-Agent parser takes a User-Agent attribute and extracts the OS, browser, device, and other user data. It recognizes major bots like the Google Bot, Yahoo Slurp, and Bing. is_enabled boolean Whether or not the processor is enabled. is_encoded boolean Define if the source attribute is URL encoded or not. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `http.useragent` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.useragent_details` type [_required_] enum Type of logs User-Agent parser. Allowed enum values: `user-agent-parser` default: `user-agent-parser` Option 9 object Use the Category Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log matching a provided search query. Use categories to create groups for an analytical view. For example, URL groups, machine groups, environments, and response time buckets. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Once the log has matched one of the Processor queries, it stops. Make sure they are properly ordered in case a log could match several queries. * The names of the categories must be unique. * Once defined in the Category Processor, you can map categories to log status using the Log Status Remapper. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter object Filter for logs. query string The filter query. name string Value to assign to the target attribute. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. target [_required_] string Name of the target attribute which value is defined by the matching category. type [_required_] enum Type of logs category processor. Allowed enum values: `category-processor` default: `category-processor` Option 10 object Use the Arithmetic Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log with the result of the provided formula. This enables you to remap different time attributes with different units into a single attribute, or to compute operations on attributes within the same log. The formula can use parentheses and the basic arithmetic operators `-`, `+`, `*`, `/`. By default, the calculation is skipped if an attribute is missing. Select “Replace missing attribute by 0” to automatically populate missing attribute values with 0 to ensure that the calculation is done. An attribute is missing if it is not found in the log attributes, or if it cannot be converted to a number. _Notes_ : * The operator `-` needs to be space split in the formula as it can also be contained in attribute names. * If the target attribute already exists, it is overwritten by the result of the formula. * Results are rounded up to the 9th decimal. For example, if the result of the formula is `0.1234567891`, the actual value stored for the attribute is `0.123456789`. * If you need to scale a unit of measure, see [Scale Filter](https://docs.datadoghq.com/logs/log_configuration/parsing/?tab=filter#matcher-and-filter). expression [_required_] string Arithmetic operation between one or more log attributes. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If `true`, it replaces all missing attributes of expression by `0`, `false` skip the operation if an attribute is missing. name string Name of the processor. target [_required_] string Name of the attribute that contains the result of the arithmetic operation. type [_required_] enum Type of logs arithmetic processor. Allowed enum values: `arithmetic-processor` default: `arithmetic-processor` Option 11 object Use the string builder processor to add a new attribute (without spaces or special characters) to a log with the result of the provided template. This enables aggregation of different attributes or raw strings into a single attribute. The template is defined by both raw text and blocks with the syntax `%{attribute_path}`. **Notes** : * The processor only accepts attributes with values or an array of values in the blocks. * If an attribute cannot be used (object or array of object), it is replaced by an empty string or the entire operation is skipped depending on your selection. * If the target attribute already exists, it is overwritten by the result of the template. * Results of the template cannot exceed 256 characters. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If true, it replaces all missing attributes of `template` by an empty string. If `false` (default), skips the operation for missing attributes. name string Name of the processor. target [_required_] string The name of the attribute that contains the result of the template. template [_required_] string A formula with one or more attributes and raw text. type [_required_] enum Type of logs string builder processor. Allowed enum values: `string-builder-processor` default: `string-builder-processor` Option 12 object Nested Pipelines are pipelines within a pipeline. Use Nested Pipelines to split the processing into two steps. For example, first use a high-level filtering such as team and then a second level of filtering based on the integration, service, or any other tag or attribute. A pipeline can contain Nested Pipelines and Processors whereas a Nested Pipeline can only contain Processors. filter object Filter for logs. query string The filter query. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. processors [object] Ordered list of processors in this pipeline. type [_required_] enum Type of logs pipeline processor. Allowed enum values: `pipeline` default: `pipeline` Option 13 object The GeoIP parser takes an IP address attribute and extracts if available the Continent, Country, Subdivision, and City information in the target attribute path. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `network.client.ip` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `network.client.geoip` type [_required_] enum Type of GeoIP parser. Allowed enum values: `geo-ip-parser` default: `geo-ip-parser` Option 14 object Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in the processors mapping table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. default_lookup string Value to set the target attribute if the source value is not found in the list. is_enabled boolean Whether or not the processor is enabled. lookup_table [_required_] [string] Mapping table of values for the source attribute and their associated target attribute values, formatted as `["source_key1,target_value1", "source_key2,target_value2"]` name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list or the `default_lookup` if not found in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 15 object **Note** : Reference Tables are in public beta. Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in a Reference Table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. is_enabled boolean Whether or not the processor is enabled. lookup_enrichment_table [_required_] string Name of the Reference Table for the source attribute and their associated target attribute values. name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 16 object There are two ways to improve correlation between application traces and logs. 1. Follow the documentation on [how to inject a trace ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces) and by default log integrations take care of all the rest of the setup. 2. Use the Trace remapper processor to define a log attribute as its associated trace ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.trace_id` type [_required_] enum Type of logs trace remapper. Allowed enum values: `trace-id-remapper` default: `trace-id-remapper` Option 17 object There are two ways to define correlation between application spans and logs: 1. Follow the documentation on [how to inject a span ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces). Log integrations automatically handle all remaining setup steps by default. 2. Use the span remapper processor to define a log attribute as its associated span ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.span_id` type [_required_] enum Type of logs span remapper. Allowed enum values: `span-id-remapper` default: `span-id-remapper` Option 18 object A processor for extracting, aggregating, or transforming values from JSON arrays within your logs. Supported operations are: * Select value from matching element * Compute array length * Append a value to an array is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. operation [_required_] Configuration of the array processor operation to perform. Option 1 object Operation that appends a value to a target array attribute. preserve_source boolean Remove or preserve the remapped source element. default: `true` source [_required_] string Attribute path containing the value to append. target [_required_] string Attribute path of the array to append to. type [_required_] enum Operation type. Allowed enum values: `append` Option 2 object Operation that computes the length of a `source` array and stores the result in the `target` attribute. source [_required_] string Attribute path of the array to measure. target [_required_] string Attribute that receives the computed length. type [_required_] enum Operation type. Allowed enum values: `length` Option 3 object Operation that finds an object in a `source` array using a `filter`, and then extracts a specific value into the `target` attribute. filter [_required_] string Filter condition expressed as `key:value` used to find the matching element. source [_required_] string Attribute path of the array to search into. target [_required_] string Attribute that receives the extracted value. type [_required_] enum Operation type. Allowed enum values: `select` value_to_extract [_required_] string Key of the value to extract from the matching element. type [_required_] enum Type of logs array processor. Allowed enum values: `array-processor` default: `array-processor` Option 19 object The decoder processor decodes any source attribute containing a base64/base16-encoded UTF-8/ASCII string back to its original value, storing the result in a target attribute. binary_to_text_encoding [_required_] enum The encoding used to represent the binary data. Allowed enum values: `base64,base16` input_representation [_required_] enum The original representation of input string. Allowed enum values: `utf_8,integer` is_enabled boolean Whether the processor is enabled. name string Name of the processor. source [_required_] string Name of the log attribute with the encoded data. target [_required_] string Name of the log attribute that contains the decoded data. type [_required_] enum Type of logs decoder processor. Allowed enum values: `decoder-processor` default: `decoder-processor` Option 20 object A processor that has additional validations and checks for a given schema. Currently supported schema types include OCSF. is_enabled boolean Whether or not the processor is enabled. mappers [_required_] [ ] The `LogsSchemaProcessor` `mappers`. Option 1 object The schema remapper maps source log fields to their correct fields. name [_required_] string Name of the logs schema remapper. override_on_conflict boolean Override or not the target element if already set. preserve_source boolean Remove or preserve the remapped source element. sources [_required_] [string] Array of source attributes. target [_required_] string Target field to map log source field to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` type [_required_] enum Type of logs schema remapper. Allowed enum values: `schema-remapper` Option 2 object Use the Schema Category Mapper to categorize log event into enum fields. In the case of OCSF, they can be used to map sibling fields which are composed of an ID and a name. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Categories are executed in order and processing stops at the first match. Make sure categories are properly ordered in case a log could match multiple queries. * Sibling fields always have a numerical ID field and a human-readable string name. * A fallback section handles cases where the name or ID value matches a specific value. If the name matches "Other" or the ID matches 99, the value of the sibling name field will be pulled from a source field from the original log. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter [_required_] object Filter for logs. query string The filter query. id [_required_] int64 ID to inject into the category. name [_required_] string Value to assign to target schema field. fallback object Used to override hardcoded category values with a value pulled from a source attribute on the log. sources object Fallback sources used to populate value of field. [string] values object Values that define when the fallback is used. string name [_required_] string Name of the logs schema category mapper. targets [_required_] object Name of the target attributes which value is defined by the matching category. id string ID of the field to map log attributes to. name string Name of the field to map log attributes to. type [_required_] enum Type of logs schema category mapper. Allowed enum values: `schema-category-mapper` name [_required_] string Name of the processor. schema [_required_] object Configuration of the schema data to use. class_name [_required_] string Class name of the schema to use. class_uid [_required_] int64 Class UID of the schema to use. profiles [string] Optional list of profiles to modify the schema. schema_type [_required_] string Type of schema to use. version [_required_] string Version of the schema to use. type [_required_] enum Type of logs schema processor. Allowed enum values: `schema-processor` default: `schema-processor` tags [string] A list of tags associated with the pipeline. type string Type of pipeline. ##### Create a pipeline with Array Processor Append Operation returns "OK" response ``` { "filter": { "query": "source:python" }, "name": "testPipelineArrayAppend", "processors": [ { "type": "array-processor", "is_enabled": true, "name": "append_ip_to_array", "operation": { "type": "append", "source": "network.client.ip", "target": "sourceIps" } } ], "tags": [] } ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response ``` { "filter": { "query": "source:python" }, "name": "testPipelineArrayAppendNoPreserve", "processors": [ { "type": "array-processor", "is_enabled": true, "name": "append_ip_and_remove_source", "operation": { "type": "append", "source": "network.client.ip", "target": "sourceIps", "preserve_source": false } } ], "tags": [] } ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response ``` { "filter": { "query": "source:python" }, "name": "testPipelineArrayAppendPreserve", "processors": [ { "type": "array-processor", "is_enabled": true, "name": "append_ip_and_keep_source", "operation": { "type": "append", "source": "network.client.ip", "target": "sourceIps", "preserve_source": true } } ], "tags": [] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-pipelines/#CreateLogsPipeline-200-v1) * [400](https://docs.datadoghq.com/api/latest/logs-pipelines/#CreateLogsPipeline-400-v1) * [403](https://docs.datadoghq.com/api/latest/logs-pipelines/#CreateLogsPipeline-403-v1) * [429](https://docs.datadoghq.com/api/latest/logs-pipelines/#CreateLogsPipeline-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Pipelines and processors operate on incoming logs, parsing and transforming them into structured attributes for easier querying. **Note** : These endpoints are only available for admin users. Make sure to use an application key created by an admin. Field Type Description description string A description of the pipeline. filter object Filter for logs. query string The filter query. id string ID of the pipeline. is_enabled boolean Whether or not the pipeline is enabled. is_read_only boolean Whether or not the pipeline can be edited. name [_required_] string Name of the pipeline. processors [ ] Ordered list of processors in this pipeline. Option 1 object Create custom grok rules to parse the full message or [a specific attribute of your raw event](https://docs.datadoghq.com/logs/log_configuration/parsing/#advanced-settings). For more information, see the [parsing section](https://docs.datadoghq.com/logs/log_configuration/parsing). grok [_required_] object Set of rules for the grok parser. match_rules [_required_] string List of match rules for the grok parser, separated by a new line. support_rules string List of support rules for the grok parser, separated by a new line. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. samples [string] List of sample logs to test this grok parser. source [_required_] string Name of the log attribute to parse. default: `message` type [_required_] enum Type of logs grok parser. Allowed enum values: `grok-parser` default: `grok-parser` Option 2 object As Datadog receives logs, it timestamps them using the value(s) from any of these default attributes. * `timestamp` * `date` * `_timestamp` * `Timestamp` * `eventTime` * `published_date` If your logs put their dates in an attribute not in this list, use the log date Remapper Processor to define their date attribute as the official log timestamp. The recognized date formats are ISO8601, UNIX (the milliseconds EPOCH format), and RFC3164. **Note:** If your logs don’t contain any of the default attributes and you haven’t defined your own date attribute, Datadog timestamps the logs with the date it received them. If multiple log date remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs date remapper. Allowed enum values: `date-remapper` default: `date-remapper` Option 3 object Use this Processor if you want to assign some attributes as the official status. Each incoming status value is mapped as follows. * Integers from 0 to 7 map to the Syslog severity standards * Strings beginning with `emerg` or f (case-insensitive) map to `emerg` (0) * Strings beginning with `a` (case-insensitive) map to `alert` (1) * Strings beginning with `c` (case-insensitive) map to `critical` (2) * Strings beginning with `err` (case-insensitive) map to `error` (3) * Strings beginning with `w` (case-insensitive) map to `warning` (4) * Strings beginning with `n` (case-insensitive) map to `notice` (5) * Strings beginning with `i` (case-insensitive) map to `info` (6) * Strings beginning with `d`, `trace` or `verbose` (case-insensitive) map to `debug` (7) * Strings beginning with `o` or matching `OK` or `Success` (case-insensitive) map to OK * All others map to `info` (6) **Note:** If multiple log status remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs status remapper. Allowed enum values: `status-remapper` default: `status-remapper` Option 4 object Use this processor if you want to assign one or more attributes as the official service. **Note:** If multiple service remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs service remapper. Allowed enum values: `service-remapper` default: `service-remapper` Option 5 object The message is a key attribute in Datadog. It is displayed in the message column of the Log Explorer and you can do full string search on it. Use this Processor to define one or more attributes as the official log message. **Note:** If multiple log message remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `msg` type [_required_] enum Type of logs message remapper. Allowed enum values: `message-remapper` default: `message-remapper` Option 6 object The remapper processor remaps any source attribute(s) or tag to another target attribute or tag. Constraints on the tag/attribute name are explained in the [Tag Best Practice documentation](https://docs.datadoghq.com/logs/guide/log-parsing-best-practice). Some additional constraints are applied as `:` or `,` are not allowed in the target tag/attribute name. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. override_on_conflict boolean Override or not the target element if already set, preserve_source boolean Remove or preserve the remapped source element. source_type string Defines if the sources are from log `attribute` or `tag`. default: `attribute` sources [_required_] [string] Array of source attributes. target [_required_] string Final attribute or tag name to remap the sources to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` target_type string Defines if the final attribute or tag name is from log `attribute` or `tag`. default: `attribute` type [_required_] enum Type of logs attribute remapper. Allowed enum values: `attribute-remapper` default: `attribute-remapper` Option 7 object This processor extracts query parameters and other important parameters from a URL. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. normalize_ending_slashes boolean Normalize the ending slashes or not. sources [_required_] [string] Array of source attributes. default: `http.url` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.url_details` type [_required_] enum Type of logs URL parser. Allowed enum values: `url-parser` default: `url-parser` Option 8 object The User-Agent parser takes a User-Agent attribute and extracts the OS, browser, device, and other user data. It recognizes major bots like the Google Bot, Yahoo Slurp, and Bing. is_enabled boolean Whether or not the processor is enabled. is_encoded boolean Define if the source attribute is URL encoded or not. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `http.useragent` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.useragent_details` type [_required_] enum Type of logs User-Agent parser. Allowed enum values: `user-agent-parser` default: `user-agent-parser` Option 9 object Use the Category Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log matching a provided search query. Use categories to create groups for an analytical view. For example, URL groups, machine groups, environments, and response time buckets. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Once the log has matched one of the Processor queries, it stops. Make sure they are properly ordered in case a log could match several queries. * The names of the categories must be unique. * Once defined in the Category Processor, you can map categories to log status using the Log Status Remapper. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter object Filter for logs. query string The filter query. name string Value to assign to the target attribute. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. target [_required_] string Name of the target attribute which value is defined by the matching category. type [_required_] enum Type of logs category processor. Allowed enum values: `category-processor` default: `category-processor` Option 10 object Use the Arithmetic Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log with the result of the provided formula. This enables you to remap different time attributes with different units into a single attribute, or to compute operations on attributes within the same log. The formula can use parentheses and the basic arithmetic operators `-`, `+`, `*`, `/`. By default, the calculation is skipped if an attribute is missing. Select “Replace missing attribute by 0” to automatically populate missing attribute values with 0 to ensure that the calculation is done. An attribute is missing if it is not found in the log attributes, or if it cannot be converted to a number. _Notes_ : * The operator `-` needs to be space split in the formula as it can also be contained in attribute names. * If the target attribute already exists, it is overwritten by the result of the formula. * Results are rounded up to the 9th decimal. For example, if the result of the formula is `0.1234567891`, the actual value stored for the attribute is `0.123456789`. * If you need to scale a unit of measure, see [Scale Filter](https://docs.datadoghq.com/logs/log_configuration/parsing/?tab=filter#matcher-and-filter). expression [_required_] string Arithmetic operation between one or more log attributes. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If `true`, it replaces all missing attributes of expression by `0`, `false` skip the operation if an attribute is missing. name string Name of the processor. target [_required_] string Name of the attribute that contains the result of the arithmetic operation. type [_required_] enum Type of logs arithmetic processor. Allowed enum values: `arithmetic-processor` default: `arithmetic-processor` Option 11 object Use the string builder processor to add a new attribute (without spaces or special characters) to a log with the result of the provided template. This enables aggregation of different attributes or raw strings into a single attribute. The template is defined by both raw text and blocks with the syntax `%{attribute_path}`. **Notes** : * The processor only accepts attributes with values or an array of values in the blocks. * If an attribute cannot be used (object or array of object), it is replaced by an empty string or the entire operation is skipped depending on your selection. * If the target attribute already exists, it is overwritten by the result of the template. * Results of the template cannot exceed 256 characters. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If true, it replaces all missing attributes of `template` by an empty string. If `false` (default), skips the operation for missing attributes. name string Name of the processor. target [_required_] string The name of the attribute that contains the result of the template. template [_required_] string A formula with one or more attributes and raw text. type [_required_] enum Type of logs string builder processor. Allowed enum values: `string-builder-processor` default: `string-builder-processor` Option 12 object Nested Pipelines are pipelines within a pipeline. Use Nested Pipelines to split the processing into two steps. For example, first use a high-level filtering such as team and then a second level of filtering based on the integration, service, or any other tag or attribute. A pipeline can contain Nested Pipelines and Processors whereas a Nested Pipeline can only contain Processors. filter object Filter for logs. query string The filter query. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. processors [object] Ordered list of processors in this pipeline. type [_required_] enum Type of logs pipeline processor. Allowed enum values: `pipeline` default: `pipeline` Option 13 object The GeoIP parser takes an IP address attribute and extracts if available the Continent, Country, Subdivision, and City information in the target attribute path. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `network.client.ip` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `network.client.geoip` type [_required_] enum Type of GeoIP parser. Allowed enum values: `geo-ip-parser` default: `geo-ip-parser` Option 14 object Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in the processors mapping table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. default_lookup string Value to set the target attribute if the source value is not found in the list. is_enabled boolean Whether or not the processor is enabled. lookup_table [_required_] [string] Mapping table of values for the source attribute and their associated target attribute values, formatted as `["source_key1,target_value1", "source_key2,target_value2"]` name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list or the `default_lookup` if not found in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 15 object **Note** : Reference Tables are in public beta. Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in a Reference Table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. is_enabled boolean Whether or not the processor is enabled. lookup_enrichment_table [_required_] string Name of the Reference Table for the source attribute and their associated target attribute values. name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 16 object There are two ways to improve correlation between application traces and logs. 1. Follow the documentation on [how to inject a trace ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces) and by default log integrations take care of all the rest of the setup. 2. Use the Trace remapper processor to define a log attribute as its associated trace ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.trace_id` type [_required_] enum Type of logs trace remapper. Allowed enum values: `trace-id-remapper` default: `trace-id-remapper` Option 17 object There are two ways to define correlation between application spans and logs: 1. Follow the documentation on [how to inject a span ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces). Log integrations automatically handle all remaining setup steps by default. 2. Use the span remapper processor to define a log attribute as its associated span ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.span_id` type [_required_] enum Type of logs span remapper. Allowed enum values: `span-id-remapper` default: `span-id-remapper` Option 18 object A processor for extracting, aggregating, or transforming values from JSON arrays within your logs. Supported operations are: * Select value from matching element * Compute array length * Append a value to an array is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. operation [_required_] Configuration of the array processor operation to perform. Option 1 object Operation that appends a value to a target array attribute. preserve_source boolean Remove or preserve the remapped source element. default: `true` source [_required_] string Attribute path containing the value to append. target [_required_] string Attribute path of the array to append to. type [_required_] enum Operation type. Allowed enum values: `append` Option 2 object Operation that computes the length of a `source` array and stores the result in the `target` attribute. source [_required_] string Attribute path of the array to measure. target [_required_] string Attribute that receives the computed length. type [_required_] enum Operation type. Allowed enum values: `length` Option 3 object Operation that finds an object in a `source` array using a `filter`, and then extracts a specific value into the `target` attribute. filter [_required_] string Filter condition expressed as `key:value` used to find the matching element. source [_required_] string Attribute path of the array to search into. target [_required_] string Attribute that receives the extracted value. type [_required_] enum Operation type. Allowed enum values: `select` value_to_extract [_required_] string Key of the value to extract from the matching element. type [_required_] enum Type of logs array processor. Allowed enum values: `array-processor` default: `array-processor` Option 19 object The decoder processor decodes any source attribute containing a base64/base16-encoded UTF-8/ASCII string back to its original value, storing the result in a target attribute. binary_to_text_encoding [_required_] enum The encoding used to represent the binary data. Allowed enum values: `base64,base16` input_representation [_required_] enum The original representation of input string. Allowed enum values: `utf_8,integer` is_enabled boolean Whether the processor is enabled. name string Name of the processor. source [_required_] string Name of the log attribute with the encoded data. target [_required_] string Name of the log attribute that contains the decoded data. type [_required_] enum Type of logs decoder processor. Allowed enum values: `decoder-processor` default: `decoder-processor` Option 20 object A processor that has additional validations and checks for a given schema. Currently supported schema types include OCSF. is_enabled boolean Whether or not the processor is enabled. mappers [_required_] [ ] The `LogsSchemaProcessor` `mappers`. Option 1 object The schema remapper maps source log fields to their correct fields. name [_required_] string Name of the logs schema remapper. override_on_conflict boolean Override or not the target element if already set. preserve_source boolean Remove or preserve the remapped source element. sources [_required_] [string] Array of source attributes. target [_required_] string Target field to map log source field to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` type [_required_] enum Type of logs schema remapper. Allowed enum values: `schema-remapper` Option 2 object Use the Schema Category Mapper to categorize log event into enum fields. In the case of OCSF, they can be used to map sibling fields which are composed of an ID and a name. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Categories are executed in order and processing stops at the first match. Make sure categories are properly ordered in case a log could match multiple queries. * Sibling fields always have a numerical ID field and a human-readable string name. * A fallback section handles cases where the name or ID value matches a specific value. If the name matches "Other" or the ID matches 99, the value of the sibling name field will be pulled from a source field from the original log. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter [_required_] object Filter for logs. query string The filter query. id [_required_] int64 ID to inject into the category. name [_required_] string Value to assign to target schema field. fallback object Used to override hardcoded category values with a value pulled from a source attribute on the log. sources object Fallback sources used to populate value of field. [string] values object Values that define when the fallback is used. string name [_required_] string Name of the logs schema category mapper. targets [_required_] object Name of the target attributes which value is defined by the matching category. id string ID of the field to map log attributes to. name string Name of the field to map log attributes to. type [_required_] enum Type of logs schema category mapper. Allowed enum values: `schema-category-mapper` name [_required_] string Name of the processor. schema [_required_] object Configuration of the schema data to use. class_name [_required_] string Class name of the schema to use. class_uid [_required_] int64 Class UID of the schema to use. profiles [string] Optional list of profiles to modify the schema. schema_type [_required_] string Type of schema to use. version [_required_] string Version of the schema to use. type [_required_] enum Type of logs schema processor. Allowed enum values: `schema-processor` default: `schema-processor` tags [string] A list of tags associated with the pipeline. type string Type of pipeline. ``` { "description": "string", "filter": { "query": "source:python" }, "id": "string", "is_enabled": false, "is_read_only": false, "name": "", "processors": [ { "grok": { "match_rules": "rule_name_1 foo\nrule_name_2 bar", "support_rules": "rule_name_1 foo\nrule_name_2 bar" }, "is_enabled": false, "name": "string", "samples": [], "source": "message", "type": "grok-parser" } ], "tags": [], "type": "pipeline" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=typescript) ##### Create a pipeline with Array Processor Append Operation returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/pipelines" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "query": "source:python" }, "name": "testPipelineArrayAppend", "processors": [ { "type": "array-processor", "is_enabled": true, "name": "append_ip_to_array", "operation": { "type": "append", "source": "network.client.ip", "target": "sourceIps" } } ], "tags": [] } EOF ``` ##### Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/pipelines" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "query": "source:python" }, "name": "testPipelineArrayAppendNoPreserve", "processors": [ { "type": "array-processor", "is_enabled": true, "name": "append_ip_and_remove_source", "operation": { "type": "append", "source": "network.client.ip", "target": "sourceIps", "preserve_source": false } } ], "tags": [] } EOF ``` ##### Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/pipelines" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "query": "source:python" }, "name": "testPipelineArrayAppendPreserve", "processors": [ { "type": "array-processor", "is_enabled": true, "name": "append_ip_and_keep_source", "operation": { "type": "append", "source": "network.client.ip", "target": "sourceIps", "preserve_source": true } } ], "tags": [] } EOF ``` ##### Create a pipeline with Array Processor Append Operation returns "OK" response ``` // Create a pipeline with Array Processor Append Operation returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.LogsPipeline{ Filter: &datadogV1.LogsFilter{ Query: datadog.PtrString("source:python"), }, Name: "testPipelineArrayAppend", Processors: []datadogV1.LogsProcessor{ datadogV1.LogsProcessor{ LogsArrayProcessor: &datadogV1.LogsArrayProcessor{ Type: datadogV1.LOGSARRAYPROCESSORTYPE_ARRAY_PROCESSOR, IsEnabled: datadog.PtrBool(true), Name: datadog.PtrString("append_ip_to_array"), Operation: datadogV1.LogsArrayProcessorOperation{ LogsArrayProcessorOperationAppend: &datadogV1.LogsArrayProcessorOperationAppend{ Type: datadogV1.LOGSARRAYPROCESSOROPERATIONAPPENDTYPE_APPEND, Source: "network.client.ip", Target: "sourceIps", }}, }}, }, Tags: []string{}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsPipelinesApi(apiClient) resp, r, err := api.CreateLogsPipeline(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.CreateLogsPipeline`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.CreateLogsPipeline`:\n%s\n", responseContent) } ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response ``` // Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.LogsPipeline{ Filter: &datadogV1.LogsFilter{ Query: datadog.PtrString("source:python"), }, Name: "testPipelineArrayAppendNoPreserve", Processors: []datadogV1.LogsProcessor{ datadogV1.LogsProcessor{ LogsArrayProcessor: &datadogV1.LogsArrayProcessor{ Type: datadogV1.LOGSARRAYPROCESSORTYPE_ARRAY_PROCESSOR, IsEnabled: datadog.PtrBool(true), Name: datadog.PtrString("append_ip_and_remove_source"), Operation: datadogV1.LogsArrayProcessorOperation{ LogsArrayProcessorOperationAppend: &datadogV1.LogsArrayProcessorOperationAppend{ Type: datadogV1.LOGSARRAYPROCESSOROPERATIONAPPENDTYPE_APPEND, Source: "network.client.ip", Target: "sourceIps", PreserveSource: datadog.PtrBool(false), }}, }}, }, Tags: []string{}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsPipelinesApi(apiClient) resp, r, err := api.CreateLogsPipeline(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.CreateLogsPipeline`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.CreateLogsPipeline`:\n%s\n", responseContent) } ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response ``` // Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.LogsPipeline{ Filter: &datadogV1.LogsFilter{ Query: datadog.PtrString("source:python"), }, Name: "testPipelineArrayAppendPreserve", Processors: []datadogV1.LogsProcessor{ datadogV1.LogsProcessor{ LogsArrayProcessor: &datadogV1.LogsArrayProcessor{ Type: datadogV1.LOGSARRAYPROCESSORTYPE_ARRAY_PROCESSOR, IsEnabled: datadog.PtrBool(true), Name: datadog.PtrString("append_ip_and_keep_source"), Operation: datadogV1.LogsArrayProcessorOperation{ LogsArrayProcessorOperationAppend: &datadogV1.LogsArrayProcessorOperationAppend{ Type: datadogV1.LOGSARRAYPROCESSOROPERATIONAPPENDTYPE_APPEND, Source: "network.client.ip", Target: "sourceIps", PreserveSource: datadog.PtrBool(true), }}, }}, }, Tags: []string{}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsPipelinesApi(apiClient) resp, r, err := api.CreateLogsPipeline(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.CreateLogsPipeline`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.CreateLogsPipeline`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a pipeline with Array Processor Append Operation returns "OK" response ``` // Create a pipeline with Array Processor Append Operation returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsPipelinesApi; import com.datadog.api.client.v1.model.LogsArrayProcessor; import com.datadog.api.client.v1.model.LogsArrayProcessorOperation; import com.datadog.api.client.v1.model.LogsArrayProcessorOperationAppend; import com.datadog.api.client.v1.model.LogsArrayProcessorOperationAppendType; import com.datadog.api.client.v1.model.LogsArrayProcessorType; import com.datadog.api.client.v1.model.LogsFilter; import com.datadog.api.client.v1.model.LogsPipeline; import com.datadog.api.client.v1.model.LogsProcessor; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient); LogsPipeline body = new LogsPipeline() .filter(new LogsFilter().query("source:python")) .name("testPipelineArrayAppend") .processors( Collections.singletonList( new LogsProcessor( new LogsArrayProcessor() .type(LogsArrayProcessorType.ARRAY_PROCESSOR) .isEnabled(true) .name("append_ip_to_array") .operation( new LogsArrayProcessorOperation( new LogsArrayProcessorOperationAppend() .type(LogsArrayProcessorOperationAppendType.APPEND) .source("network.client.ip") .target("sourceIps")))))); try { LogsPipeline result = apiInstance.createLogsPipeline(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsPipelinesApi#createLogsPipeline"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response ``` // Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsPipelinesApi; import com.datadog.api.client.v1.model.LogsArrayProcessor; import com.datadog.api.client.v1.model.LogsArrayProcessorOperation; import com.datadog.api.client.v1.model.LogsArrayProcessorOperationAppend; import com.datadog.api.client.v1.model.LogsArrayProcessorOperationAppendType; import com.datadog.api.client.v1.model.LogsArrayProcessorType; import com.datadog.api.client.v1.model.LogsFilter; import com.datadog.api.client.v1.model.LogsPipeline; import com.datadog.api.client.v1.model.LogsProcessor; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient); LogsPipeline body = new LogsPipeline() .filter(new LogsFilter().query("source:python")) .name("testPipelineArrayAppendNoPreserve") .processors( Collections.singletonList( new LogsProcessor( new LogsArrayProcessor() .type(LogsArrayProcessorType.ARRAY_PROCESSOR) .isEnabled(true) .name("append_ip_and_remove_source") .operation( new LogsArrayProcessorOperation( new LogsArrayProcessorOperationAppend() .type(LogsArrayProcessorOperationAppendType.APPEND) .source("network.client.ip") .target("sourceIps") .preserveSource(false)))))); try { LogsPipeline result = apiInstance.createLogsPipeline(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsPipelinesApi#createLogsPipeline"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response ``` // Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsPipelinesApi; import com.datadog.api.client.v1.model.LogsArrayProcessor; import com.datadog.api.client.v1.model.LogsArrayProcessorOperation; import com.datadog.api.client.v1.model.LogsArrayProcessorOperationAppend; import com.datadog.api.client.v1.model.LogsArrayProcessorOperationAppendType; import com.datadog.api.client.v1.model.LogsArrayProcessorType; import com.datadog.api.client.v1.model.LogsFilter; import com.datadog.api.client.v1.model.LogsPipeline; import com.datadog.api.client.v1.model.LogsProcessor; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient); LogsPipeline body = new LogsPipeline() .filter(new LogsFilter().query("source:python")) .name("testPipelineArrayAppendPreserve") .processors( Collections.singletonList( new LogsProcessor( new LogsArrayProcessor() .type(LogsArrayProcessorType.ARRAY_PROCESSOR) .isEnabled(true) .name("append_ip_and_keep_source") .operation( new LogsArrayProcessorOperation( new LogsArrayProcessorOperationAppend() .type(LogsArrayProcessorOperationAppendType.APPEND) .source("network.client.ip") .target("sourceIps") .preserveSource(true)))))); try { LogsPipeline result = apiInstance.createLogsPipeline(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsPipelinesApi#createLogsPipeline"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a pipeline with Array Processor Append Operation returns "OK" response ``` """ Create a pipeline with Array Processor Append Operation returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_pipelines_api import LogsPipelinesApi from datadog_api_client.v1.model.logs_array_processor import LogsArrayProcessor from datadog_api_client.v1.model.logs_array_processor_operation_append import LogsArrayProcessorOperationAppend from datadog_api_client.v1.model.logs_array_processor_operation_append_type import LogsArrayProcessorOperationAppendType from datadog_api_client.v1.model.logs_array_processor_type import LogsArrayProcessorType from datadog_api_client.v1.model.logs_filter import LogsFilter from datadog_api_client.v1.model.logs_pipeline import LogsPipeline body = LogsPipeline( filter=LogsFilter( query="source:python", ), name="testPipelineArrayAppend", processors=[ LogsArrayProcessor( type=LogsArrayProcessorType.ARRAY_PROCESSOR, is_enabled=True, name="append_ip_to_array", operation=LogsArrayProcessorOperationAppend( type=LogsArrayProcessorOperationAppendType.APPEND, source="network.client.ip", target="sourceIps", ), ), ], tags=[], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsPipelinesApi(api_client) response = api_instance.create_logs_pipeline(body=body) print(response) ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response ``` """ Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_pipelines_api import LogsPipelinesApi from datadog_api_client.v1.model.logs_array_processor import LogsArrayProcessor from datadog_api_client.v1.model.logs_array_processor_operation_append import LogsArrayProcessorOperationAppend from datadog_api_client.v1.model.logs_array_processor_operation_append_type import LogsArrayProcessorOperationAppendType from datadog_api_client.v1.model.logs_array_processor_type import LogsArrayProcessorType from datadog_api_client.v1.model.logs_filter import LogsFilter from datadog_api_client.v1.model.logs_pipeline import LogsPipeline body = LogsPipeline( filter=LogsFilter( query="source:python", ), name="testPipelineArrayAppendNoPreserve", processors=[ LogsArrayProcessor( type=LogsArrayProcessorType.ARRAY_PROCESSOR, is_enabled=True, name="append_ip_and_remove_source", operation=LogsArrayProcessorOperationAppend( type=LogsArrayProcessorOperationAppendType.APPEND, source="network.client.ip", target="sourceIps", preserve_source=False, ), ), ], tags=[], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsPipelinesApi(api_client) response = api_instance.create_logs_pipeline(body=body) print(response) ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response ``` """ Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_pipelines_api import LogsPipelinesApi from datadog_api_client.v1.model.logs_array_processor import LogsArrayProcessor from datadog_api_client.v1.model.logs_array_processor_operation_append import LogsArrayProcessorOperationAppend from datadog_api_client.v1.model.logs_array_processor_operation_append_type import LogsArrayProcessorOperationAppendType from datadog_api_client.v1.model.logs_array_processor_type import LogsArrayProcessorType from datadog_api_client.v1.model.logs_filter import LogsFilter from datadog_api_client.v1.model.logs_pipeline import LogsPipeline body = LogsPipeline( filter=LogsFilter( query="source:python", ), name="testPipelineArrayAppendPreserve", processors=[ LogsArrayProcessor( type=LogsArrayProcessorType.ARRAY_PROCESSOR, is_enabled=True, name="append_ip_and_keep_source", operation=LogsArrayProcessorOperationAppend( type=LogsArrayProcessorOperationAppendType.APPEND, source="network.client.ip", target="sourceIps", preserve_source=True, ), ), ], tags=[], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsPipelinesApi(api_client) response = api_instance.create_logs_pipeline(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a pipeline with Array Processor Append Operation returns "OK" response ``` # Create a pipeline with Array Processor Append Operation returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsPipelinesAPI.new body = DatadogAPIClient::V1::LogsPipeline.new({ filter: DatadogAPIClient::V1::LogsFilter.new({ query: "source:python", }), name: "testPipelineArrayAppend", processors: [ DatadogAPIClient::V1::LogsArrayProcessor.new({ type: DatadogAPIClient::V1::LogsArrayProcessorType::ARRAY_PROCESSOR, is_enabled: true, name: "append_ip_to_array", operation: DatadogAPIClient::V1::LogsArrayProcessorOperationAppend.new({ type: DatadogAPIClient::V1::LogsArrayProcessorOperationAppendType::APPEND, source: "network.client.ip", target: "sourceIps", }), }), ], tags: [], }) p api_instance.create_logs_pipeline(body) ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response ``` # Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsPipelinesAPI.new body = DatadogAPIClient::V1::LogsPipeline.new({ filter: DatadogAPIClient::V1::LogsFilter.new({ query: "source:python", }), name: "testPipelineArrayAppendNoPreserve", processors: [ DatadogAPIClient::V1::LogsArrayProcessor.new({ type: DatadogAPIClient::V1::LogsArrayProcessorType::ARRAY_PROCESSOR, is_enabled: true, name: "append_ip_and_remove_source", operation: DatadogAPIClient::V1::LogsArrayProcessorOperationAppend.new({ type: DatadogAPIClient::V1::LogsArrayProcessorOperationAppendType::APPEND, source: "network.client.ip", target: "sourceIps", preserve_source: false, }), }), ], tags: [], }) p api_instance.create_logs_pipeline(body) ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response ``` # Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsPipelinesAPI.new body = DatadogAPIClient::V1::LogsPipeline.new({ filter: DatadogAPIClient::V1::LogsFilter.new({ query: "source:python", }), name: "testPipelineArrayAppendPreserve", processors: [ DatadogAPIClient::V1::LogsArrayProcessor.new({ type: DatadogAPIClient::V1::LogsArrayProcessorType::ARRAY_PROCESSOR, is_enabled: true, name: "append_ip_and_keep_source", operation: DatadogAPIClient::V1::LogsArrayProcessorOperationAppend.new({ type: DatadogAPIClient::V1::LogsArrayProcessorOperationAppendType::APPEND, source: "network.client.ip", target: "sourceIps", preserve_source: true, }), }), ], tags: [], }) p api_instance.create_logs_pipeline(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a pipeline with Array Processor Append Operation returns "OK" response ``` // Create a pipeline with Array Processor Append Operation returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_pipelines::LogsPipelinesAPI; use datadog_api_client::datadogV1::model::LogsArrayProcessor; use datadog_api_client::datadogV1::model::LogsArrayProcessorOperation; use datadog_api_client::datadogV1::model::LogsArrayProcessorOperationAppend; use datadog_api_client::datadogV1::model::LogsArrayProcessorOperationAppendType; use datadog_api_client::datadogV1::model::LogsArrayProcessorType; use datadog_api_client::datadogV1::model::LogsFilter; use datadog_api_client::datadogV1::model::LogsPipeline; use datadog_api_client::datadogV1::model::LogsProcessor; #[tokio::main] async fn main() { let body = LogsPipeline::new("testPipelineArrayAppend".to_string()) .filter(LogsFilter::new().query("source:python".to_string())) .processors(vec![LogsProcessor::LogsArrayProcessor(Box::new( LogsArrayProcessor::new( LogsArrayProcessorOperation::LogsArrayProcessorOperationAppend(Box::new( LogsArrayProcessorOperationAppend::new( "network.client.ip".to_string(), "sourceIps".to_string(), LogsArrayProcessorOperationAppendType::APPEND, ), )), LogsArrayProcessorType::ARRAY_PROCESSOR, ) .is_enabled(true) .name("append_ip_to_array".to_string()), ))]) .tags(vec![]); let configuration = datadog::Configuration::new(); let api = LogsPipelinesAPI::with_config(configuration); let resp = api.create_logs_pipeline(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response ``` // Create a pipeline with Array Processor Append Operation with preserve_source // false returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_pipelines::LogsPipelinesAPI; use datadog_api_client::datadogV1::model::LogsArrayProcessor; use datadog_api_client::datadogV1::model::LogsArrayProcessorOperation; use datadog_api_client::datadogV1::model::LogsArrayProcessorOperationAppend; use datadog_api_client::datadogV1::model::LogsArrayProcessorOperationAppendType; use datadog_api_client::datadogV1::model::LogsArrayProcessorType; use datadog_api_client::datadogV1::model::LogsFilter; use datadog_api_client::datadogV1::model::LogsPipeline; use datadog_api_client::datadogV1::model::LogsProcessor; #[tokio::main] async fn main() { let body = LogsPipeline::new("testPipelineArrayAppendNoPreserve".to_string()) .filter(LogsFilter::new().query("source:python".to_string())) .processors(vec![LogsProcessor::LogsArrayProcessor(Box::new( LogsArrayProcessor::new( LogsArrayProcessorOperation::LogsArrayProcessorOperationAppend(Box::new( LogsArrayProcessorOperationAppend::new( "network.client.ip".to_string(), "sourceIps".to_string(), LogsArrayProcessorOperationAppendType::APPEND, ) .preserve_source(false), )), LogsArrayProcessorType::ARRAY_PROCESSOR, ) .is_enabled(true) .name("append_ip_and_remove_source".to_string()), ))]) .tags(vec![]); let configuration = datadog::Configuration::new(); let api = LogsPipelinesAPI::with_config(configuration); let resp = api.create_logs_pipeline(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response ``` // Create a pipeline with Array Processor Append Operation with preserve_source // true returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_pipelines::LogsPipelinesAPI; use datadog_api_client::datadogV1::model::LogsArrayProcessor; use datadog_api_client::datadogV1::model::LogsArrayProcessorOperation; use datadog_api_client::datadogV1::model::LogsArrayProcessorOperationAppend; use datadog_api_client::datadogV1::model::LogsArrayProcessorOperationAppendType; use datadog_api_client::datadogV1::model::LogsArrayProcessorType; use datadog_api_client::datadogV1::model::LogsFilter; use datadog_api_client::datadogV1::model::LogsPipeline; use datadog_api_client::datadogV1::model::LogsProcessor; #[tokio::main] async fn main() { let body = LogsPipeline::new("testPipelineArrayAppendPreserve".to_string()) .filter(LogsFilter::new().query("source:python".to_string())) .processors(vec![LogsProcessor::LogsArrayProcessor(Box::new( LogsArrayProcessor::new( LogsArrayProcessorOperation::LogsArrayProcessorOperationAppend(Box::new( LogsArrayProcessorOperationAppend::new( "network.client.ip".to_string(), "sourceIps".to_string(), LogsArrayProcessorOperationAppendType::APPEND, ) .preserve_source(true), )), LogsArrayProcessorType::ARRAY_PROCESSOR, ) .is_enabled(true) .name("append_ip_and_keep_source".to_string()), ))]) .tags(vec![]); let configuration = datadog::Configuration::new(); let api = LogsPipelinesAPI::with_config(configuration); let resp = api.create_logs_pipeline(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a pipeline with Array Processor Append Operation returns "OK" response ``` /** * Create a pipeline with Array Processor Append Operation returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsPipelinesApi(configuration); const params: v1.LogsPipelinesApiCreateLogsPipelineRequest = { body: { filter: { query: "source:python", }, name: "testPipelineArrayAppend", processors: [ { type: "array-processor", isEnabled: true, name: "append_ip_to_array", operation: { type: "append", source: "network.client.ip", target: "sourceIps", }, }, ], tags: [], }, }; apiInstance .createLogsPipeline(params) .then((data: v1.LogsPipeline) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response ``` /** * Create a pipeline with Array Processor Append Operation with preserve_source false returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsPipelinesApi(configuration); const params: v1.LogsPipelinesApiCreateLogsPipelineRequest = { body: { filter: { query: "source:python", }, name: "testPipelineArrayAppendNoPreserve", processors: [ { type: "array-processor", isEnabled: true, name: "append_ip_and_remove_source", operation: { type: "append", source: "network.client.ip", target: "sourceIps", preserveSource: false, }, }, ], tags: [], }, }; apiInstance .createLogsPipeline(params) .then((data: v1.LogsPipeline) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response ``` /** * Create a pipeline with Array Processor Append Operation with preserve_source true returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsPipelinesApi(configuration); const params: v1.LogsPipelinesApiCreateLogsPipelineRequest = { body: { filter: { query: "source:python", }, name: "testPipelineArrayAppendPreserve", processors: [ { type: "array-processor", isEnabled: true, name: "append_ip_and_keep_source", operation: { type: "append", source: "network.client.ip", target: "sourceIps", preserveSource: true, }, }, ], tags: [], }, }; apiInstance .createLogsPipeline(params) .then((data: v1.LogsPipeline) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a pipeline](https://docs.datadoghq.com/api/latest/logs-pipelines/#get-a-pipeline) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-pipelines/#get-a-pipeline-v1) GET https://api.ap1.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.ap2.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.datadoghq.eu/api/v1/logs/config/pipelines/{pipeline_id}https://api.ddog-gov.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.us3.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.us5.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id} ### Overview Get a specific pipeline from your organization. This endpoint takes no JSON arguments. This endpoint requires the `logs_read_config` permission. ### Arguments #### Path Parameters Name Type Description pipeline_id [_required_] string ID of the pipeline to get. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-pipelines/#GetLogsPipeline-200-v1) * [400](https://docs.datadoghq.com/api/latest/logs-pipelines/#GetLogsPipeline-400-v1) * [403](https://docs.datadoghq.com/api/latest/logs-pipelines/#GetLogsPipeline-403-v1) * [429](https://docs.datadoghq.com/api/latest/logs-pipelines/#GetLogsPipeline-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Pipelines and processors operate on incoming logs, parsing and transforming them into structured attributes for easier querying. **Note** : These endpoints are only available for admin users. Make sure to use an application key created by an admin. Field Type Description description string A description of the pipeline. filter object Filter for logs. query string The filter query. id string ID of the pipeline. is_enabled boolean Whether or not the pipeline is enabled. is_read_only boolean Whether or not the pipeline can be edited. name [_required_] string Name of the pipeline. processors [ ] Ordered list of processors in this pipeline. Option 1 object Create custom grok rules to parse the full message or [a specific attribute of your raw event](https://docs.datadoghq.com/logs/log_configuration/parsing/#advanced-settings). For more information, see the [parsing section](https://docs.datadoghq.com/logs/log_configuration/parsing). grok [_required_] object Set of rules for the grok parser. match_rules [_required_] string List of match rules for the grok parser, separated by a new line. support_rules string List of support rules for the grok parser, separated by a new line. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. samples [string] List of sample logs to test this grok parser. source [_required_] string Name of the log attribute to parse. default: `message` type [_required_] enum Type of logs grok parser. Allowed enum values: `grok-parser` default: `grok-parser` Option 2 object As Datadog receives logs, it timestamps them using the value(s) from any of these default attributes. * `timestamp` * `date` * `_timestamp` * `Timestamp` * `eventTime` * `published_date` If your logs put their dates in an attribute not in this list, use the log date Remapper Processor to define their date attribute as the official log timestamp. The recognized date formats are ISO8601, UNIX (the milliseconds EPOCH format), and RFC3164. **Note:** If your logs don’t contain any of the default attributes and you haven’t defined your own date attribute, Datadog timestamps the logs with the date it received them. If multiple log date remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs date remapper. Allowed enum values: `date-remapper` default: `date-remapper` Option 3 object Use this Processor if you want to assign some attributes as the official status. Each incoming status value is mapped as follows. * Integers from 0 to 7 map to the Syslog severity standards * Strings beginning with `emerg` or f (case-insensitive) map to `emerg` (0) * Strings beginning with `a` (case-insensitive) map to `alert` (1) * Strings beginning with `c` (case-insensitive) map to `critical` (2) * Strings beginning with `err` (case-insensitive) map to `error` (3) * Strings beginning with `w` (case-insensitive) map to `warning` (4) * Strings beginning with `n` (case-insensitive) map to `notice` (5) * Strings beginning with `i` (case-insensitive) map to `info` (6) * Strings beginning with `d`, `trace` or `verbose` (case-insensitive) map to `debug` (7) * Strings beginning with `o` or matching `OK` or `Success` (case-insensitive) map to OK * All others map to `info` (6) **Note:** If multiple log status remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs status remapper. Allowed enum values: `status-remapper` default: `status-remapper` Option 4 object Use this processor if you want to assign one or more attributes as the official service. **Note:** If multiple service remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs service remapper. Allowed enum values: `service-remapper` default: `service-remapper` Option 5 object The message is a key attribute in Datadog. It is displayed in the message column of the Log Explorer and you can do full string search on it. Use this Processor to define one or more attributes as the official log message. **Note:** If multiple log message remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `msg` type [_required_] enum Type of logs message remapper. Allowed enum values: `message-remapper` default: `message-remapper` Option 6 object The remapper processor remaps any source attribute(s) or tag to another target attribute or tag. Constraints on the tag/attribute name are explained in the [Tag Best Practice documentation](https://docs.datadoghq.com/logs/guide/log-parsing-best-practice). Some additional constraints are applied as `:` or `,` are not allowed in the target tag/attribute name. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. override_on_conflict boolean Override or not the target element if already set, preserve_source boolean Remove or preserve the remapped source element. source_type string Defines if the sources are from log `attribute` or `tag`. default: `attribute` sources [_required_] [string] Array of source attributes. target [_required_] string Final attribute or tag name to remap the sources to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` target_type string Defines if the final attribute or tag name is from log `attribute` or `tag`. default: `attribute` type [_required_] enum Type of logs attribute remapper. Allowed enum values: `attribute-remapper` default: `attribute-remapper` Option 7 object This processor extracts query parameters and other important parameters from a URL. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. normalize_ending_slashes boolean Normalize the ending slashes or not. sources [_required_] [string] Array of source attributes. default: `http.url` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.url_details` type [_required_] enum Type of logs URL parser. Allowed enum values: `url-parser` default: `url-parser` Option 8 object The User-Agent parser takes a User-Agent attribute and extracts the OS, browser, device, and other user data. It recognizes major bots like the Google Bot, Yahoo Slurp, and Bing. is_enabled boolean Whether or not the processor is enabled. is_encoded boolean Define if the source attribute is URL encoded or not. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `http.useragent` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.useragent_details` type [_required_] enum Type of logs User-Agent parser. Allowed enum values: `user-agent-parser` default: `user-agent-parser` Option 9 object Use the Category Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log matching a provided search query. Use categories to create groups for an analytical view. For example, URL groups, machine groups, environments, and response time buckets. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Once the log has matched one of the Processor queries, it stops. Make sure they are properly ordered in case a log could match several queries. * The names of the categories must be unique. * Once defined in the Category Processor, you can map categories to log status using the Log Status Remapper. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter object Filter for logs. query string The filter query. name string Value to assign to the target attribute. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. target [_required_] string Name of the target attribute which value is defined by the matching category. type [_required_] enum Type of logs category processor. Allowed enum values: `category-processor` default: `category-processor` Option 10 object Use the Arithmetic Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log with the result of the provided formula. This enables you to remap different time attributes with different units into a single attribute, or to compute operations on attributes within the same log. The formula can use parentheses and the basic arithmetic operators `-`, `+`, `*`, `/`. By default, the calculation is skipped if an attribute is missing. Select “Replace missing attribute by 0” to automatically populate missing attribute values with 0 to ensure that the calculation is done. An attribute is missing if it is not found in the log attributes, or if it cannot be converted to a number. _Notes_ : * The operator `-` needs to be space split in the formula as it can also be contained in attribute names. * If the target attribute already exists, it is overwritten by the result of the formula. * Results are rounded up to the 9th decimal. For example, if the result of the formula is `0.1234567891`, the actual value stored for the attribute is `0.123456789`. * If you need to scale a unit of measure, see [Scale Filter](https://docs.datadoghq.com/logs/log_configuration/parsing/?tab=filter#matcher-and-filter). expression [_required_] string Arithmetic operation between one or more log attributes. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If `true`, it replaces all missing attributes of expression by `0`, `false` skip the operation if an attribute is missing. name string Name of the processor. target [_required_] string Name of the attribute that contains the result of the arithmetic operation. type [_required_] enum Type of logs arithmetic processor. Allowed enum values: `arithmetic-processor` default: `arithmetic-processor` Option 11 object Use the string builder processor to add a new attribute (without spaces or special characters) to a log with the result of the provided template. This enables aggregation of different attributes or raw strings into a single attribute. The template is defined by both raw text and blocks with the syntax `%{attribute_path}`. **Notes** : * The processor only accepts attributes with values or an array of values in the blocks. * If an attribute cannot be used (object or array of object), it is replaced by an empty string or the entire operation is skipped depending on your selection. * If the target attribute already exists, it is overwritten by the result of the template. * Results of the template cannot exceed 256 characters. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If true, it replaces all missing attributes of `template` by an empty string. If `false` (default), skips the operation for missing attributes. name string Name of the processor. target [_required_] string The name of the attribute that contains the result of the template. template [_required_] string A formula with one or more attributes and raw text. type [_required_] enum Type of logs string builder processor. Allowed enum values: `string-builder-processor` default: `string-builder-processor` Option 12 object Nested Pipelines are pipelines within a pipeline. Use Nested Pipelines to split the processing into two steps. For example, first use a high-level filtering such as team and then a second level of filtering based on the integration, service, or any other tag or attribute. A pipeline can contain Nested Pipelines and Processors whereas a Nested Pipeline can only contain Processors. filter object Filter for logs. query string The filter query. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. processors [object] Ordered list of processors in this pipeline. type [_required_] enum Type of logs pipeline processor. Allowed enum values: `pipeline` default: `pipeline` Option 13 object The GeoIP parser takes an IP address attribute and extracts if available the Continent, Country, Subdivision, and City information in the target attribute path. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `network.client.ip` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `network.client.geoip` type [_required_] enum Type of GeoIP parser. Allowed enum values: `geo-ip-parser` default: `geo-ip-parser` Option 14 object Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in the processors mapping table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. default_lookup string Value to set the target attribute if the source value is not found in the list. is_enabled boolean Whether or not the processor is enabled. lookup_table [_required_] [string] Mapping table of values for the source attribute and their associated target attribute values, formatted as `["source_key1,target_value1", "source_key2,target_value2"]` name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list or the `default_lookup` if not found in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 15 object **Note** : Reference Tables are in public beta. Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in a Reference Table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. is_enabled boolean Whether or not the processor is enabled. lookup_enrichment_table [_required_] string Name of the Reference Table for the source attribute and their associated target attribute values. name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 16 object There are two ways to improve correlation between application traces and logs. 1. Follow the documentation on [how to inject a trace ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces) and by default log integrations take care of all the rest of the setup. 2. Use the Trace remapper processor to define a log attribute as its associated trace ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.trace_id` type [_required_] enum Type of logs trace remapper. Allowed enum values: `trace-id-remapper` default: `trace-id-remapper` Option 17 object There are two ways to define correlation between application spans and logs: 1. Follow the documentation on [how to inject a span ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces). Log integrations automatically handle all remaining setup steps by default. 2. Use the span remapper processor to define a log attribute as its associated span ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.span_id` type [_required_] enum Type of logs span remapper. Allowed enum values: `span-id-remapper` default: `span-id-remapper` Option 18 object A processor for extracting, aggregating, or transforming values from JSON arrays within your logs. Supported operations are: * Select value from matching element * Compute array length * Append a value to an array is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. operation [_required_] Configuration of the array processor operation to perform. Option 1 object Operation that appends a value to a target array attribute. preserve_source boolean Remove or preserve the remapped source element. default: `true` source [_required_] string Attribute path containing the value to append. target [_required_] string Attribute path of the array to append to. type [_required_] enum Operation type. Allowed enum values: `append` Option 2 object Operation that computes the length of a `source` array and stores the result in the `target` attribute. source [_required_] string Attribute path of the array to measure. target [_required_] string Attribute that receives the computed length. type [_required_] enum Operation type. Allowed enum values: `length` Option 3 object Operation that finds an object in a `source` array using a `filter`, and then extracts a specific value into the `target` attribute. filter [_required_] string Filter condition expressed as `key:value` used to find the matching element. source [_required_] string Attribute path of the array to search into. target [_required_] string Attribute that receives the extracted value. type [_required_] enum Operation type. Allowed enum values: `select` value_to_extract [_required_] string Key of the value to extract from the matching element. type [_required_] enum Type of logs array processor. Allowed enum values: `array-processor` default: `array-processor` Option 19 object The decoder processor decodes any source attribute containing a base64/base16-encoded UTF-8/ASCII string back to its original value, storing the result in a target attribute. binary_to_text_encoding [_required_] enum The encoding used to represent the binary data. Allowed enum values: `base64,base16` input_representation [_required_] enum The original representation of input string. Allowed enum values: `utf_8,integer` is_enabled boolean Whether the processor is enabled. name string Name of the processor. source [_required_] string Name of the log attribute with the encoded data. target [_required_] string Name of the log attribute that contains the decoded data. type [_required_] enum Type of logs decoder processor. Allowed enum values: `decoder-processor` default: `decoder-processor` Option 20 object A processor that has additional validations and checks for a given schema. Currently supported schema types include OCSF. is_enabled boolean Whether or not the processor is enabled. mappers [_required_] [ ] The `LogsSchemaProcessor` `mappers`. Option 1 object The schema remapper maps source log fields to their correct fields. name [_required_] string Name of the logs schema remapper. override_on_conflict boolean Override or not the target element if already set. preserve_source boolean Remove or preserve the remapped source element. sources [_required_] [string] Array of source attributes. target [_required_] string Target field to map log source field to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` type [_required_] enum Type of logs schema remapper. Allowed enum values: `schema-remapper` Option 2 object Use the Schema Category Mapper to categorize log event into enum fields. In the case of OCSF, they can be used to map sibling fields which are composed of an ID and a name. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Categories are executed in order and processing stops at the first match. Make sure categories are properly ordered in case a log could match multiple queries. * Sibling fields always have a numerical ID field and a human-readable string name. * A fallback section handles cases where the name or ID value matches a specific value. If the name matches "Other" or the ID matches 99, the value of the sibling name field will be pulled from a source field from the original log. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter [_required_] object Filter for logs. query string The filter query. id [_required_] int64 ID to inject into the category. name [_required_] string Value to assign to target schema field. fallback object Used to override hardcoded category values with a value pulled from a source attribute on the log. sources object Fallback sources used to populate value of field. [string] values object Values that define when the fallback is used. string name [_required_] string Name of the logs schema category mapper. targets [_required_] object Name of the target attributes which value is defined by the matching category. id string ID of the field to map log attributes to. name string Name of the field to map log attributes to. type [_required_] enum Type of logs schema category mapper. Allowed enum values: `schema-category-mapper` name [_required_] string Name of the processor. schema [_required_] object Configuration of the schema data to use. class_name [_required_] string Class name of the schema to use. class_uid [_required_] int64 Class UID of the schema to use. profiles [string] Optional list of profiles to modify the schema. schema_type [_required_] string Type of schema to use. version [_required_] string Version of the schema to use. type [_required_] enum Type of logs schema processor. Allowed enum values: `schema-processor` default: `schema-processor` tags [string] A list of tags associated with the pipeline. type string Type of pipeline. ``` { "description": "string", "filter": { "query": "source:python" }, "id": "string", "is_enabled": false, "is_read_only": false, "name": "", "processors": [ { "grok": { "match_rules": "rule_name_1 foo\nrule_name_2 bar", "support_rules": "rule_name_1 foo\nrule_name_2 bar" }, "is_enabled": false, "name": "string", "samples": [], "source": "message", "type": "grok-parser" } ], "tags": [], "type": "pipeline" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=typescript) ##### Get a pipeline Copy ``` # Path parameters export pipeline_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/pipelines/${pipeline_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a pipeline ``` """ Get a pipeline returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_pipelines_api import LogsPipelinesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsPipelinesApi(api_client) response = api_instance.get_logs_pipeline( pipeline_id="pipeline_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a pipeline ``` # Get a pipeline returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsPipelinesAPI.new p api_instance.get_logs_pipeline("pipeline_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a pipeline ``` // Get a pipeline returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsPipelinesApi(apiClient) resp, r, err := api.GetLogsPipeline(ctx, "pipeline_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.GetLogsPipeline`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.GetLogsPipeline`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a pipeline ``` // Get a pipeline returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsPipelinesApi; import com.datadog.api.client.v1.model.LogsPipeline; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient); try { LogsPipeline result = apiInstance.getLogsPipeline("pipeline_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsPipelinesApi#getLogsPipeline"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a pipeline ``` // Get a pipeline returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_pipelines::LogsPipelinesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsPipelinesAPI::with_config(configuration); let resp = api.get_logs_pipeline("pipeline_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a pipeline ``` /** * Get a pipeline returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsPipelinesApi(configuration); const params: v1.LogsPipelinesApiGetLogsPipelineRequest = { pipelineId: "pipeline_id", }; apiInstance .getLogsPipeline(params) .then((data: v1.LogsPipeline) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a pipeline](https://docs.datadoghq.com/api/latest/logs-pipelines/#delete-a-pipeline) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-pipelines/#delete-a-pipeline-v1) DELETE https://api.ap1.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.ap2.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.datadoghq.eu/api/v1/logs/config/pipelines/{pipeline_id}https://api.ddog-gov.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.us3.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.us5.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id} ### Overview Delete a given pipeline from your organization. This endpoint takes no JSON arguments. This endpoint requires the `logs_write_pipelines` permission. ### Arguments #### Path Parameters Name Type Description pipeline_id [_required_] string ID of the pipeline to delete. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-pipelines/#DeleteLogsPipeline-200-v1) * [400](https://docs.datadoghq.com/api/latest/logs-pipelines/#DeleteLogsPipeline-400-v1) * [403](https://docs.datadoghq.com/api/latest/logs-pipelines/#DeleteLogsPipeline-403-v1) * [429](https://docs.datadoghq.com/api/latest/logs-pipelines/#DeleteLogsPipeline-429-v1) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=typescript) ##### Delete a pipeline Copy ``` # Path parameters export pipeline_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/pipelines/${pipeline_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a pipeline ``` """ Delete a pipeline returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_pipelines_api import LogsPipelinesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsPipelinesApi(api_client) api_instance.delete_logs_pipeline( pipeline_id="pipeline_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a pipeline ``` # Delete a pipeline returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsPipelinesAPI.new p api_instance.delete_logs_pipeline("pipeline_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a pipeline ``` // Delete a pipeline returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsPipelinesApi(apiClient) r, err := api.DeleteLogsPipeline(ctx, "pipeline_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.DeleteLogsPipeline`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a pipeline ``` // Delete a pipeline returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsPipelinesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient); try { apiInstance.deleteLogsPipeline("pipeline_id"); } catch (ApiException e) { System.err.println("Exception when calling LogsPipelinesApi#deleteLogsPipeline"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a pipeline ``` // Delete a pipeline returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_pipelines::LogsPipelinesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsPipelinesAPI::with_config(configuration); let resp = api.delete_logs_pipeline("pipeline_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a pipeline ``` /** * Delete a pipeline returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsPipelinesApi(configuration); const params: v1.LogsPipelinesApiDeleteLogsPipelineRequest = { pipelineId: "pipeline_id", }; apiInstance .deleteLogsPipeline(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a pipeline](https://docs.datadoghq.com/api/latest/logs-pipelines/#update-a-pipeline) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs-pipelines/#update-a-pipeline-v1) PUT https://api.ap1.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.ap2.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.datadoghq.eu/api/v1/logs/config/pipelines/{pipeline_id}https://api.ddog-gov.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.us3.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id}https://api.us5.datadoghq.com/api/v1/logs/config/pipelines/{pipeline_id} ### Overview Update a given pipeline configuration to change it’s processors or their order. **Note** : Using this method updates your pipeline configuration by **replacing** your current configuration with the new one sent to your Datadog organization. This endpoint requires the `logs_write_pipelines` permission. ### Arguments #### Path Parameters Name Type Description pipeline_id [_required_] string ID of the pipeline to delete. ### Request #### Body Data (required) New definition of the pipeline. * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Field Type Description description string A description of the pipeline. filter object Filter for logs. query string The filter query. id string ID of the pipeline. is_enabled boolean Whether or not the pipeline is enabled. is_read_only boolean Whether or not the pipeline can be edited. name [_required_] string Name of the pipeline. processors [ ] Ordered list of processors in this pipeline. Option 1 object Create custom grok rules to parse the full message or [a specific attribute of your raw event](https://docs.datadoghq.com/logs/log_configuration/parsing/#advanced-settings). For more information, see the [parsing section](https://docs.datadoghq.com/logs/log_configuration/parsing). grok [_required_] object Set of rules for the grok parser. match_rules [_required_] string List of match rules for the grok parser, separated by a new line. support_rules string List of support rules for the grok parser, separated by a new line. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. samples [string] List of sample logs to test this grok parser. source [_required_] string Name of the log attribute to parse. default: `message` type [_required_] enum Type of logs grok parser. Allowed enum values: `grok-parser` default: `grok-parser` Option 2 object As Datadog receives logs, it timestamps them using the value(s) from any of these default attributes. * `timestamp` * `date` * `_timestamp` * `Timestamp` * `eventTime` * `published_date` If your logs put their dates in an attribute not in this list, use the log date Remapper Processor to define their date attribute as the official log timestamp. The recognized date formats are ISO8601, UNIX (the milliseconds EPOCH format), and RFC3164. **Note:** If your logs don’t contain any of the default attributes and you haven’t defined your own date attribute, Datadog timestamps the logs with the date it received them. If multiple log date remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs date remapper. Allowed enum values: `date-remapper` default: `date-remapper` Option 3 object Use this Processor if you want to assign some attributes as the official status. Each incoming status value is mapped as follows. * Integers from 0 to 7 map to the Syslog severity standards * Strings beginning with `emerg` or f (case-insensitive) map to `emerg` (0) * Strings beginning with `a` (case-insensitive) map to `alert` (1) * Strings beginning with `c` (case-insensitive) map to `critical` (2) * Strings beginning with `err` (case-insensitive) map to `error` (3) * Strings beginning with `w` (case-insensitive) map to `warning` (4) * Strings beginning with `n` (case-insensitive) map to `notice` (5) * Strings beginning with `i` (case-insensitive) map to `info` (6) * Strings beginning with `d`, `trace` or `verbose` (case-insensitive) map to `debug` (7) * Strings beginning with `o` or matching `OK` or `Success` (case-insensitive) map to OK * All others map to `info` (6) **Note:** If multiple log status remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs status remapper. Allowed enum values: `status-remapper` default: `status-remapper` Option 4 object Use this processor if you want to assign one or more attributes as the official service. **Note:** If multiple service remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs service remapper. Allowed enum values: `service-remapper` default: `service-remapper` Option 5 object The message is a key attribute in Datadog. It is displayed in the message column of the Log Explorer and you can do full string search on it. Use this Processor to define one or more attributes as the official log message. **Note:** If multiple log message remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `msg` type [_required_] enum Type of logs message remapper. Allowed enum values: `message-remapper` default: `message-remapper` Option 6 object The remapper processor remaps any source attribute(s) or tag to another target attribute or tag. Constraints on the tag/attribute name are explained in the [Tag Best Practice documentation](https://docs.datadoghq.com/logs/guide/log-parsing-best-practice). Some additional constraints are applied as `:` or `,` are not allowed in the target tag/attribute name. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. override_on_conflict boolean Override or not the target element if already set, preserve_source boolean Remove or preserve the remapped source element. source_type string Defines if the sources are from log `attribute` or `tag`. default: `attribute` sources [_required_] [string] Array of source attributes. target [_required_] string Final attribute or tag name to remap the sources to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` target_type string Defines if the final attribute or tag name is from log `attribute` or `tag`. default: `attribute` type [_required_] enum Type of logs attribute remapper. Allowed enum values: `attribute-remapper` default: `attribute-remapper` Option 7 object This processor extracts query parameters and other important parameters from a URL. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. normalize_ending_slashes boolean Normalize the ending slashes or not. sources [_required_] [string] Array of source attributes. default: `http.url` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.url_details` type [_required_] enum Type of logs URL parser. Allowed enum values: `url-parser` default: `url-parser` Option 8 object The User-Agent parser takes a User-Agent attribute and extracts the OS, browser, device, and other user data. It recognizes major bots like the Google Bot, Yahoo Slurp, and Bing. is_enabled boolean Whether or not the processor is enabled. is_encoded boolean Define if the source attribute is URL encoded or not. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `http.useragent` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.useragent_details` type [_required_] enum Type of logs User-Agent parser. Allowed enum values: `user-agent-parser` default: `user-agent-parser` Option 9 object Use the Category Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log matching a provided search query. Use categories to create groups for an analytical view. For example, URL groups, machine groups, environments, and response time buckets. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Once the log has matched one of the Processor queries, it stops. Make sure they are properly ordered in case a log could match several queries. * The names of the categories must be unique. * Once defined in the Category Processor, you can map categories to log status using the Log Status Remapper. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter object Filter for logs. query string The filter query. name string Value to assign to the target attribute. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. target [_required_] string Name of the target attribute which value is defined by the matching category. type [_required_] enum Type of logs category processor. Allowed enum values: `category-processor` default: `category-processor` Option 10 object Use the Arithmetic Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log with the result of the provided formula. This enables you to remap different time attributes with different units into a single attribute, or to compute operations on attributes within the same log. The formula can use parentheses and the basic arithmetic operators `-`, `+`, `*`, `/`. By default, the calculation is skipped if an attribute is missing. Select “Replace missing attribute by 0” to automatically populate missing attribute values with 0 to ensure that the calculation is done. An attribute is missing if it is not found in the log attributes, or if it cannot be converted to a number. _Notes_ : * The operator `-` needs to be space split in the formula as it can also be contained in attribute names. * If the target attribute already exists, it is overwritten by the result of the formula. * Results are rounded up to the 9th decimal. For example, if the result of the formula is `0.1234567891`, the actual value stored for the attribute is `0.123456789`. * If you need to scale a unit of measure, see [Scale Filter](https://docs.datadoghq.com/logs/log_configuration/parsing/?tab=filter#matcher-and-filter). expression [_required_] string Arithmetic operation between one or more log attributes. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If `true`, it replaces all missing attributes of expression by `0`, `false` skip the operation if an attribute is missing. name string Name of the processor. target [_required_] string Name of the attribute that contains the result of the arithmetic operation. type [_required_] enum Type of logs arithmetic processor. Allowed enum values: `arithmetic-processor` default: `arithmetic-processor` Option 11 object Use the string builder processor to add a new attribute (without spaces or special characters) to a log with the result of the provided template. This enables aggregation of different attributes or raw strings into a single attribute. The template is defined by both raw text and blocks with the syntax `%{attribute_path}`. **Notes** : * The processor only accepts attributes with values or an array of values in the blocks. * If an attribute cannot be used (object or array of object), it is replaced by an empty string or the entire operation is skipped depending on your selection. * If the target attribute already exists, it is overwritten by the result of the template. * Results of the template cannot exceed 256 characters. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If true, it replaces all missing attributes of `template` by an empty string. If `false` (default), skips the operation for missing attributes. name string Name of the processor. target [_required_] string The name of the attribute that contains the result of the template. template [_required_] string A formula with one or more attributes and raw text. type [_required_] enum Type of logs string builder processor. Allowed enum values: `string-builder-processor` default: `string-builder-processor` Option 12 object Nested Pipelines are pipelines within a pipeline. Use Nested Pipelines to split the processing into two steps. For example, first use a high-level filtering such as team and then a second level of filtering based on the integration, service, or any other tag or attribute. A pipeline can contain Nested Pipelines and Processors whereas a Nested Pipeline can only contain Processors. filter object Filter for logs. query string The filter query. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. processors [object] Ordered list of processors in this pipeline. type [_required_] enum Type of logs pipeline processor. Allowed enum values: `pipeline` default: `pipeline` Option 13 object The GeoIP parser takes an IP address attribute and extracts if available the Continent, Country, Subdivision, and City information in the target attribute path. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `network.client.ip` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `network.client.geoip` type [_required_] enum Type of GeoIP parser. Allowed enum values: `geo-ip-parser` default: `geo-ip-parser` Option 14 object Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in the processors mapping table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. default_lookup string Value to set the target attribute if the source value is not found in the list. is_enabled boolean Whether or not the processor is enabled. lookup_table [_required_] [string] Mapping table of values for the source attribute and their associated target attribute values, formatted as `["source_key1,target_value1", "source_key2,target_value2"]` name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list or the `default_lookup` if not found in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 15 object **Note** : Reference Tables are in public beta. Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in a Reference Table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. is_enabled boolean Whether or not the processor is enabled. lookup_enrichment_table [_required_] string Name of the Reference Table for the source attribute and their associated target attribute values. name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 16 object There are two ways to improve correlation between application traces and logs. 1. Follow the documentation on [how to inject a trace ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces) and by default log integrations take care of all the rest of the setup. 2. Use the Trace remapper processor to define a log attribute as its associated trace ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.trace_id` type [_required_] enum Type of logs trace remapper. Allowed enum values: `trace-id-remapper` default: `trace-id-remapper` Option 17 object There are two ways to define correlation between application spans and logs: 1. Follow the documentation on [how to inject a span ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces). Log integrations automatically handle all remaining setup steps by default. 2. Use the span remapper processor to define a log attribute as its associated span ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.span_id` type [_required_] enum Type of logs span remapper. Allowed enum values: `span-id-remapper` default: `span-id-remapper` Option 18 object A processor for extracting, aggregating, or transforming values from JSON arrays within your logs. Supported operations are: * Select value from matching element * Compute array length * Append a value to an array is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. operation [_required_] Configuration of the array processor operation to perform. Option 1 object Operation that appends a value to a target array attribute. preserve_source boolean Remove or preserve the remapped source element. default: `true` source [_required_] string Attribute path containing the value to append. target [_required_] string Attribute path of the array to append to. type [_required_] enum Operation type. Allowed enum values: `append` Option 2 object Operation that computes the length of a `source` array and stores the result in the `target` attribute. source [_required_] string Attribute path of the array to measure. target [_required_] string Attribute that receives the computed length. type [_required_] enum Operation type. Allowed enum values: `length` Option 3 object Operation that finds an object in a `source` array using a `filter`, and then extracts a specific value into the `target` attribute. filter [_required_] string Filter condition expressed as `key:value` used to find the matching element. source [_required_] string Attribute path of the array to search into. target [_required_] string Attribute that receives the extracted value. type [_required_] enum Operation type. Allowed enum values: `select` value_to_extract [_required_] string Key of the value to extract from the matching element. type [_required_] enum Type of logs array processor. Allowed enum values: `array-processor` default: `array-processor` Option 19 object The decoder processor decodes any source attribute containing a base64/base16-encoded UTF-8/ASCII string back to its original value, storing the result in a target attribute. binary_to_text_encoding [_required_] enum The encoding used to represent the binary data. Allowed enum values: `base64,base16` input_representation [_required_] enum The original representation of input string. Allowed enum values: `utf_8,integer` is_enabled boolean Whether the processor is enabled. name string Name of the processor. source [_required_] string Name of the log attribute with the encoded data. target [_required_] string Name of the log attribute that contains the decoded data. type [_required_] enum Type of logs decoder processor. Allowed enum values: `decoder-processor` default: `decoder-processor` Option 20 object A processor that has additional validations and checks for a given schema. Currently supported schema types include OCSF. is_enabled boolean Whether or not the processor is enabled. mappers [_required_] [ ] The `LogsSchemaProcessor` `mappers`. Option 1 object The schema remapper maps source log fields to their correct fields. name [_required_] string Name of the logs schema remapper. override_on_conflict boolean Override or not the target element if already set. preserve_source boolean Remove or preserve the remapped source element. sources [_required_] [string] Array of source attributes. target [_required_] string Target field to map log source field to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` type [_required_] enum Type of logs schema remapper. Allowed enum values: `schema-remapper` Option 2 object Use the Schema Category Mapper to categorize log event into enum fields. In the case of OCSF, they can be used to map sibling fields which are composed of an ID and a name. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Categories are executed in order and processing stops at the first match. Make sure categories are properly ordered in case a log could match multiple queries. * Sibling fields always have a numerical ID field and a human-readable string name. * A fallback section handles cases where the name or ID value matches a specific value. If the name matches "Other" or the ID matches 99, the value of the sibling name field will be pulled from a source field from the original log. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter [_required_] object Filter for logs. query string The filter query. id [_required_] int64 ID to inject into the category. name [_required_] string Value to assign to target schema field. fallback object Used to override hardcoded category values with a value pulled from a source attribute on the log. sources object Fallback sources used to populate value of field. [string] values object Values that define when the fallback is used. string name [_required_] string Name of the logs schema category mapper. targets [_required_] object Name of the target attributes which value is defined by the matching category. id string ID of the field to map log attributes to. name string Name of the field to map log attributes to. type [_required_] enum Type of logs schema category mapper. Allowed enum values: `schema-category-mapper` name [_required_] string Name of the processor. schema [_required_] object Configuration of the schema data to use. class_name [_required_] string Class name of the schema to use. class_uid [_required_] int64 Class UID of the schema to use. profiles [string] Optional list of profiles to modify the schema. schema_type [_required_] string Type of schema to use. version [_required_] string Version of the schema to use. type [_required_] enum Type of logs schema processor. Allowed enum values: `schema-processor` default: `schema-processor` tags [string] A list of tags associated with the pipeline. type string Type of pipeline. ``` { "description": "string", "filter": { "query": "source:python" }, "is_enabled": false, "name": "", "processors": [ { "grok": { "match_rules": "rule_name_1 foo\nrule_name_2 bar", "support_rules": "rule_name_1 foo\nrule_name_2 bar" }, "is_enabled": false, "name": "string", "samples": [], "source": "message", "type": "grok-parser" } ], "tags": [] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-pipelines/#UpdateLogsPipeline-200-v1) * [400](https://docs.datadoghq.com/api/latest/logs-pipelines/#UpdateLogsPipeline-400-v1) * [403](https://docs.datadoghq.com/api/latest/logs-pipelines/#UpdateLogsPipeline-403-v1) * [429](https://docs.datadoghq.com/api/latest/logs-pipelines/#UpdateLogsPipeline-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Pipelines and processors operate on incoming logs, parsing and transforming them into structured attributes for easier querying. **Note** : These endpoints are only available for admin users. Make sure to use an application key created by an admin. Field Type Description description string A description of the pipeline. filter object Filter for logs. query string The filter query. id string ID of the pipeline. is_enabled boolean Whether or not the pipeline is enabled. is_read_only boolean Whether or not the pipeline can be edited. name [_required_] string Name of the pipeline. processors [ ] Ordered list of processors in this pipeline. Option 1 object Create custom grok rules to parse the full message or [a specific attribute of your raw event](https://docs.datadoghq.com/logs/log_configuration/parsing/#advanced-settings). For more information, see the [parsing section](https://docs.datadoghq.com/logs/log_configuration/parsing). grok [_required_] object Set of rules for the grok parser. match_rules [_required_] string List of match rules for the grok parser, separated by a new line. support_rules string List of support rules for the grok parser, separated by a new line. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. samples [string] List of sample logs to test this grok parser. source [_required_] string Name of the log attribute to parse. default: `message` type [_required_] enum Type of logs grok parser. Allowed enum values: `grok-parser` default: `grok-parser` Option 2 object As Datadog receives logs, it timestamps them using the value(s) from any of these default attributes. * `timestamp` * `date` * `_timestamp` * `Timestamp` * `eventTime` * `published_date` If your logs put their dates in an attribute not in this list, use the log date Remapper Processor to define their date attribute as the official log timestamp. The recognized date formats are ISO8601, UNIX (the milliseconds EPOCH format), and RFC3164. **Note:** If your logs don’t contain any of the default attributes and you haven’t defined your own date attribute, Datadog timestamps the logs with the date it received them. If multiple log date remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs date remapper. Allowed enum values: `date-remapper` default: `date-remapper` Option 3 object Use this Processor if you want to assign some attributes as the official status. Each incoming status value is mapped as follows. * Integers from 0 to 7 map to the Syslog severity standards * Strings beginning with `emerg` or f (case-insensitive) map to `emerg` (0) * Strings beginning with `a` (case-insensitive) map to `alert` (1) * Strings beginning with `c` (case-insensitive) map to `critical` (2) * Strings beginning with `err` (case-insensitive) map to `error` (3) * Strings beginning with `w` (case-insensitive) map to `warning` (4) * Strings beginning with `n` (case-insensitive) map to `notice` (5) * Strings beginning with `i` (case-insensitive) map to `info` (6) * Strings beginning with `d`, `trace` or `verbose` (case-insensitive) map to `debug` (7) * Strings beginning with `o` or matching `OK` or `Success` (case-insensitive) map to OK * All others map to `info` (6) **Note:** If multiple log status remapper processors can be applied to a given log, only the first one (according to the pipelines order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs status remapper. Allowed enum values: `status-remapper` default: `status-remapper` Option 4 object Use this processor if you want to assign one or more attributes as the official service. **Note:** If multiple service remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. type [_required_] enum Type of logs service remapper. Allowed enum values: `service-remapper` default: `service-remapper` Option 5 object The message is a key attribute in Datadog. It is displayed in the message column of the Log Explorer and you can do full string search on it. Use this Processor to define one or more attributes as the official log message. **Note:** If multiple log message remapper processors can be applied to a given log, only the first one (according to the pipeline order) is taken into account. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `msg` type [_required_] enum Type of logs message remapper. Allowed enum values: `message-remapper` default: `message-remapper` Option 6 object The remapper processor remaps any source attribute(s) or tag to another target attribute or tag. Constraints on the tag/attribute name are explained in the [Tag Best Practice documentation](https://docs.datadoghq.com/logs/guide/log-parsing-best-practice). Some additional constraints are applied as `:` or `,` are not allowed in the target tag/attribute name. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. override_on_conflict boolean Override or not the target element if already set, preserve_source boolean Remove or preserve the remapped source element. source_type string Defines if the sources are from log `attribute` or `tag`. default: `attribute` sources [_required_] [string] Array of source attributes. target [_required_] string Final attribute or tag name to remap the sources to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` target_type string Defines if the final attribute or tag name is from log `attribute` or `tag`. default: `attribute` type [_required_] enum Type of logs attribute remapper. Allowed enum values: `attribute-remapper` default: `attribute-remapper` Option 7 object This processor extracts query parameters and other important parameters from a URL. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. normalize_ending_slashes boolean Normalize the ending slashes or not. sources [_required_] [string] Array of source attributes. default: `http.url` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.url_details` type [_required_] enum Type of logs URL parser. Allowed enum values: `url-parser` default: `url-parser` Option 8 object The User-Agent parser takes a User-Agent attribute and extracts the OS, browser, device, and other user data. It recognizes major bots like the Google Bot, Yahoo Slurp, and Bing. is_enabled boolean Whether or not the processor is enabled. is_encoded boolean Define if the source attribute is URL encoded or not. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `http.useragent` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `http.useragent_details` type [_required_] enum Type of logs User-Agent parser. Allowed enum values: `user-agent-parser` default: `user-agent-parser` Option 9 object Use the Category Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log matching a provided search query. Use categories to create groups for an analytical view. For example, URL groups, machine groups, environments, and response time buckets. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Once the log has matched one of the Processor queries, it stops. Make sure they are properly ordered in case a log could match several queries. * The names of the categories must be unique. * Once defined in the Category Processor, you can map categories to log status using the Log Status Remapper. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter object Filter for logs. query string The filter query. name string Value to assign to the target attribute. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. target [_required_] string Name of the target attribute which value is defined by the matching category. type [_required_] enum Type of logs category processor. Allowed enum values: `category-processor` default: `category-processor` Option 10 object Use the Arithmetic Processor to add a new attribute (without spaces or special characters in the new attribute name) to a log with the result of the provided formula. This enables you to remap different time attributes with different units into a single attribute, or to compute operations on attributes within the same log. The formula can use parentheses and the basic arithmetic operators `-`, `+`, `*`, `/`. By default, the calculation is skipped if an attribute is missing. Select “Replace missing attribute by 0” to automatically populate missing attribute values with 0 to ensure that the calculation is done. An attribute is missing if it is not found in the log attributes, or if it cannot be converted to a number. _Notes_ : * The operator `-` needs to be space split in the formula as it can also be contained in attribute names. * If the target attribute already exists, it is overwritten by the result of the formula. * Results are rounded up to the 9th decimal. For example, if the result of the formula is `0.1234567891`, the actual value stored for the attribute is `0.123456789`. * If you need to scale a unit of measure, see [Scale Filter](https://docs.datadoghq.com/logs/log_configuration/parsing/?tab=filter#matcher-and-filter). expression [_required_] string Arithmetic operation between one or more log attributes. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If `true`, it replaces all missing attributes of expression by `0`, `false` skip the operation if an attribute is missing. name string Name of the processor. target [_required_] string Name of the attribute that contains the result of the arithmetic operation. type [_required_] enum Type of logs arithmetic processor. Allowed enum values: `arithmetic-processor` default: `arithmetic-processor` Option 11 object Use the string builder processor to add a new attribute (without spaces or special characters) to a log with the result of the provided template. This enables aggregation of different attributes or raw strings into a single attribute. The template is defined by both raw text and blocks with the syntax `%{attribute_path}`. **Notes** : * The processor only accepts attributes with values or an array of values in the blocks. * If an attribute cannot be used (object or array of object), it is replaced by an empty string or the entire operation is skipped depending on your selection. * If the target attribute already exists, it is overwritten by the result of the template. * Results of the template cannot exceed 256 characters. is_enabled boolean Whether or not the processor is enabled. is_replace_missing boolean If true, it replaces all missing attributes of `template` by an empty string. If `false` (default), skips the operation for missing attributes. name string Name of the processor. target [_required_] string The name of the attribute that contains the result of the template. template [_required_] string A formula with one or more attributes and raw text. type [_required_] enum Type of logs string builder processor. Allowed enum values: `string-builder-processor` default: `string-builder-processor` Option 12 object Nested Pipelines are pipelines within a pipeline. Use Nested Pipelines to split the processing into two steps. For example, first use a high-level filtering such as team and then a second level of filtering based on the integration, service, or any other tag or attribute. A pipeline can contain Nested Pipelines and Processors whereas a Nested Pipeline can only contain Processors. filter object Filter for logs. query string The filter query. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. processors [object] Ordered list of processors in this pipeline. type [_required_] enum Type of logs pipeline processor. Allowed enum values: `pipeline` default: `pipeline` Option 13 object The GeoIP parser takes an IP address attribute and extracts if available the Continent, Country, Subdivision, and City information in the target attribute path. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [_required_] [string] Array of source attributes. default: `network.client.ip` target [_required_] string Name of the parent attribute that contains all the extracted details from the `sources`. default: `network.client.geoip` type [_required_] enum Type of GeoIP parser. Allowed enum values: `geo-ip-parser` default: `geo-ip-parser` Option 14 object Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in the processors mapping table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. default_lookup string Value to set the target attribute if the source value is not found in the list. is_enabled boolean Whether or not the processor is enabled. lookup_table [_required_] [string] Mapping table of values for the source attribute and their associated target attribute values, formatted as `["source_key1,target_value1", "source_key2,target_value2"]` name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list or the `default_lookup` if not found in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 15 object **Note** : Reference Tables are in public beta. Use the Lookup Processor to define a mapping between a log attribute and a human readable value saved in a Reference Table. For example, you can use the Lookup Processor to map an internal service ID into a human readable service name. Alternatively, you could also use it to check if the MAC address that just attempted to connect to the production environment belongs to your list of stolen machines. is_enabled boolean Whether or not the processor is enabled. lookup_enrichment_table [_required_] string Name of the Reference Table for the source attribute and their associated target attribute values. name string Name of the processor. source [_required_] string Source attribute used to perform the lookup. target [_required_] string Name of the attribute that contains the corresponding value in the mapping list. type [_required_] enum Type of logs lookup processor. Allowed enum values: `lookup-processor` default: `lookup-processor` Option 16 object There are two ways to improve correlation between application traces and logs. 1. Follow the documentation on [how to inject a trace ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces) and by default log integrations take care of all the rest of the setup. 2. Use the Trace remapper processor to define a log attribute as its associated trace ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.trace_id` type [_required_] enum Type of logs trace remapper. Allowed enum values: `trace-id-remapper` default: `trace-id-remapper` Option 17 object There are two ways to define correlation between application spans and logs: 1. Follow the documentation on [how to inject a span ID in the application logs](https://docs.datadoghq.com/tracing/connect_logs_and_traces). Log integrations automatically handle all remaining setup steps by default. 2. Use the span remapper processor to define a log attribute as its associated span ID. is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. sources [string] Array of source attributes. default: `dd.span_id` type [_required_] enum Type of logs span remapper. Allowed enum values: `span-id-remapper` default: `span-id-remapper` Option 18 object A processor for extracting, aggregating, or transforming values from JSON arrays within your logs. Supported operations are: * Select value from matching element * Compute array length * Append a value to an array is_enabled boolean Whether or not the processor is enabled. name string Name of the processor. operation [_required_] Configuration of the array processor operation to perform. Option 1 object Operation that appends a value to a target array attribute. preserve_source boolean Remove or preserve the remapped source element. default: `true` source [_required_] string Attribute path containing the value to append. target [_required_] string Attribute path of the array to append to. type [_required_] enum Operation type. Allowed enum values: `append` Option 2 object Operation that computes the length of a `source` array and stores the result in the `target` attribute. source [_required_] string Attribute path of the array to measure. target [_required_] string Attribute that receives the computed length. type [_required_] enum Operation type. Allowed enum values: `length` Option 3 object Operation that finds an object in a `source` array using a `filter`, and then extracts a specific value into the `target` attribute. filter [_required_] string Filter condition expressed as `key:value` used to find the matching element. source [_required_] string Attribute path of the array to search into. target [_required_] string Attribute that receives the extracted value. type [_required_] enum Operation type. Allowed enum values: `select` value_to_extract [_required_] string Key of the value to extract from the matching element. type [_required_] enum Type of logs array processor. Allowed enum values: `array-processor` default: `array-processor` Option 19 object The decoder processor decodes any source attribute containing a base64/base16-encoded UTF-8/ASCII string back to its original value, storing the result in a target attribute. binary_to_text_encoding [_required_] enum The encoding used to represent the binary data. Allowed enum values: `base64,base16` input_representation [_required_] enum The original representation of input string. Allowed enum values: `utf_8,integer` is_enabled boolean Whether the processor is enabled. name string Name of the processor. source [_required_] string Name of the log attribute with the encoded data. target [_required_] string Name of the log attribute that contains the decoded data. type [_required_] enum Type of logs decoder processor. Allowed enum values: `decoder-processor` default: `decoder-processor` Option 20 object A processor that has additional validations and checks for a given schema. Currently supported schema types include OCSF. is_enabled boolean Whether or not the processor is enabled. mappers [_required_] [ ] The `LogsSchemaProcessor` `mappers`. Option 1 object The schema remapper maps source log fields to their correct fields. name [_required_] string Name of the logs schema remapper. override_on_conflict boolean Override or not the target element if already set. preserve_source boolean Remove or preserve the remapped source element. sources [_required_] [string] Array of source attributes. target [_required_] string Target field to map log source field to. target_format enum If the `target_type` of the remapper is `attribute`, try to cast the value to a new specific type. If the cast is not possible, the original type is kept. `string`, `integer`, or `double` are the possible types. If the `target_type` is `tag`, this parameter may not be specified. Allowed enum values: `auto,string,integer,double` type [_required_] enum Type of logs schema remapper. Allowed enum values: `schema-remapper` Option 2 object Use the Schema Category Mapper to categorize log event into enum fields. In the case of OCSF, they can be used to map sibling fields which are composed of an ID and a name. **Notes** : * The syntax of the query is the one of Logs Explorer search bar. The query can be done on any log attribute or tag, whether it is a facet or not. Wildcards can also be used inside your query. * Categories are executed in order and processing stops at the first match. Make sure categories are properly ordered in case a log could match multiple queries. * Sibling fields always have a numerical ID field and a human-readable string name. * A fallback section handles cases where the name or ID value matches a specific value. If the name matches "Other" or the ID matches 99, the value of the sibling name field will be pulled from a source field from the original log. categories [_required_] [object] Array of filters to match or not a log and their corresponding `name` to assign a custom value to the log. filter [_required_] object Filter for logs. query string The filter query. id [_required_] int64 ID to inject into the category. name [_required_] string Value to assign to target schema field. fallback object Used to override hardcoded category values with a value pulled from a source attribute on the log. sources object Fallback sources used to populate value of field. [string] values object Values that define when the fallback is used. string name [_required_] string Name of the logs schema category mapper. targets [_required_] object Name of the target attributes which value is defined by the matching category. id string ID of the field to map log attributes to. name string Name of the field to map log attributes to. type [_required_] enum Type of logs schema category mapper. Allowed enum values: `schema-category-mapper` name [_required_] string Name of the processor. schema [_required_] object Configuration of the schema data to use. class_name [_required_] string Class name of the schema to use. class_uid [_required_] int64 Class UID of the schema to use. profiles [string] Optional list of profiles to modify the schema. schema_type [_required_] string Type of schema to use. version [_required_] string Version of the schema to use. type [_required_] enum Type of logs schema processor. Allowed enum values: `schema-processor` default: `schema-processor` tags [string] A list of tags associated with the pipeline. type string Type of pipeline. ``` { "description": "string", "filter": { "query": "source:python" }, "id": "string", "is_enabled": false, "is_read_only": false, "name": "", "processors": [ { "grok": { "match_rules": "rule_name_1 foo\nrule_name_2 bar", "support_rules": "rule_name_1 foo\nrule_name_2 bar" }, "is_enabled": false, "name": "string", "samples": [], "source": "message", "type": "grok-parser" } ], "tags": [], "type": "pipeline" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/logs-pipelines/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-pipelines/?code-lang=typescript) ##### Update a pipeline Copy ``` # Path parameters export pipeline_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs/config/pipelines/${pipeline_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "", "processors": [ { "grok": { "match_rules": "rule_name_1 foo\nrule_name_2 bar" } } ] } EOF ``` ##### Update a pipeline ``` """ Update a pipeline returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_pipelines_api import LogsPipelinesApi from datadog_api_client.v1.model.logs_filter import LogsFilter from datadog_api_client.v1.model.logs_grok_parser import LogsGrokParser from datadog_api_client.v1.model.logs_grok_parser_rules import LogsGrokParserRules from datadog_api_client.v1.model.logs_grok_parser_type import LogsGrokParserType from datadog_api_client.v1.model.logs_pipeline import LogsPipeline body = LogsPipeline( filter=LogsFilter( query="source:python", ), name="", processors=[ LogsGrokParser( grok=LogsGrokParserRules( match_rules="rule_name_1 foo\nrule_name_2 bar", support_rules="rule_name_1 foo\nrule_name_2 bar", ), is_enabled=False, samples=[], source="message", type=LogsGrokParserType.GROK_PARSER, ), ], tags=[], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsPipelinesApi(api_client) response = api_instance.update_logs_pipeline(pipeline_id="pipeline_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a pipeline ``` # Update a pipeline returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsPipelinesAPI.new body = DatadogAPIClient::V1::LogsPipeline.new({ filter: DatadogAPIClient::V1::LogsFilter.new({ query: "source:python", }), name: "", processors: [ DatadogAPIClient::V1::LogsGrokParser.new({ grok: DatadogAPIClient::V1::LogsGrokParserRules.new({ match_rules: 'rule_name_1 foo\nrule_name_2 bar', support_rules: 'rule_name_1 foo\nrule_name_2 bar', }), is_enabled: false, samples: [], source: "message", type: DatadogAPIClient::V1::LogsGrokParserType::GROK_PARSER, }), ], tags: [], }) p api_instance.update_logs_pipeline("pipeline_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a pipeline ``` // Update a pipeline returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.LogsPipeline{ Filter: &datadogV1.LogsFilter{ Query: datadog.PtrString("source:python"), }, Name: "", Processors: []datadogV1.LogsProcessor{ datadogV1.LogsProcessor{ LogsGrokParser: &datadogV1.LogsGrokParser{ Grok: datadogV1.LogsGrokParserRules{ MatchRules: `rule_name_1 foo rule_name_2 bar`, SupportRules: datadog.PtrString(`rule_name_1 foo rule_name_2 bar`), }, IsEnabled: datadog.PtrBool(false), Samples: []string{}, Source: "message", Type: datadogV1.LOGSGROKPARSERTYPE_GROK_PARSER, }}, }, Tags: []string{}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsPipelinesApi(apiClient) resp, r, err := api.UpdateLogsPipeline(ctx, "pipeline_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsPipelinesApi.UpdateLogsPipeline`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsPipelinesApi.UpdateLogsPipeline`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a pipeline ``` // Update a pipeline returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsPipelinesApi; import com.datadog.api.client.v1.model.LogsFilter; import com.datadog.api.client.v1.model.LogsGrokParser; import com.datadog.api.client.v1.model.LogsGrokParserRules; import com.datadog.api.client.v1.model.LogsGrokParserType; import com.datadog.api.client.v1.model.LogsPipeline; import com.datadog.api.client.v1.model.LogsProcessor; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsPipelinesApi apiInstance = new LogsPipelinesApi(defaultClient); LogsPipeline body = new LogsPipeline() .filter(new LogsFilter().query("source:python")) .name("") .processors( Collections.singletonList( new LogsProcessor( new LogsGrokParser() .grok( new LogsGrokParserRules() .matchRules(""" rule_name_1 foo rule_name_2 bar """) .supportRules(""" rule_name_1 foo rule_name_2 bar """)) .isEnabled(false) .source("message") .type(LogsGrokParserType.GROK_PARSER)))); try { LogsPipeline result = apiInstance.updateLogsPipeline("pipeline_id", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsPipelinesApi#updateLogsPipeline"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a pipeline ``` // Update a pipeline returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs_pipelines::LogsPipelinesAPI; use datadog_api_client::datadogV1::model::LogsFilter; use datadog_api_client::datadogV1::model::LogsGrokParser; use datadog_api_client::datadogV1::model::LogsGrokParserRules; use datadog_api_client::datadogV1::model::LogsGrokParserType; use datadog_api_client::datadogV1::model::LogsPipeline; use datadog_api_client::datadogV1::model::LogsProcessor; #[tokio::main] async fn main() { let body = LogsPipeline::new("".to_string()) .filter(LogsFilter::new().query("source:python".to_string())) .processors(vec![LogsProcessor::LogsGrokParser(Box::new( LogsGrokParser::new( LogsGrokParserRules::new( r#"rule_name_1 foo rule_name_2 bar"# .to_string(), ) .support_rules( r#"rule_name_1 foo rule_name_2 bar"# .to_string(), ), "message".to_string(), LogsGrokParserType::GROK_PARSER, ) .is_enabled(false) .samples(vec![]), ))]) .tags(vec![]); let configuration = datadog::Configuration::new(); let api = LogsPipelinesAPI::with_config(configuration); let resp = api .update_logs_pipeline("pipeline_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a pipeline ``` /** * Update a pipeline returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsPipelinesApi(configuration); const params: v1.LogsPipelinesApiUpdateLogsPipelineRequest = { body: { filter: { query: "source:python", }, name: "", processors: [ { grok: { matchRules: "rule_name_1 foo\nrule_name_2 bar", supportRules: "rule_name_1 foo\nrule_name_2 bar", }, isEnabled: false, samples: [], source: "message", type: "grok-parser", }, ], tags: [], }, pipelineId: "pipeline_id", }; apiInstance .updateLogsPipeline(params) .then((data: v1.LogsPipeline) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=ab6484ce-3d71-4972-8884-d04755f4ed8e&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=55e66d20-2363-41f7-8b7d-c0af9054772c&pt=Logs%20Pipelines&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-pipelines%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=ab6484ce-3d71-4972-8884-d04755f4ed8e&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=55e66d20-2363-41f7-8b7d-c0af9054772c&pt=Logs%20Pipelines&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-pipelines%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=03e60368-05b4-42a5-b3c4-50e98f5d61d9&bo=2&sid=ca97aff0f0bf11f0b225ebe7018368cd&vid=ca980060f0bf11f09385b797b877861f&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Logs%20Pipelines&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-pipelines%2F&r=<=2659&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=348194) --- # Source: https://docs.datadoghq.com/api/latest/logs-restriction-queries/ # Logs Restriction Queries **Note: This endpoint is in public beta. If you have any feedback, contact[Datadog support](https://docs.datadoghq.com/help/).** A Restriction Query is a logs query that restricts which logs the `logs_read_data` permission grants read access to. For users whose roles have Restriction Queries, any log query they make only returns those log events that also match one of their Restriction Queries. This is true whether the user queries log events from any log-related feature, including the log explorer, Live Tail, re-hydration, or a dashboard widget. Restriction Queries currently only support use of the following components of log events: * Reserved attributes * The log message * Tags To restrict read access on log data, add a team tag to log events to indicate which teams own them, and then scope Restriction Queries to the relevant values of the team tag. Tags can be applied to log events in many ways, and a log event can have multiple tags with the same key (like team) and different values. This means the same log event can be visible to roles whose restriction queries are scoped to different team values. See [How to Set Up RBAC for Logs](https://docs.datadoghq.com/logs/guide/logs-rbac/?tab=api#restrict-access-to-logs) for details on how to add restriction queries. ## [List restriction queries](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#list-restriction-queries) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#list-restriction-queries-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/logs/config/restriction_querieshttps://api.ap2.datadoghq.com/api/v2/logs/config/restriction_querieshttps://api.datadoghq.eu/api/v2/logs/config/restriction_querieshttps://api.ddog-gov.com/api/v2/logs/config/restriction_querieshttps://api.datadoghq.com/api/v2/logs/config/restriction_querieshttps://api.us3.datadoghq.com/api/v2/logs/config/restriction_querieshttps://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries ### Overview Returns all restriction queries, including their names and IDs. This endpoint requires the `logs_read_config` permission. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListRestrictionQueries-200-v2) * [403](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListRestrictionQueries-403-v2) * [429](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListRestrictionQueries-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Response containing information about multiple restriction queries. Field Type Description data [object] Array of returned restriction queries. attributes object Attributes of the restriction query. created_at date-time Creation time of the restriction query. last_modifier_email string Email of the user who last modified this restriction query. last_modifier_name string Name of the user who last modified this restriction query. modified_at date-time Time of last restriction query modification. restriction_query string The query that defines the restriction. Only the content matching the query can be returned. role_count int64 Number of roles associated with this restriction query. user_count int64 Number of users associated with this restriction query. id string ID of the restriction query. type string Restriction queries type. default: `logs_restriction_queries` ``` { "data": [ { "attributes": { "created_at": "2020-03-17T21:06:44.000Z", "last_modifier_email": "user@example.com", "last_modifier_name": "John Doe", "modified_at": "2020-03-17T21:15:15.000Z", "restriction_query": "env:sandbox", "role_count": 3, "user_count": 5 }, "id": "79a0e60a-644a-11ea-ad29-43329f7f58b5", "type": "logs_restriction_queries" } ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=typescript) ##### List restriction queries Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List restriction queries ``` """ List restriction queries returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_restriction_queries_api import LogsRestrictionQueriesApi configuration = Configuration() configuration.unstable_operations["list_restriction_queries"] = True with ApiClient(configuration) as api_client: api_instance = LogsRestrictionQueriesApi(api_client) response = api_instance.list_restriction_queries() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List restriction queries ``` # List restriction queries returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_restriction_queries".to_sym] = true end api_instance = DatadogAPIClient::V2::LogsRestrictionQueriesAPI.new p api_instance.list_restriction_queries() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List restriction queries ``` // List restriction queries returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListRestrictionQueries", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsRestrictionQueriesApi(apiClient) resp, r, err := api.ListRestrictionQueries(ctx, *datadogV2.NewListRestrictionQueriesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsRestrictionQueriesApi.ListRestrictionQueries`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsRestrictionQueriesApi.ListRestrictionQueries`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List restriction queries ``` // List restriction queries returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsRestrictionQueriesApi; import com.datadog.api.client.v2.model.RestrictionQueryListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listRestrictionQueries", true); LogsRestrictionQueriesApi apiInstance = new LogsRestrictionQueriesApi(defaultClient); try { RestrictionQueryListResponse result = apiInstance.listRestrictionQueries(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsRestrictionQueriesApi#listRestrictionQueries"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List restriction queries ``` // List restriction queries returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_restriction_queries::ListRestrictionQueriesOptionalParams; use datadog_api_client::datadogV2::api_logs_restriction_queries::LogsRestrictionQueriesAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListRestrictionQueries", true); let api = LogsRestrictionQueriesAPI::with_config(configuration); let resp = api .list_restriction_queries(ListRestrictionQueriesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List restriction queries ``` /** * List restriction queries returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listRestrictionQueries"] = true; const apiInstance = new v2.LogsRestrictionQueriesApi(configuration); apiInstance .listRestrictionQueries() .then((data: v2.RestrictionQueryListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a restriction query](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#create-a-restriction-query) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#create-a-restriction-query-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/logs/config/restriction_querieshttps://api.ap2.datadoghq.com/api/v2/logs/config/restriction_querieshttps://api.datadoghq.eu/api/v2/logs/config/restriction_querieshttps://api.ddog-gov.com/api/v2/logs/config/restriction_querieshttps://api.datadoghq.com/api/v2/logs/config/restriction_querieshttps://api.us3.datadoghq.com/api/v2/logs/config/restriction_querieshttps://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries ### Overview Create a new restriction query for your organization. This endpoint requires the `user_access_manage` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Field Type Description data object Data related to the creation of a restriction query. attributes object Attributes of the created restriction query. restriction_query [_required_] string The restriction query. type enum Restriction query resource type. Allowed enum values: `logs_restriction_queries` default: `logs_restriction_queries` ``` { "data": { "attributes": { "restriction_query": "env:sandbox" }, "type": "logs_restriction_queries" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#CreateRestrictionQuery-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#CreateRestrictionQuery-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#CreateRestrictionQuery-403-v2) * [429](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#CreateRestrictionQuery-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Response containing information about a single restriction query. Field Type Description data object Restriction query object returned by the API. attributes object Attributes of the restriction query. created_at date-time Creation time of the restriction query. last_modifier_email string Email of the user who last modified this restriction query. last_modifier_name string Name of the user who last modified this restriction query. modified_at date-time Time of last restriction query modification. restriction_query string The query that defines the restriction. Only the content matching the query can be returned. role_count int64 Number of roles associated with this restriction query. user_count int64 Number of users associated with this restriction query. id string ID of the restriction query. type string Restriction queries type. default: `logs_restriction_queries` ``` { "data": { "attributes": { "created_at": "2020-03-17T21:06:44.000Z", "last_modifier_email": "user@example.com", "last_modifier_name": "John Doe", "modified_at": "2020-03-17T21:15:15.000Z", "restriction_query": "env:sandbox", "role_count": 3, "user_count": 5 }, "id": "79a0e60a-644a-11ea-ad29-43329f7f58b5", "type": "logs_restriction_queries" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=typescript) ##### Create a restriction query returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "restriction_query": "env:sandbox" }, "type": "logs_restriction_queries" } } EOF ``` ##### Create a restriction query returns "OK" response ``` // Create a restriction query returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RestrictionQueryCreatePayload{ Data: &datadogV2.RestrictionQueryCreateData{ Attributes: &datadogV2.RestrictionQueryCreateAttributes{ RestrictionQuery: "env:sandbox", }, Type: datadogV2.LOGSRESTRICTIONQUERIESTYPE_LOGS_RESTRICTION_QUERIES.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateRestrictionQuery", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsRestrictionQueriesApi(apiClient) resp, r, err := api.CreateRestrictionQuery(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsRestrictionQueriesApi.CreateRestrictionQuery`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsRestrictionQueriesApi.CreateRestrictionQuery`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a restriction query returns "OK" response ``` // Create a restriction query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsRestrictionQueriesApi; import com.datadog.api.client.v2.model.LogsRestrictionQueriesType; import com.datadog.api.client.v2.model.RestrictionQueryCreateAttributes; import com.datadog.api.client.v2.model.RestrictionQueryCreateData; import com.datadog.api.client.v2.model.RestrictionQueryCreatePayload; import com.datadog.api.client.v2.model.RestrictionQueryWithoutRelationshipsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createRestrictionQuery", true); LogsRestrictionQueriesApi apiInstance = new LogsRestrictionQueriesApi(defaultClient); RestrictionQueryCreatePayload body = new RestrictionQueryCreatePayload() .data( new RestrictionQueryCreateData() .attributes( new RestrictionQueryCreateAttributes().restrictionQuery("env:sandbox")) .type(LogsRestrictionQueriesType.LOGS_RESTRICTION_QUERIES)); try { RestrictionQueryWithoutRelationshipsResponse result = apiInstance.createRestrictionQuery(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsRestrictionQueriesApi#createRestrictionQuery"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a restriction query returns "OK" response ``` """ Create a restriction query returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_restriction_queries_api import LogsRestrictionQueriesApi from datadog_api_client.v2.model.logs_restriction_queries_type import LogsRestrictionQueriesType from datadog_api_client.v2.model.restriction_query_create_attributes import RestrictionQueryCreateAttributes from datadog_api_client.v2.model.restriction_query_create_data import RestrictionQueryCreateData from datadog_api_client.v2.model.restriction_query_create_payload import RestrictionQueryCreatePayload body = RestrictionQueryCreatePayload( data=RestrictionQueryCreateData( attributes=RestrictionQueryCreateAttributes( restriction_query="env:sandbox", ), type=LogsRestrictionQueriesType.LOGS_RESTRICTION_QUERIES, ), ) configuration = Configuration() configuration.unstable_operations["create_restriction_query"] = True with ApiClient(configuration) as api_client: api_instance = LogsRestrictionQueriesApi(api_client) response = api_instance.create_restriction_query(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a restriction query returns "OK" response ``` # Create a restriction query returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_restriction_query".to_sym] = true end api_instance = DatadogAPIClient::V2::LogsRestrictionQueriesAPI.new body = DatadogAPIClient::V2::RestrictionQueryCreatePayload.new({ data: DatadogAPIClient::V2::RestrictionQueryCreateData.new({ attributes: DatadogAPIClient::V2::RestrictionQueryCreateAttributes.new({ restriction_query: "env:sandbox", }), type: DatadogAPIClient::V2::LogsRestrictionQueriesType::LOGS_RESTRICTION_QUERIES, }), }) p api_instance.create_restriction_query(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a restriction query returns "OK" response ``` // Create a restriction query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_restriction_queries::LogsRestrictionQueriesAPI; use datadog_api_client::datadogV2::model::LogsRestrictionQueriesType; use datadog_api_client::datadogV2::model::RestrictionQueryCreateAttributes; use datadog_api_client::datadogV2::model::RestrictionQueryCreateData; use datadog_api_client::datadogV2::model::RestrictionQueryCreatePayload; #[tokio::main] async fn main() { let body = RestrictionQueryCreatePayload::new().data( RestrictionQueryCreateData::new() .attributes(RestrictionQueryCreateAttributes::new( "env:sandbox".to_string(), )) .type_(LogsRestrictionQueriesType::LOGS_RESTRICTION_QUERIES), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateRestrictionQuery", true); let api = LogsRestrictionQueriesAPI::with_config(configuration); let resp = api.create_restriction_query(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a restriction query returns "OK" response ``` /** * Create a restriction query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createRestrictionQuery"] = true; const apiInstance = new v2.LogsRestrictionQueriesApi(configuration); const params: v2.LogsRestrictionQueriesApiCreateRestrictionQueryRequest = { body: { data: { attributes: { restrictionQuery: "env:sandbox", }, type: "logs_restriction_queries", }, }, }; apiInstance .createRestrictionQuery(params) .then((data: v2.RestrictionQueryWithoutRelationshipsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a restriction query](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#get-a-restriction-query) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#get-a-restriction-query-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.ap2.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.datadoghq.eu/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.ddog-gov.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.us3.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id} ### Overview Get a restriction query in the organization specified by the restriction query’s `restriction_query_id`. This endpoint requires the `logs_read_config` permission. ### Arguments #### Path Parameters Name Type Description restriction_query_id [_required_] string The ID of the restriction query. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#GetRestrictionQuery-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#GetRestrictionQuery-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#GetRestrictionQuery-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#GetRestrictionQuery-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#GetRestrictionQuery-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Response containing information about a single restriction query. Field Type Description data object Restriction query object returned by the API. attributes object Attributes of the restriction query. created_at date-time Creation time of the restriction query. last_modifier_email string Email of the user who last modified this restriction query. last_modifier_name string Name of the user who last modified this restriction query. modified_at date-time Time of last restriction query modification. restriction_query string The query that defines the restriction. Only the content matching the query can be returned. role_count int64 Number of roles associated with this restriction query. user_count int64 Number of users associated with this restriction query. id string ID of the restriction query. relationships object Relationships of the user object. roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Restriction query resource type. Allowed enum values: `logs_restriction_queries` default: `logs_restriction_queries` included [ ] Array of objects related to the restriction query. Option 1 object Partial role object. attributes [_required_] object Attributes of the role for a restriction query. name string The role name. id [_required_] string ID of the role. type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "attributes": { "created_at": "2020-03-17T21:06:44.000Z", "last_modifier_email": "user@example.com", "last_modifier_name": "John Doe", "modified_at": "2020-03-17T21:15:15.000Z", "restriction_query": "env:sandbox", "role_count": 3, "user_count": 5 }, "id": "79a0e60a-644a-11ea-ad29-43329f7f58b5", "relationships": { "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "logs_restriction_queries" }, "included": [ { "attributes": { "name": "Datadog Admin Role" }, "id": "", "type": "roles" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=typescript) ##### Get a restriction query Copy ``` # Path parameters export restriction_query_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/${restriction_query_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a restriction query ``` """ Get a restriction query returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_restriction_queries_api import LogsRestrictionQueriesApi # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = environ["RESTRICTION_QUERY_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_restriction_query"] = True with ApiClient(configuration) as api_client: api_instance = LogsRestrictionQueriesApi(api_client) response = api_instance.get_restriction_query( restriction_query_id=RESTRICTION_QUERY_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a restriction query ``` # Get a restriction query returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_restriction_query".to_sym] = true end api_instance = DatadogAPIClient::V2::LogsRestrictionQueriesAPI.new # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = ENV["RESTRICTION_QUERY_DATA_ID"] p api_instance.get_restriction_query(RESTRICTION_QUERY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a restriction query ``` // Get a restriction query returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "restriction_query" in the system RestrictionQueryDataID := os.Getenv("RESTRICTION_QUERY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetRestrictionQuery", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsRestrictionQueriesApi(apiClient) resp, r, err := api.GetRestrictionQuery(ctx, RestrictionQueryDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsRestrictionQueriesApi.GetRestrictionQuery`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsRestrictionQueriesApi.GetRestrictionQuery`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a restriction query ``` // Get a restriction query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsRestrictionQueriesApi; import com.datadog.api.client.v2.model.RestrictionQueryWithRelationshipsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getRestrictionQuery", true); LogsRestrictionQueriesApi apiInstance = new LogsRestrictionQueriesApi(defaultClient); // there is a valid "restriction_query" in the system String RESTRICTION_QUERY_DATA_ID = System.getenv("RESTRICTION_QUERY_DATA_ID"); try { RestrictionQueryWithRelationshipsResponse result = apiInstance.getRestrictionQuery(RESTRICTION_QUERY_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsRestrictionQueriesApi#getRestrictionQuery"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a restriction query ``` // Get a restriction query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_restriction_queries::LogsRestrictionQueriesAPI; #[tokio::main] async fn main() { // there is a valid "restriction_query" in the system let restriction_query_data_id = std::env::var("RESTRICTION_QUERY_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetRestrictionQuery", true); let api = LogsRestrictionQueriesAPI::with_config(configuration); let resp = api .get_restriction_query(restriction_query_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a restriction query ``` /** * Get a restriction query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getRestrictionQuery"] = true; const apiInstance = new v2.LogsRestrictionQueriesApi(configuration); // there is a valid "restriction_query" in the system const RESTRICTION_QUERY_DATA_ID = process.env .RESTRICTION_QUERY_DATA_ID as string; const params: v2.LogsRestrictionQueriesApiGetRestrictionQueryRequest = { restrictionQueryId: RESTRICTION_QUERY_DATA_ID, }; apiInstance .getRestrictionQuery(params) .then((data: v2.RestrictionQueryWithRelationshipsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Replace a restriction query](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#replace-a-restriction-query) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#replace-a-restriction-query-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PUT https://api.ap1.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.ap2.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.datadoghq.eu/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.ddog-gov.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.us3.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id} ### Overview Replace a restriction query. This endpoint requires the `user_access_manage` permission. ### Arguments #### Path Parameters Name Type Description restriction_query_id [_required_] string The ID of the restriction query. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Field Type Description data object Data related to the update of a restriction query. attributes object Attributes of the edited restriction query. restriction_query [_required_] string The restriction query. type enum Restriction query resource type. Allowed enum values: `logs_restriction_queries` default: `logs_restriction_queries` ``` { "data": { "attributes": { "restriction_query": "env:sandbox" }, "type": "logs_restriction_queries" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ReplaceRestrictionQuery-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ReplaceRestrictionQuery-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ReplaceRestrictionQuery-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ReplaceRestrictionQuery-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ReplaceRestrictionQuery-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Response containing information about a single restriction query. Field Type Description data object Restriction query object returned by the API. attributes object Attributes of the restriction query. created_at date-time Creation time of the restriction query. last_modifier_email string Email of the user who last modified this restriction query. last_modifier_name string Name of the user who last modified this restriction query. modified_at date-time Time of last restriction query modification. restriction_query string The query that defines the restriction. Only the content matching the query can be returned. role_count int64 Number of roles associated with this restriction query. user_count int64 Number of users associated with this restriction query. id string ID of the restriction query. type string Restriction queries type. default: `logs_restriction_queries` ``` { "data": { "attributes": { "created_at": "2020-03-17T21:06:44.000Z", "last_modifier_email": "user@example.com", "last_modifier_name": "John Doe", "modified_at": "2020-03-17T21:15:15.000Z", "restriction_query": "env:sandbox", "role_count": 3, "user_count": 5 }, "id": "79a0e60a-644a-11ea-ad29-43329f7f58b5", "type": "logs_restriction_queries" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=typescript) ##### Replace a restriction query Copy ``` # Path parameters export restriction_query_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/${restriction_query_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "restriction_query": "env:sandbox" } } } EOF ``` ##### Replace a restriction query ``` """ Replace a restriction query returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_restriction_queries_api import LogsRestrictionQueriesApi from datadog_api_client.v2.model.logs_restriction_queries_type import LogsRestrictionQueriesType from datadog_api_client.v2.model.restriction_query_update_attributes import RestrictionQueryUpdateAttributes from datadog_api_client.v2.model.restriction_query_update_data import RestrictionQueryUpdateData from datadog_api_client.v2.model.restriction_query_update_payload import RestrictionQueryUpdatePayload # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = environ["RESTRICTION_QUERY_DATA_ID"] body = RestrictionQueryUpdatePayload( data=RestrictionQueryUpdateData( attributes=RestrictionQueryUpdateAttributes( restriction_query="env:staging", ), type=LogsRestrictionQueriesType.LOGS_RESTRICTION_QUERIES, ), ) configuration = Configuration() configuration.unstable_operations["replace_restriction_query"] = True with ApiClient(configuration) as api_client: api_instance = LogsRestrictionQueriesApi(api_client) response = api_instance.replace_restriction_query(restriction_query_id=RESTRICTION_QUERY_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Replace a restriction query ``` # Replace a restriction query returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.replace_restriction_query".to_sym] = true end api_instance = DatadogAPIClient::V2::LogsRestrictionQueriesAPI.new # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = ENV["RESTRICTION_QUERY_DATA_ID"] body = DatadogAPIClient::V2::RestrictionQueryUpdatePayload.new({ data: DatadogAPIClient::V2::RestrictionQueryUpdateData.new({ attributes: DatadogAPIClient::V2::RestrictionQueryUpdateAttributes.new({ restriction_query: "env:staging", }), type: DatadogAPIClient::V2::LogsRestrictionQueriesType::LOGS_RESTRICTION_QUERIES, }), }) p api_instance.replace_restriction_query(RESTRICTION_QUERY_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Replace a restriction query ``` // Replace a restriction query returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "restriction_query" in the system RestrictionQueryDataID := os.Getenv("RESTRICTION_QUERY_DATA_ID") body := datadogV2.RestrictionQueryUpdatePayload{ Data: &datadogV2.RestrictionQueryUpdateData{ Attributes: &datadogV2.RestrictionQueryUpdateAttributes{ RestrictionQuery: "env:staging", }, Type: datadogV2.LOGSRESTRICTIONQUERIESTYPE_LOGS_RESTRICTION_QUERIES.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ReplaceRestrictionQuery", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsRestrictionQueriesApi(apiClient) resp, r, err := api.ReplaceRestrictionQuery(ctx, RestrictionQueryDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsRestrictionQueriesApi.ReplaceRestrictionQuery`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsRestrictionQueriesApi.ReplaceRestrictionQuery`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Replace a restriction query ``` // Replace a restriction query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsRestrictionQueriesApi; import com.datadog.api.client.v2.model.LogsRestrictionQueriesType; import com.datadog.api.client.v2.model.RestrictionQueryUpdateAttributes; import com.datadog.api.client.v2.model.RestrictionQueryUpdateData; import com.datadog.api.client.v2.model.RestrictionQueryUpdatePayload; import com.datadog.api.client.v2.model.RestrictionQueryWithoutRelationshipsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.replaceRestrictionQuery", true); LogsRestrictionQueriesApi apiInstance = new LogsRestrictionQueriesApi(defaultClient); // there is a valid "restriction_query" in the system String RESTRICTION_QUERY_DATA_ID = System.getenv("RESTRICTION_QUERY_DATA_ID"); RestrictionQueryUpdatePayload body = new RestrictionQueryUpdatePayload() .data( new RestrictionQueryUpdateData() .attributes( new RestrictionQueryUpdateAttributes().restrictionQuery("env:staging")) .type(LogsRestrictionQueriesType.LOGS_RESTRICTION_QUERIES)); try { RestrictionQueryWithoutRelationshipsResponse result = apiInstance.replaceRestrictionQuery(RESTRICTION_QUERY_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling LogsRestrictionQueriesApi#replaceRestrictionQuery"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Replace a restriction query ``` // Replace a restriction query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_restriction_queries::LogsRestrictionQueriesAPI; use datadog_api_client::datadogV2::model::LogsRestrictionQueriesType; use datadog_api_client::datadogV2::model::RestrictionQueryUpdateAttributes; use datadog_api_client::datadogV2::model::RestrictionQueryUpdateData; use datadog_api_client::datadogV2::model::RestrictionQueryUpdatePayload; #[tokio::main] async fn main() { // there is a valid "restriction_query" in the system let restriction_query_data_id = std::env::var("RESTRICTION_QUERY_DATA_ID").unwrap(); let body = RestrictionQueryUpdatePayload::new().data( RestrictionQueryUpdateData::new() .attributes(RestrictionQueryUpdateAttributes::new( "env:staging".to_string(), )) .type_(LogsRestrictionQueriesType::LOGS_RESTRICTION_QUERIES), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ReplaceRestrictionQuery", true); let api = LogsRestrictionQueriesAPI::with_config(configuration); let resp = api .replace_restriction_query(restriction_query_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Replace a restriction query ``` /** * Replace a restriction query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.replaceRestrictionQuery"] = true; const apiInstance = new v2.LogsRestrictionQueriesApi(configuration); // there is a valid "restriction_query" in the system const RESTRICTION_QUERY_DATA_ID = process.env .RESTRICTION_QUERY_DATA_ID as string; const params: v2.LogsRestrictionQueriesApiReplaceRestrictionQueryRequest = { body: { data: { attributes: { restrictionQuery: "env:staging", }, type: "logs_restriction_queries", }, }, restrictionQueryId: RESTRICTION_QUERY_DATA_ID, }; apiInstance .replaceRestrictionQuery(params) .then((data: v2.RestrictionQueryWithoutRelationshipsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a restriction query](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#update-a-restriction-query) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#update-a-restriction-query-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PATCH https://api.ap1.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.ap2.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.datadoghq.eu/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.ddog-gov.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.us3.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id} ### Overview Edit a restriction query. This endpoint requires the `user_access_manage` permission. ### Arguments #### Path Parameters Name Type Description restriction_query_id [_required_] string The ID of the restriction query. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Field Type Description data object Data related to the update of a restriction query. attributes object Attributes of the edited restriction query. restriction_query [_required_] string The restriction query. type enum Restriction query resource type. Allowed enum values: `logs_restriction_queries` default: `logs_restriction_queries` ``` { "data": { "attributes": { "restriction_query": "env:sandbox" }, "type": "logs_restriction_queries" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#UpdateRestrictionQuery-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#UpdateRestrictionQuery-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#UpdateRestrictionQuery-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#UpdateRestrictionQuery-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#UpdateRestrictionQuery-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Response containing information about a single restriction query. Field Type Description data object Restriction query object returned by the API. attributes object Attributes of the restriction query. created_at date-time Creation time of the restriction query. last_modifier_email string Email of the user who last modified this restriction query. last_modifier_name string Name of the user who last modified this restriction query. modified_at date-time Time of last restriction query modification. restriction_query string The query that defines the restriction. Only the content matching the query can be returned. role_count int64 Number of roles associated with this restriction query. user_count int64 Number of users associated with this restriction query. id string ID of the restriction query. type string Restriction queries type. default: `logs_restriction_queries` ``` { "data": { "attributes": { "created_at": "2020-03-17T21:06:44.000Z", "last_modifier_email": "user@example.com", "last_modifier_name": "John Doe", "modified_at": "2020-03-17T21:15:15.000Z", "restriction_query": "env:sandbox", "role_count": 3, "user_count": 5 }, "id": "79a0e60a-644a-11ea-ad29-43329f7f58b5", "type": "logs_restriction_queries" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=typescript) ##### Update a restriction query Copy ``` # Path parameters export restriction_query_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/${restriction_query_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "restriction_query": "env:sandbox" } } } EOF ``` ##### Update a restriction query ``` """ Update a restriction query returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_restriction_queries_api import LogsRestrictionQueriesApi from datadog_api_client.v2.model.logs_restriction_queries_type import LogsRestrictionQueriesType from datadog_api_client.v2.model.restriction_query_update_attributes import RestrictionQueryUpdateAttributes from datadog_api_client.v2.model.restriction_query_update_data import RestrictionQueryUpdateData from datadog_api_client.v2.model.restriction_query_update_payload import RestrictionQueryUpdatePayload # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = environ["RESTRICTION_QUERY_DATA_ID"] body = RestrictionQueryUpdatePayload( data=RestrictionQueryUpdateData( attributes=RestrictionQueryUpdateAttributes( restriction_query="env:production", ), type=LogsRestrictionQueriesType.LOGS_RESTRICTION_QUERIES, ), ) configuration = Configuration() configuration.unstable_operations["update_restriction_query"] = True with ApiClient(configuration) as api_client: api_instance = LogsRestrictionQueriesApi(api_client) response = api_instance.update_restriction_query(restriction_query_id=RESTRICTION_QUERY_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a restriction query ``` # Update a restriction query returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_restriction_query".to_sym] = true end api_instance = DatadogAPIClient::V2::LogsRestrictionQueriesAPI.new # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = ENV["RESTRICTION_QUERY_DATA_ID"] body = DatadogAPIClient::V2::RestrictionQueryUpdatePayload.new({ data: DatadogAPIClient::V2::RestrictionQueryUpdateData.new({ attributes: DatadogAPIClient::V2::RestrictionQueryUpdateAttributes.new({ restriction_query: "env:production", }), type: DatadogAPIClient::V2::LogsRestrictionQueriesType::LOGS_RESTRICTION_QUERIES, }), }) p api_instance.update_restriction_query(RESTRICTION_QUERY_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a restriction query ``` // Update a restriction query returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "restriction_query" in the system RestrictionQueryDataID := os.Getenv("RESTRICTION_QUERY_DATA_ID") body := datadogV2.RestrictionQueryUpdatePayload{ Data: &datadogV2.RestrictionQueryUpdateData{ Attributes: &datadogV2.RestrictionQueryUpdateAttributes{ RestrictionQuery: "env:production", }, Type: datadogV2.LOGSRESTRICTIONQUERIESTYPE_LOGS_RESTRICTION_QUERIES.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateRestrictionQuery", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsRestrictionQueriesApi(apiClient) resp, r, err := api.UpdateRestrictionQuery(ctx, RestrictionQueryDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsRestrictionQueriesApi.UpdateRestrictionQuery`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsRestrictionQueriesApi.UpdateRestrictionQuery`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a restriction query ``` // Update a restriction query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsRestrictionQueriesApi; import com.datadog.api.client.v2.model.LogsRestrictionQueriesType; import com.datadog.api.client.v2.model.RestrictionQueryUpdateAttributes; import com.datadog.api.client.v2.model.RestrictionQueryUpdateData; import com.datadog.api.client.v2.model.RestrictionQueryUpdatePayload; import com.datadog.api.client.v2.model.RestrictionQueryWithoutRelationshipsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateRestrictionQuery", true); LogsRestrictionQueriesApi apiInstance = new LogsRestrictionQueriesApi(defaultClient); // there is a valid "restriction_query" in the system String RESTRICTION_QUERY_DATA_ID = System.getenv("RESTRICTION_QUERY_DATA_ID"); RestrictionQueryUpdatePayload body = new RestrictionQueryUpdatePayload() .data( new RestrictionQueryUpdateData() .attributes( new RestrictionQueryUpdateAttributes().restrictionQuery("env:production")) .type(LogsRestrictionQueriesType.LOGS_RESTRICTION_QUERIES)); try { RestrictionQueryWithoutRelationshipsResponse result = apiInstance.updateRestrictionQuery(RESTRICTION_QUERY_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsRestrictionQueriesApi#updateRestrictionQuery"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a restriction query ``` // Update a restriction query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_restriction_queries::LogsRestrictionQueriesAPI; use datadog_api_client::datadogV2::model::LogsRestrictionQueriesType; use datadog_api_client::datadogV2::model::RestrictionQueryUpdateAttributes; use datadog_api_client::datadogV2::model::RestrictionQueryUpdateData; use datadog_api_client::datadogV2::model::RestrictionQueryUpdatePayload; #[tokio::main] async fn main() { // there is a valid "restriction_query" in the system let restriction_query_data_id = std::env::var("RESTRICTION_QUERY_DATA_ID").unwrap(); let body = RestrictionQueryUpdatePayload::new().data( RestrictionQueryUpdateData::new() .attributes(RestrictionQueryUpdateAttributes::new( "env:production".to_string(), )) .type_(LogsRestrictionQueriesType::LOGS_RESTRICTION_QUERIES), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateRestrictionQuery", true); let api = LogsRestrictionQueriesAPI::with_config(configuration); let resp = api .update_restriction_query(restriction_query_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a restriction query ``` /** * Update a restriction query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateRestrictionQuery"] = true; const apiInstance = new v2.LogsRestrictionQueriesApi(configuration); // there is a valid "restriction_query" in the system const RESTRICTION_QUERY_DATA_ID = process.env .RESTRICTION_QUERY_DATA_ID as string; const params: v2.LogsRestrictionQueriesApiUpdateRestrictionQueryRequest = { body: { data: { attributes: { restrictionQuery: "env:production", }, type: "logs_restriction_queries", }, }, restrictionQueryId: RESTRICTION_QUERY_DATA_ID, }; apiInstance .updateRestrictionQuery(params) .then((data: v2.RestrictionQueryWithoutRelationshipsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a restriction query](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#delete-a-restriction-query) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#delete-a-restriction-query-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.ap2.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.datadoghq.eu/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.ddog-gov.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.us3.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id} ### Overview Deletes a restriction query. This endpoint requires the `user_access_manage` permission. ### Arguments #### Path Parameters Name Type Description restriction_query_id [_required_] string The ID of the restriction query. ### Response * [204](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#DeleteRestrictionQuery-204-v2) * [400](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#DeleteRestrictionQuery-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#DeleteRestrictionQuery-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#DeleteRestrictionQuery-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#DeleteRestrictionQuery-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=typescript) ##### Delete a restriction query Copy ``` # Path parameters export restriction_query_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/${restriction_query_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a restriction query ``` """ Delete a restriction query returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_restriction_queries_api import LogsRestrictionQueriesApi # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = environ["RESTRICTION_QUERY_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_restriction_query"] = True with ApiClient(configuration) as api_client: api_instance = LogsRestrictionQueriesApi(api_client) api_instance.delete_restriction_query( restriction_query_id=RESTRICTION_QUERY_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a restriction query ``` # Delete a restriction query returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_restriction_query".to_sym] = true end api_instance = DatadogAPIClient::V2::LogsRestrictionQueriesAPI.new # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = ENV["RESTRICTION_QUERY_DATA_ID"] api_instance.delete_restriction_query(RESTRICTION_QUERY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a restriction query ``` // Delete a restriction query returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "restriction_query" in the system RestrictionQueryDataID := os.Getenv("RESTRICTION_QUERY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteRestrictionQuery", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsRestrictionQueriesApi(apiClient) r, err := api.DeleteRestrictionQuery(ctx, RestrictionQueryDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsRestrictionQueriesApi.DeleteRestrictionQuery`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a restriction query ``` // Delete a restriction query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsRestrictionQueriesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteRestrictionQuery", true); LogsRestrictionQueriesApi apiInstance = new LogsRestrictionQueriesApi(defaultClient); // there is a valid "restriction_query" in the system String RESTRICTION_QUERY_DATA_ID = System.getenv("RESTRICTION_QUERY_DATA_ID"); try { apiInstance.deleteRestrictionQuery(RESTRICTION_QUERY_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling LogsRestrictionQueriesApi#deleteRestrictionQuery"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a restriction query ``` // Delete a restriction query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_restriction_queries::LogsRestrictionQueriesAPI; #[tokio::main] async fn main() { // there is a valid "restriction_query" in the system let restriction_query_data_id = std::env::var("RESTRICTION_QUERY_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteRestrictionQuery", true); let api = LogsRestrictionQueriesAPI::with_config(configuration); let resp = api .delete_restriction_query(restriction_query_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a restriction query ``` /** * Delete a restriction query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteRestrictionQuery"] = true; const apiInstance = new v2.LogsRestrictionQueriesApi(configuration); // there is a valid "restriction_query" in the system const RESTRICTION_QUERY_DATA_ID = process.env .RESTRICTION_QUERY_DATA_ID as string; const params: v2.LogsRestrictionQueriesApiDeleteRestrictionQueryRequest = { restrictionQueryId: RESTRICTION_QUERY_DATA_ID, }; apiInstance .deleteRestrictionQuery(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List roles for a restriction query](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#list-roles-for-a-restriction-query) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#list-roles-for-a-restriction-query-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.ap2.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.datadoghq.eu/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.ddog-gov.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.us3.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roles ### Overview Returns all roles that have a given restriction query. This endpoint requires the `logs_read_config` permission. ### Arguments #### Path Parameters Name Type Description restriction_query_id [_required_] string The ID of the restriction query. #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListRestrictionQueryRoles-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListRestrictionQueryRoles-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListRestrictionQueryRoles-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListRestrictionQueryRoles-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListRestrictionQueryRoles-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Response containing information about roles attached to a restriction query. Field Type Description data [object] Array of roles. attributes [_required_] object Attributes of the role for a restriction query. name string The role name. id [_required_] string ID of the role. type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": [ { "attributes": { "name": "Datadog Admin Role" }, "id": "", "type": "roles" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=typescript) ##### List roles for a restriction query Copy ``` # Path parameters export restriction_query_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/${restriction_query_id}/roles" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List roles for a restriction query ``` """ List roles for a restriction query returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_restriction_queries_api import LogsRestrictionQueriesApi # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = environ["RESTRICTION_QUERY_DATA_ID"] configuration = Configuration() configuration.unstable_operations["list_restriction_query_roles"] = True with ApiClient(configuration) as api_client: api_instance = LogsRestrictionQueriesApi(api_client) response = api_instance.list_restriction_query_roles( restriction_query_id=RESTRICTION_QUERY_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List roles for a restriction query ``` # List roles for a restriction query returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_restriction_query_roles".to_sym] = true end api_instance = DatadogAPIClient::V2::LogsRestrictionQueriesAPI.new # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = ENV["RESTRICTION_QUERY_DATA_ID"] p api_instance.list_restriction_query_roles(RESTRICTION_QUERY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List roles for a restriction query ``` // List roles for a restriction query returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "restriction_query" in the system RestrictionQueryDataID := os.Getenv("RESTRICTION_QUERY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListRestrictionQueryRoles", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsRestrictionQueriesApi(apiClient) resp, r, err := api.ListRestrictionQueryRoles(ctx, RestrictionQueryDataID, *datadogV2.NewListRestrictionQueryRolesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsRestrictionQueriesApi.ListRestrictionQueryRoles`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsRestrictionQueriesApi.ListRestrictionQueryRoles`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List roles for a restriction query ``` // List roles for a restriction query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsRestrictionQueriesApi; import com.datadog.api.client.v2.model.RestrictionQueryRolesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listRestrictionQueryRoles", true); LogsRestrictionQueriesApi apiInstance = new LogsRestrictionQueriesApi(defaultClient); // there is a valid "restriction_query" in the system String RESTRICTION_QUERY_DATA_ID = System.getenv("RESTRICTION_QUERY_DATA_ID"); try { RestrictionQueryRolesResponse result = apiInstance.listRestrictionQueryRoles(RESTRICTION_QUERY_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling LogsRestrictionQueriesApi#listRestrictionQueryRoles"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List roles for a restriction query ``` // List roles for a restriction query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_restriction_queries::ListRestrictionQueryRolesOptionalParams; use datadog_api_client::datadogV2::api_logs_restriction_queries::LogsRestrictionQueriesAPI; #[tokio::main] async fn main() { // there is a valid "restriction_query" in the system let restriction_query_data_id = std::env::var("RESTRICTION_QUERY_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListRestrictionQueryRoles", true); let api = LogsRestrictionQueriesAPI::with_config(configuration); let resp = api .list_restriction_query_roles( restriction_query_data_id.clone(), ListRestrictionQueryRolesOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List roles for a restriction query ``` /** * List roles for a restriction query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listRestrictionQueryRoles"] = true; const apiInstance = new v2.LogsRestrictionQueriesApi(configuration); // there is a valid "restriction_query" in the system const RESTRICTION_QUERY_DATA_ID = process.env .RESTRICTION_QUERY_DATA_ID as string; const params: v2.LogsRestrictionQueriesApiListRestrictionQueryRolesRequest = { restrictionQueryId: RESTRICTION_QUERY_DATA_ID, }; apiInstance .listRestrictionQueryRoles(params) .then((data: v2.RestrictionQueryRolesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Grant role to a restriction query](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#grant-role-to-a-restriction-query) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#grant-role-to-a-restriction-query-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.ap2.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.datadoghq.eu/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.ddog-gov.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.us3.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roles ### Overview Adds a role to a restriction query. **Note** : This operation automatically grants the `logs_read_data` permission to the role if it doesn’t already have it. This endpoint requires the `user_access_manage` permission. ### Arguments #### Path Parameters Name Type Description restriction_query_id [_required_] string The ID of the restriction query. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Field Type Description data object Relationship to role object. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "id": "string", "type": "roles" } } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#AddRoleToRestrictionQuery-204-v2) * [400](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#AddRoleToRestrictionQuery-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#AddRoleToRestrictionQuery-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#AddRoleToRestrictionQuery-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#AddRoleToRestrictionQuery-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=typescript) ##### Grant role to a restriction query returns "OK" response Copy ``` # Path parameters export restriction_query_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/${restriction_query_id}/roles" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "string", "type": "roles" } } EOF ``` ##### Grant role to a restriction query returns "OK" response ``` // Grant role to a restriction query returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "restriction_query" in the system RestrictionQueryDataID := os.Getenv("RESTRICTION_QUERY_DATA_ID") // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") body := datadogV2.RelationshipToRole{ Data: &datadogV2.RelationshipToRoleData{ Id: datadog.PtrString(RoleDataID), Type: datadogV2.ROLESTYPE_ROLES.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.AddRoleToRestrictionQuery", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsRestrictionQueriesApi(apiClient) r, err := api.AddRoleToRestrictionQuery(ctx, RestrictionQueryDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsRestrictionQueriesApi.AddRoleToRestrictionQuery`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Grant role to a restriction query returns "OK" response ``` // Grant role to a restriction query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsRestrictionQueriesApi; import com.datadog.api.client.v2.model.RelationshipToRole; import com.datadog.api.client.v2.model.RelationshipToRoleData; import com.datadog.api.client.v2.model.RolesType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.addRoleToRestrictionQuery", true); LogsRestrictionQueriesApi apiInstance = new LogsRestrictionQueriesApi(defaultClient); // there is a valid "restriction_query" in the system String RESTRICTION_QUERY_DATA_ID = System.getenv("RESTRICTION_QUERY_DATA_ID"); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); RelationshipToRole body = new RelationshipToRole() .data(new RelationshipToRoleData().id(ROLE_DATA_ID).type(RolesType.ROLES)); try { apiInstance.addRoleToRestrictionQuery(RESTRICTION_QUERY_DATA_ID, body); } catch (ApiException e) { System.err.println( "Exception when calling LogsRestrictionQueriesApi#addRoleToRestrictionQuery"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Grant role to a restriction query returns "OK" response ``` """ Grant role to a restriction query returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_restriction_queries_api import LogsRestrictionQueriesApi from datadog_api_client.v2.model.relationship_to_role import RelationshipToRole from datadog_api_client.v2.model.relationship_to_role_data import RelationshipToRoleData from datadog_api_client.v2.model.roles_type import RolesType # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = environ["RESTRICTION_QUERY_DATA_ID"] # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] body = RelationshipToRole( data=RelationshipToRoleData( id=ROLE_DATA_ID, type=RolesType.ROLES, ), ) configuration = Configuration() configuration.unstable_operations["add_role_to_restriction_query"] = True with ApiClient(configuration) as api_client: api_instance = LogsRestrictionQueriesApi(api_client) api_instance.add_role_to_restriction_query(restriction_query_id=RESTRICTION_QUERY_DATA_ID, body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Grant role to a restriction query returns "OK" response ``` # Grant role to a restriction query returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.add_role_to_restriction_query".to_sym] = true end api_instance = DatadogAPIClient::V2::LogsRestrictionQueriesAPI.new # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = ENV["RESTRICTION_QUERY_DATA_ID"] # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] body = DatadogAPIClient::V2::RelationshipToRole.new({ data: DatadogAPIClient::V2::RelationshipToRoleData.new({ id: ROLE_DATA_ID, type: DatadogAPIClient::V2::RolesType::ROLES, }), }) api_instance.add_role_to_restriction_query(RESTRICTION_QUERY_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Grant role to a restriction query returns "OK" response ``` // Grant role to a restriction query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_restriction_queries::LogsRestrictionQueriesAPI; use datadog_api_client::datadogV2::model::RelationshipToRole; use datadog_api_client::datadogV2::model::RelationshipToRoleData; use datadog_api_client::datadogV2::model::RolesType; #[tokio::main] async fn main() { // there is a valid "restriction_query" in the system let restriction_query_data_id = std::env::var("RESTRICTION_QUERY_DATA_ID").unwrap(); // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let body = RelationshipToRole::new().data( RelationshipToRoleData::new() .id(role_data_id.clone()) .type_(RolesType::ROLES), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.AddRoleToRestrictionQuery", true); let api = LogsRestrictionQueriesAPI::with_config(configuration); let resp = api .add_role_to_restriction_query(restriction_query_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Grant role to a restriction query returns "OK" response ``` /** * Grant role to a restriction query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.addRoleToRestrictionQuery"] = true; const apiInstance = new v2.LogsRestrictionQueriesApi(configuration); // there is a valid "restriction_query" in the system const RESTRICTION_QUERY_DATA_ID = process.env .RESTRICTION_QUERY_DATA_ID as string; // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v2.LogsRestrictionQueriesApiAddRoleToRestrictionQueryRequest = { body: { data: { id: ROLE_DATA_ID, type: "roles", }, }, restrictionQueryId: RESTRICTION_QUERY_DATA_ID, }; apiInstance .addRoleToRestrictionQuery(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Revoke role from a restriction query](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#revoke-role-from-a-restriction-query) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#revoke-role-from-a-restriction-query-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.ap2.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.datadoghq.eu/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.ddog-gov.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.us3.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roleshttps://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/{restriction_query_id}/roles ### Overview Removes a role from a restriction query. This endpoint requires the `user_access_manage` permission. ### Arguments #### Path Parameters Name Type Description restriction_query_id [_required_] string The ID of the restriction query. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Field Type Description data object Relationship to role object. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#RemoveRoleFromRestrictionQuery-204-v2) * [400](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#RemoveRoleFromRestrictionQuery-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#RemoveRoleFromRestrictionQuery-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#RemoveRoleFromRestrictionQuery-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#RemoveRoleFromRestrictionQuery-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=typescript) ##### Revoke role from a restriction query Copy ``` # Path parameters export restriction_query_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/${restriction_query_id}/roles" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Revoke role from a restriction query ``` """ Revoke role from a restriction query returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_restriction_queries_api import LogsRestrictionQueriesApi from datadog_api_client.v2.model.relationship_to_role import RelationshipToRole from datadog_api_client.v2.model.relationship_to_role_data import RelationshipToRoleData from datadog_api_client.v2.model.roles_type import RolesType # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = environ["RESTRICTION_QUERY_DATA_ID"] # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] body = RelationshipToRole( data=RelationshipToRoleData( id=ROLE_DATA_ID, type=RolesType.ROLES, ), ) configuration = Configuration() configuration.unstable_operations["remove_role_from_restriction_query"] = True with ApiClient(configuration) as api_client: api_instance = LogsRestrictionQueriesApi(api_client) api_instance.remove_role_from_restriction_query(restriction_query_id=RESTRICTION_QUERY_DATA_ID, body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Revoke role from a restriction query ``` # Revoke role from a restriction query returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.remove_role_from_restriction_query".to_sym] = true end api_instance = DatadogAPIClient::V2::LogsRestrictionQueriesAPI.new # there is a valid "restriction_query" in the system RESTRICTION_QUERY_DATA_ID = ENV["RESTRICTION_QUERY_DATA_ID"] # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] body = DatadogAPIClient::V2::RelationshipToRole.new({ data: DatadogAPIClient::V2::RelationshipToRoleData.new({ id: ROLE_DATA_ID, type: DatadogAPIClient::V2::RolesType::ROLES, }), }) api_instance.remove_role_from_restriction_query(RESTRICTION_QUERY_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Revoke role from a restriction query ``` // Revoke role from a restriction query returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "restriction_query" in the system RestrictionQueryDataID := os.Getenv("RESTRICTION_QUERY_DATA_ID") // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") body := datadogV2.RelationshipToRole{ Data: &datadogV2.RelationshipToRoleData{ Id: datadog.PtrString(RoleDataID), Type: datadogV2.ROLESTYPE_ROLES.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.RemoveRoleFromRestrictionQuery", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsRestrictionQueriesApi(apiClient) r, err := api.RemoveRoleFromRestrictionQuery(ctx, RestrictionQueryDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsRestrictionQueriesApi.RemoveRoleFromRestrictionQuery`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Revoke role from a restriction query ``` // Revoke role from a restriction query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsRestrictionQueriesApi; import com.datadog.api.client.v2.model.RelationshipToRole; import com.datadog.api.client.v2.model.RelationshipToRoleData; import com.datadog.api.client.v2.model.RolesType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.removeRoleFromRestrictionQuery", true); LogsRestrictionQueriesApi apiInstance = new LogsRestrictionQueriesApi(defaultClient); // there is a valid "restriction_query" in the system String RESTRICTION_QUERY_DATA_ID = System.getenv("RESTRICTION_QUERY_DATA_ID"); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); RelationshipToRole body = new RelationshipToRole() .data(new RelationshipToRoleData().id(ROLE_DATA_ID).type(RolesType.ROLES)); try { apiInstance.removeRoleFromRestrictionQuery(RESTRICTION_QUERY_DATA_ID, body); } catch (ApiException e) { System.err.println( "Exception when calling LogsRestrictionQueriesApi#removeRoleFromRestrictionQuery"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Revoke role from a restriction query ``` // Revoke role from a restriction query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_restriction_queries::LogsRestrictionQueriesAPI; use datadog_api_client::datadogV2::model::RelationshipToRole; use datadog_api_client::datadogV2::model::RelationshipToRoleData; use datadog_api_client::datadogV2::model::RolesType; #[tokio::main] async fn main() { // there is a valid "restriction_query" in the system let restriction_query_data_id = std::env::var("RESTRICTION_QUERY_DATA_ID").unwrap(); // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let body = RelationshipToRole::new().data( RelationshipToRoleData::new() .id(role_data_id.clone()) .type_(RolesType::ROLES), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.RemoveRoleFromRestrictionQuery", true); let api = LogsRestrictionQueriesAPI::with_config(configuration); let resp = api .remove_role_from_restriction_query(restriction_query_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Revoke role from a restriction query ``` /** * Revoke role from a restriction query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.removeRoleFromRestrictionQuery"] = true; const apiInstance = new v2.LogsRestrictionQueriesApi(configuration); // there is a valid "restriction_query" in the system const RESTRICTION_QUERY_DATA_ID = process.env .RESTRICTION_QUERY_DATA_ID as string; // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v2.LogsRestrictionQueriesApiRemoveRoleFromRestrictionQueryRequest = { body: { data: { id: ROLE_DATA_ID, type: "roles", }, }, restrictionQueryId: RESTRICTION_QUERY_DATA_ID, }; apiInstance .removeRoleFromRestrictionQuery(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all restriction queries for a given user](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#get-all-restriction-queries-for-a-given-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#get-all-restriction-queries-for-a-given-user-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/logs/config/restriction_queries/user/{user_id}https://api.ap2.datadoghq.com/api/v2/logs/config/restriction_queries/user/{user_id}https://api.datadoghq.eu/api/v2/logs/config/restriction_queries/user/{user_id}https://api.ddog-gov.com/api/v2/logs/config/restriction_queries/user/{user_id}https://api.datadoghq.com/api/v2/logs/config/restriction_queries/user/{user_id}https://api.us3.datadoghq.com/api/v2/logs/config/restriction_queries/user/{user_id}https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/user/{user_id} ### Overview Get all restriction queries for a given user. This endpoint requires the `logs_read_config` permission. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The ID of the user. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListUserRestrictionQueries-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListUserRestrictionQueries-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListUserRestrictionQueries-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListUserRestrictionQueries-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#ListUserRestrictionQueries-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Response containing information about multiple restriction queries. Field Type Description data [object] Array of returned restriction queries. attributes object Attributes of the restriction query. created_at date-time Creation time of the restriction query. last_modifier_email string Email of the user who last modified this restriction query. last_modifier_name string Name of the user who last modified this restriction query. modified_at date-time Time of last restriction query modification. restriction_query string The query that defines the restriction. Only the content matching the query can be returned. role_count int64 Number of roles associated with this restriction query. user_count int64 Number of users associated with this restriction query. id string ID of the restriction query. type string Restriction queries type. default: `logs_restriction_queries` ``` { "data": [ { "attributes": { "created_at": "2020-03-17T21:06:44.000Z", "last_modifier_email": "user@example.com", "last_modifier_name": "John Doe", "modified_at": "2020-03-17T21:15:15.000Z", "restriction_query": "env:sandbox", "role_count": 3, "user_count": 5 }, "id": "79a0e60a-644a-11ea-ad29-43329f7f58b5", "type": "logs_restriction_queries" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=typescript) ##### Get all restriction queries for a given user Copy ``` # Path parameters export user_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/user/${user_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all restriction queries for a given user ``` """ Get all restriction queries for a given user returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_restriction_queries_api import LogsRestrictionQueriesApi # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] configuration = Configuration() configuration.unstable_operations["list_user_restriction_queries"] = True with ApiClient(configuration) as api_client: api_instance = LogsRestrictionQueriesApi(api_client) response = api_instance.list_user_restriction_queries( user_id=USER_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all restriction queries for a given user ``` # Get all restriction queries for a given user returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_user_restriction_queries".to_sym] = true end api_instance = DatadogAPIClient::V2::LogsRestrictionQueriesAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] p api_instance.list_user_restriction_queries(USER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all restriction queries for a given user ``` // Get all restriction queries for a given user returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListUserRestrictionQueries", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsRestrictionQueriesApi(apiClient) resp, r, err := api.ListUserRestrictionQueries(ctx, UserDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsRestrictionQueriesApi.ListUserRestrictionQueries`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsRestrictionQueriesApi.ListUserRestrictionQueries`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all restriction queries for a given user ``` // Get all restriction queries for a given user returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsRestrictionQueriesApi; import com.datadog.api.client.v2.model.RestrictionQueryListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listUserRestrictionQueries", true); LogsRestrictionQueriesApi apiInstance = new LogsRestrictionQueriesApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); try { RestrictionQueryListResponse result = apiInstance.listUserRestrictionQueries(USER_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling LogsRestrictionQueriesApi#listUserRestrictionQueries"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all restriction queries for a given user ``` // Get all restriction queries for a given user returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_restriction_queries::LogsRestrictionQueriesAPI; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListUserRestrictionQueries", true); let api = LogsRestrictionQueriesAPI::with_config(configuration); let resp = api .list_user_restriction_queries(user_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all restriction queries for a given user ``` /** * Get all restriction queries for a given user returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listUserRestrictionQueries"] = true; const apiInstance = new v2.LogsRestrictionQueriesApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.LogsRestrictionQueriesApiListUserRestrictionQueriesRequest = { userId: USER_DATA_ID, }; apiInstance .listUserRestrictionQueries(params) .then((data: v2.RestrictionQueryListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get restriction query for a given role](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#get-restriction-query-for-a-given-role) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#get-restriction-query-for-a-given-role-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/logs/config/restriction_queries/role/{role_id}https://api.ap2.datadoghq.com/api/v2/logs/config/restriction_queries/role/{role_id}https://api.datadoghq.eu/api/v2/logs/config/restriction_queries/role/{role_id}https://api.ddog-gov.com/api/v2/logs/config/restriction_queries/role/{role_id}https://api.datadoghq.com/api/v2/logs/config/restriction_queries/role/{role_id}https://api.us3.datadoghq.com/api/v2/logs/config/restriction_queries/role/{role_id}https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/role/{role_id} ### Overview Get restriction query for a given role. This endpoint requires the `logs_read_config` permission. ### Arguments #### Path Parameters Name Type Description role_id [_required_] string The ID of the role. ### Response * [200](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#GetRoleRestrictionQuery-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#GetRoleRestrictionQuery-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#GetRoleRestrictionQuery-403-v2) * [404](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#GetRoleRestrictionQuery-404-v2) * [429](https://docs.datadoghq.com/api/latest/logs-restriction-queries/#GetRoleRestrictionQuery-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) Response containing information about multiple restriction queries. Field Type Description data [object] Array of returned restriction queries. attributes object Attributes of the restriction query. created_at date-time Creation time of the restriction query. last_modifier_email string Email of the user who last modified this restriction query. last_modifier_name string Name of the user who last modified this restriction query. modified_at date-time Time of last restriction query modification. restriction_query string The query that defines the restriction. Only the content matching the query can be returned. role_count int64 Number of roles associated with this restriction query. user_count int64 Number of users associated with this restriction query. id string ID of the restriction query. type string Restriction queries type. default: `logs_restriction_queries` ``` { "data": [ { "attributes": { "created_at": "2020-03-17T21:06:44.000Z", "last_modifier_email": "user@example.com", "last_modifier_name": "John Doe", "modified_at": "2020-03-17T21:15:15.000Z", "restriction_query": "env:sandbox", "role_count": 3, "user_count": 5 }, "id": "79a0e60a-644a-11ea-ad29-43329f7f58b5", "type": "logs_restriction_queries" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) * [Example](https://docs.datadoghq.com/api/latest/logs-restriction-queries/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs-restriction-queries/?code-lang=typescript) ##### Get restriction query for a given role Copy ``` # Path parameters export role_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/config/restriction_queries/role/${role_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get restriction query for a given role ``` """ Get restriction query for a given role returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_restriction_queries_api import LogsRestrictionQueriesApi # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_role_restriction_query"] = True with ApiClient(configuration) as api_client: api_instance = LogsRestrictionQueriesApi(api_client) response = api_instance.get_role_restriction_query( role_id=ROLE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get restriction query for a given role ``` # Get restriction query for a given role returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_role_restriction_query".to_sym] = true end api_instance = DatadogAPIClient::V2::LogsRestrictionQueriesAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] p api_instance.get_role_restriction_query(ROLE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get restriction query for a given role ``` // Get restriction query for a given role returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetRoleRestrictionQuery", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsRestrictionQueriesApi(apiClient) resp, r, err := api.GetRoleRestrictionQuery(ctx, RoleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsRestrictionQueriesApi.GetRoleRestrictionQuery`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsRestrictionQueriesApi.GetRoleRestrictionQuery`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get restriction query for a given role ``` // Get restriction query for a given role returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsRestrictionQueriesApi; import com.datadog.api.client.v2.model.RestrictionQueryListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getRoleRestrictionQuery", true); LogsRestrictionQueriesApi apiInstance = new LogsRestrictionQueriesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); try { RestrictionQueryListResponse result = apiInstance.getRoleRestrictionQuery(ROLE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling LogsRestrictionQueriesApi#getRoleRestrictionQuery"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get restriction query for a given role ``` // Get restriction query for a given role returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs_restriction_queries::LogsRestrictionQueriesAPI; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetRoleRestrictionQuery", true); let api = LogsRestrictionQueriesAPI::with_config(configuration); let resp = api.get_role_restriction_query(role_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get restriction query for a given role ``` /** * Get restriction query for a given role returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getRoleRestrictionQuery"] = true; const apiInstance = new v2.LogsRestrictionQueriesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v2.LogsRestrictionQueriesApiGetRoleRestrictionQueryRequest = { roleId: ROLE_DATA_ID, }; apiInstance .getRoleRestrictionQuery(params) .then((data: v2.RestrictionQueryListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=105f844d-15b3-40cf-93c4-f8814fc1475d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=8b7b43e2-a176-4c0b-980e-6825fa15f936&pt=Logs%20Restriction%20Queries&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-restriction-queries%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=105f844d-15b3-40cf-93c4-f8814fc1475d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=8b7b43e2-a176-4c0b-980e-6825fa15f936&pt=Logs%20Restriction%20Queries&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-restriction-queries%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=2835aed7-0f24-44a4-8504-9fc2a2aabe6c&bo=2&sid=c4ba0d50f0bf11f09afde9c0412ba014&vid=c4ba67e0f0bf11f099fcdb9b6c7a6b3e&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Logs%20Restriction%20Queries&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs-restriction-queries%2F&r=<=2043&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=812273) --- # Source: https://docs.datadoghq.com/api/latest/logs/ # Logs Search your logs and send them to your Datadog platform over HTTP. See the [Log Management page](https://docs.datadoghq.com/logs/) for more information. ## [Send logs](https://docs.datadoghq.com/api/latest/logs/#send-logs) * [v1](https://docs.datadoghq.com/api/latest/logs/#send-logs-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs/#send-logs-v2) POST https://http-intake.logs.ap1.datadoghq.com/v1/inputhttps://http-intake.logs.ap2.datadoghq.com/v1/inputhttps://http-intake.logs.datadoghq.eu/v1/inputhttps://http-intake.logs.ddog-gov.com/v1/inputhttps://http-intake.logs.datadoghq.com/v1/inputhttps://http-intake.logs.us3.datadoghq.com/v1/inputhttps://http-intake.logs.us5.datadoghq.com/v1/input ### Overview Send your logs to your Datadog platform over HTTP. Limits per HTTP request are: * Maximum content size per payload (uncompressed): 5MB * Maximum size for a single log: 1MB * Maximum array size if sending multiple logs in an array: 1000 entries Any log exceeding 1MB is accepted and truncated by Datadog: * For a single log request, the API truncates the log at 1MB and returns a 2xx. * For a multi-logs request, the API processes all logs, truncates only logs larger than 1MB, and returns a 2xx. Datadog recommends sending your logs compressed. Add the `Content-Encoding: gzip` header to the request when sending compressed logs. The status codes answered by the HTTP API are: * 200: OK * 400: Bad request (likely an issue in the payload formatting) * 403: Permission issue (likely using an invalid API Key) * 413: Payload too large (batch is above 5MB uncompressed) * 5xx: Internal error, request should be retried after some time ### Arguments #### Query Strings Name Type Description ddtags string Log tags can be passed as query parameters with `text/plain` content type. #### Header Parameters Name Type Description Content-Encoding string HTTP header used to compress the media-type. ### Request #### Body Data (required) Log to send (JSON format). * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Expand All Field Type Description ddsource string The integration name associated with your log: the technology from which the log originated. When it matches an integration name, Datadog automatically installs the corresponding parsers and facets. See [reserved attributes](https://docs.datadoghq.com/logs/log_collection/#reserved-attributes). ddtags string Tags associated with your logs. hostname string The name of the originating host of the log. message string The message [reserved attribute](https://docs.datadoghq.com/logs/log_collection/#reserved-attributes) of your log. By default, Datadog ingests the value of the message attribute as the body of the log entry. That value is then highlighted and displayed in the Logstream, where it is indexed for full text search. service string The name of the application or service generating the log events. It is used to switch from Logs to APM, so make sure you define the same value when you use both products. See [reserved attributes](https://docs.datadoghq.com/logs/log_collection/#reserved-attributes). ##### Send deflate logs returns "Response from server (always 200 empty JSON)." response ``` [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] ``` Copy ##### Send gzip logs returns "Response from server (always 200 empty JSON)." response ``` [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] ``` Copy ##### Send logs returns "Response from server (always 200 empty JSON)." response ``` [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] ``` Copy * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Expand All Field Type Description ddsource string The integration name associated with your log: the technology from which the log originated. When it matches an integration name, Datadog automatically installs the corresponding parsers and facets. See [reserved attributes](https://docs.datadoghq.com/logs/log_collection/#reserved-attributes). ddtags string Tags associated with your logs. hostname string The name of the originating host of the log. message string The message [reserved attribute](https://docs.datadoghq.com/logs/log_collection/#reserved-attributes) of your log. By default, Datadog ingests the value of the message attribute as the body of the log entry. That value is then highlighted and displayed in the Logstream, where it is indexed for full text search. service string The name of the application or service generating the log events. It is used to switch from Logs to APM, so make sure you define the same value when you use both products. See [reserved attributes](https://docs.datadoghq.com/logs/log_collection/#reserved-attributes). ##### Send deflate logs returns "Response from server (always 200 empty JSON)." response ``` [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] ``` Copy ##### Send gzip logs returns "Response from server (always 200 empty JSON)." response ``` [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] ``` Copy ##### Send logs returns "Response from server (always 200 empty JSON)." response ``` [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-200-v1) * [400](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-400-v1) * [429](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-429-v1) Response from server (always 200 empty JSON). * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Expand All Field Type Description No response body ``` {} ``` Copy unexpected error * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Invalid query performed. Expand All Field Type Description code [_required_] int32 Error code. message [_required_] string Error message. ``` { "code": 0, "message": "Your browser sent an invalid request." } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs/?code-lang=typescript) ##### Send deflate logs returns "Response from server (always 200 empty JSON)." response Copy ``` ## Multi JSON Messages # Pass multiple log objects at once. See one of the other client libraries for an example of sending deflate-compressed data. ## Simple JSON Message # Log attributes can be passed as `key:value` pairs in valid JSON messages. See one of the other client libraries for an example of sending deflate-compressed data. ## Multi Logplex Messages # Submit log messages. See one of the other client libraries for an example of sending deflate-compressed data. ## Simple Logplex Message # Submit log string. See one of the other client libraries for an example of sending deflate-compressed data. ## Multi Raw Messages # Submit log string. See one of the other client libraries for an example of sending deflate-compressed data. ## Simple Raw Message # Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`. See one of the other client libraries for an example of sending deflate-compressed data. ``` ##### Send gzip logs returns "Response from server (always 200 empty JSON)." response Copy ``` ## Multi JSON Messages # Pass multiple log objects at once. # Curl command echo $(cat << EOF [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ## Simple JSON Message # Log attributes can be passed as `key:value` pairs in valid JSON messages. # Curl command echo $(cat << EOF [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: application/json;simple" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ## Multi Logplex Messages # Submit log messages. # Curl command echo $(cat << EOF [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: application/logplex-1" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ## Simple Logplex Message # Submit log string. # Curl command echo $(cat << EOF [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: application/logplex-1" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ## Multi Raw Messages # Submit log string. # Curl command echo $(cat << EOF [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: text/plain" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ## Simple Raw Message # Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`. # Curl command echo $(cat << EOF [ { "message": "Example-Log", "ddtags": "host:ExampleLog" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: text/plain" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ``` ##### Send logs returns "Response from server (always 200 empty JSON)." response Copy ``` ## Multi JSON Messages # Pass multiple log objects at once. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF [ { "message": "hello" }, { "message": "world" } ] EOF ## Simple JSON Message # Log attributes can be passed as `key:value` pairs in valid JSON messages. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: application/json;simple" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "ddsource": "agent", "ddtags": "env:prod,user:joe.doe", "hostname": "fa1e1e739d95", "message": "hello world" } EOF ## Multi Logplex Messages # Submit log messages. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: application/logplex-1" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF hello world EOF ## Simple Logplex Message # Submit log string. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: application/logplex-1" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF hello world EOF ## Multi Raw Messages # Submit log string. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: text/plain" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF hello world EOF ## Simple Raw Message # Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/v1/input" \ -H "Accept: application/json" \ -H "Content-Type: text/plain" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF hello world EOF ``` ##### Send deflate logs returns "Response from server (always 200 empty JSON)." response ``` // Send deflate logs returns "Response from server (always 200 empty JSON)." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := []datadogV1.HTTPLogItem{ { Message: "Example-Log", Ddtags: datadog.PtrString("host:ExampleLog"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsApi(apiClient) resp, r, err := api.SubmitLog(ctx, body, *datadogV1.NewSubmitLogOptionalParameters().WithContentEncoding(datadogV1.CONTENTENCODING_DEFLATE)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent) } ``` Copy ##### Send gzip logs returns "Response from server (always 200 empty JSON)." response ``` // Send gzip logs returns "Response from server (always 200 empty JSON)." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := []datadogV1.HTTPLogItem{ { Message: "Example-Log", Ddtags: datadog.PtrString("host:ExampleLog"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsApi(apiClient) resp, r, err := api.SubmitLog(ctx, body, *datadogV1.NewSubmitLogOptionalParameters().WithContentEncoding(datadogV1.CONTENTENCODING_GZIP)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent) } ``` Copy ##### Send logs returns "Response from server (always 200 empty JSON)." response ``` // Send logs returns "Response from server (always 200 empty JSON)." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := []datadogV1.HTTPLogItem{ { Message: "Example-Log", Ddtags: datadog.PtrString("host:ExampleLog"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsApi(apiClient) resp, r, err := api.SubmitLog(ctx, body, *datadogV1.NewSubmitLogOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Send deflate logs returns "Response from server (always 200 empty JSON)." response ``` // Send deflate logs returns "Response from server (always 200 empty JSON)." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsApi; import com.datadog.api.client.v1.api.LogsApi.SubmitLogOptionalParameters; import com.datadog.api.client.v1.model.ContentEncoding; import com.datadog.api.client.v1.model.HTTPLogItem; import java.util.Collections; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); List body = Collections.singletonList( new HTTPLogItem().message("Example-Log").ddtags("host:ExampleLog")); try { apiInstance.submitLog( body, new SubmitLogOptionalParameters().contentEncoding(ContentEncoding.DEFLATE)); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#submitLog"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Send gzip logs returns "Response from server (always 200 empty JSON)." response ``` // Send gzip logs returns "Response from server (always 200 empty JSON)." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsApi; import com.datadog.api.client.v1.api.LogsApi.SubmitLogOptionalParameters; import com.datadog.api.client.v1.model.ContentEncoding; import com.datadog.api.client.v1.model.HTTPLogItem; import java.util.Collections; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); List body = Collections.singletonList( new HTTPLogItem().message("Example-Log").ddtags("host:ExampleLog")); try { apiInstance.submitLog( body, new SubmitLogOptionalParameters().contentEncoding(ContentEncoding.GZIP)); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#submitLog"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Send logs returns "Response from server (always 200 empty JSON)." response ``` // Send logs returns "Response from server (always 200 empty JSON)." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsApi; import com.datadog.api.client.v1.model.HTTPLogItem; import java.util.Collections; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); List body = Collections.singletonList( new HTTPLogItem().message("Example-Log").ddtags("host:ExampleLog")); try { apiInstance.submitLog(body); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#submitLog"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Send deflate logs returns "Response from server (always 200 empty JSON)." response ``` """ Send deflate logs returns "Response from server (always 200 empty JSON)." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_api import LogsApi from datadog_api_client.v1.model.content_encoding import ContentEncoding from datadog_api_client.v1.model.http_log import HTTPLog from datadog_api_client.v1.model.http_log_item import HTTPLogItem body = HTTPLog( [ HTTPLogItem( message="Example-Log", ddtags="host:ExampleLog", ), ] ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.submit_log(content_encoding=ContentEncoding.DEFLATE, body=body) print(response) ``` Copy ##### Send gzip logs returns "Response from server (always 200 empty JSON)." response ``` """ Send gzip logs returns "Response from server (always 200 empty JSON)." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_api import LogsApi from datadog_api_client.v1.model.content_encoding import ContentEncoding from datadog_api_client.v1.model.http_log import HTTPLog from datadog_api_client.v1.model.http_log_item import HTTPLogItem body = HTTPLog( [ HTTPLogItem( message="Example-Log", ddtags="host:ExampleLog", ), ] ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.submit_log(content_encoding=ContentEncoding.GZIP, body=body) print(response) ``` Copy ##### Send logs returns "Response from server (always 200 empty JSON)." response ``` """ Send logs returns "Response from server (always 200 empty JSON)." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_api import LogsApi from datadog_api_client.v1.model.http_log import HTTPLog from datadog_api_client.v1.model.http_log_item import HTTPLogItem body = HTTPLog( [ HTTPLogItem( message="Example-Log", ddtags="host:ExampleLog", ), ] ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.submit_log(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Send deflate logs returns "Response from server (always 200 empty JSON)." response ``` # Send deflate logs returns "Response from server (always 200 empty JSON)." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsAPI.new body = [ DatadogAPIClient::V1::HTTPLogItem.new({ message: "Example-Log", ddtags: "host:ExampleLog", }), ] opts = { content_encoding: ContentEncoding::DEFLATE, } p api_instance.submit_log(body, opts) ``` Copy ##### Send gzip logs returns "Response from server (always 200 empty JSON)." response ``` # Send gzip logs returns "Response from server (always 200 empty JSON)." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsAPI.new body = [ DatadogAPIClient::V1::HTTPLogItem.new({ message: "Example-Log", ddtags: "host:ExampleLog", }), ] opts = { content_encoding: ContentEncoding::GZIP, } p api_instance.submit_log(body, opts) ``` Copy ##### Send logs returns "Response from server (always 200 empty JSON)." response ``` # Send logs returns "Response from server (always 200 empty JSON)." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsAPI.new body = [ DatadogAPIClient::V1::HTTPLogItem.new({ message: "Example-Log", ddtags: "host:ExampleLog", }), ] p api_instance.submit_log(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Send deflate logs returns "Response from server (always 200 empty JSON)." response ``` // Send deflate logs returns "Response from server (always 200 empty JSON)." // response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs::LogsAPI; use datadog_api_client::datadogV1::api_logs::SubmitLogOptionalParams; use datadog_api_client::datadogV1::model::ContentEncoding; use datadog_api_client::datadogV1::model::HTTPLogItem; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = vec![HTTPLogItem::new("Example-Log".to_string()) .ddtags("host:ExampleLog".to_string()) .additional_properties(BTreeMap::from([]))]; let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api .submit_log( body, SubmitLogOptionalParams::default().content_encoding(ContentEncoding::DEFLATE), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Send gzip logs returns "Response from server (always 200 empty JSON)." response ``` // Send gzip logs returns "Response from server (always 200 empty JSON)." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs::LogsAPI; use datadog_api_client::datadogV1::api_logs::SubmitLogOptionalParams; use datadog_api_client::datadogV1::model::ContentEncoding; use datadog_api_client::datadogV1::model::HTTPLogItem; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = vec![HTTPLogItem::new("Example-Log".to_string()) .ddtags("host:ExampleLog".to_string()) .additional_properties(BTreeMap::from([]))]; let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api .submit_log( body, SubmitLogOptionalParams::default().content_encoding(ContentEncoding::GZIP), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Send logs returns "Response from server (always 200 empty JSON)." response ``` // Send logs returns "Response from server (always 200 empty JSON)." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs::LogsAPI; use datadog_api_client::datadogV1::api_logs::SubmitLogOptionalParams; use datadog_api_client::datadogV1::model::HTTPLogItem; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = vec![HTTPLogItem::new("Example-Log".to_string()) .ddtags("host:ExampleLog".to_string()) .additional_properties(BTreeMap::from([]))]; let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api .submit_log(body, SubmitLogOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Send deflate logs returns "Response from server (always 200 empty JSON)." response ``` /** * Send deflate logs returns "Response from server (always 200 empty JSON)." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsApi(configuration); const params: v1.LogsApiSubmitLogRequest = { body: [ { message: "Example-Log", ddtags: "host:ExampleLog", }, ], contentEncoding: "deflate", }; apiInstance .submitLog(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Send gzip logs returns "Response from server (always 200 empty JSON)." response ``` /** * Send gzip logs returns "Response from server (always 200 empty JSON)." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsApi(configuration); const params: v1.LogsApiSubmitLogRequest = { body: [ { message: "Example-Log", ddtags: "host:ExampleLog", }, ], contentEncoding: "gzip", }; apiInstance .submitLog(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Send logs returns "Response from server (always 200 empty JSON)." response ``` /** * Send logs returns "Response from server (always 200 empty JSON)." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsApi(configuration); const params: v1.LogsApiSubmitLogRequest = { body: [ { message: "Example-Log", ddtags: "host:ExampleLog", }, ], }; apiInstance .submitLog(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` POST https://http-intake.logs.ap1.datadoghq.com/api/v2/logshttps://http-intake.logs.ap2.datadoghq.com/api/v2/logshttps://http-intake.logs.datadoghq.eu/api/v2/logshttps://http-intake.logs.ddog-gov.com/api/v2/logshttps://http-intake.logs.datadoghq.com/api/v2/logshttps://http-intake.logs.us3.datadoghq.com/api/v2/logshttps://http-intake.logs.us5.datadoghq.com/api/v2/logs ### Overview Send your logs to your Datadog platform over HTTP. Limits per HTTP request are: * Maximum content size per payload (uncompressed): 5MB * Maximum size for a single log: 1MB * Maximum array size if sending multiple logs in an array: 1000 entries Any log exceeding 1MB is accepted and truncated by Datadog: * For a single log request, the API truncates the log at 1MB and returns a 2xx. * For a multi-logs request, the API processes all logs, truncates only logs larger than 1MB, and returns a 2xx. Datadog recommends sending your logs compressed. Add the `Content-Encoding: gzip` header to the request when sending compressed logs. Log events can be submitted with a timestamp that is up to 18 hours in the past. The status codes answered by the HTTP API are: * 202: Accepted: the request has been accepted for processing * 400: Bad request (likely an issue in the payload formatting) * 401: Unauthorized (likely a missing API Key) * 403: Permission issue (likely using an invalid API Key) * 408: Request Timeout, request should be retried after some time * 413: Payload too large (batch is above 5MB uncompressed) * 429: Too Many Requests, request should be retried after some time * 500: Internal Server Error, the server encountered an unexpected condition that prevented it from fulfilling the request, request should be retried after some time * 503: Service Unavailable, the server is not ready to handle the request probably because it is overloaded, request should be retried after some time ### Arguments #### Query Strings Name Type Description ddtags string Log tags can be passed as query parameters with `text/plain` content type. #### Header Parameters Name Type Description Content-Encoding string HTTP header used to compress the media-type. ### Request #### Body Data (required) Log to send (JSON format). * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Expand All Field Type Description ddsource string The integration name associated with your log: the technology from which the log originated. When it matches an integration name, Datadog automatically installs the corresponding parsers and facets. See [reserved attributes](https://docs.datadoghq.com/logs/log_configuration/attributes_naming_convention/#reserved-attributes). ddtags string Tags associated with your logs. hostname string The name of the originating host of the log. message string The message [reserved attribute](https://docs.datadoghq.com/logs/log_configuration/attributes_naming_convention/#reserved-attributes) of your log. By default, Datadog ingests the value of the message attribute as the body of the log entry. That value is then highlighted and displayed in the Logstream, where it is indexed for full text search. service string The name of the application or service generating the log events. It is used to switch from Logs to APM, so make sure you define the same value when you use both products. See [reserved attributes](https://docs.datadoghq.com/logs/log_configuration/attributes_naming_convention/#reserved-attributes). ##### Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response ``` [ { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment" } ] ``` Copy ##### Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response ``` [ { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment" } ] ``` Copy ##### Send logs returns "Request accepted for processing (always 202 empty JSON)." response ``` [ { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment", "status": "info" } ] ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-202-v2) * [400](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-400-v2) * [401](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-401-v2) * [403](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-403-v2) * [408](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-408-v2) * [413](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-413-v2) * [429](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-429-v2) * [500](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-500-v2) * [503](https://docs.datadoghq.com/api/latest/logs/#SubmitLog-503-v2) Request accepted for processing (always 202 empty JSON). * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Invalid query performed. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Invalid query performed. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Invalid query performed. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Request Timeout * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Invalid query performed. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Payload Too Large * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Invalid query performed. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Invalid query performed. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Invalid query performed. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Service Unavailable * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Invalid query performed. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs/?code-lang=typescript) ##### Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response Copy ``` ## Multi JSON Messages # Pass multiple log objects at once. See one of the other client libraries for an example of sending deflate-compressed data. ## Simple JSON Message # Log attributes can be passed as `key:value` pairs in valid JSON messages. See one of the other client libraries for an example of sending deflate-compressed data. ## Multi Logplex Messages # Submit log messages. See one of the other client libraries for an example of sending deflate-compressed data. ## Simple Logplex Message # Submit log string. See one of the other client libraries for an example of sending deflate-compressed data. ## Multi Raw Messages # Submit log string. See one of the other client libraries for an example of sending deflate-compressed data. ## Simple Raw Message # Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`. See one of the other client libraries for an example of sending deflate-compressed data. ``` ##### Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response Copy ``` ## Multi JSON Messages # Pass multiple log objects at once. # Curl command echo $(cat << EOF [ { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ## Simple JSON Message # Log attributes can be passed as `key:value` pairs in valid JSON messages. # Curl command echo $(cat << EOF [ { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ## Multi Logplex Messages # Submit log messages. # Curl command echo $(cat << EOF [ { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: application/logplex-1" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ## Simple Logplex Message # Submit log string. # Curl command echo $(cat << EOF [ { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: application/logplex-1" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ## Multi Raw Messages # Submit log string. # Curl command echo $(cat << EOF [ { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: text/plain" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ## Simple Raw Message # Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`. # Curl command echo $(cat << EOF [ { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment" } ] EOF ) | gzip | curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: text/plain" \ -H "Content-Encoding: gzip" \ -H "DD-API-KEY: ${DD_API_KEY}" \ --data-binary @- ``` ##### Send logs returns "Request accepted for processing (always 202 empty JSON)." response Copy ``` ## Multi JSON Messages # Pass multiple log objects at once. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF [ { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello", "service": "payment" }, { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345679", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] World", "service": "payment" } ] EOF ## Simple JSON Message # Log attributes can be passed as `key:value` pairs in valid JSON messages. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment" } EOF ## Multi Logplex Messages # Submit log messages. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: application/logplex-1" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF 2019-11-19T14:37:58,995 INFO [process.name][20081] Hello 2019-11-19T14:37:58,995 INFO [process.name][20081] World EOF ## Simple Logplex Message # Submit log string. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: application/logplex-1" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF 2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World EOF ## Multi Raw Messages # Submit log string. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: text/plain" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF 2019-11-19T14:37:58,995 INFO [process.name][20081] Hello 2019-11-19T14:37:58,995 INFO [process.name][20081] World EOF ## Simple Raw Message # Submit log string. Log attributes can be passed as query parameters in the URL. This enables the addition of tags or the source by using the `ddtags` and `ddsource` parameters: `?host=my-hostname&service=my-service&ddsource=my-source&ddtags=env:prod,user:my-user`. # Curl command curl -X POST "https://http-intake.logs.ap1.datadoghq.com"https://http-intake.logs.ap2.datadoghq.com"https://http-intake.logs.datadoghq.eu"https://http-intake.logs.ddog-gov.com"https://http-intake.logs.datadoghq.com"https://http-intake.logs.us3.datadoghq.com"https://http-intake.logs.us5.datadoghq.com/api/v2/logs" \ -H "Accept: application/json" \ -H "Content-Type: text/plain" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF 2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World EOF ``` ##### Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response ``` // Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := []datadogV2.HTTPLogItem{ { Ddsource: datadog.PtrString("nginx"), Ddtags: datadog.PtrString("env:staging,version:5.1"), Hostname: datadog.PtrString("i-012345678"), Message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", Service: datadog.PtrString("payment"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsApi(apiClient) resp, r, err := api.SubmitLog(ctx, body, *datadogV2.NewSubmitLogOptionalParameters().WithContentEncoding(datadogV2.CONTENTENCODING_DEFLATE)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent) } ``` Copy ##### Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response ``` // Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := []datadogV2.HTTPLogItem{ { Ddsource: datadog.PtrString("nginx"), Ddtags: datadog.PtrString("env:staging,version:5.1"), Hostname: datadog.PtrString("i-012345678"), Message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", Service: datadog.PtrString("payment"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsApi(apiClient) resp, r, err := api.SubmitLog(ctx, body, *datadogV2.NewSubmitLogOptionalParameters().WithContentEncoding(datadogV2.CONTENTENCODING_GZIP)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent) } ``` Copy ##### Send logs returns "Request accepted for processing (always 202 empty JSON)." response ``` // Send logs returns "Request accepted for processing (always 202 empty JSON)." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := []datadogV2.HTTPLogItem{ { Ddsource: datadog.PtrString("nginx"), Ddtags: datadog.PtrString("env:staging,version:5.1"), Hostname: datadog.PtrString("i-012345678"), Message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", Service: datadog.PtrString("payment"), AdditionalProperties: map[string]interface{}{ "status": "info", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsApi(apiClient) resp, r, err := api.SubmitLog(ctx, body, *datadogV2.NewSubmitLogOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.SubmitLog`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.SubmitLog`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response ``` // Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsApi; import com.datadog.api.client.v2.api.LogsApi.SubmitLogOptionalParameters; import com.datadog.api.client.v2.model.ContentEncoding; import com.datadog.api.client.v2.model.HTTPLogItem; import java.util.Collections; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); List body = Collections.singletonList( new HTTPLogItem() .ddsource("nginx") .ddtags("env:staging,version:5.1") .hostname("i-012345678") .message("2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World") .service("payment")); try { apiInstance.submitLog( body, new SubmitLogOptionalParameters().contentEncoding(ContentEncoding.DEFLATE)); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#submitLog"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response ``` // Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsApi; import com.datadog.api.client.v2.api.LogsApi.SubmitLogOptionalParameters; import com.datadog.api.client.v2.model.ContentEncoding; import com.datadog.api.client.v2.model.HTTPLogItem; import java.util.Collections; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); List body = Collections.singletonList( new HTTPLogItem() .ddsource("nginx") .ddtags("env:staging,version:5.1") .hostname("i-012345678") .message("2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World") .service("payment")); try { apiInstance.submitLog( body, new SubmitLogOptionalParameters().contentEncoding(ContentEncoding.GZIP)); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#submitLog"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Send logs returns "Request accepted for processing (always 202 empty JSON)." response ``` // Send logs returns "Request accepted for processing (always 202 empty JSON)." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsApi; import com.datadog.api.client.v2.model.HTTPLogItem; import java.util.Collections; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); List body = Collections.singletonList( new HTTPLogItem() .ddsource("nginx") .ddtags("env:staging,version:5.1") .hostname("i-012345678") .message("2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World") .service("payment") .putAdditionalProperty("status", "info")); try { apiInstance.submitLog(body); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#submitLog"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response ``` """ Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_api import LogsApi from datadog_api_client.v2.model.content_encoding import ContentEncoding from datadog_api_client.v2.model.http_log import HTTPLog from datadog_api_client.v2.model.http_log_item import HTTPLogItem body = HTTPLog( [ HTTPLogItem( ddsource="nginx", ddtags="env:staging,version:5.1", hostname="i-012345678", message="2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service="payment", ), ] ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.submit_log(content_encoding=ContentEncoding.DEFLATE, body=body) print(response) ``` Copy ##### Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response ``` """ Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_api import LogsApi from datadog_api_client.v2.model.content_encoding import ContentEncoding from datadog_api_client.v2.model.http_log import HTTPLog from datadog_api_client.v2.model.http_log_item import HTTPLogItem body = HTTPLog( [ HTTPLogItem( ddsource="nginx", ddtags="env:staging,version:5.1", hostname="i-012345678", message="2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service="payment", ), ] ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.submit_log(content_encoding=ContentEncoding.GZIP, body=body) print(response) ``` Copy ##### Send logs returns "Request accepted for processing (always 202 empty JSON)." response ``` """ Send logs returns "Request accepted for processing (always 202 empty JSON)." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_api import LogsApi from datadog_api_client.v2.model.http_log import HTTPLog from datadog_api_client.v2.model.http_log_item import HTTPLogItem body = HTTPLog( [ HTTPLogItem( ddsource="nginx", ddtags="env:staging,version:5.1", hostname="i-012345678", message="2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service="payment", status="info", ), ] ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.submit_log(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response ``` # Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsAPI.new body = [ DatadogAPIClient::V2::HTTPLogItem.new({ ddsource: "nginx", ddtags: "env:staging,version:5.1", hostname: "i-012345678", message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service: "payment", }), ] opts = { content_encoding: ContentEncoding::DEFLATE, } p api_instance.submit_log(body, opts) ``` Copy ##### Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response ``` # Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsAPI.new body = [ DatadogAPIClient::V2::HTTPLogItem.new({ ddsource: "nginx", ddtags: "env:staging,version:5.1", hostname: "i-012345678", message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service: "payment", }), ] opts = { content_encoding: ContentEncoding::GZIP, } p api_instance.submit_log(body, opts) ``` Copy ##### Send logs returns "Request accepted for processing (always 202 empty JSON)." response ``` # Send logs returns "Request accepted for processing (always 202 empty JSON)." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsAPI.new body = [ DatadogAPIClient::V2::HTTPLogItem.new({ ddsource: "nginx", ddtags: "env:staging,version:5.1", hostname: "i-012345678", message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service: "payment", status: "info", }), ] p api_instance.submit_log(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response ``` // Send deflate logs returns "Request accepted for processing (always 202 empty // JSON)." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs::LogsAPI; use datadog_api_client::datadogV2::api_logs::SubmitLogOptionalParams; use datadog_api_client::datadogV2::model::ContentEncoding; use datadog_api_client::datadogV2::model::HTTPLogItem; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = vec![HTTPLogItem::new( "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World".to_string(), ) .ddsource("nginx".to_string()) .ddtags("env:staging,version:5.1".to_string()) .hostname("i-012345678".to_string()) .service("payment".to_string()) .additional_properties(BTreeMap::from([]))]; let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api .submit_log( body, SubmitLogOptionalParams::default().content_encoding(ContentEncoding::DEFLATE), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response ``` // Send gzip logs returns "Request accepted for processing (always 202 empty // JSON)." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs::LogsAPI; use datadog_api_client::datadogV2::api_logs::SubmitLogOptionalParams; use datadog_api_client::datadogV2::model::ContentEncoding; use datadog_api_client::datadogV2::model::HTTPLogItem; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = vec![HTTPLogItem::new( "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World".to_string(), ) .ddsource("nginx".to_string()) .ddtags("env:staging,version:5.1".to_string()) .hostname("i-012345678".to_string()) .service("payment".to_string()) .additional_properties(BTreeMap::from([]))]; let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api .submit_log( body, SubmitLogOptionalParams::default().content_encoding(ContentEncoding::GZIP), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Send logs returns "Request accepted for processing (always 202 empty JSON)." response ``` // Send logs returns "Request accepted for processing (always 202 empty JSON)." // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs::LogsAPI; use datadog_api_client::datadogV2::api_logs::SubmitLogOptionalParams; use datadog_api_client::datadogV2::model::HTTPLogItem; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = vec![HTTPLogItem::new( "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World".to_string(), ) .ddsource("nginx".to_string()) .ddtags("env:staging,version:5.1".to_string()) .hostname("i-012345678".to_string()) .service("payment".to_string()) .additional_properties(BTreeMap::from([( "status".to_string(), Value::from("info"), )]))]; let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api .submit_log(body, SubmitLogOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response ``` /** * Send deflate logs returns "Request accepted for processing (always 202 empty JSON)." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsApi(configuration); const params: v2.LogsApiSubmitLogRequest = { body: [ { ddsource: "nginx", ddtags: "env:staging,version:5.1", hostname: "i-012345678", message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service: "payment", }, ], contentEncoding: "deflate", }; apiInstance .submitLog(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response ``` /** * Send gzip logs returns "Request accepted for processing (always 202 empty JSON)." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsApi(configuration); const params: v2.LogsApiSubmitLogRequest = { body: [ { ddsource: "nginx", ddtags: "env:staging,version:5.1", hostname: "i-012345678", message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service: "payment", }, ], contentEncoding: "gzip", }; apiInstance .submitLog(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Send logs returns "Request accepted for processing (always 202 empty JSON)." response ``` /** * Send logs returns "Request accepted for processing (always 202 empty JSON)." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsApi(configuration); const params: v2.LogsApiSubmitLogRequest = { body: [ { ddsource: "nginx", ddtags: "env:staging,version:5.1", hostname: "i-012345678", message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service: "payment", additionalProperties: { status: "info", }, }, ], }; apiInstance .submitLog(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` * * * ## [Aggregate events](https://docs.datadoghq.com/api/latest/logs/#aggregate-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs/#aggregate-events-v2) POST https://api.ap1.datadoghq.com/api/v2/logs/analytics/aggregatehttps://api.ap2.datadoghq.com/api/v2/logs/analytics/aggregatehttps://api.datadoghq.eu/api/v2/logs/analytics/aggregatehttps://api.ddog-gov.com/api/v2/logs/analytics/aggregatehttps://api.datadoghq.com/api/v2/logs/analytics/aggregatehttps://api.us3.datadoghq.com/api/v2/logs/analytics/aggregatehttps://api.us5.datadoghq.com/api/v2/logs/analytics/aggregate ### Overview The API endpoint to aggregate events into buckets and compute metrics and timeseries. This endpoint requires the `logs_read_data` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Field Type Description compute [object] The list of metrics or timeseries to compute for the retrieved buckets. aggregation [_required_] enum An aggregation function Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median` interval string The time buckets' size (only used for type=timeseries) Defaults to a resolution of 150 points metric string The metric to use type enum The type of compute Allowed enum values: `timeseries,total` default: `total` filter object The search and filter query settings from string The minimum time for the requested logs, supports date math and regular timestamps (milliseconds). default: `now-15m` indexes [string] For customers with multiple indexes, the indexes to search. Defaults to ['*'] which means all indexes. default: `*` query string The search query - following the log search syntax. default: `*` storage_tier enum Specifies storage type as indexes, online-archives or flex Allowed enum values: `indexes,online-archives,flex` default: `indexes` to string The maximum time for the requested logs, supports date math and regular timestamps (milliseconds). default: `now` group_by [object] The rules for the group by facet [_required_] string The name of the facet to use (required) histogram object Used to perform a histogram computation (only for measure facets). Note: at most 100 buckets are allowed, the number of buckets is (max - min)/interval. interval [_required_] double The bin size of the histogram buckets max [_required_] double The maximum value for the measure used in the histogram (values greater than this one are filtered out) min [_required_] double The minimum value for the measure used in the histogram (values smaller than this one are filtered out) limit int64 The maximum buckets to return for this group by. Note: at most 10000 buckets are allowed. If grouping by multiple facets, the product of limits must not exceed 10000. default: `10` missing The value to use for logs that don't have the facet used to group by Option 1 string The missing value to use if there is string valued facet. Option 2 double The missing value to use if there is a number valued facet. sort object A sort rule aggregation enum An aggregation function Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median` metric string The metric to sort by (only used for `type=measure`) order enum The order to use, ascending or descending Allowed enum values: `asc,desc` type enum The type of sorting algorithm Allowed enum values: `alphabetical,measure` default: `alphabetical` total A resulting object to put the given computes in over all the matching records. Option 1 boolean If set to true, creates an additional bucket labeled "$facet_total" Option 2 string A string to use as the key value for the total bucket Option 3 double A number to use as the key value for the total bucket options object **DEPRECATED** : Global query options that are used during the query. Note: These fields are currently deprecated and do not affect the query results. timeOffset int64 The time offset (in seconds) to apply to the query. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` page object Paging settings cursor string The returned paging point to use to get the next results. Note: at most 1000 results can be paged. ##### Aggregate compute events returns "OK" response ``` { "compute": [ { "aggregation": "count", "interval": "5m", "type": "timeseries" } ], "filter": { "from": "now-15m", "indexes": [ "main" ], "query": "*", "to": "now" } } ``` Copy ##### Aggregate compute events with group by returns "OK" response ``` { "compute": [ { "aggregation": "count", "interval": "5m", "type": "timeseries" } ], "filter": { "from": "now-15m", "indexes": [ "main" ], "query": "*", "to": "now" }, "group_by": [ { "facet": "host", "missing": "miss", "sort": { "type": "measure", "order": "asc", "aggregation": "pc90", "metric": "@duration" } } ] } ``` Copy ##### Aggregate events returns "OK" response ``` { "filter": { "from": "now-15m", "indexes": [ "main" ], "query": "*", "to": "now" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs/#AggregateLogs-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs/#AggregateLogs-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs/#AggregateLogs-403-v2) * [429](https://docs.datadoghq.com/api/latest/logs/#AggregateLogs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) The response object for the logs aggregate API endpoint Field Type Description data object The query results buckets [object] The list of matching buckets, one item per bucket by object The key, value pairs for each group by The values for each group by computes object A map of the metric name -> value for regular compute or list of values for a timeseries A bucket value, can be either a timeseries or a single value Option 1 string A single string value Option 2 double A single number value Option 3 [object] A timeseries array time string The time value for this point value double The value for this point meta object The metadata associated with a request elapsed int64 The time elapsed in milliseconds page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. request_id string The identifier of the request status enum The status of the response Allowed enum values: `done,timeout` warnings [object] A list of warnings (non fatal errors) encountered, partial results might be returned if warnings are present in the response. code string A unique code for this type of warning detail string A detailed explanation of this specific warning title string A short human-readable summary of the warning ``` { "data": { "buckets": [ { "by": { "": "undefined" }, "computes": { "": { "description": "undefined", "type": "undefined" } } } ] }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs/?code-lang=typescript) ##### Aggregate compute events returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/analytics/aggregate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "compute": [ { "aggregation": "count", "interval": "5m", "type": "timeseries" } ], "filter": { "from": "now-15m", "indexes": [ "main" ], "query": "*", "to": "now" } } EOF ``` ##### Aggregate compute events with group by returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/analytics/aggregate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "compute": [ { "aggregation": "count", "interval": "5m", "type": "timeseries" } ], "filter": { "from": "now-15m", "indexes": [ "main" ], "query": "*", "to": "now" }, "group_by": [ { "facet": "host", "missing": "miss", "sort": { "type": "measure", "order": "asc", "aggregation": "pc90", "metric": "@duration" } } ] } EOF ``` ##### Aggregate events returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/analytics/aggregate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "now-15m", "indexes": [ "main" ], "query": "*", "to": "now" } } EOF ``` ##### Aggregate compute events returns "OK" response ``` // Aggregate compute events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.LogsAggregateRequest{ Compute: []datadogV2.LogsCompute{ { Aggregation: datadogV2.LOGSAGGREGATIONFUNCTION_COUNT, Interval: datadog.PtrString("5m"), Type: datadogV2.LOGSCOMPUTETYPE_TIMESERIES.Ptr(), }, }, Filter: &datadogV2.LogsQueryFilter{ From: datadog.PtrString("now-15m"), Indexes: []string{ "main", }, Query: datadog.PtrString("*"), To: datadog.PtrString("now"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsApi(apiClient) resp, r, err := api.AggregateLogs(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.AggregateLogs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.AggregateLogs`:\n%s\n", responseContent) } ``` Copy ##### Aggregate compute events with group by returns "OK" response ``` // Aggregate compute events with group by returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.LogsAggregateRequest{ Compute: []datadogV2.LogsCompute{ { Aggregation: datadogV2.LOGSAGGREGATIONFUNCTION_COUNT, Interval: datadog.PtrString("5m"), Type: datadogV2.LOGSCOMPUTETYPE_TIMESERIES.Ptr(), }, }, Filter: &datadogV2.LogsQueryFilter{ From: datadog.PtrString("now-15m"), Indexes: []string{ "main", }, Query: datadog.PtrString("*"), To: datadog.PtrString("now"), }, GroupBy: []datadogV2.LogsGroupBy{ { Facet: "host", Missing: &datadogV2.LogsGroupByMissing{ LogsGroupByMissingString: datadog.PtrString("miss")}, Sort: &datadogV2.LogsAggregateSort{ Type: datadogV2.LOGSAGGREGATESORTTYPE_MEASURE.Ptr(), Order: datadogV2.LOGSSORTORDER_ASCENDING.Ptr(), Aggregation: datadogV2.LOGSAGGREGATIONFUNCTION_PERCENTILE_90.Ptr(), Metric: datadog.PtrString("@duration"), }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsApi(apiClient) resp, r, err := api.AggregateLogs(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.AggregateLogs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.AggregateLogs`:\n%s\n", responseContent) } ``` Copy ##### Aggregate events returns "OK" response ``` // Aggregate events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.LogsAggregateRequest{ Filter: &datadogV2.LogsQueryFilter{ From: datadog.PtrString("now-15m"), Indexes: []string{ "main", }, Query: datadog.PtrString("*"), To: datadog.PtrString("now"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsApi(apiClient) resp, r, err := api.AggregateLogs(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.AggregateLogs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.AggregateLogs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Aggregate compute events returns "OK" response ``` // Aggregate compute events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsApi; import com.datadog.api.client.v2.model.LogsAggregateRequest; import com.datadog.api.client.v2.model.LogsAggregateResponse; import com.datadog.api.client.v2.model.LogsAggregationFunction; import com.datadog.api.client.v2.model.LogsCompute; import com.datadog.api.client.v2.model.LogsComputeType; import com.datadog.api.client.v2.model.LogsQueryFilter; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); LogsAggregateRequest body = new LogsAggregateRequest() .compute( Collections.singletonList( new LogsCompute() .aggregation(LogsAggregationFunction.COUNT) .interval("5m") .type(LogsComputeType.TIMESERIES))) .filter( new LogsQueryFilter() .from("now-15m") .indexes(Collections.singletonList("main")) .query("*") .to("now")); try { LogsAggregateResponse result = apiInstance.aggregateLogs(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#aggregateLogs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Aggregate compute events with group by returns "OK" response ``` // Aggregate compute events with group by returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsApi; import com.datadog.api.client.v2.model.LogsAggregateRequest; import com.datadog.api.client.v2.model.LogsAggregateResponse; import com.datadog.api.client.v2.model.LogsAggregateSort; import com.datadog.api.client.v2.model.LogsAggregateSortType; import com.datadog.api.client.v2.model.LogsAggregationFunction; import com.datadog.api.client.v2.model.LogsCompute; import com.datadog.api.client.v2.model.LogsComputeType; import com.datadog.api.client.v2.model.LogsGroupBy; import com.datadog.api.client.v2.model.LogsGroupByMissing; import com.datadog.api.client.v2.model.LogsQueryFilter; import com.datadog.api.client.v2.model.LogsSortOrder; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); LogsAggregateRequest body = new LogsAggregateRequest() .compute( Collections.singletonList( new LogsCompute() .aggregation(LogsAggregationFunction.COUNT) .interval("5m") .type(LogsComputeType.TIMESERIES))) .filter( new LogsQueryFilter() .from("now-15m") .indexes(Collections.singletonList("main")) .query("*") .to("now")) .groupBy( Collections.singletonList( new LogsGroupBy() .facet("host") .missing(new LogsGroupByMissing("miss")) .sort( new LogsAggregateSort() .type(LogsAggregateSortType.MEASURE) .order(LogsSortOrder.ASCENDING) .aggregation(LogsAggregationFunction.PERCENTILE_90) .metric("@duration")))); try { LogsAggregateResponse result = apiInstance.aggregateLogs(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#aggregateLogs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Aggregate events returns "OK" response ``` // Aggregate events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsApi; import com.datadog.api.client.v2.model.LogsAggregateRequest; import com.datadog.api.client.v2.model.LogsAggregateResponse; import com.datadog.api.client.v2.model.LogsQueryFilter; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); LogsAggregateRequest body = new LogsAggregateRequest() .filter( new LogsQueryFilter() .from("now-15m") .indexes(Collections.singletonList("main")) .query("*") .to("now")); try { LogsAggregateResponse result = apiInstance.aggregateLogs(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#aggregateLogs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Aggregate compute events returns "OK" response ``` """ Aggregate compute events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_api import LogsApi from datadog_api_client.v2.model.logs_aggregate_request import LogsAggregateRequest from datadog_api_client.v2.model.logs_aggregation_function import LogsAggregationFunction from datadog_api_client.v2.model.logs_compute import LogsCompute from datadog_api_client.v2.model.logs_compute_type import LogsComputeType from datadog_api_client.v2.model.logs_query_filter import LogsQueryFilter body = LogsAggregateRequest( compute=[ LogsCompute( aggregation=LogsAggregationFunction.COUNT, interval="5m", type=LogsComputeType.TIMESERIES, ), ], filter=LogsQueryFilter( _from="now-15m", indexes=[ "main", ], query="*", to="now", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.aggregate_logs(body=body) print(response) ``` Copy ##### Aggregate compute events with group by returns "OK" response ``` """ Aggregate compute events with group by returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_api import LogsApi from datadog_api_client.v2.model.logs_aggregate_request import LogsAggregateRequest from datadog_api_client.v2.model.logs_aggregate_sort import LogsAggregateSort from datadog_api_client.v2.model.logs_aggregate_sort_type import LogsAggregateSortType from datadog_api_client.v2.model.logs_aggregation_function import LogsAggregationFunction from datadog_api_client.v2.model.logs_compute import LogsCompute from datadog_api_client.v2.model.logs_compute_type import LogsComputeType from datadog_api_client.v2.model.logs_group_by import LogsGroupBy from datadog_api_client.v2.model.logs_query_filter import LogsQueryFilter from datadog_api_client.v2.model.logs_sort_order import LogsSortOrder body = LogsAggregateRequest( compute=[ LogsCompute( aggregation=LogsAggregationFunction.COUNT, interval="5m", type=LogsComputeType.TIMESERIES, ), ], filter=LogsQueryFilter( _from="now-15m", indexes=[ "main", ], query="*", to="now", ), group_by=[ LogsGroupBy( facet="host", missing="miss", sort=LogsAggregateSort( type=LogsAggregateSortType.MEASURE, order=LogsSortOrder.ASCENDING, aggregation=LogsAggregationFunction.PERCENTILE_90, metric="@duration", ), ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.aggregate_logs(body=body) print(response) ``` Copy ##### Aggregate events returns "OK" response ``` """ Aggregate events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_api import LogsApi from datadog_api_client.v2.model.logs_aggregate_request import LogsAggregateRequest from datadog_api_client.v2.model.logs_query_filter import LogsQueryFilter body = LogsAggregateRequest( filter=LogsQueryFilter( _from="now-15m", indexes=[ "main", ], query="*", to="now", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.aggregate_logs(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Aggregate compute events returns "OK" response ``` # Aggregate compute events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsAPI.new body = DatadogAPIClient::V2::LogsAggregateRequest.new({ compute: [ DatadogAPIClient::V2::LogsCompute.new({ aggregation: DatadogAPIClient::V2::LogsAggregationFunction::COUNT, interval: "5m", type: DatadogAPIClient::V2::LogsComputeType::TIMESERIES, }), ], filter: DatadogAPIClient::V2::LogsQueryFilter.new({ from: "now-15m", indexes: [ "main", ], query: "*", to: "now", }), }) p api_instance.aggregate_logs(body) ``` Copy ##### Aggregate compute events with group by returns "OK" response ``` # Aggregate compute events with group by returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsAPI.new body = DatadogAPIClient::V2::LogsAggregateRequest.new({ compute: [ DatadogAPIClient::V2::LogsCompute.new({ aggregation: DatadogAPIClient::V2::LogsAggregationFunction::COUNT, interval: "5m", type: DatadogAPIClient::V2::LogsComputeType::TIMESERIES, }), ], filter: DatadogAPIClient::V2::LogsQueryFilter.new({ from: "now-15m", indexes: [ "main", ], query: "*", to: "now", }), group_by: [ DatadogAPIClient::V2::LogsGroupBy.new({ facet: "host", missing: "miss", sort: DatadogAPIClient::V2::LogsAggregateSort.new({ type: DatadogAPIClient::V2::LogsAggregateSortType::MEASURE, order: DatadogAPIClient::V2::LogsSortOrder::ASCENDING, aggregation: DatadogAPIClient::V2::LogsAggregationFunction::PERCENTILE_90, metric: "@duration", }), }), ], }) p api_instance.aggregate_logs(body) ``` Copy ##### Aggregate events returns "OK" response ``` # Aggregate events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsAPI.new body = DatadogAPIClient::V2::LogsAggregateRequest.new({ filter: DatadogAPIClient::V2::LogsQueryFilter.new({ from: "now-15m", indexes: [ "main", ], query: "*", to: "now", }), }) p api_instance.aggregate_logs(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Aggregate compute events returns "OK" response ``` // Aggregate compute events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs::LogsAPI; use datadog_api_client::datadogV2::model::LogsAggregateRequest; use datadog_api_client::datadogV2::model::LogsAggregationFunction; use datadog_api_client::datadogV2::model::LogsCompute; use datadog_api_client::datadogV2::model::LogsComputeType; use datadog_api_client::datadogV2::model::LogsQueryFilter; #[tokio::main] async fn main() { let body = LogsAggregateRequest::new() .compute(vec![LogsCompute::new(LogsAggregationFunction::COUNT) .interval("5m".to_string()) .type_(LogsComputeType::TIMESERIES)]) .filter( LogsQueryFilter::new() .from("now-15m".to_string()) .indexes(vec!["main".to_string()]) .query("*".to_string()) .to("now".to_string()), ); let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api.aggregate_logs(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Aggregate compute events with group by returns "OK" response ``` // Aggregate compute events with group by returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs::LogsAPI; use datadog_api_client::datadogV2::model::LogsAggregateRequest; use datadog_api_client::datadogV2::model::LogsAggregateSort; use datadog_api_client::datadogV2::model::LogsAggregateSortType; use datadog_api_client::datadogV2::model::LogsAggregationFunction; use datadog_api_client::datadogV2::model::LogsCompute; use datadog_api_client::datadogV2::model::LogsComputeType; use datadog_api_client::datadogV2::model::LogsGroupBy; use datadog_api_client::datadogV2::model::LogsGroupByMissing; use datadog_api_client::datadogV2::model::LogsQueryFilter; use datadog_api_client::datadogV2::model::LogsSortOrder; #[tokio::main] async fn main() { let body = LogsAggregateRequest::new() .compute(vec![LogsCompute::new(LogsAggregationFunction::COUNT) .interval("5m".to_string()) .type_(LogsComputeType::TIMESERIES)]) .filter( LogsQueryFilter::new() .from("now-15m".to_string()) .indexes(vec!["main".to_string()]) .query("*".to_string()) .to("now".to_string()), ) .group_by(vec![LogsGroupBy::new("host".to_string()) .missing(LogsGroupByMissing::LogsGroupByMissingString( "miss".to_string(), )) .sort( LogsAggregateSort::new() .aggregation(LogsAggregationFunction::PERCENTILE_90) .metric("@duration".to_string()) .order(LogsSortOrder::ASCENDING) .type_(LogsAggregateSortType::MEASURE), )]); let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api.aggregate_logs(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Aggregate events returns "OK" response ``` // Aggregate events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs::LogsAPI; use datadog_api_client::datadogV2::model::LogsAggregateRequest; use datadog_api_client::datadogV2::model::LogsQueryFilter; #[tokio::main] async fn main() { let body = LogsAggregateRequest::new().filter( LogsQueryFilter::new() .from("now-15m".to_string()) .indexes(vec!["main".to_string()]) .query("*".to_string()) .to("now".to_string()), ); let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api.aggregate_logs(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Aggregate compute events returns "OK" response ``` /** * Aggregate compute events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsApi(configuration); const params: v2.LogsApiAggregateLogsRequest = { body: { compute: [ { aggregation: "count", interval: "5m", type: "timeseries", }, ], filter: { from: "now-15m", indexes: ["main"], query: "*", to: "now", }, }, }; apiInstance .aggregateLogs(params) .then((data: v2.LogsAggregateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Aggregate compute events with group by returns "OK" response ``` /** * Aggregate compute events with group by returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsApi(configuration); const params: v2.LogsApiAggregateLogsRequest = { body: { compute: [ { aggregation: "count", interval: "5m", type: "timeseries", }, ], filter: { from: "now-15m", indexes: ["main"], query: "*", to: "now", }, groupBy: [ { facet: "host", missing: "miss", sort: { type: "measure", order: "asc", aggregation: "pc90", metric: "@duration", }, }, ], }, }; apiInstance .aggregateLogs(params) .then((data: v2.LogsAggregateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Aggregate events returns "OK" response ``` /** * Aggregate events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsApi(configuration); const params: v2.LogsApiAggregateLogsRequest = { body: { filter: { from: "now-15m", indexes: ["main"], query: "*", to: "now", }, }, }; apiInstance .aggregateLogs(params) .then((data: v2.LogsAggregateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search logs](https://docs.datadoghq.com/api/latest/logs/#search-logs) * [v1 (latest)](https://docs.datadoghq.com/api/latest/logs/#search-logs-v1) POST https://api.ap1.datadoghq.com/api/v1/logs-queries/listhttps://api.ap2.datadoghq.com/api/v1/logs-queries/listhttps://api.datadoghq.eu/api/v1/logs-queries/listhttps://api.ddog-gov.com/api/v1/logs-queries/listhttps://api.datadoghq.com/api/v1/logs-queries/listhttps://api.us3.datadoghq.com/api/v1/logs-queries/listhttps://api.us5.datadoghq.com/api/v1/logs-queries/list ### Overview List endpoint returns logs that match a log search query. [Results are paginated](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). **If you are considering archiving logs for your organization, consider use of the Datadog archive capabilities instead of the log list API. See[Datadog Logs Archive documentation](https://docs.datadoghq.com/logs/archives).** This endpoint requires the `logs_read_data` permission. ### Request #### Body Data (required) Logs filter * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Field Type Description index string The log index on which the request is performed. For multi-index organizations, the default is all live indexes. Historical indexes of rehydrated logs must be specified. limit int32 Number of logs return in the response. query string The search query - following the log search syntax. sort enum Time-ascending `asc` or time-descending `desc` results. Allowed enum values: `asc,desc` startAt string Hash identifier of the first log to return in the list, available in a log `id` attribute. This parameter is used for the pagination feature. **Note** : This parameter is ignored if the corresponding log is out of the scope of the specified time window. time [_required_] object Timeframe to retrieve the log from. from [_required_] date-time Minimum timestamp for requested logs. timezone string Timezone can be specified both as an offset (for example "UTC+03:00") or a regional zone (for example "Europe/Paris"). to [_required_] date-time Maximum timestamp for requested logs. ``` { "index": "main", "query": "host:Test*", "sort": "asc", "time": { "from": "2021-11-11T10:11:11+00:00", "timezone": "Europe/Paris", "to": "2021-11-11T11:11:11+00:00" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs/#ListLogs-200-v1) * [400](https://docs.datadoghq.com/api/latest/logs/#ListLogs-400-v1) * [403](https://docs.datadoghq.com/api/latest/logs/#ListLogs-403-v1) * [429](https://docs.datadoghq.com/api/latest/logs/#ListLogs-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Response object with all logs matching the request and pagination information. Field Type Description logs [object] Array of logs matching the request and the `nextLogId` if sent. content object JSON object containing all log attributes and their associated values. attributes object JSON object of attributes from your log. host string Name of the machine from where the logs are being sent. message string The message [reserved attribute](https://docs.datadoghq.com/logs/log_collection/#reserved-attributes) of your log. By default, Datadog ingests the value of the message attribute as the body of the log entry. That value is then highlighted and displayed in the Logstream, where it is indexed for full text search. service string The name of the application or service generating the log events. It is used to switch from Logs to APM, so make sure you define the same value when you use both products. tags [string] Array of tags associated with your log. timestamp date-time Timestamp of your log. id string ID of the Log. nextLogId string Hash identifier of the next log to return in the list. This parameter is used for the pagination feature. status string Status of the response. ``` { "logs": [ { "content": { "attributes": { "customAttribute": 123, "duration": 2345 }, "host": "i-0123", "message": "Host connected to remote", "service": "agent", "tags": [ "team:A" ], "timestamp": "2020-05-26T13:36:14Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA" } ], "nextLogId": "string", "status": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Response returned by the Logs API when errors occur. Field Type Description error object Error returned by the Logs API code string Code identifying the error details [object] Additional error details message string Error message ``` { "error": { "code": "string", "details": [], "message": "string" } } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs/?code-lang=typescript) ##### Search test logs returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/logs-queries/list" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "index": "main", "query": "host:Test*", "sort": "asc", "time": { "from": "2021-11-11T10:11:11+00:00", "timezone": "Europe/Paris", "to": "2021-11-11T11:11:11+00:00" } } EOF ``` ##### Search test logs returns "OK" response ``` // Search test logs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.LogsListRequest{ Index: datadog.PtrString("main"), Query: datadog.PtrString("host:Test*"), Sort: datadogV1.LOGSSORT_TIME_ASCENDING.Ptr(), Time: datadogV1.LogsListRequestTime{ From: time.Now().Add(time.Hour * -1), Timezone: datadog.PtrString("Europe/Paris"), To: time.Now(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewLogsApi(apiClient) resp, r, err := api.ListLogs(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.ListLogs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.ListLogs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search test logs returns "OK" response ``` // Search test logs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.LogsApi; import com.datadog.api.client.v1.model.LogsListRequest; import com.datadog.api.client.v1.model.LogsListRequestTime; import com.datadog.api.client.v1.model.LogsListResponse; import com.datadog.api.client.v1.model.LogsSort; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); LogsListRequest body = new LogsListRequest() .index("main") .query("host:Test*") .sort(LogsSort.TIME_ASCENDING) .time( new LogsListRequestTime() .from(OffsetDateTime.now().plusHours(-1)) .timezone("Europe/Paris") .to(OffsetDateTime.now())); try { LogsListResponse result = apiInstance.listLogs(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#listLogs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search test logs returns "OK" response ``` """ Search test logs returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.logs_api import LogsApi from datadog_api_client.v1.model.logs_list_request import LogsListRequest from datadog_api_client.v1.model.logs_list_request_time import LogsListRequestTime from datadog_api_client.v1.model.logs_sort import LogsSort body = LogsListRequest( index="main", query="host:Test*", sort=LogsSort.TIME_ASCENDING, time=LogsListRequestTime( _from=(datetime.now() + relativedelta(hours=-1)), timezone="Europe/Paris", to=datetime.now(), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.list_logs(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search test logs returns "OK" response ``` # Search test logs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::LogsAPI.new body = DatadogAPIClient::V1::LogsListRequest.new({ index: "main", query: "host:Test*", sort: DatadogAPIClient::V1::LogsSort::TIME_ASCENDING, time: DatadogAPIClient::V1::LogsListRequestTime.new({ from: (Time.now + -1 * 3600), timezone: "Europe/Paris", to: Time.now, }), }) p api_instance.list_logs(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search test logs returns "OK" response ``` // Search test logs returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_logs::LogsAPI; use datadog_api_client::datadogV1::model::LogsListRequest; use datadog_api_client::datadogV1::model::LogsListRequestTime; use datadog_api_client::datadogV1::model::LogsSort; #[tokio::main] async fn main() { let body = LogsListRequest::new( LogsListRequestTime::new( DateTime::parse_from_rfc3339("2021-11-11T10:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), DateTime::parse_from_rfc3339("2021-11-11T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .timezone("Europe/Paris".to_string()), ) .index("main".to_string()) .query("host:Test*".to_string()) .sort(LogsSort::TIME_ASCENDING); let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api.list_logs(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search test logs returns "OK" response ``` /** * Search test logs returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.LogsApi(configuration); const params: v1.LogsApiListLogsRequest = { body: { index: "main", query: "host:Test*", sort: "asc", time: { from: new Date(new Date().getTime() + -1 * 3600 * 1000), timezone: "Europe/Paris", to: new Date(), }, }, }; apiInstance .listLogs(params) .then((data: v1.LogsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search logs (POST)](https://docs.datadoghq.com/api/latest/logs/#search-logs-post) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs/#search-logs-post-v2) POST https://api.ap1.datadoghq.com/api/v2/logs/events/searchhttps://api.ap2.datadoghq.com/api/v2/logs/events/searchhttps://api.datadoghq.eu/api/v2/logs/events/searchhttps://api.ddog-gov.com/api/v2/logs/events/searchhttps://api.datadoghq.com/api/v2/logs/events/searchhttps://api.us3.datadoghq.com/api/v2/logs/events/searchhttps://api.us5.datadoghq.com/api/v2/logs/events/search ### Overview List endpoint returns logs that match a log search query. [Results are paginated](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to search and filter your logs. **If you are considering archiving logs for your organization, consider use of the Datadog archive capabilities instead of the log list API. See[Datadog Logs Archive documentation](https://docs.datadoghq.com/logs/archives).** This endpoint requires the `logs_read_data` permission. ### Request #### Body Data * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Field Type Description filter object The search and filter query settings from string The minimum time for the requested logs, supports date math and regular timestamps (milliseconds). default: `now-15m` indexes [string] For customers with multiple indexes, the indexes to search. Defaults to ['*'] which means all indexes. default: `*` query string The search query - following the log search syntax. default: `*` storage_tier enum Specifies storage type as indexes, online-archives or flex Allowed enum values: `indexes,online-archives,flex` default: `indexes` to string The maximum time for the requested logs, supports date math and regular timestamps (milliseconds). default: `now` options object **DEPRECATED** : Global query options that are used during the query. Note: These fields are currently deprecated and do not affect the query results. timeOffset int64 The time offset (in seconds) to apply to the query. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` page object Paging attributes for listing logs. cursor string List following results with a cursor provided in the previous query. limit int32 Maximum number of logs in the response. default: `10` sort enum Sort parameters when querying logs. Allowed enum values: `timestamp,-timestamp` ##### Search logs returns "OK" response ``` { "filter": { "query": "datadog-agent", "indexes": [ "main" ], "from": "2020-09-17T11:48:36+01:00", "to": "2020-09-17T12:48:36+01:00" }, "sort": "timestamp", "page": { "limit": 5 } } ``` Copy ##### Search logs returns "OK" response with pagination ``` { "filter": { "from": "now-15m", "indexes": [ "main" ], "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/logs/#ListLogs-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs/#ListLogs-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs/#ListLogs-403-v2) * [429](https://docs.datadoghq.com/api/latest/logs/#ListLogs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Response object with all logs matching the request and pagination information. Field Type Description data [object] Array of logs matching the request. attributes object JSON object containing all log attributes and their associated values. attributes object JSON object of attributes from your log. host string Name of the machine from where the logs are being sent. message string The message [reserved attribute](https://docs.datadoghq.com/logs/log_collection/#reserved-attributes) of your log. By default, Datadog ingests the value of the message attribute as the body of the log entry. That value is then highlighted and displayed in the Logstream, where it is indexed for full text search. service string The name of the application or service generating the log events. It is used to switch from Logs to APM, so make sure you define the same value when you use both products. status string Status of the message associated with your log. tags [string] Array of tags associated with your log. timestamp date-time Timestamp of your log. id string Unique ID of the Log. type enum Type of the event. Allowed enum values: `log` default: `log` links object Links attributes. next string Link for the next set of results. Note that the request can also be made using the POST endpoint. meta object The metadata associated with a request elapsed int64 The time elapsed in milliseconds page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. request_id string The identifier of the request status enum The status of the response Allowed enum values: `done,timeout` warnings [object] A list of warnings (non fatal errors) encountered, partial results might be returned if warnings are present in the response. code string A unique code for this type of warning detail string A detailed explanation of this specific warning title string A short human-readable summary of the warning ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "host": "i-0123", "message": "Host connected to remote", "service": "agent", "status": "INFO", "tags": [ "team:A" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "log" } ], "links": { "next": "https://app.datadoghq.com/api/v2/logs/event?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/logs/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/logs/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/logs/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs/?code-lang=typescript) ##### Search logs returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "query": "datadog-agent", "indexes": [ "main" ], "from": "2020-09-17T11:48:36+01:00", "to": "2020-09-17T12:48:36+01:00" }, "sort": "timestamp", "page": { "limit": 5 } } EOF ``` ##### Search logs returns "OK" response with pagination Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "now-15m", "indexes": [ "main" ], "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" } EOF ``` ##### Search logs returns "OK" response ``` // Search logs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.LogsListRequest{ Filter: &datadogV2.LogsQueryFilter{ Query: datadog.PtrString("datadog-agent"), Indexes: []string{ "main", }, From: datadog.PtrString("2020-09-17T11:48:36+01:00"), To: datadog.PtrString("2020-09-17T12:48:36+01:00"), }, Sort: datadogV2.LOGSSORT_TIMESTAMP_ASCENDING.Ptr(), Page: &datadogV2.LogsListRequestPage{ Limit: datadog.PtrInt32(5), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsApi(apiClient) resp, r, err := api.ListLogs(ctx, *datadogV2.NewListLogsOptionalParameters().WithBody(body)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.ListLogs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.ListLogs`:\n%s\n", responseContent) } ``` Copy ##### Search logs returns "OK" response with pagination ``` // Search logs returns "OK" response with pagination package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.LogsListRequest{ Filter: &datadogV2.LogsQueryFilter{ From: datadog.PtrString("now-15m"), Indexes: []string{ "main", }, To: datadog.PtrString("now"), }, Options: &datadogV2.LogsQueryOptions{ Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.LogsListRequestPage{ Limit: datadog.PtrInt32(2), }, Sort: datadogV2.LOGSSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsApi(apiClient) resp, _ := api.ListLogsWithPagination(ctx, *datadogV2.NewListLogsOptionalParameters().WithBody(body)) for paginationResult := range resp { if paginationResult.Error != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.ListLogs`: %v\n", paginationResult.Error) } responseContent, _ := json.MarshalIndent(paginationResult.Item, "", " ") fmt.Fprintf(os.Stdout, "%s\n", responseContent) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search logs returns "OK" response ``` // Search logs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsApi; import com.datadog.api.client.v2.api.LogsApi.ListLogsOptionalParameters; import com.datadog.api.client.v2.model.LogsListRequest; import com.datadog.api.client.v2.model.LogsListRequestPage; import com.datadog.api.client.v2.model.LogsListResponse; import com.datadog.api.client.v2.model.LogsQueryFilter; import com.datadog.api.client.v2.model.LogsSort; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); LogsListRequest body = new LogsListRequest() .filter( new LogsQueryFilter() .query("datadog-agent") .indexes(Collections.singletonList("main")) .from("2020-09-17T11:48:36+01:00") .to("2020-09-17T12:48:36+01:00")) .sort(LogsSort.TIMESTAMP_ASCENDING) .page(new LogsListRequestPage().limit(5)); try { LogsListResponse result = apiInstance.listLogs(new ListLogsOptionalParameters().body(body)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#listLogs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Search logs returns "OK" response with pagination ``` // Search logs returns "OK" response with pagination import com.datadog.api.client.ApiClient; import com.datadog.api.client.PaginationIterable; import com.datadog.api.client.v2.api.LogsApi; import com.datadog.api.client.v2.api.LogsApi.ListLogsOptionalParameters; import com.datadog.api.client.v2.model.Log; import com.datadog.api.client.v2.model.LogsListRequest; import com.datadog.api.client.v2.model.LogsListRequestPage; import com.datadog.api.client.v2.model.LogsQueryFilter; import com.datadog.api.client.v2.model.LogsQueryOptions; import com.datadog.api.client.v2.model.LogsSort; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); LogsListRequest body = new LogsListRequest() .filter( new LogsQueryFilter() .from("now-15m") .indexes(Collections.singletonList("main")) .to("now")) .options(new LogsQueryOptions().timezone("GMT")) .page(new LogsListRequestPage().limit(2)) .sort(LogsSort.TIMESTAMP_ASCENDING); try { PaginationIterable iterable = apiInstance.listLogsWithPagination(new ListLogsOptionalParameters().body(body)); for (Log item : iterable) { System.out.println(item); } } catch (RuntimeException e) { System.err.println("Exception when calling LogsApi#listLogsWithPagination"); System.err.println("Reason: " + e.getMessage()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search logs returns "OK" response ``` """ Search logs returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_api import LogsApi from datadog_api_client.v2.model.logs_list_request import LogsListRequest from datadog_api_client.v2.model.logs_list_request_page import LogsListRequestPage from datadog_api_client.v2.model.logs_query_filter import LogsQueryFilter from datadog_api_client.v2.model.logs_sort import LogsSort body = LogsListRequest( filter=LogsQueryFilter( query="datadog-agent", indexes=[ "main", ], _from="2020-09-17T11:48:36+01:00", to="2020-09-17T12:48:36+01:00", ), sort=LogsSort.TIMESTAMP_ASCENDING, page=LogsListRequestPage( limit=5, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.list_logs(body=body) print(response) ``` Copy ##### Search logs returns "OK" response with pagination ``` """ Search logs returns "OK" response with pagination """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_api import LogsApi from datadog_api_client.v2.model.logs_list_request import LogsListRequest from datadog_api_client.v2.model.logs_list_request_page import LogsListRequestPage from datadog_api_client.v2.model.logs_query_filter import LogsQueryFilter from datadog_api_client.v2.model.logs_query_options import LogsQueryOptions from datadog_api_client.v2.model.logs_sort import LogsSort body = LogsListRequest( filter=LogsQueryFilter( _from="now-15m", indexes=[ "main", ], to="now", ), options=LogsQueryOptions( timezone="GMT", ), page=LogsListRequestPage( limit=2, ), sort=LogsSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) items = api_instance.list_logs_with_pagination(body=body) for item in items: print(item) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search logs returns "OK" response ``` # Search logs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsAPI.new body = DatadogAPIClient::V2::LogsListRequest.new({ filter: DatadogAPIClient::V2::LogsQueryFilter.new({ query: "datadog-agent", indexes: [ "main", ], from: "2020-09-17T11:48:36+01:00", to: "2020-09-17T12:48:36+01:00", }), sort: DatadogAPIClient::V2::LogsSort::TIMESTAMP_ASCENDING, page: DatadogAPIClient::V2::LogsListRequestPage.new({ limit: 5, }), }) opts = { body: body, } p api_instance.list_logs(opts) ``` Copy ##### Search logs returns "OK" response with pagination ``` # Search logs returns "OK" response with pagination require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsAPI.new body = DatadogAPIClient::V2::LogsListRequest.new({ filter: DatadogAPIClient::V2::LogsQueryFilter.new({ from: "now-15m", indexes: [ "main", ], to: "now", }), options: DatadogAPIClient::V2::LogsQueryOptions.new({ timezone: "GMT", }), page: DatadogAPIClient::V2::LogsListRequestPage.new({ limit: 2, }), sort: DatadogAPIClient::V2::LogsSort::TIMESTAMP_ASCENDING, }) opts = { body: body, } api_instance.list_logs_with_pagination(opts) { |item| puts item } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search logs returns "OK" response ``` // Search logs returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs::ListLogsOptionalParams; use datadog_api_client::datadogV2::api_logs::LogsAPI; use datadog_api_client::datadogV2::model::LogsListRequest; use datadog_api_client::datadogV2::model::LogsListRequestPage; use datadog_api_client::datadogV2::model::LogsQueryFilter; use datadog_api_client::datadogV2::model::LogsSort; #[tokio::main] async fn main() { let body = LogsListRequest::new() .filter( LogsQueryFilter::new() .from("2020-09-17T11:48:36+01:00".to_string()) .indexes(vec!["main".to_string()]) .query("datadog-agent".to_string()) .to("2020-09-17T12:48:36+01:00".to_string()), ) .page(LogsListRequestPage::new().limit(5)) .sort(LogsSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api .list_logs(ListLogsOptionalParams::default().body(body)) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Search logs returns "OK" response with pagination ``` // Search logs returns "OK" response with pagination use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs::ListLogsOptionalParams; use datadog_api_client::datadogV2::api_logs::LogsAPI; use datadog_api_client::datadogV2::model::LogsListRequest; use datadog_api_client::datadogV2::model::LogsListRequestPage; use datadog_api_client::datadogV2::model::LogsQueryFilter; use datadog_api_client::datadogV2::model::LogsQueryOptions; use datadog_api_client::datadogV2::model::LogsSort; use futures_util::pin_mut; use futures_util::stream::StreamExt; #[tokio::main] async fn main() { let body = LogsListRequest::new() .filter( LogsQueryFilter::new() .from("now-15m".to_string()) .indexes(vec!["main".to_string()]) .to("now".to_string()), ) .options(LogsQueryOptions::new().timezone("GMT".to_string())) .page(LogsListRequestPage::new().limit(2)) .sort(LogsSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let response = api.list_logs_with_pagination(ListLogsOptionalParams::default().body(body)); pin_mut!(response); while let Some(resp) = response.next().await { if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search logs returns "OK" response ``` /** * Search logs returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsApi(configuration); const params: v2.LogsApiListLogsRequest = { body: { filter: { query: "datadog-agent", indexes: ["main"], from: "2020-09-17T11:48:36+01:00", to: "2020-09-17T12:48:36+01:00", }, sort: "timestamp", page: { limit: 5, }, }, }; apiInstance .listLogs(params) .then((data: v2.LogsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Search logs returns "OK" response with pagination ``` /** * Search logs returns "OK" response with pagination */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsApi(configuration); const params: v2.LogsApiListLogsRequest = { body: { filter: { from: "now-15m", indexes: ["main"], to: "now", }, options: { timezone: "GMT", }, page: { limit: 2, }, sort: "timestamp", }, }; (async () => { try { for await (const item of apiInstance.listLogsWithPagination(params)) { console.log(item); } } catch (error) { console.error(error); } })(); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search logs (GET)](https://docs.datadoghq.com/api/latest/logs/#search-logs-get) * [v2 (latest)](https://docs.datadoghq.com/api/latest/logs/#search-logs-get-v2) GET https://api.ap1.datadoghq.com/api/v2/logs/eventshttps://api.ap2.datadoghq.com/api/v2/logs/eventshttps://api.datadoghq.eu/api/v2/logs/eventshttps://api.ddog-gov.com/api/v2/logs/eventshttps://api.datadoghq.com/api/v2/logs/eventshttps://api.us3.datadoghq.com/api/v2/logs/eventshttps://api.us5.datadoghq.com/api/v2/logs/events ### Overview List endpoint returns logs that match a log search query. [Results are paginated](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to search and filter your logs. **If you are considering archiving logs for your organization, consider use of the Datadog archive capabilities instead of the log list API. See[Datadog Logs Archive documentation](https://docs.datadoghq.com/logs/archives).** This endpoint requires the `logs_read_data` permission. ### Arguments #### Query Strings Name Type Description filter[query] string Search query following logs syntax. filter[indexes] array For customers with multiple indexes, the indexes to search. Defaults to ‘*’ which means all indexes filter[from] string Minimum timestamp for requested logs. filter[to] string Maximum timestamp for requested logs. filter[storage_tier] enum Specifies the storage type to be used Allowed enum values: `indexes, online-archives, flex` sort enum Order of logs in results. Allowed enum values: `timestamp, -timestamp` page[cursor] string List following results with a cursor provided in the previous query. page[limit] integer Maximum number of logs in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/logs/#ListLogsGet-200-v2) * [400](https://docs.datadoghq.com/api/latest/logs/#ListLogsGet-400-v2) * [403](https://docs.datadoghq.com/api/latest/logs/#ListLogsGet-403-v2) * [429](https://docs.datadoghq.com/api/latest/logs/#ListLogsGet-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) Response object with all logs matching the request and pagination information. Field Type Description data [object] Array of logs matching the request. attributes object JSON object containing all log attributes and their associated values. attributes object JSON object of attributes from your log. host string Name of the machine from where the logs are being sent. message string The message [reserved attribute](https://docs.datadoghq.com/logs/log_collection/#reserved-attributes) of your log. By default, Datadog ingests the value of the message attribute as the body of the log entry. That value is then highlighted and displayed in the Logstream, where it is indexed for full text search. service string The name of the application or service generating the log events. It is used to switch from Logs to APM, so make sure you define the same value when you use both products. status string Status of the message associated with your log. tags [string] Array of tags associated with your log. timestamp date-time Timestamp of your log. id string Unique ID of the Log. type enum Type of the event. Allowed enum values: `log` default: `log` links object Links attributes. next string Link for the next set of results. Note that the request can also be made using the POST endpoint. meta object The metadata associated with a request elapsed int64 The time elapsed in milliseconds page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. request_id string The identifier of the request status enum The status of the response Allowed enum values: `done,timeout` warnings [object] A list of warnings (non fatal errors) encountered, partial results might be returned if warnings are present in the response. code string A unique code for this type of warning detail string A detailed explanation of this specific warning title string A short human-readable summary of the warning ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "host": "i-0123", "message": "Host connected to remote", "service": "agent", "status": "INFO", "tags": [ "team:A" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "log" } ], "links": { "next": "https://app.datadoghq.com/api/v2/logs/event?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/logs/) * [Example](https://docs.datadoghq.com/api/latest/logs/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/logs/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/logs/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/logs/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/logs/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/logs/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/logs/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/logs/?code-lang=typescript) ##### Search logs (GET) Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/logs/events" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Search logs (GET) ``` """ Search logs (GET) returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.logs_api import LogsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = LogsApi(api_client) response = api_instance.list_logs_get() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search logs (GET) ``` # Search logs (GET) returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::LogsAPI.new p api_instance.list_logs_get() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search logs (GET) ``` // Search logs (GET) returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewLogsApi(apiClient) resp, r, err := api.ListLogsGet(ctx, *datadogV2.NewListLogsGetOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `LogsApi.ListLogsGet`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `LogsApi.ListLogsGet`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search logs (GET) ``` // Search logs (GET) returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.LogsApi; import com.datadog.api.client.v2.model.LogsListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); LogsApi apiInstance = new LogsApi(defaultClient); try { LogsListResponse result = apiInstance.listLogsGet(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling LogsApi#listLogsGet"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search logs (GET) ``` // Search logs (GET) returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_logs::ListLogsGetOptionalParams; use datadog_api_client::datadogV2::api_logs::LogsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = LogsAPI::with_config(configuration); let resp = api .list_logs_get(ListLogsGetOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search logs (GET) ``` /** * Search logs (GET) returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.LogsApi(configuration); apiInstance .listLogsGet() .then((data: v2.LogsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=9e8f20d5-8eaf-44be-967b-49d5f65384fb&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=7b41c1c2-01a6-4439-a1b9-69524d0154e8&pt=Logs&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=9e8f20d5-8eaf-44be-967b-49d5f65384fb&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=7b41c1c2-01a6-4439-a1b9-69524d0154e8&pt=Logs&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=e5e78025-2f6e-49d3-9eeb-1baef02fa048&bo=2&sid=c4ba0d50f0bf11f09afde9c0412ba014&vid=c4ba67e0f0bf11f099fcdb9b6c7a6b3e&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Logs&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Flogs%2F&r=<=2392&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=745547) --- # Source: https://docs.datadoghq.com/api/latest/metrics # Metrics The metrics endpoint allows you to: * Post metrics data so it can be graphed on Datadog’s dashboards * Query metrics from any time period * Modify tag configurations for metrics * View tags and volumes for metrics **Note** : A graph can only contain a set number of points and as the timeframe over which a metric is viewed increases, aggregation between points occurs to stay below that set number. The Post, Patch, and Delete `manage_tags` API methods can only be performed by a user who has the `Manage Tags for Metrics` permission. See the [Metrics page](https://docs.datadoghq.com/metrics/) for more information. ## [Create a tag configuration](https://docs.datadoghq.com/api/latest/metrics/#create-a-tag-configuration) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#create-a-tag-configuration-v2) POST https://api.ap1.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.ap2.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.datadoghq.eu/api/v2/metrics/{metric_name}/tagshttps://api.ddog-gov.com/api/v2/metrics/{metric_name}/tagshttps://api.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.us3.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.us5.datadoghq.com/api/v2/metrics/{metric_name}/tags ### Overview Create and define a list of queryable tag keys for an existing count/gauge/rate/distribution metric. Optionally, include percentile aggregations on any distribution metric. By setting `exclude_tags_mode` to true, the behavior is changed from an allow-list to a deny-list, and tags in the defined list are not queryable. Can only be used with application keys of users with the `Manage Tags for Metrics` permission. This endpoint requires the `metric_tags_write` permission. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string The name of the metric. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Field Type Description data [_required_] object Object for a single metric to be configure tags on. attributes object Object containing the definition of a metric tag configuration to be created. aggregations [object] Deprecated. You no longer need to configure specific time and space aggregations for Metrics Without Limits. space [_required_] enum A space aggregation for use in query. Allowed enum values: `avg,max,min,sum` time [_required_] enum A time aggregation for use in query. Allowed enum values: `avg,count,max,min,sum` exclude_tags_mode boolean When set to true, the configuration will exclude the configured tags and include any other submitted tags. When set to false, the configuration will include the configured tags and exclude any other submitted tags. Defaults to false. Requires `tags` property. include_percentiles boolean Toggle to include/exclude percentiles for a distribution metric. Defaults to false. Can only be applied to metrics that have a `metric_type` of `distribution`. metric_type [_required_] enum The metric's type. Allowed enum values: `gauge,count,rate,distribution` default: `gauge` tags [_required_] [string] A list of tag keys that will be queryable for your metric. default: id [_required_] string The metric name for this resource. type [_required_] enum The metric tag configuration resource type. Allowed enum values: `manage_tags` default: `manage_tags` ``` { "data": { "type": "manage_tags", "id": "ExampleMetric", "attributes": { "tags": [ "app", "datacenter" ], "metric_type": "gauge" } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/metrics/#CreateTagConfiguration-201-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#CreateTagConfiguration-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#CreateTagConfiguration-403-v2) * [409](https://docs.datadoghq.com/api/latest/metrics/#CreateTagConfiguration-409-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#CreateTagConfiguration-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Response object which includes a single metric’s tag configuration. Field Type Description data object Object for a single metric tag configuration. attributes object Object containing the definition of a metric tag configuration attributes. aggregations [object] Deprecated. You no longer need to configure specific time and space aggregations for Metrics Without Limits. space [_required_] enum A space aggregation for use in query. Allowed enum values: `avg,max,min,sum` time [_required_] enum A time aggregation for use in query. Allowed enum values: `avg,count,max,min,sum` created_at date-time Timestamp when the tag configuration was created. exclude_tags_mode boolean When set to true, the configuration will exclude the configured tags and include any other submitted tags. When set to false, the configuration will include the configured tags and exclude any other submitted tags. Defaults to false. Requires `tags` property. include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `metric_type` is `distribution`. metric_type enum The metric's type. Allowed enum values: `gauge,count,rate,distribution` default: `gauge` modified_at date-time Timestamp when the tag configuration was last modified. tags [string] List of tag keys on which to group. id string The metric name for this resource. type enum The metric tag configuration resource type. Allowed enum values: `manage_tags` default: `manage_tags` ``` { "data": { "attributes": { "aggregations": [ { "space": "sum", "time": "sum" } ], "created_at": "2020-03-25T09:48:37.463835Z", "exclude_tags_mode": false, "include_percentiles": true, "metric_type": "count", "modified_at": "2020-03-25T09:48:37.463835Z", "tags": [ "app", "datacenter" ] }, "id": "test.metric.latency", "type": "manage_tags" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Create a tag configuration returns "Created" response Copy ``` # Path parameters export metric_name="dist.http.endpoint.request" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/${metric_name}/tags" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "manage_tags", "id": "ExampleMetric", "attributes": { "tags": [ "app", "datacenter" ], "metric_type": "gauge" } } } EOF ``` ##### Create a tag configuration returns "Created" response ``` // Create a tag configuration returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MetricTagConfigurationCreateRequest{ Data: datadogV2.MetricTagConfigurationCreateData{ Type: datadogV2.METRICTAGCONFIGURATIONTYPE_MANAGE_TAGS, Id: "ExampleMetric", Attributes: &datadogV2.MetricTagConfigurationCreateAttributes{ Tags: []string{ "app", "datacenter", }, MetricType: datadogV2.METRICTAGCONFIGURATIONMETRICTYPES_GAUGE, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.CreateTagConfiguration(ctx, "ExampleMetric", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.CreateTagConfiguration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.CreateTagConfiguration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a tag configuration returns "Created" response ``` // Create a tag configuration returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.MetricTagConfigurationCreateAttributes; import com.datadog.api.client.v2.model.MetricTagConfigurationCreateData; import com.datadog.api.client.v2.model.MetricTagConfigurationCreateRequest; import com.datadog.api.client.v2.model.MetricTagConfigurationMetricTypes; import com.datadog.api.client.v2.model.MetricTagConfigurationResponse; import com.datadog.api.client.v2.model.MetricTagConfigurationType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); MetricTagConfigurationCreateRequest body = new MetricTagConfigurationCreateRequest() .data( new MetricTagConfigurationCreateData() .type(MetricTagConfigurationType.MANAGE_TAGS) .id("ExampleMetric") .attributes( new MetricTagConfigurationCreateAttributes() .tags(Arrays.asList("app", "datacenter")) .metricType(MetricTagConfigurationMetricTypes.GAUGE))); try { MetricTagConfigurationResponse result = apiInstance.createTagConfiguration("ExampleMetric", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#createTagConfiguration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a tag configuration returns "Created" response ``` """ Create a tag configuration returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi from datadog_api_client.v2.model.metric_tag_configuration_create_attributes import ( MetricTagConfigurationCreateAttributes, ) from datadog_api_client.v2.model.metric_tag_configuration_create_data import MetricTagConfigurationCreateData from datadog_api_client.v2.model.metric_tag_configuration_create_request import MetricTagConfigurationCreateRequest from datadog_api_client.v2.model.metric_tag_configuration_metric_types import MetricTagConfigurationMetricTypes from datadog_api_client.v2.model.metric_tag_configuration_type import MetricTagConfigurationType body = MetricTagConfigurationCreateRequest( data=MetricTagConfigurationCreateData( type=MetricTagConfigurationType.MANAGE_TAGS, id="ExampleMetric", attributes=MetricTagConfigurationCreateAttributes( tags=[ "app", "datacenter", ], metric_type=MetricTagConfigurationMetricTypes.GAUGE, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.create_tag_configuration(metric_name="ExampleMetric", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a tag configuration returns "Created" response ``` # Create a tag configuration returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new body = DatadogAPIClient::V2::MetricTagConfigurationCreateRequest.new({ data: DatadogAPIClient::V2::MetricTagConfigurationCreateData.new({ type: DatadogAPIClient::V2::MetricTagConfigurationType::MANAGE_TAGS, id: "ExampleMetric", attributes: DatadogAPIClient::V2::MetricTagConfigurationCreateAttributes.new({ tags: [ "app", "datacenter", ], metric_type: DatadogAPIClient::V2::MetricTagConfigurationMetricTypes::GAUGE, }), }), }) p api_instance.create_tag_configuration("ExampleMetric", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a tag configuration returns "Created" response ``` // Create a tag configuration returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; use datadog_api_client::datadogV2::model::MetricTagConfigurationCreateAttributes; use datadog_api_client::datadogV2::model::MetricTagConfigurationCreateData; use datadog_api_client::datadogV2::model::MetricTagConfigurationCreateRequest; use datadog_api_client::datadogV2::model::MetricTagConfigurationMetricTypes; use datadog_api_client::datadogV2::model::MetricTagConfigurationType; #[tokio::main] async fn main() { let body = MetricTagConfigurationCreateRequest::new( MetricTagConfigurationCreateData::new( "ExampleMetric".to_string(), MetricTagConfigurationType::MANAGE_TAGS, ) .attributes(MetricTagConfigurationCreateAttributes::new( MetricTagConfigurationMetricTypes::GAUGE, vec!["app".to_string(), "datacenter".to_string()], )), ); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .create_tag_configuration("ExampleMetric".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a tag configuration returns "Created" response ``` /** * Create a tag configuration returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiCreateTagConfigurationRequest = { body: { data: { type: "manage_tags", id: "ExampleMetric", attributes: { tags: ["app", "datacenter"], metricType: "gauge", }, }, }, metricName: "ExampleMetric", }; apiInstance .createTagConfiguration(params) .then((data: v2.MetricTagConfigurationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get active metrics list](https://docs.datadoghq.com/api/latest/metrics/#get-active-metrics-list) * [v1 (latest)](https://docs.datadoghq.com/api/latest/metrics/#get-active-metrics-list-v1) GET https://api.ap1.datadoghq.com/api/v1/metricshttps://api.ap2.datadoghq.com/api/v1/metricshttps://api.datadoghq.eu/api/v1/metricshttps://api.ddog-gov.com/api/v1/metricshttps://api.datadoghq.com/api/v1/metricshttps://api.us3.datadoghq.com/api/v1/metricshttps://api.us5.datadoghq.com/api/v1/metrics ### Overview Get the list of actively reporting metrics from a given time until now. This endpoint requires the `metrics_read` permission. OAuth apps require the `metrics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#metrics) to access this endpoint. ### Arguments #### Query Strings Name Type Description from [_required_] integer Seconds since the Unix epoch. host string Hostname for filtering the list of metrics returned. If set, metrics retrieved are those with the corresponding hostname tag. tag_filter string Filter metrics that have been submitted with the given tags. Supports boolean and wildcard expressions. Cannot be combined with other filters. ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#ListActiveMetrics-200-v1) * [400](https://docs.datadoghq.com/api/latest/metrics/#ListActiveMetrics-400-v1) * [403](https://docs.datadoghq.com/api/latest/metrics/#ListActiveMetrics-403-v1) * [429](https://docs.datadoghq.com/api/latest/metrics/#ListActiveMetrics-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Object listing all metric names stored by Datadog since a given time. Expand All Field Type Description from string Time when the metrics were active, seconds since the Unix epoch. metrics [string] List of metric names. ``` { "from": "string", "metrics": [] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python-legacy) ##### Get active metrics list Copy ``` # Required query arguments export from="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/metrics?from=${from}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get active metrics list ``` """ Get active metrics list returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.metrics_api import MetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.list_active_metrics( _from=9223372036854775807, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get active metrics list ``` # Get active metrics list returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MetricsAPI.new p api_instance.list_active_metrics(9223372036854775807) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get active metrics list ``` // Get active metrics list returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMetricsApi(apiClient) resp, r, err := api.ListActiveMetrics(ctx, 9223372036854775807, *datadogV1.NewListActiveMetricsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.ListActiveMetrics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.ListActiveMetrics`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get active metrics list ``` // Get active metrics list returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MetricsApi; import com.datadog.api.client.v1.model.MetricsListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); try { MetricsListResponse result = apiInstance.listActiveMetrics(9223372036854775807L); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#listActiveMetrics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get active metrics list ``` from datadog import initialize, api import time options = { 'api_key': '', 'app_key': '' } initialize(**options) # Taking the last 24hours from_time = int(time.time()) - 60 * 60 * 24 * 1 result = api.Metric.list(from_time) print(result) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get active metrics list ``` // Get active metrics list returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_metrics::ListActiveMetricsOptionalParams; use datadog_api_client::datadogV1::api_metrics::MetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .list_active_metrics( 9223372036854775807, ListActiveMetricsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get active metrics list ``` /** * Get active metrics list returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MetricsApi(configuration); const params: v1.MetricsApiListActiveMetricsRequest = { from: 9223372036854775807, }; apiInstance .listActiveMetrics(params) .then((data: v1.MetricsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Query timeseries data across multiple products](https://docs.datadoghq.com/api/latest/metrics/#query-timeseries-data-across-multiple-products) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#query-timeseries-data-across-multiple-products-v2) POST https://api.ap1.datadoghq.com/api/v2/query/timeserieshttps://api.ap2.datadoghq.com/api/v2/query/timeserieshttps://api.datadoghq.eu/api/v2/query/timeserieshttps://api.ddog-gov.com/api/v2/query/timeserieshttps://api.datadoghq.com/api/v2/query/timeserieshttps://api.us3.datadoghq.com/api/v2/query/timeserieshttps://api.us5.datadoghq.com/api/v2/query/timeseries ### Overview Query timeseries data across various data sources and process the data by applying formulas and functions. This endpoint requires the `timeseries_query` permission. OAuth apps require the `timeseries_query` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#metrics) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Field Type Description data [_required_] object A single timeseries query to be executed. attributes [_required_] object The object describing a timeseries formula request. formulas [object] List of formulas to be calculated and returned as responses. formula [_required_] string Formula string, referencing one or more queries with their name property. limit object Message for specifying limits to the number of values returned by a query. This limit is only for scalar queries and has no effect on timeseries queries. count int32 The number of results to which to limit. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` from [_required_] int64 Start date (inclusive) of the query in milliseconds since the Unix epoch. interval int64 A time interval in milliseconds. May be overridden by a larger interval if the query would result in too many points for the specified timeframe. Defaults to a reasonable interval for the given timeframe. queries [_required_] [ ] List of queries to be run and used as inputs to the formulas. Option 1 object An individual timeseries metrics query. data_source [_required_] enum A data source that is powered by the Metrics platform. Allowed enum values: `metrics,cloud_cost` default: `metrics` name string The variable name for use in formulas. query [_required_] string A classic metrics query string. Option 2 object An individual timeseries events query. compute [_required_] object The instructions for what to compute for this query. aggregation [_required_] enum The type of aggregation that can be performed on events-based queries. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` default: `count` interval int64 Interval for compute in milliseconds. metric string The "measure" attribute on which to perform the computation. data_source [_required_] enum A data source that is powered by the Events Platform. Allowed enum values: `logs,rum,dora` default: `logs` group_by [object] The list of facets on which to split results. facet [_required_] string The facet by which to split groups. limit int32 The maximum buckets to return for this group by. Note: at most 10000 buckets are allowed. If grouping by multiple facets, the product of limits must not exceed 10000. default: `10` sort object The dimension by which to sort a query's results. aggregation [_required_] enum The type of aggregation that can be performed on events-based queries. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` default: `count` metric string The metric's calculated value which should be used to define the sort order of a query's results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` type enum The type of sort to use on the calculated value. Allowed enum values: `alphabetical,measure` indexes [string] The indexes in which to search. name string The variable name for use in formulas. search object Configuration of the search/filter for an events query. query string The search/filter string for an events query. to [_required_] int64 End date (exclusive) of the query in milliseconds since the Unix epoch. type [_required_] enum The type of the resource. The value should always be timeseries_request. Allowed enum values: `timeseries_request` default: `timeseries_request` ``` { "data": { "attributes": { "formulas": [ { "formula": "a", "limit": { "count": 10, "order": "desc" } } ], "from": 1636625471000, "interval": 5000, "queries": [ { "data_source": "metrics", "query": "avg:datadog.estimated_usage.metrics.custom{*}", "name": "a" } ], "to": 1636629071000 }, "type": "timeseries_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#QueryTimeseriesData-200-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#QueryTimeseriesData-400-v2) * [401](https://docs.datadoghq.com/api/latest/metrics/#QueryTimeseriesData-401-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#QueryTimeseriesData-403-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#QueryTimeseriesData-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) A message containing one response to a timeseries query made with timeseries formula query request. Field Type Description data object A message containing the response to a timeseries query. attributes object The object describing a timeseries response. series [object] Array of response series. The index here corresponds to the index in the `formulas` or `queries` array from the request. group_tags [string] List of tags that apply to a single response value. query_index int32 The index of the query in the "formulas" array (or "queries" array if no "formulas" was specified). unit [object] Detailed information about the unit. The first element describes the "primary unit" (for example, `bytes` in `bytes per second`). The second element describes the "per unit" (for example, `second` in `bytes per second`). If the second element is not present, the API returns null. family string Unit family, allows for conversion between units of the same family, for scaling. name string Unit name plural string Plural form of the unit name. scale_factor double Factor for scaling between units of the same family. short_name string Abbreviation of the unit. times [integer] Array of times, 1-1 match with individual values arrays. values [array] Array of value-arrays. The index here corresponds to the index in the `formulas` or `queries` array from the request. type enum The type of the resource. The value should always be timeseries_response. Allowed enum values: `timeseries_response` default: `timeseries_response` errors string The error generated by the request. ``` { "data": { "attributes": { "series": [ { "group_tags": [ "env:production" ], "query_index": 0, "unit": [ { "family": "time", "name": "minute", "plural": "minutes", "scale_factor": 60, "short_name": "min" } ] } ], "times": [], "values": [ 1575317847, 0.5 ] }, "type": "timeseries_response" }, "errors": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Timeseries cross product query returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/query/timeseries" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "formulas": [ { "formula": "a", "limit": { "count": 10, "order": "desc" } } ], "from": 1636625471000, "interval": 5000, "queries": [ { "data_source": "metrics", "query": "avg:datadog.estimated_usage.metrics.custom{*}", "name": "a" } ], "to": 1636629071000 }, "type": "timeseries_request" } } EOF ``` ##### Timeseries cross product query returns "OK" response ``` // Timeseries cross product query returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.TimeseriesFormulaQueryRequest{ Data: datadogV2.TimeseriesFormulaRequest{ Attributes: datadogV2.TimeseriesFormulaRequestAttributes{ Formulas: []datadogV2.QueryFormula{ { Formula: "a", Limit: &datadogV2.FormulaLimit{ Count: datadog.PtrInt32(10), Order: datadogV2.QUERYSORTORDER_DESC.Ptr(), }, }, }, From: 1636625471000, Interval: datadog.PtrInt64(5000), Queries: []datadogV2.TimeseriesQuery{ datadogV2.TimeseriesQuery{ MetricsTimeseriesQuery: &datadogV2.MetricsTimeseriesQuery{ DataSource: datadogV2.METRICSDATASOURCE_METRICS, Query: "avg:datadog.estimated_usage.metrics.custom{*}", Name: datadog.PtrString("a"), }}, }, To: 1636629071000, }, Type: datadogV2.TIMESERIESFORMULAREQUESTTYPE_TIMESERIES_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.QueryTimeseriesData(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.QueryTimeseriesData`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.QueryTimeseriesData`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Timeseries cross product query returns "OK" response ``` // Timeseries cross product query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.FormulaLimit; import com.datadog.api.client.v2.model.MetricsDataSource; import com.datadog.api.client.v2.model.MetricsTimeseriesQuery; import com.datadog.api.client.v2.model.QueryFormula; import com.datadog.api.client.v2.model.QuerySortOrder; import com.datadog.api.client.v2.model.TimeseriesFormulaQueryRequest; import com.datadog.api.client.v2.model.TimeseriesFormulaQueryResponse; import com.datadog.api.client.v2.model.TimeseriesFormulaRequest; import com.datadog.api.client.v2.model.TimeseriesFormulaRequestAttributes; import com.datadog.api.client.v2.model.TimeseriesFormulaRequestType; import com.datadog.api.client.v2.model.TimeseriesQuery; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); TimeseriesFormulaQueryRequest body = new TimeseriesFormulaQueryRequest() .data( new TimeseriesFormulaRequest() .attributes( new TimeseriesFormulaRequestAttributes() .formulas( Collections.singletonList( new QueryFormula() .formula("a") .limit( new FormulaLimit() .count(10) .order(QuerySortOrder.DESC)))) .from(1636625471000L) .interval(5000L) .queries( Collections.singletonList( new TimeseriesQuery( new MetricsTimeseriesQuery() .dataSource(MetricsDataSource.METRICS) .query("avg:datadog.estimated_usage.metrics.custom{*}") .name("a")))) .to(1636629071000L)) .type(TimeseriesFormulaRequestType.TIMESERIES_REQUEST)); try { TimeseriesFormulaQueryResponse result = apiInstance.queryTimeseriesData(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#queryTimeseriesData"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Timeseries cross product query returns "OK" response ``` """ Timeseries cross product query returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi from datadog_api_client.v2.model.formula_limit import FormulaLimit from datadog_api_client.v2.model.metrics_data_source import MetricsDataSource from datadog_api_client.v2.model.metrics_timeseries_query import MetricsTimeseriesQuery from datadog_api_client.v2.model.query_formula import QueryFormula from datadog_api_client.v2.model.query_sort_order import QuerySortOrder from datadog_api_client.v2.model.timeseries_formula_query_request import TimeseriesFormulaQueryRequest from datadog_api_client.v2.model.timeseries_formula_request import TimeseriesFormulaRequest from datadog_api_client.v2.model.timeseries_formula_request_attributes import TimeseriesFormulaRequestAttributes from datadog_api_client.v2.model.timeseries_formula_request_queries import TimeseriesFormulaRequestQueries from datadog_api_client.v2.model.timeseries_formula_request_type import TimeseriesFormulaRequestType body = TimeseriesFormulaQueryRequest( data=TimeseriesFormulaRequest( attributes=TimeseriesFormulaRequestAttributes( formulas=[ QueryFormula( formula="a", limit=FormulaLimit( count=10, order=QuerySortOrder.DESC, ), ), ], _from=1636625471000, interval=5000, queries=TimeseriesFormulaRequestQueries( [ MetricsTimeseriesQuery( data_source=MetricsDataSource.METRICS, query="avg:datadog.estimated_usage.metrics.custom{*}", name="a", ), ] ), to=1636629071000, ), type=TimeseriesFormulaRequestType.TIMESERIES_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.query_timeseries_data(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Timeseries cross product query returns "OK" response ``` # Timeseries cross product query returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new body = DatadogAPIClient::V2::TimeseriesFormulaQueryRequest.new({ data: DatadogAPIClient::V2::TimeseriesFormulaRequest.new({ attributes: DatadogAPIClient::V2::TimeseriesFormulaRequestAttributes.new({ formulas: [ DatadogAPIClient::V2::QueryFormula.new({ formula: "a", limit: DatadogAPIClient::V2::FormulaLimit.new({ count: 10, order: DatadogAPIClient::V2::QuerySortOrder::DESC, }), }), ], from: 1636625471000, interval: 5000, queries: [ DatadogAPIClient::V2::MetricsTimeseriesQuery.new({ data_source: DatadogAPIClient::V2::MetricsDataSource::METRICS, query: "avg:datadog.estimated_usage.metrics.custom{*}", name: "a", }), ], to: 1636629071000, }), type: DatadogAPIClient::V2::TimeseriesFormulaRequestType::TIMESERIES_REQUEST, }), }) p api_instance.query_timeseries_data(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Timeseries cross product query returns "OK" response ``` // Timeseries cross product query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; use datadog_api_client::datadogV2::model::FormulaLimit; use datadog_api_client::datadogV2::model::MetricsDataSource; use datadog_api_client::datadogV2::model::MetricsTimeseriesQuery; use datadog_api_client::datadogV2::model::QueryFormula; use datadog_api_client::datadogV2::model::QuerySortOrder; use datadog_api_client::datadogV2::model::TimeseriesFormulaQueryRequest; use datadog_api_client::datadogV2::model::TimeseriesFormulaRequest; use datadog_api_client::datadogV2::model::TimeseriesFormulaRequestAttributes; use datadog_api_client::datadogV2::model::TimeseriesFormulaRequestType; use datadog_api_client::datadogV2::model::TimeseriesQuery; #[tokio::main] async fn main() { let body = TimeseriesFormulaQueryRequest::new(TimeseriesFormulaRequest::new( TimeseriesFormulaRequestAttributes::new( 1636625471000, vec![TimeseriesQuery::MetricsTimeseriesQuery(Box::new( MetricsTimeseriesQuery::new( MetricsDataSource::METRICS, "avg:datadog.estimated_usage.metrics.custom{*}".to_string(), ) .name("a".to_string()), ))], 1636629071000, ) .formulas(vec![QueryFormula::new("a".to_string()) .limit(FormulaLimit::new().count(10).order(QuerySortOrder::DESC))]) .interval(5000), TimeseriesFormulaRequestType::TIMESERIES_REQUEST, )); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api.query_timeseries_data(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Timeseries cross product query returns "OK" response ``` /** * Timeseries cross product query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiQueryTimeseriesDataRequest = { body: { data: { attributes: { formulas: [ { formula: "a", limit: { count: 10, order: "desc", }, }, ], from: 1636625471000, interval: 5000, queries: [ { dataSource: "metrics", query: "avg:datadog.estimated_usage.metrics.custom{*}", name: "a", }, ], to: 1636629071000, }, type: "timeseries_request", }, }, }; apiInstance .queryTimeseriesData(params) .then((data: v2.TimeseriesFormulaQueryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Submit distribution points](https://docs.datadoghq.com/api/latest/metrics/#submit-distribution-points) * [v1 (latest)](https://docs.datadoghq.com/api/latest/metrics/#submit-distribution-points-v1) POST https://api.ap1.datadoghq.com/api/v1/distribution_pointshttps://api.ap2.datadoghq.com/api/v1/distribution_pointshttps://api.datadoghq.eu/api/v1/distribution_pointshttps://api.ddog-gov.com/api/v1/distribution_pointshttps://api.datadoghq.com/api/v1/distribution_pointshttps://api.us3.datadoghq.com/api/v1/distribution_pointshttps://api.us5.datadoghq.com/api/v1/distribution_points ### Overview The distribution points end-point allows you to post distribution data that can be graphed on Datadog’s dashboards. ### Arguments #### Header Parameters Name Type Description Content-Encoding string HTTP header used to compress the media-type. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Field Type Description series [_required_] [object] A list of distribution points series to submit to Datadog. host string The name of the host that produced the distribution point metric. metric [_required_] string The name of the distribution points metric. points [_required_] [array] Points relating to the distribution point metric. All points must be tuples with timestamp and a list of values (cannot be a string). Timestamps should be in POSIX time in seconds. tags [string] A list of tags associated with the distribution point metric. type enum The type of the distribution point. Allowed enum values: `distribution` default: `distribution` ##### Submit deflate distribution points returns "Payload accepted" response ``` { "series": [ { "metric": "system.load.1.dist", "points": [ [ 1636629071, [ 1.0, 2.0 ] ] ] } ] } ``` Copy ##### Submit distribution points returns "Payload accepted" response ``` { "series": [ { "metric": "system.load.1.dist", "points": [ [ 1636629071, [ 1.0, 2.0 ] ] ] } ] } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/metrics/#SubmitDistributionPoints-202-v1) * [400](https://docs.datadoghq.com/api/latest/metrics/#SubmitDistributionPoints-400-v1) * [403](https://docs.datadoghq.com/api/latest/metrics/#SubmitDistributionPoints-403-v1) * [408](https://docs.datadoghq.com/api/latest/metrics/#SubmitDistributionPoints-408-v1) * [413](https://docs.datadoghq.com/api/latest/metrics/#SubmitDistributionPoints-413-v1) * [429](https://docs.datadoghq.com/api/latest/metrics/#SubmitDistributionPoints-429-v1) Payload accepted * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) The payload accepted for intake. Expand All Field Type Description status string The status of the intake payload. ``` { "status": "ok" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Request timeout * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Payload too large * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Submit deflate distribution points returns "Payload accepted" response Copy ``` ## Dynamic Points # Post time-series data that can be graphed on Datadog’s dashboards. See one of the other client libraries for an example of sending deflate-compressed data. ``` ##### Submit distribution points returns "Payload accepted" response Copy ``` ## Dynamic Points # Post time-series data that can be graphed on Datadog’s dashboards. # Template variables export NOW="$(date +%s)" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/distribution_points" \ -H "Accept: application/json" \ -H "Content-Type: text/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "series": [ { "metric": "system.load.1.dist", "points": [[${NOW}, [1234.5]]] } ] } EOF ``` ##### Submit deflate distribution points returns "Payload accepted" response ``` // Submit deflate distribution points returns "Payload accepted" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.DistributionPointsPayload{ Series: []datadogV1.DistributionPointsSeries{ { Metric: "system.load.1.dist", Points: [][]datadogV1.DistributionPointItem{ { {DistributionPointTimestamp: datadog.PtrFloat64(float64(time.Now().Unix()))}, {DistributionPointData: &[]float64{ 1.0, 2.0, }}, }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMetricsApi(apiClient) resp, r, err := api.SubmitDistributionPoints(ctx, body, *datadogV1.NewSubmitDistributionPointsOptionalParameters().WithContentEncoding(datadogV1.DISTRIBUTIONPOINTSCONTENTENCODING_DEFLATE)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.SubmitDistributionPoints`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.SubmitDistributionPoints`:\n%s\n", responseContent) } ``` Copy ##### Submit distribution points returns "Payload accepted" response ``` // Submit distribution points returns "Payload accepted" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.DistributionPointsPayload{ Series: []datadogV1.DistributionPointsSeries{ { Metric: "system.load.1.dist", Points: [][]datadogV1.DistributionPointItem{ { {DistributionPointTimestamp: datadog.PtrFloat64(float64(time.Now().Unix()))}, {DistributionPointData: &[]float64{ 1.0, 2.0, }}, }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMetricsApi(apiClient) resp, r, err := api.SubmitDistributionPoints(ctx, body, *datadogV1.NewSubmitDistributionPointsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.SubmitDistributionPoints`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.SubmitDistributionPoints`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Submit deflate distribution points returns "Payload accepted" response ``` // Submit deflate distribution points returns "Payload accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MetricsApi; import com.datadog.api.client.v1.api.MetricsApi.SubmitDistributionPointsOptionalParameters; import com.datadog.api.client.v1.model.DistributionPointItem; import com.datadog.api.client.v1.model.DistributionPointsContentEncoding; import com.datadog.api.client.v1.model.DistributionPointsPayload; import com.datadog.api.client.v1.model.DistributionPointsSeries; import com.datadog.api.client.v1.model.IntakePayloadAccepted; import java.time.OffsetDateTime; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); DistributionPointsPayload body = new DistributionPointsPayload() .series( Collections.singletonList( new DistributionPointsSeries() .metric("system.load.1.dist") .points( Collections.singletonList( Arrays.asList( new DistributionPointItem( Long.valueOf( OffsetDateTime.now().toInstant().getEpochSecond()) .doubleValue()), new DistributionPointItem(Arrays.asList(1.0, 2.0))))))); try { IntakePayloadAccepted result = apiInstance.submitDistributionPoints( body, new SubmitDistributionPointsOptionalParameters() .contentEncoding(DistributionPointsContentEncoding.DEFLATE)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#submitDistributionPoints"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Submit distribution points returns "Payload accepted" response ``` // Submit distribution points returns "Payload accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MetricsApi; import com.datadog.api.client.v1.model.DistributionPointItem; import com.datadog.api.client.v1.model.DistributionPointsPayload; import com.datadog.api.client.v1.model.DistributionPointsSeries; import com.datadog.api.client.v1.model.IntakePayloadAccepted; import java.time.OffsetDateTime; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); DistributionPointsPayload body = new DistributionPointsPayload() .series( Collections.singletonList( new DistributionPointsSeries() .metric("system.load.1.dist") .points( Collections.singletonList( Arrays.asList( new DistributionPointItem( Long.valueOf( OffsetDateTime.now().toInstant().getEpochSecond()) .doubleValue()), new DistributionPointItem(Arrays.asList(1.0, 2.0))))))); try { IntakePayloadAccepted result = apiInstance.submitDistributionPoints(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#submitDistributionPoints"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Submit deflate distribution points returns "Payload accepted" response ``` """ Submit deflate distribution points returns "Payload accepted" response """ from datetime import datetime from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.metrics_api import MetricsApi from datadog_api_client.v1.model.distribution_point import DistributionPoint from datadog_api_client.v1.model.distribution_points_content_encoding import DistributionPointsContentEncoding from datadog_api_client.v1.model.distribution_points_payload import DistributionPointsPayload from datadog_api_client.v1.model.distribution_points_series import DistributionPointsSeries body = DistributionPointsPayload( series=[ DistributionPointsSeries( metric="system.load.1.dist", points=[ DistributionPoint( [ datetime.now().timestamp(), [1.0, 2.0], ] ), ], ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.submit_distribution_points( content_encoding=DistributionPointsContentEncoding.DEFLATE, body=body ) print(response) ``` Copy ##### Submit distribution points returns "Payload accepted" response ``` """ Submit distribution points returns "Payload accepted" response """ from datetime import datetime from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.metrics_api import MetricsApi from datadog_api_client.v1.model.distribution_point import DistributionPoint from datadog_api_client.v1.model.distribution_points_payload import DistributionPointsPayload from datadog_api_client.v1.model.distribution_points_series import DistributionPointsSeries body = DistributionPointsPayload( series=[ DistributionPointsSeries( metric="system.load.1.dist", points=[ DistributionPoint( [ datetime.now().timestamp(), [1.0, 2.0], ] ), ], ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.submit_distribution_points(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Submit deflate distribution points returns "Payload accepted" response ``` # Submit deflate distribution points returns "Payload accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MetricsAPI.new body = DatadogAPIClient::V1::DistributionPointsPayload.new({ series: [ DatadogAPIClient::V1::DistributionPointsSeries.new({ metric: "system.load.1.dist", points: [ [ Time.now, [ 1.0, 2.0, ], ], ], }), ], }) opts = { content_encoding: DistributionPointsContentEncoding::DEFLATE, } p api_instance.submit_distribution_points(body, opts) ``` Copy ##### Submit distribution points returns "Payload accepted" response ``` # Submit distribution points returns "Payload accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MetricsAPI.new body = DatadogAPIClient::V1::DistributionPointsPayload.new({ series: [ DatadogAPIClient::V1::DistributionPointsSeries.new({ metric: "system.load.1.dist", points: [ [ Time.now, [ 1.0, 2.0, ], ], ], }), ], }) p api_instance.submit_distribution_points(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Submit deflate distribution points returns "Payload accepted" response ``` // Submit deflate distribution points returns "Payload accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_metrics::MetricsAPI; use datadog_api_client::datadogV1::api_metrics::SubmitDistributionPointsOptionalParams; use datadog_api_client::datadogV1::model::DistributionPointItem; use datadog_api_client::datadogV1::model::DistributionPointsContentEncoding; use datadog_api_client::datadogV1::model::DistributionPointsPayload; use datadog_api_client::datadogV1::model::DistributionPointsSeries; #[tokio::main] async fn main() { let body = DistributionPointsPayload::new(vec![DistributionPointsSeries::new( "system.load.1.dist".to_string(), vec![vec![ DistributionPointItem::DistributionPointTimestamp(1636629071.0 as f64), DistributionPointItem::DistributionPointData(vec![1.0, 2.0]), ]], )]); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .submit_distribution_points( body, SubmitDistributionPointsOptionalParams::default() .content_encoding(DistributionPointsContentEncoding::DEFLATE), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Submit distribution points returns "Payload accepted" response ``` // Submit distribution points returns "Payload accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_metrics::MetricsAPI; use datadog_api_client::datadogV1::api_metrics::SubmitDistributionPointsOptionalParams; use datadog_api_client::datadogV1::model::DistributionPointItem; use datadog_api_client::datadogV1::model::DistributionPointsPayload; use datadog_api_client::datadogV1::model::DistributionPointsSeries; #[tokio::main] async fn main() { let body = DistributionPointsPayload::new(vec![DistributionPointsSeries::new( "system.load.1.dist".to_string(), vec![vec![ DistributionPointItem::DistributionPointTimestamp(1636629071.0 as f64), DistributionPointItem::DistributionPointData(vec![1.0, 2.0]), ]], )]); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .submit_distribution_points(body, SubmitDistributionPointsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Submit deflate distribution points returns "Payload accepted" response ``` /** * Submit deflate distribution points returns "Payload accepted" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MetricsApi(configuration); const params: v1.MetricsApiSubmitDistributionPointsRequest = { body: { series: [ { metric: "system.load.1.dist", points: [[Math.round(new Date().getTime() / 1000), [1.0, 2.0]]], }, ], }, contentEncoding: "deflate", }; apiInstance .submitDistributionPoints(params) .then((data: v1.IntakePayloadAccepted) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Submit distribution points returns "Payload accepted" response ``` /** * Submit distribution points returns "Payload accepted" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MetricsApi(configuration); const params: v1.MetricsApiSubmitDistributionPointsRequest = { body: { series: [ { metric: "system.load.1.dist", points: [[Math.round(new Date().getTime() / 1000), [1.0, 2.0]]], }, ], }, }; apiInstance .submitDistributionPoints(params) .then((data: v1.IntakePayloadAccepted) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` * * * ## [Submit metrics](https://docs.datadoghq.com/api/latest/metrics/#submit-metrics) * [v1](https://docs.datadoghq.com/api/latest/metrics/#submit-metrics-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#submit-metrics-v2) POST https://api.ap1.datadoghq.com/api/v1/serieshttps://api.ap2.datadoghq.com/api/v1/serieshttps://api.datadoghq.eu/api/v1/serieshttps://api.ddog-gov.com/api/v1/serieshttps://api.datadoghq.com/api/v1/serieshttps://api.us3.datadoghq.com/api/v1/serieshttps://api.us5.datadoghq.com/api/v1/series ### Overview The metrics end-point allows you to post time-series data that can be graphed on Datadog’s dashboards. The maximum payload size is 3.2 megabytes (3200000 bytes). Compressed payloads must have a decompressed size of less than 62 megabytes (62914560 bytes). If you’re submitting metrics directly to the Datadog API without using DogStatsD, expect: * 64 bits for the timestamp * 64 bits for the value * 40 bytes for the metric names * 50 bytes for the timeseries * The full payload is approximately 100 bytes. However, with the DogStatsD API, compression is applied, which reduces the payload size. ### Arguments #### Header Parameters Name Type Description Content-Encoding string HTTP header used to compress the media-type. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Field Type Description series [_required_] [object] A list of timeseries to submit to Datadog. host string The name of the host that produced the metric. interval int64 If the type of the metric is rate or count, define the corresponding interval in seconds. metric [_required_] string The name of the timeseries. points [_required_] [array] Points relating to a metric. All points must be tuples with timestamp and a scalar value (cannot be a string). Timestamps should be in POSIX time in seconds, and cannot be more than ten minutes in the future or more than one hour in the past. tags [string] A list of tags associated with the metric. type string The type of the metric. Valid types are "",`count`, `gauge`, and `rate`. ##### Submit deflate metrics returns "Payload accepted" response ``` { "series": [ { "metric": "system.load.1", "type": "gauge", "points": [ [ 1636629071, 1.1 ] ], "tags": [ "test:ExampleMetric" ] } ] } ``` Copy ##### Submit metrics returns "Payload accepted" response ``` { "series": [ { "metric": "system.load.1", "type": "gauge", "points": [ [ 1636629071, 1.1 ] ], "tags": [ "test:ExampleMetric" ] } ] } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-202-v1) * [400](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-400-v1) * [403](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-403-v1) * [408](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-408-v1) * [413](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-413-v1) * [429](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-429-v1) Payload accepted * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) The payload accepted for intake. Expand All Field Type Description status string The status of the intake payload. ``` { "status": "ok" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Request timeout * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Payload too large * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby-legacy) ##### Submit deflate metrics returns "Payload accepted" response Copy ``` ## Dynamic Points # Post time-series data that can be graphed on Datadog’s dashboards. See one of the other client libraries for an example of sending deflate-compressed data. ``` ##### Submit metrics returns "Payload accepted" response Copy ``` ## Dynamic Points # Post time-series data that can be graphed on Datadog’s dashboards. # Template variables export NOW="$(date +%s)" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/series" \ -H "Accept: application/json" \ -H "Content-Type: text/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "series": [ { "metric": "system.load.1", "points": [[${NOW}, 1234.5]] } ] } EOF ``` ##### Submit deflate metrics returns "Payload accepted" response ``` // Submit deflate metrics returns "Payload accepted" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.MetricsPayload{ Series: []datadogV1.Series{ { Metric: "system.load.1", Type: datadog.PtrString("gauge"), Points: [][]*float64{ { datadog.PtrFloat64(float64(time.Now().Unix())), datadog.PtrFloat64(1.1), }, }, Tags: []string{ "test:ExampleMetric", }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMetricsApi(apiClient) resp, r, err := api.SubmitMetrics(ctx, body, *datadogV1.NewSubmitMetricsOptionalParameters().WithContentEncoding(datadogV1.METRICCONTENTENCODING_DEFLATE)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.SubmitMetrics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.SubmitMetrics`:\n%s\n", responseContent) } ``` Copy ##### Submit metrics returns "Payload accepted" response ``` // Submit metrics returns "Payload accepted" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.MetricsPayload{ Series: []datadogV1.Series{ { Metric: "system.load.1", Type: datadog.PtrString("gauge"), Points: [][]*float64{ { datadog.PtrFloat64(float64(time.Now().Unix())), datadog.PtrFloat64(1.1), }, }, Tags: []string{ "test:ExampleMetric", }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMetricsApi(apiClient) resp, r, err := api.SubmitMetrics(ctx, body, *datadogV1.NewSubmitMetricsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.SubmitMetrics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.SubmitMetrics`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Submit deflate metrics returns "Payload accepted" response ``` // Submit deflate metrics returns "Payload accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MetricsApi; import com.datadog.api.client.v1.api.MetricsApi.SubmitMetricsOptionalParameters; import com.datadog.api.client.v1.model.IntakePayloadAccepted; import com.datadog.api.client.v1.model.MetricContentEncoding; import com.datadog.api.client.v1.model.MetricsPayload; import com.datadog.api.client.v1.model.Series; import java.time.OffsetDateTime; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); MetricsPayload body = new MetricsPayload() .series( Collections.singletonList( new Series() .metric("system.load.1") .type("gauge") .points( Collections.singletonList( Arrays.asList( Long.valueOf(OffsetDateTime.now().toInstant().getEpochSecond()) .doubleValue(), 1.1))) .tags(Collections.singletonList("test:ExampleMetric")))); try { IntakePayloadAccepted result = apiInstance.submitMetrics( body, new SubmitMetricsOptionalParameters().contentEncoding(MetricContentEncoding.DEFLATE)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#submitMetrics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Submit metrics returns "Payload accepted" response ``` // Submit metrics returns "Payload accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MetricsApi; import com.datadog.api.client.v1.model.IntakePayloadAccepted; import com.datadog.api.client.v1.model.MetricsPayload; import com.datadog.api.client.v1.model.Series; import java.time.OffsetDateTime; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); MetricsPayload body = new MetricsPayload() .series( Collections.singletonList( new Series() .metric("system.load.1") .type("gauge") .points( Collections.singletonList( Arrays.asList( Long.valueOf(OffsetDateTime.now().toInstant().getEpochSecond()) .doubleValue(), 1.1))) .tags(Collections.singletonList("test:ExampleMetric")))); try { IntakePayloadAccepted result = apiInstance.submitMetrics(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#submitMetrics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Submit metrics returns "Payload accepted" response ``` from datadog import initialize, api import time options = { 'api_key': '' ## EU costumers need to define 'api_host' as below #'api_host': 'https://api.datadoghq.eu/' } initialize(**options) now = time.time() future_10s = now + 10 # Submit a single point with a timestamp of `now` api.Metric.send(metric='page.views', points=1000) # Submit a point with a timestamp (must be current) api.Metric.send(metric='my.pair', points=(now, 15)) # Submit multiple points. api.Metric.send( metric='my.series', points=[ (now, 15), (future_10s, 16) ] ) # Submit a point with a host and tags. api.Metric.send( metric='my.series', points=100, host="myhost.example.com", tags=["version:1"] ) # Submit multiple metrics api.Metric.send([{ 'metric': 'my.series', 'points': 15 }, { 'metric': 'my1.series', 'points': 16 }]) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python "example.py" ``` ##### Submit deflate metrics returns "Payload accepted" response ``` """ Submit deflate metrics returns "Payload accepted" response """ from datetime import datetime from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.metrics_api import MetricsApi from datadog_api_client.v1.model.metric_content_encoding import MetricContentEncoding from datadog_api_client.v1.model.metrics_payload import MetricsPayload from datadog_api_client.v1.model.point import Point from datadog_api_client.v1.model.series import Series body = MetricsPayload( series=[ Series( metric="system.load.1", type="gauge", points=[ Point( [ datetime.now().timestamp(), 1.1, ] ), ], tags=[ "test:ExampleMetric", ], ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.submit_metrics(content_encoding=MetricContentEncoding.DEFLATE, body=body) print(response) ``` Copy ##### Submit metrics returns "Payload accepted" response ``` """ Submit metrics returns "Payload accepted" response """ from datetime import datetime from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.metrics_api import MetricsApi from datadog_api_client.v1.model.metrics_payload import MetricsPayload from datadog_api_client.v1.model.point import Point from datadog_api_client.v1.model.series import Series body = MetricsPayload( series=[ Series( metric="system.load.1", type="gauge", points=[ Point( [ datetime.now().timestamp(), 1.1, ] ), ], tags=[ "test:ExampleMetric", ], ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.submit_metrics(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Submit metrics returns "Payload accepted" response ``` require 'rubygems' require 'dogapi' api_key = '' dog = Dogapi::Client.new(api_key) # Submit one metric value. dog.emit_point('some.metric.name', 50.0, :host => "my_host.example.com") # Submit multiple metric values points = [ [Time.now, 0], [Time.now + 10, 10.0], [Time.now + 20, 20.0] ] dog.emit_points('some.metric.name', points, :tags => ["version:1"]) # Emit differents metrics in a single request to be more efficient dog.batch_metrics do dog.emit_point('test.api.test_metric',10) dog.emit_point('test.api.this_other_metric', 1) end ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Submit deflate metrics returns "Payload accepted" response ``` # Submit deflate metrics returns "Payload accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MetricsAPI.new body = DatadogAPIClient::V1::MetricsPayload.new({ series: [ DatadogAPIClient::V1::Series.new({ metric: "system.load.1", type: "gauge", points: [ [ Time.now.to_f, 1.1, ], ], tags: [ "test:ExampleMetric", ], }), ], }) opts = { content_encoding: MetricContentEncoding::DEFLATE, } p api_instance.submit_metrics(body, opts) ``` Copy ##### Submit metrics returns "Payload accepted" response ``` # Submit metrics returns "Payload accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MetricsAPI.new body = DatadogAPIClient::V1::MetricsPayload.new({ series: [ DatadogAPIClient::V1::Series.new({ metric: "system.load.1", type: "gauge", points: [ [ Time.now.to_f, 1.1, ], ], tags: [ "test:ExampleMetric", ], }), ], }) p api_instance.submit_metrics(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Submit deflate metrics returns "Payload accepted" response ``` // Submit deflate metrics returns "Payload accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_metrics::MetricsAPI; use datadog_api_client::datadogV1::api_metrics::SubmitMetricsOptionalParams; use datadog_api_client::datadogV1::model::MetricContentEncoding; use datadog_api_client::datadogV1::model::MetricsPayload; use datadog_api_client::datadogV1::model::Series; #[tokio::main] async fn main() { let body = MetricsPayload::new(vec![Series::new( "system.load.1".to_string(), vec![vec![Some(1636629071.0 as f64), Some(1.1 as f64)]], ) .tags(vec!["test:ExampleMetric".to_string()]) .type_("gauge".to_string())]); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .submit_metrics( body, SubmitMetricsOptionalParams::default().content_encoding(MetricContentEncoding::DEFLATE), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Submit metrics returns "Payload accepted" response ``` // Submit metrics returns "Payload accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_metrics::MetricsAPI; use datadog_api_client::datadogV1::api_metrics::SubmitMetricsOptionalParams; use datadog_api_client::datadogV1::model::MetricsPayload; use datadog_api_client::datadogV1::model::Series; #[tokio::main] async fn main() { let body = MetricsPayload::new(vec![Series::new( "system.load.1".to_string(), vec![vec![Some(1636629071.0 as f64), Some(1.1 as f64)]], ) .tags(vec!["test:ExampleMetric".to_string()]) .type_("gauge".to_string())]); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .submit_metrics(body, SubmitMetricsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Submit deflate metrics returns "Payload accepted" response ``` /** * Submit deflate metrics returns "Payload accepted" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MetricsApi(configuration); const params: v1.MetricsApiSubmitMetricsRequest = { body: { series: [ { metric: "system.load.1", type: "gauge", points: [[Math.round(new Date().getTime() / 1000), 1.1]], tags: ["test:ExampleMetric"], }, ], }, contentEncoding: "deflate", }; apiInstance .submitMetrics(params) .then((data: v1.IntakePayloadAccepted) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Submit metrics returns "Payload accepted" response ``` /** * Submit metrics returns "Payload accepted" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MetricsApi(configuration); const params: v1.MetricsApiSubmitMetricsRequest = { body: { series: [ { metric: "system.load.1", type: "gauge", points: [[Math.round(new Date().getTime() / 1000), 1.1]], tags: ["test:ExampleMetric"], }, ], }, }; apiInstance .submitMetrics(params) .then((data: v1.IntakePayloadAccepted) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` POST https://api.ap1.datadoghq.com/api/v2/serieshttps://api.ap2.datadoghq.com/api/v2/serieshttps://api.datadoghq.eu/api/v2/serieshttps://api.ddog-gov.com/api/v2/serieshttps://api.datadoghq.com/api/v2/serieshttps://api.us3.datadoghq.com/api/v2/serieshttps://api.us5.datadoghq.com/api/v2/series ### Overview The metrics end-point allows you to post time-series data that can be graphed on Datadog’s dashboards. The maximum payload size is 500 kilobytes (512000 bytes). Compressed payloads must have a decompressed size of less than 5 megabytes (5242880 bytes). If you’re submitting metrics directly to the Datadog API without using DogStatsD, expect: * 64 bits for the timestamp * 64 bits for the value * 20 bytes for the metric names * 50 bytes for the timeseries * The full payload is approximately 100 bytes. Host name is one of the resources in the Resources field. ### Arguments #### Header Parameters Name Type Description Content-Encoding string HTTP header used to compress the media-type. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Field Type Description series [_required_] [object] A list of timeseries to submit to Datadog. interval int64 If the type of the metric is rate or count, define the corresponding interval in seconds. metadata object Metadata for the metric. origin object Metric origin information. metric_type int32 The origin metric type code product int32 The origin product code service int32 The origin service code metric [_required_] string The name of the timeseries. points [_required_] [object] Points relating to a metric. All points must be objects with timestamp and a scalar value (cannot be a string). Timestamps should be in POSIX time in seconds, and cannot be more than ten minutes in the future or more than one hour in the past. timestamp int64 The timestamp should be in seconds and current. Current is defined as not more than 10 minutes in the future or more than 1 hour in the past. value double The numeric value format should be a 64bit float gauge-type value. resources [object] A list of resources to associate with this metric. name string The name of the resource. type string The type of the resource. source_type_name string The source type name. tags [string] A list of tags associated with the metric. type enum The type of metric. The available types are `0` (unspecified), `1` (count), `2` (rate), and `3` (gauge). Allowed enum values: `0,1,2,3` unit string The unit of point value. ##### Submit metrics returns "Payload accepted" response ``` { "series": [ { "metric": "system.load.1", "type": 0, "points": [ { "timestamp": 1636629071, "value": 0.7 } ], "resources": [ { "name": "dummyhost", "type": "host" } ] } ] } ``` Copy ##### Submit metrics with compression returns "Payload accepted" response ``` { "series": [ { "metric": "system.load.1", "type": 0, "points": [ { "timestamp": 1636629071, "value": 0.7 } ] } ] } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-202-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-403-v2) * [408](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-408-v2) * [413](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-413-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#SubmitMetrics-429-v2) Payload accepted * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) The payload accepted for intake. Expand All Field Type Description errors [string] A list of errors. ``` { "errors": [] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Request timeout * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Payload too large * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Submit metrics returns "Payload accepted" response Copy ``` ## Dynamic Points # Post time-series data that can be graphed on Datadog’s dashboards. # Template variables export NOW="$(date +%s)" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/series" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "series": [ { "metric": "system.load.1", "type": 0, "points": [ { "timestamp": 1636629071, "value": 0.7 } ], "resources": [ { "name": "dummyhost", "type": "host" } ] } ] } EOF ``` ##### Submit metrics with compression returns "Payload accepted" response Copy ``` ## Dynamic Points # Post time-series data that can be graphed on Datadog’s dashboards. See one of the other client libraries for an example of sending deflate-compressed data. ``` ##### Submit metrics returns "Payload accepted" response ``` // Submit metrics returns "Payload accepted" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MetricPayload{ Series: []datadogV2.MetricSeries{ { Metric: "system.load.1", Type: datadogV2.METRICINTAKETYPE_UNSPECIFIED.Ptr(), Points: []datadogV2.MetricPoint{ { Timestamp: datadog.PtrInt64(time.Now().Unix()), Value: datadog.PtrFloat64(0.7), }, }, Resources: []datadogV2.MetricResource{ { Name: datadog.PtrString("dummyhost"), Type: datadog.PtrString("host"), }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.SubmitMetrics(ctx, body, *datadogV2.NewSubmitMetricsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.SubmitMetrics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.SubmitMetrics`:\n%s\n", responseContent) } ``` Copy ##### Submit metrics with compression returns "Payload accepted" response ``` // Submit metrics with compression returns "Payload accepted" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MetricPayload{ Series: []datadogV2.MetricSeries{ { Metric: "system.load.1", Type: datadogV2.METRICINTAKETYPE_UNSPECIFIED.Ptr(), Points: []datadogV2.MetricPoint{ { Timestamp: datadog.PtrInt64(time.Now().Unix()), Value: datadog.PtrFloat64(0.7), }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.SubmitMetrics(ctx, body, *datadogV2.NewSubmitMetricsOptionalParameters().WithContentEncoding(datadogV2.METRICCONTENTENCODING_ZSTD1)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.SubmitMetrics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.SubmitMetrics`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Submit metrics returns "Payload accepted" response ``` // Submit metrics returns "Payload accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.IntakePayloadAccepted; import com.datadog.api.client.v2.model.MetricIntakeType; import com.datadog.api.client.v2.model.MetricPayload; import com.datadog.api.client.v2.model.MetricPoint; import com.datadog.api.client.v2.model.MetricResource; import com.datadog.api.client.v2.model.MetricSeries; import java.time.OffsetDateTime; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); MetricPayload body = new MetricPayload() .series( Collections.singletonList( new MetricSeries() .metric("system.load.1") .type(MetricIntakeType.UNSPECIFIED) .points( Collections.singletonList( new MetricPoint() .timestamp(OffsetDateTime.now().toInstant().getEpochSecond()) .value(0.7))) .resources( Collections.singletonList( new MetricResource().name("dummyhost").type("host"))))); try { IntakePayloadAccepted result = apiInstance.submitMetrics(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#submitMetrics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Submit metrics with compression returns "Payload accepted" response ``` // Submit metrics with compression returns "Payload accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.api.MetricsApi.SubmitMetricsOptionalParameters; import com.datadog.api.client.v2.model.IntakePayloadAccepted; import com.datadog.api.client.v2.model.MetricContentEncoding; import com.datadog.api.client.v2.model.MetricIntakeType; import com.datadog.api.client.v2.model.MetricPayload; import com.datadog.api.client.v2.model.MetricPoint; import com.datadog.api.client.v2.model.MetricSeries; import java.time.OffsetDateTime; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); MetricPayload body = new MetricPayload() .series( Collections.singletonList( new MetricSeries() .metric("system.load.1") .type(MetricIntakeType.UNSPECIFIED) .points( Collections.singletonList( new MetricPoint() .timestamp(OffsetDateTime.now().toInstant().getEpochSecond()) .value(0.7))))); try { IntakePayloadAccepted result = apiInstance.submitMetrics( body, new SubmitMetricsOptionalParameters().contentEncoding(MetricContentEncoding.ZSTD1)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#submitMetrics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Submit metrics returns "Payload accepted" response ``` """ Submit metrics returns "Payload accepted" response """ from datetime import datetime from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi from datadog_api_client.v2.model.metric_intake_type import MetricIntakeType from datadog_api_client.v2.model.metric_payload import MetricPayload from datadog_api_client.v2.model.metric_point import MetricPoint from datadog_api_client.v2.model.metric_resource import MetricResource from datadog_api_client.v2.model.metric_series import MetricSeries body = MetricPayload( series=[ MetricSeries( metric="system.load.1", type=MetricIntakeType.UNSPECIFIED, points=[ MetricPoint( timestamp=int(datetime.now().timestamp()), value=0.7, ), ], resources=[ MetricResource( name="dummyhost", type="host", ), ], ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.submit_metrics(body=body) print(response) ``` Copy ##### Submit metrics with compression returns "Payload accepted" response ``` """ Submit metrics with compression returns "Payload accepted" response """ from datetime import datetime from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi from datadog_api_client.v2.model.metric_content_encoding import MetricContentEncoding from datadog_api_client.v2.model.metric_intake_type import MetricIntakeType from datadog_api_client.v2.model.metric_payload import MetricPayload from datadog_api_client.v2.model.metric_point import MetricPoint from datadog_api_client.v2.model.metric_series import MetricSeries body = MetricPayload( series=[ MetricSeries( metric="system.load.1", type=MetricIntakeType.UNSPECIFIED, points=[ MetricPoint( timestamp=int(datetime.now().timestamp()), value=0.7, ), ], ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.submit_metrics(content_encoding=MetricContentEncoding.ZSTD1, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Submit metrics returns "Payload accepted" response ``` # Submit metrics returns "Payload accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new body = DatadogAPIClient::V2::MetricPayload.new({ series: [ DatadogAPIClient::V2::MetricSeries.new({ metric: "system.load.1", type: DatadogAPIClient::V2::MetricIntakeType::UNSPECIFIED, points: [ DatadogAPIClient::V2::MetricPoint.new({ timestamp: Time.now.to_i, value: 0.7, }), ], resources: [ DatadogAPIClient::V2::MetricResource.new({ name: "dummyhost", type: "host", }), ], }), ], }) p api_instance.submit_metrics(body) ``` Copy ##### Submit metrics with compression returns "Payload accepted" response ``` # Submit metrics with compression returns "Payload accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new body = DatadogAPIClient::V2::MetricPayload.new({ series: [ DatadogAPIClient::V2::MetricSeries.new({ metric: "system.load.1", type: DatadogAPIClient::V2::MetricIntakeType::UNSPECIFIED, points: [ DatadogAPIClient::V2::MetricPoint.new({ timestamp: Time.now.to_i, value: 0.7, }), ], }), ], }) opts = { content_encoding: MetricContentEncoding::ZSTD1, } p api_instance.submit_metrics(body, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Submit metrics returns "Payload accepted" response ``` // Submit metrics returns "Payload accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; use datadog_api_client::datadogV2::api_metrics::SubmitMetricsOptionalParams; use datadog_api_client::datadogV2::model::MetricIntakeType; use datadog_api_client::datadogV2::model::MetricPayload; use datadog_api_client::datadogV2::model::MetricPoint; use datadog_api_client::datadogV2::model::MetricResource; use datadog_api_client::datadogV2::model::MetricSeries; #[tokio::main] async fn main() { let body = MetricPayload::new(vec![MetricSeries::new( "system.load.1".to_string(), vec![MetricPoint::new().timestamp(1636629071).value(0.7 as f64)], ) .resources(vec![MetricResource::new() .name("dummyhost".to_string()) .type_("host".to_string())]) .type_(MetricIntakeType::UNSPECIFIED)]); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .submit_metrics(body, SubmitMetricsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Submit metrics with compression returns "Payload accepted" response ``` // Submit metrics with compression returns "Payload accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; use datadog_api_client::datadogV2::api_metrics::SubmitMetricsOptionalParams; use datadog_api_client::datadogV2::model::MetricContentEncoding; use datadog_api_client::datadogV2::model::MetricIntakeType; use datadog_api_client::datadogV2::model::MetricPayload; use datadog_api_client::datadogV2::model::MetricPoint; use datadog_api_client::datadogV2::model::MetricSeries; #[tokio::main] async fn main() { let body = MetricPayload::new(vec![MetricSeries::new( "system.load.1".to_string(), vec![MetricPoint::new().timestamp(1636629071).value(0.7 as f64)], ) .type_(MetricIntakeType::UNSPECIFIED)]); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .submit_metrics( body, SubmitMetricsOptionalParams::default().content_encoding(MetricContentEncoding::ZSTD1), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Submit metrics returns "Payload accepted" response ``` /** * Submit metrics returns "Payload accepted" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiSubmitMetricsRequest = { body: { series: [ { metric: "system.load.1", type: 0, points: [ { timestamp: Math.round(new Date().getTime() / 1000), value: 0.7, }, ], resources: [ { name: "dummyhost", type: "host", }, ], }, ], }, }; apiInstance .submitMetrics(params) .then((data: v2.IntakePayloadAccepted) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Submit metrics with compression returns "Payload accepted" response ``` /** * Submit metrics with compression returns "Payload accepted" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiSubmitMetricsRequest = { body: { series: [ { metric: "system.load.1", type: 0, points: [ { timestamp: Math.round(new Date().getTime() / 1000), value: 0.7, }, ], }, ], }, contentEncoding: "zstd1", }; apiInstance .submitMetrics(params) .then((data: v2.IntakePayloadAccepted) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` * * * ## [Get metric metadata](https://docs.datadoghq.com/api/latest/metrics/#get-metric-metadata) * [v1 (latest)](https://docs.datadoghq.com/api/latest/metrics/#get-metric-metadata-v1) GET https://api.ap1.datadoghq.com/api/v1/metrics/{metric_name}https://api.ap2.datadoghq.com/api/v1/metrics/{metric_name}https://api.datadoghq.eu/api/v1/metrics/{metric_name}https://api.ddog-gov.com/api/v1/metrics/{metric_name}https://api.datadoghq.com/api/v1/metrics/{metric_name}https://api.us3.datadoghq.com/api/v1/metrics/{metric_name}https://api.us5.datadoghq.com/api/v1/metrics/{metric_name} ### Overview Get metadata about a specific metric. This endpoint requires the `metrics_read` permission. OAuth apps require the `metrics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#metrics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string Name of the metric for which to get metadata. ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#GetMetricMetadata-200-v1) * [403](https://docs.datadoghq.com/api/latest/metrics/#GetMetricMetadata-403-v1) * [404](https://docs.datadoghq.com/api/latest/metrics/#GetMetricMetadata-404-v1) * [429](https://docs.datadoghq.com/api/latest/metrics/#GetMetricMetadata-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Object with all metric related metadata. Expand All Field Type Description description string Metric description. integration string Name of the integration that sent the metric if applicable. per_unit string Per unit of the metric such as `second` in `bytes per second`. short_name string A more human-readable and abbreviated version of the metric name. statsd_interval int64 StatsD flush interval of the metric in seconds if applicable. type string Metric type such as `gauge` or `rate`. unit string Primary unit of the metric such as `byte` or `operation`. ``` { "description": "string", "integration": "string", "per_unit": "second", "short_name": "string", "statsd_interval": "integer", "type": "count", "unit": "byte" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python-legacy) ##### Get metric metadata Copy ``` # Path parameters export metric_name="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/metrics/${metric_name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get metric metadata ``` """ Get metric metadata returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.metrics_api import MetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.get_metric_metadata( metric_name="metric_name", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get metric metadata ``` # Get metric metadata returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MetricsAPI.new p api_instance.get_metric_metadata("metric_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get metric metadata ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Get metadata on metric result = dog.get_metadata('system.net.bytes_sent') ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get metric metadata ``` // Get metric metadata returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMetricsApi(apiClient) resp, r, err := api.GetMetricMetadata(ctx, "metric_name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.GetMetricMetadata`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.GetMetricMetadata`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get metric metadata ``` // Get metric metadata returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MetricsApi; import com.datadog.api.client.v1.model.MetricMetadata; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); try { MetricMetadata result = apiInstance.getMetricMetadata("metric_name"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#getMetricMetadata"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get metric metadata ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) metric = 'system.cpu.idle' api.Metadata.get(metric_name=metric) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get metric metadata ``` // Get metric metadata returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_metrics::MetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api.get_metric_metadata("metric_name".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get metric metadata ``` /** * Get metric metadata returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MetricsApi(configuration); const params: v1.MetricsApiGetMetricMetadataRequest = { metricName: "metric_name", }; apiInstance .getMetricMetadata(params) .then((data: v1.MetricMetadata) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List tag configuration by name](https://docs.datadoghq.com/api/latest/metrics/#list-tag-configuration-by-name) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#list-tag-configuration-by-name-v2) GET https://api.ap1.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.ap2.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.datadoghq.eu/api/v2/metrics/{metric_name}/tagshttps://api.ddog-gov.com/api/v2/metrics/{metric_name}/tagshttps://api.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.us3.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.us5.datadoghq.com/api/v2/metrics/{metric_name}/tags ### Overview Returns the tag configuration for the given metric name. This endpoint requires the `metrics_read` permission. OAuth apps require the `metrics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#metrics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string The name of the metric. ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#ListTagConfigurationByName-200-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#ListTagConfigurationByName-403-v2) * [404](https://docs.datadoghq.com/api/latest/metrics/#ListTagConfigurationByName-404-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#ListTagConfigurationByName-429-v2) Success * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Response object which includes a single metric’s tag configuration. Field Type Description data object Object for a single metric tag configuration. attributes object Object containing the definition of a metric tag configuration attributes. aggregations [object] Deprecated. You no longer need to configure specific time and space aggregations for Metrics Without Limits. space [_required_] enum A space aggregation for use in query. Allowed enum values: `avg,max,min,sum` time [_required_] enum A time aggregation for use in query. Allowed enum values: `avg,count,max,min,sum` created_at date-time Timestamp when the tag configuration was created. exclude_tags_mode boolean When set to true, the configuration will exclude the configured tags and include any other submitted tags. When set to false, the configuration will include the configured tags and exclude any other submitted tags. Defaults to false. Requires `tags` property. include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `metric_type` is `distribution`. metric_type enum The metric's type. Allowed enum values: `gauge,count,rate,distribution` default: `gauge` modified_at date-time Timestamp when the tag configuration was last modified. tags [string] List of tag keys on which to group. id string The metric name for this resource. type enum The metric tag configuration resource type. Allowed enum values: `manage_tags` default: `manage_tags` ``` { "data": { "attributes": { "aggregations": [ { "space": "sum", "time": "sum" } ], "created_at": "2020-03-25T09:48:37.463835Z", "exclude_tags_mode": false, "include_percentiles": true, "metric_type": "count", "modified_at": "2020-03-25T09:48:37.463835Z", "tags": [ "app", "datacenter" ] }, "id": "test.metric.latency", "type": "manage_tags" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### List tag configuration by name Copy ``` # Path parameters export metric_name="dist.http.endpoint.request" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/${metric_name}/tags" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List tag configuration by name ``` """ List tag configuration by name returns "Success" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi # there is a valid "metric_tag_configuration" in the system METRIC_TAG_CONFIGURATION_DATA_ID = environ["METRIC_TAG_CONFIGURATION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.list_tag_configuration_by_name( metric_name=METRIC_TAG_CONFIGURATION_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List tag configuration by name ``` # List tag configuration by name returns "Success" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new # there is a valid "metric_tag_configuration" in the system METRIC_TAG_CONFIGURATION_DATA_ID = ENV["METRIC_TAG_CONFIGURATION_DATA_ID"] p api_instance.list_tag_configuration_by_name(METRIC_TAG_CONFIGURATION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List tag configuration by name ``` // List tag configuration by name returns "Success" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "metric_tag_configuration" in the system MetricTagConfigurationDataID := os.Getenv("METRIC_TAG_CONFIGURATION_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.ListTagConfigurationByName(ctx, MetricTagConfigurationDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.ListTagConfigurationByName`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.ListTagConfigurationByName`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List tag configuration by name ``` // List tag configuration by name returns "Success" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.MetricTagConfigurationResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); // there is a valid "metric_tag_configuration" in the system String METRIC_TAG_CONFIGURATION_DATA_ID = System.getenv("METRIC_TAG_CONFIGURATION_DATA_ID"); try { MetricTagConfigurationResponse result = apiInstance.listTagConfigurationByName(METRIC_TAG_CONFIGURATION_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#listTagConfigurationByName"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List tag configuration by name ``` // List tag configuration by name returns "Success" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; #[tokio::main] async fn main() { // there is a valid "metric_tag_configuration" in the system let metric_tag_configuration_data_id = std::env::var("METRIC_TAG_CONFIGURATION_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .list_tag_configuration_by_name(metric_tag_configuration_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List tag configuration by name ``` /** * List tag configuration by name returns "Success" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); // there is a valid "metric_tag_configuration" in the system const METRIC_TAG_CONFIGURATION_DATA_ID = process.env .METRIC_TAG_CONFIGURATION_DATA_ID as string; const params: v2.MetricsApiListTagConfigurationByNameRequest = { metricName: METRIC_TAG_CONFIGURATION_DATA_ID, }; apiInstance .listTagConfigurationByName(params) .then((data: v2.MetricTagConfigurationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Query scalar data across multiple products](https://docs.datadoghq.com/api/latest/metrics/#query-scalar-data-across-multiple-products) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#query-scalar-data-across-multiple-products-v2) POST https://api.ap1.datadoghq.com/api/v2/query/scalarhttps://api.ap2.datadoghq.com/api/v2/query/scalarhttps://api.datadoghq.eu/api/v2/query/scalarhttps://api.ddog-gov.com/api/v2/query/scalarhttps://api.datadoghq.com/api/v2/query/scalarhttps://api.us3.datadoghq.com/api/v2/query/scalarhttps://api.us5.datadoghq.com/api/v2/query/scalar ### Overview Query scalar values (as seen on Query Value, Table, and Toplist widgets). Multiple data sources are supported with the ability to process the data using formulas and functions. This endpoint requires the `timeseries_query` permission. OAuth apps require the `timeseries_query` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#metrics) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Field Type Description data [_required_] object A single scalar query to be executed. attributes [_required_] object The object describing a scalar formula request. formulas [object] List of formulas to be calculated and returned as responses. formula [_required_] string Formula string, referencing one or more queries with their name property. limit object Message for specifying limits to the number of values returned by a query. This limit is only for scalar queries and has no effect on timeseries queries. count int32 The number of results to which to limit. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` from [_required_] int64 Start date (inclusive) of the query in milliseconds since the Unix epoch. queries [_required_] [ ] List of queries to be run and used as inputs to the formulas. Option 1 object An individual scalar metrics query. aggregator [_required_] enum The type of aggregation that can be performed on metrics-based queries. Allowed enum values: `avg,min,max,sum,last,percentile,mean,l2norm,area` default: `avg` data_source [_required_] enum A data source that is powered by the Metrics platform. Allowed enum values: `metrics,cloud_cost` default: `metrics` name string The variable name for use in formulas. query [_required_] string A classic metrics query string. Option 2 object An individual scalar events query. compute [_required_] object The instructions for what to compute for this query. aggregation [_required_] enum The type of aggregation that can be performed on events-based queries. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` default: `count` interval int64 Interval for compute in milliseconds. metric string The "measure" attribute on which to perform the computation. data_source [_required_] enum A data source that is powered by the Events Platform. Allowed enum values: `logs,rum,dora` default: `logs` group_by [object] The list of facets on which to split results. facet [_required_] string The facet by which to split groups. limit int32 The maximum buckets to return for this group by. Note: at most 10000 buckets are allowed. If grouping by multiple facets, the product of limits must not exceed 10000. default: `10` sort object The dimension by which to sort a query's results. aggregation [_required_] enum The type of aggregation that can be performed on events-based queries. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` default: `count` metric string The metric's calculated value which should be used to define the sort order of a query's results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` type enum The type of sort to use on the calculated value. Allowed enum values: `alphabetical,measure` indexes [string] The indexes in which to search. name string The variable name for use in formulas. search object Configuration of the search/filter for an events query. query string The search/filter string for an events query. to [_required_] int64 End date (exclusive) of the query in milliseconds since the Unix epoch. type [_required_] enum The type of the resource. The value should always be scalar_request. Allowed enum values: `scalar_request` default: `scalar_request` ``` { "data": { "attributes": { "formulas": [ { "formula": "a", "limit": { "count": 10, "order": "desc" } } ], "from": 1636625471000, "queries": [ { "aggregator": "avg", "data_source": "metrics", "query": "avg:system.cpu.user{*}", "name": "a" } ], "to": 1636629071000 }, "type": "scalar_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#QueryScalarData-200-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#QueryScalarData-400-v2) * [401](https://docs.datadoghq.com/api/latest/metrics/#QueryScalarData-401-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#QueryScalarData-403-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#QueryScalarData-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) A message containing one or more responses to scalar queries. Field Type Description data object A message containing the response to a scalar query. attributes object The object describing a scalar response. columns [ ] List of response columns, each corresponding to an individual formula or query in the request and with values in parallel arrays matching the series list. Option 1 object A column containing the tag keys and values in a group. name string The name of the tag key or group. type enum The type of column present for groups. Allowed enum values: `group` default: `group` values [array] The array of tag values for each group found for the results of the formulas or queries. Option 2 object A column containing the numerical results for a formula or query. meta object Metadata for the resulting numerical values. unit [object] Detailed information about the unit. First element describes the "primary unit" (for example, `bytes` in `bytes per second`). The second element describes the "per unit" (for example, `second` in `bytes per second`). If the second element is not present, the API returns null. family string Unit family, allows for conversion between units of the same family, for scaling. name string Unit name plural string Plural form of the unit name. scale_factor double Factor for scaling between units of the same family. short_name string Abbreviation of the unit. name string The name referencing the formula or query for this column. type enum The type of column present for numbers. Allowed enum values: `number` default: `number` values [number] The array of numerical values for one formula or query. type enum The type of the resource. The value should always be scalar_response. Allowed enum values: `scalar_response` default: `scalar_response` errors string An error generated when processing a request. ``` { "data": { "attributes": { "columns": [ { "name": "env", "type": "group", "values": [ [ "staging" ] ] } ] }, "type": "scalar_response" }, "errors": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Scalar cross product query returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/query/scalar" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "formulas": [ { "formula": "a", "limit": { "count": 10, "order": "desc" } } ], "from": 1636625471000, "queries": [ { "aggregator": "avg", "data_source": "metrics", "query": "avg:system.cpu.user{*}", "name": "a" } ], "to": 1636629071000 }, "type": "scalar_request" } } EOF ``` ##### Scalar cross product query returns "OK" response ``` // Scalar cross product query returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ScalarFormulaQueryRequest{ Data: datadogV2.ScalarFormulaRequest{ Attributes: datadogV2.ScalarFormulaRequestAttributes{ Formulas: []datadogV2.QueryFormula{ { Formula: "a", Limit: &datadogV2.FormulaLimit{ Count: datadog.PtrInt32(10), Order: datadogV2.QUERYSORTORDER_DESC.Ptr(), }, }, }, From: 1636625471000, Queries: []datadogV2.ScalarQuery{ datadogV2.ScalarQuery{ MetricsScalarQuery: &datadogV2.MetricsScalarQuery{ Aggregator: datadogV2.METRICSAGGREGATOR_AVG, DataSource: datadogV2.METRICSDATASOURCE_METRICS, Query: "avg:system.cpu.user{*}", Name: datadog.PtrString("a"), }}, }, To: 1636629071000, }, Type: datadogV2.SCALARFORMULAREQUESTTYPE_SCALAR_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.QueryScalarData(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.QueryScalarData`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.QueryScalarData`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Scalar cross product query returns "OK" response ``` // Scalar cross product query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.FormulaLimit; import com.datadog.api.client.v2.model.MetricsAggregator; import com.datadog.api.client.v2.model.MetricsDataSource; import com.datadog.api.client.v2.model.MetricsScalarQuery; import com.datadog.api.client.v2.model.QueryFormula; import com.datadog.api.client.v2.model.QuerySortOrder; import com.datadog.api.client.v2.model.ScalarFormulaQueryRequest; import com.datadog.api.client.v2.model.ScalarFormulaQueryResponse; import com.datadog.api.client.v2.model.ScalarFormulaRequest; import com.datadog.api.client.v2.model.ScalarFormulaRequestAttributes; import com.datadog.api.client.v2.model.ScalarFormulaRequestType; import com.datadog.api.client.v2.model.ScalarQuery; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); ScalarFormulaQueryRequest body = new ScalarFormulaQueryRequest() .data( new ScalarFormulaRequest() .attributes( new ScalarFormulaRequestAttributes() .formulas( Collections.singletonList( new QueryFormula() .formula("a") .limit( new FormulaLimit() .count(10) .order(QuerySortOrder.DESC)))) .from(1636625471000L) .queries( Collections.singletonList( new ScalarQuery( new MetricsScalarQuery() .aggregator(MetricsAggregator.AVG) .dataSource(MetricsDataSource.METRICS) .query("avg:system.cpu.user{*}") .name("a")))) .to(1636629071000L)) .type(ScalarFormulaRequestType.SCALAR_REQUEST)); try { ScalarFormulaQueryResponse result = apiInstance.queryScalarData(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#queryScalarData"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Scalar cross product query returns "OK" response ``` """ Scalar cross product query returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi from datadog_api_client.v2.model.formula_limit import FormulaLimit from datadog_api_client.v2.model.metrics_aggregator import MetricsAggregator from datadog_api_client.v2.model.metrics_data_source import MetricsDataSource from datadog_api_client.v2.model.metrics_scalar_query import MetricsScalarQuery from datadog_api_client.v2.model.query_formula import QueryFormula from datadog_api_client.v2.model.query_sort_order import QuerySortOrder from datadog_api_client.v2.model.scalar_formula_query_request import ScalarFormulaQueryRequest from datadog_api_client.v2.model.scalar_formula_request import ScalarFormulaRequest from datadog_api_client.v2.model.scalar_formula_request_attributes import ScalarFormulaRequestAttributes from datadog_api_client.v2.model.scalar_formula_request_queries import ScalarFormulaRequestQueries from datadog_api_client.v2.model.scalar_formula_request_type import ScalarFormulaRequestType body = ScalarFormulaQueryRequest( data=ScalarFormulaRequest( attributes=ScalarFormulaRequestAttributes( formulas=[ QueryFormula( formula="a", limit=FormulaLimit( count=10, order=QuerySortOrder.DESC, ), ), ], _from=1636625471000, queries=ScalarFormulaRequestQueries( [ MetricsScalarQuery( aggregator=MetricsAggregator.AVG, data_source=MetricsDataSource.METRICS, query="avg:system.cpu.user{*}", name="a", ), ] ), to=1636629071000, ), type=ScalarFormulaRequestType.SCALAR_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.query_scalar_data(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Scalar cross product query returns "OK" response ``` # Scalar cross product query returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new body = DatadogAPIClient::V2::ScalarFormulaQueryRequest.new({ data: DatadogAPIClient::V2::ScalarFormulaRequest.new({ attributes: DatadogAPIClient::V2::ScalarFormulaRequestAttributes.new({ formulas: [ DatadogAPIClient::V2::QueryFormula.new({ formula: "a", limit: DatadogAPIClient::V2::FormulaLimit.new({ count: 10, order: DatadogAPIClient::V2::QuerySortOrder::DESC, }), }), ], from: 1636625471000, queries: [ DatadogAPIClient::V2::MetricsScalarQuery.new({ aggregator: DatadogAPIClient::V2::MetricsAggregator::AVG, data_source: DatadogAPIClient::V2::MetricsDataSource::METRICS, query: "avg:system.cpu.user{*}", name: "a", }), ], to: 1636629071000, }), type: DatadogAPIClient::V2::ScalarFormulaRequestType::SCALAR_REQUEST, }), }) p api_instance.query_scalar_data(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Scalar cross product query returns "OK" response ``` // Scalar cross product query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; use datadog_api_client::datadogV2::model::FormulaLimit; use datadog_api_client::datadogV2::model::MetricsAggregator; use datadog_api_client::datadogV2::model::MetricsDataSource; use datadog_api_client::datadogV2::model::MetricsScalarQuery; use datadog_api_client::datadogV2::model::QueryFormula; use datadog_api_client::datadogV2::model::QuerySortOrder; use datadog_api_client::datadogV2::model::ScalarFormulaQueryRequest; use datadog_api_client::datadogV2::model::ScalarFormulaRequest; use datadog_api_client::datadogV2::model::ScalarFormulaRequestAttributes; use datadog_api_client::datadogV2::model::ScalarFormulaRequestType; use datadog_api_client::datadogV2::model::ScalarQuery; #[tokio::main] async fn main() { let body = ScalarFormulaQueryRequest::new(ScalarFormulaRequest::new( ScalarFormulaRequestAttributes::new( 1636625471000, vec![ScalarQuery::MetricsScalarQuery(Box::new( MetricsScalarQuery::new( MetricsAggregator::AVG, MetricsDataSource::METRICS, "avg:system.cpu.user{*}".to_string(), ) .name("a".to_string()), ))], 1636629071000, ) .formulas(vec![QueryFormula::new("a".to_string()) .limit(FormulaLimit::new().count(10).order(QuerySortOrder::DESC))]), ScalarFormulaRequestType::SCALAR_REQUEST, )); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api.query_scalar_data(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Scalar cross product query returns "OK" response ``` /** * Scalar cross product query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiQueryScalarDataRequest = { body: { data: { attributes: { formulas: [ { formula: "a", limit: { count: 10, order: "desc", }, }, ], from: 1636625471000, queries: [ { aggregator: "avg", dataSource: "metrics", query: "avg:system.cpu.user{*}", name: "a", }, ], to: 1636629071000, }, type: "scalar_request", }, }, }; apiInstance .queryScalarData(params) .then((data: v2.ScalarFormulaQueryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit metric metadata](https://docs.datadoghq.com/api/latest/metrics/#edit-metric-metadata) * [v1 (latest)](https://docs.datadoghq.com/api/latest/metrics/#edit-metric-metadata-v1) PUT https://api.ap1.datadoghq.com/api/v1/metrics/{metric_name}https://api.ap2.datadoghq.com/api/v1/metrics/{metric_name}https://api.datadoghq.eu/api/v1/metrics/{metric_name}https://api.ddog-gov.com/api/v1/metrics/{metric_name}https://api.datadoghq.com/api/v1/metrics/{metric_name}https://api.us3.datadoghq.com/api/v1/metrics/{metric_name}https://api.us5.datadoghq.com/api/v1/metrics/{metric_name} ### Overview Edit metadata of a specific metric. Find out more about [supported types](https://docs.datadoghq.com/developers/metrics). This endpoint requires the `metrics_metadata_write` permission. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string Name of the metric for which to edit metadata. ### Request #### Body Data (required) New metadata. * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Expand All Field Type Description description string Metric description. integration string Name of the integration that sent the metric if applicable. per_unit string Per unit of the metric such as `second` in `bytes per second`. short_name string A more human-readable and abbreviated version of the metric name. statsd_interval int64 StatsD flush interval of the metric in seconds if applicable. type string Metric type such as `gauge` or `rate`. unit string Primary unit of the metric such as `byte` or `operation`. ``` { "description": "string", "per_unit": "second", "short_name": "string", "statsd_interval": "integer", "type": "count", "unit": "byte" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#UpdateMetricMetadata-200-v1) * [400](https://docs.datadoghq.com/api/latest/metrics/#UpdateMetricMetadata-400-v1) * [403](https://docs.datadoghq.com/api/latest/metrics/#UpdateMetricMetadata-403-v1) * [404](https://docs.datadoghq.com/api/latest/metrics/#UpdateMetricMetadata-404-v1) * [429](https://docs.datadoghq.com/api/latest/metrics/#UpdateMetricMetadata-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Object with all metric related metadata. Expand All Field Type Description description string Metric description. integration string Name of the integration that sent the metric if applicable. per_unit string Per unit of the metric such as `second` in `bytes per second`. short_name string A more human-readable and abbreviated version of the metric name. statsd_interval int64 StatsD flush interval of the metric in seconds if applicable. type string Metric type such as `gauge` or `rate`. unit string Primary unit of the metric such as `byte` or `operation`. ``` { "description": "string", "integration": "string", "per_unit": "second", "short_name": "string", "statsd_interval": "integer", "type": "count", "unit": "byte" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Edit metric metadata Copy ``` # Path parameters export metric_name="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/metrics/${metric_name}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Edit metric metadata ``` """ Edit metric metadata returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.metrics_api import MetricsApi from datadog_api_client.v1.model.metric_metadata import MetricMetadata body = MetricMetadata( per_unit="second", type="count", unit="byte", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.update_metric_metadata(metric_name="metric_name", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit metric metadata ``` # Edit metric metadata returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MetricsAPI.new body = DatadogAPIClient::V1::MetricMetadata.new({ per_unit: "second", type: "count", unit: "byte", }) p api_instance.update_metric_metadata("metric_name", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit metric metadata ``` // Edit metric metadata returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.MetricMetadata{ PerUnit: datadog.PtrString("second"), Type: datadog.PtrString("count"), Unit: datadog.PtrString("byte"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMetricsApi(apiClient) resp, r, err := api.UpdateMetricMetadata(ctx, "metric_name", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.UpdateMetricMetadata`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.UpdateMetricMetadata`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit metric metadata ``` // Edit metric metadata returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MetricsApi; import com.datadog.api.client.v1.model.MetricMetadata; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); MetricMetadata body = new MetricMetadata().perUnit("second").type("count").unit("byte"); try { MetricMetadata result = apiInstance.updateMetricMetadata("metric_name", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#updateMetricMetadata"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit metric metadata ``` // Edit metric metadata returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_metrics::MetricsAPI; use datadog_api_client::datadogV1::model::MetricMetadata; #[tokio::main] async fn main() { let body = MetricMetadata::new() .per_unit("second".to_string()) .type_("count".to_string()) .unit("byte".to_string()); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .update_metric_metadata("metric_name".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit metric metadata ``` /** * Edit metric metadata returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MetricsApi(configuration); const params: v1.MetricsApiUpdateMetricMetadataRequest = { body: { perUnit: "second", type: "count", unit: "byte", }, metricName: "metric_name", }; apiInstance .updateMetricMetadata(params) .then((data: v1.MetricMetadata) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a tag configuration](https://docs.datadoghq.com/api/latest/metrics/#update-a-tag-configuration) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#update-a-tag-configuration-v2) PATCH https://api.ap1.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.ap2.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.datadoghq.eu/api/v2/metrics/{metric_name}/tagshttps://api.ddog-gov.com/api/v2/metrics/{metric_name}/tagshttps://api.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.us3.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.us5.datadoghq.com/api/v2/metrics/{metric_name}/tags ### Overview Update the tag configuration of a metric or percentile aggregations of a distribution metric or custom aggregations of a count, rate, or gauge metric. By setting `exclude_tags_mode` to true the behavior is changed from an allow-list to a deny-list, and tags in the defined list will not be queryable. Can only be used with application keys from users with the `Manage Tags for Metrics` permission. This endpoint requires a tag configuration to be created first. This endpoint requires the `metric_tags_write` permission. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string The name of the metric. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Field Type Description data [_required_] object Object for a single tag configuration to be edited. attributes object Object containing the definition of a metric tag configuration to be updated. aggregations [object] Deprecated. You no longer need to configure specific time and space aggregations for Metrics Without Limits. space [_required_] enum A space aggregation for use in query. Allowed enum values: `avg,max,min,sum` time [_required_] enum A time aggregation for use in query. Allowed enum values: `avg,count,max,min,sum` exclude_tags_mode boolean When set to true, the configuration will exclude the configured tags and include any other submitted tags. When set to false, the configuration will include the configured tags and exclude any other submitted tags. Defaults to false. Requires `tags` property. include_percentiles boolean Toggle to include/exclude percentiles for a distribution metric. Defaults to false. Can only be applied to metrics that have a `metric_type` of `distribution`. tags [string] A list of tag keys that will be queryable for your metric. default: id [_required_] string The metric name for this resource. type [_required_] enum The metric tag configuration resource type. Allowed enum values: `manage_tags` default: `manage_tags` ``` { "data": { "type": "manage_tags", "id": "test.metric.latency", "attributes": { "tags": [ "app" ] } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#UpdateTagConfiguration-200-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#UpdateTagConfiguration-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#UpdateTagConfiguration-403-v2) * [422](https://docs.datadoghq.com/api/latest/metrics/#UpdateTagConfiguration-422-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#UpdateTagConfiguration-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Response object which includes a single metric’s tag configuration. Field Type Description data object Object for a single metric tag configuration. attributes object Object containing the definition of a metric tag configuration attributes. aggregations [object] Deprecated. You no longer need to configure specific time and space aggregations for Metrics Without Limits. space [_required_] enum A space aggregation for use in query. Allowed enum values: `avg,max,min,sum` time [_required_] enum A time aggregation for use in query. Allowed enum values: `avg,count,max,min,sum` created_at date-time Timestamp when the tag configuration was created. exclude_tags_mode boolean When set to true, the configuration will exclude the configured tags and include any other submitted tags. When set to false, the configuration will include the configured tags and exclude any other submitted tags. Defaults to false. Requires `tags` property. include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `metric_type` is `distribution`. metric_type enum The metric's type. Allowed enum values: `gauge,count,rate,distribution` default: `gauge` modified_at date-time Timestamp when the tag configuration was last modified. tags [string] List of tag keys on which to group. id string The metric name for this resource. type enum The metric tag configuration resource type. Allowed enum values: `manage_tags` default: `manage_tags` ``` { "data": { "attributes": { "aggregations": [ { "space": "sum", "time": "sum" } ], "created_at": "2020-03-25T09:48:37.463835Z", "exclude_tags_mode": false, "include_percentiles": true, "metric_type": "count", "modified_at": "2020-03-25T09:48:37.463835Z", "tags": [ "app", "datacenter" ] }, "id": "test.metric.latency", "type": "manage_tags" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unprocessable Entity * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Update a tag configuration returns "OK" response Copy ``` # Path parameters export metric_name="dist.http.endpoint.request" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/${metric_name}/tags" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "manage_tags", "id": "test.metric.latency", "attributes": { "tags": [ "app" ] } } } EOF ``` ##### Update a tag configuration returns "OK" response ``` // Update a tag configuration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "metric_tag_configuration" in the system MetricTagConfigurationDataID := os.Getenv("METRIC_TAG_CONFIGURATION_DATA_ID") body := datadogV2.MetricTagConfigurationUpdateRequest{ Data: datadogV2.MetricTagConfigurationUpdateData{ Type: datadogV2.METRICTAGCONFIGURATIONTYPE_MANAGE_TAGS, Id: MetricTagConfigurationDataID, Attributes: &datadogV2.MetricTagConfigurationUpdateAttributes{ Tags: []string{ "app", }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.UpdateTagConfiguration(ctx, MetricTagConfigurationDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.UpdateTagConfiguration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.UpdateTagConfiguration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a tag configuration returns "OK" response ``` // Update a tag configuration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.MetricTagConfigurationResponse; import com.datadog.api.client.v2.model.MetricTagConfigurationType; import com.datadog.api.client.v2.model.MetricTagConfigurationUpdateAttributes; import com.datadog.api.client.v2.model.MetricTagConfigurationUpdateData; import com.datadog.api.client.v2.model.MetricTagConfigurationUpdateRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); // there is a valid "metric_tag_configuration" in the system String METRIC_TAG_CONFIGURATION_DATA_ID = System.getenv("METRIC_TAG_CONFIGURATION_DATA_ID"); MetricTagConfigurationUpdateRequest body = new MetricTagConfigurationUpdateRequest() .data( new MetricTagConfigurationUpdateData() .type(MetricTagConfigurationType.MANAGE_TAGS) .id(METRIC_TAG_CONFIGURATION_DATA_ID) .attributes( new MetricTagConfigurationUpdateAttributes() .tags(Collections.singletonList("app")))); try { MetricTagConfigurationResponse result = apiInstance.updateTagConfiguration(METRIC_TAG_CONFIGURATION_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#updateTagConfiguration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a tag configuration returns "OK" response ``` """ Update a tag configuration returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi from datadog_api_client.v2.model.metric_tag_configuration_type import MetricTagConfigurationType from datadog_api_client.v2.model.metric_tag_configuration_update_attributes import ( MetricTagConfigurationUpdateAttributes, ) from datadog_api_client.v2.model.metric_tag_configuration_update_data import MetricTagConfigurationUpdateData from datadog_api_client.v2.model.metric_tag_configuration_update_request import MetricTagConfigurationUpdateRequest # there is a valid "metric_tag_configuration" in the system METRIC_TAG_CONFIGURATION_DATA_ID = environ["METRIC_TAG_CONFIGURATION_DATA_ID"] body = MetricTagConfigurationUpdateRequest( data=MetricTagConfigurationUpdateData( type=MetricTagConfigurationType.MANAGE_TAGS, id=METRIC_TAG_CONFIGURATION_DATA_ID, attributes=MetricTagConfigurationUpdateAttributes( tags=[ "app", ], ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.update_tag_configuration(metric_name=METRIC_TAG_CONFIGURATION_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a tag configuration returns "OK" response ``` # Update a tag configuration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new # there is a valid "metric_tag_configuration" in the system METRIC_TAG_CONFIGURATION_DATA_ID = ENV["METRIC_TAG_CONFIGURATION_DATA_ID"] body = DatadogAPIClient::V2::MetricTagConfigurationUpdateRequest.new({ data: DatadogAPIClient::V2::MetricTagConfigurationUpdateData.new({ type: DatadogAPIClient::V2::MetricTagConfigurationType::MANAGE_TAGS, id: METRIC_TAG_CONFIGURATION_DATA_ID, attributes: DatadogAPIClient::V2::MetricTagConfigurationUpdateAttributes.new({ tags: [ "app", ], }), }), }) p api_instance.update_tag_configuration(METRIC_TAG_CONFIGURATION_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a tag configuration returns "OK" response ``` // Update a tag configuration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; use datadog_api_client::datadogV2::model::MetricTagConfigurationType; use datadog_api_client::datadogV2::model::MetricTagConfigurationUpdateAttributes; use datadog_api_client::datadogV2::model::MetricTagConfigurationUpdateData; use datadog_api_client::datadogV2::model::MetricTagConfigurationUpdateRequest; #[tokio::main] async fn main() { // there is a valid "metric_tag_configuration" in the system let metric_tag_configuration_data_id = std::env::var("METRIC_TAG_CONFIGURATION_DATA_ID").unwrap(); let body = MetricTagConfigurationUpdateRequest::new( MetricTagConfigurationUpdateData::new( metric_tag_configuration_data_id.clone(), MetricTagConfigurationType::MANAGE_TAGS, ) .attributes(MetricTagConfigurationUpdateAttributes::new().tags(vec!["app".to_string()])), ); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .update_tag_configuration(metric_tag_configuration_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a tag configuration returns "OK" response ``` /** * Update a tag configuration returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); // there is a valid "metric_tag_configuration" in the system const METRIC_TAG_CONFIGURATION_DATA_ID = process.env .METRIC_TAG_CONFIGURATION_DATA_ID as string; const params: v2.MetricsApiUpdateTagConfigurationRequest = { body: { data: { type: "manage_tags", id: METRIC_TAG_CONFIGURATION_DATA_ID, attributes: { tags: ["app"], }, }, }, metricName: METRIC_TAG_CONFIGURATION_DATA_ID, }; apiInstance .updateTagConfiguration(params) .then((data: v2.MetricTagConfigurationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a tag configuration](https://docs.datadoghq.com/api/latest/metrics/#delete-a-tag-configuration) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#delete-a-tag-configuration-v2) DELETE https://api.ap1.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.ap2.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.datadoghq.eu/api/v2/metrics/{metric_name}/tagshttps://api.ddog-gov.com/api/v2/metrics/{metric_name}/tagshttps://api.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.us3.datadoghq.com/api/v2/metrics/{metric_name}/tagshttps://api.us5.datadoghq.com/api/v2/metrics/{metric_name}/tags ### Overview Deletes a metric’s tag configuration. Can only be used with application keys from users with the `Manage Tags for Metrics` permission. This endpoint requires the `metric_tags_write` permission. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string The name of the metric. ### Response * [204](https://docs.datadoghq.com/api/latest/metrics/#DeleteTagConfiguration-204-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#DeleteTagConfiguration-403-v2) * [404](https://docs.datadoghq.com/api/latest/metrics/#DeleteTagConfiguration-404-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#DeleteTagConfiguration-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Delete a tag configuration Copy ``` # Path parameters export metric_name="dist.http.endpoint.request" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/${metric_name}/tags" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a tag configuration ``` """ Delete a tag configuration returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) api_instance.delete_tag_configuration( metric_name="ExampleMetric", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a tag configuration ``` # Delete a tag configuration returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new api_instance.delete_tag_configuration("ExampleMetric") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a tag configuration ``` // Delete a tag configuration returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) r, err := api.DeleteTagConfiguration(ctx, "ExampleMetric") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.DeleteTagConfiguration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a tag configuration ``` // Delete a tag configuration returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); try { apiInstance.deleteTagConfiguration("ExampleMetric"); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#deleteTagConfiguration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a tag configuration ``` // Delete a tag configuration returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .delete_tag_configuration("ExampleMetric".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a tag configuration ``` /** * Delete a tag configuration returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiDeleteTagConfigurationRequest = { metricName: "ExampleMetric", }; apiInstance .deleteTagConfiguration(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search metrics](https://docs.datadoghq.com/api/latest/metrics/#search-metrics) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/metrics/#search-metrics-v1) GET https://api.ap1.datadoghq.com/api/v1/searchhttps://api.ap2.datadoghq.com/api/v1/searchhttps://api.datadoghq.eu/api/v1/searchhttps://api.ddog-gov.com/api/v1/searchhttps://api.datadoghq.com/api/v1/searchhttps://api.us3.datadoghq.com/api/v1/searchhttps://api.us5.datadoghq.com/api/v1/search ### Overview **Note** : This endpoint is deprecated. Use `/api/v2/metrics` instead. Search for metrics from the last 24 hours in Datadog. This endpoint requires the `metrics_read` permission. OAuth apps require the `metrics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#metrics) to access this endpoint. ### Arguments #### Query Strings Name Type Description q [_required_] string Query string to search metrics upon. Can optionally be prefixed with `metrics:`. ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#ListMetrics-200-v1) * [400](https://docs.datadoghq.com/api/latest/metrics/#ListMetrics-400-v1) * [403](https://docs.datadoghq.com/api/latest/metrics/#ListMetrics-403-v1) * [429](https://docs.datadoghq.com/api/latest/metrics/#ListMetrics-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Object containing the list of metrics matching the search query. Field Type Description results object Search result. metrics [string] List of metrics that match the search query. ``` { "results": { "metrics": [] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby-legacy) ##### Search metrics Copy ``` # Required query arguments export q="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/search?q=${q}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Search metrics ``` """ Search metrics returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.metrics_api import MetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.list_metrics( q="q", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search metrics ``` # Search metrics returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MetricsAPI.new p api_instance.list_metrics("q") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search metrics ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.search("metrics:test") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search metrics ``` // Search metrics returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMetricsApi(apiClient) resp, r, err := api.ListMetrics(ctx, "q") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.ListMetrics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.ListMetrics`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search metrics ``` // Search metrics returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MetricsApi; import com.datadog.api.client.v1.model.MetricSearchResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); try { MetricSearchResponse result = apiInstance.listMetrics("q"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#listMetrics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search metrics ``` // Search metrics returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_metrics::MetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api.list_metrics("q".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search metrics ``` /** * Search metrics returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MetricsApi(configuration); const params: v1.MetricsApiListMetricsRequest = { q: "q", }; apiInstance .listMetrics(params) .then((data: v1.MetricSearchResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of metrics](https://docs.datadoghq.com/api/latest/metrics/#get-a-list-of-metrics) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#get-a-list-of-metrics-v2) GET https://api.ap1.datadoghq.com/api/v2/metricshttps://api.ap2.datadoghq.com/api/v2/metricshttps://api.datadoghq.eu/api/v2/metricshttps://api.ddog-gov.com/api/v2/metricshttps://api.datadoghq.com/api/v2/metricshttps://api.us3.datadoghq.com/api/v2/metricshttps://api.us5.datadoghq.com/api/v2/metrics ### Overview Returns all metrics that can be configured in the Metrics Summary page or with Metrics without Limits™ (matching additional filters if specified). Optionally, paginate by using the `page[cursor]` and/or `page[size]` query parameters. To fetch the first page, pass in a query parameter with either a valid `page[size]` or an empty cursor like `page[cursor]=`. To fetch the next page, pass in the `next_cursor` value from the response as the new `page[cursor]` value. Once the `meta.pagination.next_cursor` value is null, all pages have been retrieved. This endpoint requires the `metrics_read` permission. OAuth apps require the `metrics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#metrics) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[configured] boolean Filter custom metrics that have configured tags. filter[tags_configured] string Filter tag configurations by configured tags. filter[metric_type] enum Filter metrics by metric type. Allowed enum values: `non_distribution, distribution` filter[include_percentiles] boolean Filter distributions with additional percentile aggregations enabled or disabled. filter[queried] boolean (Preview) Filter custom metrics that have or have not been queried in the specified window[seconds]. If no window is provided or the window is less than 2 hours, a default of 2 hours will be applied. filter[tags] string Filter metrics that have been submitted with the given tags. Supports boolean and wildcard expressions. Can only be combined with the filter[queried] filter. filter[related_assets] boolean (Preview) Filter metrics that are used in dashboards, monitors, notebooks, SLOs. window[seconds] integer The number of seconds of look back (from now) to apply to a filter[tag] or filter[queried] query. Default value is 3600 (1 hour), maximum value is 2,592,000 (30 days). page[size] integer Maximum number of results returned. page[cursor] string String to query the next page of results. This key is provided with each valid response from the API in `meta.pagination.next_cursor`. Once the `meta.pagination.next_cursor` key is null, all pages have been retrieved. ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#ListTagConfigurations-200-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#ListTagConfigurations-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#ListTagConfigurations-403-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#ListTagConfigurations-429-v2) Success * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Response object that includes metrics and metric tag configurations. Field Type Description data [ ] Array of metrics and metric tag configurations. Option 1 object Object for a single metric tag configuration. id string The metric name for this resource. type enum The metric resource type. Allowed enum values: `metrics` default: `metrics` Option 2 object Object for a single metric tag configuration. attributes object Object containing the definition of a metric tag configuration attributes. aggregations [object] Deprecated. You no longer need to configure specific time and space aggregations for Metrics Without Limits. space [_required_] enum A space aggregation for use in query. Allowed enum values: `avg,max,min,sum` time [_required_] enum A time aggregation for use in query. Allowed enum values: `avg,count,max,min,sum` created_at date-time Timestamp when the tag configuration was created. exclude_tags_mode boolean When set to true, the configuration will exclude the configured tags and include any other submitted tags. When set to false, the configuration will include the configured tags and exclude any other submitted tags. Defaults to false. Requires `tags` property. include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `metric_type` is `distribution`. metric_type enum The metric's type. Allowed enum values: `gauge,count,rate,distribution` default: `gauge` modified_at date-time Timestamp when the tag configuration was last modified. tags [string] List of tag keys on which to group. id string The metric name for this resource. type enum The metric tag configuration resource type. Allowed enum values: `manage_tags` default: `manage_tags` links object Pagination links. Only present if pagination query parameters were provided. first string Link to the first page. last string Link to the last page. next string Link to the next page. prev string Link to previous page. self string Link to current page. meta object Response metadata object. pagination object Paging attributes. Only present if pagination query parameters were provided. cursor string The cursor used to get the current results, if any. limit int32 Number of results returned next_cursor string The cursor used to get the next results, if any. type enum Type of metric pagination. Allowed enum values: `cursor_limit` default: `cursor_limit` ``` { "data": [ { "id": "test.metric.latency", "type": "metrics" } ], "links": { "first": "string", "last": "string", "next": "string", "prev": "string", "self": "string" }, "meta": { "pagination": { "cursor": "string", "limit": "integer", "next_cursor": "string", "type": "cursor_limit" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Get a list of metrics Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of metrics ``` """ Get a list of metrics returns "Success" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.list_tag_configurations() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of metrics ``` # Get a list of metrics returns "Success" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new p api_instance.list_tag_configurations() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of metrics ``` // Get a list of metrics returns "Success" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.ListTagConfigurations(ctx, *datadogV2.NewListTagConfigurationsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.ListTagConfigurations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.ListTagConfigurations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of metrics ``` // Get a list of metrics returns "Success" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.MetricsAndMetricTagConfigurationsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); try { MetricsAndMetricTagConfigurationsResponse result = apiInstance.listTagConfigurations(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#listTagConfigurations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of metrics ``` // Get a list of metrics returns "Success" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::ListTagConfigurationsOptionalParams; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .list_tag_configurations(ListTagConfigurationsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of metrics ``` /** * Get a list of metrics returns "Success" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); apiInstance .listTagConfigurations() .then((data: v2.MetricsAndMetricTagConfigurationsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Query timeseries points](https://docs.datadoghq.com/api/latest/metrics/#query-timeseries-points) * [v1 (latest)](https://docs.datadoghq.com/api/latest/metrics/#query-timeseries-points-v1) GET https://api.ap1.datadoghq.com/api/v1/queryhttps://api.ap2.datadoghq.com/api/v1/queryhttps://api.datadoghq.eu/api/v1/queryhttps://api.ddog-gov.com/api/v1/queryhttps://api.datadoghq.com/api/v1/queryhttps://api.us3.datadoghq.com/api/v1/queryhttps://api.us5.datadoghq.com/api/v1/query ### Overview Query timeseries points. This endpoint requires the `timeseries_query` permission. OAuth apps require the `timeseries_query` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#metrics) to access this endpoint. ### Arguments #### Query Strings Name Type Description from [_required_] integer Start of the queried time period, seconds since the Unix epoch. to [_required_] integer End of the queried time period, seconds since the Unix epoch. query [_required_] string Query string. ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#QueryMetrics-200-v1) * [400](https://docs.datadoghq.com/api/latest/metrics/#QueryMetrics-400-v1) * [403](https://docs.datadoghq.com/api/latest/metrics/#QueryMetrics-403-v1) * [429](https://docs.datadoghq.com/api/latest/metrics/#QueryMetrics-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Response Object that includes your query and the list of metrics retrieved. Field Type Description error string Message indicating the errors if status is not `ok`. from_date int64 Start of requested time window, milliseconds since Unix epoch. group_by [string] List of tag keys on which to group. message string Message indicating `success` if status is `ok`. query string Query string res_type string Type of response. series [object] List of timeseries queried. aggr string Aggregation type. display_name string Display name of the metric. end int64 End of the time window, milliseconds since Unix epoch. expression string Metric expression. interval int64 Number of milliseconds between data samples. length int64 Number of data samples. metric string Metric name. pointlist [array] List of points of the timeseries in milliseconds. query_index int64 The index of the series' query within the request. scope string Metric scope, comma separated list of tags. start int64 Start of the time window, milliseconds since Unix epoch. tag_set [string] Unique tags identifying this series. unit [object] Detailed information about the metric unit. The first element describes the "primary unit" (for example, `bytes` in `bytes per second`). The second element describes the "per unit" (for example, `second` in `bytes per second`). If the second element is not present, the API returns null. family string Unit family, allows for conversion between units of the same family, for scaling. name string Unit name plural string Plural form of the unit name. scale_factor double Factor for scaling between units of the same family. short_name string Abbreviation of the unit. status string Status of the query. to_date int64 End of requested time window, milliseconds since Unix epoch. ``` { "error": "string", "from_date": "integer", "group_by": [], "message": "string", "query": "string", "res_type": "time_series", "series": [ { "aggr": "avg", "display_name": "system.cpu.idle", "end": "integer", "expression": "system.cpu.idle{host:foo,env:test}", "interval": "integer", "length": "integer", "metric": "system.cpu.idle", "pointlist": [ [ 1681683300000, 77.62145685254418 ] ], "query_index": "integer", "scope": "host:foo,env:test", "start": "integer", "tag_set": [], "unit": [ { "family": "time", "name": "minute", "plural": "minutes", "scale_factor": 60, "short_name": "min" } ] } ], "status": "ok", "to_date": "integer" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python-legacy) ##### Query timeseries points Copy ``` # Required query arguments export from="CHANGE_ME" export to="CHANGE_ME" export query="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/query?from=${from}&to=${to}&query=${query}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Query timeseries points ``` """ Query timeseries points returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.metrics_api import MetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.query_metrics( _from=int((datetime.now() + relativedelta(days=-1)).timestamp()), to=int(datetime.now().timestamp()), query="system.cpu.idle{*}", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Query timeseries points ``` # Query timeseries points returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MetricsAPI.new p api_instance.query_metrics((Time.now + -1 * 86400).to_i, Time.now.to_i, "system.cpu.idle{*}") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Query timeseries points ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Get points from the last hour from = Time.now - 3600 to = Time.now query = 'system.cpu.idle{*}by{host}' dog.get_points(query, from, to) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Query timeseries points ``` // Query timeseries points returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMetricsApi(apiClient) resp, r, err := api.QueryMetrics(ctx, time.Now().AddDate(0, 0, -1).Unix(), time.Now().Unix(), "system.cpu.idle{*}") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.QueryMetrics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.QueryMetrics`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Query timeseries points ``` // Query timeseries points returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MetricsApi; import com.datadog.api.client.v1.model.MetricsQueryResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); try { MetricsQueryResponse result = apiInstance.queryMetrics( OffsetDateTime.now().plusDays(-1).toInstant().getEpochSecond(), OffsetDateTime.now().toInstant().getEpochSecond(), "system.cpu.idle{*}"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#queryMetrics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Query timeseries points ``` from datadog import initialize, api import time options = { 'api_key': '', 'app_key': '' } initialize(**options) now = int(time.time()) query = 'system.cpu.idle{*}by{host}' print(api.Metric.query(start=now - 3600, end=now, query=query)) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Query timeseries points ``` // Query timeseries points returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_metrics::MetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .query_metrics(1636542671, 1636629071, "system.cpu.idle{*}".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Query timeseries points ``` /** * Query timeseries points returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MetricsApi(configuration); const params: v1.MetricsApiQueryMetricsRequest = { from: Math.round( new Date(new Date().getTime() + -1 * 86400 * 1000).getTime() / 1000 ), to: Math.round(new Date().getTime() / 1000), query: "system.cpu.idle{*}", }; apiInstance .queryMetrics(params) .then((data: v1.MetricsQueryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List tags by metric name](https://docs.datadoghq.com/api/latest/metrics/#list-tags-by-metric-name) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#list-tags-by-metric-name-v2) GET https://api.ap1.datadoghq.com/api/v2/metrics/{metric_name}/all-tagshttps://api.ap2.datadoghq.com/api/v2/metrics/{metric_name}/all-tagshttps://api.datadoghq.eu/api/v2/metrics/{metric_name}/all-tagshttps://api.ddog-gov.com/api/v2/metrics/{metric_name}/all-tagshttps://api.datadoghq.com/api/v2/metrics/{metric_name}/all-tagshttps://api.us3.datadoghq.com/api/v2/metrics/{metric_name}/all-tagshttps://api.us5.datadoghq.com/api/v2/metrics/{metric_name}/all-tags ### Overview View indexed tag key-value pairs for a given metric name over the previous hour. This endpoint requires the `metrics_read` permission. OAuth apps require the `metrics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#metrics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string The name of the metric. ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#ListTagsByMetricName-200-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#ListTagsByMetricName-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#ListTagsByMetricName-403-v2) * [404](https://docs.datadoghq.com/api/latest/metrics/#ListTagsByMetricName-404-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#ListTagsByMetricName-429-v2) Success * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Response object that includes a single metric’s indexed tags. Field Type Description data object Object for a single metric's indexed tags. attributes object Object containing the definition of a metric's tags. tags [string] List of indexed tag value pairs. id string The metric name for this resource. type enum The metric resource type. Allowed enum values: `metrics` default: `metrics` ``` { "data": { "attributes": { "tags": [ "sport:golf", "sport:football", "animal:dog" ] }, "id": "test.metric.latency", "type": "metrics" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### List tags by metric name Copy ``` # Path parameters export metric_name="dist.http.endpoint.request" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/${metric_name}/all-tags" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List tags by metric name ``` """ List tags by metric name returns "Success" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi # there is a valid "metric_tag_configuration" in the system METRIC_TAG_CONFIGURATION_DATA_ID = environ["METRIC_TAG_CONFIGURATION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.list_tags_by_metric_name( metric_name=METRIC_TAG_CONFIGURATION_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List tags by metric name ``` # List tags by metric name returns "Success" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new # there is a valid "metric_tag_configuration" in the system METRIC_TAG_CONFIGURATION_DATA_ID = ENV["METRIC_TAG_CONFIGURATION_DATA_ID"] p api_instance.list_tags_by_metric_name(METRIC_TAG_CONFIGURATION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List tags by metric name ``` // List tags by metric name returns "Success" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "metric_tag_configuration" in the system MetricTagConfigurationDataID := os.Getenv("METRIC_TAG_CONFIGURATION_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.ListTagsByMetricName(ctx, MetricTagConfigurationDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.ListTagsByMetricName`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.ListTagsByMetricName`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List tags by metric name ``` // List tags by metric name returns "Success" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.MetricAllTagsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); // there is a valid "metric_tag_configuration" in the system String METRIC_TAG_CONFIGURATION_DATA_ID = System.getenv("METRIC_TAG_CONFIGURATION_DATA_ID"); try { MetricAllTagsResponse result = apiInstance.listTagsByMetricName(METRIC_TAG_CONFIGURATION_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#listTagsByMetricName"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List tags by metric name ``` // List tags by metric name returns "Success" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; #[tokio::main] async fn main() { // there is a valid "metric_tag_configuration" in the system let metric_tag_configuration_data_id = std::env::var("METRIC_TAG_CONFIGURATION_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .list_tags_by_metric_name(metric_tag_configuration_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List tags by metric name ``` /** * List tags by metric name returns "Success" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); // there is a valid "metric_tag_configuration" in the system const METRIC_TAG_CONFIGURATION_DATA_ID = process.env .METRIC_TAG_CONFIGURATION_DATA_ID as string; const params: v2.MetricsApiListTagsByMetricNameRequest = { metricName: METRIC_TAG_CONFIGURATION_DATA_ID, }; apiInstance .listTagsByMetricName(params) .then((data: v2.MetricAllTagsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List active tags and aggregations](https://docs.datadoghq.com/api/latest/metrics/#list-active-tags-and-aggregations) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#list-active-tags-and-aggregations-v2) GET https://api.ap1.datadoghq.com/api/v2/metrics/{metric_name}/active-configurationshttps://api.ap2.datadoghq.com/api/v2/metrics/{metric_name}/active-configurationshttps://api.datadoghq.eu/api/v2/metrics/{metric_name}/active-configurationshttps://api.ddog-gov.com/api/v2/metrics/{metric_name}/active-configurationshttps://api.datadoghq.com/api/v2/metrics/{metric_name}/active-configurationshttps://api.us3.datadoghq.com/api/v2/metrics/{metric_name}/active-configurationshttps://api.us5.datadoghq.com/api/v2/metrics/{metric_name}/active-configurations ### Overview List tags and aggregations that are actively queried on dashboards, notebooks, monitors, the Metrics Explorer, and using the API for a given metric name. This endpoint requires the `metrics_read` permission. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string The name of the metric. #### Query Strings Name Type Description window[seconds] integer The number of seconds of look back (from now). Default value is 604,800 (1 week), minimum value is 7200 (2 hours), maximum value is 2,630,000 (1 month). ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#ListActiveMetricConfigurations-200-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#ListActiveMetricConfigurations-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#ListActiveMetricConfigurations-403-v2) * [404](https://docs.datadoghq.com/api/latest/metrics/#ListActiveMetricConfigurations-404-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#ListActiveMetricConfigurations-429-v2) Success * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Response object that includes a single metric’s actively queried tags and aggregations. Field Type Description data object Object for a single metric's actively queried tags and aggregations. attributes object Object containing the definition of a metric's actively queried tags and aggregations. active_aggregations [object] List of aggregation combinations that have been actively queried. space [_required_] enum A space aggregation for use in query. Allowed enum values: `avg,max,min,sum` time [_required_] enum A time aggregation for use in query. Allowed enum values: `avg,count,max,min,sum` active_tags [string] List of tag keys that have been actively queried. id string The metric name for this resource. type enum The metric actively queried configuration resource type. Allowed enum values: `actively_queried_configurations` default: `actively_queried_configurations` ``` { "data": { "attributes": { "active_aggregations": [ { "space": "sum", "time": "sum" } ], "active_tags": [ "app", "datacenter" ] }, "id": "test.metric.latency", "type": "actively_queried_configurations" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### List active tags and aggregations Copy ``` # Path parameters export metric_name="dist.http.endpoint.request" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/${metric_name}/active-configurations" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List active tags and aggregations ``` """ List active tags and aggregations returns "Success" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.list_active_metric_configurations( metric_name="static_test_metric_donotdelete", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List active tags and aggregations ``` # List active tags and aggregations returns "Success" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new p api_instance.list_active_metric_configurations("static_test_metric_donotdelete") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List active tags and aggregations ``` // List active tags and aggregations returns "Success" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.ListActiveMetricConfigurations(ctx, "static_test_metric_donotdelete", *datadogV2.NewListActiveMetricConfigurationsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.ListActiveMetricConfigurations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.ListActiveMetricConfigurations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List active tags and aggregations ``` // List active tags and aggregations returns "Success" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.MetricSuggestedTagsAndAggregationsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); try { MetricSuggestedTagsAndAggregationsResponse result = apiInstance.listActiveMetricConfigurations("static_test_metric_donotdelete"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#listActiveMetricConfigurations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List active tags and aggregations ``` // List active tags and aggregations returns "Success" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::ListActiveMetricConfigurationsOptionalParams; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .list_active_metric_configurations( "static_test_metric_donotdelete".to_string(), ListActiveMetricConfigurationsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List active tags and aggregations ``` /** * List active tags and aggregations returns "Success" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiListActiveMetricConfigurationsRequest = { metricName: "static_test_metric_donotdelete", }; apiInstance .listActiveMetricConfigurations(params) .then((data: v2.MetricSuggestedTagsAndAggregationsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List distinct metric volumes by metric name](https://docs.datadoghq.com/api/latest/metrics/#list-distinct-metric-volumes-by-metric-name) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#list-distinct-metric-volumes-by-metric-name-v2) GET https://api.ap1.datadoghq.com/api/v2/metrics/{metric_name}/volumeshttps://api.ap2.datadoghq.com/api/v2/metrics/{metric_name}/volumeshttps://api.datadoghq.eu/api/v2/metrics/{metric_name}/volumeshttps://api.ddog-gov.com/api/v2/metrics/{metric_name}/volumeshttps://api.datadoghq.com/api/v2/metrics/{metric_name}/volumeshttps://api.us3.datadoghq.com/api/v2/metrics/{metric_name}/volumeshttps://api.us5.datadoghq.com/api/v2/metrics/{metric_name}/volumes ### Overview View distinct metrics volumes for the given metric name. Custom metrics generated in-app from other products will return `null` for ingested volumes. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string The name of the metric. ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#ListVolumesByMetricName-200-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#ListVolumesByMetricName-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#ListVolumesByMetricName-403-v2) * [404](https://docs.datadoghq.com/api/latest/metrics/#ListVolumesByMetricName-404-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#ListVolumesByMetricName-429-v2) Success * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Response object which includes a single metric’s volume. Field Type Description data Possible response objects for a metric's volume. Option 1 object Object for a single metric's distinct volume. attributes object Object containing the definition of a metric's distinct volume. distinct_volume int64 Distinct volume for the given metric. id string The metric name for this resource. type enum The metric distinct volume type. Allowed enum values: `distinct_metric_volumes` default: `distinct_metric_volumes` Option 2 object Object for a single metric's ingested and indexed volume. attributes object Object containing the definition of a metric's ingested and indexed volume. indexed_volume int64 Indexed volume for the given metric. ingested_volume int64 Ingested volume for the given metric. id string The metric name for this resource. type enum The metric ingested and indexed volume type. Allowed enum values: `metric_volumes` default: `metric_volumes` ``` { "data": { "attributes": { "distinct_volume": 10 }, "id": "test.metric.latency", "type": "distinct_metric_volumes" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### List distinct metric volumes by metric name Copy ``` # Path parameters export metric_name="dist.http.endpoint.request" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/${metric_name}/volumes" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List distinct metric volumes by metric name ``` """ List distinct metric volumes by metric name returns "Success" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.list_volumes_by_metric_name( metric_name="static_test_metric_donotdelete", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List distinct metric volumes by metric name ``` # List distinct metric volumes by metric name returns "Success" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new p api_instance.list_volumes_by_metric_name("static_test_metric_donotdelete") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List distinct metric volumes by metric name ``` // List distinct metric volumes by metric name returns "Success" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.ListVolumesByMetricName(ctx, "static_test_metric_donotdelete") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.ListVolumesByMetricName`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.ListVolumesByMetricName`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List distinct metric volumes by metric name ``` // List distinct metric volumes by metric name returns "Success" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.MetricVolumesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); try { MetricVolumesResponse result = apiInstance.listVolumesByMetricName("static_test_metric_donotdelete"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#listVolumesByMetricName"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List distinct metric volumes by metric name ``` // List distinct metric volumes by metric name returns "Success" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .list_volumes_by_metric_name("static_test_metric_donotdelete".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List distinct metric volumes by metric name ``` /** * List distinct metric volumes by metric name returns "Success" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiListVolumesByMetricNameRequest = { metricName: "static_test_metric_donotdelete", }; apiInstance .listVolumesByMetricName(params) .then((data: v2.MetricVolumesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Configure tags for multiple metrics](https://docs.datadoghq.com/api/latest/metrics/#configure-tags-for-multiple-metrics) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#configure-tags-for-multiple-metrics-v2) POST https://api.ap1.datadoghq.com/api/v2/metrics/config/bulk-tagshttps://api.ap2.datadoghq.com/api/v2/metrics/config/bulk-tagshttps://api.datadoghq.eu/api/v2/metrics/config/bulk-tagshttps://api.ddog-gov.com/api/v2/metrics/config/bulk-tagshttps://api.datadoghq.com/api/v2/metrics/config/bulk-tagshttps://api.us3.datadoghq.com/api/v2/metrics/config/bulk-tagshttps://api.us5.datadoghq.com/api/v2/metrics/config/bulk-tags ### Overview Create and define a list of queryable tag keys for a set of existing count, gauge, rate, and distribution metrics. Metrics are selected by passing a metric name prefix. Use the Delete method of this API path to remove tag configurations. Results can be sent to a set of account email addresses, just like the same operation in the Datadog web app. If multiple calls include the same metric, the last configuration applied (not by submit order) is used, do not expect deterministic ordering of concurrent calls. The `exclude_tags_mode` value will set all metrics that match the prefix to the same exclusion state, metric tag configurations do not support mixed inclusion and exclusion for tags on the same metric. Can only be used with application keys of users with the `Manage Tags for Metrics` permission. This endpoint requires the `metric_tags_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Field Type Description data [_required_] object Request object to bulk configure tags for metrics matching the given prefix. attributes object Optional parameters for bulk creating metric tag configurations. emails [string] A list of account emails to notify when the configuration is applied. exclude_tags_mode boolean When set to true, the configuration will exclude the configured tags and include any other submitted tags. When set to false, the configuration will include the configured tags and exclude any other submitted tags. Defaults to false. include_actively_queried_tags_window double When provided, all tags that have been actively queried are configured (and, therefore, remain queryable) for each metric that matches the given prefix. Minimum value is 1 second, and maximum value is 7,776,000 seconds (90 days). override_existing_configurations boolean When set to true, the configuration overrides any existing configurations for the given metric with the new set of tags in this configuration request. If false, old configurations are kept and are merged with the set of tags in this configuration request. Defaults to true. tags [string] A list of tag names to apply to the configuration. id [_required_] string A text prefix to match against metric names. type [_required_] enum The metric bulk configure tags resource. Allowed enum values: `metric_bulk_configure_tags` default: `metric_bulk_configure_tags` ``` { "data": { "attributes": { "emails": [ "string" ], "tags": [ "test", "examplemetric" ] }, "id": "system.load.1", "type": "metric_bulk_configure_tags" } } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/metrics/#CreateBulkTagsMetricsConfiguration-202-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#CreateBulkTagsMetricsConfiguration-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#CreateBulkTagsMetricsConfiguration-403-v2) * [404](https://docs.datadoghq.com/api/latest/metrics/#CreateBulkTagsMetricsConfiguration-404-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#CreateBulkTagsMetricsConfiguration-429-v2) Accepted * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Wrapper for a single bulk tag configuration status response. Field Type Description data object The status of a request to bulk configure metric tags. It contains the fields from the original request for reference. attributes object Optional attributes for the status of a bulk tag configuration request. emails [string] A list of account emails to notify when the configuration is applied. exclude_tags_mode boolean When set to true, the configuration will exclude the configured tags and include any other submitted tags. When set to false, the configuration will include the configured tags and exclude any other submitted tags. status string The status of the request. tags [string] A list of tag names to apply to the configuration. id [_required_] string A text prefix to match against metric names. type [_required_] enum The metric bulk configure tags resource. Allowed enum values: `metric_bulk_configure_tags` default: `metric_bulk_configure_tags` ``` { "data": { "attributes": { "emails": [ "sue@example.com", "bob@example.com" ], "exclude_tags_mode": false, "status": "Accepted", "tags": [ "host", "pod_name", "is_shadow" ] }, "id": "kafka.lag", "type": "metric_bulk_configure_tags" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Configure tags for multiple metrics returns "Accepted" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/config/bulk-tags" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "emails": [ "string" ], "tags": [ "test", "examplemetric" ] }, "id": "system.load.1", "type": "metric_bulk_configure_tags" } } EOF ``` ##### Configure tags for multiple metrics returns "Accepted" response ``` // Configure tags for multiple metrics returns "Accepted" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataAttributesEmail := os.Getenv("USER_DATA_ATTRIBUTES_EMAIL") body := datadogV2.MetricBulkTagConfigCreateRequest{ Data: datadogV2.MetricBulkTagConfigCreate{ Attributes: &datadogV2.MetricBulkTagConfigCreateAttributes{ Emails: []string{ UserDataAttributesEmail, }, Tags: []string{ "test", "examplemetric", }, }, Id: "system.load.1", Type: datadogV2.METRICBULKCONFIGURETAGSTYPE_BULK_MANAGE_TAGS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.CreateBulkTagsMetricsConfiguration(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.CreateBulkTagsMetricsConfiguration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.CreateBulkTagsMetricsConfiguration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Configure tags for multiple metrics returns "Accepted" response ``` // Configure tags for multiple metrics returns "Accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.MetricBulkConfigureTagsType; import com.datadog.api.client.v2.model.MetricBulkTagConfigCreate; import com.datadog.api.client.v2.model.MetricBulkTagConfigCreateAttributes; import com.datadog.api.client.v2.model.MetricBulkTagConfigCreateRequest; import com.datadog.api.client.v2.model.MetricBulkTagConfigResponse; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ATTRIBUTES_EMAIL = System.getenv("USER_DATA_ATTRIBUTES_EMAIL"); MetricBulkTagConfigCreateRequest body = new MetricBulkTagConfigCreateRequest() .data( new MetricBulkTagConfigCreate() .attributes( new MetricBulkTagConfigCreateAttributes() .emails(Collections.singletonList(USER_DATA_ATTRIBUTES_EMAIL)) .tags(Arrays.asList("test", "examplemetric"))) .id("system.load.1") .type(MetricBulkConfigureTagsType.BULK_MANAGE_TAGS)); try { MetricBulkTagConfigResponse result = apiInstance.createBulkTagsMetricsConfiguration(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#createBulkTagsMetricsConfiguration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Configure tags for multiple metrics returns "Accepted" response ``` """ Configure tags for multiple metrics returns "Accepted" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi from datadog_api_client.v2.model.metric_bulk_configure_tags_type import MetricBulkConfigureTagsType from datadog_api_client.v2.model.metric_bulk_tag_config_create import MetricBulkTagConfigCreate from datadog_api_client.v2.model.metric_bulk_tag_config_create_attributes import MetricBulkTagConfigCreateAttributes from datadog_api_client.v2.model.metric_bulk_tag_config_create_request import MetricBulkTagConfigCreateRequest from datadog_api_client.v2.model.metric_bulk_tag_config_email_list import MetricBulkTagConfigEmailList from datadog_api_client.v2.model.metric_bulk_tag_config_tag_name_list import MetricBulkTagConfigTagNameList # there is a valid "user" in the system USER_DATA_ATTRIBUTES_EMAIL = environ["USER_DATA_ATTRIBUTES_EMAIL"] body = MetricBulkTagConfigCreateRequest( data=MetricBulkTagConfigCreate( attributes=MetricBulkTagConfigCreateAttributes( emails=MetricBulkTagConfigEmailList( [ USER_DATA_ATTRIBUTES_EMAIL, ] ), tags=MetricBulkTagConfigTagNameList( [ "test", "examplemetric", ] ), ), id="system.load.1", type=MetricBulkConfigureTagsType.BULK_MANAGE_TAGS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.create_bulk_tags_metrics_configuration(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Configure tags for multiple metrics returns "Accepted" response ``` # Configure tags for multiple metrics returns "Accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new # there is a valid "user" in the system USER_DATA_ATTRIBUTES_EMAIL = ENV["USER_DATA_ATTRIBUTES_EMAIL"] body = DatadogAPIClient::V2::MetricBulkTagConfigCreateRequest.new({ data: DatadogAPIClient::V2::MetricBulkTagConfigCreate.new({ attributes: DatadogAPIClient::V2::MetricBulkTagConfigCreateAttributes.new({ emails: [ USER_DATA_ATTRIBUTES_EMAIL, ], tags: [ "test", "examplemetric", ], }), id: "system.load.1", type: DatadogAPIClient::V2::MetricBulkConfigureTagsType::BULK_MANAGE_TAGS, }), }) p api_instance.create_bulk_tags_metrics_configuration(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Configure tags for multiple metrics returns "Accepted" response ``` // Configure tags for multiple metrics returns "Accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; use datadog_api_client::datadogV2::model::MetricBulkConfigureTagsType; use datadog_api_client::datadogV2::model::MetricBulkTagConfigCreate; use datadog_api_client::datadogV2::model::MetricBulkTagConfigCreateAttributes; use datadog_api_client::datadogV2::model::MetricBulkTagConfigCreateRequest; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_attributes_email = std::env::var("USER_DATA_ATTRIBUTES_EMAIL").unwrap(); let body = MetricBulkTagConfigCreateRequest::new( MetricBulkTagConfigCreate::new( "system.load.1".to_string(), MetricBulkConfigureTagsType::BULK_MANAGE_TAGS, ) .attributes( MetricBulkTagConfigCreateAttributes::new() .emails(vec![user_data_attributes_email.clone()]) .tags(vec!["test".to_string(), "examplemetric".to_string()]), ), ); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api.create_bulk_tags_metrics_configuration(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Configure tags for multiple metrics returns "Accepted" response ``` /** * Configure tags for multiple metrics returns "Accepted" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); // there is a valid "user" in the system const USER_DATA_ATTRIBUTES_EMAIL = process.env .USER_DATA_ATTRIBUTES_EMAIL as string; const params: v2.MetricsApiCreateBulkTagsMetricsConfigurationRequest = { body: { data: { attributes: { emails: [USER_DATA_ATTRIBUTES_EMAIL], tags: ["test", "examplemetric"], }, id: "system.load.1", type: "metric_bulk_configure_tags", }, }, }; apiInstance .createBulkTagsMetricsConfiguration(params) .then((data: v2.MetricBulkTagConfigResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete tags for multiple metrics](https://docs.datadoghq.com/api/latest/metrics/#delete-tags-for-multiple-metrics) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#delete-tags-for-multiple-metrics-v2) DELETE https://api.ap1.datadoghq.com/api/v2/metrics/config/bulk-tagshttps://api.ap2.datadoghq.com/api/v2/metrics/config/bulk-tagshttps://api.datadoghq.eu/api/v2/metrics/config/bulk-tagshttps://api.ddog-gov.com/api/v2/metrics/config/bulk-tagshttps://api.datadoghq.com/api/v2/metrics/config/bulk-tagshttps://api.us3.datadoghq.com/api/v2/metrics/config/bulk-tagshttps://api.us5.datadoghq.com/api/v2/metrics/config/bulk-tags ### Overview Delete all custom lists of queryable tag keys for a set of existing count, gauge, rate, and distribution metrics. Metrics are selected by passing a metric name prefix. Results can be sent to a set of account email addresses, just like the same operation in the Datadog web app. Can only be used with application keys of users with the `Manage Tags for Metrics` permission. This endpoint requires the `metric_tags_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Field Type Description data [_required_] object Request object to bulk delete all tag configurations for metrics matching the given prefix. attributes object Optional parameters for bulk deleting metric tag configurations. emails [string] A list of account emails to notify when the configuration is applied. id [_required_] string A text prefix to match against metric names. type [_required_] enum The metric bulk configure tags resource. Allowed enum values: `metric_bulk_configure_tags` default: `metric_bulk_configure_tags` ``` { "data": { "attributes": { "emails": [ "sue@example.com", "bob@example.com" ] }, "id": "kafka.lag", "type": "metric_bulk_configure_tags" } } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/metrics/#DeleteBulkTagsMetricsConfiguration-202-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#DeleteBulkTagsMetricsConfiguration-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#DeleteBulkTagsMetricsConfiguration-403-v2) * [404](https://docs.datadoghq.com/api/latest/metrics/#DeleteBulkTagsMetricsConfiguration-404-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#DeleteBulkTagsMetricsConfiguration-429-v2) Accepted * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Wrapper for a single bulk tag configuration status response. Field Type Description data object The status of a request to bulk configure metric tags. It contains the fields from the original request for reference. attributes object Optional attributes for the status of a bulk tag configuration request. emails [string] A list of account emails to notify when the configuration is applied. exclude_tags_mode boolean When set to true, the configuration will exclude the configured tags and include any other submitted tags. When set to false, the configuration will include the configured tags and exclude any other submitted tags. status string The status of the request. tags [string] A list of tag names to apply to the configuration. id [_required_] string A text prefix to match against metric names. type [_required_] enum The metric bulk configure tags resource. Allowed enum values: `metric_bulk_configure_tags` default: `metric_bulk_configure_tags` ``` { "data": { "attributes": { "emails": [ "sue@example.com", "bob@example.com" ], "exclude_tags_mode": false, "status": "Accepted", "tags": [ "host", "pod_name", "is_shadow" ] }, "id": "kafka.lag", "type": "metric_bulk_configure_tags" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Delete tags for multiple metrics Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/config/bulk-tags" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "kafka.lag", "type": "metric_bulk_configure_tags" } } EOF ``` ##### Delete tags for multiple metrics ``` """ Delete tags for multiple metrics returns "Accepted" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi from datadog_api_client.v2.model.metric_bulk_configure_tags_type import MetricBulkConfigureTagsType from datadog_api_client.v2.model.metric_bulk_tag_config_delete import MetricBulkTagConfigDelete from datadog_api_client.v2.model.metric_bulk_tag_config_delete_attributes import MetricBulkTagConfigDeleteAttributes from datadog_api_client.v2.model.metric_bulk_tag_config_delete_request import MetricBulkTagConfigDeleteRequest from datadog_api_client.v2.model.metric_bulk_tag_config_email_list import MetricBulkTagConfigEmailList body = MetricBulkTagConfigDeleteRequest( data=MetricBulkTagConfigDelete( attributes=MetricBulkTagConfigDeleteAttributes( emails=MetricBulkTagConfigEmailList( [ "sue@example.com", "bob@example.com", ] ), ), id="kafka.lag", type=MetricBulkConfigureTagsType.BULK_MANAGE_TAGS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.delete_bulk_tags_metrics_configuration(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete tags for multiple metrics ``` # Delete tags for multiple metrics returns "Accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new body = DatadogAPIClient::V2::MetricBulkTagConfigDeleteRequest.new({ data: DatadogAPIClient::V2::MetricBulkTagConfigDelete.new({ attributes: DatadogAPIClient::V2::MetricBulkTagConfigDeleteAttributes.new({ emails: [ "sue@example.com", "bob@example.com", ], }), id: "kafka.lag", type: DatadogAPIClient::V2::MetricBulkConfigureTagsType::BULK_MANAGE_TAGS, }), }) p api_instance.delete_bulk_tags_metrics_configuration(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete tags for multiple metrics ``` // Delete tags for multiple metrics returns "Accepted" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MetricBulkTagConfigDeleteRequest{ Data: datadogV2.MetricBulkTagConfigDelete{ Attributes: &datadogV2.MetricBulkTagConfigDeleteAttributes{ Emails: []string{ "sue@example.com", "bob@example.com", }, }, Id: "kafka.lag", Type: datadogV2.METRICBULKCONFIGURETAGSTYPE_BULK_MANAGE_TAGS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.DeleteBulkTagsMetricsConfiguration(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.DeleteBulkTagsMetricsConfiguration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.DeleteBulkTagsMetricsConfiguration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete tags for multiple metrics ``` // Delete tags for multiple metrics returns "Accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.MetricBulkConfigureTagsType; import com.datadog.api.client.v2.model.MetricBulkTagConfigDelete; import com.datadog.api.client.v2.model.MetricBulkTagConfigDeleteAttributes; import com.datadog.api.client.v2.model.MetricBulkTagConfigDeleteRequest; import com.datadog.api.client.v2.model.MetricBulkTagConfigResponse; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); MetricBulkTagConfigDeleteRequest body = new MetricBulkTagConfigDeleteRequest() .data( new MetricBulkTagConfigDelete() .attributes( new MetricBulkTagConfigDeleteAttributes() .emails(Arrays.asList("sue@example.com", "bob@example.com"))) .id("kafka.lag") .type(MetricBulkConfigureTagsType.BULK_MANAGE_TAGS)); try { MetricBulkTagConfigResponse result = apiInstance.deleteBulkTagsMetricsConfiguration(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#deleteBulkTagsMetricsConfiguration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete tags for multiple metrics ``` // Delete tags for multiple metrics returns "Accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; use datadog_api_client::datadogV2::model::MetricBulkConfigureTagsType; use datadog_api_client::datadogV2::model::MetricBulkTagConfigDelete; use datadog_api_client::datadogV2::model::MetricBulkTagConfigDeleteAttributes; use datadog_api_client::datadogV2::model::MetricBulkTagConfigDeleteRequest; #[tokio::main] async fn main() { let body = MetricBulkTagConfigDeleteRequest::new( MetricBulkTagConfigDelete::new( "kafka.lag".to_string(), MetricBulkConfigureTagsType::BULK_MANAGE_TAGS, ) .attributes(MetricBulkTagConfigDeleteAttributes::new().emails(vec![ "sue@example.com".to_string(), "bob@example.com".to_string(), ])), ); let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api.delete_bulk_tags_metrics_configuration(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete tags for multiple metrics ``` /** * Delete tags for multiple metrics returns "Accepted" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiDeleteBulkTagsMetricsConfigurationRequest = { body: { data: { attributes: { emails: ["sue@example.com", "bob@example.com"], }, id: "kafka.lag", type: "metric_bulk_configure_tags", }, }, }; apiInstance .deleteBulkTagsMetricsConfiguration(params) .then((data: v2.MetricBulkTagConfigResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Tag Configuration Cardinality Estimator](https://docs.datadoghq.com/api/latest/metrics/#tag-configuration-cardinality-estimator) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#tag-configuration-cardinality-estimator-v2) GET https://api.ap1.datadoghq.com/api/v2/metrics/{metric_name}/estimatehttps://api.ap2.datadoghq.com/api/v2/metrics/{metric_name}/estimatehttps://api.datadoghq.eu/api/v2/metrics/{metric_name}/estimatehttps://api.ddog-gov.com/api/v2/metrics/{metric_name}/estimatehttps://api.datadoghq.com/api/v2/metrics/{metric_name}/estimatehttps://api.us3.datadoghq.com/api/v2/metrics/{metric_name}/estimatehttps://api.us5.datadoghq.com/api/v2/metrics/{metric_name}/estimate ### Overview Returns the estimated cardinality for a metric with a given tag, percentile and number of aggregations configuration using Metrics without Limits™. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string The name of the metric. #### Query Strings Name Type Description filter[groups] string Filtered tag keys that the metric is configured to query with. filter[hours_ago] integer The number of hours of look back (from now) to estimate cardinality with. If unspecified, it defaults to 0 hours. filter[num_aggregations] integer Deprecated. Number of aggregations has no impact on volume. filter[pct] boolean A boolean, for distribution metrics only, to estimate cardinality if the metric includes additional percentile aggregators. filter[timespan_h] integer A window, in hours, from the look back to estimate cardinality with. The minimum and default is 1 hour. ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#EstimateMetricsOutputSeries-200-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#EstimateMetricsOutputSeries-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#EstimateMetricsOutputSeries-403-v2) * [404](https://docs.datadoghq.com/api/latest/metrics/#EstimateMetricsOutputSeries-404-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#EstimateMetricsOutputSeries-429-v2) Success * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Response object that includes metric cardinality estimates. Field Type Description data object Object for a metric cardinality estimate. attributes object Object containing the definition of a metric estimate attribute. estimate_type enum Estimate type based on the queried configuration. By default, `count_or_gauge` is returned. `distribution` is returned for distribution metrics without percentiles enabled. Lastly, `percentile` is returned if `filter[pct]=true` is queried with a distribution metric. Allowed enum values: `count_or_gauge,distribution,percentile` default: `count_or_gauge` estimated_at date-time Timestamp when the cardinality estimate was requested. estimated_output_series int64 Estimated cardinality of the metric based on the queried configuration. id string The metric name for this resource. type enum The metric estimate resource type. Allowed enum values: `metric_cardinality_estimate` default: `metric_cardinality_estimate` ``` { "data": { "attributes": { "estimate_type": "distribution", "estimated_at": "2022-04-27T09:48:37.463835Z", "estimated_output_series": 50 }, "id": "test.metric.latency", "type": "metric_cardinality_estimate" } } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Tag Configuration Cardinality Estimator Copy ``` # Path parameters export metric_name="dist.http.endpoint.request" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/${metric_name}/estimate" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Tag Configuration Cardinality Estimator ``` """ Tag Configuration Cardinality Estimator returns "Success" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.estimate_metrics_output_series( metric_name="system.cpu.idle", filter_groups="app,host", filter_num_aggregations=4, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Tag Configuration Cardinality Estimator ``` # Tag Configuration Cardinality Estimator returns "Success" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new opts = { filter_groups: "app,host", filter_num_aggregations: 4, } p api_instance.estimate_metrics_output_series("system.cpu.idle", opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Tag Configuration Cardinality Estimator ``` // Tag Configuration Cardinality Estimator returns "Success" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.EstimateMetricsOutputSeries(ctx, "system.cpu.idle", *datadogV2.NewEstimateMetricsOutputSeriesOptionalParameters().WithFilterGroups("app,host").WithFilterNumAggregations(4)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.EstimateMetricsOutputSeries`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.EstimateMetricsOutputSeries`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Tag Configuration Cardinality Estimator ``` // Tag Configuration Cardinality Estimator returns "Success" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.api.MetricsApi.EstimateMetricsOutputSeriesOptionalParameters; import com.datadog.api.client.v2.model.MetricEstimateResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); try { MetricEstimateResponse result = apiInstance.estimateMetricsOutputSeries( "system.cpu.idle", new EstimateMetricsOutputSeriesOptionalParameters() .filterGroups("app,host") .filterNumAggregations(4)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#estimateMetricsOutputSeries"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Tag Configuration Cardinality Estimator ``` // Tag Configuration Cardinality Estimator returns "Success" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::EstimateMetricsOutputSeriesOptionalParams; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .estimate_metrics_output_series( "system.cpu.idle".to_string(), EstimateMetricsOutputSeriesOptionalParams::default() .filter_groups("app,host".to_string()) .filter_num_aggregations(4), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Tag Configuration Cardinality Estimator ``` /** * Tag Configuration Cardinality Estimator returns "Success" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiEstimateMetricsOutputSeriesRequest = { metricName: "system.cpu.idle", filterGroups: "app,host", filterNumAggregations: 4, }; apiInstance .estimateMetricsOutputSeries(params) .then((data: v2.MetricEstimateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Related Assets to a Metric](https://docs.datadoghq.com/api/latest/metrics/#related-assets-to-a-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#related-assets-to-a-metric-v2) GET https://api.ap1.datadoghq.com/api/v2/metrics/{metric_name}/assetshttps://api.ap2.datadoghq.com/api/v2/metrics/{metric_name}/assetshttps://api.datadoghq.eu/api/v2/metrics/{metric_name}/assetshttps://api.ddog-gov.com/api/v2/metrics/{metric_name}/assetshttps://api.datadoghq.com/api/v2/metrics/{metric_name}/assetshttps://api.us3.datadoghq.com/api/v2/metrics/{metric_name}/assetshttps://api.us5.datadoghq.com/api/v2/metrics/{metric_name}/assets ### Overview Returns dashboards, monitors, notebooks, and SLOs that a metric is stored in, if any. Updated every 24 hours. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string The name of the metric. ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#ListMetricAssets-200-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#ListMetricAssets-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#ListMetricAssets-403-v2) * [404](https://docs.datadoghq.com/api/latest/metrics/#ListMetricAssets-404-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#ListMetricAssets-429-v2) Success * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Response object that includes related dashboards, monitors, notebooks, and SLOs. Field Type Description data object Metric assets response data. id [_required_] string The metric name for this resource. relationships object Relationships to assets related to the metric. dashboards object An object containing the list of dashboards that can be referenced in the `included` data. data [object] A list of dashboards that can be referenced in the `included` data. id string The related dashboard's ID. type enum Dashboard resource type. Allowed enum values: `dashboards` monitors object A object containing the list of monitors that can be referenced in the `included` data. data [object] A list of monitors that can be referenced in the `included` data. id string The related monitor's ID. type enum Monitor resource type. Allowed enum values: `monitors` notebooks object An object containing the list of notebooks that can be referenced in the `included` data. data [object] A list of notebooks that can be referenced in the `included` data. id string The related notebook's ID. type enum Notebook resource type. Allowed enum values: `notebooks` slos object An object containing a list of SLOs that can be referenced in the `included` data. data [object] A list of SLOs that can be referenced in the `included` data. id string The SLO ID. type enum SLO resource type. Allowed enum values: `slos` type [_required_] enum The metric resource type. Allowed enum values: `metrics` default: `metrics` included [ ] Array of objects related to the metric assets. Option 1 object A dashboard object with title and popularity. attributes object Attributes related to the dashboard, including title, popularity, and url. popularity double Value from 0 to 5 that ranks popularity of the dashboard. tags [string] List of tag keys used in the asset. title string Title of the asset. url string URL path of the asset. id [_required_] string The related dashboard's ID. type [_required_] enum Dashboard resource type. Allowed enum values: `dashboards` Option 2 object A monitor object with title. attributes object Assets related to the object, including title, url, and tags. tags [string] List of tag keys used in the asset. title string Title of the asset. url string URL path of the asset. id [_required_] string The related monitor's ID. type [_required_] enum Monitor resource type. Allowed enum values: `monitors` Option 3 object A notebook object with title. attributes object Assets related to the object, including title, url, and tags. tags [string] List of tag keys used in the asset. title string Title of the asset. url string URL path of the asset. id [_required_] string The related notebook's ID. type [_required_] enum Notebook resource type. Allowed enum values: `notebooks` Option 4 object A SLO object with title. attributes object Assets related to the object, including title, url, and tags. tags [string] List of tag keys used in the asset. title string Title of the asset. url string URL path of the asset. id [_required_] string The SLO ID. type [_required_] enum SLO resource type. Allowed enum values: `slos` ``` { "data": { "id": "test.metric.latency", "relationships": { "dashboards": { "data": [ { "id": "xxx-yyy-zzz", "type": "dashboards" } ] }, "monitors": { "data": [ { "id": "1775073", "type": "monitors" } ] }, "notebooks": { "data": [ { "id": "12345", "type": "notebooks" } ] }, "slos": { "data": [ { "id": "9ffef113b389520db54391d67d652dfb", "type": "slos" } ] } }, "type": "metrics" }, "included": [ { "attributes": { "popularity": "number", "tags": [ "env", "service", "host", "datacenter" ], "title": "string", "url": "string" }, "id": "xxx-yyy-zzz", "type": "dashboards" } ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Related Assets to a Metric Copy ``` # Path parameters export metric_name="dist.http.endpoint.request" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/${metric_name}/assets" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Related Assets to a Metric ``` """ Related Assets to a Metric returns "Success" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.list_metric_assets( metric_name="system.cpu.user", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Related Assets to a Metric ``` # Related Assets to a Metric returns "Success" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new p api_instance.list_metric_assets("system.cpu.user") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Related Assets to a Metric ``` // Related Assets to a Metric returns "Success" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.ListMetricAssets(ctx, "system.cpu.user") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.ListMetricAssets`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.ListMetricAssets`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Related Assets to a Metric ``` // Related Assets to a Metric returns "Success" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.MetricAssetsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); try { MetricAssetsResponse result = apiInstance.listMetricAssets("system.cpu.user"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#listMetricAssets"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Related Assets to a Metric ``` // Related Assets to a Metric returns "Success" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api.list_metric_assets("system.cpu.user".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Related Assets to a Metric ``` /** * Related Assets to a Metric returns "Success" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiListMetricAssetsRequest = { metricName: "system.cpu.user", }; apiInstance .listMetricAssets(params) .then((data: v2.MetricAssetsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get tag key cardinality details](https://docs.datadoghq.com/api/latest/metrics/#get-tag-key-cardinality-details) * [v2 (latest)](https://docs.datadoghq.com/api/latest/metrics/#get-tag-key-cardinality-details-v2) GET https://api.ap1.datadoghq.com/api/v2/metrics/{metric_name}/tag-cardinalitieshttps://api.ap2.datadoghq.com/api/v2/metrics/{metric_name}/tag-cardinalitieshttps://api.datadoghq.eu/api/v2/metrics/{metric_name}/tag-cardinalitieshttps://api.ddog-gov.com/api/v2/metrics/{metric_name}/tag-cardinalitieshttps://api.datadoghq.com/api/v2/metrics/{metric_name}/tag-cardinalitieshttps://api.us3.datadoghq.com/api/v2/metrics/{metric_name}/tag-cardinalitieshttps://api.us5.datadoghq.com/api/v2/metrics/{metric_name}/tag-cardinalities ### Overview Returns the cardinality details of tags for a specific metric. This endpoint requires the `metrics_read` permission. ### Arguments #### Path Parameters Name Type Description metric_name [_required_] string The name of the metric. ### Response * [200](https://docs.datadoghq.com/api/latest/metrics/#GetMetricTagCardinalityDetails-200-v2) * [400](https://docs.datadoghq.com/api/latest/metrics/#GetMetricTagCardinalityDetails-400-v2) * [403](https://docs.datadoghq.com/api/latest/metrics/#GetMetricTagCardinalityDetails-403-v2) * [404](https://docs.datadoghq.com/api/latest/metrics/#GetMetricTagCardinalityDetails-404-v2) * [429](https://docs.datadoghq.com/api/latest/metrics/#GetMetricTagCardinalityDetails-429-v2) Success * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) Response object that includes an array of objects representing the cardinality details of a metric’s tags. Field Type Description data [object] A list of tag cardinalities associated with the given metric. attributes object An object containing properties related to the tag key cardinality_delta int64 This describes the recent change in the tag keys cardinality id string The name of the tag key. type string This describes the endpoint action. default: `tag_cardinality` meta object Response metadata object. metric_name string The name of metric for which the tag cardinalities are returned. This matches the metric name provided in the request. ``` { "data": [ { "attributes": { "cardinality_delta": "integer" }, "id": "string", "type": "string" } ], "meta": { "metric_name": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/metrics/) * [Example](https://docs.datadoghq.com/api/latest/metrics/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/metrics/?code-lang=typescript) ##### Get tag key cardinality details Copy ``` # Path parameters export metric_name="dist.http.endpoint.request" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/metrics/${metric_name}/tag-cardinalities" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get tag key cardinality details ``` """ Get tag key cardinality details returns "Success" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.metrics_api import MetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MetricsApi(api_client) response = api_instance.get_metric_tag_cardinality_details( metric_name="metric_name", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get tag key cardinality details ``` # Get tag key cardinality details returns "Success" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MetricsAPI.new p api_instance.get_metric_tag_cardinality_details("metric_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get tag key cardinality details ``` // Get tag key cardinality details returns "Success" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMetricsApi(apiClient) resp, r, err := api.GetMetricTagCardinalityDetails(ctx, "metric_name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MetricsApi.GetMetricTagCardinalityDetails`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MetricsApi.GetMetricTagCardinalityDetails`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get tag key cardinality details ``` // Get tag key cardinality details returns "Success" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MetricsApi; import com.datadog.api.client.v2.model.MetricTagCardinalitiesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MetricsApi apiInstance = new MetricsApi(defaultClient); try { MetricTagCardinalitiesResponse result = apiInstance.getMetricTagCardinalityDetails("dist.http.endpoint.request"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MetricsApi#getMetricTagCardinalityDetails"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get tag key cardinality details ``` // Get tag key cardinality details returns "Success" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_metrics::MetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MetricsAPI::with_config(configuration); let resp = api .get_metric_tag_cardinality_details("metric_name".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get tag key cardinality details ``` /** * Get tag key cardinality details returns "Success" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MetricsApi(configuration); const params: v2.MetricsApiGetMetricTagCardinalityDetailsRequest = { metricName: "metric_name", }; apiInstance .getMetricTagCardinalityDetails(params) .then((data: v2.MetricTagCardinalitiesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=2acd63f6-6e49-43e3-af1c-ff70bd439b42&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=4a9fc478-4f3f-4485-9e3a-ae35804a236e&pt=Metrics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fmetrics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=2acd63f6-6e49-43e3-af1c-ff70bd439b42&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=4a9fc478-4f3f-4485-9e3a-ae35804a236e&pt=Metrics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fmetrics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=acb6f3e5-2482-4b8e-aaa5-e7d9c6955245&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Metrics&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fmetrics%2F&r=<=33234&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=222373) --- # Source: https://docs.datadoghq.com/api/latest/microsoft-teams-integration/ # Microsoft Teams Integration Configure your [Datadog Microsoft Teams integration](https://docs.datadoghq.com/integrations/microsoft_teams/) directly through the Datadog API. Note: These endpoints do not support legacy connector handles. ## [Create tenant-based handle](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#create-tenant-based-handle) * [v2 (latest)](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#create-tenant-based-handle-v2) POST https://api.ap1.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.ap2.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.datadoghq.eu/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.ddog-gov.com/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.us3.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles ### Overview Create a tenant-based handle in the Datadog Microsoft Teams integration. ### Request #### Body Data (required) Tenant-based handle payload. * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Field Type Description data [_required_] object Tenant-based handle data from a response. attributes [_required_] object Tenant-based handle attributes. channel_id [_required_] string Channel id. name [_required_] string Tenant-based handle name. team_id [_required_] string Team id. tenant_id [_required_] string Tenant id. type [_required_] enum Specifies the tenant-based handle resource type. Allowed enum values: `tenant-based-handle` default: `tenant-based-handle` ``` { "data": { "attributes": { "channel_id": "19:iD_D2xy_sAa-JV851JJYwIa6mlW9F9Nxm3SLyZq68qY1@thread.tacv2", "name": "Example-Microsoft-Teams-Integration", "team_id": "e5f50a58-c929-4fb3-8866-e2cd836de3c2", "tenant_id": "4d3bac44-0230-4732-9e70-cc00736f0a97" }, "type": "tenant-based-handle" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateTenantBasedHandle-201-v2) * [400](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateTenantBasedHandle-400-v2) * [403](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateTenantBasedHandle-403-v2) * [404](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateTenantBasedHandle-404-v2) * [409](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateTenantBasedHandle-409-v2) * [412](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateTenantBasedHandle-412-v2) * [429](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateTenantBasedHandle-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Response of a tenant-based handle. Field Type Description data [_required_] object Tenant-based handle data from a response. attributes object Tenant-based handle attributes. channel_id string Channel id. name string Tenant-based handle name. team_id string Team id. tenant_id string Tenant id. id string The ID of the tenant-based handle. type enum Specifies the tenant-based handle resource type. Allowed enum values: `tenant-based-handle` default: `tenant-based-handle` ``` { "data": { "attributes": { "channel_id": "fake-channel-id", "name": "fake-handle-name", "team_id": "00000000-0000-0000-0000-000000000000", "tenant_id": "00000000-0000-0000-0000-000000000001" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "tenant-based-handle" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Failed Precondition * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=typescript) ##### Create api handle returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "channel_id": "19:iD_D2xy_sAa-JV851JJYwIa6mlW9F9Nxm3SLyZq68qY1@thread.tacv2", "name": "Example-Microsoft-Teams-Integration", "team_id": "e5f50a58-c929-4fb3-8866-e2cd836de3c2", "tenant_id": "4d3bac44-0230-4732-9e70-cc00736f0a97" }, "type": "tenant-based-handle" } } EOF ``` ##### Create api handle returns "CREATED" response ``` // Create api handle returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MicrosoftTeamsCreateTenantBasedHandleRequest{ Data: datadogV2.MicrosoftTeamsTenantBasedHandleRequestData{ Attributes: datadogV2.MicrosoftTeamsTenantBasedHandleRequestAttributes{ ChannelId: "19:iD_D2xy_sAa-JV851JJYwIa6mlW9F9Nxm3SLyZq68qY1@thread.tacv2", Name: "Example-Microsoft-Teams-Integration", TeamId: "e5f50a58-c929-4fb3-8866-e2cd836de3c2", TenantId: "4d3bac44-0230-4732-9e70-cc00736f0a97", }, Type: datadogV2.MICROSOFTTEAMSTENANTBASEDHANDLETYPE_TENANT_BASED_HANDLE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMicrosoftTeamsIntegrationApi(apiClient) resp, r, err := api.CreateTenantBasedHandle(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MicrosoftTeamsIntegrationApi.CreateTenantBasedHandle`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MicrosoftTeamsIntegrationApi.CreateTenantBasedHandle`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create api handle returns "CREATED" response ``` // Create api handle returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MicrosoftTeamsIntegrationApi; import com.datadog.api.client.v2.model.MicrosoftTeamsCreateTenantBasedHandleRequest; import com.datadog.api.client.v2.model.MicrosoftTeamsTenantBasedHandleRequestAttributes; import com.datadog.api.client.v2.model.MicrosoftTeamsTenantBasedHandleRequestData; import com.datadog.api.client.v2.model.MicrosoftTeamsTenantBasedHandleResponse; import com.datadog.api.client.v2.model.MicrosoftTeamsTenantBasedHandleType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MicrosoftTeamsIntegrationApi apiInstance = new MicrosoftTeamsIntegrationApi(defaultClient); MicrosoftTeamsCreateTenantBasedHandleRequest body = new MicrosoftTeamsCreateTenantBasedHandleRequest() .data( new MicrosoftTeamsTenantBasedHandleRequestData() .attributes( new MicrosoftTeamsTenantBasedHandleRequestAttributes() .channelId( "19:iD_D2xy_sAa-JV851JJYwIa6mlW9F9Nxm3SLyZq68qY1@thread.tacv2") .name("Example-Microsoft-Teams-Integration") .teamId("e5f50a58-c929-4fb3-8866-e2cd836de3c2") .tenantId("4d3bac44-0230-4732-9e70-cc00736f0a97")) .type(MicrosoftTeamsTenantBasedHandleType.TENANT_BASED_HANDLE)); try { MicrosoftTeamsTenantBasedHandleResponse result = apiInstance.createTenantBasedHandle(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling MicrosoftTeamsIntegrationApi#createTenantBasedHandle"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create api handle returns "CREATED" response ``` """ Create api handle returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.microsoft_teams_integration_api import MicrosoftTeamsIntegrationApi from datadog_api_client.v2.model.microsoft_teams_create_tenant_based_handle_request import ( MicrosoftTeamsCreateTenantBasedHandleRequest, ) from datadog_api_client.v2.model.microsoft_teams_tenant_based_handle_request_attributes import ( MicrosoftTeamsTenantBasedHandleRequestAttributes, ) from datadog_api_client.v2.model.microsoft_teams_tenant_based_handle_request_data import ( MicrosoftTeamsTenantBasedHandleRequestData, ) from datadog_api_client.v2.model.microsoft_teams_tenant_based_handle_type import MicrosoftTeamsTenantBasedHandleType body = MicrosoftTeamsCreateTenantBasedHandleRequest( data=MicrosoftTeamsTenantBasedHandleRequestData( attributes=MicrosoftTeamsTenantBasedHandleRequestAttributes( channel_id="19:iD_D2xy_sAa-JV851JJYwIa6mlW9F9Nxm3SLyZq68qY1@thread.tacv2", name="Example-Microsoft-Teams-Integration", team_id="e5f50a58-c929-4fb3-8866-e2cd836de3c2", tenant_id="4d3bac44-0230-4732-9e70-cc00736f0a97", ), type=MicrosoftTeamsTenantBasedHandleType.TENANT_BASED_HANDLE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MicrosoftTeamsIntegrationApi(api_client) response = api_instance.create_tenant_based_handle(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create api handle returns "CREATED" response ``` # Create api handle returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MicrosoftTeamsIntegrationAPI.new body = DatadogAPIClient::V2::MicrosoftTeamsCreateTenantBasedHandleRequest.new({ data: DatadogAPIClient::V2::MicrosoftTeamsTenantBasedHandleRequestData.new({ attributes: DatadogAPIClient::V2::MicrosoftTeamsTenantBasedHandleRequestAttributes.new({ channel_id: "19:iD_D2xy_sAa-JV851JJYwIa6mlW9F9Nxm3SLyZq68qY1@thread.tacv2", name: "Example-Microsoft-Teams-Integration", team_id: "e5f50a58-c929-4fb3-8866-e2cd836de3c2", tenant_id: "4d3bac44-0230-4732-9e70-cc00736f0a97", }), type: DatadogAPIClient::V2::MicrosoftTeamsTenantBasedHandleType::TENANT_BASED_HANDLE, }), }) p api_instance.create_tenant_based_handle(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create api handle returns "CREATED" response ``` // Create api handle returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_microsoft_teams_integration::MicrosoftTeamsIntegrationAPI; use datadog_api_client::datadogV2::model::MicrosoftTeamsCreateTenantBasedHandleRequest; use datadog_api_client::datadogV2::model::MicrosoftTeamsTenantBasedHandleRequestAttributes; use datadog_api_client::datadogV2::model::MicrosoftTeamsTenantBasedHandleRequestData; use datadog_api_client::datadogV2::model::MicrosoftTeamsTenantBasedHandleType; #[tokio::main] async fn main() { let body = MicrosoftTeamsCreateTenantBasedHandleRequest::new( MicrosoftTeamsTenantBasedHandleRequestData::new( MicrosoftTeamsTenantBasedHandleRequestAttributes::new( "19:iD_D2xy_sAa-JV851JJYwIa6mlW9F9Nxm3SLyZq68qY1@thread.tacv2".to_string(), "Example-Microsoft-Teams-Integration".to_string(), "e5f50a58-c929-4fb3-8866-e2cd836de3c2".to_string(), "4d3bac44-0230-4732-9e70-cc00736f0a97".to_string(), ), MicrosoftTeamsTenantBasedHandleType::TENANT_BASED_HANDLE, ), ); let configuration = datadog::Configuration::new(); let api = MicrosoftTeamsIntegrationAPI::with_config(configuration); let resp = api.create_tenant_based_handle(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create api handle returns "CREATED" response ``` /** * Create api handle returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MicrosoftTeamsIntegrationApi(configuration); const params: v2.MicrosoftTeamsIntegrationApiCreateTenantBasedHandleRequest = { body: { data: { attributes: { channelId: "19:iD_D2xy_sAa-JV851JJYwIa6mlW9F9Nxm3SLyZq68qY1@thread.tacv2", name: "Example-Microsoft-Teams-Integration", teamId: "e5f50a58-c929-4fb3-8866-e2cd836de3c2", tenantId: "4d3bac44-0230-4732-9e70-cc00736f0a97", }, type: "tenant-based-handle", }, }, }; apiInstance .createTenantBasedHandle(params) .then((data: v2.MicrosoftTeamsTenantBasedHandleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create Workflows webhook handle](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#create-workflows-webhook-handle) * [v2 (latest)](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#create-workflows-webhook-handle-v2) POST https://api.ap1.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.ap2.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.datadoghq.eu/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.ddog-gov.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.us3.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles ### Overview Create a Workflows webhook handle in the Datadog Microsoft Teams integration. ### Request #### Body Data (required) Workflows Webhook handle payload. * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Field Type Description data [_required_] object Workflows Webhook handle data from a response. attributes [_required_] object Workflows Webhook handle attributes. name [_required_] string Workflows Webhook handle name. url [_required_] string Workflows Webhook URL. type [_required_] enum Specifies the Workflows webhook handle resource type. Allowed enum values: `workflows-webhook-handle` default: `workflows-webhook-handle` ``` { "data": { "attributes": { "name": "Example-Microsoft-Teams-Integration", "url": "https://example.logic.azure.com/workflows/123" }, "type": "workflows-webhook-handle" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateWorkflowsWebhookHandle-201-v2) * [400](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateWorkflowsWebhookHandle-400-v2) * [403](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateWorkflowsWebhookHandle-403-v2) * [404](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateWorkflowsWebhookHandle-404-v2) * [409](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateWorkflowsWebhookHandle-409-v2) * [412](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateWorkflowsWebhookHandle-412-v2) * [429](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#CreateWorkflowsWebhookHandle-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Response of a Workflows webhook handle. Field Type Description data [_required_] object Workflows Webhook handle data from a response. attributes object Workflows Webhook handle attributes. name string Workflows Webhook handle name. id string The ID of the Workflows webhook handle. type enum Specifies the Workflows webhook handle resource type. Allowed enum values: `workflows-webhook-handle` default: `workflows-webhook-handle` ``` { "data": { "attributes": { "name": "fake-handle-name" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "workflows-webhook-handle" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Failed Precondition * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=typescript) ##### Create workflow webhook handle returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Example-Microsoft-Teams-Integration", "url": "https://example.logic.azure.com/workflows/123" }, "type": "workflows-webhook-handle" } } EOF ``` ##### Create workflow webhook handle returns "CREATED" response ``` // Create workflow webhook handle returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MicrosoftTeamsCreateWorkflowsWebhookHandleRequest{ Data: datadogV2.MicrosoftTeamsWorkflowsWebhookHandleRequestData{ Attributes: datadogV2.MicrosoftTeamsWorkflowsWebhookHandleRequestAttributes{ Name: "Example-Microsoft-Teams-Integration", Url: "https://example.logic.azure.com/workflows/123", }, Type: datadogV2.MICROSOFTTEAMSWORKFLOWSWEBHOOKHANDLETYPE_WORKFLOWS_WEBHOOK_HANDLE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMicrosoftTeamsIntegrationApi(apiClient) resp, r, err := api.CreateWorkflowsWebhookHandle(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MicrosoftTeamsIntegrationApi.CreateWorkflowsWebhookHandle`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MicrosoftTeamsIntegrationApi.CreateWorkflowsWebhookHandle`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create workflow webhook handle returns "CREATED" response ``` // Create workflow webhook handle returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MicrosoftTeamsIntegrationApi; import com.datadog.api.client.v2.model.MicrosoftTeamsCreateWorkflowsWebhookHandleRequest; import com.datadog.api.client.v2.model.MicrosoftTeamsWorkflowsWebhookHandleRequestAttributes; import com.datadog.api.client.v2.model.MicrosoftTeamsWorkflowsWebhookHandleRequestData; import com.datadog.api.client.v2.model.MicrosoftTeamsWorkflowsWebhookHandleResponse; import com.datadog.api.client.v2.model.MicrosoftTeamsWorkflowsWebhookHandleType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MicrosoftTeamsIntegrationApi apiInstance = new MicrosoftTeamsIntegrationApi(defaultClient); MicrosoftTeamsCreateWorkflowsWebhookHandleRequest body = new MicrosoftTeamsCreateWorkflowsWebhookHandleRequest() .data( new MicrosoftTeamsWorkflowsWebhookHandleRequestData() .attributes( new MicrosoftTeamsWorkflowsWebhookHandleRequestAttributes() .name("Example-Microsoft-Teams-Integration") .url("https://example.logic.azure.com/workflows/123")) .type(MicrosoftTeamsWorkflowsWebhookHandleType.WORKFLOWS_WEBHOOK_HANDLE)); try { MicrosoftTeamsWorkflowsWebhookHandleResponse result = apiInstance.createWorkflowsWebhookHandle(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling MicrosoftTeamsIntegrationApi#createWorkflowsWebhookHandle"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create workflow webhook handle returns "CREATED" response ``` """ Create workflow webhook handle returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.microsoft_teams_integration_api import MicrosoftTeamsIntegrationApi from datadog_api_client.v2.model.microsoft_teams_create_workflows_webhook_handle_request import ( MicrosoftTeamsCreateWorkflowsWebhookHandleRequest, ) from datadog_api_client.v2.model.microsoft_teams_workflows_webhook_handle_request_attributes import ( MicrosoftTeamsWorkflowsWebhookHandleRequestAttributes, ) from datadog_api_client.v2.model.microsoft_teams_workflows_webhook_handle_request_data import ( MicrosoftTeamsWorkflowsWebhookHandleRequestData, ) from datadog_api_client.v2.model.microsoft_teams_workflows_webhook_handle_type import ( MicrosoftTeamsWorkflowsWebhookHandleType, ) body = MicrosoftTeamsCreateWorkflowsWebhookHandleRequest( data=MicrosoftTeamsWorkflowsWebhookHandleRequestData( attributes=MicrosoftTeamsWorkflowsWebhookHandleRequestAttributes( name="Example-Microsoft-Teams-Integration", url="https://example.logic.azure.com/workflows/123", ), type=MicrosoftTeamsWorkflowsWebhookHandleType.WORKFLOWS_WEBHOOK_HANDLE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MicrosoftTeamsIntegrationApi(api_client) response = api_instance.create_workflows_webhook_handle(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create workflow webhook handle returns "CREATED" response ``` # Create workflow webhook handle returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MicrosoftTeamsIntegrationAPI.new body = DatadogAPIClient::V2::MicrosoftTeamsCreateWorkflowsWebhookHandleRequest.new({ data: DatadogAPIClient::V2::MicrosoftTeamsWorkflowsWebhookHandleRequestData.new({ attributes: DatadogAPIClient::V2::MicrosoftTeamsWorkflowsWebhookHandleRequestAttributes.new({ name: "Example-Microsoft-Teams-Integration", url: "https://example.logic.azure.com/workflows/123", }), type: DatadogAPIClient::V2::MicrosoftTeamsWorkflowsWebhookHandleType::WORKFLOWS_WEBHOOK_HANDLE, }), }) p api_instance.create_workflows_webhook_handle(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create workflow webhook handle returns "CREATED" response ``` // Create workflow webhook handle returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_microsoft_teams_integration::MicrosoftTeamsIntegrationAPI; use datadog_api_client::datadogV2::model::MicrosoftTeamsCreateWorkflowsWebhookHandleRequest; use datadog_api_client::datadogV2::model::MicrosoftTeamsWorkflowsWebhookHandleRequestAttributes; use datadog_api_client::datadogV2::model::MicrosoftTeamsWorkflowsWebhookHandleRequestData; use datadog_api_client::datadogV2::model::MicrosoftTeamsWorkflowsWebhookHandleType; #[tokio::main] async fn main() { let body = MicrosoftTeamsCreateWorkflowsWebhookHandleRequest::new( MicrosoftTeamsWorkflowsWebhookHandleRequestData::new( MicrosoftTeamsWorkflowsWebhookHandleRequestAttributes::new( "Example-Microsoft-Teams-Integration".to_string(), "https://example.logic.azure.com/workflows/123".to_string(), ), MicrosoftTeamsWorkflowsWebhookHandleType::WORKFLOWS_WEBHOOK_HANDLE, ), ); let configuration = datadog::Configuration::new(); let api = MicrosoftTeamsIntegrationAPI::with_config(configuration); let resp = api.create_workflows_webhook_handle(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create workflow webhook handle returns "CREATED" response ``` /** * Create workflow webhook handle returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MicrosoftTeamsIntegrationApi(configuration); const params: v2.MicrosoftTeamsIntegrationApiCreateWorkflowsWebhookHandleRequest = { body: { data: { attributes: { name: "Example-Microsoft-Teams-Integration", url: "https://example.logic.azure.com/workflows/123", }, type: "workflows-webhook-handle", }, }, }; apiInstance .createWorkflowsWebhookHandle(params) .then((data: v2.MicrosoftTeamsWorkflowsWebhookHandleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete tenant-based handle](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#delete-tenant-based-handle) * [v2 (latest)](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#delete-tenant-based-handle-v2) DELETE https://api.ap1.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.ap2.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.datadoghq.eu/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.ddog-gov.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.us3.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id} ### Overview Delete a tenant-based handle from the Datadog Microsoft Teams integration. ### Arguments #### Path Parameters Name Type Description handle_id [_required_] string Your tenant-based handle id. ### Response * [204](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#DeleteTenantBasedHandle-204-v2) * [400](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#DeleteTenantBasedHandle-400-v2) * [403](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#DeleteTenantBasedHandle-403-v2) * [412](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#DeleteTenantBasedHandle-412-v2) * [429](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#DeleteTenantBasedHandle-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Failed Precondition * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=typescript) ##### Delete tenant-based handle Copy ``` # Path parameters export handle_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/${handle_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete tenant-based handle ``` """ Delete tenant-based handle returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.microsoft_teams_integration_api import MicrosoftTeamsIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MicrosoftTeamsIntegrationApi(api_client) api_instance.delete_tenant_based_handle( handle_id="handle_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete tenant-based handle ``` # Delete tenant-based handle returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MicrosoftTeamsIntegrationAPI.new api_instance.delete_tenant_based_handle("handle_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete tenant-based handle ``` // Delete tenant-based handle returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMicrosoftTeamsIntegrationApi(apiClient) r, err := api.DeleteTenantBasedHandle(ctx, "handle_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MicrosoftTeamsIntegrationApi.DeleteTenantBasedHandle`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete tenant-based handle ``` // Delete tenant-based handle returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MicrosoftTeamsIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MicrosoftTeamsIntegrationApi apiInstance = new MicrosoftTeamsIntegrationApi(defaultClient); try { apiInstance.deleteTenantBasedHandle("handle_id"); } catch (ApiException e) { System.err.println( "Exception when calling MicrosoftTeamsIntegrationApi#deleteTenantBasedHandle"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete tenant-based handle ``` // Delete tenant-based handle returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_microsoft_teams_integration::MicrosoftTeamsIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MicrosoftTeamsIntegrationAPI::with_config(configuration); let resp = api .delete_tenant_based_handle("handle_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete tenant-based handle ``` /** * Delete tenant-based handle returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MicrosoftTeamsIntegrationApi(configuration); const params: v2.MicrosoftTeamsIntegrationApiDeleteTenantBasedHandleRequest = { handleId: "handle_id", }; apiInstance .deleteTenantBasedHandle(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Workflows webhook handle](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#delete-workflows-webhook-handle) * [v2 (latest)](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#delete-workflows-webhook-handle-v2) DELETE https://api.ap1.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.ap2.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.datadoghq.eu/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.ddog-gov.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.us3.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id} ### Overview Delete a Workflows webhook handle from the Datadog Microsoft Teams integration. ### Arguments #### Path Parameters Name Type Description handle_id [_required_] string Your Workflows webhook handle id. ### Response * [204](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#DeleteWorkflowsWebhookHandle-204-v2) * [400](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#DeleteWorkflowsWebhookHandle-400-v2) * [403](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#DeleteWorkflowsWebhookHandle-403-v2) * [412](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#DeleteWorkflowsWebhookHandle-412-v2) * [429](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#DeleteWorkflowsWebhookHandle-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Failed Precondition * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=typescript) ##### Delete Workflows webhook handle Copy ``` # Path parameters export handle_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/${handle_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Workflows webhook handle ``` """ Delete Workflows webhook handle returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.microsoft_teams_integration_api import MicrosoftTeamsIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MicrosoftTeamsIntegrationApi(api_client) api_instance.delete_workflows_webhook_handle( handle_id="handle_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Workflows webhook handle ``` # Delete Workflows webhook handle returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MicrosoftTeamsIntegrationAPI.new api_instance.delete_workflows_webhook_handle("handle_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Workflows webhook handle ``` // Delete Workflows webhook handle returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMicrosoftTeamsIntegrationApi(apiClient) r, err := api.DeleteWorkflowsWebhookHandle(ctx, "handle_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MicrosoftTeamsIntegrationApi.DeleteWorkflowsWebhookHandle`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Workflows webhook handle ``` // Delete Workflows webhook handle returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MicrosoftTeamsIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MicrosoftTeamsIntegrationApi apiInstance = new MicrosoftTeamsIntegrationApi(defaultClient); try { apiInstance.deleteWorkflowsWebhookHandle("handle_id"); } catch (ApiException e) { System.err.println( "Exception when calling MicrosoftTeamsIntegrationApi#deleteWorkflowsWebhookHandle"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Workflows webhook handle ``` // Delete Workflows webhook handle returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_microsoft_teams_integration::MicrosoftTeamsIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MicrosoftTeamsIntegrationAPI::with_config(configuration); let resp = api .delete_workflows_webhook_handle("handle_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Workflows webhook handle ``` /** * Delete Workflows webhook handle returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MicrosoftTeamsIntegrationApi(configuration); const params: v2.MicrosoftTeamsIntegrationApiDeleteWorkflowsWebhookHandleRequest = { handleId: "handle_id", }; apiInstance .deleteWorkflowsWebhookHandle(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all tenant-based handles](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#get-all-tenant-based-handles) * [v2 (latest)](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#get-all-tenant-based-handles-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.ap2.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.datadoghq.eu/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.ddog-gov.com/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.us3.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handleshttps://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles ### Overview Get a list of all tenant-based handles from the Datadog Microsoft Teams integration. ### Arguments #### Query Strings Name Type Description tenant_id string Your tenant id. name string Your tenant-based handle name. ### Response * [200](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListTenantBasedHandles-200-v2) * [400](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListTenantBasedHandles-400-v2) * [403](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListTenantBasedHandles-403-v2) * [404](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListTenantBasedHandles-404-v2) * [412](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListTenantBasedHandles-412-v2) * [429](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListTenantBasedHandles-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Response with a list of tenant-based handles. Field Type Description data [_required_] [object] An array of tenant-based handles. attributes object Tenant-based handle attributes. channel_id string Channel id. channel_name string Channel name. name string Tenant-based handle name. team_id string Team id. team_name string Team name. tenant_id string Tenant id. tenant_name string Tenant name. id string The ID of the tenant-based handle. type enum Tenant-based handle resource type. Allowed enum values: `ms-teams-tenant-based-handle-info` default: `ms-teams-tenant-based-handle-info` ``` { "data": [ { "attributes": { "channel_id": "fake-channel-id", "channel_name": "fake-channel-name", "name": "fake-handle-name", "team_id": "00000000-0000-0000-0000-000000000000", "team_name": "fake-team-name", "tenant_id": "00000000-0000-0000-0000-000000000001", "tenant_name": "fake-tenant-name" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "ms-teams-tenant-based-handle-info" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Failed Precondition * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=typescript) ##### Get all tenant-based handles Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all tenant-based handles ``` """ Get all tenant-based handles returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.microsoft_teams_integration_api import MicrosoftTeamsIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MicrosoftTeamsIntegrationApi(api_client) response = api_instance.list_tenant_based_handles() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all tenant-based handles ``` # Get all tenant-based handles returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MicrosoftTeamsIntegrationAPI.new p api_instance.list_tenant_based_handles() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all tenant-based handles ``` // Get all tenant-based handles returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMicrosoftTeamsIntegrationApi(apiClient) resp, r, err := api.ListTenantBasedHandles(ctx, *datadogV2.NewListTenantBasedHandlesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MicrosoftTeamsIntegrationApi.ListTenantBasedHandles`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MicrosoftTeamsIntegrationApi.ListTenantBasedHandles`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all tenant-based handles ``` // Get all tenant-based handles returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MicrosoftTeamsIntegrationApi; import com.datadog.api.client.v2.model.MicrosoftTeamsTenantBasedHandlesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MicrosoftTeamsIntegrationApi apiInstance = new MicrosoftTeamsIntegrationApi(defaultClient); try { MicrosoftTeamsTenantBasedHandlesResponse result = apiInstance.listTenantBasedHandles(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling MicrosoftTeamsIntegrationApi#listTenantBasedHandles"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all tenant-based handles ``` // Get all tenant-based handles returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_microsoft_teams_integration::ListTenantBasedHandlesOptionalParams; use datadog_api_client::datadogV2::api_microsoft_teams_integration::MicrosoftTeamsIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MicrosoftTeamsIntegrationAPI::with_config(configuration); let resp = api .list_tenant_based_handles(ListTenantBasedHandlesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all tenant-based handles ``` /** * Get all tenant-based handles returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MicrosoftTeamsIntegrationApi(configuration); apiInstance .listTenantBasedHandles() .then((data: v2.MicrosoftTeamsTenantBasedHandlesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all Workflows webhook handles](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#get-all-workflows-webhook-handles) * [v2 (latest)](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#get-all-workflows-webhook-handles-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.ap2.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.datadoghq.eu/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.ddog-gov.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.us3.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handleshttps://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles ### Overview Get a list of all Workflows webhook handles from the Datadog Microsoft Teams integration. ### Arguments #### Query Strings Name Type Description name string Your Workflows webhook handle name. ### Response * [200](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListWorkflowsWebhookHandles-200-v2) * [400](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListWorkflowsWebhookHandles-400-v2) * [403](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListWorkflowsWebhookHandles-403-v2) * [404](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListWorkflowsWebhookHandles-404-v2) * [412](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListWorkflowsWebhookHandles-412-v2) * [429](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#ListWorkflowsWebhookHandles-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Response with a list of Workflows webhook handles. Field Type Description data [_required_] [object] An array of Workflows webhook handles. attributes object Workflows Webhook handle attributes. name string Workflows Webhook handle name. id string The ID of the Workflows webhook handle. type enum Specifies the Workflows webhook handle resource type. Allowed enum values: `workflows-webhook-handle` default: `workflows-webhook-handle` ``` { "data": [ { "attributes": { "name": "fake-handle-name" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "workflows-webhook-handle" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Failed Precondition * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=typescript) ##### Get all Workflows webhook handles Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all Workflows webhook handles ``` """ Get all Workflows webhook handles returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.microsoft_teams_integration_api import MicrosoftTeamsIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MicrosoftTeamsIntegrationApi(api_client) response = api_instance.list_workflows_webhook_handles() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all Workflows webhook handles ``` # Get all Workflows webhook handles returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MicrosoftTeamsIntegrationAPI.new p api_instance.list_workflows_webhook_handles() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all Workflows webhook handles ``` // Get all Workflows webhook handles returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMicrosoftTeamsIntegrationApi(apiClient) resp, r, err := api.ListWorkflowsWebhookHandles(ctx, *datadogV2.NewListWorkflowsWebhookHandlesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MicrosoftTeamsIntegrationApi.ListWorkflowsWebhookHandles`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MicrosoftTeamsIntegrationApi.ListWorkflowsWebhookHandles`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all Workflows webhook handles ``` // Get all Workflows webhook handles returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MicrosoftTeamsIntegrationApi; import com.datadog.api.client.v2.model.MicrosoftTeamsWorkflowsWebhookHandlesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MicrosoftTeamsIntegrationApi apiInstance = new MicrosoftTeamsIntegrationApi(defaultClient); try { MicrosoftTeamsWorkflowsWebhookHandlesResponse result = apiInstance.listWorkflowsWebhookHandles(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling MicrosoftTeamsIntegrationApi#listWorkflowsWebhookHandles"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all Workflows webhook handles ``` // Get all Workflows webhook handles returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_microsoft_teams_integration::ListWorkflowsWebhookHandlesOptionalParams; use datadog_api_client::datadogV2::api_microsoft_teams_integration::MicrosoftTeamsIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MicrosoftTeamsIntegrationAPI::with_config(configuration); let resp = api .list_workflows_webhook_handles(ListWorkflowsWebhookHandlesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all Workflows webhook handles ``` /** * Get all Workflows webhook handles returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MicrosoftTeamsIntegrationApi(configuration); apiInstance .listWorkflowsWebhookHandles() .then((data: v2.MicrosoftTeamsWorkflowsWebhookHandlesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get channel information by name](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#get-channel-information-by-name) * [v2 (latest)](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#get-channel-information-by-name-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/ms-teams/configuration/channel/{tenant_name}/{team_name}/{channel_name}https://api.ap2.datadoghq.com/api/v2/integration/ms-teams/configuration/channel/{tenant_name}/{team_name}/{channel_name}https://api.datadoghq.eu/api/v2/integration/ms-teams/configuration/channel/{tenant_name}/{team_name}/{channel_name}https://api.ddog-gov.com/api/v2/integration/ms-teams/configuration/channel/{tenant_name}/{team_name}/{channel_name}https://api.datadoghq.com/api/v2/integration/ms-teams/configuration/channel/{tenant_name}/{team_name}/{channel_name}https://api.us3.datadoghq.com/api/v2/integration/ms-teams/configuration/channel/{tenant_name}/{team_name}/{channel_name}https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/channel/{tenant_name}/{team_name}/{channel_name} ### Overview Get the tenant, team, and channel ID of a channel in the Datadog Microsoft Teams integration. ### Arguments #### Path Parameters Name Type Description tenant_name [_required_] string Your tenant name. team_name [_required_] string Your team name. channel_name [_required_] string Your channel name. ### Response * [200](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetChannelByName-200-v2) * [400](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetChannelByName-400-v2) * [403](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetChannelByName-403-v2) * [404](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetChannelByName-404-v2) * [429](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetChannelByName-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Response with channel, team, and tenant ID information. Field Type Description data object Channel data from a response. attributes object Channel attributes. is_primary boolean Indicates if this is the primary channel. team_id string Team id. tenant_id string Tenant id. id string The ID of the channel. type enum Channel info resource type. Allowed enum values: `ms-teams-channel-info` default: `ms-teams-channel-info` ``` { "data": { "attributes": { "is_primary": true, "team_id": "00000000-0000-0000-0000-000000000000", "tenant_id": "00000000-0000-0000-0000-000000000001" }, "id": "19:b41k24b14bn1nwffkernfkwrnfneubgkr@thread.tacv2", "type": "ms-teams-channel-info" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=typescript) ##### Get channel information by name Copy ``` # Path parameters export tenant_name="CHANGE_ME" export team_name="CHANGE_ME" export channel_name="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/channel/${tenant_name}/${team_name}/${channel_name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get channel information by name ``` """ Get channel information by name returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.microsoft_teams_integration_api import MicrosoftTeamsIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MicrosoftTeamsIntegrationApi(api_client) response = api_instance.get_channel_by_name( tenant_name="tenant_name", team_name="team_name", channel_name="channel_name", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get channel information by name ``` # Get channel information by name returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MicrosoftTeamsIntegrationAPI.new p api_instance.get_channel_by_name("tenant_name", "team_name", "channel_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get channel information by name ``` // Get channel information by name returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMicrosoftTeamsIntegrationApi(apiClient) resp, r, err := api.GetChannelByName(ctx, "tenant_name", "team_name", "channel_name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MicrosoftTeamsIntegrationApi.GetChannelByName`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MicrosoftTeamsIntegrationApi.GetChannelByName`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get channel information by name ``` // Get channel information by name returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MicrosoftTeamsIntegrationApi; import com.datadog.api.client.v2.model.MicrosoftTeamsGetChannelByNameResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MicrosoftTeamsIntegrationApi apiInstance = new MicrosoftTeamsIntegrationApi(defaultClient); try { MicrosoftTeamsGetChannelByNameResponse result = apiInstance.getChannelByName("tenant_name", "team_name", "channel_name"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MicrosoftTeamsIntegrationApi#getChannelByName"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get channel information by name ``` // Get channel information by name returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_microsoft_teams_integration::MicrosoftTeamsIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MicrosoftTeamsIntegrationAPI::with_config(configuration); let resp = api .get_channel_by_name( "tenant_name".to_string(), "team_name".to_string(), "channel_name".to_string(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get channel information by name ``` /** * Get channel information by name returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MicrosoftTeamsIntegrationApi(configuration); const params: v2.MicrosoftTeamsIntegrationApiGetChannelByNameRequest = { tenantName: "tenant_name", teamName: "team_name", channelName: "channel_name", }; apiInstance .getChannelByName(params) .then((data: v2.MicrosoftTeamsGetChannelByNameResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get tenant-based handle information](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#get-tenant-based-handle-information) * [v2 (latest)](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#get-tenant-based-handle-information-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.ap2.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.datadoghq.eu/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.ddog-gov.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.us3.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id} ### Overview Get the tenant, team, and channel information of a tenant-based handle from the Datadog Microsoft Teams integration. ### Arguments #### Path Parameters Name Type Description handle_id [_required_] string Your tenant-based handle id. ### Response * [200](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetTenantBasedHandle-200-v2) * [400](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetTenantBasedHandle-400-v2) * [403](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetTenantBasedHandle-403-v2) * [404](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetTenantBasedHandle-404-v2) * [412](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetTenantBasedHandle-412-v2) * [429](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetTenantBasedHandle-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Response of a tenant-based handle. Field Type Description data [_required_] object Tenant-based handle data from a response. attributes object Tenant-based handle attributes. channel_id string Channel id. name string Tenant-based handle name. team_id string Team id. tenant_id string Tenant id. id string The ID of the tenant-based handle. type enum Specifies the tenant-based handle resource type. Allowed enum values: `tenant-based-handle` default: `tenant-based-handle` ``` { "data": { "attributes": { "channel_id": "fake-channel-id", "name": "fake-handle-name", "team_id": "00000000-0000-0000-0000-000000000000", "tenant_id": "00000000-0000-0000-0000-000000000001" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "tenant-based-handle" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Failed Precondition * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=typescript) ##### Get tenant-based handle information Copy ``` # Path parameters export handle_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/${handle_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get tenant-based handle information ``` """ Get tenant-based handle information returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.microsoft_teams_integration_api import MicrosoftTeamsIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MicrosoftTeamsIntegrationApi(api_client) response = api_instance.get_tenant_based_handle( handle_id="handle_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get tenant-based handle information ``` # Get tenant-based handle information returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MicrosoftTeamsIntegrationAPI.new p api_instance.get_tenant_based_handle("handle_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get tenant-based handle information ``` // Get tenant-based handle information returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMicrosoftTeamsIntegrationApi(apiClient) resp, r, err := api.GetTenantBasedHandle(ctx, "handle_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MicrosoftTeamsIntegrationApi.GetTenantBasedHandle`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MicrosoftTeamsIntegrationApi.GetTenantBasedHandle`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get tenant-based handle information ``` // Get tenant-based handle information returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MicrosoftTeamsIntegrationApi; import com.datadog.api.client.v2.model.MicrosoftTeamsTenantBasedHandleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MicrosoftTeamsIntegrationApi apiInstance = new MicrosoftTeamsIntegrationApi(defaultClient); try { MicrosoftTeamsTenantBasedHandleResponse result = apiInstance.getTenantBasedHandle("handle_id"); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling MicrosoftTeamsIntegrationApi#getTenantBasedHandle"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get tenant-based handle information ``` // Get tenant-based handle information returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_microsoft_teams_integration::MicrosoftTeamsIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MicrosoftTeamsIntegrationAPI::with_config(configuration); let resp = api.get_tenant_based_handle("handle_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get tenant-based handle information ``` /** * Get tenant-based handle information returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MicrosoftTeamsIntegrationApi(configuration); const params: v2.MicrosoftTeamsIntegrationApiGetTenantBasedHandleRequest = { handleId: "handle_id", }; apiInstance .getTenantBasedHandle(params) .then((data: v2.MicrosoftTeamsTenantBasedHandleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get Workflows webhook handle information](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#get-workflows-webhook-handle-information) * [v2 (latest)](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#get-workflows-webhook-handle-information-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.ap2.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.datadoghq.eu/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.ddog-gov.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.us3.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id} ### Overview Get the name of a Workflows webhook handle from the Datadog Microsoft Teams integration. ### Arguments #### Path Parameters Name Type Description handle_id [_required_] string Your Workflows webhook handle id. ### Response * [200](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetWorkflowsWebhookHandle-200-v2) * [400](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetWorkflowsWebhookHandle-400-v2) * [403](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetWorkflowsWebhookHandle-403-v2) * [404](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetWorkflowsWebhookHandle-404-v2) * [412](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetWorkflowsWebhookHandle-412-v2) * [429](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#GetWorkflowsWebhookHandle-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Response of a Workflows webhook handle. Field Type Description data [_required_] object Workflows Webhook handle data from a response. attributes object Workflows Webhook handle attributes. name string Workflows Webhook handle name. id string The ID of the Workflows webhook handle. type enum Specifies the Workflows webhook handle resource type. Allowed enum values: `workflows-webhook-handle` default: `workflows-webhook-handle` ``` { "data": { "attributes": { "name": "fake-handle-name" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "workflows-webhook-handle" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Failed Precondition * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=typescript) ##### Get Workflows webhook handle information Copy ``` # Path parameters export handle_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/${handle_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Workflows webhook handle information ``` """ Get Workflows webhook handle information returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.microsoft_teams_integration_api import MicrosoftTeamsIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MicrosoftTeamsIntegrationApi(api_client) response = api_instance.get_workflows_webhook_handle( handle_id="handle_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Workflows webhook handle information ``` # Get Workflows webhook handle information returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MicrosoftTeamsIntegrationAPI.new p api_instance.get_workflows_webhook_handle("handle_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Workflows webhook handle information ``` // Get Workflows webhook handle information returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMicrosoftTeamsIntegrationApi(apiClient) resp, r, err := api.GetWorkflowsWebhookHandle(ctx, "handle_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MicrosoftTeamsIntegrationApi.GetWorkflowsWebhookHandle`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MicrosoftTeamsIntegrationApi.GetWorkflowsWebhookHandle`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Workflows webhook handle information ``` // Get Workflows webhook handle information returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MicrosoftTeamsIntegrationApi; import com.datadog.api.client.v2.model.MicrosoftTeamsWorkflowsWebhookHandleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MicrosoftTeamsIntegrationApi apiInstance = new MicrosoftTeamsIntegrationApi(defaultClient); try { MicrosoftTeamsWorkflowsWebhookHandleResponse result = apiInstance.getWorkflowsWebhookHandle("handle_id"); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling MicrosoftTeamsIntegrationApi#getWorkflowsWebhookHandle"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Workflows webhook handle information ``` // Get Workflows webhook handle information returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_microsoft_teams_integration::MicrosoftTeamsIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MicrosoftTeamsIntegrationAPI::with_config(configuration); let resp = api .get_workflows_webhook_handle("handle_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Workflows webhook handle information ``` /** * Get Workflows webhook handle information returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MicrosoftTeamsIntegrationApi(configuration); const params: v2.MicrosoftTeamsIntegrationApiGetWorkflowsWebhookHandleRequest = { handleId: "handle_id", }; apiInstance .getWorkflowsWebhookHandle(params) .then((data: v2.MicrosoftTeamsWorkflowsWebhookHandleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update tenant-based handle](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#update-tenant-based-handle) * [v2 (latest)](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#update-tenant-based-handle-v2) PATCH https://api.ap1.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.ap2.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.datadoghq.eu/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.ddog-gov.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.us3.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id}https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/{handle_id} ### Overview Update a tenant-based handle from the Datadog Microsoft Teams integration. ### Arguments #### Path Parameters Name Type Description handle_id [_required_] string Your tenant-based handle id. ### Request #### Body Data (required) Tenant-based handle payload. * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Field Type Description data [_required_] object Tenant-based handle data from a response. attributes [_required_] object Tenant-based handle attributes. channel_id string Channel id. name string Tenant-based handle name. team_id string Team id. tenant_id string Tenant id. type [_required_] enum Specifies the tenant-based handle resource type. Allowed enum values: `tenant-based-handle` default: `tenant-based-handle` ``` { "data": { "attributes": { "name": "fake-handle-name--updated" }, "type": "tenant-based-handle" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateTenantBasedHandle-200-v2) * [400](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateTenantBasedHandle-400-v2) * [403](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateTenantBasedHandle-403-v2) * [404](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateTenantBasedHandle-404-v2) * [409](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateTenantBasedHandle-409-v2) * [412](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateTenantBasedHandle-412-v2) * [429](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateTenantBasedHandle-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Response of a tenant-based handle. Field Type Description data [_required_] object Tenant-based handle data from a response. attributes object Tenant-based handle attributes. channel_id string Channel id. name string Tenant-based handle name. team_id string Team id. tenant_id string Tenant id. id string The ID of the tenant-based handle. type enum Specifies the tenant-based handle resource type. Allowed enum values: `tenant-based-handle` default: `tenant-based-handle` ``` { "data": { "attributes": { "channel_id": "fake-channel-id", "name": "fake-handle-name", "team_id": "00000000-0000-0000-0000-000000000000", "tenant_id": "00000000-0000-0000-0000-000000000001" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "tenant-based-handle" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Failed Precondition * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=typescript) ##### Update api handle returns "OK" response Copy ``` # Path parameters export handle_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/tenant-based-handles/${handle_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "fake-handle-name--updated" }, "type": "tenant-based-handle" } } EOF ``` ##### Update api handle returns "OK" response ``` // Update api handle returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "tenant_based_handle" in the system TenantBasedHandleDataID := os.Getenv("TENANT_BASED_HANDLE_DATA_ID") body := datadogV2.MicrosoftTeamsUpdateTenantBasedHandleRequest{ Data: datadogV2.MicrosoftTeamsUpdateTenantBasedHandleRequestData{ Attributes: datadogV2.MicrosoftTeamsTenantBasedHandleAttributes{ Name: datadog.PtrString("fake-handle-name--updated"), }, Type: datadogV2.MICROSOFTTEAMSTENANTBASEDHANDLETYPE_TENANT_BASED_HANDLE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMicrosoftTeamsIntegrationApi(apiClient) resp, r, err := api.UpdateTenantBasedHandle(ctx, TenantBasedHandleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MicrosoftTeamsIntegrationApi.UpdateTenantBasedHandle`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MicrosoftTeamsIntegrationApi.UpdateTenantBasedHandle`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update api handle returns "OK" response ``` // Update api handle returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MicrosoftTeamsIntegrationApi; import com.datadog.api.client.v2.model.MicrosoftTeamsTenantBasedHandleAttributes; import com.datadog.api.client.v2.model.MicrosoftTeamsTenantBasedHandleResponse; import com.datadog.api.client.v2.model.MicrosoftTeamsTenantBasedHandleType; import com.datadog.api.client.v2.model.MicrosoftTeamsUpdateTenantBasedHandleRequest; import com.datadog.api.client.v2.model.MicrosoftTeamsUpdateTenantBasedHandleRequestData; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MicrosoftTeamsIntegrationApi apiInstance = new MicrosoftTeamsIntegrationApi(defaultClient); // there is a valid "tenant_based_handle" in the system String TENANT_BASED_HANDLE_DATA_ATTRIBUTES_NAME = System.getenv("TENANT_BASED_HANDLE_DATA_ATTRIBUTES_NAME"); String TENANT_BASED_HANDLE_DATA_ID = System.getenv("TENANT_BASED_HANDLE_DATA_ID"); MicrosoftTeamsUpdateTenantBasedHandleRequest body = new MicrosoftTeamsUpdateTenantBasedHandleRequest() .data( new MicrosoftTeamsUpdateTenantBasedHandleRequestData() .attributes( new MicrosoftTeamsTenantBasedHandleAttributes() .name("fake-handle-name--updated")) .type(MicrosoftTeamsTenantBasedHandleType.TENANT_BASED_HANDLE)); try { MicrosoftTeamsTenantBasedHandleResponse result = apiInstance.updateTenantBasedHandle(TENANT_BASED_HANDLE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling MicrosoftTeamsIntegrationApi#updateTenantBasedHandle"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update api handle returns "OK" response ``` """ Update api handle returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.microsoft_teams_integration_api import MicrosoftTeamsIntegrationApi from datadog_api_client.v2.model.microsoft_teams_tenant_based_handle_attributes import ( MicrosoftTeamsTenantBasedHandleAttributes, ) from datadog_api_client.v2.model.microsoft_teams_tenant_based_handle_type import MicrosoftTeamsTenantBasedHandleType from datadog_api_client.v2.model.microsoft_teams_update_tenant_based_handle_request import ( MicrosoftTeamsUpdateTenantBasedHandleRequest, ) from datadog_api_client.v2.model.microsoft_teams_update_tenant_based_handle_request_data import ( MicrosoftTeamsUpdateTenantBasedHandleRequestData, ) # there is a valid "tenant_based_handle" in the system TENANT_BASED_HANDLE_DATA_ATTRIBUTES_NAME = environ["TENANT_BASED_HANDLE_DATA_ATTRIBUTES_NAME"] TENANT_BASED_HANDLE_DATA_ID = environ["TENANT_BASED_HANDLE_DATA_ID"] body = MicrosoftTeamsUpdateTenantBasedHandleRequest( data=MicrosoftTeamsUpdateTenantBasedHandleRequestData( attributes=MicrosoftTeamsTenantBasedHandleAttributes( name="fake-handle-name--updated", ), type=MicrosoftTeamsTenantBasedHandleType.TENANT_BASED_HANDLE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MicrosoftTeamsIntegrationApi(api_client) response = api_instance.update_tenant_based_handle(handle_id=TENANT_BASED_HANDLE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update api handle returns "OK" response ``` # Update api handle returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MicrosoftTeamsIntegrationAPI.new # there is a valid "tenant_based_handle" in the system TENANT_BASED_HANDLE_DATA_ATTRIBUTES_NAME = ENV["TENANT_BASED_HANDLE_DATA_ATTRIBUTES_NAME"] TENANT_BASED_HANDLE_DATA_ID = ENV["TENANT_BASED_HANDLE_DATA_ID"] body = DatadogAPIClient::V2::MicrosoftTeamsUpdateTenantBasedHandleRequest.new({ data: DatadogAPIClient::V2::MicrosoftTeamsUpdateTenantBasedHandleRequestData.new({ attributes: DatadogAPIClient::V2::MicrosoftTeamsTenantBasedHandleAttributes.new({ name: "fake-handle-name--updated", }), type: DatadogAPIClient::V2::MicrosoftTeamsTenantBasedHandleType::TENANT_BASED_HANDLE, }), }) p api_instance.update_tenant_based_handle(TENANT_BASED_HANDLE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update api handle returns "OK" response ``` // Update api handle returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_microsoft_teams_integration::MicrosoftTeamsIntegrationAPI; use datadog_api_client::datadogV2::model::MicrosoftTeamsTenantBasedHandleAttributes; use datadog_api_client::datadogV2::model::MicrosoftTeamsTenantBasedHandleType; use datadog_api_client::datadogV2::model::MicrosoftTeamsUpdateTenantBasedHandleRequest; use datadog_api_client::datadogV2::model::MicrosoftTeamsUpdateTenantBasedHandleRequestData; #[tokio::main] async fn main() { // there is a valid "tenant_based_handle" in the system let tenant_based_handle_data_id = std::env::var("TENANT_BASED_HANDLE_DATA_ID").unwrap(); let body = MicrosoftTeamsUpdateTenantBasedHandleRequest::new( MicrosoftTeamsUpdateTenantBasedHandleRequestData::new( MicrosoftTeamsTenantBasedHandleAttributes::new() .name("fake-handle-name--updated".to_string()), MicrosoftTeamsTenantBasedHandleType::TENANT_BASED_HANDLE, ), ); let configuration = datadog::Configuration::new(); let api = MicrosoftTeamsIntegrationAPI::with_config(configuration); let resp = api .update_tenant_based_handle(tenant_based_handle_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update api handle returns "OK" response ``` /** * Update api handle returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MicrosoftTeamsIntegrationApi(configuration); // there is a valid "tenant_based_handle" in the system const TENANT_BASED_HANDLE_DATA_ID = process.env .TENANT_BASED_HANDLE_DATA_ID as string; const params: v2.MicrosoftTeamsIntegrationApiUpdateTenantBasedHandleRequest = { body: { data: { attributes: { name: "fake-handle-name--updated", }, type: "tenant-based-handle", }, }, handleId: TENANT_BASED_HANDLE_DATA_ID, }; apiInstance .updateTenantBasedHandle(params) .then((data: v2.MicrosoftTeamsTenantBasedHandleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Workflows webhook handle](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#update-workflows-webhook-handle) * [v2 (latest)](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#update-workflows-webhook-handle-v2) PATCH https://api.ap1.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.ap2.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.datadoghq.eu/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.ddog-gov.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.us3.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id}https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/{handle_id} ### Overview Update a Workflows webhook handle from the Datadog Microsoft Teams integration. ### Arguments #### Path Parameters Name Type Description handle_id [_required_] string Your Workflows webhook handle id. ### Request #### Body Data (required) Workflows Webhook handle payload. * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Field Type Description data [_required_] object Workflows Webhook handle data from a response. attributes [_required_] object Workflows Webhook handle attributes. name string Workflows Webhook handle name. url string Workflows Webhook URL. type [_required_] enum Specifies the Workflows webhook handle resource type. Allowed enum values: `workflows-webhook-handle` default: `workflows-webhook-handle` ``` { "data": { "attributes": { "name": "fake-handle-name--updated" }, "type": "workflows-webhook-handle" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateWorkflowsWebhookHandle-200-v2) * [400](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateWorkflowsWebhookHandle-400-v2) * [403](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateWorkflowsWebhookHandle-403-v2) * [404](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateWorkflowsWebhookHandle-404-v2) * [409](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateWorkflowsWebhookHandle-409-v2) * [412](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateWorkflowsWebhookHandle-412-v2) * [429](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/#UpdateWorkflowsWebhookHandle-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) Response of a Workflows webhook handle. Field Type Description data [_required_] object Workflows Webhook handle data from a response. attributes object Workflows Webhook handle attributes. name string Workflows Webhook handle name. id string The ID of the Workflows webhook handle. type enum Specifies the Workflows webhook handle resource type. Allowed enum values: `workflows-webhook-handle` default: `workflows-webhook-handle` ``` { "data": { "attributes": { "name": "fake-handle-name" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "workflows-webhook-handle" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Failed Precondition * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) * [Example](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/microsoft-teams-integration/?code-lang=typescript) ##### Update workflow webhook handle returns "OK" response Copy ``` # Path parameters export handle_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/ms-teams/configuration/workflows-webhook-handles/${handle_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "fake-handle-name--updated" }, "type": "workflows-webhook-handle" } } EOF ``` ##### Update workflow webhook handle returns "OK" response ``` // Update workflow webhook handle returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "workflows_webhook_handle" in the system WorkflowsWebhookHandleDataID := os.Getenv("WORKFLOWS_WEBHOOK_HANDLE_DATA_ID") body := datadogV2.MicrosoftTeamsUpdateWorkflowsWebhookHandleRequest{ Data: datadogV2.MicrosoftTeamsUpdateWorkflowsWebhookHandleRequestData{ Attributes: datadogV2.MicrosoftTeamsWorkflowsWebhookHandleAttributes{ Name: datadog.PtrString("fake-handle-name--updated"), }, Type: datadogV2.MICROSOFTTEAMSWORKFLOWSWEBHOOKHANDLETYPE_WORKFLOWS_WEBHOOK_HANDLE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMicrosoftTeamsIntegrationApi(apiClient) resp, r, err := api.UpdateWorkflowsWebhookHandle(ctx, WorkflowsWebhookHandleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MicrosoftTeamsIntegrationApi.UpdateWorkflowsWebhookHandle`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MicrosoftTeamsIntegrationApi.UpdateWorkflowsWebhookHandle`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update workflow webhook handle returns "OK" response ``` // Update workflow webhook handle returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MicrosoftTeamsIntegrationApi; import com.datadog.api.client.v2.model.MicrosoftTeamsUpdateWorkflowsWebhookHandleRequest; import com.datadog.api.client.v2.model.MicrosoftTeamsUpdateWorkflowsWebhookHandleRequestData; import com.datadog.api.client.v2.model.MicrosoftTeamsWorkflowsWebhookHandleAttributes; import com.datadog.api.client.v2.model.MicrosoftTeamsWorkflowsWebhookHandleResponse; import com.datadog.api.client.v2.model.MicrosoftTeamsWorkflowsWebhookHandleType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MicrosoftTeamsIntegrationApi apiInstance = new MicrosoftTeamsIntegrationApi(defaultClient); // there is a valid "workflows_webhook_handle" in the system String WORKFLOWS_WEBHOOK_HANDLE_DATA_ATTRIBUTES_NAME = System.getenv("WORKFLOWS_WEBHOOK_HANDLE_DATA_ATTRIBUTES_NAME"); String WORKFLOWS_WEBHOOK_HANDLE_DATA_ID = System.getenv("WORKFLOWS_WEBHOOK_HANDLE_DATA_ID"); MicrosoftTeamsUpdateWorkflowsWebhookHandleRequest body = new MicrosoftTeamsUpdateWorkflowsWebhookHandleRequest() .data( new MicrosoftTeamsUpdateWorkflowsWebhookHandleRequestData() .attributes( new MicrosoftTeamsWorkflowsWebhookHandleAttributes() .name("fake-handle-name--updated")) .type(MicrosoftTeamsWorkflowsWebhookHandleType.WORKFLOWS_WEBHOOK_HANDLE)); try { MicrosoftTeamsWorkflowsWebhookHandleResponse result = apiInstance.updateWorkflowsWebhookHandle(WORKFLOWS_WEBHOOK_HANDLE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling MicrosoftTeamsIntegrationApi#updateWorkflowsWebhookHandle"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update workflow webhook handle returns "OK" response ``` """ Update workflow webhook handle returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.microsoft_teams_integration_api import MicrosoftTeamsIntegrationApi from datadog_api_client.v2.model.microsoft_teams_update_workflows_webhook_handle_request import ( MicrosoftTeamsUpdateWorkflowsWebhookHandleRequest, ) from datadog_api_client.v2.model.microsoft_teams_update_workflows_webhook_handle_request_data import ( MicrosoftTeamsUpdateWorkflowsWebhookHandleRequestData, ) from datadog_api_client.v2.model.microsoft_teams_workflows_webhook_handle_attributes import ( MicrosoftTeamsWorkflowsWebhookHandleAttributes, ) from datadog_api_client.v2.model.microsoft_teams_workflows_webhook_handle_type import ( MicrosoftTeamsWorkflowsWebhookHandleType, ) # there is a valid "workflows_webhook_handle" in the system WORKFLOWS_WEBHOOK_HANDLE_DATA_ATTRIBUTES_NAME = environ["WORKFLOWS_WEBHOOK_HANDLE_DATA_ATTRIBUTES_NAME"] WORKFLOWS_WEBHOOK_HANDLE_DATA_ID = environ["WORKFLOWS_WEBHOOK_HANDLE_DATA_ID"] body = MicrosoftTeamsUpdateWorkflowsWebhookHandleRequest( data=MicrosoftTeamsUpdateWorkflowsWebhookHandleRequestData( attributes=MicrosoftTeamsWorkflowsWebhookHandleAttributes( name="fake-handle-name--updated", ), type=MicrosoftTeamsWorkflowsWebhookHandleType.WORKFLOWS_WEBHOOK_HANDLE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MicrosoftTeamsIntegrationApi(api_client) response = api_instance.update_workflows_webhook_handle(handle_id=WORKFLOWS_WEBHOOK_HANDLE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update workflow webhook handle returns "OK" response ``` # Update workflow webhook handle returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MicrosoftTeamsIntegrationAPI.new # there is a valid "workflows_webhook_handle" in the system WORKFLOWS_WEBHOOK_HANDLE_DATA_ATTRIBUTES_NAME = ENV["WORKFLOWS_WEBHOOK_HANDLE_DATA_ATTRIBUTES_NAME"] WORKFLOWS_WEBHOOK_HANDLE_DATA_ID = ENV["WORKFLOWS_WEBHOOK_HANDLE_DATA_ID"] body = DatadogAPIClient::V2::MicrosoftTeamsUpdateWorkflowsWebhookHandleRequest.new({ data: DatadogAPIClient::V2::MicrosoftTeamsUpdateWorkflowsWebhookHandleRequestData.new({ attributes: DatadogAPIClient::V2::MicrosoftTeamsWorkflowsWebhookHandleAttributes.new({ name: "fake-handle-name--updated", }), type: DatadogAPIClient::V2::MicrosoftTeamsWorkflowsWebhookHandleType::WORKFLOWS_WEBHOOK_HANDLE, }), }) p api_instance.update_workflows_webhook_handle(WORKFLOWS_WEBHOOK_HANDLE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update workflow webhook handle returns "OK" response ``` // Update workflow webhook handle returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_microsoft_teams_integration::MicrosoftTeamsIntegrationAPI; use datadog_api_client::datadogV2::model::MicrosoftTeamsUpdateWorkflowsWebhookHandleRequest; use datadog_api_client::datadogV2::model::MicrosoftTeamsUpdateWorkflowsWebhookHandleRequestData; use datadog_api_client::datadogV2::model::MicrosoftTeamsWorkflowsWebhookHandleAttributes; use datadog_api_client::datadogV2::model::MicrosoftTeamsWorkflowsWebhookHandleType; #[tokio::main] async fn main() { // there is a valid "workflows_webhook_handle" in the system let workflows_webhook_handle_data_id = std::env::var("WORKFLOWS_WEBHOOK_HANDLE_DATA_ID").unwrap(); let body = MicrosoftTeamsUpdateWorkflowsWebhookHandleRequest::new( MicrosoftTeamsUpdateWorkflowsWebhookHandleRequestData::new( MicrosoftTeamsWorkflowsWebhookHandleAttributes::new() .name("fake-handle-name--updated".to_string()), MicrosoftTeamsWorkflowsWebhookHandleType::WORKFLOWS_WEBHOOK_HANDLE, ), ); let configuration = datadog::Configuration::new(); let api = MicrosoftTeamsIntegrationAPI::with_config(configuration); let resp = api .update_workflows_webhook_handle(workflows_webhook_handle_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update workflow webhook handle returns "OK" response ``` /** * Update workflow webhook handle returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MicrosoftTeamsIntegrationApi(configuration); // there is a valid "workflows_webhook_handle" in the system const WORKFLOWS_WEBHOOK_HANDLE_DATA_ID = process.env .WORKFLOWS_WEBHOOK_HANDLE_DATA_ID as string; const params: v2.MicrosoftTeamsIntegrationApiUpdateWorkflowsWebhookHandleRequest = { body: { data: { attributes: { name: "fake-handle-name--updated", }, type: "workflows-webhook-handle", }, }, handleId: WORKFLOWS_WEBHOOK_HANDLE_DATA_ID, }; apiInstance .updateWorkflowsWebhookHandle(params) .then((data: v2.MicrosoftTeamsWorkflowsWebhookHandleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=ee0f81ff-0294-44d3-b93c-e23f39cf6a14&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=caad0665-5b71-407f-9da6-85f28f22d1e1&pt=Microsoft%20Teams%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fmicrosoft-teams-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=ee0f81ff-0294-44d3-b93c-e23f39cf6a14&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=caad0665-5b71-407f-9da6-85f28f22d1e1&pt=Microsoft%20Teams%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fmicrosoft-teams-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=d3dfe749-5766-464c-abc6-78ea51f51f90&bo=2&sid=e35c08c0f0bf11f08fae81183ca9ad9c&vid=e35c1ef0f0bf11f0823753b5c8db39b1&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Microsoft%20Teams%20Integration&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fmicrosoft-teams-integration%2F&r=<=1785&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=577097) --- # Source: https://docs.datadoghq.com/api/latest/monitors # Monitors [Monitors](https://docs.datadoghq.com/monitors) allow you to watch a metric or check that you care about and notifies your team when a defined threshold has exceeded. For more information, see [Creating Monitors](https://docs.datadoghq.com/monitors/create/types/). **Note:** `curl` commands require [url encoding](https://curl.se/docs/url-syntax.html). ## [Create a monitor](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor-v1) POST https://api.ap1.datadoghq.com/api/v1/monitorhttps://api.ap2.datadoghq.com/api/v1/monitorhttps://api.datadoghq.eu/api/v1/monitorhttps://api.ddog-gov.com/api/v1/monitorhttps://api.datadoghq.com/api/v1/monitorhttps://api.us3.datadoghq.com/api/v1/monitorhttps://api.us5.datadoghq.com/api/v1/monitor ### Overview Create a monitor using the specified options. #### [Monitor Types](https://docs.datadoghq.com/api/latest/monitors/#monitor-types) The type of monitor chosen from: * anomaly: `query alert` * APM: `query alert` or `trace-analytics alert` * composite: `composite` * custom: `service check` * forecast: `query alert` * host: `service check` * integration: `query alert` or `service check` * live process: `process alert` * logs: `log alert` * metric: `query alert` * network: `service check` * outlier: `query alert` * process: `service check` * rum: `rum alert` * SLO: `slo alert` * watchdog: `event-v2 alert` * event-v2: `event-v2 alert` * audit: `audit alert` * error-tracking: `error-tracking alert` * database-monitoring: `database-monitoring alert` * network-performance: `network-performance alert` * cloud cost: `cost alert` **Notes** : * Synthetic monitors are created through the Synthetics API. See the [Synthetics API](https://docs.datadoghq.com/api/latest/synthetics/) documentation for more information. * Log monitors require an unscoped App Key. #### [Query Types](https://docs.datadoghq.com/api/latest/monitors/#query-types) ##### [Metric Alert Query](https://docs.datadoghq.com/api/latest/monitors/#metric-alert-query) Example: `time_aggr(time_window):space_aggr:metric{tags} [by {key}] operator #` * `time_aggr`: avg, sum, max, min, change, or pct_change * `time_window`: `last_#m` (with `#` between 1 and 10080 depending on the monitor type) or `last_#h`(with `#` between 1 and 168 depending on the monitor type) or `last_1d`, or `last_1w` * `space_aggr`: avg, sum, min, or max * `tags`: one or more tags (comma-separated), or * * `key`: a ‘key’ in key:value tag syntax; defines a separate alert for each tag in the group (multi-alert) * `operator`: <, <=, >, >=, ==, or != * `#`: an integer or decimal number used to set the threshold If you are using the `_change_` or `_pct_change_` time aggregator, instead use `change_aggr(time_aggr(time_window), timeshift):space_aggr:metric{tags} [by {key}] operator #` with: * `change_aggr` change, pct_change * `time_aggr` avg, sum, max, min [Learn more](https://docs.datadoghq.com/monitors/create/types/#define-the-conditions) * `time_window` last_#m (between 1 and 2880 depending on the monitor type), last_#h (between 1 and 48 depending on the monitor type), or last_#d (1 or 2) * `timeshift` #m_ago (5, 10, 15, or 30), #h_ago (1, 2, or 4), or 1d_ago Use this to create an outlier monitor using the following query: `avg(last_30m):outliers(avg:system.cpu.user{role:es-events-data} by {host}, 'dbscan', 7) > 0` ##### [Service Check Query](https://docs.datadoghq.com/api/latest/monitors/#service-check-query) Example: `"check".over(tags).last(count).by(group).count_by_status()` * `check` name of the check, for example `datadog.agent.up` * `tags` one or more quoted tags (comma-separated), or “*”. for example: `.over("env:prod", "role:db")`; `over` cannot be blank. * `count` must be at greater than or equal to your max threshold (defined in the `options`). It is limited to 100. For example, if you’ve specified to notify on 1 critical, 3 ok, and 2 warn statuses, `count` should be at least 3. * `group` must be specified for check monitors. Per-check grouping is already explicitly known for some service checks. For example, Postgres integration monitors are tagged by `db`, `host`, and `port`, and Network monitors by `host`, `instance`, and `url`. See [Service Checks](https://docs.datadoghq.com/api/latest/service-checks/) documentation for more information. ##### [Event Alert Query](https://docs.datadoghq.com/api/latest/monitors/#event-alert-query) **Note:** The Event Alert Query has been replaced by the Event V2 Alert Query. For more information, see the [Event Migration guide](https://docs.datadoghq.com/service_management/events/guides/migrating_to_new_events_features/). ##### [Event V2 Alert Query](https://docs.datadoghq.com/api/latest/monitors/#event-v2-alert-query) Example: `events(query).rollup(rollup_method[, measure]).last(time_window) operator #` * `query` The search query - following the [Log search syntax](https://docs.datadoghq.com/logs/search_syntax/). * `rollup_method` The stats roll-up method - supports `count`, `avg` and `cardinality`. * `measure` For `avg` and cardinality `rollup_method` - specify the measure or the facet name you want to use. * `time_window` #m (between 1 and 2880), #h (between 1 and 48). * `operator` `<`, `<=`, `>`, `>=`, `==`, or `!=`. * `#` an integer or decimal number used to set the threshold. ##### [Process Alert Query](https://docs.datadoghq.com/api/latest/monitors/#process-alert-query) Example: `processes(search).over(tags).rollup('count').last(timeframe) operator #` * `search` free text search string for querying processes. Matching processes match results on the [Live Processes](https://docs.datadoghq.com/infrastructure/process/?tab=linuxwindows) page. * `tags` one or more tags (comma-separated) * `timeframe` the timeframe to roll up the counts. Examples: 10m, 4h. Supported timeframes: s, m, h and d * `operator` <, <=, >, >=, ==, or != * `#` an integer or decimal number used to set the threshold ##### [Logs Alert Query](https://docs.datadoghq.com/api/latest/monitors/#logs-alert-query) Example: `logs(query).index(index_name).rollup(rollup_method[, measure]).last(time_window) operator #` * `query` The search query - following the [Log search syntax](https://docs.datadoghq.com/logs/search_syntax/). * `index_name` For multi-index organizations, the log index in which the request is performed. * `rollup_method` The stats roll-up method - supports `count`, `avg` and `cardinality`. * `measure` For `avg` and cardinality `rollup_method` - specify the measure or the facet name you want to use. * `time_window` #m (between 1 and 2880), #h (between 1 and 48). * `operator` `<`, `<=`, `>`, `>=`, `==`, or `!=`. * `#` an integer or decimal number used to set the threshold. ##### [Composite Query](https://docs.datadoghq.com/api/latest/monitors/#composite-query) Example: `12345 && 67890`, where `12345` and `67890` are the IDs of non-composite monitors * `name` [_required_ , _default_ = **dynamic, based on query**]: The name of the alert. * `message` [_required_ , _default_ = **dynamic, based on query**]: A message to include with notifications for this monitor. Email notifications can be sent to specific users by using the same ‘@username’ notation as events. * `tags` [_optional_ , _default_ = **empty list**]: A list of tags to associate with your monitor. When getting all monitor details via the API, use the `monitor_tags` argument to filter results by these tags. It is only available via the API and isn’t visible or editable in the Datadog UI. ##### [SLO Alert Query](https://docs.datadoghq.com/api/latest/monitors/#slo-alert-query) Example: `error_budget("slo_id").over("time_window") operator #` * `slo_id`: The alphanumeric SLO ID of the SLO you are configuring the alert for. * `time_window`: The time window of the SLO target you wish to alert on. Valid options: `7d`, `30d`, `90d`. * `operator`: `>=` or `>` ##### [Audit Alert Query](https://docs.datadoghq.com/api/latest/monitors/#audit-alert-query) Example: `audits(query).rollup(rollup_method[, measure]).last(time_window) operator #` * `query` The search query - following the [Log search syntax](https://docs.datadoghq.com/logs/search_syntax/). * `rollup_method` The stats roll-up method - supports `count`, `avg` and `cardinality`. * `measure` For `avg` and cardinality `rollup_method` - specify the measure or the facet name you want to use. * `time_window` #m (between 1 and 2880), #h (between 1 and 48). * `operator` `<`, `<=`, `>`, `>=`, `==`, or `!=`. * `#` an integer or decimal number used to set the threshold. ##### [CI Pipelines Alert Query](https://docs.datadoghq.com/api/latest/monitors/#ci-pipelines-alert-query) Example: `ci-pipelines(query).rollup(rollup_method[, measure]).last(time_window) operator #` * `query` The search query - following the [Log search syntax](https://docs.datadoghq.com/logs/search_syntax/). * `rollup_method` The stats roll-up method - supports `count`, `avg`, and `cardinality`. * `measure` For `avg` and cardinality `rollup_method` - specify the measure or the facet name you want to use. * `time_window` #m (between 1 and 2880), #h (between 1 and 48). * `operator` `<`, `<=`, `>`, `>=`, `==`, or `!=`. * `#` an integer or decimal number used to set the threshold. ##### [CI Tests Alert Query](https://docs.datadoghq.com/api/latest/monitors/#ci-tests-alert-query) Example: `ci-tests(query).rollup(rollup_method[, measure]).last(time_window) operator #` * `query` The search query - following the [Log search syntax](https://docs.datadoghq.com/logs/search_syntax/). * `rollup_method` The stats roll-up method - supports `count`, `avg`, and `cardinality`. * `measure` For `avg` and cardinality `rollup_method` - specify the measure or the facet name you want to use. * `time_window` #m (between 1 and 2880), #h (between 1 and 48). * `operator` `<`, `<=`, `>`, `>=`, `==`, or `!=`. * `#` an integer or decimal number used to set the threshold. ##### [Error Tracking Alert Query](https://docs.datadoghq.com/api/latest/monitors/#error-tracking-alert-query) “New issue” example: `error-tracking(query).source(issue_source).new().rollup(rollup_method[, measure]).by(group_by).last(time_window) operator #` “High impact issue” example: `error-tracking(query).source(issue_source).impact().rollup(rollup_method[, measure]).by(group_by).last(time_window) operator #` * `query` The search query - following the [Log search syntax](https://docs.datadoghq.com/logs/search_syntax/). * `issue_source` The issue source - supports `all`, `browser`, `mobile` and `backend` and defaults to `all` if omitted. * `rollup_method` The stats roll-up method - supports `count`, `avg`, and `cardinality` and defaults to `count` if omitted. * `measure` For `avg` and cardinality `rollup_method` - specify the measure or the facet name you want to use. * `group by` Comma-separated list of attributes to group by - should contain at least `issue.id`. * `time_window` #m (between 1 and 2880), #h (between 1 and 48). * `operator` `<`, `<=`, `>`, `>=`, `==`, or `!=`. * `#` an integer or decimal number used to set the threshold. **Database Monitoring Alert Query** Example: `database-monitoring(query).rollup(rollup_method[, measure]).last(time_window) operator #` * `query` The search query - following the [Log search syntax](https://docs.datadoghq.com/logs/search_syntax/). * `rollup_method` The stats roll-up method - supports `count`, `avg`, and `cardinality`. * `measure` For `avg` and cardinality `rollup_method` - specify the measure or the facet name you want to use. * `time_window` #m (between 1 and 2880), #h (between 1 and 48). * `operator` `<`, `<=`, `>`, `>=`, `==`, or `!=`. * `#` an integer or decimal number used to set the threshold. **Network Performance Alert Query** Example: `network-performance(query).rollup(rollup_method[, measure]).last(time_window) operator #` * `query` The search query - following the [Log search syntax](https://docs.datadoghq.com/logs/search_syntax/). * `rollup_method` The stats roll-up method - supports `count`, `avg`, and `cardinality`. * `measure` For `avg` and cardinality `rollup_method` - specify the measure or the facet name you want to use. * `time_window` #m (between 1 and 2880), #h (between 1 and 48). * `operator` `<`, `<=`, `>`, `>=`, `==`, or `!=`. * `#` an integer or decimal number used to set the threshold. **Cost Alert Query** Example: `formula(query).timeframe_type(time_window).function(parameter) operator #` * `query` The search query - following the [Log search syntax](https://docs.datadoghq.com/logs/search_syntax/). * `timeframe_type` The timeframe type to evaluate the cost - for `forecast` supports `current` - for `change`, `anomaly`, `threshold` supports `last` * `time_window` - supports daily roll-up e.g. `7d` * `function` - [optional, defaults to `threshold` monitor if omitted] supports `change`, `anomaly`, `forecast` * `parameter` Specify the parameter of the type * for `change`: * supports `relative`, `absolute` * [optional] supports `#`, where `#` is an integer or decimal number used to set the threshold * for `anomaly`: * supports `direction=both`, `direction=above`, `direction=below` * [optional] supports `threshold=#`, where `#` is an integer or decimal number used to set the threshold * `operator` * for `threshold` supports `<`, `<=`, `>`, `>=`, `==`, or `!=` * for `change` supports `>`, `<` * for `anomaly` supports `>=` * for `forecast` supports `>` * `#` an integer or decimal number used to set the threshold. This endpoint requires the `monitors_write` permission. OAuth apps require the `monitors_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Request #### Body Data (required) Create a monitor request body. * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description assets [object] The list of monitor assets tied to a monitor, which represents key links for users to take action on monitor alerts (for example, runbooks). category [_required_] enum Indicates the type of asset this entity represents on a monitor. Allowed enum values: `runbook` name [_required_] string Name for the monitor asset resource_key string Represents the identifier of the internal Datadog resource that this asset represents. IDs in this field should be passed in as strings. resource_type enum Type of internal Datadog resource associated with a monitor asset. Allowed enum values: `notebook` url [_required_] string URL link for the asset. For links with an internal resource type set, this should be the relative path to where the Datadog domain is appended internally. For external links, this should be the full URL path. created date-time Timestamp of the monitor creation. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. deleted date-time Whether or not the monitor is deleted. (Always `null`) draft_status enum Indicates whether the monitor is in a draft or published state. `draft`: The monitor appears as Draft and does not send notifications. `published`: The monitor is active and evaluates conditions and notify as configured. This field is in preview. The draft value is only available to customers with the feature enabled. Allowed enum values: `draft,published` default: `published` id int64 ID of this monitor. matching_downtimes [object] A list of active v1 downtimes that match this monitor. end int64 POSIX timestamp to end the downtime. id [_required_] int64 The downtime ID. scope [string] The scope(s) to which the downtime applies. Must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. message string A message to include with notifications for this monitor. modified date-time Last timestamp when the monitor was edited. multi boolean Whether or not the monitor is broken down on different groups. name string The monitor name. options object List of options associated with your monitor. aggregation object Type of aggregation performed in the monitor query. group_by string Group to break down the monitor on. metric string Metric name used in the monitor. type string Metric type used in the monitor. device_ids [string] **DEPRECATED** : IDs of the device the Synthetics monitor is running on. enable_logs_sample boolean Whether or not to send a log sample when the log monitor triggers. enable_samples boolean Whether or not to send a list of samples when the monitor triggers. This is only used by CI Test and Pipeline monitors. escalation_message string We recommend using the [is_renotify](https://docs.datadoghq.com/monitors/notify/?tab=is_alert#renotify), block in the original message instead. A message to include with a re-notification. Supports the `@username` notification we allow elsewhere. Not applicable if `renotify_interval` is `None`. evaluation_delay int64 Time (in seconds) to delay evaluation, as a non-negative integer. For example, if the value is set to `300` (5min), the timeframe is set to `last_5m` and the time is 7:00, the monitor evaluates data from 6:50 to 6:55. This is useful for AWS CloudWatch and other backfilled metrics to ensure the monitor always has data during evaluation. group_retention_duration string The time span after which groups with missing data are dropped from the monitor state. The minimum value is one hour, and the maximum value is 72 hours. Example values are: "60m", "1h", and "2d". This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. groupby_simple_monitor boolean **DEPRECATED** : Whether the log alert monitor triggers a single alert or multiple alerts when any group breaches a threshold. Use `notify_by` instead. include_tags boolean A Boolean indicating whether notifications from this monitor automatically inserts its triggering tags into the title. **Examples** * If `True`, `[Triggered on {host:h1}] Monitor Title` * If `False`, `[Triggered] Monitor Title` default: `true` locked boolean **DEPRECATED** : Whether or not the monitor is locked (only editable by creator and admins). Use `restricted_roles` instead. min_failure_duration int64 How long the test should be in failure before alerting (integer, number of seconds, max 7200). min_location_failed int64 The minimum number of locations in failure at the same time during at least one moment in the `min_failure_duration` period (`min_location_failed` and `min_failure_duration` are part of the advanced alerting rules - integer, >= 1). default: `1` new_group_delay int64 Time (in seconds) to skip evaluations for new groups. For example, this option can be used to skip evaluations for new hosts while they initialize. Must be a non negative integer. new_host_delay int64 **DEPRECATED** : Time (in seconds) to allow a host to boot and applications to fully start before starting the evaluation of monitor results. Should be a non negative integer. Use new_group_delay instead. default: `300` no_data_timeframe int64 The number of minutes before a monitor notifies after data stops reporting. Datadog recommends at least 2x the monitor timeframe for query alerts or 2 minutes for service checks. If omitted, 2x the evaluation timeframe is used for query alerts, and 24 hours is used for service checks. notification_preset_name enum Toggles the display of additional content sent in the monitor notification. Allowed enum values: `show_all,hide_query,hide_handles,hide_all` default: `show_all` notify_audit boolean A Boolean indicating whether tagged users is notified on changes to this monitor. notify_by [string] Controls what granularity a monitor alerts on. Only available for monitors with groupings. For instance, a monitor grouped by `cluster`, `namespace`, and `pod` can be configured to only notify on each new `cluster` violating the alert conditions by setting `notify_by` to `["cluster"]`. Tags mentioned in `notify_by` must be a subset of the grouping tags in the query. For example, a query grouped by `cluster` and `namespace` cannot notify on `region`. Setting `notify_by` to `["*"]` configures the monitor to notify as a simple-alert. notify_no_data boolean A Boolean indicating whether this monitor notifies when data stops reporting. Defaults to `false`. on_missing_data enum Controls how groups or monitors are treated if an evaluation does not return any data points. The default option results in different behavior depending on the monitor query type. For monitors using Count queries, an empty monitor evaluation is treated as 0 and is compared to the threshold conditions. For monitors using any query type other than Count, for example Gauge, Measure, or Rate, the monitor shows the last known status. This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. Allowed enum values: `default,show_no_data,show_and_notify_no_data,resolve` renotify_interval int64 The number of minutes after the last notification before a monitor re-notifies on the current status. It only re-notifies if it’s not resolved. renotify_occurrences int64 The number of times re-notification messages should be sent on the current status at the provided re-notification interval. renotify_statuses [string] The types of monitor statuses for which re-notification messages are sent. Default: **null** if `renotify_interval` is **null**. If `renotify_interval` is set, defaults to renotify on `Alert` and `No Data`. require_full_window boolean A Boolean indicating whether this monitor needs a full window of data before it’s evaluated. We highly recommend you set this to `false` for sparse metrics, otherwise some evaluations are skipped. Default is false. This setting only applies to metric monitors. scheduling_options object Configuration options for scheduling. custom_schedule object Configuration options for the custom schedule. **This feature is in private beta.** recurrences [object] Array of custom schedule recurrences. rrule string Defines the recurrence rule (RRULE) for a given schedule. start string Defines the start date and time of the recurring schedule. timezone string Defines the timezone the schedule runs on. evaluation_window object Configuration options for the evaluation window. If `hour_starts` is set, no other fields may be set. Otherwise, `day_starts` and `month_starts` must be set together. day_starts string The time of the day at which a one day cumulative evaluation window starts. hour_starts int32 The minute of the hour at which a one hour cumulative evaluation window starts. month_starts int32 The day of the month at which a one month cumulative evaluation window starts. timezone string The timezone of the time of the day of the cumulative evaluation window start. silenced object **DEPRECATED** : Information about the downtime applied to the monitor. Only shows v1 downtimes. int64 UTC epoch timestamp in seconds when the downtime for the group expires. synthetics_check_id string **DEPRECATED** : ID of the corresponding Synthetic check. threshold_windows object Alerting time window options. recovery_window string Describes how long an anomalous metric must be normal before the alert recovers. trigger_window string Describes how long a metric must be anomalous before an alert triggers. thresholds object List of the different monitor threshold available. critical double The monitor `CRITICAL` threshold. critical_recovery double The monitor `CRITICAL` recovery threshold. ok double The monitor `OK` threshold. unknown double The monitor UNKNOWN threshold. warning double The monitor `WARNING` threshold. warning_recovery double The monitor `WARNING` recovery threshold. timeout_h int64 The number of hours of the monitor not reporting data before it automatically resolves from a triggered state. The minimum allowed value is 0 hours. The maximum allowed value is 24 hours. variables [ ] List of requests that can be used in the monitor query. **This feature is currently in beta.** Option 1 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `rum,ci_pipelines,ci_tests,audit,events,logs,spans,database_queries,network` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. Option 2 object A formula and functions cost query. aggregator enum Aggregation methods for metric queries. Allowed enum values: `avg,sum,max,min,last,area,l2norm,percentile,stddev` data_source [_required_] enum Data source for cost queries. Allowed enum values: `metrics,cloud_cost,datadog_usage` name [_required_] string Name of the query for use in formulas. query [_required_] string The monitor query. Option 3 object A formula and functions data quality query. data_source [_required_] enum Data source for data quality queries. Allowed enum values: `data_quality_metrics` filter [_required_] string Filter expression used to match on data entities. Uses Aastra query syntax. group_by [string] Optional grouping fields for aggregation. measure [_required_] string The data quality measure to query. Common values include: `bytes`, `cardinality`, `custom`, `freshness`, `max`, `mean`, `min`, `nullness`, `percent_negative`, `percent_zero`, `row_count`, `stddev`, `sum`, `uniqueness`. Additional values may be supported. monitor_options object Monitor configuration options for data quality queries. crontab_override string Crontab expression to override the default schedule. custom_sql string Custom SQL query for the monitor. custom_where string Custom WHERE clause for the query. group_by_columns [string] Columns to group results by. model_type_override enum Override for the model type used in anomaly detection. Allowed enum values: `freshness,percentage,any` name [_required_] string Name of the query for use in formulas. schema_version string Schema version for the data quality query. scope string Optional scoping expression to further filter metrics. Uses metrics filter syntax. This is useful when an entity has been configured to emit metrics with additional tags. overall_state enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` priority int64 Integer from 1 (high) to 5 (low) indicating alert severity. query [_required_] string The monitor query. restricted_roles [string] A list of unique role identifiers to define which roles are allowed to edit the monitor. The unique identifiers for all roles can be pulled from the [Roles API](https://docs.datadoghq.com/api/latest/roles/#list-roles) and are located in the `data.id` field. Editing a monitor includes any updates to the monitor configuration, monitor deletion, and muting of the monitor for any amount of time. You can use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) to manage write authorization for individual monitors by teams and users, in addition to roles. state object Wrapper object with the different monitor states. groups object Dictionary where the keys are groups (comma separated lists of tags) and the values are the list of groups your monitor is broken down on. object Monitor state for a single group. last_nodata_ts int64 Latest timestamp the monitor was in NO_DATA state. last_notified_ts int64 Latest timestamp of the notification sent for this monitor group. last_resolved_ts int64 Latest timestamp the monitor group was resolved. last_triggered_ts int64 Latest timestamp the monitor group triggered. name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated to your monitor. type [_required_] enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ##### Create a Cost Monitor returns "OK" response ``` { "name": "Example Monitor", "type": "cost alert", "query": "formula(\"exclude_null(query1)\").last(\"7d\").anomaly(direction=\"above\", threshold=10) >= 5", "message": "some message Notify: @hipchat-channel", "tags": [ "test:examplemonitor", "env:ci" ], "priority": 3, "options": { "thresholds": { "critical": 5, "warning": 3 }, "variables": [ { "data_source": "cloud_cost", "query": "sum:aws.cost.net.amortized.shared.resources.allocated{aws_product IN (amplify ,athena, backup, bedrock ) } by {aws_product}.rollup(sum, 86400)", "name": "query1", "aggregator": "sum" } ], "include_tags": true } } ``` Copy ##### Create a Data Quality monitor returns "OK" response ``` { "name": "Example-Monitor", "type": "data-quality alert", "query": "formula(\"query1\").last(\"5m\") > 100", "message": "Data quality alert triggered", "tags": [ "test:examplemonitor", "env:ci" ], "priority": 3, "options": { "thresholds": { "critical": 100 }, "variables": [ { "name": "query1", "data_source": "data_quality_metrics", "measure": "row_count", "filter": "search for column where `database:production AND table:users`", "group_by": [ "entity_id" ] } ] } } ``` Copy ##### Create a RUM formula and functions monitor returns "OK" response ``` { "name": "Example-Monitor", "type": "rum alert", "query": "formula(\"query2 / query1 * 100\").last(\"15m\") >= 0.8", "message": "some message Notify: @hipchat-channel", "tags": [ "test:examplemonitor", "env:ci" ], "priority": 3, "options": { "thresholds": { "critical": 0.8 }, "variables": [ { "data_source": "rum", "name": "query2", "search": { "query": "" }, "indexes": [ "*" ], "compute": { "aggregation": "count" }, "group_by": [] }, { "data_source": "rum", "name": "query1", "search": { "query": "status:error" }, "indexes": [ "*" ], "compute": { "aggregation": "count" }, "group_by": [] } ] } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitor-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitor-400-v1) * [403](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitor-403-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitor-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Object describing a monitor. Field Type Description assets [object] The list of monitor assets tied to a monitor, which represents key links for users to take action on monitor alerts (for example, runbooks). category [_required_] enum Indicates the type of asset this entity represents on a monitor. Allowed enum values: `runbook` name [_required_] string Name for the monitor asset resource_key string Represents the identifier of the internal Datadog resource that this asset represents. IDs in this field should be passed in as strings. resource_type enum Type of internal Datadog resource associated with a monitor asset. Allowed enum values: `notebook` url [_required_] string URL link for the asset. For links with an internal resource type set, this should be the relative path to where the Datadog domain is appended internally. For external links, this should be the full URL path. created date-time Timestamp of the monitor creation. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. deleted date-time Whether or not the monitor is deleted. (Always `null`) draft_status enum Indicates whether the monitor is in a draft or published state. `draft`: The monitor appears as Draft and does not send notifications. `published`: The monitor is active and evaluates conditions and notify as configured. This field is in preview. The draft value is only available to customers with the feature enabled. Allowed enum values: `draft,published` default: `published` id int64 ID of this monitor. matching_downtimes [object] A list of active v1 downtimes that match this monitor. end int64 POSIX timestamp to end the downtime. id [_required_] int64 The downtime ID. scope [string] The scope(s) to which the downtime applies. Must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. message string A message to include with notifications for this monitor. modified date-time Last timestamp when the monitor was edited. multi boolean Whether or not the monitor is broken down on different groups. name string The monitor name. options object List of options associated with your monitor. aggregation object Type of aggregation performed in the monitor query. group_by string Group to break down the monitor on. metric string Metric name used in the monitor. type string Metric type used in the monitor. device_ids [string] **DEPRECATED** : IDs of the device the Synthetics monitor is running on. enable_logs_sample boolean Whether or not to send a log sample when the log monitor triggers. enable_samples boolean Whether or not to send a list of samples when the monitor triggers. This is only used by CI Test and Pipeline monitors. escalation_message string We recommend using the [is_renotify](https://docs.datadoghq.com/monitors/notify/?tab=is_alert#renotify), block in the original message instead. A message to include with a re-notification. Supports the `@username` notification we allow elsewhere. Not applicable if `renotify_interval` is `None`. evaluation_delay int64 Time (in seconds) to delay evaluation, as a non-negative integer. For example, if the value is set to `300` (5min), the timeframe is set to `last_5m` and the time is 7:00, the monitor evaluates data from 6:50 to 6:55. This is useful for AWS CloudWatch and other backfilled metrics to ensure the monitor always has data during evaluation. group_retention_duration string The time span after which groups with missing data are dropped from the monitor state. The minimum value is one hour, and the maximum value is 72 hours. Example values are: "60m", "1h", and "2d". This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. groupby_simple_monitor boolean **DEPRECATED** : Whether the log alert monitor triggers a single alert or multiple alerts when any group breaches a threshold. Use `notify_by` instead. include_tags boolean A Boolean indicating whether notifications from this monitor automatically inserts its triggering tags into the title. **Examples** * If `True`, `[Triggered on {host:h1}] Monitor Title` * If `False`, `[Triggered] Monitor Title` default: `true` locked boolean **DEPRECATED** : Whether or not the monitor is locked (only editable by creator and admins). Use `restricted_roles` instead. min_failure_duration int64 How long the test should be in failure before alerting (integer, number of seconds, max 7200). min_location_failed int64 The minimum number of locations in failure at the same time during at least one moment in the `min_failure_duration` period (`min_location_failed` and `min_failure_duration` are part of the advanced alerting rules - integer, >= 1). default: `1` new_group_delay int64 Time (in seconds) to skip evaluations for new groups. For example, this option can be used to skip evaluations for new hosts while they initialize. Must be a non negative integer. new_host_delay int64 **DEPRECATED** : Time (in seconds) to allow a host to boot and applications to fully start before starting the evaluation of monitor results. Should be a non negative integer. Use new_group_delay instead. default: `300` no_data_timeframe int64 The number of minutes before a monitor notifies after data stops reporting. Datadog recommends at least 2x the monitor timeframe for query alerts or 2 minutes for service checks. If omitted, 2x the evaluation timeframe is used for query alerts, and 24 hours is used for service checks. notification_preset_name enum Toggles the display of additional content sent in the monitor notification. Allowed enum values: `show_all,hide_query,hide_handles,hide_all` default: `show_all` notify_audit boolean A Boolean indicating whether tagged users is notified on changes to this monitor. notify_by [string] Controls what granularity a monitor alerts on. Only available for monitors with groupings. For instance, a monitor grouped by `cluster`, `namespace`, and `pod` can be configured to only notify on each new `cluster` violating the alert conditions by setting `notify_by` to `["cluster"]`. Tags mentioned in `notify_by` must be a subset of the grouping tags in the query. For example, a query grouped by `cluster` and `namespace` cannot notify on `region`. Setting `notify_by` to `["*"]` configures the monitor to notify as a simple-alert. notify_no_data boolean A Boolean indicating whether this monitor notifies when data stops reporting. Defaults to `false`. on_missing_data enum Controls how groups or monitors are treated if an evaluation does not return any data points. The default option results in different behavior depending on the monitor query type. For monitors using Count queries, an empty monitor evaluation is treated as 0 and is compared to the threshold conditions. For monitors using any query type other than Count, for example Gauge, Measure, or Rate, the monitor shows the last known status. This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. Allowed enum values: `default,show_no_data,show_and_notify_no_data,resolve` renotify_interval int64 The number of minutes after the last notification before a monitor re-notifies on the current status. It only re-notifies if it’s not resolved. renotify_occurrences int64 The number of times re-notification messages should be sent on the current status at the provided re-notification interval. renotify_statuses [string] The types of monitor statuses for which re-notification messages are sent. Default: **null** if `renotify_interval` is **null**. If `renotify_interval` is set, defaults to renotify on `Alert` and `No Data`. require_full_window boolean A Boolean indicating whether this monitor needs a full window of data before it’s evaluated. We highly recommend you set this to `false` for sparse metrics, otherwise some evaluations are skipped. Default is false. This setting only applies to metric monitors. scheduling_options object Configuration options for scheduling. custom_schedule object Configuration options for the custom schedule. **This feature is in private beta.** recurrences [object] Array of custom schedule recurrences. rrule string Defines the recurrence rule (RRULE) for a given schedule. start string Defines the start date and time of the recurring schedule. timezone string Defines the timezone the schedule runs on. evaluation_window object Configuration options for the evaluation window. If `hour_starts` is set, no other fields may be set. Otherwise, `day_starts` and `month_starts` must be set together. day_starts string The time of the day at which a one day cumulative evaluation window starts. hour_starts int32 The minute of the hour at which a one hour cumulative evaluation window starts. month_starts int32 The day of the month at which a one month cumulative evaluation window starts. timezone string The timezone of the time of the day of the cumulative evaluation window start. silenced object **DEPRECATED** : Information about the downtime applied to the monitor. Only shows v1 downtimes. int64 UTC epoch timestamp in seconds when the downtime for the group expires. synthetics_check_id string **DEPRECATED** : ID of the corresponding Synthetic check. threshold_windows object Alerting time window options. recovery_window string Describes how long an anomalous metric must be normal before the alert recovers. trigger_window string Describes how long a metric must be anomalous before an alert triggers. thresholds object List of the different monitor threshold available. critical double The monitor `CRITICAL` threshold. critical_recovery double The monitor `CRITICAL` recovery threshold. ok double The monitor `OK` threshold. unknown double The monitor UNKNOWN threshold. warning double The monitor `WARNING` threshold. warning_recovery double The monitor `WARNING` recovery threshold. timeout_h int64 The number of hours of the monitor not reporting data before it automatically resolves from a triggered state. The minimum allowed value is 0 hours. The maximum allowed value is 24 hours. variables [ ] List of requests that can be used in the monitor query. **This feature is currently in beta.** Option 1 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `rum,ci_pipelines,ci_tests,audit,events,logs,spans,database_queries,network` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. Option 2 object A formula and functions cost query. aggregator enum Aggregation methods for metric queries. Allowed enum values: `avg,sum,max,min,last,area,l2norm,percentile,stddev` data_source [_required_] enum Data source for cost queries. Allowed enum values: `metrics,cloud_cost,datadog_usage` name [_required_] string Name of the query for use in formulas. query [_required_] string The monitor query. Option 3 object A formula and functions data quality query. data_source [_required_] enum Data source for data quality queries. Allowed enum values: `data_quality_metrics` filter [_required_] string Filter expression used to match on data entities. Uses Aastra query syntax. group_by [string] Optional grouping fields for aggregation. measure [_required_] string The data quality measure to query. Common values include: `bytes`, `cardinality`, `custom`, `freshness`, `max`, `mean`, `min`, `nullness`, `percent_negative`, `percent_zero`, `row_count`, `stddev`, `sum`, `uniqueness`. Additional values may be supported. monitor_options object Monitor configuration options for data quality queries. crontab_override string Crontab expression to override the default schedule. custom_sql string Custom SQL query for the monitor. custom_where string Custom WHERE clause for the query. group_by_columns [string] Columns to group results by. model_type_override enum Override for the model type used in anomaly detection. Allowed enum values: `freshness,percentage,any` name [_required_] string Name of the query for use in formulas. schema_version string Schema version for the data quality query. scope string Optional scoping expression to further filter metrics. Uses metrics filter syntax. This is useful when an entity has been configured to emit metrics with additional tags. overall_state enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` priority int64 Integer from 1 (high) to 5 (low) indicating alert severity. query [_required_] string The monitor query. restricted_roles [string] A list of unique role identifiers to define which roles are allowed to edit the monitor. The unique identifiers for all roles can be pulled from the [Roles API](https://docs.datadoghq.com/api/latest/roles/#list-roles) and are located in the `data.id` field. Editing a monitor includes any updates to the monitor configuration, monitor deletion, and muting of the monitor for any amount of time. You can use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) to manage write authorization for individual monitors by teams and users, in addition to roles. state object Wrapper object with the different monitor states. groups object Dictionary where the keys are groups (comma separated lists of tags) and the values are the list of groups your monitor is broken down on. object Monitor state for a single group. last_nodata_ts int64 Latest timestamp the monitor was in NO_DATA state. last_notified_ts int64 Latest timestamp of the notification sent for this monitor group. last_resolved_ts int64 Latest timestamp the monitor group was resolved. last_triggered_ts int64 Latest timestamp the monitor group triggered. name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated to your monitor. type [_required_] enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ``` { "assets": [ { "category": "runbook", "name": "Monitor Runbook", "resource_key": "12345", "resource_type": "string", "url": "/notebooks/12345" } ], "created": "2019-09-19T10:00:00.000Z", "creator": { "email": "string", "handle": "string", "name": "string" }, "deleted": "2019-09-19T10:00:00.000Z", "draft_status": "string", "id": "integer", "matching_downtimes": [ { "end": 1412792983, "id": 1625, "scope": [ "env:staging" ], "start": 1412792983 } ], "message": "string", "modified": "2019-09-19T10:00:00.000Z", "multi": false, "name": "My monitor", "options": { "aggregation": { "group_by": "host", "metric": "metrics.name", "type": "count" }, "device_ids": [], "enable_logs_sample": false, "enable_samples": false, "escalation_message": "string", "evaluation_delay": "integer", "group_retention_duration": "string", "groupby_simple_monitor": false, "include_tags": false, "locked": false, "min_failure_duration": "integer", "min_location_failed": "integer", "new_group_delay": "integer", "new_host_delay": "integer", "no_data_timeframe": "integer", "notification_preset_name": "string", "notify_audit": false, "notify_by": [], "notify_no_data": false, "on_missing_data": "string", "renotify_interval": "integer", "renotify_occurrences": "integer", "renotify_statuses": [], "require_full_window": false, "scheduling_options": { "custom_schedule": { "recurrences": [ { "rrule": "FREQ=WEEKLY;BYDAY=MO,TU,WE,TH,FR", "start": "2023-08-31T16:30:00", "timezone": "Europe/Paris" } ] }, "evaluation_window": { "day_starts": "04:00", "hour_starts": 0, "month_starts": 1, "timezone": "Europe/Paris" } }, "silenced": { "": "integer" }, "synthetics_check_id": "string", "threshold_windows": { "recovery_window": "string", "trigger_window": "string" }, "thresholds": { "critical": "number", "critical_recovery": "number", "ok": "number", "unknown": "number", "warning": "number", "warning_recovery": "number" }, "timeout_h": "integer", "variables": [ { "compute": { "aggregation": "avg", "interval": 60000, "metric": "@duration" }, "data_source": "rum", "group_by": [ { "facet": "status", "limit": 10, "sort": { "aggregation": "avg", "metric": "string", "order": "string" } } ], "indexes": [ "days-3", "days-7" ], "name": "query_errors", "search": { "query": "service:query" } } ] }, "overall_state": "string", "priority": "integer", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "restricted_roles": [], "state": { "groups": { "": { "last_nodata_ts": "integer", "last_notified_ts": "integer", "last_resolved_ts": "integer", "last_triggered_ts": "integer", "name": "string", "status": "string" } } }, "tags": [], "type": "query alert" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby-legacy) ##### Create a Cost Monitor returns "OK" response Copy ``` ## json-request-body # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "message": "You may need to add web hosts if this is consistently high.", "name": "Bytes received on host0", "options": { "no_data_timeframe": 20, "notify_no_data": true }, "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} \u003e 100", "tags": [ "app:webserver", "frontend" ], "type": "query alert" } EOF ``` ##### Create a Data Quality monitor returns "OK" response Copy ``` ## json-request-body # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "message": "You may need to add web hosts if this is consistently high.", "name": "Bytes received on host0", "options": { "no_data_timeframe": 20, "notify_no_data": true }, "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} \u003e 100", "tags": [ "app:webserver", "frontend" ], "type": "query alert" } EOF ``` ##### Create a RUM formula and functions monitor returns "OK" response Copy ``` ## json-request-body # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "message": "You may need to add web hosts if this is consistently high.", "name": "Bytes received on host0", "options": { "no_data_timeframe": 20, "notify_no_data": true }, "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} \u003e 100", "tags": [ "app:webserver", "frontend" ], "type": "query alert" } EOF ``` ##### Create a Cost Monitor returns "OK" response ``` // Create a Cost Monitor returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Monitor{ Name: datadog.PtrString("Example Monitor"), Type: datadogV1.MONITORTYPE_COST_ALERT, Query: `formula("exclude_null(query1)").last("7d").anomaly(direction="above", threshold=10) >= 5`, Message: datadog.PtrString("some message Notify: @hipchat-channel"), Tags: []string{ "test:examplemonitor", "env:ci", }, Priority: *datadog.NewNullableInt64(datadog.PtrInt64(3)), Options: &datadogV1.MonitorOptions{ Thresholds: &datadogV1.MonitorThresholds{ Critical: datadog.PtrFloat64(5), Warning: *datadog.NewNullableFloat64(datadog.PtrFloat64(3)), }, Variables: []datadogV1.MonitorFormulaAndFunctionQueryDefinition{ datadogV1.MonitorFormulaAndFunctionQueryDefinition{ MonitorFormulaAndFunctionCostQueryDefinition: &datadogV1.MonitorFormulaAndFunctionCostQueryDefinition{ DataSource: datadogV1.MONITORFORMULAANDFUNCTIONCOSTDATASOURCE_CLOUD_COST, Query: "sum:aws.cost.net.amortized.shared.resources.allocated{aws_product IN (amplify ,athena, backup, bedrock ) } by {aws_product}.rollup(sum, 86400)", Name: "query1", Aggregator: datadogV1.MONITORFORMULAANDFUNCTIONCOSTAGGREGATOR_SUM.Ptr(), }}, }, IncludeTags: datadog.PtrBool(true), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.CreateMonitor(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.CreateMonitor`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.CreateMonitor`:\n%s\n", responseContent) } ``` Copy ##### Create a RUM formula and functions monitor returns "OK" response ``` // Create a RUM formula and functions monitor returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Monitor{ Name: datadog.PtrString("Example-Monitor"), Type: datadogV1.MONITORTYPE_RUM_ALERT, Query: `formula("query2 / query1 * 100").last("15m") >= 0.8`, Message: datadog.PtrString("some message Notify: @hipchat-channel"), Tags: []string{ "test:examplemonitor", "env:ci", }, Priority: *datadog.NewNullableInt64(datadog.PtrInt64(3)), Options: &datadogV1.MonitorOptions{ Thresholds: &datadogV1.MonitorThresholds{ Critical: datadog.PtrFloat64(0.8), }, Variables: []datadogV1.MonitorFormulaAndFunctionQueryDefinition{ datadogV1.MonitorFormulaAndFunctionQueryDefinition{ MonitorFormulaAndFunctionEventQueryDefinition: &datadogV1.MonitorFormulaAndFunctionEventQueryDefinition{ DataSource: datadogV1.MONITORFORMULAANDFUNCTIONEVENTSDATASOURCE_RUM, Name: "query2", Search: &datadogV1.MonitorFormulaAndFunctionEventQueryDefinitionSearch{ Query: "", }, Indexes: []string{ "*", }, Compute: datadogV1.MonitorFormulaAndFunctionEventQueryDefinitionCompute{ Aggregation: datadogV1.MONITORFORMULAANDFUNCTIONEVENTAGGREGATION_COUNT, }, GroupBy: []datadogV1.MonitorFormulaAndFunctionEventQueryGroupBy{}, }}, datadogV1.MonitorFormulaAndFunctionQueryDefinition{ MonitorFormulaAndFunctionEventQueryDefinition: &datadogV1.MonitorFormulaAndFunctionEventQueryDefinition{ DataSource: datadogV1.MONITORFORMULAANDFUNCTIONEVENTSDATASOURCE_RUM, Name: "query1", Search: &datadogV1.MonitorFormulaAndFunctionEventQueryDefinitionSearch{ Query: "status:error", }, Indexes: []string{ "*", }, Compute: datadogV1.MonitorFormulaAndFunctionEventQueryDefinitionCompute{ Aggregation: datadogV1.MONITORFORMULAANDFUNCTIONEVENTAGGREGATION_COUNT, }, GroupBy: []datadogV1.MonitorFormulaAndFunctionEventQueryGroupBy{}, }}, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.CreateMonitor(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.CreateMonitor`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.CreateMonitor`:\n%s\n", responseContent) } ``` Copy ##### Create a ci-pipelines formula and functions monitor returns "OK" response ``` // Create a ci-pipelines formula and functions monitor returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Monitor{ Name: datadog.PtrString("Example-Monitor"), Type: datadogV1.MONITORTYPE_CI_PIPELINES_ALERT, Query: `formula("query1 / query2 * 100").last("15m") >= 0.8`, Message: datadog.PtrString("some message Notify: @hipchat-channel"), Tags: []string{ "test:examplemonitor", "env:ci", }, Priority: *datadog.NewNullableInt64(datadog.PtrInt64(3)), Options: &datadogV1.MonitorOptions{ Thresholds: &datadogV1.MonitorThresholds{ Critical: datadog.PtrFloat64(0.8), }, Variables: []datadogV1.MonitorFormulaAndFunctionQueryDefinition{ datadogV1.MonitorFormulaAndFunctionQueryDefinition{ MonitorFormulaAndFunctionEventQueryDefinition: &datadogV1.MonitorFormulaAndFunctionEventQueryDefinition{ DataSource: datadogV1.MONITORFORMULAANDFUNCTIONEVENTSDATASOURCE_CI_PIPELINES, Name: "query1", Search: &datadogV1.MonitorFormulaAndFunctionEventQueryDefinitionSearch{ Query: "@ci.status:error", }, Indexes: []string{ "*", }, Compute: datadogV1.MonitorFormulaAndFunctionEventQueryDefinitionCompute{ Aggregation: datadogV1.MONITORFORMULAANDFUNCTIONEVENTAGGREGATION_COUNT, }, GroupBy: []datadogV1.MonitorFormulaAndFunctionEventQueryGroupBy{}, }}, datadogV1.MonitorFormulaAndFunctionQueryDefinition{ MonitorFormulaAndFunctionEventQueryDefinition: &datadogV1.MonitorFormulaAndFunctionEventQueryDefinition{ DataSource: datadogV1.MONITORFORMULAANDFUNCTIONEVENTSDATASOURCE_CI_PIPELINES, Name: "query2", Search: &datadogV1.MonitorFormulaAndFunctionEventQueryDefinitionSearch{ Query: "", }, Indexes: []string{ "*", }, Compute: datadogV1.MonitorFormulaAndFunctionEventQueryDefinitionCompute{ Aggregation: datadogV1.MONITORFORMULAANDFUNCTIONEVENTAGGREGATION_COUNT, }, GroupBy: []datadogV1.MonitorFormulaAndFunctionEventQueryGroupBy{}, }}, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.CreateMonitor(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.CreateMonitor`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.CreateMonitor`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a Cost Monitor returns "OK" response ``` // Create a Cost Monitor returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.Monitor; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionCostAggregator; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionCostDataSource; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionCostQueryDefinition; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionQueryDefinition; import com.datadog.api.client.v1.model.MonitorOptions; import com.datadog.api.client.v1.model.MonitorThresholds; import com.datadog.api.client.v1.model.MonitorType; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); Monitor body = new Monitor() .name("Example Monitor") .type(MonitorType.COST_ALERT) .query( """ formula("exclude_null(query1)").last("7d").anomaly(direction="above", threshold=10) >= 5 """) .message("some message Notify: @hipchat-channel") .tags(Arrays.asList("test:examplemonitor", "env:ci")) .priority(3L) .options( new MonitorOptions() .thresholds(new MonitorThresholds().critical(5.0).warning(3.0)) .variables( Collections.singletonList( new MonitorFormulaAndFunctionQueryDefinition( new MonitorFormulaAndFunctionCostQueryDefinition() .dataSource(MonitorFormulaAndFunctionCostDataSource.CLOUD_COST) .query( "sum:aws.cost.net.amortized.shared.resources.allocated{aws_product" + " IN (amplify ,athena, backup, bedrock ) } by" + " {aws_product}.rollup(sum, 86400)") .name("query1") .aggregator(MonitorFormulaAndFunctionCostAggregator.SUM)))) .includeTags(true)); try { Monitor result = apiInstance.createMonitor(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#createMonitor"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a RUM formula and functions monitor returns "OK" response ``` // Create a RUM formula and functions monitor returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.Monitor; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionEventAggregation; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionEventQueryDefinition; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionEventQueryDefinitionCompute; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionEventQueryDefinitionSearch; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionEventsDataSource; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionQueryDefinition; import com.datadog.api.client.v1.model.MonitorOptions; import com.datadog.api.client.v1.model.MonitorThresholds; import com.datadog.api.client.v1.model.MonitorType; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); Monitor body = new Monitor() .name("Example-Monitor") .type(MonitorType.RUM_ALERT) .query(""" formula("query2 / query1 * 100").last("15m") >= 0.8 """) .message("some message Notify: @hipchat-channel") .tags(Arrays.asList("test:examplemonitor", "env:ci")) .priority(3L) .options( new MonitorOptions() .thresholds(new MonitorThresholds().critical(0.8)) .variables( Arrays.asList( new MonitorFormulaAndFunctionQueryDefinition( new MonitorFormulaAndFunctionEventQueryDefinition() .dataSource(MonitorFormulaAndFunctionEventsDataSource.RUM) .name("query2") .search( new MonitorFormulaAndFunctionEventQueryDefinitionSearch() .query("")) .indexes(Collections.singletonList("*")) .compute( new MonitorFormulaAndFunctionEventQueryDefinitionCompute() .aggregation( MonitorFormulaAndFunctionEventAggregation.COUNT))), new MonitorFormulaAndFunctionQueryDefinition( new MonitorFormulaAndFunctionEventQueryDefinition() .dataSource(MonitorFormulaAndFunctionEventsDataSource.RUM) .name("query1") .search( new MonitorFormulaAndFunctionEventQueryDefinitionSearch() .query("status:error")) .indexes(Collections.singletonList("*")) .compute( new MonitorFormulaAndFunctionEventQueryDefinitionCompute() .aggregation( MonitorFormulaAndFunctionEventAggregation .COUNT)))))); try { Monitor result = apiInstance.createMonitor(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#createMonitor"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a ci-pipelines formula and functions monitor returns "OK" response ``` // Create a ci-pipelines formula and functions monitor returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.Monitor; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionEventAggregation; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionEventQueryDefinition; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionEventQueryDefinitionCompute; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionEventQueryDefinitionSearch; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionEventsDataSource; import com.datadog.api.client.v1.model.MonitorFormulaAndFunctionQueryDefinition; import com.datadog.api.client.v1.model.MonitorOptions; import com.datadog.api.client.v1.model.MonitorThresholds; import com.datadog.api.client.v1.model.MonitorType; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); Monitor body = new Monitor() .name("Example-Monitor") .type(MonitorType.CI_PIPELINES_ALERT) .query(""" formula("query1 / query2 * 100").last("15m") >= 0.8 """) .message("some message Notify: @hipchat-channel") .tags(Arrays.asList("test:examplemonitor", "env:ci")) .priority(3L) .options( new MonitorOptions() .thresholds(new MonitorThresholds().critical(0.8)) .variables( Arrays.asList( new MonitorFormulaAndFunctionQueryDefinition( new MonitorFormulaAndFunctionEventQueryDefinition() .dataSource( MonitorFormulaAndFunctionEventsDataSource.CI_PIPELINES) .name("query1") .search( new MonitorFormulaAndFunctionEventQueryDefinitionSearch() .query("@ci.status:error")) .indexes(Collections.singletonList("*")) .compute( new MonitorFormulaAndFunctionEventQueryDefinitionCompute() .aggregation( MonitorFormulaAndFunctionEventAggregation.COUNT))), new MonitorFormulaAndFunctionQueryDefinition( new MonitorFormulaAndFunctionEventQueryDefinition() .dataSource( MonitorFormulaAndFunctionEventsDataSource.CI_PIPELINES) .name("query2") .search( new MonitorFormulaAndFunctionEventQueryDefinitionSearch() .query("")) .indexes(Collections.singletonList("*")) .compute( new MonitorFormulaAndFunctionEventQueryDefinitionCompute() .aggregation( MonitorFormulaAndFunctionEventAggregation .COUNT)))))); try { Monitor result = apiInstance.createMonitor(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#createMonitor"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a monitor returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Create a new monitor monitor_options = { "notify_no_data": True, "no_data_timeframe": 20 } tags = ["app:webserver", "frontend"] api.Monitor.create( type="query alert", query="avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", name="Bytes received on host0", message="We may need to add web hosts if this is consistently high.", tags=tags, options=monitor_options ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Create a Cost Monitor returns "OK" response ``` """ Create a Cost Monitor returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi from datadog_api_client.v1.model.monitor import Monitor from datadog_api_client.v1.model.monitor_formula_and_function_cost_aggregator import ( MonitorFormulaAndFunctionCostAggregator, ) from datadog_api_client.v1.model.monitor_formula_and_function_cost_data_source import ( MonitorFormulaAndFunctionCostDataSource, ) from datadog_api_client.v1.model.monitor_formula_and_function_cost_query_definition import ( MonitorFormulaAndFunctionCostQueryDefinition, ) from datadog_api_client.v1.model.monitor_options import MonitorOptions from datadog_api_client.v1.model.monitor_thresholds import MonitorThresholds from datadog_api_client.v1.model.monitor_type import MonitorType body = Monitor( name="Example Monitor", type=MonitorType.COST_ALERT, query='formula("exclude_null(query1)").last("7d").anomaly(direction="above", threshold=10) >= 5', message="some message Notify: @hipchat-channel", tags=[ "test:examplemonitor", "env:ci", ], priority=3, options=MonitorOptions( thresholds=MonitorThresholds( critical=5.0, warning=3.0, ), variables=[ MonitorFormulaAndFunctionCostQueryDefinition( data_source=MonitorFormulaAndFunctionCostDataSource.CLOUD_COST, query="sum:aws.cost.net.amortized.shared.resources.allocated{aws_product IN (amplify ,athena, backup, bedrock ) } by {aws_product}.rollup(sum, 86400)", name="query1", aggregator=MonitorFormulaAndFunctionCostAggregator.SUM, ), ], include_tags=True, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.create_monitor(body=body) print(response) ``` Copy ##### Create a RUM formula and functions monitor returns "OK" response ``` """ Create a RUM formula and functions monitor returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi from datadog_api_client.v1.model.monitor import Monitor from datadog_api_client.v1.model.monitor_formula_and_function_event_aggregation import ( MonitorFormulaAndFunctionEventAggregation, ) from datadog_api_client.v1.model.monitor_formula_and_function_event_query_definition import ( MonitorFormulaAndFunctionEventQueryDefinition, ) from datadog_api_client.v1.model.monitor_formula_and_function_event_query_definition_compute import ( MonitorFormulaAndFunctionEventQueryDefinitionCompute, ) from datadog_api_client.v1.model.monitor_formula_and_function_event_query_definition_search import ( MonitorFormulaAndFunctionEventQueryDefinitionSearch, ) from datadog_api_client.v1.model.monitor_formula_and_function_events_data_source import ( MonitorFormulaAndFunctionEventsDataSource, ) from datadog_api_client.v1.model.monitor_options import MonitorOptions from datadog_api_client.v1.model.monitor_thresholds import MonitorThresholds from datadog_api_client.v1.model.monitor_type import MonitorType body = Monitor( name="Example-Monitor", type=MonitorType.RUM_ALERT, query='formula("query2 / query1 * 100").last("15m") >= 0.8', message="some message Notify: @hipchat-channel", tags=[ "test:examplemonitor", "env:ci", ], priority=3, options=MonitorOptions( thresholds=MonitorThresholds( critical=0.8, ), variables=[ MonitorFormulaAndFunctionEventQueryDefinition( data_source=MonitorFormulaAndFunctionEventsDataSource.RUM, name="query2", search=MonitorFormulaAndFunctionEventQueryDefinitionSearch( query="", ), indexes=[ "*", ], compute=MonitorFormulaAndFunctionEventQueryDefinitionCompute( aggregation=MonitorFormulaAndFunctionEventAggregation.COUNT, ), group_by=[], ), MonitorFormulaAndFunctionEventQueryDefinition( data_source=MonitorFormulaAndFunctionEventsDataSource.RUM, name="query1", search=MonitorFormulaAndFunctionEventQueryDefinitionSearch( query="status:error", ), indexes=[ "*", ], compute=MonitorFormulaAndFunctionEventQueryDefinitionCompute( aggregation=MonitorFormulaAndFunctionEventAggregation.COUNT, ), group_by=[], ), ], ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.create_monitor(body=body) print(response) ``` Copy ##### Create a ci-pipelines formula and functions monitor returns "OK" response ``` """ Create a ci-pipelines formula and functions monitor returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi from datadog_api_client.v1.model.monitor import Monitor from datadog_api_client.v1.model.monitor_formula_and_function_event_aggregation import ( MonitorFormulaAndFunctionEventAggregation, ) from datadog_api_client.v1.model.monitor_formula_and_function_event_query_definition import ( MonitorFormulaAndFunctionEventQueryDefinition, ) from datadog_api_client.v1.model.monitor_formula_and_function_event_query_definition_compute import ( MonitorFormulaAndFunctionEventQueryDefinitionCompute, ) from datadog_api_client.v1.model.monitor_formula_and_function_event_query_definition_search import ( MonitorFormulaAndFunctionEventQueryDefinitionSearch, ) from datadog_api_client.v1.model.monitor_formula_and_function_events_data_source import ( MonitorFormulaAndFunctionEventsDataSource, ) from datadog_api_client.v1.model.monitor_options import MonitorOptions from datadog_api_client.v1.model.monitor_thresholds import MonitorThresholds from datadog_api_client.v1.model.monitor_type import MonitorType body = Monitor( name="Example-Monitor", type=MonitorType.CI_PIPELINES_ALERT, query='formula("query1 / query2 * 100").last("15m") >= 0.8', message="some message Notify: @hipchat-channel", tags=[ "test:examplemonitor", "env:ci", ], priority=3, options=MonitorOptions( thresholds=MonitorThresholds( critical=0.8, ), variables=[ MonitorFormulaAndFunctionEventQueryDefinition( data_source=MonitorFormulaAndFunctionEventsDataSource.CI_PIPELINES, name="query1", search=MonitorFormulaAndFunctionEventQueryDefinitionSearch( query="@ci.status:error", ), indexes=[ "*", ], compute=MonitorFormulaAndFunctionEventQueryDefinitionCompute( aggregation=MonitorFormulaAndFunctionEventAggregation.COUNT, ), group_by=[], ), MonitorFormulaAndFunctionEventQueryDefinition( data_source=MonitorFormulaAndFunctionEventsDataSource.CI_PIPELINES, name="query2", search=MonitorFormulaAndFunctionEventQueryDefinitionSearch( query="", ), indexes=[ "*", ], compute=MonitorFormulaAndFunctionEventQueryDefinitionCompute( aggregation=MonitorFormulaAndFunctionEventAggregation.COUNT, ), group_by=[], ), ], ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.create_monitor(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a monitor returns "OK" response ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Create a new monitor options = { 'notify_no_data' => true, 'no_data_timeframe' => 20 } tags = ['app:webserver', 'frontend'] dog.monitor( 'query alert', 'avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100', name: 'Bytes received on host0', message: 'We may need to add web hosts if this is consistently high.', tags: tags, options: options ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a Cost Monitor returns "OK" response ``` # Create a Cost Monitor returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new body = DatadogAPIClient::V1::Monitor.new({ name: "Example Monitor", type: DatadogAPIClient::V1::MonitorType::COST_ALERT, query: 'formula("exclude_null(query1)").last("7d").anomaly(direction="above", threshold=10) >= 5', message: "some message Notify: @hipchat-channel", tags: [ "test:examplemonitor", "env:ci", ], priority: 3, options: DatadogAPIClient::V1::MonitorOptions.new({ thresholds: DatadogAPIClient::V1::MonitorThresholds.new({ critical: 5, warning: 3, }), variables: [ DatadogAPIClient::V1::MonitorFormulaAndFunctionCostQueryDefinition.new({ data_source: DatadogAPIClient::V1::MonitorFormulaAndFunctionCostDataSource::CLOUD_COST, query: "sum:aws.cost.net.amortized.shared.resources.allocated{aws_product IN (amplify ,athena, backup, bedrock ) } by {aws_product}.rollup(sum, 86400)", name: "query1", aggregator: DatadogAPIClient::V1::MonitorFormulaAndFunctionCostAggregator::SUM, }), ], include_tags: true, }), }) p api_instance.create_monitor(body) ``` Copy ##### Create a RUM formula and functions monitor returns "OK" response ``` # Create a RUM formula and functions monitor returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new body = DatadogAPIClient::V1::Monitor.new({ name: "Example-Monitor", type: DatadogAPIClient::V1::MonitorType::RUM_ALERT, query: 'formula("query2 / query1 * 100").last("15m") >= 0.8', message: "some message Notify: @hipchat-channel", tags: [ "test:examplemonitor", "env:ci", ], priority: 3, options: DatadogAPIClient::V1::MonitorOptions.new({ thresholds: DatadogAPIClient::V1::MonitorThresholds.new({ critical: 0.8, }), variables: [ DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinition.new({ data_source: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventsDataSource::RUM, name: "query2", search: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinitionSearch.new({ query: "", }), indexes: [ "*", ], compute: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinitionCompute.new({ aggregation: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventAggregation::COUNT, }), group_by: [], }), DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinition.new({ data_source: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventsDataSource::RUM, name: "query1", search: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinitionSearch.new({ query: "status:error", }), indexes: [ "*", ], compute: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinitionCompute.new({ aggregation: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventAggregation::COUNT, }), group_by: [], }), ], }), }) p api_instance.create_monitor(body) ``` Copy ##### Create a ci-pipelines formula and functions monitor returns "OK" response ``` # Create a ci-pipelines formula and functions monitor returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new body = DatadogAPIClient::V1::Monitor.new({ name: "Example-Monitor", type: DatadogAPIClient::V1::MonitorType::CI_PIPELINES_ALERT, query: 'formula("query1 / query2 * 100").last("15m") >= 0.8', message: "some message Notify: @hipchat-channel", tags: [ "test:examplemonitor", "env:ci", ], priority: 3, options: DatadogAPIClient::V1::MonitorOptions.new({ thresholds: DatadogAPIClient::V1::MonitorThresholds.new({ critical: 0.8, }), variables: [ DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinition.new({ data_source: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventsDataSource::CI_PIPELINES, name: "query1", search: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinitionSearch.new({ query: "@ci.status:error", }), indexes: [ "*", ], compute: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinitionCompute.new({ aggregation: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventAggregation::COUNT, }), group_by: [], }), DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinition.new({ data_source: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventsDataSource::CI_PIPELINES, name: "query2", search: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinitionSearch.new({ query: "", }), indexes: [ "*", ], compute: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventQueryDefinitionCompute.new({ aggregation: DatadogAPIClient::V1::MonitorFormulaAndFunctionEventAggregation::COUNT, }), group_by: [], }), ], }), }) p api_instance.create_monitor(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a Cost Monitor returns "OK" response ``` // Create a Cost Monitor returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; use datadog_api_client::datadogV1::model::Monitor; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionCostAggregator; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionCostDataSource; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionCostQueryDefinition; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionQueryDefinition; use datadog_api_client::datadogV1::model::MonitorOptions; use datadog_api_client::datadogV1::model::MonitorThresholds; use datadog_api_client::datadogV1::model::MonitorType; #[tokio::main] async fn main() { let body = Monitor::new( r#"formula("exclude_null(query1)").last("7d").anomaly(direction="above", threshold=10) >= 5"#.to_string(), MonitorType::COST_ALERT, ) .message("some message Notify: @hipchat-channel".to_string()) .name("Example Monitor".to_string()) .options( MonitorOptions::new() .include_tags(true) .thresholds(MonitorThresholds::new().critical(5.0 as f64).warning(Some(3.0 as f64))) .variables( vec![ MonitorFormulaAndFunctionQueryDefinition::MonitorFormulaAndFunctionCostQueryDefinition( Box::new( MonitorFormulaAndFunctionCostQueryDefinition::new( MonitorFormulaAndFunctionCostDataSource::CLOUD_COST, "query1".to_string(), "sum:aws.cost.net.amortized.shared.resources.allocated{aws_product IN (amplify ,athena, backup, bedrock ) } by {aws_product}.rollup(sum, 86400)".to_string(), ).aggregator(MonitorFormulaAndFunctionCostAggregator::SUM), ), ) ], ), ) .priority(Some(3)) .tags(vec!["test:examplemonitor".to_string(), "env:ci".to_string()]); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.create_monitor(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a RUM formula and functions monitor returns "OK" response ``` // Create a RUM formula and functions monitor returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; use datadog_api_client::datadogV1::model::Monitor; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionEventAggregation; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionEventQueryDefinition; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionEventQueryDefinitionCompute; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionEventQueryDefinitionSearch; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionEventsDataSource; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionQueryDefinition; use datadog_api_client::datadogV1::model::MonitorOptions; use datadog_api_client::datadogV1::model::MonitorThresholds; use datadog_api_client::datadogV1::model::MonitorType; #[tokio::main] async fn main() { let body = Monitor::new(r#"formula("query2 / query1 * 100").last("15m") >= 0.8"#.to_string(), MonitorType::RUM_ALERT) .message("some message Notify: @hipchat-channel".to_string()) .name("Example-Monitor".to_string()) .options( MonitorOptions::new() .thresholds(MonitorThresholds::new().critical(0.8 as f64)) .variables( vec![ MonitorFormulaAndFunctionQueryDefinition::MonitorFormulaAndFunctionEventQueryDefinition( Box::new( MonitorFormulaAndFunctionEventQueryDefinition::new( MonitorFormulaAndFunctionEventQueryDefinitionCompute::new( MonitorFormulaAndFunctionEventAggregation::COUNT, ), MonitorFormulaAndFunctionEventsDataSource::RUM, "query2".to_string(), ) .group_by(vec![]) .indexes(vec!["*".to_string()]) .search( MonitorFormulaAndFunctionEventQueryDefinitionSearch::new("".to_string()), ), ), ), MonitorFormulaAndFunctionQueryDefinition::MonitorFormulaAndFunctionEventQueryDefinition( Box::new( MonitorFormulaAndFunctionEventQueryDefinition::new( MonitorFormulaAndFunctionEventQueryDefinitionCompute::new( MonitorFormulaAndFunctionEventAggregation::COUNT, ), MonitorFormulaAndFunctionEventsDataSource::RUM, "query1".to_string(), ) .group_by(vec![]) .indexes(vec!["*".to_string()]) .search( MonitorFormulaAndFunctionEventQueryDefinitionSearch::new( "status:error".to_string(), ), ), ), ) ], ), ) .priority(Some(3)) .tags(vec!["test:examplemonitor".to_string(), "env:ci".to_string()]); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.create_monitor(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a ci-pipelines formula and functions monitor returns "OK" response ``` // Create a ci-pipelines formula and functions monitor returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; use datadog_api_client::datadogV1::model::Monitor; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionEventAggregation; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionEventQueryDefinition; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionEventQueryDefinitionCompute; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionEventQueryDefinitionSearch; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionEventsDataSource; use datadog_api_client::datadogV1::model::MonitorFormulaAndFunctionQueryDefinition; use datadog_api_client::datadogV1::model::MonitorOptions; use datadog_api_client::datadogV1::model::MonitorThresholds; use datadog_api_client::datadogV1::model::MonitorType; #[tokio::main] async fn main() { let body = Monitor::new( r#"formula("query1 / query2 * 100").last("15m") >= 0.8"#.to_string(), MonitorType::CI_PIPELINES_ALERT, ) .message("some message Notify: @hipchat-channel".to_string()) .name("Example-Monitor".to_string()) .options( MonitorOptions::new() .thresholds(MonitorThresholds::new().critical(0.8 as f64)) .variables( vec![ MonitorFormulaAndFunctionQueryDefinition::MonitorFormulaAndFunctionEventQueryDefinition( Box::new( MonitorFormulaAndFunctionEventQueryDefinition::new( MonitorFormulaAndFunctionEventQueryDefinitionCompute::new( MonitorFormulaAndFunctionEventAggregation::COUNT, ), MonitorFormulaAndFunctionEventsDataSource::CI_PIPELINES, "query1".to_string(), ) .group_by(vec![]) .indexes(vec!["*".to_string()]) .search( MonitorFormulaAndFunctionEventQueryDefinitionSearch::new( "@ci.status:error".to_string(), ), ), ), ), MonitorFormulaAndFunctionQueryDefinition::MonitorFormulaAndFunctionEventQueryDefinition( Box::new( MonitorFormulaAndFunctionEventQueryDefinition::new( MonitorFormulaAndFunctionEventQueryDefinitionCompute::new( MonitorFormulaAndFunctionEventAggregation::COUNT, ), MonitorFormulaAndFunctionEventsDataSource::CI_PIPELINES, "query2".to_string(), ) .group_by(vec![]) .indexes(vec!["*".to_string()]) .search( MonitorFormulaAndFunctionEventQueryDefinitionSearch::new("".to_string()), ), ), ) ], ), ) .priority(Some(3)) .tags(vec!["test:examplemonitor".to_string(), "env:ci".to_string()]); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.create_monitor(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a Cost Monitor returns "OK" response ``` /** * Create a Cost Monitor returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); const params: v1.MonitorsApiCreateMonitorRequest = { body: { name: "Example Monitor", type: "cost alert", query: `formula("exclude_null(query1)").last("7d").anomaly(direction="above", threshold=10) >= 5`, message: "some message Notify: @hipchat-channel", tags: ["test:examplemonitor", "env:ci"], priority: 3, options: { thresholds: { critical: 5, warning: 3, }, variables: [ { dataSource: "cloud_cost", query: "sum:aws.cost.net.amortized.shared.resources.allocated{aws_product IN (amplify ,athena, backup, bedrock ) } by {aws_product}.rollup(sum, 86400)", name: "query1", aggregator: "sum", }, ], includeTags: true, }, }, }; apiInstance .createMonitor(params) .then((data: v1.Monitor) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a RUM formula and functions monitor returns "OK" response ``` /** * Create a RUM formula and functions monitor returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); const params: v1.MonitorsApiCreateMonitorRequest = { body: { name: "Example-Monitor", type: "rum alert", query: `formula("query2 / query1 * 100").last("15m") >= 0.8`, message: "some message Notify: @hipchat-channel", tags: ["test:examplemonitor", "env:ci"], priority: 3, options: { thresholds: { critical: 0.8, }, variables: [ { dataSource: "rum", name: "query2", search: { query: "", }, indexes: ["*"], compute: { aggregation: "count", }, groupBy: [], }, { dataSource: "rum", name: "query1", search: { query: "status:error", }, indexes: ["*"], compute: { aggregation: "count", }, groupBy: [], }, ], }, }, }; apiInstance .createMonitor(params) .then((data: v1.Monitor) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a ci-pipelines formula and functions monitor returns "OK" response ``` /** * Create a ci-pipelines formula and functions monitor returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); const params: v1.MonitorsApiCreateMonitorRequest = { body: { name: "Example-Monitor", type: "ci-pipelines alert", query: `formula("query1 / query2 * 100").last("15m") >= 0.8`, message: "some message Notify: @hipchat-channel", tags: ["test:examplemonitor", "env:ci"], priority: 3, options: { thresholds: { critical: 0.8, }, variables: [ { dataSource: "ci_pipelines", name: "query1", search: { query: "@ci.status:error", }, indexes: ["*"], compute: { aggregation: "count", }, groupBy: [], }, { dataSource: "ci_pipelines", name: "query2", search: { query: "", }, indexes: ["*"], compute: { aggregation: "count", }, groupBy: [], }, ], }, }, }; apiInstance .createMonitor(params) .then((data: v1.Monitor) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Monitors search](https://docs.datadoghq.com/api/latest/monitors/#monitors-search) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#monitors-search-v1) GET https://api.ap1.datadoghq.com/api/v1/monitor/searchhttps://api.ap2.datadoghq.com/api/v1/monitor/searchhttps://api.datadoghq.eu/api/v1/monitor/searchhttps://api.ddog-gov.com/api/v1/monitor/searchhttps://api.datadoghq.com/api/v1/monitor/searchhttps://api.us3.datadoghq.com/api/v1/monitor/searchhttps://api.us5.datadoghq.com/api/v1/monitor/search ### Overview Search and filter your monitors details. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Query Strings Name Type Description query string After entering a search query in your [Manage Monitor page](https://app.datadoghq.com/monitors/manage) use the query parameter value in the URL of the page as value for this parameter. Consult the dedicated [manage monitor documentation](https://docs.datadoghq.com/monitors/manage/#find-the-monitors) page to learn more. The query can contain any number of space-separated monitor attributes, for instance `query="type:metric status:alert"`. page integer Page to start paginating from. per_page integer Number of monitors to return per page. sort string String for sort order, composed of field and sort order separate by a comma, for example `name,asc`. Supported sort directions: `asc`, `desc`. Supported fields: * `name` * `status` * `tags` ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#SearchMonitors-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#SearchMonitors-400-v1) * [403](https://docs.datadoghq.com/api/latest/monitors/#SearchMonitors-403-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#SearchMonitors-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) The response from a monitor search. Field Type Description counts object The counts of monitors per different criteria. muted [object] Search facets. count int64 The number of found monitors with the listed value. name The facet value. status [object] Search facets. count int64 The number of found monitors with the listed value. name The facet value. tag [object] Search facets. count int64 The number of found monitors with the listed value. name The facet value. type [object] Search facets. count int64 The number of found monitors with the listed value. name The facet value. metadata object Metadata about the response. page int64 The page to start paginating from. page_count int64 The number of pages. per_page int64 The number of monitors to return per page. total_count int64 The total number of monitors. monitors [object] The list of found monitors. classification string Classification of the monitor. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. id int64 ID of the monitor. last_triggered_ts int64 Latest timestamp the monitor triggered. metrics [string] Metrics used by the monitor. name string The monitor name. notifications [object] The notification triggered by the monitor. handle string The email address that received the notification. name string The username receiving the notification org_id int64 The ID of the organization. quality_issues [string] Quality issues detected with the monitor. query string The monitor query. scopes [string] The scope(s) to which the downtime applies, for example `host:app2`. Provide multiple scopes as a comma-separated list, for example `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (that is `env:dev AND env:prod`), NOT any of them. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated with the monitor. type enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ``` { "counts": { "muted": [ { "count": 3, "name": false }, { "count": 3, "name": true } ], "status": [ { "count": 4, "name": "No Data" }, { "count": 2, "name": "OK" } ], "tag": [ { "count": 6, "name": "service:cassandra" } ], "type": [ { "count": 6, "name": "metric" } ] }, "metadata": { "page": 0, "page_count": 6, "per_page": 30, "total_count": 6 }, "monitors": [ { "classification": "metric", "creator": { "handle": "john@datadoghq.com", "name": "John Doe" }, "id": 2699850, "last_triggered_ts": null, "metrics": [ "system.cpu.user" ], "name": "Cassandra CPU is high on {{host.name}} in {{availability-zone.name}}", "notifications": [ { "handle": "jane@datadoghq.com", "name": "Jane Doe" } ], "org_id": 1234, "quality_issues": [ "broken_at_handle", "noisy_monitor" ], "scopes": [ "!availability-zone:us-east-1c", "name:cassandra" ], "status": "No Data", "tags": [ "service:cassandra" ], "type": "query alert" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python-legacy) ##### Monitors search Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/search" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Monitors search ``` """ Monitors search returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.search_monitors() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Monitors search ``` # Monitors search returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new p api_instance.search_monitors() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Monitors search ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Search monitors dog.search_monitors # Examples of possible query parameters: # dog.search_monitors(query="id:7100311") # dog.search_monitors(query="title:foo metric:system.core.idle status:Alert") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Monitors search ``` // Monitors search returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.SearchMonitors(ctx, *datadogV1.NewSearchMonitorsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.SearchMonitors`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.SearchMonitors`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Monitors search ``` // Monitors search returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.MonitorSearchResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); try { MonitorSearchResponse result = apiInstance.searchMonitors(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#searchMonitors"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Monitors search ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Search monitors api.Monitor.search() # Examples of possible query parameters: # api.Monitor.search(query="id:7100311") # api.Monitor.search(query="title:foo metric:system.core.idle status:Alert") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Monitors search ``` // Monitors search returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; use datadog_api_client::datadogV1::api_monitors::SearchMonitorsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .search_monitors(SearchMonitorsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Monitors search ``` /** * Monitors search returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); apiInstance .searchMonitors() .then((data: v1.MonitorSearchResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Unmute a monitor](https://docs.datadoghq.com/api/latest/monitors/#unmute-a-monitor) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#unmute-a-monitor-v1) POST https://api.ap1.datadoghq.com/api/v1/monitor/{monitor_id}/unmutehttps://api.ap2.datadoghq.com/api/v1/monitor/{monitor_id}/unmutehttps://api.datadoghq.eu/api/v1/monitor/{monitor_id}/unmutehttps://api.ddog-gov.com/api/v1/monitor/{monitor_id}/unmutehttps://api.datadoghq.com/api/v1/monitor/{monitor_id}/unmutehttps://api.us3.datadoghq.com/api/v1/monitor/{monitor_id}/unmutehttps://api.us5.datadoghq.com/api/v1/monitor/{monitor_id}/unmute ### Overview Unmute the specified monitor. This endpoint requires the `monitors_write` permission. OAuth apps require the `monitors_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Path Parameters Name Type Description monitor_id [_required_] integer The id of the monitor #### Query Strings Name Type Description scope string The scope to apply the mute to. For example, if your alert is grouped by `{host}`, you might mute `host:app1`. all_scopes boolean Clear muting across all scopes. Default is `false`. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#UnmuteMonitor-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#UnmuteMonitor-400-v1) * [401](https://docs.datadoghq.com/api/latest/monitors/#UnmuteMonitor-401-v1) * [404](https://docs.datadoghq.com/api/latest/monitors/#UnmuteMonitor-404-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#UnmuteMonitor-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Object describing a monitor. Field Type Description assets [object] The list of monitor assets tied to a monitor, which represents key links for users to take action on monitor alerts (for example, runbooks). category [_required_] enum Indicates the type of asset this entity represents on a monitor. Allowed enum values: `runbook` name [_required_] string Name for the monitor asset resource_key string Represents the identifier of the internal Datadog resource that this asset represents. IDs in this field should be passed in as strings. resource_type enum Type of internal Datadog resource associated with a monitor asset. Allowed enum values: `notebook` url [_required_] string URL link for the asset. For links with an internal resource type set, this should be the relative path to where the Datadog domain is appended internally. For external links, this should be the full URL path. created date-time Timestamp of the monitor creation. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. deleted date-time Whether or not the monitor is deleted. (Always `null`) draft_status enum Indicates whether the monitor is in a draft or published state. `draft`: The monitor appears as Draft and does not send notifications. `published`: The monitor is active and evaluates conditions and notify as configured. This field is in preview. The draft value is only available to customers with the feature enabled. Allowed enum values: `draft,published` default: `published` id int64 ID of this monitor. matching_downtimes [object] A list of active v1 downtimes that match this monitor. end int64 POSIX timestamp to end the downtime. id [_required_] int64 The downtime ID. scope [string] The scope(s) to which the downtime applies. Must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. message string A message to include with notifications for this monitor. modified date-time Last timestamp when the monitor was edited. multi boolean Whether or not the monitor is broken down on different groups. name string The monitor name. options object List of options associated with your monitor. aggregation object Type of aggregation performed in the monitor query. group_by string Group to break down the monitor on. metric string Metric name used in the monitor. type string Metric type used in the monitor. device_ids [string] **DEPRECATED** : IDs of the device the Synthetics monitor is running on. enable_logs_sample boolean Whether or not to send a log sample when the log monitor triggers. enable_samples boolean Whether or not to send a list of samples when the monitor triggers. This is only used by CI Test and Pipeline monitors. escalation_message string We recommend using the [is_renotify](https://docs.datadoghq.com/monitors/notify/?tab=is_alert#renotify), block in the original message instead. A message to include with a re-notification. Supports the `@username` notification we allow elsewhere. Not applicable if `renotify_interval` is `None`. evaluation_delay int64 Time (in seconds) to delay evaluation, as a non-negative integer. For example, if the value is set to `300` (5min), the timeframe is set to `last_5m` and the time is 7:00, the monitor evaluates data from 6:50 to 6:55. This is useful for AWS CloudWatch and other backfilled metrics to ensure the monitor always has data during evaluation. group_retention_duration string The time span after which groups with missing data are dropped from the monitor state. The minimum value is one hour, and the maximum value is 72 hours. Example values are: "60m", "1h", and "2d". This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. groupby_simple_monitor boolean **DEPRECATED** : Whether the log alert monitor triggers a single alert or multiple alerts when any group breaches a threshold. Use `notify_by` instead. include_tags boolean A Boolean indicating whether notifications from this monitor automatically inserts its triggering tags into the title. **Examples** * If `True`, `[Triggered on {host:h1}] Monitor Title` * If `False`, `[Triggered] Monitor Title` default: `true` locked boolean **DEPRECATED** : Whether or not the monitor is locked (only editable by creator and admins). Use `restricted_roles` instead. min_failure_duration int64 How long the test should be in failure before alerting (integer, number of seconds, max 7200). min_location_failed int64 The minimum number of locations in failure at the same time during at least one moment in the `min_failure_duration` period (`min_location_failed` and `min_failure_duration` are part of the advanced alerting rules - integer, >= 1). default: `1` new_group_delay int64 Time (in seconds) to skip evaluations for new groups. For example, this option can be used to skip evaluations for new hosts while they initialize. Must be a non negative integer. new_host_delay int64 **DEPRECATED** : Time (in seconds) to allow a host to boot and applications to fully start before starting the evaluation of monitor results. Should be a non negative integer. Use new_group_delay instead. default: `300` no_data_timeframe int64 The number of minutes before a monitor notifies after data stops reporting. Datadog recommends at least 2x the monitor timeframe for query alerts or 2 minutes for service checks. If omitted, 2x the evaluation timeframe is used for query alerts, and 24 hours is used for service checks. notification_preset_name enum Toggles the display of additional content sent in the monitor notification. Allowed enum values: `show_all,hide_query,hide_handles,hide_all` default: `show_all` notify_audit boolean A Boolean indicating whether tagged users is notified on changes to this monitor. notify_by [string] Controls what granularity a monitor alerts on. Only available for monitors with groupings. For instance, a monitor grouped by `cluster`, `namespace`, and `pod` can be configured to only notify on each new `cluster` violating the alert conditions by setting `notify_by` to `["cluster"]`. Tags mentioned in `notify_by` must be a subset of the grouping tags in the query. For example, a query grouped by `cluster` and `namespace` cannot notify on `region`. Setting `notify_by` to `["*"]` configures the monitor to notify as a simple-alert. notify_no_data boolean A Boolean indicating whether this monitor notifies when data stops reporting. Defaults to `false`. on_missing_data enum Controls how groups or monitors are treated if an evaluation does not return any data points. The default option results in different behavior depending on the monitor query type. For monitors using Count queries, an empty monitor evaluation is treated as 0 and is compared to the threshold conditions. For monitors using any query type other than Count, for example Gauge, Measure, or Rate, the monitor shows the last known status. This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. Allowed enum values: `default,show_no_data,show_and_notify_no_data,resolve` renotify_interval int64 The number of minutes after the last notification before a monitor re-notifies on the current status. It only re-notifies if it’s not resolved. renotify_occurrences int64 The number of times re-notification messages should be sent on the current status at the provided re-notification interval. renotify_statuses [string] The types of monitor statuses for which re-notification messages are sent. Default: **null** if `renotify_interval` is **null**. If `renotify_interval` is set, defaults to renotify on `Alert` and `No Data`. require_full_window boolean A Boolean indicating whether this monitor needs a full window of data before it’s evaluated. We highly recommend you set this to `false` for sparse metrics, otherwise some evaluations are skipped. Default is false. This setting only applies to metric monitors. scheduling_options object Configuration options for scheduling. custom_schedule object Configuration options for the custom schedule. **This feature is in private beta.** recurrences [object] Array of custom schedule recurrences. rrule string Defines the recurrence rule (RRULE) for a given schedule. start string Defines the start date and time of the recurring schedule. timezone string Defines the timezone the schedule runs on. evaluation_window object Configuration options for the evaluation window. If `hour_starts` is set, no other fields may be set. Otherwise, `day_starts` and `month_starts` must be set together. day_starts string The time of the day at which a one day cumulative evaluation window starts. hour_starts int32 The minute of the hour at which a one hour cumulative evaluation window starts. month_starts int32 The day of the month at which a one month cumulative evaluation window starts. timezone string The timezone of the time of the day of the cumulative evaluation window start. silenced object **DEPRECATED** : Information about the downtime applied to the monitor. Only shows v1 downtimes. int64 UTC epoch timestamp in seconds when the downtime for the group expires. synthetics_check_id string **DEPRECATED** : ID of the corresponding Synthetic check. threshold_windows object Alerting time window options. recovery_window string Describes how long an anomalous metric must be normal before the alert recovers. trigger_window string Describes how long a metric must be anomalous before an alert triggers. thresholds object List of the different monitor threshold available. critical double The monitor `CRITICAL` threshold. critical_recovery double The monitor `CRITICAL` recovery threshold. ok double The monitor `OK` threshold. unknown double The monitor UNKNOWN threshold. warning double The monitor `WARNING` threshold. warning_recovery double The monitor `WARNING` recovery threshold. timeout_h int64 The number of hours of the monitor not reporting data before it automatically resolves from a triggered state. The minimum allowed value is 0 hours. The maximum allowed value is 24 hours. variables [ ] List of requests that can be used in the monitor query. **This feature is currently in beta.** Option 1 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `rum,ci_pipelines,ci_tests,audit,events,logs,spans,database_queries,network` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. Option 2 object A formula and functions cost query. aggregator enum Aggregation methods for metric queries. Allowed enum values: `avg,sum,max,min,last,area,l2norm,percentile,stddev` data_source [_required_] enum Data source for cost queries. Allowed enum values: `metrics,cloud_cost,datadog_usage` name [_required_] string Name of the query for use in formulas. query [_required_] string The monitor query. Option 3 object A formula and functions data quality query. data_source [_required_] enum Data source for data quality queries. Allowed enum values: `data_quality_metrics` filter [_required_] string Filter expression used to match on data entities. Uses Aastra query syntax. group_by [string] Optional grouping fields for aggregation. measure [_required_] string The data quality measure to query. Common values include: `bytes`, `cardinality`, `custom`, `freshness`, `max`, `mean`, `min`, `nullness`, `percent_negative`, `percent_zero`, `row_count`, `stddev`, `sum`, `uniqueness`. Additional values may be supported. monitor_options object Monitor configuration options for data quality queries. crontab_override string Crontab expression to override the default schedule. custom_sql string Custom SQL query for the monitor. custom_where string Custom WHERE clause for the query. group_by_columns [string] Columns to group results by. model_type_override enum Override for the model type used in anomaly detection. Allowed enum values: `freshness,percentage,any` name [_required_] string Name of the query for use in formulas. schema_version string Schema version for the data quality query. scope string Optional scoping expression to further filter metrics. Uses metrics filter syntax. This is useful when an entity has been configured to emit metrics with additional tags. overall_state enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` priority int64 Integer from 1 (high) to 5 (low) indicating alert severity. query [_required_] string The monitor query. restricted_roles [string] A list of unique role identifiers to define which roles are allowed to edit the monitor. The unique identifiers for all roles can be pulled from the [Roles API](https://docs.datadoghq.com/api/latest/roles/#list-roles) and are located in the `data.id` field. Editing a monitor includes any updates to the monitor configuration, monitor deletion, and muting of the monitor for any amount of time. You can use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) to manage write authorization for individual monitors by teams and users, in addition to roles. state object Wrapper object with the different monitor states. groups object Dictionary where the keys are groups (comma separated lists of tags) and the values are the list of groups your monitor is broken down on. object Monitor state for a single group. last_nodata_ts int64 Latest timestamp the monitor was in NO_DATA state. last_notified_ts int64 Latest timestamp of the notification sent for this monitor group. last_resolved_ts int64 Latest timestamp the monitor group was resolved. last_triggered_ts int64 Latest timestamp the monitor group triggered. name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated to your monitor. type [_required_] enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ``` { "assets": [ { "category": "runbook", "name": "Monitor Runbook", "resource_key": "12345", "resource_type": "string", "url": "/notebooks/12345" } ], "created": "2019-09-19T10:00:00.000Z", "creator": { "email": "string", "handle": "string", "name": "string" }, "deleted": "2019-09-19T10:00:00.000Z", "draft_status": "string", "id": "integer", "matching_downtimes": [ { "end": 1412792983, "id": 1625, "scope": [ "env:staging" ], "start": 1412792983 } ], "message": "string", "modified": "2019-09-19T10:00:00.000Z", "multi": false, "name": "My monitor", "options": { "aggregation": { "group_by": "host", "metric": "metrics.name", "type": "count" }, "device_ids": [], "enable_logs_sample": false, "enable_samples": false, "escalation_message": "string", "evaluation_delay": "integer", "group_retention_duration": "string", "groupby_simple_monitor": false, "include_tags": false, "locked": false, "min_failure_duration": "integer", "min_location_failed": "integer", "new_group_delay": "integer", "new_host_delay": "integer", "no_data_timeframe": "integer", "notification_preset_name": "string", "notify_audit": false, "notify_by": [], "notify_no_data": false, "on_missing_data": "string", "renotify_interval": "integer", "renotify_occurrences": "integer", "renotify_statuses": [], "require_full_window": false, "scheduling_options": { "custom_schedule": { "recurrences": [ { "rrule": "FREQ=WEEKLY;BYDAY=MO,TU,WE,TH,FR", "start": "2023-08-31T16:30:00", "timezone": "Europe/Paris" } ] }, "evaluation_window": { "day_starts": "04:00", "hour_starts": 0, "month_starts": 1, "timezone": "Europe/Paris" } }, "silenced": { "": "integer" }, "synthetics_check_id": "string", "threshold_windows": { "recovery_window": "string", "trigger_window": "string" }, "thresholds": { "critical": "number", "critical_recovery": "number", "ok": "number", "unknown": "number", "warning": "number", "warning_recovery": "number" }, "timeout_h": "integer", "variables": [ { "compute": { "aggregation": "avg", "interval": 60000, "metric": "@duration" }, "data_source": "rum", "group_by": [ { "facet": "status", "limit": 10, "sort": { "aggregation": "avg", "metric": "string", "order": "string" } } ], "indexes": [ "days-3", "days-7" ], "name": "query_errors", "search": { "query": "service:query" } } ] }, "overall_state": "string", "priority": "integer", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "restricted_roles": [], "state": { "groups": { "": { "last_nodata_ts": "integer", "last_notified_ts": "integer", "last_resolved_ts": "integer", "last_triggered_ts": "integer", "name": "string", "status": "string" } } }, "tags": [], "type": "query alert" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Monitor Not Found error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python-legacy) ##### Unmute a monitor Copy ``` # Path parameters export monitor_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/${monitor_id}/unmute" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Unmute a monitor ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Unmute an alert dog.unmute_monitor(62_628) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Unmute a monitor ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Unmute all alerts api.Monitor.unmute(2088) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` * * * ## [Get all monitors](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitors) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitors-v1) GET https://api.ap1.datadoghq.com/api/v1/monitorhttps://api.ap2.datadoghq.com/api/v1/monitorhttps://api.datadoghq.eu/api/v1/monitorhttps://api.ddog-gov.com/api/v1/monitorhttps://api.datadoghq.com/api/v1/monitorhttps://api.us3.datadoghq.com/api/v1/monitorhttps://api.us5.datadoghq.com/api/v1/monitor ### Overview Get all monitors from your organization. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Query Strings Name Type Description group_states string When specified, shows additional information about the group states. Choose one or more from `all`, `alert`, `warn`, and `no data`. name string A string to filter monitors by name. tags string A comma separated list indicating what tags, if any, should be used to filter the list of monitors by scope. For example, `host:host0`. monitor_tags string A comma separated list indicating what service and/or custom tags, if any, should be used to filter the list of monitors. Tags created in the Datadog UI automatically have the service key prepended. For example, `service:my-app`. with_downtimes boolean If this argument is set to true, then the returned data includes all current active downtimes for each monitor. id_offset integer Use this parameter for paginating through large sets of monitors. Start with a value of zero, make a request, set the value to the last ID of result set, and then repeat until the response is empty. page integer The page to start paginating from. If this argument is not specified, the request returns all monitors without pagination. page_size integer The number of monitors to return per page. If the page argument is not specified, the default behavior returns all monitors without a `page_size` limit. However, if page is specified and `page_size` is not, the argument defaults to 100. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#ListMonitors-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#ListMonitors-400-v1) * [403](https://docs.datadoghq.com/api/latest/monitors/#ListMonitors-403-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#ListMonitors-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) An array of monitor objects. Field Type Description assets [object] The list of monitor assets tied to a monitor, which represents key links for users to take action on monitor alerts (for example, runbooks). category [_required_] enum Indicates the type of asset this entity represents on a monitor. Allowed enum values: `runbook` name [_required_] string Name for the monitor asset resource_key string Represents the identifier of the internal Datadog resource that this asset represents. IDs in this field should be passed in as strings. resource_type enum Type of internal Datadog resource associated with a monitor asset. Allowed enum values: `notebook` url [_required_] string URL link for the asset. For links with an internal resource type set, this should be the relative path to where the Datadog domain is appended internally. For external links, this should be the full URL path. created date-time Timestamp of the monitor creation. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. deleted date-time Whether or not the monitor is deleted. (Always `null`) draft_status enum Indicates whether the monitor is in a draft or published state. `draft`: The monitor appears as Draft and does not send notifications. `published`: The monitor is active and evaluates conditions and notify as configured. This field is in preview. The draft value is only available to customers with the feature enabled. Allowed enum values: `draft,published` default: `published` id int64 ID of this monitor. matching_downtimes [object] A list of active v1 downtimes that match this monitor. end int64 POSIX timestamp to end the downtime. id [_required_] int64 The downtime ID. scope [string] The scope(s) to which the downtime applies. Must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. message string A message to include with notifications for this monitor. modified date-time Last timestamp when the monitor was edited. multi boolean Whether or not the monitor is broken down on different groups. name string The monitor name. options object List of options associated with your monitor. aggregation object Type of aggregation performed in the monitor query. group_by string Group to break down the monitor on. metric string Metric name used in the monitor. type string Metric type used in the monitor. device_ids [string] **DEPRECATED** : IDs of the device the Synthetics monitor is running on. enable_logs_sample boolean Whether or not to send a log sample when the log monitor triggers. enable_samples boolean Whether or not to send a list of samples when the monitor triggers. This is only used by CI Test and Pipeline monitors. escalation_message string We recommend using the [is_renotify](https://docs.datadoghq.com/monitors/notify/?tab=is_alert#renotify), block in the original message instead. A message to include with a re-notification. Supports the `@username` notification we allow elsewhere. Not applicable if `renotify_interval` is `None`. evaluation_delay int64 Time (in seconds) to delay evaluation, as a non-negative integer. For example, if the value is set to `300` (5min), the timeframe is set to `last_5m` and the time is 7:00, the monitor evaluates data from 6:50 to 6:55. This is useful for AWS CloudWatch and other backfilled metrics to ensure the monitor always has data during evaluation. group_retention_duration string The time span after which groups with missing data are dropped from the monitor state. The minimum value is one hour, and the maximum value is 72 hours. Example values are: "60m", "1h", and "2d". This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. groupby_simple_monitor boolean **DEPRECATED** : Whether the log alert monitor triggers a single alert or multiple alerts when any group breaches a threshold. Use `notify_by` instead. include_tags boolean A Boolean indicating whether notifications from this monitor automatically inserts its triggering tags into the title. **Examples** * If `True`, `[Triggered on {host:h1}] Monitor Title` * If `False`, `[Triggered] Monitor Title` default: `true` locked boolean **DEPRECATED** : Whether or not the monitor is locked (only editable by creator and admins). Use `restricted_roles` instead. min_failure_duration int64 How long the test should be in failure before alerting (integer, number of seconds, max 7200). min_location_failed int64 The minimum number of locations in failure at the same time during at least one moment in the `min_failure_duration` period (`min_location_failed` and `min_failure_duration` are part of the advanced alerting rules - integer, >= 1). default: `1` new_group_delay int64 Time (in seconds) to skip evaluations for new groups. For example, this option can be used to skip evaluations for new hosts while they initialize. Must be a non negative integer. new_host_delay int64 **DEPRECATED** : Time (in seconds) to allow a host to boot and applications to fully start before starting the evaluation of monitor results. Should be a non negative integer. Use new_group_delay instead. default: `300` no_data_timeframe int64 The number of minutes before a monitor notifies after data stops reporting. Datadog recommends at least 2x the monitor timeframe for query alerts or 2 minutes for service checks. If omitted, 2x the evaluation timeframe is used for query alerts, and 24 hours is used for service checks. notification_preset_name enum Toggles the display of additional content sent in the monitor notification. Allowed enum values: `show_all,hide_query,hide_handles,hide_all` default: `show_all` notify_audit boolean A Boolean indicating whether tagged users is notified on changes to this monitor. notify_by [string] Controls what granularity a monitor alerts on. Only available for monitors with groupings. For instance, a monitor grouped by `cluster`, `namespace`, and `pod` can be configured to only notify on each new `cluster` violating the alert conditions by setting `notify_by` to `["cluster"]`. Tags mentioned in `notify_by` must be a subset of the grouping tags in the query. For example, a query grouped by `cluster` and `namespace` cannot notify on `region`. Setting `notify_by` to `["*"]` configures the monitor to notify as a simple-alert. notify_no_data boolean A Boolean indicating whether this monitor notifies when data stops reporting. Defaults to `false`. on_missing_data enum Controls how groups or monitors are treated if an evaluation does not return any data points. The default option results in different behavior depending on the monitor query type. For monitors using Count queries, an empty monitor evaluation is treated as 0 and is compared to the threshold conditions. For monitors using any query type other than Count, for example Gauge, Measure, or Rate, the monitor shows the last known status. This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. Allowed enum values: `default,show_no_data,show_and_notify_no_data,resolve` renotify_interval int64 The number of minutes after the last notification before a monitor re-notifies on the current status. It only re-notifies if it’s not resolved. renotify_occurrences int64 The number of times re-notification messages should be sent on the current status at the provided re-notification interval. renotify_statuses [string] The types of monitor statuses for which re-notification messages are sent. Default: **null** if `renotify_interval` is **null**. If `renotify_interval` is set, defaults to renotify on `Alert` and `No Data`. require_full_window boolean A Boolean indicating whether this monitor needs a full window of data before it’s evaluated. We highly recommend you set this to `false` for sparse metrics, otherwise some evaluations are skipped. Default is false. This setting only applies to metric monitors. scheduling_options object Configuration options for scheduling. custom_schedule object Configuration options for the custom schedule. **This feature is in private beta.** recurrences [object] Array of custom schedule recurrences. rrule string Defines the recurrence rule (RRULE) for a given schedule. start string Defines the start date and time of the recurring schedule. timezone string Defines the timezone the schedule runs on. evaluation_window object Configuration options for the evaluation window. If `hour_starts` is set, no other fields may be set. Otherwise, `day_starts` and `month_starts` must be set together. day_starts string The time of the day at which a one day cumulative evaluation window starts. hour_starts int32 The minute of the hour at which a one hour cumulative evaluation window starts. month_starts int32 The day of the month at which a one month cumulative evaluation window starts. timezone string The timezone of the time of the day of the cumulative evaluation window start. silenced object **DEPRECATED** : Information about the downtime applied to the monitor. Only shows v1 downtimes. int64 UTC epoch timestamp in seconds when the downtime for the group expires. synthetics_check_id string **DEPRECATED** : ID of the corresponding Synthetic check. threshold_windows object Alerting time window options. recovery_window string Describes how long an anomalous metric must be normal before the alert recovers. trigger_window string Describes how long a metric must be anomalous before an alert triggers. thresholds object List of the different monitor threshold available. critical double The monitor `CRITICAL` threshold. critical_recovery double The monitor `CRITICAL` recovery threshold. ok double The monitor `OK` threshold. unknown double The monitor UNKNOWN threshold. warning double The monitor `WARNING` threshold. warning_recovery double The monitor `WARNING` recovery threshold. timeout_h int64 The number of hours of the monitor not reporting data before it automatically resolves from a triggered state. The minimum allowed value is 0 hours. The maximum allowed value is 24 hours. variables [ ] List of requests that can be used in the monitor query. **This feature is currently in beta.** Option 1 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `rum,ci_pipelines,ci_tests,audit,events,logs,spans,database_queries,network` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. Option 2 object A formula and functions cost query. aggregator enum Aggregation methods for metric queries. Allowed enum values: `avg,sum,max,min,last,area,l2norm,percentile,stddev` data_source [_required_] enum Data source for cost queries. Allowed enum values: `metrics,cloud_cost,datadog_usage` name [_required_] string Name of the query for use in formulas. query [_required_] string The monitor query. Option 3 object A formula and functions data quality query. data_source [_required_] enum Data source for data quality queries. Allowed enum values: `data_quality_metrics` filter [_required_] string Filter expression used to match on data entities. Uses Aastra query syntax. group_by [string] Optional grouping fields for aggregation. measure [_required_] string The data quality measure to query. Common values include: `bytes`, `cardinality`, `custom`, `freshness`, `max`, `mean`, `min`, `nullness`, `percent_negative`, `percent_zero`, `row_count`, `stddev`, `sum`, `uniqueness`. Additional values may be supported. monitor_options object Monitor configuration options for data quality queries. crontab_override string Crontab expression to override the default schedule. custom_sql string Custom SQL query for the monitor. custom_where string Custom WHERE clause for the query. group_by_columns [string] Columns to group results by. model_type_override enum Override for the model type used in anomaly detection. Allowed enum values: `freshness,percentage,any` name [_required_] string Name of the query for use in formulas. schema_version string Schema version for the data quality query. scope string Optional scoping expression to further filter metrics. Uses metrics filter syntax. This is useful when an entity has been configured to emit metrics with additional tags. overall_state enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` priority int64 Integer from 1 (high) to 5 (low) indicating alert severity. query string The monitor query. restricted_roles [string] A list of unique role identifiers to define which roles are allowed to edit the monitor. The unique identifiers for all roles can be pulled from the [Roles API](https://docs.datadoghq.com/api/latest/roles/#list-roles) and are located in the `data.id` field. Editing a monitor includes any updates to the monitor configuration, monitor deletion, and muting of the monitor for any amount of time. You can use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) to manage write authorization for individual monitors by teams and users, in addition to roles. state object Wrapper object with the different monitor states. groups object Dictionary where the keys are groups (comma separated lists of tags) and the values are the list of groups your monitor is broken down on. object Monitor state for a single group. last_nodata_ts int64 Latest timestamp the monitor was in NO_DATA state. last_notified_ts int64 Latest timestamp of the notification sent for this monitor group. last_resolved_ts int64 Latest timestamp the monitor group was resolved. last_triggered_ts int64 Latest timestamp the monitor group triggered. name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated to your monitor. type enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ``` { "assets": [ { "category": "runbook", "name": "Monitor Runbook", "resource_key": "12345", "resource_type": "string", "url": "/notebooks/12345" } ], "created": "2019-09-19T10:00:00.000Z", "creator": { "email": "string", "handle": "string", "name": "string" }, "deleted": "2019-09-19T10:00:00.000Z", "draft_status": "string", "id": "integer", "matching_downtimes": [ { "end": 1412792983, "id": 1625, "scope": [ "env:staging" ], "start": 1412792983 } ], "message": "string", "modified": "2019-09-19T10:00:00.000Z", "multi": false, "name": "My monitor", "options": { "aggregation": { "group_by": "host", "metric": "metrics.name", "type": "count" }, "device_ids": [], "enable_logs_sample": false, "enable_samples": false, "escalation_message": "string", "evaluation_delay": "integer", "group_retention_duration": "string", "groupby_simple_monitor": false, "include_tags": false, "locked": false, "min_failure_duration": "integer", "min_location_failed": "integer", "new_group_delay": "integer", "new_host_delay": "integer", "no_data_timeframe": "integer", "notification_preset_name": "string", "notify_audit": false, "notify_by": [], "notify_no_data": false, "on_missing_data": "string", "renotify_interval": "integer", "renotify_occurrences": "integer", "renotify_statuses": [], "require_full_window": false, "scheduling_options": { "custom_schedule": { "recurrences": [ { "rrule": "FREQ=WEEKLY;BYDAY=MO,TU,WE,TH,FR", "start": "2023-08-31T16:30:00", "timezone": "Europe/Paris" } ] }, "evaluation_window": { "day_starts": "04:00", "hour_starts": 0, "month_starts": 1, "timezone": "Europe/Paris" } }, "silenced": { "": "integer" }, "synthetics_check_id": "string", "threshold_windows": { "recovery_window": "string", "trigger_window": "string" }, "thresholds": { "critical": "number", "critical_recovery": "number", "ok": "number", "unknown": "number", "warning": "number", "warning_recovery": "number" }, "timeout_h": "integer", "variables": [ { "compute": { "aggregation": "avg", "interval": 60000, "metric": "@duration" }, "data_source": "rum", "group_by": [ { "facet": "status", "limit": 10, "sort": { "aggregation": "avg", "metric": "string", "order": "string" } } ], "indexes": [ "days-3", "days-7" ], "name": "query_errors", "search": { "query": "service:query" } } ] }, "overall_state": "string", "priority": "integer", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "restricted_roles": [], "state": { "groups": { "": { "last_nodata_ts": "integer", "last_notified_ts": "integer", "last_resolved_ts": "integer", "last_triggered_ts": "integer", "name": "string", "status": "string" } } }, "tags": [], "type": "query alert" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python-legacy) ##### Get all monitors Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all monitors ``` """ Get all monitors returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.list_monitors() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all monitors ``` # Get all monitors returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new p api_instance.list_monitors() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all monitors ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Get all monitor details dog.get_all_monitors ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all monitors ``` // Get all monitors returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.ListMonitors(ctx, *datadogV1.NewListMonitorsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.ListMonitors`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.ListMonitors`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all monitors ``` // Get all monitors returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.Monitor; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); try { List result = apiInstance.listMonitors(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#listMonitors"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all monitors ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Get all monitor details print(api.Monitor.get_all()) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get all monitors ``` // Get all monitors returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::ListMonitorsOptionalParams; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .list_monitors(ListMonitorsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all monitors ``` /** * Get all monitors returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); apiInstance .listMonitors() .then((data: v1.Monitor[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Monitors group search](https://docs.datadoghq.com/api/latest/monitors/#monitors-group-search) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#monitors-group-search-v1) GET https://api.ap1.datadoghq.com/api/v1/monitor/groups/searchhttps://api.ap2.datadoghq.com/api/v1/monitor/groups/searchhttps://api.datadoghq.eu/api/v1/monitor/groups/searchhttps://api.ddog-gov.com/api/v1/monitor/groups/searchhttps://api.datadoghq.com/api/v1/monitor/groups/searchhttps://api.us3.datadoghq.com/api/v1/monitor/groups/searchhttps://api.us5.datadoghq.com/api/v1/monitor/groups/search ### Overview Search and filter your monitor groups details. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Query Strings Name Type Description query string After entering a search query on the [Triggered Monitors page](https://app.datadoghq.com/monitors/triggered), use the query parameter value in the URL of the page as a value for this parameter. For more information, see the [Manage Monitors documentation](https://docs.datadoghq.com/monitors/manage/#triggered-monitors). The query can contain any number of space-separated monitor attributes, for instance: `query="type:metric group_status:alert"`. page integer Page to start paginating from. per_page integer Number of monitors to return per page. sort string String for sort order, composed of field and sort order separate by a comma, for example `name,asc`. Supported sort directions: `asc`, `desc`. Supported fields: * `name` * `status` * `tags` ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#SearchMonitorGroups-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#SearchMonitorGroups-400-v1) * [403](https://docs.datadoghq.com/api/latest/monitors/#SearchMonitorGroups-403-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#SearchMonitorGroups-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) The response of a monitor group search. Field Type Description counts object The counts of monitor groups per different criteria. status [object] Search facets. count int64 The number of found monitors with the listed value. name The facet value. type [object] Search facets. count int64 The number of found monitors with the listed value. name The facet value. groups [object] The list of found monitor groups. group string The name of the group. group_tags [string] The list of tags of the monitor group. last_nodata_ts int64 Latest timestamp the monitor group was in NO_DATA state. last_triggered_ts int64 Latest timestamp the monitor group triggered. monitor_id int64 The ID of the monitor. monitor_name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` metadata object Metadata about the response. page int64 The page to start paginating from. page_count int64 The number of pages. per_page int64 The number of monitors to return per page. total_count int64 The total number of monitors. ``` { "counts": { "status": [ { "count": 2, "name": "OK" } ], "type": [ { "count": 2, "name": "metric" } ] }, "groups": [ { "group": "*", "group_tags": [ "*" ], "last_nodata_ts": 0, "last_triggered_ts": 1525702966, "monitor_id": 2738266, "monitor_name": "[demo] Cassandra disk usage is high on {{host.name}}", "status": "OK" }, { "group": "*", "group_tags": [ "*" ], "last_nodata_ts": 0, "last_triggered_ts": 1525703008, "monitor_id": 1576648, "monitor_name": "[demo] Disk usage is high on {{host.name}}", "status": "OK" } ], "metadata": { "page": 0, "page_count": 2, "per_page": 30, "total_count": 2 } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python-legacy) ##### Monitors group search Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/groups/search" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Monitors group search ``` """ Monitors group search returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.search_monitor_groups() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Monitors group search ``` # Monitors group search returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new p api_instance.search_monitor_groups() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Monitors group search ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Search monitor groups dog.search_monitor_groups ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Monitors group search ``` // Monitors group search returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.SearchMonitorGroups(ctx, *datadogV1.NewSearchMonitorGroupsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.SearchMonitorGroups`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.SearchMonitorGroups`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Monitors group search ``` // Monitors group search returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.MonitorGroupSearchResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); try { MonitorGroupSearchResponse result = apiInstance.searchMonitorGroups(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#searchMonitorGroups"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Monitors group search ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Search monitor groups api.Monitor.search_groups() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Monitors group search ``` // Monitors group search returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; use datadog_api_client::datadogV1::api_monitors::SearchMonitorGroupsOptionalParams; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .search_monitor_groups(SearchMonitorGroupsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Monitors group search ``` /** * Monitors group search returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); apiInstance .searchMonitorGroups() .then((data: v1.MonitorGroupSearchResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Mute a monitor](https://docs.datadoghq.com/api/latest/monitors/#mute-a-monitor) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#mute-a-monitor-v1) POST https://api.ap1.datadoghq.com/api/v1/monitor/{monitor_id}/mutehttps://api.ap2.datadoghq.com/api/v1/monitor/{monitor_id}/mutehttps://api.datadoghq.eu/api/v1/monitor/{monitor_id}/mutehttps://api.ddog-gov.com/api/v1/monitor/{monitor_id}/mutehttps://api.datadoghq.com/api/v1/monitor/{monitor_id}/mutehttps://api.us3.datadoghq.com/api/v1/monitor/{monitor_id}/mutehttps://api.us5.datadoghq.com/api/v1/monitor/{monitor_id}/mute ### Overview Mute the specified monitor. This endpoint requires the `monitors_write` permission. OAuth apps require the `monitors_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Path Parameters Name Type Description monitor_id [_required_] integer The id of the monitor #### Query Strings Name Type Description scope string The scope to apply the mute to. For example, if your alert is grouped by `{host}`, you might mute `host:app1`. end integer A POSIX timestamp for when the mute should end. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#MuteMonitor-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#MuteMonitor-400-v1) * [401](https://docs.datadoghq.com/api/latest/monitors/#MuteMonitor-401-v1) * [404](https://docs.datadoghq.com/api/latest/monitors/#MuteMonitor-404-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#MuteMonitor-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Object describing a monitor. Field Type Description assets [object] The list of monitor assets tied to a monitor, which represents key links for users to take action on monitor alerts (for example, runbooks). category [_required_] enum Indicates the type of asset this entity represents on a monitor. Allowed enum values: `runbook` name [_required_] string Name for the monitor asset resource_key string Represents the identifier of the internal Datadog resource that this asset represents. IDs in this field should be passed in as strings. resource_type enum Type of internal Datadog resource associated with a monitor asset. Allowed enum values: `notebook` url [_required_] string URL link for the asset. For links with an internal resource type set, this should be the relative path to where the Datadog domain is appended internally. For external links, this should be the full URL path. created date-time Timestamp of the monitor creation. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. deleted date-time Whether or not the monitor is deleted. (Always `null`) draft_status enum Indicates whether the monitor is in a draft or published state. `draft`: The monitor appears as Draft and does not send notifications. `published`: The monitor is active and evaluates conditions and notify as configured. This field is in preview. The draft value is only available to customers with the feature enabled. Allowed enum values: `draft,published` default: `published` id int64 ID of this monitor. matching_downtimes [object] A list of active v1 downtimes that match this monitor. end int64 POSIX timestamp to end the downtime. id [_required_] int64 The downtime ID. scope [string] The scope(s) to which the downtime applies. Must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. message string A message to include with notifications for this monitor. modified date-time Last timestamp when the monitor was edited. multi boolean Whether or not the monitor is broken down on different groups. name string The monitor name. options object List of options associated with your monitor. aggregation object Type of aggregation performed in the monitor query. group_by string Group to break down the monitor on. metric string Metric name used in the monitor. type string Metric type used in the monitor. device_ids [string] **DEPRECATED** : IDs of the device the Synthetics monitor is running on. enable_logs_sample boolean Whether or not to send a log sample when the log monitor triggers. enable_samples boolean Whether or not to send a list of samples when the monitor triggers. This is only used by CI Test and Pipeline monitors. escalation_message string We recommend using the [is_renotify](https://docs.datadoghq.com/monitors/notify/?tab=is_alert#renotify), block in the original message instead. A message to include with a re-notification. Supports the `@username` notification we allow elsewhere. Not applicable if `renotify_interval` is `None`. evaluation_delay int64 Time (in seconds) to delay evaluation, as a non-negative integer. For example, if the value is set to `300` (5min), the timeframe is set to `last_5m` and the time is 7:00, the monitor evaluates data from 6:50 to 6:55. This is useful for AWS CloudWatch and other backfilled metrics to ensure the monitor always has data during evaluation. group_retention_duration string The time span after which groups with missing data are dropped from the monitor state. The minimum value is one hour, and the maximum value is 72 hours. Example values are: "60m", "1h", and "2d". This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. groupby_simple_monitor boolean **DEPRECATED** : Whether the log alert monitor triggers a single alert or multiple alerts when any group breaches a threshold. Use `notify_by` instead. include_tags boolean A Boolean indicating whether notifications from this monitor automatically inserts its triggering tags into the title. **Examples** * If `True`, `[Triggered on {host:h1}] Monitor Title` * If `False`, `[Triggered] Monitor Title` default: `true` locked boolean **DEPRECATED** : Whether or not the monitor is locked (only editable by creator and admins). Use `restricted_roles` instead. min_failure_duration int64 How long the test should be in failure before alerting (integer, number of seconds, max 7200). min_location_failed int64 The minimum number of locations in failure at the same time during at least one moment in the `min_failure_duration` period (`min_location_failed` and `min_failure_duration` are part of the advanced alerting rules - integer, >= 1). default: `1` new_group_delay int64 Time (in seconds) to skip evaluations for new groups. For example, this option can be used to skip evaluations for new hosts while they initialize. Must be a non negative integer. new_host_delay int64 **DEPRECATED** : Time (in seconds) to allow a host to boot and applications to fully start before starting the evaluation of monitor results. Should be a non negative integer. Use new_group_delay instead. default: `300` no_data_timeframe int64 The number of minutes before a monitor notifies after data stops reporting. Datadog recommends at least 2x the monitor timeframe for query alerts or 2 minutes for service checks. If omitted, 2x the evaluation timeframe is used for query alerts, and 24 hours is used for service checks. notification_preset_name enum Toggles the display of additional content sent in the monitor notification. Allowed enum values: `show_all,hide_query,hide_handles,hide_all` default: `show_all` notify_audit boolean A Boolean indicating whether tagged users is notified on changes to this monitor. notify_by [string] Controls what granularity a monitor alerts on. Only available for monitors with groupings. For instance, a monitor grouped by `cluster`, `namespace`, and `pod` can be configured to only notify on each new `cluster` violating the alert conditions by setting `notify_by` to `["cluster"]`. Tags mentioned in `notify_by` must be a subset of the grouping tags in the query. For example, a query grouped by `cluster` and `namespace` cannot notify on `region`. Setting `notify_by` to `["*"]` configures the monitor to notify as a simple-alert. notify_no_data boolean A Boolean indicating whether this monitor notifies when data stops reporting. Defaults to `false`. on_missing_data enum Controls how groups or monitors are treated if an evaluation does not return any data points. The default option results in different behavior depending on the monitor query type. For monitors using Count queries, an empty monitor evaluation is treated as 0 and is compared to the threshold conditions. For monitors using any query type other than Count, for example Gauge, Measure, or Rate, the monitor shows the last known status. This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. Allowed enum values: `default,show_no_data,show_and_notify_no_data,resolve` renotify_interval int64 The number of minutes after the last notification before a monitor re-notifies on the current status. It only re-notifies if it’s not resolved. renotify_occurrences int64 The number of times re-notification messages should be sent on the current status at the provided re-notification interval. renotify_statuses [string] The types of monitor statuses for which re-notification messages are sent. Default: **null** if `renotify_interval` is **null**. If `renotify_interval` is set, defaults to renotify on `Alert` and `No Data`. require_full_window boolean A Boolean indicating whether this monitor needs a full window of data before it’s evaluated. We highly recommend you set this to `false` for sparse metrics, otherwise some evaluations are skipped. Default is false. This setting only applies to metric monitors. scheduling_options object Configuration options for scheduling. custom_schedule object Configuration options for the custom schedule. **This feature is in private beta.** recurrences [object] Array of custom schedule recurrences. rrule string Defines the recurrence rule (RRULE) for a given schedule. start string Defines the start date and time of the recurring schedule. timezone string Defines the timezone the schedule runs on. evaluation_window object Configuration options for the evaluation window. If `hour_starts` is set, no other fields may be set. Otherwise, `day_starts` and `month_starts` must be set together. day_starts string The time of the day at which a one day cumulative evaluation window starts. hour_starts int32 The minute of the hour at which a one hour cumulative evaluation window starts. month_starts int32 The day of the month at which a one month cumulative evaluation window starts. timezone string The timezone of the time of the day of the cumulative evaluation window start. silenced object **DEPRECATED** : Information about the downtime applied to the monitor. Only shows v1 downtimes. int64 UTC epoch timestamp in seconds when the downtime for the group expires. synthetics_check_id string **DEPRECATED** : ID of the corresponding Synthetic check. threshold_windows object Alerting time window options. recovery_window string Describes how long an anomalous metric must be normal before the alert recovers. trigger_window string Describes how long a metric must be anomalous before an alert triggers. thresholds object List of the different monitor threshold available. critical double The monitor `CRITICAL` threshold. critical_recovery double The monitor `CRITICAL` recovery threshold. ok double The monitor `OK` threshold. unknown double The monitor UNKNOWN threshold. warning double The monitor `WARNING` threshold. warning_recovery double The monitor `WARNING` recovery threshold. timeout_h int64 The number of hours of the monitor not reporting data before it automatically resolves from a triggered state. The minimum allowed value is 0 hours. The maximum allowed value is 24 hours. variables [ ] List of requests that can be used in the monitor query. **This feature is currently in beta.** Option 1 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `rum,ci_pipelines,ci_tests,audit,events,logs,spans,database_queries,network` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. Option 2 object A formula and functions cost query. aggregator enum Aggregation methods for metric queries. Allowed enum values: `avg,sum,max,min,last,area,l2norm,percentile,stddev` data_source [_required_] enum Data source for cost queries. Allowed enum values: `metrics,cloud_cost,datadog_usage` name [_required_] string Name of the query for use in formulas. query [_required_] string The monitor query. Option 3 object A formula and functions data quality query. data_source [_required_] enum Data source for data quality queries. Allowed enum values: `data_quality_metrics` filter [_required_] string Filter expression used to match on data entities. Uses Aastra query syntax. group_by [string] Optional grouping fields for aggregation. measure [_required_] string The data quality measure to query. Common values include: `bytes`, `cardinality`, `custom`, `freshness`, `max`, `mean`, `min`, `nullness`, `percent_negative`, `percent_zero`, `row_count`, `stddev`, `sum`, `uniqueness`. Additional values may be supported. monitor_options object Monitor configuration options for data quality queries. crontab_override string Crontab expression to override the default schedule. custom_sql string Custom SQL query for the monitor. custom_where string Custom WHERE clause for the query. group_by_columns [string] Columns to group results by. model_type_override enum Override for the model type used in anomaly detection. Allowed enum values: `freshness,percentage,any` name [_required_] string Name of the query for use in formulas. schema_version string Schema version for the data quality query. scope string Optional scoping expression to further filter metrics. Uses metrics filter syntax. This is useful when an entity has been configured to emit metrics with additional tags. overall_state enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` priority int64 Integer from 1 (high) to 5 (low) indicating alert severity. query [_required_] string The monitor query. restricted_roles [string] A list of unique role identifiers to define which roles are allowed to edit the monitor. The unique identifiers for all roles can be pulled from the [Roles API](https://docs.datadoghq.com/api/latest/roles/#list-roles) and are located in the `data.id` field. Editing a monitor includes any updates to the monitor configuration, monitor deletion, and muting of the monitor for any amount of time. You can use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) to manage write authorization for individual monitors by teams and users, in addition to roles. state object Wrapper object with the different monitor states. groups object Dictionary where the keys are groups (comma separated lists of tags) and the values are the list of groups your monitor is broken down on. object Monitor state for a single group. last_nodata_ts int64 Latest timestamp the monitor was in NO_DATA state. last_notified_ts int64 Latest timestamp of the notification sent for this monitor group. last_resolved_ts int64 Latest timestamp the monitor group was resolved. last_triggered_ts int64 Latest timestamp the monitor group triggered. name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated to your monitor. type [_required_] enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ``` { "assets": [ { "category": "runbook", "name": "Monitor Runbook", "resource_key": "12345", "resource_type": "string", "url": "/notebooks/12345" } ], "created": "2019-09-19T10:00:00.000Z", "creator": { "email": "string", "handle": "string", "name": "string" }, "deleted": "2019-09-19T10:00:00.000Z", "draft_status": "string", "id": "integer", "matching_downtimes": [ { "end": 1412792983, "id": 1625, "scope": [ "env:staging" ], "start": 1412792983 } ], "message": "string", "modified": "2019-09-19T10:00:00.000Z", "multi": false, "name": "My monitor", "options": { "aggregation": { "group_by": "host", "metric": "metrics.name", "type": "count" }, "device_ids": [], "enable_logs_sample": false, "enable_samples": false, "escalation_message": "string", "evaluation_delay": "integer", "group_retention_duration": "string", "groupby_simple_monitor": false, "include_tags": false, "locked": false, "min_failure_duration": "integer", "min_location_failed": "integer", "new_group_delay": "integer", "new_host_delay": "integer", "no_data_timeframe": "integer", "notification_preset_name": "string", "notify_audit": false, "notify_by": [], "notify_no_data": false, "on_missing_data": "string", "renotify_interval": "integer", "renotify_occurrences": "integer", "renotify_statuses": [], "require_full_window": false, "scheduling_options": { "custom_schedule": { "recurrences": [ { "rrule": "FREQ=WEEKLY;BYDAY=MO,TU,WE,TH,FR", "start": "2023-08-31T16:30:00", "timezone": "Europe/Paris" } ] }, "evaluation_window": { "day_starts": "04:00", "hour_starts": 0, "month_starts": 1, "timezone": "Europe/Paris" } }, "silenced": { "": "integer" }, "synthetics_check_id": "string", "threshold_windows": { "recovery_window": "string", "trigger_window": "string" }, "thresholds": { "critical": "number", "critical_recovery": "number", "ok": "number", "unknown": "number", "warning": "number", "warning_recovery": "number" }, "timeout_h": "integer", "variables": [ { "compute": { "aggregation": "avg", "interval": 60000, "metric": "@duration" }, "data_source": "rum", "group_by": [ { "facet": "status", "limit": 10, "sort": { "aggregation": "avg", "metric": "string", "order": "string" } } ], "indexes": [ "days-3", "days-7" ], "name": "query_errors", "search": { "query": "service:query" } } ] }, "overall_state": "string", "priority": "integer", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "restricted_roles": [], "state": { "groups": { "": { "last_nodata_ts": "integer", "last_notified_ts": "integer", "last_resolved_ts": "integer", "last_triggered_ts": "integer", "name": "string", "status": "string" } } }, "tags": [], "type": "query alert" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Monitor Not Found error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python-legacy) ##### Mute a monitor Copy ``` # Path parameters export monitor_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/${monitor_id}/mute" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Mute a monitor ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Mute a monitor dog.mute_monitor(62_628) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Mute a monitor ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Mute a monitor api.Monitor.mute(2088) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` * * * ## [Edit a monitor](https://docs.datadoghq.com/api/latest/monitors/#edit-a-monitor) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#edit-a-monitor-v1) PUT https://api.ap1.datadoghq.com/api/v1/monitor/{monitor_id}https://api.ap2.datadoghq.com/api/v1/monitor/{monitor_id}https://api.datadoghq.eu/api/v1/monitor/{monitor_id}https://api.ddog-gov.com/api/v1/monitor/{monitor_id}https://api.datadoghq.com/api/v1/monitor/{monitor_id}https://api.us3.datadoghq.com/api/v1/monitor/{monitor_id}https://api.us5.datadoghq.com/api/v1/monitor/{monitor_id} ### Overview Edit the specified monitor. This endpoint requires the `monitors_write` permission. OAuth apps require the `monitors_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Path Parameters Name Type Description monitor_id [_required_] integer The ID of the monitor. ### Request #### Body Data (required) Edit a monitor request body. * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description assets [object] The list of monitor assets tied to a monitor, which represents key links for users to take action on monitor alerts (for example, runbooks). category [_required_] enum Indicates the type of asset this entity represents on a monitor. Allowed enum values: `runbook` name [_required_] string Name for the monitor asset resource_key string Represents the identifier of the internal Datadog resource that this asset represents. IDs in this field should be passed in as strings. resource_type enum Type of internal Datadog resource associated with a monitor asset. Allowed enum values: `notebook` url [_required_] string URL link for the asset. For links with an internal resource type set, this should be the relative path to where the Datadog domain is appended internally. For external links, this should be the full URL path. created date-time Timestamp of the monitor creation. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. deleted date-time Whether or not the monitor is deleted. (Always `null`) draft_status enum Indicates whether the monitor is in a draft or published state. `draft`: The monitor appears as Draft and does not send notifications. `published`: The monitor is active and evaluates conditions and notify as configured. This field is in preview. The draft value is only available to customers with the feature enabled. Allowed enum values: `draft,published` default: `published` id int64 ID of this monitor. message string A message to include with notifications for this monitor. modified date-time Last timestamp when the monitor was edited. multi boolean Whether or not the monitor is broken down on different groups. name string The monitor name. options object List of options associated with your monitor. aggregation object Type of aggregation performed in the monitor query. group_by string Group to break down the monitor on. metric string Metric name used in the monitor. type string Metric type used in the monitor. device_ids [string] **DEPRECATED** : IDs of the device the Synthetics monitor is running on. enable_logs_sample boolean Whether or not to send a log sample when the log monitor triggers. enable_samples boolean Whether or not to send a list of samples when the monitor triggers. This is only used by CI Test and Pipeline monitors. escalation_message string We recommend using the [is_renotify](https://docs.datadoghq.com/monitors/notify/?tab=is_alert#renotify), block in the original message instead. A message to include with a re-notification. Supports the `@username` notification we allow elsewhere. Not applicable if `renotify_interval` is `None`. evaluation_delay int64 Time (in seconds) to delay evaluation, as a non-negative integer. For example, if the value is set to `300` (5min), the timeframe is set to `last_5m` and the time is 7:00, the monitor evaluates data from 6:50 to 6:55. This is useful for AWS CloudWatch and other backfilled metrics to ensure the monitor always has data during evaluation. group_retention_duration string The time span after which groups with missing data are dropped from the monitor state. The minimum value is one hour, and the maximum value is 72 hours. Example values are: "60m", "1h", and "2d". This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. groupby_simple_monitor boolean **DEPRECATED** : Whether the log alert monitor triggers a single alert or multiple alerts when any group breaches a threshold. Use `notify_by` instead. include_tags boolean A Boolean indicating whether notifications from this monitor automatically inserts its triggering tags into the title. **Examples** * If `True`, `[Triggered on {host:h1}] Monitor Title` * If `False`, `[Triggered] Monitor Title` default: `true` locked boolean **DEPRECATED** : Whether or not the monitor is locked (only editable by creator and admins). Use `restricted_roles` instead. min_failure_duration int64 How long the test should be in failure before alerting (integer, number of seconds, max 7200). min_location_failed int64 The minimum number of locations in failure at the same time during at least one moment in the `min_failure_duration` period (`min_location_failed` and `min_failure_duration` are part of the advanced alerting rules - integer, >= 1). default: `1` new_group_delay int64 Time (in seconds) to skip evaluations for new groups. For example, this option can be used to skip evaluations for new hosts while they initialize. Must be a non negative integer. new_host_delay int64 **DEPRECATED** : Time (in seconds) to allow a host to boot and applications to fully start before starting the evaluation of monitor results. Should be a non negative integer. Use new_group_delay instead. default: `300` no_data_timeframe int64 The number of minutes before a monitor notifies after data stops reporting. Datadog recommends at least 2x the monitor timeframe for query alerts or 2 minutes for service checks. If omitted, 2x the evaluation timeframe is used for query alerts, and 24 hours is used for service checks. notification_preset_name enum Toggles the display of additional content sent in the monitor notification. Allowed enum values: `show_all,hide_query,hide_handles,hide_all` default: `show_all` notify_audit boolean A Boolean indicating whether tagged users is notified on changes to this monitor. notify_by [string] Controls what granularity a monitor alerts on. Only available for monitors with groupings. For instance, a monitor grouped by `cluster`, `namespace`, and `pod` can be configured to only notify on each new `cluster` violating the alert conditions by setting `notify_by` to `["cluster"]`. Tags mentioned in `notify_by` must be a subset of the grouping tags in the query. For example, a query grouped by `cluster` and `namespace` cannot notify on `region`. Setting `notify_by` to `["*"]` configures the monitor to notify as a simple-alert. notify_no_data boolean A Boolean indicating whether this monitor notifies when data stops reporting. Defaults to `false`. on_missing_data enum Controls how groups or monitors are treated if an evaluation does not return any data points. The default option results in different behavior depending on the monitor query type. For monitors using Count queries, an empty monitor evaluation is treated as 0 and is compared to the threshold conditions. For monitors using any query type other than Count, for example Gauge, Measure, or Rate, the monitor shows the last known status. This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. Allowed enum values: `default,show_no_data,show_and_notify_no_data,resolve` renotify_interval int64 The number of minutes after the last notification before a monitor re-notifies on the current status. It only re-notifies if it’s not resolved. renotify_occurrences int64 The number of times re-notification messages should be sent on the current status at the provided re-notification interval. renotify_statuses [string] The types of monitor statuses for which re-notification messages are sent. Default: **null** if `renotify_interval` is **null**. If `renotify_interval` is set, defaults to renotify on `Alert` and `No Data`. require_full_window boolean A Boolean indicating whether this monitor needs a full window of data before it’s evaluated. We highly recommend you set this to `false` for sparse metrics, otherwise some evaluations are skipped. Default is false. This setting only applies to metric monitors. scheduling_options object Configuration options for scheduling. custom_schedule object Configuration options for the custom schedule. **This feature is in private beta.** recurrences [object] Array of custom schedule recurrences. rrule string Defines the recurrence rule (RRULE) for a given schedule. start string Defines the start date and time of the recurring schedule. timezone string Defines the timezone the schedule runs on. evaluation_window object Configuration options for the evaluation window. If `hour_starts` is set, no other fields may be set. Otherwise, `day_starts` and `month_starts` must be set together. day_starts string The time of the day at which a one day cumulative evaluation window starts. hour_starts int32 The minute of the hour at which a one hour cumulative evaluation window starts. month_starts int32 The day of the month at which a one month cumulative evaluation window starts. timezone string The timezone of the time of the day of the cumulative evaluation window start. silenced object **DEPRECATED** : Information about the downtime applied to the monitor. Only shows v1 downtimes. int64 UTC epoch timestamp in seconds when the downtime for the group expires. synthetics_check_id string **DEPRECATED** : ID of the corresponding Synthetic check. threshold_windows object Alerting time window options. recovery_window string Describes how long an anomalous metric must be normal before the alert recovers. trigger_window string Describes how long a metric must be anomalous before an alert triggers. thresholds object List of the different monitor threshold available. critical double The monitor `CRITICAL` threshold. critical_recovery double The monitor `CRITICAL` recovery threshold. ok double The monitor `OK` threshold. unknown double The monitor UNKNOWN threshold. warning double The monitor `WARNING` threshold. warning_recovery double The monitor `WARNING` recovery threshold. timeout_h int64 The number of hours of the monitor not reporting data before it automatically resolves from a triggered state. The minimum allowed value is 0 hours. The maximum allowed value is 24 hours. variables [ ] List of requests that can be used in the monitor query. **This feature is currently in beta.** Option 1 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `rum,ci_pipelines,ci_tests,audit,events,logs,spans,database_queries,network` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. Option 2 object A formula and functions cost query. aggregator enum Aggregation methods for metric queries. Allowed enum values: `avg,sum,max,min,last,area,l2norm,percentile,stddev` data_source [_required_] enum Data source for cost queries. Allowed enum values: `metrics,cloud_cost,datadog_usage` name [_required_] string Name of the query for use in formulas. query [_required_] string The monitor query. Option 3 object A formula and functions data quality query. data_source [_required_] enum Data source for data quality queries. Allowed enum values: `data_quality_metrics` filter [_required_] string Filter expression used to match on data entities. Uses Aastra query syntax. group_by [string] Optional grouping fields for aggregation. measure [_required_] string The data quality measure to query. Common values include: `bytes`, `cardinality`, `custom`, `freshness`, `max`, `mean`, `min`, `nullness`, `percent_negative`, `percent_zero`, `row_count`, `stddev`, `sum`, `uniqueness`. Additional values may be supported. monitor_options object Monitor configuration options for data quality queries. crontab_override string Crontab expression to override the default schedule. custom_sql string Custom SQL query for the monitor. custom_where string Custom WHERE clause for the query. group_by_columns [string] Columns to group results by. model_type_override enum Override for the model type used in anomaly detection. Allowed enum values: `freshness,percentage,any` name [_required_] string Name of the query for use in formulas. schema_version string Schema version for the data quality query. scope string Optional scoping expression to further filter metrics. Uses metrics filter syntax. This is useful when an entity has been configured to emit metrics with additional tags. overall_state enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` priority int64 Integer from 1 (high) to 5 (low) indicating alert severity. query string The monitor query. restricted_roles [string] A list of unique role identifiers to define which roles are allowed to edit the monitor. The unique identifiers for all roles can be pulled from the [Roles API](https://docs.datadoghq.com/api/latest/roles/#list-roles) and are located in the `data.id` field. Editing a monitor includes any updates to the monitor configuration, monitor deletion, and muting of the monitor for any amount of time. You can use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) to manage write authorization for individual monitors by teams and users, in addition to roles. state object Wrapper object with the different monitor states. groups object Dictionary where the keys are groups (comma separated lists of tags) and the values are the list of groups your monitor is broken down on. object Monitor state for a single group. last_nodata_ts int64 Latest timestamp the monitor was in NO_DATA state. last_notified_ts int64 Latest timestamp of the notification sent for this monitor group. last_resolved_ts int64 Latest timestamp the monitor group was resolved. last_triggered_ts int64 Latest timestamp the monitor group triggered. name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated to your monitor. type enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ``` { "name": "My monitor-updated", "priority": null, "options": { "evaluation_delay": null, "new_group_delay": 600, "new_host_delay": null, "renotify_interval": null, "thresholds": { "critical": 2, "warning": null }, "timeout_h": null } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitor-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitor-400-v1) * [401](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitor-401-v1) * [403](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitor-403-v1) * [404](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitor-404-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitor-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Object describing a monitor. Field Type Description assets [object] The list of monitor assets tied to a monitor, which represents key links for users to take action on monitor alerts (for example, runbooks). category [_required_] enum Indicates the type of asset this entity represents on a monitor. Allowed enum values: `runbook` name [_required_] string Name for the monitor asset resource_key string Represents the identifier of the internal Datadog resource that this asset represents. IDs in this field should be passed in as strings. resource_type enum Type of internal Datadog resource associated with a monitor asset. Allowed enum values: `notebook` url [_required_] string URL link for the asset. For links with an internal resource type set, this should be the relative path to where the Datadog domain is appended internally. For external links, this should be the full URL path. created date-time Timestamp of the monitor creation. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. deleted date-time Whether or not the monitor is deleted. (Always `null`) draft_status enum Indicates whether the monitor is in a draft or published state. `draft`: The monitor appears as Draft and does not send notifications. `published`: The monitor is active and evaluates conditions and notify as configured. This field is in preview. The draft value is only available to customers with the feature enabled. Allowed enum values: `draft,published` default: `published` id int64 ID of this monitor. matching_downtimes [object] A list of active v1 downtimes that match this monitor. end int64 POSIX timestamp to end the downtime. id [_required_] int64 The downtime ID. scope [string] The scope(s) to which the downtime applies. Must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. message string A message to include with notifications for this monitor. modified date-time Last timestamp when the monitor was edited. multi boolean Whether or not the monitor is broken down on different groups. name string The monitor name. options object List of options associated with your monitor. aggregation object Type of aggregation performed in the monitor query. group_by string Group to break down the monitor on. metric string Metric name used in the monitor. type string Metric type used in the monitor. device_ids [string] **DEPRECATED** : IDs of the device the Synthetics monitor is running on. enable_logs_sample boolean Whether or not to send a log sample when the log monitor triggers. enable_samples boolean Whether or not to send a list of samples when the monitor triggers. This is only used by CI Test and Pipeline monitors. escalation_message string We recommend using the [is_renotify](https://docs.datadoghq.com/monitors/notify/?tab=is_alert#renotify), block in the original message instead. A message to include with a re-notification. Supports the `@username` notification we allow elsewhere. Not applicable if `renotify_interval` is `None`. evaluation_delay int64 Time (in seconds) to delay evaluation, as a non-negative integer. For example, if the value is set to `300` (5min), the timeframe is set to `last_5m` and the time is 7:00, the monitor evaluates data from 6:50 to 6:55. This is useful for AWS CloudWatch and other backfilled metrics to ensure the monitor always has data during evaluation. group_retention_duration string The time span after which groups with missing data are dropped from the monitor state. The minimum value is one hour, and the maximum value is 72 hours. Example values are: "60m", "1h", and "2d". This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. groupby_simple_monitor boolean **DEPRECATED** : Whether the log alert monitor triggers a single alert or multiple alerts when any group breaches a threshold. Use `notify_by` instead. include_tags boolean A Boolean indicating whether notifications from this monitor automatically inserts its triggering tags into the title. **Examples** * If `True`, `[Triggered on {host:h1}] Monitor Title` * If `False`, `[Triggered] Monitor Title` default: `true` locked boolean **DEPRECATED** : Whether or not the monitor is locked (only editable by creator and admins). Use `restricted_roles` instead. min_failure_duration int64 How long the test should be in failure before alerting (integer, number of seconds, max 7200). min_location_failed int64 The minimum number of locations in failure at the same time during at least one moment in the `min_failure_duration` period (`min_location_failed` and `min_failure_duration` are part of the advanced alerting rules - integer, >= 1). default: `1` new_group_delay int64 Time (in seconds) to skip evaluations for new groups. For example, this option can be used to skip evaluations for new hosts while they initialize. Must be a non negative integer. new_host_delay int64 **DEPRECATED** : Time (in seconds) to allow a host to boot and applications to fully start before starting the evaluation of monitor results. Should be a non negative integer. Use new_group_delay instead. default: `300` no_data_timeframe int64 The number of minutes before a monitor notifies after data stops reporting. Datadog recommends at least 2x the monitor timeframe for query alerts or 2 minutes for service checks. If omitted, 2x the evaluation timeframe is used for query alerts, and 24 hours is used for service checks. notification_preset_name enum Toggles the display of additional content sent in the monitor notification. Allowed enum values: `show_all,hide_query,hide_handles,hide_all` default: `show_all` notify_audit boolean A Boolean indicating whether tagged users is notified on changes to this monitor. notify_by [string] Controls what granularity a monitor alerts on. Only available for monitors with groupings. For instance, a monitor grouped by `cluster`, `namespace`, and `pod` can be configured to only notify on each new `cluster` violating the alert conditions by setting `notify_by` to `["cluster"]`. Tags mentioned in `notify_by` must be a subset of the grouping tags in the query. For example, a query grouped by `cluster` and `namespace` cannot notify on `region`. Setting `notify_by` to `["*"]` configures the monitor to notify as a simple-alert. notify_no_data boolean A Boolean indicating whether this monitor notifies when data stops reporting. Defaults to `false`. on_missing_data enum Controls how groups or monitors are treated if an evaluation does not return any data points. The default option results in different behavior depending on the monitor query type. For monitors using Count queries, an empty monitor evaluation is treated as 0 and is compared to the threshold conditions. For monitors using any query type other than Count, for example Gauge, Measure, or Rate, the monitor shows the last known status. This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. Allowed enum values: `default,show_no_data,show_and_notify_no_data,resolve` renotify_interval int64 The number of minutes after the last notification before a monitor re-notifies on the current status. It only re-notifies if it’s not resolved. renotify_occurrences int64 The number of times re-notification messages should be sent on the current status at the provided re-notification interval. renotify_statuses [string] The types of monitor statuses for which re-notification messages are sent. Default: **null** if `renotify_interval` is **null**. If `renotify_interval` is set, defaults to renotify on `Alert` and `No Data`. require_full_window boolean A Boolean indicating whether this monitor needs a full window of data before it’s evaluated. We highly recommend you set this to `false` for sparse metrics, otherwise some evaluations are skipped. Default is false. This setting only applies to metric monitors. scheduling_options object Configuration options for scheduling. custom_schedule object Configuration options for the custom schedule. **This feature is in private beta.** recurrences [object] Array of custom schedule recurrences. rrule string Defines the recurrence rule (RRULE) for a given schedule. start string Defines the start date and time of the recurring schedule. timezone string Defines the timezone the schedule runs on. evaluation_window object Configuration options for the evaluation window. If `hour_starts` is set, no other fields may be set. Otherwise, `day_starts` and `month_starts` must be set together. day_starts string The time of the day at which a one day cumulative evaluation window starts. hour_starts int32 The minute of the hour at which a one hour cumulative evaluation window starts. month_starts int32 The day of the month at which a one month cumulative evaluation window starts. timezone string The timezone of the time of the day of the cumulative evaluation window start. silenced object **DEPRECATED** : Information about the downtime applied to the monitor. Only shows v1 downtimes. int64 UTC epoch timestamp in seconds when the downtime for the group expires. synthetics_check_id string **DEPRECATED** : ID of the corresponding Synthetic check. threshold_windows object Alerting time window options. recovery_window string Describes how long an anomalous metric must be normal before the alert recovers. trigger_window string Describes how long a metric must be anomalous before an alert triggers. thresholds object List of the different monitor threshold available. critical double The monitor `CRITICAL` threshold. critical_recovery double The monitor `CRITICAL` recovery threshold. ok double The monitor `OK` threshold. unknown double The monitor UNKNOWN threshold. warning double The monitor `WARNING` threshold. warning_recovery double The monitor `WARNING` recovery threshold. timeout_h int64 The number of hours of the monitor not reporting data before it automatically resolves from a triggered state. The minimum allowed value is 0 hours. The maximum allowed value is 24 hours. variables [ ] List of requests that can be used in the monitor query. **This feature is currently in beta.** Option 1 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `rum,ci_pipelines,ci_tests,audit,events,logs,spans,database_queries,network` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. Option 2 object A formula and functions cost query. aggregator enum Aggregation methods for metric queries. Allowed enum values: `avg,sum,max,min,last,area,l2norm,percentile,stddev` data_source [_required_] enum Data source for cost queries. Allowed enum values: `metrics,cloud_cost,datadog_usage` name [_required_] string Name of the query for use in formulas. query [_required_] string The monitor query. Option 3 object A formula and functions data quality query. data_source [_required_] enum Data source for data quality queries. Allowed enum values: `data_quality_metrics` filter [_required_] string Filter expression used to match on data entities. Uses Aastra query syntax. group_by [string] Optional grouping fields for aggregation. measure [_required_] string The data quality measure to query. Common values include: `bytes`, `cardinality`, `custom`, `freshness`, `max`, `mean`, `min`, `nullness`, `percent_negative`, `percent_zero`, `row_count`, `stddev`, `sum`, `uniqueness`. Additional values may be supported. monitor_options object Monitor configuration options for data quality queries. crontab_override string Crontab expression to override the default schedule. custom_sql string Custom SQL query for the monitor. custom_where string Custom WHERE clause for the query. group_by_columns [string] Columns to group results by. model_type_override enum Override for the model type used in anomaly detection. Allowed enum values: `freshness,percentage,any` name [_required_] string Name of the query for use in formulas. schema_version string Schema version for the data quality query. scope string Optional scoping expression to further filter metrics. Uses metrics filter syntax. This is useful when an entity has been configured to emit metrics with additional tags. overall_state enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` priority int64 Integer from 1 (high) to 5 (low) indicating alert severity. query [_required_] string The monitor query. restricted_roles [string] A list of unique role identifiers to define which roles are allowed to edit the monitor. The unique identifiers for all roles can be pulled from the [Roles API](https://docs.datadoghq.com/api/latest/roles/#list-roles) and are located in the `data.id` field. Editing a monitor includes any updates to the monitor configuration, monitor deletion, and muting of the monitor for any amount of time. You can use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) to manage write authorization for individual monitors by teams and users, in addition to roles. state object Wrapper object with the different monitor states. groups object Dictionary where the keys are groups (comma separated lists of tags) and the values are the list of groups your monitor is broken down on. object Monitor state for a single group. last_nodata_ts int64 Latest timestamp the monitor was in NO_DATA state. last_notified_ts int64 Latest timestamp of the notification sent for this monitor group. last_resolved_ts int64 Latest timestamp the monitor group was resolved. last_triggered_ts int64 Latest timestamp the monitor group triggered. name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated to your monitor. type [_required_] enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ``` { "assets": [ { "category": "runbook", "name": "Monitor Runbook", "resource_key": "12345", "resource_type": "string", "url": "/notebooks/12345" } ], "created": "2019-09-19T10:00:00.000Z", "creator": { "email": "string", "handle": "string", "name": "string" }, "deleted": "2019-09-19T10:00:00.000Z", "draft_status": "string", "id": "integer", "matching_downtimes": [ { "end": 1412792983, "id": 1625, "scope": [ "env:staging" ], "start": 1412792983 } ], "message": "string", "modified": "2019-09-19T10:00:00.000Z", "multi": false, "name": "My monitor", "options": { "aggregation": { "group_by": "host", "metric": "metrics.name", "type": "count" }, "device_ids": [], "enable_logs_sample": false, "enable_samples": false, "escalation_message": "string", "evaluation_delay": "integer", "group_retention_duration": "string", "groupby_simple_monitor": false, "include_tags": false, "locked": false, "min_failure_duration": "integer", "min_location_failed": "integer", "new_group_delay": "integer", "new_host_delay": "integer", "no_data_timeframe": "integer", "notification_preset_name": "string", "notify_audit": false, "notify_by": [], "notify_no_data": false, "on_missing_data": "string", "renotify_interval": "integer", "renotify_occurrences": "integer", "renotify_statuses": [], "require_full_window": false, "scheduling_options": { "custom_schedule": { "recurrences": [ { "rrule": "FREQ=WEEKLY;BYDAY=MO,TU,WE,TH,FR", "start": "2023-08-31T16:30:00", "timezone": "Europe/Paris" } ] }, "evaluation_window": { "day_starts": "04:00", "hour_starts": 0, "month_starts": 1, "timezone": "Europe/Paris" } }, "silenced": { "": "integer" }, "synthetics_check_id": "string", "threshold_windows": { "recovery_window": "string", "trigger_window": "string" }, "thresholds": { "critical": "number", "critical_recovery": "number", "ok": "number", "unknown": "number", "warning": "number", "warning_recovery": "number" }, "timeout_h": "integer", "variables": [ { "compute": { "aggregation": "avg", "interval": 60000, "metric": "@duration" }, "data_source": "rum", "group_by": [ { "facet": "status", "limit": 10, "sort": { "aggregation": "avg", "metric": "string", "order": "string" } } ], "indexes": [ "days-3", "days-7" ], "name": "query_errors", "search": { "query": "service:query" } } ] }, "overall_state": "string", "priority": "integer", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "restricted_roles": [], "state": { "groups": { "": { "last_nodata_ts": "integer", "last_notified_ts": "integer", "last_resolved_ts": "integer", "last_triggered_ts": "integer", "name": "string", "status": "string" } } }, "tags": [], "type": "query alert" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Monitor Not Found error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Edit a monitor returns "OK" response Copy ``` # Path parameters export monitor_id="6.66486743e+08" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/${monitor_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "My monitor-updated", "priority": null, "options": { "evaluation_delay": null, "new_group_delay": 600, "new_host_delay": null, "renotify_interval": null, "thresholds": { "critical": 2, "warning": null }, "timeout_h": null } } EOF ``` ##### Edit a monitor returns "OK" response ``` // Edit a monitor returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "monitor" in the system MonitorID, _ := strconv.ParseInt(os.Getenv("MONITOR_ID"), 10, 64) body := datadogV1.MonitorUpdateRequest{ Name: datadog.PtrString("My monitor-updated"), Priority: *datadog.NewNullableInt64(nil), Options: &datadogV1.MonitorOptions{ EvaluationDelay: *datadog.NewNullableInt64(nil), NewGroupDelay: *datadog.NewNullableInt64(datadog.PtrInt64(600)), NewHostDelay: *datadog.NewNullableInt64(nil), RenotifyInterval: *datadog.NewNullableInt64(nil), Thresholds: &datadogV1.MonitorThresholds{ Critical: datadog.PtrFloat64(2), Warning: *datadog.NewNullableFloat64(nil), }, TimeoutH: *datadog.NewNullableInt64(nil), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.UpdateMonitor(ctx, MonitorID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.UpdateMonitor`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.UpdateMonitor`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit a monitor returns "OK" response ``` // Edit a monitor returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.Monitor; import com.datadog.api.client.v1.model.MonitorOptions; import com.datadog.api.client.v1.model.MonitorThresholds; import com.datadog.api.client.v1.model.MonitorUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor" in the system Long MONITOR_ID = Long.parseLong(System.getenv("MONITOR_ID")); String MONITOR_NAME = System.getenv("MONITOR_NAME"); MonitorUpdateRequest body = new MonitorUpdateRequest() .name("My monitor-updated") .priority(null) .options( new MonitorOptions() .evaluationDelay(null) .newGroupDelay(600L) .newHostDelay(null) .renotifyInterval(null) .thresholds(new MonitorThresholds().critical(2.0).warning(null)) .timeoutH(null)); try { Monitor result = apiInstance.updateMonitor(MONITOR_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#updateMonitor"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit a monitor returns "OK" response ``` """ Edit a monitor returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi from datadog_api_client.v1.model.monitor_options import MonitorOptions from datadog_api_client.v1.model.monitor_thresholds import MonitorThresholds from datadog_api_client.v1.model.monitor_update_request import MonitorUpdateRequest # there is a valid "monitor" in the system MONITOR_ID = environ["MONITOR_ID"] MONITOR_NAME = environ["MONITOR_NAME"] body = MonitorUpdateRequest( name="My monitor-updated", priority=None, options=MonitorOptions( evaluation_delay=None, new_group_delay=600, new_host_delay=None, renotify_interval=None, thresholds=MonitorThresholds( critical=2.0, warning=None, ), timeout_h=None, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.update_monitor(monitor_id=int(MONITOR_ID), body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit a monitor returns "OK" response ``` # Edit a monitor returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new # there is a valid "monitor" in the system MONITOR_ID = ENV["MONITOR_ID"] MONITOR_NAME = ENV["MONITOR_NAME"] body = DatadogAPIClient::V1::MonitorUpdateRequest.new({ name: "My monitor-updated", priority: nil, options: DatadogAPIClient::V1::MonitorOptions.new({ evaluation_delay: nil, new_group_delay: 600, new_host_delay: nil, renotify_interval: nil, thresholds: DatadogAPIClient::V1::MonitorThresholds.new({ critical: 2, warning: nil, }), timeout_h: nil, }), }) p api_instance.update_monitor(MONITOR_ID.to_i, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit a monitor returns "OK" response ``` // Edit a monitor returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; use datadog_api_client::datadogV1::model::MonitorOptions; use datadog_api_client::datadogV1::model::MonitorThresholds; use datadog_api_client::datadogV1::model::MonitorUpdateRequest; #[tokio::main] async fn main() { // there is a valid "monitor" in the system let monitor_id: i64 = std::env::var("MONITOR_ID").unwrap().parse().unwrap(); let body = MonitorUpdateRequest::new() .name("My monitor-updated".to_string()) .options( MonitorOptions::new() .evaluation_delay(None) .new_group_delay(Some(600)) .new_host_delay(None) .renotify_interval(None) .thresholds(MonitorThresholds::new().critical(2.0 as f64).warning(None)) .timeout_h(None), ) .priority(None); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.update_monitor(monitor_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit a monitor returns "OK" response ``` /** * Edit a monitor returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); // there is a valid "monitor" in the system const MONITOR_ID = parseInt(process.env.MONITOR_ID as string); const params: v1.MonitorsApiUpdateMonitorRequest = { body: { name: "My monitor-updated", priority: undefined, options: { evaluationDelay: undefined, newGroupDelay: 600, newHostDelay: undefined, renotifyInterval: undefined, thresholds: { critical: 2, warning: undefined, }, timeoutH: undefined, }, }, monitorId: MONITOR_ID, }; apiInstance .updateMonitor(params) .then((data: v1.Monitor) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Unmute all monitors](https://docs.datadoghq.com/api/latest/monitors/#unmute-all-monitors) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/monitors/#unmute-all-monitors-v1) POST https://api.ap1.datadoghq.com/v1/monitor/unmute_allhttps://api.ap2.datadoghq.com/v1/monitor/unmute_allhttps://api.datadoghq.eu/v1/monitor/unmute_allhttps://api.ddog-gov.com/v1/monitor/unmute_allhttps://api.datadoghq.com/v1/monitor/unmute_allhttps://api.us3.datadoghq.com/v1/monitor/unmute_allhttps://api.us5.datadoghq.com/v1/monitor/unmute_all ### Overview Disables muting all monitors. Throws an error if mute all was not enabled previously. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#UnmuteAllMonitors-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#UnmuteAllMonitors-400-v1) * [401](https://docs.datadoghq.com/api/latest/monitors/#UnmuteAllMonitors-401-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#UnmuteAllMonitors-429-v1) OK Bad Request Authentication error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) ##### Unmute all monitors Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/v1/monitor/unmute_all" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Get a monitor's details](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitors-details) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitors-details-v1) GET https://api.ap1.datadoghq.com/api/v1/monitor/{monitor_id}https://api.ap2.datadoghq.com/api/v1/monitor/{monitor_id}https://api.datadoghq.eu/api/v1/monitor/{monitor_id}https://api.ddog-gov.com/api/v1/monitor/{monitor_id}https://api.datadoghq.com/api/v1/monitor/{monitor_id}https://api.us3.datadoghq.com/api/v1/monitor/{monitor_id}https://api.us5.datadoghq.com/api/v1/monitor/{monitor_id} ### Overview Get details about the specified monitor from your organization. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Path Parameters Name Type Description monitor_id [_required_] integer The ID of the monitor #### Query Strings Name Type Description group_states string When specified, shows additional information about the group states. Choose one or more from `all`, `alert`, `warn`, and `no data`. with_downtimes boolean If this argument is set to true, then the returned data includes all current active downtimes for the monitor. with_assets boolean If this argument is set to `true`, the returned data includes all assets tied to this monitor. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#GetMonitor-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#GetMonitor-400-v1) * [403](https://docs.datadoghq.com/api/latest/monitors/#GetMonitor-403-v1) * [404](https://docs.datadoghq.com/api/latest/monitors/#GetMonitor-404-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#GetMonitor-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Object describing a monitor. Field Type Description assets [object] The list of monitor assets tied to a monitor, which represents key links for users to take action on monitor alerts (for example, runbooks). category [_required_] enum Indicates the type of asset this entity represents on a monitor. Allowed enum values: `runbook` name [_required_] string Name for the monitor asset resource_key string Represents the identifier of the internal Datadog resource that this asset represents. IDs in this field should be passed in as strings. resource_type enum Type of internal Datadog resource associated with a monitor asset. Allowed enum values: `notebook` url [_required_] string URL link for the asset. For links with an internal resource type set, this should be the relative path to where the Datadog domain is appended internally. For external links, this should be the full URL path. created date-time Timestamp of the monitor creation. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. deleted date-time Whether or not the monitor is deleted. (Always `null`) draft_status enum Indicates whether the monitor is in a draft or published state. `draft`: The monitor appears as Draft and does not send notifications. `published`: The monitor is active and evaluates conditions and notify as configured. This field is in preview. The draft value is only available to customers with the feature enabled. Allowed enum values: `draft,published` default: `published` id int64 ID of this monitor. matching_downtimes [object] A list of active v1 downtimes that match this monitor. end int64 POSIX timestamp to end the downtime. id [_required_] int64 The downtime ID. scope [string] The scope(s) to which the downtime applies. Must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. message string A message to include with notifications for this monitor. modified date-time Last timestamp when the monitor was edited. multi boolean Whether or not the monitor is broken down on different groups. name string The monitor name. options object List of options associated with your monitor. aggregation object Type of aggregation performed in the monitor query. group_by string Group to break down the monitor on. metric string Metric name used in the monitor. type string Metric type used in the monitor. device_ids [string] **DEPRECATED** : IDs of the device the Synthetics monitor is running on. enable_logs_sample boolean Whether or not to send a log sample when the log monitor triggers. enable_samples boolean Whether or not to send a list of samples when the monitor triggers. This is only used by CI Test and Pipeline monitors. escalation_message string We recommend using the [is_renotify](https://docs.datadoghq.com/monitors/notify/?tab=is_alert#renotify), block in the original message instead. A message to include with a re-notification. Supports the `@username` notification we allow elsewhere. Not applicable if `renotify_interval` is `None`. evaluation_delay int64 Time (in seconds) to delay evaluation, as a non-negative integer. For example, if the value is set to `300` (5min), the timeframe is set to `last_5m` and the time is 7:00, the monitor evaluates data from 6:50 to 6:55. This is useful for AWS CloudWatch and other backfilled metrics to ensure the monitor always has data during evaluation. group_retention_duration string The time span after which groups with missing data are dropped from the monitor state. The minimum value is one hour, and the maximum value is 72 hours. Example values are: "60m", "1h", and "2d". This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. groupby_simple_monitor boolean **DEPRECATED** : Whether the log alert monitor triggers a single alert or multiple alerts when any group breaches a threshold. Use `notify_by` instead. include_tags boolean A Boolean indicating whether notifications from this monitor automatically inserts its triggering tags into the title. **Examples** * If `True`, `[Triggered on {host:h1}] Monitor Title` * If `False`, `[Triggered] Monitor Title` default: `true` locked boolean **DEPRECATED** : Whether or not the monitor is locked (only editable by creator and admins). Use `restricted_roles` instead. min_failure_duration int64 How long the test should be in failure before alerting (integer, number of seconds, max 7200). min_location_failed int64 The minimum number of locations in failure at the same time during at least one moment in the `min_failure_duration` period (`min_location_failed` and `min_failure_duration` are part of the advanced alerting rules - integer, >= 1). default: `1` new_group_delay int64 Time (in seconds) to skip evaluations for new groups. For example, this option can be used to skip evaluations for new hosts while they initialize. Must be a non negative integer. new_host_delay int64 **DEPRECATED** : Time (in seconds) to allow a host to boot and applications to fully start before starting the evaluation of monitor results. Should be a non negative integer. Use new_group_delay instead. default: `300` no_data_timeframe int64 The number of minutes before a monitor notifies after data stops reporting. Datadog recommends at least 2x the monitor timeframe for query alerts or 2 minutes for service checks. If omitted, 2x the evaluation timeframe is used for query alerts, and 24 hours is used for service checks. notification_preset_name enum Toggles the display of additional content sent in the monitor notification. Allowed enum values: `show_all,hide_query,hide_handles,hide_all` default: `show_all` notify_audit boolean A Boolean indicating whether tagged users is notified on changes to this monitor. notify_by [string] Controls what granularity a monitor alerts on. Only available for monitors with groupings. For instance, a monitor grouped by `cluster`, `namespace`, and `pod` can be configured to only notify on each new `cluster` violating the alert conditions by setting `notify_by` to `["cluster"]`. Tags mentioned in `notify_by` must be a subset of the grouping tags in the query. For example, a query grouped by `cluster` and `namespace` cannot notify on `region`. Setting `notify_by` to `["*"]` configures the monitor to notify as a simple-alert. notify_no_data boolean A Boolean indicating whether this monitor notifies when data stops reporting. Defaults to `false`. on_missing_data enum Controls how groups or monitors are treated if an evaluation does not return any data points. The default option results in different behavior depending on the monitor query type. For monitors using Count queries, an empty monitor evaluation is treated as 0 and is compared to the threshold conditions. For monitors using any query type other than Count, for example Gauge, Measure, or Rate, the monitor shows the last known status. This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. Allowed enum values: `default,show_no_data,show_and_notify_no_data,resolve` renotify_interval int64 The number of minutes after the last notification before a monitor re-notifies on the current status. It only re-notifies if it’s not resolved. renotify_occurrences int64 The number of times re-notification messages should be sent on the current status at the provided re-notification interval. renotify_statuses [string] The types of monitor statuses for which re-notification messages are sent. Default: **null** if `renotify_interval` is **null**. If `renotify_interval` is set, defaults to renotify on `Alert` and `No Data`. require_full_window boolean A Boolean indicating whether this monitor needs a full window of data before it’s evaluated. We highly recommend you set this to `false` for sparse metrics, otherwise some evaluations are skipped. Default is false. This setting only applies to metric monitors. scheduling_options object Configuration options for scheduling. custom_schedule object Configuration options for the custom schedule. **This feature is in private beta.** recurrences [object] Array of custom schedule recurrences. rrule string Defines the recurrence rule (RRULE) for a given schedule. start string Defines the start date and time of the recurring schedule. timezone string Defines the timezone the schedule runs on. evaluation_window object Configuration options for the evaluation window. If `hour_starts` is set, no other fields may be set. Otherwise, `day_starts` and `month_starts` must be set together. day_starts string The time of the day at which a one day cumulative evaluation window starts. hour_starts int32 The minute of the hour at which a one hour cumulative evaluation window starts. month_starts int32 The day of the month at which a one month cumulative evaluation window starts. timezone string The timezone of the time of the day of the cumulative evaluation window start. silenced object **DEPRECATED** : Information about the downtime applied to the monitor. Only shows v1 downtimes. int64 UTC epoch timestamp in seconds when the downtime for the group expires. synthetics_check_id string **DEPRECATED** : ID of the corresponding Synthetic check. threshold_windows object Alerting time window options. recovery_window string Describes how long an anomalous metric must be normal before the alert recovers. trigger_window string Describes how long a metric must be anomalous before an alert triggers. thresholds object List of the different monitor threshold available. critical double The monitor `CRITICAL` threshold. critical_recovery double The monitor `CRITICAL` recovery threshold. ok double The monitor `OK` threshold. unknown double The monitor UNKNOWN threshold. warning double The monitor `WARNING` threshold. warning_recovery double The monitor `WARNING` recovery threshold. timeout_h int64 The number of hours of the monitor not reporting data before it automatically resolves from a triggered state. The minimum allowed value is 0 hours. The maximum allowed value is 24 hours. variables [ ] List of requests that can be used in the monitor query. **This feature is currently in beta.** Option 1 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `rum,ci_pipelines,ci_tests,audit,events,logs,spans,database_queries,network` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. Option 2 object A formula and functions cost query. aggregator enum Aggregation methods for metric queries. Allowed enum values: `avg,sum,max,min,last,area,l2norm,percentile,stddev` data_source [_required_] enum Data source for cost queries. Allowed enum values: `metrics,cloud_cost,datadog_usage` name [_required_] string Name of the query for use in formulas. query [_required_] string The monitor query. Option 3 object A formula and functions data quality query. data_source [_required_] enum Data source for data quality queries. Allowed enum values: `data_quality_metrics` filter [_required_] string Filter expression used to match on data entities. Uses Aastra query syntax. group_by [string] Optional grouping fields for aggregation. measure [_required_] string The data quality measure to query. Common values include: `bytes`, `cardinality`, `custom`, `freshness`, `max`, `mean`, `min`, `nullness`, `percent_negative`, `percent_zero`, `row_count`, `stddev`, `sum`, `uniqueness`. Additional values may be supported. monitor_options object Monitor configuration options for data quality queries. crontab_override string Crontab expression to override the default schedule. custom_sql string Custom SQL query for the monitor. custom_where string Custom WHERE clause for the query. group_by_columns [string] Columns to group results by. model_type_override enum Override for the model type used in anomaly detection. Allowed enum values: `freshness,percentage,any` name [_required_] string Name of the query for use in formulas. schema_version string Schema version for the data quality query. scope string Optional scoping expression to further filter metrics. Uses metrics filter syntax. This is useful when an entity has been configured to emit metrics with additional tags. overall_state enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` priority int64 Integer from 1 (high) to 5 (low) indicating alert severity. query [_required_] string The monitor query. restricted_roles [string] A list of unique role identifiers to define which roles are allowed to edit the monitor. The unique identifiers for all roles can be pulled from the [Roles API](https://docs.datadoghq.com/api/latest/roles/#list-roles) and are located in the `data.id` field. Editing a monitor includes any updates to the monitor configuration, monitor deletion, and muting of the monitor for any amount of time. You can use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) to manage write authorization for individual monitors by teams and users, in addition to roles. state object Wrapper object with the different monitor states. groups object Dictionary where the keys are groups (comma separated lists of tags) and the values are the list of groups your monitor is broken down on. object Monitor state for a single group. last_nodata_ts int64 Latest timestamp the monitor was in NO_DATA state. last_notified_ts int64 Latest timestamp of the notification sent for this monitor group. last_resolved_ts int64 Latest timestamp the monitor group was resolved. last_triggered_ts int64 Latest timestamp the monitor group triggered. name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated to your monitor. type [_required_] enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ``` { "assets": [ { "category": "runbook", "name": "Monitor Runbook", "resource_key": "12345", "resource_type": "string", "url": "/notebooks/12345" } ], "created": "2019-09-19T10:00:00.000Z", "creator": { "email": "string", "handle": "string", "name": "string" }, "deleted": "2019-09-19T10:00:00.000Z", "draft_status": "string", "id": "integer", "matching_downtimes": [ { "end": 1412792983, "id": 1625, "scope": [ "env:staging" ], "start": 1412792983 } ], "message": "string", "modified": "2019-09-19T10:00:00.000Z", "multi": false, "name": "My monitor", "options": { "aggregation": { "group_by": "host", "metric": "metrics.name", "type": "count" }, "device_ids": [], "enable_logs_sample": false, "enable_samples": false, "escalation_message": "string", "evaluation_delay": "integer", "group_retention_duration": "string", "groupby_simple_monitor": false, "include_tags": false, "locked": false, "min_failure_duration": "integer", "min_location_failed": "integer", "new_group_delay": "integer", "new_host_delay": "integer", "no_data_timeframe": "integer", "notification_preset_name": "string", "notify_audit": false, "notify_by": [], "notify_no_data": false, "on_missing_data": "string", "renotify_interval": "integer", "renotify_occurrences": "integer", "renotify_statuses": [], "require_full_window": false, "scheduling_options": { "custom_schedule": { "recurrences": [ { "rrule": "FREQ=WEEKLY;BYDAY=MO,TU,WE,TH,FR", "start": "2023-08-31T16:30:00", "timezone": "Europe/Paris" } ] }, "evaluation_window": { "day_starts": "04:00", "hour_starts": 0, "month_starts": 1, "timezone": "Europe/Paris" } }, "silenced": { "": "integer" }, "synthetics_check_id": "string", "threshold_windows": { "recovery_window": "string", "trigger_window": "string" }, "thresholds": { "critical": "number", "critical_recovery": "number", "ok": "number", "unknown": "number", "warning": "number", "warning_recovery": "number" }, "timeout_h": "integer", "variables": [ { "compute": { "aggregation": "avg", "interval": 60000, "metric": "@duration" }, "data_source": "rum", "group_by": [ { "facet": "status", "limit": 10, "sort": { "aggregation": "avg", "metric": "string", "order": "string" } } ], "indexes": [ "days-3", "days-7" ], "name": "query_errors", "search": { "query": "service:query" } } ] }, "overall_state": "string", "priority": "integer", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "restricted_roles": [], "state": { "groups": { "": { "last_nodata_ts": "integer", "last_notified_ts": "integer", "last_resolved_ts": "integer", "last_triggered_ts": "integer", "name": "string", "status": "string" } } }, "tags": [], "type": "query alert" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Monitor Not Found error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python-legacy) ##### Get a monitor's details Copy ``` # Path parameters export monitor_id="6.66486743e+08" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/${monitor_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a monitor's details ``` """ Get a monitor's details returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi # there is a valid "monitor" in the system MONITOR_ID = environ["MONITOR_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.get_monitor( monitor_id=int(MONITOR_ID), with_downtimes=True, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a monitor's details ``` # Get a monitor's details returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new # there is a valid "monitor" in the system MONITOR_ID = ENV["MONITOR_ID"] opts = { with_downtimes: true, } p api_instance.get_monitor(MONITOR_ID.to_i, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a monitor's details ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Get a monitors's details dog.get_monitor(91_879, group_states: 'all') ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a monitor's details ``` // Get a monitor's details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "monitor" in the system MonitorID, _ := strconv.ParseInt(os.Getenv("MONITOR_ID"), 10, 64) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.GetMonitor(ctx, MonitorID, *datadogV1.NewGetMonitorOptionalParameters().WithWithDowntimes(true)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.GetMonitor`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.GetMonitor`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a monitor's details ``` // Get a monitor's details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.api.MonitorsApi.GetMonitorOptionalParameters; import com.datadog.api.client.v1.model.Monitor; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor" in the system Long MONITOR_ID = Long.parseLong(System.getenv("MONITOR_ID")); try { Monitor result = apiInstance.getMonitor( MONITOR_ID, new GetMonitorOptionalParameters().withDowntimes(true)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#getMonitor"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a monitor's details ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Get a monitor's details api.Monitor.get(2081, group_states='all') ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get a monitor's details ``` // Get a monitor's details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::GetMonitorOptionalParams; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { // there is a valid "monitor" in the system let monitor_id: i64 = std::env::var("MONITOR_ID").unwrap().parse().unwrap(); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .get_monitor( monitor_id.clone(), GetMonitorOptionalParams::default().with_downtimes(true), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a monitor's details ``` /** * Get a monitor's details returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); // there is a valid "monitor" in the system const MONITOR_ID = parseInt(process.env.MONITOR_ID as string); const params: v1.MonitorsApiGetMonitorRequest = { monitorId: MONITOR_ID, withDowntimes: true, }; apiInstance .getMonitor(params) .then((data: v1.Monitor) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Mute all monitors](https://docs.datadoghq.com/api/latest/monitors/#mute-all-monitors) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/monitors/#mute-all-monitors-v1) POST https://api.ap1.datadoghq.com/v1/monitor/mute_allhttps://api.ap2.datadoghq.com/v1/monitor/mute_allhttps://api.datadoghq.eu/v1/monitor/mute_allhttps://api.ddog-gov.com/v1/monitor/mute_allhttps://api.datadoghq.com/v1/monitor/mute_allhttps://api.us3.datadoghq.com/v1/monitor/mute_allhttps://api.us5.datadoghq.com/v1/monitor/mute_all ### Overview Muting prevents all monitors from notifying through email and posts to the [event stream](https://docs.datadoghq.com/events). State changes are only visible by checking the alert page. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#MuteAllMonitors-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#MuteAllMonitors-400-v1) * [401](https://docs.datadoghq.com/api/latest/monitors/#MuteAllMonitors-401-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#MuteAllMonitors-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Object describing a monitor. Field Type Description assets [object] The list of monitor assets tied to a monitor, which represents key links for users to take action on monitor alerts (for example, runbooks). category [_required_] enum Indicates the type of asset this entity represents on a monitor. Allowed enum values: `runbook` name [_required_] string Name for the monitor asset resource_key string Represents the identifier of the internal Datadog resource that this asset represents. IDs in this field should be passed in as strings. resource_type enum Type of internal Datadog resource associated with a monitor asset. Allowed enum values: `notebook` url [_required_] string URL link for the asset. For links with an internal resource type set, this should be the relative path to where the Datadog domain is appended internally. For external links, this should be the full URL path. created date-time Timestamp of the monitor creation. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. deleted date-time Whether or not the monitor is deleted. (Always `null`) draft_status enum Indicates whether the monitor is in a draft or published state. `draft`: The monitor appears as Draft and does not send notifications. `published`: The monitor is active and evaluates conditions and notify as configured. This field is in preview. The draft value is only available to customers with the feature enabled. Allowed enum values: `draft,published` default: `published` id int64 ID of this monitor. matching_downtimes [object] A list of active v1 downtimes that match this monitor. end int64 POSIX timestamp to end the downtime. id [_required_] int64 The downtime ID. scope [string] The scope(s) to which the downtime applies. Must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. message string A message to include with notifications for this monitor. modified date-time Last timestamp when the monitor was edited. multi boolean Whether or not the monitor is broken down on different groups. name string The monitor name. options object List of options associated with your monitor. aggregation object Type of aggregation performed in the monitor query. group_by string Group to break down the monitor on. metric string Metric name used in the monitor. type string Metric type used in the monitor. device_ids [string] **DEPRECATED** : IDs of the device the Synthetics monitor is running on. enable_logs_sample boolean Whether or not to send a log sample when the log monitor triggers. enable_samples boolean Whether or not to send a list of samples when the monitor triggers. This is only used by CI Test and Pipeline monitors. escalation_message string We recommend using the [is_renotify](https://docs.datadoghq.com/monitors/notify/?tab=is_alert#renotify), block in the original message instead. A message to include with a re-notification. Supports the `@username` notification we allow elsewhere. Not applicable if `renotify_interval` is `None`. evaluation_delay int64 Time (in seconds) to delay evaluation, as a non-negative integer. For example, if the value is set to `300` (5min), the timeframe is set to `last_5m` and the time is 7:00, the monitor evaluates data from 6:50 to 6:55. This is useful for AWS CloudWatch and other backfilled metrics to ensure the monitor always has data during evaluation. group_retention_duration string The time span after which groups with missing data are dropped from the monitor state. The minimum value is one hour, and the maximum value is 72 hours. Example values are: "60m", "1h", and "2d". This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. groupby_simple_monitor boolean **DEPRECATED** : Whether the log alert monitor triggers a single alert or multiple alerts when any group breaches a threshold. Use `notify_by` instead. include_tags boolean A Boolean indicating whether notifications from this monitor automatically inserts its triggering tags into the title. **Examples** * If `True`, `[Triggered on {host:h1}] Monitor Title` * If `False`, `[Triggered] Monitor Title` default: `true` locked boolean **DEPRECATED** : Whether or not the monitor is locked (only editable by creator and admins). Use `restricted_roles` instead. min_failure_duration int64 How long the test should be in failure before alerting (integer, number of seconds, max 7200). min_location_failed int64 The minimum number of locations in failure at the same time during at least one moment in the `min_failure_duration` period (`min_location_failed` and `min_failure_duration` are part of the advanced alerting rules - integer, >= 1). default: `1` new_group_delay int64 Time (in seconds) to skip evaluations for new groups. For example, this option can be used to skip evaluations for new hosts while they initialize. Must be a non negative integer. new_host_delay int64 **DEPRECATED** : Time (in seconds) to allow a host to boot and applications to fully start before starting the evaluation of monitor results. Should be a non negative integer. Use new_group_delay instead. default: `300` no_data_timeframe int64 The number of minutes before a monitor notifies after data stops reporting. Datadog recommends at least 2x the monitor timeframe for query alerts or 2 minutes for service checks. If omitted, 2x the evaluation timeframe is used for query alerts, and 24 hours is used for service checks. notification_preset_name enum Toggles the display of additional content sent in the monitor notification. Allowed enum values: `show_all,hide_query,hide_handles,hide_all` default: `show_all` notify_audit boolean A Boolean indicating whether tagged users is notified on changes to this monitor. notify_by [string] Controls what granularity a monitor alerts on. Only available for monitors with groupings. For instance, a monitor grouped by `cluster`, `namespace`, and `pod` can be configured to only notify on each new `cluster` violating the alert conditions by setting `notify_by` to `["cluster"]`. Tags mentioned in `notify_by` must be a subset of the grouping tags in the query. For example, a query grouped by `cluster` and `namespace` cannot notify on `region`. Setting `notify_by` to `["*"]` configures the monitor to notify as a simple-alert. notify_no_data boolean A Boolean indicating whether this monitor notifies when data stops reporting. Defaults to `false`. on_missing_data enum Controls how groups or monitors are treated if an evaluation does not return any data points. The default option results in different behavior depending on the monitor query type. For monitors using Count queries, an empty monitor evaluation is treated as 0 and is compared to the threshold conditions. For monitors using any query type other than Count, for example Gauge, Measure, or Rate, the monitor shows the last known status. This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. Allowed enum values: `default,show_no_data,show_and_notify_no_data,resolve` renotify_interval int64 The number of minutes after the last notification before a monitor re-notifies on the current status. It only re-notifies if it’s not resolved. renotify_occurrences int64 The number of times re-notification messages should be sent on the current status at the provided re-notification interval. renotify_statuses [string] The types of monitor statuses for which re-notification messages are sent. Default: **null** if `renotify_interval` is **null**. If `renotify_interval` is set, defaults to renotify on `Alert` and `No Data`. require_full_window boolean A Boolean indicating whether this monitor needs a full window of data before it’s evaluated. We highly recommend you set this to `false` for sparse metrics, otherwise some evaluations are skipped. Default is false. This setting only applies to metric monitors. scheduling_options object Configuration options for scheduling. custom_schedule object Configuration options for the custom schedule. **This feature is in private beta.** recurrences [object] Array of custom schedule recurrences. rrule string Defines the recurrence rule (RRULE) for a given schedule. start string Defines the start date and time of the recurring schedule. timezone string Defines the timezone the schedule runs on. evaluation_window object Configuration options for the evaluation window. If `hour_starts` is set, no other fields may be set. Otherwise, `day_starts` and `month_starts` must be set together. day_starts string The time of the day at which a one day cumulative evaluation window starts. hour_starts int32 The minute of the hour at which a one hour cumulative evaluation window starts. month_starts int32 The day of the month at which a one month cumulative evaluation window starts. timezone string The timezone of the time of the day of the cumulative evaluation window start. silenced object **DEPRECATED** : Information about the downtime applied to the monitor. Only shows v1 downtimes. int64 UTC epoch timestamp in seconds when the downtime for the group expires. synthetics_check_id string **DEPRECATED** : ID of the corresponding Synthetic check. threshold_windows object Alerting time window options. recovery_window string Describes how long an anomalous metric must be normal before the alert recovers. trigger_window string Describes how long a metric must be anomalous before an alert triggers. thresholds object List of the different monitor threshold available. critical double The monitor `CRITICAL` threshold. critical_recovery double The monitor `CRITICAL` recovery threshold. ok double The monitor `OK` threshold. unknown double The monitor UNKNOWN threshold. warning double The monitor `WARNING` threshold. warning_recovery double The monitor `WARNING` recovery threshold. timeout_h int64 The number of hours of the monitor not reporting data before it automatically resolves from a triggered state. The minimum allowed value is 0 hours. The maximum allowed value is 24 hours. variables [ ] List of requests that can be used in the monitor query. **This feature is currently in beta.** Option 1 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `rum,ci_pipelines,ci_tests,audit,events,logs,spans,database_queries,network` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. Option 2 object A formula and functions cost query. aggregator enum Aggregation methods for metric queries. Allowed enum values: `avg,sum,max,min,last,area,l2norm,percentile,stddev` data_source [_required_] enum Data source for cost queries. Allowed enum values: `metrics,cloud_cost,datadog_usage` name [_required_] string Name of the query for use in formulas. query [_required_] string The monitor query. Option 3 object A formula and functions data quality query. data_source [_required_] enum Data source for data quality queries. Allowed enum values: `data_quality_metrics` filter [_required_] string Filter expression used to match on data entities. Uses Aastra query syntax. group_by [string] Optional grouping fields for aggregation. measure [_required_] string The data quality measure to query. Common values include: `bytes`, `cardinality`, `custom`, `freshness`, `max`, `mean`, `min`, `nullness`, `percent_negative`, `percent_zero`, `row_count`, `stddev`, `sum`, `uniqueness`. Additional values may be supported. monitor_options object Monitor configuration options for data quality queries. crontab_override string Crontab expression to override the default schedule. custom_sql string Custom SQL query for the monitor. custom_where string Custom WHERE clause for the query. group_by_columns [string] Columns to group results by. model_type_override enum Override for the model type used in anomaly detection. Allowed enum values: `freshness,percentage,any` name [_required_] string Name of the query for use in formulas. schema_version string Schema version for the data quality query. scope string Optional scoping expression to further filter metrics. Uses metrics filter syntax. This is useful when an entity has been configured to emit metrics with additional tags. overall_state enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` priority int64 Integer from 1 (high) to 5 (low) indicating alert severity. query [_required_] string The monitor query. restricted_roles [string] A list of unique role identifiers to define which roles are allowed to edit the monitor. The unique identifiers for all roles can be pulled from the [Roles API](https://docs.datadoghq.com/api/latest/roles/#list-roles) and are located in the `data.id` field. Editing a monitor includes any updates to the monitor configuration, monitor deletion, and muting of the monitor for any amount of time. You can use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) to manage write authorization for individual monitors by teams and users, in addition to roles. state object Wrapper object with the different monitor states. groups object Dictionary where the keys are groups (comma separated lists of tags) and the values are the list of groups your monitor is broken down on. object Monitor state for a single group. last_nodata_ts int64 Latest timestamp the monitor was in NO_DATA state. last_notified_ts int64 Latest timestamp of the notification sent for this monitor group. last_resolved_ts int64 Latest timestamp the monitor group was resolved. last_triggered_ts int64 Latest timestamp the monitor group triggered. name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated to your monitor. type [_required_] enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ``` { "assets": [ { "category": "runbook", "name": "Monitor Runbook", "resource_key": "12345", "resource_type": "string", "url": "/notebooks/12345" } ], "created": "2019-09-19T10:00:00.000Z", "creator": { "email": "string", "handle": "string", "name": "string" }, "deleted": "2019-09-19T10:00:00.000Z", "draft_status": "string", "id": "integer", "matching_downtimes": [ { "end": 1412792983, "id": 1625, "scope": [ "env:staging" ], "start": 1412792983 } ], "message": "string", "modified": "2019-09-19T10:00:00.000Z", "multi": false, "name": "My monitor", "options": { "aggregation": { "group_by": "host", "metric": "metrics.name", "type": "count" }, "device_ids": [], "enable_logs_sample": false, "enable_samples": false, "escalation_message": "string", "evaluation_delay": "integer", "group_retention_duration": "string", "groupby_simple_monitor": false, "include_tags": false, "locked": false, "min_failure_duration": "integer", "min_location_failed": "integer", "new_group_delay": "integer", "new_host_delay": "integer", "no_data_timeframe": "integer", "notification_preset_name": "string", "notify_audit": false, "notify_by": [], "notify_no_data": false, "on_missing_data": "string", "renotify_interval": "integer", "renotify_occurrences": "integer", "renotify_statuses": [], "require_full_window": false, "scheduling_options": { "custom_schedule": { "recurrences": [ { "rrule": "FREQ=WEEKLY;BYDAY=MO,TU,WE,TH,FR", "start": "2023-08-31T16:30:00", "timezone": "Europe/Paris" } ] }, "evaluation_window": { "day_starts": "04:00", "hour_starts": 0, "month_starts": 1, "timezone": "Europe/Paris" } }, "silenced": { "": "integer" }, "synthetics_check_id": "string", "threshold_windows": { "recovery_window": "string", "trigger_window": "string" }, "thresholds": { "critical": "number", "critical_recovery": "number", "ok": "number", "unknown": "number", "warning": "number", "warning_recovery": "number" }, "timeout_h": "integer", "variables": [ { "compute": { "aggregation": "avg", "interval": 60000, "metric": "@duration" }, "data_source": "rum", "group_by": [ { "facet": "status", "limit": 10, "sort": { "aggregation": "avg", "metric": "string", "order": "string" } } ], "indexes": [ "days-3", "days-7" ], "name": "query_errors", "search": { "query": "service:query" } } ] }, "overall_state": "string", "priority": "integer", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "restricted_roles": [], "state": { "groups": { "": { "last_nodata_ts": "integer", "last_notified_ts": "integer", "last_resolved_ts": "integer", "last_triggered_ts": "integer", "name": "string", "status": "string" } } }, "tags": [], "type": "query alert" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) ##### Mute all monitors Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/v1/monitor/mute_all" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Delete a monitor](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor-v1) DELETE https://api.ap1.datadoghq.com/api/v1/monitor/{monitor_id}https://api.ap2.datadoghq.com/api/v1/monitor/{monitor_id}https://api.datadoghq.eu/api/v1/monitor/{monitor_id}https://api.ddog-gov.com/api/v1/monitor/{monitor_id}https://api.datadoghq.com/api/v1/monitor/{monitor_id}https://api.us3.datadoghq.com/api/v1/monitor/{monitor_id}https://api.us5.datadoghq.com/api/v1/monitor/{monitor_id} ### Overview Delete the specified monitor This endpoint requires the `monitors_write` permission. OAuth apps require the `monitors_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Path Parameters Name Type Description monitor_id [_required_] integer The ID of the monitor. #### Query Strings Name Type Description force string Delete the monitor even if it’s referenced by other resources (for example SLO, composite monitor). ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitor-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitor-400-v1) * [401](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitor-401-v1) * [403](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitor-403-v1) * [404](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitor-404-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitor-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response from the delete monitor call. Expand All Field Type Description deleted_monitor_id int64 ID of the deleted monitor. ``` { "deleted_monitor_id": 666486743 } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item not found error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python-legacy) ##### Delete a monitor Copy ``` # Path parameters export monitor_id="6.66486743e+08" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/${monitor_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a monitor ``` """ Delete a monitor returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi # there is a valid "monitor" in the system MONITOR_ID = environ["MONITOR_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.delete_monitor( monitor_id=int(MONITOR_ID), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a monitor ``` # Delete a monitor returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new # there is a valid "monitor" in the system MONITOR_ID = ENV["MONITOR_ID"] p api_instance.delete_monitor(MONITOR_ID.to_i) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a monitor ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Delete a monitor dog.delete_monitor(62_625) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a monitor ``` // Delete a monitor returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::DeleteMonitorOptionalParams; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { // there is a valid "monitor" in the system let monitor_id: i64 = std::env::var("MONITOR_ID").unwrap().parse().unwrap(); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .delete_monitor(monitor_id.clone(), DeleteMonitorOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a monitor ``` /** * Delete a monitor returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); // there is a valid "monitor" in the system const MONITOR_ID = parseInt(process.env.MONITOR_ID as string); const params: v1.MonitorsApiDeleteMonitorRequest = { monitorId: MONITOR_ID, }; apiInstance .deleteMonitor(params) .then((data: v1.DeletedMonitor) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` ##### Delete a monitor ``` // Delete a monitor returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "monitor" in the system MonitorID, _ := strconv.ParseInt(os.Getenv("MONITOR_ID"), 10, 64) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.DeleteMonitor(ctx, MonitorID, *datadogV1.NewDeleteMonitorOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.DeleteMonitor`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.DeleteMonitor`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a monitor ``` // Delete a monitor returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.DeletedMonitor; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor" in the system Long MONITOR_ID = Long.parseLong(System.getenv("MONITOR_ID")); try { DeletedMonitor result = apiInstance.deleteMonitor(MONITOR_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#deleteMonitor"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a monitor ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Delete a monitor api.Monitor.delete(2081) # Force delete a monitor to override warnings api.Monitor.delete(2081, force=True) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` * * * ## [Check if a monitor can be deleted](https://docs.datadoghq.com/api/latest/monitors/#check-if-a-monitor-can-be-deleted) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#check-if-a-monitor-can-be-deleted-v1) GET https://api.ap1.datadoghq.com/api/v1/monitor/can_deletehttps://api.ap2.datadoghq.com/api/v1/monitor/can_deletehttps://api.datadoghq.eu/api/v1/monitor/can_deletehttps://api.ddog-gov.com/api/v1/monitor/can_deletehttps://api.datadoghq.com/api/v1/monitor/can_deletehttps://api.us3.datadoghq.com/api/v1/monitor/can_deletehttps://api.us5.datadoghq.com/api/v1/monitor/can_delete ### Overview Check if the given monitors can be deleted. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Query Strings Name Type Description monitor_ids [_required_] array The IDs of the monitor to check. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#CheckCanDeleteMonitor-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#CheckCanDeleteMonitor-400-v1) * [403](https://docs.datadoghq.com/api/latest/monitors/#CheckCanDeleteMonitor-403-v1) * [409](https://docs.datadoghq.com/api/latest/monitors/#CheckCanDeleteMonitor-409-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#CheckCanDeleteMonitor-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response of monitor IDs that can or can’t be safely deleted. Field Type Description data [_required_] object Wrapper object with the list of monitor IDs. ok [integer] An array of Monitor IDs that can be safely deleted. errors object A mapping of Monitor ID to strings denoting where it's used. [string] Strings denoting where a monitor is used. ``` { "data": { "ok": [] }, "errors": { "": [] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Deletion conflict error * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response of monitor IDs that can or can’t be safely deleted. Field Type Description data [_required_] object Wrapper object with the list of monitor IDs. ok [integer] An array of Monitor IDs that can be safely deleted. errors object A mapping of Monitor ID to strings denoting where it's used. [string] Strings denoting where a monitor is used. ``` { "data": { "ok": [] }, "errors": { "": [] } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python-legacy) ##### Check if a monitor can be deleted Copy ``` # Required query arguments export monitor_ids="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/can_delete?monitor_ids=${monitor_ids}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Check if a monitor can be deleted ``` """ Check if a monitor can be deleted returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi # there is a valid "monitor" in the system MONITOR_ID = environ["MONITOR_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.check_can_delete_monitor( monitor_ids=[ int(MONITOR_ID), ], ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Check if a monitor can be deleted ``` # Check if a monitor can be deleted returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new # there is a valid "monitor" in the system MONITOR_ID = ENV["MONITOR_ID"] p api_instance.check_can_delete_monitor([ MONITOR_ID.to_i, ]) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Check if a monitor can be deleted ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Check if you can delete the given monitors. dog.can_delete_monitors([56838, 771060, 1000376]) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Check if a monitor can be deleted ``` // Check if a monitor can be deleted returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "monitor" in the system MonitorID, _ := strconv.ParseInt(os.Getenv("MONITOR_ID"), 10, 64) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.CheckCanDeleteMonitor(ctx, []int64{ MonitorID, }) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.CheckCanDeleteMonitor`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.CheckCanDeleteMonitor`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Check if a monitor can be deleted ``` // Check if a monitor can be deleted returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.CheckCanDeleteMonitorResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor" in the system Long MONITOR_ID = Long.parseLong(System.getenv("MONITOR_ID")); try { CheckCanDeleteMonitorResponse result = apiInstance.checkCanDeleteMonitor(Collections.singletonList(MONITOR_ID)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#checkCanDeleteMonitor"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Check if a monitor can be deleted ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Check if you can delete the given monitors. api.Monitor.can_delete(monitor_ids=[56838, 771060, 1000376]) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Check if a monitor can be deleted ``` // Check if a monitor can be deleted returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { // there is a valid "monitor" in the system let monitor_id: i64 = std::env::var("MONITOR_ID").unwrap().parse().unwrap(); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.check_can_delete_monitor(vec![monitor_id.clone()]).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Check if a monitor can be deleted ``` /** * Check if a monitor can be deleted returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); // there is a valid "monitor" in the system const MONITOR_ID = parseInt(process.env.MONITOR_ID as string); const params: v1.MonitorsApiCheckCanDeleteMonitorRequest = { monitorIds: [MONITOR_ID], }; apiInstance .checkCanDeleteMonitor(params) .then((data: v1.CheckCanDeleteMonitorResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Validate a monitor](https://docs.datadoghq.com/api/latest/monitors/#validate-a-monitor) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#validate-a-monitor-v1) POST https://api.ap1.datadoghq.com/api/v1/monitor/validatehttps://api.ap2.datadoghq.com/api/v1/monitor/validatehttps://api.datadoghq.eu/api/v1/monitor/validatehttps://api.ddog-gov.com/api/v1/monitor/validatehttps://api.datadoghq.com/api/v1/monitor/validatehttps://api.us3.datadoghq.com/api/v1/monitor/validatehttps://api.us5.datadoghq.com/api/v1/monitor/validate ### Overview Validate the monitor provided in the request. **Note** : Log monitors require an unscoped App Key. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Request #### Body Data (required) Monitor request object * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description assets [object] The list of monitor assets tied to a monitor, which represents key links for users to take action on monitor alerts (for example, runbooks). category [_required_] enum Indicates the type of asset this entity represents on a monitor. Allowed enum values: `runbook` name [_required_] string Name for the monitor asset resource_key string Represents the identifier of the internal Datadog resource that this asset represents. IDs in this field should be passed in as strings. resource_type enum Type of internal Datadog resource associated with a monitor asset. Allowed enum values: `notebook` url [_required_] string URL link for the asset. For links with an internal resource type set, this should be the relative path to where the Datadog domain is appended internally. For external links, this should be the full URL path. created date-time Timestamp of the monitor creation. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. deleted date-time Whether or not the monitor is deleted. (Always `null`) draft_status enum Indicates whether the monitor is in a draft or published state. `draft`: The monitor appears as Draft and does not send notifications. `published`: The monitor is active and evaluates conditions and notify as configured. This field is in preview. The draft value is only available to customers with the feature enabled. Allowed enum values: `draft,published` default: `published` id int64 ID of this monitor. matching_downtimes [object] A list of active v1 downtimes that match this monitor. end int64 POSIX timestamp to end the downtime. id [_required_] int64 The downtime ID. scope [string] The scope(s) to which the downtime applies. Must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. message string A message to include with notifications for this monitor. modified date-time Last timestamp when the monitor was edited. multi boolean Whether or not the monitor is broken down on different groups. name string The monitor name. options object List of options associated with your monitor. aggregation object Type of aggregation performed in the monitor query. group_by string Group to break down the monitor on. metric string Metric name used in the monitor. type string Metric type used in the monitor. device_ids [string] **DEPRECATED** : IDs of the device the Synthetics monitor is running on. enable_logs_sample boolean Whether or not to send a log sample when the log monitor triggers. enable_samples boolean Whether or not to send a list of samples when the monitor triggers. This is only used by CI Test and Pipeline monitors. escalation_message string We recommend using the [is_renotify](https://docs.datadoghq.com/monitors/notify/?tab=is_alert#renotify), block in the original message instead. A message to include with a re-notification. Supports the `@username` notification we allow elsewhere. Not applicable if `renotify_interval` is `None`. evaluation_delay int64 Time (in seconds) to delay evaluation, as a non-negative integer. For example, if the value is set to `300` (5min), the timeframe is set to `last_5m` and the time is 7:00, the monitor evaluates data from 6:50 to 6:55. This is useful for AWS CloudWatch and other backfilled metrics to ensure the monitor always has data during evaluation. group_retention_duration string The time span after which groups with missing data are dropped from the monitor state. The minimum value is one hour, and the maximum value is 72 hours. Example values are: "60m", "1h", and "2d". This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. groupby_simple_monitor boolean **DEPRECATED** : Whether the log alert monitor triggers a single alert or multiple alerts when any group breaches a threshold. Use `notify_by` instead. include_tags boolean A Boolean indicating whether notifications from this monitor automatically inserts its triggering tags into the title. **Examples** * If `True`, `[Triggered on {host:h1}] Monitor Title` * If `False`, `[Triggered] Monitor Title` default: `true` locked boolean **DEPRECATED** : Whether or not the monitor is locked (only editable by creator and admins). Use `restricted_roles` instead. min_failure_duration int64 How long the test should be in failure before alerting (integer, number of seconds, max 7200). min_location_failed int64 The minimum number of locations in failure at the same time during at least one moment in the `min_failure_duration` period (`min_location_failed` and `min_failure_duration` are part of the advanced alerting rules - integer, >= 1). default: `1` new_group_delay int64 Time (in seconds) to skip evaluations for new groups. For example, this option can be used to skip evaluations for new hosts while they initialize. Must be a non negative integer. new_host_delay int64 **DEPRECATED** : Time (in seconds) to allow a host to boot and applications to fully start before starting the evaluation of monitor results. Should be a non negative integer. Use new_group_delay instead. default: `300` no_data_timeframe int64 The number of minutes before a monitor notifies after data stops reporting. Datadog recommends at least 2x the monitor timeframe for query alerts or 2 minutes for service checks. If omitted, 2x the evaluation timeframe is used for query alerts, and 24 hours is used for service checks. notification_preset_name enum Toggles the display of additional content sent in the monitor notification. Allowed enum values: `show_all,hide_query,hide_handles,hide_all` default: `show_all` notify_audit boolean A Boolean indicating whether tagged users is notified on changes to this monitor. notify_by [string] Controls what granularity a monitor alerts on. Only available for monitors with groupings. For instance, a monitor grouped by `cluster`, `namespace`, and `pod` can be configured to only notify on each new `cluster` violating the alert conditions by setting `notify_by` to `["cluster"]`. Tags mentioned in `notify_by` must be a subset of the grouping tags in the query. For example, a query grouped by `cluster` and `namespace` cannot notify on `region`. Setting `notify_by` to `["*"]` configures the monitor to notify as a simple-alert. notify_no_data boolean A Boolean indicating whether this monitor notifies when data stops reporting. Defaults to `false`. on_missing_data enum Controls how groups or monitors are treated if an evaluation does not return any data points. The default option results in different behavior depending on the monitor query type. For monitors using Count queries, an empty monitor evaluation is treated as 0 and is compared to the threshold conditions. For monitors using any query type other than Count, for example Gauge, Measure, or Rate, the monitor shows the last known status. This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. Allowed enum values: `default,show_no_data,show_and_notify_no_data,resolve` renotify_interval int64 The number of minutes after the last notification before a monitor re-notifies on the current status. It only re-notifies if it’s not resolved. renotify_occurrences int64 The number of times re-notification messages should be sent on the current status at the provided re-notification interval. renotify_statuses [string] The types of monitor statuses for which re-notification messages are sent. Default: **null** if `renotify_interval` is **null**. If `renotify_interval` is set, defaults to renotify on `Alert` and `No Data`. require_full_window boolean A Boolean indicating whether this monitor needs a full window of data before it’s evaluated. We highly recommend you set this to `false` for sparse metrics, otherwise some evaluations are skipped. Default is false. This setting only applies to metric monitors. scheduling_options object Configuration options for scheduling. custom_schedule object Configuration options for the custom schedule. **This feature is in private beta.** recurrences [object] Array of custom schedule recurrences. rrule string Defines the recurrence rule (RRULE) for a given schedule. start string Defines the start date and time of the recurring schedule. timezone string Defines the timezone the schedule runs on. evaluation_window object Configuration options for the evaluation window. If `hour_starts` is set, no other fields may be set. Otherwise, `day_starts` and `month_starts` must be set together. day_starts string The time of the day at which a one day cumulative evaluation window starts. hour_starts int32 The minute of the hour at which a one hour cumulative evaluation window starts. month_starts int32 The day of the month at which a one month cumulative evaluation window starts. timezone string The timezone of the time of the day of the cumulative evaluation window start. silenced object **DEPRECATED** : Information about the downtime applied to the monitor. Only shows v1 downtimes. int64 UTC epoch timestamp in seconds when the downtime for the group expires. synthetics_check_id string **DEPRECATED** : ID of the corresponding Synthetic check. threshold_windows object Alerting time window options. recovery_window string Describes how long an anomalous metric must be normal before the alert recovers. trigger_window string Describes how long a metric must be anomalous before an alert triggers. thresholds object List of the different monitor threshold available. critical double The monitor `CRITICAL` threshold. critical_recovery double The monitor `CRITICAL` recovery threshold. ok double The monitor `OK` threshold. unknown double The monitor UNKNOWN threshold. warning double The monitor `WARNING` threshold. warning_recovery double The monitor `WARNING` recovery threshold. timeout_h int64 The number of hours of the monitor not reporting data before it automatically resolves from a triggered state. The minimum allowed value is 0 hours. The maximum allowed value is 24 hours. variables [ ] List of requests that can be used in the monitor query. **This feature is currently in beta.** Option 1 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `rum,ci_pipelines,ci_tests,audit,events,logs,spans,database_queries,network` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. Option 2 object A formula and functions cost query. aggregator enum Aggregation methods for metric queries. Allowed enum values: `avg,sum,max,min,last,area,l2norm,percentile,stddev` data_source [_required_] enum Data source for cost queries. Allowed enum values: `metrics,cloud_cost,datadog_usage` name [_required_] string Name of the query for use in formulas. query [_required_] string The monitor query. Option 3 object A formula and functions data quality query. data_source [_required_] enum Data source for data quality queries. Allowed enum values: `data_quality_metrics` filter [_required_] string Filter expression used to match on data entities. Uses Aastra query syntax. group_by [string] Optional grouping fields for aggregation. measure [_required_] string The data quality measure to query. Common values include: `bytes`, `cardinality`, `custom`, `freshness`, `max`, `mean`, `min`, `nullness`, `percent_negative`, `percent_zero`, `row_count`, `stddev`, `sum`, `uniqueness`. Additional values may be supported. monitor_options object Monitor configuration options for data quality queries. crontab_override string Crontab expression to override the default schedule. custom_sql string Custom SQL query for the monitor. custom_where string Custom WHERE clause for the query. group_by_columns [string] Columns to group results by. model_type_override enum Override for the model type used in anomaly detection. Allowed enum values: `freshness,percentage,any` name [_required_] string Name of the query for use in formulas. schema_version string Schema version for the data quality query. scope string Optional scoping expression to further filter metrics. Uses metrics filter syntax. This is useful when an entity has been configured to emit metrics with additional tags. overall_state enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` priority int64 Integer from 1 (high) to 5 (low) indicating alert severity. query [_required_] string The monitor query. restricted_roles [string] A list of unique role identifiers to define which roles are allowed to edit the monitor. The unique identifiers for all roles can be pulled from the [Roles API](https://docs.datadoghq.com/api/latest/roles/#list-roles) and are located in the `data.id` field. Editing a monitor includes any updates to the monitor configuration, monitor deletion, and muting of the monitor for any amount of time. You can use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) to manage write authorization for individual monitors by teams and users, in addition to roles. state object Wrapper object with the different monitor states. groups object Dictionary where the keys are groups (comma separated lists of tags) and the values are the list of groups your monitor is broken down on. object Monitor state for a single group. last_nodata_ts int64 Latest timestamp the monitor was in NO_DATA state. last_notified_ts int64 Latest timestamp of the notification sent for this monitor group. last_resolved_ts int64 Latest timestamp the monitor group was resolved. last_triggered_ts int64 Latest timestamp the monitor group triggered. name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated to your monitor. type [_required_] enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ##### Validate a monitor returns "OK" response ``` { "name": "Example-Monitor", "type": "log alert", "query": "logs(\"service:foo AND type:error\").index(\"main\").rollup(\"count\").by(\"source\").last(\"5m\") > 2", "message": "some message Notify: @hipchat-channel", "tags": [ "test:examplemonitor", "env:ci" ], "priority": 3, "options": { "enable_logs_sample": true, "escalation_message": "the situation has escalated", "evaluation_delay": 700, "include_tags": true, "locked": false, "new_host_delay": 600, "no_data_timeframe": null, "notify_audit": false, "notify_no_data": false, "on_missing_data": "show_and_notify_no_data", "notification_preset_name": "hide_handles", "renotify_interval": 60, "require_full_window": true, "timeout_h": 24, "thresholds": { "critical": 2, "warning": 1 } } } ``` Copy ##### Validate a multi-alert monitor returns "OK" response ``` { "name": "Example-Monitor", "type": "log alert", "query": "logs(\"service:foo AND type:error\").index(\"main\").rollup(\"count\").by(\"source,status\").last(\"5m\") > 2", "message": "some message Notify: @hipchat-channel", "tags": [ "test:examplemonitor", "env:ci" ], "priority": 3, "options": { "enable_logs_sample": true, "escalation_message": "the situation has escalated", "evaluation_delay": 700, "group_retention_duration": "2d", "include_tags": true, "locked": false, "new_host_delay": 600, "no_data_timeframe": null, "notify_audit": false, "notify_by": [ "status" ], "notify_no_data": false, "on_missing_data": "show_and_notify_no_data", "renotify_interval": 60, "require_full_window": true, "timeout_h": 24, "thresholds": { "critical": 2, "warning": 1 } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#ValidateMonitor-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#ValidateMonitor-400-v1) * [403](https://docs.datadoghq.com/api/latest/monitors/#ValidateMonitor-403-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#ValidateMonitor-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Expand All Field Type Description No response body ``` {} ``` Copy Invalid JSON * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby-legacy) ##### Validate a monitor returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/validate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Monitor", "type": "log alert", "query": "logs(\"service:foo AND type:error\").index(\"main\").rollup(\"count\").by(\"source\").last(\"5m\") > 2", "message": "some message Notify: @hipchat-channel", "tags": [ "test:examplemonitor", "env:ci" ], "priority": 3, "options": { "enable_logs_sample": true, "escalation_message": "the situation has escalated", "evaluation_delay": 700, "include_tags": true, "locked": false, "new_host_delay": 600, "no_data_timeframe": null, "notify_audit": false, "notify_no_data": false, "on_missing_data": "show_and_notify_no_data", "notification_preset_name": "hide_handles", "renotify_interval": 60, "require_full_window": true, "timeout_h": 24, "thresholds": { "critical": 2, "warning": 1 } } } EOF ``` ##### Validate a multi-alert monitor returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/validate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Monitor", "type": "log alert", "query": "logs(\"service:foo AND type:error\").index(\"main\").rollup(\"count\").by(\"source,status\").last(\"5m\") > 2", "message": "some message Notify: @hipchat-channel", "tags": [ "test:examplemonitor", "env:ci" ], "priority": 3, "options": { "enable_logs_sample": true, "escalation_message": "the situation has escalated", "evaluation_delay": 700, "group_retention_duration": "2d", "include_tags": true, "locked": false, "new_host_delay": 600, "no_data_timeframe": null, "notify_audit": false, "notify_by": [ "status" ], "notify_no_data": false, "on_missing_data": "show_and_notify_no_data", "renotify_interval": 60, "require_full_window": true, "timeout_h": 24, "thresholds": { "critical": 2, "warning": 1 } } } EOF ``` ##### Validate a monitor returns "OK" response ``` // Validate a monitor returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Monitor{ Name: datadog.PtrString("Example-Monitor"), Type: datadogV1.MONITORTYPE_LOG_ALERT, Query: `logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2`, Message: datadog.PtrString("some message Notify: @hipchat-channel"), Tags: []string{ "test:examplemonitor", "env:ci", }, Priority: *datadog.NewNullableInt64(datadog.PtrInt64(3)), Options: &datadogV1.MonitorOptions{ EnableLogsSample: datadog.PtrBool(true), EscalationMessage: datadog.PtrString("the situation has escalated"), EvaluationDelay: *datadog.NewNullableInt64(datadog.PtrInt64(700)), IncludeTags: datadog.PtrBool(true), Locked: datadog.PtrBool(false), NewHostDelay: *datadog.NewNullableInt64(datadog.PtrInt64(600)), NoDataTimeframe: *datadog.NewNullableInt64(nil), NotifyAudit: datadog.PtrBool(false), NotifyNoData: datadog.PtrBool(false), OnMissingData: datadogV1.ONMISSINGDATAOPTION_SHOW_AND_NOTIFY_NO_DATA.Ptr(), NotificationPresetName: datadogV1.MONITOROPTIONSNOTIFICATIONPRESETS_HIDE_HANDLES.Ptr(), RenotifyInterval: *datadog.NewNullableInt64(datadog.PtrInt64(60)), RequireFullWindow: datadog.PtrBool(true), TimeoutH: *datadog.NewNullableInt64(datadog.PtrInt64(24)), Thresholds: &datadogV1.MonitorThresholds{ Critical: datadog.PtrFloat64(2), Warning: *datadog.NewNullableFloat64(datadog.PtrFloat64(1)), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.ValidateMonitor(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.ValidateMonitor`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.ValidateMonitor`:\n%s\n", responseContent) } ``` Copy ##### Validate a multi-alert monitor returns "OK" response ``` // Validate a multi-alert monitor returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Monitor{ Name: datadog.PtrString("Example-Monitor"), Type: datadogV1.MONITORTYPE_LOG_ALERT, Query: `logs("service:foo AND type:error").index("main").rollup("count").by("source,status").last("5m") > 2`, Message: datadog.PtrString("some message Notify: @hipchat-channel"), Tags: []string{ "test:examplemonitor", "env:ci", }, Priority: *datadog.NewNullableInt64(datadog.PtrInt64(3)), Options: &datadogV1.MonitorOptions{ EnableLogsSample: datadog.PtrBool(true), EscalationMessage: datadog.PtrString("the situation has escalated"), EvaluationDelay: *datadog.NewNullableInt64(datadog.PtrInt64(700)), GroupRetentionDuration: datadog.PtrString("2d"), IncludeTags: datadog.PtrBool(true), Locked: datadog.PtrBool(false), NewHostDelay: *datadog.NewNullableInt64(datadog.PtrInt64(600)), NoDataTimeframe: *datadog.NewNullableInt64(nil), NotifyAudit: datadog.PtrBool(false), NotifyBy: []string{ "status", }, NotifyNoData: datadog.PtrBool(false), OnMissingData: datadogV1.ONMISSINGDATAOPTION_SHOW_AND_NOTIFY_NO_DATA.Ptr(), RenotifyInterval: *datadog.NewNullableInt64(datadog.PtrInt64(60)), RequireFullWindow: datadog.PtrBool(true), TimeoutH: *datadog.NewNullableInt64(datadog.PtrInt64(24)), Thresholds: &datadogV1.MonitorThresholds{ Critical: datadog.PtrFloat64(2), Warning: *datadog.NewNullableFloat64(datadog.PtrFloat64(1)), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.ValidateMonitor(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.ValidateMonitor`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.ValidateMonitor`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Validate a monitor returns "OK" response ``` // Validate a monitor returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.Monitor; import com.datadog.api.client.v1.model.MonitorOptions; import com.datadog.api.client.v1.model.MonitorOptionsNotificationPresets; import com.datadog.api.client.v1.model.MonitorThresholds; import com.datadog.api.client.v1.model.MonitorType; import com.datadog.api.client.v1.model.OnMissingDataOption; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); Monitor body = new Monitor() .name("Example-Monitor") .type(MonitorType.LOG_ALERT) .query( """ logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2 """) .message("some message Notify: @hipchat-channel") .tags(Arrays.asList("test:examplemonitor", "env:ci")) .priority(3L) .options( new MonitorOptions() .enableLogsSample(true) .escalationMessage("the situation has escalated") .evaluationDelay(700L) .includeTags(true) .locked(false) .newHostDelay(600L) .noDataTimeframe(null) .notifyAudit(false) .notifyNoData(false) .onMissingData(OnMissingDataOption.SHOW_AND_NOTIFY_NO_DATA) .notificationPresetName(MonitorOptionsNotificationPresets.HIDE_HANDLES) .renotifyInterval(60L) .requireFullWindow(true) .timeoutH(24L) .thresholds(new MonitorThresholds().critical(2.0).warning(1.0))); try { apiInstance.validateMonitor(body); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#validateMonitor"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Validate a multi-alert monitor returns "OK" response ``` // Validate a multi-alert monitor returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.Monitor; import com.datadog.api.client.v1.model.MonitorOptions; import com.datadog.api.client.v1.model.MonitorThresholds; import com.datadog.api.client.v1.model.MonitorType; import com.datadog.api.client.v1.model.OnMissingDataOption; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); Monitor body = new Monitor() .name("Example-Monitor") .type(MonitorType.LOG_ALERT) .query( """ logs("service:foo AND type:error").index("main").rollup("count").by("source,status").last("5m") > 2 """) .message("some message Notify: @hipchat-channel") .tags(Arrays.asList("test:examplemonitor", "env:ci")) .priority(3L) .options( new MonitorOptions() .enableLogsSample(true) .escalationMessage("the situation has escalated") .evaluationDelay(700L) .groupRetentionDuration("2d") .includeTags(true) .locked(false) .newHostDelay(600L) .noDataTimeframe(null) .notifyAudit(false) .notifyBy(Collections.singletonList("status")) .notifyNoData(false) .onMissingData(OnMissingDataOption.SHOW_AND_NOTIFY_NO_DATA) .renotifyInterval(60L) .requireFullWindow(true) .timeoutH(24L) .thresholds(new MonitorThresholds().critical(2.0).warning(1.0))); try { apiInstance.validateMonitor(body); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#validateMonitor"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Validate a monitor returns "OK" response ``` from datadog import initialize, api options = { "api_key": "", "app_key": "" } initialize(**options) monitor_type = "query alert" query = "avg(last_1h):sum:system.net.bytes_rcvd{host:host0} > 200" monitor_options = {"thresholds": {"critical": 90.0}} # Validate a monitor's definition api.Monitor.validate( type=monitor_type, query=query, options=monitor_options, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Validate a monitor returns "OK" response ``` """ Validate a monitor returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi from datadog_api_client.v1.model.monitor import Monitor from datadog_api_client.v1.model.monitor_options import MonitorOptions from datadog_api_client.v1.model.monitor_options_notification_presets import MonitorOptionsNotificationPresets from datadog_api_client.v1.model.monitor_thresholds import MonitorThresholds from datadog_api_client.v1.model.monitor_type import MonitorType from datadog_api_client.v1.model.on_missing_data_option import OnMissingDataOption body = Monitor( name="Example-Monitor", type=MonitorType.LOG_ALERT, query='logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2', message="some message Notify: @hipchat-channel", tags=[ "test:examplemonitor", "env:ci", ], priority=3, options=MonitorOptions( enable_logs_sample=True, escalation_message="the situation has escalated", evaluation_delay=700, include_tags=True, locked=False, new_host_delay=600, no_data_timeframe=None, notify_audit=False, notify_no_data=False, on_missing_data=OnMissingDataOption.SHOW_AND_NOTIFY_NO_DATA, notification_preset_name=MonitorOptionsNotificationPresets.HIDE_HANDLES, renotify_interval=60, require_full_window=True, timeout_h=24, thresholds=MonitorThresholds( critical=2.0, warning=1.0, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.validate_monitor(body=body) print(response) ``` Copy ##### Validate a multi-alert monitor returns "OK" response ``` """ Validate a multi-alert monitor returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi from datadog_api_client.v1.model.monitor import Monitor from datadog_api_client.v1.model.monitor_options import MonitorOptions from datadog_api_client.v1.model.monitor_thresholds import MonitorThresholds from datadog_api_client.v1.model.monitor_type import MonitorType from datadog_api_client.v1.model.on_missing_data_option import OnMissingDataOption body = Monitor( name="Example-Monitor", type=MonitorType.LOG_ALERT, query='logs("service:foo AND type:error").index("main").rollup("count").by("source,status").last("5m") > 2', message="some message Notify: @hipchat-channel", tags=[ "test:examplemonitor", "env:ci", ], priority=3, options=MonitorOptions( enable_logs_sample=True, escalation_message="the situation has escalated", evaluation_delay=700, group_retention_duration="2d", include_tags=True, locked=False, new_host_delay=600, no_data_timeframe=None, notify_audit=False, notify_by=[ "status", ], notify_no_data=False, on_missing_data=OnMissingDataOption.SHOW_AND_NOTIFY_NO_DATA, renotify_interval=60, require_full_window=True, timeout_h=24, thresholds=MonitorThresholds( critical=2.0, warning=1.0, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.validate_monitor(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Validate a monitor returns "OK" response ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) type = 'query alert' query = 'THIS IS A BAD QUERY' parameters = { name: 'Bytes received on host0', message: 'We may need to add web hosts if this is consistently high.', tags: ['app:webserver', 'frontend'], options: { notify_no_data: true, no_data_timeframe: 20, thresholds: { critical: 90.0 } } } # Validate a monitor definition dog.validate_monitor(type, query, parameters) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Validate a monitor returns "OK" response ``` # Validate a monitor returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new body = DatadogAPIClient::V1::Monitor.new({ name: "Example-Monitor", type: DatadogAPIClient::V1::MonitorType::LOG_ALERT, query: 'logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2', message: "some message Notify: @hipchat-channel", tags: [ "test:examplemonitor", "env:ci", ], priority: 3, options: DatadogAPIClient::V1::MonitorOptions.new({ enable_logs_sample: true, escalation_message: "the situation has escalated", evaluation_delay: 700, include_tags: true, locked: false, new_host_delay: 600, no_data_timeframe: nil, notify_audit: false, notify_no_data: false, on_missing_data: DatadogAPIClient::V1::OnMissingDataOption::SHOW_AND_NOTIFY_NO_DATA, notification_preset_name: DatadogAPIClient::V1::MonitorOptionsNotificationPresets::HIDE_HANDLES, renotify_interval: 60, require_full_window: true, timeout_h: 24, thresholds: DatadogAPIClient::V1::MonitorThresholds.new({ critical: 2, warning: 1, }), }), }) p api_instance.validate_monitor(body) ``` Copy ##### Validate a multi-alert monitor returns "OK" response ``` # Validate a multi-alert monitor returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new body = DatadogAPIClient::V1::Monitor.new({ name: "Example-Monitor", type: DatadogAPIClient::V1::MonitorType::LOG_ALERT, query: 'logs("service:foo AND type:error").index("main").rollup("count").by("source,status").last("5m") > 2', message: "some message Notify: @hipchat-channel", tags: [ "test:examplemonitor", "env:ci", ], priority: 3, options: DatadogAPIClient::V1::MonitorOptions.new({ enable_logs_sample: true, escalation_message: "the situation has escalated", evaluation_delay: 700, group_retention_duration: "2d", include_tags: true, locked: false, new_host_delay: 600, no_data_timeframe: nil, notify_audit: false, notify_by: [ "status", ], notify_no_data: false, on_missing_data: DatadogAPIClient::V1::OnMissingDataOption::SHOW_AND_NOTIFY_NO_DATA, renotify_interval: 60, require_full_window: true, timeout_h: 24, thresholds: DatadogAPIClient::V1::MonitorThresholds.new({ critical: 2, warning: 1, }), }), }) p api_instance.validate_monitor(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Validate a monitor returns "OK" response ``` // Validate a monitor returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; use datadog_api_client::datadogV1::model::Monitor; use datadog_api_client::datadogV1::model::MonitorOptions; use datadog_api_client::datadogV1::model::MonitorOptionsNotificationPresets; use datadog_api_client::datadogV1::model::MonitorThresholds; use datadog_api_client::datadogV1::model::MonitorType; use datadog_api_client::datadogV1::model::OnMissingDataOption; #[tokio::main] async fn main() { let body = Monitor::new( r#"logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2"#.to_string(), MonitorType::LOG_ALERT, ) .message("some message Notify: @hipchat-channel".to_string()) .name("Example-Monitor".to_string()) .options( MonitorOptions::new() .enable_logs_sample(true) .escalation_message("the situation has escalated".to_string()) .evaluation_delay(Some(700)) .include_tags(true) .locked(false) .new_host_delay(Some(600)) .no_data_timeframe(None) .notification_preset_name(MonitorOptionsNotificationPresets::HIDE_HANDLES) .notify_audit(false) .notify_no_data(false) .on_missing_data(OnMissingDataOption::SHOW_AND_NOTIFY_NO_DATA) .renotify_interval(Some(60)) .require_full_window(true) .thresholds(MonitorThresholds::new().critical(2.0 as f64).warning(Some(1.0 as f64))) .timeout_h(Some(24)), ) .priority(Some(3)) .tags(vec!["test:examplemonitor".to_string(), "env:ci".to_string()]); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.validate_monitor(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Validate a multi-alert monitor returns "OK" response ``` // Validate a multi-alert monitor returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; use datadog_api_client::datadogV1::model::Monitor; use datadog_api_client::datadogV1::model::MonitorOptions; use datadog_api_client::datadogV1::model::MonitorThresholds; use datadog_api_client::datadogV1::model::MonitorType; use datadog_api_client::datadogV1::model::OnMissingDataOption; #[tokio::main] async fn main() { let body = Monitor::new( r#"logs("service:foo AND type:error").index("main").rollup("count").by("source,status").last("5m") > 2"#.to_string(), MonitorType::LOG_ALERT, ) .message("some message Notify: @hipchat-channel".to_string()) .name("Example-Monitor".to_string()) .options( MonitorOptions::new() .enable_logs_sample(true) .escalation_message("the situation has escalated".to_string()) .evaluation_delay(Some(700)) .group_retention_duration("2d".to_string()) .include_tags(true) .locked(false) .new_host_delay(Some(600)) .no_data_timeframe(None) .notify_audit(false) .notify_by(vec!["status".to_string()]) .notify_no_data(false) .on_missing_data(OnMissingDataOption::SHOW_AND_NOTIFY_NO_DATA) .renotify_interval(Some(60)) .require_full_window(true) .thresholds(MonitorThresholds::new().critical(2.0 as f64).warning(Some(1.0 as f64))) .timeout_h(Some(24)), ) .priority(Some(3)) .tags(vec!["test:examplemonitor".to_string(), "env:ci".to_string()]); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.validate_monitor(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Validate a monitor returns "OK" response ``` /** * Validate a monitor returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); const params: v1.MonitorsApiValidateMonitorRequest = { body: { name: "Example-Monitor", type: "log alert", query: `logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2`, message: "some message Notify: @hipchat-channel", tags: ["test:examplemonitor", "env:ci"], priority: 3, options: { enableLogsSample: true, escalationMessage: "the situation has escalated", evaluationDelay: 700, includeTags: true, locked: false, newHostDelay: 600, noDataTimeframe: undefined, notifyAudit: false, notifyNoData: false, onMissingData: "show_and_notify_no_data", notificationPresetName: "hide_handles", renotifyInterval: 60, requireFullWindow: true, timeoutH: 24, thresholds: { critical: 2, warning: 1, }, }, }, }; apiInstance .validateMonitor(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Validate a multi-alert monitor returns "OK" response ``` /** * Validate a multi-alert monitor returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); const params: v1.MonitorsApiValidateMonitorRequest = { body: { name: "Example-Monitor", type: "log alert", query: `logs("service:foo AND type:error").index("main").rollup("count").by("source,status").last("5m") > 2`, message: "some message Notify: @hipchat-channel", tags: ["test:examplemonitor", "env:ci"], priority: 3, options: { enableLogsSample: true, escalationMessage: "the situation has escalated", evaluationDelay: 700, groupRetentionDuration: "2d", includeTags: true, locked: false, newHostDelay: 600, noDataTimeframe: undefined, notifyAudit: false, notifyBy: ["status"], notifyNoData: false, onMissingData: "show_and_notify_no_data", renotifyInterval: 60, requireFullWindow: true, timeoutH: 24, thresholds: { critical: 2, warning: 1, }, }, }, }; apiInstance .validateMonitor(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Validate an existing monitor](https://docs.datadoghq.com/api/latest/monitors/#validate-an-existing-monitor) * [v1 (latest)](https://docs.datadoghq.com/api/latest/monitors/#validate-an-existing-monitor-v1) POST https://api.ap1.datadoghq.com/api/v1/monitor/{monitor_id}/validatehttps://api.ap2.datadoghq.com/api/v1/monitor/{monitor_id}/validatehttps://api.datadoghq.eu/api/v1/monitor/{monitor_id}/validatehttps://api.ddog-gov.com/api/v1/monitor/{monitor_id}/validatehttps://api.datadoghq.com/api/v1/monitor/{monitor_id}/validatehttps://api.us3.datadoghq.com/api/v1/monitor/{monitor_id}/validatehttps://api.us5.datadoghq.com/api/v1/monitor/{monitor_id}/validate ### Overview Validate the monitor provided in the request. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Path Parameters Name Type Description monitor_id [_required_] integer The ID of the monitor ### Request #### Body Data (required) Monitor request object * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description assets [object] The list of monitor assets tied to a monitor, which represents key links for users to take action on monitor alerts (for example, runbooks). category [_required_] enum Indicates the type of asset this entity represents on a monitor. Allowed enum values: `runbook` name [_required_] string Name for the monitor asset resource_key string Represents the identifier of the internal Datadog resource that this asset represents. IDs in this field should be passed in as strings. resource_type enum Type of internal Datadog resource associated with a monitor asset. Allowed enum values: `notebook` url [_required_] string URL link for the asset. For links with an internal resource type set, this should be the relative path to where the Datadog domain is appended internally. For external links, this should be the full URL path. created date-time Timestamp of the monitor creation. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. deleted date-time Whether or not the monitor is deleted. (Always `null`) draft_status enum Indicates whether the monitor is in a draft or published state. `draft`: The monitor appears as Draft and does not send notifications. `published`: The monitor is active and evaluates conditions and notify as configured. This field is in preview. The draft value is only available to customers with the feature enabled. Allowed enum values: `draft,published` default: `published` id int64 ID of this monitor. matching_downtimes [object] A list of active v1 downtimes that match this monitor. end int64 POSIX timestamp to end the downtime. id [_required_] int64 The downtime ID. scope [string] The scope(s) to which the downtime applies. Must be in `key:value` format. For example, `host:app2`. Provide multiple scopes as a comma-separated list like `env:dev,env:prod`. The resulting downtime applies to sources that matches ALL provided scopes (`env:dev` **AND** `env:prod`). start int64 POSIX timestamp to start the downtime. message string A message to include with notifications for this monitor. modified date-time Last timestamp when the monitor was edited. multi boolean Whether or not the monitor is broken down on different groups. name string The monitor name. options object List of options associated with your monitor. aggregation object Type of aggregation performed in the monitor query. group_by string Group to break down the monitor on. metric string Metric name used in the monitor. type string Metric type used in the monitor. device_ids [string] **DEPRECATED** : IDs of the device the Synthetics monitor is running on. enable_logs_sample boolean Whether or not to send a log sample when the log monitor triggers. enable_samples boolean Whether or not to send a list of samples when the monitor triggers. This is only used by CI Test and Pipeline monitors. escalation_message string We recommend using the [is_renotify](https://docs.datadoghq.com/monitors/notify/?tab=is_alert#renotify), block in the original message instead. A message to include with a re-notification. Supports the `@username` notification we allow elsewhere. Not applicable if `renotify_interval` is `None`. evaluation_delay int64 Time (in seconds) to delay evaluation, as a non-negative integer. For example, if the value is set to `300` (5min), the timeframe is set to `last_5m` and the time is 7:00, the monitor evaluates data from 6:50 to 6:55. This is useful for AWS CloudWatch and other backfilled metrics to ensure the monitor always has data during evaluation. group_retention_duration string The time span after which groups with missing data are dropped from the monitor state. The minimum value is one hour, and the maximum value is 72 hours. Example values are: "60m", "1h", and "2d". This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. groupby_simple_monitor boolean **DEPRECATED** : Whether the log alert monitor triggers a single alert or multiple alerts when any group breaches a threshold. Use `notify_by` instead. include_tags boolean A Boolean indicating whether notifications from this monitor automatically inserts its triggering tags into the title. **Examples** * If `True`, `[Triggered on {host:h1}] Monitor Title` * If `False`, `[Triggered] Monitor Title` default: `true` locked boolean **DEPRECATED** : Whether or not the monitor is locked (only editable by creator and admins). Use `restricted_roles` instead. min_failure_duration int64 How long the test should be in failure before alerting (integer, number of seconds, max 7200). min_location_failed int64 The minimum number of locations in failure at the same time during at least one moment in the `min_failure_duration` period (`min_location_failed` and `min_failure_duration` are part of the advanced alerting rules - integer, >= 1). default: `1` new_group_delay int64 Time (in seconds) to skip evaluations for new groups. For example, this option can be used to skip evaluations for new hosts while they initialize. Must be a non negative integer. new_host_delay int64 **DEPRECATED** : Time (in seconds) to allow a host to boot and applications to fully start before starting the evaluation of monitor results. Should be a non negative integer. Use new_group_delay instead. default: `300` no_data_timeframe int64 The number of minutes before a monitor notifies after data stops reporting. Datadog recommends at least 2x the monitor timeframe for query alerts or 2 minutes for service checks. If omitted, 2x the evaluation timeframe is used for query alerts, and 24 hours is used for service checks. notification_preset_name enum Toggles the display of additional content sent in the monitor notification. Allowed enum values: `show_all,hide_query,hide_handles,hide_all` default: `show_all` notify_audit boolean A Boolean indicating whether tagged users is notified on changes to this monitor. notify_by [string] Controls what granularity a monitor alerts on. Only available for monitors with groupings. For instance, a monitor grouped by `cluster`, `namespace`, and `pod` can be configured to only notify on each new `cluster` violating the alert conditions by setting `notify_by` to `["cluster"]`. Tags mentioned in `notify_by` must be a subset of the grouping tags in the query. For example, a query grouped by `cluster` and `namespace` cannot notify on `region`. Setting `notify_by` to `["*"]` configures the monitor to notify as a simple-alert. notify_no_data boolean A Boolean indicating whether this monitor notifies when data stops reporting. Defaults to `false`. on_missing_data enum Controls how groups or monitors are treated if an evaluation does not return any data points. The default option results in different behavior depending on the monitor query type. For monitors using Count queries, an empty monitor evaluation is treated as 0 and is compared to the threshold conditions. For monitors using any query type other than Count, for example Gauge, Measure, or Rate, the monitor shows the last known status. This option is only available for APM Trace Analytics, Audit Trail, CI, Error Tracking, Event, Logs, and RUM monitors. Allowed enum values: `default,show_no_data,show_and_notify_no_data,resolve` renotify_interval int64 The number of minutes after the last notification before a monitor re-notifies on the current status. It only re-notifies if it’s not resolved. renotify_occurrences int64 The number of times re-notification messages should be sent on the current status at the provided re-notification interval. renotify_statuses [string] The types of monitor statuses for which re-notification messages are sent. Default: **null** if `renotify_interval` is **null**. If `renotify_interval` is set, defaults to renotify on `Alert` and `No Data`. require_full_window boolean A Boolean indicating whether this monitor needs a full window of data before it’s evaluated. We highly recommend you set this to `false` for sparse metrics, otherwise some evaluations are skipped. Default is false. This setting only applies to metric monitors. scheduling_options object Configuration options for scheduling. custom_schedule object Configuration options for the custom schedule. **This feature is in private beta.** recurrences [object] Array of custom schedule recurrences. rrule string Defines the recurrence rule (RRULE) for a given schedule. start string Defines the start date and time of the recurring schedule. timezone string Defines the timezone the schedule runs on. evaluation_window object Configuration options for the evaluation window. If `hour_starts` is set, no other fields may be set. Otherwise, `day_starts` and `month_starts` must be set together. day_starts string The time of the day at which a one day cumulative evaluation window starts. hour_starts int32 The minute of the hour at which a one hour cumulative evaluation window starts. month_starts int32 The day of the month at which a one month cumulative evaluation window starts. timezone string The timezone of the time of the day of the cumulative evaluation window start. silenced object **DEPRECATED** : Information about the downtime applied to the monitor. Only shows v1 downtimes. int64 UTC epoch timestamp in seconds when the downtime for the group expires. synthetics_check_id string **DEPRECATED** : ID of the corresponding Synthetic check. threshold_windows object Alerting time window options. recovery_window string Describes how long an anomalous metric must be normal before the alert recovers. trigger_window string Describes how long a metric must be anomalous before an alert triggers. thresholds object List of the different monitor threshold available. critical double The monitor `CRITICAL` threshold. critical_recovery double The monitor `CRITICAL` recovery threshold. ok double The monitor `OK` threshold. unknown double The monitor UNKNOWN threshold. warning double The monitor `WARNING` threshold. warning_recovery double The monitor `WARNING` recovery threshold. timeout_h int64 The number of hours of the monitor not reporting data before it automatically resolves from a triggered state. The minimum allowed value is 0 hours. The maximum allowed value is 24 hours. variables [ ] List of requests that can be used in the monitor query. **This feature is currently in beta.** Option 1 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `rum,ci_pipelines,ci_tests,audit,events,logs,spans,database_queries,network` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` metric string Metric used for sorting group by results. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. Option 2 object A formula and functions cost query. aggregator enum Aggregation methods for metric queries. Allowed enum values: `avg,sum,max,min,last,area,l2norm,percentile,stddev` data_source [_required_] enum Data source for cost queries. Allowed enum values: `metrics,cloud_cost,datadog_usage` name [_required_] string Name of the query for use in formulas. query [_required_] string The monitor query. Option 3 object A formula and functions data quality query. data_source [_required_] enum Data source for data quality queries. Allowed enum values: `data_quality_metrics` filter [_required_] string Filter expression used to match on data entities. Uses Aastra query syntax. group_by [string] Optional grouping fields for aggregation. measure [_required_] string The data quality measure to query. Common values include: `bytes`, `cardinality`, `custom`, `freshness`, `max`, `mean`, `min`, `nullness`, `percent_negative`, `percent_zero`, `row_count`, `stddev`, `sum`, `uniqueness`. Additional values may be supported. monitor_options object Monitor configuration options for data quality queries. crontab_override string Crontab expression to override the default schedule. custom_sql string Custom SQL query for the monitor. custom_where string Custom WHERE clause for the query. group_by_columns [string] Columns to group results by. model_type_override enum Override for the model type used in anomaly detection. Allowed enum values: `freshness,percentage,any` name [_required_] string Name of the query for use in formulas. schema_version string Schema version for the data quality query. scope string Optional scoping expression to further filter metrics. Uses metrics filter syntax. This is useful when an entity has been configured to emit metrics with additional tags. overall_state enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` priority int64 Integer from 1 (high) to 5 (low) indicating alert severity. query [_required_] string The monitor query. restricted_roles [string] A list of unique role identifiers to define which roles are allowed to edit the monitor. The unique identifiers for all roles can be pulled from the [Roles API](https://docs.datadoghq.com/api/latest/roles/#list-roles) and are located in the `data.id` field. Editing a monitor includes any updates to the monitor configuration, monitor deletion, and muting of the monitor for any amount of time. You can use the [Restriction Policies API](https://docs.datadoghq.com/api/latest/restriction-policies/) to manage write authorization for individual monitors by teams and users, in addition to roles. state object Wrapper object with the different monitor states. groups object Dictionary where the keys are groups (comma separated lists of tags) and the values are the list of groups your monitor is broken down on. object Monitor state for a single group. last_nodata_ts int64 Latest timestamp the monitor was in NO_DATA state. last_notified_ts int64 Latest timestamp of the notification sent for this monitor group. last_resolved_ts int64 Latest timestamp the monitor group was resolved. last_triggered_ts int64 Latest timestamp the monitor group triggered. name string The name of the monitor. status enum The different states your monitor can be in. Allowed enum values: `Alert,Ignored,No Data,OK,Skipped,Unknown,Warn` tags [string] Tags associated to your monitor. type [_required_] enum The type of the monitor. For more information about `type`, see the [monitor options](https://docs.datadoghq.com/monitors/guide/monitor_api_options/) docs. Allowed enum values: `composite,event alert,log alert,metric alert,process alert,query alert,rum alert,service check,synthetics alert,trace-analytics alert,slo alert,event-v2 alert,audit alert,ci-pipelines alert,ci-tests alert,error-tracking alert,database-monitoring alert,network-performance alert,cost alert,data-quality alert` ``` { "name": "Example-Monitor", "type": "log alert", "query": "logs(\"service:foo AND type:error\").index(\"main\").rollup(\"count\").by(\"source\").last(\"5m\") > 2", "message": "some message Notify: @hipchat-channel", "tags": [ "test:examplemonitor", "env:ci" ], "priority": 3, "options": { "enable_logs_sample": true, "escalation_message": "the situation has escalated", "evaluation_delay": 700, "include_tags": true, "locked": false, "new_host_delay": 600, "no_data_timeframe": null, "notify_audit": false, "notify_no_data": false, "on_missing_data": "show_and_notify_no_data", "notification_preset_name": "hide_handles", "renotify_interval": 60, "require_full_window": true, "timeout_h": 24, "thresholds": { "critical": 2, "warning": 1 } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#ValidateExistingMonitor-200-v1) * [400](https://docs.datadoghq.com/api/latest/monitors/#ValidateExistingMonitor-400-v1) * [403](https://docs.datadoghq.com/api/latest/monitors/#ValidateExistingMonitor-403-v1) * [429](https://docs.datadoghq.com/api/latest/monitors/#ValidateExistingMonitor-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Expand All Field Type Description No response body ``` {} ``` Copy Invalid JSON * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Validate an existing monitor returns "OK" response Copy ``` # Path parameters export monitor_id="6.66486743e+08" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monitor/${monitor_id}/validate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Monitor", "type": "log alert", "query": "logs(\"service:foo AND type:error\").index(\"main\").rollup(\"count\").by(\"source\").last(\"5m\") > 2", "message": "some message Notify: @hipchat-channel", "tags": [ "test:examplemonitor", "env:ci" ], "priority": 3, "options": { "enable_logs_sample": true, "escalation_message": "the situation has escalated", "evaluation_delay": 700, "include_tags": true, "locked": false, "new_host_delay": 600, "no_data_timeframe": null, "notify_audit": false, "notify_no_data": false, "on_missing_data": "show_and_notify_no_data", "notification_preset_name": "hide_handles", "renotify_interval": 60, "require_full_window": true, "timeout_h": 24, "thresholds": { "critical": 2, "warning": 1 } } } EOF ``` ##### Validate an existing monitor returns "OK" response ``` // Validate an existing monitor returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "monitor" in the system MonitorID, _ := strconv.ParseInt(os.Getenv("MONITOR_ID"), 10, 64) body := datadogV1.Monitor{ Name: datadog.PtrString("Example-Monitor"), Type: datadogV1.MONITORTYPE_LOG_ALERT, Query: `logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2`, Message: datadog.PtrString("some message Notify: @hipchat-channel"), Tags: []string{ "test:examplemonitor", "env:ci", }, Priority: *datadog.NewNullableInt64(datadog.PtrInt64(3)), Options: &datadogV1.MonitorOptions{ EnableLogsSample: datadog.PtrBool(true), EscalationMessage: datadog.PtrString("the situation has escalated"), EvaluationDelay: *datadog.NewNullableInt64(datadog.PtrInt64(700)), IncludeTags: datadog.PtrBool(true), Locked: datadog.PtrBool(false), NewHostDelay: *datadog.NewNullableInt64(datadog.PtrInt64(600)), NoDataTimeframe: *datadog.NewNullableInt64(nil), NotifyAudit: datadog.PtrBool(false), NotifyNoData: datadog.PtrBool(false), OnMissingData: datadogV1.ONMISSINGDATAOPTION_SHOW_AND_NOTIFY_NO_DATA.Ptr(), NotificationPresetName: datadogV1.MONITOROPTIONSNOTIFICATIONPRESETS_HIDE_HANDLES.Ptr(), RenotifyInterval: *datadog.NewNullableInt64(datadog.PtrInt64(60)), RequireFullWindow: datadog.PtrBool(true), TimeoutH: *datadog.NewNullableInt64(datadog.PtrInt64(24)), Thresholds: &datadogV1.MonitorThresholds{ Critical: datadog.PtrFloat64(2), Warning: *datadog.NewNullableFloat64(datadog.PtrFloat64(1)), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewMonitorsApi(apiClient) resp, r, err := api.ValidateExistingMonitor(ctx, MonitorID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.ValidateExistingMonitor`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.ValidateExistingMonitor`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Validate an existing monitor returns "OK" response ``` // Validate an existing monitor returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.MonitorsApi; import com.datadog.api.client.v1.model.Monitor; import com.datadog.api.client.v1.model.MonitorOptions; import com.datadog.api.client.v1.model.MonitorOptionsNotificationPresets; import com.datadog.api.client.v1.model.MonitorThresholds; import com.datadog.api.client.v1.model.MonitorType; import com.datadog.api.client.v1.model.OnMissingDataOption; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor" in the system Long MONITOR_ID = Long.parseLong(System.getenv("MONITOR_ID")); Monitor body = new Monitor() .name("Example-Monitor") .type(MonitorType.LOG_ALERT) .query( """ logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2 """) .message("some message Notify: @hipchat-channel") .tags(Arrays.asList("test:examplemonitor", "env:ci")) .priority(3L) .options( new MonitorOptions() .enableLogsSample(true) .escalationMessage("the situation has escalated") .evaluationDelay(700L) .includeTags(true) .locked(false) .newHostDelay(600L) .noDataTimeframe(null) .notifyAudit(false) .notifyNoData(false) .onMissingData(OnMissingDataOption.SHOW_AND_NOTIFY_NO_DATA) .notificationPresetName(MonitorOptionsNotificationPresets.HIDE_HANDLES) .renotifyInterval(60L) .requireFullWindow(true) .timeoutH(24L) .thresholds(new MonitorThresholds().critical(2.0).warning(1.0))); try { apiInstance.validateExistingMonitor(MONITOR_ID, body); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#validateExistingMonitor"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Validate an existing monitor returns "OK" response ``` """ Validate an existing monitor returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.monitors_api import MonitorsApi from datadog_api_client.v1.model.monitor import Monitor from datadog_api_client.v1.model.monitor_options import MonitorOptions from datadog_api_client.v1.model.monitor_options_notification_presets import MonitorOptionsNotificationPresets from datadog_api_client.v1.model.monitor_thresholds import MonitorThresholds from datadog_api_client.v1.model.monitor_type import MonitorType from datadog_api_client.v1.model.on_missing_data_option import OnMissingDataOption # there is a valid "monitor" in the system MONITOR_ID = environ["MONITOR_ID"] body = Monitor( name="Example-Monitor", type=MonitorType.LOG_ALERT, query='logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2', message="some message Notify: @hipchat-channel", tags=[ "test:examplemonitor", "env:ci", ], priority=3, options=MonitorOptions( enable_logs_sample=True, escalation_message="the situation has escalated", evaluation_delay=700, include_tags=True, locked=False, new_host_delay=600, no_data_timeframe=None, notify_audit=False, notify_no_data=False, on_missing_data=OnMissingDataOption.SHOW_AND_NOTIFY_NO_DATA, notification_preset_name=MonitorOptionsNotificationPresets.HIDE_HANDLES, renotify_interval=60, require_full_window=True, timeout_h=24, thresholds=MonitorThresholds( critical=2.0, warning=1.0, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.validate_existing_monitor(monitor_id=int(MONITOR_ID), body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Validate an existing monitor returns "OK" response ``` # Validate an existing monitor returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::MonitorsAPI.new # there is a valid "monitor" in the system MONITOR_ID = ENV["MONITOR_ID"] body = DatadogAPIClient::V1::Monitor.new({ name: "Example-Monitor", type: DatadogAPIClient::V1::MonitorType::LOG_ALERT, query: 'logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2', message: "some message Notify: @hipchat-channel", tags: [ "test:examplemonitor", "env:ci", ], priority: 3, options: DatadogAPIClient::V1::MonitorOptions.new({ enable_logs_sample: true, escalation_message: "the situation has escalated", evaluation_delay: 700, include_tags: true, locked: false, new_host_delay: 600, no_data_timeframe: nil, notify_audit: false, notify_no_data: false, on_missing_data: DatadogAPIClient::V1::OnMissingDataOption::SHOW_AND_NOTIFY_NO_DATA, notification_preset_name: DatadogAPIClient::V1::MonitorOptionsNotificationPresets::HIDE_HANDLES, renotify_interval: 60, require_full_window: true, timeout_h: 24, thresholds: DatadogAPIClient::V1::MonitorThresholds.new({ critical: 2, warning: 1, }), }), }) p api_instance.validate_existing_monitor(MONITOR_ID.to_i, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Validate an existing monitor returns "OK" response ``` // Validate an existing monitor returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_monitors::MonitorsAPI; use datadog_api_client::datadogV1::model::Monitor; use datadog_api_client::datadogV1::model::MonitorOptions; use datadog_api_client::datadogV1::model::MonitorOptionsNotificationPresets; use datadog_api_client::datadogV1::model::MonitorThresholds; use datadog_api_client::datadogV1::model::MonitorType; use datadog_api_client::datadogV1::model::OnMissingDataOption; #[tokio::main] async fn main() { // there is a valid "monitor" in the system let monitor_id: i64 = std::env::var("MONITOR_ID").unwrap().parse().unwrap(); let body = Monitor::new( r#"logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2"#.to_string(), MonitorType::LOG_ALERT, ) .message("some message Notify: @hipchat-channel".to_string()) .name("Example-Monitor".to_string()) .options( MonitorOptions::new() .enable_logs_sample(true) .escalation_message("the situation has escalated".to_string()) .evaluation_delay(Some(700)) .include_tags(true) .locked(false) .new_host_delay(Some(600)) .no_data_timeframe(None) .notification_preset_name(MonitorOptionsNotificationPresets::HIDE_HANDLES) .notify_audit(false) .notify_no_data(false) .on_missing_data(OnMissingDataOption::SHOW_AND_NOTIFY_NO_DATA) .renotify_interval(Some(60)) .require_full_window(true) .thresholds(MonitorThresholds::new().critical(2.0 as f64).warning(Some(1.0 as f64))) .timeout_h(Some(24)), ) .priority(Some(3)) .tags(vec!["test:examplemonitor".to_string(), "env:ci".to_string()]); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .validate_existing_monitor(monitor_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Validate an existing monitor returns "OK" response ``` /** * Validate an existing monitor returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.MonitorsApi(configuration); // there is a valid "monitor" in the system const MONITOR_ID = parseInt(process.env.MONITOR_ID as string); const params: v1.MonitorsApiValidateExistingMonitorRequest = { body: { name: "Example-Monitor", type: "log alert", query: `logs("service:foo AND type:error").index("main").rollup("count").by("source").last("5m") > 2`, message: "some message Notify: @hipchat-channel", tags: ["test:examplemonitor", "env:ci"], priority: 3, options: { enableLogsSample: true, escalationMessage: "the situation has escalated", evaluationDelay: 700, includeTags: true, locked: false, newHostDelay: 600, noDataTimeframe: undefined, notifyAudit: false, notifyNoData: false, onMissingData: "show_and_notify_no_data", notificationPresetName: "hide_handles", renotifyInterval: 60, requireFullWindow: true, timeoutH: 24, thresholds: { critical: 2, warning: 1, }, }, }, monitorId: MONITOR_ID, }; apiInstance .validateExistingMonitor(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a monitor configuration policy](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-configuration-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-configuration-policy-v2) GET https://api.ap1.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.ap2.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.datadoghq.eu/api/v2/monitor/policy/{policy_id}https://api.ddog-gov.com/api/v2/monitor/policy/{policy_id}https://api.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.us3.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.us5.datadoghq.com/api/v2/monitor/policy/{policy_id} ### Overview Get a monitor configuration policy by `policy_id`. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Path Parameters Name Type Description policy_id [_required_] string ID of the monitor configuration policy. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorConfigPolicy-200-v2) * [403](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorConfigPolicy-403-v2) * [404](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorConfigPolicy-404-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorConfigPolicy-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response for retrieving a monitor configuration policy. Field Type Description data object A monitor configuration policy data. attributes object Policy and policy type for a monitor configuration policy. policy Configuration for the policy. Option 1 object Tag attributes of a monitor configuration policy. tag_key string The key of the tag. tag_key_required boolean If a tag key is required for monitor creation. valid_tag_values [string] Valid values for the tag. policy_type enum The monitor configuration policy type. Allowed enum values: `tag` default: `tag` id string ID of this monitor configuration policy. type enum Monitor configuration policy resource type. Allowed enum values: `monitor-config-policy` default: `monitor-config-policy` ``` { "data": { "attributes": { "policy": { "tag_key": "datacenter", "tag_key_required": true, "valid_tag_values": [ "prod", "staging" ] }, "policy_type": "tag" }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-config-policy" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Get a monitor configuration policy Copy ``` # Path parameters export policy_id="00000000-0000-1234-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/policy/${policy_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a monitor configuration policy ``` """ Get a monitor configuration policy returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi # there is a valid "monitor_configuration_policy" in the system MONITOR_CONFIGURATION_POLICY_DATA_ID = environ["MONITOR_CONFIGURATION_POLICY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.get_monitor_config_policy( policy_id=MONITOR_CONFIGURATION_POLICY_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a monitor configuration policy ``` # Get a monitor configuration policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new # there is a valid "monitor_configuration_policy" in the system MONITOR_CONFIGURATION_POLICY_DATA_ID = ENV["MONITOR_CONFIGURATION_POLICY_DATA_ID"] p api_instance.get_monitor_config_policy(MONITOR_CONFIGURATION_POLICY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a monitor configuration policy ``` // Get a monitor configuration policy returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "monitor_configuration_policy" in the system MonitorConfigurationPolicyDataID := os.Getenv("MONITOR_CONFIGURATION_POLICY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.GetMonitorConfigPolicy(ctx, MonitorConfigurationPolicyDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.GetMonitorConfigPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.GetMonitorConfigPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a monitor configuration policy ``` // Get a monitor configuration policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorConfigPolicyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor_configuration_policy" in the system String MONITOR_CONFIGURATION_POLICY_DATA_ID = System.getenv("MONITOR_CONFIGURATION_POLICY_DATA_ID"); try { MonitorConfigPolicyResponse result = apiInstance.getMonitorConfigPolicy(MONITOR_CONFIGURATION_POLICY_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#getMonitorConfigPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a monitor configuration policy ``` // Get a monitor configuration policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { // there is a valid "monitor_configuration_policy" in the system let monitor_configuration_policy_data_id = std::env::var("MONITOR_CONFIGURATION_POLICY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .get_monitor_config_policy(monitor_configuration_policy_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a monitor configuration policy ``` /** * Get a monitor configuration policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); // there is a valid "monitor_configuration_policy" in the system const MONITOR_CONFIGURATION_POLICY_DATA_ID = process.env .MONITOR_CONFIGURATION_POLICY_DATA_ID as string; const params: v2.MonitorsApiGetMonitorConfigPolicyRequest = { policyId: MONITOR_CONFIGURATION_POLICY_DATA_ID, }; apiInstance .getMonitorConfigPolicy(params) .then((data: v2.MonitorConfigPolicyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all monitor configuration policies](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-configuration-policies) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-configuration-policies-v2) GET https://api.ap1.datadoghq.com/api/v2/monitor/policyhttps://api.ap2.datadoghq.com/api/v2/monitor/policyhttps://api.datadoghq.eu/api/v2/monitor/policyhttps://api.ddog-gov.com/api/v2/monitor/policyhttps://api.datadoghq.com/api/v2/monitor/policyhttps://api.us3.datadoghq.com/api/v2/monitor/policyhttps://api.us5.datadoghq.com/api/v2/monitor/policy ### Overview Get all monitor configuration policies. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#ListMonitorConfigPolicies-200-v2) * [403](https://docs.datadoghq.com/api/latest/monitors/#ListMonitorConfigPolicies-403-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#ListMonitorConfigPolicies-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response for retrieving all monitor configuration policies. Field Type Description data [object] An array of monitor configuration policies. attributes object Policy and policy type for a monitor configuration policy. policy Configuration for the policy. Option 1 object Tag attributes of a monitor configuration policy. tag_key string The key of the tag. tag_key_required boolean If a tag key is required for monitor creation. valid_tag_values [string] Valid values for the tag. policy_type enum The monitor configuration policy type. Allowed enum values: `tag` default: `tag` id string ID of this monitor configuration policy. type enum Monitor configuration policy resource type. Allowed enum values: `monitor-config-policy` default: `monitor-config-policy` ``` { "data": [ { "attributes": { "policy": { "tag_key": "datacenter", "tag_key_required": true, "valid_tag_values": [ "prod", "staging" ] }, "policy_type": "tag" }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-config-policy" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Get all monitor configuration policies Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/policy" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all monitor configuration policies ``` """ Get all monitor configuration policies returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.list_monitor_config_policies() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all monitor configuration policies ``` # Get all monitor configuration policies returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new p api_instance.list_monitor_config_policies() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all monitor configuration policies ``` // Get all monitor configuration policies returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.ListMonitorConfigPolicies(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.ListMonitorConfigPolicies`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.ListMonitorConfigPolicies`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all monitor configuration policies ``` // Get all monitor configuration policies returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorConfigPolicyListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); try { MonitorConfigPolicyListResponse result = apiInstance.listMonitorConfigPolicies(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#listMonitorConfigPolicies"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all monitor configuration policies ``` // Get all monitor configuration policies returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.list_monitor_config_policies().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all monitor configuration policies ``` /** * Get all monitor configuration policies returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); apiInstance .listMonitorConfigPolicies() .then((data: v2.MonitorConfigPolicyListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a monitor configuration policy](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor-configuration-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor-configuration-policy-v2) POST https://api.ap1.datadoghq.com/api/v2/monitor/policyhttps://api.ap2.datadoghq.com/api/v2/monitor/policyhttps://api.datadoghq.eu/api/v2/monitor/policyhttps://api.ddog-gov.com/api/v2/monitor/policyhttps://api.datadoghq.com/api/v2/monitor/policyhttps://api.us3.datadoghq.com/api/v2/monitor/policyhttps://api.us5.datadoghq.com/api/v2/monitor/policy ### Overview Create a monitor configuration policy. This endpoint requires the `monitor_config_policy_write` permission. ### Request #### Body Data (required) Create a monitor configuration policy request body. * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description data [_required_] object A monitor configuration policy data. attributes [_required_] object Policy and policy type for a monitor configuration policy. policy [_required_] Configuration for the policy. Option 1 object Tag attributes of a monitor configuration policy. tag_key [_required_] string The key of the tag. tag_key_required [_required_] boolean If a tag key is required for monitor creation. valid_tag_values [_required_] [string] Valid values for the tag. policy_type [_required_] enum The monitor configuration policy type. Allowed enum values: `tag` default: `tag` type [_required_] enum Monitor configuration policy resource type. Allowed enum values: `monitor-config-policy` default: `monitor-config-policy` ``` { "data": { "attributes": { "policy_type": "tag", "policy": { "tag_key": "examplemonitor", "tag_key_required": false, "valid_tag_values": [ "prod", "staging" ] } }, "type": "monitor-config-policy" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitorConfigPolicy-200-v2) * [400](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitorConfigPolicy-400-v2) * [403](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitorConfigPolicy-403-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitorConfigPolicy-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response for retrieving a monitor configuration policy. Field Type Description data object A monitor configuration policy data. attributes object Policy and policy type for a monitor configuration policy. policy Configuration for the policy. Option 1 object Tag attributes of a monitor configuration policy. tag_key string The key of the tag. tag_key_required boolean If a tag key is required for monitor creation. valid_tag_values [string] Valid values for the tag. policy_type enum The monitor configuration policy type. Allowed enum values: `tag` default: `tag` id string ID of this monitor configuration policy. type enum Monitor configuration policy resource type. Allowed enum values: `monitor-config-policy` default: `monitor-config-policy` ``` { "data": { "attributes": { "policy": { "tag_key": "datacenter", "tag_key_required": true, "valid_tag_values": [ "prod", "staging" ] }, "policy_type": "tag" }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-config-policy" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Create a monitor configuration policy returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/policy" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "policy_type": "tag", "policy": { "tag_key": "examplemonitor", "tag_key_required": false, "valid_tag_values": [ "prod", "staging" ] } }, "type": "monitor-config-policy" } } EOF ``` ##### Create a monitor configuration policy returns "OK" response ``` // Create a monitor configuration policy returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MonitorConfigPolicyCreateRequest{ Data: datadogV2.MonitorConfigPolicyCreateData{ Attributes: datadogV2.MonitorConfigPolicyAttributeCreateRequest{ PolicyType: datadogV2.MONITORCONFIGPOLICYTYPE_TAG, Policy: datadogV2.MonitorConfigPolicyPolicyCreateRequest{ MonitorConfigPolicyTagPolicyCreateRequest: &datadogV2.MonitorConfigPolicyTagPolicyCreateRequest{ TagKey: "examplemonitor", TagKeyRequired: false, ValidTagValues: []string{ "prod", "staging", }, }}, }, Type: datadogV2.MONITORCONFIGPOLICYRESOURCETYPE_MONITOR_CONFIG_POLICY, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.CreateMonitorConfigPolicy(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.CreateMonitorConfigPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.CreateMonitorConfigPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a monitor configuration policy returns "OK" response ``` // Create a monitor configuration policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorConfigPolicyAttributeCreateRequest; import com.datadog.api.client.v2.model.MonitorConfigPolicyCreateData; import com.datadog.api.client.v2.model.MonitorConfigPolicyCreateRequest; import com.datadog.api.client.v2.model.MonitorConfigPolicyPolicyCreateRequest; import com.datadog.api.client.v2.model.MonitorConfigPolicyResourceType; import com.datadog.api.client.v2.model.MonitorConfigPolicyResponse; import com.datadog.api.client.v2.model.MonitorConfigPolicyTagPolicyCreateRequest; import com.datadog.api.client.v2.model.MonitorConfigPolicyType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); MonitorConfigPolicyCreateRequest body = new MonitorConfigPolicyCreateRequest() .data( new MonitorConfigPolicyCreateData() .attributes( new MonitorConfigPolicyAttributeCreateRequest() .policyType(MonitorConfigPolicyType.TAG) .policy( new MonitorConfigPolicyPolicyCreateRequest( new MonitorConfigPolicyTagPolicyCreateRequest() .tagKey("examplemonitor") .tagKeyRequired(false) .validTagValues(Arrays.asList("prod", "staging"))))) .type(MonitorConfigPolicyResourceType.MONITOR_CONFIG_POLICY)); try { MonitorConfigPolicyResponse result = apiInstance.createMonitorConfigPolicy(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#createMonitorConfigPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a monitor configuration policy returns "OK" response ``` """ Create a monitor configuration policy returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_config_policy_attribute_create_request import ( MonitorConfigPolicyAttributeCreateRequest, ) from datadog_api_client.v2.model.monitor_config_policy_create_data import MonitorConfigPolicyCreateData from datadog_api_client.v2.model.monitor_config_policy_create_request import MonitorConfigPolicyCreateRequest from datadog_api_client.v2.model.monitor_config_policy_resource_type import MonitorConfigPolicyResourceType from datadog_api_client.v2.model.monitor_config_policy_tag_policy_create_request import ( MonitorConfigPolicyTagPolicyCreateRequest, ) from datadog_api_client.v2.model.monitor_config_policy_type import MonitorConfigPolicyType body = MonitorConfigPolicyCreateRequest( data=MonitorConfigPolicyCreateData( attributes=MonitorConfigPolicyAttributeCreateRequest( policy_type=MonitorConfigPolicyType.TAG, policy=MonitorConfigPolicyTagPolicyCreateRequest( tag_key="examplemonitor", tag_key_required=False, valid_tag_values=[ "prod", "staging", ], ), ), type=MonitorConfigPolicyResourceType.MONITOR_CONFIG_POLICY, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.create_monitor_config_policy(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a monitor configuration policy returns "OK" response ``` # Create a monitor configuration policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new body = DatadogAPIClient::V2::MonitorConfigPolicyCreateRequest.new({ data: DatadogAPIClient::V2::MonitorConfigPolicyCreateData.new({ attributes: DatadogAPIClient::V2::MonitorConfigPolicyAttributeCreateRequest.new({ policy_type: DatadogAPIClient::V2::MonitorConfigPolicyType::TAG, policy: DatadogAPIClient::V2::MonitorConfigPolicyTagPolicyCreateRequest.new({ tag_key: "examplemonitor", tag_key_required: false, valid_tag_values: [ "prod", "staging", ], }), }), type: DatadogAPIClient::V2::MonitorConfigPolicyResourceType::MONITOR_CONFIG_POLICY, }), }) p api_instance.create_monitor_config_policy(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a monitor configuration policy returns "OK" response ``` // Create a monitor configuration policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorConfigPolicyAttributeCreateRequest; use datadog_api_client::datadogV2::model::MonitorConfigPolicyCreateData; use datadog_api_client::datadogV2::model::MonitorConfigPolicyCreateRequest; use datadog_api_client::datadogV2::model::MonitorConfigPolicyPolicyCreateRequest; use datadog_api_client::datadogV2::model::MonitorConfigPolicyResourceType; use datadog_api_client::datadogV2::model::MonitorConfigPolicyTagPolicyCreateRequest; use datadog_api_client::datadogV2::model::MonitorConfigPolicyType; #[tokio::main] async fn main() { let body = MonitorConfigPolicyCreateRequest::new(MonitorConfigPolicyCreateData::new( MonitorConfigPolicyAttributeCreateRequest::new( MonitorConfigPolicyPolicyCreateRequest::MonitorConfigPolicyTagPolicyCreateRequest( Box::new(MonitorConfigPolicyTagPolicyCreateRequest::new( "examplemonitor".to_string(), false, vec!["prod".to_string(), "staging".to_string()], )), ), MonitorConfigPolicyType::TAG, ), MonitorConfigPolicyResourceType::MONITOR_CONFIG_POLICY, )); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.create_monitor_config_policy(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a monitor configuration policy returns "OK" response ``` /** * Create a monitor configuration policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); const params: v2.MonitorsApiCreateMonitorConfigPolicyRequest = { body: { data: { attributes: { policyType: "tag", policy: { tagKey: "examplemonitor", tagKeyRequired: false, validTagValues: ["prod", "staging"], }, }, type: "monitor-config-policy", }, }, }; apiInstance .createMonitorConfigPolicy(params) .then((data: v2.MonitorConfigPolicyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit a monitor configuration policy](https://docs.datadoghq.com/api/latest/monitors/#edit-a-monitor-configuration-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#edit-a-monitor-configuration-policy-v2) PATCH https://api.ap1.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.ap2.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.datadoghq.eu/api/v2/monitor/policy/{policy_id}https://api.ddog-gov.com/api/v2/monitor/policy/{policy_id}https://api.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.us3.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.us5.datadoghq.com/api/v2/monitor/policy/{policy_id} ### Overview Edit a monitor configuration policy. This endpoint requires the `monitor_config_policy_write` permission. ### Arguments #### Path Parameters Name Type Description policy_id [_required_] string ID of the monitor configuration policy. ### Request #### Body Data (required) Description of the update. * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description data [_required_] object A monitor configuration policy data. attributes [_required_] object Policy and policy type for a monitor configuration policy. policy [_required_] Configuration for the policy. Option 1 object Tag attributes of a monitor configuration policy. tag_key string The key of the tag. tag_key_required boolean If a tag key is required for monitor creation. valid_tag_values [string] Valid values for the tag. policy_type [_required_] enum The monitor configuration policy type. Allowed enum values: `tag` default: `tag` id [_required_] string ID of this monitor configuration policy. type [_required_] enum Monitor configuration policy resource type. Allowed enum values: `monitor-config-policy` default: `monitor-config-policy` ``` { "data": { "attributes": { "policy": { "tag_key": "examplemonitor", "tag_key_required": false, "valid_tag_values": [ "prod", "staging" ] }, "policy_type": "tag" }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-config-policy" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorConfigPolicy-200-v2) * [403](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorConfigPolicy-403-v2) * [404](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorConfigPolicy-404-v2) * [422](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorConfigPolicy-422-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorConfigPolicy-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response for retrieving a monitor configuration policy. Field Type Description data object A monitor configuration policy data. attributes object Policy and policy type for a monitor configuration policy. policy Configuration for the policy. Option 1 object Tag attributes of a monitor configuration policy. tag_key string The key of the tag. tag_key_required boolean If a tag key is required for monitor creation. valid_tag_values [string] Valid values for the tag. policy_type enum The monitor configuration policy type. Allowed enum values: `tag` default: `tag` id string ID of this monitor configuration policy. type enum Monitor configuration policy resource type. Allowed enum values: `monitor-config-policy` default: `monitor-config-policy` ``` { "data": { "attributes": { "policy": { "tag_key": "datacenter", "tag_key_required": true, "valid_tag_values": [ "prod", "staging" ] }, "policy_type": "tag" }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-config-policy" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unprocessable Entity * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Edit a monitor configuration policy returns "OK" response Copy ``` # Path parameters export policy_id="00000000-0000-1234-0000-000000000000" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/policy/${policy_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "policy": { "tag_key": "examplemonitor", "tag_key_required": false, "valid_tag_values": [ "prod", "staging" ] }, "policy_type": "tag" }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-config-policy" } } EOF ``` ##### Edit a monitor configuration policy returns "OK" response ``` // Edit a monitor configuration policy returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "monitor_configuration_policy" in the system MonitorConfigurationPolicyDataID := os.Getenv("MONITOR_CONFIGURATION_POLICY_DATA_ID") body := datadogV2.MonitorConfigPolicyEditRequest{ Data: datadogV2.MonitorConfigPolicyEditData{ Attributes: datadogV2.MonitorConfigPolicyAttributeEditRequest{ Policy: datadogV2.MonitorConfigPolicyPolicy{ MonitorConfigPolicyTagPolicy: &datadogV2.MonitorConfigPolicyTagPolicy{ TagKey: datadog.PtrString("examplemonitor"), TagKeyRequired: datadog.PtrBool(false), ValidTagValues: []string{ "prod", "staging", }, }}, PolicyType: datadogV2.MONITORCONFIGPOLICYTYPE_TAG, }, Id: MonitorConfigurationPolicyDataID, Type: datadogV2.MONITORCONFIGPOLICYRESOURCETYPE_MONITOR_CONFIG_POLICY, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.UpdateMonitorConfigPolicy(ctx, MonitorConfigurationPolicyDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.UpdateMonitorConfigPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.UpdateMonitorConfigPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit a monitor configuration policy returns "OK" response ``` // Edit a monitor configuration policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorConfigPolicyAttributeEditRequest; import com.datadog.api.client.v2.model.MonitorConfigPolicyEditData; import com.datadog.api.client.v2.model.MonitorConfigPolicyEditRequest; import com.datadog.api.client.v2.model.MonitorConfigPolicyPolicy; import com.datadog.api.client.v2.model.MonitorConfigPolicyResourceType; import com.datadog.api.client.v2.model.MonitorConfigPolicyResponse; import com.datadog.api.client.v2.model.MonitorConfigPolicyTagPolicy; import com.datadog.api.client.v2.model.MonitorConfigPolicyType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor_configuration_policy" in the system String MONITOR_CONFIGURATION_POLICY_DATA_ID = System.getenv("MONITOR_CONFIGURATION_POLICY_DATA_ID"); MonitorConfigPolicyEditRequest body = new MonitorConfigPolicyEditRequest() .data( new MonitorConfigPolicyEditData() .attributes( new MonitorConfigPolicyAttributeEditRequest() .policy( new MonitorConfigPolicyPolicy( new MonitorConfigPolicyTagPolicy() .tagKey("examplemonitor") .tagKeyRequired(false) .validTagValues(Arrays.asList("prod", "staging")))) .policyType(MonitorConfigPolicyType.TAG)) .id(MONITOR_CONFIGURATION_POLICY_DATA_ID) .type(MonitorConfigPolicyResourceType.MONITOR_CONFIG_POLICY)); try { MonitorConfigPolicyResponse result = apiInstance.updateMonitorConfigPolicy(MONITOR_CONFIGURATION_POLICY_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#updateMonitorConfigPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit a monitor configuration policy returns "OK" response ``` """ Edit a monitor configuration policy returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_config_policy_attribute_edit_request import ( MonitorConfigPolicyAttributeEditRequest, ) from datadog_api_client.v2.model.monitor_config_policy_edit_data import MonitorConfigPolicyEditData from datadog_api_client.v2.model.monitor_config_policy_edit_request import MonitorConfigPolicyEditRequest from datadog_api_client.v2.model.monitor_config_policy_resource_type import MonitorConfigPolicyResourceType from datadog_api_client.v2.model.monitor_config_policy_tag_policy import MonitorConfigPolicyTagPolicy from datadog_api_client.v2.model.monitor_config_policy_type import MonitorConfigPolicyType # there is a valid "monitor_configuration_policy" in the system MONITOR_CONFIGURATION_POLICY_DATA_ID = environ["MONITOR_CONFIGURATION_POLICY_DATA_ID"] body = MonitorConfigPolicyEditRequest( data=MonitorConfigPolicyEditData( attributes=MonitorConfigPolicyAttributeEditRequest( policy=MonitorConfigPolicyTagPolicy( tag_key="examplemonitor", tag_key_required=False, valid_tag_values=[ "prod", "staging", ], ), policy_type=MonitorConfigPolicyType.TAG, ), id=MONITOR_CONFIGURATION_POLICY_DATA_ID, type=MonitorConfigPolicyResourceType.MONITOR_CONFIG_POLICY, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.update_monitor_config_policy(policy_id=MONITOR_CONFIGURATION_POLICY_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit a monitor configuration policy returns "OK" response ``` # Edit a monitor configuration policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new # there is a valid "monitor_configuration_policy" in the system MONITOR_CONFIGURATION_POLICY_DATA_ID = ENV["MONITOR_CONFIGURATION_POLICY_DATA_ID"] body = DatadogAPIClient::V2::MonitorConfigPolicyEditRequest.new({ data: DatadogAPIClient::V2::MonitorConfigPolicyEditData.new({ attributes: DatadogAPIClient::V2::MonitorConfigPolicyAttributeEditRequest.new({ policy: DatadogAPIClient::V2::MonitorConfigPolicyTagPolicy.new({ tag_key: "examplemonitor", tag_key_required: false, valid_tag_values: [ "prod", "staging", ], }), policy_type: DatadogAPIClient::V2::MonitorConfigPolicyType::TAG, }), id: MONITOR_CONFIGURATION_POLICY_DATA_ID, type: DatadogAPIClient::V2::MonitorConfigPolicyResourceType::MONITOR_CONFIG_POLICY, }), }) p api_instance.update_monitor_config_policy(MONITOR_CONFIGURATION_POLICY_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit a monitor configuration policy returns "OK" response ``` // Edit a monitor configuration policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorConfigPolicyAttributeEditRequest; use datadog_api_client::datadogV2::model::MonitorConfigPolicyEditData; use datadog_api_client::datadogV2::model::MonitorConfigPolicyEditRequest; use datadog_api_client::datadogV2::model::MonitorConfigPolicyPolicy; use datadog_api_client::datadogV2::model::MonitorConfigPolicyResourceType; use datadog_api_client::datadogV2::model::MonitorConfigPolicyTagPolicy; use datadog_api_client::datadogV2::model::MonitorConfigPolicyType; #[tokio::main] async fn main() { // there is a valid "monitor_configuration_policy" in the system let monitor_configuration_policy_data_id = std::env::var("MONITOR_CONFIGURATION_POLICY_DATA_ID").unwrap(); let body = MonitorConfigPolicyEditRequest::new(MonitorConfigPolicyEditData::new( MonitorConfigPolicyAttributeEditRequest::new( MonitorConfigPolicyPolicy::MonitorConfigPolicyTagPolicy(Box::new( MonitorConfigPolicyTagPolicy::new() .tag_key("examplemonitor".to_string()) .tag_key_required(false) .valid_tag_values(vec!["prod".to_string(), "staging".to_string()]), )), MonitorConfigPolicyType::TAG, ), monitor_configuration_policy_data_id.clone(), MonitorConfigPolicyResourceType::MONITOR_CONFIG_POLICY, )); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .update_monitor_config_policy(monitor_configuration_policy_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit a monitor configuration policy returns "OK" response ``` /** * Edit a monitor configuration policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); // there is a valid "monitor_configuration_policy" in the system const MONITOR_CONFIGURATION_POLICY_DATA_ID = process.env .MONITOR_CONFIGURATION_POLICY_DATA_ID as string; const params: v2.MonitorsApiUpdateMonitorConfigPolicyRequest = { body: { data: { attributes: { policy: { tagKey: "examplemonitor", tagKeyRequired: false, validTagValues: ["prod", "staging"], }, policyType: "tag", }, id: MONITOR_CONFIGURATION_POLICY_DATA_ID, type: "monitor-config-policy", }, }, policyId: MONITOR_CONFIGURATION_POLICY_DATA_ID, }; apiInstance .updateMonitorConfigPolicy(params) .then((data: v2.MonitorConfigPolicyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a monitor configuration policy](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor-configuration-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor-configuration-policy-v2) DELETE https://api.ap1.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.ap2.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.datadoghq.eu/api/v2/monitor/policy/{policy_id}https://api.ddog-gov.com/api/v2/monitor/policy/{policy_id}https://api.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.us3.datadoghq.com/api/v2/monitor/policy/{policy_id}https://api.us5.datadoghq.com/api/v2/monitor/policy/{policy_id} ### Overview Delete a monitor configuration policy. This endpoint requires the `monitor_config_policy_write` permission. ### Arguments #### Path Parameters Name Type Description policy_id [_required_] string ID of the monitor configuration policy. ### Response * [204](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorConfigPolicy-204-v2) * [400](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorConfigPolicy-400-v2) * [403](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorConfigPolicy-403-v2) * [404](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorConfigPolicy-404-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorConfigPolicy-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Delete a monitor configuration policy Copy ``` # Path parameters export policy_id="00000000-0000-1234-0000-000000000000" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/policy/${policy_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a monitor configuration policy ``` """ Delete a monitor configuration policy returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi # there is a valid "monitor_configuration_policy" in the system MONITOR_CONFIGURATION_POLICY_DATA_ID = environ["MONITOR_CONFIGURATION_POLICY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) api_instance.delete_monitor_config_policy( policy_id=MONITOR_CONFIGURATION_POLICY_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a monitor configuration policy ``` # Delete a monitor configuration policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new # there is a valid "monitor_configuration_policy" in the system MONITOR_CONFIGURATION_POLICY_DATA_ID = ENV["MONITOR_CONFIGURATION_POLICY_DATA_ID"] api_instance.delete_monitor_config_policy(MONITOR_CONFIGURATION_POLICY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a monitor configuration policy ``` // Delete a monitor configuration policy returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "monitor_configuration_policy" in the system MonitorConfigurationPolicyDataID := os.Getenv("MONITOR_CONFIGURATION_POLICY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) r, err := api.DeleteMonitorConfigPolicy(ctx, MonitorConfigurationPolicyDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.DeleteMonitorConfigPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a monitor configuration policy ``` // Delete a monitor configuration policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor_configuration_policy" in the system String MONITOR_CONFIGURATION_POLICY_DATA_ID = System.getenv("MONITOR_CONFIGURATION_POLICY_DATA_ID"); try { apiInstance.deleteMonitorConfigPolicy(MONITOR_CONFIGURATION_POLICY_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#deleteMonitorConfigPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a monitor configuration policy ``` // Delete a monitor configuration policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { // there is a valid "monitor_configuration_policy" in the system let monitor_configuration_policy_data_id = std::env::var("MONITOR_CONFIGURATION_POLICY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .delete_monitor_config_policy(monitor_configuration_policy_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a monitor configuration policy ``` /** * Delete a monitor configuration policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); // there is a valid "monitor_configuration_policy" in the system const MONITOR_CONFIGURATION_POLICY_DATA_ID = process.env .MONITOR_CONFIGURATION_POLICY_DATA_ID as string; const params: v2.MonitorsApiDeleteMonitorConfigPolicyRequest = { policyId: MONITOR_CONFIGURATION_POLICY_DATA_ID, }; apiInstance .deleteMonitorConfigPolicy(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a monitor notification rule](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-notification-rule-v2) GET https://api.ap1.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.ap2.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.datadoghq.eu/api/v2/monitor/notification_rule/{rule_id}https://api.ddog-gov.com/api/v2/monitor/notification_rule/{rule_id}https://api.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.us3.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.us5.datadoghq.com/api/v2/monitor/notification_rule/{rule_id} ### Overview Returns a monitor notification rule by `rule_id`. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string ID of the monitor notification rule to fetch. #### Query Strings Name Type Description include string Comma-separated list of resource paths for related resources to include in the response. Supported resource path is `created_by`. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorNotificationRule-200-v2) * [403](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorNotificationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) A monitor notification rule. Field Type Description data object Monitor notification rule data. attributes object Attributes of the monitor notification rule. conditional_recipients object Use conditional recipients to define different recipients for different situations. Cannot be used with `recipients`. conditions [_required_] [object] Conditions of the notification rule. recipients [_required_] [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. scope [_required_] string The scope to which the monitor applied. fallback_recipients [string] If none of the `conditions` applied, `fallback_recipients` will get notified. created date-time Creation time of the monitor notification rule. filter Filter used to associate the notification rule with monitors. Option 1 object Filter monitor notifications by tags. A monitor notification must match all tags. tags [_required_] [string] A list of tags (key:value pairs), which can be used to filter monitor notifications on monitor and group tags. Option 2 object Filter monitor notifications. A monitor notification must match the scope. scope [_required_] string A scope composed of one or several key:value pairs, which can be used to filter monitor notifications on monitor and group tags. modified date-time Time the monitor notification rule was last modified. name string The name of the monitor notification rule. recipients [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. Cannot be used with `conditional_recipients`. id string The ID of the monitor notification rule. relationships object All relationships associated with monitor notification rule. created_by object The user who created the monitor notification rule. data object Data for the user who created the monitor notification rule. id string User ID of the monitor notification rule creator. type enum Users resource type. Allowed enum values: `users` default: `users` type enum Monitor notification rule resource type. Allowed enum values: `monitor-notification-rule` default: `monitor-notification-rule` included [ ] Array of objects related to the monitor notification rule that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "conditional_recipients": { "conditions": [ { "recipients": [ "slack-test-channel", "jira-test" ], "scope": "transition_type:alert" } ], "fallback_recipients": [ "slack-test-channel", "jira-test" ] }, "created": "2020-01-02T03:04:00.000Z", "filter": { "tags": [ "team:product", "host:abc" ] }, "modified": "2020-01-02T03:04:00.000Z", "name": "A notification rule name", "recipients": [ "slack-test-channel", "jira-test" ] }, "id": "00000000-0000-1234-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-1234-0000-000000000000", "type": "users" } } }, "type": "monitor-notification-rule" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Get a monitor notification rule Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/notification_rule/${rule_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a monitor notification rule ``` """ Get a monitor notification rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi # there is a valid "monitor_notification_rule" in the system MONITOR_NOTIFICATION_RULE_DATA_ID = environ["MONITOR_NOTIFICATION_RULE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.get_monitor_notification_rule( rule_id=MONITOR_NOTIFICATION_RULE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a monitor notification rule ``` # Get a monitor notification rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new # there is a valid "monitor_notification_rule" in the system MONITOR_NOTIFICATION_RULE_DATA_ID = ENV["MONITOR_NOTIFICATION_RULE_DATA_ID"] p api_instance.get_monitor_notification_rule(MONITOR_NOTIFICATION_RULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a monitor notification rule ``` // Get a monitor notification rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "monitor_notification_rule" in the system MonitorNotificationRuleDataID := os.Getenv("MONITOR_NOTIFICATION_RULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.GetMonitorNotificationRule(ctx, MonitorNotificationRuleDataID, *datadogV2.NewGetMonitorNotificationRuleOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.GetMonitorNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.GetMonitorNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a monitor notification rule ``` // Get a monitor notification rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorNotificationRuleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor_notification_rule" in the system String MONITOR_NOTIFICATION_RULE_DATA_ID = System.getenv("MONITOR_NOTIFICATION_RULE_DATA_ID"); try { MonitorNotificationRuleResponse result = apiInstance.getMonitorNotificationRule(MONITOR_NOTIFICATION_RULE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#getMonitorNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a monitor notification rule ``` // Get a monitor notification rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::GetMonitorNotificationRuleOptionalParams; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { // there is a valid "monitor_notification_rule" in the system let monitor_notification_rule_data_id = std::env::var("MONITOR_NOTIFICATION_RULE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .get_monitor_notification_rule( monitor_notification_rule_data_id.clone(), GetMonitorNotificationRuleOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a monitor notification rule ``` /** * Get a monitor notification rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); // there is a valid "monitor_notification_rule" in the system const MONITOR_NOTIFICATION_RULE_DATA_ID = process.env .MONITOR_NOTIFICATION_RULE_DATA_ID as string; const params: v2.MonitorsApiGetMonitorNotificationRuleRequest = { ruleId: MONITOR_NOTIFICATION_RULE_DATA_ID, }; apiInstance .getMonitorNotificationRule(params) .then((data: v2.MonitorNotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all monitor notification rules](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-notification-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-notification-rules-v2) GET https://api.ap1.datadoghq.com/api/v2/monitor/notification_rulehttps://api.ap2.datadoghq.com/api/v2/monitor/notification_rulehttps://api.datadoghq.eu/api/v2/monitor/notification_rulehttps://api.ddog-gov.com/api/v2/monitor/notification_rulehttps://api.datadoghq.com/api/v2/monitor/notification_rulehttps://api.us3.datadoghq.com/api/v2/monitor/notification_rulehttps://api.us5.datadoghq.com/api/v2/monitor/notification_rule ### Overview Returns a list of all monitor notification rules. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Query Strings Name Type Description page integer The page to start paginating from. If `page` is not specified, the argument defaults to the first page. per_page integer The number of rules to return per page. If `per_page` is not specified, the argument defaults to 100. sort string String for sort order, composed of field and sort order separated by a colon, for example `name:asc`. Supported sort directions: `asc`, `desc`. Supported fields: `name`, `created_at`. filters string JSON-encoded filter object. Supported keys: * `text`: Free-text query matched against rule name, tags, and recipients. * `tags`: Array of strings. Return rules that have any of these tags. * `recipients`: Array of strings. Return rules that have any of these recipients. include string Comma-separated list of resource paths for related resources to include in the response. Supported resource path is `created_by`. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorNotificationRules-200-v2) * [403](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorNotificationRules-403-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorNotificationRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response for retrieving all monitor notification rules. Field Type Description data [object] A list of monitor notification rules. attributes object Attributes of the monitor notification rule. conditional_recipients object Use conditional recipients to define different recipients for different situations. Cannot be used with `recipients`. conditions [_required_] [object] Conditions of the notification rule. recipients [_required_] [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. scope [_required_] string The scope to which the monitor applied. fallback_recipients [string] If none of the `conditions` applied, `fallback_recipients` will get notified. created date-time Creation time of the monitor notification rule. filter Filter used to associate the notification rule with monitors. Option 1 object Filter monitor notifications by tags. A monitor notification must match all tags. tags [_required_] [string] A list of tags (key:value pairs), which can be used to filter monitor notifications on monitor and group tags. Option 2 object Filter monitor notifications. A monitor notification must match the scope. scope [_required_] string A scope composed of one or several key:value pairs, which can be used to filter monitor notifications on monitor and group tags. modified date-time Time the monitor notification rule was last modified. name string The name of the monitor notification rule. recipients [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. Cannot be used with `conditional_recipients`. id string The ID of the monitor notification rule. relationships object All relationships associated with monitor notification rule. created_by object The user who created the monitor notification rule. data object Data for the user who created the monitor notification rule. id string User ID of the monitor notification rule creator. type enum Users resource type. Allowed enum values: `users` default: `users` type enum Monitor notification rule resource type. Allowed enum values: `monitor-notification-rule` default: `monitor-notification-rule` included [ ] Array of objects related to the monitor notification rules. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": [ { "attributes": { "conditional_recipients": { "conditions": [ { "recipients": [ "slack-test-channel", "jira-test" ], "scope": "transition_type:alert" } ], "fallback_recipients": [ "slack-test-channel", "jira-test" ] }, "created": "2020-01-02T03:04:00.000Z", "filter": { "tags": [ "team:product", "host:abc" ] }, "modified": "2020-01-02T03:04:00.000Z", "name": "A notification rule name", "recipients": [ "slack-test-channel", "jira-test" ] }, "id": "00000000-0000-1234-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-1234-0000-000000000000", "type": "users" } } }, "type": "monitor-notification-rule" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Get all monitor notification rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/notification_rule" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all monitor notification rules ``` """ Get all monitor notification rules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.get_monitor_notification_rules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all monitor notification rules ``` # Get all monitor notification rules returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new p api_instance.get_monitor_notification_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all monitor notification rules ``` // Get all monitor notification rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.GetMonitorNotificationRules(ctx, *datadogV2.NewGetMonitorNotificationRulesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.GetMonitorNotificationRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.GetMonitorNotificationRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all monitor notification rules ``` // Get all monitor notification rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorNotificationRuleListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); try { MonitorNotificationRuleListResponse result = apiInstance.getMonitorNotificationRules(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#getMonitorNotificationRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all monitor notification rules ``` // Get all monitor notification rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::GetMonitorNotificationRulesOptionalParams; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .get_monitor_notification_rules(GetMonitorNotificationRulesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all monitor notification rules ``` /** * Get all monitor notification rules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); apiInstance .getMonitorNotificationRules() .then((data: v2.MonitorNotificationRuleListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a monitor notification rule](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor-notification-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/monitor/notification_rulehttps://api.ap2.datadoghq.com/api/v2/monitor/notification_rulehttps://api.datadoghq.eu/api/v2/monitor/notification_rulehttps://api.ddog-gov.com/api/v2/monitor/notification_rulehttps://api.datadoghq.com/api/v2/monitor/notification_rulehttps://api.us3.datadoghq.com/api/v2/monitor/notification_rulehttps://api.us5.datadoghq.com/api/v2/monitor/notification_rule ### Overview Creates a monitor notification rule. This endpoint requires the `monitor_config_policy_write` permission. ### Request #### Body Data (required) Request body to create a monitor notification rule. * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description data [_required_] object Object to create a monitor notification rule. attributes [_required_] object Attributes of the monitor notification rule. conditional_recipients object Use conditional recipients to define different recipients for different situations. Cannot be used with `recipients`. conditions [_required_] [object] Conditions of the notification rule. recipients [_required_] [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. scope [_required_] string The scope to which the monitor applied. fallback_recipients [string] If none of the `conditions` applied, `fallback_recipients` will get notified. filter Filter used to associate the notification rule with monitors. Option 1 object Filter monitor notifications by tags. A monitor notification must match all tags. tags [_required_] [string] A list of tags (key:value pairs), which can be used to filter monitor notifications on monitor and group tags. Option 2 object Filter monitor notifications. A monitor notification must match the scope. scope [_required_] string A scope composed of one or several key:value pairs, which can be used to filter monitor notifications on monitor and group tags. name [_required_] string The name of the monitor notification rule. recipients [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. Cannot be used with `conditional_recipients`. type enum Monitor notification rule resource type. Allowed enum values: `monitor-notification-rule` default: `monitor-notification-rule` ##### Create a monitor notification rule returns "OK" response ``` { "data": { "attributes": { "filter": { "tags": [ "test:example-monitor" ] }, "name": "test rule", "recipients": [ "slack-test-channel", "jira-test" ] }, "type": "monitor-notification-rule" } } ``` Copy ##### Create a monitor notification rule with conditional recipients returns "OK" response ``` { "data": { "attributes": { "filter": { "tags": [ "test:example-monitor" ] }, "name": "test rule", "conditional_recipients": { "conditions": [ { "scope": "transition_type:is_alert", "recipients": [ "slack-test-channel", "jira-test" ] } ] } }, "type": "monitor-notification-rule" } } ``` Copy ##### Create a monitor notification rule with scope returns "OK" response ``` { "data": { "attributes": { "filter": { "scope": "test:example-monitor" }, "name": "test rule", "recipients": [ "slack-test-channel", "jira-test" ] }, "type": "monitor-notification-rule" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitorNotificationRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitorNotificationRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitorNotificationRule-403-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitorNotificationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) A monitor notification rule. Field Type Description data object Monitor notification rule data. attributes object Attributes of the monitor notification rule. conditional_recipients object Use conditional recipients to define different recipients for different situations. Cannot be used with `recipients`. conditions [_required_] [object] Conditions of the notification rule. recipients [_required_] [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. scope [_required_] string The scope to which the monitor applied. fallback_recipients [string] If none of the `conditions` applied, `fallback_recipients` will get notified. created date-time Creation time of the monitor notification rule. filter Filter used to associate the notification rule with monitors. Option 1 object Filter monitor notifications by tags. A monitor notification must match all tags. tags [_required_] [string] A list of tags (key:value pairs), which can be used to filter monitor notifications on monitor and group tags. Option 2 object Filter monitor notifications. A monitor notification must match the scope. scope [_required_] string A scope composed of one or several key:value pairs, which can be used to filter monitor notifications on monitor and group tags. modified date-time Time the monitor notification rule was last modified. name string The name of the monitor notification rule. recipients [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. Cannot be used with `conditional_recipients`. id string The ID of the monitor notification rule. relationships object All relationships associated with monitor notification rule. created_by object The user who created the monitor notification rule. data object Data for the user who created the monitor notification rule. id string User ID of the monitor notification rule creator. type enum Users resource type. Allowed enum values: `users` default: `users` type enum Monitor notification rule resource type. Allowed enum values: `monitor-notification-rule` default: `monitor-notification-rule` included [ ] Array of objects related to the monitor notification rule that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "conditional_recipients": { "conditions": [ { "recipients": [ "slack-test-channel", "jira-test" ], "scope": "transition_type:alert" } ], "fallback_recipients": [ "slack-test-channel", "jira-test" ] }, "created": "2020-01-02T03:04:00.000Z", "filter": { "tags": [ "team:product", "host:abc" ] }, "modified": "2020-01-02T03:04:00.000Z", "name": "A notification rule name", "recipients": [ "slack-test-channel", "jira-test" ] }, "id": "00000000-0000-1234-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-1234-0000-000000000000", "type": "users" } } }, "type": "monitor-notification-rule" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Create a monitor notification rule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/notification_rule" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter": { "tags": [ "test:example-monitor" ] }, "name": "test rule", "recipients": [ "slack-test-channel", "jira-test" ] }, "type": "monitor-notification-rule" } } EOF ``` ##### Create a monitor notification rule with conditional recipients returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/notification_rule" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter": { "tags": [ "test:example-monitor" ] }, "name": "test rule", "conditional_recipients": { "conditions": [ { "scope": "transition_type:is_alert", "recipients": [ "slack-test-channel", "jira-test" ] } ] } }, "type": "monitor-notification-rule" } } EOF ``` ##### Create a monitor notification rule with scope returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/notification_rule" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter": { "scope": "test:example-monitor" }, "name": "test rule", "recipients": [ "slack-test-channel", "jira-test" ] }, "type": "monitor-notification-rule" } } EOF ``` ##### Create a monitor notification rule returns "OK" response ``` // Create a monitor notification rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MonitorNotificationRuleCreateRequest{ Data: datadogV2.MonitorNotificationRuleCreateRequestData{ Attributes: datadogV2.MonitorNotificationRuleAttributes{ Filter: &datadogV2.MonitorNotificationRuleFilter{ MonitorNotificationRuleFilterTags: &datadogV2.MonitorNotificationRuleFilterTags{ Tags: []string{ "test:example-monitor", }, }}, Name: "test rule", Recipients: []string{ "slack-test-channel", "jira-test", }, }, Type: datadogV2.MONITORNOTIFICATIONRULERESOURCETYPE_MONITOR_NOTIFICATION_RULE.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.CreateMonitorNotificationRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.CreateMonitorNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.CreateMonitorNotificationRule`:\n%s\n", responseContent) } ``` Copy ##### Create a monitor notification rule with conditional recipients returns "OK" response ``` // Create a monitor notification rule with conditional recipients returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MonitorNotificationRuleCreateRequest{ Data: datadogV2.MonitorNotificationRuleCreateRequestData{ Attributes: datadogV2.MonitorNotificationRuleAttributes{ Filter: &datadogV2.MonitorNotificationRuleFilter{ MonitorNotificationRuleFilterTags: &datadogV2.MonitorNotificationRuleFilterTags{ Tags: []string{ "test:example-monitor", }, }}, Name: "test rule", ConditionalRecipients: &datadogV2.MonitorNotificationRuleConditionalRecipients{ Conditions: []datadogV2.MonitorNotificationRuleCondition{ { Scope: "transition_type:is_alert", Recipients: []string{ "slack-test-channel", "jira-test", }, }, }, }, }, Type: datadogV2.MONITORNOTIFICATIONRULERESOURCETYPE_MONITOR_NOTIFICATION_RULE.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.CreateMonitorNotificationRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.CreateMonitorNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.CreateMonitorNotificationRule`:\n%s\n", responseContent) } ``` Copy ##### Create a monitor notification rule with scope returns "OK" response ``` // Create a monitor notification rule with scope returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MonitorNotificationRuleCreateRequest{ Data: datadogV2.MonitorNotificationRuleCreateRequestData{ Attributes: datadogV2.MonitorNotificationRuleAttributes{ Filter: &datadogV2.MonitorNotificationRuleFilter{ MonitorNotificationRuleFilterScope: &datadogV2.MonitorNotificationRuleFilterScope{ Scope: "test:example-monitor", }}, Name: "test rule", Recipients: []string{ "slack-test-channel", "jira-test", }, }, Type: datadogV2.MONITORNOTIFICATIONRULERESOURCETYPE_MONITOR_NOTIFICATION_RULE.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.CreateMonitorNotificationRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.CreateMonitorNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.CreateMonitorNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a monitor notification rule returns "OK" response ``` // Create a monitor notification rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorNotificationRuleAttributes; import com.datadog.api.client.v2.model.MonitorNotificationRuleCreateRequest; import com.datadog.api.client.v2.model.MonitorNotificationRuleCreateRequestData; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilter; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilterTags; import com.datadog.api.client.v2.model.MonitorNotificationRuleResourceType; import com.datadog.api.client.v2.model.MonitorNotificationRuleResponse; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); MonitorNotificationRuleCreateRequest body = new MonitorNotificationRuleCreateRequest() .data( new MonitorNotificationRuleCreateRequestData() .attributes( new MonitorNotificationRuleAttributes() .filter( new MonitorNotificationRuleFilter( new MonitorNotificationRuleFilterTags() .tags(Collections.singletonList("test:example-monitor")))) .name("test rule") .recipients(Arrays.asList("slack-test-channel", "jira-test"))) .type(MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE)); try { MonitorNotificationRuleResponse result = apiInstance.createMonitorNotificationRule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#createMonitorNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a monitor notification rule with conditional recipients returns "OK" response ``` // Create a monitor notification rule with conditional recipients returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorNotificationRuleAttributes; import com.datadog.api.client.v2.model.MonitorNotificationRuleCondition; import com.datadog.api.client.v2.model.MonitorNotificationRuleConditionalRecipients; import com.datadog.api.client.v2.model.MonitorNotificationRuleCreateRequest; import com.datadog.api.client.v2.model.MonitorNotificationRuleCreateRequestData; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilter; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilterTags; import com.datadog.api.client.v2.model.MonitorNotificationRuleResourceType; import com.datadog.api.client.v2.model.MonitorNotificationRuleResponse; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); MonitorNotificationRuleCreateRequest body = new MonitorNotificationRuleCreateRequest() .data( new MonitorNotificationRuleCreateRequestData() .attributes( new MonitorNotificationRuleAttributes() .filter( new MonitorNotificationRuleFilter( new MonitorNotificationRuleFilterTags() .tags(Collections.singletonList("test:example-monitor")))) .name("test rule") .conditionalRecipients( new MonitorNotificationRuleConditionalRecipients() .conditions( Collections.singletonList( new MonitorNotificationRuleCondition() .scope("transition_type:is_alert") .recipients( Arrays.asList( "slack-test-channel", "jira-test")))))) .type(MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE)); try { MonitorNotificationRuleResponse result = apiInstance.createMonitorNotificationRule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#createMonitorNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a monitor notification rule with scope returns "OK" response ``` // Create a monitor notification rule with scope returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorNotificationRuleAttributes; import com.datadog.api.client.v2.model.MonitorNotificationRuleCreateRequest; import com.datadog.api.client.v2.model.MonitorNotificationRuleCreateRequestData; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilter; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilterScope; import com.datadog.api.client.v2.model.MonitorNotificationRuleResourceType; import com.datadog.api.client.v2.model.MonitorNotificationRuleResponse; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); MonitorNotificationRuleCreateRequest body = new MonitorNotificationRuleCreateRequest() .data( new MonitorNotificationRuleCreateRequestData() .attributes( new MonitorNotificationRuleAttributes() .filter( new MonitorNotificationRuleFilter( new MonitorNotificationRuleFilterScope() .scope("test:example-monitor"))) .name("test rule") .recipients(Arrays.asList("slack-test-channel", "jira-test"))) .type(MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE)); try { MonitorNotificationRuleResponse result = apiInstance.createMonitorNotificationRule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#createMonitorNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a monitor notification rule returns "OK" response ``` """ Create a monitor notification rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_notification_rule_attributes import MonitorNotificationRuleAttributes from datadog_api_client.v2.model.monitor_notification_rule_create_request import MonitorNotificationRuleCreateRequest from datadog_api_client.v2.model.monitor_notification_rule_create_request_data import ( MonitorNotificationRuleCreateRequestData, ) from datadog_api_client.v2.model.monitor_notification_rule_filter_tags import MonitorNotificationRuleFilterTags from datadog_api_client.v2.model.monitor_notification_rule_resource_type import MonitorNotificationRuleResourceType body = MonitorNotificationRuleCreateRequest( data=MonitorNotificationRuleCreateRequestData( attributes=MonitorNotificationRuleAttributes( filter=MonitorNotificationRuleFilterTags( tags=[ "test:example-monitor", ], ), name="test rule", recipients=[ "slack-test-channel", "jira-test", ], ), type=MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.create_monitor_notification_rule(body=body) print(response) ``` Copy ##### Create a monitor notification rule with conditional recipients returns "OK" response ``` """ Create a monitor notification rule with conditional recipients returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_notification_rule_attributes import MonitorNotificationRuleAttributes from datadog_api_client.v2.model.monitor_notification_rule_condition import MonitorNotificationRuleCondition from datadog_api_client.v2.model.monitor_notification_rule_conditional_recipients import ( MonitorNotificationRuleConditionalRecipients, ) from datadog_api_client.v2.model.monitor_notification_rule_create_request import MonitorNotificationRuleCreateRequest from datadog_api_client.v2.model.monitor_notification_rule_create_request_data import ( MonitorNotificationRuleCreateRequestData, ) from datadog_api_client.v2.model.monitor_notification_rule_filter_tags import MonitorNotificationRuleFilterTags from datadog_api_client.v2.model.monitor_notification_rule_resource_type import MonitorNotificationRuleResourceType body = MonitorNotificationRuleCreateRequest( data=MonitorNotificationRuleCreateRequestData( attributes=MonitorNotificationRuleAttributes( filter=MonitorNotificationRuleFilterTags( tags=[ "test:example-monitor", ], ), name="test rule", conditional_recipients=MonitorNotificationRuleConditionalRecipients( conditions=[ MonitorNotificationRuleCondition( scope="transition_type:is_alert", recipients=[ "slack-test-channel", "jira-test", ], ), ], ), ), type=MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.create_monitor_notification_rule(body=body) print(response) ``` Copy ##### Create a monitor notification rule with scope returns "OK" response ``` """ Create a monitor notification rule with scope returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_notification_rule_attributes import MonitorNotificationRuleAttributes from datadog_api_client.v2.model.monitor_notification_rule_create_request import MonitorNotificationRuleCreateRequest from datadog_api_client.v2.model.monitor_notification_rule_create_request_data import ( MonitorNotificationRuleCreateRequestData, ) from datadog_api_client.v2.model.monitor_notification_rule_filter_scope import MonitorNotificationRuleFilterScope from datadog_api_client.v2.model.monitor_notification_rule_resource_type import MonitorNotificationRuleResourceType body = MonitorNotificationRuleCreateRequest( data=MonitorNotificationRuleCreateRequestData( attributes=MonitorNotificationRuleAttributes( filter=MonitorNotificationRuleFilterScope( scope="test:example-monitor", ), name="test rule", recipients=[ "slack-test-channel", "jira-test", ], ), type=MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.create_monitor_notification_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a monitor notification rule returns "OK" response ``` # Create a monitor notification rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new body = DatadogAPIClient::V2::MonitorNotificationRuleCreateRequest.new({ data: DatadogAPIClient::V2::MonitorNotificationRuleCreateRequestData.new({ attributes: DatadogAPIClient::V2::MonitorNotificationRuleAttributes.new({ filter: DatadogAPIClient::V2::MonitorNotificationRuleFilterTags.new({ tags: [ "test:example-monitor", ], }), name: "test rule", recipients: [ "slack-test-channel", "jira-test", ], }), type: DatadogAPIClient::V2::MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE, }), }) p api_instance.create_monitor_notification_rule(body) ``` Copy ##### Create a monitor notification rule with conditional recipients returns "OK" response ``` # Create a monitor notification rule with conditional recipients returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new body = DatadogAPIClient::V2::MonitorNotificationRuleCreateRequest.new({ data: DatadogAPIClient::V2::MonitorNotificationRuleCreateRequestData.new({ attributes: DatadogAPIClient::V2::MonitorNotificationRuleAttributes.new({ filter: DatadogAPIClient::V2::MonitorNotificationRuleFilterTags.new({ tags: [ "test:example-monitor", ], }), name: "test rule", conditional_recipients: DatadogAPIClient::V2::MonitorNotificationRuleConditionalRecipients.new({ conditions: [ DatadogAPIClient::V2::MonitorNotificationRuleCondition.new({ scope: "transition_type:is_alert", recipients: [ "slack-test-channel", "jira-test", ], }), ], }), }), type: DatadogAPIClient::V2::MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE, }), }) p api_instance.create_monitor_notification_rule(body) ``` Copy ##### Create a monitor notification rule with scope returns "OK" response ``` # Create a monitor notification rule with scope returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new body = DatadogAPIClient::V2::MonitorNotificationRuleCreateRequest.new({ data: DatadogAPIClient::V2::MonitorNotificationRuleCreateRequestData.new({ attributes: DatadogAPIClient::V2::MonitorNotificationRuleAttributes.new({ filter: DatadogAPIClient::V2::MonitorNotificationRuleFilterScope.new({ scope: "test:example-monitor", }), name: "test rule", recipients: [ "slack-test-channel", "jira-test", ], }), type: DatadogAPIClient::V2::MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE, }), }) p api_instance.create_monitor_notification_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a monitor notification rule returns "OK" response ``` // Create a monitor notification rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorNotificationRuleAttributes; use datadog_api_client::datadogV2::model::MonitorNotificationRuleCreateRequest; use datadog_api_client::datadogV2::model::MonitorNotificationRuleCreateRequestData; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilter; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilterTags; use datadog_api_client::datadogV2::model::MonitorNotificationRuleResourceType; #[tokio::main] async fn main() { let body = MonitorNotificationRuleCreateRequest::new( MonitorNotificationRuleCreateRequestData::new( MonitorNotificationRuleAttributes::new("test rule".to_string()) .filter( MonitorNotificationRuleFilter::MonitorNotificationRuleFilterTags(Box::new( MonitorNotificationRuleFilterTags::new(vec![ "test:example-monitor".to_string() ]), )), ) .recipients(vec![ "slack-test-channel".to_string(), "jira-test".to_string(), ]), ) .type_(MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE), ); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.create_monitor_notification_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a monitor notification rule with conditional recipients returns "OK" response ``` // Create a monitor notification rule with conditional recipients returns "OK" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorNotificationRuleAttributes; use datadog_api_client::datadogV2::model::MonitorNotificationRuleCondition; use datadog_api_client::datadogV2::model::MonitorNotificationRuleConditionalRecipients; use datadog_api_client::datadogV2::model::MonitorNotificationRuleCreateRequest; use datadog_api_client::datadogV2::model::MonitorNotificationRuleCreateRequestData; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilter; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilterTags; use datadog_api_client::datadogV2::model::MonitorNotificationRuleResourceType; #[tokio::main] async fn main() { let body = MonitorNotificationRuleCreateRequest::new( MonitorNotificationRuleCreateRequestData::new( MonitorNotificationRuleAttributes::new("test rule".to_string()) .conditional_recipients(MonitorNotificationRuleConditionalRecipients::new(vec![ MonitorNotificationRuleCondition::new( vec!["slack-test-channel".to_string(), "jira-test".to_string()], "transition_type:is_alert".to_string(), ), ])) .filter( MonitorNotificationRuleFilter::MonitorNotificationRuleFilterTags(Box::new( MonitorNotificationRuleFilterTags::new(vec![ "test:example-monitor".to_string() ]), )), ), ) .type_(MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE), ); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.create_monitor_notification_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a monitor notification rule with scope returns "OK" response ``` // Create a monitor notification rule with scope returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorNotificationRuleAttributes; use datadog_api_client::datadogV2::model::MonitorNotificationRuleCreateRequest; use datadog_api_client::datadogV2::model::MonitorNotificationRuleCreateRequestData; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilter; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilterScope; use datadog_api_client::datadogV2::model::MonitorNotificationRuleResourceType; #[tokio::main] async fn main() { let body = MonitorNotificationRuleCreateRequest::new( MonitorNotificationRuleCreateRequestData::new( MonitorNotificationRuleAttributes::new("test rule".to_string()) .filter( MonitorNotificationRuleFilter::MonitorNotificationRuleFilterScope(Box::new( MonitorNotificationRuleFilterScope::new("test:example-monitor".to_string()), )), ) .recipients(vec![ "slack-test-channel".to_string(), "jira-test".to_string(), ]), ) .type_(MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE), ); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api.create_monitor_notification_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a monitor notification rule returns "OK" response ``` /** * Create a monitor notification rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); const params: v2.MonitorsApiCreateMonitorNotificationRuleRequest = { body: { data: { attributes: { filter: { tags: ["test:example-monitor"], }, name: "test rule", recipients: ["slack-test-channel", "jira-test"], }, type: "monitor-notification-rule", }, }, }; apiInstance .createMonitorNotificationRule(params) .then((data: v2.MonitorNotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a monitor notification rule with conditional recipients returns "OK" response ``` /** * Create a monitor notification rule with conditional recipients returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); const params: v2.MonitorsApiCreateMonitorNotificationRuleRequest = { body: { data: { attributes: { filter: { tags: ["test:example-monitor"], }, name: "test rule", conditionalRecipients: { conditions: [ { scope: "transition_type:is_alert", recipients: ["slack-test-channel", "jira-test"], }, ], }, }, type: "monitor-notification-rule", }, }, }; apiInstance .createMonitorNotificationRule(params) .then((data: v2.MonitorNotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a monitor notification rule with scope returns "OK" response ``` /** * Create a monitor notification rule with scope returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); const params: v2.MonitorsApiCreateMonitorNotificationRuleRequest = { body: { data: { attributes: { filter: { scope: "test:example-monitor", }, name: "test rule", recipients: ["slack-test-channel", "jira-test"], }, type: "monitor-notification-rule", }, }, }; apiInstance .createMonitorNotificationRule(params) .then((data: v2.MonitorNotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a monitor notification rule](https://docs.datadoghq.com/api/latest/monitors/#update-a-monitor-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#update-a-monitor-notification-rule-v2) PATCH https://api.ap1.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.ap2.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.datadoghq.eu/api/v2/monitor/notification_rule/{rule_id}https://api.ddog-gov.com/api/v2/monitor/notification_rule/{rule_id}https://api.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.us3.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.us5.datadoghq.com/api/v2/monitor/notification_rule/{rule_id} ### Overview Updates a monitor notification rule by `rule_id`. This endpoint requires the `monitor_config_policy_write` permission. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string ID of the monitor notification rule to update. ### Request #### Body Data (required) Request body to update the monitor notification rule. * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description data [_required_] object Object to update a monitor notification rule. attributes [_required_] object Attributes of the monitor notification rule. conditional_recipients object Use conditional recipients to define different recipients for different situations. Cannot be used with `recipients`. conditions [_required_] [object] Conditions of the notification rule. recipients [_required_] [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. scope [_required_] string The scope to which the monitor applied. fallback_recipients [string] If none of the `conditions` applied, `fallback_recipients` will get notified. filter Filter used to associate the notification rule with monitors. Option 1 object Filter monitor notifications by tags. A monitor notification must match all tags. tags [_required_] [string] A list of tags (key:value pairs), which can be used to filter monitor notifications on monitor and group tags. Option 2 object Filter monitor notifications. A monitor notification must match the scope. scope [_required_] string A scope composed of one or several key:value pairs, which can be used to filter monitor notifications on monitor and group tags. name [_required_] string The name of the monitor notification rule. recipients [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. Cannot be used with `conditional_recipients`. id [_required_] string The ID of the monitor notification rule. type enum Monitor notification rule resource type. Allowed enum values: `monitor-notification-rule` default: `monitor-notification-rule` ##### Update a monitor notification rule returns "OK" response ``` { "data": { "attributes": { "filter": { "tags": [ "test:example-monitor", "host:abc" ] }, "name": "updated rule", "recipients": [ "slack-test-channel" ] }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-notification-rule" } } ``` Copy ##### Update a monitor notification rule with conditional_recipients returns "OK" response ``` { "data": { "attributes": { "filter": { "tags": [ "test:example-monitor", "host:abc" ] }, "name": "updated rule", "conditional_recipients": { "conditions": [ { "scope": "transition_type:is_alert", "recipients": [ "slack-test-channel", "jira-test" ] } ] } }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-notification-rule" } } ``` Copy ##### Update a monitor notification rule with scope returns "OK" response ``` { "data": { "attributes": { "filter": { "scope": "test:example-monitor" }, "name": "updated rule", "recipients": [ "slack-test-channel" ] }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-notification-rule" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorNotificationRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorNotificationRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorNotificationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) A monitor notification rule. Field Type Description data object Monitor notification rule data. attributes object Attributes of the monitor notification rule. conditional_recipients object Use conditional recipients to define different recipients for different situations. Cannot be used with `recipients`. conditions [_required_] [object] Conditions of the notification rule. recipients [_required_] [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. scope [_required_] string The scope to which the monitor applied. fallback_recipients [string] If none of the `conditions` applied, `fallback_recipients` will get notified. created date-time Creation time of the monitor notification rule. filter Filter used to associate the notification rule with monitors. Option 1 object Filter monitor notifications by tags. A monitor notification must match all tags. tags [_required_] [string] A list of tags (key:value pairs), which can be used to filter monitor notifications on monitor and group tags. Option 2 object Filter monitor notifications. A monitor notification must match the scope. scope [_required_] string A scope composed of one or several key:value pairs, which can be used to filter monitor notifications on monitor and group tags. modified date-time Time the monitor notification rule was last modified. name string The name of the monitor notification rule. recipients [string] A list of recipients to notify. Uses the same format as the monitor `message` field. Must not start with an '@'. Cannot be used with `conditional_recipients`. id string The ID of the monitor notification rule. relationships object All relationships associated with monitor notification rule. created_by object The user who created the monitor notification rule. data object Data for the user who created the monitor notification rule. id string User ID of the monitor notification rule creator. type enum Users resource type. Allowed enum values: `users` default: `users` type enum Monitor notification rule resource type. Allowed enum values: `monitor-notification-rule` default: `monitor-notification-rule` included [ ] Array of objects related to the monitor notification rule that the user requested. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "conditional_recipients": { "conditions": [ { "recipients": [ "slack-test-channel", "jira-test" ], "scope": "transition_type:alert" } ], "fallback_recipients": [ "slack-test-channel", "jira-test" ] }, "created": "2020-01-02T03:04:00.000Z", "filter": { "tags": [ "team:product", "host:abc" ] }, "modified": "2020-01-02T03:04:00.000Z", "name": "A notification rule name", "recipients": [ "slack-test-channel", "jira-test" ] }, "id": "00000000-0000-1234-0000-000000000000", "relationships": { "created_by": { "data": { "id": "00000000-0000-1234-0000-000000000000", "type": "users" } } }, "type": "monitor-notification-rule" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Update a monitor notification rule returns "OK" response Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/notification_rule/${rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter": { "tags": [ "test:example-monitor", "host:abc" ] }, "name": "updated rule", "recipients": [ "slack-test-channel" ] }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-notification-rule" } } EOF ``` ##### Update a monitor notification rule with conditional_recipients returns "OK" response Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/notification_rule/${rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter": { "tags": [ "test:example-monitor", "host:abc" ] }, "name": "updated rule", "conditional_recipients": { "conditions": [ { "scope": "transition_type:is_alert", "recipients": [ "slack-test-channel", "jira-test" ] } ] } }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-notification-rule" } } EOF ``` ##### Update a monitor notification rule with scope returns "OK" response Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/notification_rule/${rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter": { "scope": "test:example-monitor" }, "name": "updated rule", "recipients": [ "slack-test-channel" ] }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-notification-rule" } } EOF ``` ##### Update a monitor notification rule returns "OK" response ``` // Update a monitor notification rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "monitor_notification_rule" in the system MonitorNotificationRuleDataID := os.Getenv("MONITOR_NOTIFICATION_RULE_DATA_ID") body := datadogV2.MonitorNotificationRuleUpdateRequest{ Data: datadogV2.MonitorNotificationRuleUpdateRequestData{ Attributes: datadogV2.MonitorNotificationRuleAttributes{ Filter: &datadogV2.MonitorNotificationRuleFilter{ MonitorNotificationRuleFilterTags: &datadogV2.MonitorNotificationRuleFilterTags{ Tags: []string{ "test:example-monitor", "host:abc", }, }}, Name: "updated rule", Recipients: []string{ "slack-test-channel", }, }, Id: MonitorNotificationRuleDataID, Type: datadogV2.MONITORNOTIFICATIONRULERESOURCETYPE_MONITOR_NOTIFICATION_RULE.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.UpdateMonitorNotificationRule(ctx, MonitorNotificationRuleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.UpdateMonitorNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.UpdateMonitorNotificationRule`:\n%s\n", responseContent) } ``` Copy ##### Update a monitor notification rule with conditional_recipients returns "OK" response ``` // Update a monitor notification rule with conditional_recipients returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "monitor_notification_rule" in the system MonitorNotificationRuleDataID := os.Getenv("MONITOR_NOTIFICATION_RULE_DATA_ID") body := datadogV2.MonitorNotificationRuleUpdateRequest{ Data: datadogV2.MonitorNotificationRuleUpdateRequestData{ Attributes: datadogV2.MonitorNotificationRuleAttributes{ Filter: &datadogV2.MonitorNotificationRuleFilter{ MonitorNotificationRuleFilterTags: &datadogV2.MonitorNotificationRuleFilterTags{ Tags: []string{ "test:example-monitor", "host:abc", }, }}, Name: "updated rule", ConditionalRecipients: &datadogV2.MonitorNotificationRuleConditionalRecipients{ Conditions: []datadogV2.MonitorNotificationRuleCondition{ { Scope: "transition_type:is_alert", Recipients: []string{ "slack-test-channel", "jira-test", }, }, }, }, }, Id: MonitorNotificationRuleDataID, Type: datadogV2.MONITORNOTIFICATIONRULERESOURCETYPE_MONITOR_NOTIFICATION_RULE.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.UpdateMonitorNotificationRule(ctx, MonitorNotificationRuleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.UpdateMonitorNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.UpdateMonitorNotificationRule`:\n%s\n", responseContent) } ``` Copy ##### Update a monitor notification rule with scope returns "OK" response ``` // Update a monitor notification rule with scope returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "monitor_notification_rule" in the system MonitorNotificationRuleDataID := os.Getenv("MONITOR_NOTIFICATION_RULE_DATA_ID") body := datadogV2.MonitorNotificationRuleUpdateRequest{ Data: datadogV2.MonitorNotificationRuleUpdateRequestData{ Attributes: datadogV2.MonitorNotificationRuleAttributes{ Filter: &datadogV2.MonitorNotificationRuleFilter{ MonitorNotificationRuleFilterScope: &datadogV2.MonitorNotificationRuleFilterScope{ Scope: "test:example-monitor", }}, Name: "updated rule", Recipients: []string{ "slack-test-channel", }, }, Id: MonitorNotificationRuleDataID, Type: datadogV2.MONITORNOTIFICATIONRULERESOURCETYPE_MONITOR_NOTIFICATION_RULE.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.UpdateMonitorNotificationRule(ctx, MonitorNotificationRuleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.UpdateMonitorNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.UpdateMonitorNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a monitor notification rule returns "OK" response ``` // Update a monitor notification rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorNotificationRuleAttributes; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilter; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilterTags; import com.datadog.api.client.v2.model.MonitorNotificationRuleResourceType; import com.datadog.api.client.v2.model.MonitorNotificationRuleResponse; import com.datadog.api.client.v2.model.MonitorNotificationRuleUpdateRequest; import com.datadog.api.client.v2.model.MonitorNotificationRuleUpdateRequestData; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor_notification_rule" in the system String MONITOR_NOTIFICATION_RULE_DATA_ID = System.getenv("MONITOR_NOTIFICATION_RULE_DATA_ID"); MonitorNotificationRuleUpdateRequest body = new MonitorNotificationRuleUpdateRequest() .data( new MonitorNotificationRuleUpdateRequestData() .attributes( new MonitorNotificationRuleAttributes() .filter( new MonitorNotificationRuleFilter( new MonitorNotificationRuleFilterTags() .tags(Arrays.asList("test:example-monitor", "host:abc")))) .name("updated rule") .recipients(Collections.singletonList("slack-test-channel"))) .id(MONITOR_NOTIFICATION_RULE_DATA_ID) .type(MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE)); try { MonitorNotificationRuleResponse result = apiInstance.updateMonitorNotificationRule(MONITOR_NOTIFICATION_RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#updateMonitorNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update a monitor notification rule with conditional_recipients returns "OK" response ``` // Update a monitor notification rule with conditional_recipients returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorNotificationRuleAttributes; import com.datadog.api.client.v2.model.MonitorNotificationRuleCondition; import com.datadog.api.client.v2.model.MonitorNotificationRuleConditionalRecipients; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilter; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilterTags; import com.datadog.api.client.v2.model.MonitorNotificationRuleResourceType; import com.datadog.api.client.v2.model.MonitorNotificationRuleResponse; import com.datadog.api.client.v2.model.MonitorNotificationRuleUpdateRequest; import com.datadog.api.client.v2.model.MonitorNotificationRuleUpdateRequestData; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor_notification_rule" in the system String MONITOR_NOTIFICATION_RULE_DATA_ID = System.getenv("MONITOR_NOTIFICATION_RULE_DATA_ID"); MonitorNotificationRuleUpdateRequest body = new MonitorNotificationRuleUpdateRequest() .data( new MonitorNotificationRuleUpdateRequestData() .attributes( new MonitorNotificationRuleAttributes() .filter( new MonitorNotificationRuleFilter( new MonitorNotificationRuleFilterTags() .tags(Arrays.asList("test:example-monitor", "host:abc")))) .name("updated rule") .conditionalRecipients( new MonitorNotificationRuleConditionalRecipients() .conditions( Collections.singletonList( new MonitorNotificationRuleCondition() .scope("transition_type:is_alert") .recipients( Arrays.asList( "slack-test-channel", "jira-test")))))) .id(MONITOR_NOTIFICATION_RULE_DATA_ID) .type(MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE)); try { MonitorNotificationRuleResponse result = apiInstance.updateMonitorNotificationRule(MONITOR_NOTIFICATION_RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#updateMonitorNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update a monitor notification rule with scope returns "OK" response ``` // Update a monitor notification rule with scope returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorNotificationRuleAttributes; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilter; import com.datadog.api.client.v2.model.MonitorNotificationRuleFilterScope; import com.datadog.api.client.v2.model.MonitorNotificationRuleResourceType; import com.datadog.api.client.v2.model.MonitorNotificationRuleResponse; import com.datadog.api.client.v2.model.MonitorNotificationRuleUpdateRequest; import com.datadog.api.client.v2.model.MonitorNotificationRuleUpdateRequestData; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor_notification_rule" in the system String MONITOR_NOTIFICATION_RULE_DATA_ID = System.getenv("MONITOR_NOTIFICATION_RULE_DATA_ID"); MonitorNotificationRuleUpdateRequest body = new MonitorNotificationRuleUpdateRequest() .data( new MonitorNotificationRuleUpdateRequestData() .attributes( new MonitorNotificationRuleAttributes() .filter( new MonitorNotificationRuleFilter( new MonitorNotificationRuleFilterScope() .scope("test:example-monitor"))) .name("updated rule") .recipients(Collections.singletonList("slack-test-channel"))) .id(MONITOR_NOTIFICATION_RULE_DATA_ID) .type(MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE)); try { MonitorNotificationRuleResponse result = apiInstance.updateMonitorNotificationRule(MONITOR_NOTIFICATION_RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#updateMonitorNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a monitor notification rule returns "OK" response ``` """ Update a monitor notification rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_notification_rule_attributes import MonitorNotificationRuleAttributes from datadog_api_client.v2.model.monitor_notification_rule_filter_tags import MonitorNotificationRuleFilterTags from datadog_api_client.v2.model.monitor_notification_rule_resource_type import MonitorNotificationRuleResourceType from datadog_api_client.v2.model.monitor_notification_rule_update_request import MonitorNotificationRuleUpdateRequest from datadog_api_client.v2.model.monitor_notification_rule_update_request_data import ( MonitorNotificationRuleUpdateRequestData, ) # there is a valid "monitor_notification_rule" in the system MONITOR_NOTIFICATION_RULE_DATA_ID = environ["MONITOR_NOTIFICATION_RULE_DATA_ID"] body = MonitorNotificationRuleUpdateRequest( data=MonitorNotificationRuleUpdateRequestData( attributes=MonitorNotificationRuleAttributes( filter=MonitorNotificationRuleFilterTags( tags=[ "test:example-monitor", "host:abc", ], ), name="updated rule", recipients=[ "slack-test-channel", ], ), id=MONITOR_NOTIFICATION_RULE_DATA_ID, type=MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.update_monitor_notification_rule(rule_id=MONITOR_NOTIFICATION_RULE_DATA_ID, body=body) print(response) ``` Copy ##### Update a monitor notification rule with conditional_recipients returns "OK" response ``` """ Update a monitor notification rule with conditional_recipients returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_notification_rule_attributes import MonitorNotificationRuleAttributes from datadog_api_client.v2.model.monitor_notification_rule_condition import MonitorNotificationRuleCondition from datadog_api_client.v2.model.monitor_notification_rule_conditional_recipients import ( MonitorNotificationRuleConditionalRecipients, ) from datadog_api_client.v2.model.monitor_notification_rule_filter_tags import MonitorNotificationRuleFilterTags from datadog_api_client.v2.model.monitor_notification_rule_resource_type import MonitorNotificationRuleResourceType from datadog_api_client.v2.model.monitor_notification_rule_update_request import MonitorNotificationRuleUpdateRequest from datadog_api_client.v2.model.monitor_notification_rule_update_request_data import ( MonitorNotificationRuleUpdateRequestData, ) # there is a valid "monitor_notification_rule" in the system MONITOR_NOTIFICATION_RULE_DATA_ID = environ["MONITOR_NOTIFICATION_RULE_DATA_ID"] body = MonitorNotificationRuleUpdateRequest( data=MonitorNotificationRuleUpdateRequestData( attributes=MonitorNotificationRuleAttributes( filter=MonitorNotificationRuleFilterTags( tags=[ "test:example-monitor", "host:abc", ], ), name="updated rule", conditional_recipients=MonitorNotificationRuleConditionalRecipients( conditions=[ MonitorNotificationRuleCondition( scope="transition_type:is_alert", recipients=[ "slack-test-channel", "jira-test", ], ), ], ), ), id=MONITOR_NOTIFICATION_RULE_DATA_ID, type=MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.update_monitor_notification_rule(rule_id=MONITOR_NOTIFICATION_RULE_DATA_ID, body=body) print(response) ``` Copy ##### Update a monitor notification rule with scope returns "OK" response ``` """ Update a monitor notification rule with scope returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_notification_rule_attributes import MonitorNotificationRuleAttributes from datadog_api_client.v2.model.monitor_notification_rule_filter_scope import MonitorNotificationRuleFilterScope from datadog_api_client.v2.model.monitor_notification_rule_resource_type import MonitorNotificationRuleResourceType from datadog_api_client.v2.model.monitor_notification_rule_update_request import MonitorNotificationRuleUpdateRequest from datadog_api_client.v2.model.monitor_notification_rule_update_request_data import ( MonitorNotificationRuleUpdateRequestData, ) # there is a valid "monitor_notification_rule" in the system MONITOR_NOTIFICATION_RULE_DATA_ID = environ["MONITOR_NOTIFICATION_RULE_DATA_ID"] body = MonitorNotificationRuleUpdateRequest( data=MonitorNotificationRuleUpdateRequestData( attributes=MonitorNotificationRuleAttributes( filter=MonitorNotificationRuleFilterScope( scope="test:example-monitor", ), name="updated rule", recipients=[ "slack-test-channel", ], ), id=MONITOR_NOTIFICATION_RULE_DATA_ID, type=MonitorNotificationRuleResourceType.MONITOR_NOTIFICATION_RULE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.update_monitor_notification_rule(rule_id=MONITOR_NOTIFICATION_RULE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a monitor notification rule returns "OK" response ``` # Update a monitor notification rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new # there is a valid "monitor_notification_rule" in the system MONITOR_NOTIFICATION_RULE_DATA_ID = ENV["MONITOR_NOTIFICATION_RULE_DATA_ID"] body = DatadogAPIClient::V2::MonitorNotificationRuleUpdateRequest.new({ data: DatadogAPIClient::V2::MonitorNotificationRuleUpdateRequestData.new({ attributes: DatadogAPIClient::V2::MonitorNotificationRuleAttributes.new({ filter: DatadogAPIClient::V2::MonitorNotificationRuleFilterTags.new({ tags: [ "test:example-monitor", "host:abc", ], }), name: "updated rule", recipients: [ "slack-test-channel", ], }), id: MONITOR_NOTIFICATION_RULE_DATA_ID, type: DatadogAPIClient::V2::MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE, }), }) p api_instance.update_monitor_notification_rule(MONITOR_NOTIFICATION_RULE_DATA_ID, body) ``` Copy ##### Update a monitor notification rule with conditional_recipients returns "OK" response ``` # Update a monitor notification rule with conditional_recipients returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new # there is a valid "monitor_notification_rule" in the system MONITOR_NOTIFICATION_RULE_DATA_ID = ENV["MONITOR_NOTIFICATION_RULE_DATA_ID"] body = DatadogAPIClient::V2::MonitorNotificationRuleUpdateRequest.new({ data: DatadogAPIClient::V2::MonitorNotificationRuleUpdateRequestData.new({ attributes: DatadogAPIClient::V2::MonitorNotificationRuleAttributes.new({ filter: DatadogAPIClient::V2::MonitorNotificationRuleFilterTags.new({ tags: [ "test:example-monitor", "host:abc", ], }), name: "updated rule", conditional_recipients: DatadogAPIClient::V2::MonitorNotificationRuleConditionalRecipients.new({ conditions: [ DatadogAPIClient::V2::MonitorNotificationRuleCondition.new({ scope: "transition_type:is_alert", recipients: [ "slack-test-channel", "jira-test", ], }), ], }), }), id: MONITOR_NOTIFICATION_RULE_DATA_ID, type: DatadogAPIClient::V2::MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE, }), }) p api_instance.update_monitor_notification_rule(MONITOR_NOTIFICATION_RULE_DATA_ID, body) ``` Copy ##### Update a monitor notification rule with scope returns "OK" response ``` # Update a monitor notification rule with scope returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new # there is a valid "monitor_notification_rule" in the system MONITOR_NOTIFICATION_RULE_DATA_ID = ENV["MONITOR_NOTIFICATION_RULE_DATA_ID"] body = DatadogAPIClient::V2::MonitorNotificationRuleUpdateRequest.new({ data: DatadogAPIClient::V2::MonitorNotificationRuleUpdateRequestData.new({ attributes: DatadogAPIClient::V2::MonitorNotificationRuleAttributes.new({ filter: DatadogAPIClient::V2::MonitorNotificationRuleFilterScope.new({ scope: "test:example-monitor", }), name: "updated rule", recipients: [ "slack-test-channel", ], }), id: MONITOR_NOTIFICATION_RULE_DATA_ID, type: DatadogAPIClient::V2::MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE, }), }) p api_instance.update_monitor_notification_rule(MONITOR_NOTIFICATION_RULE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a monitor notification rule returns "OK" response ``` // Update a monitor notification rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorNotificationRuleAttributes; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilter; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilterTags; use datadog_api_client::datadogV2::model::MonitorNotificationRuleResourceType; use datadog_api_client::datadogV2::model::MonitorNotificationRuleUpdateRequest; use datadog_api_client::datadogV2::model::MonitorNotificationRuleUpdateRequestData; #[tokio::main] async fn main() { // there is a valid "monitor_notification_rule" in the system let monitor_notification_rule_data_id = std::env::var("MONITOR_NOTIFICATION_RULE_DATA_ID").unwrap(); let body = MonitorNotificationRuleUpdateRequest::new( MonitorNotificationRuleUpdateRequestData::new( MonitorNotificationRuleAttributes::new("updated rule".to_string()) .filter( MonitorNotificationRuleFilter::MonitorNotificationRuleFilterTags(Box::new( MonitorNotificationRuleFilterTags::new(vec![ "test:example-monitor".to_string(), "host:abc".to_string(), ]), )), ) .recipients(vec!["slack-test-channel".to_string()]), monitor_notification_rule_data_id.clone(), ) .type_(MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE), ); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .update_monitor_notification_rule(monitor_notification_rule_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update a monitor notification rule with conditional_recipients returns "OK" response ``` // Update a monitor notification rule with conditional_recipients returns "OK" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorNotificationRuleAttributes; use datadog_api_client::datadogV2::model::MonitorNotificationRuleCondition; use datadog_api_client::datadogV2::model::MonitorNotificationRuleConditionalRecipients; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilter; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilterTags; use datadog_api_client::datadogV2::model::MonitorNotificationRuleResourceType; use datadog_api_client::datadogV2::model::MonitorNotificationRuleUpdateRequest; use datadog_api_client::datadogV2::model::MonitorNotificationRuleUpdateRequestData; #[tokio::main] async fn main() { // there is a valid "monitor_notification_rule" in the system let monitor_notification_rule_data_id = std::env::var("MONITOR_NOTIFICATION_RULE_DATA_ID").unwrap(); let body = MonitorNotificationRuleUpdateRequest::new( MonitorNotificationRuleUpdateRequestData::new( MonitorNotificationRuleAttributes::new("updated rule".to_string()) .conditional_recipients(MonitorNotificationRuleConditionalRecipients::new(vec![ MonitorNotificationRuleCondition::new( vec!["slack-test-channel".to_string(), "jira-test".to_string()], "transition_type:is_alert".to_string(), ), ])) .filter( MonitorNotificationRuleFilter::MonitorNotificationRuleFilterTags(Box::new( MonitorNotificationRuleFilterTags::new(vec![ "test:example-monitor".to_string(), "host:abc".to_string(), ]), )), ), monitor_notification_rule_data_id.clone(), ) .type_(MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE), ); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .update_monitor_notification_rule(monitor_notification_rule_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update a monitor notification rule with scope returns "OK" response ``` // Update a monitor notification rule with scope returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorNotificationRuleAttributes; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilter; use datadog_api_client::datadogV2::model::MonitorNotificationRuleFilterScope; use datadog_api_client::datadogV2::model::MonitorNotificationRuleResourceType; use datadog_api_client::datadogV2::model::MonitorNotificationRuleUpdateRequest; use datadog_api_client::datadogV2::model::MonitorNotificationRuleUpdateRequestData; #[tokio::main] async fn main() { // there is a valid "monitor_notification_rule" in the system let monitor_notification_rule_data_id = std::env::var("MONITOR_NOTIFICATION_RULE_DATA_ID").unwrap(); let body = MonitorNotificationRuleUpdateRequest::new( MonitorNotificationRuleUpdateRequestData::new( MonitorNotificationRuleAttributes::new("updated rule".to_string()) .filter( MonitorNotificationRuleFilter::MonitorNotificationRuleFilterScope(Box::new( MonitorNotificationRuleFilterScope::new("test:example-monitor".to_string()), )), ) .recipients(vec!["slack-test-channel".to_string()]), monitor_notification_rule_data_id.clone(), ) .type_(MonitorNotificationRuleResourceType::MONITOR_NOTIFICATION_RULE), ); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .update_monitor_notification_rule(monitor_notification_rule_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a monitor notification rule returns "OK" response ``` /** * Update a monitor notification rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); // there is a valid "monitor_notification_rule" in the system const MONITOR_NOTIFICATION_RULE_DATA_ID = process.env .MONITOR_NOTIFICATION_RULE_DATA_ID as string; const params: v2.MonitorsApiUpdateMonitorNotificationRuleRequest = { body: { data: { attributes: { filter: { tags: ["test:example-monitor", "host:abc"], }, name: "updated rule", recipients: ["slack-test-channel"], }, id: MONITOR_NOTIFICATION_RULE_DATA_ID, type: "monitor-notification-rule", }, }, ruleId: MONITOR_NOTIFICATION_RULE_DATA_ID, }; apiInstance .updateMonitorNotificationRule(params) .then((data: v2.MonitorNotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update a monitor notification rule with conditional_recipients returns "OK" response ``` /** * Update a monitor notification rule with conditional_recipients returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); // there is a valid "monitor_notification_rule" in the system const MONITOR_NOTIFICATION_RULE_DATA_ID = process.env .MONITOR_NOTIFICATION_RULE_DATA_ID as string; const params: v2.MonitorsApiUpdateMonitorNotificationRuleRequest = { body: { data: { attributes: { filter: { tags: ["test:example-monitor", "host:abc"], }, name: "updated rule", conditionalRecipients: { conditions: [ { scope: "transition_type:is_alert", recipients: ["slack-test-channel", "jira-test"], }, ], }, }, id: MONITOR_NOTIFICATION_RULE_DATA_ID, type: "monitor-notification-rule", }, }, ruleId: MONITOR_NOTIFICATION_RULE_DATA_ID, }; apiInstance .updateMonitorNotificationRule(params) .then((data: v2.MonitorNotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update a monitor notification rule with scope returns "OK" response ``` /** * Update a monitor notification rule with scope returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); // there is a valid "monitor_notification_rule" in the system const MONITOR_NOTIFICATION_RULE_DATA_ID = process.env .MONITOR_NOTIFICATION_RULE_DATA_ID as string; const params: v2.MonitorsApiUpdateMonitorNotificationRuleRequest = { body: { data: { attributes: { filter: { scope: "test:example-monitor", }, name: "updated rule", recipients: ["slack-test-channel"], }, id: MONITOR_NOTIFICATION_RULE_DATA_ID, type: "monitor-notification-rule", }, }, ruleId: MONITOR_NOTIFICATION_RULE_DATA_ID, }; apiInstance .updateMonitorNotificationRule(params) .then((data: v2.MonitorNotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a monitor notification rule](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor-notification-rule-v2) DELETE https://api.ap1.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.ap2.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.datadoghq.eu/api/v2/monitor/notification_rule/{rule_id}https://api.ddog-gov.com/api/v2/monitor/notification_rule/{rule_id}https://api.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.us3.datadoghq.com/api/v2/monitor/notification_rule/{rule_id}https://api.us5.datadoghq.com/api/v2/monitor/notification_rule/{rule_id} ### Overview Deletes a monitor notification rule by `rule_id`. This endpoint requires the `monitor_config_policy_write` permission. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string ID of the monitor notification rule to delete. ### Response * [204](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorNotificationRule-204-v2) * [403](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorNotificationRule-429-v2) OK Forbidden * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Delete a monitor notification rule Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/notification_rule/${rule_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a monitor notification rule ``` """ Delete a monitor notification rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi # there is a valid "monitor_notification_rule" in the system MONITOR_NOTIFICATION_RULE_DATA_ID = environ["MONITOR_NOTIFICATION_RULE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) api_instance.delete_monitor_notification_rule( rule_id=MONITOR_NOTIFICATION_RULE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a monitor notification rule ``` # Delete a monitor notification rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::MonitorsAPI.new # there is a valid "monitor_notification_rule" in the system MONITOR_NOTIFICATION_RULE_DATA_ID = ENV["MONITOR_NOTIFICATION_RULE_DATA_ID"] api_instance.delete_monitor_notification_rule(MONITOR_NOTIFICATION_RULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a monitor notification rule ``` // Delete a monitor notification rule returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "monitor_notification_rule" in the system MonitorNotificationRuleDataID := os.Getenv("MONITOR_NOTIFICATION_RULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) r, err := api.DeleteMonitorNotificationRule(ctx, MonitorNotificationRuleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.DeleteMonitorNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a monitor notification rule ``` // Delete a monitor notification rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor_notification_rule" in the system String MONITOR_NOTIFICATION_RULE_DATA_ID = System.getenv("MONITOR_NOTIFICATION_RULE_DATA_ID"); try { apiInstance.deleteMonitorNotificationRule(MONITOR_NOTIFICATION_RULE_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#deleteMonitorNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a monitor notification rule ``` // Delete a monitor notification rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { // there is a valid "monitor_notification_rule" in the system let monitor_notification_rule_data_id = std::env::var("MONITOR_NOTIFICATION_RULE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = MonitorsAPI::with_config(configuration); let resp = api .delete_monitor_notification_rule(monitor_notification_rule_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a monitor notification rule ``` /** * Delete a monitor notification rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.MonitorsApi(configuration); // there is a valid "monitor_notification_rule" in the system const MONITOR_NOTIFICATION_RULE_DATA_ID = process.env .MONITOR_NOTIFICATION_RULE_DATA_ID as string; const params: v2.MonitorsApiDeleteMonitorNotificationRuleRequest = { ruleId: MONITOR_NOTIFICATION_RULE_DATA_ID, }; apiInstance .deleteMonitorNotificationRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a monitor user template](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-user-template) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-user-template-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/monitor/template/{template_id}https://api.ap2.datadoghq.com/api/v2/monitor/template/{template_id}https://api.datadoghq.eu/api/v2/monitor/template/{template_id}https://api.ddog-gov.com/api/v2/monitor/template/{template_id}https://api.datadoghq.com/api/v2/monitor/template/{template_id}https://api.us3.datadoghq.com/api/v2/monitor/template/{template_id}https://api.us5.datadoghq.com/api/v2/monitor/template/{template_id} ### Overview Retrieve a monitor user template by its ID. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Arguments #### Path Parameters Name Type Description template_id [_required_] string ID of the monitor user template. #### Query Strings Name Type Description with_all_versions boolean Whether to include all versions of the template in the response in the versions field. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorUserTemplate-200-v2) * [404](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorUserTemplate-404-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#GetMonitorUserTemplate-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response for retrieving a monitor user template. Field Type Description data object Monitor user template data. attributes object A monitor user template object. created date-time The created timestamp of the template. description string A brief description of the monitor user template. modified date-time The last modified timestamp. When the template version was created. monitor_definition object A valid monitor definition in the same format as the [V1 Monitor API](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor). tags [string] The definition of `MonitorUserTemplateTags` object. template_variables [object] The definition of `MonitorUserTemplateTemplateVariables` object. available_values [string] Available values for the variable. defaults [string] Default values of the template variable. name [_required_] string The name of the template variable. tag_key string The tag key associated with the variable. This works the same as dashboard template variables. title string The title of the monitor user template. version int64 The version of the monitor user template. versions [object] All versions of the monitor user template. created date-time The created timestamp of the template. description string A brief description of the monitor user template. id string The unique identifier. The initial version will match the template ID. monitor_definition object A valid monitor definition in the same format as the [V1 Monitor API](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor). tags [string] The definition of `MonitorUserTemplateTags` object. template_variables [object] The definition of `MonitorUserTemplateTemplateVariables` object. available_values [string] Available values for the variable. defaults [string] Default values of the template variable. name [_required_] string The name of the template variable. tag_key string The tag key associated with the variable. This works the same as dashboard template variables. title string The title of the monitor user template. version int64 The version of the monitor user template. id string The unique identifier. type enum Monitor user template resource type. Allowed enum values: `monitor-user-template` default: `monitor-user-template` ``` { "data": { "attributes": { "created": "2024-01-02T03:04:23.274966+00:00", "description": "This is a template for monitoring user activity.", "modified": "2024-02-02T03:04:23.274966+00:00", "monitor_definition": { "message": "You may need to add web hosts if this is consistently high.", "name": "Bytes received on host0", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "product:Our Custom App", "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres CPU Monitor", "version": 0, "versions": [ { "created": "2024-01-02T03:04:23.274966+00:00", "description": "This is a template for monitoring user activity.", "id": "00000000-0000-1234-0000-000000000000", "monitor_definition": { "message": "You may need to add web hosts if this is consistently high.", "name": "Bytes received on host0", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "product:Our Custom App", "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres CPU Monitor", "version": 0 } ] }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-user-template" } } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Get a monitor user template Copy ``` # Path parameters export template_id="00000000-0000-1234-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/template/${template_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a monitor user template ``` """ Get a monitor user template returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi # there is a valid "monitor_user_template" in the system MONITOR_USER_TEMPLATE_DATA_ID = environ["MONITOR_USER_TEMPLATE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_monitor_user_template"] = True with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.get_monitor_user_template( template_id=MONITOR_USER_TEMPLATE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a monitor user template ``` # Get a monitor user template returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_monitor_user_template".to_sym] = true end api_instance = DatadogAPIClient::V2::MonitorsAPI.new # there is a valid "monitor_user_template" in the system MONITOR_USER_TEMPLATE_DATA_ID = ENV["MONITOR_USER_TEMPLATE_DATA_ID"] p api_instance.get_monitor_user_template(MONITOR_USER_TEMPLATE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a monitor user template ``` // Get a monitor user template returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "monitor_user_template" in the system MonitorUserTemplateDataID := os.Getenv("MONITOR_USER_TEMPLATE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetMonitorUserTemplate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.GetMonitorUserTemplate(ctx, MonitorUserTemplateDataID, *datadogV2.NewGetMonitorUserTemplateOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.GetMonitorUserTemplate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.GetMonitorUserTemplate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a monitor user template ``` // Get a monitor user template returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorUserTemplateResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getMonitorUserTemplate", true); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor_user_template" in the system String MONITOR_USER_TEMPLATE_DATA_ID = System.getenv("MONITOR_USER_TEMPLATE_DATA_ID"); try { MonitorUserTemplateResponse result = apiInstance.getMonitorUserTemplate(MONITOR_USER_TEMPLATE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#getMonitorUserTemplate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a monitor user template ``` // Get a monitor user template returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::GetMonitorUserTemplateOptionalParams; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { // there is a valid "monitor_user_template" in the system let monitor_user_template_data_id = std::env::var("MONITOR_USER_TEMPLATE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetMonitorUserTemplate", true); let api = MonitorsAPI::with_config(configuration); let resp = api .get_monitor_user_template( monitor_user_template_data_id.clone(), GetMonitorUserTemplateOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a monitor user template ``` /** * Get a monitor user template returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getMonitorUserTemplate"] = true; const apiInstance = new v2.MonitorsApi(configuration); // there is a valid "monitor_user_template" in the system const MONITOR_USER_TEMPLATE_DATA_ID = process.env .MONITOR_USER_TEMPLATE_DATA_ID as string; const params: v2.MonitorsApiGetMonitorUserTemplateRequest = { templateId: MONITOR_USER_TEMPLATE_DATA_ID, }; apiInstance .getMonitorUserTemplate(params) .then((data: v2.MonitorUserTemplateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all monitor user templates](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-user-templates) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-user-templates-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/monitor/templatehttps://api.ap2.datadoghq.com/api/v2/monitor/templatehttps://api.datadoghq.eu/api/v2/monitor/templatehttps://api.ddog-gov.com/api/v2/monitor/templatehttps://api.datadoghq.com/api/v2/monitor/templatehttps://api.us3.datadoghq.com/api/v2/monitor/templatehttps://api.us5.datadoghq.com/api/v2/monitor/template ### Overview Retrieve all monitor user templates. This endpoint requires the `monitors_read` permission. OAuth apps require the `monitors_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#monitors) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#ListMonitorUserTemplates-200-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#ListMonitorUserTemplates-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response for retrieving all monitor user templates. Field Type Description data [object] An array of monitor user templates. attributes object Attributes for a monitor user template. created date-time The created timestamp of the template. description string A brief description of the monitor user template. modified date-time The last modified timestamp. When the template version was created. monitor_definition object A valid monitor definition in the same format as the [V1 Monitor API](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor). tags [string] The definition of `MonitorUserTemplateTags` object. template_variables [object] The definition of `MonitorUserTemplateTemplateVariables` object. available_values [string] Available values for the variable. defaults [string] Default values of the template variable. name [_required_] string The name of the template variable. tag_key string The tag key associated with the variable. This works the same as dashboard template variables. title string The title of the monitor user template. version int64 The version of the monitor user template. id string The unique identifier. type enum Monitor user template resource type. Allowed enum values: `monitor-user-template` default: `monitor-user-template` ``` { "data": [ { "attributes": { "created": "2024-01-02T03:04:23.274966+00:00", "description": "This is a template for monitoring user activity.", "modified": "2024-02-02T03:04:23.274966+00:00", "monitor_definition": { "message": "You may need to add web hosts if this is consistently high.", "name": "Bytes received on host0", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "product:Our Custom App", "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres CPU Monitor", "version": 0 }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-user-template" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Get all monitor user templates Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/template" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all monitor user templates ``` """ Get all monitor user templates returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi configuration = Configuration() configuration.unstable_operations["list_monitor_user_templates"] = True with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.list_monitor_user_templates() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all monitor user templates ``` # Get all monitor user templates returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_monitor_user_templates".to_sym] = true end api_instance = DatadogAPIClient::V2::MonitorsAPI.new p api_instance.list_monitor_user_templates() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all monitor user templates ``` // Get all monitor user templates returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListMonitorUserTemplates", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.ListMonitorUserTemplates(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.ListMonitorUserTemplates`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.ListMonitorUserTemplates`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all monitor user templates ``` // Get all monitor user templates returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorUserTemplateListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listMonitorUserTemplates", true); MonitorsApi apiInstance = new MonitorsApi(defaultClient); try { MonitorUserTemplateListResponse result = apiInstance.listMonitorUserTemplates(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#listMonitorUserTemplates"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all monitor user templates ``` // Get all monitor user templates returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListMonitorUserTemplates", true); let api = MonitorsAPI::with_config(configuration); let resp = api.list_monitor_user_templates().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all monitor user templates ``` /** * Get all monitor user templates returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listMonitorUserTemplates"] = true; const apiInstance = new v2.MonitorsApi(configuration); apiInstance .listMonitorUserTemplates() .then((data: v2.MonitorUserTemplateListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a monitor user template](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor-user-template) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor-user-template-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/monitor/templatehttps://api.ap2.datadoghq.com/api/v2/monitor/templatehttps://api.datadoghq.eu/api/v2/monitor/templatehttps://api.ddog-gov.com/api/v2/monitor/templatehttps://api.datadoghq.com/api/v2/monitor/templatehttps://api.us3.datadoghq.com/api/v2/monitor/templatehttps://api.us5.datadoghq.com/api/v2/monitor/template ### Overview Create a new monitor user template. This endpoint requires the `monitor_config_policy_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description data [_required_] object Monitor user template data. attributes [_required_] object Attributes for a monitor user template. description string A brief description of the monitor user template. monitor_definition [_required_] object A valid monitor definition in the same format as the [V1 Monitor API](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor). tags [_required_] [string] The definition of `MonitorUserTemplateTags` object. template_variables [object] The definition of `MonitorUserTemplateTemplateVariables` object. available_values [string] Available values for the variable. defaults [string] Default values of the template variable. name [_required_] string The name of the template variable. tag_key string The tag key associated with the variable. This works the same as dashboard template variables. title [_required_] string The title of the monitor user template. type [_required_] enum Monitor user template resource type. Allowed enum values: `monitor-user-template` default: `monitor-user-template` ``` { "data": { "attributes": { "description": "A description.", "monitor_definition": { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres DB example-monitor" }, "type": "monitor-user-template" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitorUserTemplate-200-v2) * [400](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitorUserTemplate-400-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#CreateMonitorUserTemplate-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response for creating a monitor user template. Field Type Description data object Monitor user template list response data. attributes object Attributes for a monitor user template. created date-time The created timestamp of the template. description string A brief description of the monitor user template. modified date-time The last modified timestamp. When the template version was created. monitor_definition object A valid monitor definition in the same format as the [V1 Monitor API](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor). tags [string] The definition of `MonitorUserTemplateTags` object. template_variables [object] The definition of `MonitorUserTemplateTemplateVariables` object. available_values [string] Available values for the variable. defaults [string] Default values of the template variable. name [_required_] string The name of the template variable. tag_key string The tag key associated with the variable. This works the same as dashboard template variables. title string The title of the monitor user template. version int64 The version of the monitor user template. id string The unique identifier. type enum Monitor user template resource type. Allowed enum values: `monitor-user-template` default: `monitor-user-template` ``` { "data": { "attributes": { "created": "2024-01-02T03:04:23.274966+00:00", "description": "This is a template for monitoring user activity.", "modified": "2024-02-02T03:04:23.274966+00:00", "monitor_definition": { "message": "You may need to add web hosts if this is consistently high.", "name": "Bytes received on host0", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "product:Our Custom App", "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres CPU Monitor", "version": 0 }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-user-template" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Create a monitor user template returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/template" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "A description.", "monitor_definition": { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres DB example-monitor" }, "type": "monitor-user-template" } } EOF ``` ##### Create a monitor user template returns "OK" response ``` // Create a monitor user template returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MonitorUserTemplateCreateRequest{ Data: datadogV2.MonitorUserTemplateCreateData{ Attributes: datadogV2.MonitorUserTemplateRequestAttributes{ Description: *datadog.NewNullableString(datadog.PtrString("A description.")), MonitorDefinition: map[string]interface{}{ "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert", }, Tags: []string{ "integration:Azure", }, TemplateVariables: []datadogV2.MonitorUserTemplateTemplateVariablesItems{ { AvailableValues: []string{ "value1", "value2", }, Defaults: []string{ "defaultValue", }, Name: "regionName", TagKey: datadog.PtrString("datacenter"), }, }, Title: "Postgres DB example-monitor", }, Type: datadogV2.MONITORUSERTEMPLATERESOURCETYPE_MONITOR_USER_TEMPLATE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateMonitorUserTemplate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.CreateMonitorUserTemplate(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.CreateMonitorUserTemplate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.CreateMonitorUserTemplate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a monitor user template returns "OK" response ``` // Create a monitor user template returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorUserTemplateCreateData; import com.datadog.api.client.v2.model.MonitorUserTemplateCreateRequest; import com.datadog.api.client.v2.model.MonitorUserTemplateCreateResponse; import com.datadog.api.client.v2.model.MonitorUserTemplateRequestAttributes; import com.datadog.api.client.v2.model.MonitorUserTemplateResourceType; import com.datadog.api.client.v2.model.MonitorUserTemplateTemplateVariablesItems; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createMonitorUserTemplate", true); MonitorsApi apiInstance = new MonitorsApi(defaultClient); MonitorUserTemplateCreateRequest body = new MonitorUserTemplateCreateRequest() .data( new MonitorUserTemplateCreateData() .attributes( new MonitorUserTemplateRequestAttributes() .description("A description.") .monitorDefinition( Map.ofEntries( Map.entry("message", "A msg."), Map.entry("name", "A name example-monitor"), Map.entry( "query", "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), Map.entry("type", "query alert"))) .tags(Collections.singletonList("integration:Azure")) .templateVariables( Collections.singletonList( new MonitorUserTemplateTemplateVariablesItems() .availableValues(Arrays.asList("value1", "value2")) .defaults(Collections.singletonList("defaultValue")) .name("regionName") .tagKey("datacenter"))) .title("Postgres DB example-monitor")) .type(MonitorUserTemplateResourceType.MONITOR_USER_TEMPLATE)); try { MonitorUserTemplateCreateResponse result = apiInstance.createMonitorUserTemplate(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#createMonitorUserTemplate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a monitor user template returns "OK" response ``` """ Create a monitor user template returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_user_template_create_data import MonitorUserTemplateCreateData from datadog_api_client.v2.model.monitor_user_template_create_request import MonitorUserTemplateCreateRequest from datadog_api_client.v2.model.monitor_user_template_request_attributes import MonitorUserTemplateRequestAttributes from datadog_api_client.v2.model.monitor_user_template_resource_type import MonitorUserTemplateResourceType from datadog_api_client.v2.model.monitor_user_template_template_variables_items import ( MonitorUserTemplateTemplateVariablesItems, ) body = MonitorUserTemplateCreateRequest( data=MonitorUserTemplateCreateData( attributes=MonitorUserTemplateRequestAttributes( description="A description.", monitor_definition=dict( [ ("message", "A msg."), ("name", "A name example-monitor"), ("query", "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), ("type", "query alert"), ] ), tags=[ "integration:Azure", ], template_variables=[ MonitorUserTemplateTemplateVariablesItems( available_values=[ "value1", "value2", ], defaults=[ "defaultValue", ], name="regionName", tag_key="datacenter", ), ], title="Postgres DB example-monitor", ), type=MonitorUserTemplateResourceType.MONITOR_USER_TEMPLATE, ), ) configuration = Configuration() configuration.unstable_operations["create_monitor_user_template"] = True with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.create_monitor_user_template(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a monitor user template returns "OK" response ``` # Create a monitor user template returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_monitor_user_template".to_sym] = true end api_instance = DatadogAPIClient::V2::MonitorsAPI.new body = DatadogAPIClient::V2::MonitorUserTemplateCreateRequest.new({ data: DatadogAPIClient::V2::MonitorUserTemplateCreateData.new({ attributes: DatadogAPIClient::V2::MonitorUserTemplateRequestAttributes.new({ description: "A description.", monitor_definition: { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert", }, tags: [ "integration:Azure", ], template_variables: [ DatadogAPIClient::V2::MonitorUserTemplateTemplateVariablesItems.new({ available_values: [ "value1", "value2", ], defaults: [ "defaultValue", ], name: "regionName", tag_key: "datacenter", }), ], title: "Postgres DB example-monitor", }), type: DatadogAPIClient::V2::MonitorUserTemplateResourceType::MONITOR_USER_TEMPLATE, }), }) p api_instance.create_monitor_user_template(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a monitor user template returns "OK" response ``` // Create a monitor user template returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorUserTemplateCreateData; use datadog_api_client::datadogV2::model::MonitorUserTemplateCreateRequest; use datadog_api_client::datadogV2::model::MonitorUserTemplateRequestAttributes; use datadog_api_client::datadogV2::model::MonitorUserTemplateResourceType; use datadog_api_client::datadogV2::model::MonitorUserTemplateTemplateVariablesItems; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = MonitorUserTemplateCreateRequest::new(MonitorUserTemplateCreateData::new( MonitorUserTemplateRequestAttributes::new( BTreeMap::from([ ("message".to_string(), Value::from("A msg.")), ("name".to_string(), Value::from("A name example-monitor")), ( "query".to_string(), Value::from("avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), ), ("type".to_string(), Value::from("query alert")), ]), vec!["integration:Azure".to_string()], "Postgres DB example-monitor".to_string(), ) .description(Some("A description.".to_string())) .template_variables(vec![MonitorUserTemplateTemplateVariablesItems::new( "regionName".to_string(), ) .available_values(vec!["value1".to_string(), "value2".to_string()]) .defaults(vec!["defaultValue".to_string()]) .tag_key("datacenter".to_string())]), MonitorUserTemplateResourceType::MONITOR_USER_TEMPLATE, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateMonitorUserTemplate", true); let api = MonitorsAPI::with_config(configuration); let resp = api.create_monitor_user_template(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a monitor user template returns "OK" response ``` /** * Create a monitor user template returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createMonitorUserTemplate"] = true; const apiInstance = new v2.MonitorsApi(configuration); const params: v2.MonitorsApiCreateMonitorUserTemplateRequest = { body: { data: { attributes: { description: "A description.", monitorDefinition: { message: "A msg.", name: "A name example-monitor", query: "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", type: "query alert", }, tags: ["integration:Azure"], templateVariables: [ { availableValues: ["value1", "value2"], defaults: ["defaultValue"], name: "regionName", tagKey: "datacenter", }, ], title: "Postgres DB example-monitor", }, type: "monitor-user-template", }, }, }; apiInstance .createMonitorUserTemplate(params) .then((data: v2.MonitorUserTemplateCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a monitor user template to a new version](https://docs.datadoghq.com/api/latest/monitors/#update-a-monitor-user-template-to-a-new-version) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#update-a-monitor-user-template-to-a-new-version-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PUT https://api.ap1.datadoghq.com/api/v2/monitor/template/{template_id}https://api.ap2.datadoghq.com/api/v2/monitor/template/{template_id}https://api.datadoghq.eu/api/v2/monitor/template/{template_id}https://api.ddog-gov.com/api/v2/monitor/template/{template_id}https://api.datadoghq.com/api/v2/monitor/template/{template_id}https://api.us3.datadoghq.com/api/v2/monitor/template/{template_id}https://api.us5.datadoghq.com/api/v2/monitor/template/{template_id} ### Overview Creates a new version of an existing monitor user template. This endpoint requires the `monitor_config_policy_write` permission. ### Arguments #### Path Parameters Name Type Description template_id [_required_] string ID of the monitor user template. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description data [_required_] object Monitor user template data. attributes [_required_] object Attributes for a monitor user template. description string A brief description of the monitor user template. monitor_definition [_required_] object A valid monitor definition in the same format as the [V1 Monitor API](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor). tags [_required_] [string] The definition of `MonitorUserTemplateTags` object. template_variables [object] The definition of `MonitorUserTemplateTemplateVariables` object. available_values [string] Available values for the variable. defaults [string] Default values of the template variable. name [_required_] string The name of the template variable. tag_key string The tag key associated with the variable. This works the same as dashboard template variables. title [_required_] string The title of the monitor user template. id [_required_] string The unique identifier. type [_required_] enum Monitor user template resource type. Allowed enum values: `monitor-user-template` default: `monitor-user-template` ``` { "data": { "attributes": { "description": "A description.", "monitor_definition": { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres DB example-monitor" }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-user-template" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorUserTemplate-200-v2) * [400](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorUserTemplate-400-v2) * [404](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorUserTemplate-404-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#UpdateMonitorUserTemplate-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Response for retrieving a monitor user template. Field Type Description data object Monitor user template data. attributes object A monitor user template object. created date-time The created timestamp of the template. description string A brief description of the monitor user template. modified date-time The last modified timestamp. When the template version was created. monitor_definition object A valid monitor definition in the same format as the [V1 Monitor API](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor). tags [string] The definition of `MonitorUserTemplateTags` object. template_variables [object] The definition of `MonitorUserTemplateTemplateVariables` object. available_values [string] Available values for the variable. defaults [string] Default values of the template variable. name [_required_] string The name of the template variable. tag_key string The tag key associated with the variable. This works the same as dashboard template variables. title string The title of the monitor user template. version int64 The version of the monitor user template. versions [object] All versions of the monitor user template. created date-time The created timestamp of the template. description string A brief description of the monitor user template. id string The unique identifier. The initial version will match the template ID. monitor_definition object A valid monitor definition in the same format as the [V1 Monitor API](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor). tags [string] The definition of `MonitorUserTemplateTags` object. template_variables [object] The definition of `MonitorUserTemplateTemplateVariables` object. available_values [string] Available values for the variable. defaults [string] Default values of the template variable. name [_required_] string The name of the template variable. tag_key string The tag key associated with the variable. This works the same as dashboard template variables. title string The title of the monitor user template. version int64 The version of the monitor user template. id string The unique identifier. type enum Monitor user template resource type. Allowed enum values: `monitor-user-template` default: `monitor-user-template` ``` { "data": { "attributes": { "created": "2024-01-02T03:04:23.274966+00:00", "description": "This is a template for monitoring user activity.", "modified": "2024-02-02T03:04:23.274966+00:00", "monitor_definition": { "message": "You may need to add web hosts if this is consistently high.", "name": "Bytes received on host0", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "product:Our Custom App", "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres CPU Monitor", "version": 0, "versions": [ { "created": "2024-01-02T03:04:23.274966+00:00", "description": "This is a template for monitoring user activity.", "id": "00000000-0000-1234-0000-000000000000", "monitor_definition": { "message": "You may need to add web hosts if this is consistently high.", "name": "Bytes received on host0", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "product:Our Custom App", "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres CPU Monitor", "version": 0 } ] }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-user-template" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Update a monitor user template to a new version returns "OK" response Copy ``` # Path parameters export template_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/template/${template_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "A description.", "monitor_definition": { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres DB example-monitor" }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-user-template" } } EOF ``` ##### Update a monitor user template to a new version returns "OK" response ``` // Update a monitor user template to a new version returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "monitor_user_template" in the system MonitorUserTemplateDataID := os.Getenv("MONITOR_USER_TEMPLATE_DATA_ID") body := datadogV2.MonitorUserTemplateUpdateRequest{ Data: datadogV2.MonitorUserTemplateUpdateData{ Attributes: datadogV2.MonitorUserTemplateRequestAttributes{ Description: *datadog.NewNullableString(datadog.PtrString("A description.")), MonitorDefinition: map[string]interface{}{ "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert", }, Tags: []string{ "integration:Azure", }, TemplateVariables: []datadogV2.MonitorUserTemplateTemplateVariablesItems{ { AvailableValues: []string{ "value1", "value2", }, Defaults: []string{ "defaultValue", }, Name: "regionName", TagKey: datadog.PtrString("datacenter"), }, }, Title: "Postgres DB example-monitor", }, Id: MonitorUserTemplateDataID, Type: datadogV2.MONITORUSERTEMPLATERESOURCETYPE_MONITOR_USER_TEMPLATE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateMonitorUserTemplate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) resp, r, err := api.UpdateMonitorUserTemplate(ctx, MonitorUserTemplateDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.UpdateMonitorUserTemplate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `MonitorsApi.UpdateMonitorUserTemplate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a monitor user template to a new version returns "OK" response ``` // Update a monitor user template to a new version returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorUserTemplateRequestAttributes; import com.datadog.api.client.v2.model.MonitorUserTemplateResourceType; import com.datadog.api.client.v2.model.MonitorUserTemplateResponse; import com.datadog.api.client.v2.model.MonitorUserTemplateTemplateVariablesItems; import com.datadog.api.client.v2.model.MonitorUserTemplateUpdateData; import com.datadog.api.client.v2.model.MonitorUserTemplateUpdateRequest; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateMonitorUserTemplate", true); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor_user_template" in the system String MONITOR_USER_TEMPLATE_DATA_ID = System.getenv("MONITOR_USER_TEMPLATE_DATA_ID"); MonitorUserTemplateUpdateRequest body = new MonitorUserTemplateUpdateRequest() .data( new MonitorUserTemplateUpdateData() .attributes( new MonitorUserTemplateRequestAttributes() .description("A description.") .monitorDefinition( Map.ofEntries( Map.entry("message", "A msg."), Map.entry("name", "A name example-monitor"), Map.entry( "query", "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), Map.entry("type", "query alert"))) .tags(Collections.singletonList("integration:Azure")) .templateVariables( Collections.singletonList( new MonitorUserTemplateTemplateVariablesItems() .availableValues(Arrays.asList("value1", "value2")) .defaults(Collections.singletonList("defaultValue")) .name("regionName") .tagKey("datacenter"))) .title("Postgres DB example-monitor")) .id(MONITOR_USER_TEMPLATE_DATA_ID) .type(MonitorUserTemplateResourceType.MONITOR_USER_TEMPLATE)); try { MonitorUserTemplateResponse result = apiInstance.updateMonitorUserTemplate(MONITOR_USER_TEMPLATE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#updateMonitorUserTemplate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a monitor user template to a new version returns "OK" response ``` """ Update a monitor user template to a new version returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_user_template_request_attributes import MonitorUserTemplateRequestAttributes from datadog_api_client.v2.model.monitor_user_template_resource_type import MonitorUserTemplateResourceType from datadog_api_client.v2.model.monitor_user_template_template_variables_items import ( MonitorUserTemplateTemplateVariablesItems, ) from datadog_api_client.v2.model.monitor_user_template_update_data import MonitorUserTemplateUpdateData from datadog_api_client.v2.model.monitor_user_template_update_request import MonitorUserTemplateUpdateRequest # there is a valid "monitor_user_template" in the system MONITOR_USER_TEMPLATE_DATA_ID = environ["MONITOR_USER_TEMPLATE_DATA_ID"] body = MonitorUserTemplateUpdateRequest( data=MonitorUserTemplateUpdateData( attributes=MonitorUserTemplateRequestAttributes( description="A description.", monitor_definition=dict( [ ("message", "A msg."), ("name", "A name example-monitor"), ("query", "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), ("type", "query alert"), ] ), tags=[ "integration:Azure", ], template_variables=[ MonitorUserTemplateTemplateVariablesItems( available_values=[ "value1", "value2", ], defaults=[ "defaultValue", ], name="regionName", tag_key="datacenter", ), ], title="Postgres DB example-monitor", ), id=MONITOR_USER_TEMPLATE_DATA_ID, type=MonitorUserTemplateResourceType.MONITOR_USER_TEMPLATE, ), ) configuration = Configuration() configuration.unstable_operations["update_monitor_user_template"] = True with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) response = api_instance.update_monitor_user_template(template_id=MONITOR_USER_TEMPLATE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a monitor user template to a new version returns "OK" response ``` # Update a monitor user template to a new version returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_monitor_user_template".to_sym] = true end api_instance = DatadogAPIClient::V2::MonitorsAPI.new # there is a valid "monitor_user_template" in the system MONITOR_USER_TEMPLATE_DATA_ID = ENV["MONITOR_USER_TEMPLATE_DATA_ID"] body = DatadogAPIClient::V2::MonitorUserTemplateUpdateRequest.new({ data: DatadogAPIClient::V2::MonitorUserTemplateUpdateData.new({ attributes: DatadogAPIClient::V2::MonitorUserTemplateRequestAttributes.new({ description: "A description.", monitor_definition: { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert", }, tags: [ "integration:Azure", ], template_variables: [ DatadogAPIClient::V2::MonitorUserTemplateTemplateVariablesItems.new({ available_values: [ "value1", "value2", ], defaults: [ "defaultValue", ], name: "regionName", tag_key: "datacenter", }), ], title: "Postgres DB example-monitor", }), id: MONITOR_USER_TEMPLATE_DATA_ID, type: DatadogAPIClient::V2::MonitorUserTemplateResourceType::MONITOR_USER_TEMPLATE, }), }) p api_instance.update_monitor_user_template(MONITOR_USER_TEMPLATE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a monitor user template to a new version returns "OK" response ``` // Update a monitor user template to a new version returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorUserTemplateRequestAttributes; use datadog_api_client::datadogV2::model::MonitorUserTemplateResourceType; use datadog_api_client::datadogV2::model::MonitorUserTemplateTemplateVariablesItems; use datadog_api_client::datadogV2::model::MonitorUserTemplateUpdateData; use datadog_api_client::datadogV2::model::MonitorUserTemplateUpdateRequest; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { // there is a valid "monitor_user_template" in the system let monitor_user_template_data_id = std::env::var("MONITOR_USER_TEMPLATE_DATA_ID").unwrap(); let body = MonitorUserTemplateUpdateRequest::new(MonitorUserTemplateUpdateData::new( MonitorUserTemplateRequestAttributes::new( BTreeMap::from([ ("message".to_string(), Value::from("A msg.")), ("name".to_string(), Value::from("A name example-monitor")), ( "query".to_string(), Value::from("avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), ), ("type".to_string(), Value::from("query alert")), ]), vec!["integration:Azure".to_string()], "Postgres DB example-monitor".to_string(), ) .description(Some("A description.".to_string())) .template_variables(vec![MonitorUserTemplateTemplateVariablesItems::new( "regionName".to_string(), ) .available_values(vec!["value1".to_string(), "value2".to_string()]) .defaults(vec!["defaultValue".to_string()]) .tag_key("datacenter".to_string())]), monitor_user_template_data_id.clone(), MonitorUserTemplateResourceType::MONITOR_USER_TEMPLATE, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateMonitorUserTemplate", true); let api = MonitorsAPI::with_config(configuration); let resp = api .update_monitor_user_template(monitor_user_template_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a monitor user template to a new version returns "OK" response ``` /** * Update a monitor user template to a new version returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateMonitorUserTemplate"] = true; const apiInstance = new v2.MonitorsApi(configuration); // there is a valid "monitor_user_template" in the system const MONITOR_USER_TEMPLATE_DATA_ID = process.env .MONITOR_USER_TEMPLATE_DATA_ID as string; const params: v2.MonitorsApiUpdateMonitorUserTemplateRequest = { body: { data: { attributes: { description: "A description.", monitorDefinition: { message: "A msg.", name: "A name example-monitor", query: "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", type: "query alert", }, tags: ["integration:Azure"], templateVariables: [ { availableValues: ["value1", "value2"], defaults: ["defaultValue"], name: "regionName", tagKey: "datacenter", }, ], title: "Postgres DB example-monitor", }, id: MONITOR_USER_TEMPLATE_DATA_ID, type: "monitor-user-template", }, }, templateId: MONITOR_USER_TEMPLATE_DATA_ID, }; apiInstance .updateMonitorUserTemplate(params) .then((data: v2.MonitorUserTemplateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a monitor user template](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor-user-template) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor-user-template-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/monitor/template/{template_id}https://api.ap2.datadoghq.com/api/v2/monitor/template/{template_id}https://api.datadoghq.eu/api/v2/monitor/template/{template_id}https://api.ddog-gov.com/api/v2/monitor/template/{template_id}https://api.datadoghq.com/api/v2/monitor/template/{template_id}https://api.us3.datadoghq.com/api/v2/monitor/template/{template_id}https://api.us5.datadoghq.com/api/v2/monitor/template/{template_id} ### Overview Delete an existing monitor user template by its ID. This endpoint requires the `monitor_config_policy_write` permission. ### Arguments #### Path Parameters Name Type Description template_id [_required_] string ID of the monitor user template. ### Response * [204](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorUserTemplate-204-v2) * [404](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorUserTemplate-404-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#DeleteMonitorUserTemplate-429-v2) OK Not Found * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Delete a monitor user template Copy ``` # Path parameters export template_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/template/${template_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a monitor user template ``` """ Delete a monitor user template returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi configuration = Configuration() configuration.unstable_operations["delete_monitor_user_template"] = True with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) api_instance.delete_monitor_user_template( template_id="template_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a monitor user template ``` # Delete a monitor user template returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_monitor_user_template".to_sym] = true end api_instance = DatadogAPIClient::V2::MonitorsAPI.new api_instance.delete_monitor_user_template("template_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a monitor user template ``` // Delete a monitor user template returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteMonitorUserTemplate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) r, err := api.DeleteMonitorUserTemplate(ctx, "template_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.DeleteMonitorUserTemplate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a monitor user template ``` // Delete a monitor user template returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteMonitorUserTemplate", true); MonitorsApi apiInstance = new MonitorsApi(defaultClient); try { apiInstance.deleteMonitorUserTemplate("template_id"); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#deleteMonitorUserTemplate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a monitor user template ``` // Delete a monitor user template returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteMonitorUserTemplate", true); let api = MonitorsAPI::with_config(configuration); let resp = api .delete_monitor_user_template("template_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a monitor user template ``` /** * Delete a monitor user template returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteMonitorUserTemplate"] = true; const apiInstance = new v2.MonitorsApi(configuration); const params: v2.MonitorsApiDeleteMonitorUserTemplateRequest = { templateId: "template_id", }; apiInstance .deleteMonitorUserTemplate(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Validate a monitor user template](https://docs.datadoghq.com/api/latest/monitors/#validate-a-monitor-user-template) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#validate-a-monitor-user-template-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/monitor/template/validatehttps://api.ap2.datadoghq.com/api/v2/monitor/template/validatehttps://api.datadoghq.eu/api/v2/monitor/template/validatehttps://api.ddog-gov.com/api/v2/monitor/template/validatehttps://api.datadoghq.com/api/v2/monitor/template/validatehttps://api.us3.datadoghq.com/api/v2/monitor/template/validatehttps://api.us5.datadoghq.com/api/v2/monitor/template/validate ### Overview Validate the structure and content of a monitor user template. This endpoint requires the `monitor_config_policy_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description data [_required_] object Monitor user template data. attributes [_required_] object Attributes for a monitor user template. description string A brief description of the monitor user template. monitor_definition [_required_] object A valid monitor definition in the same format as the [V1 Monitor API](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor). tags [_required_] [string] The definition of `MonitorUserTemplateTags` object. template_variables [object] The definition of `MonitorUserTemplateTemplateVariables` object. available_values [string] Available values for the variable. defaults [string] Default values of the template variable. name [_required_] string The name of the template variable. tag_key string The tag key associated with the variable. This works the same as dashboard template variables. title [_required_] string The title of the monitor user template. type [_required_] enum Monitor user template resource type. Allowed enum values: `monitor-user-template` default: `monitor-user-template` ``` { "data": { "attributes": { "description": "A description.", "monitor_definition": { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres DB example-monitor" }, "type": "monitor-user-template" } } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/monitors/#ValidateMonitorUserTemplate-204-v2) * [400](https://docs.datadoghq.com/api/latest/monitors/#ValidateMonitorUserTemplate-400-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#ValidateMonitorUserTemplate-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Validate a monitor user template returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/template/validate" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "A description.", "monitor_definition": { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres DB example-monitor" }, "type": "monitor-user-template" } } EOF ``` ##### Validate a monitor user template returns "OK" response ``` // Validate a monitor user template returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.MonitorUserTemplateCreateRequest{ Data: datadogV2.MonitorUserTemplateCreateData{ Attributes: datadogV2.MonitorUserTemplateRequestAttributes{ Description: *datadog.NewNullableString(datadog.PtrString("A description.")), MonitorDefinition: map[string]interface{}{ "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert", }, Tags: []string{ "integration:Azure", }, TemplateVariables: []datadogV2.MonitorUserTemplateTemplateVariablesItems{ { AvailableValues: []string{ "value1", "value2", }, Defaults: []string{ "defaultValue", }, Name: "regionName", TagKey: datadog.PtrString("datacenter"), }, }, Title: "Postgres DB example-monitor", }, Type: datadogV2.MONITORUSERTEMPLATERESOURCETYPE_MONITOR_USER_TEMPLATE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ValidateMonitorUserTemplate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) r, err := api.ValidateMonitorUserTemplate(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.ValidateMonitorUserTemplate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Validate a monitor user template returns "OK" response ``` // Validate a monitor user template returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorUserTemplateCreateData; import com.datadog.api.client.v2.model.MonitorUserTemplateCreateRequest; import com.datadog.api.client.v2.model.MonitorUserTemplateRequestAttributes; import com.datadog.api.client.v2.model.MonitorUserTemplateResourceType; import com.datadog.api.client.v2.model.MonitorUserTemplateTemplateVariablesItems; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.validateMonitorUserTemplate", true); MonitorsApi apiInstance = new MonitorsApi(defaultClient); MonitorUserTemplateCreateRequest body = new MonitorUserTemplateCreateRequest() .data( new MonitorUserTemplateCreateData() .attributes( new MonitorUserTemplateRequestAttributes() .description("A description.") .monitorDefinition( Map.ofEntries( Map.entry("message", "A msg."), Map.entry("name", "A name example-monitor"), Map.entry( "query", "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), Map.entry("type", "query alert"))) .tags(Collections.singletonList("integration:Azure")) .templateVariables( Collections.singletonList( new MonitorUserTemplateTemplateVariablesItems() .availableValues(Arrays.asList("value1", "value2")) .defaults(Collections.singletonList("defaultValue")) .name("regionName") .tagKey("datacenter"))) .title("Postgres DB example-monitor")) .type(MonitorUserTemplateResourceType.MONITOR_USER_TEMPLATE)); try { apiInstance.validateMonitorUserTemplate(body); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#validateMonitorUserTemplate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Validate a monitor user template returns "OK" response ``` """ Validate a monitor user template returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_user_template_create_data import MonitorUserTemplateCreateData from datadog_api_client.v2.model.monitor_user_template_create_request import MonitorUserTemplateCreateRequest from datadog_api_client.v2.model.monitor_user_template_request_attributes import MonitorUserTemplateRequestAttributes from datadog_api_client.v2.model.monitor_user_template_resource_type import MonitorUserTemplateResourceType from datadog_api_client.v2.model.monitor_user_template_template_variables_items import ( MonitorUserTemplateTemplateVariablesItems, ) body = MonitorUserTemplateCreateRequest( data=MonitorUserTemplateCreateData( attributes=MonitorUserTemplateRequestAttributes( description="A description.", monitor_definition=dict( [ ("message", "A msg."), ("name", "A name example-monitor"), ("query", "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), ("type", "query alert"), ] ), tags=[ "integration:Azure", ], template_variables=[ MonitorUserTemplateTemplateVariablesItems( available_values=[ "value1", "value2", ], defaults=[ "defaultValue", ], name="regionName", tag_key="datacenter", ), ], title="Postgres DB example-monitor", ), type=MonitorUserTemplateResourceType.MONITOR_USER_TEMPLATE, ), ) configuration = Configuration() configuration.unstable_operations["validate_monitor_user_template"] = True with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) api_instance.validate_monitor_user_template(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Validate a monitor user template returns "OK" response ``` # Validate a monitor user template returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.validate_monitor_user_template".to_sym] = true end api_instance = DatadogAPIClient::V2::MonitorsAPI.new body = DatadogAPIClient::V2::MonitorUserTemplateCreateRequest.new({ data: DatadogAPIClient::V2::MonitorUserTemplateCreateData.new({ attributes: DatadogAPIClient::V2::MonitorUserTemplateRequestAttributes.new({ description: "A description.", monitor_definition: { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert", }, tags: [ "integration:Azure", ], template_variables: [ DatadogAPIClient::V2::MonitorUserTemplateTemplateVariablesItems.new({ available_values: [ "value1", "value2", ], defaults: [ "defaultValue", ], name: "regionName", tag_key: "datacenter", }), ], title: "Postgres DB example-monitor", }), type: DatadogAPIClient::V2::MonitorUserTemplateResourceType::MONITOR_USER_TEMPLATE, }), }) api_instance.validate_monitor_user_template(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Validate a monitor user template returns "OK" response ``` // Validate a monitor user template returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorUserTemplateCreateData; use datadog_api_client::datadogV2::model::MonitorUserTemplateCreateRequest; use datadog_api_client::datadogV2::model::MonitorUserTemplateRequestAttributes; use datadog_api_client::datadogV2::model::MonitorUserTemplateResourceType; use datadog_api_client::datadogV2::model::MonitorUserTemplateTemplateVariablesItems; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = MonitorUserTemplateCreateRequest::new(MonitorUserTemplateCreateData::new( MonitorUserTemplateRequestAttributes::new( BTreeMap::from([ ("message".to_string(), Value::from("A msg.")), ("name".to_string(), Value::from("A name example-monitor")), ( "query".to_string(), Value::from("avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), ), ("type".to_string(), Value::from("query alert")), ]), vec!["integration:Azure".to_string()], "Postgres DB example-monitor".to_string(), ) .description(Some("A description.".to_string())) .template_variables(vec![MonitorUserTemplateTemplateVariablesItems::new( "regionName".to_string(), ) .available_values(vec!["value1".to_string(), "value2".to_string()]) .defaults(vec!["defaultValue".to_string()]) .tag_key("datacenter".to_string())]), MonitorUserTemplateResourceType::MONITOR_USER_TEMPLATE, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ValidateMonitorUserTemplate", true); let api = MonitorsAPI::with_config(configuration); let resp = api.validate_monitor_user_template(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Validate a monitor user template returns "OK" response ``` /** * Validate a monitor user template returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.validateMonitorUserTemplate"] = true; const apiInstance = new v2.MonitorsApi(configuration); const params: v2.MonitorsApiValidateMonitorUserTemplateRequest = { body: { data: { attributes: { description: "A description.", monitorDefinition: { message: "A msg.", name: "A name example-monitor", query: "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", type: "query alert", }, tags: ["integration:Azure"], templateVariables: [ { availableValues: ["value1", "value2"], defaults: ["defaultValue"], name: "regionName", tagKey: "datacenter", }, ], title: "Postgres DB example-monitor", }, type: "monitor-user-template", }, }, }; apiInstance .validateMonitorUserTemplate(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Validate an existing monitor user template](https://docs.datadoghq.com/api/latest/monitors/#validate-an-existing-monitor-user-template) * [v2 (latest)](https://docs.datadoghq.com/api/latest/monitors/#validate-an-existing-monitor-user-template-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/monitor/template/{template_id}/validatehttps://api.ap2.datadoghq.com/api/v2/monitor/template/{template_id}/validatehttps://api.datadoghq.eu/api/v2/monitor/template/{template_id}/validatehttps://api.ddog-gov.com/api/v2/monitor/template/{template_id}/validatehttps://api.datadoghq.com/api/v2/monitor/template/{template_id}/validatehttps://api.us3.datadoghq.com/api/v2/monitor/template/{template_id}/validatehttps://api.us5.datadoghq.com/api/v2/monitor/template/{template_id}/validate ### Overview Validate the structure and content of an existing monitor user template being updated to a new version. This endpoint requires the `monitor_config_policy_write` permission. ### Arguments #### Path Parameters Name Type Description template_id [_required_] string ID of the monitor user template. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) Field Type Description data [_required_] object Monitor user template data. attributes [_required_] object Attributes for a monitor user template. description string A brief description of the monitor user template. monitor_definition [_required_] object A valid monitor definition in the same format as the [V1 Monitor API](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor). tags [_required_] [string] The definition of `MonitorUserTemplateTags` object. template_variables [object] The definition of `MonitorUserTemplateTemplateVariables` object. available_values [string] Available values for the variable. defaults [string] Default values of the template variable. name [_required_] string The name of the template variable. tag_key string The tag key associated with the variable. This works the same as dashboard template variables. title [_required_] string The title of the monitor user template. id [_required_] string The unique identifier. type [_required_] enum Monitor user template resource type. Allowed enum values: `monitor-user-template` default: `monitor-user-template` ``` { "data": { "attributes": { "description": "A description.", "monitor_definition": { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres DB example-monitor" }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-user-template" } } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/monitors/#ValidateExistingMonitorUserTemplate-204-v2) * [400](https://docs.datadoghq.com/api/latest/monitors/#ValidateExistingMonitorUserTemplate-400-v2) * [404](https://docs.datadoghq.com/api/latest/monitors/#ValidateExistingMonitorUserTemplate-404-v2) * [429](https://docs.datadoghq.com/api/latest/monitors/#ValidateExistingMonitorUserTemplate-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/monitors/) * [Example](https://docs.datadoghq.com/api/latest/monitors/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/monitors/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/monitors/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/monitors/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/monitors/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/monitors/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/monitors/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/monitors/?code-lang=typescript) ##### Validate an existing monitor user template returns "OK" response Copy ``` # Path parameters export template_id="00000000-0000-1234-0000-000000000000" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/monitor/template/${template_id}/validate" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "A description.", "monitor_definition": { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert" }, "tags": [ "integration:Azure" ], "template_variables": [ { "available_values": [ "value1", "value2" ], "defaults": [ "defaultValue" ], "name": "regionName", "tag_key": "datacenter" } ], "title": "Postgres DB example-monitor" }, "id": "00000000-0000-1234-0000-000000000000", "type": "monitor-user-template" } } EOF ``` ##### Validate an existing monitor user template returns "OK" response ``` // Validate an existing monitor user template returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "monitor_user_template" in the system MonitorUserTemplateDataID := os.Getenv("MONITOR_USER_TEMPLATE_DATA_ID") body := datadogV2.MonitorUserTemplateUpdateRequest{ Data: datadogV2.MonitorUserTemplateUpdateData{ Attributes: datadogV2.MonitorUserTemplateRequestAttributes{ Description: *datadog.NewNullableString(datadog.PtrString("A description.")), MonitorDefinition: map[string]interface{}{ "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert", }, Tags: []string{ "integration:Azure", }, TemplateVariables: []datadogV2.MonitorUserTemplateTemplateVariablesItems{ { AvailableValues: []string{ "value1", "value2", }, Defaults: []string{ "defaultValue", }, Name: "regionName", TagKey: datadog.PtrString("datacenter"), }, }, Title: "Postgres DB example-monitor", }, Id: MonitorUserTemplateDataID, Type: datadogV2.MONITORUSERTEMPLATERESOURCETYPE_MONITOR_USER_TEMPLATE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ValidateExistingMonitorUserTemplate", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewMonitorsApi(apiClient) r, err := api.ValidateExistingMonitorUserTemplate(ctx, MonitorUserTemplateDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `MonitorsApi.ValidateExistingMonitorUserTemplate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Validate an existing monitor user template returns "OK" response ``` // Validate an existing monitor user template returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.MonitorsApi; import com.datadog.api.client.v2.model.MonitorUserTemplateRequestAttributes; import com.datadog.api.client.v2.model.MonitorUserTemplateResourceType; import com.datadog.api.client.v2.model.MonitorUserTemplateTemplateVariablesItems; import com.datadog.api.client.v2.model.MonitorUserTemplateUpdateData; import com.datadog.api.client.v2.model.MonitorUserTemplateUpdateRequest; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.validateExistingMonitorUserTemplate", true); MonitorsApi apiInstance = new MonitorsApi(defaultClient); // there is a valid "monitor_user_template" in the system String MONITOR_USER_TEMPLATE_DATA_ID = System.getenv("MONITOR_USER_TEMPLATE_DATA_ID"); MonitorUserTemplateUpdateRequest body = new MonitorUserTemplateUpdateRequest() .data( new MonitorUserTemplateUpdateData() .attributes( new MonitorUserTemplateRequestAttributes() .description("A description.") .monitorDefinition( Map.ofEntries( Map.entry("message", "A msg."), Map.entry("name", "A name example-monitor"), Map.entry( "query", "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), Map.entry("type", "query alert"))) .tags(Collections.singletonList("integration:Azure")) .templateVariables( Collections.singletonList( new MonitorUserTemplateTemplateVariablesItems() .availableValues(Arrays.asList("value1", "value2")) .defaults(Collections.singletonList("defaultValue")) .name("regionName") .tagKey("datacenter"))) .title("Postgres DB example-monitor")) .id(MONITOR_USER_TEMPLATE_DATA_ID) .type(MonitorUserTemplateResourceType.MONITOR_USER_TEMPLATE)); try { apiInstance.validateExistingMonitorUserTemplate(MONITOR_USER_TEMPLATE_DATA_ID, body); } catch (ApiException e) { System.err.println("Exception when calling MonitorsApi#validateExistingMonitorUserTemplate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Validate an existing monitor user template returns "OK" response ``` """ Validate an existing monitor user template returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.monitors_api import MonitorsApi from datadog_api_client.v2.model.monitor_user_template_request_attributes import MonitorUserTemplateRequestAttributes from datadog_api_client.v2.model.monitor_user_template_resource_type import MonitorUserTemplateResourceType from datadog_api_client.v2.model.monitor_user_template_template_variables_items import ( MonitorUserTemplateTemplateVariablesItems, ) from datadog_api_client.v2.model.monitor_user_template_update_data import MonitorUserTemplateUpdateData from datadog_api_client.v2.model.monitor_user_template_update_request import MonitorUserTemplateUpdateRequest # there is a valid "monitor_user_template" in the system MONITOR_USER_TEMPLATE_DATA_ID = environ["MONITOR_USER_TEMPLATE_DATA_ID"] body = MonitorUserTemplateUpdateRequest( data=MonitorUserTemplateUpdateData( attributes=MonitorUserTemplateRequestAttributes( description="A description.", monitor_definition=dict( [ ("message", "A msg."), ("name", "A name example-monitor"), ("query", "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), ("type", "query alert"), ] ), tags=[ "integration:Azure", ], template_variables=[ MonitorUserTemplateTemplateVariablesItems( available_values=[ "value1", "value2", ], defaults=[ "defaultValue", ], name="regionName", tag_key="datacenter", ), ], title="Postgres DB example-monitor", ), id=MONITOR_USER_TEMPLATE_DATA_ID, type=MonitorUserTemplateResourceType.MONITOR_USER_TEMPLATE, ), ) configuration = Configuration() configuration.unstable_operations["validate_existing_monitor_user_template"] = True with ApiClient(configuration) as api_client: api_instance = MonitorsApi(api_client) api_instance.validate_existing_monitor_user_template(template_id=MONITOR_USER_TEMPLATE_DATA_ID, body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Validate an existing monitor user template returns "OK" response ``` # Validate an existing monitor user template returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.validate_existing_monitor_user_template".to_sym] = true end api_instance = DatadogAPIClient::V2::MonitorsAPI.new # there is a valid "monitor_user_template" in the system MONITOR_USER_TEMPLATE_DATA_ID = ENV["MONITOR_USER_TEMPLATE_DATA_ID"] body = DatadogAPIClient::V2::MonitorUserTemplateUpdateRequest.new({ data: DatadogAPIClient::V2::MonitorUserTemplateUpdateData.new({ attributes: DatadogAPIClient::V2::MonitorUserTemplateRequestAttributes.new({ description: "A description.", monitor_definition: { "message": "A msg.", "name": "A name example-monitor", "query": "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", "type": "query alert", }, tags: [ "integration:Azure", ], template_variables: [ DatadogAPIClient::V2::MonitorUserTemplateTemplateVariablesItems.new({ available_values: [ "value1", "value2", ], defaults: [ "defaultValue", ], name: "regionName", tag_key: "datacenter", }), ], title: "Postgres DB example-monitor", }), id: MONITOR_USER_TEMPLATE_DATA_ID, type: DatadogAPIClient::V2::MonitorUserTemplateResourceType::MONITOR_USER_TEMPLATE, }), }) api_instance.validate_existing_monitor_user_template(MONITOR_USER_TEMPLATE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Validate an existing monitor user template returns "OK" response ``` // Validate an existing monitor user template returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_monitors::MonitorsAPI; use datadog_api_client::datadogV2::model::MonitorUserTemplateRequestAttributes; use datadog_api_client::datadogV2::model::MonitorUserTemplateResourceType; use datadog_api_client::datadogV2::model::MonitorUserTemplateTemplateVariablesItems; use datadog_api_client::datadogV2::model::MonitorUserTemplateUpdateData; use datadog_api_client::datadogV2::model::MonitorUserTemplateUpdateRequest; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { // there is a valid "monitor_user_template" in the system let monitor_user_template_data_id = std::env::var("MONITOR_USER_TEMPLATE_DATA_ID").unwrap(); let body = MonitorUserTemplateUpdateRequest::new(MonitorUserTemplateUpdateData::new( MonitorUserTemplateRequestAttributes::new( BTreeMap::from([ ("message".to_string(), Value::from("A msg.")), ("name".to_string(), Value::from("A name example-monitor")), ( "query".to_string(), Value::from("avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100"), ), ("type".to_string(), Value::from("query alert")), ]), vec!["integration:Azure".to_string()], "Postgres DB example-monitor".to_string(), ) .description(Some("A description.".to_string())) .template_variables(vec![MonitorUserTemplateTemplateVariablesItems::new( "regionName".to_string(), ) .available_values(vec!["value1".to_string(), "value2".to_string()]) .defaults(vec!["defaultValue".to_string()]) .tag_key("datacenter".to_string())]), monitor_user_template_data_id.clone(), MonitorUserTemplateResourceType::MONITOR_USER_TEMPLATE, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ValidateExistingMonitorUserTemplate", true); let api = MonitorsAPI::with_config(configuration); let resp = api .validate_existing_monitor_user_template(monitor_user_template_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Validate an existing monitor user template returns "OK" response ``` /** * Validate an existing monitor user template returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.validateExistingMonitorUserTemplate"] = true; const apiInstance = new v2.MonitorsApi(configuration); // there is a valid "monitor_user_template" in the system const MONITOR_USER_TEMPLATE_DATA_ID = process.env .MONITOR_USER_TEMPLATE_DATA_ID as string; const params: v2.MonitorsApiValidateExistingMonitorUserTemplateRequest = { body: { data: { attributes: { description: "A description.", monitorDefinition: { message: "A msg.", name: "A name example-monitor", query: "avg(last_5m):sum:system.net.bytes_rcvd{host:host0} > 100", type: "query alert", }, tags: ["integration:Azure"], templateVariables: [ { availableValues: ["value1", "value2"], defaults: ["defaultValue"], name: "regionName", tagKey: "datacenter", }, ], title: "Postgres DB example-monitor", }, id: MONITOR_USER_TEMPLATE_DATA_ID, type: "monitor-user-template", }, }, templateId: MONITOR_USER_TEMPLATE_DATA_ID, }; apiInstance .validateExistingMonitorUserTemplate(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=3739093d-2e4b-4de6-96e8-bfee9390624a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=c06e3c58-9146-4399-bf9e-49f62e42ca81&pt=Monitors&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fmonitors%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=3739093d-2e4b-4de6-96e8-bfee9390624a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=c06e3c58-9146-4399-bf9e-49f62e42ca81&pt=Monitors&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fmonitors%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=48f2d69f-8b1b-4119-b1c0-1295a47fb142&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Monitors&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fmonitors%2F&r=<=33873&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=509539) --- # Source: https://docs.datadoghq.com/api/latest/network-device-monitoring/ # Network Device Monitoring The Network Device Monitoring API allows you to fetch devices and interfaces and their attributes. See the [Network Device Monitoring page](https://docs.datadoghq.com/network_monitoring/) for more information. ## [Get the list of devices](https://docs.datadoghq.com/api/latest/network-device-monitoring/#get-the-list-of-devices) * [v2 (latest)](https://docs.datadoghq.com/api/latest/network-device-monitoring/#get-the-list-of-devices-v2) GET https://api.ap1.datadoghq.com/api/v2/ndm/deviceshttps://api.ap2.datadoghq.com/api/v2/ndm/deviceshttps://api.datadoghq.eu/api/v2/ndm/deviceshttps://api.ddog-gov.com/api/v2/ndm/deviceshttps://api.datadoghq.com/api/v2/ndm/deviceshttps://api.us3.datadoghq.com/api/v2/ndm/deviceshttps://api.us5.datadoghq.com/api/v2/ndm/devices ### Overview Get the list of devices. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort string The field to sort the devices by. filter[tag] string Filter devices by tag. ### Response * [200](https://docs.datadoghq.com/api/latest/network-device-monitoring/#ListDevices-200-v2) * [400](https://docs.datadoghq.com/api/latest/network-device-monitoring/#ListDevices-400-v2) * [403](https://docs.datadoghq.com/api/latest/network-device-monitoring/#ListDevices-403-v2) * [429](https://docs.datadoghq.com/api/latest/network-device-monitoring/#ListDevices-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) List devices response. Field Type Description data [object] The list devices response data. attributes object The device attributes description string The device description device_type string The device type integration string The device integration interface_statuses object Count of the device interfaces by status down int64 The number of interfaces that are down off int64 The number of interfaces that are off up int64 The number of interfaces that are up warning int64 The number of interfaces that are in a warning state ip_address string The device IP address location string The device location model string The device model name string The device name os_hostname string The device OS hostname os_name string The device OS name os_version string The device OS version ping_status string The device ping status product_name string The device product name serial_number string The device serial number status string The device SNMP status subnet string The device subnet sys_object_id string The device `sys_object_id` tags [string] The list of device tags vendor string The device vendor version string The device version id string The device ID type string The type of the resource. The value should always be device. meta object Object describing meta attributes of response. page object Pagination object. total_filtered_count int64 Total count of devices matched by the filter. ``` { "data": [ { "attributes": { "description": "a device monitored with NDM", "device_type": "other", "integration": "snmp", "interface_statuses": { "down": "integer", "off": "integer", "up": "integer", "warning": "integer" }, "ip_address": "1.2.3.4", "location": "paris", "model": "xx-123", "name": "example device", "os_hostname": "string", "os_name": "example OS", "os_version": "1.0.2", "ping_status": "unmonitored", "product_name": "example device", "serial_number": "X12345", "status": "ok", "subnet": "1.2.3.4/24", "sys_object_id": "1.3.6.1.4.1.99999", "tags": [ "device_ip:1.2.3.4", "device_id:example:1.2.3.4" ], "vendor": "example vendor", "version": "1.2.3" }, "id": "example:1.2.3.4", "type": "string" } ], "meta": { "page": { "total_filtered_count": 1 } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=typescript) ##### Get the list of devices Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ndm/devices" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the list of devices ``` """ Get the list of devices returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.network_device_monitoring_api import NetworkDeviceMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = NetworkDeviceMonitoringApi(api_client) response = api_instance.list_devices( page_size=1, page_number=0, filter_tag="device_namespace:default", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the list of devices ``` # Get the list of devices returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::NetworkDeviceMonitoringAPI.new opts = { page_size: 1, page_number: 0, filter_tag: "device_namespace:default", } p api_instance.list_devices(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the list of devices ``` // Get the list of devices returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewNetworkDeviceMonitoringApi(apiClient) resp, r, err := api.ListDevices(ctx, *datadogV2.NewListDevicesOptionalParameters().WithPageSize(1).WithPageNumber(0).WithFilterTag("device_namespace:default")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `NetworkDeviceMonitoringApi.ListDevices`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `NetworkDeviceMonitoringApi.ListDevices`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the list of devices ``` // Get the list of devices returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.NetworkDeviceMonitoringApi; import com.datadog.api.client.v2.api.NetworkDeviceMonitoringApi.ListDevicesOptionalParameters; import com.datadog.api.client.v2.model.ListDevicesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); NetworkDeviceMonitoringApi apiInstance = new NetworkDeviceMonitoringApi(defaultClient); try { ListDevicesResponse result = apiInstance.listDevices( new ListDevicesOptionalParameters() .pageSize(1L) .pageNumber(0L) .filterTag("device_namespace:default")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling NetworkDeviceMonitoringApi#listDevices"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the list of devices ``` // Get the list of devices returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_network_device_monitoring::ListDevicesOptionalParams; use datadog_api_client::datadogV2::api_network_device_monitoring::NetworkDeviceMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = NetworkDeviceMonitoringAPI::with_config(configuration); let resp = api .list_devices( ListDevicesOptionalParams::default() .page_size(1) .page_number(0) .filter_tag("device_namespace:default".to_string()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the list of devices ``` /** * Get the list of devices returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.NetworkDeviceMonitoringApi(configuration); const params: v2.NetworkDeviceMonitoringApiListDevicesRequest = { pageSize: 1, pageNumber: 0, filterTag: "device_namespace:default", }; apiInstance .listDevices(params) .then((data: v2.ListDevicesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the device details](https://docs.datadoghq.com/api/latest/network-device-monitoring/#get-the-device-details) * [v2 (latest)](https://docs.datadoghq.com/api/latest/network-device-monitoring/#get-the-device-details-v2) GET https://api.ap1.datadoghq.com/api/v2/ndm/devices/{device_id}https://api.ap2.datadoghq.com/api/v2/ndm/devices/{device_id}https://api.datadoghq.eu/api/v2/ndm/devices/{device_id}https://api.ddog-gov.com/api/v2/ndm/devices/{device_id}https://api.datadoghq.com/api/v2/ndm/devices/{device_id}https://api.us3.datadoghq.com/api/v2/ndm/devices/{device_id}https://api.us5.datadoghq.com/api/v2/ndm/devices/{device_id} ### Overview Get the device details. ### Arguments #### Path Parameters Name Type Description device_id [_required_] string The id of the device to fetch. ### Response * [200](https://docs.datadoghq.com/api/latest/network-device-monitoring/#GetDevice-200-v2) * [403](https://docs.datadoghq.com/api/latest/network-device-monitoring/#GetDevice-403-v2) * [404](https://docs.datadoghq.com/api/latest/network-device-monitoring/#GetDevice-404-v2) * [429](https://docs.datadoghq.com/api/latest/network-device-monitoring/#GetDevice-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) The `GetDevice` operation’s response. Field Type Description data object Get device response data. attributes object The device attributes description string A description of the device. device_type string The type of the device. integration string The integration of the device. ip_address string The IP address of the device. location string The location of the device. model string The model of the device. name string The name of the device. os_hostname string The operating system hostname of the device. os_name string The operating system name of the device. os_version string The operating system version of the device. ping_status string The ping status of the device. product_name string The product name of the device. serial_number string The serial number of the device. status string The status of the device. subnet string The subnet of the device. sys_object_id string The device `sys_object_id`. tags [string] A list of tags associated with the device. vendor string The vendor of the device. version string The version of the device. id string The device ID type string The type of the resource. The value should always be device. ``` { "data": { "attributes": { "description": "a device monitored with NDM", "device_type": "other", "integration": "snmp", "ip_address": "1.2.3.4", "location": "paris", "model": "xx-123", "name": "example device", "os_hostname": "1.0.2", "os_name": "example OS", "os_version": "1.0.2", "ping_status": "unmonitored", "product_name": "example device", "serial_number": "X12345", "status": "ok", "subnet": "1.2.3.4/24", "sys_object_id": "1.3.6.1.4.1.99999", "tags": [ "device_ip:1.2.3.4", "device_id:example:1.2.3.4" ], "vendor": "example vendor", "version": "1.2.3" }, "id": "example:1.2.3.4", "type": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=typescript) ##### Get the device details Copy ``` # Path parameters export device_id="example:1.2.3.4" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ndm/devices/${device_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the device details ``` """ Get the device details returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.network_device_monitoring_api import NetworkDeviceMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = NetworkDeviceMonitoringApi(api_client) response = api_instance.get_device( device_id="default_device", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the device details ``` # Get the device details returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::NetworkDeviceMonitoringAPI.new p api_instance.get_device("default_device") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the device details ``` // Get the device details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewNetworkDeviceMonitoringApi(apiClient) resp, r, err := api.GetDevice(ctx, "default_device") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `NetworkDeviceMonitoringApi.GetDevice`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `NetworkDeviceMonitoringApi.GetDevice`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the device details ``` // Get the device details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.NetworkDeviceMonitoringApi; import com.datadog.api.client.v2.model.GetDeviceResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); NetworkDeviceMonitoringApi apiInstance = new NetworkDeviceMonitoringApi(defaultClient); try { GetDeviceResponse result = apiInstance.getDevice("default_device"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling NetworkDeviceMonitoringApi#getDevice"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the device details ``` // Get the device details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_network_device_monitoring::NetworkDeviceMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = NetworkDeviceMonitoringAPI::with_config(configuration); let resp = api.get_device("default_device".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the device details ``` /** * Get the device details returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.NetworkDeviceMonitoringApi(configuration); const params: v2.NetworkDeviceMonitoringApiGetDeviceRequest = { deviceId: "default_device", }; apiInstance .getDevice(params) .then((data: v2.GetDeviceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the list of interfaces of the device](https://docs.datadoghq.com/api/latest/network-device-monitoring/#get-the-list-of-interfaces-of-the-device) * [v2 (latest)](https://docs.datadoghq.com/api/latest/network-device-monitoring/#get-the-list-of-interfaces-of-the-device-v2) GET https://api.ap1.datadoghq.com/api/v2/ndm/interfaceshttps://api.ap2.datadoghq.com/api/v2/ndm/interfaceshttps://api.datadoghq.eu/api/v2/ndm/interfaceshttps://api.ddog-gov.com/api/v2/ndm/interfaceshttps://api.datadoghq.com/api/v2/ndm/interfaceshttps://api.us3.datadoghq.com/api/v2/ndm/interfaceshttps://api.us5.datadoghq.com/api/v2/ndm/interfaces ### Overview Get the list of interfaces of the device. ### Arguments #### Query Strings Name Type Description device_id [_required_] string The ID of the device to get interfaces from. get_ip_addresses boolean Whether to get the IP addresses of the interfaces. ### Response * [200](https://docs.datadoghq.com/api/latest/network-device-monitoring/#GetInterfaces-200-v2) * [403](https://docs.datadoghq.com/api/latest/network-device-monitoring/#GetInterfaces-403-v2) * [429](https://docs.datadoghq.com/api/latest/network-device-monitoring/#GetInterfaces-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) The `GetInterfaces` operation’s response. Field Type Description data [object] Get Interfaces response attributes object The interface attributes alias string The interface alias description string The interface description index int64 The interface index ip_addresses [string] The interface IP addresses mac_address string The interface MAC address name string The interface name status enum The interface status Allowed enum values: `up,down,warning,off` id string The interface ID type string The type of the resource. The value should always be interface. ``` { "data": [ { "attributes": { "alias": "interface_0", "description": "a network interface", "index": 0, "ip_addresses": [ "1.1.1.1", "1.1.1.2" ], "mac_address": "00:00:00:00:00:00", "name": "if0", "status": "up" }, "id": "example:1.2.3.4:99", "type": "string" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=typescript) ##### Get the list of interfaces of the device Copy ``` # Required query arguments export device_id="example:1.2.3.4" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ndm/interfaces?device_id=${device_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the list of interfaces of the device ``` """ Get the list of interfaces of the device returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.network_device_monitoring_api import NetworkDeviceMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = NetworkDeviceMonitoringApi(api_client) response = api_instance.get_interfaces( device_id="default:1.2.3.4", get_ip_addresses=True, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the list of interfaces of the device ``` # Get the list of interfaces of the device returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::NetworkDeviceMonitoringAPI.new opts = { get_ip_addresses: true, } p api_instance.get_interfaces("default:1.2.3.4", opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the list of interfaces of the device ``` // Get the list of interfaces of the device returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewNetworkDeviceMonitoringApi(apiClient) resp, r, err := api.GetInterfaces(ctx, "default:1.2.3.4", *datadogV2.NewGetInterfacesOptionalParameters().WithGetIpAddresses(true)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `NetworkDeviceMonitoringApi.GetInterfaces`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `NetworkDeviceMonitoringApi.GetInterfaces`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the list of interfaces of the device ``` // Get the list of interfaces of the device returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.NetworkDeviceMonitoringApi; import com.datadog.api.client.v2.api.NetworkDeviceMonitoringApi.GetInterfacesOptionalParameters; import com.datadog.api.client.v2.model.GetInterfacesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); NetworkDeviceMonitoringApi apiInstance = new NetworkDeviceMonitoringApi(defaultClient); try { GetInterfacesResponse result = apiInstance.getInterfaces( "default:1.2.3.4", new GetInterfacesOptionalParameters().getIpAddresses(true)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling NetworkDeviceMonitoringApi#getInterfaces"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the list of interfaces of the device ``` // Get the list of interfaces of the device returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_network_device_monitoring::GetInterfacesOptionalParams; use datadog_api_client::datadogV2::api_network_device_monitoring::NetworkDeviceMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = NetworkDeviceMonitoringAPI::with_config(configuration); let resp = api .get_interfaces( "default:1.2.3.4".to_string(), GetInterfacesOptionalParams::default().get_ip_addresses(true), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the list of interfaces of the device ``` /** * Get the list of interfaces of the device returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.NetworkDeviceMonitoringApi(configuration); const params: v2.NetworkDeviceMonitoringApiGetInterfacesRequest = { deviceId: "default:1.2.3.4", getIpAddresses: true, }; apiInstance .getInterfaces(params) .then((data: v2.GetInterfacesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the list of tags for a device](https://docs.datadoghq.com/api/latest/network-device-monitoring/#get-the-list-of-tags-for-a-device) * [v2 (latest)](https://docs.datadoghq.com/api/latest/network-device-monitoring/#get-the-list-of-tags-for-a-device-v2) GET https://api.ap1.datadoghq.com/api/v2/ndm/tags/devices/{device_id}https://api.ap2.datadoghq.com/api/v2/ndm/tags/devices/{device_id}https://api.datadoghq.eu/api/v2/ndm/tags/devices/{device_id}https://api.ddog-gov.com/api/v2/ndm/tags/devices/{device_id}https://api.datadoghq.com/api/v2/ndm/tags/devices/{device_id}https://api.us3.datadoghq.com/api/v2/ndm/tags/devices/{device_id}https://api.us5.datadoghq.com/api/v2/ndm/tags/devices/{device_id} ### Overview Get the list of tags for a device. ### Arguments #### Path Parameters Name Type Description device_id [_required_] string The id of the device to fetch tags for. ### Response * [200](https://docs.datadoghq.com/api/latest/network-device-monitoring/#ListDeviceUserTags-200-v2) * [403](https://docs.datadoghq.com/api/latest/network-device-monitoring/#ListDeviceUserTags-403-v2) * [404](https://docs.datadoghq.com/api/latest/network-device-monitoring/#ListDeviceUserTags-404-v2) * [429](https://docs.datadoghq.com/api/latest/network-device-monitoring/#ListDeviceUserTags-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) List tags response. Field Type Description data object The list tags response data. attributes object The definition of ListTagsResponseDataAttributes object. tags [string] The list of tags id string The device ID type string The type of the resource. The value should always be tags. ``` { "data": { "attributes": { "tags": [ "tag:test", "tag:testbis" ] }, "id": "example:1.2.3.4", "type": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=typescript) ##### Get the list of tags for a device Copy ``` # Path parameters export device_id="example:1.2.3.4" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ndm/tags/devices/${device_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the list of tags for a device ``` """ Get the list of tags for a device returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.network_device_monitoring_api import NetworkDeviceMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = NetworkDeviceMonitoringApi(api_client) response = api_instance.list_device_user_tags( device_id="default_device", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the list of tags for a device ``` # Get the list of tags for a device returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::NetworkDeviceMonitoringAPI.new p api_instance.list_device_user_tags("default_device") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the list of tags for a device ``` // Get the list of tags for a device returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewNetworkDeviceMonitoringApi(apiClient) resp, r, err := api.ListDeviceUserTags(ctx, "default_device") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `NetworkDeviceMonitoringApi.ListDeviceUserTags`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `NetworkDeviceMonitoringApi.ListDeviceUserTags`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the list of tags for a device ``` // Get the list of tags for a device returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.NetworkDeviceMonitoringApi; import com.datadog.api.client.v2.model.ListTagsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); NetworkDeviceMonitoringApi apiInstance = new NetworkDeviceMonitoringApi(defaultClient); try { ListTagsResponse result = apiInstance.listDeviceUserTags("default_device"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling NetworkDeviceMonitoringApi#listDeviceUserTags"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the list of tags for a device ``` // Get the list of tags for a device returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_network_device_monitoring::NetworkDeviceMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = NetworkDeviceMonitoringAPI::with_config(configuration); let resp = api .list_device_user_tags("default_device".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the list of tags for a device ``` /** * Get the list of tags for a device returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.NetworkDeviceMonitoringApi(configuration); const params: v2.NetworkDeviceMonitoringApiListDeviceUserTagsRequest = { deviceId: "default_device", }; apiInstance .listDeviceUserTags(params) .then((data: v2.ListTagsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update the tags for a device](https://docs.datadoghq.com/api/latest/network-device-monitoring/#update-the-tags-for-a-device) * [v2 (latest)](https://docs.datadoghq.com/api/latest/network-device-monitoring/#update-the-tags-for-a-device-v2) PATCH https://api.ap1.datadoghq.com/api/v2/ndm/tags/devices/{device_id}https://api.ap2.datadoghq.com/api/v2/ndm/tags/devices/{device_id}https://api.datadoghq.eu/api/v2/ndm/tags/devices/{device_id}https://api.ddog-gov.com/api/v2/ndm/tags/devices/{device_id}https://api.datadoghq.com/api/v2/ndm/tags/devices/{device_id}https://api.us3.datadoghq.com/api/v2/ndm/tags/devices/{device_id}https://api.us5.datadoghq.com/api/v2/ndm/tags/devices/{device_id} ### Overview Update the tags for a device. ### Arguments #### Path Parameters Name Type Description device_id [_required_] string The id of the device to update tags for. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) Field Type Description data object The list tags response data. attributes object The definition of ListTagsResponseDataAttributes object. tags [string] The list of tags id string The device ID type string The type of the resource. The value should always be tags. ``` { "data": { "attributes": { "tags": [ "tag:test", "tag:testbis" ] }, "id": "default_device", "type": "tags" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/network-device-monitoring/#UpdateDeviceUserTags-200-v2) * [403](https://docs.datadoghq.com/api/latest/network-device-monitoring/#UpdateDeviceUserTags-403-v2) * [404](https://docs.datadoghq.com/api/latest/network-device-monitoring/#UpdateDeviceUserTags-404-v2) * [429](https://docs.datadoghq.com/api/latest/network-device-monitoring/#UpdateDeviceUserTags-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) List tags response. Field Type Description data object The list tags response data. attributes object The definition of ListTagsResponseDataAttributes object. tags [string] The list of tags id string The device ID type string The type of the resource. The value should always be tags. ``` { "data": { "attributes": { "tags": [ "tag:test", "tag:testbis" ] }, "id": "example:1.2.3.4", "type": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/network-device-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/network-device-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/network-device-monitoring/?code-lang=typescript) ##### Update the tags for a device returns "OK" response Copy ``` # Path parameters export device_id="example:1.2.3.4" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/ndm/tags/devices/${device_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "tags": [ "tag:test", "tag:testbis" ] }, "id": "default_device", "type": "tags" } } EOF ``` ##### Update the tags for a device returns "OK" response ``` // Update the tags for a device returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ListTagsResponse{ Data: &datadogV2.ListTagsResponseData{ Attributes: &datadogV2.ListTagsResponseDataAttributes{ Tags: []string{ "tag:test", "tag:testbis", }, }, Id: datadog.PtrString("default_device"), Type: datadog.PtrString("tags"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewNetworkDeviceMonitoringApi(apiClient) resp, r, err := api.UpdateDeviceUserTags(ctx, "default_device", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `NetworkDeviceMonitoringApi.UpdateDeviceUserTags`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `NetworkDeviceMonitoringApi.UpdateDeviceUserTags`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update the tags for a device returns "OK" response ``` // Update the tags for a device returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.NetworkDeviceMonitoringApi; import com.datadog.api.client.v2.model.ListTagsResponse; import com.datadog.api.client.v2.model.ListTagsResponseData; import com.datadog.api.client.v2.model.ListTagsResponseDataAttributes; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); NetworkDeviceMonitoringApi apiInstance = new NetworkDeviceMonitoringApi(defaultClient); ListTagsResponse body = new ListTagsResponse() .data( new ListTagsResponseData() .attributes( new ListTagsResponseDataAttributes() .tags(Arrays.asList("tag:test", "tag:testbis"))) .id("default_device") .type("tags")); try { ListTagsResponse result = apiInstance.updateDeviceUserTags("default_device", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling NetworkDeviceMonitoringApi#updateDeviceUserTags"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update the tags for a device returns "OK" response ``` """ Update the tags for a device returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.network_device_monitoring_api import NetworkDeviceMonitoringApi from datadog_api_client.v2.model.list_tags_response import ListTagsResponse from datadog_api_client.v2.model.list_tags_response_data import ListTagsResponseData from datadog_api_client.v2.model.list_tags_response_data_attributes import ListTagsResponseDataAttributes body = ListTagsResponse( data=ListTagsResponseData( attributes=ListTagsResponseDataAttributes( tags=[ "tag:test", "tag:testbis", ], ), id="default_device", type="tags", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = NetworkDeviceMonitoringApi(api_client) response = api_instance.update_device_user_tags(device_id="default_device", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update the tags for a device returns "OK" response ``` # Update the tags for a device returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::NetworkDeviceMonitoringAPI.new body = DatadogAPIClient::V2::ListTagsResponse.new({ data: DatadogAPIClient::V2::ListTagsResponseData.new({ attributes: DatadogAPIClient::V2::ListTagsResponseDataAttributes.new({ tags: [ "tag:test", "tag:testbis", ], }), id: "default_device", type: "tags", }), }) p api_instance.update_device_user_tags("default_device", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update the tags for a device returns "OK" response ``` // Update the tags for a device returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_network_device_monitoring::NetworkDeviceMonitoringAPI; use datadog_api_client::datadogV2::model::ListTagsResponse; use datadog_api_client::datadogV2::model::ListTagsResponseData; use datadog_api_client::datadogV2::model::ListTagsResponseDataAttributes; #[tokio::main] async fn main() { let body = ListTagsResponse::new().data( ListTagsResponseData::new() .attributes( ListTagsResponseDataAttributes::new() .tags(vec!["tag:test".to_string(), "tag:testbis".to_string()]), ) .id("default_device".to_string()) .type_("tags".to_string()), ); let configuration = datadog::Configuration::new(); let api = NetworkDeviceMonitoringAPI::with_config(configuration); let resp = api .update_device_user_tags("default_device".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update the tags for a device returns "OK" response ``` /** * Update the tags for a device returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.NetworkDeviceMonitoringApi(configuration); const params: v2.NetworkDeviceMonitoringApiUpdateDeviceUserTagsRequest = { body: { data: { attributes: { tags: ["tag:test", "tag:testbis"], }, id: "default_device", type: "tags", }, }, deviceId: "default_device", }; apiInstance .updateDeviceUserTags(params) .then((data: v2.ListTagsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=7bfc6e4e-3957-4b28-836c-4394ebabea1a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d0e1e88c-cee3-4d98-9977-dde9c3ba8306&pt=Network%20Device%20Monitoring&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fnetwork-device-monitoring%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=7bfc6e4e-3957-4b28-836c-4394ebabea1a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d0e1e88c-cee3-4d98-9977-dde9c3ba8306&pt=Network%20Device%20Monitoring&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fnetwork-device-monitoring%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=f8cb8352-2655-45f4-9002-45cb5899e364&bo=2&sid=e7969980f0bf11f08c054320cbbd0092&vid=e7972810f0bf11f09bf29f699b2daf47&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Network%20Device%20Monitoring&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fnetwork-device-monitoring%2F&r=<=1239&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=811024) --- # Source: https://docs.datadoghq.com/api/latest/notebooks/ # Notebooks Interact with your notebooks through the API to make it easier to organize, find, and share all of your notebooks with your team and organization. For more information, see the [Notebooks documentation](https://docs.datadoghq.com/notebooks/). ## [Create a notebook](https://docs.datadoghq.com/api/latest/notebooks/#create-a-notebook) * [v1 (latest)](https://docs.datadoghq.com/api/latest/notebooks/#create-a-notebook-v1) POST https://api.ap1.datadoghq.com/api/v1/notebookshttps://api.ap2.datadoghq.com/api/v1/notebookshttps://api.datadoghq.eu/api/v1/notebookshttps://api.ddog-gov.com/api/v1/notebookshttps://api.datadoghq.com/api/v1/notebookshttps://api.us3.datadoghq.com/api/v1/notebookshttps://api.us5.datadoghq.com/api/v1/notebooks ### Overview Create a notebook using the specified options. This endpoint requires the `notebooks_write` permission. ### Request #### Body Data (required) The JSON description of the notebook you want to create. * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Field Type Description data [_required_] object The data for a notebook create request. attributes [_required_] object The data attributes of a notebook. cells [_required_] [object] List of cells to display in the notebook. attributes [_required_] The attributes of a notebook cell in create cell request. Valid cell types are `markdown`, `timeseries`, `toplist`, `heatmap`, `distribution`, `log_stream`. [More information on each graph visualization type.](https://docs.datadoghq.com/dashboards/widgets/) Option 1 object The attributes of a notebook `markdown` cell. definition [_required_] object Text in a notebook is formatted with [Markdown](https://daringfireball.net/projects/markdown/), which enables the use of headings, subheadings, links, images, lists, and code blocks. text [_required_] string The markdown content. type [_required_] enum Type of the markdown cell. Allowed enum values: `markdown` default: `markdown` Option 2 object The attributes of a notebook `timeseries` cell. definition [_required_] object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 3 object The attributes of a notebook `toplist` cell. definition [_required_] object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 4 object The attributes of a notebook `heatmap` cell. definition [_required_] object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of widget types. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` xaxis object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 5 object The attributes of a notebook `distribution` cell. definition [_required_] object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query Query definition for Distribution Widget Histogram Request Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` xaxis object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` yaxis object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 6 object The attributes of a notebook `log_stream` cell. definition [_required_] object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. type [_required_] enum Type of the Notebook Cell resource. Allowed enum values: `notebook_cells` default: `notebook_cells` metadata object Metadata associated with the notebook. is_template boolean Whether or not the notebook is a template. take_snapshots boolean Whether or not the notebook takes snapshot image backups of the notebook's fixed-time graphs. type enum Metadata type of the notebook. Allowed enum values: `postmortem,runbook,investigation,documentation,report` name [_required_] string The name of the notebook. status enum Publication status of the notebook. For now, always "published". Allowed enum values: `published` default: `published` time [_required_] Notebook global timeframe. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. type [_required_] enum Type of the Notebook resource. Allowed enum values: `notebooks` default: `notebooks` ``` { "data": { "attributes": { "cells": [ { "attributes": { "definition": { "text": "## Some test markdown\n\n```\nvar x, y;\nx = 5;\ny = 6;\n```", "type": "markdown" } }, "type": "notebook_cells" }, { "attributes": { "definition": { "requests": [ { "display_type": "line", "q": "avg:system.load.1{*}", "style": { "line_type": "solid", "line_width": "normal", "palette": "dog_classic" } } ], "show_legend": true, "type": "timeseries", "yaxis": { "scale": "linear" } }, "graph_size": "m", "split_by": { "keys": [], "tags": [] }, "time": null }, "type": "notebook_cells" } ], "name": "Example-Notebook", "status": "published", "time": { "live_span": "1h" } }, "type": "notebooks" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/notebooks/#CreateNotebook-200-v1) * [400](https://docs.datadoghq.com/api/latest/notebooks/#CreateNotebook-400-v1) * [403](https://docs.datadoghq.com/api/latest/notebooks/#CreateNotebook-403-v1) * [429](https://docs.datadoghq.com/api/latest/notebooks/#CreateNotebook-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) The description of a notebook response. Field Type Description data object The data for a notebook. attributes [_required_] object The attributes of a notebook. author object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. name string Name of the user. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. cells [_required_] [object] List of cells to display in the notebook. attributes [_required_] The attributes of a notebook cell response. Valid cell types are `markdown`, `timeseries`, `toplist`, `heatmap`, `distribution`, `log_stream`. [More information on each graph visualization type.](https://docs.datadoghq.com/dashboards/widgets/) Option 1 object The attributes of a notebook `markdown` cell. definition [_required_] object Text in a notebook is formatted with [Markdown](https://daringfireball.net/projects/markdown/), which enables the use of headings, subheadings, links, images, lists, and code blocks. text [_required_] string The markdown content. type [_required_] enum Type of the markdown cell. Allowed enum values: `markdown` default: `markdown` Option 2 object The attributes of a notebook `timeseries` cell. definition [_required_] object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 3 object The attributes of a notebook `toplist` cell. definition [_required_] object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 4 object The attributes of a notebook `heatmap` cell. definition [_required_] object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of widget types. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` xaxis object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 5 object The attributes of a notebook `distribution` cell. definition [_required_] object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query Query definition for Distribution Widget Histogram Request Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` xaxis object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` yaxis object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 6 object The attributes of a notebook `log_stream` cell. definition [_required_] object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. id [_required_] string Notebook cell ID. type [_required_] enum Type of the Notebook Cell resource. Allowed enum values: `notebook_cells` default: `notebook_cells` created date-time UTC time stamp for when the notebook was created. metadata object Metadata associated with the notebook. is_template boolean Whether or not the notebook is a template. take_snapshots boolean Whether or not the notebook takes snapshot image backups of the notebook's fixed-time graphs. type enum Metadata type of the notebook. Allowed enum values: `postmortem,runbook,investigation,documentation,report` modified date-time UTC time stamp for when the notebook was last modified. name [_required_] string The name of the notebook. status enum Publication status of the notebook. For now, always "published". Allowed enum values: `published` default: `published` time [_required_] Notebook global timeframe. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. id [_required_] int64 Unique notebook ID, assigned when you create the notebook. type [_required_] enum Type of the Notebook resource. Allowed enum values: `notebooks` default: `notebooks` ``` { "data": { "attributes": { "author": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "name": "string", "status": "string", "title": "string", "verified": false }, "cells": [ { "attributes": { "definition": { "requests": [ { "display_type": "line", "q": "avg:system.load.1{*}", "style": { "line_type": "solid", "line_width": "normal", "palette": "dog_classic" } } ], "show_legend": true, "type": "timeseries", "yaxis": { "scale": "linear" } }, "graph_size": "m", "split_by": { "keys": [], "tags": [] }, "time": null }, "id": "abcd1234", "type": "notebook_cells" } ], "created": "2021-02-24T23:14:15.173964+00:00", "metadata": { "is_template": false, "take_snapshots": false, "type": "investigation" }, "modified": "2021-02-24T23:15:23.274966+00:00", "name": "Example Notebook", "status": "published", "time": { "live_span": "5m" } }, "id": 123456, "type": "notebooks" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=typescript) ##### Create a notebook returns "OK" response Copy ``` ## json-request-body # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/notebooks" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "cells": [ { "attributes": { "definition": { "text": "## Some test markdown\n\nWith some example content.", "type": "markdown" } }, "type": "notebook_cells" }, { "attributes": { "definition": { "requests": [ { "display_type": "line", "q": "avg:system.load.1{*}", "style": { "line_type": "solid", "line_width": "normal", "palette": "dog_classic" } } ], "show_legend": true, "type": "timeseries", "yaxis": { "scale": "linear" } }, "graph_size": "m", "split_by": { "keys": [], "tags": [] }, "time": null }, "type": "notebook_cells" } ], "name": "Example Notebook", "time": { "live_span": "1h" } }, "type": "notebooks" } } EOF ``` ##### Create a notebook returns "OK" response ``` // Create a notebook returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.NotebookCreateRequest{ Data: datadogV1.NotebookCreateData{ Attributes: datadogV1.NotebookCreateDataAttributes{ Cells: []datadogV1.NotebookCellCreateRequest{ { Attributes: datadogV1.NotebookCellCreateRequestAttributes{ NotebookMarkdownCellAttributes: &datadogV1.NotebookMarkdownCellAttributes{ Definition: datadogV1.NotebookMarkdownCellDefinition{ Text: `## Some test markdown ` + "```" + ` var x, y; x = 5; y = 6; ` + "```", Type: datadogV1.NOTEBOOKMARKDOWNCELLDEFINITIONTYPE_MARKDOWN, }, }}, Type: datadogV1.NOTEBOOKCELLRESOURCETYPE_NOTEBOOK_CELLS, }, { Attributes: datadogV1.NotebookCellCreateRequestAttributes{ NotebookTimeseriesCellAttributes: &datadogV1.NotebookTimeseriesCellAttributes{ Definition: datadogV1.TimeseriesWidgetDefinition{ Requests: []datadogV1.TimeseriesWidgetRequest{ { DisplayType: datadogV1.WIDGETDISPLAYTYPE_LINE.Ptr(), Q: datadog.PtrString("avg:system.load.1{*}"), Style: &datadogV1.WidgetRequestStyle{ LineType: datadogV1.WIDGETLINETYPE_SOLID.Ptr(), LineWidth: datadogV1.WIDGETLINEWIDTH_NORMAL.Ptr(), Palette: datadog.PtrString("dog_classic"), }, }, }, ShowLegend: datadog.PtrBool(true), Type: datadogV1.TIMESERIESWIDGETDEFINITIONTYPE_TIMESERIES, Yaxis: &datadogV1.WidgetAxis{ Scale: datadog.PtrString("linear"), }, }, GraphSize: datadogV1.NOTEBOOKGRAPHSIZE_MEDIUM.Ptr(), SplitBy: &datadogV1.NotebookSplitBy{ Keys: []string{}, Tags: []string{}, }, Time: *datadogV1.NewNullableNotebookCellTime(nil), }}, Type: datadogV1.NOTEBOOKCELLRESOURCETYPE_NOTEBOOK_CELLS, }, }, Name: "Example-Notebook", Status: datadogV1.NOTEBOOKSTATUS_PUBLISHED.Ptr(), Time: datadogV1.NotebookGlobalTime{ NotebookRelativeTime: &datadogV1.NotebookRelativeTime{ LiveSpan: datadogV1.WIDGETLIVESPAN_PAST_ONE_HOUR, }}, }, Type: datadogV1.NOTEBOOKRESOURCETYPE_NOTEBOOKS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewNotebooksApi(apiClient) resp, r, err := api.CreateNotebook(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `NotebooksApi.CreateNotebook`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `NotebooksApi.CreateNotebook`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a notebook returns "OK" response ``` // Create a notebook returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.NotebooksApi; import com.datadog.api.client.v1.model.NotebookCellCreateRequest; import com.datadog.api.client.v1.model.NotebookCellCreateRequestAttributes; import com.datadog.api.client.v1.model.NotebookCellResourceType; import com.datadog.api.client.v1.model.NotebookCreateData; import com.datadog.api.client.v1.model.NotebookCreateDataAttributes; import com.datadog.api.client.v1.model.NotebookCreateRequest; import com.datadog.api.client.v1.model.NotebookGlobalTime; import com.datadog.api.client.v1.model.NotebookGraphSize; import com.datadog.api.client.v1.model.NotebookMarkdownCellAttributes; import com.datadog.api.client.v1.model.NotebookMarkdownCellDefinition; import com.datadog.api.client.v1.model.NotebookMarkdownCellDefinitionType; import com.datadog.api.client.v1.model.NotebookRelativeTime; import com.datadog.api.client.v1.model.NotebookResourceType; import com.datadog.api.client.v1.model.NotebookResponse; import com.datadog.api.client.v1.model.NotebookSplitBy; import com.datadog.api.client.v1.model.NotebookStatus; import com.datadog.api.client.v1.model.NotebookTimeseriesCellAttributes; import com.datadog.api.client.v1.model.TimeseriesWidgetDefinition; import com.datadog.api.client.v1.model.TimeseriesWidgetDefinitionType; import com.datadog.api.client.v1.model.TimeseriesWidgetRequest; import com.datadog.api.client.v1.model.WidgetAxis; import com.datadog.api.client.v1.model.WidgetDisplayType; import com.datadog.api.client.v1.model.WidgetLineType; import com.datadog.api.client.v1.model.WidgetLineWidth; import com.datadog.api.client.v1.model.WidgetLiveSpan; import com.datadog.api.client.v1.model.WidgetRequestStyle; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); NotebooksApi apiInstance = new NotebooksApi(defaultClient); NotebookCreateRequest body = new NotebookCreateRequest() .data( new NotebookCreateData() .attributes( new NotebookCreateDataAttributes() .cells( Arrays.asList( new NotebookCellCreateRequest() .attributes( new NotebookCellCreateRequestAttributes( new NotebookMarkdownCellAttributes() .definition( new NotebookMarkdownCellDefinition() .text( """ ## Some test markdown ``` var x, y; x = 5; y = 6; ``` """) .type( NotebookMarkdownCellDefinitionType .MARKDOWN)))) .type(NotebookCellResourceType.NOTEBOOK_CELLS), new NotebookCellCreateRequest() .attributes( new NotebookCellCreateRequestAttributes( new NotebookTimeseriesCellAttributes() .definition( new TimeseriesWidgetDefinition() .requests( Collections.singletonList( new TimeseriesWidgetRequest() .displayType( WidgetDisplayType.LINE) .q("avg:system.load.1{*}") .style( new WidgetRequestStyle() .lineType( WidgetLineType .SOLID) .lineWidth( WidgetLineWidth .NORMAL) .palette( "dog_classic")))) .showLegend(true) .type( TimeseriesWidgetDefinitionType .TIMESERIES) .yaxis( new WidgetAxis().scale("linear"))) .graphSize(NotebookGraphSize.MEDIUM) .splitBy(new NotebookSplitBy()) .time(null))) .type(NotebookCellResourceType.NOTEBOOK_CELLS))) .name("Example-Notebook") .status(NotebookStatus.PUBLISHED) .time( new NotebookGlobalTime( new NotebookRelativeTime() .liveSpan(WidgetLiveSpan.PAST_ONE_HOUR)))) .type(NotebookResourceType.NOTEBOOKS)); try { NotebookResponse result = apiInstance.createNotebook(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling NotebooksApi#createNotebook"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a notebook returns "OK" response ``` """ Create a notebook returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.notebooks_api import NotebooksApi from datadog_api_client.v1.model.notebook_cell_create_request import NotebookCellCreateRequest from datadog_api_client.v1.model.notebook_cell_resource_type import NotebookCellResourceType from datadog_api_client.v1.model.notebook_create_data import NotebookCreateData from datadog_api_client.v1.model.notebook_create_data_attributes import NotebookCreateDataAttributes from datadog_api_client.v1.model.notebook_create_request import NotebookCreateRequest from datadog_api_client.v1.model.notebook_graph_size import NotebookGraphSize from datadog_api_client.v1.model.notebook_markdown_cell_attributes import NotebookMarkdownCellAttributes from datadog_api_client.v1.model.notebook_markdown_cell_definition import NotebookMarkdownCellDefinition from datadog_api_client.v1.model.notebook_markdown_cell_definition_type import NotebookMarkdownCellDefinitionType from datadog_api_client.v1.model.notebook_relative_time import NotebookRelativeTime from datadog_api_client.v1.model.notebook_resource_type import NotebookResourceType from datadog_api_client.v1.model.notebook_split_by import NotebookSplitBy from datadog_api_client.v1.model.notebook_status import NotebookStatus from datadog_api_client.v1.model.notebook_timeseries_cell_attributes import NotebookTimeseriesCellAttributes from datadog_api_client.v1.model.timeseries_widget_definition import TimeseriesWidgetDefinition from datadog_api_client.v1.model.timeseries_widget_definition_type import TimeseriesWidgetDefinitionType from datadog_api_client.v1.model.timeseries_widget_request import TimeseriesWidgetRequest from datadog_api_client.v1.model.widget_axis import WidgetAxis from datadog_api_client.v1.model.widget_display_type import WidgetDisplayType from datadog_api_client.v1.model.widget_line_type import WidgetLineType from datadog_api_client.v1.model.widget_line_width import WidgetLineWidth from datadog_api_client.v1.model.widget_live_span import WidgetLiveSpan from datadog_api_client.v1.model.widget_request_style import WidgetRequestStyle body = NotebookCreateRequest( data=NotebookCreateData( attributes=NotebookCreateDataAttributes( cells=[ NotebookCellCreateRequest( attributes=NotebookMarkdownCellAttributes( definition=NotebookMarkdownCellDefinition( text="## Some test markdown\n\n```\nvar x, y;\nx = 5;\ny = 6;\n```", type=NotebookMarkdownCellDefinitionType.MARKDOWN, ), ), type=NotebookCellResourceType.NOTEBOOK_CELLS, ), NotebookCellCreateRequest( attributes=NotebookTimeseriesCellAttributes( definition=TimeseriesWidgetDefinition( requests=[ TimeseriesWidgetRequest( display_type=WidgetDisplayType.LINE, q="avg:system.load.1{*}", style=WidgetRequestStyle( line_type=WidgetLineType.SOLID, line_width=WidgetLineWidth.NORMAL, palette="dog_classic", ), ), ], show_legend=True, type=TimeseriesWidgetDefinitionType.TIMESERIES, yaxis=WidgetAxis( scale="linear", ), ), graph_size=NotebookGraphSize.MEDIUM, split_by=NotebookSplitBy( keys=[], tags=[], ), time=None, ), type=NotebookCellResourceType.NOTEBOOK_CELLS, ), ], name="Example-Notebook", status=NotebookStatus.PUBLISHED, time=NotebookRelativeTime( live_span=WidgetLiveSpan.PAST_ONE_HOUR, ), ), type=NotebookResourceType.NOTEBOOKS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = NotebooksApi(api_client) response = api_instance.create_notebook(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a notebook returns "OK" response ``` # Create a notebook returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::NotebooksAPI.new body = DatadogAPIClient::V1::NotebookCreateRequest.new({ data: DatadogAPIClient::V1::NotebookCreateData.new({ attributes: DatadogAPIClient::V1::NotebookCreateDataAttributes.new({ cells: [ DatadogAPIClient::V1::NotebookCellCreateRequest.new({ attributes: DatadogAPIClient::V1::NotebookMarkdownCellAttributes.new({ definition: DatadogAPIClient::V1::NotebookMarkdownCellDefinition.new({ text: '## Some test markdown\n\n```\nvar x, y;\nx = 5;\ny = 6;\n```', type: DatadogAPIClient::V1::NotebookMarkdownCellDefinitionType::MARKDOWN, }), }), type: DatadogAPIClient::V1::NotebookCellResourceType::NOTEBOOK_CELLS, }), DatadogAPIClient::V1::NotebookCellCreateRequest.new({ attributes: DatadogAPIClient::V1::NotebookTimeseriesCellAttributes.new({ definition: DatadogAPIClient::V1::TimeseriesWidgetDefinition.new({ requests: [ DatadogAPIClient::V1::TimeseriesWidgetRequest.new({ display_type: DatadogAPIClient::V1::WidgetDisplayType::LINE, q: "avg:system.load.1{*}", style: DatadogAPIClient::V1::WidgetRequestStyle.new({ line_type: DatadogAPIClient::V1::WidgetLineType::SOLID, line_width: DatadogAPIClient::V1::WidgetLineWidth::NORMAL, palette: "dog_classic", }), }), ], show_legend: true, type: DatadogAPIClient::V1::TimeseriesWidgetDefinitionType::TIMESERIES, yaxis: DatadogAPIClient::V1::WidgetAxis.new({ scale: "linear", }), }), graph_size: DatadogAPIClient::V1::NotebookGraphSize::MEDIUM, split_by: DatadogAPIClient::V1::NotebookSplitBy.new({ keys: [], tags: [], }), time: nil, }), type: DatadogAPIClient::V1::NotebookCellResourceType::NOTEBOOK_CELLS, }), ], name: "Example-Notebook", status: DatadogAPIClient::V1::NotebookStatus::PUBLISHED, time: DatadogAPIClient::V1::NotebookRelativeTime.new({ live_span: DatadogAPIClient::V1::WidgetLiveSpan::PAST_ONE_HOUR, }), }), type: DatadogAPIClient::V1::NotebookResourceType::NOTEBOOKS, }), }) p api_instance.create_notebook(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a notebook returns "OK" response ``` // Create a notebook returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_notebooks::NotebooksAPI; use datadog_api_client::datadogV1::model::NotebookCellCreateRequest; use datadog_api_client::datadogV1::model::NotebookCellCreateRequestAttributes; use datadog_api_client::datadogV1::model::NotebookCellResourceType; use datadog_api_client::datadogV1::model::NotebookCreateData; use datadog_api_client::datadogV1::model::NotebookCreateDataAttributes; use datadog_api_client::datadogV1::model::NotebookCreateRequest; use datadog_api_client::datadogV1::model::NotebookGlobalTime; use datadog_api_client::datadogV1::model::NotebookGraphSize; use datadog_api_client::datadogV1::model::NotebookMarkdownCellAttributes; use datadog_api_client::datadogV1::model::NotebookMarkdownCellDefinition; use datadog_api_client::datadogV1::model::NotebookMarkdownCellDefinitionType; use datadog_api_client::datadogV1::model::NotebookRelativeTime; use datadog_api_client::datadogV1::model::NotebookResourceType; use datadog_api_client::datadogV1::model::NotebookSplitBy; use datadog_api_client::datadogV1::model::NotebookStatus; use datadog_api_client::datadogV1::model::NotebookTimeseriesCellAttributes; use datadog_api_client::datadogV1::model::TimeseriesWidgetDefinition; use datadog_api_client::datadogV1::model::TimeseriesWidgetDefinitionType; use datadog_api_client::datadogV1::model::TimeseriesWidgetRequest; use datadog_api_client::datadogV1::model::WidgetAxis; use datadog_api_client::datadogV1::model::WidgetDisplayType; use datadog_api_client::datadogV1::model::WidgetLineType; use datadog_api_client::datadogV1::model::WidgetLineWidth; use datadog_api_client::datadogV1::model::WidgetLiveSpan; use datadog_api_client::datadogV1::model::WidgetRequestStyle; #[tokio::main] async fn main() { let body = NotebookCreateRequest::new(NotebookCreateData::new( NotebookCreateDataAttributes::new( vec![ NotebookCellCreateRequest::new( NotebookCellCreateRequestAttributes::NotebookMarkdownCellAttributes(Box::new( NotebookMarkdownCellAttributes::new(NotebookMarkdownCellDefinition::new( r#"## Some test markdown ``` var x, y; x = 5; y = 6; ```"# .to_string(), NotebookMarkdownCellDefinitionType::MARKDOWN, )), )), NotebookCellResourceType::NOTEBOOK_CELLS, ), NotebookCellCreateRequest::new( NotebookCellCreateRequestAttributes::NotebookTimeseriesCellAttributes( Box::new( NotebookTimeseriesCellAttributes::new( TimeseriesWidgetDefinition::new( vec![TimeseriesWidgetRequest::new() .display_type(WidgetDisplayType::LINE) .q("avg:system.load.1{*}".to_string()) .style( WidgetRequestStyle::new() .line_type(WidgetLineType::SOLID) .line_width(WidgetLineWidth::NORMAL) .palette("dog_classic".to_string()), )], TimeseriesWidgetDefinitionType::TIMESERIES, ) .show_legend(true) .yaxis(WidgetAxis::new().scale("linear".to_string())), ) .graph_size(NotebookGraphSize::MEDIUM) .split_by(NotebookSplitBy::new(vec![], vec![])) .time(None), ), ), NotebookCellResourceType::NOTEBOOK_CELLS, ), ], "Example-Notebook".to_string(), NotebookGlobalTime::NotebookRelativeTime(Box::new(NotebookRelativeTime::new( WidgetLiveSpan::PAST_ONE_HOUR, ))), ) .status(NotebookStatus::PUBLISHED), NotebookResourceType::NOTEBOOKS, )); let configuration = datadog::Configuration::new(); let api = NotebooksAPI::with_config(configuration); let resp = api.create_notebook(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a notebook returns "OK" response ``` /** * Create a notebook returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.NotebooksApi(configuration); const params: v1.NotebooksApiCreateNotebookRequest = { body: { data: { attributes: { cells: [ { attributes: { definition: { text: `## Some test markdown ` + "```" + ` var x, y; x = 5; y = 6; ` + "```", type: "markdown", }, }, type: "notebook_cells", }, { attributes: { definition: { requests: [ { displayType: "line", q: "avg:system.load.1{*}", style: { lineType: "solid", lineWidth: "normal", palette: "dog_classic", }, }, ], showLegend: true, type: "timeseries", yaxis: { scale: "linear", }, }, graphSize: "m", splitBy: { keys: [], tags: [], }, time: undefined, }, type: "notebook_cells", }, ], name: "Example-Notebook", status: "published", time: { liveSpan: "1h", }, }, type: "notebooks", }, }, }; apiInstance .createNotebook(params) .then((data: v1.NotebookResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all notebooks](https://docs.datadoghq.com/api/latest/notebooks/#get-all-notebooks) * [v1 (latest)](https://docs.datadoghq.com/api/latest/notebooks/#get-all-notebooks-v1) GET https://api.ap1.datadoghq.com/api/v1/notebookshttps://api.ap2.datadoghq.com/api/v1/notebookshttps://api.datadoghq.eu/api/v1/notebookshttps://api.ddog-gov.com/api/v1/notebookshttps://api.datadoghq.com/api/v1/notebookshttps://api.us3.datadoghq.com/api/v1/notebookshttps://api.us5.datadoghq.com/api/v1/notebooks ### Overview Get all notebooks. This can also be used to search for notebooks with a particular `query` in the notebook `name` or author `handle`. This endpoint requires the `notebooks_read` permission. ### Arguments #### Query Strings Name Type Description author_handle string Return notebooks created by the given `author_handle`. exclude_author_handle string Return notebooks not created by the given `author_handle`. start integer The index of the first notebook you want returned. count integer The number of notebooks to be returned. sort_field string Sort by field `modified`, `name`, or `created`. sort_dir string Sort by direction `asc` or `desc`. query string Return only notebooks with `query` string in notebook name or author handle. include_cells boolean Value of `false` excludes the `cells` and global `time` for each notebook. is_template boolean True value returns only template notebooks. Default is false (returns only non-template notebooks). type string If type is provided, returns only notebooks with that metadata type. Default does not have type filtering. ### Response * [200](https://docs.datadoghq.com/api/latest/notebooks/#ListNotebooks-200-v1) * [400](https://docs.datadoghq.com/api/latest/notebooks/#ListNotebooks-400-v1) * [403](https://docs.datadoghq.com/api/latest/notebooks/#ListNotebooks-403-v1) * [429](https://docs.datadoghq.com/api/latest/notebooks/#ListNotebooks-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Notebooks get all response. Field Type Description data [object] List of notebook definitions. attributes [_required_] object The attributes of a notebook in get all response. author object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. name string Name of the user. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. cells [object] List of cells to display in the notebook. attributes [_required_] The attributes of a notebook cell response. Valid cell types are `markdown`, `timeseries`, `toplist`, `heatmap`, `distribution`, `log_stream`. [More information on each graph visualization type.](https://docs.datadoghq.com/dashboards/widgets/) Option 1 object The attributes of a notebook `markdown` cell. definition [_required_] object Text in a notebook is formatted with [Markdown](https://daringfireball.net/projects/markdown/), which enables the use of headings, subheadings, links, images, lists, and code blocks. text [_required_] string The markdown content. type [_required_] enum Type of the markdown cell. Allowed enum values: `markdown` default: `markdown` Option 2 object The attributes of a notebook `timeseries` cell. definition [_required_] object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 3 object The attributes of a notebook `toplist` cell. definition [_required_] object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 4 object The attributes of a notebook `heatmap` cell. definition [_required_] object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of widget types. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` xaxis object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 5 object The attributes of a notebook `distribution` cell. definition [_required_] object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query Query definition for Distribution Widget Histogram Request Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` xaxis object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` yaxis object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 6 object The attributes of a notebook `log_stream` cell. definition [_required_] object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. id [_required_] string Notebook cell ID. type [_required_] enum Type of the Notebook Cell resource. Allowed enum values: `notebook_cells` default: `notebook_cells` created date-time UTC time stamp for when the notebook was created. metadata object Metadata associated with the notebook. is_template boolean Whether or not the notebook is a template. take_snapshots boolean Whether or not the notebook takes snapshot image backups of the notebook's fixed-time graphs. type enum Metadata type of the notebook. Allowed enum values: `postmortem,runbook,investigation,documentation,report` modified date-time UTC time stamp for when the notebook was last modified. name [_required_] string The name of the notebook. status enum Publication status of the notebook. For now, always "published". Allowed enum values: `published` default: `published` time Notebook global timeframe. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. id [_required_] int64 Unique notebook ID, assigned when you create the notebook. type [_required_] enum Type of the Notebook resource. Allowed enum values: `notebooks` default: `notebooks` meta object Searches metadata returned by the API. page object Pagination metadata returned by the API. total_count int64 The total number of notebooks that would be returned if the request was not filtered by `start` and `count` parameters. total_filtered_count int64 The total number of notebooks returned. ``` { "data": [ { "attributes": { "author": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "name": "string", "status": "string", "title": "string", "verified": false }, "cells": [ { "attributes": { "definition": { "text": "# Example Header \nexample content", "type": "markdown" } }, "id": "abcd1234", "type": "notebook_cells" } ], "created": "2021-02-24T23:14:15.173964+00:00", "metadata": { "is_template": false, "take_snapshots": false, "type": "investigation" }, "modified": "2021-02-24T23:15:23.274966+00:00", "name": "Example Notebook", "status": "published", "time": { "live_span": "5m" } }, "id": 123456, "type": "notebooks" } ], "meta": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=typescript) ##### Get all notebooks Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/notebooks" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all notebooks ``` """ Get all notebooks returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.notebooks_api import NotebooksApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = NotebooksApi(api_client) response = api_instance.list_notebooks() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all notebooks ``` # Get all notebooks returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::NotebooksAPI.new p api_instance.list_notebooks() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all notebooks ``` // Get all notebooks returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewNotebooksApi(apiClient) resp, r, err := api.ListNotebooks(ctx, *datadogV1.NewListNotebooksOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `NotebooksApi.ListNotebooks`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `NotebooksApi.ListNotebooks`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all notebooks ``` // Get all notebooks returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.NotebooksApi; import com.datadog.api.client.v1.model.NotebooksResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); NotebooksApi apiInstance = new NotebooksApi(defaultClient); try { NotebooksResponse result = apiInstance.listNotebooks(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling NotebooksApi#listNotebooks"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all notebooks ``` // Get all notebooks returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_notebooks::ListNotebooksOptionalParams; use datadog_api_client::datadogV1::api_notebooks::NotebooksAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = NotebooksAPI::with_config(configuration); let resp = api .list_notebooks(ListNotebooksOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all notebooks ``` /** * Get all notebooks returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.NotebooksApi(configuration); apiInstance .listNotebooks() .then((data: v1.NotebooksResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a notebook](https://docs.datadoghq.com/api/latest/notebooks/#delete-a-notebook) * [v1 (latest)](https://docs.datadoghq.com/api/latest/notebooks/#delete-a-notebook-v1) DELETE https://api.ap1.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.ap2.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.datadoghq.eu/api/v1/notebooks/{notebook_id}https://api.ddog-gov.com/api/v1/notebooks/{notebook_id}https://api.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.us3.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.us5.datadoghq.com/api/v1/notebooks/{notebook_id} ### Overview Delete a notebook using the specified ID. This endpoint requires the `notebooks_write` permission. ### Arguments #### Path Parameters Name Type Description notebook_id [_required_] integer Unique ID, assigned when you create the notebook. ### Response * [204](https://docs.datadoghq.com/api/latest/notebooks/#DeleteNotebook-204-v1) * [400](https://docs.datadoghq.com/api/latest/notebooks/#DeleteNotebook-400-v1) * [403](https://docs.datadoghq.com/api/latest/notebooks/#DeleteNotebook-403-v1) * [404](https://docs.datadoghq.com/api/latest/notebooks/#DeleteNotebook-404-v1) * [429](https://docs.datadoghq.com/api/latest/notebooks/#DeleteNotebook-429-v1) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=typescript) ##### Delete a notebook Copy ``` # Path parameters export notebook_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/notebooks/${notebook_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a notebook ``` """ Delete a notebook returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.notebooks_api import NotebooksApi # there is a valid "notebook" in the system NOTEBOOK_DATA_ID = environ["NOTEBOOK_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = NotebooksApi(api_client) api_instance.delete_notebook( notebook_id=int(NOTEBOOK_DATA_ID), ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a notebook ``` # Delete a notebook returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::NotebooksAPI.new # there is a valid "notebook" in the system NOTEBOOK_DATA_ID = ENV["NOTEBOOK_DATA_ID"] api_instance.delete_notebook(NOTEBOOK_DATA_ID.to_i) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a notebook ``` // Delete a notebook returns "OK" response package main import ( "context" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "notebook" in the system NotebookDataID, _ := strconv.ParseInt(os.Getenv("NOTEBOOK_DATA_ID"), 10, 64) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewNotebooksApi(apiClient) r, err := api.DeleteNotebook(ctx, NotebookDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `NotebooksApi.DeleteNotebook`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a notebook ``` // Delete a notebook returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.NotebooksApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); NotebooksApi apiInstance = new NotebooksApi(defaultClient); // there is a valid "notebook" in the system Long NOTEBOOK_DATA_ID = Long.parseLong(System.getenv("NOTEBOOK_DATA_ID")); try { apiInstance.deleteNotebook(NOTEBOOK_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling NotebooksApi#deleteNotebook"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a notebook ``` // Delete a notebook returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_notebooks::NotebooksAPI; #[tokio::main] async fn main() { // there is a valid "notebook" in the system let notebook_data_id: i64 = std::env::var("NOTEBOOK_DATA_ID").unwrap().parse().unwrap(); let configuration = datadog::Configuration::new(); let api = NotebooksAPI::with_config(configuration); let resp = api.delete_notebook(notebook_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a notebook ``` /** * Delete a notebook returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.NotebooksApi(configuration); // there is a valid "notebook" in the system const NOTEBOOK_DATA_ID = parseInt(process.env.NOTEBOOK_DATA_ID as string); const params: v1.NotebooksApiDeleteNotebookRequest = { notebookId: NOTEBOOK_DATA_ID, }; apiInstance .deleteNotebook(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a notebook](https://docs.datadoghq.com/api/latest/notebooks/#update-a-notebook) * [v1 (latest)](https://docs.datadoghq.com/api/latest/notebooks/#update-a-notebook-v1) PUT https://api.ap1.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.ap2.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.datadoghq.eu/api/v1/notebooks/{notebook_id}https://api.ddog-gov.com/api/v1/notebooks/{notebook_id}https://api.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.us3.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.us5.datadoghq.com/api/v1/notebooks/{notebook_id} ### Overview Update a notebook using the specified ID. This endpoint requires the `notebooks_write` permission. ### Arguments #### Path Parameters Name Type Description notebook_id [_required_] integer Unique ID, assigned when you create the notebook. ### Request #### Body Data (required) Update notebook request body. * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Field Type Description data [_required_] object The data for a notebook update request. attributes [_required_] object The data attributes of a notebook. cells [_required_] [ ] List of cells to display in the notebook. Option 1 object The description of a notebook cell create request. attributes [_required_] The attributes of a notebook cell in create cell request. Valid cell types are `markdown`, `timeseries`, `toplist`, `heatmap`, `distribution`, `log_stream`. [More information on each graph visualization type.](https://docs.datadoghq.com/dashboards/widgets/) Option 1 object The attributes of a notebook `markdown` cell. definition [_required_] object Text in a notebook is formatted with [Markdown](https://daringfireball.net/projects/markdown/), which enables the use of headings, subheadings, links, images, lists, and code blocks. text [_required_] string The markdown content. type [_required_] enum Type of the markdown cell. Allowed enum values: `markdown` default: `markdown` Option 2 object The attributes of a notebook `timeseries` cell. definition [_required_] object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. unit_scale object The definition of `NumberFormatUnitScale` object. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 3 object The attributes of a notebook `toplist` cell. definition [_required_] object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. unit_scale object The definition of `NumberFormatUnitScale` object. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. Option 2 object The group to sort the widget by. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 4 object The attributes of a notebook `heatmap` cell. definition [_required_] object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of widget types. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. unit_scale object The definition of `NumberFormatUnitScale` object. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` xaxis object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 5 object The attributes of a notebook `distribution` cell. definition [_required_] object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. unit_scale object The definition of `NumberFormatUnitScale` object. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query Query definition for Distribution Widget Histogram Request Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. storage string Option for storage location. Feature in Private Beta. Option 3 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` xaxis object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` yaxis object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 6 object The attributes of a notebook `log_stream` cell. definition [_required_] object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. type [_required_] enum Type of the Notebook Cell resource. Allowed enum values: `notebook_cells` default: `notebook_cells` Option 2 object The description of a notebook cell update request. attributes [_required_] The attributes of a notebook cell in update cell request. Valid cell types are `markdown`, `timeseries`, `toplist`, `heatmap`, `distribution`, `log_stream`. [More information on each graph visualization type.](https://docs.datadoghq.com/dashboards/widgets/) Option 1 object The attributes of a notebook `markdown` cell. definition [_required_] object Text in a notebook is formatted with [Markdown](https://daringfireball.net/projects/markdown/), which enables the use of headings, subheadings, links, images, lists, and code blocks. text [_required_] string The markdown content. type [_required_] enum Type of the markdown cell. Allowed enum values: `markdown` default: `markdown` Option 2 object The attributes of a notebook `timeseries` cell. definition [_required_] object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. unit_scale object The definition of `NumberFormatUnitScale` object. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 3 object The attributes of a notebook `toplist` cell. definition [_required_] object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. unit_scale object The definition of `NumberFormatUnitScale` object. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. Option 2 object The group to sort the widget by. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 4 object The attributes of a notebook `heatmap` cell. definition [_required_] object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of widget types. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. unit_scale object The definition of `NumberFormatUnitScale` object. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` xaxis object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 5 object The attributes of a notebook `distribution` cell. definition [_required_] object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. unit_scale object The definition of `NumberFormatUnitScale` object. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query Query definition for Distribution Widget Histogram Request Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. storage string Option for storage location. Feature in Private Beta. Option 3 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` xaxis object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` yaxis object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 6 object The attributes of a notebook `log_stream` cell. definition [_required_] object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. id [_required_] string Notebook cell ID. type [_required_] enum Type of the Notebook Cell resource. Allowed enum values: `notebook_cells` default: `notebook_cells` metadata object Metadata associated with the notebook. is_template boolean Whether or not the notebook is a template. take_snapshots boolean Whether or not the notebook takes snapshot image backups of the notebook's fixed-time graphs. type enum Metadata type of the notebook. Allowed enum values: `postmortem,runbook,investigation,documentation,report` name [_required_] string The name of the notebook. status enum Publication status of the notebook. For now, always "published". Allowed enum values: `published` default: `published` time [_required_] Notebook global timeframe. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. type [_required_] enum Type of the Notebook resource. Allowed enum values: `notebooks` default: `notebooks` ``` { "data": { "attributes": { "cells": [ { "attributes": { "definition": { "text": "## Some test markdown\n\n```\nvar x, y;\nx = 5;\ny = 6;\n```", "type": "markdown" } }, "type": "notebook_cells" }, { "attributes": { "definition": { "requests": [ { "display_type": "line", "q": "avg:system.load.1{*}", "style": { "line_type": "solid", "line_width": "normal", "palette": "dog_classic" } } ], "show_legend": true, "type": "timeseries", "yaxis": { "scale": "linear" } }, "graph_size": "m", "split_by": { "keys": [], "tags": [] }, "time": null }, "type": "notebook_cells" } ], "name": "Example-Notebook-updated", "status": "published", "time": { "live_span": "1h" } }, "type": "notebooks" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/notebooks/#UpdateNotebook-200-v1) * [400](https://docs.datadoghq.com/api/latest/notebooks/#UpdateNotebook-400-v1) * [403](https://docs.datadoghq.com/api/latest/notebooks/#UpdateNotebook-403-v1) * [404](https://docs.datadoghq.com/api/latest/notebooks/#UpdateNotebook-404-v1) * [409](https://docs.datadoghq.com/api/latest/notebooks/#UpdateNotebook-409-v1) * [429](https://docs.datadoghq.com/api/latest/notebooks/#UpdateNotebook-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) The description of a notebook response. Field Type Description data object The data for a notebook. attributes [_required_] object The attributes of a notebook. author object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. name string Name of the user. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. cells [_required_] [object] List of cells to display in the notebook. attributes [_required_] The attributes of a notebook cell response. Valid cell types are `markdown`, `timeseries`, `toplist`, `heatmap`, `distribution`, `log_stream`. [More information on each graph visualization type.](https://docs.datadoghq.com/dashboards/widgets/) Option 1 object The attributes of a notebook `markdown` cell. definition [_required_] object Text in a notebook is formatted with [Markdown](https://daringfireball.net/projects/markdown/), which enables the use of headings, subheadings, links, images, lists, and code blocks. text [_required_] string The markdown content. type [_required_] enum Type of the markdown cell. Allowed enum values: `markdown` default: `markdown` Option 2 object The attributes of a notebook `timeseries` cell. definition [_required_] object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 3 object The attributes of a notebook `toplist` cell. definition [_required_] object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 4 object The attributes of a notebook `heatmap` cell. definition [_required_] object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of widget types. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` xaxis object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 5 object The attributes of a notebook `distribution` cell. definition [_required_] object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query Query definition for Distribution Widget Histogram Request Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` xaxis object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` yaxis object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 6 object The attributes of a notebook `log_stream` cell. definition [_required_] object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. id [_required_] string Notebook cell ID. type [_required_] enum Type of the Notebook Cell resource. Allowed enum values: `notebook_cells` default: `notebook_cells` created date-time UTC time stamp for when the notebook was created. metadata object Metadata associated with the notebook. is_template boolean Whether or not the notebook is a template. take_snapshots boolean Whether or not the notebook takes snapshot image backups of the notebook's fixed-time graphs. type enum Metadata type of the notebook. Allowed enum values: `postmortem,runbook,investigation,documentation,report` modified date-time UTC time stamp for when the notebook was last modified. name [_required_] string The name of the notebook. status enum Publication status of the notebook. For now, always "published". Allowed enum values: `published` default: `published` time [_required_] Notebook global timeframe. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. id [_required_] int64 Unique notebook ID, assigned when you create the notebook. type [_required_] enum Type of the Notebook resource. Allowed enum values: `notebooks` default: `notebooks` ``` { "data": { "attributes": { "author": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "name": "string", "status": "string", "title": "string", "verified": false }, "cells": [ { "attributes": { "definition": { "requests": [ { "display_type": "line", "q": "avg:system.load.1{*}", "style": { "line_type": "solid", "line_width": "normal", "palette": "dog_classic" } } ], "show_legend": true, "type": "timeseries", "yaxis": { "scale": "linear" } }, "graph_size": "m", "split_by": { "keys": [], "tags": [] }, "time": null }, "id": "abcd1234", "type": "notebook_cells" } ], "created": "2021-02-24T23:14:15.173964+00:00", "metadata": { "is_template": false, "take_snapshots": false, "type": "investigation" }, "modified": "2021-02-24T23:15:23.274966+00:00", "name": "Example Notebook", "status": "published", "time": { "live_span": "5m" } }, "id": 123456, "type": "notebooks" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=typescript) ##### Update a notebook returns "OK" response Copy ``` ## json-request-body # # Path parameters export notebook_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/notebooks/${notebook_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "cells": [ { "attributes": { "definition": { "text": "## Some updated test markdown\n\nWith some example content.", "type": "markdown" } }, "type": "notebook_cells" }, { "attributes": { "definition": { "requests": [ { "display_type": "bars", "q": "avg:system.load.1{*}", "style": { "line_type": "solid", "line_width": "normal", "palette": "warm" } } ], "show_legend": true, "type": "timeseries", "yaxis": { "scale": "linear" } }, "graph_size": "m", "split_by": { "keys": [], "tags": [] }, "time": null }, "id": "abcd1234", "type": "notebook_cells" } ], "name": "Example Notebook", "time": { "live_span": "1h" } }, "type": "notebooks" } } EOF ``` ##### Update a notebook returns "OK" response ``` // Update a notebook returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "notebook" in the system NotebookDataID, _ := strconv.ParseInt(os.Getenv("NOTEBOOK_DATA_ID"), 10, 64) body := datadogV1.NotebookUpdateRequest{ Data: datadogV1.NotebookUpdateData{ Attributes: datadogV1.NotebookUpdateDataAttributes{ Cells: []datadogV1.NotebookUpdateCell{ datadogV1.NotebookUpdateCell{ NotebookCellCreateRequest: &datadogV1.NotebookCellCreateRequest{ Attributes: datadogV1.NotebookCellCreateRequestAttributes{ NotebookMarkdownCellAttributes: &datadogV1.NotebookMarkdownCellAttributes{ Definition: datadogV1.NotebookMarkdownCellDefinition{ Text: `## Some test markdown ` + "```" + ` var x, y; x = 5; y = 6; ` + "```", Type: datadogV1.NOTEBOOKMARKDOWNCELLDEFINITIONTYPE_MARKDOWN, }, }}, Type: datadogV1.NOTEBOOKCELLRESOURCETYPE_NOTEBOOK_CELLS, }}, datadogV1.NotebookUpdateCell{ NotebookCellCreateRequest: &datadogV1.NotebookCellCreateRequest{ Attributes: datadogV1.NotebookCellCreateRequestAttributes{ NotebookTimeseriesCellAttributes: &datadogV1.NotebookTimeseriesCellAttributes{ Definition: datadogV1.TimeseriesWidgetDefinition{ Requests: []datadogV1.TimeseriesWidgetRequest{ { DisplayType: datadogV1.WIDGETDISPLAYTYPE_LINE.Ptr(), Q: datadog.PtrString("avg:system.load.1{*}"), Style: &datadogV1.WidgetRequestStyle{ LineType: datadogV1.WIDGETLINETYPE_SOLID.Ptr(), LineWidth: datadogV1.WIDGETLINEWIDTH_NORMAL.Ptr(), Palette: datadog.PtrString("dog_classic"), }, }, }, ShowLegend: datadog.PtrBool(true), Type: datadogV1.TIMESERIESWIDGETDEFINITIONTYPE_TIMESERIES, Yaxis: &datadogV1.WidgetAxis{ Scale: datadog.PtrString("linear"), }, }, GraphSize: datadogV1.NOTEBOOKGRAPHSIZE_MEDIUM.Ptr(), SplitBy: &datadogV1.NotebookSplitBy{ Keys: []string{}, Tags: []string{}, }, Time: *datadogV1.NewNullableNotebookCellTime(nil), }}, Type: datadogV1.NOTEBOOKCELLRESOURCETYPE_NOTEBOOK_CELLS, }}, }, Name: "Example-Notebook-updated", Status: datadogV1.NOTEBOOKSTATUS_PUBLISHED.Ptr(), Time: datadogV1.NotebookGlobalTime{ NotebookRelativeTime: &datadogV1.NotebookRelativeTime{ LiveSpan: datadogV1.WIDGETLIVESPAN_PAST_ONE_HOUR, }}, }, Type: datadogV1.NOTEBOOKRESOURCETYPE_NOTEBOOKS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewNotebooksApi(apiClient) resp, r, err := api.UpdateNotebook(ctx, NotebookDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `NotebooksApi.UpdateNotebook`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `NotebooksApi.UpdateNotebook`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a notebook returns "OK" response ``` // Update a notebook returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.NotebooksApi; import com.datadog.api.client.v1.model.NotebookCellCreateRequest; import com.datadog.api.client.v1.model.NotebookCellCreateRequestAttributes; import com.datadog.api.client.v1.model.NotebookCellResourceType; import com.datadog.api.client.v1.model.NotebookGlobalTime; import com.datadog.api.client.v1.model.NotebookGraphSize; import com.datadog.api.client.v1.model.NotebookMarkdownCellAttributes; import com.datadog.api.client.v1.model.NotebookMarkdownCellDefinition; import com.datadog.api.client.v1.model.NotebookMarkdownCellDefinitionType; import com.datadog.api.client.v1.model.NotebookRelativeTime; import com.datadog.api.client.v1.model.NotebookResourceType; import com.datadog.api.client.v1.model.NotebookResponse; import com.datadog.api.client.v1.model.NotebookSplitBy; import com.datadog.api.client.v1.model.NotebookStatus; import com.datadog.api.client.v1.model.NotebookTimeseriesCellAttributes; import com.datadog.api.client.v1.model.NotebookUpdateCell; import com.datadog.api.client.v1.model.NotebookUpdateData; import com.datadog.api.client.v1.model.NotebookUpdateDataAttributes; import com.datadog.api.client.v1.model.NotebookUpdateRequest; import com.datadog.api.client.v1.model.TimeseriesWidgetDefinition; import com.datadog.api.client.v1.model.TimeseriesWidgetDefinitionType; import com.datadog.api.client.v1.model.TimeseriesWidgetRequest; import com.datadog.api.client.v1.model.WidgetAxis; import com.datadog.api.client.v1.model.WidgetDisplayType; import com.datadog.api.client.v1.model.WidgetLineType; import com.datadog.api.client.v1.model.WidgetLineWidth; import com.datadog.api.client.v1.model.WidgetLiveSpan; import com.datadog.api.client.v1.model.WidgetRequestStyle; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); NotebooksApi apiInstance = new NotebooksApi(defaultClient); // there is a valid "notebook" in the system Long NOTEBOOK_DATA_ID = Long.parseLong(System.getenv("NOTEBOOK_DATA_ID")); NotebookUpdateRequest body = new NotebookUpdateRequest() .data( new NotebookUpdateData() .attributes( new NotebookUpdateDataAttributes() .cells( Arrays.asList( new NotebookUpdateCell( new NotebookCellCreateRequest() .attributes( new NotebookCellCreateRequestAttributes( new NotebookMarkdownCellAttributes() .definition( new NotebookMarkdownCellDefinition() .text( """ ## Some test markdown ``` var x, y; x = 5; y = 6; ``` """) .type( NotebookMarkdownCellDefinitionType .MARKDOWN)))) .type(NotebookCellResourceType.NOTEBOOK_CELLS)), new NotebookUpdateCell( new NotebookCellCreateRequest() .attributes( new NotebookCellCreateRequestAttributes( new NotebookTimeseriesCellAttributes() .definition( new TimeseriesWidgetDefinition() .requests( Collections.singletonList( new TimeseriesWidgetRequest() .displayType( WidgetDisplayType .LINE) .q( "avg:system.load.1{*}") .style( new WidgetRequestStyle() .lineType( WidgetLineType .SOLID) .lineWidth( WidgetLineWidth .NORMAL) .palette( "dog_classic")))) .showLegend(true) .type( TimeseriesWidgetDefinitionType .TIMESERIES) .yaxis( new WidgetAxis() .scale("linear"))) .graphSize(NotebookGraphSize.MEDIUM) .splitBy(new NotebookSplitBy()) .time(null))) .type(NotebookCellResourceType.NOTEBOOK_CELLS)))) .name("Example-Notebook-updated") .status(NotebookStatus.PUBLISHED) .time( new NotebookGlobalTime( new NotebookRelativeTime() .liveSpan(WidgetLiveSpan.PAST_ONE_HOUR)))) .type(NotebookResourceType.NOTEBOOKS)); try { NotebookResponse result = apiInstance.updateNotebook(NOTEBOOK_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling NotebooksApi#updateNotebook"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a notebook returns "OK" response ``` """ Update a notebook returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.notebooks_api import NotebooksApi from datadog_api_client.v1.model.notebook_cell_create_request import NotebookCellCreateRequest from datadog_api_client.v1.model.notebook_cell_resource_type import NotebookCellResourceType from datadog_api_client.v1.model.notebook_graph_size import NotebookGraphSize from datadog_api_client.v1.model.notebook_markdown_cell_attributes import NotebookMarkdownCellAttributes from datadog_api_client.v1.model.notebook_markdown_cell_definition import NotebookMarkdownCellDefinition from datadog_api_client.v1.model.notebook_markdown_cell_definition_type import NotebookMarkdownCellDefinitionType from datadog_api_client.v1.model.notebook_relative_time import NotebookRelativeTime from datadog_api_client.v1.model.notebook_resource_type import NotebookResourceType from datadog_api_client.v1.model.notebook_split_by import NotebookSplitBy from datadog_api_client.v1.model.notebook_status import NotebookStatus from datadog_api_client.v1.model.notebook_timeseries_cell_attributes import NotebookTimeseriesCellAttributes from datadog_api_client.v1.model.notebook_update_data import NotebookUpdateData from datadog_api_client.v1.model.notebook_update_data_attributes import NotebookUpdateDataAttributes from datadog_api_client.v1.model.notebook_update_request import NotebookUpdateRequest from datadog_api_client.v1.model.timeseries_widget_definition import TimeseriesWidgetDefinition from datadog_api_client.v1.model.timeseries_widget_definition_type import TimeseriesWidgetDefinitionType from datadog_api_client.v1.model.timeseries_widget_request import TimeseriesWidgetRequest from datadog_api_client.v1.model.widget_axis import WidgetAxis from datadog_api_client.v1.model.widget_display_type import WidgetDisplayType from datadog_api_client.v1.model.widget_line_type import WidgetLineType from datadog_api_client.v1.model.widget_line_width import WidgetLineWidth from datadog_api_client.v1.model.widget_live_span import WidgetLiveSpan from datadog_api_client.v1.model.widget_request_style import WidgetRequestStyle # there is a valid "notebook" in the system NOTEBOOK_DATA_ID = environ["NOTEBOOK_DATA_ID"] body = NotebookUpdateRequest( data=NotebookUpdateData( attributes=NotebookUpdateDataAttributes( cells=[ NotebookCellCreateRequest( attributes=NotebookMarkdownCellAttributes( definition=NotebookMarkdownCellDefinition( text="## Some test markdown\n\n```\nvar x, y;\nx = 5;\ny = 6;\n```", type=NotebookMarkdownCellDefinitionType.MARKDOWN, ), ), type=NotebookCellResourceType.NOTEBOOK_CELLS, ), NotebookCellCreateRequest( attributes=NotebookTimeseriesCellAttributes( definition=TimeseriesWidgetDefinition( requests=[ TimeseriesWidgetRequest( display_type=WidgetDisplayType.LINE, q="avg:system.load.1{*}", style=WidgetRequestStyle( line_type=WidgetLineType.SOLID, line_width=WidgetLineWidth.NORMAL, palette="dog_classic", ), ), ], show_legend=True, type=TimeseriesWidgetDefinitionType.TIMESERIES, yaxis=WidgetAxis( scale="linear", ), ), graph_size=NotebookGraphSize.MEDIUM, split_by=NotebookSplitBy( keys=[], tags=[], ), time=None, ), type=NotebookCellResourceType.NOTEBOOK_CELLS, ), ], name="Example-Notebook-updated", status=NotebookStatus.PUBLISHED, time=NotebookRelativeTime( live_span=WidgetLiveSpan.PAST_ONE_HOUR, ), ), type=NotebookResourceType.NOTEBOOKS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = NotebooksApi(api_client) response = api_instance.update_notebook(notebook_id=int(NOTEBOOK_DATA_ID), body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a notebook returns "OK" response ``` # Update a notebook returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::NotebooksAPI.new # there is a valid "notebook" in the system NOTEBOOK_DATA_ID = ENV["NOTEBOOK_DATA_ID"] body = DatadogAPIClient::V1::NotebookUpdateRequest.new({ data: DatadogAPIClient::V1::NotebookUpdateData.new({ attributes: DatadogAPIClient::V1::NotebookUpdateDataAttributes.new({ cells: [ DatadogAPIClient::V1::NotebookCellCreateRequest.new({ attributes: DatadogAPIClient::V1::NotebookMarkdownCellAttributes.new({ definition: DatadogAPIClient::V1::NotebookMarkdownCellDefinition.new({ text: '## Some test markdown\n\n```\nvar x, y;\nx = 5;\ny = 6;\n```', type: DatadogAPIClient::V1::NotebookMarkdownCellDefinitionType::MARKDOWN, }), }), type: DatadogAPIClient::V1::NotebookCellResourceType::NOTEBOOK_CELLS, }), DatadogAPIClient::V1::NotebookCellCreateRequest.new({ attributes: DatadogAPIClient::V1::NotebookTimeseriesCellAttributes.new({ definition: DatadogAPIClient::V1::TimeseriesWidgetDefinition.new({ requests: [ DatadogAPIClient::V1::TimeseriesWidgetRequest.new({ display_type: DatadogAPIClient::V1::WidgetDisplayType::LINE, q: "avg:system.load.1{*}", style: DatadogAPIClient::V1::WidgetRequestStyle.new({ line_type: DatadogAPIClient::V1::WidgetLineType::SOLID, line_width: DatadogAPIClient::V1::WidgetLineWidth::NORMAL, palette: "dog_classic", }), }), ], show_legend: true, type: DatadogAPIClient::V1::TimeseriesWidgetDefinitionType::TIMESERIES, yaxis: DatadogAPIClient::V1::WidgetAxis.new({ scale: "linear", }), }), graph_size: DatadogAPIClient::V1::NotebookGraphSize::MEDIUM, split_by: DatadogAPIClient::V1::NotebookSplitBy.new({ keys: [], tags: [], }), time: nil, }), type: DatadogAPIClient::V1::NotebookCellResourceType::NOTEBOOK_CELLS, }), ], name: "Example-Notebook-updated", status: DatadogAPIClient::V1::NotebookStatus::PUBLISHED, time: DatadogAPIClient::V1::NotebookRelativeTime.new({ live_span: DatadogAPIClient::V1::WidgetLiveSpan::PAST_ONE_HOUR, }), }), type: DatadogAPIClient::V1::NotebookResourceType::NOTEBOOKS, }), }) p api_instance.update_notebook(NOTEBOOK_DATA_ID.to_i, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a notebook returns "OK" response ``` // Update a notebook returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_notebooks::NotebooksAPI; use datadog_api_client::datadogV1::model::NotebookCellCreateRequest; use datadog_api_client::datadogV1::model::NotebookCellCreateRequestAttributes; use datadog_api_client::datadogV1::model::NotebookCellResourceType; use datadog_api_client::datadogV1::model::NotebookGlobalTime; use datadog_api_client::datadogV1::model::NotebookGraphSize; use datadog_api_client::datadogV1::model::NotebookMarkdownCellAttributes; use datadog_api_client::datadogV1::model::NotebookMarkdownCellDefinition; use datadog_api_client::datadogV1::model::NotebookMarkdownCellDefinitionType; use datadog_api_client::datadogV1::model::NotebookRelativeTime; use datadog_api_client::datadogV1::model::NotebookResourceType; use datadog_api_client::datadogV1::model::NotebookSplitBy; use datadog_api_client::datadogV1::model::NotebookStatus; use datadog_api_client::datadogV1::model::NotebookTimeseriesCellAttributes; use datadog_api_client::datadogV1::model::NotebookUpdateCell; use datadog_api_client::datadogV1::model::NotebookUpdateData; use datadog_api_client::datadogV1::model::NotebookUpdateDataAttributes; use datadog_api_client::datadogV1::model::NotebookUpdateRequest; use datadog_api_client::datadogV1::model::TimeseriesWidgetDefinition; use datadog_api_client::datadogV1::model::TimeseriesWidgetDefinitionType; use datadog_api_client::datadogV1::model::TimeseriesWidgetRequest; use datadog_api_client::datadogV1::model::WidgetAxis; use datadog_api_client::datadogV1::model::WidgetDisplayType; use datadog_api_client::datadogV1::model::WidgetLineType; use datadog_api_client::datadogV1::model::WidgetLineWidth; use datadog_api_client::datadogV1::model::WidgetLiveSpan; use datadog_api_client::datadogV1::model::WidgetRequestStyle; #[tokio::main] async fn main() { // there is a valid "notebook" in the system let notebook_data_id: i64 = std::env::var("NOTEBOOK_DATA_ID").unwrap().parse().unwrap(); let body = NotebookUpdateRequest::new(NotebookUpdateData::new( NotebookUpdateDataAttributes::new( vec![ NotebookUpdateCell::NotebookCellCreateRequest(Box::new( NotebookCellCreateRequest::new( NotebookCellCreateRequestAttributes::NotebookMarkdownCellAttributes( Box::new(NotebookMarkdownCellAttributes::new( NotebookMarkdownCellDefinition::new( r#"## Some test markdown ``` var x, y; x = 5; y = 6; ```"# .to_string(), NotebookMarkdownCellDefinitionType::MARKDOWN, ), )), ), NotebookCellResourceType::NOTEBOOK_CELLS, ), )), NotebookUpdateCell::NotebookCellCreateRequest(Box::new( NotebookCellCreateRequest::new( NotebookCellCreateRequestAttributes::NotebookTimeseriesCellAttributes( Box::new( NotebookTimeseriesCellAttributes::new( TimeseriesWidgetDefinition::new( vec![TimeseriesWidgetRequest::new() .display_type(WidgetDisplayType::LINE) .q("avg:system.load.1{*}".to_string()) .style( WidgetRequestStyle::new() .line_type(WidgetLineType::SOLID) .line_width(WidgetLineWidth::NORMAL) .palette("dog_classic".to_string()), )], TimeseriesWidgetDefinitionType::TIMESERIES, ) .show_legend(true) .yaxis(WidgetAxis::new().scale("linear".to_string())), ) .graph_size(NotebookGraphSize::MEDIUM) .split_by(NotebookSplitBy::new(vec![], vec![])) .time(None), ), ), NotebookCellResourceType::NOTEBOOK_CELLS, ), )), ], "Example-Notebook-updated".to_string(), NotebookGlobalTime::NotebookRelativeTime(Box::new(NotebookRelativeTime::new( WidgetLiveSpan::PAST_ONE_HOUR, ))), ) .status(NotebookStatus::PUBLISHED), NotebookResourceType::NOTEBOOKS, )); let configuration = datadog::Configuration::new(); let api = NotebooksAPI::with_config(configuration); let resp = api.update_notebook(notebook_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a notebook returns "OK" response ``` /** * Update a notebook returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.NotebooksApi(configuration); // there is a valid "notebook" in the system const NOTEBOOK_DATA_ID = parseInt(process.env.NOTEBOOK_DATA_ID as string); const params: v1.NotebooksApiUpdateNotebookRequest = { body: { data: { attributes: { cells: [ { attributes: { definition: { text: `## Some test markdown ` + "```" + ` var x, y; x = 5; y = 6; ` + "```", type: "markdown", }, }, type: "notebook_cells", }, { attributes: { definition: { requests: [ { displayType: "line", q: "avg:system.load.1{*}", style: { lineType: "solid", lineWidth: "normal", palette: "dog_classic", }, }, ], showLegend: true, type: "timeseries", yaxis: { scale: "linear", }, }, graphSize: "m", splitBy: { keys: [], tags: [], }, time: undefined, }, type: "notebook_cells", }, ], name: "Example-Notebook-updated", status: "published", time: { liveSpan: "1h", }, }, type: "notebooks", }, }, notebookId: NOTEBOOK_DATA_ID, }; apiInstance .updateNotebook(params) .then((data: v1.NotebookResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a notebook](https://docs.datadoghq.com/api/latest/notebooks/#get-a-notebook) * [v1 (latest)](https://docs.datadoghq.com/api/latest/notebooks/#get-a-notebook-v1) GET https://api.ap1.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.ap2.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.datadoghq.eu/api/v1/notebooks/{notebook_id}https://api.ddog-gov.com/api/v1/notebooks/{notebook_id}https://api.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.us3.datadoghq.com/api/v1/notebooks/{notebook_id}https://api.us5.datadoghq.com/api/v1/notebooks/{notebook_id} ### Overview Get a notebook using the specified notebook ID. This endpoint requires the `notebooks_read` permission. ### Arguments #### Path Parameters Name Type Description notebook_id [_required_] integer Unique ID, assigned when you create the notebook. ### Response * [200](https://docs.datadoghq.com/api/latest/notebooks/#GetNotebook-200-v1) * [400](https://docs.datadoghq.com/api/latest/notebooks/#GetNotebook-400-v1) * [403](https://docs.datadoghq.com/api/latest/notebooks/#GetNotebook-403-v1) * [404](https://docs.datadoghq.com/api/latest/notebooks/#GetNotebook-404-v1) * [429](https://docs.datadoghq.com/api/latest/notebooks/#GetNotebook-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) The description of a notebook response. Field Type Description data object The data for a notebook. attributes [_required_] object The attributes of a notebook. author object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. name string Name of the user. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. cells [_required_] [object] List of cells to display in the notebook. attributes [_required_] The attributes of a notebook cell response. Valid cell types are `markdown`, `timeseries`, `toplist`, `heatmap`, `distribution`, `log_stream`. [More information on each graph visualization type.](https://docs.datadoghq.com/dashboards/widgets/) Option 1 object The attributes of a notebook `markdown` cell. definition [_required_] object Text in a notebook is formatted with [Markdown](https://daringfireball.net/projects/markdown/), which enables the use of headings, subheadings, links, images, lists, and code blocks. text [_required_] string The markdown content. type [_required_] enum Type of the markdown cell. Allowed enum values: `markdown` default: `markdown` Option 2 object The attributes of a notebook `timeseries` cell. definition [_required_] object The timeseries visualization allows you to display the evolution of one or more metrics, log events, or Indexed Spans over time. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_columns [string] Columns displayed in the legend. legend_layout enum Layout of the legend. Allowed enum values: `auto,horizontal,vertical` legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of timeseries widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. display_type enum Type of display to use for the request. Allowed enum values: `area,bars,line,overlay` event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. metadata [object] Used to define expression aliases. alias_name string Expression alias. expression [_required_] string Expression name. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. on_right_yaxis boolean Whether or not to display a second y-axis on the right. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. right_yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` show_legend boolean (screenboard only) Show the legend for this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the timeseries widget. Allowed enum values: `timeseries` default: `timeseries` yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 3 object The attributes of a notebook `toplist` cell. definition [_required_] object The top list visualization enables you to display a list of Tag value like hostname or service with the most or least of any metric value, such as highest consumers of CPU, hosts with the least disk space, etc. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. requests [_required_] [object] List of top list widget requests. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. audit_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. sort object The controls for sorting the widget. count int64 The number of items to limit the widget to. order_by [ ] The array of items to sort the widget by in order. Option 1 object The formula to sort the widget by. index [_required_] int64 The index of the formula to sort by. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to formula. Allowed enum values: `formula` Option 2 object The group to sort the widget by. name [_required_] string The name of the group. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` type [_required_] enum Set the sort type to group. Allowed enum values: `group` style object Define request widget style. line_type enum Type of lines displayed. Allowed enum values: `dashed,dotted,solid` line_width enum Width of line displayed. Allowed enum values: `normal,thick,thin` palette string Color palette to apply to the widget. style object Style customization for a top list widget. display Top list widget display options. Option 1 object Top list widget stacked display options. legend enum Top list widget stacked legend behavior. Allowed enum values: `automatic,inline,none` type [_required_] enum Top list widget stacked display type. Allowed enum values: `stacked` default: `stacked` Option 2 object Top list widget flat display. type [_required_] enum Top list widget flat display type. Allowed enum values: `flat` default: `flat` palette string Color palette to apply to the widget. scaling enum Top list widget scaling definition. Allowed enum values: `absolute,relative` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of your widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the top list widget. Allowed enum values: `toplist` default: `toplist` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 4 object The attributes of a notebook `heatmap` cell. definition [_required_] object The heat map visualization shows metrics aggregated across many tags, such as hosts. The more hosts that have a particular value, the darker that square is. custom_links [object] List of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. events [object] List of widget events. q [_required_] string Query definition. tags_execution string The execution method for multi-value filters. legend_size string Available legend sizes for a widget. Should be one of "0", "2", "4", "8", "16", or "auto". markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] List of widget types. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. event_query object The event query. search [_required_] string The query being made on the event. tags_execution [_required_] string The execution method for multi-value filters. Can be either and or or. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean Whether or not to display the legend on this widget. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the heat map widget. Allowed enum values: `heatmap` default: `heatmap` xaxis object X Axis controls for the heat map widget. num_buckets int64 Number of time buckets to target, also known as the resolution of the time bins. This is only applicable for distribution of points (group distributions use the roll-up modifier). yaxis object Axis controls for the widget. include_zero boolean Set to `true` to include zero. label string The label of the axis to display on the graph. Only usable on Scatterplot Widgets. max string Specifies maximum numeric value to show on the axis. Defaults to `auto`. default: `auto` min string Specifies minimum numeric value to show on the axis. Defaults to `auto`. default: `auto` scale string Specifies the scale type. Possible values are `linear`, `log`, `sqrt`, and `pow##` (for example `pow2` or `pow0.5`). default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 5 object The attributes of a notebook `distribution` cell. definition [_required_] object The Distribution visualization is another way of showing metrics aggregated across one or several tags, such as hosts. Unlike the heat map, a distribution graph’s x-axis is quantity rather than time. custom_links [object] A list of custom links. is_hidden boolean The flag for toggling context menu link visibility. label string The label for the custom link URL. Keep the label short and descriptive. Use metrics and tags as variables. link string The URL of the custom link. URL must include `http` or `https`. A relative URL must start with `/`. override_label string The label ID that refers to a context menu link. Can be `logs`, `hosts`, `traces`, `profiles`, `processes`, `containers`, or `rum`. legend_size string **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. markers [object] List of markers. display_type string Combination of: * A severity error, warning, ok, or info * A line type: dashed, solid, or bold In this case of a Distribution widget, this can be set to be `percentile`. label string Label to display over the marker. time string Timestamp for the widget. value [_required_] string Value to apply. Can be a single value y = 15 or a range of values 0 < y < 10. For Distribution widgets with `display_type` set to `percentile`, this should be a numeric percentile value (for example, "90" for P90). requests [_required_] [object] Array of one request object to display in the widget. See the dedicated [Request JSON schema documentation](https://docs.datadoghq.com/dashboards/graphing_json/request_json) to learn how to build the `REQUEST_SCHEMA`. apm_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. apm_stats_query object The APM stats query for table and distributions widgets. columns [object] Column properties used by the front end for display. alias string A user-assigned alias for the column. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` name [_required_] string Column name. order enum Widget sorting methods. Allowed enum values: `asc,desc` env [_required_] string Environment name. name [_required_] string Operation name associated with service. primary_tag [_required_] string The organization's host group name and value. resource string Resource name. row_type [_required_] enum The level of detail for the request. Allowed enum values: `service,resource,span` service [_required_] string Service name. event_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. formulas [object] List of formulas that operate on queries. alias string Expression alias. cell_display_mode enum Define a display mode for the table cell. Allowed enum values: `number,bar,trend` cell_display_mode_options object Cell display mode options for the widget formula. (only if `cell_display_mode` is set to `trend`). trend_type enum Trend type for the cell display mode options. Allowed enum values: `area,line,bars` y_scale enum Y scale for the cell display mode options. Allowed enum values: `shared,independent` conditional_formats [object] List of conditional formats. comparator [_required_] enum Comparator to apply. Allowed enum values: `=,>,>=,<,<=` custom_bg_color string Color palette to apply to the background, same values available as palette. custom_fg_color string Color palette to apply to the foreground, same values available as palette. hide_value boolean True hides values. image_url string Displays an image as the background. metric string Metric from the request to correlate this conditional format with. palette [_required_] enum Color palette to apply. Allowed enum values: `blue,custom_bg,custom_image,custom_text,gray_on_white,grey,green,orange,red,red_on_white,white_on_gray,white_on_green,green_on_white,white_on_red,white_on_yellow,yellow_on_white,black_on_light_yellow,black_on_light_green,black_on_light_red` timeframe string Defines the displayed timeframe. value [_required_] double Value for the comparator. formula [_required_] string String expression built from queries, formulas, and functions. limit object Options for limiting results returned. count int64 Number of results to return. order enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` number_format object Number format options for the widget. unit Number format unit. Option 1 object Canonical unit. Option 2 object Custom unit. unit_scale object The definition of `NumberFormatUnitScale` object. type enum The type of unit scale. Allowed enum values: `canonical_unit` unit_name string The name of the unit. style object Styling options for widget formulas. palette string The color palette used to display the formula. A guide to the available color palettes can be found at palette_index int64 Index specifying which color to use within the palette. log_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. network_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. process_query object The process query to use in the widget. filter_by [string] List of processes. limit int64 Max number of items in the filter list. metric [_required_] string Your chosen metric. search_by string Your chosen search term. profile_metrics_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. q string Widget query. queries [ ] List of queries that can be returned directly or used in formulas. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object Process query using formulas and functions. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data sources that rely on the process backend. Allowed enum values: `process,container` is_normalized_cpu boolean Whether to normalize the CPU percentages. limit int64 Number of hits to return. metric [_required_] string Process metric name. name [_required_] string Name of query for use in formulas. sort enum Direction of sort. Allowed enum values: `asc,desc` default: `desc` tag_filters [string] An array of tags to filter by. text_filter string Text to use as filter. Option 4 object A formula and functions APM dependency stats query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM dependency stats queries. Allowed enum values: `apm_dependency_stats` env [_required_] string APM environment. is_upstream boolean Determines whether stats for upstream or downstream dependencies should be queried. name [_required_] string Name of query to use in formulas. operation_name [_required_] string Name of operation on service. primary_tag_name string The name of the second primary tag used within APM; required when `primary_tag_value` is specified. See . primary_tag_value string Filter APM data by the second primary tag. `primary_tag_name` must also be specified. resource_name [_required_] string APM resource. service [_required_] string APM service. stat [_required_] enum APM statistic. Allowed enum values: `avg_duration,avg_root_duration,avg_spans_per_trace,error_rate,pct_exec_time,pct_of_traces,total_traces_count` Option 5 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` Option 6 object A formula and functions metrics query. additional_query_filters string Additional filters applied to the SLO query. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for SLO measures queries. Allowed enum values: `slo` group_mode enum Group mode to query measures. Allowed enum values: `overall,components` measure [_required_] enum SLO measures queries. Allowed enum values: `good_events,bad_events,good_minutes,bad_minutes,slo_status,error_budget_remaining,burn_rate,error_budget_burndown` name string Name of the query for use in formulas. slo_id [_required_] string ID of an SLO to query measures. slo_query_type enum Name of the query for use in formulas. Allowed enum values: `metric,monitor,time_slice` Option 7 object A formula and functions Cloud Cost query. aggregator enum Aggregator used for the request. Allowed enum values: `avg,last,max,min,sum,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for Cloud Cost queries. Allowed enum values: `cloud_cost` name [_required_] string Name of the query for use in formulas. query [_required_] string Query for Cloud Cost data. query Query definition for Distribution Widget Histogram Request Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` Option 2 object A formula and functions events query. compute [_required_] object Compute options. aggregation [_required_] enum Aggregation methods for event platform queries. Allowed enum values: `count,cardinality,median,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg` interval int64 A time interval in milliseconds. metric string Measurable attribute to compute. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for event platform-based queries. Allowed enum values: `logs,spans,network,rum,security_signals,profiles,audit,events,ci_tests,ci_pipelines,incident_analytics,product_analytics,on_call_events` group_by [object] Group by options. facet [_required_] string Event facet. limit int64 Number of groups to return. sort object Options for sorting group by results. indexes [string] An array of index names to query in the stream. Omit or use `[]` to query all indexes at once. name [_required_] string Name of the query for use in formulas. search object Search options. query [_required_] string Events search string. storage string Option for storage location. Feature in Private Beta. Option 3 object APM resource stats query using formulas and functions. cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for APM resource stats queries. Allowed enum values: `apm_resource_stats` env [_required_] string APM environment. group_by [string] Array of fields to group results by. name [_required_] string Name of this query to use in formulas. operation_name string Name of operation on service. primary_tag_name string Name of the second primary tag used within APM. Required when `primary_tag_value` is specified. See primary_tag_value string Value of the second primary tag by which to filter APM data. `primary_tag_name` must also be specified. resource_name string APM resource name. service [_required_] string APM service name. stat [_required_] enum APM resource stat name. Allowed enum values: `errors,error_rate,hits,latency_avg,latency_distribution,latency_max,latency_p50,latency_p75,latency_p90,latency_p95,latency_p99` request_type enum Request type for the histogram request. Allowed enum values: `histogram` response_format enum Timeseries, scalar, or event list response. Event list response formats are supported by Geomap widgets. Allowed enum values: `timeseries,scalar,event_list` rum_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. security_query object The log query. compute object Define computation for a log query. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. group_by [object] List of tag prefixes to group by in the case of a cluster check. facet [_required_] string Facet name. limit int64 Maximum number of items in the group. sort object Define a sorting method. aggregation [_required_] string The aggregation method. facet string Facet name. order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` index string A coma separated-list of index names. Use "*" query all indexes at once. [Multiple Indexes](https://docs.datadoghq.com/logs/indexes/#multiple-indexes) multi_compute [object] This field is mutually exclusive with `compute`. aggregation [_required_] string The aggregation method. facet string Facet name. interval int64 Define a time interval in seconds. search object The query being made on the logs. query [_required_] string Search value to apply. style object Widget style definition. palette string Color palette to apply to the widget. show_legend boolean **DEPRECATED** : (Deprecated) The widget legend was replaced by a tooltip and sidebar. time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the distribution widget. Allowed enum values: `distribution` default: `distribution` xaxis object X Axis controls for the distribution widget. include_zero boolean True includes zero. max string Specifies maximum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` min string Specifies minimum value to show on the x-axis. It takes a number, percentile (p90 === 90th percentile), or auto for default behavior. default: `auto` num_buckets int64 Number of value buckets to target, also known as the resolution of the value bins. scale string Specifies the scale type. Possible values are `linear`. default: `linear` yaxis object Y Axis controls for the distribution widget. include_zero boolean True includes zero. label string The label of the axis to display on the graph. max string Specifies the maximum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` min string Specifies minimum value to show on the y-axis. It takes a number, or auto for default behavior. default: `auto` scale string Specifies the scale type. Possible values are `linear` or `log`. default: `linear` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` split_by object Object describing how to split the graph to display multiple visualizations per request. keys [_required_] [string] Keys to split on. tags [_required_] [string] Tags to split on. time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. Option 6 object The attributes of a notebook `log_stream` cell. definition [_required_] object The Log Stream displays a log flow matching the defined query. Only available on FREE layout dashboards. columns [string] Which columns to display on the widget. indexes [string] An array of index names to query in the stream. Use [] to query all indexes at once. logset string **DEPRECATED** : ID of the log set to use. message_display enum Amount of log lines to display Allowed enum values: `inline,expanded-md,expanded-lg` query string Query to filter the log stream with. show_date_column boolean Whether to show the date column or not show_message_column boolean Whether to show the message column or not sort object Which column and order to sort by column [_required_] string Facet path for the column order [_required_] enum Widget sorting methods. Allowed enum values: `asc,desc` time Time setting for the widget. Option 1 object Wrapper for live span hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Used for arbitrary live span times, such as 17 minutes or 6 hours. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. type [_required_] enum Type "live" denotes a live span in the new format. Allowed enum values: `live` unit [_required_] enum Unit of the time span. Allowed enum values: `minute,hour,day,week,month,year` value [_required_] int64 Value of the time span. Option 3 object Used for fixed span times, such as 'March 1 to March 7'. from [_required_] int64 Start time in seconds since epoch. hide_incomplete_cost_data boolean Whether to hide incomplete cost data in the widget. to [_required_] int64 End time in seconds since epoch. type [_required_] enum Type "fixed" denotes a fixed span. Allowed enum values: `fixed` title string Title of the widget. title_align enum How to align the text on the widget. Allowed enum values: `center,left,right` title_size string Size of the title. type [_required_] enum Type of the log stream widget. Allowed enum values: `log_stream` default: `log_stream` graph_size enum The size of the graph. Allowed enum values: `xs,s,m,l,xl` time object Timeframe for the notebook cell. When 'null', the notebook global time is used. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. id [_required_] string Notebook cell ID. type [_required_] enum Type of the Notebook Cell resource. Allowed enum values: `notebook_cells` default: `notebook_cells` created date-time UTC time stamp for when the notebook was created. metadata object Metadata associated with the notebook. is_template boolean Whether or not the notebook is a template. take_snapshots boolean Whether or not the notebook takes snapshot image backups of the notebook's fixed-time graphs. type enum Metadata type of the notebook. Allowed enum values: `postmortem,runbook,investigation,documentation,report` modified date-time UTC time stamp for when the notebook was last modified. name [_required_] string The name of the notebook. status enum Publication status of the notebook. For now, always "published". Allowed enum values: `published` default: `published` time [_required_] Notebook global timeframe. Option 1 object Relative timeframe. live_span [_required_] enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,week_to_date,month_to_date,1y,alert` Option 2 object Absolute timeframe. end [_required_] date-time The end time. live boolean Indicates whether the timeframe should be shifted to end at the current time. start [_required_] date-time The start time. id [_required_] int64 Unique notebook ID, assigned when you create the notebook. type [_required_] enum Type of the Notebook resource. Allowed enum values: `notebooks` default: `notebooks` ``` { "data": { "attributes": { "author": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "name": "string", "status": "string", "title": "string", "verified": false }, "cells": [ { "attributes": { "definition": { "requests": [ { "display_type": "line", "q": "avg:system.load.1{*}", "style": { "line_type": "solid", "line_width": "normal", "palette": "dog_classic" } } ], "show_legend": true, "type": "timeseries", "yaxis": { "scale": "linear" } }, "graph_size": "m", "split_by": { "keys": [], "tags": [] }, "time": null }, "id": "abcd1234", "type": "notebook_cells" } ], "created": "2021-02-24T23:14:15.173964+00:00", "metadata": { "is_template": false, "take_snapshots": false, "type": "investigation" }, "modified": "2021-02-24T23:15:23.274966+00:00", "name": "Example Notebook", "status": "published", "time": { "live_span": "5m" } }, "id": 123456, "type": "notebooks" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/notebooks/) * [Example](https://docs.datadoghq.com/api/latest/notebooks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/notebooks/?code-lang=typescript) ##### Get a notebook Copy ``` # Path parameters export notebook_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/notebooks/${notebook_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a notebook ``` """ Get a notebook returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.notebooks_api import NotebooksApi # there is a valid "notebook" in the system NOTEBOOK_DATA_ID = environ["NOTEBOOK_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = NotebooksApi(api_client) response = api_instance.get_notebook( notebook_id=int(NOTEBOOK_DATA_ID), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a notebook ``` # Get a notebook returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::NotebooksAPI.new # there is a valid "notebook" in the system NOTEBOOK_DATA_ID = ENV["NOTEBOOK_DATA_ID"] p api_instance.get_notebook(NOTEBOOK_DATA_ID.to_i) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a notebook ``` // Get a notebook returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "strconv" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "notebook" in the system NotebookDataID, _ := strconv.ParseInt(os.Getenv("NOTEBOOK_DATA_ID"), 10, 64) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewNotebooksApi(apiClient) resp, r, err := api.GetNotebook(ctx, NotebookDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `NotebooksApi.GetNotebook`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `NotebooksApi.GetNotebook`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a notebook ``` // Get a notebook returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.NotebooksApi; import com.datadog.api.client.v1.model.NotebookResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); NotebooksApi apiInstance = new NotebooksApi(defaultClient); // there is a valid "notebook" in the system Long NOTEBOOK_DATA_ID = Long.parseLong(System.getenv("NOTEBOOK_DATA_ID")); try { NotebookResponse result = apiInstance.getNotebook(NOTEBOOK_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling NotebooksApi#getNotebook"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a notebook ``` // Get a notebook returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_notebooks::NotebooksAPI; #[tokio::main] async fn main() { // there is a valid "notebook" in the system let notebook_data_id: i64 = std::env::var("NOTEBOOK_DATA_ID").unwrap().parse().unwrap(); let configuration = datadog::Configuration::new(); let api = NotebooksAPI::with_config(configuration); let resp = api.get_notebook(notebook_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a notebook ``` /** * Get a notebook returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.NotebooksApi(configuration); // there is a valid "notebook" in the system const NOTEBOOK_DATA_ID = parseInt(process.env.NOTEBOOK_DATA_ID as string); const params: v1.NotebooksApiGetNotebookRequest = { notebookId: NOTEBOOK_DATA_ID, }; apiInstance .getNotebook(params) .then((data: v1.NotebookResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=eaed3c2e-91eb-46c2-ab25-c6278ec1551f&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=3f87a4b9-8c87-482a-ace1-01f594555b83&pt=Notebooks&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fnotebooks%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=eaed3c2e-91eb-46c2-ab25-c6278ec1551f&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=3f87a4b9-8c87-482a-ace1-01f594555b83&pt=Notebooks&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fnotebooks%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=a20642b8-ca1e-481b-91d0-25179543606f&bo=2&sid=edd68350f0bf11f09aeaabc5684b3852&vid=edd6a2d0f0bf11f0862e6df258b5378b&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Notebooks&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fnotebooks%2F&r=<=6232&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=84221) --- # Source: https://docs.datadoghq.com/api/latest/observability-pipelines/ # Observability Pipelines Observability Pipelines allows you to collect and process logs within your own infrastructure, and then route them to downstream integrations. ## [List pipelines](https://docs.datadoghq.com/api/latest/observability-pipelines/#list-pipelines) * [v2 (latest)](https://docs.datadoghq.com/api/latest/observability-pipelines/#list-pipelines-v2) **Note** : This endpoint is in Preview. Fill out this [form](https://www.datadoghq.com/product-preview/observability-pipelines-api-and-terraform-support/) to request access. GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.ap2.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.datadoghq.eu/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.ddog-gov.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.us3.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines ### Overview Retrieve a list of pipelines. This endpoint requires the `observability_pipelines_read` permission. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. ### Response * [200](https://docs.datadoghq.com/api/latest/observability-pipelines/#ListPipelines-200-v2) * [400](https://docs.datadoghq.com/api/latest/observability-pipelines/#ListPipelines-400-v2) * [403](https://docs.datadoghq.com/api/latest/observability-pipelines/#ListPipelines-403-v2) * [429](https://docs.datadoghq.com/api/latest/observability-pipelines/#ListPipelines-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) Represents the response payload containing a list of pipelines and associated metadata. Expand All Field Type Description _required_] [object] The `schema` `data`. _required_] object Defines the pipeline’s name and its components (sources, processors, and destinations). _required_] object Specifies the pipeline's configuration, including its sources, processors, and destinations. _required_] [ ] A list of destination components where processed logs are sent. object The `datadog_logs` destination forwards logs to Datadog Log Management. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `datadog_logs`. Allowed enum values: `datadog_logs` default: `datadog_logs` object The `amazon_s3` destination sends your logs in Datadog-rehydratable format to an Amazon S3 bucket for archiving. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string S3 bucket name. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys. region [_required_] string AWS region of the S3 bucket. storage_class [_required_] enum S3 storage class. Allowed enum values: `STANDARD,REDUCED_REDUNDANCY,INTELLIGENT_TIERING,STANDARD_IA,EXPRESS_ONEZONE,ONEZONE_IA,GLACIER,GLACIER_IR,DEEP_ARCHIVE` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `google_cloud_storage` destination stores logs in a Google Cloud Storage (GCS) bucket. It requires a bucket name, GCP authentication, and metadata fields. acl enum Access control list setting for objects written to the bucket. Allowed enum values: `private,project-private,public-read,authenticated-read,bucket-owner-read,bucket-owner-full-control` object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. bucket [_required_] string Name of the GCS bucket. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys within the GCS bucket. [object] Custom metadata to attach to each object uploaded to the GCS bucket. name [_required_] string The metadata key. value [_required_] string The metadata value. storage_class [_required_] enum Storage class used for objects stored in GCS. Allowed enum values: `STANDARD,NEARLINE,COLDLINE,ARCHIVE` type [_required_] enum The destination type. Always `google_cloud_storage`. Allowed enum values: `google_cloud_storage` default: `google_cloud_storage` object The `splunk_hec` destination forwards logs to Splunk using the HTTP Event Collector (HEC). auto_extract_timestamp boolean If `true`, Splunk tries to extract timestamps from incoming log events. If `false`, Splunk assigns the time the event was received. encoding enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). index string Optional name of the Splunk index where logs are written. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. sourcetype string The Splunk sourcetype to assign to log events. type [_required_] enum The destination type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `sumo_logic` destination forwards logs to Sumo Logic. encoding enum The output encoding format. Allowed enum values: `json,raw_message,logfmt` [object] A list of custom headers to include in the request to Sumo Logic. name [_required_] string The header field name. value [_required_] string The header field value. header_host_name string Optional override for the host name header. header_source_category string Optional override for the source category header. header_source_name string Optional override for the source name header. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `elasticsearch` destination writes logs to an Elasticsearch cluster. api_version enum The Elasticsearch API version to use. Set to `auto` to auto-detect. Allowed enum values: `auto,v6,v7,v8` bulk_index string The index to write logs to in Elasticsearch. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `elasticsearch`. Allowed enum values: `elasticsearch` default: `elasticsearch` object The `rsyslog` destination forwards logs to an external `rsyslog` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` destination forwards logs to an external `syslog-ng` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `azure_storage` destination forwards logs to an Azure Blob Storage container. blob_prefix string Optional prefix for blobs written to the container. container_name [_required_] string The name of the Azure Blob Storage container to store logs in. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `azure_storage`. Allowed enum values: `azure_storage` default: `azure_storage` object The `microsoft_sentinel` destination forwards logs to Microsoft Sentinel. client_id [_required_] string Azure AD client ID used for authentication. dcr_immutable_id [_required_] string The immutable ID of the Data Collection Rule (DCR). id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. table [_required_] string The name of the Log Analytics table where logs are sent. tenant_id [_required_] string Azure AD tenant ID. type [_required_] enum The destination type. The value should always be `microsoft_sentinel`. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` object The `google_chronicle` destination sends logs to Google Chronicle. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. customer_id [_required_] string The Google Chronicle customer ID. encoding enum The encoding format for the logs sent to Chronicle. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. log_type string The log type metadata associated with the Chronicle destination. type [_required_] enum The destination type. The value should always be `google_chronicle`. Allowed enum values: `google_chronicle` default: `google_chronicle` object The `new_relic` destination sends logs to the New Relic platform. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The New Relic region. Allowed enum values: `us,eu` type [_required_] enum The destination type. The value should always be `new_relic`. Allowed enum values: `new_relic` default: `new_relic` object The `sentinel_one` destination sends logs to SentinelOne. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The SentinelOne region to send logs to. Allowed enum values: `us,eu,ca,data_set_us` type [_required_] enum The destination type. The value should always be `sentinel_one`. Allowed enum values: `sentinel_one` default: `sentinel_one` object The `opensearch` destination writes logs to an OpenSearch cluster. bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `opensearch`. Allowed enum values: `opensearch` default: `opensearch` object The `amazon_opensearch` destination writes logs to Amazon OpenSearch. _required_] object Authentication settings for the Amazon OpenSearch destination. The `strategy` field determines whether basic or AWS-based authentication is used. assume_role string The ARN of the role to assume (used with `aws` strategy). aws_region string AWS region external_id string External ID for the assumed role (used with `aws` strategy). session_name string Session name for the assumed role (used with `aws` strategy). strategy [_required_] enum The authentication strategy to use. Allowed enum values: `basic,aws` bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `amazon_opensearch`. Allowed enum values: `amazon_opensearch` default: `amazon_opensearch` object The `socket` destination sends logs over TCP or UDP to a remote server. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` _required_] Framing method configuration. object Each log event is delimited by a newline character. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingNewlineDelimitedMethod` object. Allowed enum values: `newline_delimited` object Event data is not delimited at all. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingBytesMethod` object. Allowed enum values: `bytes` object Each log event is separated using the specified delimiter character. delimiter [_required_] string A single ASCII character used as a delimiter. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingCharacterDelimitedMethod` object. Allowed enum values: `character_delimited` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. mode [_required_] enum Protocol used to send logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` object The `amazon_security_lake` destination sends your logs to Amazon Security Lake. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string Name of the Amazon S3 bucket in Security Lake (3-63 characters). custom_source_name [_required_] string Custom source name for the logs in Security Lake. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] string AWS region of the S3 bucket. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_security_lake`. Allowed enum values: `amazon_security_lake` default: `amazon_security_lake` object The `crowdstrike_next_gen_siem` destination forwards logs to CrowdStrike Next Gen SIEM. object Compression configuration for log events. algorithm [_required_] enum Compression algorithm for log events. Allowed enum values: `gzip,zlib` level int64 Compression level. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `crowdstrike_next_gen_siem`. Allowed enum values: `crowdstrike_next_gen_siem` default: `crowdstrike_next_gen_siem` object The `google_pubsub` destination publishes logs to a Google Cloud Pub/Sub topic. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. project [_required_] string The GCP project ID that owns the Pub/Sub topic. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topic [_required_] string The Pub/Sub topic name to publish logs to. type [_required_] enum The destination type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` [object] A list of processor groups that transform or enrich log data. display_name string The display name for a component. enabled [_required_] boolean Whether this processor group is enabled. id [_required_] string The unique identifier for the processor group. include [_required_] string Conditional expression for when this processor group should execute. inputs [_required_] [string] A list of IDs for components whose output is used as the input for this processor group. _required_] [ ] Processors applied sequentially within this group. Events flow through each processor in order. object The `filter` processor allows conditional processing of logs based on a Datadog search query. Logs that match the `include` query are passed through; others are discarded. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped. type [_required_] enum The processor type. The value should always be `filter`. Allowed enum values: `filter` default: `filter` object The `parse_json` processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. field [_required_] string The name of the log field that contains a JSON string. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `parse_json`. Allowed enum values: `parse_json` default: `parse_json` object The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert. display_name string The display name for a component. drop_events boolean If set to `true`, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). ignore_when_missing_partitions boolean If `true`, the processor skips quota checks when partition fields are missing from the logs. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. name [_required_] string Name of the quota. overflow_action enum The action to take when the quota is exceeded. Options: * `drop`: Drop the event. * `no_action`: Let the event pass through. * `overflow_routing`: Route to an overflow destination. Allowed enum values: `drop,no_action,overflow_routing` [object] A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit. _required_] [object] A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced. name [_required_] string The field name. value [_required_] string The field value. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. partition_fields [string] A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values. type [_required_] enum The processor type. The value should always be `quota`. Allowed enum values: `quota` default: `quota` object The `add_fields` processor adds static key-value fields to logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of static fields (key-value pairs) that is added to each log event processed by this component. name [_required_] string The field name. value [_required_] string The field value. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_fields`. Allowed enum values: `add_fields` default: `add_fields` object The `remove_fields` processor deletes specified fields from logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of field names to be removed from each log event. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `remove_fields`. Allowed enum values: `remove_fields` default: `remove_fields` object The `rename_fields` processor changes field names. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields. destination [_required_] string The field name to assign the renamed value to. preserve_source [_required_] boolean Indicates whether the original field, that is received from the source, should be kept (`true`) or removed (`false`) after renaming. source [_required_] string The original field name in the log event that should be renamed. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `rename_fields`. Allowed enum values: `rename_fields` default: `rename_fields` object The `generate_datadog_metrics` processor creates custom metrics from logs and sends them to Datadog. Metrics can be counters, gauges, or distributions and optionally grouped by log fields. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include string A Datadog search query used to determine which logs this processor targets. [object] Configuration for generating individual metrics. group_by [string] Optional fields used to group the metric series. include [_required_] string Datadog filter query to match logs for metric generation. metric_type [_required_] enum Type of metric to create. Allowed enum values: `count,gauge,distribution` name [_required_] string Name of the custom metric to be created. _required_] Specifies how the value of the generated metric is computed. object Strategy that increments a generated metric by one for each matching event. strategy [_required_] enum Increments the metric by 1 for each matching event. Allowed enum values: `increment_by_one` object Strategy that increments a generated metric based on the value of a log field. field [_required_] string Name of the log field containing the numeric value to increment the metric by. strategy [_required_] enum Uses a numeric field in the log event as the metric increment. Allowed enum values: `increment_by_field` type [_required_] enum The processor type. Always `generate_datadog_metrics`. Allowed enum values: `generate_datadog_metrics` default: `generate_datadog_metrics` object The `sample` processor allows probabilistic sampling of logs at a fixed rate. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. percentage double The percentage of logs to sample. rate int64 Number of events to sample (1 in N). type [_required_] enum The processor type. The value should always be `sample`. Allowed enum values: `sample` default: `sample` object The `parse_grok` processor extracts structured fields from unstructured log messages using Grok patterns. disable_library_rules boolean If set to `true`, disables the default Grok rules provided by Datadog. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string A unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] The list of Grok parsing rules. If multiple matching rules are provided, they are evaluated in order. The first successful match is applied. _required_] [object] A list of Grok parsing rules that define how to extract fields from the source field. Each rule must contain a name and a valid Grok pattern. name [_required_] string The name of the rule. rule [_required_] string The definition of the Grok rule. source [_required_] string The name of the field in the log event to apply the Grok rules to. [object] A list of Grok helper rules that can be referenced by the parsing rules. name [_required_] string The name of the Grok helper rule. rule [_required_] string The definition of the Grok helper rule. type [_required_] enum The processor type. The value should always be `parse_grok`. Allowed enum values: `parse_grok` default: `parse_grok` object The `sensitive_data_scanner` processor detects and optionally redacts sensitive data in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of rules for identifying and acting on sensitive data patterns. object Configuration for keywords used to reinforce sensitive data pattern detection. keywords [_required_] [string] A list of keywords to match near the sensitive pattern. proximity [_required_] int64 Maximum number of tokens between a keyword and a sensitive value match. name [_required_] string A name identifying the rule. _required_] Defines what action to take when sensitive data is matched. object Configuration for completely redacting matched sensitive data. action [_required_] enum Action type that completely replaces the matched sensitive data with a fixed replacement string to remove all visibility. Allowed enum values: `redact` _required_] object Configuration for fully redacting sensitive data. replace [_required_] string The `ObservabilityPipelineSensitiveDataScannerProcessorActionRedactOptions` `replace`. object Configuration for hashing matched sensitive values. action [_required_] enum Action type that replaces the matched sensitive data with a hashed representation, preserving structure while securing content. Allowed enum values: `hash` options object The `ObservabilityPipelineSensitiveDataScannerProcessorActionHash` `options`. object Configuration for partially redacting matched sensitive data. action [_required_] enum Action type that redacts part of the sensitive data while preserving a configurable number of characters, typically used for masking purposes (e.g., show last 4 digits of a credit card). Allowed enum values: `partial_redact` _required_] object Controls how partial redaction is applied, including character count and direction. characters [_required_] int64 The `ObservabilityPipelineSensitiveDataScannerProcessorActionPartialRedactOptions` `characters`. direction [_required_] enum Indicates whether to redact characters from the first or last part of the matched value. Allowed enum values: `first,last` _required_] Pattern detection configuration for identifying sensitive data using either a custom regex or a library reference. object Defines a custom regex-based pattern for identifying sensitive data in logs. _required_] object Options for defining a custom regex pattern. rule [_required_] string A regular expression used to detect sensitive values. Must be a valid regex. type [_required_] enum Indicates a custom regular expression is used for matching. Allowed enum values: `custom` object Specifies a pattern from Datadog’s sensitive data detection library to match known sensitive data types. _required_] object Options for selecting a predefined library pattern and enabling keyword support. id [_required_] string Identifier for a predefined pattern from the sensitive data scanner pattern library. use_recommended_keywords boolean Whether to augment the pattern with recommended keywords (optional). type [_required_] enum Indicates that a predefined library pattern is used. Allowed enum values: `library` _required_] Determines which parts of the log the pattern-matching rule should be applied to. object Includes only specific fields for sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Applies the rule only to included fields. Allowed enum values: `include` object Excludes specific fields from sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Excludes specific fields from processing. Allowed enum values: `exclude` object Applies scanning across all available fields. target [_required_] enum Applies the rule to all fields. Allowed enum values: `all` tags [_required_] [string] Tags assigned to this rule for filtering and classification. type [_required_] enum The processor type. The value should always be `sensitive_data_scanner`. Allowed enum values: `sensitive_data_scanner` default: `sensitive_data_scanner` object The `ocsf_mapper` processor transforms logs into the OCSF schema using a predefined mapping configuration. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of mapping rules to convert events to the OCSF format. include [_required_] string A Datadog search query used to select the logs that this mapping should apply to. _required_] Defines a single mapping rule for transforming logs into the OCSF schema. Option 1 enum Predefined library mappings for common log formats. Allowed enum values: `CloudTrail Account Change,GCP Cloud Audit CreateBucket,GCP Cloud Audit CreateSink,GCP Cloud Audit SetIamPolicy,GCP Cloud Audit UpdateSink,Github Audit Log API Activity,Google Workspace Admin Audit addPrivilege,Microsoft 365 Defender Incident,Microsoft 365 Defender UserLoggedIn,Okta System Log Authentication,Palo Alto Networks Firewall Traffic` type [_required_] enum The processor type. The value should always be `ocsf_mapper`. Allowed enum values: `ocsf_mapper` default: `ocsf_mapper` object The `add_env_vars` processor adds environment variable values to log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this processor in the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_env_vars`. Allowed enum values: `add_env_vars` default: `add_env_vars` _required_] [object] A list of environment variable mappings to apply to log fields. field [_required_] string The target field in the log event. name [_required_] string The name of the environment variable to read. object The `dedupe` processor removes duplicate fields in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of log field paths to check for duplicates. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. mode [_required_] enum The deduplication mode to apply to the fields. Allowed enum values: `match,ignore` type [_required_] enum The processor type. The value should always be `dedupe`. Allowed enum values: `dedupe` default: `dedupe` object The `enrichment_table` processor enriches logs using a static CSV file or GeoIP database. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. object Defines a static enrichment table loaded from a CSV file. _required_] object File encoding format. delimiter [_required_] string The `encoding` `delimiter`. includes_headers [_required_] boolean The `encoding` `includes_headers`. type [_required_] enum Specifies the encoding format (e.g., CSV) used for enrichment tables. Allowed enum values: `csv` _required_] [object] Key fields used to look up enrichment values. column [_required_] string The `items` `column`. comparison [_required_] enum Defines how to compare key fields for enrichment table lookups. Allowed enum values: `equals` field [_required_] string The `items` `field`. path [_required_] string Path to the CSV file. _required_] [object] Schema defining column names and their types. column [_required_] string The `items` `column`. type [_required_] enum Declares allowed data types for enrichment table columns. Allowed enum values: `string,boolean,integer,float,date,timestamp` object Uses a GeoIP database to enrich logs based on an IP field. key_field [_required_] string Path to the IP field in the log. locale [_required_] string Locale used to resolve geographical names. path [_required_] string Path to the GeoIP database file. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. target [_required_] string Path where enrichment results should be stored in the log. type [_required_] enum The processor type. The value should always be `enrichment_table`. Allowed enum values: `enrichment_table` default: `enrichment_table` object The `reduce` processor aggregates and merges logs based on matching keys and merge strategies. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [_required_] [string] A list of fields used to group log events for merging. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] List of merge strategies defining how values from grouped events should be combined. path [_required_] string The field path in the log event. strategy [_required_] enum The merge strategy to apply. Allowed enum values: `discard,retain,sum,max,min,array,concat,concat_newline,concat_raw,shortest_array,longest_array,flat_unique` type [_required_] enum The processor type. The value should always be `reduce`. Allowed enum values: `reduce` default: `reduce` object The `throttle` processor limits the number of events that pass through over a given time window. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [string] Optional list of fields used to group events before the threshold has been reached. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. threshold [_required_] int64 the number of events allowed in a given time window. Events sent after the threshold has been reached, are dropped. type [_required_] enum The processor type. The value should always be `throttle`. Allowed enum values: `throttle` default: `throttle` window [_required_] double The time window in seconds over which the threshold applies. object The `custom_processor` processor transforms events using [Vector Remap Language (VRL)](https://vector.dev/docs/reference/vrl/) scripts with advanced filtering capabilities. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. This field should always be set to `*` for the custom_processor processor. default: `*` _required_] [object] Array of VRL remap rules. drop_on_error [_required_] boolean Whether to drop events that caused errors during processing. enabled boolean Whether this remap rule is enabled. include [_required_] string A Datadog search query used to filter events for this specific remap rule. name [_required_] string A descriptive name for this remap rule. source [_required_] string The VRL script source code that defines the processing logic. type [_required_] enum The processor type. The value should always be `custom_processor`. Allowed enum values: `custom_processor` default: `custom_processor` object The `datadog_tags` processor includes or excludes specific Datadog tags in your logs. action [_required_] enum The action to take on tags with matching keys. Allowed enum values: `include,exclude` display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. keys [_required_] [string] A list of tag keys. mode [_required_] enum The processing mode. Allowed enum values: `filter` type [_required_] enum The processor type. The value should always be `datadog_tags`. Allowed enum values: `datadog_tags` default: `datadog_tags` _required_] [ ] A list of configured data sources for the pipeline. object The `kafka` source ingests data from Apache Kafka topics. group_id [_required_] string Consumer group ID used by the Kafka client. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). [object] Optional list of advanced Kafka client configuration options, defined as key-value pairs. name [_required_] string The name of the `librdkafka` configuration option to set. value [_required_] string The value assigned to the specified `librdkafka` configuration option. object Specifies the SASL mechanism for authenticating with a Kafka cluster. mechanism enum SASL mechanism used for Kafka authentication. Allowed enum values: `PLAIN,SCRAM-SHA-256,SCRAM-SHA-512` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topics [_required_] [string] A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified. type [_required_] enum The source type. The value should always be `kafka`. Allowed enum values: `kafka` default: `kafka` object The `datadog_agent` source collects logs from the Datadog Agent. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `datadog_agent`. Allowed enum values: `datadog_agent` default: `datadog_agent` object The `splunk_tcp` source receives logs from a Splunk Universal Forwarder over TCP. TLS is supported for secure transmission. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_tcp`. Allowed enum values: `splunk_tcp` default: `splunk_tcp` object The `splunk_hec` source implements the Splunk HTTP Event Collector (HEC) API. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `amazon_s3` source ingests logs from an Amazon S3 bucket. It supports AWS authentication and TLS encryption. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). region [_required_] string AWS region where the S3 bucket resides. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `fluentd` source ingests logs from a Fluentd-compatible service. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluentd. Allowed enum values: `fluentd` default: `fluentd` object The `fluent_bit` source ingests logs from Fluent Bit. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluent_bit`. Allowed enum values: `fluent_bit` default: `fluent_bit` object The `http_server` source collects logs over HTTP POST from external services. auth_strategy [_required_] enum HTTP authentication method. Allowed enum values: `none,plain` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string Unique ID for the HTTP server source. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_server`. Allowed enum values: `http_server` default: `http_server` object The `sumo_logic` source receives logs from Sumo Logic collectors. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). type [_required_] enum The source type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `rsyslog` source listens for logs over TCP or UDP from an `rsyslog` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` source listens for logs over TCP or UDP from a `syslog-ng` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `amazon_data_firehose` source ingests logs from AWS Data Firehose. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `amazon_data_firehose`. Allowed enum values: `amazon_data_firehose` default: `amazon_data_firehose` object The `google_pubsub` source ingests logs from a Google Cloud Pub/Sub subscription. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). project [_required_] string The GCP project ID that owns the Pub/Sub subscription. subscription [_required_] string The Pub/Sub subscription name from which messages are consumed. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` object The `http_client` source scrapes logs from HTTP endpoints at regular intervals. auth_strategy enum Optional authentication strategy for HTTP requests. Allowed enum values: `basic,bearer` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). scrape_interval_secs int64 The interval (in seconds) between HTTP scrape requests. scrape_timeout_secs int64 The timeout (in seconds) for each scrape request. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_client`. Allowed enum values: `http_client` default: `http_client` object The `logstash` source ingests logs from a Logstash forwarder. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `logstash`. Allowed enum values: `logstash` default: `logstash` object The `socket` source ingests logs over TCP or UDP. _required_] Framing method configuration for the socket source. object Byte frames which are delimited by a newline character. method [_required_] enum Byte frames which are delimited by a newline character. Allowed enum values: `newline_delimited` object Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). method [_required_] enum Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). Allowed enum values: `bytes` object Byte frames which are delimited by a chosen character. delimiter [_required_] string A single ASCII character used to delimit events. method [_required_] enum Byte frames which are delimited by a chosen character. Allowed enum values: `character_delimited` object Byte frames according to the octet counting format as per RFC6587. method [_required_] enum Byte frames according to the octet counting format as per RFC6587. Allowed enum values: `octet_counting` object Byte frames which are chunked GELF messages. method [_required_] enum Byte frames which are chunked GELF messages. Allowed enum values: `chunked_gelf` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used to receive logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` name [_required_] string Name of the pipeline. id [_required_] string Unique identifier for the pipeline. type [_required_] string The resource type identifier. For pipeline resources, this should always be set to `pipelines`. default: `pipelines` object Metadata about the response. totalCount int64 The total number of pipelines. ``` { "data": [ { "attributes": { "config": { "destinations": [ { "id": "datadog-logs-destination", "inputs": [ "filter-processor" ], "type": "datadog_logs" } ], "processors": [ { "display_name": "my component", "enabled": true, "id": "grouped-processors", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "processors": [ [] ] } ], "sources": [ { "group_id": "consumer-group-0", "id": "kafka-source", "librdkafka_options": [ { "name": "fetch.message.max.bytes", "value": "1048576" } ], "sasl": { "mechanism": "string" }, "tls": { "ca_file": "string", "crt_file": "/path/to/cert.crt", "key_file": "string" }, "topics": [ "topic1", "topic2" ], "type": "kafka" } ] }, "name": "Main Observability Pipeline" }, "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6", "type": "pipelines" } ], "meta": { "totalCount": 42 } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=typescript) ##### List pipelines Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List pipelines ``` """ List pipelines returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.observability_pipelines_api import ObservabilityPipelinesApi configuration = Configuration() configuration.unstable_operations["list_pipelines"] = True with ApiClient(configuration) as api_client: api_instance = ObservabilityPipelinesApi(api_client) response = api_instance.list_pipelines() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List pipelines ``` # List pipelines returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_pipelines".to_sym] = true end api_instance = DatadogAPIClient::V2::ObservabilityPipelinesAPI.new p api_instance.list_pipelines() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List pipelines ``` // List pipelines returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListPipelines", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewObservabilityPipelinesApi(apiClient) resp, r, err := api.ListPipelines(ctx, *datadogV2.NewListPipelinesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ObservabilityPipelinesApi.ListPipelines`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ObservabilityPipelinesApi.ListPipelines`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List pipelines ``` // List pipelines returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ObservabilityPipelinesApi; import com.datadog.api.client.v2.model.ListPipelinesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listPipelines", true); ObservabilityPipelinesApi apiInstance = new ObservabilityPipelinesApi(defaultClient); try { ListPipelinesResponse result = apiInstance.listPipelines(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ObservabilityPipelinesApi#listPipelines"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List pipelines ``` // List pipelines returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_observability_pipelines::ListPipelinesOptionalParams; use datadog_api_client::datadogV2::api_observability_pipelines::ObservabilityPipelinesAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListPipelines", true); let api = ObservabilityPipelinesAPI::with_config(configuration); let resp = api .list_pipelines(ListPipelinesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List pipelines ``` /** * List pipelines returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listPipelines"] = true; const apiInstance = new v2.ObservabilityPipelinesApi(configuration); apiInstance .listPipelines() .then((data: v2.ListPipelinesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new pipeline](https://docs.datadoghq.com/api/latest/observability-pipelines/#create-a-new-pipeline) * [v2 (latest)](https://docs.datadoghq.com/api/latest/observability-pipelines/#create-a-new-pipeline-v2) **Note** : This endpoint is in Preview. Fill out this [form](https://www.datadoghq.com/product-preview/observability-pipelines-api-and-terraform-support/) to request access. POST https://api.ap1.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.ap2.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.datadoghq.eu/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.ddog-gov.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.us3.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelineshttps://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines ### Overview Create a new pipeline. This endpoint requires the `observability_pipelines_deploy` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) Expand All Field Type Description _required_] object Contains the the pipeline configuration. _required_] object Defines the pipeline’s name and its components (sources, processors, and destinations). _required_] object Specifies the pipeline's configuration, including its sources, processors, and destinations. _required_] [ ] A list of destination components where processed logs are sent. object The `datadog_logs` destination forwards logs to Datadog Log Management. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `datadog_logs`. Allowed enum values: `datadog_logs` default: `datadog_logs` object The `amazon_s3` destination sends your logs in Datadog-rehydratable format to an Amazon S3 bucket for archiving. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string S3 bucket name. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys. region [_required_] string AWS region of the S3 bucket. storage_class [_required_] enum S3 storage class. Allowed enum values: `STANDARD,REDUCED_REDUNDANCY,INTELLIGENT_TIERING,STANDARD_IA,EXPRESS_ONEZONE,ONEZONE_IA,GLACIER,GLACIER_IR,DEEP_ARCHIVE` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `google_cloud_storage` destination stores logs in a Google Cloud Storage (GCS) bucket. It requires a bucket name, GCP authentication, and metadata fields. acl enum Access control list setting for objects written to the bucket. Allowed enum values: `private,project-private,public-read,authenticated-read,bucket-owner-read,bucket-owner-full-control` object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. bucket [_required_] string Name of the GCS bucket. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys within the GCS bucket. [object] Custom metadata to attach to each object uploaded to the GCS bucket. name [_required_] string The metadata key. value [_required_] string The metadata value. storage_class [_required_] enum Storage class used for objects stored in GCS. Allowed enum values: `STANDARD,NEARLINE,COLDLINE,ARCHIVE` type [_required_] enum The destination type. Always `google_cloud_storage`. Allowed enum values: `google_cloud_storage` default: `google_cloud_storage` object The `splunk_hec` destination forwards logs to Splunk using the HTTP Event Collector (HEC). auto_extract_timestamp boolean If `true`, Splunk tries to extract timestamps from incoming log events. If `false`, Splunk assigns the time the event was received. encoding enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). index string Optional name of the Splunk index where logs are written. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. sourcetype string The Splunk sourcetype to assign to log events. type [_required_] enum The destination type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `sumo_logic` destination forwards logs to Sumo Logic. encoding enum The output encoding format. Allowed enum values: `json,raw_message,logfmt` [object] A list of custom headers to include in the request to Sumo Logic. name [_required_] string The header field name. value [_required_] string The header field value. header_host_name string Optional override for the host name header. header_source_category string Optional override for the source category header. header_source_name string Optional override for the source name header. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `elasticsearch` destination writes logs to an Elasticsearch cluster. api_version enum The Elasticsearch API version to use. Set to `auto` to auto-detect. Allowed enum values: `auto,v6,v7,v8` bulk_index string The index to write logs to in Elasticsearch. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `elasticsearch`. Allowed enum values: `elasticsearch` default: `elasticsearch` object The `rsyslog` destination forwards logs to an external `rsyslog` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` destination forwards logs to an external `syslog-ng` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `azure_storage` destination forwards logs to an Azure Blob Storage container. blob_prefix string Optional prefix for blobs written to the container. container_name [_required_] string The name of the Azure Blob Storage container to store logs in. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `azure_storage`. Allowed enum values: `azure_storage` default: `azure_storage` object The `microsoft_sentinel` destination forwards logs to Microsoft Sentinel. client_id [_required_] string Azure AD client ID used for authentication. dcr_immutable_id [_required_] string The immutable ID of the Data Collection Rule (DCR). id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. table [_required_] string The name of the Log Analytics table where logs are sent. tenant_id [_required_] string Azure AD tenant ID. type [_required_] enum The destination type. The value should always be `microsoft_sentinel`. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` object The `google_chronicle` destination sends logs to Google Chronicle. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. customer_id [_required_] string The Google Chronicle customer ID. encoding enum The encoding format for the logs sent to Chronicle. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. log_type string The log type metadata associated with the Chronicle destination. type [_required_] enum The destination type. The value should always be `google_chronicle`. Allowed enum values: `google_chronicle` default: `google_chronicle` object The `new_relic` destination sends logs to the New Relic platform. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The New Relic region. Allowed enum values: `us,eu` type [_required_] enum The destination type. The value should always be `new_relic`. Allowed enum values: `new_relic` default: `new_relic` object The `sentinel_one` destination sends logs to SentinelOne. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The SentinelOne region to send logs to. Allowed enum values: `us,eu,ca,data_set_us` type [_required_] enum The destination type. The value should always be `sentinel_one`. Allowed enum values: `sentinel_one` default: `sentinel_one` object The `opensearch` destination writes logs to an OpenSearch cluster. bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `opensearch`. Allowed enum values: `opensearch` default: `opensearch` object The `amazon_opensearch` destination writes logs to Amazon OpenSearch. _required_] object Authentication settings for the Amazon OpenSearch destination. The `strategy` field determines whether basic or AWS-based authentication is used. assume_role string The ARN of the role to assume (used with `aws` strategy). aws_region string AWS region external_id string External ID for the assumed role (used with `aws` strategy). session_name string Session name for the assumed role (used with `aws` strategy). strategy [_required_] enum The authentication strategy to use. Allowed enum values: `basic,aws` bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `amazon_opensearch`. Allowed enum values: `amazon_opensearch` default: `amazon_opensearch` object The `socket` destination sends logs over TCP or UDP to a remote server. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` _required_] Framing method configuration. object Each log event is delimited by a newline character. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingNewlineDelimitedMethod` object. Allowed enum values: `newline_delimited` object Event data is not delimited at all. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingBytesMethod` object. Allowed enum values: `bytes` object Each log event is separated using the specified delimiter character. delimiter [_required_] string A single ASCII character used as a delimiter. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingCharacterDelimitedMethod` object. Allowed enum values: `character_delimited` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. mode [_required_] enum Protocol used to send logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` object The `amazon_security_lake` destination sends your logs to Amazon Security Lake. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string Name of the Amazon S3 bucket in Security Lake (3-63 characters). custom_source_name [_required_] string Custom source name for the logs in Security Lake. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] string AWS region of the S3 bucket. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_security_lake`. Allowed enum values: `amazon_security_lake` default: `amazon_security_lake` object The `crowdstrike_next_gen_siem` destination forwards logs to CrowdStrike Next Gen SIEM. object Compression configuration for log events. algorithm [_required_] enum Compression algorithm for log events. Allowed enum values: `gzip,zlib` level int64 Compression level. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `crowdstrike_next_gen_siem`. Allowed enum values: `crowdstrike_next_gen_siem` default: `crowdstrike_next_gen_siem` object The `google_pubsub` destination publishes logs to a Google Cloud Pub/Sub topic. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. project [_required_] string The GCP project ID that owns the Pub/Sub topic. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topic [_required_] string The Pub/Sub topic name to publish logs to. type [_required_] enum The destination type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` [object] A list of processor groups that transform or enrich log data. display_name string The display name for a component. enabled [_required_] boolean Whether this processor group is enabled. id [_required_] string The unique identifier for the processor group. include [_required_] string Conditional expression for when this processor group should execute. inputs [_required_] [string] A list of IDs for components whose output is used as the input for this processor group. _required_] [ ] Processors applied sequentially within this group. Events flow through each processor in order. object The `filter` processor allows conditional processing of logs based on a Datadog search query. Logs that match the `include` query are passed through; others are discarded. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped. type [_required_] enum The processor type. The value should always be `filter`. Allowed enum values: `filter` default: `filter` object The `parse_json` processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. field [_required_] string The name of the log field that contains a JSON string. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `parse_json`. Allowed enum values: `parse_json` default: `parse_json` object The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert. display_name string The display name for a component. drop_events boolean If set to `true`, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). ignore_when_missing_partitions boolean If `true`, the processor skips quota checks when partition fields are missing from the logs. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. name [_required_] string Name of the quota. overflow_action enum The action to take when the quota is exceeded. Options: * `drop`: Drop the event. * `no_action`: Let the event pass through. * `overflow_routing`: Route to an overflow destination. Allowed enum values: `drop,no_action,overflow_routing` [object] A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit. _required_] [object] A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced. name [_required_] string The field name. value [_required_] string The field value. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. partition_fields [string] A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values. type [_required_] enum The processor type. The value should always be `quota`. Allowed enum values: `quota` default: `quota` object The `add_fields` processor adds static key-value fields to logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of static fields (key-value pairs) that is added to each log event processed by this component. name [_required_] string The field name. value [_required_] string The field value. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_fields`. Allowed enum values: `add_fields` default: `add_fields` object The `remove_fields` processor deletes specified fields from logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of field names to be removed from each log event. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `remove_fields`. Allowed enum values: `remove_fields` default: `remove_fields` object The `rename_fields` processor changes field names. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields. destination [_required_] string The field name to assign the renamed value to. preserve_source [_required_] boolean Indicates whether the original field, that is received from the source, should be kept (`true`) or removed (`false`) after renaming. source [_required_] string The original field name in the log event that should be renamed. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `rename_fields`. Allowed enum values: `rename_fields` default: `rename_fields` object The `generate_datadog_metrics` processor creates custom metrics from logs and sends them to Datadog. Metrics can be counters, gauges, or distributions and optionally grouped by log fields. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include string A Datadog search query used to determine which logs this processor targets. [object] Configuration for generating individual metrics. group_by [string] Optional fields used to group the metric series. include [_required_] string Datadog filter query to match logs for metric generation. metric_type [_required_] enum Type of metric to create. Allowed enum values: `count,gauge,distribution` name [_required_] string Name of the custom metric to be created. _required_] Specifies how the value of the generated metric is computed. object Strategy that increments a generated metric by one for each matching event. strategy [_required_] enum Increments the metric by 1 for each matching event. Allowed enum values: `increment_by_one` object Strategy that increments a generated metric based on the value of a log field. field [_required_] string Name of the log field containing the numeric value to increment the metric by. strategy [_required_] enum Uses a numeric field in the log event as the metric increment. Allowed enum values: `increment_by_field` type [_required_] enum The processor type. Always `generate_datadog_metrics`. Allowed enum values: `generate_datadog_metrics` default: `generate_datadog_metrics` object The `sample` processor allows probabilistic sampling of logs at a fixed rate. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. percentage double The percentage of logs to sample. rate int64 Number of events to sample (1 in N). type [_required_] enum The processor type. The value should always be `sample`. Allowed enum values: `sample` default: `sample` object The `parse_grok` processor extracts structured fields from unstructured log messages using Grok patterns. disable_library_rules boolean If set to `true`, disables the default Grok rules provided by Datadog. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string A unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] The list of Grok parsing rules. If multiple matching rules are provided, they are evaluated in order. The first successful match is applied. _required_] [object] A list of Grok parsing rules that define how to extract fields from the source field. Each rule must contain a name and a valid Grok pattern. name [_required_] string The name of the rule. rule [_required_] string The definition of the Grok rule. source [_required_] string The name of the field in the log event to apply the Grok rules to. [object] A list of Grok helper rules that can be referenced by the parsing rules. name [_required_] string The name of the Grok helper rule. rule [_required_] string The definition of the Grok helper rule. type [_required_] enum The processor type. The value should always be `parse_grok`. Allowed enum values: `parse_grok` default: `parse_grok` object The `sensitive_data_scanner` processor detects and optionally redacts sensitive data in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of rules for identifying and acting on sensitive data patterns. object Configuration for keywords used to reinforce sensitive data pattern detection. keywords [_required_] [string] A list of keywords to match near the sensitive pattern. proximity [_required_] int64 Maximum number of tokens between a keyword and a sensitive value match. name [_required_] string A name identifying the rule. _required_] Defines what action to take when sensitive data is matched. object Configuration for completely redacting matched sensitive data. action [_required_] enum Action type that completely replaces the matched sensitive data with a fixed replacement string to remove all visibility. Allowed enum values: `redact` _required_] object Configuration for fully redacting sensitive data. replace [_required_] string The `ObservabilityPipelineSensitiveDataScannerProcessorActionRedactOptions` `replace`. object Configuration for hashing matched sensitive values. action [_required_] enum Action type that replaces the matched sensitive data with a hashed representation, preserving structure while securing content. Allowed enum values: `hash` options object The `ObservabilityPipelineSensitiveDataScannerProcessorActionHash` `options`. object Configuration for partially redacting matched sensitive data. action [_required_] enum Action type that redacts part of the sensitive data while preserving a configurable number of characters, typically used for masking purposes (e.g., show last 4 digits of a credit card). Allowed enum values: `partial_redact` _required_] object Controls how partial redaction is applied, including character count and direction. characters [_required_] int64 The `ObservabilityPipelineSensitiveDataScannerProcessorActionPartialRedactOptions` `characters`. direction [_required_] enum Indicates whether to redact characters from the first or last part of the matched value. Allowed enum values: `first,last` _required_] Pattern detection configuration for identifying sensitive data using either a custom regex or a library reference. object Defines a custom regex-based pattern for identifying sensitive data in logs. _required_] object Options for defining a custom regex pattern. rule [_required_] string A regular expression used to detect sensitive values. Must be a valid regex. type [_required_] enum Indicates a custom regular expression is used for matching. Allowed enum values: `custom` object Specifies a pattern from Datadog’s sensitive data detection library to match known sensitive data types. _required_] object Options for selecting a predefined library pattern and enabling keyword support. id [_required_] string Identifier for a predefined pattern from the sensitive data scanner pattern library. use_recommended_keywords boolean Whether to augment the pattern with recommended keywords (optional). type [_required_] enum Indicates that a predefined library pattern is used. Allowed enum values: `library` _required_] Determines which parts of the log the pattern-matching rule should be applied to. object Includes only specific fields for sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Applies the rule only to included fields. Allowed enum values: `include` object Excludes specific fields from sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Excludes specific fields from processing. Allowed enum values: `exclude` object Applies scanning across all available fields. target [_required_] enum Applies the rule to all fields. Allowed enum values: `all` tags [_required_] [string] Tags assigned to this rule for filtering and classification. type [_required_] enum The processor type. The value should always be `sensitive_data_scanner`. Allowed enum values: `sensitive_data_scanner` default: `sensitive_data_scanner` object The `ocsf_mapper` processor transforms logs into the OCSF schema using a predefined mapping configuration. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of mapping rules to convert events to the OCSF format. include [_required_] string A Datadog search query used to select the logs that this mapping should apply to. _required_] Defines a single mapping rule for transforming logs into the OCSF schema. Option 1 enum Predefined library mappings for common log formats. Allowed enum values: `CloudTrail Account Change,GCP Cloud Audit CreateBucket,GCP Cloud Audit CreateSink,GCP Cloud Audit SetIamPolicy,GCP Cloud Audit UpdateSink,Github Audit Log API Activity,Google Workspace Admin Audit addPrivilege,Microsoft 365 Defender Incident,Microsoft 365 Defender UserLoggedIn,Okta System Log Authentication,Palo Alto Networks Firewall Traffic` type [_required_] enum The processor type. The value should always be `ocsf_mapper`. Allowed enum values: `ocsf_mapper` default: `ocsf_mapper` object The `add_env_vars` processor adds environment variable values to log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this processor in the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_env_vars`. Allowed enum values: `add_env_vars` default: `add_env_vars` _required_] [object] A list of environment variable mappings to apply to log fields. field [_required_] string The target field in the log event. name [_required_] string The name of the environment variable to read. object The `dedupe` processor removes duplicate fields in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of log field paths to check for duplicates. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. mode [_required_] enum The deduplication mode to apply to the fields. Allowed enum values: `match,ignore` type [_required_] enum The processor type. The value should always be `dedupe`. Allowed enum values: `dedupe` default: `dedupe` object The `enrichment_table` processor enriches logs using a static CSV file or GeoIP database. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. object Defines a static enrichment table loaded from a CSV file. _required_] object File encoding format. delimiter [_required_] string The `encoding` `delimiter`. includes_headers [_required_] boolean The `encoding` `includes_headers`. type [_required_] enum Specifies the encoding format (e.g., CSV) used for enrichment tables. Allowed enum values: `csv` _required_] [object] Key fields used to look up enrichment values. column [_required_] string The `items` `column`. comparison [_required_] enum Defines how to compare key fields for enrichment table lookups. Allowed enum values: `equals` field [_required_] string The `items` `field`. path [_required_] string Path to the CSV file. _required_] [object] Schema defining column names and their types. column [_required_] string The `items` `column`. type [_required_] enum Declares allowed data types for enrichment table columns. Allowed enum values: `string,boolean,integer,float,date,timestamp` object Uses a GeoIP database to enrich logs based on an IP field. key_field [_required_] string Path to the IP field in the log. locale [_required_] string Locale used to resolve geographical names. path [_required_] string Path to the GeoIP database file. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. target [_required_] string Path where enrichment results should be stored in the log. type [_required_] enum The processor type. The value should always be `enrichment_table`. Allowed enum values: `enrichment_table` default: `enrichment_table` object The `reduce` processor aggregates and merges logs based on matching keys and merge strategies. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [_required_] [string] A list of fields used to group log events for merging. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] List of merge strategies defining how values from grouped events should be combined. path [_required_] string The field path in the log event. strategy [_required_] enum The merge strategy to apply. Allowed enum values: `discard,retain,sum,max,min,array,concat,concat_newline,concat_raw,shortest_array,longest_array,flat_unique` type [_required_] enum The processor type. The value should always be `reduce`. Allowed enum values: `reduce` default: `reduce` object The `throttle` processor limits the number of events that pass through over a given time window. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [string] Optional list of fields used to group events before the threshold has been reached. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. threshold [_required_] int64 the number of events allowed in a given time window. Events sent after the threshold has been reached, are dropped. type [_required_] enum The processor type. The value should always be `throttle`. Allowed enum values: `throttle` default: `throttle` window [_required_] double The time window in seconds over which the threshold applies. object The `custom_processor` processor transforms events using [Vector Remap Language (VRL)](https://vector.dev/docs/reference/vrl/) scripts with advanced filtering capabilities. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. This field should always be set to `*` for the custom_processor processor. default: `*` _required_] [object] Array of VRL remap rules. drop_on_error [_required_] boolean Whether to drop events that caused errors during processing. enabled boolean Whether this remap rule is enabled. include [_required_] string A Datadog search query used to filter events for this specific remap rule. name [_required_] string A descriptive name for this remap rule. source [_required_] string The VRL script source code that defines the processing logic. type [_required_] enum The processor type. The value should always be `custom_processor`. Allowed enum values: `custom_processor` default: `custom_processor` object The `datadog_tags` processor includes or excludes specific Datadog tags in your logs. action [_required_] enum The action to take on tags with matching keys. Allowed enum values: `include,exclude` display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. keys [_required_] [string] A list of tag keys. mode [_required_] enum The processing mode. Allowed enum values: `filter` type [_required_] enum The processor type. The value should always be `datadog_tags`. Allowed enum values: `datadog_tags` default: `datadog_tags` _required_] [ ] A list of configured data sources for the pipeline. object The `kafka` source ingests data from Apache Kafka topics. group_id [_required_] string Consumer group ID used by the Kafka client. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). [object] Optional list of advanced Kafka client configuration options, defined as key-value pairs. name [_required_] string The name of the `librdkafka` configuration option to set. value [_required_] string The value assigned to the specified `librdkafka` configuration option. object Specifies the SASL mechanism for authenticating with a Kafka cluster. mechanism enum SASL mechanism used for Kafka authentication. Allowed enum values: `PLAIN,SCRAM-SHA-256,SCRAM-SHA-512` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topics [_required_] [string] A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified. type [_required_] enum The source type. The value should always be `kafka`. Allowed enum values: `kafka` default: `kafka` object The `datadog_agent` source collects logs from the Datadog Agent. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `datadog_agent`. Allowed enum values: `datadog_agent` default: `datadog_agent` object The `splunk_tcp` source receives logs from a Splunk Universal Forwarder over TCP. TLS is supported for secure transmission. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_tcp`. Allowed enum values: `splunk_tcp` default: `splunk_tcp` object The `splunk_hec` source implements the Splunk HTTP Event Collector (HEC) API. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `amazon_s3` source ingests logs from an Amazon S3 bucket. It supports AWS authentication and TLS encryption. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). region [_required_] string AWS region where the S3 bucket resides. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `fluentd` source ingests logs from a Fluentd-compatible service. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluentd. Allowed enum values: `fluentd` default: `fluentd` object The `fluent_bit` source ingests logs from Fluent Bit. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluent_bit`. Allowed enum values: `fluent_bit` default: `fluent_bit` object The `http_server` source collects logs over HTTP POST from external services. auth_strategy [_required_] enum HTTP authentication method. Allowed enum values: `none,plain` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string Unique ID for the HTTP server source. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_server`. Allowed enum values: `http_server` default: `http_server` object The `sumo_logic` source receives logs from Sumo Logic collectors. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). type [_required_] enum The source type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `rsyslog` source listens for logs over TCP or UDP from an `rsyslog` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` source listens for logs over TCP or UDP from a `syslog-ng` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `amazon_data_firehose` source ingests logs from AWS Data Firehose. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `amazon_data_firehose`. Allowed enum values: `amazon_data_firehose` default: `amazon_data_firehose` object The `google_pubsub` source ingests logs from a Google Cloud Pub/Sub subscription. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). project [_required_] string The GCP project ID that owns the Pub/Sub subscription. subscription [_required_] string The Pub/Sub subscription name from which messages are consumed. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` object The `http_client` source scrapes logs from HTTP endpoints at regular intervals. auth_strategy enum Optional authentication strategy for HTTP requests. Allowed enum values: `basic,bearer` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). scrape_interval_secs int64 The interval (in seconds) between HTTP scrape requests. scrape_timeout_secs int64 The timeout (in seconds) for each scrape request. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_client`. Allowed enum values: `http_client` default: `http_client` object The `logstash` source ingests logs from a Logstash forwarder. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `logstash`. Allowed enum values: `logstash` default: `logstash` object The `socket` source ingests logs over TCP or UDP. _required_] Framing method configuration for the socket source. object Byte frames which are delimited by a newline character. method [_required_] enum Byte frames which are delimited by a newline character. Allowed enum values: `newline_delimited` object Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). method [_required_] enum Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). Allowed enum values: `bytes` object Byte frames which are delimited by a chosen character. delimiter [_required_] string A single ASCII character used to delimit events. method [_required_] enum Byte frames which are delimited by a chosen character. Allowed enum values: `character_delimited` object Byte frames according to the octet counting format as per RFC6587. method [_required_] enum Byte frames according to the octet counting format as per RFC6587. Allowed enum values: `octet_counting` object Byte frames which are chunked GELF messages. method [_required_] enum Byte frames which are chunked GELF messages. Allowed enum values: `chunked_gelf` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used to receive logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` name [_required_] string Name of the pipeline. type [_required_] string The resource type identifier. For pipeline resources, this should always be set to `pipelines`. default: `pipelines` ``` { "data": { "attributes": { "config": { "destinations": [ { "id": "datadog-logs-destination", "inputs": [ "my-processor-group" ], "type": "datadog_logs" } ], "processors": [ { "enabled": true, "id": "my-processor-group", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "processors": [ { "enabled": true, "id": "filter-processor", "include": "status:error", "type": "filter" } ] } ], "sources": [ { "id": "datadog-agent-source", "type": "datadog_agent" } ] }, "name": "Main Observability Pipeline" }, "type": "pipelines" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/observability-pipelines/#CreatePipeline-201-v2) * [400](https://docs.datadoghq.com/api/latest/observability-pipelines/#CreatePipeline-400-v2) * [403](https://docs.datadoghq.com/api/latest/observability-pipelines/#CreatePipeline-403-v2) * [409](https://docs.datadoghq.com/api/latest/observability-pipelines/#CreatePipeline-409-v2) * [429](https://docs.datadoghq.com/api/latest/observability-pipelines/#CreatePipeline-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) Top-level schema representing a pipeline. Expand All Field Type Description _required_] object Contains the pipeline’s ID, type, and configuration attributes. _required_] object Defines the pipeline’s name and its components (sources, processors, and destinations). _required_] object Specifies the pipeline's configuration, including its sources, processors, and destinations. _required_] [ ] A list of destination components where processed logs are sent. object The `datadog_logs` destination forwards logs to Datadog Log Management. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `datadog_logs`. Allowed enum values: `datadog_logs` default: `datadog_logs` object The `amazon_s3` destination sends your logs in Datadog-rehydratable format to an Amazon S3 bucket for archiving. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string S3 bucket name. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys. region [_required_] string AWS region of the S3 bucket. storage_class [_required_] enum S3 storage class. Allowed enum values: `STANDARD,REDUCED_REDUNDANCY,INTELLIGENT_TIERING,STANDARD_IA,EXPRESS_ONEZONE,ONEZONE_IA,GLACIER,GLACIER_IR,DEEP_ARCHIVE` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `google_cloud_storage` destination stores logs in a Google Cloud Storage (GCS) bucket. It requires a bucket name, GCP authentication, and metadata fields. acl enum Access control list setting for objects written to the bucket. Allowed enum values: `private,project-private,public-read,authenticated-read,bucket-owner-read,bucket-owner-full-control` object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. bucket [_required_] string Name of the GCS bucket. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys within the GCS bucket. [object] Custom metadata to attach to each object uploaded to the GCS bucket. name [_required_] string The metadata key. value [_required_] string The metadata value. storage_class [_required_] enum Storage class used for objects stored in GCS. Allowed enum values: `STANDARD,NEARLINE,COLDLINE,ARCHIVE` type [_required_] enum The destination type. Always `google_cloud_storage`. Allowed enum values: `google_cloud_storage` default: `google_cloud_storage` object The `splunk_hec` destination forwards logs to Splunk using the HTTP Event Collector (HEC). auto_extract_timestamp boolean If `true`, Splunk tries to extract timestamps from incoming log events. If `false`, Splunk assigns the time the event was received. encoding enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). index string Optional name of the Splunk index where logs are written. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. sourcetype string The Splunk sourcetype to assign to log events. type [_required_] enum The destination type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `sumo_logic` destination forwards logs to Sumo Logic. encoding enum The output encoding format. Allowed enum values: `json,raw_message,logfmt` [object] A list of custom headers to include in the request to Sumo Logic. name [_required_] string The header field name. value [_required_] string The header field value. header_host_name string Optional override for the host name header. header_source_category string Optional override for the source category header. header_source_name string Optional override for the source name header. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `elasticsearch` destination writes logs to an Elasticsearch cluster. api_version enum The Elasticsearch API version to use. Set to `auto` to auto-detect. Allowed enum values: `auto,v6,v7,v8` bulk_index string The index to write logs to in Elasticsearch. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `elasticsearch`. Allowed enum values: `elasticsearch` default: `elasticsearch` object The `rsyslog` destination forwards logs to an external `rsyslog` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` destination forwards logs to an external `syslog-ng` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `azure_storage` destination forwards logs to an Azure Blob Storage container. blob_prefix string Optional prefix for blobs written to the container. container_name [_required_] string The name of the Azure Blob Storage container to store logs in. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `azure_storage`. Allowed enum values: `azure_storage` default: `azure_storage` object The `microsoft_sentinel` destination forwards logs to Microsoft Sentinel. client_id [_required_] string Azure AD client ID used for authentication. dcr_immutable_id [_required_] string The immutable ID of the Data Collection Rule (DCR). id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. table [_required_] string The name of the Log Analytics table where logs are sent. tenant_id [_required_] string Azure AD tenant ID. type [_required_] enum The destination type. The value should always be `microsoft_sentinel`. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` object The `google_chronicle` destination sends logs to Google Chronicle. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. customer_id [_required_] string The Google Chronicle customer ID. encoding enum The encoding format for the logs sent to Chronicle. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. log_type string The log type metadata associated with the Chronicle destination. type [_required_] enum The destination type. The value should always be `google_chronicle`. Allowed enum values: `google_chronicle` default: `google_chronicle` object The `new_relic` destination sends logs to the New Relic platform. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The New Relic region. Allowed enum values: `us,eu` type [_required_] enum The destination type. The value should always be `new_relic`. Allowed enum values: `new_relic` default: `new_relic` object The `sentinel_one` destination sends logs to SentinelOne. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The SentinelOne region to send logs to. Allowed enum values: `us,eu,ca,data_set_us` type [_required_] enum The destination type. The value should always be `sentinel_one`. Allowed enum values: `sentinel_one` default: `sentinel_one` object The `opensearch` destination writes logs to an OpenSearch cluster. bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `opensearch`. Allowed enum values: `opensearch` default: `opensearch` object The `amazon_opensearch` destination writes logs to Amazon OpenSearch. _required_] object Authentication settings for the Amazon OpenSearch destination. The `strategy` field determines whether basic or AWS-based authentication is used. assume_role string The ARN of the role to assume (used with `aws` strategy). aws_region string AWS region external_id string External ID for the assumed role (used with `aws` strategy). session_name string Session name for the assumed role (used with `aws` strategy). strategy [_required_] enum The authentication strategy to use. Allowed enum values: `basic,aws` bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `amazon_opensearch`. Allowed enum values: `amazon_opensearch` default: `amazon_opensearch` object The `socket` destination sends logs over TCP or UDP to a remote server. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` _required_] Framing method configuration. object Each log event is delimited by a newline character. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingNewlineDelimitedMethod` object. Allowed enum values: `newline_delimited` object Event data is not delimited at all. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingBytesMethod` object. Allowed enum values: `bytes` object Each log event is separated using the specified delimiter character. delimiter [_required_] string A single ASCII character used as a delimiter. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingCharacterDelimitedMethod` object. Allowed enum values: `character_delimited` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. mode [_required_] enum Protocol used to send logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` object The `amazon_security_lake` destination sends your logs to Amazon Security Lake. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string Name of the Amazon S3 bucket in Security Lake (3-63 characters). custom_source_name [_required_] string Custom source name for the logs in Security Lake. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] string AWS region of the S3 bucket. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_security_lake`. Allowed enum values: `amazon_security_lake` default: `amazon_security_lake` object The `crowdstrike_next_gen_siem` destination forwards logs to CrowdStrike Next Gen SIEM. object Compression configuration for log events. algorithm [_required_] enum Compression algorithm for log events. Allowed enum values: `gzip,zlib` level int64 Compression level. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `crowdstrike_next_gen_siem`. Allowed enum values: `crowdstrike_next_gen_siem` default: `crowdstrike_next_gen_siem` object The `google_pubsub` destination publishes logs to a Google Cloud Pub/Sub topic. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. project [_required_] string The GCP project ID that owns the Pub/Sub topic. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topic [_required_] string The Pub/Sub topic name to publish logs to. type [_required_] enum The destination type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` [object] A list of processor groups that transform or enrich log data. display_name string The display name for a component. enabled [_required_] boolean Whether this processor group is enabled. id [_required_] string The unique identifier for the processor group. include [_required_] string Conditional expression for when this processor group should execute. inputs [_required_] [string] A list of IDs for components whose output is used as the input for this processor group. _required_] [ ] Processors applied sequentially within this group. Events flow through each processor in order. object The `filter` processor allows conditional processing of logs based on a Datadog search query. Logs that match the `include` query are passed through; others are discarded. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped. type [_required_] enum The processor type. The value should always be `filter`. Allowed enum values: `filter` default: `filter` object The `parse_json` processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. field [_required_] string The name of the log field that contains a JSON string. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `parse_json`. Allowed enum values: `parse_json` default: `parse_json` object The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert. display_name string The display name for a component. drop_events boolean If set to `true`, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). ignore_when_missing_partitions boolean If `true`, the processor skips quota checks when partition fields are missing from the logs. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. name [_required_] string Name of the quota. overflow_action enum The action to take when the quota is exceeded. Options: * `drop`: Drop the event. * `no_action`: Let the event pass through. * `overflow_routing`: Route to an overflow destination. Allowed enum values: `drop,no_action,overflow_routing` [object] A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit. _required_] [object] A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced. name [_required_] string The field name. value [_required_] string The field value. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. partition_fields [string] A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values. type [_required_] enum The processor type. The value should always be `quota`. Allowed enum values: `quota` default: `quota` object The `add_fields` processor adds static key-value fields to logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of static fields (key-value pairs) that is added to each log event processed by this component. name [_required_] string The field name. value [_required_] string The field value. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_fields`. Allowed enum values: `add_fields` default: `add_fields` object The `remove_fields` processor deletes specified fields from logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of field names to be removed from each log event. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `remove_fields`. Allowed enum values: `remove_fields` default: `remove_fields` object The `rename_fields` processor changes field names. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields. destination [_required_] string The field name to assign the renamed value to. preserve_source [_required_] boolean Indicates whether the original field, that is received from the source, should be kept (`true`) or removed (`false`) after renaming. source [_required_] string The original field name in the log event that should be renamed. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `rename_fields`. Allowed enum values: `rename_fields` default: `rename_fields` object The `generate_datadog_metrics` processor creates custom metrics from logs and sends them to Datadog. Metrics can be counters, gauges, or distributions and optionally grouped by log fields. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include string A Datadog search query used to determine which logs this processor targets. [object] Configuration for generating individual metrics. group_by [string] Optional fields used to group the metric series. include [_required_] string Datadog filter query to match logs for metric generation. metric_type [_required_] enum Type of metric to create. Allowed enum values: `count,gauge,distribution` name [_required_] string Name of the custom metric to be created. _required_] Specifies how the value of the generated metric is computed. object Strategy that increments a generated metric by one for each matching event. strategy [_required_] enum Increments the metric by 1 for each matching event. Allowed enum values: `increment_by_one` object Strategy that increments a generated metric based on the value of a log field. field [_required_] string Name of the log field containing the numeric value to increment the metric by. strategy [_required_] enum Uses a numeric field in the log event as the metric increment. Allowed enum values: `increment_by_field` type [_required_] enum The processor type. Always `generate_datadog_metrics`. Allowed enum values: `generate_datadog_metrics` default: `generate_datadog_metrics` object The `sample` processor allows probabilistic sampling of logs at a fixed rate. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. percentage double The percentage of logs to sample. rate int64 Number of events to sample (1 in N). type [_required_] enum The processor type. The value should always be `sample`. Allowed enum values: `sample` default: `sample` object The `parse_grok` processor extracts structured fields from unstructured log messages using Grok patterns. disable_library_rules boolean If set to `true`, disables the default Grok rules provided by Datadog. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string A unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] The list of Grok parsing rules. If multiple matching rules are provided, they are evaluated in order. The first successful match is applied. _required_] [object] A list of Grok parsing rules that define how to extract fields from the source field. Each rule must contain a name and a valid Grok pattern. name [_required_] string The name of the rule. rule [_required_] string The definition of the Grok rule. source [_required_] string The name of the field in the log event to apply the Grok rules to. [object] A list of Grok helper rules that can be referenced by the parsing rules. name [_required_] string The name of the Grok helper rule. rule [_required_] string The definition of the Grok helper rule. type [_required_] enum The processor type. The value should always be `parse_grok`. Allowed enum values: `parse_grok` default: `parse_grok` object The `sensitive_data_scanner` processor detects and optionally redacts sensitive data in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of rules for identifying and acting on sensitive data patterns. object Configuration for keywords used to reinforce sensitive data pattern detection. keywords [_required_] [string] A list of keywords to match near the sensitive pattern. proximity [_required_] int64 Maximum number of tokens between a keyword and a sensitive value match. name [_required_] string A name identifying the rule. _required_] Defines what action to take when sensitive data is matched. object Configuration for completely redacting matched sensitive data. action [_required_] enum Action type that completely replaces the matched sensitive data with a fixed replacement string to remove all visibility. Allowed enum values: `redact` _required_] object Configuration for fully redacting sensitive data. replace [_required_] string The `ObservabilityPipelineSensitiveDataScannerProcessorActionRedactOptions` `replace`. object Configuration for hashing matched sensitive values. action [_required_] enum Action type that replaces the matched sensitive data with a hashed representation, preserving structure while securing content. Allowed enum values: `hash` options object The `ObservabilityPipelineSensitiveDataScannerProcessorActionHash` `options`. object Configuration for partially redacting matched sensitive data. action [_required_] enum Action type that redacts part of the sensitive data while preserving a configurable number of characters, typically used for masking purposes (e.g., show last 4 digits of a credit card). Allowed enum values: `partial_redact` _required_] object Controls how partial redaction is applied, including character count and direction. characters [_required_] int64 The `ObservabilityPipelineSensitiveDataScannerProcessorActionPartialRedactOptions` `characters`. direction [_required_] enum Indicates whether to redact characters from the first or last part of the matched value. Allowed enum values: `first,last` _required_] Pattern detection configuration for identifying sensitive data using either a custom regex or a library reference. object Defines a custom regex-based pattern for identifying sensitive data in logs. _required_] object Options for defining a custom regex pattern. rule [_required_] string A regular expression used to detect sensitive values. Must be a valid regex. type [_required_] enum Indicates a custom regular expression is used for matching. Allowed enum values: `custom` object Specifies a pattern from Datadog’s sensitive data detection library to match known sensitive data types. _required_] object Options for selecting a predefined library pattern and enabling keyword support. id [_required_] string Identifier for a predefined pattern from the sensitive data scanner pattern library. use_recommended_keywords boolean Whether to augment the pattern with recommended keywords (optional). type [_required_] enum Indicates that a predefined library pattern is used. Allowed enum values: `library` _required_] Determines which parts of the log the pattern-matching rule should be applied to. object Includes only specific fields for sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Applies the rule only to included fields. Allowed enum values: `include` object Excludes specific fields from sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Excludes specific fields from processing. Allowed enum values: `exclude` object Applies scanning across all available fields. target [_required_] enum Applies the rule to all fields. Allowed enum values: `all` tags [_required_] [string] Tags assigned to this rule for filtering and classification. type [_required_] enum The processor type. The value should always be `sensitive_data_scanner`. Allowed enum values: `sensitive_data_scanner` default: `sensitive_data_scanner` object The `ocsf_mapper` processor transforms logs into the OCSF schema using a predefined mapping configuration. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of mapping rules to convert events to the OCSF format. include [_required_] string A Datadog search query used to select the logs that this mapping should apply to. _required_] Defines a single mapping rule for transforming logs into the OCSF schema. Option 1 enum Predefined library mappings for common log formats. Allowed enum values: `CloudTrail Account Change,GCP Cloud Audit CreateBucket,GCP Cloud Audit CreateSink,GCP Cloud Audit SetIamPolicy,GCP Cloud Audit UpdateSink,Github Audit Log API Activity,Google Workspace Admin Audit addPrivilege,Microsoft 365 Defender Incident,Microsoft 365 Defender UserLoggedIn,Okta System Log Authentication,Palo Alto Networks Firewall Traffic` type [_required_] enum The processor type. The value should always be `ocsf_mapper`. Allowed enum values: `ocsf_mapper` default: `ocsf_mapper` object The `add_env_vars` processor adds environment variable values to log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this processor in the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_env_vars`. Allowed enum values: `add_env_vars` default: `add_env_vars` _required_] [object] A list of environment variable mappings to apply to log fields. field [_required_] string The target field in the log event. name [_required_] string The name of the environment variable to read. object The `dedupe` processor removes duplicate fields in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of log field paths to check for duplicates. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. mode [_required_] enum The deduplication mode to apply to the fields. Allowed enum values: `match,ignore` type [_required_] enum The processor type. The value should always be `dedupe`. Allowed enum values: `dedupe` default: `dedupe` object The `enrichment_table` processor enriches logs using a static CSV file or GeoIP database. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. object Defines a static enrichment table loaded from a CSV file. _required_] object File encoding format. delimiter [_required_] string The `encoding` `delimiter`. includes_headers [_required_] boolean The `encoding` `includes_headers`. type [_required_] enum Specifies the encoding format (e.g., CSV) used for enrichment tables. Allowed enum values: `csv` _required_] [object] Key fields used to look up enrichment values. column [_required_] string The `items` `column`. comparison [_required_] enum Defines how to compare key fields for enrichment table lookups. Allowed enum values: `equals` field [_required_] string The `items` `field`. path [_required_] string Path to the CSV file. _required_] [object] Schema defining column names and their types. column [_required_] string The `items` `column`. type [_required_] enum Declares allowed data types for enrichment table columns. Allowed enum values: `string,boolean,integer,float,date,timestamp` object Uses a GeoIP database to enrich logs based on an IP field. key_field [_required_] string Path to the IP field in the log. locale [_required_] string Locale used to resolve geographical names. path [_required_] string Path to the GeoIP database file. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. target [_required_] string Path where enrichment results should be stored in the log. type [_required_] enum The processor type. The value should always be `enrichment_table`. Allowed enum values: `enrichment_table` default: `enrichment_table` object The `reduce` processor aggregates and merges logs based on matching keys and merge strategies. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [_required_] [string] A list of fields used to group log events for merging. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] List of merge strategies defining how values from grouped events should be combined. path [_required_] string The field path in the log event. strategy [_required_] enum The merge strategy to apply. Allowed enum values: `discard,retain,sum,max,min,array,concat,concat_newline,concat_raw,shortest_array,longest_array,flat_unique` type [_required_] enum The processor type. The value should always be `reduce`. Allowed enum values: `reduce` default: `reduce` object The `throttle` processor limits the number of events that pass through over a given time window. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [string] Optional list of fields used to group events before the threshold has been reached. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. threshold [_required_] int64 the number of events allowed in a given time window. Events sent after the threshold has been reached, are dropped. type [_required_] enum The processor type. The value should always be `throttle`. Allowed enum values: `throttle` default: `throttle` window [_required_] double The time window in seconds over which the threshold applies. object The `custom_processor` processor transforms events using [Vector Remap Language (VRL)](https://vector.dev/docs/reference/vrl/) scripts with advanced filtering capabilities. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. This field should always be set to `*` for the custom_processor processor. default: `*` _required_] [object] Array of VRL remap rules. drop_on_error [_required_] boolean Whether to drop events that caused errors during processing. enabled boolean Whether this remap rule is enabled. include [_required_] string A Datadog search query used to filter events for this specific remap rule. name [_required_] string A descriptive name for this remap rule. source [_required_] string The VRL script source code that defines the processing logic. type [_required_] enum The processor type. The value should always be `custom_processor`. Allowed enum values: `custom_processor` default: `custom_processor` object The `datadog_tags` processor includes or excludes specific Datadog tags in your logs. action [_required_] enum The action to take on tags with matching keys. Allowed enum values: `include,exclude` display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. keys [_required_] [string] A list of tag keys. mode [_required_] enum The processing mode. Allowed enum values: `filter` type [_required_] enum The processor type. The value should always be `datadog_tags`. Allowed enum values: `datadog_tags` default: `datadog_tags` _required_] [ ] A list of configured data sources for the pipeline. object The `kafka` source ingests data from Apache Kafka topics. group_id [_required_] string Consumer group ID used by the Kafka client. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). [object] Optional list of advanced Kafka client configuration options, defined as key-value pairs. name [_required_] string The name of the `librdkafka` configuration option to set. value [_required_] string The value assigned to the specified `librdkafka` configuration option. object Specifies the SASL mechanism for authenticating with a Kafka cluster. mechanism enum SASL mechanism used for Kafka authentication. Allowed enum values: `PLAIN,SCRAM-SHA-256,SCRAM-SHA-512` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topics [_required_] [string] A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified. type [_required_] enum The source type. The value should always be `kafka`. Allowed enum values: `kafka` default: `kafka` object The `datadog_agent` source collects logs from the Datadog Agent. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `datadog_agent`. Allowed enum values: `datadog_agent` default: `datadog_agent` object The `splunk_tcp` source receives logs from a Splunk Universal Forwarder over TCP. TLS is supported for secure transmission. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_tcp`. Allowed enum values: `splunk_tcp` default: `splunk_tcp` object The `splunk_hec` source implements the Splunk HTTP Event Collector (HEC) API. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `amazon_s3` source ingests logs from an Amazon S3 bucket. It supports AWS authentication and TLS encryption. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). region [_required_] string AWS region where the S3 bucket resides. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `fluentd` source ingests logs from a Fluentd-compatible service. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluentd. Allowed enum values: `fluentd` default: `fluentd` object The `fluent_bit` source ingests logs from Fluent Bit. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluent_bit`. Allowed enum values: `fluent_bit` default: `fluent_bit` object The `http_server` source collects logs over HTTP POST from external services. auth_strategy [_required_] enum HTTP authentication method. Allowed enum values: `none,plain` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string Unique ID for the HTTP server source. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_server`. Allowed enum values: `http_server` default: `http_server` object The `sumo_logic` source receives logs from Sumo Logic collectors. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). type [_required_] enum The source type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `rsyslog` source listens for logs over TCP or UDP from an `rsyslog` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` source listens for logs over TCP or UDP from a `syslog-ng` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `amazon_data_firehose` source ingests logs from AWS Data Firehose. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `amazon_data_firehose`. Allowed enum values: `amazon_data_firehose` default: `amazon_data_firehose` object The `google_pubsub` source ingests logs from a Google Cloud Pub/Sub subscription. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). project [_required_] string The GCP project ID that owns the Pub/Sub subscription. subscription [_required_] string The Pub/Sub subscription name from which messages are consumed. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` object The `http_client` source scrapes logs from HTTP endpoints at regular intervals. auth_strategy enum Optional authentication strategy for HTTP requests. Allowed enum values: `basic,bearer` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). scrape_interval_secs int64 The interval (in seconds) between HTTP scrape requests. scrape_timeout_secs int64 The timeout (in seconds) for each scrape request. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_client`. Allowed enum values: `http_client` default: `http_client` object The `logstash` source ingests logs from a Logstash forwarder. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `logstash`. Allowed enum values: `logstash` default: `logstash` object The `socket` source ingests logs over TCP or UDP. _required_] Framing method configuration for the socket source. object Byte frames which are delimited by a newline character. method [_required_] enum Byte frames which are delimited by a newline character. Allowed enum values: `newline_delimited` object Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). method [_required_] enum Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). Allowed enum values: `bytes` object Byte frames which are delimited by a chosen character. delimiter [_required_] string A single ASCII character used to delimit events. method [_required_] enum Byte frames which are delimited by a chosen character. Allowed enum values: `character_delimited` object Byte frames according to the octet counting format as per RFC6587. method [_required_] enum Byte frames according to the octet counting format as per RFC6587. Allowed enum values: `octet_counting` object Byte frames which are chunked GELF messages. method [_required_] enum Byte frames which are chunked GELF messages. Allowed enum values: `chunked_gelf` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used to receive logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` name [_required_] string Name of the pipeline. id [_required_] string Unique identifier for the pipeline. type [_required_] string The resource type identifier. For pipeline resources, this should always be set to `pipelines`. default: `pipelines` ``` { "data": { "attributes": { "config": { "destinations": [ { "id": "datadog-logs-destination", "inputs": [ "filter-processor" ], "type": "datadog_logs" } ], "processors": [ { "display_name": "my component", "enabled": true, "id": "grouped-processors", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "processors": [ [] ] } ], "sources": [ { "group_id": "consumer-group-0", "id": "kafka-source", "librdkafka_options": [ { "name": "fetch.message.max.bytes", "value": "1048576" } ], "sasl": { "mechanism": "string" }, "tls": { "ca_file": "string", "crt_file": "/path/to/cert.crt", "key_file": "string" }, "topics": [ "topic1", "topic2" ], "type": "kafka" } ] }, "name": "Main Observability Pipeline" }, "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6", "type": "pipelines" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=typescript) ##### Create a new pipeline returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "config": { "destinations": [ { "id": "datadog-logs-destination", "inputs": [ "my-processor-group" ], "type": "datadog_logs" } ], "processors": [ { "enabled": true, "id": "my-processor-group", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "processors": [ { "enabled": true, "id": "filter-processor", "include": "status:error", "type": "filter" } ] } ], "sources": [ { "id": "datadog-agent-source", "type": "datadog_agent" } ] }, "name": "Main Observability Pipeline" }, "type": "pipelines" } } EOF ``` ##### Create a new pipeline returns "OK" response ``` // Create a new pipeline returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ObservabilityPipelineSpec{ Data: datadogV2.ObservabilityPipelineSpecData{ Attributes: datadogV2.ObservabilityPipelineDataAttributes{ Config: datadogV2.ObservabilityPipelineConfig{ Destinations: []datadogV2.ObservabilityPipelineConfigDestinationItem{ datadogV2.ObservabilityPipelineConfigDestinationItem{ ObservabilityPipelineDatadogLogsDestination: &datadogV2.ObservabilityPipelineDatadogLogsDestination{ Id: "datadog-logs-destination", Inputs: []string{ "my-processor-group", }, Type: datadogV2.OBSERVABILITYPIPELINEDATADOGLOGSDESTINATIONTYPE_DATADOG_LOGS, }}, }, Processors: []datadogV2.ObservabilityPipelineConfigProcessorGroup{ { Enabled: true, Id: "my-processor-group", Include: "service:my-service", Inputs: []string{ "datadog-agent-source", }, Processors: []datadogV2.ObservabilityPipelineConfigProcessorItem{ datadogV2.ObservabilityPipelineConfigProcessorItem{ ObservabilityPipelineFilterProcessor: &datadogV2.ObservabilityPipelineFilterProcessor{ Enabled: true, Id: "filter-processor", Include: "status:error", Type: datadogV2.OBSERVABILITYPIPELINEFILTERPROCESSORTYPE_FILTER, }}, }, }, }, Sources: []datadogV2.ObservabilityPipelineConfigSourceItem{ datadogV2.ObservabilityPipelineConfigSourceItem{ ObservabilityPipelineDatadogAgentSource: &datadogV2.ObservabilityPipelineDatadogAgentSource{ Id: "datadog-agent-source", Type: datadogV2.OBSERVABILITYPIPELINEDATADOGAGENTSOURCETYPE_DATADOG_AGENT, }}, }, }, Name: "Main Observability Pipeline", }, Type: "pipelines", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreatePipeline", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewObservabilityPipelinesApi(apiClient) resp, r, err := api.CreatePipeline(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ObservabilityPipelinesApi.CreatePipeline`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ObservabilityPipelinesApi.CreatePipeline`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new pipeline returns "OK" response ``` // Create a new pipeline returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ObservabilityPipelinesApi; import com.datadog.api.client.v2.model.ObservabilityPipeline; import com.datadog.api.client.v2.model.ObservabilityPipelineConfig; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigDestinationItem; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigProcessorGroup; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigProcessorItem; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigSourceItem; import com.datadog.api.client.v2.model.ObservabilityPipelineDataAttributes; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogAgentSource; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogAgentSourceType; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogLogsDestination; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogLogsDestinationType; import com.datadog.api.client.v2.model.ObservabilityPipelineFilterProcessor; import com.datadog.api.client.v2.model.ObservabilityPipelineFilterProcessorType; import com.datadog.api.client.v2.model.ObservabilityPipelineSpec; import com.datadog.api.client.v2.model.ObservabilityPipelineSpecData; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createPipeline", true); ObservabilityPipelinesApi apiInstance = new ObservabilityPipelinesApi(defaultClient); ObservabilityPipelineSpec body = new ObservabilityPipelineSpec() .data( new ObservabilityPipelineSpecData() .attributes( new ObservabilityPipelineDataAttributes() .config( new ObservabilityPipelineConfig() .destinations( Collections.singletonList( new ObservabilityPipelineConfigDestinationItem( new ObservabilityPipelineDatadogLogsDestination() .id("datadog-logs-destination") .inputs( Collections.singletonList( "my-processor-group")) .type( ObservabilityPipelineDatadogLogsDestinationType .DATADOG_LOGS)))) .processors( Collections.singletonList( new ObservabilityPipelineConfigProcessorGroup() .enabled(true) .id("my-processor-group") .include("service:my-service") .inputs( Collections.singletonList( "datadog-agent-source")) .processors( Collections.singletonList( new ObservabilityPipelineConfigProcessorItem( new ObservabilityPipelineFilterProcessor() .enabled(true) .id("filter-processor") .include("status:error") .type( ObservabilityPipelineFilterProcessorType .FILTER)))))) .sources( Collections.singletonList( new ObservabilityPipelineConfigSourceItem( new ObservabilityPipelineDatadogAgentSource() .id("datadog-agent-source") .type( ObservabilityPipelineDatadogAgentSourceType .DATADOG_AGENT))))) .name("Main Observability Pipeline")) .type("pipelines")); try { ObservabilityPipeline result = apiInstance.createPipeline(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ObservabilityPipelinesApi#createPipeline"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new pipeline returns "OK" response ``` """ Create a new pipeline returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.observability_pipelines_api import ObservabilityPipelinesApi from datadog_api_client.v2.model.observability_pipeline_config import ObservabilityPipelineConfig from datadog_api_client.v2.model.observability_pipeline_config_processor_group import ( ObservabilityPipelineConfigProcessorGroup, ) from datadog_api_client.v2.model.observability_pipeline_data_attributes import ObservabilityPipelineDataAttributes from datadog_api_client.v2.model.observability_pipeline_datadog_agent_source import ( ObservabilityPipelineDatadogAgentSource, ) from datadog_api_client.v2.model.observability_pipeline_datadog_agent_source_type import ( ObservabilityPipelineDatadogAgentSourceType, ) from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination import ( ObservabilityPipelineDatadogLogsDestination, ) from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination_type import ( ObservabilityPipelineDatadogLogsDestinationType, ) from datadog_api_client.v2.model.observability_pipeline_filter_processor import ObservabilityPipelineFilterProcessor from datadog_api_client.v2.model.observability_pipeline_filter_processor_type import ( ObservabilityPipelineFilterProcessorType, ) from datadog_api_client.v2.model.observability_pipeline_spec import ObservabilityPipelineSpec from datadog_api_client.v2.model.observability_pipeline_spec_data import ObservabilityPipelineSpecData body = ObservabilityPipelineSpec( data=ObservabilityPipelineSpecData( attributes=ObservabilityPipelineDataAttributes( config=ObservabilityPipelineConfig( destinations=[ ObservabilityPipelineDatadogLogsDestination( id="datadog-logs-destination", inputs=[ "my-processor-group", ], type=ObservabilityPipelineDatadogLogsDestinationType.DATADOG_LOGS, ), ], processors=[ ObservabilityPipelineConfigProcessorGroup( enabled=True, id="my-processor-group", include="service:my-service", inputs=[ "datadog-agent-source", ], processors=[ ObservabilityPipelineFilterProcessor( enabled=True, id="filter-processor", include="status:error", type=ObservabilityPipelineFilterProcessorType.FILTER, ), ], ), ], sources=[ ObservabilityPipelineDatadogAgentSource( id="datadog-agent-source", type=ObservabilityPipelineDatadogAgentSourceType.DATADOG_AGENT, ), ], ), name="Main Observability Pipeline", ), type="pipelines", ), ) configuration = Configuration() configuration.unstable_operations["create_pipeline"] = True with ApiClient(configuration) as api_client: api_instance = ObservabilityPipelinesApi(api_client) response = api_instance.create_pipeline(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new pipeline returns "OK" response ``` # Create a new pipeline returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_pipeline".to_sym] = true end api_instance = DatadogAPIClient::V2::ObservabilityPipelinesAPI.new body = DatadogAPIClient::V2::ObservabilityPipelineSpec.new({ data: DatadogAPIClient::V2::ObservabilityPipelineSpecData.new({ attributes: DatadogAPIClient::V2::ObservabilityPipelineDataAttributes.new({ config: DatadogAPIClient::V2::ObservabilityPipelineConfig.new({ destinations: [ DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestination.new({ id: "datadog-logs-destination", inputs: [ "my-processor-group", ], type: DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestinationType::DATADOG_LOGS, }), ], processors: [ DatadogAPIClient::V2::ObservabilityPipelineConfigProcessorGroup.new({ enabled: true, id: "my-processor-group", include: "service:my-service", inputs: [ "datadog-agent-source", ], processors: [ DatadogAPIClient::V2::ObservabilityPipelineFilterProcessor.new({ enabled: true, id: "filter-processor", include: "status:error", type: DatadogAPIClient::V2::ObservabilityPipelineFilterProcessorType::FILTER, }), ], }), ], sources: [ DatadogAPIClient::V2::ObservabilityPipelineDatadogAgentSource.new({ id: "datadog-agent-source", type: DatadogAPIClient::V2::ObservabilityPipelineDatadogAgentSourceType::DATADOG_AGENT, }), ], }), name: "Main Observability Pipeline", }), type: "pipelines", }), }) p api_instance.create_pipeline(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new pipeline returns "OK" response ``` // Create a new pipeline returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_observability_pipelines::ObservabilityPipelinesAPI; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfig; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigDestinationItem; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigProcessorGroup; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigProcessorItem; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigSourceItem; use datadog_api_client::datadogV2::model::ObservabilityPipelineDataAttributes; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogAgentSource; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogAgentSourceType; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogLogsDestination; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogLogsDestinationType; use datadog_api_client::datadogV2::model::ObservabilityPipelineFilterProcessor; use datadog_api_client::datadogV2::model::ObservabilityPipelineFilterProcessorType; use datadog_api_client::datadogV2::model::ObservabilityPipelineSpec; use datadog_api_client::datadogV2::model::ObservabilityPipelineSpecData; #[tokio::main] async fn main() { let body = ObservabilityPipelineSpec::new( ObservabilityPipelineSpecData::new( ObservabilityPipelineDataAttributes::new( ObservabilityPipelineConfig::new( vec![ ObservabilityPipelineConfigDestinationItem::ObservabilityPipelineDatadogLogsDestination( Box::new( ObservabilityPipelineDatadogLogsDestination::new( "datadog-logs-destination".to_string(), vec!["my-processor-group".to_string()], ObservabilityPipelineDatadogLogsDestinationType::DATADOG_LOGS, ), ), ) ], vec![ ObservabilityPipelineConfigSourceItem::ObservabilityPipelineDatadogAgentSource( Box::new( ObservabilityPipelineDatadogAgentSource::new( "datadog-agent-source".to_string(), ObservabilityPipelineDatadogAgentSourceType::DATADOG_AGENT, ), ), ) ], ).processors( vec![ ObservabilityPipelineConfigProcessorGroup::new( true, "my-processor-group".to_string(), "service:my-service".to_string(), vec!["datadog-agent-source".to_string()], vec![ ObservabilityPipelineConfigProcessorItem::ObservabilityPipelineFilterProcessor( Box::new( ObservabilityPipelineFilterProcessor::new( true, "filter-processor".to_string(), "status:error".to_string(), ObservabilityPipelineFilterProcessorType::FILTER, ), ), ) ], ) ], ), "Main Observability Pipeline".to_string(), ), "pipelines".to_string(), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreatePipeline", true); let api = ObservabilityPipelinesAPI::with_config(configuration); let resp = api.create_pipeline(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new pipeline returns "OK" response ``` /** * Create a new pipeline returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createPipeline"] = true; const apiInstance = new v2.ObservabilityPipelinesApi(configuration); const params: v2.ObservabilityPipelinesApiCreatePipelineRequest = { body: { data: { attributes: { config: { destinations: [ { id: "datadog-logs-destination", inputs: ["my-processor-group"], type: "datadog_logs", }, ], processors: [ { enabled: true, id: "my-processor-group", include: "service:my-service", inputs: ["datadog-agent-source"], processors: [ { enabled: true, id: "filter-processor", include: "status:error", type: "filter", }, ], }, ], sources: [ { id: "datadog-agent-source", type: "datadog_agent", }, ], }, name: "Main Observability Pipeline", }, type: "pipelines", }, }, }; apiInstance .createPipeline(params) .then((data: v2.ObservabilityPipeline) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a specific pipeline](https://docs.datadoghq.com/api/latest/observability-pipelines/#get-a-specific-pipeline) * [v2 (latest)](https://docs.datadoghq.com/api/latest/observability-pipelines/#get-a-specific-pipeline-v2) **Note** : This endpoint is in Preview. Fill out this [form](https://www.datadoghq.com/product-preview/observability-pipelines-api-and-terraform-support/) to request access. GET https://api.ap1.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.eu/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.ddog-gov.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id} ### Overview Get a specific pipeline by its ID. This endpoint requires the `observability_pipelines_read` permission. ### Arguments #### Path Parameters Name Type Description pipeline_id [_required_] string The ID of the pipeline to retrieve. ### Response * [200](https://docs.datadoghq.com/api/latest/observability-pipelines/#GetPipeline-200-v2) * [403](https://docs.datadoghq.com/api/latest/observability-pipelines/#GetPipeline-403-v2) * [429](https://docs.datadoghq.com/api/latest/observability-pipelines/#GetPipeline-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) Top-level schema representing a pipeline. Expand All Field Type Description _required_] object Contains the pipeline’s ID, type, and configuration attributes. _required_] object Defines the pipeline’s name and its components (sources, processors, and destinations). _required_] object Specifies the pipeline's configuration, including its sources, processors, and destinations. _required_] [ ] A list of destination components where processed logs are sent. object The `datadog_logs` destination forwards logs to Datadog Log Management. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `datadog_logs`. Allowed enum values: `datadog_logs` default: `datadog_logs` object The `amazon_s3` destination sends your logs in Datadog-rehydratable format to an Amazon S3 bucket for archiving. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string S3 bucket name. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys. region [_required_] string AWS region of the S3 bucket. storage_class [_required_] enum S3 storage class. Allowed enum values: `STANDARD,REDUCED_REDUNDANCY,INTELLIGENT_TIERING,STANDARD_IA,EXPRESS_ONEZONE,ONEZONE_IA,GLACIER,GLACIER_IR,DEEP_ARCHIVE` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `google_cloud_storage` destination stores logs in a Google Cloud Storage (GCS) bucket. It requires a bucket name, GCP authentication, and metadata fields. acl enum Access control list setting for objects written to the bucket. Allowed enum values: `private,project-private,public-read,authenticated-read,bucket-owner-read,bucket-owner-full-control` object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. bucket [_required_] string Name of the GCS bucket. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys within the GCS bucket. [object] Custom metadata to attach to each object uploaded to the GCS bucket. name [_required_] string The metadata key. value [_required_] string The metadata value. storage_class [_required_] enum Storage class used for objects stored in GCS. Allowed enum values: `STANDARD,NEARLINE,COLDLINE,ARCHIVE` type [_required_] enum The destination type. Always `google_cloud_storage`. Allowed enum values: `google_cloud_storage` default: `google_cloud_storage` object The `splunk_hec` destination forwards logs to Splunk using the HTTP Event Collector (HEC). auto_extract_timestamp boolean If `true`, Splunk tries to extract timestamps from incoming log events. If `false`, Splunk assigns the time the event was received. encoding enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). index string Optional name of the Splunk index where logs are written. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. sourcetype string The Splunk sourcetype to assign to log events. type [_required_] enum The destination type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `sumo_logic` destination forwards logs to Sumo Logic. encoding enum The output encoding format. Allowed enum values: `json,raw_message,logfmt` [object] A list of custom headers to include in the request to Sumo Logic. name [_required_] string The header field name. value [_required_] string The header field value. header_host_name string Optional override for the host name header. header_source_category string Optional override for the source category header. header_source_name string Optional override for the source name header. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `elasticsearch` destination writes logs to an Elasticsearch cluster. api_version enum The Elasticsearch API version to use. Set to `auto` to auto-detect. Allowed enum values: `auto,v6,v7,v8` bulk_index string The index to write logs to in Elasticsearch. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `elasticsearch`. Allowed enum values: `elasticsearch` default: `elasticsearch` object The `rsyslog` destination forwards logs to an external `rsyslog` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` destination forwards logs to an external `syslog-ng` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `azure_storage` destination forwards logs to an Azure Blob Storage container. blob_prefix string Optional prefix for blobs written to the container. container_name [_required_] string The name of the Azure Blob Storage container to store logs in. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `azure_storage`. Allowed enum values: `azure_storage` default: `azure_storage` object The `microsoft_sentinel` destination forwards logs to Microsoft Sentinel. client_id [_required_] string Azure AD client ID used for authentication. dcr_immutable_id [_required_] string The immutable ID of the Data Collection Rule (DCR). id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. table [_required_] string The name of the Log Analytics table where logs are sent. tenant_id [_required_] string Azure AD tenant ID. type [_required_] enum The destination type. The value should always be `microsoft_sentinel`. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` object The `google_chronicle` destination sends logs to Google Chronicle. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. customer_id [_required_] string The Google Chronicle customer ID. encoding enum The encoding format for the logs sent to Chronicle. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. log_type string The log type metadata associated with the Chronicle destination. type [_required_] enum The destination type. The value should always be `google_chronicle`. Allowed enum values: `google_chronicle` default: `google_chronicle` object The `new_relic` destination sends logs to the New Relic platform. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The New Relic region. Allowed enum values: `us,eu` type [_required_] enum The destination type. The value should always be `new_relic`. Allowed enum values: `new_relic` default: `new_relic` object The `sentinel_one` destination sends logs to SentinelOne. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The SentinelOne region to send logs to. Allowed enum values: `us,eu,ca,data_set_us` type [_required_] enum The destination type. The value should always be `sentinel_one`. Allowed enum values: `sentinel_one` default: `sentinel_one` object The `opensearch` destination writes logs to an OpenSearch cluster. bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `opensearch`. Allowed enum values: `opensearch` default: `opensearch` object The `amazon_opensearch` destination writes logs to Amazon OpenSearch. _required_] object Authentication settings for the Amazon OpenSearch destination. The `strategy` field determines whether basic or AWS-based authentication is used. assume_role string The ARN of the role to assume (used with `aws` strategy). aws_region string AWS region external_id string External ID for the assumed role (used with `aws` strategy). session_name string Session name for the assumed role (used with `aws` strategy). strategy [_required_] enum The authentication strategy to use. Allowed enum values: `basic,aws` bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `amazon_opensearch`. Allowed enum values: `amazon_opensearch` default: `amazon_opensearch` object The `socket` destination sends logs over TCP or UDP to a remote server. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` _required_] Framing method configuration. object Each log event is delimited by a newline character. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingNewlineDelimitedMethod` object. Allowed enum values: `newline_delimited` object Event data is not delimited at all. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingBytesMethod` object. Allowed enum values: `bytes` object Each log event is separated using the specified delimiter character. delimiter [_required_] string A single ASCII character used as a delimiter. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingCharacterDelimitedMethod` object. Allowed enum values: `character_delimited` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. mode [_required_] enum Protocol used to send logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` object The `amazon_security_lake` destination sends your logs to Amazon Security Lake. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string Name of the Amazon S3 bucket in Security Lake (3-63 characters). custom_source_name [_required_] string Custom source name for the logs in Security Lake. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] string AWS region of the S3 bucket. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_security_lake`. Allowed enum values: `amazon_security_lake` default: `amazon_security_lake` object The `crowdstrike_next_gen_siem` destination forwards logs to CrowdStrike Next Gen SIEM. object Compression configuration for log events. algorithm [_required_] enum Compression algorithm for log events. Allowed enum values: `gzip,zlib` level int64 Compression level. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `crowdstrike_next_gen_siem`. Allowed enum values: `crowdstrike_next_gen_siem` default: `crowdstrike_next_gen_siem` object The `google_pubsub` destination publishes logs to a Google Cloud Pub/Sub topic. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. project [_required_] string The GCP project ID that owns the Pub/Sub topic. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topic [_required_] string The Pub/Sub topic name to publish logs to. type [_required_] enum The destination type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` [object] A list of processor groups that transform or enrich log data. display_name string The display name for a component. enabled [_required_] boolean Whether this processor group is enabled. id [_required_] string The unique identifier for the processor group. include [_required_] string Conditional expression for when this processor group should execute. inputs [_required_] [string] A list of IDs for components whose output is used as the input for this processor group. _required_] [ ] Processors applied sequentially within this group. Events flow through each processor in order. object The `filter` processor allows conditional processing of logs based on a Datadog search query. Logs that match the `include` query are passed through; others are discarded. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped. type [_required_] enum The processor type. The value should always be `filter`. Allowed enum values: `filter` default: `filter` object The `parse_json` processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. field [_required_] string The name of the log field that contains a JSON string. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `parse_json`. Allowed enum values: `parse_json` default: `parse_json` object The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert. display_name string The display name for a component. drop_events boolean If set to `true`, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). ignore_when_missing_partitions boolean If `true`, the processor skips quota checks when partition fields are missing from the logs. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. name [_required_] string Name of the quota. overflow_action enum The action to take when the quota is exceeded. Options: * `drop`: Drop the event. * `no_action`: Let the event pass through. * `overflow_routing`: Route to an overflow destination. Allowed enum values: `drop,no_action,overflow_routing` [object] A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit. _required_] [object] A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced. name [_required_] string The field name. value [_required_] string The field value. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. partition_fields [string] A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values. type [_required_] enum The processor type. The value should always be `quota`. Allowed enum values: `quota` default: `quota` object The `add_fields` processor adds static key-value fields to logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of static fields (key-value pairs) that is added to each log event processed by this component. name [_required_] string The field name. value [_required_] string The field value. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_fields`. Allowed enum values: `add_fields` default: `add_fields` object The `remove_fields` processor deletes specified fields from logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of field names to be removed from each log event. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `remove_fields`. Allowed enum values: `remove_fields` default: `remove_fields` object The `rename_fields` processor changes field names. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields. destination [_required_] string The field name to assign the renamed value to. preserve_source [_required_] boolean Indicates whether the original field, that is received from the source, should be kept (`true`) or removed (`false`) after renaming. source [_required_] string The original field name in the log event that should be renamed. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `rename_fields`. Allowed enum values: `rename_fields` default: `rename_fields` object The `generate_datadog_metrics` processor creates custom metrics from logs and sends them to Datadog. Metrics can be counters, gauges, or distributions and optionally grouped by log fields. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include string A Datadog search query used to determine which logs this processor targets. [object] Configuration for generating individual metrics. group_by [string] Optional fields used to group the metric series. include [_required_] string Datadog filter query to match logs for metric generation. metric_type [_required_] enum Type of metric to create. Allowed enum values: `count,gauge,distribution` name [_required_] string Name of the custom metric to be created. _required_] Specifies how the value of the generated metric is computed. object Strategy that increments a generated metric by one for each matching event. strategy [_required_] enum Increments the metric by 1 for each matching event. Allowed enum values: `increment_by_one` object Strategy that increments a generated metric based on the value of a log field. field [_required_] string Name of the log field containing the numeric value to increment the metric by. strategy [_required_] enum Uses a numeric field in the log event as the metric increment. Allowed enum values: `increment_by_field` type [_required_] enum The processor type. Always `generate_datadog_metrics`. Allowed enum values: `generate_datadog_metrics` default: `generate_datadog_metrics` object The `sample` processor allows probabilistic sampling of logs at a fixed rate. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. percentage double The percentage of logs to sample. rate int64 Number of events to sample (1 in N). type [_required_] enum The processor type. The value should always be `sample`. Allowed enum values: `sample` default: `sample` object The `parse_grok` processor extracts structured fields from unstructured log messages using Grok patterns. disable_library_rules boolean If set to `true`, disables the default Grok rules provided by Datadog. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string A unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] The list of Grok parsing rules. If multiple matching rules are provided, they are evaluated in order. The first successful match is applied. _required_] [object] A list of Grok parsing rules that define how to extract fields from the source field. Each rule must contain a name and a valid Grok pattern. name [_required_] string The name of the rule. rule [_required_] string The definition of the Grok rule. source [_required_] string The name of the field in the log event to apply the Grok rules to. [object] A list of Grok helper rules that can be referenced by the parsing rules. name [_required_] string The name of the Grok helper rule. rule [_required_] string The definition of the Grok helper rule. type [_required_] enum The processor type. The value should always be `parse_grok`. Allowed enum values: `parse_grok` default: `parse_grok` object The `sensitive_data_scanner` processor detects and optionally redacts sensitive data in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of rules for identifying and acting on sensitive data patterns. object Configuration for keywords used to reinforce sensitive data pattern detection. keywords [_required_] [string] A list of keywords to match near the sensitive pattern. proximity [_required_] int64 Maximum number of tokens between a keyword and a sensitive value match. name [_required_] string A name identifying the rule. _required_] Defines what action to take when sensitive data is matched. object Configuration for completely redacting matched sensitive data. action [_required_] enum Action type that completely replaces the matched sensitive data with a fixed replacement string to remove all visibility. Allowed enum values: `redact` _required_] object Configuration for fully redacting sensitive data. replace [_required_] string The `ObservabilityPipelineSensitiveDataScannerProcessorActionRedactOptions` `replace`. object Configuration for hashing matched sensitive values. action [_required_] enum Action type that replaces the matched sensitive data with a hashed representation, preserving structure while securing content. Allowed enum values: `hash` options object The `ObservabilityPipelineSensitiveDataScannerProcessorActionHash` `options`. object Configuration for partially redacting matched sensitive data. action [_required_] enum Action type that redacts part of the sensitive data while preserving a configurable number of characters, typically used for masking purposes (e.g., show last 4 digits of a credit card). Allowed enum values: `partial_redact` _required_] object Controls how partial redaction is applied, including character count and direction. characters [_required_] int64 The `ObservabilityPipelineSensitiveDataScannerProcessorActionPartialRedactOptions` `characters`. direction [_required_] enum Indicates whether to redact characters from the first or last part of the matched value. Allowed enum values: `first,last` _required_] Pattern detection configuration for identifying sensitive data using either a custom regex or a library reference. object Defines a custom regex-based pattern for identifying sensitive data in logs. _required_] object Options for defining a custom regex pattern. rule [_required_] string A regular expression used to detect sensitive values. Must be a valid regex. type [_required_] enum Indicates a custom regular expression is used for matching. Allowed enum values: `custom` object Specifies a pattern from Datadog’s sensitive data detection library to match known sensitive data types. _required_] object Options for selecting a predefined library pattern and enabling keyword support. id [_required_] string Identifier for a predefined pattern from the sensitive data scanner pattern library. use_recommended_keywords boolean Whether to augment the pattern with recommended keywords (optional). type [_required_] enum Indicates that a predefined library pattern is used. Allowed enum values: `library` _required_] Determines which parts of the log the pattern-matching rule should be applied to. object Includes only specific fields for sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Applies the rule only to included fields. Allowed enum values: `include` object Excludes specific fields from sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Excludes specific fields from processing. Allowed enum values: `exclude` object Applies scanning across all available fields. target [_required_] enum Applies the rule to all fields. Allowed enum values: `all` tags [_required_] [string] Tags assigned to this rule for filtering and classification. type [_required_] enum The processor type. The value should always be `sensitive_data_scanner`. Allowed enum values: `sensitive_data_scanner` default: `sensitive_data_scanner` object The `ocsf_mapper` processor transforms logs into the OCSF schema using a predefined mapping configuration. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of mapping rules to convert events to the OCSF format. include [_required_] string A Datadog search query used to select the logs that this mapping should apply to. _required_] Defines a single mapping rule for transforming logs into the OCSF schema. Option 1 enum Predefined library mappings for common log formats. Allowed enum values: `CloudTrail Account Change,GCP Cloud Audit CreateBucket,GCP Cloud Audit CreateSink,GCP Cloud Audit SetIamPolicy,GCP Cloud Audit UpdateSink,Github Audit Log API Activity,Google Workspace Admin Audit addPrivilege,Microsoft 365 Defender Incident,Microsoft 365 Defender UserLoggedIn,Okta System Log Authentication,Palo Alto Networks Firewall Traffic` type [_required_] enum The processor type. The value should always be `ocsf_mapper`. Allowed enum values: `ocsf_mapper` default: `ocsf_mapper` object The `add_env_vars` processor adds environment variable values to log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this processor in the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_env_vars`. Allowed enum values: `add_env_vars` default: `add_env_vars` _required_] [object] A list of environment variable mappings to apply to log fields. field [_required_] string The target field in the log event. name [_required_] string The name of the environment variable to read. object The `dedupe` processor removes duplicate fields in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of log field paths to check for duplicates. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. mode [_required_] enum The deduplication mode to apply to the fields. Allowed enum values: `match,ignore` type [_required_] enum The processor type. The value should always be `dedupe`. Allowed enum values: `dedupe` default: `dedupe` object The `enrichment_table` processor enriches logs using a static CSV file or GeoIP database. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. object Defines a static enrichment table loaded from a CSV file. _required_] object File encoding format. delimiter [_required_] string The `encoding` `delimiter`. includes_headers [_required_] boolean The `encoding` `includes_headers`. type [_required_] enum Specifies the encoding format (e.g., CSV) used for enrichment tables. Allowed enum values: `csv` _required_] [object] Key fields used to look up enrichment values. column [_required_] string The `items` `column`. comparison [_required_] enum Defines how to compare key fields for enrichment table lookups. Allowed enum values: `equals` field [_required_] string The `items` `field`. path [_required_] string Path to the CSV file. _required_] [object] Schema defining column names and their types. column [_required_] string The `items` `column`. type [_required_] enum Declares allowed data types for enrichment table columns. Allowed enum values: `string,boolean,integer,float,date,timestamp` object Uses a GeoIP database to enrich logs based on an IP field. key_field [_required_] string Path to the IP field in the log. locale [_required_] string Locale used to resolve geographical names. path [_required_] string Path to the GeoIP database file. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. target [_required_] string Path where enrichment results should be stored in the log. type [_required_] enum The processor type. The value should always be `enrichment_table`. Allowed enum values: `enrichment_table` default: `enrichment_table` object The `reduce` processor aggregates and merges logs based on matching keys and merge strategies. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [_required_] [string] A list of fields used to group log events for merging. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] List of merge strategies defining how values from grouped events should be combined. path [_required_] string The field path in the log event. strategy [_required_] enum The merge strategy to apply. Allowed enum values: `discard,retain,sum,max,min,array,concat,concat_newline,concat_raw,shortest_array,longest_array,flat_unique` type [_required_] enum The processor type. The value should always be `reduce`. Allowed enum values: `reduce` default: `reduce` object The `throttle` processor limits the number of events that pass through over a given time window. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [string] Optional list of fields used to group events before the threshold has been reached. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. threshold [_required_] int64 the number of events allowed in a given time window. Events sent after the threshold has been reached, are dropped. type [_required_] enum The processor type. The value should always be `throttle`. Allowed enum values: `throttle` default: `throttle` window [_required_] double The time window in seconds over which the threshold applies. object The `custom_processor` processor transforms events using [Vector Remap Language (VRL)](https://vector.dev/docs/reference/vrl/) scripts with advanced filtering capabilities. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. This field should always be set to `*` for the custom_processor processor. default: `*` _required_] [object] Array of VRL remap rules. drop_on_error [_required_] boolean Whether to drop events that caused errors during processing. enabled boolean Whether this remap rule is enabled. include [_required_] string A Datadog search query used to filter events for this specific remap rule. name [_required_] string A descriptive name for this remap rule. source [_required_] string The VRL script source code that defines the processing logic. type [_required_] enum The processor type. The value should always be `custom_processor`. Allowed enum values: `custom_processor` default: `custom_processor` object The `datadog_tags` processor includes or excludes specific Datadog tags in your logs. action [_required_] enum The action to take on tags with matching keys. Allowed enum values: `include,exclude` display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. keys [_required_] [string] A list of tag keys. mode [_required_] enum The processing mode. Allowed enum values: `filter` type [_required_] enum The processor type. The value should always be `datadog_tags`. Allowed enum values: `datadog_tags` default: `datadog_tags` _required_] [ ] A list of configured data sources for the pipeline. object The `kafka` source ingests data from Apache Kafka topics. group_id [_required_] string Consumer group ID used by the Kafka client. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). [object] Optional list of advanced Kafka client configuration options, defined as key-value pairs. name [_required_] string The name of the `librdkafka` configuration option to set. value [_required_] string The value assigned to the specified `librdkafka` configuration option. object Specifies the SASL mechanism for authenticating with a Kafka cluster. mechanism enum SASL mechanism used for Kafka authentication. Allowed enum values: `PLAIN,SCRAM-SHA-256,SCRAM-SHA-512` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topics [_required_] [string] A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified. type [_required_] enum The source type. The value should always be `kafka`. Allowed enum values: `kafka` default: `kafka` object The `datadog_agent` source collects logs from the Datadog Agent. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `datadog_agent`. Allowed enum values: `datadog_agent` default: `datadog_agent` object The `splunk_tcp` source receives logs from a Splunk Universal Forwarder over TCP. TLS is supported for secure transmission. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_tcp`. Allowed enum values: `splunk_tcp` default: `splunk_tcp` object The `splunk_hec` source implements the Splunk HTTP Event Collector (HEC) API. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `amazon_s3` source ingests logs from an Amazon S3 bucket. It supports AWS authentication and TLS encryption. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). region [_required_] string AWS region where the S3 bucket resides. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `fluentd` source ingests logs from a Fluentd-compatible service. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluentd. Allowed enum values: `fluentd` default: `fluentd` object The `fluent_bit` source ingests logs from Fluent Bit. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluent_bit`. Allowed enum values: `fluent_bit` default: `fluent_bit` object The `http_server` source collects logs over HTTP POST from external services. auth_strategy [_required_] enum HTTP authentication method. Allowed enum values: `none,plain` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string Unique ID for the HTTP server source. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_server`. Allowed enum values: `http_server` default: `http_server` object The `sumo_logic` source receives logs from Sumo Logic collectors. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). type [_required_] enum The source type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `rsyslog` source listens for logs over TCP or UDP from an `rsyslog` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` source listens for logs over TCP or UDP from a `syslog-ng` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `amazon_data_firehose` source ingests logs from AWS Data Firehose. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `amazon_data_firehose`. Allowed enum values: `amazon_data_firehose` default: `amazon_data_firehose` object The `google_pubsub` source ingests logs from a Google Cloud Pub/Sub subscription. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). project [_required_] string The GCP project ID that owns the Pub/Sub subscription. subscription [_required_] string The Pub/Sub subscription name from which messages are consumed. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` object The `http_client` source scrapes logs from HTTP endpoints at regular intervals. auth_strategy enum Optional authentication strategy for HTTP requests. Allowed enum values: `basic,bearer` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). scrape_interval_secs int64 The interval (in seconds) between HTTP scrape requests. scrape_timeout_secs int64 The timeout (in seconds) for each scrape request. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_client`. Allowed enum values: `http_client` default: `http_client` object The `logstash` source ingests logs from a Logstash forwarder. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `logstash`. Allowed enum values: `logstash` default: `logstash` object The `socket` source ingests logs over TCP or UDP. _required_] Framing method configuration for the socket source. object Byte frames which are delimited by a newline character. method [_required_] enum Byte frames which are delimited by a newline character. Allowed enum values: `newline_delimited` object Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). method [_required_] enum Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). Allowed enum values: `bytes` object Byte frames which are delimited by a chosen character. delimiter [_required_] string A single ASCII character used to delimit events. method [_required_] enum Byte frames which are delimited by a chosen character. Allowed enum values: `character_delimited` object Byte frames according to the octet counting format as per RFC6587. method [_required_] enum Byte frames according to the octet counting format as per RFC6587. Allowed enum values: `octet_counting` object Byte frames which are chunked GELF messages. method [_required_] enum Byte frames which are chunked GELF messages. Allowed enum values: `chunked_gelf` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used to receive logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` name [_required_] string Name of the pipeline. id [_required_] string Unique identifier for the pipeline. type [_required_] string The resource type identifier. For pipeline resources, this should always be set to `pipelines`. default: `pipelines` ``` { "data": { "attributes": { "config": { "destinations": [ { "id": "datadog-logs-destination", "inputs": [ "filter-processor" ], "type": "datadog_logs" } ], "processors": [ { "display_name": "my component", "enabled": true, "id": "grouped-processors", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "processors": [ [] ] } ], "sources": [ { "group_id": "consumer-group-0", "id": "kafka-source", "librdkafka_options": [ { "name": "fetch.message.max.bytes", "value": "1048576" } ], "sasl": { "mechanism": "string" }, "tls": { "ca_file": "string", "crt_file": "/path/to/cert.crt", "key_file": "string" }, "topics": [ "topic1", "topic2" ], "type": "kafka" } ] }, "name": "Main Observability Pipeline" }, "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6", "type": "pipelines" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=typescript) ##### Get a specific pipeline Copy ``` # Path parameters export pipeline_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/${pipeline_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a specific pipeline ``` """ Get a specific pipeline returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.observability_pipelines_api import ObservabilityPipelinesApi # there is a valid "pipeline" in the system PIPELINE_DATA_ID = environ["PIPELINE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_pipeline"] = True with ApiClient(configuration) as api_client: api_instance = ObservabilityPipelinesApi(api_client) response = api_instance.get_pipeline( pipeline_id=PIPELINE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a specific pipeline ``` # Get a specific pipeline returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_pipeline".to_sym] = true end api_instance = DatadogAPIClient::V2::ObservabilityPipelinesAPI.new # there is a valid "pipeline" in the system PIPELINE_DATA_ID = ENV["PIPELINE_DATA_ID"] p api_instance.get_pipeline(PIPELINE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a specific pipeline ``` // Get a specific pipeline returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "pipeline" in the system PipelineDataID := os.Getenv("PIPELINE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetPipeline", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewObservabilityPipelinesApi(apiClient) resp, r, err := api.GetPipeline(ctx, PipelineDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ObservabilityPipelinesApi.GetPipeline`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ObservabilityPipelinesApi.GetPipeline`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a specific pipeline ``` // Get a specific pipeline returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ObservabilityPipelinesApi; import com.datadog.api.client.v2.model.ObservabilityPipeline; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getPipeline", true); ObservabilityPipelinesApi apiInstance = new ObservabilityPipelinesApi(defaultClient); // there is a valid "pipeline" in the system String PIPELINE_DATA_ID = System.getenv("PIPELINE_DATA_ID"); try { ObservabilityPipeline result = apiInstance.getPipeline(PIPELINE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ObservabilityPipelinesApi#getPipeline"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a specific pipeline ``` // Get a specific pipeline returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_observability_pipelines::ObservabilityPipelinesAPI; #[tokio::main] async fn main() { // there is a valid "pipeline" in the system let pipeline_data_id = std::env::var("PIPELINE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetPipeline", true); let api = ObservabilityPipelinesAPI::with_config(configuration); let resp = api.get_pipeline(pipeline_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a specific pipeline ``` /** * Get a specific pipeline returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getPipeline"] = true; const apiInstance = new v2.ObservabilityPipelinesApi(configuration); // there is a valid "pipeline" in the system const PIPELINE_DATA_ID = process.env.PIPELINE_DATA_ID as string; const params: v2.ObservabilityPipelinesApiGetPipelineRequest = { pipelineId: PIPELINE_DATA_ID, }; apiInstance .getPipeline(params) .then((data: v2.ObservabilityPipeline) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a pipeline](https://docs.datadoghq.com/api/latest/observability-pipelines/#update-a-pipeline) * [v2 (latest)](https://docs.datadoghq.com/api/latest/observability-pipelines/#update-a-pipeline-v2) **Note** : This endpoint is in Preview. Fill out this [form](https://www.datadoghq.com/product-preview/observability-pipelines-api-and-terraform-support/) to request access. PUT https://api.ap1.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.eu/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.ddog-gov.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id} ### Overview Update a pipeline. This endpoint requires the `observability_pipelines_deploy` permission. ### Arguments #### Path Parameters Name Type Description pipeline_id [_required_] string The ID of the pipeline to update. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) Expand All Field Type Description _required_] object Contains the pipeline’s ID, type, and configuration attributes. _required_] object Defines the pipeline’s name and its components (sources, processors, and destinations). _required_] object Specifies the pipeline's configuration, including its sources, processors, and destinations. _required_] [ ] A list of destination components where processed logs are sent. object The `datadog_logs` destination forwards logs to Datadog Log Management. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `datadog_logs`. Allowed enum values: `datadog_logs` default: `datadog_logs` object The `amazon_s3` destination sends your logs in Datadog-rehydratable format to an Amazon S3 bucket for archiving. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string S3 bucket name. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys. region [_required_] string AWS region of the S3 bucket. storage_class [_required_] enum S3 storage class. Allowed enum values: `STANDARD,REDUCED_REDUNDANCY,INTELLIGENT_TIERING,STANDARD_IA,EXPRESS_ONEZONE,ONEZONE_IA,GLACIER,GLACIER_IR,DEEP_ARCHIVE` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `google_cloud_storage` destination stores logs in a Google Cloud Storage (GCS) bucket. It requires a bucket name, GCP authentication, and metadata fields. acl enum Access control list setting for objects written to the bucket. Allowed enum values: `private,project-private,public-read,authenticated-read,bucket-owner-read,bucket-owner-full-control` object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. bucket [_required_] string Name of the GCS bucket. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys within the GCS bucket. [object] Custom metadata to attach to each object uploaded to the GCS bucket. name [_required_] string The metadata key. value [_required_] string The metadata value. storage_class [_required_] enum Storage class used for objects stored in GCS. Allowed enum values: `STANDARD,NEARLINE,COLDLINE,ARCHIVE` type [_required_] enum The destination type. Always `google_cloud_storage`. Allowed enum values: `google_cloud_storage` default: `google_cloud_storage` object The `splunk_hec` destination forwards logs to Splunk using the HTTP Event Collector (HEC). auto_extract_timestamp boolean If `true`, Splunk tries to extract timestamps from incoming log events. If `false`, Splunk assigns the time the event was received. encoding enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). index string Optional name of the Splunk index where logs are written. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. sourcetype string The Splunk sourcetype to assign to log events. type [_required_] enum The destination type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `sumo_logic` destination forwards logs to Sumo Logic. encoding enum The output encoding format. Allowed enum values: `json,raw_message,logfmt` [object] A list of custom headers to include in the request to Sumo Logic. name [_required_] string The header field name. value [_required_] string The header field value. header_host_name string Optional override for the host name header. header_source_category string Optional override for the source category header. header_source_name string Optional override for the source name header. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `elasticsearch` destination writes logs to an Elasticsearch cluster. api_version enum The Elasticsearch API version to use. Set to `auto` to auto-detect. Allowed enum values: `auto,v6,v7,v8` bulk_index string The index to write logs to in Elasticsearch. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `elasticsearch`. Allowed enum values: `elasticsearch` default: `elasticsearch` object The `rsyslog` destination forwards logs to an external `rsyslog` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` destination forwards logs to an external `syslog-ng` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `azure_storage` destination forwards logs to an Azure Blob Storage container. blob_prefix string Optional prefix for blobs written to the container. container_name [_required_] string The name of the Azure Blob Storage container to store logs in. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `azure_storage`. Allowed enum values: `azure_storage` default: `azure_storage` object The `microsoft_sentinel` destination forwards logs to Microsoft Sentinel. client_id [_required_] string Azure AD client ID used for authentication. dcr_immutable_id [_required_] string The immutable ID of the Data Collection Rule (DCR). id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. table [_required_] string The name of the Log Analytics table where logs are sent. tenant_id [_required_] string Azure AD tenant ID. type [_required_] enum The destination type. The value should always be `microsoft_sentinel`. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` object The `google_chronicle` destination sends logs to Google Chronicle. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. customer_id [_required_] string The Google Chronicle customer ID. encoding enum The encoding format for the logs sent to Chronicle. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. log_type string The log type metadata associated with the Chronicle destination. type [_required_] enum The destination type. The value should always be `google_chronicle`. Allowed enum values: `google_chronicle` default: `google_chronicle` object The `new_relic` destination sends logs to the New Relic platform. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The New Relic region. Allowed enum values: `us,eu` type [_required_] enum The destination type. The value should always be `new_relic`. Allowed enum values: `new_relic` default: `new_relic` object The `sentinel_one` destination sends logs to SentinelOne. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The SentinelOne region to send logs to. Allowed enum values: `us,eu,ca,data_set_us` type [_required_] enum The destination type. The value should always be `sentinel_one`. Allowed enum values: `sentinel_one` default: `sentinel_one` object The `opensearch` destination writes logs to an OpenSearch cluster. bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `opensearch`. Allowed enum values: `opensearch` default: `opensearch` object The `amazon_opensearch` destination writes logs to Amazon OpenSearch. _required_] object Authentication settings for the Amazon OpenSearch destination. The `strategy` field determines whether basic or AWS-based authentication is used. assume_role string The ARN of the role to assume (used with `aws` strategy). aws_region string AWS region external_id string External ID for the assumed role (used with `aws` strategy). session_name string Session name for the assumed role (used with `aws` strategy). strategy [_required_] enum The authentication strategy to use. Allowed enum values: `basic,aws` bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `amazon_opensearch`. Allowed enum values: `amazon_opensearch` default: `amazon_opensearch` object The `socket` destination sends logs over TCP or UDP to a remote server. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` _required_] Framing method configuration. object Each log event is delimited by a newline character. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingNewlineDelimitedMethod` object. Allowed enum values: `newline_delimited` object Event data is not delimited at all. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingBytesMethod` object. Allowed enum values: `bytes` object Each log event is separated using the specified delimiter character. delimiter [_required_] string A single ASCII character used as a delimiter. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingCharacterDelimitedMethod` object. Allowed enum values: `character_delimited` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. mode [_required_] enum Protocol used to send logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` object The `amazon_security_lake` destination sends your logs to Amazon Security Lake. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string Name of the Amazon S3 bucket in Security Lake (3-63 characters). custom_source_name [_required_] string Custom source name for the logs in Security Lake. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] string AWS region of the S3 bucket. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_security_lake`. Allowed enum values: `amazon_security_lake` default: `amazon_security_lake` object The `crowdstrike_next_gen_siem` destination forwards logs to CrowdStrike Next Gen SIEM. object Compression configuration for log events. algorithm [_required_] enum Compression algorithm for log events. Allowed enum values: `gzip,zlib` level int64 Compression level. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `crowdstrike_next_gen_siem`. Allowed enum values: `crowdstrike_next_gen_siem` default: `crowdstrike_next_gen_siem` object The `google_pubsub` destination publishes logs to a Google Cloud Pub/Sub topic. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. project [_required_] string The GCP project ID that owns the Pub/Sub topic. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topic [_required_] string The Pub/Sub topic name to publish logs to. type [_required_] enum The destination type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` [object] A list of processor groups that transform or enrich log data. display_name string The display name for a component. enabled [_required_] boolean Whether this processor group is enabled. id [_required_] string The unique identifier for the processor group. include [_required_] string Conditional expression for when this processor group should execute. inputs [_required_] [string] A list of IDs for components whose output is used as the input for this processor group. _required_] [ ] Processors applied sequentially within this group. Events flow through each processor in order. object The `filter` processor allows conditional processing of logs based on a Datadog search query. Logs that match the `include` query are passed through; others are discarded. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped. type [_required_] enum The processor type. The value should always be `filter`. Allowed enum values: `filter` default: `filter` object The `parse_json` processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. field [_required_] string The name of the log field that contains a JSON string. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `parse_json`. Allowed enum values: `parse_json` default: `parse_json` object The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert. display_name string The display name for a component. drop_events boolean If set to `true`, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). ignore_when_missing_partitions boolean If `true`, the processor skips quota checks when partition fields are missing from the logs. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. name [_required_] string Name of the quota. overflow_action enum The action to take when the quota is exceeded. Options: * `drop`: Drop the event. * `no_action`: Let the event pass through. * `overflow_routing`: Route to an overflow destination. Allowed enum values: `drop,no_action,overflow_routing` [object] A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit. _required_] [object] A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced. name [_required_] string The field name. value [_required_] string The field value. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. partition_fields [string] A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values. type [_required_] enum The processor type. The value should always be `quota`. Allowed enum values: `quota` default: `quota` object The `add_fields` processor adds static key-value fields to logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of static fields (key-value pairs) that is added to each log event processed by this component. name [_required_] string The field name. value [_required_] string The field value. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_fields`. Allowed enum values: `add_fields` default: `add_fields` object The `remove_fields` processor deletes specified fields from logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of field names to be removed from each log event. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `remove_fields`. Allowed enum values: `remove_fields` default: `remove_fields` object The `rename_fields` processor changes field names. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields. destination [_required_] string The field name to assign the renamed value to. preserve_source [_required_] boolean Indicates whether the original field, that is received from the source, should be kept (`true`) or removed (`false`) after renaming. source [_required_] string The original field name in the log event that should be renamed. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `rename_fields`. Allowed enum values: `rename_fields` default: `rename_fields` object The `generate_datadog_metrics` processor creates custom metrics from logs and sends them to Datadog. Metrics can be counters, gauges, or distributions and optionally grouped by log fields. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include string A Datadog search query used to determine which logs this processor targets. [object] Configuration for generating individual metrics. group_by [string] Optional fields used to group the metric series. include [_required_] string Datadog filter query to match logs for metric generation. metric_type [_required_] enum Type of metric to create. Allowed enum values: `count,gauge,distribution` name [_required_] string Name of the custom metric to be created. _required_] Specifies how the value of the generated metric is computed. object Strategy that increments a generated metric by one for each matching event. strategy [_required_] enum Increments the metric by 1 for each matching event. Allowed enum values: `increment_by_one` object Strategy that increments a generated metric based on the value of a log field. field [_required_] string Name of the log field containing the numeric value to increment the metric by. strategy [_required_] enum Uses a numeric field in the log event as the metric increment. Allowed enum values: `increment_by_field` type [_required_] enum The processor type. Always `generate_datadog_metrics`. Allowed enum values: `generate_datadog_metrics` default: `generate_datadog_metrics` object The `sample` processor allows probabilistic sampling of logs at a fixed rate. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. percentage double The percentage of logs to sample. rate int64 Number of events to sample (1 in N). type [_required_] enum The processor type. The value should always be `sample`. Allowed enum values: `sample` default: `sample` object The `parse_grok` processor extracts structured fields from unstructured log messages using Grok patterns. disable_library_rules boolean If set to `true`, disables the default Grok rules provided by Datadog. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string A unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] The list of Grok parsing rules. If multiple matching rules are provided, they are evaluated in order. The first successful match is applied. _required_] [object] A list of Grok parsing rules that define how to extract fields from the source field. Each rule must contain a name and a valid Grok pattern. name [_required_] string The name of the rule. rule [_required_] string The definition of the Grok rule. source [_required_] string The name of the field in the log event to apply the Grok rules to. [object] A list of Grok helper rules that can be referenced by the parsing rules. name [_required_] string The name of the Grok helper rule. rule [_required_] string The definition of the Grok helper rule. type [_required_] enum The processor type. The value should always be `parse_grok`. Allowed enum values: `parse_grok` default: `parse_grok` object The `sensitive_data_scanner` processor detects and optionally redacts sensitive data in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of rules for identifying and acting on sensitive data patterns. object Configuration for keywords used to reinforce sensitive data pattern detection. keywords [_required_] [string] A list of keywords to match near the sensitive pattern. proximity [_required_] int64 Maximum number of tokens between a keyword and a sensitive value match. name [_required_] string A name identifying the rule. _required_] Defines what action to take when sensitive data is matched. object Configuration for completely redacting matched sensitive data. action [_required_] enum Action type that completely replaces the matched sensitive data with a fixed replacement string to remove all visibility. Allowed enum values: `redact` _required_] object Configuration for fully redacting sensitive data. replace [_required_] string The `ObservabilityPipelineSensitiveDataScannerProcessorActionRedactOptions` `replace`. object Configuration for hashing matched sensitive values. action [_required_] enum Action type that replaces the matched sensitive data with a hashed representation, preserving structure while securing content. Allowed enum values: `hash` options object The `ObservabilityPipelineSensitiveDataScannerProcessorActionHash` `options`. object Configuration for partially redacting matched sensitive data. action [_required_] enum Action type that redacts part of the sensitive data while preserving a configurable number of characters, typically used for masking purposes (e.g., show last 4 digits of a credit card). Allowed enum values: `partial_redact` _required_] object Controls how partial redaction is applied, including character count and direction. characters [_required_] int64 The `ObservabilityPipelineSensitiveDataScannerProcessorActionPartialRedactOptions` `characters`. direction [_required_] enum Indicates whether to redact characters from the first or last part of the matched value. Allowed enum values: `first,last` _required_] Pattern detection configuration for identifying sensitive data using either a custom regex or a library reference. object Defines a custom regex-based pattern for identifying sensitive data in logs. _required_] object Options for defining a custom regex pattern. rule [_required_] string A regular expression used to detect sensitive values. Must be a valid regex. type [_required_] enum Indicates a custom regular expression is used for matching. Allowed enum values: `custom` object Specifies a pattern from Datadog’s sensitive data detection library to match known sensitive data types. _required_] object Options for selecting a predefined library pattern and enabling keyword support. id [_required_] string Identifier for a predefined pattern from the sensitive data scanner pattern library. use_recommended_keywords boolean Whether to augment the pattern with recommended keywords (optional). type [_required_] enum Indicates that a predefined library pattern is used. Allowed enum values: `library` _required_] Determines which parts of the log the pattern-matching rule should be applied to. object Includes only specific fields for sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Applies the rule only to included fields. Allowed enum values: `include` object Excludes specific fields from sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Excludes specific fields from processing. Allowed enum values: `exclude` object Applies scanning across all available fields. target [_required_] enum Applies the rule to all fields. Allowed enum values: `all` tags [_required_] [string] Tags assigned to this rule for filtering and classification. type [_required_] enum The processor type. The value should always be `sensitive_data_scanner`. Allowed enum values: `sensitive_data_scanner` default: `sensitive_data_scanner` object The `ocsf_mapper` processor transforms logs into the OCSF schema using a predefined mapping configuration. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of mapping rules to convert events to the OCSF format. include [_required_] string A Datadog search query used to select the logs that this mapping should apply to. _required_] Defines a single mapping rule for transforming logs into the OCSF schema. Option 1 enum Predefined library mappings for common log formats. Allowed enum values: `CloudTrail Account Change,GCP Cloud Audit CreateBucket,GCP Cloud Audit CreateSink,GCP Cloud Audit SetIamPolicy,GCP Cloud Audit UpdateSink,Github Audit Log API Activity,Google Workspace Admin Audit addPrivilege,Microsoft 365 Defender Incident,Microsoft 365 Defender UserLoggedIn,Okta System Log Authentication,Palo Alto Networks Firewall Traffic` type [_required_] enum The processor type. The value should always be `ocsf_mapper`. Allowed enum values: `ocsf_mapper` default: `ocsf_mapper` object The `add_env_vars` processor adds environment variable values to log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this processor in the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_env_vars`. Allowed enum values: `add_env_vars` default: `add_env_vars` _required_] [object] A list of environment variable mappings to apply to log fields. field [_required_] string The target field in the log event. name [_required_] string The name of the environment variable to read. object The `dedupe` processor removes duplicate fields in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of log field paths to check for duplicates. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. mode [_required_] enum The deduplication mode to apply to the fields. Allowed enum values: `match,ignore` type [_required_] enum The processor type. The value should always be `dedupe`. Allowed enum values: `dedupe` default: `dedupe` object The `enrichment_table` processor enriches logs using a static CSV file or GeoIP database. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. object Defines a static enrichment table loaded from a CSV file. _required_] object File encoding format. delimiter [_required_] string The `encoding` `delimiter`. includes_headers [_required_] boolean The `encoding` `includes_headers`. type [_required_] enum Specifies the encoding format (e.g., CSV) used for enrichment tables. Allowed enum values: `csv` _required_] [object] Key fields used to look up enrichment values. column [_required_] string The `items` `column`. comparison [_required_] enum Defines how to compare key fields for enrichment table lookups. Allowed enum values: `equals` field [_required_] string The `items` `field`. path [_required_] string Path to the CSV file. _required_] [object] Schema defining column names and their types. column [_required_] string The `items` `column`. type [_required_] enum Declares allowed data types for enrichment table columns. Allowed enum values: `string,boolean,integer,float,date,timestamp` object Uses a GeoIP database to enrich logs based on an IP field. key_field [_required_] string Path to the IP field in the log. locale [_required_] string Locale used to resolve geographical names. path [_required_] string Path to the GeoIP database file. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. target [_required_] string Path where enrichment results should be stored in the log. type [_required_] enum The processor type. The value should always be `enrichment_table`. Allowed enum values: `enrichment_table` default: `enrichment_table` object The `reduce` processor aggregates and merges logs based on matching keys and merge strategies. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [_required_] [string] A list of fields used to group log events for merging. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] List of merge strategies defining how values from grouped events should be combined. path [_required_] string The field path in the log event. strategy [_required_] enum The merge strategy to apply. Allowed enum values: `discard,retain,sum,max,min,array,concat,concat_newline,concat_raw,shortest_array,longest_array,flat_unique` type [_required_] enum The processor type. The value should always be `reduce`. Allowed enum values: `reduce` default: `reduce` object The `throttle` processor limits the number of events that pass through over a given time window. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [string] Optional list of fields used to group events before the threshold has been reached. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. threshold [_required_] int64 the number of events allowed in a given time window. Events sent after the threshold has been reached, are dropped. type [_required_] enum The processor type. The value should always be `throttle`. Allowed enum values: `throttle` default: `throttle` window [_required_] double The time window in seconds over which the threshold applies. object The `custom_processor` processor transforms events using [Vector Remap Language (VRL)](https://vector.dev/docs/reference/vrl/) scripts with advanced filtering capabilities. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. This field should always be set to `*` for the custom_processor processor. default: `*` _required_] [object] Array of VRL remap rules. drop_on_error [_required_] boolean Whether to drop events that caused errors during processing. enabled boolean Whether this remap rule is enabled. include [_required_] string A Datadog search query used to filter events for this specific remap rule. name [_required_] string A descriptive name for this remap rule. source [_required_] string The VRL script source code that defines the processing logic. type [_required_] enum The processor type. The value should always be `custom_processor`. Allowed enum values: `custom_processor` default: `custom_processor` object The `datadog_tags` processor includes or excludes specific Datadog tags in your logs. action [_required_] enum The action to take on tags with matching keys. Allowed enum values: `include,exclude` display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. keys [_required_] [string] A list of tag keys. mode [_required_] enum The processing mode. Allowed enum values: `filter` type [_required_] enum The processor type. The value should always be `datadog_tags`. Allowed enum values: `datadog_tags` default: `datadog_tags` _required_] [ ] A list of configured data sources for the pipeline. object The `kafka` source ingests data from Apache Kafka topics. group_id [_required_] string Consumer group ID used by the Kafka client. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). [object] Optional list of advanced Kafka client configuration options, defined as key-value pairs. name [_required_] string The name of the `librdkafka` configuration option to set. value [_required_] string The value assigned to the specified `librdkafka` configuration option. object Specifies the SASL mechanism for authenticating with a Kafka cluster. mechanism enum SASL mechanism used for Kafka authentication. Allowed enum values: `PLAIN,SCRAM-SHA-256,SCRAM-SHA-512` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topics [_required_] [string] A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified. type [_required_] enum The source type. The value should always be `kafka`. Allowed enum values: `kafka` default: `kafka` object The `datadog_agent` source collects logs from the Datadog Agent. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `datadog_agent`. Allowed enum values: `datadog_agent` default: `datadog_agent` object The `splunk_tcp` source receives logs from a Splunk Universal Forwarder over TCP. TLS is supported for secure transmission. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_tcp`. Allowed enum values: `splunk_tcp` default: `splunk_tcp` object The `splunk_hec` source implements the Splunk HTTP Event Collector (HEC) API. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `amazon_s3` source ingests logs from an Amazon S3 bucket. It supports AWS authentication and TLS encryption. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). region [_required_] string AWS region where the S3 bucket resides. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `fluentd` source ingests logs from a Fluentd-compatible service. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluentd. Allowed enum values: `fluentd` default: `fluentd` object The `fluent_bit` source ingests logs from Fluent Bit. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluent_bit`. Allowed enum values: `fluent_bit` default: `fluent_bit` object The `http_server` source collects logs over HTTP POST from external services. auth_strategy [_required_] enum HTTP authentication method. Allowed enum values: `none,plain` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string Unique ID for the HTTP server source. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_server`. Allowed enum values: `http_server` default: `http_server` object The `sumo_logic` source receives logs from Sumo Logic collectors. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). type [_required_] enum The source type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `rsyslog` source listens for logs over TCP or UDP from an `rsyslog` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` source listens for logs over TCP or UDP from a `syslog-ng` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `amazon_data_firehose` source ingests logs from AWS Data Firehose. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `amazon_data_firehose`. Allowed enum values: `amazon_data_firehose` default: `amazon_data_firehose` object The `google_pubsub` source ingests logs from a Google Cloud Pub/Sub subscription. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). project [_required_] string The GCP project ID that owns the Pub/Sub subscription. subscription [_required_] string The Pub/Sub subscription name from which messages are consumed. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` object The `http_client` source scrapes logs from HTTP endpoints at regular intervals. auth_strategy enum Optional authentication strategy for HTTP requests. Allowed enum values: `basic,bearer` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). scrape_interval_secs int64 The interval (in seconds) between HTTP scrape requests. scrape_timeout_secs int64 The timeout (in seconds) for each scrape request. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_client`. Allowed enum values: `http_client` default: `http_client` object The `logstash` source ingests logs from a Logstash forwarder. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `logstash`. Allowed enum values: `logstash` default: `logstash` object The `socket` source ingests logs over TCP or UDP. _required_] Framing method configuration for the socket source. object Byte frames which are delimited by a newline character. method [_required_] enum Byte frames which are delimited by a newline character. Allowed enum values: `newline_delimited` object Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). method [_required_] enum Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). Allowed enum values: `bytes` object Byte frames which are delimited by a chosen character. delimiter [_required_] string A single ASCII character used to delimit events. method [_required_] enum Byte frames which are delimited by a chosen character. Allowed enum values: `character_delimited` object Byte frames according to the octet counting format as per RFC6587. method [_required_] enum Byte frames according to the octet counting format as per RFC6587. Allowed enum values: `octet_counting` object Byte frames which are chunked GELF messages. method [_required_] enum Byte frames which are chunked GELF messages. Allowed enum values: `chunked_gelf` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used to receive logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` name [_required_] string Name of the pipeline. id [_required_] string Unique identifier for the pipeline. type [_required_] string The resource type identifier. For pipeline resources, this should always be set to `pipelines`. default: `pipelines` ``` { "data": { "attributes": { "config": { "destinations": [ { "id": "updated-datadog-logs-destination-id", "inputs": [ "my-processor-group" ], "type": "datadog_logs" } ], "processors": [ { "enabled": true, "id": "my-processor-group", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "processors": [ { "enabled": true, "id": "filter-processor", "include": "status:error", "type": "filter" } ] } ], "sources": [ { "id": "datadog-agent-source", "type": "datadog_agent" } ] }, "name": "Updated Pipeline Name" }, "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6", "type": "pipelines" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/observability-pipelines/#UpdatePipeline-200-v2) * [400](https://docs.datadoghq.com/api/latest/observability-pipelines/#UpdatePipeline-400-v2) * [403](https://docs.datadoghq.com/api/latest/observability-pipelines/#UpdatePipeline-403-v2) * [404](https://docs.datadoghq.com/api/latest/observability-pipelines/#UpdatePipeline-404-v2) * [409](https://docs.datadoghq.com/api/latest/observability-pipelines/#UpdatePipeline-409-v2) * [429](https://docs.datadoghq.com/api/latest/observability-pipelines/#UpdatePipeline-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) Top-level schema representing a pipeline. Expand All Field Type Description _required_] object Contains the pipeline’s ID, type, and configuration attributes. _required_] object Defines the pipeline’s name and its components (sources, processors, and destinations). _required_] object Specifies the pipeline's configuration, including its sources, processors, and destinations. _required_] [ ] A list of destination components where processed logs are sent. object The `datadog_logs` destination forwards logs to Datadog Log Management. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `datadog_logs`. Allowed enum values: `datadog_logs` default: `datadog_logs` object The `amazon_s3` destination sends your logs in Datadog-rehydratable format to an Amazon S3 bucket for archiving. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string S3 bucket name. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys. region [_required_] string AWS region of the S3 bucket. storage_class [_required_] enum S3 storage class. Allowed enum values: `STANDARD,REDUCED_REDUNDANCY,INTELLIGENT_TIERING,STANDARD_IA,EXPRESS_ONEZONE,ONEZONE_IA,GLACIER,GLACIER_IR,DEEP_ARCHIVE` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `google_cloud_storage` destination stores logs in a Google Cloud Storage (GCS) bucket. It requires a bucket name, GCP authentication, and metadata fields. acl enum Access control list setting for objects written to the bucket. Allowed enum values: `private,project-private,public-read,authenticated-read,bucket-owner-read,bucket-owner-full-control` object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. bucket [_required_] string Name of the GCS bucket. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys within the GCS bucket. [object] Custom metadata to attach to each object uploaded to the GCS bucket. name [_required_] string The metadata key. value [_required_] string The metadata value. storage_class [_required_] enum Storage class used for objects stored in GCS. Allowed enum values: `STANDARD,NEARLINE,COLDLINE,ARCHIVE` type [_required_] enum The destination type. Always `google_cloud_storage`. Allowed enum values: `google_cloud_storage` default: `google_cloud_storage` object The `splunk_hec` destination forwards logs to Splunk using the HTTP Event Collector (HEC). auto_extract_timestamp boolean If `true`, Splunk tries to extract timestamps from incoming log events. If `false`, Splunk assigns the time the event was received. encoding enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). index string Optional name of the Splunk index where logs are written. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. sourcetype string The Splunk sourcetype to assign to log events. type [_required_] enum The destination type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `sumo_logic` destination forwards logs to Sumo Logic. encoding enum The output encoding format. Allowed enum values: `json,raw_message,logfmt` [object] A list of custom headers to include in the request to Sumo Logic. name [_required_] string The header field name. value [_required_] string The header field value. header_host_name string Optional override for the host name header. header_source_category string Optional override for the source category header. header_source_name string Optional override for the source name header. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `elasticsearch` destination writes logs to an Elasticsearch cluster. api_version enum The Elasticsearch API version to use. Set to `auto` to auto-detect. Allowed enum values: `auto,v6,v7,v8` bulk_index string The index to write logs to in Elasticsearch. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `elasticsearch`. Allowed enum values: `elasticsearch` default: `elasticsearch` object The `rsyslog` destination forwards logs to an external `rsyslog` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` destination forwards logs to an external `syslog-ng` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `azure_storage` destination forwards logs to an Azure Blob Storage container. blob_prefix string Optional prefix for blobs written to the container. container_name [_required_] string The name of the Azure Blob Storage container to store logs in. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `azure_storage`. Allowed enum values: `azure_storage` default: `azure_storage` object The `microsoft_sentinel` destination forwards logs to Microsoft Sentinel. client_id [_required_] string Azure AD client ID used for authentication. dcr_immutable_id [_required_] string The immutable ID of the Data Collection Rule (DCR). id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. table [_required_] string The name of the Log Analytics table where logs are sent. tenant_id [_required_] string Azure AD tenant ID. type [_required_] enum The destination type. The value should always be `microsoft_sentinel`. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` object The `google_chronicle` destination sends logs to Google Chronicle. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. customer_id [_required_] string The Google Chronicle customer ID. encoding enum The encoding format for the logs sent to Chronicle. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. log_type string The log type metadata associated with the Chronicle destination. type [_required_] enum The destination type. The value should always be `google_chronicle`. Allowed enum values: `google_chronicle` default: `google_chronicle` object The `new_relic` destination sends logs to the New Relic platform. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The New Relic region. Allowed enum values: `us,eu` type [_required_] enum The destination type. The value should always be `new_relic`. Allowed enum values: `new_relic` default: `new_relic` object The `sentinel_one` destination sends logs to SentinelOne. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The SentinelOne region to send logs to. Allowed enum values: `us,eu,ca,data_set_us` type [_required_] enum The destination type. The value should always be `sentinel_one`. Allowed enum values: `sentinel_one` default: `sentinel_one` object The `opensearch` destination writes logs to an OpenSearch cluster. bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `opensearch`. Allowed enum values: `opensearch` default: `opensearch` object The `amazon_opensearch` destination writes logs to Amazon OpenSearch. _required_] object Authentication settings for the Amazon OpenSearch destination. The `strategy` field determines whether basic or AWS-based authentication is used. assume_role string The ARN of the role to assume (used with `aws` strategy). aws_region string AWS region external_id string External ID for the assumed role (used with `aws` strategy). session_name string Session name for the assumed role (used with `aws` strategy). strategy [_required_] enum The authentication strategy to use. Allowed enum values: `basic,aws` bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `amazon_opensearch`. Allowed enum values: `amazon_opensearch` default: `amazon_opensearch` object The `socket` destination sends logs over TCP or UDP to a remote server. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` _required_] Framing method configuration. object Each log event is delimited by a newline character. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingNewlineDelimitedMethod` object. Allowed enum values: `newline_delimited` object Event data is not delimited at all. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingBytesMethod` object. Allowed enum values: `bytes` object Each log event is separated using the specified delimiter character. delimiter [_required_] string A single ASCII character used as a delimiter. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingCharacterDelimitedMethod` object. Allowed enum values: `character_delimited` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. mode [_required_] enum Protocol used to send logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` object The `amazon_security_lake` destination sends your logs to Amazon Security Lake. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string Name of the Amazon S3 bucket in Security Lake (3-63 characters). custom_source_name [_required_] string Custom source name for the logs in Security Lake. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] string AWS region of the S3 bucket. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_security_lake`. Allowed enum values: `amazon_security_lake` default: `amazon_security_lake` object The `crowdstrike_next_gen_siem` destination forwards logs to CrowdStrike Next Gen SIEM. object Compression configuration for log events. algorithm [_required_] enum Compression algorithm for log events. Allowed enum values: `gzip,zlib` level int64 Compression level. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `crowdstrike_next_gen_siem`. Allowed enum values: `crowdstrike_next_gen_siem` default: `crowdstrike_next_gen_siem` object The `google_pubsub` destination publishes logs to a Google Cloud Pub/Sub topic. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. project [_required_] string The GCP project ID that owns the Pub/Sub topic. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topic [_required_] string The Pub/Sub topic name to publish logs to. type [_required_] enum The destination type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` [object] A list of processor groups that transform or enrich log data. display_name string The display name for a component. enabled [_required_] boolean Whether this processor group is enabled. id [_required_] string The unique identifier for the processor group. include [_required_] string Conditional expression for when this processor group should execute. inputs [_required_] [string] A list of IDs for components whose output is used as the input for this processor group. _required_] [ ] Processors applied sequentially within this group. Events flow through each processor in order. object The `filter` processor allows conditional processing of logs based on a Datadog search query. Logs that match the `include` query are passed through; others are discarded. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped. type [_required_] enum The processor type. The value should always be `filter`. Allowed enum values: `filter` default: `filter` object The `parse_json` processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. field [_required_] string The name of the log field that contains a JSON string. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `parse_json`. Allowed enum values: `parse_json` default: `parse_json` object The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert. display_name string The display name for a component. drop_events boolean If set to `true`, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). ignore_when_missing_partitions boolean If `true`, the processor skips quota checks when partition fields are missing from the logs. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. name [_required_] string Name of the quota. overflow_action enum The action to take when the quota is exceeded. Options: * `drop`: Drop the event. * `no_action`: Let the event pass through. * `overflow_routing`: Route to an overflow destination. Allowed enum values: `drop,no_action,overflow_routing` [object] A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit. _required_] [object] A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced. name [_required_] string The field name. value [_required_] string The field value. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. partition_fields [string] A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values. type [_required_] enum The processor type. The value should always be `quota`. Allowed enum values: `quota` default: `quota` object The `add_fields` processor adds static key-value fields to logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of static fields (key-value pairs) that is added to each log event processed by this component. name [_required_] string The field name. value [_required_] string The field value. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_fields`. Allowed enum values: `add_fields` default: `add_fields` object The `remove_fields` processor deletes specified fields from logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of field names to be removed from each log event. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `remove_fields`. Allowed enum values: `remove_fields` default: `remove_fields` object The `rename_fields` processor changes field names. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields. destination [_required_] string The field name to assign the renamed value to. preserve_source [_required_] boolean Indicates whether the original field, that is received from the source, should be kept (`true`) or removed (`false`) after renaming. source [_required_] string The original field name in the log event that should be renamed. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `rename_fields`. Allowed enum values: `rename_fields` default: `rename_fields` object The `generate_datadog_metrics` processor creates custom metrics from logs and sends them to Datadog. Metrics can be counters, gauges, or distributions and optionally grouped by log fields. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include string A Datadog search query used to determine which logs this processor targets. [object] Configuration for generating individual metrics. group_by [string] Optional fields used to group the metric series. include [_required_] string Datadog filter query to match logs for metric generation. metric_type [_required_] enum Type of metric to create. Allowed enum values: `count,gauge,distribution` name [_required_] string Name of the custom metric to be created. _required_] Specifies how the value of the generated metric is computed. object Strategy that increments a generated metric by one for each matching event. strategy [_required_] enum Increments the metric by 1 for each matching event. Allowed enum values: `increment_by_one` object Strategy that increments a generated metric based on the value of a log field. field [_required_] string Name of the log field containing the numeric value to increment the metric by. strategy [_required_] enum Uses a numeric field in the log event as the metric increment. Allowed enum values: `increment_by_field` type [_required_] enum The processor type. Always `generate_datadog_metrics`. Allowed enum values: `generate_datadog_metrics` default: `generate_datadog_metrics` object The `sample` processor allows probabilistic sampling of logs at a fixed rate. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. percentage double The percentage of logs to sample. rate int64 Number of events to sample (1 in N). type [_required_] enum The processor type. The value should always be `sample`. Allowed enum values: `sample` default: `sample` object The `parse_grok` processor extracts structured fields from unstructured log messages using Grok patterns. disable_library_rules boolean If set to `true`, disables the default Grok rules provided by Datadog. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string A unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] The list of Grok parsing rules. If multiple matching rules are provided, they are evaluated in order. The first successful match is applied. _required_] [object] A list of Grok parsing rules that define how to extract fields from the source field. Each rule must contain a name and a valid Grok pattern. name [_required_] string The name of the rule. rule [_required_] string The definition of the Grok rule. source [_required_] string The name of the field in the log event to apply the Grok rules to. [object] A list of Grok helper rules that can be referenced by the parsing rules. name [_required_] string The name of the Grok helper rule. rule [_required_] string The definition of the Grok helper rule. type [_required_] enum The processor type. The value should always be `parse_grok`. Allowed enum values: `parse_grok` default: `parse_grok` object The `sensitive_data_scanner` processor detects and optionally redacts sensitive data in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of rules for identifying and acting on sensitive data patterns. object Configuration for keywords used to reinforce sensitive data pattern detection. keywords [_required_] [string] A list of keywords to match near the sensitive pattern. proximity [_required_] int64 Maximum number of tokens between a keyword and a sensitive value match. name [_required_] string A name identifying the rule. _required_] Defines what action to take when sensitive data is matched. object Configuration for completely redacting matched sensitive data. action [_required_] enum Action type that completely replaces the matched sensitive data with a fixed replacement string to remove all visibility. Allowed enum values: `redact` _required_] object Configuration for fully redacting sensitive data. replace [_required_] string The `ObservabilityPipelineSensitiveDataScannerProcessorActionRedactOptions` `replace`. object Configuration for hashing matched sensitive values. action [_required_] enum Action type that replaces the matched sensitive data with a hashed representation, preserving structure while securing content. Allowed enum values: `hash` options object The `ObservabilityPipelineSensitiveDataScannerProcessorActionHash` `options`. object Configuration for partially redacting matched sensitive data. action [_required_] enum Action type that redacts part of the sensitive data while preserving a configurable number of characters, typically used for masking purposes (e.g., show last 4 digits of a credit card). Allowed enum values: `partial_redact` _required_] object Controls how partial redaction is applied, including character count and direction. characters [_required_] int64 The `ObservabilityPipelineSensitiveDataScannerProcessorActionPartialRedactOptions` `characters`. direction [_required_] enum Indicates whether to redact characters from the first or last part of the matched value. Allowed enum values: `first,last` _required_] Pattern detection configuration for identifying sensitive data using either a custom regex or a library reference. object Defines a custom regex-based pattern for identifying sensitive data in logs. _required_] object Options for defining a custom regex pattern. rule [_required_] string A regular expression used to detect sensitive values. Must be a valid regex. type [_required_] enum Indicates a custom regular expression is used for matching. Allowed enum values: `custom` object Specifies a pattern from Datadog’s sensitive data detection library to match known sensitive data types. _required_] object Options for selecting a predefined library pattern and enabling keyword support. id [_required_] string Identifier for a predefined pattern from the sensitive data scanner pattern library. use_recommended_keywords boolean Whether to augment the pattern with recommended keywords (optional). type [_required_] enum Indicates that a predefined library pattern is used. Allowed enum values: `library` _required_] Determines which parts of the log the pattern-matching rule should be applied to. object Includes only specific fields for sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Applies the rule only to included fields. Allowed enum values: `include` object Excludes specific fields from sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Excludes specific fields from processing. Allowed enum values: `exclude` object Applies scanning across all available fields. target [_required_] enum Applies the rule to all fields. Allowed enum values: `all` tags [_required_] [string] Tags assigned to this rule for filtering and classification. type [_required_] enum The processor type. The value should always be `sensitive_data_scanner`. Allowed enum values: `sensitive_data_scanner` default: `sensitive_data_scanner` object The `ocsf_mapper` processor transforms logs into the OCSF schema using a predefined mapping configuration. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of mapping rules to convert events to the OCSF format. include [_required_] string A Datadog search query used to select the logs that this mapping should apply to. _required_] Defines a single mapping rule for transforming logs into the OCSF schema. Option 1 enum Predefined library mappings for common log formats. Allowed enum values: `CloudTrail Account Change,GCP Cloud Audit CreateBucket,GCP Cloud Audit CreateSink,GCP Cloud Audit SetIamPolicy,GCP Cloud Audit UpdateSink,Github Audit Log API Activity,Google Workspace Admin Audit addPrivilege,Microsoft 365 Defender Incident,Microsoft 365 Defender UserLoggedIn,Okta System Log Authentication,Palo Alto Networks Firewall Traffic` type [_required_] enum The processor type. The value should always be `ocsf_mapper`. Allowed enum values: `ocsf_mapper` default: `ocsf_mapper` object The `add_env_vars` processor adds environment variable values to log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this processor in the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_env_vars`. Allowed enum values: `add_env_vars` default: `add_env_vars` _required_] [object] A list of environment variable mappings to apply to log fields. field [_required_] string The target field in the log event. name [_required_] string The name of the environment variable to read. object The `dedupe` processor removes duplicate fields in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of log field paths to check for duplicates. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. mode [_required_] enum The deduplication mode to apply to the fields. Allowed enum values: `match,ignore` type [_required_] enum The processor type. The value should always be `dedupe`. Allowed enum values: `dedupe` default: `dedupe` object The `enrichment_table` processor enriches logs using a static CSV file or GeoIP database. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. object Defines a static enrichment table loaded from a CSV file. _required_] object File encoding format. delimiter [_required_] string The `encoding` `delimiter`. includes_headers [_required_] boolean The `encoding` `includes_headers`. type [_required_] enum Specifies the encoding format (e.g., CSV) used for enrichment tables. Allowed enum values: `csv` _required_] [object] Key fields used to look up enrichment values. column [_required_] string The `items` `column`. comparison [_required_] enum Defines how to compare key fields for enrichment table lookups. Allowed enum values: `equals` field [_required_] string The `items` `field`. path [_required_] string Path to the CSV file. _required_] [object] Schema defining column names and their types. column [_required_] string The `items` `column`. type [_required_] enum Declares allowed data types for enrichment table columns. Allowed enum values: `string,boolean,integer,float,date,timestamp` object Uses a GeoIP database to enrich logs based on an IP field. key_field [_required_] string Path to the IP field in the log. locale [_required_] string Locale used to resolve geographical names. path [_required_] string Path to the GeoIP database file. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. target [_required_] string Path where enrichment results should be stored in the log. type [_required_] enum The processor type. The value should always be `enrichment_table`. Allowed enum values: `enrichment_table` default: `enrichment_table` object The `reduce` processor aggregates and merges logs based on matching keys and merge strategies. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [_required_] [string] A list of fields used to group log events for merging. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] List of merge strategies defining how values from grouped events should be combined. path [_required_] string The field path in the log event. strategy [_required_] enum The merge strategy to apply. Allowed enum values: `discard,retain,sum,max,min,array,concat,concat_newline,concat_raw,shortest_array,longest_array,flat_unique` type [_required_] enum The processor type. The value should always be `reduce`. Allowed enum values: `reduce` default: `reduce` object The `throttle` processor limits the number of events that pass through over a given time window. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [string] Optional list of fields used to group events before the threshold has been reached. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. threshold [_required_] int64 the number of events allowed in a given time window. Events sent after the threshold has been reached, are dropped. type [_required_] enum The processor type. The value should always be `throttle`. Allowed enum values: `throttle` default: `throttle` window [_required_] double The time window in seconds over which the threshold applies. object The `custom_processor` processor transforms events using [Vector Remap Language (VRL)](https://vector.dev/docs/reference/vrl/) scripts with advanced filtering capabilities. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. This field should always be set to `*` for the custom_processor processor. default: `*` _required_] [object] Array of VRL remap rules. drop_on_error [_required_] boolean Whether to drop events that caused errors during processing. enabled boolean Whether this remap rule is enabled. include [_required_] string A Datadog search query used to filter events for this specific remap rule. name [_required_] string A descriptive name for this remap rule. source [_required_] string The VRL script source code that defines the processing logic. type [_required_] enum The processor type. The value should always be `custom_processor`. Allowed enum values: `custom_processor` default: `custom_processor` object The `datadog_tags` processor includes or excludes specific Datadog tags in your logs. action [_required_] enum The action to take on tags with matching keys. Allowed enum values: `include,exclude` display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. keys [_required_] [string] A list of tag keys. mode [_required_] enum The processing mode. Allowed enum values: `filter` type [_required_] enum The processor type. The value should always be `datadog_tags`. Allowed enum values: `datadog_tags` default: `datadog_tags` _required_] [ ] A list of configured data sources for the pipeline. object The `kafka` source ingests data from Apache Kafka topics. group_id [_required_] string Consumer group ID used by the Kafka client. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). [object] Optional list of advanced Kafka client configuration options, defined as key-value pairs. name [_required_] string The name of the `librdkafka` configuration option to set. value [_required_] string The value assigned to the specified `librdkafka` configuration option. object Specifies the SASL mechanism for authenticating with a Kafka cluster. mechanism enum SASL mechanism used for Kafka authentication. Allowed enum values: `PLAIN,SCRAM-SHA-256,SCRAM-SHA-512` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topics [_required_] [string] A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified. type [_required_] enum The source type. The value should always be `kafka`. Allowed enum values: `kafka` default: `kafka` object The `datadog_agent` source collects logs from the Datadog Agent. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `datadog_agent`. Allowed enum values: `datadog_agent` default: `datadog_agent` object The `splunk_tcp` source receives logs from a Splunk Universal Forwarder over TCP. TLS is supported for secure transmission. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_tcp`. Allowed enum values: `splunk_tcp` default: `splunk_tcp` object The `splunk_hec` source implements the Splunk HTTP Event Collector (HEC) API. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `amazon_s3` source ingests logs from an Amazon S3 bucket. It supports AWS authentication and TLS encryption. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). region [_required_] string AWS region where the S3 bucket resides. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `fluentd` source ingests logs from a Fluentd-compatible service. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluentd. Allowed enum values: `fluentd` default: `fluentd` object The `fluent_bit` source ingests logs from Fluent Bit. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluent_bit`. Allowed enum values: `fluent_bit` default: `fluent_bit` object The `http_server` source collects logs over HTTP POST from external services. auth_strategy [_required_] enum HTTP authentication method. Allowed enum values: `none,plain` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string Unique ID for the HTTP server source. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_server`. Allowed enum values: `http_server` default: `http_server` object The `sumo_logic` source receives logs from Sumo Logic collectors. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). type [_required_] enum The source type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `rsyslog` source listens for logs over TCP or UDP from an `rsyslog` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` source listens for logs over TCP or UDP from a `syslog-ng` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `amazon_data_firehose` source ingests logs from AWS Data Firehose. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `amazon_data_firehose`. Allowed enum values: `amazon_data_firehose` default: `amazon_data_firehose` object The `google_pubsub` source ingests logs from a Google Cloud Pub/Sub subscription. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). project [_required_] string The GCP project ID that owns the Pub/Sub subscription. subscription [_required_] string The Pub/Sub subscription name from which messages are consumed. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` object The `http_client` source scrapes logs from HTTP endpoints at regular intervals. auth_strategy enum Optional authentication strategy for HTTP requests. Allowed enum values: `basic,bearer` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). scrape_interval_secs int64 The interval (in seconds) between HTTP scrape requests. scrape_timeout_secs int64 The timeout (in seconds) for each scrape request. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_client`. Allowed enum values: `http_client` default: `http_client` object The `logstash` source ingests logs from a Logstash forwarder. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `logstash`. Allowed enum values: `logstash` default: `logstash` object The `socket` source ingests logs over TCP or UDP. _required_] Framing method configuration for the socket source. object Byte frames which are delimited by a newline character. method [_required_] enum Byte frames which are delimited by a newline character. Allowed enum values: `newline_delimited` object Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). method [_required_] enum Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). Allowed enum values: `bytes` object Byte frames which are delimited by a chosen character. delimiter [_required_] string A single ASCII character used to delimit events. method [_required_] enum Byte frames which are delimited by a chosen character. Allowed enum values: `character_delimited` object Byte frames according to the octet counting format as per RFC6587. method [_required_] enum Byte frames according to the octet counting format as per RFC6587. Allowed enum values: `octet_counting` object Byte frames which are chunked GELF messages. method [_required_] enum Byte frames which are chunked GELF messages. Allowed enum values: `chunked_gelf` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used to receive logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` name [_required_] string Name of the pipeline. id [_required_] string Unique identifier for the pipeline. type [_required_] string The resource type identifier. For pipeline resources, this should always be set to `pipelines`. default: `pipelines` ``` { "data": { "attributes": { "config": { "destinations": [ { "id": "datadog-logs-destination", "inputs": [ "filter-processor" ], "type": "datadog_logs" } ], "processors": [ { "display_name": "my component", "enabled": true, "id": "grouped-processors", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "processors": [ [] ] } ], "sources": [ { "group_id": "consumer-group-0", "id": "kafka-source", "librdkafka_options": [ { "name": "fetch.message.max.bytes", "value": "1048576" } ], "sasl": { "mechanism": "string" }, "tls": { "ca_file": "string", "crt_file": "/path/to/cert.crt", "key_file": "string" }, "topics": [ "topic1", "topic2" ], "type": "kafka" } ] }, "name": "Main Observability Pipeline" }, "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6", "type": "pipelines" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=typescript) ##### Update a pipeline returns "OK" response Copy ``` # Path parameters export pipeline_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/${pipeline_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "config": { "destinations": [ { "id": "updated-datadog-logs-destination-id", "inputs": [ "my-processor-group" ], "type": "datadog_logs" } ], "processors": [ { "enabled": true, "id": "my-processor-group", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "processors": [ { "enabled": true, "id": "filter-processor", "include": "status:error", "type": "filter" } ] } ], "sources": [ { "id": "datadog-agent-source", "type": "datadog_agent" } ] }, "name": "Updated Pipeline Name" }, "id": "3fa85f64-5717-4562-b3fc-2c963f66afa6", "type": "pipelines" } } EOF ``` ##### Update a pipeline returns "OK" response ``` // Update a pipeline returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "pipeline" in the system PipelineDataID := os.Getenv("PIPELINE_DATA_ID") body := datadogV2.ObservabilityPipeline{ Data: datadogV2.ObservabilityPipelineData{ Attributes: datadogV2.ObservabilityPipelineDataAttributes{ Config: datadogV2.ObservabilityPipelineConfig{ Destinations: []datadogV2.ObservabilityPipelineConfigDestinationItem{ datadogV2.ObservabilityPipelineConfigDestinationItem{ ObservabilityPipelineDatadogLogsDestination: &datadogV2.ObservabilityPipelineDatadogLogsDestination{ Id: "updated-datadog-logs-destination-id", Inputs: []string{ "my-processor-group", }, Type: datadogV2.OBSERVABILITYPIPELINEDATADOGLOGSDESTINATIONTYPE_DATADOG_LOGS, }}, }, Processors: []datadogV2.ObservabilityPipelineConfigProcessorGroup{ { Enabled: true, Id: "my-processor-group", Include: "service:my-service", Inputs: []string{ "datadog-agent-source", }, Processors: []datadogV2.ObservabilityPipelineConfigProcessorItem{ datadogV2.ObservabilityPipelineConfigProcessorItem{ ObservabilityPipelineFilterProcessor: &datadogV2.ObservabilityPipelineFilterProcessor{ Enabled: true, Id: "filter-processor", Include: "status:error", Type: datadogV2.OBSERVABILITYPIPELINEFILTERPROCESSORTYPE_FILTER, }}, }, }, }, Sources: []datadogV2.ObservabilityPipelineConfigSourceItem{ datadogV2.ObservabilityPipelineConfigSourceItem{ ObservabilityPipelineDatadogAgentSource: &datadogV2.ObservabilityPipelineDatadogAgentSource{ Id: "datadog-agent-source", Type: datadogV2.OBSERVABILITYPIPELINEDATADOGAGENTSOURCETYPE_DATADOG_AGENT, }}, }, }, Name: "Updated Pipeline Name", }, Id: PipelineDataID, Type: "pipelines", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdatePipeline", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewObservabilityPipelinesApi(apiClient) resp, r, err := api.UpdatePipeline(ctx, PipelineDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ObservabilityPipelinesApi.UpdatePipeline`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ObservabilityPipelinesApi.UpdatePipeline`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a pipeline returns "OK" response ``` // Update a pipeline returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ObservabilityPipelinesApi; import com.datadog.api.client.v2.model.ObservabilityPipeline; import com.datadog.api.client.v2.model.ObservabilityPipelineConfig; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigDestinationItem; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigProcessorGroup; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigProcessorItem; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigSourceItem; import com.datadog.api.client.v2.model.ObservabilityPipelineData; import com.datadog.api.client.v2.model.ObservabilityPipelineDataAttributes; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogAgentSource; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogAgentSourceType; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogLogsDestination; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogLogsDestinationType; import com.datadog.api.client.v2.model.ObservabilityPipelineFilterProcessor; import com.datadog.api.client.v2.model.ObservabilityPipelineFilterProcessorType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updatePipeline", true); ObservabilityPipelinesApi apiInstance = new ObservabilityPipelinesApi(defaultClient); // there is a valid "pipeline" in the system String PIPELINE_DATA_ID = System.getenv("PIPELINE_DATA_ID"); ObservabilityPipeline body = new ObservabilityPipeline() .data( new ObservabilityPipelineData() .attributes( new ObservabilityPipelineDataAttributes() .config( new ObservabilityPipelineConfig() .destinations( Collections.singletonList( new ObservabilityPipelineConfigDestinationItem( new ObservabilityPipelineDatadogLogsDestination() .id("updated-datadog-logs-destination-id") .inputs( Collections.singletonList( "my-processor-group")) .type( ObservabilityPipelineDatadogLogsDestinationType .DATADOG_LOGS)))) .processors( Collections.singletonList( new ObservabilityPipelineConfigProcessorGroup() .enabled(true) .id("my-processor-group") .include("service:my-service") .inputs( Collections.singletonList( "datadog-agent-source")) .processors( Collections.singletonList( new ObservabilityPipelineConfigProcessorItem( new ObservabilityPipelineFilterProcessor() .enabled(true) .id("filter-processor") .include("status:error") .type( ObservabilityPipelineFilterProcessorType .FILTER)))))) .sources( Collections.singletonList( new ObservabilityPipelineConfigSourceItem( new ObservabilityPipelineDatadogAgentSource() .id("datadog-agent-source") .type( ObservabilityPipelineDatadogAgentSourceType .DATADOG_AGENT))))) .name("Updated Pipeline Name")) .id(PIPELINE_DATA_ID) .type("pipelines")); try { ObservabilityPipeline result = apiInstance.updatePipeline(PIPELINE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ObservabilityPipelinesApi#updatePipeline"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a pipeline returns "OK" response ``` """ Update a pipeline returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.observability_pipelines_api import ObservabilityPipelinesApi from datadog_api_client.v2.model.observability_pipeline import ObservabilityPipeline from datadog_api_client.v2.model.observability_pipeline_config import ObservabilityPipelineConfig from datadog_api_client.v2.model.observability_pipeline_config_processor_group import ( ObservabilityPipelineConfigProcessorGroup, ) from datadog_api_client.v2.model.observability_pipeline_data import ObservabilityPipelineData from datadog_api_client.v2.model.observability_pipeline_data_attributes import ObservabilityPipelineDataAttributes from datadog_api_client.v2.model.observability_pipeline_datadog_agent_source import ( ObservabilityPipelineDatadogAgentSource, ) from datadog_api_client.v2.model.observability_pipeline_datadog_agent_source_type import ( ObservabilityPipelineDatadogAgentSourceType, ) from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination import ( ObservabilityPipelineDatadogLogsDestination, ) from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination_type import ( ObservabilityPipelineDatadogLogsDestinationType, ) from datadog_api_client.v2.model.observability_pipeline_filter_processor import ObservabilityPipelineFilterProcessor from datadog_api_client.v2.model.observability_pipeline_filter_processor_type import ( ObservabilityPipelineFilterProcessorType, ) # there is a valid "pipeline" in the system PIPELINE_DATA_ID = environ["PIPELINE_DATA_ID"] body = ObservabilityPipeline( data=ObservabilityPipelineData( attributes=ObservabilityPipelineDataAttributes( config=ObservabilityPipelineConfig( destinations=[ ObservabilityPipelineDatadogLogsDestination( id="updated-datadog-logs-destination-id", inputs=[ "my-processor-group", ], type=ObservabilityPipelineDatadogLogsDestinationType.DATADOG_LOGS, ), ], processors=[ ObservabilityPipelineConfigProcessorGroup( enabled=True, id="my-processor-group", include="service:my-service", inputs=[ "datadog-agent-source", ], processors=[ ObservabilityPipelineFilterProcessor( enabled=True, id="filter-processor", include="status:error", type=ObservabilityPipelineFilterProcessorType.FILTER, ), ], ), ], sources=[ ObservabilityPipelineDatadogAgentSource( id="datadog-agent-source", type=ObservabilityPipelineDatadogAgentSourceType.DATADOG_AGENT, ), ], ), name="Updated Pipeline Name", ), id=PIPELINE_DATA_ID, type="pipelines", ), ) configuration = Configuration() configuration.unstable_operations["update_pipeline"] = True with ApiClient(configuration) as api_client: api_instance = ObservabilityPipelinesApi(api_client) response = api_instance.update_pipeline(pipeline_id=PIPELINE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a pipeline returns "OK" response ``` # Update a pipeline returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_pipeline".to_sym] = true end api_instance = DatadogAPIClient::V2::ObservabilityPipelinesAPI.new # there is a valid "pipeline" in the system PIPELINE_DATA_ID = ENV["PIPELINE_DATA_ID"] body = DatadogAPIClient::V2::ObservabilityPipeline.new({ data: DatadogAPIClient::V2::ObservabilityPipelineData.new({ attributes: DatadogAPIClient::V2::ObservabilityPipelineDataAttributes.new({ config: DatadogAPIClient::V2::ObservabilityPipelineConfig.new({ destinations: [ DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestination.new({ id: "updated-datadog-logs-destination-id", inputs: [ "my-processor-group", ], type: DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestinationType::DATADOG_LOGS, }), ], processors: [ DatadogAPIClient::V2::ObservabilityPipelineConfigProcessorGroup.new({ enabled: true, id: "my-processor-group", include: "service:my-service", inputs: [ "datadog-agent-source", ], processors: [ DatadogAPIClient::V2::ObservabilityPipelineFilterProcessor.new({ enabled: true, id: "filter-processor", include: "status:error", type: DatadogAPIClient::V2::ObservabilityPipelineFilterProcessorType::FILTER, }), ], }), ], sources: [ DatadogAPIClient::V2::ObservabilityPipelineDatadogAgentSource.new({ id: "datadog-agent-source", type: DatadogAPIClient::V2::ObservabilityPipelineDatadogAgentSourceType::DATADOG_AGENT, }), ], }), name: "Updated Pipeline Name", }), id: PIPELINE_DATA_ID, type: "pipelines", }), }) p api_instance.update_pipeline(PIPELINE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a pipeline returns "OK" response ``` // Update a pipeline returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_observability_pipelines::ObservabilityPipelinesAPI; use datadog_api_client::datadogV2::model::ObservabilityPipeline; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfig; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigDestinationItem; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigProcessorGroup; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigProcessorItem; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigSourceItem; use datadog_api_client::datadogV2::model::ObservabilityPipelineData; use datadog_api_client::datadogV2::model::ObservabilityPipelineDataAttributes; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogAgentSource; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogAgentSourceType; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogLogsDestination; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogLogsDestinationType; use datadog_api_client::datadogV2::model::ObservabilityPipelineFilterProcessor; use datadog_api_client::datadogV2::model::ObservabilityPipelineFilterProcessorType; #[tokio::main] async fn main() { // there is a valid "pipeline" in the system let pipeline_data_id = std::env::var("PIPELINE_DATA_ID").unwrap(); let body = ObservabilityPipeline::new( ObservabilityPipelineData::new( ObservabilityPipelineDataAttributes::new( ObservabilityPipelineConfig::new( vec![ ObservabilityPipelineConfigDestinationItem::ObservabilityPipelineDatadogLogsDestination( Box::new( ObservabilityPipelineDatadogLogsDestination::new( "updated-datadog-logs-destination-id".to_string(), vec!["my-processor-group".to_string()], ObservabilityPipelineDatadogLogsDestinationType::DATADOG_LOGS, ), ), ) ], vec![ ObservabilityPipelineConfigSourceItem::ObservabilityPipelineDatadogAgentSource( Box::new( ObservabilityPipelineDatadogAgentSource::new( "datadog-agent-source".to_string(), ObservabilityPipelineDatadogAgentSourceType::DATADOG_AGENT, ), ), ) ], ).processors( vec![ ObservabilityPipelineConfigProcessorGroup::new( true, "my-processor-group".to_string(), "service:my-service".to_string(), vec!["datadog-agent-source".to_string()], vec![ ObservabilityPipelineConfigProcessorItem::ObservabilityPipelineFilterProcessor( Box::new( ObservabilityPipelineFilterProcessor::new( true, "filter-processor".to_string(), "status:error".to_string(), ObservabilityPipelineFilterProcessorType::FILTER, ), ), ) ], ) ], ), "Updated Pipeline Name".to_string(), ), pipeline_data_id.clone(), "pipelines".to_string(), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdatePipeline", true); let api = ObservabilityPipelinesAPI::with_config(configuration); let resp = api.update_pipeline(pipeline_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a pipeline returns "OK" response ``` /** * Update a pipeline returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updatePipeline"] = true; const apiInstance = new v2.ObservabilityPipelinesApi(configuration); // there is a valid "pipeline" in the system const PIPELINE_DATA_ID = process.env.PIPELINE_DATA_ID as string; const params: v2.ObservabilityPipelinesApiUpdatePipelineRequest = { body: { data: { attributes: { config: { destinations: [ { id: "updated-datadog-logs-destination-id", inputs: ["my-processor-group"], type: "datadog_logs", }, ], processors: [ { enabled: true, id: "my-processor-group", include: "service:my-service", inputs: ["datadog-agent-source"], processors: [ { enabled: true, id: "filter-processor", include: "status:error", type: "filter", }, ], }, ], sources: [ { id: "datadog-agent-source", type: "datadog_agent", }, ], }, name: "Updated Pipeline Name", }, id: PIPELINE_DATA_ID, type: "pipelines", }, }, pipelineId: PIPELINE_DATA_ID, }; apiInstance .updatePipeline(params) .then((data: v2.ObservabilityPipeline) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a pipeline](https://docs.datadoghq.com/api/latest/observability-pipelines/#delete-a-pipeline) * [v2 (latest)](https://docs.datadoghq.com/api/latest/observability-pipelines/#delete-a-pipeline-v2) **Note** : This endpoint is in Preview. Fill out this [form](https://www.datadoghq.com/product-preview/observability-pipelines-api-and-terraform-support/) to request access. DELETE https://api.ap1.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.ap2.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.eu/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.ddog-gov.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us3.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id}https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/{pipeline_id} ### Overview Delete a pipeline. This endpoint requires the `observability_pipelines_delete` permission. ### Arguments #### Path Parameters Name Type Description pipeline_id [_required_] string The ID of the pipeline to delete. ### Response * [204](https://docs.datadoghq.com/api/latest/observability-pipelines/#DeletePipeline-204-v2) * [403](https://docs.datadoghq.com/api/latest/observability-pipelines/#DeletePipeline-403-v2) * [404](https://docs.datadoghq.com/api/latest/observability-pipelines/#DeletePipeline-404-v2) * [409](https://docs.datadoghq.com/api/latest/observability-pipelines/#DeletePipeline-409-v2) * [429](https://docs.datadoghq.com/api/latest/observability-pipelines/#DeletePipeline-429-v2) OK Forbidden * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=typescript) ##### Delete a pipeline Copy ``` # Path parameters export pipeline_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/${pipeline_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a pipeline ``` """ Delete a pipeline returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.observability_pipelines_api import ObservabilityPipelinesApi # there is a valid "pipeline" in the system PIPELINE_DATA_ID = environ["PIPELINE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_pipeline"] = True with ApiClient(configuration) as api_client: api_instance = ObservabilityPipelinesApi(api_client) api_instance.delete_pipeline( pipeline_id=PIPELINE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a pipeline ``` # Delete a pipeline returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_pipeline".to_sym] = true end api_instance = DatadogAPIClient::V2::ObservabilityPipelinesAPI.new # there is a valid "pipeline" in the system PIPELINE_DATA_ID = ENV["PIPELINE_DATA_ID"] api_instance.delete_pipeline(PIPELINE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a pipeline ``` // Delete a pipeline returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "pipeline" in the system PipelineDataID := os.Getenv("PIPELINE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeletePipeline", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewObservabilityPipelinesApi(apiClient) r, err := api.DeletePipeline(ctx, PipelineDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ObservabilityPipelinesApi.DeletePipeline`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a pipeline ``` // Delete a pipeline returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ObservabilityPipelinesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deletePipeline", true); ObservabilityPipelinesApi apiInstance = new ObservabilityPipelinesApi(defaultClient); // there is a valid "pipeline" in the system String PIPELINE_DATA_ID = System.getenv("PIPELINE_DATA_ID"); try { apiInstance.deletePipeline(PIPELINE_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling ObservabilityPipelinesApi#deletePipeline"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a pipeline ``` // Delete a pipeline returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_observability_pipelines::ObservabilityPipelinesAPI; #[tokio::main] async fn main() { // there is a valid "pipeline" in the system let pipeline_data_id = std::env::var("PIPELINE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeletePipeline", true); let api = ObservabilityPipelinesAPI::with_config(configuration); let resp = api.delete_pipeline(pipeline_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a pipeline ``` /** * Delete a pipeline returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deletePipeline"] = true; const apiInstance = new v2.ObservabilityPipelinesApi(configuration); // there is a valid "pipeline" in the system const PIPELINE_DATA_ID = process.env.PIPELINE_DATA_ID as string; const params: v2.ObservabilityPipelinesApiDeletePipelineRequest = { pipelineId: PIPELINE_DATA_ID, }; apiInstance .deletePipeline(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Validate an observability pipeline](https://docs.datadoghq.com/api/latest/observability-pipelines/#validate-an-observability-pipeline) * [v2 (latest)](https://docs.datadoghq.com/api/latest/observability-pipelines/#validate-an-observability-pipeline-v2) **Note** : This endpoint is in Preview. Fill out this [form](https://www.datadoghq.com/product-preview/observability-pipelines-api-and-terraform-support/) to request access. POST https://api.ap1.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/validatehttps://api.ap2.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/validatehttps://api.datadoghq.eu/api/v2/remote_config/products/obs_pipelines/pipelines/validatehttps://api.ddog-gov.com/api/v2/remote_config/products/obs_pipelines/pipelines/validatehttps://api.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/validatehttps://api.us3.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/validatehttps://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/validate ### Overview Validates a pipeline configuration without creating or updating any resources. Returns a list of validation errors, if any. This endpoint requires the `observability_pipelines_read` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) Expand All Field Type Description _required_] object Contains the the pipeline configuration. _required_] object Defines the pipeline’s name and its components (sources, processors, and destinations). _required_] object Specifies the pipeline's configuration, including its sources, processors, and destinations. _required_] [ ] A list of destination components where processed logs are sent. object The `datadog_logs` destination forwards logs to Datadog Log Management. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `datadog_logs`. Allowed enum values: `datadog_logs` default: `datadog_logs` object The `amazon_s3` destination sends your logs in Datadog-rehydratable format to an Amazon S3 bucket for archiving. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string S3 bucket name. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys. region [_required_] string AWS region of the S3 bucket. storage_class [_required_] enum S3 storage class. Allowed enum values: `STANDARD,REDUCED_REDUNDANCY,INTELLIGENT_TIERING,STANDARD_IA,EXPRESS_ONEZONE,ONEZONE_IA,GLACIER,GLACIER_IR,DEEP_ARCHIVE` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `google_cloud_storage` destination stores logs in a Google Cloud Storage (GCS) bucket. It requires a bucket name, GCP authentication, and metadata fields. acl enum Access control list setting for objects written to the bucket. Allowed enum values: `private,project-private,public-read,authenticated-read,bucket-owner-read,bucket-owner-full-control` object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. bucket [_required_] string Name of the GCS bucket. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. key_prefix string Optional prefix for object keys within the GCS bucket. [object] Custom metadata to attach to each object uploaded to the GCS bucket. name [_required_] string The metadata key. value [_required_] string The metadata value. storage_class [_required_] enum Storage class used for objects stored in GCS. Allowed enum values: `STANDARD,NEARLINE,COLDLINE,ARCHIVE` type [_required_] enum The destination type. Always `google_cloud_storage`. Allowed enum values: `google_cloud_storage` default: `google_cloud_storage` object The `splunk_hec` destination forwards logs to Splunk using the HTTP Event Collector (HEC). auto_extract_timestamp boolean If `true`, Splunk tries to extract timestamps from incoming log events. If `false`, Splunk assigns the time the event was received. encoding enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). index string Optional name of the Splunk index where logs are written. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. sourcetype string The Splunk sourcetype to assign to log events. type [_required_] enum The destination type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `sumo_logic` destination forwards logs to Sumo Logic. encoding enum The output encoding format. Allowed enum values: `json,raw_message,logfmt` [object] A list of custom headers to include in the request to Sumo Logic. name [_required_] string The header field name. value [_required_] string The header field value. header_host_name string Optional override for the host name header. header_source_category string Optional override for the source category header. header_source_name string Optional override for the source name header. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `elasticsearch` destination writes logs to an Elasticsearch cluster. api_version enum The Elasticsearch API version to use. Set to `auto` to auto-detect. Allowed enum values: `auto,v6,v7,v8` bulk_index string The index to write logs to in Elasticsearch. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `elasticsearch`. Allowed enum values: `elasticsearch` default: `elasticsearch` object The `rsyslog` destination forwards logs to an external `rsyslog` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` destination forwards logs to an external `syslog-ng` server over TCP or UDP using the syslog protocol. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. keepalive int64 Optional socket keepalive duration in milliseconds. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `azure_storage` destination forwards logs to an Azure Blob Storage container. blob_prefix string Optional prefix for blobs written to the container. container_name [_required_] string The name of the Azure Blob Storage container to store logs in. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `azure_storage`. Allowed enum values: `azure_storage` default: `azure_storage` object The `microsoft_sentinel` destination forwards logs to Microsoft Sentinel. client_id [_required_] string Azure AD client ID used for authentication. dcr_immutable_id [_required_] string The immutable ID of the Data Collection Rule (DCR). id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. table [_required_] string The name of the Log Analytics table where logs are sent. tenant_id [_required_] string Azure AD tenant ID. type [_required_] enum The destination type. The value should always be `microsoft_sentinel`. Allowed enum values: `microsoft_sentinel` default: `microsoft_sentinel` object The `google_chronicle` destination sends logs to Google Chronicle. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. customer_id [_required_] string The Google Chronicle customer ID. encoding enum The encoding format for the logs sent to Chronicle. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. log_type string The log type metadata associated with the Chronicle destination. type [_required_] enum The destination type. The value should always be `google_chronicle`. Allowed enum values: `google_chronicle` default: `google_chronicle` object The `new_relic` destination sends logs to the New Relic platform. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The New Relic region. Allowed enum values: `us,eu` type [_required_] enum The destination type. The value should always be `new_relic`. Allowed enum values: `new_relic` default: `new_relic` object The `sentinel_one` destination sends logs to SentinelOne. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] enum The SentinelOne region to send logs to. Allowed enum values: `us,eu,ca,data_set_us` type [_required_] enum The destination type. The value should always be `sentinel_one`. Allowed enum values: `sentinel_one` default: `sentinel_one` object The `opensearch` destination writes logs to an OpenSearch cluster. bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `opensearch`. Allowed enum values: `opensearch` default: `opensearch` object The `amazon_opensearch` destination writes logs to Amazon OpenSearch. _required_] object Authentication settings for the Amazon OpenSearch destination. The `strategy` field determines whether basic or AWS-based authentication is used. assume_role string The ARN of the role to assume (used with `aws` strategy). aws_region string AWS region external_id string External ID for the assumed role (used with `aws` strategy). session_name string Session name for the assumed role (used with `aws` strategy). strategy [_required_] enum The authentication strategy to use. Allowed enum values: `basic,aws` bulk_index string The index to write logs to. id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. type [_required_] enum The destination type. The value should always be `amazon_opensearch`. Allowed enum values: `amazon_opensearch` default: `amazon_opensearch` object The `socket` destination sends logs over TCP or UDP to a remote server. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` _required_] Framing method configuration. object Each log event is delimited by a newline character. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingNewlineDelimitedMethod` object. Allowed enum values: `newline_delimited` object Event data is not delimited at all. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingBytesMethod` object. Allowed enum values: `bytes` object Each log event is separated using the specified delimiter character. delimiter [_required_] string A single ASCII character used as a delimiter. method [_required_] enum The definition of `ObservabilityPipelineSocketDestinationFramingCharacterDelimitedMethod` object. Allowed enum values: `character_delimited` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. mode [_required_] enum Protocol used to send logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` object The `amazon_security_lake` destination sends your logs to Amazon Security Lake. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. bucket [_required_] string Name of the Amazon S3 bucket in Security Lake (3-63 characters). custom_source_name [_required_] string Custom source name for the logs in Security Lake. id [_required_] string Unique identifier for the destination component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. region [_required_] string AWS region of the S3 bucket. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. Always `amazon_security_lake`. Allowed enum values: `amazon_security_lake` default: `amazon_security_lake` object The `crowdstrike_next_gen_siem` destination forwards logs to CrowdStrike Next Gen SIEM. object Compression configuration for log events. algorithm [_required_] enum Compression algorithm for log events. Allowed enum values: `gzip,zlib` level int64 Compression level. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The destination type. The value should always be `crowdstrike_next_gen_siem`. Allowed enum values: `crowdstrike_next_gen_siem` default: `crowdstrike_next_gen_siem` object The `google_pubsub` destination publishes logs to a Google Cloud Pub/Sub topic. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. encoding [_required_] enum Encoding format for log events. Allowed enum values: `json,raw_message` id [_required_] string The unique identifier for this component. inputs [_required_] [string] A list of component IDs whose output is used as the `input` for this component. project [_required_] string The GCP project ID that owns the Pub/Sub topic. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topic [_required_] string The Pub/Sub topic name to publish logs to. type [_required_] enum The destination type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` [object] A list of processor groups that transform or enrich log data. display_name string The display name for a component. enabled [_required_] boolean Whether this processor group is enabled. id [_required_] string The unique identifier for the processor group. include [_required_] string Conditional expression for when this processor group should execute. inputs [_required_] [string] A list of IDs for components whose output is used as the input for this processor group. _required_] [ ] Processors applied sequentially within this group. Events flow through each processor in order. object The `filter` processor allows conditional processing of logs based on a Datadog search query. Logs that match the `include` query are passed through; others are discarded. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs should pass through the filter. Logs that match this query continue to downstream components; others are dropped. type [_required_] enum The processor type. The value should always be `filter`. Allowed enum values: `filter` default: `filter` object The `parse_json` processor extracts JSON from a specified field and flattens it into the event. This is useful when logs contain embedded JSON as a string. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. field [_required_] string The name of the log field that contains a JSON string. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `parse_json`. Allowed enum values: `parse_json` default: `parse_json` object The Quota Processor measures logging traffic for logs that match a specified filter. When the configured daily quota is met, the processor can drop or alert. display_name string The display name for a component. drop_events boolean If set to `true`, logs that matched the quota filter and sent after the quota has been met are dropped; only logs that did not match the filter query continue through the pipeline. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). ignore_when_missing_partitions boolean If `true`, the processor skips quota checks when partition fields are missing from the logs. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. name [_required_] string Name of the quota. overflow_action enum The action to take when the quota is exceeded. Options: * `drop`: Drop the event. * `no_action`: Let the event pass through. * `overflow_routing`: Route to an overflow destination. Allowed enum values: `drop,no_action,overflow_routing` [object] A list of alternate quota rules that apply to specific sets of events, identified by matching field values. Each override can define a custom limit. _required_] [object] A list of field matchers used to apply a specific override. If an event matches all listed key-value pairs, the corresponding override limit is enforced. name [_required_] string The field name. value [_required_] string The field value. _required_] object The maximum amount of data or number of events allowed before the quota is enforced. Can be specified in bytes or events. enforce [_required_] enum Unit for quota enforcement in bytes for data size or events for count. Allowed enum values: `bytes,events` limit [_required_] int64 The limit for quota enforcement. partition_fields [string] A list of fields used to segment log traffic for quota enforcement. Quotas are tracked independently by unique combinations of these field values. type [_required_] enum The processor type. The value should always be `quota`. Allowed enum values: `quota` default: `quota` object The `add_fields` processor adds static key-value fields to logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of static fields (key-value pairs) that is added to each log event processed by this component. name [_required_] string The field name. value [_required_] string The field value. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_fields`. Allowed enum values: `add_fields` default: `add_fields` object The `remove_fields` processor deletes specified fields from logs. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of field names to be removed from each log event. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `remove_fields`. Allowed enum values: `remove_fields` default: `remove_fields` object The `rename_fields` processor changes field names. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. _required_] [object] A list of rename rules specifying which fields to rename in the event, what to rename them to, and whether to preserve the original fields. destination [_required_] string The field name to assign the renamed value to. preserve_source [_required_] boolean Indicates whether the original field, that is received from the source, should be kept (`true`) or removed (`false`) after renaming. source [_required_] string The original field name in the log event that should be renamed. id [_required_] string A unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `rename_fields`. Allowed enum values: `rename_fields` default: `rename_fields` object The `generate_datadog_metrics` processor creates custom metrics from logs and sends them to Datadog. Metrics can be counters, gauges, or distributions and optionally grouped by log fields. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include string A Datadog search query used to determine which logs this processor targets. [object] Configuration for generating individual metrics. group_by [string] Optional fields used to group the metric series. include [_required_] string Datadog filter query to match logs for metric generation. metric_type [_required_] enum Type of metric to create. Allowed enum values: `count,gauge,distribution` name [_required_] string Name of the custom metric to be created. _required_] Specifies how the value of the generated metric is computed. object Strategy that increments a generated metric by one for each matching event. strategy [_required_] enum Increments the metric by 1 for each matching event. Allowed enum values: `increment_by_one` object Strategy that increments a generated metric based on the value of a log field. field [_required_] string Name of the log field containing the numeric value to increment the metric by. strategy [_required_] enum Uses a numeric field in the log event as the metric increment. Allowed enum values: `increment_by_field` type [_required_] enum The processor type. Always `generate_datadog_metrics`. Allowed enum values: `generate_datadog_metrics` default: `generate_datadog_metrics` object The `sample` processor allows probabilistic sampling of logs at a fixed rate. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. percentage double The percentage of logs to sample. rate int64 Number of events to sample (1 in N). type [_required_] enum The processor type. The value should always be `sample`. Allowed enum values: `sample` default: `sample` object The `parse_grok` processor extracts structured fields from unstructured log messages using Grok patterns. disable_library_rules boolean If set to `true`, disables the default Grok rules provided by Datadog. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string A unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] The list of Grok parsing rules. If multiple matching rules are provided, they are evaluated in order. The first successful match is applied. _required_] [object] A list of Grok parsing rules that define how to extract fields from the source field. Each rule must contain a name and a valid Grok pattern. name [_required_] string The name of the rule. rule [_required_] string The definition of the Grok rule. source [_required_] string The name of the field in the log event to apply the Grok rules to. [object] A list of Grok helper rules that can be referenced by the parsing rules. name [_required_] string The name of the Grok helper rule. rule [_required_] string The definition of the Grok helper rule. type [_required_] enum The processor type. The value should always be `parse_grok`. Allowed enum values: `parse_grok` default: `parse_grok` object The `sensitive_data_scanner` processor detects and optionally redacts sensitive data in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of rules for identifying and acting on sensitive data patterns. object Configuration for keywords used to reinforce sensitive data pattern detection. keywords [_required_] [string] A list of keywords to match near the sensitive pattern. proximity [_required_] int64 Maximum number of tokens between a keyword and a sensitive value match. name [_required_] string A name identifying the rule. _required_] Defines what action to take when sensitive data is matched. object Configuration for completely redacting matched sensitive data. action [_required_] enum Action type that completely replaces the matched sensitive data with a fixed replacement string to remove all visibility. Allowed enum values: `redact` _required_] object Configuration for fully redacting sensitive data. replace [_required_] string The `ObservabilityPipelineSensitiveDataScannerProcessorActionRedactOptions` `replace`. object Configuration for hashing matched sensitive values. action [_required_] enum Action type that replaces the matched sensitive data with a hashed representation, preserving structure while securing content. Allowed enum values: `hash` options object The `ObservabilityPipelineSensitiveDataScannerProcessorActionHash` `options`. object Configuration for partially redacting matched sensitive data. action [_required_] enum Action type that redacts part of the sensitive data while preserving a configurable number of characters, typically used for masking purposes (e.g., show last 4 digits of a credit card). Allowed enum values: `partial_redact` _required_] object Controls how partial redaction is applied, including character count and direction. characters [_required_] int64 The `ObservabilityPipelineSensitiveDataScannerProcessorActionPartialRedactOptions` `characters`. direction [_required_] enum Indicates whether to redact characters from the first or last part of the matched value. Allowed enum values: `first,last` _required_] Pattern detection configuration for identifying sensitive data using either a custom regex or a library reference. object Defines a custom regex-based pattern for identifying sensitive data in logs. _required_] object Options for defining a custom regex pattern. rule [_required_] string A regular expression used to detect sensitive values. Must be a valid regex. type [_required_] enum Indicates a custom regular expression is used for matching. Allowed enum values: `custom` object Specifies a pattern from Datadog’s sensitive data detection library to match known sensitive data types. _required_] object Options for selecting a predefined library pattern and enabling keyword support. id [_required_] string Identifier for a predefined pattern from the sensitive data scanner pattern library. use_recommended_keywords boolean Whether to augment the pattern with recommended keywords (optional). type [_required_] enum Indicates that a predefined library pattern is used. Allowed enum values: `library` _required_] Determines which parts of the log the pattern-matching rule should be applied to. object Includes only specific fields for sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Applies the rule only to included fields. Allowed enum values: `include` object Excludes specific fields from sensitive data scanning. _required_] object Fields to which the scope rule applies. fields [_required_] [string] The `ObservabilityPipelineSensitiveDataScannerProcessorScopeOptions` `fields`. target [_required_] enum Excludes specific fields from processing. Allowed enum values: `exclude` object Applies scanning across all available fields. target [_required_] enum Applies the rule to all fields. Allowed enum values: `all` tags [_required_] [string] Tags assigned to this rule for filtering and classification. type [_required_] enum The processor type. The value should always be `sensitive_data_scanner`. Allowed enum values: `sensitive_data_scanner` default: `sensitive_data_scanner` object The `ocsf_mapper` processor transforms logs into the OCSF schema using a predefined mapping configuration. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] A list of mapping rules to convert events to the OCSF format. include [_required_] string A Datadog search query used to select the logs that this mapping should apply to. _required_] Defines a single mapping rule for transforming logs into the OCSF schema. Option 1 enum Predefined library mappings for common log formats. Allowed enum values: `CloudTrail Account Change,GCP Cloud Audit CreateBucket,GCP Cloud Audit CreateSink,GCP Cloud Audit SetIamPolicy,GCP Cloud Audit UpdateSink,Github Audit Log API Activity,Google Workspace Admin Audit addPrivilege,Microsoft 365 Defender Incident,Microsoft 365 Defender UserLoggedIn,Okta System Log Authentication,Palo Alto Networks Firewall Traffic` type [_required_] enum The processor type. The value should always be `ocsf_mapper`. Allowed enum values: `ocsf_mapper` default: `ocsf_mapper` object The `add_env_vars` processor adds environment variable values to log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this processor in the pipeline. include [_required_] string A Datadog search query used to determine which logs this processor targets. type [_required_] enum The processor type. The value should always be `add_env_vars`. Allowed enum values: `add_env_vars` default: `add_env_vars` _required_] [object] A list of environment variable mappings to apply to log fields. field [_required_] string The target field in the log event. name [_required_] string The name of the environment variable to read. object The `dedupe` processor removes duplicate fields in log events. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. fields [_required_] [string] A list of log field paths to check for duplicates. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. mode [_required_] enum The deduplication mode to apply to the fields. Allowed enum values: `match,ignore` type [_required_] enum The processor type. The value should always be `dedupe`. Allowed enum values: `dedupe` default: `dedupe` object The `enrichment_table` processor enriches logs using a static CSV file or GeoIP database. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. object Defines a static enrichment table loaded from a CSV file. _required_] object File encoding format. delimiter [_required_] string The `encoding` `delimiter`. includes_headers [_required_] boolean The `encoding` `includes_headers`. type [_required_] enum Specifies the encoding format (e.g., CSV) used for enrichment tables. Allowed enum values: `csv` _required_] [object] Key fields used to look up enrichment values. column [_required_] string The `items` `column`. comparison [_required_] enum Defines how to compare key fields for enrichment table lookups. Allowed enum values: `equals` field [_required_] string The `items` `field`. path [_required_] string Path to the CSV file. _required_] [object] Schema defining column names and their types. column [_required_] string The `items` `column`. type [_required_] enum Declares allowed data types for enrichment table columns. Allowed enum values: `string,boolean,integer,float,date,timestamp` object Uses a GeoIP database to enrich logs based on an IP field. key_field [_required_] string Path to the IP field in the log. locale [_required_] string Locale used to resolve geographical names. path [_required_] string Path to the GeoIP database file. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. target [_required_] string Path where enrichment results should be stored in the log. type [_required_] enum The processor type. The value should always be `enrichment_table`. Allowed enum values: `enrichment_table` default: `enrichment_table` object The `reduce` processor aggregates and merges logs based on matching keys and merge strategies. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [_required_] [string] A list of fields used to group log events for merging. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. _required_] [object] List of merge strategies defining how values from grouped events should be combined. path [_required_] string The field path in the log event. strategy [_required_] enum The merge strategy to apply. Allowed enum values: `discard,retain,sum,max,min,array,concat,concat_newline,concat_raw,shortest_array,longest_array,flat_unique` type [_required_] enum The processor type. The value should always be `reduce`. Allowed enum values: `reduce` default: `reduce` object The `throttle` processor limits the number of events that pass through over a given time window. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. group_by [string] Optional list of fields used to group events before the threshold has been reached. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. threshold [_required_] int64 the number of events allowed in a given time window. Events sent after the threshold has been reached, are dropped. type [_required_] enum The processor type. The value should always be `throttle`. Allowed enum values: `throttle` default: `throttle` window [_required_] double The time window in seconds over which the threshold applies. object The `custom_processor` processor transforms events using [Vector Remap Language (VRL)](https://vector.dev/docs/reference/vrl/) scripts with advanced filtering capabilities. display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this processor. include [_required_] string A Datadog search query used to determine which logs this processor targets. This field should always be set to `*` for the custom_processor processor. default: `*` _required_] [object] Array of VRL remap rules. drop_on_error [_required_] boolean Whether to drop events that caused errors during processing. enabled boolean Whether this remap rule is enabled. include [_required_] string A Datadog search query used to filter events for this specific remap rule. name [_required_] string A descriptive name for this remap rule. source [_required_] string The VRL script source code that defines the processing logic. type [_required_] enum The processor type. The value should always be `custom_processor`. Allowed enum values: `custom_processor` default: `custom_processor` object The `datadog_tags` processor includes or excludes specific Datadog tags in your logs. action [_required_] enum The action to take on tags with matching keys. Allowed enum values: `include,exclude` display_name string The display name for a component. enabled [_required_] boolean Whether this processor is enabled. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). include [_required_] string A Datadog search query used to determine which logs this processor targets. keys [_required_] [string] A list of tag keys. mode [_required_] enum The processing mode. Allowed enum values: `filter` type [_required_] enum The processor type. The value should always be `datadog_tags`. Allowed enum values: `datadog_tags` default: `datadog_tags` _required_] [ ] A list of configured data sources for the pipeline. object The `kafka` source ingests data from Apache Kafka topics. group_id [_required_] string Consumer group ID used by the Kafka client. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). [object] Optional list of advanced Kafka client configuration options, defined as key-value pairs. name [_required_] string The name of the `librdkafka` configuration option to set. value [_required_] string The value assigned to the specified `librdkafka` configuration option. object Specifies the SASL mechanism for authenticating with a Kafka cluster. mechanism enum SASL mechanism used for Kafka authentication. Allowed enum values: `PLAIN,SCRAM-SHA-256,SCRAM-SHA-512` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. topics [_required_] [string] A list of Kafka topic names to subscribe to. The source ingests messages from each topic specified. type [_required_] enum The source type. The value should always be `kafka`. Allowed enum values: `kafka` default: `kafka` object The `datadog_agent` source collects logs from the Datadog Agent. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `datadog_agent`. Allowed enum values: `datadog_agent` default: `datadog_agent` object The `splunk_tcp` source receives logs from a Splunk Universal Forwarder over TCP. TLS is supported for secure transmission. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_tcp`. Allowed enum values: `splunk_tcp` default: `splunk_tcp` object The `splunk_hec` source implements the Splunk HTTP Event Collector (HEC) API. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `splunk_hec`. Allowed enum values: `splunk_hec` default: `splunk_hec` object The `amazon_s3` source ingests logs from an Amazon S3 bucket. It supports AWS authentication and TLS encryption. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). region [_required_] string AWS region where the S3 bucket resides. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. Always `amazon_s3`. Allowed enum values: `amazon_s3` default: `amazon_s3` object The `fluentd` source ingests logs from a Fluentd-compatible service. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluentd. Allowed enum values: `fluentd` default: `fluentd` object The `fluent_bit` source ingests logs from Fluent Bit. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (for example, as the `input` to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `fluent_bit`. Allowed enum values: `fluent_bit` default: `fluent_bit` object The `http_server` source collects logs over HTTP POST from external services. auth_strategy [_required_] enum HTTP authentication method. Allowed enum values: `none,plain` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string Unique ID for the HTTP server source. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_server`. Allowed enum values: `http_server` default: `http_server` object The `sumo_logic` source receives logs from Sumo Logic collectors. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). type [_required_] enum The source type. The value should always be `sumo_logic`. Allowed enum values: `sumo_logic` default: `sumo_logic` object The `rsyslog` source listens for logs over TCP or UDP from an `rsyslog` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `rsyslog`. Allowed enum values: `rsyslog` default: `rsyslog` object The `syslog_ng` source listens for logs over TCP or UDP from a `syslog-ng` server using the syslog protocol. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used by the syslog source to receive messages. Allowed enum values: `tcp,udp` object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `syslog_ng`. Allowed enum values: `syslog_ng` default: `syslog_ng` object The `amazon_data_firehose` source ingests logs from AWS Data Firehose. object AWS authentication credentials used for accessing AWS services such as S3. If omitted, the system’s default credentials are used (for example, the IAM role and environment variables). assume_role string The Amazon Resource Name (ARN) of the role to assume. external_id string A unique identifier for cross-account role assumption. session_name string A session identifier used for logging and tracing the assumed role session. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `amazon_data_firehose`. Allowed enum values: `amazon_data_firehose` default: `amazon_data_firehose` object The `google_pubsub` source ingests logs from a Google Cloud Pub/Sub subscription. object GCP credentials used to authenticate with Google Cloud Storage. credentials_file [_required_] string Path to the GCP service account key file. decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). project [_required_] string The GCP project ID that owns the Pub/Sub subscription. subscription [_required_] string The Pub/Sub subscription name from which messages are consumed. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `google_pubsub`. Allowed enum values: `google_pubsub` default: `google_pubsub` object The `http_client` source scrapes logs from HTTP endpoints at regular intervals. auth_strategy enum Optional authentication strategy for HTTP requests. Allowed enum values: `basic,bearer` decoding [_required_] enum The decoding format used to interpret incoming logs. Allowed enum values: `bytes,gelf,json,syslog` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). scrape_interval_secs int64 The interval (in seconds) between HTTP scrape requests. scrape_timeout_secs int64 The timeout (in seconds) for each scrape request. object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `http_client`. Allowed enum values: `http_client` default: `http_client` object The `logstash` source ingests logs from a Logstash forwarder. id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). object Configuration for enabling TLS encryption between the pipeline component and external services. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `logstash`. Allowed enum values: `logstash` default: `logstash` object The `socket` source ingests logs over TCP or UDP. _required_] Framing method configuration for the socket source. object Byte frames which are delimited by a newline character. method [_required_] enum Byte frames which are delimited by a newline character. Allowed enum values: `newline_delimited` object Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). method [_required_] enum Byte frames are passed through as-is according to the underlying I/O boundaries (for example, split between messages or stream segments). Allowed enum values: `bytes` object Byte frames which are delimited by a chosen character. delimiter [_required_] string A single ASCII character used to delimit events. method [_required_] enum Byte frames which are delimited by a chosen character. Allowed enum values: `character_delimited` object Byte frames according to the octet counting format as per RFC6587. method [_required_] enum Byte frames according to the octet counting format as per RFC6587. Allowed enum values: `octet_counting` object Byte frames which are chunked GELF messages. method [_required_] enum Byte frames which are chunked GELF messages. Allowed enum values: `chunked_gelf` id [_required_] string The unique identifier for this component. Used to reference this component in other parts of the pipeline (e.g., as input to downstream components). mode [_required_] enum Protocol used to receive logs. Allowed enum values: `tcp,udp` object TLS configuration. Relevant only when `mode` is `tcp`. ca_file string Path to the Certificate Authority (CA) file used to validate the server’s TLS certificate. crt_file [_required_] string Path to the TLS client certificate file used to authenticate the pipeline component with upstream or downstream services. key_file string Path to the private key file associated with the TLS client certificate. Used for mutual TLS authentication. type [_required_] enum The source type. The value should always be `socket`. Allowed enum values: `socket` default: `socket` name [_required_] string Name of the pipeline. type [_required_] string The resource type identifier. For pipeline resources, this should always be set to `pipelines`. default: `pipelines` ``` { "data": { "attributes": { "config": { "destinations": [ { "id": "datadog-logs-destination", "inputs": [ "my-processor-group" ], "type": "datadog_logs" } ], "processors": [ { "enabled": true, "id": "my-processor-group", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "processors": [ { "enabled": true, "id": "filter-processor", "include": "status:error", "type": "filter" } ] } ], "sources": [ { "id": "datadog-agent-source", "type": "datadog_agent" } ] }, "name": "Main Observability Pipeline" }, "type": "pipelines" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/observability-pipelines/#ValidatePipeline-200-v2) * [400](https://docs.datadoghq.com/api/latest/observability-pipelines/#ValidatePipeline-400-v2) * [403](https://docs.datadoghq.com/api/latest/observability-pipelines/#ValidatePipeline-403-v2) * [429](https://docs.datadoghq.com/api/latest/observability-pipelines/#ValidatePipeline-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) Response containing validation errors. Expand All Field Type Description [object] The `ValidationResponse` `errors`. _required_] object Describes additional metadata for validation errors, including field names and error messages. field string The field name that caused the error. id string The ID of the component in which the error occurred. message [_required_] string The detailed error message. title [_required_] string A short, human-readable summary of the error. ``` { "errors": [ { "meta": { "field": "region", "id": "datadog-agent-source", "message": "Field 'region' is required" }, "title": "Field 'region' is required" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/observability-pipelines/) * [Example](https://docs.datadoghq.com/api/latest/observability-pipelines/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/observability-pipelines/?code-lang=typescript) ##### Validate an observability pipeline returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/remote_config/products/obs_pipelines/pipelines/validate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "config": { "destinations": [ { "id": "datadog-logs-destination", "inputs": [ "my-processor-group" ], "type": "datadog_logs" } ], "processors": [ { "enabled": true, "id": "my-processor-group", "include": "service:my-service", "inputs": [ "datadog-agent-source" ], "processors": [ { "enabled": true, "id": "filter-processor", "include": "status:error", "type": "filter" } ] } ], "sources": [ { "id": "datadog-agent-source", "type": "datadog_agent" } ] }, "name": "Main Observability Pipeline" }, "type": "pipelines" } } EOF ``` ##### Validate an observability pipeline returns "OK" response ``` // Validate an observability pipeline returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ObservabilityPipelineSpec{ Data: datadogV2.ObservabilityPipelineSpecData{ Attributes: datadogV2.ObservabilityPipelineDataAttributes{ Config: datadogV2.ObservabilityPipelineConfig{ Destinations: []datadogV2.ObservabilityPipelineConfigDestinationItem{ datadogV2.ObservabilityPipelineConfigDestinationItem{ ObservabilityPipelineDatadogLogsDestination: &datadogV2.ObservabilityPipelineDatadogLogsDestination{ Id: "datadog-logs-destination", Inputs: []string{ "my-processor-group", }, Type: datadogV2.OBSERVABILITYPIPELINEDATADOGLOGSDESTINATIONTYPE_DATADOG_LOGS, }}, }, Processors: []datadogV2.ObservabilityPipelineConfigProcessorGroup{ { Enabled: true, Id: "my-processor-group", Include: "service:my-service", Inputs: []string{ "datadog-agent-source", }, Processors: []datadogV2.ObservabilityPipelineConfigProcessorItem{ datadogV2.ObservabilityPipelineConfigProcessorItem{ ObservabilityPipelineFilterProcessor: &datadogV2.ObservabilityPipelineFilterProcessor{ Enabled: true, Id: "filter-processor", Include: "status:error", Type: datadogV2.OBSERVABILITYPIPELINEFILTERPROCESSORTYPE_FILTER, }}, }, }, }, Sources: []datadogV2.ObservabilityPipelineConfigSourceItem{ datadogV2.ObservabilityPipelineConfigSourceItem{ ObservabilityPipelineDatadogAgentSource: &datadogV2.ObservabilityPipelineDatadogAgentSource{ Id: "datadog-agent-source", Type: datadogV2.OBSERVABILITYPIPELINEDATADOGAGENTSOURCETYPE_DATADOG_AGENT, }}, }, }, Name: "Main Observability Pipeline", }, Type: "pipelines", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ValidatePipeline", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewObservabilityPipelinesApi(apiClient) resp, r, err := api.ValidatePipeline(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ObservabilityPipelinesApi.ValidatePipeline`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ObservabilityPipelinesApi.ValidatePipeline`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Validate an observability pipeline returns "OK" response ``` // Validate an observability pipeline returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ObservabilityPipelinesApi; import com.datadog.api.client.v2.model.ObservabilityPipelineConfig; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigDestinationItem; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigProcessorGroup; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigProcessorItem; import com.datadog.api.client.v2.model.ObservabilityPipelineConfigSourceItem; import com.datadog.api.client.v2.model.ObservabilityPipelineDataAttributes; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogAgentSource; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogAgentSourceType; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogLogsDestination; import com.datadog.api.client.v2.model.ObservabilityPipelineDatadogLogsDestinationType; import com.datadog.api.client.v2.model.ObservabilityPipelineFilterProcessor; import com.datadog.api.client.v2.model.ObservabilityPipelineFilterProcessorType; import com.datadog.api.client.v2.model.ObservabilityPipelineSpec; import com.datadog.api.client.v2.model.ObservabilityPipelineSpecData; import com.datadog.api.client.v2.model.ValidationResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.validatePipeline", true); ObservabilityPipelinesApi apiInstance = new ObservabilityPipelinesApi(defaultClient); ObservabilityPipelineSpec body = new ObservabilityPipelineSpec() .data( new ObservabilityPipelineSpecData() .attributes( new ObservabilityPipelineDataAttributes() .config( new ObservabilityPipelineConfig() .destinations( Collections.singletonList( new ObservabilityPipelineConfigDestinationItem( new ObservabilityPipelineDatadogLogsDestination() .id("datadog-logs-destination") .inputs( Collections.singletonList( "my-processor-group")) .type( ObservabilityPipelineDatadogLogsDestinationType .DATADOG_LOGS)))) .processors( Collections.singletonList( new ObservabilityPipelineConfigProcessorGroup() .enabled(true) .id("my-processor-group") .include("service:my-service") .inputs( Collections.singletonList( "datadog-agent-source")) .processors( Collections.singletonList( new ObservabilityPipelineConfigProcessorItem( new ObservabilityPipelineFilterProcessor() .enabled(true) .id("filter-processor") .include("status:error") .type( ObservabilityPipelineFilterProcessorType .FILTER)))))) .sources( Collections.singletonList( new ObservabilityPipelineConfigSourceItem( new ObservabilityPipelineDatadogAgentSource() .id("datadog-agent-source") .type( ObservabilityPipelineDatadogAgentSourceType .DATADOG_AGENT))))) .name("Main Observability Pipeline")) .type("pipelines")); try { ValidationResponse result = apiInstance.validatePipeline(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ObservabilityPipelinesApi#validatePipeline"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Validate an observability pipeline returns "OK" response ``` """ Validate an observability pipeline returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.observability_pipelines_api import ObservabilityPipelinesApi from datadog_api_client.v2.model.observability_pipeline_config import ObservabilityPipelineConfig from datadog_api_client.v2.model.observability_pipeline_config_processor_group import ( ObservabilityPipelineConfigProcessorGroup, ) from datadog_api_client.v2.model.observability_pipeline_data_attributes import ObservabilityPipelineDataAttributes from datadog_api_client.v2.model.observability_pipeline_datadog_agent_source import ( ObservabilityPipelineDatadogAgentSource, ) from datadog_api_client.v2.model.observability_pipeline_datadog_agent_source_type import ( ObservabilityPipelineDatadogAgentSourceType, ) from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination import ( ObservabilityPipelineDatadogLogsDestination, ) from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination_type import ( ObservabilityPipelineDatadogLogsDestinationType, ) from datadog_api_client.v2.model.observability_pipeline_filter_processor import ObservabilityPipelineFilterProcessor from datadog_api_client.v2.model.observability_pipeline_filter_processor_type import ( ObservabilityPipelineFilterProcessorType, ) from datadog_api_client.v2.model.observability_pipeline_spec import ObservabilityPipelineSpec from datadog_api_client.v2.model.observability_pipeline_spec_data import ObservabilityPipelineSpecData body = ObservabilityPipelineSpec( data=ObservabilityPipelineSpecData( attributes=ObservabilityPipelineDataAttributes( config=ObservabilityPipelineConfig( destinations=[ ObservabilityPipelineDatadogLogsDestination( id="datadog-logs-destination", inputs=[ "my-processor-group", ], type=ObservabilityPipelineDatadogLogsDestinationType.DATADOG_LOGS, ), ], processors=[ ObservabilityPipelineConfigProcessorGroup( enabled=True, id="my-processor-group", include="service:my-service", inputs=[ "datadog-agent-source", ], processors=[ ObservabilityPipelineFilterProcessor( enabled=True, id="filter-processor", include="status:error", type=ObservabilityPipelineFilterProcessorType.FILTER, ), ], ), ], sources=[ ObservabilityPipelineDatadogAgentSource( id="datadog-agent-source", type=ObservabilityPipelineDatadogAgentSourceType.DATADOG_AGENT, ), ], ), name="Main Observability Pipeline", ), type="pipelines", ), ) configuration = Configuration() configuration.unstable_operations["validate_pipeline"] = True with ApiClient(configuration) as api_client: api_instance = ObservabilityPipelinesApi(api_client) response = api_instance.validate_pipeline(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Validate an observability pipeline returns "OK" response ``` # Validate an observability pipeline returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.validate_pipeline".to_sym] = true end api_instance = DatadogAPIClient::V2::ObservabilityPipelinesAPI.new body = DatadogAPIClient::V2::ObservabilityPipelineSpec.new({ data: DatadogAPIClient::V2::ObservabilityPipelineSpecData.new({ attributes: DatadogAPIClient::V2::ObservabilityPipelineDataAttributes.new({ config: DatadogAPIClient::V2::ObservabilityPipelineConfig.new({ destinations: [ DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestination.new({ id: "datadog-logs-destination", inputs: [ "my-processor-group", ], type: DatadogAPIClient::V2::ObservabilityPipelineDatadogLogsDestinationType::DATADOG_LOGS, }), ], processors: [ DatadogAPIClient::V2::ObservabilityPipelineConfigProcessorGroup.new({ enabled: true, id: "my-processor-group", include: "service:my-service", inputs: [ "datadog-agent-source", ], processors: [ DatadogAPIClient::V2::ObservabilityPipelineFilterProcessor.new({ enabled: true, id: "filter-processor", include: "status:error", type: DatadogAPIClient::V2::ObservabilityPipelineFilterProcessorType::FILTER, }), ], }), ], sources: [ DatadogAPIClient::V2::ObservabilityPipelineDatadogAgentSource.new({ id: "datadog-agent-source", type: DatadogAPIClient::V2::ObservabilityPipelineDatadogAgentSourceType::DATADOG_AGENT, }), ], }), name: "Main Observability Pipeline", }), type: "pipelines", }), }) p api_instance.validate_pipeline(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Validate an observability pipeline returns "OK" response ``` // Validate an observability pipeline returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_observability_pipelines::ObservabilityPipelinesAPI; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfig; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigDestinationItem; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigProcessorGroup; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigProcessorItem; use datadog_api_client::datadogV2::model::ObservabilityPipelineConfigSourceItem; use datadog_api_client::datadogV2::model::ObservabilityPipelineDataAttributes; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogAgentSource; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogAgentSourceType; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogLogsDestination; use datadog_api_client::datadogV2::model::ObservabilityPipelineDatadogLogsDestinationType; use datadog_api_client::datadogV2::model::ObservabilityPipelineFilterProcessor; use datadog_api_client::datadogV2::model::ObservabilityPipelineFilterProcessorType; use datadog_api_client::datadogV2::model::ObservabilityPipelineSpec; use datadog_api_client::datadogV2::model::ObservabilityPipelineSpecData; #[tokio::main] async fn main() { let body = ObservabilityPipelineSpec::new( ObservabilityPipelineSpecData::new( ObservabilityPipelineDataAttributes::new( ObservabilityPipelineConfig::new( vec![ ObservabilityPipelineConfigDestinationItem::ObservabilityPipelineDatadogLogsDestination( Box::new( ObservabilityPipelineDatadogLogsDestination::new( "datadog-logs-destination".to_string(), vec!["my-processor-group".to_string()], ObservabilityPipelineDatadogLogsDestinationType::DATADOG_LOGS, ), ), ) ], vec![ ObservabilityPipelineConfigSourceItem::ObservabilityPipelineDatadogAgentSource( Box::new( ObservabilityPipelineDatadogAgentSource::new( "datadog-agent-source".to_string(), ObservabilityPipelineDatadogAgentSourceType::DATADOG_AGENT, ), ), ) ], ).processors( vec![ ObservabilityPipelineConfigProcessorGroup::new( true, "my-processor-group".to_string(), "service:my-service".to_string(), vec!["datadog-agent-source".to_string()], vec![ ObservabilityPipelineConfigProcessorItem::ObservabilityPipelineFilterProcessor( Box::new( ObservabilityPipelineFilterProcessor::new( true, "filter-processor".to_string(), "status:error".to_string(), ObservabilityPipelineFilterProcessorType::FILTER, ), ), ) ], ) ], ), "Main Observability Pipeline".to_string(), ), "pipelines".to_string(), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ValidatePipeline", true); let api = ObservabilityPipelinesAPI::with_config(configuration); let resp = api.validate_pipeline(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Validate an observability pipeline returns "OK" response ``` /** * Validate an observability pipeline returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.validatePipeline"] = true; const apiInstance = new v2.ObservabilityPipelinesApi(configuration); const params: v2.ObservabilityPipelinesApiValidatePipelineRequest = { body: { data: { attributes: { config: { destinations: [ { id: "datadog-logs-destination", inputs: ["my-processor-group"], type: "datadog_logs", }, ], processors: [ { enabled: true, id: "my-processor-group", include: "service:my-service", inputs: ["datadog-agent-source"], processors: [ { enabled: true, id: "filter-processor", include: "status:error", type: "filter", }, ], }, ], sources: [ { id: "datadog-agent-source", type: "datadog_agent", }, ], }, name: "Main Observability Pipeline", }, type: "pipelines", }, }, }; apiInstance .validatePipeline(params) .then((data: v2.ValidationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## Can't find something? Our friendly, knowledgeable solutions engineers are here to help! [Contact Us](https://docs.datadoghq.com/help/) [Free Trial](https://docs.datadoghq.com/api/latest/observability-pipelines/) Download mobile app [](https://apps.apple.com/app/datadog/id1391380318)[](https://play.google.com/store/apps/details?id=com.datadog.app) Product [Infrastructure Monitoring](https://www.datadoghq.com/product/infrastructure-monitoring/) [Network Monitoring](https://www.datadoghq.com/product/network-monitoring/) [Container Monitoring](https://www.datadoghq.com/product/container-monitoring/) [Serverless](https://www.datadoghq.com/product/serverless-monitoring/) [Cloud Cost Management](https://www.datadoghq.com/product/cloud-cost-management/) [Cloudcraft](https://www.datadoghq.com/product/cloudcraft/) [Kubernetes Autoscaling](https://www.datadoghq.com/product/kubernetes-autoscaling/) [Application Performance Monitoring](https://www.datadoghq.com/product/apm/) [Software Catalog](https://www.datadoghq.com/product/software-catalog/) [Universal Service Monitoring](https://www.datadoghq.com/product/universal-service-monitoring/) [Data Streams Monitoring](https://www.datadoghq.com/product/data-streams-monitoring/) [Jobs Monitoring](https://www.datadoghq.com/product/data-observability/jobs-monitoring/) [Quality Monitoring](https://www.datadoghq.com/product/data-observability/quality-monitoring/) [Database Monitoring](https://www.datadoghq.com/product/database-monitoring/) [Continuous Profiler](https://www.datadoghq.com/product/code-profiling/) [Dynamic Instrumentation](https://www.datadoghq.com/product/dynamic-instrumentation/) [Log Management](https://www.datadoghq.com/product/log-management/) [Sensitive Data Scanner](https://www.datadoghq.com/product/sensitive-data-scanner/) [Audit Trail](https://www.datadoghq.com/product/audit-trail/) [Observability Pipelines](https://www.datadoghq.com/product/observability-pipelines/) [Cloud Security](https://www.datadoghq.com/product/cloud-security/) [Cloud Security Posture Management](https://www.datadoghq.com/product/cloud-security/#posture-management) [Workload Protection](https://www.datadoghq.com/product/workload-protection/) [Cloud Infrastructure Entitlement Management](https://www.datadoghq.com/product/cloud-security/#entitlement-management) [Vulnerability Management](https://www.datadoghq.com/product/cloud-security/#vulnerability-management) [Compliance](https://www.datadoghq.com/product/cloud-security/#compliance) [App and API Protection](https://www.datadoghq.com/product/app-and-api-protection/) [Software Composition Analysis](https://www.datadoghq.com/product/software-composition-analysis/) [Code Security](https://www.datadoghq.com/product/code-security/) [Static Code Analysis (SAST)](https://www.datadoghq.com/product/sast/) [Runtime Code Analysis (IAST)](https://www.datadoghq.com/product/iast/) [IaC Security](https://www.datadoghq.com/product/iac-security) [Cloud SIEM](https://www.datadoghq.com/product/cloud-siem/) [Browser Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/) [Mobile Real User Monitoring](https://www.datadoghq.com/product/real-user-monitoring/mobile-rum/) [Product Analytics](https://www.datadoghq.com/product/product-analytics/) [Session Replay](https://www.datadoghq.com/product/real-user-monitoring/session-replay/) [Synthetic Monitoring](https://www.datadoghq.com/product/synthetic-monitoring/) [Mobile App Testing](https://www.datadoghq.com/product/mobile-app-testing/) [Continuous Testing](https://www.datadoghq.com/product/continuous-testing/) [Error Tracking](https://www.datadoghq.com/product/error-tracking/) [CloudPrem](https://www.datadoghq.com/product/cloudprem/) [Internal Developer Portal](https://www.datadoghq.com/product/internal-developer-portal/) [CI Visibility](https://www.datadoghq.com/product/ci-cd-monitoring/) [Test Optimization](https://www.datadoghq.com/product/test-optimization/) [Feature Flags](https://www.datadoghq.com/product/feature-flags/) [Code Coverage](https://www.datadoghq.com/product/code-coverage/) [Service Level Objectives](https://www.datadoghq.com/product/service-level-objectives/) [Incident Response](https://www.datadoghq.com/product/incident-response/) [Event Management](https://www.datadoghq.com/product/event-management/) [Case Management](https://www.datadoghq.com/product/case-management/) [Bits AI Agents](https://www.datadoghq.com/product/ai/bits-ai-agents/) [Bits AI SRE](https://www.datadoghq.com/product/ai/bits-ai-sre/) [Metrics](https://www.datadoghq.com/product/metrics/) [Watchdog](https://www.datadoghq.com/product/platform/watchdog/) [LLM Observability](https://www.datadoghq.com/product/llm-observability/) [AI Integrations](https://www.datadoghq.com/product/platform/integrations/#cat-aiml) [Workflow Automation](https://www.datadoghq.com/product/workflow-automation/) [App Builder](https://www.datadoghq.com/product/app-builder/) [CoScreen](https://www.datadoghq.com/product/coscreen/) [Teams](https://docs.datadoghq.com/account_management/teams/) [Dashboards](https://www.datadoghq.com/product/platform/dashboards/) [Notebooks](https://docs.datadoghq.com/notebooks/) [Mobile App](https://docs.datadoghq.com/service_management/mobile/?tab=ios) [Fleet Automation](https://www.datadoghq.com/product/fleet-automation/) [Access Control](https://docs.datadoghq.com/account_management/rbac/?tab=datadogapplication) [OpenTelemetry](https://www.datadoghq.com/solutions/opentelemetry/) [Alerts](https://www.datadoghq.com/product/platform/alerts/) [integrations](https://www.datadoghq.com/product/platform/integrations/) [IDE Plugins](https://www.datadoghq.com/product/platform/ides/) [API](https://docs.datadoghq.com/api/) [Marketplace](https://www.datadoghq.com/marketplacepartners/) [Security Labs Research](https://securitylabs.datadoghq.com/) [Open Source Projects](https://opensource.datadoghq.com/) [Storage Management](https://www.datadoghq.com/product/storage-management/) [DORA Metrics](https://www.datadoghq.com/product/platform/dora-metrics/) [Secret Scanning](https://www.datadoghq.com/product/secret-scanning/) resources [Pricing](https://www.datadoghq.com/pricing/) [Documentation](https://docs.datadoghq.com/) [Support](https://www.datadoghq.com/support/) [Services & Enablement](https://www.datadoghq.com/support-services/) [Product Preview Program](https://www.datadoghq.com/product-preview/) [Certification](https://www.datadoghq.com/certification/overview/) [Open Source](https://opensource.datadoghq.com/) [Events and Webinars](https://www.datadoghq.com/events-webinars/) [Security](https://www.datadoghq.com/security/) [Privacy Center](https://www.datadoghq.com/privacy/) [Knowledge Center](https://www.datadoghq.com/knowledge-center/) [Learning Resources](https://www.datadoghq.com/learn/) About [Contact Us](https://www.datadoghq.com/about/contact/) [Partners](https://www.datadoghq.com/partner/network/) [Press](https://www.datadoghq.com/about/latest-news/press-releases/) [Leadership](https://www.datadoghq.com/about/leadership/) [Careers](https://careers.datadoghq.com/) [Legal](https://www.datadoghq.com/legal/) [Investor Relations](https://investors.datadoghq.com/) [Analyst Reports](https://www.datadoghq.com/about/analyst/) [ESG Report](https://www.datadoghq.com/esg-report/) [Vendor Help](https://www.datadoghq.com/vendor-help/) [Trust Hub](https://www.datadoghq.com/trust/) Blog [The Monitor](https://www.datadoghq.com/blog/) [Engineering](https://www.datadoghq.com/blog/engineering/) [AI](https://www.datadoghq.com/blog/ai/) [Security Labs](https://securitylabs.datadoghq.com/) Icon/world Created with Sketch. English [English ](https://docs.datadoghq.com/?lang_pref=en)[Français ](https://docs.datadoghq.com/fr/?lang_pref=fr)[日本語 ](https://docs.datadoghq.com/ja/?lang_pref=ja)[한국어 ](https://docs.datadoghq.com/ko/?lang_pref=ko)[Español](https://docs.datadoghq.com/es/?lang_pref=es) [](https://twitter.com/datadoghq)[](https://www.instagram.com/datadoghq/)[](https://www.youtube.com/user/DatadogHQ)[](https://www.LinkedIn.com/company/datadog/) © Datadog 2026 [Terms](https://www.datadoghq.com/legal/terms/) | [Privacy](https://www.datadoghq.com/legal/privacy/) | [Your Privacy Choices ![](https://imgix.datadoghq.com/img/icons/privacyoptions.svg?w=24&dpr=2)](https://docs.datadoghq.com/api/latest/observability-pipelines/) ###### Request a personalized demo × * First Name* * Last Name* * Business Email* * Company* * Job Title* * Phone Number * How are you currently monitoring your infrastructure and applications? By submitting this form, you agree to the [Privacy Policy](https://www.datadoghq.com/legal/privacy/) and [Cookie Policy.](https://www.datadoghq.com/legal/cookies/) Request a Demo ##### Get Started with Datadog --- # Source: https://docs.datadoghq.com/api/latest/okta-integration/ # Okta Integration Configure your [Datadog Okta integration](https://docs.datadoghq.com/integrations/okta/) directly through the Datadog API. ## [List Okta accounts](https://docs.datadoghq.com/api/latest/okta-integration/#list-okta-accounts) * [v2 (latest)](https://docs.datadoghq.com/api/latest/okta-integration/#list-okta-accounts-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/okta/accountshttps://api.ap2.datadoghq.com/api/v2/integrations/okta/accountshttps://api.datadoghq.eu/api/v2/integrations/okta/accountshttps://api.ddog-gov.com/api/v2/integrations/okta/accountshttps://api.datadoghq.com/api/v2/integrations/okta/accountshttps://api.us3.datadoghq.com/api/v2/integrations/okta/accountshttps://api.us5.datadoghq.com/api/v2/integrations/okta/accounts ### Overview List Okta accounts. This endpoint requires the `integrations_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/okta-integration/#ListOktaAccounts-200-v2) * [400](https://docs.datadoghq.com/api/latest/okta-integration/#ListOktaAccounts-400-v2) * [403](https://docs.datadoghq.com/api/latest/okta-integration/#ListOktaAccounts-403-v2) * [404](https://docs.datadoghq.com/api/latest/okta-integration/#ListOktaAccounts-404-v2) * [429](https://docs.datadoghq.com/api/latest/okta-integration/#ListOktaAccounts-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) The expected response schema when getting Okta accounts. Field Type Description data [object] List of Okta accounts. attributes [_required_] object Attributes object for an Okta account. api_key string The API key of the Okta account. auth_method [_required_] string The authorization method for an Okta account. client_id string The Client ID of an Okta app integration. client_secret string The client secret of an Okta app integration. domain [_required_] string The domain of the Okta account. name [_required_] string The name of the Okta account. id [_required_] string The ID of the Okta account, a UUID hash of the account name. type [_required_] enum Account type for an Okta account. Allowed enum values: `okta-accounts` default: `okta-accounts` ``` { "data": [ { "attributes": { "api_key": "string", "auth_method": "oauth", "client_id": "string", "client_secret": "string", "domain": "https://example.okta.com/", "name": "Okta-Prod" }, "id": "f749daaf-682e-4208-a38d-c9b43162c609", "type": "okta-accounts" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=typescript) ##### List Okta accounts Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/okta/accounts" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Okta accounts ``` """ List Okta accounts returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.okta_integration_api import OktaIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OktaIntegrationApi(api_client) response = api_instance.list_okta_accounts() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Okta accounts ``` # List Okta accounts returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OktaIntegrationAPI.new p api_instance.list_okta_accounts() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Okta accounts ``` // List Okta accounts returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOktaIntegrationApi(apiClient) resp, r, err := api.ListOktaAccounts(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OktaIntegrationApi.ListOktaAccounts`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OktaIntegrationApi.ListOktaAccounts`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Okta accounts ``` // List Okta accounts returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OktaIntegrationApi; import com.datadog.api.client.v2.model.OktaAccountsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OktaIntegrationApi apiInstance = new OktaIntegrationApi(defaultClient); try { OktaAccountsResponse result = apiInstance.listOktaAccounts(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OktaIntegrationApi#listOktaAccounts"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Okta accounts ``` // List Okta accounts returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_okta_integration::OktaIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OktaIntegrationAPI::with_config(configuration); let resp = api.list_okta_accounts().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Okta accounts ``` /** * List Okta accounts returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OktaIntegrationApi(configuration); apiInstance .listOktaAccounts() .then((data: v2.OktaAccountsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add Okta account](https://docs.datadoghq.com/api/latest/okta-integration/#add-okta-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/okta-integration/#add-okta-account-v2) POST https://api.ap1.datadoghq.com/api/v2/integrations/okta/accountshttps://api.ap2.datadoghq.com/api/v2/integrations/okta/accountshttps://api.datadoghq.eu/api/v2/integrations/okta/accountshttps://api.ddog-gov.com/api/v2/integrations/okta/accountshttps://api.datadoghq.com/api/v2/integrations/okta/accountshttps://api.us3.datadoghq.com/api/v2/integrations/okta/accountshttps://api.us5.datadoghq.com/api/v2/integrations/okta/accounts ### Overview Create an Okta account. This endpoint requires the `manage_integrations` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) Field Type Description data [_required_] object Schema for an Okta account. attributes [_required_] object Attributes object for an Okta account. api_key string The API key of the Okta account. auth_method [_required_] string The authorization method for an Okta account. client_id string The Client ID of an Okta app integration. client_secret string The client secret of an Okta app integration. domain [_required_] string The domain of the Okta account. name [_required_] string The name of the Okta account. id string The ID of the Okta account, a UUID hash of the account name. type [_required_] enum Account type for an Okta account. Allowed enum values: `okta-accounts` default: `okta-accounts` ``` { "data": { "attributes": { "auth_method": "oauth", "domain": "https://example.okta.com/", "name": "exampleoktaintegration", "client_id": "client_id", "client_secret": "client_secret" }, "id": "f749daaf-682e-4208-a38d-c9b43162c609", "type": "okta-accounts" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/okta-integration/#CreateOktaAccount-201-v2) * [400](https://docs.datadoghq.com/api/latest/okta-integration/#CreateOktaAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/okta-integration/#CreateOktaAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/okta-integration/#CreateOktaAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/okta-integration/#CreateOktaAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) Response object for an Okta account. Field Type Description data object Schema for an Okta account. attributes [_required_] object Attributes object for an Okta account. api_key string The API key of the Okta account. auth_method [_required_] string The authorization method for an Okta account. client_id string The Client ID of an Okta app integration. client_secret string The client secret of an Okta app integration. domain [_required_] string The domain of the Okta account. name [_required_] string The name of the Okta account. id string The ID of the Okta account, a UUID hash of the account name. type [_required_] enum Account type for an Okta account. Allowed enum values: `okta-accounts` default: `okta-accounts` ``` { "data": { "attributes": { "api_key": "string", "auth_method": "oauth", "client_id": "string", "client_secret": "string", "domain": "https://example.okta.com/", "name": "Okta-Prod" }, "id": "f749daaf-682e-4208-a38d-c9b43162c609", "type": "okta-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=typescript) ##### Add Okta account returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/okta/accounts" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "auth_method": "oauth", "domain": "https://example.okta.com/", "name": "exampleoktaintegration", "client_id": "client_id", "client_secret": "client_secret" }, "id": "f749daaf-682e-4208-a38d-c9b43162c609", "type": "okta-accounts" } } EOF ``` ##### Add Okta account returns "OK" response ``` // Add Okta account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.OktaAccountRequest{ Data: datadogV2.OktaAccount{ Attributes: datadogV2.OktaAccountAttributes{ AuthMethod: "oauth", Domain: "https://example.okta.com/", Name: "exampleoktaintegration", ClientId: datadog.PtrString("client_id"), ClientSecret: datadog.PtrString("client_secret"), }, Id: datadog.PtrString("f749daaf-682e-4208-a38d-c9b43162c609"), Type: datadogV2.OKTAACCOUNTTYPE_OKTA_ACCOUNTS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOktaIntegrationApi(apiClient) resp, r, err := api.CreateOktaAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OktaIntegrationApi.CreateOktaAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OktaIntegrationApi.CreateOktaAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add Okta account returns "OK" response ``` // Add Okta account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OktaIntegrationApi; import com.datadog.api.client.v2.model.OktaAccount; import com.datadog.api.client.v2.model.OktaAccountAttributes; import com.datadog.api.client.v2.model.OktaAccountRequest; import com.datadog.api.client.v2.model.OktaAccountResponse; import com.datadog.api.client.v2.model.OktaAccountType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OktaIntegrationApi apiInstance = new OktaIntegrationApi(defaultClient); OktaAccountRequest body = new OktaAccountRequest() .data( new OktaAccount() .attributes( new OktaAccountAttributes() .authMethod("oauth") .domain("https://example.okta.com/") .name("exampleoktaintegration") .clientId("client_id") .clientSecret("client_secret")) .id("f749daaf-682e-4208-a38d-c9b43162c609") .type(OktaAccountType.OKTA_ACCOUNTS)); try { OktaAccountResponse result = apiInstance.createOktaAccount(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OktaIntegrationApi#createOktaAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add Okta account returns "OK" response ``` """ Add Okta account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.okta_integration_api import OktaIntegrationApi from datadog_api_client.v2.model.okta_account import OktaAccount from datadog_api_client.v2.model.okta_account_attributes import OktaAccountAttributes from datadog_api_client.v2.model.okta_account_request import OktaAccountRequest from datadog_api_client.v2.model.okta_account_type import OktaAccountType body = OktaAccountRequest( data=OktaAccount( attributes=OktaAccountAttributes( auth_method="oauth", domain="https://example.okta.com/", name="exampleoktaintegration", client_id="client_id", client_secret="client_secret", ), id="f749daaf-682e-4208-a38d-c9b43162c609", type=OktaAccountType.OKTA_ACCOUNTS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OktaIntegrationApi(api_client) response = api_instance.create_okta_account(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add Okta account returns "OK" response ``` # Add Okta account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OktaIntegrationAPI.new body = DatadogAPIClient::V2::OktaAccountRequest.new({ data: DatadogAPIClient::V2::OktaAccount.new({ attributes: DatadogAPIClient::V2::OktaAccountAttributes.new({ auth_method: "oauth", domain: "https://example.okta.com/", name: "exampleoktaintegration", client_id: "client_id", client_secret: "client_secret", }), id: "f749daaf-682e-4208-a38d-c9b43162c609", type: DatadogAPIClient::V2::OktaAccountType::OKTA_ACCOUNTS, }), }) p api_instance.create_okta_account(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add Okta account returns "OK" response ``` // Add Okta account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_okta_integration::OktaIntegrationAPI; use datadog_api_client::datadogV2::model::OktaAccount; use datadog_api_client::datadogV2::model::OktaAccountAttributes; use datadog_api_client::datadogV2::model::OktaAccountRequest; use datadog_api_client::datadogV2::model::OktaAccountType; #[tokio::main] async fn main() { let body = OktaAccountRequest::new( OktaAccount::new( OktaAccountAttributes::new( "oauth".to_string(), "https://example.okta.com/".to_string(), "exampleoktaintegration".to_string(), ) .client_id("client_id".to_string()) .client_secret("client_secret".to_string()), OktaAccountType::OKTA_ACCOUNTS, ) .id("f749daaf-682e-4208-a38d-c9b43162c609".to_string()), ); let configuration = datadog::Configuration::new(); let api = OktaIntegrationAPI::with_config(configuration); let resp = api.create_okta_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add Okta account returns "OK" response ``` /** * Add Okta account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OktaIntegrationApi(configuration); const params: v2.OktaIntegrationApiCreateOktaAccountRequest = { body: { data: { attributes: { authMethod: "oauth", domain: "https://example.okta.com/", name: "exampleoktaintegration", clientId: "client_id", clientSecret: "client_secret", }, id: "f749daaf-682e-4208-a38d-c9b43162c609", type: "okta-accounts", }, }, }; apiInstance .createOktaAccount(params) .then((data: v2.OktaAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get Okta account](https://docs.datadoghq.com/api/latest/okta-integration/#get-okta-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/okta-integration/#get-okta-account-v2) GET https://api.ap1.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/okta/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/okta/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/okta/accounts/{account_id} ### Overview Get an Okta account. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string None ### Response * [200](https://docs.datadoghq.com/api/latest/okta-integration/#GetOktaAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/okta-integration/#GetOktaAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/okta-integration/#GetOktaAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/okta-integration/#GetOktaAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/okta-integration/#GetOktaAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) Response object for an Okta account. Field Type Description data object Schema for an Okta account. attributes [_required_] object Attributes object for an Okta account. api_key string The API key of the Okta account. auth_method [_required_] string The authorization method for an Okta account. client_id string The Client ID of an Okta app integration. client_secret string The client secret of an Okta app integration. domain [_required_] string The domain of the Okta account. name [_required_] string The name of the Okta account. id string The ID of the Okta account, a UUID hash of the account name. type [_required_] enum Account type for an Okta account. Allowed enum values: `okta-accounts` default: `okta-accounts` ``` { "data": { "attributes": { "api_key": "string", "auth_method": "oauth", "client_id": "string", "client_secret": "string", "domain": "https://example.okta.com/", "name": "Okta-Prod" }, "id": "f749daaf-682e-4208-a38d-c9b43162c609", "type": "okta-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=typescript) ##### Get Okta account Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/okta/accounts/${account_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Okta account ``` """ Get Okta account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.okta_integration_api import OktaIntegrationApi # there is a valid "okta_account" in the system OKTA_ACCOUNT_DATA_ID = environ["OKTA_ACCOUNT_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OktaIntegrationApi(api_client) response = api_instance.get_okta_account( account_id=OKTA_ACCOUNT_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Okta account ``` # Get Okta account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OktaIntegrationAPI.new # there is a valid "okta_account" in the system OKTA_ACCOUNT_DATA_ID = ENV["OKTA_ACCOUNT_DATA_ID"] p api_instance.get_okta_account(OKTA_ACCOUNT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Okta account ``` // Get Okta account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "okta_account" in the system OktaAccountDataID := os.Getenv("OKTA_ACCOUNT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOktaIntegrationApi(apiClient) resp, r, err := api.GetOktaAccount(ctx, OktaAccountDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OktaIntegrationApi.GetOktaAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OktaIntegrationApi.GetOktaAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Okta account ``` // Get Okta account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OktaIntegrationApi; import com.datadog.api.client.v2.model.OktaAccountResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OktaIntegrationApi apiInstance = new OktaIntegrationApi(defaultClient); // there is a valid "okta_account" in the system String OKTA_ACCOUNT_DATA_ID = System.getenv("OKTA_ACCOUNT_DATA_ID"); try { OktaAccountResponse result = apiInstance.getOktaAccount(OKTA_ACCOUNT_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OktaIntegrationApi#getOktaAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Okta account ``` // Get Okta account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_okta_integration::OktaIntegrationAPI; #[tokio::main] async fn main() { // there is a valid "okta_account" in the system let okta_account_data_id = std::env::var("OKTA_ACCOUNT_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OktaIntegrationAPI::with_config(configuration); let resp = api.get_okta_account(okta_account_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Okta account ``` /** * Get Okta account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OktaIntegrationApi(configuration); // there is a valid "okta_account" in the system const OKTA_ACCOUNT_DATA_ID = process.env.OKTA_ACCOUNT_DATA_ID as string; const params: v2.OktaIntegrationApiGetOktaAccountRequest = { accountId: OKTA_ACCOUNT_DATA_ID, }; apiInstance .getOktaAccount(params) .then((data: v2.OktaAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Okta account](https://docs.datadoghq.com/api/latest/okta-integration/#update-okta-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/okta-integration/#update-okta-account-v2) PATCH https://api.ap1.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/okta/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/okta/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/okta/accounts/{account_id} ### Overview Update an Okta account. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) Field Type Description data [_required_] object Data object for updating an Okta account. attributes object Attributes object for updating an Okta account. api_key string The API key of the Okta account. auth_method [_required_] string The authorization method for an Okta account. client_id string The Client ID of an Okta app integration. client_secret string The client secret of an Okta app integration. domain [_required_] string The domain associated with an Okta account. type enum Account type for an Okta account. Allowed enum values: `okta-accounts` default: `okta-accounts` ``` { "data": { "attributes": { "auth_method": "oauth", "domain": "https://example.okta.com/", "client_id": "client_id", "client_secret": "client_secret" }, "type": "okta-accounts" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/okta-integration/#UpdateOktaAccount-200-v2) * [400](https://docs.datadoghq.com/api/latest/okta-integration/#UpdateOktaAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/okta-integration/#UpdateOktaAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/okta-integration/#UpdateOktaAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/okta-integration/#UpdateOktaAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) Response object for an Okta account. Field Type Description data object Schema for an Okta account. attributes [_required_] object Attributes object for an Okta account. api_key string The API key of the Okta account. auth_method [_required_] string The authorization method for an Okta account. client_id string The Client ID of an Okta app integration. client_secret string The client secret of an Okta app integration. domain [_required_] string The domain of the Okta account. name [_required_] string The name of the Okta account. id string The ID of the Okta account, a UUID hash of the account name. type [_required_] enum Account type for an Okta account. Allowed enum values: `okta-accounts` default: `okta-accounts` ``` { "data": { "attributes": { "api_key": "string", "auth_method": "oauth", "client_id": "string", "client_secret": "string", "domain": "https://example.okta.com/", "name": "Okta-Prod" }, "id": "f749daaf-682e-4208-a38d-c9b43162c609", "type": "okta-accounts" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=typescript) ##### Update Okta account returns "OK" response Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/okta/accounts/${account_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "auth_method": "oauth", "domain": "https://example.okta.com/", "client_id": "client_id", "client_secret": "client_secret" }, "type": "okta-accounts" } } EOF ``` ##### Update Okta account returns "OK" response ``` // Update Okta account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "okta_account" in the system OktaAccountDataID := os.Getenv("OKTA_ACCOUNT_DATA_ID") body := datadogV2.OktaAccountUpdateRequest{ Data: datadogV2.OktaAccountUpdateRequestData{ Attributes: &datadogV2.OktaAccountUpdateRequestAttributes{ AuthMethod: "oauth", Domain: "https://example.okta.com/", ClientId: datadog.PtrString("client_id"), ClientSecret: datadog.PtrString("client_secret"), }, Type: datadogV2.OKTAACCOUNTTYPE_OKTA_ACCOUNTS.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOktaIntegrationApi(apiClient) resp, r, err := api.UpdateOktaAccount(ctx, OktaAccountDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OktaIntegrationApi.UpdateOktaAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OktaIntegrationApi.UpdateOktaAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Okta account returns "OK" response ``` // Update Okta account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OktaIntegrationApi; import com.datadog.api.client.v2.model.OktaAccountResponse; import com.datadog.api.client.v2.model.OktaAccountType; import com.datadog.api.client.v2.model.OktaAccountUpdateRequest; import com.datadog.api.client.v2.model.OktaAccountUpdateRequestAttributes; import com.datadog.api.client.v2.model.OktaAccountUpdateRequestData; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OktaIntegrationApi apiInstance = new OktaIntegrationApi(defaultClient); // there is a valid "okta_account" in the system String OKTA_ACCOUNT_DATA_ID = System.getenv("OKTA_ACCOUNT_DATA_ID"); OktaAccountUpdateRequest body = new OktaAccountUpdateRequest() .data( new OktaAccountUpdateRequestData() .attributes( new OktaAccountUpdateRequestAttributes() .authMethod("oauth") .domain("https://example.okta.com/") .clientId("client_id") .clientSecret("client_secret")) .type(OktaAccountType.OKTA_ACCOUNTS)); try { OktaAccountResponse result = apiInstance.updateOktaAccount(OKTA_ACCOUNT_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OktaIntegrationApi#updateOktaAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Okta account returns "OK" response ``` """ Update Okta account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.okta_integration_api import OktaIntegrationApi from datadog_api_client.v2.model.okta_account_type import OktaAccountType from datadog_api_client.v2.model.okta_account_update_request import OktaAccountUpdateRequest from datadog_api_client.v2.model.okta_account_update_request_attributes import OktaAccountUpdateRequestAttributes from datadog_api_client.v2.model.okta_account_update_request_data import OktaAccountUpdateRequestData # there is a valid "okta_account" in the system OKTA_ACCOUNT_DATA_ID = environ["OKTA_ACCOUNT_DATA_ID"] body = OktaAccountUpdateRequest( data=OktaAccountUpdateRequestData( attributes=OktaAccountUpdateRequestAttributes( auth_method="oauth", domain="https://example.okta.com/", client_id="client_id", client_secret="client_secret", ), type=OktaAccountType.OKTA_ACCOUNTS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OktaIntegrationApi(api_client) response = api_instance.update_okta_account(account_id=OKTA_ACCOUNT_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Okta account returns "OK" response ``` # Update Okta account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OktaIntegrationAPI.new # there is a valid "okta_account" in the system OKTA_ACCOUNT_DATA_ID = ENV["OKTA_ACCOUNT_DATA_ID"] body = DatadogAPIClient::V2::OktaAccountUpdateRequest.new({ data: DatadogAPIClient::V2::OktaAccountUpdateRequestData.new({ attributes: DatadogAPIClient::V2::OktaAccountUpdateRequestAttributes.new({ auth_method: "oauth", domain: "https://example.okta.com/", client_id: "client_id", client_secret: "client_secret", }), type: DatadogAPIClient::V2::OktaAccountType::OKTA_ACCOUNTS, }), }) p api_instance.update_okta_account(OKTA_ACCOUNT_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Okta account returns "OK" response ``` // Update Okta account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_okta_integration::OktaIntegrationAPI; use datadog_api_client::datadogV2::model::OktaAccountType; use datadog_api_client::datadogV2::model::OktaAccountUpdateRequest; use datadog_api_client::datadogV2::model::OktaAccountUpdateRequestAttributes; use datadog_api_client::datadogV2::model::OktaAccountUpdateRequestData; #[tokio::main] async fn main() { // there is a valid "okta_account" in the system let okta_account_data_id = std::env::var("OKTA_ACCOUNT_DATA_ID").unwrap(); let body = OktaAccountUpdateRequest::new( OktaAccountUpdateRequestData::new() .attributes( OktaAccountUpdateRequestAttributes::new( "oauth".to_string(), "https://example.okta.com/".to_string(), ) .client_id("client_id".to_string()) .client_secret("client_secret".to_string()), ) .type_(OktaAccountType::OKTA_ACCOUNTS), ); let configuration = datadog::Configuration::new(); let api = OktaIntegrationAPI::with_config(configuration); let resp = api .update_okta_account(okta_account_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Okta account returns "OK" response ``` /** * Update Okta account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OktaIntegrationApi(configuration); // there is a valid "okta_account" in the system const OKTA_ACCOUNT_DATA_ID = process.env.OKTA_ACCOUNT_DATA_ID as string; const params: v2.OktaIntegrationApiUpdateOktaAccountRequest = { body: { data: { attributes: { authMethod: "oauth", domain: "https://example.okta.com/", clientId: "client_id", clientSecret: "client_secret", }, type: "okta-accounts", }, }, accountId: OKTA_ACCOUNT_DATA_ID, }; apiInstance .updateOktaAccount(params) .then((data: v2.OktaAccountResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Okta account](https://docs.datadoghq.com/api/latest/okta-integration/#delete-okta-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/okta-integration/#delete-okta-account-v2) DELETE https://api.ap1.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.ap2.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.datadoghq.eu/api/v2/integrations/okta/accounts/{account_id}https://api.ddog-gov.com/api/v2/integrations/okta/accounts/{account_id}https://api.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.us3.datadoghq.com/api/v2/integrations/okta/accounts/{account_id}https://api.us5.datadoghq.com/api/v2/integrations/okta/accounts/{account_id} ### Overview Delete an Okta account. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_id [_required_] string None ### Response * [204](https://docs.datadoghq.com/api/latest/okta-integration/#DeleteOktaAccount-204-v2) * [400](https://docs.datadoghq.com/api/latest/okta-integration/#DeleteOktaAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/okta-integration/#DeleteOktaAccount-403-v2) * [404](https://docs.datadoghq.com/api/latest/okta-integration/#DeleteOktaAccount-404-v2) * [429](https://docs.datadoghq.com/api/latest/okta-integration/#DeleteOktaAccount-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/okta-integration/) * [Example](https://docs.datadoghq.com/api/latest/okta-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/okta-integration/?code-lang=typescript) ##### Delete Okta account Copy ``` # Path parameters export account_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integrations/okta/accounts/${account_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Okta account ``` """ Delete Okta account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.okta_integration_api import OktaIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OktaIntegrationApi(api_client) api_instance.delete_okta_account( account_id="account_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Okta account ``` # Delete Okta account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OktaIntegrationAPI.new api_instance.delete_okta_account("account_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Okta account ``` // Delete Okta account returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOktaIntegrationApi(apiClient) r, err := api.DeleteOktaAccount(ctx, "account_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OktaIntegrationApi.DeleteOktaAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Okta account ``` // Delete Okta account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OktaIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OktaIntegrationApi apiInstance = new OktaIntegrationApi(defaultClient); try { apiInstance.deleteOktaAccount("account_id"); } catch (ApiException e) { System.err.println("Exception when calling OktaIntegrationApi#deleteOktaAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Okta account ``` // Delete Okta account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_okta_integration::OktaIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OktaIntegrationAPI::with_config(configuration); let resp = api.delete_okta_account("account_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Okta account ``` /** * Delete Okta account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OktaIntegrationApi(configuration); const params: v2.OktaIntegrationApiDeleteOktaAccountRequest = { accountId: "account_id", }; apiInstance .deleteOktaAccount(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=b3a691cb-da21-4dd7-80fe-912a5e780a3d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=caf6489a-3627-4cda-88a6-bc374ed8d81d&pt=Okta%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fokta-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=b3a691cb-da21-4dd7-80fe-912a5e780a3d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=caf6489a-3627-4cda-88a6-bc374ed8d81d&pt=Okta%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fokta-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=d61d8f36-d8a3-4d42-88b2-57f86ddce2e4&bo=2&sid=3d8da700f0c011f0b447d932ca871b75&vid=3d8dcb40f0c011f095171bdc3f922644&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Okta%20Integration&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fokta-integration%2F&r=<=1174&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=80503) --- # Source: https://docs.datadoghq.com/api/latest/on-call-paging/ # On-Call Paging Trigger and manage [Datadog On-Call](https://docs.datadoghq.com/service_management/on-call/) pages directly through the Datadog API. ## [Create On-Call Page](https://docs.datadoghq.com/api/latest/on-call-paging/#create-on-call-page) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call-paging/#create-on-call-page-v2) POST https://saffron.oncall.datadoghq.com/api/v2/on-call/pageshttps://lava.oncall.datadoghq.com/api/v2/on-call/pageshttps://beige.oncall.datadoghq.eu/api/v2/on-call/pageshttps://navy.oncall.datadoghq.com/api/v2/on-call/pageshttps://navy.oncall.datadoghq.com/api/v2/on-call/pageshttps://teal.oncall.datadoghq.com/api/v2/on-call/pageshttps://coral.oncall.datadoghq.com/api/v2/on-call/pages ### Overview Trigger a new On-Call Page. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/on-call-paging/) * [Example](https://docs.datadoghq.com/api/latest/on-call-paging/) Field Type Description data object The main request body, including attributes and resource type. attributes object Details about the On-Call Page you want to create. description string A short summary of the issue or context. tags [string] Tags to help categorize or filter the page. target [_required_] object Information about the target to notify (such as a team or user). identifier string Identifier for the target (for example, team handle or user ID). type enum The kind of target, `team_id` | `team_handle` | `user_id`. Allowed enum values: `team_id,team_handle,user_id` title [_required_] string The title of the page. urgency [_required_] enum On-Call Page urgency level. Allowed enum values: `low,high` default: `high` type [_required_] enum The type of resource used when creating an On-Call Page. Allowed enum values: `pages` default: `pages` ``` { "data": { "attributes": { "description": "string", "tags": [], "target": { "identifier": "string", "type": "team_id" }, "title": "Service: Test is down", "urgency": "high" }, "type": "pages" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/on-call-paging/#CreateOnCallPage-200-v2) * [429](https://docs.datadoghq.com/api/latest/on-call-paging/#CreateOnCallPage-429-v2) OK. * [Model](https://docs.datadoghq.com/api/latest/on-call-paging/) * [Example](https://docs.datadoghq.com/api/latest/on-call-paging/) The full response object after creating a new On-Call Page. Field Type Description data object The information returned after successfully creating a page. id string The unique ID of the created page. type [_required_] enum The type of resource used when creating an On-Call Page. Allowed enum values: `pages` default: `pages` ``` { "data": { "id": "15e74b8b-f865-48d0-bcc5-453323ed2c8f", "type": "pages" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call-paging/) * [Example](https://docs.datadoghq.com/api/latest/on-call-paging/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=typescript) ##### Create On-Call Page Copy ``` # Curl command curl -X POST "https://saffron.oncall.datadoghq.com"https://lava.oncall.datadoghq.com"https://beige.oncall.datadoghq.eu"https://navy.oncall.datadoghq.com"https://navy.oncall.datadoghq.com"https://teal.oncall.datadoghq.com"https://coral.oncall.datadoghq.com/api/v2/on-call/pages" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "target": {}, "title": "Service: Test is down", "urgency": "high" }, "type": "pages" } } EOF ``` ##### Create On-Call Page ``` """ Create On-Call Page returns "OK." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_paging_api import OnCallPagingApi from datadog_api_client.v2.model.create_page_request import CreatePageRequest from datadog_api_client.v2.model.create_page_request_data import CreatePageRequestData from datadog_api_client.v2.model.create_page_request_data_attributes import CreatePageRequestDataAttributes from datadog_api_client.v2.model.create_page_request_data_attributes_target import CreatePageRequestDataAttributesTarget from datadog_api_client.v2.model.create_page_request_data_type import CreatePageRequestDataType from datadog_api_client.v2.model.on_call_page_target_type import OnCallPageTargetType from datadog_api_client.v2.model.page_urgency import PageUrgency body = CreatePageRequest( data=CreatePageRequestData( attributes=CreatePageRequestDataAttributes( description="Page details.", tags=[ "service:test", ], target=CreatePageRequestDataAttributesTarget( identifier="my-team", type=OnCallPageTargetType.TEAM_HANDLE, ), title="Page title", urgency=PageUrgency.LOW, ), type=CreatePageRequestDataType.PAGES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallPagingApi(api_client) response = api_instance.create_on_call_page(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create On-Call Page ``` # Create On-Call Page returns "OK." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallPagingAPI.new body = DatadogAPIClient::V2::CreatePageRequest.new({ data: DatadogAPIClient::V2::CreatePageRequestData.new({ attributes: DatadogAPIClient::V2::CreatePageRequestDataAttributes.new({ description: "Page details.", tags: [ "service:test", ], target: DatadogAPIClient::V2::CreatePageRequestDataAttributesTarget.new({ identifier: "my-team", type: DatadogAPIClient::V2::OnCallPageTargetType::TEAM_HANDLE, }), title: "Page title", urgency: DatadogAPIClient::V2::PageUrgency::LOW, }), type: DatadogAPIClient::V2::CreatePageRequestDataType::PAGES, }), }) p api_instance.create_on_call_page(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create On-Call Page ``` // Create On-Call Page returns "OK." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreatePageRequest{ Data: &datadogV2.CreatePageRequestData{ Attributes: &datadogV2.CreatePageRequestDataAttributes{ Description: datadog.PtrString("Page details."), Tags: []string{ "service:test", }, Target: datadogV2.CreatePageRequestDataAttributesTarget{ Identifier: datadog.PtrString("my-team"), Type: datadogV2.ONCALLPAGETARGETTYPE_TEAM_HANDLE.Ptr(), }, Title: "Page title", Urgency: datadogV2.PAGEURGENCY_LOW, }, Type: datadogV2.CREATEPAGEREQUESTDATATYPE_PAGES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallPagingApi(apiClient) resp, r, err := api.CreateOnCallPage(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallPagingApi.CreateOnCallPage`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallPagingApi.CreateOnCallPage`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create On-Call Page ``` // Create On-Call Page returns "OK." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallPagingApi; import com.datadog.api.client.v2.model.CreatePageRequest; import com.datadog.api.client.v2.model.CreatePageRequestData; import com.datadog.api.client.v2.model.CreatePageRequestDataAttributes; import com.datadog.api.client.v2.model.CreatePageRequestDataAttributesTarget; import com.datadog.api.client.v2.model.CreatePageRequestDataType; import com.datadog.api.client.v2.model.CreatePageResponse; import com.datadog.api.client.v2.model.OnCallPageTargetType; import com.datadog.api.client.v2.model.PageUrgency; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallPagingApi apiInstance = new OnCallPagingApi(defaultClient); CreatePageRequest body = new CreatePageRequest() .data( new CreatePageRequestData() .attributes( new CreatePageRequestDataAttributes() .description("Page details.") .tags(Collections.singletonList("service:test")) .target( new CreatePageRequestDataAttributesTarget() .identifier("my-team") .type(OnCallPageTargetType.TEAM_HANDLE)) .title("Page title") .urgency(PageUrgency.LOW)) .type(CreatePageRequestDataType.PAGES)); try { CreatePageResponse result = apiInstance.createOnCallPage(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallPagingApi#createOnCallPage"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create On-Call Page ``` // Create On-Call Page returns "OK." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call_paging::OnCallPagingAPI; use datadog_api_client::datadogV2::model::CreatePageRequest; use datadog_api_client::datadogV2::model::CreatePageRequestData; use datadog_api_client::datadogV2::model::CreatePageRequestDataAttributes; use datadog_api_client::datadogV2::model::CreatePageRequestDataAttributesTarget; use datadog_api_client::datadogV2::model::CreatePageRequestDataType; use datadog_api_client::datadogV2::model::OnCallPageTargetType; use datadog_api_client::datadogV2::model::PageUrgency; #[tokio::main] async fn main() { let body = CreatePageRequest::new().data( CreatePageRequestData::new(CreatePageRequestDataType::PAGES).attributes( CreatePageRequestDataAttributes::new( CreatePageRequestDataAttributesTarget::new() .identifier("my-team".to_string()) .type_(OnCallPageTargetType::TEAM_HANDLE), "Page title".to_string(), PageUrgency::LOW, ) .description("Page details.".to_string()) .tags(vec!["service:test".to_string()]), ), ); let configuration = datadog::Configuration::new(); let api = OnCallPagingAPI::with_config(configuration); let resp = api.create_on_call_page(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create On-Call Page ``` /** * Create On-Call Page returns "OK." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallPagingApi(configuration); const params: v2.OnCallPagingApiCreateOnCallPageRequest = { body: { data: { attributes: { description: "Page details.", tags: ["service:test"], target: { identifier: "my-team", type: "team_handle", }, title: "Page title", urgency: "low", }, type: "pages", }, }, }; apiInstance .createOnCallPage(params) .then((data: v2.CreatePageResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Acknowledge On-Call Page](https://docs.datadoghq.com/api/latest/on-call-paging/#acknowledge-on-call-page) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call-paging/#acknowledge-on-call-page-v2) POST https://saffron.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/acknowledgehttps://lava.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/acknowledgehttps://beige.oncall.datadoghq.eu/api/v2/on-call/pages/{page_id}/acknowledgehttps://navy.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/acknowledgehttps://navy.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/acknowledgehttps://teal.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/acknowledgehttps://coral.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/acknowledge ### Overview Acknowledges an On-Call Page. ### Arguments #### Path Parameters Name Type Description page_id [_required_] string The page ID. ### Response * [202](https://docs.datadoghq.com/api/latest/on-call-paging/#AcknowledgeOnCallPage-202-v2) * [429](https://docs.datadoghq.com/api/latest/on-call-paging/#AcknowledgeOnCallPage-429-v2) Accepted. Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call-paging/) * [Example](https://docs.datadoghq.com/api/latest/on-call-paging/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=typescript) ##### Acknowledge On-Call Page Copy ``` # Path parameters export page_id="15e74b8b-f865-48d0-bcc5-453323ed2c8f" # Curl command curl -X POST "https://saffron.oncall.datadoghq.com"https://lava.oncall.datadoghq.com"https://beige.oncall.datadoghq.eu"https://navy.oncall.datadoghq.com"https://navy.oncall.datadoghq.com"https://teal.oncall.datadoghq.com"https://coral.oncall.datadoghq.com/api/v2/on-call/pages/${page_id}/acknowledge" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Acknowledge On-Call Page ``` """ Acknowledge On-Call Page returns "Accepted." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_paging_api import OnCallPagingApi from uuid import UUID configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallPagingApi(api_client) api_instance.acknowledge_on_call_page( page_id=UUID("15e74b8b-f865-48d0-bcc5-453323ed2c8f"), ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Acknowledge On-Call Page ``` # Acknowledge On-Call Page returns "Accepted." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallPagingAPI.new p api_instance.acknowledge_on_call_page("15e74b8b-f865-48d0-bcc5-453323ed2c8f") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Acknowledge On-Call Page ``` // Acknowledge On-Call Page returns "Accepted." response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallPagingApi(apiClient) r, err := api.AcknowledgeOnCallPage(ctx, uuid.MustParse("15e74b8b-f865-48d0-bcc5-453323ed2c8f")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallPagingApi.AcknowledgeOnCallPage`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Acknowledge On-Call Page ``` // Acknowledge On-Call Page returns "Accepted." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallPagingApi; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallPagingApi apiInstance = new OnCallPagingApi(defaultClient); try { apiInstance.acknowledgeOnCallPage(UUID.fromString("15e74b8b-f865-48d0-bcc5-453323ed2c8f")); } catch (ApiException e) { System.err.println("Exception when calling OnCallPagingApi#acknowledgeOnCallPage"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Acknowledge On-Call Page ``` // Acknowledge On-Call Page returns "Accepted." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call_paging::OnCallPagingAPI; use uuid::Uuid; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OnCallPagingAPI::with_config(configuration); let resp = api .acknowledge_on_call_page( Uuid::parse_str("15e74b8b-f865-48d0-bcc5-453323ed2c8f").expect("invalid UUID"), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Acknowledge On-Call Page ``` /** * Acknowledge On-Call Page returns "Accepted." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallPagingApi(configuration); const params: v2.OnCallPagingApiAcknowledgeOnCallPageRequest = { pageId: "15e74b8b-f865-48d0-bcc5-453323ed2c8f", }; apiInstance .acknowledgeOnCallPage(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Escalate On-Call Page](https://docs.datadoghq.com/api/latest/on-call-paging/#escalate-on-call-page) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call-paging/#escalate-on-call-page-v2) POST https://saffron.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/escalatehttps://lava.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/escalatehttps://beige.oncall.datadoghq.eu/api/v2/on-call/pages/{page_id}/escalatehttps://navy.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/escalatehttps://navy.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/escalatehttps://teal.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/escalatehttps://coral.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/escalate ### Overview Escalates an On-Call Page. ### Arguments #### Path Parameters Name Type Description page_id [_required_] string The page ID. ### Response * [202](https://docs.datadoghq.com/api/latest/on-call-paging/#EscalateOnCallPage-202-v2) * [429](https://docs.datadoghq.com/api/latest/on-call-paging/#EscalateOnCallPage-429-v2) Accepted. Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call-paging/) * [Example](https://docs.datadoghq.com/api/latest/on-call-paging/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=typescript) ##### Escalate On-Call Page Copy ``` # Path parameters export page_id="15e74b8b-f865-48d0-bcc5-453323ed2c8f" # Curl command curl -X POST "https://saffron.oncall.datadoghq.com"https://lava.oncall.datadoghq.com"https://beige.oncall.datadoghq.eu"https://navy.oncall.datadoghq.com"https://navy.oncall.datadoghq.com"https://teal.oncall.datadoghq.com"https://coral.oncall.datadoghq.com/api/v2/on-call/pages/${page_id}/escalate" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Escalate On-Call Page ``` """ Escalate On-Call Page returns "Accepted." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_paging_api import OnCallPagingApi from uuid import UUID configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallPagingApi(api_client) api_instance.escalate_on_call_page( page_id=UUID("15e74b8b-f865-48d0-bcc5-453323ed2c8f"), ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Escalate On-Call Page ``` # Escalate On-Call Page returns "Accepted." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallPagingAPI.new p api_instance.escalate_on_call_page("15e74b8b-f865-48d0-bcc5-453323ed2c8f") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Escalate On-Call Page ``` // Escalate On-Call Page returns "Accepted." response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallPagingApi(apiClient) r, err := api.EscalateOnCallPage(ctx, uuid.MustParse("15e74b8b-f865-48d0-bcc5-453323ed2c8f")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallPagingApi.EscalateOnCallPage`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Escalate On-Call Page ``` // Escalate On-Call Page returns "Accepted." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallPagingApi; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallPagingApi apiInstance = new OnCallPagingApi(defaultClient); try { apiInstance.escalateOnCallPage(UUID.fromString("15e74b8b-f865-48d0-bcc5-453323ed2c8f")); } catch (ApiException e) { System.err.println("Exception when calling OnCallPagingApi#escalateOnCallPage"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Escalate On-Call Page ``` // Escalate On-Call Page returns "Accepted." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call_paging::OnCallPagingAPI; use uuid::Uuid; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OnCallPagingAPI::with_config(configuration); let resp = api .escalate_on_call_page( Uuid::parse_str("15e74b8b-f865-48d0-bcc5-453323ed2c8f").expect("invalid UUID"), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Escalate On-Call Page ``` /** * Escalate On-Call Page returns "Accepted." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallPagingApi(configuration); const params: v2.OnCallPagingApiEscalateOnCallPageRequest = { pageId: "15e74b8b-f865-48d0-bcc5-453323ed2c8f", }; apiInstance .escalateOnCallPage(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Resolve On-Call Page](https://docs.datadoghq.com/api/latest/on-call-paging/#resolve-on-call-page) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call-paging/#resolve-on-call-page-v2) POST https://saffron.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/resolvehttps://lava.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/resolvehttps://beige.oncall.datadoghq.eu/api/v2/on-call/pages/{page_id}/resolvehttps://navy.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/resolvehttps://navy.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/resolvehttps://teal.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/resolvehttps://coral.oncall.datadoghq.com/api/v2/on-call/pages/{page_id}/resolve ### Overview Resolves an On-Call Page. ### Arguments #### Path Parameters Name Type Description page_id [_required_] string The page ID. ### Response * [202](https://docs.datadoghq.com/api/latest/on-call-paging/#ResolveOnCallPage-202-v2) * [429](https://docs.datadoghq.com/api/latest/on-call-paging/#ResolveOnCallPage-429-v2) Accepted. Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call-paging/) * [Example](https://docs.datadoghq.com/api/latest/on-call-paging/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call-paging/?code-lang=typescript) ##### Resolve On-Call Page Copy ``` # Path parameters export page_id="15e74b8b-f865-48d0-bcc5-453323ed2c8f" # Curl command curl -X POST "https://saffron.oncall.datadoghq.com"https://lava.oncall.datadoghq.com"https://beige.oncall.datadoghq.eu"https://navy.oncall.datadoghq.com"https://navy.oncall.datadoghq.com"https://teal.oncall.datadoghq.com"https://coral.oncall.datadoghq.com/api/v2/on-call/pages/${page_id}/resolve" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Resolve On-Call Page ``` """ Resolve On-Call Page returns "Accepted." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_paging_api import OnCallPagingApi from uuid import UUID configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallPagingApi(api_client) api_instance.resolve_on_call_page( page_id=UUID("15e74b8b-f865-48d0-bcc5-453323ed2c8f"), ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Resolve On-Call Page ``` # Resolve On-Call Page returns "Accepted." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallPagingAPI.new p api_instance.resolve_on_call_page("15e74b8b-f865-48d0-bcc5-453323ed2c8f") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Resolve On-Call Page ``` // Resolve On-Call Page returns "Accepted." response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallPagingApi(apiClient) r, err := api.ResolveOnCallPage(ctx, uuid.MustParse("15e74b8b-f865-48d0-bcc5-453323ed2c8f")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallPagingApi.ResolveOnCallPage`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Resolve On-Call Page ``` // Resolve On-Call Page returns "Accepted." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallPagingApi; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallPagingApi apiInstance = new OnCallPagingApi(defaultClient); try { apiInstance.resolveOnCallPage(UUID.fromString("15e74b8b-f865-48d0-bcc5-453323ed2c8f")); } catch (ApiException e) { System.err.println("Exception when calling OnCallPagingApi#resolveOnCallPage"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Resolve On-Call Page ``` // Resolve On-Call Page returns "Accepted." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call_paging::OnCallPagingAPI; use uuid::Uuid; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OnCallPagingAPI::with_config(configuration); let resp = api .resolve_on_call_page( Uuid::parse_str("15e74b8b-f865-48d0-bcc5-453323ed2c8f").expect("invalid UUID"), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Resolve On-Call Page ``` /** * Resolve On-Call Page returns "Accepted." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallPagingApi(configuration); const params: v2.OnCallPagingApiResolveOnCallPageRequest = { pageId: "15e74b8b-f865-48d0-bcc5-453323ed2c8f", }; apiInstance .resolveOnCallPage(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=abc9341a-9440-4412-8ec5-dc67f839b077&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b4d43a2c-f51a-4b50-bdc8-b7f4579a5862&pt=On-Call%20Paging&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fon-call-paging%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=abc9341a-9440-4412-8ec5-dc67f839b077&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b4d43a2c-f51a-4b50-bdc8-b7f4579a5862&pt=On-Call%20Paging&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fon-call-paging%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=9612149a-365e-434a-8790-12ba6a9451b0&bo=2&sid=470bafe0f0c011f08cbfdd3fec52cd2b&vid=470c0260f0c011f083dcd562678d27ef&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=On-Call%20Paging&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fon-call-paging%2F&r=<=1117&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=529449) --- # Source: https://docs.datadoghq.com/api/latest/on-call/ This product is not supported for your selected [Datadog site](https://docs.datadoghq.com/getting_started/site). (US1). # On-Call Configure your [Datadog On-Call](https://docs.datadoghq.com/service_management/on-call/) directly through the Datadog API. ## [Create On-Call schedule](https://docs.datadoghq.com/api/latest/on-call/#create-on-call-schedule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#create-on-call-schedule-v2) POST https://api.ap1.datadoghq.com/api/v2/on-call/scheduleshttps://api.ap2.datadoghq.com/api/v2/on-call/scheduleshttps://api.datadoghq.eu/api/v2/on-call/scheduleshttps://api.ddog-gov.com/api/v2/on-call/scheduleshttps://api.datadoghq.com/api/v2/on-call/scheduleshttps://api.us3.datadoghq.com/api/v2/on-call/scheduleshttps://api.us5.datadoghq.com/api/v2/on-call/schedules ### Overview Create a new On-Call schedule This endpoint requires the `on_call_write` permission. ### Arguments #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `teams`, `layers`, `layers.members`, `layers.members.user`. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Field Type Description data [_required_] object The core data wrapper for creating a schedule, encompassing attributes, relationships, and the resource type. attributes [_required_] object Describes the main attributes for creating a new schedule, including name, layers, and time zone. layers [_required_] [object] The layers of On-Call coverage that define rotation intervals and restrictions. effective_date [_required_] date-time The date/time when this layer becomes active (in ISO 8601). end_date date-time The date/time after which this layer no longer applies (in ISO 8601). interval [_required_] object Defines how often the rotation repeats, using a combination of days and optional seconds. Should be at least 1 hour. days int32 The number of days in each rotation cycle. seconds int64 Any additional seconds for the rotation cycle (up to 30 days). members [_required_] [object] A list of members who participate in this layer's rotation. user object Identifies the user participating in this layer as a single object with an `id`. id string The user's ID. name [_required_] string The name of this layer. restrictions [object] Zero or more time-based restrictions (for example, only weekdays, during business hours). end_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` end_time string Specifies the ending time for this restriction. start_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` start_time string Specifies the starting time for this restriction. rotation_start [_required_] date-time The date/time when the rotation for this layer starts (in ISO 8601). time_zone string The time zone for this layer. name [_required_] string A human-readable name for the new schedule. time_zone [_required_] string The time zone in which the schedule is defined. relationships object Gathers relationship objects for the schedule creation request, including the teams to associate. teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Schedules resource type. Allowed enum values: `schedules` default: `schedules` ``` { "data": { "attributes": { "layers": [ { "effective_date": "2021-11-01T11:11:11+00:00", "end_date": "2021-11-21T11:11:11+00:00", "interval": { "days": 1 }, "members": [ { "user": { "id": "string" } } ], "name": "Layer 1", "restrictions": [ { "end_day": "friday", "end_time": "17:00:00", "start_day": "monday", "start_time": "09:00:00" } ], "rotation_start": "2021-11-06T11:11:11+00:00" } ], "name": "Example-On-Call", "time_zone": "America/New_York" }, "relationships": { "teams": { "data": [ { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] } }, "type": "schedules" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/on-call/#CreateOnCallSchedule-201-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#CreateOnCallSchedule-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#CreateOnCallSchedule-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#CreateOnCallSchedule-403-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#CreateOnCallSchedule-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Top-level container for a schedule object, including both the `data` payload and any related `included` resources (such as teams, layers, or members). Field Type Description data object Represents the primary data object for a schedule, linking attributes and relationships. attributes object Provides core properties of a schedule object such as its name and time zone. name string A short name for the schedule. time_zone string The time zone in which this schedule operates. id string The schedule's unique identifier. relationships object Groups the relationships for a schedule object, referencing layers and teams. layers object Associates layers with this schedule in a data structure. data [object] An array of layer references for this schedule. id [_required_] string The unique identifier of the layer in this relationship. type [_required_] enum Layers resource type. Allowed enum values: `layers` default: `layers` teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Schedules resource type. Allowed enum values: `schedules` default: `schedules` included [ ] Any additional resources related to this schedule, such as teams and layers. Option 1 object Provides a reference to a team, including ID, type, and basic attributes/relationships. attributes object Encapsulates the basic attributes of a Team reference, such as name, handle, and an optional avatar or description. avatar string URL or reference for the team's avatar (if available). description string A short text describing the team. handle string A unique handle/slug for the team. name string The full, human-readable name of the team. id string The team's unique identifier. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` Option 2 object Encapsulates a layer resource, holding attributes like rotation details, plus relationships to the members covering that layer. attributes object Describes key properties of a Layer, including rotation details, name, start/end times, and any restrictions. effective_date date-time When the layer becomes active (ISO 8601). end_date date-time When the layer ceases to be active (ISO 8601). interval object Defines how often the rotation repeats, using a combination of days and optional seconds. Should be at least 1 hour. days int32 The number of days in each rotation cycle. seconds int64 Any additional seconds for the rotation cycle (up to 30 days). name string The name of this layer. restrictions [object] An optional list of time restrictions for when this layer is in effect. end_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` end_time string Specifies the ending time for this restriction. start_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` start_time string Specifies the starting time for this restriction. rotation_start date-time The date/time when the rotation starts (ISO 8601). time_zone string The time zone for this layer. id string A unique identifier for this layer. relationships object Holds references to objects related to the Layer entity, such as its members. members object Holds an array of references to the members of a Layer, each containing member IDs. data [object] The list of members who belong to this layer. id [_required_] string The unique user ID of the layer member. type [_required_] enum Members resource type. Allowed enum values: `members` default: `members` type [_required_] enum Layers resource type. Allowed enum values: `layers` default: `layers` Option 3 object Represents a single member entry in a schedule, referencing a specific user. id string The unique identifier for this schedule member. relationships object Defines relationships for a schedule member, primarily referencing a single user. user object Wraps the user data reference for a schedule member. data [_required_] object Points to the user data associated with this schedule member, including an ID and type. id [_required_] string The user's unique identifier. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Schedule Members resource type. Allowed enum values: `members` default: `members` Option 4 object Represents a user object in the context of a schedule, including their `id`, type, and basic attributes. attributes object Provides basic user information for a schedule, including a name and email address. email string The user's email address. name string The user's name. status enum The user's status. Allowed enum values: `active,deactivated,pending` id string The unique user identifier. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "name": "On-Call Schedule", "time_zone": "America/New_York" }, "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "relationships": { "layers": { "data": [ { "id": "00000000-0000-0000-0000-000000000001", "type": "layers" } ] }, "teams": { "data": [ { "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" } ] } }, "type": "schedules" }, "included": [ { "attributes": { "avatar": "", "description": "Team 1 description", "handle": "team1", "name": "Team 1" }, "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" }, { "attributes": { "effective_date": "2025-02-03T05:00:00Z", "end_date": "2025-12-31T00:00:00Z", "interval": { "days": 1 }, "name": "Layer 1", "restrictions": [ { "end_day": "friday", "end_time": "17:00:00", "start_day": "monday", "start_time": "09:00:00" } ], "rotation_start": "2025-02-01T00:00:00Z" }, "id": "00000000-0000-0000-0000-000000000001", "relationships": { "members": { "data": [ { "id": "00000000-0000-0000-0000-000000000002", "type": "members" } ] } }, "type": "layers" }, { "id": "00000000-0000-0000-0000-000000000002", "relationships": { "user": { "data": { "id": "00000000-aba1-0000-0000-000000000000", "type": "users" } } }, "type": "members" }, { "attributes": { "email": "foo@bar.com", "name": "User 1" }, "id": "00000000-aba1-0000-0000-000000000000", "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Create On-Call schedule returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/schedules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "layers": [ { "effective_date": "2021-11-01T11:11:11+00:00", "end_date": "2021-11-21T11:11:11+00:00", "interval": { "days": 1 }, "members": [ { "user": { "id": "string" } } ], "name": "Layer 1", "restrictions": [ { "end_day": "friday", "end_time": "17:00:00", "start_day": "monday", "start_time": "09:00:00" } ], "rotation_start": "2021-11-06T11:11:11+00:00" } ], "name": "Example-On-Call", "time_zone": "America/New_York" }, "relationships": { "teams": { "data": [ { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] } }, "type": "schedules" } } EOF ``` ##### Create On-Call schedule returns "Created" response ``` // Create On-Call schedule returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") body := datadogV2.ScheduleCreateRequest{ Data: datadogV2.ScheduleCreateRequestData{ Attributes: datadogV2.ScheduleCreateRequestDataAttributes{ Layers: []datadogV2.ScheduleCreateRequestDataAttributesLayersItems{ { EffectiveDate: time.Now().AddDate(0, 0, -10), EndDate: datadog.PtrTime(time.Now().AddDate(0, 0, 10)), Interval: datadogV2.LayerAttributesInterval{ Days: datadog.PtrInt32(1), }, Members: []datadogV2.ScheduleRequestDataAttributesLayersItemsMembersItems{ { User: &datadogV2.ScheduleRequestDataAttributesLayersItemsMembersItemsUser{ Id: datadog.PtrString(UserDataID), }, }, }, Name: "Layer 1", Restrictions: []datadogV2.TimeRestriction{ { EndDay: datadogV2.WEEKDAY_FRIDAY.Ptr(), EndTime: datadog.PtrString("17:00:00"), StartDay: datadogV2.WEEKDAY_MONDAY.Ptr(), StartTime: datadog.PtrString("09:00:00"), }, }, RotationStart: time.Now().AddDate(0, 0, -5), }, }, Name: "Example-On-Call", TimeZone: "America/New_York", }, Relationships: &datadogV2.ScheduleCreateRequestDataRelationships{ Teams: &datadogV2.DataRelationshipsTeams{ Data: []datadogV2.DataRelationshipsTeamsDataItems{ { Id: DdTeamDataID, Type: datadogV2.DATARELATIONSHIPSTEAMSDATAITEMSTYPE_TEAMS, }, }, }, }, Type: datadogV2.SCHEDULECREATEREQUESTDATATYPE_SCHEDULES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.CreateOnCallSchedule(ctx, body, *datadogV2.NewCreateOnCallScheduleOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.CreateOnCallSchedule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.CreateOnCallSchedule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create On-Call schedule returns "Created" response ``` // Create On-Call schedule returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.model.DataRelationshipsTeams; import com.datadog.api.client.v2.model.DataRelationshipsTeamsDataItems; import com.datadog.api.client.v2.model.DataRelationshipsTeamsDataItemsType; import com.datadog.api.client.v2.model.LayerAttributesInterval; import com.datadog.api.client.v2.model.Schedule; import com.datadog.api.client.v2.model.ScheduleCreateRequest; import com.datadog.api.client.v2.model.ScheduleCreateRequestData; import com.datadog.api.client.v2.model.ScheduleCreateRequestDataAttributes; import com.datadog.api.client.v2.model.ScheduleCreateRequestDataAttributesLayersItems; import com.datadog.api.client.v2.model.ScheduleCreateRequestDataRelationships; import com.datadog.api.client.v2.model.ScheduleCreateRequestDataType; import com.datadog.api.client.v2.model.ScheduleRequestDataAttributesLayersItemsMembersItems; import com.datadog.api.client.v2.model.ScheduleRequestDataAttributesLayersItemsMembersItemsUser; import com.datadog.api.client.v2.model.TimeRestriction; import com.datadog.api.client.v2.model.Weekday; import java.time.OffsetDateTime; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); ScheduleCreateRequest body = new ScheduleCreateRequest() .data( new ScheduleCreateRequestData() .attributes( new ScheduleCreateRequestDataAttributes() .layers( Collections.singletonList( new ScheduleCreateRequestDataAttributesLayersItems() .effectiveDate(OffsetDateTime.now().plusDays(-10)) .endDate(OffsetDateTime.now().plusDays(10)) .interval(new LayerAttributesInterval().days(1)) .members( Collections.singletonList( new ScheduleRequestDataAttributesLayersItemsMembersItems() .user( new ScheduleRequestDataAttributesLayersItemsMembersItemsUser() .id(USER_DATA_ID)))) .name("Layer 1") .restrictions( Collections.singletonList( new TimeRestriction() .endDay(Weekday.FRIDAY) .endTime("17:00:00") .startDay(Weekday.MONDAY) .startTime("09:00:00"))) .rotationStart(OffsetDateTime.now().plusDays(-5)))) .name("Example-On-Call") .timeZone("America/New_York")) .relationships( new ScheduleCreateRequestDataRelationships() .teams( new DataRelationshipsTeams() .data( Collections.singletonList( new DataRelationshipsTeamsDataItems() .id(DD_TEAM_DATA_ID) .type(DataRelationshipsTeamsDataItemsType.TEAMS))))) .type(ScheduleCreateRequestDataType.SCHEDULES)); try { Schedule result = apiInstance.createOnCallSchedule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#createOnCallSchedule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create On-Call schedule returns "Created" response ``` """ Create On-Call schedule returns "Created" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi from datadog_api_client.v2.model.data_relationships_teams import DataRelationshipsTeams from datadog_api_client.v2.model.data_relationships_teams_data_items import DataRelationshipsTeamsDataItems from datadog_api_client.v2.model.data_relationships_teams_data_items_type import DataRelationshipsTeamsDataItemsType from datadog_api_client.v2.model.layer_attributes_interval import LayerAttributesInterval from datadog_api_client.v2.model.schedule_create_request import ScheduleCreateRequest from datadog_api_client.v2.model.schedule_create_request_data import ScheduleCreateRequestData from datadog_api_client.v2.model.schedule_create_request_data_attributes import ScheduleCreateRequestDataAttributes from datadog_api_client.v2.model.schedule_create_request_data_attributes_layers_items import ( ScheduleCreateRequestDataAttributesLayersItems, ) from datadog_api_client.v2.model.schedule_create_request_data_relationships import ( ScheduleCreateRequestDataRelationships, ) from datadog_api_client.v2.model.schedule_create_request_data_type import ScheduleCreateRequestDataType from datadog_api_client.v2.model.schedule_request_data_attributes_layers_items_members_items import ( ScheduleRequestDataAttributesLayersItemsMembersItems, ) from datadog_api_client.v2.model.schedule_request_data_attributes_layers_items_members_items_user import ( ScheduleRequestDataAttributesLayersItemsMembersItemsUser, ) from datadog_api_client.v2.model.time_restriction import TimeRestriction from datadog_api_client.v2.model.weekday import Weekday # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] body = ScheduleCreateRequest( data=ScheduleCreateRequestData( attributes=ScheduleCreateRequestDataAttributes( layers=[ ScheduleCreateRequestDataAttributesLayersItems( effective_date=(datetime.now() + relativedelta(days=-10)), end_date=(datetime.now() + relativedelta(days=10)), interval=LayerAttributesInterval( days=1, ), members=[ ScheduleRequestDataAttributesLayersItemsMembersItems( user=ScheduleRequestDataAttributesLayersItemsMembersItemsUser( id=USER_DATA_ID, ), ), ], name="Layer 1", restrictions=[ TimeRestriction( end_day=Weekday.FRIDAY, end_time="17:00:00", start_day=Weekday.MONDAY, start_time="09:00:00", ), ], rotation_start=(datetime.now() + relativedelta(days=-5)), ), ], name="Example-On-Call", time_zone="America/New_York", ), relationships=ScheduleCreateRequestDataRelationships( teams=DataRelationshipsTeams( data=[ DataRelationshipsTeamsDataItems( id=DD_TEAM_DATA_ID, type=DataRelationshipsTeamsDataItemsType.TEAMS, ), ], ), ), type=ScheduleCreateRequestDataType.SCHEDULES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.create_on_call_schedule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create On-Call schedule returns "Created" response ``` # Create On-Call schedule returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] body = DatadogAPIClient::V2::ScheduleCreateRequest.new({ data: DatadogAPIClient::V2::ScheduleCreateRequestData.new({ attributes: DatadogAPIClient::V2::ScheduleCreateRequestDataAttributes.new({ layers: [ DatadogAPIClient::V2::ScheduleCreateRequestDataAttributesLayersItems.new({ effective_date: (Time.now + -10 * 86400), end_date: (Time.now + 10 * 86400), interval: DatadogAPIClient::V2::LayerAttributesInterval.new({ days: 1, }), members: [ DatadogAPIClient::V2::ScheduleRequestDataAttributesLayersItemsMembersItems.new({ user: DatadogAPIClient::V2::ScheduleRequestDataAttributesLayersItemsMembersItemsUser.new({ id: USER_DATA_ID, }), }), ], name: "Layer 1", restrictions: [ DatadogAPIClient::V2::TimeRestriction.new({ end_day: DatadogAPIClient::V2::Weekday::FRIDAY, end_time: "17:00:00", start_day: DatadogAPIClient::V2::Weekday::MONDAY, start_time: "09:00:00", }), ], rotation_start: (Time.now + -5 * 86400), }), ], name: "Example-On-Call", time_zone: "America/New_York", }), relationships: DatadogAPIClient::V2::ScheduleCreateRequestDataRelationships.new({ teams: DatadogAPIClient::V2::DataRelationshipsTeams.new({ data: [ DatadogAPIClient::V2::DataRelationshipsTeamsDataItems.new({ id: DD_TEAM_DATA_ID, type: DatadogAPIClient::V2::DataRelationshipsTeamsDataItemsType::TEAMS, }), ], }), }), type: DatadogAPIClient::V2::ScheduleCreateRequestDataType::SCHEDULES, }), }) p api_instance.create_on_call_schedule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create On-Call schedule returns "Created" response ``` // Create On-Call schedule returns "Created" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::CreateOnCallScheduleOptionalParams; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; use datadog_api_client::datadogV2::model::DataRelationshipsTeams; use datadog_api_client::datadogV2::model::DataRelationshipsTeamsDataItems; use datadog_api_client::datadogV2::model::DataRelationshipsTeamsDataItemsType; use datadog_api_client::datadogV2::model::LayerAttributesInterval; use datadog_api_client::datadogV2::model::ScheduleCreateRequest; use datadog_api_client::datadogV2::model::ScheduleCreateRequestData; use datadog_api_client::datadogV2::model::ScheduleCreateRequestDataAttributes; use datadog_api_client::datadogV2::model::ScheduleCreateRequestDataAttributesLayersItems; use datadog_api_client::datadogV2::model::ScheduleCreateRequestDataRelationships; use datadog_api_client::datadogV2::model::ScheduleCreateRequestDataType; use datadog_api_client::datadogV2::model::ScheduleRequestDataAttributesLayersItemsMembersItems; use datadog_api_client::datadogV2::model::ScheduleRequestDataAttributesLayersItemsMembersItemsUser; use datadog_api_client::datadogV2::model::TimeRestriction; use datadog_api_client::datadogV2::model::Weekday; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let body = ScheduleCreateRequest::new( ScheduleCreateRequestData::new( ScheduleCreateRequestDataAttributes::new( vec![ScheduleCreateRequestDataAttributesLayersItems::new( DateTime::parse_from_rfc3339("2021-11-01T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), LayerAttributesInterval::new().days(1), vec![ ScheduleRequestDataAttributesLayersItemsMembersItems::new().user( ScheduleRequestDataAttributesLayersItemsMembersItemsUser::new() .id(user_data_id.clone()), ), ], "Layer 1".to_string(), DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .end_date( DateTime::parse_from_rfc3339("2021-11-21T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .restrictions(vec![TimeRestriction::new() .end_day(Weekday::FRIDAY) .end_time("17:00:00".to_string()) .start_day(Weekday::MONDAY) .start_time("09:00:00".to_string())])], "Example-On-Call".to_string(), "America/New_York".to_string(), ), ScheduleCreateRequestDataType::SCHEDULES, ) .relationships(ScheduleCreateRequestDataRelationships::new().teams( DataRelationshipsTeams::new().data(vec![DataRelationshipsTeamsDataItems::new( dd_team_data_id.clone(), DataRelationshipsTeamsDataItemsType::TEAMS, )]), )), ); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .create_on_call_schedule(body, CreateOnCallScheduleOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create On-Call schedule returns "Created" response ``` /** * Create On-Call schedule returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.OnCallApiCreateOnCallScheduleRequest = { body: { data: { attributes: { layers: [ { effectiveDate: new Date(new Date().getTime() + -10 * 86400 * 1000), endDate: new Date(new Date().getTime() + 10 * 86400 * 1000), interval: { days: 1, }, members: [ { user: { id: USER_DATA_ID, }, }, ], name: "Layer 1", restrictions: [ { endDay: "friday", endTime: "17:00:00", startDay: "monday", startTime: "09:00:00", }, ], rotationStart: new Date(new Date().getTime() + -5 * 86400 * 1000), }, ], name: "Example-On-Call", timeZone: "America/New_York", }, relationships: { teams: { data: [ { id: DD_TEAM_DATA_ID, type: "teams", }, ], }, }, type: "schedules", }, }, }; apiInstance .createOnCallSchedule(params) .then((data: v2.Schedule) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get On-Call schedule](https://docs.datadoghq.com/api/latest/on-call/#get-on-call-schedule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#get-on-call-schedule-v2) GET https://api.ap1.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.ap2.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.datadoghq.eu/api/v2/on-call/schedules/{schedule_id}https://api.ddog-gov.com/api/v2/on-call/schedules/{schedule_id}https://api.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.us3.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.us5.datadoghq.com/api/v2/on-call/schedules/{schedule_id} ### Overview Get an On-Call schedule This endpoint requires the `on_call_read` permission. ### Arguments #### Path Parameters Name Type Description schedule_id [_required_] string The ID of the schedule #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `teams`, `layers`, `layers.members`, `layers.members.user`. ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallSchedule-200-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallSchedule-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallSchedule-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallSchedule-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallSchedule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Top-level container for a schedule object, including both the `data` payload and any related `included` resources (such as teams, layers, or members). Field Type Description data object Represents the primary data object for a schedule, linking attributes and relationships. attributes object Provides core properties of a schedule object such as its name and time zone. name string A short name for the schedule. time_zone string The time zone in which this schedule operates. id string The schedule's unique identifier. relationships object Groups the relationships for a schedule object, referencing layers and teams. layers object Associates layers with this schedule in a data structure. data [object] An array of layer references for this schedule. id [_required_] string The unique identifier of the layer in this relationship. type [_required_] enum Layers resource type. Allowed enum values: `layers` default: `layers` teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Schedules resource type. Allowed enum values: `schedules` default: `schedules` included [ ] Any additional resources related to this schedule, such as teams and layers. Option 1 object Provides a reference to a team, including ID, type, and basic attributes/relationships. attributes object Encapsulates the basic attributes of a Team reference, such as name, handle, and an optional avatar or description. avatar string URL or reference for the team's avatar (if available). description string A short text describing the team. handle string A unique handle/slug for the team. name string The full, human-readable name of the team. id string The team's unique identifier. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` Option 2 object Encapsulates a layer resource, holding attributes like rotation details, plus relationships to the members covering that layer. attributes object Describes key properties of a Layer, including rotation details, name, start/end times, and any restrictions. effective_date date-time When the layer becomes active (ISO 8601). end_date date-time When the layer ceases to be active (ISO 8601). interval object Defines how often the rotation repeats, using a combination of days and optional seconds. Should be at least 1 hour. days int32 The number of days in each rotation cycle. seconds int64 Any additional seconds for the rotation cycle (up to 30 days). name string The name of this layer. restrictions [object] An optional list of time restrictions for when this layer is in effect. end_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` end_time string Specifies the ending time for this restriction. start_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` start_time string Specifies the starting time for this restriction. rotation_start date-time The date/time when the rotation starts (ISO 8601). time_zone string The time zone for this layer. id string A unique identifier for this layer. relationships object Holds references to objects related to the Layer entity, such as its members. members object Holds an array of references to the members of a Layer, each containing member IDs. data [object] The list of members who belong to this layer. id [_required_] string The unique user ID of the layer member. type [_required_] enum Members resource type. Allowed enum values: `members` default: `members` type [_required_] enum Layers resource type. Allowed enum values: `layers` default: `layers` Option 3 object Represents a single member entry in a schedule, referencing a specific user. id string The unique identifier for this schedule member. relationships object Defines relationships for a schedule member, primarily referencing a single user. user object Wraps the user data reference for a schedule member. data [_required_] object Points to the user data associated with this schedule member, including an ID and type. id [_required_] string The user's unique identifier. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Schedule Members resource type. Allowed enum values: `members` default: `members` Option 4 object Represents a user object in the context of a schedule, including their `id`, type, and basic attributes. attributes object Provides basic user information for a schedule, including a name and email address. email string The user's email address. name string The user's name. status enum The user's status. Allowed enum values: `active,deactivated,pending` id string The unique user identifier. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "name": "On-Call Schedule", "time_zone": "America/New_York" }, "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "relationships": { "layers": { "data": [ { "id": "00000000-0000-0000-0000-000000000001", "type": "layers" } ] }, "teams": { "data": [ { "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" } ] } }, "type": "schedules" }, "included": [ { "attributes": { "avatar": "", "description": "Team 1 description", "handle": "team1", "name": "Team 1" }, "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" }, { "attributes": { "effective_date": "2025-02-03T05:00:00Z", "end_date": "2025-12-31T00:00:00Z", "interval": { "days": 1 }, "name": "Layer 1", "restrictions": [ { "end_day": "friday", "end_time": "17:00:00", "start_day": "monday", "start_time": "09:00:00" } ], "rotation_start": "2025-02-01T00:00:00Z" }, "id": "00000000-0000-0000-0000-000000000001", "relationships": { "members": { "data": [ { "id": "00000000-0000-0000-0000-000000000002", "type": "members" } ] } }, "type": "layers" }, { "id": "00000000-0000-0000-0000-000000000002", "relationships": { "user": { "data": { "id": "00000000-aba1-0000-0000-000000000000", "type": "users" } } }, "type": "members" }, { "attributes": { "email": "foo@bar.com", "name": "User 1" }, "id": "00000000-aba1-0000-0000-000000000000", "type": "users" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Get On-Call schedule Copy ``` # Path parameters export schedule_id="3653d3c6-0c75-11ea-ad28-fb5701eabc7d" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/schedules/${schedule_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get On-Call schedule ``` """ Get On-Call schedule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi # there is a valid "schedule" in the system SCHEDULE_DATA_ID = environ["SCHEDULE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.get_on_call_schedule( schedule_id=SCHEDULE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get On-Call schedule ``` # Get On-Call schedule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "schedule" in the system SCHEDULE_DATA_ID = ENV["SCHEDULE_DATA_ID"] p api_instance.get_on_call_schedule(SCHEDULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get On-Call schedule ``` // Get On-Call schedule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "schedule" in the system ScheduleDataID := os.Getenv("SCHEDULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.GetOnCallSchedule(ctx, ScheduleDataID, *datadogV2.NewGetOnCallScheduleOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.GetOnCallSchedule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.GetOnCallSchedule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get On-Call schedule ``` // Get On-Call schedule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.model.Schedule; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "schedule" in the system String SCHEDULE_DATA_ID = System.getenv("SCHEDULE_DATA_ID"); try { Schedule result = apiInstance.getOnCallSchedule(SCHEDULE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#getOnCallSchedule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get On-Call schedule ``` // Get On-Call schedule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::GetOnCallScheduleOptionalParams; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; #[tokio::main] async fn main() { // there is a valid "schedule" in the system let schedule_data_id = std::env::var("SCHEDULE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .get_on_call_schedule( schedule_data_id.clone(), GetOnCallScheduleOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get On-Call schedule ``` /** * Get On-Call schedule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "schedule" in the system const SCHEDULE_DATA_ID = process.env.SCHEDULE_DATA_ID as string; const params: v2.OnCallApiGetOnCallScheduleRequest = { scheduleId: SCHEDULE_DATA_ID, }; apiInstance .getOnCallSchedule(params) .then((data: v2.Schedule) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete On-Call schedule](https://docs.datadoghq.com/api/latest/on-call/#delete-on-call-schedule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#delete-on-call-schedule-v2) DELETE https://api.ap1.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.ap2.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.datadoghq.eu/api/v2/on-call/schedules/{schedule_id}https://api.ddog-gov.com/api/v2/on-call/schedules/{schedule_id}https://api.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.us3.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.us5.datadoghq.com/api/v2/on-call/schedules/{schedule_id} ### Overview Delete an On-Call schedule This endpoint requires the `on_call_write` permission. ### Arguments #### Path Parameters Name Type Description schedule_id [_required_] string The ID of the schedule ### Response * [204](https://docs.datadoghq.com/api/latest/on-call/#DeleteOnCallSchedule-204-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#DeleteOnCallSchedule-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#DeleteOnCallSchedule-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#DeleteOnCallSchedule-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#DeleteOnCallSchedule-429-v2) No Content Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Delete On-Call schedule Copy ``` # Path parameters export schedule_id="3653d3c6-0c75-11ea-ad28-fb5701eabc7d" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/schedules/${schedule_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete On-Call schedule ``` """ Delete On-Call schedule returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi # there is a valid "schedule" in the system SCHEDULE_DATA_ID = environ["SCHEDULE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) api_instance.delete_on_call_schedule( schedule_id=SCHEDULE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete On-Call schedule ``` # Delete On-Call schedule returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "schedule" in the system SCHEDULE_DATA_ID = ENV["SCHEDULE_DATA_ID"] api_instance.delete_on_call_schedule(SCHEDULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete On-Call schedule ``` // Delete On-Call schedule returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "schedule" in the system ScheduleDataID := os.Getenv("SCHEDULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) r, err := api.DeleteOnCallSchedule(ctx, ScheduleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.DeleteOnCallSchedule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete On-Call schedule ``` // Delete On-Call schedule returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "schedule" in the system String SCHEDULE_DATA_ID = System.getenv("SCHEDULE_DATA_ID"); try { apiInstance.deleteOnCallSchedule(SCHEDULE_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#deleteOnCallSchedule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete On-Call schedule ``` // Delete On-Call schedule returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; #[tokio::main] async fn main() { // there is a valid "schedule" in the system let schedule_data_id = std::env::var("SCHEDULE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api.delete_on_call_schedule(schedule_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete On-Call schedule ``` /** * Delete On-Call schedule returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "schedule" in the system const SCHEDULE_DATA_ID = process.env.SCHEDULE_DATA_ID as string; const params: v2.OnCallApiDeleteOnCallScheduleRequest = { scheduleId: SCHEDULE_DATA_ID, }; apiInstance .deleteOnCallSchedule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update On-Call schedule](https://docs.datadoghq.com/api/latest/on-call/#update-on-call-schedule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#update-on-call-schedule-v2) PUT https://api.ap1.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.ap2.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.datadoghq.eu/api/v2/on-call/schedules/{schedule_id}https://api.ddog-gov.com/api/v2/on-call/schedules/{schedule_id}https://api.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.us3.datadoghq.com/api/v2/on-call/schedules/{schedule_id}https://api.us5.datadoghq.com/api/v2/on-call/schedules/{schedule_id} ### Overview Update a new On-Call schedule This endpoint requires the `on_call_write` permission. ### Arguments #### Path Parameters Name Type Description schedule_id [_required_] string The ID of the schedule #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `teams`, `layers`, `layers.members`, `layers.members.user`. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Field Type Description data [_required_] object Contains all data needed to update an existing schedule, including its attributes (such as name and time zone) and any relationships to teams. attributes [_required_] object Defines the updatable attributes for a schedule, such as name, time zone, and layers. layers [_required_] [object] The updated list of layers (rotations) for this schedule. effective_date [_required_] date-time When this updated layer takes effect (ISO 8601 format). end_date date-time When this updated layer should stop being active (ISO 8601 format). id string A unique identifier for the layer being updated. interval [_required_] object Defines how often the rotation repeats, using a combination of days and optional seconds. Should be at least 1 hour. days int32 The number of days in each rotation cycle. seconds int64 Any additional seconds for the rotation cycle (up to 30 days). members [_required_] [object] The members assigned to this layer. user object Identifies the user participating in this layer as a single object with an `id`. id string The user's ID. name [_required_] string The name for this layer (for example, "Secondary Coverage"). restrictions [object] Any time restrictions that define when this layer is active. end_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` end_time string Specifies the ending time for this restriction. start_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` start_time string Specifies the starting time for this restriction. rotation_start [_required_] date-time The date/time at which the rotation begins (ISO 8601 format). time_zone string The time zone for this layer. name [_required_] string A short name for the schedule. time_zone [_required_] string The time zone used when interpreting rotation times. id [_required_] string The ID of the schedule to be updated. relationships object Houses relationships for the schedule update, typically referencing teams. teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Schedules resource type. Allowed enum values: `schedules` default: `schedules` ``` { "data": { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "attributes": { "layers": [ { "id": "00000000-0000-0000-0000-000000000001", "effective_date": "2021-11-01T11:11:11+00:00", "end_date": "2021-11-21T11:11:11+00:00", "interval": { "seconds": 3600 }, "members": [ { "user": { "id": "string" } } ], "name": "Layer 1", "restrictions": [ { "end_day": "friday", "end_time": "17:00:00", "start_day": "monday", "start_time": "09:00:00" } ], "rotation_start": "2021-11-06T11:11:11+00:00" } ], "name": "Example-On-Call", "time_zone": "America/New_York" }, "relationships": { "teams": { "data": [ { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] } }, "type": "schedules" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallSchedule-200-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallSchedule-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallSchedule-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallSchedule-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallSchedule-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallSchedule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Top-level container for a schedule object, including both the `data` payload and any related `included` resources (such as teams, layers, or members). Field Type Description data object Represents the primary data object for a schedule, linking attributes and relationships. attributes object Provides core properties of a schedule object such as its name and time zone. name string A short name for the schedule. time_zone string The time zone in which this schedule operates. id string The schedule's unique identifier. relationships object Groups the relationships for a schedule object, referencing layers and teams. layers object Associates layers with this schedule in a data structure. data [object] An array of layer references for this schedule. id [_required_] string The unique identifier of the layer in this relationship. type [_required_] enum Layers resource type. Allowed enum values: `layers` default: `layers` teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Schedules resource type. Allowed enum values: `schedules` default: `schedules` included [ ] Any additional resources related to this schedule, such as teams and layers. Option 1 object Provides a reference to a team, including ID, type, and basic attributes/relationships. attributes object Encapsulates the basic attributes of a Team reference, such as name, handle, and an optional avatar or description. avatar string URL or reference for the team's avatar (if available). description string A short text describing the team. handle string A unique handle/slug for the team. name string The full, human-readable name of the team. id string The team's unique identifier. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` Option 2 object Encapsulates a layer resource, holding attributes like rotation details, plus relationships to the members covering that layer. attributes object Describes key properties of a Layer, including rotation details, name, start/end times, and any restrictions. effective_date date-time When the layer becomes active (ISO 8601). end_date date-time When the layer ceases to be active (ISO 8601). interval object Defines how often the rotation repeats, using a combination of days and optional seconds. Should be at least 1 hour. days int32 The number of days in each rotation cycle. seconds int64 Any additional seconds for the rotation cycle (up to 30 days). name string The name of this layer. restrictions [object] An optional list of time restrictions for when this layer is in effect. end_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` end_time string Specifies the ending time for this restriction. start_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` start_time string Specifies the starting time for this restriction. rotation_start date-time The date/time when the rotation starts (ISO 8601). time_zone string The time zone for this layer. id string A unique identifier for this layer. relationships object Holds references to objects related to the Layer entity, such as its members. members object Holds an array of references to the members of a Layer, each containing member IDs. data [object] The list of members who belong to this layer. id [_required_] string The unique user ID of the layer member. type [_required_] enum Members resource type. Allowed enum values: `members` default: `members` type [_required_] enum Layers resource type. Allowed enum values: `layers` default: `layers` Option 3 object Represents a single member entry in a schedule, referencing a specific user. id string The unique identifier for this schedule member. relationships object Defines relationships for a schedule member, primarily referencing a single user. user object Wraps the user data reference for a schedule member. data [_required_] object Points to the user data associated with this schedule member, including an ID and type. id [_required_] string The user's unique identifier. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Schedule Members resource type. Allowed enum values: `members` default: `members` Option 4 object Represents a user object in the context of a schedule, including their `id`, type, and basic attributes. attributes object Provides basic user information for a schedule, including a name and email address. email string The user's email address. name string The user's name. status enum The user's status. Allowed enum values: `active,deactivated,pending` id string The unique user identifier. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "name": "On-Call Schedule", "time_zone": "America/New_York" }, "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "relationships": { "layers": { "data": [ { "id": "00000000-0000-0000-0000-000000000001", "type": "layers" } ] }, "teams": { "data": [ { "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" } ] } }, "type": "schedules" }, "included": [ { "attributes": { "avatar": "", "description": "Team 1 description", "handle": "team1", "name": "Team 1" }, "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" }, { "attributes": { "effective_date": "2025-02-03T05:00:00Z", "end_date": "2025-12-31T00:00:00Z", "interval": { "days": 1 }, "name": "Layer 1", "restrictions": [ { "end_day": "friday", "end_time": "17:00:00", "start_day": "monday", "start_time": "09:00:00" } ], "rotation_start": "2025-02-01T00:00:00Z" }, "id": "00000000-0000-0000-0000-000000000001", "relationships": { "members": { "data": [ { "id": "00000000-0000-0000-0000-000000000002", "type": "members" } ] } }, "type": "layers" }, { "id": "00000000-0000-0000-0000-000000000002", "relationships": { "user": { "data": { "id": "00000000-aba1-0000-0000-000000000000", "type": "users" } } }, "type": "members" }, { "attributes": { "email": "foo@bar.com", "name": "User 1" }, "id": "00000000-aba1-0000-0000-000000000000", "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Update On-Call schedule returns "OK" response Copy ``` # Path parameters export schedule_id="3653d3c6-0c75-11ea-ad28-fb5701eabc7d" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/schedules/${schedule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "attributes": { "layers": [ { "id": "00000000-0000-0000-0000-000000000001", "effective_date": "2021-11-01T11:11:11+00:00", "end_date": "2021-11-21T11:11:11+00:00", "interval": { "seconds": 3600 }, "members": [ { "user": { "id": "string" } } ], "name": "Layer 1", "restrictions": [ { "end_day": "friday", "end_time": "17:00:00", "start_day": "monday", "start_time": "09:00:00" } ], "rotation_start": "2021-11-06T11:11:11+00:00" } ], "name": "Example-On-Call", "time_zone": "America/New_York" }, "relationships": { "teams": { "data": [ { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] } }, "type": "schedules" } } EOF ``` ##### Update On-Call schedule returns "OK" response ``` // Update On-Call schedule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "schedule" in the system ScheduleDataID := os.Getenv("SCHEDULE_DATA_ID") ScheduleDataRelationshipsLayersData0ID := os.Getenv("SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID") // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") body := datadogV2.ScheduleUpdateRequest{ Data: datadogV2.ScheduleUpdateRequestData{ Id: ScheduleDataID, Attributes: datadogV2.ScheduleUpdateRequestDataAttributes{ Layers: []datadogV2.ScheduleUpdateRequestDataAttributesLayersItems{ { Id: datadog.PtrString(ScheduleDataRelationshipsLayersData0ID), EffectiveDate: time.Now().AddDate(0, 0, -10), EndDate: datadog.PtrTime(time.Now().AddDate(0, 0, 10)), Interval: datadogV2.LayerAttributesInterval{ Seconds: datadog.PtrInt64(3600), }, Members: []datadogV2.ScheduleRequestDataAttributesLayersItemsMembersItems{ { User: &datadogV2.ScheduleRequestDataAttributesLayersItemsMembersItemsUser{ Id: datadog.PtrString(UserDataID), }, }, }, Name: "Layer 1", Restrictions: []datadogV2.TimeRestriction{ { EndDay: datadogV2.WEEKDAY_FRIDAY.Ptr(), EndTime: datadog.PtrString("17:00:00"), StartDay: datadogV2.WEEKDAY_MONDAY.Ptr(), StartTime: datadog.PtrString("09:00:00"), }, }, RotationStart: time.Now().AddDate(0, 0, -5), }, }, Name: "Example-On-Call", TimeZone: "America/New_York", }, Relationships: &datadogV2.ScheduleUpdateRequestDataRelationships{ Teams: &datadogV2.DataRelationshipsTeams{ Data: []datadogV2.DataRelationshipsTeamsDataItems{ { Id: DdTeamDataID, Type: datadogV2.DATARELATIONSHIPSTEAMSDATAITEMSTYPE_TEAMS, }, }, }, }, Type: datadogV2.SCHEDULEUPDATEREQUESTDATATYPE_SCHEDULES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.UpdateOnCallSchedule(ctx, ScheduleDataID, body, *datadogV2.NewUpdateOnCallScheduleOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.UpdateOnCallSchedule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.UpdateOnCallSchedule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update On-Call schedule returns "OK" response ``` // Update On-Call schedule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.model.DataRelationshipsTeams; import com.datadog.api.client.v2.model.DataRelationshipsTeamsDataItems; import com.datadog.api.client.v2.model.DataRelationshipsTeamsDataItemsType; import com.datadog.api.client.v2.model.LayerAttributesInterval; import com.datadog.api.client.v2.model.Schedule; import com.datadog.api.client.v2.model.ScheduleRequestDataAttributesLayersItemsMembersItems; import com.datadog.api.client.v2.model.ScheduleRequestDataAttributesLayersItemsMembersItemsUser; import com.datadog.api.client.v2.model.ScheduleUpdateRequest; import com.datadog.api.client.v2.model.ScheduleUpdateRequestData; import com.datadog.api.client.v2.model.ScheduleUpdateRequestDataAttributes; import com.datadog.api.client.v2.model.ScheduleUpdateRequestDataAttributesLayersItems; import com.datadog.api.client.v2.model.ScheduleUpdateRequestDataRelationships; import com.datadog.api.client.v2.model.ScheduleUpdateRequestDataType; import com.datadog.api.client.v2.model.TimeRestriction; import com.datadog.api.client.v2.model.Weekday; import java.time.OffsetDateTime; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "schedule" in the system String SCHEDULE_DATA_ID = System.getenv("SCHEDULE_DATA_ID"); String SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID = System.getenv("SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID"); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); ScheduleUpdateRequest body = new ScheduleUpdateRequest() .data( new ScheduleUpdateRequestData() .id(SCHEDULE_DATA_ID) .attributes( new ScheduleUpdateRequestDataAttributes() .layers( Collections.singletonList( new ScheduleUpdateRequestDataAttributesLayersItems() .id(SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID) .effectiveDate(OffsetDateTime.now().plusDays(-10)) .endDate(OffsetDateTime.now().plusDays(10)) .interval(new LayerAttributesInterval().seconds(3600L)) .members( Collections.singletonList( new ScheduleRequestDataAttributesLayersItemsMembersItems() .user( new ScheduleRequestDataAttributesLayersItemsMembersItemsUser() .id(USER_DATA_ID)))) .name("Layer 1") .restrictions( Collections.singletonList( new TimeRestriction() .endDay(Weekday.FRIDAY) .endTime("17:00:00") .startDay(Weekday.MONDAY) .startTime("09:00:00"))) .rotationStart(OffsetDateTime.now().plusDays(-5)))) .name("Example-On-Call") .timeZone("America/New_York")) .relationships( new ScheduleUpdateRequestDataRelationships() .teams( new DataRelationshipsTeams() .data( Collections.singletonList( new DataRelationshipsTeamsDataItems() .id(DD_TEAM_DATA_ID) .type(DataRelationshipsTeamsDataItemsType.TEAMS))))) .type(ScheduleUpdateRequestDataType.SCHEDULES)); try { Schedule result = apiInstance.updateOnCallSchedule(SCHEDULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#updateOnCallSchedule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update On-Call schedule returns "OK" response ``` """ Update On-Call schedule returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi from datadog_api_client.v2.model.data_relationships_teams import DataRelationshipsTeams from datadog_api_client.v2.model.data_relationships_teams_data_items import DataRelationshipsTeamsDataItems from datadog_api_client.v2.model.data_relationships_teams_data_items_type import DataRelationshipsTeamsDataItemsType from datadog_api_client.v2.model.layer_attributes_interval import LayerAttributesInterval from datadog_api_client.v2.model.schedule_request_data_attributes_layers_items_members_items import ( ScheduleRequestDataAttributesLayersItemsMembersItems, ) from datadog_api_client.v2.model.schedule_request_data_attributes_layers_items_members_items_user import ( ScheduleRequestDataAttributesLayersItemsMembersItemsUser, ) from datadog_api_client.v2.model.schedule_update_request import ScheduleUpdateRequest from datadog_api_client.v2.model.schedule_update_request_data import ScheduleUpdateRequestData from datadog_api_client.v2.model.schedule_update_request_data_attributes import ScheduleUpdateRequestDataAttributes from datadog_api_client.v2.model.schedule_update_request_data_attributes_layers_items import ( ScheduleUpdateRequestDataAttributesLayersItems, ) from datadog_api_client.v2.model.schedule_update_request_data_relationships import ( ScheduleUpdateRequestDataRelationships, ) from datadog_api_client.v2.model.schedule_update_request_data_type import ScheduleUpdateRequestDataType from datadog_api_client.v2.model.time_restriction import TimeRestriction from datadog_api_client.v2.model.weekday import Weekday # there is a valid "schedule" in the system SCHEDULE_DATA_ID = environ["SCHEDULE_DATA_ID"] SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID = environ["SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID"] # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] body = ScheduleUpdateRequest( data=ScheduleUpdateRequestData( id=SCHEDULE_DATA_ID, attributes=ScheduleUpdateRequestDataAttributes( layers=[ ScheduleUpdateRequestDataAttributesLayersItems( id=SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID, effective_date=(datetime.now() + relativedelta(days=-10)), end_date=(datetime.now() + relativedelta(days=10)), interval=LayerAttributesInterval( seconds=3600, ), members=[ ScheduleRequestDataAttributesLayersItemsMembersItems( user=ScheduleRequestDataAttributesLayersItemsMembersItemsUser( id=USER_DATA_ID, ), ), ], name="Layer 1", restrictions=[ TimeRestriction( end_day=Weekday.FRIDAY, end_time="17:00:00", start_day=Weekday.MONDAY, start_time="09:00:00", ), ], rotation_start=(datetime.now() + relativedelta(days=-5)), ), ], name="Example-On-Call", time_zone="America/New_York", ), relationships=ScheduleUpdateRequestDataRelationships( teams=DataRelationshipsTeams( data=[ DataRelationshipsTeamsDataItems( id=DD_TEAM_DATA_ID, type=DataRelationshipsTeamsDataItemsType.TEAMS, ), ], ), ), type=ScheduleUpdateRequestDataType.SCHEDULES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.update_on_call_schedule(schedule_id=SCHEDULE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update On-Call schedule returns "OK" response ``` # Update On-Call schedule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "schedule" in the system SCHEDULE_DATA_ID = ENV["SCHEDULE_DATA_ID"] SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID = ENV["SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID"] # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] body = DatadogAPIClient::V2::ScheduleUpdateRequest.new({ data: DatadogAPIClient::V2::ScheduleUpdateRequestData.new({ id: SCHEDULE_DATA_ID, attributes: DatadogAPIClient::V2::ScheduleUpdateRequestDataAttributes.new({ layers: [ DatadogAPIClient::V2::ScheduleUpdateRequestDataAttributesLayersItems.new({ id: SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID, effective_date: (Time.now + -10 * 86400), end_date: (Time.now + 10 * 86400), interval: DatadogAPIClient::V2::LayerAttributesInterval.new({ seconds: 3600, }), members: [ DatadogAPIClient::V2::ScheduleRequestDataAttributesLayersItemsMembersItems.new({ user: DatadogAPIClient::V2::ScheduleRequestDataAttributesLayersItemsMembersItemsUser.new({ id: USER_DATA_ID, }), }), ], name: "Layer 1", restrictions: [ DatadogAPIClient::V2::TimeRestriction.new({ end_day: DatadogAPIClient::V2::Weekday::FRIDAY, end_time: "17:00:00", start_day: DatadogAPIClient::V2::Weekday::MONDAY, start_time: "09:00:00", }), ], rotation_start: (Time.now + -5 * 86400), }), ], name: "Example-On-Call", time_zone: "America/New_York", }), relationships: DatadogAPIClient::V2::ScheduleUpdateRequestDataRelationships.new({ teams: DatadogAPIClient::V2::DataRelationshipsTeams.new({ data: [ DatadogAPIClient::V2::DataRelationshipsTeamsDataItems.new({ id: DD_TEAM_DATA_ID, type: DatadogAPIClient::V2::DataRelationshipsTeamsDataItemsType::TEAMS, }), ], }), }), type: DatadogAPIClient::V2::ScheduleUpdateRequestDataType::SCHEDULES, }), }) p api_instance.update_on_call_schedule(SCHEDULE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update On-Call schedule returns "OK" response ``` // Update On-Call schedule returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; use datadog_api_client::datadogV2::api_on_call::UpdateOnCallScheduleOptionalParams; use datadog_api_client::datadogV2::model::DataRelationshipsTeams; use datadog_api_client::datadogV2::model::DataRelationshipsTeamsDataItems; use datadog_api_client::datadogV2::model::DataRelationshipsTeamsDataItemsType; use datadog_api_client::datadogV2::model::LayerAttributesInterval; use datadog_api_client::datadogV2::model::ScheduleRequestDataAttributesLayersItemsMembersItems; use datadog_api_client::datadogV2::model::ScheduleRequestDataAttributesLayersItemsMembersItemsUser; use datadog_api_client::datadogV2::model::ScheduleUpdateRequest; use datadog_api_client::datadogV2::model::ScheduleUpdateRequestData; use datadog_api_client::datadogV2::model::ScheduleUpdateRequestDataAttributes; use datadog_api_client::datadogV2::model::ScheduleUpdateRequestDataAttributesLayersItems; use datadog_api_client::datadogV2::model::ScheduleUpdateRequestDataRelationships; use datadog_api_client::datadogV2::model::ScheduleUpdateRequestDataType; use datadog_api_client::datadogV2::model::TimeRestriction; use datadog_api_client::datadogV2::model::Weekday; #[tokio::main] async fn main() { // there is a valid "schedule" in the system let schedule_data_id = std::env::var("SCHEDULE_DATA_ID").unwrap(); let schedule_data_relationships_layers_data_0_id = std::env::var("SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID").unwrap(); // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let body = ScheduleUpdateRequest::new( ScheduleUpdateRequestData::new( ScheduleUpdateRequestDataAttributes::new( vec![ScheduleUpdateRequestDataAttributesLayersItems::new( DateTime::parse_from_rfc3339("2021-11-01T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), LayerAttributesInterval::new().seconds(3600), vec![ ScheduleRequestDataAttributesLayersItemsMembersItems::new().user( ScheduleRequestDataAttributesLayersItemsMembersItemsUser::new() .id(user_data_id.clone()), ), ], "Layer 1".to_string(), DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .end_date( DateTime::parse_from_rfc3339("2021-11-21T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .id(schedule_data_relationships_layers_data_0_id.clone()) .restrictions(vec![TimeRestriction::new() .end_day(Weekday::FRIDAY) .end_time("17:00:00".to_string()) .start_day(Weekday::MONDAY) .start_time("09:00:00".to_string())])], "Example-On-Call".to_string(), "America/New_York".to_string(), ), schedule_data_id.clone(), ScheduleUpdateRequestDataType::SCHEDULES, ) .relationships(ScheduleUpdateRequestDataRelationships::new().teams( DataRelationshipsTeams::new().data(vec![DataRelationshipsTeamsDataItems::new( dd_team_data_id.clone(), DataRelationshipsTeamsDataItemsType::TEAMS, )]), )), ); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .update_on_call_schedule( schedule_data_id.clone(), body, UpdateOnCallScheduleOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update On-Call schedule returns "OK" response ``` /** * Update On-Call schedule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "schedule" in the system const SCHEDULE_DATA_ID = process.env.SCHEDULE_DATA_ID as string; const SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID = process.env .SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID as string; // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.OnCallApiUpdateOnCallScheduleRequest = { body: { data: { id: SCHEDULE_DATA_ID, attributes: { layers: [ { id: SCHEDULE_DATA_RELATIONSHIPS_LAYERS_DATA_0_ID, effectiveDate: new Date(new Date().getTime() + -10 * 86400 * 1000), endDate: new Date(new Date().getTime() + 10 * 86400 * 1000), interval: { seconds: 3600, }, members: [ { user: { id: USER_DATA_ID, }, }, ], name: "Layer 1", restrictions: [ { endDay: "friday", endTime: "17:00:00", startDay: "monday", startTime: "09:00:00", }, ], rotationStart: new Date(new Date().getTime() + -5 * 86400 * 1000), }, ], name: "Example-On-Call", timeZone: "America/New_York", }, relationships: { teams: { data: [ { id: DD_TEAM_DATA_ID, type: "teams", }, ], }, }, type: "schedules", }, }, scheduleId: SCHEDULE_DATA_ID, }; apiInstance .updateOnCallSchedule(params) .then((data: v2.Schedule) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create On-Call escalation policy](https://docs.datadoghq.com/api/latest/on-call/#create-on-call-escalation-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#create-on-call-escalation-policy-v2) POST https://api.ap1.datadoghq.com/api/v2/on-call/escalation-policieshttps://api.ap2.datadoghq.com/api/v2/on-call/escalation-policieshttps://api.datadoghq.eu/api/v2/on-call/escalation-policieshttps://api.ddog-gov.com/api/v2/on-call/escalation-policieshttps://api.datadoghq.com/api/v2/on-call/escalation-policieshttps://api.us3.datadoghq.com/api/v2/on-call/escalation-policieshttps://api.us5.datadoghq.com/api/v2/on-call/escalation-policies ### Overview Create a new On-Call escalation policy This endpoint requires the `on_call_write` permission. ### Arguments #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `teams`, `steps`, `steps.targets`. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Field Type Description data [_required_] object Represents the data for creating an escalation policy, including its attributes, relationships, and resource type. attributes [_required_] object Defines the attributes for creating an escalation policy, including its description, name, resolution behavior, retries, and steps. name [_required_] string Specifies the name for the new escalation policy. resolve_page_on_policy_end boolean Indicates whether the page is automatically resolved when the policy ends. retries int64 Specifies how many times the escalation sequence is retried if there is no response. steps [_required_] [object] A list of escalation steps, each defining assignment, escalation timeout, and targets for the new policy. assignment enum Specifies how this escalation step will assign targets (example `default` or `round-robin`). Allowed enum values: `default,round-robin` escalate_after_seconds int64 Defines how many seconds to wait before escalating to the next step. targets [_required_] [object] Specifies the collection of escalation targets for this step. config object Configuration for an escalation target, such as schedule position. schedule object Schedule-specific configuration for an escalation target. position enum Specifies the position of a schedule target (example `previous`, `current`, or `next`). Allowed enum values: `previous,current,next` id string Specifies the unique identifier for this target. type enum Specifies the type of escalation target (example `users`, `schedules`, or `teams`). Allowed enum values: `users,schedules,teams` relationships object Represents relationships in an escalation policy creation request, including references to teams. teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Indicates that the resource is of type `policies`. Allowed enum values: `policies` default: `policies` ``` { "data": { "attributes": { "name": "Example-On-Call", "resolve_page_on_policy_end": true, "retries": 2, "steps": [ { "assignment": "default", "escalate_after_seconds": 3600, "targets": [ { "id": "string", "type": "users" }, { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "schedules" }, { "config": { "schedule": { "position": "previous" } }, "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "schedules" }, { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] }, { "assignment": "round-robin", "escalate_after_seconds": 3600, "targets": [ { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] } ] }, "relationships": { "teams": { "data": [ { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] } }, "type": "policies" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/on-call/#CreateOnCallEscalationPolicy-201-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#CreateOnCallEscalationPolicy-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#CreateOnCallEscalationPolicy-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#CreateOnCallEscalationPolicy-403-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#CreateOnCallEscalationPolicy-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Represents a complete escalation policy response, including policy data and optionally included related resources. Field Type Description data object Represents the data for a single escalation policy, including its attributes, ID, relationships, and resource type. attributes object Defines the main attributes of an escalation policy, such as its name and behavior on policy end. name [_required_] string Specifies the name of the escalation policy. resolve_page_on_policy_end boolean Indicates whether the page is automatically resolved when the policy ends. retries int64 Specifies how many times the escalation sequence is retried if there is no response. id string Specifies the unique identifier of the escalation policy. relationships object Represents the relationships for an escalation policy, including references to steps and teams. steps [_required_] object Defines the relationship to a collection of steps within an escalation policy. Contains an array of step data references. data [object] An array of references to the steps defined in this escalation policy. id [_required_] string Specifies the unique identifier for the step resource. type [_required_] enum Indicates that the resource is of type `steps`. Allowed enum values: `steps` default: `steps` teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Indicates that the resource is of type `policies`. Allowed enum values: `policies` default: `policies` included [ ] Provides any included related resources, such as steps or targets, returned with the policy. Option 1 object Represents a single step in an escalation policy, including its attributes, relationships, and resource type. attributes object Defines attributes for an escalation policy step, such as assignment strategy and escalation timeout. assignment enum Specifies how this escalation step will assign targets (example `default` or `round-robin`). Allowed enum values: `default,round-robin` escalate_after_seconds int64 Specifies how many seconds to wait before escalating to the next step. id string Specifies the unique identifier of this escalation policy step. relationships object Represents the relationship of an escalation policy step to its targets. targets object A list of escalation targets for a step data [ ] The `EscalationTargets` `data`. Option 1 object Represents a team target for an escalation policy step, including the team's ID and resource type. id [_required_] string Specifies the unique identifier of the team resource. type [_required_] enum Indicates that the resource is of type `teams`. Allowed enum values: `teams` default: `teams` Option 2 object Represents a user target for an escalation policy step, including the user's ID and resource type. id [_required_] string Specifies the unique identifier of the user resource. type [_required_] enum Indicates that the resource is of type `users`. Allowed enum values: `users` default: `users` Option 3 object Represents a schedule target for an escalation policy step, including its ID and resource type. This is a shortcut for a configured schedule target with position set to 'current'. id [_required_] string Specifies the unique identifier of the schedule resource. type [_required_] enum Indicates that the resource is of type `schedules`. Allowed enum values: `schedules` default: `schedules` Option 4 object Relationship reference to a configured schedule target. id [_required_] string Specifies the unique identifier of the configured schedule target. type [_required_] enum Indicates that the resource is of type `schedule_target`. Allowed enum values: `schedule_target` default: `schedule_target` type [_required_] enum Indicates that the resource is of type `steps`. Allowed enum values: `steps` default: `steps` Option 2 object Represents a user object in the context of an escalation policy, including their `id`, type, and basic attributes. attributes object Provides basic user information for an escalation policy, including a name and email address. email string The user's email address. name string The user's name. status enum The user's status. Allowed enum values: `active,deactivated,pending` id string The unique user identifier. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` Option 3 object Represents the primary data object for a schedule, linking attributes and relationships. attributes object Provides core properties of a schedule object such as its name and time zone. name string A short name for the schedule. time_zone string The time zone in which this schedule operates. id string The schedule's unique identifier. relationships object Groups the relationships for a schedule object, referencing layers and teams. layers object Associates layers with this schedule in a data structure. data [object] An array of layer references for this schedule. id [_required_] string The unique identifier of the layer in this relationship. type [_required_] enum Layers resource type. Allowed enum values: `layers` default: `layers` teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Schedules resource type. Allowed enum values: `schedules` default: `schedules` Option 4 object Full resource representation of a configured schedule target with position (previous, current, or next). attributes [_required_] object Attributes for a configured schedule target, including position. position [_required_] enum Specifies the position of a schedule target (example `previous`, `current`, or `next`). Allowed enum values: `previous,current,next` id [_required_] string Specifies the unique identifier of the configured schedule target. relationships [_required_] object Represents the relationships of a configured schedule target. schedule [_required_] object Holds the schedule reference for a configured schedule target. data [_required_] object Represents a schedule target for an escalation policy step, including its ID and resource type. This is a shortcut for a configured schedule target with position set to 'current'. id [_required_] string Specifies the unique identifier of the schedule resource. type [_required_] enum Indicates that the resource is of type `schedules`. Allowed enum values: `schedules` default: `schedules` type [_required_] enum Indicates that the resource is of type `schedule_target`. Allowed enum values: `schedule_target` default: `schedule_target` Option 5 object Provides a reference to a team, including ID, type, and basic attributes/relationships. attributes object Encapsulates the basic attributes of a Team reference, such as name, handle, and an optional avatar or description. avatar string URL or reference for the team's avatar (if available). description string A short text describing the team. handle string A unique handle/slug for the team. name string The full, human-readable name of the team. id string The team's unique identifier. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` ``` { "data": { "attributes": { "name": "Escalation Policy 1", "resolve_page_on_policy_end": true, "retries": 2 }, "id": "00000000-aba1-0000-0000-000000000000", "relationships": { "steps": { "data": [ { "id": "00000000-aba1-0000-0000-000000000000", "type": "steps" } ] }, "teams": { "data": [ { "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" } ] } }, "type": "policies" }, "included": [ { "attributes": { "avatar": "", "description": "Team 1 description", "handle": "team1", "name": "Team 1" }, "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" }, { "attributes": { "assignment": "default", "escalate_after_seconds": 3600 }, "id": "00000000-aba1-0000-0000-000000000000", "relationships": { "targets": { "data": [ { "id": "00000000-aba1-0000-0000-000000000000", "type": "users" }, { "id": "00000000-aba2-0000-0000-000000000000", "type": "schedules" }, { "id": "00000000-aba2-0000-0000-000000000000_previous", "type": "schedule_target" }, { "id": "00000000-aba3-0000-0000-000000000000", "type": "teams" } ] } }, "type": "steps" }, { "id": "00000000-aba1-0000-0000-000000000000", "type": "users" }, { "id": "00000000-aba2-0000-0000-000000000000", "type": "schedules" }, { "attributes": { "position": "previous" }, "id": "00000000-aba2-0000-0000-000000000000_previous", "relationships": { "schedule": { "data": { "id": "00000000-aba2-0000-0000-000000000000", "type": "schedules" } } }, "type": "schedule_target" }, { "id": "00000000-aba3-0000-0000-000000000000", "type": "teams" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Create On-Call escalation policy returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/escalation-policies" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Example-On-Call", "resolve_page_on_policy_end": true, "retries": 2, "steps": [ { "assignment": "default", "escalate_after_seconds": 3600, "targets": [ { "id": "string", "type": "users" }, { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "schedules" }, { "config": { "schedule": { "position": "previous" } }, "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "schedules" }, { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] }, { "assignment": "round-robin", "escalate_after_seconds": 3600, "targets": [ { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] } ] }, "relationships": { "teams": { "data": [ { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] } }, "type": "policies" } } EOF ``` ##### Create On-Call escalation policy returns "Created" response ``` // Create On-Call escalation policy returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") // there is a valid "schedule" in the system ScheduleDataID := os.Getenv("SCHEDULE_DATA_ID") // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") body := datadogV2.EscalationPolicyCreateRequest{ Data: datadogV2.EscalationPolicyCreateRequestData{ Attributes: datadogV2.EscalationPolicyCreateRequestDataAttributes{ Name: "Example-On-Call", ResolvePageOnPolicyEnd: datadog.PtrBool(true), Retries: datadog.PtrInt64(2), Steps: []datadogV2.EscalationPolicyCreateRequestDataAttributesStepsItems{ { Assignment: datadogV2.ESCALATIONPOLICYSTEPATTRIBUTESASSIGNMENT_DEFAULT.Ptr(), EscalateAfterSeconds: datadog.PtrInt64(3600), Targets: []datadogV2.EscalationPolicyStepTarget{ { Id: datadog.PtrString(UserDataID), Type: datadogV2.ESCALATIONPOLICYSTEPTARGETTYPE_USERS.Ptr(), }, { Id: datadog.PtrString(ScheduleDataID), Type: datadogV2.ESCALATIONPOLICYSTEPTARGETTYPE_SCHEDULES.Ptr(), }, { Config: &datadogV2.EscalationPolicyStepTargetConfig{ Schedule: &datadogV2.EscalationPolicyStepTargetConfigSchedule{ Position: datadogV2.SCHEDULETARGETPOSITION_PREVIOUS.Ptr(), }, }, Id: datadog.PtrString(ScheduleDataID), Type: datadogV2.ESCALATIONPOLICYSTEPTARGETTYPE_SCHEDULES.Ptr(), }, { Id: datadog.PtrString(DdTeamDataID), Type: datadogV2.ESCALATIONPOLICYSTEPTARGETTYPE_TEAMS.Ptr(), }, }, }, { Assignment: datadogV2.ESCALATIONPOLICYSTEPATTRIBUTESASSIGNMENT_ROUND_ROBIN.Ptr(), EscalateAfterSeconds: datadog.PtrInt64(3600), Targets: []datadogV2.EscalationPolicyStepTarget{ { Id: datadog.PtrString(DdTeamDataID), Type: datadogV2.ESCALATIONPOLICYSTEPTARGETTYPE_TEAMS.Ptr(), }, }, }, }, }, Relationships: &datadogV2.EscalationPolicyCreateRequestDataRelationships{ Teams: &datadogV2.DataRelationshipsTeams{ Data: []datadogV2.DataRelationshipsTeamsDataItems{ { Id: DdTeamDataID, Type: datadogV2.DATARELATIONSHIPSTEAMSDATAITEMSTYPE_TEAMS, }, }, }, }, Type: datadogV2.ESCALATIONPOLICYCREATEREQUESTDATATYPE_POLICIES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.CreateOnCallEscalationPolicy(ctx, body, *datadogV2.NewCreateOnCallEscalationPolicyOptionalParameters().WithInclude("steps.targets")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.CreateOnCallEscalationPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.CreateOnCallEscalationPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create On-Call escalation policy returns "Created" response ``` // Create On-Call escalation policy returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.api.OnCallApi.CreateOnCallEscalationPolicyOptionalParameters; import com.datadog.api.client.v2.model.DataRelationshipsTeams; import com.datadog.api.client.v2.model.DataRelationshipsTeamsDataItems; import com.datadog.api.client.v2.model.DataRelationshipsTeamsDataItemsType; import com.datadog.api.client.v2.model.EscalationPolicy; import com.datadog.api.client.v2.model.EscalationPolicyCreateRequest; import com.datadog.api.client.v2.model.EscalationPolicyCreateRequestData; import com.datadog.api.client.v2.model.EscalationPolicyCreateRequestDataAttributes; import com.datadog.api.client.v2.model.EscalationPolicyCreateRequestDataAttributesStepsItems; import com.datadog.api.client.v2.model.EscalationPolicyCreateRequestDataRelationships; import com.datadog.api.client.v2.model.EscalationPolicyCreateRequestDataType; import com.datadog.api.client.v2.model.EscalationPolicyStepAttributesAssignment; import com.datadog.api.client.v2.model.EscalationPolicyStepTarget; import com.datadog.api.client.v2.model.EscalationPolicyStepTargetConfig; import com.datadog.api.client.v2.model.EscalationPolicyStepTargetConfigSchedule; import com.datadog.api.client.v2.model.EscalationPolicyStepTargetType; import com.datadog.api.client.v2.model.ScheduleTargetPosition; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); // there is a valid "schedule" in the system String SCHEDULE_DATA_ID = System.getenv("SCHEDULE_DATA_ID"); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); EscalationPolicyCreateRequest body = new EscalationPolicyCreateRequest() .data( new EscalationPolicyCreateRequestData() .attributes( new EscalationPolicyCreateRequestDataAttributes() .name("Example-On-Call") .resolvePageOnPolicyEnd(true) .retries(2L) .steps( Arrays.asList( new EscalationPolicyCreateRequestDataAttributesStepsItems() .assignment( EscalationPolicyStepAttributesAssignment.DEFAULT) .escalateAfterSeconds(3600L) .targets( Arrays.asList( new EscalationPolicyStepTarget() .id(USER_DATA_ID) .type(EscalationPolicyStepTargetType.USERS), new EscalationPolicyStepTarget() .id(SCHEDULE_DATA_ID) .type(EscalationPolicyStepTargetType.SCHEDULES), new EscalationPolicyStepTarget() .config( new EscalationPolicyStepTargetConfig() .schedule( new EscalationPolicyStepTargetConfigSchedule() .position( ScheduleTargetPosition .PREVIOUS))) .id(SCHEDULE_DATA_ID) .type(EscalationPolicyStepTargetType.SCHEDULES), new EscalationPolicyStepTarget() .id(DD_TEAM_DATA_ID) .type(EscalationPolicyStepTargetType.TEAMS))), new EscalationPolicyCreateRequestDataAttributesStepsItems() .assignment( EscalationPolicyStepAttributesAssignment.ROUND_ROBIN) .escalateAfterSeconds(3600L) .targets( Collections.singletonList( new EscalationPolicyStepTarget() .id(DD_TEAM_DATA_ID) .type(EscalationPolicyStepTargetType.TEAMS)))))) .relationships( new EscalationPolicyCreateRequestDataRelationships() .teams( new DataRelationshipsTeams() .data( Collections.singletonList( new DataRelationshipsTeamsDataItems() .id(DD_TEAM_DATA_ID) .type(DataRelationshipsTeamsDataItemsType.TEAMS))))) .type(EscalationPolicyCreateRequestDataType.POLICIES)); try { EscalationPolicy result = apiInstance.createOnCallEscalationPolicy( body, new CreateOnCallEscalationPolicyOptionalParameters().include("steps.targets")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#createOnCallEscalationPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create On-Call escalation policy returns "Created" response ``` """ Create On-Call escalation policy returns "Created" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi from datadog_api_client.v2.model.data_relationships_teams import DataRelationshipsTeams from datadog_api_client.v2.model.data_relationships_teams_data_items import DataRelationshipsTeamsDataItems from datadog_api_client.v2.model.data_relationships_teams_data_items_type import DataRelationshipsTeamsDataItemsType from datadog_api_client.v2.model.escalation_policy_create_request import EscalationPolicyCreateRequest from datadog_api_client.v2.model.escalation_policy_create_request_data import EscalationPolicyCreateRequestData from datadog_api_client.v2.model.escalation_policy_create_request_data_attributes import ( EscalationPolicyCreateRequestDataAttributes, ) from datadog_api_client.v2.model.escalation_policy_create_request_data_attributes_steps_items import ( EscalationPolicyCreateRequestDataAttributesStepsItems, ) from datadog_api_client.v2.model.escalation_policy_create_request_data_relationships import ( EscalationPolicyCreateRequestDataRelationships, ) from datadog_api_client.v2.model.escalation_policy_create_request_data_type import EscalationPolicyCreateRequestDataType from datadog_api_client.v2.model.escalation_policy_step_attributes_assignment import ( EscalationPolicyStepAttributesAssignment, ) from datadog_api_client.v2.model.escalation_policy_step_target import EscalationPolicyStepTarget from datadog_api_client.v2.model.escalation_policy_step_target_config import EscalationPolicyStepTargetConfig from datadog_api_client.v2.model.escalation_policy_step_target_config_schedule import ( EscalationPolicyStepTargetConfigSchedule, ) from datadog_api_client.v2.model.escalation_policy_step_target_type import EscalationPolicyStepTargetType from datadog_api_client.v2.model.schedule_target_position import ScheduleTargetPosition # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] # there is a valid "schedule" in the system SCHEDULE_DATA_ID = environ["SCHEDULE_DATA_ID"] # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] body = EscalationPolicyCreateRequest( data=EscalationPolicyCreateRequestData( attributes=EscalationPolicyCreateRequestDataAttributes( name="Example-On-Call", resolve_page_on_policy_end=True, retries=2, steps=[ EscalationPolicyCreateRequestDataAttributesStepsItems( assignment=EscalationPolicyStepAttributesAssignment.DEFAULT, escalate_after_seconds=3600, targets=[ EscalationPolicyStepTarget( id=USER_DATA_ID, type=EscalationPolicyStepTargetType.USERS, ), EscalationPolicyStepTarget( id=SCHEDULE_DATA_ID, type=EscalationPolicyStepTargetType.SCHEDULES, ), EscalationPolicyStepTarget( config=EscalationPolicyStepTargetConfig( schedule=EscalationPolicyStepTargetConfigSchedule( position=ScheduleTargetPosition.PREVIOUS, ), ), id=SCHEDULE_DATA_ID, type=EscalationPolicyStepTargetType.SCHEDULES, ), EscalationPolicyStepTarget( id=DD_TEAM_DATA_ID, type=EscalationPolicyStepTargetType.TEAMS, ), ], ), EscalationPolicyCreateRequestDataAttributesStepsItems( assignment=EscalationPolicyStepAttributesAssignment.ROUND_ROBIN, escalate_after_seconds=3600, targets=[ EscalationPolicyStepTarget( id=DD_TEAM_DATA_ID, type=EscalationPolicyStepTargetType.TEAMS, ), ], ), ], ), relationships=EscalationPolicyCreateRequestDataRelationships( teams=DataRelationshipsTeams( data=[ DataRelationshipsTeamsDataItems( id=DD_TEAM_DATA_ID, type=DataRelationshipsTeamsDataItemsType.TEAMS, ), ], ), ), type=EscalationPolicyCreateRequestDataType.POLICIES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.create_on_call_escalation_policy(include="steps.targets", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create On-Call escalation policy returns "Created" response ``` # Create On-Call escalation policy returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] # there is a valid "schedule" in the system SCHEDULE_DATA_ID = ENV["SCHEDULE_DATA_ID"] # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] body = DatadogAPIClient::V2::EscalationPolicyCreateRequest.new({ data: DatadogAPIClient::V2::EscalationPolicyCreateRequestData.new({ attributes: DatadogAPIClient::V2::EscalationPolicyCreateRequestDataAttributes.new({ name: "Example-On-Call", resolve_page_on_policy_end: true, retries: 2, steps: [ DatadogAPIClient::V2::EscalationPolicyCreateRequestDataAttributesStepsItems.new({ assignment: DatadogAPIClient::V2::EscalationPolicyStepAttributesAssignment::DEFAULT, escalate_after_seconds: 3600, targets: [ DatadogAPIClient::V2::EscalationPolicyStepTarget.new({ id: USER_DATA_ID, type: DatadogAPIClient::V2::EscalationPolicyStepTargetType::USERS, }), DatadogAPIClient::V2::EscalationPolicyStepTarget.new({ id: SCHEDULE_DATA_ID, type: DatadogAPIClient::V2::EscalationPolicyStepTargetType::SCHEDULES, }), DatadogAPIClient::V2::EscalationPolicyStepTarget.new({ config: DatadogAPIClient::V2::EscalationPolicyStepTargetConfig.new({ schedule: DatadogAPIClient::V2::EscalationPolicyStepTargetConfigSchedule.new({ position: DatadogAPIClient::V2::ScheduleTargetPosition::PREVIOUS, }), }), id: SCHEDULE_DATA_ID, type: DatadogAPIClient::V2::EscalationPolicyStepTargetType::SCHEDULES, }), DatadogAPIClient::V2::EscalationPolicyStepTarget.new({ id: DD_TEAM_DATA_ID, type: DatadogAPIClient::V2::EscalationPolicyStepTargetType::TEAMS, }), ], }), DatadogAPIClient::V2::EscalationPolicyCreateRequestDataAttributesStepsItems.new({ assignment: DatadogAPIClient::V2::EscalationPolicyStepAttributesAssignment::ROUND_ROBIN, escalate_after_seconds: 3600, targets: [ DatadogAPIClient::V2::EscalationPolicyStepTarget.new({ id: DD_TEAM_DATA_ID, type: DatadogAPIClient::V2::EscalationPolicyStepTargetType::TEAMS, }), ], }), ], }), relationships: DatadogAPIClient::V2::EscalationPolicyCreateRequestDataRelationships.new({ teams: DatadogAPIClient::V2::DataRelationshipsTeams.new({ data: [ DatadogAPIClient::V2::DataRelationshipsTeamsDataItems.new({ id: DD_TEAM_DATA_ID, type: DatadogAPIClient::V2::DataRelationshipsTeamsDataItemsType::TEAMS, }), ], }), }), type: DatadogAPIClient::V2::EscalationPolicyCreateRequestDataType::POLICIES, }), }) opts = { include: "steps.targets", } p api_instance.create_on_call_escalation_policy(body, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create On-Call escalation policy returns "Created" response ``` // Create On-Call escalation policy returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::CreateOnCallEscalationPolicyOptionalParams; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; use datadog_api_client::datadogV2::model::DataRelationshipsTeams; use datadog_api_client::datadogV2::model::DataRelationshipsTeamsDataItems; use datadog_api_client::datadogV2::model::DataRelationshipsTeamsDataItemsType; use datadog_api_client::datadogV2::model::EscalationPolicyCreateRequest; use datadog_api_client::datadogV2::model::EscalationPolicyCreateRequestData; use datadog_api_client::datadogV2::model::EscalationPolicyCreateRequestDataAttributes; use datadog_api_client::datadogV2::model::EscalationPolicyCreateRequestDataAttributesStepsItems; use datadog_api_client::datadogV2::model::EscalationPolicyCreateRequestDataRelationships; use datadog_api_client::datadogV2::model::EscalationPolicyCreateRequestDataType; use datadog_api_client::datadogV2::model::EscalationPolicyStepAttributesAssignment; use datadog_api_client::datadogV2::model::EscalationPolicyStepTarget; use datadog_api_client::datadogV2::model::EscalationPolicyStepTargetConfig; use datadog_api_client::datadogV2::model::EscalationPolicyStepTargetConfigSchedule; use datadog_api_client::datadogV2::model::EscalationPolicyStepTargetType; use datadog_api_client::datadogV2::model::ScheduleTargetPosition; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); // there is a valid "schedule" in the system let schedule_data_id = std::env::var("SCHEDULE_DATA_ID").unwrap(); // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let body = EscalationPolicyCreateRequest::new( EscalationPolicyCreateRequestData::new( EscalationPolicyCreateRequestDataAttributes::new( "Example-On-Call".to_string(), vec![ EscalationPolicyCreateRequestDataAttributesStepsItems::new(vec![ EscalationPolicyStepTarget::new() .id(user_data_id.clone()) .type_(EscalationPolicyStepTargetType::USERS), EscalationPolicyStepTarget::new() .id(schedule_data_id.clone()) .type_(EscalationPolicyStepTargetType::SCHEDULES), EscalationPolicyStepTarget::new() .config( EscalationPolicyStepTargetConfig::new().schedule( EscalationPolicyStepTargetConfigSchedule::new() .position(ScheduleTargetPosition::PREVIOUS), ), ) .id(schedule_data_id.clone()) .type_(EscalationPolicyStepTargetType::SCHEDULES), EscalationPolicyStepTarget::new() .id(dd_team_data_id.clone()) .type_(EscalationPolicyStepTargetType::TEAMS), ]) .assignment(EscalationPolicyStepAttributesAssignment::DEFAULT) .escalate_after_seconds(3600), EscalationPolicyCreateRequestDataAttributesStepsItems::new(vec![ EscalationPolicyStepTarget::new() .id(dd_team_data_id.clone()) .type_(EscalationPolicyStepTargetType::TEAMS), ]) .assignment(EscalationPolicyStepAttributesAssignment::ROUND_ROBIN) .escalate_after_seconds(3600), ], ) .resolve_page_on_policy_end(true) .retries(2), EscalationPolicyCreateRequestDataType::POLICIES, ) .relationships(EscalationPolicyCreateRequestDataRelationships::new().teams( DataRelationshipsTeams::new().data(vec![DataRelationshipsTeamsDataItems::new( dd_team_data_id.clone(), DataRelationshipsTeamsDataItemsType::TEAMS, )]), )), ); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .create_on_call_escalation_policy( body, CreateOnCallEscalationPolicyOptionalParams::default() .include("steps.targets".to_string()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create On-Call escalation policy returns "Created" response ``` /** * Create On-Call escalation policy returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; // there is a valid "schedule" in the system const SCHEDULE_DATA_ID = process.env.SCHEDULE_DATA_ID as string; // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.OnCallApiCreateOnCallEscalationPolicyRequest = { body: { data: { attributes: { name: "Example-On-Call", resolvePageOnPolicyEnd: true, retries: 2, steps: [ { assignment: "default", escalateAfterSeconds: 3600, targets: [ { id: USER_DATA_ID, type: "users", }, { id: SCHEDULE_DATA_ID, type: "schedules", }, { config: { schedule: { position: "previous", }, }, id: SCHEDULE_DATA_ID, type: "schedules", }, { id: DD_TEAM_DATA_ID, type: "teams", }, ], }, { assignment: "round-robin", escalateAfterSeconds: 3600, targets: [ { id: DD_TEAM_DATA_ID, type: "teams", }, ], }, ], }, relationships: { teams: { data: [ { id: DD_TEAM_DATA_ID, type: "teams", }, ], }, }, type: "policies", }, }, include: "steps.targets", }; apiInstance .createOnCallEscalationPolicy(params) .then((data: v2.EscalationPolicy) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update On-Call escalation policy](https://docs.datadoghq.com/api/latest/on-call/#update-on-call-escalation-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#update-on-call-escalation-policy-v2) PUT https://api.ap1.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.ap2.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.datadoghq.eu/api/v2/on-call/escalation-policies/{policy_id}https://api.ddog-gov.com/api/v2/on-call/escalation-policies/{policy_id}https://api.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.us3.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.us5.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id} ### Overview Update an On-Call escalation policy This endpoint requires the `on_call_write` permission. ### Arguments #### Path Parameters Name Type Description policy_id [_required_] string The ID of the escalation policy #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `teams`, `steps`, `steps.targets`. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Field Type Description data [_required_] object Represents the data for updating an existing escalation policy, including its ID, attributes, relationships, and resource type. attributes [_required_] object Defines the attributes that can be updated for an escalation policy, such as description, name, resolution behavior, retries, and steps. name [_required_] string Specifies the name of the escalation policy. resolve_page_on_policy_end boolean Indicates whether the page is automatically resolved when the policy ends. retries int64 Specifies how many times the escalation sequence is retried if there is no response. steps [_required_] [object] A list of escalation steps, each defining assignment, escalation timeout, and targets. assignment enum Specifies how this escalation step will assign targets (example `default` or `round-robin`). Allowed enum values: `default,round-robin` escalate_after_seconds int64 Defines how many seconds to wait before escalating to the next step. id string Specifies the unique identifier of this step. targets [_required_] [object] Specifies the collection of escalation targets for this step. config object Configuration for an escalation target, such as schedule position. schedule object Schedule-specific configuration for an escalation target. position enum Specifies the position of a schedule target (example `previous`, `current`, or `next`). Allowed enum values: `previous,current,next` id string Specifies the unique identifier for this target. type enum Specifies the type of escalation target (example `users`, `schedules`, or `teams`). Allowed enum values: `users,schedules,teams` id [_required_] string Specifies the unique identifier of the escalation policy being updated. relationships object Represents relationships in an escalation policy update request, including references to teams. teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Indicates that the resource is of type `policies`. Allowed enum values: `policies` default: `policies` ``` { "data": { "attributes": { "name": "Example-On-Call-updated", "resolve_page_on_policy_end": false, "retries": 0, "steps": [ { "assignment": "default", "escalate_after_seconds": 3600, "id": "00000000-aba1-0000-0000-000000000000", "targets": [ { "id": "string", "type": "users" } ] } ] }, "id": "ab000000-0000-0000-0000-000000000000", "relationships": { "teams": { "data": [ { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] } }, "type": "policies" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallEscalationPolicy-200-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallEscalationPolicy-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallEscalationPolicy-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallEscalationPolicy-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallEscalationPolicy-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#UpdateOnCallEscalationPolicy-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Represents a complete escalation policy response, including policy data and optionally included related resources. Field Type Description data object Represents the data for a single escalation policy, including its attributes, ID, relationships, and resource type. attributes object Defines the main attributes of an escalation policy, such as its name and behavior on policy end. name [_required_] string Specifies the name of the escalation policy. resolve_page_on_policy_end boolean Indicates whether the page is automatically resolved when the policy ends. retries int64 Specifies how many times the escalation sequence is retried if there is no response. id string Specifies the unique identifier of the escalation policy. relationships object Represents the relationships for an escalation policy, including references to steps and teams. steps [_required_] object Defines the relationship to a collection of steps within an escalation policy. Contains an array of step data references. data [object] An array of references to the steps defined in this escalation policy. id [_required_] string Specifies the unique identifier for the step resource. type [_required_] enum Indicates that the resource is of type `steps`. Allowed enum values: `steps` default: `steps` teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Indicates that the resource is of type `policies`. Allowed enum values: `policies` default: `policies` included [ ] Provides any included related resources, such as steps or targets, returned with the policy. Option 1 object Represents a single step in an escalation policy, including its attributes, relationships, and resource type. attributes object Defines attributes for an escalation policy step, such as assignment strategy and escalation timeout. assignment enum Specifies how this escalation step will assign targets (example `default` or `round-robin`). Allowed enum values: `default,round-robin` escalate_after_seconds int64 Specifies how many seconds to wait before escalating to the next step. id string Specifies the unique identifier of this escalation policy step. relationships object Represents the relationship of an escalation policy step to its targets. targets object A list of escalation targets for a step data [ ] The `EscalationTargets` `data`. Option 1 object Represents a team target for an escalation policy step, including the team's ID and resource type. id [_required_] string Specifies the unique identifier of the team resource. type [_required_] enum Indicates that the resource is of type `teams`. Allowed enum values: `teams` default: `teams` Option 2 object Represents a user target for an escalation policy step, including the user's ID and resource type. id [_required_] string Specifies the unique identifier of the user resource. type [_required_] enum Indicates that the resource is of type `users`. Allowed enum values: `users` default: `users` Option 3 object Represents a schedule target for an escalation policy step, including its ID and resource type. This is a shortcut for a configured schedule target with position set to 'current'. id [_required_] string Specifies the unique identifier of the schedule resource. type [_required_] enum Indicates that the resource is of type `schedules`. Allowed enum values: `schedules` default: `schedules` Option 4 object Relationship reference to a configured schedule target. id [_required_] string Specifies the unique identifier of the configured schedule target. type [_required_] enum Indicates that the resource is of type `schedule_target`. Allowed enum values: `schedule_target` default: `schedule_target` type [_required_] enum Indicates that the resource is of type `steps`. Allowed enum values: `steps` default: `steps` Option 2 object Represents a user object in the context of an escalation policy, including their `id`, type, and basic attributes. attributes object Provides basic user information for an escalation policy, including a name and email address. email string The user's email address. name string The user's name. status enum The user's status. Allowed enum values: `active,deactivated,pending` id string The unique user identifier. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` Option 3 object Represents the primary data object for a schedule, linking attributes and relationships. attributes object Provides core properties of a schedule object such as its name and time zone. name string A short name for the schedule. time_zone string The time zone in which this schedule operates. id string The schedule's unique identifier. relationships object Groups the relationships for a schedule object, referencing layers and teams. layers object Associates layers with this schedule in a data structure. data [object] An array of layer references for this schedule. id [_required_] string The unique identifier of the layer in this relationship. type [_required_] enum Layers resource type. Allowed enum values: `layers` default: `layers` teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Schedules resource type. Allowed enum values: `schedules` default: `schedules` Option 4 object Full resource representation of a configured schedule target with position (previous, current, or next). attributes [_required_] object Attributes for a configured schedule target, including position. position [_required_] enum Specifies the position of a schedule target (example `previous`, `current`, or `next`). Allowed enum values: `previous,current,next` id [_required_] string Specifies the unique identifier of the configured schedule target. relationships [_required_] object Represents the relationships of a configured schedule target. schedule [_required_] object Holds the schedule reference for a configured schedule target. data [_required_] object Represents a schedule target for an escalation policy step, including its ID and resource type. This is a shortcut for a configured schedule target with position set to 'current'. id [_required_] string Specifies the unique identifier of the schedule resource. type [_required_] enum Indicates that the resource is of type `schedules`. Allowed enum values: `schedules` default: `schedules` type [_required_] enum Indicates that the resource is of type `schedule_target`. Allowed enum values: `schedule_target` default: `schedule_target` Option 5 object Provides a reference to a team, including ID, type, and basic attributes/relationships. attributes object Encapsulates the basic attributes of a Team reference, such as name, handle, and an optional avatar or description. avatar string URL or reference for the team's avatar (if available). description string A short text describing the team. handle string A unique handle/slug for the team. name string The full, human-readable name of the team. id string The team's unique identifier. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` ``` { "data": { "attributes": { "name": "Escalation Policy 1", "resolve_page_on_policy_end": true, "retries": 2 }, "id": "00000000-aba1-0000-0000-000000000000", "relationships": { "steps": { "data": [ { "id": "00000000-aba1-0000-0000-000000000000", "type": "steps" } ] }, "teams": { "data": [ { "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" } ] } }, "type": "policies" }, "included": [ { "attributes": { "avatar": "", "description": "Team 1 description", "handle": "team1", "name": "Team 1" }, "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" }, { "attributes": { "assignment": "default", "escalate_after_seconds": 3600 }, "id": "00000000-aba1-0000-0000-000000000000", "relationships": { "targets": { "data": [ { "id": "00000000-aba1-0000-0000-000000000000", "type": "users" }, { "id": "00000000-aba2-0000-0000-000000000000", "type": "schedules" }, { "id": "00000000-aba2-0000-0000-000000000000_previous", "type": "schedule_target" }, { "id": "00000000-aba3-0000-0000-000000000000", "type": "teams" } ] } }, "type": "steps" }, { "id": "00000000-aba1-0000-0000-000000000000", "type": "users" }, { "id": "00000000-aba2-0000-0000-000000000000", "type": "schedules" }, { "attributes": { "position": "previous" }, "id": "00000000-aba2-0000-0000-000000000000_previous", "relationships": { "schedule": { "data": { "id": "00000000-aba2-0000-0000-000000000000", "type": "schedules" } } }, "type": "schedule_target" }, { "id": "00000000-aba3-0000-0000-000000000000", "type": "teams" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Update On-Call escalation policy returns "OK" response Copy ``` # Path parameters export policy_id="a3000000-0000-0000-0000-000000000000" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/escalation-policies/${policy_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Example-On-Call-updated", "resolve_page_on_policy_end": false, "retries": 0, "steps": [ { "assignment": "default", "escalate_after_seconds": 3600, "id": "00000000-aba1-0000-0000-000000000000", "targets": [ { "id": "string", "type": "users" } ] } ] }, "id": "ab000000-0000-0000-0000-000000000000", "relationships": { "teams": { "data": [ { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "teams" } ] } }, "type": "policies" } } EOF ``` ##### Update On-Call escalation policy returns "OK" response ``` // Update On-Call escalation policy returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "escalation_policy" in the system EscalationPolicyDataID := os.Getenv("ESCALATION_POLICY_DATA_ID") EscalationPolicyDataRelationshipsStepsData0ID := os.Getenv("ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID") // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") body := datadogV2.EscalationPolicyUpdateRequest{ Data: datadogV2.EscalationPolicyUpdateRequestData{ Attributes: datadogV2.EscalationPolicyUpdateRequestDataAttributes{ Name: "Example-On-Call-updated", ResolvePageOnPolicyEnd: datadog.PtrBool(false), Retries: datadog.PtrInt64(0), Steps: []datadogV2.EscalationPolicyUpdateRequestDataAttributesStepsItems{ { Assignment: datadogV2.ESCALATIONPOLICYSTEPATTRIBUTESASSIGNMENT_DEFAULT.Ptr(), EscalateAfterSeconds: datadog.PtrInt64(3600), Id: datadog.PtrString(EscalationPolicyDataRelationshipsStepsData0ID), Targets: []datadogV2.EscalationPolicyStepTarget{ { Id: datadog.PtrString(UserDataID), Type: datadogV2.ESCALATIONPOLICYSTEPTARGETTYPE_USERS.Ptr(), }, }, }, }, }, Id: EscalationPolicyDataID, Relationships: &datadogV2.EscalationPolicyUpdateRequestDataRelationships{ Teams: &datadogV2.DataRelationshipsTeams{ Data: []datadogV2.DataRelationshipsTeamsDataItems{ { Id: DdTeamDataID, Type: datadogV2.DATARELATIONSHIPSTEAMSDATAITEMSTYPE_TEAMS, }, }, }, }, Type: datadogV2.ESCALATIONPOLICYUPDATEREQUESTDATATYPE_POLICIES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.UpdateOnCallEscalationPolicy(ctx, EscalationPolicyDataID, body, *datadogV2.NewUpdateOnCallEscalationPolicyOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.UpdateOnCallEscalationPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.UpdateOnCallEscalationPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update On-Call escalation policy returns "OK" response ``` // Update On-Call escalation policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.model.DataRelationshipsTeams; import com.datadog.api.client.v2.model.DataRelationshipsTeamsDataItems; import com.datadog.api.client.v2.model.DataRelationshipsTeamsDataItemsType; import com.datadog.api.client.v2.model.EscalationPolicy; import com.datadog.api.client.v2.model.EscalationPolicyStepAttributesAssignment; import com.datadog.api.client.v2.model.EscalationPolicyStepTarget; import com.datadog.api.client.v2.model.EscalationPolicyStepTargetType; import com.datadog.api.client.v2.model.EscalationPolicyUpdateRequest; import com.datadog.api.client.v2.model.EscalationPolicyUpdateRequestData; import com.datadog.api.client.v2.model.EscalationPolicyUpdateRequestDataAttributes; import com.datadog.api.client.v2.model.EscalationPolicyUpdateRequestDataAttributesStepsItems; import com.datadog.api.client.v2.model.EscalationPolicyUpdateRequestDataRelationships; import com.datadog.api.client.v2.model.EscalationPolicyUpdateRequestDataType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "escalation_policy" in the system String ESCALATION_POLICY_DATA_ID = System.getenv("ESCALATION_POLICY_DATA_ID"); String ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID = System.getenv("ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID"); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); EscalationPolicyUpdateRequest body = new EscalationPolicyUpdateRequest() .data( new EscalationPolicyUpdateRequestData() .attributes( new EscalationPolicyUpdateRequestDataAttributes() .name("Example-On-Call-updated") .resolvePageOnPolicyEnd(false) .retries(0L) .steps( Collections.singletonList( new EscalationPolicyUpdateRequestDataAttributesStepsItems() .assignment( EscalationPolicyStepAttributesAssignment.DEFAULT) .escalateAfterSeconds(3600L) .id(ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID) .targets( Collections.singletonList( new EscalationPolicyStepTarget() .id(USER_DATA_ID) .type(EscalationPolicyStepTargetType.USERS)))))) .id(ESCALATION_POLICY_DATA_ID) .relationships( new EscalationPolicyUpdateRequestDataRelationships() .teams( new DataRelationshipsTeams() .data( Collections.singletonList( new DataRelationshipsTeamsDataItems() .id(DD_TEAM_DATA_ID) .type(DataRelationshipsTeamsDataItemsType.TEAMS))))) .type(EscalationPolicyUpdateRequestDataType.POLICIES)); try { EscalationPolicy result = apiInstance.updateOnCallEscalationPolicy(ESCALATION_POLICY_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#updateOnCallEscalationPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update On-Call escalation policy returns "OK" response ``` """ Update On-Call escalation policy returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi from datadog_api_client.v2.model.data_relationships_teams import DataRelationshipsTeams from datadog_api_client.v2.model.data_relationships_teams_data_items import DataRelationshipsTeamsDataItems from datadog_api_client.v2.model.data_relationships_teams_data_items_type import DataRelationshipsTeamsDataItemsType from datadog_api_client.v2.model.escalation_policy_step_attributes_assignment import ( EscalationPolicyStepAttributesAssignment, ) from datadog_api_client.v2.model.escalation_policy_step_target import EscalationPolicyStepTarget from datadog_api_client.v2.model.escalation_policy_step_target_type import EscalationPolicyStepTargetType from datadog_api_client.v2.model.escalation_policy_update_request import EscalationPolicyUpdateRequest from datadog_api_client.v2.model.escalation_policy_update_request_data import EscalationPolicyUpdateRequestData from datadog_api_client.v2.model.escalation_policy_update_request_data_attributes import ( EscalationPolicyUpdateRequestDataAttributes, ) from datadog_api_client.v2.model.escalation_policy_update_request_data_attributes_steps_items import ( EscalationPolicyUpdateRequestDataAttributesStepsItems, ) from datadog_api_client.v2.model.escalation_policy_update_request_data_relationships import ( EscalationPolicyUpdateRequestDataRelationships, ) from datadog_api_client.v2.model.escalation_policy_update_request_data_type import EscalationPolicyUpdateRequestDataType # there is a valid "escalation_policy" in the system ESCALATION_POLICY_DATA_ID = environ["ESCALATION_POLICY_DATA_ID"] ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID = environ["ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID"] # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] body = EscalationPolicyUpdateRequest( data=EscalationPolicyUpdateRequestData( attributes=EscalationPolicyUpdateRequestDataAttributes( name="Example-On-Call-updated", resolve_page_on_policy_end=False, retries=0, steps=[ EscalationPolicyUpdateRequestDataAttributesStepsItems( assignment=EscalationPolicyStepAttributesAssignment.DEFAULT, escalate_after_seconds=3600, id=ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID, targets=[ EscalationPolicyStepTarget( id=USER_DATA_ID, type=EscalationPolicyStepTargetType.USERS, ), ], ), ], ), id=ESCALATION_POLICY_DATA_ID, relationships=EscalationPolicyUpdateRequestDataRelationships( teams=DataRelationshipsTeams( data=[ DataRelationshipsTeamsDataItems( id=DD_TEAM_DATA_ID, type=DataRelationshipsTeamsDataItemsType.TEAMS, ), ], ), ), type=EscalationPolicyUpdateRequestDataType.POLICIES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.update_on_call_escalation_policy(policy_id=ESCALATION_POLICY_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update On-Call escalation policy returns "OK" response ``` # Update On-Call escalation policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "escalation_policy" in the system ESCALATION_POLICY_DATA_ID = ENV["ESCALATION_POLICY_DATA_ID"] ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID = ENV["ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID"] # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] body = DatadogAPIClient::V2::EscalationPolicyUpdateRequest.new({ data: DatadogAPIClient::V2::EscalationPolicyUpdateRequestData.new({ attributes: DatadogAPIClient::V2::EscalationPolicyUpdateRequestDataAttributes.new({ name: "Example-On-Call-updated", resolve_page_on_policy_end: false, retries: 0, steps: [ DatadogAPIClient::V2::EscalationPolicyUpdateRequestDataAttributesStepsItems.new({ assignment: DatadogAPIClient::V2::EscalationPolicyStepAttributesAssignment::DEFAULT, escalate_after_seconds: 3600, id: ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID, targets: [ DatadogAPIClient::V2::EscalationPolicyStepTarget.new({ id: USER_DATA_ID, type: DatadogAPIClient::V2::EscalationPolicyStepTargetType::USERS, }), ], }), ], }), id: ESCALATION_POLICY_DATA_ID, relationships: DatadogAPIClient::V2::EscalationPolicyUpdateRequestDataRelationships.new({ teams: DatadogAPIClient::V2::DataRelationshipsTeams.new({ data: [ DatadogAPIClient::V2::DataRelationshipsTeamsDataItems.new({ id: DD_TEAM_DATA_ID, type: DatadogAPIClient::V2::DataRelationshipsTeamsDataItemsType::TEAMS, }), ], }), }), type: DatadogAPIClient::V2::EscalationPolicyUpdateRequestDataType::POLICIES, }), }) p api_instance.update_on_call_escalation_policy(ESCALATION_POLICY_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update On-Call escalation policy returns "OK" response ``` // Update On-Call escalation policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; use datadog_api_client::datadogV2::api_on_call::UpdateOnCallEscalationPolicyOptionalParams; use datadog_api_client::datadogV2::model::DataRelationshipsTeams; use datadog_api_client::datadogV2::model::DataRelationshipsTeamsDataItems; use datadog_api_client::datadogV2::model::DataRelationshipsTeamsDataItemsType; use datadog_api_client::datadogV2::model::EscalationPolicyStepAttributesAssignment; use datadog_api_client::datadogV2::model::EscalationPolicyStepTarget; use datadog_api_client::datadogV2::model::EscalationPolicyStepTargetType; use datadog_api_client::datadogV2::model::EscalationPolicyUpdateRequest; use datadog_api_client::datadogV2::model::EscalationPolicyUpdateRequestData; use datadog_api_client::datadogV2::model::EscalationPolicyUpdateRequestDataAttributes; use datadog_api_client::datadogV2::model::EscalationPolicyUpdateRequestDataAttributesStepsItems; use datadog_api_client::datadogV2::model::EscalationPolicyUpdateRequestDataRelationships; use datadog_api_client::datadogV2::model::EscalationPolicyUpdateRequestDataType; #[tokio::main] async fn main() { // there is a valid "escalation_policy" in the system let escalation_policy_data_id = std::env::var("ESCALATION_POLICY_DATA_ID").unwrap(); let escalation_policy_data_relationships_steps_data_0_id = std::env::var("ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID").unwrap(); // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let body = EscalationPolicyUpdateRequest::new( EscalationPolicyUpdateRequestData::new( EscalationPolicyUpdateRequestDataAttributes::new( "Example-On-Call-updated".to_string(), vec![ EscalationPolicyUpdateRequestDataAttributesStepsItems::new(vec![ EscalationPolicyStepTarget::new() .id(user_data_id.clone()) .type_(EscalationPolicyStepTargetType::USERS), ]) .assignment(EscalationPolicyStepAttributesAssignment::DEFAULT) .escalate_after_seconds(3600) .id(escalation_policy_data_relationships_steps_data_0_id.clone()), ], ) .resolve_page_on_policy_end(false) .retries(0), escalation_policy_data_id.clone(), EscalationPolicyUpdateRequestDataType::POLICIES, ) .relationships(EscalationPolicyUpdateRequestDataRelationships::new().teams( DataRelationshipsTeams::new().data(vec![DataRelationshipsTeamsDataItems::new( dd_team_data_id.clone(), DataRelationshipsTeamsDataItemsType::TEAMS, )]), )), ); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .update_on_call_escalation_policy( escalation_policy_data_id.clone(), body, UpdateOnCallEscalationPolicyOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update On-Call escalation policy returns "OK" response ``` /** * Update On-Call escalation policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "escalation_policy" in the system const ESCALATION_POLICY_DATA_ID = process.env .ESCALATION_POLICY_DATA_ID as string; const ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID = process.env .ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID as string; // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.OnCallApiUpdateOnCallEscalationPolicyRequest = { body: { data: { attributes: { name: "Example-On-Call-updated", resolvePageOnPolicyEnd: false, retries: 0, steps: [ { assignment: "default", escalateAfterSeconds: 3600, id: ESCALATION_POLICY_DATA_RELATIONSHIPS_STEPS_DATA_0_ID, targets: [ { id: USER_DATA_ID, type: "users", }, ], }, ], }, id: ESCALATION_POLICY_DATA_ID, relationships: { teams: { data: [ { id: DD_TEAM_DATA_ID, type: "teams", }, ], }, }, type: "policies", }, }, policyId: ESCALATION_POLICY_DATA_ID, }; apiInstance .updateOnCallEscalationPolicy(params) .then((data: v2.EscalationPolicy) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get On-Call escalation policy](https://docs.datadoghq.com/api/latest/on-call/#get-on-call-escalation-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#get-on-call-escalation-policy-v2) GET https://api.ap1.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.ap2.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.datadoghq.eu/api/v2/on-call/escalation-policies/{policy_id}https://api.ddog-gov.com/api/v2/on-call/escalation-policies/{policy_id}https://api.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.us3.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.us5.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id} ### Overview Get an On-Call escalation policy This endpoint requires the `on_call_read` permission. ### Arguments #### Path Parameters Name Type Description policy_id [_required_] string The ID of the escalation policy #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `teams`, `steps`, `steps.targets`. ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallEscalationPolicy-200-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallEscalationPolicy-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallEscalationPolicy-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallEscalationPolicy-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallEscalationPolicy-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallEscalationPolicy-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Represents a complete escalation policy response, including policy data and optionally included related resources. Field Type Description data object Represents the data for a single escalation policy, including its attributes, ID, relationships, and resource type. attributes object Defines the main attributes of an escalation policy, such as its name and behavior on policy end. name [_required_] string Specifies the name of the escalation policy. resolve_page_on_policy_end boolean Indicates whether the page is automatically resolved when the policy ends. retries int64 Specifies how many times the escalation sequence is retried if there is no response. id string Specifies the unique identifier of the escalation policy. relationships object Represents the relationships for an escalation policy, including references to steps and teams. steps [_required_] object Defines the relationship to a collection of steps within an escalation policy. Contains an array of step data references. data [object] An array of references to the steps defined in this escalation policy. id [_required_] string Specifies the unique identifier for the step resource. type [_required_] enum Indicates that the resource is of type `steps`. Allowed enum values: `steps` default: `steps` teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Indicates that the resource is of type `policies`. Allowed enum values: `policies` default: `policies` included [ ] Provides any included related resources, such as steps or targets, returned with the policy. Option 1 object Represents a single step in an escalation policy, including its attributes, relationships, and resource type. attributes object Defines attributes for an escalation policy step, such as assignment strategy and escalation timeout. assignment enum Specifies how this escalation step will assign targets (example `default` or `round-robin`). Allowed enum values: `default,round-robin` escalate_after_seconds int64 Specifies how many seconds to wait before escalating to the next step. id string Specifies the unique identifier of this escalation policy step. relationships object Represents the relationship of an escalation policy step to its targets. targets object A list of escalation targets for a step data [ ] The `EscalationTargets` `data`. Option 1 object Represents a team target for an escalation policy step, including the team's ID and resource type. id [_required_] string Specifies the unique identifier of the team resource. type [_required_] enum Indicates that the resource is of type `teams`. Allowed enum values: `teams` default: `teams` Option 2 object Represents a user target for an escalation policy step, including the user's ID and resource type. id [_required_] string Specifies the unique identifier of the user resource. type [_required_] enum Indicates that the resource is of type `users`. Allowed enum values: `users` default: `users` Option 3 object Represents a schedule target for an escalation policy step, including its ID and resource type. This is a shortcut for a configured schedule target with position set to 'current'. id [_required_] string Specifies the unique identifier of the schedule resource. type [_required_] enum Indicates that the resource is of type `schedules`. Allowed enum values: `schedules` default: `schedules` Option 4 object Relationship reference to a configured schedule target. id [_required_] string Specifies the unique identifier of the configured schedule target. type [_required_] enum Indicates that the resource is of type `schedule_target`. Allowed enum values: `schedule_target` default: `schedule_target` type [_required_] enum Indicates that the resource is of type `steps`. Allowed enum values: `steps` default: `steps` Option 2 object Represents a user object in the context of an escalation policy, including their `id`, type, and basic attributes. attributes object Provides basic user information for an escalation policy, including a name and email address. email string The user's email address. name string The user's name. status enum The user's status. Allowed enum values: `active,deactivated,pending` id string The unique user identifier. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` Option 3 object Represents the primary data object for a schedule, linking attributes and relationships. attributes object Provides core properties of a schedule object such as its name and time zone. name string A short name for the schedule. time_zone string The time zone in which this schedule operates. id string The schedule's unique identifier. relationships object Groups the relationships for a schedule object, referencing layers and teams. layers object Associates layers with this schedule in a data structure. data [object] An array of layer references for this schedule. id [_required_] string The unique identifier of the layer in this relationship. type [_required_] enum Layers resource type. Allowed enum values: `layers` default: `layers` teams object Associates teams with this schedule in a data structure. data [object] An array of team references for this schedule. id [_required_] string The unique identifier of the team in this relationship. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` type [_required_] enum Schedules resource type. Allowed enum values: `schedules` default: `schedules` Option 4 object Full resource representation of a configured schedule target with position (previous, current, or next). attributes [_required_] object Attributes for a configured schedule target, including position. position [_required_] enum Specifies the position of a schedule target (example `previous`, `current`, or `next`). Allowed enum values: `previous,current,next` id [_required_] string Specifies the unique identifier of the configured schedule target. relationships [_required_] object Represents the relationships of a configured schedule target. schedule [_required_] object Holds the schedule reference for a configured schedule target. data [_required_] object Represents a schedule target for an escalation policy step, including its ID and resource type. This is a shortcut for a configured schedule target with position set to 'current'. id [_required_] string Specifies the unique identifier of the schedule resource. type [_required_] enum Indicates that the resource is of type `schedules`. Allowed enum values: `schedules` default: `schedules` type [_required_] enum Indicates that the resource is of type `schedule_target`. Allowed enum values: `schedule_target` default: `schedule_target` Option 5 object Provides a reference to a team, including ID, type, and basic attributes/relationships. attributes object Encapsulates the basic attributes of a Team reference, such as name, handle, and an optional avatar or description. avatar string URL or reference for the team's avatar (if available). description string A short text describing the team. handle string A unique handle/slug for the team. name string The full, human-readable name of the team. id string The team's unique identifier. type [_required_] enum Teams resource type. Allowed enum values: `teams` default: `teams` ``` { "data": { "attributes": { "name": "Escalation Policy 1", "resolve_page_on_policy_end": true, "retries": 2 }, "id": "00000000-aba1-0000-0000-000000000000", "relationships": { "steps": { "data": [ { "id": "00000000-aba1-0000-0000-000000000000", "type": "steps" } ] }, "teams": { "data": [ { "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" } ] } }, "type": "policies" }, "included": [ { "attributes": { "avatar": "", "description": "Team 1 description", "handle": "team1", "name": "Team 1" }, "id": "00000000-da3a-0000-0000-000000000000", "type": "teams" }, { "attributes": { "assignment": "default", "escalate_after_seconds": 3600 }, "id": "00000000-aba1-0000-0000-000000000000", "relationships": { "targets": { "data": [ { "id": "00000000-aba1-0000-0000-000000000000", "type": "users" }, { "id": "00000000-aba2-0000-0000-000000000000", "type": "schedules" }, { "id": "00000000-aba2-0000-0000-000000000000_previous", "type": "schedule_target" }, { "id": "00000000-aba3-0000-0000-000000000000", "type": "teams" } ] } }, "type": "steps" }, { "id": "00000000-aba1-0000-0000-000000000000", "type": "users" }, { "id": "00000000-aba2-0000-0000-000000000000", "type": "schedules" }, { "attributes": { "position": "previous" }, "id": "00000000-aba2-0000-0000-000000000000_previous", "relationships": { "schedule": { "data": { "id": "00000000-aba2-0000-0000-000000000000", "type": "schedules" } } }, "type": "schedule_target" }, { "id": "00000000-aba3-0000-0000-000000000000", "type": "teams" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Get On-Call escalation policy Copy ``` # Path parameters export policy_id="a3000000-0000-0000-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/escalation-policies/${policy_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get On-Call escalation policy ``` """ Get On-Call escalation policy returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi # there is a valid "escalation_policy" in the system ESCALATION_POLICY_DATA_ID = environ["ESCALATION_POLICY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.get_on_call_escalation_policy( policy_id=ESCALATION_POLICY_DATA_ID, include="steps.targets", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get On-Call escalation policy ``` # Get On-Call escalation policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "escalation_policy" in the system ESCALATION_POLICY_DATA_ID = ENV["ESCALATION_POLICY_DATA_ID"] opts = { include: "steps.targets", } p api_instance.get_on_call_escalation_policy(ESCALATION_POLICY_DATA_ID, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get On-Call escalation policy ``` // Get On-Call escalation policy returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "escalation_policy" in the system EscalationPolicyDataID := os.Getenv("ESCALATION_POLICY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.GetOnCallEscalationPolicy(ctx, EscalationPolicyDataID, *datadogV2.NewGetOnCallEscalationPolicyOptionalParameters().WithInclude("steps.targets")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.GetOnCallEscalationPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.GetOnCallEscalationPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get On-Call escalation policy ``` // Get On-Call escalation policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.api.OnCallApi.GetOnCallEscalationPolicyOptionalParameters; import com.datadog.api.client.v2.model.EscalationPolicy; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "escalation_policy" in the system String ESCALATION_POLICY_DATA_ID = System.getenv("ESCALATION_POLICY_DATA_ID"); try { EscalationPolicy result = apiInstance.getOnCallEscalationPolicy( ESCALATION_POLICY_DATA_ID, new GetOnCallEscalationPolicyOptionalParameters().include("steps.targets")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#getOnCallEscalationPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get On-Call escalation policy ``` // Get On-Call escalation policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::GetOnCallEscalationPolicyOptionalParams; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; #[tokio::main] async fn main() { // there is a valid "escalation_policy" in the system let escalation_policy_data_id = std::env::var("ESCALATION_POLICY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .get_on_call_escalation_policy( escalation_policy_data_id.clone(), GetOnCallEscalationPolicyOptionalParams::default().include("steps.targets".to_string()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get On-Call escalation policy ``` /** * Get On-Call escalation policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "escalation_policy" in the system const ESCALATION_POLICY_DATA_ID = process.env .ESCALATION_POLICY_DATA_ID as string; const params: v2.OnCallApiGetOnCallEscalationPolicyRequest = { policyId: ESCALATION_POLICY_DATA_ID, include: "steps.targets", }; apiInstance .getOnCallEscalationPolicy(params) .then((data: v2.EscalationPolicy) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete On-Call escalation policy](https://docs.datadoghq.com/api/latest/on-call/#delete-on-call-escalation-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#delete-on-call-escalation-policy-v2) DELETE https://api.ap1.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.ap2.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.datadoghq.eu/api/v2/on-call/escalation-policies/{policy_id}https://api.ddog-gov.com/api/v2/on-call/escalation-policies/{policy_id}https://api.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.us3.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id}https://api.us5.datadoghq.com/api/v2/on-call/escalation-policies/{policy_id} ### Overview Delete an On-Call escalation policy This endpoint requires the `on_call_write` permission. ### Arguments #### Path Parameters Name Type Description policy_id [_required_] string The ID of the escalation policy ### Response * [204](https://docs.datadoghq.com/api/latest/on-call/#DeleteOnCallEscalationPolicy-204-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#DeleteOnCallEscalationPolicy-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#DeleteOnCallEscalationPolicy-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#DeleteOnCallEscalationPolicy-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#DeleteOnCallEscalationPolicy-429-v2) No Content Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Delete On-Call escalation policy Copy ``` # Path parameters export policy_id="a3000000-0000-0000-0000-000000000000" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/escalation-policies/${policy_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete On-Call escalation policy ``` """ Delete On-Call escalation policy returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi # there is a valid "escalation_policy" in the system ESCALATION_POLICY_DATA_ID = environ["ESCALATION_POLICY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) api_instance.delete_on_call_escalation_policy( policy_id=ESCALATION_POLICY_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete On-Call escalation policy ``` # Delete On-Call escalation policy returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "escalation_policy" in the system ESCALATION_POLICY_DATA_ID = ENV["ESCALATION_POLICY_DATA_ID"] api_instance.delete_on_call_escalation_policy(ESCALATION_POLICY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete On-Call escalation policy ``` // Delete On-Call escalation policy returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "escalation_policy" in the system EscalationPolicyDataID := os.Getenv("ESCALATION_POLICY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) r, err := api.DeleteOnCallEscalationPolicy(ctx, EscalationPolicyDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.DeleteOnCallEscalationPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete On-Call escalation policy ``` // Delete On-Call escalation policy returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "escalation_policy" in the system String ESCALATION_POLICY_DATA_ID = System.getenv("ESCALATION_POLICY_DATA_ID"); try { apiInstance.deleteOnCallEscalationPolicy(ESCALATION_POLICY_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#deleteOnCallEscalationPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete On-Call escalation policy ``` // Delete On-Call escalation policy returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; #[tokio::main] async fn main() { // there is a valid "escalation_policy" in the system let escalation_policy_data_id = std::env::var("ESCALATION_POLICY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .delete_on_call_escalation_policy(escalation_policy_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete On-Call escalation policy ``` /** * Delete On-Call escalation policy returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "escalation_policy" in the system const ESCALATION_POLICY_DATA_ID = process.env .ESCALATION_POLICY_DATA_ID as string; const params: v2.OnCallApiDeleteOnCallEscalationPolicyRequest = { policyId: ESCALATION_POLICY_DATA_ID, }; apiInstance .deleteOnCallEscalationPolicy(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get On-Call team routing rules](https://docs.datadoghq.com/api/latest/on-call/#get-on-call-team-routing-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#get-on-call-team-routing-rules-v2) GET https://api.ap1.datadoghq.com/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.ap2.datadoghq.com/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.datadoghq.eu/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.ddog-gov.com/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.datadoghq.com/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.us3.datadoghq.com/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.us5.datadoghq.com/api/v2/on-call/teams/{team_id}/routing-rules ### Overview Get a team’s On-Call routing rules This endpoint requires the `on_call_read` permission. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string The team ID #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `rules`, `rules.policy`. ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallTeamRoutingRules-200-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#GetOnCallTeamRoutingRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Represents a complete set of team routing rules, including data and optionally included related resources. Field Type Description data object Represents the top-level data object for team routing rules, containing the ID, relationships, and resource type. id string Specifies the unique identifier of this team routing rules record. relationships object Specifies relationships for team routing rules, including rule references. rules object Holds references to a set of routing rules in a relationship. data [object] An array of references to the routing rules associated with this team. id [_required_] string Specifies the unique identifier for the related routing rule. type [_required_] enum Indicates that the resource is of type 'team_routing_rules'. Allowed enum values: `team_routing_rules` default: `team_routing_rules` type [_required_] enum Team routing rules resource type. Allowed enum values: `team_routing_rules` default: `team_routing_rules` included [ ] Provides related routing rules or other included resources. Option 1 object Represents a routing rule, including its attributes, relationships, and unique identifier. attributes object Defines the configurable attributes of a routing rule, such as actions, query, time restriction, and urgency. actions [ ] Specifies the list of actions to perform when the routing rule matches. Option 1 object Sends a message to a Slack channel. channel [_required_] string The channel ID. type [_required_] enum Indicates that the action is a send Slack message action. Allowed enum values: `send_slack_message` default: `send_slack_message` workspace [_required_] string The workspace ID. Option 2 object Sends a message to a Microsoft Teams channel. channel [_required_] string The channel ID. team [_required_] string The team ID. tenant [_required_] string The tenant ID. type [_required_] enum Indicates that the action is a send Microsoft Teams message action. Allowed enum values: `send_teams_message` default: `send_teams_message` query string Defines the query or condition that triggers this routing rule. time_restriction object Holds time zone information and a list of time restrictions for a routing rule. restrictions [_required_] [object] Defines the list of time-based restrictions. end_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` end_time string Specifies the ending time for this restriction. start_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` start_time string Specifies the starting time for this restriction. time_zone [_required_] string Specifies the time zone applicable to the restrictions. urgency enum Specifies the level of urgency for a routing rule (low, high, or dynamic). Allowed enum values: `low,high,dynamic` id string Specifies the unique identifier of this routing rule. relationships object Specifies relationships for a routing rule, linking to associated policy resources. policy object Defines the relationship that links a routing rule to a policy. data object Represents the policy data reference, containing the policy's ID and resource type. id [_required_] string Specifies the unique identifier of the policy. type [_required_] enum Indicates that the resource is of type 'policies'. Allowed enum values: `policies` default: `policies` type [_required_] enum Team routing rules resource type. Allowed enum values: `team_routing_rules` default: `team_routing_rules` ``` { "data": { "id": "27590dae-47be-4a7d-9abf-8f4e45124020", "relationships": { "rules": { "data": [ { "id": "03aff2d6-6cbf-496c-997f-a857bbe9a94a", "type": "team_routing_rules" }, { "id": "03aff2d6-6cbf-496c-997f-a857bbe9a94a", "type": "team_routing_rules" } ] } }, "type": "team_routing_rules" }, "included": [ { "attributes": { "actions": null, "query": "tags.service:test", "time_restriction": { "restrictions": [ { "end_day": "monday", "end_time": "17:00:00", "start_day": "monday", "start_time": "09:00:00" }, { "end_day": "tuesday", "end_time": "17:00:00", "start_day": "tuesday", "start_time": "09:00:00" } ], "time_zone": "" }, "urgency": "high" }, "id": "03aff2d6-6cbf-496c-997f-a857bbe9a94a", "relationships": { "policy": { "data": null } }, "type": "team_routing_rules" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Get On-Call team routing rules Copy ``` # Path parameters export team_id="27590dae-47be-4a7d-9abf-8f4e45124020" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/teams/${team_id}/routing-rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get On-Call team routing rules ``` """ Get On-Call team routing rules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.get_on_call_team_routing_rules( team_id="27590dae-47be-4a7d-9abf-8f4e45124020", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get On-Call team routing rules ``` # Get On-Call team routing rules returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new p api_instance.get_on_call_team_routing_rules("27590dae-47be-4a7d-9abf-8f4e45124020") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get On-Call team routing rules ``` // Get On-Call team routing rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.GetOnCallTeamRoutingRules(ctx, "27590dae-47be-4a7d-9abf-8f4e45124020", *datadogV2.NewGetOnCallTeamRoutingRulesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.GetOnCallTeamRoutingRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.GetOnCallTeamRoutingRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get On-Call team routing rules ``` // Get On-Call team routing rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.model.TeamRoutingRules; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); try { TeamRoutingRules result = apiInstance.getOnCallTeamRoutingRules("27590dae-47be-4a7d-9abf-8f4e45124020"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#getOnCallTeamRoutingRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get On-Call team routing rules ``` // Get On-Call team routing rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::GetOnCallTeamRoutingRulesOptionalParams; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .get_on_call_team_routing_rules( "27590dae-47be-4a7d-9abf-8f4e45124020".to_string(), GetOnCallTeamRoutingRulesOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get On-Call team routing rules ``` /** * Get On-Call team routing rules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); const params: v2.OnCallApiGetOnCallTeamRoutingRulesRequest = { teamId: "27590dae-47be-4a7d-9abf-8f4e45124020", }; apiInstance .getOnCallTeamRoutingRules(params) .then((data: v2.TeamRoutingRules) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Set On-Call team routing rules](https://docs.datadoghq.com/api/latest/on-call/#set-on-call-team-routing-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#set-on-call-team-routing-rules-v2) PUT https://api.ap1.datadoghq.com/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.ap2.datadoghq.com/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.datadoghq.eu/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.ddog-gov.com/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.datadoghq.com/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.us3.datadoghq.com/api/v2/on-call/teams/{team_id}/routing-ruleshttps://api.us5.datadoghq.com/api/v2/on-call/teams/{team_id}/routing-rules ### Overview Set a team’s On-Call routing rules This endpoint requires the `on_call_write` permission. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string The team ID #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `rules`, `rules.policy`. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Field Type Description data object Holds the data necessary to create or update team routing rules, including attributes, ID, and resource type. attributes object Represents the attributes of a request to update or create team routing rules. rules [object] A list of routing rule items that define how incoming pages should be handled. actions [ ] Specifies the list of actions to perform when the routing rule is matched. Option 1 object Sends a message to a Slack channel. channel [_required_] string The channel ID. type [_required_] enum Indicates that the action is a send Slack message action. Allowed enum values: `send_slack_message` default: `send_slack_message` workspace [_required_] string The workspace ID. Option 2 object Sends a message to a Microsoft Teams channel. channel [_required_] string The channel ID. team [_required_] string The team ID. tenant [_required_] string The tenant ID. type [_required_] enum Indicates that the action is a send Microsoft Teams message action. Allowed enum values: `send_teams_message` default: `send_teams_message` policy_id string Identifies the policy to be applied when this routing rule matches. query string Defines the query or condition that triggers this routing rule. time_restriction object Holds time zone information and a list of time restrictions for a routing rule. restrictions [_required_] [object] Defines the list of time-based restrictions. end_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` end_time string Specifies the ending time for this restriction. start_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` start_time string Specifies the starting time for this restriction. time_zone [_required_] string Specifies the time zone applicable to the restrictions. urgency enum Specifies the level of urgency for a routing rule (low, high, or dynamic). Allowed enum values: `low,high,dynamic` id string Specifies the unique identifier for this set of team routing rules. type [_required_] enum Team routing rules resource type. Allowed enum values: `team_routing_rules` default: `team_routing_rules` ``` { "data": { "attributes": { "rules": [ { "actions": [ { "channel": "channel", "type": "send_slack_message", "workspace": "workspace" } ], "query": "tags.service:test", "time_restriction": { "time_zone": "Europe/Paris", "restrictions": [ { "end_day": "monday", "end_time": "17:00:00", "start_day": "monday", "start_time": "09:00:00" }, { "end_day": "tuesday", "end_time": "17:00:00", "start_day": "tuesday", "start_time": "09:00:00" } ] } }, { "policy_id": "ab000000-0000-0000-0000-000000000000", "query": "", "urgency": "low" } ] }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "team_routing_rules" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#SetOnCallTeamRoutingRules-200-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#SetOnCallTeamRoutingRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Represents a complete set of team routing rules, including data and optionally included related resources. Field Type Description data object Represents the top-level data object for team routing rules, containing the ID, relationships, and resource type. id string Specifies the unique identifier of this team routing rules record. relationships object Specifies relationships for team routing rules, including rule references. rules object Holds references to a set of routing rules in a relationship. data [object] An array of references to the routing rules associated with this team. id [_required_] string Specifies the unique identifier for the related routing rule. type [_required_] enum Indicates that the resource is of type 'team_routing_rules'. Allowed enum values: `team_routing_rules` default: `team_routing_rules` type [_required_] enum Team routing rules resource type. Allowed enum values: `team_routing_rules` default: `team_routing_rules` included [ ] Provides related routing rules or other included resources. Option 1 object Represents a routing rule, including its attributes, relationships, and unique identifier. attributes object Defines the configurable attributes of a routing rule, such as actions, query, time restriction, and urgency. actions [ ] Specifies the list of actions to perform when the routing rule matches. Option 1 object Sends a message to a Slack channel. channel [_required_] string The channel ID. type [_required_] enum Indicates that the action is a send Slack message action. Allowed enum values: `send_slack_message` default: `send_slack_message` workspace [_required_] string The workspace ID. Option 2 object Sends a message to a Microsoft Teams channel. channel [_required_] string The channel ID. team [_required_] string The team ID. tenant [_required_] string The tenant ID. type [_required_] enum Indicates that the action is a send Microsoft Teams message action. Allowed enum values: `send_teams_message` default: `send_teams_message` query string Defines the query or condition that triggers this routing rule. time_restriction object Holds time zone information and a list of time restrictions for a routing rule. restrictions [_required_] [object] Defines the list of time-based restrictions. end_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` end_time string Specifies the ending time for this restriction. start_day enum A day of the week. Allowed enum values: `monday,tuesday,wednesday,thursday,friday,saturday,sunday` start_time string Specifies the starting time for this restriction. time_zone [_required_] string Specifies the time zone applicable to the restrictions. urgency enum Specifies the level of urgency for a routing rule (low, high, or dynamic). Allowed enum values: `low,high,dynamic` id string Specifies the unique identifier of this routing rule. relationships object Specifies relationships for a routing rule, linking to associated policy resources. policy object Defines the relationship that links a routing rule to a policy. data object Represents the policy data reference, containing the policy's ID and resource type. id [_required_] string Specifies the unique identifier of the policy. type [_required_] enum Indicates that the resource is of type 'policies'. Allowed enum values: `policies` default: `policies` type [_required_] enum Team routing rules resource type. Allowed enum values: `team_routing_rules` default: `team_routing_rules` ``` { "data": { "id": "27590dae-47be-4a7d-9abf-8f4e45124020", "relationships": { "rules": { "data": [ { "id": "03aff2d6-6cbf-496c-997f-a857bbe9a94a", "type": "team_routing_rules" }, { "id": "03aff2d6-6cbf-496c-997f-a857bbe9a94a", "type": "team_routing_rules" } ] } }, "type": "team_routing_rules" }, "included": [ { "attributes": { "actions": null, "query": "tags.service:test", "time_restriction": { "restrictions": [ { "end_day": "monday", "end_time": "17:00:00", "start_day": "monday", "start_time": "09:00:00" }, { "end_day": "tuesday", "end_time": "17:00:00", "start_day": "tuesday", "start_time": "09:00:00" } ], "time_zone": "" }, "urgency": "high" }, "id": "03aff2d6-6cbf-496c-997f-a857bbe9a94a", "relationships": { "policy": { "data": null } }, "type": "team_routing_rules" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Set On-Call team routing rules returns "OK" response Copy ``` # Path parameters export team_id="27590dae-47be-4a7d-9abf-8f4e45124020" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/teams/${team_id}/routing-rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "rules": [ { "actions": [ { "channel": "channel", "type": "send_slack_message", "workspace": "workspace" } ], "query": "tags.service:test", "time_restriction": { "time_zone": "Europe/Paris", "restrictions": [ { "end_day": "monday", "end_time": "17:00:00", "start_day": "monday", "start_time": "09:00:00" }, { "end_day": "tuesday", "end_time": "17:00:00", "start_day": "tuesday", "start_time": "09:00:00" } ] } }, { "policy_id": "ab000000-0000-0000-0000-000000000000", "query": "", "urgency": "low" } ] }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "team_routing_rules" } } EOF ``` ##### Set On-Call team routing rules returns "OK" response ``` // Set On-Call team routing rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") // there is a valid "escalation_policy" in the system EscalationPolicyDataID := os.Getenv("ESCALATION_POLICY_DATA_ID") body := datadogV2.TeamRoutingRulesRequest{ Data: &datadogV2.TeamRoutingRulesRequestData{ Attributes: &datadogV2.TeamRoutingRulesRequestDataAttributes{ Rules: []datadogV2.TeamRoutingRulesRequestRule{ { Actions: []datadogV2.RoutingRuleAction{ datadogV2.RoutingRuleAction{ SendSlackMessageAction: &datadogV2.SendSlackMessageAction{ Channel: "channel", Type: datadogV2.SENDSLACKMESSAGEACTIONTYPE_SEND_SLACK_MESSAGE, Workspace: "workspace", }}, }, Query: datadog.PtrString("tags.service:test"), TimeRestriction: &datadogV2.TimeRestrictions{ TimeZone: "Europe/Paris", Restrictions: []datadogV2.TimeRestriction{ { EndDay: datadogV2.WEEKDAY_MONDAY.Ptr(), EndTime: datadog.PtrString("17:00:00"), StartDay: datadogV2.WEEKDAY_MONDAY.Ptr(), StartTime: datadog.PtrString("09:00:00"), }, { EndDay: datadogV2.WEEKDAY_TUESDAY.Ptr(), EndTime: datadog.PtrString("17:00:00"), StartDay: datadogV2.WEEKDAY_TUESDAY.Ptr(), StartTime: datadog.PtrString("09:00:00"), }, }, }, }, { PolicyId: datadog.PtrString(EscalationPolicyDataID), Query: datadog.PtrString(""), Urgency: datadogV2.URGENCY_LOW.Ptr(), }, }, }, Id: datadog.PtrString(DdTeamDataID), Type: datadogV2.TEAMROUTINGRULESREQUESTDATATYPE_TEAM_ROUTING_RULES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.SetOnCallTeamRoutingRules(ctx, DdTeamDataID, body, *datadogV2.NewSetOnCallTeamRoutingRulesOptionalParameters().WithInclude("rules")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.SetOnCallTeamRoutingRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.SetOnCallTeamRoutingRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Set On-Call team routing rules returns "OK" response ``` // Set On-Call team routing rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.api.OnCallApi.SetOnCallTeamRoutingRulesOptionalParameters; import com.datadog.api.client.v2.model.RoutingRuleAction; import com.datadog.api.client.v2.model.SendSlackMessageAction; import com.datadog.api.client.v2.model.SendSlackMessageActionType; import com.datadog.api.client.v2.model.TeamRoutingRules; import com.datadog.api.client.v2.model.TeamRoutingRulesRequest; import com.datadog.api.client.v2.model.TeamRoutingRulesRequestData; import com.datadog.api.client.v2.model.TeamRoutingRulesRequestDataAttributes; import com.datadog.api.client.v2.model.TeamRoutingRulesRequestDataType; import com.datadog.api.client.v2.model.TeamRoutingRulesRequestRule; import com.datadog.api.client.v2.model.TimeRestriction; import com.datadog.api.client.v2.model.TimeRestrictions; import com.datadog.api.client.v2.model.Urgency; import com.datadog.api.client.v2.model.Weekday; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); // there is a valid "escalation_policy" in the system String ESCALATION_POLICY_DATA_ID = System.getenv("ESCALATION_POLICY_DATA_ID"); TeamRoutingRulesRequest body = new TeamRoutingRulesRequest() .data( new TeamRoutingRulesRequestData() .attributes( new TeamRoutingRulesRequestDataAttributes() .rules( Arrays.asList( new TeamRoutingRulesRequestRule() .actions( Collections.singletonList( new RoutingRuleAction( new SendSlackMessageAction() .channel("channel") .type( SendSlackMessageActionType .SEND_SLACK_MESSAGE) .workspace("workspace")))) .query("tags.service:test") .timeRestriction( new TimeRestrictions() .timeZone("Europe/Paris") .restrictions( Arrays.asList( new TimeRestriction() .endDay(Weekday.MONDAY) .endTime("17:00:00") .startDay(Weekday.MONDAY) .startTime("09:00:00"), new TimeRestriction() .endDay(Weekday.TUESDAY) .endTime("17:00:00") .startDay(Weekday.TUESDAY) .startTime("09:00:00")))), new TeamRoutingRulesRequestRule() .policyId(ESCALATION_POLICY_DATA_ID) .query("") .urgency(Urgency.LOW)))) .id(DD_TEAM_DATA_ID) .type(TeamRoutingRulesRequestDataType.TEAM_ROUTING_RULES)); try { TeamRoutingRules result = apiInstance.setOnCallTeamRoutingRules( DD_TEAM_DATA_ID, body, new SetOnCallTeamRoutingRulesOptionalParameters().include("rules")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#setOnCallTeamRoutingRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Set On-Call team routing rules returns "OK" response ``` """ Set On-Call team routing rules returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi from datadog_api_client.v2.model.send_slack_message_action import SendSlackMessageAction from datadog_api_client.v2.model.send_slack_message_action_type import SendSlackMessageActionType from datadog_api_client.v2.model.team_routing_rules_request import TeamRoutingRulesRequest from datadog_api_client.v2.model.team_routing_rules_request_data import TeamRoutingRulesRequestData from datadog_api_client.v2.model.team_routing_rules_request_data_attributes import TeamRoutingRulesRequestDataAttributes from datadog_api_client.v2.model.team_routing_rules_request_data_type import TeamRoutingRulesRequestDataType from datadog_api_client.v2.model.team_routing_rules_request_rule import TeamRoutingRulesRequestRule from datadog_api_client.v2.model.time_restriction import TimeRestriction from datadog_api_client.v2.model.time_restrictions import TimeRestrictions from datadog_api_client.v2.model.urgency import Urgency from datadog_api_client.v2.model.weekday import Weekday # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] # there is a valid "escalation_policy" in the system ESCALATION_POLICY_DATA_ID = environ["ESCALATION_POLICY_DATA_ID"] body = TeamRoutingRulesRequest( data=TeamRoutingRulesRequestData( attributes=TeamRoutingRulesRequestDataAttributes( rules=[ TeamRoutingRulesRequestRule( actions=[ SendSlackMessageAction( channel="channel", type=SendSlackMessageActionType.SEND_SLACK_MESSAGE, workspace="workspace", ), ], query="tags.service:test", time_restriction=TimeRestrictions( time_zone="Europe/Paris", restrictions=[ TimeRestriction( end_day=Weekday.MONDAY, end_time="17:00:00", start_day=Weekday.MONDAY, start_time="09:00:00", ), TimeRestriction( end_day=Weekday.TUESDAY, end_time="17:00:00", start_day=Weekday.TUESDAY, start_time="09:00:00", ), ], ), ), TeamRoutingRulesRequestRule( policy_id=ESCALATION_POLICY_DATA_ID, query="", urgency=Urgency.LOW, ), ], ), id=DD_TEAM_DATA_ID, type=TeamRoutingRulesRequestDataType.TEAM_ROUTING_RULES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.set_on_call_team_routing_rules(team_id=DD_TEAM_DATA_ID, include="rules", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Set On-Call team routing rules returns "OK" response ``` # Set On-Call team routing rules returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] # there is a valid "escalation_policy" in the system ESCALATION_POLICY_DATA_ID = ENV["ESCALATION_POLICY_DATA_ID"] body = DatadogAPIClient::V2::TeamRoutingRulesRequest.new({ data: DatadogAPIClient::V2::TeamRoutingRulesRequestData.new({ attributes: DatadogAPIClient::V2::TeamRoutingRulesRequestDataAttributes.new({ rules: [ DatadogAPIClient::V2::TeamRoutingRulesRequestRule.new({ actions: [ DatadogAPIClient::V2::SendSlackMessageAction.new({ channel: "channel", type: DatadogAPIClient::V2::SendSlackMessageActionType::SEND_SLACK_MESSAGE, workspace: "workspace", }), ], query: "tags.service:test", time_restriction: DatadogAPIClient::V2::TimeRestrictions.new({ time_zone: "Europe/Paris", restrictions: [ DatadogAPIClient::V2::TimeRestriction.new({ end_day: DatadogAPIClient::V2::Weekday::MONDAY, end_time: "17:00:00", start_day: DatadogAPIClient::V2::Weekday::MONDAY, start_time: "09:00:00", }), DatadogAPIClient::V2::TimeRestriction.new({ end_day: DatadogAPIClient::V2::Weekday::TUESDAY, end_time: "17:00:00", start_day: DatadogAPIClient::V2::Weekday::TUESDAY, start_time: "09:00:00", }), ], }), }), DatadogAPIClient::V2::TeamRoutingRulesRequestRule.new({ policy_id: ESCALATION_POLICY_DATA_ID, query: "", urgency: DatadogAPIClient::V2::Urgency::LOW, }), ], }), id: DD_TEAM_DATA_ID, type: DatadogAPIClient::V2::TeamRoutingRulesRequestDataType::TEAM_ROUTING_RULES, }), }) opts = { include: "rules", } p api_instance.set_on_call_team_routing_rules(DD_TEAM_DATA_ID, body, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Set On-Call team routing rules returns "OK" response ``` // Set On-Call team routing rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; use datadog_api_client::datadogV2::api_on_call::SetOnCallTeamRoutingRulesOptionalParams; use datadog_api_client::datadogV2::model::RoutingRuleAction; use datadog_api_client::datadogV2::model::SendSlackMessageAction; use datadog_api_client::datadogV2::model::SendSlackMessageActionType; use datadog_api_client::datadogV2::model::TeamRoutingRulesRequest; use datadog_api_client::datadogV2::model::TeamRoutingRulesRequestData; use datadog_api_client::datadogV2::model::TeamRoutingRulesRequestDataAttributes; use datadog_api_client::datadogV2::model::TeamRoutingRulesRequestDataType; use datadog_api_client::datadogV2::model::TeamRoutingRulesRequestRule; use datadog_api_client::datadogV2::model::TimeRestriction; use datadog_api_client::datadogV2::model::TimeRestrictions; use datadog_api_client::datadogV2::model::Urgency; use datadog_api_client::datadogV2::model::Weekday; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); // there is a valid "escalation_policy" in the system let escalation_policy_data_id = std::env::var("ESCALATION_POLICY_DATA_ID").unwrap(); let body = TeamRoutingRulesRequest::new().data( TeamRoutingRulesRequestData::new(TeamRoutingRulesRequestDataType::TEAM_ROUTING_RULES) .attributes(TeamRoutingRulesRequestDataAttributes::new().rules(vec![ TeamRoutingRulesRequestRule::new() .actions( vec![ RoutingRuleAction::SendSlackMessageAction( Box::new( SendSlackMessageAction::new( "channel".to_string(), SendSlackMessageActionType::SEND_SLACK_MESSAGE, "workspace".to_string(), ), ), ) ], ) .query("tags.service:test".to_string()) .time_restriction( TimeRestrictions::new( vec![ TimeRestriction::new() .end_day(Weekday::MONDAY) .end_time("17:00:00".to_string()) .start_day(Weekday::MONDAY) .start_time("09:00:00".to_string()), TimeRestriction::new() .end_day(Weekday::TUESDAY) .end_time("17:00:00".to_string()) .start_day(Weekday::TUESDAY) .start_time("09:00:00".to_string()) ], "Europe/Paris".to_string(), ), ), TeamRoutingRulesRequestRule::new() .policy_id(escalation_policy_data_id.clone()) .query("".to_string()) .urgency(Urgency::LOW) ])) .id(dd_team_data_id.clone()), ); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .set_on_call_team_routing_rules( dd_team_data_id.clone(), body, SetOnCallTeamRoutingRulesOptionalParams::default().include("rules".to_string()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Set On-Call team routing rules returns "OK" response ``` /** * Set On-Call team routing rules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; // there is a valid "escalation_policy" in the system const ESCALATION_POLICY_DATA_ID = process.env .ESCALATION_POLICY_DATA_ID as string; const params: v2.OnCallApiSetOnCallTeamRoutingRulesRequest = { body: { data: { attributes: { rules: [ { actions: [ { channel: "channel", type: "send_slack_message", workspace: "workspace", }, ], query: "tags.service:test", timeRestriction: { timeZone: "Europe/Paris", restrictions: [ { endDay: "monday", endTime: "17:00:00", startDay: "monday", startTime: "09:00:00", }, { endDay: "tuesday", endTime: "17:00:00", startDay: "tuesday", startTime: "09:00:00", }, ], }, }, { policyId: ESCALATION_POLICY_DATA_ID, query: "", urgency: "low", }, ], }, id: DD_TEAM_DATA_ID, type: "team_routing_rules", }, }, teamId: DD_TEAM_DATA_ID, include: "rules", }; apiInstance .setOnCallTeamRoutingRules(params) .then((data: v2.TeamRoutingRules) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get scheduled on-call user](https://docs.datadoghq.com/api/latest/on-call/#get-scheduled-on-call-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#get-scheduled-on-call-user-v2) GET https://api.ap1.datadoghq.com/api/v2/on-call/schedules/{schedule_id}/on-callhttps://api.ap2.datadoghq.com/api/v2/on-call/schedules/{schedule_id}/on-callhttps://api.datadoghq.eu/api/v2/on-call/schedules/{schedule_id}/on-callhttps://api.ddog-gov.com/api/v2/on-call/schedules/{schedule_id}/on-callhttps://api.datadoghq.com/api/v2/on-call/schedules/{schedule_id}/on-callhttps://api.us3.datadoghq.com/api/v2/on-call/schedules/{schedule_id}/on-callhttps://api.us5.datadoghq.com/api/v2/on-call/schedules/{schedule_id}/on-call ### Overview Retrieves the user who is on-call for the specified schedule at a given time. This endpoint requires the `on_call_read` permission. ### Arguments #### Path Parameters Name Type Description schedule_id [_required_] string The ID of the schedule. #### Query Strings Name Type Description include string Specifies related resources to include in the response as a comma-separated list. Allowed value: `user`. filter[at_ts] string Retrieves the on-call user at the given timestamp (ISO-8601). Defaults to the current time if omitted." ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#GetScheduleOnCallUser-200-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#GetScheduleOnCallUser-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#GetScheduleOnCallUser-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#GetScheduleOnCallUser-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#GetScheduleOnCallUser-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#GetScheduleOnCallUser-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) An on-call shift with its associated data and relationships. Field Type Description data object Data for an on-call shift. attributes object Attributes for an on-call shift. end date-time The end time of the shift. start date-time The start time of the shift. id string The `ShiftData` `id`. relationships object Relationships for an on-call shift. user object Defines the relationship between a shift and the user who is working that shift. data [_required_] object Represents a reference to the user assigned to this shift, containing the user's ID and resource type. id [_required_] string Specifies the unique identifier of the user. type [_required_] enum Indicates that the related resource is of type 'users'. Allowed enum values: `users` default: `users` type [_required_] enum Indicates that the resource is of type 'shifts'. Allowed enum values: `shifts` default: `shifts` included [ ] The `Shift` `included`. Option 1 object Represents a user object in the context of a schedule, including their `id`, type, and basic attributes. attributes object Provides basic user information for a schedule, including a name and email address. email string The user's email address. name string The user's name. status enum The user's status. Allowed enum values: `active,deactivated,pending` id string The unique user identifier. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "end": "2025-05-07T03:53:01.206662873Z", "start": "2025-05-07T02:53:01.206662814Z" }, "id": "00000000-0000-0000-0000-000000000000", "relationships": { "user": { "data": { "id": "00000000-aba1-0000-0000-000000000000", "type": "users" } } }, "type": "shifts" }, "included": [ { "attributes": { "email": "foo@bar.com", "name": "User 1", "status": "" }, "id": "00000000-aba1-0000-0000-000000000000", "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Get scheduled on-call user Copy ``` # Path parameters export schedule_id="3653d3c6-0c75-11ea-ad28-fb5701eabc7d" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/schedules/${schedule_id}/on-call" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get scheduled on-call user ``` """ Get scheduled on-call user returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi # there is a valid "schedule" in the system SCHEDULE_DATA_ID = environ["SCHEDULE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.get_schedule_on_call_user( schedule_id=SCHEDULE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get scheduled on-call user ``` # Get scheduled on-call user returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "schedule" in the system SCHEDULE_DATA_ID = ENV["SCHEDULE_DATA_ID"] p api_instance.get_schedule_on_call_user(SCHEDULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get scheduled on-call user ``` // Get scheduled on-call user returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "schedule" in the system ScheduleDataID := os.Getenv("SCHEDULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.GetScheduleOnCallUser(ctx, ScheduleDataID, *datadogV2.NewGetScheduleOnCallUserOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.GetScheduleOnCallUser`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.GetScheduleOnCallUser`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get scheduled on-call user ``` // Get scheduled on-call user returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.model.Shift; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "schedule" in the system String SCHEDULE_DATA_ID = System.getenv("SCHEDULE_DATA_ID"); try { Shift result = apiInstance.getScheduleOnCallUser(SCHEDULE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#getScheduleOnCallUser"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get scheduled on-call user ``` // Get scheduled on-call user returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::GetScheduleOnCallUserOptionalParams; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; #[tokio::main] async fn main() { // there is a valid "schedule" in the system let schedule_data_id = std::env::var("SCHEDULE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .get_schedule_on_call_user( schedule_data_id.clone(), GetScheduleOnCallUserOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get scheduled on-call user ``` /** * Get scheduled on-call user returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "schedule" in the system const SCHEDULE_DATA_ID = process.env.SCHEDULE_DATA_ID as string; const params: v2.OnCallApiGetScheduleOnCallUserRequest = { scheduleId: SCHEDULE_DATA_ID, }; apiInstance .getScheduleOnCallUser(params) .then((data: v2.Shift) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get team on-call users](https://docs.datadoghq.com/api/latest/on-call/#get-team-on-call-users) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#get-team-on-call-users-v2) GET https://api.ap1.datadoghq.com/api/v2/on-call/teams/{team_id}/on-callhttps://api.ap2.datadoghq.com/api/v2/on-call/teams/{team_id}/on-callhttps://api.datadoghq.eu/api/v2/on-call/teams/{team_id}/on-callhttps://api.ddog-gov.com/api/v2/on-call/teams/{team_id}/on-callhttps://api.datadoghq.com/api/v2/on-call/teams/{team_id}/on-callhttps://api.us3.datadoghq.com/api/v2/on-call/teams/{team_id}/on-callhttps://api.us5.datadoghq.com/api/v2/on-call/teams/{team_id}/on-call ### Overview Get a team’s on-call users at a given time This endpoint requires the `on_call_read` permission. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string The team ID #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `responders`, `escalations`, `escalations.responders`. ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#GetTeamOnCallUsers-200-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#GetTeamOnCallUsers-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#GetTeamOnCallUsers-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#GetTeamOnCallUsers-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#GetTeamOnCallUsers-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#GetTeamOnCallUsers-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Root object representing a team’s on-call responder configuration. Field Type Description data object Defines the main on-call responder object for a team, including relationships and metadata. id string Unique identifier of the on-call responder configuration. relationships object Relationship objects linked to a team's on-call responder configuration, including escalations and responders. escalations object Defines the escalation policy steps linked to the team's on-call configuration. data [object] Array of escalation step references. id [_required_] string Unique identifier of the escalation step. type [_required_] enum Identifies the resource type for escalation policy steps linked to a team's on-call configuration. Allowed enum values: `escalation_policy_steps` default: `escalation_policy_steps` responders object Defines the list of users assigned as on-call responders for the team. data [object] Array of user references associated as responders. id [_required_] string Unique identifier of the responder. type [_required_] enum Identifies the resource type for individual user entities associated with on-call response. Allowed enum values: `users` default: `users` type [_required_] enum Represents the resource type for a group of users assigned to handle on-call duties within a team. Allowed enum values: `team_oncall_responders` default: `team_oncall_responders` included [ ] The `TeamOnCallResponders` `included`. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Represents an escalation policy step. id string Unique identifier of the escalation step. relationships object Contains the relationships of an escalation object, including its responders. responders object Lists the users involved in a specific step of the escalation policy. data [object] Array of user references assigned as responders for this escalation step. id [_required_] string Unique identifier of the user assigned to the escalation step. type [_required_] enum Represents the resource type for users assigned as responders in an escalation step. Allowed enum values: `users` default: `users` type [_required_] enum Represents the resource type for individual steps in an escalation policy used during incident response. Allowed enum values: `escalation_policy_steps` default: `escalation_policy_steps` ``` { "data": { "id": "111ee23r-aaaaa-aaaa-aaww-1234wertsd23", "relationships": { "escalations": { "data": [ { "id": "111ee23r-aaaaa-aaaa-aaww-1234wertsd23", "type": "escalation_policy_steps" } ] }, "responders": { "data": [ { "id": "111ee23r-aaaaa-aaaa-aaww-1234wertsd23", "type": "users" } ] } }, "type": "team_oncall_responders" }, "included": [ { "attributes": { "email": "test@test.com", "name": "Test User", "status": "active" }, "id": "111ee23r-aaaaa-aaaa-aaww-1234wertsd23", "type": "users" }, { "id": "111ee23r-aaaaa-aaaa-aaww-1234wertsd23", "relationships": { "responders": { "data": [ { "id": "111ee23r-aaaaa-aaaa-aaww-1234wertsd23", "type": "users" } ] } }, "type": "escalation_policy_steps" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Get team on-call users Copy ``` # Path parameters export team_id="27590dae-47be-4a7d-9abf-8f4e45124020" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/teams/${team_id}/on-call" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get team on-call users ``` """ Get team on-call users returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi # there are valid "routing_rules" in the system ROUTING_RULES_DATA_ID = environ["ROUTING_RULES_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.get_team_on_call_users( include="responders,escalations.responders", team_id=ROUTING_RULES_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get team on-call users ``` # Get team on-call users returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there are valid "routing_rules" in the system ROUTING_RULES_DATA_ID = ENV["ROUTING_RULES_DATA_ID"] opts = { include: "responders,escalations.responders", } p api_instance.get_team_on_call_users(ROUTING_RULES_DATA_ID, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get team on-call users ``` // Get team on-call users returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there are valid "routing_rules" in the system RoutingRulesDataID := os.Getenv("ROUTING_RULES_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.GetTeamOnCallUsers(ctx, RoutingRulesDataID, *datadogV2.NewGetTeamOnCallUsersOptionalParameters().WithInclude("responders,escalations.responders")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.GetTeamOnCallUsers`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.GetTeamOnCallUsers`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get team on-call users ``` // Get team on-call users returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.api.OnCallApi.GetTeamOnCallUsersOptionalParameters; import com.datadog.api.client.v2.model.TeamOnCallResponders; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there are valid "routing_rules" in the system String ROUTING_RULES_DATA_ID = System.getenv("ROUTING_RULES_DATA_ID"); try { TeamOnCallResponders result = apiInstance.getTeamOnCallUsers( ROUTING_RULES_DATA_ID, new GetTeamOnCallUsersOptionalParameters() .include("responders,escalations.responders")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#getTeamOnCallUsers"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get team on-call users ``` // Get team on-call users returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::GetTeamOnCallUsersOptionalParams; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; #[tokio::main] async fn main() { // there are valid "routing_rules" in the system let routing_rules_data_id = std::env::var("ROUTING_RULES_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .get_team_on_call_users( routing_rules_data_id.clone(), GetTeamOnCallUsersOptionalParams::default() .include("responders,escalations.responders".to_string()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get team on-call users ``` /** * Get team on-call users returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there are valid "routing_rules" in the system const ROUTING_RULES_DATA_ID = process.env.ROUTING_RULES_DATA_ID as string; const params: v2.OnCallApiGetTeamOnCallUsersRequest = { include: "responders,escalations.responders", teamId: ROUTING_RULES_DATA_ID, }; apiInstance .getTeamOnCallUsers(params) .then((data: v2.TeamOnCallResponders) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an On-Call notification channel for a user](https://docs.datadoghq.com/api/latest/on-call/#delete-an-on-call-notification-channel-for-a-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#delete-an-on-call-notification-channel-for-a-user-v2) DELETE https://api.ap1.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.ap2.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.datadoghq.eu/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.ddog-gov.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.us3.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.us5.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id} ### Overview Delete a notification channel for a user. The authenticated user must be the target user or have the `on_call_admin` permission This endpoint requires the `on_call_respond` permission. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The user ID channel_id [_required_] string The channel ID ### Response * [204](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationChannel-204-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationChannel-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationChannel-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationChannel-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationChannel-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationChannel-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Delete an On-Call notification channel for a user Copy ``` # Path parameters export user_id="00000000-0000-0000-0000-000000000000" export channel_id="00000000-0000-0000-0000-000000000000" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/users/${user_id}/notification-channels/${channel_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an On-Call notification channel for a user ``` """ Delete an On-Call notification channel for a user returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] # there is a valid "oncall_email_notification_channel" in the system ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID = environ["ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) api_instance.delete_user_notification_channel( user_id=USER_DATA_ID, channel_id=ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an On-Call notification channel for a user ``` # Delete an On-Call notification channel for a user returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] # there is a valid "oncall_email_notification_channel" in the system ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID = ENV["ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID"] api_instance.delete_user_notification_channel(USER_DATA_ID, ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an On-Call notification channel for a user ``` // Delete an On-Call notification channel for a user returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") // there is a valid "oncall_email_notification_channel" in the system OncallEmailNotificationChannelDataID := os.Getenv("ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) r, err := api.DeleteUserNotificationChannel(ctx, UserDataID, OncallEmailNotificationChannelDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.DeleteUserNotificationChannel`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an On-Call notification channel for a user ``` // Delete an On-Call notification channel for a user returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); // there is a valid "oncall_email_notification_channel" in the system String ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID = System.getenv("ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID"); try { apiInstance.deleteUserNotificationChannel( USER_DATA_ID, ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#deleteUserNotificationChannel"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an On-Call notification channel for a user ``` // Delete an On-Call notification channel for a user returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); // there is a valid "oncall_email_notification_channel" in the system let oncall_email_notification_channel_data_id = std::env::var("ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .delete_user_notification_channel( user_data_id.clone(), oncall_email_notification_channel_data_id.clone(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an On-Call notification channel for a user ``` /** * Delete an On-Call notification channel for a user returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; // there is a valid "oncall_email_notification_channel" in the system const ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID = process.env .ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID as string; const params: v2.OnCallApiDeleteUserNotificationChannelRequest = { userId: USER_DATA_ID, channelId: ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID, }; apiInstance .deleteUserNotificationChannel(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an On-Call notification channel for a user](https://docs.datadoghq.com/api/latest/on-call/#get-an-on-call-notification-channel-for-a-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#get-an-on-call-notification-channel-for-a-user-v2) GET https://api.ap1.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.ap2.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.datadoghq.eu/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.ddog-gov.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.us3.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id}https://api.us5.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels/{channel_id} ### Overview Get a notification channel for a user. The authenticated user must be the target user or have the `on_call_admin` permission This endpoint requires the `on_call_read` permission. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The user ID channel_id [_required_] string The channel ID ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationChannel-200-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationChannel-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationChannel-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationChannel-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationChannel-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationChannel-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) A top-level wrapper for a user notification channel Field Type Description data object Data for an on-call notification channel attributes object Attributes for an on-call notification channel. active boolean Whether the notification channel is currently active. config Notification channel configuration Option 1 object Phone notification channel configuration formatted_number [_required_] string The formatted international version of Number (e.g. +33 7 1 23 45 67). number [_required_] string The E-164 formatted phone number (e.g. +3371234567) region [_required_] string The ISO 3166-1 alpha-2 two-letter country code. sms_subscribed_at date-time If present, the date the user subscribed this number to SMS messages type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` verified [_required_] boolean Indicates whether this phone has been verified by the user in Datadog On-Call Option 2 object Email notification channel configuration address [_required_] string The e-mail address to be notified formats [_required_] [string] Preferred content formats for notifications. type [_required_] enum Indicates that the notification channel is an e-mail address Allowed enum values: `email` default: `email` Option 3 object Push notification channel configuration application_name [_required_] string The name of the application used to receive push notifications device_name [_required_] string The name of the mobile device being used type [_required_] enum Indicates that the notification channel is a mobile device for push notifications Allowed enum values: `push` default: `push` id string Unique identifier for the channel type [_required_] enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` ``` { "data": { "attributes": { "config": { "address": "foo@bar.com", "formats": [ "html" ], "type": "email" } }, "id": "27590dae-47be-4a7d-9abf-8f4e45124020", "type": "notification_channels" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Get an On-Call notification channel for a user Copy ``` # Path parameters export user_id="00000000-0000-0000-0000-000000000000" export channel_id="00000000-0000-0000-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/users/${user_id}/notification-channels/${channel_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an On-Call notification channel for a user ``` """ Get an On-Call notification channel for a user returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] # there is a valid "oncall_email_notification_channel" in the system ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID = environ["ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.get_user_notification_channel( user_id=USER_DATA_ID, channel_id=ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an On-Call notification channel for a user ``` # Get an On-Call notification channel for a user returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] # there is a valid "oncall_email_notification_channel" in the system ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID = ENV["ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID"] p api_instance.get_user_notification_channel(USER_DATA_ID, ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an On-Call notification channel for a user ``` // Get an On-Call notification channel for a user returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") // there is a valid "oncall_email_notification_channel" in the system OncallEmailNotificationChannelDataID := os.Getenv("ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.GetUserNotificationChannel(ctx, UserDataID, OncallEmailNotificationChannelDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.GetUserNotificationChannel`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.GetUserNotificationChannel`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an On-Call notification channel for a user ``` // Get an On-Call notification channel for a user returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.model.NotificationChannel; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); // there is a valid "oncall_email_notification_channel" in the system String ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID = System.getenv("ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID"); try { NotificationChannel result = apiInstance.getUserNotificationChannel( USER_DATA_ID, ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#getUserNotificationChannel"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an On-Call notification channel for a user ``` // Get an On-Call notification channel for a user returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); // there is a valid "oncall_email_notification_channel" in the system let oncall_email_notification_channel_data_id = std::env::var("ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .get_user_notification_channel( user_data_id.clone(), oncall_email_notification_channel_data_id.clone(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an On-Call notification channel for a user ``` /** * Get an On-Call notification channel for a user returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; // there is a valid "oncall_email_notification_channel" in the system const ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID = process.env .ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID as string; const params: v2.OnCallApiGetUserNotificationChannelRequest = { userId: USER_DATA_ID, channelId: ONCALL_EMAIL_NOTIFICATION_CHANNEL_DATA_ID, }; apiInstance .getUserNotificationChannel(params) .then((data: v2.NotificationChannel) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List On-Call notification channels for a user](https://docs.datadoghq.com/api/latest/on-call/#list-on-call-notification-channels-for-a-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#list-on-call-notification-channels-for-a-user-v2) GET https://api.ap1.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channelshttps://api.ap2.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channelshttps://api.datadoghq.eu/api/v2/on-call/users/{user_id}/notification-channelshttps://api.ddog-gov.com/api/v2/on-call/users/{user_id}/notification-channelshttps://api.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channelshttps://api.us3.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channelshttps://api.us5.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels ### Overview List the notification channels for a user. The authenticated user must be the target user or have the `on_call_admin` permission This endpoint requires the `on_call_read` permission. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The user ID ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationChannels-200-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationChannels-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationChannels-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationChannels-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationChannels-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationChannels-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Response type for listing notification channels for a user Field Type Description data [object] attributes object Attributes for an on-call notification channel. active boolean Whether the notification channel is currently active. config Notification channel configuration Option 1 object Phone notification channel configuration formatted_number [_required_] string The formatted international version of Number (e.g. +33 7 1 23 45 67). number [_required_] string The E-164 formatted phone number (e.g. +3371234567) region [_required_] string The ISO 3166-1 alpha-2 two-letter country code. sms_subscribed_at date-time If present, the date the user subscribed this number to SMS messages type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` verified [_required_] boolean Indicates whether this phone has been verified by the user in Datadog On-Call Option 2 object Email notification channel configuration address [_required_] string The e-mail address to be notified formats [_required_] [string] Preferred content formats for notifications. type [_required_] enum Indicates that the notification channel is an e-mail address Allowed enum values: `email` default: `email` Option 3 object Push notification channel configuration application_name [_required_] string The name of the application used to receive push notifications device_name [_required_] string The name of the mobile device being used type [_required_] enum Indicates that the notification channel is a mobile device for push notifications Allowed enum values: `push` default: `push` id string Unique identifier for the channel type [_required_] enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` ``` { "data": [ { "attributes": { "active": false, "config": { "formatted_number": "", "number": "", "region": "", "sms_subscribed_at": "2019-09-19T10:00:00.000Z", "type": "phone", "verified": false } }, "id": "string", "type": "notification_channels" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### List On-Call notification channels for a user Copy ``` # Path parameters export user_id="00000000-0000-0000-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/users/${user_id}/notification-channels" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List On-Call notification channels for a user ``` """ List On-Call notification channels for a user returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.list_user_notification_channels( user_id=USER_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List On-Call notification channels for a user ``` # List On-Call notification channels for a user returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] p api_instance.list_user_notification_channels(USER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List On-Call notification channels for a user ``` // List On-Call notification channels for a user returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.ListUserNotificationChannels(ctx, UserDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.ListUserNotificationChannels`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.ListUserNotificationChannels`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List On-Call notification channels for a user ``` // List On-Call notification channels for a user returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.model.ListNotificationChannelsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); try { ListNotificationChannelsResponse result = apiInstance.listUserNotificationChannels(USER_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#listUserNotificationChannels"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List On-Call notification channels for a user ``` // List On-Call notification channels for a user returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .list_user_notification_channels(user_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List On-Call notification channels for a user ``` /** * List On-Call notification channels for a user returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.OnCallApiListUserNotificationChannelsRequest = { userId: USER_DATA_ID, }; apiInstance .listUserNotificationChannels(params) .then((data: v2.ListNotificationChannelsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an On-Call notification channel for a user](https://docs.datadoghq.com/api/latest/on-call/#create-an-on-call-notification-channel-for-a-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#create-an-on-call-notification-channel-for-a-user-v2) POST https://api.ap1.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channelshttps://api.ap2.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channelshttps://api.datadoghq.eu/api/v2/on-call/users/{user_id}/notification-channelshttps://api.ddog-gov.com/api/v2/on-call/users/{user_id}/notification-channelshttps://api.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channelshttps://api.us3.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channelshttps://api.us5.datadoghq.com/api/v2/on-call/users/{user_id}/notification-channels ### Overview Create a new notification channel for a user. The authenticated user must be the target user or have the `on_call_admin` permission This endpoint requires the `on_call_respond` permission. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The user ID ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Field Type Description data [_required_] object Data for creating an on-call notification channel attributes object Attributes for creating an on-call notification channel. config Notification channel configuration Option 1 object Configuration to create a phone notification channel number [_required_] string The E-164 formatted phone number (e.g. +3371234567) type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` Option 2 object Configuration to create an e-mail notification channel address [_required_] string The e-mail address to be notified formats [_required_] [string] Preferred content formats for notifications. type [_required_] enum Indicates that the notification channel is an e-mail address Allowed enum values: `email` default: `email` type [_required_] enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` ``` { "data": { "attributes": { "config": { "address": "foo@bar.com", "formats": [ "html" ], "type": "email" } }, "type": "notification_channels" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationChannel-201-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationChannel-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationChannel-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationChannel-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationChannel-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationChannel-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) A top-level wrapper for a user notification channel Field Type Description data object Data for an on-call notification channel attributes object Attributes for an on-call notification channel. active boolean Whether the notification channel is currently active. config Notification channel configuration Option 1 object Phone notification channel configuration formatted_number [_required_] string The formatted international version of Number (e.g. +33 7 1 23 45 67). number [_required_] string The E-164 formatted phone number (e.g. +3371234567) region [_required_] string The ISO 3166-1 alpha-2 two-letter country code. sms_subscribed_at date-time If present, the date the user subscribed this number to SMS messages type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` verified [_required_] boolean Indicates whether this phone has been verified by the user in Datadog On-Call Option 2 object Email notification channel configuration address [_required_] string The e-mail address to be notified formats [_required_] [string] Preferred content formats for notifications. type [_required_] enum Indicates that the notification channel is an e-mail address Allowed enum values: `email` default: `email` Option 3 object Push notification channel configuration application_name [_required_] string The name of the application used to receive push notifications device_name [_required_] string The name of the mobile device being used type [_required_] enum Indicates that the notification channel is a mobile device for push notifications Allowed enum values: `push` default: `push` id string Unique identifier for the channel type [_required_] enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` ``` { "data": { "attributes": { "config": { "address": "foo@bar.com", "formats": [ "html" ], "type": "email" } }, "id": "27590dae-47be-4a7d-9abf-8f4e45124020", "type": "notification_channels" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/on-call/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/on-call/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/on-call/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/on-call/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/on-call/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/on-call/?code-lang=typescript) ##### Create an On-Call notification channel for a user returns "Created" response Copy ``` # Path parameters export user_id="00000000-0000-0000-0000-000000000000" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/users/${user_id}/notification-channels" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "config": { "address": "foo@bar.com", "formats": [ "html" ], "type": "email" } }, "type": "notification_channels" } } EOF ``` ##### Create an On-Call notification channel for a user returns "Created" response ``` // Create an On-Call notification channel for a user returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") body := datadogV2.CreateUserNotificationChannelRequest{ Data: datadogV2.CreateNotificationChannelData{ Attributes: &datadogV2.CreateNotificationChannelAttributes{ Config: &datadogV2.CreateNotificationChannelConfig{ CreateEmailNotificationChannelConfig: &datadogV2.CreateEmailNotificationChannelConfig{ Address: "foo@bar.com", Formats: []datadogV2.NotificationChannelEmailFormatType{ datadogV2.NOTIFICATIONCHANNELEMAILFORMATTYPE_HTML, }, Type: datadogV2.NOTIFICATIONCHANNELEMAILCONFIGTYPE_EMAIL, }}, }, Type: datadogV2.NOTIFICATIONCHANNELTYPE_NOTIFICATION_CHANNELS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOnCallApi(apiClient) resp, r, err := api.CreateUserNotificationChannel(ctx, UserDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OnCallApi.CreateUserNotificationChannel`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OnCallApi.CreateUserNotificationChannel`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an On-Call notification channel for a user returns "Created" response ``` // Create an On-Call notification channel for a user returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OnCallApi; import com.datadog.api.client.v2.model.CreateEmailNotificationChannelConfig; import com.datadog.api.client.v2.model.CreateNotificationChannelAttributes; import com.datadog.api.client.v2.model.CreateNotificationChannelConfig; import com.datadog.api.client.v2.model.CreateNotificationChannelData; import com.datadog.api.client.v2.model.CreateUserNotificationChannelRequest; import com.datadog.api.client.v2.model.NotificationChannel; import com.datadog.api.client.v2.model.NotificationChannelEmailConfigType; import com.datadog.api.client.v2.model.NotificationChannelEmailFormatType; import com.datadog.api.client.v2.model.NotificationChannelType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OnCallApi apiInstance = new OnCallApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); CreateUserNotificationChannelRequest body = new CreateUserNotificationChannelRequest() .data( new CreateNotificationChannelData() .attributes( new CreateNotificationChannelAttributes() .config( new CreateNotificationChannelConfig( new CreateEmailNotificationChannelConfig() .address("foo@bar.com") .formats( Collections.singletonList( NotificationChannelEmailFormatType.HTML)) .type(NotificationChannelEmailConfigType.EMAIL)))) .type(NotificationChannelType.NOTIFICATION_CHANNELS)); try { NotificationChannel result = apiInstance.createUserNotificationChannel(USER_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OnCallApi#createUserNotificationChannel"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an On-Call notification channel for a user returns "Created" response ``` """ Create an On-Call notification channel for a user returns "Created" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.on_call_api import OnCallApi from datadog_api_client.v2.model.create_email_notification_channel_config import CreateEmailNotificationChannelConfig from datadog_api_client.v2.model.create_notification_channel_attributes import CreateNotificationChannelAttributes from datadog_api_client.v2.model.create_notification_channel_data import CreateNotificationChannelData from datadog_api_client.v2.model.create_user_notification_channel_request import CreateUserNotificationChannelRequest from datadog_api_client.v2.model.notification_channel_email_config_type import NotificationChannelEmailConfigType from datadog_api_client.v2.model.notification_channel_email_format_type import NotificationChannelEmailFormatType from datadog_api_client.v2.model.notification_channel_type import NotificationChannelType # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] body = CreateUserNotificationChannelRequest( data=CreateNotificationChannelData( attributes=CreateNotificationChannelAttributes( config=CreateEmailNotificationChannelConfig( address="foo@bar.com", formats=[ NotificationChannelEmailFormatType.HTML, ], type=NotificationChannelEmailConfigType.EMAIL, ), ), type=NotificationChannelType.NOTIFICATION_CHANNELS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OnCallApi(api_client) response = api_instance.create_user_notification_channel(user_id=USER_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an On-Call notification channel for a user returns "Created" response ``` # Create an On-Call notification channel for a user returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OnCallAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] body = DatadogAPIClient::V2::CreateUserNotificationChannelRequest.new({ data: DatadogAPIClient::V2::CreateNotificationChannelData.new({ attributes: DatadogAPIClient::V2::CreateNotificationChannelAttributes.new({ config: DatadogAPIClient::V2::CreateEmailNotificationChannelConfig.new({ address: "foo@bar.com", formats: [ DatadogAPIClient::V2::NotificationChannelEmailFormatType::HTML, ], type: DatadogAPIClient::V2::NotificationChannelEmailConfigType::EMAIL, }), }), type: DatadogAPIClient::V2::NotificationChannelType::NOTIFICATION_CHANNELS, }), }) p api_instance.create_user_notification_channel(USER_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an On-Call notification channel for a user returns "Created" response ``` // Create an On-Call notification channel for a user returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_on_call::OnCallAPI; use datadog_api_client::datadogV2::model::CreateEmailNotificationChannelConfig; use datadog_api_client::datadogV2::model::CreateNotificationChannelAttributes; use datadog_api_client::datadogV2::model::CreateNotificationChannelConfig; use datadog_api_client::datadogV2::model::CreateNotificationChannelData; use datadog_api_client::datadogV2::model::CreateUserNotificationChannelRequest; use datadog_api_client::datadogV2::model::NotificationChannelEmailConfigType; use datadog_api_client::datadogV2::model::NotificationChannelEmailFormatType; use datadog_api_client::datadogV2::model::NotificationChannelType; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let body = CreateUserNotificationChannelRequest::new( CreateNotificationChannelData::new(NotificationChannelType::NOTIFICATION_CHANNELS) .attributes(CreateNotificationChannelAttributes::new().config( CreateNotificationChannelConfig::CreateEmailNotificationChannelConfig(Box::new( CreateEmailNotificationChannelConfig::new( "foo@bar.com".to_string(), vec![NotificationChannelEmailFormatType::HTML], NotificationChannelEmailConfigType::EMAIL, ), )), )), ); let configuration = datadog::Configuration::new(); let api = OnCallAPI::with_config(configuration); let resp = api .create_user_notification_channel(user_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an On-Call notification channel for a user returns "Created" response ``` /** * Create an On-Call notification channel for a user returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OnCallApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.OnCallApiCreateUserNotificationChannelRequest = { body: { data: { attributes: { config: { address: "foo@bar.com", formats: ["html"], type: "email", }, }, type: "notification_channels", }, }, userId: USER_DATA_ID, }; apiInstance .createUserNotificationChannel(params) .then((data: v2.NotificationChannel) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/on-call/#create-an-on-call-notification-rule-for-a-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#create-an-on-call-notification-rule-for-a-user-v2) POST https://api.ap1.datadoghq.com/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.ap2.datadoghq.com/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.datadoghq.eu/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.ddog-gov.com/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.datadoghq.com/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.us3.datadoghq.com/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.us5.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules ### Overview Create a new notification rule for a user. The authenticated user must be the target user or have the `on_call_admin` permission This endpoint requires the `on_call_respond` permission. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The user ID ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Field Type Description data [_required_] object Data for creating an on-call notification rule attributes object Attributes for creating or modifying an on-call notification rule. category enum Specifies the category a notification rule will apply to Allowed enum values: `high_urgency,low_urgency` default: `high_urgency` channel_settings Configuration for the associated channel, if necessary Option 1 object Configuration for using a phone notification channel in a notification rule method [_required_] enum Specifies the method in which a phone is used in a notification rule Allowed enum values: `sms,voice` type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` delay_minutes int64 The number of minutes that will elapse before this rule is evaluated. 0 indicates immediate evaluation relationships object Relationship object for creating a notification rule channel object Relationship object for creating a notification rule data [_required_] object Channel relationship data for creating a notification rule id string ID of the notification channel type enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` type [_required_] enum Indicates that the resource is of type 'notification_rules'. Allowed enum values: `notification_rules` default: `notification_rules` ``` { "data": { "attributes": { "category": "high_urgency", "delay_minutes": 0 }, "relationships": { "channel": { "data": { "id": "7c1ce93f-30a5-596a-0f7b-06bfe153704c", "type": "notification_channels" } } }, "type": "notification_rules" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationRule-201-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#CreateUserNotificationRule-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) A top-level wrapper for a notification rule Field Type Description data [_required_] object Data for an on-call notification rule attributes object Attributes for an on-call notification rule. category enum Specifies the category a notification rule will apply to Allowed enum values: `high_urgency,low_urgency` default: `high_urgency` channel_settings Configuration for the associated channel, if necessary Option 1 object Configuration for using a phone notification channel in a notification rule method [_required_] enum Specifies the method in which a phone is used in a notification rule Allowed enum values: `sms,voice` type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` delay_minutes int64 The number of minutes that will elapse before this rule is evaluated. 0 indicates immediate evaluation id string Unique identifier for the rule relationships object Relationship object for creating a notification rule channel object Relationship object for creating a notification rule data [_required_] object Channel relationship data for creating a notification rule id string ID of the notification channel type enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` type [_required_] enum Indicates that the resource is of type 'notification_rules'. Allowed enum values: `notification_rules` default: `notification_rules` included [ ] Option 1 object Data for an on-call notification channel attributes object Attributes for an on-call notification channel. active boolean Whether the notification channel is currently active. config Notification channel configuration Option 1 object Phone notification channel configuration formatted_number [_required_] string The formatted international version of Number (e.g. +33 7 1 23 45 67). number [_required_] string The E-164 formatted phone number (e.g. +3371234567) region [_required_] string The ISO 3166-1 alpha-2 two-letter country code. sms_subscribed_at date-time If present, the date the user subscribed this number to SMS messages type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` verified [_required_] boolean Indicates whether this phone has been verified by the user in Datadog On-Call Option 2 object Email notification channel configuration address [_required_] string The e-mail address to be notified formats [_required_] [string] Preferred content formats for notifications. type [_required_] enum Indicates that the notification channel is an e-mail address Allowed enum values: `email` default: `email` Option 3 object Push notification channel configuration application_name [_required_] string The name of the application used to receive push notifications device_name [_required_] string The name of the mobile device being used type [_required_] enum Indicates that the notification channel is a mobile device for push notifications Allowed enum values: `push` default: `push` id string Unique identifier for the channel type [_required_] enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` ``` { "data": { "attributes": { "category": "high_urgency", "channel_settings": { "method": "sms", "type": "phone" }, "delay_minutes": 1 }, "id": "27590dae-47be-4a7d-9abf-8f4e45124020", "relationships": { "channel": { "data": { "id": "1562fab3-a8c2-49e2-8f3a-28dcda2405e2", "type": "notification_channels" } } }, "type": "notification_rules" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) ##### Create an On-Call notification rule for a user returns "Created" response Copy ``` # Path parameters export user_id="00000000-0000-0000-0000-000000000000" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/users/${user_id}/notification-rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "category": "high_urgency", "delay_minutes": 0 }, "relationships": { "channel": { "data": { "id": "7c1ce93f-30a5-596a-0f7b-06bfe153704c", "type": "notification_channels" } } }, "type": "notification_rules" } } EOF ``` * * * ## [List On-Call notification rules for a user](https://docs.datadoghq.com/api/latest/on-call/#list-on-call-notification-rules-for-a-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#list-on-call-notification-rules-for-a-user-v2) GET https://api.ap1.datadoghq.com/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.ap2.datadoghq.com/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.datadoghq.eu/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.ddog-gov.com/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.datadoghq.com/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.us3.datadoghq.com/api/v2/on-call/users/{user_id}/notification-ruleshttps://api.us5.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules ### Overview List the notification rules for a user. The authenticated user must be the target user or have the `on_call_admin` permission This endpoint requires the `on_call_read` permission. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The user ID #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `channel`. ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationRules-200-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationRules-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationRules-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationRules-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationRules-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#ListUserNotificationRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Response type for listing notification rules for a user Field Type Description data [object] attributes object Attributes for an on-call notification rule. category enum Specifies the category a notification rule will apply to Allowed enum values: `high_urgency,low_urgency` default: `high_urgency` channel_settings Configuration for the associated channel, if necessary Option 1 object Configuration for using a phone notification channel in a notification rule method [_required_] enum Specifies the method in which a phone is used in a notification rule Allowed enum values: `sms,voice` type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` delay_minutes int64 The number of minutes that will elapse before this rule is evaluated. 0 indicates immediate evaluation id string Unique identifier for the rule relationships object Relationship object for creating a notification rule channel object Relationship object for creating a notification rule data [_required_] object Channel relationship data for creating a notification rule id string ID of the notification channel type enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` type [_required_] enum Indicates that the resource is of type 'notification_rules'. Allowed enum values: `notification_rules` default: `notification_rules` included [ ] Option 1 object Data for an on-call notification channel attributes object Attributes for an on-call notification channel. active boolean Whether the notification channel is currently active. config Notification channel configuration Option 1 object Phone notification channel configuration formatted_number [_required_] string The formatted international version of Number (e.g. +33 7 1 23 45 67). number [_required_] string The E-164 formatted phone number (e.g. +3371234567) region [_required_] string The ISO 3166-1 alpha-2 two-letter country code. sms_subscribed_at date-time If present, the date the user subscribed this number to SMS messages type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` verified [_required_] boolean Indicates whether this phone has been verified by the user in Datadog On-Call Option 2 object Email notification channel configuration address [_required_] string The e-mail address to be notified formats [_required_] [string] Preferred content formats for notifications. type [_required_] enum Indicates that the notification channel is an e-mail address Allowed enum values: `email` default: `email` Option 3 object Push notification channel configuration application_name [_required_] string The name of the application used to receive push notifications device_name [_required_] string The name of the mobile device being used type [_required_] enum Indicates that the notification channel is a mobile device for push notifications Allowed enum values: `push` default: `push` id string Unique identifier for the channel type [_required_] enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` ``` { "data": [ { "attributes": { "category": "string", "channel_settings": { "method": "sms", "type": "phone" }, "delay_minutes": "integer" }, "id": "string", "relationships": { "channel": { "data": { "id": "string", "type": "notification_channels" } } }, "type": "notification_rules" } ], "included": [ { "attributes": { "active": false, "config": { "formatted_number": "", "number": "", "region": "", "sms_subscribed_at": "2019-09-19T10:00:00.000Z", "type": "phone", "verified": false } }, "id": "string", "type": "notification_channels" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) ##### List On-Call notification rules for a user Copy ``` # Path parameters export user_id="00000000-0000-0000-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/users/${user_id}/notification-rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Delete an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/on-call/#delete-an-on-call-notification-rule-for-a-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#delete-an-on-call-notification-rule-for-a-user-v2) DELETE https://api.ap1.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.datadoghq.eu/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.ddog-gov.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.us3.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.us5.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id} ### Overview Delete a notification rule for a user. The authenticated user must be the target user or have the `on_call_admin` permission This endpoint requires the `on_call_respond` permission. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The user ID rule_id [_required_] string The rule ID ### Response * [204](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationRule-204-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#DeleteUserNotificationRule-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) ##### Delete an On-Call notification rule for a user Copy ``` # Path parameters export user_id="00000000-0000-0000-0000-000000000000" export rule_id="00000000-0000-0000-0000-000000000000" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/users/${user_id}/notification-rules/${rule_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Get an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/on-call/#get-an-on-call-notification-rule-for-a-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#get-an-on-call-notification-rule-for-a-user-v2) GET https://api.ap1.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.datadoghq.eu/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.ddog-gov.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.us3.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.us5.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id} ### Overview Get a notification rule for a user. The authenticated user must be the target user or have the `on_call_admin` permission This endpoint requires the `on_call_read` permission. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The user ID rule_id [_required_] string The rule ID #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `channel`. ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#GetUserNotificationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) A top-level wrapper for a notification rule Field Type Description data [_required_] object Data for an on-call notification rule attributes object Attributes for an on-call notification rule. category enum Specifies the category a notification rule will apply to Allowed enum values: `high_urgency,low_urgency` default: `high_urgency` channel_settings Configuration for the associated channel, if necessary Option 1 object Configuration for using a phone notification channel in a notification rule method [_required_] enum Specifies the method in which a phone is used in a notification rule Allowed enum values: `sms,voice` type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` delay_minutes int64 The number of minutes that will elapse before this rule is evaluated. 0 indicates immediate evaluation id string Unique identifier for the rule relationships object Relationship object for creating a notification rule channel object Relationship object for creating a notification rule data [_required_] object Channel relationship data for creating a notification rule id string ID of the notification channel type enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` type [_required_] enum Indicates that the resource is of type 'notification_rules'. Allowed enum values: `notification_rules` default: `notification_rules` included [ ] Option 1 object Data for an on-call notification channel attributes object Attributes for an on-call notification channel. active boolean Whether the notification channel is currently active. config Notification channel configuration Option 1 object Phone notification channel configuration formatted_number [_required_] string The formatted international version of Number (e.g. +33 7 1 23 45 67). number [_required_] string The E-164 formatted phone number (e.g. +3371234567) region [_required_] string The ISO 3166-1 alpha-2 two-letter country code. sms_subscribed_at date-time If present, the date the user subscribed this number to SMS messages type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` verified [_required_] boolean Indicates whether this phone has been verified by the user in Datadog On-Call Option 2 object Email notification channel configuration address [_required_] string The e-mail address to be notified formats [_required_] [string] Preferred content formats for notifications. type [_required_] enum Indicates that the notification channel is an e-mail address Allowed enum values: `email` default: `email` Option 3 object Push notification channel configuration application_name [_required_] string The name of the application used to receive push notifications device_name [_required_] string The name of the mobile device being used type [_required_] enum Indicates that the notification channel is a mobile device for push notifications Allowed enum values: `push` default: `push` id string Unique identifier for the channel type [_required_] enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` ``` { "data": { "attributes": { "category": "high_urgency", "channel_settings": { "method": "sms", "type": "phone" }, "delay_minutes": 1 }, "id": "27590dae-47be-4a7d-9abf-8f4e45124020", "relationships": { "channel": { "data": { "id": "1562fab3-a8c2-49e2-8f3a-28dcda2405e2", "type": "notification_channels" } } }, "type": "notification_rules" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) ##### Get an On-Call notification rule for a user Copy ``` # Path parameters export user_id="00000000-0000-0000-0000-000000000000" export rule_id="00000000-0000-0000-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/users/${user_id}/notification-rules/${rule_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Update an On-Call notification rule for a user](https://docs.datadoghq.com/api/latest/on-call/#update-an-on-call-notification-rule-for-a-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/on-call/#update-an-on-call-notification-rule-for-a-user-v2) PUT https://api.ap1.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.datadoghq.eu/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.ddog-gov.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.us3.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id}https://api.us5.datadoghq.com/api/v2/on-call/users/{user_id}/notification-rules/{rule_id} ### Overview Update a notification rule for a user. The authenticated user must be the target user or have the `on_call_admin` permission This endpoint requires the `on_call_read` permission. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The user ID rule_id [_required_] string The rule ID #### Query Strings Name Type Description include string Comma-separated list of included relationships to be returned. Allowed values: `channel`. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) Field Type Description data [_required_] object Data for updating an on-call notification rule attributes object Attributes for creating or modifying an on-call notification rule. category enum Specifies the category a notification rule will apply to Allowed enum values: `high_urgency,low_urgency` default: `high_urgency` channel_settings Configuration for the associated channel, if necessary Option 1 object Configuration for using a phone notification channel in a notification rule method [_required_] enum Specifies the method in which a phone is used in a notification rule Allowed enum values: `sms,voice` type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` delay_minutes int64 The number of minutes that will elapse before this rule is evaluated. 0 indicates immediate evaluation id string Unique identifier for the rule relationships object Relationship object for creating a notification rule channel object Relationship object for creating a notification rule data [_required_] object Channel relationship data for creating a notification rule id string ID of the notification channel type enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` type [_required_] enum Indicates that the resource is of type 'notification_rules'. Allowed enum values: `notification_rules` default: `notification_rules` ``` { "data": { "attributes": { "category": "high_urgency", "delay_minutes": 1 }, "id": "4d00de6b-e1c1-e5fa-b238-57acec728d0d", "relationships": { "channel": { "data": { "id": "7c1ce93f-30a5-596a-0f7b-06bfe153704c", "type": "notification_channels" } } }, "type": "notification_rules" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/on-call/#UpdateUserNotificationRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/on-call/#UpdateUserNotificationRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/on-call/#UpdateUserNotificationRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/on-call/#UpdateUserNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/on-call/#UpdateUserNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/on-call/#UpdateUserNotificationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) A top-level wrapper for a notification rule Field Type Description data [_required_] object Data for an on-call notification rule attributes object Attributes for an on-call notification rule. category enum Specifies the category a notification rule will apply to Allowed enum values: `high_urgency,low_urgency` default: `high_urgency` channel_settings Configuration for the associated channel, if necessary Option 1 object Configuration for using a phone notification channel in a notification rule method [_required_] enum Specifies the method in which a phone is used in a notification rule Allowed enum values: `sms,voice` type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` delay_minutes int64 The number of minutes that will elapse before this rule is evaluated. 0 indicates immediate evaluation id string Unique identifier for the rule relationships object Relationship object for creating a notification rule channel object Relationship object for creating a notification rule data [_required_] object Channel relationship data for creating a notification rule id string ID of the notification channel type enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` type [_required_] enum Indicates that the resource is of type 'notification_rules'. Allowed enum values: `notification_rules` default: `notification_rules` included [ ] Option 1 object Data for an on-call notification channel attributes object Attributes for an on-call notification channel. active boolean Whether the notification channel is currently active. config Notification channel configuration Option 1 object Phone notification channel configuration formatted_number [_required_] string The formatted international version of Number (e.g. +33 7 1 23 45 67). number [_required_] string The E-164 formatted phone number (e.g. +3371234567) region [_required_] string The ISO 3166-1 alpha-2 two-letter country code. sms_subscribed_at date-time If present, the date the user subscribed this number to SMS messages type [_required_] enum Indicates that the notification channel is a phone Allowed enum values: `phone` default: `phone` verified [_required_] boolean Indicates whether this phone has been verified by the user in Datadog On-Call Option 2 object Email notification channel configuration address [_required_] string The e-mail address to be notified formats [_required_] [string] Preferred content formats for notifications. type [_required_] enum Indicates that the notification channel is an e-mail address Allowed enum values: `email` default: `email` Option 3 object Push notification channel configuration application_name [_required_] string The name of the application used to receive push notifications device_name [_required_] string The name of the mobile device being used type [_required_] enum Indicates that the notification channel is a mobile device for push notifications Allowed enum values: `push` default: `push` id string Unique identifier for the channel type [_required_] enum Indicates that the resource is of type 'notification_channels'. Allowed enum values: `notification_channels` default: `notification_channels` ``` { "data": { "attributes": { "category": "high_urgency", "channel_settings": { "method": "sms", "type": "phone" }, "delay_minutes": 1 }, "id": "27590dae-47be-4a7d-9abf-8f4e45124020", "relationships": { "channel": { "data": { "id": "1562fab3-a8c2-49e2-8f3a-28dcda2405e2", "type": "notification_channels" } } }, "type": "notification_rules" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/on-call/) * [Example](https://docs.datadoghq.com/api/latest/on-call/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/on-call/?code-lang=curl) ##### Update an On-Call notification rule for a user returns "OK" response Copy ``` # Path parameters export user_id="00000000-0000-0000-0000-000000000000" export rule_id="00000000-0000-0000-0000-000000000000" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/on-call/users/${user_id}/notification-rules/${rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "category": "high_urgency", "delay_minutes": 1 }, "id": "4d00de6b-e1c1-e5fa-b238-57acec728d0d", "relationships": { "channel": { "data": { "id": "7c1ce93f-30a5-596a-0f7b-06bfe153704c", "type": "notification_channels" } } }, "type": "notification_rules" } } EOF ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=cfb94027-a50e-457c-98cf-c9beba8ffeb9&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=714b9b91-9d30-40b7-8e91-ad83afe1c83a&pt=On-Call&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fon-call%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=cfb94027-a50e-457c-98cf-c9beba8ffeb9&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=714b9b91-9d30-40b7-8e91-ad83afe1c83a&pt=On-Call&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fon-call%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=78c49b8c-5517-4288-8957-ce38b985f3fb&bo=2&sid=c4ba0d50f0bf11f09afde9c0412ba014&vid=c4ba67e0f0bf11f099fcdb9b6c7a6b3e&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=On-Call&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fon-call%2F&r=<=3189&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=38258) --- # Source: https://docs.datadoghq.com/api/latest/opsgenie-integration/ # Opsgenie Integration Configure your [Datadog Opsgenie integration](https://docs.datadoghq.com/integrations/opsgenie/) directly through the Datadog API. ## [Get all service objects](https://docs.datadoghq.com/api/latest/opsgenie-integration/#get-all-service-objects) * [v2 (latest)](https://docs.datadoghq.com/api/latest/opsgenie-integration/#get-all-service-objects-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/opsgenie/serviceshttps://api.ap2.datadoghq.com/api/v2/integration/opsgenie/serviceshttps://api.datadoghq.eu/api/v2/integration/opsgenie/serviceshttps://api.ddog-gov.com/api/v2/integration/opsgenie/serviceshttps://api.datadoghq.com/api/v2/integration/opsgenie/serviceshttps://api.us3.datadoghq.com/api/v2/integration/opsgenie/serviceshttps://api.us5.datadoghq.com/api/v2/integration/opsgenie/services ### Overview Get a list of all services from the Datadog Opsgenie integration. This endpoint requires the `integrations_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/opsgenie-integration/#ListOpsgenieServices-200-v2) * [403](https://docs.datadoghq.com/api/latest/opsgenie-integration/#ListOpsgenieServices-403-v2) * [429](https://docs.datadoghq.com/api/latest/opsgenie-integration/#ListOpsgenieServices-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) Response with a list of Opsgenie services. Field Type Description data [_required_] [object] An array of Opsgenie services. attributes [_required_] object The attributes from an Opsgenie service response. custom_url string The custom URL for a custom region. name string The name for the Opsgenie service. region enum The region for the Opsgenie service. Allowed enum values: `us,eu,custom` id [_required_] string The ID of the Opsgenie service. type [_required_] enum Opsgenie service resource type. Allowed enum values: `opsgenie-service` default: `opsgenie-service` ``` { "data": [ { "attributes": { "custom_url": null, "name": "fake-opsgenie-service-name", "region": "us" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "opsgenie-service" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=typescript) ##### Get all service objects Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/opsgenie/services" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all service objects ``` """ Get all service objects returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.opsgenie_integration_api import OpsgenieIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OpsgenieIntegrationApi(api_client) response = api_instance.list_opsgenie_services() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all service objects ``` # Get all service objects returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OpsgenieIntegrationAPI.new p api_instance.list_opsgenie_services() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all service objects ``` // Get all service objects returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOpsgenieIntegrationApi(apiClient) resp, r, err := api.ListOpsgenieServices(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OpsgenieIntegrationApi.ListOpsgenieServices`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OpsgenieIntegrationApi.ListOpsgenieServices`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all service objects ``` // Get all service objects returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OpsgenieIntegrationApi; import com.datadog.api.client.v2.model.OpsgenieServicesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OpsgenieIntegrationApi apiInstance = new OpsgenieIntegrationApi(defaultClient); try { OpsgenieServicesResponse result = apiInstance.listOpsgenieServices(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OpsgenieIntegrationApi#listOpsgenieServices"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all service objects ``` // Get all service objects returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_opsgenie_integration::OpsgenieIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OpsgenieIntegrationAPI::with_config(configuration); let resp = api.list_opsgenie_services().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all service objects ``` /** * Get all service objects returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OpsgenieIntegrationApi(configuration); apiInstance .listOpsgenieServices() .then((data: v2.OpsgenieServicesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new service object](https://docs.datadoghq.com/api/latest/opsgenie-integration/#create-a-new-service-object) * [v2 (latest)](https://docs.datadoghq.com/api/latest/opsgenie-integration/#create-a-new-service-object-v2) POST https://api.ap1.datadoghq.com/api/v2/integration/opsgenie/serviceshttps://api.ap2.datadoghq.com/api/v2/integration/opsgenie/serviceshttps://api.datadoghq.eu/api/v2/integration/opsgenie/serviceshttps://api.ddog-gov.com/api/v2/integration/opsgenie/serviceshttps://api.datadoghq.com/api/v2/integration/opsgenie/serviceshttps://api.us3.datadoghq.com/api/v2/integration/opsgenie/serviceshttps://api.us5.datadoghq.com/api/v2/integration/opsgenie/services ### Overview Create a new service object in the Opsgenie integration. This endpoint requires the `manage_integrations` permission. ### Request #### Body Data (required) Opsgenie service payload * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) Field Type Description data [_required_] object Opsgenie service data for a create request. attributes [_required_] object The Opsgenie service attributes for a create request. custom_url string The custom URL for a custom region. name [_required_] string The name for the Opsgenie service. opsgenie_api_key [_required_] string The Opsgenie API key for your Opsgenie service. region [_required_] enum The region for the Opsgenie service. Allowed enum values: `us,eu,custom` type [_required_] enum Opsgenie service resource type. Allowed enum values: `opsgenie-service` default: `opsgenie-service` ``` { "data": { "attributes": { "name": "Example-Opsgenie-Integration", "opsgenie_api_key": "00000000-0000-0000-0000-000000000000", "region": "us" }, "type": "opsgenie-service" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/opsgenie-integration/#CreateOpsgenieService-201-v2) * [400](https://docs.datadoghq.com/api/latest/opsgenie-integration/#CreateOpsgenieService-400-v2) * [403](https://docs.datadoghq.com/api/latest/opsgenie-integration/#CreateOpsgenieService-403-v2) * [409](https://docs.datadoghq.com/api/latest/opsgenie-integration/#CreateOpsgenieService-409-v2) * [429](https://docs.datadoghq.com/api/latest/opsgenie-integration/#CreateOpsgenieService-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) Response of an Opsgenie service. Field Type Description data [_required_] object Opsgenie service data from a response. attributes [_required_] object The attributes from an Opsgenie service response. custom_url string The custom URL for a custom region. name string The name for the Opsgenie service. region enum The region for the Opsgenie service. Allowed enum values: `us,eu,custom` id [_required_] string The ID of the Opsgenie service. type [_required_] enum Opsgenie service resource type. Allowed enum values: `opsgenie-service` default: `opsgenie-service` ``` { "data": { "attributes": { "custom_url": null, "name": "fake-opsgenie-service-name", "region": "us" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "opsgenie-service" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=typescript) ##### Create a new service object returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/opsgenie/services" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Example-Opsgenie-Integration", "opsgenie_api_key": "00000000-0000-0000-0000-000000000000", "region": "us" }, "type": "opsgenie-service" } } EOF ``` ##### Create a new service object returns "CREATED" response ``` // Create a new service object returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.OpsgenieServiceCreateRequest{ Data: datadogV2.OpsgenieServiceCreateData{ Attributes: datadogV2.OpsgenieServiceCreateAttributes{ Name: "Example-Opsgenie-Integration", OpsgenieApiKey: "00000000-0000-0000-0000-000000000000", Region: datadogV2.OPSGENIESERVICEREGIONTYPE_US, }, Type: datadogV2.OPSGENIESERVICETYPE_OPSGENIE_SERVICE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOpsgenieIntegrationApi(apiClient) resp, r, err := api.CreateOpsgenieService(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OpsgenieIntegrationApi.CreateOpsgenieService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OpsgenieIntegrationApi.CreateOpsgenieService`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new service object returns "CREATED" response ``` // Create a new service object returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OpsgenieIntegrationApi; import com.datadog.api.client.v2.model.OpsgenieServiceCreateAttributes; import com.datadog.api.client.v2.model.OpsgenieServiceCreateData; import com.datadog.api.client.v2.model.OpsgenieServiceCreateRequest; import com.datadog.api.client.v2.model.OpsgenieServiceRegionType; import com.datadog.api.client.v2.model.OpsgenieServiceResponse; import com.datadog.api.client.v2.model.OpsgenieServiceType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OpsgenieIntegrationApi apiInstance = new OpsgenieIntegrationApi(defaultClient); OpsgenieServiceCreateRequest body = new OpsgenieServiceCreateRequest() .data( new OpsgenieServiceCreateData() .attributes( new OpsgenieServiceCreateAttributes() .name("Example-Opsgenie-Integration") .opsgenieApiKey("00000000-0000-0000-0000-000000000000") .region(OpsgenieServiceRegionType.US)) .type(OpsgenieServiceType.OPSGENIE_SERVICE)); try { OpsgenieServiceResponse result = apiInstance.createOpsgenieService(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OpsgenieIntegrationApi#createOpsgenieService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new service object returns "CREATED" response ``` """ Create a new service object returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.opsgenie_integration_api import OpsgenieIntegrationApi from datadog_api_client.v2.model.opsgenie_service_create_attributes import OpsgenieServiceCreateAttributes from datadog_api_client.v2.model.opsgenie_service_create_data import OpsgenieServiceCreateData from datadog_api_client.v2.model.opsgenie_service_create_request import OpsgenieServiceCreateRequest from datadog_api_client.v2.model.opsgenie_service_region_type import OpsgenieServiceRegionType from datadog_api_client.v2.model.opsgenie_service_type import OpsgenieServiceType body = OpsgenieServiceCreateRequest( data=OpsgenieServiceCreateData( attributes=OpsgenieServiceCreateAttributes( name="Example-Opsgenie-Integration", opsgenie_api_key="00000000-0000-0000-0000-000000000000", region=OpsgenieServiceRegionType.US, ), type=OpsgenieServiceType.OPSGENIE_SERVICE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OpsgenieIntegrationApi(api_client) response = api_instance.create_opsgenie_service(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new service object returns "CREATED" response ``` # Create a new service object returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OpsgenieIntegrationAPI.new body = DatadogAPIClient::V2::OpsgenieServiceCreateRequest.new({ data: DatadogAPIClient::V2::OpsgenieServiceCreateData.new({ attributes: DatadogAPIClient::V2::OpsgenieServiceCreateAttributes.new({ name: "Example-Opsgenie-Integration", opsgenie_api_key: "00000000-0000-0000-0000-000000000000", region: DatadogAPIClient::V2::OpsgenieServiceRegionType::US, }), type: DatadogAPIClient::V2::OpsgenieServiceType::OPSGENIE_SERVICE, }), }) p api_instance.create_opsgenie_service(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new service object returns "CREATED" response ``` // Create a new service object returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_opsgenie_integration::OpsgenieIntegrationAPI; use datadog_api_client::datadogV2::model::OpsgenieServiceCreateAttributes; use datadog_api_client::datadogV2::model::OpsgenieServiceCreateData; use datadog_api_client::datadogV2::model::OpsgenieServiceCreateRequest; use datadog_api_client::datadogV2::model::OpsgenieServiceRegionType; use datadog_api_client::datadogV2::model::OpsgenieServiceType; #[tokio::main] async fn main() { let body = OpsgenieServiceCreateRequest::new(OpsgenieServiceCreateData::new( OpsgenieServiceCreateAttributes::new( "Example-Opsgenie-Integration".to_string(), "00000000-0000-0000-0000-000000000000".to_string(), OpsgenieServiceRegionType::US, ), OpsgenieServiceType::OPSGENIE_SERVICE, )); let configuration = datadog::Configuration::new(); let api = OpsgenieIntegrationAPI::with_config(configuration); let resp = api.create_opsgenie_service(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new service object returns "CREATED" response ``` /** * Create a new service object returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OpsgenieIntegrationApi(configuration); const params: v2.OpsgenieIntegrationApiCreateOpsgenieServiceRequest = { body: { data: { attributes: { name: "Example-Opsgenie-Integration", opsgenieApiKey: "00000000-0000-0000-0000-000000000000", region: "us", }, type: "opsgenie-service", }, }, }; apiInstance .createOpsgenieService(params) .then((data: v2.OpsgenieServiceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a single service object](https://docs.datadoghq.com/api/latest/opsgenie-integration/#get-a-single-service-object) * [v2 (latest)](https://docs.datadoghq.com/api/latest/opsgenie-integration/#get-a-single-service-object-v2) GET https://api.ap1.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.ap2.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.datadoghq.eu/api/v2/integration/opsgenie/services/{integration_service_id}https://api.ddog-gov.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.us3.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.us5.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id} ### Overview Get a single service from the Datadog Opsgenie integration. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description integration_service_id [_required_] string The UUID of the service. ### Response * [200](https://docs.datadoghq.com/api/latest/opsgenie-integration/#GetOpsgenieService-200-v2) * [400](https://docs.datadoghq.com/api/latest/opsgenie-integration/#GetOpsgenieService-400-v2) * [403](https://docs.datadoghq.com/api/latest/opsgenie-integration/#GetOpsgenieService-403-v2) * [404](https://docs.datadoghq.com/api/latest/opsgenie-integration/#GetOpsgenieService-404-v2) * [409](https://docs.datadoghq.com/api/latest/opsgenie-integration/#GetOpsgenieService-409-v2) * [429](https://docs.datadoghq.com/api/latest/opsgenie-integration/#GetOpsgenieService-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) Response of an Opsgenie service. Field Type Description data [_required_] object Opsgenie service data from a response. attributes [_required_] object The attributes from an Opsgenie service response. custom_url string The custom URL for a custom region. name string The name for the Opsgenie service. region enum The region for the Opsgenie service. Allowed enum values: `us,eu,custom` id [_required_] string The ID of the Opsgenie service. type [_required_] enum Opsgenie service resource type. Allowed enum values: `opsgenie-service` default: `opsgenie-service` ``` { "data": { "attributes": { "custom_url": null, "name": "fake-opsgenie-service-name", "region": "us" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "opsgenie-service" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=typescript) ##### Get a single service object Copy ``` # Path parameters export integration_service_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/opsgenie/services/${integration_service_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a single service object ``` """ Get a single service object returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.opsgenie_integration_api import OpsgenieIntegrationApi # there is a valid "opsgenie_service" in the system OPSGENIE_SERVICE_DATA_ID = environ["OPSGENIE_SERVICE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OpsgenieIntegrationApi(api_client) response = api_instance.get_opsgenie_service( integration_service_id=OPSGENIE_SERVICE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a single service object ``` # Get a single service object returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OpsgenieIntegrationAPI.new # there is a valid "opsgenie_service" in the system OPSGENIE_SERVICE_DATA_ID = ENV["OPSGENIE_SERVICE_DATA_ID"] p api_instance.get_opsgenie_service(OPSGENIE_SERVICE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a single service object ``` // Get a single service object returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "opsgenie_service" in the system OpsgenieServiceDataID := os.Getenv("OPSGENIE_SERVICE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOpsgenieIntegrationApi(apiClient) resp, r, err := api.GetOpsgenieService(ctx, OpsgenieServiceDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OpsgenieIntegrationApi.GetOpsgenieService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OpsgenieIntegrationApi.GetOpsgenieService`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a single service object ``` // Get a single service object returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OpsgenieIntegrationApi; import com.datadog.api.client.v2.model.OpsgenieServiceResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OpsgenieIntegrationApi apiInstance = new OpsgenieIntegrationApi(defaultClient); // there is a valid "opsgenie_service" in the system String OPSGENIE_SERVICE_DATA_ID = System.getenv("OPSGENIE_SERVICE_DATA_ID"); try { OpsgenieServiceResponse result = apiInstance.getOpsgenieService(OPSGENIE_SERVICE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OpsgenieIntegrationApi#getOpsgenieService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a single service object ``` // Get a single service object returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_opsgenie_integration::OpsgenieIntegrationAPI; #[tokio::main] async fn main() { // there is a valid "opsgenie_service" in the system let opsgenie_service_data_id = std::env::var("OPSGENIE_SERVICE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OpsgenieIntegrationAPI::with_config(configuration); let resp = api .get_opsgenie_service(opsgenie_service_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a single service object ``` /** * Get a single service object returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OpsgenieIntegrationApi(configuration); // there is a valid "opsgenie_service" in the system const OPSGENIE_SERVICE_DATA_ID = process.env.OPSGENIE_SERVICE_DATA_ID as string; const params: v2.OpsgenieIntegrationApiGetOpsgenieServiceRequest = { integrationServiceId: OPSGENIE_SERVICE_DATA_ID, }; apiInstance .getOpsgenieService(params) .then((data: v2.OpsgenieServiceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a single service object](https://docs.datadoghq.com/api/latest/opsgenie-integration/#update-a-single-service-object) * [v2 (latest)](https://docs.datadoghq.com/api/latest/opsgenie-integration/#update-a-single-service-object-v2) PATCH https://api.ap1.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.ap2.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.datadoghq.eu/api/v2/integration/opsgenie/services/{integration_service_id}https://api.ddog-gov.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.us3.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.us5.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id} ### Overview Update a single service object in the Datadog Opsgenie integration. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description integration_service_id [_required_] string The UUID of the service. ### Request #### Body Data (required) Opsgenie service payload. * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) Field Type Description data [_required_] object Opsgenie service for an update request. attributes [_required_] object The Opsgenie service attributes for an update request. custom_url string The custom URL for a custom region. name string The name for the Opsgenie service. opsgenie_api_key string The Opsgenie API key for your Opsgenie service. region enum The region for the Opsgenie service. Allowed enum values: `us,eu,custom` id [_required_] string The ID of the Opsgenie service. type [_required_] enum Opsgenie service resource type. Allowed enum values: `opsgenie-service` default: `opsgenie-service` ``` { "data": { "attributes": { "name": "fake-opsgenie-service-name--updated", "opsgenie_api_key": "00000000-0000-0000-0000-000000000000", "region": "eu" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "opsgenie-service" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/opsgenie-integration/#UpdateOpsgenieService-200-v2) * [400](https://docs.datadoghq.com/api/latest/opsgenie-integration/#UpdateOpsgenieService-400-v2) * [403](https://docs.datadoghq.com/api/latest/opsgenie-integration/#UpdateOpsgenieService-403-v2) * [404](https://docs.datadoghq.com/api/latest/opsgenie-integration/#UpdateOpsgenieService-404-v2) * [409](https://docs.datadoghq.com/api/latest/opsgenie-integration/#UpdateOpsgenieService-409-v2) * [429](https://docs.datadoghq.com/api/latest/opsgenie-integration/#UpdateOpsgenieService-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) Response of an Opsgenie service. Field Type Description data [_required_] object Opsgenie service data from a response. attributes [_required_] object The attributes from an Opsgenie service response. custom_url string The custom URL for a custom region. name string The name for the Opsgenie service. region enum The region for the Opsgenie service. Allowed enum values: `us,eu,custom` id [_required_] string The ID of the Opsgenie service. type [_required_] enum Opsgenie service resource type. Allowed enum values: `opsgenie-service` default: `opsgenie-service` ``` { "data": { "attributes": { "custom_url": null, "name": "fake-opsgenie-service-name", "region": "us" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "opsgenie-service" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=typescript) ##### Update a single service object returns "OK" response Copy ``` # Path parameters export integration_service_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/opsgenie/services/${integration_service_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "fake-opsgenie-service-name--updated", "opsgenie_api_key": "00000000-0000-0000-0000-000000000000", "region": "eu" }, "id": "596da4af-0563-4097-90ff-07230c3f9db3", "type": "opsgenie-service" } } EOF ``` ##### Update a single service object returns "OK" response ``` // Update a single service object returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "opsgenie_service" in the system OpsgenieServiceDataID := os.Getenv("OPSGENIE_SERVICE_DATA_ID") body := datadogV2.OpsgenieServiceUpdateRequest{ Data: datadogV2.OpsgenieServiceUpdateData{ Attributes: datadogV2.OpsgenieServiceUpdateAttributes{ Name: datadog.PtrString("fake-opsgenie-service-name--updated"), OpsgenieApiKey: datadog.PtrString("00000000-0000-0000-0000-000000000000"), Region: datadogV2.OPSGENIESERVICEREGIONTYPE_EU.Ptr(), }, Id: OpsgenieServiceDataID, Type: datadogV2.OPSGENIESERVICETYPE_OPSGENIE_SERVICE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOpsgenieIntegrationApi(apiClient) resp, r, err := api.UpdateOpsgenieService(ctx, OpsgenieServiceDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OpsgenieIntegrationApi.UpdateOpsgenieService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OpsgenieIntegrationApi.UpdateOpsgenieService`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a single service object returns "OK" response ``` // Update a single service object returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OpsgenieIntegrationApi; import com.datadog.api.client.v2.model.OpsgenieServiceRegionType; import com.datadog.api.client.v2.model.OpsgenieServiceResponse; import com.datadog.api.client.v2.model.OpsgenieServiceType; import com.datadog.api.client.v2.model.OpsgenieServiceUpdateAttributes; import com.datadog.api.client.v2.model.OpsgenieServiceUpdateData; import com.datadog.api.client.v2.model.OpsgenieServiceUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OpsgenieIntegrationApi apiInstance = new OpsgenieIntegrationApi(defaultClient); // there is a valid "opsgenie_service" in the system String OPSGENIE_SERVICE_DATA_ATTRIBUTES_NAME = System.getenv("OPSGENIE_SERVICE_DATA_ATTRIBUTES_NAME"); String OPSGENIE_SERVICE_DATA_ID = System.getenv("OPSGENIE_SERVICE_DATA_ID"); OpsgenieServiceUpdateRequest body = new OpsgenieServiceUpdateRequest() .data( new OpsgenieServiceUpdateData() .attributes( new OpsgenieServiceUpdateAttributes() .name("fake-opsgenie-service-name--updated") .opsgenieApiKey("00000000-0000-0000-0000-000000000000") .region(OpsgenieServiceRegionType.EU)) .id(OPSGENIE_SERVICE_DATA_ID) .type(OpsgenieServiceType.OPSGENIE_SERVICE)); try { OpsgenieServiceResponse result = apiInstance.updateOpsgenieService(OPSGENIE_SERVICE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OpsgenieIntegrationApi#updateOpsgenieService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a single service object returns "OK" response ``` """ Update a single service object returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.opsgenie_integration_api import OpsgenieIntegrationApi from datadog_api_client.v2.model.opsgenie_service_region_type import OpsgenieServiceRegionType from datadog_api_client.v2.model.opsgenie_service_type import OpsgenieServiceType from datadog_api_client.v2.model.opsgenie_service_update_attributes import OpsgenieServiceUpdateAttributes from datadog_api_client.v2.model.opsgenie_service_update_data import OpsgenieServiceUpdateData from datadog_api_client.v2.model.opsgenie_service_update_request import OpsgenieServiceUpdateRequest # there is a valid "opsgenie_service" in the system OPSGENIE_SERVICE_DATA_ATTRIBUTES_NAME = environ["OPSGENIE_SERVICE_DATA_ATTRIBUTES_NAME"] OPSGENIE_SERVICE_DATA_ID = environ["OPSGENIE_SERVICE_DATA_ID"] body = OpsgenieServiceUpdateRequest( data=OpsgenieServiceUpdateData( attributes=OpsgenieServiceUpdateAttributes( name="fake-opsgenie-service-name--updated", opsgenie_api_key="00000000-0000-0000-0000-000000000000", region=OpsgenieServiceRegionType.EU, ), id=OPSGENIE_SERVICE_DATA_ID, type=OpsgenieServiceType.OPSGENIE_SERVICE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OpsgenieIntegrationApi(api_client) response = api_instance.update_opsgenie_service(integration_service_id=OPSGENIE_SERVICE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a single service object returns "OK" response ``` # Update a single service object returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OpsgenieIntegrationAPI.new # there is a valid "opsgenie_service" in the system OPSGENIE_SERVICE_DATA_ATTRIBUTES_NAME = ENV["OPSGENIE_SERVICE_DATA_ATTRIBUTES_NAME"] OPSGENIE_SERVICE_DATA_ID = ENV["OPSGENIE_SERVICE_DATA_ID"] body = DatadogAPIClient::V2::OpsgenieServiceUpdateRequest.new({ data: DatadogAPIClient::V2::OpsgenieServiceUpdateData.new({ attributes: DatadogAPIClient::V2::OpsgenieServiceUpdateAttributes.new({ name: "fake-opsgenie-service-name--updated", opsgenie_api_key: "00000000-0000-0000-0000-000000000000", region: DatadogAPIClient::V2::OpsgenieServiceRegionType::EU, }), id: OPSGENIE_SERVICE_DATA_ID, type: DatadogAPIClient::V2::OpsgenieServiceType::OPSGENIE_SERVICE, }), }) p api_instance.update_opsgenie_service(OPSGENIE_SERVICE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a single service object returns "OK" response ``` // Update a single service object returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_opsgenie_integration::OpsgenieIntegrationAPI; use datadog_api_client::datadogV2::model::OpsgenieServiceRegionType; use datadog_api_client::datadogV2::model::OpsgenieServiceType; use datadog_api_client::datadogV2::model::OpsgenieServiceUpdateAttributes; use datadog_api_client::datadogV2::model::OpsgenieServiceUpdateData; use datadog_api_client::datadogV2::model::OpsgenieServiceUpdateRequest; #[tokio::main] async fn main() { // there is a valid "opsgenie_service" in the system let opsgenie_service_data_id = std::env::var("OPSGENIE_SERVICE_DATA_ID").unwrap(); let body = OpsgenieServiceUpdateRequest::new(OpsgenieServiceUpdateData::new( OpsgenieServiceUpdateAttributes::new() .name("fake-opsgenie-service-name--updated".to_string()) .opsgenie_api_key("00000000-0000-0000-0000-000000000000".to_string()) .region(OpsgenieServiceRegionType::EU), opsgenie_service_data_id.clone(), OpsgenieServiceType::OPSGENIE_SERVICE, )); let configuration = datadog::Configuration::new(); let api = OpsgenieIntegrationAPI::with_config(configuration); let resp = api .update_opsgenie_service(opsgenie_service_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a single service object returns "OK" response ``` /** * Update a single service object returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OpsgenieIntegrationApi(configuration); // there is a valid "opsgenie_service" in the system const OPSGENIE_SERVICE_DATA_ID = process.env.OPSGENIE_SERVICE_DATA_ID as string; const params: v2.OpsgenieIntegrationApiUpdateOpsgenieServiceRequest = { body: { data: { attributes: { name: "fake-opsgenie-service-name--updated", opsgenieApiKey: "00000000-0000-0000-0000-000000000000", region: "eu", }, id: OPSGENIE_SERVICE_DATA_ID, type: "opsgenie-service", }, }, integrationServiceId: OPSGENIE_SERVICE_DATA_ID, }; apiInstance .updateOpsgenieService(params) .then((data: v2.OpsgenieServiceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a single service object](https://docs.datadoghq.com/api/latest/opsgenie-integration/#delete-a-single-service-object) * [v2 (latest)](https://docs.datadoghq.com/api/latest/opsgenie-integration/#delete-a-single-service-object-v2) DELETE https://api.ap1.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.ap2.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.datadoghq.eu/api/v2/integration/opsgenie/services/{integration_service_id}https://api.ddog-gov.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.us3.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id}https://api.us5.datadoghq.com/api/v2/integration/opsgenie/services/{integration_service_id} ### Overview Delete a single service object in the Datadog Opsgenie integration. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description integration_service_id [_required_] string The UUID of the service. ### Response * [204](https://docs.datadoghq.com/api/latest/opsgenie-integration/#DeleteOpsgenieService-204-v2) * [400](https://docs.datadoghq.com/api/latest/opsgenie-integration/#DeleteOpsgenieService-400-v2) * [403](https://docs.datadoghq.com/api/latest/opsgenie-integration/#DeleteOpsgenieService-403-v2) * [404](https://docs.datadoghq.com/api/latest/opsgenie-integration/#DeleteOpsgenieService-404-v2) * [429](https://docs.datadoghq.com/api/latest/opsgenie-integration/#DeleteOpsgenieService-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/opsgenie-integration/) * [Example](https://docs.datadoghq.com/api/latest/opsgenie-integration/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/opsgenie-integration/?code-lang=typescript) ##### Delete a single service object Copy ``` # Path parameters export integration_service_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/integration/opsgenie/services/${integration_service_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a single service object ``` """ Delete a single service object returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.opsgenie_integration_api import OpsgenieIntegrationApi # there is a valid "opsgenie_service" in the system OPSGENIE_SERVICE_DATA_ID = environ["OPSGENIE_SERVICE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OpsgenieIntegrationApi(api_client) api_instance.delete_opsgenie_service( integration_service_id=OPSGENIE_SERVICE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a single service object ``` # Delete a single service object returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OpsgenieIntegrationAPI.new # there is a valid "opsgenie_service" in the system OPSGENIE_SERVICE_DATA_ID = ENV["OPSGENIE_SERVICE_DATA_ID"] api_instance.delete_opsgenie_service(OPSGENIE_SERVICE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a single service object ``` // Delete a single service object returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "opsgenie_service" in the system OpsgenieServiceDataID := os.Getenv("OPSGENIE_SERVICE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOpsgenieIntegrationApi(apiClient) r, err := api.DeleteOpsgenieService(ctx, OpsgenieServiceDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OpsgenieIntegrationApi.DeleteOpsgenieService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a single service object ``` // Delete a single service object returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OpsgenieIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OpsgenieIntegrationApi apiInstance = new OpsgenieIntegrationApi(defaultClient); // there is a valid "opsgenie_service" in the system String OPSGENIE_SERVICE_DATA_ID = System.getenv("OPSGENIE_SERVICE_DATA_ID"); try { apiInstance.deleteOpsgenieService(OPSGENIE_SERVICE_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling OpsgenieIntegrationApi#deleteOpsgenieService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a single service object ``` // Delete a single service object returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_opsgenie_integration::OpsgenieIntegrationAPI; #[tokio::main] async fn main() { // there is a valid "opsgenie_service" in the system let opsgenie_service_data_id = std::env::var("OPSGENIE_SERVICE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = OpsgenieIntegrationAPI::with_config(configuration); let resp = api .delete_opsgenie_service(opsgenie_service_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a single service object ``` /** * Delete a single service object returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OpsgenieIntegrationApi(configuration); // there is a valid "opsgenie_service" in the system const OPSGENIE_SERVICE_DATA_ID = process.env.OPSGENIE_SERVICE_DATA_ID as string; const params: v2.OpsgenieIntegrationApiDeleteOpsgenieServiceRequest = { integrationServiceId: OPSGENIE_SERVICE_DATA_ID, }; apiInstance .deleteOpsgenieService(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=5b95167d-bab5-46e8-b6e0-b82a7ff67125&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d674958b-f4d8-449f-af09-8b4a37a44e4d&pt=Opsgenie%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fopsgenie-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=5b95167d-bab5-46e8-b6e0-b82a7ff67125&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d674958b-f4d8-449f-af09-8b4a37a44e4d&pt=Opsgenie%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fopsgenie-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=24be1a87-b4d3-436c-be85-83ad998bf962&bo=2&sid=4afb3340f0c011f0a70bcf3adf5a320c&vid=4afb72d0f0c011f0a0b97926bab7a539&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Opsgenie%20Integration&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fopsgenie-integration%2F&r=<=1333&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=127594) --- # Source: https://docs.datadoghq.com/api/latest/org-connections # Org Connections Manage connections between organizations. Org connections allow for controlled sharing of data between different Datadog organizations. See the [Cross-Organization Visibiltiy](https://docs.datadoghq.com/account_management/org_settings/cross_org_visibility/) page for more information. ## [List Org Connections](https://docs.datadoghq.com/api/latest/org-connections/#list-org-connections) * [v2 (latest)](https://docs.datadoghq.com/api/latest/org-connections/#list-org-connections-v2) GET https://api.ap1.datadoghq.com/api/v2/org_connectionshttps://api.ap2.datadoghq.com/api/v2/org_connectionshttps://api.datadoghq.eu/api/v2/org_connectionshttps://api.ddog-gov.com/api/v2/org_connectionshttps://api.datadoghq.com/api/v2/org_connectionshttps://api.us3.datadoghq.com/api/v2/org_connectionshttps://api.us5.datadoghq.com/api/v2/org_connections ### Overview Returns a list of org connections. This endpoint requires the `org_connections_read` permission. OAuth apps require the `org_connections_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#org-connections) to access this endpoint. ### Arguments #### Query Strings Name Type Description sink_org_id string The Org ID of the sink org. source_org_id string The Org ID of the source org. limit integer The limit of number of entries you want to return. Default is 1000. offset integer The pagination offset which you want to query from. Default is 0. ### Response * [200](https://docs.datadoghq.com/api/latest/org-connections/#ListOrgConnections-200-v2) * [401](https://docs.datadoghq.com/api/latest/org-connections/#ListOrgConnections-401-v2) * [403](https://docs.datadoghq.com/api/latest/org-connections/#ListOrgConnections-403-v2) * [429](https://docs.datadoghq.com/api/latest/org-connections/#ListOrgConnections-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) Response containing a list of org connections. Field Type Description data [_required_] [object] List of org connections. attributes [_required_] object Org connection attributes. connection_types [_required_] [string] List of connection types. created_at [_required_] date-time Timestamp when the connection was created. id [_required_] uuid The unique identifier of the org connection. relationships [_required_] object Related organizations and user. created_by object User relationship. data object The data for a user relationship. id string User UUID. name string User name. type enum The type of the user relationship. Allowed enum values: `users` sink_org object Org relationship. data object The definition of `OrgConnectionOrgRelationshipData` object. id string Org UUID. name string Org name. type enum The type of the organization relationship. Allowed enum values: `orgs` source_org object Org relationship. data object The definition of `OrgConnectionOrgRelationshipData` object. id string Org UUID. name string Org name. type enum The type of the organization relationship. Allowed enum values: `orgs` type [_required_] enum Org connection type. Allowed enum values: `org_connection` meta object Pagination metadata. page object Page information. total_count int64 Total number of org connections. total_filtered_count int64 Total number of org connections matching the filter. ``` { "data": [ { "attributes": { "connection_types": [ "logs", "metrics" ], "created_at": "2023-01-01T12:00:00Z" }, "id": "f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a", "relationships": { "created_by": { "data": { "id": "usr123abc456", "name": "John Doe", "type": "users" } }, "sink_org": { "data": { "id": "f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a", "name": "Example Org", "type": "orgs" } }, "source_org": { "data": { "id": "f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a", "name": "Example Org", "type": "orgs" } } }, "type": "org_connection" } ], "meta": { "page": { "total_count": 0, "total_filtered_count": 0 } } } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=typescript) ##### List Org Connections Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/org_connections" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Org Connections ``` """ List Org Connections returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.org_connections_api import OrgConnectionsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrgConnectionsApi(api_client) response = api_instance.list_org_connections() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Org Connections ``` # List Org Connections returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OrgConnectionsAPI.new p api_instance.list_org_connections() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Org Connections ``` // List Org Connections returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOrgConnectionsApi(apiClient) resp, r, err := api.ListOrgConnections(ctx, *datadogV2.NewListOrgConnectionsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrgConnectionsApi.ListOrgConnections`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrgConnectionsApi.ListOrgConnections`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Org Connections ``` // List Org Connections returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OrgConnectionsApi; import com.datadog.api.client.v2.model.OrgConnectionListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrgConnectionsApi apiInstance = new OrgConnectionsApi(defaultClient); try { OrgConnectionListResponse result = apiInstance.listOrgConnections(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrgConnectionsApi#listOrgConnections"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Org Connections ``` // List Org Connections returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_org_connections::ListOrgConnectionsOptionalParams; use datadog_api_client::datadogV2::api_org_connections::OrgConnectionsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OrgConnectionsAPI::with_config(configuration); let resp = api .list_org_connections(ListOrgConnectionsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Org Connections ``` /** * List Org Connections returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OrgConnectionsApi(configuration); apiInstance .listOrgConnections() .then((data: v2.OrgConnectionListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create Org Connection](https://docs.datadoghq.com/api/latest/org-connections/#create-org-connection) * [v2 (latest)](https://docs.datadoghq.com/api/latest/org-connections/#create-org-connection-v2) POST https://api.ap1.datadoghq.com/api/v2/org_connectionshttps://api.ap2.datadoghq.com/api/v2/org_connectionshttps://api.datadoghq.eu/api/v2/org_connectionshttps://api.ddog-gov.com/api/v2/org_connectionshttps://api.datadoghq.com/api/v2/org_connectionshttps://api.us3.datadoghq.com/api/v2/org_connectionshttps://api.us5.datadoghq.com/api/v2/org_connections ### Overview Create a new org connection between the current org and a target org. This endpoint requires the `org_connections_write` permission. OAuth apps require the `org_connections_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#org-connections) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) Field Type Description data [_required_] object Org connection creation data. attributes [_required_] object Attributes for creating an org connection. connection_types [_required_] [string] List of connection types to establish. relationships [_required_] object Relationships for org connection creation. sink_org [_required_] object Org relationship. data object The definition of `OrgConnectionOrgRelationshipData` object. id string Org UUID. name string Org name. type enum The type of the organization relationship. Allowed enum values: `orgs` type [_required_] enum Org connection type. Allowed enum values: `org_connection` ``` { "data": { "type": "org_connection", "relationships": { "sink_org": { "data": { "type": "orgs", "id": "83999dcd-7f97-11f0-8de1-1ecf66f1aa85" } } }, "attributes": { "connection_types": [ "logs" ] } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/org-connections/#CreateOrgConnections-200-v2) * [400](https://docs.datadoghq.com/api/latest/org-connections/#CreateOrgConnections-400-v2) * [401](https://docs.datadoghq.com/api/latest/org-connections/#CreateOrgConnections-401-v2) * [403](https://docs.datadoghq.com/api/latest/org-connections/#CreateOrgConnections-403-v2) * [404](https://docs.datadoghq.com/api/latest/org-connections/#CreateOrgConnections-404-v2) * [409](https://docs.datadoghq.com/api/latest/org-connections/#CreateOrgConnections-409-v2) * [429](https://docs.datadoghq.com/api/latest/org-connections/#CreateOrgConnections-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) Response containing a single org connection. Field Type Description data [_required_] object An org connection. attributes [_required_] object Org connection attributes. connection_types [_required_] [string] List of connection types. created_at [_required_] date-time Timestamp when the connection was created. id [_required_] uuid The unique identifier of the org connection. relationships [_required_] object Related organizations and user. created_by object User relationship. data object The data for a user relationship. id string User UUID. name string User name. type enum The type of the user relationship. Allowed enum values: `users` sink_org object Org relationship. data object The definition of `OrgConnectionOrgRelationshipData` object. id string Org UUID. name string Org name. type enum The type of the organization relationship. Allowed enum values: `orgs` source_org object Org relationship. data object The definition of `OrgConnectionOrgRelationshipData` object. id string Org UUID. name string Org name. type enum The type of the organization relationship. Allowed enum values: `orgs` type [_required_] enum Org connection type. Allowed enum values: `org_connection` ``` { "data": { "attributes": { "connection_types": [ "logs", "metrics" ], "created_at": "2023-01-01T12:00:00Z" }, "id": "f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a", "relationships": { "created_by": { "data": { "id": "usr123abc456", "name": "John Doe", "type": "users" } }, "sink_org": { "data": { "id": "f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a", "name": "Example Org", "type": "orgs" } }, "source_org": { "data": { "id": "f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a", "name": "Example Org", "type": "orgs" } } }, "type": "org_connection" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=typescript) ##### Create Org Connection returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/org_connections" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "org_connection", "relationships": { "sink_org": { "data": { "type": "orgs", "id": "83999dcd-7f97-11f0-8de1-1ecf66f1aa85" } } }, "attributes": { "connection_types": [ "logs" ] } } } EOF ``` ##### Create Org Connection returns "OK" response ``` // Create Org Connection returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.OrgConnectionCreateRequest{ Data: datadogV2.OrgConnectionCreate{ Type: datadogV2.ORGCONNECTIONTYPE_ORG_CONNECTION, Relationships: datadogV2.OrgConnectionCreateRelationships{ SinkOrg: datadogV2.OrgConnectionOrgRelationship{ Data: &datadogV2.OrgConnectionOrgRelationshipData{ Type: datadogV2.ORGCONNECTIONORGRELATIONSHIPDATATYPE_ORGS.Ptr(), Id: datadog.PtrString("83999dcd-7f97-11f0-8de1-1ecf66f1aa85"), }, }, }, Attributes: datadogV2.OrgConnectionCreateAttributes{ ConnectionTypes: []datadogV2.OrgConnectionTypeEnum{ datadogV2.ORGCONNECTIONTYPEENUM_LOGS, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOrgConnectionsApi(apiClient) resp, r, err := api.CreateOrgConnections(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrgConnectionsApi.CreateOrgConnections`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrgConnectionsApi.CreateOrgConnections`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create Org Connection returns "OK" response ``` // Create Org Connection returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OrgConnectionsApi; import com.datadog.api.client.v2.model.OrgConnectionCreate; import com.datadog.api.client.v2.model.OrgConnectionCreateAttributes; import com.datadog.api.client.v2.model.OrgConnectionCreateRelationships; import com.datadog.api.client.v2.model.OrgConnectionCreateRequest; import com.datadog.api.client.v2.model.OrgConnectionOrgRelationship; import com.datadog.api.client.v2.model.OrgConnectionOrgRelationshipData; import com.datadog.api.client.v2.model.OrgConnectionOrgRelationshipDataType; import com.datadog.api.client.v2.model.OrgConnectionResponse; import com.datadog.api.client.v2.model.OrgConnectionType; import com.datadog.api.client.v2.model.OrgConnectionTypeEnum; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrgConnectionsApi apiInstance = new OrgConnectionsApi(defaultClient); OrgConnectionCreateRequest body = new OrgConnectionCreateRequest() .data( new OrgConnectionCreate() .type(OrgConnectionType.ORG_CONNECTION) .relationships( new OrgConnectionCreateRelationships() .sinkOrg( new OrgConnectionOrgRelationship() .data( new OrgConnectionOrgRelationshipData() .type(OrgConnectionOrgRelationshipDataType.ORGS) .id("83999dcd-7f97-11f0-8de1-1ecf66f1aa85")))) .attributes( new OrgConnectionCreateAttributes() .connectionTypes( Collections.singletonList(OrgConnectionTypeEnum.LOGS)))); try { OrgConnectionResponse result = apiInstance.createOrgConnections(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrgConnectionsApi#createOrgConnections"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create Org Connection returns "OK" response ``` """ Create Org Connection returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.org_connections_api import OrgConnectionsApi from datadog_api_client.v2.model.org_connection_create import OrgConnectionCreate from datadog_api_client.v2.model.org_connection_create_attributes import OrgConnectionCreateAttributes from datadog_api_client.v2.model.org_connection_create_relationships import OrgConnectionCreateRelationships from datadog_api_client.v2.model.org_connection_create_request import OrgConnectionCreateRequest from datadog_api_client.v2.model.org_connection_org_relationship import OrgConnectionOrgRelationship from datadog_api_client.v2.model.org_connection_org_relationship_data import OrgConnectionOrgRelationshipData from datadog_api_client.v2.model.org_connection_org_relationship_data_type import OrgConnectionOrgRelationshipDataType from datadog_api_client.v2.model.org_connection_type import OrgConnectionType from datadog_api_client.v2.model.org_connection_type_enum import OrgConnectionTypeEnum body = OrgConnectionCreateRequest( data=OrgConnectionCreate( type=OrgConnectionType.ORG_CONNECTION, relationships=OrgConnectionCreateRelationships( sink_org=OrgConnectionOrgRelationship( data=OrgConnectionOrgRelationshipData( type=OrgConnectionOrgRelationshipDataType.ORGS, id="83999dcd-7f97-11f0-8de1-1ecf66f1aa85", ), ), ), attributes=OrgConnectionCreateAttributes( connection_types=[ OrgConnectionTypeEnum.LOGS, ], ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrgConnectionsApi(api_client) response = api_instance.create_org_connections(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create Org Connection returns "OK" response ``` # Create Org Connection returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OrgConnectionsAPI.new body = DatadogAPIClient::V2::OrgConnectionCreateRequest.new({ data: DatadogAPIClient::V2::OrgConnectionCreate.new({ type: DatadogAPIClient::V2::OrgConnectionType::ORG_CONNECTION, relationships: DatadogAPIClient::V2::OrgConnectionCreateRelationships.new({ sink_org: DatadogAPIClient::V2::OrgConnectionOrgRelationship.new({ data: DatadogAPIClient::V2::OrgConnectionOrgRelationshipData.new({ type: DatadogAPIClient::V2::OrgConnectionOrgRelationshipDataType::ORGS, id: "83999dcd-7f97-11f0-8de1-1ecf66f1aa85", }), }), }), attributes: DatadogAPIClient::V2::OrgConnectionCreateAttributes.new({ connection_types: [ DatadogAPIClient::V2::OrgConnectionTypeEnum::LOGS, ], }), }), }) p api_instance.create_org_connections(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create Org Connection returns "OK" response ``` // Create Org Connection returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_org_connections::OrgConnectionsAPI; use datadog_api_client::datadogV2::model::OrgConnectionCreate; use datadog_api_client::datadogV2::model::OrgConnectionCreateAttributes; use datadog_api_client::datadogV2::model::OrgConnectionCreateRelationships; use datadog_api_client::datadogV2::model::OrgConnectionCreateRequest; use datadog_api_client::datadogV2::model::OrgConnectionOrgRelationship; use datadog_api_client::datadogV2::model::OrgConnectionOrgRelationshipData; use datadog_api_client::datadogV2::model::OrgConnectionOrgRelationshipDataType; use datadog_api_client::datadogV2::model::OrgConnectionType; use datadog_api_client::datadogV2::model::OrgConnectionTypeEnum; #[tokio::main] async fn main() { let body = OrgConnectionCreateRequest::new(OrgConnectionCreate::new( OrgConnectionCreateAttributes::new(vec![OrgConnectionTypeEnum::LOGS]), OrgConnectionCreateRelationships::new( OrgConnectionOrgRelationship::new().data( OrgConnectionOrgRelationshipData::new() .id("83999dcd-7f97-11f0-8de1-1ecf66f1aa85".to_string()) .type_(OrgConnectionOrgRelationshipDataType::ORGS), ), ), OrgConnectionType::ORG_CONNECTION, )); let configuration = datadog::Configuration::new(); let api = OrgConnectionsAPI::with_config(configuration); let resp = api.create_org_connections(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create Org Connection returns "OK" response ``` /** * Create Org Connection returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OrgConnectionsApi(configuration); const params: v2.OrgConnectionsApiCreateOrgConnectionsRequest = { body: { data: { type: "org_connection", relationships: { sinkOrg: { data: { type: "orgs", id: "83999dcd-7f97-11f0-8de1-1ecf66f1aa85", }, }, }, attributes: { connectionTypes: ["logs"], }, }, }, }; apiInstance .createOrgConnections(params) .then((data: v2.OrgConnectionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Org Connection](https://docs.datadoghq.com/api/latest/org-connections/#update-org-connection) * [v2 (latest)](https://docs.datadoghq.com/api/latest/org-connections/#update-org-connection-v2) PATCH https://api.ap1.datadoghq.com/api/v2/org_connections/{connection_id}https://api.ap2.datadoghq.com/api/v2/org_connections/{connection_id}https://api.datadoghq.eu/api/v2/org_connections/{connection_id}https://api.ddog-gov.com/api/v2/org_connections/{connection_id}https://api.datadoghq.com/api/v2/org_connections/{connection_id}https://api.us3.datadoghq.com/api/v2/org_connections/{connection_id}https://api.us5.datadoghq.com/api/v2/org_connections/{connection_id} ### Overview Update an existing org connection. This endpoint requires the `org_connections_write` permission. OAuth apps require the `org_connections_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#org-connections) to access this endpoint. ### Arguments #### Path Parameters Name Type Description connection_id [_required_] string The unique identifier of the org connection. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) Field Type Description data [_required_] object Org connection update data. attributes [_required_] object Attributes for updating an org connection. connection_types [_required_] [string] Updated list of connection types. id [_required_] uuid The unique identifier of the org connection. type [_required_] enum Org connection type. Allowed enum values: `org_connection` ``` { "data": { "type": "org_connection", "id": "f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a", "attributes": { "connection_types": [ "logs", "metrics" ] } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/org-connections/#UpdateOrgConnections-200-v2) * [400](https://docs.datadoghq.com/api/latest/org-connections/#UpdateOrgConnections-400-v2) * [401](https://docs.datadoghq.com/api/latest/org-connections/#UpdateOrgConnections-401-v2) * [403](https://docs.datadoghq.com/api/latest/org-connections/#UpdateOrgConnections-403-v2) * [404](https://docs.datadoghq.com/api/latest/org-connections/#UpdateOrgConnections-404-v2) * [429](https://docs.datadoghq.com/api/latest/org-connections/#UpdateOrgConnections-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) Response containing a single org connection. Field Type Description data [_required_] object An org connection. attributes [_required_] object Org connection attributes. connection_types [_required_] [string] List of connection types. created_at [_required_] date-time Timestamp when the connection was created. id [_required_] uuid The unique identifier of the org connection. relationships [_required_] object Related organizations and user. created_by object User relationship. data object The data for a user relationship. id string User UUID. name string User name. type enum The type of the user relationship. Allowed enum values: `users` sink_org object Org relationship. data object The definition of `OrgConnectionOrgRelationshipData` object. id string Org UUID. name string Org name. type enum The type of the organization relationship. Allowed enum values: `orgs` source_org object Org relationship. data object The definition of `OrgConnectionOrgRelationshipData` object. id string Org UUID. name string Org name. type enum The type of the organization relationship. Allowed enum values: `orgs` type [_required_] enum Org connection type. Allowed enum values: `org_connection` ``` { "data": { "attributes": { "connection_types": [ "logs", "metrics" ], "created_at": "2023-01-01T12:00:00Z" }, "id": "f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a", "relationships": { "created_by": { "data": { "id": "usr123abc456", "name": "John Doe", "type": "users" } }, "sink_org": { "data": { "id": "f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a", "name": "Example Org", "type": "orgs" } }, "source_org": { "data": { "id": "f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a", "name": "Example Org", "type": "orgs" } } }, "type": "org_connection" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=typescript) ##### Update Org Connection returns "OK" response Copy ``` # Path parameters export connection_id="f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/org_connections/${connection_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "org_connection", "id": "f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a", "attributes": { "connection_types": [ "logs", "metrics" ] } } } EOF ``` ##### Update Org Connection returns "OK" response ``` // Update Org Connection returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "org_connection" in the system OrgConnectionDataID := uuid.MustParse(os.Getenv("ORG_CONNECTION_DATA_ID")) body := datadogV2.OrgConnectionUpdateRequest{ Data: datadogV2.OrgConnectionUpdate{ Type: datadogV2.ORGCONNECTIONTYPE_ORG_CONNECTION, Id: OrgConnectionDataID, Attributes: datadogV2.OrgConnectionUpdateAttributes{ ConnectionTypes: []datadogV2.OrgConnectionTypeEnum{ datadogV2.ORGCONNECTIONTYPEENUM_LOGS, datadogV2.ORGCONNECTIONTYPEENUM_METRICS, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOrgConnectionsApi(apiClient) resp, r, err := api.UpdateOrgConnections(ctx, OrgConnectionDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrgConnectionsApi.UpdateOrgConnections`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrgConnectionsApi.UpdateOrgConnections`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Org Connection returns "OK" response ``` // Update Org Connection returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OrgConnectionsApi; import com.datadog.api.client.v2.model.OrgConnectionResponse; import com.datadog.api.client.v2.model.OrgConnectionType; import com.datadog.api.client.v2.model.OrgConnectionTypeEnum; import com.datadog.api.client.v2.model.OrgConnectionUpdate; import com.datadog.api.client.v2.model.OrgConnectionUpdateAttributes; import com.datadog.api.client.v2.model.OrgConnectionUpdateRequest; import java.util.Arrays; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrgConnectionsApi apiInstance = new OrgConnectionsApi(defaultClient); // there is a valid "org_connection" in the system UUID ORG_CONNECTION_DATA_ID = null; try { ORG_CONNECTION_DATA_ID = UUID.fromString(System.getenv("ORG_CONNECTION_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } OrgConnectionUpdateRequest body = new OrgConnectionUpdateRequest() .data( new OrgConnectionUpdate() .type(OrgConnectionType.ORG_CONNECTION) .id(ORG_CONNECTION_DATA_ID) .attributes( new OrgConnectionUpdateAttributes() .connectionTypes( Arrays.asList( OrgConnectionTypeEnum.LOGS, OrgConnectionTypeEnum.METRICS)))); try { OrgConnectionResponse result = apiInstance.updateOrgConnections(ORG_CONNECTION_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrgConnectionsApi#updateOrgConnections"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Org Connection returns "OK" response ``` """ Update Org Connection returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.org_connections_api import OrgConnectionsApi from datadog_api_client.v2.model.org_connection_type import OrgConnectionType from datadog_api_client.v2.model.org_connection_type_enum import OrgConnectionTypeEnum from datadog_api_client.v2.model.org_connection_update import OrgConnectionUpdate from datadog_api_client.v2.model.org_connection_update_attributes import OrgConnectionUpdateAttributes from datadog_api_client.v2.model.org_connection_update_request import OrgConnectionUpdateRequest # there is a valid "org_connection" in the system ORG_CONNECTION_DATA_ID = environ["ORG_CONNECTION_DATA_ID"] body = OrgConnectionUpdateRequest( data=OrgConnectionUpdate( type=OrgConnectionType.ORG_CONNECTION, id=ORG_CONNECTION_DATA_ID, attributes=OrgConnectionUpdateAttributes( connection_types=[ OrgConnectionTypeEnum.LOGS, OrgConnectionTypeEnum.METRICS, ], ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrgConnectionsApi(api_client) response = api_instance.update_org_connections(connection_id=ORG_CONNECTION_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Org Connection returns "OK" response ``` # Update Org Connection returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OrgConnectionsAPI.new # there is a valid "org_connection" in the system ORG_CONNECTION_DATA_ID = ENV["ORG_CONNECTION_DATA_ID"] body = DatadogAPIClient::V2::OrgConnectionUpdateRequest.new({ data: DatadogAPIClient::V2::OrgConnectionUpdate.new({ type: DatadogAPIClient::V2::OrgConnectionType::ORG_CONNECTION, id: ORG_CONNECTION_DATA_ID, attributes: DatadogAPIClient::V2::OrgConnectionUpdateAttributes.new({ connection_types: [ DatadogAPIClient::V2::OrgConnectionTypeEnum::LOGS, DatadogAPIClient::V2::OrgConnectionTypeEnum::METRICS, ], }), }), }) p api_instance.update_org_connections(ORG_CONNECTION_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Org Connection returns "OK" response ``` // Update Org Connection returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_org_connections::OrgConnectionsAPI; use datadog_api_client::datadogV2::model::OrgConnectionType; use datadog_api_client::datadogV2::model::OrgConnectionTypeEnum; use datadog_api_client::datadogV2::model::OrgConnectionUpdate; use datadog_api_client::datadogV2::model::OrgConnectionUpdateAttributes; use datadog_api_client::datadogV2::model::OrgConnectionUpdateRequest; #[tokio::main] async fn main() { // there is a valid "org_connection" in the system let org_connection_data_id = uuid::Uuid::parse_str(&std::env::var("ORG_CONNECTION_DATA_ID").unwrap()) .expect("Invalid UUID"); let body = OrgConnectionUpdateRequest::new(OrgConnectionUpdate::new( OrgConnectionUpdateAttributes::new(vec![ OrgConnectionTypeEnum::LOGS, OrgConnectionTypeEnum::METRICS, ]), org_connection_data_id.clone(), OrgConnectionType::ORG_CONNECTION, )); let configuration = datadog::Configuration::new(); let api = OrgConnectionsAPI::with_config(configuration); let resp = api .update_org_connections(org_connection_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Org Connection returns "OK" response ``` /** * Update Org Connection returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OrgConnectionsApi(configuration); // there is a valid "org_connection" in the system const ORG_CONNECTION_DATA_ID = process.env.ORG_CONNECTION_DATA_ID as string; const params: v2.OrgConnectionsApiUpdateOrgConnectionsRequest = { body: { data: { type: "org_connection", id: ORG_CONNECTION_DATA_ID, attributes: { connectionTypes: ["logs", "metrics"], }, }, }, connectionId: ORG_CONNECTION_DATA_ID, }; apiInstance .updateOrgConnections(params) .then((data: v2.OrgConnectionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Org Connection](https://docs.datadoghq.com/api/latest/org-connections/#delete-org-connection) * [v2 (latest)](https://docs.datadoghq.com/api/latest/org-connections/#delete-org-connection-v2) DELETE https://api.ap1.datadoghq.com/api/v2/org_connections/{connection_id}https://api.ap2.datadoghq.com/api/v2/org_connections/{connection_id}https://api.datadoghq.eu/api/v2/org_connections/{connection_id}https://api.ddog-gov.com/api/v2/org_connections/{connection_id}https://api.datadoghq.com/api/v2/org_connections/{connection_id}https://api.us3.datadoghq.com/api/v2/org_connections/{connection_id}https://api.us5.datadoghq.com/api/v2/org_connections/{connection_id} ### Overview Delete an existing org connection. This endpoint requires the `org_connections_write` permission. OAuth apps require the `org_connections_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#org-connections) to access this endpoint. ### Arguments #### Path Parameters Name Type Description connection_id [_required_] string The unique identifier of the org connection. ### Response * [200](https://docs.datadoghq.com/api/latest/org-connections/#DeleteOrgConnections-200-v2) * [400](https://docs.datadoghq.com/api/latest/org-connections/#DeleteOrgConnections-400-v2) * [401](https://docs.datadoghq.com/api/latest/org-connections/#DeleteOrgConnections-401-v2) * [403](https://docs.datadoghq.com/api/latest/org-connections/#DeleteOrgConnections-403-v2) * [404](https://docs.datadoghq.com/api/latest/org-connections/#DeleteOrgConnections-404-v2) * [429](https://docs.datadoghq.com/api/latest/org-connections/#DeleteOrgConnections-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/org-connections/) * [Example](https://docs.datadoghq.com/api/latest/org-connections/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/org-connections/?code-lang=typescript) ##### Delete Org Connection Copy ``` # Path parameters export connection_id="f9ec96b0-8c8a-4b0a-9b0a-1b2c3d4e5f6a" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/org_connections/${connection_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete Org Connection ``` """ Delete Org Connection returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.org_connections_api import OrgConnectionsApi # there is a valid "org_connection" in the system ORG_CONNECTION_DATA_ID = environ["ORG_CONNECTION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrgConnectionsApi(api_client) api_instance.delete_org_connections( connection_id=ORG_CONNECTION_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Org Connection ``` # Delete Org Connection returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OrgConnectionsAPI.new # there is a valid "org_connection" in the system ORG_CONNECTION_DATA_ID = ENV["ORG_CONNECTION_DATA_ID"] p api_instance.delete_org_connections(ORG_CONNECTION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Org Connection ``` // Delete Org Connection returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" "github.com/google/uuid" ) func main() { // there is a valid "org_connection" in the system OrgConnectionDataID := uuid.MustParse(os.Getenv("ORG_CONNECTION_DATA_ID")) ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOrgConnectionsApi(apiClient) r, err := api.DeleteOrgConnections(ctx, OrgConnectionDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrgConnectionsApi.DeleteOrgConnections`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Org Connection ``` // Delete Org Connection returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OrgConnectionsApi; import java.util.UUID; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrgConnectionsApi apiInstance = new OrgConnectionsApi(defaultClient); // there is a valid "org_connection" in the system UUID ORG_CONNECTION_DATA_ID = null; try { ORG_CONNECTION_DATA_ID = UUID.fromString(System.getenv("ORG_CONNECTION_DATA_ID")); } catch (IllegalArgumentException e) { System.err.println("Error parsing UUID: " + e.getMessage()); } try { apiInstance.deleteOrgConnections(ORG_CONNECTION_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling OrgConnectionsApi#deleteOrgConnections"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Org Connection ``` // Delete Org Connection returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_org_connections::OrgConnectionsAPI; #[tokio::main] async fn main() { // there is a valid "org_connection" in the system let org_connection_data_id = uuid::Uuid::parse_str(&std::env::var("ORG_CONNECTION_DATA_ID").unwrap()) .expect("Invalid UUID"); let configuration = datadog::Configuration::new(); let api = OrgConnectionsAPI::with_config(configuration); let resp = api .delete_org_connections(org_connection_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Org Connection ``` /** * Delete Org Connection returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OrgConnectionsApi(configuration); // there is a valid "org_connection" in the system const ORG_CONNECTION_DATA_ID = process.env.ORG_CONNECTION_DATA_ID as string; const params: v2.OrgConnectionsApiDeleteOrgConnectionsRequest = { connectionId: ORG_CONNECTION_DATA_ID, }; apiInstance .deleteOrgConnections(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=8d3b5c8b-a459-4164-aa9c-3e17a3bdf4c1&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=80809f55-d8b0-47a9-8d33-2da0db35f07b&pt=Org%20Connections&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Forg-connections%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=8d3b5c8b-a459-4164-aa9c-3e17a3bdf4c1&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=80809f55-d8b0-47a9-8d33-2da0db35f07b&pt=Org%20Connections&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Forg-connections%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=73bcee8c-fa00-4315-8c0e-aa09c122514c&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Org%20Connections&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Forg-connections%2F&r=<=11074&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=887008) --- # Source: https://docs.datadoghq.com/api/latest/organizations/ # Organizations Create, edit, and manage your organizations. Read more about [multi-org accounts](https://docs.datadoghq.com/account_management/multi_organization). ## [Create a child organization](https://docs.datadoghq.com/api/latest/organizations/#create-a-child-organization) * [v1 (latest)](https://docs.datadoghq.com/api/latest/organizations/#create-a-child-organization-v1) POST https://api.ap1.datadoghq.com/api/v1/orghttps://api.ap2.datadoghq.com/api/v1/orghttps://api.datadoghq.eu/api/v1/orghttps://api.ddog-gov.com/api/v1/orghttps://api.datadoghq.com/api/v1/orghttps://api.us3.datadoghq.com/api/v1/orghttps://api.us5.datadoghq.com/api/v1/org ### Overview Create a child organization. This endpoint requires the [multi-organization account](https://docs.datadoghq.com/account_management/multi_organization/) feature and must be enabled by [contacting support](https://docs.datadoghq.com/help/). Once a new child organization is created, you can interact with it by using the `org.public_id`, `api_key.key`, and `application_key.hash` provided in the response. This endpoint requires the `org_management` permission. ### Request #### Body Data (required) Organization object that needs to be created * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Field Type Description billing object **DEPRECATED** : A JSON array of billing type. type string The type of billing. Only `parent_billing` is supported. name [_required_] string The name of the new child-organization, limited to 32 characters. subscription object **DEPRECATED** : Subscription definition. type string The subscription type. Types available are `trial`, `free`, and `pro`. ``` { "billing": { "type": "string" }, "name": "New child org", "subscription": { "type": "string" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/organizations/#CreateChildOrg-200-v1) * [400](https://docs.datadoghq.com/api/latest/organizations/#CreateChildOrg-400-v1) * [403](https://docs.datadoghq.com/api/latest/organizations/#CreateChildOrg-403-v1) * [429](https://docs.datadoghq.com/api/latest/organizations/#CreateChildOrg-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Response object for an organization creation. Field Type Description api_key object Datadog API key. created string Date of creation of the API key. created_by string Datadog user handle that created the API key. key string API key. name string Name of your API key. application_key object An application key with its associated metadata. hash string Hash of an application key. name string Name of an application key. owner string Owner of an application key. org object Create, edit, and manage organizations. billing object **DEPRECATED** : A JSON array of billing type. type string The type of billing. Only `parent_billing` is supported. created string Date of the organization creation. description string Description of the organization. name string The name of the child organization, limited to 32 characters. public_id string The `public_id` of the organization you are operating within. settings object A JSON array of settings. private_widget_share boolean Whether or not the organization users can share widgets outside of Datadog. saml object Set the boolean property enabled to enable or disable single sign on with SAML. See the SAML documentation for more information about all SAML settings. enabled boolean Whether or not SAML is enabled for this organization. saml_autocreate_access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` saml_autocreate_users_domains object Has two properties, `enabled` (boolean) and `domains`, which is a list of domains without the @ symbol. domains [string] List of domains where the SAML automated user creation is enabled. enabled boolean Whether or not the automated user creation based on SAML domain is enabled. saml_can_be_enabled boolean Whether or not SAML can be enabled for this organization. saml_idp_endpoint string Identity provider endpoint for SAML authentication. saml_idp_initiated_login object Has one property enabled (boolean). enabled boolean Whether SAML IdP initiated login is enabled, learn more in the [SAML documentation](https://docs.datadoghq.com/account_management/saml/#idp-initiated-login). saml_idp_metadata_uploaded boolean Whether or not a SAML identity provider metadata file was provided to the Datadog organization. saml_login_url string URL for SAML logging. saml_strict_mode object Has one property enabled (boolean). enabled boolean Whether or not the SAML strict mode is enabled. If true, all users must log in with SAML. Learn more on the [SAML Strict documentation](https://docs.datadoghq.com/account_management/saml/#saml-strict). subscription object **DEPRECATED** : Subscription definition. type string The subscription type. Types available are `trial`, `free`, and `pro`. trial boolean Only available for MSP customers. Allows child organizations to be created on a trial plan. user object Create, edit, and disable users. access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` disabled boolean The new disabled status of the user. email string The new email of the user. handle string The user handle, must be a valid email. icon string Gravatar icon associated to the user. name string The name of the user. verified boolean Whether or not the user logged in Datadog at least once. ``` { "api_key": { "created": "2019-08-02 15:31:07", "created_by": "john@example.com", "key": "1234512345123456abcabc912349abcd", "name": "example user" }, "application_key": { "hash": "1234512345123459cda4eb9ced49a3d84fd0138c", "name": "example user", "owner": "example.com" }, "org": { "billing": { "type": "string" }, "created": "2019-09-26T17:28:28Z", "description": "some description", "name": "New child org", "public_id": "abcdef12345", "settings": { "private_widget_share": false, "saml": { "enabled": false }, "saml_autocreate_access_role": "ro", "saml_autocreate_users_domains": { "domains": [ "example.com" ], "enabled": false }, "saml_can_be_enabled": false, "saml_idp_endpoint": "https://my.saml.endpoint", "saml_idp_initiated_login": { "enabled": false }, "saml_idp_metadata_uploaded": false, "saml_login_url": "https://my.saml.login.url", "saml_strict_mode": { "enabled": false } }, "subscription": { "type": "string" }, "trial": false }, "user": { "access_role": "ro", "disabled": false, "email": "test@datadoghq.com", "handle": "test@datadoghq.com", "icon": "/path/to/matching/gravatar/icon", "name": "test user", "verified": true } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/organizations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/organizations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/organizations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/organizations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/organizations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/organizations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/organizations/?code-lang=typescript) ##### Create a child organization Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/org" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "New child org" } EOF ``` ##### Create a child organization ``` """ Create a child organization returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.organizations_api import OrganizationsApi from datadog_api_client.v1.model.organization_billing import OrganizationBilling from datadog_api_client.v1.model.organization_create_body import OrganizationCreateBody from datadog_api_client.v1.model.organization_subscription import OrganizationSubscription body = OrganizationCreateBody( billing=OrganizationBilling( type="parent_billing", ), name="New child org", subscription=OrganizationSubscription( type="pro", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrganizationsApi(api_client) response = api_instance.create_child_org(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a child organization ``` # Create a child organization returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::OrganizationsAPI.new body = DatadogAPIClient::V1::OrganizationCreateBody.new({ billing: DatadogAPIClient::V1::OrganizationBilling.new({ type: "parent_billing", }), name: "New child org", subscription: DatadogAPIClient::V1::OrganizationSubscription.new({ type: "pro", }), }) p api_instance.create_child_org(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a child organization ``` // Create a child organization returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.OrganizationCreateBody{ Billing: &datadogV1.OrganizationBilling{ Type: datadog.PtrString("parent_billing"), }, Name: "New child org", Subscription: &datadogV1.OrganizationSubscription{ Type: datadog.PtrString("pro"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewOrganizationsApi(apiClient) resp, r, err := api.CreateChildOrg(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrganizationsApi.CreateChildOrg`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrganizationsApi.CreateChildOrg`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a child organization ``` // Create a child organization returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.OrganizationsApi; import com.datadog.api.client.v1.model.OrganizationBilling; import com.datadog.api.client.v1.model.OrganizationCreateBody; import com.datadog.api.client.v1.model.OrganizationCreateResponse; import com.datadog.api.client.v1.model.OrganizationSubscription; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrganizationsApi apiInstance = new OrganizationsApi(defaultClient); OrganizationCreateBody body = new OrganizationCreateBody() .billing(new OrganizationBilling().type("parent_billing")) .name("New child org") .subscription(new OrganizationSubscription().type("pro")); try { OrganizationCreateResponse result = apiInstance.createChildOrg(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrganizationsApi#createChildOrg"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a child organization ``` // Create a child organization returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_organizations::OrganizationsAPI; use datadog_api_client::datadogV1::model::OrganizationBilling; use datadog_api_client::datadogV1::model::OrganizationCreateBody; use datadog_api_client::datadogV1::model::OrganizationSubscription; #[tokio::main] async fn main() { let body = OrganizationCreateBody::new("New child org".to_string()) .billing(OrganizationBilling::new().type_("parent_billing".to_string())) .subscription(OrganizationSubscription::new().type_("pro".to_string())); let configuration = datadog::Configuration::new(); let api = OrganizationsAPI::with_config(configuration); let resp = api.create_child_org(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a child organization ``` /** * Create a child organization returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.OrganizationsApi(configuration); const params: v1.OrganizationsApiCreateChildOrgRequest = { body: { billing: { type: "parent_billing", }, name: "New child org", subscription: { type: "pro", }, }, }; apiInstance .createChildOrg(params) .then((data: v1.OrganizationCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List your managed organizations](https://docs.datadoghq.com/api/latest/organizations/#list-your-managed-organizations) * [v1 (latest)](https://docs.datadoghq.com/api/latest/organizations/#list-your-managed-organizations-v1) GET https://api.ap1.datadoghq.com/api/v1/orghttps://api.ap2.datadoghq.com/api/v1/orghttps://api.datadoghq.eu/api/v1/orghttps://api.ddog-gov.com/api/v1/orghttps://api.datadoghq.com/api/v1/orghttps://api.us3.datadoghq.com/api/v1/orghttps://api.us5.datadoghq.com/api/v1/org ### Overview This endpoint returns data on your top-level organization. This endpoint requires the `org_management` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/organizations/#ListOrgs-200-v1) * [403](https://docs.datadoghq.com/api/latest/organizations/#ListOrgs-403-v1) * [429](https://docs.datadoghq.com/api/latest/organizations/#ListOrgs-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Response with the list of organizations. Field Type Description orgs [object] Array of organization objects. billing object **DEPRECATED** : A JSON array of billing type. type string The type of billing. Only `parent_billing` is supported. created string Date of the organization creation. description string Description of the organization. name string The name of the child organization, limited to 32 characters. public_id string The `public_id` of the organization you are operating within. settings object A JSON array of settings. private_widget_share boolean Whether or not the organization users can share widgets outside of Datadog. saml object Set the boolean property enabled to enable or disable single sign on with SAML. See the SAML documentation for more information about all SAML settings. enabled boolean Whether or not SAML is enabled for this organization. saml_autocreate_access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` saml_autocreate_users_domains object Has two properties, `enabled` (boolean) and `domains`, which is a list of domains without the @ symbol. domains [string] List of domains where the SAML automated user creation is enabled. enabled boolean Whether or not the automated user creation based on SAML domain is enabled. saml_can_be_enabled boolean Whether or not SAML can be enabled for this organization. saml_idp_endpoint string Identity provider endpoint for SAML authentication. saml_idp_initiated_login object Has one property enabled (boolean). enabled boolean Whether SAML IdP initiated login is enabled, learn more in the [SAML documentation](https://docs.datadoghq.com/account_management/saml/#idp-initiated-login). saml_idp_metadata_uploaded boolean Whether or not a SAML identity provider metadata file was provided to the Datadog organization. saml_login_url string URL for SAML logging. saml_strict_mode object Has one property enabled (boolean). enabled boolean Whether or not the SAML strict mode is enabled. If true, all users must log in with SAML. Learn more on the [SAML Strict documentation](https://docs.datadoghq.com/account_management/saml/#saml-strict). subscription object **DEPRECATED** : Subscription definition. type string The subscription type. Types available are `trial`, `free`, and `pro`. trial boolean Only available for MSP customers. Allows child organizations to be created on a trial plan. ``` { "orgs": [ { "billing": { "type": "string" }, "created": "2019-09-26T17:28:28Z", "description": "some description", "name": "New child org", "public_id": "abcdef12345", "settings": { "private_widget_share": false, "saml": { "enabled": false }, "saml_autocreate_access_role": "ro", "saml_autocreate_users_domains": { "domains": [ "example.com" ], "enabled": false }, "saml_can_be_enabled": false, "saml_idp_endpoint": "https://my.saml.endpoint", "saml_idp_initiated_login": { "enabled": false }, "saml_idp_metadata_uploaded": false, "saml_login_url": "https://my.saml.login.url", "saml_strict_mode": { "enabled": false } }, "subscription": { "type": "string" }, "trial": false } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/organizations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/organizations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/organizations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/organizations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/organizations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/organizations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/organizations/?code-lang=typescript) ##### List your managed organizations Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/org" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List your managed organizations ``` """ List your managed organizations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.organizations_api import OrganizationsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrganizationsApi(api_client) response = api_instance.list_orgs() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List your managed organizations ``` # List your managed organizations returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::OrganizationsAPI.new p api_instance.list_orgs() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List your managed organizations ``` // List your managed organizations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewOrganizationsApi(apiClient) resp, r, err := api.ListOrgs(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrganizationsApi.ListOrgs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrganizationsApi.ListOrgs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List your managed organizations ``` // List your managed organizations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.OrganizationsApi; import com.datadog.api.client.v1.model.OrganizationListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrganizationsApi apiInstance = new OrganizationsApi(defaultClient); try { OrganizationListResponse result = apiInstance.listOrgs(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrganizationsApi#listOrgs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List your managed organizations ``` // List your managed organizations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_organizations::OrganizationsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OrganizationsAPI::with_config(configuration); let resp = api.list_orgs().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List your managed organizations ``` /** * List your managed organizations returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.OrganizationsApi(configuration); apiInstance .listOrgs() .then((data: v1.OrganizationListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get organization information](https://docs.datadoghq.com/api/latest/organizations/#get-organization-information) * [v1 (latest)](https://docs.datadoghq.com/api/latest/organizations/#get-organization-information-v1) GET https://api.ap1.datadoghq.com/api/v1/org/{public_id}https://api.ap2.datadoghq.com/api/v1/org/{public_id}https://api.datadoghq.eu/api/v1/org/{public_id}https://api.ddog-gov.com/api/v1/org/{public_id}https://api.datadoghq.com/api/v1/org/{public_id}https://api.us3.datadoghq.com/api/v1/org/{public_id}https://api.us5.datadoghq.com/api/v1/org/{public_id} ### Overview Get organization information. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The `public_id` of the organization you are operating within. ### Response * [200](https://docs.datadoghq.com/api/latest/organizations/#GetOrg-200-v1) * [400](https://docs.datadoghq.com/api/latest/organizations/#GetOrg-400-v1) * [403](https://docs.datadoghq.com/api/latest/organizations/#GetOrg-403-v1) * [429](https://docs.datadoghq.com/api/latest/organizations/#GetOrg-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Response with an organization. Field Type Description org object Create, edit, and manage organizations. billing object **DEPRECATED** : A JSON array of billing type. type string The type of billing. Only `parent_billing` is supported. created string Date of the organization creation. description string Description of the organization. name string The name of the child organization, limited to 32 characters. public_id string The `public_id` of the organization you are operating within. settings object A JSON array of settings. private_widget_share boolean Whether or not the organization users can share widgets outside of Datadog. saml object Set the boolean property enabled to enable or disable single sign on with SAML. See the SAML documentation for more information about all SAML settings. enabled boolean Whether or not SAML is enabled for this organization. saml_autocreate_access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` saml_autocreate_users_domains object Has two properties, `enabled` (boolean) and `domains`, which is a list of domains without the @ symbol. domains [string] List of domains where the SAML automated user creation is enabled. enabled boolean Whether or not the automated user creation based on SAML domain is enabled. saml_can_be_enabled boolean Whether or not SAML can be enabled for this organization. saml_idp_endpoint string Identity provider endpoint for SAML authentication. saml_idp_initiated_login object Has one property enabled (boolean). enabled boolean Whether SAML IdP initiated login is enabled, learn more in the [SAML documentation](https://docs.datadoghq.com/account_management/saml/#idp-initiated-login). saml_idp_metadata_uploaded boolean Whether or not a SAML identity provider metadata file was provided to the Datadog organization. saml_login_url string URL for SAML logging. saml_strict_mode object Has one property enabled (boolean). enabled boolean Whether or not the SAML strict mode is enabled. If true, all users must log in with SAML. Learn more on the [SAML Strict documentation](https://docs.datadoghq.com/account_management/saml/#saml-strict). subscription object **DEPRECATED** : Subscription definition. type string The subscription type. Types available are `trial`, `free`, and `pro`. trial boolean Only available for MSP customers. Allows child organizations to be created on a trial plan. ``` { "org": { "billing": { "type": "string" }, "created": "2019-09-26T17:28:28Z", "description": "some description", "name": "New child org", "public_id": "abcdef12345", "settings": { "private_widget_share": false, "saml": { "enabled": false }, "saml_autocreate_access_role": "ro", "saml_autocreate_users_domains": { "domains": [ "example.com" ], "enabled": false }, "saml_can_be_enabled": false, "saml_idp_endpoint": "https://my.saml.endpoint", "saml_idp_initiated_login": { "enabled": false }, "saml_idp_metadata_uploaded": false, "saml_login_url": "https://my.saml.login.url", "saml_strict_mode": { "enabled": false } }, "subscription": { "type": "string" }, "trial": false } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/organizations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/organizations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/organizations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/organizations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/organizations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/organizations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/organizations/?code-lang=typescript) ##### Get organization information Copy ``` # Path parameters export public_id="abc123" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/org/${public_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get organization information ``` """ Get organization information returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.organizations_api import OrganizationsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrganizationsApi(api_client) response = api_instance.get_org( public_id="abc123", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get organization information ``` # Get organization information returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::OrganizationsAPI.new p api_instance.get_org("abc123") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get organization information ``` // Get organization information returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewOrganizationsApi(apiClient) resp, r, err := api.GetOrg(ctx, "abc123") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrganizationsApi.GetOrg`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrganizationsApi.GetOrg`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get organization information ``` // Get organization information returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.OrganizationsApi; import com.datadog.api.client.v1.model.OrganizationResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrganizationsApi apiInstance = new OrganizationsApi(defaultClient); try { OrganizationResponse result = apiInstance.getOrg("abc123"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrganizationsApi#getOrg"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get organization information ``` // Get organization information returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_organizations::OrganizationsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OrganizationsAPI::with_config(configuration); let resp = api.get_org("abc123".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get organization information ``` /** * Get organization information returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.OrganizationsApi(configuration); const params: v1.OrganizationsApiGetOrgRequest = { publicId: "abc123", }; apiInstance .getOrg(params) .then((data: v1.OrganizationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update your organization](https://docs.datadoghq.com/api/latest/organizations/#update-your-organization) * [v1 (latest)](https://docs.datadoghq.com/api/latest/organizations/#update-your-organization-v1) PUT https://api.ap1.datadoghq.com/api/v1/org/{public_id}https://api.ap2.datadoghq.com/api/v1/org/{public_id}https://api.datadoghq.eu/api/v1/org/{public_id}https://api.ddog-gov.com/api/v1/org/{public_id}https://api.datadoghq.com/api/v1/org/{public_id}https://api.us3.datadoghq.com/api/v1/org/{public_id}https://api.us5.datadoghq.com/api/v1/org/{public_id} ### Overview Update your organization. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The `public_id` of the organization you are operating within. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Field Type Description billing object **DEPRECATED** : A JSON array of billing type. type string The type of billing. Only `parent_billing` is supported. created string Date of the organization creation. description string Description of the organization. name string The name of the child organization, limited to 32 characters. public_id string The `public_id` of the organization you are operating within. settings object A JSON array of settings. private_widget_share boolean Whether or not the organization users can share widgets outside of Datadog. saml object Set the boolean property enabled to enable or disable single sign on with SAML. See the SAML documentation for more information about all SAML settings. enabled boolean Whether or not SAML is enabled for this organization. saml_autocreate_access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` saml_autocreate_users_domains object Has two properties, `enabled` (boolean) and `domains`, which is a list of domains without the @ symbol. domains [string] List of domains where the SAML automated user creation is enabled. enabled boolean Whether or not the automated user creation based on SAML domain is enabled. saml_can_be_enabled boolean Whether or not SAML can be enabled for this organization. saml_idp_endpoint string Identity provider endpoint for SAML authentication. saml_idp_initiated_login object Has one property enabled (boolean). enabled boolean Whether SAML IdP initiated login is enabled, learn more in the [SAML documentation](https://docs.datadoghq.com/account_management/saml/#idp-initiated-login). saml_idp_metadata_uploaded boolean Whether or not a SAML identity provider metadata file was provided to the Datadog organization. saml_login_url string URL for SAML logging. saml_strict_mode object Has one property enabled (boolean). enabled boolean Whether or not the SAML strict mode is enabled. If true, all users must log in with SAML. Learn more on the [SAML Strict documentation](https://docs.datadoghq.com/account_management/saml/#saml-strict). subscription object **DEPRECATED** : Subscription definition. type string The subscription type. Types available are `trial`, `free`, and `pro`. trial boolean Only available for MSP customers. Allows child organizations to be created on a trial plan. ``` { "billing": { "type": "string" }, "description": "some description", "name": "New child org", "public_id": "abcdef12345", "settings": { "private_widget_share": false, "saml": { "enabled": false }, "saml_autocreate_access_role": "ro", "saml_autocreate_users_domains": { "domains": [ "example.com" ], "enabled": false }, "saml_can_be_enabled": false, "saml_idp_endpoint": "https://my.saml.endpoint", "saml_idp_initiated_login": { "enabled": false }, "saml_idp_metadata_uploaded": false, "saml_login_url": "https://my.saml.login.url", "saml_strict_mode": { "enabled": false } }, "subscription": { "type": "string" }, "trial": false } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/organizations/#UpdateOrg-200-v1) * [400](https://docs.datadoghq.com/api/latest/organizations/#UpdateOrg-400-v1) * [403](https://docs.datadoghq.com/api/latest/organizations/#UpdateOrg-403-v1) * [429](https://docs.datadoghq.com/api/latest/organizations/#UpdateOrg-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Response with an organization. Field Type Description org object Create, edit, and manage organizations. billing object **DEPRECATED** : A JSON array of billing type. type string The type of billing. Only `parent_billing` is supported. created string Date of the organization creation. description string Description of the organization. name string The name of the child organization, limited to 32 characters. public_id string The `public_id` of the organization you are operating within. settings object A JSON array of settings. private_widget_share boolean Whether or not the organization users can share widgets outside of Datadog. saml object Set the boolean property enabled to enable or disable single sign on with SAML. See the SAML documentation for more information about all SAML settings. enabled boolean Whether or not SAML is enabled for this organization. saml_autocreate_access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` saml_autocreate_users_domains object Has two properties, `enabled` (boolean) and `domains`, which is a list of domains without the @ symbol. domains [string] List of domains where the SAML automated user creation is enabled. enabled boolean Whether or not the automated user creation based on SAML domain is enabled. saml_can_be_enabled boolean Whether or not SAML can be enabled for this organization. saml_idp_endpoint string Identity provider endpoint for SAML authentication. saml_idp_initiated_login object Has one property enabled (boolean). enabled boolean Whether SAML IdP initiated login is enabled, learn more in the [SAML documentation](https://docs.datadoghq.com/account_management/saml/#idp-initiated-login). saml_idp_metadata_uploaded boolean Whether or not a SAML identity provider metadata file was provided to the Datadog organization. saml_login_url string URL for SAML logging. saml_strict_mode object Has one property enabled (boolean). enabled boolean Whether or not the SAML strict mode is enabled. If true, all users must log in with SAML. Learn more on the [SAML Strict documentation](https://docs.datadoghq.com/account_management/saml/#saml-strict). subscription object **DEPRECATED** : Subscription definition. type string The subscription type. Types available are `trial`, `free`, and `pro`. trial boolean Only available for MSP customers. Allows child organizations to be created on a trial plan. ``` { "org": { "billing": { "type": "string" }, "created": "2019-09-26T17:28:28Z", "description": "some description", "name": "New child org", "public_id": "abcdef12345", "settings": { "private_widget_share": false, "saml": { "enabled": false }, "saml_autocreate_access_role": "ro", "saml_autocreate_users_domains": { "domains": [ "example.com" ], "enabled": false }, "saml_can_be_enabled": false, "saml_idp_endpoint": "https://my.saml.endpoint", "saml_idp_initiated_login": { "enabled": false }, "saml_idp_metadata_uploaded": false, "saml_login_url": "https://my.saml.login.url", "saml_strict_mode": { "enabled": false } }, "subscription": { "type": "string" }, "trial": false } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/organizations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/organizations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/organizations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/organizations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/organizations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/organizations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/organizations/?code-lang=typescript) ##### Update your organization Copy ``` # Path parameters export public_id="abc123" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/org/${public_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Update your organization ``` """ Update your organization returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.organizations_api import OrganizationsApi from datadog_api_client.v1.model.access_role import AccessRole from datadog_api_client.v1.model.organization import Organization from datadog_api_client.v1.model.organization_billing import OrganizationBilling from datadog_api_client.v1.model.organization_settings import OrganizationSettings from datadog_api_client.v1.model.organization_settings_saml import OrganizationSettingsSaml from datadog_api_client.v1.model.organization_settings_saml_autocreate_users_domains import ( OrganizationSettingsSamlAutocreateUsersDomains, ) from datadog_api_client.v1.model.organization_settings_saml_idp_initiated_login import ( OrganizationSettingsSamlIdpInitiatedLogin, ) from datadog_api_client.v1.model.organization_settings_saml_strict_mode import OrganizationSettingsSamlStrictMode from datadog_api_client.v1.model.organization_subscription import OrganizationSubscription body = Organization( billing=OrganizationBilling( type="parent_billing", ), description="some description", name="New child org", public_id="abcdef12345", settings=OrganizationSettings( private_widget_share=False, saml=OrganizationSettingsSaml( enabled=False, ), saml_autocreate_access_role=AccessRole.READ_ONLY, saml_autocreate_users_domains=OrganizationSettingsSamlAutocreateUsersDomains( domains=[ "example.com", ], enabled=False, ), saml_can_be_enabled=False, saml_idp_endpoint="https://my.saml.endpoint", saml_idp_initiated_login=OrganizationSettingsSamlIdpInitiatedLogin( enabled=False, ), saml_idp_metadata_uploaded=False, saml_login_url="https://my.saml.login.url", saml_strict_mode=OrganizationSettingsSamlStrictMode( enabled=False, ), ), subscription=OrganizationSubscription( type="pro", ), trial=False, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrganizationsApi(api_client) response = api_instance.update_org(public_id="abc123", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update your organization ``` # Update your organization returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::OrganizationsAPI.new body = DatadogAPIClient::V1::Organization.new({ billing: DatadogAPIClient::V1::OrganizationBilling.new({ type: "parent_billing", }), description: "some description", name: "New child org", public_id: "abcdef12345", settings: DatadogAPIClient::V1::OrganizationSettings.new({ private_widget_share: false, saml: DatadogAPIClient::V1::OrganizationSettingsSaml.new({ enabled: false, }), saml_autocreate_access_role: DatadogAPIClient::V1::AccessRole::READ_ONLY, saml_autocreate_users_domains: DatadogAPIClient::V1::OrganizationSettingsSamlAutocreateUsersDomains.new({ domains: [ "example.com", ], enabled: false, }), saml_can_be_enabled: false, saml_idp_endpoint: "https://my.saml.endpoint", saml_idp_initiated_login: DatadogAPIClient::V1::OrganizationSettingsSamlIdpInitiatedLogin.new({ enabled: false, }), saml_idp_metadata_uploaded: false, saml_login_url: "https://my.saml.login.url", saml_strict_mode: DatadogAPIClient::V1::OrganizationSettingsSamlStrictMode.new({ enabled: false, }), }), subscription: DatadogAPIClient::V1::OrganizationSubscription.new({ type: "pro", }), trial: false, }) p api_instance.update_org("abc123", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update your organization ``` // Update your organization returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.Organization{ Billing: &datadogV1.OrganizationBilling{ Type: datadog.PtrString("parent_billing"), }, Description: datadog.PtrString("some description"), Name: datadog.PtrString("New child org"), PublicId: datadog.PtrString("abcdef12345"), Settings: &datadogV1.OrganizationSettings{ PrivateWidgetShare: datadog.PtrBool(false), Saml: &datadogV1.OrganizationSettingsSaml{ Enabled: datadog.PtrBool(false), }, SamlAutocreateAccessRole: *datadogV1.NewNullableAccessRole(datadogV1.ACCESSROLE_READ_ONLY.Ptr()), SamlAutocreateUsersDomains: &datadogV1.OrganizationSettingsSamlAutocreateUsersDomains{ Domains: []string{ "example.com", }, Enabled: datadog.PtrBool(false), }, SamlCanBeEnabled: datadog.PtrBool(false), SamlIdpEndpoint: datadog.PtrString("https://my.saml.endpoint"), SamlIdpInitiatedLogin: &datadogV1.OrganizationSettingsSamlIdpInitiatedLogin{ Enabled: datadog.PtrBool(false), }, SamlIdpMetadataUploaded: datadog.PtrBool(false), SamlLoginUrl: datadog.PtrString("https://my.saml.login.url"), SamlStrictMode: &datadogV1.OrganizationSettingsSamlStrictMode{ Enabled: datadog.PtrBool(false), }, }, Subscription: &datadogV1.OrganizationSubscription{ Type: datadog.PtrString("pro"), }, Trial: datadog.PtrBool(false), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewOrganizationsApi(apiClient) resp, r, err := api.UpdateOrg(ctx, "abc123", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrganizationsApi.UpdateOrg`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrganizationsApi.UpdateOrg`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update your organization ``` // Update your organization returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.OrganizationsApi; import com.datadog.api.client.v1.model.AccessRole; import com.datadog.api.client.v1.model.Organization; import com.datadog.api.client.v1.model.OrganizationBilling; import com.datadog.api.client.v1.model.OrganizationResponse; import com.datadog.api.client.v1.model.OrganizationSettings; import com.datadog.api.client.v1.model.OrganizationSettingsSaml; import com.datadog.api.client.v1.model.OrganizationSettingsSamlAutocreateUsersDomains; import com.datadog.api.client.v1.model.OrganizationSettingsSamlIdpInitiatedLogin; import com.datadog.api.client.v1.model.OrganizationSettingsSamlStrictMode; import com.datadog.api.client.v1.model.OrganizationSubscription; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrganizationsApi apiInstance = new OrganizationsApi(defaultClient); Organization body = new Organization() .billing(new OrganizationBilling().type("parent_billing")) .description("some description") .name("New child org") .publicId("abcdef12345") .settings( new OrganizationSettings() .privateWidgetShare(false) .saml(new OrganizationSettingsSaml().enabled(false)) .samlAutocreateAccessRole(AccessRole.READ_ONLY) .samlAutocreateUsersDomains( new OrganizationSettingsSamlAutocreateUsersDomains() .domains(Collections.singletonList("example.com")) .enabled(false)) .samlCanBeEnabled(false) .samlIdpEndpoint("https://my.saml.endpoint") .samlIdpInitiatedLogin( new OrganizationSettingsSamlIdpInitiatedLogin().enabled(false)) .samlIdpMetadataUploaded(false) .samlLoginUrl("https://my.saml.login.url") .samlStrictMode(new OrganizationSettingsSamlStrictMode().enabled(false))) .subscription(new OrganizationSubscription().type("pro")) .trial(false); try { OrganizationResponse result = apiInstance.updateOrg("abc123", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrganizationsApi#updateOrg"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update your organization ``` // Update your organization returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_organizations::OrganizationsAPI; use datadog_api_client::datadogV1::model::AccessRole; use datadog_api_client::datadogV1::model::Organization; use datadog_api_client::datadogV1::model::OrganizationBilling; use datadog_api_client::datadogV1::model::OrganizationSettings; use datadog_api_client::datadogV1::model::OrganizationSettingsSaml; use datadog_api_client::datadogV1::model::OrganizationSettingsSamlAutocreateUsersDomains; use datadog_api_client::datadogV1::model::OrganizationSettingsSamlIdpInitiatedLogin; use datadog_api_client::datadogV1::model::OrganizationSettingsSamlStrictMode; use datadog_api_client::datadogV1::model::OrganizationSubscription; #[tokio::main] async fn main() { let body = Organization::new() .billing(OrganizationBilling::new().type_("parent_billing".to_string())) .description("some description".to_string()) .name("New child org".to_string()) .public_id("abcdef12345".to_string()) .settings( OrganizationSettings::new() .private_widget_share(false) .saml(OrganizationSettingsSaml::new().enabled(false)) .saml_autocreate_access_role(Some(AccessRole::READ_ONLY)) .saml_autocreate_users_domains( OrganizationSettingsSamlAutocreateUsersDomains::new() .domains(vec!["example.com".to_string()]) .enabled(false), ) .saml_can_be_enabled(false) .saml_idp_endpoint("https://my.saml.endpoint".to_string()) .saml_idp_initiated_login( OrganizationSettingsSamlIdpInitiatedLogin::new().enabled(false), ) .saml_idp_metadata_uploaded(false) .saml_login_url("https://my.saml.login.url".to_string()) .saml_strict_mode(OrganizationSettingsSamlStrictMode::new().enabled(false)), ) .subscription(OrganizationSubscription::new().type_("pro".to_string())) .trial(false); let configuration = datadog::Configuration::new(); let api = OrganizationsAPI::with_config(configuration); let resp = api.update_org("abc123".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update your organization ``` /** * Update your organization returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.OrganizationsApi(configuration); const params: v1.OrganizationsApiUpdateOrgRequest = { body: { billing: { type: "parent_billing", }, description: "some description", name: "New child org", publicId: "abcdef12345", settings: { privateWidgetShare: false, saml: { enabled: false, }, samlAutocreateAccessRole: "ro", samlAutocreateUsersDomains: { domains: ["example.com"], enabled: false, }, samlCanBeEnabled: false, samlIdpEndpoint: "https://my.saml.endpoint", samlIdpInitiatedLogin: { enabled: false, }, samlIdpMetadataUploaded: false, samlLoginUrl: "https://my.saml.login.url", samlStrictMode: { enabled: false, }, }, subscription: { type: "pro", }, trial: false, }, publicId: "abc123", }; apiInstance .updateOrg(params) .then((data: v1.OrganizationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Upload IdP metadata](https://docs.datadoghq.com/api/latest/organizations/#upload-idp-metadata) * [v1](https://docs.datadoghq.com/api/latest/organizations/#upload-idp-metadata-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/organizations/#upload-idp-metadata-v2) POST https://api.ap1.datadoghq.com/api/v1/org/{public_id}/idp_metadatahttps://api.ap2.datadoghq.com/api/v1/org/{public_id}/idp_metadatahttps://api.datadoghq.eu/api/v1/org/{public_id}/idp_metadatahttps://api.ddog-gov.com/api/v1/org/{public_id}/idp_metadatahttps://api.datadoghq.com/api/v1/org/{public_id}/idp_metadatahttps://api.us3.datadoghq.com/api/v1/org/{public_id}/idp_metadatahttps://api.us5.datadoghq.com/api/v1/org/{public_id}/idp_metadata ### Overview There are a couple of options for updating the Identity Provider (IdP) metadata from your SAML IdP. * **Multipart Form-Data** : Post the IdP metadata file using a form post. * **XML Body:** Post the IdP metadata file as the body of the request. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The `public_id` of the organization you are operating with ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Expand All Field Type Description idp_file [_required_] binary The path to the XML metadata file you wish to upload. ``` { "idp_file": "" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/organizations/#UploadIdPForOrg-200-v1) * [400](https://docs.datadoghq.com/api/latest/organizations/#UploadIdPForOrg-400-v1) * [403](https://docs.datadoghq.com/api/latest/organizations/#UploadIdPForOrg-403-v1) * [415](https://docs.datadoghq.com/api/latest/organizations/#UploadIdPForOrg-415-v1) * [429](https://docs.datadoghq.com/api/latest/organizations/#UploadIdPForOrg-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) The IdP response object. Expand All Field Type Description message [_required_] string Identity provider response. ``` { "message": "IdP metadata successfully uploaded for example org" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Unsupported Media Type * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/organizations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/organizations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/organizations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/organizations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/organizations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/organizations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/organizations/?code-lang=typescript) ##### Upload IdP metadata Copy ``` # Path parameters export public_id="abc123" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/org/${public_id}/idp_metadata" \ -H "Accept: application/json" \ -H "Content-Type: multipart/form-data" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -F idp_file=@string ``` ##### Upload IdP metadata ``` """ Upload IdP metadata returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.organizations_api import OrganizationsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrganizationsApi(api_client) response = api_instance.upload_idp_for_org( public_id="abc123", idp_file=open("./idp_metadata.xml", "rb"), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Upload IdP metadata ``` # Upload IdP metadata returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::OrganizationsAPI.new p api_instance.upload_idp_for_org("abc123", File.open("./idp_metadata.xml", "r")) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Upload IdP metadata ``` // Upload IdP metadata returns "OK" response package main import ( "context" "encoding/json" "fmt" "io" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewOrganizationsApi(apiClient) resp, r, err := api.UploadIdPForOrg(ctx, "abc123", func() io.Reader { fp, _ := os.Open("./idp_metadata.xml"); return fp }()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrganizationsApi.UploadIdPForOrg`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrganizationsApi.UploadIdPForOrg`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Upload IdP metadata ``` // Upload IdP metadata returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.OrganizationsApi; import com.datadog.api.client.v1.model.IdpResponse; import java.io.File; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrganizationsApi apiInstance = new OrganizationsApi(defaultClient); try { IdpResponse result = apiInstance.uploadIdPForOrg("abc123", new File("./idp_metadata.xml")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrganizationsApi#uploadIdPForOrg"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Upload IdP metadata ``` // Upload IdP metadata returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_organizations::OrganizationsAPI; use std::fs; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OrganizationsAPI::with_config(configuration); let resp = api .upload_idp_for_org( "abc123".to_string(), fs::read("./idp_metadata.xml").unwrap(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Upload IdP metadata ``` /** * Upload IdP metadata returns "OK" response */ import * as fs from "fs"; import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.OrganizationsApi(configuration); const params: v1.OrganizationsApiUploadIdPForOrgRequest = { publicId: "abc123", idpFile: { data: Buffer.from(fs.readFileSync("./idp_metadata.xml", "utf8")), name: "./idp_metadata.xml", }, }; apiInstance .uploadIdPForOrg(params) .then((data: v1.IdpResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` POST https://api.ap1.datadoghq.com/api/v2/saml_configurations/idp_metadatahttps://api.ap2.datadoghq.com/api/v2/saml_configurations/idp_metadatahttps://api.datadoghq.eu/api/v2/saml_configurations/idp_metadatahttps://api.ddog-gov.com/api/v2/saml_configurations/idp_metadatahttps://api.datadoghq.com/api/v2/saml_configurations/idp_metadatahttps://api.us3.datadoghq.com/api/v2/saml_configurations/idp_metadatahttps://api.us5.datadoghq.com/api/v2/saml_configurations/idp_metadata ### Overview Endpoint for uploading IdP metadata for SAML setup. Use this endpoint to upload or replace IdP metadata for SAML login configuration. This endpoint requires the `org_management` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Expand All Field Type Description idp_file binary The IdP metadata XML file ``` { "idp_file": "string" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/organizations/#UploadIdPMetadata-200-v2) * [400](https://docs.datadoghq.com/api/latest/organizations/#UploadIdPMetadata-400-v2) * [403](https://docs.datadoghq.com/api/latest/organizations/#UploadIdPMetadata-403-v2) * [429](https://docs.datadoghq.com/api/latest/organizations/#UploadIdPMetadata-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/organizations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/organizations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/organizations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/organizations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/organizations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/organizations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/organizations/?code-lang=typescript) ##### Upload IdP metadata Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/saml_configurations/idp_metadata" \ -H "Accept: application/json" \ -H "Content-Type: multipart/form-data" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -F idp_file=@string ``` ##### Upload IdP metadata ``` """ Upload IdP metadata returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.organizations_api import OrganizationsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrganizationsApi(api_client) api_instance.upload_idp_metadata( idp_file=open("fixtures/organizations/saml_configurations/valid_idp_metadata.xml", "rb"), ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Upload IdP metadata ``` # Upload IdP metadata returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OrganizationsAPI.new opts = { idp_file: File.open("fixtures/organizations/saml_configurations/valid_idp_metadata.xml", "r"), } p api_instance.upload_idp_metadata(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Upload IdP metadata ``` // Upload IdP metadata returns "OK" response package main import ( "context" "fmt" "io" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOrganizationsApi(apiClient) r, err := api.UploadIdPMetadata(ctx, *datadogV2.NewUploadIdPMetadataOptionalParameters().WithIdpFile(func() io.Reader { fp, _ := os.Open("fixtures/organizations/saml_configurations/valid_idp_metadata.xml") return fp }())) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrganizationsApi.UploadIdPMetadata`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Upload IdP metadata ``` // Upload IdP metadata returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OrganizationsApi; import com.datadog.api.client.v2.api.OrganizationsApi.UploadIdPMetadataOptionalParameters; import java.io.File; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrganizationsApi apiInstance = new OrganizationsApi(defaultClient); try { apiInstance.uploadIdPMetadata( new UploadIdPMetadataOptionalParameters() .idpFile( new File("fixtures/organizations/saml_configurations/valid_idp_metadata.xml"))); } catch (ApiException e) { System.err.println("Exception when calling OrganizationsApi#uploadIdPMetadata"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Upload IdP metadata ``` // Upload IdP metadata returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_organizations::OrganizationsAPI; use datadog_api_client::datadogV2::api_organizations::UploadIdPMetadataOptionalParams; use std::fs; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OrganizationsAPI::with_config(configuration); let resp = api .upload_idp_metadata(UploadIdPMetadataOptionalParams::default().idp_file( fs::read("fixtures/organizations/saml_configurations/valid_idp_metadata.xml").unwrap(), )) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Upload IdP metadata ``` /** * Upload IdP metadata returns "OK" response */ import * as fs from "fs"; import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OrganizationsApi(configuration); const params: v2.OrganizationsApiUploadIdPMetadataRequest = { idpFile: { data: Buffer.from( fs.readFileSync( "fixtures/organizations/saml_configurations/valid_idp_metadata.xml", "utf8" ) ), name: "fixtures/organizations/saml_configurations/valid_idp_metadata.xml", }, }; apiInstance .uploadIdPMetadata(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Spin-off Child Organization](https://docs.datadoghq.com/api/latest/organizations/#spin-off-child-organization) * [v1 (latest)](https://docs.datadoghq.com/api/latest/organizations/#spin-off-child-organization-v1) POST https://api.ap1.datadoghq.com/api/v1/org/{public_id}/downgradehttps://api.ap2.datadoghq.com/api/v1/org/{public_id}/downgradehttps://api.datadoghq.eu/api/v1/org/{public_id}/downgradehttps://api.ddog-gov.com/api/v1/org/{public_id}/downgradehttps://api.datadoghq.com/api/v1/org/{public_id}/downgradehttps://api.us3.datadoghq.com/api/v1/org/{public_id}/downgradehttps://api.us5.datadoghq.com/api/v1/org/{public_id}/downgrade ### Overview Only available for MSP customers. Removes a child organization from the hierarchy of the master organization and places the child organization on a 30-day trial. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The `public_id` of the organization you are operating within. ### Response * [200](https://docs.datadoghq.com/api/latest/organizations/#DowngradeOrg-200-v1) * [400](https://docs.datadoghq.com/api/latest/organizations/#DowngradeOrg-400-v1) * [403](https://docs.datadoghq.com/api/latest/organizations/#DowngradeOrg-403-v1) * [429](https://docs.datadoghq.com/api/latest/organizations/#DowngradeOrg-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Status of downgrade Expand All Field Type Description message string Information pertaining to the downgraded child organization. ``` { "message": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/organizations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/organizations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/organizations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/organizations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/organizations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/organizations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/organizations/?code-lang=typescript) ##### Spin-off Child Organization Copy ``` # Path parameters export public_id="abc123" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/org/${public_id}/downgrade" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Spin-off Child Organization ``` """ Spin-off Child Organization returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.organizations_api import OrganizationsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrganizationsApi(api_client) response = api_instance.downgrade_org( public_id="abc123", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Spin-off Child Organization ``` # Spin-off Child Organization returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::OrganizationsAPI.new p api_instance.downgrade_org("abc123") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Spin-off Child Organization ``` // Spin-off Child Organization returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewOrganizationsApi(apiClient) resp, r, err := api.DowngradeOrg(ctx, "abc123") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrganizationsApi.DowngradeOrg`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrganizationsApi.DowngradeOrg`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Spin-off Child Organization ``` // Spin-off Child Organization returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.OrganizationsApi; import com.datadog.api.client.v1.model.OrgDowngradedResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrganizationsApi apiInstance = new OrganizationsApi(defaultClient); try { OrgDowngradedResponse result = apiInstance.downgradeOrg("abc123"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrganizationsApi#downgradeOrg"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Spin-off Child Organization ``` // Spin-off Child Organization returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_organizations::OrganizationsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OrganizationsAPI::with_config(configuration); let resp = api.downgrade_org("abc123".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Spin-off Child Organization ``` /** * Spin-off Child Organization returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.OrganizationsApi(configuration); const params: v1.OrganizationsApiDowngradeOrgRequest = { publicId: "abc123", }; apiInstance .downgradeOrg(params) .then((data: v1.OrgDowngradedResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List Org Configs](https://docs.datadoghq.com/api/latest/organizations/#list-org-configs) * [v2 (latest)](https://docs.datadoghq.com/api/latest/organizations/#list-org-configs-v2) GET https://api.ap1.datadoghq.com/api/v2/org_configshttps://api.ap2.datadoghq.com/api/v2/org_configshttps://api.datadoghq.eu/api/v2/org_configshttps://api.ddog-gov.com/api/v2/org_configshttps://api.datadoghq.com/api/v2/org_configshttps://api.us3.datadoghq.com/api/v2/org_configshttps://api.us5.datadoghq.com/api/v2/org_configs ### Overview Returns all Org Configs (name, description, and value). ### Response * [200](https://docs.datadoghq.com/api/latest/organizations/#ListOrgConfigs-200-v2) * [400](https://docs.datadoghq.com/api/latest/organizations/#ListOrgConfigs-400-v2) * [401](https://docs.datadoghq.com/api/latest/organizations/#ListOrgConfigs-401-v2) * [403](https://docs.datadoghq.com/api/latest/organizations/#ListOrgConfigs-403-v2) * [429](https://docs.datadoghq.com/api/latest/organizations/#ListOrgConfigs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) A response with multiple Org Configs. Field Type Description data [_required_] [object] An array of Org Configs. attributes [_required_] object Readable attributes of an Org Config. description [_required_] string The description of an Org Config. modified_at date-time The timestamp of the last Org Config update (if any). name [_required_] string The machine-friendly name of an Org Config. value [_required_] The value of an Org Config. value_type [_required_] string The type of an Org Config value. id [_required_] string A unique identifier for an Org Config. type [_required_] enum Data type of an Org Config. Allowed enum values: `org_configs` ``` { "data": [ { "attributes": { "description": "Frobulate the turbo encabulator manifold", "modified_at": "2019-09-19T10:00:00.000Z", "name": "monitor_timezone", "value": "undefined", "value_type": "bool" }, "id": "abcd1234", "type": "org_configs" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/organizations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/organizations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/organizations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/organizations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/organizations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/organizations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/organizations/?code-lang=typescript) ##### List Org Configs Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/org_configs" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Org Configs ``` """ List Org Configs returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.organizations_api import OrganizationsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrganizationsApi(api_client) response = api_instance.list_org_configs() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Org Configs ``` # List Org Configs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OrganizationsAPI.new p api_instance.list_org_configs() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Org Configs ``` // List Org Configs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOrganizationsApi(apiClient) resp, r, err := api.ListOrgConfigs(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrganizationsApi.ListOrgConfigs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrganizationsApi.ListOrgConfigs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Org Configs ``` // List Org Configs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OrganizationsApi; import com.datadog.api.client.v2.model.OrgConfigListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrganizationsApi apiInstance = new OrganizationsApi(defaultClient); try { OrgConfigListResponse result = apiInstance.listOrgConfigs(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrganizationsApi#listOrgConfigs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Org Configs ``` // List Org Configs returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_organizations::OrganizationsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OrganizationsAPI::with_config(configuration); let resp = api.list_org_configs().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Org Configs ``` /** * List Org Configs returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OrganizationsApi(configuration); apiInstance .listOrgConfigs() .then((data: v2.OrgConfigListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a specific Org Config value](https://docs.datadoghq.com/api/latest/organizations/#get-a-specific-org-config-value) * [v2 (latest)](https://docs.datadoghq.com/api/latest/organizations/#get-a-specific-org-config-value-v2) GET https://api.ap1.datadoghq.com/api/v2/org_configs/{org_config_name}https://api.ap2.datadoghq.com/api/v2/org_configs/{org_config_name}https://api.datadoghq.eu/api/v2/org_configs/{org_config_name}https://api.ddog-gov.com/api/v2/org_configs/{org_config_name}https://api.datadoghq.com/api/v2/org_configs/{org_config_name}https://api.us3.datadoghq.com/api/v2/org_configs/{org_config_name}https://api.us5.datadoghq.com/api/v2/org_configs/{org_config_name} ### Overview Return the name, description, and value of a specific Org Config. ### Arguments #### Path Parameters Name Type Description org_config_name [_required_] string The name of an Org Config. ### Response * [200](https://docs.datadoghq.com/api/latest/organizations/#GetOrgConfig-200-v2) * [400](https://docs.datadoghq.com/api/latest/organizations/#GetOrgConfig-400-v2) * [401](https://docs.datadoghq.com/api/latest/organizations/#GetOrgConfig-401-v2) * [403](https://docs.datadoghq.com/api/latest/organizations/#GetOrgConfig-403-v2) * [404](https://docs.datadoghq.com/api/latest/organizations/#GetOrgConfig-404-v2) * [429](https://docs.datadoghq.com/api/latest/organizations/#GetOrgConfig-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) A response with a single Org Config. Field Type Description data [_required_] object A single Org Config. attributes [_required_] object Readable attributes of an Org Config. description [_required_] string The description of an Org Config. modified_at date-time The timestamp of the last Org Config update (if any). name [_required_] string The machine-friendly name of an Org Config. value [_required_] The value of an Org Config. value_type [_required_] string The type of an Org Config value. id [_required_] string A unique identifier for an Org Config. type [_required_] enum Data type of an Org Config. Allowed enum values: `org_configs` ``` { "data": { "attributes": { "description": "Frobulate the turbo encabulator manifold", "modified_at": "2019-09-19T10:00:00.000Z", "name": "monitor_timezone", "value": "undefined", "value_type": "bool" }, "id": "abcd1234", "type": "org_configs" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/organizations/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/organizations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/organizations/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/organizations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/organizations/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/organizations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/organizations/?code-lang=typescript) ##### Get a specific Org Config value Copy ``` # Path parameters export org_config_name="monitor_timezone" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/org_configs/${org_config_name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a specific Org Config value ``` """ Get a specific Org Config value returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.organizations_api import OrganizationsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrganizationsApi(api_client) response = api_instance.get_org_config( org_config_name="custom_roles", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a specific Org Config value ``` # Get a specific Org Config value returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OrganizationsAPI.new p api_instance.get_org_config("custom_roles") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a specific Org Config value ``` // Get a specific Org Config value returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOrganizationsApi(apiClient) resp, r, err := api.GetOrgConfig(ctx, "custom_roles") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrganizationsApi.GetOrgConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrganizationsApi.GetOrgConfig`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a specific Org Config value ``` // Get a specific Org Config value returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OrganizationsApi; import com.datadog.api.client.v2.model.OrgConfigGetResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrganizationsApi apiInstance = new OrganizationsApi(defaultClient); try { OrgConfigGetResponse result = apiInstance.getOrgConfig("custom_roles"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrganizationsApi#getOrgConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a specific Org Config value ``` // Get a specific Org Config value returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_organizations::OrganizationsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = OrganizationsAPI::with_config(configuration); let resp = api.get_org_config("custom_roles".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a specific Org Config value ``` /** * Get a specific Org Config value returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OrganizationsApi(configuration); const params: v2.OrganizationsApiGetOrgConfigRequest = { orgConfigName: "custom_roles", }; apiInstance .getOrgConfig(params) .then((data: v2.OrgConfigGetResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a specific Org Config](https://docs.datadoghq.com/api/latest/organizations/#update-a-specific-org-config) * [v2 (latest)](https://docs.datadoghq.com/api/latest/organizations/#update-a-specific-org-config-v2) PATCH https://api.ap1.datadoghq.com/api/v2/org_configs/{org_config_name}https://api.ap2.datadoghq.com/api/v2/org_configs/{org_config_name}https://api.datadoghq.eu/api/v2/org_configs/{org_config_name}https://api.ddog-gov.com/api/v2/org_configs/{org_config_name}https://api.datadoghq.com/api/v2/org_configs/{org_config_name}https://api.us3.datadoghq.com/api/v2/org_configs/{org_config_name}https://api.us5.datadoghq.com/api/v2/org_configs/{org_config_name} ### Overview Update the value of a specific Org Config. This endpoint requires the `org_management` permission. ### Arguments #### Path Parameters Name Type Description org_config_name [_required_] string The name of an Org Config. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) Field Type Description data [_required_] object An Org Config write operation. attributes [_required_] object Writable attributes of an Org Config. value [_required_] The value of an Org Config. type [_required_] enum Data type of an Org Config. Allowed enum values: `org_configs` ``` { "data": { "attributes": { "value": "UTC" }, "type": "org_configs" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/organizations/#UpdateOrgConfig-200-v2) * [400](https://docs.datadoghq.com/api/latest/organizations/#UpdateOrgConfig-400-v2) * [401](https://docs.datadoghq.com/api/latest/organizations/#UpdateOrgConfig-401-v2) * [403](https://docs.datadoghq.com/api/latest/organizations/#UpdateOrgConfig-403-v2) * [404](https://docs.datadoghq.com/api/latest/organizations/#UpdateOrgConfig-404-v2) * [429](https://docs.datadoghq.com/api/latest/organizations/#UpdateOrgConfig-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) A response with a single Org Config. Field Type Description data [_required_] object A single Org Config. attributes [_required_] object Readable attributes of an Org Config. description [_required_] string The description of an Org Config. modified_at date-time The timestamp of the last Org Config update (if any). name [_required_] string The machine-friendly name of an Org Config. value [_required_] The value of an Org Config. value_type [_required_] string The type of an Org Config value. id [_required_] string A unique identifier for an Org Config. type [_required_] enum Data type of an Org Config. Allowed enum values: `org_configs` ``` { "data": { "attributes": { "description": "Frobulate the turbo encabulator manifold", "modified_at": "2019-09-19T10:00:00.000Z", "name": "monitor_timezone", "value": "undefined", "value_type": "bool" }, "id": "abcd1234", "type": "org_configs" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/organizations/) * [Example](https://docs.datadoghq.com/api/latest/organizations/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/organizations/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/organizations/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/organizations/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/organizations/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/organizations/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/organizations/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/organizations/?code-lang=typescript) ##### Update a specific Org Config returns "OK" response Copy ``` # Path parameters export org_config_name="monitor_timezone" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/org_configs/${org_config_name}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "value": "UTC" }, "type": "org_configs" } } EOF ``` ##### Update a specific Org Config returns "OK" response ``` // Update a specific Org Config returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.OrgConfigWriteRequest{ Data: datadogV2.OrgConfigWrite{ Attributes: datadogV2.OrgConfigWriteAttributes{ Value: "UTC", }, Type: datadogV2.ORGCONFIGTYPE_ORG_CONFIGS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewOrganizationsApi(apiClient) resp, r, err := api.UpdateOrgConfig(ctx, "monitor_timezone", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `OrganizationsApi.UpdateOrgConfig`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `OrganizationsApi.UpdateOrgConfig`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a specific Org Config returns "OK" response ``` // Update a specific Org Config returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.OrganizationsApi; import com.datadog.api.client.v2.model.OrgConfigGetResponse; import com.datadog.api.client.v2.model.OrgConfigType; import com.datadog.api.client.v2.model.OrgConfigWrite; import com.datadog.api.client.v2.model.OrgConfigWriteAttributes; import com.datadog.api.client.v2.model.OrgConfigWriteRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); OrganizationsApi apiInstance = new OrganizationsApi(defaultClient); OrgConfigWriteRequest body = new OrgConfigWriteRequest() .data( new OrgConfigWrite() .attributes(new OrgConfigWriteAttributes().value("UTC")) .type(OrgConfigType.ORG_CONFIGS)); try { OrgConfigGetResponse result = apiInstance.updateOrgConfig("monitor_timezone", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling OrganizationsApi#updateOrgConfig"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a specific Org Config returns "OK" response ``` """ Update a specific Org Config returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.organizations_api import OrganizationsApi from datadog_api_client.v2.model.org_config_type import OrgConfigType from datadog_api_client.v2.model.org_config_write import OrgConfigWrite from datadog_api_client.v2.model.org_config_write_attributes import OrgConfigWriteAttributes from datadog_api_client.v2.model.org_config_write_request import OrgConfigWriteRequest body = OrgConfigWriteRequest( data=OrgConfigWrite( attributes=OrgConfigWriteAttributes( value="UTC", ), type=OrgConfigType.ORG_CONFIGS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = OrganizationsApi(api_client) response = api_instance.update_org_config(org_config_name="monitor_timezone", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a specific Org Config returns "OK" response ``` # Update a specific Org Config returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::OrganizationsAPI.new body = DatadogAPIClient::V2::OrgConfigWriteRequest.new({ data: DatadogAPIClient::V2::OrgConfigWrite.new({ attributes: DatadogAPIClient::V2::OrgConfigWriteAttributes.new({ value: "UTC", }), type: DatadogAPIClient::V2::OrgConfigType::ORG_CONFIGS, }), }) p api_instance.update_org_config("monitor_timezone", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a specific Org Config returns "OK" response ``` // Update a specific Org Config returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_organizations::OrganizationsAPI; use datadog_api_client::datadogV2::model::OrgConfigType; use datadog_api_client::datadogV2::model::OrgConfigWrite; use datadog_api_client::datadogV2::model::OrgConfigWriteAttributes; use datadog_api_client::datadogV2::model::OrgConfigWriteRequest; use serde_json::Value; #[tokio::main] async fn main() { let body = OrgConfigWriteRequest::new(OrgConfigWrite::new( OrgConfigWriteAttributes::new(Value::from("UTC")), OrgConfigType::ORG_CONFIGS, )); let configuration = datadog::Configuration::new(); let api = OrganizationsAPI::with_config(configuration); let resp = api .update_org_config("monitor_timezone".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a specific Org Config returns "OK" response ``` /** * Update a specific Org Config returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.OrganizationsApi(configuration); const params: v2.OrganizationsApiUpdateOrgConfigRequest = { body: { data: { attributes: { value: "UTC", }, type: "org_configs", }, }, orgConfigName: "monitor_timezone", }; apiInstance .updateOrgConfig(params) .then((data: v2.OrgConfigGetResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=e180c467-c59a-4ecc-bc01-8aa5c1f58f3a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=3b28bc21-1483-4bbf-a1e6-27c4ffc6b4da&pt=Organizations&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Forganizations%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=e180c467-c59a-4ecc-bc01-8aa5c1f58f3a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=3b28bc21-1483-4bbf-a1e6-27c4ffc6b4da&pt=Organizations&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Forganizations%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=f95442e9-5045-451e-92c8-a7df85e77281&bo=2&sid=ca97aff0f0bf11f0b225ebe7018368cd&vid=ca980060f0bf11f09385b797b877861f&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Organizations&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Forganizations%2F&r=<=2167&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=940443) --- # Source: https://docs.datadoghq.com/api/latest/pagerduty-integration/ # PagerDuty Integration Configure your [Datadog-PagerDuty integration](https://docs.datadoghq.com/integrations/pagerduty/) directly through the Datadog API. ## [Create a new service object](https://docs.datadoghq.com/api/latest/pagerduty-integration/#create-a-new-service-object) * [v1 (latest)](https://docs.datadoghq.com/api/latest/pagerduty-integration/#create-a-new-service-object-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/pagerduty/configuration/serviceshttps://api.ap2.datadoghq.com/api/v1/integration/pagerduty/configuration/serviceshttps://api.datadoghq.eu/api/v1/integration/pagerduty/configuration/serviceshttps://api.ddog-gov.com/api/v1/integration/pagerduty/configuration/serviceshttps://api.datadoghq.com/api/v1/integration/pagerduty/configuration/serviceshttps://api.us3.datadoghq.com/api/v1/integration/pagerduty/configuration/serviceshttps://api.us5.datadoghq.com/api/v1/integration/pagerduty/configuration/services ### Overview Create a new service object in the PagerDuty integration. This endpoint requires the `manage_integrations` permission. ### Request #### Body Data (required) Create a new service object request body. * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Expand All Field Type Description service_key [_required_] string Your service key in PagerDuty. service_name [_required_] string Your service name associated with a service key in PagerDuty. ``` { "service_key": "", "service_name": "" } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/pagerduty-integration/#CreatePagerDutyIntegrationService-201-v1) * [400](https://docs.datadoghq.com/api/latest/pagerduty-integration/#CreatePagerDutyIntegrationService-400-v1) * [403](https://docs.datadoghq.com/api/latest/pagerduty-integration/#CreatePagerDutyIntegrationService-403-v1) * [429](https://docs.datadoghq.com/api/latest/pagerduty-integration/#CreatePagerDutyIntegrationService-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) PagerDuty service object name. Expand All Field Type Description service_name [_required_] string Your service name associated service key in PagerDuty. ``` { "service_name": "" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=typescript) ##### Create a new service object Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/pagerduty/configuration/services" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "service_key": "", "service_name": "" } EOF ``` ##### Create a new service object ``` """ Create a new service object returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.pager_duty_integration_api import PagerDutyIntegrationApi from datadog_api_client.v1.model.pager_duty_service import PagerDutyService body = PagerDutyService( service_key="", service_name="", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = PagerDutyIntegrationApi(api_client) response = api_instance.create_pager_duty_integration_service(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new service object ``` # Create a new service object returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::PagerDutyIntegrationAPI.new body = DatadogAPIClient::V1::PagerDutyService.new({ service_key: "", service_name: "", }) p api_instance.create_pager_duty_integration_service(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new service object ``` // Create a new service object returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.PagerDutyService{ ServiceKey: "", ServiceName: "", } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewPagerDutyIntegrationApi(apiClient) resp, r, err := api.CreatePagerDutyIntegrationService(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `PagerDutyIntegrationApi.CreatePagerDutyIntegrationService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `PagerDutyIntegrationApi.CreatePagerDutyIntegrationService`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new service object ``` // Create a new service object returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.PagerDutyIntegrationApi; import com.datadog.api.client.v1.model.PagerDutyService; import com.datadog.api.client.v1.model.PagerDutyServiceName; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); PagerDutyIntegrationApi apiInstance = new PagerDutyIntegrationApi(defaultClient); PagerDutyService body = new PagerDutyService().serviceKey("").serviceName(""); try { PagerDutyServiceName result = apiInstance.createPagerDutyIntegrationService(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling PagerDutyIntegrationApi#createPagerDutyIntegrationService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new service object ``` // Create a new service object returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_pager_duty_integration::PagerDutyIntegrationAPI; use datadog_api_client::datadogV1::model::PagerDutyService; #[tokio::main] async fn main() { let body = PagerDutyService::new("".to_string(), "".to_string()); let configuration = datadog::Configuration::new(); let api = PagerDutyIntegrationAPI::with_config(configuration); let resp = api.create_pager_duty_integration_service(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new service object ``` /** * Create a new service object returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.PagerDutyIntegrationApi(configuration); const params: v1.PagerDutyIntegrationApiCreatePagerDutyIntegrationServiceRequest = { body: { serviceKey: "", serviceName: "", }, }; apiInstance .createPagerDutyIntegrationService(params) .then((data: v1.PagerDutyServiceName) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a single service object](https://docs.datadoghq.com/api/latest/pagerduty-integration/#get-a-single-service-object) * [v1 (latest)](https://docs.datadoghq.com/api/latest/pagerduty-integration/#get-a-single-service-object-v1) GET https://api.ap1.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.ap2.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.datadoghq.eu/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.ddog-gov.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.us3.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.us5.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name} ### Overview Get service name in the Datadog-PagerDuty integration. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description service_name [_required_] string The service name. ### Response * [200](https://docs.datadoghq.com/api/latest/pagerduty-integration/#GetPagerDutyIntegrationService-200-v1) * [403](https://docs.datadoghq.com/api/latest/pagerduty-integration/#GetPagerDutyIntegrationService-403-v1) * [404](https://docs.datadoghq.com/api/latest/pagerduty-integration/#GetPagerDutyIntegrationService-404-v1) * [429](https://docs.datadoghq.com/api/latest/pagerduty-integration/#GetPagerDutyIntegrationService-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) PagerDuty service object name. Expand All Field Type Description service_name [_required_] string Your service name associated service key in PagerDuty. ``` { "service_name": "" } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=typescript) ##### Get a single service object Copy ``` # Path parameters export service_name="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/pagerduty/configuration/services/${service_name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a single service object ``` """ Get a single service object returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.pager_duty_integration_api import PagerDutyIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = PagerDutyIntegrationApi(api_client) response = api_instance.get_pager_duty_integration_service( service_name="service_name", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a single service object ``` # Get a single service object returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::PagerDutyIntegrationAPI.new p api_instance.get_pager_duty_integration_service("service_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a single service object ``` // Get a single service object returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewPagerDutyIntegrationApi(apiClient) resp, r, err := api.GetPagerDutyIntegrationService(ctx, "service_name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `PagerDutyIntegrationApi.GetPagerDutyIntegrationService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `PagerDutyIntegrationApi.GetPagerDutyIntegrationService`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a single service object ``` // Get a single service object returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.PagerDutyIntegrationApi; import com.datadog.api.client.v1.model.PagerDutyServiceName; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); PagerDutyIntegrationApi apiInstance = new PagerDutyIntegrationApi(defaultClient); try { PagerDutyServiceName result = apiInstance.getPagerDutyIntegrationService("service_name"); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling PagerDutyIntegrationApi#getPagerDutyIntegrationService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a single service object ``` // Get a single service object returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_pager_duty_integration::PagerDutyIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = PagerDutyIntegrationAPI::with_config(configuration); let resp = api .get_pager_duty_integration_service("service_name".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a single service object ``` /** * Get a single service object returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.PagerDutyIntegrationApi(configuration); const params: v1.PagerDutyIntegrationApiGetPagerDutyIntegrationServiceRequest = { serviceName: "service_name", }; apiInstance .getPagerDutyIntegrationService(params) .then((data: v1.PagerDutyServiceName) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a single service object](https://docs.datadoghq.com/api/latest/pagerduty-integration/#update-a-single-service-object) * [v1 (latest)](https://docs.datadoghq.com/api/latest/pagerduty-integration/#update-a-single-service-object-v1) PUT https://api.ap1.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.ap2.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.datadoghq.eu/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.ddog-gov.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.us3.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.us5.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name} ### Overview Update a single service object in the Datadog-PagerDuty integration. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description service_name [_required_] string The service name ### Request #### Body Data (required) Update an existing service object request body. * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Expand All Field Type Description service_key [_required_] string Your service key in PagerDuty. ``` { "service_key": "" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/pagerduty-integration/#UpdatePagerDutyIntegrationService-200-v1) * [400](https://docs.datadoghq.com/api/latest/pagerduty-integration/#UpdatePagerDutyIntegrationService-400-v1) * [403](https://docs.datadoghq.com/api/latest/pagerduty-integration/#UpdatePagerDutyIntegrationService-403-v1) * [404](https://docs.datadoghq.com/api/latest/pagerduty-integration/#UpdatePagerDutyIntegrationService-404-v1) * [429](https://docs.datadoghq.com/api/latest/pagerduty-integration/#UpdatePagerDutyIntegrationService-429-v1) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=typescript) ##### Update a single service object Copy ``` # Path parameters export service_name="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/pagerduty/configuration/services/${service_name}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "service_key": "" } EOF ``` ##### Update a single service object ``` """ Update a single service object returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.pager_duty_integration_api import PagerDutyIntegrationApi from datadog_api_client.v1.model.pager_duty_service_key import PagerDutyServiceKey body = PagerDutyServiceKey( service_key="", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = PagerDutyIntegrationApi(api_client) api_instance.update_pager_duty_integration_service(service_name="service_name", body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a single service object ``` # Update a single service object returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::PagerDutyIntegrationAPI.new body = DatadogAPIClient::V1::PagerDutyServiceKey.new({ service_key: "", }) p api_instance.update_pager_duty_integration_service("service_name", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a single service object ``` // Update a single service object returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.PagerDutyServiceKey{ ServiceKey: "", } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewPagerDutyIntegrationApi(apiClient) r, err := api.UpdatePagerDutyIntegrationService(ctx, "service_name", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `PagerDutyIntegrationApi.UpdatePagerDutyIntegrationService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a single service object ``` // Update a single service object returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.PagerDutyIntegrationApi; import com.datadog.api.client.v1.model.PagerDutyServiceKey; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); PagerDutyIntegrationApi apiInstance = new PagerDutyIntegrationApi(defaultClient); PagerDutyServiceKey body = new PagerDutyServiceKey().serviceKey(""); try { apiInstance.updatePagerDutyIntegrationService("service_name", body); } catch (ApiException e) { System.err.println( "Exception when calling PagerDutyIntegrationApi#updatePagerDutyIntegrationService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a single service object ``` // Update a single service object returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_pager_duty_integration::PagerDutyIntegrationAPI; use datadog_api_client::datadogV1::model::PagerDutyServiceKey; #[tokio::main] async fn main() { let body = PagerDutyServiceKey::new("".to_string()); let configuration = datadog::Configuration::new(); let api = PagerDutyIntegrationAPI::with_config(configuration); let resp = api .update_pager_duty_integration_service("service_name".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a single service object ``` /** * Update a single service object returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.PagerDutyIntegrationApi(configuration); const params: v1.PagerDutyIntegrationApiUpdatePagerDutyIntegrationServiceRequest = { body: { serviceKey: "", }, serviceName: "service_name", }; apiInstance .updatePagerDutyIntegrationService(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a single service object](https://docs.datadoghq.com/api/latest/pagerduty-integration/#delete-a-single-service-object) * [v1 (latest)](https://docs.datadoghq.com/api/latest/pagerduty-integration/#delete-a-single-service-object-v1) DELETE https://api.ap1.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.ap2.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.datadoghq.eu/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.ddog-gov.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.us3.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name}https://api.us5.datadoghq.com/api/v1/integration/pagerduty/configuration/services/{service_name} ### Overview Delete a single service object in the Datadog-PagerDuty integration. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description service_name [_required_] string The service name ### Response * [204](https://docs.datadoghq.com/api/latest/pagerduty-integration/#DeletePagerDutyIntegrationService-204-v1) * [403](https://docs.datadoghq.com/api/latest/pagerduty-integration/#DeletePagerDutyIntegrationService-403-v1) * [404](https://docs.datadoghq.com/api/latest/pagerduty-integration/#DeletePagerDutyIntegrationService-404-v1) * [429](https://docs.datadoghq.com/api/latest/pagerduty-integration/#DeletePagerDutyIntegrationService-429-v1) No Content Authentication error * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/pagerduty-integration/) * [Example](https://docs.datadoghq.com/api/latest/pagerduty-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/pagerduty-integration/?code-lang=typescript) ##### Delete a single service object Copy ``` # Path parameters export service_name="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/pagerduty/configuration/services/${service_name}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a single service object ``` """ Delete a single service object returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.pager_duty_integration_api import PagerDutyIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = PagerDutyIntegrationApi(api_client) api_instance.delete_pager_duty_integration_service( service_name="service_name", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a single service object ``` # Delete a single service object returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::PagerDutyIntegrationAPI.new api_instance.delete_pager_duty_integration_service("service_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a single service object ``` // Delete a single service object returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewPagerDutyIntegrationApi(apiClient) r, err := api.DeletePagerDutyIntegrationService(ctx, "service_name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `PagerDutyIntegrationApi.DeletePagerDutyIntegrationService`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a single service object ``` // Delete a single service object returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.PagerDutyIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); PagerDutyIntegrationApi apiInstance = new PagerDutyIntegrationApi(defaultClient); try { apiInstance.deletePagerDutyIntegrationService("service_name"); } catch (ApiException e) { System.err.println( "Exception when calling PagerDutyIntegrationApi#deletePagerDutyIntegrationService"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a single service object ``` // Delete a single service object returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_pager_duty_integration::PagerDutyIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = PagerDutyIntegrationAPI::with_config(configuration); let resp = api .delete_pager_duty_integration_service("service_name".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a single service object ``` /** * Delete a single service object returns "No Content" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.PagerDutyIntegrationApi(configuration); const params: v1.PagerDutyIntegrationApiDeletePagerDutyIntegrationServiceRequest = { serviceName: "service_name", }; apiInstance .deletePagerDutyIntegrationService(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=a9b1d464-2840-45fd-a6ec-38b7b41d636b&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=eab1f73b-1214-49a9-a651-0951a6e5cef6&pt=PagerDuty%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fpagerduty-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=a9b1d464-2840-45fd-a6ec-38b7b41d636b&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=eab1f73b-1214-49a9-a651-0951a6e5cef6&pt=PagerDuty%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fpagerduty-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=2963a30f-94ea-4d10-a5f8-435cf350e671&bo=2&sid=545458e0f0c011f0a1a8c3a4f135763b&vid=54545f80f0c011f0909d2faa02b2847c&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=PagerDuty%20Integration&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fpagerduty-integration%2F&r=<=1518&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=145823) --- # Source: https://docs.datadoghq.com/api/latest/powerpack # Powerpack The Powerpack endpoints allow you to: * Get a Powerpack * Create a Powerpack * Delete a Powerpack * Get a list of all Powerpacks The Patch and Delete API methods can only be performed on a Powerpack by a user who has the powerpack create permission for that specific Powerpack. Read [Scale Graphing Expertise with Powerpacks](https://docs.datadoghq.com/dashboards/guide/powerpacks-best-practices/) for more information. ## [Get all powerpacks](https://docs.datadoghq.com/api/latest/powerpack/#get-all-powerpacks) * [v2 (latest)](https://docs.datadoghq.com/api/latest/powerpack/#get-all-powerpacks-v2) GET https://api.ap1.datadoghq.com/api/v2/powerpackshttps://api.ap2.datadoghq.com/api/v2/powerpackshttps://api.datadoghq.eu/api/v2/powerpackshttps://api.ddog-gov.com/api/v2/powerpackshttps://api.datadoghq.com/api/v2/powerpackshttps://api.us3.datadoghq.com/api/v2/powerpackshttps://api.us5.datadoghq.com/api/v2/powerpacks ### Overview Get a list of all powerpacks. This endpoint requires the `dashboards_read` permission. OAuth apps require the `dashboards_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#powerpack) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[limit] integer Maximum number of powerpacks in the response. page[offset] integer Specific offset to use as the beginning of the returned page. ### Response * [200](https://docs.datadoghq.com/api/latest/powerpack/#ListPowerpacks-200-v2) * [429](https://docs.datadoghq.com/api/latest/powerpack/#ListPowerpacks-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) Response object which includes all powerpack configurations. Field Type Description data [object] List of powerpack definitions. attributes object Powerpack attribute object. description string Description of this powerpack. group_widget [_required_] object Powerpack group widget definition object. definition [_required_] object Powerpack group widget object. layout_type [_required_] string Layout type of widgets. show_title boolean Boolean indicating whether powerpack group title should be visible or not. title string Name for the group widget. type [_required_] string Type of widget, must be group. widgets [_required_] [object] Widgets inside the powerpack. definition [_required_] object Information about widget. layout object Powerpack inner widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. layout object Powerpack group widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,1y,alert` name [_required_] string Name of the powerpack. tags [string] List of tags to identify this powerpack. template_variables [object] List of template variables for this powerpack. available_values [string] The list of values that the template variable drop-down is limited to. defaults [string] One or many template variable default values within the saved view, which are unioned together using `OR` if more than one is specified. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. Only tags with this prefix appear in the variable drop-down. id string ID of the powerpack. relationships object Powerpack relationship object. author object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type string Type of widget, must be powerpack. included [object] Array of objects related to the users. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` links object Links attributes. first string Link to last page. last string Link to first page. next string Link for the next set of results. prev string Link for the previous set of results. self string Link to current page. meta object Powerpack response metadata. pagination object Powerpack response pagination metadata. first_offset int64 The first offset. last_offset int64 The last offset. limit int64 Pagination limit. next_offset int64 The next offset. offset int64 The offset. prev_offset int64 The previous offset. total int64 Total results. type string Offset type. ``` { "data": [ { "attributes": { "description": "Powerpack for ABC", "group_widget": { "definition": { "layout_type": "ordered", "show_title": true, "title": "Sample Powerpack", "type": "group", "widgets": [ { "definition": { "definition": { "content": "example", "type": "note" } }, "layout": { "height": 0, "width": 0, "x": 0, "y": 0 } } ] }, "layout": { "height": 0, "width": 0, "x": 0, "y": 0 }, "live_span": "5m" }, "name": "Sample Powerpack", "tags": [ "tag:foo1" ], "template_variables": [ { "available_values": [ "my-host", "host1", "host2" ], "defaults": [ "*" ], "name": "datacenter", "prefix": "host" } ] }, "id": "string", "relationships": { "author": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "powerpack" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "links": { "first": "string", "last": "https://app.datadoghq.com/api/v2/powerpacks?page[offset]=0\u0026page[limit]=25", "next": "https://app.datadoghq.com/api/v2/powerpacks?page[offset]=25\u0026page[limit]=25", "prev": "string", "self": "https://app.datadoghq.com/api/v2/powerpacks" }, "meta": { "pagination": { "first_offset": "integer", "last_offset": "integer", "limit": "integer", "next_offset": "integer", "offset": "integer", "prev_offset": "integer", "total": "integer", "type": "string" } } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=typescript) ##### Get all powerpacks Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/powerpacks" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all powerpacks ``` """ Get all powerpacks returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.powerpack_api import PowerpackApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = PowerpackApi(api_client) response = api_instance.list_powerpacks( page_limit=1000, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all powerpacks ``` # Get all powerpacks returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::PowerpackAPI.new opts = { page_limit: 1000, } p api_instance.list_powerpacks(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all powerpacks ``` // Get all powerpacks returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewPowerpackApi(apiClient) resp, r, err := api.ListPowerpacks(ctx, *datadogV2.NewListPowerpacksOptionalParameters().WithPageLimit(1000)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `PowerpackApi.ListPowerpacks`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `PowerpackApi.ListPowerpacks`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all powerpacks ``` // Get all powerpacks returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.PowerpackApi; import com.datadog.api.client.v2.api.PowerpackApi.ListPowerpacksOptionalParameters; import com.datadog.api.client.v2.model.ListPowerpacksResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); PowerpackApi apiInstance = new PowerpackApi(defaultClient); try { ListPowerpacksResponse result = apiInstance.listPowerpacks(new ListPowerpacksOptionalParameters().pageLimit(1000L)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling PowerpackApi#listPowerpacks"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all powerpacks ``` // Get all powerpacks returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_powerpack::ListPowerpacksOptionalParams; use datadog_api_client::datadogV2::api_powerpack::PowerpackAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = PowerpackAPI::with_config(configuration); let resp = api .list_powerpacks(ListPowerpacksOptionalParams::default().page_limit(1000)) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all powerpacks ``` /** * Get all powerpacks returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.PowerpackApi(configuration); const params: v2.PowerpackApiListPowerpacksRequest = { pageLimit: 1000, }; apiInstance .listPowerpacks(params) .then((data: v2.ListPowerpacksResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new powerpack](https://docs.datadoghq.com/api/latest/powerpack/#create-a-new-powerpack) * [v2 (latest)](https://docs.datadoghq.com/api/latest/powerpack/#create-a-new-powerpack-v2) POST https://api.ap1.datadoghq.com/api/v2/powerpackshttps://api.ap2.datadoghq.com/api/v2/powerpackshttps://api.datadoghq.eu/api/v2/powerpackshttps://api.ddog-gov.com/api/v2/powerpackshttps://api.datadoghq.com/api/v2/powerpackshttps://api.us3.datadoghq.com/api/v2/powerpackshttps://api.us5.datadoghq.com/api/v2/powerpacks ### Overview Create a powerpack. This endpoint requires the `dashboards_write` permission. OAuth apps require the `dashboards_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#powerpack) to access this endpoint. ### Request #### Body Data (required) Create a powerpack request body. * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) Field Type Description data object Powerpack data object. attributes object Powerpack attribute object. description string Description of this powerpack. group_widget [_required_] object Powerpack group widget definition object. definition [_required_] object Powerpack group widget object. layout_type [_required_] string Layout type of widgets. show_title boolean Boolean indicating whether powerpack group title should be visible or not. title string Name for the group widget. type [_required_] string Type of widget, must be group. widgets [_required_] [object] Widgets inside the powerpack. definition [_required_] object Information about widget. layout object Powerpack inner widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. layout object Powerpack group widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,1y,alert` name [_required_] string Name of the powerpack. tags [string] List of tags to identify this powerpack. template_variables [object] List of template variables for this powerpack. available_values [string] The list of values that the template variable drop-down is limited to. defaults [string] One or many template variable default values within the saved view, which are unioned together using `OR` if more than one is specified. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. Only tags with this prefix appear in the variable drop-down. id string ID of the powerpack. relationships object Powerpack relationship object. author object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type string Type of widget, must be powerpack. ``` { "data": { "attributes": { "description": "Sample powerpack", "group_widget": { "definition": { "layout_type": "ordered", "show_title": true, "title": "Sample Powerpack", "type": "group", "widgets": [ { "definition": { "content": "test", "type": "note" } } ] }, "layout": { "height": 3, "width": 12, "x": 0, "y": 0 }, "live_span": "1h" }, "name": "Example-Powerpack", "tags": [ "tag:sample" ], "template_variables": [ { "defaults": [ "*" ], "name": "sample" } ] }, "type": "powerpack" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/powerpack/#CreatePowerpack-200-v2) * [400](https://docs.datadoghq.com/api/latest/powerpack/#CreatePowerpack-400-v2) * [429](https://docs.datadoghq.com/api/latest/powerpack/#CreatePowerpack-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) Response object which includes a single powerpack configuration. Field Type Description data object Powerpack data object. attributes object Powerpack attribute object. description string Description of this powerpack. group_widget [_required_] object Powerpack group widget definition object. definition [_required_] object Powerpack group widget object. layout_type [_required_] string Layout type of widgets. show_title boolean Boolean indicating whether powerpack group title should be visible or not. title string Name for the group widget. type [_required_] string Type of widget, must be group. widgets [_required_] [object] Widgets inside the powerpack. definition [_required_] object Information about widget. layout object Powerpack inner widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. layout object Powerpack group widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,1y,alert` name [_required_] string Name of the powerpack. tags [string] List of tags to identify this powerpack. template_variables [object] List of template variables for this powerpack. available_values [string] The list of values that the template variable drop-down is limited to. defaults [string] One or many template variable default values within the saved view, which are unioned together using `OR` if more than one is specified. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. Only tags with this prefix appear in the variable drop-down. id string ID of the powerpack. relationships object Powerpack relationship object. author object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type string Type of widget, must be powerpack. included [object] Array of objects related to the users. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "description": "Powerpack for ABC", "group_widget": { "definition": { "layout_type": "ordered", "show_title": true, "title": "Sample Powerpack", "type": "group", "widgets": [ { "definition": { "definition": { "content": "example", "type": "note" } }, "layout": { "height": 0, "width": 0, "x": 0, "y": 0 } } ] }, "layout": { "height": 0, "width": 0, "x": 0, "y": 0 }, "live_span": "5m" }, "name": "Sample Powerpack", "tags": [ "tag:foo1" ], "template_variables": [ { "available_values": [ "my-host", "host1", "host2" ], "defaults": [ "*" ], "name": "datacenter", "prefix": "host" } ] }, "id": "string", "relationships": { "author": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "powerpack" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=typescript) ##### Create a new powerpack returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/powerpacks" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Sample powerpack", "group_widget": { "definition": { "layout_type": "ordered", "show_title": true, "title": "Sample Powerpack", "type": "group", "widgets": [ { "definition": { "content": "test", "type": "note" } } ] }, "layout": { "height": 3, "width": 12, "x": 0, "y": 0 }, "live_span": "1h" }, "name": "Example-Powerpack", "tags": [ "tag:sample" ], "template_variables": [ { "defaults": [ "*" ], "name": "sample" } ] }, "type": "powerpack" } } EOF ``` ##### Create a new powerpack returns "OK" response ``` // Create a new powerpack returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.Powerpack{ Data: &datadogV2.PowerpackData{ Attributes: &datadogV2.PowerpackAttributes{ Description: datadog.PtrString("Sample powerpack"), GroupWidget: datadogV2.PowerpackGroupWidget{ Definition: datadogV2.PowerpackGroupWidgetDefinition{ LayoutType: "ordered", ShowTitle: datadog.PtrBool(true), Title: datadog.PtrString("Sample Powerpack"), Type: "group", Widgets: []datadogV2.PowerpackInnerWidgets{ { Definition: map[string]interface{}{ "content": "test", "type": "note", }, }, }, }, Layout: &datadogV2.PowerpackGroupWidgetLayout{ Height: 3, Width: 12, X: 0, Y: 0, }, LiveSpan: datadogV2.WIDGETLIVESPAN_PAST_ONE_HOUR.Ptr(), }, Name: "Example-Powerpack", Tags: []string{ "tag:sample", }, TemplateVariables: []datadogV2.PowerpackTemplateVariable{ { Defaults: []string{ "*", }, Name: "sample", }, }, }, Type: datadog.PtrString("powerpack"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewPowerpackApi(apiClient) resp, r, err := api.CreatePowerpack(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `PowerpackApi.CreatePowerpack`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `PowerpackApi.CreatePowerpack`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new powerpack returns "OK" response ``` // Create a new powerpack returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.PowerpackApi; import com.datadog.api.client.v2.model.Powerpack; import com.datadog.api.client.v2.model.PowerpackAttributes; import com.datadog.api.client.v2.model.PowerpackData; import com.datadog.api.client.v2.model.PowerpackGroupWidget; import com.datadog.api.client.v2.model.PowerpackGroupWidgetDefinition; import com.datadog.api.client.v2.model.PowerpackGroupWidgetLayout; import com.datadog.api.client.v2.model.PowerpackInnerWidgets; import com.datadog.api.client.v2.model.PowerpackResponse; import com.datadog.api.client.v2.model.PowerpackTemplateVariable; import com.datadog.api.client.v2.model.WidgetLiveSpan; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); PowerpackApi apiInstance = new PowerpackApi(defaultClient); Powerpack body = new Powerpack() .data( new PowerpackData() .attributes( new PowerpackAttributes() .description("Sample powerpack") .groupWidget( new PowerpackGroupWidget() .definition( new PowerpackGroupWidgetDefinition() .layoutType("ordered") .showTitle(true) .title("Sample Powerpack") .type("group") .widgets( Collections.singletonList( new PowerpackInnerWidgets() .definition( Map.ofEntries( Map.entry("content", "test"), Map.entry("type", "note")))))) .layout( new PowerpackGroupWidgetLayout() .height(3L) .width(12L) .x(0L) .y(0L)) .liveSpan(WidgetLiveSpan.PAST_ONE_HOUR)) .name("Example-Powerpack") .tags(Collections.singletonList("tag:sample")) .templateVariables( Collections.singletonList( new PowerpackTemplateVariable() .defaults(Collections.singletonList("*")) .name("sample")))) .type("powerpack")); try { PowerpackResponse result = apiInstance.createPowerpack(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling PowerpackApi#createPowerpack"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new powerpack returns "OK" response ``` """ Create a new powerpack returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.powerpack_api import PowerpackApi from datadog_api_client.v2.model.powerpack import Powerpack from datadog_api_client.v2.model.powerpack_attributes import PowerpackAttributes from datadog_api_client.v2.model.powerpack_data import PowerpackData from datadog_api_client.v2.model.powerpack_group_widget import PowerpackGroupWidget from datadog_api_client.v2.model.powerpack_group_widget_definition import PowerpackGroupWidgetDefinition from datadog_api_client.v2.model.powerpack_group_widget_layout import PowerpackGroupWidgetLayout from datadog_api_client.v2.model.powerpack_inner_widgets import PowerpackInnerWidgets from datadog_api_client.v2.model.powerpack_template_variable import PowerpackTemplateVariable from datadog_api_client.v2.model.widget_live_span import WidgetLiveSpan body = Powerpack( data=PowerpackData( attributes=PowerpackAttributes( description="Sample powerpack", group_widget=PowerpackGroupWidget( definition=PowerpackGroupWidgetDefinition( layout_type="ordered", show_title=True, title="Sample Powerpack", type="group", widgets=[ PowerpackInnerWidgets( definition=dict([("content", "test"), ("type", "note")]), ), ], ), layout=PowerpackGroupWidgetLayout( height=3, width=12, x=0, y=0, ), live_span=WidgetLiveSpan.PAST_ONE_HOUR, ), name="Example-Powerpack", tags=[ "tag:sample", ], template_variables=[ PowerpackTemplateVariable( defaults=[ "*", ], name="sample", ), ], ), type="powerpack", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = PowerpackApi(api_client) response = api_instance.create_powerpack(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new powerpack returns "OK" response ``` # Create a new powerpack returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::PowerpackAPI.new body = DatadogAPIClient::V2::Powerpack.new({ data: DatadogAPIClient::V2::PowerpackData.new({ attributes: DatadogAPIClient::V2::PowerpackAttributes.new({ description: "Sample powerpack", group_widget: DatadogAPIClient::V2::PowerpackGroupWidget.new({ definition: DatadogAPIClient::V2::PowerpackGroupWidgetDefinition.new({ layout_type: "ordered", show_title: true, title: "Sample Powerpack", type: "group", widgets: [ DatadogAPIClient::V2::PowerpackInnerWidgets.new({ definition: { "content": "test", "type": "note", }, }), ], }), layout: DatadogAPIClient::V2::PowerpackGroupWidgetLayout.new({ height: 3, width: 12, x: 0, y: 0, }), live_span: DatadogAPIClient::V2::WidgetLiveSpan::PAST_ONE_HOUR, }), name: "Example-Powerpack", tags: [ "tag:sample", ], template_variables: [ DatadogAPIClient::V2::PowerpackTemplateVariable.new({ defaults: [ "*", ], name: "sample", }), ], }), type: "powerpack", }), }) p api_instance.create_powerpack(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new powerpack returns "OK" response ``` // Create a new powerpack returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_powerpack::PowerpackAPI; use datadog_api_client::datadogV2::model::Powerpack; use datadog_api_client::datadogV2::model::PowerpackAttributes; use datadog_api_client::datadogV2::model::PowerpackData; use datadog_api_client::datadogV2::model::PowerpackGroupWidget; use datadog_api_client::datadogV2::model::PowerpackGroupWidgetDefinition; use datadog_api_client::datadogV2::model::PowerpackGroupWidgetLayout; use datadog_api_client::datadogV2::model::PowerpackInnerWidgets; use datadog_api_client::datadogV2::model::PowerpackTemplateVariable; use datadog_api_client::datadogV2::model::WidgetLiveSpan; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = Powerpack::new().data( PowerpackData::new() .attributes( PowerpackAttributes::new( PowerpackGroupWidget::new( PowerpackGroupWidgetDefinition::new( "ordered".to_string(), "group".to_string(), vec![PowerpackInnerWidgets::new(BTreeMap::from([ ("content".to_string(), Value::from("test")), ("type".to_string(), Value::from("note")), ]))], ) .show_title(true) .title("Sample Powerpack".to_string()), ) .layout(PowerpackGroupWidgetLayout::new(3, 12, 0, 0)) .live_span(WidgetLiveSpan::PAST_ONE_HOUR), "Example-Powerpack".to_string(), ) .description("Sample powerpack".to_string()) .tags(vec!["tag:sample".to_string()]) .template_variables(vec![PowerpackTemplateVariable::new("sample".to_string()) .defaults(vec!["*".to_string()])]), ) .type_("powerpack".to_string()), ); let configuration = datadog::Configuration::new(); let api = PowerpackAPI::with_config(configuration); let resp = api.create_powerpack(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new powerpack returns "OK" response ``` /** * Create a new powerpack returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.PowerpackApi(configuration); const params: v2.PowerpackApiCreatePowerpackRequest = { body: { data: { attributes: { description: "Sample powerpack", groupWidget: { definition: { layoutType: "ordered", showTitle: true, title: "Sample Powerpack", type: "group", widgets: [ { definition: { content: "test", type: "note", }, }, ], }, layout: { height: 3, width: 12, x: 0, y: 0, }, liveSpan: "1h", }, name: "Example-Powerpack", tags: ["tag:sample"], templateVariables: [ { defaults: ["*"], name: "sample", }, ], }, type: "powerpack", }, }, }; apiInstance .createPowerpack(params) .then((data: v2.PowerpackResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a powerpack](https://docs.datadoghq.com/api/latest/powerpack/#delete-a-powerpack) * [v2 (latest)](https://docs.datadoghq.com/api/latest/powerpack/#delete-a-powerpack-v2) DELETE https://api.ap1.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.ap2.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.datadoghq.eu/api/v2/powerpacks/{powerpack_id}https://api.ddog-gov.com/api/v2/powerpacks/{powerpack_id}https://api.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.us3.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.us5.datadoghq.com/api/v2/powerpacks/{powerpack_id} ### Overview Delete a powerpack. This endpoint requires the `dashboards_write` permission. OAuth apps require the `dashboards_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#powerpack) to access this endpoint. ### Arguments #### Path Parameters Name Type Description powerpack_id [_required_] string Powerpack id ### Response * [204](https://docs.datadoghq.com/api/latest/powerpack/#DeletePowerpack-204-v2) * [404](https://docs.datadoghq.com/api/latest/powerpack/#DeletePowerpack-404-v2) * [429](https://docs.datadoghq.com/api/latest/powerpack/#DeletePowerpack-429-v2) OK Powerpack Not Found * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=typescript) ##### Delete a powerpack Copy ``` # Path parameters export powerpack_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/powerpacks/${powerpack_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a powerpack ``` """ Delete a powerpack returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.powerpack_api import PowerpackApi # there is a valid "powerpack" in the system POWERPACK_DATA_ID = environ["POWERPACK_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = PowerpackApi(api_client) api_instance.delete_powerpack( powerpack_id=POWERPACK_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a powerpack ``` # Delete a powerpack returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::PowerpackAPI.new # there is a valid "powerpack" in the system POWERPACK_DATA_ID = ENV["POWERPACK_DATA_ID"] api_instance.delete_powerpack(POWERPACK_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a powerpack ``` // Delete a powerpack returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "powerpack" in the system PowerpackDataID := os.Getenv("POWERPACK_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewPowerpackApi(apiClient) r, err := api.DeletePowerpack(ctx, PowerpackDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `PowerpackApi.DeletePowerpack`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a powerpack ``` // Delete a powerpack returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.PowerpackApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); PowerpackApi apiInstance = new PowerpackApi(defaultClient); // there is a valid "powerpack" in the system String POWERPACK_DATA_ID = System.getenv("POWERPACK_DATA_ID"); try { apiInstance.deletePowerpack(POWERPACK_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling PowerpackApi#deletePowerpack"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a powerpack ``` // Delete a powerpack returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_powerpack::PowerpackAPI; #[tokio::main] async fn main() { // there is a valid "powerpack" in the system let powerpack_data_id = std::env::var("POWERPACK_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = PowerpackAPI::with_config(configuration); let resp = api.delete_powerpack(powerpack_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a powerpack ``` /** * Delete a powerpack returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.PowerpackApi(configuration); // there is a valid "powerpack" in the system const POWERPACK_DATA_ID = process.env.POWERPACK_DATA_ID as string; const params: v2.PowerpackApiDeletePowerpackRequest = { powerpackId: POWERPACK_DATA_ID, }; apiInstance .deletePowerpack(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a Powerpack](https://docs.datadoghq.com/api/latest/powerpack/#get-a-powerpack) * [v2 (latest)](https://docs.datadoghq.com/api/latest/powerpack/#get-a-powerpack-v2) GET https://api.ap1.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.ap2.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.datadoghq.eu/api/v2/powerpacks/{powerpack_id}https://api.ddog-gov.com/api/v2/powerpacks/{powerpack_id}https://api.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.us3.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.us5.datadoghq.com/api/v2/powerpacks/{powerpack_id} ### Overview Get a powerpack. This endpoint requires the `dashboards_read` permission. OAuth apps require the `dashboards_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#powerpack) to access this endpoint. ### Arguments #### Path Parameters Name Type Description powerpack_id [_required_] string ID of the powerpack. ### Response * [200](https://docs.datadoghq.com/api/latest/powerpack/#GetPowerpack-200-v2) * [404](https://docs.datadoghq.com/api/latest/powerpack/#GetPowerpack-404-v2) * [429](https://docs.datadoghq.com/api/latest/powerpack/#GetPowerpack-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) Response object which includes a single powerpack configuration. Field Type Description data object Powerpack data object. attributes object Powerpack attribute object. description string Description of this powerpack. group_widget [_required_] object Powerpack group widget definition object. definition [_required_] object Powerpack group widget object. layout_type [_required_] string Layout type of widgets. show_title boolean Boolean indicating whether powerpack group title should be visible or not. title string Name for the group widget. type [_required_] string Type of widget, must be group. widgets [_required_] [object] Widgets inside the powerpack. definition [_required_] object Information about widget. layout object Powerpack inner widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. layout object Powerpack group widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,1y,alert` name [_required_] string Name of the powerpack. tags [string] List of tags to identify this powerpack. template_variables [object] List of template variables for this powerpack. available_values [string] The list of values that the template variable drop-down is limited to. defaults [string] One or many template variable default values within the saved view, which are unioned together using `OR` if more than one is specified. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. Only tags with this prefix appear in the variable drop-down. id string ID of the powerpack. relationships object Powerpack relationship object. author object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type string Type of widget, must be powerpack. included [object] Array of objects related to the users. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "description": "Powerpack for ABC", "group_widget": { "definition": { "layout_type": "ordered", "show_title": true, "title": "Sample Powerpack", "type": "group", "widgets": [ { "definition": { "definition": { "content": "example", "type": "note" } }, "layout": { "height": 0, "width": 0, "x": 0, "y": 0 } } ] }, "layout": { "height": 0, "width": 0, "x": 0, "y": 0 }, "live_span": "5m" }, "name": "Sample Powerpack", "tags": [ "tag:foo1" ], "template_variables": [ { "available_values": [ "my-host", "host1", "host2" ], "defaults": [ "*" ], "name": "datacenter", "prefix": "host" } ] }, "id": "string", "relationships": { "author": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "powerpack" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Powerpack Not Found. * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=typescript) ##### Get a Powerpack Copy ``` # Path parameters export powerpack_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/powerpacks/${powerpack_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a Powerpack ``` """ Get a Powerpack returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.powerpack_api import PowerpackApi # there is a valid "powerpack" in the system POWERPACK_DATA_ID = environ["POWERPACK_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = PowerpackApi(api_client) response = api_instance.get_powerpack( powerpack_id=POWERPACK_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a Powerpack ``` # Get a Powerpack returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::PowerpackAPI.new # there is a valid "powerpack" in the system POWERPACK_DATA_ID = ENV["POWERPACK_DATA_ID"] p api_instance.get_powerpack(POWERPACK_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a Powerpack ``` // Get a Powerpack returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "powerpack" in the system PowerpackDataID := os.Getenv("POWERPACK_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewPowerpackApi(apiClient) resp, r, err := api.GetPowerpack(ctx, PowerpackDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `PowerpackApi.GetPowerpack`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `PowerpackApi.GetPowerpack`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a Powerpack ``` // Get a Powerpack returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.PowerpackApi; import com.datadog.api.client.v2.model.PowerpackResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); PowerpackApi apiInstance = new PowerpackApi(defaultClient); // there is a valid "powerpack" in the system String POWERPACK_DATA_ID = System.getenv("POWERPACK_DATA_ID"); try { PowerpackResponse result = apiInstance.getPowerpack(POWERPACK_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling PowerpackApi#getPowerpack"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a Powerpack ``` // Get a Powerpack returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_powerpack::PowerpackAPI; #[tokio::main] async fn main() { // there is a valid "powerpack" in the system let powerpack_data_id = std::env::var("POWERPACK_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = PowerpackAPI::with_config(configuration); let resp = api.get_powerpack(powerpack_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a Powerpack ``` /** * Get a Powerpack returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.PowerpackApi(configuration); // there is a valid "powerpack" in the system const POWERPACK_DATA_ID = process.env.POWERPACK_DATA_ID as string; const params: v2.PowerpackApiGetPowerpackRequest = { powerpackId: POWERPACK_DATA_ID, }; apiInstance .getPowerpack(params) .then((data: v2.PowerpackResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a powerpack](https://docs.datadoghq.com/api/latest/powerpack/#update-a-powerpack) * [v2 (latest)](https://docs.datadoghq.com/api/latest/powerpack/#update-a-powerpack-v2) PATCH https://api.ap1.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.ap2.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.datadoghq.eu/api/v2/powerpacks/{powerpack_id}https://api.ddog-gov.com/api/v2/powerpacks/{powerpack_id}https://api.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.us3.datadoghq.com/api/v2/powerpacks/{powerpack_id}https://api.us5.datadoghq.com/api/v2/powerpacks/{powerpack_id} ### Overview Update a powerpack. This endpoint requires the `dashboards_write` permission. OAuth apps require the `dashboards_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#powerpack) to access this endpoint. ### Arguments #### Path Parameters Name Type Description powerpack_id [_required_] string ID of the powerpack. ### Request #### Body Data (required) Update a powerpack request body. * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) Field Type Description data object Powerpack data object. attributes object Powerpack attribute object. description string Description of this powerpack. group_widget [_required_] object Powerpack group widget definition object. definition [_required_] object Powerpack group widget object. layout_type [_required_] string Layout type of widgets. show_title boolean Boolean indicating whether powerpack group title should be visible or not. title string Name for the group widget. type [_required_] string Type of widget, must be group. widgets [_required_] [object] Widgets inside the powerpack. definition [_required_] object Information about widget. layout object Powerpack inner widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. layout object Powerpack group widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,1y,alert` name [_required_] string Name of the powerpack. tags [string] List of tags to identify this powerpack. template_variables [object] List of template variables for this powerpack. available_values [string] The list of values that the template variable drop-down is limited to. defaults [string] One or many template variable default values within the saved view, which are unioned together using `OR` if more than one is specified. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. Only tags with this prefix appear in the variable drop-down. id string ID of the powerpack. relationships object Powerpack relationship object. author object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type string Type of widget, must be powerpack. ``` { "data": { "attributes": { "description": "Sample powerpack", "group_widget": { "definition": { "layout_type": "ordered", "show_title": true, "title": "Sample Powerpack", "type": "group", "widgets": [ { "definition": { "content": "test", "type": "note" } } ] }, "layout": { "height": 3, "width": 12, "x": 0, "y": 0 }, "live_span": "1h" }, "name": "Example-Powerpack", "tags": [ "tag:sample" ], "template_variables": [ { "defaults": [ "*" ], "name": "sample" } ] }, "type": "powerpack" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/powerpack/#UpdatePowerpack-200-v2) * [400](https://docs.datadoghq.com/api/latest/powerpack/#UpdatePowerpack-400-v2) * [404](https://docs.datadoghq.com/api/latest/powerpack/#UpdatePowerpack-404-v2) * [429](https://docs.datadoghq.com/api/latest/powerpack/#UpdatePowerpack-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) Response object which includes a single powerpack configuration. Field Type Description data object Powerpack data object. attributes object Powerpack attribute object. description string Description of this powerpack. group_widget [_required_] object Powerpack group widget definition object. definition [_required_] object Powerpack group widget object. layout_type [_required_] string Layout type of widgets. show_title boolean Boolean indicating whether powerpack group title should be visible or not. title string Name for the group widget. type [_required_] string Type of widget, must be group. widgets [_required_] [object] Widgets inside the powerpack. definition [_required_] object Information about widget. layout object Powerpack inner widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. layout object Powerpack group widget layout. height [_required_] int64 The height of the widget. Should be a non-negative integer. width [_required_] int64 The width of the widget. Should be a non-negative integer. x [_required_] int64 The position of the widget on the x (horizontal) axis. Should be a non-negative integer. y [_required_] int64 The position of the widget on the y (vertical) axis. Should be a non-negative integer. live_span enum The available timeframes depend on the widget you are using. Allowed enum values: `1m,5m,10m,15m,30m,1h,4h,1d,2d,1w,1mo,3mo,6mo,1y,alert` name [_required_] string Name of the powerpack. tags [string] List of tags to identify this powerpack. template_variables [object] List of template variables for this powerpack. available_values [string] The list of values that the template variable drop-down is limited to. defaults [string] One or many template variable default values within the saved view, which are unioned together using `OR` if more than one is specified. name [_required_] string The name of the variable. prefix string The tag prefix associated with the variable. Only tags with this prefix appear in the variable drop-down. id string ID of the powerpack. relationships object Powerpack relationship object. author object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type string Type of widget, must be powerpack. included [object] Array of objects related to the users. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "attributes": { "description": "Powerpack for ABC", "group_widget": { "definition": { "layout_type": "ordered", "show_title": true, "title": "Sample Powerpack", "type": "group", "widgets": [ { "definition": { "definition": { "content": "example", "type": "note" } }, "layout": { "height": 0, "width": 0, "x": 0, "y": 0 } } ] }, "layout": { "height": 0, "width": 0, "x": 0, "y": 0 }, "live_span": "5m" }, "name": "Sample Powerpack", "tags": [ "tag:foo1" ], "template_variables": [ { "available_values": [ "my-host", "host1", "host2" ], "defaults": [ "*" ], "name": "datacenter", "prefix": "host" } ] }, "id": "string", "relationships": { "author": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "powerpack" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Powerpack Not Found * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/powerpack/) * [Example](https://docs.datadoghq.com/api/latest/powerpack/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/powerpack/?code-lang=typescript) ##### Update a powerpack returns "OK" response Copy ``` # Path parameters export powerpack_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/powerpacks/${powerpack_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Sample powerpack", "group_widget": { "definition": { "layout_type": "ordered", "show_title": true, "title": "Sample Powerpack", "type": "group", "widgets": [ { "definition": { "content": "test", "type": "note" } } ] }, "layout": { "height": 3, "width": 12, "x": 0, "y": 0 }, "live_span": "1h" }, "name": "Example-Powerpack", "tags": [ "tag:sample" ], "template_variables": [ { "defaults": [ "*" ], "name": "sample" } ] }, "type": "powerpack" } } EOF ``` ##### Update a powerpack returns "OK" response ``` // Update a powerpack returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "powerpack" in the system PowerpackDataID := os.Getenv("POWERPACK_DATA_ID") body := datadogV2.Powerpack{ Data: &datadogV2.PowerpackData{ Attributes: &datadogV2.PowerpackAttributes{ Description: datadog.PtrString("Sample powerpack"), GroupWidget: datadogV2.PowerpackGroupWidget{ Definition: datadogV2.PowerpackGroupWidgetDefinition{ LayoutType: "ordered", ShowTitle: datadog.PtrBool(true), Title: datadog.PtrString("Sample Powerpack"), Type: "group", Widgets: []datadogV2.PowerpackInnerWidgets{ { Definition: map[string]interface{}{ "content": "test", "type": "note", }, }, }, }, Layout: &datadogV2.PowerpackGroupWidgetLayout{ Height: 3, Width: 12, X: 0, Y: 0, }, LiveSpan: datadogV2.WIDGETLIVESPAN_PAST_ONE_HOUR.Ptr(), }, Name: "Example-Powerpack", Tags: []string{ "tag:sample", }, TemplateVariables: []datadogV2.PowerpackTemplateVariable{ { Defaults: []string{ "*", }, Name: "sample", }, }, }, Type: datadog.PtrString("powerpack"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewPowerpackApi(apiClient) resp, r, err := api.UpdatePowerpack(ctx, PowerpackDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `PowerpackApi.UpdatePowerpack`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `PowerpackApi.UpdatePowerpack`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a powerpack returns "OK" response ``` // Update a powerpack returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.PowerpackApi; import com.datadog.api.client.v2.model.Powerpack; import com.datadog.api.client.v2.model.PowerpackAttributes; import com.datadog.api.client.v2.model.PowerpackData; import com.datadog.api.client.v2.model.PowerpackGroupWidget; import com.datadog.api.client.v2.model.PowerpackGroupWidgetDefinition; import com.datadog.api.client.v2.model.PowerpackGroupWidgetLayout; import com.datadog.api.client.v2.model.PowerpackInnerWidgets; import com.datadog.api.client.v2.model.PowerpackResponse; import com.datadog.api.client.v2.model.PowerpackTemplateVariable; import com.datadog.api.client.v2.model.WidgetLiveSpan; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); PowerpackApi apiInstance = new PowerpackApi(defaultClient); // there is a valid "powerpack" in the system String POWERPACK_DATA_ID = System.getenv("POWERPACK_DATA_ID"); Powerpack body = new Powerpack() .data( new PowerpackData() .attributes( new PowerpackAttributes() .description("Sample powerpack") .groupWidget( new PowerpackGroupWidget() .definition( new PowerpackGroupWidgetDefinition() .layoutType("ordered") .showTitle(true) .title("Sample Powerpack") .type("group") .widgets( Collections.singletonList( new PowerpackInnerWidgets() .definition( Map.ofEntries( Map.entry("content", "test"), Map.entry("type", "note")))))) .layout( new PowerpackGroupWidgetLayout() .height(3L) .width(12L) .x(0L) .y(0L)) .liveSpan(WidgetLiveSpan.PAST_ONE_HOUR)) .name("Example-Powerpack") .tags(Collections.singletonList("tag:sample")) .templateVariables( Collections.singletonList( new PowerpackTemplateVariable() .defaults(Collections.singletonList("*")) .name("sample")))) .type("powerpack")); try { PowerpackResponse result = apiInstance.updatePowerpack(POWERPACK_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling PowerpackApi#updatePowerpack"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a powerpack returns "OK" response ``` """ Update a powerpack returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.powerpack_api import PowerpackApi from datadog_api_client.v2.model.powerpack import Powerpack from datadog_api_client.v2.model.powerpack_attributes import PowerpackAttributes from datadog_api_client.v2.model.powerpack_data import PowerpackData from datadog_api_client.v2.model.powerpack_group_widget import PowerpackGroupWidget from datadog_api_client.v2.model.powerpack_group_widget_definition import PowerpackGroupWidgetDefinition from datadog_api_client.v2.model.powerpack_group_widget_layout import PowerpackGroupWidgetLayout from datadog_api_client.v2.model.powerpack_inner_widgets import PowerpackInnerWidgets from datadog_api_client.v2.model.powerpack_template_variable import PowerpackTemplateVariable from datadog_api_client.v2.model.widget_live_span import WidgetLiveSpan # there is a valid "powerpack" in the system POWERPACK_DATA_ID = environ["POWERPACK_DATA_ID"] body = Powerpack( data=PowerpackData( attributes=PowerpackAttributes( description="Sample powerpack", group_widget=PowerpackGroupWidget( definition=PowerpackGroupWidgetDefinition( layout_type="ordered", show_title=True, title="Sample Powerpack", type="group", widgets=[ PowerpackInnerWidgets( definition=dict([("content", "test"), ("type", "note")]), ), ], ), layout=PowerpackGroupWidgetLayout( height=3, width=12, x=0, y=0, ), live_span=WidgetLiveSpan.PAST_ONE_HOUR, ), name="Example-Powerpack", tags=[ "tag:sample", ], template_variables=[ PowerpackTemplateVariable( defaults=[ "*", ], name="sample", ), ], ), type="powerpack", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = PowerpackApi(api_client) response = api_instance.update_powerpack(powerpack_id=POWERPACK_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a powerpack returns "OK" response ``` # Update a powerpack returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::PowerpackAPI.new # there is a valid "powerpack" in the system POWERPACK_DATA_ID = ENV["POWERPACK_DATA_ID"] body = DatadogAPIClient::V2::Powerpack.new({ data: DatadogAPIClient::V2::PowerpackData.new({ attributes: DatadogAPIClient::V2::PowerpackAttributes.new({ description: "Sample powerpack", group_widget: DatadogAPIClient::V2::PowerpackGroupWidget.new({ definition: DatadogAPIClient::V2::PowerpackGroupWidgetDefinition.new({ layout_type: "ordered", show_title: true, title: "Sample Powerpack", type: "group", widgets: [ DatadogAPIClient::V2::PowerpackInnerWidgets.new({ definition: { "content": "test", "type": "note", }, }), ], }), layout: DatadogAPIClient::V2::PowerpackGroupWidgetLayout.new({ height: 3, width: 12, x: 0, y: 0, }), live_span: DatadogAPIClient::V2::WidgetLiveSpan::PAST_ONE_HOUR, }), name: "Example-Powerpack", tags: [ "tag:sample", ], template_variables: [ DatadogAPIClient::V2::PowerpackTemplateVariable.new({ defaults: [ "*", ], name: "sample", }), ], }), type: "powerpack", }), }) p api_instance.update_powerpack(POWERPACK_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a powerpack returns "OK" response ``` // Update a powerpack returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_powerpack::PowerpackAPI; use datadog_api_client::datadogV2::model::Powerpack; use datadog_api_client::datadogV2::model::PowerpackAttributes; use datadog_api_client::datadogV2::model::PowerpackData; use datadog_api_client::datadogV2::model::PowerpackGroupWidget; use datadog_api_client::datadogV2::model::PowerpackGroupWidgetDefinition; use datadog_api_client::datadogV2::model::PowerpackGroupWidgetLayout; use datadog_api_client::datadogV2::model::PowerpackInnerWidgets; use datadog_api_client::datadogV2::model::PowerpackTemplateVariable; use datadog_api_client::datadogV2::model::WidgetLiveSpan; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { // there is a valid "powerpack" in the system let powerpack_data_id = std::env::var("POWERPACK_DATA_ID").unwrap(); let body = Powerpack::new().data( PowerpackData::new() .attributes( PowerpackAttributes::new( PowerpackGroupWidget::new( PowerpackGroupWidgetDefinition::new( "ordered".to_string(), "group".to_string(), vec![PowerpackInnerWidgets::new(BTreeMap::from([ ("content".to_string(), Value::from("test")), ("type".to_string(), Value::from("note")), ]))], ) .show_title(true) .title("Sample Powerpack".to_string()), ) .layout(PowerpackGroupWidgetLayout::new(3, 12, 0, 0)) .live_span(WidgetLiveSpan::PAST_ONE_HOUR), "Example-Powerpack".to_string(), ) .description("Sample powerpack".to_string()) .tags(vec!["tag:sample".to_string()]) .template_variables(vec![PowerpackTemplateVariable::new("sample".to_string()) .defaults(vec!["*".to_string()])]), ) .type_("powerpack".to_string()), ); let configuration = datadog::Configuration::new(); let api = PowerpackAPI::with_config(configuration); let resp = api.update_powerpack(powerpack_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a powerpack returns "OK" response ``` /** * Update a powerpack returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.PowerpackApi(configuration); // there is a valid "powerpack" in the system const POWERPACK_DATA_ID = process.env.POWERPACK_DATA_ID as string; const params: v2.PowerpackApiUpdatePowerpackRequest = { body: { data: { attributes: { description: "Sample powerpack", groupWidget: { definition: { layoutType: "ordered", showTitle: true, title: "Sample Powerpack", type: "group", widgets: [ { definition: { content: "test", type: "note", }, }, ], }, layout: { height: 3, width: 12, x: 0, y: 0, }, liveSpan: "1h", }, name: "Example-Powerpack", tags: ["tag:sample"], templateVariables: [ { defaults: ["*"], name: "sample", }, ], }, type: "powerpack", }, }, powerpackId: POWERPACK_DATA_ID, }; apiInstance .updatePowerpack(params) .then((data: v2.PowerpackResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=289dfeab-3fe9-4031-a595-03898aef71bd&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=085ea75a-fdc7-4372-ba9c-2485f8229214&pt=Powerpack&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fpowerpack%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=289dfeab-3fe9-4031-a595-03898aef71bd&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=085ea75a-fdc7-4372-ba9c-2485f8229214&pt=Powerpack&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fpowerpack%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=5c0929c0-e53f-4df8-ba00-68e8b1cb84ad&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Powerpack&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fpowerpack%2F&r=<=14307&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=45383) --- # Source: https://docs.datadoghq.com/api/latest/processes/ # Processes The processes API allows you to query processes data for your organization. See the [Live Processes page](https://docs.datadoghq.com/infrastructure/process/) for more information. ## [Get all processes](https://docs.datadoghq.com/api/latest/processes/#get-all-processes) * [v2 (latest)](https://docs.datadoghq.com/api/latest/processes/#get-all-processes-v2) GET https://api.ap1.datadoghq.com/api/v2/processeshttps://api.ap2.datadoghq.com/api/v2/processeshttps://api.datadoghq.eu/api/v2/processeshttps://api.ddog-gov.com/api/v2/processeshttps://api.datadoghq.com/api/v2/processeshttps://api.us3.datadoghq.com/api/v2/processeshttps://api.us5.datadoghq.com/api/v2/processes ### Overview Get all processes for your organization. ### Arguments #### Query Strings Name Type Description search string String to search processes by. tags string Comma-separated list of tags to filter processes by. from integer Unix timestamp (number of seconds since epoch) of the start of the query window. If not provided, the start of the query window will be 15 minutes before the `to` timestamp. If neither `from` nor `to` are provided, the query window will be `[now - 15m, now]`. to integer Unix timestamp (number of seconds since epoch) of the end of the query window. If not provided, the end of the query window will be 15 minutes after the `from` timestamp. If neither `from` nor `to` are provided, the query window will be `[now - 15m, now]`. page[limit] integer Maximum number of results returned. page[cursor] string String to query the next page of results. This key is provided with each valid response from the API in `meta.page.after`. ### Response * [200](https://docs.datadoghq.com/api/latest/processes/#ListProcesses-200-v2) * [400](https://docs.datadoghq.com/api/latest/processes/#ListProcesses-400-v2) * [403](https://docs.datadoghq.com/api/latest/processes/#ListProcesses-403-v2) * [429](https://docs.datadoghq.com/api/latest/processes/#ListProcesses-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/processes/) * [Example](https://docs.datadoghq.com/api/latest/processes/) List of process summaries. Field Type Description data [object] Array of process summary objects. attributes object Attributes for a process summary. cmdline string Process command line. host string Host running the process. pid int64 Process ID. ppid int64 Parent process ID. start string Time the process was started. tags [string] List of tags associated with the process. timestamp string Time the process was seen. user string Process owner. id string Process ID. type enum Type of process summary. Allowed enum values: `process` default: `process` meta object Response metadata object. page object Paging attributes. after string The cursor used to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. size int32 Number of results returned. ``` { "data": [ { "attributes": { "cmdline": "string", "host": "string", "pid": "integer", "ppid": "integer", "start": "string", "tags": [], "timestamp": "string", "user": "string" }, "id": "string", "type": "process" } ], "meta": { "page": { "after": "911abf1204838d9cdfcb9a96d0b6a1bd03e1b514074f1ce1737c4cbd", "size": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/processes/) * [Example](https://docs.datadoghq.com/api/latest/processes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/processes/) * [Example](https://docs.datadoghq.com/api/latest/processes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/processes/) * [Example](https://docs.datadoghq.com/api/latest/processes/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/processes/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/processes/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/processes/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/processes/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/processes/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/processes/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/processes/?code-lang=typescript) ##### Get all processes Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/processes" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all processes ``` """ Get all processes returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.processes_api import ProcessesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ProcessesApi(api_client) response = api_instance.list_processes( search="process-agent", tags="testing:true", page_limit=2, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all processes ``` # Get all processes returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ProcessesAPI.new opts = { search: "process-agent", tags: "testing:true", page_limit: 2, } p api_instance.list_processes(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all processes ``` // Get all processes returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewProcessesApi(apiClient) resp, r, err := api.ListProcesses(ctx, *datadogV2.NewListProcessesOptionalParameters().WithSearch("process-agent").WithTags("testing:true").WithPageLimit(2)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ProcessesApi.ListProcesses`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ProcessesApi.ListProcesses`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all processes ``` // Get all processes returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ProcessesApi; import com.datadog.api.client.v2.api.ProcessesApi.ListProcessesOptionalParameters; import com.datadog.api.client.v2.model.ProcessSummariesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ProcessesApi apiInstance = new ProcessesApi(defaultClient); try { ProcessSummariesResponse result = apiInstance.listProcesses( new ListProcessesOptionalParameters() .search("process-agent") .tags("testing:true") .pageLimit(2)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ProcessesApi#listProcesses"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all processes ``` // Get all processes returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_processes::ListProcessesOptionalParams; use datadog_api_client::datadogV2::api_processes::ProcessesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ProcessesAPI::with_config(configuration); let resp = api .list_processes( ListProcessesOptionalParams::default() .search("process-agent".to_string()) .tags("testing:true".to_string()) .page_limit(2), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all processes ``` /** * Get all processes returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ProcessesApi(configuration); const params: v2.ProcessesApiListProcessesRequest = { search: "process-agent", tags: "testing:true", pageLimit: 2, }; apiInstance .listProcesses(params) .then((data: v2.ProcessSummariesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=962d7f9b-9bbd-44b9-8333-d229f395a4ae&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=47203e82-0f90-4661-893e-9ec06e20fc4a&pt=Processes&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fprocesses%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=962d7f9b-9bbd-44b9-8333-d229f395a4ae&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=47203e82-0f90-4661-893e-9ec06e20fc4a&pt=Processes&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fprocesses%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=84667f64-130d-4b52-ba90-303d62662c97&bo=2&sid=b77a5fe0f0bf11f0bab0291dd4b5f9fd&vid=b77a7130f0bf11f0b69eb7008fe8e23b&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Processes&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fprocesses%2F&r=<=834&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=141152) --- # Source: https://docs.datadoghq.com/api/latest/product-analytics/ # Product Analytics Send server-side events to Product Analytics. Server-Side Events Ingestion allows you to collect custom events from any server-side source, and retains events for 15 months. Server-side events are helpful for understanding causes of a funnel drop-off which are external to the client-side (for example, payment processing error). See the [Product Analytics page](https://docs.datadoghq.com/product_analytics/) for more information. ## [Send server-side events](https://docs.datadoghq.com/api/latest/product-analytics/#send-server-side-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/product-analytics/#send-server-side-events-v2) POST https://browser-intake.ap1.datadoghq.com/api/v2/prodlyticshttps://browser-intake.ap2.datadoghq.com/api/v2/prodlyticshttps://browser-intake.datadoghq.eu/api/v2/prodlyticshttps://browser-intake.datadoghq.com/api/v2/prodlyticshttps://browser-intake.us3.datadoghq.com/api/v2/prodlyticshttps://browser-intake.us5.datadoghq.com/api/v2/prodlyticsNot supported in the GOV region ### Overview Send server-side events to Product Analytics. Server-side events are retained for 15 months. Server-Side events in Product Analytics are helpful for tracking events that occur on the server, as opposed to client-side events, which are captured by Real User Monitoring (RUM) SDKs. This allows for a more comprehensive view of the user journey by including actions that happen on the server. Typical examples could be `checkout.completed` or `payment.processed`. Ingested server-side events are integrated into Product Analytics to allow users to select and filter these events in the event picker, similar to how views or actions are handled. **Requirements:** * At least one of `usr`, `account`, or `session` must be provided with a valid ID. * The `application.id` must reference a Product Analytics-enabled application. **Custom Attributes:** Any additional fields in the payload are flattened and searchable as facets. For example, a payload with `{"customer": {"tier": "premium"}}` is searchable with the syntax `@customer.tier:premium` in Datadog. The status codes answered by the HTTP API are: * 202: Accepted: The request has been accepted for processing * 400: Bad request (likely an issue in the payload formatting) * 401: Unauthorized (likely a missing API Key) * 403: Permission issue (likely using an invalid API Key) * 408: Request Timeout, request should be retried after some time * 413: Payload too large (batch is above 5MB uncompressed) * 429: Too Many Requests, request should be retried after some time * 500: Internal Server Error, the server encountered an unexpected condition that prevented it from fulfilling the request, request should be retried after some time * 503: Service Unavailable, the server is not ready to handle the request probably because it is overloaded, request should be retried after some time ### Request #### Body Data (required) Server-side event to send (JSON format). * [Model](https://docs.datadoghq.com/api/latest/product-analytics/) * [Example](https://docs.datadoghq.com/api/latest/product-analytics/) Field Type Description account object The account linked to your event. id [_required_] string The account ID used in Datadog. application [_required_] object The application in which you want to send your events. id [_required_] string The application ID of your application. It can be found in your [application management page](https://app.datadoghq.com/rum/list). event [_required_] object Fields used for the event. name [_required_] string The name of your event, which is used for search in the same way as view or action names. session object The session linked to your event. id [_required_] string The session ID captured by the SDK. type [_required_] enum The type of Product Analytics event. Must be `server` for server-side events. Allowed enum values: `server` usr object The user linked to your event. id [_required_] string The user ID used in Datadog. ``` { "account": { "id": "account-67890" }, "application": { "id": "123abcde-123a-123b-1234-123456789abc" }, "event": { "name": "payment.processed" }, "session": { "id": "session-abcdef" }, "type": "server", "usr": { "id": "user-12345" } } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/product-analytics/#SubmitProductAnalyticsEvent-202-v2) * [400](https://docs.datadoghq.com/api/latest/product-analytics/#SubmitProductAnalyticsEvent-400-v2) * [401](https://docs.datadoghq.com/api/latest/product-analytics/#SubmitProductAnalyticsEvent-401-v2) * [403](https://docs.datadoghq.com/api/latest/product-analytics/#SubmitProductAnalyticsEvent-403-v2) * [408](https://docs.datadoghq.com/api/latest/product-analytics/#SubmitProductAnalyticsEvent-408-v2) * [413](https://docs.datadoghq.com/api/latest/product-analytics/#SubmitProductAnalyticsEvent-413-v2) * [429](https://docs.datadoghq.com/api/latest/product-analytics/#SubmitProductAnalyticsEvent-429-v2) * [500](https://docs.datadoghq.com/api/latest/product-analytics/#SubmitProductAnalyticsEvent-500-v2) * [503](https://docs.datadoghq.com/api/latest/product-analytics/#SubmitProductAnalyticsEvent-503-v2) Request accepted for processing (always 202 empty JSON). * [Model](https://docs.datadoghq.com/api/latest/product-analytics/) * [Example](https://docs.datadoghq.com/api/latest/product-analytics/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/product-analytics/) * [Example](https://docs.datadoghq.com/api/latest/product-analytics/) Error response. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Unauthorized * [Model](https://docs.datadoghq.com/api/latest/product-analytics/) * [Example](https://docs.datadoghq.com/api/latest/product-analytics/) Error response. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/product-analytics/) * [Example](https://docs.datadoghq.com/api/latest/product-analytics/) Error response. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Request Timeout * [Model](https://docs.datadoghq.com/api/latest/product-analytics/) * [Example](https://docs.datadoghq.com/api/latest/product-analytics/) Error response. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Payload Too Large * [Model](https://docs.datadoghq.com/api/latest/product-analytics/) * [Example](https://docs.datadoghq.com/api/latest/product-analytics/) Error response. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Too Many Requests * [Model](https://docs.datadoghq.com/api/latest/product-analytics/) * [Example](https://docs.datadoghq.com/api/latest/product-analytics/) Error response. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/product-analytics/) * [Example](https://docs.datadoghq.com/api/latest/product-analytics/) Error response. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy Service Unavailable * [Model](https://docs.datadoghq.com/api/latest/product-analytics/) * [Example](https://docs.datadoghq.com/api/latest/product-analytics/) Error response. Field Type Description errors [object] Structured errors. detail string Error message. status string Error code. title string Error title. ``` { "errors": [ { "detail": "Malformed payload", "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/product-analytics/?code-lang=curl) ##### Send server-side events Copy ``` ## Event with account ID # Send a server-side event linked to an account. # Curl command curl -X POST "https://browser-intake.ap1.datadoghq.com"https://browser-intake.ap2.datadoghq.com"https://browser-intake.datadoghq.eu"https://browser-intake.datadoghq.com"https://browser-intake.us3.datadoghq.com"https://browser-intake.us5.datadoghq.com/api/v2/prodlytics" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "account": { "id": "account-456" }, "application": { "id": "123abcde-123a-123b-1234-123456789abc" }, "event": { "name": "checkout.completed" }, "type": "server" } EOF ## Event with custom attributes # Send a server-side event with additional custom attributes. # Curl command curl -X POST "https://browser-intake.ap1.datadoghq.com"https://browser-intake.ap2.datadoghq.com"https://browser-intake.datadoghq.eu"https://browser-intake.datadoghq.com"https://browser-intake.us3.datadoghq.com"https://browser-intake.us5.datadoghq.com/api/v2/prodlytics" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "application": { "id": "123abcde-123a-123b-1234-123456789abc" }, "customer": { "tier": "premium" }, "event": { "name": "payment.processed" }, "type": "server", "usr": { "id": "123" } } EOF ## Event with session ID # Send a server-side event linked to a session. # Curl command curl -X POST "https://browser-intake.ap1.datadoghq.com"https://browser-intake.ap2.datadoghq.com"https://browser-intake.datadoghq.eu"https://browser-intake.datadoghq.com"https://browser-intake.us3.datadoghq.com"https://browser-intake.us5.datadoghq.com/api/v2/prodlytics" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "application": { "id": "123abcde-123a-123b-1234-123456789abc" }, "event": { "name": "form.submitted" }, "session": { "id": "session-789" }, "type": "server" } EOF ## Simple event with user ID # Send a server-side event linked to a user. # Curl command curl -X POST "https://browser-intake.ap1.datadoghq.com"https://browser-intake.ap2.datadoghq.com"https://browser-intake.datadoghq.eu"https://browser-intake.datadoghq.com"https://browser-intake.us3.datadoghq.com"https://browser-intake.us5.datadoghq.com/api/v2/prodlytics" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF { "application": { "id": "123abcde-123a-123b-1234-123456789abc" }, "event": { "name": "payment.processed" }, "type": "server", "usr": { "id": "123" } } EOF ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=6a6e4900-1673-424b-852a-b36ae2a28eaa&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=f4bf989c-e978-4dba-b26d-71a49c7d98d1&pt=Product%20Analytics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fproduct-analytics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=6a6e4900-1673-424b-852a-b36ae2a28eaa&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=f4bf989c-e978-4dba-b26d-71a49c7d98d1&pt=Product%20Analytics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fproduct-analytics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=44c49a75-bee8-46ba-808f-0d7f2b6bc33d&bo=2&sid=ca97aff0f0bf11f0b225ebe7018368cd&vid=ca980060f0bf11f09385b797b877861f&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Product%20Analytics&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fproduct-analytics%2F&r=<=1318&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=6367) --- # Source: https://docs.datadoghq.com/api/latest/rate-limits ## [Rate Limits](https://docs.datadoghq.com/api/latest/rate-limits/#rate-limits) Many API endpoints are rate limited. Once you exceed a certain number of requests in a specific period, Datadog returns an error. If you are rate limited, you can see a 429 in the response code. You can either wait the designated time by the `X-RateLimit-Period` before making calls again, or switch to making calls at a frequency slightly longer than the `X-RateLimit-Limit` or `X-RateLimit-Period`. Rate limits can be increased from the defaults by [contacting the Datadog support team](https://docs.datadoghq.com/help/). Regarding the API rate limit policy: * Datadog **does not rate limit** on data point/metric submission (see [metrics section](https://docs.datadoghq.com/api/v1/metrics/) for more info on how the metric submission rate is handled). Limits encounter is dependent on the quantity of [custom metrics](https://docs.datadoghq.com/metrics/custom_metrics/) based on your agreement. * The API for sending logs is not rate limited. * The rate limit for event submission is `250,000` events per minute per organization. * The rate limits for endpoints vary and are included in the headers detailed below. These can be extended on demand. The list above is not comprehensive of all rate limits on Datadog APIs. If you are experiencing rate limiting, reach out to [support](https://www.datadoghq.com/support/) for more information about the APIs you're using and their limits. Rate Limit Headers | Description ---|--- `X-RateLimit-Limit` | number of requests allowed in a time period. `X-RateLimit-Period` | length of time in seconds for resets (calendar aligned). `X-RateLimit-Remaining` | number of allowed requests left in the current time period. `X-RateLimit-Reset` | time in seconds until next reset. `X-RateLimit-Name` | name of the rate limit for increase requests. ### [Datadog API usage metrics](https://docs.datadoghq.com/api/latest/rate-limits/#datadog-api-usage-metrics) All Datadog APIs have a usage limit for a given period of time. APIs can have unique, distinct rate limit buckets or be grouped together into a single bucket depending on the resource(s) being used. For example, the monitor status API has a rate limit that allows a human or automation script to query only so many times per minute. The endpoint rejects excess requests with a 429 response code and a hint to back off until a reset period has expired. API usage metrics allow Datadog users to self-service and audit API rate limit consumption for API endpoints (excluding metrics, logs, and event submission endpoints). These metrics provide a picture of allowed and blocked requests, and are provided with the following dimensions and available tags: [Datadog API Rate Limit Usage Dashboard](https://app.datadoghq.com/dash/integration/31668/datadog-api-rate-limit-usage) [Datadog API Rate Limit Usage Dashboard](https://app.datadoghq.eu/dash/integration/1386/datadog-api-rate-limit-usage) [Datadog API Rate Limit Usage Dashboard](https://us3.datadoghq.com/dash/integration/2248/datadog-api-rate-limit-usage) [Datadog API Rate Limit Usage Dashboard](https://us5.datadoghq.com/dash/integration/1421/datadog-api-rate-limit-usage) [Datadog API Rate Limit Usage Dashboard](https://ap1.datadoghq.com/dash/integration/2698/datadog-api-rate-limit-usage) [Datadog API Rate Limit Usage Dashboard](https://app.ddog-gov.com/dash/integration/1330/datadog-api-rate-limit-usage) #### [Available metrics](https://docs.datadoghq.com/api/latest/rate-limits/#available-metrics) Dimension | Usage metric | Description | Available Tags ---|---|---|--- **Org** | `datadog.apis.usage.per_org` | The organization-wide rate limit of the number of API requests made to a specific endpoint | * `app_key_id` * `child_org` (on parent only) * `limit_count` * `limit_name` * `limit_period` * `rate_limit_status` * `user_uuid` `datadog.apis.usage.per_org_ratio` | Ratio of API requests by available dimensions to total number of requests (`limit_count`) allowed. | * `app_key_id` * `child_org` (on parent only) * `limit_count` * `limit_name` * `limit_period` * `rate_limit_status` * `user_uuid` **User (UUID)** | `datadog.apis.usage.per_user` | Number of API requests made for a specific API endpoint that is rate limited per unique user. | * `app_key_id` * `child_org` (on parent only) * `limit_count` * `limit_name` * `limit_period` * `rate_limit_status` * `user_uuid` `datadog.apis.usage.per_user_ratio` | Ratio of API requests by available dimensions to total number of requests (`limit_count`) allowed. | * `app_key_id` * `child_org` (on parent only) * `limit_count` * `limit_name` * `limit_period` * `rate_limit_status` * `user_uuid` **API Key** | `datadog.apis.usage.per_api_key` | Number of API requests made for a specific API endpoint that is rate limited per unique API Key used | * `app_key_id` * `child_org` (on parent only) * `limit_count` * `limit_name` * `limit_period` * `rate_limit_status` * `user_uuid` `datadog.apis.usage.per_api_key_ratio` | Ratio of API requests by available dimensions to total number of requests (`limit_count`) allowed. | * `app_key_id` * `child_org` (on parent only) * `limit_count` * `limit_name` * `limit_period` * `rate_limit_status` * `user_uuid` #### [Tag key](https://docs.datadoghq.com/api/latest/rate-limits/#tag-key) Tag name | Description ---|--- `app_key_id` | Application Key ID used by API client. This can be `N/A` for web or mobile users and open endpoints. `child_org` | Name of child org, if viewing from the parent org. Otherwise, set to `N/A`. This only applies within the same datacenter. `limit_count` | Number of requests available to each rate limit name during a request period. `limit_name` | Name of rate limit. Different endpoints can share the same name. `limit_period` | Time in seconds for each rate limit name before the consumption count is reset. `rate_limit_status` | `passed`: Request was not blocked. `blocked`: Request was blocked due to rate limits breached. `user_uuid` | UUID of user for API consumption. #### [Rollup in widgets](https://docs.datadoghq.com/api/latest/rate-limits/#rollup-in-widgets) Metric visualizations should generally be rolled up to the minute using sum(60s) to aggregate the total number of requests per minute. Ratio metrics are already normalized to the corresponding `limit_period`. ##### [Example use cases](https://docs.datadoghq.com/api/latest/rate-limits/#example-use-cases) Requests by rate limit name Graph the sum of `datadog.apis.usage.per_org`, `datadog.apis.usage.per_user`, and `datadog.apis.usage.per_api_key` by `limit_name` **Example:** `default_zero(sum:datadog.apis.usage.per_org{*} by {limit_name}) + default_zero(sum:datadog.apis.usage.per_user{*} by {limit_name}) + default_zero(sum:datadog.apis.usage.per_api_key{*} by {limit_name})` Blocked by rate limit name Graph the sum of `datadog.apis.usage.per_org`, `datadog.apis.usage.per_user`, and `datadog.apis.usage.per_api_key` by `limit_name` with `rate_limit_status:blocked` **Example:** `default_zero(sum:datadog.apis.usage.per_org{rate_limit_status:blocked} by {limit_name}) + default_zero(sum:datadog.apis.usage.per_user{rate_limit_status:blocked} by {limit_name}) + default_zero(sum:datadog.apis.usage.per_api_key{rate_limit_status:blocked} by {limit_name})` Blocked endpoint by user Graph the sum of `datadog.apis.usage.per_org`, `datadog.apis.usage.per_user`, and `datadog.apis.usage.per_api_key` by `user_uuid` with `rate_limit_status:blocked` and `limit_name:example` **Example:** `default_zero(sum:datadog.apis.usage.per_org{rate_limit_status:blocked,limit_name:example} by {user_uuid}) + default_zero(sum:datadog.apis.usage.per_user{rate_limit_status:blocked,limit_name:example} by {user_uuid}) + default_zero(sum:datadog.apis.usage.per_api_key{rate_limit_status:blocked,limit_name:example} by {user_uuid})` Blocked endpoint by app key ID Graph the sum of `datadog.apis.usage.per_org`, `datadog.apis.usage.per_user`, and `datadog.apis.usage.per_api_key` by `app_key_id` with `rate_limit_status:blocked` and `limit_name:example` **Example:** `default_zero(sum:datadog.apis.usage.per_org{rate_limit_status:blocked,limit_name:example} by {app_key_id}) + default_zero(sum:datadog.apis.usage.per_user{rate_limit_status:blocked,limit_name:example} by {app_key_id}) + default_zero(sum:datadog.apis.usage.per_api_key{rate_limit_status:blocked,limit_name:example} by {app_key_id})` Ratio of Rate Limits Used by Rate Limit Name Graph the sum of `datadog.apis.usage.per_org_ratio`, `datadog.apis.usage.per_user_ratio`, and `datadog.apis.usage.per_api_key_ratio` by `limit_name` **Example:** `default_zero(max:datadog.apis.usage.per_org_ratio{*} by {limit_name}) + default_zero(max:datadog.apis.usage.per_user_ratio{*} by {limit_name}) + default_zero(max:datadog.apis.usage.per_api_key_ratio{*} by {limit_name})` ### [Increase your rate limit](https://docs.datadoghq.com/api/latest/rate-limits/#increase-your-rate-limit) You can request increased rate limits by creating a Support ticket with the below details under **Help** > **New Support Ticket**. Upon receiving a rate limit increase, our Support Engineering team reviews the request on a case-by-case basis and, if needed, works with internal engineering resources to confirm the viability of the rate limit increase request. ``` Title: Request to increase rate limit on endpoint: X Details: We would like to request a rate limit increase for API endpoint: X Example use cases/queries: Example API call as cURL or as URL with example payload Motivation for increasing rate limit: Example - Our organization uses this endpoint to right size a container before we deploy. This deployment takes place every X hours or up to Y times per day. Desired target rate limit: Tip - Having a specific limit increase or percentage increase in mind helps Support Engineering expedite the request to internal Engineering teams for review. ``` After Datadog Support reviews and approves the use case, they can apply the rate limit increase behind the scenes. Note that there is a maximum to how much a rate limit can be increased due to the SaaS nature of Datadog. Datadog Support reserves the right to reject rate limit increases based on use cases and Engineering recommendations. ### [Audit logs](https://docs.datadoghq.com/api/latest/rate-limits/#audit-logs) API limit and usage metrics provide insight into usage patterns and blocked requests. If you need additional details, Audit Trail offers more granular visibility into API activity. With Audit Trail, you can view data such as: * **IP address & geolocation** – Identify where API requests originated. * **Actor type** – Distinguish between service accounts and user accounts. * **API vs. app key authentication** – See whether requests were made through an API key or directly by a user. * **Correlated events** – View other events occurring at the same time, such as configuration changes or security-related actions. Audit Trail can help teams troubleshoot rate limit issues by providing more context on API consumption and blocked requests. It also enables tracking of API usage across an organization for security and compliance purposes. For more detailed visibility into API activity, consider using **[Audit Trail](https://docs.datadoghq.com/account_management/audit_trail/events/)**. ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c4b73423-79f3-48fa-8b79-0a4fb5109cd5&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=6e44f413-c311-4cec-a4ec-3eb7d6110c99&pt=Rate%20Limits&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frate-limits%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c4b73423-79f3-48fa-8b79-0a4fb5109cd5&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=6e44f413-c311-4cec-a4ec-3eb7d6110c99&pt=Rate%20Limits&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frate-limits%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=b2d1b930-bab7-40b4-bd98-05f040289cbe&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Rate%20Limits&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frate-limits%2F&r=<=881&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=741832) --- # Source: https://docs.datadoghq.com/api/latest/reference-tables/ # Reference Tables View and manage Reference Tables in your organization. ## [Create reference table upload](https://docs.datadoghq.com/api/latest/reference-tables/#create-reference-table-upload) * [v2 (latest)](https://docs.datadoghq.com/api/latest/reference-tables/#create-reference-table-upload-v2) POST https://api.ap1.datadoghq.com/api/v2/reference-tables/uploadshttps://api.ap2.datadoghq.com/api/v2/reference-tables/uploadshttps://api.datadoghq.eu/api/v2/reference-tables/uploadshttps://api.ddog-gov.com/api/v2/reference-tables/uploadshttps://api.datadoghq.com/api/v2/reference-tables/uploadshttps://api.us3.datadoghq.com/api/v2/reference-tables/uploadshttps://api.us5.datadoghq.com/api/v2/reference-tables/uploads ### Overview Create a reference table upload for bulk data ingestion ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) Field Type Description data object Request data for creating an upload for a file to be ingested into a reference table. attributes object Upload configuration specifying how data is uploaded by the user, and properties of the table to associate the upload with. headers [_required_] [string] The CSV file headers that define the schema fields, provided in the same order as the columns in the uploaded file. part_count [_required_] int32 Number of parts to split the file into for multipart upload. part_size [_required_] int64 The size of each part in the upload in bytes. All parts except the last one must be at least 5,000,000 bytes. table_name [_required_] string Name of the table to associate with this upload. type [_required_] enum Upload resource type. Allowed enum values: `upload` default: `upload` ``` { "data": { "attributes": { "headers": [ "field_1", "field_2" ], "part_count": 3, "part_size": 10000000, "table_name": "" }, "type": "upload" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/reference-tables/#CreateReferenceTableUpload-201-v2) * [400](https://docs.datadoghq.com/api/latest/reference-tables/#CreateReferenceTableUpload-400-v2) * [403](https://docs.datadoghq.com/api/latest/reference-tables/#CreateReferenceTableUpload-403-v2) * [429](https://docs.datadoghq.com/api/latest/reference-tables/#CreateReferenceTableUpload-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) Information about the upload created containing the upload ID and pre-signed URLs to PUT chunks of the CSV file to. Field Type Description data object Upload ID and attributes of the created upload. attributes object Pre-signed URLs for uploading parts of the file. part_urls [string] The pre-signed URLs for uploading parts. These URLs expire after 5 minutes. id string Unique identifier for this upload. Use this ID when creating the reference table. type [_required_] enum Upload resource type. Allowed enum values: `upload` default: `upload` ``` { "data": { "attributes": { "part_urls": [] }, "id": "string", "type": "upload" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=typescript) ##### Create reference table upload Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/reference-tables/uploads" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "headers": [ "field_1", "field_2" ], "part_count": 3, "part_size": 10000000, "table_name": "" }, "type": "upload" } } EOF ``` ##### Create reference table upload ``` """ Create reference table upload returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.reference_tables_api import ReferenceTablesApi from datadog_api_client.v2.model.create_upload_request import CreateUploadRequest from datadog_api_client.v2.model.create_upload_request_data import CreateUploadRequestData from datadog_api_client.v2.model.create_upload_request_data_attributes import CreateUploadRequestDataAttributes from datadog_api_client.v2.model.create_upload_request_data_type import CreateUploadRequestDataType body = CreateUploadRequest( data=CreateUploadRequestData( attributes=CreateUploadRequestDataAttributes( headers=[ "id", "name", "value", ], table_name="test_upload_table_Example-Reference-Table", part_count=1, part_size=1024, ), type=CreateUploadRequestDataType.UPLOAD, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ReferenceTablesApi(api_client) response = api_instance.create_reference_table_upload(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create reference table upload ``` # Create reference table upload returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ReferenceTablesAPI.new body = DatadogAPIClient::V2::CreateUploadRequest.new({ data: DatadogAPIClient::V2::CreateUploadRequestData.new({ attributes: DatadogAPIClient::V2::CreateUploadRequestDataAttributes.new({ headers: [ "id", "name", "value", ], table_name: "test_upload_table_Example-Reference-Table", part_count: 1, part_size: 1024, }), type: DatadogAPIClient::V2::CreateUploadRequestDataType::UPLOAD, }), }) p api_instance.create_reference_table_upload(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create reference table upload ``` // Create reference table upload returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateUploadRequest{ Data: &datadogV2.CreateUploadRequestData{ Attributes: &datadogV2.CreateUploadRequestDataAttributes{ Headers: []string{ "id", "name", "value", }, TableName: "test_upload_table_Example-Reference-Table", PartCount: 1, PartSize: 1024, }, Type: datadogV2.CREATEUPLOADREQUESTDATATYPE_UPLOAD, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewReferenceTablesApi(apiClient) resp, r, err := api.CreateReferenceTableUpload(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ReferenceTablesApi.CreateReferenceTableUpload`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ReferenceTablesApi.CreateReferenceTableUpload`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create reference table upload ``` // Create reference table upload returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ReferenceTablesApi; import com.datadog.api.client.v2.model.CreateUploadRequest; import com.datadog.api.client.v2.model.CreateUploadRequestData; import com.datadog.api.client.v2.model.CreateUploadRequestDataAttributes; import com.datadog.api.client.v2.model.CreateUploadRequestDataType; import com.datadog.api.client.v2.model.CreateUploadResponse; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ReferenceTablesApi apiInstance = new ReferenceTablesApi(defaultClient); CreateUploadRequest body = new CreateUploadRequest() .data( new CreateUploadRequestData() .attributes( new CreateUploadRequestDataAttributes() .headers(Arrays.asList("id", "name", "value")) .tableName("test_upload_table_Example-Reference-Table") .partCount(1) .partSize(1024L)) .type(CreateUploadRequestDataType.UPLOAD)); try { CreateUploadResponse result = apiInstance.createReferenceTableUpload(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ReferenceTablesApi#createReferenceTableUpload"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create reference table upload ``` // Create reference table upload returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_reference_tables::ReferenceTablesAPI; use datadog_api_client::datadogV2::model::CreateUploadRequest; use datadog_api_client::datadogV2::model::CreateUploadRequestData; use datadog_api_client::datadogV2::model::CreateUploadRequestDataAttributes; use datadog_api_client::datadogV2::model::CreateUploadRequestDataType; #[tokio::main] async fn main() { let body = CreateUploadRequest::new().data( CreateUploadRequestData::new(CreateUploadRequestDataType::UPLOAD).attributes( CreateUploadRequestDataAttributes::new( vec!["id".to_string(), "name".to_string(), "value".to_string()], 1, 1024, "test_upload_table_Example-Reference-Table".to_string(), ), ), ); let configuration = datadog::Configuration::new(); let api = ReferenceTablesAPI::with_config(configuration); let resp = api.create_reference_table_upload(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create reference table upload ``` /** * Create reference table upload returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ReferenceTablesApi(configuration); const params: v2.ReferenceTablesApiCreateReferenceTableUploadRequest = { body: { data: { attributes: { headers: ["id", "name", "value"], tableName: "test_upload_table_Example-Reference-Table", partCount: 1, partSize: 1024, }, type: "upload", }, }, }; apiInstance .createReferenceTableUpload(params) .then((data: v2.CreateUploadResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create reference table](https://docs.datadoghq.com/api/latest/reference-tables/#create-reference-table) * [v2 (latest)](https://docs.datadoghq.com/api/latest/reference-tables/#create-reference-table-v2) POST https://api.ap1.datadoghq.com/api/v2/reference-tables/tableshttps://api.ap2.datadoghq.com/api/v2/reference-tables/tableshttps://api.datadoghq.eu/api/v2/reference-tables/tableshttps://api.ddog-gov.com/api/v2/reference-tables/tableshttps://api.datadoghq.com/api/v2/reference-tables/tableshttps://api.us3.datadoghq.com/api/v2/reference-tables/tableshttps://api.us5.datadoghq.com/api/v2/reference-tables/tables ### Overview Creates a reference table. You can provide data in two ways: 1. Call POST /api/v2/reference-tables/upload to get an upload ID. Then, PUT the CSV data (not the file itself) in chunks to each URL in the request body. Finally, call this POST endpoint with `upload_id` in `file_metadata`. 2. Provide `access_details` in `file_metadata` pointing to a CSV file in cloud storage. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) Field Type Description data object The data object containing the table definition. attributes object Attributes that define the reference table's configuration and properties. description string Optional text describing the purpose or contents of this reference table. file_metadata Metadata specifying where and how to access the reference table's data file. Option 1 object Cloud storage file metadata for create requests. Both access_details and sync_enabled are required. access_details [_required_] object Cloud storage access configuration for the reference table data file. aws_detail object Amazon Web Services S3 storage access configuration. aws_account_id [_required_] string AWS account ID where the S3 bucket is located. aws_bucket_name [_required_] string S3 bucket containing the CSV file. file_path [_required_] string The relative file path from the S3 bucket root to the CSV file. azure_detail object Azure Blob Storage access configuration. azure_client_id [_required_] string Azure service principal (application) client ID with permissions to read from the container. azure_container_name [_required_] string Azure Blob Storage container containing the CSV file. azure_storage_account_name [_required_] string Azure storage account where the container is located. azure_tenant_id [_required_] string Azure Active Directory tenant ID. file_path [_required_] string The relative file path from the Azure container root to the CSV file. gcp_detail object Google Cloud Platform storage access configuration. file_path [_required_] string The relative file path from the GCS bucket root to the CSV file. gcp_bucket_name [_required_] string GCP bucket containing the CSV file. gcp_project_id [_required_] string GCP project ID where the bucket is located. gcp_service_account_email [_required_] string Service account email with read permissions for the GCS bucket. sync_enabled [_required_] boolean Whether this table is synced automatically. Option 2 object Local file metadata for create requests using the upload ID. upload_id [_required_] string The upload ID. schema [_required_] object Schema defining the structure and columns of the reference table. fields [_required_] [object] The schema fields. name [_required_] string The field name. type [_required_] enum The field type for reference table schema fields. Allowed enum values: `STRING,INT32` primary_keys [_required_] [string] List of field names that serve as primary keys for the table. Only one primary key is supported, and it is used as an ID to retrieve rows. source [_required_] enum The source type for creating reference table data. Only these source types can be created through this API. Allowed enum values: `LOCAL_FILE,S3,GCS,AZURE` table_name [_required_] string Name to identify this reference table. tags [string] Tags for organizing and filtering reference tables. type [_required_] enum Reference table resource type. Allowed enum values: `reference_table` default: `reference_table` ``` { "data": { "attributes": { "description": "string", "file_metadata": { "access_details": { "aws_detail": { "aws_account_id": "123456789000", "aws_bucket_name": "example-data-bucket", "file_path": "reference-tables/users.csv" }, "azure_detail": { "azure_client_id": "aaaaaaaa-1111-2222-3333-bbbbbbbbbbbb", "azure_container_name": "reference-data", "azure_storage_account_name": "examplestorageaccount", "azure_tenant_id": "cccccccc-4444-5555-6666-dddddddddddd", "file_path": "tables/users.csv" }, "gcp_detail": { "file_path": "data/reference_tables/users.csv", "gcp_bucket_name": "example-data-bucket", "gcp_project_id": "example-gcp-project-12345", "gcp_service_account_email": "example-service@example-gcp-project-12345.iam.gserviceaccount.com" } }, "sync_enabled": false }, "schema": { "fields": [ { "name": "field_1", "type": "STRING" } ], "primary_keys": [ "field_1" ] }, "source": "LOCAL_FILE", "table_name": "table_1", "tags": [ "tag_1", "tag_2" ] }, "type": "reference_table" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/reference-tables/#CreateReferenceTable-201-v2) * [400](https://docs.datadoghq.com/api/latest/reference-tables/#CreateReferenceTable-400-v2) * [403](https://docs.datadoghq.com/api/latest/reference-tables/#CreateReferenceTable-403-v2) * [429](https://docs.datadoghq.com/api/latest/reference-tables/#CreateReferenceTable-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) A reference table resource containing its full configuration and state. Field Type Description data object The data object containing the reference table configuration and state. attributes object Attributes that define the reference table's configuration and properties. created_by string UUID of the user who created the reference table. description string Optional text describing the purpose or contents of this reference table. file_metadata object Metadata specifying where and how to access the reference table's data file. For cloud storage tables (S3/GCS/Azure): * sync_enabled and access_details will always be present * error fields (error_message, error_row_count, error_type) are present only when errors occur For local file tables: * error fields (error_message, error_row_count) are present only when errors occur * sync_enabled, access_details are never present access_details object Cloud storage access configuration. Only present for cloud storage sources (S3, GCS, Azure). aws_detail object Amazon Web Services S3 storage access configuration. aws_account_id string AWS account ID where the S3 bucket is located. aws_bucket_name string S3 bucket containing the CSV file. file_path string The relative file path from the S3 bucket root to the CSV file. azure_detail object Azure Blob Storage access configuration. azure_client_id string Azure service principal (application) client ID with permissions to read from the container. azure_container_name string Azure Blob Storage container containing the CSV file. azure_storage_account_name string Azure storage account where the container is located. azure_tenant_id string Azure Active Directory tenant ID. file_path string The relative file path from the Azure container root to the CSV file. gcp_detail object Google Cloud Platform storage access configuration. file_path string The relative file path from the GCS bucket root to the CSV file. gcp_bucket_name string GCP bucket containing the CSV file. gcp_project_id string GCP project ID where the bucket is located. gcp_service_account_email string Service account email with read permissions for the GCS bucket. error_message string The error message returned from the last operation (sync for cloud storage, upload for local file). error_row_count int64 The number of rows that failed to process. error_type enum The type of error that occurred during file processing. Only applicable for cloud storage sources. Allowed enum values: `TABLE_SCHEMA_ERROR,FILE_FORMAT_ERROR,CONFIGURATION_ERROR,QUOTA_EXCEEDED,CONFLICT_ERROR,VALIDATION_ERROR,STATE_ERROR,OPERATION_ERROR,SYSTEM_ERROR` sync_enabled boolean Whether this table is synced automatically from cloud storage. Only applicable for cloud storage sources. last_updated_by string UUID of the user who last updated the reference table. row_count int64 The number of successfully processed rows in the reference table. schema object Schema defining the structure and columns of the reference table. fields [_required_] [object] The schema fields. name [_required_] string The field name. type [_required_] enum The field type for reference table schema fields. Allowed enum values: `STRING,INT32` primary_keys [_required_] [string] List of field names that serve as primary keys for the table. Only one primary key is supported, and it is used as an ID to retrieve rows. source enum The source type for reference table data. Includes all possible source types that can appear in responses. Allowed enum values: `LOCAL_FILE,S3,GCS,AZURE,SERVICENOW,SALESFORCE,DATABRICKS,SNOWFLAKE` status string The processing status of the table. table_name string Unique name to identify this reference table. Used in enrichment processors and API calls. tags [string] Tags for organizing and filtering reference tables. updated_at string When the reference table was last updated, in ISO 8601 format. id string Unique identifier for the reference table. type [_required_] enum Reference table resource type. Allowed enum values: `reference_table` default: `reference_table` ``` { "data": { "attributes": { "created_by": "00000000-0000-0000-0000-000000000000", "description": "example description", "file_metadata": { "access_details": { "aws_detail": { "aws_account_id": "123456789000", "aws_bucket_name": "my-bucket", "file_path": "path/to/file.csv" } }, "sync_enabled": true }, "last_updated_by": "00000000-0000-0000-0000-000000000000", "row_count": 5, "schema": { "fields": [ { "name": "id", "type": "INT32" }, { "name": "name", "type": "STRING" } ], "primary_keys": [ "id" ] }, "source": "S3", "status": "DONE", "table_name": "test_reference_table", "tags": [ "tag1", "tag2" ], "updated_at": "2000-01-01T01:00:00+00:00" }, "id": "00000000-0000-0000-0000-000000000000", "type": "reference_table" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=typescript) ##### Create reference table Copy ``` ## Create table from cloud storage (S3) # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/reference-tables/tables" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Customer reference data synced from S3", "file_metadata": { "access_details": { "aws_detail": { "aws_account_id": "924305315327", "aws_bucket_name": "my-data-bucket", "file_path": "customers.csv" } }, "sync_enabled": true }, "schema": { "fields": [ { "name": "customer_id", "type": "STRING" }, { "name": "customer_name", "type": "STRING" }, { "name": "email", "type": "STRING" } ], "primary_keys": [ "customer_id" ] }, "source": "S3", "table_name": "customer_reference_data", "tags": [ "team:data-platform" ] }, "type": "reference_table" } } EOF ## Create table from local file upload # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/reference-tables/tables" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "Product catalog uploaded via local file", "file_metadata": { "upload_id": "00000000-0000-0000-0000-000000000000" }, "schema": { "fields": [ { "name": "product_id", "type": "STRING" }, { "name": "product_name", "type": "STRING" }, { "name": "price", "type": "DOUBLE" } ], "primary_keys": [ "product_id" ] }, "source": "LOCAL_FILE", "table_name": "product_catalog", "tags": [ "team:ecommerce" ] }, "type": "reference_table" } } EOF ``` ##### Create reference table ``` """ Create reference table returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.reference_tables_api import ReferenceTablesApi from datadog_api_client.v2.model.create_table_request import CreateTableRequest from datadog_api_client.v2.model.create_table_request_data import CreateTableRequestData from datadog_api_client.v2.model.create_table_request_data_attributes import CreateTableRequestDataAttributes from datadog_api_client.v2.model.create_table_request_data_attributes_file_metadata_cloud_storage import ( CreateTableRequestDataAttributesFileMetadataCloudStorage, ) from datadog_api_client.v2.model.create_table_request_data_attributes_file_metadata_one_of_access_details import ( CreateTableRequestDataAttributesFileMetadataOneOfAccessDetails, ) from datadog_api_client.v2.model.create_table_request_data_attributes_file_metadata_one_of_access_details_aws_detail import ( CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail, ) from datadog_api_client.v2.model.create_table_request_data_attributes_file_metadata_one_of_access_details_azure_detail import ( CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAzureDetail, ) from datadog_api_client.v2.model.create_table_request_data_attributes_file_metadata_one_of_access_details_gcp_detail import ( CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsGcpDetail, ) from datadog_api_client.v2.model.create_table_request_data_attributes_schema import ( CreateTableRequestDataAttributesSchema, ) from datadog_api_client.v2.model.create_table_request_data_attributes_schema_fields_items import ( CreateTableRequestDataAttributesSchemaFieldsItems, ) from datadog_api_client.v2.model.create_table_request_data_type import CreateTableRequestDataType from datadog_api_client.v2.model.reference_table_create_source_type import ReferenceTableCreateSourceType from datadog_api_client.v2.model.reference_table_schema_field_type import ReferenceTableSchemaFieldType body = CreateTableRequest( data=CreateTableRequestData( attributes=CreateTableRequestDataAttributes( file_metadata=CreateTableRequestDataAttributesFileMetadataCloudStorage( access_details=CreateTableRequestDataAttributesFileMetadataOneOfAccessDetails( aws_detail=CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail( aws_account_id="123456789000", aws_bucket_name="example-data-bucket", file_path="reference-tables/users.csv", ), azure_detail=CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAzureDetail( azure_client_id="aaaaaaaa-1111-2222-3333-bbbbbbbbbbbb", azure_container_name="reference-data", azure_storage_account_name="examplestorageaccount", azure_tenant_id="cccccccc-4444-5555-6666-dddddddddddd", file_path="tables/users.csv", ), gcp_detail=CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsGcpDetail( file_path="data/reference_tables/users.csv", gcp_bucket_name="example-data-bucket", gcp_project_id="example-gcp-project-12345", gcp_service_account_email="example-service@example-gcp-project-12345.iam.gserviceaccount.com", ), ), sync_enabled=False, ), schema=CreateTableRequestDataAttributesSchema( fields=[ CreateTableRequestDataAttributesSchemaFieldsItems( name="field_1", type=ReferenceTableSchemaFieldType.STRING, ), ], primary_keys=[ "field_1", ], ), source=ReferenceTableCreateSourceType.LOCAL_FILE, table_name="table_1", tags=[ "tag_1", "tag_2", ], ), type=CreateTableRequestDataType.REFERENCE_TABLE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ReferenceTablesApi(api_client) response = api_instance.create_reference_table(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create reference table ``` # Create reference table returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ReferenceTablesAPI.new body = DatadogAPIClient::V2::CreateTableRequest.new({ data: DatadogAPIClient::V2::CreateTableRequestData.new({ attributes: DatadogAPIClient::V2::CreateTableRequestDataAttributes.new({ file_metadata: DatadogAPIClient::V2::CreateTableRequestDataAttributesFileMetadataCloudStorage.new({ access_details: DatadogAPIClient::V2::CreateTableRequestDataAttributesFileMetadataOneOfAccessDetails.new({ aws_detail: DatadogAPIClient::V2::CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail.new({ aws_account_id: "123456789000", aws_bucket_name: "example-data-bucket", file_path: "reference-tables/users.csv", }), azure_detail: DatadogAPIClient::V2::CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAzureDetail.new({ azure_client_id: "aaaaaaaa-1111-2222-3333-bbbbbbbbbbbb", azure_container_name: "reference-data", azure_storage_account_name: "examplestorageaccount", azure_tenant_id: "cccccccc-4444-5555-6666-dddddddddddd", file_path: "tables/users.csv", }), gcp_detail: DatadogAPIClient::V2::CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsGcpDetail.new({ file_path: "data/reference_tables/users.csv", gcp_bucket_name: "example-data-bucket", gcp_project_id: "example-gcp-project-12345", gcp_service_account_email: "example-service@example-gcp-project-12345.iam.gserviceaccount.com", }), }), sync_enabled: false, }), schema: DatadogAPIClient::V2::CreateTableRequestDataAttributesSchema.new({ fields: [ DatadogAPIClient::V2::CreateTableRequestDataAttributesSchemaFieldsItems.new({ name: "field_1", type: DatadogAPIClient::V2::ReferenceTableSchemaFieldType::STRING, }), ], primary_keys: [ "field_1", ], }), source: DatadogAPIClient::V2::ReferenceTableCreateSourceType::LOCAL_FILE, table_name: "table_1", tags: [ "tag_1", "tag_2", ], }), type: DatadogAPIClient::V2::CreateTableRequestDataType::REFERENCE_TABLE, }), }) p api_instance.create_reference_table(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create reference table ``` // Create reference table returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateTableRequest{ Data: &datadogV2.CreateTableRequestData{ Attributes: &datadogV2.CreateTableRequestDataAttributes{ FileMetadata: &datadogV2.CreateTableRequestDataAttributesFileMetadata{ CreateTableRequestDataAttributesFileMetadataCloudStorage: &datadogV2.CreateTableRequestDataAttributesFileMetadataCloudStorage{ AccessDetails: datadogV2.CreateTableRequestDataAttributesFileMetadataOneOfAccessDetails{ AwsDetail: &datadogV2.CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail{ AwsAccountId: "123456789000", AwsBucketName: "example-data-bucket", FilePath: "reference-tables/users.csv", }, AzureDetail: &datadogV2.CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAzureDetail{ AzureClientId: "aaaaaaaa-1111-2222-3333-bbbbbbbbbbbb", AzureContainerName: "reference-data", AzureStorageAccountName: "examplestorageaccount", AzureTenantId: "cccccccc-4444-5555-6666-dddddddddddd", FilePath: "tables/users.csv", }, GcpDetail: &datadogV2.CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsGcpDetail{ FilePath: "data/reference_tables/users.csv", GcpBucketName: "example-data-bucket", GcpProjectId: "example-gcp-project-12345", GcpServiceAccountEmail: "example-service@example-gcp-project-12345.iam.gserviceaccount.com", }, }, SyncEnabled: false, }}, Schema: datadogV2.CreateTableRequestDataAttributesSchema{ Fields: []datadogV2.CreateTableRequestDataAttributesSchemaFieldsItems{ { Name: "field_1", Type: datadogV2.REFERENCETABLESCHEMAFIELDTYPE_STRING, }, }, PrimaryKeys: []string{ "field_1", }, }, Source: datadogV2.REFERENCETABLECREATESOURCETYPE_LOCAL_FILE, TableName: "table_1", Tags: []string{ "tag_1", "tag_2", }, }, Type: datadogV2.CREATETABLEREQUESTDATATYPE_REFERENCE_TABLE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewReferenceTablesApi(apiClient) resp, r, err := api.CreateReferenceTable(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ReferenceTablesApi.CreateReferenceTable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ReferenceTablesApi.CreateReferenceTable`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create reference table ``` // Create reference table returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ReferenceTablesApi; import com.datadog.api.client.v2.model.CreateTableRequest; import com.datadog.api.client.v2.model.CreateTableRequestData; import com.datadog.api.client.v2.model.CreateTableRequestDataAttributes; import com.datadog.api.client.v2.model.CreateTableRequestDataAttributesFileMetadata; import com.datadog.api.client.v2.model.CreateTableRequestDataAttributesFileMetadataCloudStorage; import com.datadog.api.client.v2.model.CreateTableRequestDataAttributesFileMetadataOneOfAccessDetails; import com.datadog.api.client.v2.model.CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail; import com.datadog.api.client.v2.model.CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAzureDetail; import com.datadog.api.client.v2.model.CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsGcpDetail; import com.datadog.api.client.v2.model.CreateTableRequestDataAttributesSchema; import com.datadog.api.client.v2.model.CreateTableRequestDataAttributesSchemaFieldsItems; import com.datadog.api.client.v2.model.CreateTableRequestDataType; import com.datadog.api.client.v2.model.ReferenceTableCreateSourceType; import com.datadog.api.client.v2.model.ReferenceTableSchemaFieldType; import com.datadog.api.client.v2.model.TableResultV2; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ReferenceTablesApi apiInstance = new ReferenceTablesApi(defaultClient); CreateTableRequest body = new CreateTableRequest() .data( new CreateTableRequestData() .attributes( new CreateTableRequestDataAttributes() .fileMetadata( new CreateTableRequestDataAttributesFileMetadata( new CreateTableRequestDataAttributesFileMetadataCloudStorage() .accessDetails( new CreateTableRequestDataAttributesFileMetadataOneOfAccessDetails() .awsDetail( new CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail() .awsAccountId("123456789000") .awsBucketName("example-data-bucket") .filePath("reference-tables/users.csv")) .azureDetail( new CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAzureDetail() .azureClientId( "aaaaaaaa-1111-2222-3333-bbbbbbbbbbbb") .azureContainerName("reference-data") .azureStorageAccountName( "examplestorageaccount") .azureTenantId( "cccccccc-4444-5555-6666-dddddddddddd") .filePath("tables/users.csv")) .gcpDetail( new CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsGcpDetail() .filePath("data/reference_tables/users.csv") .gcpBucketName("example-data-bucket") .gcpProjectId("example-gcp-project-12345") .gcpServiceAccountEmail( "example-service@example-gcp-project-12345.iam.gserviceaccount.com"))) .syncEnabled(false))) .schema( new CreateTableRequestDataAttributesSchema() .fields( Collections.singletonList( new CreateTableRequestDataAttributesSchemaFieldsItems() .name("field_1") .type(ReferenceTableSchemaFieldType.STRING))) .primaryKeys(Collections.singletonList("field_1"))) .source(ReferenceTableCreateSourceType.LOCAL_FILE) .tableName("table_1") .tags(Arrays.asList("tag_1", "tag_2"))) .type(CreateTableRequestDataType.REFERENCE_TABLE)); try { TableResultV2 result = apiInstance.createReferenceTable(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ReferenceTablesApi#createReferenceTable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create reference table ``` // Create reference table returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_reference_tables::ReferenceTablesAPI; use datadog_api_client::datadogV2::model::CreateTableRequest; use datadog_api_client::datadogV2::model::CreateTableRequestData; use datadog_api_client::datadogV2::model::CreateTableRequestDataAttributes; use datadog_api_client::datadogV2::model::CreateTableRequestDataAttributesFileMetadata; use datadog_api_client::datadogV2::model::CreateTableRequestDataAttributesFileMetadataCloudStorage; use datadog_api_client::datadogV2::model::CreateTableRequestDataAttributesFileMetadataOneOfAccessDetails; use datadog_api_client::datadogV2::model::CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail; use datadog_api_client::datadogV2::model::CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAzureDetail; use datadog_api_client::datadogV2::model::CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsGcpDetail; use datadog_api_client::datadogV2::model::CreateTableRequestDataAttributesSchema; use datadog_api_client::datadogV2::model::CreateTableRequestDataAttributesSchemaFieldsItems; use datadog_api_client::datadogV2::model::CreateTableRequestDataType; use datadog_api_client::datadogV2::model::ReferenceTableCreateSourceType; use datadog_api_client::datadogV2::model::ReferenceTableSchemaFieldType; #[tokio::main] async fn main() { let body = CreateTableRequest ::new().data( CreateTableRequestData::new( CreateTableRequestDataType::REFERENCE_TABLE, ).attributes( CreateTableRequestDataAttributes::new( CreateTableRequestDataAttributesSchema::new( vec![ CreateTableRequestDataAttributesSchemaFieldsItems::new( "field_1".to_string(), ReferenceTableSchemaFieldType::STRING, ) ], vec!["field_1".to_string()], ), ReferenceTableCreateSourceType::LOCAL_FILE, "table_1".to_string(), ) .file_metadata( CreateTableRequestDataAttributesFileMetadata ::CreateTableRequestDataAttributesFileMetadataCloudStorage( Box::new( CreateTableRequestDataAttributesFileMetadataCloudStorage::new( CreateTableRequestDataAttributesFileMetadataOneOfAccessDetails::new() .aws_detail( CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail ::new( "123456789000".to_string(), "example-data-bucket".to_string(), "reference-tables/users.csv".to_string(), ), ) .azure_detail( CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsAzureDetail ::new( "aaaaaaaa-1111-2222-3333-bbbbbbbbbbbb".to_string(), "reference-data".to_string(), "examplestorageaccount".to_string(), "cccccccc-4444-5555-6666-dddddddddddd".to_string(), "tables/users.csv".to_string(), ), ) .gcp_detail( CreateTableRequestDataAttributesFileMetadataOneOfAccessDetailsGcpDetail ::new( "data/reference_tables/users.csv".to_string(), "example-data-bucket".to_string(), "example-gcp-project-12345".to_string(), "example-service@example-gcp-project-12345.iam.gserviceaccount.com".to_string(), ), ), false, ), ), ), ) .tags(vec!["tag_1".to_string(), "tag_2".to_string()]), ), ); let configuration = datadog::Configuration::new(); let api = ReferenceTablesAPI::with_config(configuration); let resp = api.create_reference_table(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create reference table ``` /** * Create reference table returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ReferenceTablesApi(configuration); const params: v2.ReferenceTablesApiCreateReferenceTableRequest = { body: { data: { attributes: { fileMetadata: { accessDetails: { awsDetail: { awsAccountId: "123456789000", awsBucketName: "example-data-bucket", filePath: "reference-tables/users.csv", }, azureDetail: { azureClientId: "aaaaaaaa-1111-2222-3333-bbbbbbbbbbbb", azureContainerName: "reference-data", azureStorageAccountName: "examplestorageaccount", azureTenantId: "cccccccc-4444-5555-6666-dddddddddddd", filePath: "tables/users.csv", }, gcpDetail: { filePath: "data/reference_tables/users.csv", gcpBucketName: "example-data-bucket", gcpProjectId: "example-gcp-project-12345", gcpServiceAccountEmail: "example-service@example-gcp-project-12345.iam.gserviceaccount.com", }, }, syncEnabled: false, }, schema: { fields: [ { name: "field_1", type: "STRING", }, ], primaryKeys: ["field_1"], }, source: "LOCAL_FILE", tableName: "table_1", tags: ["tag_1", "tag_2"], }, type: "reference_table", }, }, }; apiInstance .createReferenceTable(params) .then((data: v2.TableResultV2) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List tables](https://docs.datadoghq.com/api/latest/reference-tables/#list-tables) * [v2 (latest)](https://docs.datadoghq.com/api/latest/reference-tables/#list-tables-v2) GET https://api.ap1.datadoghq.com/api/v2/reference-tables/tableshttps://api.ap2.datadoghq.com/api/v2/reference-tables/tableshttps://api.datadoghq.eu/api/v2/reference-tables/tableshttps://api.ddog-gov.com/api/v2/reference-tables/tableshttps://api.datadoghq.com/api/v2/reference-tables/tableshttps://api.us3.datadoghq.com/api/v2/reference-tables/tableshttps://api.us5.datadoghq.com/api/v2/reference-tables/tables ### Overview List all reference tables in this organization. ### Arguments #### Query Strings Name Type Description page[limit] integer Number of tables to return. page[offset] integer Number of tables to skip for pagination. sort enum Sort field and direction for the list of reference tables. Use field name for ascending, prefix with “-” for descending. Allowed enum values: `updated_at, table_name, status, -updated_at, -table_name, -status` filter[status] string Filter by table status. filter[table_name][exact] string Filter by exact table name match. filter[table_name][contains] string Filter by table name containing substring. ### Response * [200](https://docs.datadoghq.com/api/latest/reference-tables/#ListTables-200-v2) * [403](https://docs.datadoghq.com/api/latest/reference-tables/#ListTables-403-v2) * [429](https://docs.datadoghq.com/api/latest/reference-tables/#ListTables-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) List of reference tables. Field Type Description data [_required_] [object] The reference tables. attributes object Attributes that define the reference table's configuration and properties. created_by string UUID of the user who created the reference table. description string Optional text describing the purpose or contents of this reference table. file_metadata object Metadata specifying where and how to access the reference table's data file. For cloud storage tables (S3/GCS/Azure): * sync_enabled and access_details will always be present * error fields (error_message, error_row_count, error_type) are present only when errors occur For local file tables: * error fields (error_message, error_row_count) are present only when errors occur * sync_enabled, access_details are never present access_details object Cloud storage access configuration. Only present for cloud storage sources (S3, GCS, Azure). aws_detail object Amazon Web Services S3 storage access configuration. aws_account_id string AWS account ID where the S3 bucket is located. aws_bucket_name string S3 bucket containing the CSV file. file_path string The relative file path from the S3 bucket root to the CSV file. azure_detail object Azure Blob Storage access configuration. azure_client_id string Azure service principal (application) client ID with permissions to read from the container. azure_container_name string Azure Blob Storage container containing the CSV file. azure_storage_account_name string Azure storage account where the container is located. azure_tenant_id string Azure Active Directory tenant ID. file_path string The relative file path from the Azure container root to the CSV file. gcp_detail object Google Cloud Platform storage access configuration. file_path string The relative file path from the GCS bucket root to the CSV file. gcp_bucket_name string GCP bucket containing the CSV file. gcp_project_id string GCP project ID where the bucket is located. gcp_service_account_email string Service account email with read permissions for the GCS bucket. error_message string The error message returned from the last operation (sync for cloud storage, upload for local file). error_row_count int64 The number of rows that failed to process. error_type enum The type of error that occurred during file processing. Only applicable for cloud storage sources. Allowed enum values: `TABLE_SCHEMA_ERROR,FILE_FORMAT_ERROR,CONFIGURATION_ERROR,QUOTA_EXCEEDED,CONFLICT_ERROR,VALIDATION_ERROR,STATE_ERROR,OPERATION_ERROR,SYSTEM_ERROR` sync_enabled boolean Whether this table is synced automatically from cloud storage. Only applicable for cloud storage sources. last_updated_by string UUID of the user who last updated the reference table. row_count int64 The number of successfully processed rows in the reference table. schema object Schema defining the structure and columns of the reference table. fields [_required_] [object] The schema fields. name [_required_] string The field name. type [_required_] enum The field type for reference table schema fields. Allowed enum values: `STRING,INT32` primary_keys [_required_] [string] List of field names that serve as primary keys for the table. Only one primary key is supported, and it is used as an ID to retrieve rows. source enum The source type for reference table data. Includes all possible source types that can appear in responses. Allowed enum values: `LOCAL_FILE,S3,GCS,AZURE,SERVICENOW,SALESFORCE,DATABRICKS,SNOWFLAKE` status string The processing status of the table. table_name string Unique name to identify this reference table. Used in enrichment processors and API calls. tags [string] Tags for organizing and filtering reference tables. updated_at string When the reference table was last updated, in ISO 8601 format. id string Unique identifier for the reference table. type [_required_] enum Reference table resource type. Allowed enum values: `reference_table` default: `reference_table` ``` { "data": [ { "attributes": { "created_by": "00000000-0000-0000-0000-000000000000", "description": "example description", "file_metadata": { "access_details": {}, "error_message": "", "error_row_count": 0, "upload_id": "00000000-0000-0000-0000-000000000000" }, "last_updated_by": "", "row_count": 5, "schema": { "fields": [ { "name": "id", "type": "INT32" }, { "name": "name", "type": "STRING" } ], "primary_keys": [ "id" ] }, "source": "LOCAL_FILE", "status": "DONE", "table_name": "test_reference_table", "tags": [ "tag1", "tag2" ], "updated_at": "2000-01-01T01:00:00+00:00" }, "id": "00000000-0000-0000-0000-000000000000", "type": "reference_table" }, { "attributes": { "created_by": "00000000-0000-0000-0000-000000000000", "description": "example description", "file_metadata": { "access_details": { "aws_detail": { "aws_account_id": "test-account-id", "aws_bucket_name": "test-bucket", "file_path": "test_rt.csv" } }, "error_message": "", "error_row_count": 0, "sync_enabled": true }, "last_updated_by": "00000000-0000-0000-0000-000000000000", "row_count": 5, "schema": { "fields": [ { "name": "location", "type": "STRING" }, { "name": "file_name", "type": "STRING" } ], "primary_keys": [ "location" ] }, "source": "S3", "status": "DONE", "table_name": "test_reference_table_2", "tags": [ "test_tag1", "tag2", "3" ], "updated_at": "2000-01-01T01:00:00+00:00" }, "id": "00000000-0000-0000-0000-000000000000", "type": "reference_table" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=typescript) ##### List tables Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/reference-tables/tables" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List tables ``` """ List tables returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.reference_tables_api import ReferenceTablesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ReferenceTablesApi(api_client) response = api_instance.list_tables() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List tables ``` # List tables returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ReferenceTablesAPI.new p api_instance.list_tables() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List tables ``` // List tables returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewReferenceTablesApi(apiClient) resp, r, err := api.ListTables(ctx, *datadogV2.NewListTablesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ReferenceTablesApi.ListTables`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ReferenceTablesApi.ListTables`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List tables ``` // List tables returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ReferenceTablesApi; import com.datadog.api.client.v2.model.TableResultV2Array; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ReferenceTablesApi apiInstance = new ReferenceTablesApi(defaultClient); try { TableResultV2Array result = apiInstance.listTables(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ReferenceTablesApi#listTables"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List tables ``` // List tables returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_reference_tables::ListTablesOptionalParams; use datadog_api_client::datadogV2::api_reference_tables::ReferenceTablesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ReferenceTablesAPI::with_config(configuration); let resp = api.list_tables(ListTablesOptionalParams::default()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List tables ``` /** * List tables returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ReferenceTablesApi(configuration); apiInstance .listTables() .then((data: v2.TableResultV2Array) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get table](https://docs.datadoghq.com/api/latest/reference-tables/#get-table) * [v2 (latest)](https://docs.datadoghq.com/api/latest/reference-tables/#get-table-v2) GET https://api.ap1.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.ap2.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.datadoghq.eu/api/v2/reference-tables/tables/{id}https://api.ddog-gov.com/api/v2/reference-tables/tables/{id}https://api.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.us3.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.us5.datadoghq.com/api/v2/reference-tables/tables/{id} ### Overview Get a reference table by ID ### Arguments #### Path Parameters Name Type Description id [_required_] string Unique identifier of the reference table to retrieve ### Response * [200](https://docs.datadoghq.com/api/latest/reference-tables/#GetTable-200-v2) * [403](https://docs.datadoghq.com/api/latest/reference-tables/#GetTable-403-v2) * [404](https://docs.datadoghq.com/api/latest/reference-tables/#GetTable-404-v2) * [429](https://docs.datadoghq.com/api/latest/reference-tables/#GetTable-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) A reference table resource containing its full configuration and state. Field Type Description data object The data object containing the reference table configuration and state. attributes object Attributes that define the reference table's configuration and properties. created_by string UUID of the user who created the reference table. description string Optional text describing the purpose or contents of this reference table. file_metadata object Metadata specifying where and how to access the reference table's data file. For cloud storage tables (S3/GCS/Azure): * sync_enabled and access_details will always be present * error fields (error_message, error_row_count, error_type) are present only when errors occur For local file tables: * error fields (error_message, error_row_count) are present only when errors occur * sync_enabled, access_details are never present access_details object Cloud storage access configuration. Only present for cloud storage sources (S3, GCS, Azure). aws_detail object Amazon Web Services S3 storage access configuration. aws_account_id string AWS account ID where the S3 bucket is located. aws_bucket_name string S3 bucket containing the CSV file. file_path string The relative file path from the S3 bucket root to the CSV file. azure_detail object Azure Blob Storage access configuration. azure_client_id string Azure service principal (application) client ID with permissions to read from the container. azure_container_name string Azure Blob Storage container containing the CSV file. azure_storage_account_name string Azure storage account where the container is located. azure_tenant_id string Azure Active Directory tenant ID. file_path string The relative file path from the Azure container root to the CSV file. gcp_detail object Google Cloud Platform storage access configuration. file_path string The relative file path from the GCS bucket root to the CSV file. gcp_bucket_name string GCP bucket containing the CSV file. gcp_project_id string GCP project ID where the bucket is located. gcp_service_account_email string Service account email with read permissions for the GCS bucket. error_message string The error message returned from the last operation (sync for cloud storage, upload for local file). error_row_count int64 The number of rows that failed to process. error_type enum The type of error that occurred during file processing. Only applicable for cloud storage sources. Allowed enum values: `TABLE_SCHEMA_ERROR,FILE_FORMAT_ERROR,CONFIGURATION_ERROR,QUOTA_EXCEEDED,CONFLICT_ERROR,VALIDATION_ERROR,STATE_ERROR,OPERATION_ERROR,SYSTEM_ERROR` sync_enabled boolean Whether this table is synced automatically from cloud storage. Only applicable for cloud storage sources. last_updated_by string UUID of the user who last updated the reference table. row_count int64 The number of successfully processed rows in the reference table. schema object Schema defining the structure and columns of the reference table. fields [_required_] [object] The schema fields. name [_required_] string The field name. type [_required_] enum The field type for reference table schema fields. Allowed enum values: `STRING,INT32` primary_keys [_required_] [string] List of field names that serve as primary keys for the table. Only one primary key is supported, and it is used as an ID to retrieve rows. source enum The source type for reference table data. Includes all possible source types that can appear in responses. Allowed enum values: `LOCAL_FILE,S3,GCS,AZURE,SERVICENOW,SALESFORCE,DATABRICKS,SNOWFLAKE` status string The processing status of the table. table_name string Unique name to identify this reference table. Used in enrichment processors and API calls. tags [string] Tags for organizing and filtering reference tables. updated_at string When the reference table was last updated, in ISO 8601 format. id string Unique identifier for the reference table. type [_required_] enum Reference table resource type. Allowed enum values: `reference_table` default: `reference_table` ``` { "data": { "attributes": { "created_by": "00000000-0000-0000-0000-000000000000", "description": "example description", "file_metadata": { "access_details": { "aws_detail": { "aws_account_id": "123456789000", "aws_bucket_name": "my-bucket", "file_path": "path/to/file.csv" } }, "sync_enabled": true }, "last_updated_by": "00000000-0000-0000-0000-000000000000", "row_count": 5, "schema": { "fields": [ { "name": "id", "type": "INT32" }, { "name": "name", "type": "STRING" } ], "primary_keys": [ "id" ] }, "source": "S3", "status": "DONE", "table_name": "test_reference_table", "tags": [ "tag1", "tag2" ], "updated_at": "2000-01-01T01:00:00+00:00" }, "id": "00000000-0000-0000-0000-000000000000", "type": "reference_table" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=typescript) ##### Get table Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/reference-tables/tables/${id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get table ``` """ Get table returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.reference_tables_api import ReferenceTablesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ReferenceTablesApi(api_client) response = api_instance.get_table( id="id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get table ``` # Get table returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ReferenceTablesAPI.new p api_instance.get_table("id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get table ``` // Get table returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewReferenceTablesApi(apiClient) resp, r, err := api.GetTable(ctx, "id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ReferenceTablesApi.GetTable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ReferenceTablesApi.GetTable`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get table ``` // Get table returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ReferenceTablesApi; import com.datadog.api.client.v2.model.TableResultV2; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ReferenceTablesApi apiInstance = new ReferenceTablesApi(defaultClient); try { TableResultV2 result = apiInstance.getTable("id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ReferenceTablesApi#getTable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get table ``` // Get table returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_reference_tables::ReferenceTablesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ReferenceTablesAPI::with_config(configuration); let resp = api.get_table("id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get table ``` /** * Get table returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ReferenceTablesApi(configuration); const params: v2.ReferenceTablesApiGetTableRequest = { id: "id", }; apiInstance .getTable(params) .then((data: v2.TableResultV2) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get rows by id](https://docs.datadoghq.com/api/latest/reference-tables/#get-rows-by-id) * [v2 (latest)](https://docs.datadoghq.com/api/latest/reference-tables/#get-rows-by-id-v2) GET https://api.ap1.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.ap2.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.datadoghq.eu/api/v2/reference-tables/tables/{id}/rowshttps://api.ddog-gov.com/api/v2/reference-tables/tables/{id}/rowshttps://api.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.us3.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.us5.datadoghq.com/api/v2/reference-tables/tables/{id}/rows ### Overview Get reference table rows by their primary key values. ### Arguments #### Path Parameters Name Type Description id [_required_] string Unique identifier of the reference table to get rows from #### Query Strings Name Type Description row_id [_required_] array List of row IDs (primary key values) to retrieve from the reference table. ### Response * [200](https://docs.datadoghq.com/api/latest/reference-tables/#GetRowsByID-200-v2) * [403](https://docs.datadoghq.com/api/latest/reference-tables/#GetRowsByID-403-v2) * [404](https://docs.datadoghq.com/api/latest/reference-tables/#GetRowsByID-404-v2) * [429](https://docs.datadoghq.com/api/latest/reference-tables/#GetRowsByID-429-v2) Some or all requested rows were found. * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) List of rows from a reference table query. Field Type Description data [_required_] [object] The rows. attributes object Column values for this row in the reference table. values object Key-value pairs representing the row data, where keys are field names from the schema. id string Row identifier, corresponding to the primary key value. type [_required_] enum Row resource type. Allowed enum values: `row` default: `row` ``` { "data": [ { "attributes": { "values": {} }, "id": "string", "type": "row" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=typescript) ##### Get rows by id Copy ``` # Path parameters export id="table-123" # Required query arguments export row_id_0="row_id_0" export row_id_1="row_id_1" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/reference-tables/tables/${id}/rows?row_id=${row_id_0}&row_id=${row_id_1}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get rows by id ``` """ Get rows by id returns "Some or all requested rows were found." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.reference_tables_api import ReferenceTablesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ReferenceTablesApi(api_client) response = api_instance.get_rows_by_id( id="id", row_id=[], ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get rows by id ``` # Get rows by id returns "Some or all requested rows were found." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ReferenceTablesAPI.new p api_instance.get_rows_by_id("id", []) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get rows by id ``` // Get rows by id returns "Some or all requested rows were found." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewReferenceTablesApi(apiClient) resp, r, err := api.GetRowsByID(ctx, "id", []string{}) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ReferenceTablesApi.GetRowsByID`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ReferenceTablesApi.GetRowsByID`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get rows by id ``` // Get rows by id returns "Some or all requested rows were found." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ReferenceTablesApi; import com.datadog.api.client.v2.model.TableRowResourceArray; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ReferenceTablesApi apiInstance = new ReferenceTablesApi(defaultClient); try { TableRowResourceArray result = apiInstance.getRowsByID("table-123", Arrays.asList("row1", "row2")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ReferenceTablesApi#getRowsByID"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get rows by id ``` // Get rows by id returns "Some or all requested rows were found." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_reference_tables::ReferenceTablesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ReferenceTablesAPI::with_config(configuration); let resp = api.get_rows_by_id("id".to_string(), vec![]).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get rows by id ``` /** * Get rows by id returns "Some or all requested rows were found." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ReferenceTablesApi(configuration); const params: v2.ReferenceTablesApiGetRowsByIDRequest = { id: "id", rowId: [], }; apiInstance .getRowsByID(params) .then((data: v2.TableRowResourceArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update reference table](https://docs.datadoghq.com/api/latest/reference-tables/#update-reference-table) * [v2 (latest)](https://docs.datadoghq.com/api/latest/reference-tables/#update-reference-table-v2) PATCH https://api.ap1.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.ap2.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.datadoghq.eu/api/v2/reference-tables/tables/{id}https://api.ddog-gov.com/api/v2/reference-tables/tables/{id}https://api.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.us3.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.us5.datadoghq.com/api/v2/reference-tables/tables/{id} ### Overview Update a reference table by ID. You can update the table’s data, description, and tags. Note: The source type cannot be changed after table creation. For data updates: For existing tables of type `source:LOCAL_FILE`, call POST api/v2/reference-tables/uploads first to get an upload ID, then PUT chunks of CSV data to each provided URL, and finally call this PATCH endpoint with the upload_id in file_metadata. For existing tables with `source:` types of `S3`, `GCS`, or `AZURE`, provide updated access_details in file_metadata pointing to a CSV file in the same type of cloud storage. ### Arguments #### Path Parameters Name Type Description id [_required_] string Unique identifier of the reference table to update ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) Field Type Description data object The data object containing the partial table definition updates. attributes object Attributes that define the updates to the reference table's configuration and properties. description string Optional text describing the purpose or contents of this reference table. file_metadata Metadata specifying where and how to access the reference table's data file. Option 1 object Cloud storage file metadata for patch requests. Allows partial updates of access_details and sync_enabled. access_details object Cloud storage access configuration for the reference table data file. aws_detail object Amazon Web Services S3 storage access configuration. aws_account_id string AWS account ID where the S3 bucket is located. aws_bucket_name string S3 bucket containing the CSV file. file_path string The relative file path from the S3 bucket root to the CSV file. azure_detail object Azure Blob Storage access configuration. azure_client_id string Azure service principal (application) client ID with permissions to read from the container. azure_container_name string Azure Blob Storage container containing the CSV file. azure_storage_account_name string Azure storage account where the container is located. azure_tenant_id string Azure Active Directory tenant ID. file_path string The relative file path from the Azure container root to the CSV file. gcp_detail object Google Cloud Platform storage access configuration. file_path string The relative file path from the GCS bucket root to the CSV file. gcp_bucket_name string GCP bucket containing the CSV file. gcp_project_id string GCP project ID where the bucket is located. gcp_service_account_email string Service account email with read permissions for the GCS bucket. sync_enabled boolean Whether this table is synced automatically. Option 2 object Local file metadata for patch requests using upload ID. upload_id [_required_] string The upload ID. schema object Schema defining the updates to the structure and columns of the reference table. Schema fields cannot be deleted or renamed. fields [_required_] [object] The schema fields. name [_required_] string The field name. type [_required_] enum The field type for reference table schema fields. Allowed enum values: `STRING,INT32` primary_keys [_required_] [string] List of field names that serve as primary keys for the table. Only one primary key is supported, and it is used as an ID to retrieve rows. Primary keys cannot be changed after table creation. tags [string] Tags for organizing and filtering reference tables. type [_required_] enum Reference table resource type. Allowed enum values: `reference_table` default: `reference_table` ``` { "data": { "attributes": { "description": "example description", "file_metadata": { "access_details": { "aws_detail": { "aws_account_id": "123456789000", "aws_bucket_name": "example-data-bucket", "file_path": "reference-tables/users.csv" }, "azure_detail": { "azure_client_id": "aaaaaaaa-1111-2222-3333-bbbbbbbbbbbb", "azure_container_name": "reference-data", "azure_storage_account_name": "examplestorageaccount", "azure_tenant_id": "cccccccc-4444-5555-6666-dddddddddddd", "file_path": "tables/users.csv" }, "gcp_detail": { "file_path": "data/reference_tables/users.csv", "gcp_bucket_name": "example-data-bucket", "gcp_project_id": "example-gcp-project-12345", "gcp_service_account_email": "example-service@example-gcp-project-12345.iam.gserviceaccount.com" } }, "sync_enabled": false }, "schema": { "fields": [ { "name": "field_1", "type": "STRING" } ], "primary_keys": [ "field_1" ] }, "tags": [ "tag_1", "tag_2" ] }, "type": "reference_table" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/reference-tables/#UpdateReferenceTable-200-v2) * [400](https://docs.datadoghq.com/api/latest/reference-tables/#UpdateReferenceTable-400-v2) * [403](https://docs.datadoghq.com/api/latest/reference-tables/#UpdateReferenceTable-403-v2) * [429](https://docs.datadoghq.com/api/latest/reference-tables/#UpdateReferenceTable-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=typescript) ##### Update reference table Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/reference-tables/tables/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "schema": { "fields": [ { "name": "field_1", "type": "STRING" } ], "primary_keys": [ "field_1" ] } }, "type": "reference_table" } } EOF ``` ##### Update reference table ``` """ Update reference table returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.reference_tables_api import ReferenceTablesApi from datadog_api_client.v2.model.patch_table_request import PatchTableRequest from datadog_api_client.v2.model.patch_table_request_data import PatchTableRequestData from datadog_api_client.v2.model.patch_table_request_data_attributes import PatchTableRequestDataAttributes from datadog_api_client.v2.model.patch_table_request_data_attributes_file_metadata_cloud_storage import ( PatchTableRequestDataAttributesFileMetadataCloudStorage, ) from datadog_api_client.v2.model.patch_table_request_data_attributes_file_metadata_one_of_access_details import ( PatchTableRequestDataAttributesFileMetadataOneOfAccessDetails, ) from datadog_api_client.v2.model.patch_table_request_data_attributes_file_metadata_one_of_access_details_aws_detail import ( PatchTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail, ) from datadog_api_client.v2.model.patch_table_request_data_attributes_schema import PatchTableRequestDataAttributesSchema from datadog_api_client.v2.model.patch_table_request_data_attributes_schema_fields_items import ( PatchTableRequestDataAttributesSchemaFieldsItems, ) from datadog_api_client.v2.model.patch_table_request_data_type import PatchTableRequestDataType from datadog_api_client.v2.model.reference_table_schema_field_type import ReferenceTableSchemaFieldType body = PatchTableRequest( data=PatchTableRequestData( attributes=PatchTableRequestDataAttributes( description="this is a cloud table generated via a cloud bucket sync", file_metadata=PatchTableRequestDataAttributesFileMetadataCloudStorage( access_details=PatchTableRequestDataAttributesFileMetadataOneOfAccessDetails( aws_detail=PatchTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail( aws_account_id="test-account-id", aws_bucket_name="test-bucket", file_path="test_rt.csv", ), ), sync_enabled=True, ), schema=PatchTableRequestDataAttributesSchema( fields=[ PatchTableRequestDataAttributesSchemaFieldsItems( name="id", type=ReferenceTableSchemaFieldType.INT32, ), PatchTableRequestDataAttributesSchemaFieldsItems( name="name", type=ReferenceTableSchemaFieldType.STRING, ), ], primary_keys=[ "id", ], ), sync_enabled=False, tags=[ "test_tag", ], ), type=PatchTableRequestDataType.REFERENCE_TABLE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ReferenceTablesApi(api_client) api_instance.update_reference_table(id="id", body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update reference table ``` # Update reference table returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ReferenceTablesAPI.new body = DatadogAPIClient::V2::PatchTableRequest.new({ data: DatadogAPIClient::V2::PatchTableRequestData.new({ attributes: DatadogAPIClient::V2::PatchTableRequestDataAttributes.new({ description: "this is a cloud table generated via a cloud bucket sync", file_metadata: DatadogAPIClient::V2::PatchTableRequestDataAttributesFileMetadataCloudStorage.new({ access_details: DatadogAPIClient::V2::PatchTableRequestDataAttributesFileMetadataOneOfAccessDetails.new({ aws_detail: DatadogAPIClient::V2::PatchTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail.new({ aws_account_id: "test-account-id", aws_bucket_name: "test-bucket", file_path: "test_rt.csv", }), }), sync_enabled: true, }), schema: DatadogAPIClient::V2::PatchTableRequestDataAttributesSchema.new({ fields: [ DatadogAPIClient::V2::PatchTableRequestDataAttributesSchemaFieldsItems.new({ name: "id", type: DatadogAPIClient::V2::ReferenceTableSchemaFieldType::INT32, }), DatadogAPIClient::V2::PatchTableRequestDataAttributesSchemaFieldsItems.new({ name: "name", type: DatadogAPIClient::V2::ReferenceTableSchemaFieldType::STRING, }), ], primary_keys: [ "id", ], }), sync_enabled: false, tags: [ "test_tag", ], }), type: DatadogAPIClient::V2::PatchTableRequestDataType::REFERENCE_TABLE, }), }) p api_instance.update_reference_table("id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update reference table ``` // Update reference table returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.PatchTableRequest{ Data: &datadogV2.PatchTableRequestData{ Attributes: &datadogV2.PatchTableRequestDataAttributes{ Description: datadog.PtrString("this is a cloud table generated via a cloud bucket sync"), FileMetadata: &datadogV2.PatchTableRequestDataAttributesFileMetadata{ PatchTableRequestDataAttributesFileMetadataCloudStorage: &datadogV2.PatchTableRequestDataAttributesFileMetadataCloudStorage{ AccessDetails: &datadogV2.PatchTableRequestDataAttributesFileMetadataOneOfAccessDetails{ AwsDetail: &datadogV2.PatchTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail{ AwsAccountId: datadog.PtrString("test-account-id"), AwsBucketName: datadog.PtrString("test-bucket"), FilePath: datadog.PtrString("test_rt.csv"), }, }, SyncEnabled: datadog.PtrBool(true), }}, Schema: &datadogV2.PatchTableRequestDataAttributesSchema{ Fields: []datadogV2.PatchTableRequestDataAttributesSchemaFieldsItems{ { Name: "id", Type: datadogV2.REFERENCETABLESCHEMAFIELDTYPE_INT32, }, { Name: "name", Type: datadogV2.REFERENCETABLESCHEMAFIELDTYPE_STRING, }, }, PrimaryKeys: []string{ "id", }, }, SyncEnabled: datadog.PtrBool(false), Tags: []string{ "test_tag", }, }, Type: datadogV2.PATCHTABLEREQUESTDATATYPE_REFERENCE_TABLE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewReferenceTablesApi(apiClient) r, err := api.UpdateReferenceTable(ctx, "id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ReferenceTablesApi.UpdateReferenceTable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update reference table ``` // Update reference table returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ReferenceTablesApi; import com.datadog.api.client.v2.model.PatchTableRequest; import com.datadog.api.client.v2.model.PatchTableRequestData; import com.datadog.api.client.v2.model.PatchTableRequestDataAttributes; import com.datadog.api.client.v2.model.PatchTableRequestDataAttributesFileMetadata; import com.datadog.api.client.v2.model.PatchTableRequestDataAttributesFileMetadataCloudStorage; import com.datadog.api.client.v2.model.PatchTableRequestDataAttributesFileMetadataOneOfAccessDetails; import com.datadog.api.client.v2.model.PatchTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail; import com.datadog.api.client.v2.model.PatchTableRequestDataAttributesSchema; import com.datadog.api.client.v2.model.PatchTableRequestDataAttributesSchemaFieldsItems; import com.datadog.api.client.v2.model.PatchTableRequestDataType; import com.datadog.api.client.v2.model.ReferenceTableSchemaFieldType; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ReferenceTablesApi apiInstance = new ReferenceTablesApi(defaultClient); PatchTableRequest body = new PatchTableRequest() .data( new PatchTableRequestData() .attributes( new PatchTableRequestDataAttributes() .description("this is a cloud table generated via a cloud bucket sync") .fileMetadata( new PatchTableRequestDataAttributesFileMetadata( new PatchTableRequestDataAttributesFileMetadataCloudStorage() .accessDetails( new PatchTableRequestDataAttributesFileMetadataOneOfAccessDetails() .awsDetail( new PatchTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail() .awsAccountId("test-account-id") .awsBucketName("test-bucket") .filePath("test_rt.csv"))) .syncEnabled(true))) .schema( new PatchTableRequestDataAttributesSchema() .fields( Arrays.asList( new PatchTableRequestDataAttributesSchemaFieldsItems() .name("id") .type(ReferenceTableSchemaFieldType.INT32), new PatchTableRequestDataAttributesSchemaFieldsItems() .name("name") .type(ReferenceTableSchemaFieldType.STRING))) .primaryKeys(Collections.singletonList("id"))) .syncEnabled(false) .tags(Collections.singletonList("test_tag"))) .type(PatchTableRequestDataType.REFERENCE_TABLE)); try { apiInstance.updateReferenceTable("id", body); } catch (ApiException e) { System.err.println("Exception when calling ReferenceTablesApi#updateReferenceTable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update reference table ``` // Update reference table returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_reference_tables::ReferenceTablesAPI; use datadog_api_client::datadogV2::model::PatchTableRequest; use datadog_api_client::datadogV2::model::PatchTableRequestData; use datadog_api_client::datadogV2::model::PatchTableRequestDataAttributes; use datadog_api_client::datadogV2::model::PatchTableRequestDataAttributesFileMetadata; use datadog_api_client::datadogV2::model::PatchTableRequestDataAttributesFileMetadataCloudStorage; use datadog_api_client::datadogV2::model::PatchTableRequestDataAttributesFileMetadataOneOfAccessDetails; use datadog_api_client::datadogV2::model::PatchTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail; use datadog_api_client::datadogV2::model::PatchTableRequestDataAttributesSchema; use datadog_api_client::datadogV2::model::PatchTableRequestDataAttributesSchemaFieldsItems; use datadog_api_client::datadogV2::model::PatchTableRequestDataType; use datadog_api_client::datadogV2::model::ReferenceTableSchemaFieldType; #[tokio::main] async fn main() { let body = PatchTableRequest ::new().data( PatchTableRequestData::new( PatchTableRequestDataType::REFERENCE_TABLE, ).attributes( PatchTableRequestDataAttributes::new() .description("this is a cloud table generated via a cloud bucket sync".to_string()) .file_metadata( PatchTableRequestDataAttributesFileMetadata ::PatchTableRequestDataAttributesFileMetadataCloudStorage( Box::new( PatchTableRequestDataAttributesFileMetadataCloudStorage::new() .access_details( PatchTableRequestDataAttributesFileMetadataOneOfAccessDetails ::new().aws_detail( PatchTableRequestDataAttributesFileMetadataOneOfAccessDetailsAwsDetail ::new() .aws_account_id("test-account-id".to_string()) .aws_bucket_name("test-bucket".to_string()) .file_path("test_rt.csv".to_string()), ), ) .sync_enabled(true), ), ), ) .schema( PatchTableRequestDataAttributesSchema::new( vec![ PatchTableRequestDataAttributesSchemaFieldsItems::new( "id".to_string(), ReferenceTableSchemaFieldType::INT32, ), PatchTableRequestDataAttributesSchemaFieldsItems::new( "name".to_string(), ReferenceTableSchemaFieldType::STRING, ) ], vec!["id".to_string()], ), ) .sync_enabled(false) .tags(vec!["test_tag".to_string()]), ), ); let configuration = datadog::Configuration::new(); let api = ReferenceTablesAPI::with_config(configuration); let resp = api.update_reference_table("id".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update reference table ``` /** * Update reference table returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ReferenceTablesApi(configuration); const params: v2.ReferenceTablesApiUpdateReferenceTableRequest = { body: { data: { attributes: { description: "this is a cloud table generated via a cloud bucket sync", fileMetadata: { accessDetails: { awsDetail: { awsAccountId: "test-account-id", awsBucketName: "test-bucket", filePath: "test_rt.csv", }, }, syncEnabled: true, }, schema: { fields: [ { name: "id", type: "INT32", }, { name: "name", type: "STRING", }, ], primaryKeys: ["id"], }, syncEnabled: false, tags: ["test_tag"], }, type: "reference_table", }, }, id: "id", }; apiInstance .updateReferenceTable(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete table](https://docs.datadoghq.com/api/latest/reference-tables/#delete-table) * [v2 (latest)](https://docs.datadoghq.com/api/latest/reference-tables/#delete-table-v2) DELETE https://api.ap1.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.ap2.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.datadoghq.eu/api/v2/reference-tables/tables/{id}https://api.ddog-gov.com/api/v2/reference-tables/tables/{id}https://api.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.us3.datadoghq.com/api/v2/reference-tables/tables/{id}https://api.us5.datadoghq.com/api/v2/reference-tables/tables/{id} ### Overview Delete a reference table by ID ### Arguments #### Path Parameters Name Type Description id [_required_] string Unique identifier of the reference table to delete ### Response * [200](https://docs.datadoghq.com/api/latest/reference-tables/#DeleteTable-200-v2) * [403](https://docs.datadoghq.com/api/latest/reference-tables/#DeleteTable-403-v2) * [404](https://docs.datadoghq.com/api/latest/reference-tables/#DeleteTable-404-v2) * [429](https://docs.datadoghq.com/api/latest/reference-tables/#DeleteTable-429-v2) OK Forbidden * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=typescript) ##### Delete table Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/reference-tables/tables/${id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete table ``` """ Delete table returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.reference_tables_api import ReferenceTablesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ReferenceTablesApi(api_client) api_instance.delete_table( id="id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete table ``` # Delete table returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ReferenceTablesAPI.new p api_instance.delete_table("id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete table ``` // Delete table returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewReferenceTablesApi(apiClient) r, err := api.DeleteTable(ctx, "id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ReferenceTablesApi.DeleteTable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete table ``` // Delete table returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ReferenceTablesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ReferenceTablesApi apiInstance = new ReferenceTablesApi(defaultClient); try { apiInstance.deleteTable("id"); } catch (ApiException e) { System.err.println("Exception when calling ReferenceTablesApi#deleteTable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete table ``` // Delete table returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_reference_tables::ReferenceTablesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ReferenceTablesAPI::with_config(configuration); let resp = api.delete_table("id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete table ``` /** * Delete table returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ReferenceTablesApi(configuration); const params: v2.ReferenceTablesApiDeleteTableRequest = { id: "id", }; apiInstance .deleteTable(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Upsert rows](https://docs.datadoghq.com/api/latest/reference-tables/#upsert-rows) * [v2 (latest)](https://docs.datadoghq.com/api/latest/reference-tables/#upsert-rows-v2) POST https://api.ap1.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.ap2.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.datadoghq.eu/api/v2/reference-tables/tables/{id}/rowshttps://api.ddog-gov.com/api/v2/reference-tables/tables/{id}/rowshttps://api.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.us3.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.us5.datadoghq.com/api/v2/reference-tables/tables/{id}/rows ### Overview Create or update rows in a Reference Table by their primary key values. If a row with the specified primary key exists, it is updated; otherwise, a new row is created. ### Arguments #### Path Parameters Name Type Description id [_required_] string Unique identifier of the reference table to upsert rows into ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) Field Type Description data [_required_] [object] attributes object Attributes containing row data values for row creation or update operations. values [_required_] object Key-value pairs representing row data, where keys are schema field names and values match the corresponding column types. Types allowed for Reference Table row values. Option 1 string Option 2 int32 id [_required_] string type [_required_] enum Row resource type. Allowed enum values: `row` default: `row` ``` { "data": [ { "attributes": { "values": { "": { "example": "undefined", "type": "undefined" } } }, "id": "primary_key_value", "type": "row" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/reference-tables/#UpsertRows-200-v2) * [400](https://docs.datadoghq.com/api/latest/reference-tables/#UpsertRows-400-v2) * [403](https://docs.datadoghq.com/api/latest/reference-tables/#UpsertRows-403-v2) * [404](https://docs.datadoghq.com/api/latest/reference-tables/#UpsertRows-404-v2) * [429](https://docs.datadoghq.com/api/latest/reference-tables/#UpsertRows-429-v2) * [500](https://docs.datadoghq.com/api/latest/reference-tables/#UpsertRows-500-v2) Rows created or updated successfully Bad Request * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=typescript) ##### Upsert rows Copy ``` ## Upsert a row with mixed string and int values # # Path parameters export id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/reference-tables/tables/${id}/rows" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "attributes": { "values": { "age": 25, "example_key_value": "primary_key_value", "name": "row_name" } }, "id": "primary_key_value", "type": "row" } ] } EOF ``` ##### Upsert rows ``` """ Upsert rows returns "Rows created or updated successfully" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.reference_tables_api import ReferenceTablesApi from datadog_api_client.v2.model.batch_upsert_rows_request_array import BatchUpsertRowsRequestArray from datadog_api_client.v2.model.batch_upsert_rows_request_data import BatchUpsertRowsRequestData from datadog_api_client.v2.model.batch_upsert_rows_request_data_attributes import BatchUpsertRowsRequestDataAttributes from datadog_api_client.v2.model.table_row_resource_data_type import TableRowResourceDataType body = BatchUpsertRowsRequestArray( data=[ BatchUpsertRowsRequestData( attributes=BatchUpsertRowsRequestDataAttributes( values=dict( example_key_value="primary_key_value", name="row_name", ), ), id="primary_key_value", type=TableRowResourceDataType.ROW, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ReferenceTablesApi(api_client) api_instance.upsert_rows(id="id", body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Upsert rows ``` # Upsert rows returns "Rows created or updated successfully" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ReferenceTablesAPI.new body = DatadogAPIClient::V2::BatchUpsertRowsRequestArray.new({ data: [ DatadogAPIClient::V2::BatchUpsertRowsRequestData.new({ attributes: DatadogAPIClient::V2::BatchUpsertRowsRequestDataAttributes.new({ values: { example_key_value: "primary_key_value", name: "row_name", }, }), id: "primary_key_value", type: DatadogAPIClient::V2::TableRowResourceDataType::ROW, }), ], }) p api_instance.upsert_rows("id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Upsert rows ``` // Upsert rows returns "Rows created or updated successfully" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.BatchUpsertRowsRequestArray{ Data: []datadogV2.BatchUpsertRowsRequestData{ { Attributes: &datadogV2.BatchUpsertRowsRequestDataAttributes{ Values: map[string]interface{}{ "example_key_value": "primary_key_value", "name": "row_name", }, }, Id: "primary_key_value", Type: datadogV2.TABLEROWRESOURCEDATATYPE_ROW, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewReferenceTablesApi(apiClient) r, err := api.UpsertRows(ctx, "id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ReferenceTablesApi.UpsertRows`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Upsert rows ``` // Upsert rows returns "Rows created or updated successfully" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ReferenceTablesApi; import com.datadog.api.client.v2.model.BatchUpsertRowsRequestArray; import com.datadog.api.client.v2.model.BatchUpsertRowsRequestData; import com.datadog.api.client.v2.model.BatchUpsertRowsRequestDataAttributes; import com.datadog.api.client.v2.model.TableRowResourceDataType; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ReferenceTablesApi apiInstance = new ReferenceTablesApi(defaultClient); BatchUpsertRowsRequestArray body = new BatchUpsertRowsRequestArray() .data( Collections.singletonList( new BatchUpsertRowsRequestData() .attributes( new BatchUpsertRowsRequestDataAttributes() .values( Map.ofEntries( Map.entry("example_key_value", "primary_key_value"), Map.entry("name", "row_name")))) .id("primary_key_value") .type(TableRowResourceDataType.ROW))); try { apiInstance.upsertRows("id", body); } catch (ApiException e) { System.err.println("Exception when calling ReferenceTablesApi#upsertRows"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Upsert rows ``` // Upsert rows returns "Rows created or updated successfully" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_reference_tables::ReferenceTablesAPI; use datadog_api_client::datadogV2::model::BatchUpsertRowsRequestArray; use datadog_api_client::datadogV2::model::BatchUpsertRowsRequestData; use datadog_api_client::datadogV2::model::BatchUpsertRowsRequestDataAttributes; use datadog_api_client::datadogV2::model::TableRowResourceDataType; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = BatchUpsertRowsRequestArray::new(vec![BatchUpsertRowsRequestData::new( "primary_key_value".to_string(), TableRowResourceDataType::ROW, ) .attributes(BatchUpsertRowsRequestDataAttributes::new(BTreeMap::from([ ( "example_key_value".to_string(), Value::from("primary_key_value"), ), ("name".to_string(), Value::from("row_name")), ])))]); let configuration = datadog::Configuration::new(); let api = ReferenceTablesAPI::with_config(configuration); let resp = api.upsert_rows("id".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Upsert rows ``` /** * Upsert rows returns "Rows created or updated successfully" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ReferenceTablesApi(configuration); const params: v2.ReferenceTablesApiUpsertRowsRequest = { body: { data: [ { attributes: { values: { example_key_value: "primary_key_value", name: "row_name", }, }, id: "primary_key_value", type: "row", }, ], }, id: "id", }; apiInstance .upsertRows(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete rows](https://docs.datadoghq.com/api/latest/reference-tables/#delete-rows) * [v2 (latest)](https://docs.datadoghq.com/api/latest/reference-tables/#delete-rows-v2) DELETE https://api.ap1.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.ap2.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.datadoghq.eu/api/v2/reference-tables/tables/{id}/rowshttps://api.ddog-gov.com/api/v2/reference-tables/tables/{id}/rowshttps://api.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.us3.datadoghq.com/api/v2/reference-tables/tables/{id}/rowshttps://api.us5.datadoghq.com/api/v2/reference-tables/tables/{id}/rows ### Overview Delete multiple rows from a Reference Table by their primary key values. ### Arguments #### Path Parameters Name Type Description id [_required_] string Unique identifier of the reference table to delete rows from ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) Field Type Description data [_required_] [object] id [_required_] string type [_required_] enum Row resource type. Allowed enum values: `row` default: `row` ``` { "data": [ { "id": "primary_key_value", "type": "row" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/reference-tables/#DeleteRows-200-v2) * [400](https://docs.datadoghq.com/api/latest/reference-tables/#DeleteRows-400-v2) * [403](https://docs.datadoghq.com/api/latest/reference-tables/#DeleteRows-403-v2) * [404](https://docs.datadoghq.com/api/latest/reference-tables/#DeleteRows-404-v2) * [429](https://docs.datadoghq.com/api/latest/reference-tables/#DeleteRows-429-v2) * [500](https://docs.datadoghq.com/api/latest/reference-tables/#DeleteRows-500-v2) Rows deleted successfully Bad Request * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error * [Model](https://docs.datadoghq.com/api/latest/reference-tables/) * [Example](https://docs.datadoghq.com/api/latest/reference-tables/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/reference-tables/?code-lang=typescript) ##### Delete rows Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/reference-tables/tables/${id}/rows" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "id": "primary_key_value", "type": "row" } ] } EOF ``` ##### Delete rows ``` """ Delete rows returns "Rows deleted successfully" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.reference_tables_api import ReferenceTablesApi from datadog_api_client.v2.model.batch_delete_rows_request_array import BatchDeleteRowsRequestArray from datadog_api_client.v2.model.batch_delete_rows_request_data import BatchDeleteRowsRequestData from datadog_api_client.v2.model.table_row_resource_data_type import TableRowResourceDataType body = BatchDeleteRowsRequestArray( data=[ BatchDeleteRowsRequestData( id="primary_key_value", type=TableRowResourceDataType.ROW, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ReferenceTablesApi(api_client) api_instance.delete_rows(id="id", body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete rows ``` # Delete rows returns "Rows deleted successfully" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ReferenceTablesAPI.new body = DatadogAPIClient::V2::BatchDeleteRowsRequestArray.new({ data: [ DatadogAPIClient::V2::BatchDeleteRowsRequestData.new({ id: "primary_key_value", type: DatadogAPIClient::V2::TableRowResourceDataType::ROW, }), ], }) p api_instance.delete_rows("id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete rows ``` // Delete rows returns "Rows deleted successfully" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.BatchDeleteRowsRequestArray{ Data: []datadogV2.BatchDeleteRowsRequestData{ { Id: "primary_key_value", Type: datadogV2.TABLEROWRESOURCEDATATYPE_ROW, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewReferenceTablesApi(apiClient) r, err := api.DeleteRows(ctx, "id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ReferenceTablesApi.DeleteRows`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete rows ``` // Delete rows returns "Rows deleted successfully" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ReferenceTablesApi; import com.datadog.api.client.v2.model.BatchDeleteRowsRequestArray; import com.datadog.api.client.v2.model.BatchDeleteRowsRequestData; import com.datadog.api.client.v2.model.TableRowResourceDataType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ReferenceTablesApi apiInstance = new ReferenceTablesApi(defaultClient); BatchDeleteRowsRequestArray body = new BatchDeleteRowsRequestArray() .data( Collections.singletonList( new BatchDeleteRowsRequestData() .id("primary_key_value") .type(TableRowResourceDataType.ROW))); try { apiInstance.deleteRows("id", body); } catch (ApiException e) { System.err.println("Exception when calling ReferenceTablesApi#deleteRows"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete rows ``` // Delete rows returns "Rows deleted successfully" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_reference_tables::ReferenceTablesAPI; use datadog_api_client::datadogV2::model::BatchDeleteRowsRequestArray; use datadog_api_client::datadogV2::model::BatchDeleteRowsRequestData; use datadog_api_client::datadogV2::model::TableRowResourceDataType; #[tokio::main] async fn main() { let body = BatchDeleteRowsRequestArray::new(vec![BatchDeleteRowsRequestData::new( "primary_key_value".to_string(), TableRowResourceDataType::ROW, )]); let configuration = datadog::Configuration::new(); let api = ReferenceTablesAPI::with_config(configuration); let resp = api.delete_rows("id".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete rows ``` /** * Delete rows returns "Rows deleted successfully" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ReferenceTablesApi(configuration); const params: v2.ReferenceTablesApiDeleteRowsRequest = { body: { data: [ { id: "primary_key_value", type: "row", }, ], }, id: "id", }; apiInstance .deleteRows(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=94c1bbd1-4dff-4f57-bf75-37edee45c419&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=56730eb8-6721-4f17-9ab4-154a99991286&pt=Reference%20Tables&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Freference-tables%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=94c1bbd1-4dff-4f57-bf75-37edee45c419&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=56730eb8-6721-4f17-9ab4-154a99991286&pt=Reference%20Tables&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Freference-tables%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=de15814e-e251-4423-aab5-d10860daffe9&bo=2&sid=f729f770f0bf11f0aa19994c0c4b86b5&vid=f72a14f0f0bf11f0b889879a61d8cd17&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Reference%20Tables&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Freference-tables%2F&r=<=1760&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=586287) --- # Source: https://docs.datadoghq.com/api/latest/restriction-policies/ # Restriction Policies A restriction policy defines the access control rules for a resource, mapping a set of relations (such as editor and viewer) to a set of allowed principals (such as roles, teams, or users). The restriction policy determines who is authorized to perform what actions on the resource. ## [Update a restriction policy](https://docs.datadoghq.com/api/latest/restriction-policies/#update-a-restriction-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/restriction-policies/#update-a-restriction-policy-v2) POST https://api.ap1.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.ap2.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.datadoghq.eu/api/v2/restriction_policy/{resource_id}https://api.ddog-gov.com/api/v2/restriction_policy/{resource_id}https://api.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.us3.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.us5.datadoghq.com/api/v2/restriction_policy/{resource_id} ### Overview Updates the restriction policy associated with a resource. #### [Supported resources](https://docs.datadoghq.com/api/latest/restriction-policies/#supported-resources) Restriction policies can be applied to the following resources: * Dashboards: `dashboard` * Integration Services: `integration-service` * Integration Webhooks: `integration-webhook` * Notebooks: `notebook` * Powerpacks: `powerpack` * Reference Tables: `reference-table` * Security Rules: `security-rule` * Service Level Objectives: `slo` * Synthetic Global Variables: `synthetics-global-variable` * Synthetic Tests: `synthetics-test` * Synthetic Private Locations: `synthetics-private-location` * Monitors: `monitor` * Workflows: `workflow` * App Builder Apps: `app-builder-app` * Connections: `connection` * Connection Groups: `connection-group` * RUM Applications: `rum-application` * Cross Org Connections: `cross-org-connection` * Spreadsheets: `spreadsheet` * On-Call Schedules: `on-call-schedule` * On-Call Escalation Policies: `on-call-escalation-policy` * On-Call Team Routing Rules: `on-call-team-routing-rules` * Logs Pipelines: `logs-pipeline` #### [Supported relations for resources](https://docs.datadoghq.com/api/latest/restriction-policies/#supported-relations-for-resources) Resource Type | Supported Relations ---|--- Dashboards | `viewer`, `editor` Integration Services | `viewer`, `editor` Integration Webhooks | `viewer`, `editor` Notebooks | `viewer`, `editor` Powerpacks | `viewer`, `editor` Security Rules | `viewer`, `editor` Service Level Objectives | `viewer`, `editor` Synthetic Global Variables | `viewer`, `editor` Synthetic Tests | `viewer`, `editor` Synthetic Private Locations | `viewer`, `editor` Monitors | `viewer`, `editor` Reference Tables | `viewer`, `editor` Workflows | `viewer`, `runner`, `editor` App Builder Apps | `viewer`, `editor` Connections | `viewer`, `resolver`, `editor` Connection Groups | `viewer`, `editor` RUM Application | `viewer`, `editor` Cross Org Connections | `viewer`, `editor` Spreadsheets | `viewer`, `editor` On-Call Schedules | `viewer`, `overrider`, `editor` On-Call Escalation Policies | `viewer`, `editor` On-Call Team Routing Rules | `viewer`, `editor` Logs Pipelines | `viewer`, `processors_editor`, `editor` ### Arguments #### Path Parameters Name Type Description resource_id [_required_] string Identifier, formatted as `type:id`. Supported types: `dashboard`, `integration-service`, `integration-webhook`, `notebook`, `reference-table`, `security-rule`, `slo`, `workflow`, `app-builder-app`, `connection`, `connection-group`, `rum-application`, `cross-org-connection`, `spreadsheet`, `on-call-schedule`, `on-call-escalation-policy`, `on-call-team-routing-rules`, `logs-pipeline`. #### Query Strings Name Type Description allow_self_lockout boolean Allows admins (users with the `user_access_manage` permission) to remove their own access from the resource if set to `true`. By default, this is set to `false`, preventing admins from locking themselves out. ### Request #### Body Data (required) Restriction policy payload * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) Field Type Description data [_required_] object Restriction policy object. attributes [_required_] object Restriction policy attributes. bindings [_required_] [object] An array of bindings. principals [_required_] [string] An array of principals. A principal is a subject or group of subjects. Each principal is formatted as `type:id`. Supported types: `role`, `team`, `user`, and `org`. The org ID can be obtained through the api/v2/current_user API. The user principal type accepts service account IDs. relation [_required_] string The role/level of access. id [_required_] string The identifier, always equivalent to the value specified in the `resource_id` path parameter. type [_required_] enum Restriction policy type. Allowed enum values: `restriction_policy` default: `restriction_policy` ``` { "data": { "id": "dashboard:test-update", "type": "restriction_policy", "attributes": { "bindings": [ { "relation": "editor", "principals": [ "org:00000000-0000-beef-0000-000000000000" ] } ] } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/restriction-policies/#UpdateRestrictionPolicy-200-v2) * [400](https://docs.datadoghq.com/api/latest/restriction-policies/#UpdateRestrictionPolicy-400-v2) * [403](https://docs.datadoghq.com/api/latest/restriction-policies/#UpdateRestrictionPolicy-403-v2) * [429](https://docs.datadoghq.com/api/latest/restriction-policies/#UpdateRestrictionPolicy-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) Response containing information about a single restriction policy. Field Type Description data [_required_] object Restriction policy object. attributes [_required_] object Restriction policy attributes. bindings [_required_] [object] An array of bindings. principals [_required_] [string] An array of principals. A principal is a subject or group of subjects. Each principal is formatted as `type:id`. Supported types: `role`, `team`, `user`, and `org`. The org ID can be obtained through the api/v2/current_user API. The user principal type accepts service account IDs. relation [_required_] string The role/level of access. id [_required_] string The identifier, always equivalent to the value specified in the `resource_id` path parameter. type [_required_] enum Restriction policy type. Allowed enum values: `restriction_policy` default: `restriction_policy` ``` { "data": { "attributes": { "bindings": [ { "principals": [ "role:00000000-0000-1111-0000-000000000000" ], "relation": "editor" } ] }, "id": "dashboard:abc-def-ghi", "type": "restriction_policy" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=typescript) ##### Update a restriction policy returns "OK" response Copy ``` # Path parameters export resource_id="dashboard:abc-def-ghi" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/restriction_policy/${resource_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "dashboard:test-update", "type": "restriction_policy", "attributes": { "bindings": [ { "relation": "editor", "principals": [ "org:00000000-0000-beef-0000-000000000000" ] } ] } } } EOF ``` ##### Update a restriction policy returns "OK" response ``` // Update a restriction policy returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system body := datadogV2.RestrictionPolicyUpdateRequest{ Data: datadogV2.RestrictionPolicy{ Id: "dashboard:test-update", Type: datadogV2.RESTRICTIONPOLICYTYPE_RESTRICTION_POLICY, Attributes: datadogV2.RestrictionPolicyAttributes{ Bindings: []datadogV2.RestrictionPolicyBinding{ { Relation: "editor", Principals: []string{ "org:00000000-0000-beef-0000-000000000000", }, }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRestrictionPoliciesApi(apiClient) resp, r, err := api.UpdateRestrictionPolicy(ctx, "dashboard:test-update", body, *datadogV2.NewUpdateRestrictionPolicyOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RestrictionPoliciesApi.UpdateRestrictionPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RestrictionPoliciesApi.UpdateRestrictionPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a restriction policy returns "OK" response ``` // Update a restriction policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RestrictionPoliciesApi; import com.datadog.api.client.v2.model.RestrictionPolicy; import com.datadog.api.client.v2.model.RestrictionPolicyAttributes; import com.datadog.api.client.v2.model.RestrictionPolicyBinding; import com.datadog.api.client.v2.model.RestrictionPolicyResponse; import com.datadog.api.client.v2.model.RestrictionPolicyType; import com.datadog.api.client.v2.model.RestrictionPolicyUpdateRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RestrictionPoliciesApi apiInstance = new RestrictionPoliciesApi(defaultClient); // there is a valid "user" in the system String USER_DATA_RELATIONSHIPS_ORG_DATA_ID = System.getenv("USER_DATA_RELATIONSHIPS_ORG_DATA_ID"); RestrictionPolicyUpdateRequest body = new RestrictionPolicyUpdateRequest() .data( new RestrictionPolicy() .id("dashboard:test-update") .type(RestrictionPolicyType.RESTRICTION_POLICY) .attributes( new RestrictionPolicyAttributes() .bindings( Collections.singletonList( new RestrictionPolicyBinding() .relation("editor") .principals( Collections.singletonList( "org:00000000-0000-beef-0000-000000000000")))))); try { RestrictionPolicyResponse result = apiInstance.updateRestrictionPolicy("dashboard:test-update", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RestrictionPoliciesApi#updateRestrictionPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a restriction policy returns "OK" response ``` """ Update a restriction policy returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.restriction_policies_api import RestrictionPoliciesApi from datadog_api_client.v2.model.restriction_policy import RestrictionPolicy from datadog_api_client.v2.model.restriction_policy_attributes import RestrictionPolicyAttributes from datadog_api_client.v2.model.restriction_policy_binding import RestrictionPolicyBinding from datadog_api_client.v2.model.restriction_policy_type import RestrictionPolicyType from datadog_api_client.v2.model.restriction_policy_update_request import RestrictionPolicyUpdateRequest # there is a valid "user" in the system USER_DATA_RELATIONSHIPS_ORG_DATA_ID = environ["USER_DATA_RELATIONSHIPS_ORG_DATA_ID"] body = RestrictionPolicyUpdateRequest( data=RestrictionPolicy( id="dashboard:test-update", type=RestrictionPolicyType.RESTRICTION_POLICY, attributes=RestrictionPolicyAttributes( bindings=[ RestrictionPolicyBinding( relation="editor", principals=[ "org:00000000-0000-beef-0000-000000000000", ], ), ], ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RestrictionPoliciesApi(api_client) response = api_instance.update_restriction_policy(resource_id="dashboard:test-update", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a restriction policy returns "OK" response ``` # Update a restriction policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RestrictionPoliciesAPI.new # there is a valid "user" in the system USER_DATA_RELATIONSHIPS_ORG_DATA_ID = ENV["USER_DATA_RELATIONSHIPS_ORG_DATA_ID"] body = DatadogAPIClient::V2::RestrictionPolicyUpdateRequest.new({ data: DatadogAPIClient::V2::RestrictionPolicy.new({ id: "dashboard:test-update", type: DatadogAPIClient::V2::RestrictionPolicyType::RESTRICTION_POLICY, attributes: DatadogAPIClient::V2::RestrictionPolicyAttributes.new({ bindings: [ DatadogAPIClient::V2::RestrictionPolicyBinding.new({ relation: "editor", principals: [ "org:00000000-0000-beef-0000-000000000000", ], }), ], }), }), }) p api_instance.update_restriction_policy("dashboard:test-update", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a restriction policy returns "OK" response ``` // Update a restriction policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_restriction_policies::RestrictionPoliciesAPI; use datadog_api_client::datadogV2::api_restriction_policies::UpdateRestrictionPolicyOptionalParams; use datadog_api_client::datadogV2::model::RestrictionPolicy; use datadog_api_client::datadogV2::model::RestrictionPolicyAttributes; use datadog_api_client::datadogV2::model::RestrictionPolicyBinding; use datadog_api_client::datadogV2::model::RestrictionPolicyType; use datadog_api_client::datadogV2::model::RestrictionPolicyUpdateRequest; #[tokio::main] async fn main() { // there is a valid "user" in the system let body = RestrictionPolicyUpdateRequest::new(RestrictionPolicy::new( RestrictionPolicyAttributes::new(vec![RestrictionPolicyBinding::new( vec!["org:00000000-0000-beef-0000-000000000000".to_string()], "editor".to_string(), )]), "dashboard:test-update".to_string(), RestrictionPolicyType::RESTRICTION_POLICY, )); let configuration = datadog::Configuration::new(); let api = RestrictionPoliciesAPI::with_config(configuration); let resp = api .update_restriction_policy( "dashboard:test-update".to_string(), body, UpdateRestrictionPolicyOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a restriction policy returns "OK" response ``` /** * Update a restriction policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RestrictionPoliciesApi(configuration); // there is a valid "user" in the system const params: v2.RestrictionPoliciesApiUpdateRestrictionPolicyRequest = { body: { data: { id: "dashboard:test-update", type: "restriction_policy", attributes: { bindings: [ { relation: "editor", principals: ["org:00000000-0000-beef-0000-000000000000"], }, ], }, }, }, resourceId: "dashboard:test-update", }; apiInstance .updateRestrictionPolicy(params) .then((data: v2.RestrictionPolicyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a restriction policy](https://docs.datadoghq.com/api/latest/restriction-policies/#get-a-restriction-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/restriction-policies/#get-a-restriction-policy-v2) GET https://api.ap1.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.ap2.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.datadoghq.eu/api/v2/restriction_policy/{resource_id}https://api.ddog-gov.com/api/v2/restriction_policy/{resource_id}https://api.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.us3.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.us5.datadoghq.com/api/v2/restriction_policy/{resource_id} ### Overview Retrieves the restriction policy associated with a specified resource. ### Arguments #### Path Parameters Name Type Description resource_id [_required_] string Identifier, formatted as `type:id`. Supported types: `dashboard`, `integration-service`, `integration-webhook`, `notebook`, `reference-table`, `security-rule`, `slo`, `workflow`, `app-builder-app`, `connection`, `connection-group`, `rum-application`, `cross-org-connection`, `spreadsheet`, `on-call-schedule`, `on-call-escalation-policy`, `on-call-team-routing-rules`, `logs-pipeline`. ### Response * [200](https://docs.datadoghq.com/api/latest/restriction-policies/#GetRestrictionPolicy-200-v2) * [400](https://docs.datadoghq.com/api/latest/restriction-policies/#GetRestrictionPolicy-400-v2) * [403](https://docs.datadoghq.com/api/latest/restriction-policies/#GetRestrictionPolicy-403-v2) * [429](https://docs.datadoghq.com/api/latest/restriction-policies/#GetRestrictionPolicy-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) Response containing information about a single restriction policy. Field Type Description data [_required_] object Restriction policy object. attributes [_required_] object Restriction policy attributes. bindings [_required_] [object] An array of bindings. principals [_required_] [string] An array of principals. A principal is a subject or group of subjects. Each principal is formatted as `type:id`. Supported types: `role`, `team`, `user`, and `org`. The org ID can be obtained through the api/v2/current_user API. The user principal type accepts service account IDs. relation [_required_] string The role/level of access. id [_required_] string The identifier, always equivalent to the value specified in the `resource_id` path parameter. type [_required_] enum Restriction policy type. Allowed enum values: `restriction_policy` default: `restriction_policy` ``` { "data": { "attributes": { "bindings": [ { "principals": [ "role:00000000-0000-1111-0000-000000000000" ], "relation": "editor" } ] }, "id": "dashboard:abc-def-ghi", "type": "restriction_policy" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=typescript) ##### Get a restriction policy Copy ``` # Path parameters export resource_id="dashboard:abc-def-ghi" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/restriction_policy/${resource_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a restriction policy ``` """ Get a restriction policy returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.restriction_policies_api import RestrictionPoliciesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RestrictionPoliciesApi(api_client) response = api_instance.get_restriction_policy( resource_id="dashboard:test-get", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a restriction policy ``` # Get a restriction policy returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RestrictionPoliciesAPI.new p api_instance.get_restriction_policy("dashboard:test-get") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a restriction policy ``` // Get a restriction policy returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRestrictionPoliciesApi(apiClient) resp, r, err := api.GetRestrictionPolicy(ctx, "dashboard:test-get") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RestrictionPoliciesApi.GetRestrictionPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RestrictionPoliciesApi.GetRestrictionPolicy`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a restriction policy ``` // Get a restriction policy returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RestrictionPoliciesApi; import com.datadog.api.client.v2.model.RestrictionPolicyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RestrictionPoliciesApi apiInstance = new RestrictionPoliciesApi(defaultClient); try { RestrictionPolicyResponse result = apiInstance.getRestrictionPolicy("dashboard:test-get"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RestrictionPoliciesApi#getRestrictionPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a restriction policy ``` // Get a restriction policy returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_restriction_policies::RestrictionPoliciesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = RestrictionPoliciesAPI::with_config(configuration); let resp = api .get_restriction_policy("dashboard:test-get".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a restriction policy ``` /** * Get a restriction policy returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RestrictionPoliciesApi(configuration); const params: v2.RestrictionPoliciesApiGetRestrictionPolicyRequest = { resourceId: "dashboard:test-get", }; apiInstance .getRestrictionPolicy(params) .then((data: v2.RestrictionPolicyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a restriction policy](https://docs.datadoghq.com/api/latest/restriction-policies/#delete-a-restriction-policy) * [v2 (latest)](https://docs.datadoghq.com/api/latest/restriction-policies/#delete-a-restriction-policy-v2) DELETE https://api.ap1.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.ap2.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.datadoghq.eu/api/v2/restriction_policy/{resource_id}https://api.ddog-gov.com/api/v2/restriction_policy/{resource_id}https://api.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.us3.datadoghq.com/api/v2/restriction_policy/{resource_id}https://api.us5.datadoghq.com/api/v2/restriction_policy/{resource_id} ### Overview Deletes the restriction policy associated with a specified resource. ### Arguments #### Path Parameters Name Type Description resource_id [_required_] string Identifier, formatted as `type:id`. Supported types: `dashboard`, `integration-service`, `integration-webhook`, `notebook`, `reference-table`, `security-rule`, `slo`, `workflow`, `app-builder-app`, `connection`, `connection-group`, `rum-application`, `cross-org-connection`, `spreadsheet`, `on-call-schedule`, `on-call-escalation-policy`, `on-call-team-routing-rules`, `logs-pipeline`. ### Response * [204](https://docs.datadoghq.com/api/latest/restriction-policies/#DeleteRestrictionPolicy-204-v2) * [400](https://docs.datadoghq.com/api/latest/restriction-policies/#DeleteRestrictionPolicy-400-v2) * [403](https://docs.datadoghq.com/api/latest/restriction-policies/#DeleteRestrictionPolicy-403-v2) * [429](https://docs.datadoghq.com/api/latest/restriction-policies/#DeleteRestrictionPolicy-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/restriction-policies/) * [Example](https://docs.datadoghq.com/api/latest/restriction-policies/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/restriction-policies/?code-lang=typescript) ##### Delete a restriction policy Copy ``` # Path parameters export resource_id="dashboard:abc-def-ghi" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/restriction_policy/${resource_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a restriction policy ``` """ Delete a restriction policy returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.restriction_policies_api import RestrictionPoliciesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RestrictionPoliciesApi(api_client) api_instance.delete_restriction_policy( resource_id="dashboard:test-delete", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a restriction policy ``` # Delete a restriction policy returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RestrictionPoliciesAPI.new api_instance.delete_restriction_policy("dashboard:test-delete") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a restriction policy ``` // Delete a restriction policy returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRestrictionPoliciesApi(apiClient) r, err := api.DeleteRestrictionPolicy(ctx, "dashboard:test-delete") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RestrictionPoliciesApi.DeleteRestrictionPolicy`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a restriction policy ``` // Delete a restriction policy returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RestrictionPoliciesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RestrictionPoliciesApi apiInstance = new RestrictionPoliciesApi(defaultClient); try { apiInstance.deleteRestrictionPolicy("dashboard:test-delete"); } catch (ApiException e) { System.err.println("Exception when calling RestrictionPoliciesApi#deleteRestrictionPolicy"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a restriction policy ``` // Delete a restriction policy returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_restriction_policies::RestrictionPoliciesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = RestrictionPoliciesAPI::with_config(configuration); let resp = api .delete_restriction_policy("dashboard:test-delete".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a restriction policy ``` /** * Delete a restriction policy returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RestrictionPoliciesApi(configuration); const params: v2.RestrictionPoliciesApiDeleteRestrictionPolicyRequest = { resourceId: "dashboard:test-delete", }; apiInstance .deleteRestrictionPolicy(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=d34e136f-a2af-415a-ac09-613930c075e5&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b8fd4205-3e96-44d7-b811-e8960d258442&pt=Restriction%20Policies&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frestriction-policies%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=d34e136f-a2af-415a-ac09-613930c075e5&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b8fd4205-3e96-44d7-b811-e8960d258442&pt=Restriction%20Policies&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frestriction-policies%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=2b470f19-2c7a-45cb-ab8e-24caffa7fe2c&bo=2&sid=b77a5fe0f0bf11f0bab0291dd4b5f9fd&vid=b77a7130f0bf11f0b69eb7008fe8e23b&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Restriction%20Policies&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frestriction-policies%2F&r=<=843&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=132577) --- # Source: https://docs.datadoghq.com/api/latest/roles # Roles The Roles API is used to create and manage Datadog roles, what [global permissions](https://docs.datadoghq.com/account_management/rbac/) they grant, and which users belong to them. Permissions related to specific account assets can be granted to roles in the Datadog application without using this API. For example, granting read access on a specific log index to a role can be done in Datadog from the [Pipelines page](https://app.datadoghq.com/logs/pipelines). ## [List permissions](https://docs.datadoghq.com/api/latest/roles/#list-permissions) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#list-permissions-v2) GET https://api.ap1.datadoghq.com/api/v2/permissionshttps://api.ap2.datadoghq.com/api/v2/permissionshttps://api.datadoghq.eu/api/v2/permissionshttps://api.ddog-gov.com/api/v2/permissionshttps://api.datadoghq.com/api/v2/permissionshttps://api.us3.datadoghq.com/api/v2/permissionshttps://api.us5.datadoghq.com/api/v2/permissions ### Overview Returns a list of all permissions, including name, description, and ID. This endpoint requires the `user_access_read` permission. OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#ListPermissions-200-v2) * [400](https://docs.datadoghq.com/api/latest/roles/#ListPermissions-400-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#ListPermissions-403-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#ListPermissions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Payload with API-returned permissions. Field Type Description data [object] Array of permissions. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` ``` { "data": [ { "attributes": { "created": "2019-09-19T10:00:00.000Z", "description": "string", "display_name": "string", "display_type": "string", "group_name": "string", "name": "string", "restricted": false }, "id": "string", "type": "permissions" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### List permissions Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/permissions" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List permissions ``` """ List permissions returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.list_permissions() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List permissions ``` # List permissions returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new p api_instance.list_permissions() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List permissions ``` // List permissions returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.ListPermissions(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.ListPermissions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.ListPermissions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List permissions ``` // List permissions returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.PermissionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); try { PermissionsResponse result = apiInstance.listPermissions(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#listPermissions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List permissions ``` // List permissions returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api.list_permissions().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List permissions ``` /** * List permissions returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); apiInstance .listPermissions() .then((data: v2.PermissionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List roles](https://docs.datadoghq.com/api/latest/roles/#list-roles) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#list-roles-v2) GET https://api.ap1.datadoghq.com/api/v2/roleshttps://api.ap2.datadoghq.com/api/v2/roleshttps://api.datadoghq.eu/api/v2/roleshttps://api.ddog-gov.com/api/v2/roleshttps://api.datadoghq.com/api/v2/roleshttps://api.us3.datadoghq.com/api/v2/roleshttps://api.us5.datadoghq.com/api/v2/roles ### Overview Returns all roles, including their names and their unique identifiers. This endpoint requires the `user_access_read` permission. OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort enum Sort roles depending on the given field. Sort order is **ascending** by default. Sort order is **descending** if the field is prefixed by a negative sign, for example: `sort=-name`. Allowed enum values: `name, -name, modified_at, -modified_at, user_count, -user_count` filter string Filter all roles by the given string. filter[id] string Filter all roles by the given list of role IDs. ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#ListRoles-200-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#ListRoles-403-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#ListRoles-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Response containing information about multiple roles. Field Type Description data [object] Array of returned roles. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` meta object Object describing meta attributes of response. page object Pagination object. total_count int64 Total count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "user_count": "integer" }, "id": "string", "relationships": { "permissions": { "data": [ { "id": "string", "type": "permissions" } ] } }, "type": "roles" } ], "meta": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### List roles Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List roles ``` """ List roles returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi # there is a valid "role" in the system ROLE_DATA_ATTRIBUTES_NAME = environ["ROLE_DATA_ATTRIBUTES_NAME"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.list_roles( filter=ROLE_DATA_ATTRIBUTES_NAME, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List roles ``` # List roles returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "role" in the system ROLE_DATA_ATTRIBUTES_NAME = ENV["ROLE_DATA_ATTRIBUTES_NAME"] opts = { filter: ROLE_DATA_ATTRIBUTES_NAME, } p api_instance.list_roles(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List roles ``` // List roles returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataAttributesName := os.Getenv("ROLE_DATA_ATTRIBUTES_NAME") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.ListRoles(ctx, *datadogV2.NewListRolesOptionalParameters().WithFilter(RoleDataAttributesName)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.ListRoles`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.ListRoles`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List roles ``` // List roles returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.api.RolesApi.ListRolesOptionalParameters; import com.datadog.api.client.v2.model.RolesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ATTRIBUTES_NAME = System.getenv("ROLE_DATA_ATTRIBUTES_NAME"); try { RolesResponse result = apiInstance.listRoles( new ListRolesOptionalParameters().filter(ROLE_DATA_ATTRIBUTES_NAME)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#listRoles"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List roles ``` // List roles returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::ListRolesOptionalParams; use datadog_api_client::datadogV2::api_roles::RolesAPI; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_attributes_name = std::env::var("ROLE_DATA_ATTRIBUTES_NAME").unwrap(); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api .list_roles(ListRolesOptionalParams::default().filter(role_data_attributes_name.clone())) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List roles ``` /** * List roles returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ATTRIBUTES_NAME = process.env .ROLE_DATA_ATTRIBUTES_NAME as string; const params: v2.RolesApiListRolesRequest = { filter: ROLE_DATA_ATTRIBUTES_NAME, }; apiInstance .listRoles(params) .then((data: v2.RolesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create role](https://docs.datadoghq.com/api/latest/roles/#create-role) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#create-role-v2) POST https://api.ap1.datadoghq.com/api/v2/roleshttps://api.ap2.datadoghq.com/api/v2/roleshttps://api.datadoghq.eu/api/v2/roleshttps://api.ddog-gov.com/api/v2/roleshttps://api.datadoghq.com/api/v2/roleshttps://api.us3.datadoghq.com/api/v2/roleshttps://api.us5.datadoghq.com/api/v2/roles ### Overview Create a new role for your organization. This endpoint requires the `user_access_manage` permission. OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Field Type Description data [_required_] object Data related to the creation of a role. attributes [_required_] object Attributes of the created role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name [_required_] string Name of the role. relationships object Relationships of the role object. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "type": "roles", "attributes": { "name": "Example-Role" }, "relationships": { "permissions": { "data": [ { "id": "string", "type": "permissions" } ] } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#CreateRole-200-v2) * [400](https://docs.datadoghq.com/api/latest/roles/#CreateRole-400-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#CreateRole-403-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#CreateRole-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Response containing information about a created role. Field Type Description data object Role object returned by the API. attributes object Attributes of the created role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name [_required_] string Name of the role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "name": "developers" }, "id": "string", "relationships": { "permissions": { "data": [ { "id": "string", "type": "permissions" } ] } }, "type": "roles" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### Create role with a permission returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "roles", "attributes": { "name": "Example-Role" }, "relationships": { "permissions": { "data": [ { "id": "string", "type": "permissions" } ] } } } } EOF ``` ##### Create role with a permission returns "OK" response ``` // Create role with a permission returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "permission" in the system PermissionID := os.Getenv("PERMISSION_ID") body := datadogV2.RoleCreateRequest{ Data: datadogV2.RoleCreateData{ Type: datadogV2.ROLESTYPE_ROLES.Ptr(), Attributes: datadogV2.RoleCreateAttributes{ Name: "Example-Role", }, Relationships: &datadogV2.RoleRelationships{ Permissions: &datadogV2.RelationshipToPermissions{ Data: []datadogV2.RelationshipToPermissionData{ { Id: datadog.PtrString(PermissionID), Type: datadogV2.PERMISSIONSTYPE_PERMISSIONS.Ptr(), }, }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.CreateRole(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.CreateRole`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.CreateRole`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create role with a permission returns "OK" response ``` // Create role with a permission returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.PermissionsType; import com.datadog.api.client.v2.model.RelationshipToPermissionData; import com.datadog.api.client.v2.model.RelationshipToPermissions; import com.datadog.api.client.v2.model.RoleCreateAttributes; import com.datadog.api.client.v2.model.RoleCreateData; import com.datadog.api.client.v2.model.RoleCreateRequest; import com.datadog.api.client.v2.model.RoleCreateResponse; import com.datadog.api.client.v2.model.RoleRelationships; import com.datadog.api.client.v2.model.RolesType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "permission" in the system String PERMISSION_ID = System.getenv("PERMISSION_ID"); RoleCreateRequest body = new RoleCreateRequest() .data( new RoleCreateData() .type(RolesType.ROLES) .attributes(new RoleCreateAttributes().name("Example-Role")) .relationships( new RoleRelationships() .permissions( new RelationshipToPermissions() .data( Collections.singletonList( new RelationshipToPermissionData() .id(PERMISSION_ID) .type(PermissionsType.PERMISSIONS)))))); try { RoleCreateResponse result = apiInstance.createRole(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#createRole"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create role with a permission returns "OK" response ``` """ Create role with a permission returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi from datadog_api_client.v2.model.permissions_type import PermissionsType from datadog_api_client.v2.model.relationship_to_permission_data import RelationshipToPermissionData from datadog_api_client.v2.model.relationship_to_permissions import RelationshipToPermissions from datadog_api_client.v2.model.role_create_attributes import RoleCreateAttributes from datadog_api_client.v2.model.role_create_data import RoleCreateData from datadog_api_client.v2.model.role_create_request import RoleCreateRequest from datadog_api_client.v2.model.role_relationships import RoleRelationships from datadog_api_client.v2.model.roles_type import RolesType # there is a valid "permission" in the system PERMISSION_ID = environ["PERMISSION_ID"] body = RoleCreateRequest( data=RoleCreateData( type=RolesType.ROLES, attributes=RoleCreateAttributes( name="Example-Role", ), relationships=RoleRelationships( permissions=RelationshipToPermissions( data=[ RelationshipToPermissionData( id=PERMISSION_ID, type=PermissionsType.PERMISSIONS, ), ], ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.create_role(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create role with a permission returns "OK" response ``` # Create role with a permission returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "permission" in the system PERMISSION_ID = ENV["PERMISSION_ID"] body = DatadogAPIClient::V2::RoleCreateRequest.new({ data: DatadogAPIClient::V2::RoleCreateData.new({ type: DatadogAPIClient::V2::RolesType::ROLES, attributes: DatadogAPIClient::V2::RoleCreateAttributes.new({ name: "Example-Role", }), relationships: DatadogAPIClient::V2::RoleRelationships.new({ permissions: DatadogAPIClient::V2::RelationshipToPermissions.new({ data: [ DatadogAPIClient::V2::RelationshipToPermissionData.new({ id: PERMISSION_ID, type: DatadogAPIClient::V2::PermissionsType::PERMISSIONS, }), ], }), }), }), }) p api_instance.create_role(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create role with a permission returns "OK" response ``` // Create role with a permission returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; use datadog_api_client::datadogV2::model::PermissionsType; use datadog_api_client::datadogV2::model::RelationshipToPermissionData; use datadog_api_client::datadogV2::model::RelationshipToPermissions; use datadog_api_client::datadogV2::model::RoleCreateAttributes; use datadog_api_client::datadogV2::model::RoleCreateData; use datadog_api_client::datadogV2::model::RoleCreateRequest; use datadog_api_client::datadogV2::model::RoleRelationships; use datadog_api_client::datadogV2::model::RolesType; #[tokio::main] async fn main() { // there is a valid "permission" in the system let permission_id = std::env::var("PERMISSION_ID").unwrap(); let body = RoleCreateRequest::new( RoleCreateData::new(RoleCreateAttributes::new("Example-Role".to_string())) .relationships(RoleRelationships::new().permissions( RelationshipToPermissions::new().data(vec![ RelationshipToPermissionData::new() .id(permission_id.clone()) .type_(PermissionsType::PERMISSIONS) ]), )) .type_(RolesType::ROLES), ); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api.create_role(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create role with a permission returns "OK" response ``` /** * Create role with a permission returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "permission" in the system const PERMISSION_ID = process.env.PERMISSION_ID as string; const params: v2.RolesApiCreateRoleRequest = { body: { data: { type: "roles", attributes: { name: "Example-Role", }, relationships: { permissions: { data: [ { id: PERMISSION_ID, type: "permissions", }, ], }, }, }, }, }; apiInstance .createRole(params) .then((data: v2.RoleCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a role](https://docs.datadoghq.com/api/latest/roles/#get-a-role) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#get-a-role-v2) GET https://api.ap1.datadoghq.com/api/v2/roles/{role_id}https://api.ap2.datadoghq.com/api/v2/roles/{role_id}https://api.datadoghq.eu/api/v2/roles/{role_id}https://api.ddog-gov.com/api/v2/roles/{role_id}https://api.datadoghq.com/api/v2/roles/{role_id}https://api.us3.datadoghq.com/api/v2/roles/{role_id}https://api.us5.datadoghq.com/api/v2/roles/{role_id} ### Overview Get a role in the organization specified by the role’s `role_id`. OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Arguments #### Path Parameters Name Type Description role_id [_required_] string The unique identifier of the role. ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#GetRole-200-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#GetRole-403-v2) * [404](https://docs.datadoghq.com/api/latest/roles/#GetRole-404-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#GetRole-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Response containing information about a single role. Field Type Description data object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "user_count": "integer" }, "id": "string", "relationships": { "permissions": { "data": [ { "id": "string", "type": "permissions" } ] } }, "type": "roles" } } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### Get a role Copy ``` # Path parameters export role_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles/${role_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a role ``` """ Get a role returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.get_role( role_id=ROLE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a role ``` # Get a role returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] p api_instance.get_role(ROLE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a role ``` // Get a role returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.GetRole(ctx, RoleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.GetRole`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.GetRole`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a role ``` // Get a role returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.RoleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); try { RoleResponse result = apiInstance.getRole(ROLE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#getRole"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a role ``` // Get a role returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api.get_role(role_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a role ``` /** * Get a role returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v2.RolesApiGetRoleRequest = { roleId: ROLE_DATA_ID, }; apiInstance .getRole(params) .then((data: v2.RoleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a role](https://docs.datadoghq.com/api/latest/roles/#update-a-role) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#update-a-role-v2) PATCH https://api.ap1.datadoghq.com/api/v2/roles/{role_id}https://api.ap2.datadoghq.com/api/v2/roles/{role_id}https://api.datadoghq.eu/api/v2/roles/{role_id}https://api.ddog-gov.com/api/v2/roles/{role_id}https://api.datadoghq.com/api/v2/roles/{role_id}https://api.us3.datadoghq.com/api/v2/roles/{role_id}https://api.us5.datadoghq.com/api/v2/roles/{role_id} ### Overview Edit a role. Can only be used with application keys belonging to administrators. This endpoint requires the `user_access_manage` permission. OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Arguments #### Path Parameters Name Type Description role_id [_required_] string The unique identifier of the role. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Field Type Description data [_required_] object Data related to the update of a role. attributes [_required_] object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string Name of the role. user_count int32 The user count. id [_required_] string The unique identifier of the role. relationships object Relationships of the role object. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "id": "string", "type": "roles", "attributes": { "name": "developers-updated" }, "relationships": { "permissions": { "data": [ { "id": "f2a8beb4-91f8-962d-b6d9-60215cda2214", "type": "permissions" } ] } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#UpdateRole-200-v2) * [400](https://docs.datadoghq.com/api/latest/roles/#UpdateRole-400-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#UpdateRole-403-v2) * [404](https://docs.datadoghq.com/api/latest/roles/#UpdateRole-404-v2) * [422](https://docs.datadoghq.com/api/latest/roles/#UpdateRole-422-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#UpdateRole-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Response containing information about an updated role. Field Type Description data object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string Name of the role. user_count int32 The user count. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "user_count": "integer" }, "id": "string", "relationships": { "permissions": { "data": [ { "id": "string", "type": "permissions" } ] } }, "type": "roles" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unprocessable Entity * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### Update a role Copy ``` # Path parameters export role_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles/${role_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": {}, "id": "00000000-0000-1111-0000-000000000000", "type": "roles" } } EOF ``` ##### Update a role ``` """ Update a role returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi from datadog_api_client.v2.model.permissions_type import PermissionsType from datadog_api_client.v2.model.relationship_to_permission_data import RelationshipToPermissionData from datadog_api_client.v2.model.relationship_to_permissions import RelationshipToPermissions from datadog_api_client.v2.model.role_relationships import RoleRelationships from datadog_api_client.v2.model.role_update_attributes import RoleUpdateAttributes from datadog_api_client.v2.model.role_update_data import RoleUpdateData from datadog_api_client.v2.model.role_update_request import RoleUpdateRequest from datadog_api_client.v2.model.roles_type import RolesType # there is a valid "role" in the system ROLE_DATA_ATTRIBUTES_NAME = environ["ROLE_DATA_ATTRIBUTES_NAME"] ROLE_DATA_ID = environ["ROLE_DATA_ID"] # there is a valid "permission" in the system PERMISSION_ID = environ["PERMISSION_ID"] body = RoleUpdateRequest( data=RoleUpdateData( id=ROLE_DATA_ID, type=RolesType.ROLES, attributes=RoleUpdateAttributes( name="developers-updated", ), relationships=RoleRelationships( permissions=RelationshipToPermissions( data=[ RelationshipToPermissionData( id=PERMISSION_ID, type=PermissionsType.PERMISSIONS, ), ], ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.update_role(role_id=ROLE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a role ``` # Update a role returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "role" in the system ROLE_DATA_ATTRIBUTES_NAME = ENV["ROLE_DATA_ATTRIBUTES_NAME"] ROLE_DATA_ID = ENV["ROLE_DATA_ID"] # there is a valid "permission" in the system PERMISSION_ID = ENV["PERMISSION_ID"] body = DatadogAPIClient::V2::RoleUpdateRequest.new({ data: DatadogAPIClient::V2::RoleUpdateData.new({ id: ROLE_DATA_ID, type: DatadogAPIClient::V2::RolesType::ROLES, attributes: DatadogAPIClient::V2::RoleUpdateAttributes.new({ name: "developers-updated", }), relationships: DatadogAPIClient::V2::RoleRelationships.new({ permissions: DatadogAPIClient::V2::RelationshipToPermissions.new({ data: [ DatadogAPIClient::V2::RelationshipToPermissionData.new({ id: PERMISSION_ID, type: DatadogAPIClient::V2::PermissionsType::PERMISSIONS, }), ], }), }), }), }) p api_instance.update_role(ROLE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a role ``` // Update a role returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") // there is a valid "permission" in the system PermissionID := os.Getenv("PERMISSION_ID") body := datadogV2.RoleUpdateRequest{ Data: datadogV2.RoleUpdateData{ Id: RoleDataID, Type: datadogV2.ROLESTYPE_ROLES, Attributes: datadogV2.RoleUpdateAttributes{ Name: datadog.PtrString("developers-updated"), }, Relationships: &datadogV2.RoleRelationships{ Permissions: &datadogV2.RelationshipToPermissions{ Data: []datadogV2.RelationshipToPermissionData{ { Id: datadog.PtrString(PermissionID), Type: datadogV2.PERMISSIONSTYPE_PERMISSIONS.Ptr(), }, }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.UpdateRole(ctx, RoleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.UpdateRole`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.UpdateRole`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a role ``` // Update a role returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.PermissionsType; import com.datadog.api.client.v2.model.RelationshipToPermissionData; import com.datadog.api.client.v2.model.RelationshipToPermissions; import com.datadog.api.client.v2.model.RoleRelationships; import com.datadog.api.client.v2.model.RoleUpdateAttributes; import com.datadog.api.client.v2.model.RoleUpdateData; import com.datadog.api.client.v2.model.RoleUpdateRequest; import com.datadog.api.client.v2.model.RoleUpdateResponse; import com.datadog.api.client.v2.model.RolesType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ATTRIBUTES_NAME = System.getenv("ROLE_DATA_ATTRIBUTES_NAME"); String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); // there is a valid "permission" in the system String PERMISSION_ID = System.getenv("PERMISSION_ID"); RoleUpdateRequest body = new RoleUpdateRequest() .data( new RoleUpdateData() .id(ROLE_DATA_ID) .type(RolesType.ROLES) .attributes(new RoleUpdateAttributes().name("developers-updated")) .relationships( new RoleRelationships() .permissions( new RelationshipToPermissions() .data( Collections.singletonList( new RelationshipToPermissionData() .id(PERMISSION_ID) .type(PermissionsType.PERMISSIONS)))))); try { RoleUpdateResponse result = apiInstance.updateRole(ROLE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#updateRole"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a role ``` // Update a role returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; use datadog_api_client::datadogV2::model::PermissionsType; use datadog_api_client::datadogV2::model::RelationshipToPermissionData; use datadog_api_client::datadogV2::model::RelationshipToPermissions; use datadog_api_client::datadogV2::model::RoleRelationships; use datadog_api_client::datadogV2::model::RoleUpdateAttributes; use datadog_api_client::datadogV2::model::RoleUpdateData; use datadog_api_client::datadogV2::model::RoleUpdateRequest; use datadog_api_client::datadogV2::model::RolesType; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); // there is a valid "permission" in the system let permission_id = std::env::var("PERMISSION_ID").unwrap(); let body = RoleUpdateRequest::new( RoleUpdateData::new( RoleUpdateAttributes::new().name("developers-updated".to_string()), role_data_id.clone(), RolesType::ROLES, ) .relationships(RoleRelationships::new().permissions( RelationshipToPermissions::new().data(vec![ RelationshipToPermissionData::new() .id(permission_id.clone()) .type_(PermissionsType::PERMISSIONS) ]), )), ); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api.update_role(role_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a role ``` /** * Update a role returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; // there is a valid "permission" in the system const PERMISSION_ID = process.env.PERMISSION_ID as string; const params: v2.RolesApiUpdateRoleRequest = { body: { data: { id: ROLE_DATA_ID, type: "roles", attributes: { name: "developers-updated", }, relationships: { permissions: { data: [ { id: PERMISSION_ID, type: "permissions", }, ], }, }, }, }, roleId: ROLE_DATA_ID, }; apiInstance .updateRole(params) .then((data: v2.RoleUpdateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete role](https://docs.datadoghq.com/api/latest/roles/#delete-role) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#delete-role-v2) DELETE https://api.ap1.datadoghq.com/api/v2/roles/{role_id}https://api.ap2.datadoghq.com/api/v2/roles/{role_id}https://api.datadoghq.eu/api/v2/roles/{role_id}https://api.ddog-gov.com/api/v2/roles/{role_id}https://api.datadoghq.com/api/v2/roles/{role_id}https://api.us3.datadoghq.com/api/v2/roles/{role_id}https://api.us5.datadoghq.com/api/v2/roles/{role_id} ### Overview Disables a role. OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Arguments #### Path Parameters Name Type Description role_id [_required_] string The unique identifier of the role. ### Response * [204](https://docs.datadoghq.com/api/latest/roles/#DeleteRole-204-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#DeleteRole-403-v2) * [404](https://docs.datadoghq.com/api/latest/roles/#DeleteRole-404-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#DeleteRole-429-v2) OK Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### Delete role Copy ``` # Path parameters export role_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles/${role_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete role ``` """ Delete role returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) api_instance.delete_role( role_id=ROLE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete role ``` # Delete role returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] api_instance.delete_role(ROLE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete role ``` // Delete role returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) r, err := api.DeleteRole(ctx, RoleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.DeleteRole`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete role ``` // Delete role returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); try { apiInstance.deleteRole(ROLE_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#deleteRole"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete role ``` // Delete role returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api.delete_role(role_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete role ``` /** * Delete role returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v2.RolesApiDeleteRoleRequest = { roleId: ROLE_DATA_ID, }; apiInstance .deleteRole(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List permissions for a role](https://docs.datadoghq.com/api/latest/roles/#list-permissions-for-a-role) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#list-permissions-for-a-role-v2) GET https://api.ap1.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.ap2.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.datadoghq.eu/api/v2/roles/{role_id}/permissionshttps://api.ddog-gov.com/api/v2/roles/{role_id}/permissionshttps://api.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.us3.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.us5.datadoghq.com/api/v2/roles/{role_id}/permissions ### Overview Returns a list of all permissions for a single role. OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Arguments #### Path Parameters Name Type Description role_id [_required_] string The unique identifier of the role. ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#ListRolePermissions-200-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#ListRolePermissions-403-v2) * [404](https://docs.datadoghq.com/api/latest/roles/#ListRolePermissions-404-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#ListRolePermissions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Payload with API-returned permissions. Field Type Description data [object] Array of permissions. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` ``` { "data": [ { "attributes": { "created": "2019-09-19T10:00:00.000Z", "description": "string", "display_name": "string", "display_type": "string", "group_name": "string", "name": "string", "restricted": false }, "id": "string", "type": "permissions" } ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### List permissions for a role Copy ``` # Path parameters export role_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles/${role_id}/permissions" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List permissions for a role ``` """ List permissions for a role returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.list_role_permissions( role_id=ROLE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List permissions for a role ``` # List permissions for a role returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] p api_instance.list_role_permissions(ROLE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List permissions for a role ``` // List permissions for a role returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.ListRolePermissions(ctx, RoleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.ListRolePermissions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.ListRolePermissions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List permissions for a role ``` // List permissions for a role returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.PermissionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); try { PermissionsResponse result = apiInstance.listRolePermissions(ROLE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#listRolePermissions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List permissions for a role ``` // List permissions for a role returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api.list_role_permissions(role_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List permissions for a role ``` /** * List permissions for a role returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v2.RolesApiListRolePermissionsRequest = { roleId: ROLE_DATA_ID, }; apiInstance .listRolePermissions(params) .then((data: v2.PermissionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Grant permission to a role](https://docs.datadoghq.com/api/latest/roles/#grant-permission-to-a-role) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#grant-permission-to-a-role-v2) POST https://api.ap1.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.ap2.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.datadoghq.eu/api/v2/roles/{role_id}/permissionshttps://api.ddog-gov.com/api/v2/roles/{role_id}/permissionshttps://api.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.us3.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.us5.datadoghq.com/api/v2/roles/{role_id}/permissions ### Overview Adds a permission to a role. This endpoint requires the `user_access_manage` permission. OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Arguments #### Path Parameters Name Type Description role_id [_required_] string The unique identifier of the role. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Field Type Description data object Relationship to permission object. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` ``` { "data": { "id": "f2a8beb4-91f8-962d-b6d9-60215cda2214", "type": "permissions" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#AddPermissionToRole-200-v2) * [400](https://docs.datadoghq.com/api/latest/roles/#AddPermissionToRole-400-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#AddPermissionToRole-403-v2) * [404](https://docs.datadoghq.com/api/latest/roles/#AddPermissionToRole-404-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#AddPermissionToRole-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Payload with API-returned permissions. Field Type Description data [object] Array of permissions. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` ``` { "data": [ { "attributes": { "created": "2019-09-19T10:00:00.000Z", "description": "string", "display_name": "string", "display_type": "string", "group_name": "string", "name": "string", "restricted": false }, "id": "string", "type": "permissions" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### Grant permission to a role returns "OK" response Copy ``` # Path parameters export role_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles/${role_id}/permissions" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "f2a8beb4-91f8-962d-b6d9-60215cda2214", "type": "permissions" } } EOF ``` ##### Grant permission to a role returns "OK" response ``` // Grant permission to a role returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") // there is a valid "permission" in the system PermissionID := os.Getenv("PERMISSION_ID") body := datadogV2.RelationshipToPermission{ Data: &datadogV2.RelationshipToPermissionData{ Id: datadog.PtrString(PermissionID), Type: datadogV2.PERMISSIONSTYPE_PERMISSIONS.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.AddPermissionToRole(ctx, RoleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.AddPermissionToRole`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.AddPermissionToRole`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Grant permission to a role returns "OK" response ``` // Grant permission to a role returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.PermissionsResponse; import com.datadog.api.client.v2.model.PermissionsType; import com.datadog.api.client.v2.model.RelationshipToPermission; import com.datadog.api.client.v2.model.RelationshipToPermissionData; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); // there is a valid "permission" in the system String PERMISSION_ID = System.getenv("PERMISSION_ID"); RelationshipToPermission body = new RelationshipToPermission() .data( new RelationshipToPermissionData() .id(PERMISSION_ID) .type(PermissionsType.PERMISSIONS)); try { PermissionsResponse result = apiInstance.addPermissionToRole(ROLE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#addPermissionToRole"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Grant permission to a role returns "OK" response ``` """ Grant permission to a role returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi from datadog_api_client.v2.model.permissions_type import PermissionsType from datadog_api_client.v2.model.relationship_to_permission import RelationshipToPermission from datadog_api_client.v2.model.relationship_to_permission_data import RelationshipToPermissionData # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] # there is a valid "permission" in the system PERMISSION_ID = environ["PERMISSION_ID"] body = RelationshipToPermission( data=RelationshipToPermissionData( id=PERMISSION_ID, type=PermissionsType.PERMISSIONS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.add_permission_to_role(role_id=ROLE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Grant permission to a role returns "OK" response ``` # Grant permission to a role returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] # there is a valid "permission" in the system PERMISSION_ID = ENV["PERMISSION_ID"] body = DatadogAPIClient::V2::RelationshipToPermission.new({ data: DatadogAPIClient::V2::RelationshipToPermissionData.new({ id: PERMISSION_ID, type: DatadogAPIClient::V2::PermissionsType::PERMISSIONS, }), }) p api_instance.add_permission_to_role(ROLE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Grant permission to a role returns "OK" response ``` // Grant permission to a role returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; use datadog_api_client::datadogV2::model::PermissionsType; use datadog_api_client::datadogV2::model::RelationshipToPermission; use datadog_api_client::datadogV2::model::RelationshipToPermissionData; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); // there is a valid "permission" in the system let permission_id = std::env::var("PERMISSION_ID").unwrap(); let body = RelationshipToPermission::new().data( RelationshipToPermissionData::new() .id(permission_id.clone()) .type_(PermissionsType::PERMISSIONS), ); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api.add_permission_to_role(role_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Grant permission to a role returns "OK" response ``` /** * Grant permission to a role returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; // there is a valid "permission" in the system const PERMISSION_ID = process.env.PERMISSION_ID as string; const params: v2.RolesApiAddPermissionToRoleRequest = { body: { data: { id: PERMISSION_ID, type: "permissions", }, }, roleId: ROLE_DATA_ID, }; apiInstance .addPermissionToRole(params) .then((data: v2.PermissionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Revoke permission](https://docs.datadoghq.com/api/latest/roles/#revoke-permission) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#revoke-permission-v2) DELETE https://api.ap1.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.ap2.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.datadoghq.eu/api/v2/roles/{role_id}/permissionshttps://api.ddog-gov.com/api/v2/roles/{role_id}/permissionshttps://api.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.us3.datadoghq.com/api/v2/roles/{role_id}/permissionshttps://api.us5.datadoghq.com/api/v2/roles/{role_id}/permissions ### Overview Removes a permission from a role. This endpoint requires the `user_access_manage` permission. OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Arguments #### Path Parameters Name Type Description role_id [_required_] string The unique identifier of the role. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Field Type Description data object Relationship to permission object. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` ``` { "data": { "id": "f2a8beb4-91f8-962d-b6d9-60215cda2214", "type": "permissions" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#RemovePermissionFromRole-200-v2) * [400](https://docs.datadoghq.com/api/latest/roles/#RemovePermissionFromRole-400-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#RemovePermissionFromRole-403-v2) * [404](https://docs.datadoghq.com/api/latest/roles/#RemovePermissionFromRole-404-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#RemovePermissionFromRole-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Payload with API-returned permissions. Field Type Description data [object] Array of permissions. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` ``` { "data": [ { "attributes": { "created": "2019-09-19T10:00:00.000Z", "description": "string", "display_name": "string", "display_type": "string", "group_name": "string", "name": "string", "restricted": false }, "id": "string", "type": "permissions" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### Revoke permission returns "OK" response Copy ``` # Path parameters export role_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles/${role_id}/permissions" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "f2a8beb4-91f8-962d-b6d9-60215cda2214", "type": "permissions" } } EOF ``` ##### Revoke permission returns "OK" response ``` // Revoke permission returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") // there is a valid "permission" in the system PermissionID := os.Getenv("PERMISSION_ID") body := datadogV2.RelationshipToPermission{ Data: &datadogV2.RelationshipToPermissionData{ Id: datadog.PtrString(PermissionID), Type: datadogV2.PERMISSIONSTYPE_PERMISSIONS.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.RemovePermissionFromRole(ctx, RoleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.RemovePermissionFromRole`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.RemovePermissionFromRole`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Revoke permission returns "OK" response ``` // Revoke permission returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.PermissionsResponse; import com.datadog.api.client.v2.model.PermissionsType; import com.datadog.api.client.v2.model.RelationshipToPermission; import com.datadog.api.client.v2.model.RelationshipToPermissionData; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); // there is a valid "permission" in the system String PERMISSION_ID = System.getenv("PERMISSION_ID"); RelationshipToPermission body = new RelationshipToPermission() .data( new RelationshipToPermissionData() .id(PERMISSION_ID) .type(PermissionsType.PERMISSIONS)); try { PermissionsResponse result = apiInstance.removePermissionFromRole(ROLE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#removePermissionFromRole"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Revoke permission returns "OK" response ``` """ Revoke permission returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi from datadog_api_client.v2.model.permissions_type import PermissionsType from datadog_api_client.v2.model.relationship_to_permission import RelationshipToPermission from datadog_api_client.v2.model.relationship_to_permission_data import RelationshipToPermissionData # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] # there is a valid "permission" in the system PERMISSION_ID = environ["PERMISSION_ID"] body = RelationshipToPermission( data=RelationshipToPermissionData( id=PERMISSION_ID, type=PermissionsType.PERMISSIONS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.remove_permission_from_role(role_id=ROLE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Revoke permission returns "OK" response ``` # Revoke permission returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] # there is a valid "permission" in the system PERMISSION_ID = ENV["PERMISSION_ID"] body = DatadogAPIClient::V2::RelationshipToPermission.new({ data: DatadogAPIClient::V2::RelationshipToPermissionData.new({ id: PERMISSION_ID, type: DatadogAPIClient::V2::PermissionsType::PERMISSIONS, }), }) p api_instance.remove_permission_from_role(ROLE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Revoke permission returns "OK" response ``` // Revoke permission returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; use datadog_api_client::datadogV2::model::PermissionsType; use datadog_api_client::datadogV2::model::RelationshipToPermission; use datadog_api_client::datadogV2::model::RelationshipToPermissionData; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); // there is a valid "permission" in the system let permission_id = std::env::var("PERMISSION_ID").unwrap(); let body = RelationshipToPermission::new().data( RelationshipToPermissionData::new() .id(permission_id.clone()) .type_(PermissionsType::PERMISSIONS), ); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api .remove_permission_from_role(role_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Revoke permission returns "OK" response ``` /** * Revoke permission returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; // there is a valid "permission" in the system const PERMISSION_ID = process.env.PERMISSION_ID as string; const params: v2.RolesApiRemovePermissionFromRoleRequest = { body: { data: { id: PERMISSION_ID, type: "permissions", }, }, roleId: ROLE_DATA_ID, }; apiInstance .removePermissionFromRole(params) .then((data: v2.PermissionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all users of a role](https://docs.datadoghq.com/api/latest/roles/#get-all-users-of-a-role) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#get-all-users-of-a-role-v2) GET https://api.ap1.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.ap2.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.datadoghq.eu/api/v2/roles/{role_id}/usershttps://api.ddog-gov.com/api/v2/roles/{role_id}/usershttps://api.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.us3.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.us5.datadoghq.com/api/v2/roles/{role_id}/users ### Overview Gets all users of a role. OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Arguments #### Path Parameters Name Type Description role_id [_required_] string The unique identifier of the role. #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort string User attribute to order results by. Sort order is **ascending** by default. Sort order is **descending** if the field is prefixed by a negative sign, for example `sort=-name`. Options: `name`, `email`, `status`. filter string Filter all users by the given string. Defaults to no filtering. ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#ListRoleUsers-200-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#ListRoleUsers-403-v2) * [404](https://docs.datadoghq.com/api/latest/roles/#ListRoleUsers-404-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#ListRoleUsers-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Response containing information about multiple users. Field Type Description data [object] Array of returned users. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` included [ ] Array of objects related to the users. Option 1 object Organization object. attributes object Attributes of the organization. created_at date-time Creation time of the organization. description string Description of the organization. disabled boolean Whether or not the organization is disabled. modified_at date-time Time of last organization modification. name string Name of the organization. public_id string Public ID of the organization. sharing string Sharing type of the organization. url string URL of the site that this organization exists at. id string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` Option 2 object Permission object. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` Option 3 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` meta object Object describing meta attributes of response. page object Pagination object. total_count int64 Total count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "disabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "public_id": "string", "sharing": "string", "url": "string" }, "id": "string", "type": "orgs" } ], "meta": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### Get all users of a role Copy ``` # Path parameters export role_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles/${role_id}/users" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all users of a role ``` """ Get all users of a role returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.list_role_users( role_id=ROLE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all users of a role ``` # Get all users of a role returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] p api_instance.list_role_users(ROLE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all users of a role ``` // Get all users of a role returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.ListRoleUsers(ctx, RoleDataID, *datadogV2.NewListRoleUsersOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.ListRoleUsers`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.ListRoleUsers`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all users of a role ``` // Get all users of a role returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.UsersResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); try { UsersResponse result = apiInstance.listRoleUsers(ROLE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#listRoleUsers"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all users of a role ``` // Get all users of a role returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::ListRoleUsersOptionalParams; use datadog_api_client::datadogV2::api_roles::RolesAPI; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api .list_role_users(role_data_id.clone(), ListRoleUsersOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all users of a role ``` /** * Get all users of a role returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v2.RolesApiListRoleUsersRequest = { roleId: ROLE_DATA_ID, }; apiInstance .listRoleUsers(params) .then((data: v2.UsersResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add a user to a role](https://docs.datadoghq.com/api/latest/roles/#add-a-user-to-a-role) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#add-a-user-to-a-role-v2) POST https://api.ap1.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.ap2.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.datadoghq.eu/api/v2/roles/{role_id}/usershttps://api.ddog-gov.com/api/v2/roles/{role_id}/usershttps://api.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.us3.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.us5.datadoghq.com/api/v2/roles/{role_id}/users ### Overview Adds a user to a role. This endpoint requires the `user_access_manage` permission. OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Arguments #### Path Parameters Name Type Description role_id [_required_] string The unique identifier of the role. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Field Type Description data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "id": "c1d4eb9e-8bb0-974d-85a5-a7dd9db46bee", "type": "users" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#AddUserToRole-200-v2) * [400](https://docs.datadoghq.com/api/latest/roles/#AddUserToRole-400-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#AddUserToRole-403-v2) * [404](https://docs.datadoghq.com/api/latest/roles/#AddUserToRole-404-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#AddUserToRole-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Response containing information about multiple users. Field Type Description data [object] Array of returned users. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` included [ ] Array of objects related to the users. Option 1 object Organization object. attributes object Attributes of the organization. created_at date-time Creation time of the organization. description string Description of the organization. disabled boolean Whether or not the organization is disabled. modified_at date-time Time of last organization modification. name string Name of the organization. public_id string Public ID of the organization. sharing string Sharing type of the organization. url string URL of the site that this organization exists at. id string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` Option 2 object Permission object. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` Option 3 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` meta object Object describing meta attributes of response. page object Pagination object. total_count int64 Total count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "disabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "public_id": "string", "sharing": "string", "url": "string" }, "id": "string", "type": "orgs" } ], "meta": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### Add a user to a role returns "OK" response Copy ``` # Path parameters export role_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles/${role_id}/users" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "c1d4eb9e-8bb0-974d-85a5-a7dd9db46bee", "type": "users" } } EOF ``` ##### Add a user to a role returns "OK" response ``` // Add a user to a role returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") body := datadogV2.RelationshipToUser{ Data: datadogV2.RelationshipToUserData{ Id: UserDataID, Type: datadogV2.USERSTYPE_USERS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.AddUserToRole(ctx, RoleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.AddUserToRole`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.AddUserToRole`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add a user to a role returns "OK" response ``` // Add a user to a role returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.RelationshipToUser; import com.datadog.api.client.v2.model.RelationshipToUserData; import com.datadog.api.client.v2.model.UsersResponse; import com.datadog.api.client.v2.model.UsersType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); RelationshipToUser body = new RelationshipToUser() .data(new RelationshipToUserData().id(USER_DATA_ID).type(UsersType.USERS)); try { UsersResponse result = apiInstance.addUserToRole(ROLE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#addUserToRole"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add a user to a role returns "OK" response ``` """ Add a user to a role returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi from datadog_api_client.v2.model.relationship_to_user import RelationshipToUser from datadog_api_client.v2.model.relationship_to_user_data import RelationshipToUserData from datadog_api_client.v2.model.users_type import UsersType # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] body = RelationshipToUser( data=RelationshipToUserData( id=USER_DATA_ID, type=UsersType.USERS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.add_user_to_role(role_id=ROLE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add a user to a role returns "OK" response ``` # Add a user to a role returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] body = DatadogAPIClient::V2::RelationshipToUser.new({ data: DatadogAPIClient::V2::RelationshipToUserData.new({ id: USER_DATA_ID, type: DatadogAPIClient::V2::UsersType::USERS, }), }) p api_instance.add_user_to_role(ROLE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add a user to a role returns "OK" response ``` // Add a user to a role returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; use datadog_api_client::datadogV2::model::RelationshipToUser; use datadog_api_client::datadogV2::model::RelationshipToUserData; use datadog_api_client::datadogV2::model::UsersType; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let body = RelationshipToUser::new(RelationshipToUserData::new( user_data_id.clone(), UsersType::USERS, )); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api.add_user_to_role(role_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add a user to a role returns "OK" response ``` /** * Add a user to a role returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.RolesApiAddUserToRoleRequest = { body: { data: { id: USER_DATA_ID, type: "users", }, }, roleId: ROLE_DATA_ID, }; apiInstance .addUserToRole(params) .then((data: v2.UsersResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Remove a user from a role](https://docs.datadoghq.com/api/latest/roles/#remove-a-user-from-a-role) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#remove-a-user-from-a-role-v2) DELETE https://api.ap1.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.ap2.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.datadoghq.eu/api/v2/roles/{role_id}/usershttps://api.ddog-gov.com/api/v2/roles/{role_id}/usershttps://api.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.us3.datadoghq.com/api/v2/roles/{role_id}/usershttps://api.us5.datadoghq.com/api/v2/roles/{role_id}/users ### Overview Removes a user from a role. This endpoint requires the `user_access_manage` permission. OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Arguments #### Path Parameters Name Type Description role_id [_required_] string The unique identifier of the role. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Field Type Description data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "id": "c1d4eb9e-8bb0-974d-85a5-a7dd9db46bee", "type": "users" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#RemoveUserFromRole-200-v2) * [400](https://docs.datadoghq.com/api/latest/roles/#RemoveUserFromRole-400-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#RemoveUserFromRole-403-v2) * [404](https://docs.datadoghq.com/api/latest/roles/#RemoveUserFromRole-404-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#RemoveUserFromRole-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Response containing information about multiple users. Field Type Description data [object] Array of returned users. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` included [ ] Array of objects related to the users. Option 1 object Organization object. attributes object Attributes of the organization. created_at date-time Creation time of the organization. description string Description of the organization. disabled boolean Whether or not the organization is disabled. modified_at date-time Time of last organization modification. name string Name of the organization. public_id string Public ID of the organization. sharing string Sharing type of the organization. url string URL of the site that this organization exists at. id string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` Option 2 object Permission object. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` Option 3 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` meta object Object describing meta attributes of response. page object Pagination object. total_count int64 Total count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "disabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "public_id": "string", "sharing": "string", "url": "string" }, "id": "string", "type": "orgs" } ], "meta": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### Remove a user from a role returns "OK" response Copy ``` # Path parameters export role_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles/${role_id}/users" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "c1d4eb9e-8bb0-974d-85a5-a7dd9db46bee", "type": "users" } } EOF ``` ##### Remove a user from a role returns "OK" response ``` // Remove a user from a role returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") body := datadogV2.RelationshipToUser{ Data: datadogV2.RelationshipToUserData{ Id: UserDataID, Type: datadogV2.USERSTYPE_USERS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.RemoveUserFromRole(ctx, RoleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.RemoveUserFromRole`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.RemoveUserFromRole`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Remove a user from a role returns "OK" response ``` // Remove a user from a role returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.RelationshipToUser; import com.datadog.api.client.v2.model.RelationshipToUserData; import com.datadog.api.client.v2.model.UsersResponse; import com.datadog.api.client.v2.model.UsersType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); RelationshipToUser body = new RelationshipToUser() .data(new RelationshipToUserData().id(USER_DATA_ID).type(UsersType.USERS)); try { UsersResponse result = apiInstance.removeUserFromRole(ROLE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#removeUserFromRole"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Remove a user from a role returns "OK" response ``` """ Remove a user from a role returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi from datadog_api_client.v2.model.relationship_to_user import RelationshipToUser from datadog_api_client.v2.model.relationship_to_user_data import RelationshipToUserData from datadog_api_client.v2.model.users_type import UsersType # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] body = RelationshipToUser( data=RelationshipToUserData( id=USER_DATA_ID, type=UsersType.USERS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.remove_user_from_role(role_id=ROLE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Remove a user from a role returns "OK" response ``` # Remove a user from a role returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] body = DatadogAPIClient::V2::RelationshipToUser.new({ data: DatadogAPIClient::V2::RelationshipToUserData.new({ id: USER_DATA_ID, type: DatadogAPIClient::V2::UsersType::USERS, }), }) p api_instance.remove_user_from_role(ROLE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Remove a user from a role returns "OK" response ``` // Remove a user from a role returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; use datadog_api_client::datadogV2::model::RelationshipToUser; use datadog_api_client::datadogV2::model::RelationshipToUserData; use datadog_api_client::datadogV2::model::UsersType; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let body = RelationshipToUser::new(RelationshipToUserData::new( user_data_id.clone(), UsersType::USERS, )); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api.remove_user_from_role(role_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Remove a user from a role returns "OK" response ``` /** * Remove a user from a role returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.RolesApiRemoveUserFromRoleRequest = { body: { data: { id: USER_DATA_ID, type: "users", }, }, roleId: ROLE_DATA_ID, }; apiInstance .removeUserFromRole(params) .then((data: v2.UsersResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new role by cloning an existing role](https://docs.datadoghq.com/api/latest/roles/#create-a-new-role-by-cloning-an-existing-role) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#create-a-new-role-by-cloning-an-existing-role-v2) POST https://api.ap1.datadoghq.com/api/v2/roles/{role_id}/clonehttps://api.ap2.datadoghq.com/api/v2/roles/{role_id}/clonehttps://api.datadoghq.eu/api/v2/roles/{role_id}/clonehttps://api.ddog-gov.com/api/v2/roles/{role_id}/clonehttps://api.datadoghq.com/api/v2/roles/{role_id}/clonehttps://api.us3.datadoghq.com/api/v2/roles/{role_id}/clonehttps://api.us5.datadoghq.com/api/v2/roles/{role_id}/clone ### Overview Clone an existing role This endpoint requires the `user_access_manage` permission. OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Arguments #### Path Parameters Name Type Description role_id [_required_] string The unique identifier of the role. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Field Type Description data [_required_] object Data for the clone role request. attributes [_required_] object Attributes required to create a new role by cloning an existing one. name [_required_] string Name of the new role that is cloned. type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "attributes": { "name": "Example-Role clone" }, "type": "roles" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#CloneRole-200-v2) * [400](https://docs.datadoghq.com/api/latest/roles/#CloneRole-400-v2) * [403](https://docs.datadoghq.com/api/latest/roles/#CloneRole-403-v2) * [404](https://docs.datadoghq.com/api/latest/roles/#CloneRole-404-v2) * [409](https://docs.datadoghq.com/api/latest/roles/#CloneRole-409-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#CloneRole-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) Response containing information about a single role. Field Type Description data object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "user_count": "integer" }, "id": "string", "relationships": { "permissions": { "data": [ { "id": "string", "type": "permissions" } ] } }, "type": "roles" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### Create a new role by cloning an existing role returns "OK" response Copy ``` # Path parameters export role_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles/${role_id}/clone" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Example-Role clone" }, "type": "roles" } } EOF ``` ##### Create a new role by cloning an existing role returns "OK" response ``` // Create a new role by cloning an existing role returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") body := datadogV2.RoleCloneRequest{ Data: datadogV2.RoleClone{ Attributes: datadogV2.RoleCloneAttributes{ Name: "Example-Role clone", }, Type: datadogV2.ROLESTYPE_ROLES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.CloneRole(ctx, RoleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.CloneRole`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.CloneRole`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new role by cloning an existing role returns "OK" response ``` // Create a new role by cloning an existing role returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.RoleClone; import com.datadog.api.client.v2.model.RoleCloneAttributes; import com.datadog.api.client.v2.model.RoleCloneRequest; import com.datadog.api.client.v2.model.RoleResponse; import com.datadog.api.client.v2.model.RolesType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RolesApi apiInstance = new RolesApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); RoleCloneRequest body = new RoleCloneRequest() .data( new RoleClone() .attributes(new RoleCloneAttributes().name("Example-Role clone")) .type(RolesType.ROLES)); try { RoleResponse result = apiInstance.cloneRole(ROLE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#cloneRole"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new role by cloning an existing role returns "OK" response ``` """ Create a new role by cloning an existing role returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi from datadog_api_client.v2.model.role_clone import RoleClone from datadog_api_client.v2.model.role_clone_attributes import RoleCloneAttributes from datadog_api_client.v2.model.role_clone_request import RoleCloneRequest from datadog_api_client.v2.model.roles_type import RolesType # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] body = RoleCloneRequest( data=RoleClone( attributes=RoleCloneAttributes( name="Example-Role clone", ), type=RolesType.ROLES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.clone_role(role_id=ROLE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new role by cloning an existing role returns "OK" response ``` # Create a new role by cloning an existing role returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RolesAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] body = DatadogAPIClient::V2::RoleCloneRequest.new({ data: DatadogAPIClient::V2::RoleClone.new({ attributes: DatadogAPIClient::V2::RoleCloneAttributes.new({ name: "Example-Role clone", }), type: DatadogAPIClient::V2::RolesType::ROLES, }), }) p api_instance.clone_role(ROLE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new role by cloning an existing role returns "OK" response ``` // Create a new role by cloning an existing role returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; use datadog_api_client::datadogV2::model::RoleClone; use datadog_api_client::datadogV2::model::RoleCloneAttributes; use datadog_api_client::datadogV2::model::RoleCloneRequest; use datadog_api_client::datadogV2::model::RolesType; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let body = RoleCloneRequest::new(RoleClone::new( RoleCloneAttributes::new("Example-Role clone".to_string()), RolesType::ROLES, )); let configuration = datadog::Configuration::new(); let api = RolesAPI::with_config(configuration); let resp = api.clone_role(role_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new role by cloning an existing role returns "OK" response ``` /** * Create a new role by cloning an existing role returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RolesApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v2.RolesApiCloneRoleRequest = { body: { data: { attributes: { name: "Example-Role clone", }, type: "roles", }, }, roleId: ROLE_DATA_ID, }; apiInstance .cloneRole(params) .then((data: v2.RoleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List role templates](https://docs.datadoghq.com/api/latest/roles/#list-role-templates) * [v2 (latest)](https://docs.datadoghq.com/api/latest/roles/#list-role-templates-v2) **Note** : This endpoint may be subject to changes. GET https://api.ap1.datadoghq.com/api/v2/roles/templateshttps://api.ap2.datadoghq.com/api/v2/roles/templateshttps://api.datadoghq.eu/api/v2/roles/templateshttps://api.ddog-gov.com/api/v2/roles/templateshttps://api.datadoghq.com/api/v2/roles/templateshttps://api.us3.datadoghq.com/api/v2/roles/templateshttps://api.us5.datadoghq.com/api/v2/roles/templates ### Overview List all role templates OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#roles) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/roles/#ListRoleTemplates-200-v2) * [429](https://docs.datadoghq.com/api/latest/roles/#ListRoleTemplates-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) The definition of `RoleTemplateArray` object. Field Type Description data [_required_] [object] The `RoleTemplateArray` `data`. attributes object The definition of `RoleTemplateDataAttributes` object. description string The `attributes` `description`. name string The `attributes` `name`. id string The `RoleTemplateData` `id`. type [_required_] enum Roles resource type. Allowed enum values: `roles` default: `roles` ``` { "data": [ { "attributes": { "description": "string", "name": "string" }, "id": "string", "type": "roles" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/roles/) * [Example](https://docs.datadoghq.com/api/latest/roles/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/roles/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/roles/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/roles/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/roles/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/roles/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/roles/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/roles/?code-lang=typescript) ##### List role templates Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/roles/templates" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List role templates ``` """ List role templates returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.roles_api import RolesApi configuration = Configuration() configuration.unstable_operations["list_role_templates"] = True with ApiClient(configuration) as api_client: api_instance = RolesApi(api_client) response = api_instance.list_role_templates() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List role templates ``` # List role templates returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_role_templates".to_sym] = true end api_instance = DatadogAPIClient::V2::RolesAPI.new p api_instance.list_role_templates() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List role templates ``` // List role templates returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListRoleTemplates", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRolesApi(apiClient) resp, r, err := api.ListRoleTemplates(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RolesApi.ListRoleTemplates`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RolesApi.ListRoleTemplates`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List role templates ``` // List role templates returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RolesApi; import com.datadog.api.client.v2.model.RoleTemplateArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listRoleTemplates", true); RolesApi apiInstance = new RolesApi(defaultClient); try { RoleTemplateArray result = apiInstance.listRoleTemplates(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RolesApi#listRoleTemplates"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List role templates ``` // List role templates returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_roles::RolesAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListRoleTemplates", true); let api = RolesAPI::with_config(configuration); let resp = api.list_role_templates().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List role templates ``` /** * List role templates returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listRoleTemplates"] = true; const apiInstance = new v2.RolesApi(configuration); apiInstance .listRoleTemplates() .then((data: v2.RoleTemplateArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=b32158b3-8c4c-4824-ad91-504395903cf4&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b0502f15-e333-4ef8-95ec-8555b757071f&pt=Roles&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Froles%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=b32158b3-8c4c-4824-ad91-504395903cf4&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=b0502f15-e333-4ef8-95ec-8555b757071f&pt=Roles&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Froles%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=343fae6d-f6a0-4dbe-a801-aa7f75ba9b1f&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Roles&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Froles%2F&r=<=21004&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=996563) --- # Source: https://docs.datadoghq.com/api/latest/rum-audience-management/ # Rum Audience Management Auto-generated tag Rum Audience Management ## [Query accounts](https://docs.datadoghq.com/api/latest/rum-audience-management/#query-accounts) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-audience-management/#query-accounts-v2) **Note** : This endpoint may be subject to changes. POST https://api.ap1.datadoghq.com/api/v2/product-analytics/accounts/queryhttps://api.ap2.datadoghq.com/api/v2/product-analytics/accounts/queryhttps://api.datadoghq.eu/api/v2/product-analytics/accounts/queryhttps://api.ddog-gov.com/api/v2/product-analytics/accounts/queryhttps://api.datadoghq.com/api/v2/product-analytics/accounts/queryhttps://api.us3.datadoghq.com/api/v2/product-analytics/accounts/queryhttps://api.us5.datadoghq.com/api/v2/product-analytics/accounts/query ### Overview Query accounts with flexible filtering by account properties ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object limit int64 query string select_columns [string] sort object field string order string wildcard_search_term string id string type [_required_] enum Query account request resource type. Allowed enum values: `query_account_request` default: `query_account_request` ``` { "data": { "attributes": { "limit": "integer", "query": "string", "select_columns": [], "sort": { "field": "string", "order": "string" }, "wildcard_search_term": "string" }, "id": "string", "type": "query_account_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum-audience-management/#QueryAccounts-200-v2) * [429](https://docs.datadoghq.com/api/latest/rum-audience-management/#QueryAccounts-429-v2) Successful response with account data * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object hits [] total int64 id string type [_required_] enum Query response resource type. Allowed enum values: `query_response` default: `query_response` ``` { "data": { "attributes": { "hits": [ { "first_browser_name": "Chrome", "first_city": "San Francisco", "first_country_code": "US", "first_device_type": "Desktop", "last_seen": "2025-08-14T06:45:12.142Z", "session_count": 47, "user_created": "2024-12-15T08:42:33.287Z", "user_email": "john.smith@techcorp.com", "user_id": "150847", "user_name": "John Smith", "user_org_id": "5001" }, { "first_browser_name": "Chrome", "first_city": "Austin", "first_country_code": "US", "first_device_type": "Desktop", "last_seen": "2025-08-14T05:22:08.951Z", "session_count": 89, "user_created": "2024-11-28T14:17:45.634Z", "user_email": "john.williams@techcorp.com", "user_id": "150848", "user_name": "John Williams", "user_org_id": "5001" }, { "first_browser_name": "Chrome", "first_city": "Seattle", "first_country_code": "US", "first_device_type": "Desktop", "last_seen": "2025-08-14T04:18:34.726Z", "session_count": 23, "user_created": "2025-01-03T16:33:21.445Z", "user_email": "john.jones@techcorp.com", "user_id": "150849", "user_name": "John Jones", "user_org_id": "5001" } ], "total": 147 }, "id": "query_response", "type": "query_response" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=typescript) ##### Query accounts Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/product-analytics/accounts/query" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "query_account_request" } } EOF ``` ##### Query accounts ``` """ Query accounts returns "Successful response with account data" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_audience_management_api import RumAudienceManagementApi from datadog_api_client.v2.model.query_account_request import QueryAccountRequest from datadog_api_client.v2.model.query_account_request_data import QueryAccountRequestData from datadog_api_client.v2.model.query_account_request_data_attributes import QueryAccountRequestDataAttributes from datadog_api_client.v2.model.query_account_request_data_attributes_sort import QueryAccountRequestDataAttributesSort from datadog_api_client.v2.model.query_account_request_data_type import QueryAccountRequestDataType body = QueryAccountRequest( data=QueryAccountRequestData( attributes=QueryAccountRequestDataAttributes( limit=20, query="plan_type:enterprise AND user_count:>100 AND subscription_status:active", select_columns=[ "account_id", "account_name", "user_count", "plan_type", "subscription_status", "created_at", "mrr", "industry", ], sort=QueryAccountRequestDataAttributesSort( field="user_count", order="DESC", ), wildcard_search_term="tech", ), id="query_account_request", type=QueryAccountRequestDataType.QUERY_ACCOUNT_REQUEST, ), ) configuration = Configuration() configuration.unstable_operations["query_accounts"] = True with ApiClient(configuration) as api_client: api_instance = RumAudienceManagementApi(api_client) response = api_instance.query_accounts(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Query accounts ``` # Query accounts returns "Successful response with account data" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.query_accounts".to_sym] = true end api_instance = DatadogAPIClient::V2::RumAudienceManagementAPI.new body = DatadogAPIClient::V2::QueryAccountRequest.new({ data: DatadogAPIClient::V2::QueryAccountRequestData.new({ attributes: DatadogAPIClient::V2::QueryAccountRequestDataAttributes.new({ limit: 20, query: "plan_type:enterprise AND user_count:>100 AND subscription_status:active", select_columns: [ "account_id", "account_name", "user_count", "plan_type", "subscription_status", "created_at", "mrr", "industry", ], sort: DatadogAPIClient::V2::QueryAccountRequestDataAttributesSort.new({ field: "user_count", order: "DESC", }), wildcard_search_term: "tech", }), id: "query_account_request", type: DatadogAPIClient::V2::QueryAccountRequestDataType::QUERY_ACCOUNT_REQUEST, }), }) p api_instance.query_accounts(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Query accounts ``` // Query accounts returns "Successful response with account data" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.QueryAccountRequest{ Data: &datadogV2.QueryAccountRequestData{ Attributes: &datadogV2.QueryAccountRequestDataAttributes{ Limit: datadog.PtrInt64(20), Query: datadog.PtrString("plan_type:enterprise AND user_count:>100 AND subscription_status:active"), SelectColumns: []string{ "account_id", "account_name", "user_count", "plan_type", "subscription_status", "created_at", "mrr", "industry", }, Sort: &datadogV2.QueryAccountRequestDataAttributesSort{ Field: datadog.PtrString("user_count"), Order: datadog.PtrString("DESC"), }, WildcardSearchTerm: datadog.PtrString("tech"), }, Id: datadog.PtrString("query_account_request"), Type: datadogV2.QUERYACCOUNTREQUESTDATATYPE_QUERY_ACCOUNT_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.QueryAccounts", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumAudienceManagementApi(apiClient) resp, r, err := api.QueryAccounts(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumAudienceManagementApi.QueryAccounts`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumAudienceManagementApi.QueryAccounts`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Query accounts ``` // Query accounts returns "Successful response with account data" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumAudienceManagementApi; import com.datadog.api.client.v2.model.QueryAccountRequest; import com.datadog.api.client.v2.model.QueryAccountRequestData; import com.datadog.api.client.v2.model.QueryAccountRequestDataAttributes; import com.datadog.api.client.v2.model.QueryAccountRequestDataAttributesSort; import com.datadog.api.client.v2.model.QueryAccountRequestDataType; import com.datadog.api.client.v2.model.QueryResponse; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.queryAccounts", true); RumAudienceManagementApi apiInstance = new RumAudienceManagementApi(defaultClient); QueryAccountRequest body = new QueryAccountRequest() .data( new QueryAccountRequestData() .attributes( new QueryAccountRequestDataAttributes() .limit(20L) .query( "plan_type:enterprise AND user_count:>100 AND" + " subscription_status:active") .selectColumns( Arrays.asList( "account_id", "account_name", "user_count", "plan_type", "subscription_status", "created_at", "mrr", "industry")) .sort( new QueryAccountRequestDataAttributesSort() .field("user_count") .order("DESC")) .wildcardSearchTerm("tech")) .id("query_account_request") .type(QueryAccountRequestDataType.QUERY_ACCOUNT_REQUEST)); try { QueryResponse result = apiInstance.queryAccounts(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumAudienceManagementApi#queryAccounts"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Query accounts ``` // Query accounts returns "Successful response with account data" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_audience_management::RumAudienceManagementAPI; use datadog_api_client::datadogV2::model::QueryAccountRequest; use datadog_api_client::datadogV2::model::QueryAccountRequestData; use datadog_api_client::datadogV2::model::QueryAccountRequestDataAttributes; use datadog_api_client::datadogV2::model::QueryAccountRequestDataAttributesSort; use datadog_api_client::datadogV2::model::QueryAccountRequestDataType; #[tokio::main] async fn main() { let body = QueryAccountRequest::new().data( QueryAccountRequestData::new(QueryAccountRequestDataType::QUERY_ACCOUNT_REQUEST) .attributes( QueryAccountRequestDataAttributes::new() .limit(20) .query( "plan_type:enterprise AND user_count:>100 AND subscription_status:active" .to_string(), ) .select_columns(vec![ "account_id".to_string(), "account_name".to_string(), "user_count".to_string(), "plan_type".to_string(), "subscription_status".to_string(), "created_at".to_string(), "mrr".to_string(), "industry".to_string(), ]) .sort( QueryAccountRequestDataAttributesSort::new() .field("user_count".to_string()) .order("DESC".to_string()), ) .wildcard_search_term("tech".to_string()), ) .id("query_account_request".to_string()), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.QueryAccounts", true); let api = RumAudienceManagementAPI::with_config(configuration); let resp = api.query_accounts(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Query accounts ``` /** * Query accounts returns "Successful response with account data" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.queryAccounts"] = true; const apiInstance = new v2.RumAudienceManagementApi(configuration); const params: v2.RumAudienceManagementApiQueryAccountsRequest = { body: { data: { attributes: { limit: 20, query: "plan_type:enterprise AND user_count:>100 AND subscription_status:active", selectColumns: [ "account_id", "account_name", "user_count", "plan_type", "subscription_status", "created_at", "mrr", "industry", ], sort: { field: "user_count", order: "DESC", }, wildcardSearchTerm: "tech", }, id: "query_account_request", type: "query_account_request", }, }, }; apiInstance .queryAccounts(params) .then((data: v2.QueryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create connection](https://docs.datadoghq.com/api/latest/rum-audience-management/#create-connection) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-audience-management/#create-connection-v2) **Note** : This endpoint may be subject to changes. POST https://api.ap1.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.ap2.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.datadoghq.eu/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.ddog-gov.com/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.us3.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.us5.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connection ### Overview Create a new data connection and its fields for an entity ### Arguments #### Path Parameters Name Type Description entity [_required_] string The entity for which to create the connection ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object fields [object] description string display_name string groups [string] id [_required_] string source_name [_required_] string type [_required_] string join_attribute [_required_] string join_type [_required_] string metadata object string type [_required_] string id string type [_required_] enum Connection id resource type. Allowed enum values: `connection_id` default: `connection_id` ``` { "data": { "attributes": { "fields": [ { "description": "string", "display_name": "string", "groups": [], "id": "", "source_name": "", "type": "" } ], "join_attribute": "", "join_type": "", "metadata": { "": "string" }, "type": "" }, "id": "string", "type": "connection_id" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/rum-audience-management/#CreateConnection-201-v2) * [429](https://docs.datadoghq.com/api/latest/rum-audience-management/#CreateConnection-429-v2) Connection created successfully Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=typescript) ##### Create connection Copy ``` # Path parameters export entity="users" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/product-analytics/${entity}/mapping/connection" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "fields": [ { "id": "", "source_name": "", "type": "" } ], "join_attribute": "", "join_type": "", "type": "" }, "type": "connection_id" } } EOF ``` ##### Create connection ``` """ Create connection returns "Connection created successfully" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_audience_management_api import RumAudienceManagementApi from datadog_api_client.v2.model.create_connection_request import CreateConnectionRequest from datadog_api_client.v2.model.create_connection_request_data import CreateConnectionRequestData from datadog_api_client.v2.model.create_connection_request_data_attributes import CreateConnectionRequestDataAttributes from datadog_api_client.v2.model.create_connection_request_data_attributes_fields_items import ( CreateConnectionRequestDataAttributesFieldsItems, ) from datadog_api_client.v2.model.update_connection_request_data_type import UpdateConnectionRequestDataType body = CreateConnectionRequest( data=CreateConnectionRequestData( attributes=CreateConnectionRequestDataAttributes( fields=[ CreateConnectionRequestDataAttributesFieldsItems( description="Customer subscription tier from `CRM`", display_name="Customer Tier", id="customer_tier", source_name="subscription_tier", type="string", ), CreateConnectionRequestDataAttributesFieldsItems( description="Customer lifetime value in `USD`", display_name="Lifetime Value", id="lifetime_value", source_name="ltv", type="number", ), ], join_attribute="user_email", join_type="email", type="ref_table", ), id="crm-integration", type=UpdateConnectionRequestDataType.CONNECTION_ID, ), ) configuration = Configuration() configuration.unstable_operations["create_connection"] = True with ApiClient(configuration) as api_client: api_instance = RumAudienceManagementApi(api_client) api_instance.create_connection(entity="users", body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create connection ``` # Create connection returns "Connection created successfully" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_connection".to_sym] = true end api_instance = DatadogAPIClient::V2::RumAudienceManagementAPI.new body = DatadogAPIClient::V2::CreateConnectionRequest.new({ data: DatadogAPIClient::V2::CreateConnectionRequestData.new({ attributes: DatadogAPIClient::V2::CreateConnectionRequestDataAttributes.new({ fields: [ DatadogAPIClient::V2::CreateConnectionRequestDataAttributesFieldsItems.new({ description: "Customer subscription tier from `CRM`", display_name: "Customer Tier", id: "customer_tier", source_name: "subscription_tier", type: "string", }), DatadogAPIClient::V2::CreateConnectionRequestDataAttributesFieldsItems.new({ description: "Customer lifetime value in `USD`", display_name: "Lifetime Value", id: "lifetime_value", source_name: "ltv", type: "number", }), ], join_attribute: "user_email", join_type: "email", type: "ref_table", }), id: "crm-integration", type: DatadogAPIClient::V2::UpdateConnectionRequestDataType::CONNECTION_ID, }), }) p api_instance.create_connection("users", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create connection ``` // Create connection returns "Connection created successfully" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateConnectionRequest{ Data: &datadogV2.CreateConnectionRequestData{ Attributes: &datadogV2.CreateConnectionRequestDataAttributes{ Fields: []datadogV2.CreateConnectionRequestDataAttributesFieldsItems{ { Description: datadog.PtrString(`Customer subscription tier from ` + "`" + `CRM` + "`"), DisplayName: datadog.PtrString("Customer Tier"), Id: "customer_tier", SourceName: "subscription_tier", Type: "string", }, { Description: datadog.PtrString(`Customer lifetime value in ` + "`" + `USD` + "`"), DisplayName: datadog.PtrString("Lifetime Value"), Id: "lifetime_value", SourceName: "ltv", Type: "number", }, }, JoinAttribute: "user_email", JoinType: "email", Type: "ref_table", }, Id: datadog.PtrString("crm-integration"), Type: datadogV2.UPDATECONNECTIONREQUESTDATATYPE_CONNECTION_ID, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateConnection", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumAudienceManagementApi(apiClient) r, err := api.CreateConnection(ctx, "users", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumAudienceManagementApi.CreateConnection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create connection ``` // Create connection returns "Connection created successfully" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumAudienceManagementApi; import com.datadog.api.client.v2.model.CreateConnectionRequest; import com.datadog.api.client.v2.model.CreateConnectionRequestData; import com.datadog.api.client.v2.model.CreateConnectionRequestDataAttributes; import com.datadog.api.client.v2.model.CreateConnectionRequestDataAttributesFieldsItems; import com.datadog.api.client.v2.model.UpdateConnectionRequestDataType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createConnection", true); RumAudienceManagementApi apiInstance = new RumAudienceManagementApi(defaultClient); CreateConnectionRequest body = new CreateConnectionRequest() .data( new CreateConnectionRequestData() .attributes( new CreateConnectionRequestDataAttributes() .fields( Arrays.asList( new CreateConnectionRequestDataAttributesFieldsItems() .description("Customer subscription tier from `CRM`") .displayName("Customer Tier") .id("customer_tier") .sourceName("subscription_tier") .type("string"), new CreateConnectionRequestDataAttributesFieldsItems() .description("Customer lifetime value in `USD`") .displayName("Lifetime Value") .id("lifetime_value") .sourceName("ltv") .type("number"))) .joinAttribute("user_email") .joinType("email") .type("ref_table")) .id("crm-integration") .type(UpdateConnectionRequestDataType.CONNECTION_ID)); try { apiInstance.createConnection("users", body); } catch (ApiException e) { System.err.println("Exception when calling RumAudienceManagementApi#createConnection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create connection ``` // Create connection returns "Connection created successfully" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_audience_management::RumAudienceManagementAPI; use datadog_api_client::datadogV2::model::CreateConnectionRequest; use datadog_api_client::datadogV2::model::CreateConnectionRequestData; use datadog_api_client::datadogV2::model::CreateConnectionRequestDataAttributes; use datadog_api_client::datadogV2::model::CreateConnectionRequestDataAttributesFieldsItems; use datadog_api_client::datadogV2::model::UpdateConnectionRequestDataType; #[tokio::main] async fn main() { let body = CreateConnectionRequest::new().data( CreateConnectionRequestData::new(UpdateConnectionRequestDataType::CONNECTION_ID) .attributes( CreateConnectionRequestDataAttributes::new( "user_email".to_string(), "email".to_string(), "ref_table".to_string(), ) .fields(vec![ CreateConnectionRequestDataAttributesFieldsItems::new( "customer_tier".to_string(), "subscription_tier".to_string(), "string".to_string(), ) .description(r#"Customer subscription tier from `CRM`"#.to_string()) .display_name("Customer Tier".to_string()), CreateConnectionRequestDataAttributesFieldsItems::new( "lifetime_value".to_string(), "ltv".to_string(), "number".to_string(), ) .description(r#"Customer lifetime value in `USD`"#.to_string()) .display_name("Lifetime Value".to_string()), ]), ) .id("crm-integration".to_string()), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateConnection", true); let api = RumAudienceManagementAPI::with_config(configuration); let resp = api.create_connection("users".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create connection ``` /** * Create connection returns "Connection created successfully" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createConnection"] = true; const apiInstance = new v2.RumAudienceManagementApi(configuration); const params: v2.RumAudienceManagementApiCreateConnectionRequest = { body: { data: { attributes: { fields: [ { description: `Customer subscription tier from ` + "`" + `CRM` + "`", displayName: "Customer Tier", id: "customer_tier", sourceName: "subscription_tier", type: "string", }, { description: `Customer lifetime value in ` + "`" + `USD` + "`", displayName: "Lifetime Value", id: "lifetime_value", sourceName: "ltv", type: "number", }, ], joinAttribute: "user_email", joinType: "email", type: "ref_table", }, id: "crm-integration", type: "connection_id", }, }, entity: "users", }; apiInstance .createConnection(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update connection](https://docs.datadoghq.com/api/latest/rum-audience-management/#update-connection) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-audience-management/#update-connection-v2) **Note** : This endpoint may be subject to changes. PUT https://api.ap1.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.ap2.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.datadoghq.eu/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.ddog-gov.com/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.us3.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionhttps://api.us5.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connection ### Overview Update an existing data connection by adding, updating, or deleting fields ### Arguments #### Path Parameters Name Type Description entity [_required_] string The entity for which to update the connection ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object fields_to_add [object] description string display_name string groups [string] id [_required_] string source_name [_required_] string type [_required_] string fields_to_delete [string] fields_to_update [object] field_id [_required_] string updated_description string updated_display_name string updated_field_id string updated_groups [string] id [_required_] string type [_required_] enum Connection id resource type. Allowed enum values: `connection_id` default: `connection_id` ``` { "data": { "attributes": { "fields_to_add": [ { "description": "string", "display_name": "string", "groups": [], "id": "", "source_name": "", "type": "" } ], "fields_to_delete": [], "fields_to_update": [ { "field_id": "", "updated_description": "string", "updated_display_name": "string", "updated_field_id": "string", "updated_groups": [] } ] }, "id": "", "type": "connection_id" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum-audience-management/#UpdateConnection-200-v2) * [429](https://docs.datadoghq.com/api/latest/rum-audience-management/#UpdateConnection-429-v2) Connection updated successfully Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=typescript) ##### Update connection Copy ``` # Path parameters export entity="users" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/product-analytics/${entity}/mapping/connection" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "fields_to_add": [ { "id": "", "source_name": "", "type": "" } ], "fields_to_update": [ { "field_id": "" } ] }, "id": "", "type": "connection_id" } } EOF ``` ##### Update connection ``` """ Update connection returns "Connection updated successfully" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_audience_management_api import RumAudienceManagementApi from datadog_api_client.v2.model.create_connection_request_data_attributes_fields_items import ( CreateConnectionRequestDataAttributesFieldsItems, ) from datadog_api_client.v2.model.update_connection_request import UpdateConnectionRequest from datadog_api_client.v2.model.update_connection_request_data import UpdateConnectionRequestData from datadog_api_client.v2.model.update_connection_request_data_attributes import UpdateConnectionRequestDataAttributes from datadog_api_client.v2.model.update_connection_request_data_attributes_fields_to_update_items import ( UpdateConnectionRequestDataAttributesFieldsToUpdateItems, ) from datadog_api_client.v2.model.update_connection_request_data_type import UpdateConnectionRequestDataType body = UpdateConnectionRequest( data=UpdateConnectionRequestData( attributes=UpdateConnectionRequestDataAttributes( fields_to_add=[ CreateConnectionRequestDataAttributesFieldsItems( description="Net Promoter Score from customer surveys", display_name="NPS Score", groups=[ "Satisfaction", "Metrics", ], id="nps_score", source_name="net_promoter_score", type="number", ), ], fields_to_delete=[ "old_revenue_field", ], fields_to_update=[ UpdateConnectionRequestDataAttributesFieldsToUpdateItems( field_id="lifetime_value", updated_display_name="Customer Lifetime Value (`USD`)", updated_groups=[ "Financial", "Metrics", ], ), ], ), id="crm-integration", type=UpdateConnectionRequestDataType.CONNECTION_ID, ), ) configuration = Configuration() configuration.unstable_operations["update_connection"] = True with ApiClient(configuration) as api_client: api_instance = RumAudienceManagementApi(api_client) api_instance.update_connection(entity="users", body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update connection ``` # Update connection returns "Connection updated successfully" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_connection".to_sym] = true end api_instance = DatadogAPIClient::V2::RumAudienceManagementAPI.new body = DatadogAPIClient::V2::UpdateConnectionRequest.new({ data: DatadogAPIClient::V2::UpdateConnectionRequestData.new({ attributes: DatadogAPIClient::V2::UpdateConnectionRequestDataAttributes.new({ fields_to_add: [ DatadogAPIClient::V2::CreateConnectionRequestDataAttributesFieldsItems.new({ description: "Net Promoter Score from customer surveys", display_name: "NPS Score", groups: [ "Satisfaction", "Metrics", ], id: "nps_score", source_name: "net_promoter_score", type: "number", }), ], fields_to_delete: [ "old_revenue_field", ], fields_to_update: [ DatadogAPIClient::V2::UpdateConnectionRequestDataAttributesFieldsToUpdateItems.new({ field_id: "lifetime_value", updated_display_name: "Customer Lifetime Value (`USD`)", updated_groups: [ "Financial", "Metrics", ], }), ], }), id: "crm-integration", type: DatadogAPIClient::V2::UpdateConnectionRequestDataType::CONNECTION_ID, }), }) p api_instance.update_connection("users", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update connection ``` // Update connection returns "Connection updated successfully" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.UpdateConnectionRequest{ Data: &datadogV2.UpdateConnectionRequestData{ Attributes: &datadogV2.UpdateConnectionRequestDataAttributes{ FieldsToAdd: []datadogV2.CreateConnectionRequestDataAttributesFieldsItems{ { Description: datadog.PtrString("Net Promoter Score from customer surveys"), DisplayName: datadog.PtrString("NPS Score"), Groups: []string{ "Satisfaction", "Metrics", }, Id: "nps_score", SourceName: "net_promoter_score", Type: "number", }, }, FieldsToDelete: []string{ "old_revenue_field", }, FieldsToUpdate: []datadogV2.UpdateConnectionRequestDataAttributesFieldsToUpdateItems{ { FieldId: "lifetime_value", UpdatedDisplayName: datadog.PtrString(`Customer Lifetime Value (` + "`" + `USD` + "`" + `)`), UpdatedGroups: []string{ "Financial", "Metrics", }, }, }, }, Id: "crm-integration", Type: datadogV2.UPDATECONNECTIONREQUESTDATATYPE_CONNECTION_ID, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateConnection", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumAudienceManagementApi(apiClient) r, err := api.UpdateConnection(ctx, "users", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumAudienceManagementApi.UpdateConnection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update connection ``` // Update connection returns "Connection updated successfully" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumAudienceManagementApi; import com.datadog.api.client.v2.model.CreateConnectionRequestDataAttributesFieldsItems; import com.datadog.api.client.v2.model.UpdateConnectionRequest; import com.datadog.api.client.v2.model.UpdateConnectionRequestData; import com.datadog.api.client.v2.model.UpdateConnectionRequestDataAttributes; import com.datadog.api.client.v2.model.UpdateConnectionRequestDataAttributesFieldsToUpdateItems; import com.datadog.api.client.v2.model.UpdateConnectionRequestDataType; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateConnection", true); RumAudienceManagementApi apiInstance = new RumAudienceManagementApi(defaultClient); UpdateConnectionRequest body = new UpdateConnectionRequest() .data( new UpdateConnectionRequestData() .attributes( new UpdateConnectionRequestDataAttributes() .fieldsToAdd( Collections.singletonList( new CreateConnectionRequestDataAttributesFieldsItems() .description("Net Promoter Score from customer surveys") .displayName("NPS Score") .groups(Arrays.asList("Satisfaction", "Metrics")) .id("nps_score") .sourceName("net_promoter_score") .type("number"))) .fieldsToDelete(Collections.singletonList("old_revenue_field")) .fieldsToUpdate( Collections.singletonList( new UpdateConnectionRequestDataAttributesFieldsToUpdateItems() .fieldId("lifetime_value") .updatedDisplayName("Customer Lifetime Value (`USD`)") .updatedGroups(Arrays.asList("Financial", "Metrics"))))) .id("crm-integration") .type(UpdateConnectionRequestDataType.CONNECTION_ID)); try { apiInstance.updateConnection("users", body); } catch (ApiException e) { System.err.println("Exception when calling RumAudienceManagementApi#updateConnection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update connection ``` // Update connection returns "Connection updated successfully" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_audience_management::RumAudienceManagementAPI; use datadog_api_client::datadogV2::model::CreateConnectionRequestDataAttributesFieldsItems; use datadog_api_client::datadogV2::model::UpdateConnectionRequest; use datadog_api_client::datadogV2::model::UpdateConnectionRequestData; use datadog_api_client::datadogV2::model::UpdateConnectionRequestDataAttributes; use datadog_api_client::datadogV2::model::UpdateConnectionRequestDataAttributesFieldsToUpdateItems; use datadog_api_client::datadogV2::model::UpdateConnectionRequestDataType; #[tokio::main] async fn main() { let body = UpdateConnectionRequest::new().data( UpdateConnectionRequestData::new( "crm-integration".to_string(), UpdateConnectionRequestDataType::CONNECTION_ID, ) .attributes( UpdateConnectionRequestDataAttributes::new() .fields_to_add(vec![CreateConnectionRequestDataAttributesFieldsItems::new( "nps_score".to_string(), "net_promoter_score".to_string(), "number".to_string(), ) .description("Net Promoter Score from customer surveys".to_string()) .display_name("NPS Score".to_string()) .groups(vec!["Satisfaction".to_string(), "Metrics".to_string()])]) .fields_to_delete(vec!["old_revenue_field".to_string()]) .fields_to_update(vec![ UpdateConnectionRequestDataAttributesFieldsToUpdateItems::new( "lifetime_value".to_string(), ) .updated_display_name(r#"Customer Lifetime Value (`USD`)"#.to_string()) .updated_groups(vec!["Financial".to_string(), "Metrics".to_string()]), ]), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateConnection", true); let api = RumAudienceManagementAPI::with_config(configuration); let resp = api.update_connection("users".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update connection ``` /** * Update connection returns "Connection updated successfully" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateConnection"] = true; const apiInstance = new v2.RumAudienceManagementApi(configuration); const params: v2.RumAudienceManagementApiUpdateConnectionRequest = { body: { data: { attributes: { fieldsToAdd: [ { description: "Net Promoter Score from customer surveys", displayName: "NPS Score", groups: ["Satisfaction", "Metrics"], id: "nps_score", sourceName: "net_promoter_score", type: "number", }, ], fieldsToDelete: ["old_revenue_field"], fieldsToUpdate: [ { fieldId: "lifetime_value", updatedDisplayName: `Customer Lifetime Value (` + "`" + `USD` + "`" + `)`, updatedGroups: ["Financial", "Metrics"], }, ], }, id: "crm-integration", type: "connection_id", }, }, entity: "users", }; apiInstance .updateConnection(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Query event filtered users](https://docs.datadoghq.com/api/latest/rum-audience-management/#query-event-filtered-users) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-audience-management/#query-event-filtered-users-v2) **Note** : This endpoint may be subject to changes. POST https://api.ap1.datadoghq.com/api/v2/product-analytics/users/event_filtered_queryhttps://api.ap2.datadoghq.com/api/v2/product-analytics/users/event_filtered_queryhttps://api.datadoghq.eu/api/v2/product-analytics/users/event_filtered_queryhttps://api.ddog-gov.com/api/v2/product-analytics/users/event_filtered_queryhttps://api.datadoghq.com/api/v2/product-analytics/users/event_filtered_queryhttps://api.us3.datadoghq.com/api/v2/product-analytics/users/event_filtered_queryhttps://api.us5.datadoghq.com/api/v2/product-analytics/users/event_filtered_query ### Overview Query users filtered by both user properties and event platform data ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object event_query object query string time_frame object end int64 start int64 include_row_count boolean limit int64 query string select_columns [string] id string type [_required_] enum Query event filtered users request resource type. Allowed enum values: `query_event_filtered_users_request` default: `query_event_filtered_users_request` ``` { "data": { "attributes": { "event_query": { "query": "string", "time_frame": { "end": "integer", "start": "integer" } }, "include_row_count": false, "limit": "integer", "query": "string", "select_columns": [] }, "id": "string", "type": "query_event_filtered_users_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum-audience-management/#QueryEventFilteredUsers-200-v2) * [429](https://docs.datadoghq.com/api/latest/rum-audience-management/#QueryEventFilteredUsers-429-v2) Successful response with filtered user data * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object hits [] total int64 id string type [_required_] enum Query response resource type. Allowed enum values: `query_response` default: `query_response` ``` { "data": { "attributes": { "hits": [ { "first_browser_name": "Chrome", "first_city": "San Francisco", "first_country_code": "US", "first_device_type": "Desktop", "last_seen": "2025-08-14T06:45:12.142Z", "session_count": 47, "user_created": "2024-12-15T08:42:33.287Z", "user_email": "john.smith@techcorp.com", "user_id": "150847", "user_name": "John Smith", "user_org_id": "5001" }, { "first_browser_name": "Chrome", "first_city": "Austin", "first_country_code": "US", "first_device_type": "Desktop", "last_seen": "2025-08-14T05:22:08.951Z", "session_count": 89, "user_created": "2024-11-28T14:17:45.634Z", "user_email": "john.williams@techcorp.com", "user_id": "150848", "user_name": "John Williams", "user_org_id": "5001" }, { "first_browser_name": "Chrome", "first_city": "Seattle", "first_country_code": "US", "first_device_type": "Desktop", "last_seen": "2025-08-14T04:18:34.726Z", "session_count": 23, "user_created": "2025-01-03T16:33:21.445Z", "user_email": "john.jones@techcorp.com", "user_id": "150849", "user_name": "John Jones", "user_org_id": "5001" } ], "total": 147 }, "id": "query_response", "type": "query_response" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=typescript) ##### Query event filtered users Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/product-analytics/users/event_filtered_query" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "query_event_filtered_users_request" } } EOF ``` ##### Query event filtered users ``` """ Query event filtered users returns "Successful response with filtered user data" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_audience_management_api import RumAudienceManagementApi from datadog_api_client.v2.model.query_event_filtered_users_request import QueryEventFilteredUsersRequest from datadog_api_client.v2.model.query_event_filtered_users_request_data import QueryEventFilteredUsersRequestData from datadog_api_client.v2.model.query_event_filtered_users_request_data_attributes import ( QueryEventFilteredUsersRequestDataAttributes, ) from datadog_api_client.v2.model.query_event_filtered_users_request_data_attributes_event_query import ( QueryEventFilteredUsersRequestDataAttributesEventQuery, ) from datadog_api_client.v2.model.query_event_filtered_users_request_data_attributes_event_query_time_frame import ( QueryEventFilteredUsersRequestDataAttributesEventQueryTimeFrame, ) from datadog_api_client.v2.model.query_event_filtered_users_request_data_type import ( QueryEventFilteredUsersRequestDataType, ) body = QueryEventFilteredUsersRequest( data=QueryEventFilteredUsersRequestData( attributes=QueryEventFilteredUsersRequestDataAttributes( event_query=QueryEventFilteredUsersRequestDataAttributesEventQuery( query="@type:view AND @view.loading_time:>3000 AND @application.name:ecommerce-platform", time_frame=QueryEventFilteredUsersRequestDataAttributesEventQueryTimeFrame( end=1761309676, start=1760100076, ), ), include_row_count=True, limit=25, query="user_org_id:5001 AND first_country_code:US AND first_browser_name:Chrome", select_columns=[ "user_id", "user_email", "first_country_code", "first_browser_name", "events_count", "session_count", "error_count", "avg_loading_time", ], ), id="query_event_filtered_users_request", type=QueryEventFilteredUsersRequestDataType.QUERY_EVENT_FILTERED_USERS_REQUEST, ), ) configuration = Configuration() configuration.unstable_operations["query_event_filtered_users"] = True with ApiClient(configuration) as api_client: api_instance = RumAudienceManagementApi(api_client) response = api_instance.query_event_filtered_users(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Query event filtered users ``` # Query event filtered users returns "Successful response with filtered user data" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.query_event_filtered_users".to_sym] = true end api_instance = DatadogAPIClient::V2::RumAudienceManagementAPI.new body = DatadogAPIClient::V2::QueryEventFilteredUsersRequest.new({ data: DatadogAPIClient::V2::QueryEventFilteredUsersRequestData.new({ attributes: DatadogAPIClient::V2::QueryEventFilteredUsersRequestDataAttributes.new({ event_query: DatadogAPIClient::V2::QueryEventFilteredUsersRequestDataAttributesEventQuery.new({ query: "@type:view AND @view.loading_time:>3000 AND @application.name:ecommerce-platform", time_frame: DatadogAPIClient::V2::QueryEventFilteredUsersRequestDataAttributesEventQueryTimeFrame.new({ _end: 1761309676, start: 1760100076, }), }), include_row_count: true, limit: 25, query: "user_org_id:5001 AND first_country_code:US AND first_browser_name:Chrome", select_columns: [ "user_id", "user_email", "first_country_code", "first_browser_name", "events_count", "session_count", "error_count", "avg_loading_time", ], }), id: "query_event_filtered_users_request", type: DatadogAPIClient::V2::QueryEventFilteredUsersRequestDataType::QUERY_EVENT_FILTERED_USERS_REQUEST, }), }) p api_instance.query_event_filtered_users(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Query event filtered users ``` // Query event filtered users returns "Successful response with filtered user data" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.QueryEventFilteredUsersRequest{ Data: &datadogV2.QueryEventFilteredUsersRequestData{ Attributes: &datadogV2.QueryEventFilteredUsersRequestDataAttributes{ EventQuery: &datadogV2.QueryEventFilteredUsersRequestDataAttributesEventQuery{ Query: datadog.PtrString("@type:view AND @view.loading_time:>3000 AND @application.name:ecommerce-platform"), TimeFrame: &datadogV2.QueryEventFilteredUsersRequestDataAttributesEventQueryTimeFrame{ End: datadog.PtrInt64(1761309676), Start: datadog.PtrInt64(1760100076), }, }, IncludeRowCount: datadog.PtrBool(true), Limit: datadog.PtrInt64(25), Query: datadog.PtrString("user_org_id:5001 AND first_country_code:US AND first_browser_name:Chrome"), SelectColumns: []string{ "user_id", "user_email", "first_country_code", "first_browser_name", "events_count", "session_count", "error_count", "avg_loading_time", }, }, Id: datadog.PtrString("query_event_filtered_users_request"), Type: datadogV2.QUERYEVENTFILTEREDUSERSREQUESTDATATYPE_QUERY_EVENT_FILTERED_USERS_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.QueryEventFilteredUsers", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumAudienceManagementApi(apiClient) resp, r, err := api.QueryEventFilteredUsers(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumAudienceManagementApi.QueryEventFilteredUsers`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumAudienceManagementApi.QueryEventFilteredUsers`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Query event filtered users ``` // Query event filtered users returns "Successful response with filtered user data" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumAudienceManagementApi; import com.datadog.api.client.v2.model.QueryEventFilteredUsersRequest; import com.datadog.api.client.v2.model.QueryEventFilteredUsersRequestData; import com.datadog.api.client.v2.model.QueryEventFilteredUsersRequestDataAttributes; import com.datadog.api.client.v2.model.QueryEventFilteredUsersRequestDataAttributesEventQuery; import com.datadog.api.client.v2.model.QueryEventFilteredUsersRequestDataAttributesEventQueryTimeFrame; import com.datadog.api.client.v2.model.QueryEventFilteredUsersRequestDataType; import com.datadog.api.client.v2.model.QueryResponse; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.queryEventFilteredUsers", true); RumAudienceManagementApi apiInstance = new RumAudienceManagementApi(defaultClient); QueryEventFilteredUsersRequest body = new QueryEventFilteredUsersRequest() .data( new QueryEventFilteredUsersRequestData() .attributes( new QueryEventFilteredUsersRequestDataAttributes() .eventQuery( new QueryEventFilteredUsersRequestDataAttributesEventQuery() .query( "@type:view AND @view.loading_time:>3000 AND" + " @application.name:ecommerce-platform") .timeFrame( new QueryEventFilteredUsersRequestDataAttributesEventQueryTimeFrame() .end(1761309676L) .start(1760100076L))) .includeRowCount(true) .limit(25L) .query( "user_org_id:5001 AND first_country_code:US AND" + " first_browser_name:Chrome") .selectColumns( Arrays.asList( "user_id", "user_email", "first_country_code", "first_browser_name", "events_count", "session_count", "error_count", "avg_loading_time"))) .id("query_event_filtered_users_request") .type( QueryEventFilteredUsersRequestDataType.QUERY_EVENT_FILTERED_USERS_REQUEST)); try { QueryResponse result = apiInstance.queryEventFilteredUsers(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumAudienceManagementApi#queryEventFilteredUsers"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Query event filtered users ``` // Query event filtered users returns "Successful response with filtered user // data" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_audience_management::RumAudienceManagementAPI; use datadog_api_client::datadogV2::model::QueryEventFilteredUsersRequest; use datadog_api_client::datadogV2::model::QueryEventFilteredUsersRequestData; use datadog_api_client::datadogV2::model::QueryEventFilteredUsersRequestDataAttributes; use datadog_api_client::datadogV2::model::QueryEventFilteredUsersRequestDataAttributesEventQuery; use datadog_api_client::datadogV2::model::QueryEventFilteredUsersRequestDataAttributesEventQueryTimeFrame; use datadog_api_client::datadogV2::model::QueryEventFilteredUsersRequestDataType; #[tokio::main] async fn main() { let body = QueryEventFilteredUsersRequest ::new().data( QueryEventFilteredUsersRequestData::new( QueryEventFilteredUsersRequestDataType::QUERY_EVENT_FILTERED_USERS_REQUEST, ) .attributes( QueryEventFilteredUsersRequestDataAttributes::new() .event_query( QueryEventFilteredUsersRequestDataAttributesEventQuery::new() .query( "@type:view AND @view.loading_time:>3000 AND @application.name:ecommerce-platform".to_string(), ) .time_frame( QueryEventFilteredUsersRequestDataAttributesEventQueryTimeFrame::new() .end(1761309676) .start(1760100076), ), ) .include_row_count(true) .limit(25) .query("user_org_id:5001 AND first_country_code:US AND first_browser_name:Chrome".to_string()) .select_columns( vec![ "user_id".to_string(), "user_email".to_string(), "first_country_code".to_string(), "first_browser_name".to_string(), "events_count".to_string(), "session_count".to_string(), "error_count".to_string(), "avg_loading_time".to_string() ], ), ) .id("query_event_filtered_users_request".to_string()), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.QueryEventFilteredUsers", true); let api = RumAudienceManagementAPI::with_config(configuration); let resp = api.query_event_filtered_users(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Query event filtered users ``` /** * Query event filtered users returns "Successful response with filtered user data" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.queryEventFilteredUsers"] = true; const apiInstance = new v2.RumAudienceManagementApi(configuration); const params: v2.RumAudienceManagementApiQueryEventFilteredUsersRequest = { body: { data: { attributes: { eventQuery: { query: "@type:view AND @view.loading_time:>3000 AND @application.name:ecommerce-platform", timeFrame: { end: 1761309676, start: 1760100076, }, }, includeRowCount: true, limit: 25, query: "user_org_id:5001 AND first_country_code:US AND first_browser_name:Chrome", selectColumns: [ "user_id", "user_email", "first_country_code", "first_browser_name", "events_count", "session_count", "error_count", "avg_loading_time", ], }, id: "query_event_filtered_users_request", type: "query_event_filtered_users_request", }, }, }; apiInstance .queryEventFilteredUsers(params) .then((data: v2.QueryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get account facet info](https://docs.datadoghq.com/api/latest/rum-audience-management/#get-account-facet-info) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-audience-management/#get-account-facet-info-v2) **Note** : This endpoint may be subject to changes. POST https://api.ap1.datadoghq.com/api/v2/product-analytics/accounts/facet_infohttps://api.ap2.datadoghq.com/api/v2/product-analytics/accounts/facet_infohttps://api.datadoghq.eu/api/v2/product-analytics/accounts/facet_infohttps://api.ddog-gov.com/api/v2/product-analytics/accounts/facet_infohttps://api.datadoghq.com/api/v2/product-analytics/accounts/facet_infohttps://api.us3.datadoghq.com/api/v2/product-analytics/accounts/facet_infohttps://api.us5.datadoghq.com/api/v2/product-analytics/accounts/facet_info ### Overview Get facet information for account attributes including possible values and counts ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object facet_id [_required_] string limit [_required_] int64 search object query string term_search object value string id string type [_required_] enum Users facet info request resource type. Allowed enum values: `users_facet_info_request` default: `users_facet_info_request` ``` { "data": { "attributes": { "facet_id": "", "limit": 0, "search": { "query": "string" }, "term_search": { "value": "string" } }, "id": "string", "type": "users_facet_info_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum-audience-management/#GetAccountFacetInfo-200-v2) * [429](https://docs.datadoghq.com/api/latest/rum-audience-management/#GetAccountFacetInfo-429-v2) Successful response with facet information * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object result object range object max object min object values [object] count int64 value string id string type [_required_] enum Users facet info resource type. Allowed enum values: `users_facet_info` default: `users_facet_info` ``` { "data": { "attributes": { "result": { "values": [ { "count": 4892, "value": "Chrome" }, { "count": 2341, "value": "Safari" }, { "count": 1567, "value": "Firefox" }, { "count": 892, "value": "Edge" }, { "count": 234, "value": "Opera" } ] } }, "id": "facet_info_response", "type": "users_facet_info" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=typescript) ##### Get account facet info Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/product-analytics/accounts/facet_info" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "facet_id": "", "limit": 0 }, "type": "users_facet_info_request" } } EOF ``` ##### Get account facet info ``` """ Get account facet info returns "Successful response with facet information" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_audience_management_api import RumAudienceManagementApi from datadog_api_client.v2.model.facet_info_request import FacetInfoRequest from datadog_api_client.v2.model.facet_info_request_data import FacetInfoRequestData from datadog_api_client.v2.model.facet_info_request_data_attributes import FacetInfoRequestDataAttributes from datadog_api_client.v2.model.facet_info_request_data_attributes_search import FacetInfoRequestDataAttributesSearch from datadog_api_client.v2.model.facet_info_request_data_attributes_term_search import ( FacetInfoRequestDataAttributesTermSearch, ) from datadog_api_client.v2.model.facet_info_request_data_type import FacetInfoRequestDataType body = FacetInfoRequest( data=FacetInfoRequestData( attributes=FacetInfoRequestDataAttributes( facet_id="first_browser_name", limit=10, search=FacetInfoRequestDataAttributesSearch( query="user_org_id:5001 AND first_country_code:US", ), term_search=FacetInfoRequestDataAttributesTermSearch( value="Chrome", ), ), id="facet_info_request", type=FacetInfoRequestDataType.USERS_FACET_INFO_REQUEST, ), ) configuration = Configuration() configuration.unstable_operations["get_account_facet_info"] = True with ApiClient(configuration) as api_client: api_instance = RumAudienceManagementApi(api_client) response = api_instance.get_account_facet_info(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get account facet info ``` # Get account facet info returns "Successful response with facet information" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_account_facet_info".to_sym] = true end api_instance = DatadogAPIClient::V2::RumAudienceManagementAPI.new body = DatadogAPIClient::V2::FacetInfoRequest.new({ data: DatadogAPIClient::V2::FacetInfoRequestData.new({ attributes: DatadogAPIClient::V2::FacetInfoRequestDataAttributes.new({ facet_id: "first_browser_name", limit: 10, search: DatadogAPIClient::V2::FacetInfoRequestDataAttributesSearch.new({ query: "user_org_id:5001 AND first_country_code:US", }), term_search: DatadogAPIClient::V2::FacetInfoRequestDataAttributesTermSearch.new({ value: "Chrome", }), }), id: "facet_info_request", type: DatadogAPIClient::V2::FacetInfoRequestDataType::USERS_FACET_INFO_REQUEST, }), }) p api_instance.get_account_facet_info(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get account facet info ``` // Get account facet info returns "Successful response with facet information" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.FacetInfoRequest{ Data: &datadogV2.FacetInfoRequestData{ Attributes: &datadogV2.FacetInfoRequestDataAttributes{ FacetId: "first_browser_name", Limit: 10, Search: &datadogV2.FacetInfoRequestDataAttributesSearch{ Query: datadog.PtrString("user_org_id:5001 AND first_country_code:US"), }, TermSearch: &datadogV2.FacetInfoRequestDataAttributesTermSearch{ Value: datadog.PtrString("Chrome"), }, }, Id: datadog.PtrString("facet_info_request"), Type: datadogV2.FACETINFOREQUESTDATATYPE_USERS_FACET_INFO_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetAccountFacetInfo", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumAudienceManagementApi(apiClient) resp, r, err := api.GetAccountFacetInfo(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumAudienceManagementApi.GetAccountFacetInfo`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumAudienceManagementApi.GetAccountFacetInfo`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get account facet info ``` // Get account facet info returns "Successful response with facet information" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumAudienceManagementApi; import com.datadog.api.client.v2.model.FacetInfoRequest; import com.datadog.api.client.v2.model.FacetInfoRequestData; import com.datadog.api.client.v2.model.FacetInfoRequestDataAttributes; import com.datadog.api.client.v2.model.FacetInfoRequestDataAttributesSearch; import com.datadog.api.client.v2.model.FacetInfoRequestDataAttributesTermSearch; import com.datadog.api.client.v2.model.FacetInfoRequestDataType; import com.datadog.api.client.v2.model.FacetInfoResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getAccountFacetInfo", true); RumAudienceManagementApi apiInstance = new RumAudienceManagementApi(defaultClient); FacetInfoRequest body = new FacetInfoRequest() .data( new FacetInfoRequestData() .attributes( new FacetInfoRequestDataAttributes() .facetId("first_browser_name") .limit(10L) .search( new FacetInfoRequestDataAttributesSearch() .query("user_org_id:5001 AND first_country_code:US")) .termSearch( new FacetInfoRequestDataAttributesTermSearch().value("Chrome"))) .id("facet_info_request") .type(FacetInfoRequestDataType.USERS_FACET_INFO_REQUEST)); try { FacetInfoResponse result = apiInstance.getAccountFacetInfo(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumAudienceManagementApi#getAccountFacetInfo"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get account facet info ``` // Get account facet info returns "Successful response with facet information" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_audience_management::RumAudienceManagementAPI; use datadog_api_client::datadogV2::model::FacetInfoRequest; use datadog_api_client::datadogV2::model::FacetInfoRequestData; use datadog_api_client::datadogV2::model::FacetInfoRequestDataAttributes; use datadog_api_client::datadogV2::model::FacetInfoRequestDataAttributesSearch; use datadog_api_client::datadogV2::model::FacetInfoRequestDataAttributesTermSearch; use datadog_api_client::datadogV2::model::FacetInfoRequestDataType; #[tokio::main] async fn main() { let body = FacetInfoRequest::new().data( FacetInfoRequestData::new(FacetInfoRequestDataType::USERS_FACET_INFO_REQUEST) .attributes( FacetInfoRequestDataAttributes::new("first_browser_name".to_string(), 10) .search( FacetInfoRequestDataAttributesSearch::new() .query("user_org_id:5001 AND first_country_code:US".to_string()), ) .term_search( FacetInfoRequestDataAttributesTermSearch::new().value("Chrome".to_string()), ), ) .id("facet_info_request".to_string()), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetAccountFacetInfo", true); let api = RumAudienceManagementAPI::with_config(configuration); let resp = api.get_account_facet_info(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get account facet info ``` /** * Get account facet info returns "Successful response with facet information" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getAccountFacetInfo"] = true; const apiInstance = new v2.RumAudienceManagementApi(configuration); const params: v2.RumAudienceManagementApiGetAccountFacetInfoRequest = { body: { data: { attributes: { facetId: "first_browser_name", limit: 10, search: { query: "user_org_id:5001 AND first_country_code:US", }, termSearch: { value: "Chrome", }, }, id: "facet_info_request", type: "users_facet_info_request", }, }, }; apiInstance .getAccountFacetInfo(params) .then((data: v2.FacetInfoResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete connection](https://docs.datadoghq.com/api/latest/rum-audience-management/#delete-connection) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-audience-management/#delete-connection-v2) **Note** : This endpoint may be subject to changes. DELETE https://api.ap1.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connection/{id}https://api.ap2.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connection/{id}https://api.datadoghq.eu/api/v2/product-analytics/{entity}/mapping/connection/{id}https://api.ddog-gov.com/api/v2/product-analytics/{entity}/mapping/connection/{id}https://api.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connection/{id}https://api.us3.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connection/{id}https://api.us5.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connection/{id} ### Overview Delete an existing data connection for an entity ### Arguments #### Path Parameters Name Type Description id [_required_] string The connection ID to delete entity [_required_] string The entity for which to delete the connection ### Response * [204](https://docs.datadoghq.com/api/latest/rum-audience-management/#DeleteConnection-204-v2) * [429](https://docs.datadoghq.com/api/latest/rum-audience-management/#DeleteConnection-429-v2) Connection deleted successfully Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=typescript) ##### Delete connection Copy ``` # Path parameters export id="connection-id-123" export entity="users" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/product-analytics/${entity}/mapping/connection/${id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete connection ``` """ Delete connection returns "Connection deleted successfully" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_audience_management_api import RumAudienceManagementApi configuration = Configuration() configuration.unstable_operations["delete_connection"] = True with ApiClient(configuration) as api_client: api_instance = RumAudienceManagementApi(api_client) api_instance.delete_connection( id="connection-id-123", entity="users", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete connection ``` # Delete connection returns "Connection deleted successfully" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_connection".to_sym] = true end api_instance = DatadogAPIClient::V2::RumAudienceManagementAPI.new api_instance.delete_connection("connection-id-123", "users") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete connection ``` // Delete connection returns "Connection deleted successfully" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteConnection", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumAudienceManagementApi(apiClient) r, err := api.DeleteConnection(ctx, "connection-id-123", "users") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumAudienceManagementApi.DeleteConnection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete connection ``` // Delete connection returns "Connection deleted successfully" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumAudienceManagementApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteConnection", true); RumAudienceManagementApi apiInstance = new RumAudienceManagementApi(defaultClient); try { apiInstance.deleteConnection("connection-id-123", "users"); } catch (ApiException e) { System.err.println("Exception when calling RumAudienceManagementApi#deleteConnection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete connection ``` // Delete connection returns "Connection deleted successfully" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_audience_management::RumAudienceManagementAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteConnection", true); let api = RumAudienceManagementAPI::with_config(configuration); let resp = api .delete_connection("connection-id-123".to_string(), "users".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete connection ``` /** * Delete connection returns "Connection deleted successfully" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteConnection"] = true; const apiInstance = new v2.RumAudienceManagementApi(configuration); const params: v2.RumAudienceManagementApiDeleteConnectionRequest = { id: "connection-id-123", entity: "users", }; apiInstance .deleteConnection(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List connections](https://docs.datadoghq.com/api/latest/rum-audience-management/#list-connections) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-audience-management/#list-connections-v2) **Note** : This endpoint may be subject to changes. GET https://api.ap1.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionshttps://api.ap2.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionshttps://api.datadoghq.eu/api/v2/product-analytics/{entity}/mapping/connectionshttps://api.ddog-gov.com/api/v2/product-analytics/{entity}/mapping/connectionshttps://api.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionshttps://api.us3.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connectionshttps://api.us5.datadoghq.com/api/v2/product-analytics/{entity}/mapping/connections ### Overview List all data connections for an entity ### Arguments #### Path Parameters Name Type Description entity [_required_] string The entity for which to list connections ### Response * [200](https://docs.datadoghq.com/api/latest/rum-audience-management/#ListConnections-200-v2) * [429](https://docs.datadoghq.com/api/latest/rum-audience-management/#ListConnections-429-v2) Successful response with list of connections * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object connections [object] created_at date-time created_by string fields [object] description string display_name string groups [string] id [_required_] string source_name [_required_] string type [_required_] string id string join object attribute string type string metadata object string type string updated_at date-time updated_by string id string type [_required_] enum List connections response resource type. Allowed enum values: `list_connections_response` default: `list_connections_response` ``` { "data": { "attributes": { "connections": [ { "created_at": "0001-01-01T00:00:00Z", "created_by": "00000000-0000-0000-0000-000000000000", "fields": [ { "description": "Customer subscription tier", "display_name": "Customer Tier", "groups": [ "Business", "Subscription" ], "id": "customer_tier", "source_name": "subscription_tier", "type": "string" }, { "description": "Channel through which user signed up", "display_name": "Signup Source", "groups": [ "Marketing", "Attribution" ], "id": "signup_source", "source_name": "acquisition_channel", "type": "string" } ], "id": "user-profiles-connection", "join": { "attribute": "user_email", "type": "email" }, "type": "ref_table", "updated_at": "0001-01-01T00:00:00Z", "updated_by": "00000000-0000-0000-0000-000000000000" } ] }, "id": "list_connections_response", "type": "list_connections_response" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=typescript) ##### List connections Copy ``` # Path parameters export entity="users" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/product-analytics/${entity}/mapping/connections" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List connections ``` """ List connections returns "Successful response with list of connections" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_audience_management_api import RumAudienceManagementApi configuration = Configuration() configuration.unstable_operations["list_connections"] = True with ApiClient(configuration) as api_client: api_instance = RumAudienceManagementApi(api_client) response = api_instance.list_connections( entity="users", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List connections ``` # List connections returns "Successful response with list of connections" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_connections".to_sym] = true end api_instance = DatadogAPIClient::V2::RumAudienceManagementAPI.new p api_instance.list_connections("users") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List connections ``` // List connections returns "Successful response with list of connections" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListConnections", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumAudienceManagementApi(apiClient) resp, r, err := api.ListConnections(ctx, "users") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumAudienceManagementApi.ListConnections`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumAudienceManagementApi.ListConnections`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List connections ``` // List connections returns "Successful response with list of connections" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumAudienceManagementApi; import com.datadog.api.client.v2.model.ListConnectionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listConnections", true); RumAudienceManagementApi apiInstance = new RumAudienceManagementApi(defaultClient); try { ListConnectionsResponse result = apiInstance.listConnections("users"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumAudienceManagementApi#listConnections"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List connections ``` // List connections returns "Successful response with list of connections" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_audience_management::RumAudienceManagementAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListConnections", true); let api = RumAudienceManagementAPI::with_config(configuration); let resp = api.list_connections("users".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List connections ``` /** * List connections returns "Successful response with list of connections" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listConnections"] = true; const apiInstance = new v2.RumAudienceManagementApi(configuration); const params: v2.RumAudienceManagementApiListConnectionsRequest = { entity: "users", }; apiInstance .listConnections(params) .then((data: v2.ListConnectionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Query users](https://docs.datadoghq.com/api/latest/rum-audience-management/#query-users) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-audience-management/#query-users-v2) **Note** : This endpoint may be subject to changes. POST https://api.ap1.datadoghq.com/api/v2/product-analytics/users/queryhttps://api.ap2.datadoghq.com/api/v2/product-analytics/users/queryhttps://api.datadoghq.eu/api/v2/product-analytics/users/queryhttps://api.ddog-gov.com/api/v2/product-analytics/users/queryhttps://api.datadoghq.com/api/v2/product-analytics/users/queryhttps://api.us3.datadoghq.com/api/v2/product-analytics/users/queryhttps://api.us5.datadoghq.com/api/v2/product-analytics/users/query ### Overview Query users with flexible filtering by user properties, with optional wildcard search ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object limit int64 query string select_columns [string] sort object field string order string wildcard_search_term string id string type [_required_] enum Query users request resource type. Allowed enum values: `query_users_request` default: `query_users_request` ``` { "data": { "attributes": { "limit": "integer", "query": "string", "select_columns": [], "sort": { "field": "string", "order": "string" }, "wildcard_search_term": "string" }, "id": "string", "type": "query_users_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum-audience-management/#QueryUsers-200-v2) * [429](https://docs.datadoghq.com/api/latest/rum-audience-management/#QueryUsers-429-v2) Successful response with user data * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object hits [] total int64 id string type [_required_] enum Query response resource type. Allowed enum values: `query_response` default: `query_response` ``` { "data": { "attributes": { "hits": [ { "first_browser_name": "Chrome", "first_city": "San Francisco", "first_country_code": "US", "first_device_type": "Desktop", "last_seen": "2025-08-14T06:45:12.142Z", "session_count": 47, "user_created": "2024-12-15T08:42:33.287Z", "user_email": "john.smith@techcorp.com", "user_id": "150847", "user_name": "John Smith", "user_org_id": "5001" }, { "first_browser_name": "Chrome", "first_city": "Austin", "first_country_code": "US", "first_device_type": "Desktop", "last_seen": "2025-08-14T05:22:08.951Z", "session_count": 89, "user_created": "2024-11-28T14:17:45.634Z", "user_email": "john.williams@techcorp.com", "user_id": "150848", "user_name": "John Williams", "user_org_id": "5001" }, { "first_browser_name": "Chrome", "first_city": "Seattle", "first_country_code": "US", "first_device_type": "Desktop", "last_seen": "2025-08-14T04:18:34.726Z", "session_count": 23, "user_created": "2025-01-03T16:33:21.445Z", "user_email": "john.jones@techcorp.com", "user_id": "150849", "user_name": "John Jones", "user_org_id": "5001" } ], "total": 147 }, "id": "query_response", "type": "query_response" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=typescript) ##### Query users Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/product-analytics/users/query" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "query_users_request" } } EOF ``` ##### Query users ``` """ Query users returns "Successful response with user data" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_audience_management_api import RumAudienceManagementApi from datadog_api_client.v2.model.query_users_request import QueryUsersRequest from datadog_api_client.v2.model.query_users_request_data import QueryUsersRequestData from datadog_api_client.v2.model.query_users_request_data_attributes import QueryUsersRequestDataAttributes from datadog_api_client.v2.model.query_users_request_data_attributes_sort import QueryUsersRequestDataAttributesSort from datadog_api_client.v2.model.query_users_request_data_type import QueryUsersRequestDataType body = QueryUsersRequest( data=QueryUsersRequestData( attributes=QueryUsersRequestDataAttributes( limit=25, query="user_email:*@techcorp.com AND first_country_code:US AND first_browser_name:Chrome", select_columns=[ "user_id", "user_email", "user_name", "user_org_id", "first_country_code", "first_browser_name", "first_device_type", "last_seen", ], sort=QueryUsersRequestDataAttributesSort( field="first_seen", order="DESC", ), wildcard_search_term="john", ), id="query_users_request", type=QueryUsersRequestDataType.QUERY_USERS_REQUEST, ), ) configuration = Configuration() configuration.unstable_operations["query_users"] = True with ApiClient(configuration) as api_client: api_instance = RumAudienceManagementApi(api_client) response = api_instance.query_users(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Query users ``` # Query users returns "Successful response with user data" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.query_users".to_sym] = true end api_instance = DatadogAPIClient::V2::RumAudienceManagementAPI.new body = DatadogAPIClient::V2::QueryUsersRequest.new({ data: DatadogAPIClient::V2::QueryUsersRequestData.new({ attributes: DatadogAPIClient::V2::QueryUsersRequestDataAttributes.new({ limit: 25, query: "user_email:*@techcorp.com AND first_country_code:US AND first_browser_name:Chrome", select_columns: [ "user_id", "user_email", "user_name", "user_org_id", "first_country_code", "first_browser_name", "first_device_type", "last_seen", ], sort: DatadogAPIClient::V2::QueryUsersRequestDataAttributesSort.new({ field: "first_seen", order: "DESC", }), wildcard_search_term: "john", }), id: "query_users_request", type: DatadogAPIClient::V2::QueryUsersRequestDataType::QUERY_USERS_REQUEST, }), }) p api_instance.query_users(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Query users ``` // Query users returns "Successful response with user data" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.QueryUsersRequest{ Data: &datadogV2.QueryUsersRequestData{ Attributes: &datadogV2.QueryUsersRequestDataAttributes{ Limit: datadog.PtrInt64(25), Query: datadog.PtrString("user_email:*@techcorp.com AND first_country_code:US AND first_browser_name:Chrome"), SelectColumns: []string{ "user_id", "user_email", "user_name", "user_org_id", "first_country_code", "first_browser_name", "first_device_type", "last_seen", }, Sort: &datadogV2.QueryUsersRequestDataAttributesSort{ Field: datadog.PtrString("first_seen"), Order: datadog.PtrString("DESC"), }, WildcardSearchTerm: datadog.PtrString("john"), }, Id: datadog.PtrString("query_users_request"), Type: datadogV2.QUERYUSERSREQUESTDATATYPE_QUERY_USERS_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.QueryUsers", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumAudienceManagementApi(apiClient) resp, r, err := api.QueryUsers(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumAudienceManagementApi.QueryUsers`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumAudienceManagementApi.QueryUsers`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Query users ``` // Query users returns "Successful response with user data" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumAudienceManagementApi; import com.datadog.api.client.v2.model.QueryResponse; import com.datadog.api.client.v2.model.QueryUsersRequest; import com.datadog.api.client.v2.model.QueryUsersRequestData; import com.datadog.api.client.v2.model.QueryUsersRequestDataAttributes; import com.datadog.api.client.v2.model.QueryUsersRequestDataAttributesSort; import com.datadog.api.client.v2.model.QueryUsersRequestDataType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.queryUsers", true); RumAudienceManagementApi apiInstance = new RumAudienceManagementApi(defaultClient); QueryUsersRequest body = new QueryUsersRequest() .data( new QueryUsersRequestData() .attributes( new QueryUsersRequestDataAttributes() .limit(25L) .query( "user_email:*@techcorp.com AND first_country_code:US AND" + " first_browser_name:Chrome") .selectColumns( Arrays.asList( "user_id", "user_email", "user_name", "user_org_id", "first_country_code", "first_browser_name", "first_device_type", "last_seen")) .sort( new QueryUsersRequestDataAttributesSort() .field("first_seen") .order("DESC")) .wildcardSearchTerm("john")) .id("query_users_request") .type(QueryUsersRequestDataType.QUERY_USERS_REQUEST)); try { QueryResponse result = apiInstance.queryUsers(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumAudienceManagementApi#queryUsers"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Query users ``` // Query users returns "Successful response with user data" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_audience_management::RumAudienceManagementAPI; use datadog_api_client::datadogV2::model::QueryUsersRequest; use datadog_api_client::datadogV2::model::QueryUsersRequestData; use datadog_api_client::datadogV2::model::QueryUsersRequestDataAttributes; use datadog_api_client::datadogV2::model::QueryUsersRequestDataAttributesSort; use datadog_api_client::datadogV2::model::QueryUsersRequestDataType; #[tokio::main] async fn main() { let body = QueryUsersRequest ::new().data( QueryUsersRequestData::new(QueryUsersRequestDataType::QUERY_USERS_REQUEST) .attributes( QueryUsersRequestDataAttributes::new() .limit(25) .query( "user_email:*@techcorp.com AND first_country_code:US AND first_browser_name:Chrome".to_string(), ) .select_columns( vec![ "user_id".to_string(), "user_email".to_string(), "user_name".to_string(), "user_org_id".to_string(), "first_country_code".to_string(), "first_browser_name".to_string(), "first_device_type".to_string(), "last_seen".to_string() ], ) .sort( QueryUsersRequestDataAttributesSort::new() .field("first_seen".to_string()) .order("DESC".to_string()), ) .wildcard_search_term("john".to_string()), ) .id("query_users_request".to_string()), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.QueryUsers", true); let api = RumAudienceManagementAPI::with_config(configuration); let resp = api.query_users(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Query users ``` /** * Query users returns "Successful response with user data" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.queryUsers"] = true; const apiInstance = new v2.RumAudienceManagementApi(configuration); const params: v2.RumAudienceManagementApiQueryUsersRequest = { body: { data: { attributes: { limit: 25, query: "user_email:*@techcorp.com AND first_country_code:US AND first_browser_name:Chrome", selectColumns: [ "user_id", "user_email", "user_name", "user_org_id", "first_country_code", "first_browser_name", "first_device_type", "last_seen", ], sort: { field: "first_seen", order: "DESC", }, wildcardSearchTerm: "john", }, id: "query_users_request", type: "query_users_request", }, }, }; apiInstance .queryUsers(params) .then((data: v2.QueryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get user facet info](https://docs.datadoghq.com/api/latest/rum-audience-management/#get-user-facet-info) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-audience-management/#get-user-facet-info-v2) **Note** : This endpoint may be subject to changes. POST https://api.ap1.datadoghq.com/api/v2/product-analytics/users/facet_infohttps://api.ap2.datadoghq.com/api/v2/product-analytics/users/facet_infohttps://api.datadoghq.eu/api/v2/product-analytics/users/facet_infohttps://api.ddog-gov.com/api/v2/product-analytics/users/facet_infohttps://api.datadoghq.com/api/v2/product-analytics/users/facet_infohttps://api.us3.datadoghq.com/api/v2/product-analytics/users/facet_infohttps://api.us5.datadoghq.com/api/v2/product-analytics/users/facet_info ### Overview Get facet information for user attributes including possible values and counts ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object facet_id [_required_] string limit [_required_] int64 search object query string term_search object value string id string type [_required_] enum Users facet info request resource type. Allowed enum values: `users_facet_info_request` default: `users_facet_info_request` ``` { "data": { "attributes": { "facet_id": "", "limit": 0, "search": { "query": "string" }, "term_search": { "value": "string" } }, "id": "string", "type": "users_facet_info_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum-audience-management/#GetUserFacetInfo-200-v2) * [429](https://docs.datadoghq.com/api/latest/rum-audience-management/#GetUserFacetInfo-429-v2) Successful response with facet information * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object result object range object max object min object values [object] count int64 value string id string type [_required_] enum Users facet info resource type. Allowed enum values: `users_facet_info` default: `users_facet_info` ``` { "data": { "attributes": { "result": { "values": [ { "count": 4892, "value": "Chrome" }, { "count": 2341, "value": "Safari" }, { "count": 1567, "value": "Firefox" }, { "count": 892, "value": "Edge" }, { "count": 234, "value": "Opera" } ] } }, "id": "facet_info_response", "type": "users_facet_info" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=typescript) ##### Get user facet info Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/product-analytics/users/facet_info" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "facet_id": "", "limit": 0 }, "type": "users_facet_info_request" } } EOF ``` ##### Get user facet info ``` """ Get user facet info returns "Successful response with facet information" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_audience_management_api import RumAudienceManagementApi from datadog_api_client.v2.model.facet_info_request import FacetInfoRequest from datadog_api_client.v2.model.facet_info_request_data import FacetInfoRequestData from datadog_api_client.v2.model.facet_info_request_data_attributes import FacetInfoRequestDataAttributes from datadog_api_client.v2.model.facet_info_request_data_attributes_search import FacetInfoRequestDataAttributesSearch from datadog_api_client.v2.model.facet_info_request_data_attributes_term_search import ( FacetInfoRequestDataAttributesTermSearch, ) from datadog_api_client.v2.model.facet_info_request_data_type import FacetInfoRequestDataType body = FacetInfoRequest( data=FacetInfoRequestData( attributes=FacetInfoRequestDataAttributes( facet_id="first_browser_name", limit=10, search=FacetInfoRequestDataAttributesSearch( query="user_org_id:5001 AND first_country_code:US", ), term_search=FacetInfoRequestDataAttributesTermSearch( value="Chrome", ), ), id="facet_info_request", type=FacetInfoRequestDataType.USERS_FACET_INFO_REQUEST, ), ) configuration = Configuration() configuration.unstable_operations["get_user_facet_info"] = True with ApiClient(configuration) as api_client: api_instance = RumAudienceManagementApi(api_client) response = api_instance.get_user_facet_info(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get user facet info ``` # Get user facet info returns "Successful response with facet information" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_user_facet_info".to_sym] = true end api_instance = DatadogAPIClient::V2::RumAudienceManagementAPI.new body = DatadogAPIClient::V2::FacetInfoRequest.new({ data: DatadogAPIClient::V2::FacetInfoRequestData.new({ attributes: DatadogAPIClient::V2::FacetInfoRequestDataAttributes.new({ facet_id: "first_browser_name", limit: 10, search: DatadogAPIClient::V2::FacetInfoRequestDataAttributesSearch.new({ query: "user_org_id:5001 AND first_country_code:US", }), term_search: DatadogAPIClient::V2::FacetInfoRequestDataAttributesTermSearch.new({ value: "Chrome", }), }), id: "facet_info_request", type: DatadogAPIClient::V2::FacetInfoRequestDataType::USERS_FACET_INFO_REQUEST, }), }) p api_instance.get_user_facet_info(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get user facet info ``` // Get user facet info returns "Successful response with facet information" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.FacetInfoRequest{ Data: &datadogV2.FacetInfoRequestData{ Attributes: &datadogV2.FacetInfoRequestDataAttributes{ FacetId: "first_browser_name", Limit: 10, Search: &datadogV2.FacetInfoRequestDataAttributesSearch{ Query: datadog.PtrString("user_org_id:5001 AND first_country_code:US"), }, TermSearch: &datadogV2.FacetInfoRequestDataAttributesTermSearch{ Value: datadog.PtrString("Chrome"), }, }, Id: datadog.PtrString("facet_info_request"), Type: datadogV2.FACETINFOREQUESTDATATYPE_USERS_FACET_INFO_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetUserFacetInfo", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumAudienceManagementApi(apiClient) resp, r, err := api.GetUserFacetInfo(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumAudienceManagementApi.GetUserFacetInfo`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumAudienceManagementApi.GetUserFacetInfo`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get user facet info ``` // Get user facet info returns "Successful response with facet information" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumAudienceManagementApi; import com.datadog.api.client.v2.model.FacetInfoRequest; import com.datadog.api.client.v2.model.FacetInfoRequestData; import com.datadog.api.client.v2.model.FacetInfoRequestDataAttributes; import com.datadog.api.client.v2.model.FacetInfoRequestDataAttributesSearch; import com.datadog.api.client.v2.model.FacetInfoRequestDataAttributesTermSearch; import com.datadog.api.client.v2.model.FacetInfoRequestDataType; import com.datadog.api.client.v2.model.FacetInfoResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getUserFacetInfo", true); RumAudienceManagementApi apiInstance = new RumAudienceManagementApi(defaultClient); FacetInfoRequest body = new FacetInfoRequest() .data( new FacetInfoRequestData() .attributes( new FacetInfoRequestDataAttributes() .facetId("first_browser_name") .limit(10L) .search( new FacetInfoRequestDataAttributesSearch() .query("user_org_id:5001 AND first_country_code:US")) .termSearch( new FacetInfoRequestDataAttributesTermSearch().value("Chrome"))) .id("facet_info_request") .type(FacetInfoRequestDataType.USERS_FACET_INFO_REQUEST)); try { FacetInfoResponse result = apiInstance.getUserFacetInfo(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumAudienceManagementApi#getUserFacetInfo"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get user facet info ``` // Get user facet info returns "Successful response with facet information" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_audience_management::RumAudienceManagementAPI; use datadog_api_client::datadogV2::model::FacetInfoRequest; use datadog_api_client::datadogV2::model::FacetInfoRequestData; use datadog_api_client::datadogV2::model::FacetInfoRequestDataAttributes; use datadog_api_client::datadogV2::model::FacetInfoRequestDataAttributesSearch; use datadog_api_client::datadogV2::model::FacetInfoRequestDataAttributesTermSearch; use datadog_api_client::datadogV2::model::FacetInfoRequestDataType; #[tokio::main] async fn main() { let body = FacetInfoRequest::new().data( FacetInfoRequestData::new(FacetInfoRequestDataType::USERS_FACET_INFO_REQUEST) .attributes( FacetInfoRequestDataAttributes::new("first_browser_name".to_string(), 10) .search( FacetInfoRequestDataAttributesSearch::new() .query("user_org_id:5001 AND first_country_code:US".to_string()), ) .term_search( FacetInfoRequestDataAttributesTermSearch::new().value("Chrome".to_string()), ), ) .id("facet_info_request".to_string()), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetUserFacetInfo", true); let api = RumAudienceManagementAPI::with_config(configuration); let resp = api.get_user_facet_info(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get user facet info ``` /** * Get user facet info returns "Successful response with facet information" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getUserFacetInfo"] = true; const apiInstance = new v2.RumAudienceManagementApi(configuration); const params: v2.RumAudienceManagementApiGetUserFacetInfoRequest = { body: { data: { attributes: { facetId: "first_browser_name", limit: 10, search: { query: "user_org_id:5001 AND first_country_code:US", }, termSearch: { value: "Chrome", }, }, id: "facet_info_request", type: "users_facet_info_request", }, }, }; apiInstance .getUserFacetInfo(params) .then((data: v2.FacetInfoResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get mapping](https://docs.datadoghq.com/api/latest/rum-audience-management/#get-mapping) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-audience-management/#get-mapping-v2) **Note** : This endpoint may be subject to changes. GET https://api.ap1.datadoghq.com/api/v2/product-analytics/{entity}/mappinghttps://api.ap2.datadoghq.com/api/v2/product-analytics/{entity}/mappinghttps://api.datadoghq.eu/api/v2/product-analytics/{entity}/mappinghttps://api.ddog-gov.com/api/v2/product-analytics/{entity}/mappinghttps://api.datadoghq.com/api/v2/product-analytics/{entity}/mappinghttps://api.us3.datadoghq.com/api/v2/product-analytics/{entity}/mappinghttps://api.us5.datadoghq.com/api/v2/product-analytics/{entity}/mapping ### Overview Get entity mapping configuration including all available attributes and their properties ### Arguments #### Path Parameters Name Type Description entity [_required_] string The entity for which to get the mapping ### Response * [200](https://docs.datadoghq.com/api/latest/rum-audience-management/#GetMapping-200-v2) * [429](https://docs.datadoghq.com/api/latest/rum-audience-management/#GetMapping-429-v2) Successful response with entity mapping configuration * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) Field Type Description data object attributes object attributes [object] attribute string description string display_name string groups [string] is_custom boolean type string id string type [_required_] enum Get mappings response resource type. Allowed enum values: `get_mappings_response` default: `get_mappings_response` ``` { "data": { "attributes": { "attributes": [ { "attribute": "user_id", "description": "Unique user identifier", "display_name": "User ID", "groups": [ "Identity" ], "is_custom": false, "type": "string" }, { "attribute": "user_email", "description": "User email address", "display_name": "Email Address", "groups": [ "Identity", "Contact" ], "is_custom": false, "type": "string" }, { "attribute": "first_country_code", "description": "The ISO code of the country for the user's first session", "display_name": "First Country Code", "groups": [ "Geography" ], "is_custom": false, "type": "string" }, { "attribute": "@customer_tier", "description": "Customer subscription tier", "display_name": "Customer Tier", "groups": [ "Business" ], "is_custom": true, "type": "string" } ] }, "id": "get_mappings_response", "type": "get_mappings_response" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-audience-management/) * [Example](https://docs.datadoghq.com/api/latest/rum-audience-management/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-audience-management/?code-lang=typescript) ##### Get mapping Copy ``` # Path parameters export entity="users" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/product-analytics/${entity}/mapping" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get mapping ``` """ Get mapping returns "Successful response with entity mapping configuration" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_audience_management_api import RumAudienceManagementApi configuration = Configuration() configuration.unstable_operations["get_mapping"] = True with ApiClient(configuration) as api_client: api_instance = RumAudienceManagementApi(api_client) response = api_instance.get_mapping( entity="users", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get mapping ``` # Get mapping returns "Successful response with entity mapping configuration" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_mapping".to_sym] = true end api_instance = DatadogAPIClient::V2::RumAudienceManagementAPI.new p api_instance.get_mapping("users") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get mapping ``` // Get mapping returns "Successful response with entity mapping configuration" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetMapping", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumAudienceManagementApi(apiClient) resp, r, err := api.GetMapping(ctx, "users") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumAudienceManagementApi.GetMapping`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumAudienceManagementApi.GetMapping`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get mapping ``` // Get mapping returns "Successful response with entity mapping configuration" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumAudienceManagementApi; import com.datadog.api.client.v2.model.GetMappingResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getMapping", true); RumAudienceManagementApi apiInstance = new RumAudienceManagementApi(defaultClient); try { GetMappingResponse result = apiInstance.getMapping("users"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumAudienceManagementApi#getMapping"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get mapping ``` // Get mapping returns "Successful response with entity mapping configuration" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_audience_management::RumAudienceManagementAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetMapping", true); let api = RumAudienceManagementAPI::with_config(configuration); let resp = api.get_mapping("users".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get mapping ``` /** * Get mapping returns "Successful response with entity mapping configuration" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getMapping"] = true; const apiInstance = new v2.RumAudienceManagementApi(configuration); const params: v2.RumAudienceManagementApiGetMappingRequest = { entity: "users", }; apiInstance .getMapping(params) .then((data: v2.GetMappingResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=113a9dda-d806-43f3-a88d-34921e01a4d8&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=162a6464-53a5-4c26-bb7b-d5a157ce54ea&pt=Rum%20Audience%20Management&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frum-audience-management%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=113a9dda-d806-43f3-a88d-34921e01a4d8&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=162a6464-53a5-4c26-bb7b-d5a157ce54ea&pt=Rum%20Audience%20Management&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frum-audience-management%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=5232cc43-b1b4-4c95-9614-9629e9d47498&bo=2&sid=f729f770f0bf11f0aa19994c0c4b86b5&vid=f72a14f0f0bf11f0b889879a61d8cd17&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Rum%20Audience%20Management&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frum-audience-management%2F&r=<=2023&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=832261) --- # Source: https://docs.datadoghq.com/api/latest/rum-metrics/ # Rum Metrics Manage configuration of [rum-based metrics](https://app.datadoghq.com/rum/generate-metrics) for your organization. ## [Get all rum-based metrics](https://docs.datadoghq.com/api/latest/rum-metrics/#get-all-rum-based-metrics) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-metrics/#get-all-rum-based-metrics-v2) GET https://api.ap1.datadoghq.com/api/v2/rum/config/metricshttps://api.ap2.datadoghq.com/api/v2/rum/config/metricshttps://api.datadoghq.eu/api/v2/rum/config/metricshttps://api.ddog-gov.com/api/v2/rum/config/metricshttps://api.datadoghq.com/api/v2/rum/config/metricshttps://api.us3.datadoghq.com/api/v2/rum/config/metricshttps://api.us5.datadoghq.com/api/v2/rum/config/metrics ### Overview Get the list of configured rum-based metrics with their definitions. ### Response * [200](https://docs.datadoghq.com/api/latest/rum-metrics/#ListRumMetrics-200-v2) * [403](https://docs.datadoghq.com/api/latest/rum-metrics/#ListRumMetrics-403-v2) * [429](https://docs.datadoghq.com/api/latest/rum-metrics/#ListRumMetrics-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) All the available rum-based metric objects. Field Type Description data [object] A list of rum-based metric objects. attributes object The object describing a Datadog rum-based metric. compute object The compute rule to compute the rum-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when `aggregation_type` is `distribution`. path string The path to the value the rum-based metric will aggregate on. Only present when `aggregation_type` is `distribution`. event_type enum The type of RUM events to filter on. Allowed enum values: `session,view,action,error,resource,long_task,vital` filter object The rum-based metric filter. RUM events matching this filter will be aggregated in this metric. query string The search query - following the RUM search syntax. group_by [object] The rules for the group by. path string The path to the value the rum-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, `path` is used as the tag name. uniqueness object The rule to count updatable events. Is only set if `event_type` is `session` or `view`. when enum When to count updatable events. `match` when the event is first seen, or `end` when the event is complete. Allowed enum values: `match,end` id string The name of the rum-based metric. type enum The type of the resource. The value should always be rum_metrics. Allowed enum values: `rum_metrics` default: `rum_metrics` ``` { "data": [ { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" }, "event_type": "session", "filter": { "query": "service:web* AND @http.status_code:[200 TO 299]" }, "group_by": [ { "path": "@http.status_code", "tag_name": "status_code" } ], "uniqueness": { "when": "match" } }, "id": "rum.sessions.webui.count", "type": "rum_metrics" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=typescript) ##### Get all rum-based metrics Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/config/metrics" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all rum-based metrics ``` """ Get all rum-based metrics returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_metrics_api import RumMetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RumMetricsApi(api_client) response = api_instance.list_rum_metrics() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all rum-based metrics ``` # Get all rum-based metrics returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RumMetricsAPI.new p api_instance.list_rum_metrics() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all rum-based metrics ``` // Get all rum-based metrics returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumMetricsApi(apiClient) resp, r, err := api.ListRumMetrics(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumMetricsApi.ListRumMetrics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumMetricsApi.ListRumMetrics`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all rum-based metrics ``` // Get all rum-based metrics returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumMetricsApi; import com.datadog.api.client.v2.model.RumMetricsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumMetricsApi apiInstance = new RumMetricsApi(defaultClient); try { RumMetricsResponse result = apiInstance.listRumMetrics(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumMetricsApi#listRumMetrics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all rum-based metrics ``` // Get all rum-based metrics returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_metrics::RumMetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = RumMetricsAPI::with_config(configuration); let resp = api.list_rum_metrics().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all rum-based metrics ``` /** * Get all rum-based metrics returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RumMetricsApi(configuration); apiInstance .listRumMetrics() .then((data: v2.RumMetricsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a rum-based metric](https://docs.datadoghq.com/api/latest/rum-metrics/#create-a-rum-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-metrics/#create-a-rum-based-metric-v2) POST https://api.ap1.datadoghq.com/api/v2/rum/config/metricshttps://api.ap2.datadoghq.com/api/v2/rum/config/metricshttps://api.datadoghq.eu/api/v2/rum/config/metricshttps://api.ddog-gov.com/api/v2/rum/config/metricshttps://api.datadoghq.com/api/v2/rum/config/metricshttps://api.us3.datadoghq.com/api/v2/rum/config/metricshttps://api.us5.datadoghq.com/api/v2/rum/config/metrics ### Overview Create a metric based on your organization’s RUM data. Returns the rum-based metric object from the request body when the request is successful. ### Request #### Body Data (required) The definition of the new rum-based metric. * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) Field Type Description data [_required_] object The new rum-based metric properties. attributes [_required_] object The object describing the Datadog rum-based metric to create. compute [_required_] object The compute rule to compute the rum-based metric. aggregation_type [_required_] enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when `aggregation_type` is `distribution`. path string The path to the value the rum-based metric will aggregate on. Only present when `aggregation_type` is `distribution`. event_type [_required_] enum The type of RUM events to filter on. Allowed enum values: `session,view,action,error,resource,long_task,vital` filter object The rum-based metric filter. Events matching this filter will be aggregated in this metric. query [_required_] string The search query - following the RUM search syntax. default: `*` group_by [object] The rules for the group by. path [_required_] string The path to the value the rum-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, `path` is used as the tag name. uniqueness object The rule to count updatable events. Is only set if `event_type` is `sessions` or `views`. when [_required_] enum When to count updatable events. `match` when the event is first seen, or `end` when the event is complete. Allowed enum values: `match,end` id [_required_] string The name of the rum-based metric. type [_required_] enum The type of the resource. The value should always be rum_metrics. Allowed enum values: `rum_metrics` default: `rum_metrics` ``` { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" }, "event_type": "session", "filter": { "query": "@service:web-ui" }, "group_by": [ { "path": "@browser.name", "tag_name": "browser_name" } ], "uniqueness": { "when": "match" } }, "id": "examplerummetric", "type": "rum_metrics" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/rum-metrics/#CreateRumMetric-201-v2) * [400](https://docs.datadoghq.com/api/latest/rum-metrics/#CreateRumMetric-400-v2) * [403](https://docs.datadoghq.com/api/latest/rum-metrics/#CreateRumMetric-403-v2) * [409](https://docs.datadoghq.com/api/latest/rum-metrics/#CreateRumMetric-409-v2) * [429](https://docs.datadoghq.com/api/latest/rum-metrics/#CreateRumMetric-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) The rum-based metric object. Field Type Description data object The rum-based metric properties. attributes object The object describing a Datadog rum-based metric. compute object The compute rule to compute the rum-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when `aggregation_type` is `distribution`. path string The path to the value the rum-based metric will aggregate on. Only present when `aggregation_type` is `distribution`. event_type enum The type of RUM events to filter on. Allowed enum values: `session,view,action,error,resource,long_task,vital` filter object The rum-based metric filter. RUM events matching this filter will be aggregated in this metric. query string The search query - following the RUM search syntax. group_by [object] The rules for the group by. path string The path to the value the rum-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, `path` is used as the tag name. uniqueness object The rule to count updatable events. Is only set if `event_type` is `session` or `view`. when enum When to count updatable events. `match` when the event is first seen, or `end` when the event is complete. Allowed enum values: `match,end` id string The name of the rum-based metric. type enum The type of the resource. The value should always be rum_metrics. Allowed enum values: `rum_metrics` default: `rum_metrics` ``` { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" }, "event_type": "session", "filter": { "query": "service:web* AND @http.status_code:[200 TO 299]" }, "group_by": [ { "path": "@http.status_code", "tag_name": "status_code" } ], "uniqueness": { "when": "match" } }, "id": "rum.sessions.webui.count", "type": "rum_metrics" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=typescript) ##### Create a rum-based metric returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/config/metrics" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" }, "event_type": "session", "filter": { "query": "@service:web-ui" }, "group_by": [ { "path": "@browser.name", "tag_name": "browser_name" } ], "uniqueness": { "when": "match" } }, "id": "examplerummetric", "type": "rum_metrics" } } EOF ``` ##### Create a rum-based metric returns "Created" response ``` // Create a rum-based metric returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RumMetricCreateRequest{ Data: datadogV2.RumMetricCreateData{ Attributes: datadogV2.RumMetricCreateAttributes{ Compute: datadogV2.RumMetricCompute{ AggregationType: datadogV2.RUMMETRICCOMPUTEAGGREGATIONTYPE_DISTRIBUTION, IncludePercentiles: datadog.PtrBool(true), Path: datadog.PtrString("@duration"), }, EventType: datadogV2.RUMMETRICEVENTTYPE_SESSION, Filter: &datadogV2.RumMetricFilter{ Query: "@service:web-ui", }, GroupBy: []datadogV2.RumMetricGroupBy{ { Path: "@browser.name", TagName: datadog.PtrString("browser_name"), }, }, Uniqueness: &datadogV2.RumMetricUniqueness{ When: datadogV2.RUMMETRICUNIQUENESSWHEN_WHEN_MATCH, }, }, Id: "examplerummetric", Type: datadogV2.RUMMETRICTYPE_RUM_METRICS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumMetricsApi(apiClient) resp, r, err := api.CreateRumMetric(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumMetricsApi.CreateRumMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumMetricsApi.CreateRumMetric`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a rum-based metric returns "Created" response ``` // Create a rum-based metric returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumMetricsApi; import com.datadog.api.client.v2.model.RumMetricCompute; import com.datadog.api.client.v2.model.RumMetricComputeAggregationType; import com.datadog.api.client.v2.model.RumMetricCreateAttributes; import com.datadog.api.client.v2.model.RumMetricCreateData; import com.datadog.api.client.v2.model.RumMetricCreateRequest; import com.datadog.api.client.v2.model.RumMetricEventType; import com.datadog.api.client.v2.model.RumMetricFilter; import com.datadog.api.client.v2.model.RumMetricGroupBy; import com.datadog.api.client.v2.model.RumMetricResponse; import com.datadog.api.client.v2.model.RumMetricType; import com.datadog.api.client.v2.model.RumMetricUniqueness; import com.datadog.api.client.v2.model.RumMetricUniquenessWhen; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumMetricsApi apiInstance = new RumMetricsApi(defaultClient); RumMetricCreateRequest body = new RumMetricCreateRequest() .data( new RumMetricCreateData() .attributes( new RumMetricCreateAttributes() .compute( new RumMetricCompute() .aggregationType(RumMetricComputeAggregationType.DISTRIBUTION) .includePercentiles(true) .path("@duration")) .eventType(RumMetricEventType.SESSION) .filter(new RumMetricFilter().query("@service:web-ui")) .groupBy( Collections.singletonList( new RumMetricGroupBy() .path("@browser.name") .tagName("browser_name"))) .uniqueness( new RumMetricUniqueness().when(RumMetricUniquenessWhen.WHEN_MATCH))) .id("examplerummetric") .type(RumMetricType.RUM_METRICS)); try { RumMetricResponse result = apiInstance.createRumMetric(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumMetricsApi#createRumMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a rum-based metric returns "Created" response ``` """ Create a rum-based metric returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_metrics_api import RumMetricsApi from datadog_api_client.v2.model.rum_metric_compute import RumMetricCompute from datadog_api_client.v2.model.rum_metric_compute_aggregation_type import RumMetricComputeAggregationType from datadog_api_client.v2.model.rum_metric_create_attributes import RumMetricCreateAttributes from datadog_api_client.v2.model.rum_metric_create_data import RumMetricCreateData from datadog_api_client.v2.model.rum_metric_create_request import RumMetricCreateRequest from datadog_api_client.v2.model.rum_metric_event_type import RumMetricEventType from datadog_api_client.v2.model.rum_metric_filter import RumMetricFilter from datadog_api_client.v2.model.rum_metric_group_by import RumMetricGroupBy from datadog_api_client.v2.model.rum_metric_type import RumMetricType from datadog_api_client.v2.model.rum_metric_uniqueness import RumMetricUniqueness from datadog_api_client.v2.model.rum_metric_uniqueness_when import RumMetricUniquenessWhen body = RumMetricCreateRequest( data=RumMetricCreateData( attributes=RumMetricCreateAttributes( compute=RumMetricCompute( aggregation_type=RumMetricComputeAggregationType.DISTRIBUTION, include_percentiles=True, path="@duration", ), event_type=RumMetricEventType.SESSION, filter=RumMetricFilter( query="@service:web-ui", ), group_by=[ RumMetricGroupBy( path="@browser.name", tag_name="browser_name", ), ], uniqueness=RumMetricUniqueness( when=RumMetricUniquenessWhen.WHEN_MATCH, ), ), id="examplerummetric", type=RumMetricType.RUM_METRICS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RumMetricsApi(api_client) response = api_instance.create_rum_metric(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a rum-based metric returns "Created" response ``` # Create a rum-based metric returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RumMetricsAPI.new body = DatadogAPIClient::V2::RumMetricCreateRequest.new({ data: DatadogAPIClient::V2::RumMetricCreateData.new({ attributes: DatadogAPIClient::V2::RumMetricCreateAttributes.new({ compute: DatadogAPIClient::V2::RumMetricCompute.new({ aggregation_type: DatadogAPIClient::V2::RumMetricComputeAggregationType::DISTRIBUTION, include_percentiles: true, path: "@duration", }), event_type: DatadogAPIClient::V2::RumMetricEventType::SESSION, filter: DatadogAPIClient::V2::RumMetricFilter.new({ query: "@service:web-ui", }), group_by: [ DatadogAPIClient::V2::RumMetricGroupBy.new({ path: "@browser.name", tag_name: "browser_name", }), ], uniqueness: DatadogAPIClient::V2::RumMetricUniqueness.new({ _when: DatadogAPIClient::V2::RumMetricUniquenessWhen::WHEN_MATCH, }), }), id: "examplerummetric", type: DatadogAPIClient::V2::RumMetricType::RUM_METRICS, }), }) p api_instance.create_rum_metric(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a rum-based metric returns "Created" response ``` // Create a rum-based metric returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_metrics::RumMetricsAPI; use datadog_api_client::datadogV2::model::RumMetricCompute; use datadog_api_client::datadogV2::model::RumMetricComputeAggregationType; use datadog_api_client::datadogV2::model::RumMetricCreateAttributes; use datadog_api_client::datadogV2::model::RumMetricCreateData; use datadog_api_client::datadogV2::model::RumMetricCreateRequest; use datadog_api_client::datadogV2::model::RumMetricEventType; use datadog_api_client::datadogV2::model::RumMetricFilter; use datadog_api_client::datadogV2::model::RumMetricGroupBy; use datadog_api_client::datadogV2::model::RumMetricType; use datadog_api_client::datadogV2::model::RumMetricUniqueness; use datadog_api_client::datadogV2::model::RumMetricUniquenessWhen; #[tokio::main] async fn main() { let body = RumMetricCreateRequest::new(RumMetricCreateData::new( RumMetricCreateAttributes::new( RumMetricCompute::new(RumMetricComputeAggregationType::DISTRIBUTION) .include_percentiles(true) .path("@duration".to_string()), RumMetricEventType::SESSION, ) .filter(RumMetricFilter::new("@service:web-ui".to_string())) .group_by(vec![ RumMetricGroupBy::new("@browser.name".to_string()).tag_name("browser_name".to_string()) ]) .uniqueness(RumMetricUniqueness::new( RumMetricUniquenessWhen::WHEN_MATCH, )), "examplerummetric".to_string(), RumMetricType::RUM_METRICS, )); let configuration = datadog::Configuration::new(); let api = RumMetricsAPI::with_config(configuration); let resp = api.create_rum_metric(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a rum-based metric returns "Created" response ``` /** * Create a rum-based metric returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RumMetricsApi(configuration); const params: v2.RumMetricsApiCreateRumMetricRequest = { body: { data: { attributes: { compute: { aggregationType: "distribution", includePercentiles: true, path: "@duration", }, eventType: "session", filter: { query: "@service:web-ui", }, groupBy: [ { path: "@browser.name", tagName: "browser_name", }, ], uniqueness: { when: "match", }, }, id: "examplerummetric", type: "rum_metrics", }, }, }; apiInstance .createRumMetric(params) .then((data: v2.RumMetricResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a rum-based metric](https://docs.datadoghq.com/api/latest/rum-metrics/#get-a-rum-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-metrics/#get-a-rum-based-metric-v2) GET https://api.ap1.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.ap2.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.datadoghq.eu/api/v2/rum/config/metrics/{metric_id}https://api.ddog-gov.com/api/v2/rum/config/metrics/{metric_id}https://api.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.us3.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.us5.datadoghq.com/api/v2/rum/config/metrics/{metric_id} ### Overview Get a specific rum-based metric from your organization. ### Arguments #### Path Parameters Name Type Description metric_id [_required_] string The name of the rum-based metric. ### Response * [200](https://docs.datadoghq.com/api/latest/rum-metrics/#GetRumMetric-200-v2) * [403](https://docs.datadoghq.com/api/latest/rum-metrics/#GetRumMetric-403-v2) * [404](https://docs.datadoghq.com/api/latest/rum-metrics/#GetRumMetric-404-v2) * [429](https://docs.datadoghq.com/api/latest/rum-metrics/#GetRumMetric-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) The rum-based metric object. Field Type Description data object The rum-based metric properties. attributes object The object describing a Datadog rum-based metric. compute object The compute rule to compute the rum-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when `aggregation_type` is `distribution`. path string The path to the value the rum-based metric will aggregate on. Only present when `aggregation_type` is `distribution`. event_type enum The type of RUM events to filter on. Allowed enum values: `session,view,action,error,resource,long_task,vital` filter object The rum-based metric filter. RUM events matching this filter will be aggregated in this metric. query string The search query - following the RUM search syntax. group_by [object] The rules for the group by. path string The path to the value the rum-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, `path` is used as the tag name. uniqueness object The rule to count updatable events. Is only set if `event_type` is `session` or `view`. when enum When to count updatable events. `match` when the event is first seen, or `end` when the event is complete. Allowed enum values: `match,end` id string The name of the rum-based metric. type enum The type of the resource. The value should always be rum_metrics. Allowed enum values: `rum_metrics` default: `rum_metrics` ``` { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" }, "event_type": "session", "filter": { "query": "service:web* AND @http.status_code:[200 TO 299]" }, "group_by": [ { "path": "@http.status_code", "tag_name": "status_code" } ], "uniqueness": { "when": "match" } }, "id": "rum.sessions.webui.count", "type": "rum_metrics" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=typescript) ##### Get a rum-based metric Copy ``` # Path parameters export metric_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/config/metrics/${metric_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a rum-based metric ``` """ Get a rum-based metric returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_metrics_api import RumMetricsApi # there is a valid "rum_metric" in the system RUM_METRIC_DATA_ID = environ["RUM_METRIC_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RumMetricsApi(api_client) response = api_instance.get_rum_metric( metric_id=RUM_METRIC_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a rum-based metric ``` # Get a rum-based metric returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RumMetricsAPI.new # there is a valid "rum_metric" in the system RUM_METRIC_DATA_ID = ENV["RUM_METRIC_DATA_ID"] p api_instance.get_rum_metric(RUM_METRIC_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a rum-based metric ``` // Get a rum-based metric returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "rum_metric" in the system RumMetricDataID := os.Getenv("RUM_METRIC_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumMetricsApi(apiClient) resp, r, err := api.GetRumMetric(ctx, RumMetricDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumMetricsApi.GetRumMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumMetricsApi.GetRumMetric`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a rum-based metric ``` // Get a rum-based metric returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumMetricsApi; import com.datadog.api.client.v2.model.RumMetricResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumMetricsApi apiInstance = new RumMetricsApi(defaultClient); // there is a valid "rum_metric" in the system String RUM_METRIC_DATA_ID = System.getenv("RUM_METRIC_DATA_ID"); try { RumMetricResponse result = apiInstance.getRumMetric(RUM_METRIC_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumMetricsApi#getRumMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a rum-based metric ``` // Get a rum-based metric returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_metrics::RumMetricsAPI; #[tokio::main] async fn main() { // there is a valid "rum_metric" in the system let rum_metric_data_id = std::env::var("RUM_METRIC_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = RumMetricsAPI::with_config(configuration); let resp = api.get_rum_metric(rum_metric_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a rum-based metric ``` /** * Get a rum-based metric returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RumMetricsApi(configuration); // there is a valid "rum_metric" in the system const RUM_METRIC_DATA_ID = process.env.RUM_METRIC_DATA_ID as string; const params: v2.RumMetricsApiGetRumMetricRequest = { metricId: RUM_METRIC_DATA_ID, }; apiInstance .getRumMetric(params) .then((data: v2.RumMetricResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a rum-based metric](https://docs.datadoghq.com/api/latest/rum-metrics/#update-a-rum-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-metrics/#update-a-rum-based-metric-v2) PATCH https://api.ap1.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.ap2.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.datadoghq.eu/api/v2/rum/config/metrics/{metric_id}https://api.ddog-gov.com/api/v2/rum/config/metrics/{metric_id}https://api.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.us3.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.us5.datadoghq.com/api/v2/rum/config/metrics/{metric_id} ### Overview Update a specific rum-based metric from your organization. Returns the rum-based metric object from the request body when the request is successful. ### Arguments #### Path Parameters Name Type Description metric_id [_required_] string The name of the rum-based metric. ### Request #### Body Data (required) New definition of the rum-based metric. * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) Field Type Description data [_required_] object The new rum-based metric properties. attributes [_required_] object The rum-based metric properties that will be updated. compute object The compute rule to compute the rum-based metric. include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when `aggregation_type` is `distribution`. filter object The rum-based metric filter. Events matching this filter will be aggregated in this metric. query [_required_] string The search query - following the RUM search syntax. default: `*` group_by [object] The rules for the group by. path [_required_] string The path to the value the rum-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, `path` is used as the tag name. id string The name of the rum-based metric. type [_required_] enum The type of the resource. The value should always be rum_metrics. Allowed enum values: `rum_metrics` default: `rum_metrics` ``` { "data": { "id": "rum.sessions.webui.count", "type": "rum_metrics", "attributes": { "compute": { "include_percentiles": false }, "filter": { "query": "@service:rum-config" }, "group_by": [ { "path": "@browser.version", "tag_name": "browser_version" } ] } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum-metrics/#UpdateRumMetric-200-v2) * [400](https://docs.datadoghq.com/api/latest/rum-metrics/#UpdateRumMetric-400-v2) * [403](https://docs.datadoghq.com/api/latest/rum-metrics/#UpdateRumMetric-403-v2) * [404](https://docs.datadoghq.com/api/latest/rum-metrics/#UpdateRumMetric-404-v2) * [409](https://docs.datadoghq.com/api/latest/rum-metrics/#UpdateRumMetric-409-v2) * [429](https://docs.datadoghq.com/api/latest/rum-metrics/#UpdateRumMetric-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) The rum-based metric object. Field Type Description data object The rum-based metric properties. attributes object The object describing a Datadog rum-based metric. compute object The compute rule to compute the rum-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when `aggregation_type` is `distribution`. path string The path to the value the rum-based metric will aggregate on. Only present when `aggregation_type` is `distribution`. event_type enum The type of RUM events to filter on. Allowed enum values: `session,view,action,error,resource,long_task,vital` filter object The rum-based metric filter. RUM events matching this filter will be aggregated in this metric. query string The search query - following the RUM search syntax. group_by [object] The rules for the group by. path string The path to the value the rum-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, `path` is used as the tag name. uniqueness object The rule to count updatable events. Is only set if `event_type` is `session` or `view`. when enum When to count updatable events. `match` when the event is first seen, or `end` when the event is complete. Allowed enum values: `match,end` id string The name of the rum-based metric. type enum The type of the resource. The value should always be rum_metrics. Allowed enum values: `rum_metrics` default: `rum_metrics` ``` { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": true, "path": "@duration" }, "event_type": "session", "filter": { "query": "service:web* AND @http.status_code:[200 TO 299]" }, "group_by": [ { "path": "@http.status_code", "tag_name": "status_code" } ], "uniqueness": { "when": "match" } }, "id": "rum.sessions.webui.count", "type": "rum_metrics" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=typescript) ##### Update a rum-based metric returns "OK" response Copy ``` # Path parameters export metric_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/config/metrics/${metric_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "rum.sessions.webui.count", "type": "rum_metrics", "attributes": { "compute": { "include_percentiles": false }, "filter": { "query": "@service:rum-config" }, "group_by": [ { "path": "@browser.version", "tag_name": "browser_version" } ] } } } EOF ``` ##### Update a rum-based metric returns "OK" response ``` // Update a rum-based metric returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "rum_metric" in the system RumMetricDataID := os.Getenv("RUM_METRIC_DATA_ID") body := datadogV2.RumMetricUpdateRequest{ Data: datadogV2.RumMetricUpdateData{ Id: datadog.PtrString(RumMetricDataID), Type: datadogV2.RUMMETRICTYPE_RUM_METRICS, Attributes: datadogV2.RumMetricUpdateAttributes{ Compute: &datadogV2.RumMetricUpdateCompute{ IncludePercentiles: datadog.PtrBool(false), }, Filter: &datadogV2.RumMetricFilter{ Query: "@service:rum-config", }, GroupBy: []datadogV2.RumMetricGroupBy{ { Path: "@browser.version", TagName: datadog.PtrString("browser_version"), }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumMetricsApi(apiClient) resp, r, err := api.UpdateRumMetric(ctx, RumMetricDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumMetricsApi.UpdateRumMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumMetricsApi.UpdateRumMetric`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a rum-based metric returns "OK" response ``` // Update a rum-based metric returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumMetricsApi; import com.datadog.api.client.v2.model.RumMetricFilter; import com.datadog.api.client.v2.model.RumMetricGroupBy; import com.datadog.api.client.v2.model.RumMetricResponse; import com.datadog.api.client.v2.model.RumMetricType; import com.datadog.api.client.v2.model.RumMetricUpdateAttributes; import com.datadog.api.client.v2.model.RumMetricUpdateCompute; import com.datadog.api.client.v2.model.RumMetricUpdateData; import com.datadog.api.client.v2.model.RumMetricUpdateRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumMetricsApi apiInstance = new RumMetricsApi(defaultClient); // there is a valid "rum_metric" in the system String RUM_METRIC_DATA_ID = System.getenv("RUM_METRIC_DATA_ID"); RumMetricUpdateRequest body = new RumMetricUpdateRequest() .data( new RumMetricUpdateData() .id(RUM_METRIC_DATA_ID) .type(RumMetricType.RUM_METRICS) .attributes( new RumMetricUpdateAttributes() .compute(new RumMetricUpdateCompute().includePercentiles(false)) .filter(new RumMetricFilter().query("@service:rum-config")) .groupBy( Collections.singletonList( new RumMetricGroupBy() .path("@browser.version") .tagName("browser_version"))))); try { RumMetricResponse result = apiInstance.updateRumMetric(RUM_METRIC_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumMetricsApi#updateRumMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a rum-based metric returns "OK" response ``` """ Update a rum-based metric returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_metrics_api import RumMetricsApi from datadog_api_client.v2.model.rum_metric_filter import RumMetricFilter from datadog_api_client.v2.model.rum_metric_group_by import RumMetricGroupBy from datadog_api_client.v2.model.rum_metric_type import RumMetricType from datadog_api_client.v2.model.rum_metric_update_attributes import RumMetricUpdateAttributes from datadog_api_client.v2.model.rum_metric_update_compute import RumMetricUpdateCompute from datadog_api_client.v2.model.rum_metric_update_data import RumMetricUpdateData from datadog_api_client.v2.model.rum_metric_update_request import RumMetricUpdateRequest # there is a valid "rum_metric" in the system RUM_METRIC_DATA_ID = environ["RUM_METRIC_DATA_ID"] body = RumMetricUpdateRequest( data=RumMetricUpdateData( id=RUM_METRIC_DATA_ID, type=RumMetricType.RUM_METRICS, attributes=RumMetricUpdateAttributes( compute=RumMetricUpdateCompute( include_percentiles=False, ), filter=RumMetricFilter( query="@service:rum-config", ), group_by=[ RumMetricGroupBy( path="@browser.version", tag_name="browser_version", ), ], ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RumMetricsApi(api_client) response = api_instance.update_rum_metric(metric_id=RUM_METRIC_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a rum-based metric returns "OK" response ``` # Update a rum-based metric returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RumMetricsAPI.new # there is a valid "rum_metric" in the system RUM_METRIC_DATA_ID = ENV["RUM_METRIC_DATA_ID"] body = DatadogAPIClient::V2::RumMetricUpdateRequest.new({ data: DatadogAPIClient::V2::RumMetricUpdateData.new({ id: RUM_METRIC_DATA_ID, type: DatadogAPIClient::V2::RumMetricType::RUM_METRICS, attributes: DatadogAPIClient::V2::RumMetricUpdateAttributes.new({ compute: DatadogAPIClient::V2::RumMetricUpdateCompute.new({ include_percentiles: false, }), filter: DatadogAPIClient::V2::RumMetricFilter.new({ query: "@service:rum-config", }), group_by: [ DatadogAPIClient::V2::RumMetricGroupBy.new({ path: "@browser.version", tag_name: "browser_version", }), ], }), }), }) p api_instance.update_rum_metric(RUM_METRIC_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a rum-based metric returns "OK" response ``` // Update a rum-based metric returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_metrics::RumMetricsAPI; use datadog_api_client::datadogV2::model::RumMetricFilter; use datadog_api_client::datadogV2::model::RumMetricGroupBy; use datadog_api_client::datadogV2::model::RumMetricType; use datadog_api_client::datadogV2::model::RumMetricUpdateAttributes; use datadog_api_client::datadogV2::model::RumMetricUpdateCompute; use datadog_api_client::datadogV2::model::RumMetricUpdateData; use datadog_api_client::datadogV2::model::RumMetricUpdateRequest; #[tokio::main] async fn main() { // there is a valid "rum_metric" in the system let rum_metric_data_id = std::env::var("RUM_METRIC_DATA_ID").unwrap(); let body = RumMetricUpdateRequest::new( RumMetricUpdateData::new( RumMetricUpdateAttributes::new() .compute(RumMetricUpdateCompute::new().include_percentiles(false)) .filter(RumMetricFilter::new("@service:rum-config".to_string())) .group_by(vec![RumMetricGroupBy::new("@browser.version".to_string()) .tag_name("browser_version".to_string())]), RumMetricType::RUM_METRICS, ) .id(rum_metric_data_id.clone()), ); let configuration = datadog::Configuration::new(); let api = RumMetricsAPI::with_config(configuration); let resp = api .update_rum_metric(rum_metric_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a rum-based metric returns "OK" response ``` /** * Update a rum-based metric returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RumMetricsApi(configuration); // there is a valid "rum_metric" in the system const RUM_METRIC_DATA_ID = process.env.RUM_METRIC_DATA_ID as string; const params: v2.RumMetricsApiUpdateRumMetricRequest = { body: { data: { id: RUM_METRIC_DATA_ID, type: "rum_metrics", attributes: { compute: { includePercentiles: false, }, filter: { query: "@service:rum-config", }, groupBy: [ { path: "@browser.version", tagName: "browser_version", }, ], }, }, }, metricId: RUM_METRIC_DATA_ID, }; apiInstance .updateRumMetric(params) .then((data: v2.RumMetricResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a rum-based metric](https://docs.datadoghq.com/api/latest/rum-metrics/#delete-a-rum-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-metrics/#delete-a-rum-based-metric-v2) DELETE https://api.ap1.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.ap2.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.datadoghq.eu/api/v2/rum/config/metrics/{metric_id}https://api.ddog-gov.com/api/v2/rum/config/metrics/{metric_id}https://api.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.us3.datadoghq.com/api/v2/rum/config/metrics/{metric_id}https://api.us5.datadoghq.com/api/v2/rum/config/metrics/{metric_id} ### Overview Delete a specific rum-based metric from your organization. ### Arguments #### Path Parameters Name Type Description metric_id [_required_] string The name of the rum-based metric. ### Response * [204](https://docs.datadoghq.com/api/latest/rum-metrics/#DeleteRumMetric-204-v2) * [403](https://docs.datadoghq.com/api/latest/rum-metrics/#DeleteRumMetric-403-v2) * [404](https://docs.datadoghq.com/api/latest/rum-metrics/#DeleteRumMetric-404-v2) * [429](https://docs.datadoghq.com/api/latest/rum-metrics/#DeleteRumMetric-429-v2) No Content Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-metrics/) * [Example](https://docs.datadoghq.com/api/latest/rum-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-metrics/?code-lang=typescript) ##### Delete a rum-based metric Copy ``` # Path parameters export metric_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/config/metrics/${metric_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a rum-based metric ``` """ Delete a rum-based metric returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_metrics_api import RumMetricsApi # there is a valid "rum_metric" in the system RUM_METRIC_DATA_ID = environ["RUM_METRIC_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RumMetricsApi(api_client) api_instance.delete_rum_metric( metric_id=RUM_METRIC_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a rum-based metric ``` # Delete a rum-based metric returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RumMetricsAPI.new # there is a valid "rum_metric" in the system RUM_METRIC_DATA_ID = ENV["RUM_METRIC_DATA_ID"] api_instance.delete_rum_metric(RUM_METRIC_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a rum-based metric ``` // Delete a rum-based metric returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "rum_metric" in the system RumMetricDataID := os.Getenv("RUM_METRIC_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumMetricsApi(apiClient) r, err := api.DeleteRumMetric(ctx, RumMetricDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumMetricsApi.DeleteRumMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a rum-based metric ``` // Delete a rum-based metric returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumMetricsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumMetricsApi apiInstance = new RumMetricsApi(defaultClient); // there is a valid "rum_metric" in the system String RUM_METRIC_DATA_ID = System.getenv("RUM_METRIC_DATA_ID"); try { apiInstance.deleteRumMetric(RUM_METRIC_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling RumMetricsApi#deleteRumMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a rum-based metric ``` // Delete a rum-based metric returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_metrics::RumMetricsAPI; #[tokio::main] async fn main() { // there is a valid "rum_metric" in the system let rum_metric_data_id = std::env::var("RUM_METRIC_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = RumMetricsAPI::with_config(configuration); let resp = api.delete_rum_metric(rum_metric_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a rum-based metric ``` /** * Delete a rum-based metric returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RumMetricsApi(configuration); // there is a valid "rum_metric" in the system const RUM_METRIC_DATA_ID = process.env.RUM_METRIC_DATA_ID as string; const params: v2.RumMetricsApiDeleteRumMetricRequest = { metricId: RUM_METRIC_DATA_ID, }; apiInstance .deleteRumMetric(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=d9cab3b9-4caa-4ab0-852a-afde53d2cd1d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=ac74284f-4fdf-4261-b335-684f3439101b&pt=Rum%20Metrics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frum-metrics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=d9cab3b9-4caa-4ab0-852a-afde53d2cd1d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=ac74284f-4fdf-4261-b335-684f3439101b&pt=Rum%20Metrics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frum-metrics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=5b07697a-4657-4b2f-af6b-f9b19fa13833&bo=2&sid=6ba625b0f0c011f08f1691493ba2b986&vid=6ba6b460f0c011f0ac91d1b1b7b38028&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Rum%20Metrics&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frum-metrics%2F&r=<=1558&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=663809) --- # Source: https://docs.datadoghq.com/api/latest/rum-retention-filters/ # Rum Retention Filters Manage retention filters through [Manage Applications](https://app.datadoghq.com/rum/list) of RUM for your organization. ## [Get all RUM retention filters](https://docs.datadoghq.com/api/latest/rum-retention-filters/#get-all-rum-retention-filters) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-retention-filters/#get-all-rum-retention-filters-v2) GET https://api.ap1.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filtershttps://api.ap2.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filtershttps://api.datadoghq.eu/api/v2/rum/applications/{app_id}/retention_filtershttps://api.ddog-gov.com/api/v2/rum/applications/{app_id}/retention_filtershttps://api.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filtershttps://api.us3.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filtershttps://api.us5.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters ### Overview Get the list of RUM retention filters for a RUM application. ### Arguments #### Path Parameters Name Type Description app_id [_required_] string RUM application ID. ### Response * [200](https://docs.datadoghq.com/api/latest/rum-retention-filters/#ListRetentionFilters-200-v2) * [403](https://docs.datadoghq.com/api/latest/rum-retention-filters/#ListRetentionFilters-403-v2) * [429](https://docs.datadoghq.com/api/latest/rum-retention-filters/#ListRetentionFilters-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) All RUM retention filters for a RUM application. Field Type Description data [object] A list of RUM retention filters. attributes object The object describing attributes of a RUM retention filter. enabled boolean Whether the retention filter is enabled. event_type enum The type of RUM events to filter on. Allowed enum values: `session,view,action,error,resource,long_task,vital` name string The name of a RUM retention filter. query string The query string for a RUM retention filter. sample_rate int64 The sample rate for a RUM retention filter, between 0 and 100. id string ID of retention filter in UUID. type enum The type of the resource. The value should always be retention_filters. Allowed enum values: `retention_filters` default: `retention_filters` ``` { "data": [ { "attributes": { "enabled": true, "event_type": "session", "name": "Retention filter for session", "query": "@session.has_replay:true", "sample_rate": 25 }, "id": "051601eb-54a0-abc0-03f9-cc02efa18892", "type": "retention_filters" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=typescript) ##### Get all RUM retention filters Copy ``` # Path parameters export app_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications/${app_id}/retention_filters" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all RUM retention filters ``` """ Get all RUM retention filters returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_retention_filters_api import RumRetentionFiltersApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RumRetentionFiltersApi(api_client) response = api_instance.list_retention_filters( app_id="1d4b9c34-7ac4-423a-91cf-9902d926e9b3", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all RUM retention filters ``` # Get all RUM retention filters returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RumRetentionFiltersAPI.new p api_instance.list_retention_filters("1d4b9c34-7ac4-423a-91cf-9902d926e9b3") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all RUM retention filters ``` // Get all RUM retention filters returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumRetentionFiltersApi(apiClient) resp, r, err := api.ListRetentionFilters(ctx, "1d4b9c34-7ac4-423a-91cf-9902d926e9b3") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumRetentionFiltersApi.ListRetentionFilters`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumRetentionFiltersApi.ListRetentionFilters`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all RUM retention filters ``` // Get all RUM retention filters returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumRetentionFiltersApi; import com.datadog.api.client.v2.model.RumRetentionFiltersResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumRetentionFiltersApi apiInstance = new RumRetentionFiltersApi(defaultClient); try { RumRetentionFiltersResponse result = apiInstance.listRetentionFilters("1d4b9c34-7ac4-423a-91cf-9902d926e9b3"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumRetentionFiltersApi#listRetentionFilters"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all RUM retention filters ``` // Get all RUM retention filters returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_retention_filters::RumRetentionFiltersAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = RumRetentionFiltersAPI::with_config(configuration); let resp = api .list_retention_filters("1d4b9c34-7ac4-423a-91cf-9902d926e9b3".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all RUM retention filters ``` /** * Get all RUM retention filters returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RumRetentionFiltersApi(configuration); const params: v2.RumRetentionFiltersApiListRetentionFiltersRequest = { appId: "1d4b9c34-7ac4-423a-91cf-9902d926e9b3", }; apiInstance .listRetentionFilters(params) .then((data: v2.RumRetentionFiltersResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a RUM retention filter](https://docs.datadoghq.com/api/latest/rum-retention-filters/#get-a-rum-retention-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-retention-filters/#get-a-rum-retention-filter-v2) GET https://api.ap1.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.ap2.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.datadoghq.eu/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.ddog-gov.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.us3.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.us5.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id} ### Overview Get a RUM retention filter for a RUM application. ### Arguments #### Path Parameters Name Type Description app_id [_required_] string RUM application ID. rf_id [_required_] string Retention filter ID. ### Response * [200](https://docs.datadoghq.com/api/latest/rum-retention-filters/#GetRetentionFilter-200-v2) * [403](https://docs.datadoghq.com/api/latest/rum-retention-filters/#GetRetentionFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/rum-retention-filters/#GetRetentionFilter-404-v2) * [429](https://docs.datadoghq.com/api/latest/rum-retention-filters/#GetRetentionFilter-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) The RUM retention filter object. Field Type Description data object The RUM retention filter. attributes object The object describing attributes of a RUM retention filter. enabled boolean Whether the retention filter is enabled. event_type enum The type of RUM events to filter on. Allowed enum values: `session,view,action,error,resource,long_task,vital` name string The name of a RUM retention filter. query string The query string for a RUM retention filter. sample_rate int64 The sample rate for a RUM retention filter, between 0 and 100. id string ID of retention filter in UUID. type enum The type of the resource. The value should always be retention_filters. Allowed enum values: `retention_filters` default: `retention_filters` ``` { "data": { "attributes": { "enabled": true, "event_type": "session", "name": "Retention filter for session", "query": "@session.has_replay:true", "sample_rate": 25 }, "id": "051601eb-54a0-abc0-03f9-cc02efa18892", "type": "retention_filters" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=typescript) ##### Get a RUM retention filter Copy ``` # Path parameters export app_id="CHANGE_ME" export rf_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications/${app_id}/retention_filters/${rf_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a RUM retention filter ``` """ Get a RUM retention filter returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_retention_filters_api import RumRetentionFiltersApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RumRetentionFiltersApi(api_client) response = api_instance.get_retention_filter( app_id="a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", rf_id="4b95d361-f65d-4515-9824-c9aaeba5ac2a", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a RUM retention filter ``` # Get a RUM retention filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RumRetentionFiltersAPI.new p api_instance.get_retention_filter("a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", "4b95d361-f65d-4515-9824-c9aaeba5ac2a") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a RUM retention filter ``` // Get a RUM retention filter returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumRetentionFiltersApi(apiClient) resp, r, err := api.GetRetentionFilter(ctx, "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", "4b95d361-f65d-4515-9824-c9aaeba5ac2a") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumRetentionFiltersApi.GetRetentionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumRetentionFiltersApi.GetRetentionFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a RUM retention filter ``` // Get a RUM retention filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumRetentionFiltersApi; import com.datadog.api.client.v2.model.RumRetentionFilterResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumRetentionFiltersApi apiInstance = new RumRetentionFiltersApi(defaultClient); try { RumRetentionFilterResponse result = apiInstance.getRetentionFilter( "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", "4b95d361-f65d-4515-9824-c9aaeba5ac2a"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumRetentionFiltersApi#getRetentionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a RUM retention filter ``` // Get a RUM retention filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_retention_filters::RumRetentionFiltersAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = RumRetentionFiltersAPI::with_config(configuration); let resp = api .get_retention_filter( "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690".to_string(), "4b95d361-f65d-4515-9824-c9aaeba5ac2a".to_string(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a RUM retention filter ``` /** * Get a RUM retention filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RumRetentionFiltersApi(configuration); const params: v2.RumRetentionFiltersApiGetRetentionFilterRequest = { appId: "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", rfId: "4b95d361-f65d-4515-9824-c9aaeba5ac2a", }; apiInstance .getRetentionFilter(params) .then((data: v2.RumRetentionFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a RUM retention filter](https://docs.datadoghq.com/api/latest/rum-retention-filters/#create-a-rum-retention-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-retention-filters/#create-a-rum-retention-filter-v2) POST https://api.ap1.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filtershttps://api.ap2.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filtershttps://api.datadoghq.eu/api/v2/rum/applications/{app_id}/retention_filtershttps://api.ddog-gov.com/api/v2/rum/applications/{app_id}/retention_filtershttps://api.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filtershttps://api.us3.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filtershttps://api.us5.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters ### Overview Create a RUM retention filter for a RUM application. Returns RUM retention filter objects from the request body when the request is successful. ### Arguments #### Path Parameters Name Type Description app_id [_required_] string RUM application ID. ### Request #### Body Data (required) The definition of the new RUM retention filter. * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) Field Type Description data [_required_] object The new RUM retention filter properties to create. attributes [_required_] object The object describing attributes of a RUM retention filter to create. enabled boolean Whether the retention filter is enabled. event_type [_required_] enum The type of RUM events to filter on. Allowed enum values: `session,view,action,error,resource,long_task,vital` name [_required_] string The name of a RUM retention filter. query string The query string for a RUM retention filter. sample_rate [_required_] int64 The sample rate for a RUM retention filter, between 0 and 100. type [_required_] enum The type of the resource. The value should always be retention_filters. Allowed enum values: `retention_filters` default: `retention_filters` ``` { "data": { "type": "retention_filters", "attributes": { "name": "Test creating retention filter", "event_type": "session", "query": "custom_query", "sample_rate": 50, "enabled": true } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/rum-retention-filters/#CreateRetentionFilter-201-v2) * [400](https://docs.datadoghq.com/api/latest/rum-retention-filters/#CreateRetentionFilter-400-v2) * [403](https://docs.datadoghq.com/api/latest/rum-retention-filters/#CreateRetentionFilter-403-v2) * [429](https://docs.datadoghq.com/api/latest/rum-retention-filters/#CreateRetentionFilter-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) The RUM retention filter object. Field Type Description data object The RUM retention filter. attributes object The object describing attributes of a RUM retention filter. enabled boolean Whether the retention filter is enabled. event_type enum The type of RUM events to filter on. Allowed enum values: `session,view,action,error,resource,long_task,vital` name string The name of a RUM retention filter. query string The query string for a RUM retention filter. sample_rate int64 The sample rate for a RUM retention filter, between 0 and 100. id string ID of retention filter in UUID. type enum The type of the resource. The value should always be retention_filters. Allowed enum values: `retention_filters` default: `retention_filters` ``` { "data": { "attributes": { "enabled": true, "event_type": "session", "name": "Retention filter for session", "query": "@session.has_replay:true", "sample_rate": 25 }, "id": "051601eb-54a0-abc0-03f9-cc02efa18892", "type": "retention_filters" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=typescript) ##### Create a RUM retention filter returns "Created" response Copy ``` # Path parameters export app_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications/${app_id}/retention_filters" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "retention_filters", "attributes": { "name": "Test creating retention filter", "event_type": "session", "query": "custom_query", "sample_rate": 50, "enabled": true } } } EOF ``` ##### Create a RUM retention filter returns "Created" response ``` // Create a RUM retention filter returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RumRetentionFilterCreateRequest{ Data: datadogV2.RumRetentionFilterCreateData{ Type: datadogV2.RUMRETENTIONFILTERTYPE_RETENTION_FILTERS, Attributes: datadogV2.RumRetentionFilterCreateAttributes{ Name: "Test creating retention filter", EventType: datadogV2.RUMRETENTIONFILTEREVENTTYPE_SESSION, Query: datadog.PtrString("custom_query"), SampleRate: 50, Enabled: datadog.PtrBool(true), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumRetentionFiltersApi(apiClient) resp, r, err := api.CreateRetentionFilter(ctx, "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumRetentionFiltersApi.CreateRetentionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumRetentionFiltersApi.CreateRetentionFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a RUM retention filter returns "Created" response ``` // Create a RUM retention filter returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumRetentionFiltersApi; import com.datadog.api.client.v2.model.RumRetentionFilterCreateAttributes; import com.datadog.api.client.v2.model.RumRetentionFilterCreateData; import com.datadog.api.client.v2.model.RumRetentionFilterCreateRequest; import com.datadog.api.client.v2.model.RumRetentionFilterEventType; import com.datadog.api.client.v2.model.RumRetentionFilterResponse; import com.datadog.api.client.v2.model.RumRetentionFilterType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumRetentionFiltersApi apiInstance = new RumRetentionFiltersApi(defaultClient); RumRetentionFilterCreateRequest body = new RumRetentionFilterCreateRequest() .data( new RumRetentionFilterCreateData() .type(RumRetentionFilterType.RETENTION_FILTERS) .attributes( new RumRetentionFilterCreateAttributes() .name("Test creating retention filter") .eventType(RumRetentionFilterEventType.SESSION) .query("custom_query") .sampleRate(50L) .enabled(true))); try { RumRetentionFilterResponse result = apiInstance.createRetentionFilter("a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumRetentionFiltersApi#createRetentionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a RUM retention filter returns "Created" response ``` """ Create a RUM retention filter returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_retention_filters_api import RumRetentionFiltersApi from datadog_api_client.v2.model.rum_retention_filter_create_attributes import RumRetentionFilterCreateAttributes from datadog_api_client.v2.model.rum_retention_filter_create_data import RumRetentionFilterCreateData from datadog_api_client.v2.model.rum_retention_filter_create_request import RumRetentionFilterCreateRequest from datadog_api_client.v2.model.rum_retention_filter_event_type import RumRetentionFilterEventType from datadog_api_client.v2.model.rum_retention_filter_type import RumRetentionFilterType body = RumRetentionFilterCreateRequest( data=RumRetentionFilterCreateData( type=RumRetentionFilterType.RETENTION_FILTERS, attributes=RumRetentionFilterCreateAttributes( name="Test creating retention filter", event_type=RumRetentionFilterEventType.SESSION, query="custom_query", sample_rate=50, enabled=True, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RumRetentionFiltersApi(api_client) response = api_instance.create_retention_filter(app_id="a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a RUM retention filter returns "Created" response ``` # Create a RUM retention filter returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RumRetentionFiltersAPI.new body = DatadogAPIClient::V2::RumRetentionFilterCreateRequest.new({ data: DatadogAPIClient::V2::RumRetentionFilterCreateData.new({ type: DatadogAPIClient::V2::RumRetentionFilterType::RETENTION_FILTERS, attributes: DatadogAPIClient::V2::RumRetentionFilterCreateAttributes.new({ name: "Test creating retention filter", event_type: DatadogAPIClient::V2::RumRetentionFilterEventType::SESSION, query: "custom_query", sample_rate: 50, enabled: true, }), }), }) p api_instance.create_retention_filter("a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a RUM retention filter returns "Created" response ``` // Create a RUM retention filter returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_retention_filters::RumRetentionFiltersAPI; use datadog_api_client::datadogV2::model::RumRetentionFilterCreateAttributes; use datadog_api_client::datadogV2::model::RumRetentionFilterCreateData; use datadog_api_client::datadogV2::model::RumRetentionFilterCreateRequest; use datadog_api_client::datadogV2::model::RumRetentionFilterEventType; use datadog_api_client::datadogV2::model::RumRetentionFilterType; #[tokio::main] async fn main() { let body = RumRetentionFilterCreateRequest::new(RumRetentionFilterCreateData::new( RumRetentionFilterCreateAttributes::new( RumRetentionFilterEventType::SESSION, "Test creating retention filter".to_string(), 50, ) .enabled(true) .query("custom_query".to_string()), RumRetentionFilterType::RETENTION_FILTERS, )); let configuration = datadog::Configuration::new(); let api = RumRetentionFiltersAPI::with_config(configuration); let resp = api .create_retention_filter("a33671aa-24fd-4dcd-ba4b-5bbdbafe7690".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a RUM retention filter returns "Created" response ``` /** * Create a RUM retention filter returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RumRetentionFiltersApi(configuration); const params: v2.RumRetentionFiltersApiCreateRetentionFilterRequest = { body: { data: { type: "retention_filters", attributes: { name: "Test creating retention filter", eventType: "session", query: "custom_query", sampleRate: 50, enabled: true, }, }, }, appId: "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", }; apiInstance .createRetentionFilter(params) .then((data: v2.RumRetentionFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a RUM retention filter](https://docs.datadoghq.com/api/latest/rum-retention-filters/#update-a-rum-retention-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-retention-filters/#update-a-rum-retention-filter-v2) PATCH https://api.ap1.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.ap2.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.datadoghq.eu/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.ddog-gov.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.us3.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.us5.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id} ### Overview Update a RUM retention filter for a RUM application. Returns RUM retention filter objects from the request body when the request is successful. ### Arguments #### Path Parameters Name Type Description app_id [_required_] string RUM application ID. rf_id [_required_] string Retention filter ID. ### Request #### Body Data (required) New definition of the RUM retention filter. * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) Field Type Description data [_required_] object The new RUM retention filter properties to update. attributes [_required_] object The object describing attributes of a RUM retention filter to update. enabled boolean Whether the retention filter is enabled. event_type enum The type of RUM events to filter on. Allowed enum values: `session,view,action,error,resource,long_task,vital` name string The name of a RUM retention filter. query string The query string for a RUM retention filter. sample_rate int64 The sample rate for a RUM retention filter, between 0 and 100. id [_required_] string ID of retention filter in UUID. type [_required_] enum The type of the resource. The value should always be retention_filters. Allowed enum values: `retention_filters` default: `retention_filters` ``` { "data": { "id": "4b95d361-f65d-4515-9824-c9aaeba5ac2a", "type": "retention_filters", "attributes": { "name": "Test updating retention filter", "event_type": "view", "query": "view_query", "sample_rate": 100, "enabled": true } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum-retention-filters/#UpdateRetentionFilter-200-v2) * [400](https://docs.datadoghq.com/api/latest/rum-retention-filters/#UpdateRetentionFilter-400-v2) * [403](https://docs.datadoghq.com/api/latest/rum-retention-filters/#UpdateRetentionFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/rum-retention-filters/#UpdateRetentionFilter-404-v2) * [429](https://docs.datadoghq.com/api/latest/rum-retention-filters/#UpdateRetentionFilter-429-v2) Updated * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) The RUM retention filter object. Field Type Description data object The RUM retention filter. attributes object The object describing attributes of a RUM retention filter. enabled boolean Whether the retention filter is enabled. event_type enum The type of RUM events to filter on. Allowed enum values: `session,view,action,error,resource,long_task,vital` name string The name of a RUM retention filter. query string The query string for a RUM retention filter. sample_rate int64 The sample rate for a RUM retention filter, between 0 and 100. id string ID of retention filter in UUID. type enum The type of the resource. The value should always be retention_filters. Allowed enum values: `retention_filters` default: `retention_filters` ``` { "data": { "attributes": { "enabled": true, "event_type": "session", "name": "Retention filter for session", "query": "@session.has_replay:true", "sample_rate": 25 }, "id": "051601eb-54a0-abc0-03f9-cc02efa18892", "type": "retention_filters" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=typescript) ##### Update a RUM retention filter returns "Updated" response Copy ``` # Path parameters export app_id="CHANGE_ME" export rf_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications/${app_id}/retention_filters/${rf_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "4b95d361-f65d-4515-9824-c9aaeba5ac2a", "type": "retention_filters", "attributes": { "name": "Test updating retention filter", "event_type": "view", "query": "view_query", "sample_rate": 100, "enabled": true } } } EOF ``` ##### Update a RUM retention filter returns "Updated" response ``` // Update a RUM retention filter returns "Updated" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RumRetentionFilterUpdateRequest{ Data: datadogV2.RumRetentionFilterUpdateData{ Id: "4b95d361-f65d-4515-9824-c9aaeba5ac2a", Type: datadogV2.RUMRETENTIONFILTERTYPE_RETENTION_FILTERS, Attributes: datadogV2.RumRetentionFilterUpdateAttributes{ Name: datadog.PtrString("Test updating retention filter"), EventType: datadogV2.RUMRETENTIONFILTEREVENTTYPE_VIEW.Ptr(), Query: datadog.PtrString("view_query"), SampleRate: datadog.PtrInt64(100), Enabled: datadog.PtrBool(true), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumRetentionFiltersApi(apiClient) resp, r, err := api.UpdateRetentionFilter(ctx, "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", "4b95d361-f65d-4515-9824-c9aaeba5ac2a", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumRetentionFiltersApi.UpdateRetentionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumRetentionFiltersApi.UpdateRetentionFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a RUM retention filter returns "Updated" response ``` // Update a RUM retention filter returns "Updated" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumRetentionFiltersApi; import com.datadog.api.client.v2.model.RumRetentionFilterEventType; import com.datadog.api.client.v2.model.RumRetentionFilterResponse; import com.datadog.api.client.v2.model.RumRetentionFilterType; import com.datadog.api.client.v2.model.RumRetentionFilterUpdateAttributes; import com.datadog.api.client.v2.model.RumRetentionFilterUpdateData; import com.datadog.api.client.v2.model.RumRetentionFilterUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumRetentionFiltersApi apiInstance = new RumRetentionFiltersApi(defaultClient); RumRetentionFilterUpdateRequest body = new RumRetentionFilterUpdateRequest() .data( new RumRetentionFilterUpdateData() .id("4b95d361-f65d-4515-9824-c9aaeba5ac2a") .type(RumRetentionFilterType.RETENTION_FILTERS) .attributes( new RumRetentionFilterUpdateAttributes() .name("Test updating retention filter") .eventType(RumRetentionFilterEventType.VIEW) .query("view_query") .sampleRate(100L) .enabled(true))); try { RumRetentionFilterResponse result = apiInstance.updateRetentionFilter( "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", "4b95d361-f65d-4515-9824-c9aaeba5ac2a", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumRetentionFiltersApi#updateRetentionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a RUM retention filter returns "Updated" response ``` """ Update a RUM retention filter returns "Updated" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_retention_filters_api import RumRetentionFiltersApi from datadog_api_client.v2.model.rum_retention_filter_event_type import RumRetentionFilterEventType from datadog_api_client.v2.model.rum_retention_filter_type import RumRetentionFilterType from datadog_api_client.v2.model.rum_retention_filter_update_attributes import RumRetentionFilterUpdateAttributes from datadog_api_client.v2.model.rum_retention_filter_update_data import RumRetentionFilterUpdateData from datadog_api_client.v2.model.rum_retention_filter_update_request import RumRetentionFilterUpdateRequest body = RumRetentionFilterUpdateRequest( data=RumRetentionFilterUpdateData( id="4b95d361-f65d-4515-9824-c9aaeba5ac2a", type=RumRetentionFilterType.RETENTION_FILTERS, attributes=RumRetentionFilterUpdateAttributes( name="Test updating retention filter", event_type=RumRetentionFilterEventType.VIEW, query="view_query", sample_rate=100, enabled=True, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RumRetentionFiltersApi(api_client) response = api_instance.update_retention_filter( app_id="a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", rf_id="4b95d361-f65d-4515-9824-c9aaeba5ac2a", body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a RUM retention filter returns "Updated" response ``` # Update a RUM retention filter returns "Updated" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RumRetentionFiltersAPI.new body = DatadogAPIClient::V2::RumRetentionFilterUpdateRequest.new({ data: DatadogAPIClient::V2::RumRetentionFilterUpdateData.new({ id: "4b95d361-f65d-4515-9824-c9aaeba5ac2a", type: DatadogAPIClient::V2::RumRetentionFilterType::RETENTION_FILTERS, attributes: DatadogAPIClient::V2::RumRetentionFilterUpdateAttributes.new({ name: "Test updating retention filter", event_type: DatadogAPIClient::V2::RumRetentionFilterEventType::VIEW, query: "view_query", sample_rate: 100, enabled: true, }), }), }) p api_instance.update_retention_filter("a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", "4b95d361-f65d-4515-9824-c9aaeba5ac2a", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a RUM retention filter returns "Updated" response ``` // Update a RUM retention filter returns "Updated" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_retention_filters::RumRetentionFiltersAPI; use datadog_api_client::datadogV2::model::RumRetentionFilterEventType; use datadog_api_client::datadogV2::model::RumRetentionFilterType; use datadog_api_client::datadogV2::model::RumRetentionFilterUpdateAttributes; use datadog_api_client::datadogV2::model::RumRetentionFilterUpdateData; use datadog_api_client::datadogV2::model::RumRetentionFilterUpdateRequest; #[tokio::main] async fn main() { let body = RumRetentionFilterUpdateRequest::new(RumRetentionFilterUpdateData::new( RumRetentionFilterUpdateAttributes::new() .enabled(true) .event_type(RumRetentionFilterEventType::VIEW) .name("Test updating retention filter".to_string()) .query("view_query".to_string()) .sample_rate(100), "4b95d361-f65d-4515-9824-c9aaeba5ac2a".to_string(), RumRetentionFilterType::RETENTION_FILTERS, )); let configuration = datadog::Configuration::new(); let api = RumRetentionFiltersAPI::with_config(configuration); let resp = api .update_retention_filter( "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690".to_string(), "4b95d361-f65d-4515-9824-c9aaeba5ac2a".to_string(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a RUM retention filter returns "Updated" response ``` /** * Update a RUM retention filter returns "Updated" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RumRetentionFiltersApi(configuration); const params: v2.RumRetentionFiltersApiUpdateRetentionFilterRequest = { body: { data: { id: "4b95d361-f65d-4515-9824-c9aaeba5ac2a", type: "retention_filters", attributes: { name: "Test updating retention filter", eventType: "view", query: "view_query", sampleRate: 100, enabled: true, }, }, }, appId: "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", rfId: "4b95d361-f65d-4515-9824-c9aaeba5ac2a", }; apiInstance .updateRetentionFilter(params) .then((data: v2.RumRetentionFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a RUM retention filter](https://docs.datadoghq.com/api/latest/rum-retention-filters/#delete-a-rum-retention-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-retention-filters/#delete-a-rum-retention-filter-v2) DELETE https://api.ap1.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.ap2.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.datadoghq.eu/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.ddog-gov.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.us3.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id}https://api.us5.datadoghq.com/api/v2/rum/applications/{app_id}/retention_filters/{rf_id} ### Overview Delete a RUM retention filter for a RUM application. ### Arguments #### Path Parameters Name Type Description app_id [_required_] string RUM application ID. rf_id [_required_] string Retention filter ID. ### Response * [204](https://docs.datadoghq.com/api/latest/rum-retention-filters/#DeleteRetentionFilter-204-v2) * [403](https://docs.datadoghq.com/api/latest/rum-retention-filters/#DeleteRetentionFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/rum-retention-filters/#DeleteRetentionFilter-404-v2) * [429](https://docs.datadoghq.com/api/latest/rum-retention-filters/#DeleteRetentionFilter-429-v2) No Content Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=typescript) ##### Delete a RUM retention filter Copy ``` # Path parameters export app_id="CHANGE_ME" export rf_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications/${app_id}/retention_filters/${rf_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a RUM retention filter ``` """ Delete a RUM retention filter returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_retention_filters_api import RumRetentionFiltersApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RumRetentionFiltersApi(api_client) api_instance.delete_retention_filter( app_id="a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", rf_id="fe34ee09-14cf-4976-9362-08044c0dea80", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a RUM retention filter ``` # Delete a RUM retention filter returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RumRetentionFiltersAPI.new api_instance.delete_retention_filter("a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", "fe34ee09-14cf-4976-9362-08044c0dea80") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a RUM retention filter ``` // Delete a RUM retention filter returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumRetentionFiltersApi(apiClient) r, err := api.DeleteRetentionFilter(ctx, "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", "fe34ee09-14cf-4976-9362-08044c0dea80") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumRetentionFiltersApi.DeleteRetentionFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a RUM retention filter ``` // Delete a RUM retention filter returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumRetentionFiltersApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumRetentionFiltersApi apiInstance = new RumRetentionFiltersApi(defaultClient); try { apiInstance.deleteRetentionFilter( "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", "fe34ee09-14cf-4976-9362-08044c0dea80"); } catch (ApiException e) { System.err.println("Exception when calling RumRetentionFiltersApi#deleteRetentionFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a RUM retention filter ``` // Delete a RUM retention filter returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_retention_filters::RumRetentionFiltersAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = RumRetentionFiltersAPI::with_config(configuration); let resp = api .delete_retention_filter( "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690".to_string(), "fe34ee09-14cf-4976-9362-08044c0dea80".to_string(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a RUM retention filter ``` /** * Delete a RUM retention filter returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RumRetentionFiltersApi(configuration); const params: v2.RumRetentionFiltersApiDeleteRetentionFilterRequest = { appId: "a33671aa-24fd-4dcd-ba4b-5bbdbafe7690", rfId: "fe34ee09-14cf-4976-9362-08044c0dea80", }; apiInstance .deleteRetentionFilter(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Order RUM retention filters](https://docs.datadoghq.com/api/latest/rum-retention-filters/#order-rum-retention-filters) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum-retention-filters/#order-rum-retention-filters-v2) PATCH https://api.ap1.datadoghq.com/api/v2/rum/applications/{app_id}/relationships/retention_filtershttps://api.ap2.datadoghq.com/api/v2/rum/applications/{app_id}/relationships/retention_filtershttps://api.datadoghq.eu/api/v2/rum/applications/{app_id}/relationships/retention_filtershttps://api.ddog-gov.com/api/v2/rum/applications/{app_id}/relationships/retention_filtershttps://api.datadoghq.com/api/v2/rum/applications/{app_id}/relationships/retention_filtershttps://api.us3.datadoghq.com/api/v2/rum/applications/{app_id}/relationships/retention_filtershttps://api.us5.datadoghq.com/api/v2/rum/applications/{app_id}/relationships/retention_filters ### Overview Order RUM retention filters for a RUM application. Returns RUM retention filter objects without attributes from the request body when the request is successful. ### Arguments #### Path Parameters Name Type Description app_id [_required_] string RUM application ID. ### Request #### Body Data (required) New definition of the RUM retention filter. * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) Field Type Description data [object] A list of RUM retention filter IDs along with type. id [_required_] string ID of retention filter in UUID. type [_required_] enum The type of the resource. The value should always be retention_filters. Allowed enum values: `retention_filters` default: `retention_filters` ``` { "data": [ { "type": "retention_filters", "id": "325631eb-94c9-49c0-93f9-ab7e4fd24529" }, { "type": "retention_filters", "id": "42d89430-5b80-426e-a44b-ba3b417ece25" }, { "type": "retention_filters", "id": "bff0bc34-99e9-4c16-adce-f47e71948c23" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum-retention-filters/#OrderRetentionFilters-200-v2) * [400](https://docs.datadoghq.com/api/latest/rum-retention-filters/#OrderRetentionFilters-400-v2) * [403](https://docs.datadoghq.com/api/latest/rum-retention-filters/#OrderRetentionFilters-403-v2) * [429](https://docs.datadoghq.com/api/latest/rum-retention-filters/#OrderRetentionFilters-429-v2) Ordered * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) The list of RUM retention filter IDs along with type. Field Type Description data [object] A list of RUM retention filter IDs along with type. id [_required_] string ID of retention filter in UUID. type [_required_] enum The type of the resource. The value should always be retention_filters. Allowed enum values: `retention_filters` default: `retention_filters` ``` { "data": [ { "id": "051601eb-54a0-abc0-03f9-cc02efa18892", "type": "retention_filters" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum-retention-filters/) * [Example](https://docs.datadoghq.com/api/latest/rum-retention-filters/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum-retention-filters/?code-lang=typescript) ##### Order RUM retention filters returns "Ordered" response Copy ``` # Path parameters export app_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications/${app_id}/relationships/retention_filters" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "type": "retention_filters", "id": "325631eb-94c9-49c0-93f9-ab7e4fd24529" }, { "type": "retention_filters", "id": "42d89430-5b80-426e-a44b-ba3b417ece25" }, { "type": "retention_filters", "id": "bff0bc34-99e9-4c16-adce-f47e71948c23" } ] } EOF ``` ##### Order RUM retention filters returns "Ordered" response ``` // Order RUM retention filters returns "Ordered" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RumRetentionFiltersOrderRequest{ Data: []datadogV2.RumRetentionFiltersOrderData{ { Type: datadogV2.RUMRETENTIONFILTERTYPE_RETENTION_FILTERS, Id: "325631eb-94c9-49c0-93f9-ab7e4fd24529", }, { Type: datadogV2.RUMRETENTIONFILTERTYPE_RETENTION_FILTERS, Id: "42d89430-5b80-426e-a44b-ba3b417ece25", }, { Type: datadogV2.RUMRETENTIONFILTERTYPE_RETENTION_FILTERS, Id: "bff0bc34-99e9-4c16-adce-f47e71948c23", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRumRetentionFiltersApi(apiClient) resp, r, err := api.OrderRetentionFilters(ctx, "1d4b9c34-7ac4-423a-91cf-9902d926e9b3", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RumRetentionFiltersApi.OrderRetentionFilters`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RumRetentionFiltersApi.OrderRetentionFilters`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Order RUM retention filters returns "Ordered" response ``` // Order RUM retention filters returns "Ordered" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumRetentionFiltersApi; import com.datadog.api.client.v2.model.RumRetentionFilterType; import com.datadog.api.client.v2.model.RumRetentionFiltersOrderData; import com.datadog.api.client.v2.model.RumRetentionFiltersOrderRequest; import com.datadog.api.client.v2.model.RumRetentionFiltersOrderResponse; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumRetentionFiltersApi apiInstance = new RumRetentionFiltersApi(defaultClient); RumRetentionFiltersOrderRequest body = new RumRetentionFiltersOrderRequest() .data( Arrays.asList( new RumRetentionFiltersOrderData() .type(RumRetentionFilterType.RETENTION_FILTERS) .id("325631eb-94c9-49c0-93f9-ab7e4fd24529"), new RumRetentionFiltersOrderData() .type(RumRetentionFilterType.RETENTION_FILTERS) .id("42d89430-5b80-426e-a44b-ba3b417ece25"), new RumRetentionFiltersOrderData() .type(RumRetentionFilterType.RETENTION_FILTERS) .id("bff0bc34-99e9-4c16-adce-f47e71948c23"))); try { RumRetentionFiltersOrderResponse result = apiInstance.orderRetentionFilters("1d4b9c34-7ac4-423a-91cf-9902d926e9b3", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumRetentionFiltersApi#orderRetentionFilters"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Order RUM retention filters returns "Ordered" response ``` """ Order RUM retention filters returns "Ordered" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_retention_filters_api import RumRetentionFiltersApi from datadog_api_client.v2.model.rum_retention_filter_type import RumRetentionFilterType from datadog_api_client.v2.model.rum_retention_filters_order_data import RumRetentionFiltersOrderData from datadog_api_client.v2.model.rum_retention_filters_order_request import RumRetentionFiltersOrderRequest body = RumRetentionFiltersOrderRequest( data=[ RumRetentionFiltersOrderData( type=RumRetentionFilterType.RETENTION_FILTERS, id="325631eb-94c9-49c0-93f9-ab7e4fd24529", ), RumRetentionFiltersOrderData( type=RumRetentionFilterType.RETENTION_FILTERS, id="42d89430-5b80-426e-a44b-ba3b417ece25", ), RumRetentionFiltersOrderData( type=RumRetentionFilterType.RETENTION_FILTERS, id="bff0bc34-99e9-4c16-adce-f47e71948c23", ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RumRetentionFiltersApi(api_client) response = api_instance.order_retention_filters(app_id="1d4b9c34-7ac4-423a-91cf-9902d926e9b3", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Order RUM retention filters returns "Ordered" response ``` # Order RUM retention filters returns "Ordered" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RumRetentionFiltersAPI.new body = DatadogAPIClient::V2::RumRetentionFiltersOrderRequest.new({ data: [ DatadogAPIClient::V2::RumRetentionFiltersOrderData.new({ type: DatadogAPIClient::V2::RumRetentionFilterType::RETENTION_FILTERS, id: "325631eb-94c9-49c0-93f9-ab7e4fd24529", }), DatadogAPIClient::V2::RumRetentionFiltersOrderData.new({ type: DatadogAPIClient::V2::RumRetentionFilterType::RETENTION_FILTERS, id: "42d89430-5b80-426e-a44b-ba3b417ece25", }), DatadogAPIClient::V2::RumRetentionFiltersOrderData.new({ type: DatadogAPIClient::V2::RumRetentionFilterType::RETENTION_FILTERS, id: "bff0bc34-99e9-4c16-adce-f47e71948c23", }), ], }) p api_instance.order_retention_filters("1d4b9c34-7ac4-423a-91cf-9902d926e9b3", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Order RUM retention filters returns "Ordered" response ``` // Order RUM retention filters returns "Ordered" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum_retention_filters::RumRetentionFiltersAPI; use datadog_api_client::datadogV2::model::RumRetentionFilterType; use datadog_api_client::datadogV2::model::RumRetentionFiltersOrderData; use datadog_api_client::datadogV2::model::RumRetentionFiltersOrderRequest; #[tokio::main] async fn main() { let body = RumRetentionFiltersOrderRequest::new().data(vec![ RumRetentionFiltersOrderData::new( "325631eb-94c9-49c0-93f9-ab7e4fd24529".to_string(), RumRetentionFilterType::RETENTION_FILTERS, ), RumRetentionFiltersOrderData::new( "42d89430-5b80-426e-a44b-ba3b417ece25".to_string(), RumRetentionFilterType::RETENTION_FILTERS, ), RumRetentionFiltersOrderData::new( "bff0bc34-99e9-4c16-adce-f47e71948c23".to_string(), RumRetentionFilterType::RETENTION_FILTERS, ), ]); let configuration = datadog::Configuration::new(); let api = RumRetentionFiltersAPI::with_config(configuration); let resp = api .order_retention_filters("1d4b9c34-7ac4-423a-91cf-9902d926e9b3".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Order RUM retention filters returns "Ordered" response ``` /** * Order RUM retention filters returns "Ordered" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RumRetentionFiltersApi(configuration); const params: v2.RumRetentionFiltersApiOrderRetentionFiltersRequest = { body: { data: [ { type: "retention_filters", id: "325631eb-94c9-49c0-93f9-ab7e4fd24529", }, { type: "retention_filters", id: "42d89430-5b80-426e-a44b-ba3b417ece25", }, { type: "retention_filters", id: "bff0bc34-99e9-4c16-adce-f47e71948c23", }, ], }, appId: "1d4b9c34-7ac4-423a-91cf-9902d926e9b3", }; apiInstance .orderRetentionFilters(params) .then((data: v2.RumRetentionFiltersOrderResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=9106d793-2b59-4004-8afe-1cff185d1eee&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=bfc301b1-3d43-4fa0-a2c8-f2f393eb7a5d&pt=Rum%20Retention%20Filters&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frum-retention-filters%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=9106d793-2b59-4004-8afe-1cff185d1eee&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=bfc301b1-3d43-4fa0-a2c8-f2f393eb7a5d&pt=Rum%20Retention%20Filters&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frum-retention-filters%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) --- # Source: https://docs.datadoghq.com/api/latest/rum/ # RUM Manage your Real User Monitoring (RUM) applications, and search or aggregate your RUM events over HTTP. See the [RUM & Session Replay page](https://docs.datadoghq.com/real_user_monitoring/) for more information ## [Search RUM events](https://docs.datadoghq.com/api/latest/rum/#search-rum-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum/#search-rum-events-v2) POST https://api.ap1.datadoghq.com/api/v2/rum/events/searchhttps://api.ap2.datadoghq.com/api/v2/rum/events/searchhttps://api.datadoghq.eu/api/v2/rum/events/searchhttps://api.ddog-gov.com/api/v2/rum/events/searchhttps://api.datadoghq.com/api/v2/rum/events/searchhttps://api.us3.datadoghq.com/api/v2/rum/events/searchhttps://api.us5.datadoghq.com/api/v2/rum/events/search ### Overview List endpoint returns RUM events that match a RUM search query. [Results are paginated](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to build complex RUM events filtering and search. This endpoint requires the `rum_apps_read` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) Field Type Description filter object The search and filter query settings. from string The minimum time for the requested events; supports date (in [ISO 8601](https://www.w3.org/TR/NOTE-datetime) format with full date, hours, minutes, and the `Z` UTC indicator - seconds and fractional seconds are optional), math, and regular timestamps (in milliseconds). default: `now-15m` query string The search query following the RUM search syntax. default: `*` to string The maximum time for the requested events; supports date (in [ISO 8601](https://www.w3.org/TR/NOTE-datetime) format with full date, hours, minutes, and the `Z` UTC indicator - seconds and fractional seconds are optional), math, and regular timestamps (in milliseconds). default: `now` options object Global query options that are used during the query. Note: Only supply timezone or time offset, not both. Otherwise, the query fails. time_offset int64 The time offset (in seconds) to apply to the query. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` page object Paging attributes for listing events. cursor string List following results with a cursor provided in the previous query. limit int32 Maximum number of events in the response. default: `10` sort enum Sort parameters when querying events. Allowed enum values: `timestamp,-timestamp` ##### Search RUM events returns "OK" response ``` { "filter": { "from": "now-15m", "query": "@type:session AND @session.type:user", "to": "now" }, "options": { "time_offset": 0, "timezone": "GMT" }, "page": { "limit": 25 }, "sort": "timestamp" } ``` Copy ##### Search RUM events returns "OK" response with pagination ``` { "filter": { "from": "now-15m", "query": "@type:session AND @session.type:user", "to": "now" }, "options": { "time_offset": 0, "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum/#SearchRUMEvents-200-v2) * [400](https://docs.datadoghq.com/api/latest/rum/#SearchRUMEvents-400-v2) * [403](https://docs.datadoghq.com/api/latest/rum/#SearchRUMEvents-403-v2) * [429](https://docs.datadoghq.com/api/latest/rum/#SearchRUMEvents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) Response object with all events matching the request and pagination information. Field Type Description data [object] Array of events matching the request. attributes object JSON object containing all event attributes and their associated values. attributes object JSON object of attributes from RUM events. service string The name of the application or service generating RUM events. It is used to switch from RUM to APM, so make sure you define the same value when you use both products. tags [string] Array of tags associated with your event. timestamp date-time Timestamp of your event. id string Unique ID of the event. type enum Type of the event. Allowed enum values: `rum` default: `rum` links object Links attributes. next string Link for the next set of results. Note that the request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non-fatal errors) encountered. Partial results may return if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "service": "web-app", "tags": [ "team:A" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "rum" } ], "links": { "next": "https://app.datadoghq.com/api/v2/rum/event?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/rum/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/rum/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/rum/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum/?code-lang=typescript) ##### Search RUM events returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "now-15m", "query": "@type:session AND @session.type:user", "to": "now" }, "options": { "time_offset": 0, "timezone": "GMT" }, "page": { "limit": 25 }, "sort": "timestamp" } EOF ``` ##### Search RUM events returns "OK" response with pagination Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "now-15m", "query": "@type:session AND @session.type:user", "to": "now" }, "options": { "time_offset": 0, "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" } EOF ``` ##### Search RUM events returns "OK" response ``` // Search RUM events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RUMSearchEventsRequest{ Filter: &datadogV2.RUMQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("@type:session AND @session.type:user"), To: datadog.PtrString("now"), }, Options: &datadogV2.RUMQueryOptions{ TimeOffset: datadog.PtrInt64(0), Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.RUMQueryPageOptions{ Limit: datadog.PtrInt32(25), }, Sort: datadogV2.RUMSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRUMApi(apiClient) resp, r, err := api.SearchRUMEvents(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RUMApi.SearchRUMEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RUMApi.SearchRUMEvents`:\n%s\n", responseContent) } ``` Copy ##### Search RUM events returns "OK" response with pagination ``` // Search RUM events returns "OK" response with pagination package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RUMSearchEventsRequest{ Filter: &datadogV2.RUMQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("@type:session AND @session.type:user"), To: datadog.PtrString("now"), }, Options: &datadogV2.RUMQueryOptions{ TimeOffset: datadog.PtrInt64(0), Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.RUMQueryPageOptions{ Limit: datadog.PtrInt32(2), }, Sort: datadogV2.RUMSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRUMApi(apiClient) resp, _ := api.SearchRUMEventsWithPagination(ctx, body) for paginationResult := range resp { if paginationResult.Error != nil { fmt.Fprintf(os.Stderr, "Error when calling `RUMApi.SearchRUMEvents`: %v\n", paginationResult.Error) } responseContent, _ := json.MarshalIndent(paginationResult.Item, "", " ") fmt.Fprintf(os.Stdout, "%s\n", responseContent) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search RUM events returns "OK" response ``` // Search RUM events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumApi; import com.datadog.api.client.v2.model.RUMEventsResponse; import com.datadog.api.client.v2.model.RUMQueryFilter; import com.datadog.api.client.v2.model.RUMQueryOptions; import com.datadog.api.client.v2.model.RUMQueryPageOptions; import com.datadog.api.client.v2.model.RUMSearchEventsRequest; import com.datadog.api.client.v2.model.RUMSort; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumApi apiInstance = new RumApi(defaultClient); RUMSearchEventsRequest body = new RUMSearchEventsRequest() .filter( new RUMQueryFilter() .from("now-15m") .query("@type:session AND @session.type:user") .to("now")) .options(new RUMQueryOptions().timeOffset(0L).timezone("GMT")) .page(new RUMQueryPageOptions().limit(25)) .sort(RUMSort.TIMESTAMP_ASCENDING); try { RUMEventsResponse result = apiInstance.searchRUMEvents(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumApi#searchRUMEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Search RUM events returns "OK" response with pagination ``` // Search RUM events returns "OK" response with pagination import com.datadog.api.client.ApiClient; import com.datadog.api.client.PaginationIterable; import com.datadog.api.client.v2.api.RumApi; import com.datadog.api.client.v2.model.RUMEvent; import com.datadog.api.client.v2.model.RUMQueryFilter; import com.datadog.api.client.v2.model.RUMQueryOptions; import com.datadog.api.client.v2.model.RUMQueryPageOptions; import com.datadog.api.client.v2.model.RUMSearchEventsRequest; import com.datadog.api.client.v2.model.RUMSort; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumApi apiInstance = new RumApi(defaultClient); RUMSearchEventsRequest body = new RUMSearchEventsRequest() .filter( new RUMQueryFilter() .from("now-15m") .query("@type:session AND @session.type:user") .to("now")) .options(new RUMQueryOptions().timeOffset(0L).timezone("GMT")) .page(new RUMQueryPageOptions().limit(2)) .sort(RUMSort.TIMESTAMP_ASCENDING); try { PaginationIterable iterable = apiInstance.searchRUMEventsWithPagination(body); for (RUMEvent item : iterable) { System.out.println(item); } } catch (RuntimeException e) { System.err.println("Exception when calling RumApi#searchRUMEventsWithPagination"); System.err.println("Reason: " + e.getMessage()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search RUM events returns "OK" response ``` """ Search RUM events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_api import RUMApi from datadog_api_client.v2.model.rum_query_filter import RUMQueryFilter from datadog_api_client.v2.model.rum_query_options import RUMQueryOptions from datadog_api_client.v2.model.rum_query_page_options import RUMQueryPageOptions from datadog_api_client.v2.model.rum_search_events_request import RUMSearchEventsRequest from datadog_api_client.v2.model.rum_sort import RUMSort body = RUMSearchEventsRequest( filter=RUMQueryFilter( _from="now-15m", query="@type:session AND @session.type:user", to="now", ), options=RUMQueryOptions( time_offset=0, timezone="GMT", ), page=RUMQueryPageOptions( limit=25, ), sort=RUMSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RUMApi(api_client) response = api_instance.search_rum_events(body=body) print(response) ``` Copy ##### Search RUM events returns "OK" response with pagination ``` """ Search RUM events returns "OK" response with pagination """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_api import RUMApi from datadog_api_client.v2.model.rum_query_filter import RUMQueryFilter from datadog_api_client.v2.model.rum_query_options import RUMQueryOptions from datadog_api_client.v2.model.rum_query_page_options import RUMQueryPageOptions from datadog_api_client.v2.model.rum_search_events_request import RUMSearchEventsRequest from datadog_api_client.v2.model.rum_sort import RUMSort body = RUMSearchEventsRequest( filter=RUMQueryFilter( _from="now-15m", query="@type:session AND @session.type:user", to="now", ), options=RUMQueryOptions( time_offset=0, timezone="GMT", ), page=RUMQueryPageOptions( limit=2, ), sort=RUMSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RUMApi(api_client) items = api_instance.search_rum_events_with_pagination(body=body) for item in items: print(item) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search RUM events returns "OK" response ``` # Search RUM events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RUMAPI.new body = DatadogAPIClient::V2::RUMSearchEventsRequest.new({ filter: DatadogAPIClient::V2::RUMQueryFilter.new({ from: "now-15m", query: "@type:session AND @session.type:user", to: "now", }), options: DatadogAPIClient::V2::RUMQueryOptions.new({ time_offset: 0, timezone: "GMT", }), page: DatadogAPIClient::V2::RUMQueryPageOptions.new({ limit: 25, }), sort: DatadogAPIClient::V2::RUMSort::TIMESTAMP_ASCENDING, }) p api_instance.search_rum_events(body) ``` Copy ##### Search RUM events returns "OK" response with pagination ``` # Search RUM events returns "OK" response with pagination require "datadog_api_client" api_instance = DatadogAPIClient::V2::RUMAPI.new body = DatadogAPIClient::V2::RUMSearchEventsRequest.new({ filter: DatadogAPIClient::V2::RUMQueryFilter.new({ from: "now-15m", query: "@type:session AND @session.type:user", to: "now", }), options: DatadogAPIClient::V2::RUMQueryOptions.new({ time_offset: 0, timezone: "GMT", }), page: DatadogAPIClient::V2::RUMQueryPageOptions.new({ limit: 2, }), sort: DatadogAPIClient::V2::RUMSort::TIMESTAMP_ASCENDING, }) api_instance.search_rum_events_with_pagination(body) { |item| puts item } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search RUM events returns "OK" response ``` // Search RUM events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum::RUMAPI; use datadog_api_client::datadogV2::model::RUMQueryFilter; use datadog_api_client::datadogV2::model::RUMQueryOptions; use datadog_api_client::datadogV2::model::RUMQueryPageOptions; use datadog_api_client::datadogV2::model::RUMSearchEventsRequest; use datadog_api_client::datadogV2::model::RUMSort; #[tokio::main] async fn main() { let body = RUMSearchEventsRequest::new() .filter( RUMQueryFilter::new() .from("now-15m".to_string()) .query("@type:session AND @session.type:user".to_string()) .to("now".to_string()), ) .options( RUMQueryOptions::new() .time_offset(0) .timezone("GMT".to_string()), ) .page(RUMQueryPageOptions::new().limit(25)) .sort(RUMSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = RUMAPI::with_config(configuration); let resp = api.search_rum_events(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Search RUM events returns "OK" response with pagination ``` // Search RUM events returns "OK" response with pagination use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum::RUMAPI; use datadog_api_client::datadogV2::model::RUMQueryFilter; use datadog_api_client::datadogV2::model::RUMQueryOptions; use datadog_api_client::datadogV2::model::RUMQueryPageOptions; use datadog_api_client::datadogV2::model::RUMSearchEventsRequest; use datadog_api_client::datadogV2::model::RUMSort; use futures_util::pin_mut; use futures_util::stream::StreamExt; #[tokio::main] async fn main() { let body = RUMSearchEventsRequest::new() .filter( RUMQueryFilter::new() .from("now-15m".to_string()) .query("@type:session AND @session.type:user".to_string()) .to("now".to_string()), ) .options( RUMQueryOptions::new() .time_offset(0) .timezone("GMT".to_string()), ) .page(RUMQueryPageOptions::new().limit(2)) .sort(RUMSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = RUMAPI::with_config(configuration); let response = api.search_rum_events_with_pagination(body); pin_mut!(response); while let Some(resp) = response.next().await { if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search RUM events returns "OK" response ``` /** * Search RUM events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RUMApi(configuration); const params: v2.RUMApiSearchRUMEventsRequest = { body: { filter: { from: "now-15m", query: "@type:session AND @session.type:user", to: "now", }, options: { timeOffset: 0, timezone: "GMT", }, page: { limit: 25, }, sort: "timestamp", }, }; apiInstance .searchRUMEvents(params) .then((data: v2.RUMEventsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Search RUM events returns "OK" response with pagination ``` /** * Search RUM events returns "OK" response with pagination */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RUMApi(configuration); const params: v2.RUMApiSearchRUMEventsRequest = { body: { filter: { from: "now-15m", query: "@type:session AND @session.type:user", to: "now", }, options: { timeOffset: 0, timezone: "GMT", }, page: { limit: 2, }, sort: "timestamp", }, }; (async () => { try { for await (const item of apiInstance.searchRUMEventsWithPagination( params )) { console.log(item); } } catch (error) { console.error(error); } })(); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of RUM events](https://docs.datadoghq.com/api/latest/rum/#get-a-list-of-rum-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum/#get-a-list-of-rum-events-v2) GET https://api.ap1.datadoghq.com/api/v2/rum/eventshttps://api.ap2.datadoghq.com/api/v2/rum/eventshttps://api.datadoghq.eu/api/v2/rum/eventshttps://api.ddog-gov.com/api/v2/rum/eventshttps://api.datadoghq.com/api/v2/rum/eventshttps://api.us3.datadoghq.com/api/v2/rum/eventshttps://api.us5.datadoghq.com/api/v2/rum/events ### Overview List endpoint returns events that match a RUM search query. [Results are paginated](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination). Use this endpoint to see your latest RUM events. This endpoint requires the `rum_apps_read` permission. ### Arguments #### Query Strings Name Type Description filter[query] string Search query following RUM syntax. filter[from] string Minimum timestamp for requested events. filter[to] string Maximum timestamp for requested events. sort enum Order of events in results. Allowed enum values: `timestamp, -timestamp` page[cursor] string List following results with a cursor provided in the previous query. page[limit] integer Maximum number of events in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/rum/#ListRUMEvents-200-v2) * [400](https://docs.datadoghq.com/api/latest/rum/#ListRUMEvents-400-v2) * [403](https://docs.datadoghq.com/api/latest/rum/#ListRUMEvents-403-v2) * [429](https://docs.datadoghq.com/api/latest/rum/#ListRUMEvents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) Response object with all events matching the request and pagination information. Field Type Description data [object] Array of events matching the request. attributes object JSON object containing all event attributes and their associated values. attributes object JSON object of attributes from RUM events. service string The name of the application or service generating RUM events. It is used to switch from RUM to APM, so make sure you define the same value when you use both products. tags [string] Array of tags associated with your event. timestamp date-time Timestamp of your event. id string Unique ID of the event. type enum Type of the event. Allowed enum values: `rum` default: `rum` links object Links attributes. next string Link for the next set of results. Note that the request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non-fatal errors) encountered. Partial results may return if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "service": "web-app", "tags": [ "team:A" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "rum" } ], "links": { "next": "https://app.datadoghq.com/api/v2/rum/event?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum/?code-lang=typescript) ##### Get a list of RUM events Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/events" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of RUM events ``` """ Get a list of RUM events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_api import RUMApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RUMApi(api_client) response = api_instance.list_rum_events() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of RUM events ``` # Get a list of RUM events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RUMAPI.new p api_instance.list_rum_events() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of RUM events ``` // Get a list of RUM events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRUMApi(apiClient) resp, r, err := api.ListRUMEvents(ctx, *datadogV2.NewListRUMEventsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RUMApi.ListRUMEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RUMApi.ListRUMEvents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of RUM events ``` // Get a list of RUM events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumApi; import com.datadog.api.client.v2.model.RUMEventsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumApi apiInstance = new RumApi(defaultClient); try { RUMEventsResponse result = apiInstance.listRUMEvents(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumApi#listRUMEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of RUM events ``` // Get a list of RUM events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum::ListRUMEventsOptionalParams; use datadog_api_client::datadogV2::api_rum::RUMAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = RUMAPI::with_config(configuration); let resp = api .list_rum_events(ListRUMEventsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of RUM events ``` /** * Get a list of RUM events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RUMApi(configuration); apiInstance .listRUMEvents() .then((data: v2.RUMEventsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Aggregate RUM events](https://docs.datadoghq.com/api/latest/rum/#aggregate-rum-events) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum/#aggregate-rum-events-v2) POST https://api.ap1.datadoghq.com/api/v2/rum/analytics/aggregatehttps://api.ap2.datadoghq.com/api/v2/rum/analytics/aggregatehttps://api.datadoghq.eu/api/v2/rum/analytics/aggregatehttps://api.ddog-gov.com/api/v2/rum/analytics/aggregatehttps://api.datadoghq.com/api/v2/rum/analytics/aggregatehttps://api.us3.datadoghq.com/api/v2/rum/analytics/aggregatehttps://api.us5.datadoghq.com/api/v2/rum/analytics/aggregate ### Overview The API endpoint to aggregate RUM events into buckets of computed metrics and timeseries. This endpoint requires the `rum_apps_read` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) Field Type Description compute [object] The list of metrics or timeseries to compute for the retrieved buckets. aggregation [_required_] enum An aggregation function. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median` interval string The time buckets' size (only used for type=timeseries) Defaults to a resolution of 150 points. metric string The metric to use. type enum The type of compute. Allowed enum values: `timeseries,total` default: `total` filter object The search and filter query settings. from string The minimum time for the requested events; supports date (in [ISO 8601](https://www.w3.org/TR/NOTE-datetime) format with full date, hours, minutes, and the `Z` UTC indicator - seconds and fractional seconds are optional), math, and regular timestamps (in milliseconds). default: `now-15m` query string The search query following the RUM search syntax. default: `*` to string The maximum time for the requested events; supports date (in [ISO 8601](https://www.w3.org/TR/NOTE-datetime) format with full date, hours, minutes, and the `Z` UTC indicator - seconds and fractional seconds are optional), math, and regular timestamps (in milliseconds). default: `now` group_by [object] The rules for the group by. facet [_required_] string The name of the facet to use (required). histogram object Used to perform a histogram computation (only for measure facets). Note: At most 100 buckets are allowed, the number of buckets is (max - min)/interval. interval [_required_] double The bin size of the histogram buckets. max [_required_] double The maximum value for the measure used in the histogram (values greater than this one are filtered out). min [_required_] double The minimum value for the measure used in the histogram (values smaller than this one are filtered out). limit int64 The maximum buckets to return for this group-by. default: `10` missing The value to use for logs that don't have the facet used to group by. Option 1 string The missing value to use if there is string valued facet. Option 2 double The missing value to use if there is a number valued facet. sort object A sort rule. aggregation enum An aggregation function. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median` metric string The metric to sort by (only used for `type=measure`). order enum The order to use, ascending or descending. Allowed enum values: `asc,desc` type enum The type of sorting algorithm. Allowed enum values: `alphabetical,measure` default: `alphabetical` total A resulting object to put the given computes in over all the matching records. Option 1 boolean If set to true, creates an additional bucket labeled "$facet_total". Option 2 string A string to use as the key value for the total bucket. Option 3 double A number to use as the key value for the total bucket. options object Global query options that are used during the query. Note: Only supply timezone or time offset, not both. Otherwise, the query fails. time_offset int64 The time offset (in seconds) to apply to the query. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` page object Paging attributes for listing events. cursor string List following results with a cursor provided in the previous query. limit int32 Maximum number of events in the response. default: `10` ``` { "compute": [ { "aggregation": "pc90", "metric": "@view.time_spent", "type": "total" } ], "filter": { "from": "now-15m", "query": "@type:view AND @session.type:user", "to": "now" }, "group_by": [ { "facet": "@view.time_spent", "limit": 10, "total": false } ], "options": { "timezone": "GMT" }, "page": { "limit": 25 } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum/#AggregateRUMEvents-200-v2) * [400](https://docs.datadoghq.com/api/latest/rum/#AggregateRUMEvents-400-v2) * [403](https://docs.datadoghq.com/api/latest/rum/#AggregateRUMEvents-403-v2) * [429](https://docs.datadoghq.com/api/latest/rum/#AggregateRUMEvents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) The response object for the RUM events aggregate API endpoint. Field Type Description data object The query results. buckets [object] The list of matching buckets, one item per bucket. by object The key-value pairs for each group-by. string The values for each group-by. computes object A map of the metric name to value for regular compute, or a list of values for a timeseries. A bucket value, can be either a timeseries or a single value. Option 1 string A single string value. Option 2 double A single number value. Option 3 [object] A timeseries array. time date-time The time value for this point. value double The value for this point. links object Links attributes. next string Link for the next set of results. Note that the request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non-fatal errors) encountered. Partial results may return if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": { "buckets": [ { "by": { "": "string" }, "computes": { "": { "description": "undefined", "type": "undefined" } } } ] }, "links": { "next": "https://app.datadoghq.com/api/v2/rum/event?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/rum/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/rum/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/rum/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum/?code-lang=typescript) ##### Aggregate RUM events returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/analytics/aggregate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "compute": [ { "aggregation": "pc90", "metric": "@view.time_spent", "type": "total" } ], "filter": { "from": "now-15m", "query": "@type:view AND @session.type:user", "to": "now" }, "group_by": [ { "facet": "@view.time_spent", "limit": 10, "total": false } ], "options": { "timezone": "GMT" }, "page": { "limit": 25 } } EOF ``` ##### Aggregate RUM events returns "OK" response ``` // Aggregate RUM events returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RUMAggregateRequest{ Compute: []datadogV2.RUMCompute{ { Aggregation: datadogV2.RUMAGGREGATIONFUNCTION_PERCENTILE_90, Metric: datadog.PtrString("@view.time_spent"), Type: datadogV2.RUMCOMPUTETYPE_TOTAL.Ptr(), }, }, Filter: &datadogV2.RUMQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("@type:view AND @session.type:user"), To: datadog.PtrString("now"), }, GroupBy: []datadogV2.RUMGroupBy{ { Facet: "@view.time_spent", Limit: datadog.PtrInt64(10), Total: &datadogV2.RUMGroupByTotal{ RUMGroupByTotalBoolean: datadog.PtrBool(false)}, }, }, Options: &datadogV2.RUMQueryOptions{ Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.RUMQueryPageOptions{ Limit: datadog.PtrInt32(25), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRUMApi(apiClient) resp, r, err := api.AggregateRUMEvents(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RUMApi.AggregateRUMEvents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RUMApi.AggregateRUMEvents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Aggregate RUM events returns "OK" response ``` // Aggregate RUM events returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumApi; import com.datadog.api.client.v2.model.RUMAggregateRequest; import com.datadog.api.client.v2.model.RUMAggregationFunction; import com.datadog.api.client.v2.model.RUMAnalyticsAggregateResponse; import com.datadog.api.client.v2.model.RUMCompute; import com.datadog.api.client.v2.model.RUMComputeType; import com.datadog.api.client.v2.model.RUMGroupBy; import com.datadog.api.client.v2.model.RUMGroupByTotal; import com.datadog.api.client.v2.model.RUMQueryFilter; import com.datadog.api.client.v2.model.RUMQueryOptions; import com.datadog.api.client.v2.model.RUMQueryPageOptions; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumApi apiInstance = new RumApi(defaultClient); RUMAggregateRequest body = new RUMAggregateRequest() .compute( Collections.singletonList( new RUMCompute() .aggregation(RUMAggregationFunction.PERCENTILE_90) .metric("@view.time_spent") .type(RUMComputeType.TOTAL))) .filter( new RUMQueryFilter() .from("now-15m") .query("@type:view AND @session.type:user") .to("now")) .groupBy( Collections.singletonList( new RUMGroupBy() .facet("@view.time_spent") .limit(10L) .total(new RUMGroupByTotal(false)))) .options(new RUMQueryOptions().timezone("GMT")) .page(new RUMQueryPageOptions().limit(25)); try { RUMAnalyticsAggregateResponse result = apiInstance.aggregateRUMEvents(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumApi#aggregateRUMEvents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Aggregate RUM events returns "OK" response ``` """ Aggregate RUM events returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_api import RUMApi from datadog_api_client.v2.model.rum_aggregate_request import RUMAggregateRequest from datadog_api_client.v2.model.rum_aggregation_function import RUMAggregationFunction from datadog_api_client.v2.model.rum_compute import RUMCompute from datadog_api_client.v2.model.rum_compute_type import RUMComputeType from datadog_api_client.v2.model.rum_group_by import RUMGroupBy from datadog_api_client.v2.model.rum_query_filter import RUMQueryFilter from datadog_api_client.v2.model.rum_query_options import RUMQueryOptions from datadog_api_client.v2.model.rum_query_page_options import RUMQueryPageOptions body = RUMAggregateRequest( compute=[ RUMCompute( aggregation=RUMAggregationFunction.PERCENTILE_90, metric="@view.time_spent", type=RUMComputeType.TOTAL, ), ], filter=RUMQueryFilter( _from="now-15m", query="@type:view AND @session.type:user", to="now", ), group_by=[ RUMGroupBy( facet="@view.time_spent", limit=10, total=False, ), ], options=RUMQueryOptions( timezone="GMT", ), page=RUMQueryPageOptions( limit=25, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RUMApi(api_client) response = api_instance.aggregate_rum_events(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Aggregate RUM events returns "OK" response ``` # Aggregate RUM events returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RUMAPI.new body = DatadogAPIClient::V2::RUMAggregateRequest.new({ compute: [ DatadogAPIClient::V2::RUMCompute.new({ aggregation: DatadogAPIClient::V2::RUMAggregationFunction::PERCENTILE_90, metric: "@view.time_spent", type: DatadogAPIClient::V2::RUMComputeType::TOTAL, }), ], filter: DatadogAPIClient::V2::RUMQueryFilter.new({ from: "now-15m", query: "@type:view AND @session.type:user", to: "now", }), group_by: [ DatadogAPIClient::V2::RUMGroupBy.new({ facet: "@view.time_spent", limit: 10, total: false, }), ], options: DatadogAPIClient::V2::RUMQueryOptions.new({ timezone: "GMT", }), page: DatadogAPIClient::V2::RUMQueryPageOptions.new({ limit: 25, }), }) p api_instance.aggregate_rum_events(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Aggregate RUM events returns "OK" response ``` // Aggregate RUM events returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum::RUMAPI; use datadog_api_client::datadogV2::model::RUMAggregateRequest; use datadog_api_client::datadogV2::model::RUMAggregationFunction; use datadog_api_client::datadogV2::model::RUMCompute; use datadog_api_client::datadogV2::model::RUMComputeType; use datadog_api_client::datadogV2::model::RUMGroupBy; use datadog_api_client::datadogV2::model::RUMGroupByTotal; use datadog_api_client::datadogV2::model::RUMQueryFilter; use datadog_api_client::datadogV2::model::RUMQueryOptions; use datadog_api_client::datadogV2::model::RUMQueryPageOptions; #[tokio::main] async fn main() { let body = RUMAggregateRequest::new() .compute(vec![RUMCompute::new(RUMAggregationFunction::PERCENTILE_90) .metric("@view.time_spent".to_string()) .type_(RUMComputeType::TOTAL)]) .filter( RUMQueryFilter::new() .from("now-15m".to_string()) .query("@type:view AND @session.type:user".to_string()) .to("now".to_string()), ) .group_by(vec![RUMGroupBy::new("@view.time_spent".to_string()) .limit(10) .total(RUMGroupByTotal::RUMGroupByTotalBoolean(false))]) .options(RUMQueryOptions::new().timezone("GMT".to_string())) .page(RUMQueryPageOptions::new().limit(25)); let configuration = datadog::Configuration::new(); let api = RUMAPI::with_config(configuration); let resp = api.aggregate_rum_events(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Aggregate RUM events returns "OK" response ``` /** * Aggregate RUM events returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RUMApi(configuration); const params: v2.RUMApiAggregateRUMEventsRequest = { body: { compute: [ { aggregation: "pc90", metric: "@view.time_spent", type: "total", }, ], filter: { from: "now-15m", query: "@type:view AND @session.type:user", to: "now", }, groupBy: [ { facet: "@view.time_spent", limit: 10, total: false, }, ], options: { timezone: "GMT", }, page: { limit: 25, }, }, }; apiInstance .aggregateRUMEvents(params) .then((data: v2.RUMAnalyticsAggregateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a RUM application](https://docs.datadoghq.com/api/latest/rum/#update-a-rum-application) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum/#update-a-rum-application-v2) PATCH https://api.ap1.datadoghq.com/api/v2/rum/applications/{id}https://api.ap2.datadoghq.com/api/v2/rum/applications/{id}https://api.datadoghq.eu/api/v2/rum/applications/{id}https://api.ddog-gov.com/api/v2/rum/applications/{id}https://api.datadoghq.com/api/v2/rum/applications/{id}https://api.us3.datadoghq.com/api/v2/rum/applications/{id}https://api.us5.datadoghq.com/api/v2/rum/applications/{id} ### Overview Update the RUM application with given ID in your organization. This endpoint requires the `rum_apps_write` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string RUM application ID. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) Field Type Description data [_required_] object RUM application update. attributes object RUM application update attributes. name string Name of the RUM application. product_analytics_retention_state enum Controls the retention policy for Product Analytics data derived from RUM events. Allowed enum values: `MAX,NONE` rum_event_processing_state enum Configures which RUM events are processed and stored for the application. Allowed enum values: `ALL,ERROR_FOCUSED_MODE,NONE` type string Type of the RUM application. Supported values are `browser`, `ios`, `android`, `react-native`, `flutter`, `roku`, `electron`, `unity`, `kotlin-multiplatform`. id [_required_] string RUM application ID. type [_required_] enum RUM application update type. Allowed enum values: `rum_application_update` default: `rum_application_update` ##### Update a RUM application returns "OK" response ``` { "data": { "attributes": { "name": "updated_name_for_my_existing_rum_application", "type": "browser" }, "id": "abcd1234-0000-0000-abcd-1234abcd5678", "type": "rum_application_update" } } ``` Copy ##### Update a RUM application with Product Scales returns "OK" response ``` { "data": { "attributes": { "name": "updated_rum_with_product_scales", "rum_event_processing_state": "ALL", "product_analytics_retention_state": "MAX" }, "id": "abcd1234-0000-0000-abcd-1234abcd5678", "type": "rum_application_update" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum/#UpdateRUMApplication-200-v2) * [400](https://docs.datadoghq.com/api/latest/rum/#UpdateRUMApplication-400-v2) * [404](https://docs.datadoghq.com/api/latest/rum/#UpdateRUMApplication-404-v2) * [422](https://docs.datadoghq.com/api/latest/rum/#UpdateRUMApplication-422-v2) * [429](https://docs.datadoghq.com/api/latest/rum/#UpdateRUMApplication-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) RUM application response. Field Type Description data object RUM application. attributes [_required_] object RUM application attributes. api_key_id int32 ID of the API key associated with the application. application_id [_required_] string ID of the RUM application. client_token [_required_] string Client token of the RUM application. created_at [_required_] int64 Timestamp in ms of the creation date. created_by_handle [_required_] string Handle of the creator user. hash string Hash of the RUM application. Optional. is_active boolean Indicates if the RUM application is active. name [_required_] string Name of the RUM application. org_id [_required_] int32 Org ID of the RUM application. product_scales object Product Scales configuration for the RUM application. product_analytics_retention_scale object Product Analytics retention scale configuration. last_modified_at int64 Timestamp in milliseconds when this scale was last modified. state enum Controls the retention policy for Product Analytics data derived from RUM events. Allowed enum values: `MAX,NONE` rum_event_processing_scale object RUM event processing scale configuration. last_modified_at int64 Timestamp in milliseconds when this scale was last modified. state enum Configures which RUM events are processed and stored for the application. Allowed enum values: `ALL,ERROR_FOCUSED_MODE,NONE` type [_required_] string Type of the RUM application. Supported values are `browser`, `ios`, `android`, `react-native`, `flutter`, `roku`, `electron`, `unity`, `kotlin-multiplatform`. updated_at [_required_] int64 Timestamp in ms of the last update date. updated_by_handle [_required_] string Handle of the updater user. id [_required_] string RUM application ID. type [_required_] enum RUM application response type. Allowed enum values: `rum_application` default: `rum_application` ``` { "data": { "attributes": { "api_key_id": 123456789, "application_id": "abcd1234-0000-0000-abcd-1234abcd5678", "client_token": "abcd1234efgh5678ijkl90abcd1234efgh0", "created_at": 1659479836169, "created_by_handle": "john.doe", "hash": "string", "is_active": true, "name": "my_rum_application", "org_id": 999, "product_scales": { "product_analytics_retention_scale": { "last_modified_at": 1747922145974, "state": "MAX" }, "rum_event_processing_scale": { "last_modified_at": 1721897494108, "state": "ALL" } }, "type": "browser", "updated_at": 1659479836169, "updated_by_handle": "jane.doe" }, "id": "abcd1234-0000-0000-abcd-1234abcd5678", "type": "rum_application" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unprocessable Entity. * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/rum/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/rum/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/rum/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum/?code-lang=typescript) ##### Update a RUM application returns "OK" response Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "updated_name_for_my_existing_rum_application", "type": "browser" }, "id": "abcd1234-0000-0000-abcd-1234abcd5678", "type": "rum_application_update" } } EOF ``` ##### Update a RUM application with Product Scales returns "OK" response Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "updated_rum_with_product_scales", "rum_event_processing_state": "ALL", "product_analytics_retention_state": "MAX" }, "id": "abcd1234-0000-0000-abcd-1234abcd5678", "type": "rum_application_update" } } EOF ``` ##### Update a RUM application returns "OK" response ``` // Update a RUM application returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "rum_application" in the system RumApplicationDataID := os.Getenv("RUM_APPLICATION_DATA_ID") body := datadogV2.RUMApplicationUpdateRequest{ Data: datadogV2.RUMApplicationUpdate{ Attributes: &datadogV2.RUMApplicationUpdateAttributes{ Name: datadog.PtrString("updated_name_for_my_existing_rum_application"), Type: datadog.PtrString("browser"), }, Id: RumApplicationDataID, Type: datadogV2.RUMAPPLICATIONUPDATETYPE_RUM_APPLICATION_UPDATE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRUMApi(apiClient) resp, r, err := api.UpdateRUMApplication(ctx, RumApplicationDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RUMApi.UpdateRUMApplication`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RUMApi.UpdateRUMApplication`:\n%s\n", responseContent) } ``` Copy ##### Update a RUM application with Product Scales returns "OK" response ``` // Update a RUM application with Product Scales returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "rum_application" in the system RumApplicationDataID := os.Getenv("RUM_APPLICATION_DATA_ID") body := datadogV2.RUMApplicationUpdateRequest{ Data: datadogV2.RUMApplicationUpdate{ Attributes: &datadogV2.RUMApplicationUpdateAttributes{ Name: datadog.PtrString("updated_rum_with_product_scales"), RumEventProcessingState: datadogV2.RUMEVENTPROCESSINGSTATE_ALL.Ptr(), ProductAnalyticsRetentionState: datadogV2.RUMPRODUCTANALYTICSRETENTIONSTATE_MAX.Ptr(), }, Id: RumApplicationDataID, Type: datadogV2.RUMAPPLICATIONUPDATETYPE_RUM_APPLICATION_UPDATE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRUMApi(apiClient) resp, r, err := api.UpdateRUMApplication(ctx, RumApplicationDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RUMApi.UpdateRUMApplication`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RUMApi.UpdateRUMApplication`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a RUM application returns "OK" response ``` // Update a RUM application returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumApi; import com.datadog.api.client.v2.model.RUMApplicationResponse; import com.datadog.api.client.v2.model.RUMApplicationUpdate; import com.datadog.api.client.v2.model.RUMApplicationUpdateAttributes; import com.datadog.api.client.v2.model.RUMApplicationUpdateRequest; import com.datadog.api.client.v2.model.RUMApplicationUpdateType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumApi apiInstance = new RumApi(defaultClient); // there is a valid "rum_application" in the system String RUM_APPLICATION_DATA_ID = System.getenv("RUM_APPLICATION_DATA_ID"); RUMApplicationUpdateRequest body = new RUMApplicationUpdateRequest() .data( new RUMApplicationUpdate() .attributes( new RUMApplicationUpdateAttributes() .name("updated_name_for_my_existing_rum_application") .type("browser")) .id(RUM_APPLICATION_DATA_ID) .type(RUMApplicationUpdateType.RUM_APPLICATION_UPDATE)); try { RUMApplicationResponse result = apiInstance.updateRUMApplication(RUM_APPLICATION_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumApi#updateRUMApplication"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update a RUM application with Product Scales returns "OK" response ``` // Update a RUM application with Product Scales returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumApi; import com.datadog.api.client.v2.model.RUMApplicationResponse; import com.datadog.api.client.v2.model.RUMApplicationUpdate; import com.datadog.api.client.v2.model.RUMApplicationUpdateAttributes; import com.datadog.api.client.v2.model.RUMApplicationUpdateRequest; import com.datadog.api.client.v2.model.RUMApplicationUpdateType; import com.datadog.api.client.v2.model.RUMEventProcessingState; import com.datadog.api.client.v2.model.RUMProductAnalyticsRetentionState; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumApi apiInstance = new RumApi(defaultClient); // there is a valid "rum_application" in the system String RUM_APPLICATION_DATA_ID = System.getenv("RUM_APPLICATION_DATA_ID"); RUMApplicationUpdateRequest body = new RUMApplicationUpdateRequest() .data( new RUMApplicationUpdate() .attributes( new RUMApplicationUpdateAttributes() .name("updated_rum_with_product_scales") .rumEventProcessingState(RUMEventProcessingState.ALL) .productAnalyticsRetentionState(RUMProductAnalyticsRetentionState.MAX)) .id(RUM_APPLICATION_DATA_ID) .type(RUMApplicationUpdateType.RUM_APPLICATION_UPDATE)); try { RUMApplicationResponse result = apiInstance.updateRUMApplication(RUM_APPLICATION_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumApi#updateRUMApplication"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a RUM application returns "OK" response ``` """ Update a RUM application returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_api import RUMApi from datadog_api_client.v2.model.rum_application_update import RUMApplicationUpdate from datadog_api_client.v2.model.rum_application_update_attributes import RUMApplicationUpdateAttributes from datadog_api_client.v2.model.rum_application_update_request import RUMApplicationUpdateRequest from datadog_api_client.v2.model.rum_application_update_type import RUMApplicationUpdateType # there is a valid "rum_application" in the system RUM_APPLICATION_DATA_ID = environ["RUM_APPLICATION_DATA_ID"] body = RUMApplicationUpdateRequest( data=RUMApplicationUpdate( attributes=RUMApplicationUpdateAttributes( name="updated_name_for_my_existing_rum_application", type="browser", ), id=RUM_APPLICATION_DATA_ID, type=RUMApplicationUpdateType.RUM_APPLICATION_UPDATE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RUMApi(api_client) response = api_instance.update_rum_application(id=RUM_APPLICATION_DATA_ID, body=body) print(response) ``` Copy ##### Update a RUM application with Product Scales returns "OK" response ``` """ Update a RUM application with Product Scales returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_api import RUMApi from datadog_api_client.v2.model.rum_application_update import RUMApplicationUpdate from datadog_api_client.v2.model.rum_application_update_attributes import RUMApplicationUpdateAttributes from datadog_api_client.v2.model.rum_application_update_request import RUMApplicationUpdateRequest from datadog_api_client.v2.model.rum_application_update_type import RUMApplicationUpdateType from datadog_api_client.v2.model.rum_event_processing_state import RUMEventProcessingState from datadog_api_client.v2.model.rum_product_analytics_retention_state import RUMProductAnalyticsRetentionState # there is a valid "rum_application" in the system RUM_APPLICATION_DATA_ID = environ["RUM_APPLICATION_DATA_ID"] body = RUMApplicationUpdateRequest( data=RUMApplicationUpdate( attributes=RUMApplicationUpdateAttributes( name="updated_rum_with_product_scales", rum_event_processing_state=RUMEventProcessingState.ALL, product_analytics_retention_state=RUMProductAnalyticsRetentionState.MAX, ), id=RUM_APPLICATION_DATA_ID, type=RUMApplicationUpdateType.RUM_APPLICATION_UPDATE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RUMApi(api_client) response = api_instance.update_rum_application(id=RUM_APPLICATION_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a RUM application returns "OK" response ``` # Update a RUM application returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RUMAPI.new # there is a valid "rum_application" in the system RUM_APPLICATION_DATA_ID = ENV["RUM_APPLICATION_DATA_ID"] body = DatadogAPIClient::V2::RUMApplicationUpdateRequest.new({ data: DatadogAPIClient::V2::RUMApplicationUpdate.new({ attributes: DatadogAPIClient::V2::RUMApplicationUpdateAttributes.new({ name: "updated_name_for_my_existing_rum_application", type: "browser", }), id: RUM_APPLICATION_DATA_ID, type: DatadogAPIClient::V2::RUMApplicationUpdateType::RUM_APPLICATION_UPDATE, }), }) p api_instance.update_rum_application(RUM_APPLICATION_DATA_ID, body) ``` Copy ##### Update a RUM application with Product Scales returns "OK" response ``` # Update a RUM application with Product Scales returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RUMAPI.new # there is a valid "rum_application" in the system RUM_APPLICATION_DATA_ID = ENV["RUM_APPLICATION_DATA_ID"] body = DatadogAPIClient::V2::RUMApplicationUpdateRequest.new({ data: DatadogAPIClient::V2::RUMApplicationUpdate.new({ attributes: DatadogAPIClient::V2::RUMApplicationUpdateAttributes.new({ name: "updated_rum_with_product_scales", rum_event_processing_state: DatadogAPIClient::V2::RUMEventProcessingState::ALL, product_analytics_retention_state: DatadogAPIClient::V2::RUMProductAnalyticsRetentionState::MAX, }), id: RUM_APPLICATION_DATA_ID, type: DatadogAPIClient::V2::RUMApplicationUpdateType::RUM_APPLICATION_UPDATE, }), }) p api_instance.update_rum_application(RUM_APPLICATION_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a RUM application returns "OK" response ``` // Update a RUM application returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum::RUMAPI; use datadog_api_client::datadogV2::model::RUMApplicationUpdate; use datadog_api_client::datadogV2::model::RUMApplicationUpdateAttributes; use datadog_api_client::datadogV2::model::RUMApplicationUpdateRequest; use datadog_api_client::datadogV2::model::RUMApplicationUpdateType; #[tokio::main] async fn main() { // there is a valid "rum_application" in the system let rum_application_data_id = std::env::var("RUM_APPLICATION_DATA_ID").unwrap(); let body = RUMApplicationUpdateRequest::new( RUMApplicationUpdate::new( rum_application_data_id.clone(), RUMApplicationUpdateType::RUM_APPLICATION_UPDATE, ) .attributes( RUMApplicationUpdateAttributes::new() .name("updated_name_for_my_existing_rum_application".to_string()) .type_("browser".to_string()), ), ); let configuration = datadog::Configuration::new(); let api = RUMAPI::with_config(configuration); let resp = api .update_rum_application(rum_application_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update a RUM application with Product Scales returns "OK" response ``` // Update a RUM application with Product Scales returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum::RUMAPI; use datadog_api_client::datadogV2::model::RUMApplicationUpdate; use datadog_api_client::datadogV2::model::RUMApplicationUpdateAttributes; use datadog_api_client::datadogV2::model::RUMApplicationUpdateRequest; use datadog_api_client::datadogV2::model::RUMApplicationUpdateType; use datadog_api_client::datadogV2::model::RUMEventProcessingState; use datadog_api_client::datadogV2::model::RUMProductAnalyticsRetentionState; #[tokio::main] async fn main() { // there is a valid "rum_application" in the system let rum_application_data_id = std::env::var("RUM_APPLICATION_DATA_ID").unwrap(); let body = RUMApplicationUpdateRequest::new( RUMApplicationUpdate::new( rum_application_data_id.clone(), RUMApplicationUpdateType::RUM_APPLICATION_UPDATE, ) .attributes( RUMApplicationUpdateAttributes::new() .name("updated_rum_with_product_scales".to_string()) .product_analytics_retention_state(RUMProductAnalyticsRetentionState::MAX) .rum_event_processing_state(RUMEventProcessingState::ALL), ), ); let configuration = datadog::Configuration::new(); let api = RUMAPI::with_config(configuration); let resp = api .update_rum_application(rum_application_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a RUM application returns "OK" response ``` /** * Update a RUM application returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RUMApi(configuration); // there is a valid "rum_application" in the system const RUM_APPLICATION_DATA_ID = process.env.RUM_APPLICATION_DATA_ID as string; const params: v2.RUMApiUpdateRUMApplicationRequest = { body: { data: { attributes: { name: "updated_name_for_my_existing_rum_application", type: "browser", }, id: RUM_APPLICATION_DATA_ID, type: "rum_application_update", }, }, id: RUM_APPLICATION_DATA_ID, }; apiInstance .updateRUMApplication(params) .then((data: v2.RUMApplicationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update a RUM application with Product Scales returns "OK" response ``` /** * Update a RUM application with Product Scales returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RUMApi(configuration); // there is a valid "rum_application" in the system const RUM_APPLICATION_DATA_ID = process.env.RUM_APPLICATION_DATA_ID as string; const params: v2.RUMApiUpdateRUMApplicationRequest = { body: { data: { attributes: { name: "updated_rum_with_product_scales", rumEventProcessingState: "ALL", productAnalyticsRetentionState: "MAX", }, id: RUM_APPLICATION_DATA_ID, type: "rum_application_update", }, }, id: RUM_APPLICATION_DATA_ID, }; apiInstance .updateRUMApplication(params) .then((data: v2.RUMApplicationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a RUM application](https://docs.datadoghq.com/api/latest/rum/#get-a-rum-application) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum/#get-a-rum-application-v2) GET https://api.ap1.datadoghq.com/api/v2/rum/applications/{id}https://api.ap2.datadoghq.com/api/v2/rum/applications/{id}https://api.datadoghq.eu/api/v2/rum/applications/{id}https://api.ddog-gov.com/api/v2/rum/applications/{id}https://api.datadoghq.com/api/v2/rum/applications/{id}https://api.us3.datadoghq.com/api/v2/rum/applications/{id}https://api.us5.datadoghq.com/api/v2/rum/applications/{id} ### Overview Get the RUM application with given ID in your organization. This endpoint requires the `rum_apps_read` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string RUM application ID. ### Response * [200](https://docs.datadoghq.com/api/latest/rum/#GetRUMApplication-200-v2) * [404](https://docs.datadoghq.com/api/latest/rum/#GetRUMApplication-404-v2) * [429](https://docs.datadoghq.com/api/latest/rum/#GetRUMApplication-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) RUM application response. Field Type Description data object RUM application. attributes [_required_] object RUM application attributes. api_key_id int32 ID of the API key associated with the application. application_id [_required_] string ID of the RUM application. client_token [_required_] string Client token of the RUM application. created_at [_required_] int64 Timestamp in ms of the creation date. created_by_handle [_required_] string Handle of the creator user. hash string Hash of the RUM application. Optional. is_active boolean Indicates if the RUM application is active. name [_required_] string Name of the RUM application. org_id [_required_] int32 Org ID of the RUM application. product_scales object Product Scales configuration for the RUM application. product_analytics_retention_scale object Product Analytics retention scale configuration. last_modified_at int64 Timestamp in milliseconds when this scale was last modified. state enum Controls the retention policy for Product Analytics data derived from RUM events. Allowed enum values: `MAX,NONE` rum_event_processing_scale object RUM event processing scale configuration. last_modified_at int64 Timestamp in milliseconds when this scale was last modified. state enum Configures which RUM events are processed and stored for the application. Allowed enum values: `ALL,ERROR_FOCUSED_MODE,NONE` type [_required_] string Type of the RUM application. Supported values are `browser`, `ios`, `android`, `react-native`, `flutter`, `roku`, `electron`, `unity`, `kotlin-multiplatform`. updated_at [_required_] int64 Timestamp in ms of the last update date. updated_by_handle [_required_] string Handle of the updater user. id [_required_] string RUM application ID. type [_required_] enum RUM application response type. Allowed enum values: `rum_application` default: `rum_application` ``` { "data": { "attributes": { "api_key_id": 123456789, "application_id": "abcd1234-0000-0000-abcd-1234abcd5678", "client_token": "abcd1234efgh5678ijkl90abcd1234efgh0", "created_at": 1659479836169, "created_by_handle": "john.doe", "hash": "string", "is_active": true, "name": "my_rum_application", "org_id": 999, "product_scales": { "product_analytics_retention_scale": { "last_modified_at": 1747922145974, "state": "MAX" }, "rum_event_processing_scale": { "last_modified_at": 1721897494108, "state": "ALL" } }, "type": "browser", "updated_at": 1659479836169, "updated_by_handle": "jane.doe" }, "id": "abcd1234-0000-0000-abcd-1234abcd5678", "type": "rum_application" } } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum/?code-lang=typescript) ##### Get a RUM application Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications/${id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a RUM application ``` """ Get a RUM application returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_api import RUMApi # there is a valid "rum_application" in the system RUM_APPLICATION_DATA_ID = environ["RUM_APPLICATION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RUMApi(api_client) response = api_instance.get_rum_application( id=RUM_APPLICATION_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a RUM application ``` # Get a RUM application returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RUMAPI.new # there is a valid "rum_application" in the system RUM_APPLICATION_DATA_ID = ENV["RUM_APPLICATION_DATA_ID"] p api_instance.get_rum_application(RUM_APPLICATION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a RUM application ``` // Get a RUM application returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "rum_application" in the system RumApplicationDataID := os.Getenv("RUM_APPLICATION_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRUMApi(apiClient) resp, r, err := api.GetRUMApplication(ctx, RumApplicationDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RUMApi.GetRUMApplication`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RUMApi.GetRUMApplication`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a RUM application ``` // Get a RUM application returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumApi; import com.datadog.api.client.v2.model.RUMApplicationResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumApi apiInstance = new RumApi(defaultClient); // there is a valid "rum_application" in the system String RUM_APPLICATION_DATA_ID = System.getenv("RUM_APPLICATION_DATA_ID"); try { RUMApplicationResponse result = apiInstance.getRUMApplication(RUM_APPLICATION_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumApi#getRUMApplication"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a RUM application ``` // Get a RUM application returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum::RUMAPI; #[tokio::main] async fn main() { // there is a valid "rum_application" in the system let rum_application_data_id = std::env::var("RUM_APPLICATION_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = RUMAPI::with_config(configuration); let resp = api .get_rum_application(rum_application_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a RUM application ``` /** * Get a RUM application returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RUMApi(configuration); // there is a valid "rum_application" in the system const RUM_APPLICATION_DATA_ID = process.env.RUM_APPLICATION_DATA_ID as string; const params: v2.RUMApiGetRUMApplicationRequest = { id: RUM_APPLICATION_DATA_ID, }; apiInstance .getRUMApplication(params) .then((data: v2.RUMApplicationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a RUM application](https://docs.datadoghq.com/api/latest/rum/#delete-a-rum-application) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum/#delete-a-rum-application-v2) DELETE https://api.ap1.datadoghq.com/api/v2/rum/applications/{id}https://api.ap2.datadoghq.com/api/v2/rum/applications/{id}https://api.datadoghq.eu/api/v2/rum/applications/{id}https://api.ddog-gov.com/api/v2/rum/applications/{id}https://api.datadoghq.com/api/v2/rum/applications/{id}https://api.us3.datadoghq.com/api/v2/rum/applications/{id}https://api.us5.datadoghq.com/api/v2/rum/applications/{id} ### Overview Delete an existing RUM application in your organization. This endpoint requires the `rum_apps_write` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string RUM application ID. ### Response * [204](https://docs.datadoghq.com/api/latest/rum/#DeleteRUMApplication-204-v2) * [404](https://docs.datadoghq.com/api/latest/rum/#DeleteRUMApplication-404-v2) * [429](https://docs.datadoghq.com/api/latest/rum/#DeleteRUMApplication-429-v2) No Content Not Found * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum/?code-lang=typescript) ##### Delete a RUM application Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications/${id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a RUM application ``` """ Delete a RUM application returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_api import RUMApi # there is a valid "rum_application" in the system RUM_APPLICATION_DATA_ID = environ["RUM_APPLICATION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RUMApi(api_client) api_instance.delete_rum_application( id=RUM_APPLICATION_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a RUM application ``` # Delete a RUM application returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RUMAPI.new # there is a valid "rum_application" in the system RUM_APPLICATION_DATA_ID = ENV["RUM_APPLICATION_DATA_ID"] api_instance.delete_rum_application(RUM_APPLICATION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a RUM application ``` // Delete a RUM application returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "rum_application" in the system RumApplicationDataID := os.Getenv("RUM_APPLICATION_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRUMApi(apiClient) r, err := api.DeleteRUMApplication(ctx, RumApplicationDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RUMApi.DeleteRUMApplication`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a RUM application ``` // Delete a RUM application returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumApi apiInstance = new RumApi(defaultClient); // there is a valid "rum_application" in the system String RUM_APPLICATION_DATA_ID = System.getenv("RUM_APPLICATION_DATA_ID"); try { apiInstance.deleteRUMApplication(RUM_APPLICATION_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling RumApi#deleteRUMApplication"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a RUM application ``` // Delete a RUM application returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum::RUMAPI; #[tokio::main] async fn main() { // there is a valid "rum_application" in the system let rum_application_data_id = std::env::var("RUM_APPLICATION_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = RUMAPI::with_config(configuration); let resp = api .delete_rum_application(rum_application_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a RUM application ``` /** * Delete a RUM application returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RUMApi(configuration); // there is a valid "rum_application" in the system const RUM_APPLICATION_DATA_ID = process.env.RUM_APPLICATION_DATA_ID as string; const params: v2.RUMApiDeleteRUMApplicationRequest = { id: RUM_APPLICATION_DATA_ID, }; apiInstance .deleteRUMApplication(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new RUM application](https://docs.datadoghq.com/api/latest/rum/#create-a-new-rum-application) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum/#create-a-new-rum-application-v2) POST https://api.ap1.datadoghq.com/api/v2/rum/applicationshttps://api.ap2.datadoghq.com/api/v2/rum/applicationshttps://api.datadoghq.eu/api/v2/rum/applicationshttps://api.ddog-gov.com/api/v2/rum/applicationshttps://api.datadoghq.com/api/v2/rum/applicationshttps://api.us3.datadoghq.com/api/v2/rum/applicationshttps://api.us5.datadoghq.com/api/v2/rum/applications ### Overview Create a new RUM application in your organization. This endpoint requires the `rum_apps_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) Field Type Description data [_required_] object RUM application creation. attributes [_required_] object RUM application creation attributes. name [_required_] string Name of the RUM application. product_analytics_retention_state enum Controls the retention policy for Product Analytics data derived from RUM events. Allowed enum values: `MAX,NONE` rum_event_processing_state enum Configures which RUM events are processed and stored for the application. Allowed enum values: `ALL,ERROR_FOCUSED_MODE,NONE` type string Type of the RUM application. Supported values are `browser`, `ios`, `android`, `react-native`, `flutter`, `roku`, `electron`, `unity`, `kotlin-multiplatform`. type [_required_] enum RUM application creation type. Allowed enum values: `rum_application_create` default: `rum_application_create` ##### Create a new RUM application returns "OK" response ``` { "data": { "attributes": { "name": "test-rum-5c67ebb32077e1d9", "type": "ios" }, "type": "rum_application_create" } } ``` Copy ##### Create a new RUM application with Product Scales returns "OK" response ``` { "data": { "attributes": { "name": "test-rum-with-product-scales-5c67ebb32077e1d9", "type": "browser", "rum_event_processing_state": "ERROR_FOCUSED_MODE", "product_analytics_retention_state": "NONE" }, "type": "rum_application_create" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/rum/#CreateRUMApplication-200-v2) * [400](https://docs.datadoghq.com/api/latest/rum/#CreateRUMApplication-400-v2) * [429](https://docs.datadoghq.com/api/latest/rum/#CreateRUMApplication-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) RUM application response. Field Type Description data object RUM application. attributes [_required_] object RUM application attributes. api_key_id int32 ID of the API key associated with the application. application_id [_required_] string ID of the RUM application. client_token [_required_] string Client token of the RUM application. created_at [_required_] int64 Timestamp in ms of the creation date. created_by_handle [_required_] string Handle of the creator user. hash string Hash of the RUM application. Optional. is_active boolean Indicates if the RUM application is active. name [_required_] string Name of the RUM application. org_id [_required_] int32 Org ID of the RUM application. product_scales object Product Scales configuration for the RUM application. product_analytics_retention_scale object Product Analytics retention scale configuration. last_modified_at int64 Timestamp in milliseconds when this scale was last modified. state enum Controls the retention policy for Product Analytics data derived from RUM events. Allowed enum values: `MAX,NONE` rum_event_processing_scale object RUM event processing scale configuration. last_modified_at int64 Timestamp in milliseconds when this scale was last modified. state enum Configures which RUM events are processed and stored for the application. Allowed enum values: `ALL,ERROR_FOCUSED_MODE,NONE` type [_required_] string Type of the RUM application. Supported values are `browser`, `ios`, `android`, `react-native`, `flutter`, `roku`, `electron`, `unity`, `kotlin-multiplatform`. updated_at [_required_] int64 Timestamp in ms of the last update date. updated_by_handle [_required_] string Handle of the updater user. id [_required_] string RUM application ID. type [_required_] enum RUM application response type. Allowed enum values: `rum_application` default: `rum_application` ``` { "data": { "attributes": { "api_key_id": 123456789, "application_id": "abcd1234-0000-0000-abcd-1234abcd5678", "client_token": "abcd1234efgh5678ijkl90abcd1234efgh0", "created_at": 1659479836169, "created_by_handle": "john.doe", "hash": "string", "is_active": true, "name": "my_rum_application", "org_id": 999, "product_scales": { "product_analytics_retention_scale": { "last_modified_at": 1747922145974, "state": "MAX" }, "rum_event_processing_scale": { "last_modified_at": 1721897494108, "state": "ALL" } }, "type": "browser", "updated_at": 1659479836169, "updated_by_handle": "jane.doe" }, "id": "abcd1234-0000-0000-abcd-1234abcd5678", "type": "rum_application" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/rum/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/rum/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/rum/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum/?code-lang=typescript) ##### Create a new RUM application returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "test-rum-5c67ebb32077e1d9", "type": "ios" }, "type": "rum_application_create" } } EOF ``` ##### Create a new RUM application with Product Scales returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "test-rum-with-product-scales-5c67ebb32077e1d9", "type": "browser", "rum_event_processing_state": "ERROR_FOCUSED_MODE", "product_analytics_retention_state": "NONE" }, "type": "rum_application_create" } } EOF ``` ##### Create a new RUM application returns "OK" response ``` // Create a new RUM application returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RUMApplicationCreateRequest{ Data: datadogV2.RUMApplicationCreate{ Attributes: datadogV2.RUMApplicationCreateAttributes{ Name: "test-rum-5c67ebb32077e1d9", Type: datadog.PtrString("ios"), }, Type: datadogV2.RUMAPPLICATIONCREATETYPE_RUM_APPLICATION_CREATE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRUMApi(apiClient) resp, r, err := api.CreateRUMApplication(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RUMApi.CreateRUMApplication`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RUMApi.CreateRUMApplication`:\n%s\n", responseContent) } ``` Copy ##### Create a new RUM application with Product Scales returns "OK" response ``` // Create a new RUM application with Product Scales returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RUMApplicationCreateRequest{ Data: datadogV2.RUMApplicationCreate{ Attributes: datadogV2.RUMApplicationCreateAttributes{ Name: "test-rum-with-product-scales-5c67ebb32077e1d9", Type: datadog.PtrString("browser"), RumEventProcessingState: datadogV2.RUMEVENTPROCESSINGSTATE_ERROR_FOCUSED_MODE.Ptr(), ProductAnalyticsRetentionState: datadogV2.RUMPRODUCTANALYTICSRETENTIONSTATE_NONE.Ptr(), }, Type: datadogV2.RUMAPPLICATIONCREATETYPE_RUM_APPLICATION_CREATE, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRUMApi(apiClient) resp, r, err := api.CreateRUMApplication(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RUMApi.CreateRUMApplication`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RUMApi.CreateRUMApplication`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new RUM application returns "OK" response ``` // Create a new RUM application returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumApi; import com.datadog.api.client.v2.model.RUMApplicationCreate; import com.datadog.api.client.v2.model.RUMApplicationCreateAttributes; import com.datadog.api.client.v2.model.RUMApplicationCreateRequest; import com.datadog.api.client.v2.model.RUMApplicationCreateType; import com.datadog.api.client.v2.model.RUMApplicationResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumApi apiInstance = new RumApi(defaultClient); RUMApplicationCreateRequest body = new RUMApplicationCreateRequest() .data( new RUMApplicationCreate() .attributes( new RUMApplicationCreateAttributes() .name("test-rum-5c67ebb32077e1d9") .type("ios")) .type(RUMApplicationCreateType.RUM_APPLICATION_CREATE)); try { RUMApplicationResponse result = apiInstance.createRUMApplication(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumApi#createRUMApplication"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a new RUM application with Product Scales returns "OK" response ``` // Create a new RUM application with Product Scales returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumApi; import com.datadog.api.client.v2.model.RUMApplicationCreate; import com.datadog.api.client.v2.model.RUMApplicationCreateAttributes; import com.datadog.api.client.v2.model.RUMApplicationCreateRequest; import com.datadog.api.client.v2.model.RUMApplicationCreateType; import com.datadog.api.client.v2.model.RUMApplicationResponse; import com.datadog.api.client.v2.model.RUMEventProcessingState; import com.datadog.api.client.v2.model.RUMProductAnalyticsRetentionState; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumApi apiInstance = new RumApi(defaultClient); RUMApplicationCreateRequest body = new RUMApplicationCreateRequest() .data( new RUMApplicationCreate() .attributes( new RUMApplicationCreateAttributes() .name("test-rum-with-product-scales-5c67ebb32077e1d9") .type("browser") .rumEventProcessingState(RUMEventProcessingState.ERROR_FOCUSED_MODE) .productAnalyticsRetentionState(RUMProductAnalyticsRetentionState.NONE)) .type(RUMApplicationCreateType.RUM_APPLICATION_CREATE)); try { RUMApplicationResponse result = apiInstance.createRUMApplication(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumApi#createRUMApplication"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new RUM application returns "OK" response ``` """ Create a new RUM application returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_api import RUMApi from datadog_api_client.v2.model.rum_application_create import RUMApplicationCreate from datadog_api_client.v2.model.rum_application_create_attributes import RUMApplicationCreateAttributes from datadog_api_client.v2.model.rum_application_create_request import RUMApplicationCreateRequest from datadog_api_client.v2.model.rum_application_create_type import RUMApplicationCreateType body = RUMApplicationCreateRequest( data=RUMApplicationCreate( attributes=RUMApplicationCreateAttributes( name="test-rum-5c67ebb32077e1d9", type="ios", ), type=RUMApplicationCreateType.RUM_APPLICATION_CREATE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RUMApi(api_client) response = api_instance.create_rum_application(body=body) print(response) ``` Copy ##### Create a new RUM application with Product Scales returns "OK" response ``` """ Create a new RUM application with Product Scales returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_api import RUMApi from datadog_api_client.v2.model.rum_application_create import RUMApplicationCreate from datadog_api_client.v2.model.rum_application_create_attributes import RUMApplicationCreateAttributes from datadog_api_client.v2.model.rum_application_create_request import RUMApplicationCreateRequest from datadog_api_client.v2.model.rum_application_create_type import RUMApplicationCreateType from datadog_api_client.v2.model.rum_event_processing_state import RUMEventProcessingState from datadog_api_client.v2.model.rum_product_analytics_retention_state import RUMProductAnalyticsRetentionState body = RUMApplicationCreateRequest( data=RUMApplicationCreate( attributes=RUMApplicationCreateAttributes( name="test-rum-with-product-scales-5c67ebb32077e1d9", type="browser", rum_event_processing_state=RUMEventProcessingState.ERROR_FOCUSED_MODE, product_analytics_retention_state=RUMProductAnalyticsRetentionState.NONE, ), type=RUMApplicationCreateType.RUM_APPLICATION_CREATE, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RUMApi(api_client) response = api_instance.create_rum_application(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new RUM application returns "OK" response ``` # Create a new RUM application returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RUMAPI.new body = DatadogAPIClient::V2::RUMApplicationCreateRequest.new({ data: DatadogAPIClient::V2::RUMApplicationCreate.new({ attributes: DatadogAPIClient::V2::RUMApplicationCreateAttributes.new({ name: "test-rum-5c67ebb32077e1d9", type: "ios", }), type: DatadogAPIClient::V2::RUMApplicationCreateType::RUM_APPLICATION_CREATE, }), }) p api_instance.create_rum_application(body) ``` Copy ##### Create a new RUM application with Product Scales returns "OK" response ``` # Create a new RUM application with Product Scales returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RUMAPI.new body = DatadogAPIClient::V2::RUMApplicationCreateRequest.new({ data: DatadogAPIClient::V2::RUMApplicationCreate.new({ attributes: DatadogAPIClient::V2::RUMApplicationCreateAttributes.new({ name: "test-rum-with-product-scales-5c67ebb32077e1d9", type: "browser", rum_event_processing_state: DatadogAPIClient::V2::RUMEventProcessingState::ERROR_FOCUSED_MODE, product_analytics_retention_state: DatadogAPIClient::V2::RUMProductAnalyticsRetentionState::NONE, }), type: DatadogAPIClient::V2::RUMApplicationCreateType::RUM_APPLICATION_CREATE, }), }) p api_instance.create_rum_application(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new RUM application returns "OK" response ``` // Create a new RUM application returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum::RUMAPI; use datadog_api_client::datadogV2::model::RUMApplicationCreate; use datadog_api_client::datadogV2::model::RUMApplicationCreateAttributes; use datadog_api_client::datadogV2::model::RUMApplicationCreateRequest; use datadog_api_client::datadogV2::model::RUMApplicationCreateType; #[tokio::main] async fn main() { let body = RUMApplicationCreateRequest::new(RUMApplicationCreate::new( RUMApplicationCreateAttributes::new("test-rum-5c67ebb32077e1d9".to_string()) .type_("ios".to_string()), RUMApplicationCreateType::RUM_APPLICATION_CREATE, )); let configuration = datadog::Configuration::new(); let api = RUMAPI::with_config(configuration); let resp = api.create_rum_application(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a new RUM application with Product Scales returns "OK" response ``` // Create a new RUM application with Product Scales returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum::RUMAPI; use datadog_api_client::datadogV2::model::RUMApplicationCreate; use datadog_api_client::datadogV2::model::RUMApplicationCreateAttributes; use datadog_api_client::datadogV2::model::RUMApplicationCreateRequest; use datadog_api_client::datadogV2::model::RUMApplicationCreateType; use datadog_api_client::datadogV2::model::RUMEventProcessingState; use datadog_api_client::datadogV2::model::RUMProductAnalyticsRetentionState; #[tokio::main] async fn main() { let body = RUMApplicationCreateRequest::new(RUMApplicationCreate::new( RUMApplicationCreateAttributes::new( "test-rum-with-product-scales-5c67ebb32077e1d9".to_string(), ) .product_analytics_retention_state(RUMProductAnalyticsRetentionState::NONE) .rum_event_processing_state(RUMEventProcessingState::ERROR_FOCUSED_MODE) .type_("browser".to_string()), RUMApplicationCreateType::RUM_APPLICATION_CREATE, )); let configuration = datadog::Configuration::new(); let api = RUMAPI::with_config(configuration); let resp = api.create_rum_application(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new RUM application returns "OK" response ``` /** * Create a new RUM application returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RUMApi(configuration); const params: v2.RUMApiCreateRUMApplicationRequest = { body: { data: { attributes: { name: "test-rum-5c67ebb32077e1d9", type: "ios", }, type: "rum_application_create", }, }, }; apiInstance .createRUMApplication(params) .then((data: v2.RUMApplicationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a new RUM application with Product Scales returns "OK" response ``` /** * Create a new RUM application with Product Scales returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RUMApi(configuration); const params: v2.RUMApiCreateRUMApplicationRequest = { body: { data: { attributes: { name: "test-rum-with-product-scales-5c67ebb32077e1d9", type: "browser", rumEventProcessingState: "ERROR_FOCUSED_MODE", productAnalyticsRetentionState: "NONE", }, type: "rum_application_create", }, }, }; apiInstance .createRUMApplication(params) .then((data: v2.RUMApplicationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all the RUM applications](https://docs.datadoghq.com/api/latest/rum/#list-all-the-rum-applications) * [v2 (latest)](https://docs.datadoghq.com/api/latest/rum/#list-all-the-rum-applications-v2) GET https://api.ap1.datadoghq.com/api/v2/rum/applicationshttps://api.ap2.datadoghq.com/api/v2/rum/applicationshttps://api.datadoghq.eu/api/v2/rum/applicationshttps://api.ddog-gov.com/api/v2/rum/applicationshttps://api.datadoghq.com/api/v2/rum/applicationshttps://api.us3.datadoghq.com/api/v2/rum/applicationshttps://api.us5.datadoghq.com/api/v2/rum/applications ### Overview List all the RUM applications in your organization. This endpoint requires the `rum_apps_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/rum/#GetRUMApplications-200-v2) * [404](https://docs.datadoghq.com/api/latest/rum/#GetRUMApplications-404-v2) * [429](https://docs.datadoghq.com/api/latest/rum/#GetRUMApplications-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) RUM applications response. Field Type Description data [object] RUM applications array response. attributes [_required_] object RUM application list attributes. application_id [_required_] string ID of the RUM application. created_at [_required_] int64 Timestamp in ms of the creation date. created_by_handle [_required_] string Handle of the creator user. hash string Hash of the RUM application. Optional. is_active boolean Indicates if the RUM application is active. name [_required_] string Name of the RUM application. org_id [_required_] int32 Org ID of the RUM application. product_scales object Product Scales configuration for the RUM application. product_analytics_retention_scale object Product Analytics retention scale configuration. last_modified_at int64 Timestamp in milliseconds when this scale was last modified. state enum Controls the retention policy for Product Analytics data derived from RUM events. Allowed enum values: `MAX,NONE` rum_event_processing_scale object RUM event processing scale configuration. last_modified_at int64 Timestamp in milliseconds when this scale was last modified. state enum Configures which RUM events are processed and stored for the application. Allowed enum values: `ALL,ERROR_FOCUSED_MODE,NONE` type [_required_] string Type of the RUM application. Supported values are `browser`, `ios`, `android`, `react-native`, `flutter`, `roku`, `electron`, `unity`, `kotlin-multiplatform`. updated_at [_required_] int64 Timestamp in ms of the last update date. updated_by_handle [_required_] string Handle of the updater user. id string RUM application ID. type [_required_] enum RUM application list type. Allowed enum values: `rum_application` default: `rum_application` ``` { "data": [ { "attributes": { "application_id": "abcd1234-0000-0000-abcd-1234abcd5678", "created_at": 1659479836169, "created_by_handle": "john.doe", "hash": "string", "is_active": true, "name": "my_rum_application", "org_id": 999, "product_scales": { "product_analytics_retention_scale": { "last_modified_at": 1747922145974, "state": "MAX" }, "rum_event_processing_scale": { "last_modified_at": 1721897494108, "state": "ALL" } }, "type": "browser", "updated_at": 1659479836169, "updated_by_handle": "jane.doe" }, "id": "abcd1234-0000-0000-abcd-1234abcd5678", "type": "rum_application" } ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/rum/) * [Example](https://docs.datadoghq.com/api/latest/rum/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/rum/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/rum/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/rum/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/rum/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/rum/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/rum/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/rum/?code-lang=typescript) ##### List all the RUM applications Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/rum/applications" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all the RUM applications ``` """ List all the RUM applications returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.rum_api import RUMApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = RUMApi(api_client) response = api_instance.get_rum_applications() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all the RUM applications ``` # List all the RUM applications returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::RUMAPI.new p api_instance.get_rum_applications() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all the RUM applications ``` // List all the RUM applications returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewRUMApi(apiClient) resp, r, err := api.GetRUMApplications(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `RUMApi.GetRUMApplications`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `RUMApi.GetRUMApplications`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all the RUM applications ``` // List all the RUM applications returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.RumApi; import com.datadog.api.client.v2.model.RUMApplicationsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); RumApi apiInstance = new RumApi(defaultClient); try { RUMApplicationsResponse result = apiInstance.getRUMApplications(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling RumApi#getRUMApplications"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all the RUM applications ``` // List all the RUM applications returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_rum::RUMAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = RUMAPI::with_config(configuration); let resp = api.get_rum_applications().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all the RUM applications ``` /** * List all the RUM applications returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.RUMApi(configuration); apiInstance .getRUMApplications() .then((data: v2.RUMApplicationsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=45a89a38-7fb0-4446-9360-45e1e06b3d0d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2186c6e5-97a3-4653-9890-2bf447d9431b&pt=RUM&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frum%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=45a89a38-7fb0-4446-9360-45e1e06b3d0d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2186c6e5-97a3-4653-9890-2bf447d9431b&pt=RUM&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frum%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=268f7ec2-fdf4-40ed-88f2-0f9f8342e2ad&bo=2&sid=c4ba0d50f0bf11f09afde9c0412ba014&vid=c4ba67e0f0bf11f099fcdb9b6c7a6b3e&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=RUM&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Frum%2F&r=<=1963&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=287360) --- # Source: https://docs.datadoghq.com/api/latest/scim/ # SCIM Provision Datadog users and teams using SCIM APIs. Note: SCIM provisioning for Datadog teams is only available for select organizations at this point. Request access by contacting Datadog support, or see the [SCIM page](https://docs.datadoghq.com/account_management/scim/) for more information. ## [List users](https://docs.datadoghq.com/api/latest/scim/#list-users) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#list-users-v2) GET https://api.ap1.datadoghq.com/api/v2/scim/Usershttps://api.ap2.datadoghq.com/api/v2/scim/Usershttps://api.datadoghq.eu/api/v2/scim/Usershttps://api.ddog-gov.com/api/v2/scim/Usershttps://api.datadoghq.com/api/v2/scim/Usershttps://api.us3.datadoghq.com/api/v2/scim/Usershttps://api.us5.datadoghq.com/api/v2/scim/Users ### Overview List users in the organization. Results are paginated by `startIndex` and `count` parameters. Results can be narrowed down by the `filter` parameter. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Arguments #### Query Strings Name Type Description startIndex integer Specifies the start index to fetch the results (1-indexed). count integer Specifies the number of users to be returned. filter string Specifies the url encoded filter to use to narrow down the results. Filter should be of the form `userName eq `. ### Response * [200](https://docs.datadoghq.com/api/latest/scim/#ListSCIMUsers-200-v2) * [400](https://docs.datadoghq.com/api/latest/scim/#ListSCIMUsers-400-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#ListSCIMUsers-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) List users response object. Field Type Description Resources [object] List of users matching the request criteria. active boolean A Boolean value indicating the User's administrative status. emails [object] Email addresses for the user. primary boolean Boolean indicating if this email is the primary email address. type enum The type of email. Allowed enum values: `work` value string Email addresses for the user. id string The identifier of the resource. Not required when creating a user. meta object Metadata associated with a user. created date-time The date and time the user was created. lastModified date-time The date and time the user was last changed. location string URL identifying the resource. resourceType string Type of resource. name object The components of user's real name formatted string The full name, including all middle names, titles, and suffixes as appropriate, formatted for display. schemas [string] User JSON Schemas. title string The user's title. userName string Unique identifier for the User. itemsPerPage int64 Number of users returned per page. schemas [string] List response JSON Schemas. startIndex int64 Starting index of the users for this page (1-indexed). totalResults int64 Total number of users matching the request criteria. ``` { "Resources": [ { "active": true, "emails": [ { "primary": true, "type": "work", "value": "john.doe@datadoghq.com" } ], "id": "e43536e9-33fe-43f8-90b8-d3e39a7dd6ad", "meta": { "created": "2024-11-22T15:05:52.055138963Z", "lastModified": "2024-11-22T15:05:52.055139017Z", "location": "https://app.datadoghq.com/api/scim/v2/Users/e43536e9-33fe-43f8-90b8-d3e39a7dd6ad", "resourceType": "User" }, "name": { "formatted": "John Doe" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:User" ], "title": "Mr.", "userName": "john.doe@datadoghq.com" }, { "active": true, "emails": [ { "primary": true, "type": "work", "value": "jane.doe@datadoghq.com" } ], "id": "79ef0d28-f257-4829-97e6-d23d2a26cb1a", "meta": { "created": "2024-11-22T15:05:52.055139748Z", "lastModified": "2024-11-22T15:05:52.055139813Z", "location": "https://app.datadoghq.com/api/scim/v2/Users/79ef0d28-f257-4829-97e6-d23d2a26cb1a", "resourceType": "User" }, "name": { "formatted": "Jane Doe" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:User" ], "title": "Mrs.", "userName": "jane.doe@datadoghq.com" } ], "itemsPerPage": 2, "schemas": [ "urn:ietf:params:scim:api:messages:2.0:ListResponse" ], "startIndex": 1, "totalResults": 2 } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### List users Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Users" \ -H "Accept: application/json" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" ``` * * * ## [Create user](https://docs.datadoghq.com/api/latest/scim/#create-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#create-user-v2) POST https://api.ap1.datadoghq.com/api/v2/scim/Usershttps://api.ap2.datadoghq.com/api/v2/scim/Usershttps://api.datadoghq.eu/api/v2/scim/Usershttps://api.ddog-gov.com/api/v2/scim/Usershttps://api.datadoghq.com/api/v2/scim/Usershttps://api.us3.datadoghq.com/api/v2/scim/Usershttps://api.us5.datadoghq.com/api/v2/scim/Users ### Overview Create a new user. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Field Type Description active boolean A Boolean value indicating the User's administrative status. emails [object] Email addresses for the user. primary boolean Boolean indicating if this email is the primary email address. type enum The type of email. Allowed enum values: `work` value string Email addresses for the user. id string The identifier of the resource. Not required when creating a user. meta object Metadata associated with a user. created date-time The date and time the user was created. lastModified date-time The date and time the user was last changed. location string URL identifying the resource. resourceType string Type of resource. name object The components of user's real name formatted string The full name, including all middle names, titles, and suffixes as appropriate, formatted for display. schemas [string] User JSON Schemas. title string The user's title. userName string Unique identifier for the User. ``` { "active": false, "emails": [ { "primary": false, "type": "string", "value": "string" } ], "id": "string", "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Users/13a95654-b76d-478d-8636-157a7e461d7c", "resourceType": "User" }, "name": { "formatted": "string" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:User" ], "title": "string", "userName": "string" } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/scim/#CreateSCIMUser-201-v2) * [400](https://docs.datadoghq.com/api/latest/scim/#CreateSCIMUser-400-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#CreateSCIMUser-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Definition of a user. Field Type Description active boolean A Boolean value indicating the User's administrative status. emails [object] Email addresses for the user. primary boolean Boolean indicating if this email is the primary email address. type enum The type of email. Allowed enum values: `work` value string Email addresses for the user. id string The identifier of the resource. Not required when creating a user. meta object Metadata associated with a user. created date-time The date and time the user was created. lastModified date-time The date and time the user was last changed. location string URL identifying the resource. resourceType string Type of resource. name object The components of user's real name formatted string The full name, including all middle names, titles, and suffixes as appropriate, formatted for display. schemas [string] User JSON Schemas. title string The user's title. userName string Unique identifier for the User. ``` { "active": false, "emails": [ { "primary": false, "type": "string", "value": "string" } ], "id": "string", "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Users/13a95654-b76d-478d-8636-157a7e461d7c", "resourceType": "User" }, "name": { "formatted": "string" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:User" ], "title": "string", "userName": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### Create user Copy ``` ## json-request-body # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Users" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" \ -d @- << EOF { "active": true, "emails": [ { "primary": true, "type": "work", "value": "john.doe@datadoghq.com" } ], "name": { "formatted": "John Doe" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:User" ], "title": "Mr.", "userName": "john.doe@datadoghq.com" } EOF ``` * * * ## [Get user](https://docs.datadoghq.com/api/latest/scim/#get-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#get-user-v2) GET https://api.ap1.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.ap2.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.datadoghq.eu/api/v2/scim/Users/{user_uuid}https://api.ddog-gov.com/api/v2/scim/Users/{user_uuid}https://api.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.us3.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.us5.datadoghq.com/api/v2/scim/Users/{user_uuid} ### Overview Get a single user using the `user_uuid`. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Arguments #### Path Parameters Name Type Description user_uuid [_required_] string None ### Response * [200](https://docs.datadoghq.com/api/latest/scim/#GetSCIMUser-200-v2) * [404](https://docs.datadoghq.com/api/latest/scim/#GetSCIMUser-404-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#GetSCIMUser-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Definition of a user. Field Type Description active boolean A Boolean value indicating the User's administrative status. emails [object] Email addresses for the user. primary boolean Boolean indicating if this email is the primary email address. type enum The type of email. Allowed enum values: `work` value string Email addresses for the user. id string The identifier of the resource. Not required when creating a user. meta object Metadata associated with a user. created date-time The date and time the user was created. lastModified date-time The date and time the user was last changed. location string URL identifying the resource. resourceType string Type of resource. name object The components of user's real name formatted string The full name, including all middle names, titles, and suffixes as appropriate, formatted for display. schemas [string] User JSON Schemas. title string The user's title. userName string Unique identifier for the User. ``` { "active": false, "emails": [ { "primary": false, "type": "string", "value": "string" } ], "id": "string", "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Users/13a95654-b76d-478d-8636-157a7e461d7c", "resourceType": "User" }, "name": { "formatted": "string" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:User" ], "title": "string", "userName": "string" } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### Get user Copy ``` # Path parameters export user_uuid="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Users/${user_uuid}" \ -H "Accept: application/json" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" ``` * * * ## [Update user](https://docs.datadoghq.com/api/latest/scim/#update-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#update-user-v2) PUT https://api.ap1.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.ap2.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.datadoghq.eu/api/v2/scim/Users/{user_uuid}https://api.ddog-gov.com/api/v2/scim/Users/{user_uuid}https://api.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.us3.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.us5.datadoghq.com/api/v2/scim/Users/{user_uuid} ### Overview Update the user with the given `user_uuid`. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Arguments #### Path Parameters Name Type Description user_uuid [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Field Type Description active boolean A Boolean value indicating the User's administrative status. emails [object] Email addresses for the user. primary boolean Boolean indicating if this email is the primary email address. type enum The type of email. Allowed enum values: `work` value string Email addresses for the user. id string The identifier of the resource. Not required when creating a user. meta object Metadata associated with a user. created date-time The date and time the user was created. lastModified date-time The date and time the user was last changed. location string URL identifying the resource. resourceType string Type of resource. name object The components of user's real name formatted string The full name, including all middle names, titles, and suffixes as appropriate, formatted for display. schemas [string] User JSON Schemas. title string The user's title. userName string Unique identifier for the User. ``` { "active": false, "emails": [ { "primary": false, "type": "string", "value": "string" } ], "id": "string", "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Users/13a95654-b76d-478d-8636-157a7e461d7c", "resourceType": "User" }, "name": { "formatted": "string" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:User" ], "title": "string", "userName": "string" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/scim/#UpdateSCIMUser-200-v2) * [400](https://docs.datadoghq.com/api/latest/scim/#UpdateSCIMUser-400-v2) * [404](https://docs.datadoghq.com/api/latest/scim/#UpdateSCIMUser-404-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#UpdateSCIMUser-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Definition of a user. Field Type Description active boolean A Boolean value indicating the User's administrative status. emails [object] Email addresses for the user. primary boolean Boolean indicating if this email is the primary email address. type enum The type of email. Allowed enum values: `work` value string Email addresses for the user. id string The identifier of the resource. Not required when creating a user. meta object Metadata associated with a user. created date-time The date and time the user was created. lastModified date-time The date and time the user was last changed. location string URL identifying the resource. resourceType string Type of resource. name object The components of user's real name formatted string The full name, including all middle names, titles, and suffixes as appropriate, formatted for display. schemas [string] User JSON Schemas. title string The user's title. userName string Unique identifier for the User. ``` { "active": false, "emails": [ { "primary": false, "type": "string", "value": "string" } ], "id": "string", "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Users/13a95654-b76d-478d-8636-157a7e461d7c", "resourceType": "User" }, "name": { "formatted": "string" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:User" ], "title": "string", "userName": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### Update user Copy ``` ## json-request-body # # Path parameters export user_uuid="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Users/${user_uuid}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" \ -d @- << EOF { "active": true, "emails": [ { "primary": true, "type": "work", "value": "john.doe@datadoghq.com" } ], "id": "e43536e9-33fe-43f8-90b8-d3e39a7dd6ad", "name": { "formatted": "John Doe" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:User" ], "title": "Mr.", "userName": "john.doe@datadoghq.com" } EOF ``` * * * ## [Patch user](https://docs.datadoghq.com/api/latest/scim/#patch-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#patch-user-v2) PATCH https://api.ap1.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.ap2.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.datadoghq.eu/api/v2/scim/Users/{user_uuid}https://api.ddog-gov.com/api/v2/scim/Users/{user_uuid}https://api.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.us3.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.us5.datadoghq.com/api/v2/scim/Users/{user_uuid} ### Overview Patch the user with the given `user_uuid`. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Arguments #### Path Parameters Name Type Description user_uuid [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Field Type Description Operations [object] A list of update operations to be performed on a user. op enum The operation to be performed. Allowed enum values: `add,replace` path string An attribute path describing the target of the operation. value New value to use for the patch operation. schemas [string] Input JSON Schemas ``` { "Operations": [ { "op": "string", "path": "title", "value": "undefined" } ], "schemas": [ "urn:ietf:params:scim:api:messages:2.0:PatchOp" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/scim/#PatchSCIMUser-200-v2) * [400](https://docs.datadoghq.com/api/latest/scim/#PatchSCIMUser-400-v2) * [403](https://docs.datadoghq.com/api/latest/scim/#PatchSCIMUser-403-v2) * [404](https://docs.datadoghq.com/api/latest/scim/#PatchSCIMUser-404-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#PatchSCIMUser-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Definition of a user. Field Type Description active boolean A Boolean value indicating the User's administrative status. emails [object] Email addresses for the user. primary boolean Boolean indicating if this email is the primary email address. type enum The type of email. Allowed enum values: `work` value string Email addresses for the user. id string The identifier of the resource. Not required when creating a user. meta object Metadata associated with a user. created date-time The date and time the user was created. lastModified date-time The date and time the user was last changed. location string URL identifying the resource. resourceType string Type of resource. name object The components of user's real name formatted string The full name, including all middle names, titles, and suffixes as appropriate, formatted for display. schemas [string] User JSON Schemas. title string The user's title. userName string Unique identifier for the User. ``` { "active": false, "emails": [ { "primary": false, "type": "string", "value": "string" } ], "id": "string", "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Users/13a95654-b76d-478d-8636-157a7e461d7c", "resourceType": "User" }, "name": { "formatted": "string" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:User" ], "title": "string", "userName": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### Patch user Copy ``` ## json-request-body # # Path parameters export user_uuid="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Users/${user_uuid}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" \ -d @- << EOF { "Operations": [ { "op": "replace", "path": "title", "value": "CEO" }, { "op": "replace", "value": { "name": { "formatted": "John Doe" } } } ], "schemas": [ "urn:ietf:params:scim:api:messages:2.0:PatchOp" ] } EOF ``` * * * ## [Delete user](https://docs.datadoghq.com/api/latest/scim/#delete-user) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#delete-user-v2) DELETE https://api.ap1.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.ap2.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.datadoghq.eu/api/v2/scim/Users/{user_uuid}https://api.ddog-gov.com/api/v2/scim/Users/{user_uuid}https://api.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.us3.datadoghq.com/api/v2/scim/Users/{user_uuid}https://api.us5.datadoghq.com/api/v2/scim/Users/{user_uuid} ### Overview Delete the group with the given `user_uuid`. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Arguments #### Path Parameters Name Type Description user_uuid [_required_] string None ### Response * [204](https://docs.datadoghq.com/api/latest/scim/#DeleteSCIMUser-204-v2) * [400](https://docs.datadoghq.com/api/latest/scim/#DeleteSCIMUser-400-v2) * [404](https://docs.datadoghq.com/api/latest/scim/#DeleteSCIMUser-404-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#DeleteSCIMUser-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### Delete user Copy ``` # Path parameters export user_uuid="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Users/${user_uuid}" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" ``` * * * ## [List groups](https://docs.datadoghq.com/api/latest/scim/#list-groups) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#list-groups-v2) GET https://api.ap1.datadoghq.com/api/v2/scim/Groupshttps://api.ap2.datadoghq.com/api/v2/scim/Groupshttps://api.datadoghq.eu/api/v2/scim/Groupshttps://api.ddog-gov.com/api/v2/scim/Groupshttps://api.datadoghq.com/api/v2/scim/Groupshttps://api.us3.datadoghq.com/api/v2/scim/Groupshttps://api.us5.datadoghq.com/api/v2/scim/Groups ### Overview List groups in the organization. Results are paginated by `startIndex` and `count` parameters. Results can be narrowed down by the `filter` parameter. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Arguments #### Query Strings Name Type Description startIndex integer Specifies the start index to fetch the results (1-indexed). count integer Specifies the number of groups to be returned. filter string Specifies the url encoded filter to use to narrow down the results. Filters can be in the form of `displayName eq ` or `externalId eq ` or `id eq and members eq ` or `members eq and id eq `. ### Response * [200](https://docs.datadoghq.com/api/latest/scim/#ListSCIMGroups-200-v2) * [400](https://docs.datadoghq.com/api/latest/scim/#ListSCIMGroups-400-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#ListSCIMGroups-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) List groups response object. Field Type Description Resources [object] List of groups matching the request criteria. displayName string A human-readable name for the group. externalId string An identifier for the resource as defined by the provisioning client. id string The identifier of the resource. Not required when creating a group. members [object] A list of members belonging to the team. $ref string The URI corresponding to a resource that is a member of this group. display string Full name of the user. type string A label indicating the type of resource. Only supported value is "User". value string The identifier of the member of this group. meta object Metadata associated with a group. created date-time The date and time the group was created. lastModified date-time The date and time the group was last changed. location string URL identifying the resource. resourceType string Type of resource. schemas [string] Group JSON Schemas. itemsPerPage int64 Number of groups returned per page. schemas [string] List response JSON Schemas. startIndex int64 Starting index of the groups for this page (1-indexed). totalResults int64 Total number of groups matching the request criteria. ``` { "Resources": [ { "displayName": "Group 1", "externalId": "group1", "id": "e43536e9-33fe-43f8-90b8-d3e39a7dd6ad", "members": [ { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/d34a5f93-5690-4d3f-a293-f2ad5c7a82a4", "display": "John Doe", "type": "User", "value": "d34a5f93-5690-4d3f-a293-f2ad5c7a82a4" }, { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "display": "Jane Doe", "type": "User", "value": "429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6" } ], "meta": { "created": "2024-11-22T15:05:52.055138963Z", "lastModified": "2024-11-22T15:05:52.055139017Z", "location": "https://app.datadoghq.com/api/scim/v2/Groups/e43536e9-33fe-43f8-90b8-d3e39a7dd6ad", "resourceType": "Group" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:Group" ] }, { "displayName": "Group 2", "externalId": "group2", "id": "79ef0d28-f257-4829-97e6-d23d2a26cb1a", "members": [ { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/29da9fb7-8fca-4e87-bf58-86652253deae", "display": "Alice Smith", "type": "User", "value": "29da9fb7-8fca-4e87-bf58-86652253deae" }, { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/f85e3868-ad7b-47e3-a8a9-ff1eade2bbf9", "display": "Bob Smith", "type": "User", "value": "f85e3868-ad7b-47e3-a8a9-ff1eade2bbf9" } ], "meta": { "created": "2024-11-22T15:05:52.055139748Z", "lastModified": "2024-11-22T15:05:52.055139813Z", "location": "https://app.datadoghq.com/api/scim/v2/Groups/79ef0d28-f257-4829-97e6-d23d2a26cb1a", "resourceType": "Group" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:Group" ] } ], "itemsPerPage": 2, "schemas": [ "urn:ietf:params:scim:api:messages:2.0:ListResponse" ], "startIndex": 1, "totalResults": 2 } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### List groups Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Groups" \ -H "Accept: application/json" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" ``` * * * ## [Create group](https://docs.datadoghq.com/api/latest/scim/#create-group) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#create-group-v2) POST https://api.ap1.datadoghq.com/api/v2/scim/Groupshttps://api.ap2.datadoghq.com/api/v2/scim/Groupshttps://api.datadoghq.eu/api/v2/scim/Groupshttps://api.ddog-gov.com/api/v2/scim/Groupshttps://api.datadoghq.com/api/v2/scim/Groupshttps://api.us3.datadoghq.com/api/v2/scim/Groupshttps://api.us5.datadoghq.com/api/v2/scim/Groups ### Overview Create a new group. The group may contain members. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Field Type Description displayName string A human-readable name for the group. externalId string An identifier for the resource as defined by the provisioning client. id string The identifier of the resource. Not required when creating a group. members [object] Members of the group. $ref string The URI corresponding to a SCIM resource that is a member of this group. display string A human-readable name for the group member. type string A label indicating the type of resource. value string The identifier of the member of this group. meta object Metadata associated with a group. created date-time The date and time the group was created. lastModified date-time The date and time the group was last changed. location string URL identifying the resource. resourceType string Type of resource. schemas [string] Input JSON Schemas. ``` { "displayName": "string", "externalId": "string", "id": "string", "members": [ { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "display": "John Doe", "type": "User", "value": "429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6" } ], "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Groups/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "resourceType": "Group" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:Group" ] } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/scim/#CreateSCIMGroup-201-v2) * [400](https://docs.datadoghq.com/api/latest/scim/#CreateSCIMGroup-400-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#CreateSCIMGroup-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Definition of a group. Field Type Description displayName string A human-readable name for the group. externalId string An identifier for the resource as defined by the provisioning client. id string The identifier of the resource. Not required when creating a group. members [object] Members of the group. $ref string The URI corresponding to a SCIM resource that is a member of this group. display string A human-readable name for the group member. type string A label indicating the type of resource. value string The identifier of the member of this group. meta object Metadata associated with a group. created date-time The date and time the group was created. lastModified date-time The date and time the group was last changed. location string URL identifying the resource. resourceType string Type of resource. schemas [string] Input JSON Schemas. ``` { "displayName": "string", "externalId": "string", "id": "string", "members": [ { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "display": "John Doe", "type": "User", "value": "429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6" } ], "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Groups/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "resourceType": "Group" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:Group" ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### Create group Copy ``` ## json-request-body # # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Groups" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" \ -d @- << EOF { "displayName": "Group 1", "externalId": "group1", "members": [ { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/d34a5f93-5690-4d3f-a293-f2ad5c7a82a4", "display": "John Doe", "type": "User", "value": "d34a5f93-5690-4d3f-a293-f2ad5c7a82a4" }, { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "display": "Jane Doe", "type": "User", "value": "429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6" } ], "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:Group" ] } EOF ``` * * * ## [Get group](https://docs.datadoghq.com/api/latest/scim/#get-group) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#get-group-v2) GET https://api.ap1.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.ap2.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.datadoghq.eu/api/v2/scim/Groups/{group_id}https://api.ddog-gov.com/api/v2/scim/Groups/{group_id}https://api.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.us3.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.us5.datadoghq.com/api/v2/scim/Groups/{group_id} ### Overview Get a single group using the `group_id`. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Arguments #### Path Parameters Name Type Description group_id [_required_] string None ### Response * [200](https://docs.datadoghq.com/api/latest/scim/#GetSCIMGroup-200-v2) * [400](https://docs.datadoghq.com/api/latest/scim/#GetSCIMGroup-400-v2) * [404](https://docs.datadoghq.com/api/latest/scim/#GetSCIMGroup-404-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#GetSCIMGroup-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Definition of a group. Field Type Description displayName string A human-readable name for the group. externalId string An identifier for the resource as defined by the provisioning client. id string The identifier of the resource. Not required when creating a group. members [object] Members of the group. $ref string The URI corresponding to a SCIM resource that is a member of this group. display string A human-readable name for the group member. type string A label indicating the type of resource. value string The identifier of the member of this group. meta object Metadata associated with a group. created date-time The date and time the group was created. lastModified date-time The date and time the group was last changed. location string URL identifying the resource. resourceType string Type of resource. schemas [string] Input JSON Schemas. ``` { "displayName": "string", "externalId": "string", "id": "string", "members": [ { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "display": "John Doe", "type": "User", "value": "429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6" } ], "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Groups/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "resourceType": "Group" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:Group" ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### Get group Copy ``` # Path parameters export group_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Groups/${group_id}" \ -H "Accept: application/json" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" ``` * * * ## [Update group](https://docs.datadoghq.com/api/latest/scim/#update-group) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#update-group-v2) PUT https://api.ap1.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.ap2.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.datadoghq.eu/api/v2/scim/Groups/{group_id}https://api.ddog-gov.com/api/v2/scim/Groups/{group_id}https://api.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.us3.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.us5.datadoghq.com/api/v2/scim/Groups/{group_id} ### Overview Update the group with the given `group_id`. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Arguments #### Path Parameters Name Type Description group_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Field Type Description displayName string A human-readable name for the group. externalId string An identifier for the resource as defined by the provisioning client. id string The identifier of the resource. Not required when creating a group. members [object] Members of the group. $ref string The URI corresponding to a SCIM resource that is a member of this group. display string A human-readable name for the group member. type string A label indicating the type of resource. value string The identifier of the member of this group. meta object Metadata associated with a group. created date-time The date and time the group was created. lastModified date-time The date and time the group was last changed. location string URL identifying the resource. resourceType string Type of resource. schemas [string] Input JSON Schemas. ``` { "displayName": "string", "externalId": "string", "id": "string", "members": [ { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "display": "John Doe", "type": "User", "value": "429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6" } ], "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Groups/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "resourceType": "Group" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:Group" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/scim/#UpdateSCIMGroup-200-v2) * [400](https://docs.datadoghq.com/api/latest/scim/#UpdateSCIMGroup-400-v2) * [404](https://docs.datadoghq.com/api/latest/scim/#UpdateSCIMGroup-404-v2) * [409](https://docs.datadoghq.com/api/latest/scim/#UpdateSCIMGroup-409-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#UpdateSCIMGroup-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Definition of a group. Field Type Description displayName string A human-readable name for the group. externalId string An identifier for the resource as defined by the provisioning client. id string The identifier of the resource. Not required when creating a group. members [object] Members of the group. $ref string The URI corresponding to a SCIM resource that is a member of this group. display string A human-readable name for the group member. type string A label indicating the type of resource. value string The identifier of the member of this group. meta object Metadata associated with a group. created date-time The date and time the group was created. lastModified date-time The date and time the group was last changed. location string URL identifying the resource. resourceType string Type of resource. schemas [string] Input JSON Schemas. ``` { "displayName": "string", "externalId": "string", "id": "string", "members": [ { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "display": "John Doe", "type": "User", "value": "429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6" } ], "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Groups/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "resourceType": "Group" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:Group" ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### Update group Copy ``` ## json-request-body # # Path parameters export group_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Groups/${group_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" \ -d @- << EOF { "displayName": "Group 1", "externalId": "group1", "id": "e43536e9-33fe-43f8-90b8-d3e39a7dd6ad", "members": [ { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/d34a5f93-5690-4d3f-a293-f2ad5c7a82a4", "display": "John Doe", "type": "User", "value": "d34a5f93-5690-4d3f-a293-f2ad5c7a82a4" }, { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "display": "Jane Doe", "type": "User", "value": "429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6" } ], "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:Group" ] } EOF ``` * * * ## [Patch group](https://docs.datadoghq.com/api/latest/scim/#patch-group) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#patch-group-v2) PATCH https://api.ap1.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.ap2.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.datadoghq.eu/api/v2/scim/Groups/{group_id}https://api.ddog-gov.com/api/v2/scim/Groups/{group_id}https://api.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.us3.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.us5.datadoghq.com/api/v2/scim/Groups/{group_id} ### Overview Patch the group with the given `group_id`. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Arguments #### Path Parameters Name Type Description group_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Field Type Description Operations [object] A list of update operations to be performed on a group. op enum The operation to be performed. Allowed enum values: `add,replace,remove` path string An attribute path describing the target of the operation. value JSON element containing the target values required to apply the patch operation. schemas [string] Input JSON Schemas ``` { "Operations": [ { "op": "string", "path": "members", "value": "{\n \"displayName\": \"Real new group\",\n \"id\": \"80df3a9b-24f5-4ebf-9ba0-714455453621\"\n}" } ], "schemas": [ "urn:ietf:params:scim:api:messages:2.0:PatchOp" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/scim/#PatchSCIMGroup-200-v2) * [400](https://docs.datadoghq.com/api/latest/scim/#PatchSCIMGroup-400-v2) * [404](https://docs.datadoghq.com/api/latest/scim/#PatchSCIMGroup-404-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#PatchSCIMGroup-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) Definition of a group. Field Type Description displayName string A human-readable name for the group. externalId string An identifier for the resource as defined by the provisioning client. id string The identifier of the resource. Not required when creating a group. members [object] Members of the group. $ref string The URI corresponding to a SCIM resource that is a member of this group. display string A human-readable name for the group member. type string A label indicating the type of resource. value string The identifier of the member of this group. meta object Metadata associated with a group. created date-time The date and time the group was created. lastModified date-time The date and time the group was last changed. location string URL identifying the resource. resourceType string Type of resource. schemas [string] Input JSON Schemas. ``` { "displayName": "string", "externalId": "string", "id": "string", "members": [ { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "display": "John Doe", "type": "User", "value": "429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6" } ], "meta": { "created": "2024-10-17T12:53:35.793Z", "lastModified": "2024-10-19T12:53:35.793Z", "location": "https://app.datadoghq.com/api/scim/v2/Groups/429ebce5-8ed3-4da9-9f1e-662f2dbc2fe6", "resourceType": "Group" }, "schemas": [ "urn:ietf:params:scim:schemas:core:2.0:Group" ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### Patch group Copy ``` ## json-request-body # # Path parameters export group_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Groups/${group_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" \ -d @- << EOF { "Operations": [ { "op": "replace", "path": "None", "value": { "displayName": "Real new group", "id": "e43536e9-33fe-43f8-90b8-d3e39a7dd6ad" } }, { "op": "add", "path": "None", "value": { "members": [ { "$ref": "https://app.datadoghq.com/api/scim/v2/Users/f85e3868-ad7b-47e3-a8a9-ff1eade2bbf9", "displayName": "Bob Smith", "value": "f85e3868-ad7b-47e3-a8a9-ff1eade2bbf9" } ] } }, { "op": "remove", "path": "members[value eq \"fddf0cf2-9b60-11ef-ad4b-d6754a54a839\"]", "value": null } ], "schemas": [ "urn:ietf:params:scim:api:messages:2.0:PatchOp" ] } EOF ``` * * * ## [Delete group](https://docs.datadoghq.com/api/latest/scim/#delete-group) * [v2 (latest)](https://docs.datadoghq.com/api/latest/scim/#delete-group-v2) DELETE https://api.ap1.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.ap2.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.datadoghq.eu/api/v2/scim/Groups/{group_id}https://api.ddog-gov.com/api/v2/scim/Groups/{group_id}https://api.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.us3.datadoghq.com/api/v2/scim/Groups/{group_id}https://api.us5.datadoghq.com/api/v2/scim/Groups/{group_id} ### Overview Delete the group with the given `group_id`. This endpoint requires all of the following permissions: * `user_access_invite` * `user_access_manage` ### Arguments #### Path Parameters Name Type Description group_id [_required_] string None ### Response * [204](https://docs.datadoghq.com/api/latest/scim/#DeleteSCIMGroup-204-v2) * [400](https://docs.datadoghq.com/api/latest/scim/#DeleteSCIMGroup-400-v2) * [404](https://docs.datadoghq.com/api/latest/scim/#DeleteSCIMGroup-404-v2) * [429](https://docs.datadoghq.com/api/latest/scim/#DeleteSCIMGroup-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/scim/) * [Example](https://docs.datadoghq.com/api/latest/scim/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/scim/?code-lang=curl) ##### Delete group Copy ``` # Path parameters export group_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scim/Groups/${group_id}" \ -H "Authorization: Bearer ${DD_BEARER_TOKEN}" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=2c16c66e-1697-4d4d-854b-7e6a58c5d6c4&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=5af3065b-38be-4441-b8da-ced925f663db&pt=SCIM&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fscim%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=2c16c66e-1697-4d4d-854b-7e6a58c5d6c4&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=5af3065b-38be-4441-b8da-ced925f663db&pt=SCIM&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fscim%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=71882614-3d1f-4fd7-a824-5a2dc16e9886&bo=2&sid=72532930f0c011f0ad9469533a6ec7f2&vid=72534e40f0c011f08d852dd084c80453&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=SCIM&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fscim%2F&r=<=1546&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=322923) --- # Source: https://docs.datadoghq.com/api/latest/scopes ## [Authorization scopes for OAuth clients](https://docs.datadoghq.com/api/latest/scopes/#authorization-scopes-for-oauth-clients) Scopes are an authorization mechanism that allow you to limit and define the specific access applications have to an organization’s Datadog data. When authorized to access data on behalf of a user or service account, applications can only access the information explicitly permitted by their assigned scopes. This page lists only the authorization scopes that can be assigned to OAuth clients. To view the full list of assignable permissions for scoped application keys, see [Datadog Role Permissions](https://docs.datadoghq.com/account_management/rbac/permissions/#permissions-list). * **OAuth clients** → Can only be assigned authorization scopes (limited set). * **Scoped application keys** → Can be assigned any Datadog permission. The best practice for scoping applications is to follow the principle of least privilege. Assign only the minimum scopes necessary for an application to function as intended. This enhances security and provides visibility into how applications interact with your organization’s data. For example, a third-party application that only reads dashboards does not need permissions to delete or manage users. You can use authorization scopes with OAuth2 clients for your [Datadog Apps](https://docs.datadoghq.com/developers/datadog_apps/#oauth-api-access). #### API Management, Synthetics Scope name Description Endpoints that require this scope apm_api_catalog_read View API catalog and API definitions. [Get all global variables](https://docs.datadoghq.com/api/latest/synthetics/#get-all-global-variables) [List APIs](https://docs.datadoghq.com/api/latest/api-management/#list-apis) [Get an API](https://docs.datadoghq.com/api/latest/api-management/#get-an-api) apm_api_catalog_write Add, modify, and delete API catalog definitions. [Delete an API](https://docs.datadoghq.com/api/latest/api-management/#delete-an-api) [Update an API](https://docs.datadoghq.com/api/latest/api-management/#update-an-api) [Create a new API](https://docs.datadoghq.com/api/latest/api-management/#create-a-new-api) synthetics_global_variable_read View, search, and use Synthetics global variables. [Get all global variables](https://docs.datadoghq.com/api/latest/synthetics/#get-all-global-variables) [Get a global variable](https://docs.datadoghq.com/api/latest/synthetics/#get-a-global-variable) synthetics_global_variable_write Create, edit, and delete global variables for Synthetics. [Create a global variable](https://docs.datadoghq.com/api/latest/synthetics/#create-a-global-variable) [Delete a global variable](https://docs.datadoghq.com/api/latest/synthetics/#delete-a-global-variable) [Edit a global variable](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-global-variable) synthetics_private_location_read View, search, and use Synthetics private locations. [Get all locations (public and private)](https://docs.datadoghq.com/api/latest/synthetics/#get-all-locations-public-and-private) [Get a private location](https://docs.datadoghq.com/api/latest/synthetics/#get-a-private-location) synthetics_private_location_write Create and delete private locations in addition to having access to the associated installation guidelines. [Create a private location](https://docs.datadoghq.com/api/latest/synthetics/#create-a-private-location) [Delete a private location](https://docs.datadoghq.com/api/latest/synthetics/#delete-a-private-location) [Edit a private location](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-private-location) synthetics_read List and view configured Synthetic tests and test results. [Get details of batch](https://docs.datadoghq.com/api/latest/synthetics/#get-details-of-batch) [Get the list of all Synthetic tests](https://docs.datadoghq.com/api/latest/synthetics/#get-the-list-of-all-synthetic-tests) [Get an API test](https://docs.datadoghq.com/api/latest/synthetics/#get-an-api-test) [Get a browser test](https://docs.datadoghq.com/api/latest/synthetics/#get-a-browser-test) [Get a browser test's latest results summaries](https://docs.datadoghq.com/api/latest/synthetics/#get-a-browser-tests-latest-results-summaries) [Get a browser test result](https://docs.datadoghq.com/api/latest/synthetics/#get-a-browser-test-result) [Get a Mobile test](https://docs.datadoghq.com/api/latest/synthetics/#get-a-mobile-test) [Search Synthetic tests](https://docs.datadoghq.com/api/latest/synthetics/#search-synthetic-tests) [Fetch uptime for multiple tests](https://docs.datadoghq.com/api/latest/synthetics/#fetch-uptime-for-multiple-tests) [Get a test configuration](https://docs.datadoghq.com/api/latest/synthetics/#get-a-test-configuration) [Get an API test's latest results summaries](https://docs.datadoghq.com/api/latest/synthetics/#get-an-api-tests-latest-results-summaries) [Get an API test result](https://docs.datadoghq.com/api/latest/synthetics/#get-an-api-test-result) synthetics_write Create, edit, and delete Synthetic tests. [Create a test](https://docs.datadoghq.com/api/latest/synthetics/#create-a-test) [Create an API test](https://docs.datadoghq.com/api/latest/synthetics/#create-an-api-test) [Edit an API test](https://docs.datadoghq.com/api/latest/synthetics/#edit-an-api-test) [Create a browser test](https://docs.datadoghq.com/api/latest/synthetics/#create-a-browser-test) [Edit a browser test](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-browser-test) [Delete tests](https://docs.datadoghq.com/api/latest/synthetics/#delete-tests) [Create a mobile test](https://docs.datadoghq.com/api/latest/synthetics/#create-a-mobile-test) [Edit a Mobile test](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-mobile-test) [Trigger Synthetic tests](https://docs.datadoghq.com/api/latest/synthetics/#trigger-synthetic-tests) [Trigger tests from CI/CD pipelines](https://docs.datadoghq.com/api/latest/synthetics/#trigger-tests-from-ci/cd-pipelines) [Patch a Synthetic test](https://docs.datadoghq.com/api/latest/synthetics/#patch-a-synthetic-test) [Edit a test](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-test) [Pause or start a test](https://docs.datadoghq.com/api/latest/synthetics/#pause-or-start-a-test) #### APM, Spans Scope name Description Endpoints that require this scope apm_read Read and query APM and Trace Analytics. [Get service list](https://docs.datadoghq.com/api/latest/apm/#get-service-list) [Aggregate spans](https://docs.datadoghq.com/api/latest/spans/#aggregate-spans) [Get a list of spans](https://docs.datadoghq.com/api/latest/spans/#get-a-list-of-spans) [Search spans](https://docs.datadoghq.com/api/latest/spans/#search-spans) #### Agentless Scanning, Domain Allowlist, Downtimes, IP Allowlist, Monitors Scope name Description Endpoints that require this scope org_management Edit org configurations, including authentication and certain security preferences such as configuring SAML, renaming an org, configuring allowed login methods, creating child orgs, subscribing & unsubscribing from apps in the marketplace, and enabling & disabling Remote Configuration for the entire organization. [Create AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-scan-options) [Delete AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-aws-scan-options) [Update AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-aws-scan-options) [Create Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-azure-scan-options) [Delete Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-azure-scan-options) [Update Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-azure-scan-options) [Create GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-gcp-scan-options) [Delete GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-gcp-scan-options) [Update GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-gcp-scan-options) [Create AWS on demand task](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-on-demand-task) [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist) [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist) [Get IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#get-ip-allowlist) [Update IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#update-ip-allowlist) security_monitoring_findings_read View a list of findings that include both misconfigurations and identity risks. [List AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-scan-options) [Get AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-scan-options) [List Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-azure-scan-options) [Get Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-azure-scan-options) [List GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-gcp-scan-options) [Get GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-gcp-scan-options) [List AWS on demand tasks](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-on-demand-tasks) [Get AWS on demand task](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-on-demand-task) [List findings](https://docs.datadoghq.com/api/latest/security-monitoring/#list-findings) [Get a finding](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-finding) [List security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#list-security-findings) [Search security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#search-security-findings) monitors_write Edit, delete, and resolve individual monitors. [Create a monitor](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor) [Delete a monitor](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor) [Edit a monitor](https://docs.datadoghq.com/api/latest/monitors/#edit-a-monitor) [Mute a monitor](https://docs.datadoghq.com/api/latest/monitors/#mute-a-monitor) [Unmute a monitor](https://docs.datadoghq.com/api/latest/monitors/#unmute-a-monitor) [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist) [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist) monitors_downtime Set downtimes to suppress alerts from any monitor in an organization. Mute and unmute monitors. The ability to write monitors is not required to set downtimes. [Schedule a downtime](https://docs.datadoghq.com/api/latest/downtimes/#schedule-a-downtime) [Cancel downtimes by scope](https://docs.datadoghq.com/api/latest/downtimes/#cancel-downtimes-by-scope) [Cancel a downtime](https://docs.datadoghq.com/api/latest/downtimes/#cancel-a-downtime) [Update a downtime](https://docs.datadoghq.com/api/latest/downtimes/#update-a-downtime) [Get all downtimes](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes) [Schedule a downtime](https://docs.datadoghq.com/api/latest/downtimes/#schedule-a-downtime) [Cancel a downtime](https://docs.datadoghq.com/api/latest/downtimes/#cancel-a-downtime) [Get a downtime](https://docs.datadoghq.com/api/latest/downtimes/#get-a-downtime) [Update a downtime](https://docs.datadoghq.com/api/latest/downtimes/#update-a-downtime) [Get active downtimes for a monitor](https://docs.datadoghq.com/api/latest/downtimes/#get-active-downtimes-for-a-monitor) monitors_read View monitors. [Get all downtimes](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes) [Get a downtime](https://docs.datadoghq.com/api/latest/downtimes/#get-a-downtime) [Get all monitors](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitors) [Check if a monitor can be deleted](https://docs.datadoghq.com/api/latest/monitors/#check-if-a-monitor-can-be-deleted) [Monitors group search](https://docs.datadoghq.com/api/latest/monitors/#monitors-group-search) [Monitors search](https://docs.datadoghq.com/api/latest/monitors/#monitors-search) [Validate a monitor](https://docs.datadoghq.com/api/latest/monitors/#validate-a-monitor) [Get a monitor's details](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitors-details) [Get active downtimes for a monitor](https://docs.datadoghq.com/api/latest/downtimes/#get-active-downtimes-for-a-monitor) [Validate an existing monitor](https://docs.datadoghq.com/api/latest/monitors/#validate-an-existing-monitor) [Get all monitor notification rules](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-notification-rules) [Get a monitor notification rule](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-notification-rule) [Get all monitor configuration policies](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-configuration-policies) [Get a monitor configuration policy](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-configuration-policy) [Get all monitor user templates](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-user-templates) [Get a monitor user template](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-user-template) #### Agentless Scanning, Domain Allowlist, IP Allowlist, Monitors, Security Monitoring Scope name Description Endpoints that require this scope org_management Edit org configurations, including authentication and certain security preferences such as configuring SAML, renaming an org, configuring allowed login methods, creating child orgs, subscribing & unsubscribing from apps in the marketplace, and enabling & disabling Remote Configuration for the entire organization. [Create AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-scan-options) [Delete AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-aws-scan-options) [Update AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-aws-scan-options) [Create Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-azure-scan-options) [Delete Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-azure-scan-options) [Update Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-azure-scan-options) [Create GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-gcp-scan-options) [Delete GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-gcp-scan-options) [Update GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-gcp-scan-options) [Create AWS on demand task](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-on-demand-task) [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist) [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist) [Get IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#get-ip-allowlist) [Update IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#update-ip-allowlist) security_monitoring_findings_read View a list of findings that include both misconfigurations and identity risks. [List AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-scan-options) [Get AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-scan-options) [List Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-azure-scan-options) [Get Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-azure-scan-options) [List GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-gcp-scan-options) [Get GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-gcp-scan-options) [List AWS on demand tasks](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-on-demand-tasks) [Get AWS on demand task](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-on-demand-task) [List findings](https://docs.datadoghq.com/api/latest/security-monitoring/#list-findings) [Get a finding](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-finding) [List security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#list-security-findings) [Search security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#search-security-findings) monitors_write Edit, delete, and resolve individual monitors. [Create a monitor](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor) [Delete a monitor](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor) [Edit a monitor](https://docs.datadoghq.com/api/latest/monitors/#edit-a-monitor) [Mute a monitor](https://docs.datadoghq.com/api/latest/monitors/#mute-a-monitor) [Unmute a monitor](https://docs.datadoghq.com/api/latest/monitors/#unmute-a-monitor) [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist) [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist) monitors_read View monitors. [Get all downtimes](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes) [Get a downtime](https://docs.datadoghq.com/api/latest/downtimes/#get-a-downtime) [Get all monitors](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitors) [Check if a monitor can be deleted](https://docs.datadoghq.com/api/latest/monitors/#check-if-a-monitor-can-be-deleted) [Monitors group search](https://docs.datadoghq.com/api/latest/monitors/#monitors-group-search) [Monitors search](https://docs.datadoghq.com/api/latest/monitors/#monitors-search) [Validate a monitor](https://docs.datadoghq.com/api/latest/monitors/#validate-a-monitor) [Get a monitor's details](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitors-details) [Get active downtimes for a monitor](https://docs.datadoghq.com/api/latest/downtimes/#get-active-downtimes-for-a-monitor) [Validate an existing monitor](https://docs.datadoghq.com/api/latest/monitors/#validate-an-existing-monitor) [Get all monitor notification rules](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-notification-rules) [Get a monitor notification rule](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-notification-rule) [Get all monitor configuration policies](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-configuration-policies) [Get a monitor configuration policy](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-configuration-policy) [Get all monitor user templates](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-user-templates) [Get a monitor user template](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-user-template) code_analysis_read View Code Analysis. [Post dependencies for analysis](https://docs.datadoghq.com/api/latest/static-analysis/#post-dependencies-for-analysis) [POST request to resolve vulnerable symbols](https://docs.datadoghq.com/api/latest/static-analysis/#post-request-to-resolve-vulnerable-symbols) [Ruleset get multiple](https://docs.datadoghq.com/api/latest/security-monitoring/#ruleset-get-multiple) [Returns a list of Secrets rules](https://docs.datadoghq.com/api/latest/security-monitoring/#returns-a-list-of-secrets-rules) security_monitoring_filters_read Read Security Filters. [List resource filters](https://docs.datadoghq.com/api/latest/security-monitoring/#list-resource-filters) [Get all security filters](https://docs.datadoghq.com/api/latest/security-monitoring/#get-all-security-filters) [Get a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-security-filter) security_monitoring_filters_write Create, edit, and delete Security Filters. [Update resource filters](https://docs.datadoghq.com/api/latest/security-monitoring/#update-resource-filters) [Create a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-security-filter) [Delete a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-security-filter) [Update a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-security-filter) security_monitoring_rules_read Read Detection Rules. [Create a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-custom-framework) [Delete a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-custom-framework) [Get a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-custom-framework) [Update a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-custom-framework) [List rules](https://docs.datadoghq.com/api/latest/security-monitoring/#list-rules) [Get a rule's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-rules-details) [Convert an existing rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-an-existing-rule-from-json-to-terraform) [Get a job's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-jobs-details) security_monitoring_rules_write Create and edit Detection Rules. [Create a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-custom-framework) [Delete a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-custom-framework) [Update a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-custom-framework) [Create a detection rule](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-detection-rule) [Convert a rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-a-rule-from-json-to-terraform) [Test a rule](https://docs.datadoghq.com/api/latest/security-monitoring/#test-a-rule) [Validate a detection rule](https://docs.datadoghq.com/api/latest/security-monitoring/#validate-a-detection-rule) [Delete an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-an-existing-rule) [Update an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#update-an-existing-rule) [Test an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#test-an-existing-rule) [Run a threat hunting job](https://docs.datadoghq.com/api/latest/security-monitoring/#run-a-threat-hunting-job) [Cancel a threat hunting job](https://docs.datadoghq.com/api/latest/security-monitoring/#cancel-a-threat-hunting-job) security_monitoring_signals_read View Security Signals. [Get a quick list of security signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-quick-list-of-security-signals) [Get a list of security signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-list-of-security-signals) [Get a signal's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-signals-details) [List hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#list-hist-signals) [Search hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#search-hist-signals) [Get a hist signal's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-hist-signals-details) [Get a job's hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-jobs-hist-signals) security_monitoring_suppressions_read Read Rule Suppressions. [Get all suppression rules](https://docs.datadoghq.com/api/latest/security-monitoring/#get-all-suppression-rules) [Get suppressions affecting future rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-suppressions-affecting-future-rule) [Get suppressions affecting a specific rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-suppressions-affecting-a-specific-rule) [Get a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-suppression-rule) [Get a suppression's version history](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-suppressions-version-history) security_monitoring_suppressions_write Write Rule Suppressions. [Create a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-suppression-rule) [Validate a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#validate-a-suppression-rule) [Delete a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-suppression-rule) [Update a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-suppression-rule) #### Agentless Scanning, Domain Allowlist, IP Allowlist, Security Monitoring, Static Analysis Scope name Description Endpoints that require this scope org_management Edit org configurations, including authentication and certain security preferences such as configuring SAML, renaming an org, configuring allowed login methods, creating child orgs, subscribing & unsubscribing from apps in the marketplace, and enabling & disabling Remote Configuration for the entire organization. [Create AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-scan-options) [Delete AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-aws-scan-options) [Update AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-aws-scan-options) [Create Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-azure-scan-options) [Delete Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-azure-scan-options) [Update Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-azure-scan-options) [Create GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-gcp-scan-options) [Delete GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-gcp-scan-options) [Update GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-gcp-scan-options) [Create AWS on demand task](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-on-demand-task) [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist) [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist) [Get IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#get-ip-allowlist) [Update IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#update-ip-allowlist) security_monitoring_findings_read View a list of findings that include both misconfigurations and identity risks. [List AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-scan-options) [Get AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-scan-options) [List Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-azure-scan-options) [Get Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-azure-scan-options) [List GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-gcp-scan-options) [Get GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-gcp-scan-options) [List AWS on demand tasks](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-on-demand-tasks) [Get AWS on demand task](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-on-demand-task) [List findings](https://docs.datadoghq.com/api/latest/security-monitoring/#list-findings) [Get a finding](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-finding) [List security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#list-security-findings) [Search security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#search-security-findings) monitors_write Edit, delete, and resolve individual monitors. [Create a monitor](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor) [Delete a monitor](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor) [Edit a monitor](https://docs.datadoghq.com/api/latest/monitors/#edit-a-monitor) [Mute a monitor](https://docs.datadoghq.com/api/latest/monitors/#mute-a-monitor) [Unmute a monitor](https://docs.datadoghq.com/api/latest/monitors/#unmute-a-monitor) [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist) [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist) code_analysis_read View Code Analysis. [Post dependencies for analysis](https://docs.datadoghq.com/api/latest/static-analysis/#post-dependencies-for-analysis) [POST request to resolve vulnerable symbols](https://docs.datadoghq.com/api/latest/static-analysis/#post-request-to-resolve-vulnerable-symbols) [Ruleset get multiple](https://docs.datadoghq.com/api/latest/security-monitoring/#ruleset-get-multiple) [Returns a list of Secrets rules](https://docs.datadoghq.com/api/latest/security-monitoring/#returns-a-list-of-secrets-rules) security_monitoring_filters_read Read Security Filters. [List resource filters](https://docs.datadoghq.com/api/latest/security-monitoring/#list-resource-filters) [Get all security filters](https://docs.datadoghq.com/api/latest/security-monitoring/#get-all-security-filters) [Get a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-security-filter) security_monitoring_filters_write Create, edit, and delete Security Filters. [Update resource filters](https://docs.datadoghq.com/api/latest/security-monitoring/#update-resource-filters) [Create a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-security-filter) [Delete a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-security-filter) [Update a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-security-filter) security_monitoring_rules_read Read Detection Rules. [Create a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-custom-framework) [Delete a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-custom-framework) [Get a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-custom-framework) [Update a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-custom-framework) [List rules](https://docs.datadoghq.com/api/latest/security-monitoring/#list-rules) [Get a rule's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-rules-details) [Convert an existing rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-an-existing-rule-from-json-to-terraform) [Get a job's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-jobs-details) security_monitoring_rules_write Create and edit Detection Rules. [Create a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-custom-framework) [Delete a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-custom-framework) [Update a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-custom-framework) [Create a detection rule](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-detection-rule) [Convert a rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-a-rule-from-json-to-terraform) [Test a rule](https://docs.datadoghq.com/api/latest/security-monitoring/#test-a-rule) [Validate a detection rule](https://docs.datadoghq.com/api/latest/security-monitoring/#validate-a-detection-rule) [Delete an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-an-existing-rule) [Update an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#update-an-existing-rule) [Test an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#test-an-existing-rule) [Run a threat hunting job](https://docs.datadoghq.com/api/latest/security-monitoring/#run-a-threat-hunting-job) [Cancel a threat hunting job](https://docs.datadoghq.com/api/latest/security-monitoring/#cancel-a-threat-hunting-job) security_monitoring_signals_read View Security Signals. [Get a quick list of security signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-quick-list-of-security-signals) [Get a list of security signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-list-of-security-signals) [Get a signal's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-signals-details) [List hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#list-hist-signals) [Search hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#search-hist-signals) [Get a hist signal's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-hist-signals-details) [Get a job's hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-jobs-hist-signals) security_monitoring_suppressions_read Read Rule Suppressions. [Get all suppression rules](https://docs.datadoghq.com/api/latest/security-monitoring/#get-all-suppression-rules) [Get suppressions affecting future rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-suppressions-affecting-future-rule) [Get suppressions affecting a specific rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-suppressions-affecting-a-specific-rule) [Get a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-suppression-rule) [Get a suppression's version history](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-suppressions-version-history) security_monitoring_suppressions_write Write Rule Suppressions. [Create a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-suppression-rule) [Validate a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#validate-a-suppression-rule) [Delete a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-suppression-rule) [Update a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-suppression-rule) #### Agentless Scanning, Security Monitoring, Static Analysis Scope name Description Endpoints that require this scope org_management Edit org configurations, including authentication and certain security preferences such as configuring SAML, renaming an org, configuring allowed login methods, creating child orgs, subscribing & unsubscribing from apps in the marketplace, and enabling & disabling Remote Configuration for the entire organization. [Create AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-scan-options) [Delete AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-aws-scan-options) [Update AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-aws-scan-options) [Create Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-azure-scan-options) [Delete Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-azure-scan-options) [Update Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-azure-scan-options) [Create GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-gcp-scan-options) [Delete GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-gcp-scan-options) [Update GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-gcp-scan-options) [Create AWS on demand task](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-on-demand-task) [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist) [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist) [Get IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#get-ip-allowlist) [Update IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#update-ip-allowlist) security_monitoring_findings_read View a list of findings that include both misconfigurations and identity risks. [List AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-scan-options) [Get AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-scan-options) [List Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-azure-scan-options) [Get Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-azure-scan-options) [List GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-gcp-scan-options) [Get GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-gcp-scan-options) [List AWS on demand tasks](https://docs.datadoghq.com/api/latest/agentless-scanning/#list-aws-on-demand-tasks) [Get AWS on demand task](https://docs.datadoghq.com/api/latest/agentless-scanning/#get-aws-on-demand-task) [List findings](https://docs.datadoghq.com/api/latest/security-monitoring/#list-findings) [Get a finding](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-finding) [List security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#list-security-findings) [Search security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#search-security-findings) code_analysis_read View Code Analysis. [Post dependencies for analysis](https://docs.datadoghq.com/api/latest/static-analysis/#post-dependencies-for-analysis) [POST request to resolve vulnerable symbols](https://docs.datadoghq.com/api/latest/static-analysis/#post-request-to-resolve-vulnerable-symbols) [Ruleset get multiple](https://docs.datadoghq.com/api/latest/security-monitoring/#ruleset-get-multiple) [Returns a list of Secrets rules](https://docs.datadoghq.com/api/latest/security-monitoring/#returns-a-list-of-secrets-rules) security_monitoring_filters_read Read Security Filters. [List resource filters](https://docs.datadoghq.com/api/latest/security-monitoring/#list-resource-filters) [Get all security filters](https://docs.datadoghq.com/api/latest/security-monitoring/#get-all-security-filters) [Get a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-security-filter) security_monitoring_filters_write Create, edit, and delete Security Filters. [Update resource filters](https://docs.datadoghq.com/api/latest/security-monitoring/#update-resource-filters) [Create a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-security-filter) [Delete a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-security-filter) [Update a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-security-filter) security_monitoring_rules_read Read Detection Rules. [Create a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-custom-framework) [Delete a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-custom-framework) [Get a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-custom-framework) [Update a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-custom-framework) [List rules](https://docs.datadoghq.com/api/latest/security-monitoring/#list-rules) [Get a rule's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-rules-details) [Convert an existing rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-an-existing-rule-from-json-to-terraform) [Get a job's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-jobs-details) security_monitoring_rules_write Create and edit Detection Rules. [Create a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-custom-framework) [Delete a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-custom-framework) [Update a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-custom-framework) [Create a detection rule](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-detection-rule) [Convert a rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-a-rule-from-json-to-terraform) [Test a rule](https://docs.datadoghq.com/api/latest/security-monitoring/#test-a-rule) [Validate a detection rule](https://docs.datadoghq.com/api/latest/security-monitoring/#validate-a-detection-rule) [Delete an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-an-existing-rule) [Update an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#update-an-existing-rule) [Test an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#test-an-existing-rule) [Run a threat hunting job](https://docs.datadoghq.com/api/latest/security-monitoring/#run-a-threat-hunting-job) [Cancel a threat hunting job](https://docs.datadoghq.com/api/latest/security-monitoring/#cancel-a-threat-hunting-job) security_monitoring_signals_read View Security Signals. [Get a quick list of security signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-quick-list-of-security-signals) [Get a list of security signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-list-of-security-signals) [Get a signal's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-signals-details) [List hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#list-hist-signals) [Search hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#search-hist-signals) [Get a hist signal's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-hist-signals-details) [Get a job's hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-jobs-hist-signals) security_monitoring_suppressions_read Read Rule Suppressions. [Get all suppression rules](https://docs.datadoghq.com/api/latest/security-monitoring/#get-all-suppression-rules) [Get suppressions affecting future rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-suppressions-affecting-future-rule) [Get suppressions affecting a specific rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-suppressions-affecting-a-specific-rule) [Get a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-suppression-rule) [Get a suppression's version history](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-suppressions-version-history) security_monitoring_suppressions_write Write Rule Suppressions. [Create a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-suppression-rule) [Validate a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#validate-a-suppression-rule) [Delete a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-suppression-rule) [Update a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-suppression-rule) #### CI Visibility Pipelines, CI Visibility Tests, Test Optimization Scope name Description Endpoints that require this scope ci_visibility_read View CI Visibility. [Aggregate pipelines events](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#aggregate-pipelines-events) [Get a list of pipelines events](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#get-a-list-of-pipelines-events) [Search pipelines events](https://docs.datadoghq.com/api/latest/ci-visibility-pipelines/#search-pipelines-events) [Aggregate tests events](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#aggregate-tests-events) [Get a list of tests events](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#get-a-list-of-tests-events) [Search tests events](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#search-tests-events) test_optimization_read View Test Optimization. [Aggregate tests events](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#aggregate-tests-events) [Get a list of tests events](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#get-a-list-of-tests-events) [Search tests events](https://docs.datadoghq.com/api/latest/ci-visibility-tests/#search-tests-events) [Search flaky tests](https://docs.datadoghq.com/api/latest/test-optimization/#search-flaky-tests) #### Case Management, Error Tracking Scope name Description Endpoints that require this scope cases_read View Cases. [Search cases](https://docs.datadoghq.com/api/latest/case-management/#search-cases) [Get all projects](https://docs.datadoghq.com/api/latest/case-management/#get-all-projects) [Get the details of a project](https://docs.datadoghq.com/api/latest/case-management/#get-the-details-of-a-project) [Get the details of a case](https://docs.datadoghq.com/api/latest/case-management/#get-the-details-of-a-case) [Remove the assignee of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#remove-the-assignee-of-an-issue) [Update the assignee of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#update-the-assignee-of-an-issue) cases_write Create and update cases. [Create a case](https://docs.datadoghq.com/api/latest/case-management/#create-a-case) [Create a project](https://docs.datadoghq.com/api/latest/case-management/#create-a-project) [Remove a project](https://docs.datadoghq.com/api/latest/case-management/#remove-a-project) [Archive case](https://docs.datadoghq.com/api/latest/case-management/#archive-case) [Assign case](https://docs.datadoghq.com/api/latest/case-management/#assign-case) [Update case attributes](https://docs.datadoghq.com/api/latest/case-management/#update-case-attributes) [Delete custom attribute from case](https://docs.datadoghq.com/api/latest/case-management/#delete-custom-attribute-from-case) [Update case custom attribute](https://docs.datadoghq.com/api/latest/case-management/#update-case-custom-attribute) [Update case description](https://docs.datadoghq.com/api/latest/case-management/#update-case-description) [Update case priority](https://docs.datadoghq.com/api/latest/case-management/#update-case-priority) [Update case status](https://docs.datadoghq.com/api/latest/case-management/#update-case-status) [Update case title](https://docs.datadoghq.com/api/latest/case-management/#update-case-title) [Unarchive case](https://docs.datadoghq.com/api/latest/case-management/#unarchive-case) [Unassign case](https://docs.datadoghq.com/api/latest/case-management/#unassign-case) [Remove the assignee of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#remove-the-assignee-of-an-issue) [Update the assignee of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#update-the-assignee-of-an-issue) error_tracking_read Read Error Tracking data. [Search error tracking issues](https://docs.datadoghq.com/api/latest/error-tracking/#search-error-tracking-issues) [Get the details of an error tracking issue](https://docs.datadoghq.com/api/latest/error-tracking/#get-the-details-of-an-error-tracking-issue) [Remove the assignee of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#remove-the-assignee-of-an-issue) [Update the assignee of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#update-the-assignee-of-an-issue) [Update the state of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#update-the-state-of-an-issue) error_tracking_write Edit Error Tracking issues. [Remove the assignee of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#remove-the-assignee-of-an-issue) [Update the assignee of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#update-the-assignee-of-an-issue) [Update the state of an issue](https://docs.datadoghq.com/api/latest/error-tracking/#update-the-state-of-an-issue) #### Cloud Cost Management Scope name Description Endpoints that require this scope cloud_cost_management_read View Cloud Cost pages and the cloud cost data source in dashboards and notebooks. For more details, see the Cloud Cost Management docs. [List custom allocation rules](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-custom-allocation-rules) [Get custom allocation rule](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-custom-allocation-rule) [List Cloud Cost Management AWS CUR configs](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-cloud-cost-management-aws-cur-configs) [Get cost AWS CUR config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-cost-aws-cur-config) [List Cloud Cost Management Azure configs](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-cloud-cost-management-azure-configs) [Get cost Azure UC config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-cost-azure-uc-config) [Get a budget](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-a-budget) [List budgets](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-budgets) [List Custom Costs files](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-custom-costs-files) [Get Custom Costs file](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-custom-costs-file) [List Google Cloud Usage Cost configs](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-google-cloud-usage-cost-configs) [Get Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-google-cloud-usage-cost-config) [List tag pipeline rulesets](https://docs.datadoghq.com/api/latest/cloud-cost-management/#list-tag-pipeline-rulesets) [Validate query](https://docs.datadoghq.com/api/latest/cloud-cost-management/#validate-query) [Get a tag pipeline ruleset](https://docs.datadoghq.com/api/latest/cloud-cost-management/#get-a-tag-pipeline-ruleset) cloud_cost_management_write Configure cloud cost accounts and global customizations. For more details, see the Cloud Cost Management docs. [Create custom allocation rule](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-custom-allocation-rule) [Reorder custom allocation rules](https://docs.datadoghq.com/api/latest/cloud-cost-management/#reorder-custom-allocation-rules) [Delete custom allocation rule](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-custom-allocation-rule) [Update custom allocation rule](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-custom-allocation-rule) [Create Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-cloud-cost-management-aws-cur-config) [Delete Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-cloud-cost-management-aws-cur-config) [Update Cloud Cost Management AWS CUR config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-cloud-cost-management-aws-cur-config) [Create Cloud Cost Management Azure configs](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-cloud-cost-management-azure-configs) [Delete Cloud Cost Management Azure config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-cloud-cost-management-azure-config) [Update Cloud Cost Management Azure config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-cloud-cost-management-azure-config) [Create or update a budget](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-or-update-a-budget) [Delete a budget](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-a-budget) [Upload Custom Costs file](https://docs.datadoghq.com/api/latest/cloud-cost-management/#upload-custom-costs-file) [Delete Custom Costs file](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-custom-costs-file) [Create Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-google-cloud-usage-cost-config) [Delete Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-google-cloud-usage-cost-config) [Update Google Cloud Usage Cost config](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-google-cloud-usage-cost-config) [Create tag pipeline ruleset](https://docs.datadoghq.com/api/latest/cloud-cost-management/#create-tag-pipeline-ruleset) [Reorder tag pipeline rulesets](https://docs.datadoghq.com/api/latest/cloud-cost-management/#reorder-tag-pipeline-rulesets) [Delete tag pipeline ruleset](https://docs.datadoghq.com/api/latest/cloud-cost-management/#delete-tag-pipeline-ruleset) [Update tag pipeline ruleset](https://docs.datadoghq.com/api/latest/cloud-cost-management/#update-tag-pipeline-ruleset) #### Dashboard Lists, Dashboards, Powerpack Scope name Description Endpoints that require this scope dashboards_read View dashboards. [Get all dashboards](https://docs.datadoghq.com/api/latest/dashboards/#get-all-dashboards) [Get all dashboard lists](https://docs.datadoghq.com/api/latest/dashboard-lists/#get-all-dashboard-lists) [Get a dashboard list](https://docs.datadoghq.com/api/latest/dashboard-lists/#get-a-dashboard-list) [Get a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#get-a-shared-dashboard) [Get a dashboard](https://docs.datadoghq.com/api/latest/dashboards/#get-a-dashboard) [Get items of a Dashboard List](https://docs.datadoghq.com/api/latest/dashboard-lists/#get-items-of-a-dashboard-list) [Get all powerpacks](https://docs.datadoghq.com/api/latest/powerpack/#get-all-powerpacks) [Get a Powerpack](https://docs.datadoghq.com/api/latest/powerpack/#get-a-powerpack) dashboards_write Create and change dashboards. [Delete dashboards](https://docs.datadoghq.com/api/latest/dashboards/#delete-dashboards) [Restore deleted dashboards](https://docs.datadoghq.com/api/latest/dashboards/#restore-deleted-dashboards) [Create a new dashboard](https://docs.datadoghq.com/api/latest/dashboards/#create-a-new-dashboard) [Create a dashboard list](https://docs.datadoghq.com/api/latest/dashboard-lists/#create-a-dashboard-list) [Delete a dashboard list](https://docs.datadoghq.com/api/latest/dashboard-lists/#delete-a-dashboard-list) [Update a dashboard list](https://docs.datadoghq.com/api/latest/dashboard-lists/#update-a-dashboard-list) [Delete a dashboard](https://docs.datadoghq.com/api/latest/dashboards/#delete-a-dashboard) [Update a dashboard](https://docs.datadoghq.com/api/latest/dashboards/#update-a-dashboard) [Create a new powerpack](https://docs.datadoghq.com/api/latest/powerpack/#create-a-new-powerpack) [Delete a powerpack](https://docs.datadoghq.com/api/latest/powerpack/#delete-a-powerpack) [Update a powerpack](https://docs.datadoghq.com/api/latest/powerpack/#update-a-powerpack) dashboards_embed_share Create, modify, and delete shared dashboards with share type 'embed'. [Create a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#create-a-shared-dashboard) [Revoke a shared dashboard URL](https://docs.datadoghq.com/api/latest/dashboards/#revoke-a-shared-dashboard-url) [Update a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#update-a-shared-dashboard) dashboards_invite_share Create, modify, and delete shared dashboards with share type 'invite'. [Create a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#create-a-shared-dashboard) [Revoke a shared dashboard URL](https://docs.datadoghq.com/api/latest/dashboards/#revoke-a-shared-dashboard-url) [Update a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#update-a-shared-dashboard) [Revoke shared dashboard invitations](https://docs.datadoghq.com/api/latest/dashboards/#revoke-shared-dashboard-invitations) [Get all invitations for a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#get-all-invitations-for-a-shared-dashboard) [Send shared dashboard invitation email](https://docs.datadoghq.com/api/latest/dashboards/#send-shared-dashboard-invitation-email) dashboards_public_share Generate public and authenticated links to share dashboards or embeddable graphs externally. [Create a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#create-a-shared-dashboard) [Revoke a shared dashboard URL](https://docs.datadoghq.com/api/latest/dashboards/#revoke-a-shared-dashboard-url) [Update a shared dashboard](https://docs.datadoghq.com/api/latest/dashboards/#update-a-shared-dashboard) #### Datasets, Roles, Users Scope name Description Endpoints that require this scope user_access_manage Disable users, manage user roles, manage SAML-to-role mappings, and configure logs restriction queries. [Create a dataset](https://docs.datadoghq.com/api/latest/datasets/#create-a-dataset) [Delete a dataset](https://docs.datadoghq.com/api/latest/datasets/#delete-a-dataset) [Edit a dataset](https://docs.datadoghq.com/api/latest/datasets/#edit-a-dataset) [Create role](https://docs.datadoghq.com/api/latest/roles/#create-role) [Delete role](https://docs.datadoghq.com/api/latest/roles/#delete-role) [Update a role](https://docs.datadoghq.com/api/latest/roles/#update-a-role) [Create a new role by cloning an existing role](https://docs.datadoghq.com/api/latest/roles/#create-a-new-role-by-cloning-an-existing-role) [Revoke permission](https://docs.datadoghq.com/api/latest/roles/#revoke-permission) [Grant permission to a role](https://docs.datadoghq.com/api/latest/roles/#grant-permission-to-a-role) [Remove a user from a role](https://docs.datadoghq.com/api/latest/roles/#remove-a-user-from-a-role) [Add a user to a role](https://docs.datadoghq.com/api/latest/roles/#add-a-user-to-a-role) [Disable a user](https://docs.datadoghq.com/api/latest/users/#disable-a-user) [Update a user](https://docs.datadoghq.com/api/latest/users/#update-a-user) user_access_read View users and their roles and settings. [List all users](https://docs.datadoghq.com/api/latest/users/#list-all-users) [Get all datasets](https://docs.datadoghq.com/api/latest/datasets/#get-all-datasets) [Get a single dataset by ID](https://docs.datadoghq.com/api/latest/datasets/#get-a-single-dataset-by-id) [List permissions](https://docs.datadoghq.com/api/latest/roles/#list-permissions) [List roles](https://docs.datadoghq.com/api/latest/roles/#list-roles) [List role templates](https://docs.datadoghq.com/api/latest/roles/#list-role-templates) [Get a role](https://docs.datadoghq.com/api/latest/roles/#get-a-role) [List permissions for a role](https://docs.datadoghq.com/api/latest/roles/#list-permissions-for-a-role) [Get all users of a role](https://docs.datadoghq.com/api/latest/roles/#get-all-users-of-a-role) [List all users](https://docs.datadoghq.com/api/latest/users/#list-all-users) [Get user details](https://docs.datadoghq.com/api/latest/users/#get-user-details) [Get a user permissions](https://docs.datadoghq.com/api/latest/users/#get-a-user-permissions) user_access_invite Invite other users to your organization. [Send invitation emails](https://docs.datadoghq.com/api/latest/users/#send-invitation-emails) [Get a user invitation](https://docs.datadoghq.com/api/latest/users/#get-a-user-invitation) [Create a user](https://docs.datadoghq.com/api/latest/users/#create-a-user) #### Domain Allowlist, Downtimes, Monitors Scope name Description Endpoints that require this scope monitors_write Edit, delete, and resolve individual monitors. [Create a monitor](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor) [Delete a monitor](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor) [Edit a monitor](https://docs.datadoghq.com/api/latest/monitors/#edit-a-monitor) [Mute a monitor](https://docs.datadoghq.com/api/latest/monitors/#mute-a-monitor) [Unmute a monitor](https://docs.datadoghq.com/api/latest/monitors/#unmute-a-monitor) [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist) [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist) org_management Edit org configurations, including authentication and certain security preferences such as configuring SAML, renaming an org, configuring allowed login methods, creating child orgs, subscribing & unsubscribing from apps in the marketplace, and enabling & disabling Remote Configuration for the entire organization. [Create AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-scan-options) [Delete AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-aws-scan-options) [Update AWS scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-aws-scan-options) [Create Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-azure-scan-options) [Delete Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-azure-scan-options) [Update Azure scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-azure-scan-options) [Create GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-gcp-scan-options) [Delete GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#delete-gcp-scan-options) [Update GCP scan options](https://docs.datadoghq.com/api/latest/agentless-scanning/#update-gcp-scan-options) [Create AWS on demand task](https://docs.datadoghq.com/api/latest/agentless-scanning/#create-aws-on-demand-task) [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist) [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist) [Get IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#get-ip-allowlist) [Update IP Allowlist](https://docs.datadoghq.com/api/latest/ip-allowlist/#update-ip-allowlist) monitors_downtime Set downtimes to suppress alerts from any monitor in an organization. Mute and unmute monitors. The ability to write monitors is not required to set downtimes. [Schedule a downtime](https://docs.datadoghq.com/api/latest/downtimes/#schedule-a-downtime) [Cancel downtimes by scope](https://docs.datadoghq.com/api/latest/downtimes/#cancel-downtimes-by-scope) [Cancel a downtime](https://docs.datadoghq.com/api/latest/downtimes/#cancel-a-downtime) [Update a downtime](https://docs.datadoghq.com/api/latest/downtimes/#update-a-downtime) [Get all downtimes](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes) [Schedule a downtime](https://docs.datadoghq.com/api/latest/downtimes/#schedule-a-downtime) [Cancel a downtime](https://docs.datadoghq.com/api/latest/downtimes/#cancel-a-downtime) [Get a downtime](https://docs.datadoghq.com/api/latest/downtimes/#get-a-downtime) [Update a downtime](https://docs.datadoghq.com/api/latest/downtimes/#update-a-downtime) [Get active downtimes for a monitor](https://docs.datadoghq.com/api/latest/downtimes/#get-active-downtimes-for-a-monitor) monitors_read View monitors. [Get all downtimes](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes) [Get a downtime](https://docs.datadoghq.com/api/latest/downtimes/#get-a-downtime) [Get all monitors](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitors) [Check if a monitor can be deleted](https://docs.datadoghq.com/api/latest/monitors/#check-if-a-monitor-can-be-deleted) [Monitors group search](https://docs.datadoghq.com/api/latest/monitors/#monitors-group-search) [Monitors search](https://docs.datadoghq.com/api/latest/monitors/#monitors-search) [Validate a monitor](https://docs.datadoghq.com/api/latest/monitors/#validate-a-monitor) [Get a monitor's details](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitors-details) [Get active downtimes for a monitor](https://docs.datadoghq.com/api/latest/downtimes/#get-active-downtimes-for-a-monitor) [Validate an existing monitor](https://docs.datadoghq.com/api/latest/monitors/#validate-an-existing-monitor) [Get all monitor notification rules](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-notification-rules) [Get a monitor notification rule](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-notification-rule) [Get all monitor configuration policies](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-configuration-policies) [Get a monitor configuration policy](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-configuration-policy) [Get all monitor user templates](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-user-templates) [Get a monitor user template](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-user-template) #### Downtimes, Monitors Scope name Description Endpoints that require this scope monitors_downtime Set downtimes to suppress alerts from any monitor in an organization. Mute and unmute monitors. The ability to write monitors is not required to set downtimes. [Schedule a downtime](https://docs.datadoghq.com/api/latest/downtimes/#schedule-a-downtime) [Cancel downtimes by scope](https://docs.datadoghq.com/api/latest/downtimes/#cancel-downtimes-by-scope) [Cancel a downtime](https://docs.datadoghq.com/api/latest/downtimes/#cancel-a-downtime) [Update a downtime](https://docs.datadoghq.com/api/latest/downtimes/#update-a-downtime) [Get all downtimes](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes) [Schedule a downtime](https://docs.datadoghq.com/api/latest/downtimes/#schedule-a-downtime) [Cancel a downtime](https://docs.datadoghq.com/api/latest/downtimes/#cancel-a-downtime) [Get a downtime](https://docs.datadoghq.com/api/latest/downtimes/#get-a-downtime) [Update a downtime](https://docs.datadoghq.com/api/latest/downtimes/#update-a-downtime) [Get active downtimes for a monitor](https://docs.datadoghq.com/api/latest/downtimes/#get-active-downtimes-for-a-monitor) monitors_read View monitors. [Get all downtimes](https://docs.datadoghq.com/api/latest/downtimes/#get-all-downtimes) [Get a downtime](https://docs.datadoghq.com/api/latest/downtimes/#get-a-downtime) [Get all monitors](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitors) [Check if a monitor can be deleted](https://docs.datadoghq.com/api/latest/monitors/#check-if-a-monitor-can-be-deleted) [Monitors group search](https://docs.datadoghq.com/api/latest/monitors/#monitors-group-search) [Monitors search](https://docs.datadoghq.com/api/latest/monitors/#monitors-search) [Validate a monitor](https://docs.datadoghq.com/api/latest/monitors/#validate-a-monitor) [Get a monitor's details](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitors-details) [Get active downtimes for a monitor](https://docs.datadoghq.com/api/latest/downtimes/#get-active-downtimes-for-a-monitor) [Validate an existing monitor](https://docs.datadoghq.com/api/latest/monitors/#validate-an-existing-monitor) [Get all monitor notification rules](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-notification-rules) [Get a monitor notification rule](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-notification-rule) [Get all monitor configuration policies](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-configuration-policies) [Get a monitor configuration policy](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-configuration-policy) [Get all monitor user templates](https://docs.datadoghq.com/api/latest/monitors/#get-all-monitor-user-templates) [Get a monitor user template](https://docs.datadoghq.com/api/latest/monitors/#get-a-monitor-user-template) monitors_write Edit, delete, and resolve individual monitors. [Create a monitor](https://docs.datadoghq.com/api/latest/monitors/#create-a-monitor) [Delete a monitor](https://docs.datadoghq.com/api/latest/monitors/#delete-a-monitor) [Edit a monitor](https://docs.datadoghq.com/api/latest/monitors/#edit-a-monitor) [Mute a monitor](https://docs.datadoghq.com/api/latest/monitors/#mute-a-monitor) [Unmute a monitor](https://docs.datadoghq.com/api/latest/monitors/#unmute-a-monitor) [Get Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#get-domain-allowlist) [Sets Domain Allowlist](https://docs.datadoghq.com/api/latest/domain-allowlist/#sets-domain-allowlist) #### Events Scope name Description Endpoints that require this scope events_read Read Events data. [Get a list of events](https://docs.datadoghq.com/api/latest/events/#get-a-list-of-events) [Get an event](https://docs.datadoghq.com/api/latest/events/#get-an-event) [Get a list of events](https://docs.datadoghq.com/api/latest/events/#get-a-list-of-events) [Get an event](https://docs.datadoghq.com/api/latest/events/#get-an-event) #### Hosts Scope name Description Endpoints that require this scope hosts_read List hosts and their attributes. [Get all hosts for your organization](https://docs.datadoghq.com/api/latest/hosts/#get-all-hosts-for-your-organization) [Get the total number of active hosts](https://docs.datadoghq.com/api/latest/hosts/#get-the-total-number-of-active-hosts) #### Incident Services, Incident Teams, Incidents Scope name Description Endpoints that require this scope incident_read View incidents in Datadog. [Get a list of incidents](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-incidents) [Get incident notification template](https://docs.datadoghq.com/api/latest/incidents/#get-incident-notification-template) [Get a list of incident types](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-incident-types) [Get incident type details](https://docs.datadoghq.com/api/latest/incidents/#get-incident-type-details) [Search for incidents](https://docs.datadoghq.com/api/latest/incidents/#search-for-incidents) [Get the details of an incident](https://docs.datadoghq.com/api/latest/incidents/#get-the-details-of-an-incident) [List an incident's impacts](https://docs.datadoghq.com/api/latest/incidents/#list-an-incidents-impacts) [Get a list of an incident's integration metadata](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-an-incidents-integration-metadata) [Get incident integration metadata details](https://docs.datadoghq.com/api/latest/incidents/#get-incident-integration-metadata-details) [Get a list of an incident's todos](https://docs.datadoghq.com/api/latest/incidents/#get-a-list-of-an-incidents-todos) [Get incident todo details](https://docs.datadoghq.com/api/latest/incidents/#get-incident-todo-details) [Get a list of all incident services](https://docs.datadoghq.com/api/latest/incident-services/#get-a-list-of-all-incident-services) [Get details of an incident service](https://docs.datadoghq.com/api/latest/incident-services/#get-details-of-an-incident-service) [Get a list of all incident teams](https://docs.datadoghq.com/api/latest/incident-teams/#get-a-list-of-all-incident-teams) [Get details of an incident team](https://docs.datadoghq.com/api/latest/incident-teams/#get-details-of-an-incident-team) incident_settings_write Configure Incident Settings. [Create an incident type](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-type) [Delete an incident type](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-type) [Update an incident type](https://docs.datadoghq.com/api/latest/incidents/#update-an-incident-type) [Create a new incident service](https://docs.datadoghq.com/api/latest/incident-services/#create-a-new-incident-service) [Delete an existing incident service](https://docs.datadoghq.com/api/latest/incident-services/#delete-an-existing-incident-service) [Update an existing incident service](https://docs.datadoghq.com/api/latest/incident-services/#update-an-existing-incident-service) [Create a new incident team](https://docs.datadoghq.com/api/latest/incident-teams/#create-a-new-incident-team) [Delete an existing incident team](https://docs.datadoghq.com/api/latest/incident-teams/#delete-an-existing-incident-team) [Update an existing incident team](https://docs.datadoghq.com/api/latest/incident-teams/#update-an-existing-incident-team) incident_notification_settings_read View Incident Notification Rule Settings. [Get an incident notification rule](https://docs.datadoghq.com/api/latest/incidents/#get-an-incident-notification-rule) incident_notification_settings_write Configure Incidents Notification Rule settings. [Create an incident notification rule](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-notification-rule) [Delete an incident notification rule](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-notification-rule) [Update an incident notification rule](https://docs.datadoghq.com/api/latest/incidents/#update-an-incident-notification-rule) [Create incident notification template](https://docs.datadoghq.com/api/latest/incidents/#create-incident-notification-template) [Delete a notification template](https://docs.datadoghq.com/api/latest/incidents/#delete-a-notification-template) [Update incident notification template](https://docs.datadoghq.com/api/latest/incidents/#update-incident-notification-template) incident_write Create, view, and manage incidents in Datadog. [Create an incident](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident) [Get incident notification template](https://docs.datadoghq.com/api/latest/incidents/#get-incident-notification-template) [Delete an existing incident](https://docs.datadoghq.com/api/latest/incidents/#delete-an-existing-incident) [Update an existing incident](https://docs.datadoghq.com/api/latest/incidents/#update-an-existing-incident) [Create incident attachment](https://docs.datadoghq.com/api/latest/incidents/#create-incident-attachment) [Delete incident attachment](https://docs.datadoghq.com/api/latest/incidents/#delete-incident-attachment) [Update incident attachment](https://docs.datadoghq.com/api/latest/incidents/#update-incident-attachment) [Create an incident impact](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-impact) [Delete an incident impact](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-impact) [Create an incident integration metadata](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-integration-metadata) [Delete an incident integration metadata](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-integration-metadata) [Update an existing incident integration metadata](https://docs.datadoghq.com/api/latest/incidents/#update-an-existing-incident-integration-metadata) [Create an incident todo](https://docs.datadoghq.com/api/latest/incidents/#create-an-incident-todo) [Delete an incident todo](https://docs.datadoghq.com/api/latest/incidents/#delete-an-incident-todo) [Update an incident todo](https://docs.datadoghq.com/api/latest/incidents/#update-an-incident-todo) #### Metrics Scope name Description Endpoints that require this scope metrics_read View custom metrics. [Get active metrics list](https://docs.datadoghq.com/api/latest/metrics/#get-active-metrics-list) [Get metric metadata](https://docs.datadoghq.com/api/latest/metrics/#get-metric-metadata) [Search metrics](https://docs.datadoghq.com/api/latest/metrics/#search-metrics) [Get a list of metrics](https://docs.datadoghq.com/api/latest/metrics/#get-a-list-of-metrics) [List tags by metric name](https://docs.datadoghq.com/api/latest/metrics/#list-tags-by-metric-name) [List tag configuration by name](https://docs.datadoghq.com/api/latest/metrics/#list-tag-configuration-by-name) timeseries_query Query Timeseries data. [Query timeseries points](https://docs.datadoghq.com/api/latest/metrics/#query-timeseries-points) [Query scalar data across multiple products](https://docs.datadoghq.com/api/latest/metrics/#query-scalar-data-across-multiple-products) [Query timeseries data across multiple products](https://docs.datadoghq.com/api/latest/metrics/#query-timeseries-data-across-multiple-products) #### Org Connections Scope name Description Endpoints that require this scope org_connections_read Read cross organization connections. [List Org Connections](https://docs.datadoghq.com/api/latest/org-connections/#list-org-connections) org_connections_write Create, edit, and delete cross organization connections. [Create Org Connection](https://docs.datadoghq.com/api/latest/org-connections/#create-org-connection) [Delete Org Connection](https://docs.datadoghq.com/api/latest/org-connections/#delete-org-connection) [Update Org Connection](https://docs.datadoghq.com/api/latest/org-connections/#update-org-connection) #### Service Definition, Service Scorecards, Software Catalog Scope name Description Endpoints that require this scope apm_service_catalog_read View service catalog and service definitions. [Get a list of entities](https://docs.datadoghq.com/api/latest/software-catalog/#get-a-list-of-entities) [Preview catalog entities](https://docs.datadoghq.com/api/latest/software-catalog/#preview-catalog-entities) [Get a list of entity kinds](https://docs.datadoghq.com/api/latest/software-catalog/#get-a-list-of-entity-kinds) [Get a list of entity relations](https://docs.datadoghq.com/api/latest/software-catalog/#get-a-list-of-entity-relations) [List all rule outcomes](https://docs.datadoghq.com/api/latest/service-scorecards/#list-all-rule-outcomes) [List all rules](https://docs.datadoghq.com/api/latest/service-scorecards/#list-all-rules) [Get all service definitions](https://docs.datadoghq.com/api/latest/service-definition/#get-all-service-definitions) [Get a single service definition](https://docs.datadoghq.com/api/latest/service-definition/#get-a-single-service-definition) apm_service_catalog_write Add, modify, and delete service catalog definitions when those definitions are maintained by Datadog. [Create or update entities](https://docs.datadoghq.com/api/latest/software-catalog/#create-or-update-entities) [Delete a single entity](https://docs.datadoghq.com/api/latest/software-catalog/#delete-a-single-entity) [Create or update kinds](https://docs.datadoghq.com/api/latest/software-catalog/#create-or-update-kinds) [Delete a single kind](https://docs.datadoghq.com/api/latest/software-catalog/#delete-a-single-kind) [Update Scorecard outcomes asynchronously](https://docs.datadoghq.com/api/latest/service-scorecards/#update-scorecard-outcomes-asynchronously) [Create outcomes batch](https://docs.datadoghq.com/api/latest/service-scorecards/#create-outcomes-batch) [Create a new rule](https://docs.datadoghq.com/api/latest/service-scorecards/#create-a-new-rule) [Delete a rule](https://docs.datadoghq.com/api/latest/service-scorecards/#delete-a-rule) [Update an existing rule](https://docs.datadoghq.com/api/latest/service-scorecards/#update-an-existing-rule) [Create or update service definition](https://docs.datadoghq.com/api/latest/service-definition/#create-or-update-service-definition) [Delete a single service definition](https://docs.datadoghq.com/api/latest/service-definition/#delete-a-single-service-definition) #### Service Level Objective Corrections, Service Level Objectives Scope name Description Endpoints that require this scope slos_corrections Apply, edit, and delete SLO status corrections. A user with this permission can make status corrections, even if they do not have permission to edit those SLOs. [Create an SLO correction](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#create-an-slo-correction) slos_read View SLOs and status corrections. [Get all SLOs](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-all-slos) [Check if SLOs can be safely deleted](https://docs.datadoghq.com/api/latest/service-level-objectives/#check-if-slos-can-be-safely-deleted) [Get all SLO corrections](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#get-all-slo-corrections) [Search for SLOs](https://docs.datadoghq.com/api/latest/service-level-objectives/#search-for-slos) [Get an SLO's details](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-an-slos-details) [Get Corrections For an SLO](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-corrections-for-an-slo) [Get an SLO's history](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-an-slos-history) [Create a new SLO report](https://docs.datadoghq.com/api/latest/service-level-objectives/#create-a-new-slo-report) [Get SLO report](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-slo-report) [Get SLO report status](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-slo-report-status) slos_write Create, edit, and delete SLOs. [Create an SLO object](https://docs.datadoghq.com/api/latest/service-level-objectives/#create-an-slo-object) [Bulk Delete SLO Timeframes](https://docs.datadoghq.com/api/latest/service-level-objectives/#bulk-delete-slo-timeframes) [Delete an SLO](https://docs.datadoghq.com/api/latest/service-level-objectives/#delete-an-slo) [Update an SLO](https://docs.datadoghq.com/api/latest/service-level-objectives/#update-an-slo) #### Teams Scope name Description Endpoints that require this scope teams_manage Manage Teams. Create, delete, rename, and edit metadata of all Teams. To control Team membership across all Teams, use the User Access Manage permission. [Create a team](https://docs.datadoghq.com/api/latest/teams/#create-a-team) [Create a team hierarchy link](https://docs.datadoghq.com/api/latest/teams/#create-a-team-hierarchy-link) [Remove a team hierarchy link](https://docs.datadoghq.com/api/latest/teams/#remove-a-team-hierarchy-link) [Link Teams with GitHub Teams](https://docs.datadoghq.com/api/latest/teams/#link-teams-with-github-teams) [Remove a team](https://docs.datadoghq.com/api/latest/teams/#remove-a-team) teams_read Read Teams data. A User with this permission can view Team names, metadata, and which Users are on each Team. [Get all teams](https://docs.datadoghq.com/api/latest/teams/#get-all-teams) [Create a team](https://docs.datadoghq.com/api/latest/teams/#create-a-team) [Get team hierarchy links](https://docs.datadoghq.com/api/latest/teams/#get-team-hierarchy-links) [Create a team hierarchy link](https://docs.datadoghq.com/api/latest/teams/#create-a-team-hierarchy-link) [Remove a team hierarchy link](https://docs.datadoghq.com/api/latest/teams/#remove-a-team-hierarchy-link) [Get a team hierarchy link](https://docs.datadoghq.com/api/latest/teams/#get-a-team-hierarchy-link) [Delete team connections](https://docs.datadoghq.com/api/latest/teams/#delete-team-connections) [List team connections](https://docs.datadoghq.com/api/latest/teams/#list-team-connections) [Create team connections](https://docs.datadoghq.com/api/latest/teams/#create-team-connections) [Get team sync configurations](https://docs.datadoghq.com/api/latest/teams/#get-team-sync-configurations) [Get all member teams](https://docs.datadoghq.com/api/latest/teams/#get-all-member-teams) [Add a member team](https://docs.datadoghq.com/api/latest/teams/#add-a-member-team) [Remove a member team](https://docs.datadoghq.com/api/latest/teams/#remove-a-member-team) [Remove a team](https://docs.datadoghq.com/api/latest/teams/#remove-a-team) [Get a team](https://docs.datadoghq.com/api/latest/teams/#get-a-team) [Update a team](https://docs.datadoghq.com/api/latest/teams/#update-a-team) [Get links for a team](https://docs.datadoghq.com/api/latest/teams/#get-links-for-a-team) [Create a team link](https://docs.datadoghq.com/api/latest/teams/#create-a-team-link) [Remove a team link](https://docs.datadoghq.com/api/latest/teams/#remove-a-team-link) [Get a team link](https://docs.datadoghq.com/api/latest/teams/#get-a-team-link) [Update a team link](https://docs.datadoghq.com/api/latest/teams/#update-a-team-link) [Get team memberships](https://docs.datadoghq.com/api/latest/teams/#get-team-memberships) [Add a user to a team](https://docs.datadoghq.com/api/latest/teams/#add-a-user-to-a-team) [Remove a user from a team](https://docs.datadoghq.com/api/latest/teams/#remove-a-user-from-a-team) [Update a user's membership attributes on a team](https://docs.datadoghq.com/api/latest/teams/#update-a-users-membership-attributes-on-a-team) [Get team notification rules](https://docs.datadoghq.com/api/latest/teams/#get-team-notification-rules) [Create team notification rule](https://docs.datadoghq.com/api/latest/teams/#create-team-notification-rule) [Delete team notification rule](https://docs.datadoghq.com/api/latest/teams/#delete-team-notification-rule) [Get team notification rule](https://docs.datadoghq.com/api/latest/teams/#get-team-notification-rule) [Update team notification rule](https://docs.datadoghq.com/api/latest/teams/#update-team-notification-rule) [Get permission settings for a team](https://docs.datadoghq.com/api/latest/teams/#get-permission-settings-for-a-team) [Update permission setting for team](https://docs.datadoghq.com/api/latest/teams/#update-permission-setting-for-team) [Get user memberships](https://docs.datadoghq.com/api/latest/teams/#get-user-memberships) #### Usage Metering Scope name Description Endpoints that require this scope billing_read View your organization's billing information. [Get Monthly Cost Attribution](https://docs.datadoghq.com/api/latest/usage-metering/#get-monthly-cost-attribution) [Get cost across multi-org account](https://docs.datadoghq.com/api/latest/usage-metering/#get-cost-across-multi-org-account) [Get estimated cost across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-estimated-cost-across-your-account) [Get historical cost across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-historical-cost-across-your-account) [Get projected cost across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-projected-cost-across-your-account) usage_read View your organization's usage and usage attribution. [Get hourly usage for analyzed logs](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-analyzed-logs) [Get hourly usage for audit logs](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-audit-logs) [Get hourly usage for Lambda](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-lambda) [Get billable usage across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-billable-usage-across-your-account) [Get hourly usage for CI visibility](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-ci-visibility) [Get hourly usage for CSM Pro](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-csm-pro) [Get hourly usage for cloud workload security](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-cloud-workload-security) [Get hourly usage for database monitoring](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-database-monitoring) [Get hourly usage for Fargate](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-fargate) [Get hourly usage for hosts and containers](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-hosts-and-containers) [Get hourly usage attribution](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-attribution) [Get hourly usage for incident management](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-incident-management) [Get hourly usage for indexed spans](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-indexed-spans) [Get hourly usage for ingested spans](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-ingested-spans) [Get hourly usage for IoT](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-iot) [Get hourly usage for logs](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-logs) [Get hourly logs usage by retention](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-logs-usage-by-retention) [Get hourly usage for logs by index](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-logs-by-index) [Get monthly usage attribution](https://docs.datadoghq.com/api/latest/usage-metering/#get-monthly-usage-attribution) [get hourly usage for network flows](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-network-flows) [Get hourly usage for network hosts](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-network-hosts) [Get hourly usage for online archive](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-online-archive) [Get hourly usage for profiled hosts](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-profiled-hosts) [Get hourly usage for RUM units](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-rum-units) [Get hourly usage for RUM sessions](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-rum-sessions) [Get hourly usage for sensitive data scanner](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-sensitive-data-scanner) [Get hourly usage for SNMP devices](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-snmp-devices) [Get usage across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-usage-across-your-account) [Get hourly usage for synthetics checks](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-synthetics-checks) [Get hourly usage for synthetics API checks](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-synthetics-api-checks) [Get hourly usage for synthetics browser checks](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-synthetics-browser-checks) [Get hourly usage for custom metrics](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-custom-metrics) [Get all custom metrics by hourly average](https://docs.datadoghq.com/api/latest/usage-metering/#get-all-custom-metrics-by-hourly-average) [Get active billing dimensions for cost attribution](https://docs.datadoghq.com/api/latest/usage-metering/#get-active-billing-dimensions-for-cost-attribution) [Get Monthly Cost Attribution](https://docs.datadoghq.com/api/latest/usage-metering/#get-monthly-cost-attribution) [Get hourly usage for application security](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-application-security) [Get billing dimension mapping for usage endpoints](https://docs.datadoghq.com/api/latest/usage-metering/#get-billing-dimension-mapping-for-usage-endpoints) [Get cost across multi-org account](https://docs.datadoghq.com/api/latest/usage-metering/#get-cost-across-multi-org-account) [Get estimated cost across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-estimated-cost-across-your-account) [Get historical cost across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-historical-cost-across-your-account) [Get hourly usage by product family](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family) [Get hourly usage for Lambda traced invocations](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-lambda-traced-invocations) [Get hourly usage for observability pipelines](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-observability-pipelines) [Get projected cost across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-projected-cost-across-your-account) #### Webhooks Integration Scope name Description Endpoints that require this scope create_webhooks Create webhooks integrations. [Create a webhooks integration](https://docs.datadoghq.com/api/latest/webhooks-integration/#create-a-webhooks-integration) #### Workflow Automation Scope name Description Endpoints that require this scope workflows_read View workflows. [List workflow instances](https://docs.datadoghq.com/api/latest/workflow-automation/#list-workflow-instances) [Get a workflow instance](https://docs.datadoghq.com/api/latest/workflow-automation/#get-a-workflow-instance) workflows_run Run workflows. [Execute a workflow](https://docs.datadoghq.com/api/latest/workflow-automation/#execute-a-workflow) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=24546ac7-72ae-4451-8b75-d4ec3fab7743&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=7935ae67-a159-466e-bbee-29224c27f170&pt=Authorization%20Scopes&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fscopes%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=24546ac7-72ae-4451-8b75-d4ec3fab7743&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=7935ae67-a159-466e-bbee-29224c27f170&pt=Authorization%20Scopes&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fscopes%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=95fdc5d3-1caf-4408-a3e5-0584fccf6c79&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Authorization%20Scopes&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fscopes%2F&r=<=1484&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=268200) --- # Source: https://docs.datadoghq.com/api/latest/screenboards/ # Screenboards This endpoint is outdated. Use the [new Dashboard endpoint](https://docs.datadoghq.com/api/latest/dashboards/) instead. ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=76d9ecaf-0006-41e1-8b18-0a657eaa3ec0&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2f824152-ec2b-4cc5-8d00-282b678d3d6a&pt=Screenboards&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fscreenboards%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=76d9ecaf-0006-41e1-8b18-0a657eaa3ec0&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2f824152-ec2b-4cc5-8d00-282b678d3d6a&pt=Screenboards&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fscreenboards%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=d2a54b2f-523d-48e5-a2e8-52c3f181c83c&bo=2&sid=c4ba0d50f0bf11f09afde9c0412ba014&vid=c4ba67e0f0bf11f099fcdb9b6c7a6b3e&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Screenboards&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fscreenboards%2F&r=<=774&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=530781) --- # Source: https://docs.datadoghq.com/api/latest/security-monitoring # Security Monitoring Create and manage your security rules, signals, filters, and more. See the [Datadog Security page](https://docs.datadoghq.com/security/) for more information. ## [Create a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-suppression-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-suppression-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/suppressionshttps://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/suppressionshttps://api.datadoghq.eu/api/v2/security_monitoring/configuration/suppressionshttps://api.ddog-gov.com/api/v2/security_monitoring/configuration/suppressionshttps://api.datadoghq.com/api/v2/security_monitoring/configuration/suppressionshttps://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/suppressionshttps://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions ### Overview Create a new suppression rule. OAuth apps require the `security_monitoring_suppressions_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) The definition of the new suppression rule. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object Object for a single suppression rule. attributes [_required_] object Object containing the attributes of the suppression rule to be created. data_exclusion_query string An exclusion query on the input data of the security rules, which could be logs, Agent events, or other types of data based on the security rule. Events matching this query are ignored by any detection rules referenced in the suppression rule. description string A description for the suppression rule. enabled [_required_] boolean Whether the suppression rule is enabled. expiration_date int64 A Unix millisecond timestamp giving an expiration date for the suppression rule. After this date, it won't suppress signals anymore. name [_required_] string The name of the suppression rule. rule_query [_required_] string The rule query of the suppression rule, with the same syntax as the search bar for detection rules. start_date int64 A Unix millisecond timestamp giving the start date for the suppression rule. After this date, it starts suppressing signals. suppression_query string The suppression query of the suppression rule. If a signal matches this query, it is suppressed and is not triggered. It uses the same syntax as the queries to search signals in the Signals Explorer. tags [string] List of tags associated with the suppression rule. type [_required_] enum The type of the resource. The value should always be `suppressions`. Allowed enum values: `suppressions` default: `suppressions` ##### Create a suppression rule returns "OK" response ``` { "data": { "attributes": { "description": "This rule suppresses low-severity signals in staging environments.", "enabled": true, "start_date": 1637493071000, "expiration_date": 1638443471000, "name": "Example-Security-Monitoring", "rule_query": "type:log_detection source:cloudtrail", "suppression_query": "env:staging status:low", "tags": [ "technique:T1110-brute-force", "source:cloudtrail" ] }, "type": "suppressions" } } ``` Copy ##### Create a suppression rule with an exclusion query returns "OK" response ``` { "data": { "attributes": { "description": "This rule suppresses low-severity signals in staging environments.", "enabled": true, "start_date": 1637493071000, "expiration_date": 1638443471000, "name": "Example-Security-Monitoring", "rule_query": "type:log_detection source:cloudtrail", "data_exclusion_query": "account_id:12345" }, "type": "suppressions" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityMonitoringSuppression-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityMonitoringSuppression-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityMonitoringSuppression-403-v2) * [409](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityMonitoringSuppression-409-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityMonitoringSuppression-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object containing a single suppression rule. Field Type Description data object The suppression rule's properties. attributes object The attributes of the suppression rule. creation_date int64 A Unix millisecond timestamp given the creation date of the suppression rule. creator object A user. handle string The handle of the user. name string The name of the user. data_exclusion_query string An exclusion query on the input data of the security rules, which could be logs, Agent events, or other types of data based on the security rule. Events matching this query are ignored by any detection rules referenced in the suppression rule. description string A description for the suppression rule. editable boolean Whether the suppression rule is editable. enabled boolean Whether the suppression rule is enabled. expiration_date int64 A Unix millisecond timestamp giving an expiration date for the suppression rule. After this date, it won't suppress signals anymore. name string The name of the suppression rule. rule_query string The rule query of the suppression rule, with the same syntax as the search bar for detection rules. start_date int64 A Unix millisecond timestamp giving the start date for the suppression rule. After this date, it starts suppressing signals. suppression_query string The suppression query of the suppression rule. If a signal matches this query, it is suppressed and not triggered. Same syntax as the queries to search signals in the signal explorer. tags [string] List of tags associated with the suppression rule. update_date int64 A Unix millisecond timestamp given the update date of the suppression rule. updater object A user. handle string The handle of the user. name string The name of the user. version int32 The version of the suppression rule; it starts at 1, and is incremented at each update. id string The ID of the suppression rule. type enum The type of the resource. The value should always be `suppressions`. Allowed enum values: `suppressions` default: `suppressions` ``` { "data": { "attributes": { "creation_date": "integer", "creator": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "data_exclusion_query": "source:cloudtrail account_id:12345", "description": "This rule suppresses low-severity signals in staging environments.", "editable": true, "enabled": true, "expiration_date": 1703187336000, "name": "Custom suppression", "rule_query": "type:log_detection source:cloudtrail", "start_date": 1703187336000, "suppression_query": "env:staging status:low", "tags": [ "technique:T1110-brute-force", "source:cloudtrail" ], "update_date": "integer", "updater": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "version": 42 }, "id": "3dd-0uc-h1s", "type": "suppressions" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Create a suppression rule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "This rule suppresses low-severity signals in staging environments.", "enabled": true, "start_date": 1637493071000, "expiration_date": 1638443471000, "name": "Example-Security-Monitoring", "rule_query": "type:log_detection source:cloudtrail", "suppression_query": "env:staging status:low", "tags": [ "technique:T1110-brute-force", "source:cloudtrail" ] }, "type": "suppressions" } } EOF ``` ##### Create a suppression rule with an exclusion query returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "This rule suppresses low-severity signals in staging environments.", "enabled": true, "start_date": 1637493071000, "expiration_date": 1638443471000, "name": "Example-Security-Monitoring", "rule_query": "type:log_detection source:cloudtrail", "data_exclusion_query": "account_id:12345" }, "type": "suppressions" } } EOF ``` ##### Create a suppression rule returns "OK" response ``` // Create a suppression rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringSuppressionCreateRequest{ Data: datadogV2.SecurityMonitoringSuppressionCreateData{ Attributes: datadogV2.SecurityMonitoringSuppressionCreateAttributes{ Description: datadog.PtrString("This rule suppresses low-severity signals in staging environments."), Enabled: true, StartDate: datadog.PtrInt64(1637493071000), ExpirationDate: datadog.PtrInt64(1638443471000), Name: "Example-Security-Monitoring", RuleQuery: "type:log_detection source:cloudtrail", SuppressionQuery: datadog.PtrString("env:staging status:low"), Tags: []string{ "technique:T1110-brute-force", "source:cloudtrail", }, }, Type: datadogV2.SECURITYMONITORINGSUPPRESSIONTYPE_SUPPRESSIONS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateSecurityMonitoringSuppression(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateSecurityMonitoringSuppression`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateSecurityMonitoringSuppression`:\n%s\n", responseContent) } ``` Copy ##### Create a suppression rule with an exclusion query returns "OK" response ``` // Create a suppression rule with an exclusion query returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringSuppressionCreateRequest{ Data: datadogV2.SecurityMonitoringSuppressionCreateData{ Attributes: datadogV2.SecurityMonitoringSuppressionCreateAttributes{ Description: datadog.PtrString("This rule suppresses low-severity signals in staging environments."), Enabled: true, StartDate: datadog.PtrInt64(1637493071000), ExpirationDate: datadog.PtrInt64(1638443471000), Name: "Example-Security-Monitoring", RuleQuery: "type:log_detection source:cloudtrail", DataExclusionQuery: datadog.PtrString("account_id:12345"), }, Type: datadogV2.SECURITYMONITORINGSUPPRESSIONTYPE_SUPPRESSIONS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateSecurityMonitoringSuppression(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateSecurityMonitoringSuppression`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateSecurityMonitoringSuppression`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a suppression rule returns "OK" response ``` // Create a suppression rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionCreateAttributes; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionCreateData; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionCreateRequest; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionResponse; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringSuppressionCreateRequest body = new SecurityMonitoringSuppressionCreateRequest() .data( new SecurityMonitoringSuppressionCreateData() .attributes( new SecurityMonitoringSuppressionCreateAttributes() .description( "This rule suppresses low-severity signals in staging" + " environments.") .enabled(true) .startDate(1637493071000L) .expirationDate(1638443471000L) .name("Example-Security-Monitoring") .ruleQuery("type:log_detection source:cloudtrail") .suppressionQuery("env:staging status:low") .tags( Arrays.asList("technique:T1110-brute-force", "source:cloudtrail"))) .type(SecurityMonitoringSuppressionType.SUPPRESSIONS)); try { SecurityMonitoringSuppressionResponse result = apiInstance.createSecurityMonitoringSuppression(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#createSecurityMonitoringSuppression"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a suppression rule with an exclusion query returns "OK" response ``` // Create a suppression rule with an exclusion query returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionCreateAttributes; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionCreateData; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionCreateRequest; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionResponse; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringSuppressionCreateRequest body = new SecurityMonitoringSuppressionCreateRequest() .data( new SecurityMonitoringSuppressionCreateData() .attributes( new SecurityMonitoringSuppressionCreateAttributes() .description( "This rule suppresses low-severity signals in staging" + " environments.") .enabled(true) .startDate(1637493071000L) .expirationDate(1638443471000L) .name("Example-Security-Monitoring") .ruleQuery("type:log_detection source:cloudtrail") .dataExclusionQuery("account_id:12345")) .type(SecurityMonitoringSuppressionType.SUPPRESSIONS)); try { SecurityMonitoringSuppressionResponse result = apiInstance.createSecurityMonitoringSuppression(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#createSecurityMonitoringSuppression"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a suppression rule returns "OK" response ``` """ Create a suppression rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_suppression_create_attributes import ( SecurityMonitoringSuppressionCreateAttributes, ) from datadog_api_client.v2.model.security_monitoring_suppression_create_data import ( SecurityMonitoringSuppressionCreateData, ) from datadog_api_client.v2.model.security_monitoring_suppression_create_request import ( SecurityMonitoringSuppressionCreateRequest, ) from datadog_api_client.v2.model.security_monitoring_suppression_type import SecurityMonitoringSuppressionType body = SecurityMonitoringSuppressionCreateRequest( data=SecurityMonitoringSuppressionCreateData( attributes=SecurityMonitoringSuppressionCreateAttributes( description="This rule suppresses low-severity signals in staging environments.", enabled=True, start_date=1637493071000, expiration_date=1638443471000, name="Example-Security-Monitoring", rule_query="type:log_detection source:cloudtrail", suppression_query="env:staging status:low", tags=[ "technique:T1110-brute-force", "source:cloudtrail", ], ), type=SecurityMonitoringSuppressionType.SUPPRESSIONS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_security_monitoring_suppression(body=body) print(response) ``` Copy ##### Create a suppression rule with an exclusion query returns "OK" response ``` """ Create a suppression rule with an exclusion query returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_suppression_create_attributes import ( SecurityMonitoringSuppressionCreateAttributes, ) from datadog_api_client.v2.model.security_monitoring_suppression_create_data import ( SecurityMonitoringSuppressionCreateData, ) from datadog_api_client.v2.model.security_monitoring_suppression_create_request import ( SecurityMonitoringSuppressionCreateRequest, ) from datadog_api_client.v2.model.security_monitoring_suppression_type import SecurityMonitoringSuppressionType body = SecurityMonitoringSuppressionCreateRequest( data=SecurityMonitoringSuppressionCreateData( attributes=SecurityMonitoringSuppressionCreateAttributes( description="This rule suppresses low-severity signals in staging environments.", enabled=True, start_date=1637493071000, expiration_date=1638443471000, name="Example-Security-Monitoring", rule_query="type:log_detection source:cloudtrail", data_exclusion_query="account_id:12345", ), type=SecurityMonitoringSuppressionType.SUPPRESSIONS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_security_monitoring_suppression(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a suppression rule returns "OK" response ``` # Create a suppression rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringSuppressionCreateRequest.new({ data: DatadogAPIClient::V2::SecurityMonitoringSuppressionCreateData.new({ attributes: DatadogAPIClient::V2::SecurityMonitoringSuppressionCreateAttributes.new({ description: "This rule suppresses low-severity signals in staging environments.", enabled: true, start_date: 1637493071000, expiration_date: 1638443471000, name: "Example-Security-Monitoring", rule_query: "type:log_detection source:cloudtrail", suppression_query: "env:staging status:low", tags: [ "technique:T1110-brute-force", "source:cloudtrail", ], }), type: DatadogAPIClient::V2::SecurityMonitoringSuppressionType::SUPPRESSIONS, }), }) p api_instance.create_security_monitoring_suppression(body) ``` Copy ##### Create a suppression rule with an exclusion query returns "OK" response ``` # Create a suppression rule with an exclusion query returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringSuppressionCreateRequest.new({ data: DatadogAPIClient::V2::SecurityMonitoringSuppressionCreateData.new({ attributes: DatadogAPIClient::V2::SecurityMonitoringSuppressionCreateAttributes.new({ description: "This rule suppresses low-severity signals in staging environments.", enabled: true, start_date: 1637493071000, expiration_date: 1638443471000, name: "Example-Security-Monitoring", rule_query: "type:log_detection source:cloudtrail", data_exclusion_query: "account_id:12345", }), type: DatadogAPIClient::V2::SecurityMonitoringSuppressionType::SUPPRESSIONS, }), }) p api_instance.create_security_monitoring_suppression(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a suppression rule returns "OK" response ``` // Create a suppression rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionCreateAttributes; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionCreateData; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionCreateRequest; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionType; #[tokio::main] async fn main() { let body = SecurityMonitoringSuppressionCreateRequest::new( SecurityMonitoringSuppressionCreateData::new( SecurityMonitoringSuppressionCreateAttributes::new( true, "Example-Security-Monitoring".to_string(), "type:log_detection source:cloudtrail".to_string(), ) .description( "This rule suppresses low-severity signals in staging environments.".to_string(), ) .expiration_date(1638443471000) .start_date(1637493071000) .suppression_query("env:staging status:low".to_string()) .tags(vec![ "technique:T1110-brute-force".to_string(), "source:cloudtrail".to_string(), ]), SecurityMonitoringSuppressionType::SUPPRESSIONS, ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_security_monitoring_suppression(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a suppression rule with an exclusion query returns "OK" response ``` // Create a suppression rule with an exclusion query returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionCreateAttributes; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionCreateData; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionCreateRequest; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionType; #[tokio::main] async fn main() { let body = SecurityMonitoringSuppressionCreateRequest::new( SecurityMonitoringSuppressionCreateData::new( SecurityMonitoringSuppressionCreateAttributes::new( true, "Example-Security-Monitoring".to_string(), "type:log_detection source:cloudtrail".to_string(), ) .data_exclusion_query("account_id:12345".to_string()) .description( "This rule suppresses low-severity signals in staging environments.".to_string(), ) .expiration_date(1638443471000) .start_date(1637493071000), SecurityMonitoringSuppressionType::SUPPRESSIONS, ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_security_monitoring_suppression(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a suppression rule returns "OK" response ``` /** * Create a suppression rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateSecurityMonitoringSuppressionRequest = { body: { data: { attributes: { description: "This rule suppresses low-severity signals in staging environments.", enabled: true, startDate: 1637493071000, expirationDate: 1638443471000, name: "Example-Security-Monitoring", ruleQuery: "type:log_detection source:cloudtrail", suppressionQuery: "env:staging status:low", tags: ["technique:T1110-brute-force", "source:cloudtrail"], }, type: "suppressions", }, }, }; apiInstance .createSecurityMonitoringSuppression(params) .then((data: v2.SecurityMonitoringSuppressionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a suppression rule with an exclusion query returns "OK" response ``` /** * Create a suppression rule with an exclusion query returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateSecurityMonitoringSuppressionRequest = { body: { data: { attributes: { description: "This rule suppresses low-severity signals in staging environments.", enabled: true, startDate: 1637493071000, expirationDate: 1638443471000, name: "Example-Security-Monitoring", ruleQuery: "type:log_detection source:cloudtrail", dataExclusionQuery: "account_id:12345", }, type: "suppressions", }, }, }; apiInstance .createSecurityMonitoringSuppression(params) .then((data: v2.SecurityMonitoringSuppressionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-suppression-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-suppression-rule-v2) DELETE https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.datadoghq.eu/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.ddog-gov.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id} ### Overview Delete a specific suppression rule. OAuth apps require the `security_monitoring_suppressions_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description suppression_id [_required_] string The ID of the suppression rule ### Response * [204](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityMonitoringSuppression-204-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityMonitoringSuppression-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityMonitoringSuppression-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityMonitoringSuppression-429-v2) OK Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Delete a suppression rule Copy ``` # Path parameters export suppression_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/${suppression_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a suppression rule ``` """ Delete a suppression rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "suppression" in the system SUPPRESSION_DATA_ID = environ["SUPPRESSION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.delete_security_monitoring_suppression( suppression_id=SUPPRESSION_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a suppression rule ``` # Delete a suppression rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "suppression" in the system SUPPRESSION_DATA_ID = ENV["SUPPRESSION_DATA_ID"] api_instance.delete_security_monitoring_suppression(SUPPRESSION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a suppression rule ``` // Delete a suppression rule returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "suppression" in the system SuppressionDataID := os.Getenv("SUPPRESSION_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.DeleteSecurityMonitoringSuppression(ctx, SuppressionDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.DeleteSecurityMonitoringSuppression`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a suppression rule ``` // Delete a suppression rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "suppression" in the system String SUPPRESSION_DATA_ID = System.getenv("SUPPRESSION_DATA_ID"); try { apiInstance.deleteSecurityMonitoringSuppression(SUPPRESSION_DATA_ID); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#deleteSecurityMonitoringSuppression"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a suppression rule ``` // Delete a suppression rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "suppression" in the system let suppression_data_id = std::env::var("SUPPRESSION_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .delete_security_monitoring_suppression(suppression_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a suppression rule ``` /** * Delete a suppression rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "suppression" in the system const SUPPRESSION_DATA_ID = process.env.SUPPRESSION_DATA_ID as string; const params: v2.SecurityMonitoringApiDeleteSecurityMonitoringSuppressionRequest = { suppressionId: SUPPRESSION_DATA_ID, }; apiInstance .deleteSecurityMonitoringSuppression(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-suppression-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-suppression-rule-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.datadoghq.eu/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.ddog-gov.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id} ### Overview Get the details of a specific suppression rule. OAuth apps require the `security_monitoring_suppressions_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description suppression_id [_required_] string The ID of the suppression rule ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringSuppression-200-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringSuppression-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringSuppression-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringSuppression-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object containing a single suppression rule. Field Type Description data object The suppression rule's properties. attributes object The attributes of the suppression rule. creation_date int64 A Unix millisecond timestamp given the creation date of the suppression rule. creator object A user. handle string The handle of the user. name string The name of the user. data_exclusion_query string An exclusion query on the input data of the security rules, which could be logs, Agent events, or other types of data based on the security rule. Events matching this query are ignored by any detection rules referenced in the suppression rule. description string A description for the suppression rule. editable boolean Whether the suppression rule is editable. enabled boolean Whether the suppression rule is enabled. expiration_date int64 A Unix millisecond timestamp giving an expiration date for the suppression rule. After this date, it won't suppress signals anymore. name string The name of the suppression rule. rule_query string The rule query of the suppression rule, with the same syntax as the search bar for detection rules. start_date int64 A Unix millisecond timestamp giving the start date for the suppression rule. After this date, it starts suppressing signals. suppression_query string The suppression query of the suppression rule. If a signal matches this query, it is suppressed and not triggered. Same syntax as the queries to search signals in the signal explorer. tags [string] List of tags associated with the suppression rule. update_date int64 A Unix millisecond timestamp given the update date of the suppression rule. updater object A user. handle string The handle of the user. name string The name of the user. version int32 The version of the suppression rule; it starts at 1, and is incremented at each update. id string The ID of the suppression rule. type enum The type of the resource. The value should always be `suppressions`. Allowed enum values: `suppressions` default: `suppressions` ``` { "data": { "attributes": { "creation_date": "integer", "creator": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "data_exclusion_query": "source:cloudtrail account_id:12345", "description": "This rule suppresses low-severity signals in staging environments.", "editable": true, "enabled": true, "expiration_date": 1703187336000, "name": "Custom suppression", "rule_query": "type:log_detection source:cloudtrail", "start_date": 1703187336000, "suppression_query": "env:staging status:low", "tags": [ "technique:T1110-brute-force", "source:cloudtrail" ], "update_date": "integer", "updater": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "version": 42 }, "id": "3dd-0uc-h1s", "type": "suppressions" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a suppression rule Copy ``` # Path parameters export suppression_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/${suppression_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a suppression rule ``` """ Get a suppression rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "suppression" in the system SUPPRESSION_DATA_ID = environ["SUPPRESSION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_security_monitoring_suppression( suppression_id=SUPPRESSION_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a suppression rule ``` # Get a suppression rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "suppression" in the system SUPPRESSION_DATA_ID = ENV["SUPPRESSION_DATA_ID"] p api_instance.get_security_monitoring_suppression(SUPPRESSION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a suppression rule ``` // Get a suppression rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "suppression" in the system String SUPPRESSION_DATA_ID = System.getenv("SUPPRESSION_DATA_ID"); try { SecurityMonitoringSuppressionResponse result = apiInstance.getSecurityMonitoringSuppression(SUPPRESSION_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#getSecurityMonitoringSuppression"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a suppression rule ``` // Get a suppression rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "suppression" in the system SuppressionDataID := os.Getenv("SUPPRESSION_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSecurityMonitoringSuppression(ctx, SuppressionDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSecurityMonitoringSuppression`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSecurityMonitoringSuppression`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a suppression rule ``` // Get a suppression rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "suppression" in the system let suppression_data_id = std::env::var("SUPPRESSION_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_security_monitoring_suppression(suppression_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a suppression rule ``` /** * Get a suppression rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "suppression" in the system const SUPPRESSION_DATA_ID = process.env.SUPPRESSION_DATA_ID as string; const params: v2.SecurityMonitoringApiGetSecurityMonitoringSuppressionRequest = { suppressionId: SUPPRESSION_DATA_ID, }; apiInstance .getSecurityMonitoringSuppression(params) .then((data: v2.SecurityMonitoringSuppressionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a suppression's version history](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-suppressions-version-history) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-suppressions-version-history-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}/version_historyhttps://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}/version_historyhttps://api.datadoghq.eu/api/v2/security_monitoring/configuration/suppressions/{suppression_id}/version_historyhttps://api.ddog-gov.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}/version_historyhttps://api.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}/version_historyhttps://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}/version_historyhttps://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}/version_history ### Overview Get a suppression’s version history. OAuth apps require the `security_monitoring_suppressions_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description suppression_id [_required_] string The ID of the suppression rule #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionVersionHistory-200-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionVersionHistory-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionVersionHistory-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionVersionHistory-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response for getting the suppression version history. Field Type Description data object Data for the suppression version history. attributes object Response object containing the version history of a suppression. count int32 The number of suppression versions. data object The version history of a suppression. object A suppression version with a list of updates. changes [object] A list of changes. change string The new value of the field. field string The field that was changed. type enum The type of change. Allowed enum values: `create,update,delete` suppression object The attributes of the suppression rule. creation_date int64 A Unix millisecond timestamp given the creation date of the suppression rule. creator object A user. handle string The handle of the user. name string The name of the user. data_exclusion_query string An exclusion query on the input data of the security rules, which could be logs, Agent events, or other types of data based on the security rule. Events matching this query are ignored by any detection rules referenced in the suppression rule. description string A description for the suppression rule. editable boolean Whether the suppression rule is editable. enabled boolean Whether the suppression rule is enabled. expiration_date int64 A Unix millisecond timestamp giving an expiration date for the suppression rule. After this date, it won't suppress signals anymore. name string The name of the suppression rule. rule_query string The rule query of the suppression rule, with the same syntax as the search bar for detection rules. start_date int64 A Unix millisecond timestamp giving the start date for the suppression rule. After this date, it starts suppressing signals. suppression_query string The suppression query of the suppression rule. If a signal matches this query, it is suppressed and not triggered. Same syntax as the queries to search signals in the signal explorer. tags [string] List of tags associated with the suppression rule. update_date int64 A Unix millisecond timestamp given the update date of the suppression rule. updater object A user. handle string The handle of the user. name string The name of the user. version int32 The version of the suppression rule; it starts at 1, and is incremented at each update. id string ID of the suppression. type enum Type of data. Allowed enum values: `suppression_version_history` ``` { "data": { "attributes": { "count": "integer", "data": { "": { "changes": [ { "change": "cloud_provider:aws", "field": "Tags", "type": "string" } ], "suppression": { "creation_date": "integer", "creator": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "data_exclusion_query": "source:cloudtrail account_id:12345", "description": "This rule suppresses low-severity signals in staging environments.", "editable": true, "enabled": true, "expiration_date": 1703187336000, "name": "Custom suppression", "rule_query": "type:log_detection source:cloudtrail", "start_date": 1703187336000, "suppression_query": "env:staging status:low", "tags": [ "technique:T1110-brute-force", "source:cloudtrail" ], "update_date": "integer", "updater": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "version": 42 } } } }, "id": "string", "type": "string" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a suppression's version history Copy ``` # Path parameters export suppression_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/${suppression_id}/version_history" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a suppression's version history ``` """ Get a suppression's version history returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "suppression" in the system SUPPRESSION_DATA_ID = environ["SUPPRESSION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_suppression_version_history( suppression_id=SUPPRESSION_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a suppression's version history ``` # Get a suppression's version history returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "suppression" in the system SUPPRESSION_DATA_ID = ENV["SUPPRESSION_DATA_ID"] p api_instance.get_suppression_version_history(SUPPRESSION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a suppression's version history ``` // Get a suppression's version history returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "suppression" in the system SuppressionDataID := os.Getenv("SUPPRESSION_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSuppressionVersionHistory(ctx, SuppressionDataID, *datadogV2.NewGetSuppressionVersionHistoryOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSuppressionVersionHistory`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSuppressionVersionHistory`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a suppression's version history ``` // Get a suppression's version history returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.GetSuppressionVersionHistoryResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "suppression" in the system String SUPPRESSION_DATA_ID = System.getenv("SUPPRESSION_DATA_ID"); try { GetSuppressionVersionHistoryResponse result = apiInstance.getSuppressionVersionHistory(SUPPRESSION_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#getSuppressionVersionHistory"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a suppression's version history ``` // Get a suppression's version history returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::GetSuppressionVersionHistoryOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "suppression" in the system let suppression_data_id = std::env::var("SUPPRESSION_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_suppression_version_history( suppression_data_id.clone(), GetSuppressionVersionHistoryOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a suppression's version history ``` /** * Get a suppression's version history returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "suppression" in the system const SUPPRESSION_DATA_ID = process.env.SUPPRESSION_DATA_ID as string; const params: v2.SecurityMonitoringApiGetSuppressionVersionHistoryRequest = { suppressionId: SUPPRESSION_DATA_ID, }; apiInstance .getSuppressionVersionHistory(params) .then((data: v2.GetSuppressionVersionHistoryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all suppression rules](https://docs.datadoghq.com/api/latest/security-monitoring/#get-all-suppression-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-all-suppression-rules-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/suppressionshttps://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/suppressionshttps://api.datadoghq.eu/api/v2/security_monitoring/configuration/suppressionshttps://api.ddog-gov.com/api/v2/security_monitoring/configuration/suppressionshttps://api.datadoghq.com/api/v2/security_monitoring/configuration/suppressionshttps://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/suppressionshttps://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions ### Overview Get the list of all suppression rules. OAuth apps require the `security_monitoring_suppressions_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Query Strings Name Type Description query string Query string. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringSuppressions-200-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringSuppressions-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringSuppressions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object containing the available suppression rules. Field Type Description data [object] A list of suppressions objects. attributes object The attributes of the suppression rule. creation_date int64 A Unix millisecond timestamp given the creation date of the suppression rule. creator object A user. handle string The handle of the user. name string The name of the user. data_exclusion_query string An exclusion query on the input data of the security rules, which could be logs, Agent events, or other types of data based on the security rule. Events matching this query are ignored by any detection rules referenced in the suppression rule. description string A description for the suppression rule. editable boolean Whether the suppression rule is editable. enabled boolean Whether the suppression rule is enabled. expiration_date int64 A Unix millisecond timestamp giving an expiration date for the suppression rule. After this date, it won't suppress signals anymore. name string The name of the suppression rule. rule_query string The rule query of the suppression rule, with the same syntax as the search bar for detection rules. start_date int64 A Unix millisecond timestamp giving the start date for the suppression rule. After this date, it starts suppressing signals. suppression_query string The suppression query of the suppression rule. If a signal matches this query, it is suppressed and not triggered. Same syntax as the queries to search signals in the signal explorer. tags [string] List of tags associated with the suppression rule. update_date int64 A Unix millisecond timestamp given the update date of the suppression rule. updater object A user. handle string The handle of the user. name string The name of the user. version int32 The version of the suppression rule; it starts at 1, and is incremented at each update. id string The ID of the suppression rule. type enum The type of the resource. The value should always be `suppressions`. Allowed enum values: `suppressions` default: `suppressions` ``` { "data": [ { "attributes": { "creation_date": "integer", "creator": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "data_exclusion_query": "source:cloudtrail account_id:12345", "description": "This rule suppresses low-severity signals in staging environments.", "editable": true, "enabled": true, "expiration_date": 1703187336000, "name": "Custom suppression", "rule_query": "type:log_detection source:cloudtrail", "start_date": 1703187336000, "suppression_query": "env:staging status:low", "tags": [ "technique:T1110-brute-force", "source:cloudtrail" ], "update_date": "integer", "updater": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "version": 42 }, "id": "3dd-0uc-h1s", "type": "suppressions" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get all suppression rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all suppression rules ``` """ Get all suppression rules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_security_monitoring_suppressions() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all suppression rules ``` # Get all suppression rules returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.list_security_monitoring_suppressions() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all suppression rules ``` // Get all suppression rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListSecurityMonitoringSuppressions(ctx, *datadogV2.NewListSecurityMonitoringSuppressionsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListSecurityMonitoringSuppressions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListSecurityMonitoringSuppressions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all suppression rules ``` // Get all suppression rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { SecurityMonitoringSuppressionsResponse result = apiInstance.listSecurityMonitoringSuppressions(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#listSecurityMonitoringSuppressions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all suppression rules ``` // Get all suppression rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::ListSecurityMonitoringSuppressionsOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .list_security_monitoring_suppressions( ListSecurityMonitoringSuppressionsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all suppression rules ``` /** * Get all suppression rules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); apiInstance .listSecurityMonitoringSuppressions() .then((data: v2.SecurityMonitoringSuppressionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get suppressions affecting a specific rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-suppressions-affecting-a-specific-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-suppressions-affecting-a-specific-rule-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/rules/{rule_id}https://api.datadoghq.eu/api/v2/security_monitoring/configuration/suppressions/rules/{rule_id}https://api.ddog-gov.com/api/v2/security_monitoring/configuration/suppressions/rules/{rule_id}https://api.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/rules/{rule_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/rules/{rule_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/rules/{rule_id} ### Overview Get the list of suppressions that affect a specific existing rule by its ID. OAuth apps require the `security_monitoring_suppressions_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string The ID of the rule. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionsAffectingRule-200-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionsAffectingRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionsAffectingRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionsAffectingRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object containing the available suppression rules. Field Type Description data [object] A list of suppressions objects. attributes object The attributes of the suppression rule. creation_date int64 A Unix millisecond timestamp given the creation date of the suppression rule. creator object A user. handle string The handle of the user. name string The name of the user. data_exclusion_query string An exclusion query on the input data of the security rules, which could be logs, Agent events, or other types of data based on the security rule. Events matching this query are ignored by any detection rules referenced in the suppression rule. description string A description for the suppression rule. editable boolean Whether the suppression rule is editable. enabled boolean Whether the suppression rule is enabled. expiration_date int64 A Unix millisecond timestamp giving an expiration date for the suppression rule. After this date, it won't suppress signals anymore. name string The name of the suppression rule. rule_query string The rule query of the suppression rule, with the same syntax as the search bar for detection rules. start_date int64 A Unix millisecond timestamp giving the start date for the suppression rule. After this date, it starts suppressing signals. suppression_query string The suppression query of the suppression rule. If a signal matches this query, it is suppressed and not triggered. Same syntax as the queries to search signals in the signal explorer. tags [string] List of tags associated with the suppression rule. update_date int64 A Unix millisecond timestamp given the update date of the suppression rule. updater object A user. handle string The handle of the user. name string The name of the user. version int32 The version of the suppression rule; it starts at 1, and is incremented at each update. id string The ID of the suppression rule. type enum The type of the resource. The value should always be `suppressions`. Allowed enum values: `suppressions` default: `suppressions` ``` { "data": [ { "attributes": { "creation_date": "integer", "creator": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "data_exclusion_query": "source:cloudtrail account_id:12345", "description": "This rule suppresses low-severity signals in staging environments.", "editable": true, "enabled": true, "expiration_date": 1703187336000, "name": "Custom suppression", "rule_query": "type:log_detection source:cloudtrail", "start_date": 1703187336000, "suppression_query": "env:staging status:low", "tags": [ "technique:T1110-brute-force", "source:cloudtrail" ], "update_date": "integer", "updater": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "version": 42 }, "id": "3dd-0uc-h1s", "type": "suppressions" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get suppressions affecting a specific rule Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/rules/${rule_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get suppressions affecting a specific rule ``` """ Get suppressions affecting a specific rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "security_rule" in the system SECURITY_RULE_ID = environ["SECURITY_RULE_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_suppressions_affecting_rule( rule_id=SECURITY_RULE_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get suppressions affecting a specific rule ``` # Get suppressions affecting a specific rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "security_rule" in the system SECURITY_RULE_ID = ENV["SECURITY_RULE_ID"] p api_instance.get_suppressions_affecting_rule(SECURITY_RULE_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get suppressions affecting a specific rule ``` // Get suppressions affecting a specific rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "security_rule" in the system SecurityRuleID := os.Getenv("SECURITY_RULE_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSuppressionsAffectingRule(ctx, SecurityRuleID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSuppressionsAffectingRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSuppressionsAffectingRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get suppressions affecting a specific rule ``` // Get suppressions affecting a specific rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "security_rule" in the system String SECURITY_RULE_ID = System.getenv("SECURITY_RULE_ID"); try { SecurityMonitoringSuppressionsResponse result = apiInstance.getSuppressionsAffectingRule(SECURITY_RULE_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#getSuppressionsAffectingRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get suppressions affecting a specific rule ``` // Get suppressions affecting a specific rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "security_rule" in the system let security_rule_id = std::env::var("SECURITY_RULE_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_suppressions_affecting_rule(security_rule_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get suppressions affecting a specific rule ``` /** * Get suppressions affecting a specific rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "security_rule" in the system const SECURITY_RULE_ID = process.env.SECURITY_RULE_ID as string; const params: v2.SecurityMonitoringApiGetSuppressionsAffectingRuleRequest = { ruleId: SECURITY_RULE_ID, }; apiInstance .getSuppressionsAffectingRule(params) .then((data: v2.SecurityMonitoringSuppressionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get suppressions affecting future rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-suppressions-affecting-future-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-suppressions-affecting-future-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/ruleshttps://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/ruleshttps://api.datadoghq.eu/api/v2/security_monitoring/configuration/suppressions/ruleshttps://api.ddog-gov.com/api/v2/security_monitoring/configuration/suppressions/ruleshttps://api.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/ruleshttps://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/ruleshttps://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/rules ### Overview Get the list of suppressions that would affect a rule. OAuth apps require the `security_monitoring_suppressions_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description Option 1 object Create a new rule. calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [_required_] [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message for generated signals. name [_required_] string The name of the rule. options [_required_] object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. name string Name of the case. notifications [string] Notification targets for each case. query string A query to map a third party event to this case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum The rule type. Allowed enum values: `api_security,application_security,log_detection,workload_security` Option 2 object Create a new signal correlation rule. cases [_required_] [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message for generated signals. name [_required_] string The name of the rule. options [_required_] object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting signals which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` correlatedByFields [string] Fields to group by. correlatedQueryIndex int32 Index of the rule query used to retrieve the correlated field. metrics [string] Group of target fields to aggregate over. name string Name of the query. ruleId [_required_] string Rule ID to match on signals. tags [string] Tags for generated signals. type enum The rule type. Allowed enum values: `signal_correlation` Option 3 object Create a new cloud configuration rule. cases [_required_] [object] Description of generated findings and signals (severity and channels to be notified in case of a signal). Must contain exactly one item. notifications [string] Notification targets for each rule case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` complianceSignalOptions [_required_] object How to generate compliance signals. Useful for cloud_configuration rules only. defaultActivationStatus boolean The default activation status. defaultGroupByFields [string] The default group by fields. userActivationStatus boolean Whether signals will be sent. userGroupByFields [string] Fields to use to group findings by when sending signals. filters [object] Additional queries to filter matched events before they are processed. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message in markdown format for generated findings and signals. name [_required_] string The name of the rule. options [_required_] object Options on cloud configuration rules. complianceRuleOptions [_required_] object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. tags [string] Tags for generated findings and signals. type enum The rule type. Allowed enum values: `cloud_configuration` ``` { "name": "Example-Security-Monitoring", "queries": [ { "query": "@test:true", "aggregation": "count", "groupByFields": [], "distinctFields": [], "metrics": [] } ], "filters": [], "cases": [ { "name": "", "status": "info", "condition": "a > 0", "notifications": [] } ], "options": { "evaluationWindow": 900, "keepAlive": 3600, "maxSignalDuration": 86400 }, "message": "Test rule", "tags": [], "isEnabled": true, "type": "log_detection" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionsAffectingFutureRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionsAffectingFutureRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionsAffectingFutureRule-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSuppressionsAffectingFutureRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object containing the available suppression rules. Field Type Description data [object] A list of suppressions objects. attributes object The attributes of the suppression rule. creation_date int64 A Unix millisecond timestamp given the creation date of the suppression rule. creator object A user. handle string The handle of the user. name string The name of the user. data_exclusion_query string An exclusion query on the input data of the security rules, which could be logs, Agent events, or other types of data based on the security rule. Events matching this query are ignored by any detection rules referenced in the suppression rule. description string A description for the suppression rule. editable boolean Whether the suppression rule is editable. enabled boolean Whether the suppression rule is enabled. expiration_date int64 A Unix millisecond timestamp giving an expiration date for the suppression rule. After this date, it won't suppress signals anymore. name string The name of the suppression rule. rule_query string The rule query of the suppression rule, with the same syntax as the search bar for detection rules. start_date int64 A Unix millisecond timestamp giving the start date for the suppression rule. After this date, it starts suppressing signals. suppression_query string The suppression query of the suppression rule. If a signal matches this query, it is suppressed and not triggered. Same syntax as the queries to search signals in the signal explorer. tags [string] List of tags associated with the suppression rule. update_date int64 A Unix millisecond timestamp given the update date of the suppression rule. updater object A user. handle string The handle of the user. name string The name of the user. version int32 The version of the suppression rule; it starts at 1, and is incremented at each update. id string The ID of the suppression rule. type enum The type of the resource. The value should always be `suppressions`. Allowed enum values: `suppressions` default: `suppressions` ``` { "data": [ { "attributes": { "creation_date": "integer", "creator": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "data_exclusion_query": "source:cloudtrail account_id:12345", "description": "This rule suppresses low-severity signals in staging environments.", "editable": true, "enabled": true, "expiration_date": 1703187336000, "name": "Custom suppression", "rule_query": "type:log_detection source:cloudtrail", "start_date": 1703187336000, "suppression_query": "env:staging status:low", "tags": [ "technique:T1110-brute-force", "source:cloudtrail" ], "update_date": "integer", "updater": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "version": 42 }, "id": "3dd-0uc-h1s", "type": "suppressions" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get suppressions affecting future rule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Security-Monitoring", "queries": [ { "query": "@test:true", "aggregation": "count", "groupByFields": [], "distinctFields": [], "metrics": [] } ], "filters": [], "cases": [ { "name": "", "status": "info", "condition": "a > 0", "notifications": [] } ], "options": { "evaluationWindow": 900, "keepAlive": 3600, "maxSignalDuration": 86400 }, "message": "Test rule", "tags": [], "isEnabled": true, "type": "log_detection" } EOF ``` ##### Get suppressions affecting future rule returns "OK" response ``` // Get suppressions affecting future rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringRuleCreatePayload{ SecurityMonitoringStandardRuleCreatePayload: &datadogV2.SecurityMonitoringStandardRuleCreatePayload{ Name: "Example-Security-Monitoring", Queries: []datadogV2.SecurityMonitoringStandardRuleQuery{ { Query: datadog.PtrString("@test:true"), Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_COUNT.Ptr(), GroupByFields: []string{}, DistinctFields: []string{}, Metrics: []string{}, }, }, Filters: []datadogV2.SecurityMonitoringFilter{}, Cases: []datadogV2.SecurityMonitoringRuleCaseCreate{ { Name: datadog.PtrString(""), Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO, Condition: datadog.PtrString("a > 0"), Notifications: []string{}, }, }, Options: datadogV2.SecurityMonitoringRuleOptions{ EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_FIFTEEN_MINUTES.Ptr(), KeepAlive: datadogV2.SECURITYMONITORINGRULEKEEPALIVE_ONE_HOUR.Ptr(), MaxSignalDuration: datadogV2.SECURITYMONITORINGRULEMAXSIGNALDURATION_ONE_DAY.Ptr(), }, Message: "Test rule", Tags: []string{}, IsEnabled: true, Type: datadogV2.SECURITYMONITORINGRULETYPECREATE_LOG_DETECTION.Ptr(), }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSuppressionsAffectingFutureRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSuppressionsAffectingFutureRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSuppressionsAffectingFutureRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get suppressions affecting future rule returns "OK" response ``` // Get suppressions affecting future rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCaseCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCreatePayload; import com.datadog.api.client.v2.model.SecurityMonitoringRuleEvaluationWindow; import com.datadog.api.client.v2.model.SecurityMonitoringRuleKeepAlive; import com.datadog.api.client.v2.model.SecurityMonitoringRuleMaxSignalDuration; import com.datadog.api.client.v2.model.SecurityMonitoringRuleOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryAggregation; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTypeCreate; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleCreatePayload; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleQuery; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionsResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringRuleCreatePayload body = new SecurityMonitoringRuleCreatePayload( new SecurityMonitoringStandardRuleCreatePayload() .name("Example-Security-Monitoring") .queries( Collections.singletonList( new SecurityMonitoringStandardRuleQuery() .query("@test:true") .aggregation(SecurityMonitoringRuleQueryAggregation.COUNT))) .cases( Collections.singletonList( new SecurityMonitoringRuleCaseCreate() .name("") .status(SecurityMonitoringRuleSeverity.INFO) .condition("a > 0"))) .options( new SecurityMonitoringRuleOptions() .evaluationWindow(SecurityMonitoringRuleEvaluationWindow.FIFTEEN_MINUTES) .keepAlive(SecurityMonitoringRuleKeepAlive.ONE_HOUR) .maxSignalDuration(SecurityMonitoringRuleMaxSignalDuration.ONE_DAY)) .message("Test rule") .isEnabled(true) .type(SecurityMonitoringRuleTypeCreate.LOG_DETECTION)); try { SecurityMonitoringSuppressionsResponse result = apiInstance.getSuppressionsAffectingFutureRule(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#getSuppressionsAffectingFutureRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get suppressions affecting future rule returns "OK" response ``` """ Get suppressions affecting future rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_rule_case_create import SecurityMonitoringRuleCaseCreate from datadog_api_client.v2.model.security_monitoring_rule_evaluation_window import ( SecurityMonitoringRuleEvaluationWindow, ) from datadog_api_client.v2.model.security_monitoring_rule_keep_alive import SecurityMonitoringRuleKeepAlive from datadog_api_client.v2.model.security_monitoring_rule_max_signal_duration import ( SecurityMonitoringRuleMaxSignalDuration, ) from datadog_api_client.v2.model.security_monitoring_rule_options import SecurityMonitoringRuleOptions from datadog_api_client.v2.model.security_monitoring_rule_query_aggregation import ( SecurityMonitoringRuleQueryAggregation, ) from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity from datadog_api_client.v2.model.security_monitoring_rule_type_create import SecurityMonitoringRuleTypeCreate from datadog_api_client.v2.model.security_monitoring_standard_rule_create_payload import ( SecurityMonitoringStandardRuleCreatePayload, ) from datadog_api_client.v2.model.security_monitoring_standard_rule_query import SecurityMonitoringStandardRuleQuery body = SecurityMonitoringStandardRuleCreatePayload( name="Example-Security-Monitoring", queries=[ SecurityMonitoringStandardRuleQuery( query="@test:true", aggregation=SecurityMonitoringRuleQueryAggregation.COUNT, group_by_fields=[], distinct_fields=[], metrics=[], ), ], filters=[], cases=[ SecurityMonitoringRuleCaseCreate( name="", status=SecurityMonitoringRuleSeverity.INFO, condition="a > 0", notifications=[], ), ], options=SecurityMonitoringRuleOptions( evaluation_window=SecurityMonitoringRuleEvaluationWindow.FIFTEEN_MINUTES, keep_alive=SecurityMonitoringRuleKeepAlive.ONE_HOUR, max_signal_duration=SecurityMonitoringRuleMaxSignalDuration.ONE_DAY, ), message="Test rule", tags=[], is_enabled=True, type=SecurityMonitoringRuleTypeCreate.LOG_DETECTION, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_suppressions_affecting_future_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get suppressions affecting future rule returns "OK" response ``` # Get suppressions affecting future rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringStandardRuleCreatePayload.new({ name: "Example-Security-Monitoring", queries: [ DatadogAPIClient::V2::SecurityMonitoringStandardRuleQuery.new({ query: "@test:true", aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::COUNT, group_by_fields: [], distinct_fields: [], metrics: [], }), ], filters: [], cases: [ DatadogAPIClient::V2::SecurityMonitoringRuleCaseCreate.new({ name: "", status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, condition: "a > 0", notifications: [], }), ], options: DatadogAPIClient::V2::SecurityMonitoringRuleOptions.new({ evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES, keep_alive: DatadogAPIClient::V2::SecurityMonitoringRuleKeepAlive::ONE_HOUR, max_signal_duration: DatadogAPIClient::V2::SecurityMonitoringRuleMaxSignalDuration::ONE_DAY, }), message: "Test rule", tags: [], is_enabled: true, type: DatadogAPIClient::V2::SecurityMonitoringRuleTypeCreate::LOG_DETECTION, }) p api_instance.get_suppressions_affecting_future_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get suppressions affecting future rule returns "OK" response ``` // Get suppressions affecting future rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCaseCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCreatePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleEvaluationWindow; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleKeepAlive; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleMaxSignalDuration; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryAggregation; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleTypeCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleCreatePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleQuery; #[tokio::main] async fn main() { let body = SecurityMonitoringRuleCreatePayload::SecurityMonitoringStandardRuleCreatePayload(Box::new( SecurityMonitoringStandardRuleCreatePayload::new( vec![ SecurityMonitoringRuleCaseCreate::new(SecurityMonitoringRuleSeverity::INFO) .condition("a > 0".to_string()) .name("".to_string()) .notifications(vec![]), ], true, "Test rule".to_string(), "Example-Security-Monitoring".to_string(), SecurityMonitoringRuleOptions::new() .evaluation_window(SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES) .keep_alive(SecurityMonitoringRuleKeepAlive::ONE_HOUR) .max_signal_duration(SecurityMonitoringRuleMaxSignalDuration::ONE_DAY), vec![SecurityMonitoringStandardRuleQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::COUNT) .distinct_fields(vec![]) .group_by_fields(vec![]) .metrics(vec![]) .query("@test:true".to_string())], ) .filters(vec![]) .tags(vec![]) .type_(SecurityMonitoringRuleTypeCreate::LOG_DETECTION), )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.get_suppressions_affecting_future_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get suppressions affecting future rule returns "OK" response ``` /** * Get suppressions affecting future rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiGetSuppressionsAffectingFutureRuleRequest = { body: { name: "Example-Security-Monitoring", queries: [ { query: "@test:true", aggregation: "count", groupByFields: [], distinctFields: [], metrics: [], }, ], filters: [], cases: [ { name: "", status: "info", condition: "a > 0", notifications: [], }, ], options: { evaluationWindow: 900, keepAlive: 3600, maxSignalDuration: 86400, }, message: "Test rule", tags: [], isEnabled: true, type: "log_detection", }, }; apiInstance .getSuppressionsAffectingFutureRule(params) .then((data: v2.SecurityMonitoringSuppressionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-suppression-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-suppression-rule-v2) PATCH https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.datadoghq.eu/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.ddog-gov.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/{suppression_id} ### Overview Update a specific suppression rule. OAuth apps require the `security_monitoring_suppressions_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description suppression_id [_required_] string The ID of the suppression rule ### Request #### Body Data (required) New definition of the suppression rule. Supports partial updates. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object The new suppression properties; partial updates are supported. attributes [_required_] object The suppression rule properties to be updated. data_exclusion_query string An exclusion query on the input data of the security rules, which could be logs, Agent events, or other types of data based on the security rule. Events matching this query are ignored by any detection rules referenced in the suppression rule. description string A description for the suppression rule. enabled boolean Whether the suppression rule is enabled. expiration_date int64 A Unix millisecond timestamp giving an expiration date for the suppression rule. After this date, it won't suppress signals anymore. If unset, the expiration date of the suppression rule is left untouched. If set to `null`, the expiration date is removed. name string The name of the suppression rule. rule_query string The rule query of the suppression rule, with the same syntax as the search bar for detection rules. start_date int64 A Unix millisecond timestamp giving the start date for the suppression rule. After this date, it starts suppressing signals. If unset, the start date of the suppression rule is left untouched. If set to `null`, the start date is removed. suppression_query string The suppression query of the suppression rule. If a signal matches this query, it is suppressed and not triggered. Same syntax as the queries to search signals in the signal explorer. tags [string] List of tags associated with the suppression rule. version int32 The current version of the suppression. This is optional, but it can help prevent concurrent modifications. type [_required_] enum The type of the resource. The value should always be `suppressions`. Allowed enum values: `suppressions` default: `suppressions` ``` { "data": { "attributes": { "suppression_query": "env:staging status:low" }, "type": "suppressions" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringSuppression-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringSuppression-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringSuppression-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringSuppression-404-v2) * [409](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringSuppression-409-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringSuppression-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object containing a single suppression rule. Field Type Description data object The suppression rule's properties. attributes object The attributes of the suppression rule. creation_date int64 A Unix millisecond timestamp given the creation date of the suppression rule. creator object A user. handle string The handle of the user. name string The name of the user. data_exclusion_query string An exclusion query on the input data of the security rules, which could be logs, Agent events, or other types of data based on the security rule. Events matching this query are ignored by any detection rules referenced in the suppression rule. description string A description for the suppression rule. editable boolean Whether the suppression rule is editable. enabled boolean Whether the suppression rule is enabled. expiration_date int64 A Unix millisecond timestamp giving an expiration date for the suppression rule. After this date, it won't suppress signals anymore. name string The name of the suppression rule. rule_query string The rule query of the suppression rule, with the same syntax as the search bar for detection rules. start_date int64 A Unix millisecond timestamp giving the start date for the suppression rule. After this date, it starts suppressing signals. suppression_query string The suppression query of the suppression rule. If a signal matches this query, it is suppressed and not triggered. Same syntax as the queries to search signals in the signal explorer. tags [string] List of tags associated with the suppression rule. update_date int64 A Unix millisecond timestamp given the update date of the suppression rule. updater object A user. handle string The handle of the user. name string The name of the user. version int32 The version of the suppression rule; it starts at 1, and is incremented at each update. id string The ID of the suppression rule. type enum The type of the resource. The value should always be `suppressions`. Allowed enum values: `suppressions` default: `suppressions` ``` { "data": { "attributes": { "creation_date": "integer", "creator": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "data_exclusion_query": "source:cloudtrail account_id:12345", "description": "This rule suppresses low-severity signals in staging environments.", "editable": true, "enabled": true, "expiration_date": 1703187336000, "name": "Custom suppression", "rule_query": "type:log_detection source:cloudtrail", "start_date": 1703187336000, "suppression_query": "env:staging status:low", "tags": [ "technique:T1110-brute-force", "source:cloudtrail" ], "update_date": "integer", "updater": { "handle": "john.doe@datadoghq.com", "name": "John Doe" }, "version": 42 }, "id": "3dd-0uc-h1s", "type": "suppressions" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Update a suppression rule returns "OK" response Copy ``` # Path parameters export suppression_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/${suppression_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "suppression_query": "env:staging status:low" }, "type": "suppressions" } } EOF ``` ##### Update a suppression rule returns "OK" response ``` // Update a suppression rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "suppression" in the system SuppressionDataID := os.Getenv("SUPPRESSION_DATA_ID") body := datadogV2.SecurityMonitoringSuppressionUpdateRequest{ Data: datadogV2.SecurityMonitoringSuppressionUpdateData{ Attributes: datadogV2.SecurityMonitoringSuppressionUpdateAttributes{ SuppressionQuery: datadog.PtrString("env:staging status:low"), }, Type: datadogV2.SECURITYMONITORINGSUPPRESSIONTYPE_SUPPRESSIONS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.UpdateSecurityMonitoringSuppression(ctx, SuppressionDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.UpdateSecurityMonitoringSuppression`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.UpdateSecurityMonitoringSuppression`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a suppression rule returns "OK" response ``` // Update a suppression rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionResponse; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionType; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionUpdateAttributes; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionUpdateData; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "suppression" in the system String SUPPRESSION_DATA_ID = System.getenv("SUPPRESSION_DATA_ID"); SecurityMonitoringSuppressionUpdateRequest body = new SecurityMonitoringSuppressionUpdateRequest() .data( new SecurityMonitoringSuppressionUpdateData() .attributes( new SecurityMonitoringSuppressionUpdateAttributes() .suppressionQuery("env:staging status:low")) .type(SecurityMonitoringSuppressionType.SUPPRESSIONS)); try { SecurityMonitoringSuppressionResponse result = apiInstance.updateSecurityMonitoringSuppression(SUPPRESSION_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#updateSecurityMonitoringSuppression"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a suppression rule returns "OK" response ``` """ Update a suppression rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_suppression_type import SecurityMonitoringSuppressionType from datadog_api_client.v2.model.security_monitoring_suppression_update_attributes import ( SecurityMonitoringSuppressionUpdateAttributes, ) from datadog_api_client.v2.model.security_monitoring_suppression_update_data import ( SecurityMonitoringSuppressionUpdateData, ) from datadog_api_client.v2.model.security_monitoring_suppression_update_request import ( SecurityMonitoringSuppressionUpdateRequest, ) # there is a valid "suppression" in the system SUPPRESSION_DATA_ID = environ["SUPPRESSION_DATA_ID"] body = SecurityMonitoringSuppressionUpdateRequest( data=SecurityMonitoringSuppressionUpdateData( attributes=SecurityMonitoringSuppressionUpdateAttributes( suppression_query="env:staging status:low", ), type=SecurityMonitoringSuppressionType.SUPPRESSIONS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.update_security_monitoring_suppression(suppression_id=SUPPRESSION_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a suppression rule returns "OK" response ``` # Update a suppression rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "suppression" in the system SUPPRESSION_DATA_ID = ENV["SUPPRESSION_DATA_ID"] body = DatadogAPIClient::V2::SecurityMonitoringSuppressionUpdateRequest.new({ data: DatadogAPIClient::V2::SecurityMonitoringSuppressionUpdateData.new({ attributes: DatadogAPIClient::V2::SecurityMonitoringSuppressionUpdateAttributes.new({ suppression_query: "env:staging status:low", }), type: DatadogAPIClient::V2::SecurityMonitoringSuppressionType::SUPPRESSIONS, }), }) p api_instance.update_security_monitoring_suppression(SUPPRESSION_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a suppression rule returns "OK" response ``` // Update a suppression rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionType; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionUpdateAttributes; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionUpdateData; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionUpdateRequest; #[tokio::main] async fn main() { // there is a valid "suppression" in the system let suppression_data_id = std::env::var("SUPPRESSION_DATA_ID").unwrap(); let body = SecurityMonitoringSuppressionUpdateRequest::new( SecurityMonitoringSuppressionUpdateData::new( SecurityMonitoringSuppressionUpdateAttributes::new() .suppression_query("env:staging status:low".to_string()), SecurityMonitoringSuppressionType::SUPPRESSIONS, ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .update_security_monitoring_suppression(suppression_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a suppression rule returns "OK" response ``` /** * Update a suppression rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "suppression" in the system const SUPPRESSION_DATA_ID = process.env.SUPPRESSION_DATA_ID as string; const params: v2.SecurityMonitoringApiUpdateSecurityMonitoringSuppressionRequest = { body: { data: { attributes: { suppressionQuery: "env:staging status:low", }, type: "suppressions", }, }, suppressionId: SUPPRESSION_DATA_ID, }; apiInstance .updateSecurityMonitoringSuppression(params) .then((data: v2.SecurityMonitoringSuppressionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Validate a suppression rule](https://docs.datadoghq.com/api/latest/security-monitoring/#validate-a-suppression-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#validate-a-suppression-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/validationhttps://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/validationhttps://api.datadoghq.eu/api/v2/security_monitoring/configuration/suppressions/validationhttps://api.ddog-gov.com/api/v2/security_monitoring/configuration/suppressions/validationhttps://api.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/validationhttps://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/validationhttps://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/validation ### Overview Validate a suppression rule. This endpoint requires the `security_monitoring_suppressions_write` permission. OAuth apps require the `security_monitoring_suppressions_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object Object for a single suppression rule. attributes [_required_] object Object containing the attributes of the suppression rule to be created. data_exclusion_query string An exclusion query on the input data of the security rules, which could be logs, Agent events, or other types of data based on the security rule. Events matching this query are ignored by any detection rules referenced in the suppression rule. description string A description for the suppression rule. enabled [_required_] boolean Whether the suppression rule is enabled. expiration_date int64 A Unix millisecond timestamp giving an expiration date for the suppression rule. After this date, it won't suppress signals anymore. name [_required_] string The name of the suppression rule. rule_query [_required_] string The rule query of the suppression rule, with the same syntax as the search bar for detection rules. start_date int64 A Unix millisecond timestamp giving the start date for the suppression rule. After this date, it starts suppressing signals. suppression_query string The suppression query of the suppression rule. If a signal matches this query, it is suppressed and is not triggered. It uses the same syntax as the queries to search signals in the Signals Explorer. tags [string] List of tags associated with the suppression rule. type [_required_] enum The type of the resource. The value should always be `suppressions`. Allowed enum values: `suppressions` default: `suppressions` ``` { "data": { "attributes": { "data_exclusion_query": "source:cloudtrail account_id:12345", "description": "This rule suppresses low-severity signals in staging environments.", "enabled": true, "name": "Custom suppression", "rule_query": "type:log_detection source:cloudtrail" }, "type": "suppressions" } } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/security-monitoring/#ValidateSecurityMonitoringSuppression-204-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ValidateSecurityMonitoringSuppression-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ValidateSecurityMonitoringSuppression-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ValidateSecurityMonitoringSuppression-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Validate a suppression rule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/suppressions/validation" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "data_exclusion_query": "source:cloudtrail account_id:12345", "description": "This rule suppresses low-severity signals in staging environments.", "enabled": true, "name": "Custom suppression", "rule_query": "type:log_detection source:cloudtrail" }, "type": "suppressions" } } EOF ``` ##### Validate a suppression rule returns "OK" response ``` // Validate a suppression rule returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringSuppressionCreateRequest{ Data: datadogV2.SecurityMonitoringSuppressionCreateData{ Attributes: datadogV2.SecurityMonitoringSuppressionCreateAttributes{ DataExclusionQuery: datadog.PtrString("source:cloudtrail account_id:12345"), Description: datadog.PtrString("This rule suppresses low-severity signals in staging environments."), Enabled: true, Name: "Custom suppression", RuleQuery: "type:log_detection source:cloudtrail", }, Type: datadogV2.SECURITYMONITORINGSUPPRESSIONTYPE_SUPPRESSIONS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.ValidateSecurityMonitoringSuppression(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ValidateSecurityMonitoringSuppression`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Validate a suppression rule returns "OK" response ``` // Validate a suppression rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionCreateAttributes; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionCreateData; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionCreateRequest; import com.datadog.api.client.v2.model.SecurityMonitoringSuppressionType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringSuppressionCreateRequest body = new SecurityMonitoringSuppressionCreateRequest() .data( new SecurityMonitoringSuppressionCreateData() .attributes( new SecurityMonitoringSuppressionCreateAttributes() .dataExclusionQuery("source:cloudtrail account_id:12345") .description( "This rule suppresses low-severity signals in staging" + " environments.") .enabled(true) .name("Custom suppression") .ruleQuery("type:log_detection source:cloudtrail")) .type(SecurityMonitoringSuppressionType.SUPPRESSIONS)); try { apiInstance.validateSecurityMonitoringSuppression(body); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#validateSecurityMonitoringSuppression"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Validate a suppression rule returns "OK" response ``` """ Validate a suppression rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_suppression_create_attributes import ( SecurityMonitoringSuppressionCreateAttributes, ) from datadog_api_client.v2.model.security_monitoring_suppression_create_data import ( SecurityMonitoringSuppressionCreateData, ) from datadog_api_client.v2.model.security_monitoring_suppression_create_request import ( SecurityMonitoringSuppressionCreateRequest, ) from datadog_api_client.v2.model.security_monitoring_suppression_type import SecurityMonitoringSuppressionType body = SecurityMonitoringSuppressionCreateRequest( data=SecurityMonitoringSuppressionCreateData( attributes=SecurityMonitoringSuppressionCreateAttributes( data_exclusion_query="source:cloudtrail account_id:12345", description="This rule suppresses low-severity signals in staging environments.", enabled=True, name="Custom suppression", rule_query="type:log_detection source:cloudtrail", ), type=SecurityMonitoringSuppressionType.SUPPRESSIONS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.validate_security_monitoring_suppression(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Validate a suppression rule returns "OK" response ``` # Validate a suppression rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringSuppressionCreateRequest.new({ data: DatadogAPIClient::V2::SecurityMonitoringSuppressionCreateData.new({ attributes: DatadogAPIClient::V2::SecurityMonitoringSuppressionCreateAttributes.new({ data_exclusion_query: "source:cloudtrail account_id:12345", description: "This rule suppresses low-severity signals in staging environments.", enabled: true, name: "Custom suppression", rule_query: "type:log_detection source:cloudtrail", }), type: DatadogAPIClient::V2::SecurityMonitoringSuppressionType::SUPPRESSIONS, }), }) api_instance.validate_security_monitoring_suppression(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Validate a suppression rule returns "OK" response ``` // Validate a suppression rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionCreateAttributes; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionCreateData; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionCreateRequest; use datadog_api_client::datadogV2::model::SecurityMonitoringSuppressionType; #[tokio::main] async fn main() { let body = SecurityMonitoringSuppressionCreateRequest::new( SecurityMonitoringSuppressionCreateData::new( SecurityMonitoringSuppressionCreateAttributes::new( true, "Custom suppression".to_string(), "type:log_detection source:cloudtrail".to_string(), ) .data_exclusion_query("source:cloudtrail account_id:12345".to_string()) .description( "This rule suppresses low-severity signals in staging environments.".to_string(), ), SecurityMonitoringSuppressionType::SUPPRESSIONS, ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.validate_security_monitoring_suppression(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Validate a suppression rule returns "OK" response ``` /** * Validate a suppression rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiValidateSecurityMonitoringSuppressionRequest = { body: { data: { attributes: { dataExclusionQuery: "source:cloudtrail account_id:12345", description: "This rule suppresses low-severity signals in staging environments.", enabled: true, name: "Custom suppression", ruleQuery: "type:log_detection source:cloudtrail", }, type: "suppressions", }, }, }; apiInstance .validateSecurityMonitoringSuppression(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List findings](https://docs.datadoghq.com/api/latest/security-monitoring/#list-findings) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#list-findings-v2) **Note** : This endpoint uses the legacy security findings data model and is planned for deprecation. Use the [search security findings endpoint](https://docs.datadoghq.com/api/latest/security-monitoring/#search-security-findings), which is based on the [new security findings schema](https://docs.datadoghq.com/security/guide/findings-schema/), to search security findings. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/posture_management/findingshttps://api.ap2.datadoghq.com/api/v2/posture_management/findingshttps://api.datadoghq.eu/api/v2/posture_management/findingshttps://api.ddog-gov.com/api/v2/posture_management/findingshttps://api.datadoghq.com/api/v2/posture_management/findingshttps://api.us3.datadoghq.com/api/v2/posture_management/findingshttps://api.us5.datadoghq.com/api/v2/posture_management/findings ### Overview Get a list of findings. These include both misconfigurations and identity risks. **Note** : To filter and return only identity risks, add the following query parameter: `?filter[tags]=dd_rule_type:ciem` ### [Filtering](https://docs.datadoghq.com/api/latest/security-monitoring/#filtering) Filters can be applied by appending query parameters to the URL. * Using a single filter: `?filter[attribute_key]=attribute_value` * Chaining filters: `?filter[attribute_key]=attribute_value&filter[attribute_key]=attribute_value...` * Filtering on tags: `?filter[tags]=tag_key:tag_value&filter[tags]=tag_key_2:tag_value_2` Here, `attribute_key` can be any of the filter keys described further below. Query parameters of type `integer` support comparison operators (`>`, `>=`, `<`, `<=`). This is particularly useful when filtering by `evaluation_changed_at` or `resource_discovery_timestamp`. For example: `?filter[evaluation_changed_at]=>20123123121`. You can also use the negation operator on strings. For example, use `filter[resource_type]=-aws*` to filter for any non-AWS resources. The operator must come after the equal sign. For example, to filter with the `>=` operator, add the operator after the equal sign: `filter[evaluation_changed_at]=>=1678809373257`. Query parameters must be only among the documented ones and with values of correct types. Duplicated query parameters (e.g. `filter[status]=low&filter[status]=info`) are not allowed. ### [Additional extension fields](https://docs.datadoghq.com/api/latest/security-monitoring/#additional-extension-fields) Additional extension fields are available for some findings. The data is available when you include the query parameter `?detailed_findings=true` in the request. The following fields are available for findings: * `external_id`: The resource external ID related to the finding. * `description`: The description and remediation steps for the finding. * `datadog_link`: The Datadog relative link for the finding. * `ip_addresses`: The list of private IP addresses for the resource related to the finding. ### [Response](https://docs.datadoghq.com/api/latest/security-monitoring/#response) The response includes an array of finding objects, pagination metadata, and a count of items that match the query. Each finding object contains the following: * The finding ID that can be used in a `GetFinding` request to retrieve the full finding details. * Core attributes, including status, evaluation, high-level resource details, muted state, and rule details. * `evaluation_changed_at` and `resource_discovery_date` time stamps. * An array of associated tags. OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[limit] integer Limit the number of findings returned. Must be <= 1000. snapshot_timestamp integer Return findings for a given snapshot of time (Unix ms). page[cursor] string Return the next page of findings pointed to by the cursor. filter[tags] string Return findings that have these associated tags (repeatable). filter[evaluation_changed_at] string Return findings that have changed from pass to fail or vice versa on a specified date (Unix ms) or date range (using comparison operators). filter[muted] boolean Set to `true` to return findings that are muted. Set to `false` to return unmuted findings. filter[rule_id] string Return findings for the specified rule ID. filter[rule_name] string Return findings for the specified rule. filter[resource_type] string Return only findings for the specified resource type. filter[@resource_id] string Return only findings for the specified resource id. filter[discovery_timestamp] string Return findings that were found on a specified date (Unix ms) or date range (using comparison operators). filter[evaluation] enum Return only `pass` or `fail` findings. Allowed enum values: `pass, fail` filter[status] enum Return only findings with the specified status. Allowed enum values: `critical, high, medium, low, info` filter[vulnerability_type] array Return findings that match the selected vulnerability types (repeatable). detailed_findings boolean Return additional fields for some findings. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListFindings-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ListFindings-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ListFindings-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#ListFindings-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListFindings-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The expected response schema when listing findings. Field Type Description data [_required_] [object] Array of findings. attributes object The JSON:API attributes of the finding. datadog_link string The Datadog relative link for this finding. description string The description and remediation steps for this finding. evaluation enum The evaluation of the finding. Allowed enum values: `pass,fail` evaluation_changed_at int64 The date on which the evaluation for this finding changed (Unix ms). external_id string The cloud-based ID for the resource related to the finding. mute object Information about the mute status of this finding. description string Additional information about the reason why this finding is muted or unmuted. expiration_date int64 The expiration date of the mute or unmute action (Unix ms). muted boolean Whether this finding is muted or unmuted. reason enum The reason why this finding is muted or unmuted. Allowed enum values: `PENDING_FIX,FALSE_POSITIVE,ACCEPTED_RISK,NO_PENDING_FIX,HUMAN_ERROR,NO_LONGER_ACCEPTED_RISK,OTHER` start_date int64 The start of the mute period. uuid string The ID of the user who muted or unmuted this finding. resource string The resource name of this finding. resource_discovery_date int64 The date on which the resource was discovered (Unix ms). resource_type string The resource type of this finding. rule object The rule that triggered this finding. id string The ID of the rule that triggered this finding. name string The name of the rule that triggered this finding. status enum The status of the finding. Allowed enum values: `critical,high,medium,low,info` tags [string] The tags associated with this finding. vulnerability_type enum The vulnerability type of the finding. Allowed enum values: `misconfiguration,attack_path,identity_risk,api_security` id string The unique ID for this finding. type enum The JSON:API type for findings. Allowed enum values: `finding` default: `finding` meta [_required_] object Metadata for pagination. page object Pagination and findings count information. cursor string The cursor used to paginate requests. total_filtered_count int64 The total count of findings after the filter has been applied. snapshot_timestamp int64 The point in time corresponding to the listed findings. ``` { "data": [ { "attributes": { "datadog_link": "/security/compliance?panels=cpfinding%7Cevent%7CruleId%3Adef-000-u5t%7CresourceId%3Ae8c9ab7c52ebd7bf2fdb4db641082d7d%7CtabId%3Aoverview", "description": "## Remediation\n\n1. In the console, go to **Storage Account**.\n2. For each Storage Account, navigate to **Data Protection**.\n3. Select **Set soft delete enabled** and enter the number of days to retain soft deleted data.", "evaluation": "pass", "evaluation_changed_at": 1678721573794, "external_id": "arn:aws:s3:::my-example-bucket", "mute": { "description": "To be resolved later", "expiration_date": 1778721573794, "muted": true, "reason": "ACCEPTED_RISK", "start_date": 1678721573794, "uuid": "e51c9744-d158-11ec-ad23-da7ad0900002" }, "resource": "my_resource_name", "resource_discovery_date": 1678721573794, "resource_type": "azure_storage_account", "rule": { "id": "dv2-jzf-41i", "name": "Soft delete is enabled for Azure Storage" }, "status": "critical", "tags": [ "cloud_provider:aws", "myTag:myValue" ], "vulnerability_type": "misconfiguration" }, "id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", "type": "finding" } ], "meta": { "page": { "cursor": "eyJhZnRlciI6IkFRQUFBWWJiaEJXQS1OY1dqUUFBQUFCQldXSmlhRUpYUVVGQlJFSktkbTlDTUdaWFRVbDNRVUUiLCJ2YWx1ZXMiOlsiY3JpdGljYWwiXX0=", "total_filtered_count": 213 }, "snapshot_timestamp": 1678721573794 } } ``` Copy Bad Request: The server cannot process the request due to invalid syntax in the request. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden: Access denied * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found: The requested finding cannot be found. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests: The rate limit set by the API has been exceeded. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### List findings Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/posture_management/findings" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List findings ``` """ List findings returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() configuration.unstable_operations["list_findings"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_findings() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List findings ``` # List findings returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_findings".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.list_findings() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List findings ``` // List findings returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListFindings", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListFindings(ctx, *datadogV2.NewListFindingsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListFindings`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListFindings`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List findings ``` // List findings returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.ListFindingsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listFindings", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { ListFindingsResponse result = apiInstance.listFindings(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#listFindings"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List findings ``` // List findings returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::ListFindingsOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListFindings", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .list_findings(ListFindingsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List findings ``` /** * List findings returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listFindings"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); apiInstance .listFindings() .then((data: v2.ListFindingsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#list-security-findings) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#list-security-findings-v2) GET https://api.ap1.datadoghq.com/api/v2/security/findingshttps://api.ap2.datadoghq.com/api/v2/security/findingshttps://api.datadoghq.eu/api/v2/security/findingshttps://api.ddog-gov.com/api/v2/security/findingshttps://api.datadoghq.com/api/v2/security/findingshttps://api.us3.datadoghq.com/api/v2/security/findingshttps://api.us5.datadoghq.com/api/v2/security/findings ### Overview Get a list of security findings that match a search query. [See the schema for security findings](https://docs.datadoghq.com/security/guide/findings-schema/). ### [Query Syntax](https://docs.datadoghq.com/api/latest/security-monitoring/#query-syntax) This endpoint uses the logs query syntax. Findings attributes (living in the attributes.attributes. namespace) are prefixed by @ when queried. Tags are queried without a prefix. Example: `@severity:(critical OR high) @status:open team:platform` This endpoint requires any of the following permissions: * `security_monitoring_findings_read` * `appsec_vm_read` OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[query] string The search query following log search syntax. page[cursor] string Get the next page of results with a cursor provided in the previous query. page[limit] integer The maximum number of findings in the response. sort enum Sorts by @detection_changed_at. Allowed enum values: `@detection_changed_at, -@detection_changed_at` ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityFindings-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityFindings-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityFindings-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityFindings-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The expected response schema when listing security findings. Field Type Description data [object] Array of security findings matching the search query. attributes object The JSON object containing all attributes of the security finding. attributes object The custom attributes of the security finding. tags [string] List of tags associated with the security finding. timestamp int64 The Unix timestamp at which the detection changed for the resource. Same value as @detection_changed_at. id string The unique ID of the security finding. type enum The type of the security finding resource. Allowed enum values: `finding` default: `finding` links object Links for pagination. next string Link for the next page of results. Note that paginated requests can also be made using the POST endpoint. meta object Metadata about the response. elapsed int64 The time elapsed in milliseconds. page object Pagination information. after string The cursor used to get the next page of results. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` ``` { "data": [ { "attributes": { "attributes": { "severity": "high", "status": "open" }, "tags": [ "team:platform", "env:prod" ], "timestamp": 1765901760 }, "id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", "type": "finding" } ], "links": { "next": "https://app.datadoghq.com/api/v2/security/findings?page[cursor]=eyJhZnRlciI6IkF3QUFBWnPcm1pd0FBQUJbVlBQUKBa1pqRTVdZUzSTBNemN0YWiIsLTE3Mjk0MzYwMjFdfQ==\u0026page[limit]=25" }, "meta": { "elapsed": 548, "page": { "after": "eyJhZnRlciI6IkFRQUFBWWJiaEJXQS1OY1dqUUFBQUFCQldXSmlhRUpYUVVGQlJFSktkbTlDTUdaWFRVbDNRVUUiLCJ2YWx1ZXMiOlsiY3JpdGljYWwiXX0=" }, "request_id": "pddv1ChZwVlMxMUdYRFRMQ1lyb3B4MGNYbFlnIi0KHQu35LDbucx", "status": "done" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) ##### List security findings Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/findings" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Get a finding](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-finding) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-finding-v2) **Note** : This endpoint uses the legacy security findings data model and is planned for deprecation. Use the [search security findings endpoint](https://docs.datadoghq.com/api/latest/security-monitoring/#search-security-findings), which is based on the [new security findings schema](https://docs.datadoghq.com/security/guide/findings-schema/), to search security findings. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/posture_management/findings/{finding_id}https://api.ap2.datadoghq.com/api/v2/posture_management/findings/{finding_id}https://api.datadoghq.eu/api/v2/posture_management/findings/{finding_id}https://api.ddog-gov.com/api/v2/posture_management/findings/{finding_id}https://api.datadoghq.com/api/v2/posture_management/findings/{finding_id}https://api.us3.datadoghq.com/api/v2/posture_management/findings/{finding_id}https://api.us5.datadoghq.com/api/v2/posture_management/findings/{finding_id} ### Overview Returns a single finding with message and resource configuration. OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description finding_id [_required_] string The ID of the finding. #### Query Strings Name Type Description snapshot_timestamp integer Return the finding for a given snapshot of time (Unix ms). ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetFinding-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#GetFinding-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetFinding-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetFinding-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetFinding-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The expected response schema when getting a finding. Field Type Description data [_required_] object A single finding with with message and resource configuration. attributes object The JSON:API attributes of the detailed finding. evaluation enum The evaluation of the finding. Allowed enum values: `pass,fail` evaluation_changed_at int64 The date on which the evaluation for this finding changed (Unix ms). message string The remediation message for this finding. mute object Information about the mute status of this finding. description string Additional information about the reason why this finding is muted or unmuted. expiration_date int64 The expiration date of the mute or unmute action (Unix ms). muted boolean Whether this finding is muted or unmuted. reason enum The reason why this finding is muted or unmuted. Allowed enum values: `PENDING_FIX,FALSE_POSITIVE,ACCEPTED_RISK,NO_PENDING_FIX,HUMAN_ERROR,NO_LONGER_ACCEPTED_RISK,OTHER` start_date int64 The start of the mute period. uuid string The ID of the user who muted or unmuted this finding. resource string The resource name of this finding. resource_configuration object The resource configuration for this finding. resource_discovery_date int64 The date on which the resource was discovered (Unix ms). resource_type string The resource type of this finding. rule object The rule that triggered this finding. id string The ID of the rule that triggered this finding. name string The name of the rule that triggered this finding. status enum The status of the finding. Allowed enum values: `critical,high,medium,low,info` tags [string] The tags associated with this finding. id string The unique ID for this finding. type enum The JSON:API type for findings that have the message and resource configuration. Allowed enum values: `detailed_finding` default: `detailed_finding` ``` { "data": { "attributes": { "evaluation": "pass", "evaluation_changed_at": 1678721573794, "message": "## Remediation\n\n### From the console\n\n1. Go to Storage Account\n2. For each Storage Account, navigate to Data Protection\n3. Select Set soft delete enabled and enter the number of days to retain soft deleted data.", "mute": { "description": "To be resolved later", "expiration_date": 1778721573794, "muted": true, "reason": "ACCEPTED_RISK", "start_date": 1678721573794, "uuid": "e51c9744-d158-11ec-ad23-da7ad0900002" }, "resource": "my_resource_name", "resource_configuration": {}, "resource_discovery_date": 1678721573794, "resource_type": "azure_storage_account", "rule": { "id": "dv2-jzf-41i", "name": "Soft delete is enabled for Azure Storage" }, "status": "critical", "tags": [ "cloud_provider:aws", "myTag:myValue" ] }, "id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", "type": "detailed_finding" } } ``` Copy Bad Request: The server cannot process the request due to invalid syntax in the request. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden: Access denied * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found: The requested finding cannot be found. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests: The rate limit set by the API has been exceeded. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a finding Copy ``` # Path parameters export finding_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/posture_management/findings/${finding_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a finding ``` """ Get a finding returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() configuration.unstable_operations["get_finding"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_finding( finding_id="AgAAAYd59gjghzF52gAAAAAAAAAYAAAAAEFZZDU5Z2pnQUFCRTRvV1lFeEo4SlFBQQAAACQAAAAAMDE4NzdhMDEtMDRiYS00NTZlLWFmMzMtNTIxNmNkNjVlNDMz", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a finding ``` # Get a finding returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_finding".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.get_finding("AgAAAYd59gjghzF52gAAAAAAAAAYAAAAAEFZZDU5Z2pnQUFCRTRvV1lFeEo4SlFBQQAAACQAAAAAMDE4NzdhMDEtMDRiYS00NTZlLWFmMzMtNTIxNmNkNjVlNDMz") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a finding ``` // Get a finding returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetFinding", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetFinding(ctx, "AgAAAYd59gjghzF52gAAAAAAAAAYAAAAAEFZZDU5Z2pnQUFCRTRvV1lFeEo4SlFBQQAAACQAAAAAMDE4NzdhMDEtMDRiYS00NTZlLWFmMzMtNTIxNmNkNjVlNDMz", *datadogV2.NewGetFindingOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetFinding`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetFinding`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a finding ``` // Get a finding returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.GetFindingResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getFinding", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { GetFindingResponse result = apiInstance.getFinding( "AgAAAYd59gjghzF52gAAAAAAAAAYAAAAAEFZZDU5Z2pnQUFCRTRvV1lFeEo4SlFBQQAAACQAAAAAMDE4NzdhMDEtMDRiYS00NTZlLWFmMzMtNTIxNmNkNjVlNDMz"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#getFinding"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a finding ``` // Get a finding returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::GetFindingOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetFinding", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_finding( "AgAAAYd59gjghzF52gAAAAAAAAAYAAAAAEFZZDU5Z2pnQUFCRTRvV1lFeEo4SlFBQQAAACQAAAAAMDE4NzdhMDEtMDRiYS00NTZlLWFmMzMtNTIxNmNkNjVlNDMz".to_string(), GetFindingOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a finding ``` /** * Get a finding returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getFinding"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiGetFindingRequest = { findingId: "AgAAAYd59gjghzF52gAAAAAAAAAYAAAAAEFZZDU5Z2pnQUFCRTRvV1lFeEo4SlFBQQAAACQAAAAAMDE4NzdhMDEtMDRiYS00NTZlLWFmMzMtNTIxNmNkNjVlNDMz", }; apiInstance .getFinding(params) .then((data: v2.GetFindingResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Mute or unmute a batch of findings](https://docs.datadoghq.com/api/latest/security-monitoring/#mute-or-unmute-a-batch-of-findings) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#mute-or-unmute-a-batch-of-findings-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PATCH https://api.ap1.datadoghq.com/api/v2/posture_management/findingshttps://api.ap2.datadoghq.com/api/v2/posture_management/findingshttps://api.datadoghq.eu/api/v2/posture_management/findingshttps://api.ddog-gov.com/api/v2/posture_management/findingshttps://api.datadoghq.com/api/v2/posture_management/findingshttps://api.us3.datadoghq.com/api/v2/posture_management/findingshttps://api.us5.datadoghq.com/api/v2/posture_management/findings ### Overview Mute or unmute findings. This endpoint requires any of the following permissions: * `security_monitoring_findings_write` * `appsec_vm_write` ### Request #### Body Data (required) ### Attributes All findings are updated with the same attributes. The request body must include at least two attributes: `muted` and `reason`. The allowed reasons depend on whether the finding is being muted or unmuted: * To mute a finding: `PENDING_FIX`, `FALSE_POSITIVE`, `ACCEPTED_RISK`, `OTHER`. * To unmute a finding : `NO_PENDING_FIX`, `HUMAN_ERROR`, `NO_LONGER_ACCEPTED_RISK`, `OTHER`. ### Meta The request body must include a list of the finding IDs to be updated. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object Data object containing the new bulk mute properties of the finding. attributes [_required_] object The mute properties to be updated. mute [_required_] object Object containing the new mute properties of the findings. description string Additional information about the reason why those findings are muted or unmuted. This field has a maximum limit of 280 characters. expiration_date int64 The expiration date of the mute or unmute action (Unix ms). It must be set to a value greater than the current timestamp. If this field is not provided, the finding will be muted or unmuted indefinitely, which is equivalent to setting the expiration date to 9999999999999. muted [_required_] boolean Whether those findings should be muted or unmuted. reason [_required_] enum The reason why this finding is muted or unmuted. Allowed enum values: `PENDING_FIX,FALSE_POSITIVE,ACCEPTED_RISK,NO_PENDING_FIX,HUMAN_ERROR,NO_LONGER_ACCEPTED_RISK,OTHER` id [_required_] string UUID to identify the request meta [_required_] object Meta object containing the findings to be updated. findings [object] Array of findings. finding_id string The unique ID for this finding. type [_required_] enum The JSON:API type for findings. Allowed enum values: `finding` default: `finding` ``` { "data": { "attributes": { "mute": { "expiration_date": 1778721573794, "muted": true, "reason": "ACCEPTED_RISK" } }, "id": "dbe5f567-192b-4404-b908-29b70e1c9f76", "meta": { "findings": [ { "finding_id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==" } ] }, "type": "finding" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#MuteFindings-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#MuteFindings-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#MuteFindings-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#MuteFindings-404-v2) * [422](https://docs.datadoghq.com/api/latest/security-monitoring/#MuteFindings-422-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#MuteFindings-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The expected response schema. Field Type Description data [_required_] object Data object containing the ID of the request that was updated. id string UUID used to identify the request type enum The JSON:API type for findings. Allowed enum values: `finding` default: `finding` ``` { "data": { "id": "93bfeb70-af47-424d-908a-948d3f08e37f", "type": "finding" } } ``` Copy Bad Request: The server cannot process the request due to invalid syntax in the request. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden: Access denied * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not Found: The requested finding cannot be found. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Invalid Request: The server understands the request syntax but cannot process it due to invalid data. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests: The rate limit set by the API has been exceeded. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Mute or unmute a batch of findings returns "OK" response Copy ``` # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/posture_management/findings" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "mute": { "expiration_date": 1778721573794, "muted": true, "reason": "ACCEPTED_RISK" } }, "id": "dbe5f567-192b-4404-b908-29b70e1c9f76", "meta": { "findings": [ { "finding_id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==" } ] }, "type": "finding" } } EOF ``` ##### Mute or unmute a batch of findings returns "OK" response ``` // Mute or unmute a batch of findings returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.BulkMuteFindingsRequest{ Data: datadogV2.BulkMuteFindingsRequestData{ Attributes: datadogV2.BulkMuteFindingsRequestAttributes{ Mute: datadogV2.BulkMuteFindingsRequestProperties{ ExpirationDate: datadog.PtrInt64(1778721573794), Muted: true, Reason: datadogV2.FINDINGMUTEREASON_ACCEPTED_RISK, }, }, Id: "dbe5f567-192b-4404-b908-29b70e1c9f76", Meta: datadogV2.BulkMuteFindingsRequestMeta{ Findings: []datadogV2.BulkMuteFindingsRequestMetaFindings{ { FindingId: datadog.PtrString("ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw=="), }, }, }, Type: datadogV2.FINDINGTYPE_FINDING, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.MuteFindings", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.MuteFindings(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.MuteFindings`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.MuteFindings`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Mute or unmute a batch of findings returns "OK" response ``` // Mute or unmute a batch of findings returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.BulkMuteFindingsRequest; import com.datadog.api.client.v2.model.BulkMuteFindingsRequestAttributes; import com.datadog.api.client.v2.model.BulkMuteFindingsRequestData; import com.datadog.api.client.v2.model.BulkMuteFindingsRequestMeta; import com.datadog.api.client.v2.model.BulkMuteFindingsRequestMetaFindings; import com.datadog.api.client.v2.model.BulkMuteFindingsRequestProperties; import com.datadog.api.client.v2.model.BulkMuteFindingsResponse; import com.datadog.api.client.v2.model.FindingMuteReason; import com.datadog.api.client.v2.model.FindingType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.muteFindings", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); BulkMuteFindingsRequest body = new BulkMuteFindingsRequest() .data( new BulkMuteFindingsRequestData() .attributes( new BulkMuteFindingsRequestAttributes() .mute( new BulkMuteFindingsRequestProperties() .expirationDate(1778721573794L) .muted(true) .reason(FindingMuteReason.ACCEPTED_RISK))) .id("dbe5f567-192b-4404-b908-29b70e1c9f76") .meta( new BulkMuteFindingsRequestMeta() .findings( Collections.singletonList( new BulkMuteFindingsRequestMetaFindings() .findingId( "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==")))) .type(FindingType.FINDING)); try { BulkMuteFindingsResponse result = apiInstance.muteFindings(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#muteFindings"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Mute or unmute a batch of findings returns "OK" response ``` """ Mute or unmute a batch of findings returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.bulk_mute_findings_request import BulkMuteFindingsRequest from datadog_api_client.v2.model.bulk_mute_findings_request_attributes import BulkMuteFindingsRequestAttributes from datadog_api_client.v2.model.bulk_mute_findings_request_data import BulkMuteFindingsRequestData from datadog_api_client.v2.model.bulk_mute_findings_request_meta import BulkMuteFindingsRequestMeta from datadog_api_client.v2.model.bulk_mute_findings_request_meta_findings import BulkMuteFindingsRequestMetaFindings from datadog_api_client.v2.model.bulk_mute_findings_request_properties import BulkMuteFindingsRequestProperties from datadog_api_client.v2.model.finding_mute_reason import FindingMuteReason from datadog_api_client.v2.model.finding_type import FindingType body = BulkMuteFindingsRequest( data=BulkMuteFindingsRequestData( attributes=BulkMuteFindingsRequestAttributes( mute=BulkMuteFindingsRequestProperties( expiration_date=1778721573794, muted=True, reason=FindingMuteReason.ACCEPTED_RISK, ), ), id="dbe5f567-192b-4404-b908-29b70e1c9f76", meta=BulkMuteFindingsRequestMeta( findings=[ BulkMuteFindingsRequestMetaFindings( finding_id="ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", ), ], ), type=FindingType.FINDING, ), ) configuration = Configuration() configuration.unstable_operations["mute_findings"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.mute_findings(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Mute or unmute a batch of findings returns "OK" response ``` # Mute or unmute a batch of findings returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.mute_findings".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::BulkMuteFindingsRequest.new({ data: DatadogAPIClient::V2::BulkMuteFindingsRequestData.new({ attributes: DatadogAPIClient::V2::BulkMuteFindingsRequestAttributes.new({ mute: DatadogAPIClient::V2::BulkMuteFindingsRequestProperties.new({ expiration_date: 1778721573794, muted: true, reason: DatadogAPIClient::V2::FindingMuteReason::ACCEPTED_RISK, }), }), id: "dbe5f567-192b-4404-b908-29b70e1c9f76", meta: DatadogAPIClient::V2::BulkMuteFindingsRequestMeta.new({ findings: [ DatadogAPIClient::V2::BulkMuteFindingsRequestMetaFindings.new({ finding_id: "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", }), ], }), type: DatadogAPIClient::V2::FindingType::FINDING, }), }) p api_instance.mute_findings(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Mute or unmute a batch of findings returns "OK" response ``` // Mute or unmute a batch of findings returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::BulkMuteFindingsRequest; use datadog_api_client::datadogV2::model::BulkMuteFindingsRequestAttributes; use datadog_api_client::datadogV2::model::BulkMuteFindingsRequestData; use datadog_api_client::datadogV2::model::BulkMuteFindingsRequestMeta; use datadog_api_client::datadogV2::model::BulkMuteFindingsRequestMetaFindings; use datadog_api_client::datadogV2::model::BulkMuteFindingsRequestProperties; use datadog_api_client::datadogV2::model::FindingMuteReason; use datadog_api_client::datadogV2::model::FindingType; #[tokio::main] async fn main() { let body = BulkMuteFindingsRequest::new(BulkMuteFindingsRequestData::new( BulkMuteFindingsRequestAttributes::new( BulkMuteFindingsRequestProperties::new(true, FindingMuteReason::ACCEPTED_RISK) .expiration_date(1778721573794), ), "dbe5f567-192b-4404-b908-29b70e1c9f76".to_string(), BulkMuteFindingsRequestMeta::new() .findings(vec![BulkMuteFindingsRequestMetaFindings::new().finding_id( "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==".to_string(), )]), FindingType::FINDING, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.MuteFindings", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.mute_findings(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Mute or unmute a batch of findings returns "OK" response ``` /** * Mute or unmute a batch of findings returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.muteFindings"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiMuteFindingsRequest = { body: { data: { attributes: { mute: { expirationDate: 1778721573794, muted: true, reason: "ACCEPTED_RISK", }, }, id: "dbe5f567-192b-4404-b908-29b70e1c9f76", meta: { findings: [ { findingId: "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", }, ], }, type: "finding", }, }, }; apiInstance .muteFindings(params) .then((data: v2.BulkMuteFindingsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#search-security-findings) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#search-security-findings-v2) POST https://api.ap1.datadoghq.com/api/v2/security/findings/searchhttps://api.ap2.datadoghq.com/api/v2/security/findings/searchhttps://api.datadoghq.eu/api/v2/security/findings/searchhttps://api.ddog-gov.com/api/v2/security/findings/searchhttps://api.datadoghq.com/api/v2/security/findings/searchhttps://api.us3.datadoghq.com/api/v2/security/findings/searchhttps://api.us5.datadoghq.com/api/v2/security/findings/search ### Overview Get a list of security findings that match a search query. [See the schema for security findings](https://docs.datadoghq.com/security/guide/findings-schema/). ### [Query Syntax](https://docs.datadoghq.com/api/latest/security-monitoring/#query-syntax) The API uses the logs query syntax. Findings attributes (living in the attributes.attributes. namespace) are prefixed by @ when queried. Tags are queried without a prefix. Example: `@severity:(critical OR high) @status:open team:platform` This endpoint requires any of the following permissions: * `security_monitoring_findings_read` * `appsec_vm_read` OAuth apps require the `security_monitoring_findings_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object Request data for searching security findings. attributes object Request attributes for searching security findings. filter string The search query following log search syntax. default: `*` page object Pagination attributes for the search request. cursor string Get the next page of results with a cursor provided in the previous query. limit int64 The maximum number of security findings in the response. default: `10` sort enum The sort parameters when querying security findings. Allowed enum values: `@detection_changed_at,-@detection_changed_at` default: `-@detection_changed_at` ##### Search security findings returns "OK" response ``` { "data": { "attributes": { "filter": "@severity:(critical OR high)" } } } ``` Copy ##### Search security findings returns "OK" response with pagination ``` { "data": { "attributes": { "filter": "@severity:(critical OR high)", "page": { "limit": 1 } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityFindings-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityFindings-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityFindings-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityFindings-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The expected response schema when listing security findings. Field Type Description data [object] Array of security findings matching the search query. attributes object The JSON object containing all attributes of the security finding. attributes object The custom attributes of the security finding. tags [string] List of tags associated with the security finding. timestamp int64 The Unix timestamp at which the detection changed for the resource. Same value as @detection_changed_at. id string The unique ID of the security finding. type enum The type of the security finding resource. Allowed enum values: `finding` default: `finding` links object Links for pagination. next string Link for the next page of results. Note that paginated requests can also be made using the POST endpoint. meta object Metadata about the response. elapsed int64 The time elapsed in milliseconds. page object Pagination information. after string The cursor used to get the next page of results. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` ``` { "data": [ { "attributes": { "attributes": { "severity": "high", "status": "open" }, "tags": [ "team:platform", "env:prod" ], "timestamp": 1765901760 }, "id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", "type": "finding" } ], "links": { "next": "https://app.datadoghq.com/api/v2/security/findings?page[cursor]=eyJhZnRlciI6IkF3QUFBWnPcm1pd0FBQUJbVlBQUKBa1pqRTVdZUzSTBNemN0YWiIsLTE3Mjk0MzYwMjFdfQ==\u0026page[limit]=25" }, "meta": { "elapsed": 548, "page": { "after": "eyJhZnRlciI6IkFRQUFBWWJiaEJXQS1OY1dqUUFBQUFCQldXSmlhRUpYUVVGQlJFSktkbTlDTUdaWFRVbDNRVUUiLCJ2YWx1ZXMiOlsiY3JpdGljYWwiXX0=" }, "request_id": "pddv1ChZwVlMxMUdYRFRMQ1lyb3B4MGNYbFlnIi0KHQu35LDbucx", "status": "done" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) ##### Search security findings returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/findings/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter": "@severity:(critical OR high)" } } } EOF ``` ##### Search security findings returns "OK" response with pagination Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/findings/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter": "@severity:(critical OR high)", "page": { "limit": 1 } } } } EOF ``` * * * ## [Add a security signal to an incident](https://docs.datadoghq.com/api/latest/security-monitoring/#add-a-security-signal-to-an-incident) * [v1 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#add-a-security-signal-to-an-incident-v1) PATCH https://api.ap1.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/add_to_incidenthttps://api.ap2.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/add_to_incidenthttps://api.datadoghq.eu/api/v1/security_analytics/signals/{signal_id}/add_to_incidenthttps://api.ddog-gov.com/api/v1/security_analytics/signals/{signal_id}/add_to_incidenthttps://api.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/add_to_incidenthttps://api.us3.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/add_to_incidenthttps://api.us5.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/add_to_incident ### Overview Add a security signal to an incident. This makes it possible to search for signals by incident within the signal explorer and to view the signals on the incident timeline. This endpoint requires the `security_monitoring_signals_write` permission. ### Arguments #### Path Parameters Name Type Description signal_id [_required_] string The ID of the signal. ### Request #### Body Data (required) Attributes describing the signal update. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Expand All Field Type Description add_to_signal_timeline boolean Whether to post the signal on the incident timeline. incident_id [_required_] int64 Public ID attribute of the incident to which the signal will be added. version int64 Version of the updated signal. If server side version is higher, update will be rejected. ``` { "incident_id": 2609 } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#AddSecurityMonitoringSignalToIncident-200-v1) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#AddSecurityMonitoringSignalToIncident-400-v1) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#AddSecurityMonitoringSignalToIncident-403-v1) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#AddSecurityMonitoringSignalToIncident-404-v1) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#AddSecurityMonitoringSignalToIncident-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Updated signal data following a successfully performed update. Expand All Field Type Description status string Status of the response. ``` { "status": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Add a security signal to an incident returns "OK" response Copy ``` # Path parameters export signal_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/security_analytics/signals/${signal_id}/add_to_incident" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "incident_id": 2609 } EOF ``` ##### Add a security signal to an incident returns "OK" response ``` // Add a security signal to an incident returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.AddSignalToIncidentRequest{ IncidentId: 2609, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSecurityMonitoringApi(apiClient) resp, r, err := api.AddSecurityMonitoringSignalToIncident(ctx, "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.AddSecurityMonitoringSignalToIncident`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.AddSecurityMonitoringSignalToIncident`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add a security signal to an incident returns "OK" response ``` // Add a security signal to an incident returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SecurityMonitoringApi; import com.datadog.api.client.v1.model.AddSignalToIncidentRequest; import com.datadog.api.client.v1.model.SuccessfulSignalUpdateResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); AddSignalToIncidentRequest body = new AddSignalToIncidentRequest().incidentId(2609L); try { SuccessfulSignalUpdateResponse result = apiInstance.addSecurityMonitoringSignalToIncident( "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#addSecurityMonitoringSignalToIncident"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add a security signal to an incident returns "OK" response ``` """ Add a security signal to an incident returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v1.model.add_signal_to_incident_request import AddSignalToIncidentRequest body = AddSignalToIncidentRequest( incident_id=2609, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.add_security_monitoring_signal_to_incident( signal_id="AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add a security signal to an incident returns "OK" response ``` # Add a security signal to an incident returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SecurityMonitoringAPI.new body = DatadogAPIClient::V1::AddSignalToIncidentRequest.new({ incident_id: 2609, }) p api_instance.add_security_monitoring_signal_to_incident("AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add a security signal to an incident returns "OK" response ``` // Add a security signal to an incident returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV1::model::AddSignalToIncidentRequest; #[tokio::main] async fn main() { let body = AddSignalToIncidentRequest::new(2609); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .add_security_monitoring_signal_to_incident( "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE".to_string(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add a security signal to an incident returns "OK" response ``` /** * Add a security signal to an incident returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SecurityMonitoringApi(configuration); const params: v1.SecurityMonitoringApiAddSecurityMonitoringSignalToIncidentRequest = { body: { incidentId: 2609, }, signalId: "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", }; apiInstance .addSecurityMonitoringSignalToIncident(params) .then((data: v1.SuccessfulSignalUpdateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Change the triage state of a security signal](https://docs.datadoghq.com/api/latest/security-monitoring/#change-the-triage-state-of-a-security-signal) * [v1](https://docs.datadoghq.com/api/latest/security-monitoring/#change-the-triage-state-of-a-security-signal-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#change-the-triage-state-of-a-security-signal-v2) PATCH https://api.ap1.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/statehttps://api.ap2.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/statehttps://api.datadoghq.eu/api/v1/security_analytics/signals/{signal_id}/statehttps://api.ddog-gov.com/api/v1/security_analytics/signals/{signal_id}/statehttps://api.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/statehttps://api.us3.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/statehttps://api.us5.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/state ### Overview This endpoint is deprecated - Change the triage state of a security signal. This endpoint requires the `security_monitoring_signals_write` permission. ### Arguments #### Path Parameters Name Type Description signal_id [_required_] string The ID of the signal. ### Request #### Body Data (required) Attributes describing the signal update. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Expand All Field Type Description archiveComment string Optional comment to explain why a signal is being archived. archiveReason enum Reason why a signal has been archived. Allowed enum values: `none,false_positive,testing_or_maintenance,investigated_case_opened,true_positive_benign,true_positive_malicious,other` state [_required_] enum The new triage state of the signal. Allowed enum values: `open,archived,under_review` version int64 Version of the updated signal. If server side version is higher, update will be rejected. ``` { "archiveReason": "none", "state": "open" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalState-200-v1) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalState-400-v1) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalState-403-v1) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalState-404-v1) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalState-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Updated signal data following a successfully performed update. Expand All Field Type Description status string Status of the response. ``` { "status": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Change the triage state of a security signal returns "OK" response Copy ``` # Path parameters export signal_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/security_analytics/signals/${signal_id}/state" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "archiveReason": "none", "state": "open" } EOF ``` ##### Change the triage state of a security signal returns "OK" response ``` // Change the triage state of a security signal returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SignalStateUpdateRequest{ ArchiveReason: datadogV1.SIGNALARCHIVEREASON_NONE.Ptr(), State: datadogV1.SIGNALTRIAGESTATE_OPEN, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSecurityMonitoringApi(apiClient) resp, r, err := api.EditSecurityMonitoringSignalState(ctx, "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.EditSecurityMonitoringSignalState`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.EditSecurityMonitoringSignalState`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Change the triage state of a security signal returns "OK" response ``` // Change the triage state of a security signal returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SecurityMonitoringApi; import com.datadog.api.client.v1.model.SignalArchiveReason; import com.datadog.api.client.v1.model.SignalStateUpdateRequest; import com.datadog.api.client.v1.model.SignalTriageState; import com.datadog.api.client.v1.model.SuccessfulSignalUpdateResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SignalStateUpdateRequest body = new SignalStateUpdateRequest() .archiveReason(SignalArchiveReason.NONE) .state(SignalTriageState.OPEN); try { SuccessfulSignalUpdateResponse result = apiInstance.editSecurityMonitoringSignalState( "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#editSecurityMonitoringSignalState"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Change the triage state of a security signal returns "OK" response ``` """ Change the triage state of a security signal returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v1.model.signal_archive_reason import SignalArchiveReason from datadog_api_client.v1.model.signal_state_update_request import SignalStateUpdateRequest from datadog_api_client.v1.model.signal_triage_state import SignalTriageState body = SignalStateUpdateRequest( archive_reason=SignalArchiveReason.NONE, state=SignalTriageState.OPEN, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.edit_security_monitoring_signal_state( signal_id="AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Change the triage state of a security signal returns "OK" response ``` # Change the triage state of a security signal returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SecurityMonitoringAPI.new body = DatadogAPIClient::V1::SignalStateUpdateRequest.new({ archive_reason: DatadogAPIClient::V1::SignalArchiveReason::NONE, state: DatadogAPIClient::V1::SignalTriageState::OPEN, }) p api_instance.edit_security_monitoring_signal_state("AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Change the triage state of a security signal returns "OK" response ``` // Change the triage state of a security signal returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV1::model::SignalArchiveReason; use datadog_api_client::datadogV1::model::SignalStateUpdateRequest; use datadog_api_client::datadogV1::model::SignalTriageState; #[tokio::main] async fn main() { let body = SignalStateUpdateRequest::new(SignalTriageState::OPEN) .archive_reason(SignalArchiveReason::NONE); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .edit_security_monitoring_signal_state( "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE".to_string(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Change the triage state of a security signal returns "OK" response ``` /** * Change the triage state of a security signal returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SecurityMonitoringApi(configuration); const params: v1.SecurityMonitoringApiEditSecurityMonitoringSignalStateRequest = { body: { archiveReason: "none", state: "open", }, signalId: "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", }; apiInstance .editSecurityMonitoringSignalState(params) .then((data: v1.SuccessfulSignalUpdateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` PATCH https://api.ap1.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/statehttps://api.ap2.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/statehttps://api.datadoghq.eu/api/v2/security_monitoring/signals/{signal_id}/statehttps://api.ddog-gov.com/api/v2/security_monitoring/signals/{signal_id}/statehttps://api.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/statehttps://api.us3.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/statehttps://api.us5.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/state ### Overview Change the triage state of a security signal. This endpoint requires the `security_monitoring_signals_write` permission. ### Arguments #### Path Parameters Name Type Description signal_id [_required_] string The ID of the signal. ### Request #### Body Data (required) Attributes describing the signal update. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object Data containing the patch for changing the state of a signal. attributes [_required_] object Attributes describing the change of state of a security signal. archive_comment string Optional comment to display on archived signals. archive_reason enum Reason a signal is archived. Allowed enum values: `none,false_positive,testing_or_maintenance,investigated_case_opened,true_positive_benign,true_positive_malicious,other` state [_required_] enum The new triage state of the signal. Allowed enum values: `open,archived,under_review` version int64 Version of the updated signal. If server side version is higher, update will be rejected. id The unique ID of the security signal. type enum The type of event. Allowed enum values: `signal_metadata` default: `signal_metadata` ``` { "data": { "attributes": { "archive_reason": "none", "state": "open" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalState-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalState-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalState-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalState-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalState-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The response returned after all triage operations, containing the updated signal triage data. Field Type Description data [_required_] object Data containing the updated triage attributes of the signal. attributes object Attributes describing a triage state update operation over a security signal. archive_comment string Optional comment to display on archived signals. archive_comment_timestamp int64 Timestamp of the last edit to the comment. archive_comment_user object Object representing a given user entity. handle string The handle for this user account. icon string Gravatar icon associated to the user. id int64 Numerical ID assigned by Datadog to this user account. name string The name for this user account. uuid [_required_] string UUID assigned by Datadog to this user account. archive_reason enum Reason a signal is archived. Allowed enum values: `none,false_positive,testing_or_maintenance,investigated_case_opened,true_positive_benign,true_positive_malicious,other` assignee [_required_] object Object representing a given user entity. handle string The handle for this user account. icon string Gravatar icon associated to the user. id int64 Numerical ID assigned by Datadog to this user account. name string The name for this user account. uuid [_required_] string UUID assigned by Datadog to this user account. incident_ids [_required_] [integer] Array of incidents that are associated with this signal. state [_required_] enum The new triage state of the signal. Allowed enum values: `open,archived,under_review` state_update_timestamp int64 Timestamp of the last update to the signal state. state_update_user object Object representing a given user entity. handle string The handle for this user account. icon string Gravatar icon associated to the user. id int64 Numerical ID assigned by Datadog to this user account. name string The name for this user account. uuid [_required_] string UUID assigned by Datadog to this user account. id string The unique ID of the security signal. type enum The type of event. Allowed enum values: `signal_metadata` default: `signal_metadata` ``` { "data": { "attributes": { "archive_comment": "string", "archive_comment_timestamp": "integer", "archive_comment_user": { "handle": "string", "icon": "/path/to/matching/gravatar/icon", "id": "integer", "name": "string", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" }, "archive_reason": "string", "assignee": { "handle": "string", "icon": "/path/to/matching/gravatar/icon", "id": "integer", "name": "string", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" }, "incident_ids": [ 2066 ], "state": "open", "state_update_timestamp": "integer", "state_update_user": { "handle": "string", "icon": "/path/to/matching/gravatar/icon", "id": "integer", "name": "string", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" } }, "id": "string", "type": "signal_metadata" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Change the triage state of a security signal returns "OK" response Copy ``` # Path parameters export signal_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/signals/${signal_id}/state" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "archive_reason": "none", "state": "open" } } } EOF ``` ##### Change the triage state of a security signal returns "OK" response ``` // Change the triage state of a security signal returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringSignalStateUpdateRequest{ Data: datadogV2.SecurityMonitoringSignalStateUpdateData{ Attributes: datadogV2.SecurityMonitoringSignalStateUpdateAttributes{ ArchiveReason: datadogV2.SECURITYMONITORINGSIGNALARCHIVEREASON_NONE.Ptr(), State: datadogV2.SECURITYMONITORINGSIGNALSTATE_OPEN, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.EditSecurityMonitoringSignalState(ctx, "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.EditSecurityMonitoringSignalState`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.EditSecurityMonitoringSignalState`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Change the triage state of a security signal returns "OK" response ``` // Change the triage state of a security signal returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSignalArchiveReason; import com.datadog.api.client.v2.model.SecurityMonitoringSignalState; import com.datadog.api.client.v2.model.SecurityMonitoringSignalStateUpdateAttributes; import com.datadog.api.client.v2.model.SecurityMonitoringSignalStateUpdateData; import com.datadog.api.client.v2.model.SecurityMonitoringSignalStateUpdateRequest; import com.datadog.api.client.v2.model.SecurityMonitoringSignalTriageUpdateResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringSignalStateUpdateRequest body = new SecurityMonitoringSignalStateUpdateRequest() .data( new SecurityMonitoringSignalStateUpdateData() .attributes( new SecurityMonitoringSignalStateUpdateAttributes() .archiveReason(SecurityMonitoringSignalArchiveReason.NONE) .state(SecurityMonitoringSignalState.OPEN))); try { SecurityMonitoringSignalTriageUpdateResponse result = apiInstance.editSecurityMonitoringSignalState( "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#editSecurityMonitoringSignalState"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Change the triage state of a security signal returns "OK" response ``` """ Change the triage state of a security signal returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_signal_archive_reason import SecurityMonitoringSignalArchiveReason from datadog_api_client.v2.model.security_monitoring_signal_state import SecurityMonitoringSignalState from datadog_api_client.v2.model.security_monitoring_signal_state_update_attributes import ( SecurityMonitoringSignalStateUpdateAttributes, ) from datadog_api_client.v2.model.security_monitoring_signal_state_update_data import ( SecurityMonitoringSignalStateUpdateData, ) from datadog_api_client.v2.model.security_monitoring_signal_state_update_request import ( SecurityMonitoringSignalStateUpdateRequest, ) body = SecurityMonitoringSignalStateUpdateRequest( data=SecurityMonitoringSignalStateUpdateData( attributes=SecurityMonitoringSignalStateUpdateAttributes( archive_reason=SecurityMonitoringSignalArchiveReason.NONE, state=SecurityMonitoringSignalState.OPEN, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.edit_security_monitoring_signal_state( signal_id="AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Change the triage state of a security signal returns "OK" response ``` # Change the triage state of a security signal returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringSignalStateUpdateRequest.new({ data: DatadogAPIClient::V2::SecurityMonitoringSignalStateUpdateData.new({ attributes: DatadogAPIClient::V2::SecurityMonitoringSignalStateUpdateAttributes.new({ archive_reason: DatadogAPIClient::V2::SecurityMonitoringSignalArchiveReason::NONE, state: DatadogAPIClient::V2::SecurityMonitoringSignalState::OPEN, }), }), }) p api_instance.edit_security_monitoring_signal_state("AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Change the triage state of a security signal returns "OK" response ``` // Change the triage state of a security signal returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalArchiveReason; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalState; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalStateUpdateAttributes; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalStateUpdateData; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalStateUpdateRequest; #[tokio::main] async fn main() { let body = SecurityMonitoringSignalStateUpdateRequest::new( SecurityMonitoringSignalStateUpdateData::new( SecurityMonitoringSignalStateUpdateAttributes::new(SecurityMonitoringSignalState::OPEN) .archive_reason(SecurityMonitoringSignalArchiveReason::NONE), ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .edit_security_monitoring_signal_state( "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE".to_string(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Change the triage state of a security signal returns "OK" response ``` /** * Change the triage state of a security signal returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiEditSecurityMonitoringSignalStateRequest = { body: { data: { attributes: { archiveReason: "none", state: "open", }, }, }, signalId: "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", }; apiInstance .editSecurityMonitoringSignalState(params) .then((data: v2.SecurityMonitoringSignalTriageUpdateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-custom-framework) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-custom-framework-v2) POST https://api.ap1.datadoghq.com/api/v2/cloud_security_management/custom_frameworkshttps://api.ap2.datadoghq.com/api/v2/cloud_security_management/custom_frameworkshttps://api.datadoghq.eu/api/v2/cloud_security_management/custom_frameworkshttps://api.ddog-gov.com/api/v2/cloud_security_management/custom_frameworkshttps://api.datadoghq.com/api/v2/cloud_security_management/custom_frameworkshttps://api.us3.datadoghq.com/api/v2/cloud_security_management/custom_frameworkshttps://api.us5.datadoghq.com/api/v2/cloud_security_management/custom_frameworks ### Overview Create a custom framework. This endpoint requires all of the following permissions: * `security_monitoring_rules_read` * `security_monitoring_rules_write` OAuth apps require the `security_monitoring_rules_read, security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object Contains type and attributes for custom frameworks. attributes [_required_] object Framework Data Attributes. description string Framework Description handle [_required_] string Framework Handle icon_url string Framework Icon URL name [_required_] string Framework Name requirements [_required_] [object] Framework Requirements controls [_required_] [object] Requirement Controls. name [_required_] string Control Name. rules_id [_required_] [string] Rule IDs. name [_required_] string Requirement Name. version [_required_] string Framework Version type [_required_] enum The type of the resource. The value must be `custom_framework`. Allowed enum values: `custom_framework` default: `custom_framework` ``` { "data": { "type": "custom_framework", "attributes": { "name": "name", "handle": "create-framework-new", "version": "10", "icon_url": "test-url", "requirements": [ { "name": "requirement", "controls": [ { "name": "control", "rules_id": [ "def-000-be9" ] } ] } ] } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateCustomFramework-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateCustomFramework-400-v2) * [409](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateCustomFramework-409-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateCustomFramework-429-v2) * [500](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateCustomFramework-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object to create a custom framework. Field Type Description data [_required_] object Contains type and attributes for custom frameworks. attributes [_required_] object Framework Handle and Version. handle string Framework Handle version string Framework Version id [_required_] string The ID of the custom framework. type [_required_] enum The type of the resource. The value must be `custom_framework`. Allowed enum values: `custom_framework` default: `custom_framework` ``` { "data": { "attributes": { "handle": "sec2", "version": "2" }, "id": "handle-version", "type": "custom_framework" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Create a custom framework returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cloud_security_management/custom_frameworks" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "custom_framework", "attributes": { "name": "name", "handle": "create-framework-new", "version": "10", "icon_url": "test-url", "requirements": [ { "name": "requirement", "controls": [ { "name": "control", "rules_id": [ "def-000-be9" ] } ] } ] } } } EOF ``` ##### Create a custom framework returns "OK" response ``` // Create a custom framework returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateCustomFrameworkRequest{ Data: datadogV2.CustomFrameworkData{ Type: datadogV2.CUSTOMFRAMEWORKTYPE_CUSTOM_FRAMEWORK, Attributes: datadogV2.CustomFrameworkDataAttributes{ Name: "name", Handle: "create-framework-new", Version: "10", IconUrl: datadog.PtrString("test-url"), Requirements: []datadogV2.CustomFrameworkRequirement{ { Name: "requirement", Controls: []datadogV2.CustomFrameworkControl{ { Name: "control", RulesId: []string{ "def-000-be9", }, }, }, }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateCustomFramework(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateCustomFramework`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateCustomFramework`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a custom framework returns "OK" response ``` // Create a custom framework returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.CreateCustomFrameworkRequest; import com.datadog.api.client.v2.model.CreateCustomFrameworkResponse; import com.datadog.api.client.v2.model.CustomFrameworkControl; import com.datadog.api.client.v2.model.CustomFrameworkData; import com.datadog.api.client.v2.model.CustomFrameworkDataAttributes; import com.datadog.api.client.v2.model.CustomFrameworkRequirement; import com.datadog.api.client.v2.model.CustomFrameworkType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); CreateCustomFrameworkRequest body = new CreateCustomFrameworkRequest() .data( new CustomFrameworkData() .type(CustomFrameworkType.CUSTOM_FRAMEWORK) .attributes( new CustomFrameworkDataAttributes() .name("name") .handle("create-framework-new") .version("10") .iconUrl("test-url") .requirements( Collections.singletonList( new CustomFrameworkRequirement() .name("requirement") .controls( Collections.singletonList( new CustomFrameworkControl() .name("control") .rulesId( Collections.singletonList( "def-000-be9")))))))); try { CreateCustomFrameworkResponse result = apiInstance.createCustomFramework(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#createCustomFramework"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a custom framework returns "OK" response ``` """ Create a custom framework returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.create_custom_framework_request import CreateCustomFrameworkRequest from datadog_api_client.v2.model.custom_framework_control import CustomFrameworkControl from datadog_api_client.v2.model.custom_framework_data import CustomFrameworkData from datadog_api_client.v2.model.custom_framework_data_attributes import CustomFrameworkDataAttributes from datadog_api_client.v2.model.custom_framework_requirement import CustomFrameworkRequirement from datadog_api_client.v2.model.custom_framework_type import CustomFrameworkType body = CreateCustomFrameworkRequest( data=CustomFrameworkData( type=CustomFrameworkType.CUSTOM_FRAMEWORK, attributes=CustomFrameworkDataAttributes( name="name", handle="create-framework-new", version="10", icon_url="test-url", requirements=[ CustomFrameworkRequirement( name="requirement", controls=[ CustomFrameworkControl( name="control", rules_id=[ "def-000-be9", ], ), ], ), ], ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_custom_framework(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a custom framework returns "OK" response ``` # Create a custom framework returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::CreateCustomFrameworkRequest.new({ data: DatadogAPIClient::V2::CustomFrameworkData.new({ type: DatadogAPIClient::V2::CustomFrameworkType::CUSTOM_FRAMEWORK, attributes: DatadogAPIClient::V2::CustomFrameworkDataAttributes.new({ name: "name", handle: "create-framework-new", version: "10", icon_url: "test-url", requirements: [ DatadogAPIClient::V2::CustomFrameworkRequirement.new({ name: "requirement", controls: [ DatadogAPIClient::V2::CustomFrameworkControl.new({ name: "control", rules_id: [ "def-000-be9", ], }), ], }), ], }), }), }) p api_instance.create_custom_framework(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a custom framework returns "OK" response ``` // Create a custom framework returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::CreateCustomFrameworkRequest; use datadog_api_client::datadogV2::model::CustomFrameworkControl; use datadog_api_client::datadogV2::model::CustomFrameworkData; use datadog_api_client::datadogV2::model::CustomFrameworkDataAttributes; use datadog_api_client::datadogV2::model::CustomFrameworkRequirement; use datadog_api_client::datadogV2::model::CustomFrameworkType; #[tokio::main] async fn main() { let body = CreateCustomFrameworkRequest::new(CustomFrameworkData::new( CustomFrameworkDataAttributes::new( "create-framework-new".to_string(), "name".to_string(), vec![CustomFrameworkRequirement::new( vec![CustomFrameworkControl::new( "control".to_string(), vec!["def-000-be9".to_string()], )], "requirement".to_string(), )], "10".to_string(), ) .icon_url("test-url".to_string()), CustomFrameworkType::CUSTOM_FRAMEWORK, )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_custom_framework(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a custom framework returns "OK" response ``` /** * Create a custom framework returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateCustomFrameworkRequest = { body: { data: { type: "custom_framework", attributes: { name: "name", handle: "create-framework-new", version: "10", iconUrl: "test-url", requirements: [ { name: "requirement", controls: [ { name: "control", rulesId: ["def-000-be9"], }, ], }, ], }, }, }, }; apiInstance .createCustomFramework(params) .then((data: v2.CreateCustomFrameworkResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a detection rule](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-detection-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-detection-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/security_monitoring/ruleshttps://api.ap2.datadoghq.com/api/v2/security_monitoring/ruleshttps://api.datadoghq.eu/api/v2/security_monitoring/ruleshttps://api.ddog-gov.com/api/v2/security_monitoring/ruleshttps://api.datadoghq.com/api/v2/security_monitoring/ruleshttps://api.us3.datadoghq.com/api/v2/security_monitoring/ruleshttps://api.us5.datadoghq.com/api/v2/security_monitoring/rules ### Overview Create a detection rule. This endpoint requires the `security_monitoring_rules_write` permission. OAuth apps require the `security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description Option 1 object Create a new rule. calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [_required_] [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message for generated signals. name [_required_] string The name of the rule. options [_required_] object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. name string Name of the case. notifications [string] Notification targets for each case. query string A query to map a third party event to this case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum The rule type. Allowed enum values: `api_security,application_security,log_detection,workload_security` Option 2 object Create a new signal correlation rule. cases [_required_] [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message for generated signals. name [_required_] string The name of the rule. options [_required_] object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting signals which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` correlatedByFields [string] Fields to group by. correlatedQueryIndex int32 Index of the rule query used to retrieve the correlated field. metrics [string] Group of target fields to aggregate over. name string Name of the query. ruleId [_required_] string Rule ID to match on signals. tags [string] Tags for generated signals. type enum The rule type. Allowed enum values: `signal_correlation` Option 3 object Create a new cloud configuration rule. cases [_required_] [object] Description of generated findings and signals (severity and channels to be notified in case of a signal). Must contain exactly one item. notifications [string] Notification targets for each rule case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` complianceSignalOptions [_required_] object How to generate compliance signals. Useful for cloud_configuration rules only. defaultActivationStatus boolean The default activation status. defaultGroupByFields [string] The default group by fields. userActivationStatus boolean Whether signals will be sent. userGroupByFields [string] Fields to use to group findings by when sending signals. filters [object] Additional queries to filter matched events before they are processed. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message in markdown format for generated findings and signals. name [_required_] string The name of the rule. options [_required_] object Options on cloud configuration rules. complianceRuleOptions [_required_] object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. tags [string] Tags for generated findings and signals. type enum The rule type. Allowed enum values: `cloud_configuration` ##### Create a cloud_configuration rule returns "OK" response ``` { "type": "cloud_configuration", "name": "Example-Security-Monitoring_cloud", "isEnabled": false, "cases": [ { "status": "info", "notifications": [ "channel" ] } ], "options": { "complianceRuleOptions": { "resourceType": "gcp_compute_disk", "complexRule": false, "regoRule": { "policy": "package datadog\n\nimport data.datadog.output as dd_output\n\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\nmilliseconds_in_a_day := ((1000 * 60) * 60) * 24\n\neval(iam_service_account_key) = \"skip\" if {\n\tiam_service_account_key.disabled\n} else = \"pass\" if {\n\t(iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90\n} else = \"fail\"\n\n# This part remains unchanged for all rules\nresults contains result if {\n\tsome resource in input.resources[input.main_resource_type]\n\tresult := dd_output.format(resource, eval(resource))\n}\n", "resourceTypes": [ "gcp_compute_disk" ] } } }, "message": "ddd", "tags": [ "my:tag" ], "complianceSignalOptions": { "userActivationStatus": true, "userGroupByFields": [ "@account_id" ] }, "filters": [ { "action": "require", "query": "resource_id:helo*" }, { "action": "suppress", "query": "control:helo*" } ] } ``` Copy ##### Create a detection rule returns "OK" response ``` { "name": "Example-Security-Monitoring", "queries": [ { "query": "@test:true", "aggregation": "count", "groupByFields": [], "distinctFields": [], "metric": "" } ], "filters": [], "cases": [ { "name": "", "status": "info", "condition": "a > 0", "notifications": [] } ], "options": { "evaluationWindow": 900, "keepAlive": 3600, "maxSignalDuration": 86400 }, "message": "Test rule", "tags": [], "isEnabled": true, "type": "log_detection", "referenceTables": [ { "tableName": "synthetics_test_reference_table_dont_delete", "columnName": "value", "logFieldPath": "testtag", "checkPresence": true, "ruleQueryName": "a" } ] } ``` Copy ##### Create a detection rule with detection method 'anomaly_detection' returns "OK" response ``` { "name": "Example-Security-Monitoring", "type": "log_detection", "isEnabled": true, "queries": [ { "aggregation": "count", "dataSource": "logs", "distinctFields": [], "groupByFields": [ "@usr.email", "@network.client.ip" ], "hasOptionalGroupByFields": false, "name": "", "query": "service:app status:error" } ], "cases": [ { "name": "", "status": "info", "notifications": [], "condition": "a > 0.995" } ], "message": "An anomaly detection rule", "options": { "detectionMethod": "anomaly_detection", "evaluationWindow": 900, "keepAlive": 3600, "maxSignalDuration": 86400, "anomalyDetectionOptions": { "bucketDuration": 300, "learningDuration": 24, "detectionTolerance": 3, "learningPeriodBaseline": 10 } }, "tags": [], "filters": [] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityMonitoringRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityMonitoringRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityMonitoringRule-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityMonitoringRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Create a new rule. Field Type Description Option 1 object Rule. calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A rule case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` complianceSignalOptions object How to generate compliance signals. Useful for cloud_configuration rules only. defaultActivationStatus boolean The default activation status. defaultGroupByFields [string] The default group by fields. userActivationStatus boolean Whether signals will be sent. userGroupByFields [string] Fields to use to group findings by when sending signals. createdAt int64 When the rule was created, timestamp in milliseconds. creationAuthorId int64 User ID of the user who created the rule. customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). defaultTags [string] Default Tags for default rules (included in tags) deprecationDate int64 When the rule will be deprecated, timestamp in milliseconds. filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. id string The ID of the rule. isDefault boolean Whether the rule is included by default. isDeleted boolean Whether the rule has been deleted. isEnabled boolean Whether the rule is enabled. message string Message for generated signals. name string The name of the rule. options object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. query string A query to map a third party event to this case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum The rule type. Allowed enum values: `log_detection,infrastructure_configuration,workload_security,cloud_configuration,application_security,api_security` updateAuthorId int64 User ID of the user who updated the rule. updatedAt int64 The date the rule was last updated, in milliseconds. version int64 The version of the rule. Option 2 object Rule. cases [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A rule case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` createdAt int64 When the rule was created, timestamp in milliseconds. creationAuthorId int64 User ID of the user who created the rule. customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). deprecationDate int64 When the rule will be deprecated, timestamp in milliseconds. filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. id string The ID of the rule. isDefault boolean Whether the rule is included by default. isDeleted boolean Whether the rule has been deleted. isEnabled boolean Whether the rule is enabled. message string Message for generated signals. name string The name of the rule. options object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` correlatedByFields [string] Fields to correlate by. correlatedQueryIndex int32 Index of the rule query used to retrieve the correlated field. defaultRuleId string Default Rule ID to match on signals. distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. metrics [string] Group of target fields to aggregate over. name string Name of the query. ruleId string Rule ID to match on signals. tags [string] Tags for generated signals. type enum The rule type. Allowed enum values: `signal_correlation` updateAuthorId int64 User ID of the user who updated the rule. version int64 The version of the rule. ``` { "calculatedFields": [ { "expression": "@request_end_timestamp - @request_start_timestamp", "name": "response_time" } ], "cases": [ { "actions": [ { "options": { "duration": 0, "flaggedIPType": "FLAGGED", "userBehaviorName": "string" }, "type": "string" } ], "condition": "string", "customStatus": "critical", "name": "string", "notifications": [], "status": "critical" } ], "complianceSignalOptions": { "defaultActivationStatus": false, "defaultGroupByFields": [], "userActivationStatus": false, "userGroupByFields": [] }, "createdAt": "integer", "creationAuthorId": "integer", "customMessage": "string", "customName": "string", "defaultTags": [ "security:attacks" ], "deprecationDate": "integer", "filters": [ { "action": "string", "query": "string" } ], "groupSignalsBy": [ "service" ], "hasExtendedTitle": false, "id": "string", "isDefault": false, "isDeleted": false, "isEnabled": false, "message": "string", "name": "string", "options": { "anomalyDetectionOptions": { "bucketDuration": 300, "detectionTolerance": 5, "learningDuration": "integer", "learningPeriodBaseline": "integer" }, "complianceRuleOptions": { "complexRule": false, "regoRule": { "policy": "package datadog\n\nimport data.datadog.output as dd_output\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\neval(resource) = \"skip\" if {\n # Logic that evaluates to true if the resource should be skipped\n true\n} else = \"pass\" {\n # Logic that evaluates to true if the resource is compliant\n true\n} else = \"fail\" {\n # Logic that evaluates to true if the resource is not compliant\n true\n}\n\n# This part remains unchanged for all rules\nresults contains result if {\n some resource in input.resources[input.main_resource_type]\n result := dd_output.format(resource, eval(resource))\n}", "resourceTypes": [ "gcp_iam_service_account", "gcp_iam_policy" ] }, "resourceType": "aws_acm" }, "decreaseCriticalityBasedOnEnv": false, "detectionMethod": "string", "evaluationWindow": "integer", "hardcodedEvaluatorType": "string", "impossibleTravelOptions": { "baselineUserLocations": true }, "keepAlive": "integer", "maxSignalDuration": "integer", "newValueOptions": { "forgetAfter": "integer", "instantaneousBaseline": false, "learningDuration": "integer", "learningMethod": "string", "learningThreshold": "integer" }, "sequenceDetectionOptions": { "stepTransitions": [ { "child": "string", "evaluationWindow": "integer", "parent": "string" } ], "steps": [ { "condition": "string", "evaluationWindow": "integer", "name": "string" } ] }, "thirdPartyRuleOptions": { "defaultNotifications": [], "defaultStatus": "critical", "rootQueries": [ { "groupByFields": [], "query": "source:cloudtrail" } ], "signalTitleTemplate": "string" } }, "queries": [ { "aggregation": "string", "customQueryExtension": "a > 3", "dataSource": "logs", "distinctFields": [], "groupByFields": [], "hasOptionalGroupByFields": false, "index": "string", "indexes": [], "metric": "string", "metrics": [], "name": "string", "query": "a > 3" } ], "referenceTables": [ { "checkPresence": false, "columnName": "string", "logFieldPath": "string", "ruleQueryName": "string", "tableName": "string" } ], "schedulingOptions": { "rrule": "FREQ=HOURLY;INTERVAL=1;", "start": "2025-07-14T12:00:00", "timezone": "America/New_York" }, "tags": [], "thirdPartyCases": [ { "customStatus": "critical", "name": "string", "notifications": [], "query": "string", "status": "critical" } ], "type": "string", "updateAuthorId": "integer", "updatedAt": "integer", "version": "integer" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Create a cloud_configuration rule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "type": "cloud_configuration", "name": "Example-Security-Monitoring_cloud", "isEnabled": false, "cases": [ { "status": "info", "notifications": [ "channel" ] } ], "options": { "complianceRuleOptions": { "resourceType": "gcp_compute_disk", "complexRule": false, "regoRule": { "policy": "package datadog\n\nimport data.datadog.output as dd_output\n\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\nmilliseconds_in_a_day := ((1000 * 60) * 60) * 24\n\neval(iam_service_account_key) = \"skip\" if {\n\tiam_service_account_key.disabled\n} else = \"pass\" if {\n\t(iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90\n} else = \"fail\"\n\n# This part remains unchanged for all rules\nresults contains result if {\n\tsome resource in input.resources[input.main_resource_type]\n\tresult := dd_output.format(resource, eval(resource))\n}\n", "resourceTypes": [ "gcp_compute_disk" ] } } }, "message": "ddd", "tags": [ "my:tag" ], "complianceSignalOptions": { "userActivationStatus": true, "userGroupByFields": [ "@account_id" ] }, "filters": [ { "action": "require", "query": "resource_id:helo*" }, { "action": "suppress", "query": "control:helo*" } ] } EOF ``` ##### Create a detection rule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Security-Monitoring", "queries": [ { "query": "@test:true", "aggregation": "count", "groupByFields": [], "distinctFields": [], "metric": "" } ], "filters": [], "cases": [ { "name": "", "status": "info", "condition": "a > 0", "notifications": [] } ], "options": { "evaluationWindow": 900, "keepAlive": 3600, "maxSignalDuration": 86400 }, "message": "Test rule", "tags": [], "isEnabled": true, "type": "log_detection", "referenceTables": [ { "tableName": "synthetics_test_reference_table_dont_delete", "columnName": "value", "logFieldPath": "testtag", "checkPresence": true, "ruleQueryName": "a" } ] } EOF ``` ##### Create a detection rule with detection method 'anomaly_detection' returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Security-Monitoring", "type": "log_detection", "isEnabled": true, "queries": [ { "aggregation": "count", "dataSource": "logs", "distinctFields": [], "groupByFields": [ "@usr.email", "@network.client.ip" ], "hasOptionalGroupByFields": false, "name": "", "query": "service:app status:error" } ], "cases": [ { "name": "", "status": "info", "notifications": [], "condition": "a > 0.995" } ], "message": "An anomaly detection rule", "options": { "detectionMethod": "anomaly_detection", "evaluationWindow": 900, "keepAlive": 3600, "maxSignalDuration": 86400, "anomalyDetectionOptions": { "bucketDuration": 300, "learningDuration": 24, "detectionTolerance": 3, "learningPeriodBaseline": 10 } }, "tags": [], "filters": [] } EOF ``` ##### Create a cloud_configuration rule returns "OK" response ``` // Create a cloud_configuration rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringRuleCreatePayload{ CloudConfigurationRuleCreatePayload: &datadogV2.CloudConfigurationRuleCreatePayload{ Type: datadogV2.CLOUDCONFIGURATIONRULETYPE_CLOUD_CONFIGURATION.Ptr(), Name: "Example-Security-Monitoring_cloud", IsEnabled: false, Cases: []datadogV2.CloudConfigurationRuleCaseCreate{ { Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO, Notifications: []string{ "channel", }, }, }, Options: datadogV2.CloudConfigurationRuleOptions{ ComplianceRuleOptions: datadogV2.CloudConfigurationComplianceRuleOptions{ ResourceType: datadog.PtrString("gcp_compute_disk"), ComplexRule: datadog.PtrBool(false), RegoRule: &datadogV2.CloudConfigurationRegoRule{ Policy: `package datadog import data.datadog.output as dd_output import future.keywords.contains import future.keywords.if import future.keywords.in milliseconds_in_a_day := ((1000 * 60) * 60) * 24 eval(iam_service_account_key) = "skip" if { iam_service_account_key.disabled } else = "pass" if { (iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90 } else = "fail" # This part remains unchanged for all rules results contains result if { some resource in input.resources[input.main_resource_type] result := dd_output.format(resource, eval(resource)) } `, ResourceTypes: []string{ "gcp_compute_disk", }, }, }, }, Message: "ddd", Tags: []string{ "my:tag", }, ComplianceSignalOptions: datadogV2.CloudConfigurationRuleComplianceSignalOptions{ UserActivationStatus: *datadog.NewNullableBool(datadog.PtrBool(true)), UserGroupByFields: *datadog.NewNullableList(&[]string{ "@account_id", }), }, Filters: []datadogV2.SecurityMonitoringFilter{ { Action: datadogV2.SECURITYMONITORINGFILTERACTION_REQUIRE.Ptr(), Query: datadog.PtrString("resource_id:helo*"), }, { Action: datadogV2.SECURITYMONITORINGFILTERACTION_SUPPRESS.Ptr(), Query: datadog.PtrString("control:helo*"), }, }, }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateSecurityMonitoringRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateSecurityMonitoringRule`:\n%s\n", responseContent) } ``` Copy ##### Create a detection rule returns "OK" response ``` // Create a detection rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringRuleCreatePayload{ SecurityMonitoringStandardRuleCreatePayload: &datadogV2.SecurityMonitoringStandardRuleCreatePayload{ Name: "Example-Security-Monitoring", Queries: []datadogV2.SecurityMonitoringStandardRuleQuery{ { Query: datadog.PtrString("@test:true"), Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_COUNT.Ptr(), GroupByFields: []string{}, DistinctFields: []string{}, Metric: datadog.PtrString(""), }, }, Filters: []datadogV2.SecurityMonitoringFilter{}, Cases: []datadogV2.SecurityMonitoringRuleCaseCreate{ { Name: datadog.PtrString(""), Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO, Condition: datadog.PtrString("a > 0"), Notifications: []string{}, }, }, Options: datadogV2.SecurityMonitoringRuleOptions{ EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_FIFTEEN_MINUTES.Ptr(), KeepAlive: datadogV2.SECURITYMONITORINGRULEKEEPALIVE_ONE_HOUR.Ptr(), MaxSignalDuration: datadogV2.SECURITYMONITORINGRULEMAXSIGNALDURATION_ONE_DAY.Ptr(), }, Message: "Test rule", Tags: []string{}, IsEnabled: true, Type: datadogV2.SECURITYMONITORINGRULETYPECREATE_LOG_DETECTION.Ptr(), ReferenceTables: []datadogV2.SecurityMonitoringReferenceTable{ { TableName: datadog.PtrString("synthetics_test_reference_table_dont_delete"), ColumnName: datadog.PtrString("value"), LogFieldPath: datadog.PtrString("testtag"), CheckPresence: datadog.PtrBool(true), RuleQueryName: datadog.PtrString("a"), }, }, }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateSecurityMonitoringRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateSecurityMonitoringRule`:\n%s\n", responseContent) } ``` Copy ##### Create a detection rule with detection method 'sequence_detection' returns "OK" response ``` // Create a detection rule with detection method 'sequence_detection' returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringRuleCreatePayload{ SecurityMonitoringStandardRuleCreatePayload: &datadogV2.SecurityMonitoringStandardRuleCreatePayload{ Name: "Example-Security-Monitoring", Type: datadogV2.SECURITYMONITORINGRULETYPECREATE_LOG_DETECTION.Ptr(), IsEnabled: true, Queries: []datadogV2.SecurityMonitoringStandardRuleQuery{ { Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_COUNT.Ptr(), DataSource: datadogV2.SECURITYMONITORINGSTANDARDDATASOURCE_LOGS.Ptr(), DistinctFields: []string{}, GroupByFields: []string{}, HasOptionalGroupByFields: datadog.PtrBool(false), Name: datadog.PtrString(""), Query: datadog.PtrString("service:logs-rule-reducer source:paul test2"), }, { Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_COUNT.Ptr(), DataSource: datadogV2.SECURITYMONITORINGSTANDARDDATASOURCE_LOGS.Ptr(), DistinctFields: []string{}, GroupByFields: []string{}, HasOptionalGroupByFields: datadog.PtrBool(false), Name: datadog.PtrString(""), Query: datadog.PtrString("service:logs-rule-reducer source:paul test1"), }, }, Cases: []datadogV2.SecurityMonitoringRuleCaseCreate{ { Name: datadog.PtrString(""), Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO, Notifications: []string{}, Condition: datadog.PtrString("step_b > 0"), }, }, Message: "Logs and signals asdf", Options: datadogV2.SecurityMonitoringRuleOptions{ DetectionMethod: datadogV2.SECURITYMONITORINGRULEDETECTIONMETHOD_SEQUENCE_DETECTION.Ptr(), EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_ZERO_MINUTES.Ptr(), KeepAlive: datadogV2.SECURITYMONITORINGRULEKEEPALIVE_FIVE_MINUTES.Ptr(), MaxSignalDuration: datadogV2.SECURITYMONITORINGRULEMAXSIGNALDURATION_TEN_MINUTES.Ptr(), SequenceDetectionOptions: &datadogV2.SecurityMonitoringRuleSequenceDetectionOptions{ StepTransitions: []datadogV2.SecurityMonitoringRuleSequenceDetectionStepTransition{ { Child: datadog.PtrString("step_b"), EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_FIFTEEN_MINUTES.Ptr(), Parent: datadog.PtrString("step_a"), }, }, Steps: []datadogV2.SecurityMonitoringRuleSequenceDetectionStep{ { Condition: datadog.PtrString("a > 0"), EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_ONE_MINUTE.Ptr(), Name: datadog.PtrString("step_a"), }, { Condition: datadog.PtrString("b > 0"), EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_ONE_MINUTE.Ptr(), Name: datadog.PtrString("step_b"), }, }, }, }, Tags: []string{}, }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateSecurityMonitoringRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateSecurityMonitoringRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a cloud_configuration rule returns "OK" response ``` // Create a cloud_configuration rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.CloudConfigurationComplianceRuleOptions; import com.datadog.api.client.v2.model.CloudConfigurationRegoRule; import com.datadog.api.client.v2.model.CloudConfigurationRuleCaseCreate; import com.datadog.api.client.v2.model.CloudConfigurationRuleComplianceSignalOptions; import com.datadog.api.client.v2.model.CloudConfigurationRuleCreatePayload; import com.datadog.api.client.v2.model.CloudConfigurationRuleOptions; import com.datadog.api.client.v2.model.CloudConfigurationRuleType; import com.datadog.api.client.v2.model.SecurityMonitoringFilter; import com.datadog.api.client.v2.model.SecurityMonitoringFilterAction; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCreatePayload; import com.datadog.api.client.v2.model.SecurityMonitoringRuleResponse; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringRuleCreatePayload body = new SecurityMonitoringRuleCreatePayload( new CloudConfigurationRuleCreatePayload() .type(CloudConfigurationRuleType.CLOUD_CONFIGURATION) .name("Example-Security-Monitoring_cloud") .isEnabled(false) .cases( Collections.singletonList( new CloudConfigurationRuleCaseCreate() .status(SecurityMonitoringRuleSeverity.INFO) .notifications(Collections.singletonList("channel")))) .options( new CloudConfigurationRuleOptions() .complianceRuleOptions( new CloudConfigurationComplianceRuleOptions() .resourceType("gcp_compute_disk") .complexRule(false) .regoRule( new CloudConfigurationRegoRule() .policy( """ package datadog import data.datadog.output as dd_output import future.keywords.contains import future.keywords.if import future.keywords.in milliseconds_in_a_day := ((1000 * 60) * 60) * 24 eval(iam_service_account_key) = "skip" if { iam_service_account_key.disabled } else = "pass" if { (iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90 } else = "fail" # This part remains unchanged for all rules results contains result if { some resource in input.resources[input.main_resource_type] result := dd_output.format(resource, eval(resource)) } """) .resourceTypes( Collections.singletonList("gcp_compute_disk"))))) .message("ddd") .tags(Collections.singletonList("my:tag")) .complianceSignalOptions( new CloudConfigurationRuleComplianceSignalOptions() .userActivationStatus(true) .userGroupByFields(Collections.singletonList("@account_id"))) .filters( Arrays.asList( new SecurityMonitoringFilter() .action(SecurityMonitoringFilterAction.REQUIRE) .query("resource_id:helo*"), new SecurityMonitoringFilter() .action(SecurityMonitoringFilterAction.SUPPRESS) .query("control:helo*")))); try { SecurityMonitoringRuleResponse result = apiInstance.createSecurityMonitoringRule(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#createSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a detection rule returns "OK" response ``` // Create a detection rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringReferenceTable; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCaseCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCreatePayload; import com.datadog.api.client.v2.model.SecurityMonitoringRuleEvaluationWindow; import com.datadog.api.client.v2.model.SecurityMonitoringRuleKeepAlive; import com.datadog.api.client.v2.model.SecurityMonitoringRuleMaxSignalDuration; import com.datadog.api.client.v2.model.SecurityMonitoringRuleOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryAggregation; import com.datadog.api.client.v2.model.SecurityMonitoringRuleResponse; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTypeCreate; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleCreatePayload; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleQuery; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringRuleCreatePayload body = new SecurityMonitoringRuleCreatePayload( new SecurityMonitoringStandardRuleCreatePayload() .name("Example-Security-Monitoring") .queries( Collections.singletonList( new SecurityMonitoringStandardRuleQuery() .query("@test:true") .aggregation(SecurityMonitoringRuleQueryAggregation.COUNT) .metric(""))) .cases( Collections.singletonList( new SecurityMonitoringRuleCaseCreate() .name("") .status(SecurityMonitoringRuleSeverity.INFO) .condition("a > 0"))) .options( new SecurityMonitoringRuleOptions() .evaluationWindow(SecurityMonitoringRuleEvaluationWindow.FIFTEEN_MINUTES) .keepAlive(SecurityMonitoringRuleKeepAlive.ONE_HOUR) .maxSignalDuration(SecurityMonitoringRuleMaxSignalDuration.ONE_DAY)) .message("Test rule") .isEnabled(true) .type(SecurityMonitoringRuleTypeCreate.LOG_DETECTION) .referenceTables( Collections.singletonList( new SecurityMonitoringReferenceTable() .tableName("synthetics_test_reference_table_dont_delete") .columnName("value") .logFieldPath("testtag") .checkPresence(true) .ruleQueryName("a")))); try { SecurityMonitoringRuleResponse result = apiInstance.createSecurityMonitoringRule(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#createSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a detection rule with detection method 'sequence_detection' returns "OK" response ``` // Create a detection rule with detection method 'sequence_detection' returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCaseCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCreatePayload; import com.datadog.api.client.v2.model.SecurityMonitoringRuleDetectionMethod; import com.datadog.api.client.v2.model.SecurityMonitoringRuleEvaluationWindow; import com.datadog.api.client.v2.model.SecurityMonitoringRuleKeepAlive; import com.datadog.api.client.v2.model.SecurityMonitoringRuleMaxSignalDuration; import com.datadog.api.client.v2.model.SecurityMonitoringRuleOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryAggregation; import com.datadog.api.client.v2.model.SecurityMonitoringRuleResponse; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSequenceDetectionOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSequenceDetectionStep; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSequenceDetectionStepTransition; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTypeCreate; import com.datadog.api.client.v2.model.SecurityMonitoringStandardDataSource; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleCreatePayload; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleQuery; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringRuleCreatePayload body = new SecurityMonitoringRuleCreatePayload( new SecurityMonitoringStandardRuleCreatePayload() .name("Example-Security-Monitoring") .type(SecurityMonitoringRuleTypeCreate.LOG_DETECTION) .isEnabled(true) .queries( Arrays.asList( new SecurityMonitoringStandardRuleQuery() .aggregation(SecurityMonitoringRuleQueryAggregation.COUNT) .dataSource(SecurityMonitoringStandardDataSource.LOGS) .hasOptionalGroupByFields(false) .name("") .query("service:logs-rule-reducer source:paul test2"), new SecurityMonitoringStandardRuleQuery() .aggregation(SecurityMonitoringRuleQueryAggregation.COUNT) .dataSource(SecurityMonitoringStandardDataSource.LOGS) .hasOptionalGroupByFields(false) .name("") .query("service:logs-rule-reducer source:paul test1"))) .cases( Collections.singletonList( new SecurityMonitoringRuleCaseCreate() .name("") .status(SecurityMonitoringRuleSeverity.INFO) .condition("step_b > 0"))) .message("Logs and signals asdf") .options( new SecurityMonitoringRuleOptions() .detectionMethod(SecurityMonitoringRuleDetectionMethod.SEQUENCE_DETECTION) .evaluationWindow(SecurityMonitoringRuleEvaluationWindow.ZERO_MINUTES) .keepAlive(SecurityMonitoringRuleKeepAlive.FIVE_MINUTES) .maxSignalDuration(SecurityMonitoringRuleMaxSignalDuration.TEN_MINUTES) .sequenceDetectionOptions( new SecurityMonitoringRuleSequenceDetectionOptions() .stepTransitions( Collections.singletonList( new SecurityMonitoringRuleSequenceDetectionStepTransition() .child("step_b") .evaluationWindow( SecurityMonitoringRuleEvaluationWindow .FIFTEEN_MINUTES) .parent("step_a"))) .steps( Arrays.asList( new SecurityMonitoringRuleSequenceDetectionStep() .condition("a > 0") .evaluationWindow( SecurityMonitoringRuleEvaluationWindow.ONE_MINUTE) .name("step_a"), new SecurityMonitoringRuleSequenceDetectionStep() .condition("b > 0") .evaluationWindow( SecurityMonitoringRuleEvaluationWindow.ONE_MINUTE) .name("step_b")))))); try { SecurityMonitoringRuleResponse result = apiInstance.createSecurityMonitoringRule(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#createSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a cloud_configuration rule returns "OK" response ``` """ Create a cloud_configuration rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.cloud_configuration_compliance_rule_options import ( CloudConfigurationComplianceRuleOptions, ) from datadog_api_client.v2.model.cloud_configuration_rego_rule import CloudConfigurationRegoRule from datadog_api_client.v2.model.cloud_configuration_rule_case_create import CloudConfigurationRuleCaseCreate from datadog_api_client.v2.model.cloud_configuration_rule_compliance_signal_options import ( CloudConfigurationRuleComplianceSignalOptions, ) from datadog_api_client.v2.model.cloud_configuration_rule_create_payload import CloudConfigurationRuleCreatePayload from datadog_api_client.v2.model.cloud_configuration_rule_options import CloudConfigurationRuleOptions from datadog_api_client.v2.model.cloud_configuration_rule_type import CloudConfigurationRuleType from datadog_api_client.v2.model.security_monitoring_filter import SecurityMonitoringFilter from datadog_api_client.v2.model.security_monitoring_filter_action import SecurityMonitoringFilterAction from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity body = CloudConfigurationRuleCreatePayload( type=CloudConfigurationRuleType.CLOUD_CONFIGURATION, name="Example-Security-Monitoring_cloud", is_enabled=False, cases=[ CloudConfigurationRuleCaseCreate( status=SecurityMonitoringRuleSeverity.INFO, notifications=[ "channel", ], ), ], options=CloudConfigurationRuleOptions( compliance_rule_options=CloudConfigurationComplianceRuleOptions( resource_type="gcp_compute_disk", complex_rule=False, rego_rule=CloudConfigurationRegoRule( policy='package datadog\n\nimport data.datadog.output as dd_output\n\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\nmilliseconds_in_a_day := ((1000 * 60) * 60) * 24\n\neval(iam_service_account_key) = "skip" if {\n\tiam_service_account_key.disabled\n} else = "pass" if {\n\t(iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90\n} else = "fail"\n\n# This part remains unchanged for all rules\nresults contains result if {\n\tsome resource in input.resources[input.main_resource_type]\n\tresult := dd_output.format(resource, eval(resource))\n}\n', resource_types=[ "gcp_compute_disk", ], ), ), ), message="ddd", tags=[ "my:tag", ], compliance_signal_options=CloudConfigurationRuleComplianceSignalOptions( user_activation_status=True, user_group_by_fields=[ "@account_id", ], ), filters=[ SecurityMonitoringFilter( action=SecurityMonitoringFilterAction.REQUIRE, query="resource_id:helo*", ), SecurityMonitoringFilter( action=SecurityMonitoringFilterAction.SUPPRESS, query="control:helo*", ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_security_monitoring_rule(body=body) print(response) ``` Copy ##### Create a detection rule returns "OK" response ``` """ Create a detection rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_reference_table import SecurityMonitoringReferenceTable from datadog_api_client.v2.model.security_monitoring_rule_case_create import SecurityMonitoringRuleCaseCreate from datadog_api_client.v2.model.security_monitoring_rule_evaluation_window import ( SecurityMonitoringRuleEvaluationWindow, ) from datadog_api_client.v2.model.security_monitoring_rule_keep_alive import SecurityMonitoringRuleKeepAlive from datadog_api_client.v2.model.security_monitoring_rule_max_signal_duration import ( SecurityMonitoringRuleMaxSignalDuration, ) from datadog_api_client.v2.model.security_monitoring_rule_options import SecurityMonitoringRuleOptions from datadog_api_client.v2.model.security_monitoring_rule_query_aggregation import ( SecurityMonitoringRuleQueryAggregation, ) from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity from datadog_api_client.v2.model.security_monitoring_rule_type_create import SecurityMonitoringRuleTypeCreate from datadog_api_client.v2.model.security_monitoring_standard_rule_create_payload import ( SecurityMonitoringStandardRuleCreatePayload, ) from datadog_api_client.v2.model.security_monitoring_standard_rule_query import SecurityMonitoringStandardRuleQuery body = SecurityMonitoringStandardRuleCreatePayload( name="Example-Security-Monitoring", queries=[ SecurityMonitoringStandardRuleQuery( query="@test:true", aggregation=SecurityMonitoringRuleQueryAggregation.COUNT, group_by_fields=[], distinct_fields=[], metric="", ), ], filters=[], cases=[ SecurityMonitoringRuleCaseCreate( name="", status=SecurityMonitoringRuleSeverity.INFO, condition="a > 0", notifications=[], ), ], options=SecurityMonitoringRuleOptions( evaluation_window=SecurityMonitoringRuleEvaluationWindow.FIFTEEN_MINUTES, keep_alive=SecurityMonitoringRuleKeepAlive.ONE_HOUR, max_signal_duration=SecurityMonitoringRuleMaxSignalDuration.ONE_DAY, ), message="Test rule", tags=[], is_enabled=True, type=SecurityMonitoringRuleTypeCreate.LOG_DETECTION, reference_tables=[ SecurityMonitoringReferenceTable( table_name="synthetics_test_reference_table_dont_delete", column_name="value", log_field_path="testtag", check_presence=True, rule_query_name="a", ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_security_monitoring_rule(body=body) print(response) ``` Copy ##### Create a detection rule with detection method 'sequence_detection' returns "OK" response ``` """ Create a detection rule with detection method 'sequence_detection' returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_rule_case_create import SecurityMonitoringRuleCaseCreate from datadog_api_client.v2.model.security_monitoring_rule_detection_method import SecurityMonitoringRuleDetectionMethod from datadog_api_client.v2.model.security_monitoring_rule_evaluation_window import ( SecurityMonitoringRuleEvaluationWindow, ) from datadog_api_client.v2.model.security_monitoring_rule_keep_alive import SecurityMonitoringRuleKeepAlive from datadog_api_client.v2.model.security_monitoring_rule_max_signal_duration import ( SecurityMonitoringRuleMaxSignalDuration, ) from datadog_api_client.v2.model.security_monitoring_rule_options import SecurityMonitoringRuleOptions from datadog_api_client.v2.model.security_monitoring_rule_query_aggregation import ( SecurityMonitoringRuleQueryAggregation, ) from datadog_api_client.v2.model.security_monitoring_rule_sequence_detection_options import ( SecurityMonitoringRuleSequenceDetectionOptions, ) from datadog_api_client.v2.model.security_monitoring_rule_sequence_detection_step import ( SecurityMonitoringRuleSequenceDetectionStep, ) from datadog_api_client.v2.model.security_monitoring_rule_sequence_detection_step_transition import ( SecurityMonitoringRuleSequenceDetectionStepTransition, ) from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity from datadog_api_client.v2.model.security_monitoring_rule_type_create import SecurityMonitoringRuleTypeCreate from datadog_api_client.v2.model.security_monitoring_standard_data_source import SecurityMonitoringStandardDataSource from datadog_api_client.v2.model.security_monitoring_standard_rule_create_payload import ( SecurityMonitoringStandardRuleCreatePayload, ) from datadog_api_client.v2.model.security_monitoring_standard_rule_query import SecurityMonitoringStandardRuleQuery body = SecurityMonitoringStandardRuleCreatePayload( name="Example-Security-Monitoring", type=SecurityMonitoringRuleTypeCreate.LOG_DETECTION, is_enabled=True, queries=[ SecurityMonitoringStandardRuleQuery( aggregation=SecurityMonitoringRuleQueryAggregation.COUNT, data_source=SecurityMonitoringStandardDataSource.LOGS, distinct_fields=[], group_by_fields=[], has_optional_group_by_fields=False, name="", query="service:logs-rule-reducer source:paul test2", ), SecurityMonitoringStandardRuleQuery( aggregation=SecurityMonitoringRuleQueryAggregation.COUNT, data_source=SecurityMonitoringStandardDataSource.LOGS, distinct_fields=[], group_by_fields=[], has_optional_group_by_fields=False, name="", query="service:logs-rule-reducer source:paul test1", ), ], cases=[ SecurityMonitoringRuleCaseCreate( name="", status=SecurityMonitoringRuleSeverity.INFO, notifications=[], condition="step_b > 0", ), ], message="Logs and signals asdf", options=SecurityMonitoringRuleOptions( detection_method=SecurityMonitoringRuleDetectionMethod.SEQUENCE_DETECTION, evaluation_window=SecurityMonitoringRuleEvaluationWindow.ZERO_MINUTES, keep_alive=SecurityMonitoringRuleKeepAlive.FIVE_MINUTES, max_signal_duration=SecurityMonitoringRuleMaxSignalDuration.TEN_MINUTES, sequence_detection_options=SecurityMonitoringRuleSequenceDetectionOptions( step_transitions=[ SecurityMonitoringRuleSequenceDetectionStepTransition( child="step_b", evaluation_window=SecurityMonitoringRuleEvaluationWindow.FIFTEEN_MINUTES, parent="step_a", ), ], steps=[ SecurityMonitoringRuleSequenceDetectionStep( condition="a > 0", evaluation_window=SecurityMonitoringRuleEvaluationWindow.ONE_MINUTE, name="step_a", ), SecurityMonitoringRuleSequenceDetectionStep( condition="b > 0", evaluation_window=SecurityMonitoringRuleEvaluationWindow.ONE_MINUTE, name="step_b", ), ], ), ), tags=[], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_security_monitoring_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a cloud_configuration rule returns "OK" response ``` # Create a cloud_configuration rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::CloudConfigurationRuleCreatePayload.new({ type: DatadogAPIClient::V2::CloudConfigurationRuleType::CLOUD_CONFIGURATION, name: "Example-Security-Monitoring_cloud", is_enabled: false, cases: [ DatadogAPIClient::V2::CloudConfigurationRuleCaseCreate.new({ status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, notifications: [ "channel", ], }), ], options: DatadogAPIClient::V2::CloudConfigurationRuleOptions.new({ compliance_rule_options: DatadogAPIClient::V2::CloudConfigurationComplianceRuleOptions.new({ resource_type: "gcp_compute_disk", complex_rule: false, rego_rule: DatadogAPIClient::V2::CloudConfigurationRegoRule.new({ policy: 'package datadog\n\nimport data.datadog.output as dd_output\n\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\nmilliseconds_in_a_day := ((1000 * 60) * 60) * 24\n\neval(iam_service_account_key) = "skip" if {\n\tiam_service_account_key.disabled\n} else = "pass" if {\n\t(iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90\n} else = "fail"\n\n# This part remains unchanged for all rules\nresults contains result if {\n\tsome resource in input.resources[input.main_resource_type]\n\tresult := dd_output.format(resource, eval(resource))\n}\n', resource_types: [ "gcp_compute_disk", ], }), }), }), message: "ddd", tags: [ "my:tag", ], compliance_signal_options: DatadogAPIClient::V2::CloudConfigurationRuleComplianceSignalOptions.new({ user_activation_status: true, user_group_by_fields: [ "@account_id", ], }), filters: [ DatadogAPIClient::V2::SecurityMonitoringFilter.new({ action: DatadogAPIClient::V2::SecurityMonitoringFilterAction::REQUIRE, query: "resource_id:helo*", }), DatadogAPIClient::V2::SecurityMonitoringFilter.new({ action: DatadogAPIClient::V2::SecurityMonitoringFilterAction::SUPPRESS, query: "control:helo*", }), ], }) p api_instance.create_security_monitoring_rule(body) ``` Copy ##### Create a detection rule returns "OK" response ``` # Create a detection rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringStandardRuleCreatePayload.new({ name: "Example-Security-Monitoring", queries: [ DatadogAPIClient::V2::SecurityMonitoringStandardRuleQuery.new({ query: "@test:true", aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::COUNT, group_by_fields: [], distinct_fields: [], metric: "", }), ], filters: [], cases: [ DatadogAPIClient::V2::SecurityMonitoringRuleCaseCreate.new({ name: "", status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, condition: "a > 0", notifications: [], }), ], options: DatadogAPIClient::V2::SecurityMonitoringRuleOptions.new({ evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES, keep_alive: DatadogAPIClient::V2::SecurityMonitoringRuleKeepAlive::ONE_HOUR, max_signal_duration: DatadogAPIClient::V2::SecurityMonitoringRuleMaxSignalDuration::ONE_DAY, }), message: "Test rule", tags: [], is_enabled: true, type: DatadogAPIClient::V2::SecurityMonitoringRuleTypeCreate::LOG_DETECTION, reference_tables: [ DatadogAPIClient::V2::SecurityMonitoringReferenceTable.new({ table_name: "synthetics_test_reference_table_dont_delete", column_name: "value", log_field_path: "testtag", check_presence: true, rule_query_name: "a", }), ], }) p api_instance.create_security_monitoring_rule(body) ``` Copy ##### Create a detection rule with detection method 'sequence_detection' returns "OK" response ``` # Create a detection rule with detection method 'sequence_detection' returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringStandardRuleCreatePayload.new({ name: "Example-Security-Monitoring", type: DatadogAPIClient::V2::SecurityMonitoringRuleTypeCreate::LOG_DETECTION, is_enabled: true, queries: [ DatadogAPIClient::V2::SecurityMonitoringStandardRuleQuery.new({ aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::COUNT, data_source: DatadogAPIClient::V2::SecurityMonitoringStandardDataSource::LOGS, distinct_fields: [], group_by_fields: [], has_optional_group_by_fields: false, name: "", query: "service:logs-rule-reducer source:paul test2", }), DatadogAPIClient::V2::SecurityMonitoringStandardRuleQuery.new({ aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::COUNT, data_source: DatadogAPIClient::V2::SecurityMonitoringStandardDataSource::LOGS, distinct_fields: [], group_by_fields: [], has_optional_group_by_fields: false, name: "", query: "service:logs-rule-reducer source:paul test1", }), ], cases: [ DatadogAPIClient::V2::SecurityMonitoringRuleCaseCreate.new({ name: "", status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, notifications: [], condition: "step_b > 0", }), ], message: "Logs and signals asdf", options: DatadogAPIClient::V2::SecurityMonitoringRuleOptions.new({ detection_method: DatadogAPIClient::V2::SecurityMonitoringRuleDetectionMethod::SEQUENCE_DETECTION, evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::ZERO_MINUTES, keep_alive: DatadogAPIClient::V2::SecurityMonitoringRuleKeepAlive::FIVE_MINUTES, max_signal_duration: DatadogAPIClient::V2::SecurityMonitoringRuleMaxSignalDuration::TEN_MINUTES, sequence_detection_options: DatadogAPIClient::V2::SecurityMonitoringRuleSequenceDetectionOptions.new({ step_transitions: [ DatadogAPIClient::V2::SecurityMonitoringRuleSequenceDetectionStepTransition.new({ child: "step_b", evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES, parent: "step_a", }), ], steps: [ DatadogAPIClient::V2::SecurityMonitoringRuleSequenceDetectionStep.new({ condition: "a > 0", evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::ONE_MINUTE, name: "step_a", }), DatadogAPIClient::V2::SecurityMonitoringRuleSequenceDetectionStep.new({ condition: "b > 0", evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::ONE_MINUTE, name: "step_b", }), ], }), }), tags: [], }) p api_instance.create_security_monitoring_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a cloud_configuration rule returns "OK" response ``` // Create a cloud_configuration rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::CloudConfigurationComplianceRuleOptions; use datadog_api_client::datadogV2::model::CloudConfigurationRegoRule; use datadog_api_client::datadogV2::model::CloudConfigurationRuleCaseCreate; use datadog_api_client::datadogV2::model::CloudConfigurationRuleComplianceSignalOptions; use datadog_api_client::datadogV2::model::CloudConfigurationRuleCreatePayload; use datadog_api_client::datadogV2::model::CloudConfigurationRuleOptions; use datadog_api_client::datadogV2::model::CloudConfigurationRuleType; use datadog_api_client::datadogV2::model::SecurityMonitoringFilter; use datadog_api_client::datadogV2::model::SecurityMonitoringFilterAction; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCreatePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; #[tokio::main] async fn main() { let body = SecurityMonitoringRuleCreatePayload::CloudConfigurationRuleCreatePayload( Box::new( CloudConfigurationRuleCreatePayload::new( vec![ CloudConfigurationRuleCaseCreate::new( SecurityMonitoringRuleSeverity::INFO, ).notifications(vec!["channel".to_string()]) ], CloudConfigurationRuleComplianceSignalOptions::new() .user_activation_status(Some(true)) .user_group_by_fields(Some(vec!["@account_id".to_string()])), false, "ddd".to_string(), "Example-Security-Monitoring_cloud".to_string(), CloudConfigurationRuleOptions::new( CloudConfigurationComplianceRuleOptions::new() .complex_rule(false) .rego_rule( CloudConfigurationRegoRule::new( r#"package datadog import data.datadog.output as dd_output import future.keywords.contains import future.keywords.if import future.keywords.in milliseconds_in_a_day := ((1000 * 60) * 60) * 24 eval(iam_service_account_key) = "skip" if { iam_service_account_key.disabled } else = "pass" if { (iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90 } else = "fail" # This part remains unchanged for all rules results contains result if { some resource in input.resources[input.main_resource_type] result := dd_output.format(resource, eval(resource)) } "#.to_string(), vec!["gcp_compute_disk".to_string()], ), ) .resource_type("gcp_compute_disk".to_string()), ), ) .filters( vec![ SecurityMonitoringFilter::new() .action(SecurityMonitoringFilterAction::REQUIRE) .query("resource_id:helo*".to_string()), SecurityMonitoringFilter::new() .action(SecurityMonitoringFilterAction::SUPPRESS) .query("control:helo*".to_string()) ], ) .tags(vec!["my:tag".to_string()]) .type_(CloudConfigurationRuleType::CLOUD_CONFIGURATION), ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_security_monitoring_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a detection rule returns "OK" response ``` // Create a detection rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringReferenceTable; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCaseCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCreatePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleEvaluationWindow; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleKeepAlive; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleMaxSignalDuration; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryAggregation; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleTypeCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleCreatePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleQuery; #[tokio::main] async fn main() { let body = SecurityMonitoringRuleCreatePayload::SecurityMonitoringStandardRuleCreatePayload(Box::new( SecurityMonitoringStandardRuleCreatePayload::new( vec![ SecurityMonitoringRuleCaseCreate::new(SecurityMonitoringRuleSeverity::INFO) .condition("a > 0".to_string()) .name("".to_string()) .notifications(vec![]), ], true, "Test rule".to_string(), "Example-Security-Monitoring".to_string(), SecurityMonitoringRuleOptions::new() .evaluation_window(SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES) .keep_alive(SecurityMonitoringRuleKeepAlive::ONE_HOUR) .max_signal_duration(SecurityMonitoringRuleMaxSignalDuration::ONE_DAY), vec![SecurityMonitoringStandardRuleQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::COUNT) .distinct_fields(vec![]) .group_by_fields(vec![]) .metric("".to_string()) .query("@test:true".to_string())], ) .filters(vec![]) .reference_tables(vec![SecurityMonitoringReferenceTable::new() .check_presence(true) .column_name("value".to_string()) .log_field_path("testtag".to_string()) .rule_query_name("a".to_string()) .table_name("synthetics_test_reference_table_dont_delete".to_string())]) .tags(vec![]) .type_(SecurityMonitoringRuleTypeCreate::LOG_DETECTION), )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_security_monitoring_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a detection rule with detection method 'sequence_detection' returns "OK" response ``` // Create a detection rule with detection method 'sequence_detection' returns "OK" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCaseCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCreatePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleDetectionMethod; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleEvaluationWindow; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleKeepAlive; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleMaxSignalDuration; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryAggregation; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSequenceDetectionOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSequenceDetectionStep; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSequenceDetectionStepTransition; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleTypeCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardDataSource; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleCreatePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleQuery; #[tokio::main] async fn main() { let body = SecurityMonitoringRuleCreatePayload::SecurityMonitoringStandardRuleCreatePayload(Box::new( SecurityMonitoringStandardRuleCreatePayload::new( vec![ SecurityMonitoringRuleCaseCreate::new(SecurityMonitoringRuleSeverity::INFO) .condition("step_b > 0".to_string()) .name("".to_string()) .notifications(vec![]), ], true, "Logs and signals asdf".to_string(), "Example-Security-Monitoring".to_string(), SecurityMonitoringRuleOptions::new() .detection_method(SecurityMonitoringRuleDetectionMethod::SEQUENCE_DETECTION) .evaluation_window(SecurityMonitoringRuleEvaluationWindow::ZERO_MINUTES) .keep_alive(SecurityMonitoringRuleKeepAlive::FIVE_MINUTES) .max_signal_duration(SecurityMonitoringRuleMaxSignalDuration::TEN_MINUTES) .sequence_detection_options( SecurityMonitoringRuleSequenceDetectionOptions::new() .step_transitions(vec![ SecurityMonitoringRuleSequenceDetectionStepTransition::new() .child("step_b".to_string()) .evaluation_window( SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES, ) .parent("step_a".to_string()), ]) .steps(vec![ SecurityMonitoringRuleSequenceDetectionStep::new() .condition("a > 0".to_string()) .evaluation_window( SecurityMonitoringRuleEvaluationWindow::ONE_MINUTE, ) .name("step_a".to_string()), SecurityMonitoringRuleSequenceDetectionStep::new() .condition("b > 0".to_string()) .evaluation_window( SecurityMonitoringRuleEvaluationWindow::ONE_MINUTE, ) .name("step_b".to_string()), ]), ), vec![ SecurityMonitoringStandardRuleQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::COUNT) .data_source(SecurityMonitoringStandardDataSource::LOGS) .distinct_fields(vec![]) .group_by_fields(vec![]) .has_optional_group_by_fields(false) .name("".to_string()) .query("service:logs-rule-reducer source:paul test2".to_string()), SecurityMonitoringStandardRuleQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::COUNT) .data_source(SecurityMonitoringStandardDataSource::LOGS) .distinct_fields(vec![]) .group_by_fields(vec![]) .has_optional_group_by_fields(false) .name("".to_string()) .query("service:logs-rule-reducer source:paul test1".to_string()), ], ) .tags(vec![]) .type_(SecurityMonitoringRuleTypeCreate::LOG_DETECTION), )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_security_monitoring_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a cloud_configuration rule returns "OK" response ``` /** * Create a cloud_configuration rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateSecurityMonitoringRuleRequest = { body: { type: "cloud_configuration", name: "Example-Security-Monitoring_cloud", isEnabled: false, cases: [ { status: "info", notifications: ["channel"], }, ], options: { complianceRuleOptions: { resourceType: "gcp_compute_disk", complexRule: false, regoRule: { policy: `package datadog import data.datadog.output as dd_output import future.keywords.contains import future.keywords.if import future.keywords.in milliseconds_in_a_day := ((1000 * 60) * 60) * 24 eval(iam_service_account_key) = "skip" if { iam_service_account_key.disabled } else = "pass" if { (iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90 } else = "fail" # This part remains unchanged for all rules results contains result if { some resource in input.resources[input.main_resource_type] result := dd_output.format(resource, eval(resource)) } `, resourceTypes: ["gcp_compute_disk"], }, }, }, message: "ddd", tags: ["my:tag"], complianceSignalOptions: { userActivationStatus: true, userGroupByFields: ["@account_id"], }, filters: [ { action: "require", query: "resource_id:helo*", }, { action: "suppress", query: "control:helo*", }, ], }, }; apiInstance .createSecurityMonitoringRule(params) .then((data: v2.SecurityMonitoringRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a detection rule returns "OK" response ``` /** * Create a detection rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateSecurityMonitoringRuleRequest = { body: { name: "Example-Security-Monitoring", queries: [ { query: "@test:true", aggregation: "count", groupByFields: [], distinctFields: [], metric: "", }, ], filters: [], cases: [ { name: "", status: "info", condition: "a > 0", notifications: [], }, ], options: { evaluationWindow: 900, keepAlive: 3600, maxSignalDuration: 86400, }, message: "Test rule", tags: [], isEnabled: true, type: "log_detection", referenceTables: [ { tableName: "synthetics_test_reference_table_dont_delete", columnName: "value", logFieldPath: "testtag", checkPresence: true, ruleQueryName: "a", }, ], }, }; apiInstance .createSecurityMonitoringRule(params) .then((data: v2.SecurityMonitoringRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a detection rule with detection method 'sequence_detection' returns "OK" response ``` /** * Create a detection rule with detection method 'sequence_detection' returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateSecurityMonitoringRuleRequest = { body: { name: "Example-Security-Monitoring", type: "log_detection", isEnabled: true, queries: [ { aggregation: "count", dataSource: "logs", distinctFields: [], groupByFields: [], hasOptionalGroupByFields: false, name: "", query: "service:logs-rule-reducer source:paul test2", }, { aggregation: "count", dataSource: "logs", distinctFields: [], groupByFields: [], hasOptionalGroupByFields: false, name: "", query: "service:logs-rule-reducer source:paul test1", }, ], cases: [ { name: "", status: "info", notifications: [], condition: "step_b > 0", }, ], message: "Logs and signals asdf", options: { detectionMethod: "sequence_detection", evaluationWindow: 0, keepAlive: 300, maxSignalDuration: 600, sequenceDetectionOptions: { stepTransitions: [ { child: "step_b", evaluationWindow: 900, parent: "step_a", }, ], steps: [ { condition: "a > 0", evaluationWindow: 60, name: "step_a", }, { condition: "b > 0", evaluationWindow: 60, name: "step_b", }, ], }, }, tags: [], }, }; apiInstance .createSecurityMonitoringRule(params) .then((data: v2.SecurityMonitoringRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-custom-framework) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-custom-framework-v2) DELETE https://api.ap1.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.ap2.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.datadoghq.eu/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.ddog-gov.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.us3.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.us5.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version} ### Overview Delete a custom framework. This endpoint requires all of the following permissions: * `security_monitoring_rules_read` * `security_monitoring_rules_write` OAuth apps require the `security_monitoring_rules_read, security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description handle [_required_] string The framework handle version [_required_] string The framework version ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteCustomFramework-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteCustomFramework-400-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteCustomFramework-429-v2) * [500](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteCustomFramework-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object to delete a custom framework. Field Type Description data [_required_] object Metadata for custom frameworks. attributes object Framework without requirements. description string Framework Description handle [_required_] string Framework Handle icon_url string Framework Icon URL name [_required_] string Framework Name version [_required_] string Framework Version id string The ID of the custom framework. type enum The type of the resource. The value must be `custom_framework`. Allowed enum values: `custom_framework` default: `custom_framework` ``` { "data": { "attributes": { "description": "this is a security description", "handle": "sec2", "icon_url": "https://example.com/icon.png", "name": "security-framework", "version": "2" }, "id": "handle-version", "type": "custom_framework" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Delete a custom framework Copy ``` # Path parameters export handle="CHANGE_ME" export version="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/${handle}/${version}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a custom framework ``` """ Delete a custom framework returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.delete_custom_framework( handle="create-framework-new", version="10", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a custom framework ``` # Delete a custom framework returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.delete_custom_framework("create-framework-new", "10") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a custom framework ``` // Delete a custom framework returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.DeleteCustomFramework(ctx, "create-framework-new", "10") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.DeleteCustomFramework`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.DeleteCustomFramework`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a custom framework ``` // Delete a custom framework returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.DeleteCustomFrameworkResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { DeleteCustomFrameworkResponse result = apiInstance.deleteCustomFramework("create-framework-new", "10"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#deleteCustomFramework"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a custom framework ``` // Delete a custom framework returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .delete_custom_framework("create-framework-new".to_string(), "10".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a custom framework ``` /** * Delete a custom framework returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiDeleteCustomFrameworkRequest = { handle: "create-framework-new", version: "10", }; apiInstance .deleteCustomFramework(params) .then((data: v2.DeleteCustomFrameworkResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-custom-framework) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-custom-framework-v2) GET https://api.ap1.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.ap2.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.datadoghq.eu/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.ddog-gov.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.us3.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.us5.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version} ### Overview Get a custom framework. This endpoint requires the `security_monitoring_rules_read` permission. OAuth apps require the `security_monitoring_rules_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description handle [_required_] string The framework handle version [_required_] string The framework version ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetCustomFramework-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#GetCustomFramework-400-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetCustomFramework-429-v2) * [500](https://docs.datadoghq.com/api/latest/security-monitoring/#GetCustomFramework-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object to get a custom framework. Field Type Description data [_required_] object Contains type and attributes for custom frameworks. attributes [_required_] object Full Framework Data Attributes. handle [_required_] string Framework Handle icon_url string Framework Icon URL name [_required_] string Framework Name requirements [_required_] [object] Framework Requirements controls [_required_] [object] Requirement Controls. name [_required_] string Control Name. rules_id [_required_] [string] Rule IDs. name [_required_] string Requirement Name. version [_required_] string Framework Version id [_required_] string The ID of the custom framework. type [_required_] enum The type of the resource. The value must be `custom_framework`. Allowed enum values: `custom_framework` default: `custom_framework` ``` { "data": { "attributes": { "handle": "sec2", "icon_url": "https://example.com/icon.png", "name": "security-framework", "requirements": [ { "controls": [ { "name": "A1.2", "rules_id": [ "def-000-abc" ] } ], "name": "criteria" } ], "version": "2" }, "id": "handle-version", "type": "custom_framework" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a custom framework Copy ``` # Path parameters export handle="CHANGE_ME" export version="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/${handle}/${version}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a custom framework ``` """ Get a custom framework returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_custom_framework( handle="create-framework-new", version="10", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a custom framework ``` # Get a custom framework returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.get_custom_framework("create-framework-new", "10") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a custom framework ``` // Get a custom framework returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetCustomFramework(ctx, "create-framework-new", "10") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetCustomFramework`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetCustomFramework`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a custom framework ``` // Get a custom framework returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.GetCustomFrameworkResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { GetCustomFrameworkResponse result = apiInstance.getCustomFramework("create-framework-new", "10"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#getCustomFramework"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a custom framework ``` // Get a custom framework returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_custom_framework("create-framework-new".to_string(), "10".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a custom framework ``` /** * Get a custom framework returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiGetCustomFrameworkRequest = { handle: "create-framework-new", version: "10", }; apiInstance .getCustomFramework(params) .then((data: v2.GetCustomFrameworkResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List rules](https://docs.datadoghq.com/api/latest/security-monitoring/#list-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#list-rules-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/ruleshttps://api.ap2.datadoghq.com/api/v2/security_monitoring/ruleshttps://api.datadoghq.eu/api/v2/security_monitoring/ruleshttps://api.ddog-gov.com/api/v2/security_monitoring/ruleshttps://api.datadoghq.com/api/v2/security_monitoring/ruleshttps://api.us3.datadoghq.com/api/v2/security_monitoring/ruleshttps://api.us5.datadoghq.com/api/v2/security_monitoring/rules ### Overview List rules. This endpoint requires the `security_monitoring_rules_read` permission. OAuth apps require the `security_monitoring_rules_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringRules-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringRules-400-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) List of rules. Field Type Description data [ ] Array containing the list of rules. Option 1 object Rule. calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A rule case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` complianceSignalOptions object How to generate compliance signals. Useful for cloud_configuration rules only. defaultActivationStatus boolean The default activation status. defaultGroupByFields [string] The default group by fields. userActivationStatus boolean Whether signals will be sent. userGroupByFields [string] Fields to use to group findings by when sending signals. createdAt int64 When the rule was created, timestamp in milliseconds. creationAuthorId int64 User ID of the user who created the rule. customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). defaultTags [string] Default Tags for default rules (included in tags) deprecationDate int64 When the rule will be deprecated, timestamp in milliseconds. filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. id string The ID of the rule. isDefault boolean Whether the rule is included by default. isDeleted boolean Whether the rule has been deleted. isEnabled boolean Whether the rule is enabled. message string Message for generated signals. name string The name of the rule. options object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. query string A query to map a third party event to this case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum The rule type. Allowed enum values: `log_detection,infrastructure_configuration,workload_security,cloud_configuration,application_security,api_security` updateAuthorId int64 User ID of the user who updated the rule. updatedAt int64 The date the rule was last updated, in milliseconds. version int64 The version of the rule. Option 2 object Rule. cases [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A rule case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` createdAt int64 When the rule was created, timestamp in milliseconds. creationAuthorId int64 User ID of the user who created the rule. customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). deprecationDate int64 When the rule will be deprecated, timestamp in milliseconds. filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. id string The ID of the rule. isDefault boolean Whether the rule is included by default. isDeleted boolean Whether the rule has been deleted. isEnabled boolean Whether the rule is enabled. message string Message for generated signals. name string The name of the rule. options object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` correlatedByFields [string] Fields to correlate by. correlatedQueryIndex int32 Index of the rule query used to retrieve the correlated field. defaultRuleId string Default Rule ID to match on signals. distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. metrics [string] Group of target fields to aggregate over. name string Name of the query. ruleId string Rule ID to match on signals. tags [string] Tags for generated signals. type enum The rule type. Allowed enum values: `signal_correlation` updateAuthorId int64 User ID of the user who updated the rule. version int64 The version of the rule. meta object Object describing meta attributes of response. page object Pagination object. total_count int64 Total count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "calculatedFields": [ { "expression": "@request_end_timestamp - @request_start_timestamp", "name": "response_time" } ], "cases": [ { "actions": [ { "options": { "duration": 0, "flaggedIPType": "FLAGGED", "userBehaviorName": "string" }, "type": "string" } ], "condition": "string", "customStatus": "critical", "name": "string", "notifications": [], "status": "critical" } ], "complianceSignalOptions": { "defaultActivationStatus": false, "defaultGroupByFields": [], "userActivationStatus": false, "userGroupByFields": [] }, "createdAt": "integer", "creationAuthorId": "integer", "customMessage": "string", "customName": "string", "defaultTags": [ "security:attacks" ], "deprecationDate": "integer", "filters": [ { "action": "string", "query": "string" } ], "groupSignalsBy": [ "service" ], "hasExtendedTitle": false, "id": "string", "isDefault": false, "isDeleted": false, "isEnabled": false, "message": "string", "name": "string", "options": { "anomalyDetectionOptions": { "bucketDuration": 300, "detectionTolerance": 5, "learningDuration": "integer", "learningPeriodBaseline": "integer" }, "complianceRuleOptions": { "complexRule": false, "regoRule": { "policy": "package datadog\n\nimport data.datadog.output as dd_output\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\neval(resource) = \"skip\" if {\n # Logic that evaluates to true if the resource should be skipped\n true\n} else = \"pass\" {\n # Logic that evaluates to true if the resource is compliant\n true\n} else = \"fail\" {\n # Logic that evaluates to true if the resource is not compliant\n true\n}\n\n# This part remains unchanged for all rules\nresults contains result if {\n some resource in input.resources[input.main_resource_type]\n result := dd_output.format(resource, eval(resource))\n}", "resourceTypes": [ "gcp_iam_service_account", "gcp_iam_policy" ] }, "resourceType": "aws_acm" }, "decreaseCriticalityBasedOnEnv": false, "detectionMethod": "string", "evaluationWindow": "integer", "hardcodedEvaluatorType": "string", "impossibleTravelOptions": { "baselineUserLocations": true }, "keepAlive": "integer", "maxSignalDuration": "integer", "newValueOptions": { "forgetAfter": "integer", "instantaneousBaseline": false, "learningDuration": "integer", "learningMethod": "string", "learningThreshold": "integer" }, "sequenceDetectionOptions": { "stepTransitions": [ { "child": "string", "evaluationWindow": "integer", "parent": "string" } ], "steps": [ { "condition": "string", "evaluationWindow": "integer", "name": "string" } ] }, "thirdPartyRuleOptions": { "defaultNotifications": [], "defaultStatus": "critical", "rootQueries": [ { "groupByFields": [], "query": "source:cloudtrail" } ], "signalTitleTemplate": "string" } }, "queries": [ { "aggregation": "string", "customQueryExtension": "a > 3", "dataSource": "logs", "distinctFields": [], "groupByFields": [], "hasOptionalGroupByFields": false, "index": "string", "indexes": [], "metric": "string", "metrics": [], "name": "string", "query": "a > 3" } ], "referenceTables": [ { "checkPresence": false, "columnName": "string", "logFieldPath": "string", "ruleQueryName": "string", "tableName": "string" } ], "schedulingOptions": { "rrule": "FREQ=HOURLY;INTERVAL=1;", "start": "2025-07-14T12:00:00", "timezone": "America/New_York" }, "tags": [], "thirdPartyCases": [ { "customStatus": "critical", "name": "string", "notifications": [], "query": "string", "status": "critical" } ], "type": "string", "updateAuthorId": "integer", "updatedAt": "integer", "version": "integer" } ], "meta": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### List rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List rules ``` """ List rules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_security_monitoring_rules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List rules ``` # List rules returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.list_security_monitoring_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List rules ``` // List rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListSecurityMonitoringRules(ctx, *datadogV2.NewListSecurityMonitoringRulesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListSecurityMonitoringRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListSecurityMonitoringRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List rules ``` // List rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringListRulesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { SecurityMonitoringListRulesResponse result = apiInstance.listSecurityMonitoringRules(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#listSecurityMonitoringRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List rules ``` // List rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::ListSecurityMonitoringRulesOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .list_security_monitoring_rules(ListSecurityMonitoringRulesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List rules ``` /** * List rules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); apiInstance .listSecurityMonitoringRules() .then((data: v2.SecurityMonitoringListRulesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a custom framework](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-custom-framework) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-custom-framework-v2) PUT https://api.ap1.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.ap2.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.datadoghq.eu/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.ddog-gov.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.us3.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version}https://api.us5.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/{handle}/{version} ### Overview Update a custom framework. This endpoint requires all of the following permissions: * `security_monitoring_rules_read` * `security_monitoring_rules_write` OAuth apps require the `security_monitoring_rules_read, security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description handle [_required_] string The framework handle version [_required_] string The framework version ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object Contains type and attributes for custom frameworks. attributes [_required_] object Framework Data Attributes. description string Framework Description handle [_required_] string Framework Handle icon_url string Framework Icon URL name [_required_] string Framework Name requirements [_required_] [object] Framework Requirements controls [_required_] [object] Requirement Controls. name [_required_] string Control Name. rules_id [_required_] [string] Rule IDs. name [_required_] string Requirement Name. version [_required_] string Framework Version type [_required_] enum The type of the resource. The value must be `custom_framework`. Allowed enum values: `custom_framework` default: `custom_framework` ``` { "data": { "type": "custom_framework", "attributes": { "name": "name", "handle": "create-framework-new", "version": "10", "icon_url": "test-url", "requirements": [ { "name": "requirement", "controls": [ { "name": "control", "rules_id": [ "def-000-be9" ] } ] } ] } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateCustomFramework-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateCustomFramework-400-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateCustomFramework-429-v2) * [500](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateCustomFramework-500-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object to update a custom framework. Field Type Description data [_required_] object Contains type and attributes for custom frameworks. attributes [_required_] object Framework Handle and Version. handle string Framework Handle version string Framework Version id [_required_] string The ID of the custom framework. type [_required_] enum The type of the resource. The value must be `custom_framework`. Allowed enum values: `custom_framework` default: `custom_framework` ``` { "data": { "attributes": { "handle": "sec2", "version": "2" }, "id": "handle-version", "type": "custom_framework" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Update a custom framework returns "OK" response Copy ``` # Path parameters export handle="CHANGE_ME" export version="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cloud_security_management/custom_frameworks/${handle}/${version}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "custom_framework", "attributes": { "name": "name", "handle": "create-framework-new", "version": "10", "icon_url": "test-url", "requirements": [ { "name": "requirement", "controls": [ { "name": "control", "rules_id": [ "def-000-be9" ] } ] } ] } } } EOF ``` ##### Update a custom framework returns "OK" response ``` // Update a custom framework returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.UpdateCustomFrameworkRequest{ Data: datadogV2.CustomFrameworkData{ Type: datadogV2.CUSTOMFRAMEWORKTYPE_CUSTOM_FRAMEWORK, Attributes: datadogV2.CustomFrameworkDataAttributes{ Name: "name", Handle: "create-framework-new", Version: "10", IconUrl: datadog.PtrString("test-url"), Requirements: []datadogV2.CustomFrameworkRequirement{ { Name: "requirement", Controls: []datadogV2.CustomFrameworkControl{ { Name: "control", RulesId: []string{ "def-000-be9", }, }, }, }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.UpdateCustomFramework(ctx, "create-framework-new", "10", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.UpdateCustomFramework`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.UpdateCustomFramework`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a custom framework returns "OK" response ``` // Update a custom framework returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.CustomFrameworkControl; import com.datadog.api.client.v2.model.CustomFrameworkData; import com.datadog.api.client.v2.model.CustomFrameworkDataAttributes; import com.datadog.api.client.v2.model.CustomFrameworkRequirement; import com.datadog.api.client.v2.model.CustomFrameworkType; import com.datadog.api.client.v2.model.UpdateCustomFrameworkRequest; import com.datadog.api.client.v2.model.UpdateCustomFrameworkResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); UpdateCustomFrameworkRequest body = new UpdateCustomFrameworkRequest() .data( new CustomFrameworkData() .type(CustomFrameworkType.CUSTOM_FRAMEWORK) .attributes( new CustomFrameworkDataAttributes() .name("name") .handle("create-framework-new") .version("10") .iconUrl("test-url") .requirements( Collections.singletonList( new CustomFrameworkRequirement() .name("requirement") .controls( Collections.singletonList( new CustomFrameworkControl() .name("control") .rulesId( Collections.singletonList( "def-000-be9")))))))); try { UpdateCustomFrameworkResponse result = apiInstance.updateCustomFramework("create-framework-new", "10", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#updateCustomFramework"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a custom framework returns "OK" response ``` """ Update a custom framework returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.custom_framework_control import CustomFrameworkControl from datadog_api_client.v2.model.custom_framework_data import CustomFrameworkData from datadog_api_client.v2.model.custom_framework_data_attributes import CustomFrameworkDataAttributes from datadog_api_client.v2.model.custom_framework_requirement import CustomFrameworkRequirement from datadog_api_client.v2.model.custom_framework_type import CustomFrameworkType from datadog_api_client.v2.model.update_custom_framework_request import UpdateCustomFrameworkRequest body = UpdateCustomFrameworkRequest( data=CustomFrameworkData( type=CustomFrameworkType.CUSTOM_FRAMEWORK, attributes=CustomFrameworkDataAttributes( name="name", handle="create-framework-new", version="10", icon_url="test-url", requirements=[ CustomFrameworkRequirement( name="requirement", controls=[ CustomFrameworkControl( name="control", rules_id=[ "def-000-be9", ], ), ], ), ], ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.update_custom_framework(handle="create-framework-new", version="10", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a custom framework returns "OK" response ``` # Update a custom framework returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::UpdateCustomFrameworkRequest.new({ data: DatadogAPIClient::V2::CustomFrameworkData.new({ type: DatadogAPIClient::V2::CustomFrameworkType::CUSTOM_FRAMEWORK, attributes: DatadogAPIClient::V2::CustomFrameworkDataAttributes.new({ name: "name", handle: "create-framework-new", version: "10", icon_url: "test-url", requirements: [ DatadogAPIClient::V2::CustomFrameworkRequirement.new({ name: "requirement", controls: [ DatadogAPIClient::V2::CustomFrameworkControl.new({ name: "control", rules_id: [ "def-000-be9", ], }), ], }), ], }), }), }) p api_instance.update_custom_framework("create-framework-new", "10", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a custom framework returns "OK" response ``` // Update a custom framework returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::CustomFrameworkControl; use datadog_api_client::datadogV2::model::CustomFrameworkData; use datadog_api_client::datadogV2::model::CustomFrameworkDataAttributes; use datadog_api_client::datadogV2::model::CustomFrameworkRequirement; use datadog_api_client::datadogV2::model::CustomFrameworkType; use datadog_api_client::datadogV2::model::UpdateCustomFrameworkRequest; #[tokio::main] async fn main() { let body = UpdateCustomFrameworkRequest::new(CustomFrameworkData::new( CustomFrameworkDataAttributes::new( "create-framework-new".to_string(), "name".to_string(), vec![CustomFrameworkRequirement::new( vec![CustomFrameworkControl::new( "control".to_string(), vec!["def-000-be9".to_string()], )], "requirement".to_string(), )], "10".to_string(), ) .icon_url("test-url".to_string()), CustomFrameworkType::CUSTOM_FRAMEWORK, )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .update_custom_framework("create-framework-new".to_string(), "10".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a custom framework returns "OK" response ``` /** * Update a custom framework returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiUpdateCustomFrameworkRequest = { body: { data: { type: "custom_framework", attributes: { name: "name", handle: "create-framework-new", version: "10", iconUrl: "test-url", requirements: [ { name: "requirement", controls: [ { name: "control", rulesId: ["def-000-be9"], }, ], }, ], }, }, }, handle: "create-framework-new", version: "10", }; apiInstance .updateCustomFramework(params) .then((data: v2.UpdateCustomFrameworkResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a rule's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-rules-details) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-rules-details-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.datadoghq.eu/api/v2/security_monitoring/rules/{rule_id}https://api.ddog-gov.com/api/v2/security_monitoring/rules/{rule_id}https://api.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/{rule_id} ### Overview Get a rule’s details. This endpoint requires the `security_monitoring_rules_read` permission. OAuth apps require the `security_monitoring_rules_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string The ID of the rule. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringRule-200-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Create a new rule. Field Type Description Option 1 object Rule. calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A rule case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` complianceSignalOptions object How to generate compliance signals. Useful for cloud_configuration rules only. defaultActivationStatus boolean The default activation status. defaultGroupByFields [string] The default group by fields. userActivationStatus boolean Whether signals will be sent. userGroupByFields [string] Fields to use to group findings by when sending signals. createdAt int64 When the rule was created, timestamp in milliseconds. creationAuthorId int64 User ID of the user who created the rule. customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). defaultTags [string] Default Tags for default rules (included in tags) deprecationDate int64 When the rule will be deprecated, timestamp in milliseconds. filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. id string The ID of the rule. isDefault boolean Whether the rule is included by default. isDeleted boolean Whether the rule has been deleted. isEnabled boolean Whether the rule is enabled. message string Message for generated signals. name string The name of the rule. options object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. query string A query to map a third party event to this case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum The rule type. Allowed enum values: `log_detection,infrastructure_configuration,workload_security,cloud_configuration,application_security,api_security` updateAuthorId int64 User ID of the user who updated the rule. updatedAt int64 The date the rule was last updated, in milliseconds. version int64 The version of the rule. Option 2 object Rule. cases [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A rule case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` createdAt int64 When the rule was created, timestamp in milliseconds. creationAuthorId int64 User ID of the user who created the rule. customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). deprecationDate int64 When the rule will be deprecated, timestamp in milliseconds. filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. id string The ID of the rule. isDefault boolean Whether the rule is included by default. isDeleted boolean Whether the rule has been deleted. isEnabled boolean Whether the rule is enabled. message string Message for generated signals. name string The name of the rule. options object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` correlatedByFields [string] Fields to correlate by. correlatedQueryIndex int32 Index of the rule query used to retrieve the correlated field. defaultRuleId string Default Rule ID to match on signals. distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. metrics [string] Group of target fields to aggregate over. name string Name of the query. ruleId string Rule ID to match on signals. tags [string] Tags for generated signals. type enum The rule type. Allowed enum values: `signal_correlation` updateAuthorId int64 User ID of the user who updated the rule. version int64 The version of the rule. ``` { "calculatedFields": [ { "expression": "@request_end_timestamp - @request_start_timestamp", "name": "response_time" } ], "cases": [ { "actions": [ { "options": { "duration": 0, "flaggedIPType": "FLAGGED", "userBehaviorName": "string" }, "type": "string" } ], "condition": "string", "customStatus": "critical", "name": "string", "notifications": [], "status": "critical" } ], "complianceSignalOptions": { "defaultActivationStatus": false, "defaultGroupByFields": [], "userActivationStatus": false, "userGroupByFields": [] }, "createdAt": "integer", "creationAuthorId": "integer", "customMessage": "string", "customName": "string", "defaultTags": [ "security:attacks" ], "deprecationDate": "integer", "filters": [ { "action": "string", "query": "string" } ], "groupSignalsBy": [ "service" ], "hasExtendedTitle": false, "id": "string", "isDefault": false, "isDeleted": false, "isEnabled": false, "message": "string", "name": "string", "options": { "anomalyDetectionOptions": { "bucketDuration": 300, "detectionTolerance": 5, "learningDuration": "integer", "learningPeriodBaseline": "integer" }, "complianceRuleOptions": { "complexRule": false, "regoRule": { "policy": "package datadog\n\nimport data.datadog.output as dd_output\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\neval(resource) = \"skip\" if {\n # Logic that evaluates to true if the resource should be skipped\n true\n} else = \"pass\" {\n # Logic that evaluates to true if the resource is compliant\n true\n} else = \"fail\" {\n # Logic that evaluates to true if the resource is not compliant\n true\n}\n\n# This part remains unchanged for all rules\nresults contains result if {\n some resource in input.resources[input.main_resource_type]\n result := dd_output.format(resource, eval(resource))\n}", "resourceTypes": [ "gcp_iam_service_account", "gcp_iam_policy" ] }, "resourceType": "aws_acm" }, "decreaseCriticalityBasedOnEnv": false, "detectionMethod": "string", "evaluationWindow": "integer", "hardcodedEvaluatorType": "string", "impossibleTravelOptions": { "baselineUserLocations": true }, "keepAlive": "integer", "maxSignalDuration": "integer", "newValueOptions": { "forgetAfter": "integer", "instantaneousBaseline": false, "learningDuration": "integer", "learningMethod": "string", "learningThreshold": "integer" }, "sequenceDetectionOptions": { "stepTransitions": [ { "child": "string", "evaluationWindow": "integer", "parent": "string" } ], "steps": [ { "condition": "string", "evaluationWindow": "integer", "name": "string" } ] }, "thirdPartyRuleOptions": { "defaultNotifications": [], "defaultStatus": "critical", "rootQueries": [ { "groupByFields": [], "query": "source:cloudtrail" } ], "signalTitleTemplate": "string" } }, "queries": [ { "aggregation": "string", "customQueryExtension": "a > 3", "dataSource": "logs", "distinctFields": [], "groupByFields": [], "hasOptionalGroupByFields": false, "index": "string", "indexes": [], "metric": "string", "metrics": [], "name": "string", "query": "a > 3" } ], "referenceTables": [ { "checkPresence": false, "columnName": "string", "logFieldPath": "string", "ruleQueryName": "string", "tableName": "string" } ], "schedulingOptions": { "rrule": "FREQ=HOURLY;INTERVAL=1;", "start": "2025-07-14T12:00:00", "timezone": "America/New_York" }, "tags": [], "thirdPartyCases": [ { "customStatus": "critical", "name": "string", "notifications": [], "query": "string", "status": "critical" } ], "type": "string", "updateAuthorId": "integer", "updatedAt": "integer", "version": "integer" } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a rule's details Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/${rule_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a rule's details ``` """ Get a rule's details returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "security_rule" in the system SECURITY_RULE_ID = environ["SECURITY_RULE_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_security_monitoring_rule( rule_id=SECURITY_RULE_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a rule's details ``` # Get a rule's details returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "security_rule" in the system SECURITY_RULE_ID = ENV["SECURITY_RULE_ID"] p api_instance.get_security_monitoring_rule(SECURITY_RULE_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a rule's details ``` // Get a rule's details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "security_rule" in the system SecurityRuleID := os.Getenv("SECURITY_RULE_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSecurityMonitoringRule(ctx, SecurityRuleID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSecurityMonitoringRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a rule's details ``` // Get a rule's details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringRuleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "security_rule" in the system String SECURITY_RULE_ID = System.getenv("SECURITY_RULE_ID"); try { SecurityMonitoringRuleResponse result = apiInstance.getSecurityMonitoringRule(SECURITY_RULE_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#getSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a rule's details ``` // Get a rule's details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "security_rule" in the system let security_rule_id = std::env::var("SECURITY_RULE_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_security_monitoring_rule(security_rule_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a rule's details ``` /** * Get a rule's details returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "security_rule" in the system const SECURITY_RULE_ID = process.env.SECURITY_RULE_ID as string; const params: v2.SecurityMonitoringApiGetSecurityMonitoringRuleRequest = { ruleId: SECURITY_RULE_ID, }; apiInstance .getSecurityMonitoringRule(params) .then((data: v2.SecurityMonitoringRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Modify the triage assignee of a security signal](https://docs.datadoghq.com/api/latest/security-monitoring/#modify-the-triage-assignee-of-a-security-signal) * [v1](https://docs.datadoghq.com/api/latest/security-monitoring/#modify-the-triage-assignee-of-a-security-signal-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#modify-the-triage-assignee-of-a-security-signal-v2) PATCH https://api.ap1.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/assigneehttps://api.ap2.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/assigneehttps://api.datadoghq.eu/api/v1/security_analytics/signals/{signal_id}/assigneehttps://api.ddog-gov.com/api/v1/security_analytics/signals/{signal_id}/assigneehttps://api.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/assigneehttps://api.us3.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/assigneehttps://api.us5.datadoghq.com/api/v1/security_analytics/signals/{signal_id}/assignee ### Overview This endpoint is deprecated - Modify the triage assignee of a security signal. This endpoint requires the `security_monitoring_signals_write` permission. ### Arguments #### Path Parameters Name Type Description signal_id [_required_] string The ID of the signal. ### Request #### Body Data (required) Attributes describing the signal update. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Expand All Field Type Description assignee [_required_] string The UUID of the user being assigned. Use empty string to return signal to unassigned. version int64 Version of the updated signal. If server side version is higher, update will be rejected. ``` { "assignee": "773b045d-ccf8-4808-bd3b-955ef6a8c940" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalAssignee-200-v1) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalAssignee-400-v1) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalAssignee-403-v1) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalAssignee-404-v1) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalAssignee-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Updated signal data following a successfully performed update. Expand All Field Type Description status string Status of the response. ``` { "status": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Modify the triage assignee of a security signal returns "OK" response Copy ``` # Path parameters export signal_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/security_analytics/signals/${signal_id}/assignee" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "assignee": "773b045d-ccf8-4808-bd3b-955ef6a8c940" } EOF ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` // Modify the triage assignee of a security signal returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SignalAssigneeUpdateRequest{ Assignee: "773b045d-ccf8-4808-bd3b-955ef6a8c940", } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSecurityMonitoringApi(apiClient) resp, r, err := api.EditSecurityMonitoringSignalAssignee(ctx, "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.EditSecurityMonitoringSignalAssignee`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.EditSecurityMonitoringSignalAssignee`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` // Modify the triage assignee of a security signal returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SecurityMonitoringApi; import com.datadog.api.client.v1.model.SignalAssigneeUpdateRequest; import com.datadog.api.client.v1.model.SuccessfulSignalUpdateResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SignalAssigneeUpdateRequest body = new SignalAssigneeUpdateRequest().assignee("773b045d-ccf8-4808-bd3b-955ef6a8c940"); try { SuccessfulSignalUpdateResponse result = apiInstance.editSecurityMonitoringSignalAssignee( "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#editSecurityMonitoringSignalAssignee"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` """ Modify the triage assignee of a security signal returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v1.model.signal_assignee_update_request import SignalAssigneeUpdateRequest body = SignalAssigneeUpdateRequest( assignee="773b045d-ccf8-4808-bd3b-955ef6a8c940", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.edit_security_monitoring_signal_assignee( signal_id="AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` # Modify the triage assignee of a security signal returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SecurityMonitoringAPI.new body = DatadogAPIClient::V1::SignalAssigneeUpdateRequest.new({ assignee: "773b045d-ccf8-4808-bd3b-955ef6a8c940", }) p api_instance.edit_security_monitoring_signal_assignee("AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` // Modify the triage assignee of a security signal returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV1::model::SignalAssigneeUpdateRequest; #[tokio::main] async fn main() { let body = SignalAssigneeUpdateRequest::new("773b045d-ccf8-4808-bd3b-955ef6a8c940".to_string()); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .edit_security_monitoring_signal_assignee( "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE".to_string(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` /** * Modify the triage assignee of a security signal returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SecurityMonitoringApi(configuration); const params: v1.SecurityMonitoringApiEditSecurityMonitoringSignalAssigneeRequest = { body: { assignee: "773b045d-ccf8-4808-bd3b-955ef6a8c940", }, signalId: "AQAAAYDiB_Ol8PbzFAAAAABBWURpQl9PbEFBQU0yeXhGTG9ZV2JnQUE", }; apiInstance .editSecurityMonitoringSignalAssignee(params) .then((data: v1.SuccessfulSignalUpdateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` PATCH https://api.ap1.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/assigneehttps://api.ap2.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/assigneehttps://api.datadoghq.eu/api/v2/security_monitoring/signals/{signal_id}/assigneehttps://api.ddog-gov.com/api/v2/security_monitoring/signals/{signal_id}/assigneehttps://api.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/assigneehttps://api.us3.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/assigneehttps://api.us5.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/assignee ### Overview Modify the triage assignee of a security signal. This endpoint requires the `security_monitoring_signals_write` permission. ### Arguments #### Path Parameters Name Type Description signal_id [_required_] string The ID of the signal. ### Request #### Body Data (required) Attributes describing the signal update. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object Data containing the patch for changing the assignee of a signal. attributes [_required_] object Attributes describing the new assignee of a security signal. assignee [_required_] object Object representing a given user entity. handle string The handle for this user account. icon string Gravatar icon associated to the user. id int64 Numerical ID assigned by Datadog to this user account. name string The name for this user account. uuid [_required_] string UUID assigned by Datadog to this user account. version int64 Version of the updated signal. If server side version is higher, update will be rejected. ``` { "data": { "attributes": { "assignee": { "uuid": "" } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalAssignee-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalAssignee-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalAssignee-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalAssignee-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalAssignee-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The response returned after all triage operations, containing the updated signal triage data. Field Type Description data [_required_] object Data containing the updated triage attributes of the signal. attributes object Attributes describing a triage state update operation over a security signal. archive_comment string Optional comment to display on archived signals. archive_comment_timestamp int64 Timestamp of the last edit to the comment. archive_comment_user object Object representing a given user entity. handle string The handle for this user account. icon string Gravatar icon associated to the user. id int64 Numerical ID assigned by Datadog to this user account. name string The name for this user account. uuid [_required_] string UUID assigned by Datadog to this user account. archive_reason enum Reason a signal is archived. Allowed enum values: `none,false_positive,testing_or_maintenance,investigated_case_opened,true_positive_benign,true_positive_malicious,other` assignee [_required_] object Object representing a given user entity. handle string The handle for this user account. icon string Gravatar icon associated to the user. id int64 Numerical ID assigned by Datadog to this user account. name string The name for this user account. uuid [_required_] string UUID assigned by Datadog to this user account. incident_ids [_required_] [integer] Array of incidents that are associated with this signal. state [_required_] enum The new triage state of the signal. Allowed enum values: `open,archived,under_review` state_update_timestamp int64 Timestamp of the last update to the signal state. state_update_user object Object representing a given user entity. handle string The handle for this user account. icon string Gravatar icon associated to the user. id int64 Numerical ID assigned by Datadog to this user account. name string The name for this user account. uuid [_required_] string UUID assigned by Datadog to this user account. id string The unique ID of the security signal. type enum The type of event. Allowed enum values: `signal_metadata` default: `signal_metadata` ``` { "data": { "attributes": { "archive_comment": "string", "archive_comment_timestamp": "integer", "archive_comment_user": { "handle": "string", "icon": "/path/to/matching/gravatar/icon", "id": "integer", "name": "string", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" }, "archive_reason": "string", "assignee": { "handle": "string", "icon": "/path/to/matching/gravatar/icon", "id": "integer", "name": "string", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" }, "incident_ids": [ 2066 ], "state": "open", "state_update_timestamp": "integer", "state_update_user": { "handle": "string", "icon": "/path/to/matching/gravatar/icon", "id": "integer", "name": "string", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" } }, "id": "string", "type": "signal_metadata" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Modify the triage assignee of a security signal returns "OK" response Copy ``` # Path parameters export signal_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/signals/${signal_id}/assignee" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "assignee": { "uuid": "" } } } } EOF ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` // Modify the triage assignee of a security signal returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringSignalAssigneeUpdateRequest{ Data: datadogV2.SecurityMonitoringSignalAssigneeUpdateData{ Attributes: datadogV2.SecurityMonitoringSignalAssigneeUpdateAttributes{ Assignee: datadogV2.SecurityMonitoringTriageUser{ Uuid: "", }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.EditSecurityMonitoringSignalAssignee(ctx, "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.EditSecurityMonitoringSignalAssignee`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.EditSecurityMonitoringSignalAssignee`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` // Modify the triage assignee of a security signal returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSignalAssigneeUpdateAttributes; import com.datadog.api.client.v2.model.SecurityMonitoringSignalAssigneeUpdateData; import com.datadog.api.client.v2.model.SecurityMonitoringSignalAssigneeUpdateRequest; import com.datadog.api.client.v2.model.SecurityMonitoringSignalTriageUpdateResponse; import com.datadog.api.client.v2.model.SecurityMonitoringTriageUser; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringSignalAssigneeUpdateRequest body = new SecurityMonitoringSignalAssigneeUpdateRequest() .data( new SecurityMonitoringSignalAssigneeUpdateData() .attributes( new SecurityMonitoringSignalAssigneeUpdateAttributes() .assignee(new SecurityMonitoringTriageUser().uuid("")))); try { SecurityMonitoringSignalTriageUpdateResponse result = apiInstance.editSecurityMonitoringSignalAssignee( "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#editSecurityMonitoringSignalAssignee"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` """ Modify the triage assignee of a security signal returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_signal_assignee_update_attributes import ( SecurityMonitoringSignalAssigneeUpdateAttributes, ) from datadog_api_client.v2.model.security_monitoring_signal_assignee_update_data import ( SecurityMonitoringSignalAssigneeUpdateData, ) from datadog_api_client.v2.model.security_monitoring_signal_assignee_update_request import ( SecurityMonitoringSignalAssigneeUpdateRequest, ) from datadog_api_client.v2.model.security_monitoring_triage_user import SecurityMonitoringTriageUser body = SecurityMonitoringSignalAssigneeUpdateRequest( data=SecurityMonitoringSignalAssigneeUpdateData( attributes=SecurityMonitoringSignalAssigneeUpdateAttributes( assignee=SecurityMonitoringTriageUser( uuid="", ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.edit_security_monitoring_signal_assignee( signal_id="AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` # Modify the triage assignee of a security signal returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringSignalAssigneeUpdateRequest.new({ data: DatadogAPIClient::V2::SecurityMonitoringSignalAssigneeUpdateData.new({ attributes: DatadogAPIClient::V2::SecurityMonitoringSignalAssigneeUpdateAttributes.new({ assignee: DatadogAPIClient::V2::SecurityMonitoringTriageUser.new({ uuid: "", }), }), }), }) p api_instance.edit_security_monitoring_signal_assignee("AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` // Modify the triage assignee of a security signal returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalAssigneeUpdateAttributes; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalAssigneeUpdateData; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalAssigneeUpdateRequest; use datadog_api_client::datadogV2::model::SecurityMonitoringTriageUser; #[tokio::main] async fn main() { let body = SecurityMonitoringSignalAssigneeUpdateRequest::new( SecurityMonitoringSignalAssigneeUpdateData::new( SecurityMonitoringSignalAssigneeUpdateAttributes::new( SecurityMonitoringTriageUser::new("".to_string()), ), ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .edit_security_monitoring_signal_assignee( "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE".to_string(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Modify the triage assignee of a security signal returns "OK" response ``` /** * Modify the triage assignee of a security signal returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiEditSecurityMonitoringSignalAssigneeRequest = { body: { data: { attributes: { assignee: { uuid: "", }, }, }, }, signalId: "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", }; apiInstance .editSecurityMonitoringSignalAssignee(params) .then((data: v2.SecurityMonitoringSignalTriageUpdateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#update-an-existing-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#update-an-existing-rule-v2) PUT https://api.ap1.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.datadoghq.eu/api/v2/security_monitoring/rules/{rule_id}https://api.ddog-gov.com/api/v2/security_monitoring/rules/{rule_id}https://api.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/{rule_id} ### Overview Update an existing rule. When updating `cases`, `queries` or `options`, the whole field must be included. For example, when modifying a query all queries must be included. Default rules can only be updated to be enabled, to change notifications, or to update the tags (default tags cannot be removed). This endpoint requires the `security_monitoring_rules_write` permission. OAuth apps require the `security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string The ID of the rule. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A rule case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` complianceSignalOptions object How to generate compliance signals. Useful for cloud_configuration rules only. defaultActivationStatus boolean The default activation status. defaultGroupByFields [string] The default group by fields. userActivationStatus boolean Whether signals will be sent. userGroupByFields [string] Fields to use to group findings by when sending signals. customMessage string Custom/Overridden Message for generated signals (used in case of Default rule update). customName string Custom/Overridden name (used in case of Default rule update). filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. isEnabled boolean Whether the rule is enabled. message string Message for generated signals. name string Name of the rule. options object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [ ] Queries for selecting logs which are part of the rule. Option 1 object Query for matching rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. Option 2 object Query for matching rule on signals. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` correlatedByFields [string] Fields to group by. correlatedQueryIndex int32 Index of the rule query used to retrieve the correlated field. metrics [string] Group of target fields to aggregate over. name string Name of the query. ruleId [_required_] string Rule ID to match on signals. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. query string A query to map a third party event to this case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` version int32 The version of the rule being updated. ##### Update a cloud configuration rule's details returns "OK" response ``` { "name": "Example-Security-Monitoring_cloud_updated", "isEnabled": false, "cases": [ { "status": "info", "notifications": [] } ], "options": { "complianceRuleOptions": { "resourceType": "gcp_compute_disk", "regoRule": { "policy": "package datadog\n\nimport data.datadog.output as dd_output\n\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\nmilliseconds_in_a_day := ((1000 * 60) * 60) * 24\n\neval(iam_service_account_key) = \"skip\" if {\n\tiam_service_account_key.disabled\n} else = \"pass\" if {\n\t(iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90\n} else = \"fail\"\n\n# This part remains unchanged for all rules\nresults contains result if {\n\tsome resource in input.resources[input.main_resource_type]\n\tresult := dd_output.format(resource, eval(resource))\n}\n", "resourceTypes": [ "gcp_compute_disk" ] } } }, "message": "ddd", "tags": [], "complianceSignalOptions": { "userActivationStatus": false, "userGroupByFields": [] } } ``` Copy ##### Update an existing rule returns "OK" response ``` { "name": "Example-Security-Monitoring-Updated", "queries": [ { "query": "@test:true", "aggregation": "count", "groupByFields": [], "distinctFields": [], "metrics": [] } ], "filters": [], "cases": [ { "name": "", "status": "info", "condition": "a > 0", "notifications": [] } ], "options": { "evaluationWindow": 900, "keepAlive": 3600, "maxSignalDuration": 86400 }, "message": "Test rule", "tags": [], "isEnabled": true } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityMonitoringRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Create a new rule. Field Type Description Option 1 object Rule. calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A rule case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` complianceSignalOptions object How to generate compliance signals. Useful for cloud_configuration rules only. defaultActivationStatus boolean The default activation status. defaultGroupByFields [string] The default group by fields. userActivationStatus boolean Whether signals will be sent. userGroupByFields [string] Fields to use to group findings by when sending signals. createdAt int64 When the rule was created, timestamp in milliseconds. creationAuthorId int64 User ID of the user who created the rule. customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). defaultTags [string] Default Tags for default rules (included in tags) deprecationDate int64 When the rule will be deprecated, timestamp in milliseconds. filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. id string The ID of the rule. isDefault boolean Whether the rule is included by default. isDeleted boolean Whether the rule has been deleted. isEnabled boolean Whether the rule is enabled. message string Message for generated signals. name string The name of the rule. options object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. query string A query to map a third party event to this case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum The rule type. Allowed enum values: `log_detection,infrastructure_configuration,workload_security,cloud_configuration,application_security,api_security` updateAuthorId int64 User ID of the user who updated the rule. updatedAt int64 The date the rule was last updated, in milliseconds. version int64 The version of the rule. Option 2 object Rule. cases [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A rule case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` createdAt int64 When the rule was created, timestamp in milliseconds. creationAuthorId int64 User ID of the user who created the rule. customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). deprecationDate int64 When the rule will be deprecated, timestamp in milliseconds. filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. id string The ID of the rule. isDefault boolean Whether the rule is included by default. isDeleted boolean Whether the rule has been deleted. isEnabled boolean Whether the rule is enabled. message string Message for generated signals. name string The name of the rule. options object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` correlatedByFields [string] Fields to correlate by. correlatedQueryIndex int32 Index of the rule query used to retrieve the correlated field. defaultRuleId string Default Rule ID to match on signals. distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. metrics [string] Group of target fields to aggregate over. name string Name of the query. ruleId string Rule ID to match on signals. tags [string] Tags for generated signals. type enum The rule type. Allowed enum values: `signal_correlation` updateAuthorId int64 User ID of the user who updated the rule. version int64 The version of the rule. ``` { "calculatedFields": [ { "expression": "@request_end_timestamp - @request_start_timestamp", "name": "response_time" } ], "cases": [ { "actions": [ { "options": { "duration": 0, "flaggedIPType": "FLAGGED", "userBehaviorName": "string" }, "type": "string" } ], "condition": "string", "customStatus": "critical", "name": "string", "notifications": [], "status": "critical" } ], "complianceSignalOptions": { "defaultActivationStatus": false, "defaultGroupByFields": [], "userActivationStatus": false, "userGroupByFields": [] }, "createdAt": "integer", "creationAuthorId": "integer", "customMessage": "string", "customName": "string", "defaultTags": [ "security:attacks" ], "deprecationDate": "integer", "filters": [ { "action": "string", "query": "string" } ], "groupSignalsBy": [ "service" ], "hasExtendedTitle": false, "id": "string", "isDefault": false, "isDeleted": false, "isEnabled": false, "message": "string", "name": "string", "options": { "anomalyDetectionOptions": { "bucketDuration": 300, "detectionTolerance": 5, "learningDuration": "integer", "learningPeriodBaseline": "integer" }, "complianceRuleOptions": { "complexRule": false, "regoRule": { "policy": "package datadog\n\nimport data.datadog.output as dd_output\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\neval(resource) = \"skip\" if {\n # Logic that evaluates to true if the resource should be skipped\n true\n} else = \"pass\" {\n # Logic that evaluates to true if the resource is compliant\n true\n} else = \"fail\" {\n # Logic that evaluates to true if the resource is not compliant\n true\n}\n\n# This part remains unchanged for all rules\nresults contains result if {\n some resource in input.resources[input.main_resource_type]\n result := dd_output.format(resource, eval(resource))\n}", "resourceTypes": [ "gcp_iam_service_account", "gcp_iam_policy" ] }, "resourceType": "aws_acm" }, "decreaseCriticalityBasedOnEnv": false, "detectionMethod": "string", "evaluationWindow": "integer", "hardcodedEvaluatorType": "string", "impossibleTravelOptions": { "baselineUserLocations": true }, "keepAlive": "integer", "maxSignalDuration": "integer", "newValueOptions": { "forgetAfter": "integer", "instantaneousBaseline": false, "learningDuration": "integer", "learningMethod": "string", "learningThreshold": "integer" }, "sequenceDetectionOptions": { "stepTransitions": [ { "child": "string", "evaluationWindow": "integer", "parent": "string" } ], "steps": [ { "condition": "string", "evaluationWindow": "integer", "name": "string" } ] }, "thirdPartyRuleOptions": { "defaultNotifications": [], "defaultStatus": "critical", "rootQueries": [ { "groupByFields": [], "query": "source:cloudtrail" } ], "signalTitleTemplate": "string" } }, "queries": [ { "aggregation": "string", "customQueryExtension": "a > 3", "dataSource": "logs", "distinctFields": [], "groupByFields": [], "hasOptionalGroupByFields": false, "index": "string", "indexes": [], "metric": "string", "metrics": [], "name": "string", "query": "a > 3" } ], "referenceTables": [ { "checkPresence": false, "columnName": "string", "logFieldPath": "string", "ruleQueryName": "string", "tableName": "string" } ], "schedulingOptions": { "rrule": "FREQ=HOURLY;INTERVAL=1;", "start": "2025-07-14T12:00:00", "timezone": "America/New_York" }, "tags": [], "thirdPartyCases": [ { "customStatus": "critical", "name": "string", "notifications": [], "query": "string", "status": "critical" } ], "type": "string", "updateAuthorId": "integer", "updatedAt": "integer", "version": "integer" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Update a cloud configuration rule's details returns "OK" response Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/${rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Security-Monitoring_cloud_updated", "isEnabled": false, "cases": [ { "status": "info", "notifications": [] } ], "options": { "complianceRuleOptions": { "resourceType": "gcp_compute_disk", "regoRule": { "policy": "package datadog\n\nimport data.datadog.output as dd_output\n\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\nmilliseconds_in_a_day := ((1000 * 60) * 60) * 24\n\neval(iam_service_account_key) = \"skip\" if {\n\tiam_service_account_key.disabled\n} else = \"pass\" if {\n\t(iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90\n} else = \"fail\"\n\n# This part remains unchanged for all rules\nresults contains result if {\n\tsome resource in input.resources[input.main_resource_type]\n\tresult := dd_output.format(resource, eval(resource))\n}\n", "resourceTypes": [ "gcp_compute_disk" ] } } }, "message": "ddd", "tags": [], "complianceSignalOptions": { "userActivationStatus": false, "userGroupByFields": [] } } EOF ``` ##### Update an existing rule returns "OK" response Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/${rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Security-Monitoring-Updated", "queries": [ { "query": "@test:true", "aggregation": "count", "groupByFields": [], "distinctFields": [], "metrics": [] } ], "filters": [], "cases": [ { "name": "", "status": "info", "condition": "a > 0", "notifications": [] } ], "options": { "evaluationWindow": 900, "keepAlive": 3600, "maxSignalDuration": 86400 }, "message": "Test rule", "tags": [], "isEnabled": true } EOF ``` ##### Update a cloud configuration rule's details returns "OK" response ``` // Update a cloud configuration rule's details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "cloud_configuration_rule" in the system CloudConfigurationRuleID := os.Getenv("CLOUD_CONFIGURATION_RULE_ID") body := datadogV2.SecurityMonitoringRuleUpdatePayload{ Name: datadog.PtrString("Example-Security-Monitoring_cloud_updated"), IsEnabled: datadog.PtrBool(false), Cases: []datadogV2.SecurityMonitoringRuleCase{ { Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO.Ptr(), Notifications: []string{}, }, }, Options: &datadogV2.SecurityMonitoringRuleOptions{ ComplianceRuleOptions: &datadogV2.CloudConfigurationComplianceRuleOptions{ ResourceType: datadog.PtrString("gcp_compute_disk"), RegoRule: &datadogV2.CloudConfigurationRegoRule{ Policy: `package datadog import data.datadog.output as dd_output import future.keywords.contains import future.keywords.if import future.keywords.in milliseconds_in_a_day := ((1000 * 60) * 60) * 24 eval(iam_service_account_key) = "skip" if { iam_service_account_key.disabled } else = "pass" if { (iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90 } else = "fail" # This part remains unchanged for all rules results contains result if { some resource in input.resources[input.main_resource_type] result := dd_output.format(resource, eval(resource)) } `, ResourceTypes: []string{ "gcp_compute_disk", }, }, }, }, Message: datadog.PtrString("ddd"), Tags: []string{}, ComplianceSignalOptions: &datadogV2.CloudConfigurationRuleComplianceSignalOptions{ UserActivationStatus: *datadog.NewNullableBool(datadog.PtrBool(false)), UserGroupByFields: *datadog.NewNullableList(&[]string{}), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.UpdateSecurityMonitoringRule(ctx, CloudConfigurationRuleID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.UpdateSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.UpdateSecurityMonitoringRule`:\n%s\n", responseContent) } ``` Copy ##### Update an existing rule returns "OK" response ``` // Update an existing rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "security_rule" in the system SecurityRuleID := os.Getenv("SECURITY_RULE_ID") body := datadogV2.SecurityMonitoringRuleUpdatePayload{ Name: datadog.PtrString("Example-Security-Monitoring-Updated"), Queries: []datadogV2.SecurityMonitoringRuleQuery{ datadogV2.SecurityMonitoringRuleQuery{ SecurityMonitoringStandardRuleQuery: &datadogV2.SecurityMonitoringStandardRuleQuery{ Query: datadog.PtrString("@test:true"), Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_COUNT.Ptr(), GroupByFields: []string{}, DistinctFields: []string{}, Metrics: []string{}, }}, }, Filters: []datadogV2.SecurityMonitoringFilter{}, Cases: []datadogV2.SecurityMonitoringRuleCase{ { Name: datadog.PtrString(""), Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO.Ptr(), Condition: datadog.PtrString("a > 0"), Notifications: []string{}, }, }, Options: &datadogV2.SecurityMonitoringRuleOptions{ EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_FIFTEEN_MINUTES.Ptr(), KeepAlive: datadogV2.SECURITYMONITORINGRULEKEEPALIVE_ONE_HOUR.Ptr(), MaxSignalDuration: datadogV2.SECURITYMONITORINGRULEMAXSIGNALDURATION_ONE_DAY.Ptr(), }, Message: datadog.PtrString("Test rule"), Tags: []string{}, IsEnabled: datadog.PtrBool(true), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.UpdateSecurityMonitoringRule(ctx, SecurityRuleID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.UpdateSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.UpdateSecurityMonitoringRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a cloud configuration rule's details returns "OK" response ``` // Update a cloud configuration rule's details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.CloudConfigurationComplianceRuleOptions; import com.datadog.api.client.v2.model.CloudConfigurationRegoRule; import com.datadog.api.client.v2.model.CloudConfigurationRuleComplianceSignalOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCase; import com.datadog.api.client.v2.model.SecurityMonitoringRuleOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleResponse; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import com.datadog.api.client.v2.model.SecurityMonitoringRuleUpdatePayload; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "cloud_configuration_rule" in the system String CLOUD_CONFIGURATION_RULE_ID = System.getenv("CLOUD_CONFIGURATION_RULE_ID"); SecurityMonitoringRuleUpdatePayload body = new SecurityMonitoringRuleUpdatePayload() .name("Example-Security-Monitoring_cloud_updated") .isEnabled(false) .cases( Collections.singletonList( new SecurityMonitoringRuleCase().status(SecurityMonitoringRuleSeverity.INFO))) .options( new SecurityMonitoringRuleOptions() .complianceRuleOptions( new CloudConfigurationComplianceRuleOptions() .resourceType("gcp_compute_disk") .regoRule( new CloudConfigurationRegoRule() .policy( """ package datadog import data.datadog.output as dd_output import future.keywords.contains import future.keywords.if import future.keywords.in milliseconds_in_a_day := ((1000 * 60) * 60) * 24 eval(iam_service_account_key) = "skip" if { iam_service_account_key.disabled } else = "pass" if { (iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90 } else = "fail" # This part remains unchanged for all rules results contains result if { some resource in input.resources[input.main_resource_type] result := dd_output.format(resource, eval(resource)) } """) .resourceTypes(Collections.singletonList("gcp_compute_disk"))))) .message("ddd") .complianceSignalOptions( new CloudConfigurationRuleComplianceSignalOptions().userActivationStatus(false)); try { SecurityMonitoringRuleResponse result = apiInstance.updateSecurityMonitoringRule(CLOUD_CONFIGURATION_RULE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#updateSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Update an existing rule returns "OK" response ``` // Update an existing rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCase; import com.datadog.api.client.v2.model.SecurityMonitoringRuleEvaluationWindow; import com.datadog.api.client.v2.model.SecurityMonitoringRuleKeepAlive; import com.datadog.api.client.v2.model.SecurityMonitoringRuleMaxSignalDuration; import com.datadog.api.client.v2.model.SecurityMonitoringRuleOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQuery; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryAggregation; import com.datadog.api.client.v2.model.SecurityMonitoringRuleResponse; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import com.datadog.api.client.v2.model.SecurityMonitoringRuleUpdatePayload; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleQuery; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "security_rule" in the system String SECURITY_RULE_ID = System.getenv("SECURITY_RULE_ID"); SecurityMonitoringRuleUpdatePayload body = new SecurityMonitoringRuleUpdatePayload() .name("Example-Security-Monitoring-Updated") .queries( Collections.singletonList( new SecurityMonitoringRuleQuery( new SecurityMonitoringStandardRuleQuery() .query("@test:true") .aggregation(SecurityMonitoringRuleQueryAggregation.COUNT)))) .cases( Collections.singletonList( new SecurityMonitoringRuleCase() .name("") .status(SecurityMonitoringRuleSeverity.INFO) .condition("a > 0"))) .options( new SecurityMonitoringRuleOptions() .evaluationWindow(SecurityMonitoringRuleEvaluationWindow.FIFTEEN_MINUTES) .keepAlive(SecurityMonitoringRuleKeepAlive.ONE_HOUR) .maxSignalDuration(SecurityMonitoringRuleMaxSignalDuration.ONE_DAY)) .message("Test rule") .isEnabled(true); try { SecurityMonitoringRuleResponse result = apiInstance.updateSecurityMonitoringRule(SECURITY_RULE_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#updateSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a cloud configuration rule's details returns "OK" response ``` """ Update a cloud configuration rule's details returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.cloud_configuration_compliance_rule_options import ( CloudConfigurationComplianceRuleOptions, ) from datadog_api_client.v2.model.cloud_configuration_rego_rule import CloudConfigurationRegoRule from datadog_api_client.v2.model.cloud_configuration_rule_compliance_signal_options import ( CloudConfigurationRuleComplianceSignalOptions, ) from datadog_api_client.v2.model.security_monitoring_rule_case import SecurityMonitoringRuleCase from datadog_api_client.v2.model.security_monitoring_rule_options import SecurityMonitoringRuleOptions from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity from datadog_api_client.v2.model.security_monitoring_rule_update_payload import SecurityMonitoringRuleUpdatePayload # there is a valid "cloud_configuration_rule" in the system CLOUD_CONFIGURATION_RULE_ID = environ["CLOUD_CONFIGURATION_RULE_ID"] body = SecurityMonitoringRuleUpdatePayload( name="Example-Security-Monitoring_cloud_updated", is_enabled=False, cases=[ SecurityMonitoringRuleCase( status=SecurityMonitoringRuleSeverity.INFO, notifications=[], ), ], options=SecurityMonitoringRuleOptions( compliance_rule_options=CloudConfigurationComplianceRuleOptions( resource_type="gcp_compute_disk", rego_rule=CloudConfigurationRegoRule( policy='package datadog\n\nimport data.datadog.output as dd_output\n\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\nmilliseconds_in_a_day := ((1000 * 60) * 60) * 24\n\neval(iam_service_account_key) = "skip" if {\n\tiam_service_account_key.disabled\n} else = "pass" if {\n\t(iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90\n} else = "fail"\n\n# This part remains unchanged for all rules\nresults contains result if {\n\tsome resource in input.resources[input.main_resource_type]\n\tresult := dd_output.format(resource, eval(resource))\n}\n', resource_types=[ "gcp_compute_disk", ], ), ), ), message="ddd", tags=[], compliance_signal_options=CloudConfigurationRuleComplianceSignalOptions( user_activation_status=False, user_group_by_fields=[], ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.update_security_monitoring_rule(rule_id=CLOUD_CONFIGURATION_RULE_ID, body=body) print(response) ``` Copy ##### Update an existing rule returns "OK" response ``` """ Update an existing rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_rule_case import SecurityMonitoringRuleCase from datadog_api_client.v2.model.security_monitoring_rule_evaluation_window import ( SecurityMonitoringRuleEvaluationWindow, ) from datadog_api_client.v2.model.security_monitoring_rule_keep_alive import SecurityMonitoringRuleKeepAlive from datadog_api_client.v2.model.security_monitoring_rule_max_signal_duration import ( SecurityMonitoringRuleMaxSignalDuration, ) from datadog_api_client.v2.model.security_monitoring_rule_options import SecurityMonitoringRuleOptions from datadog_api_client.v2.model.security_monitoring_rule_query_aggregation import ( SecurityMonitoringRuleQueryAggregation, ) from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity from datadog_api_client.v2.model.security_monitoring_rule_update_payload import SecurityMonitoringRuleUpdatePayload from datadog_api_client.v2.model.security_monitoring_standard_rule_query import SecurityMonitoringStandardRuleQuery # there is a valid "security_rule" in the system SECURITY_RULE_ID = environ["SECURITY_RULE_ID"] body = SecurityMonitoringRuleUpdatePayload( name="Example-Security-Monitoring-Updated", queries=[ SecurityMonitoringStandardRuleQuery( query="@test:true", aggregation=SecurityMonitoringRuleQueryAggregation.COUNT, group_by_fields=[], distinct_fields=[], metrics=[], ), ], filters=[], cases=[ SecurityMonitoringRuleCase( name="", status=SecurityMonitoringRuleSeverity.INFO, condition="a > 0", notifications=[], ), ], options=SecurityMonitoringRuleOptions( evaluation_window=SecurityMonitoringRuleEvaluationWindow.FIFTEEN_MINUTES, keep_alive=SecurityMonitoringRuleKeepAlive.ONE_HOUR, max_signal_duration=SecurityMonitoringRuleMaxSignalDuration.ONE_DAY, ), message="Test rule", tags=[], is_enabled=True, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.update_security_monitoring_rule(rule_id=SECURITY_RULE_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a cloud configuration rule's details returns "OK" response ``` # Update a cloud configuration rule's details returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "cloud_configuration_rule" in the system CLOUD_CONFIGURATION_RULE_ID = ENV["CLOUD_CONFIGURATION_RULE_ID"] body = DatadogAPIClient::V2::SecurityMonitoringRuleUpdatePayload.new({ name: "Example-Security-Monitoring_cloud_updated", is_enabled: false, cases: [ DatadogAPIClient::V2::SecurityMonitoringRuleCase.new({ status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, notifications: [], }), ], options: DatadogAPIClient::V2::SecurityMonitoringRuleOptions.new({ compliance_rule_options: DatadogAPIClient::V2::CloudConfigurationComplianceRuleOptions.new({ resource_type: "gcp_compute_disk", rego_rule: DatadogAPIClient::V2::CloudConfigurationRegoRule.new({ policy: 'package datadog\n\nimport data.datadog.output as dd_output\n\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\nmilliseconds_in_a_day := ((1000 * 60) * 60) * 24\n\neval(iam_service_account_key) = "skip" if {\n\tiam_service_account_key.disabled\n} else = "pass" if {\n\t(iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90\n} else = "fail"\n\n# This part remains unchanged for all rules\nresults contains result if {\n\tsome resource in input.resources[input.main_resource_type]\n\tresult := dd_output.format(resource, eval(resource))\n}\n', resource_types: [ "gcp_compute_disk", ], }), }), }), message: "ddd", tags: [], compliance_signal_options: DatadogAPIClient::V2::CloudConfigurationRuleComplianceSignalOptions.new({ user_activation_status: false, user_group_by_fields: [], }), }) p api_instance.update_security_monitoring_rule(CLOUD_CONFIGURATION_RULE_ID, body) ``` Copy ##### Update an existing rule returns "OK" response ``` # Update an existing rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "security_rule" in the system SECURITY_RULE_ID = ENV["SECURITY_RULE_ID"] body = DatadogAPIClient::V2::SecurityMonitoringRuleUpdatePayload.new({ name: "Example-Security-Monitoring-Updated", queries: [ DatadogAPIClient::V2::SecurityMonitoringStandardRuleQuery.new({ query: "@test:true", aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::COUNT, group_by_fields: [], distinct_fields: [], metrics: [], }), ], filters: [], cases: [ DatadogAPIClient::V2::SecurityMonitoringRuleCase.new({ name: "", status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, condition: "a > 0", notifications: [], }), ], options: DatadogAPIClient::V2::SecurityMonitoringRuleOptions.new({ evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES, keep_alive: DatadogAPIClient::V2::SecurityMonitoringRuleKeepAlive::ONE_HOUR, max_signal_duration: DatadogAPIClient::V2::SecurityMonitoringRuleMaxSignalDuration::ONE_DAY, }), message: "Test rule", tags: [], is_enabled: true, }) p api_instance.update_security_monitoring_rule(SECURITY_RULE_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a cloud configuration rule's details returns "OK" response ``` // Update a cloud configuration rule's details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::CloudConfigurationComplianceRuleOptions; use datadog_api_client::datadogV2::model::CloudConfigurationRegoRule; use datadog_api_client::datadogV2::model::CloudConfigurationRuleComplianceSignalOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCase; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleUpdatePayload; #[tokio::main] async fn main() { // there is a valid "cloud_configuration_rule" in the system let cloud_configuration_rule_id = std::env::var("CLOUD_CONFIGURATION_RULE_ID").unwrap(); let body = SecurityMonitoringRuleUpdatePayload::new() .cases( vec![ SecurityMonitoringRuleCase::new() .notifications(vec![]) .status(SecurityMonitoringRuleSeverity::INFO) ], ) .compliance_signal_options( CloudConfigurationRuleComplianceSignalOptions::new() .user_activation_status(Some(false)) .user_group_by_fields(Some(vec![])), ) .is_enabled(false) .message("ddd".to_string()) .name("Example-Security-Monitoring_cloud_updated".to_string()) .options( SecurityMonitoringRuleOptions ::new().compliance_rule_options( CloudConfigurationComplianceRuleOptions::new() .rego_rule( CloudConfigurationRegoRule::new( r#"package datadog import data.datadog.output as dd_output import future.keywords.contains import future.keywords.if import future.keywords.in milliseconds_in_a_day := ((1000 * 60) * 60) * 24 eval(iam_service_account_key) = "skip" if { iam_service_account_key.disabled } else = "pass" if { (iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90 } else = "fail" # This part remains unchanged for all rules results contains result if { some resource in input.resources[input.main_resource_type] result := dd_output.format(resource, eval(resource)) } "#.to_string(), vec!["gcp_compute_disk".to_string()], ), ) .resource_type("gcp_compute_disk".to_string()), ), ) .tags(vec![]); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .update_security_monitoring_rule(cloud_configuration_rule_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Update an existing rule returns "OK" response ``` // Update an existing rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCase; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleEvaluationWindow; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleKeepAlive; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleMaxSignalDuration; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQuery; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryAggregation; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleUpdatePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleQuery; #[tokio::main] async fn main() { // there is a valid "security_rule" in the system let security_rule_id = std::env::var("SECURITY_RULE_ID").unwrap(); let body = SecurityMonitoringRuleUpdatePayload::new() .cases(vec![SecurityMonitoringRuleCase::new() .condition("a > 0".to_string()) .name("".to_string()) .notifications(vec![]) .status(SecurityMonitoringRuleSeverity::INFO)]) .filters(vec![]) .is_enabled(true) .message("Test rule".to_string()) .name("Example-Security-Monitoring-Updated".to_string()) .options( SecurityMonitoringRuleOptions::new() .evaluation_window(SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES) .keep_alive(SecurityMonitoringRuleKeepAlive::ONE_HOUR) .max_signal_duration(SecurityMonitoringRuleMaxSignalDuration::ONE_DAY), ) .queries(vec![ SecurityMonitoringRuleQuery::SecurityMonitoringStandardRuleQuery(Box::new( SecurityMonitoringStandardRuleQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::COUNT) .distinct_fields(vec![]) .group_by_fields(vec![]) .metrics(vec![]) .query("@test:true".to_string()), )), ]) .tags(vec![]); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .update_security_monitoring_rule(security_rule_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a cloud configuration rule's details returns "OK" response ``` /** * Update a cloud configuration rule's details returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "cloud_configuration_rule" in the system const CLOUD_CONFIGURATION_RULE_ID = process.env .CLOUD_CONFIGURATION_RULE_ID as string; const params: v2.SecurityMonitoringApiUpdateSecurityMonitoringRuleRequest = { body: { name: "Example-Security-Monitoring_cloud_updated", isEnabled: false, cases: [ { status: "info", notifications: [], }, ], options: { complianceRuleOptions: { resourceType: "gcp_compute_disk", regoRule: { policy: `package datadog import data.datadog.output as dd_output import future.keywords.contains import future.keywords.if import future.keywords.in milliseconds_in_a_day := ((1000 * 60) * 60) * 24 eval(iam_service_account_key) = "skip" if { iam_service_account_key.disabled } else = "pass" if { (iam_service_account_key.resource_seen_at / milliseconds_in_a_day) - (iam_service_account_key.valid_after_time / milliseconds_in_a_day) <= 90 } else = "fail" # This part remains unchanged for all rules results contains result if { some resource in input.resources[input.main_resource_type] result := dd_output.format(resource, eval(resource)) } `, resourceTypes: ["gcp_compute_disk"], }, }, }, message: "ddd", tags: [], complianceSignalOptions: { userActivationStatus: false, userGroupByFields: [], }, }, ruleId: CLOUD_CONFIGURATION_RULE_ID, }; apiInstance .updateSecurityMonitoringRule(params) .then((data: v2.SecurityMonitoringRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Update an existing rule returns "OK" response ``` /** * Update an existing rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "security_rule" in the system const SECURITY_RULE_ID = process.env.SECURITY_RULE_ID as string; const params: v2.SecurityMonitoringApiUpdateSecurityMonitoringRuleRequest = { body: { name: "Example-Security-Monitoring-Updated", queries: [ { query: "@test:true", aggregation: "count", groupByFields: [], distinctFields: [], metrics: [], }, ], filters: [], cases: [ { name: "", status: "info", condition: "a > 0", notifications: [], }, ], options: { evaluationWindow: 900, keepAlive: 3600, maxSignalDuration: 86400, }, message: "Test rule", tags: [], isEnabled: true, }, ruleId: SECURITY_RULE_ID, }; apiInstance .updateSecurityMonitoringRule(params) .then((data: v2.SecurityMonitoringRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-an-existing-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-an-existing-rule-v2) DELETE https://api.ap1.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.datadoghq.eu/api/v2/security_monitoring/rules/{rule_id}https://api.ddog-gov.com/api/v2/security_monitoring/rules/{rule_id}https://api.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/{rule_id} ### Overview Delete an existing rule. Default rules cannot be deleted. This endpoint requires the `security_monitoring_rules_write` permission. OAuth apps require the `security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string The ID of the rule. ### Response * [204](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityMonitoringRule-204-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityMonitoringRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityMonitoringRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityMonitoringRule-429-v2) OK Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Delete an existing rule Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/${rule_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an existing rule ``` """ Delete an existing rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "security_rule" in the system SECURITY_RULE_ID = environ["SECURITY_RULE_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.delete_security_monitoring_rule( rule_id=SECURITY_RULE_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an existing rule ``` # Delete an existing rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "security_rule" in the system SECURITY_RULE_ID = ENV["SECURITY_RULE_ID"] api_instance.delete_security_monitoring_rule(SECURITY_RULE_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an existing rule ``` // Delete an existing rule returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "security_rule" in the system SecurityRuleID := os.Getenv("SECURITY_RULE_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.DeleteSecurityMonitoringRule(ctx, SecurityRuleID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.DeleteSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an existing rule ``` // Delete an existing rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "security_rule" in the system String SECURITY_RULE_ID = System.getenv("SECURITY_RULE_ID"); try { apiInstance.deleteSecurityMonitoringRule(SECURITY_RULE_ID); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#deleteSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an existing rule ``` // Delete an existing rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "security_rule" in the system let security_rule_id = std::env::var("SECURITY_RULE_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .delete_security_monitoring_rule(security_rule_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an existing rule ``` /** * Delete an existing rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "security_rule" in the system const SECURITY_RULE_ID = process.env.SECURITY_RULE_ID as string; const params: v2.SecurityMonitoringApiDeleteSecurityMonitoringRuleRequest = { ruleId: SECURITY_RULE_ID, }; apiInstance .deleteSecurityMonitoringRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Test an existing rule](https://docs.datadoghq.com/api/latest/security-monitoring/#test-an-existing-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#test-an-existing-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/testhttps://api.ap2.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/testhttps://api.datadoghq.eu/api/v2/security_monitoring/rules/{rule_id}/testhttps://api.ddog-gov.com/api/v2/security_monitoring/rules/{rule_id}/testhttps://api.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/testhttps://api.us3.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/testhttps://api.us5.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/test ### Overview Test an existing rule. This endpoint requires the `security_monitoring_rules_write` permission. OAuth apps require the `security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string The ID of the rule. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description rule Test a rule. Option 1 object The payload of a rule to test calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [_required_] [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message for generated signals. name [_required_] string The name of the rule. options [_required_] object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. name string Name of the case. notifications [string] Notification targets for each case. query string A query to map a third party event to this case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum The rule type. Allowed enum values: `log_detection` ruleQueryPayloads [object] Data payloads used to test rules query with the expected result. expectedResult boolean Expected result of the test. index int64 Index of the query under test. payload object Payload used to test the rule query. ddsource string Source of the payload. ddtags string Tags associated with your data. hostname string The name of the originating host of the log. message string The message of the payload. service string The name of the application or service generating the data. ``` { "rule": { "calculatedFields": [ { "expression": "@request_end_timestamp - @request_start_timestamp", "name": "response_time" } ], "cases": [ { "actions": [ { "options": { "duration": 0, "flaggedIPType": "FLAGGED", "userBehaviorName": "string" }, "type": "string" } ], "condition": "string", "name": "string", "notifications": [], "status": "critical" } ], "filters": [ { "action": "string", "query": "string" } ], "groupSignalsBy": [ "service" ], "hasExtendedTitle": true, "isEnabled": true, "message": "", "name": "My security monitoring rule.", "options": { "anomalyDetectionOptions": { "bucketDuration": 300, "detectionTolerance": 5, "learningDuration": "integer", "learningPeriodBaseline": "integer" }, "complianceRuleOptions": { "complexRule": false, "regoRule": { "policy": "package datadog\n\nimport data.datadog.output as dd_output\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\neval(resource) = \"skip\" if {\n # Logic that evaluates to true if the resource should be skipped\n true\n} else = \"pass\" {\n # Logic that evaluates to true if the resource is compliant\n true\n} else = \"fail\" {\n # Logic that evaluates to true if the resource is not compliant\n true\n}\n\n# This part remains unchanged for all rules\nresults contains result if {\n some resource in input.resources[input.main_resource_type]\n result := dd_output.format(resource, eval(resource))\n}", "resourceTypes": [ "gcp_iam_service_account", "gcp_iam_policy" ] }, "resourceType": "aws_acm" }, "decreaseCriticalityBasedOnEnv": false, "detectionMethod": "string", "evaluationWindow": "integer", "hardcodedEvaluatorType": "string", "impossibleTravelOptions": { "baselineUserLocations": true }, "keepAlive": "integer", "maxSignalDuration": "integer", "newValueOptions": { "forgetAfter": "integer", "instantaneousBaseline": false, "learningDuration": "integer", "learningMethod": "string", "learningThreshold": "integer" }, "sequenceDetectionOptions": { "stepTransitions": [ { "child": "string", "evaluationWindow": "integer", "parent": "string" } ], "steps": [ { "condition": "string", "evaluationWindow": "integer", "name": "string" } ] }, "thirdPartyRuleOptions": { "defaultNotifications": [], "defaultStatus": "critical", "rootQueries": [ { "groupByFields": [], "query": "source:cloudtrail" } ], "signalTitleTemplate": "string" } }, "queries": [ { "aggregation": "string", "customQueryExtension": "a > 3", "dataSource": "logs", "distinctFields": [], "groupByFields": [], "hasOptionalGroupByFields": false, "index": "string", "indexes": [], "metric": "string", "metrics": [], "name": "string", "query": "a > 3" } ], "referenceTables": [ { "checkPresence": false, "columnName": "string", "logFieldPath": "string", "ruleQueryName": "string", "tableName": "string" } ], "schedulingOptions": { "rrule": "FREQ=HOURLY;INTERVAL=1;", "start": "2025-07-14T12:00:00", "timezone": "America/New_York" }, "tags": [ "env:prod", "team:security" ], "thirdPartyCases": [ { "name": "string", "notifications": [], "query": "string", "status": "critical" } ], "type": "string" }, "ruleQueryPayloads": [ { "expectedResult": true, "index": 0, "payload": { "ddsource": "nginx", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment" } } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#TestExistingSecurityMonitoringRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#TestExistingSecurityMonitoringRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/security-monitoring/#TestExistingSecurityMonitoringRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#TestExistingSecurityMonitoringRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#TestExistingSecurityMonitoringRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#TestExistingSecurityMonitoringRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Result of the test of the rule queries. Expand All Field Type Description results [boolean] Assert results are returned in the same order as the rule query payloads. For each payload, it returns True if the result matched the expected result, False otherwise. ``` { "results": [] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Test an existing rule Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/${rule_id}/test" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "rule": { "calculatedFields": [ { "expression": "@request_end_timestamp - @request_start_timestamp", "name": "response_time" } ], "cases": [ { "status": "critical" } ], "options": { "complianceRuleOptions": { "regoRule": { "policy": "package datadog\n\nimport data.datadog.output as dd_output\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\neval(resource) = \"skip\" if {\n # Logic that evaluates to true if the resource should be skipped\n true\n} else = \"pass\" {\n # Logic that evaluates to true if the resource is compliant\n true\n} else = \"fail\" {\n # Logic that evaluates to true if the resource is not compliant\n true\n}\n\n# This part remains unchanged for all rules\nresults contains result if {\n some resource in input.resources[input.main_resource_type]\n result := dd_output.format(resource, eval(resource))\n}", "resourceTypes": [ "gcp_iam_service_account", "gcp_iam_policy" ] } } }, "thirdPartyCases": [ { "status": "critical" } ] } } EOF ``` ##### Test an existing rule ``` """ Test an existing rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_rule_query_payload import SecurityMonitoringRuleQueryPayload from datadog_api_client.v2.model.security_monitoring_rule_query_payload_data import ( SecurityMonitoringRuleQueryPayloadData, ) from datadog_api_client.v2.model.security_monitoring_rule_test_request import SecurityMonitoringRuleTestRequest body = SecurityMonitoringRuleTestRequest( rule_query_payloads=[ SecurityMonitoringRuleQueryPayload( expected_result=True, index=0, payload=SecurityMonitoringRuleQueryPayloadData( ddsource="nginx", ddtags="env:staging,version:5.1", hostname="i-012345678", message="2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service="payment", ), ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.test_existing_security_monitoring_rule(rule_id="rule_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Test an existing rule ``` # Test an existing rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringRuleTestRequest.new({ rule_query_payloads: [ DatadogAPIClient::V2::SecurityMonitoringRuleQueryPayload.new({ expected_result: true, index: 0, payload: DatadogAPIClient::V2::SecurityMonitoringRuleQueryPayloadData.new({ ddsource: "nginx", ddtags: "env:staging,version:5.1", hostname: "i-012345678", message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service: "payment", }), }), ], }) p api_instance.test_existing_security_monitoring_rule("rule_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Test an existing rule ``` // Test an existing rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringRuleTestRequest{ RuleQueryPayloads: []datadogV2.SecurityMonitoringRuleQueryPayload{ { ExpectedResult: datadog.PtrBool(true), Index: datadog.PtrInt64(0), Payload: &datadogV2.SecurityMonitoringRuleQueryPayloadData{ Ddsource: datadog.PtrString("nginx"), Ddtags: datadog.PtrString("env:staging,version:5.1"), Hostname: datadog.PtrString("i-012345678"), Message: datadog.PtrString("2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World"), Service: datadog.PtrString("payment"), }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.TestExistingSecurityMonitoringRule(ctx, "rule_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.TestExistingSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.TestExistingSecurityMonitoringRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Test an existing rule ``` // Test an existing rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryPayload; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryPayloadData; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTestRequest; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTestResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringRuleTestRequest body = new SecurityMonitoringRuleTestRequest() .ruleQueryPayloads( Collections.singletonList( new SecurityMonitoringRuleQueryPayload() .expectedResult(true) .index(0L) .payload( new SecurityMonitoringRuleQueryPayloadData() .ddsource("nginx") .ddtags("env:staging,version:5.1") .hostname("i-012345678") .message( "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello" + " World") .service("payment")))); try { SecurityMonitoringRuleTestResponse result = apiInstance.testExistingSecurityMonitoringRule("rule_id", body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#testExistingSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Test an existing rule ``` // Test an existing rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryPayload; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryPayloadData; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleTestRequest; #[tokio::main] async fn main() { let body = SecurityMonitoringRuleTestRequest::new().rule_query_payloads(vec![ SecurityMonitoringRuleQueryPayload::new() .expected_result(true) .index(0) .payload( SecurityMonitoringRuleQueryPayloadData::new() .ddsource("nginx".to_string()) .ddtags("env:staging,version:5.1".to_string()) .hostname("i-012345678".to_string()) .message( "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World" .to_string(), ) .service("payment".to_string()), ), ]); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .test_existing_security_monitoring_rule("rule_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Test an existing rule ``` /** * Test an existing rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiTestExistingSecurityMonitoringRuleRequest = { body: { ruleQueryPayloads: [ { expectedResult: true, index: 0, payload: { ddsource: "nginx", ddtags: "env:staging,version:5.1", hostname: "i-012345678", message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service: "payment", }, }, ], }, ruleId: "rule_id", }; apiInstance .testExistingSecurityMonitoringRule(params) .then((data: v2.SecurityMonitoringRuleTestResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Test a rule](https://docs.datadoghq.com/api/latest/security-monitoring/#test-a-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#test-a-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/security_monitoring/rules/testhttps://api.ap2.datadoghq.com/api/v2/security_monitoring/rules/testhttps://api.datadoghq.eu/api/v2/security_monitoring/rules/testhttps://api.ddog-gov.com/api/v2/security_monitoring/rules/testhttps://api.datadoghq.com/api/v2/security_monitoring/rules/testhttps://api.us3.datadoghq.com/api/v2/security_monitoring/rules/testhttps://api.us5.datadoghq.com/api/v2/security_monitoring/rules/test ### Overview Test a rule. This endpoint requires the `security_monitoring_rules_write` permission. OAuth apps require the `security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description rule Test a rule. Option 1 object The payload of a rule to test calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [_required_] [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message for generated signals. name [_required_] string The name of the rule. options [_required_] object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. name string Name of the case. notifications [string] Notification targets for each case. query string A query to map a third party event to this case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum The rule type. Allowed enum values: `log_detection` ruleQueryPayloads [object] Data payloads used to test rules query with the expected result. expectedResult boolean Expected result of the test. index int64 Index of the query under test. payload object Payload used to test the rule query. ddsource string Source of the payload. ddtags string Tags associated with your data. hostname string The name of the originating host of the log. message string The message of the payload. service string The name of the application or service generating the data. ``` { "rule": { "cases": [ { "name": "", "status": "info", "notifications": [], "condition": "a > 0" } ], "hasExtendedTitle": true, "isEnabled": true, "message": "My security monitoring rule message.", "name": "My security monitoring rule.", "options": { "decreaseCriticalityBasedOnEnv": false, "detectionMethod": "threshold", "evaluationWindow": 0, "keepAlive": 0, "maxSignalDuration": 0 }, "queries": [ { "query": "source:source_here", "groupByFields": [ "@userIdentity.assumed_role" ], "distinctFields": [], "aggregation": "count", "name": "" } ], "tags": [ "env:prod", "team:security" ], "type": "log_detection" }, "ruleQueryPayloads": [ { "expectedResult": true, "index": 0, "payload": { "ddsource": "source_here", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment", "userIdentity": { "assumed_role": "fake assumed_role" } } } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#TestSecurityMonitoringRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#TestSecurityMonitoringRule-400-v2) * [401](https://docs.datadoghq.com/api/latest/security-monitoring/#TestSecurityMonitoringRule-401-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#TestSecurityMonitoringRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#TestSecurityMonitoringRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#TestSecurityMonitoringRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Result of the test of the rule queries. Expand All Field Type Description results [boolean] Assert results are returned in the same order as the rule query payloads. For each payload, it returns True if the result matched the expected result, False otherwise. ``` { "results": [] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Test a rule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/test" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "rule": { "cases": [ { "name": "", "status": "info", "notifications": [], "condition": "a > 0" } ], "hasExtendedTitle": true, "isEnabled": true, "message": "My security monitoring rule message.", "name": "My security monitoring rule.", "options": { "decreaseCriticalityBasedOnEnv": false, "detectionMethod": "threshold", "evaluationWindow": 0, "keepAlive": 0, "maxSignalDuration": 0 }, "queries": [ { "query": "source:source_here", "groupByFields": [ "@userIdentity.assumed_role" ], "distinctFields": [], "aggregation": "count", "name": "" } ], "tags": [ "env:prod", "team:security" ], "type": "log_detection" }, "ruleQueryPayloads": [ { "expectedResult": true, "index": 0, "payload": { "ddsource": "source_here", "ddtags": "env:staging,version:5.1", "hostname": "i-012345678", "message": "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", "service": "payment", "userIdentity": { "assumed_role": "fake assumed_role" } } } ] } EOF ``` ##### Test a rule returns "OK" response ``` // Test a rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringRuleTestRequest{ Rule: &datadogV2.SecurityMonitoringRuleTestPayload{ SecurityMonitoringStandardRuleTestPayload: &datadogV2.SecurityMonitoringStandardRuleTestPayload{ Cases: []datadogV2.SecurityMonitoringRuleCaseCreate{ { Name: datadog.PtrString(""), Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO, Notifications: []string{}, Condition: datadog.PtrString("a > 0"), }, }, HasExtendedTitle: datadog.PtrBool(true), IsEnabled: true, Message: "My security monitoring rule message.", Name: "My security monitoring rule.", Options: datadogV2.SecurityMonitoringRuleOptions{ DecreaseCriticalityBasedOnEnv: datadog.PtrBool(false), DetectionMethod: datadogV2.SECURITYMONITORINGRULEDETECTIONMETHOD_THRESHOLD.Ptr(), EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_ZERO_MINUTES.Ptr(), KeepAlive: datadogV2.SECURITYMONITORINGRULEKEEPALIVE_ZERO_MINUTES.Ptr(), MaxSignalDuration: datadogV2.SECURITYMONITORINGRULEMAXSIGNALDURATION_ZERO_MINUTES.Ptr(), }, Queries: []datadogV2.SecurityMonitoringStandardRuleQuery{ { Query: datadog.PtrString("source:source_here"), GroupByFields: []string{ "@userIdentity.assumed_role", }, DistinctFields: []string{}, Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_COUNT.Ptr(), Name: datadog.PtrString(""), }, }, Tags: []string{ "env:prod", "team:security", }, Type: datadogV2.SECURITYMONITORINGRULETYPETEST_LOG_DETECTION.Ptr(), }}, RuleQueryPayloads: []datadogV2.SecurityMonitoringRuleQueryPayload{ { ExpectedResult: datadog.PtrBool(true), Index: datadog.PtrInt64(0), Payload: &datadogV2.SecurityMonitoringRuleQueryPayloadData{ Ddsource: datadog.PtrString("source_here"), Ddtags: datadog.PtrString("env:staging,version:5.1"), Hostname: datadog.PtrString("i-012345678"), Message: datadog.PtrString("2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World"), Service: datadog.PtrString("payment"), }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.TestSecurityMonitoringRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.TestSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.TestSecurityMonitoringRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Test a rule returns "OK" response ``` // Test a rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCaseCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleDetectionMethod; import com.datadog.api.client.v2.model.SecurityMonitoringRuleEvaluationWindow; import com.datadog.api.client.v2.model.SecurityMonitoringRuleKeepAlive; import com.datadog.api.client.v2.model.SecurityMonitoringRuleMaxSignalDuration; import com.datadog.api.client.v2.model.SecurityMonitoringRuleOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryAggregation; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryPayload; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryPayloadData; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTestPayload; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTestRequest; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTestResponse; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTypeTest; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleQuery; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleTestPayload; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringRuleTestRequest body = new SecurityMonitoringRuleTestRequest() .rule( new SecurityMonitoringRuleTestPayload( new SecurityMonitoringStandardRuleTestPayload() .cases( Collections.singletonList( new SecurityMonitoringRuleCaseCreate() .name("") .status(SecurityMonitoringRuleSeverity.INFO) .condition("a > 0"))) .hasExtendedTitle(true) .isEnabled(true) .message("My security monitoring rule message.") .name("My security monitoring rule.") .options( new SecurityMonitoringRuleOptions() .decreaseCriticalityBasedOnEnv(false) .detectionMethod(SecurityMonitoringRuleDetectionMethod.THRESHOLD) .evaluationWindow( SecurityMonitoringRuleEvaluationWindow.ZERO_MINUTES) .keepAlive(SecurityMonitoringRuleKeepAlive.ZERO_MINUTES) .maxSignalDuration( SecurityMonitoringRuleMaxSignalDuration.ZERO_MINUTES)) .queries( Collections.singletonList( new SecurityMonitoringStandardRuleQuery() .query("source:source_here") .groupByFields( Collections.singletonList("@userIdentity.assumed_role")) .aggregation(SecurityMonitoringRuleQueryAggregation.COUNT) .name(""))) .tags(Arrays.asList("env:prod", "team:security")) .type(SecurityMonitoringRuleTypeTest.LOG_DETECTION))) .ruleQueryPayloads( Collections.singletonList( new SecurityMonitoringRuleQueryPayload() .expectedResult(true) .index(0L) .payload( new SecurityMonitoringRuleQueryPayloadData() .ddsource("source_here") .ddtags("env:staging,version:5.1") .hostname("i-012345678") .message( "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello" + " World") .service("payment")))); try { SecurityMonitoringRuleTestResponse result = apiInstance.testSecurityMonitoringRule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#testSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Test a rule returns "OK" response ``` """ Test a rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_rule_case_create import SecurityMonitoringRuleCaseCreate from datadog_api_client.v2.model.security_monitoring_rule_detection_method import SecurityMonitoringRuleDetectionMethod from datadog_api_client.v2.model.security_monitoring_rule_evaluation_window import ( SecurityMonitoringRuleEvaluationWindow, ) from datadog_api_client.v2.model.security_monitoring_rule_keep_alive import SecurityMonitoringRuleKeepAlive from datadog_api_client.v2.model.security_monitoring_rule_max_signal_duration import ( SecurityMonitoringRuleMaxSignalDuration, ) from datadog_api_client.v2.model.security_monitoring_rule_options import SecurityMonitoringRuleOptions from datadog_api_client.v2.model.security_monitoring_rule_query_aggregation import ( SecurityMonitoringRuleQueryAggregation, ) from datadog_api_client.v2.model.security_monitoring_rule_query_payload import SecurityMonitoringRuleQueryPayload from datadog_api_client.v2.model.security_monitoring_rule_query_payload_data import ( SecurityMonitoringRuleQueryPayloadData, ) from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity from datadog_api_client.v2.model.security_monitoring_rule_test_request import SecurityMonitoringRuleTestRequest from datadog_api_client.v2.model.security_monitoring_rule_type_test import SecurityMonitoringRuleTypeTest from datadog_api_client.v2.model.security_monitoring_standard_rule_query import SecurityMonitoringStandardRuleQuery from datadog_api_client.v2.model.security_monitoring_standard_rule_test_payload import ( SecurityMonitoringStandardRuleTestPayload, ) body = SecurityMonitoringRuleTestRequest( rule=SecurityMonitoringStandardRuleTestPayload( cases=[ SecurityMonitoringRuleCaseCreate( name="", status=SecurityMonitoringRuleSeverity.INFO, notifications=[], condition="a > 0", ), ], has_extended_title=True, is_enabled=True, message="My security monitoring rule message.", name="My security monitoring rule.", options=SecurityMonitoringRuleOptions( decrease_criticality_based_on_env=False, detection_method=SecurityMonitoringRuleDetectionMethod.THRESHOLD, evaluation_window=SecurityMonitoringRuleEvaluationWindow.ZERO_MINUTES, keep_alive=SecurityMonitoringRuleKeepAlive.ZERO_MINUTES, max_signal_duration=SecurityMonitoringRuleMaxSignalDuration.ZERO_MINUTES, ), queries=[ SecurityMonitoringStandardRuleQuery( query="source:source_here", group_by_fields=[ "@userIdentity.assumed_role", ], distinct_fields=[], aggregation=SecurityMonitoringRuleQueryAggregation.COUNT, name="", ), ], tags=[ "env:prod", "team:security", ], type=SecurityMonitoringRuleTypeTest.LOG_DETECTION, ), rule_query_payloads=[ SecurityMonitoringRuleQueryPayload( expected_result=True, index=0, payload=SecurityMonitoringRuleQueryPayloadData( ddsource="source_here", ddtags="env:staging,version:5.1", hostname="i-012345678", message="2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service="payment", user_identity=dict([("assumed_role", "fake assumed_role")]), ), ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.test_security_monitoring_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Test a rule returns "OK" response ``` # Test a rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringRuleTestRequest.new({ rule: DatadogAPIClient::V2::SecurityMonitoringStandardRuleTestPayload.new({ cases: [ DatadogAPIClient::V2::SecurityMonitoringRuleCaseCreate.new({ name: "", status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, notifications: [], condition: "a > 0", }), ], has_extended_title: true, is_enabled: true, message: "My security monitoring rule message.", name: "My security monitoring rule.", options: DatadogAPIClient::V2::SecurityMonitoringRuleOptions.new({ decrease_criticality_based_on_env: false, detection_method: DatadogAPIClient::V2::SecurityMonitoringRuleDetectionMethod::THRESHOLD, evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::ZERO_MINUTES, keep_alive: DatadogAPIClient::V2::SecurityMonitoringRuleKeepAlive::ZERO_MINUTES, max_signal_duration: DatadogAPIClient::V2::SecurityMonitoringRuleMaxSignalDuration::ZERO_MINUTES, }), queries: [ DatadogAPIClient::V2::SecurityMonitoringStandardRuleQuery.new({ query: "source:source_here", group_by_fields: [ "@userIdentity.assumed_role", ], distinct_fields: [], aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::COUNT, name: "", }), ], tags: [ "env:prod", "team:security", ], type: DatadogAPIClient::V2::SecurityMonitoringRuleTypeTest::LOG_DETECTION, }), rule_query_payloads: [ DatadogAPIClient::V2::SecurityMonitoringRuleQueryPayload.new({ expected_result: true, index: 0, payload: DatadogAPIClient::V2::SecurityMonitoringRuleQueryPayloadData.new({ ddsource: "source_here", ddtags: "env:staging,version:5.1", hostname: "i-012345678", message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service: "payment", }), }), ], }) p api_instance.test_security_monitoring_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Test a rule returns "OK" response ``` // Test a rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCaseCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleDetectionMethod; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleEvaluationWindow; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleKeepAlive; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleMaxSignalDuration; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryAggregation; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryPayload; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryPayloadData; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleTestPayload; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleTestRequest; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleTypeTest; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleQuery; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleTestPayload; #[tokio::main] async fn main() { let body = SecurityMonitoringRuleTestRequest::new() .rule( SecurityMonitoringRuleTestPayload::SecurityMonitoringStandardRuleTestPayload(Box::new( SecurityMonitoringStandardRuleTestPayload::new( vec![SecurityMonitoringRuleCaseCreate::new( SecurityMonitoringRuleSeverity::INFO, ) .condition("a > 0".to_string()) .name("".to_string()) .notifications(vec![])], true, "My security monitoring rule message.".to_string(), "My security monitoring rule.".to_string(), SecurityMonitoringRuleOptions::new() .decrease_criticality_based_on_env(false) .detection_method(SecurityMonitoringRuleDetectionMethod::THRESHOLD) .evaluation_window(SecurityMonitoringRuleEvaluationWindow::ZERO_MINUTES) .keep_alive(SecurityMonitoringRuleKeepAlive::ZERO_MINUTES) .max_signal_duration(SecurityMonitoringRuleMaxSignalDuration::ZERO_MINUTES), vec![SecurityMonitoringStandardRuleQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::COUNT) .distinct_fields(vec![]) .group_by_fields(vec!["@userIdentity.assumed_role".to_string()]) .name("".to_string()) .query("source:source_here".to_string())], ) .has_extended_title(true) .tags(vec!["env:prod".to_string(), "team:security".to_string()]) .type_(SecurityMonitoringRuleTypeTest::LOG_DETECTION), )), ) .rule_query_payloads(vec![SecurityMonitoringRuleQueryPayload::new() .expected_result(true) .index(0) .payload( SecurityMonitoringRuleQueryPayloadData::new() .ddsource("source_here".to_string()) .ddtags("env:staging,version:5.1".to_string()) .hostname("i-012345678".to_string()) .message( "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World" .to_string(), ) .service("payment".to_string()), )]); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.test_security_monitoring_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Test a rule returns "OK" response ``` /** * Test a rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiTestSecurityMonitoringRuleRequest = { body: { rule: { cases: [ { name: "", status: "info", notifications: [], condition: "a > 0", }, ], hasExtendedTitle: true, isEnabled: true, message: "My security monitoring rule message.", name: "My security monitoring rule.", options: { decreaseCriticalityBasedOnEnv: false, detectionMethod: "threshold", evaluationWindow: 0, keepAlive: 0, maxSignalDuration: 0, }, queries: [ { query: "source:source_here", groupByFields: ["@userIdentity.assumed_role"], distinctFields: [], aggregation: "count", name: "", }, ], tags: ["env:prod", "team:security"], type: "log_detection", }, ruleQueryPayloads: [ { expectedResult: true, index: 0, payload: { ddsource: "source_here", ddtags: "env:staging,version:5.1", hostname: "i-012345678", message: "2019-11-19T14:37:58,995 INFO [process.name][20081] Hello World", service: "payment", }, }, ], }, }; apiInstance .testSecurityMonitoringRule(params) .then((data: v2.SecurityMonitoringRuleTestResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Validate a detection rule](https://docs.datadoghq.com/api/latest/security-monitoring/#validate-a-detection-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#validate-a-detection-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/security_monitoring/rules/validationhttps://api.ap2.datadoghq.com/api/v2/security_monitoring/rules/validationhttps://api.datadoghq.eu/api/v2/security_monitoring/rules/validationhttps://api.ddog-gov.com/api/v2/security_monitoring/rules/validationhttps://api.datadoghq.com/api/v2/security_monitoring/rules/validationhttps://api.us3.datadoghq.com/api/v2/security_monitoring/rules/validationhttps://api.us5.datadoghq.com/api/v2/security_monitoring/rules/validation ### Overview Validate a detection rule. This endpoint requires the `security_monitoring_rules_write` permission. OAuth apps require the `security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description Option 1 object The payload of a rule. calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [_required_] [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message for generated signals. name [_required_] string The name of the rule. options [_required_] object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. name string Name of the case. notifications [string] Notification targets for each case. query string A query to map a third party event to this case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum The rule type. Allowed enum values: `api_security,application_security,log_detection,workload_security` Option 2 object The payload of a signal correlation rule. cases [_required_] [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message for generated signals. name [_required_] string The name of the rule. options [_required_] object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting signals which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` correlatedByFields [string] Fields to group by. correlatedQueryIndex int32 Index of the rule query used to retrieve the correlated field. metrics [string] Group of target fields to aggregate over. name string Name of the query. ruleId [_required_] string Rule ID to match on signals. tags [string] Tags for generated signals. type enum The rule type. Allowed enum values: `signal_correlation` Option 3 object The payload of a cloud configuration rule. cases [_required_] [object] Description of generated findings and signals (severity and channels to be notified in case of a signal). Must contain exactly one item. notifications [string] Notification targets for each rule case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` complianceSignalOptions [_required_] object How to generate compliance signals. Useful for cloud_configuration rules only. defaultActivationStatus boolean The default activation status. defaultGroupByFields [string] The default group by fields. userActivationStatus boolean Whether signals will be sent. userGroupByFields [string] Fields to use to group findings by when sending signals. customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). filters [object] Additional queries to filter matched events before they are processed. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message in markdown format for generated findings and signals. name [_required_] string The name of the rule. options [_required_] object Options on cloud configuration rules. complianceRuleOptions [_required_] object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. tags [string] Tags for generated findings and signals. type enum The rule type. Allowed enum values: `cloud_configuration` ##### Validate a detection rule returns "OK" response ``` { "cases": [ { "name": "", "status": "info", "notifications": [], "condition": "a > 0" } ], "hasExtendedTitle": true, "isEnabled": true, "message": "My security monitoring rule", "name": "My security monitoring rule", "options": { "evaluationWindow": 1800, "keepAlive": 1800, "maxSignalDuration": 1800, "detectionMethod": "threshold" }, "queries": [ { "query": "source:source_here", "groupByFields": [ "@userIdentity.assumed_role" ], "distinctFields": [], "aggregation": "count", "name": "" } ], "tags": [ "env:prod", "team:security" ], "type": "log_detection" } ``` Copy ##### Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" response ``` { "cases": [ { "name": "", "status": "info", "notifications": [] } ], "hasExtendedTitle": true, "isEnabled": true, "message": "My security monitoring rule", "name": "My security monitoring rule", "options": { "evaluationWindow": 0, "keepAlive": 300, "maxSignalDuration": 600, "detectionMethod": "new_value", "newValueOptions": { "forgetAfter": 7, "instantaneousBaseline": true, "learningDuration": 1, "learningThreshold": 0, "learningMethod": "duration" } }, "queries": [ { "query": "source:source_here", "groupByFields": [ "@userIdentity.assumed_role" ], "distinctFields": [], "metric": "name", "metrics": [ "name" ], "aggregation": "new_value", "name": "", "dataSource": "logs" } ], "tags": [ "env:prod", "team:security" ], "type": "log_detection" } ``` Copy ##### Validate a detection rule with detection method 'sequence_detection' returns "OK" response ``` { "cases": [ { "name": "", "status": "info", "notifications": [], "condition": "step_b > 0" } ], "hasExtendedTitle": true, "isEnabled": true, "message": "My security monitoring rule", "name": "My security monitoring rule", "options": { "evaluationWindow": 0, "keepAlive": 300, "maxSignalDuration": 600, "detectionMethod": "sequence_detection", "sequenceDetectionOptions": { "stepTransitions": [ { "child": "step_b", "evaluationWindow": 900, "parent": "step_a" } ], "steps": [ { "condition": "a > 0", "evaluationWindow": 60, "name": "step_a" }, { "condition": "b > 0", "evaluationWindow": 60, "name": "step_b" } ] } }, "queries": [ { "query": "source:source_here", "groupByFields": [ "@userIdentity.assumed_role" ], "distinctFields": [], "aggregation": "count", "name": "" }, { "query": "source:source_here2", "groupByFields": [], "distinctFields": [], "aggregation": "count", "name": "" } ], "tags": [ "env:prod", "team:security" ], "type": "log_detection" } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/security-monitoring/#ValidateSecurityMonitoringRule-204-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ValidateSecurityMonitoringRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ValidateSecurityMonitoringRule-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ValidateSecurityMonitoringRule-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Validate a detection rule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/validation" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "cases": [ { "name": "", "status": "info", "notifications": [], "condition": "a > 0" } ], "hasExtendedTitle": true, "isEnabled": true, "message": "My security monitoring rule", "name": "My security monitoring rule", "options": { "evaluationWindow": 1800, "keepAlive": 1800, "maxSignalDuration": 1800, "detectionMethod": "threshold" }, "queries": [ { "query": "source:source_here", "groupByFields": [ "@userIdentity.assumed_role" ], "distinctFields": [], "aggregation": "count", "name": "" } ], "tags": [ "env:prod", "team:security" ], "type": "log_detection" } EOF ``` ##### Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/validation" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "cases": [ { "name": "", "status": "info", "notifications": [] } ], "hasExtendedTitle": true, "isEnabled": true, "message": "My security monitoring rule", "name": "My security monitoring rule", "options": { "evaluationWindow": 0, "keepAlive": 300, "maxSignalDuration": 600, "detectionMethod": "new_value", "newValueOptions": { "forgetAfter": 7, "instantaneousBaseline": true, "learningDuration": 1, "learningThreshold": 0, "learningMethod": "duration" } }, "queries": [ { "query": "source:source_here", "groupByFields": [ "@userIdentity.assumed_role" ], "distinctFields": [], "metric": "name", "metrics": [ "name" ], "aggregation": "new_value", "name": "", "dataSource": "logs" } ], "tags": [ "env:prod", "team:security" ], "type": "log_detection" } EOF ``` ##### Validate a detection rule with detection method 'sequence_detection' returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/validation" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "cases": [ { "name": "", "status": "info", "notifications": [], "condition": "step_b > 0" } ], "hasExtendedTitle": true, "isEnabled": true, "message": "My security monitoring rule", "name": "My security monitoring rule", "options": { "evaluationWindow": 0, "keepAlive": 300, "maxSignalDuration": 600, "detectionMethod": "sequence_detection", "sequenceDetectionOptions": { "stepTransitions": [ { "child": "step_b", "evaluationWindow": 900, "parent": "step_a" } ], "steps": [ { "condition": "a > 0", "evaluationWindow": 60, "name": "step_a" }, { "condition": "b > 0", "evaluationWindow": 60, "name": "step_b" } ] } }, "queries": [ { "query": "source:source_here", "groupByFields": [ "@userIdentity.assumed_role" ], "distinctFields": [], "aggregation": "count", "name": "" }, { "query": "source:source_here2", "groupByFields": [], "distinctFields": [], "aggregation": "count", "name": "" } ], "tags": [ "env:prod", "team:security" ], "type": "log_detection" } EOF ``` ##### Validate a detection rule returns "OK" response ``` // Validate a detection rule returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringRuleValidatePayload{ SecurityMonitoringStandardRulePayload: &datadogV2.SecurityMonitoringStandardRulePayload{ Cases: []datadogV2.SecurityMonitoringRuleCaseCreate{ { Name: datadog.PtrString(""), Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO, Notifications: []string{}, Condition: datadog.PtrString("a > 0"), }, }, HasExtendedTitle: datadog.PtrBool(true), IsEnabled: true, Message: "My security monitoring rule", Name: "My security monitoring rule", Options: datadogV2.SecurityMonitoringRuleOptions{ EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_THIRTY_MINUTES.Ptr(), KeepAlive: datadogV2.SECURITYMONITORINGRULEKEEPALIVE_THIRTY_MINUTES.Ptr(), MaxSignalDuration: datadogV2.SECURITYMONITORINGRULEMAXSIGNALDURATION_THIRTY_MINUTES.Ptr(), DetectionMethod: datadogV2.SECURITYMONITORINGRULEDETECTIONMETHOD_THRESHOLD.Ptr(), }, Queries: []datadogV2.SecurityMonitoringStandardRuleQuery{ { Query: datadog.PtrString("source:source_here"), GroupByFields: []string{ "@userIdentity.assumed_role", }, DistinctFields: []string{}, Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_COUNT.Ptr(), Name: datadog.PtrString(""), }, }, Tags: []string{ "env:prod", "team:security", }, Type: datadogV2.SECURITYMONITORINGRULETYPECREATE_LOG_DETECTION.Ptr(), }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.ValidateSecurityMonitoringRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ValidateSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy ##### Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" response ``` // Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" // response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringRuleValidatePayload{ SecurityMonitoringStandardRulePayload: &datadogV2.SecurityMonitoringStandardRulePayload{ Cases: []datadogV2.SecurityMonitoringRuleCaseCreate{ { Name: datadog.PtrString(""), Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO, Notifications: []string{}, }, }, HasExtendedTitle: datadog.PtrBool(true), IsEnabled: true, Message: "My security monitoring rule", Name: "My security monitoring rule", Options: datadogV2.SecurityMonitoringRuleOptions{ EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_ZERO_MINUTES.Ptr(), KeepAlive: datadogV2.SECURITYMONITORINGRULEKEEPALIVE_FIVE_MINUTES.Ptr(), MaxSignalDuration: datadogV2.SECURITYMONITORINGRULEMAXSIGNALDURATION_TEN_MINUTES.Ptr(), DetectionMethod: datadogV2.SECURITYMONITORINGRULEDETECTIONMETHOD_NEW_VALUE.Ptr(), NewValueOptions: &datadogV2.SecurityMonitoringRuleNewValueOptions{ ForgetAfter: datadogV2.SECURITYMONITORINGRULENEWVALUEOPTIONSFORGETAFTER_ONE_WEEK.Ptr(), InstantaneousBaseline: datadog.PtrBool(true), LearningDuration: datadogV2.SECURITYMONITORINGRULENEWVALUEOPTIONSLEARNINGDURATION_ONE_DAY.Ptr(), LearningThreshold: datadogV2.SECURITYMONITORINGRULENEWVALUEOPTIONSLEARNINGTHRESHOLD_ZERO_OCCURRENCES.Ptr(), LearningMethod: datadogV2.SECURITYMONITORINGRULENEWVALUEOPTIONSLEARNINGMETHOD_DURATION.Ptr(), }, }, Queries: []datadogV2.SecurityMonitoringStandardRuleQuery{ { Query: datadog.PtrString("source:source_here"), GroupByFields: []string{ "@userIdentity.assumed_role", }, DistinctFields: []string{}, Metric: datadog.PtrString("name"), Metrics: []string{ "name", }, Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_NEW_VALUE.Ptr(), Name: datadog.PtrString(""), DataSource: datadogV2.SECURITYMONITORINGSTANDARDDATASOURCE_LOGS.Ptr(), }, }, Tags: []string{ "env:prod", "team:security", }, Type: datadogV2.SECURITYMONITORINGRULETYPECREATE_LOG_DETECTION.Ptr(), }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.ValidateSecurityMonitoringRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ValidateSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy ##### Validate a detection rule with detection method 'sequence_detection' returns "OK" response ``` // Validate a detection rule with detection method 'sequence_detection' returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringRuleValidatePayload{ SecurityMonitoringStandardRulePayload: &datadogV2.SecurityMonitoringStandardRulePayload{ Cases: []datadogV2.SecurityMonitoringRuleCaseCreate{ { Name: datadog.PtrString(""), Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO, Notifications: []string{}, Condition: datadog.PtrString("step_b > 0"), }, }, HasExtendedTitle: datadog.PtrBool(true), IsEnabled: true, Message: "My security monitoring rule", Name: "My security monitoring rule", Options: datadogV2.SecurityMonitoringRuleOptions{ EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_ZERO_MINUTES.Ptr(), KeepAlive: datadogV2.SECURITYMONITORINGRULEKEEPALIVE_FIVE_MINUTES.Ptr(), MaxSignalDuration: datadogV2.SECURITYMONITORINGRULEMAXSIGNALDURATION_TEN_MINUTES.Ptr(), DetectionMethod: datadogV2.SECURITYMONITORINGRULEDETECTIONMETHOD_SEQUENCE_DETECTION.Ptr(), SequenceDetectionOptions: &datadogV2.SecurityMonitoringRuleSequenceDetectionOptions{ StepTransitions: []datadogV2.SecurityMonitoringRuleSequenceDetectionStepTransition{ { Child: datadog.PtrString("step_b"), EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_FIFTEEN_MINUTES.Ptr(), Parent: datadog.PtrString("step_a"), }, }, Steps: []datadogV2.SecurityMonitoringRuleSequenceDetectionStep{ { Condition: datadog.PtrString("a > 0"), EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_ONE_MINUTE.Ptr(), Name: datadog.PtrString("step_a"), }, { Condition: datadog.PtrString("b > 0"), EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_ONE_MINUTE.Ptr(), Name: datadog.PtrString("step_b"), }, }, }, }, Queries: []datadogV2.SecurityMonitoringStandardRuleQuery{ { Query: datadog.PtrString("source:source_here"), GroupByFields: []string{ "@userIdentity.assumed_role", }, DistinctFields: []string{}, Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_COUNT.Ptr(), Name: datadog.PtrString(""), }, { Query: datadog.PtrString("source:source_here2"), GroupByFields: []string{}, DistinctFields: []string{}, Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_COUNT.Ptr(), Name: datadog.PtrString(""), }, }, Tags: []string{ "env:prod", "team:security", }, Type: datadogV2.SECURITYMONITORINGRULETYPECREATE_LOG_DETECTION.Ptr(), }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.ValidateSecurityMonitoringRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ValidateSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Validate a detection rule returns "OK" response ``` // Validate a detection rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCaseCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleDetectionMethod; import com.datadog.api.client.v2.model.SecurityMonitoringRuleEvaluationWindow; import com.datadog.api.client.v2.model.SecurityMonitoringRuleKeepAlive; import com.datadog.api.client.v2.model.SecurityMonitoringRuleMaxSignalDuration; import com.datadog.api.client.v2.model.SecurityMonitoringRuleOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryAggregation; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTypeCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleValidatePayload; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRulePayload; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleQuery; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringRuleValidatePayload body = new SecurityMonitoringRuleValidatePayload( new SecurityMonitoringStandardRulePayload() .cases( Collections.singletonList( new SecurityMonitoringRuleCaseCreate() .name("") .status(SecurityMonitoringRuleSeverity.INFO) .condition("a > 0"))) .hasExtendedTitle(true) .isEnabled(true) .message("My security monitoring rule") .name("My security monitoring rule") .options( new SecurityMonitoringRuleOptions() .evaluationWindow(SecurityMonitoringRuleEvaluationWindow.THIRTY_MINUTES) .keepAlive(SecurityMonitoringRuleKeepAlive.THIRTY_MINUTES) .maxSignalDuration(SecurityMonitoringRuleMaxSignalDuration.THIRTY_MINUTES) .detectionMethod(SecurityMonitoringRuleDetectionMethod.THRESHOLD)) .queries( Collections.singletonList( new SecurityMonitoringStandardRuleQuery() .query("source:source_here") .groupByFields(Collections.singletonList("@userIdentity.assumed_role")) .aggregation(SecurityMonitoringRuleQueryAggregation.COUNT) .name(""))) .tags(Arrays.asList("env:prod", "team:security")) .type(SecurityMonitoringRuleTypeCreate.LOG_DETECTION)); try { apiInstance.validateSecurityMonitoringRule(body); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#validateSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" response ``` // Validate a detection rule with detection method 'new_value' with enabled feature // 'instantaneousBaseline' returns "OK" // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCaseCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleDetectionMethod; import com.datadog.api.client.v2.model.SecurityMonitoringRuleEvaluationWindow; import com.datadog.api.client.v2.model.SecurityMonitoringRuleKeepAlive; import com.datadog.api.client.v2.model.SecurityMonitoringRuleMaxSignalDuration; import com.datadog.api.client.v2.model.SecurityMonitoringRuleNewValueOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleNewValueOptionsForgetAfter; import com.datadog.api.client.v2.model.SecurityMonitoringRuleNewValueOptionsLearningDuration; import com.datadog.api.client.v2.model.SecurityMonitoringRuleNewValueOptionsLearningMethod; import com.datadog.api.client.v2.model.SecurityMonitoringRuleNewValueOptionsLearningThreshold; import com.datadog.api.client.v2.model.SecurityMonitoringRuleOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryAggregation; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTypeCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleValidatePayload; import com.datadog.api.client.v2.model.SecurityMonitoringStandardDataSource; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRulePayload; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleQuery; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringRuleValidatePayload body = new SecurityMonitoringRuleValidatePayload( new SecurityMonitoringStandardRulePayload() .cases( Collections.singletonList( new SecurityMonitoringRuleCaseCreate() .name("") .status(SecurityMonitoringRuleSeverity.INFO))) .hasExtendedTitle(true) .isEnabled(true) .message("My security monitoring rule") .name("My security monitoring rule") .options( new SecurityMonitoringRuleOptions() .evaluationWindow(SecurityMonitoringRuleEvaluationWindow.ZERO_MINUTES) .keepAlive(SecurityMonitoringRuleKeepAlive.FIVE_MINUTES) .maxSignalDuration(SecurityMonitoringRuleMaxSignalDuration.TEN_MINUTES) .detectionMethod(SecurityMonitoringRuleDetectionMethod.NEW_VALUE) .newValueOptions( new SecurityMonitoringRuleNewValueOptions() .forgetAfter( SecurityMonitoringRuleNewValueOptionsForgetAfter.ONE_WEEK) .instantaneousBaseline(true) .learningDuration( SecurityMonitoringRuleNewValueOptionsLearningDuration.ONE_DAY) .learningThreshold( SecurityMonitoringRuleNewValueOptionsLearningThreshold .ZERO_OCCURRENCES) .learningMethod( SecurityMonitoringRuleNewValueOptionsLearningMethod.DURATION))) .queries( Collections.singletonList( new SecurityMonitoringStandardRuleQuery() .query("source:source_here") .groupByFields(Collections.singletonList("@userIdentity.assumed_role")) .metric("name") .metrics(Collections.singletonList("name")) .aggregation(SecurityMonitoringRuleQueryAggregation.NEW_VALUE) .name("") .dataSource(SecurityMonitoringStandardDataSource.LOGS))) .tags(Arrays.asList("env:prod", "team:security")) .type(SecurityMonitoringRuleTypeCreate.LOG_DETECTION)); try { apiInstance.validateSecurityMonitoringRule(body); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#validateSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Validate a detection rule with detection method 'sequence_detection' returns "OK" response ``` // Validate a detection rule with detection method 'sequence_detection' returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCaseCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleDetectionMethod; import com.datadog.api.client.v2.model.SecurityMonitoringRuleEvaluationWindow; import com.datadog.api.client.v2.model.SecurityMonitoringRuleKeepAlive; import com.datadog.api.client.v2.model.SecurityMonitoringRuleMaxSignalDuration; import com.datadog.api.client.v2.model.SecurityMonitoringRuleOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryAggregation; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSequenceDetectionOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSequenceDetectionStep; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSequenceDetectionStepTransition; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTypeCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleValidatePayload; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRulePayload; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleQuery; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringRuleValidatePayload body = new SecurityMonitoringRuleValidatePayload( new SecurityMonitoringStandardRulePayload() .cases( Collections.singletonList( new SecurityMonitoringRuleCaseCreate() .name("") .status(SecurityMonitoringRuleSeverity.INFO) .condition("step_b > 0"))) .hasExtendedTitle(true) .isEnabled(true) .message("My security monitoring rule") .name("My security monitoring rule") .options( new SecurityMonitoringRuleOptions() .evaluationWindow(SecurityMonitoringRuleEvaluationWindow.ZERO_MINUTES) .keepAlive(SecurityMonitoringRuleKeepAlive.FIVE_MINUTES) .maxSignalDuration(SecurityMonitoringRuleMaxSignalDuration.TEN_MINUTES) .detectionMethod(SecurityMonitoringRuleDetectionMethod.SEQUENCE_DETECTION) .sequenceDetectionOptions( new SecurityMonitoringRuleSequenceDetectionOptions() .stepTransitions( Collections.singletonList( new SecurityMonitoringRuleSequenceDetectionStepTransition() .child("step_b") .evaluationWindow( SecurityMonitoringRuleEvaluationWindow .FIFTEEN_MINUTES) .parent("step_a"))) .steps( Arrays.asList( new SecurityMonitoringRuleSequenceDetectionStep() .condition("a > 0") .evaluationWindow( SecurityMonitoringRuleEvaluationWindow.ONE_MINUTE) .name("step_a"), new SecurityMonitoringRuleSequenceDetectionStep() .condition("b > 0") .evaluationWindow( SecurityMonitoringRuleEvaluationWindow.ONE_MINUTE) .name("step_b"))))) .queries( Arrays.asList( new SecurityMonitoringStandardRuleQuery() .query("source:source_here") .groupByFields(Collections.singletonList("@userIdentity.assumed_role")) .aggregation(SecurityMonitoringRuleQueryAggregation.COUNT) .name(""), new SecurityMonitoringStandardRuleQuery() .query("source:source_here2") .aggregation(SecurityMonitoringRuleQueryAggregation.COUNT) .name(""))) .tags(Arrays.asList("env:prod", "team:security")) .type(SecurityMonitoringRuleTypeCreate.LOG_DETECTION)); try { apiInstance.validateSecurityMonitoringRule(body); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#validateSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Validate a detection rule returns "OK" response ``` """ Validate a detection rule returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_rule_case_create import SecurityMonitoringRuleCaseCreate from datadog_api_client.v2.model.security_monitoring_rule_detection_method import SecurityMonitoringRuleDetectionMethod from datadog_api_client.v2.model.security_monitoring_rule_evaluation_window import ( SecurityMonitoringRuleEvaluationWindow, ) from datadog_api_client.v2.model.security_monitoring_rule_keep_alive import SecurityMonitoringRuleKeepAlive from datadog_api_client.v2.model.security_monitoring_rule_max_signal_duration import ( SecurityMonitoringRuleMaxSignalDuration, ) from datadog_api_client.v2.model.security_monitoring_rule_options import SecurityMonitoringRuleOptions from datadog_api_client.v2.model.security_monitoring_rule_query_aggregation import ( SecurityMonitoringRuleQueryAggregation, ) from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity from datadog_api_client.v2.model.security_monitoring_rule_type_create import SecurityMonitoringRuleTypeCreate from datadog_api_client.v2.model.security_monitoring_standard_rule_payload import SecurityMonitoringStandardRulePayload from datadog_api_client.v2.model.security_monitoring_standard_rule_query import SecurityMonitoringStandardRuleQuery body = SecurityMonitoringStandardRulePayload( cases=[ SecurityMonitoringRuleCaseCreate( name="", status=SecurityMonitoringRuleSeverity.INFO, notifications=[], condition="a > 0", ), ], has_extended_title=True, is_enabled=True, message="My security monitoring rule", name="My security monitoring rule", options=SecurityMonitoringRuleOptions( evaluation_window=SecurityMonitoringRuleEvaluationWindow.THIRTY_MINUTES, keep_alive=SecurityMonitoringRuleKeepAlive.THIRTY_MINUTES, max_signal_duration=SecurityMonitoringRuleMaxSignalDuration.THIRTY_MINUTES, detection_method=SecurityMonitoringRuleDetectionMethod.THRESHOLD, ), queries=[ SecurityMonitoringStandardRuleQuery( query="source:source_here", group_by_fields=[ "@userIdentity.assumed_role", ], distinct_fields=[], aggregation=SecurityMonitoringRuleQueryAggregation.COUNT, name="", ), ], tags=[ "env:prod", "team:security", ], type=SecurityMonitoringRuleTypeCreate.LOG_DETECTION, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.validate_security_monitoring_rule(body=body) ``` Copy ##### Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" response ``` """ Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_rule_case_create import SecurityMonitoringRuleCaseCreate from datadog_api_client.v2.model.security_monitoring_rule_detection_method import SecurityMonitoringRuleDetectionMethod from datadog_api_client.v2.model.security_monitoring_rule_evaluation_window import ( SecurityMonitoringRuleEvaluationWindow, ) from datadog_api_client.v2.model.security_monitoring_rule_keep_alive import SecurityMonitoringRuleKeepAlive from datadog_api_client.v2.model.security_monitoring_rule_max_signal_duration import ( SecurityMonitoringRuleMaxSignalDuration, ) from datadog_api_client.v2.model.security_monitoring_rule_new_value_options import SecurityMonitoringRuleNewValueOptions from datadog_api_client.v2.model.security_monitoring_rule_new_value_options_forget_after import ( SecurityMonitoringRuleNewValueOptionsForgetAfter, ) from datadog_api_client.v2.model.security_monitoring_rule_new_value_options_learning_duration import ( SecurityMonitoringRuleNewValueOptionsLearningDuration, ) from datadog_api_client.v2.model.security_monitoring_rule_new_value_options_learning_method import ( SecurityMonitoringRuleNewValueOptionsLearningMethod, ) from datadog_api_client.v2.model.security_monitoring_rule_new_value_options_learning_threshold import ( SecurityMonitoringRuleNewValueOptionsLearningThreshold, ) from datadog_api_client.v2.model.security_monitoring_rule_options import SecurityMonitoringRuleOptions from datadog_api_client.v2.model.security_monitoring_rule_query_aggregation import ( SecurityMonitoringRuleQueryAggregation, ) from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity from datadog_api_client.v2.model.security_monitoring_rule_type_create import SecurityMonitoringRuleTypeCreate from datadog_api_client.v2.model.security_monitoring_standard_data_source import SecurityMonitoringStandardDataSource from datadog_api_client.v2.model.security_monitoring_standard_rule_payload import SecurityMonitoringStandardRulePayload from datadog_api_client.v2.model.security_monitoring_standard_rule_query import SecurityMonitoringStandardRuleQuery body = SecurityMonitoringStandardRulePayload( cases=[ SecurityMonitoringRuleCaseCreate( name="", status=SecurityMonitoringRuleSeverity.INFO, notifications=[], ), ], has_extended_title=True, is_enabled=True, message="My security monitoring rule", name="My security monitoring rule", options=SecurityMonitoringRuleOptions( evaluation_window=SecurityMonitoringRuleEvaluationWindow.ZERO_MINUTES, keep_alive=SecurityMonitoringRuleKeepAlive.FIVE_MINUTES, max_signal_duration=SecurityMonitoringRuleMaxSignalDuration.TEN_MINUTES, detection_method=SecurityMonitoringRuleDetectionMethod.NEW_VALUE, new_value_options=SecurityMonitoringRuleNewValueOptions( forget_after=SecurityMonitoringRuleNewValueOptionsForgetAfter.ONE_WEEK, instantaneous_baseline=True, learning_duration=SecurityMonitoringRuleNewValueOptionsLearningDuration.ONE_DAY, learning_threshold=SecurityMonitoringRuleNewValueOptionsLearningThreshold.ZERO_OCCURRENCES, learning_method=SecurityMonitoringRuleNewValueOptionsLearningMethod.DURATION, ), ), queries=[ SecurityMonitoringStandardRuleQuery( query="source:source_here", group_by_fields=[ "@userIdentity.assumed_role", ], distinct_fields=[], metric="name", metrics=[ "name", ], aggregation=SecurityMonitoringRuleQueryAggregation.NEW_VALUE, name="", data_source=SecurityMonitoringStandardDataSource.LOGS, ), ], tags=[ "env:prod", "team:security", ], type=SecurityMonitoringRuleTypeCreate.LOG_DETECTION, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.validate_security_monitoring_rule(body=body) ``` Copy ##### Validate a detection rule with detection method 'sequence_detection' returns "OK" response ``` """ Validate a detection rule with detection method 'sequence_detection' returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_rule_case_create import SecurityMonitoringRuleCaseCreate from datadog_api_client.v2.model.security_monitoring_rule_detection_method import SecurityMonitoringRuleDetectionMethod from datadog_api_client.v2.model.security_monitoring_rule_evaluation_window import ( SecurityMonitoringRuleEvaluationWindow, ) from datadog_api_client.v2.model.security_monitoring_rule_keep_alive import SecurityMonitoringRuleKeepAlive from datadog_api_client.v2.model.security_monitoring_rule_max_signal_duration import ( SecurityMonitoringRuleMaxSignalDuration, ) from datadog_api_client.v2.model.security_monitoring_rule_options import SecurityMonitoringRuleOptions from datadog_api_client.v2.model.security_monitoring_rule_query_aggregation import ( SecurityMonitoringRuleQueryAggregation, ) from datadog_api_client.v2.model.security_monitoring_rule_sequence_detection_options import ( SecurityMonitoringRuleSequenceDetectionOptions, ) from datadog_api_client.v2.model.security_monitoring_rule_sequence_detection_step import ( SecurityMonitoringRuleSequenceDetectionStep, ) from datadog_api_client.v2.model.security_monitoring_rule_sequence_detection_step_transition import ( SecurityMonitoringRuleSequenceDetectionStepTransition, ) from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity from datadog_api_client.v2.model.security_monitoring_rule_type_create import SecurityMonitoringRuleTypeCreate from datadog_api_client.v2.model.security_monitoring_standard_rule_payload import SecurityMonitoringStandardRulePayload from datadog_api_client.v2.model.security_monitoring_standard_rule_query import SecurityMonitoringStandardRuleQuery body = SecurityMonitoringStandardRulePayload( cases=[ SecurityMonitoringRuleCaseCreate( name="", status=SecurityMonitoringRuleSeverity.INFO, notifications=[], condition="step_b > 0", ), ], has_extended_title=True, is_enabled=True, message="My security monitoring rule", name="My security monitoring rule", options=SecurityMonitoringRuleOptions( evaluation_window=SecurityMonitoringRuleEvaluationWindow.ZERO_MINUTES, keep_alive=SecurityMonitoringRuleKeepAlive.FIVE_MINUTES, max_signal_duration=SecurityMonitoringRuleMaxSignalDuration.TEN_MINUTES, detection_method=SecurityMonitoringRuleDetectionMethod.SEQUENCE_DETECTION, sequence_detection_options=SecurityMonitoringRuleSequenceDetectionOptions( step_transitions=[ SecurityMonitoringRuleSequenceDetectionStepTransition( child="step_b", evaluation_window=SecurityMonitoringRuleEvaluationWindow.FIFTEEN_MINUTES, parent="step_a", ), ], steps=[ SecurityMonitoringRuleSequenceDetectionStep( condition="a > 0", evaluation_window=SecurityMonitoringRuleEvaluationWindow.ONE_MINUTE, name="step_a", ), SecurityMonitoringRuleSequenceDetectionStep( condition="b > 0", evaluation_window=SecurityMonitoringRuleEvaluationWindow.ONE_MINUTE, name="step_b", ), ], ), ), queries=[ SecurityMonitoringStandardRuleQuery( query="source:source_here", group_by_fields=[ "@userIdentity.assumed_role", ], distinct_fields=[], aggregation=SecurityMonitoringRuleQueryAggregation.COUNT, name="", ), SecurityMonitoringStandardRuleQuery( query="source:source_here2", group_by_fields=[], distinct_fields=[], aggregation=SecurityMonitoringRuleQueryAggregation.COUNT, name="", ), ], tags=[ "env:prod", "team:security", ], type=SecurityMonitoringRuleTypeCreate.LOG_DETECTION, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.validate_security_monitoring_rule(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Validate a detection rule returns "OK" response ``` # Validate a detection rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringStandardRulePayload.new({ cases: [ DatadogAPIClient::V2::SecurityMonitoringRuleCaseCreate.new({ name: "", status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, notifications: [], condition: "a > 0", }), ], has_extended_title: true, is_enabled: true, message: "My security monitoring rule", name: "My security monitoring rule", options: DatadogAPIClient::V2::SecurityMonitoringRuleOptions.new({ evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::THIRTY_MINUTES, keep_alive: DatadogAPIClient::V2::SecurityMonitoringRuleKeepAlive::THIRTY_MINUTES, max_signal_duration: DatadogAPIClient::V2::SecurityMonitoringRuleMaxSignalDuration::THIRTY_MINUTES, detection_method: DatadogAPIClient::V2::SecurityMonitoringRuleDetectionMethod::THRESHOLD, }), queries: [ DatadogAPIClient::V2::SecurityMonitoringStandardRuleQuery.new({ query: "source:source_here", group_by_fields: [ "@userIdentity.assumed_role", ], distinct_fields: [], aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::COUNT, name: "", }), ], tags: [ "env:prod", "team:security", ], type: DatadogAPIClient::V2::SecurityMonitoringRuleTypeCreate::LOG_DETECTION, }) api_instance.validate_security_monitoring_rule(body) ``` Copy ##### Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" response ``` # Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringStandardRulePayload.new({ cases: [ DatadogAPIClient::V2::SecurityMonitoringRuleCaseCreate.new({ name: "", status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, notifications: [], }), ], has_extended_title: true, is_enabled: true, message: "My security monitoring rule", name: "My security monitoring rule", options: DatadogAPIClient::V2::SecurityMonitoringRuleOptions.new({ evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::ZERO_MINUTES, keep_alive: DatadogAPIClient::V2::SecurityMonitoringRuleKeepAlive::FIVE_MINUTES, max_signal_duration: DatadogAPIClient::V2::SecurityMonitoringRuleMaxSignalDuration::TEN_MINUTES, detection_method: DatadogAPIClient::V2::SecurityMonitoringRuleDetectionMethod::NEW_VALUE, new_value_options: DatadogAPIClient::V2::SecurityMonitoringRuleNewValueOptions.new({ forget_after: DatadogAPIClient::V2::SecurityMonitoringRuleNewValueOptionsForgetAfter::ONE_WEEK, instantaneous_baseline: true, learning_duration: DatadogAPIClient::V2::SecurityMonitoringRuleNewValueOptionsLearningDuration::ONE_DAY, learning_threshold: DatadogAPIClient::V2::SecurityMonitoringRuleNewValueOptionsLearningThreshold::ZERO_OCCURRENCES, learning_method: DatadogAPIClient::V2::SecurityMonitoringRuleNewValueOptionsLearningMethod::DURATION, }), }), queries: [ DatadogAPIClient::V2::SecurityMonitoringStandardRuleQuery.new({ query: "source:source_here", group_by_fields: [ "@userIdentity.assumed_role", ], distinct_fields: [], metric: "name", metrics: [ "name", ], aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::NEW_VALUE, name: "", data_source: DatadogAPIClient::V2::SecurityMonitoringStandardDataSource::LOGS, }), ], tags: [ "env:prod", "team:security", ], type: DatadogAPIClient::V2::SecurityMonitoringRuleTypeCreate::LOG_DETECTION, }) api_instance.validate_security_monitoring_rule(body) ``` Copy ##### Validate a detection rule with detection method 'sequence_detection' returns "OK" response ``` # Validate a detection rule with detection method 'sequence_detection' returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringStandardRulePayload.new({ cases: [ DatadogAPIClient::V2::SecurityMonitoringRuleCaseCreate.new({ name: "", status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, notifications: [], condition: "step_b > 0", }), ], has_extended_title: true, is_enabled: true, message: "My security monitoring rule", name: "My security monitoring rule", options: DatadogAPIClient::V2::SecurityMonitoringRuleOptions.new({ evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::ZERO_MINUTES, keep_alive: DatadogAPIClient::V2::SecurityMonitoringRuleKeepAlive::FIVE_MINUTES, max_signal_duration: DatadogAPIClient::V2::SecurityMonitoringRuleMaxSignalDuration::TEN_MINUTES, detection_method: DatadogAPIClient::V2::SecurityMonitoringRuleDetectionMethod::SEQUENCE_DETECTION, sequence_detection_options: DatadogAPIClient::V2::SecurityMonitoringRuleSequenceDetectionOptions.new({ step_transitions: [ DatadogAPIClient::V2::SecurityMonitoringRuleSequenceDetectionStepTransition.new({ child: "step_b", evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES, parent: "step_a", }), ], steps: [ DatadogAPIClient::V2::SecurityMonitoringRuleSequenceDetectionStep.new({ condition: "a > 0", evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::ONE_MINUTE, name: "step_a", }), DatadogAPIClient::V2::SecurityMonitoringRuleSequenceDetectionStep.new({ condition: "b > 0", evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::ONE_MINUTE, name: "step_b", }), ], }), }), queries: [ DatadogAPIClient::V2::SecurityMonitoringStandardRuleQuery.new({ query: "source:source_here", group_by_fields: [ "@userIdentity.assumed_role", ], distinct_fields: [], aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::COUNT, name: "", }), DatadogAPIClient::V2::SecurityMonitoringStandardRuleQuery.new({ query: "source:source_here2", group_by_fields: [], distinct_fields: [], aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::COUNT, name: "", }), ], tags: [ "env:prod", "team:security", ], type: DatadogAPIClient::V2::SecurityMonitoringRuleTypeCreate::LOG_DETECTION, }) api_instance.validate_security_monitoring_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Validate a detection rule returns "OK" response ``` // Validate a detection rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCaseCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleDetectionMethod; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleEvaluationWindow; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleKeepAlive; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleMaxSignalDuration; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryAggregation; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleTypeCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleValidatePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRulePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleQuery; #[tokio::main] async fn main() { let body = SecurityMonitoringRuleValidatePayload::SecurityMonitoringStandardRulePayload(Box::new( SecurityMonitoringStandardRulePayload::new( vec![ SecurityMonitoringRuleCaseCreate::new(SecurityMonitoringRuleSeverity::INFO) .condition("a > 0".to_string()) .name("".to_string()) .notifications(vec![]), ], true, "My security monitoring rule".to_string(), "My security monitoring rule".to_string(), SecurityMonitoringRuleOptions::new() .detection_method(SecurityMonitoringRuleDetectionMethod::THRESHOLD) .evaluation_window(SecurityMonitoringRuleEvaluationWindow::THIRTY_MINUTES) .keep_alive(SecurityMonitoringRuleKeepAlive::THIRTY_MINUTES) .max_signal_duration(SecurityMonitoringRuleMaxSignalDuration::THIRTY_MINUTES), vec![SecurityMonitoringStandardRuleQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::COUNT) .distinct_fields(vec![]) .group_by_fields(vec!["@userIdentity.assumed_role".to_string()]) .name("".to_string()) .query("source:source_here".to_string())], ) .has_extended_title(true) .tags(vec!["env:prod".to_string(), "team:security".to_string()]) .type_(SecurityMonitoringRuleTypeCreate::LOG_DETECTION), )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.validate_security_monitoring_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" response ``` // Validate a detection rule with detection method 'new_value' with enabled // feature 'instantaneousBaseline' returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCaseCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleDetectionMethod; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleEvaluationWindow; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleKeepAlive; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleMaxSignalDuration; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleNewValueOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleNewValueOptionsForgetAfter; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleNewValueOptionsLearningDuration; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleNewValueOptionsLearningMethod; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleNewValueOptionsLearningThreshold; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryAggregation; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleTypeCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleValidatePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardDataSource; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRulePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleQuery; #[tokio::main] async fn main() { let body = SecurityMonitoringRuleValidatePayload::SecurityMonitoringStandardRulePayload( Box::new( SecurityMonitoringStandardRulePayload::new( vec![ SecurityMonitoringRuleCaseCreate::new(SecurityMonitoringRuleSeverity::INFO) .name("".to_string()) .notifications(vec![]) ], true, "My security monitoring rule".to_string(), "My security monitoring rule".to_string(), SecurityMonitoringRuleOptions::new() .detection_method(SecurityMonitoringRuleDetectionMethod::NEW_VALUE) .evaluation_window(SecurityMonitoringRuleEvaluationWindow::ZERO_MINUTES) .keep_alive(SecurityMonitoringRuleKeepAlive::FIVE_MINUTES) .max_signal_duration(SecurityMonitoringRuleMaxSignalDuration::TEN_MINUTES) .new_value_options( SecurityMonitoringRuleNewValueOptions::new() .forget_after(SecurityMonitoringRuleNewValueOptionsForgetAfter::ONE_WEEK) .instantaneous_baseline(true) .learning_duration(SecurityMonitoringRuleNewValueOptionsLearningDuration::ONE_DAY) .learning_method(SecurityMonitoringRuleNewValueOptionsLearningMethod::DURATION) .learning_threshold( SecurityMonitoringRuleNewValueOptionsLearningThreshold::ZERO_OCCURRENCES, ), ), vec![ SecurityMonitoringStandardRuleQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::NEW_VALUE) .data_source(SecurityMonitoringStandardDataSource::LOGS) .distinct_fields(vec![]) .group_by_fields(vec!["@userIdentity.assumed_role".to_string()]) .metric("name".to_string()) .metrics(vec!["name".to_string()]) .name("".to_string()) .query("source:source_here".to_string()) ], ) .has_extended_title(true) .tags(vec!["env:prod".to_string(), "team:security".to_string()]) .type_(SecurityMonitoringRuleTypeCreate::LOG_DETECTION), ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.validate_security_monitoring_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Validate a detection rule with detection method 'sequence_detection' returns "OK" response ``` // Validate a detection rule with detection method 'sequence_detection' returns // "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCaseCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleDetectionMethod; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleEvaluationWindow; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleKeepAlive; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleMaxSignalDuration; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryAggregation; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSequenceDetectionOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSequenceDetectionStep; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSequenceDetectionStepTransition; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleTypeCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleValidatePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRulePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleQuery; #[tokio::main] async fn main() { let body = SecurityMonitoringRuleValidatePayload::SecurityMonitoringStandardRulePayload(Box::new( SecurityMonitoringStandardRulePayload::new( vec![ SecurityMonitoringRuleCaseCreate::new(SecurityMonitoringRuleSeverity::INFO) .condition("step_b > 0".to_string()) .name("".to_string()) .notifications(vec![]), ], true, "My security monitoring rule".to_string(), "My security monitoring rule".to_string(), SecurityMonitoringRuleOptions::new() .detection_method(SecurityMonitoringRuleDetectionMethod::SEQUENCE_DETECTION) .evaluation_window(SecurityMonitoringRuleEvaluationWindow::ZERO_MINUTES) .keep_alive(SecurityMonitoringRuleKeepAlive::FIVE_MINUTES) .max_signal_duration(SecurityMonitoringRuleMaxSignalDuration::TEN_MINUTES) .sequence_detection_options( SecurityMonitoringRuleSequenceDetectionOptions::new() .step_transitions(vec![ SecurityMonitoringRuleSequenceDetectionStepTransition::new() .child("step_b".to_string()) .evaluation_window( SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES, ) .parent("step_a".to_string()), ]) .steps(vec![ SecurityMonitoringRuleSequenceDetectionStep::new() .condition("a > 0".to_string()) .evaluation_window( SecurityMonitoringRuleEvaluationWindow::ONE_MINUTE, ) .name("step_a".to_string()), SecurityMonitoringRuleSequenceDetectionStep::new() .condition("b > 0".to_string()) .evaluation_window( SecurityMonitoringRuleEvaluationWindow::ONE_MINUTE, ) .name("step_b".to_string()), ]), ), vec![ SecurityMonitoringStandardRuleQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::COUNT) .distinct_fields(vec![]) .group_by_fields(vec!["@userIdentity.assumed_role".to_string()]) .name("".to_string()) .query("source:source_here".to_string()), SecurityMonitoringStandardRuleQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::COUNT) .distinct_fields(vec![]) .group_by_fields(vec![]) .name("".to_string()) .query("source:source_here2".to_string()), ], ) .has_extended_title(true) .tags(vec!["env:prod".to_string(), "team:security".to_string()]) .type_(SecurityMonitoringRuleTypeCreate::LOG_DETECTION), )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.validate_security_monitoring_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Validate a detection rule returns "OK" response ``` /** * Validate a detection rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiValidateSecurityMonitoringRuleRequest = { body: { cases: [ { name: "", status: "info", notifications: [], condition: "a > 0", }, ], hasExtendedTitle: true, isEnabled: true, message: "My security monitoring rule", name: "My security monitoring rule", options: { evaluationWindow: 1800, keepAlive: 1800, maxSignalDuration: 1800, detectionMethod: "threshold", }, queries: [ { query: "source:source_here", groupByFields: ["@userIdentity.assumed_role"], distinctFields: [], aggregation: "count", name: "", }, ], tags: ["env:prod", "team:security"], type: "log_detection", }, }; apiInstance .validateSecurityMonitoringRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" response ``` /** * Validate a detection rule with detection method 'new_value' with enabled feature 'instantaneousBaseline' returns "OK" * response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiValidateSecurityMonitoringRuleRequest = { body: { cases: [ { name: "", status: "info", notifications: [], }, ], hasExtendedTitle: true, isEnabled: true, message: "My security monitoring rule", name: "My security monitoring rule", options: { evaluationWindow: 0, keepAlive: 300, maxSignalDuration: 600, detectionMethod: "new_value", newValueOptions: { forgetAfter: 7, instantaneousBaseline: true, learningDuration: 1, learningThreshold: 0, learningMethod: "duration", }, }, queries: [ { query: "source:source_here", groupByFields: ["@userIdentity.assumed_role"], distinctFields: [], metric: "name", metrics: ["name"], aggregation: "new_value", name: "", dataSource: "logs", }, ], tags: ["env:prod", "team:security"], type: "log_detection", }, }; apiInstance .validateSecurityMonitoringRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Validate a detection rule with detection method 'sequence_detection' returns "OK" response ``` /** * Validate a detection rule with detection method 'sequence_detection' returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiValidateSecurityMonitoringRuleRequest = { body: { cases: [ { name: "", status: "info", notifications: [], condition: "step_b > 0", }, ], hasExtendedTitle: true, isEnabled: true, message: "My security monitoring rule", name: "My security monitoring rule", options: { evaluationWindow: 0, keepAlive: 300, maxSignalDuration: 600, detectionMethod: "sequence_detection", sequenceDetectionOptions: { stepTransitions: [ { child: "step_b", evaluationWindow: 900, parent: "step_a", }, ], steps: [ { condition: "a > 0", evaluationWindow: 60, name: "step_a", }, { condition: "b > 0", evaluationWindow: 60, name: "step_b", }, ], }, }, queries: [ { query: "source:source_here", groupByFields: ["@userIdentity.assumed_role"], distinctFields: [], aggregation: "count", name: "", }, { query: "source:source_here2", groupByFields: [], distinctFields: [], aggregation: "count", name: "", }, ], tags: ["env:prod", "team:security"], type: "log_detection", }, }; apiInstance .validateSecurityMonitoringRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Change the related incidents of a security signal](https://docs.datadoghq.com/api/latest/security-monitoring/#change-the-related-incidents-of-a-security-signal) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#change-the-related-incidents-of-a-security-signal-v2) PATCH https://api.ap1.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/incidentshttps://api.ap2.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/incidentshttps://api.datadoghq.eu/api/v2/security_monitoring/signals/{signal_id}/incidentshttps://api.ddog-gov.com/api/v2/security_monitoring/signals/{signal_id}/incidentshttps://api.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/incidentshttps://api.us3.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/incidentshttps://api.us5.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}/incidents ### Overview Change the related incidents for a security signal. This endpoint requires the `security_monitoring_signals_write` permission. ### Arguments #### Path Parameters Name Type Description signal_id [_required_] string The ID of the signal. ### Request #### Body Data (required) Attributes describing the signal update. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object Data containing the patch for changing the related incidents of a signal. attributes [_required_] object Attributes describing the new list of related signals for a security signal. incident_ids [_required_] [integer] Array of incidents that are associated with this signal. version int64 Version of the updated signal. If server side version is higher, update will be rejected. ``` { "data": { "attributes": { "incident_ids": [ 2066 ] } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalIncidents-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalIncidents-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalIncidents-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalIncidents-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#EditSecurityMonitoringSignalIncidents-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The response returned after all triage operations, containing the updated signal triage data. Field Type Description data [_required_] object Data containing the updated triage attributes of the signal. attributes object Attributes describing a triage state update operation over a security signal. archive_comment string Optional comment to display on archived signals. archive_comment_timestamp int64 Timestamp of the last edit to the comment. archive_comment_user object Object representing a given user entity. handle string The handle for this user account. icon string Gravatar icon associated to the user. id int64 Numerical ID assigned by Datadog to this user account. name string The name for this user account. uuid [_required_] string UUID assigned by Datadog to this user account. archive_reason enum Reason a signal is archived. Allowed enum values: `none,false_positive,testing_or_maintenance,investigated_case_opened,true_positive_benign,true_positive_malicious,other` assignee [_required_] object Object representing a given user entity. handle string The handle for this user account. icon string Gravatar icon associated to the user. id int64 Numerical ID assigned by Datadog to this user account. name string The name for this user account. uuid [_required_] string UUID assigned by Datadog to this user account. incident_ids [_required_] [integer] Array of incidents that are associated with this signal. state [_required_] enum The new triage state of the signal. Allowed enum values: `open,archived,under_review` state_update_timestamp int64 Timestamp of the last update to the signal state. state_update_user object Object representing a given user entity. handle string The handle for this user account. icon string Gravatar icon associated to the user. id int64 Numerical ID assigned by Datadog to this user account. name string The name for this user account. uuid [_required_] string UUID assigned by Datadog to this user account. id string The unique ID of the security signal. type enum The type of event. Allowed enum values: `signal_metadata` default: `signal_metadata` ``` { "data": { "attributes": { "archive_comment": "string", "archive_comment_timestamp": "integer", "archive_comment_user": { "handle": "string", "icon": "/path/to/matching/gravatar/icon", "id": "integer", "name": "string", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" }, "archive_reason": "string", "assignee": { "handle": "string", "icon": "/path/to/matching/gravatar/icon", "id": "integer", "name": "string", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" }, "incident_ids": [ 2066 ], "state": "open", "state_update_timestamp": "integer", "state_update_user": { "handle": "string", "icon": "/path/to/matching/gravatar/icon", "id": "integer", "name": "string", "uuid": "773b045d-ccf8-4808-bd3b-955ef6a8c940" } }, "id": "string", "type": "signal_metadata" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Change the related incidents of a security signal returns "OK" response Copy ``` # Path parameters export signal_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/signals/${signal_id}/incidents" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "incident_ids": [ 2066 ] } } } EOF ``` ##### Change the related incidents of a security signal returns "OK" response ``` // Change the related incidents of a security signal returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringSignalIncidentsUpdateRequest{ Data: datadogV2.SecurityMonitoringSignalIncidentsUpdateData{ Attributes: datadogV2.SecurityMonitoringSignalIncidentsUpdateAttributes{ IncidentIds: []int64{ 2066, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.EditSecurityMonitoringSignalIncidents(ctx, "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.EditSecurityMonitoringSignalIncidents`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.EditSecurityMonitoringSignalIncidents`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Change the related incidents of a security signal returns "OK" response ``` // Change the related incidents of a security signal returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSignalIncidentsUpdateAttributes; import com.datadog.api.client.v2.model.SecurityMonitoringSignalIncidentsUpdateData; import com.datadog.api.client.v2.model.SecurityMonitoringSignalIncidentsUpdateRequest; import com.datadog.api.client.v2.model.SecurityMonitoringSignalTriageUpdateResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringSignalIncidentsUpdateRequest body = new SecurityMonitoringSignalIncidentsUpdateRequest() .data( new SecurityMonitoringSignalIncidentsUpdateData() .attributes( new SecurityMonitoringSignalIncidentsUpdateAttributes() .incidentIds(Collections.singletonList(2066L)))); try { SecurityMonitoringSignalTriageUpdateResponse result = apiInstance.editSecurityMonitoringSignalIncidents( "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#editSecurityMonitoringSignalIncidents"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Change the related incidents of a security signal returns "OK" response ``` """ Change the related incidents of a security signal returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_signal_incident_ids import SecurityMonitoringSignalIncidentIds from datadog_api_client.v2.model.security_monitoring_signal_incidents_update_attributes import ( SecurityMonitoringSignalIncidentsUpdateAttributes, ) from datadog_api_client.v2.model.security_monitoring_signal_incidents_update_data import ( SecurityMonitoringSignalIncidentsUpdateData, ) from datadog_api_client.v2.model.security_monitoring_signal_incidents_update_request import ( SecurityMonitoringSignalIncidentsUpdateRequest, ) body = SecurityMonitoringSignalIncidentsUpdateRequest( data=SecurityMonitoringSignalIncidentsUpdateData( attributes=SecurityMonitoringSignalIncidentsUpdateAttributes( incident_ids=SecurityMonitoringSignalIncidentIds( [ 2066, ] ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.edit_security_monitoring_signal_incidents( signal_id="AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Change the related incidents of a security signal returns "OK" response ``` # Change the related incidents of a security signal returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringSignalIncidentsUpdateRequest.new({ data: DatadogAPIClient::V2::SecurityMonitoringSignalIncidentsUpdateData.new({ attributes: DatadogAPIClient::V2::SecurityMonitoringSignalIncidentsUpdateAttributes.new({ incident_ids: [ 2066, ], }), }), }) p api_instance.edit_security_monitoring_signal_incidents("AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Change the related incidents of a security signal returns "OK" response ``` // Change the related incidents of a security signal returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalIncidentsUpdateAttributes; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalIncidentsUpdateData; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalIncidentsUpdateRequest; #[tokio::main] async fn main() { let body = SecurityMonitoringSignalIncidentsUpdateRequest::new( SecurityMonitoringSignalIncidentsUpdateData::new( SecurityMonitoringSignalIncidentsUpdateAttributes::new(vec![2066]), ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .edit_security_monitoring_signal_incidents( "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE".to_string(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Change the related incidents of a security signal returns "OK" response ``` /** * Change the related incidents of a security signal returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiEditSecurityMonitoringSignalIncidentsRequest = { body: { data: { attributes: { incidentIds: [2066], }, }, }, signalId: "AQAAAYG1bl5K4HuUewAAAABBWUcxYmw1S0FBQmt2RmhRN0V4ZUVnQUE", }; apiInstance .editSecurityMonitoringSignalIncidents(params) .then((data: v2.SecurityMonitoringSignalTriageUpdateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Convert an existing rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-an-existing-rule-from-json-to-terraform) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-an-existing-rule-from-json-to-terraform-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/converthttps://api.ap2.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/converthttps://api.datadoghq.eu/api/v2/security_monitoring/rules/{rule_id}/converthttps://api.ddog-gov.com/api/v2/security_monitoring/rules/{rule_id}/converthttps://api.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/converthttps://api.us3.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/converthttps://api.us5.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/convert ### Overview Convert an existing rule from JSON to Terraform for Datadog provider resource `datadog_security_monitoring_rule`. You can do so for the following rule types: * App and API Protection * Cloud SIEM (log detection and signal correlation) * Workload Protection You can convert Cloud Security configuration rules using Terraform’s [Datadog Cloud Configuration Rule resource](https://registry.terraform.io/providers/DataDog/datadog/latest/docs/resources/cloud_configuration_rule). This endpoint requires the `security_monitoring_rules_read` permission. OAuth apps require the `security_monitoring_rules_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string The ID of the rule. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertExistingSecurityMonitoringRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertExistingSecurityMonitoringRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertExistingSecurityMonitoringRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertExistingSecurityMonitoringRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertExistingSecurityMonitoringRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Result of the convert rule request containing Terraform content. Expand All Field Type Description ruleId string the ID of the rule. terraformContent string Terraform string as a result of converting the rule from JSON. ``` { "ruleId": "string", "terraformContent": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Convert an existing rule from JSON to Terraform Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/${rule_id}/convert" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Convert an existing rule from JSON to Terraform ``` """ Convert an existing rule from JSON to Terraform returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "security_rule_hash" in the system SECURITY_RULE_HASH_ID = environ["SECURITY_RULE_HASH_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.convert_existing_security_monitoring_rule( rule_id=SECURITY_RULE_HASH_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Convert an existing rule from JSON to Terraform ``` # Convert an existing rule from JSON to Terraform returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "security_rule_hash" in the system SECURITY_RULE_HASH_ID = ENV["SECURITY_RULE_HASH_ID"] p api_instance.convert_existing_security_monitoring_rule(SECURITY_RULE_HASH_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Convert an existing rule from JSON to Terraform ``` // Convert an existing rule from JSON to Terraform returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "security_rule_hash" in the system SecurityRuleHashID := os.Getenv("SECURITY_RULE_HASH_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ConvertExistingSecurityMonitoringRule(ctx, SecurityRuleHashID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ConvertExistingSecurityMonitoringRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ConvertExistingSecurityMonitoringRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Convert an existing rule from JSON to Terraform ``` // Convert an existing rule from JSON to Terraform returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringRuleConvertResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "security_rule_hash" in the system String SECURITY_RULE_HASH_ID = System.getenv("SECURITY_RULE_HASH_ID"); try { SecurityMonitoringRuleConvertResponse result = apiInstance.convertExistingSecurityMonitoringRule(SECURITY_RULE_HASH_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#convertExistingSecurityMonitoringRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Convert an existing rule from JSON to Terraform ``` // Convert an existing rule from JSON to Terraform returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "security_rule_hash" in the system let security_rule_hash_id = std::env::var("SECURITY_RULE_HASH_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .convert_existing_security_monitoring_rule(security_rule_hash_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Convert an existing rule from JSON to Terraform ``` /** * Convert an existing rule from JSON to Terraform returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "security_rule_hash" in the system const SECURITY_RULE_HASH_ID = process.env.SECURITY_RULE_HASH_ID as string; const params: v2.SecurityMonitoringApiConvertExistingSecurityMonitoringRuleRequest = { ruleId: SECURITY_RULE_HASH_ID, }; apiInstance .convertExistingSecurityMonitoringRule(params) .then((data: v2.SecurityMonitoringRuleConvertResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Convert a rule from JSON to Terraform](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-a-rule-from-json-to-terraform) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-a-rule-from-json-to-terraform-v2) POST https://api.ap1.datadoghq.com/api/v2/security_monitoring/rules/converthttps://api.ap2.datadoghq.com/api/v2/security_monitoring/rules/converthttps://api.datadoghq.eu/api/v2/security_monitoring/rules/converthttps://api.ddog-gov.com/api/v2/security_monitoring/rules/converthttps://api.datadoghq.com/api/v2/security_monitoring/rules/converthttps://api.us3.datadoghq.com/api/v2/security_monitoring/rules/converthttps://api.us5.datadoghq.com/api/v2/security_monitoring/rules/convert ### Overview Convert a rule that doesn’t (yet) exist from JSON to Terraform for Datadog provider resource `datadog_security_monitoring_rule`. You can do so for the following rule types: * App and API Protection * Cloud SIEM (log detection and signal correlation) * Workload Protection You can convert Cloud Security configuration rules using Terraform’s [Datadog Cloud Configuration Rule resource](https://registry.terraform.io/providers/DataDog/datadog/latest/docs/resources/cloud_configuration_rule). This endpoint requires the `security_monitoring_rules_write` permission. OAuth apps require the `security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description Option 1 object The payload of a rule. calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [_required_] [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message for generated signals. name [_required_] string The name of the rule. options [_required_] object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. name string Name of the case. notifications [string] Notification targets for each case. query string A query to map a third party event to this case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum The rule type. Allowed enum values: `api_security,application_security,log_detection,workload_security` Option 2 object The payload of a signal correlation rule. cases [_required_] [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. isEnabled [_required_] boolean Whether the rule is enabled. message [_required_] string Message for generated signals. name [_required_] string The name of the rule. options [_required_] object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting signals which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` correlatedByFields [string] Fields to group by. correlatedQueryIndex int32 Index of the rule query used to retrieve the correlated field. metrics [string] Group of target fields to aggregate over. name string Name of the query. ruleId [_required_] string Rule ID to match on signals. tags [string] Tags for generated signals. type enum The rule type. Allowed enum values: `signal_correlation` ``` { "name": "_49768568946de993", "queries": [ { "query": "@test:true", "aggregation": "count", "groupByFields": [], "distinctFields": [], "metric": "" } ], "filters": [], "cases": [ { "name": "", "status": "info", "condition": "a > 0", "notifications": [] } ], "options": { "evaluationWindow": 900, "keepAlive": 3600, "maxSignalDuration": 86400 }, "message": "Test rule", "tags": [], "isEnabled": true, "type": "log_detection" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertSecurityMonitoringRuleFromJSONToTerraform-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertSecurityMonitoringRuleFromJSONToTerraform-400-v2) * [401](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertSecurityMonitoringRuleFromJSONToTerraform-401-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertSecurityMonitoringRuleFromJSONToTerraform-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertSecurityMonitoringRuleFromJSONToTerraform-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertSecurityMonitoringRuleFromJSONToTerraform-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Result of the convert rule request containing Terraform content. Expand All Field Type Description ruleId string the ID of the rule. terraformContent string Terraform string as a result of converting the rule from JSON. ``` { "ruleId": "string", "terraformContent": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Convert a rule from JSON to Terraform returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/convert" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "_49768568946de993", "queries": [ { "query": "@test:true", "aggregation": "count", "groupByFields": [], "distinctFields": [], "metric": "" } ], "filters": [], "cases": [ { "name": "", "status": "info", "condition": "a > 0", "notifications": [] } ], "options": { "evaluationWindow": 900, "keepAlive": 3600, "maxSignalDuration": 86400 }, "message": "Test rule", "tags": [], "isEnabled": true, "type": "log_detection" } EOF ``` ##### Convert a rule from JSON to Terraform returns "OK" response ``` // Convert a rule from JSON to Terraform returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringRuleConvertPayload{ SecurityMonitoringStandardRulePayload: &datadogV2.SecurityMonitoringStandardRulePayload{ Name: "_49768568946de993", Queries: []datadogV2.SecurityMonitoringStandardRuleQuery{ { Query: datadog.PtrString("@test:true"), Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_COUNT.Ptr(), GroupByFields: []string{}, DistinctFields: []string{}, Metric: datadog.PtrString(""), }, }, Filters: []datadogV2.SecurityMonitoringFilter{}, Cases: []datadogV2.SecurityMonitoringRuleCaseCreate{ { Name: datadog.PtrString(""), Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO, Condition: datadog.PtrString("a > 0"), Notifications: []string{}, }, }, Options: datadogV2.SecurityMonitoringRuleOptions{ EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_FIFTEEN_MINUTES.Ptr(), KeepAlive: datadogV2.SECURITYMONITORINGRULEKEEPALIVE_ONE_HOUR.Ptr(), MaxSignalDuration: datadogV2.SECURITYMONITORINGRULEMAXSIGNALDURATION_ONE_DAY.Ptr(), }, Message: "Test rule", Tags: []string{}, IsEnabled: true, Type: datadogV2.SECURITYMONITORINGRULETYPECREATE_LOG_DETECTION.Ptr(), }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ConvertSecurityMonitoringRuleFromJSONToTerraform(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ConvertSecurityMonitoringRuleFromJSONToTerraform`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ConvertSecurityMonitoringRuleFromJSONToTerraform`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Convert a rule from JSON to Terraform returns "OK" response ``` // Convert a rule from JSON to Terraform returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCaseCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleConvertPayload; import com.datadog.api.client.v2.model.SecurityMonitoringRuleConvertResponse; import com.datadog.api.client.v2.model.SecurityMonitoringRuleEvaluationWindow; import com.datadog.api.client.v2.model.SecurityMonitoringRuleKeepAlive; import com.datadog.api.client.v2.model.SecurityMonitoringRuleMaxSignalDuration; import com.datadog.api.client.v2.model.SecurityMonitoringRuleOptions; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryAggregation; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import com.datadog.api.client.v2.model.SecurityMonitoringRuleTypeCreate; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRulePayload; import com.datadog.api.client.v2.model.SecurityMonitoringStandardRuleQuery; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringRuleConvertPayload body = new SecurityMonitoringRuleConvertPayload( new SecurityMonitoringStandardRulePayload() .name("_49768568946de993") .queries( Collections.singletonList( new SecurityMonitoringStandardRuleQuery() .query("@test:true") .aggregation(SecurityMonitoringRuleQueryAggregation.COUNT) .metric(""))) .cases( Collections.singletonList( new SecurityMonitoringRuleCaseCreate() .name("") .status(SecurityMonitoringRuleSeverity.INFO) .condition("a > 0"))) .options( new SecurityMonitoringRuleOptions() .evaluationWindow(SecurityMonitoringRuleEvaluationWindow.FIFTEEN_MINUTES) .keepAlive(SecurityMonitoringRuleKeepAlive.ONE_HOUR) .maxSignalDuration(SecurityMonitoringRuleMaxSignalDuration.ONE_DAY)) .message("Test rule") .isEnabled(true) .type(SecurityMonitoringRuleTypeCreate.LOG_DETECTION)); try { SecurityMonitoringRuleConvertResponse result = apiInstance.convertSecurityMonitoringRuleFromJSONToTerraform(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling" + " SecurityMonitoringApi#convertSecurityMonitoringRuleFromJSONToTerraform"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Convert a rule from JSON to Terraform returns "OK" response ``` """ Convert a rule from JSON to Terraform returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_rule_case_create import SecurityMonitoringRuleCaseCreate from datadog_api_client.v2.model.security_monitoring_rule_evaluation_window import ( SecurityMonitoringRuleEvaluationWindow, ) from datadog_api_client.v2.model.security_monitoring_rule_keep_alive import SecurityMonitoringRuleKeepAlive from datadog_api_client.v2.model.security_monitoring_rule_max_signal_duration import ( SecurityMonitoringRuleMaxSignalDuration, ) from datadog_api_client.v2.model.security_monitoring_rule_options import SecurityMonitoringRuleOptions from datadog_api_client.v2.model.security_monitoring_rule_query_aggregation import ( SecurityMonitoringRuleQueryAggregation, ) from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity from datadog_api_client.v2.model.security_monitoring_rule_type_create import SecurityMonitoringRuleTypeCreate from datadog_api_client.v2.model.security_monitoring_standard_rule_payload import SecurityMonitoringStandardRulePayload from datadog_api_client.v2.model.security_monitoring_standard_rule_query import SecurityMonitoringStandardRuleQuery body = SecurityMonitoringStandardRulePayload( name="_49768568946de993", queries=[ SecurityMonitoringStandardRuleQuery( query="@test:true", aggregation=SecurityMonitoringRuleQueryAggregation.COUNT, group_by_fields=[], distinct_fields=[], metric="", ), ], filters=[], cases=[ SecurityMonitoringRuleCaseCreate( name="", status=SecurityMonitoringRuleSeverity.INFO, condition="a > 0", notifications=[], ), ], options=SecurityMonitoringRuleOptions( evaluation_window=SecurityMonitoringRuleEvaluationWindow.FIFTEEN_MINUTES, keep_alive=SecurityMonitoringRuleKeepAlive.ONE_HOUR, max_signal_duration=SecurityMonitoringRuleMaxSignalDuration.ONE_DAY, ), message="Test rule", tags=[], is_enabled=True, type=SecurityMonitoringRuleTypeCreate.LOG_DETECTION, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.convert_security_monitoring_rule_from_json_to_terraform(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Convert a rule from JSON to Terraform returns "OK" response ``` # Convert a rule from JSON to Terraform returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringStandardRulePayload.new({ name: "_49768568946de993", queries: [ DatadogAPIClient::V2::SecurityMonitoringStandardRuleQuery.new({ query: "@test:true", aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::COUNT, group_by_fields: [], distinct_fields: [], metric: "", }), ], filters: [], cases: [ DatadogAPIClient::V2::SecurityMonitoringRuleCaseCreate.new({ name: "", status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, condition: "a > 0", notifications: [], }), ], options: DatadogAPIClient::V2::SecurityMonitoringRuleOptions.new({ evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES, keep_alive: DatadogAPIClient::V2::SecurityMonitoringRuleKeepAlive::ONE_HOUR, max_signal_duration: DatadogAPIClient::V2::SecurityMonitoringRuleMaxSignalDuration::ONE_DAY, }), message: "Test rule", tags: [], is_enabled: true, type: DatadogAPIClient::V2::SecurityMonitoringRuleTypeCreate::LOG_DETECTION, }) p api_instance.convert_security_monitoring_rule_from_json_to_terraform(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Convert a rule from JSON to Terraform returns "OK" response ``` // Convert a rule from JSON to Terraform returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCaseCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleConvertPayload; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleEvaluationWindow; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleKeepAlive; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleMaxSignalDuration; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleOptions; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryAggregation; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleTypeCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRulePayload; use datadog_api_client::datadogV2::model::SecurityMonitoringStandardRuleQuery; #[tokio::main] async fn main() { let body = SecurityMonitoringRuleConvertPayload::SecurityMonitoringStandardRulePayload(Box::new( SecurityMonitoringStandardRulePayload::new( vec![ SecurityMonitoringRuleCaseCreate::new(SecurityMonitoringRuleSeverity::INFO) .condition("a > 0".to_string()) .name("".to_string()) .notifications(vec![]), ], true, "Test rule".to_string(), "_49768568946de993".to_string(), SecurityMonitoringRuleOptions::new() .evaluation_window(SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES) .keep_alive(SecurityMonitoringRuleKeepAlive::ONE_HOUR) .max_signal_duration(SecurityMonitoringRuleMaxSignalDuration::ONE_DAY), vec![SecurityMonitoringStandardRuleQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::COUNT) .distinct_fields(vec![]) .group_by_fields(vec![]) .metric("".to_string()) .query("@test:true".to_string())], ) .filters(vec![]) .tags(vec![]) .type_(SecurityMonitoringRuleTypeCreate::LOG_DETECTION), )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .convert_security_monitoring_rule_from_json_to_terraform(body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Convert a rule from JSON to Terraform returns "OK" response ``` /** * Convert a rule from JSON to Terraform returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiConvertSecurityMonitoringRuleFromJSONToTerraformRequest = { body: { name: "_49768568946de993", queries: [ { query: "@test:true", aggregation: "count", groupByFields: [], distinctFields: [], metric: "", }, ], filters: [], cases: [ { name: "", status: "info", condition: "a > 0", notifications: [], }, ], options: { evaluationWindow: 900, keepAlive: 3600, maxSignalDuration: 86400, }, message: "Test rule", tags: [], isEnabled: true, type: "log_detection", }, }; apiInstance .convertSecurityMonitoringRuleFromJSONToTerraform(params) .then((data: v2.SecurityMonitoringRuleConvertResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of security signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-list-of-security-signals) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-list-of-security-signals-v2) POST https://api.ap1.datadoghq.com/api/v2/security_monitoring/signals/searchhttps://api.ap2.datadoghq.com/api/v2/security_monitoring/signals/searchhttps://api.datadoghq.eu/api/v2/security_monitoring/signals/searchhttps://api.ddog-gov.com/api/v2/security_monitoring/signals/searchhttps://api.datadoghq.com/api/v2/security_monitoring/signals/searchhttps://api.us3.datadoghq.com/api/v2/security_monitoring/signals/searchhttps://api.us5.datadoghq.com/api/v2/security_monitoring/signals/search ### Overview Returns security signals that match a search query. Both this endpoint and the GET endpoint can be used interchangeably for listing security signals. This endpoint requires the `security_monitoring_signals_read` permission. OAuth apps require the `security_monitoring_signals_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description filter object Search filters for listing security signals. from date-time The minimum timestamp for requested security signals. query string Search query for listing security signals. to date-time The maximum timestamp for requested security signals. page object The paging attributes for listing security signals. cursor string A list of results using the cursor provided in the previous query. limit int32 The maximum number of security signals in the response. default: `10` sort enum The sort parameters used for querying security signals. Allowed enum values: `timestamp,-timestamp` ``` { "filter": { "from": "2021-11-11T10:56:11+00:00", "query": "security:attack status:high", "to": "2021-11-11T11:11:11+00:00" }, "page": { "limit": 2 }, "sort": "timestamp" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityMonitoringSignals-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityMonitoringSignals-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityMonitoringSignals-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityMonitoringSignals-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The response object with all security signals matching the request and pagination information. Field Type Description data [object] An array of security signals matching the request. attributes object The object containing all signal attributes and their associated values. custom object A JSON object of attributes in the security signal. message string The message in the security signal defined by the rule that generated the signal. tags [string] An array of tags associated with the security signal. timestamp date-time The timestamp of the security signal. id string The unique ID of the security signal. type enum The type of event. Allowed enum values: `signal` default: `signal` links object Links attributes. next string The link for the next set of results. **Note** : The request can also be made using the POST endpoint. meta object Meta attributes. page object Paging attributes. after string The cursor used to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. ``` { "data": [ { "attributes": { "custom": { "workflow": { "first_seen": "2020-06-23T14:46:01.000Z", "last_seen": "2020-06-23T14:46:49.000Z", "rule": { "id": "0f5-e0c-805", "name": "Brute Force Attack Grouped By User ", "version": 12 } } }, "message": "Detect Account Take Over (ATO) through brute force attempts", "tags": [ "security:attack", "technique:T1110-brute-force" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "signal" } ], "links": { "next": "https://app.datadoghq.com/api/v2/security_monitoring/signals?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a list of security signals returns "OK" response with pagination Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/signals/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "filter": { "from": "2021-11-11T10:56:11+00:00", "query": "security:attack status:high", "to": "2021-11-11T11:11:11+00:00" }, "page": { "limit": 2 }, "sort": "timestamp" } EOF ``` ##### Get a list of security signals returns "OK" response with pagination ``` // Get a list of security signals returns "OK" response with pagination package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringSignalListRequest{ Filter: &datadogV2.SecurityMonitoringSignalListRequestFilter{ From: datadog.PtrTime(time.Now().Add(time.Minute * -15)), Query: datadog.PtrString("security:attack status:high"), To: datadog.PtrTime(time.Now()), }, Page: &datadogV2.SecurityMonitoringSignalListRequestPage{ Limit: datadog.PtrInt32(2), }, Sort: datadogV2.SECURITYMONITORINGSIGNALSSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, _ := api.SearchSecurityMonitoringSignalsWithPagination(ctx, *datadogV2.NewSearchSecurityMonitoringSignalsOptionalParameters().WithBody(body)) for paginationResult := range resp { if paginationResult.Error != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.SearchSecurityMonitoringSignals`: %v\n", paginationResult.Error) } responseContent, _ := json.MarshalIndent(paginationResult.Item, "", " ") fmt.Fprintf(os.Stdout, "%s\n", responseContent) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of security signals returns "OK" response with pagination ``` // Get a list of security signals returns "OK" response with pagination import com.datadog.api.client.ApiClient; import com.datadog.api.client.PaginationIterable; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.api.SecurityMonitoringApi.SearchSecurityMonitoringSignalsOptionalParameters; import com.datadog.api.client.v2.model.SecurityMonitoringSignal; import com.datadog.api.client.v2.model.SecurityMonitoringSignalListRequest; import com.datadog.api.client.v2.model.SecurityMonitoringSignalListRequestFilter; import com.datadog.api.client.v2.model.SecurityMonitoringSignalListRequestPage; import com.datadog.api.client.v2.model.SecurityMonitoringSignalsSort; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringSignalListRequest body = new SecurityMonitoringSignalListRequest() .filter( new SecurityMonitoringSignalListRequestFilter() .from(OffsetDateTime.now().plusMinutes(-15)) .query("security:attack status:high") .to(OffsetDateTime.now())) .page(new SecurityMonitoringSignalListRequestPage().limit(2)) .sort(SecurityMonitoringSignalsSort.TIMESTAMP_ASCENDING); try { PaginationIterable iterable = apiInstance.searchSecurityMonitoringSignalsWithPagination( new SearchSecurityMonitoringSignalsOptionalParameters().body(body)); for (SecurityMonitoringSignal item : iterable) { System.out.println(item); } } catch (RuntimeException e) { System.err.println( "Exception when calling" + " SecurityMonitoringApi#searchSecurityMonitoringSignalsWithPagination"); System.err.println("Reason: " + e.getMessage()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of security signals returns "OK" response with pagination ``` """ Get a list of security signals returns "OK" response with pagination """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_signal_list_request import SecurityMonitoringSignalListRequest from datadog_api_client.v2.model.security_monitoring_signal_list_request_filter import ( SecurityMonitoringSignalListRequestFilter, ) from datadog_api_client.v2.model.security_monitoring_signal_list_request_page import ( SecurityMonitoringSignalListRequestPage, ) from datadog_api_client.v2.model.security_monitoring_signals_sort import SecurityMonitoringSignalsSort body = SecurityMonitoringSignalListRequest( filter=SecurityMonitoringSignalListRequestFilter( _from=(datetime.now() + relativedelta(minutes=-15)), query="security:attack status:high", to=datetime.now(), ), page=SecurityMonitoringSignalListRequestPage( limit=2, ), sort=SecurityMonitoringSignalsSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) items = api_instance.search_security_monitoring_signals_with_pagination(body=body) for item in items: print(item) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of security signals returns "OK" response with pagination ``` # Get a list of security signals returns "OK" response with pagination require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringSignalListRequest.new({ filter: DatadogAPIClient::V2::SecurityMonitoringSignalListRequestFilter.new({ from: (Time.now + -15 * 60), query: "security:attack status:high", to: Time.now, }), page: DatadogAPIClient::V2::SecurityMonitoringSignalListRequestPage.new({ limit: 2, }), sort: DatadogAPIClient::V2::SecurityMonitoringSignalsSort::TIMESTAMP_ASCENDING, }) opts = { body: body, } api_instance.search_security_monitoring_signals_with_pagination(opts) { |item| puts item } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of security signals returns "OK" response with pagination ``` // Get a list of security signals returns "OK" response with pagination use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SearchSecurityMonitoringSignalsOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalListRequest; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalListRequestFilter; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalListRequestPage; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalsSort; use futures_util::pin_mut; use futures_util::stream::StreamExt; #[tokio::main] async fn main() { let body = SecurityMonitoringSignalListRequest::new() .filter( SecurityMonitoringSignalListRequestFilter::new() .from( DateTime::parse_from_rfc3339("2021-11-11T10:56:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .query("security:attack status:high".to_string()) .to(DateTime::parse_from_rfc3339("2021-11-11T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc)), ) .page(SecurityMonitoringSignalListRequestPage::new().limit(2)) .sort(SecurityMonitoringSignalsSort::TIMESTAMP_ASCENDING); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let response = api.search_security_monitoring_signals_with_pagination( SearchSecurityMonitoringSignalsOptionalParams::default().body(body), ); pin_mut!(response); while let Some(resp) = response.next().await { if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of security signals returns "OK" response with pagination ``` /** * Get a list of security signals returns "OK" response with pagination */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiSearchSecurityMonitoringSignalsRequest = { body: { filter: { from: new Date(new Date().getTime() + -15 * 60 * 1000), query: "security:attack status:high", to: new Date(), }, page: { limit: 2, }, sort: "timestamp", }, }; (async () => { try { for await (const item of apiInstance.searchSecurityMonitoringSignalsWithPagination( params )) { console.log(item); } } catch (error) { console.error(error); } })(); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a signal's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-signals-details) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-signals-details-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}https://api.datadoghq.eu/api/v2/security_monitoring/signals/{signal_id}https://api.ddog-gov.com/api/v2/security_monitoring/signals/{signal_id}https://api.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/signals/{signal_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/signals/{signal_id} ### Overview Get a signal’s details. This endpoint requires the `security_monitoring_signals_read` permission. OAuth apps require the `security_monitoring_signals_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description signal_id [_required_] string The ID of the signal. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringSignal-200-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringSignal-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringSignal-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Security Signal response data object. Field Type Description data object Object description of a security signal. attributes object The object containing all signal attributes and their associated values. custom object A JSON object of attributes in the security signal. message string The message in the security signal defined by the rule that generated the signal. tags [string] An array of tags associated with the security signal. timestamp date-time The timestamp of the security signal. id string The unique ID of the security signal. type enum The type of event. Allowed enum values: `signal` default: `signal` ``` { "data": { "attributes": { "custom": { "workflow": { "first_seen": "2020-06-23T14:46:01.000Z", "last_seen": "2020-06-23T14:46:49.000Z", "rule": { "id": "0f5-e0c-805", "name": "Brute Force Attack Grouped By User ", "version": 12 } } }, "message": "Detect Account Take Over (ATO) through brute force attempts", "tags": [ "security:attack", "technique:T1110-brute-force" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "signal" } } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a signal's details Copy ``` # Path parameters export signal_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/signals/${signal_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a signal's details ``` """ Get a signal's details returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_security_monitoring_signal( signal_id="AQAAAYNqUBVU4-rffwAAAABBWU5xVUJWVUFBQjJBd3ptMDdQUnF3QUE", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a signal's details ``` # Get a signal's details returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.get_security_monitoring_signal("AQAAAYNqUBVU4-rffwAAAABBWU5xVUJWVUFBQjJBd3ptMDdQUnF3QUE") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a signal's details ``` // Get a signal's details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSecurityMonitoringSignal(ctx, "AQAAAYNqUBVU4-rffwAAAABBWU5xVUJWVUFBQjJBd3ptMDdQUnF3QUE") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSecurityMonitoringSignal`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSecurityMonitoringSignal`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a signal's details ``` // Get a signal's details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSignalResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { SecurityMonitoringSignalResponse result = apiInstance.getSecurityMonitoringSignal( "AQAAAYNqUBVU4-rffwAAAABBWU5xVUJWVUFBQjJBd3ptMDdQUnF3QUE"); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#getSecurityMonitoringSignal"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a signal's details ``` // Get a signal's details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_security_monitoring_signal( "AQAAAYNqUBVU4-rffwAAAABBWU5xVUJWVUFBQjJBd3ptMDdQUnF3QUE".to_string(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a signal's details ``` /** * Get a signal's details returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiGetSecurityMonitoringSignalRequest = { signalId: "AQAAAYNqUBVU4-rffwAAAABBWU5xVUJWVUFBQjJBd3ptMDdQUnF3QUE", }; apiInstance .getSecurityMonitoringSignal(params) .then((data: v2.SecurityMonitoringSignalResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-security-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-security-filter-v2) DELETE https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.datadoghq.eu/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.ddog-gov.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id} ### Overview Delete a specific security filter. This endpoint requires the `security_monitoring_filters_write` permission. OAuth apps require the `security_monitoring_filters_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description security_filter_id [_required_] string The ID of the security filter. ### Response * [204](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityFilter-204-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityFilter-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSecurityFilter-429-v2) OK Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Delete a security filter Copy ``` # Path parameters export security_filter_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/${security_filter_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a security filter ``` """ Delete a security filter returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.delete_security_filter( security_filter_id="security_filter_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a security filter ``` # Delete a security filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new api_instance.delete_security_filter("security_filter_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a security filter ``` // Delete a security filter returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.DeleteSecurityFilter(ctx, "security_filter_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.DeleteSecurityFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a security filter ``` // Delete a security filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { apiInstance.deleteSecurityFilter("security_filter_id"); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#deleteSecurityFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a security filter ``` // Delete a security filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .delete_security_filter("security_filter_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a security filter ``` /** * Delete a security filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiDeleteSecurityFilterRequest = { securityFilterId: "security_filter_id", }; apiInstance .deleteSecurityFilter(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a quick list of security signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-quick-list-of-security-signals) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-quick-list-of-security-signals-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/signalshttps://api.ap2.datadoghq.com/api/v2/security_monitoring/signalshttps://api.datadoghq.eu/api/v2/security_monitoring/signalshttps://api.ddog-gov.com/api/v2/security_monitoring/signalshttps://api.datadoghq.com/api/v2/security_monitoring/signalshttps://api.us3.datadoghq.com/api/v2/security_monitoring/signalshttps://api.us5.datadoghq.com/api/v2/security_monitoring/signals ### Overview The list endpoint returns security signals that match a search query. Both this endpoint and the POST endpoint can be used interchangeably when listing security signals. This endpoint requires the `security_monitoring_signals_read` permission. OAuth apps require the `security_monitoring_signals_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[query] string The search query for security signals. filter[from] string The minimum timestamp for requested security signals. filter[to] string The maximum timestamp for requested security signals. sort enum The order of the security signals in results. Allowed enum values: `timestamp, -timestamp` page[cursor] string A list of results using the cursor provided in the previous query. page[limit] integer The maximum number of security signals in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringSignals-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringSignals-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringSignals-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringSignals-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The response object with all security signals matching the request and pagination information. Field Type Description data [object] An array of security signals matching the request. attributes object The object containing all signal attributes and their associated values. custom object A JSON object of attributes in the security signal. message string The message in the security signal defined by the rule that generated the signal. tags [string] An array of tags associated with the security signal. timestamp date-time The timestamp of the security signal. id string The unique ID of the security signal. type enum The type of event. Allowed enum values: `signal` default: `signal` links object Links attributes. next string The link for the next set of results. **Note** : The request can also be made using the POST endpoint. meta object Meta attributes. page object Paging attributes. after string The cursor used to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. ``` { "data": [ { "attributes": { "custom": { "workflow": { "first_seen": "2020-06-23T14:46:01.000Z", "last_seen": "2020-06-23T14:46:49.000Z", "rule": { "id": "0f5-e0c-805", "name": "Brute Force Attack Grouped By User ", "version": 12 } } }, "message": "Detect Account Take Over (ATO) through brute force attempts", "tags": [ "security:attack", "technique:T1110-brute-force" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "signal" } ], "links": { "next": "https://app.datadoghq.com/api/v2/security_monitoring/signals?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a quick list of security signals Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/signals" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a quick list of security signals ``` """ Get a quick list of security signals returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_security_monitoring_signals() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a quick list of security signals ``` # Get a quick list of security signals returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.list_security_monitoring_signals() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a quick list of security signals ``` // Get a quick list of security signals returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListSecurityMonitoringSignals(ctx, *datadogV2.NewListSecurityMonitoringSignalsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListSecurityMonitoringSignals`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListSecurityMonitoringSignals`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a quick list of security signals ``` // Get a quick list of security signals returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSignalsListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { SecurityMonitoringSignalsListResponse result = apiInstance.listSecurityMonitoringSignals(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#listSecurityMonitoringSignals"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a quick list of security signals ``` // Get a quick list of security signals returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::ListSecurityMonitoringSignalsOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .list_security_monitoring_signals(ListSecurityMonitoringSignalsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a quick list of security signals ``` /** * Get a quick list of security signals returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); apiInstance .listSecurityMonitoringSignals() .then((data: v2.SecurityMonitoringSignalsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the list of signal-based notification rules](https://docs.datadoghq.com/api/latest/security-monitoring/#get-the-list-of-signal-based-notification-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-the-list-of-signal-based-notification-rules-v2) GET https://api.ap1.datadoghq.com/api/v2/security/signals/notification_ruleshttps://api.ap2.datadoghq.com/api/v2/security/signals/notification_ruleshttps://api.datadoghq.eu/api/v2/security/signals/notification_ruleshttps://api.ddog-gov.com/api/v2/security/signals/notification_ruleshttps://api.datadoghq.com/api/v2/security/signals/notification_ruleshttps://api.us3.datadoghq.com/api/v2/security/signals/notification_ruleshttps://api.us5.datadoghq.com/api/v2/security/signals/notification_rules ### Overview Returns the list of notification rules for security signals. This endpoint requires the `security_monitoring_notification_profiles_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSignalNotificationRules-200-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSignalNotificationRules-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSignalNotificationRules-429-v2) The list of notification rules. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [object] attributes [_required_] object Attributes of the notification rule. created_at [_required_] int64 Date as Unix timestamp in milliseconds. created_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. enabled [_required_] boolean Field used to enable or disable the rule. modified_at [_required_] int64 Date as Unix timestamp in milliseconds. modified_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. name [_required_] string Name of the notification rule. selectors [_required_] object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [_required_] [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. version [_required_] int64 Version of the notification rule. It is updated when the rule is modified. id [_required_] string The ID of a notification rule. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": [ { "attributes": { "created_at": 1722439510282, "created_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "enabled": true, "modified_at": 1722439510282, "modified_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get the list of signal-based notification rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/signals/notification_rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the list of signal-based notification rules ``` """ Get the list of signal-based notification rules returns "The list of notification rules." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_signal_notification_rules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the list of signal-based notification rules ``` # Get the list of signal-based notification rules returns "The list of notification rules." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.get_signal_notification_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the list of signal-based notification rules ``` // Get the list of signal-based notification rules returns "The list of notification rules." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSignalNotificationRules(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSignalNotificationRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSignalNotificationRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the list of signal-based notification rules ``` // Get the list of signal-based notification rules returns "The list of notification rules." // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { apiInstance.getSignalNotificationRules(); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#getSignalNotificationRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the list of signal-based notification rules ``` // Get the list of signal-based notification rules returns "The list of // notification rules." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.get_signal_notification_rules().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the list of signal-based notification rules ``` /** * Get the list of signal-based notification rules returns "The list of notification rules." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); apiInstance .getSignalNotificationRules() .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the list of vulnerability notification rules](https://docs.datadoghq.com/api/latest/security-monitoring/#get-the-list-of-vulnerability-notification-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-the-list-of-vulnerability-notification-rules-v2) GET https://api.ap1.datadoghq.com/api/v2/security/vulnerabilities/notification_ruleshttps://api.ap2.datadoghq.com/api/v2/security/vulnerabilities/notification_ruleshttps://api.datadoghq.eu/api/v2/security/vulnerabilities/notification_ruleshttps://api.ddog-gov.com/api/v2/security/vulnerabilities/notification_ruleshttps://api.datadoghq.com/api/v2/security/vulnerabilities/notification_ruleshttps://api.us3.datadoghq.com/api/v2/security/vulnerabilities/notification_ruleshttps://api.us5.datadoghq.com/api/v2/security/vulnerabilities/notification_rules ### Overview Returns the list of notification rules for security vulnerabilities. This endpoint requires the `security_monitoring_notification_profiles_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetVulnerabilityNotificationRules-200-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetVulnerabilityNotificationRules-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetVulnerabilityNotificationRules-429-v2) The list of notification rules. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [object] attributes [_required_] object Attributes of the notification rule. created_at [_required_] int64 Date as Unix timestamp in milliseconds. created_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. enabled [_required_] boolean Field used to enable or disable the rule. modified_at [_required_] int64 Date as Unix timestamp in milliseconds. modified_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. name [_required_] string Name of the notification rule. selectors [_required_] object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [_required_] [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. version [_required_] int64 Version of the notification rule. It is updated when the rule is modified. id [_required_] string The ID of a notification rule. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": [ { "attributes": { "created_at": 1722439510282, "created_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "enabled": true, "modified_at": 1722439510282, "modified_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get the list of vulnerability notification rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/vulnerabilities/notification_rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the list of vulnerability notification rules ``` """ Get the list of vulnerability notification rules returns "The list of notification rules." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_vulnerability_notification_rules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the list of vulnerability notification rules ``` # Get the list of vulnerability notification rules returns "The list of notification rules." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.get_vulnerability_notification_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the list of vulnerability notification rules ``` // Get the list of vulnerability notification rules returns "The list of notification rules." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetVulnerabilityNotificationRules(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetVulnerabilityNotificationRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetVulnerabilityNotificationRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the list of vulnerability notification rules ``` // Get the list of vulnerability notification rules returns "The list of notification rules." // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { apiInstance.getVulnerabilityNotificationRules(); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#getVulnerabilityNotificationRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the list of vulnerability notification rules ``` // Get the list of vulnerability notification rules returns "The list of // notification rules." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.get_vulnerability_notification_rules().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the list of vulnerability notification rules ``` /** * Get the list of vulnerability notification rules returns "The list of notification rules." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); apiInstance .getVulnerabilityNotificationRules() .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-security-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#update-a-security-filter-v2) PATCH https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.datadoghq.eu/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.ddog-gov.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id} ### Overview Update a specific security filter. Returns the security filter object when the request is successful. This endpoint requires the `security_monitoring_filters_write` permission. OAuth apps require the `security_monitoring_filters_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description security_filter_id [_required_] string The ID of the security filter. ### Request #### Body Data (required) New definition of the security filter. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object The new security filter properties. attributes [_required_] object The security filters properties to be updated. exclusion_filters [object] Exclusion filters to exclude some logs from the security filter. name [_required_] string Exclusion filter name. query [_required_] string Exclusion filter query. Logs that match this query are excluded from the security filter. filtered_data_type enum The filtered data type. Allowed enum values: `logs` is_enabled boolean Whether the security filter is enabled. name string The name of the security filter. query string The query of the security filter. version int32 The version of the security filter to update. type [_required_] enum The type of the resource. The value should always be `security_filters`. Allowed enum values: `security_filters` default: `security_filters` ``` { "data": { "attributes": { "exclusion_filters": [], "filtered_data_type": "logs", "is_enabled": true, "name": "Example-Security-Monitoring", "query": "service:ExampleSecurityMonitoring", "version": 1 }, "type": "security_filters" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityFilter-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityFilter-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityFilter-404-v2) * [409](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityFilter-409-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateSecurityFilter-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object which includes a single security filter. Field Type Description data object The security filter's properties. attributes object The object describing a security filter. exclusion_filters [object] The list of exclusion filters applied in this security filter. name string The exclusion filter name. query string The exclusion filter query. filtered_data_type enum The filtered data type. Allowed enum values: `logs` is_builtin boolean Whether the security filter is the built-in filter. is_enabled boolean Whether the security filter is enabled. name string The security filter name. query string The security filter query. Logs accepted by this query will be accepted by this filter. version int32 The version of the security filter. id string The ID of the security filter. type enum The type of the resource. The value should always be `security_filters`. Allowed enum values: `security_filters` default: `security_filters` meta object Optional metadata associated to the response. warning string A warning message. ``` { "data": { "attributes": { "exclusion_filters": [ { "name": "Exclude staging", "query": "source:staging" } ], "filtered_data_type": "logs", "is_builtin": false, "is_enabled": false, "name": "Custom security filter", "query": "service:api", "version": 1 }, "id": "3dd-0uc-h1s", "type": "security_filters" }, "meta": { "warning": "All the security filters are disabled. As a result, no logs are being analyzed." } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Update a security filter returns "OK" response Copy ``` # Path parameters export security_filter_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/${security_filter_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "exclusion_filters": [], "filtered_data_type": "logs", "is_enabled": true, "name": "Example-Security-Monitoring", "query": "service:ExampleSecurityMonitoring", "version": 1 }, "type": "security_filters" } } EOF ``` ##### Update a security filter returns "OK" response ``` // Update a security filter returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "security_filter" in the system SecurityFilterDataID := os.Getenv("SECURITY_FILTER_DATA_ID") body := datadogV2.SecurityFilterUpdateRequest{ Data: datadogV2.SecurityFilterUpdateData{ Attributes: datadogV2.SecurityFilterUpdateAttributes{ ExclusionFilters: []datadogV2.SecurityFilterExclusionFilter{}, FilteredDataType: datadogV2.SECURITYFILTERFILTEREDDATATYPE_LOGS.Ptr(), IsEnabled: datadog.PtrBool(true), Name: datadog.PtrString("Example-Security-Monitoring"), Query: datadog.PtrString("service:ExampleSecurityMonitoring"), Version: datadog.PtrInt32(1), }, Type: datadogV2.SECURITYFILTERTYPE_SECURITY_FILTERS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.UpdateSecurityFilter(ctx, SecurityFilterDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.UpdateSecurityFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.UpdateSecurityFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a security filter returns "OK" response ``` // Update a security filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityFilterFilteredDataType; import com.datadog.api.client.v2.model.SecurityFilterResponse; import com.datadog.api.client.v2.model.SecurityFilterType; import com.datadog.api.client.v2.model.SecurityFilterUpdateAttributes; import com.datadog.api.client.v2.model.SecurityFilterUpdateData; import com.datadog.api.client.v2.model.SecurityFilterUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "security_filter" in the system String SECURITY_FILTER_DATA_ID = System.getenv("SECURITY_FILTER_DATA_ID"); SecurityFilterUpdateRequest body = new SecurityFilterUpdateRequest() .data( new SecurityFilterUpdateData() .attributes( new SecurityFilterUpdateAttributes() .filteredDataType(SecurityFilterFilteredDataType.LOGS) .isEnabled(true) .name("Example-Security-Monitoring") .query("service:ExampleSecurityMonitoring") .version(1)) .type(SecurityFilterType.SECURITY_FILTERS)); try { SecurityFilterResponse result = apiInstance.updateSecurityFilter(SECURITY_FILTER_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#updateSecurityFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a security filter returns "OK" response ``` """ Update a security filter returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_filter_filtered_data_type import SecurityFilterFilteredDataType from datadog_api_client.v2.model.security_filter_type import SecurityFilterType from datadog_api_client.v2.model.security_filter_update_attributes import SecurityFilterUpdateAttributes from datadog_api_client.v2.model.security_filter_update_data import SecurityFilterUpdateData from datadog_api_client.v2.model.security_filter_update_request import SecurityFilterUpdateRequest # there is a valid "security_filter" in the system SECURITY_FILTER_DATA_ID = environ["SECURITY_FILTER_DATA_ID"] body = SecurityFilterUpdateRequest( data=SecurityFilterUpdateData( attributes=SecurityFilterUpdateAttributes( exclusion_filters=[], filtered_data_type=SecurityFilterFilteredDataType.LOGS, is_enabled=True, name="Example-Security-Monitoring", query="service:ExampleSecurityMonitoring", version=1, ), type=SecurityFilterType.SECURITY_FILTERS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.update_security_filter(security_filter_id=SECURITY_FILTER_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a security filter returns "OK" response ``` # Update a security filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "security_filter" in the system SECURITY_FILTER_DATA_ID = ENV["SECURITY_FILTER_DATA_ID"] body = DatadogAPIClient::V2::SecurityFilterUpdateRequest.new({ data: DatadogAPIClient::V2::SecurityFilterUpdateData.new({ attributes: DatadogAPIClient::V2::SecurityFilterUpdateAttributes.new({ exclusion_filters: [], filtered_data_type: DatadogAPIClient::V2::SecurityFilterFilteredDataType::LOGS, is_enabled: true, name: "Example-Security-Monitoring", query: "service:ExampleSecurityMonitoring", version: 1, }), type: DatadogAPIClient::V2::SecurityFilterType::SECURITY_FILTERS, }), }) p api_instance.update_security_filter(SECURITY_FILTER_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a security filter returns "OK" response ``` // Update a security filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityFilterFilteredDataType; use datadog_api_client::datadogV2::model::SecurityFilterType; use datadog_api_client::datadogV2::model::SecurityFilterUpdateAttributes; use datadog_api_client::datadogV2::model::SecurityFilterUpdateData; use datadog_api_client::datadogV2::model::SecurityFilterUpdateRequest; #[tokio::main] async fn main() { // there is a valid "security_filter" in the system let security_filter_data_id = std::env::var("SECURITY_FILTER_DATA_ID").unwrap(); let body = SecurityFilterUpdateRequest::new(SecurityFilterUpdateData::new( SecurityFilterUpdateAttributes::new() .exclusion_filters(vec![]) .filtered_data_type(SecurityFilterFilteredDataType::LOGS) .is_enabled(true) .name("Example-Security-Monitoring".to_string()) .query("service:ExampleSecurityMonitoring".to_string()) .version(1), SecurityFilterType::SECURITY_FILTERS, )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .update_security_filter(security_filter_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a security filter returns "OK" response ``` /** * Update a security filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "security_filter" in the system const SECURITY_FILTER_DATA_ID = process.env.SECURITY_FILTER_DATA_ID as string; const params: v2.SecurityMonitoringApiUpdateSecurityFilterRequest = { body: { data: { attributes: { exclusionFilters: [], filteredDataType: "logs", isEnabled: true, name: "Example-Security-Monitoring", query: "service:ExampleSecurityMonitoring", version: 1, }, type: "security_filters", }, }, securityFilterId: SECURITY_FILTER_DATA_ID, }; apiInstance .updateSecurityFilter(params) .then((data: v2.SecurityFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new signal-based notification rule](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-new-signal-based-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-new-signal-based-notification-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/security/signals/notification_ruleshttps://api.ap2.datadoghq.com/api/v2/security/signals/notification_ruleshttps://api.datadoghq.eu/api/v2/security/signals/notification_ruleshttps://api.ddog-gov.com/api/v2/security/signals/notification_ruleshttps://api.datadoghq.com/api/v2/security/signals/notification_ruleshttps://api.us3.datadoghq.com/api/v2/security/signals/notification_ruleshttps://api.us5.datadoghq.com/api/v2/security/signals/notification_rules ### Overview Create a new notification rule for security signals and return the created rule. This endpoint requires the `security_monitoring_notification_profiles_write` permission. ### Request #### Body Data (required) The body of the create notification rule request is composed of the rule type and the rule attributes: the rule name, the selectors, the notification targets, and the rule enabled status. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object Data of the notification rule create request: the rule type, and the rule attributes. All fields are required. attributes [_required_] object Attributes of the notification rule create request. enabled boolean Field used to enable or disable the rule. name [_required_] string Name of the notification rule. selectors [_required_] object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [_required_] [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": { "attributes": { "enabled": true, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400 }, "type": "notification_rules" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSignalNotificationRule-201-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSignalNotificationRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSignalNotificationRule-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSignalNotificationRule-429-v2) Successfully created the notification rule. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object which includes a notification rule. Field Type Description data object Notification rules allow full control over notifications generated by the various Datadog security products. They allow users to define the conditions under which a notification should be generated (based on rule severities, rule types, rule tags, and so on), and the targets to notify. A notification rule is composed of a rule ID, a rule type, and the rule attributes. All fields are required. attributes [_required_] object Attributes of the notification rule. created_at [_required_] int64 Date as Unix timestamp in milliseconds. created_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. enabled [_required_] boolean Field used to enable or disable the rule. modified_at [_required_] int64 Date as Unix timestamp in milliseconds. modified_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. name [_required_] string Name of the notification rule. selectors [_required_] object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [_required_] [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. version [_required_] int64 Version of the notification rule. It is updated when the rule is modified. id [_required_] string The ID of a notification rule. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": { "attributes": { "created_at": 1722439510282, "created_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "enabled": true, "modified_at": 1722439510282, "modified_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Create a new signal-based notification rule returns "Successfully created the notification rule." response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/signals/notification_rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": true, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400 }, "type": "notification_rules" } } EOF ``` ##### Create a new signal-based notification rule returns "Successfully created the notification rule." response ``` // Create a new signal-based notification rule returns "Successfully created the notification rule." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateNotificationRuleParameters{ Data: &datadogV2.CreateNotificationRuleParametersData{ Attributes: datadogV2.CreateNotificationRuleParametersDataAttributes{ Enabled: datadog.PtrBool(true), Name: "Rule 1", Selectors: datadogV2.Selectors{ Query: datadog.PtrString("(source:production_service OR env:prod)"), RuleTypes: []datadogV2.RuleTypesItems{ datadogV2.RULETYPESITEMS_MISCONFIGURATION, datadogV2.RULETYPESITEMS_ATTACK_PATH, }, Severities: []datadogV2.RuleSeverity{ datadogV2.RULESEVERITY_CRITICAL, }, TriggerSource: datadogV2.TRIGGERSOURCE_SECURITY_FINDINGS, }, Targets: []string{ "@john.doe@email.com", }, TimeAggregation: datadog.PtrInt64(86400), }, Type: datadogV2.NOTIFICATIONRULESTYPE_NOTIFICATION_RULES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateSignalNotificationRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateSignalNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateSignalNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new signal-based notification rule returns "Successfully created the notification rule." response ``` // Create a new signal-based notification rule returns "Successfully created the notification rule." // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.CreateNotificationRuleParameters; import com.datadog.api.client.v2.model.CreateNotificationRuleParametersData; import com.datadog.api.client.v2.model.CreateNotificationRuleParametersDataAttributes; import com.datadog.api.client.v2.model.NotificationRuleResponse; import com.datadog.api.client.v2.model.NotificationRulesType; import com.datadog.api.client.v2.model.RuleSeverity; import com.datadog.api.client.v2.model.RuleTypesItems; import com.datadog.api.client.v2.model.Selectors; import com.datadog.api.client.v2.model.TriggerSource; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); CreateNotificationRuleParameters body = new CreateNotificationRuleParameters() .data( new CreateNotificationRuleParametersData() .attributes( new CreateNotificationRuleParametersDataAttributes() .enabled(true) .name("Rule 1") .selectors( new Selectors() .query("(source:production_service OR env:prod)") .ruleTypes( Arrays.asList( RuleTypesItems.MISCONFIGURATION, RuleTypesItems.ATTACK_PATH)) .severities(Collections.singletonList(RuleSeverity.CRITICAL)) .triggerSource(TriggerSource.SECURITY_FINDINGS)) .targets(Collections.singletonList("@john.doe@email.com")) .timeAggregation(86400L)) .type(NotificationRulesType.NOTIFICATION_RULES)); try { NotificationRuleResponse result = apiInstance.createSignalNotificationRule(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#createSignalNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new signal-based notification rule returns "Successfully created the notification rule." response ``` """ Create a new signal-based notification rule returns "Successfully created the notification rule." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.create_notification_rule_parameters import CreateNotificationRuleParameters from datadog_api_client.v2.model.create_notification_rule_parameters_data import CreateNotificationRuleParametersData from datadog_api_client.v2.model.create_notification_rule_parameters_data_attributes import ( CreateNotificationRuleParametersDataAttributes, ) from datadog_api_client.v2.model.notification_rules_type import NotificationRulesType from datadog_api_client.v2.model.rule_severity import RuleSeverity from datadog_api_client.v2.model.rule_types_items import RuleTypesItems from datadog_api_client.v2.model.selectors import Selectors from datadog_api_client.v2.model.trigger_source import TriggerSource body = CreateNotificationRuleParameters( data=CreateNotificationRuleParametersData( attributes=CreateNotificationRuleParametersDataAttributes( enabled=True, name="Rule 1", selectors=Selectors( query="(source:production_service OR env:prod)", rule_types=[ RuleTypesItems.MISCONFIGURATION, RuleTypesItems.ATTACK_PATH, ], severities=[ RuleSeverity.CRITICAL, ], trigger_source=TriggerSource.SECURITY_FINDINGS, ), targets=[ "@john.doe@email.com", ], time_aggregation=86400, ), type=NotificationRulesType.NOTIFICATION_RULES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_signal_notification_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new signal-based notification rule returns "Successfully created the notification rule." response ``` # Create a new signal-based notification rule returns "Successfully created the notification rule." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::CreateNotificationRuleParameters.new({ data: DatadogAPIClient::V2::CreateNotificationRuleParametersData.new({ attributes: DatadogAPIClient::V2::CreateNotificationRuleParametersDataAttributes.new({ enabled: true, name: "Rule 1", selectors: DatadogAPIClient::V2::Selectors.new({ query: "(source:production_service OR env:prod)", rule_types: [ DatadogAPIClient::V2::RuleTypesItems::MISCONFIGURATION, DatadogAPIClient::V2::RuleTypesItems::ATTACK_PATH, ], severities: [ DatadogAPIClient::V2::RuleSeverity::CRITICAL, ], trigger_source: DatadogAPIClient::V2::TriggerSource::SECURITY_FINDINGS, }), targets: [ "@john.doe@email.com", ], time_aggregation: 86400, }), type: DatadogAPIClient::V2::NotificationRulesType::NOTIFICATION_RULES, }), }) p api_instance.create_signal_notification_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new signal-based notification rule returns "Successfully created the notification rule." response ``` // Create a new signal-based notification rule returns "Successfully created the // notification rule." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::CreateNotificationRuleParameters; use datadog_api_client::datadogV2::model::CreateNotificationRuleParametersData; use datadog_api_client::datadogV2::model::CreateNotificationRuleParametersDataAttributes; use datadog_api_client::datadogV2::model::NotificationRulesType; use datadog_api_client::datadogV2::model::RuleSeverity; use datadog_api_client::datadogV2::model::RuleTypesItems; use datadog_api_client::datadogV2::model::Selectors; use datadog_api_client::datadogV2::model::TriggerSource; #[tokio::main] async fn main() { let body = CreateNotificationRuleParameters::new().data(CreateNotificationRuleParametersData::new( CreateNotificationRuleParametersDataAttributes::new( "Rule 1".to_string(), Selectors::new(TriggerSource::SECURITY_FINDINGS) .query("(source:production_service OR env:prod)".to_string()) .rule_types(vec![ RuleTypesItems::MISCONFIGURATION, RuleTypesItems::ATTACK_PATH, ]) .severities(vec![RuleSeverity::CRITICAL]), vec!["@john.doe@email.com".to_string()], ) .enabled(true) .time_aggregation(86400), NotificationRulesType::NOTIFICATION_RULES, )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_signal_notification_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new signal-based notification rule returns "Successfully created the notification rule." response ``` /** * Create a new signal-based notification rule returns "Successfully created the notification rule." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateSignalNotificationRuleRequest = { body: { data: { attributes: { enabled: true, name: "Rule 1", selectors: { query: "(source:production_service OR env:prod)", ruleTypes: ["misconfiguration", "attack_path"], severities: ["critical"], triggerSource: "security_findings", }, targets: ["@john.doe@email.com"], timeAggregation: 86400, }, type: "notification_rules", }, }, }; apiInstance .createSignalNotificationRule(params) .then((data: v2.NotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new vulnerability-based notification rule](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-new-vulnerability-based-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-new-vulnerability-based-notification-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/security/vulnerabilities/notification_ruleshttps://api.ap2.datadoghq.com/api/v2/security/vulnerabilities/notification_ruleshttps://api.datadoghq.eu/api/v2/security/vulnerabilities/notification_ruleshttps://api.ddog-gov.com/api/v2/security/vulnerabilities/notification_ruleshttps://api.datadoghq.com/api/v2/security/vulnerabilities/notification_ruleshttps://api.us3.datadoghq.com/api/v2/security/vulnerabilities/notification_ruleshttps://api.us5.datadoghq.com/api/v2/security/vulnerabilities/notification_rules ### Overview Create a new notification rule for security vulnerabilities and return the created rule. This endpoint requires the `security_monitoring_notification_profiles_write` permission. ### Request #### Body Data (required) The body of the create notification rule request is composed of the rule type and the rule attributes: the rule name, the selectors, the notification targets, and the rule enabled status. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object Data of the notification rule create request: the rule type, and the rule attributes. All fields are required. attributes [_required_] object Attributes of the notification rule create request. enabled boolean Field used to enable or disable the rule. name [_required_] string Name of the notification rule. selectors [_required_] object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [_required_] [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": { "attributes": { "enabled": true, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400 }, "type": "notification_rules" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateVulnerabilityNotificationRule-201-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateVulnerabilityNotificationRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateVulnerabilityNotificationRule-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateVulnerabilityNotificationRule-429-v2) Successfully created the notification rule. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object which includes a notification rule. Field Type Description data object Notification rules allow full control over notifications generated by the various Datadog security products. They allow users to define the conditions under which a notification should be generated (based on rule severities, rule types, rule tags, and so on), and the targets to notify. A notification rule is composed of a rule ID, a rule type, and the rule attributes. All fields are required. attributes [_required_] object Attributes of the notification rule. created_at [_required_] int64 Date as Unix timestamp in milliseconds. created_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. enabled [_required_] boolean Field used to enable or disable the rule. modified_at [_required_] int64 Date as Unix timestamp in milliseconds. modified_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. name [_required_] string Name of the notification rule. selectors [_required_] object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [_required_] [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. version [_required_] int64 Version of the notification rule. It is updated when the rule is modified. id [_required_] string The ID of a notification rule. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": { "attributes": { "created_at": 1722439510282, "created_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "enabled": true, "modified_at": 1722439510282, "modified_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Create a new vulnerability-based notification rule returns "Successfully created the notification rule." response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/vulnerabilities/notification_rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": true, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400 }, "type": "notification_rules" } } EOF ``` ##### Create a new vulnerability-based notification rule returns "Successfully created the notification rule." response ``` // Create a new vulnerability-based notification rule returns "Successfully created the notification rule." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateNotificationRuleParameters{ Data: &datadogV2.CreateNotificationRuleParametersData{ Attributes: datadogV2.CreateNotificationRuleParametersDataAttributes{ Enabled: datadog.PtrBool(true), Name: "Rule 1", Selectors: datadogV2.Selectors{ Query: datadog.PtrString("(source:production_service OR env:prod)"), RuleTypes: []datadogV2.RuleTypesItems{ datadogV2.RULETYPESITEMS_MISCONFIGURATION, datadogV2.RULETYPESITEMS_ATTACK_PATH, }, Severities: []datadogV2.RuleSeverity{ datadogV2.RULESEVERITY_CRITICAL, }, TriggerSource: datadogV2.TRIGGERSOURCE_SECURITY_FINDINGS, }, Targets: []string{ "@john.doe@email.com", }, TimeAggregation: datadog.PtrInt64(86400), }, Type: datadogV2.NOTIFICATIONRULESTYPE_NOTIFICATION_RULES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateVulnerabilityNotificationRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateVulnerabilityNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateVulnerabilityNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new vulnerability-based notification rule returns "Successfully created the notification rule." response ``` // Create a new vulnerability-based notification rule returns "Successfully created the notification // rule." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.CreateNotificationRuleParameters; import com.datadog.api.client.v2.model.CreateNotificationRuleParametersData; import com.datadog.api.client.v2.model.CreateNotificationRuleParametersDataAttributes; import com.datadog.api.client.v2.model.NotificationRuleResponse; import com.datadog.api.client.v2.model.NotificationRulesType; import com.datadog.api.client.v2.model.RuleSeverity; import com.datadog.api.client.v2.model.RuleTypesItems; import com.datadog.api.client.v2.model.Selectors; import com.datadog.api.client.v2.model.TriggerSource; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); CreateNotificationRuleParameters body = new CreateNotificationRuleParameters() .data( new CreateNotificationRuleParametersData() .attributes( new CreateNotificationRuleParametersDataAttributes() .enabled(true) .name("Rule 1") .selectors( new Selectors() .query("(source:production_service OR env:prod)") .ruleTypes( Arrays.asList( RuleTypesItems.MISCONFIGURATION, RuleTypesItems.ATTACK_PATH)) .severities(Collections.singletonList(RuleSeverity.CRITICAL)) .triggerSource(TriggerSource.SECURITY_FINDINGS)) .targets(Collections.singletonList("@john.doe@email.com")) .timeAggregation(86400L)) .type(NotificationRulesType.NOTIFICATION_RULES)); try { NotificationRuleResponse result = apiInstance.createVulnerabilityNotificationRule(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#createVulnerabilityNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new vulnerability-based notification rule returns "Successfully created the notification rule." response ``` """ Create a new vulnerability-based notification rule returns "Successfully created the notification rule." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.create_notification_rule_parameters import CreateNotificationRuleParameters from datadog_api_client.v2.model.create_notification_rule_parameters_data import CreateNotificationRuleParametersData from datadog_api_client.v2.model.create_notification_rule_parameters_data_attributes import ( CreateNotificationRuleParametersDataAttributes, ) from datadog_api_client.v2.model.notification_rules_type import NotificationRulesType from datadog_api_client.v2.model.rule_severity import RuleSeverity from datadog_api_client.v2.model.rule_types_items import RuleTypesItems from datadog_api_client.v2.model.selectors import Selectors from datadog_api_client.v2.model.trigger_source import TriggerSource body = CreateNotificationRuleParameters( data=CreateNotificationRuleParametersData( attributes=CreateNotificationRuleParametersDataAttributes( enabled=True, name="Rule 1", selectors=Selectors( query="(source:production_service OR env:prod)", rule_types=[ RuleTypesItems.MISCONFIGURATION, RuleTypesItems.ATTACK_PATH, ], severities=[ RuleSeverity.CRITICAL, ], trigger_source=TriggerSource.SECURITY_FINDINGS, ), targets=[ "@john.doe@email.com", ], time_aggregation=86400, ), type=NotificationRulesType.NOTIFICATION_RULES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_vulnerability_notification_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new vulnerability-based notification rule returns "Successfully created the notification rule." response ``` # Create a new vulnerability-based notification rule returns "Successfully created the notification rule." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::CreateNotificationRuleParameters.new({ data: DatadogAPIClient::V2::CreateNotificationRuleParametersData.new({ attributes: DatadogAPIClient::V2::CreateNotificationRuleParametersDataAttributes.new({ enabled: true, name: "Rule 1", selectors: DatadogAPIClient::V2::Selectors.new({ query: "(source:production_service OR env:prod)", rule_types: [ DatadogAPIClient::V2::RuleTypesItems::MISCONFIGURATION, DatadogAPIClient::V2::RuleTypesItems::ATTACK_PATH, ], severities: [ DatadogAPIClient::V2::RuleSeverity::CRITICAL, ], trigger_source: DatadogAPIClient::V2::TriggerSource::SECURITY_FINDINGS, }), targets: [ "@john.doe@email.com", ], time_aggregation: 86400, }), type: DatadogAPIClient::V2::NotificationRulesType::NOTIFICATION_RULES, }), }) p api_instance.create_vulnerability_notification_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new vulnerability-based notification rule returns "Successfully created the notification rule." response ``` // Create a new vulnerability-based notification rule returns "Successfully // created the notification rule." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::CreateNotificationRuleParameters; use datadog_api_client::datadogV2::model::CreateNotificationRuleParametersData; use datadog_api_client::datadogV2::model::CreateNotificationRuleParametersDataAttributes; use datadog_api_client::datadogV2::model::NotificationRulesType; use datadog_api_client::datadogV2::model::RuleSeverity; use datadog_api_client::datadogV2::model::RuleTypesItems; use datadog_api_client::datadogV2::model::Selectors; use datadog_api_client::datadogV2::model::TriggerSource; #[tokio::main] async fn main() { let body = CreateNotificationRuleParameters::new().data(CreateNotificationRuleParametersData::new( CreateNotificationRuleParametersDataAttributes::new( "Rule 1".to_string(), Selectors::new(TriggerSource::SECURITY_FINDINGS) .query("(source:production_service OR env:prod)".to_string()) .rule_types(vec![ RuleTypesItems::MISCONFIGURATION, RuleTypesItems::ATTACK_PATH, ]) .severities(vec![RuleSeverity::CRITICAL]), vec!["@john.doe@email.com".to_string()], ) .enabled(true) .time_aggregation(86400), NotificationRulesType::NOTIFICATION_RULES, )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_vulnerability_notification_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new vulnerability-based notification rule returns "Successfully created the notification rule." response ``` /** * Create a new vulnerability-based notification rule returns "Successfully created the notification rule." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateVulnerabilityNotificationRuleRequest = { body: { data: { attributes: { enabled: true, name: "Rule 1", selectors: { query: "(source:production_service OR env:prod)", ruleTypes: ["misconfiguration", "attack_path"], severities: ["critical"], triggerSource: "security_findings", }, targets: ["@john.doe@email.com"], timeAggregation: 86400, }, type: "notification_rules", }, }, }; apiInstance .createVulnerabilityNotificationRule(params) .then((data: v2.NotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-security-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-security-filter-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.datadoghq.eu/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.ddog-gov.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id}https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/{security_filter_id} ### Overview Get the details of a specific security filter. See the [security filter guide](https://docs.datadoghq.com/security_platform/guide/how-to-setup-security-filters-using-security-monitoring-api/) for more examples. This endpoint requires the `security_monitoring_filters_read` permission. OAuth apps require the `security_monitoring_filters_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description security_filter_id [_required_] string The ID of the security filter. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityFilter-200-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityFilter-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityFilter-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityFilter-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object which includes a single security filter. Field Type Description data object The security filter's properties. attributes object The object describing a security filter. exclusion_filters [object] The list of exclusion filters applied in this security filter. name string The exclusion filter name. query string The exclusion filter query. filtered_data_type enum The filtered data type. Allowed enum values: `logs` is_builtin boolean Whether the security filter is the built-in filter. is_enabled boolean Whether the security filter is enabled. name string The security filter name. query string The security filter query. Logs accepted by this query will be accepted by this filter. version int32 The version of the security filter. id string The ID of the security filter. type enum The type of the resource. The value should always be `security_filters`. Allowed enum values: `security_filters` default: `security_filters` meta object Optional metadata associated to the response. warning string A warning message. ``` { "data": { "attributes": { "exclusion_filters": [ { "name": "Exclude staging", "query": "source:staging" } ], "filtered_data_type": "logs", "is_builtin": false, "is_enabled": false, "name": "Custom security filter", "query": "service:api", "version": 1 }, "id": "3dd-0uc-h1s", "type": "security_filters" }, "meta": { "warning": "All the security filters are disabled. As a result, no logs are being analyzed." } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a security filter Copy ``` # Path parameters export security_filter_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/security_filters/${security_filter_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a security filter ``` """ Get a security filter returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "security_filter" in the system SECURITY_FILTER_DATA_ID = environ["SECURITY_FILTER_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_security_filter( security_filter_id=SECURITY_FILTER_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a security filter ``` # Get a security filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "security_filter" in the system SECURITY_FILTER_DATA_ID = ENV["SECURITY_FILTER_DATA_ID"] p api_instance.get_security_filter(SECURITY_FILTER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a security filter ``` // Get a security filter returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "security_filter" in the system SecurityFilterDataID := os.Getenv("SECURITY_FILTER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSecurityFilter(ctx, SecurityFilterDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSecurityFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSecurityFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a security filter ``` // Get a security filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityFilterResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "security_filter" in the system String SECURITY_FILTER_DATA_ID = System.getenv("SECURITY_FILTER_DATA_ID"); try { SecurityFilterResponse result = apiInstance.getSecurityFilter(SECURITY_FILTER_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#getSecurityFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a security filter ``` // Get a security filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "security_filter" in the system let security_filter_data_id = std::env::var("SECURITY_FILTER_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_security_filter(security_filter_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a security filter ``` /** * Get a security filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "security_filter" in the system const SECURITY_FILTER_DATA_ID = process.env.SECURITY_FILTER_DATA_ID as string; const params: v2.SecurityMonitoringApiGetSecurityFilterRequest = { securityFilterId: SECURITY_FILTER_DATA_ID, }; apiInstance .getSecurityFilter(params) .then((data: v2.SecurityFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a security filter](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-security-filter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#create-a-security-filter-v2) POST https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/security_filtershttps://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/security_filtershttps://api.datadoghq.eu/api/v2/security_monitoring/configuration/security_filtershttps://api.ddog-gov.com/api/v2/security_monitoring/configuration/security_filtershttps://api.datadoghq.com/api/v2/security_monitoring/configuration/security_filtershttps://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/security_filtershttps://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/security_filters ### Overview Create a security filter. See the [security filter guide](https://docs.datadoghq.com/security_platform/guide/how-to-setup-security-filters-using-security-monitoring-api/) for more examples. This endpoint requires the `security_monitoring_filters_write` permission. OAuth apps require the `security_monitoring_filters_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) The definition of the new security filter. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object Object for a single security filter. attributes [_required_] object Object containing the attributes of the security filter to be created. exclusion_filters [_required_] [object] Exclusion filters to exclude some logs from the security filter. name [_required_] string Exclusion filter name. query [_required_] string Exclusion filter query. Logs that match this query are excluded from the security filter. filtered_data_type [_required_] enum The filtered data type. Allowed enum values: `logs` is_enabled [_required_] boolean Whether the security filter is enabled. name [_required_] string The name of the security filter. query [_required_] string The query of the security filter. type [_required_] enum The type of the resource. The value should always be `security_filters`. Allowed enum values: `security_filters` default: `security_filters` ``` { "data": { "attributes": { "exclusion_filters": [ { "name": "Exclude staging", "query": "source:staging" } ], "filtered_data_type": "logs", "is_enabled": true, "name": "Example-Security-Monitoring", "query": "service:ExampleSecurityMonitoring" }, "type": "security_filters" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityFilter-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityFilter-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityFilter-403-v2) * [409](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityFilter-409-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateSecurityFilter-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object which includes a single security filter. Field Type Description data object The security filter's properties. attributes object The object describing a security filter. exclusion_filters [object] The list of exclusion filters applied in this security filter. name string The exclusion filter name. query string The exclusion filter query. filtered_data_type enum The filtered data type. Allowed enum values: `logs` is_builtin boolean Whether the security filter is the built-in filter. is_enabled boolean Whether the security filter is enabled. name string The security filter name. query string The security filter query. Logs accepted by this query will be accepted by this filter. version int32 The version of the security filter. id string The ID of the security filter. type enum The type of the resource. The value should always be `security_filters`. Allowed enum values: `security_filters` default: `security_filters` meta object Optional metadata associated to the response. warning string A warning message. ``` { "data": { "attributes": { "exclusion_filters": [ { "name": "Exclude staging", "query": "source:staging" } ], "filtered_data_type": "logs", "is_builtin": false, "is_enabled": false, "name": "Custom security filter", "query": "service:api", "version": 1 }, "id": "3dd-0uc-h1s", "type": "security_filters" }, "meta": { "warning": "All the security filters are disabled. As a result, no logs are being analyzed." } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Create a security filter returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/security_filters" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "exclusion_filters": [ { "name": "Exclude staging", "query": "source:staging" } ], "filtered_data_type": "logs", "is_enabled": true, "name": "Example-Security-Monitoring", "query": "service:ExampleSecurityMonitoring" }, "type": "security_filters" } } EOF ``` ##### Create a security filter returns "OK" response ``` // Create a security filter returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityFilterCreateRequest{ Data: datadogV2.SecurityFilterCreateData{ Attributes: datadogV2.SecurityFilterCreateAttributes{ ExclusionFilters: []datadogV2.SecurityFilterExclusionFilter{ { Name: "Exclude staging", Query: "source:staging", }, }, FilteredDataType: datadogV2.SECURITYFILTERFILTEREDDATATYPE_LOGS, IsEnabled: true, Name: "Example-Security-Monitoring", Query: "service:ExampleSecurityMonitoring", }, Type: datadogV2.SECURITYFILTERTYPE_SECURITY_FILTERS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateSecurityFilter(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateSecurityFilter`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateSecurityFilter`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a security filter returns "OK" response ``` // Create a security filter returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityFilterCreateAttributes; import com.datadog.api.client.v2.model.SecurityFilterCreateData; import com.datadog.api.client.v2.model.SecurityFilterCreateRequest; import com.datadog.api.client.v2.model.SecurityFilterExclusionFilter; import com.datadog.api.client.v2.model.SecurityFilterFilteredDataType; import com.datadog.api.client.v2.model.SecurityFilterResponse; import com.datadog.api.client.v2.model.SecurityFilterType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityFilterCreateRequest body = new SecurityFilterCreateRequest() .data( new SecurityFilterCreateData() .attributes( new SecurityFilterCreateAttributes() .exclusionFilters( Collections.singletonList( new SecurityFilterExclusionFilter() .name("Exclude staging") .query("source:staging"))) .filteredDataType(SecurityFilterFilteredDataType.LOGS) .isEnabled(true) .name("Example-Security-Monitoring") .query("service:ExampleSecurityMonitoring")) .type(SecurityFilterType.SECURITY_FILTERS)); try { SecurityFilterResponse result = apiInstance.createSecurityFilter(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#createSecurityFilter"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a security filter returns "OK" response ``` """ Create a security filter returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_filter_create_attributes import SecurityFilterCreateAttributes from datadog_api_client.v2.model.security_filter_create_data import SecurityFilterCreateData from datadog_api_client.v2.model.security_filter_create_request import SecurityFilterCreateRequest from datadog_api_client.v2.model.security_filter_exclusion_filter import SecurityFilterExclusionFilter from datadog_api_client.v2.model.security_filter_filtered_data_type import SecurityFilterFilteredDataType from datadog_api_client.v2.model.security_filter_type import SecurityFilterType body = SecurityFilterCreateRequest( data=SecurityFilterCreateData( attributes=SecurityFilterCreateAttributes( exclusion_filters=[ SecurityFilterExclusionFilter( name="Exclude staging", query="source:staging", ), ], filtered_data_type=SecurityFilterFilteredDataType.LOGS, is_enabled=True, name="Example-Security-Monitoring", query="service:ExampleSecurityMonitoring", ), type=SecurityFilterType.SECURITY_FILTERS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_security_filter(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a security filter returns "OK" response ``` # Create a security filter returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityFilterCreateRequest.new({ data: DatadogAPIClient::V2::SecurityFilterCreateData.new({ attributes: DatadogAPIClient::V2::SecurityFilterCreateAttributes.new({ exclusion_filters: [ DatadogAPIClient::V2::SecurityFilterExclusionFilter.new({ name: "Exclude staging", query: "source:staging", }), ], filtered_data_type: DatadogAPIClient::V2::SecurityFilterFilteredDataType::LOGS, is_enabled: true, name: "Example-Security-Monitoring", query: "service:ExampleSecurityMonitoring", }), type: DatadogAPIClient::V2::SecurityFilterType::SECURITY_FILTERS, }), }) p api_instance.create_security_filter(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a security filter returns "OK" response ``` // Create a security filter returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityFilterCreateAttributes; use datadog_api_client::datadogV2::model::SecurityFilterCreateData; use datadog_api_client::datadogV2::model::SecurityFilterCreateRequest; use datadog_api_client::datadogV2::model::SecurityFilterExclusionFilter; use datadog_api_client::datadogV2::model::SecurityFilterFilteredDataType; use datadog_api_client::datadogV2::model::SecurityFilterType; #[tokio::main] async fn main() { let body = SecurityFilterCreateRequest::new(SecurityFilterCreateData::new( SecurityFilterCreateAttributes::new( vec![SecurityFilterExclusionFilter::new( "Exclude staging".to_string(), "source:staging".to_string(), )], SecurityFilterFilteredDataType::LOGS, true, "Example-Security-Monitoring".to_string(), "service:ExampleSecurityMonitoring".to_string(), ), SecurityFilterType::SECURITY_FILTERS, )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_security_filter(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a security filter returns "OK" response ``` /** * Create a security filter returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateSecurityFilterRequest = { body: { data: { attributes: { exclusionFilters: [ { name: "Exclude staging", query: "source:staging", }, ], filteredDataType: "logs", isEnabled: true, name: "Example-Security-Monitoring", query: "service:ExampleSecurityMonitoring", }, type: "security_filters", }, }, }; apiInstance .createSecurityFilter(params) .then((data: v2.SecurityFilterResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all security filters](https://docs.datadoghq.com/api/latest/security-monitoring/#get-all-security-filters) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-all-security-filters-v2) GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/configuration/security_filtershttps://api.ap2.datadoghq.com/api/v2/security_monitoring/configuration/security_filtershttps://api.datadoghq.eu/api/v2/security_monitoring/configuration/security_filtershttps://api.ddog-gov.com/api/v2/security_monitoring/configuration/security_filtershttps://api.datadoghq.com/api/v2/security_monitoring/configuration/security_filtershttps://api.us3.datadoghq.com/api/v2/security_monitoring/configuration/security_filtershttps://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/security_filters ### Overview Get the list of configured security filters with their definitions. This endpoint requires the `security_monitoring_filters_read` permission. OAuth apps require the `security_monitoring_filters_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityFilters-200-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityFilters-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityFilters-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) All the available security filters objects. Field Type Description data [object] A list of security filters objects. attributes object The object describing a security filter. exclusion_filters [object] The list of exclusion filters applied in this security filter. name string The exclusion filter name. query string The exclusion filter query. filtered_data_type enum The filtered data type. Allowed enum values: `logs` is_builtin boolean Whether the security filter is the built-in filter. is_enabled boolean Whether the security filter is enabled. name string The security filter name. query string The security filter query. Logs accepted by this query will be accepted by this filter. version int32 The version of the security filter. id string The ID of the security filter. type enum The type of the resource. The value should always be `security_filters`. Allowed enum values: `security_filters` default: `security_filters` meta object Optional metadata associated to the response. warning string A warning message. ``` { "data": [ { "attributes": { "exclusion_filters": [ { "name": "Exclude staging", "query": "source:staging" } ], "filtered_data_type": "logs", "is_builtin": false, "is_enabled": false, "name": "Custom security filter", "query": "service:api", "version": 1 }, "id": "3dd-0uc-h1s", "type": "security_filters" } ], "meta": { "warning": "All the security filters are disabled. As a result, no logs are being analyzed." } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get all security filters Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/configuration/security_filters" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all security filters ``` """ Get all security filters returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_security_filters() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all security filters ``` # Get all security filters returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.list_security_filters() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all security filters ``` // Get all security filters returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListSecurityFilters(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListSecurityFilters`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListSecurityFilters`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all security filters ``` // Get all security filters returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityFiltersResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { SecurityFiltersResponse result = apiInstance.listSecurityFilters(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#listSecurityFilters"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all security filters ``` // Get all security filters returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.list_security_filters().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all security filters ``` /** * Get all security filters returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); apiInstance .listSecurityFilters() .then((data: v2.SecurityFiltersResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get details of a signal-based notification rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-details-of-a-signal-based-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-details-of-a-signal-based-notification-rule-v2) GET https://api.ap1.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.ap2.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.datadoghq.eu/api/v2/security/signals/notification_rules/{id}https://api.ddog-gov.com/api/v2/security/signals/notification_rules/{id}https://api.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.us3.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.us5.datadoghq.com/api/v2/security/signals/notification_rules/{id} ### Overview Get the details of a notification rule for security signals. This endpoint requires the `security_monitoring_notification_profiles_read` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string ID of the notification rule. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSignalNotificationRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSignalNotificationRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSignalNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSignalNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSignalNotificationRule-429-v2) Notification rule details. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object which includes a notification rule. Field Type Description data object Notification rules allow full control over notifications generated by the various Datadog security products. They allow users to define the conditions under which a notification should be generated (based on rule severities, rule types, rule tags, and so on), and the targets to notify. A notification rule is composed of a rule ID, a rule type, and the rule attributes. All fields are required. attributes [_required_] object Attributes of the notification rule. created_at [_required_] int64 Date as Unix timestamp in milliseconds. created_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. enabled [_required_] boolean Field used to enable or disable the rule. modified_at [_required_] int64 Date as Unix timestamp in milliseconds. modified_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. name [_required_] string Name of the notification rule. selectors [_required_] object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [_required_] [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. version [_required_] int64 Version of the notification rule. It is updated when the rule is modified. id [_required_] string The ID of a notification rule. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": { "attributes": { "created_at": 1722439510282, "created_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "enabled": true, "modified_at": 1722439510282, "modified_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get details of a signal-based notification rule Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/signals/notification_rules/${id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get details of a signal-based notification rule ``` """ Get details of a signal-based notification rule returns "Notification rule details." response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "valid_signal_notification_rule" in the system VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = environ["VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_signal_notification_rule( id=VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get details of a signal-based notification rule ``` # Get details of a signal-based notification rule returns "Notification rule details." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "valid_signal_notification_rule" in the system VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = ENV["VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID"] p api_instance.get_signal_notification_rule(VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get details of a signal-based notification rule ``` // Get details of a signal-based notification rule returns "Notification rule details." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "valid_signal_notification_rule" in the system ValidSignalNotificationRuleDataID := os.Getenv("VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSignalNotificationRule(ctx, ValidSignalNotificationRuleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSignalNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSignalNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get details of a signal-based notification rule ``` // Get details of a signal-based notification rule returns "Notification rule details." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.NotificationRuleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "valid_signal_notification_rule" in the system String VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = System.getenv("VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID"); try { NotificationRuleResponse result = apiInstance.getSignalNotificationRule(VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#getSignalNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get details of a signal-based notification rule ``` // Get details of a signal-based notification rule returns "Notification rule // details." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "valid_signal_notification_rule" in the system let valid_signal_notification_rule_data_id = std::env::var("VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_signal_notification_rule(valid_signal_notification_rule_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get details of a signal-based notification rule ``` /** * Get details of a signal-based notification rule returns "Notification rule details." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "valid_signal_notification_rule" in the system const VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = process.env .VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID as string; const params: v2.SecurityMonitoringApiGetSignalNotificationRuleRequest = { id: VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID, }; apiInstance .getSignalNotificationRule(params) .then((data: v2.NotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get details of a vulnerability notification rule](https://docs.datadoghq.com/api/latest/security-monitoring/#get-details-of-a-vulnerability-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-details-of-a-vulnerability-notification-rule-v2) GET https://api.ap1.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.ap2.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.datadoghq.eu/api/v2/security/vulnerabilities/notification_rules/{id}https://api.ddog-gov.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.us3.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.us5.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id} ### Overview Get the details of a notification rule for security vulnerabilities. This endpoint requires the `security_monitoring_notification_profiles_read` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string ID of the notification rule. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetVulnerabilityNotificationRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#GetVulnerabilityNotificationRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetVulnerabilityNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetVulnerabilityNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetVulnerabilityNotificationRule-429-v2) Notification rule details. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object which includes a notification rule. Field Type Description data object Notification rules allow full control over notifications generated by the various Datadog security products. They allow users to define the conditions under which a notification should be generated (based on rule severities, rule types, rule tags, and so on), and the targets to notify. A notification rule is composed of a rule ID, a rule type, and the rule attributes. All fields are required. attributes [_required_] object Attributes of the notification rule. created_at [_required_] int64 Date as Unix timestamp in milliseconds. created_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. enabled [_required_] boolean Field used to enable or disable the rule. modified_at [_required_] int64 Date as Unix timestamp in milliseconds. modified_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. name [_required_] string Name of the notification rule. selectors [_required_] object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [_required_] [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. version [_required_] int64 Version of the notification rule. It is updated when the rule is modified. id [_required_] string The ID of a notification rule. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": { "attributes": { "created_at": 1722439510282, "created_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "enabled": true, "modified_at": 1722439510282, "modified_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get details of a vulnerability notification rule Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/${id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get details of a vulnerability notification rule ``` """ Get details of a vulnerability notification rule returns "Notification rule details." response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "valid_vulnerability_notification_rule" in the system VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = environ["VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_vulnerability_notification_rule( id=VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get details of a vulnerability notification rule ``` # Get details of a vulnerability notification rule returns "Notification rule details." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "valid_vulnerability_notification_rule" in the system VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = ENV["VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID"] p api_instance.get_vulnerability_notification_rule(VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get details of a vulnerability notification rule ``` // Get details of a vulnerability notification rule returns "Notification rule details." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "valid_vulnerability_notification_rule" in the system ValidVulnerabilityNotificationRuleDataID := os.Getenv("VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetVulnerabilityNotificationRule(ctx, ValidVulnerabilityNotificationRuleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetVulnerabilityNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetVulnerabilityNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get details of a vulnerability notification rule ``` // Get details of a vulnerability notification rule returns "Notification rule details." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.NotificationRuleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "valid_vulnerability_notification_rule" in the system String VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = System.getenv("VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID"); try { NotificationRuleResponse result = apiInstance.getVulnerabilityNotificationRule( VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#getVulnerabilityNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get details of a vulnerability notification rule ``` // Get details of a vulnerability notification rule returns "Notification rule // details." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "valid_vulnerability_notification_rule" in the system let valid_vulnerability_notification_rule_data_id = std::env::var("VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_vulnerability_notification_rule(valid_vulnerability_notification_rule_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get details of a vulnerability notification rule ``` /** * Get details of a vulnerability notification rule returns "Notification rule details." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "valid_vulnerability_notification_rule" in the system const VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = process.env .VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID as string; const params: v2.SecurityMonitoringApiGetVulnerabilityNotificationRuleRequest = { id: VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID, }; apiInstance .getVulnerabilityNotificationRule(params) .then((data: v2.NotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Run a threat hunting job](https://docs.datadoghq.com/api/latest/security-monitoring/#run-a-threat-hunting-job) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#run-a-threat-hunting-job-v2) **Note** : This endpoint is in beta and may be subject to changes. Please check the documentation regularly for updates. POST https://api.ap1.datadoghq.com/api/v2/siem-threat-hunting/jobshttps://api.ap2.datadoghq.com/api/v2/siem-threat-hunting/jobshttps://api.datadoghq.eu/api/v2/siem-threat-hunting/jobshttps://api.ddog-gov.com/api/v2/siem-threat-hunting/jobshttps://api.datadoghq.com/api/v2/siem-threat-hunting/jobshttps://api.us3.datadoghq.com/api/v2/siem-threat-hunting/jobshttps://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs ### Overview Run a threat hunting job. This endpoint requires the `security_monitoring_rules_write` permission. OAuth apps require the `security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object Data for running a threat hunting job request. attributes object Run a threat hunting job request. fromRule object Definition of a threat hunting job based on a security monitoring rule. from [_required_] int64 Starting time of data analyzed by the job. id [_required_] string ID of the detection rule used to create the job. index [_required_] string Index used to load the data. notifications [string] Notifications sent when the job is completed. to [_required_] int64 Ending time of data analyzed by the job. id string Request ID. jobDefinition object Definition of a threat hunting job. calculatedFields [object] Calculated fields. expression [_required_] string Expression. name [_required_] string Field name. cases [_required_] [object] Cases used for generating job results. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` from [_required_] int64 Starting time of data analyzed by the job. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. index [_required_] string Index used to load the data. message [_required_] string Message for generated results. name [_required_] string Job name. options object Job options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting logs analyzed by the job. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the query. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables used in the queries. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating results from third-party detection method. Only available for third-party detection method. name string Name of the case. notifications [string] Notification targets for each case. query string A query to map a third party event to this case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` to [_required_] int64 Ending time of data analyzed by the job. type string Job type. type enum Type of data. Allowed enum values: `historicalDetectionsJobCreate` ``` { "data": { "type": "historicalDetectionsJobCreate", "attributes": { "jobDefinition": { "type": "log_detection", "name": "Excessive number of failed attempts.", "queries": [ { "query": "source:non_existing_src_weekend", "aggregation": "count", "groupByFields": [], "distinctFields": [] } ], "cases": [ { "name": "Condition 1", "status": "info", "notifications": [], "condition": "a > 1" } ], "options": { "keepAlive": 3600, "maxSignalDuration": 86400, "evaluationWindow": 900 }, "message": "A large number of failed login attempts.", "tags": [], "from": 1730387522611, "to": 1730387532611, "index": "main" } } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/security-monitoring/#RunThreatHuntingJob-201-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#RunThreatHuntingJob-400-v2) * [401](https://docs.datadoghq.com/api/latest/security-monitoring/#RunThreatHuntingJob-401-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#RunThreatHuntingJob-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#RunThreatHuntingJob-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#RunThreatHuntingJob-429-v2) Status created * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Run a threat hunting job response. Field Type Description data object The definition of `JobCreateResponseData` object. id string ID of the created job. type enum Type of payload. Allowed enum values: `historicalDetectionsJob` ``` { "data": { "id": "string", "type": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Run a threat hunting job returns "Status created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "historicalDetectionsJobCreate", "attributes": { "jobDefinition": { "type": "log_detection", "name": "Excessive number of failed attempts.", "queries": [ { "query": "source:non_existing_src_weekend", "aggregation": "count", "groupByFields": [], "distinctFields": [] } ], "cases": [ { "name": "Condition 1", "status": "info", "notifications": [], "condition": "a > 1" } ], "options": { "keepAlive": 3600, "maxSignalDuration": 86400, "evaluationWindow": 900 }, "message": "A large number of failed login attempts.", "tags": [], "from": 1730387522611, "to": 1730387532611, "index": "main" } } } } EOF ``` ##### Run a threat hunting job returns "Status created" response ``` // Run a threat hunting job returns "Status created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.RunThreatHuntingJobRequest{ Data: &datadogV2.RunThreatHuntingJobRequestData{ Type: datadogV2.RUNTHREATHUNTINGJOBREQUESTDATATYPE_HISTORICALDETECTIONSJOBCREATE.Ptr(), Attributes: &datadogV2.RunThreatHuntingJobRequestAttributes{ JobDefinition: &datadogV2.JobDefinition{ Type: datadog.PtrString("log_detection"), Name: "Excessive number of failed attempts.", Queries: []datadogV2.ThreatHuntingJobQuery{ { Query: datadog.PtrString("source:non_existing_src_weekend"), Aggregation: datadogV2.SECURITYMONITORINGRULEQUERYAGGREGATION_COUNT.Ptr(), GroupByFields: []string{}, DistinctFields: []string{}, }, }, Cases: []datadogV2.SecurityMonitoringRuleCaseCreate{ { Name: datadog.PtrString("Condition 1"), Status: datadogV2.SECURITYMONITORINGRULESEVERITY_INFO, Notifications: []string{}, Condition: datadog.PtrString("a > 1"), }, }, Options: &datadogV2.ThreatHuntingJobOptions{ KeepAlive: datadogV2.SECURITYMONITORINGRULEKEEPALIVE_ONE_HOUR.Ptr(), MaxSignalDuration: datadogV2.SECURITYMONITORINGRULEMAXSIGNALDURATION_ONE_DAY.Ptr(), EvaluationWindow: datadogV2.SECURITYMONITORINGRULEEVALUATIONWINDOW_FIFTEEN_MINUTES.Ptr(), }, Message: "A large number of failed login attempts.", Tags: []string{}, From: 1730387522611, To: 1730387532611, Index: "main", }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.RunThreatHuntingJob", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.RunThreatHuntingJob(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.RunThreatHuntingJob`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.RunThreatHuntingJob`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Run a threat hunting job returns "Status created" response ``` // Run a threat hunting job returns "Status created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.JobCreateResponse; import com.datadog.api.client.v2.model.JobDefinition; import com.datadog.api.client.v2.model.RunThreatHuntingJobRequest; import com.datadog.api.client.v2.model.RunThreatHuntingJobRequestAttributes; import com.datadog.api.client.v2.model.RunThreatHuntingJobRequestData; import com.datadog.api.client.v2.model.RunThreatHuntingJobRequestDataType; import com.datadog.api.client.v2.model.SecurityMonitoringRuleCaseCreate; import com.datadog.api.client.v2.model.SecurityMonitoringRuleEvaluationWindow; import com.datadog.api.client.v2.model.SecurityMonitoringRuleKeepAlive; import com.datadog.api.client.v2.model.SecurityMonitoringRuleMaxSignalDuration; import com.datadog.api.client.v2.model.SecurityMonitoringRuleQueryAggregation; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import com.datadog.api.client.v2.model.ThreatHuntingJobOptions; import com.datadog.api.client.v2.model.ThreatHuntingJobQuery; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.runThreatHuntingJob", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); RunThreatHuntingJobRequest body = new RunThreatHuntingJobRequest() .data( new RunThreatHuntingJobRequestData() .type(RunThreatHuntingJobRequestDataType.HISTORICALDETECTIONSJOBCREATE) .attributes( new RunThreatHuntingJobRequestAttributes() .jobDefinition( new JobDefinition() .type("log_detection") .name("Excessive number of failed attempts.") .queries( Collections.singletonList( new ThreatHuntingJobQuery() .query("source:non_existing_src_weekend") .aggregation( SecurityMonitoringRuleQueryAggregation.COUNT))) .cases( Collections.singletonList( new SecurityMonitoringRuleCaseCreate() .name("Condition 1") .status(SecurityMonitoringRuleSeverity.INFO) .condition("a > 1"))) .options( new ThreatHuntingJobOptions() .keepAlive(SecurityMonitoringRuleKeepAlive.ONE_HOUR) .maxSignalDuration( SecurityMonitoringRuleMaxSignalDuration.ONE_DAY) .evaluationWindow( SecurityMonitoringRuleEvaluationWindow .FIFTEEN_MINUTES)) .message("A large number of failed login attempts.") .from(1730387522611L) .to(1730387532611L) .index("main")))); try { JobCreateResponse result = apiInstance.runThreatHuntingJob(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#runThreatHuntingJob"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Run a threat hunting job returns "Status created" response ``` """ Run a threat hunting job returns "Status created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.job_definition import JobDefinition from datadog_api_client.v2.model.run_threat_hunting_job_request import RunThreatHuntingJobRequest from datadog_api_client.v2.model.run_threat_hunting_job_request_attributes import RunThreatHuntingJobRequestAttributes from datadog_api_client.v2.model.run_threat_hunting_job_request_data import RunThreatHuntingJobRequestData from datadog_api_client.v2.model.run_threat_hunting_job_request_data_type import RunThreatHuntingJobRequestDataType from datadog_api_client.v2.model.security_monitoring_rule_case_create import SecurityMonitoringRuleCaseCreate from datadog_api_client.v2.model.security_monitoring_rule_evaluation_window import ( SecurityMonitoringRuleEvaluationWindow, ) from datadog_api_client.v2.model.security_monitoring_rule_keep_alive import SecurityMonitoringRuleKeepAlive from datadog_api_client.v2.model.security_monitoring_rule_max_signal_duration import ( SecurityMonitoringRuleMaxSignalDuration, ) from datadog_api_client.v2.model.security_monitoring_rule_query_aggregation import ( SecurityMonitoringRuleQueryAggregation, ) from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity from datadog_api_client.v2.model.threat_hunting_job_options import ThreatHuntingJobOptions from datadog_api_client.v2.model.threat_hunting_job_query import ThreatHuntingJobQuery body = RunThreatHuntingJobRequest( data=RunThreatHuntingJobRequestData( type=RunThreatHuntingJobRequestDataType.HISTORICALDETECTIONSJOBCREATE, attributes=RunThreatHuntingJobRequestAttributes( job_definition=JobDefinition( type="log_detection", name="Excessive number of failed attempts.", queries=[ ThreatHuntingJobQuery( query="source:non_existing_src_weekend", aggregation=SecurityMonitoringRuleQueryAggregation.COUNT, group_by_fields=[], distinct_fields=[], ), ], cases=[ SecurityMonitoringRuleCaseCreate( name="Condition 1", status=SecurityMonitoringRuleSeverity.INFO, notifications=[], condition="a > 1", ), ], options=ThreatHuntingJobOptions( keep_alive=SecurityMonitoringRuleKeepAlive.ONE_HOUR, max_signal_duration=SecurityMonitoringRuleMaxSignalDuration.ONE_DAY, evaluation_window=SecurityMonitoringRuleEvaluationWindow.FIFTEEN_MINUTES, ), message="A large number of failed login attempts.", tags=[], _from=1730387522611, to=1730387532611, index="main", ), ), ), ) configuration = Configuration() configuration.unstable_operations["run_threat_hunting_job"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.run_threat_hunting_job(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Run a threat hunting job returns "Status created" response ``` # Run a threat hunting job returns "Status created" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.run_threat_hunting_job".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::RunThreatHuntingJobRequest.new({ data: DatadogAPIClient::V2::RunThreatHuntingJobRequestData.new({ type: DatadogAPIClient::V2::RunThreatHuntingJobRequestDataType::HISTORICALDETECTIONSJOBCREATE, attributes: DatadogAPIClient::V2::RunThreatHuntingJobRequestAttributes.new({ job_definition: DatadogAPIClient::V2::JobDefinition.new({ type: "log_detection", name: "Excessive number of failed attempts.", queries: [ DatadogAPIClient::V2::ThreatHuntingJobQuery.new({ query: "source:non_existing_src_weekend", aggregation: DatadogAPIClient::V2::SecurityMonitoringRuleQueryAggregation::COUNT, group_by_fields: [], distinct_fields: [], }), ], cases: [ DatadogAPIClient::V2::SecurityMonitoringRuleCaseCreate.new({ name: "Condition 1", status: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::INFO, notifications: [], condition: "a > 1", }), ], options: DatadogAPIClient::V2::ThreatHuntingJobOptions.new({ keep_alive: DatadogAPIClient::V2::SecurityMonitoringRuleKeepAlive::ONE_HOUR, max_signal_duration: DatadogAPIClient::V2::SecurityMonitoringRuleMaxSignalDuration::ONE_DAY, evaluation_window: DatadogAPIClient::V2::SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES, }), message: "A large number of failed login attempts.", tags: [], from: 1730387522611, to: 1730387532611, index: "main", }), }), }), }) p api_instance.run_threat_hunting_job(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Run a threat hunting job returns "Status created" response ``` // Run a threat hunting job returns "Status created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::JobDefinition; use datadog_api_client::datadogV2::model::RunThreatHuntingJobRequest; use datadog_api_client::datadogV2::model::RunThreatHuntingJobRequestAttributes; use datadog_api_client::datadogV2::model::RunThreatHuntingJobRequestData; use datadog_api_client::datadogV2::model::RunThreatHuntingJobRequestDataType; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleCaseCreate; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleEvaluationWindow; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleKeepAlive; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleMaxSignalDuration; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleQueryAggregation; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; use datadog_api_client::datadogV2::model::ThreatHuntingJobOptions; use datadog_api_client::datadogV2::model::ThreatHuntingJobQuery; #[tokio::main] async fn main() { let body = RunThreatHuntingJobRequest::new().data( RunThreatHuntingJobRequestData::new() .attributes( RunThreatHuntingJobRequestAttributes::new().job_definition( JobDefinition::new( vec![SecurityMonitoringRuleCaseCreate::new( SecurityMonitoringRuleSeverity::INFO, ) .condition("a > 1".to_string()) .name("Condition 1".to_string()) .notifications(vec![])], 1730387522611, "main".to_string(), "A large number of failed login attempts.".to_string(), "Excessive number of failed attempts.".to_string(), vec![ThreatHuntingJobQuery::new() .aggregation(SecurityMonitoringRuleQueryAggregation::COUNT) .distinct_fields(vec![]) .group_by_fields(vec![]) .query("source:non_existing_src_weekend".to_string())], 1730387532611, ) .options( ThreatHuntingJobOptions::new() .evaluation_window( SecurityMonitoringRuleEvaluationWindow::FIFTEEN_MINUTES, ) .keep_alive(SecurityMonitoringRuleKeepAlive::ONE_HOUR) .max_signal_duration(SecurityMonitoringRuleMaxSignalDuration::ONE_DAY), ) .tags(vec![]) .type_("log_detection".to_string()), ), ) .type_(RunThreatHuntingJobRequestDataType::HISTORICALDETECTIONSJOBCREATE), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.RunThreatHuntingJob", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.run_threat_hunting_job(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Run a threat hunting job returns "Status created" response ``` /** * Run a threat hunting job returns "Status created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.runThreatHuntingJob"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiRunThreatHuntingJobRequest = { body: { data: { type: "historicalDetectionsJobCreate", attributes: { jobDefinition: { type: "log_detection", name: "Excessive number of failed attempts.", queries: [ { query: "source:non_existing_src_weekend", aggregation: "count", groupByFields: [], distinctFields: [], }, ], cases: [ { name: "Condition 1", status: "info", notifications: [], condition: "a > 1", }, ], options: { keepAlive: 3600, maxSignalDuration: 86400, evaluationWindow: 900, }, message: "A large number of failed login attempts.", tags: [], from: 1730387522611, to: 1730387532611, index: "main", }, }, }, }, }; apiInstance .runThreatHuntingJob(params) .then((data: v2.JobCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List threat hunting jobs](https://docs.datadoghq.com/api/latest/security-monitoring/#list-threat-hunting-jobs) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#list-threat-hunting-jobs-v2) **Note** : This endpoint is in beta and may be subject to changes. Please check the documentation regularly for updates. GET https://api.ap1.datadoghq.com/api/v2/siem-threat-hunting/jobshttps://api.ap2.datadoghq.com/api/v2/siem-threat-hunting/jobshttps://api.datadoghq.eu/api/v2/siem-threat-hunting/jobshttps://api.ddog-gov.com/api/v2/siem-threat-hunting/jobshttps://api.datadoghq.com/api/v2/siem-threat-hunting/jobshttps://api.us3.datadoghq.com/api/v2/siem-threat-hunting/jobshttps://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs ### Overview List threat hunting jobs. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort string The order of the jobs in results. filter[query] string Query used to filter items from the fetched list. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListThreatHuntingJobs-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ListThreatHuntingJobs-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ListThreatHuntingJobs-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListThreatHuntingJobs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) List of threat hunting jobs. Field Type Description data [object] Array containing the list of threat hunting jobs. attributes object Threat hunting job attributes. createdAt string Time when the job was created. createdByHandle string The handle of the user who created the job. createdByName string The name of the user who created the job. createdFromRuleId string ID of the rule used to create the job (if it is created from a rule). jobDefinition object Definition of a threat hunting job. calculatedFields [object] Calculated fields. expression [_required_] string Expression. name [_required_] string Field name. cases [_required_] [object] Cases used for generating job results. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` from [_required_] int64 Starting time of data analyzed by the job. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. index [_required_] string Index used to load the data. message [_required_] string Message for generated results. name [_required_] string Job name. options object Job options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting logs analyzed by the job. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the query. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables used in the queries. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating results from third-party detection method. Only available for third-party detection method. name string Name of the case. notifications [string] Notification targets for each case. query string A query to map a third party event to this case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` to [_required_] int64 Ending time of data analyzed by the job. type string Job type. jobName string Job name. jobStatus string Job status. modifiedAt string Last modification time of the job. signalOutput boolean Whether the job outputs signals. id string ID of the job. type enum Type of payload. Allowed enum values: `historicalDetectionsJob` meta object Metadata about the list of jobs. totalCount int32 Number of jobs in the list. ``` { "data": [ { "attributes": { "createdAt": "string", "createdByHandle": "string", "createdByName": "string", "createdFromRuleId": "string", "jobDefinition": { "calculatedFields": [ { "expression": "@request_end_timestamp - @request_start_timestamp", "name": "response_time" } ], "cases": [ { "actions": [ { "options": { "duration": 0, "flaggedIPType": "FLAGGED", "userBehaviorName": "string" }, "type": "string" } ], "condition": "string", "name": "string", "notifications": [], "status": "critical" } ], "from": 1729843470000, "groupSignalsBy": [ "service" ], "index": "cloud_siem", "message": "A large number of failed login attempts.", "name": "Excessive number of failed attempts.", "options": { "anomalyDetectionOptions": { "bucketDuration": 300, "detectionTolerance": 5, "learningDuration": "integer", "learningPeriodBaseline": "integer" }, "detectionMethod": "string", "evaluationWindow": "integer", "impossibleTravelOptions": { "baselineUserLocations": true }, "keepAlive": "integer", "maxSignalDuration": "integer", "newValueOptions": { "forgetAfter": "integer", "instantaneousBaseline": false, "learningDuration": "integer", "learningMethod": "string", "learningThreshold": "integer" }, "sequenceDetectionOptions": { "stepTransitions": [ { "child": "string", "evaluationWindow": "integer", "parent": "string" } ], "steps": [ { "condition": "string", "evaluationWindow": "integer", "name": "string" } ] }, "thirdPartyRuleOptions": { "defaultNotifications": [], "defaultStatus": "critical", "rootQueries": [ { "groupByFields": [], "query": "source:cloudtrail" } ], "signalTitleTemplate": "string" } }, "queries": [ { "aggregation": "string", "dataSource": "logs", "distinctFields": [], "groupByFields": [], "hasOptionalGroupByFields": false, "metrics": [], "name": "string", "query": "a > 3" } ], "referenceTables": [ { "checkPresence": false, "columnName": "string", "logFieldPath": "string", "ruleQueryName": "string", "tableName": "string" } ], "tags": [], "thirdPartyCases": [ { "name": "string", "notifications": [], "query": "string", "status": "critical" } ], "to": 1729847070000, "type": "string" }, "jobName": "string", "jobStatus": "string", "modifiedAt": "string", "signalOutput": false }, "id": "string", "type": "string" } ], "meta": { "totalCount": "integer" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### List threat hunting jobs Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List threat hunting jobs ``` """ List threat hunting jobs returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() configuration.unstable_operations["list_threat_hunting_jobs"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_threat_hunting_jobs() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List threat hunting jobs ``` # List threat hunting jobs returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_threat_hunting_jobs".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.list_threat_hunting_jobs() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List threat hunting jobs ``` // List threat hunting jobs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListThreatHuntingJobs", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListThreatHuntingJobs(ctx, *datadogV2.NewListThreatHuntingJobsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListThreatHuntingJobs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListThreatHuntingJobs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List threat hunting jobs ``` // List threat hunting jobs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.ListThreatHuntingJobsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listThreatHuntingJobs", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { ListThreatHuntingJobsResponse result = apiInstance.listThreatHuntingJobs(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#listThreatHuntingJobs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List threat hunting jobs ``` // List threat hunting jobs returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::ListThreatHuntingJobsOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListThreatHuntingJobs", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .list_threat_hunting_jobs(ListThreatHuntingJobsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List threat hunting jobs ``` /** * List threat hunting jobs returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listThreatHuntingJobs"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); apiInstance .listThreatHuntingJobs() .then((data: v2.ListThreatHuntingJobsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Patch a signal-based notification rule](https://docs.datadoghq.com/api/latest/security-monitoring/#patch-a-signal-based-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#patch-a-signal-based-notification-rule-v2) PATCH https://api.ap1.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.ap2.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.datadoghq.eu/api/v2/security/signals/notification_rules/{id}https://api.ddog-gov.com/api/v2/security/signals/notification_rules/{id}https://api.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.us3.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.us5.datadoghq.com/api/v2/security/signals/notification_rules/{id} ### Overview Partially update the notification rule. All fields are optional; if a field is not provided, it is not updated. This endpoint requires the `security_monitoring_notification_profiles_write` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string ID of the notification rule. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object Data of the notification rule patch request: the rule ID, the rule type, and the rule attributes. All fields are required. attributes [_required_] object Attributes of the notification rule patch request. It is required to update the version of the rule when patching it. enabled boolean Field used to enable or disable the rule. name string Name of the notification rule. selectors object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. version int64 Version of the notification rule. It is updated when the rule is modified. id [_required_] string The ID of a notification rule. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": { "attributes": { "enabled": true, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchSignalNotificationRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchSignalNotificationRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchSignalNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchSignalNotificationRule-404-v2) * [422](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchSignalNotificationRule-422-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchSignalNotificationRule-429-v2) Notification rule successfully patched. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object which includes a notification rule. Field Type Description data object Notification rules allow full control over notifications generated by the various Datadog security products. They allow users to define the conditions under which a notification should be generated (based on rule severities, rule types, rule tags, and so on), and the targets to notify. A notification rule is composed of a rule ID, a rule type, and the rule attributes. All fields are required. attributes [_required_] object Attributes of the notification rule. created_at [_required_] int64 Date as Unix timestamp in milliseconds. created_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. enabled [_required_] boolean Field used to enable or disable the rule. modified_at [_required_] int64 Date as Unix timestamp in milliseconds. modified_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. name [_required_] string Name of the notification rule. selectors [_required_] object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [_required_] [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. version [_required_] int64 Version of the notification rule. It is updated when the rule is modified. id [_required_] string The ID of a notification rule. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": { "attributes": { "created_at": 1722439510282, "created_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "enabled": true, "modified_at": 1722439510282, "modified_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy The server cannot process the request because it contains invalid data. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Patch a signal-based notification rule returns "Notification rule successfully patched." response Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/signals/notification_rules/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": true, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } } EOF ``` ##### Patch a signal-based notification rule returns "Notification rule successfully patched." response ``` // Patch a signal-based notification rule returns "Notification rule successfully patched." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "valid_signal_notification_rule" in the system ValidSignalNotificationRuleDataID := os.Getenv("VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID") body := datadogV2.PatchNotificationRuleParameters{ Data: &datadogV2.PatchNotificationRuleParametersData{ Attributes: datadogV2.PatchNotificationRuleParametersDataAttributes{ Enabled: datadog.PtrBool(true), Name: datadog.PtrString("Rule 1"), Selectors: &datadogV2.Selectors{ Query: datadog.PtrString("(source:production_service OR env:prod)"), RuleTypes: []datadogV2.RuleTypesItems{ datadogV2.RULETYPESITEMS_MISCONFIGURATION, datadogV2.RULETYPESITEMS_ATTACK_PATH, }, Severities: []datadogV2.RuleSeverity{ datadogV2.RULESEVERITY_CRITICAL, }, TriggerSource: datadogV2.TRIGGERSOURCE_SECURITY_FINDINGS, }, Targets: []string{ "@john.doe@email.com", }, TimeAggregation: datadog.PtrInt64(86400), Version: datadog.PtrInt64(1), }, Id: ValidSignalNotificationRuleDataID, Type: datadogV2.NOTIFICATIONRULESTYPE_NOTIFICATION_RULES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.PatchSignalNotificationRule(ctx, ValidSignalNotificationRuleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.PatchSignalNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.PatchSignalNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Patch a signal-based notification rule returns "Notification rule successfully patched." response ``` // Patch a signal-based notification rule returns "Notification rule successfully patched." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.NotificationRuleResponse; import com.datadog.api.client.v2.model.NotificationRulesType; import com.datadog.api.client.v2.model.PatchNotificationRuleParameters; import com.datadog.api.client.v2.model.PatchNotificationRuleParametersData; import com.datadog.api.client.v2.model.PatchNotificationRuleParametersDataAttributes; import com.datadog.api.client.v2.model.RuleSeverity; import com.datadog.api.client.v2.model.RuleTypesItems; import com.datadog.api.client.v2.model.Selectors; import com.datadog.api.client.v2.model.TriggerSource; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "valid_signal_notification_rule" in the system String VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = System.getenv("VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID"); PatchNotificationRuleParameters body = new PatchNotificationRuleParameters() .data( new PatchNotificationRuleParametersData() .attributes( new PatchNotificationRuleParametersDataAttributes() .enabled(true) .name("Rule 1") .selectors( new Selectors() .query("(source:production_service OR env:prod)") .ruleTypes( Arrays.asList( RuleTypesItems.MISCONFIGURATION, RuleTypesItems.ATTACK_PATH)) .severities(Collections.singletonList(RuleSeverity.CRITICAL)) .triggerSource(TriggerSource.SECURITY_FINDINGS)) .targets(Collections.singletonList("@john.doe@email.com")) .timeAggregation(86400L) .version(1L)) .id(VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID) .type(NotificationRulesType.NOTIFICATION_RULES)); try { NotificationRuleResponse result = apiInstance.patchSignalNotificationRule(VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#patchSignalNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Patch a signal-based notification rule returns "Notification rule successfully patched." response ``` """ Patch a signal-based notification rule returns "Notification rule successfully patched." response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.notification_rules_type import NotificationRulesType from datadog_api_client.v2.model.patch_notification_rule_parameters import PatchNotificationRuleParameters from datadog_api_client.v2.model.patch_notification_rule_parameters_data import PatchNotificationRuleParametersData from datadog_api_client.v2.model.patch_notification_rule_parameters_data_attributes import ( PatchNotificationRuleParametersDataAttributes, ) from datadog_api_client.v2.model.rule_severity import RuleSeverity from datadog_api_client.v2.model.rule_types_items import RuleTypesItems from datadog_api_client.v2.model.selectors import Selectors from datadog_api_client.v2.model.trigger_source import TriggerSource # there is a valid "valid_signal_notification_rule" in the system VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = environ["VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID"] body = PatchNotificationRuleParameters( data=PatchNotificationRuleParametersData( attributes=PatchNotificationRuleParametersDataAttributes( enabled=True, name="Rule 1", selectors=Selectors( query="(source:production_service OR env:prod)", rule_types=[ RuleTypesItems.MISCONFIGURATION, RuleTypesItems.ATTACK_PATH, ], severities=[ RuleSeverity.CRITICAL, ], trigger_source=TriggerSource.SECURITY_FINDINGS, ), targets=[ "@john.doe@email.com", ], time_aggregation=86400, version=1, ), id=VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID, type=NotificationRulesType.NOTIFICATION_RULES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.patch_signal_notification_rule(id=VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Patch a signal-based notification rule returns "Notification rule successfully patched." response ``` # Patch a signal-based notification rule returns "Notification rule successfully patched." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "valid_signal_notification_rule" in the system VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = ENV["VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID"] body = DatadogAPIClient::V2::PatchNotificationRuleParameters.new({ data: DatadogAPIClient::V2::PatchNotificationRuleParametersData.new({ attributes: DatadogAPIClient::V2::PatchNotificationRuleParametersDataAttributes.new({ enabled: true, name: "Rule 1", selectors: DatadogAPIClient::V2::Selectors.new({ query: "(source:production_service OR env:prod)", rule_types: [ DatadogAPIClient::V2::RuleTypesItems::MISCONFIGURATION, DatadogAPIClient::V2::RuleTypesItems::ATTACK_PATH, ], severities: [ DatadogAPIClient::V2::RuleSeverity::CRITICAL, ], trigger_source: DatadogAPIClient::V2::TriggerSource::SECURITY_FINDINGS, }), targets: [ "@john.doe@email.com", ], time_aggregation: 86400, version: 1, }), id: VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID, type: DatadogAPIClient::V2::NotificationRulesType::NOTIFICATION_RULES, }), }) p api_instance.patch_signal_notification_rule(VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Patch a signal-based notification rule returns "Notification rule successfully patched." response ``` // Patch a signal-based notification rule returns "Notification rule successfully // patched." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::NotificationRulesType; use datadog_api_client::datadogV2::model::PatchNotificationRuleParameters; use datadog_api_client::datadogV2::model::PatchNotificationRuleParametersData; use datadog_api_client::datadogV2::model::PatchNotificationRuleParametersDataAttributes; use datadog_api_client::datadogV2::model::RuleSeverity; use datadog_api_client::datadogV2::model::RuleTypesItems; use datadog_api_client::datadogV2::model::Selectors; use datadog_api_client::datadogV2::model::TriggerSource; #[tokio::main] async fn main() { // there is a valid "valid_signal_notification_rule" in the system let valid_signal_notification_rule_data_id = std::env::var("VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID").unwrap(); let body = PatchNotificationRuleParameters::new().data(PatchNotificationRuleParametersData::new( PatchNotificationRuleParametersDataAttributes::new() .enabled(true) .name("Rule 1".to_string()) .selectors( Selectors::new(TriggerSource::SECURITY_FINDINGS) .query("(source:production_service OR env:prod)".to_string()) .rule_types(vec![ RuleTypesItems::MISCONFIGURATION, RuleTypesItems::ATTACK_PATH, ]) .severities(vec![RuleSeverity::CRITICAL]), ) .targets(vec!["@john.doe@email.com".to_string()]) .time_aggregation(86400) .version(1), valid_signal_notification_rule_data_id.clone(), NotificationRulesType::NOTIFICATION_RULES, )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .patch_signal_notification_rule(valid_signal_notification_rule_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Patch a signal-based notification rule returns "Notification rule successfully patched." response ``` /** * Patch a signal-based notification rule returns "Notification rule successfully patched." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "valid_signal_notification_rule" in the system const VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = process.env .VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID as string; const params: v2.SecurityMonitoringApiPatchSignalNotificationRuleRequest = { body: { data: { attributes: { enabled: true, name: "Rule 1", selectors: { query: "(source:production_service OR env:prod)", ruleTypes: ["misconfiguration", "attack_path"], severities: ["critical"], triggerSource: "security_findings", }, targets: ["@john.doe@email.com"], timeAggregation: 86400, version: 1, }, id: VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID, type: "notification_rules", }, }, id: VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID, }; apiInstance .patchSignalNotificationRule(params) .then((data: v2.NotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Patch a vulnerability-based notification rule](https://docs.datadoghq.com/api/latest/security-monitoring/#patch-a-vulnerability-based-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#patch-a-vulnerability-based-notification-rule-v2) PATCH https://api.ap1.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.ap2.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.datadoghq.eu/api/v2/security/vulnerabilities/notification_rules/{id}https://api.ddog-gov.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.us3.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.us5.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id} ### Overview Partially update the notification rule. All fields are optional; if a field is not provided, it is not updated. This endpoint requires the `security_monitoring_notification_profiles_write` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string ID of the notification rule. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object Data of the notification rule patch request: the rule ID, the rule type, and the rule attributes. All fields are required. attributes [_required_] object Attributes of the notification rule patch request. It is required to update the version of the rule when patching it. enabled boolean Field used to enable or disable the rule. name string Name of the notification rule. selectors object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. version int64 Version of the notification rule. It is updated when the rule is modified. id [_required_] string The ID of a notification rule. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": { "attributes": { "enabled": true, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchVulnerabilityNotificationRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchVulnerabilityNotificationRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchVulnerabilityNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchVulnerabilityNotificationRule-404-v2) * [422](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchVulnerabilityNotificationRule-422-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#PatchVulnerabilityNotificationRule-429-v2) Notification rule successfully patched. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response object which includes a notification rule. Field Type Description data object Notification rules allow full control over notifications generated by the various Datadog security products. They allow users to define the conditions under which a notification should be generated (based on rule severities, rule types, rule tags, and so on), and the targets to notify. A notification rule is composed of a rule ID, a rule type, and the rule attributes. All fields are required. attributes [_required_] object Attributes of the notification rule. created_at [_required_] int64 Date as Unix timestamp in milliseconds. created_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. enabled [_required_] boolean Field used to enable or disable the rule. modified_at [_required_] int64 Date as Unix timestamp in milliseconds. modified_by [_required_] object User creating or modifying a rule. handle string The user handle. name string The user name. name [_required_] string Name of the notification rule. selectors [_required_] object Selectors are used to filter security issues for which notifications should be generated. Users can specify rule severities, rule types, a query to filter security issues on tags and attributes, and the trigger source. Only the trigger_source field is required. query string The query is composed of one or several key:value pairs, which can be used to filter security issues on tags and attributes. rule_types [string] Security rule types used as filters in security rules. severities [string] The security rules severities to consider. trigger_source [_required_] enum The type of security issues on which the rule applies. Notification rules based on security signals need to use the trigger source "security_signals", while notification rules based on security vulnerabilities need to use the trigger source "security_findings". Allowed enum values: `security_findings,security_signals` targets [_required_] [string] List of recipients to notify when a notification rule is triggered. Many different target types are supported, such as email addresses, Slack channels, and PagerDuty services. The appropriate integrations need to be properly configured to send notifications to the specified targets. time_aggregation int64 Time aggregation period (in seconds) is used to aggregate the results of the notification rule evaluation. Results are aggregated over a selected time frame using a rolling window, which updates with each new evaluation. Notifications are only sent for new issues discovered during the window. Time aggregation is only available for vulnerability-based notification rules. When omitted or set to 0, no aggregation is done. version [_required_] int64 Version of the notification rule. It is updated when the rule is modified. id [_required_] string The ID of a notification rule. type [_required_] enum The rule type associated to notification rules. Allowed enum values: `notification_rules` ``` { "data": { "attributes": { "created_at": 1722439510282, "created_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "enabled": true, "modified_at": 1722439510282, "modified_by": { "handle": "john.doe@domain.com", "name": "John Doe" }, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy The server cannot process the request because it contains invalid data. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Patch a vulnerability-based notification rule returns "Notification rule successfully patched." response Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/${id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": true, "name": "Rule 1", "selectors": { "query": "(source:production_service OR env:prod)", "rule_types": [ "misconfiguration", "attack_path" ], "severities": [ "critical" ], "trigger_source": "security_findings" }, "targets": [ "@john.doe@email.com" ], "time_aggregation": 86400, "version": 1 }, "id": "aaa-bbb-ccc", "type": "notification_rules" } } EOF ``` ##### Patch a vulnerability-based notification rule returns "Notification rule successfully patched." response ``` // Patch a vulnerability-based notification rule returns "Notification rule successfully patched." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "valid_vulnerability_notification_rule" in the system ValidVulnerabilityNotificationRuleDataID := os.Getenv("VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID") body := datadogV2.PatchNotificationRuleParameters{ Data: &datadogV2.PatchNotificationRuleParametersData{ Attributes: datadogV2.PatchNotificationRuleParametersDataAttributes{ Enabled: datadog.PtrBool(true), Name: datadog.PtrString("Rule 1"), Selectors: &datadogV2.Selectors{ Query: datadog.PtrString("(source:production_service OR env:prod)"), RuleTypes: []datadogV2.RuleTypesItems{ datadogV2.RULETYPESITEMS_MISCONFIGURATION, datadogV2.RULETYPESITEMS_ATTACK_PATH, }, Severities: []datadogV2.RuleSeverity{ datadogV2.RULESEVERITY_CRITICAL, }, TriggerSource: datadogV2.TRIGGERSOURCE_SECURITY_FINDINGS, }, Targets: []string{ "@john.doe@email.com", }, TimeAggregation: datadog.PtrInt64(86400), Version: datadog.PtrInt64(1), }, Id: ValidVulnerabilityNotificationRuleDataID, Type: datadogV2.NOTIFICATIONRULESTYPE_NOTIFICATION_RULES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.PatchVulnerabilityNotificationRule(ctx, ValidVulnerabilityNotificationRuleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.PatchVulnerabilityNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.PatchVulnerabilityNotificationRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Patch a vulnerability-based notification rule returns "Notification rule successfully patched." response ``` // Patch a vulnerability-based notification rule returns "Notification rule successfully patched." // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.NotificationRuleResponse; import com.datadog.api.client.v2.model.NotificationRulesType; import com.datadog.api.client.v2.model.PatchNotificationRuleParameters; import com.datadog.api.client.v2.model.PatchNotificationRuleParametersData; import com.datadog.api.client.v2.model.PatchNotificationRuleParametersDataAttributes; import com.datadog.api.client.v2.model.RuleSeverity; import com.datadog.api.client.v2.model.RuleTypesItems; import com.datadog.api.client.v2.model.Selectors; import com.datadog.api.client.v2.model.TriggerSource; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "valid_vulnerability_notification_rule" in the system String VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = System.getenv("VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID"); PatchNotificationRuleParameters body = new PatchNotificationRuleParameters() .data( new PatchNotificationRuleParametersData() .attributes( new PatchNotificationRuleParametersDataAttributes() .enabled(true) .name("Rule 1") .selectors( new Selectors() .query("(source:production_service OR env:prod)") .ruleTypes( Arrays.asList( RuleTypesItems.MISCONFIGURATION, RuleTypesItems.ATTACK_PATH)) .severities(Collections.singletonList(RuleSeverity.CRITICAL)) .triggerSource(TriggerSource.SECURITY_FINDINGS)) .targets(Collections.singletonList("@john.doe@email.com")) .timeAggregation(86400L) .version(1L)) .id(VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID) .type(NotificationRulesType.NOTIFICATION_RULES)); try { NotificationRuleResponse result = apiInstance.patchVulnerabilityNotificationRule( VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#patchVulnerabilityNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Patch a vulnerability-based notification rule returns "Notification rule successfully patched." response ``` """ Patch a vulnerability-based notification rule returns "Notification rule successfully patched." response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.notification_rules_type import NotificationRulesType from datadog_api_client.v2.model.patch_notification_rule_parameters import PatchNotificationRuleParameters from datadog_api_client.v2.model.patch_notification_rule_parameters_data import PatchNotificationRuleParametersData from datadog_api_client.v2.model.patch_notification_rule_parameters_data_attributes import ( PatchNotificationRuleParametersDataAttributes, ) from datadog_api_client.v2.model.rule_severity import RuleSeverity from datadog_api_client.v2.model.rule_types_items import RuleTypesItems from datadog_api_client.v2.model.selectors import Selectors from datadog_api_client.v2.model.trigger_source import TriggerSource # there is a valid "valid_vulnerability_notification_rule" in the system VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = environ["VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID"] body = PatchNotificationRuleParameters( data=PatchNotificationRuleParametersData( attributes=PatchNotificationRuleParametersDataAttributes( enabled=True, name="Rule 1", selectors=Selectors( query="(source:production_service OR env:prod)", rule_types=[ RuleTypesItems.MISCONFIGURATION, RuleTypesItems.ATTACK_PATH, ], severities=[ RuleSeverity.CRITICAL, ], trigger_source=TriggerSource.SECURITY_FINDINGS, ), targets=[ "@john.doe@email.com", ], time_aggregation=86400, version=1, ), id=VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID, type=NotificationRulesType.NOTIFICATION_RULES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.patch_vulnerability_notification_rule( id=VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID, body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Patch a vulnerability-based notification rule returns "Notification rule successfully patched." response ``` # Patch a vulnerability-based notification rule returns "Notification rule successfully patched." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "valid_vulnerability_notification_rule" in the system VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = ENV["VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID"] body = DatadogAPIClient::V2::PatchNotificationRuleParameters.new({ data: DatadogAPIClient::V2::PatchNotificationRuleParametersData.new({ attributes: DatadogAPIClient::V2::PatchNotificationRuleParametersDataAttributes.new({ enabled: true, name: "Rule 1", selectors: DatadogAPIClient::V2::Selectors.new({ query: "(source:production_service OR env:prod)", rule_types: [ DatadogAPIClient::V2::RuleTypesItems::MISCONFIGURATION, DatadogAPIClient::V2::RuleTypesItems::ATTACK_PATH, ], severities: [ DatadogAPIClient::V2::RuleSeverity::CRITICAL, ], trigger_source: DatadogAPIClient::V2::TriggerSource::SECURITY_FINDINGS, }), targets: [ "@john.doe@email.com", ], time_aggregation: 86400, version: 1, }), id: VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID, type: DatadogAPIClient::V2::NotificationRulesType::NOTIFICATION_RULES, }), }) p api_instance.patch_vulnerability_notification_rule(VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Patch a vulnerability-based notification rule returns "Notification rule successfully patched." response ``` // Patch a vulnerability-based notification rule returns "Notification rule // successfully patched." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::NotificationRulesType; use datadog_api_client::datadogV2::model::PatchNotificationRuleParameters; use datadog_api_client::datadogV2::model::PatchNotificationRuleParametersData; use datadog_api_client::datadogV2::model::PatchNotificationRuleParametersDataAttributes; use datadog_api_client::datadogV2::model::RuleSeverity; use datadog_api_client::datadogV2::model::RuleTypesItems; use datadog_api_client::datadogV2::model::Selectors; use datadog_api_client::datadogV2::model::TriggerSource; #[tokio::main] async fn main() { // there is a valid "valid_vulnerability_notification_rule" in the system let valid_vulnerability_notification_rule_data_id = std::env::var("VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID").unwrap(); let body = PatchNotificationRuleParameters::new().data(PatchNotificationRuleParametersData::new( PatchNotificationRuleParametersDataAttributes::new() .enabled(true) .name("Rule 1".to_string()) .selectors( Selectors::new(TriggerSource::SECURITY_FINDINGS) .query("(source:production_service OR env:prod)".to_string()) .rule_types(vec![ RuleTypesItems::MISCONFIGURATION, RuleTypesItems::ATTACK_PATH, ]) .severities(vec![RuleSeverity::CRITICAL]), ) .targets(vec!["@john.doe@email.com".to_string()]) .time_aggregation(86400) .version(1), valid_vulnerability_notification_rule_data_id.clone(), NotificationRulesType::NOTIFICATION_RULES, )); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .patch_vulnerability_notification_rule( valid_vulnerability_notification_rule_data_id.clone(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Patch a vulnerability-based notification rule returns "Notification rule successfully patched." response ``` /** * Patch a vulnerability-based notification rule returns "Notification rule successfully patched." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "valid_vulnerability_notification_rule" in the system const VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = process.env .VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID as string; const params: v2.SecurityMonitoringApiPatchVulnerabilityNotificationRuleRequest = { body: { data: { attributes: { enabled: true, name: "Rule 1", selectors: { query: "(source:production_service OR env:prod)", ruleTypes: ["misconfiguration", "attack_path"], severities: ["critical"], triggerSource: "security_findings", }, targets: ["@john.doe@email.com"], timeAggregation: 86400, version: 1, }, id: VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID, type: "notification_rules", }, }, id: VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID, }; apiInstance .patchVulnerabilityNotificationRule(params) .then((data: v2.NotificationRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a signal-based notification rule](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-signal-based-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-signal-based-notification-rule-v2) DELETE https://api.ap1.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.ap2.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.datadoghq.eu/api/v2/security/signals/notification_rules/{id}https://api.ddog-gov.com/api/v2/security/signals/notification_rules/{id}https://api.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.us3.datadoghq.com/api/v2/security/signals/notification_rules/{id}https://api.us5.datadoghq.com/api/v2/security/signals/notification_rules/{id} ### Overview Delete a notification rule for security signals. This endpoint requires the `security_monitoring_notification_profiles_write` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string ID of the notification rule. ### Response * [204](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSignalNotificationRule-204-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSignalNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSignalNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteSignalNotificationRule-429-v2) Rule successfully deleted. Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Delete a signal-based notification rule Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/signals/notification_rules/${id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a signal-based notification rule ``` """ Delete a signal-based notification rule returns "Rule successfully deleted." response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "valid_signal_notification_rule" in the system VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = environ["VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.delete_signal_notification_rule( id=VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a signal-based notification rule ``` # Delete a signal-based notification rule returns "Rule successfully deleted." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "valid_signal_notification_rule" in the system VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = ENV["VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID"] api_instance.delete_signal_notification_rule(VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a signal-based notification rule ``` // Delete a signal-based notification rule returns "Rule successfully deleted." response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "valid_signal_notification_rule" in the system ValidSignalNotificationRuleDataID := os.Getenv("VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.DeleteSignalNotificationRule(ctx, ValidSignalNotificationRuleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.DeleteSignalNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a signal-based notification rule ``` // Delete a signal-based notification rule returns "Rule successfully deleted." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "valid_signal_notification_rule" in the system String VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = System.getenv("VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID"); try { apiInstance.deleteSignalNotificationRule(VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#deleteSignalNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a signal-based notification rule ``` // Delete a signal-based notification rule returns "Rule successfully deleted." // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "valid_signal_notification_rule" in the system let valid_signal_notification_rule_data_id = std::env::var("VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .delete_signal_notification_rule(valid_signal_notification_rule_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a signal-based notification rule ``` /** * Delete a signal-based notification rule returns "Rule successfully deleted." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "valid_signal_notification_rule" in the system const VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID = process.env .VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID as string; const params: v2.SecurityMonitoringApiDeleteSignalNotificationRuleRequest = { id: VALID_SIGNAL_NOTIFICATION_RULE_DATA_ID, }; apiInstance .deleteSignalNotificationRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a vulnerability-based notification rule](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-vulnerability-based-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-a-vulnerability-based-notification-rule-v2) DELETE https://api.ap1.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.ap2.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.datadoghq.eu/api/v2/security/vulnerabilities/notification_rules/{id}https://api.ddog-gov.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.us3.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id}https://api.us5.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/{id} ### Overview Delete a notification rule for security vulnerabilities. This endpoint requires the `security_monitoring_notification_profiles_write` permission. ### Arguments #### Path Parameters Name Type Description id [_required_] string ID of the notification rule. ### Response * [204](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteVulnerabilityNotificationRule-204-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteVulnerabilityNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteVulnerabilityNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteVulnerabilityNotificationRule-429-v2) Rule successfully deleted. Forbidden * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Delete a vulnerability-based notification rule Copy ``` # Path parameters export id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/vulnerabilities/notification_rules/${id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a vulnerability-based notification rule ``` """ Delete a vulnerability-based notification rule returns "Rule successfully deleted." response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "valid_vulnerability_notification_rule" in the system VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = environ["VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.delete_vulnerability_notification_rule( id=VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a vulnerability-based notification rule ``` # Delete a vulnerability-based notification rule returns "Rule successfully deleted." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "valid_vulnerability_notification_rule" in the system VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = ENV["VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID"] api_instance.delete_vulnerability_notification_rule(VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a vulnerability-based notification rule ``` // Delete a vulnerability-based notification rule returns "Rule successfully deleted." response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "valid_vulnerability_notification_rule" in the system ValidVulnerabilityNotificationRuleDataID := os.Getenv("VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.DeleteVulnerabilityNotificationRule(ctx, ValidVulnerabilityNotificationRuleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.DeleteVulnerabilityNotificationRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a vulnerability-based notification rule ``` // Delete a vulnerability-based notification rule returns "Rule successfully deleted." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "valid_vulnerability_notification_rule" in the system String VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = System.getenv("VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID"); try { apiInstance.deleteVulnerabilityNotificationRule( VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#deleteVulnerabilityNotificationRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a vulnerability-based notification rule ``` // Delete a vulnerability-based notification rule returns "Rule successfully // deleted." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "valid_vulnerability_notification_rule" in the system let valid_vulnerability_notification_rule_data_id = std::env::var("VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .delete_vulnerability_notification_rule( valid_vulnerability_notification_rule_data_id.clone(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a vulnerability-based notification rule ``` /** * Delete a vulnerability-based notification rule returns "Rule successfully deleted." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "valid_vulnerability_notification_rule" in the system const VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID = process.env .VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID as string; const params: v2.SecurityMonitoringApiDeleteVulnerabilityNotificationRuleRequest = { id: VALID_VULNERABILITY_NOTIFICATION_RULE_DATA_ID, }; apiInstance .deleteVulnerabilityNotificationRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a job's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-jobs-details) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-jobs-details-v2) **Note** : This endpoint is in beta and may be subject to changes. Please check the documentation regularly for updates. GET https://api.ap1.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}https://api.ap2.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}https://api.datadoghq.eu/api/v2/siem-threat-hunting/jobs/{job_id}https://api.ddog-gov.com/api/v2/siem-threat-hunting/jobs/{job_id}https://api.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}https://api.us3.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id} ### Overview Get a job’s details. This endpoint requires the `security_monitoring_rules_read` permission. OAuth apps require the `security_monitoring_rules_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description job_id [_required_] string The ID of the job. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetThreatHuntingJob-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#GetThreatHuntingJob-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetThreatHuntingJob-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetThreatHuntingJob-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetThreatHuntingJob-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Threat hunting job response. Field Type Description data object Threat hunting job response data. attributes object Threat hunting job attributes. createdAt string Time when the job was created. createdByHandle string The handle of the user who created the job. createdByName string The name of the user who created the job. createdFromRuleId string ID of the rule used to create the job (if it is created from a rule). jobDefinition object Definition of a threat hunting job. calculatedFields [object] Calculated fields. expression [_required_] string Expression. name [_required_] string Field name. cases [_required_] [object] Cases used for generating job results. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. name string Name of the case. notifications [string] Notification targets. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` from [_required_] int64 Starting time of data analyzed by the job. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. index [_required_] string Index used to load the data. message [_required_] string Message for generated results. name [_required_] string Job name. options object Job options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [_required_] [object] Queries for selecting logs analyzed by the job. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the query. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables used in the queries. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating results from third-party detection method. Only available for third-party detection method. name string Name of the case. notifications [string] Notification targets for each case. query string A query to map a third party event to this case. status [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` to [_required_] int64 Ending time of data analyzed by the job. type string Job type. jobName string Job name. jobStatus string Job status. modifiedAt string Last modification time of the job. signalOutput boolean Whether the job outputs signals. id string ID of the job. type enum Type of payload. Allowed enum values: `historicalDetectionsJob` ``` { "data": { "attributes": { "createdAt": "string", "createdByHandle": "string", "createdByName": "string", "createdFromRuleId": "string", "jobDefinition": { "calculatedFields": [ { "expression": "@request_end_timestamp - @request_start_timestamp", "name": "response_time" } ], "cases": [ { "actions": [ { "options": { "duration": 0, "flaggedIPType": "FLAGGED", "userBehaviorName": "string" }, "type": "string" } ], "condition": "string", "name": "string", "notifications": [], "status": "critical" } ], "from": 1729843470000, "groupSignalsBy": [ "service" ], "index": "cloud_siem", "message": "A large number of failed login attempts.", "name": "Excessive number of failed attempts.", "options": { "anomalyDetectionOptions": { "bucketDuration": 300, "detectionTolerance": 5, "learningDuration": "integer", "learningPeriodBaseline": "integer" }, "detectionMethod": "string", "evaluationWindow": "integer", "impossibleTravelOptions": { "baselineUserLocations": true }, "keepAlive": "integer", "maxSignalDuration": "integer", "newValueOptions": { "forgetAfter": "integer", "instantaneousBaseline": false, "learningDuration": "integer", "learningMethod": "string", "learningThreshold": "integer" }, "sequenceDetectionOptions": { "stepTransitions": [ { "child": "string", "evaluationWindow": "integer", "parent": "string" } ], "steps": [ { "condition": "string", "evaluationWindow": "integer", "name": "string" } ] }, "thirdPartyRuleOptions": { "defaultNotifications": [], "defaultStatus": "critical", "rootQueries": [ { "groupByFields": [], "query": "source:cloudtrail" } ], "signalTitleTemplate": "string" } }, "queries": [ { "aggregation": "string", "dataSource": "logs", "distinctFields": [], "groupByFields": [], "hasOptionalGroupByFields": false, "metrics": [], "name": "string", "query": "a > 3" } ], "referenceTables": [ { "checkPresence": false, "columnName": "string", "logFieldPath": "string", "ruleQueryName": "string", "tableName": "string" } ], "tags": [], "thirdPartyCases": [ { "name": "string", "notifications": [], "query": "string", "status": "critical" } ], "to": 1729847070000, "type": "string" }, "jobName": "string", "jobStatus": "string", "modifiedAt": "string", "signalOutput": false }, "id": "string", "type": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a job's details Copy ``` # Path parameters export job_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs/${job_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a job's details ``` """ Get a job's details returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi # there is a valid "threat_hunting_job" in the system THREAT_HUNTING_JOB_DATA_ID = environ["THREAT_HUNTING_JOB_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_threat_hunting_job"] = True configuration.unstable_operations["run_threat_hunting_job"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_threat_hunting_job( job_id=THREAT_HUNTING_JOB_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a job's details ``` # Get a job's details returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_threat_hunting_job".to_sym] = true config.unstable_operations["v2.run_threat_hunting_job".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new # there is a valid "threat_hunting_job" in the system THREAT_HUNTING_JOB_DATA_ID = ENV["THREAT_HUNTING_JOB_DATA_ID"] p api_instance.get_threat_hunting_job(THREAT_HUNTING_JOB_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a job's details ``` // Get a job's details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "threat_hunting_job" in the system ThreatHuntingJobDataID := os.Getenv("THREAT_HUNTING_JOB_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetThreatHuntingJob", true) configuration.SetUnstableOperationEnabled("v2.RunThreatHuntingJob", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetThreatHuntingJob(ctx, ThreatHuntingJobDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetThreatHuntingJob`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetThreatHuntingJob`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a job's details ``` // Get a job's details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.ThreatHuntingJobResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getThreatHuntingJob", true); defaultClient.setUnstableOperationEnabled("v2.runThreatHuntingJob", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); // there is a valid "threat_hunting_job" in the system String THREAT_HUNTING_JOB_DATA_ID = System.getenv("THREAT_HUNTING_JOB_DATA_ID"); try { ThreatHuntingJobResponse result = apiInstance.getThreatHuntingJob(THREAT_HUNTING_JOB_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#getThreatHuntingJob"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a job's details ``` // Get a job's details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { // there is a valid "threat_hunting_job" in the system let threat_hunting_job_data_id = std::env::var("THREAT_HUNTING_JOB_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetThreatHuntingJob", true); configuration.set_unstable_operation_enabled("v2.RunThreatHuntingJob", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_threat_hunting_job(threat_hunting_job_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a job's details ``` /** * Get a job's details returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getThreatHuntingJob"] = true; configuration.unstableOperations["v2.runThreatHuntingJob"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); // there is a valid "threat_hunting_job" in the system const THREAT_HUNTING_JOB_DATA_ID = process.env .THREAT_HUNTING_JOB_DATA_ID as string; const params: v2.SecurityMonitoringApiGetThreatHuntingJobRequest = { jobId: THREAT_HUNTING_JOB_DATA_ID, }; apiInstance .getThreatHuntingJob(params) .then((data: v2.ThreatHuntingJobResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Cancel a threat hunting job](https://docs.datadoghq.com/api/latest/security-monitoring/#cancel-a-threat-hunting-job) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#cancel-a-threat-hunting-job-v2) **Note** : This endpoint is in beta and may be subject to changes. Please check the documentation regularly for updates. PATCH https://api.ap1.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}/cancelhttps://api.ap2.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}/cancelhttps://api.datadoghq.eu/api/v2/siem-threat-hunting/jobs/{job_id}/cancelhttps://api.ddog-gov.com/api/v2/siem-threat-hunting/jobs/{job_id}/cancelhttps://api.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}/cancelhttps://api.us3.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}/cancelhttps://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}/cancel ### Overview Cancel a threat hunting job. This endpoint requires the `security_monitoring_rules_write` permission. OAuth apps require the `security_monitoring_rules_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description job_id [_required_] string The ID of the job. ### Response * [204](https://docs.datadoghq.com/api/latest/security-monitoring/#CancelThreatHuntingJob-204-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#CancelThreatHuntingJob-400-v2) * [401](https://docs.datadoghq.com/api/latest/security-monitoring/#CancelThreatHuntingJob-401-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#CancelThreatHuntingJob-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#CancelThreatHuntingJob-404-v2) * [409](https://docs.datadoghq.com/api/latest/security-monitoring/#CancelThreatHuntingJob-409-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#CancelThreatHuntingJob-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Cancel a threat hunting job Copy ``` # Path parameters export job_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs/${job_id}/cancel" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Cancel a threat hunting job ``` """ Cancel a threat hunting job returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() configuration.unstable_operations["cancel_threat_hunting_job"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.cancel_threat_hunting_job( job_id="job_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Cancel a threat hunting job ``` # Cancel a threat hunting job returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.cancel_threat_hunting_job".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new api_instance.cancel_threat_hunting_job("job_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Cancel a threat hunting job ``` // Cancel a threat hunting job returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CancelThreatHuntingJob", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.CancelThreatHuntingJob(ctx, "job_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CancelThreatHuntingJob`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Cancel a threat hunting job ``` // Cancel a threat hunting job returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.cancelThreatHuntingJob", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { apiInstance.cancelThreatHuntingJob("job_id"); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#cancelThreatHuntingJob"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Cancel a threat hunting job ``` // Cancel a threat hunting job returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CancelThreatHuntingJob", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.cancel_threat_hunting_job("job_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Cancel a threat hunting job ``` /** * Cancel a threat hunting job returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.cancelThreatHuntingJob"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCancelThreatHuntingJobRequest = { jobId: "job_id", }; apiInstance .cancelThreatHuntingJob(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an existing job](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-an-existing-job) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#delete-an-existing-job-v2) **Note** : This endpoint is in beta and may be subject to changes. Please check the documentation regularly for updates. DELETE https://api.ap1.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}https://api.ap2.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}https://api.datadoghq.eu/api/v2/siem-threat-hunting/jobs/{job_id}https://api.ddog-gov.com/api/v2/siem-threat-hunting/jobs/{job_id}https://api.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}https://api.us3.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id} ### Overview Delete an existing job. ### Arguments #### Path Parameters Name Type Description job_id [_required_] string The ID of the job. ### Response * [204](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteThreatHuntingJob-204-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteThreatHuntingJob-400-v2) * [401](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteThreatHuntingJob-401-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteThreatHuntingJob-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteThreatHuntingJob-404-v2) * [409](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteThreatHuntingJob-409-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#DeleteThreatHuntingJob-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Delete an existing job Copy ``` # Path parameters export job_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs/${job_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an existing job ``` """ Delete an existing job returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() configuration.unstable_operations["delete_threat_hunting_job"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.delete_threat_hunting_job( job_id="job_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an existing job ``` # Delete an existing job returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_threat_hunting_job".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new api_instance.delete_threat_hunting_job("job_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an existing job ``` // Delete an existing job returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteThreatHuntingJob", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.DeleteThreatHuntingJob(ctx, "job_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.DeleteThreatHuntingJob`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an existing job ``` // Delete an existing job returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteThreatHuntingJob", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { apiInstance.deleteThreatHuntingJob("job_id"); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#deleteThreatHuntingJob"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an existing job ``` // Delete an existing job returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteThreatHuntingJob", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.delete_threat_hunting_job("job_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an existing job ``` /** * Delete an existing job returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteThreatHuntingJob"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiDeleteThreatHuntingJobRequest = { jobId: "job_id", }; apiInstance .deleteThreatHuntingJob(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Convert a job result to a signal](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-a-job-result-to-a-signal) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#convert-a-job-result-to-a-signal-v2) **Note** : This endpoint is in beta and may be subject to changes. Please check the documentation regularly for updates. POST https://api.ap1.datadoghq.com/api/v2/siem-threat-hunting/jobs/signal_converthttps://api.ap2.datadoghq.com/api/v2/siem-threat-hunting/jobs/signal_converthttps://api.datadoghq.eu/api/v2/siem-threat-hunting/jobs/signal_converthttps://api.ddog-gov.com/api/v2/siem-threat-hunting/jobs/signal_converthttps://api.datadoghq.com/api/v2/siem-threat-hunting/jobs/signal_converthttps://api.us3.datadoghq.com/api/v2/siem-threat-hunting/jobs/signal_converthttps://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs/signal_convert ### Overview Convert a job result to a signal. This endpoint requires the `security_monitoring_signals_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object Data for converting threat hunting job results to signals. attributes object Attributes for converting threat hunting job results to signals. id string Request ID. jobResultIds [_required_] [string] Job result IDs. notifications [_required_] [string] Notifications sent. signalMessage [_required_] string Message of generated signals. signalSeverity [_required_] enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum Type of payload. Allowed enum values: `historicalDetectionsJobResultSignalConversion` ``` { "data": { "attributes": { "id": "string", "jobResultIds": [ "" ], "notifications": [ "" ], "signalMessage": "A large number of failed login attempts.", "signalSeverity": "critical" }, "type": "string" } } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertJobResultToSignal-204-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertJobResultToSignal-400-v2) * [401](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertJobResultToSignal-401-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertJobResultToSignal-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertJobResultToSignal-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ConvertJobResultToSignal-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Concurrent Modification * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Convert a job result to a signal Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs/signal_convert" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "jobResultIds": [ "" ], "notifications": [ "" ], "signalMessage": "A large number of failed login attempts.", "signalSeverity": "critical" } } } EOF ``` ##### Convert a job result to a signal ``` """ Convert a job result to a signal returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.convert_job_results_to_signals_attributes import ConvertJobResultsToSignalsAttributes from datadog_api_client.v2.model.convert_job_results_to_signals_data import ConvertJobResultsToSignalsData from datadog_api_client.v2.model.convert_job_results_to_signals_data_type import ConvertJobResultsToSignalsDataType from datadog_api_client.v2.model.convert_job_results_to_signals_request import ConvertJobResultsToSignalsRequest from datadog_api_client.v2.model.security_monitoring_rule_severity import SecurityMonitoringRuleSeverity body = ConvertJobResultsToSignalsRequest( data=ConvertJobResultsToSignalsData( attributes=ConvertJobResultsToSignalsAttributes( job_result_ids=[ "", ], notifications=[ "", ], signal_message="A large number of failed login attempts.", signal_severity=SecurityMonitoringRuleSeverity.CRITICAL, ), type=ConvertJobResultsToSignalsDataType.HISTORICALDETECTIONSJOBRESULTSIGNALCONVERSION, ), ) configuration = Configuration() configuration.unstable_operations["convert_job_result_to_signal"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.convert_job_result_to_signal(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Convert a job result to a signal ``` # Convert a job result to a signal returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.convert_job_result_to_signal".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::ConvertJobResultsToSignalsRequest.new({ data: DatadogAPIClient::V2::ConvertJobResultsToSignalsData.new({ attributes: DatadogAPIClient::V2::ConvertJobResultsToSignalsAttributes.new({ job_result_ids: [ "", ], notifications: [ "", ], signal_message: "A large number of failed login attempts.", signal_severity: DatadogAPIClient::V2::SecurityMonitoringRuleSeverity::CRITICAL, }), type: DatadogAPIClient::V2::ConvertJobResultsToSignalsDataType::HISTORICALDETECTIONSJOBRESULTSIGNALCONVERSION, }), }) api_instance.convert_job_result_to_signal(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Convert a job result to a signal ``` // Convert a job result to a signal returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ConvertJobResultsToSignalsRequest{ Data: &datadogV2.ConvertJobResultsToSignalsData{ Attributes: &datadogV2.ConvertJobResultsToSignalsAttributes{ JobResultIds: []string{ "", }, Notifications: []string{ "", }, SignalMessage: "A large number of failed login attempts.", SignalSeverity: datadogV2.SECURITYMONITORINGRULESEVERITY_CRITICAL, }, Type: datadogV2.CONVERTJOBRESULTSTOSIGNALSDATATYPE_HISTORICALDETECTIONSJOBRESULTSIGNALCONVERSION.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ConvertJobResultToSignal", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.ConvertJobResultToSignal(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ConvertJobResultToSignal`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Convert a job result to a signal ``` // Convert a job result to a signal returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.ConvertJobResultsToSignalsAttributes; import com.datadog.api.client.v2.model.ConvertJobResultsToSignalsData; import com.datadog.api.client.v2.model.ConvertJobResultsToSignalsDataType; import com.datadog.api.client.v2.model.ConvertJobResultsToSignalsRequest; import com.datadog.api.client.v2.model.SecurityMonitoringRuleSeverity; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.convertJobResultToSignal", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); ConvertJobResultsToSignalsRequest body = new ConvertJobResultsToSignalsRequest() .data( new ConvertJobResultsToSignalsData() .attributes( new ConvertJobResultsToSignalsAttributes() .jobResultIds(Collections.singletonList("")) .notifications(Collections.singletonList("")) .signalMessage("A large number of failed login attempts.") .signalSeverity(SecurityMonitoringRuleSeverity.CRITICAL)) .type( ConvertJobResultsToSignalsDataType .HISTORICALDETECTIONSJOBRESULTSIGNALCONVERSION)); try { apiInstance.convertJobResultToSignal(body); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#convertJobResultToSignal"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Convert a job result to a signal ``` // Convert a job result to a signal returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::ConvertJobResultsToSignalsAttributes; use datadog_api_client::datadogV2::model::ConvertJobResultsToSignalsData; use datadog_api_client::datadogV2::model::ConvertJobResultsToSignalsDataType; use datadog_api_client::datadogV2::model::ConvertJobResultsToSignalsRequest; use datadog_api_client::datadogV2::model::SecurityMonitoringRuleSeverity; #[tokio::main] async fn main() { let body = ConvertJobResultsToSignalsRequest::new().data( ConvertJobResultsToSignalsData::new() .attributes(ConvertJobResultsToSignalsAttributes::new( vec!["".to_string()], vec!["".to_string()], "A large number of failed login attempts.".to_string(), SecurityMonitoringRuleSeverity::CRITICAL, )) .type_( ConvertJobResultsToSignalsDataType::HISTORICALDETECTIONSJOBRESULTSIGNALCONVERSION, ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ConvertJobResultToSignal", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.convert_job_result_to_signal(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Convert a job result to a signal ``` /** * Convert a job result to a signal returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.convertJobResultToSignal"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiConvertJobResultToSignalRequest = { body: { data: { attributes: { jobResultIds: [""], notifications: [""], signalMessage: "A large number of failed login attempts.", signalSeverity: "critical", }, type: "historicalDetectionsJobResultSignalConversion", }, }, }; apiInstance .convertJobResultToSignal(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a rule's version history](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-rules-version-history) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-rules-version-history-v2) **Note** : This endpoint is in beta and may be subject to changes. GET https://api.ap1.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/version_historyhttps://api.ap2.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/version_historyhttps://api.datadoghq.eu/api/v2/security_monitoring/rules/{rule_id}/version_historyhttps://api.ddog-gov.com/api/v2/security_monitoring/rules/{rule_id}/version_historyhttps://api.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/version_historyhttps://api.us3.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/version_historyhttps://api.us5.datadoghq.com/api/v2/security_monitoring/rules/{rule_id}/version_history ### Overview Get a rule’s version history. This endpoint requires the `security_monitoring_rules_read` permission. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string The ID of the rule. #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetRuleVersionHistory-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#GetRuleVersionHistory-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetRuleVersionHistory-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetRuleVersionHistory-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetRuleVersionHistory-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Response for getting the rule version history. Field Type Description data object Data for the rule version history. attributes object Response object containing the version history of a rule. count int32 The number of rule versions. data object The `RuleVersionHistory` `data`. object A rule version with a list of updates. changes [object] A list of changes. change string The new value of the field. field string The field that was changed. type enum The type of change. Allowed enum values: `create,update,delete` rule Create a new rule. Option 1 object Rule. calculatedFields [object] Calculated fields. Only allowed for scheduled rules - in other words, when schedulingOptions is also defined. expression [_required_] string Expression. name [_required_] string Field name. cases [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A rule case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` complianceSignalOptions object How to generate compliance signals. Useful for cloud_configuration rules only. defaultActivationStatus boolean The default activation status. defaultGroupByFields [string] The default group by fields. userActivationStatus boolean Whether signals will be sent. userGroupByFields [string] Fields to use to group findings by when sending signals. createdAt int64 When the rule was created, timestamp in milliseconds. creationAuthorId int64 User ID of the user who created the rule. customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). defaultTags [string] Default Tags for default rules (included in tags) deprecationDate int64 When the rule will be deprecated, timestamp in milliseconds. filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. groupSignalsBy [string] Additional grouping to perform on top of the existing groups in the query section. Must be a subset of the existing groups. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. id string The ID of the rule. isDefault boolean Whether the rule is included by default. isDeleted boolean Whether the rule has been deleted. isEnabled boolean Whether the rule is enabled. message string Message for generated signals. name string The name of the rule. options object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` customQueryExtension string Query extension to append to the logs query. dataSource enum Source of events, either logs, audit trail, or Datadog events. Allowed enum values: `logs,audit,app_sec_spans,spans,security_runtime,network,events` default: `logs` distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. hasOptionalGroupByFields boolean When false, events without a group-by value are ignored by the rule. When true, events with missing group-by fields are processed with `N/A`, replacing the missing values. index string **This field is currently unstable and might be removed in a minor version upgrade.** The index to run the query on, if the `dataSource` is `logs`. Only used for scheduled rules - in other words, when the `schedulingOptions` field is present in the rule payload. indexes [string] List of indexes to query when the `dataSource` is `logs`. Only used for scheduled rules, such as when the `schedulingOptions` field is present in the rule payload. metric string **DEPRECATED** : (Deprecated) The target field to aggregate over when using the sum or max aggregations. `metrics` field should be used instead. metrics [string] Group of target fields to aggregate over when using the sum, max, geo data, or new value aggregations. The sum, max, and geo data aggregations only accept one value in this list, whereas the new value aggregation accepts up to five values. name string Name of the query. query string Query to run on logs. referenceTables [object] Reference tables for the rule. checkPresence boolean Whether to include or exclude the matched values. columnName string The name of the column in the reference table. logFieldPath string The field in the log to match against the reference table. ruleQueryName string The name of the query to apply the reference table to. tableName string The name of the reference table. schedulingOptions object Options for scheduled rules. When this field is present, the rule runs based on the schedule. When absent, it runs real-time on ingested logs. rrule string Schedule for the rule queries, written in RRULE syntax. See [RFC](https://icalendar.org/iCalendar-RFC-5545/3-8-5-3-recurrence-rule.html) for syntax reference. start string Start date for the schedule, in ISO 8601 format without timezone. timezone string Time zone of the start date, in the [tz database](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) format. tags [string] Tags for generated signals. thirdPartyCases [object] Cases for generating signals from third-party rules. Only available for third-party rules. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. query string A query to map a third party event to this case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` type enum The rule type. Allowed enum values: `log_detection,infrastructure_configuration,workload_security,cloud_configuration,application_security,api_security` updateAuthorId int64 User ID of the user who updated the rule. updatedAt int64 The date the rule was last updated, in milliseconds. version int64 The version of the rule. Option 2 object Rule. cases [object] Cases for generating signals. actions [object] Action to perform for each rule case. options object Options for the rule action duration int64 Duration of the action in seconds. 0 indicates no expiration. flaggedIPType enum Used with the case action of type 'flag_ip'. The value specified in this field is applied as a flag to the IP addresses. Allowed enum values: `SUSPICIOUS,FLAGGED` userBehaviorName string Used with the case action of type 'user_behavior'. The value specified in this field is applied as a risk tag to all users affected by the rule. type enum The action type. Allowed enum values: `block_ip,block_user,user_behavior,flag_ip` condition string A rule case contains logical operations (`>`,`>=`, `&&`, `||`) to determine if a signal should be generated based on the event counts in the previously defined queries. customStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` name string Name of the case. notifications [string] Notification targets for each rule case. status enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` createdAt int64 When the rule was created, timestamp in milliseconds. creationAuthorId int64 User ID of the user who created the rule. customMessage string Custom/Overridden message for generated signals (used in case of Default rule update). customName string Custom/Overridden name of the rule (used in case of Default rule update). deprecationDate int64 When the rule will be deprecated, timestamp in milliseconds. filters [object] Additional queries to filter matched events before they are processed. This field is deprecated for log detection, signal correlation, and workload security rules. action enum The type of filtering action. Allowed enum values: `require,suppress` query string Query for selecting logs to apply the filtering action. hasExtendedTitle boolean Whether the notifications include the triggering group-by values in their title. id string The ID of the rule. isDefault boolean Whether the rule is included by default. isDeleted boolean Whether the rule has been deleted. isEnabled boolean Whether the rule is enabled. message string Message for generated signals. name string The name of the rule. options object Options. anomalyDetectionOptions object Options on anomaly detection method. bucketDuration enum Duration in seconds of the time buckets used to aggregate events matched by the rule. Must be greater than or equal to 300. Allowed enum values: `300,600,900,1800,3600,10800` detectionTolerance enum An optional parameter that sets how permissive anomaly detection is. Higher values require higher deviations before triggering a signal. Allowed enum values: `1,2,3,4,5` learningDuration enum Learning duration in hours. Anomaly detection waits for at least this amount of historical data before it starts evaluating. Allowed enum values: `1,6,12,24,48,168,336` learningPeriodBaseline int64 An optional override baseline to apply while the rule is in the learning period. Must be greater than or equal to 0. complianceRuleOptions object Options for cloud_configuration rules. Fields `resourceType` and `regoRule` are mandatory when managing custom `cloud_configuration` rules. complexRule boolean Whether the rule is a complex one. Must be set to true if `regoRule.resourceTypes` contains more than one item. Defaults to false. regoRule object Rule details. policy [_required_] string The policy written in `rego`, see: resourceTypes [_required_] [string] List of resource types that will be evaluated upon. Must have at least one element. resourceType string Main resource type to be checked by the rule. It should be specified again in `regoRule.resourceTypes`. decreaseCriticalityBasedOnEnv boolean If true, signals in non-production environments have a lower severity than what is defined by the rule case, which can reduce signal noise. The severity is decreased by one level: `CRITICAL` in production becomes `HIGH` in non-production, `HIGH` becomes `MEDIUM` and so on. `INFO` remains `INFO`. The decrement is applied when the environment tag of the signal starts with `staging`, `test` or `dev`. detectionMethod enum The detection method. Allowed enum values: `threshold,new_value,anomaly_detection,impossible_travel,hardcoded,third_party,anomaly_threshold,sequence_detection` evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` hardcodedEvaluatorType enum Hardcoded evaluator type. Allowed enum values: `log4shell` impossibleTravelOptions object Options on impossible travel detection method. baselineUserLocations boolean If true, signals are suppressed for the first 24 hours. In that time, Datadog learns the user's regular access locations. This can be helpful to reduce noise and infer VPN usage or credentialed API access. keepAlive enum Once a signal is generated, the signal will remain "open" if a case is matched at least once within this keep alive window. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` maxSignalDuration enum A signal will "close" regardless of the query being matched once the time exceeds the maximum duration. This time is calculated from the first seen timestamp. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` newValueOptions object Options on new value detection method. forgetAfter enum The duration in days after which a learned value is forgotten. Allowed enum values: `1,2,7,14,21,28` instantaneousBaseline boolean When set to true, Datadog uses previous values that fall within the defined learning window to construct the baseline, enabling the system to establish an accurate baseline more rapidly rather than relying solely on gradual learning over time. learningDuration enum The duration in days during which values are learned, and after which signals will be generated for values that weren't learned. If set to 0, a signal will be generated for all new values after the first value is learned. Allowed enum values: `0,1,7` learningMethod enum The learning method used to determine when signals should be generated for values that weren't learned. Allowed enum values: `duration,threshold` default: `duration` learningThreshold enum A number of occurrences after which signals will be generated for values that weren't learned. Allowed enum values: `0,1` sequenceDetectionOptions object Options on sequence detection method. stepTransitions [object] Transitions defining the allowed order of steps and their evaluation windows. child string Name of the child step. evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` parent string Name of the parent step. steps [object] Steps that define the conditions to be matched in sequence. condition string Condition referencing rule queries (e.g., `a > 0`). evaluationWindow enum A time window is specified to match when at least one of the cases matches true. This is a sliding window and evaluates in real time. For third party detection method, this field is not used. Allowed enum values: `0,60,300,600,900,1800,3600,7200,10800,21600,43200,86400` name string Unique name identifying the step. thirdPartyRuleOptions object Options on third party detection method. defaultNotifications [string] Notification targets for the logs that do not correspond to any of the cases. defaultStatus enum Severity of the Security Signal. Allowed enum values: `info,low,medium,high,critical` rootQueries [object] Queries to be combined with third party case queries. Each of them can have different group by fields, to aggregate differently based on the type of alert. groupByFields [string] Fields to group by. query string Query to run on logs. signalTitleTemplate string A template for the signal title; if omitted, the title is generated based on the case name. queries [object] Queries for selecting logs which are part of the rule. aggregation enum The aggregation type. Allowed enum values: `count,cardinality,sum,max,new_value,geo_data,event_count,none` correlatedByFields [string] Fields to correlate by. correlatedQueryIndex int32 Index of the rule query used to retrieve the correlated field. defaultRuleId string Default Rule ID to match on signals. distinctFields [string] Field for which the cardinality is measured. Sent as an array. groupByFields [string] Fields to group by. metrics [string] Group of target fields to aggregate over. name string Name of the query. ruleId string Rule ID to match on signals. tags [string] Tags for generated signals. type enum The rule type. Allowed enum values: `signal_correlation` updateAuthorId int64 User ID of the user who updated the rule. version int64 The version of the rule. id string ID of the rule. type enum Type of data. Allowed enum values: `GetRuleVersionHistoryResponse` ``` { "data": { "attributes": { "count": "integer", "data": { "": { "changes": [ { "change": "cloud_provider:aws", "field": "Tags", "type": "string" } ], "rule": { "calculatedFields": [ { "expression": "@request_end_timestamp - @request_start_timestamp", "name": "response_time" } ], "cases": [ { "actions": [ { "options": { "duration": 0, "flaggedIPType": "FLAGGED", "userBehaviorName": "string" }, "type": "string" } ], "condition": "string", "customStatus": "critical", "name": "string", "notifications": [], "status": "critical" } ], "complianceSignalOptions": { "defaultActivationStatus": false, "defaultGroupByFields": [], "userActivationStatus": false, "userGroupByFields": [] }, "createdAt": "integer", "creationAuthorId": "integer", "customMessage": "string", "customName": "string", "defaultTags": [ "security:attacks" ], "deprecationDate": "integer", "filters": [ { "action": "string", "query": "string" } ], "groupSignalsBy": [ "service" ], "hasExtendedTitle": false, "id": "string", "isDefault": false, "isDeleted": false, "isEnabled": false, "message": "string", "name": "string", "options": { "anomalyDetectionOptions": { "bucketDuration": 300, "detectionTolerance": 5, "learningDuration": "integer", "learningPeriodBaseline": "integer" }, "complianceRuleOptions": { "complexRule": false, "regoRule": { "policy": "package datadog\n\nimport data.datadog.output as dd_output\nimport future.keywords.contains\nimport future.keywords.if\nimport future.keywords.in\n\neval(resource) = \"skip\" if {\n # Logic that evaluates to true if the resource should be skipped\n true\n} else = \"pass\" {\n # Logic that evaluates to true if the resource is compliant\n true\n} else = \"fail\" {\n # Logic that evaluates to true if the resource is not compliant\n true\n}\n\n# This part remains unchanged for all rules\nresults contains result if {\n some resource in input.resources[input.main_resource_type]\n result := dd_output.format(resource, eval(resource))\n}", "resourceTypes": [ "gcp_iam_service_account", "gcp_iam_policy" ] }, "resourceType": "aws_acm" }, "decreaseCriticalityBasedOnEnv": false, "detectionMethod": "string", "evaluationWindow": "integer", "hardcodedEvaluatorType": "string", "impossibleTravelOptions": { "baselineUserLocations": true }, "keepAlive": "integer", "maxSignalDuration": "integer", "newValueOptions": { "forgetAfter": "integer", "instantaneousBaseline": false, "learningDuration": "integer", "learningMethod": "string", "learningThreshold": "integer" }, "sequenceDetectionOptions": { "stepTransitions": [ { "child": "string", "evaluationWindow": "integer", "parent": "string" } ], "steps": [ { "condition": "string", "evaluationWindow": "integer", "name": "string" } ] }, "thirdPartyRuleOptions": { "defaultNotifications": [], "defaultStatus": "critical", "rootQueries": [ { "groupByFields": [], "query": "source:cloudtrail" } ], "signalTitleTemplate": "string" } }, "queries": [ { "aggregation": "string", "customQueryExtension": "a > 3", "dataSource": "logs", "distinctFields": [], "groupByFields": [], "hasOptionalGroupByFields": false, "index": "string", "indexes": [], "metric": "string", "metrics": [], "name": "string", "query": "a > 3" } ], "referenceTables": [ { "checkPresence": false, "columnName": "string", "logFieldPath": "string", "ruleQueryName": "string", "tableName": "string" } ], "schedulingOptions": { "rrule": "FREQ=HOURLY;INTERVAL=1;", "start": "2025-07-14T12:00:00", "timezone": "America/New_York" }, "tags": [], "thirdPartyCases": [ { "customStatus": "critical", "name": "string", "notifications": [], "query": "string", "status": "critical" } ], "type": "string", "updateAuthorId": "integer", "updatedAt": "integer", "version": "integer" } } } }, "id": "string", "type": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a rule's version history Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security_monitoring/rules/${rule_id}/version_history" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a rule's version history ``` """ Get a rule's version history returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() configuration.unstable_operations["get_rule_version_history"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_rule_version_history( rule_id="rule_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a rule's version history ``` # Get a rule's version history returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_rule_version_history".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.get_rule_version_history("rule_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a rule's version history ``` // Get a rule's version history returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetRuleVersionHistory", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetRuleVersionHistory(ctx, "rule_id", *datadogV2.NewGetRuleVersionHistoryOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetRuleVersionHistory`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetRuleVersionHistory`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a rule's version history ``` // Get a rule's version history returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.GetRuleVersionHistoryResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getRuleVersionHistory", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { GetRuleVersionHistoryResponse result = apiInstance.getRuleVersionHistory("rule_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#getRuleVersionHistory"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a rule's version history ``` // Get a rule's version history returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::GetRuleVersionHistoryOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetRuleVersionHistory", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_rule_version_history( "rule_id".to_string(), GetRuleVersionHistoryOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a rule's version history ``` /** * Get a rule's version history returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getRuleVersionHistory"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiGetRuleVersionHistoryRequest = { ruleId: "rule_id", }; apiInstance .getRuleVersionHistory(params) .then((data: v2.GetRuleVersionHistoryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List vulnerabilities](https://docs.datadoghq.com/api/latest/security-monitoring/#list-vulnerabilities) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#list-vulnerabilities-v2) **Note** : This endpoint is a private preview. If you are interested in accessing this API, [fill out this form](https://forms.gle/kMYC1sDr6WDUBDsx9). GET https://api.ap1.datadoghq.com/api/v2/security/vulnerabilitieshttps://api.ap2.datadoghq.com/api/v2/security/vulnerabilitieshttps://api.datadoghq.eu/api/v2/security/vulnerabilitieshttps://api.ddog-gov.com/api/v2/security/vulnerabilitieshttps://api.datadoghq.com/api/v2/security/vulnerabilitieshttps://api.us3.datadoghq.com/api/v2/security/vulnerabilitieshttps://api.us5.datadoghq.com/api/v2/security/vulnerabilities ### Overview Get a list of vulnerabilities. ### [Pagination](https://docs.datadoghq.com/api/latest/security-monitoring/#pagination) Pagination is enabled by default in both `vulnerabilities` and `assets`. The size of the page varies depending on the endpoint and cannot be modified. To automate the request of the next page, you can use the links section in the response. This endpoint will return paginated responses. The pages are stored in the links section of the response: ``` { "data": [...], "meta": {...}, "links": { "self": "https://.../api/v2/security/vulnerabilities", "first": "https://.../api/v2/security/vulnerabilities?page[number]=1&page[token]=abc", "last": "https://.../api/v2/security/vulnerabilities?page[number]=43&page[token]=abc", "next": "https://.../api/v2/security/vulnerabilities?page[number]=2&page[token]=abc" } } ``` Copy * `links.previous` is empty if the first page is requested. * `links.next` is empty if the last page is requested. #### [Token](https://docs.datadoghq.com/api/latest/security-monitoring/#token) Vulnerabilities can be created, updated or deleted at any point in time. Upon the first request, a token is created to ensure consistency across subsequent paginated requests. A token is valid only for 24 hours. #### [First request](https://docs.datadoghq.com/api/latest/security-monitoring/#first-request) We consider a request to be the first request when there is no `page[token]` parameter. The response of this first request contains the newly created token in the `links` section. This token can then be used in the subsequent paginated requests. _Note: The first request may take longer to complete than subsequent requests._ #### [Subsequent requests](https://docs.datadoghq.com/api/latest/security-monitoring/#subsequent-requests) Any request containing valid `page[token]` and `page[number]` parameters will be considered a subsequent request. If the `token` is invalid, a `404` response will be returned. If the page `number` is invalid, a `400` response will be returned. The returned `token` is valid for all requests in the pagination sequence. To send paginated requests in parallel, reuse the same `token` and change only the `page[number]` parameter. ### [Filtering](https://docs.datadoghq.com/api/latest/security-monitoring/#filtering) The request can include some filter parameters to filter the data to be retrieved. The format of the filter parameters follows the [JSON:API format](https://jsonapi.org/format/#fetching-filtering): `filter[$prop_name]`, where `prop_name` is the property name in the entity being filtered by. All filters can include multiple values, where data will be filtered with an OR clause: `filter[title]=Title1,Title2` will filter all vulnerabilities where title is equal to `Title1` OR `Title2`. String filters are case sensitive. Boolean filters accept `true` or `false` as values. Number filters must include an operator as a second filter input: `filter[$prop_name][$operator]`. For example, for the vulnerabilities endpoint: `filter[cvss.base.score][lte]=8`. Available operators are: `eq` (==), `lt` (<), `lte` (<=), `gt` (>) and `gte` (>=). ### [Metadata](https://docs.datadoghq.com/api/latest/security-monitoring/#metadata) Following [JSON:API format](https://jsonapi.org/format/#document-meta), object including non-standard meta-information. This endpoint includes the meta member in the response. For more details on each of the properties included in this section, check the endpoints response tables. ``` { "data": [...], "meta": { "total": 1500, "count": 18732, "token": "some_token" }, "links": {...} } ``` Copy ### [Extensions](https://docs.datadoghq.com/api/latest/security-monitoring/#extensions) Requests may include extensions to modify the behavior of the requested endpoint. The filter parameters follow the [JSON:API format](https://jsonapi.org/extensions/#extensions) format: `ext:$extension_name`, where `extension_name` is the name of the modifier that is being applied. Extensions can only include one value: `ext:modifier=value`. This endpoint requires the `appsec_vm_read` permission. ### Arguments #### Query Strings Name Type Description page[token] string Its value must come from the `links` section of the response of the first request. Do not manually edit it. page[number] integer The page number to be retrieved. It should be equal or greater than `1` filter[type] enum Filter by vulnerability type. Allowed enum values: `AdminConsoleActive, CodeInjection, CommandInjection, ComponentWithKnownVulnerability, DangerousWorkflows, DefaultAppDeployed, DefaultHtmlEscapeInvalid, DirectoryListingLeak, EmailHtmlInjection, EndOfLife, HardcodedPassword, HardcodedSecret, HeaderInjection, HstsHeaderMissing, InsecureAuthProtocol, InsecureCookie, InsecureJspLayout, LdapInjection, MaliciousPackage, MandatoryRemediation, NoHttpOnlyCookie, NoSameSiteCookie, NoSqlMongoDbInjection, PathTraversal, ReflectionInjection, RiskyLicense, SessionRewriting, SessionTimeout, SqlInjection, Ssrf, StackTraceLeak, TrustBoundaryViolation, Unmaintained, UntrustedDeserialization, UnvalidatedRedirect, VerbTampering, WeakCipher, WeakHash, WeakRandomness, XContentTypeHeaderMissing, XPathInjection, Xss` filter[cvss.base.score][`$op`] number Filter by vulnerability base (i.e. from the original advisory) severity score. filter[cvss.base.severity] enum Filter by vulnerability base severity. Allowed enum values: `Unknown, None, Low, Medium, High, Critical` filter[cvss.base.vector] string Filter by vulnerability base CVSS vector. filter[cvss.datadog.score][`$op`] number Filter by vulnerability Datadog severity score. filter[cvss.datadog.severity] enum Filter by vulnerability Datadog severity. Allowed enum values: `Unknown, None, Low, Medium, High, Critical` filter[cvss.datadog.vector] string Filter by vulnerability Datadog CVSS vector. filter[status] enum Filter by the status of the vulnerability. Allowed enum values: `Open, Muted, Remediated, InProgress, AutoClosed` filter[tool] enum Filter by the tool of the vulnerability. Allowed enum values: `IAST, SCA, Infra, SAST` filter[library.name] string Filter by library name. filter[library.version] string Filter by library version. filter[advisory.id] string Filter by advisory ID. filter[risks.exploitation_probability] boolean Filter by exploitation probability. filter[risks.poc_exploit_available] boolean Filter by POC exploit availability. filter[risks.exploit_available] boolean Filter by public exploit availability. filter[risks.epss.score][`$op`] number Filter by vulnerability [EPSS](https://www.first.org/epss/) severity score. filter[risks.epss.severity] enum Filter by vulnerability [EPSS](https://www.first.org/epss/) severity. Allowed enum values: `Unknown, None, Low, Medium, High, Critical` filter[language] string Filter by language. filter[ecosystem] enum Filter by ecosystem. Allowed enum values: `PyPI, Maven, NuGet, Npm, RubyGems, Go, Packagist, Deb, Rpm, Apk, Windows, Generic, MacOs, Oci, BottleRocket, None` filter[code_location.location] string Filter by vulnerability location. filter[code_location.file_path] string Filter by vulnerability file path. filter[code_location.method] string Filter by method. filter[fix_available] boolean Filter by fix availability. filter[repo_digests] string Filter by vulnerability `repo_digest` (when the vulnerability is related to `Image` asset). filter[origin] string Filter by origin. filter[running_kernel] boolean Filter for whether the vulnerability affects a running kernel (for vulnerabilities related to a `Host` asset). filter[asset.name] string Filter by asset name. This field supports the usage of wildcards (*). filter[asset.type] enum Filter by asset type. Allowed enum values: `Repository, Service, Host, HostImage, Image` filter[asset.version.first] string Filter by the first version of the asset this vulnerability has been detected on. filter[asset.version.last] string Filter by the last version of the asset this vulnerability has been detected on. filter[asset.repository_url] string Filter by the repository url associated to the asset. filter[asset.risks.in_production] boolean Filter whether the asset is in production or not. filter[asset.risks.under_attack] boolean Filter whether the asset is under attack or not. filter[asset.risks.is_publicly_accessible] boolean Filter whether the asset is publicly accessible or not. filter[asset.risks.has_privileged_access] boolean Filter whether the asset is publicly accessible or not. filter[asset.risks.has_access_to_sensitive_data] boolean Filter whether the asset has access to sensitive data or not. filter[asset.environments] string Filter by asset environments. filter[asset.teams] string Filter by asset teams. filter[asset.arch] string Filter by asset architecture. filter[asset.operating_system.name] string Filter by asset operating system name. filter[asset.operating_system.version] string Filter by asset operating system version. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListVulnerabilities-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ListVulnerabilities-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ListVulnerabilities-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#ListVulnerabilities-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListVulnerabilities-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The expected response schema when listing vulnerabilities. Field Type Description data [_required_] [object] List of vulnerabilities. attributes [_required_] object The JSON:API attributes of the vulnerability. advisory object Advisory associated with the vulnerability. id [_required_] string Vulnerability advisory ID. last_modification_date string Vulnerability advisory last modification date. publish_date string Vulnerability advisory publish date. advisory_id string Vulnerability advisory ID. code_location object Code vulnerability location. file_path string Vulnerability location file path. location [_required_] string Vulnerability extracted location. method string Vulnerability location method. cve_list [_required_] [string] Vulnerability CVE list. cvss [_required_] object Vulnerability severities. base [_required_] object Vulnerability severity. score [_required_] double Vulnerability severity score. severity [_required_] enum The vulnerability severity. Allowed enum values: `Unknown,None,Low,Medium,High,Critical` vector [_required_] string Vulnerability CVSS vector. datadog [_required_] object Vulnerability severity. score [_required_] double Vulnerability severity score. severity [_required_] enum The vulnerability severity. Allowed enum values: `Unknown,None,Low,Medium,High,Critical` vector [_required_] string Vulnerability CVSS vector. dependency_locations object Static library vulnerability location. block [_required_] object Static library vulnerability location. column_end [_required_] int64 Location column end. column_start [_required_] int64 Location column start. file_name [_required_] string Location file name. line_end [_required_] int64 Location line end. line_start [_required_] int64 Location line start. name object Static library vulnerability location. column_end [_required_] int64 Location column end. column_start [_required_] int64 Location column start. file_name [_required_] string Location file name. line_end [_required_] int64 Location line end. line_start [_required_] int64 Location line start. version object Static library vulnerability location. column_end [_required_] int64 Location column end. column_start [_required_] int64 Location column start. file_name [_required_] string Location file name. line_end [_required_] int64 Location line end. line_start [_required_] int64 Location line start. description [_required_] string Vulnerability description. ecosystem enum The related vulnerability asset ecosystem. Allowed enum values: `PyPI,Maven,NuGet,Npm,RubyGems,Go,Packagist,Deb,Rpm,Apk,Windows,Generic,MacOs,Oci,BottleRocket,None` exposure_time [_required_] int64 Vulnerability exposure time in seconds. first_detection [_required_] string First detection of the vulnerability in [RFC 3339](https://datatracker.ietf.org/doc/html/rfc3339) format fix_available [_required_] boolean Whether the vulnerability has a remediation or not. language [_required_] string Vulnerability language. last_detection [_required_] string Last detection of the vulnerability in [RFC 3339](https://datatracker.ietf.org/doc/html/rfc3339) format library object Vulnerability library. additional_names [string] Related library or package names (such as child packages or affected binary paths). name [_required_] string Vulnerability library name. version string Vulnerability library version. origin [_required_] [string] Vulnerability origin. remediations [_required_] [object] List of remediations. auto_solvable [_required_] boolean Whether the vulnerability can be resolved when recompiling the package or not. avoided_advisories [_required_] [object] Avoided advisories. base_severity [_required_] string Advisory base severity. id [_required_] string Advisory id. severity string Advisory Datadog severity. fixed_advisories [_required_] [object] Remediation fixed advisories. base_severity [_required_] string Advisory base severity. id [_required_] string Advisory id. severity string Advisory Datadog severity. library_name [_required_] string Library name remediating the vulnerability. library_version [_required_] string Library version remediating the vulnerability. new_advisories [_required_] [object] New advisories. base_severity [_required_] string Advisory base severity. id [_required_] string Advisory id. severity string Advisory Datadog severity. remaining_advisories [_required_] [object] Remaining advisories. base_severity [_required_] string Advisory base severity. id [_required_] string Advisory id. severity string Advisory Datadog severity. type [_required_] string Remediation type. repo_digests [string] Vulnerability `repo_digest` list (when the vulnerability is related to `Image` asset). risks [_required_] object Vulnerability risks. epss object Vulnerability EPSS severity. score [_required_] double Vulnerability EPSS severity score. severity [_required_] enum The vulnerability severity. Allowed enum values: `Unknown,None,Low,Medium,High,Critical` exploit_available [_required_] boolean Vulnerability public exploit availability. exploit_sources [_required_] [string] Vulnerability exploit sources. exploitation_probability [_required_] boolean Vulnerability exploitation probability. poc_exploit_available [_required_] boolean Vulnerability POC exploit availability. running_kernel boolean True if the vulnerability affects a package in the host’s running kernel, false if it affects a non-running kernel, and omit if it is not kernel-related. status [_required_] enum The vulnerability status. Allowed enum values: `Open,Muted,Remediated,InProgress,AutoClosed` title [_required_] string Vulnerability title. tool [_required_] enum The vulnerability tool. Allowed enum values: `IAST,SCA,Infra,SAST` type [_required_] enum The vulnerability type. Allowed enum values: `AdminConsoleActive,CodeInjection,CommandInjection,ComponentWithKnownVulnerability,DangerousWorkflows,DefaultAppDeployed,DefaultHtmlEscapeInvalid,DirectoryListingLeak,EmailHtmlInjection,EndOfLife,HardcodedPassword,HardcodedSecret,HeaderInjection,HstsHeaderMissing,InsecureAuthProtocol,InsecureCookie,InsecureJspLayout,LdapInjection,MaliciousPackage,MandatoryRemediation,NoHttpOnlyCookie,NoSameSiteCookie,NoSqlMongoDbInjection,PathTraversal,ReflectionInjection,RiskyLicense,SessionRewriting,SessionTimeout,SqlInjection,Ssrf,StackTraceLeak,TrustBoundaryViolation,Unmaintained,UntrustedDeserialization,UnvalidatedRedirect,VerbTampering,WeakCipher,WeakHash,WeakRandomness,XContentTypeHeaderMissing,XPathInjection,Xss` id [_required_] string The unique ID for this vulnerability. relationships [_required_] object Related entities object. affects [_required_] object Relationship type. data [_required_] object Asset affected by this vulnerability. id [_required_] string The unique ID for this related asset. type [_required_] enum The JSON:API type. Allowed enum values: `assets` type [_required_] enum The JSON:API type. Allowed enum values: `vulnerabilities` links object The JSON:API links related to pagination. first [_required_] string First page link. last [_required_] string Last page link. next string Next page link. previous string Previous page link. self [_required_] string Request link. meta object The metadata related to this request. count [_required_] int64 Number of entities included in the response. token [_required_] string The token that identifies the request. total [_required_] int64 Total number of entities across all pages. ``` { "data": [ { "attributes": { "advisory": { "id": "TRIVY-CVE-2023-0615", "last_modification_date": "2024-09-19T21:23:08.000Z", "publish_date": "2024-09-19T21:23:08.000Z" }, "advisory_id": "TRIVY-CVE-2023-0615", "code_location": { "file_path": "src/Class.java:100", "location": "com.example.Class:100", "method": "FooBar" }, "cve_list": [ "CVE-2023-0615" ], "cvss": { "base": { "score": 4.5, "severity": "Medium", "vector": "CVSS:3.0/AV:L/AC:L/PR:L/UI:N/S:U/C:N/I:N/A:H" }, "datadog": { "score": 4.5, "severity": "Medium", "vector": "CVSS:3.0/AV:L/AC:L/PR:L/UI:N/S:U/C:N/I:N/A:H" } }, "dependency_locations": { "block": { "column_end": 140, "column_start": 5, "file_name": "src/go.mod", "line_end": 10, "line_start": 1 }, "name": { "column_end": 140, "column_start": 5, "file_name": "src/go.mod", "line_end": 10, "line_start": 1 }, "version": { "column_end": 140, "column_start": 5, "file_name": "src/go.mod", "line_end": 10, "line_start": 1 } }, "description": "LDAP Injection is a security vulnerability that occurs when untrusted user input is improperly handled and directly incorporated into LDAP queries without appropriate sanitization or validation. This vulnerability enables attackers to manipulate LDAP queries and potentially gain unauthorized access, modify data, or extract sensitive information from the directory server. By exploiting the LDAP injection vulnerability, attackers can execute malicious commands, bypass authentication mechanisms, and perform unauthorized actions within the directory service.", "ecosystem": "string", "exposure_time": 5618604, "first_detection": "2024-09-19T21:23:08.000Z", "fix_available": false, "language": "ubuntu", "last_detection": "2024-09-01T21:23:08.000Z", "library": { "additional_names": [ "linux-tools-common" ], "name": "linux-aws-5.15", "version": "5.15.0" }, "origin": [ "agentless-scanner" ], "remediations": [ { "auto_solvable": false, "avoided_advisories": [ { "base_severity": "Critical", "id": "GHSA-4wrc-f8pq-fpqp", "severity": "Medium" } ], "fixed_advisories": [ { "base_severity": "Critical", "id": "GHSA-4wrc-f8pq-fpqp", "severity": "Medium" } ], "library_name": "stdlib", "library_version": "Upgrade to a version >= 1.20.0", "new_advisories": [ { "base_severity": "Critical", "id": "GHSA-4wrc-f8pq-fpqp", "severity": "Medium" } ], "remaining_advisories": [ { "base_severity": "Critical", "id": "GHSA-4wrc-f8pq-fpqp", "severity": "Medium" } ], "type": "text" } ], "repo_digests": [ "sha256:0ae7da091191787229d321e3638e39c319a97d6e20f927d465b519d699215bf7" ], "risks": { "epss": { "score": 0.2, "severity": "Medium" }, "exploit_available": false, "exploit_sources": [ "NIST" ], "exploitation_probability": false, "poc_exploit_available": false }, "running_kernel": true, "status": "Open", "title": "LDAP Injection", "tool": "SCA", "type": "WeakCipher" }, "id": "3ecdfea798f2ce8f6e964805a344945f", "relationships": { "affects": { "data": { "id": "Repository|github.com/DataDog/datadog-agent.git", "type": "assets" } } }, "type": "vulnerabilities" } ], "links": { "first": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=1\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "last": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=15\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "next": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=16\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "previous": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=14\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "self": "https://api.datadoghq.com/api/v2/security/vulnerabilities?filter%5Btool%5D=Infra" }, "meta": { "count": 150, "token": "b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "total": 152431 } } ``` Copy Bad request: The server cannot process the request due to invalid syntax in the request. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden: Access denied * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not found: There is no request associated with the provided token. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### List vulnerabilities Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/vulnerabilities" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List vulnerabilities ``` """ List vulnerabilities returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.asset_type import AssetType from datadog_api_client.v2.model.vulnerability_severity import VulnerabilitySeverity from datadog_api_client.v2.model.vulnerability_tool import VulnerabilityTool configuration = Configuration() configuration.unstable_operations["list_vulnerabilities"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_vulnerabilities( filter_cvss_base_severity=VulnerabilitySeverity.HIGH, filter_tool=VulnerabilityTool.INFRA, filter_asset_type=AssetType.SERVICE, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List vulnerabilities ``` # List vulnerabilities returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_vulnerabilities".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new opts = { filter_cvss_base_severity: VulnerabilitySeverity::HIGH, filter_asset_type: AssetType::SERVICE, filter_tool: VulnerabilityTool::INFRA, } p api_instance.list_vulnerabilities(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List vulnerabilities ``` // List vulnerabilities returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListVulnerabilities", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListVulnerabilities(ctx, *datadogV2.NewListVulnerabilitiesOptionalParameters().WithFilterCvssBaseSeverity(datadogV2.VULNERABILITYSEVERITY_HIGH).WithFilterAssetType(datadogV2.ASSETTYPE_SERVICE).WithFilterTool(datadogV2.VULNERABILITYTOOL_INFRA)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListVulnerabilities`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListVulnerabilities`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List vulnerabilities ``` // List vulnerabilities returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.api.SecurityMonitoringApi.ListVulnerabilitiesOptionalParameters; import com.datadog.api.client.v2.model.AssetType; import com.datadog.api.client.v2.model.ListVulnerabilitiesResponse; import com.datadog.api.client.v2.model.VulnerabilitySeverity; import com.datadog.api.client.v2.model.VulnerabilityTool; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listVulnerabilities", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { ListVulnerabilitiesResponse result = apiInstance.listVulnerabilities( new ListVulnerabilitiesOptionalParameters() .filterCvssBaseSeverity(VulnerabilitySeverity.HIGH) .filterAssetType(AssetType.SERVICE) .filterTool(VulnerabilityTool.INFRA)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#listVulnerabilities"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List vulnerabilities ``` // List vulnerabilities returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::ListVulnerabilitiesOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::AssetType; use datadog_api_client::datadogV2::model::VulnerabilitySeverity; use datadog_api_client::datadogV2::model::VulnerabilityTool; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListVulnerabilities", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .list_vulnerabilities( ListVulnerabilitiesOptionalParams::default() .filter_cvss_base_severity(VulnerabilitySeverity::HIGH) .filter_asset_type(AssetType::SERVICE) .filter_tool(VulnerabilityTool::INFRA), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List vulnerabilities ``` /** * List vulnerabilities returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listVulnerabilities"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiListVulnerabilitiesRequest = { filterCvssBaseSeverity: "High", filterTool: "Infra", filterAssetType: "Service", }; apiInstance .listVulnerabilities(params) .then((data: v2.ListVulnerabilitiesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List resource filters](https://docs.datadoghq.com/api/latest/security-monitoring/#list-resource-filters) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#list-resource-filters-v2) GET https://api.ap1.datadoghq.com/api/v2/cloud_security_management/resource_filtershttps://api.ap2.datadoghq.com/api/v2/cloud_security_management/resource_filtershttps://api.datadoghq.eu/api/v2/cloud_security_management/resource_filtershttps://api.ddog-gov.com/api/v2/cloud_security_management/resource_filtershttps://api.datadoghq.com/api/v2/cloud_security_management/resource_filtershttps://api.us3.datadoghq.com/api/v2/cloud_security_management/resource_filtershttps://api.us5.datadoghq.com/api/v2/cloud_security_management/resource_filters ### Overview List resource filters. This endpoint requires the `security_monitoring_filters_read` permission. OAuth apps require the `security_monitoring_filters_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Query Strings Name Type Description cloud_provider string Filter resource filters by cloud provider (e.g. aws, gcp, azure). account_id string Filter resource filters by cloud provider account ID. This parameter is only valid when provider is specified. skip_cache boolean Skip cache for resource filters. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetResourceEvaluationFilters-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#GetResourceEvaluationFilters-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetResourceEvaluationFilters-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetResourceEvaluationFilters-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The definition of `GetResourceEvaluationFiltersResponse` object. Field Type Description data [_required_] object The definition of `GetResourceFilterResponseData` object. attributes object Attributes of a resource filter. cloud_provider [_required_] object A map of cloud provider names (e.g., "aws", "gcp", "azure") to a map of account/resource IDs and their associated tag filters. object [string] uuid string The UUID of the resource filter. id string The `data` `id`. type enum Constant string to identify the request type. Allowed enum values: `csm_resource_filter` ``` { "data": { "attributes": { "cloud_provider": { "": { "": [ "environment:production" ] } }, "uuid": "string" }, "id": "csm_resource_filter", "type": "csm_resource_filter" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### List resource filters Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cloud_security_management/resource_filters" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List resource filters ``` """ List resource filters returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_resource_evaluation_filters( cloud_provider="aws", account_id="123456789", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List resource filters ``` # List resource filters returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new opts = { cloud_provider: "aws", account_id: "123456789", } p api_instance.get_resource_evaluation_filters(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List resource filters ``` // List resource filters returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetResourceEvaluationFilters(ctx, *datadogV2.NewGetResourceEvaluationFiltersOptionalParameters().WithCloudProvider("aws").WithAccountId("123456789")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetResourceEvaluationFilters`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetResourceEvaluationFilters`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List resource filters ``` // List resource filters returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.api.SecurityMonitoringApi.GetResourceEvaluationFiltersOptionalParameters; import com.datadog.api.client.v2.model.GetResourceEvaluationFiltersResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { GetResourceEvaluationFiltersResponse result = apiInstance.getResourceEvaluationFilters( new GetResourceEvaluationFiltersOptionalParameters() .cloudProvider("aws") .accountId("123456789")); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#getResourceEvaluationFilters"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List resource filters ``` // List resource filters returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::GetResourceEvaluationFiltersOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_resource_evaluation_filters( GetResourceEvaluationFiltersOptionalParams::default() .cloud_provider("aws".to_string()) .account_id("123456789".to_string()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List resource filters ``` /** * List resource filters returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiGetResourceEvaluationFiltersRequest = { cloudProvider: "aws", accountId: "123456789", }; apiInstance .getResourceEvaluationFilters(params) .then((data: v2.GetResourceEvaluationFiltersResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List vulnerable assets](https://docs.datadoghq.com/api/latest/security-monitoring/#list-vulnerable-assets) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#list-vulnerable-assets-v2) **Note** : This endpoint is a private preview. If you are interested in accessing this API, [fill out this form](https://forms.gle/kMYC1sDr6WDUBDsx9). GET https://api.ap1.datadoghq.com/api/v2/security/vulnerable-assetshttps://api.ap2.datadoghq.com/api/v2/security/vulnerable-assetshttps://api.datadoghq.eu/api/v2/security/vulnerable-assetshttps://api.ddog-gov.com/api/v2/security/vulnerable-assetshttps://api.datadoghq.com/api/v2/security/vulnerable-assetshttps://api.us3.datadoghq.com/api/v2/security/vulnerable-assetshttps://api.us5.datadoghq.com/api/v2/security/vulnerable-assets ### Overview Get a list of vulnerable assets. ### [Pagination](https://docs.datadoghq.com/api/latest/security-monitoring/#pagination) Please review the [Pagination section for the “List Vulnerabilities”](https://docs.datadoghq.com/api/latest/security-monitoring/#pagination) endpoint. ### [Filtering](https://docs.datadoghq.com/api/latest/security-monitoring/#filtering) Please review the [Filtering section for the “List Vulnerabilities”](https://docs.datadoghq.com/api/latest/security-monitoring/#filtering) endpoint. ### [Metadata](https://docs.datadoghq.com/api/latest/security-monitoring/#metadata) Please review the [Metadata section for the “List Vulnerabilities”](https://docs.datadoghq.com/api/latest/security-monitoring/#metadata) endpoint. This endpoint requires the `appsec_vm_read` permission. ### Arguments #### Query Strings Name Type Description page[token] string Its value must come from the `links` section of the response of the first request. Do not manually edit it. page[number] integer The page number to be retrieved. It should be equal or greater than `1` filter[name] string Filter by name. This field supports the usage of wildcards (*). filter[type] enum Filter by type. Allowed enum values: `Repository, Service, Host, HostImage, Image` filter[version.first] string Filter by the first version of the asset since it has been vulnerable. filter[version.last] string Filter by the last detected version of the asset. filter[repository_url] string Filter by the repository url associated to the asset. filter[risks.in_production] boolean Filter whether the asset is in production or not. filter[risks.under_attack] boolean Filter whether the asset (Service) is under attack or not. filter[risks.is_publicly_accessible] boolean Filter whether the asset (Host) is publicly accessible or not. filter[risks.has_privileged_access] boolean Filter whether the asset (Host) has privileged access or not. filter[risks.has_access_to_sensitive_data] boolean Filter whether the asset (Host) has access to sensitive data or not. filter[environments] string Filter by environment. filter[teams] string Filter by teams. filter[arch] string Filter by architecture. filter[operating_system.name] string Filter by operating system name. filter[operating_system.version] string Filter by operating system version. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListVulnerableAssets-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ListVulnerableAssets-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ListVulnerableAssets-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#ListVulnerableAssets-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListVulnerableAssets-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The expected response schema when listing vulnerable assets. Field Type Description data [_required_] [object] List of vulnerable assets. attributes [_required_] object The JSON:API attributes of the asset. arch string Asset architecture. environments [_required_] [string] List of environments where the asset is deployed. name [_required_] string Asset name. operating_system object Asset operating system. description string Operating system version. name [_required_] string Operating system name. risks [_required_] object Asset risks. has_access_to_sensitive_data boolean Whether the asset has access to sensitive data or not. has_privileged_access boolean Whether the asset has privileged access or not. in_production [_required_] boolean Whether the asset is in production or not. is_publicly_accessible boolean Whether the asset is publicly accessible or not. under_attack boolean Whether the asset is under attack or not. teams [string] List of teams that own the asset. type [_required_] enum The asset type Allowed enum values: `Repository,Service,Host,HostImage,Image` version object Asset version. first string Asset first version. last string Asset last version. id [_required_] string The unique ID for this asset. type [_required_] enum The JSON:API type. Allowed enum values: `assets` links object The JSON:API links related to pagination. first [_required_] string First page link. last [_required_] string Last page link. next string Next page link. previous string Previous page link. self [_required_] string Request link. meta object The metadata related to this request. count [_required_] int64 Number of entities included in the response. token [_required_] string The token that identifies the request. total [_required_] int64 Total number of entities across all pages. ``` { "data": [ { "attributes": { "arch": "arm64", "environments": [ "staging" ], "name": "github.com/DataDog/datadog-agent.git", "operating_system": { "description": "24.04", "name": "ubuntu" }, "risks": { "has_access_to_sensitive_data": false, "has_privileged_access": false, "in_production": false, "is_publicly_accessible": false, "under_attack": false }, "teams": [ "compute" ], "type": "Repository", "version": { "first": "_latest", "last": "_latest" } }, "id": "Repository|github.com/DataDog/datadog-agent.git", "type": "assets" } ], "links": { "first": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=1\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "last": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=15\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "next": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=16\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "previous": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=14\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "self": "https://api.datadoghq.com/api/v2/security/vulnerabilities?filter%5Btool%5D=Infra" }, "meta": { "count": 150, "token": "b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "total": 152431 } } ``` Copy Bad request: The server cannot process the request due to invalid syntax in the request. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden: Access denied * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not found: There is no request associated with the provided token. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### List vulnerable assets Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/vulnerable-assets" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List vulnerable assets ``` """ List vulnerable assets returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.asset_type import AssetType configuration = Configuration() configuration.unstable_operations["list_vulnerable_assets"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_vulnerable_assets( filter_type=AssetType.HOST, filter_repository_url="github.com/datadog/dd-go", filter_risks_in_production=True, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List vulnerable assets ``` # List vulnerable assets returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_vulnerable_assets".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new opts = { filter_type: AssetType::HOST, filter_repository_url: "github.com/datadog/dd-go", filter_risks_in_production: true, } p api_instance.list_vulnerable_assets(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List vulnerable assets ``` // List vulnerable assets returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListVulnerableAssets", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListVulnerableAssets(ctx, *datadogV2.NewListVulnerableAssetsOptionalParameters().WithFilterType(datadogV2.ASSETTYPE_HOST).WithFilterRepositoryUrl("github.com/datadog/dd-go").WithFilterRisksInProduction(true)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListVulnerableAssets`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListVulnerableAssets`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List vulnerable assets ``` // List vulnerable assets returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.api.SecurityMonitoringApi.ListVulnerableAssetsOptionalParameters; import com.datadog.api.client.v2.model.AssetType; import com.datadog.api.client.v2.model.ListVulnerableAssetsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listVulnerableAssets", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { ListVulnerableAssetsResponse result = apiInstance.listVulnerableAssets( new ListVulnerableAssetsOptionalParameters() .filterType(AssetType.HOST) .filterRepositoryUrl("github.com/datadog/dd-go") .filterRisksInProduction(true)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#listVulnerableAssets"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List vulnerable assets ``` // List vulnerable assets returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::ListVulnerableAssetsOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::AssetType; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListVulnerableAssets", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .list_vulnerable_assets( ListVulnerableAssetsOptionalParams::default() .filter_type(AssetType::HOST) .filter_repository_url("github.com/datadog/dd-go".to_string()) .filter_risks_in_production(true), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List vulnerable assets ``` /** * List vulnerable assets returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listVulnerableAssets"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiListVulnerableAssetsRequest = { filterType: "Host", filterRepositoryUrl: "github.com/datadog/dd-go", filterRisksInProduction: true, }; apiInstance .listVulnerableAssets(params) .then((data: v2.ListVulnerableAssetsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get SBOM](https://docs.datadoghq.com/api/latest/security-monitoring/#get-sbom) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-sbom-v2) **Note** : This endpoint is a private preview. If you are interested in accessing this API, [fill out this form](https://forms.gle/kMYC1sDr6WDUBDsx9). GET https://api.ap1.datadoghq.com/api/v2/security/sboms/{asset_type}https://api.ap2.datadoghq.com/api/v2/security/sboms/{asset_type}https://api.datadoghq.eu/api/v2/security/sboms/{asset_type}https://api.ddog-gov.com/api/v2/security/sboms/{asset_type}https://api.datadoghq.com/api/v2/security/sboms/{asset_type}https://api.us3.datadoghq.com/api/v2/security/sboms/{asset_type}https://api.us5.datadoghq.com/api/v2/security/sboms/{asset_type} ### Overview Get a single SBOM related to an asset by its type and name. This endpoint requires the `appsec_vm_read` permission. ### Arguments #### Path Parameters Name Type Description asset_type [_required_] string The type of the asset for the SBOM request. #### Query Strings Name Type Description filter[asset_name] [_required_] string The name of the asset for the SBOM request. filter[repo_digest] string The container image `repo_digest` for the SBOM request. When the requested asset type is ‘Image’, this filter is mandatory. ext:format enum The standard of the SBOM. Allowed enum values: `CycloneDX, SPDX` ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSBOM-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSBOM-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSBOM-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSBOM-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSBOM-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The expected response schema when getting an SBOM. Field Type Description data [_required_] object A single SBOM attributes object The JSON:API attributes of the SBOM. bomFormat [_required_] string Specifies the format of the BOM. This helps to identify the file as CycloneDX since BOM do not have a filename convention nor does JSON schema support namespaces. This value MUST be `CycloneDX`. components [_required_] [object] A list of software and hardware components. bom-ref string An optional identifier that can be used to reference the component elsewhere in the BOM. licenses [object] The software licenses of the SBOM component. license [_required_] object The software license of the component of the SBOM. name [_required_] string The name of the software license of the component of the SBOM. name [_required_] string The name of the component. This will often be a shortened, single name of the component. properties [object] The custom properties of the component of the SBOM. name [_required_] string The name of the custom property of the component of the SBOM. value [_required_] string The value of the custom property of the component of the SBOM. purl string Specifies the package-url (purl). The purl, if specified, MUST be valid and conform to the [specification](https://github.com/package-url/purl-spec). supplier [_required_] object The supplier of the component. name [_required_] string Identifier of the supplier of the component. type [_required_] enum The SBOM component type Allowed enum values: `application,container,data,device,device-driver,file,firmware,framework,library,machine-learning-model,operating-system,platform` version [_required_] string The component version. dependencies [_required_] [object] List of dependencies between components of the SBOM. dependsOn [string] The components that are dependencies of the ref component. ref string The identifier for the related component. metadata [_required_] object Provides additional information about a BOM. authors [object] List of authors of the SBOM. name string The identifier of the Author of the SBOM. component object The component that the BOM describes. name string The name of the component. This will often be a shortened, single name of the component. type string Specifies the type of the component. timestamp string The timestamp of the SBOM creation. serialNumber [_required_] string Every BOM generated has a unique serial number, even if the contents of the BOM have not changed overt time. The serial number follows [RFC-4122](https://datatracker.ietf.org/doc/html/rfc4122) specVersion [_required_] enum The version of the CycloneDX specification a BOM conforms to. Allowed enum values: `1.0,1.1,1.2,1.3,1.4,1.5` version [_required_] int64 It increments when a BOM is modified. The default value is 1. id string The unique ID for this SBOM (it is equivalent to the `asset_name` or `asset_name@repo_digest` (Image) type enum The JSON:API type. Allowed enum values: `sboms` ``` { "data": { "attributes": { "bomFormat": "CycloneDX", "components": [ { "bom-ref": "pkg:golang/google.golang.org/grpc@1.68.1", "licenses": [ { "license": { "name": "MIT" } } ], "name": "google.golang.org/grpc", "properties": [ { "name": "license_type", "value": "permissive" } ], "purl": "pkg:golang/google.golang.org/grpc@1.68.1", "supplier": { "name": "https://go.dev" }, "type": "application", "version": "1.68.1" } ], "dependencies": [ { "dependsOn": [ "pkg:golang/google.golang.org/grpc@1.68.1" ], "ref": "Repository|github.com/datadog/datadog-agent" } ], "metadata": { "authors": [ { "name": "Datadog, Inc." } ], "component": { "name": "github.com/datadog/datadog-agent", "type": "application" }, "timestamp": "2025-07-08T07:24:53Z" }, "serialNumber": "urn:uuid:f7119d2f-1vgh-24b5-91f0-12010db72da7", "specVersion": "1.5", "version": 1 }, "id": "github.com/datadog/datadog-agent", "type": "sboms" } } ``` Copy Bad request: The server cannot process the request due to invalid syntax in the request. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden: Access denied * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not found: asset not found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get SBOM Copy ``` # Path parameters export asset_type="Repository" # Required query arguments export filter[asset_name]="github.com/datadog/datadog-agent" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/sboms/${asset_type}?filter[asset_name]=${filter[asset_name]}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get SBOM ``` """ Get SBOM returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.asset_type import AssetType configuration = Configuration() configuration.unstable_operations["get_sbom"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_sbom( asset_type=AssetType.REPOSITORY, filter_asset_name="github.com/datadog/datadog-agent", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get SBOM ``` # Get SBOM returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_sbom".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.get_sbom(AssetType::REPOSITORY, "github.com/datadog/datadog-agent") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get SBOM ``` // Get SBOM returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetSBOM", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSBOM(ctx, datadogV2.ASSETTYPE_REPOSITORY, "github.com/datadog/datadog-agent", *datadogV2.NewGetSBOMOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSBOM`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSBOM`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get SBOM ``` // Get SBOM returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.AssetType; import com.datadog.api.client.v2.model.GetSBOMResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getSBOM", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { GetSBOMResponse result = apiInstance.getSBOM(AssetType.REPOSITORY, "github.com/datadog/datadog-agent"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#getSBOM"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get SBOM ``` // Get SBOM returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::GetSBOMOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::AssetType; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetSBOM", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_sbom( AssetType::REPOSITORY, "github.com/datadog/datadog-agent".to_string(), GetSBOMOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get SBOM ``` /** * Get SBOM returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getSBOM"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiGetSBOMRequest = { assetType: "Repository", filterAssetName: "github.com/datadog/datadog-agent", }; apiInstance .getSBOM(params) .then((data: v2.GetSBOMResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update resource filters](https://docs.datadoghq.com/api/latest/security-monitoring/#update-resource-filters) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#update-resource-filters-v2) PUT https://api.ap1.datadoghq.com/api/v2/cloud_security_management/resource_filtershttps://api.ap2.datadoghq.com/api/v2/cloud_security_management/resource_filtershttps://api.datadoghq.eu/api/v2/cloud_security_management/resource_filtershttps://api.ddog-gov.com/api/v2/cloud_security_management/resource_filtershttps://api.datadoghq.com/api/v2/cloud_security_management/resource_filtershttps://api.us3.datadoghq.com/api/v2/cloud_security_management/resource_filtershttps://api.us5.datadoghq.com/api/v2/cloud_security_management/resource_filters ### Overview Update resource filters. This endpoint requires the `security_monitoring_filters_write` permission. OAuth apps require the `security_monitoring_filters_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] object The definition of `UpdateResourceFilterRequestData` object. attributes [_required_] object Attributes of a resource filter. cloud_provider [_required_] object A map of cloud provider names (e.g., "aws", "gcp", "azure") to a map of account/resource IDs and their associated tag filters. object [string] uuid string The UUID of the resource filter. id string The `UpdateResourceEvaluationFiltersRequestData` `id`. type [_required_] enum Constant string to identify the request type. Allowed enum values: `csm_resource_filter` ``` { "data": { "attributes": { "cloud_provider": { "aws": { "aws_account_id": [ "tag1:v1" ] } } }, "id": "csm_resource_filter", "type": "csm_resource_filter" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateResourceEvaluationFilters-201-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateResourceEvaluationFilters-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateResourceEvaluationFilters-403-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#UpdateResourceEvaluationFilters-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The definition of `UpdateResourceEvaluationFiltersResponse` object. Field Type Description data [_required_] object The definition of `UpdateResourceFilterResponseData` object. attributes [_required_] object Attributes of a resource filter. cloud_provider [_required_] object A map of cloud provider names (e.g., "aws", "gcp", "azure") to a map of account/resource IDs and their associated tag filters. object [string] uuid string The UUID of the resource filter. id string The `data` `id`. type [_required_] enum Constant string to identify the request type. Allowed enum values: `csm_resource_filter` ``` { "data": { "attributes": { "cloud_provider": { "": { "": [ "environment:production" ] } }, "uuid": "string" }, "id": "csm_resource_filter", "type": "csm_resource_filter" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Update resource filters returns "OK" response Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cloud_security_management/resource_filters" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "cloud_provider": { "aws": { "aws_account_id": [ "tag1:v1" ] } } }, "id": "csm_resource_filter", "type": "csm_resource_filter" } } EOF ``` ##### Update resource filters returns "OK" response ``` // Update resource filters returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.UpdateResourceEvaluationFiltersRequest{ Data: datadogV2.UpdateResourceEvaluationFiltersRequestData{ Attributes: datadogV2.ResourceFilterAttributes{ CloudProvider: map[string]map[string][]string{ "aws": map[string][]string{ "aws_account_id": []string{ "tag1:v1", }, }, }, }, Id: datadog.PtrString("csm_resource_filter"), Type: datadogV2.RESOURCEFILTERREQUESTTYPE_CSM_RESOURCE_FILTER, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.UpdateResourceEvaluationFilters(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.UpdateResourceEvaluationFilters`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.UpdateResourceEvaluationFilters`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update resource filters returns "OK" response ``` // Update resource filters returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.ResourceFilterAttributes; import com.datadog.api.client.v2.model.ResourceFilterRequestType; import com.datadog.api.client.v2.model.UpdateResourceEvaluationFiltersRequest; import com.datadog.api.client.v2.model.UpdateResourceEvaluationFiltersRequestData; import com.datadog.api.client.v2.model.UpdateResourceEvaluationFiltersResponse; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); UpdateResourceEvaluationFiltersRequest body = new UpdateResourceEvaluationFiltersRequest() .data( new UpdateResourceEvaluationFiltersRequestData() .attributes( new ResourceFilterAttributes() .cloudProvider( Map.ofEntries( Map.entry( "aws", Map.ofEntries( Map.entry( "aws_account_id", Collections.singletonList("tag1:v1"))))))) .id("csm_resource_filter") .type(ResourceFilterRequestType.CSM_RESOURCE_FILTER)); try { UpdateResourceEvaluationFiltersResponse result = apiInstance.updateResourceEvaluationFilters(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#updateResourceEvaluationFilters"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update resource filters returns "OK" response ``` """ Update resource filters returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.resource_filter_attributes import ResourceFilterAttributes from datadog_api_client.v2.model.resource_filter_request_type import ResourceFilterRequestType from datadog_api_client.v2.model.update_resource_evaluation_filters_request import ( UpdateResourceEvaluationFiltersRequest, ) from datadog_api_client.v2.model.update_resource_evaluation_filters_request_data import ( UpdateResourceEvaluationFiltersRequestData, ) body = UpdateResourceEvaluationFiltersRequest( data=UpdateResourceEvaluationFiltersRequestData( attributes=ResourceFilterAttributes( cloud_provider=dict( aws=dict( aws_account_id=[ "tag1:v1", ], ), ), ), id="csm_resource_filter", type=ResourceFilterRequestType.CSM_RESOURCE_FILTER, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.update_resource_evaluation_filters(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update resource filters returns "OK" response ``` # Update resource filters returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::UpdateResourceEvaluationFiltersRequest.new({ data: DatadogAPIClient::V2::UpdateResourceEvaluationFiltersRequestData.new({ attributes: DatadogAPIClient::V2::ResourceFilterAttributes.new({ cloud_provider: { aws: { aws_account_id: [ "tag1:v1", ], }, }, }), id: "csm_resource_filter", type: DatadogAPIClient::V2::ResourceFilterRequestType::CSM_RESOURCE_FILTER, }), }) p api_instance.update_resource_evaluation_filters(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update resource filters returns "OK" response ``` // Update resource filters returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::ResourceFilterAttributes; use datadog_api_client::datadogV2::model::ResourceFilterRequestType; use datadog_api_client::datadogV2::model::UpdateResourceEvaluationFiltersRequest; use datadog_api_client::datadogV2::model::UpdateResourceEvaluationFiltersRequestData; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = UpdateResourceEvaluationFiltersRequest::new( UpdateResourceEvaluationFiltersRequestData::new( ResourceFilterAttributes::new(BTreeMap::from([( "aws".to_string(), BTreeMap::from([("aws_account_id".to_string(), vec!["tag1:v1".to_string()])]), )])), ResourceFilterRequestType::CSM_RESOURCE_FILTER, ) .id("csm_resource_filter".to_string()), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.update_resource_evaluation_filters(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update resource filters returns "OK" response ``` /** * Update resource filters returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiUpdateResourceEvaluationFiltersRequest = { body: { data: { attributes: { cloudProvider: { aws: { aws_account_id: ["tag1:v1"], }, }, }, id: "csm_resource_filter", type: "csm_resource_filter", }, }, }; apiInstance .updateResourceEvaluationFilters(params) .then((data: v2.UpdateResourceEvaluationFiltersResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List assets SBOMs](https://docs.datadoghq.com/api/latest/security-monitoring/#list-assets-sboms) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#list-assets-sboms-v2) **Note** : This endpoint is a private preview. If you are interested in accessing this API, [fill out this form](https://forms.gle/kMYC1sDr6WDUBDsx9). GET https://api.ap1.datadoghq.com/api/v2/security/sbomshttps://api.ap2.datadoghq.com/api/v2/security/sbomshttps://api.datadoghq.eu/api/v2/security/sbomshttps://api.ddog-gov.com/api/v2/security/sbomshttps://api.datadoghq.com/api/v2/security/sbomshttps://api.us3.datadoghq.com/api/v2/security/sbomshttps://api.us5.datadoghq.com/api/v2/security/sboms ### Overview Get a list of assets SBOMs for an organization. ### [Pagination](https://docs.datadoghq.com/api/latest/security-monitoring/#pagination) Please review the [Pagination section](https://docs.datadoghq.com/api/latest/security-monitoring/#pagination) for the “List Vulnerabilities” endpoint. ### [Filtering](https://docs.datadoghq.com/api/latest/security-monitoring/#filtering) Please review the [Filtering section](https://docs.datadoghq.com/api/latest/security-monitoring/#filtering) for the “List Vulnerabilities” endpoint. ### [Metadata](https://docs.datadoghq.com/api/latest/security-monitoring/#metadata) Please review the [Metadata section](https://docs.datadoghq.com/api/latest/security-monitoring/#metadata) for the “List Vulnerabilities” endpoint. This endpoint requires the `appsec_vm_read` permission. ### Arguments #### Query Strings Name Type Description page[token] string Its value must come from the `links` section of the response of the first request. Do not manually edit it. page[number] integer The page number to be retrieved. It should be equal to or greater than 1. filter[asset_type] enum The type of the assets for the SBOM request. Allowed enum values: `Repository, Service, Host, HostImage, Image` filter[asset_name] string The name of the asset for the SBOM request. filter[package_name] string The name of the component that is a dependency of an asset. filter[package_version] string The version of the component that is a dependency of an asset. filter[license_name] string The software license name of the component that is a dependency of an asset. filter[license_type] enum The software license type of the component that is a dependency of an asset. Allowed enum values: `network_strong_copyleft, non_standard_copyleft, other_non_free, other_non_standard, permissive, public_domain, strong_copyleft, weak_copyleft` ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListAssetsSBOMs-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ListAssetsSBOMs-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ListAssetsSBOMs-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#ListAssetsSBOMs-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListAssetsSBOMs-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The expected response schema when listing assets SBOMs. Field Type Description data [_required_] [object] List of assets SBOMs. attributes object The JSON:API attributes of the SBOM. bomFormat [_required_] string Specifies the format of the BOM. This helps to identify the file as CycloneDX since BOM do not have a filename convention nor does JSON schema support namespaces. This value MUST be `CycloneDX`. components [_required_] [object] A list of software and hardware components. bom-ref string An optional identifier that can be used to reference the component elsewhere in the BOM. licenses [object] The software licenses of the SBOM component. license [_required_] object The software license of the component of the SBOM. name [_required_] string The name of the software license of the component of the SBOM. name [_required_] string The name of the component. This will often be a shortened, single name of the component. properties [object] The custom properties of the component of the SBOM. name [_required_] string The name of the custom property of the component of the SBOM. value [_required_] string The value of the custom property of the component of the SBOM. purl string Specifies the package-url (purl). The purl, if specified, MUST be valid and conform to the [specification](https://github.com/package-url/purl-spec). supplier [_required_] object The supplier of the component. name [_required_] string Identifier of the supplier of the component. type [_required_] enum The SBOM component type Allowed enum values: `application,container,data,device,device-driver,file,firmware,framework,library,machine-learning-model,operating-system,platform` version [_required_] string The component version. dependencies [_required_] [object] List of dependencies between components of the SBOM. dependsOn [string] The components that are dependencies of the ref component. ref string The identifier for the related component. metadata [_required_] object Provides additional information about a BOM. authors [object] List of authors of the SBOM. name string The identifier of the Author of the SBOM. component object The component that the BOM describes. name string The name of the component. This will often be a shortened, single name of the component. type string Specifies the type of the component. timestamp string The timestamp of the SBOM creation. serialNumber [_required_] string Every BOM generated has a unique serial number, even if the contents of the BOM have not changed overt time. The serial number follows [RFC-4122](https://datatracker.ietf.org/doc/html/rfc4122) specVersion [_required_] enum The version of the CycloneDX specification a BOM conforms to. Allowed enum values: `1.0,1.1,1.2,1.3,1.4,1.5` version [_required_] int64 It increments when a BOM is modified. The default value is 1. id string The unique ID for this SBOM (it is equivalent to the `asset_name` or `asset_name@repo_digest` (Image) type enum The JSON:API type. Allowed enum values: `sboms` links object The JSON:API links related to pagination. first [_required_] string First page link. last [_required_] string Last page link. next string Next page link. previous string Previous page link. self [_required_] string Request link. meta object The metadata related to this request. count [_required_] int64 Number of entities included in the response. token [_required_] string The token that identifies the request. total [_required_] int64 Total number of entities across all pages. ``` { "data": [ { "attributes": { "bomFormat": "CycloneDX", "components": [ { "bom-ref": "pkg:golang/google.golang.org/grpc@1.68.1", "licenses": [ { "license": { "name": "MIT" } } ], "name": "google.golang.org/grpc", "properties": [ { "name": "license_type", "value": "permissive" } ], "purl": "pkg:golang/google.golang.org/grpc@1.68.1", "supplier": { "name": "https://go.dev" }, "type": "application", "version": "1.68.1" } ], "dependencies": [ { "dependsOn": [ "pkg:golang/google.golang.org/grpc@1.68.1" ], "ref": "Repository|github.com/datadog/datadog-agent" } ], "metadata": { "authors": [ { "name": "Datadog, Inc." } ], "component": { "name": "github.com/datadog/datadog-agent", "type": "application" }, "timestamp": "2025-07-08T07:24:53Z" }, "serialNumber": "urn:uuid:f7119d2f-1vgh-24b5-91f0-12010db72da7", "specVersion": "1.5", "version": 1 }, "id": "github.com/datadog/datadog-agent", "type": "sboms" } ], "links": { "first": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=1\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "last": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=15\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "next": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=16\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "previous": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=14\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "self": "https://api.datadoghq.com/api/v2/security/vulnerabilities?filter%5Btool%5D=Infra" }, "meta": { "count": 150, "token": "b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "total": 152431 } } ``` Copy Bad request: The server cannot process the request due to invalid syntax in the request. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden: Access denied * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not found: asset not found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### List assets SBOMs Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/sboms" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List assets SBOMs ``` """ List assets SBOMs returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.asset_type import AssetType configuration = Configuration() configuration.unstable_operations["list_assets_sbo_ms"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_assets_sbo_ms( filter_asset_type=AssetType.SERVICE, filter_package_name="pandas", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List assets SBOMs ``` # List assets SBOMs returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_assets_sbo_ms".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new opts = { filter_package_name: "pandas", filter_asset_type: AssetType::SERVICE, } p api_instance.list_assets_sbo_ms(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List assets SBOMs ``` // List assets SBOMs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListAssetsSBOMs", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListAssetsSBOMs(ctx, *datadogV2.NewListAssetsSBOMsOptionalParameters().WithFilterPackageName("pandas").WithFilterAssetType(datadogV2.ASSETTYPE_SERVICE)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListAssetsSBOMs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListAssetsSBOMs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List assets SBOMs ``` // List assets SBOMs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.api.SecurityMonitoringApi.ListAssetsSBOMsOptionalParameters; import com.datadog.api.client.v2.model.AssetType; import com.datadog.api.client.v2.model.ListAssetsSBOMsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listAssetsSBOMs", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { ListAssetsSBOMsResponse result = apiInstance.listAssetsSBOMs( new ListAssetsSBOMsOptionalParameters() .filterPackageName("pandas") .filterAssetType(AssetType.SERVICE)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#listAssetsSBOMs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List assets SBOMs ``` // List assets SBOMs returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::ListAssetsSBOMsOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::AssetType; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListAssetsSBOMs", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .list_assets_sbo_ms( ListAssetsSBOMsOptionalParams::default() .filter_package_name("pandas".to_string()) .filter_asset_type(AssetType::SERVICE), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List assets SBOMs ``` /** * List assets SBOMs returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listAssetsSBOMs"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiListAssetsSBOMsRequest = { filterAssetType: "Service", filterPackageName: "pandas", }; apiInstance .listAssetsSBOMs(params) .then((data: v2.ListAssetsSBOMsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List scanned assets metadata](https://docs.datadoghq.com/api/latest/security-monitoring/#list-scanned-assets-metadata) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#list-scanned-assets-metadata-v2) **Note** : This endpoint is a private preview. If you are interested in accessing this API, [fill out this form](https://forms.gle/kMYC1sDr6WDUBDsx9). GET https://api.ap1.datadoghq.com/api/v2/security/scanned-assets-metadatahttps://api.ap2.datadoghq.com/api/v2/security/scanned-assets-metadatahttps://api.datadoghq.eu/api/v2/security/scanned-assets-metadatahttps://api.ddog-gov.com/api/v2/security/scanned-assets-metadatahttps://api.datadoghq.com/api/v2/security/scanned-assets-metadatahttps://api.us3.datadoghq.com/api/v2/security/scanned-assets-metadatahttps://api.us5.datadoghq.com/api/v2/security/scanned-assets-metadata ### Overview Get a list of security scanned assets metadata for an organization. ### [Pagination](https://docs.datadoghq.com/api/latest/security-monitoring/#pagination) For the “List Vulnerabilities” endpoint, see the [Pagination section](https://docs.datadoghq.com/api/latest/security-monitoring/#pagination). ### [Filtering](https://docs.datadoghq.com/api/latest/security-monitoring/#filtering) For the “List Vulnerabilities” endpoint, see the [Filtering section](https://docs.datadoghq.com/api/latest/security-monitoring/#filtering). ### [Metadata](https://docs.datadoghq.com/api/latest/security-monitoring/#metadata) For the “List Vulnerabilities” endpoint, see the [Metadata section](https://docs.datadoghq.com/api/latest/security-monitoring/#metadata). ### [Related endpoints](https://docs.datadoghq.com/api/latest/security-monitoring/#related-endpoints) This endpoint returns additional metadata for cloud resources that is not available from the standard resource endpoints. To access a richer dataset, call this endpoint together with the relevant resource endpoint(s) and merge (join) their results using the resource identifier. **Hosts** To enrich host data, join the response from the [Hosts](https://docs.datadoghq.com/api/latest/hosts/) endpoint with the response from the scanned-assets-metadata endpoint on the following key fields: ENDPOINT | JOIN KEY | TYPE ---|---|--- [/api/v1/hosts](https://docs.datadoghq.com/api/latest/hosts/) | host_list.host_name | string /api/v2/security/scanned-assets-metadata | data.attributes.asset.name | string **Host Images** To enrich host image data, join the response from the [Hosts](https://docs.datadoghq.com/api/latest/hosts/) endpoint with the response from the scanned-assets-metadata endpoint on the following key fields: ENDPOINT | JOIN KEY | TYPE ---|---|--- [/api/v1/hosts](https://docs.datadoghq.com/api/latest/hosts/) | host_list.tags_by_source[“Amazon Web Services”][“image”] | string /api/v2/security/scanned-assets-metadata | data.attributes.asset.name | string **Container Images** To enrich container image data, join the response from the [Container Images](https://docs.datadoghq.com/api/latest/container-images/) endpoint with the response from the scanned-assets-metadata endpoint on the following key fields: ENDPOINT | JOIN KEY | TYPE ---|---|--- [/api/v2/container_images](https://docs.datadoghq.com/api/latest/container-images/) | `data.attributes.name`@`data.attributes.repo_digest` | string /api/v2/security/scanned-assets-metadata | data.attributes.asset.name | string This endpoint requires the `appsec_vm_read` permission. ### Arguments #### Query Strings Name Type Description page[token] string Its value must come from the `links` section of the response of the first request. Do not manually edit it. page[number] integer The page number to be retrieved. It should be equal to or greater than 1. filter[asset.type] enum The type of the scanned asset. Allowed enum values: `Host, HostImage, Image` filter[asset.name] string The name of the scanned asset. filter[last_success.origin] string The origin of last success scan. filter[last_success.env] string The environment of last success scan. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListScannedAssetsMetadata-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ListScannedAssetsMetadata-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ListScannedAssetsMetadata-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#ListScannedAssetsMetadata-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListScannedAssetsMetadata-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The expected response schema when listing scanned assets metadata. Field Type Description data [_required_] [object] List of scanned assets metadata. attributes [_required_] object The attributes of a scanned asset metadata. asset [_required_] object The asset of a scanned asset metadata. name [_required_] string The name of the asset. type [_required_] enum The cloud asset type Allowed enum values: `Host,HostImage,Image` first_success_timestamp [_required_] string The timestamp when the scan of the asset was performed for the first time. last_success [_required_] object Metadata for the last successful scan of an asset. env string The environment of the last success scan of the asset. origin [string] The list of origins of the last success scan of the asset. timestamp [_required_] string The timestamp of the last success scan of the asset. id [_required_] string The ID of the scanned asset metadata. links object The JSON:API links related to pagination. first [_required_] string First page link. last [_required_] string Last page link. next string Next page link. previous string Previous page link. self [_required_] string Request link. meta object The metadata related to this request. count [_required_] int64 Number of entities included in the response. token [_required_] string The token that identifies the request. total [_required_] int64 Total number of entities across all pages. ``` { "data": [ { "attributes": { "asset": { "name": "i-0fc7edef1ab26d7ef", "type": "Host" }, "first_success_timestamp": "2025-07-08T07:24:53Z", "last_success": { "env": "prod", "origin": [ "production" ], "timestamp": "2025-07-08T07:24:53Z" } }, "id": "Host|i-0fc7edef1ab26d7ef" } ], "links": { "first": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=1\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "last": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=15\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "next": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=16\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "previous": "https://api.datadoghq.com/api/v2/security/vulnerabilities?page%5Bnumber%5D=14\u0026page%5Btoken%5D=b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "self": "https://api.datadoghq.com/api/v2/security/vulnerabilities?filter%5Btool%5D=Infra" }, "meta": { "count": 150, "token": "b82cef018aab81ed1d4bb4xb35xxfc065da7efa685fbcecdbd338f3015e3afabbbfa3a911b4984_721ee28a-zecb-4e45-9960-c42065b574f4", "total": 152431 } } ``` Copy Bad request: The server cannot process the request due to invalid syntax in the request. * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden: Access denied * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not found: asset not found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### List scanned assets metadata Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/scanned-assets-metadata" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List scanned assets metadata ``` """ List scanned assets metadata returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() configuration.unstable_operations["list_scanned_assets_metadata"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_scanned_assets_metadata() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List scanned assets metadata ``` # List scanned assets metadata returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_scanned_assets_metadata".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.list_scanned_assets_metadata() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List scanned assets metadata ``` // List scanned assets metadata returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListScannedAssetsMetadata", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListScannedAssetsMetadata(ctx, *datadogV2.NewListScannedAssetsMetadataOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListScannedAssetsMetadata`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListScannedAssetsMetadata`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List scanned assets metadata ``` // List scanned assets metadata returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.ScannedAssetsMetadata; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listScannedAssetsMetadata", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { ScannedAssetsMetadata result = apiInstance.listScannedAssetsMetadata(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#listScannedAssetsMetadata"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List scanned assets metadata ``` // List scanned assets metadata returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::ListScannedAssetsMetadataOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListScannedAssetsMetadata", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .list_scanned_assets_metadata(ListScannedAssetsMetadataOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List scanned assets metadata ``` /** * List scanned assets metadata returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listScannedAssetsMetadata"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); apiInstance .listScannedAssetsMetadata() .then((data: v2.ScannedAssetsMetadata) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Returns a list of Secrets rules](https://docs.datadoghq.com/api/latest/security-monitoring/#returns-a-list-of-secrets-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#returns-a-list-of-secrets-rules-v2) **Note** : This endpoint may be subject to changes. GET https://api.ap1.datadoghq.com/api/v2/static-analysis/secrets/ruleshttps://api.ap2.datadoghq.com/api/v2/static-analysis/secrets/ruleshttps://api.datadoghq.eu/api/v2/static-analysis/secrets/ruleshttps://api.ddog-gov.com/api/v2/static-analysis/secrets/ruleshttps://api.datadoghq.com/api/v2/static-analysis/secrets/ruleshttps://api.us3.datadoghq.com/api/v2/static-analysis/secrets/ruleshttps://api.us5.datadoghq.com/api/v2/static-analysis/secrets/rules ### Overview Returns a list of Secrets rules with ID, Pattern, Description, Priority, and SDS ID. OAuth apps require the `code_analysis_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecretsRules-200-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecretsRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] [object] attributes object default_included_keywords [string] description string license string match_validation object endpoint string hosts [string] http_method string invalid_http_status_code [object] end int64 start int64 request_headers object string timeout_seconds int64 type string valid_http_status_code [object] end int64 start int64 name string pattern string priority string sds_id string validators [string] id string type [_required_] enum Secret rule resource type. Allowed enum values: `secret_rule` default: `secret_rule` ``` { "data": [ { "attributes": { "default_included_keywords": [], "description": "string", "license": "string", "match_validation": { "endpoint": "string", "hosts": [], "http_method": "string", "invalid_http_status_code": [ { "end": "integer", "start": "integer" } ], "request_headers": { "": "string" }, "timeout_seconds": "integer", "type": "string", "valid_http_status_code": [ { "end": "integer", "start": "integer" } ] }, "name": "string", "pattern": "string", "priority": "string", "sds_id": "string", "validators": [] }, "id": "string", "type": "secret_rule" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Returns a list of Secrets rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/static-analysis/secrets/rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Returns a list of Secrets rules ``` """ Returns a list of Secrets rules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() configuration.unstable_operations["get_secrets_rules"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_secrets_rules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Returns a list of Secrets rules ``` # Returns a list of Secrets rules returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_secrets_rules".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.get_secrets_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Returns a list of Secrets rules ``` // Returns a list of Secrets rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetSecretsRules", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSecretsRules(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSecretsRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSecretsRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Returns a list of Secrets rules ``` // Returns a list of Secrets rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecretRuleArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getSecretsRules", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { SecretRuleArray result = apiInstance.getSecretsRules(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#getSecretsRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Returns a list of Secrets rules ``` // Returns a list of Secrets rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetSecretsRules", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.get_secrets_rules().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Returns a list of Secrets rules ``` /** * Returns a list of Secrets rules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getSecretsRules"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); apiInstance .getSecretsRules() .then((data: v2.SecretRuleArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Ruleset get multiple](https://docs.datadoghq.com/api/latest/security-monitoring/#ruleset-get-multiple) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#ruleset-get-multiple-v2) **Note** : This endpoint may be subject to changes. POST https://api.ap1.datadoghq.com/api/v2/static-analysis/rulesetshttps://api.ap2.datadoghq.com/api/v2/static-analysis/rulesetshttps://api.datadoghq.eu/api/v2/static-analysis/rulesetshttps://api.ddog-gov.com/api/v2/static-analysis/rulesetshttps://api.datadoghq.com/api/v2/static-analysis/rulesetshttps://api.us3.datadoghq.com/api/v2/static-analysis/rulesetshttps://api.us5.datadoghq.com/api/v2/static-analysis/rulesets ### Overview Get rules for multiple rulesets in batch. OAuth apps require the `code_analysis_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object attributes object include_testing_rules boolean include_tests boolean rulesets [string] id string type [_required_] enum Get multiple rulesets request resource type. Allowed enum values: `get_multiple_rulesets_request` default: `get_multiple_rulesets_request` ``` { "data": { "attributes": { "include_testing_rules": false, "include_tests": false, "rulesets": [] }, "id": "string", "type": "get_multiple_rulesets_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListMultipleRulesets-200-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListMultipleRulesets-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object attributes object rulesets [object] data [_required_] object id string type [_required_] enum Rulesets resource type. Allowed enum values: `rulesets` default: `rulesets` description string name string rules [object] arguments [object] description string name string category string checksum string code string created_at date-time created_by string cve string cwe string data [_required_] object id string type [_required_] enum Rules resource type. Allowed enum values: `rules` default: `rules` description string documentation_url string entity_checked string is_published boolean is_testing boolean language string last_updated_at date-time last_updated_by string name string regex string severity string short_description string should_use_ai_fix boolean tests [object] annotation_count int64 code string filename string tree_sitter_query string type string short_description string id string type [_required_] enum Get multiple rulesets response resource type. Allowed enum values: `get_multiple_rulesets_response` default: `get_multiple_rulesets_response` ``` { "data": { "attributes": { "rulesets": [ { "data": { "id": "string", "type": "rulesets" }, "description": "string", "name": "string", "rules": [ { "arguments": [ { "description": "string", "name": "string" } ], "category": "string", "checksum": "string", "code": "string", "created_at": "2019-09-19T10:00:00.000Z", "created_by": "string", "cve": "string", "cwe": "string", "data": { "id": "string", "type": "rules" }, "description": "string", "documentation_url": "string", "entity_checked": "string", "is_published": false, "is_testing": false, "language": "string", "last_updated_at": "2019-09-19T10:00:00.000Z", "last_updated_by": "string", "name": "string", "regex": "string", "severity": "string", "short_description": "string", "should_use_ai_fix": false, "tests": [ { "annotation_count": "integer", "code": "string", "filename": "string" } ], "tree_sitter_query": "string", "type": "string" } ], "short_description": "string" } ] }, "id": "string", "type": "get_multiple_rulesets_response" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Ruleset get multiple Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/static-analysis/rulesets" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "get_multiple_rulesets_request" } } EOF ``` ##### Ruleset get multiple ``` """ Ruleset get multiple returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.get_multiple_rulesets_request import GetMultipleRulesetsRequest from datadog_api_client.v2.model.get_multiple_rulesets_request_data import GetMultipleRulesetsRequestData from datadog_api_client.v2.model.get_multiple_rulesets_request_data_attributes import ( GetMultipleRulesetsRequestDataAttributes, ) from datadog_api_client.v2.model.get_multiple_rulesets_request_data_type import GetMultipleRulesetsRequestDataType body = GetMultipleRulesetsRequest( data=GetMultipleRulesetsRequestData( attributes=GetMultipleRulesetsRequestDataAttributes( rulesets=[], ), type=GetMultipleRulesetsRequestDataType.GET_MULTIPLE_RULESETS_REQUEST, ), ) configuration = Configuration() configuration.unstable_operations["list_multiple_rulesets"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_multiple_rulesets(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Ruleset get multiple ``` # Ruleset get multiple returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_multiple_rulesets".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::GetMultipleRulesetsRequest.new({ data: DatadogAPIClient::V2::GetMultipleRulesetsRequestData.new({ attributes: DatadogAPIClient::V2::GetMultipleRulesetsRequestDataAttributes.new({ rulesets: [], }), type: DatadogAPIClient::V2::GetMultipleRulesetsRequestDataType::GET_MULTIPLE_RULESETS_REQUEST, }), }) p api_instance.list_multiple_rulesets(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Ruleset get multiple ``` // Ruleset get multiple returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.GetMultipleRulesetsRequest{ Data: &datadogV2.GetMultipleRulesetsRequestData{ Attributes: &datadogV2.GetMultipleRulesetsRequestDataAttributes{ Rulesets: []string{}, }, Type: datadogV2.GETMULTIPLERULESETSREQUESTDATATYPE_GET_MULTIPLE_RULESETS_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListMultipleRulesets", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListMultipleRulesets(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListMultipleRulesets`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListMultipleRulesets`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Ruleset get multiple ``` // Ruleset get multiple returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.GetMultipleRulesetsRequest; import com.datadog.api.client.v2.model.GetMultipleRulesetsRequestData; import com.datadog.api.client.v2.model.GetMultipleRulesetsRequestDataAttributes; import com.datadog.api.client.v2.model.GetMultipleRulesetsRequestDataType; import com.datadog.api.client.v2.model.GetMultipleRulesetsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listMultipleRulesets", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); GetMultipleRulesetsRequest body = new GetMultipleRulesetsRequest() .data( new GetMultipleRulesetsRequestData() .attributes(new GetMultipleRulesetsRequestDataAttributes()) .type(GetMultipleRulesetsRequestDataType.GET_MULTIPLE_RULESETS_REQUEST)); try { GetMultipleRulesetsResponse result = apiInstance.listMultipleRulesets(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#listMultipleRulesets"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Ruleset get multiple ``` // Ruleset get multiple returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::GetMultipleRulesetsRequest; use datadog_api_client::datadogV2::model::GetMultipleRulesetsRequestData; use datadog_api_client::datadogV2::model::GetMultipleRulesetsRequestDataAttributes; use datadog_api_client::datadogV2::model::GetMultipleRulesetsRequestDataType; #[tokio::main] async fn main() { let body = GetMultipleRulesetsRequest::new().data( GetMultipleRulesetsRequestData::new( GetMultipleRulesetsRequestDataType::GET_MULTIPLE_RULESETS_REQUEST, ) .attributes(GetMultipleRulesetsRequestDataAttributes::new().rulesets(vec![])), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListMultipleRulesets", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.list_multiple_rulesets(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Ruleset get multiple ``` /** * Ruleset get multiple returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listMultipleRulesets"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiListMultipleRulesetsRequest = { body: { data: { attributes: { rulesets: [], }, type: "get_multiple_rulesets_request", }, }, }; apiInstance .listMultipleRulesets(params) .then((data: v2.GetMultipleRulesetsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#search-hist-signals) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#search-hist-signals-v2) **Note** : This endpoint is in beta and may be subject to changes. Please check the documentation regularly for updates. GET https://api.ap1.datadoghq.com/api/v2/siem-threat-hunting/histsignals/searchhttps://api.ap2.datadoghq.com/api/v2/siem-threat-hunting/histsignals/searchhttps://api.datadoghq.eu/api/v2/siem-threat-hunting/histsignals/searchhttps://api.ddog-gov.com/api/v2/siem-threat-hunting/histsignals/searchhttps://api.datadoghq.com/api/v2/siem-threat-hunting/histsignals/searchhttps://api.us3.datadoghq.com/api/v2/siem-threat-hunting/histsignals/searchhttps://api.us5.datadoghq.com/api/v2/siem-threat-hunting/histsignals/search ### Overview Search hist signals. This endpoint requires the `security_monitoring_signals_read` permission. OAuth apps require the `security_monitoring_signals_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Request #### Body Data * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description filter object Search filters for listing security signals. from date-time The minimum timestamp for requested security signals. query string Search query for listing security signals. to date-time The maximum timestamp for requested security signals. page object The paging attributes for listing security signals. cursor string A list of results using the cursor provided in the previous query. limit int32 The maximum number of security signals in the response. default: `10` sort enum The sort parameters used for querying security signals. Allowed enum values: `timestamp,-timestamp` ``` { "filter": { "from": "2019-01-02T09:42:36.320Z", "query": "security:attack status:high", "to": "2019-01-03T09:42:36.320Z" }, "page": { "cursor": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==", "limit": 25 }, "sort": "string" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityMonitoringHistsignals-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityMonitoringHistsignals-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityMonitoringHistsignals-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityMonitoringHistsignals-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#SearchSecurityMonitoringHistsignals-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The response object with all security signals matching the request and pagination information. Field Type Description data [object] An array of security signals matching the request. attributes object The object containing all signal attributes and their associated values. custom object A JSON object of attributes in the security signal. message string The message in the security signal defined by the rule that generated the signal. tags [string] An array of tags associated with the security signal. timestamp date-time The timestamp of the security signal. id string The unique ID of the security signal. type enum The type of event. Allowed enum values: `signal` default: `signal` links object Links attributes. next string The link for the next set of results. **Note** : The request can also be made using the POST endpoint. meta object Meta attributes. page object Paging attributes. after string The cursor used to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. ``` { "data": [ { "attributes": { "custom": { "workflow": { "first_seen": "2020-06-23T14:46:01.000Z", "last_seen": "2020-06-23T14:46:49.000Z", "rule": { "id": "0f5-e0c-805", "name": "Brute Force Attack Grouped By User ", "version": 12 } } }, "message": "Detect Account Take Over (ATO) through brute force attempts", "tags": [ "security:attack", "technique:T1110-brute-force" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "signal" } ], "links": { "next": "https://app.datadoghq.com/api/v2/security_monitoring/signals?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Search hist signals Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/histsignals/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Search hist signals ``` """ Search hist signals returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.security_monitoring_signal_list_request import SecurityMonitoringSignalListRequest from datadog_api_client.v2.model.security_monitoring_signal_list_request_filter import ( SecurityMonitoringSignalListRequestFilter, ) from datadog_api_client.v2.model.security_monitoring_signal_list_request_page import ( SecurityMonitoringSignalListRequestPage, ) from datadog_api_client.v2.model.security_monitoring_signals_sort import SecurityMonitoringSignalsSort from datetime import datetime from dateutil.tz import tzutc body = SecurityMonitoringSignalListRequest( filter=SecurityMonitoringSignalListRequestFilter( _from=datetime(2019, 1, 2, 9, 42, 36, 320000, tzinfo=tzutc()), query="security:attack status:high", to=datetime(2019, 1, 3, 9, 42, 36, 320000, tzinfo=tzutc()), ), page=SecurityMonitoringSignalListRequestPage( cursor="eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==", limit=25, ), sort=SecurityMonitoringSignalsSort.TIMESTAMP_ASCENDING, ) configuration = Configuration() configuration.unstable_operations["search_security_monitoring_histsignals"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.search_security_monitoring_histsignals(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search hist signals ``` # Search hist signals returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.search_security_monitoring_histsignals".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::SecurityMonitoringSignalListRequest.new({ filter: DatadogAPIClient::V2::SecurityMonitoringSignalListRequestFilter.new({ from: "2019-01-02T09:42:36.320Z", query: "security:attack status:high", to: "2019-01-03T09:42:36.320Z", }), page: DatadogAPIClient::V2::SecurityMonitoringSignalListRequestPage.new({ cursor: "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==", limit: 25, }), sort: DatadogAPIClient::V2::SecurityMonitoringSignalsSort::TIMESTAMP_ASCENDING, }) opts = { body: body, } p api_instance.search_security_monitoring_histsignals(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search hist signals ``` // Search hist signals returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SecurityMonitoringSignalListRequest{ Filter: &datadogV2.SecurityMonitoringSignalListRequestFilter{ From: datadog.PtrTime(time.Date(2019, 1, 2, 9, 42, 36, 320000, time.UTC)), Query: datadog.PtrString("security:attack status:high"), To: datadog.PtrTime(time.Date(2019, 1, 3, 9, 42, 36, 320000, time.UTC)), }, Page: &datadogV2.SecurityMonitoringSignalListRequestPage{ Cursor: datadog.PtrString("eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ=="), Limit: datadog.PtrInt32(25), }, Sort: datadogV2.SECURITYMONITORINGSIGNALSSORT_TIMESTAMP_ASCENDING.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.SearchSecurityMonitoringHistsignals", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.SearchSecurityMonitoringHistsignals(ctx, *datadogV2.NewSearchSecurityMonitoringHistsignalsOptionalParameters().WithBody(body)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.SearchSecurityMonitoringHistsignals`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.SearchSecurityMonitoringHistsignals`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search hist signals ``` // Search hist signals returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.api.SecurityMonitoringApi.SearchSecurityMonitoringHistsignalsOptionalParameters; import com.datadog.api.client.v2.model.SecurityMonitoringSignalListRequest; import com.datadog.api.client.v2.model.SecurityMonitoringSignalListRequestFilter; import com.datadog.api.client.v2.model.SecurityMonitoringSignalListRequestPage; import com.datadog.api.client.v2.model.SecurityMonitoringSignalsListResponse; import com.datadog.api.client.v2.model.SecurityMonitoringSignalsSort; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.searchSecurityMonitoringHistsignals", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); SecurityMonitoringSignalListRequest body = new SecurityMonitoringSignalListRequest() .filter( new SecurityMonitoringSignalListRequestFilter() .from(OffsetDateTime.parse("2019-01-02T09:42:36.320Z")) .query("security:attack status:high") .to(OffsetDateTime.parse("2019-01-03T09:42:36.320Z"))) .page( new SecurityMonitoringSignalListRequestPage() .cursor( "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==") .limit(25)) .sort(SecurityMonitoringSignalsSort.TIMESTAMP_ASCENDING); try { SecurityMonitoringSignalsListResponse result = apiInstance.searchSecurityMonitoringHistsignals( new SearchSecurityMonitoringHistsignalsOptionalParameters().body(body)); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#searchSecurityMonitoringHistsignals"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search hist signals ``` // Search hist signals returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SearchSecurityMonitoringHistsignalsOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalListRequest; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalListRequestFilter; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalListRequestPage; use datadog_api_client::datadogV2::model::SecurityMonitoringSignalsSort; #[tokio::main] async fn main() { let body = SecurityMonitoringSignalListRequest::new() .filter( SecurityMonitoringSignalListRequestFilter::new() .from( DateTime::parse_from_rfc3339("2019-01-02T09:42:36.320000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ) .query("security:attack status:high".to_string()) .to( DateTime::parse_from_rfc3339("2019-01-03T09:42:36.320000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .page( SecurityMonitoringSignalListRequestPage::new() .cursor( "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==".to_string(), ) .limit(25), ) .sort(SecurityMonitoringSignalsSort::TIMESTAMP_ASCENDING); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.SearchSecurityMonitoringHistsignals", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .search_security_monitoring_histsignals( SearchSecurityMonitoringHistsignalsOptionalParams::default().body(body), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search hist signals ``` /** * Search hist signals returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.searchSecurityMonitoringHistsignals"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiSearchSecurityMonitoringHistsignalsRequest = { body: { filter: { from: new Date(2019, 1, 2, 9, 42, 36, 320000), query: "security:attack status:high", to: new Date(2019, 1, 3, 9, 42, 36, 320000), }, page: { cursor: "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==", limit: 25, }, sort: "timestamp", }, }; apiInstance .searchSecurityMonitoringHistsignals(params) .then((data: v2.SecurityMonitoringSignalsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a hist signal's details](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-hist-signals-details) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-hist-signals-details-v2) **Note** : This endpoint is in beta and may be subject to changes. Please check the documentation regularly for updates. GET https://api.ap1.datadoghq.com/api/v2/siem-threat-hunting/histsignals/{histsignal_id}https://api.ap2.datadoghq.com/api/v2/siem-threat-hunting/histsignals/{histsignal_id}https://api.datadoghq.eu/api/v2/siem-threat-hunting/histsignals/{histsignal_id}https://api.ddog-gov.com/api/v2/siem-threat-hunting/histsignals/{histsignal_id}https://api.datadoghq.com/api/v2/siem-threat-hunting/histsignals/{histsignal_id}https://api.us3.datadoghq.com/api/v2/siem-threat-hunting/histsignals/{histsignal_id}https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/histsignals/{histsignal_id} ### Overview Get a hist signal’s details. This endpoint requires the `security_monitoring_signals_read` permission. OAuth apps require the `security_monitoring_signals_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description histsignal_id [_required_] string The ID of the threat hunting signal. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringHistsignal-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringHistsignal-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringHistsignal-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringHistsignal-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringHistsignal-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Security Signal response data object. Field Type Description data object Object description of a security signal. attributes object The object containing all signal attributes and their associated values. custom object A JSON object of attributes in the security signal. message string The message in the security signal defined by the rule that generated the signal. tags [string] An array of tags associated with the security signal. timestamp date-time The timestamp of the security signal. id string The unique ID of the security signal. type enum The type of event. Allowed enum values: `signal` default: `signal` ``` { "data": { "attributes": { "custom": { "workflow": { "first_seen": "2020-06-23T14:46:01.000Z", "last_seen": "2020-06-23T14:46:49.000Z", "rule": { "id": "0f5-e0c-805", "name": "Brute Force Attack Grouped By User ", "version": 12 } } }, "message": "Detect Account Take Over (ATO) through brute force attempts", "tags": [ "security:attack", "technique:T1110-brute-force" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "signal" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a hist signal's details Copy ``` # Path parameters export histsignal_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/histsignals/${histsignal_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a hist signal's details ``` """ Get a hist signal's details returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() configuration.unstable_operations["get_security_monitoring_histsignal"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_security_monitoring_histsignal( histsignal_id="histsignal_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a hist signal's details ``` # Get a hist signal's details returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_security_monitoring_histsignal".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.get_security_monitoring_histsignal("histsignal_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a hist signal's details ``` // Get a hist signal's details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetSecurityMonitoringHistsignal", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSecurityMonitoringHistsignal(ctx, "histsignal_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSecurityMonitoringHistsignal`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSecurityMonitoringHistsignal`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a hist signal's details ``` // Get a hist signal's details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSignalResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getSecurityMonitoringHistsignal", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { SecurityMonitoringSignalResponse result = apiInstance.getSecurityMonitoringHistsignal("histsignal_id"); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#getSecurityMonitoringHistsignal"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a hist signal's details ``` // Get a hist signal's details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetSecurityMonitoringHistsignal", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_security_monitoring_histsignal("histsignal_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a hist signal's details ``` /** * Get a hist signal's details returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getSecurityMonitoringHistsignal"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiGetSecurityMonitoringHistsignalRequest = { histsignalId: "histsignal_id", }; apiInstance .getSecurityMonitoringHistsignal(params) .then((data: v2.SecurityMonitoringSignalResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#list-hist-signals) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#list-hist-signals-v2) **Note** : This endpoint is in beta and may be subject to changes. Please check the documentation regularly for updates. GET https://api.ap1.datadoghq.com/api/v2/siem-threat-hunting/histsignalshttps://api.ap2.datadoghq.com/api/v2/siem-threat-hunting/histsignalshttps://api.datadoghq.eu/api/v2/siem-threat-hunting/histsignalshttps://api.ddog-gov.com/api/v2/siem-threat-hunting/histsignalshttps://api.datadoghq.com/api/v2/siem-threat-hunting/histsignalshttps://api.us3.datadoghq.com/api/v2/siem-threat-hunting/histsignalshttps://api.us5.datadoghq.com/api/v2/siem-threat-hunting/histsignals ### Overview List hist signals. This endpoint requires the `security_monitoring_signals_read` permission. OAuth apps require the `security_monitoring_signals_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[query] string The search query for security signals. filter[from] string The minimum timestamp for requested security signals. filter[to] string The maximum timestamp for requested security signals. sort enum The order of the security signals in results. Allowed enum values: `timestamp, -timestamp` page[cursor] string A list of results using the cursor provided in the previous query. page[limit] integer The maximum number of security signals in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringHistsignals-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringHistsignals-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringHistsignals-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringHistsignals-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#ListSecurityMonitoringHistsignals-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The response object with all security signals matching the request and pagination information. Field Type Description data [object] An array of security signals matching the request. attributes object The object containing all signal attributes and their associated values. custom object A JSON object of attributes in the security signal. message string The message in the security signal defined by the rule that generated the signal. tags [string] An array of tags associated with the security signal. timestamp date-time The timestamp of the security signal. id string The unique ID of the security signal. type enum The type of event. Allowed enum values: `signal` default: `signal` links object Links attributes. next string The link for the next set of results. **Note** : The request can also be made using the POST endpoint. meta object Meta attributes. page object Paging attributes. after string The cursor used to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. ``` { "data": [ { "attributes": { "custom": { "workflow": { "first_seen": "2020-06-23T14:46:01.000Z", "last_seen": "2020-06-23T14:46:49.000Z", "rule": { "id": "0f5-e0c-805", "name": "Brute Force Attack Grouped By User ", "version": 12 } } }, "message": "Detect Account Take Over (ATO) through brute force attempts", "tags": [ "security:attack", "technique:T1110-brute-force" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "signal" } ], "links": { "next": "https://app.datadoghq.com/api/v2/security_monitoring/signals?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### List hist signals Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/histsignals" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List hist signals ``` """ List hist signals returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() configuration.unstable_operations["list_security_monitoring_histsignals"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.list_security_monitoring_histsignals() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List hist signals ``` # List hist signals returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_security_monitoring_histsignals".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.list_security_monitoring_histsignals() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List hist signals ``` // List hist signals returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListSecurityMonitoringHistsignals", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.ListSecurityMonitoringHistsignals(ctx, *datadogV2.NewListSecurityMonitoringHistsignalsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.ListSecurityMonitoringHistsignals`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.ListSecurityMonitoringHistsignals`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List hist signals ``` // List hist signals returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSignalsListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listSecurityMonitoringHistsignals", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { SecurityMonitoringSignalsListResponse result = apiInstance.listSecurityMonitoringHistsignals(); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#listSecurityMonitoringHistsignals"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List hist signals ``` // List hist signals returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::ListSecurityMonitoringHistsignalsOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListSecurityMonitoringHistsignals", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .list_security_monitoring_histsignals( ListSecurityMonitoringHistsignalsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List hist signals ``` /** * List hist signals returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listSecurityMonitoringHistsignals"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); apiInstance .listSecurityMonitoringHistsignals() .then((data: v2.SecurityMonitoringSignalsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a job's hist signals](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-jobs-hist-signals) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#get-a-jobs-hist-signals-v2) **Note** : This endpoint is in beta and may be subject to changes. Please check the documentation regularly for updates. GET https://api.ap1.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}/histsignalshttps://api.ap2.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}/histsignalshttps://api.datadoghq.eu/api/v2/siem-threat-hunting/jobs/{job_id}/histsignalshttps://api.ddog-gov.com/api/v2/siem-threat-hunting/jobs/{job_id}/histsignalshttps://api.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}/histsignalshttps://api.us3.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}/histsignalshttps://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs/{job_id}/histsignals ### Overview Get a job’s hist signals. This endpoint requires the `security_monitoring_signals_read` permission. OAuth apps require the `security_monitoring_signals_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#security-monitoring) to access this endpoint. ### Arguments #### Path Parameters Name Type Description job_id [_required_] string The ID of the job. #### Query Strings Name Type Description filter[query] string The search query for security signals. filter[from] string The minimum timestamp for requested security signals. filter[to] string The maximum timestamp for requested security signals. sort enum The order of the security signals in results. Allowed enum values: `timestamp, -timestamp` page[cursor] string A list of results using the cursor provided in the previous query. page[limit] integer The maximum number of security signals in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringHistsignalsByJobId-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringHistsignalsByJobId-400-v2) * [403](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringHistsignalsByJobId-403-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringHistsignalsByJobId-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#GetSecurityMonitoringHistsignalsByJobId-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) The response object with all security signals matching the request and pagination information. Field Type Description data [object] An array of security signals matching the request. attributes object The object containing all signal attributes and their associated values. custom object A JSON object of attributes in the security signal. message string The message in the security signal defined by the rule that generated the signal. tags [string] An array of tags associated with the security signal. timestamp date-time The timestamp of the security signal. id string The unique ID of the security signal. type enum The type of event. Allowed enum values: `signal` default: `signal` links object Links attributes. next string The link for the next set of results. **Note** : The request can also be made using the POST endpoint. meta object Meta attributes. page object Paging attributes. after string The cursor used to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. ``` { "data": [ { "attributes": { "custom": { "workflow": { "first_seen": "2020-06-23T14:46:01.000Z", "last_seen": "2020-06-23T14:46:49.000Z", "rule": { "id": "0f5-e0c-805", "name": "Brute Force Attack Grouped By User ", "version": 12 } } }, "message": "Detect Account Take Over (ATO) through brute force attempts", "tags": [ "security:attack", "technique:T1110-brute-force" ], "timestamp": "2019-01-02T09:42:36.320Z" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "signal" } ], "links": { "next": "https://app.datadoghq.com/api/v2/security_monitoring/signals?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Get a job's hist signals Copy ``` # Path parameters export job_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/siem-threat-hunting/jobs/${job_id}/histsignals" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a job's hist signals ``` """ Get a job's hist signals returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi configuration = Configuration() configuration.unstable_operations["get_security_monitoring_histsignals_by_job_id"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.get_security_monitoring_histsignals_by_job_id( job_id="job_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a job's hist signals ``` # Get a job's hist signals returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_security_monitoring_histsignals_by_job_id".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new p api_instance.get_security_monitoring_histsignals_by_job_id("job_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a job's hist signals ``` // Get a job's hist signals returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetSecurityMonitoringHistsignalsByJobId", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.GetSecurityMonitoringHistsignalsByJobId(ctx, "job_id", *datadogV2.NewGetSecurityMonitoringHistsignalsByJobIdOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.GetSecurityMonitoringHistsignalsByJobId`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.GetSecurityMonitoringHistsignalsByJobId`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a job's hist signals ``` // Get a job's hist signals returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.SecurityMonitoringSignalsListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getSecurityMonitoringHistsignalsByJobId", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); try { SecurityMonitoringSignalsListResponse result = apiInstance.getSecurityMonitoringHistsignalsByJobId("job_id"); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SecurityMonitoringApi#getSecurityMonitoringHistsignalsByJobId"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a job's hist signals ``` // Get a job's hist signals returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::GetSecurityMonitoringHistsignalsByJobIdOptionalParams; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration .set_unstable_operation_enabled("v2.GetSecurityMonitoringHistsignalsByJobId", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .get_security_monitoring_histsignals_by_job_id( "job_id".to_string(), GetSecurityMonitoringHistsignalsByJobIdOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a job's hist signals ``` /** * Get a job's hist signals returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getSecurityMonitoringHistsignalsByJobId"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiGetSecurityMonitoringHistsignalsByJobIdRequest = { jobId: "job_id", }; apiInstance .getSecurityMonitoringHistsignalsByJobId(params) .then((data: v2.SecurityMonitoringSignalsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create cases for security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#create-cases-for-security-findings) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#create-cases-for-security-findings-v2) POST https://api.ap1.datadoghq.com/api/v2/security/findings/caseshttps://api.ap2.datadoghq.com/api/v2/security/findings/caseshttps://api.datadoghq.eu/api/v2/security/findings/caseshttps://api.ddog-gov.com/api/v2/security/findings/caseshttps://api.datadoghq.com/api/v2/security/findings/caseshttps://api.us3.datadoghq.com/api/v2/security/findings/caseshttps://api.us5.datadoghq.com/api/v2/security/findings/cases ### Overview Create cases for security findings. You can create up to 50 cases per request and associate up to 50 security findings per case. Security findings that are already attached to another case will be detached from their previous case and attached to the newly created case. This endpoint requires any of the following permissions: * `security_monitoring_findings_write` * `appsec_vm_write` ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] [object] attributes object Attributes of the case to create. assignee_id string Unique identifier of the user assigned to the case. description string Description of the case. If not provided, the description will be automatically generated. priority enum Priority of the case. If not provided, the priority will be automatically set to "NOT_DEFINED". Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` title string Title of the case. If not provided, the title will be automatically generated. relationships object Relationships of the case to create. findings [_required_] object Security findings to create a case for. data [object] id [_required_] string Unique identifier of the security finding. type [_required_] enum Security findings resource type. Allowed enum values: `findings` default: `findings` project [_required_] object Case management project in which the case will be created. data [_required_] object id [_required_] string Unique identifier of the case management project. type [_required_] enum Projects resource type. Allowed enum values: `projects` default: `projects` type [_required_] enum Cases resource type. Allowed enum values: `cases` default: `cases` ##### Create case for security finding returns "Created" response ``` { "data": [ { "attributes": { "title": "A title", "description": "A description" }, "relationships": { "findings": { "data": [ { "id": "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", "type": "findings" } ] }, "project": { "data": { "id": "959a6f71-bac8-4027-b1d3-2264f569296f", "type": "projects" } } }, "type": "cases" } ] } ``` Copy ##### Create case for security findings returns "Created" response ``` { "data": [ { "attributes": { "title": "A title", "description": "A description" }, "relationships": { "findings": { "data": [ { "id": "ZTd5LWNuYi1seWV-aS0wMjI2NGZjZjRmZWQ5ODMyMg==", "type": "findings" }, { "id": "c2FuLXhyaS1kZnN-aS0wODM3MjVhMTM2MDExNzNkOQ==", "type": "findings" } ] }, "project": { "data": { "id": "959a6f71-bac8-4027-b1d3-2264f569296f", "type": "projects" } } }, "type": "cases" } ] } ``` Copy ##### Create cases for security findings returns "Created" response ``` { "data": [ { "attributes": { "title": "A title", "description": "A description" }, "relationships": { "findings": { "data": [ { "id": "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", "type": "findings" } ] }, "project": { "data": { "id": "959a6f71-bac8-4027-b1d3-2264f569296f", "type": "projects" } } }, "type": "cases" }, { "attributes": { "title": "A title", "description": "A description" }, "relationships": { "findings": { "data": [ { "id": "OGRlMDIwYzk4MjFmZTZiNTQwMzk2ZjUxNzg0MDc0NjR-MTk3Yjk4MDI4ZDQ4YzI2ZGZiMWJmMTNhNDEwZGZkYWI=", "type": "findings" } ] }, "project": { "data": { "id": "959a6f71-bac8-4027-b1d3-2264f569296f", "type": "projects" } } }, "type": "cases" } ] } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateCases-201-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateCases-400-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateCases-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateCases-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) List of case responses. Field Type Description data [_required_] [object] attributes object Attributes of the case. archived_at date-time Timestamp of when the case was archived. assigned_to object User assigned to the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` attributes object [string] closed_at date-time Timestamp of when the case was closed. created_at date-time Timestamp of when the case was created. creation_source string Source of the case creation. description string Description of the case. due_date string Due date of the case. insights [object] Insights of the case. ref string Reference of the insight. resource_id string Unique identifier of the resource. For example, the unique identifier of a security finding. type string Type of the resource. For example, the type of a security finding is "SECURITY_FINDING". jira_issue object Jira issue associated with the case. error_message string Error message if the Jira issue creation failed. result object Result of the Jira issue creation. account_id string Account ID of the Jira issue. issue_id string Unique identifier of the Jira issue. issue_key string Key of the Jira issue. issue_url string URL of the Jira issue. status string Status of the Jira issue creation. Can be "COMPLETED" if the Jira issue was created successfully, or "FAILED" if the Jira issue creation failed. key string Key of the case. modified_at date-time Timestamp of when the case was last modified. priority string Priority of the case. status string Status of the case. status_group string Status group of the case. status_name string Status name of the case. title string Title of the case. type string Type of the case. For security cases, this is always "SECURITY". id string Unique identifier of the case. relationships object Relationships of the case. created_by object User who created the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` modified_by object User who last modified the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` project object Project in which the case was created. data [_required_] object id [_required_] string Unique identifier of the case management project. type [_required_] enum Projects resource type. Allowed enum values: `projects` default: `projects` type [_required_] enum Cases resource type. Allowed enum values: `cases` default: `cases` ``` { "data": [ { "attributes": { "archived_at": "2025-01-01T00:00:00.000Z", "assigned_to": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "attributes": { "": [] }, "closed_at": "2025-01-01T00:00:00.000Z", "created_at": "2025-01-01T00:00:00.000Z", "creation_source": "CS_SECURITY_FINDING", "description": "A description of the case.", "due_date": "2025-01-01", "insights": [ { "ref": "/security/appsec/vm/library/vulnerability/dfa027f7c037b2f77159adc027fecb56?detection=static", "resource_id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", "type": "SECURITY_FINDING" } ], "jira_issue": { "error_message": "{\"errorMessages\":[\"An error occured.\"],\"errors\":{}}", "result": { "account_id": "463a8631-680e-455c-bfd3-3ed04d326eb7", "issue_id": "2871276", "issue_key": "PROJ-123", "issue_url": "https://domain.atlassian.net/browse/PROJ-123" }, "status": "COMPLETED" }, "key": "PROJ-123", "modified_at": "2025-01-01T00:00:00.000Z", "priority": "P4", "status": "OPEN", "status_group": "SG_OPEN", "status_name": "Open", "title": "A title for the case.", "type": "SECURITY" }, "id": "c1234567-89ab-cdef-0123-456789abcdef", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "project": { "data": { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "projects" } } }, "type": "cases" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Create case for security finding returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/findings/cases" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "attributes": { "title": "A title", "description": "A description" }, "relationships": { "findings": { "data": [ { "id": "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", "type": "findings" } ] }, "project": { "data": { "id": "959a6f71-bac8-4027-b1d3-2264f569296f", "type": "projects" } } }, "type": "cases" } ] } EOF ``` ##### Create case for security findings returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/findings/cases" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "attributes": { "title": "A title", "description": "A description" }, "relationships": { "findings": { "data": [ { "id": "ZTd5LWNuYi1seWV-aS0wMjI2NGZjZjRmZWQ5ODMyMg==", "type": "findings" }, { "id": "c2FuLXhyaS1kZnN-aS0wODM3MjVhMTM2MDExNzNkOQ==", "type": "findings" } ] }, "project": { "data": { "id": "959a6f71-bac8-4027-b1d3-2264f569296f", "type": "projects" } } }, "type": "cases" } ] } EOF ``` ##### Create cases for security findings returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/findings/cases" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "attributes": { "title": "A title", "description": "A description" }, "relationships": { "findings": { "data": [ { "id": "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", "type": "findings" } ] }, "project": { "data": { "id": "959a6f71-bac8-4027-b1d3-2264f569296f", "type": "projects" } } }, "type": "cases" }, { "attributes": { "title": "A title", "description": "A description" }, "relationships": { "findings": { "data": [ { "id": "OGRlMDIwYzk4MjFmZTZiNTQwMzk2ZjUxNzg0MDc0NjR-MTk3Yjk4MDI4ZDQ4YzI2ZGZiMWJmMTNhNDEwZGZkYWI=", "type": "findings" } ] }, "project": { "data": { "id": "959a6f71-bac8-4027-b1d3-2264f569296f", "type": "projects" } } }, "type": "cases" } ] } EOF ``` ##### Create case for security finding returns "Created" response ``` // Create case for security finding returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateCaseRequestArray{ Data: []datadogV2.CreateCaseRequestData{ { Attributes: &datadogV2.CreateCaseRequestDataAttributes{ Title: datadog.PtrString("A title"), Description: datadog.PtrString("A description"), }, Relationships: &datadogV2.CreateCaseRequestDataRelationships{ Findings: datadogV2.Findings{ Data: []datadogV2.FindingData{ { Id: "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, }, }, Project: datadogV2.CaseManagementProject{ Data: datadogV2.CaseManagementProjectData{ Id: "959a6f71-bac8-4027-b1d3-2264f569296f", Type: datadogV2.CASEMANAGEMENTPROJECTDATATYPE_PROJECTS, }, }, }, Type: datadogV2.CASEDATATYPE_CASES, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateCases(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateCases`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateCases`:\n%s\n", responseContent) } ``` Copy ##### Create case for security findings returns "Created" response ``` // Create case for security findings returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateCaseRequestArray{ Data: []datadogV2.CreateCaseRequestData{ { Attributes: &datadogV2.CreateCaseRequestDataAttributes{ Title: datadog.PtrString("A title"), Description: datadog.PtrString("A description"), }, Relationships: &datadogV2.CreateCaseRequestDataRelationships{ Findings: datadogV2.Findings{ Data: []datadogV2.FindingData{ { Id: "ZTd5LWNuYi1seWV-aS0wMjI2NGZjZjRmZWQ5ODMyMg==", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, { Id: "c2FuLXhyaS1kZnN-aS0wODM3MjVhMTM2MDExNzNkOQ==", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, }, }, Project: datadogV2.CaseManagementProject{ Data: datadogV2.CaseManagementProjectData{ Id: "959a6f71-bac8-4027-b1d3-2264f569296f", Type: datadogV2.CASEMANAGEMENTPROJECTDATATYPE_PROJECTS, }, }, }, Type: datadogV2.CASEDATATYPE_CASES, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateCases(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateCases`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateCases`:\n%s\n", responseContent) } ``` Copy ##### Create cases for security findings returns "Created" response ``` // Create cases for security findings returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateCaseRequestArray{ Data: []datadogV2.CreateCaseRequestData{ { Attributes: &datadogV2.CreateCaseRequestDataAttributes{ Title: datadog.PtrString("A title"), Description: datadog.PtrString("A description"), }, Relationships: &datadogV2.CreateCaseRequestDataRelationships{ Findings: datadogV2.Findings{ Data: []datadogV2.FindingData{ { Id: "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, }, }, Project: datadogV2.CaseManagementProject{ Data: datadogV2.CaseManagementProjectData{ Id: "959a6f71-bac8-4027-b1d3-2264f569296f", Type: datadogV2.CASEMANAGEMENTPROJECTDATATYPE_PROJECTS, }, }, }, Type: datadogV2.CASEDATATYPE_CASES, }, { Attributes: &datadogV2.CreateCaseRequestDataAttributes{ Title: datadog.PtrString("A title"), Description: datadog.PtrString("A description"), }, Relationships: &datadogV2.CreateCaseRequestDataRelationships{ Findings: datadogV2.Findings{ Data: []datadogV2.FindingData{ { Id: "OGRlMDIwYzk4MjFmZTZiNTQwMzk2ZjUxNzg0MDc0NjR-MTk3Yjk4MDI4ZDQ4YzI2ZGZiMWJmMTNhNDEwZGZkYWI=", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, }, }, Project: datadogV2.CaseManagementProject{ Data: datadogV2.CaseManagementProjectData{ Id: "959a6f71-bac8-4027-b1d3-2264f569296f", Type: datadogV2.CASEMANAGEMENTPROJECTDATATYPE_PROJECTS, }, }, }, Type: datadogV2.CASEDATATYPE_CASES, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateCases(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateCases`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateCases`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create case for security finding returns "Created" response ``` // Create case for security finding returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.CaseDataType; import com.datadog.api.client.v2.model.CaseManagementProject; import com.datadog.api.client.v2.model.CaseManagementProjectData; import com.datadog.api.client.v2.model.CaseManagementProjectDataType; import com.datadog.api.client.v2.model.CreateCaseRequestArray; import com.datadog.api.client.v2.model.CreateCaseRequestData; import com.datadog.api.client.v2.model.CreateCaseRequestDataAttributes; import com.datadog.api.client.v2.model.CreateCaseRequestDataRelationships; import com.datadog.api.client.v2.model.FindingCaseResponseArray; import com.datadog.api.client.v2.model.FindingData; import com.datadog.api.client.v2.model.FindingDataType; import com.datadog.api.client.v2.model.Findings; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); CreateCaseRequestArray body = new CreateCaseRequestArray() .data( Collections.singletonList( new CreateCaseRequestData() .attributes( new CreateCaseRequestDataAttributes() .title("A title") .description("A description")) .relationships( new CreateCaseRequestDataRelationships() .findings( new Findings() .data( Collections.singletonList( new FindingData() .id( "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=") .type(FindingDataType.FINDINGS)))) .project( new CaseManagementProject() .data( new CaseManagementProjectData() .id("959a6f71-bac8-4027-b1d3-2264f569296f") .type(CaseManagementProjectDataType.PROJECTS)))) .type(CaseDataType.CASES))); try { FindingCaseResponseArray result = apiInstance.createCases(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#createCases"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create case for security findings returns "Created" response ``` // Create case for security findings returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.CaseDataType; import com.datadog.api.client.v2.model.CaseManagementProject; import com.datadog.api.client.v2.model.CaseManagementProjectData; import com.datadog.api.client.v2.model.CaseManagementProjectDataType; import com.datadog.api.client.v2.model.CreateCaseRequestArray; import com.datadog.api.client.v2.model.CreateCaseRequestData; import com.datadog.api.client.v2.model.CreateCaseRequestDataAttributes; import com.datadog.api.client.v2.model.CreateCaseRequestDataRelationships; import com.datadog.api.client.v2.model.FindingCaseResponseArray; import com.datadog.api.client.v2.model.FindingData; import com.datadog.api.client.v2.model.FindingDataType; import com.datadog.api.client.v2.model.Findings; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); CreateCaseRequestArray body = new CreateCaseRequestArray() .data( Collections.singletonList( new CreateCaseRequestData() .attributes( new CreateCaseRequestDataAttributes() .title("A title") .description("A description")) .relationships( new CreateCaseRequestDataRelationships() .findings( new Findings() .data( Arrays.asList( new FindingData() .id( "ZTd5LWNuYi1seWV-aS0wMjI2NGZjZjRmZWQ5ODMyMg==") .type(FindingDataType.FINDINGS), new FindingData() .id( "c2FuLXhyaS1kZnN-aS0wODM3MjVhMTM2MDExNzNkOQ==") .type(FindingDataType.FINDINGS)))) .project( new CaseManagementProject() .data( new CaseManagementProjectData() .id("959a6f71-bac8-4027-b1d3-2264f569296f") .type(CaseManagementProjectDataType.PROJECTS)))) .type(CaseDataType.CASES))); try { FindingCaseResponseArray result = apiInstance.createCases(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#createCases"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create cases for security findings returns "Created" response ``` // Create cases for security findings returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.CaseDataType; import com.datadog.api.client.v2.model.CaseManagementProject; import com.datadog.api.client.v2.model.CaseManagementProjectData; import com.datadog.api.client.v2.model.CaseManagementProjectDataType; import com.datadog.api.client.v2.model.CreateCaseRequestArray; import com.datadog.api.client.v2.model.CreateCaseRequestData; import com.datadog.api.client.v2.model.CreateCaseRequestDataAttributes; import com.datadog.api.client.v2.model.CreateCaseRequestDataRelationships; import com.datadog.api.client.v2.model.FindingCaseResponseArray; import com.datadog.api.client.v2.model.FindingData; import com.datadog.api.client.v2.model.FindingDataType; import com.datadog.api.client.v2.model.Findings; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); CreateCaseRequestArray body = new CreateCaseRequestArray() .data( Arrays.asList( new CreateCaseRequestData() .attributes( new CreateCaseRequestDataAttributes() .title("A title") .description("A description")) .relationships( new CreateCaseRequestDataRelationships() .findings( new Findings() .data( Collections.singletonList( new FindingData() .id( "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=") .type(FindingDataType.FINDINGS)))) .project( new CaseManagementProject() .data( new CaseManagementProjectData() .id("959a6f71-bac8-4027-b1d3-2264f569296f") .type(CaseManagementProjectDataType.PROJECTS)))) .type(CaseDataType.CASES), new CreateCaseRequestData() .attributes( new CreateCaseRequestDataAttributes() .title("A title") .description("A description")) .relationships( new CreateCaseRequestDataRelationships() .findings( new Findings() .data( Collections.singletonList( new FindingData() .id( "OGRlMDIwYzk4MjFmZTZiNTQwMzk2ZjUxNzg0MDc0NjR-MTk3Yjk4MDI4ZDQ4YzI2ZGZiMWJmMTNhNDEwZGZkYWI=") .type(FindingDataType.FINDINGS)))) .project( new CaseManagementProject() .data( new CaseManagementProjectData() .id("959a6f71-bac8-4027-b1d3-2264f569296f") .type(CaseManagementProjectDataType.PROJECTS)))) .type(CaseDataType.CASES))); try { FindingCaseResponseArray result = apiInstance.createCases(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#createCases"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create case for security finding returns "Created" response ``` """ Create case for security finding returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.case_data_type import CaseDataType from datadog_api_client.v2.model.case_management_project import CaseManagementProject from datadog_api_client.v2.model.case_management_project_data import CaseManagementProjectData from datadog_api_client.v2.model.case_management_project_data_type import CaseManagementProjectDataType from datadog_api_client.v2.model.create_case_request_array import CreateCaseRequestArray from datadog_api_client.v2.model.create_case_request_data import CreateCaseRequestData from datadog_api_client.v2.model.create_case_request_data_attributes import CreateCaseRequestDataAttributes from datadog_api_client.v2.model.create_case_request_data_relationships import CreateCaseRequestDataRelationships from datadog_api_client.v2.model.finding_data import FindingData from datadog_api_client.v2.model.finding_data_type import FindingDataType from datadog_api_client.v2.model.findings import Findings body = CreateCaseRequestArray( data=[ CreateCaseRequestData( attributes=CreateCaseRequestDataAttributes( title="A title", description="A description", ), relationships=CreateCaseRequestDataRelationships( findings=Findings( data=[ FindingData( id="YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", type=FindingDataType.FINDINGS, ), ], ), project=CaseManagementProject( data=CaseManagementProjectData( id="959a6f71-bac8-4027-b1d3-2264f569296f", type=CaseManagementProjectDataType.PROJECTS, ), ), ), type=CaseDataType.CASES, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_cases(body=body) print(response) ``` Copy ##### Create case for security findings returns "Created" response ``` """ Create case for security findings returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.case_data_type import CaseDataType from datadog_api_client.v2.model.case_management_project import CaseManagementProject from datadog_api_client.v2.model.case_management_project_data import CaseManagementProjectData from datadog_api_client.v2.model.case_management_project_data_type import CaseManagementProjectDataType from datadog_api_client.v2.model.create_case_request_array import CreateCaseRequestArray from datadog_api_client.v2.model.create_case_request_data import CreateCaseRequestData from datadog_api_client.v2.model.create_case_request_data_attributes import CreateCaseRequestDataAttributes from datadog_api_client.v2.model.create_case_request_data_relationships import CreateCaseRequestDataRelationships from datadog_api_client.v2.model.finding_data import FindingData from datadog_api_client.v2.model.finding_data_type import FindingDataType from datadog_api_client.v2.model.findings import Findings body = CreateCaseRequestArray( data=[ CreateCaseRequestData( attributes=CreateCaseRequestDataAttributes( title="A title", description="A description", ), relationships=CreateCaseRequestDataRelationships( findings=Findings( data=[ FindingData( id="ZTd5LWNuYi1seWV-aS0wMjI2NGZjZjRmZWQ5ODMyMg==", type=FindingDataType.FINDINGS, ), FindingData( id="c2FuLXhyaS1kZnN-aS0wODM3MjVhMTM2MDExNzNkOQ==", type=FindingDataType.FINDINGS, ), ], ), project=CaseManagementProject( data=CaseManagementProjectData( id="959a6f71-bac8-4027-b1d3-2264f569296f", type=CaseManagementProjectDataType.PROJECTS, ), ), ), type=CaseDataType.CASES, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_cases(body=body) print(response) ``` Copy ##### Create cases for security findings returns "Created" response ``` """ Create cases for security findings returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.case_data_type import CaseDataType from datadog_api_client.v2.model.case_management_project import CaseManagementProject from datadog_api_client.v2.model.case_management_project_data import CaseManagementProjectData from datadog_api_client.v2.model.case_management_project_data_type import CaseManagementProjectDataType from datadog_api_client.v2.model.create_case_request_array import CreateCaseRequestArray from datadog_api_client.v2.model.create_case_request_data import CreateCaseRequestData from datadog_api_client.v2.model.create_case_request_data_attributes import CreateCaseRequestDataAttributes from datadog_api_client.v2.model.create_case_request_data_relationships import CreateCaseRequestDataRelationships from datadog_api_client.v2.model.finding_data import FindingData from datadog_api_client.v2.model.finding_data_type import FindingDataType from datadog_api_client.v2.model.findings import Findings body = CreateCaseRequestArray( data=[ CreateCaseRequestData( attributes=CreateCaseRequestDataAttributes( title="A title", description="A description", ), relationships=CreateCaseRequestDataRelationships( findings=Findings( data=[ FindingData( id="YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", type=FindingDataType.FINDINGS, ), ], ), project=CaseManagementProject( data=CaseManagementProjectData( id="959a6f71-bac8-4027-b1d3-2264f569296f", type=CaseManagementProjectDataType.PROJECTS, ), ), ), type=CaseDataType.CASES, ), CreateCaseRequestData( attributes=CreateCaseRequestDataAttributes( title="A title", description="A description", ), relationships=CreateCaseRequestDataRelationships( findings=Findings( data=[ FindingData( id="OGRlMDIwYzk4MjFmZTZiNTQwMzk2ZjUxNzg0MDc0NjR-MTk3Yjk4MDI4ZDQ4YzI2ZGZiMWJmMTNhNDEwZGZkYWI=", type=FindingDataType.FINDINGS, ), ], ), project=CaseManagementProject( data=CaseManagementProjectData( id="959a6f71-bac8-4027-b1d3-2264f569296f", type=CaseManagementProjectDataType.PROJECTS, ), ), ), type=CaseDataType.CASES, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_cases(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create case for security finding returns "Created" response ``` # Create case for security finding returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::CreateCaseRequestArray.new({ data: [ DatadogAPIClient::V2::CreateCaseRequestData.new({ attributes: DatadogAPIClient::V2::CreateCaseRequestDataAttributes.new({ title: "A title", description: "A description", }), relationships: DatadogAPIClient::V2::CreateCaseRequestDataRelationships.new({ findings: DatadogAPIClient::V2::Findings.new({ data: [ DatadogAPIClient::V2::FindingData.new({ id: "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), ], }), project: DatadogAPIClient::V2::CaseManagementProject.new({ data: DatadogAPIClient::V2::CaseManagementProjectData.new({ id: "959a6f71-bac8-4027-b1d3-2264f569296f", type: DatadogAPIClient::V2::CaseManagementProjectDataType::PROJECTS, }), }), }), type: DatadogAPIClient::V2::CaseDataType::CASES, }), ], }) p api_instance.create_cases(body) ``` Copy ##### Create case for security findings returns "Created" response ``` # Create case for security findings returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::CreateCaseRequestArray.new({ data: [ DatadogAPIClient::V2::CreateCaseRequestData.new({ attributes: DatadogAPIClient::V2::CreateCaseRequestDataAttributes.new({ title: "A title", description: "A description", }), relationships: DatadogAPIClient::V2::CreateCaseRequestDataRelationships.new({ findings: DatadogAPIClient::V2::Findings.new({ data: [ DatadogAPIClient::V2::FindingData.new({ id: "ZTd5LWNuYi1seWV-aS0wMjI2NGZjZjRmZWQ5ODMyMg==", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), DatadogAPIClient::V2::FindingData.new({ id: "c2FuLXhyaS1kZnN-aS0wODM3MjVhMTM2MDExNzNkOQ==", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), ], }), project: DatadogAPIClient::V2::CaseManagementProject.new({ data: DatadogAPIClient::V2::CaseManagementProjectData.new({ id: "959a6f71-bac8-4027-b1d3-2264f569296f", type: DatadogAPIClient::V2::CaseManagementProjectDataType::PROJECTS, }), }), }), type: DatadogAPIClient::V2::CaseDataType::CASES, }), ], }) p api_instance.create_cases(body) ``` Copy ##### Create cases for security findings returns "Created" response ``` # Create cases for security findings returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::CreateCaseRequestArray.new({ data: [ DatadogAPIClient::V2::CreateCaseRequestData.new({ attributes: DatadogAPIClient::V2::CreateCaseRequestDataAttributes.new({ title: "A title", description: "A description", }), relationships: DatadogAPIClient::V2::CreateCaseRequestDataRelationships.new({ findings: DatadogAPIClient::V2::Findings.new({ data: [ DatadogAPIClient::V2::FindingData.new({ id: "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), ], }), project: DatadogAPIClient::V2::CaseManagementProject.new({ data: DatadogAPIClient::V2::CaseManagementProjectData.new({ id: "959a6f71-bac8-4027-b1d3-2264f569296f", type: DatadogAPIClient::V2::CaseManagementProjectDataType::PROJECTS, }), }), }), type: DatadogAPIClient::V2::CaseDataType::CASES, }), DatadogAPIClient::V2::CreateCaseRequestData.new({ attributes: DatadogAPIClient::V2::CreateCaseRequestDataAttributes.new({ title: "A title", description: "A description", }), relationships: DatadogAPIClient::V2::CreateCaseRequestDataRelationships.new({ findings: DatadogAPIClient::V2::Findings.new({ data: [ DatadogAPIClient::V2::FindingData.new({ id: "OGRlMDIwYzk4MjFmZTZiNTQwMzk2ZjUxNzg0MDc0NjR-MTk3Yjk4MDI4ZDQ4YzI2ZGZiMWJmMTNhNDEwZGZkYWI=", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), ], }), project: DatadogAPIClient::V2::CaseManagementProject.new({ data: DatadogAPIClient::V2::CaseManagementProjectData.new({ id: "959a6f71-bac8-4027-b1d3-2264f569296f", type: DatadogAPIClient::V2::CaseManagementProjectDataType::PROJECTS, }), }), }), type: DatadogAPIClient::V2::CaseDataType::CASES, }), ], }) p api_instance.create_cases(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create case for security finding returns "Created" response ``` // Create case for security finding returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::CaseDataType; use datadog_api_client::datadogV2::model::CaseManagementProject; use datadog_api_client::datadogV2::model::CaseManagementProjectData; use datadog_api_client::datadogV2::model::CaseManagementProjectDataType; use datadog_api_client::datadogV2::model::CreateCaseRequestArray; use datadog_api_client::datadogV2::model::CreateCaseRequestData; use datadog_api_client::datadogV2::model::CreateCaseRequestDataAttributes; use datadog_api_client::datadogV2::model::CreateCaseRequestDataRelationships; use datadog_api_client::datadogV2::model::FindingData; use datadog_api_client::datadogV2::model::FindingDataType; use datadog_api_client::datadogV2::model::Findings; #[tokio::main] async fn main() { let body = CreateCaseRequestArray::new( vec![ CreateCaseRequestData::new(CaseDataType::CASES) .attributes( CreateCaseRequestDataAttributes::new() .description("A description".to_string()) .title("A title".to_string()), ) .relationships( CreateCaseRequestDataRelationships::new( Findings ::new().data( vec![ FindingData::new( "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=".to_string(), FindingDataType::FINDINGS, ) ], ), CaseManagementProject::new( CaseManagementProjectData::new( "959a6f71-bac8-4027-b1d3-2264f569296f".to_string(), CaseManagementProjectDataType::PROJECTS, ), ), ), ) ], ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_cases(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create case for security findings returns "Created" response ``` // Create case for security findings returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::CaseDataType; use datadog_api_client::datadogV2::model::CaseManagementProject; use datadog_api_client::datadogV2::model::CaseManagementProjectData; use datadog_api_client::datadogV2::model::CaseManagementProjectDataType; use datadog_api_client::datadogV2::model::CreateCaseRequestArray; use datadog_api_client::datadogV2::model::CreateCaseRequestData; use datadog_api_client::datadogV2::model::CreateCaseRequestDataAttributes; use datadog_api_client::datadogV2::model::CreateCaseRequestDataRelationships; use datadog_api_client::datadogV2::model::FindingData; use datadog_api_client::datadogV2::model::FindingDataType; use datadog_api_client::datadogV2::model::Findings; #[tokio::main] async fn main() { let body = CreateCaseRequestArray::new(vec![CreateCaseRequestData::new(CaseDataType::CASES) .attributes( CreateCaseRequestDataAttributes::new() .description("A description".to_string()) .title("A title".to_string()), ) .relationships(CreateCaseRequestDataRelationships::new( Findings::new().data(vec![ FindingData::new( "ZTd5LWNuYi1seWV-aS0wMjI2NGZjZjRmZWQ5ODMyMg==".to_string(), FindingDataType::FINDINGS, ), FindingData::new( "c2FuLXhyaS1kZnN-aS0wODM3MjVhMTM2MDExNzNkOQ==".to_string(), FindingDataType::FINDINGS, ), ]), CaseManagementProject::new(CaseManagementProjectData::new( "959a6f71-bac8-4027-b1d3-2264f569296f".to_string(), CaseManagementProjectDataType::PROJECTS, )), ))]); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_cases(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create cases for security findings returns "Created" response ``` // Create cases for security findings returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::CaseDataType; use datadog_api_client::datadogV2::model::CaseManagementProject; use datadog_api_client::datadogV2::model::CaseManagementProjectData; use datadog_api_client::datadogV2::model::CaseManagementProjectDataType; use datadog_api_client::datadogV2::model::CreateCaseRequestArray; use datadog_api_client::datadogV2::model::CreateCaseRequestData; use datadog_api_client::datadogV2::model::CreateCaseRequestDataAttributes; use datadog_api_client::datadogV2::model::CreateCaseRequestDataRelationships; use datadog_api_client::datadogV2::model::FindingData; use datadog_api_client::datadogV2::model::FindingDataType; use datadog_api_client::datadogV2::model::Findings; #[tokio::main] async fn main() { let body = CreateCaseRequestArray::new( vec![ CreateCaseRequestData::new(CaseDataType::CASES) .attributes( CreateCaseRequestDataAttributes::new() .description("A description".to_string()) .title("A title".to_string()), ) .relationships( CreateCaseRequestDataRelationships::new( Findings ::new().data( vec![ FindingData::new( "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=".to_string(), FindingDataType::FINDINGS, ) ], ), CaseManagementProject::new( CaseManagementProjectData::new( "959a6f71-bac8-4027-b1d3-2264f569296f".to_string(), CaseManagementProjectDataType::PROJECTS, ), ), ), ), CreateCaseRequestData::new(CaseDataType::CASES) .attributes( CreateCaseRequestDataAttributes::new() .description("A description".to_string()) .title("A title".to_string()), ) .relationships( CreateCaseRequestDataRelationships::new( Findings ::new().data( vec![ FindingData::new( "OGRlMDIwYzk4MjFmZTZiNTQwMzk2ZjUxNzg0MDc0NjR-MTk3Yjk4MDI4ZDQ4YzI2ZGZiMWJmMTNhNDEwZGZkYWI=".to_string(), FindingDataType::FINDINGS, ) ], ), CaseManagementProject::new( CaseManagementProjectData::new( "959a6f71-bac8-4027-b1d3-2264f569296f".to_string(), CaseManagementProjectDataType::PROJECTS, ), ), ), ) ], ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_cases(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create case for security finding returns "Created" response ``` /** * Create case for security finding returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateCasesRequest = { body: { data: [ { attributes: { title: "A title", description: "A description", }, relationships: { findings: { data: [ { id: "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", type: "findings", }, ], }, project: { data: { id: "959a6f71-bac8-4027-b1d3-2264f569296f", type: "projects", }, }, }, type: "cases", }, ], }, }; apiInstance .createCases(params) .then((data: v2.FindingCaseResponseArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create case for security findings returns "Created" response ``` /** * Create case for security findings returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateCasesRequest = { body: { data: [ { attributes: { title: "A title", description: "A description", }, relationships: { findings: { data: [ { id: "ZTd5LWNuYi1seWV-aS0wMjI2NGZjZjRmZWQ5ODMyMg==", type: "findings", }, { id: "c2FuLXhyaS1kZnN-aS0wODM3MjVhMTM2MDExNzNkOQ==", type: "findings", }, ], }, project: { data: { id: "959a6f71-bac8-4027-b1d3-2264f569296f", type: "projects", }, }, }, type: "cases", }, ], }, }; apiInstance .createCases(params) .then((data: v2.FindingCaseResponseArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create cases for security findings returns "Created" response ``` /** * Create cases for security findings returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateCasesRequest = { body: { data: [ { attributes: { title: "A title", description: "A description", }, relationships: { findings: { data: [ { id: "YjdhNDM3N2QyNTFjYmUwYTY3NDdhMTg0YTk2Yjg5MDl-ZjNmMzAwOTFkZDNhNGQzYzI0MzgxNTk4MjRjZmE2NzE=", type: "findings", }, ], }, project: { data: { id: "959a6f71-bac8-4027-b1d3-2264f569296f", type: "projects", }, }, }, type: "cases", }, { attributes: { title: "A title", description: "A description", }, relationships: { findings: { data: [ { id: "OGRlMDIwYzk4MjFmZTZiNTQwMzk2ZjUxNzg0MDc0NjR-MTk3Yjk4MDI4ZDQ4YzI2ZGZiMWJmMTNhNDEwZGZkYWI=", type: "findings", }, ], }, project: { data: { id: "959a6f71-bac8-4027-b1d3-2264f569296f", type: "projects", }, }, }, type: "cases", }, ], }, }; apiInstance .createCases(params) .then((data: v2.FindingCaseResponseArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Attach security findings to a case](https://docs.datadoghq.com/api/latest/security-monitoring/#attach-security-findings-to-a-case) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#attach-security-findings-to-a-case-v2) PATCH https://api.ap1.datadoghq.com/api/v2/security/findings/cases/{case_id}https://api.ap2.datadoghq.com/api/v2/security/findings/cases/{case_id}https://api.datadoghq.eu/api/v2/security/findings/cases/{case_id}https://api.ddog-gov.com/api/v2/security/findings/cases/{case_id}https://api.datadoghq.com/api/v2/security/findings/cases/{case_id}https://api.us3.datadoghq.com/api/v2/security/findings/cases/{case_id}https://api.us5.datadoghq.com/api/v2/security/findings/cases/{case_id} ### Overview Attach security findings to a case. You can attach up to 50 security findings per case. Security findings that are already attached to another case will be detached from their previous case and attached to the specified case. This endpoint requires any of the following permissions: * `security_monitoring_findings_write` * `appsec_vm_write` ### Arguments #### Path Parameters Name Type Description case_id [_required_] string Unique identifier of the case to attach security findings to ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object Data of the case to attach security findings to. id [_required_] string Unique identifier of the case. relationships object Relationships of the case to attach security findings to. findings [_required_] object Security findings to attach to the case. data [object] id [_required_] string Unique identifier of the security finding. type [_required_] enum Security findings resource type. Allowed enum values: `findings` default: `findings` type [_required_] enum Cases resource type. Allowed enum values: `cases` default: `cases` ##### Attach security finding to a case returns "OK" response ``` { "data": { "id": "7d16945b-baf8-411e-ab2a-20fe43af1ea3", "relationships": { "findings": { "data": [ { "id": "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", "type": "findings" } ] } }, "type": "cases" } } ``` Copy ##### Attach security findings to a case returns "OK" response ``` { "data": { "id": "7d16945b-baf8-411e-ab2a-20fe43af1ea3", "relationships": { "findings": { "data": [ { "id": "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", "type": "findings" }, { "id": "MmUzMzZkODQ2YTI3NDU0OTk4NDk3NzhkOTY5YjU2Zjh-YWJjZGI1ODI4OTYzNWM3ZmUwZTBlOWRkYTRiMGUyOGQ=", "type": "findings" } ] } }, "type": "cases" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#AttachCase-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#AttachCase-400-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#AttachCase-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#AttachCase-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Case response. Field Type Description data object Data of the case. attributes object Attributes of the case. archived_at date-time Timestamp of when the case was archived. assigned_to object User assigned to the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` attributes object [string] closed_at date-time Timestamp of when the case was closed. created_at date-time Timestamp of when the case was created. creation_source string Source of the case creation. description string Description of the case. due_date string Due date of the case. insights [object] Insights of the case. ref string Reference of the insight. resource_id string Unique identifier of the resource. For example, the unique identifier of a security finding. type string Type of the resource. For example, the type of a security finding is "SECURITY_FINDING". jira_issue object Jira issue associated with the case. error_message string Error message if the Jira issue creation failed. result object Result of the Jira issue creation. account_id string Account ID of the Jira issue. issue_id string Unique identifier of the Jira issue. issue_key string Key of the Jira issue. issue_url string URL of the Jira issue. status string Status of the Jira issue creation. Can be "COMPLETED" if the Jira issue was created successfully, or "FAILED" if the Jira issue creation failed. key string Key of the case. modified_at date-time Timestamp of when the case was last modified. priority string Priority of the case. status string Status of the case. status_group string Status group of the case. status_name string Status name of the case. title string Title of the case. type string Type of the case. For security cases, this is always "SECURITY". id string Unique identifier of the case. relationships object Relationships of the case. created_by object User who created the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` modified_by object User who last modified the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` project object Project in which the case was created. data [_required_] object id [_required_] string Unique identifier of the case management project. type [_required_] enum Projects resource type. Allowed enum values: `projects` default: `projects` type [_required_] enum Cases resource type. Allowed enum values: `cases` default: `cases` ``` { "data": { "attributes": { "archived_at": "2025-01-01T00:00:00.000Z", "assigned_to": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "attributes": { "": [] }, "closed_at": "2025-01-01T00:00:00.000Z", "created_at": "2025-01-01T00:00:00.000Z", "creation_source": "CS_SECURITY_FINDING", "description": "A description of the case.", "due_date": "2025-01-01", "insights": [ { "ref": "/security/appsec/vm/library/vulnerability/dfa027f7c037b2f77159adc027fecb56?detection=static", "resource_id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", "type": "SECURITY_FINDING" } ], "jira_issue": { "error_message": "{\"errorMessages\":[\"An error occured.\"],\"errors\":{}}", "result": { "account_id": "463a8631-680e-455c-bfd3-3ed04d326eb7", "issue_id": "2871276", "issue_key": "PROJ-123", "issue_url": "https://domain.atlassian.net/browse/PROJ-123" }, "status": "COMPLETED" }, "key": "PROJ-123", "modified_at": "2025-01-01T00:00:00.000Z", "priority": "P4", "status": "OPEN", "status_group": "SG_OPEN", "status_name": "Open", "title": "A title for the case.", "type": "SECURITY" }, "id": "c1234567-89ab-cdef-0123-456789abcdef", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "project": { "data": { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "projects" } } }, "type": "cases" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Attach security finding to a case returns "OK" response Copy ``` # Path parameters export case_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/findings/cases/${case_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "7d16945b-baf8-411e-ab2a-20fe43af1ea3", "relationships": { "findings": { "data": [ { "id": "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", "type": "findings" } ] } }, "type": "cases" } } EOF ``` ##### Attach security findings to a case returns "OK" response Copy ``` # Path parameters export case_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/findings/cases/${case_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "7d16945b-baf8-411e-ab2a-20fe43af1ea3", "relationships": { "findings": { "data": [ { "id": "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", "type": "findings" }, { "id": "MmUzMzZkODQ2YTI3NDU0OTk4NDk3NzhkOTY5YjU2Zjh-YWJjZGI1ODI4OTYzNWM3ZmUwZTBlOWRkYTRiMGUyOGQ=", "type": "findings" } ] } }, "type": "cases" } } EOF ``` ##### Attach security finding to a case returns "OK" response ``` // Attach security finding to a case returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AttachCaseRequest{ Data: &datadogV2.AttachCaseRequestData{ Id: "7d16945b-baf8-411e-ab2a-20fe43af1ea3", Relationships: &datadogV2.AttachCaseRequestDataRelationships{ Findings: datadogV2.Findings{ Data: []datadogV2.FindingData{ { Id: "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, }, }, }, Type: datadogV2.CASEDATATYPE_CASES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.AttachCase(ctx, "7d16945b-baf8-411e-ab2a-20fe43af1ea3", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.AttachCase`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.AttachCase`:\n%s\n", responseContent) } ``` Copy ##### Attach security findings to a case returns "OK" response ``` // Attach security findings to a case returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AttachCaseRequest{ Data: &datadogV2.AttachCaseRequestData{ Id: "7d16945b-baf8-411e-ab2a-20fe43af1ea3", Relationships: &datadogV2.AttachCaseRequestDataRelationships{ Findings: datadogV2.Findings{ Data: []datadogV2.FindingData{ { Id: "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, { Id: "MmUzMzZkODQ2YTI3NDU0OTk4NDk3NzhkOTY5YjU2Zjh-YWJjZGI1ODI4OTYzNWM3ZmUwZTBlOWRkYTRiMGUyOGQ=", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, }, }, }, Type: datadogV2.CASEDATATYPE_CASES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.AttachCase(ctx, "7d16945b-baf8-411e-ab2a-20fe43af1ea3", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.AttachCase`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.AttachCase`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Attach security finding to a case returns "OK" response ``` // Attach security finding to a case returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.AttachCaseRequest; import com.datadog.api.client.v2.model.AttachCaseRequestData; import com.datadog.api.client.v2.model.AttachCaseRequestDataRelationships; import com.datadog.api.client.v2.model.CaseDataType; import com.datadog.api.client.v2.model.FindingCaseResponse; import com.datadog.api.client.v2.model.FindingData; import com.datadog.api.client.v2.model.FindingDataType; import com.datadog.api.client.v2.model.Findings; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); AttachCaseRequest body = new AttachCaseRequest() .data( new AttachCaseRequestData() .id("7d16945b-baf8-411e-ab2a-20fe43af1ea3") .relationships( new AttachCaseRequestDataRelationships() .findings( new Findings() .data( Collections.singletonList( new FindingData() .id( "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=") .type(FindingDataType.FINDINGS))))) .type(CaseDataType.CASES)); try { FindingCaseResponse result = apiInstance.attachCase("7d16945b-baf8-411e-ab2a-20fe43af1ea3", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#attachCase"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Attach security findings to a case returns "OK" response ``` // Attach security findings to a case returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.AttachCaseRequest; import com.datadog.api.client.v2.model.AttachCaseRequestData; import com.datadog.api.client.v2.model.AttachCaseRequestDataRelationships; import com.datadog.api.client.v2.model.CaseDataType; import com.datadog.api.client.v2.model.FindingCaseResponse; import com.datadog.api.client.v2.model.FindingData; import com.datadog.api.client.v2.model.FindingDataType; import com.datadog.api.client.v2.model.Findings; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); AttachCaseRequest body = new AttachCaseRequest() .data( new AttachCaseRequestData() .id("7d16945b-baf8-411e-ab2a-20fe43af1ea3") .relationships( new AttachCaseRequestDataRelationships() .findings( new Findings() .data( Arrays.asList( new FindingData() .id( "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=") .type(FindingDataType.FINDINGS), new FindingData() .id( "MmUzMzZkODQ2YTI3NDU0OTk4NDk3NzhkOTY5YjU2Zjh-YWJjZGI1ODI4OTYzNWM3ZmUwZTBlOWRkYTRiMGUyOGQ=") .type(FindingDataType.FINDINGS))))) .type(CaseDataType.CASES)); try { FindingCaseResponse result = apiInstance.attachCase("7d16945b-baf8-411e-ab2a-20fe43af1ea3", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#attachCase"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Attach security finding to a case returns "OK" response ``` """ Attach security finding to a case returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.attach_case_request import AttachCaseRequest from datadog_api_client.v2.model.attach_case_request_data import AttachCaseRequestData from datadog_api_client.v2.model.attach_case_request_data_relationships import AttachCaseRequestDataRelationships from datadog_api_client.v2.model.case_data_type import CaseDataType from datadog_api_client.v2.model.finding_data import FindingData from datadog_api_client.v2.model.finding_data_type import FindingDataType from datadog_api_client.v2.model.findings import Findings body = AttachCaseRequest( data=AttachCaseRequestData( id="7d16945b-baf8-411e-ab2a-20fe43af1ea3", relationships=AttachCaseRequestDataRelationships( findings=Findings( data=[ FindingData( id="ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", type=FindingDataType.FINDINGS, ), ], ), ), type=CaseDataType.CASES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.attach_case(case_id="7d16945b-baf8-411e-ab2a-20fe43af1ea3", body=body) print(response) ``` Copy ##### Attach security findings to a case returns "OK" response ``` """ Attach security findings to a case returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.attach_case_request import AttachCaseRequest from datadog_api_client.v2.model.attach_case_request_data import AttachCaseRequestData from datadog_api_client.v2.model.attach_case_request_data_relationships import AttachCaseRequestDataRelationships from datadog_api_client.v2.model.case_data_type import CaseDataType from datadog_api_client.v2.model.finding_data import FindingData from datadog_api_client.v2.model.finding_data_type import FindingDataType from datadog_api_client.v2.model.findings import Findings body = AttachCaseRequest( data=AttachCaseRequestData( id="7d16945b-baf8-411e-ab2a-20fe43af1ea3", relationships=AttachCaseRequestDataRelationships( findings=Findings( data=[ FindingData( id="ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", type=FindingDataType.FINDINGS, ), FindingData( id="MmUzMzZkODQ2YTI3NDU0OTk4NDk3NzhkOTY5YjU2Zjh-YWJjZGI1ODI4OTYzNWM3ZmUwZTBlOWRkYTRiMGUyOGQ=", type=FindingDataType.FINDINGS, ), ], ), ), type=CaseDataType.CASES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.attach_case(case_id="7d16945b-baf8-411e-ab2a-20fe43af1ea3", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Attach security finding to a case returns "OK" response ``` # Attach security finding to a case returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::AttachCaseRequest.new({ data: DatadogAPIClient::V2::AttachCaseRequestData.new({ id: "7d16945b-baf8-411e-ab2a-20fe43af1ea3", relationships: DatadogAPIClient::V2::AttachCaseRequestDataRelationships.new({ findings: DatadogAPIClient::V2::Findings.new({ data: [ DatadogAPIClient::V2::FindingData.new({ id: "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), ], }), }), type: DatadogAPIClient::V2::CaseDataType::CASES, }), }) p api_instance.attach_case("7d16945b-baf8-411e-ab2a-20fe43af1ea3", body) ``` Copy ##### Attach security findings to a case returns "OK" response ``` # Attach security findings to a case returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::AttachCaseRequest.new({ data: DatadogAPIClient::V2::AttachCaseRequestData.new({ id: "7d16945b-baf8-411e-ab2a-20fe43af1ea3", relationships: DatadogAPIClient::V2::AttachCaseRequestDataRelationships.new({ findings: DatadogAPIClient::V2::Findings.new({ data: [ DatadogAPIClient::V2::FindingData.new({ id: "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), DatadogAPIClient::V2::FindingData.new({ id: "MmUzMzZkODQ2YTI3NDU0OTk4NDk3NzhkOTY5YjU2Zjh-YWJjZGI1ODI4OTYzNWM3ZmUwZTBlOWRkYTRiMGUyOGQ=", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), ], }), }), type: DatadogAPIClient::V2::CaseDataType::CASES, }), }) p api_instance.attach_case("7d16945b-baf8-411e-ab2a-20fe43af1ea3", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Attach security finding to a case returns "OK" response ``` // Attach security finding to a case returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::AttachCaseRequest; use datadog_api_client::datadogV2::model::AttachCaseRequestData; use datadog_api_client::datadogV2::model::AttachCaseRequestDataRelationships; use datadog_api_client::datadogV2::model::CaseDataType; use datadog_api_client::datadogV2::model::FindingData; use datadog_api_client::datadogV2::model::FindingDataType; use datadog_api_client::datadogV2::model::Findings; #[tokio::main] async fn main() { let body = AttachCaseRequest ::new().data( AttachCaseRequestData::new( "7d16945b-baf8-411e-ab2a-20fe43af1ea3".to_string(), CaseDataType::CASES, ).relationships( AttachCaseRequestDataRelationships::new( Findings ::new().data( vec![ FindingData::new( "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=".to_string(), FindingDataType::FINDINGS, ) ], ), ), ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .attach_case("7d16945b-baf8-411e-ab2a-20fe43af1ea3".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Attach security findings to a case returns "OK" response ``` // Attach security findings to a case returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::AttachCaseRequest; use datadog_api_client::datadogV2::model::AttachCaseRequestData; use datadog_api_client::datadogV2::model::AttachCaseRequestDataRelationships; use datadog_api_client::datadogV2::model::CaseDataType; use datadog_api_client::datadogV2::model::FindingData; use datadog_api_client::datadogV2::model::FindingDataType; use datadog_api_client::datadogV2::model::Findings; #[tokio::main] async fn main() { let body = AttachCaseRequest ::new().data( AttachCaseRequestData::new( "7d16945b-baf8-411e-ab2a-20fe43af1ea3".to_string(), CaseDataType::CASES, ).relationships( AttachCaseRequestDataRelationships::new( Findings ::new().data( vec![ FindingData::new( "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=".to_string(), FindingDataType::FINDINGS, ), FindingData::new( "MmUzMzZkODQ2YTI3NDU0OTk4NDk3NzhkOTY5YjU2Zjh-YWJjZGI1ODI4OTYzNWM3ZmUwZTBlOWRkYTRiMGUyOGQ=".to_string(), FindingDataType::FINDINGS, ) ], ), ), ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api .attach_case("7d16945b-baf8-411e-ab2a-20fe43af1ea3".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Attach security finding to a case returns "OK" response ``` /** * Attach security finding to a case returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiAttachCaseRequest = { body: { data: { id: "7d16945b-baf8-411e-ab2a-20fe43af1ea3", relationships: { findings: { data: [ { id: "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", type: "findings", }, ], }, }, type: "cases", }, }, caseId: "7d16945b-baf8-411e-ab2a-20fe43af1ea3", }; apiInstance .attachCase(params) .then((data: v2.FindingCaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Attach security findings to a case returns "OK" response ``` /** * Attach security findings to a case returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiAttachCaseRequest = { body: { data: { id: "7d16945b-baf8-411e-ab2a-20fe43af1ea3", relationships: { findings: { data: [ { id: "ZGZhMDI3ZjdjMDM3YjJmNzcxNTlhZGMwMjdmZWNiNTZ-MTVlYTNmYWU3NjNlOTNlYTE2YjM4N2JmZmI4Yjk5N2Y=", type: "findings", }, { id: "MmUzMzZkODQ2YTI3NDU0OTk4NDk3NzhkOTY5YjU2Zjh-YWJjZGI1ODI4OTYzNWM3ZmUwZTBlOWRkYTRiMGUyOGQ=", type: "findings", }, ], }, }, type: "cases", }, }, caseId: "7d16945b-baf8-411e-ab2a-20fe43af1ea3", }; apiInstance .attachCase(params) .then((data: v2.FindingCaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Detach security findings from their case](https://docs.datadoghq.com/api/latest/security-monitoring/#detach-security-findings-from-their-case) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#detach-security-findings-from-their-case-v2) DELETE https://api.ap1.datadoghq.com/api/v2/security/findings/caseshttps://api.ap2.datadoghq.com/api/v2/security/findings/caseshttps://api.datadoghq.eu/api/v2/security/findings/caseshttps://api.ddog-gov.com/api/v2/security/findings/caseshttps://api.datadoghq.com/api/v2/security/findings/caseshttps://api.us3.datadoghq.com/api/v2/security/findings/caseshttps://api.us5.datadoghq.com/api/v2/security/findings/cases ### Overview Detach security findings from their case. This operation dissociates security findings from their associated cases without deleting the cases themselves. You can detach security findings from multiple different cases in a single request, with a limit of 50 security findings per request. Security findings that are not currently attached to any case will be ignored. This endpoint requires any of the following permissions: * `security_monitoring_findings_write` * `appsec_vm_write` ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object Data for detaching security findings from their case. relationships object Relationships detaching security findings from their case. findings [_required_] object Security findings to detach from their case. data [object] id [_required_] string Unique identifier of the security finding. type [_required_] enum Security findings resource type. Allowed enum values: `findings` default: `findings` type [_required_] enum Cases resource type. Allowed enum values: `cases` default: `cases` ``` { "data": { "relationships": { "findings": { "data": [ { "id": "YzM2MTFjYzcyNmY0Zjg4MTAxZmRlNjQ1MWU1ZGQwYzR-YzI5NzE5Y2Y4MzU4ZjliNzhkNjYxNTY0ODIzZDQ2YTM=", "type": "findings" } ] } }, "type": "cases" } } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/security-monitoring/#DetachCase-204-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#DetachCase-400-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#DetachCase-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#DetachCase-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Detach security findings from their case returns "No Content" response Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/findings/cases" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "relationships": { "findings": { "data": [ { "id": "YzM2MTFjYzcyNmY0Zjg4MTAxZmRlNjQ1MWU1ZGQwYzR-YzI5NzE5Y2Y4MzU4ZjliNzhkNjYxNTY0ODIzZDQ2YTM=", "type": "findings" } ] } }, "type": "cases" } } EOF ``` ##### Detach security findings from their case returns "No Content" response ``` // Detach security findings from their case returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.DetachCaseRequest{ Data: &datadogV2.DetachCaseRequestData{ Relationships: &datadogV2.DetachCaseRequestDataRelationships{ Findings: datadogV2.Findings{ Data: []datadogV2.FindingData{ { Id: "YzM2MTFjYzcyNmY0Zjg4MTAxZmRlNjQ1MWU1ZGQwYzR-YzI5NzE5Y2Y4MzU4ZjliNzhkNjYxNTY0ODIzZDQ2YTM=", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, }, }, }, Type: datadogV2.CASEDATATYPE_CASES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) r, err := api.DetachCase(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.DetachCase`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Detach security findings from their case returns "No Content" response ``` // Detach security findings from their case returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.CaseDataType; import com.datadog.api.client.v2.model.DetachCaseRequest; import com.datadog.api.client.v2.model.DetachCaseRequestData; import com.datadog.api.client.v2.model.DetachCaseRequestDataRelationships; import com.datadog.api.client.v2.model.FindingData; import com.datadog.api.client.v2.model.FindingDataType; import com.datadog.api.client.v2.model.Findings; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); DetachCaseRequest body = new DetachCaseRequest() .data( new DetachCaseRequestData() .relationships( new DetachCaseRequestDataRelationships() .findings( new Findings() .data( Collections.singletonList( new FindingData() .id( "YzM2MTFjYzcyNmY0Zjg4MTAxZmRlNjQ1MWU1ZGQwYzR-YzI5NzE5Y2Y4MzU4ZjliNzhkNjYxNTY0ODIzZDQ2YTM=") .type(FindingDataType.FINDINGS))))) .type(CaseDataType.CASES)); try { apiInstance.detachCase(body); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#detachCase"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Detach security findings from their case returns "No Content" response ``` """ Detach security findings from their case returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.case_data_type import CaseDataType from datadog_api_client.v2.model.detach_case_request import DetachCaseRequest from datadog_api_client.v2.model.detach_case_request_data import DetachCaseRequestData from datadog_api_client.v2.model.detach_case_request_data_relationships import DetachCaseRequestDataRelationships from datadog_api_client.v2.model.finding_data import FindingData from datadog_api_client.v2.model.finding_data_type import FindingDataType from datadog_api_client.v2.model.findings import Findings body = DetachCaseRequest( data=DetachCaseRequestData( relationships=DetachCaseRequestDataRelationships( findings=Findings( data=[ FindingData( id="YzM2MTFjYzcyNmY0Zjg4MTAxZmRlNjQ1MWU1ZGQwYzR-YzI5NzE5Y2Y4MzU4ZjliNzhkNjYxNTY0ODIzZDQ2YTM=", type=FindingDataType.FINDINGS, ), ], ), ), type=CaseDataType.CASES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) api_instance.detach_case(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Detach security findings from their case returns "No Content" response ``` # Detach security findings from their case returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::DetachCaseRequest.new({ data: DatadogAPIClient::V2::DetachCaseRequestData.new({ relationships: DatadogAPIClient::V2::DetachCaseRequestDataRelationships.new({ findings: DatadogAPIClient::V2::Findings.new({ data: [ DatadogAPIClient::V2::FindingData.new({ id: "YzM2MTFjYzcyNmY0Zjg4MTAxZmRlNjQ1MWU1ZGQwYzR-YzI5NzE5Y2Y4MzU4ZjliNzhkNjYxNTY0ODIzZDQ2YTM=", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), ], }), }), type: DatadogAPIClient::V2::CaseDataType::CASES, }), }) api_instance.detach_case(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Detach security findings from their case returns "No Content" response ``` // Detach security findings from their case returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::CaseDataType; use datadog_api_client::datadogV2::model::DetachCaseRequest; use datadog_api_client::datadogV2::model::DetachCaseRequestData; use datadog_api_client::datadogV2::model::DetachCaseRequestDataRelationships; use datadog_api_client::datadogV2::model::FindingData; use datadog_api_client::datadogV2::model::FindingDataType; use datadog_api_client::datadogV2::model::Findings; #[tokio::main] async fn main() { let body = DetachCaseRequest ::new().data( DetachCaseRequestData::new( CaseDataType::CASES, ).relationships( DetachCaseRequestDataRelationships::new( Findings ::new().data( vec![ FindingData::new( "YzM2MTFjYzcyNmY0Zjg4MTAxZmRlNjQ1MWU1ZGQwYzR-YzI5NzE5Y2Y4MzU4ZjliNzhkNjYxNTY0ODIzZDQ2YTM=".to_string(), FindingDataType::FINDINGS, ) ], ), ), ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.detach_case(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Detach security findings from their case returns "No Content" response ``` /** * Detach security findings from their case returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiDetachCaseRequest = { body: { data: { relationships: { findings: { data: [ { id: "YzM2MTFjYzcyNmY0Zjg4MTAxZmRlNjQ1MWU1ZGQwYzR-YzI5NzE5Y2Y4MzU4ZjliNzhkNjYxNTY0ODIzZDQ2YTM=", type: "findings", }, ], }, }, type: "cases", }, }, }; apiInstance .detachCase(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create Jira issues for security findings](https://docs.datadoghq.com/api/latest/security-monitoring/#create-jira-issues-for-security-findings) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#create-jira-issues-for-security-findings-v2) **Note** : This endpoint is in beta and is subject to change. Please check the documentation regularly for updates. POST https://api.ap1.datadoghq.com/api/v2/security/findings/jira_issueshttps://api.ap2.datadoghq.com/api/v2/security/findings/jira_issueshttps://api.datadoghq.eu/api/v2/security/findings/jira_issueshttps://api.ddog-gov.com/api/v2/security/findings/jira_issueshttps://api.datadoghq.com/api/v2/security/findings/jira_issueshttps://api.us3.datadoghq.com/api/v2/security/findings/jira_issueshttps://api.us5.datadoghq.com/api/v2/security/findings/jira_issues ### Overview Create Jira issues for security findings. This operation creates a case in Datadog and a Jira issue linked to that case for bidirectional sync between Datadog and Jira. To configure the Jira integration, see [Bidirectional ticket syncing with Jira](https://docs.datadoghq.com/security/ticketing_integrations/#bidirectional-ticket-syncing-with-jira). You can create up to 50 Jira issues per request and associate up to 50 security findings per Jira issue. Security findings that are already attached to another Jira issue will be detached from their previous Jira issue and attached to the newly created Jira issue. This endpoint requires any of the following permissions: * `security_monitoring_findings_write` * `appsec_vm_write` ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data [_required_] [object] attributes object Attributes of the Jira issue to create. assignee_id string Unique identifier of the user assigned to the Jira issue. description string Description of the Jira issue. If not provided, the description will be automatically generated. fields object Custom fields of the Jira issue to create. For the list of available fields, see [Jira documentation](https://developer.atlassian.com/cloud/jira/platform/rest/v2/api-group-issues/#api-rest-api-2-issue-createmeta-projectidorkey-issuetypes-issuetypeid-get). priority enum Priority of the Jira issue. If not provided, the priority will be automatically set to "NOT_DEFINED". Allowed enum values: `NOT_DEFINED,P1,P2,P3,P4,P5` default: `NOT_DEFINED` title string Title of the Jira issue. If not provided, the title will be automatically generated. relationships object Relationships of the Jira issue to create. findings [_required_] object Security findings to create a Jira issue for. data [object] id [_required_] string Unique identifier of the security finding. type [_required_] enum Security findings resource type. Allowed enum values: `findings` default: `findings` project [_required_] object Case management project configured with the Jira integration. It is used to create the Jira issue. To configure the Jira integration, see [Bidirectional ticket syncing with Jira](https://docs.datadoghq.com/security/ticketing_integrations/#bidirectional-ticket-syncing-with-jira). data [_required_] object id [_required_] string Unique identifier of the case management project. type [_required_] enum Projects resource type. Allowed enum values: `projects` default: `projects` type [_required_] enum Jira issues resource type. Allowed enum values: `jira_issues` default: `jira_issues` ``` { "data": [ { "type": "jira_issues", "attributes": {}, "relationships": { "case": { "data": { "type": "cases", "id": "53e242c6-a7d6-46ad-9680-b8d14753f716" } } } }, { "type": "jira_issues", "attributes": {}, "relationships": { "case": { "data": { "type": "cases", "id": "195772b2-1f53-41d2-b81e-48c8e6c21d33" } } } } ], "included": [ { "type": "cases", "attributes": { "title": "A title", "description": "A description" }, "relationships": { "project": { "data": { "type": "projects", "id": "959a6f71-bac8-4027-b1d3-2264f569296f" } }, "findings": { "data": [ { "type": "findings", "id": "OTQ3NjJkMmYwMTIzMzMxNTc1Y2Q4MTA5NWU0NTBmMDl-ZjE3NjMxZWVkYzBjZGI1NDY2NWY2OGQxZDk4MDY4MmI=" } ] } }, "id": "53e242c6-a7d6-46ad-9680-b8d14753f716" }, { "type": "cases", "attributes": { "title": "A title", "description": "A description" }, "relationships": { "project": { "data": { "type": "projects", "id": "959a6f71-bac8-4027-b1d3-2264f569296f" } }, "findings": { "data": [ { "type": "findings", "id": "MTNjN2ZmYWMzMDIxYmU1ZDFiZDRjNWUwN2I1NzVmY2F-YTA3MzllMTUzNWM3NmEyZjdiNzEzOWM5YmViZTMzOGM=" } ] } }, "id": "195772b2-1f53-41d2-b81e-48c8e6c21d33" }, { "type": "projects", "id": "959a6f71-bac8-4027-b1d3-2264f569296f" }, { "type": "findings", "id": "OTQ3NjJkMmYwMTIzMzMxNTc1Y2Q4MTA5NWU0NTBmMDl-ZjE3NjMxZWVkYzBjZGI1NDY2NWY2OGQxZDk4MDY4MmI=" }, { "type": "findings", "id": "MTNjN2ZmYWMzMDIxYmU1ZDFiZDRjNWUwN2I1NzVmY2F-YTA3MzllMTUzNWM3NmEyZjdiNzEzOWM5YmViZTMzOGM=" } ] } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateJiraIssues-201-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateJiraIssues-400-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateJiraIssues-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#CreateJiraIssues-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) List of case responses. Field Type Description data [_required_] [object] attributes object Attributes of the case. archived_at date-time Timestamp of when the case was archived. assigned_to object User assigned to the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` attributes object [string] closed_at date-time Timestamp of when the case was closed. created_at date-time Timestamp of when the case was created. creation_source string Source of the case creation. description string Description of the case. due_date string Due date of the case. insights [object] Insights of the case. ref string Reference of the insight. resource_id string Unique identifier of the resource. For example, the unique identifier of a security finding. type string Type of the resource. For example, the type of a security finding is "SECURITY_FINDING". jira_issue object Jira issue associated with the case. error_message string Error message if the Jira issue creation failed. result object Result of the Jira issue creation. account_id string Account ID of the Jira issue. issue_id string Unique identifier of the Jira issue. issue_key string Key of the Jira issue. issue_url string URL of the Jira issue. status string Status of the Jira issue creation. Can be "COMPLETED" if the Jira issue was created successfully, or "FAILED" if the Jira issue creation failed. key string Key of the case. modified_at date-time Timestamp of when the case was last modified. priority string Priority of the case. status string Status of the case. status_group string Status group of the case. status_name string Status name of the case. title string Title of the case. type string Type of the case. For security cases, this is always "SECURITY". id string Unique identifier of the case. relationships object Relationships of the case. created_by object User who created the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` modified_by object User who last modified the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` project object Project in which the case was created. data [_required_] object id [_required_] string Unique identifier of the case management project. type [_required_] enum Projects resource type. Allowed enum values: `projects` default: `projects` type [_required_] enum Cases resource type. Allowed enum values: `cases` default: `cases` ``` { "data": [ { "attributes": { "archived_at": "2025-01-01T00:00:00.000Z", "assigned_to": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "attributes": { "": [] }, "closed_at": "2025-01-01T00:00:00.000Z", "created_at": "2025-01-01T00:00:00.000Z", "creation_source": "CS_SECURITY_FINDING", "description": "A description of the case.", "due_date": "2025-01-01", "insights": [ { "ref": "/security/appsec/vm/library/vulnerability/dfa027f7c037b2f77159adc027fecb56?detection=static", "resource_id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", "type": "SECURITY_FINDING" } ], "jira_issue": { "error_message": "{\"errorMessages\":[\"An error occured.\"],\"errors\":{}}", "result": { "account_id": "463a8631-680e-455c-bfd3-3ed04d326eb7", "issue_id": "2871276", "issue_key": "PROJ-123", "issue_url": "https://domain.atlassian.net/browse/PROJ-123" }, "status": "COMPLETED" }, "key": "PROJ-123", "modified_at": "2025-01-01T00:00:00.000Z", "priority": "P4", "status": "OPEN", "status_group": "SG_OPEN", "status_name": "Open", "title": "A title for the case.", "type": "SECURITY" }, "id": "c1234567-89ab-cdef-0123-456789abcdef", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "project": { "data": { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "projects" } } }, "type": "cases" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Create Jira issues for security findings Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/findings/jira_issues" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "relationships": { "findings": { "data": [ { "id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", "type": "findings" } ] }, "project": { "data": { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "projects" } } }, "type": "jira_issues" } ] } EOF ``` ##### Create Jira issues for security findings ``` """ Create Jira issues for security findings returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.case_management_project import CaseManagementProject from datadog_api_client.v2.model.case_management_project_data import CaseManagementProjectData from datadog_api_client.v2.model.case_management_project_data_type import CaseManagementProjectDataType from datadog_api_client.v2.model.case_priority import CasePriority from datadog_api_client.v2.model.create_jira_issue_request_array import CreateJiraIssueRequestArray from datadog_api_client.v2.model.create_jira_issue_request_data import CreateJiraIssueRequestData from datadog_api_client.v2.model.create_jira_issue_request_data_attributes import CreateJiraIssueRequestDataAttributes from datadog_api_client.v2.model.create_jira_issue_request_data_relationships import ( CreateJiraIssueRequestDataRelationships, ) from datadog_api_client.v2.model.finding_data import FindingData from datadog_api_client.v2.model.finding_data_type import FindingDataType from datadog_api_client.v2.model.findings import Findings from datadog_api_client.v2.model.jira_issues_data_type import JiraIssuesDataType body = CreateJiraIssueRequestArray( data=[ CreateJiraIssueRequestData( attributes=CreateJiraIssueRequestDataAttributes( assignee_id="f315bdaf-9ee7-4808-a9c1-99c15bf0f4d0", description="A description of the Jira issue.", fields=dict([("key1", "value"), ("key2", "['value']"), ("key3", "{'key4': 'value'}")]), priority=CasePriority.NOT_DEFINED, title="A title for the Jira issue.", ), relationships=CreateJiraIssueRequestDataRelationships( findings=Findings( data=[ FindingData( id="ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", type=FindingDataType.FINDINGS, ), ], ), project=CaseManagementProject( data=CaseManagementProjectData( id="aeadc05e-98a8-11ec-ac2c-da7ad0900001", type=CaseManagementProjectDataType.PROJECTS, ), ), ), type=JiraIssuesDataType.JIRA_ISSUES, ), ], ) configuration = Configuration() configuration.unstable_operations["create_jira_issues"] = True with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.create_jira_issues(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create Jira issues for security findings ``` # Create Jira issues for security findings returns "Created" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_jira_issues".to_sym] = true end api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::CreateJiraIssueRequestArray.new({ data: [ DatadogAPIClient::V2::CreateJiraIssueRequestData.new({ attributes: DatadogAPIClient::V2::CreateJiraIssueRequestDataAttributes.new({ assignee_id: "f315bdaf-9ee7-4808-a9c1-99c15bf0f4d0", description: "A description of the Jira issue.", fields: { "key1": "value", "key2": "['value']", "key3": "{'key4': 'value'}", }, priority: DatadogAPIClient::V2::CasePriority::NOT_DEFINED, title: "A title for the Jira issue.", }), relationships: DatadogAPIClient::V2::CreateJiraIssueRequestDataRelationships.new({ findings: DatadogAPIClient::V2::Findings.new({ data: [ DatadogAPIClient::V2::FindingData.new({ id: "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), ], }), project: DatadogAPIClient::V2::CaseManagementProject.new({ data: DatadogAPIClient::V2::CaseManagementProjectData.new({ id: "aeadc05e-98a8-11ec-ac2c-da7ad0900001", type: DatadogAPIClient::V2::CaseManagementProjectDataType::PROJECTS, }), }), }), type: DatadogAPIClient::V2::JiraIssuesDataType::JIRA_ISSUES, }), ], }) p api_instance.create_jira_issues(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create Jira issues for security findings ``` // Create Jira issues for security findings returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateJiraIssueRequestArray{ Data: []datadogV2.CreateJiraIssueRequestData{ { Attributes: &datadogV2.CreateJiraIssueRequestDataAttributes{ AssigneeId: datadog.PtrString("f315bdaf-9ee7-4808-a9c1-99c15bf0f4d0"), Description: datadog.PtrString("A description of the Jira issue."), Fields: map[string]interface{}{ "key1": "value", "key2": "['value']", "key3": "{'key4': 'value'}", }, Priority: datadogV2.CASEPRIORITY_NOT_DEFINED.Ptr(), Title: datadog.PtrString("A title for the Jira issue."), }, Relationships: &datadogV2.CreateJiraIssueRequestDataRelationships{ Findings: datadogV2.Findings{ Data: []datadogV2.FindingData{ { Id: "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, }, }, Project: datadogV2.CaseManagementProject{ Data: datadogV2.CaseManagementProjectData{ Id: "aeadc05e-98a8-11ec-ac2c-da7ad0900001", Type: datadogV2.CASEMANAGEMENTPROJECTDATATYPE_PROJECTS, }, }, }, Type: datadogV2.JIRAISSUESDATATYPE_JIRA_ISSUES, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateJiraIssues", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.CreateJiraIssues(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.CreateJiraIssues`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.CreateJiraIssues`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create Jira issues for security findings ``` // Create Jira issues for security findings returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.CaseManagementProject; import com.datadog.api.client.v2.model.CaseManagementProjectData; import com.datadog.api.client.v2.model.CaseManagementProjectDataType; import com.datadog.api.client.v2.model.CasePriority; import com.datadog.api.client.v2.model.CreateJiraIssueRequestArray; import com.datadog.api.client.v2.model.CreateJiraIssueRequestData; import com.datadog.api.client.v2.model.CreateJiraIssueRequestDataAttributes; import com.datadog.api.client.v2.model.CreateJiraIssueRequestDataRelationships; import com.datadog.api.client.v2.model.FindingCaseResponseArray; import com.datadog.api.client.v2.model.FindingData; import com.datadog.api.client.v2.model.FindingDataType; import com.datadog.api.client.v2.model.Findings; import com.datadog.api.client.v2.model.JiraIssuesDataType; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createJiraIssues", true); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); CreateJiraIssueRequestArray body = new CreateJiraIssueRequestArray() .data( Collections.singletonList( new CreateJiraIssueRequestData() .attributes( new CreateJiraIssueRequestDataAttributes() .assigneeId("f315bdaf-9ee7-4808-a9c1-99c15bf0f4d0") .description("A description of the Jira issue.") .fields( Map.ofEntries( Map.entry("key1", "value"), Map.entry("key2", "['value']"), Map.entry("key3", "{'key4': 'value'}"))) .priority(CasePriority.NOT_DEFINED) .title("A title for the Jira issue.")) .relationships( new CreateJiraIssueRequestDataRelationships() .findings( new Findings() .data( Collections.singletonList( new FindingData() .id( "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==") .type(FindingDataType.FINDINGS)))) .project( new CaseManagementProject() .data( new CaseManagementProjectData() .id("aeadc05e-98a8-11ec-ac2c-da7ad0900001") .type(CaseManagementProjectDataType.PROJECTS)))) .type(JiraIssuesDataType.JIRA_ISSUES))); try { FindingCaseResponseArray result = apiInstance.createJiraIssues(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#createJiraIssues"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create Jira issues for security findings ``` // Create Jira issues for security findings returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::CaseManagementProject; use datadog_api_client::datadogV2::model::CaseManagementProjectData; use datadog_api_client::datadogV2::model::CaseManagementProjectDataType; use datadog_api_client::datadogV2::model::CasePriority; use datadog_api_client::datadogV2::model::CreateJiraIssueRequestArray; use datadog_api_client::datadogV2::model::CreateJiraIssueRequestData; use datadog_api_client::datadogV2::model::CreateJiraIssueRequestDataAttributes; use datadog_api_client::datadogV2::model::CreateJiraIssueRequestDataRelationships; use datadog_api_client::datadogV2::model::FindingData; use datadog_api_client::datadogV2::model::FindingDataType; use datadog_api_client::datadogV2::model::Findings; use datadog_api_client::datadogV2::model::JiraIssuesDataType; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = CreateJiraIssueRequestArray::new(vec![CreateJiraIssueRequestData::new( JiraIssuesDataType::JIRA_ISSUES, ) .attributes( CreateJiraIssueRequestDataAttributes::new() .assignee_id("f315bdaf-9ee7-4808-a9c1-99c15bf0f4d0".to_string()) .description("A description of the Jira issue.".to_string()) .fields(BTreeMap::from([("key1".to_string(), Value::from("value"))])) .priority(CasePriority::NOT_DEFINED) .title("A title for the Jira issue.".to_string()), ) .relationships(CreateJiraIssueRequestDataRelationships::new( Findings::new().data(vec![FindingData::new( "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==".to_string(), FindingDataType::FINDINGS, )]), CaseManagementProject::new(CaseManagementProjectData::new( "aeadc05e-98a8-11ec-ac2c-da7ad0900001".to_string(), CaseManagementProjectDataType::PROJECTS, )), ))]); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateJiraIssues", true); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.create_jira_issues(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create Jira issues for security findings ``` /** * Create Jira issues for security findings returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createJiraIssues"] = true; const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiCreateJiraIssuesRequest = { body: { data: [ { attributes: { assigneeId: "f315bdaf-9ee7-4808-a9c1-99c15bf0f4d0", description: "A description of the Jira issue.", fields: { key1: "value", key2: "['value']", key3: "{'key4': 'value'}", }, priority: "NOT_DEFINED", title: "A title for the Jira issue.", }, relationships: { findings: { data: [ { id: "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", type: "findings", }, ], }, project: { data: { id: "aeadc05e-98a8-11ec-ac2c-da7ad0900001", type: "projects", }, }, }, type: "jira_issues", }, ], }, }; apiInstance .createJiraIssues(params) .then((data: v2.FindingCaseResponseArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Attach security findings to a Jira issue](https://docs.datadoghq.com/api/latest/security-monitoring/#attach-security-findings-to-a-jira-issue) * [v2 (latest)](https://docs.datadoghq.com/api/latest/security-monitoring/#attach-security-findings-to-a-jira-issue-v2) **Note** : This endpoint is in beta and is subject to change. Please check the documentation regularly for updates. PATCH https://api.ap1.datadoghq.com/api/v2/security/findings/jira_issueshttps://api.ap2.datadoghq.com/api/v2/security/findings/jira_issueshttps://api.datadoghq.eu/api/v2/security/findings/jira_issueshttps://api.ddog-gov.com/api/v2/security/findings/jira_issueshttps://api.datadoghq.com/api/v2/security/findings/jira_issueshttps://api.us3.datadoghq.com/api/v2/security/findings/jira_issueshttps://api.us5.datadoghq.com/api/v2/security/findings/jira_issues ### Overview Attach security findings to a Jira issue by providing the Jira issue URL. You can attach up to 50 security findings per Jira issue. If the Jira issue is not linked to any case, this operation will create a case for the security findings and link the Jira issue to the newly created case. To configure the Jira integration, see [Bidirectional ticket syncing with Jira](https://docs.datadoghq.com/security/ticketing_integrations/#bidirectional-ticket-syncing-with-jira). Security findings that are already attached to another Jira issue will be detached from their previous Jira issue and attached to the specified Jira issue. This endpoint requires any of the following permissions: * `security_monitoring_findings_write` * `appsec_vm_write` ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Field Type Description data object Data of the Jira issue to attach security findings to. attributes object Attributes of the Jira issue to attach security findings to. jira_issue_url [_required_] string URL of the Jira issue to attach security findings to. relationships object Relationships of the Jira issue to attach security findings to. findings [_required_] object Security findings to attach to the Jira issue. data [object] id [_required_] string Unique identifier of the security finding. type [_required_] enum Security findings resource type. Allowed enum values: `findings` default: `findings` project [_required_] object Case management project with Jira integration configured. It is used to attach security findings to the Jira issue. To configure the integration, see [Bidirectional ticket syncing with Jira](https://docs.datadoghq.com/security/ticketing_integrations/#bidirectional-ticket-syncing-with-jira). data [_required_] object id [_required_] string Unique identifier of the case management project. type [_required_] enum Projects resource type. Allowed enum values: `projects` default: `projects` type [_required_] enum Jira issues resource type. Allowed enum values: `jira_issues` default: `jira_issues` ``` { "data": { "attributes": { "jira_issue_url": "https://datadoghq-sandbox-538.atlassian.net/browse/CSMSEC-105476" }, "relationships": { "findings": { "data": [ { "id": "OTQ3NjJkMmYwMTIzMzMxNTc1Y2Q4MTA5NWU0NTBmMDl-ZjE3NjMxZWVkYzBjZGI1NDY2NWY2OGQxZDk4MDY4MmI=", "type": "findings" }, { "id": "MTNjN2ZmYWMzMDIxYmU1ZDFiZDRjNWUwN2I1NzVmY2F-YTA3MzllMTUzNWM3NmEyZjdiNzEzOWM5YmViZTMzOGM=", "type": "findings" } ] }, "project": { "data": { "id": "959a6f71-bac8-4027-b1d3-2264f569296f", "type": "projects" } } }, "type": "jira_issues" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/security-monitoring/#AttachJiraIssue-200-v2) * [400](https://docs.datadoghq.com/api/latest/security-monitoring/#AttachJiraIssue-400-v2) * [404](https://docs.datadoghq.com/api/latest/security-monitoring/#AttachJiraIssue-404-v2) * [429](https://docs.datadoghq.com/api/latest/security-monitoring/#AttachJiraIssue-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) Case response. Field Type Description data object Data of the case. attributes object Attributes of the case. archived_at date-time Timestamp of when the case was archived. assigned_to object User assigned to the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` attributes object [string] closed_at date-time Timestamp of when the case was closed. created_at date-time Timestamp of when the case was created. creation_source string Source of the case creation. description string Description of the case. due_date string Due date of the case. insights [object] Insights of the case. ref string Reference of the insight. resource_id string Unique identifier of the resource. For example, the unique identifier of a security finding. type string Type of the resource. For example, the type of a security finding is "SECURITY_FINDING". jira_issue object Jira issue associated with the case. error_message string Error message if the Jira issue creation failed. result object Result of the Jira issue creation. account_id string Account ID of the Jira issue. issue_id string Unique identifier of the Jira issue. issue_key string Key of the Jira issue. issue_url string URL of the Jira issue. status string Status of the Jira issue creation. Can be "COMPLETED" if the Jira issue was created successfully, or "FAILED" if the Jira issue creation failed. key string Key of the case. modified_at date-time Timestamp of when the case was last modified. priority string Priority of the case. status string Status of the case. status_group string Status group of the case. status_name string Status name of the case. title string Title of the case. type string Type of the case. For security cases, this is always "SECURITY". id string Unique identifier of the case. relationships object Relationships of the case. created_by object User who created the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` modified_by object User who last modified the case. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` project object Project in which the case was created. data [_required_] object id [_required_] string Unique identifier of the case management project. type [_required_] enum Projects resource type. Allowed enum values: `projects` default: `projects` type [_required_] enum Cases resource type. Allowed enum values: `cases` default: `cases` ``` { "data": { "attributes": { "archived_at": "2025-01-01T00:00:00.000Z", "assigned_to": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "attributes": { "": [] }, "closed_at": "2025-01-01T00:00:00.000Z", "created_at": "2025-01-01T00:00:00.000Z", "creation_source": "CS_SECURITY_FINDING", "description": "A description of the case.", "due_date": "2025-01-01", "insights": [ { "ref": "/security/appsec/vm/library/vulnerability/dfa027f7c037b2f77159adc027fecb56?detection=static", "resource_id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", "type": "SECURITY_FINDING" } ], "jira_issue": { "error_message": "{\"errorMessages\":[\"An error occured.\"],\"errors\":{}}", "result": { "account_id": "463a8631-680e-455c-bfd3-3ed04d326eb7", "issue_id": "2871276", "issue_key": "PROJ-123", "issue_url": "https://domain.atlassian.net/browse/PROJ-123" }, "status": "COMPLETED" }, "key": "PROJ-123", "modified_at": "2025-01-01T00:00:00.000Z", "priority": "P4", "status": "OPEN", "status_group": "SG_OPEN", "status_name": "Open", "title": "A title for the case.", "type": "SECURITY" }, "id": "c1234567-89ab-cdef-0123-456789abcdef", "relationships": { "created_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "modified_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } }, "project": { "data": { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "projects" } } }, "type": "cases" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/security-monitoring/) * [Example](https://docs.datadoghq.com/api/latest/security-monitoring/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/security-monitoring/?code-lang=typescript) ##### Attach security findings to a Jira issue Copy ``` # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/security/findings/jira_issues" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "jira_issue_url": "https://domain.atlassian.net/browse/PROJ-123" }, "relationships": { "findings": { "data": [ { "id": "ZGVmLTAwcC1pZXJ-aS0wZjhjNjMyZDNmMzRlZTgzNw==", "type": "findings" } ] }, "project": { "data": { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "projects" } } }, "type": "jira_issues" } } EOF ``` ##### Attach security findings to a Jira issue ``` """ Attach security findings to a Jira issue returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.security_monitoring_api import SecurityMonitoringApi from datadog_api_client.v2.model.attach_jira_issue_request import AttachJiraIssueRequest from datadog_api_client.v2.model.attach_jira_issue_request_data import AttachJiraIssueRequestData from datadog_api_client.v2.model.attach_jira_issue_request_data_attributes import AttachJiraIssueRequestDataAttributes from datadog_api_client.v2.model.attach_jira_issue_request_data_relationships import ( AttachJiraIssueRequestDataRelationships, ) from datadog_api_client.v2.model.case_management_project import CaseManagementProject from datadog_api_client.v2.model.case_management_project_data import CaseManagementProjectData from datadog_api_client.v2.model.case_management_project_data_type import CaseManagementProjectDataType from datadog_api_client.v2.model.finding_data import FindingData from datadog_api_client.v2.model.finding_data_type import FindingDataType from datadog_api_client.v2.model.findings import Findings from datadog_api_client.v2.model.jira_issues_data_type import JiraIssuesDataType body = AttachJiraIssueRequest( data=AttachJiraIssueRequestData( attributes=AttachJiraIssueRequestDataAttributes( jira_issue_url="https://datadoghq-sandbox-538.atlassian.net/browse/CSMSEC-105476", ), relationships=AttachJiraIssueRequestDataRelationships( findings=Findings( data=[ FindingData( id="OTQ3NjJkMmYwMTIzMzMxNTc1Y2Q4MTA5NWU0NTBmMDl-ZjE3NjMxZWVkYzBjZGI1NDY2NWY2OGQxZDk4MDY4MmI=", type=FindingDataType.FINDINGS, ), FindingData( id="MTNjN2ZmYWMzMDIxYmU1ZDFiZDRjNWUwN2I1NzVmY2F-YTA3MzllMTUzNWM3NmEyZjdiNzEzOWM5YmViZTMzOGM=", type=FindingDataType.FINDINGS, ), ], ), project=CaseManagementProject( data=CaseManagementProjectData( id="959a6f71-bac8-4027-b1d3-2264f569296f", type=CaseManagementProjectDataType.PROJECTS, ), ), ), type=JiraIssuesDataType.JIRA_ISSUES, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SecurityMonitoringApi(api_client) response = api_instance.attach_jira_issue(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Attach security findings to a Jira issue ``` # Attach security findings to a Jira issue returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SecurityMonitoringAPI.new body = DatadogAPIClient::V2::AttachJiraIssueRequest.new({ data: DatadogAPIClient::V2::AttachJiraIssueRequestData.new({ attributes: DatadogAPIClient::V2::AttachJiraIssueRequestDataAttributes.new({ jira_issue_url: "https://datadoghq-sandbox-538.atlassian.net/browse/CSMSEC-105476", }), relationships: DatadogAPIClient::V2::AttachJiraIssueRequestDataRelationships.new({ findings: DatadogAPIClient::V2::Findings.new({ data: [ DatadogAPIClient::V2::FindingData.new({ id: "OTQ3NjJkMmYwMTIzMzMxNTc1Y2Q4MTA5NWU0NTBmMDl-ZjE3NjMxZWVkYzBjZGI1NDY2NWY2OGQxZDk4MDY4MmI=", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), DatadogAPIClient::V2::FindingData.new({ id: "MTNjN2ZmYWMzMDIxYmU1ZDFiZDRjNWUwN2I1NzVmY2F-YTA3MzllMTUzNWM3NmEyZjdiNzEzOWM5YmViZTMzOGM=", type: DatadogAPIClient::V2::FindingDataType::FINDINGS, }), ], }), project: DatadogAPIClient::V2::CaseManagementProject.new({ data: DatadogAPIClient::V2::CaseManagementProjectData.new({ id: "959a6f71-bac8-4027-b1d3-2264f569296f", type: DatadogAPIClient::V2::CaseManagementProjectDataType::PROJECTS, }), }), }), type: DatadogAPIClient::V2::JiraIssuesDataType::JIRA_ISSUES, }), }) p api_instance.attach_jira_issue(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Attach security findings to a Jira issue ``` // Attach security findings to a Jira issue returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AttachJiraIssueRequest{ Data: &datadogV2.AttachJiraIssueRequestData{ Attributes: &datadogV2.AttachJiraIssueRequestDataAttributes{ JiraIssueUrl: "https://datadoghq-sandbox-538.atlassian.net/browse/CSMSEC-105476", }, Relationships: &datadogV2.AttachJiraIssueRequestDataRelationships{ Findings: datadogV2.Findings{ Data: []datadogV2.FindingData{ { Id: "OTQ3NjJkMmYwMTIzMzMxNTc1Y2Q4MTA5NWU0NTBmMDl-ZjE3NjMxZWVkYzBjZGI1NDY2NWY2OGQxZDk4MDY4MmI=", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, { Id: "MTNjN2ZmYWMzMDIxYmU1ZDFiZDRjNWUwN2I1NzVmY2F-YTA3MzllMTUzNWM3NmEyZjdiNzEzOWM5YmViZTMzOGM=", Type: datadogV2.FINDINGDATATYPE_FINDINGS, }, }, }, Project: datadogV2.CaseManagementProject{ Data: datadogV2.CaseManagementProjectData{ Id: "959a6f71-bac8-4027-b1d3-2264f569296f", Type: datadogV2.CASEMANAGEMENTPROJECTDATATYPE_PROJECTS, }, }, }, Type: datadogV2.JIRAISSUESDATATYPE_JIRA_ISSUES, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSecurityMonitoringApi(apiClient) resp, r, err := api.AttachJiraIssue(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SecurityMonitoringApi.AttachJiraIssue`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SecurityMonitoringApi.AttachJiraIssue`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Attach security findings to a Jira issue ``` // Attach security findings to a Jira issue returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SecurityMonitoringApi; import com.datadog.api.client.v2.model.AttachJiraIssueRequest; import com.datadog.api.client.v2.model.AttachJiraIssueRequestData; import com.datadog.api.client.v2.model.AttachJiraIssueRequestDataAttributes; import com.datadog.api.client.v2.model.AttachJiraIssueRequestDataRelationships; import com.datadog.api.client.v2.model.CaseManagementProject; import com.datadog.api.client.v2.model.CaseManagementProjectData; import com.datadog.api.client.v2.model.CaseManagementProjectDataType; import com.datadog.api.client.v2.model.FindingCaseResponse; import com.datadog.api.client.v2.model.FindingData; import com.datadog.api.client.v2.model.FindingDataType; import com.datadog.api.client.v2.model.Findings; import com.datadog.api.client.v2.model.JiraIssuesDataType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SecurityMonitoringApi apiInstance = new SecurityMonitoringApi(defaultClient); AttachJiraIssueRequest body = new AttachJiraIssueRequest() .data( new AttachJiraIssueRequestData() .attributes( new AttachJiraIssueRequestDataAttributes() .jiraIssueUrl( "https://datadoghq-sandbox-538.atlassian.net/browse/CSMSEC-105476")) .relationships( new AttachJiraIssueRequestDataRelationships() .findings( new Findings() .data( Arrays.asList( new FindingData() .id( "OTQ3NjJkMmYwMTIzMzMxNTc1Y2Q4MTA5NWU0NTBmMDl-ZjE3NjMxZWVkYzBjZGI1NDY2NWY2OGQxZDk4MDY4MmI=") .type(FindingDataType.FINDINGS), new FindingData() .id( "MTNjN2ZmYWMzMDIxYmU1ZDFiZDRjNWUwN2I1NzVmY2F-YTA3MzllMTUzNWM3NmEyZjdiNzEzOWM5YmViZTMzOGM=") .type(FindingDataType.FINDINGS)))) .project( new CaseManagementProject() .data( new CaseManagementProjectData() .id("959a6f71-bac8-4027-b1d3-2264f569296f") .type(CaseManagementProjectDataType.PROJECTS)))) .type(JiraIssuesDataType.JIRA_ISSUES)); try { FindingCaseResponse result = apiInstance.attachJiraIssue(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SecurityMonitoringApi#attachJiraIssue"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Attach security findings to a Jira issue ``` // Attach security findings to a Jira issue returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_security_monitoring::SecurityMonitoringAPI; use datadog_api_client::datadogV2::model::AttachJiraIssueRequest; use datadog_api_client::datadogV2::model::AttachJiraIssueRequestData; use datadog_api_client::datadogV2::model::AttachJiraIssueRequestDataAttributes; use datadog_api_client::datadogV2::model::AttachJiraIssueRequestDataRelationships; use datadog_api_client::datadogV2::model::CaseManagementProject; use datadog_api_client::datadogV2::model::CaseManagementProjectData; use datadog_api_client::datadogV2::model::CaseManagementProjectDataType; use datadog_api_client::datadogV2::model::FindingData; use datadog_api_client::datadogV2::model::FindingDataType; use datadog_api_client::datadogV2::model::Findings; use datadog_api_client::datadogV2::model::JiraIssuesDataType; #[tokio::main] async fn main() { let body = AttachJiraIssueRequest ::new().data( AttachJiraIssueRequestData::new(JiraIssuesDataType::JIRA_ISSUES) .attributes( AttachJiraIssueRequestDataAttributes::new( "https://datadoghq-sandbox-538.atlassian.net/browse/CSMSEC-105476".to_string(), ), ) .relationships( AttachJiraIssueRequestDataRelationships::new( Findings ::new().data( vec![ FindingData::new( "OTQ3NjJkMmYwMTIzMzMxNTc1Y2Q4MTA5NWU0NTBmMDl-ZjE3NjMxZWVkYzBjZGI1NDY2NWY2OGQxZDk4MDY4MmI=".to_string(), FindingDataType::FINDINGS, ), FindingData::new( "MTNjN2ZmYWMzMDIxYmU1ZDFiZDRjNWUwN2I1NzVmY2F-YTA3MzllMTUzNWM3NmEyZjdiNzEzOWM5YmViZTMzOGM=".to_string(), FindingDataType::FINDINGS, ) ], ), CaseManagementProject::new( CaseManagementProjectData::new( "959a6f71-bac8-4027-b1d3-2264f569296f".to_string(), CaseManagementProjectDataType::PROJECTS, ), ), ), ), ); let configuration = datadog::Configuration::new(); let api = SecurityMonitoringAPI::with_config(configuration); let resp = api.attach_jira_issue(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Attach security findings to a Jira issue ``` /** * Attach security findings to a Jira issue returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SecurityMonitoringApi(configuration); const params: v2.SecurityMonitoringApiAttachJiraIssueRequest = { body: { data: { attributes: { jiraIssueUrl: "https://datadoghq-sandbox-538.atlassian.net/browse/CSMSEC-105476", }, relationships: { findings: { data: [ { id: "OTQ3NjJkMmYwMTIzMzMxNTc1Y2Q4MTA5NWU0NTBmMDl-ZjE3NjMxZWVkYzBjZGI1NDY2NWY2OGQxZDk4MDY4MmI=", type: "findings", }, { id: "MTNjN2ZmYWMzMDIxYmU1ZDFiZDRjNWUwN2I1NzVmY2F-YTA3MzllMTUzNWM3NmEyZjdiNzEzOWM5YmViZTMzOGM=", type: "findings", }, ], }, project: { data: { id: "959a6f71-bac8-4027-b1d3-2264f569296f", type: "projects", }, }, }, type: "jira_issues", }, }, }; apiInstance .attachJiraIssue(params) .then((data: v2.FindingCaseResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=de362410-a01a-4247-bc54-fad9cdfabd01&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d29d820c-0dd4-40e8-a80c-eb23c14f0b57&pt=Security%20Monitoring&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsecurity-monitoring%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=de362410-a01a-4247-bc54-fad9cdfabd01&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d29d820c-0dd4-40e8-a80c-eb23c14f0b57&pt=Security%20Monitoring&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsecurity-monitoring%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=15cca7fd-277f-4fcc-9469-97e45a716cd6&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Security%20Monitoring&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsecurity-monitoring%2F&r=<=43669&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=639919) --- # Source: https://docs.datadoghq.com/api/latest/sensitive-data-scanner/ # Sensitive Data Scanner Create, update, delete, and retrieve sensitive data scanner groups and rules. See the [Sensitive Data Scanner page](https://docs.datadoghq.com/sensitive_data_scanner/) for more information. ## [List Scanning Groups](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#list-scanning-groups) * [v2 (latest)](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#list-scanning-groups-v2) GET https://api.ap1.datadoghq.com/api/v2/sensitive-data-scanner/confighttps://api.ap2.datadoghq.com/api/v2/sensitive-data-scanner/confighttps://api.datadoghq.eu/api/v2/sensitive-data-scanner/confighttps://api.ddog-gov.com/api/v2/sensitive-data-scanner/confighttps://api.datadoghq.com/api/v2/sensitive-data-scanner/confighttps://api.us3.datadoghq.com/api/v2/sensitive-data-scanner/confighttps://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config ### Overview List all the Scanning groups in your organization. This endpoint requires the `data_scanner_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ListScanningGroups-200-v2) * [400](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ListScanningGroups-400-v2) * [403](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ListScanningGroups-403-v2) * [429](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ListScanningGroups-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Get all groups response. Field Type Description data object Response data related to the scanning groups. attributes object Attributes of the Sensitive Data configuration. id string ID of the configuration. relationships object Relationships of the configuration. groups object List of groups, ordered. data [object] List of groups. The order is important. id string ID of the group. type enum Sensitive Data Scanner group type. Allowed enum values: `sensitive_data_scanner_group` default: `sensitive_data_scanner_group` type enum Sensitive Data Scanner configuration type. Allowed enum values: `sensitive_data_scanner_configuration` default: `sensitive_data_scanner_configuration` included [ ] Included objects from relationships. Option 1 object A Scanning Rule included item. attributes object Attributes of the Sensitive Data Scanner rule. description string Description of the rule. excluded_namespaces [string] Attributes excluded from the scan. If namespaces is provided, it has to be a sub-path of the namespaces array. included_keyword_configuration object Object defining a set of keywords and a number of characters that help reduce noise. You can provide a list of keywords you would like to check within a defined proximity of the matching pattern. If any of the keywords are found within the proximity check, the match is kept. If none are found, the match is discarded. character_count [_required_] int64 The number of characters behind a match detected by Sensitive Data Scanner to look for the keywords defined. `character_count` should be greater than the maximum length of a keyword defined for a rule. keywords [_required_] [string] Keyword list that will be checked during scanning in order to validate a match. The number of keywords in the list must be less than or equal to 30. use_recommended_keywords boolean Should the rule use the underlying standard pattern keyword configuration. If set to `true`, the rule must be tied to a standard pattern. If set to `false`, the specified keywords and `character_count` are applied. is_enabled boolean Whether or not the rule is enabled. name string Name of the rule. namespaces [string] Attributes included in the scan. If namespaces is empty or missing, all attributes except excluded_namespaces are scanned. If both are missing the whole event is scanned. pattern string Not included if there is a relationship to a standard pattern. priority int64 Integer from 1 (high) to 5 (low) indicating rule issue severity. tags [string] List of tags. text_replacement object Object describing how the scanned event will be replaced. number_of_chars int64 Required if type == 'partial_replacement_from_beginning' or 'partial_replacement_from_end'. It must be > 0. replacement_string string Required if type == 'replacement_string'. should_save_match boolean Only valid when type == `replacement_string`. When enabled, matches can be unmasked in logs by users with ‘Data Scanner Unmask’ permission. As a security best practice, avoid masking for highly-sensitive, long-lived data. type enum Type of the replacement text. None means no replacement. hash means the data will be stubbed. replacement_string means that one can chose a text to replace the data. partial_replacement_from_beginning allows a user to partially replace the data from the beginning, and partial_replacement_from_end on the other hand, allows to replace data from the end. Allowed enum values: `none,hash,replacement_string,partial_replacement_from_beginning,partial_replacement_from_end` default: `none` id string ID of the rule. relationships object Relationships of a scanning rule. group object A scanning group data. data object A scanning group. id string ID of the group. type enum Sensitive Data Scanner group type. Allowed enum values: `sensitive_data_scanner_group` default: `sensitive_data_scanner_group` standard_pattern object A standard pattern. data object Data containing the standard pattern id. id string ID of the standard pattern. type enum Sensitive Data Scanner standard pattern type. Allowed enum values: `sensitive_data_scanner_standard_pattern` default: `sensitive_data_scanner_standard_pattern` type enum Sensitive Data Scanner rule type. Allowed enum values: `sensitive_data_scanner_rule` default: `sensitive_data_scanner_rule` Option 2 object A Scanning Group included item. attributes object Attributes of the Sensitive Data Scanner group. description string Description of the group. filter object Filter for the Scanning Group. query string Query to filter the events. is_enabled boolean Whether or not the group is enabled. name string Name of the group. product_list [string] List of products the scanning group applies. samplings [object] List of sampling rates per product type. product enum Datadog product onto which Sensitive Data Scanner can be activated. Allowed enum values: `logs,rum,events,apm` default: `logs` rate double Rate at which data in product type will be scanned, as a percentage. id string ID of the group. relationships object Relationships of the group. configuration object A Sensitive Data Scanner configuration data. data object A Sensitive Data Scanner configuration. id string ID of the configuration. type enum Sensitive Data Scanner configuration type. Allowed enum values: `sensitive_data_scanner_configuration` default: `sensitive_data_scanner_configuration` rules object Rules included in the group. data [object] Rules included in the group. The order is important. id string ID of the rule. type enum Sensitive Data Scanner rule type. Allowed enum values: `sensitive_data_scanner_rule` default: `sensitive_data_scanner_rule` type enum Sensitive Data Scanner group type. Allowed enum values: `sensitive_data_scanner_group` default: `sensitive_data_scanner_group` meta object Meta response containing information about the API. count_limit int64 Maximum number of scanning rules allowed for the org. group_count_limit int64 Maximum number of scanning groups allowed for the org. has_highlight_enabled boolean **DEPRECATED** : (Deprecated) Whether or not scanned events are highlighted in Logs or RUM for the org. default: `true` has_multi_pass_enabled boolean **DEPRECATED** : (Deprecated) Whether or not scanned events have multi-pass enabled. is_pci_compliant boolean Whether or not the org is compliant to the payment card industry standard. version int64 Version of the API. ``` { "data": { "attributes": {}, "id": "string", "relationships": { "groups": { "data": [ { "id": "string", "type": "sensitive_data_scanner_group" } ] } }, "type": "sensitive_data_scanner_configuration" }, "included": [ { "attributes": { "description": "string", "excluded_namespaces": [ "admin.name" ], "included_keyword_configuration": { "character_count": 30, "keywords": [ "credit card", "cc" ], "use_recommended_keywords": false }, "is_enabled": false, "name": "string", "namespaces": [ "admin" ], "pattern": "string", "priority": "integer", "tags": [], "text_replacement": { "number_of_chars": "integer", "replacement_string": "string", "should_save_match": false, "type": "string" } }, "id": "string", "relationships": { "group": { "data": { "id": "string", "type": "sensitive_data_scanner_group" } }, "standard_pattern": { "data": { "id": "string", "type": "sensitive_data_scanner_standard_pattern" } } }, "type": "sensitive_data_scanner_rule" } ], "meta": { "count_limit": "integer", "group_count_limit": "integer", "has_highlight_enabled": false, "has_multi_pass_enabled": false, "is_pci_compliant": false, "version": 0 } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=typescript) ##### List Scanning Groups Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List Scanning Groups ``` """ List Scanning Groups returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.sensitive_data_scanner_api import SensitiveDataScannerApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SensitiveDataScannerApi(api_client) response = api_instance.list_scanning_groups() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List Scanning Groups ``` # List Scanning Groups returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SensitiveDataScannerAPI.new p api_instance.list_scanning_groups() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List Scanning Groups ``` // List Scanning Groups returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSensitiveDataScannerApi(apiClient) resp, r, err := api.ListScanningGroups(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SensitiveDataScannerApi.ListScanningGroups`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SensitiveDataScannerApi.ListScanningGroups`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List Scanning Groups ``` // List Scanning Groups returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SensitiveDataScannerApi; import com.datadog.api.client.v2.model.SensitiveDataScannerGetConfigResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SensitiveDataScannerApi apiInstance = new SensitiveDataScannerApi(defaultClient); try { SensitiveDataScannerGetConfigResponse result = apiInstance.listScanningGroups(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SensitiveDataScannerApi#listScanningGroups"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List Scanning Groups ``` // List Scanning Groups returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_sensitive_data_scanner::SensitiveDataScannerAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SensitiveDataScannerAPI::with_config(configuration); let resp = api.list_scanning_groups().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List Scanning Groups ``` /** * List Scanning Groups returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SensitiveDataScannerApi(configuration); apiInstance .listScanningGroups() .then((data: v2.SensitiveDataScannerGetConfigResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Reorder Groups](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#reorder-groups) * [v2 (latest)](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#reorder-groups-v2) PATCH https://api.ap1.datadoghq.com/api/v2/sensitive-data-scanner/confighttps://api.ap2.datadoghq.com/api/v2/sensitive-data-scanner/confighttps://api.datadoghq.eu/api/v2/sensitive-data-scanner/confighttps://api.ddog-gov.com/api/v2/sensitive-data-scanner/confighttps://api.datadoghq.com/api/v2/sensitive-data-scanner/confighttps://api.us3.datadoghq.com/api/v2/sensitive-data-scanner/confighttps://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config ### Overview Reorder the list of groups. This endpoint requires the `data_scanner_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Field Type Description data [_required_] object Data related to the reordering of scanning groups. id string ID of the configuration. relationships object Relationships of the configuration. groups object List of groups, ordered. data [object] List of groups. The order is important. id string ID of the group. type enum Sensitive Data Scanner group type. Allowed enum values: `sensitive_data_scanner_group` default: `sensitive_data_scanner_group` type enum Sensitive Data Scanner configuration type. Allowed enum values: `sensitive_data_scanner_configuration` default: `sensitive_data_scanner_configuration` meta [_required_] object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "data": { "relationships": { "groups": { "data": [ { "type": "sensitive_data_scanner_group", "id": "string" } ] } }, "type": "sensitive_data_scanner_configuration", "id": "55482444-d71c-c45c-7d1f-31984f64e6d2" }, "meta": {} } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ReorderScanningGroups-200-v2) * [400](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ReorderScanningGroups-400-v2) * [403](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ReorderScanningGroups-403-v2) * [429](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ReorderScanningGroups-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Group reorder response. Field Type Description meta object Meta response containing information about the API. count_limit int64 Maximum number of scanning rules allowed for the org. group_count_limit int64 Maximum number of scanning groups allowed for the org. has_highlight_enabled boolean **DEPRECATED** : (Deprecated) Whether or not scanned events are highlighted in Logs or RUM for the org. default: `true` has_multi_pass_enabled boolean **DEPRECATED** : (Deprecated) Whether or not scanned events have multi-pass enabled. is_pci_compliant boolean Whether or not the org is compliant to the payment card industry standard. version int64 Version of the API. ``` { "meta": { "count_limit": "integer", "group_count_limit": "integer", "has_highlight_enabled": false, "has_multi_pass_enabled": false, "is_pci_compliant": false, "version": 0 } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=typescript) ##### Reorder Groups returns "OK" response Copy ``` # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "relationships": { "groups": { "data": [ { "type": "sensitive_data_scanner_group", "id": "string" } ] } }, "type": "sensitive_data_scanner_configuration", "id": "55482444-d71c-c45c-7d1f-31984f64e6d2" }, "meta": {} } EOF ``` ##### Reorder Groups returns "OK" response ``` // Reorder Groups returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "scanning_group" in the system GroupDataID := os.Getenv("GROUP_DATA_ID") // a valid "configuration" in the system ConfigurationDataID := os.Getenv("CONFIGURATION_DATA_ID") body := datadogV2.SensitiveDataScannerConfigRequest{ Data: datadogV2.SensitiveDataScannerReorderConfig{ Relationships: &datadogV2.SensitiveDataScannerConfigurationRelationships{ Groups: &datadogV2.SensitiveDataScannerGroupList{ Data: []datadogV2.SensitiveDataScannerGroupItem{ { Type: datadogV2.SENSITIVEDATASCANNERGROUPTYPE_SENSITIVE_DATA_SCANNER_GROUP.Ptr(), Id: datadog.PtrString(GroupDataID), }, }, }, }, Type: datadogV2.SENSITIVEDATASCANNERCONFIGURATIONTYPE_SENSITIVE_DATA_SCANNER_CONFIGURATIONS.Ptr(), Id: datadog.PtrString(ConfigurationDataID), }, Meta: datadogV2.SensitiveDataScannerMetaVersionOnly{}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSensitiveDataScannerApi(apiClient) resp, r, err := api.ReorderScanningGroups(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SensitiveDataScannerApi.ReorderScanningGroups`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SensitiveDataScannerApi.ReorderScanningGroups`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Reorder Groups returns "OK" response ``` // Reorder Groups returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SensitiveDataScannerApi; import com.datadog.api.client.v2.model.SensitiveDataScannerConfigRequest; import com.datadog.api.client.v2.model.SensitiveDataScannerConfigurationRelationships; import com.datadog.api.client.v2.model.SensitiveDataScannerConfigurationType; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupItem; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupList; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupType; import com.datadog.api.client.v2.model.SensitiveDataScannerMetaVersionOnly; import com.datadog.api.client.v2.model.SensitiveDataScannerReorderConfig; import com.datadog.api.client.v2.model.SensitiveDataScannerReorderGroupsResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SensitiveDataScannerApi apiInstance = new SensitiveDataScannerApi(defaultClient); // there is a valid "scanning_group" in the system String GROUP_DATA_ID = System.getenv("GROUP_DATA_ID"); // a valid "configuration" in the system String CONFIGURATION_DATA_ID = System.getenv("CONFIGURATION_DATA_ID"); SensitiveDataScannerConfigRequest body = new SensitiveDataScannerConfigRequest() .data( new SensitiveDataScannerReorderConfig() .relationships( new SensitiveDataScannerConfigurationRelationships() .groups( new SensitiveDataScannerGroupList() .data( Collections.singletonList( new SensitiveDataScannerGroupItem() .type( SensitiveDataScannerGroupType .SENSITIVE_DATA_SCANNER_GROUP) .id(GROUP_DATA_ID))))) .type( SensitiveDataScannerConfigurationType.SENSITIVE_DATA_SCANNER_CONFIGURATIONS) .id(CONFIGURATION_DATA_ID)) .meta(new SensitiveDataScannerMetaVersionOnly()); try { SensitiveDataScannerReorderGroupsResponse result = apiInstance.reorderScanningGroups(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SensitiveDataScannerApi#reorderScanningGroups"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Reorder Groups returns "OK" response ``` """ Reorder Groups returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.sensitive_data_scanner_api import SensitiveDataScannerApi from datadog_api_client.v2.model.sensitive_data_scanner_config_request import SensitiveDataScannerConfigRequest from datadog_api_client.v2.model.sensitive_data_scanner_configuration_relationships import ( SensitiveDataScannerConfigurationRelationships, ) from datadog_api_client.v2.model.sensitive_data_scanner_configuration_type import SensitiveDataScannerConfigurationType from datadog_api_client.v2.model.sensitive_data_scanner_group_item import SensitiveDataScannerGroupItem from datadog_api_client.v2.model.sensitive_data_scanner_group_list import SensitiveDataScannerGroupList from datadog_api_client.v2.model.sensitive_data_scanner_group_type import SensitiveDataScannerGroupType from datadog_api_client.v2.model.sensitive_data_scanner_meta_version_only import SensitiveDataScannerMetaVersionOnly from datadog_api_client.v2.model.sensitive_data_scanner_reorder_config import SensitiveDataScannerReorderConfig # there is a valid "scanning_group" in the system GROUP_DATA_ID = environ["GROUP_DATA_ID"] # a valid "configuration" in the system CONFIGURATION_DATA_ID = environ["CONFIGURATION_DATA_ID"] body = SensitiveDataScannerConfigRequest( data=SensitiveDataScannerReorderConfig( relationships=SensitiveDataScannerConfigurationRelationships( groups=SensitiveDataScannerGroupList( data=[ SensitiveDataScannerGroupItem( type=SensitiveDataScannerGroupType.SENSITIVE_DATA_SCANNER_GROUP, id=GROUP_DATA_ID, ), ], ), ), type=SensitiveDataScannerConfigurationType.SENSITIVE_DATA_SCANNER_CONFIGURATIONS, id=CONFIGURATION_DATA_ID, ), meta=SensitiveDataScannerMetaVersionOnly(), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SensitiveDataScannerApi(api_client) response = api_instance.reorder_scanning_groups(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Reorder Groups returns "OK" response ``` # Reorder Groups returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SensitiveDataScannerAPI.new # there is a valid "scanning_group" in the system GROUP_DATA_ID = ENV["GROUP_DATA_ID"] # a valid "configuration" in the system CONFIGURATION_DATA_ID = ENV["CONFIGURATION_DATA_ID"] body = DatadogAPIClient::V2::SensitiveDataScannerConfigRequest.new({ data: DatadogAPIClient::V2::SensitiveDataScannerReorderConfig.new({ relationships: DatadogAPIClient::V2::SensitiveDataScannerConfigurationRelationships.new({ groups: DatadogAPIClient::V2::SensitiveDataScannerGroupList.new({ data: [ DatadogAPIClient::V2::SensitiveDataScannerGroupItem.new({ type: DatadogAPIClient::V2::SensitiveDataScannerGroupType::SENSITIVE_DATA_SCANNER_GROUP, id: GROUP_DATA_ID, }), ], }), }), type: DatadogAPIClient::V2::SensitiveDataScannerConfigurationType::SENSITIVE_DATA_SCANNER_CONFIGURATIONS, id: CONFIGURATION_DATA_ID, }), meta: DatadogAPIClient::V2::SensitiveDataScannerMetaVersionOnly.new({}), }) p api_instance.reorder_scanning_groups(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Reorder Groups returns "OK" response ``` // Reorder Groups returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_sensitive_data_scanner::SensitiveDataScannerAPI; use datadog_api_client::datadogV2::model::SensitiveDataScannerConfigRequest; use datadog_api_client::datadogV2::model::SensitiveDataScannerConfigurationRelationships; use datadog_api_client::datadogV2::model::SensitiveDataScannerConfigurationType; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupItem; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupList; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupType; use datadog_api_client::datadogV2::model::SensitiveDataScannerMetaVersionOnly; use datadog_api_client::datadogV2::model::SensitiveDataScannerReorderConfig; #[tokio::main] async fn main() { // there is a valid "scanning_group" in the system let group_data_id = std::env::var("GROUP_DATA_ID").unwrap(); // a valid "configuration" in the system let configuration_data_id = std::env::var("CONFIGURATION_DATA_ID").unwrap(); let body = SensitiveDataScannerConfigRequest::new( SensitiveDataScannerReorderConfig::new() .id(configuration_data_id.clone()) .relationships( SensitiveDataScannerConfigurationRelationships::new().groups( SensitiveDataScannerGroupList::new().data(vec![ SensitiveDataScannerGroupItem::new() .id(group_data_id.clone()) .type_(SensitiveDataScannerGroupType::SENSITIVE_DATA_SCANNER_GROUP), ]), ), ) .type_(SensitiveDataScannerConfigurationType::SENSITIVE_DATA_SCANNER_CONFIGURATIONS), SensitiveDataScannerMetaVersionOnly::new(), ); let configuration = datadog::Configuration::new(); let api = SensitiveDataScannerAPI::with_config(configuration); let resp = api.reorder_scanning_groups(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Reorder Groups returns "OK" response ``` /** * Reorder Groups returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SensitiveDataScannerApi(configuration); // there is a valid "scanning_group" in the system const GROUP_DATA_ID = process.env.GROUP_DATA_ID as string; // a valid "configuration" in the system const CONFIGURATION_DATA_ID = process.env.CONFIGURATION_DATA_ID as string; const params: v2.SensitiveDataScannerApiReorderScanningGroupsRequest = { body: { data: { relationships: { groups: { data: [ { type: "sensitive_data_scanner_group", id: GROUP_DATA_ID, }, ], }, }, type: "sensitive_data_scanner_configuration", id: CONFIGURATION_DATA_ID, }, meta: {}, }, }; apiInstance .reorderScanningGroups(params) .then((data: v2.SensitiveDataScannerReorderGroupsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List standard patterns](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#list-standard-patterns) * [v2 (latest)](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#list-standard-patterns-v2) GET https://api.ap1.datadoghq.com/api/v2/sensitive-data-scanner/config/standard-patternshttps://api.ap2.datadoghq.com/api/v2/sensitive-data-scanner/config/standard-patternshttps://api.datadoghq.eu/api/v2/sensitive-data-scanner/config/standard-patternshttps://api.ddog-gov.com/api/v2/sensitive-data-scanner/config/standard-patternshttps://api.datadoghq.com/api/v2/sensitive-data-scanner/config/standard-patternshttps://api.us3.datadoghq.com/api/v2/sensitive-data-scanner/config/standard-patternshttps://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/standard-patterns ### Overview Returns all standard patterns. This endpoint requires the `data_scanner_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ListStandardPatterns-200-v2) * [400](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ListStandardPatterns-400-v2) * [403](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ListStandardPatterns-403-v2) * [429](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#ListStandardPatterns-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) List Standard patterns response data. Field Type Description data [object] List Standard patterns response. attributes object Attributes of the Sensitive Data Scanner standard pattern. description string Description of the standard pattern. included_keywords [string] List of included keywords. name string Name of the standard pattern. pattern string **DEPRECATED** : (Deprecated) Regex to match, optionally documented for older standard rules. Refer to the `description` field to understand what the rule does. priority int64 Integer from 1 (high) to 5 (low) indicating standard pattern issue severity. tags [string] List of tags. id string ID of the standard pattern. type enum Sensitive Data Scanner standard pattern type. Allowed enum values: `sensitive_data_scanner_standard_pattern` default: `sensitive_data_scanner_standard_pattern` ``` { "data": [ { "attributes": { "description": "string", "included_keywords": [], "name": "string", "pattern": "string", "priority": "integer", "tags": [] }, "id": "string", "type": "sensitive_data_scanner_standard_pattern" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=typescript) ##### List standard patterns Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/standard-patterns" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List standard patterns ``` """ List standard patterns returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.sensitive_data_scanner_api import SensitiveDataScannerApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SensitiveDataScannerApi(api_client) response = api_instance.list_standard_patterns() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List standard patterns ``` # List standard patterns returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SensitiveDataScannerAPI.new p api_instance.list_standard_patterns() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List standard patterns ``` // List standard patterns returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSensitiveDataScannerApi(apiClient) resp, r, err := api.ListStandardPatterns(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SensitiveDataScannerApi.ListStandardPatterns`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SensitiveDataScannerApi.ListStandardPatterns`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List standard patterns ``` // List standard patterns returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SensitiveDataScannerApi; import com.datadog.api.client.v2.model.SensitiveDataScannerStandardPatternsResponseData; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SensitiveDataScannerApi apiInstance = new SensitiveDataScannerApi(defaultClient); try { SensitiveDataScannerStandardPatternsResponseData result = apiInstance.listStandardPatterns(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SensitiveDataScannerApi#listStandardPatterns"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List standard patterns ``` // List standard patterns returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_sensitive_data_scanner::SensitiveDataScannerAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SensitiveDataScannerAPI::with_config(configuration); let resp = api.list_standard_patterns().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List standard patterns ``` /** * List standard patterns returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SensitiveDataScannerApi(configuration); apiInstance .listStandardPatterns() .then((data: v2.SensitiveDataScannerStandardPatternsResponseData) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create Scanning Group](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#create-scanning-group) * [v2 (latest)](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#create-scanning-group-v2) POST https://api.ap1.datadoghq.com/api/v2/sensitive-data-scanner/config/groupshttps://api.ap2.datadoghq.com/api/v2/sensitive-data-scanner/config/groupshttps://api.datadoghq.eu/api/v2/sensitive-data-scanner/config/groupshttps://api.ddog-gov.com/api/v2/sensitive-data-scanner/config/groupshttps://api.datadoghq.com/api/v2/sensitive-data-scanner/config/groupshttps://api.us3.datadoghq.com/api/v2/sensitive-data-scanner/config/groupshttps://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/groups ### Overview Create a scanning group. The request MAY include a configuration relationship. A rules relationship can be omitted entirely, but if it is included it MUST be null or an empty array (rules cannot be created at the same time). The new group will be ordered last within the configuration. This endpoint requires the `data_scanner_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Field Type Description data object Data related to the creation of a group. attributes [_required_] object Attributes of the Sensitive Data Scanner group. description string Description of the group. filter object Filter for the Scanning Group. query string Query to filter the events. is_enabled boolean Whether or not the group is enabled. name string Name of the group. product_list [string] List of products the scanning group applies. samplings [object] List of sampling rates per product type. product enum Datadog product onto which Sensitive Data Scanner can be activated. Allowed enum values: `logs,rum,events,apm` default: `logs` rate double Rate at which data in product type will be scanned, as a percentage. relationships object Relationships of the group. configuration object A Sensitive Data Scanner configuration data. data object A Sensitive Data Scanner configuration. id string ID of the configuration. type enum Sensitive Data Scanner configuration type. Allowed enum values: `sensitive_data_scanner_configuration` default: `sensitive_data_scanner_configuration` rules object Rules included in the group. data [object] Rules included in the group. The order is important. id string ID of the rule. type enum Sensitive Data Scanner rule type. Allowed enum values: `sensitive_data_scanner_rule` default: `sensitive_data_scanner_rule` type [_required_] enum Sensitive Data Scanner group type. Allowed enum values: `sensitive_data_scanner_group` default: `sensitive_data_scanner_group` meta object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "meta": {}, "data": { "type": "sensitive_data_scanner_group", "attributes": { "name": "Example-Sensitive-Data-Scanner", "is_enabled": false, "product_list": [ "logs" ], "filter": { "query": "*" } }, "relationships": { "configuration": { "data": { "type": "sensitive_data_scanner_configuration", "id": "string" } }, "rules": { "data": [] } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#CreateScanningGroup-200-v2) * [400](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#CreateScanningGroup-400-v2) * [403](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#CreateScanningGroup-403-v2) * [429](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#CreateScanningGroup-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Create group response. Field Type Description data object Response data related to the creation of a group. attributes object Attributes of the Sensitive Data Scanner group. description string Description of the group. filter object Filter for the Scanning Group. query string Query to filter the events. is_enabled boolean Whether or not the group is enabled. name string Name of the group. product_list [string] List of products the scanning group applies. samplings [object] List of sampling rates per product type. product enum Datadog product onto which Sensitive Data Scanner can be activated. Allowed enum values: `logs,rum,events,apm` default: `logs` rate double Rate at which data in product type will be scanned, as a percentage. id string ID of the group. relationships object Relationships of the group. configuration object A Sensitive Data Scanner configuration data. data object A Sensitive Data Scanner configuration. id string ID of the configuration. type enum Sensitive Data Scanner configuration type. Allowed enum values: `sensitive_data_scanner_configuration` default: `sensitive_data_scanner_configuration` rules object Rules included in the group. data [object] Rules included in the group. The order is important. id string ID of the rule. type enum Sensitive Data Scanner rule type. Allowed enum values: `sensitive_data_scanner_rule` default: `sensitive_data_scanner_rule` type enum Sensitive Data Scanner group type. Allowed enum values: `sensitive_data_scanner_group` default: `sensitive_data_scanner_group` meta object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "data": { "attributes": { "description": "string", "filter": { "query": "string" }, "is_enabled": false, "name": "string", "product_list": [], "samplings": [ { "product": "string", "rate": 100 } ] }, "id": "string", "relationships": { "configuration": { "data": { "id": "string", "type": "sensitive_data_scanner_configuration" } }, "rules": { "data": [ { "id": "string", "type": "sensitive_data_scanner_rule" } ] } }, "type": "sensitive_data_scanner_group" }, "meta": { "version": 0 } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=typescript) ##### Create Scanning Group returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/groups" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "meta": {}, "data": { "type": "sensitive_data_scanner_group", "attributes": { "name": "Example-Sensitive-Data-Scanner", "is_enabled": false, "product_list": [ "logs" ], "filter": { "query": "*" } }, "relationships": { "configuration": { "data": { "type": "sensitive_data_scanner_configuration", "id": "string" } }, "rules": { "data": [] } } } } EOF ``` ##### Create Scanning Group returns "OK" response ``` // Create Scanning Group returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // a valid "configuration" in the system ConfigurationDataID := os.Getenv("CONFIGURATION_DATA_ID") body := datadogV2.SensitiveDataScannerGroupCreateRequest{ Meta: &datadogV2.SensitiveDataScannerMetaVersionOnly{}, Data: &datadogV2.SensitiveDataScannerGroupCreate{ Type: datadogV2.SENSITIVEDATASCANNERGROUPTYPE_SENSITIVE_DATA_SCANNER_GROUP, Attributes: datadogV2.SensitiveDataScannerGroupAttributes{ Name: datadog.PtrString("Example-Sensitive-Data-Scanner"), IsEnabled: datadog.PtrBool(false), ProductList: []datadogV2.SensitiveDataScannerProduct{ datadogV2.SENSITIVEDATASCANNERPRODUCT_LOGS, }, Filter: &datadogV2.SensitiveDataScannerFilter{ Query: datadog.PtrString("*"), }, }, Relationships: &datadogV2.SensitiveDataScannerGroupRelationships{ Configuration: &datadogV2.SensitiveDataScannerConfigurationData{ Data: &datadogV2.SensitiveDataScannerConfiguration{ Type: datadogV2.SENSITIVEDATASCANNERCONFIGURATIONTYPE_SENSITIVE_DATA_SCANNER_CONFIGURATIONS.Ptr(), Id: datadog.PtrString(ConfigurationDataID), }, }, Rules: &datadogV2.SensitiveDataScannerRuleData{ Data: []datadogV2.SensitiveDataScannerRule{}, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSensitiveDataScannerApi(apiClient) resp, r, err := api.CreateScanningGroup(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SensitiveDataScannerApi.CreateScanningGroup`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SensitiveDataScannerApi.CreateScanningGroup`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create Scanning Group returns "OK" response ``` // Create Scanning Group returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SensitiveDataScannerApi; import com.datadog.api.client.v2.model.SensitiveDataScannerConfiguration; import com.datadog.api.client.v2.model.SensitiveDataScannerConfigurationData; import com.datadog.api.client.v2.model.SensitiveDataScannerConfigurationType; import com.datadog.api.client.v2.model.SensitiveDataScannerCreateGroupResponse; import com.datadog.api.client.v2.model.SensitiveDataScannerFilter; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupAttributes; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupCreate; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupCreateRequest; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupRelationships; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupType; import com.datadog.api.client.v2.model.SensitiveDataScannerMetaVersionOnly; import com.datadog.api.client.v2.model.SensitiveDataScannerProduct; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleData; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SensitiveDataScannerApi apiInstance = new SensitiveDataScannerApi(defaultClient); // a valid "configuration" in the system String CONFIGURATION_DATA_ID = System.getenv("CONFIGURATION_DATA_ID"); SensitiveDataScannerGroupCreateRequest body = new SensitiveDataScannerGroupCreateRequest() .meta(new SensitiveDataScannerMetaVersionOnly()) .data( new SensitiveDataScannerGroupCreate() .type(SensitiveDataScannerGroupType.SENSITIVE_DATA_SCANNER_GROUP) .attributes( new SensitiveDataScannerGroupAttributes() .name("Example-Sensitive-Data-Scanner") .isEnabled(false) .productList( Collections.singletonList(SensitiveDataScannerProduct.LOGS)) .filter(new SensitiveDataScannerFilter().query("*"))) .relationships( new SensitiveDataScannerGroupRelationships() .configuration( new SensitiveDataScannerConfigurationData() .data( new SensitiveDataScannerConfiguration() .type( SensitiveDataScannerConfigurationType .SENSITIVE_DATA_SCANNER_CONFIGURATIONS) .id(CONFIGURATION_DATA_ID))) .rules(new SensitiveDataScannerRuleData()))); try { SensitiveDataScannerCreateGroupResponse result = apiInstance.createScanningGroup(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SensitiveDataScannerApi#createScanningGroup"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create Scanning Group returns "OK" response ``` """ Create Scanning Group returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.sensitive_data_scanner_api import SensitiveDataScannerApi from datadog_api_client.v2.model.sensitive_data_scanner_configuration import SensitiveDataScannerConfiguration from datadog_api_client.v2.model.sensitive_data_scanner_configuration_data import SensitiveDataScannerConfigurationData from datadog_api_client.v2.model.sensitive_data_scanner_configuration_type import SensitiveDataScannerConfigurationType from datadog_api_client.v2.model.sensitive_data_scanner_filter import SensitiveDataScannerFilter from datadog_api_client.v2.model.sensitive_data_scanner_group_attributes import SensitiveDataScannerGroupAttributes from datadog_api_client.v2.model.sensitive_data_scanner_group_create import SensitiveDataScannerGroupCreate from datadog_api_client.v2.model.sensitive_data_scanner_group_create_request import ( SensitiveDataScannerGroupCreateRequest, ) from datadog_api_client.v2.model.sensitive_data_scanner_group_relationships import ( SensitiveDataScannerGroupRelationships, ) from datadog_api_client.v2.model.sensitive_data_scanner_group_type import SensitiveDataScannerGroupType from datadog_api_client.v2.model.sensitive_data_scanner_meta_version_only import SensitiveDataScannerMetaVersionOnly from datadog_api_client.v2.model.sensitive_data_scanner_product import SensitiveDataScannerProduct from datadog_api_client.v2.model.sensitive_data_scanner_rule_data import SensitiveDataScannerRuleData # a valid "configuration" in the system CONFIGURATION_DATA_ID = environ["CONFIGURATION_DATA_ID"] body = SensitiveDataScannerGroupCreateRequest( meta=SensitiveDataScannerMetaVersionOnly(), data=SensitiveDataScannerGroupCreate( type=SensitiveDataScannerGroupType.SENSITIVE_DATA_SCANNER_GROUP, attributes=SensitiveDataScannerGroupAttributes( name="Example-Sensitive-Data-Scanner", is_enabled=False, product_list=[ SensitiveDataScannerProduct.LOGS, ], filter=SensitiveDataScannerFilter( query="*", ), ), relationships=SensitiveDataScannerGroupRelationships( configuration=SensitiveDataScannerConfigurationData( data=SensitiveDataScannerConfiguration( type=SensitiveDataScannerConfigurationType.SENSITIVE_DATA_SCANNER_CONFIGURATIONS, id=CONFIGURATION_DATA_ID, ), ), rules=SensitiveDataScannerRuleData( data=[], ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SensitiveDataScannerApi(api_client) response = api_instance.create_scanning_group(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create Scanning Group returns "OK" response ``` # Create Scanning Group returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SensitiveDataScannerAPI.new # a valid "configuration" in the system CONFIGURATION_DATA_ID = ENV["CONFIGURATION_DATA_ID"] body = DatadogAPIClient::V2::SensitiveDataScannerGroupCreateRequest.new({ meta: DatadogAPIClient::V2::SensitiveDataScannerMetaVersionOnly.new({}), data: DatadogAPIClient::V2::SensitiveDataScannerGroupCreate.new({ type: DatadogAPIClient::V2::SensitiveDataScannerGroupType::SENSITIVE_DATA_SCANNER_GROUP, attributes: DatadogAPIClient::V2::SensitiveDataScannerGroupAttributes.new({ name: "Example-Sensitive-Data-Scanner", is_enabled: false, product_list: [ DatadogAPIClient::V2::SensitiveDataScannerProduct::LOGS, ], filter: DatadogAPIClient::V2::SensitiveDataScannerFilter.new({ query: "*", }), }), relationships: DatadogAPIClient::V2::SensitiveDataScannerGroupRelationships.new({ configuration: DatadogAPIClient::V2::SensitiveDataScannerConfigurationData.new({ data: DatadogAPIClient::V2::SensitiveDataScannerConfiguration.new({ type: DatadogAPIClient::V2::SensitiveDataScannerConfigurationType::SENSITIVE_DATA_SCANNER_CONFIGURATIONS, id: CONFIGURATION_DATA_ID, }), }), rules: DatadogAPIClient::V2::SensitiveDataScannerRuleData.new({ data: [], }), }), }), }) p api_instance.create_scanning_group(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create Scanning Group returns "OK" response ``` // Create Scanning Group returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_sensitive_data_scanner::SensitiveDataScannerAPI; use datadog_api_client::datadogV2::model::SensitiveDataScannerConfiguration; use datadog_api_client::datadogV2::model::SensitiveDataScannerConfigurationData; use datadog_api_client::datadogV2::model::SensitiveDataScannerConfigurationType; use datadog_api_client::datadogV2::model::SensitiveDataScannerFilter; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupAttributes; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupCreate; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupCreateRequest; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupRelationships; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupType; use datadog_api_client::datadogV2::model::SensitiveDataScannerMetaVersionOnly; use datadog_api_client::datadogV2::model::SensitiveDataScannerProduct; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleData; #[tokio::main] async fn main() { // a valid "configuration" in the system let configuration_data_id = std::env::var("CONFIGURATION_DATA_ID").unwrap(); let body = SensitiveDataScannerGroupCreateRequest::new() .data( SensitiveDataScannerGroupCreate::new( SensitiveDataScannerGroupAttributes::new() .filter(SensitiveDataScannerFilter::new().query("*".to_string())) .is_enabled(false) .name("Example-Sensitive-Data-Scanner".to_string()) .product_list(vec![SensitiveDataScannerProduct::LOGS]), SensitiveDataScannerGroupType::SENSITIVE_DATA_SCANNER_GROUP, ).relationships( SensitiveDataScannerGroupRelationships::new() .configuration( SensitiveDataScannerConfigurationData ::new().data( SensitiveDataScannerConfiguration::new() .id(configuration_data_id.clone()) .type_( SensitiveDataScannerConfigurationType::SENSITIVE_DATA_SCANNER_CONFIGURATIONS, ), ), ) .rules(SensitiveDataScannerRuleData::new().data(vec![])), ), ) .meta(SensitiveDataScannerMetaVersionOnly::new()); let configuration = datadog::Configuration::new(); let api = SensitiveDataScannerAPI::with_config(configuration); let resp = api.create_scanning_group(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create Scanning Group returns "OK" response ``` /** * Create Scanning Group returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SensitiveDataScannerApi(configuration); // a valid "configuration" in the system const CONFIGURATION_DATA_ID = process.env.CONFIGURATION_DATA_ID as string; const params: v2.SensitiveDataScannerApiCreateScanningGroupRequest = { body: { meta: {}, data: { type: "sensitive_data_scanner_group", attributes: { name: "Example-Sensitive-Data-Scanner", isEnabled: false, productList: ["logs"], filter: { query: "*", }, }, relationships: { configuration: { data: { type: "sensitive_data_scanner_configuration", id: CONFIGURATION_DATA_ID, }, }, rules: { data: [], }, }, }, }, }; apiInstance .createScanningGroup(params) .then((data: v2.SensitiveDataScannerCreateGroupResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Scanning Group](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#update-scanning-group) * [v2 (latest)](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#update-scanning-group-v2) PATCH https://api.ap1.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.ap2.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.datadoghq.eu/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.ddog-gov.com/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.us3.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/{group_id} ### Overview Update a group, including the order of the rules. Rules within the group are reordered by including a rules relationship. If the rules relationship is present, its data section MUST contain linkages for all of the rules currently in the group, and MUST NOT contain any others. This endpoint requires the `data_scanner_write` permission. ### Arguments #### Path Parameters Name Type Description group_id [_required_] string The ID of a group of rules. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Field Type Description data [_required_] object Data related to the update of a group. attributes object Attributes of the Sensitive Data Scanner group. description string Description of the group. filter object Filter for the Scanning Group. query string Query to filter the events. is_enabled boolean Whether or not the group is enabled. name string Name of the group. product_list [string] List of products the scanning group applies. samplings [object] List of sampling rates per product type. product enum Datadog product onto which Sensitive Data Scanner can be activated. Allowed enum values: `logs,rum,events,apm` default: `logs` rate double Rate at which data in product type will be scanned, as a percentage. id string ID of the group. relationships object Relationships of the group. configuration object A Sensitive Data Scanner configuration data. data object A Sensitive Data Scanner configuration. id string ID of the configuration. type enum Sensitive Data Scanner configuration type. Allowed enum values: `sensitive_data_scanner_configuration` default: `sensitive_data_scanner_configuration` rules object Rules included in the group. data [object] Rules included in the group. The order is important. id string ID of the rule. type enum Sensitive Data Scanner rule type. Allowed enum values: `sensitive_data_scanner_rule` default: `sensitive_data_scanner_rule` type enum Sensitive Data Scanner group type. Allowed enum values: `sensitive_data_scanner_group` default: `sensitive_data_scanner_group` meta [_required_] object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "meta": {}, "data": { "id": "string", "type": "sensitive_data_scanner_group", "attributes": { "name": "Example-Sensitive-Data-Scanner", "is_enabled": false, "product_list": [ "logs" ], "filter": { "query": "*" } }, "relationships": { "configuration": { "data": { "type": "sensitive_data_scanner_configuration", "id": "55482444-d71c-c45c-7d1f-31984f64e6d2" } }, "rules": { "data": [] } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#UpdateScanningGroup-200-v2) * [400](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#UpdateScanningGroup-400-v2) * [403](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#UpdateScanningGroup-403-v2) * [404](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#UpdateScanningGroup-404-v2) * [429](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#UpdateScanningGroup-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Update group response. Field Type Description meta object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "meta": { "version": 0 } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=typescript) ##### Update Scanning Group returns "OK" response Copy ``` # Path parameters export group_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/${group_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "meta": {}, "data": { "id": "string", "type": "sensitive_data_scanner_group", "attributes": { "name": "Example-Sensitive-Data-Scanner", "is_enabled": false, "product_list": [ "logs" ], "filter": { "query": "*" } }, "relationships": { "configuration": { "data": { "type": "sensitive_data_scanner_configuration", "id": "55482444-d71c-c45c-7d1f-31984f64e6d2" } }, "rules": { "data": [] } } } } EOF ``` ##### Update Scanning Group returns "OK" response ``` // Update Scanning Group returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "scanning_group" in the system GroupDataID := os.Getenv("GROUP_DATA_ID") // a valid "configuration" in the system ConfigurationDataID := os.Getenv("CONFIGURATION_DATA_ID") body := datadogV2.SensitiveDataScannerGroupUpdateRequest{ Meta: datadogV2.SensitiveDataScannerMetaVersionOnly{}, Data: datadogV2.SensitiveDataScannerGroupUpdate{ Id: datadog.PtrString(GroupDataID), Type: datadogV2.SENSITIVEDATASCANNERGROUPTYPE_SENSITIVE_DATA_SCANNER_GROUP.Ptr(), Attributes: &datadogV2.SensitiveDataScannerGroupAttributes{ Name: datadog.PtrString("Example-Sensitive-Data-Scanner"), IsEnabled: datadog.PtrBool(false), ProductList: []datadogV2.SensitiveDataScannerProduct{ datadogV2.SENSITIVEDATASCANNERPRODUCT_LOGS, }, Filter: &datadogV2.SensitiveDataScannerFilter{ Query: datadog.PtrString("*"), }, }, Relationships: &datadogV2.SensitiveDataScannerGroupRelationships{ Configuration: &datadogV2.SensitiveDataScannerConfigurationData{ Data: &datadogV2.SensitiveDataScannerConfiguration{ Type: datadogV2.SENSITIVEDATASCANNERCONFIGURATIONTYPE_SENSITIVE_DATA_SCANNER_CONFIGURATIONS.Ptr(), Id: datadog.PtrString(ConfigurationDataID), }, }, Rules: &datadogV2.SensitiveDataScannerRuleData{ Data: []datadogV2.SensitiveDataScannerRule{}, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSensitiveDataScannerApi(apiClient) resp, r, err := api.UpdateScanningGroup(ctx, GroupDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SensitiveDataScannerApi.UpdateScanningGroup`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SensitiveDataScannerApi.UpdateScanningGroup`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Scanning Group returns "OK" response ``` // Update Scanning Group returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SensitiveDataScannerApi; import com.datadog.api.client.v2.model.SensitiveDataScannerConfiguration; import com.datadog.api.client.v2.model.SensitiveDataScannerConfigurationData; import com.datadog.api.client.v2.model.SensitiveDataScannerConfigurationType; import com.datadog.api.client.v2.model.SensitiveDataScannerFilter; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupAttributes; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupRelationships; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupType; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupUpdate; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupUpdateRequest; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupUpdateResponse; import com.datadog.api.client.v2.model.SensitiveDataScannerMetaVersionOnly; import com.datadog.api.client.v2.model.SensitiveDataScannerProduct; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleData; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SensitiveDataScannerApi apiInstance = new SensitiveDataScannerApi(defaultClient); // there is a valid "scanning_group" in the system String GROUP_DATA_ID = System.getenv("GROUP_DATA_ID"); // a valid "configuration" in the system String CONFIGURATION_DATA_ID = System.getenv("CONFIGURATION_DATA_ID"); SensitiveDataScannerGroupUpdateRequest body = new SensitiveDataScannerGroupUpdateRequest() .meta(new SensitiveDataScannerMetaVersionOnly()) .data( new SensitiveDataScannerGroupUpdate() .id(GROUP_DATA_ID) .type(SensitiveDataScannerGroupType.SENSITIVE_DATA_SCANNER_GROUP) .attributes( new SensitiveDataScannerGroupAttributes() .name("Example-Sensitive-Data-Scanner") .isEnabled(false) .productList( Collections.singletonList(SensitiveDataScannerProduct.LOGS)) .filter(new SensitiveDataScannerFilter().query("*"))) .relationships( new SensitiveDataScannerGroupRelationships() .configuration( new SensitiveDataScannerConfigurationData() .data( new SensitiveDataScannerConfiguration() .type( SensitiveDataScannerConfigurationType .SENSITIVE_DATA_SCANNER_CONFIGURATIONS) .id(CONFIGURATION_DATA_ID))) .rules(new SensitiveDataScannerRuleData()))); try { SensitiveDataScannerGroupUpdateResponse result = apiInstance.updateScanningGroup(GROUP_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SensitiveDataScannerApi#updateScanningGroup"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Scanning Group returns "OK" response ``` """ Update Scanning Group returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.sensitive_data_scanner_api import SensitiveDataScannerApi from datadog_api_client.v2.model.sensitive_data_scanner_configuration import SensitiveDataScannerConfiguration from datadog_api_client.v2.model.sensitive_data_scanner_configuration_data import SensitiveDataScannerConfigurationData from datadog_api_client.v2.model.sensitive_data_scanner_configuration_type import SensitiveDataScannerConfigurationType from datadog_api_client.v2.model.sensitive_data_scanner_filter import SensitiveDataScannerFilter from datadog_api_client.v2.model.sensitive_data_scanner_group_attributes import SensitiveDataScannerGroupAttributes from datadog_api_client.v2.model.sensitive_data_scanner_group_relationships import ( SensitiveDataScannerGroupRelationships, ) from datadog_api_client.v2.model.sensitive_data_scanner_group_type import SensitiveDataScannerGroupType from datadog_api_client.v2.model.sensitive_data_scanner_group_update import SensitiveDataScannerGroupUpdate from datadog_api_client.v2.model.sensitive_data_scanner_group_update_request import ( SensitiveDataScannerGroupUpdateRequest, ) from datadog_api_client.v2.model.sensitive_data_scanner_meta_version_only import SensitiveDataScannerMetaVersionOnly from datadog_api_client.v2.model.sensitive_data_scanner_product import SensitiveDataScannerProduct from datadog_api_client.v2.model.sensitive_data_scanner_rule_data import SensitiveDataScannerRuleData # there is a valid "scanning_group" in the system GROUP_DATA_ID = environ["GROUP_DATA_ID"] # a valid "configuration" in the system CONFIGURATION_DATA_ID = environ["CONFIGURATION_DATA_ID"] body = SensitiveDataScannerGroupUpdateRequest( meta=SensitiveDataScannerMetaVersionOnly(), data=SensitiveDataScannerGroupUpdate( id=GROUP_DATA_ID, type=SensitiveDataScannerGroupType.SENSITIVE_DATA_SCANNER_GROUP, attributes=SensitiveDataScannerGroupAttributes( name="Example-Sensitive-Data-Scanner", is_enabled=False, product_list=[ SensitiveDataScannerProduct.LOGS, ], filter=SensitiveDataScannerFilter( query="*", ), ), relationships=SensitiveDataScannerGroupRelationships( configuration=SensitiveDataScannerConfigurationData( data=SensitiveDataScannerConfiguration( type=SensitiveDataScannerConfigurationType.SENSITIVE_DATA_SCANNER_CONFIGURATIONS, id=CONFIGURATION_DATA_ID, ), ), rules=SensitiveDataScannerRuleData( data=[], ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SensitiveDataScannerApi(api_client) response = api_instance.update_scanning_group(group_id=GROUP_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Scanning Group returns "OK" response ``` # Update Scanning Group returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SensitiveDataScannerAPI.new # there is a valid "scanning_group" in the system GROUP_DATA_ID = ENV["GROUP_DATA_ID"] # a valid "configuration" in the system CONFIGURATION_DATA_ID = ENV["CONFIGURATION_DATA_ID"] body = DatadogAPIClient::V2::SensitiveDataScannerGroupUpdateRequest.new({ meta: DatadogAPIClient::V2::SensitiveDataScannerMetaVersionOnly.new({}), data: DatadogAPIClient::V2::SensitiveDataScannerGroupUpdate.new({ id: GROUP_DATA_ID, type: DatadogAPIClient::V2::SensitiveDataScannerGroupType::SENSITIVE_DATA_SCANNER_GROUP, attributes: DatadogAPIClient::V2::SensitiveDataScannerGroupAttributes.new({ name: "Example-Sensitive-Data-Scanner", is_enabled: false, product_list: [ DatadogAPIClient::V2::SensitiveDataScannerProduct::LOGS, ], filter: DatadogAPIClient::V2::SensitiveDataScannerFilter.new({ query: "*", }), }), relationships: DatadogAPIClient::V2::SensitiveDataScannerGroupRelationships.new({ configuration: DatadogAPIClient::V2::SensitiveDataScannerConfigurationData.new({ data: DatadogAPIClient::V2::SensitiveDataScannerConfiguration.new({ type: DatadogAPIClient::V2::SensitiveDataScannerConfigurationType::SENSITIVE_DATA_SCANNER_CONFIGURATIONS, id: CONFIGURATION_DATA_ID, }), }), rules: DatadogAPIClient::V2::SensitiveDataScannerRuleData.new({ data: [], }), }), }), }) p api_instance.update_scanning_group(GROUP_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Scanning Group returns "OK" response ``` // Update Scanning Group returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_sensitive_data_scanner::SensitiveDataScannerAPI; use datadog_api_client::datadogV2::model::SensitiveDataScannerConfiguration; use datadog_api_client::datadogV2::model::SensitiveDataScannerConfigurationData; use datadog_api_client::datadogV2::model::SensitiveDataScannerConfigurationType; use datadog_api_client::datadogV2::model::SensitiveDataScannerFilter; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupAttributes; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupRelationships; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupType; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupUpdate; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupUpdateRequest; use datadog_api_client::datadogV2::model::SensitiveDataScannerMetaVersionOnly; use datadog_api_client::datadogV2::model::SensitiveDataScannerProduct; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleData; #[tokio::main] async fn main() { // there is a valid "scanning_group" in the system let group_data_id = std::env::var("GROUP_DATA_ID").unwrap(); // a valid "configuration" in the system let configuration_data_id = std::env::var("CONFIGURATION_DATA_ID").unwrap(); let body = SensitiveDataScannerGroupUpdateRequest::new( SensitiveDataScannerGroupUpdate::new() .attributes( SensitiveDataScannerGroupAttributes::new() .filter(SensitiveDataScannerFilter::new().query("*".to_string())) .is_enabled(false) .name("Example-Sensitive-Data-Scanner".to_string()) .product_list(vec![SensitiveDataScannerProduct::LOGS]), ) .id(group_data_id.clone()) .relationships( SensitiveDataScannerGroupRelationships::new() .configuration( SensitiveDataScannerConfigurationData ::new().data( SensitiveDataScannerConfiguration::new() .id(configuration_data_id.clone()) .type_( SensitiveDataScannerConfigurationType::SENSITIVE_DATA_SCANNER_CONFIGURATIONS, ), ), ) .rules(SensitiveDataScannerRuleData::new().data(vec![])), ) .type_(SensitiveDataScannerGroupType::SENSITIVE_DATA_SCANNER_GROUP), SensitiveDataScannerMetaVersionOnly::new(), ); let configuration = datadog::Configuration::new(); let api = SensitiveDataScannerAPI::with_config(configuration); let resp = api.update_scanning_group(group_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Scanning Group returns "OK" response ``` /** * Update Scanning Group returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SensitiveDataScannerApi(configuration); // there is a valid "scanning_group" in the system const GROUP_DATA_ID = process.env.GROUP_DATA_ID as string; // a valid "configuration" in the system const CONFIGURATION_DATA_ID = process.env.CONFIGURATION_DATA_ID as string; const params: v2.SensitiveDataScannerApiUpdateScanningGroupRequest = { body: { meta: {}, data: { id: GROUP_DATA_ID, type: "sensitive_data_scanner_group", attributes: { name: "Example-Sensitive-Data-Scanner", isEnabled: false, productList: ["logs"], filter: { query: "*", }, }, relationships: { configuration: { data: { type: "sensitive_data_scanner_configuration", id: CONFIGURATION_DATA_ID, }, }, rules: { data: [], }, }, }, }, groupId: GROUP_DATA_ID, }; apiInstance .updateScanningGroup(params) .then((data: v2.SensitiveDataScannerGroupUpdateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Scanning Group](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#delete-scanning-group) * [v2 (latest)](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#delete-scanning-group-v2) DELETE https://api.ap1.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.ap2.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.datadoghq.eu/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.ddog-gov.com/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.us3.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/{group_id}https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/{group_id} ### Overview Delete a given group. This endpoint requires the `data_scanner_write` permission. ### Arguments #### Path Parameters Name Type Description group_id [_required_] string The ID of a group of rules. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Field Type Description meta [_required_] object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "meta": {} } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#DeleteScanningGroup-200-v2) * [400](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#DeleteScanningGroup-400-v2) * [403](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#DeleteScanningGroup-403-v2) * [404](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#DeleteScanningGroup-404-v2) * [429](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#DeleteScanningGroup-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Delete group response. Field Type Description meta object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "meta": { "version": 0 } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=typescript) ##### Delete Scanning Group returns "OK" response Copy ``` # Path parameters export group_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/groups/${group_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "meta": {} } EOF ``` ##### Delete Scanning Group returns "OK" response ``` // Delete Scanning Group returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "scanning_group" in the system GroupDataID := os.Getenv("GROUP_DATA_ID") body := datadogV2.SensitiveDataScannerGroupDeleteRequest{ Meta: datadogV2.SensitiveDataScannerMetaVersionOnly{}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSensitiveDataScannerApi(apiClient) resp, r, err := api.DeleteScanningGroup(ctx, GroupDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SensitiveDataScannerApi.DeleteScanningGroup`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SensitiveDataScannerApi.DeleteScanningGroup`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Scanning Group returns "OK" response ``` // Delete Scanning Group returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SensitiveDataScannerApi; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupDeleteRequest; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupDeleteResponse; import com.datadog.api.client.v2.model.SensitiveDataScannerMetaVersionOnly; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SensitiveDataScannerApi apiInstance = new SensitiveDataScannerApi(defaultClient); // there is a valid "scanning_group" in the system String GROUP_DATA_ID = System.getenv("GROUP_DATA_ID"); SensitiveDataScannerGroupDeleteRequest body = new SensitiveDataScannerGroupDeleteRequest() .meta(new SensitiveDataScannerMetaVersionOnly()); try { SensitiveDataScannerGroupDeleteResponse result = apiInstance.deleteScanningGroup(GROUP_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SensitiveDataScannerApi#deleteScanningGroup"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Scanning Group returns "OK" response ``` """ Delete Scanning Group returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.sensitive_data_scanner_api import SensitiveDataScannerApi from datadog_api_client.v2.model.sensitive_data_scanner_group_delete_request import ( SensitiveDataScannerGroupDeleteRequest, ) from datadog_api_client.v2.model.sensitive_data_scanner_meta_version_only import SensitiveDataScannerMetaVersionOnly # there is a valid "scanning_group" in the system GROUP_DATA_ID = environ["GROUP_DATA_ID"] body = SensitiveDataScannerGroupDeleteRequest( meta=SensitiveDataScannerMetaVersionOnly(), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SensitiveDataScannerApi(api_client) response = api_instance.delete_scanning_group(group_id=GROUP_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Scanning Group returns "OK" response ``` # Delete Scanning Group returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SensitiveDataScannerAPI.new # there is a valid "scanning_group" in the system GROUP_DATA_ID = ENV["GROUP_DATA_ID"] body = DatadogAPIClient::V2::SensitiveDataScannerGroupDeleteRequest.new({ meta: DatadogAPIClient::V2::SensitiveDataScannerMetaVersionOnly.new({}), }) p api_instance.delete_scanning_group(GROUP_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Scanning Group returns "OK" response ``` // Delete Scanning Group returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_sensitive_data_scanner::SensitiveDataScannerAPI; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupDeleteRequest; use datadog_api_client::datadogV2::model::SensitiveDataScannerMetaVersionOnly; #[tokio::main] async fn main() { // there is a valid "scanning_group" in the system let group_data_id = std::env::var("GROUP_DATA_ID").unwrap(); let body = SensitiveDataScannerGroupDeleteRequest::new(SensitiveDataScannerMetaVersionOnly::new()); let configuration = datadog::Configuration::new(); let api = SensitiveDataScannerAPI::with_config(configuration); let resp = api.delete_scanning_group(group_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Scanning Group returns "OK" response ``` /** * Delete Scanning Group returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SensitiveDataScannerApi(configuration); // there is a valid "scanning_group" in the system const GROUP_DATA_ID = process.env.GROUP_DATA_ID as string; const params: v2.SensitiveDataScannerApiDeleteScanningGroupRequest = { body: { meta: {}, }, groupId: GROUP_DATA_ID, }; apiInstance .deleteScanningGroup(params) .then((data: v2.SensitiveDataScannerGroupDeleteResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create Scanning Rule](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#create-scanning-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#create-scanning-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/sensitive-data-scanner/config/ruleshttps://api.ap2.datadoghq.com/api/v2/sensitive-data-scanner/config/ruleshttps://api.datadoghq.eu/api/v2/sensitive-data-scanner/config/ruleshttps://api.ddog-gov.com/api/v2/sensitive-data-scanner/config/ruleshttps://api.datadoghq.com/api/v2/sensitive-data-scanner/config/ruleshttps://api.us3.datadoghq.com/api/v2/sensitive-data-scanner/config/ruleshttps://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/rules ### Overview Create a scanning rule in a sensitive data scanner group, ordered last. The posted rule MUST include a group relationship. It MUST include either a standard_pattern relationship or a regex attribute, but not both. If included_attributes is empty or missing, we will scan all attributes except excluded_attributes. If both are missing, we will scan the whole event. This endpoint requires the `data_scanner_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Field Type Description data [_required_] object Data related to the creation of a rule. attributes [_required_] object Attributes of the Sensitive Data Scanner rule. description string Description of the rule. excluded_namespaces [string] Attributes excluded from the scan. If namespaces is provided, it has to be a sub-path of the namespaces array. included_keyword_configuration object Object defining a set of keywords and a number of characters that help reduce noise. You can provide a list of keywords you would like to check within a defined proximity of the matching pattern. If any of the keywords are found within the proximity check, the match is kept. If none are found, the match is discarded. character_count [_required_] int64 The number of characters behind a match detected by Sensitive Data Scanner to look for the keywords defined. `character_count` should be greater than the maximum length of a keyword defined for a rule. keywords [_required_] [string] Keyword list that will be checked during scanning in order to validate a match. The number of keywords in the list must be less than or equal to 30. use_recommended_keywords boolean Should the rule use the underlying standard pattern keyword configuration. If set to `true`, the rule must be tied to a standard pattern. If set to `false`, the specified keywords and `character_count` are applied. is_enabled boolean Whether or not the rule is enabled. name string Name of the rule. namespaces [string] Attributes included in the scan. If namespaces is empty or missing, all attributes except excluded_namespaces are scanned. If both are missing the whole event is scanned. pattern string Not included if there is a relationship to a standard pattern. priority int64 Integer from 1 (high) to 5 (low) indicating rule issue severity. tags [string] List of tags. text_replacement object Object describing how the scanned event will be replaced. number_of_chars int64 Required if type == 'partial_replacement_from_beginning' or 'partial_replacement_from_end'. It must be > 0. replacement_string string Required if type == 'replacement_string'. should_save_match boolean Only valid when type == `replacement_string`. When enabled, matches can be unmasked in logs by users with ‘Data Scanner Unmask’ permission. As a security best practice, avoid masking for highly-sensitive, long-lived data. type enum Type of the replacement text. None means no replacement. hash means the data will be stubbed. replacement_string means that one can chose a text to replace the data. partial_replacement_from_beginning allows a user to partially replace the data from the beginning, and partial_replacement_from_end on the other hand, allows to replace data from the end. Allowed enum values: `none,hash,replacement_string,partial_replacement_from_beginning,partial_replacement_from_end` default: `none` relationships [_required_] object Relationships of a scanning rule. group object A scanning group data. data object A scanning group. id string ID of the group. type enum Sensitive Data Scanner group type. Allowed enum values: `sensitive_data_scanner_group` default: `sensitive_data_scanner_group` standard_pattern object A standard pattern. data object Data containing the standard pattern id. id string ID of the standard pattern. type enum Sensitive Data Scanner standard pattern type. Allowed enum values: `sensitive_data_scanner_standard_pattern` default: `sensitive_data_scanner_standard_pattern` type [_required_] enum Sensitive Data Scanner rule type. Allowed enum values: `sensitive_data_scanner_rule` default: `sensitive_data_scanner_rule` meta [_required_] object Meta payload containing information about the API. version int64 Version of the API (optional). ##### Create Scanning Rule returns "OK" response ``` { "meta": {}, "data": { "type": "sensitive_data_scanner_rule", "attributes": { "name": "Example-Sensitive-Data-Scanner", "pattern": "pattern", "namespaces": [ "admin" ], "excluded_namespaces": [ "admin.name" ], "text_replacement": { "type": "none" }, "tags": [ "sensitive_data:true" ], "is_enabled": true, "priority": 1, "included_keyword_configuration": { "keywords": [ "credit card" ], "character_count": 35 } }, "relationships": { "group": { "data": { "type": "sensitive_data_scanner_group", "id": "string" } } } } } ``` Copy ##### Create Scanning Rule with should_save_match returns "OK" response ``` { "meta": {}, "data": { "type": "sensitive_data_scanner_rule", "attributes": { "name": "Example-Sensitive-Data-Scanner", "pattern": "pattern", "text_replacement": { "type": "replacement_string", "replacement_string": "REDACTED", "should_save_match": true }, "tags": [ "sensitive_data:true" ], "is_enabled": true, "priority": 1 }, "relationships": { "group": { "data": { "type": "sensitive_data_scanner_group", "id": "string" } } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#CreateScanningRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#CreateScanningRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#CreateScanningRule-403-v2) * [429](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#CreateScanningRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Create rule response. Field Type Description data object Response data related to the creation of a rule. attributes object Attributes of the Sensitive Data Scanner rule. description string Description of the rule. excluded_namespaces [string] Attributes excluded from the scan. If namespaces is provided, it has to be a sub-path of the namespaces array. included_keyword_configuration object Object defining a set of keywords and a number of characters that help reduce noise. You can provide a list of keywords you would like to check within a defined proximity of the matching pattern. If any of the keywords are found within the proximity check, the match is kept. If none are found, the match is discarded. character_count [_required_] int64 The number of characters behind a match detected by Sensitive Data Scanner to look for the keywords defined. `character_count` should be greater than the maximum length of a keyword defined for a rule. keywords [_required_] [string] Keyword list that will be checked during scanning in order to validate a match. The number of keywords in the list must be less than or equal to 30. use_recommended_keywords boolean Should the rule use the underlying standard pattern keyword configuration. If set to `true`, the rule must be tied to a standard pattern. If set to `false`, the specified keywords and `character_count` are applied. is_enabled boolean Whether or not the rule is enabled. name string Name of the rule. namespaces [string] Attributes included in the scan. If namespaces is empty or missing, all attributes except excluded_namespaces are scanned. If both are missing the whole event is scanned. pattern string Not included if there is a relationship to a standard pattern. priority int64 Integer from 1 (high) to 5 (low) indicating rule issue severity. tags [string] List of tags. text_replacement object Object describing how the scanned event will be replaced. number_of_chars int64 Required if type == 'partial_replacement_from_beginning' or 'partial_replacement_from_end'. It must be > 0. replacement_string string Required if type == 'replacement_string'. should_save_match boolean Only valid when type == `replacement_string`. When enabled, matches can be unmasked in logs by users with ‘Data Scanner Unmask’ permission. As a security best practice, avoid masking for highly-sensitive, long-lived data. type enum Type of the replacement text. None means no replacement. hash means the data will be stubbed. replacement_string means that one can chose a text to replace the data. partial_replacement_from_beginning allows a user to partially replace the data from the beginning, and partial_replacement_from_end on the other hand, allows to replace data from the end. Allowed enum values: `none,hash,replacement_string,partial_replacement_from_beginning,partial_replacement_from_end` default: `none` id string ID of the rule. relationships object Relationships of a scanning rule. group object A scanning group data. data object A scanning group. id string ID of the group. type enum Sensitive Data Scanner group type. Allowed enum values: `sensitive_data_scanner_group` default: `sensitive_data_scanner_group` standard_pattern object A standard pattern. data object Data containing the standard pattern id. id string ID of the standard pattern. type enum Sensitive Data Scanner standard pattern type. Allowed enum values: `sensitive_data_scanner_standard_pattern` default: `sensitive_data_scanner_standard_pattern` type enum Sensitive Data Scanner rule type. Allowed enum values: `sensitive_data_scanner_rule` default: `sensitive_data_scanner_rule` meta object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "data": { "attributes": { "description": "string", "excluded_namespaces": [ "admin.name" ], "included_keyword_configuration": { "character_count": 30, "keywords": [ "credit card", "cc" ], "use_recommended_keywords": false }, "is_enabled": false, "name": "string", "namespaces": [ "admin" ], "pattern": "string", "priority": "integer", "tags": [], "text_replacement": { "number_of_chars": "integer", "replacement_string": "string", "should_save_match": false, "type": "string" } }, "id": "string", "relationships": { "group": { "data": { "id": "string", "type": "sensitive_data_scanner_group" } }, "standard_pattern": { "data": { "id": "string", "type": "sensitive_data_scanner_standard_pattern" } } }, "type": "sensitive_data_scanner_rule" }, "meta": { "version": 0 } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=typescript) ##### Create Scanning Rule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "meta": {}, "data": { "type": "sensitive_data_scanner_rule", "attributes": { "name": "Example-Sensitive-Data-Scanner", "pattern": "pattern", "namespaces": [ "admin" ], "excluded_namespaces": [ "admin.name" ], "text_replacement": { "type": "none" }, "tags": [ "sensitive_data:true" ], "is_enabled": true, "priority": 1, "included_keyword_configuration": { "keywords": [ "credit card" ], "character_count": 35 } }, "relationships": { "group": { "data": { "type": "sensitive_data_scanner_group", "id": "string" } } } } } EOF ``` ##### Create Scanning Rule with should_save_match returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "meta": {}, "data": { "type": "sensitive_data_scanner_rule", "attributes": { "name": "Example-Sensitive-Data-Scanner", "pattern": "pattern", "text_replacement": { "type": "replacement_string", "replacement_string": "REDACTED", "should_save_match": true }, "tags": [ "sensitive_data:true" ], "is_enabled": true, "priority": 1 }, "relationships": { "group": { "data": { "type": "sensitive_data_scanner_group", "id": "string" } } } } } EOF ``` ##### Create Scanning Rule returns "OK" response ``` // Create Scanning Rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "scanning_group" in the system GroupDataID := os.Getenv("GROUP_DATA_ID") body := datadogV2.SensitiveDataScannerRuleCreateRequest{ Meta: datadogV2.SensitiveDataScannerMetaVersionOnly{}, Data: datadogV2.SensitiveDataScannerRuleCreate{ Type: datadogV2.SENSITIVEDATASCANNERRULETYPE_SENSITIVE_DATA_SCANNER_RULE, Attributes: datadogV2.SensitiveDataScannerRuleAttributes{ Name: datadog.PtrString("Example-Sensitive-Data-Scanner"), Pattern: datadog.PtrString("pattern"), Namespaces: []string{ "admin", }, ExcludedNamespaces: []string{ "admin.name", }, TextReplacement: &datadogV2.SensitiveDataScannerTextReplacement{ Type: datadogV2.SENSITIVEDATASCANNERTEXTREPLACEMENTTYPE_NONE.Ptr(), }, Tags: []string{ "sensitive_data:true", }, IsEnabled: datadog.PtrBool(true), Priority: datadog.PtrInt64(1), IncludedKeywordConfiguration: &datadogV2.SensitiveDataScannerIncludedKeywordConfiguration{ Keywords: []string{ "credit card", }, CharacterCount: 35, }, }, Relationships: datadogV2.SensitiveDataScannerRuleRelationships{ Group: &datadogV2.SensitiveDataScannerGroupData{ Data: &datadogV2.SensitiveDataScannerGroup{ Type: datadogV2.SENSITIVEDATASCANNERGROUPTYPE_SENSITIVE_DATA_SCANNER_GROUP.Ptr(), Id: datadog.PtrString(GroupDataID), }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSensitiveDataScannerApi(apiClient) resp, r, err := api.CreateScanningRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SensitiveDataScannerApi.CreateScanningRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SensitiveDataScannerApi.CreateScanningRule`:\n%s\n", responseContent) } ``` Copy ##### Create Scanning Rule with should_save_match returns "OK" response ``` // Create Scanning Rule with should_save_match returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "scanning_group" in the system GroupDataID := os.Getenv("GROUP_DATA_ID") body := datadogV2.SensitiveDataScannerRuleCreateRequest{ Meta: datadogV2.SensitiveDataScannerMetaVersionOnly{}, Data: datadogV2.SensitiveDataScannerRuleCreate{ Type: datadogV2.SENSITIVEDATASCANNERRULETYPE_SENSITIVE_DATA_SCANNER_RULE, Attributes: datadogV2.SensitiveDataScannerRuleAttributes{ Name: datadog.PtrString("Example-Sensitive-Data-Scanner"), Pattern: datadog.PtrString("pattern"), TextReplacement: &datadogV2.SensitiveDataScannerTextReplacement{ Type: datadogV2.SENSITIVEDATASCANNERTEXTREPLACEMENTTYPE_REPLACEMENT_STRING.Ptr(), ReplacementString: datadog.PtrString("REDACTED"), ShouldSaveMatch: datadog.PtrBool(true), }, Tags: []string{ "sensitive_data:true", }, IsEnabled: datadog.PtrBool(true), Priority: datadog.PtrInt64(1), }, Relationships: datadogV2.SensitiveDataScannerRuleRelationships{ Group: &datadogV2.SensitiveDataScannerGroupData{ Data: &datadogV2.SensitiveDataScannerGroup{ Type: datadogV2.SENSITIVEDATASCANNERGROUPTYPE_SENSITIVE_DATA_SCANNER_GROUP.Ptr(), Id: datadog.PtrString(GroupDataID), }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSensitiveDataScannerApi(apiClient) resp, r, err := api.CreateScanningRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SensitiveDataScannerApi.CreateScanningRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SensitiveDataScannerApi.CreateScanningRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create Scanning Rule returns "OK" response ``` // Create Scanning Rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SensitiveDataScannerApi; import com.datadog.api.client.v2.model.SensitiveDataScannerCreateRuleResponse; import com.datadog.api.client.v2.model.SensitiveDataScannerGroup; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupData; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupType; import com.datadog.api.client.v2.model.SensitiveDataScannerIncludedKeywordConfiguration; import com.datadog.api.client.v2.model.SensitiveDataScannerMetaVersionOnly; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleAttributes; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleCreate; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleCreateRequest; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleRelationships; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleType; import com.datadog.api.client.v2.model.SensitiveDataScannerTextReplacement; import com.datadog.api.client.v2.model.SensitiveDataScannerTextReplacementType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SensitiveDataScannerApi apiInstance = new SensitiveDataScannerApi(defaultClient); // there is a valid "scanning_group" in the system String GROUP_DATA_ID = System.getenv("GROUP_DATA_ID"); SensitiveDataScannerRuleCreateRequest body = new SensitiveDataScannerRuleCreateRequest() .meta(new SensitiveDataScannerMetaVersionOnly()) .data( new SensitiveDataScannerRuleCreate() .type(SensitiveDataScannerRuleType.SENSITIVE_DATA_SCANNER_RULE) .attributes( new SensitiveDataScannerRuleAttributes() .name("Example-Sensitive-Data-Scanner") .pattern("pattern") .namespaces(Collections.singletonList("admin")) .excludedNamespaces(Collections.singletonList("admin.name")) .textReplacement( new SensitiveDataScannerTextReplacement() .type(SensitiveDataScannerTextReplacementType.NONE)) .tags(Collections.singletonList("sensitive_data:true")) .isEnabled(true) .priority(1L) .includedKeywordConfiguration( new SensitiveDataScannerIncludedKeywordConfiguration() .keywords(Collections.singletonList("credit card")) .characterCount(35L))) .relationships( new SensitiveDataScannerRuleRelationships() .group( new SensitiveDataScannerGroupData() .data( new SensitiveDataScannerGroup() .type( SensitiveDataScannerGroupType .SENSITIVE_DATA_SCANNER_GROUP) .id(GROUP_DATA_ID))))); try { SensitiveDataScannerCreateRuleResponse result = apiInstance.createScanningRule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SensitiveDataScannerApi#createScanningRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create Scanning Rule with should_save_match returns "OK" response ``` // Create Scanning Rule with should_save_match returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SensitiveDataScannerApi; import com.datadog.api.client.v2.model.SensitiveDataScannerCreateRuleResponse; import com.datadog.api.client.v2.model.SensitiveDataScannerGroup; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupData; import com.datadog.api.client.v2.model.SensitiveDataScannerGroupType; import com.datadog.api.client.v2.model.SensitiveDataScannerMetaVersionOnly; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleAttributes; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleCreate; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleCreateRequest; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleRelationships; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleType; import com.datadog.api.client.v2.model.SensitiveDataScannerTextReplacement; import com.datadog.api.client.v2.model.SensitiveDataScannerTextReplacementType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SensitiveDataScannerApi apiInstance = new SensitiveDataScannerApi(defaultClient); // there is a valid "scanning_group" in the system String GROUP_DATA_ID = System.getenv("GROUP_DATA_ID"); SensitiveDataScannerRuleCreateRequest body = new SensitiveDataScannerRuleCreateRequest() .meta(new SensitiveDataScannerMetaVersionOnly()) .data( new SensitiveDataScannerRuleCreate() .type(SensitiveDataScannerRuleType.SENSITIVE_DATA_SCANNER_RULE) .attributes( new SensitiveDataScannerRuleAttributes() .name("Example-Sensitive-Data-Scanner") .pattern("pattern") .textReplacement( new SensitiveDataScannerTextReplacement() .type( SensitiveDataScannerTextReplacementType.REPLACEMENT_STRING) .replacementString("REDACTED") .shouldSaveMatch(true)) .tags(Collections.singletonList("sensitive_data:true")) .isEnabled(true) .priority(1L)) .relationships( new SensitiveDataScannerRuleRelationships() .group( new SensitiveDataScannerGroupData() .data( new SensitiveDataScannerGroup() .type( SensitiveDataScannerGroupType .SENSITIVE_DATA_SCANNER_GROUP) .id(GROUP_DATA_ID))))); try { SensitiveDataScannerCreateRuleResponse result = apiInstance.createScanningRule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SensitiveDataScannerApi#createScanningRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create Scanning Rule returns "OK" response ``` """ Create Scanning Rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.sensitive_data_scanner_api import SensitiveDataScannerApi from datadog_api_client.v2.model.sensitive_data_scanner_group import SensitiveDataScannerGroup from datadog_api_client.v2.model.sensitive_data_scanner_group_data import SensitiveDataScannerGroupData from datadog_api_client.v2.model.sensitive_data_scanner_group_type import SensitiveDataScannerGroupType from datadog_api_client.v2.model.sensitive_data_scanner_included_keyword_configuration import ( SensitiveDataScannerIncludedKeywordConfiguration, ) from datadog_api_client.v2.model.sensitive_data_scanner_meta_version_only import SensitiveDataScannerMetaVersionOnly from datadog_api_client.v2.model.sensitive_data_scanner_rule_attributes import SensitiveDataScannerRuleAttributes from datadog_api_client.v2.model.sensitive_data_scanner_rule_create import SensitiveDataScannerRuleCreate from datadog_api_client.v2.model.sensitive_data_scanner_rule_create_request import SensitiveDataScannerRuleCreateRequest from datadog_api_client.v2.model.sensitive_data_scanner_rule_relationships import SensitiveDataScannerRuleRelationships from datadog_api_client.v2.model.sensitive_data_scanner_rule_type import SensitiveDataScannerRuleType from datadog_api_client.v2.model.sensitive_data_scanner_text_replacement import SensitiveDataScannerTextReplacement from datadog_api_client.v2.model.sensitive_data_scanner_text_replacement_type import ( SensitiveDataScannerTextReplacementType, ) # there is a valid "scanning_group" in the system GROUP_DATA_ID = environ["GROUP_DATA_ID"] body = SensitiveDataScannerRuleCreateRequest( meta=SensitiveDataScannerMetaVersionOnly(), data=SensitiveDataScannerRuleCreate( type=SensitiveDataScannerRuleType.SENSITIVE_DATA_SCANNER_RULE, attributes=SensitiveDataScannerRuleAttributes( name="Example-Sensitive-Data-Scanner", pattern="pattern", namespaces=[ "admin", ], excluded_namespaces=[ "admin.name", ], text_replacement=SensitiveDataScannerTextReplacement( type=SensitiveDataScannerTextReplacementType.NONE, ), tags=[ "sensitive_data:true", ], is_enabled=True, priority=1, included_keyword_configuration=SensitiveDataScannerIncludedKeywordConfiguration( keywords=[ "credit card", ], character_count=35, ), ), relationships=SensitiveDataScannerRuleRelationships( group=SensitiveDataScannerGroupData( data=SensitiveDataScannerGroup( type=SensitiveDataScannerGroupType.SENSITIVE_DATA_SCANNER_GROUP, id=GROUP_DATA_ID, ), ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SensitiveDataScannerApi(api_client) response = api_instance.create_scanning_rule(body=body) print(response) ``` Copy ##### Create Scanning Rule with should_save_match returns "OK" response ``` """ Create Scanning Rule with should_save_match returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.sensitive_data_scanner_api import SensitiveDataScannerApi from datadog_api_client.v2.model.sensitive_data_scanner_group import SensitiveDataScannerGroup from datadog_api_client.v2.model.sensitive_data_scanner_group_data import SensitiveDataScannerGroupData from datadog_api_client.v2.model.sensitive_data_scanner_group_type import SensitiveDataScannerGroupType from datadog_api_client.v2.model.sensitive_data_scanner_meta_version_only import SensitiveDataScannerMetaVersionOnly from datadog_api_client.v2.model.sensitive_data_scanner_rule_attributes import SensitiveDataScannerRuleAttributes from datadog_api_client.v2.model.sensitive_data_scanner_rule_create import SensitiveDataScannerRuleCreate from datadog_api_client.v2.model.sensitive_data_scanner_rule_create_request import SensitiveDataScannerRuleCreateRequest from datadog_api_client.v2.model.sensitive_data_scanner_rule_relationships import SensitiveDataScannerRuleRelationships from datadog_api_client.v2.model.sensitive_data_scanner_rule_type import SensitiveDataScannerRuleType from datadog_api_client.v2.model.sensitive_data_scanner_text_replacement import SensitiveDataScannerTextReplacement from datadog_api_client.v2.model.sensitive_data_scanner_text_replacement_type import ( SensitiveDataScannerTextReplacementType, ) # there is a valid "scanning_group" in the system GROUP_DATA_ID = environ["GROUP_DATA_ID"] body = SensitiveDataScannerRuleCreateRequest( meta=SensitiveDataScannerMetaVersionOnly(), data=SensitiveDataScannerRuleCreate( type=SensitiveDataScannerRuleType.SENSITIVE_DATA_SCANNER_RULE, attributes=SensitiveDataScannerRuleAttributes( name="Example-Sensitive-Data-Scanner", pattern="pattern", text_replacement=SensitiveDataScannerTextReplacement( type=SensitiveDataScannerTextReplacementType.REPLACEMENT_STRING, replacement_string="REDACTED", should_save_match=True, ), tags=[ "sensitive_data:true", ], is_enabled=True, priority=1, ), relationships=SensitiveDataScannerRuleRelationships( group=SensitiveDataScannerGroupData( data=SensitiveDataScannerGroup( type=SensitiveDataScannerGroupType.SENSITIVE_DATA_SCANNER_GROUP, id=GROUP_DATA_ID, ), ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SensitiveDataScannerApi(api_client) response = api_instance.create_scanning_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create Scanning Rule returns "OK" response ``` # Create Scanning Rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SensitiveDataScannerAPI.new # there is a valid "scanning_group" in the system GROUP_DATA_ID = ENV["GROUP_DATA_ID"] body = DatadogAPIClient::V2::SensitiveDataScannerRuleCreateRequest.new({ meta: DatadogAPIClient::V2::SensitiveDataScannerMetaVersionOnly.new({}), data: DatadogAPIClient::V2::SensitiveDataScannerRuleCreate.new({ type: DatadogAPIClient::V2::SensitiveDataScannerRuleType::SENSITIVE_DATA_SCANNER_RULE, attributes: DatadogAPIClient::V2::SensitiveDataScannerRuleAttributes.new({ name: "Example-Sensitive-Data-Scanner", pattern: "pattern", namespaces: [ "admin", ], excluded_namespaces: [ "admin.name", ], text_replacement: DatadogAPIClient::V2::SensitiveDataScannerTextReplacement.new({ type: DatadogAPIClient::V2::SensitiveDataScannerTextReplacementType::NONE, }), tags: [ "sensitive_data:true", ], is_enabled: true, priority: 1, included_keyword_configuration: DatadogAPIClient::V2::SensitiveDataScannerIncludedKeywordConfiguration.new({ keywords: [ "credit card", ], character_count: 35, }), }), relationships: DatadogAPIClient::V2::SensitiveDataScannerRuleRelationships.new({ group: DatadogAPIClient::V2::SensitiveDataScannerGroupData.new({ data: DatadogAPIClient::V2::SensitiveDataScannerGroup.new({ type: DatadogAPIClient::V2::SensitiveDataScannerGroupType::SENSITIVE_DATA_SCANNER_GROUP, id: GROUP_DATA_ID, }), }), }), }), }) p api_instance.create_scanning_rule(body) ``` Copy ##### Create Scanning Rule with should_save_match returns "OK" response ``` # Create Scanning Rule with should_save_match returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SensitiveDataScannerAPI.new # there is a valid "scanning_group" in the system GROUP_DATA_ID = ENV["GROUP_DATA_ID"] body = DatadogAPIClient::V2::SensitiveDataScannerRuleCreateRequest.new({ meta: DatadogAPIClient::V2::SensitiveDataScannerMetaVersionOnly.new({}), data: DatadogAPIClient::V2::SensitiveDataScannerRuleCreate.new({ type: DatadogAPIClient::V2::SensitiveDataScannerRuleType::SENSITIVE_DATA_SCANNER_RULE, attributes: DatadogAPIClient::V2::SensitiveDataScannerRuleAttributes.new({ name: "Example-Sensitive-Data-Scanner", pattern: "pattern", text_replacement: DatadogAPIClient::V2::SensitiveDataScannerTextReplacement.new({ type: DatadogAPIClient::V2::SensitiveDataScannerTextReplacementType::REPLACEMENT_STRING, replacement_string: "REDACTED", should_save_match: true, }), tags: [ "sensitive_data:true", ], is_enabled: true, priority: 1, }), relationships: DatadogAPIClient::V2::SensitiveDataScannerRuleRelationships.new({ group: DatadogAPIClient::V2::SensitiveDataScannerGroupData.new({ data: DatadogAPIClient::V2::SensitiveDataScannerGroup.new({ type: DatadogAPIClient::V2::SensitiveDataScannerGroupType::SENSITIVE_DATA_SCANNER_GROUP, id: GROUP_DATA_ID, }), }), }), }), }) p api_instance.create_scanning_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create Scanning Rule returns "OK" response ``` // Create Scanning Rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_sensitive_data_scanner::SensitiveDataScannerAPI; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroup; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupData; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupType; use datadog_api_client::datadogV2::model::SensitiveDataScannerIncludedKeywordConfiguration; use datadog_api_client::datadogV2::model::SensitiveDataScannerMetaVersionOnly; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleAttributes; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleCreate; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleCreateRequest; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleRelationships; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleType; use datadog_api_client::datadogV2::model::SensitiveDataScannerTextReplacement; use datadog_api_client::datadogV2::model::SensitiveDataScannerTextReplacementType; #[tokio::main] async fn main() { // there is a valid "scanning_group" in the system let group_data_id = std::env::var("GROUP_DATA_ID").unwrap(); let body = SensitiveDataScannerRuleCreateRequest::new( SensitiveDataScannerRuleCreate::new( SensitiveDataScannerRuleAttributes::new() .excluded_namespaces(vec!["admin.name".to_string()]) .included_keyword_configuration( SensitiveDataScannerIncludedKeywordConfiguration::new( 35, vec!["credit card".to_string()], ), ) .is_enabled(true) .name("Example-Sensitive-Data-Scanner".to_string()) .namespaces(vec!["admin".to_string()]) .pattern("pattern".to_string()) .priority(1) .tags(vec!["sensitive_data:true".to_string()]) .text_replacement( SensitiveDataScannerTextReplacement::new() .type_(SensitiveDataScannerTextReplacementType::NONE), ), SensitiveDataScannerRuleRelationships::new().group( SensitiveDataScannerGroupData::new().data( SensitiveDataScannerGroup::new() .id(group_data_id.clone()) .type_(SensitiveDataScannerGroupType::SENSITIVE_DATA_SCANNER_GROUP), ), ), SensitiveDataScannerRuleType::SENSITIVE_DATA_SCANNER_RULE, ), SensitiveDataScannerMetaVersionOnly::new(), ); let configuration = datadog::Configuration::new(); let api = SensitiveDataScannerAPI::with_config(configuration); let resp = api.create_scanning_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create Scanning Rule with should_save_match returns "OK" response ``` // Create Scanning Rule with should_save_match returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_sensitive_data_scanner::SensitiveDataScannerAPI; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroup; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupData; use datadog_api_client::datadogV2::model::SensitiveDataScannerGroupType; use datadog_api_client::datadogV2::model::SensitiveDataScannerMetaVersionOnly; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleAttributes; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleCreate; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleCreateRequest; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleRelationships; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleType; use datadog_api_client::datadogV2::model::SensitiveDataScannerTextReplacement; use datadog_api_client::datadogV2::model::SensitiveDataScannerTextReplacementType; #[tokio::main] async fn main() { // there is a valid "scanning_group" in the system let group_data_id = std::env::var("GROUP_DATA_ID").unwrap(); let body = SensitiveDataScannerRuleCreateRequest::new( SensitiveDataScannerRuleCreate::new( SensitiveDataScannerRuleAttributes::new() .is_enabled(true) .name("Example-Sensitive-Data-Scanner".to_string()) .pattern("pattern".to_string()) .priority(1) .tags(vec!["sensitive_data:true".to_string()]) .text_replacement( SensitiveDataScannerTextReplacement::new() .replacement_string("REDACTED".to_string()) .should_save_match(true) .type_(SensitiveDataScannerTextReplacementType::REPLACEMENT_STRING), ), SensitiveDataScannerRuleRelationships::new().group( SensitiveDataScannerGroupData::new().data( SensitiveDataScannerGroup::new() .id(group_data_id.clone()) .type_(SensitiveDataScannerGroupType::SENSITIVE_DATA_SCANNER_GROUP), ), ), SensitiveDataScannerRuleType::SENSITIVE_DATA_SCANNER_RULE, ), SensitiveDataScannerMetaVersionOnly::new(), ); let configuration = datadog::Configuration::new(); let api = SensitiveDataScannerAPI::with_config(configuration); let resp = api.create_scanning_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create Scanning Rule returns "OK" response ``` /** * Create Scanning Rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SensitiveDataScannerApi(configuration); // there is a valid "scanning_group" in the system const GROUP_DATA_ID = process.env.GROUP_DATA_ID as string; const params: v2.SensitiveDataScannerApiCreateScanningRuleRequest = { body: { meta: {}, data: { type: "sensitive_data_scanner_rule", attributes: { name: "Example-Sensitive-Data-Scanner", pattern: "pattern", namespaces: ["admin"], excludedNamespaces: ["admin.name"], textReplacement: { type: "none", }, tags: ["sensitive_data:true"], isEnabled: true, priority: 1, includedKeywordConfiguration: { keywords: ["credit card"], characterCount: 35, }, }, relationships: { group: { data: { type: "sensitive_data_scanner_group", id: GROUP_DATA_ID, }, }, }, }, }, }; apiInstance .createScanningRule(params) .then((data: v2.SensitiveDataScannerCreateRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create Scanning Rule with should_save_match returns "OK" response ``` /** * Create Scanning Rule with should_save_match returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SensitiveDataScannerApi(configuration); // there is a valid "scanning_group" in the system const GROUP_DATA_ID = process.env.GROUP_DATA_ID as string; const params: v2.SensitiveDataScannerApiCreateScanningRuleRequest = { body: { meta: {}, data: { type: "sensitive_data_scanner_rule", attributes: { name: "Example-Sensitive-Data-Scanner", pattern: "pattern", textReplacement: { type: "replacement_string", replacementString: "REDACTED", shouldSaveMatch: true, }, tags: ["sensitive_data:true"], isEnabled: true, priority: 1, }, relationships: { group: { data: { type: "sensitive_data_scanner_group", id: GROUP_DATA_ID, }, }, }, }, }, }; apiInstance .createScanningRule(params) .then((data: v2.SensitiveDataScannerCreateRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Scanning Rule](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#update-scanning-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#update-scanning-rule-v2) PATCH https://api.ap1.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.datadoghq.eu/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.ddog-gov.com/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.us3.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/{rule_id} ### Overview Update a scanning rule. The request body MUST NOT include a standard_pattern relationship, as that relationship is non-editable. Trying to edit the regex attribute of a rule with a standard_pattern relationship will also result in an error. This endpoint requires the `data_scanner_write` permission. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string The ID of the rule. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Field Type Description data [_required_] object Data related to the update of a rule. attributes object Attributes of the Sensitive Data Scanner rule. description string Description of the rule. excluded_namespaces [string] Attributes excluded from the scan. If namespaces is provided, it has to be a sub-path of the namespaces array. included_keyword_configuration object Object defining a set of keywords and a number of characters that help reduce noise. You can provide a list of keywords you would like to check within a defined proximity of the matching pattern. If any of the keywords are found within the proximity check, the match is kept. If none are found, the match is discarded. character_count [_required_] int64 The number of characters behind a match detected by Sensitive Data Scanner to look for the keywords defined. `character_count` should be greater than the maximum length of a keyword defined for a rule. keywords [_required_] [string] Keyword list that will be checked during scanning in order to validate a match. The number of keywords in the list must be less than or equal to 30. use_recommended_keywords boolean Should the rule use the underlying standard pattern keyword configuration. If set to `true`, the rule must be tied to a standard pattern. If set to `false`, the specified keywords and `character_count` are applied. is_enabled boolean Whether or not the rule is enabled. name string Name of the rule. namespaces [string] Attributes included in the scan. If namespaces is empty or missing, all attributes except excluded_namespaces are scanned. If both are missing the whole event is scanned. pattern string Not included if there is a relationship to a standard pattern. priority int64 Integer from 1 (high) to 5 (low) indicating rule issue severity. tags [string] List of tags. text_replacement object Object describing how the scanned event will be replaced. number_of_chars int64 Required if type == 'partial_replacement_from_beginning' or 'partial_replacement_from_end'. It must be > 0. replacement_string string Required if type == 'replacement_string'. should_save_match boolean Only valid when type == `replacement_string`. When enabled, matches can be unmasked in logs by users with ‘Data Scanner Unmask’ permission. As a security best practice, avoid masking for highly-sensitive, long-lived data. type enum Type of the replacement text. None means no replacement. hash means the data will be stubbed. replacement_string means that one can chose a text to replace the data. partial_replacement_from_beginning allows a user to partially replace the data from the beginning, and partial_replacement_from_end on the other hand, allows to replace data from the end. Allowed enum values: `none,hash,replacement_string,partial_replacement_from_beginning,partial_replacement_from_end` default: `none` id string ID of the rule. relationships object Relationships of a scanning rule. group object A scanning group data. data object A scanning group. id string ID of the group. type enum Sensitive Data Scanner group type. Allowed enum values: `sensitive_data_scanner_group` default: `sensitive_data_scanner_group` standard_pattern object A standard pattern. data object Data containing the standard pattern id. id string ID of the standard pattern. type enum Sensitive Data Scanner standard pattern type. Allowed enum values: `sensitive_data_scanner_standard_pattern` default: `sensitive_data_scanner_standard_pattern` type enum Sensitive Data Scanner rule type. Allowed enum values: `sensitive_data_scanner_rule` default: `sensitive_data_scanner_rule` meta [_required_] object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "meta": {}, "data": { "id": "string", "type": "sensitive_data_scanner_rule", "attributes": { "name": "Example-Sensitive-Data-Scanner", "pattern": "pattern", "text_replacement": { "type": "none" }, "tags": [ "sensitive_data:true" ], "is_enabled": true, "priority": 5, "included_keyword_configuration": { "keywords": [ "credit card", "cc" ], "character_count": 35 } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#UpdateScanningRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#UpdateScanningRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#UpdateScanningRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#UpdateScanningRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#UpdateScanningRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Update rule response. Field Type Description meta object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "meta": { "version": 0 } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=typescript) ##### Update Scanning Rule returns "OK" response Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/${rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "meta": {}, "data": { "id": "string", "type": "sensitive_data_scanner_rule", "attributes": { "name": "Example-Sensitive-Data-Scanner", "pattern": "pattern", "text_replacement": { "type": "none" }, "tags": [ "sensitive_data:true" ], "is_enabled": true, "priority": 5, "included_keyword_configuration": { "keywords": [ "credit card", "cc" ], "character_count": 35 } } } } EOF ``` ##### Update Scanning Rule returns "OK" response ``` // Update Scanning Rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // the "scanning_group" has a "scanning_rule" RuleDataID := os.Getenv("RULE_DATA_ID") body := datadogV2.SensitiveDataScannerRuleUpdateRequest{ Meta: datadogV2.SensitiveDataScannerMetaVersionOnly{}, Data: datadogV2.SensitiveDataScannerRuleUpdate{ Id: datadog.PtrString(RuleDataID), Type: datadogV2.SENSITIVEDATASCANNERRULETYPE_SENSITIVE_DATA_SCANNER_RULE.Ptr(), Attributes: &datadogV2.SensitiveDataScannerRuleAttributes{ Name: datadog.PtrString("Example-Sensitive-Data-Scanner"), Pattern: datadog.PtrString("pattern"), TextReplacement: &datadogV2.SensitiveDataScannerTextReplacement{ Type: datadogV2.SENSITIVEDATASCANNERTEXTREPLACEMENTTYPE_NONE.Ptr(), }, Tags: []string{ "sensitive_data:true", }, IsEnabled: datadog.PtrBool(true), Priority: datadog.PtrInt64(5), IncludedKeywordConfiguration: &datadogV2.SensitiveDataScannerIncludedKeywordConfiguration{ Keywords: []string{ "credit card", "cc", }, CharacterCount: 35, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSensitiveDataScannerApi(apiClient) resp, r, err := api.UpdateScanningRule(ctx, RuleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SensitiveDataScannerApi.UpdateScanningRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SensitiveDataScannerApi.UpdateScanningRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Scanning Rule returns "OK" response ``` // Update Scanning Rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SensitiveDataScannerApi; import com.datadog.api.client.v2.model.SensitiveDataScannerIncludedKeywordConfiguration; import com.datadog.api.client.v2.model.SensitiveDataScannerMetaVersionOnly; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleAttributes; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleType; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleUpdate; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleUpdateRequest; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleUpdateResponse; import com.datadog.api.client.v2.model.SensitiveDataScannerTextReplacement; import com.datadog.api.client.v2.model.SensitiveDataScannerTextReplacementType; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SensitiveDataScannerApi apiInstance = new SensitiveDataScannerApi(defaultClient); // the "scanning_group" has a "scanning_rule" String RULE_DATA_ID = System.getenv("RULE_DATA_ID"); SensitiveDataScannerRuleUpdateRequest body = new SensitiveDataScannerRuleUpdateRequest() .meta(new SensitiveDataScannerMetaVersionOnly()) .data( new SensitiveDataScannerRuleUpdate() .id(RULE_DATA_ID) .type(SensitiveDataScannerRuleType.SENSITIVE_DATA_SCANNER_RULE) .attributes( new SensitiveDataScannerRuleAttributes() .name("Example-Sensitive-Data-Scanner") .pattern("pattern") .textReplacement( new SensitiveDataScannerTextReplacement() .type(SensitiveDataScannerTextReplacementType.NONE)) .tags(Collections.singletonList("sensitive_data:true")) .isEnabled(true) .priority(5L) .includedKeywordConfiguration( new SensitiveDataScannerIncludedKeywordConfiguration() .keywords(Arrays.asList("credit card", "cc")) .characterCount(35L)))); try { SensitiveDataScannerRuleUpdateResponse result = apiInstance.updateScanningRule(RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SensitiveDataScannerApi#updateScanningRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Scanning Rule returns "OK" response ``` """ Update Scanning Rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.sensitive_data_scanner_api import SensitiveDataScannerApi from datadog_api_client.v2.model.sensitive_data_scanner_included_keyword_configuration import ( SensitiveDataScannerIncludedKeywordConfiguration, ) from datadog_api_client.v2.model.sensitive_data_scanner_meta_version_only import SensitiveDataScannerMetaVersionOnly from datadog_api_client.v2.model.sensitive_data_scanner_rule_attributes import SensitiveDataScannerRuleAttributes from datadog_api_client.v2.model.sensitive_data_scanner_rule_type import SensitiveDataScannerRuleType from datadog_api_client.v2.model.sensitive_data_scanner_rule_update import SensitiveDataScannerRuleUpdate from datadog_api_client.v2.model.sensitive_data_scanner_rule_update_request import SensitiveDataScannerRuleUpdateRequest from datadog_api_client.v2.model.sensitive_data_scanner_text_replacement import SensitiveDataScannerTextReplacement from datadog_api_client.v2.model.sensitive_data_scanner_text_replacement_type import ( SensitiveDataScannerTextReplacementType, ) # the "scanning_group" has a "scanning_rule" RULE_DATA_ID = environ["RULE_DATA_ID"] body = SensitiveDataScannerRuleUpdateRequest( meta=SensitiveDataScannerMetaVersionOnly(), data=SensitiveDataScannerRuleUpdate( id=RULE_DATA_ID, type=SensitiveDataScannerRuleType.SENSITIVE_DATA_SCANNER_RULE, attributes=SensitiveDataScannerRuleAttributes( name="Example-Sensitive-Data-Scanner", pattern="pattern", text_replacement=SensitiveDataScannerTextReplacement( type=SensitiveDataScannerTextReplacementType.NONE, ), tags=[ "sensitive_data:true", ], is_enabled=True, priority=5, included_keyword_configuration=SensitiveDataScannerIncludedKeywordConfiguration( keywords=[ "credit card", "cc", ], character_count=35, ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SensitiveDataScannerApi(api_client) response = api_instance.update_scanning_rule(rule_id=RULE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Scanning Rule returns "OK" response ``` # Update Scanning Rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SensitiveDataScannerAPI.new # the "scanning_group" has a "scanning_rule" RULE_DATA_ID = ENV["RULE_DATA_ID"] body = DatadogAPIClient::V2::SensitiveDataScannerRuleUpdateRequest.new({ meta: DatadogAPIClient::V2::SensitiveDataScannerMetaVersionOnly.new({}), data: DatadogAPIClient::V2::SensitiveDataScannerRuleUpdate.new({ id: RULE_DATA_ID, type: DatadogAPIClient::V2::SensitiveDataScannerRuleType::SENSITIVE_DATA_SCANNER_RULE, attributes: DatadogAPIClient::V2::SensitiveDataScannerRuleAttributes.new({ name: "Example-Sensitive-Data-Scanner", pattern: "pattern", text_replacement: DatadogAPIClient::V2::SensitiveDataScannerTextReplacement.new({ type: DatadogAPIClient::V2::SensitiveDataScannerTextReplacementType::NONE, }), tags: [ "sensitive_data:true", ], is_enabled: true, priority: 5, included_keyword_configuration: DatadogAPIClient::V2::SensitiveDataScannerIncludedKeywordConfiguration.new({ keywords: [ "credit card", "cc", ], character_count: 35, }), }), }), }) p api_instance.update_scanning_rule(RULE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Scanning Rule returns "OK" response ``` // Update Scanning Rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_sensitive_data_scanner::SensitiveDataScannerAPI; use datadog_api_client::datadogV2::model::SensitiveDataScannerIncludedKeywordConfiguration; use datadog_api_client::datadogV2::model::SensitiveDataScannerMetaVersionOnly; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleAttributes; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleType; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleUpdate; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleUpdateRequest; use datadog_api_client::datadogV2::model::SensitiveDataScannerTextReplacement; use datadog_api_client::datadogV2::model::SensitiveDataScannerTextReplacementType; #[tokio::main] async fn main() { // the "scanning_group" has a "scanning_rule" let rule_data_id = std::env::var("RULE_DATA_ID").unwrap(); let body = SensitiveDataScannerRuleUpdateRequest::new( SensitiveDataScannerRuleUpdate::new() .attributes( SensitiveDataScannerRuleAttributes::new() .included_keyword_configuration( SensitiveDataScannerIncludedKeywordConfiguration::new( 35, vec!["credit card".to_string(), "cc".to_string()], ), ) .is_enabled(true) .name("Example-Sensitive-Data-Scanner".to_string()) .pattern("pattern".to_string()) .priority(5) .tags(vec!["sensitive_data:true".to_string()]) .text_replacement( SensitiveDataScannerTextReplacement::new() .type_(SensitiveDataScannerTextReplacementType::NONE), ), ) .id(rule_data_id.clone()) .type_(SensitiveDataScannerRuleType::SENSITIVE_DATA_SCANNER_RULE), SensitiveDataScannerMetaVersionOnly::new(), ); let configuration = datadog::Configuration::new(); let api = SensitiveDataScannerAPI::with_config(configuration); let resp = api.update_scanning_rule(rule_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Scanning Rule returns "OK" response ``` /** * Update Scanning Rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SensitiveDataScannerApi(configuration); // the "scanning_group" has a "scanning_rule" const RULE_DATA_ID = process.env.RULE_DATA_ID as string; const params: v2.SensitiveDataScannerApiUpdateScanningRuleRequest = { body: { meta: {}, data: { id: RULE_DATA_ID, type: "sensitive_data_scanner_rule", attributes: { name: "Example-Sensitive-Data-Scanner", pattern: "pattern", textReplacement: { type: "none", }, tags: ["sensitive_data:true"], isEnabled: true, priority: 5, includedKeywordConfiguration: { keywords: ["credit card", "cc"], characterCount: 35, }, }, }, }, ruleId: RULE_DATA_ID, }; apiInstance .updateScanningRule(params) .then((data: v2.SensitiveDataScannerRuleUpdateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete Scanning Rule](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#delete-scanning-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#delete-scanning-rule-v2) DELETE https://api.ap1.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.datadoghq.eu/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.ddog-gov.com/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.us3.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/{rule_id}https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/{rule_id} ### Overview Delete a given rule. This endpoint requires the `data_scanner_write` permission. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string The ID of the rule. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Field Type Description meta [_required_] object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "meta": {} } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#DeleteScanningRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#DeleteScanningRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#DeleteScanningRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#DeleteScanningRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/#DeleteScanningRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) Delete rule response. Field Type Description meta object Meta payload containing information about the API. version int64 Version of the API (optional). ``` { "meta": { "version": 0 } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) * [Example](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/sensitive-data-scanner/?code-lang=typescript) ##### Delete Scanning Rule returns "OK" response Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/sensitive-data-scanner/config/rules/${rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "meta": {} } EOF ``` ##### Delete Scanning Rule returns "OK" response ``` // Delete Scanning Rule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // the "scanning_group" has a "scanning_rule" RuleDataID := os.Getenv("RULE_DATA_ID") body := datadogV2.SensitiveDataScannerRuleDeleteRequest{ Meta: datadogV2.SensitiveDataScannerMetaVersionOnly{}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSensitiveDataScannerApi(apiClient) resp, r, err := api.DeleteScanningRule(ctx, RuleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SensitiveDataScannerApi.DeleteScanningRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SensitiveDataScannerApi.DeleteScanningRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete Scanning Rule returns "OK" response ``` // Delete Scanning Rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SensitiveDataScannerApi; import com.datadog.api.client.v2.model.SensitiveDataScannerMetaVersionOnly; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleDeleteRequest; import com.datadog.api.client.v2.model.SensitiveDataScannerRuleDeleteResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SensitiveDataScannerApi apiInstance = new SensitiveDataScannerApi(defaultClient); // the "scanning_group" has a "scanning_rule" String RULE_DATA_ID = System.getenv("RULE_DATA_ID"); SensitiveDataScannerRuleDeleteRequest body = new SensitiveDataScannerRuleDeleteRequest().meta(new SensitiveDataScannerMetaVersionOnly()); try { SensitiveDataScannerRuleDeleteResponse result = apiInstance.deleteScanningRule(RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SensitiveDataScannerApi#deleteScanningRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete Scanning Rule returns "OK" response ``` """ Delete Scanning Rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.sensitive_data_scanner_api import SensitiveDataScannerApi from datadog_api_client.v2.model.sensitive_data_scanner_meta_version_only import SensitiveDataScannerMetaVersionOnly from datadog_api_client.v2.model.sensitive_data_scanner_rule_delete_request import SensitiveDataScannerRuleDeleteRequest # the "scanning_group" has a "scanning_rule" RULE_DATA_ID = environ["RULE_DATA_ID"] body = SensitiveDataScannerRuleDeleteRequest( meta=SensitiveDataScannerMetaVersionOnly(), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SensitiveDataScannerApi(api_client) response = api_instance.delete_scanning_rule(rule_id=RULE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete Scanning Rule returns "OK" response ``` # Delete Scanning Rule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SensitiveDataScannerAPI.new # the "scanning_group" has a "scanning_rule" RULE_DATA_ID = ENV["RULE_DATA_ID"] body = DatadogAPIClient::V2::SensitiveDataScannerRuleDeleteRequest.new({ meta: DatadogAPIClient::V2::SensitiveDataScannerMetaVersionOnly.new({}), }) p api_instance.delete_scanning_rule(RULE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete Scanning Rule returns "OK" response ``` // Delete Scanning Rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_sensitive_data_scanner::SensitiveDataScannerAPI; use datadog_api_client::datadogV2::model::SensitiveDataScannerMetaVersionOnly; use datadog_api_client::datadogV2::model::SensitiveDataScannerRuleDeleteRequest; #[tokio::main] async fn main() { // the "scanning_group" has a "scanning_rule" let rule_data_id = std::env::var("RULE_DATA_ID").unwrap(); let body = SensitiveDataScannerRuleDeleteRequest::new(SensitiveDataScannerMetaVersionOnly::new()); let configuration = datadog::Configuration::new(); let api = SensitiveDataScannerAPI::with_config(configuration); let resp = api.delete_scanning_rule(rule_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete Scanning Rule returns "OK" response ``` /** * Delete Scanning Rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SensitiveDataScannerApi(configuration); // the "scanning_group" has a "scanning_rule" const RULE_DATA_ID = process.env.RULE_DATA_ID as string; const params: v2.SensitiveDataScannerApiDeleteScanningRuleRequest = { body: { meta: {}, }, ruleId: RULE_DATA_ID, }; apiInstance .deleteScanningRule(params) .then((data: v2.SensitiveDataScannerRuleDeleteResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=082cb7aa-30b4-4e34-9a24-76ed25cac6d0&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=af37ffee-0e9c-4ea4-824e-6493cbea1fdf&pt=Sensitive%20Data%20Scanner&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsensitive-data-scanner%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=082cb7aa-30b4-4e34-9a24-76ed25cac6d0&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=af37ffee-0e9c-4ea4-824e-6493cbea1fdf&pt=Sensitive%20Data%20Scanner&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsensitive-data-scanner%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=3c27fabb-1a1c-4555-ad18-4e8bb079695f&bo=1&sid=79e63c00f0c011f080fd5d61f0fdb26a&vid=79e697b0f0c011f0bbfd5d41c3ac300e&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Sensitive%20Data%20Scanner&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsensitive-data-scanner%2F&r=<=2708&evt=pageLoad&sv=2&cdb=AQAQ&rn=676504) --- # Source: https://docs.datadoghq.com/api/latest/service-accounts/ # Service Accounts Create, edit, and disable service accounts. See the [Service Accounts page](https://docs.datadoghq.com/account_management/org_settings/service_accounts/) for more information. ## [Create a service account](https://docs.datadoghq.com/api/latest/service-accounts/#create-a-service-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-accounts/#create-a-service-account-v2) POST https://api.ap1.datadoghq.com/api/v2/service_accountshttps://api.ap2.datadoghq.com/api/v2/service_accountshttps://api.datadoghq.eu/api/v2/service_accountshttps://api.ddog-gov.com/api/v2/service_accountshttps://api.datadoghq.com/api/v2/service_accountshttps://api.us3.datadoghq.com/api/v2/service_accountshttps://api.us5.datadoghq.com/api/v2/service_accounts ### Overview Create a service account for your organization. This endpoint requires the `service_account_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) Field Type Description data [_required_] object Object to create a service account User. attributes [_required_] object Attributes of the created user. email [_required_] string The email of the user. name string The name of the user. service_account [_required_] boolean Whether the user is a service account. Must be true. title string The title of the user. relationships object Relationships of the user object. roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "type": "users", "attributes": { "name": "Test API Client", "email": "Example-Service-Account@datadoghq.com", "service_account": true }, "relationships": { "roles": { "data": [ { "id": "string", "type": "roles" } ] } } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/service-accounts/#CreateServiceAccount-201-v2) * [400](https://docs.datadoghq.com/api/latest/service-accounts/#CreateServiceAccount-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-accounts/#CreateServiceAccount-403-v2) * [429](https://docs.datadoghq.com/api/latest/service-accounts/#CreateServiceAccount-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) Response containing information about a single user. Field Type Description data object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` included [ ] Array of objects related to the user. Option 1 object Organization object. attributes object Attributes of the organization. created_at date-time Creation time of the organization. description string Description of the organization. disabled boolean Whether or not the organization is disabled. modified_at date-time Time of last organization modification. name string Name of the organization. public_id string Public ID of the organization. sharing string Sharing type of the organization. url string URL of the site that this organization exists at. id string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` Option 2 object Permission object. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` Option 3 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "disabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "public_id": "string", "sharing": "string", "url": "string" }, "id": "string", "type": "orgs" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=typescript) ##### Create a service account returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/service_accounts" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "users", "attributes": { "name": "Test API Client", "email": "Example-Service-Account@datadoghq.com", "service_account": true }, "relationships": { "roles": { "data": [ { "id": "string", "type": "roles" } ] } } } } EOF ``` ##### Create a service account returns "OK" response ``` // Create a service account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") body := datadogV2.ServiceAccountCreateRequest{ Data: datadogV2.ServiceAccountCreateData{ Type: datadogV2.USERSTYPE_USERS, Attributes: datadogV2.ServiceAccountCreateAttributes{ Name: datadog.PtrString("Test API Client"), Email: "Example-Service-Account@datadoghq.com", ServiceAccount: true, }, Relationships: &datadogV2.UserRelationships{ Roles: &datadogV2.RelationshipToRoles{ Data: []datadogV2.RelationshipToRoleData{ { Id: datadog.PtrString(RoleDataID), Type: datadogV2.ROLESTYPE_ROLES.Ptr(), }, }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceAccountsApi(apiClient) resp, r, err := api.CreateServiceAccount(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceAccountsApi.CreateServiceAccount`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceAccountsApi.CreateServiceAccount`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a service account returns "OK" response ``` // Create a service account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceAccountsApi; import com.datadog.api.client.v2.model.RelationshipToRoleData; import com.datadog.api.client.v2.model.RelationshipToRoles; import com.datadog.api.client.v2.model.RolesType; import com.datadog.api.client.v2.model.ServiceAccountCreateAttributes; import com.datadog.api.client.v2.model.ServiceAccountCreateData; import com.datadog.api.client.v2.model.ServiceAccountCreateRequest; import com.datadog.api.client.v2.model.UserRelationships; import com.datadog.api.client.v2.model.UserResponse; import com.datadog.api.client.v2.model.UsersType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceAccountsApi apiInstance = new ServiceAccountsApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); ServiceAccountCreateRequest body = new ServiceAccountCreateRequest() .data( new ServiceAccountCreateData() .type(UsersType.USERS) .attributes( new ServiceAccountCreateAttributes() .name("Test API Client") .email("Example-Service-Account@datadoghq.com") .serviceAccount(true)) .relationships( new UserRelationships() .roles( new RelationshipToRoles() .data( Collections.singletonList( new RelationshipToRoleData() .id(ROLE_DATA_ID) .type(RolesType.ROLES)))))); try { UserResponse result = apiInstance.createServiceAccount(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceAccountsApi#createServiceAccount"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a service account returns "OK" response ``` """ Create a service account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_accounts_api import ServiceAccountsApi from datadog_api_client.v2.model.relationship_to_role_data import RelationshipToRoleData from datadog_api_client.v2.model.relationship_to_roles import RelationshipToRoles from datadog_api_client.v2.model.roles_type import RolesType from datadog_api_client.v2.model.service_account_create_attributes import ServiceAccountCreateAttributes from datadog_api_client.v2.model.service_account_create_data import ServiceAccountCreateData from datadog_api_client.v2.model.service_account_create_request import ServiceAccountCreateRequest from datadog_api_client.v2.model.user_relationships import UserRelationships from datadog_api_client.v2.model.users_type import UsersType # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] body = ServiceAccountCreateRequest( data=ServiceAccountCreateData( type=UsersType.USERS, attributes=ServiceAccountCreateAttributes( name="Test API Client", email="Example-Service-Account@datadoghq.com", service_account=True, ), relationships=UserRelationships( roles=RelationshipToRoles( data=[ RelationshipToRoleData( id=ROLE_DATA_ID, type=RolesType.ROLES, ), ], ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceAccountsApi(api_client) response = api_instance.create_service_account(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a service account returns "OK" response ``` # Create a service account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceAccountsAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] body = DatadogAPIClient::V2::ServiceAccountCreateRequest.new({ data: DatadogAPIClient::V2::ServiceAccountCreateData.new({ type: DatadogAPIClient::V2::UsersType::USERS, attributes: DatadogAPIClient::V2::ServiceAccountCreateAttributes.new({ name: "Test API Client", email: "Example-Service-Account@datadoghq.com", service_account: true, }), relationships: DatadogAPIClient::V2::UserRelationships.new({ roles: DatadogAPIClient::V2::RelationshipToRoles.new({ data: [ DatadogAPIClient::V2::RelationshipToRoleData.new({ id: ROLE_DATA_ID, type: DatadogAPIClient::V2::RolesType::ROLES, }), ], }), }), }), }) p api_instance.create_service_account(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a service account returns "OK" response ``` // Create a service account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_accounts::ServiceAccountsAPI; use datadog_api_client::datadogV2::model::RelationshipToRoleData; use datadog_api_client::datadogV2::model::RelationshipToRoles; use datadog_api_client::datadogV2::model::RolesType; use datadog_api_client::datadogV2::model::ServiceAccountCreateAttributes; use datadog_api_client::datadogV2::model::ServiceAccountCreateData; use datadog_api_client::datadogV2::model::ServiceAccountCreateRequest; use datadog_api_client::datadogV2::model::UserRelationships; use datadog_api_client::datadogV2::model::UsersType; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let body = ServiceAccountCreateRequest::new( ServiceAccountCreateData::new( ServiceAccountCreateAttributes::new( "Example-Service-Account@datadoghq.com".to_string(), true, ) .name("Test API Client".to_string()), UsersType::USERS, ) .relationships(UserRelationships::new().roles( RelationshipToRoles::new().data(vec![RelationshipToRoleData::new() .id(role_data_id.clone()) .type_(RolesType::ROLES)]), )), ); let configuration = datadog::Configuration::new(); let api = ServiceAccountsAPI::with_config(configuration); let resp = api.create_service_account(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a service account returns "OK" response ``` /** * Create a service account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceAccountsApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v2.ServiceAccountsApiCreateServiceAccountRequest = { body: { data: { type: "users", attributes: { name: "Test API Client", email: "Example-Service-Account@datadoghq.com", serviceAccount: true, }, relationships: { roles: { data: [ { id: ROLE_DATA_ID, type: "roles", }, ], }, }, }, }, }; apiInstance .createServiceAccount(params) .then((data: v2.UserResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get one application key for this service account](https://docs.datadoghq.com/api/latest/service-accounts/#get-one-application-key-for-this-service-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-accounts/#get-one-application-key-for-this-service-account-v2) GET https://api.ap1.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.ap2.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.datadoghq.eu/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.ddog-gov.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.us3.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.us5.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id} ### Overview Get an application key owned by this service account. This endpoint requires the `service_account_write` permission. ### Arguments #### Path Parameters Name Type Description service_account_id [_required_] string The ID of the service account. app_key_id [_required_] string The ID of the application key. ### Response * [200](https://docs.datadoghq.com/api/latest/service-accounts/#GetServiceAccountApplicationKey-200-v2) * [403](https://docs.datadoghq.com/api/latest/service-accounts/#GetServiceAccountApplicationKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/service-accounts/#GetServiceAccountApplicationKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/service-accounts/#GetServiceAccountApplicationKey-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) Response for retrieving a partial application key. Field Type Description data object Partial Datadog application key. attributes object Attributes of a partial application key. created_at string Creation date of the application key. last4 string The last four characters of the application key. last_used_at string Last usage timestamp of the application key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id string ID of the application key. relationships object Resources related to the application key. owned_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` included [ ] Array of objects related to the application key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` ``` { "data": { "attributes": { "created_at": "2020-11-23T10:00:00.000Z", "last4": "abcd", "last_used_at": "2020-12-20T10:00:00.000Z", "name": "Application Key for managing dashboards", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "id": "string", "relationships": { "owned_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "application_keys" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=typescript) ##### Get one application key for this service account Copy ``` # Path parameters export service_account_id="00000000-0000-1234-0000-000000000000" export app_key_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/service_accounts/${service_account_id}/application_keys/${app_key_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get one application key for this service account ``` """ Get one application key for this service account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_accounts_api import ServiceAccountsApi # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = environ["SERVICE_ACCOUNT_USER_DATA_ID"] # there is a valid "service_account_application_key" for "service_account_user" SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = environ["SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceAccountsApi(api_client) response = api_instance.get_service_account_application_key( service_account_id=SERVICE_ACCOUNT_USER_DATA_ID, app_key_id=SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get one application key for this service account ``` # Get one application key for this service account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceAccountsAPI.new # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = ENV["SERVICE_ACCOUNT_USER_DATA_ID"] # there is a valid "service_account_application_key" for "service_account_user" SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = ENV["SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID"] p api_instance.get_service_account_application_key(SERVICE_ACCOUNT_USER_DATA_ID, SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get one application key for this service account ``` // Get one application key for this service account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "service_account_user" in the system ServiceAccountUserDataID := os.Getenv("SERVICE_ACCOUNT_USER_DATA_ID") // there is a valid "service_account_application_key" for "service_account_user" ServiceAccountApplicationKeyDataID := os.Getenv("SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceAccountsApi(apiClient) resp, r, err := api.GetServiceAccountApplicationKey(ctx, ServiceAccountUserDataID, ServiceAccountApplicationKeyDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceAccountsApi.GetServiceAccountApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceAccountsApi.GetServiceAccountApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get one application key for this service account ``` // Get one application key for this service account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceAccountsApi; import com.datadog.api.client.v2.model.PartialApplicationKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceAccountsApi apiInstance = new ServiceAccountsApi(defaultClient); // there is a valid "service_account_user" in the system String SERVICE_ACCOUNT_USER_DATA_ID = System.getenv("SERVICE_ACCOUNT_USER_DATA_ID"); // there is a valid "service_account_application_key" for "service_account_user" String SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = System.getenv("SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID"); try { PartialApplicationKeyResponse result = apiInstance.getServiceAccountApplicationKey( SERVICE_ACCOUNT_USER_DATA_ID, SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceAccountsApi#getServiceAccountApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get one application key for this service account ``` // Get one application key for this service account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_accounts::ServiceAccountsAPI; #[tokio::main] async fn main() { // there is a valid "service_account_user" in the system let service_account_user_data_id = std::env::var("SERVICE_ACCOUNT_USER_DATA_ID").unwrap(); // there is a valid "service_account_application_key" for "service_account_user" let service_account_application_key_data_id = std::env::var("SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ServiceAccountsAPI::with_config(configuration); let resp = api .get_service_account_application_key( service_account_user_data_id.clone(), service_account_application_key_data_id.clone(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get one application key for this service account ``` /** * Get one application key for this service account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceAccountsApi(configuration); // there is a valid "service_account_user" in the system const SERVICE_ACCOUNT_USER_DATA_ID = process.env .SERVICE_ACCOUNT_USER_DATA_ID as string; // there is a valid "service_account_application_key" for "service_account_user" const SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = process.env .SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID as string; const params: v2.ServiceAccountsApiGetServiceAccountApplicationKeyRequest = { serviceAccountId: SERVICE_ACCOUNT_USER_DATA_ID, appKeyId: SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID, }; apiInstance .getServiceAccountApplicationKey(params) .then((data: v2.PartialApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit an application key for this service account](https://docs.datadoghq.com/api/latest/service-accounts/#edit-an-application-key-for-this-service-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-accounts/#edit-an-application-key-for-this-service-account-v2) PATCH https://api.ap1.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.ap2.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.datadoghq.eu/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.ddog-gov.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.us3.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.us5.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id} ### Overview Edit an application key owned by this service account. This endpoint requires the `service_account_write` permission. ### Arguments #### Path Parameters Name Type Description service_account_id [_required_] string The ID of the service account. app_key_id [_required_] string The ID of the application key. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) Field Type Description data [_required_] object Object used to update an application key. attributes [_required_] object Attributes used to update an application Key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id [_required_] string ID of the application key. type [_required_] enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` ``` { "data": { "id": "67963b57-d67c-dfa7-b180-62ee9301d2f6", "type": "application_keys", "attributes": { "name": "Application Key for managing dashboards-updated" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/service-accounts/#UpdateServiceAccountApplicationKey-200-v2) * [400](https://docs.datadoghq.com/api/latest/service-accounts/#UpdateServiceAccountApplicationKey-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-accounts/#UpdateServiceAccountApplicationKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/service-accounts/#UpdateServiceAccountApplicationKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/service-accounts/#UpdateServiceAccountApplicationKey-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) Response for retrieving a partial application key. Field Type Description data object Partial Datadog application key. attributes object Attributes of a partial application key. created_at string Creation date of the application key. last4 string The last four characters of the application key. last_used_at string Last usage timestamp of the application key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id string ID of the application key. relationships object Resources related to the application key. owned_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` included [ ] Array of objects related to the application key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` ``` { "data": { "attributes": { "created_at": "2020-11-23T10:00:00.000Z", "last4": "abcd", "last_used_at": "2020-12-20T10:00:00.000Z", "name": "Application Key for managing dashboards", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "id": "string", "relationships": { "owned_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "application_keys" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=typescript) ##### Edit an application key for this service account returns "OK" response Copy ``` # Path parameters export service_account_id="00000000-0000-1234-0000-000000000000" export app_key_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/service_accounts/${service_account_id}/application_keys/${app_key_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "67963b57-d67c-dfa7-b180-62ee9301d2f6", "type": "application_keys", "attributes": { "name": "Application Key for managing dashboards-updated" } } } EOF ``` ##### Edit an application key for this service account returns "OK" response ``` // Edit an application key for this service account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "service_account_user" in the system ServiceAccountUserDataID := os.Getenv("SERVICE_ACCOUNT_USER_DATA_ID") // there is a valid "service_account_application_key" for "service_account_user" ServiceAccountApplicationKeyDataID := os.Getenv("SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID") body := datadogV2.ApplicationKeyUpdateRequest{ Data: datadogV2.ApplicationKeyUpdateData{ Id: ServiceAccountApplicationKeyDataID, Type: datadogV2.APPLICATIONKEYSTYPE_APPLICATION_KEYS, Attributes: datadogV2.ApplicationKeyUpdateAttributes{ Name: datadog.PtrString("Application Key for managing dashboards-updated"), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceAccountsApi(apiClient) resp, r, err := api.UpdateServiceAccountApplicationKey(ctx, ServiceAccountUserDataID, ServiceAccountApplicationKeyDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceAccountsApi.UpdateServiceAccountApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceAccountsApi.UpdateServiceAccountApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit an application key for this service account returns "OK" response ``` // Edit an application key for this service account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceAccountsApi; import com.datadog.api.client.v2.model.ApplicationKeyUpdateAttributes; import com.datadog.api.client.v2.model.ApplicationKeyUpdateData; import com.datadog.api.client.v2.model.ApplicationKeyUpdateRequest; import com.datadog.api.client.v2.model.ApplicationKeysType; import com.datadog.api.client.v2.model.PartialApplicationKeyResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceAccountsApi apiInstance = new ServiceAccountsApi(defaultClient); // there is a valid "service_account_user" in the system String SERVICE_ACCOUNT_USER_DATA_ID = System.getenv("SERVICE_ACCOUNT_USER_DATA_ID"); // there is a valid "service_account_application_key" for "service_account_user" String SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ATTRIBUTES_NAME = System.getenv("SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ATTRIBUTES_NAME"); String SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = System.getenv("SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID"); ApplicationKeyUpdateRequest body = new ApplicationKeyUpdateRequest() .data( new ApplicationKeyUpdateData() .id(SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID) .type(ApplicationKeysType.APPLICATION_KEYS) .attributes( new ApplicationKeyUpdateAttributes() .name("Application Key for managing dashboards-updated"))); try { PartialApplicationKeyResponse result = apiInstance.updateServiceAccountApplicationKey( SERVICE_ACCOUNT_USER_DATA_ID, SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceAccountsApi#updateServiceAccountApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit an application key for this service account returns "OK" response ``` """ Edit an application key for this service account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_accounts_api import ServiceAccountsApi from datadog_api_client.v2.model.application_key_update_attributes import ApplicationKeyUpdateAttributes from datadog_api_client.v2.model.application_key_update_data import ApplicationKeyUpdateData from datadog_api_client.v2.model.application_key_update_request import ApplicationKeyUpdateRequest from datadog_api_client.v2.model.application_keys_type import ApplicationKeysType # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = environ["SERVICE_ACCOUNT_USER_DATA_ID"] # there is a valid "service_account_application_key" for "service_account_user" SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ATTRIBUTES_NAME = environ["SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ATTRIBUTES_NAME"] SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = environ["SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID"] body = ApplicationKeyUpdateRequest( data=ApplicationKeyUpdateData( id=SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID, type=ApplicationKeysType.APPLICATION_KEYS, attributes=ApplicationKeyUpdateAttributes( name="Application Key for managing dashboards-updated", ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceAccountsApi(api_client) response = api_instance.update_service_account_application_key( service_account_id=SERVICE_ACCOUNT_USER_DATA_ID, app_key_id=SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID, body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit an application key for this service account returns "OK" response ``` # Edit an application key for this service account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceAccountsAPI.new # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = ENV["SERVICE_ACCOUNT_USER_DATA_ID"] # there is a valid "service_account_application_key" for "service_account_user" SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ATTRIBUTES_NAME = ENV["SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ATTRIBUTES_NAME"] SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = ENV["SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID"] body = DatadogAPIClient::V2::ApplicationKeyUpdateRequest.new({ data: DatadogAPIClient::V2::ApplicationKeyUpdateData.new({ id: SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID, type: DatadogAPIClient::V2::ApplicationKeysType::APPLICATION_KEYS, attributes: DatadogAPIClient::V2::ApplicationKeyUpdateAttributes.new({ name: "Application Key for managing dashboards-updated", }), }), }) p api_instance.update_service_account_application_key(SERVICE_ACCOUNT_USER_DATA_ID, SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit an application key for this service account returns "OK" response ``` // Edit an application key for this service account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_accounts::ServiceAccountsAPI; use datadog_api_client::datadogV2::model::ApplicationKeyUpdateAttributes; use datadog_api_client::datadogV2::model::ApplicationKeyUpdateData; use datadog_api_client::datadogV2::model::ApplicationKeyUpdateRequest; use datadog_api_client::datadogV2::model::ApplicationKeysType; #[tokio::main] async fn main() { // there is a valid "service_account_user" in the system let service_account_user_data_id = std::env::var("SERVICE_ACCOUNT_USER_DATA_ID").unwrap(); // there is a valid "service_account_application_key" for "service_account_user" let service_account_application_key_data_id = std::env::var("SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID").unwrap(); let body = ApplicationKeyUpdateRequest::new(ApplicationKeyUpdateData::new( ApplicationKeyUpdateAttributes::new() .name("Application Key for managing dashboards-updated".to_string()), service_account_application_key_data_id.clone(), ApplicationKeysType::APPLICATION_KEYS, )); let configuration = datadog::Configuration::new(); let api = ServiceAccountsAPI::with_config(configuration); let resp = api .update_service_account_application_key( service_account_user_data_id.clone(), service_account_application_key_data_id.clone(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit an application key for this service account returns "OK" response ``` /** * Edit an application key for this service account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceAccountsApi(configuration); // there is a valid "service_account_user" in the system const SERVICE_ACCOUNT_USER_DATA_ID = process.env .SERVICE_ACCOUNT_USER_DATA_ID as string; // there is a valid "service_account_application_key" for "service_account_user" const SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = process.env .SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID as string; const params: v2.ServiceAccountsApiUpdateServiceAccountApplicationKeyRequest = { body: { data: { id: SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID, type: "application_keys", attributes: { name: "Application Key for managing dashboards-updated", }, }, }, serviceAccountId: SERVICE_ACCOUNT_USER_DATA_ID, appKeyId: SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID, }; apiInstance .updateServiceAccountApplicationKey(params) .then((data: v2.PartialApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an application key for this service account](https://docs.datadoghq.com/api/latest/service-accounts/#delete-an-application-key-for-this-service-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-accounts/#delete-an-application-key-for-this-service-account-v2) DELETE https://api.ap1.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.ap2.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.datadoghq.eu/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.ddog-gov.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.us3.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id}https://api.us5.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys/{app_key_id} ### Overview Delete an application key owned by this service account. This endpoint requires the `service_account_write` permission. ### Arguments #### Path Parameters Name Type Description service_account_id [_required_] string The ID of the service account. app_key_id [_required_] string The ID of the application key. ### Response * [204](https://docs.datadoghq.com/api/latest/service-accounts/#DeleteServiceAccountApplicationKey-204-v2) * [403](https://docs.datadoghq.com/api/latest/service-accounts/#DeleteServiceAccountApplicationKey-403-v2) * [404](https://docs.datadoghq.com/api/latest/service-accounts/#DeleteServiceAccountApplicationKey-404-v2) * [429](https://docs.datadoghq.com/api/latest/service-accounts/#DeleteServiceAccountApplicationKey-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=typescript) ##### Delete an application key for this service account Copy ``` # Path parameters export service_account_id="00000000-0000-1234-0000-000000000000" export app_key_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/service_accounts/${service_account_id}/application_keys/${app_key_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an application key for this service account ``` """ Delete an application key for this service account returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_accounts_api import ServiceAccountsApi # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = environ["SERVICE_ACCOUNT_USER_DATA_ID"] # there is a valid "service_account_application_key" for "service_account_user" SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = environ["SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceAccountsApi(api_client) api_instance.delete_service_account_application_key( service_account_id=SERVICE_ACCOUNT_USER_DATA_ID, app_key_id=SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an application key for this service account ``` # Delete an application key for this service account returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceAccountsAPI.new # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = ENV["SERVICE_ACCOUNT_USER_DATA_ID"] # there is a valid "service_account_application_key" for "service_account_user" SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = ENV["SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID"] api_instance.delete_service_account_application_key(SERVICE_ACCOUNT_USER_DATA_ID, SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an application key for this service account ``` // Delete an application key for this service account returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "service_account_user" in the system ServiceAccountUserDataID := os.Getenv("SERVICE_ACCOUNT_USER_DATA_ID") // there is a valid "service_account_application_key" for "service_account_user" ServiceAccountApplicationKeyDataID := os.Getenv("SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceAccountsApi(apiClient) r, err := api.DeleteServiceAccountApplicationKey(ctx, ServiceAccountUserDataID, ServiceAccountApplicationKeyDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceAccountsApi.DeleteServiceAccountApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an application key for this service account ``` // Delete an application key for this service account returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceAccountsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceAccountsApi apiInstance = new ServiceAccountsApi(defaultClient); // there is a valid "service_account_user" in the system String SERVICE_ACCOUNT_USER_DATA_ID = System.getenv("SERVICE_ACCOUNT_USER_DATA_ID"); // there is a valid "service_account_application_key" for "service_account_user" String SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = System.getenv("SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID"); try { apiInstance.deleteServiceAccountApplicationKey( SERVICE_ACCOUNT_USER_DATA_ID, SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID); } catch (ApiException e) { System.err.println( "Exception when calling ServiceAccountsApi#deleteServiceAccountApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an application key for this service account ``` // Delete an application key for this service account returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_accounts::ServiceAccountsAPI; #[tokio::main] async fn main() { // there is a valid "service_account_user" in the system let service_account_user_data_id = std::env::var("SERVICE_ACCOUNT_USER_DATA_ID").unwrap(); // there is a valid "service_account_application_key" for "service_account_user" let service_account_application_key_data_id = std::env::var("SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ServiceAccountsAPI::with_config(configuration); let resp = api .delete_service_account_application_key( service_account_user_data_id.clone(), service_account_application_key_data_id.clone(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an application key for this service account ``` /** * Delete an application key for this service account returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceAccountsApi(configuration); // there is a valid "service_account_user" in the system const SERVICE_ACCOUNT_USER_DATA_ID = process.env .SERVICE_ACCOUNT_USER_DATA_ID as string; // there is a valid "service_account_application_key" for "service_account_user" const SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID = process.env .SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID as string; const params: v2.ServiceAccountsApiDeleteServiceAccountApplicationKeyRequest = { serviceAccountId: SERVICE_ACCOUNT_USER_DATA_ID, appKeyId: SERVICE_ACCOUNT_APPLICATION_KEY_DATA_ID, }; apiInstance .deleteServiceAccountApplicationKey(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create an application key for this service account](https://docs.datadoghq.com/api/latest/service-accounts/#create-an-application-key-for-this-service-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-accounts/#create-an-application-key-for-this-service-account-v2) POST https://api.ap1.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.ap2.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.datadoghq.eu/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.ddog-gov.com/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.us3.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.us5.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys ### Overview Create an application key for this service account. This endpoint requires the `service_account_write` permission. ### Arguments #### Path Parameters Name Type Description service_account_id [_required_] string The ID of the service account. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) Field Type Description data [_required_] object Object used to create an application key. attributes [_required_] object Attributes used to create an application Key. name [_required_] string Name of the application key. scopes [string] Array of scopes to grant the application key. type [_required_] enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` ##### Create an application key for this service account returns "Created" response ``` { "data": { "attributes": { "name": "Example-Service-Account" }, "type": "application_keys" } } ``` Copy ##### Create an application key with scopes for this service account returns "Created" response ``` { "data": { "attributes": { "name": "Example-Service-Account", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "type": "application_keys" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/service-accounts/#CreateServiceAccountApplicationKey-201-v2) * [400](https://docs.datadoghq.com/api/latest/service-accounts/#CreateServiceAccountApplicationKey-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-accounts/#CreateServiceAccountApplicationKey-403-v2) * [429](https://docs.datadoghq.com/api/latest/service-accounts/#CreateServiceAccountApplicationKey-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) Response for retrieving an application key. Field Type Description data object Datadog application key. attributes object Attributes of a full application key. created_at date-time Creation date of the application key. key string The application key. last4 string The last four characters of the application key. last_used_at date-time Last usage timestamp of the application key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id string ID of the application key. relationships object Resources related to the application key. owned_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` included [ ] Array of objects related to the application key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` ``` { "data": { "attributes": { "created_at": "2020-11-23T10:00:00.000Z", "key": "string", "last4": "abcd", "last_used_at": "2020-12-20T10:00:00.000Z", "name": "Application Key for managing dashboards", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "id": "string", "relationships": { "owned_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "application_keys" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=typescript) ##### Create an application key for this service account returns "Created" response Copy ``` # Path parameters export service_account_id="00000000-0000-1234-0000-000000000000" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/service_accounts/${service_account_id}/application_keys" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Example-Service-Account" }, "type": "application_keys" } } EOF ``` ##### Create an application key with scopes for this service account returns "Created" response Copy ``` # Path parameters export service_account_id="00000000-0000-1234-0000-000000000000" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/service_accounts/${service_account_id}/application_keys" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "name": "Example-Service-Account", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "type": "application_keys" } } EOF ``` ##### Create an application key for this service account returns "Created" response ``` // Create an application key for this service account returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "service_account_user" in the system ServiceAccountUserDataID := os.Getenv("SERVICE_ACCOUNT_USER_DATA_ID") body := datadogV2.ApplicationKeyCreateRequest{ Data: datadogV2.ApplicationKeyCreateData{ Attributes: datadogV2.ApplicationKeyCreateAttributes{ Name: "Example-Service-Account", }, Type: datadogV2.APPLICATIONKEYSTYPE_APPLICATION_KEYS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceAccountsApi(apiClient) resp, r, err := api.CreateServiceAccountApplicationKey(ctx, ServiceAccountUserDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceAccountsApi.CreateServiceAccountApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceAccountsApi.CreateServiceAccountApplicationKey`:\n%s\n", responseContent) } ``` Copy ##### Create an application key with scopes for this service account returns "Created" response ``` // Create an application key with scopes for this service account returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "service_account_user" in the system ServiceAccountUserDataID := os.Getenv("SERVICE_ACCOUNT_USER_DATA_ID") body := datadogV2.ApplicationKeyCreateRequest{ Data: datadogV2.ApplicationKeyCreateData{ Attributes: datadogV2.ApplicationKeyCreateAttributes{ Name: "Example-Service-Account", Scopes: *datadog.NewNullableList(&[]string{ "dashboards_read", "dashboards_write", "dashboards_public_share", }), }, Type: datadogV2.APPLICATIONKEYSTYPE_APPLICATION_KEYS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceAccountsApi(apiClient) resp, r, err := api.CreateServiceAccountApplicationKey(ctx, ServiceAccountUserDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceAccountsApi.CreateServiceAccountApplicationKey`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceAccountsApi.CreateServiceAccountApplicationKey`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an application key for this service account returns "Created" response ``` // Create an application key for this service account returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceAccountsApi; import com.datadog.api.client.v2.model.ApplicationKeyCreateAttributes; import com.datadog.api.client.v2.model.ApplicationKeyCreateData; import com.datadog.api.client.v2.model.ApplicationKeyCreateRequest; import com.datadog.api.client.v2.model.ApplicationKeyResponse; import com.datadog.api.client.v2.model.ApplicationKeysType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceAccountsApi apiInstance = new ServiceAccountsApi(defaultClient); // there is a valid "service_account_user" in the system String SERVICE_ACCOUNT_USER_DATA_ID = System.getenv("SERVICE_ACCOUNT_USER_DATA_ID"); ApplicationKeyCreateRequest body = new ApplicationKeyCreateRequest() .data( new ApplicationKeyCreateData() .attributes( new ApplicationKeyCreateAttributes().name("Example-Service-Account")) .type(ApplicationKeysType.APPLICATION_KEYS)); try { ApplicationKeyResponse result = apiInstance.createServiceAccountApplicationKey(SERVICE_ACCOUNT_USER_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceAccountsApi#createServiceAccountApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create an application key with scopes for this service account returns "Created" response ``` // Create an application key with scopes for this service account returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceAccountsApi; import com.datadog.api.client.v2.model.ApplicationKeyCreateAttributes; import com.datadog.api.client.v2.model.ApplicationKeyCreateData; import com.datadog.api.client.v2.model.ApplicationKeyCreateRequest; import com.datadog.api.client.v2.model.ApplicationKeyResponse; import com.datadog.api.client.v2.model.ApplicationKeysType; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceAccountsApi apiInstance = new ServiceAccountsApi(defaultClient); // there is a valid "service_account_user" in the system String SERVICE_ACCOUNT_USER_DATA_ID = System.getenv("SERVICE_ACCOUNT_USER_DATA_ID"); ApplicationKeyCreateRequest body = new ApplicationKeyCreateRequest() .data( new ApplicationKeyCreateData() .attributes( new ApplicationKeyCreateAttributes() .name("Example-Service-Account") .scopes( Arrays.asList( "dashboards_read", "dashboards_write", "dashboards_public_share"))) .type(ApplicationKeysType.APPLICATION_KEYS)); try { ApplicationKeyResponse result = apiInstance.createServiceAccountApplicationKey(SERVICE_ACCOUNT_USER_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceAccountsApi#createServiceAccountApplicationKey"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an application key for this service account returns "Created" response ``` """ Create an application key for this service account returns "Created" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_accounts_api import ServiceAccountsApi from datadog_api_client.v2.model.application_key_create_attributes import ApplicationKeyCreateAttributes from datadog_api_client.v2.model.application_key_create_data import ApplicationKeyCreateData from datadog_api_client.v2.model.application_key_create_request import ApplicationKeyCreateRequest from datadog_api_client.v2.model.application_keys_type import ApplicationKeysType # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = environ["SERVICE_ACCOUNT_USER_DATA_ID"] body = ApplicationKeyCreateRequest( data=ApplicationKeyCreateData( attributes=ApplicationKeyCreateAttributes( name="Example-Service-Account", ), type=ApplicationKeysType.APPLICATION_KEYS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceAccountsApi(api_client) response = api_instance.create_service_account_application_key( service_account_id=SERVICE_ACCOUNT_USER_DATA_ID, body=body ) print(response) ``` Copy ##### Create an application key with scopes for this service account returns "Created" response ``` """ Create an application key with scopes for this service account returns "Created" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_accounts_api import ServiceAccountsApi from datadog_api_client.v2.model.application_key_create_attributes import ApplicationKeyCreateAttributes from datadog_api_client.v2.model.application_key_create_data import ApplicationKeyCreateData from datadog_api_client.v2.model.application_key_create_request import ApplicationKeyCreateRequest from datadog_api_client.v2.model.application_keys_type import ApplicationKeysType # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = environ["SERVICE_ACCOUNT_USER_DATA_ID"] body = ApplicationKeyCreateRequest( data=ApplicationKeyCreateData( attributes=ApplicationKeyCreateAttributes( name="Example-Service-Account", scopes=[ "dashboards_read", "dashboards_write", "dashboards_public_share", ], ), type=ApplicationKeysType.APPLICATION_KEYS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceAccountsApi(api_client) response = api_instance.create_service_account_application_key( service_account_id=SERVICE_ACCOUNT_USER_DATA_ID, body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an application key for this service account returns "Created" response ``` # Create an application key for this service account returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceAccountsAPI.new # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = ENV["SERVICE_ACCOUNT_USER_DATA_ID"] body = DatadogAPIClient::V2::ApplicationKeyCreateRequest.new({ data: DatadogAPIClient::V2::ApplicationKeyCreateData.new({ attributes: DatadogAPIClient::V2::ApplicationKeyCreateAttributes.new({ name: "Example-Service-Account", }), type: DatadogAPIClient::V2::ApplicationKeysType::APPLICATION_KEYS, }), }) p api_instance.create_service_account_application_key(SERVICE_ACCOUNT_USER_DATA_ID, body) ``` Copy ##### Create an application key with scopes for this service account returns "Created" response ``` # Create an application key with scopes for this service account returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceAccountsAPI.new # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = ENV["SERVICE_ACCOUNT_USER_DATA_ID"] body = DatadogAPIClient::V2::ApplicationKeyCreateRequest.new({ data: DatadogAPIClient::V2::ApplicationKeyCreateData.new({ attributes: DatadogAPIClient::V2::ApplicationKeyCreateAttributes.new({ name: "Example-Service-Account", scopes: [ "dashboards_read", "dashboards_write", "dashboards_public_share", ], }), type: DatadogAPIClient::V2::ApplicationKeysType::APPLICATION_KEYS, }), }) p api_instance.create_service_account_application_key(SERVICE_ACCOUNT_USER_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an application key for this service account returns "Created" response ``` // Create an application key for this service account returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_accounts::ServiceAccountsAPI; use datadog_api_client::datadogV2::model::ApplicationKeyCreateAttributes; use datadog_api_client::datadogV2::model::ApplicationKeyCreateData; use datadog_api_client::datadogV2::model::ApplicationKeyCreateRequest; use datadog_api_client::datadogV2::model::ApplicationKeysType; #[tokio::main] async fn main() { // there is a valid "service_account_user" in the system let service_account_user_data_id = std::env::var("SERVICE_ACCOUNT_USER_DATA_ID").unwrap(); let body = ApplicationKeyCreateRequest::new(ApplicationKeyCreateData::new( ApplicationKeyCreateAttributes::new("Example-Service-Account".to_string()), ApplicationKeysType::APPLICATION_KEYS, )); let configuration = datadog::Configuration::new(); let api = ServiceAccountsAPI::with_config(configuration); let resp = api .create_service_account_application_key(service_account_user_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create an application key with scopes for this service account returns "Created" response ``` // Create an application key with scopes for this service account returns // "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_accounts::ServiceAccountsAPI; use datadog_api_client::datadogV2::model::ApplicationKeyCreateAttributes; use datadog_api_client::datadogV2::model::ApplicationKeyCreateData; use datadog_api_client::datadogV2::model::ApplicationKeyCreateRequest; use datadog_api_client::datadogV2::model::ApplicationKeysType; #[tokio::main] async fn main() { // there is a valid "service_account_user" in the system let service_account_user_data_id = std::env::var("SERVICE_ACCOUNT_USER_DATA_ID").unwrap(); let body = ApplicationKeyCreateRequest::new(ApplicationKeyCreateData::new( ApplicationKeyCreateAttributes::new("Example-Service-Account".to_string()).scopes(Some( vec![ "dashboards_read".to_string(), "dashboards_write".to_string(), "dashboards_public_share".to_string(), ], )), ApplicationKeysType::APPLICATION_KEYS, )); let configuration = datadog::Configuration::new(); let api = ServiceAccountsAPI::with_config(configuration); let resp = api .create_service_account_application_key(service_account_user_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an application key for this service account returns "Created" response ``` /** * Create an application key for this service account returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceAccountsApi(configuration); // there is a valid "service_account_user" in the system const SERVICE_ACCOUNT_USER_DATA_ID = process.env .SERVICE_ACCOUNT_USER_DATA_ID as string; const params: v2.ServiceAccountsApiCreateServiceAccountApplicationKeyRequest = { body: { data: { attributes: { name: "Example-Service-Account", }, type: "application_keys", }, }, serviceAccountId: SERVICE_ACCOUNT_USER_DATA_ID, }; apiInstance .createServiceAccountApplicationKey(params) .then((data: v2.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create an application key with scopes for this service account returns "Created" response ``` /** * Create an application key with scopes for this service account returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceAccountsApi(configuration); // there is a valid "service_account_user" in the system const SERVICE_ACCOUNT_USER_DATA_ID = process.env .SERVICE_ACCOUNT_USER_DATA_ID as string; const params: v2.ServiceAccountsApiCreateServiceAccountApplicationKeyRequest = { body: { data: { attributes: { name: "Example-Service-Account", scopes: [ "dashboards_read", "dashboards_write", "dashboards_public_share", ], }, type: "application_keys", }, }, serviceAccountId: SERVICE_ACCOUNT_USER_DATA_ID, }; apiInstance .createServiceAccountApplicationKey(params) .then((data: v2.ApplicationKeyResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List application keys for this service account](https://docs.datadoghq.com/api/latest/service-accounts/#list-application-keys-for-this-service-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-accounts/#list-application-keys-for-this-service-account-v2) GET https://api.ap1.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.ap2.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.datadoghq.eu/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.ddog-gov.com/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.us3.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keyshttps://api.us5.datadoghq.com/api/v2/service_accounts/{service_account_id}/application_keys ### Overview List all application keys available for this service account. This endpoint requires the `service_account_write` permission. ### Arguments #### Path Parameters Name Type Description service_account_id [_required_] string The ID of the service account. #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort enum Application key attribute used to sort results. Sort order is ascending by default. In order to specify a descending sort, prefix the attribute with a minus sign. Allowed enum values: `created_at, -created_at, last4, -last4, name, -name` filter string Filter application keys by the specified string. filter[created_at][start] string Only include application keys created on or after the specified date. filter[created_at][end] string Only include application keys created on or before the specified date. ### Response * [200](https://docs.datadoghq.com/api/latest/service-accounts/#ListServiceAccountApplicationKeys-200-v2) * [400](https://docs.datadoghq.com/api/latest/service-accounts/#ListServiceAccountApplicationKeys-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-accounts/#ListServiceAccountApplicationKeys-403-v2) * [404](https://docs.datadoghq.com/api/latest/service-accounts/#ListServiceAccountApplicationKeys-404-v2) * [429](https://docs.datadoghq.com/api/latest/service-accounts/#ListServiceAccountApplicationKeys-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) Response for a list of application keys. Field Type Description data [object] Array of application keys. attributes object Attributes of a partial application key. created_at string Creation date of the application key. last4 string The last four characters of the application key. last_used_at string Last usage timestamp of the application key. name string Name of the application key. scopes [string] Array of scopes to grant the application key. id string ID of the application key. relationships object Resources related to the application key. owned_by object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum Application Keys resource type. Allowed enum values: `application_keys` default: `application_keys` included [ ] Array of objects related to the application key. Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` Option 3 object The definition of LeakedKey object. attributes [_required_] object The definition of LeakedKeyAttributes object. date [_required_] date-time The LeakedKeyAttributes date. leak_source string The LeakedKeyAttributes leak_source. id [_required_] string The LeakedKey id. type [_required_] enum The definition of LeakedKeyType object. Allowed enum values: `leaked_keys` default: `leaked_keys` meta object Additional information related to the application key response. max_allowed_per_user int64 Max allowed number of application keys per user. page object Additional information related to the application key response. total_filtered_count int64 Total filtered application key count. ``` { "data": [ { "attributes": { "created_at": "2020-11-23T10:00:00.000Z", "last4": "abcd", "last_used_at": "2020-12-20T10:00:00.000Z", "name": "Application Key for managing dashboards", "scopes": [ "dashboards_read", "dashboards_write", "dashboards_public_share" ] }, "id": "string", "relationships": { "owned_by": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "application_keys" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "meta": { "max_allowed_per_user": "integer", "page": { "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-accounts/) * [Example](https://docs.datadoghq.com/api/latest/service-accounts/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-accounts/?code-lang=typescript) ##### List application keys for this service account Copy ``` # Path parameters export service_account_id="00000000-0000-1234-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/service_accounts/${service_account_id}/application_keys" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List application keys for this service account ``` """ List application keys for this service account returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_accounts_api import ServiceAccountsApi # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = environ["SERVICE_ACCOUNT_USER_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceAccountsApi(api_client) response = api_instance.list_service_account_application_keys( service_account_id=SERVICE_ACCOUNT_USER_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List application keys for this service account ``` # List application keys for this service account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceAccountsAPI.new # there is a valid "service_account_user" in the system SERVICE_ACCOUNT_USER_DATA_ID = ENV["SERVICE_ACCOUNT_USER_DATA_ID"] p api_instance.list_service_account_application_keys(SERVICE_ACCOUNT_USER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List application keys for this service account ``` // List application keys for this service account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "service_account_user" in the system ServiceAccountUserDataID := os.Getenv("SERVICE_ACCOUNT_USER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceAccountsApi(apiClient) resp, r, err := api.ListServiceAccountApplicationKeys(ctx, ServiceAccountUserDataID, *datadogV2.NewListServiceAccountApplicationKeysOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceAccountsApi.ListServiceAccountApplicationKeys`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceAccountsApi.ListServiceAccountApplicationKeys`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List application keys for this service account ``` // List application keys for this service account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceAccountsApi; import com.datadog.api.client.v2.model.ListApplicationKeysResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceAccountsApi apiInstance = new ServiceAccountsApi(defaultClient); // there is a valid "service_account_user" in the system String SERVICE_ACCOUNT_USER_DATA_ID = System.getenv("SERVICE_ACCOUNT_USER_DATA_ID"); try { ListApplicationKeysResponse result = apiInstance.listServiceAccountApplicationKeys(SERVICE_ACCOUNT_USER_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceAccountsApi#listServiceAccountApplicationKeys"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List application keys for this service account ``` // List application keys for this service account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_accounts::ListServiceAccountApplicationKeysOptionalParams; use datadog_api_client::datadogV2::api_service_accounts::ServiceAccountsAPI; #[tokio::main] async fn main() { // there is a valid "service_account_user" in the system let service_account_user_data_id = std::env::var("SERVICE_ACCOUNT_USER_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ServiceAccountsAPI::with_config(configuration); let resp = api .list_service_account_application_keys( service_account_user_data_id.clone(), ListServiceAccountApplicationKeysOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List application keys for this service account ``` /** * List application keys for this service account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceAccountsApi(configuration); // there is a valid "service_account_user" in the system const SERVICE_ACCOUNT_USER_DATA_ID = process.env .SERVICE_ACCOUNT_USER_DATA_ID as string; const params: v2.ServiceAccountsApiListServiceAccountApplicationKeysRequest = { serviceAccountId: SERVICE_ACCOUNT_USER_DATA_ID, }; apiInstance .listServiceAccountApplicationKeys(params) .then((data: v2.ListApplicationKeysResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=bfa410bd-738f-490b-997a-a1f334b0849d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=e1b859e5-bd30-44ab-896b-07da295847e3&pt=Service%20Accounts&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-accounts%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=bfa410bd-738f-490b-997a-a1f334b0849d&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=e1b859e5-bd30-44ab-896b-07da295847e3&pt=Service%20Accounts&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-accounts%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=6deebc6f-5b46-4b20-b43b-463f07215e50&bo=2&sid=e7969980f0bf11f08c054320cbbd0092&vid=e7972810f0bf11f09bf29f699b2daf47&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Service%20Accounts&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-accounts%2F&r=<=1862&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=653679) --- # Source: https://docs.datadoghq.com/api/latest/service-checks/ # Service Checks The service check endpoint allows you to post check statuses for use with monitors. Service check messages are limited to 500 characters. If a check is posted with a message containing more than 500 characters, only the first 500 characters are displayed. Messages are limited for checks with a Critical or Warning status, they are dropped for checks with an OK status. * [Read more about Service Check monitors](https://docs.datadoghq.com/monitors/types/service_check/). * [Read more about Process Check monitors](https://docs.datadoghq.com/monitors/create/types/process_check/?tab=checkalert). * [Read more about Network monitors](https://docs.datadoghq.com/monitors/create/types/network/?tab=checkalert). * [Read more about Custom Check monitors](https://docs.datadoghq.com/monitors/create/types/custom_check/?tab=checkalert). * [Read more about Service Checks and status codes](https://docs.datadoghq.com/developers/service_checks/). ## [Submit a Service Check](https://docs.datadoghq.com/api/latest/service-checks/#submit-a-service-check) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-checks/#submit-a-service-check-v1) POST https://api.ap1.datadoghq.com/api/v1/check_runhttps://api.ap2.datadoghq.com/api/v1/check_runhttps://api.datadoghq.eu/api/v1/check_runhttps://api.ddog-gov.com/api/v1/check_runhttps://api.datadoghq.com/api/v1/check_runhttps://api.us3.datadoghq.com/api/v1/check_runhttps://api.us5.datadoghq.com/api/v1/check_run ### Overview Submit a list of Service Checks. **Notes** : * A valid API key is required. * Service checks can be submitted up to 10 minutes in the past. ### Request #### Body Data (required) Service Check request body. * [Model](https://docs.datadoghq.com/api/latest/service-checks/) * [Example](https://docs.datadoghq.com/api/latest/service-checks/) Expand All Field Type Description check string The check. host_name string The host name correlated with the check. message string Message containing check status. status enum The status of a service check. Set to `0` for OK, `1` for warning, `2` for critical, and `3` for unknown. Allowed enum values: `0,1,2,3` tags [string] Tags related to a check. timestamp int64 Time of check. ``` [ { "check": "app.ok", "host_name": "host", "status": 0, "tags": [ "test:ExampleServiceCheck" ] } ] ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/service-checks/#SubmitServiceCheck-202-v1) * [400](https://docs.datadoghq.com/api/latest/service-checks/#SubmitServiceCheck-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-checks/#SubmitServiceCheck-403-v1) * [408](https://docs.datadoghq.com/api/latest/service-checks/#SubmitServiceCheck-408-v1) * [413](https://docs.datadoghq.com/api/latest/service-checks/#SubmitServiceCheck-413-v1) * [429](https://docs.datadoghq.com/api/latest/service-checks/#SubmitServiceCheck-429-v1) Payload accepted * [Model](https://docs.datadoghq.com/api/latest/service-checks/) * [Example](https://docs.datadoghq.com/api/latest/service-checks/) The payload accepted for intake. Expand All Field Type Description status string The status of the intake payload. ``` { "status": "ok" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-checks/) * [Example](https://docs.datadoghq.com/api/latest/service-checks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/service-checks/) * [Example](https://docs.datadoghq.com/api/latest/service-checks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Request timeout * [Model](https://docs.datadoghq.com/api/latest/service-checks/) * [Example](https://docs.datadoghq.com/api/latest/service-checks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Payload too large * [Model](https://docs.datadoghq.com/api/latest/service-checks/) * [Example](https://docs.datadoghq.com/api/latest/service-checks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-checks/) * [Example](https://docs.datadoghq.com/api/latest/service-checks/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-checks/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-checks/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-checks/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-checks/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-checks/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-checks/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-checks/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/service-checks/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/service-checks/?code-lang=ruby-legacy) ##### Submit a Service Check returns "Payload accepted" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/check_run" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -d @- << EOF [ { "check": "app.ok", "host_name": "host", "status": 0, "tags": [ "test:ExampleServiceCheck" ] } ] EOF ``` ##### Submit a Service Check returns "Payload accepted" response ``` // Submit a Service Check returns "Payload accepted" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := []datadogV1.ServiceCheck{ { Check: "app.ok", HostName: "host", Status: datadogV1.SERVICECHECKSTATUS_OK, Tags: []string{ "test:ExampleServiceCheck", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceChecksApi(apiClient) resp, r, err := api.SubmitServiceCheck(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceChecksApi.SubmitServiceCheck`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceChecksApi.SubmitServiceCheck`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" go run "main.go" ``` ##### Submit a Service Check returns "Payload accepted" response ``` // Submit a Service Check returns "Payload accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceChecksApi; import com.datadog.api.client.v1.model.IntakePayloadAccepted; import com.datadog.api.client.v1.model.ServiceCheck; import com.datadog.api.client.v1.model.ServiceCheckStatus; import java.util.Collections; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceChecksApi apiInstance = new ServiceChecksApi(defaultClient); List body = Collections.singletonList( new ServiceCheck() .check("app.ok") .hostName("host") .status(ServiceCheckStatus.OK) .tags(Collections.singletonList("test:ExampleServiceCheck"))); try { IntakePayloadAccepted result = apiInstance.submitServiceCheck(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceChecksApi#submitServiceCheck"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" java "Example.java" ``` ##### Submit a Service Check returns "Payload accepted" response ``` from datadog import initialize, api from datadog.api.constants import CheckStatus options = {'api_key': '', 'app_key': ''} initialize(**options) check = 'app.ok' host = 'app1' status = CheckStatus.OK # equals 0 tags = ['env:test'] api.ServiceCheck.check(check=check, host_name=host, status=status, message='Response: 200 OK', tags=tags) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python "example.py" ``` ##### Submit a Service Check returns "Payload accepted" response ``` """ Submit a Service Check returns "Payload accepted" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_checks_api import ServiceChecksApi from datadog_api_client.v1.model.service_check import ServiceCheck from datadog_api_client.v1.model.service_check_status import ServiceCheckStatus from datadog_api_client.v1.model.service_checks import ServiceChecks body = ServiceChecks( [ ServiceCheck( check="app.ok", host_name="host", status=ServiceCheckStatus.OK, tags=[ "test:ExampleServiceCheck", ], ), ] ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceChecksApi(api_client) response = api_instance.submit_service_check(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" python3 "example.py" ``` ##### Submit a Service Check returns "Payload accepted" response ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # submitting a check doesn't require an app_key dog = Dogapi::Client.new(api_key) dog.service_check('app.is_ok', 'app1', 0, :message => 'Response: 200 OK', :tags => ['env:test']) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Submit a Service Check returns "Payload accepted" response ``` # Submit a Service Check returns "Payload accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceChecksAPI.new body = [ DatadogAPIClient::V1::ServiceCheck.new({ check: "app.ok", host_name: "host", status: DatadogAPIClient::V1::ServiceCheckStatus::OK, tags: [ "test:ExampleServiceCheck", ], }), ] p api_instance.submit_service_check(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" rb "example.rb" ``` ##### Submit a Service Check returns "Payload accepted" response ``` // Submit a Service Check returns "Payload accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_checks::ServiceChecksAPI; use datadog_api_client::datadogV1::model::ServiceCheck; use datadog_api_client::datadogV1::model::ServiceCheckStatus; #[tokio::main] async fn main() { let body = vec![ServiceCheck::new( "app.ok".to_string(), "host".to_string(), ServiceCheckStatus::OK, vec!["test:ExampleServiceCheck".to_string()], )]; let configuration = datadog::Configuration::new(); let api = ServiceChecksAPI::with_config(configuration); let resp = api.submit_service_check(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" cargo run ``` ##### Submit a Service Check returns "Payload accepted" response ``` /** * Submit a Service Check returns "Payload accepted" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceChecksApi(configuration); const params: v1.ServiceChecksApiSubmitServiceCheckRequest = { body: [ { check: "app.ok", hostName: "host", status: 0, tags: ["test:ExampleServiceCheck"], }, ], }; apiInstance .submitServiceCheck(params) .then((data: v1.IntakePayloadAccepted) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=2dfb4a52-d5d8-4932-9823-1bd657ed5237&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=bd8636c9-89a9-47a6-ad70-b0de6b9d2bad&pt=Service%20Checks&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-checks%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=2dfb4a52-d5d8-4932-9823-1bd657ed5237&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=bd8636c9-89a9-47a6-ad70-b0de6b9d2bad&pt=Service%20Checks&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-checks%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=f209c200-233b-4bce-9011-1817582e7e0e&bo=2&sid=82363720f0c011f09a99dba1c5878082&vid=82365fe0f0c011f0b9240f489454cf58&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Service%20Checks&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-checks%2F&r=<=972&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=397861) --- # Source: https://docs.datadoghq.com/api/latest/service-definition # Service Definition API to create, update, retrieve and delete service definitions. Note: Service Catalog [v3.0 schema](https://docs.datadoghq.com/service_catalog/service_definitions/v3-0/) has new API endpoints documented under [Software Catalog](https://docs.datadoghq.com/api/latest/software-catalog/). Use the following Service Definition endpoints for v2.2 and earlier. ## [Get all service definitions](https://docs.datadoghq.com/api/latest/service-definition/#get-all-service-definitions) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-definition/#get-all-service-definitions-v2) GET https://api.ap1.datadoghq.com/api/v2/services/definitionshttps://api.ap2.datadoghq.com/api/v2/services/definitionshttps://api.datadoghq.eu/api/v2/services/definitionshttps://api.ddog-gov.com/api/v2/services/definitionshttps://api.datadoghq.com/api/v2/services/definitionshttps://api.us3.datadoghq.com/api/v2/services/definitionshttps://api.us5.datadoghq.com/api/v2/services/definitions ### Overview Get a list of all service definitions from the Datadog Service Catalog. This endpoint requires the `apm_service_catalog_read` permission. OAuth apps require the `apm_service_catalog_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-definition) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. schema_version enum The schema version desired in the response. Allowed enum values: `v1, v2, v2.1, v2.2` ### Response * [200](https://docs.datadoghq.com/api/latest/service-definition/#ListServiceDefinitions-200-v2) * [403](https://docs.datadoghq.com/api/latest/service-definition/#ListServiceDefinitions-403-v2) * [429](https://docs.datadoghq.com/api/latest/service-definition/#ListServiceDefinitions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) Create service definitions response. Field Type Description data [object] Data representing service definitions. attributes object Service definition attributes. meta object Metadata about a service definition. github-html-url string GitHub HTML URL. ingested-schema-version string Ingestion schema version. ingestion-source string Ingestion source of the service definition. last-modified-time string Last modified time of the service definition. origin string User defined origin of the service definition. origin-detail string User defined origin's detail of the service definition. warnings [object] A list of schema validation warnings. instance-location string The warning instance location. keyword-location string The warning keyword location. message string The warning message. schema Service definition schema. Option 1 object **DEPRECATED** : Deprecated - Service definition V1 for providing additional service metadata and integrations. contact object Contact information about the service. email string Service owner’s email. slack string Service owner’s Slack channel. extensions object Extensions to V1 schema. external-resources [object] A list of external links related to the services. name [_required_] string Link name. type [_required_] enum Link type. Allowed enum values: `doc,wiki,runbook,url,repo,dashboard,oncall,code,link` url [_required_] string Link URL. info [_required_] object Basic information about a service. dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. description string A short description of the service. display-name string A friendly name of the service. service-tier string Service tier. integrations object Third party integrations that Datadog supports. pagerduty string PagerDuty service URL for the service. org object Org related information about the service. application string App feature this service supports. team string Team that owns the service. schema-version [_required_] enum Schema version being used. Allowed enum values: `v1` default: `v1` tags [string] A set of custom tags. Option 2 object Service definition V2 for providing service metadata and integrations. contacts [ ] A list of contacts related to the services. Option 1 object Service owner's email. contact [_required_] string Contact value. name string Contact email. type [_required_] enum Contact type. Allowed enum values: `email` Option 2 object Service owner's Slack channel. contact [_required_] string Slack Channel. name string Contact Slack. type [_required_] enum Contact type. Allowed enum values: `slack` Option 3 object Service owner's Microsoft Teams. contact [_required_] string Contact value. name string Contact Microsoft Teams. type [_required_] enum Contact type. Allowed enum values: `microsoft-teams` dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. dd-team string Experimental feature. A Team handle that matches a Team in the Datadog Teams product. docs [object] A list of documentation related to the services. name [_required_] string Document name. provider string Document provider. url [_required_] string Document URL. extensions object Extensions to V2 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty string PagerDuty service URL for the service. links [object] A list of links related to the services. name [_required_] string Link name. type [_required_] enum Link type. Allowed enum values: `doc,wiki,runbook,url,repo,dashboard,oncall,code,link` url [_required_] string Link URL. repos [object] A list of code repositories related to the services. name [_required_] string Repository name. provider string Repository provider. url [_required_] string Repository URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2` default: `v2` tags [string] A set of custom tags. team string Team that owns the service. Option 3 object Service definition v2.1 for providing service metadata and integrations. application string Identifier for a group of related services serving a product feature, which the service is a part of. contacts [ ] A list of contacts related to the services. Option 1 object Service owner's email. contact [_required_] string Contact value. name string Contact email. type [_required_] enum Contact type. Allowed enum values: `email` Option 2 object Service owner's Slack channel. contact [_required_] string Slack Channel. name string Contact Slack. type [_required_] enum Contact type. Allowed enum values: `slack` Option 3 object Service owner's Microsoft Teams. contact [_required_] string Contact value. name string Contact Microsoft Teams. type [_required_] enum Contact type. Allowed enum values: `microsoft-teams` dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. description string A short description of the service. extensions object Extensions to v2.1 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty object PagerDuty integration for the service. service-url string PagerDuty service url. lifecycle string The current life cycle phase of the service. links [object] A list of links related to the services. name [_required_] string Link name. provider string Link provider. type [_required_] enum Link type. Allowed enum values: `doc,repo,runbook,dashboard,other` url [_required_] string Link URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2.1` default: `v2.1` tags [string] A set of custom tags. team string Team that owns the service. It is used to locate a team defined in Datadog Teams if it exists. tier string Importance of the service. Option 4 object Service definition v2.2 for providing service metadata and integrations. application string Identifier for a group of related services serving a product feature, which the service is a part of. ci-pipeline-fingerprints [string] A set of CI fingerprints. contacts [object] A list of contacts related to the services. contact [_required_] string Contact value. name string Contact Name. type [_required_] string Contact type. Datadog recognizes the following types: `email`, `slack`, and `microsoft-teams`. dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. description string A short description of the service. extensions object Extensions to v2.2 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty object PagerDuty integration for the service. service-url string PagerDuty service url. languages [string] The service's programming language. Datadog recognizes the following languages: `dotnet`, `go`, `java`, `js`, `php`, `python`, `ruby`, and `c++`. lifecycle string The current life cycle phase of the service. links [object] A list of links related to the services. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. Datadog recognizes the following types: `runbook`, `doc`, `repo`, `dashboard`, and `other`. url [_required_] string Link URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2.2` default: `v2.2` tags [string] A set of custom tags. team string Team that owns the service. It is used to locate a team defined in Datadog Teams if it exists. tier string Importance of the service. type string The type of service. id string Service definition id. type string Service definition type. ``` { "data": [ { "attributes": { "meta": { "github-html-url": "string", "ingested-schema-version": "string", "ingestion-source": "string", "last-modified-time": "string", "origin": "string", "origin-detail": "string", "warnings": [ { "instance-location": "string", "keyword-location": "string", "message": "string" } ] }, "schema": { "contact": { "email": "contact@datadoghq.com", "slack": "https://yourcompany.slack.com/archives/channel123" }, "extensions": { "myorg/extension": "extensionValue" }, "external-resources": [ { "name": "Runbook", "type": "runbook", "url": "https://my-runbook" } ], "info": { "dd-service": "myservice", "description": "A shopping cart service", "display-name": "My Service", "service-tier": "Tier 1" }, "integrations": { "pagerduty": "https://my-org.pagerduty.com/service-directory/PMyService" }, "org": { "application": "E-Commerce", "team": "my-team" }, "schema-version": "v1", "tags": [ "my:tag", "service:tag" ] } }, "id": "string", "type": "string" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=typescript) ##### Get all service definitions Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/services/definitions" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all service definitions ``` """ Get all service definitions returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_definition_api import ServiceDefinitionApi from datadog_api_client.v2.model.service_definition_schema_versions import ServiceDefinitionSchemaVersions configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceDefinitionApi(api_client) response = api_instance.list_service_definitions( schema_version=ServiceDefinitionSchemaVersions.V2_1, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all service definitions ``` # Get all service definitions returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceDefinitionAPI.new opts = { schema_version: ServiceDefinitionSchemaVersions::V2_1, } p api_instance.list_service_definitions(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all service definitions ``` // Get all service definitions returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceDefinitionApi(apiClient) resp, r, err := api.ListServiceDefinitions(ctx, *datadogV2.NewListServiceDefinitionsOptionalParameters().WithSchemaVersion(datadogV2.SERVICEDEFINITIONSCHEMAVERSIONS_V2_1)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceDefinitionApi.ListServiceDefinitions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceDefinitionApi.ListServiceDefinitions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all service definitions ``` // Get all service definitions returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceDefinitionApi; import com.datadog.api.client.v2.api.ServiceDefinitionApi.ListServiceDefinitionsOptionalParameters; import com.datadog.api.client.v2.model.ServiceDefinitionSchemaVersions; import com.datadog.api.client.v2.model.ServiceDefinitionsListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceDefinitionApi apiInstance = new ServiceDefinitionApi(defaultClient); try { ServiceDefinitionsListResponse result = apiInstance.listServiceDefinitions( new ListServiceDefinitionsOptionalParameters() .schemaVersion(ServiceDefinitionSchemaVersions.V2_1)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceDefinitionApi#listServiceDefinitions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all service definitions ``` // Get all service definitions returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_definition::ListServiceDefinitionsOptionalParams; use datadog_api_client::datadogV2::api_service_definition::ServiceDefinitionAPI; use datadog_api_client::datadogV2::model::ServiceDefinitionSchemaVersions; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ServiceDefinitionAPI::with_config(configuration); let resp = api .list_service_definitions( ListServiceDefinitionsOptionalParams::default() .schema_version(ServiceDefinitionSchemaVersions::V2_1), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all service definitions ``` /** * Get all service definitions returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceDefinitionApi(configuration); const params: v2.ServiceDefinitionApiListServiceDefinitionsRequest = { schemaVersion: "v2.1", }; apiInstance .listServiceDefinitions(params) .then((data: v2.ServiceDefinitionsListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create or update service definition](https://docs.datadoghq.com/api/latest/service-definition/#create-or-update-service-definition) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-definition/#create-or-update-service-definition-v2) POST https://api.ap1.datadoghq.com/api/v2/services/definitionshttps://api.ap2.datadoghq.com/api/v2/services/definitionshttps://api.datadoghq.eu/api/v2/services/definitionshttps://api.ddog-gov.com/api/v2/services/definitionshttps://api.datadoghq.com/api/v2/services/definitionshttps://api.us3.datadoghq.com/api/v2/services/definitionshttps://api.us5.datadoghq.com/api/v2/services/definitions ### Overview Create or update service definition in the Datadog Service Catalog. This endpoint requires the `apm_service_catalog_write` permission. OAuth apps require the `apm_service_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-definition) to access this endpoint. ### Request #### Body Data (required) Service Definition YAML/JSON. * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) Field Type Description Option 1 object Service definition v2.2 for providing service metadata and integrations. application string Identifier for a group of related services serving a product feature, which the service is a part of. ci-pipeline-fingerprints [string] A set of CI fingerprints. contacts [object] A list of contacts related to the services. contact [_required_] string Contact value. name string Contact Name. type [_required_] string Contact type. Datadog recognizes the following types: `email`, `slack`, and `microsoft-teams`. dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. description string A short description of the service. extensions object Extensions to v2.2 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty object PagerDuty integration for the service. service-url string PagerDuty service url. languages [string] The service's programming language. Datadog recognizes the following languages: `dotnet`, `go`, `java`, `js`, `php`, `python`, `ruby`, and `c++`. lifecycle string The current life cycle phase of the service. links [object] A list of links related to the services. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. Datadog recognizes the following types: `runbook`, `doc`, `repo`, `dashboard`, and `other`. url [_required_] string Link URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2.2` default: `v2.2` tags [string] A set of custom tags. team string Team that owns the service. It is used to locate a team defined in Datadog Teams if it exists. tier string Importance of the service. type string The type of service. Option 2 object Service definition v2.1 for providing service metadata and integrations. application string Identifier for a group of related services serving a product feature, which the service is a part of. contacts [ ] A list of contacts related to the services. Option 1 object Service owner's email. contact [_required_] string Contact value. name string Contact email. type [_required_] enum Contact type. Allowed enum values: `email` Option 2 object Service owner's Slack channel. contact [_required_] string Slack Channel. name string Contact Slack. type [_required_] enum Contact type. Allowed enum values: `slack` Option 3 object Service owner's Microsoft Teams. contact [_required_] string Contact value. name string Contact Microsoft Teams. type [_required_] enum Contact type. Allowed enum values: `microsoft-teams` dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. description string A short description of the service. extensions object Extensions to v2.1 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty object PagerDuty integration for the service. service-url string PagerDuty service url. lifecycle string The current life cycle phase of the service. links [object] A list of links related to the services. name [_required_] string Link name. provider string Link provider. type [_required_] enum Link type. Allowed enum values: `doc,repo,runbook,dashboard,other` url [_required_] string Link URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2.1` default: `v2.1` tags [string] A set of custom tags. team string Team that owns the service. It is used to locate a team defined in Datadog Teams if it exists. tier string Importance of the service. Option 3 object Service definition V2 for providing service metadata and integrations. contacts [ ] A list of contacts related to the services. Option 1 object Service owner's email. contact [_required_] string Contact value. name string Contact email. type [_required_] enum Contact type. Allowed enum values: `email` Option 2 object Service owner's Slack channel. contact [_required_] string Slack Channel. name string Contact Slack. type [_required_] enum Contact type. Allowed enum values: `slack` Option 3 object Service owner's Microsoft Teams. contact [_required_] string Contact value. name string Contact Microsoft Teams. type [_required_] enum Contact type. Allowed enum values: `microsoft-teams` dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. dd-team string Experimental feature. A Team handle that matches a Team in the Datadog Teams product. docs [object] A list of documentation related to the services. name [_required_] string Document name. provider string Document provider. url [_required_] string Document URL. extensions object Extensions to V2 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty string PagerDuty service URL for the service. links [object] A list of links related to the services. name [_required_] string Link name. type [_required_] enum Link type. Allowed enum values: `doc,wiki,runbook,url,repo,dashboard,oncall,code,link` url [_required_] string Link URL. repos [object] A list of code repositories related to the services. name [_required_] string Repository name. provider string Repository provider. url [_required_] string Repository URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2` default: `v2` tags [string] A set of custom tags. team string Team that owns the service. Option 4 string Service Definition in raw JSON/YAML representation. ##### Create or update service definition using schema v2 returns "CREATED" response ``` { "contacts": [ { "contact": "contact@datadoghq.com", "name": "Team Email", "type": "email" } ], "dd-service": "service-exampleservicedefinition", "dd-team": "my-team", "docs": [ { "name": "Architecture", "provider": "google drive", "url": "https://gdrive/mydoc" } ], "extensions": { "myorgextension": "extensionvalue" }, "integrations": { "opsgenie": { "region": "US", "service-url": "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000" }, "pagerduty": "https://my-org.pagerduty.com/service-directory/PMyService" }, "links": [ { "name": "Runbook", "type": "runbook", "url": "https://my-runbook" } ], "repos": [ { "name": "Source Code", "provider": "GitHub", "url": "https://github.com/DataDog/schema" } ], "schema-version": "v2", "tags": [ "my:tag", "service:tag" ], "team": "my-team" } ``` Copy ##### Create or update service definition using schema v2-1 returns "CREATED" response ``` { "contacts": [ { "contact": "contact@datadoghq.com", "name": "Team Email", "type": "email" } ], "dd-service": "service-exampleservicedefinition", "extensions": { "myorgextension": "extensionvalue" }, "integrations": { "opsgenie": { "region": "US", "service-url": "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000" }, "pagerduty": { "service-url": "https://my-org.pagerduty.com/service-directory/PMyService" } }, "links": [ { "name": "Runbook", "type": "runbook", "url": "https://my-runbook" }, { "name": "Source Code", "type": "repo", "provider": "GitHub", "url": "https://github.com/DataDog/schema" }, { "name": "Architecture", "type": "doc", "provider": "Gigoogle drivetHub", "url": "https://my-runbook" } ], "schema-version": "v2.1", "tags": [ "my:tag", "service:tag" ], "team": "my-team" } ``` Copy ##### Create or update service definition using schema v2-2 returns "CREATED" response ``` { "contacts": [ { "contact": "contact@datadoghq.com", "name": "Team Email", "type": "email" } ], "dd-service": "service-exampleservicedefinition", "extensions": { "myorgextension": "extensionvalue" }, "integrations": { "opsgenie": { "region": "US", "service-url": "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000" }, "pagerduty": { "service-url": "https://my-org.pagerduty.com/service-directory/PMyService" } }, "links": [ { "name": "Runbook", "type": "runbook", "url": "https://my-runbook" }, { "name": "Source Code", "type": "repo", "provider": "GitHub", "url": "https://github.com/DataDog/schema" }, { "name": "Architecture", "type": "doc", "provider": "Gigoogle drivetHub", "url": "https://my-runbook" } ], "schema-version": "v2.2", "tags": [ "my:tag", "service:tag" ], "team": "my-team" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/service-definition/#CreateOrUpdateServiceDefinitions-200-v2) * [400](https://docs.datadoghq.com/api/latest/service-definition/#CreateOrUpdateServiceDefinitions-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-definition/#CreateOrUpdateServiceDefinitions-403-v2) * [409](https://docs.datadoghq.com/api/latest/service-definition/#CreateOrUpdateServiceDefinitions-409-v2) * [429](https://docs.datadoghq.com/api/latest/service-definition/#CreateOrUpdateServiceDefinitions-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) Create service definitions response. Field Type Description data [object] Create service definitions response payload. attributes object Service definition attributes. meta object Metadata about a service definition. github-html-url string GitHub HTML URL. ingested-schema-version string Ingestion schema version. ingestion-source string Ingestion source of the service definition. last-modified-time string Last modified time of the service definition. origin string User defined origin of the service definition. origin-detail string User defined origin's detail of the service definition. warnings [object] A list of schema validation warnings. instance-location string The warning instance location. keyword-location string The warning keyword location. message string The warning message. schema Service definition schema. Option 1 object **DEPRECATED** : Deprecated - Service definition V1 for providing additional service metadata and integrations. contact object Contact information about the service. email string Service owner’s email. slack string Service owner’s Slack channel. extensions object Extensions to V1 schema. external-resources [object] A list of external links related to the services. name [_required_] string Link name. type [_required_] enum Link type. Allowed enum values: `doc,wiki,runbook,url,repo,dashboard,oncall,code,link` url [_required_] string Link URL. info [_required_] object Basic information about a service. dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. description string A short description of the service. display-name string A friendly name of the service. service-tier string Service tier. integrations object Third party integrations that Datadog supports. pagerduty string PagerDuty service URL for the service. org object Org related information about the service. application string App feature this service supports. team string Team that owns the service. schema-version [_required_] enum Schema version being used. Allowed enum values: `v1` default: `v1` tags [string] A set of custom tags. Option 2 object Service definition V2 for providing service metadata and integrations. contacts [ ] A list of contacts related to the services. Option 1 object Service owner's email. contact [_required_] string Contact value. name string Contact email. type [_required_] enum Contact type. Allowed enum values: `email` Option 2 object Service owner's Slack channel. contact [_required_] string Slack Channel. name string Contact Slack. type [_required_] enum Contact type. Allowed enum values: `slack` Option 3 object Service owner's Microsoft Teams. contact [_required_] string Contact value. name string Contact Microsoft Teams. type [_required_] enum Contact type. Allowed enum values: `microsoft-teams` dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. dd-team string Experimental feature. A Team handle that matches a Team in the Datadog Teams product. docs [object] A list of documentation related to the services. name [_required_] string Document name. provider string Document provider. url [_required_] string Document URL. extensions object Extensions to V2 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty string PagerDuty service URL for the service. links [object] A list of links related to the services. name [_required_] string Link name. type [_required_] enum Link type. Allowed enum values: `doc,wiki,runbook,url,repo,dashboard,oncall,code,link` url [_required_] string Link URL. repos [object] A list of code repositories related to the services. name [_required_] string Repository name. provider string Repository provider. url [_required_] string Repository URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2` default: `v2` tags [string] A set of custom tags. team string Team that owns the service. Option 3 object Service definition v2.1 for providing service metadata and integrations. application string Identifier for a group of related services serving a product feature, which the service is a part of. contacts [ ] A list of contacts related to the services. Option 1 object Service owner's email. contact [_required_] string Contact value. name string Contact email. type [_required_] enum Contact type. Allowed enum values: `email` Option 2 object Service owner's Slack channel. contact [_required_] string Slack Channel. name string Contact Slack. type [_required_] enum Contact type. Allowed enum values: `slack` Option 3 object Service owner's Microsoft Teams. contact [_required_] string Contact value. name string Contact Microsoft Teams. type [_required_] enum Contact type. Allowed enum values: `microsoft-teams` dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. description string A short description of the service. extensions object Extensions to v2.1 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty object PagerDuty integration for the service. service-url string PagerDuty service url. lifecycle string The current life cycle phase of the service. links [object] A list of links related to the services. name [_required_] string Link name. provider string Link provider. type [_required_] enum Link type. Allowed enum values: `doc,repo,runbook,dashboard,other` url [_required_] string Link URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2.1` default: `v2.1` tags [string] A set of custom tags. team string Team that owns the service. It is used to locate a team defined in Datadog Teams if it exists. tier string Importance of the service. Option 4 object Service definition v2.2 for providing service metadata and integrations. application string Identifier for a group of related services serving a product feature, which the service is a part of. ci-pipeline-fingerprints [string] A set of CI fingerprints. contacts [object] A list of contacts related to the services. contact [_required_] string Contact value. name string Contact Name. type [_required_] string Contact type. Datadog recognizes the following types: `email`, `slack`, and `microsoft-teams`. dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. description string A short description of the service. extensions object Extensions to v2.2 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty object PagerDuty integration for the service. service-url string PagerDuty service url. languages [string] The service's programming language. Datadog recognizes the following languages: `dotnet`, `go`, `java`, `js`, `php`, `python`, `ruby`, and `c++`. lifecycle string The current life cycle phase of the service. links [object] A list of links related to the services. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. Datadog recognizes the following types: `runbook`, `doc`, `repo`, `dashboard`, and `other`. url [_required_] string Link URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2.2` default: `v2.2` tags [string] A set of custom tags. team string Team that owns the service. It is used to locate a team defined in Datadog Teams if it exists. tier string Importance of the service. type string The type of service. id string Service definition id. type string Service definition type. ``` { "data": [ { "attributes": { "meta": { "github-html-url": "string", "ingested-schema-version": "string", "ingestion-source": "string", "last-modified-time": "string", "origin": "string", "origin-detail": "string", "warnings": [ { "instance-location": "string", "keyword-location": "string", "message": "string" } ] }, "schema": { "contact": { "email": "contact@datadoghq.com", "slack": "https://yourcompany.slack.com/archives/channel123" }, "extensions": { "myorg/extension": "extensionValue" }, "external-resources": [ { "name": "Runbook", "type": "runbook", "url": "https://my-runbook" } ], "info": { "dd-service": "myservice", "description": "A shopping cart service", "display-name": "My Service", "service-tier": "Tier 1" }, "integrations": { "pagerduty": "https://my-org.pagerduty.com/service-directory/PMyService" }, "org": { "application": "E-Commerce", "team": "my-team" }, "schema-version": "v1", "tags": [ "my:tag", "service:tag" ] } }, "id": "string", "type": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=typescript) ##### Create or update service definition using schema v2 returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/services/definitions" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "contacts": [ { "contact": "contact@datadoghq.com", "name": "Team Email", "type": "email" } ], "dd-service": "service-exampleservicedefinition", "dd-team": "my-team", "docs": [ { "name": "Architecture", "provider": "google drive", "url": "https://gdrive/mydoc" } ], "extensions": { "myorgextension": "extensionvalue" }, "integrations": { "opsgenie": { "region": "US", "service-url": "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000" }, "pagerduty": "https://my-org.pagerduty.com/service-directory/PMyService" }, "links": [ { "name": "Runbook", "type": "runbook", "url": "https://my-runbook" } ], "repos": [ { "name": "Source Code", "provider": "GitHub", "url": "https://github.com/DataDog/schema" } ], "schema-version": "v2", "tags": [ "my:tag", "service:tag" ], "team": "my-team" } EOF ``` ##### Create or update service definition using schema v2-1 returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/services/definitions" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "contacts": [ { "contact": "contact@datadoghq.com", "name": "Team Email", "type": "email" } ], "dd-service": "service-exampleservicedefinition", "extensions": { "myorgextension": "extensionvalue" }, "integrations": { "opsgenie": { "region": "US", "service-url": "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000" }, "pagerduty": { "service-url": "https://my-org.pagerduty.com/service-directory/PMyService" } }, "links": [ { "name": "Runbook", "type": "runbook", "url": "https://my-runbook" }, { "name": "Source Code", "type": "repo", "provider": "GitHub", "url": "https://github.com/DataDog/schema" }, { "name": "Architecture", "type": "doc", "provider": "Gigoogle drivetHub", "url": "https://my-runbook" } ], "schema-version": "v2.1", "tags": [ "my:tag", "service:tag" ], "team": "my-team" } EOF ``` ##### Create or update service definition using schema v2-2 returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/services/definitions" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "contacts": [ { "contact": "contact@datadoghq.com", "name": "Team Email", "type": "email" } ], "dd-service": "service-exampleservicedefinition", "extensions": { "myorgextension": "extensionvalue" }, "integrations": { "opsgenie": { "region": "US", "service-url": "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000" }, "pagerduty": { "service-url": "https://my-org.pagerduty.com/service-directory/PMyService" } }, "links": [ { "name": "Runbook", "type": "runbook", "url": "https://my-runbook" }, { "name": "Source Code", "type": "repo", "provider": "GitHub", "url": "https://github.com/DataDog/schema" }, { "name": "Architecture", "type": "doc", "provider": "Gigoogle drivetHub", "url": "https://my-runbook" } ], "schema-version": "v2.2", "tags": [ "my:tag", "service:tag" ], "team": "my-team" } EOF ``` ##### Create or update service definition using schema v2 returns "CREATED" response ``` // Create or update service definition using schema v2 returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ServiceDefinitionsCreateRequest{ ServiceDefinitionV2: &datadogV2.ServiceDefinitionV2{ Contacts: []datadogV2.ServiceDefinitionV2Contact{ datadogV2.ServiceDefinitionV2Contact{ ServiceDefinitionV2Email: &datadogV2.ServiceDefinitionV2Email{ Contact: "contact@datadoghq.com", Name: datadog.PtrString("Team Email"), Type: datadogV2.SERVICEDEFINITIONV2EMAILTYPE_EMAIL, }}, }, DdService: "service-exampleservicedefinition", DdTeam: datadog.PtrString("my-team"), Docs: []datadogV2.ServiceDefinitionV2Doc{ { Name: "Architecture", Provider: datadog.PtrString("google drive"), Url: "https://gdrive/mydoc", }, }, Extensions: map[string]interface{}{ "myorgextension": "extensionvalue", }, Integrations: &datadogV2.ServiceDefinitionV2Integrations{ Opsgenie: &datadogV2.ServiceDefinitionV2Opsgenie{ Region: datadogV2.SERVICEDEFINITIONV2OPSGENIEREGION_US.Ptr(), ServiceUrl: "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", }, Pagerduty: datadog.PtrString("https://my-org.pagerduty.com/service-directory/PMyService"), }, Links: []datadogV2.ServiceDefinitionV2Link{ { Name: "Runbook", Type: datadogV2.SERVICEDEFINITIONV2LINKTYPE_RUNBOOK, Url: "https://my-runbook", }, }, Repos: []datadogV2.ServiceDefinitionV2Repo{ { Name: "Source Code", Provider: datadog.PtrString("GitHub"), Url: "https://github.com/DataDog/schema", }, }, SchemaVersion: datadogV2.SERVICEDEFINITIONV2VERSION_V2, Tags: []string{ "my:tag", "service:tag", }, Team: datadog.PtrString("my-team"), }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceDefinitionApi(apiClient) resp, r, err := api.CreateOrUpdateServiceDefinitions(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceDefinitionApi.CreateOrUpdateServiceDefinitions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceDefinitionApi.CreateOrUpdateServiceDefinitions`:\n%s\n", responseContent) } ``` Copy ##### Create or update service definition using schema v2-1 returns "CREATED" response ``` // Create or update service definition using schema v2-1 returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ServiceDefinitionsCreateRequest{ ServiceDefinitionV2Dot1: &datadogV2.ServiceDefinitionV2Dot1{ Contacts: []datadogV2.ServiceDefinitionV2Dot1Contact{ datadogV2.ServiceDefinitionV2Dot1Contact{ ServiceDefinitionV2Dot1Email: &datadogV2.ServiceDefinitionV2Dot1Email{ Contact: "contact@datadoghq.com", Name: datadog.PtrString("Team Email"), Type: datadogV2.SERVICEDEFINITIONV2DOT1EMAILTYPE_EMAIL, }}, }, DdService: "service-exampleservicedefinition", Extensions: map[string]interface{}{ "myorgextension": "extensionvalue", }, Integrations: &datadogV2.ServiceDefinitionV2Dot1Integrations{ Opsgenie: &datadogV2.ServiceDefinitionV2Dot1Opsgenie{ Region: datadogV2.SERVICEDEFINITIONV2DOT1OPSGENIEREGION_US.Ptr(), ServiceUrl: "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", }, Pagerduty: &datadogV2.ServiceDefinitionV2Dot1Pagerduty{ ServiceUrl: datadog.PtrString("https://my-org.pagerduty.com/service-directory/PMyService"), }, }, Links: []datadogV2.ServiceDefinitionV2Dot1Link{ { Name: "Runbook", Type: datadogV2.SERVICEDEFINITIONV2DOT1LINKTYPE_RUNBOOK, Url: "https://my-runbook", }, { Name: "Source Code", Type: datadogV2.SERVICEDEFINITIONV2DOT1LINKTYPE_REPO, Provider: datadog.PtrString("GitHub"), Url: "https://github.com/DataDog/schema", }, { Name: "Architecture", Type: datadogV2.SERVICEDEFINITIONV2DOT1LINKTYPE_DOC, Provider: datadog.PtrString("Gigoogle drivetHub"), Url: "https://my-runbook", }, }, SchemaVersion: datadogV2.SERVICEDEFINITIONV2DOT1VERSION_V2_1, Tags: []string{ "my:tag", "service:tag", }, Team: datadog.PtrString("my-team"), }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceDefinitionApi(apiClient) resp, r, err := api.CreateOrUpdateServiceDefinitions(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceDefinitionApi.CreateOrUpdateServiceDefinitions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceDefinitionApi.CreateOrUpdateServiceDefinitions`:\n%s\n", responseContent) } ``` Copy ##### Create or update service definition using schema v2-2 returns "CREATED" response ``` // Create or update service definition using schema v2-2 returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ServiceDefinitionsCreateRequest{ ServiceDefinitionV2Dot2: &datadogV2.ServiceDefinitionV2Dot2{ Contacts: []datadogV2.ServiceDefinitionV2Dot2Contact{ { Contact: "contact@datadoghq.com", Name: datadog.PtrString("Team Email"), Type: "email", }, }, DdService: "service-exampleservicedefinition", Extensions: map[string]interface{}{ "myorgextension": "extensionvalue", }, Integrations: &datadogV2.ServiceDefinitionV2Dot2Integrations{ Opsgenie: &datadogV2.ServiceDefinitionV2Dot2Opsgenie{ Region: datadogV2.SERVICEDEFINITIONV2DOT2OPSGENIEREGION_US.Ptr(), ServiceUrl: "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", }, Pagerduty: &datadogV2.ServiceDefinitionV2Dot2Pagerduty{ ServiceUrl: datadog.PtrString("https://my-org.pagerduty.com/service-directory/PMyService"), }, }, Links: []datadogV2.ServiceDefinitionV2Dot2Link{ { Name: "Runbook", Type: "runbook", Url: "https://my-runbook", }, { Name: "Source Code", Type: "repo", Provider: datadog.PtrString("GitHub"), Url: "https://github.com/DataDog/schema", }, { Name: "Architecture", Type: "doc", Provider: datadog.PtrString("Gigoogle drivetHub"), Url: "https://my-runbook", }, }, SchemaVersion: datadogV2.SERVICEDEFINITIONV2DOT2VERSION_V2_2, Tags: []string{ "my:tag", "service:tag", }, Team: datadog.PtrString("my-team"), }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceDefinitionApi(apiClient) resp, r, err := api.CreateOrUpdateServiceDefinitions(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceDefinitionApi.CreateOrUpdateServiceDefinitions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceDefinitionApi.CreateOrUpdateServiceDefinitions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create or update service definition using schema v2 returns "CREATED" response ``` // Create or update service definition using schema v2 returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceDefinitionApi; import com.datadog.api.client.v2.model.ServiceDefinitionCreateResponse; import com.datadog.api.client.v2.model.ServiceDefinitionV2; import com.datadog.api.client.v2.model.ServiceDefinitionV2Contact; import com.datadog.api.client.v2.model.ServiceDefinitionV2Doc; import com.datadog.api.client.v2.model.ServiceDefinitionV2Email; import com.datadog.api.client.v2.model.ServiceDefinitionV2EmailType; import com.datadog.api.client.v2.model.ServiceDefinitionV2Integrations; import com.datadog.api.client.v2.model.ServiceDefinitionV2Link; import com.datadog.api.client.v2.model.ServiceDefinitionV2LinkType; import com.datadog.api.client.v2.model.ServiceDefinitionV2Opsgenie; import com.datadog.api.client.v2.model.ServiceDefinitionV2OpsgenieRegion; import com.datadog.api.client.v2.model.ServiceDefinitionV2Repo; import com.datadog.api.client.v2.model.ServiceDefinitionV2Version; import com.datadog.api.client.v2.model.ServiceDefinitionsCreateRequest; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceDefinitionApi apiInstance = new ServiceDefinitionApi(defaultClient); ServiceDefinitionsCreateRequest body = new ServiceDefinitionsCreateRequest( new ServiceDefinitionV2() .contacts( Collections.singletonList( new ServiceDefinitionV2Contact( new ServiceDefinitionV2Email() .contact("contact@datadoghq.com") .name("Team Email") .type(ServiceDefinitionV2EmailType.EMAIL)))) .ddService("service-exampleservicedefinition") .ddTeam("my-team") .docs( Collections.singletonList( new ServiceDefinitionV2Doc() .name("Architecture") .provider("google drive") .url("https://gdrive/mydoc"))) .extensions(Map.ofEntries(Map.entry("myorgextension", "extensionvalue"))) .integrations( new ServiceDefinitionV2Integrations() .opsgenie( new ServiceDefinitionV2Opsgenie() .region(ServiceDefinitionV2OpsgenieRegion.US) .serviceUrl( "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000")) .pagerduty("https://my-org.pagerduty.com/service-directory/PMyService")) .links( Collections.singletonList( new ServiceDefinitionV2Link() .name("Runbook") .type(ServiceDefinitionV2LinkType.RUNBOOK) .url("https://my-runbook"))) .repos( Collections.singletonList( new ServiceDefinitionV2Repo() .name("Source Code") .provider("GitHub") .url("https://github.com/DataDog/schema"))) .schemaVersion(ServiceDefinitionV2Version.V2) .tags(Arrays.asList("my:tag", "service:tag")) .team("my-team")); try { ServiceDefinitionCreateResponse result = apiInstance.createOrUpdateServiceDefinitions(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceDefinitionApi#createOrUpdateServiceDefinitions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create or update service definition using schema v2-1 returns "CREATED" response ``` // Create or update service definition using schema v2-1 returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceDefinitionApi; import com.datadog.api.client.v2.model.ServiceDefinitionCreateResponse; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot1; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot1Contact; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot1Email; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot1EmailType; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot1Integrations; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot1Link; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot1LinkType; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot1Opsgenie; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot1OpsgenieRegion; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot1Pagerduty; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot1Version; import com.datadog.api.client.v2.model.ServiceDefinitionsCreateRequest; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceDefinitionApi apiInstance = new ServiceDefinitionApi(defaultClient); ServiceDefinitionsCreateRequest body = new ServiceDefinitionsCreateRequest( new ServiceDefinitionV2Dot1() .contacts( Collections.singletonList( new ServiceDefinitionV2Dot1Contact( new ServiceDefinitionV2Dot1Email() .contact("contact@datadoghq.com") .name("Team Email") .type(ServiceDefinitionV2Dot1EmailType.EMAIL)))) .ddService("service-exampleservicedefinition") .extensions(Map.ofEntries(Map.entry("myorgextension", "extensionvalue"))) .integrations( new ServiceDefinitionV2Dot1Integrations() .opsgenie( new ServiceDefinitionV2Dot1Opsgenie() .region(ServiceDefinitionV2Dot1OpsgenieRegion.US) .serviceUrl( "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000")) .pagerduty( new ServiceDefinitionV2Dot1Pagerduty() .serviceUrl( "https://my-org.pagerduty.com/service-directory/PMyService"))) .links( Arrays.asList( new ServiceDefinitionV2Dot1Link() .name("Runbook") .type(ServiceDefinitionV2Dot1LinkType.RUNBOOK) .url("https://my-runbook"), new ServiceDefinitionV2Dot1Link() .name("Source Code") .type(ServiceDefinitionV2Dot1LinkType.REPO) .provider("GitHub") .url("https://github.com/DataDog/schema"), new ServiceDefinitionV2Dot1Link() .name("Architecture") .type(ServiceDefinitionV2Dot1LinkType.DOC) .provider("Gigoogle drivetHub") .url("https://my-runbook"))) .schemaVersion(ServiceDefinitionV2Dot1Version.V2_1) .tags(Arrays.asList("my:tag", "service:tag")) .team("my-team")); try { ServiceDefinitionCreateResponse result = apiInstance.createOrUpdateServiceDefinitions(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceDefinitionApi#createOrUpdateServiceDefinitions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create or update service definition using schema v2-2 returns "CREATED" response ``` // Create or update service definition using schema v2-2 returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceDefinitionApi; import com.datadog.api.client.v2.model.ServiceDefinitionCreateResponse; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot2; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot2Contact; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot2Integrations; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot2Link; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot2Opsgenie; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot2OpsgenieRegion; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot2Pagerduty; import com.datadog.api.client.v2.model.ServiceDefinitionV2Dot2Version; import com.datadog.api.client.v2.model.ServiceDefinitionsCreateRequest; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceDefinitionApi apiInstance = new ServiceDefinitionApi(defaultClient); ServiceDefinitionsCreateRequest body = new ServiceDefinitionsCreateRequest( new ServiceDefinitionV2Dot2() .contacts( Collections.singletonList( new ServiceDefinitionV2Dot2Contact() .contact("contact@datadoghq.com") .name("Team Email") .type("email"))) .ddService("service-exampleservicedefinition") .extensions(Map.ofEntries(Map.entry("myorgextension", "extensionvalue"))) .integrations( new ServiceDefinitionV2Dot2Integrations() .opsgenie( new ServiceDefinitionV2Dot2Opsgenie() .region(ServiceDefinitionV2Dot2OpsgenieRegion.US) .serviceUrl( "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000")) .pagerduty( new ServiceDefinitionV2Dot2Pagerduty() .serviceUrl( "https://my-org.pagerduty.com/service-directory/PMyService"))) .links( Arrays.asList( new ServiceDefinitionV2Dot2Link() .name("Runbook") .type("runbook") .url("https://my-runbook"), new ServiceDefinitionV2Dot2Link() .name("Source Code") .type("repo") .provider("GitHub") .url("https://github.com/DataDog/schema"), new ServiceDefinitionV2Dot2Link() .name("Architecture") .type("doc") .provider("Gigoogle drivetHub") .url("https://my-runbook"))) .schemaVersion(ServiceDefinitionV2Dot2Version.V2_2) .tags(Arrays.asList("my:tag", "service:tag")) .team("my-team")); try { ServiceDefinitionCreateResponse result = apiInstance.createOrUpdateServiceDefinitions(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceDefinitionApi#createOrUpdateServiceDefinitions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create or update service definition using schema v2 returns "CREATED" response ``` """ Create or update service definition using schema v2 returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_definition_api import ServiceDefinitionApi from datadog_api_client.v2.model.service_definition_v2 import ServiceDefinitionV2 from datadog_api_client.v2.model.service_definition_v2_doc import ServiceDefinitionV2Doc from datadog_api_client.v2.model.service_definition_v2_email import ServiceDefinitionV2Email from datadog_api_client.v2.model.service_definition_v2_email_type import ServiceDefinitionV2EmailType from datadog_api_client.v2.model.service_definition_v2_integrations import ServiceDefinitionV2Integrations from datadog_api_client.v2.model.service_definition_v2_link import ServiceDefinitionV2Link from datadog_api_client.v2.model.service_definition_v2_link_type import ServiceDefinitionV2LinkType from datadog_api_client.v2.model.service_definition_v2_opsgenie import ServiceDefinitionV2Opsgenie from datadog_api_client.v2.model.service_definition_v2_opsgenie_region import ServiceDefinitionV2OpsgenieRegion from datadog_api_client.v2.model.service_definition_v2_repo import ServiceDefinitionV2Repo from datadog_api_client.v2.model.service_definition_v2_version import ServiceDefinitionV2Version body = ServiceDefinitionV2( contacts=[ ServiceDefinitionV2Email( contact="contact@datadoghq.com", name="Team Email", type=ServiceDefinitionV2EmailType.EMAIL, ), ], dd_service="service-exampleservicedefinition", dd_team="my-team", docs=[ ServiceDefinitionV2Doc( name="Architecture", provider="google drive", url="https://gdrive/mydoc", ), ], extensions=dict([("myorgextension", "extensionvalue")]), integrations=ServiceDefinitionV2Integrations( opsgenie=ServiceDefinitionV2Opsgenie( region=ServiceDefinitionV2OpsgenieRegion.US, service_url="https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", ), pagerduty="https://my-org.pagerduty.com/service-directory/PMyService", ), links=[ ServiceDefinitionV2Link( name="Runbook", type=ServiceDefinitionV2LinkType.RUNBOOK, url="https://my-runbook", ), ], repos=[ ServiceDefinitionV2Repo( name="Source Code", provider="GitHub", url="https://github.com/DataDog/schema", ), ], schema_version=ServiceDefinitionV2Version.V2, tags=[ "my:tag", "service:tag", ], team="my-team", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceDefinitionApi(api_client) response = api_instance.create_or_update_service_definitions(body=body) print(response) ``` Copy ##### Create or update service definition using schema v2-1 returns "CREATED" response ``` """ Create or update service definition using schema v2-1 returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_definition_api import ServiceDefinitionApi from datadog_api_client.v2.model.service_definition_v2_dot1 import ServiceDefinitionV2Dot1 from datadog_api_client.v2.model.service_definition_v2_dot1_email import ServiceDefinitionV2Dot1Email from datadog_api_client.v2.model.service_definition_v2_dot1_email_type import ServiceDefinitionV2Dot1EmailType from datadog_api_client.v2.model.service_definition_v2_dot1_integrations import ServiceDefinitionV2Dot1Integrations from datadog_api_client.v2.model.service_definition_v2_dot1_link import ServiceDefinitionV2Dot1Link from datadog_api_client.v2.model.service_definition_v2_dot1_link_type import ServiceDefinitionV2Dot1LinkType from datadog_api_client.v2.model.service_definition_v2_dot1_opsgenie import ServiceDefinitionV2Dot1Opsgenie from datadog_api_client.v2.model.service_definition_v2_dot1_opsgenie_region import ServiceDefinitionV2Dot1OpsgenieRegion from datadog_api_client.v2.model.service_definition_v2_dot1_pagerduty import ServiceDefinitionV2Dot1Pagerduty from datadog_api_client.v2.model.service_definition_v2_dot1_version import ServiceDefinitionV2Dot1Version body = ServiceDefinitionV2Dot1( contacts=[ ServiceDefinitionV2Dot1Email( contact="contact@datadoghq.com", name="Team Email", type=ServiceDefinitionV2Dot1EmailType.EMAIL, ), ], dd_service="service-exampleservicedefinition", extensions=dict([("myorgextension", "extensionvalue")]), integrations=ServiceDefinitionV2Dot1Integrations( opsgenie=ServiceDefinitionV2Dot1Opsgenie( region=ServiceDefinitionV2Dot1OpsgenieRegion.US, service_url="https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", ), pagerduty=ServiceDefinitionV2Dot1Pagerduty( service_url="https://my-org.pagerduty.com/service-directory/PMyService", ), ), links=[ ServiceDefinitionV2Dot1Link( name="Runbook", type=ServiceDefinitionV2Dot1LinkType.RUNBOOK, url="https://my-runbook", ), ServiceDefinitionV2Dot1Link( name="Source Code", type=ServiceDefinitionV2Dot1LinkType.REPO, provider="GitHub", url="https://github.com/DataDog/schema", ), ServiceDefinitionV2Dot1Link( name="Architecture", type=ServiceDefinitionV2Dot1LinkType.DOC, provider="Gigoogle drivetHub", url="https://my-runbook", ), ], schema_version=ServiceDefinitionV2Dot1Version.V2_1, tags=[ "my:tag", "service:tag", ], team="my-team", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceDefinitionApi(api_client) response = api_instance.create_or_update_service_definitions(body=body) print(response) ``` Copy ##### Create or update service definition using schema v2-2 returns "CREATED" response ``` """ Create or update service definition using schema v2-2 returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_definition_api import ServiceDefinitionApi from datadog_api_client.v2.model.service_definition_v2_dot2 import ServiceDefinitionV2Dot2 from datadog_api_client.v2.model.service_definition_v2_dot2_contact import ServiceDefinitionV2Dot2Contact from datadog_api_client.v2.model.service_definition_v2_dot2_integrations import ServiceDefinitionV2Dot2Integrations from datadog_api_client.v2.model.service_definition_v2_dot2_link import ServiceDefinitionV2Dot2Link from datadog_api_client.v2.model.service_definition_v2_dot2_opsgenie import ServiceDefinitionV2Dot2Opsgenie from datadog_api_client.v2.model.service_definition_v2_dot2_opsgenie_region import ServiceDefinitionV2Dot2OpsgenieRegion from datadog_api_client.v2.model.service_definition_v2_dot2_pagerduty import ServiceDefinitionV2Dot2Pagerduty from datadog_api_client.v2.model.service_definition_v2_dot2_version import ServiceDefinitionV2Dot2Version body = ServiceDefinitionV2Dot2( contacts=[ ServiceDefinitionV2Dot2Contact( contact="contact@datadoghq.com", name="Team Email", type="email", ), ], dd_service="service-exampleservicedefinition", extensions=dict([("myorgextension", "extensionvalue")]), integrations=ServiceDefinitionV2Dot2Integrations( opsgenie=ServiceDefinitionV2Dot2Opsgenie( region=ServiceDefinitionV2Dot2OpsgenieRegion.US, service_url="https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", ), pagerduty=ServiceDefinitionV2Dot2Pagerduty( service_url="https://my-org.pagerduty.com/service-directory/PMyService", ), ), links=[ ServiceDefinitionV2Dot2Link( name="Runbook", type="runbook", url="https://my-runbook", ), ServiceDefinitionV2Dot2Link( name="Source Code", type="repo", provider="GitHub", url="https://github.com/DataDog/schema", ), ServiceDefinitionV2Dot2Link( name="Architecture", type="doc", provider="Gigoogle drivetHub", url="https://my-runbook", ), ], schema_version=ServiceDefinitionV2Dot2Version.V2_2, tags=[ "my:tag", "service:tag", ], team="my-team", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceDefinitionApi(api_client) response = api_instance.create_or_update_service_definitions(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create or update service definition using schema v2 returns "CREATED" response ``` # Create or update service definition using schema v2 returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceDefinitionAPI.new body = DatadogAPIClient::V2::ServiceDefinitionV2.new({ contacts: [ DatadogAPIClient::V2::ServiceDefinitionV2Email.new({ contact: "contact@datadoghq.com", name: "Team Email", type: DatadogAPIClient::V2::ServiceDefinitionV2EmailType::EMAIL, }), ], dd_service: "service-exampleservicedefinition", dd_team: "my-team", docs: [ DatadogAPIClient::V2::ServiceDefinitionV2Doc.new({ name: "Architecture", provider: "google drive", url: "https://gdrive/mydoc", }), ], extensions: { "myorgextension": "extensionvalue", }, integrations: DatadogAPIClient::V2::ServiceDefinitionV2Integrations.new({ opsgenie: DatadogAPIClient::V2::ServiceDefinitionV2Opsgenie.new({ region: DatadogAPIClient::V2::ServiceDefinitionV2OpsgenieRegion::US, service_url: "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", }), pagerduty: "https://my-org.pagerduty.com/service-directory/PMyService", }), links: [ DatadogAPIClient::V2::ServiceDefinitionV2Link.new({ name: "Runbook", type: DatadogAPIClient::V2::ServiceDefinitionV2LinkType::RUNBOOK, url: "https://my-runbook", }), ], repos: [ DatadogAPIClient::V2::ServiceDefinitionV2Repo.new({ name: "Source Code", provider: "GitHub", url: "https://github.com/DataDog/schema", }), ], schema_version: DatadogAPIClient::V2::ServiceDefinitionV2Version::V2, tags: [ "my:tag", "service:tag", ], team: "my-team", }) p api_instance.create_or_update_service_definitions(body) ``` Copy ##### Create or update service definition using schema v2-1 returns "CREATED" response ``` # Create or update service definition using schema v2-1 returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceDefinitionAPI.new body = DatadogAPIClient::V2::ServiceDefinitionV2Dot1.new({ contacts: [ DatadogAPIClient::V2::ServiceDefinitionV2Dot1Email.new({ contact: "contact@datadoghq.com", name: "Team Email", type: DatadogAPIClient::V2::ServiceDefinitionV2Dot1EmailType::EMAIL, }), ], dd_service: "service-exampleservicedefinition", extensions: { "myorgextension": "extensionvalue", }, integrations: DatadogAPIClient::V2::ServiceDefinitionV2Dot1Integrations.new({ opsgenie: DatadogAPIClient::V2::ServiceDefinitionV2Dot1Opsgenie.new({ region: DatadogAPIClient::V2::ServiceDefinitionV2Dot1OpsgenieRegion::US, service_url: "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", }), pagerduty: DatadogAPIClient::V2::ServiceDefinitionV2Dot1Pagerduty.new({ service_url: "https://my-org.pagerduty.com/service-directory/PMyService", }), }), links: [ DatadogAPIClient::V2::ServiceDefinitionV2Dot1Link.new({ name: "Runbook", type: DatadogAPIClient::V2::ServiceDefinitionV2Dot1LinkType::RUNBOOK, url: "https://my-runbook", }), DatadogAPIClient::V2::ServiceDefinitionV2Dot1Link.new({ name: "Source Code", type: DatadogAPIClient::V2::ServiceDefinitionV2Dot1LinkType::REPO, provider: "GitHub", url: "https://github.com/DataDog/schema", }), DatadogAPIClient::V2::ServiceDefinitionV2Dot1Link.new({ name: "Architecture", type: DatadogAPIClient::V2::ServiceDefinitionV2Dot1LinkType::DOC, provider: "Gigoogle drivetHub", url: "https://my-runbook", }), ], schema_version: DatadogAPIClient::V2::ServiceDefinitionV2Dot1Version::V2_1, tags: [ "my:tag", "service:tag", ], team: "my-team", }) p api_instance.create_or_update_service_definitions(body) ``` Copy ##### Create or update service definition using schema v2-2 returns "CREATED" response ``` # Create or update service definition using schema v2-2 returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceDefinitionAPI.new body = DatadogAPIClient::V2::ServiceDefinitionV2Dot2.new({ contacts: [ DatadogAPIClient::V2::ServiceDefinitionV2Dot2Contact.new({ contact: "contact@datadoghq.com", name: "Team Email", type: "email", }), ], dd_service: "service-exampleservicedefinition", extensions: { "myorgextension": "extensionvalue", }, integrations: DatadogAPIClient::V2::ServiceDefinitionV2Dot2Integrations.new({ opsgenie: DatadogAPIClient::V2::ServiceDefinitionV2Dot2Opsgenie.new({ region: DatadogAPIClient::V2::ServiceDefinitionV2Dot2OpsgenieRegion::US, service_url: "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", }), pagerduty: DatadogAPIClient::V2::ServiceDefinitionV2Dot2Pagerduty.new({ service_url: "https://my-org.pagerduty.com/service-directory/PMyService", }), }), links: [ DatadogAPIClient::V2::ServiceDefinitionV2Dot2Link.new({ name: "Runbook", type: "runbook", url: "https://my-runbook", }), DatadogAPIClient::V2::ServiceDefinitionV2Dot2Link.new({ name: "Source Code", type: "repo", provider: "GitHub", url: "https://github.com/DataDog/schema", }), DatadogAPIClient::V2::ServiceDefinitionV2Dot2Link.new({ name: "Architecture", type: "doc", provider: "Gigoogle drivetHub", url: "https://my-runbook", }), ], schema_version: DatadogAPIClient::V2::ServiceDefinitionV2Dot2Version::V2_2, tags: [ "my:tag", "service:tag", ], team: "my-team", }) p api_instance.create_or_update_service_definitions(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create or update service definition using schema v2 returns "CREATED" response ``` // Create or update service definition using schema v2 returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_definition::ServiceDefinitionAPI; use datadog_api_client::datadogV2::model::ServiceDefinitionV2; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Contact; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Doc; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Email; use datadog_api_client::datadogV2::model::ServiceDefinitionV2EmailType; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Integrations; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Link; use datadog_api_client::datadogV2::model::ServiceDefinitionV2LinkType; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Opsgenie; use datadog_api_client::datadogV2::model::ServiceDefinitionV2OpsgenieRegion; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Repo; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Version; use datadog_api_client::datadogV2::model::ServiceDefinitionsCreateRequest; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = ServiceDefinitionsCreateRequest::ServiceDefinitionV2(Box::new( ServiceDefinitionV2::new( "service-exampleservicedefinition".to_string(), ServiceDefinitionV2Version::V2, ) .contacts(vec![ServiceDefinitionV2Contact::ServiceDefinitionV2Email( Box::new( ServiceDefinitionV2Email::new( "contact@datadoghq.com".to_string(), ServiceDefinitionV2EmailType::EMAIL, ) .name("Team Email".to_string()), ), )]) .dd_team("my-team".to_string()) .docs(vec![ServiceDefinitionV2Doc::new( "Architecture".to_string(), "https://gdrive/mydoc".to_string(), ) .provider("google drive".to_string())]) .extensions(BTreeMap::from([( "myorgextension".to_string(), Value::from("extensionvalue"), )])) .integrations( ServiceDefinitionV2Integrations::new() .opsgenie( ServiceDefinitionV2Opsgenie::new( "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000" .to_string(), ) .region(ServiceDefinitionV2OpsgenieRegion::US), ) .pagerduty("https://my-org.pagerduty.com/service-directory/PMyService".to_string()), ) .links(vec![ServiceDefinitionV2Link::new( "Runbook".to_string(), ServiceDefinitionV2LinkType::RUNBOOK, "https://my-runbook".to_string(), )]) .repos(vec![ServiceDefinitionV2Repo::new( "Source Code".to_string(), "https://github.com/DataDog/schema".to_string(), ) .provider("GitHub".to_string())]) .tags(vec!["my:tag".to_string(), "service:tag".to_string()]) .team("my-team".to_string()), )); let configuration = datadog::Configuration::new(); let api = ServiceDefinitionAPI::with_config(configuration); let resp = api.create_or_update_service_definitions(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create or update service definition using schema v2-1 returns "CREATED" response ``` // Create or update service definition using schema v2-1 returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_definition::ServiceDefinitionAPI; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot1; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot1Contact; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot1Email; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot1EmailType; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot1Integrations; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot1Link; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot1LinkType; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot1Opsgenie; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot1OpsgenieRegion; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot1Pagerduty; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot1Version; use datadog_api_client::datadogV2::model::ServiceDefinitionsCreateRequest; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = ServiceDefinitionsCreateRequest::ServiceDefinitionV2Dot1(Box::new( ServiceDefinitionV2Dot1::new( "service-exampleservicedefinition".to_string(), ServiceDefinitionV2Dot1Version::V2_1, ) .contacts(vec![ ServiceDefinitionV2Dot1Contact::ServiceDefinitionV2Dot1Email(Box::new( ServiceDefinitionV2Dot1Email::new( "contact@datadoghq.com".to_string(), ServiceDefinitionV2Dot1EmailType::EMAIL, ) .name("Team Email".to_string()), )), ]) .extensions(BTreeMap::from([( "myorgextension".to_string(), Value::from("extensionvalue"), )])) .integrations( ServiceDefinitionV2Dot1Integrations::new() .opsgenie( ServiceDefinitionV2Dot1Opsgenie::new( "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000" .to_string(), ) .region(ServiceDefinitionV2Dot1OpsgenieRegion::US), ) .pagerduty(ServiceDefinitionV2Dot1Pagerduty::new().service_url( "https://my-org.pagerduty.com/service-directory/PMyService".to_string(), )), ) .links(vec![ ServiceDefinitionV2Dot1Link::new( "Runbook".to_string(), ServiceDefinitionV2Dot1LinkType::RUNBOOK, "https://my-runbook".to_string(), ), ServiceDefinitionV2Dot1Link::new( "Source Code".to_string(), ServiceDefinitionV2Dot1LinkType::REPO, "https://github.com/DataDog/schema".to_string(), ) .provider("GitHub".to_string()), ServiceDefinitionV2Dot1Link::new( "Architecture".to_string(), ServiceDefinitionV2Dot1LinkType::DOC, "https://my-runbook".to_string(), ) .provider("Gigoogle drivetHub".to_string()), ]) .tags(vec!["my:tag".to_string(), "service:tag".to_string()]) .team("my-team".to_string()), )); let configuration = datadog::Configuration::new(); let api = ServiceDefinitionAPI::with_config(configuration); let resp = api.create_or_update_service_definitions(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create or update service definition using schema v2-2 returns "CREATED" response ``` // Create or update service definition using schema v2-2 returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_definition::ServiceDefinitionAPI; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot2; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot2Contact; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot2Integrations; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot2Link; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot2Opsgenie; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot2OpsgenieRegion; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot2Pagerduty; use datadog_api_client::datadogV2::model::ServiceDefinitionV2Dot2Version; use datadog_api_client::datadogV2::model::ServiceDefinitionsCreateRequest; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = ServiceDefinitionsCreateRequest::ServiceDefinitionV2Dot2(Box::new( ServiceDefinitionV2Dot2::new( "service-exampleservicedefinition".to_string(), ServiceDefinitionV2Dot2Version::V2_2, ) .contacts(vec![ServiceDefinitionV2Dot2Contact::new( "contact@datadoghq.com".to_string(), "email".to_string(), ) .name("Team Email".to_string())]) .extensions(BTreeMap::from([( "myorgextension".to_string(), Value::from("extensionvalue"), )])) .integrations( ServiceDefinitionV2Dot2Integrations::new() .opsgenie( ServiceDefinitionV2Dot2Opsgenie::new( "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000" .to_string(), ) .region(ServiceDefinitionV2Dot2OpsgenieRegion::US), ) .pagerduty(ServiceDefinitionV2Dot2Pagerduty::new().service_url( "https://my-org.pagerduty.com/service-directory/PMyService".to_string(), )), ) .links(vec![ ServiceDefinitionV2Dot2Link::new( "Runbook".to_string(), "runbook".to_string(), "https://my-runbook".to_string(), ), ServiceDefinitionV2Dot2Link::new( "Source Code".to_string(), "repo".to_string(), "https://github.com/DataDog/schema".to_string(), ) .provider("GitHub".to_string()), ServiceDefinitionV2Dot2Link::new( "Architecture".to_string(), "doc".to_string(), "https://my-runbook".to_string(), ) .provider("Gigoogle drivetHub".to_string()), ]) .tags(vec!["my:tag".to_string(), "service:tag".to_string()]) .team("my-team".to_string()), )); let configuration = datadog::Configuration::new(); let api = ServiceDefinitionAPI::with_config(configuration); let resp = api.create_or_update_service_definitions(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create or update service definition using schema v2 returns "CREATED" response ``` /** * Create or update service definition using schema v2 returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceDefinitionApi(configuration); const params: v2.ServiceDefinitionApiCreateOrUpdateServiceDefinitionsRequest = { body: { contacts: [ { contact: "contact@datadoghq.com", name: "Team Email", type: "email", }, ], ddService: "service-exampleservicedefinition", ddTeam: "my-team", docs: [ { name: "Architecture", provider: "google drive", url: "https://gdrive/mydoc", }, ], extensions: { myorgextension: "extensionvalue", }, integrations: { opsgenie: { region: "US", serviceUrl: "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", }, pagerduty: "https://my-org.pagerduty.com/service-directory/PMyService", }, links: [ { name: "Runbook", type: "runbook", url: "https://my-runbook", }, ], repos: [ { name: "Source Code", provider: "GitHub", url: "https://github.com/DataDog/schema", }, ], schemaVersion: "v2", tags: ["my:tag", "service:tag"], team: "my-team", }, }; apiInstance .createOrUpdateServiceDefinitions(params) .then((data: v2.ServiceDefinitionCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create or update service definition using schema v2-1 returns "CREATED" response ``` /** * Create or update service definition using schema v2-1 returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceDefinitionApi(configuration); const params: v2.ServiceDefinitionApiCreateOrUpdateServiceDefinitionsRequest = { body: { contacts: [ { contact: "contact@datadoghq.com", name: "Team Email", type: "email", }, ], ddService: "service-exampleservicedefinition", extensions: { myorgextension: "extensionvalue", }, integrations: { opsgenie: { region: "US", serviceUrl: "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", }, pagerduty: { serviceUrl: "https://my-org.pagerduty.com/service-directory/PMyService", }, }, links: [ { name: "Runbook", type: "runbook", url: "https://my-runbook", }, { name: "Source Code", type: "repo", provider: "GitHub", url: "https://github.com/DataDog/schema", }, { name: "Architecture", type: "doc", provider: "Gigoogle drivetHub", url: "https://my-runbook", }, ], schemaVersion: "v2.1", tags: ["my:tag", "service:tag"], team: "my-team", }, }; apiInstance .createOrUpdateServiceDefinitions(params) .then((data: v2.ServiceDefinitionCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create or update service definition using schema v2-2 returns "CREATED" response ``` /** * Create or update service definition using schema v2-2 returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceDefinitionApi(configuration); const params: v2.ServiceDefinitionApiCreateOrUpdateServiceDefinitionsRequest = { body: { contacts: [ { contact: "contact@datadoghq.com", name: "Team Email", type: "email", }, ], ddService: "service-exampleservicedefinition", extensions: { myorgextension: "extensionvalue", }, integrations: { opsgenie: { region: "US", serviceUrl: "https://my-org.opsgenie.com/service/123e4567-e89b-12d3-a456-426614174000", }, pagerduty: { serviceUrl: "https://my-org.pagerduty.com/service-directory/PMyService", }, }, links: [ { name: "Runbook", type: "runbook", url: "https://my-runbook", }, { name: "Source Code", type: "repo", provider: "GitHub", url: "https://github.com/DataDog/schema", }, { name: "Architecture", type: "doc", provider: "Gigoogle drivetHub", url: "https://my-runbook", }, ], schemaVersion: "v2.2", tags: ["my:tag", "service:tag"], team: "my-team", }, }; apiInstance .createOrUpdateServiceDefinitions(params) .then((data: v2.ServiceDefinitionCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a single service definition](https://docs.datadoghq.com/api/latest/service-definition/#get-a-single-service-definition) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-definition/#get-a-single-service-definition-v2) GET https://api.ap1.datadoghq.com/api/v2/services/definitions/{service_name}https://api.ap2.datadoghq.com/api/v2/services/definitions/{service_name}https://api.datadoghq.eu/api/v2/services/definitions/{service_name}https://api.ddog-gov.com/api/v2/services/definitions/{service_name}https://api.datadoghq.com/api/v2/services/definitions/{service_name}https://api.us3.datadoghq.com/api/v2/services/definitions/{service_name}https://api.us5.datadoghq.com/api/v2/services/definitions/{service_name} ### Overview Get a single service definition from the Datadog Service Catalog. This endpoint requires the `apm_service_catalog_read` permission. OAuth apps require the `apm_service_catalog_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-definition) to access this endpoint. ### Arguments #### Path Parameters Name Type Description service_name [_required_] string The name of the service. #### Query Strings Name Type Description schema_version enum The schema version desired in the response. Allowed enum values: `v1, v2, v2.1, v2.2` ### Response * [200](https://docs.datadoghq.com/api/latest/service-definition/#GetServiceDefinition-200-v2) * [400](https://docs.datadoghq.com/api/latest/service-definition/#GetServiceDefinition-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-definition/#GetServiceDefinition-403-v2) * [404](https://docs.datadoghq.com/api/latest/service-definition/#GetServiceDefinition-404-v2) * [409](https://docs.datadoghq.com/api/latest/service-definition/#GetServiceDefinition-409-v2) * [429](https://docs.datadoghq.com/api/latest/service-definition/#GetServiceDefinition-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) Get service definition response. Field Type Description data object Service definition data. attributes object Service definition attributes. meta object Metadata about a service definition. github-html-url string GitHub HTML URL. ingested-schema-version string Ingestion schema version. ingestion-source string Ingestion source of the service definition. last-modified-time string Last modified time of the service definition. origin string User defined origin of the service definition. origin-detail string User defined origin's detail of the service definition. warnings [object] A list of schema validation warnings. instance-location string The warning instance location. keyword-location string The warning keyword location. message string The warning message. schema Service definition schema. Option 1 object **DEPRECATED** : Deprecated - Service definition V1 for providing additional service metadata and integrations. contact object Contact information about the service. email string Service owner’s email. slack string Service owner’s Slack channel. extensions object Extensions to V1 schema. external-resources [object] A list of external links related to the services. name [_required_] string Link name. type [_required_] enum Link type. Allowed enum values: `doc,wiki,runbook,url,repo,dashboard,oncall,code,link` url [_required_] string Link URL. info [_required_] object Basic information about a service. dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. description string A short description of the service. display-name string A friendly name of the service. service-tier string Service tier. integrations object Third party integrations that Datadog supports. pagerduty string PagerDuty service URL for the service. org object Org related information about the service. application string App feature this service supports. team string Team that owns the service. schema-version [_required_] enum Schema version being used. Allowed enum values: `v1` default: `v1` tags [string] A set of custom tags. Option 2 object Service definition V2 for providing service metadata and integrations. contacts [ ] A list of contacts related to the services. Option 1 object Service owner's email. contact [_required_] string Contact value. name string Contact email. type [_required_] enum Contact type. Allowed enum values: `email` Option 2 object Service owner's Slack channel. contact [_required_] string Slack Channel. name string Contact Slack. type [_required_] enum Contact type. Allowed enum values: `slack` Option 3 object Service owner's Microsoft Teams. contact [_required_] string Contact value. name string Contact Microsoft Teams. type [_required_] enum Contact type. Allowed enum values: `microsoft-teams` dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. dd-team string Experimental feature. A Team handle that matches a Team in the Datadog Teams product. docs [object] A list of documentation related to the services. name [_required_] string Document name. provider string Document provider. url [_required_] string Document URL. extensions object Extensions to V2 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty string PagerDuty service URL for the service. links [object] A list of links related to the services. name [_required_] string Link name. type [_required_] enum Link type. Allowed enum values: `doc,wiki,runbook,url,repo,dashboard,oncall,code,link` url [_required_] string Link URL. repos [object] A list of code repositories related to the services. name [_required_] string Repository name. provider string Repository provider. url [_required_] string Repository URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2` default: `v2` tags [string] A set of custom tags. team string Team that owns the service. Option 3 object Service definition v2.1 for providing service metadata and integrations. application string Identifier for a group of related services serving a product feature, which the service is a part of. contacts [ ] A list of contacts related to the services. Option 1 object Service owner's email. contact [_required_] string Contact value. name string Contact email. type [_required_] enum Contact type. Allowed enum values: `email` Option 2 object Service owner's Slack channel. contact [_required_] string Slack Channel. name string Contact Slack. type [_required_] enum Contact type. Allowed enum values: `slack` Option 3 object Service owner's Microsoft Teams. contact [_required_] string Contact value. name string Contact Microsoft Teams. type [_required_] enum Contact type. Allowed enum values: `microsoft-teams` dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. description string A short description of the service. extensions object Extensions to v2.1 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty object PagerDuty integration for the service. service-url string PagerDuty service url. lifecycle string The current life cycle phase of the service. links [object] A list of links related to the services. name [_required_] string Link name. provider string Link provider. type [_required_] enum Link type. Allowed enum values: `doc,repo,runbook,dashboard,other` url [_required_] string Link URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2.1` default: `v2.1` tags [string] A set of custom tags. team string Team that owns the service. It is used to locate a team defined in Datadog Teams if it exists. tier string Importance of the service. Option 4 object Service definition v2.2 for providing service metadata and integrations. application string Identifier for a group of related services serving a product feature, which the service is a part of. ci-pipeline-fingerprints [string] A set of CI fingerprints. contacts [object] A list of contacts related to the services. contact [_required_] string Contact value. name string Contact Name. type [_required_] string Contact type. Datadog recognizes the following types: `email`, `slack`, and `microsoft-teams`. dd-service [_required_] string Unique identifier of the service. Must be unique across all services and is used to match with a service in Datadog. description string A short description of the service. extensions object Extensions to v2.2 schema. integrations object Third party integrations that Datadog supports. opsgenie object Opsgenie integration for the service. region enum Opsgenie instance region. Allowed enum values: `US,EU` service-url [_required_] string Opsgenie service url. pagerduty object PagerDuty integration for the service. service-url string PagerDuty service url. languages [string] The service's programming language. Datadog recognizes the following languages: `dotnet`, `go`, `java`, `js`, `php`, `python`, `ruby`, and `c++`. lifecycle string The current life cycle phase of the service. links [object] A list of links related to the services. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. Datadog recognizes the following types: `runbook`, `doc`, `repo`, `dashboard`, and `other`. url [_required_] string Link URL. schema-version [_required_] enum Schema version being used. Allowed enum values: `v2.2` default: `v2.2` tags [string] A set of custom tags. team string Team that owns the service. It is used to locate a team defined in Datadog Teams if it exists. tier string Importance of the service. type string The type of service. id string Service definition id. type string Service definition type. ``` { "data": { "attributes": { "meta": { "github-html-url": "string", "ingested-schema-version": "string", "ingestion-source": "string", "last-modified-time": "string", "origin": "string", "origin-detail": "string", "warnings": [ { "instance-location": "string", "keyword-location": "string", "message": "string" } ] }, "schema": { "contact": { "email": "contact@datadoghq.com", "slack": "https://yourcompany.slack.com/archives/channel123" }, "extensions": { "myorg/extension": "extensionValue" }, "external-resources": [ { "name": "Runbook", "type": "runbook", "url": "https://my-runbook" } ], "info": { "dd-service": "myservice", "description": "A shopping cart service", "display-name": "My Service", "service-tier": "Tier 1" }, "integrations": { "pagerduty": "https://my-org.pagerduty.com/service-directory/PMyService" }, "org": { "application": "E-Commerce", "team": "my-team" }, "schema-version": "v1", "tags": [ "my:tag", "service:tag" ] } }, "id": "string", "type": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=typescript) ##### Get a single service definition Copy ``` # Path parameters export service_name="my-service" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/services/definitions/${service_name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a single service definition ``` """ Get a single service definition returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_definition_api import ServiceDefinitionApi from datadog_api_client.v2.model.service_definition_schema_versions import ServiceDefinitionSchemaVersions configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceDefinitionApi(api_client) response = api_instance.get_service_definition( service_name="service-definition-test", schema_version=ServiceDefinitionSchemaVersions.V2_1, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a single service definition ``` # Get a single service definition returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceDefinitionAPI.new opts = { schema_version: ServiceDefinitionSchemaVersions::V2_1, } p api_instance.get_service_definition("service-definition-test", opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a single service definition ``` // Get a single service definition returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceDefinitionApi(apiClient) resp, r, err := api.GetServiceDefinition(ctx, "service-definition-test", *datadogV2.NewGetServiceDefinitionOptionalParameters().WithSchemaVersion(datadogV2.SERVICEDEFINITIONSCHEMAVERSIONS_V2_1)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceDefinitionApi.GetServiceDefinition`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceDefinitionApi.GetServiceDefinition`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a single service definition ``` // Get a single service definition returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceDefinitionApi; import com.datadog.api.client.v2.api.ServiceDefinitionApi.GetServiceDefinitionOptionalParameters; import com.datadog.api.client.v2.model.ServiceDefinitionGetResponse; import com.datadog.api.client.v2.model.ServiceDefinitionSchemaVersions; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceDefinitionApi apiInstance = new ServiceDefinitionApi(defaultClient); try { ServiceDefinitionGetResponse result = apiInstance.getServiceDefinition( "service-definition-test", new GetServiceDefinitionOptionalParameters() .schemaVersion(ServiceDefinitionSchemaVersions.V2_1)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceDefinitionApi#getServiceDefinition"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a single service definition ``` // Get a single service definition returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_definition::GetServiceDefinitionOptionalParams; use datadog_api_client::datadogV2::api_service_definition::ServiceDefinitionAPI; use datadog_api_client::datadogV2::model::ServiceDefinitionSchemaVersions; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ServiceDefinitionAPI::with_config(configuration); let resp = api .get_service_definition( "service-definition-test".to_string(), GetServiceDefinitionOptionalParams::default() .schema_version(ServiceDefinitionSchemaVersions::V2_1), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a single service definition ``` /** * Get a single service definition returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceDefinitionApi(configuration); const params: v2.ServiceDefinitionApiGetServiceDefinitionRequest = { serviceName: "service-definition-test", schemaVersion: "v2.1", }; apiInstance .getServiceDefinition(params) .then((data: v2.ServiceDefinitionGetResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a single service definition](https://docs.datadoghq.com/api/latest/service-definition/#delete-a-single-service-definition) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-definition/#delete-a-single-service-definition-v2) DELETE https://api.ap1.datadoghq.com/api/v2/services/definitions/{service_name}https://api.ap2.datadoghq.com/api/v2/services/definitions/{service_name}https://api.datadoghq.eu/api/v2/services/definitions/{service_name}https://api.ddog-gov.com/api/v2/services/definitions/{service_name}https://api.datadoghq.com/api/v2/services/definitions/{service_name}https://api.us3.datadoghq.com/api/v2/services/definitions/{service_name}https://api.us5.datadoghq.com/api/v2/services/definitions/{service_name} ### Overview Delete a single service definition in the Datadog Service Catalog. This endpoint requires the `apm_service_catalog_write` permission. OAuth apps require the `apm_service_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-definition) to access this endpoint. ### Arguments #### Path Parameters Name Type Description service_name [_required_] string The name of the service. ### Response * [204](https://docs.datadoghq.com/api/latest/service-definition/#DeleteServiceDefinition-204-v2) * [400](https://docs.datadoghq.com/api/latest/service-definition/#DeleteServiceDefinition-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-definition/#DeleteServiceDefinition-403-v2) * [404](https://docs.datadoghq.com/api/latest/service-definition/#DeleteServiceDefinition-404-v2) * [429](https://docs.datadoghq.com/api/latest/service-definition/#DeleteServiceDefinition-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-definition/) * [Example](https://docs.datadoghq.com/api/latest/service-definition/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-definition/?code-lang=typescript) ##### Delete a single service definition Copy ``` # Path parameters export service_name="my-service" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/services/definitions/${service_name}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a single service definition ``` """ Delete a single service definition returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_definition_api import ServiceDefinitionApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceDefinitionApi(api_client) api_instance.delete_service_definition( service_name="service-definition-test", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a single service definition ``` # Delete a single service definition returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::ServiceDefinitionAPI.new api_instance.delete_service_definition("service-definition-test") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a single service definition ``` // Delete a single service definition returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceDefinitionApi(apiClient) r, err := api.DeleteServiceDefinition(ctx, "service-definition-test") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceDefinitionApi.DeleteServiceDefinition`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a single service definition ``` // Delete a single service definition returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceDefinitionApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceDefinitionApi apiInstance = new ServiceDefinitionApi(defaultClient); try { apiInstance.deleteServiceDefinition("service-definition-test"); } catch (ApiException e) { System.err.println("Exception when calling ServiceDefinitionApi#deleteServiceDefinition"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a single service definition ``` // Delete a single service definition returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_definition::ServiceDefinitionAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ServiceDefinitionAPI::with_config(configuration); let resp = api .delete_service_definition("service-definition-test".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a single service definition ``` /** * Delete a single service definition returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.ServiceDefinitionApi(configuration); const params: v2.ServiceDefinitionApiDeleteServiceDefinitionRequest = { serviceName: "service-definition-test", }; apiInstance .deleteServiceDefinition(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=5591b8fb-a754-4282-907b-bfdb45fbf4f5&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=5e6c5e0d-4ca5-4dda-be2a-c66a8514c958&pt=Service%20Definition&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-definition%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=5591b8fb-a754-4282-907b-bfdb45fbf4f5&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=5e6c5e0d-4ca5-4dda-be2a-c66a8514c958&pt=Service%20Definition&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-definition%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=edafb5bb-09f8-4519-a38c-812a6be0b879&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Service%20Definition&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-definition%2F&r=<=15024&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=418360) --- # Source: https://docs.datadoghq.com/api/latest/service-dependencies/ # Service Dependencies APM Service Map API. For more information, visit the [Service Map page](https://docs.datadoghq.com/tracing/visualization/services_map/). ## [Get all APM service dependencies](https://docs.datadoghq.com/api/latest/service-dependencies/#get-all-apm-service-dependencies) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-dependencies/#get-all-apm-service-dependencies-v1) **Note: This endpoint is in public beta. If you have any feedback, contact[Datadog support](https://docs.datadoghq.com/help/).** GET https://api.ap1.datadoghq.com/api/v1/service_dependencieshttps://api.ap2.datadoghq.com/api/v1/service_dependencieshttps://api.datadoghq.eu/api/v1/service_dependencieshttps://api.ddog-gov.com/api/v1/service_dependencieshttps://api.datadoghq.com/api/v1/service_dependencieshttps://api.us3.datadoghq.com/api/v1/service_dependencieshttps://api.us5.datadoghq.com/api/v1/service_dependencies ### Overview Get a list of services from APM and their dependencies. The services retrieved are filtered by environment and a primary tag, if one is defined. ### Arguments #### Query Strings Name Type Description env [_required_] string Specify what APM environment to query service dependencies by. primary_tag string Specify what primary tag to query service dependencies by. start integer Specify the start of the timeframe in epoch seconds to query for. (defaults to 1 hour before end parameter) end integer Specify the end of the timeframe in epoch seconds to query for. (defaults to current time) ### Response * [200](https://docs.datadoghq.com/api/latest/service-dependencies/#ListServiceDependencies-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-dependencies/#ListServiceDependencies-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-dependencies/#ListServiceDependencies-403-v1) * [429](https://docs.datadoghq.com/api/latest/service-dependencies/#ListServiceDependencies-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-dependencies/) * [Example](https://docs.datadoghq.com/api/latest/service-dependencies/) An object containing a list of APM services and their dependencies. Field Type Description object An object containing an APM service's dependencies. calls [_required_] [string] A list of dependencies. ``` { "servica_a": { "calls": [ "service_b", "service_c" ] }, "service_b": { "calls": [ "service_o" ] }, "service_c": { "calls": [ "service_o" ] }, "service_o": { "calls": [] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-dependencies/) * [Example](https://docs.datadoghq.com/api/latest/service-dependencies/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/service-dependencies/) * [Example](https://docs.datadoghq.com/api/latest/service-dependencies/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-dependencies/) * [Example](https://docs.datadoghq.com/api/latest/service-dependencies/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-dependencies/?code-lang=curl) ##### Get all APM service dependencies Copy ``` # Required query arguments export env="prod" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/service_dependencies?env=${env}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Get one APM service's dependencies](https://docs.datadoghq.com/api/latest/service-dependencies/#get-one-apm-services-dependencies) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-dependencies/#get-one-apm-services-dependencies-v1) **Note: This endpoint is in public beta. If you have any feedback, contact[Datadog support](https://docs.datadoghq.com/help/).** GET https://api.ap1.datadoghq.com/api/v1/service_dependencies/{service}https://api.ap2.datadoghq.com/api/v1/service_dependencies/{service}https://api.datadoghq.eu/api/v1/service_dependencies/{service}https://api.ddog-gov.com/api/v1/service_dependencies/{service}https://api.datadoghq.com/api/v1/service_dependencies/{service}https://api.us3.datadoghq.com/api/v1/service_dependencies/{service}https://api.us5.datadoghq.com/api/v1/service_dependencies/{service} ### Overview Get a specific service’s immediate upstream and downstream services. The services retrieved are filtered by environment and a primary tag, if one is defined. ### Arguments #### Path Parameters Name Type Description service [_required_] string The name of the service go get dependencies for. #### Query Strings Name Type Description env [_required_] string Specify what APM environment to query service dependencies by. primary_tag string Specify what primary tag to query service dependencies by. start integer Specify the start of the timeframe in epoch seconds to query for. (defaults to 1 hour before end parameter) end integer Specify the end of the timeframe in epoch seconds to query for. (defaults to current time) ### Response * [200](https://docs.datadoghq.com/api/latest/service-dependencies/#ListSingleServiceDependencies-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-dependencies/#ListSingleServiceDependencies-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-dependencies/#ListSingleServiceDependencies-403-v1) * [429](https://docs.datadoghq.com/api/latest/service-dependencies/#ListSingleServiceDependencies-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-dependencies/) * [Example](https://docs.datadoghq.com/api/latest/service-dependencies/) An object with information on APM services that call, and are called by a given service. Expand All Field Type Description called_by [string] List of service names that call the given service. calls [string] List of service names called by the given service. name string Name of the APM service being searched for. ``` { "called_by": [ "service-a", "service-b" ], "calls": [ "service-d", "service-e" ], "name": "service-c" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-dependencies/) * [Example](https://docs.datadoghq.com/api/latest/service-dependencies/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication Error * [Model](https://docs.datadoghq.com/api/latest/service-dependencies/) * [Example](https://docs.datadoghq.com/api/latest/service-dependencies/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-dependencies/) * [Example](https://docs.datadoghq.com/api/latest/service-dependencies/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-dependencies/?code-lang=curl) ##### Get one APM service's dependencies Copy ``` # Path parameters export service="service-c" # Required query arguments export env="prod" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/service_dependencies/${service}?env=${env}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=449953a2-ded0-4cfd-8128-f6afaec2f2f4&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=1cfc1407-1ec0-4cb1-b7a6-766f3c572316&pt=Service%20Dependencies&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-dependencies%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=449953a2-ded0-4cfd-8128-f6afaec2f2f4&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=1cfc1407-1ec0-4cb1-b7a6-766f3c572316&pt=Service%20Dependencies&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-dependencies%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=4526f9cc-6e3b-4592-9a51-c2b58a472b99&bo=2&sid=8e9f03d0f0c011f0b53019aafced3101&vid=8e9f5e80f0c011f0a7890579f054930f&vids=1&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Service%20Dependencies&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-dependencies%2F&r=<=901&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=40741) --- # Source: https://docs.datadoghq.com/api/latest/service-level-objective-corrections # Service Level Objective Corrections SLO Status Corrections allow you to prevent specific time periods from negatively impacting your SLO’s status and error budget. You can use Status Corrections for various purposes, such as removing planned maintenance windows, non-business hours, or other time periods that do not correspond to genuine issues. See [SLO status corrections](https://docs.datadoghq.com/service_management/service_level_objectives/#slo-status-corrections) for more information. ## [Create an SLO correction](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#create-an-slo-correction) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#create-an-slo-correction-v1) POST https://api.ap1.datadoghq.com/api/v1/slo/correctionhttps://api.ap2.datadoghq.com/api/v1/slo/correctionhttps://api.datadoghq.eu/api/v1/slo/correctionhttps://api.ddog-gov.com/api/v1/slo/correctionhttps://api.datadoghq.com/api/v1/slo/correctionhttps://api.us3.datadoghq.com/api/v1/slo/correctionhttps://api.us5.datadoghq.com/api/v1/slo/correction ### Overview Create an SLO Correction. This endpoint requires the `slos_corrections` permission. OAuth apps require the `slos_corrections` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objective-corrections) to access this endpoint. ### Request #### Body Data (required) Create an SLO Correction * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Field Type Description data object The data object associated with the SLO correction to be created. attributes object The attribute object associated with the SLO correction to be created. category [_required_] enum Category the SLO correction belongs to. Allowed enum values: `Scheduled Maintenance,Outside Business Hours,Deployment,Other` description string Description of the correction being made. duration int64 Length of time (in seconds) for a specified `rrule` recurring SLO correction. end int64 Ending time of the correction in epoch seconds. rrule string The recurrence rules as defined in the iCalendar RFC 5545. The supported rules for SLO corrections are `FREQ`, `INTERVAL`, `COUNT`, `UNTIL` and `BYDAY`. slo_id [_required_] string ID of the SLO that this correction applies to. start [_required_] int64 Starting time of the correction in epoch seconds. timezone string The timezone to display in the UI for the correction times (defaults to "UTC"). type [_required_] enum SLO correction resource type. Allowed enum values: `correction` default: `correction` ##### Create an SLO correction returns "OK" response ``` { "data": { "attributes": { "category": "Scheduled Maintenance", "description": "Example-Service-Level-Objective-Correction", "end": 1636632671, "slo_id": "string", "start": 1636629071, "timezone": "UTC" }, "type": "correction" } } ``` Copy ##### Create an SLO correction with rrule returns "OK" response ``` { "data": { "attributes": { "category": "Scheduled Maintenance", "description": "Example-Service-Level-Objective-Correction", "slo_id": "string", "start": 1636629071, "duration": 3600, "rrule": "FREQ=DAILY;INTERVAL=10;COUNT=5", "timezone": "UTC" }, "type": "correction" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#CreateSLOCorrection-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#CreateSLOCorrection-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#CreateSLOCorrection-403-v1) * [404](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#CreateSLOCorrection-404-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#CreateSLOCorrection-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) The response object of an SLO correction. Field Type Description data object The response object of a list of SLO corrections. attributes object The attribute object associated with the SLO correction. category enum Category the SLO correction belongs to. Allowed enum values: `Scheduled Maintenance,Outside Business Hours,Deployment,Other` created_at int64 The epoch timestamp of when the correction was created at. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. description string Description of the correction being made. duration int64 Length of time (in seconds) for a specified `rrule` recurring SLO correction. end int64 Ending time of the correction in epoch seconds. modified_at int64 The epoch timestamp of when the correction was modified at. modifier object Modifier of the object. email string Email of the Modifier. handle string Handle of the Modifier. name string Name of the Modifier. rrule string The recurrence rules as defined in the iCalendar RFC 5545. The supported rules for SLO corrections are `FREQ`, `INTERVAL`, `COUNT`, `UNTIL` and `BYDAY`. slo_id string ID of the SLO that this correction applies to. start int64 Starting time of the correction in epoch seconds. timezone string The timezone to display in the UI for the correction times (defaults to "UTC"). id string The ID of the SLO correction. type enum SLO correction resource type. Allowed enum values: `correction` default: `correction` ``` { "data": { "attributes": { "category": "Scheduled Maintenance", "created_at": "integer", "creator": { "email": "string", "handle": "string", "name": "string" }, "description": "string", "duration": 3600, "end": "integer", "modified_at": "integer", "modifier": { "email": "string", "handle": "string", "name": "string" }, "rrule": "FREQ=DAILY;INTERVAL=10;COUNT=5", "slo_id": "string", "start": "integer", "timezone": "string" }, "id": "string", "type": "correction" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy SLO Not Found * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=typescript) ##### Create an SLO correction returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/correction" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "category": "Scheduled Maintenance", "description": "Example-Service-Level-Objective-Correction", "end": 1636632671, "slo_id": "string", "start": 1636629071, "timezone": "UTC" }, "type": "correction" } } EOF ``` ##### Create an SLO correction with rrule returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/correction" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "category": "Scheduled Maintenance", "description": "Example-Service-Level-Objective-Correction", "slo_id": "string", "start": 1636629071, "duration": 3600, "rrule": "FREQ=DAILY;INTERVAL=10;COUNT=5", "timezone": "UTC" }, "type": "correction" } } EOF ``` ##### Create an SLO correction returns "OK" response ``` // Create an SLO correction returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "slo" in the system SloData0ID := os.Getenv("SLO_DATA_0_ID") body := datadogV1.SLOCorrectionCreateRequest{ Data: &datadogV1.SLOCorrectionCreateData{ Attributes: &datadogV1.SLOCorrectionCreateRequestAttributes{ Category: datadogV1.SLOCORRECTIONCATEGORY_SCHEDULED_MAINTENANCE, Description: datadog.PtrString("Example-Service-Level-Objective-Correction"), End: datadog.PtrInt64(time.Now().Add(time.Hour * 1).Unix()), SloId: SloData0ID, Start: time.Now().Unix(), Timezone: datadog.PtrString("UTC"), }, Type: datadogV1.SLOCORRECTIONTYPE_CORRECTION, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectiveCorrectionsApi(apiClient) resp, r, err := api.CreateSLOCorrection(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectiveCorrectionsApi.CreateSLOCorrection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectiveCorrectionsApi.CreateSLOCorrection`:\n%s\n", responseContent) } ``` Copy ##### Create an SLO correction with rrule returns "OK" response ``` // Create an SLO correction with rrule returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "slo" in the system SloData0ID := os.Getenv("SLO_DATA_0_ID") body := datadogV1.SLOCorrectionCreateRequest{ Data: &datadogV1.SLOCorrectionCreateData{ Attributes: &datadogV1.SLOCorrectionCreateRequestAttributes{ Category: datadogV1.SLOCORRECTIONCATEGORY_SCHEDULED_MAINTENANCE, Description: datadog.PtrString("Example-Service-Level-Objective-Correction"), SloId: SloData0ID, Start: time.Now().Unix(), Duration: datadog.PtrInt64(3600), Rrule: datadog.PtrString("FREQ=DAILY;INTERVAL=10;COUNT=5"), Timezone: datadog.PtrString("UTC"), }, Type: datadogV1.SLOCORRECTIONTYPE_CORRECTION, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectiveCorrectionsApi(apiClient) resp, r, err := api.CreateSLOCorrection(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectiveCorrectionsApi.CreateSLOCorrection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectiveCorrectionsApi.CreateSLOCorrection`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create an SLO correction returns "OK" response ``` // Create an SLO correction returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectiveCorrectionsApi; import com.datadog.api.client.v1.model.SLOCorrectionCategory; import com.datadog.api.client.v1.model.SLOCorrectionCreateData; import com.datadog.api.client.v1.model.SLOCorrectionCreateRequest; import com.datadog.api.client.v1.model.SLOCorrectionCreateRequestAttributes; import com.datadog.api.client.v1.model.SLOCorrectionResponse; import com.datadog.api.client.v1.model.SLOCorrectionType; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectiveCorrectionsApi apiInstance = new ServiceLevelObjectiveCorrectionsApi(defaultClient); // there is a valid "slo" in the system String SLO_DATA_0_ID = System.getenv("SLO_DATA_0_ID"); SLOCorrectionCreateRequest body = new SLOCorrectionCreateRequest() .data( new SLOCorrectionCreateData() .attributes( new SLOCorrectionCreateRequestAttributes() .category(SLOCorrectionCategory.SCHEDULED_MAINTENANCE) .description("Example-Service-Level-Objective-Correction") .end(OffsetDateTime.now().plusHours(1).toInstant().getEpochSecond()) .sloId(SLO_DATA_0_ID) .start(OffsetDateTime.now().toInstant().getEpochSecond()) .timezone("UTC")) .type(SLOCorrectionType.CORRECTION)); try { SLOCorrectionResponse result = apiInstance.createSLOCorrection(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceLevelObjectiveCorrectionsApi#createSLOCorrection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create an SLO correction with rrule returns "OK" response ``` // Create an SLO correction with rrule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectiveCorrectionsApi; import com.datadog.api.client.v1.model.SLOCorrectionCategory; import com.datadog.api.client.v1.model.SLOCorrectionCreateData; import com.datadog.api.client.v1.model.SLOCorrectionCreateRequest; import com.datadog.api.client.v1.model.SLOCorrectionCreateRequestAttributes; import com.datadog.api.client.v1.model.SLOCorrectionResponse; import com.datadog.api.client.v1.model.SLOCorrectionType; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectiveCorrectionsApi apiInstance = new ServiceLevelObjectiveCorrectionsApi(defaultClient); // there is a valid "slo" in the system String SLO_DATA_0_ID = System.getenv("SLO_DATA_0_ID"); SLOCorrectionCreateRequest body = new SLOCorrectionCreateRequest() .data( new SLOCorrectionCreateData() .attributes( new SLOCorrectionCreateRequestAttributes() .category(SLOCorrectionCategory.SCHEDULED_MAINTENANCE) .description("Example-Service-Level-Objective-Correction") .sloId(SLO_DATA_0_ID) .start(OffsetDateTime.now().toInstant().getEpochSecond()) .duration(3600L) .rrule("FREQ=DAILY;INTERVAL=10;COUNT=5") .timezone("UTC")) .type(SLOCorrectionType.CORRECTION)); try { SLOCorrectionResponse result = apiInstance.createSLOCorrection(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceLevelObjectiveCorrectionsApi#createSLOCorrection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an SLO correction returns "OK" response ``` """ Create an SLO correction returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objective_corrections_api import ServiceLevelObjectiveCorrectionsApi from datadog_api_client.v1.model.slo_correction_category import SLOCorrectionCategory from datadog_api_client.v1.model.slo_correction_create_data import SLOCorrectionCreateData from datadog_api_client.v1.model.slo_correction_create_request import SLOCorrectionCreateRequest from datadog_api_client.v1.model.slo_correction_create_request_attributes import SLOCorrectionCreateRequestAttributes from datadog_api_client.v1.model.slo_correction_type import SLOCorrectionType # there is a valid "slo" in the system SLO_DATA_0_ID = environ["SLO_DATA_0_ID"] body = SLOCorrectionCreateRequest( data=SLOCorrectionCreateData( attributes=SLOCorrectionCreateRequestAttributes( category=SLOCorrectionCategory.SCHEDULED_MAINTENANCE, description="Example-Service-Level-Objective-Correction", end=int((datetime.now() + relativedelta(hours=1)).timestamp()), slo_id=SLO_DATA_0_ID, start=int(datetime.now().timestamp()), timezone="UTC", ), type=SLOCorrectionType.CORRECTION, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectiveCorrectionsApi(api_client) response = api_instance.create_slo_correction(body=body) print(response) ``` Copy ##### Create an SLO correction with rrule returns "OK" response ``` """ Create an SLO correction with rrule returns "OK" response """ from datetime import datetime from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objective_corrections_api import ServiceLevelObjectiveCorrectionsApi from datadog_api_client.v1.model.slo_correction_category import SLOCorrectionCategory from datadog_api_client.v1.model.slo_correction_create_data import SLOCorrectionCreateData from datadog_api_client.v1.model.slo_correction_create_request import SLOCorrectionCreateRequest from datadog_api_client.v1.model.slo_correction_create_request_attributes import SLOCorrectionCreateRequestAttributes from datadog_api_client.v1.model.slo_correction_type import SLOCorrectionType # there is a valid "slo" in the system SLO_DATA_0_ID = environ["SLO_DATA_0_ID"] body = SLOCorrectionCreateRequest( data=SLOCorrectionCreateData( attributes=SLOCorrectionCreateRequestAttributes( category=SLOCorrectionCategory.SCHEDULED_MAINTENANCE, description="Example-Service-Level-Objective-Correction", slo_id=SLO_DATA_0_ID, start=int(datetime.now().timestamp()), duration=3600, rrule="FREQ=DAILY;INTERVAL=10;COUNT=5", timezone="UTC", ), type=SLOCorrectionType.CORRECTION, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectiveCorrectionsApi(api_client) response = api_instance.create_slo_correction(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an SLO correction returns "OK" response ``` # Create an SLO correction returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectiveCorrectionsAPI.new # there is a valid "slo" in the system SLO_DATA_0_ID = ENV["SLO_DATA_0_ID"] body = DatadogAPIClient::V1::SLOCorrectionCreateRequest.new({ data: DatadogAPIClient::V1::SLOCorrectionCreateData.new({ attributes: DatadogAPIClient::V1::SLOCorrectionCreateRequestAttributes.new({ category: DatadogAPIClient::V1::SLOCorrectionCategory::SCHEDULED_MAINTENANCE, description: "Example-Service-Level-Objective-Correction", _end: (Time.now + 1 * 3600).to_i, slo_id: SLO_DATA_0_ID, start: Time.now.to_i, timezone: "UTC", }), type: DatadogAPIClient::V1::SLOCorrectionType::CORRECTION, }), }) p api_instance.create_slo_correction(body) ``` Copy ##### Create an SLO correction with rrule returns "OK" response ``` # Create an SLO correction with rrule returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectiveCorrectionsAPI.new # there is a valid "slo" in the system SLO_DATA_0_ID = ENV["SLO_DATA_0_ID"] body = DatadogAPIClient::V1::SLOCorrectionCreateRequest.new({ data: DatadogAPIClient::V1::SLOCorrectionCreateData.new({ attributes: DatadogAPIClient::V1::SLOCorrectionCreateRequestAttributes.new({ category: DatadogAPIClient::V1::SLOCorrectionCategory::SCHEDULED_MAINTENANCE, description: "Example-Service-Level-Objective-Correction", slo_id: SLO_DATA_0_ID, start: Time.now.to_i, duration: 3600, rrule: "FREQ=DAILY;INTERVAL=10;COUNT=5", timezone: "UTC", }), type: DatadogAPIClient::V1::SLOCorrectionType::CORRECTION, }), }) p api_instance.create_slo_correction(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create an SLO correction returns "OK" response ``` // Create an SLO correction returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objective_corrections::ServiceLevelObjectiveCorrectionsAPI; use datadog_api_client::datadogV1::model::SLOCorrectionCategory; use datadog_api_client::datadogV1::model::SLOCorrectionCreateData; use datadog_api_client::datadogV1::model::SLOCorrectionCreateRequest; use datadog_api_client::datadogV1::model::SLOCorrectionCreateRequestAttributes; use datadog_api_client::datadogV1::model::SLOCorrectionType; #[tokio::main] async fn main() { // there is a valid "slo" in the system let slo_data_0_id = std::env::var("SLO_DATA_0_ID").unwrap(); let body = SLOCorrectionCreateRequest::new().data( SLOCorrectionCreateData::new(SLOCorrectionType::CORRECTION).attributes( SLOCorrectionCreateRequestAttributes::new( SLOCorrectionCategory::SCHEDULED_MAINTENANCE, slo_data_0_id.clone(), 1636629071, ) .description("Example-Service-Level-Objective-Correction".to_string()) .end(1636632671) .timezone("UTC".to_string()), ), ); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectiveCorrectionsAPI::with_config(configuration); let resp = api.create_slo_correction(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create an SLO correction with rrule returns "OK" response ``` // Create an SLO correction with rrule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objective_corrections::ServiceLevelObjectiveCorrectionsAPI; use datadog_api_client::datadogV1::model::SLOCorrectionCategory; use datadog_api_client::datadogV1::model::SLOCorrectionCreateData; use datadog_api_client::datadogV1::model::SLOCorrectionCreateRequest; use datadog_api_client::datadogV1::model::SLOCorrectionCreateRequestAttributes; use datadog_api_client::datadogV1::model::SLOCorrectionType; #[tokio::main] async fn main() { // there is a valid "slo" in the system let slo_data_0_id = std::env::var("SLO_DATA_0_ID").unwrap(); let body = SLOCorrectionCreateRequest::new().data( SLOCorrectionCreateData::new(SLOCorrectionType::CORRECTION).attributes( SLOCorrectionCreateRequestAttributes::new( SLOCorrectionCategory::SCHEDULED_MAINTENANCE, slo_data_0_id.clone(), 1636629071, ) .description("Example-Service-Level-Objective-Correction".to_string()) .duration(3600) .rrule("FREQ=DAILY;INTERVAL=10;COUNT=5".to_string()) .timezone("UTC".to_string()), ), ); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectiveCorrectionsAPI::with_config(configuration); let resp = api.create_slo_correction(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create an SLO correction returns "OK" response ``` /** * Create an SLO correction returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectiveCorrectionsApi(configuration); // there is a valid "slo" in the system const SLO_DATA_0_ID = process.env.SLO_DATA_0_ID as string; const params: v1.ServiceLevelObjectiveCorrectionsApiCreateSLOCorrectionRequest = { body: { data: { attributes: { category: "Scheduled Maintenance", description: "Example-Service-Level-Objective-Correction", end: Math.round( new Date(new Date().getTime() + 1 * 3600 * 1000).getTime() / 1000 ), sloId: SLO_DATA_0_ID, start: Math.round(new Date().getTime() / 1000), timezone: "UTC", }, type: "correction", }, }, }; apiInstance .createSLOCorrection(params) .then((data: v1.SLOCorrectionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create an SLO correction with rrule returns "OK" response ``` /** * Create an SLO correction with rrule returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectiveCorrectionsApi(configuration); // there is a valid "slo" in the system const SLO_DATA_0_ID = process.env.SLO_DATA_0_ID as string; const params: v1.ServiceLevelObjectiveCorrectionsApiCreateSLOCorrectionRequest = { body: { data: { attributes: { category: "Scheduled Maintenance", description: "Example-Service-Level-Objective-Correction", sloId: SLO_DATA_0_ID, start: Math.round(new Date().getTime() / 1000), duration: 3600, rrule: "FREQ=DAILY;INTERVAL=10;COUNT=5", timezone: "UTC", }, type: "correction", }, }, }; apiInstance .createSLOCorrection(params) .then((data: v1.SLOCorrectionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all SLO corrections](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#get-all-slo-corrections) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#get-all-slo-corrections-v1) GET https://api.ap1.datadoghq.com/api/v1/slo/correctionhttps://api.ap2.datadoghq.com/api/v1/slo/correctionhttps://api.datadoghq.eu/api/v1/slo/correctionhttps://api.ddog-gov.com/api/v1/slo/correctionhttps://api.datadoghq.com/api/v1/slo/correctionhttps://api.us3.datadoghq.com/api/v1/slo/correctionhttps://api.us5.datadoghq.com/api/v1/slo/correction ### Overview Get all Service Level Objective corrections. This endpoint requires the `slos_read` permission. OAuth apps require the `slos_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objective-corrections) to access this endpoint. ### Arguments #### Query Strings Name Type Description offset integer The specific offset to use as the beginning of the returned response. limit integer The number of SLO corrections to return in the response. Default is 25. ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#ListSLOCorrection-200-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#ListSLOCorrection-403-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#ListSLOCorrection-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) A list of SLO correction objects. Field Type Description data [object] The list of SLO corrections objects. attributes object The attribute object associated with the SLO correction. category enum Category the SLO correction belongs to. Allowed enum values: `Scheduled Maintenance,Outside Business Hours,Deployment,Other` created_at int64 The epoch timestamp of when the correction was created at. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. description string Description of the correction being made. duration int64 Length of time (in seconds) for a specified `rrule` recurring SLO correction. end int64 Ending time of the correction in epoch seconds. modified_at int64 The epoch timestamp of when the correction was modified at. modifier object Modifier of the object. email string Email of the Modifier. handle string Handle of the Modifier. name string Name of the Modifier. rrule string The recurrence rules as defined in the iCalendar RFC 5545. The supported rules for SLO corrections are `FREQ`, `INTERVAL`, `COUNT`, `UNTIL` and `BYDAY`. slo_id string ID of the SLO that this correction applies to. start int64 Starting time of the correction in epoch seconds. timezone string The timezone to display in the UI for the correction times (defaults to "UTC"). id string The ID of the SLO correction. type enum SLO correction resource type. Allowed enum values: `correction` default: `correction` meta object Object describing meta attributes of response. page object Pagination object. total_count int64 Total count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "attributes": { "category": "Scheduled Maintenance", "created_at": "integer", "creator": { "email": "string", "handle": "string", "name": "string" }, "description": "string", "duration": 3600, "end": "integer", "modified_at": "integer", "modifier": { "email": "string", "handle": "string", "name": "string" }, "rrule": "FREQ=DAILY;INTERVAL=10;COUNT=5", "slo_id": "string", "start": "integer", "timezone": "string" }, "id": "string", "type": "correction" } ], "meta": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=typescript) ##### Get all SLO corrections Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/correction" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all SLO corrections ``` """ Get all SLO corrections returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objective_corrections_api import ServiceLevelObjectiveCorrectionsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectiveCorrectionsApi(api_client) response = api_instance.list_slo_correction( offset=1, limit=1, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all SLO corrections ``` # Get all SLO corrections returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectiveCorrectionsAPI.new opts = { offset: 1, limit: 1, } p api_instance.list_slo_correction(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all SLO corrections ``` // Get all SLO corrections returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectiveCorrectionsApi(apiClient) resp, r, err := api.ListSLOCorrection(ctx, *datadogV1.NewListSLOCorrectionOptionalParameters().WithOffset(1).WithLimit(1)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectiveCorrectionsApi.ListSLOCorrection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectiveCorrectionsApi.ListSLOCorrection`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all SLO corrections ``` // Get all SLO corrections returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectiveCorrectionsApi; import com.datadog.api.client.v1.api.ServiceLevelObjectiveCorrectionsApi.ListSLOCorrectionOptionalParameters; import com.datadog.api.client.v1.model.SLOCorrectionListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectiveCorrectionsApi apiInstance = new ServiceLevelObjectiveCorrectionsApi(defaultClient); try { SLOCorrectionListResponse result = apiInstance.listSLOCorrection( new ListSLOCorrectionOptionalParameters().offset(1L).limit(1L)); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceLevelObjectiveCorrectionsApi#listSLOCorrection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all SLO corrections ``` // Get all SLO corrections returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objective_corrections::ListSLOCorrectionOptionalParams; use datadog_api_client::datadogV1::api_service_level_objective_corrections::ServiceLevelObjectiveCorrectionsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectiveCorrectionsAPI::with_config(configuration); let resp = api .list_slo_correction( ListSLOCorrectionOptionalParams::default() .offset(1) .limit(1), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all SLO corrections ``` /** * Get all SLO corrections returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectiveCorrectionsApi(configuration); const params: v1.ServiceLevelObjectiveCorrectionsApiListSLOCorrectionRequest = { offset: 1, limit: 1, }; apiInstance .listSLOCorrection(params) .then((data: v1.SLOCorrectionListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an SLO correction for an SLO](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#get-an-slo-correction-for-an-slo) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#get-an-slo-correction-for-an-slo-v1) GET https://api.ap1.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.ap2.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.datadoghq.eu/api/v1/slo/correction/{slo_correction_id}https://api.ddog-gov.com/api/v1/slo/correction/{slo_correction_id}https://api.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.us3.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.us5.datadoghq.com/api/v1/slo/correction/{slo_correction_id} ### Overview Get an SLO correction. ### Arguments #### Path Parameters Name Type Description slo_correction_id [_required_] string The ID of the SLO correction object. ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#GetSLOCorrection-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#GetSLOCorrection-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#GetSLOCorrection-403-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#GetSLOCorrection-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) The response object of an SLO correction. Field Type Description data object The response object of a list of SLO corrections. attributes object The attribute object associated with the SLO correction. category enum Category the SLO correction belongs to. Allowed enum values: `Scheduled Maintenance,Outside Business Hours,Deployment,Other` created_at int64 The epoch timestamp of when the correction was created at. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. description string Description of the correction being made. duration int64 Length of time (in seconds) for a specified `rrule` recurring SLO correction. end int64 Ending time of the correction in epoch seconds. modified_at int64 The epoch timestamp of when the correction was modified at. modifier object Modifier of the object. email string Email of the Modifier. handle string Handle of the Modifier. name string Name of the Modifier. rrule string The recurrence rules as defined in the iCalendar RFC 5545. The supported rules for SLO corrections are `FREQ`, `INTERVAL`, `COUNT`, `UNTIL` and `BYDAY`. slo_id string ID of the SLO that this correction applies to. start int64 Starting time of the correction in epoch seconds. timezone string The timezone to display in the UI for the correction times (defaults to "UTC"). id string The ID of the SLO correction. type enum SLO correction resource type. Allowed enum values: `correction` default: `correction` ``` { "data": { "attributes": { "category": "Scheduled Maintenance", "created_at": "integer", "creator": { "email": "string", "handle": "string", "name": "string" }, "description": "string", "duration": 3600, "end": "integer", "modified_at": "integer", "modifier": { "email": "string", "handle": "string", "name": "string" }, "rrule": "FREQ=DAILY;INTERVAL=10;COUNT=5", "slo_id": "string", "start": "integer", "timezone": "string" }, "id": "string", "type": "correction" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=typescript) ##### Get an SLO correction for an SLO Copy ``` # Path parameters export slo_correction_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/correction/${slo_correction_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an SLO correction for an SLO ``` """ Get an SLO correction for an SLO returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objective_corrections_api import ServiceLevelObjectiveCorrectionsApi # there is a valid "correction" for "slo" CORRECTION_DATA_ID = environ["CORRECTION_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectiveCorrectionsApi(api_client) response = api_instance.get_slo_correction( slo_correction_id=CORRECTION_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an SLO correction for an SLO ``` # Get an SLO correction for an SLO returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectiveCorrectionsAPI.new # there is a valid "correction" for "slo" CORRECTION_DATA_ID = ENV["CORRECTION_DATA_ID"] p api_instance.get_slo_correction(CORRECTION_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an SLO correction for an SLO ``` // Get an SLO correction for an SLO returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "correction" for "slo" CorrectionDataID := os.Getenv("CORRECTION_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectiveCorrectionsApi(apiClient) resp, r, err := api.GetSLOCorrection(ctx, CorrectionDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectiveCorrectionsApi.GetSLOCorrection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectiveCorrectionsApi.GetSLOCorrection`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an SLO correction for an SLO ``` // Get an SLO correction for an SLO returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectiveCorrectionsApi; import com.datadog.api.client.v1.model.SLOCorrectionResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectiveCorrectionsApi apiInstance = new ServiceLevelObjectiveCorrectionsApi(defaultClient); // there is a valid "correction" for "slo" String CORRECTION_DATA_ID = System.getenv("CORRECTION_DATA_ID"); try { SLOCorrectionResponse result = apiInstance.getSLOCorrection(CORRECTION_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceLevelObjectiveCorrectionsApi#getSLOCorrection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an SLO correction for an SLO ``` // Get an SLO correction for an SLO returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objective_corrections::ServiceLevelObjectiveCorrectionsAPI; #[tokio::main] async fn main() { // there is a valid "correction" for "slo" let correction_data_id = std::env::var("CORRECTION_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectiveCorrectionsAPI::with_config(configuration); let resp = api.get_slo_correction(correction_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an SLO correction for an SLO ``` /** * Get an SLO correction for an SLO returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectiveCorrectionsApi(configuration); // there is a valid "correction" for "slo" const CORRECTION_DATA_ID = process.env.CORRECTION_DATA_ID as string; const params: v1.ServiceLevelObjectiveCorrectionsApiGetSLOCorrectionRequest = { sloCorrectionId: CORRECTION_DATA_ID, }; apiInstance .getSLOCorrection(params) .then((data: v1.SLOCorrectionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an SLO correction](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#update-an-slo-correction) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#update-an-slo-correction-v1) PATCH https://api.ap1.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.ap2.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.datadoghq.eu/api/v1/slo/correction/{slo_correction_id}https://api.ddog-gov.com/api/v1/slo/correction/{slo_correction_id}https://api.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.us3.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.us5.datadoghq.com/api/v1/slo/correction/{slo_correction_id} ### Overview Update the specified SLO correction object. ### Arguments #### Path Parameters Name Type Description slo_correction_id [_required_] string The ID of the SLO correction object. ### Request #### Body Data (required) The edited SLO correction object. * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Field Type Description data object The data object associated with the SLO correction to be updated. attributes object The attribute object associated with the SLO correction to be updated. category enum Category the SLO correction belongs to. Allowed enum values: `Scheduled Maintenance,Outside Business Hours,Deployment,Other` description string Description of the correction being made. duration int64 Length of time (in seconds) for a specified `rrule` recurring SLO correction. end int64 Ending time of the correction in epoch seconds. rrule string The recurrence rules as defined in the iCalendar RFC 5545. The supported rules for SLO corrections are `FREQ`, `INTERVAL`, `COUNT`, `UNTIL` and `BYDAY`. start int64 Starting time of the correction in epoch seconds. timezone string The timezone to display in the UI for the correction times (defaults to "UTC"). type enum SLO correction resource type. Allowed enum values: `correction` default: `correction` ``` { "data": { "attributes": { "category": "Deployment", "description": "Example-Service-Level-Objective-Correction", "end": 1636632671, "start": 1636629071, "timezone": "UTC" }, "type": "correction" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#UpdateSLOCorrection-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#UpdateSLOCorrection-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#UpdateSLOCorrection-403-v1) * [404](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#UpdateSLOCorrection-404-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#UpdateSLOCorrection-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) The response object of an SLO correction. Field Type Description data object The response object of a list of SLO corrections. attributes object The attribute object associated with the SLO correction. category enum Category the SLO correction belongs to. Allowed enum values: `Scheduled Maintenance,Outside Business Hours,Deployment,Other` created_at int64 The epoch timestamp of when the correction was created at. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. description string Description of the correction being made. duration int64 Length of time (in seconds) for a specified `rrule` recurring SLO correction. end int64 Ending time of the correction in epoch seconds. modified_at int64 The epoch timestamp of when the correction was modified at. modifier object Modifier of the object. email string Email of the Modifier. handle string Handle of the Modifier. name string Name of the Modifier. rrule string The recurrence rules as defined in the iCalendar RFC 5545. The supported rules for SLO corrections are `FREQ`, `INTERVAL`, `COUNT`, `UNTIL` and `BYDAY`. slo_id string ID of the SLO that this correction applies to. start int64 Starting time of the correction in epoch seconds. timezone string The timezone to display in the UI for the correction times (defaults to "UTC"). id string The ID of the SLO correction. type enum SLO correction resource type. Allowed enum values: `correction` default: `correction` ``` { "data": { "attributes": { "category": "Scheduled Maintenance", "created_at": "integer", "creator": { "email": "string", "handle": "string", "name": "string" }, "description": "string", "duration": 3600, "end": "integer", "modified_at": "integer", "modifier": { "email": "string", "handle": "string", "name": "string" }, "rrule": "FREQ=DAILY;INTERVAL=10;COUNT=5", "slo_id": "string", "start": "integer", "timezone": "string" }, "id": "string", "type": "correction" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=typescript) ##### Update an SLO correction returns "OK" response Copy ``` # Path parameters export slo_correction_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/correction/${slo_correction_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "category": "Deployment", "description": "Example-Service-Level-Objective-Correction", "end": 1636632671, "start": 1636629071, "timezone": "UTC" }, "type": "correction" } } EOF ``` ##### Update an SLO correction returns "OK" response ``` // Update an SLO correction returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "correction" for "slo" CorrectionDataID := os.Getenv("CORRECTION_DATA_ID") body := datadogV1.SLOCorrectionUpdateRequest{ Data: &datadogV1.SLOCorrectionUpdateData{ Attributes: &datadogV1.SLOCorrectionUpdateRequestAttributes{ Category: datadogV1.SLOCORRECTIONCATEGORY_DEPLOYMENT.Ptr(), Description: datadog.PtrString("Example-Service-Level-Objective-Correction"), End: datadog.PtrInt64(time.Now().Add(time.Hour * 1).Unix()), Start: datadog.PtrInt64(time.Now().Unix()), Timezone: datadog.PtrString("UTC"), }, Type: datadogV1.SLOCORRECTIONTYPE_CORRECTION.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectiveCorrectionsApi(apiClient) resp, r, err := api.UpdateSLOCorrection(ctx, CorrectionDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectiveCorrectionsApi.UpdateSLOCorrection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectiveCorrectionsApi.UpdateSLOCorrection`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an SLO correction returns "OK" response ``` // Update an SLO correction returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectiveCorrectionsApi; import com.datadog.api.client.v1.model.SLOCorrectionCategory; import com.datadog.api.client.v1.model.SLOCorrectionResponse; import com.datadog.api.client.v1.model.SLOCorrectionType; import com.datadog.api.client.v1.model.SLOCorrectionUpdateData; import com.datadog.api.client.v1.model.SLOCorrectionUpdateRequest; import com.datadog.api.client.v1.model.SLOCorrectionUpdateRequestAttributes; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectiveCorrectionsApi apiInstance = new ServiceLevelObjectiveCorrectionsApi(defaultClient); // there is a valid "correction" for "slo" String CORRECTION_DATA_ID = System.getenv("CORRECTION_DATA_ID"); SLOCorrectionUpdateRequest body = new SLOCorrectionUpdateRequest() .data( new SLOCorrectionUpdateData() .attributes( new SLOCorrectionUpdateRequestAttributes() .category(SLOCorrectionCategory.DEPLOYMENT) .description("Example-Service-Level-Objective-Correction") .end(OffsetDateTime.now().plusHours(1).toInstant().getEpochSecond()) .start(OffsetDateTime.now().toInstant().getEpochSecond()) .timezone("UTC")) .type(SLOCorrectionType.CORRECTION)); try { SLOCorrectionResponse result = apiInstance.updateSLOCorrection(CORRECTION_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceLevelObjectiveCorrectionsApi#updateSLOCorrection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an SLO correction returns "OK" response ``` """ Update an SLO correction returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objective_corrections_api import ServiceLevelObjectiveCorrectionsApi from datadog_api_client.v1.model.slo_correction_category import SLOCorrectionCategory from datadog_api_client.v1.model.slo_correction_type import SLOCorrectionType from datadog_api_client.v1.model.slo_correction_update_data import SLOCorrectionUpdateData from datadog_api_client.v1.model.slo_correction_update_request import SLOCorrectionUpdateRequest from datadog_api_client.v1.model.slo_correction_update_request_attributes import SLOCorrectionUpdateRequestAttributes # there is a valid "correction" for "slo" CORRECTION_DATA_ID = environ["CORRECTION_DATA_ID"] body = SLOCorrectionUpdateRequest( data=SLOCorrectionUpdateData( attributes=SLOCorrectionUpdateRequestAttributes( category=SLOCorrectionCategory.DEPLOYMENT, description="Example-Service-Level-Objective-Correction", end=int((datetime.now() + relativedelta(hours=1)).timestamp()), start=int(datetime.now().timestamp()), timezone="UTC", ), type=SLOCorrectionType.CORRECTION, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectiveCorrectionsApi(api_client) response = api_instance.update_slo_correction(slo_correction_id=CORRECTION_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an SLO correction returns "OK" response ``` # Update an SLO correction returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectiveCorrectionsAPI.new # there is a valid "correction" for "slo" CORRECTION_DATA_ID = ENV["CORRECTION_DATA_ID"] body = DatadogAPIClient::V1::SLOCorrectionUpdateRequest.new({ data: DatadogAPIClient::V1::SLOCorrectionUpdateData.new({ attributes: DatadogAPIClient::V1::SLOCorrectionUpdateRequestAttributes.new({ category: DatadogAPIClient::V1::SLOCorrectionCategory::DEPLOYMENT, description: "Example-Service-Level-Objective-Correction", _end: (Time.now + 1 * 3600).to_i, start: Time.now.to_i, timezone: "UTC", }), type: DatadogAPIClient::V1::SLOCorrectionType::CORRECTION, }), }) p api_instance.update_slo_correction(CORRECTION_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an SLO correction returns "OK" response ``` // Update an SLO correction returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objective_corrections::ServiceLevelObjectiveCorrectionsAPI; use datadog_api_client::datadogV1::model::SLOCorrectionCategory; use datadog_api_client::datadogV1::model::SLOCorrectionType; use datadog_api_client::datadogV1::model::SLOCorrectionUpdateData; use datadog_api_client::datadogV1::model::SLOCorrectionUpdateRequest; use datadog_api_client::datadogV1::model::SLOCorrectionUpdateRequestAttributes; #[tokio::main] async fn main() { // there is a valid "correction" for "slo" let correction_data_id = std::env::var("CORRECTION_DATA_ID").unwrap(); let body = SLOCorrectionUpdateRequest::new().data( SLOCorrectionUpdateData::new() .attributes( SLOCorrectionUpdateRequestAttributes::new() .category(SLOCorrectionCategory::DEPLOYMENT) .description("Example-Service-Level-Objective-Correction".to_string()) .end(1636632671) .start(1636629071) .timezone("UTC".to_string()), ) .type_(SLOCorrectionType::CORRECTION), ); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectiveCorrectionsAPI::with_config(configuration); let resp = api .update_slo_correction(correction_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an SLO correction returns "OK" response ``` /** * Update an SLO correction returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectiveCorrectionsApi(configuration); // there is a valid "correction" for "slo" const CORRECTION_DATA_ID = process.env.CORRECTION_DATA_ID as string; const params: v1.ServiceLevelObjectiveCorrectionsApiUpdateSLOCorrectionRequest = { body: { data: { attributes: { category: "Deployment", description: "Example-Service-Level-Objective-Correction", end: Math.round( new Date(new Date().getTime() + 1 * 3600 * 1000).getTime() / 1000 ), start: Math.round(new Date().getTime() / 1000), timezone: "UTC", }, type: "correction", }, }, sloCorrectionId: CORRECTION_DATA_ID, }; apiInstance .updateSLOCorrection(params) .then((data: v1.SLOCorrectionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an SLO correction](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#delete-an-slo-correction) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#delete-an-slo-correction-v1) DELETE https://api.ap1.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.ap2.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.datadoghq.eu/api/v1/slo/correction/{slo_correction_id}https://api.ddog-gov.com/api/v1/slo/correction/{slo_correction_id}https://api.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.us3.datadoghq.com/api/v1/slo/correction/{slo_correction_id}https://api.us5.datadoghq.com/api/v1/slo/correction/{slo_correction_id} ### Overview Permanently delete the specified SLO correction object. ### Arguments #### Path Parameters Name Type Description slo_correction_id [_required_] string The ID of the SLO correction object. ### Response * [204](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#DeleteSLOCorrection-204-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#DeleteSLOCorrection-403-v1) * [404](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#DeleteSLOCorrection-404-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/#DeleteSLOCorrection-429-v1) OK Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objective-corrections/?code-lang=typescript) ##### Delete an SLO correction Copy ``` # Path parameters export slo_correction_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/correction/${slo_correction_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an SLO correction ``` """ Delete an SLO correction returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objective_corrections_api import ServiceLevelObjectiveCorrectionsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectiveCorrectionsApi(api_client) api_instance.delete_slo_correction( slo_correction_id="slo_correction_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an SLO correction ``` # Delete an SLO correction returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectiveCorrectionsAPI.new api_instance.delete_slo_correction("slo_correction_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an SLO correction ``` // Delete an SLO correction returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectiveCorrectionsApi(apiClient) r, err := api.DeleteSLOCorrection(ctx, "slo_correction_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectiveCorrectionsApi.DeleteSLOCorrection`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an SLO correction ``` // Delete an SLO correction returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectiveCorrectionsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectiveCorrectionsApi apiInstance = new ServiceLevelObjectiveCorrectionsApi(defaultClient); try { apiInstance.deleteSLOCorrection("slo_correction_id"); } catch (ApiException e) { System.err.println( "Exception when calling ServiceLevelObjectiveCorrectionsApi#deleteSLOCorrection"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an SLO correction ``` // Delete an SLO correction returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objective_corrections::ServiceLevelObjectiveCorrectionsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectiveCorrectionsAPI::with_config(configuration); let resp = api .delete_slo_correction("slo_correction_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an SLO correction ``` /** * Delete an SLO correction returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectiveCorrectionsApi(configuration); const params: v1.ServiceLevelObjectiveCorrectionsApiDeleteSLOCorrectionRequest = { sloCorrectionId: "slo_correction_id", }; apiInstance .deleteSLOCorrection(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=53e74415-c067-46ce-b980-e700b0b104f7&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=15df1f91-205c-4a5f-bd05-7b4f3bb87e5c&pt=Service%20Level%20Objective%20Corrections&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-level-objective-corrections%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=53e74415-c067-46ce-b980-e700b0b104f7&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=15df1f91-205c-4a5f-bd05-7b4f3bb87e5c&pt=Service%20Level%20Objective%20Corrections&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-level-objective-corrections%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=e3a147f8-9029-422c-94a7-f43d56d384f8&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Service%20Level%20Objective%20Corrections&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-level-objective-corrections%2F&r=<=11045&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=872927) --- # Source: https://docs.datadoghq.com/api/latest/service-level-objectives # Service Level Objectives [Service Level Objectives](https://docs.datadoghq.com/monitors/service_level_objectives/#configuration) (or SLOs) are a key part of the site reliability engineering toolkit. SLOs provide a framework for defining clear targets around application performance, which ultimately help teams provide a consistent customer experience, balance feature development with platform stability, and improve communication with internal and external users. ## [Create an SLO object](https://docs.datadoghq.com/api/latest/service-level-objectives/#create-an-slo-object) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#create-an-slo-object-v1) POST https://api.ap1.datadoghq.com/api/v1/slohttps://api.ap2.datadoghq.com/api/v1/slohttps://api.datadoghq.eu/api/v1/slohttps://api.ddog-gov.com/api/v1/slohttps://api.datadoghq.com/api/v1/slohttps://api.us3.datadoghq.com/api/v1/slohttps://api.us5.datadoghq.com/api/v1/slo ### Overview Create a service level objective object. This endpoint requires the `slos_write` permission. OAuth apps require the `slos_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Request #### Body Data (required) Service level objective request object. * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Field Type Description description string A user-defined description of the service level objective. Always included in service level objective responses (but may be `null`). Optional in create/update requests. groups [string] A list of (up to 100) monitor groups that narrow the scope of a monitor service level objective. Included in service level objective responses if it is not empty. Optional in create/update requests for monitor service level objectives, but may only be used when then length of the `monitor_ids` field is one. monitor_ids [integer] A list of monitor IDs that defines the scope of a monitor service level objective. **Required if type is`monitor`**. name [_required_] string The name of the service level objective object. query object A metric-based SLO. **Required if type is`metric`**. Note that Datadog only allows the sum by aggregator to be used because this will sum up all request counts instead of averaging them, or taking the max or min of all of those requests. denominator [_required_] string A Datadog metric query for total (valid) events. numerator [_required_] string A Datadog metric query for good events. sli_specification A generic SLI specification. This is currently used for time-slice SLOs only. Option 1 object A time-slice SLI specification. time_slice [_required_] object The time-slice condition, composed of 3 parts: 1. the metric timeseries query, 2. the comparator, and 3. the threshold. Optionally, a fourth part, the query interval, can be provided. comparator [_required_] enum The comparator used to compare the SLI value to the threshold. Allowed enum values: `>,>=,<,<=` query [_required_] object The queries and formula used to calculate the SLI value. formulas [_required_] [object] A list that contains exactly one formula, as only a single formula may be used in a time-slice SLO. formula [_required_] string The formula string, which is an expression involving named queries. queries [_required_] [ ] A list of queries that are used to calculate the SLI value. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` query_interval_seconds enum The interval used when querying data, which defines the size of a time slice. Two values are allowed: 60 (1 minute) and 300 (5 minutes). If not provided, the value defaults to 300 (5 minutes). Allowed enum values: `60,300` threshold [_required_] double The threshold value to which each SLI value will be compared. tags [string] A list of tags associated with this service level objective. Always included in service level objective responses (but may be empty). Optional in create/update requests. target_threshold double The target threshold such that when the service level indicator is above this threshold over the given timeframe, the objective is being met. thresholds [_required_] [object] The thresholds (timeframes and associated targets) for this service level objective object. target [_required_] double The target value for the service level indicator within the corresponding timeframe. target_display string A string representation of the target that indicates its precision. It uses trailing zeros to show significant decimal places (for example `98.00`). Always included in service level objective responses. Ignored in create/update requests. timeframe [_required_] enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` warning double The warning value for the service level objective. warning_display string A string representation of the warning target (see the description of the `target_display` field for details). Included in service level objective responses if a warning target exists. Ignored in create/update requests. timeframe enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` type [_required_] enum The type of the service level objective. Allowed enum values: `metric,monitor,time_slice` warning_threshold double The optional warning threshold such that when the service level indicator is below this value for the given threshold, but above the target threshold, the objective appears in a "warning" state. This value must be greater than the target threshold. ##### Create a time-slice SLO object returns "OK" response ``` { "type": "time_slice", "description": "string", "name": "Example-Service-Level-Objective", "sli_specification": { "time_slice": { "query": { "formulas": [ { "formula": "query1" } ], "queries": [ { "data_source": "metrics", "name": "query1", "query": "trace.servlet.request{env:prod}" } ] }, "comparator": ">", "threshold": 5 } }, "tags": [ "env:prod" ], "thresholds": [ { "target": 97.0, "target_display": "97.0", "timeframe": "7d", "warning": 98, "warning_display": "98.0" } ], "timeframe": "7d", "target_threshold": 97.0, "warning_threshold": 98 } ``` Copy ##### Create an SLO object returns "OK" response ``` { "type": "metric", "description": "string", "groups": [ "env:test", "role:mysql" ], "monitor_ids": [], "name": "Example-Service-Level-Objective", "query": { "denominator": "sum:httpservice.hits{!code:3xx}.as_count()", "numerator": "sum:httpservice.hits{code:2xx}.as_count()" }, "tags": [ "env:prod", "app:core" ], "thresholds": [ { "target": 97.0, "target_display": "97.0", "timeframe": "7d", "warning": 98, "warning_display": "98.0" } ], "timeframe": "7d", "target_threshold": 97.0, "warning_threshold": 98 } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#CreateSLO-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-level-objectives/#CreateSLO-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#CreateSLO-403-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#CreateSLO-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) A response with one or more service level objective. Field Type Description data [object] An array of service level objective objects. created_at int64 Creation timestamp (UNIX time in seconds) Always included in service level objective responses. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. description string A user-defined description of the service level objective. Always included in service level objective responses (but may be `null`). Optional in create/update requests. groups [string] A list of (up to 100) monitor groups that narrow the scope of a monitor service level objective. Included in service level objective responses if it is not empty. Optional in create/update requests for monitor service level objectives, but may only be used when then length of the `monitor_ids` field is one. id string A unique identifier for the service level objective object. Always included in service level objective responses. modified_at int64 Modification timestamp (UNIX time in seconds) Always included in service level objective responses. monitor_ids [integer] A list of monitor ids that defines the scope of a monitor service level objective. **Required if type is`monitor`**. monitor_tags [string] The union of monitor tags for all monitors referenced by the `monitor_ids` field. Always included in service level objective responses for monitor-based service level objectives (but may be empty). Ignored in create/update requests. Does not affect which monitors are included in the service level objective (that is determined entirely by the `monitor_ids` field). name [_required_] string The name of the service level objective object. query object A metric-based SLO. **Required if type is`metric`**. Note that Datadog only allows the sum by aggregator to be used because this will sum up all request counts instead of averaging them, or taking the max or min of all of those requests. denominator [_required_] string A Datadog metric query for total (valid) events. numerator [_required_] string A Datadog metric query for good events. sli_specification A generic SLI specification. This is currently used for time-slice SLOs only. Option 1 object A time-slice SLI specification. time_slice [_required_] object The time-slice condition, composed of 3 parts: 1. the metric timeseries query, 2. the comparator, and 3. the threshold. Optionally, a fourth part, the query interval, can be provided. comparator [_required_] enum The comparator used to compare the SLI value to the threshold. Allowed enum values: `>,>=,<,<=` query [_required_] object The queries and formula used to calculate the SLI value. formulas [_required_] [object] A list that contains exactly one formula, as only a single formula may be used in a time-slice SLO. formula [_required_] string The formula string, which is an expression involving named queries. queries [_required_] [ ] A list of queries that are used to calculate the SLI value. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` query_interval_seconds enum The interval used when querying data, which defines the size of a time slice. Two values are allowed: 60 (1 minute) and 300 (5 minutes). If not provided, the value defaults to 300 (5 minutes). Allowed enum values: `60,300` threshold [_required_] double The threshold value to which each SLI value will be compared. tags [string] A list of tags associated with this service level objective. Always included in service level objective responses (but may be empty). Optional in create/update requests. target_threshold double The target threshold such that when the service level indicator is above this threshold over the given timeframe, the objective is being met. thresholds [_required_] [object] The thresholds (timeframes and associated targets) for this service level objective object. target [_required_] double The target value for the service level indicator within the corresponding timeframe. target_display string A string representation of the target that indicates its precision. It uses trailing zeros to show significant decimal places (for example `98.00`). Always included in service level objective responses. Ignored in create/update requests. timeframe [_required_] enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` warning double The warning value for the service level objective. warning_display string A string representation of the warning target (see the description of the `target_display` field for details). Included in service level objective responses if a warning target exists. Ignored in create/update requests. timeframe enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` type [_required_] enum The type of the service level objective. Allowed enum values: `metric,monitor,time_slice` warning_threshold double The optional warning threshold such that when the service level indicator is below this value for the given threshold, but above the target threshold, the objective appears in a "warning" state. This value must be greater than the target threshold. errors [string] An array of error messages. Each endpoint documents how/whether this field is used. metadata object The metadata object containing additional information about the list of SLOs. page object The object containing information about the pages of the list of SLOs. total_count int64 The total number of resources that could be retrieved ignoring the parameters and filters in the request. total_filtered_count int64 The total number of resources that match the parameters and filters in the request. This attribute can be used by a client to determine the total number of pages. ``` { "data": [ { "created_at": "integer", "creator": { "email": "string", "handle": "string", "name": "string" }, "description": "string", "groups": [ "env:prod", "role:mysql" ], "id": "string", "modified_at": "integer", "monitor_ids": [], "monitor_tags": [], "name": "Custom Metric SLO", "query": { "denominator": "sum:my.custom.metric{*}.as_count()", "numerator": "sum:my.custom.metric{type:good}.as_count()" }, "sli_specification": { "time_slice": { "comparator": ">", "query": { "formulas": [ { "formula": "query1 - default_zero(query2)" } ], "queries": [ [] ] }, "query_interval_seconds": 300, "threshold": 5 } }, "tags": [ "env:prod", "app:core" ], "target_threshold": 99.9, "thresholds": [ { "target": 99.9, "target_display": "99.9", "timeframe": "30d", "warning": 90, "warning_display": "90.0" } ], "timeframe": "30d", "type": "metric", "warning_threshold": 99.95 } ], "errors": [], "metadata": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) * [Python [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python-legacy) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby-legacy) ##### Create a time-slice SLO object returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "type": "time_slice", "description": "string", "name": "Example-Service-Level-Objective", "sli_specification": { "time_slice": { "query": { "formulas": [ { "formula": "query1" } ], "queries": [ { "data_source": "metrics", "name": "query1", "query": "trace.servlet.request{env:prod}" } ] }, "comparator": ">", "threshold": 5 } }, "tags": [ "env:prod" ], "thresholds": [ { "target": 97.0, "target_display": "97.0", "timeframe": "7d", "warning": 98, "warning_display": "98.0" } ], "timeframe": "7d", "target_threshold": 97.0, "warning_threshold": 98 } EOF ``` ##### Create an SLO object returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "type": "metric", "description": "string", "groups": [ "env:test", "role:mysql" ], "monitor_ids": [], "name": "Example-Service-Level-Objective", "query": { "denominator": "sum:httpservice.hits{!code:3xx}.as_count()", "numerator": "sum:httpservice.hits{code:2xx}.as_count()" }, "tags": [ "env:prod", "app:core" ], "thresholds": [ { "target": 97.0, "target_display": "97.0", "timeframe": "7d", "warning": 98, "warning_display": "98.0" } ], "timeframe": "7d", "target_threshold": 97.0, "warning_threshold": 98 } EOF ``` ##### Create a time-slice SLO object returns "OK" response ``` // Create a time-slice SLO object returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.ServiceLevelObjectiveRequest{ Type: datadogV1.SLOTYPE_TIME_SLICE, Description: *datadog.NewNullableString(datadog.PtrString("string")), Name: "Example-Service-Level-Objective", SliSpecification: &datadogV1.SLOSliSpec{ SLOTimeSliceSpec: &datadogV1.SLOTimeSliceSpec{ TimeSlice: datadogV1.SLOTimeSliceCondition{ Query: datadogV1.SLOTimeSliceQuery{ Formulas: []datadogV1.SLOFormula{ { Formula: "query1", }, }, Queries: []datadogV1.SLODataSourceQueryDefinition{ datadogV1.SLODataSourceQueryDefinition{ FormulaAndFunctionMetricQueryDefinition: &datadogV1.FormulaAndFunctionMetricQueryDefinition{ DataSource: datadogV1.FORMULAANDFUNCTIONMETRICDATASOURCE_METRICS, Name: "query1", Query: "trace.servlet.request{env:prod}", }}, }, }, Comparator: datadogV1.SLOTIMESLICECOMPARATOR_GREATER, Threshold: 5, }, }}, Tags: []string{ "env:prod", }, Thresholds: []datadogV1.SLOThreshold{ { Target: 97.0, TargetDisplay: datadog.PtrString("97.0"), Timeframe: datadogV1.SLOTIMEFRAME_SEVEN_DAYS, Warning: datadog.PtrFloat64(98), WarningDisplay: datadog.PtrString("98.0"), }, }, Timeframe: datadogV1.SLOTIMEFRAME_SEVEN_DAYS.Ptr(), TargetThreshold: datadog.PtrFloat64(97.0), WarningThreshold: datadog.PtrFloat64(98), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.CreateSLO(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.CreateSLO`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.CreateSLO`:\n%s\n", responseContent) } ``` Copy ##### Create an SLO object returns "OK" response ``` // Create an SLO object returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.ServiceLevelObjectiveRequest{ Type: datadogV1.SLOTYPE_METRIC, Description: *datadog.NewNullableString(datadog.PtrString("string")), Groups: []string{ "env:test", "role:mysql", }, MonitorIds: []int64{}, Name: "Example-Service-Level-Objective", Query: &datadogV1.ServiceLevelObjectiveQuery{ Denominator: "sum:httpservice.hits{!code:3xx}.as_count()", Numerator: "sum:httpservice.hits{code:2xx}.as_count()", }, Tags: []string{ "env:prod", "app:core", }, Thresholds: []datadogV1.SLOThreshold{ { Target: 97.0, TargetDisplay: datadog.PtrString("97.0"), Timeframe: datadogV1.SLOTIMEFRAME_SEVEN_DAYS, Warning: datadog.PtrFloat64(98), WarningDisplay: datadog.PtrString("98.0"), }, }, Timeframe: datadogV1.SLOTIMEFRAME_SEVEN_DAYS.Ptr(), TargetThreshold: datadog.PtrFloat64(97.0), WarningThreshold: datadog.PtrFloat64(98), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.CreateSLO(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.CreateSLO`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.CreateSLO`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a time-slice SLO object returns "OK" response ``` // Create a time-slice SLO object returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v1.model.FormulaAndFunctionMetricDataSource; import com.datadog.api.client.v1.model.FormulaAndFunctionMetricQueryDefinition; import com.datadog.api.client.v1.model.SLODataSourceQueryDefinition; import com.datadog.api.client.v1.model.SLOFormula; import com.datadog.api.client.v1.model.SLOListResponse; import com.datadog.api.client.v1.model.SLOSliSpec; import com.datadog.api.client.v1.model.SLOThreshold; import com.datadog.api.client.v1.model.SLOTimeSliceComparator; import com.datadog.api.client.v1.model.SLOTimeSliceCondition; import com.datadog.api.client.v1.model.SLOTimeSliceQuery; import com.datadog.api.client.v1.model.SLOTimeSliceSpec; import com.datadog.api.client.v1.model.SLOTimeframe; import com.datadog.api.client.v1.model.SLOType; import com.datadog.api.client.v1.model.ServiceLevelObjectiveRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); ServiceLevelObjectiveRequest body = new ServiceLevelObjectiveRequest() .type(SLOType.TIME_SLICE) .description("string") .name("Example-Service-Level-Objective") .sliSpecification( new SLOSliSpec( new SLOTimeSliceSpec() .timeSlice( new SLOTimeSliceCondition() .query( new SLOTimeSliceQuery() .formulas( Collections.singletonList( new SLOFormula().formula("query1"))) .queries( Collections.singletonList( new SLODataSourceQueryDefinition( new FormulaAndFunctionMetricQueryDefinition() .dataSource( FormulaAndFunctionMetricDataSource .METRICS) .name("query1") .query( "trace.servlet.request{env:prod}"))))) .comparator(SLOTimeSliceComparator.GREATER) .threshold(5.0)))) .tags(Collections.singletonList("env:prod")) .thresholds( Collections.singletonList( new SLOThreshold() .target(97.0) .targetDisplay("97.0") .timeframe(SLOTimeframe.SEVEN_DAYS) .warning(98.0) .warningDisplay("98.0"))) .timeframe(SLOTimeframe.SEVEN_DAYS) .targetThreshold(97.0) .warningThreshold(98.0); try { SLOListResponse result = apiInstance.createSLO(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#createSLO"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create an SLO object returns "OK" response ``` // Create an SLO object returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v1.model.SLOListResponse; import com.datadog.api.client.v1.model.SLOThreshold; import com.datadog.api.client.v1.model.SLOTimeframe; import com.datadog.api.client.v1.model.SLOType; import com.datadog.api.client.v1.model.ServiceLevelObjectiveQuery; import com.datadog.api.client.v1.model.ServiceLevelObjectiveRequest; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); ServiceLevelObjectiveRequest body = new ServiceLevelObjectiveRequest() .type(SLOType.METRIC) .description("string") .groups(Arrays.asList("env:test", "role:mysql")) .name("Example-Service-Level-Objective") .query( new ServiceLevelObjectiveQuery() .denominator("sum:httpservice.hits{!code:3xx}.as_count()") .numerator("sum:httpservice.hits{code:2xx}.as_count()")) .tags(Arrays.asList("env:prod", "app:core")) .thresholds( Collections.singletonList( new SLOThreshold() .target(97.0) .targetDisplay("97.0") .timeframe(SLOTimeframe.SEVEN_DAYS) .warning(98.0) .warningDisplay("98.0"))) .timeframe(SLOTimeframe.SEVEN_DAYS) .targetThreshold(97.0) .warningThreshold(98.0); try { SLOListResponse result = apiInstance.createSLO(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#createSLO"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create an SLO object returns "OK" response ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Create a new SLO thresholds = [ {"timeframe": "7d", "target": 95}, {"timeframe": "30d", "target": 95, "warning": 97}, ] tags = ["app:webserver", "frontend"] api.ServiceLevelObjective.create( type="metric", name="Custom Metric SLO", description="SLO tracking custom service SLO", query={ "numerator": "sum:my.custom.metric{type:good}.as_count()", "denominator": "sum:my.custom.metric{*}.as_count()" }, tags=tags, thresholds=thresholds ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Create a time-slice SLO object returns "OK" response ``` """ Create a time-slice SLO object returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objectives_api import ServiceLevelObjectivesApi from datadog_api_client.v1.model.formula_and_function_metric_data_source import FormulaAndFunctionMetricDataSource from datadog_api_client.v1.model.formula_and_function_metric_query_definition import ( FormulaAndFunctionMetricQueryDefinition, ) from datadog_api_client.v1.model.service_level_objective_request import ServiceLevelObjectiveRequest from datadog_api_client.v1.model.slo_formula import SLOFormula from datadog_api_client.v1.model.slo_threshold import SLOThreshold from datadog_api_client.v1.model.slo_time_slice_comparator import SLOTimeSliceComparator from datadog_api_client.v1.model.slo_time_slice_condition import SLOTimeSliceCondition from datadog_api_client.v1.model.slo_time_slice_query import SLOTimeSliceQuery from datadog_api_client.v1.model.slo_time_slice_spec import SLOTimeSliceSpec from datadog_api_client.v1.model.slo_timeframe import SLOTimeframe from datadog_api_client.v1.model.slo_type import SLOType body = ServiceLevelObjectiveRequest( type=SLOType.TIME_SLICE, description="string", name="Example-Service-Level-Objective", sli_specification=SLOTimeSliceSpec( time_slice=SLOTimeSliceCondition( query=SLOTimeSliceQuery( formulas=[ SLOFormula( formula="query1", ), ], queries=[ FormulaAndFunctionMetricQueryDefinition( data_source=FormulaAndFunctionMetricDataSource.METRICS, name="query1", query="trace.servlet.request{env:prod}", ), ], ), comparator=SLOTimeSliceComparator.GREATER, threshold=5.0, ), ), tags=[ "env:prod", ], thresholds=[ SLOThreshold( target=97.0, target_display="97.0", timeframe=SLOTimeframe.SEVEN_DAYS, warning=98.0, warning_display="98.0", ), ], timeframe=SLOTimeframe.SEVEN_DAYS, target_threshold=97.0, warning_threshold=98.0, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.create_slo(body=body) print(response) ``` Copy ##### Create an SLO object returns "OK" response ``` """ Create an SLO object returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objectives_api import ServiceLevelObjectivesApi from datadog_api_client.v1.model.service_level_objective_query import ServiceLevelObjectiveQuery from datadog_api_client.v1.model.service_level_objective_request import ServiceLevelObjectiveRequest from datadog_api_client.v1.model.slo_threshold import SLOThreshold from datadog_api_client.v1.model.slo_timeframe import SLOTimeframe from datadog_api_client.v1.model.slo_type import SLOType body = ServiceLevelObjectiveRequest( type=SLOType.METRIC, description="string", groups=[ "env:test", "role:mysql", ], monitor_ids=[], name="Example-Service-Level-Objective", query=ServiceLevelObjectiveQuery( denominator="sum:httpservice.hits{!code:3xx}.as_count()", numerator="sum:httpservice.hits{code:2xx}.as_count()", ), tags=[ "env:prod", "app:core", ], thresholds=[ SLOThreshold( target=97.0, target_display="97.0", timeframe=SLOTimeframe.SEVEN_DAYS, warning=98.0, warning_display="98.0", ), ], timeframe=SLOTimeframe.SEVEN_DAYS, target_threshold=97.0, warning_threshold=98.0, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.create_slo(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create an SLO object returns "OK" response ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Create a new SLO thresholds = [ { timeframe: '7d', target: 95 }, { timeframe: '30d', target: 95, warning: 97 } ] tags = ['app:webserver', 'frontend'] dog.create_service_level_objective( type: 'metric', name: 'Custom Metric SLO', description: 'SLO tracking custom service SLO', numerator: 'sum:my.custom.metric{type:good}.as_count()', denominator: 'sum:my.custom.metric{*}.as_count()', tags: tags, thresholds: thresholds ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a time-slice SLO object returns "OK" response ``` # Create a time-slice SLO object returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectivesAPI.new body = DatadogAPIClient::V1::ServiceLevelObjectiveRequest.new({ type: DatadogAPIClient::V1::SLOType::TIME_SLICE, description: "string", name: "Example-Service-Level-Objective", sli_specification: DatadogAPIClient::V1::SLOTimeSliceSpec.new({ time_slice: DatadogAPIClient::V1::SLOTimeSliceCondition.new({ query: DatadogAPIClient::V1::SLOTimeSliceQuery.new({ formulas: [ DatadogAPIClient::V1::SLOFormula.new({ formula: "query1", }), ], queries: [ DatadogAPIClient::V1::FormulaAndFunctionMetricQueryDefinition.new({ data_source: DatadogAPIClient::V1::FormulaAndFunctionMetricDataSource::METRICS, name: "query1", query: "trace.servlet.request{env:prod}", }), ], }), comparator: DatadogAPIClient::V1::SLOTimeSliceComparator::GREATER, threshold: 5, }), }), tags: [ "env:prod", ], thresholds: [ DatadogAPIClient::V1::SLOThreshold.new({ target: 97.0, target_display: "97.0", timeframe: DatadogAPIClient::V1::SLOTimeframe::SEVEN_DAYS, warning: 98, warning_display: "98.0", }), ], timeframe: DatadogAPIClient::V1::SLOTimeframe::SEVEN_DAYS, target_threshold: 97.0, warning_threshold: 98, }) p api_instance.create_slo(body) ``` Copy ##### Create an SLO object returns "OK" response ``` # Create an SLO object returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectivesAPI.new body = DatadogAPIClient::V1::ServiceLevelObjectiveRequest.new({ type: DatadogAPIClient::V1::SLOType::METRIC, description: "string", groups: [ "env:test", "role:mysql", ], monitor_ids: [], name: "Example-Service-Level-Objective", query: DatadogAPIClient::V1::ServiceLevelObjectiveQuery.new({ denominator: "sum:httpservice.hits{!code:3xx}.as_count()", numerator: "sum:httpservice.hits{code:2xx}.as_count()", }), tags: [ "env:prod", "app:core", ], thresholds: [ DatadogAPIClient::V1::SLOThreshold.new({ target: 97.0, target_display: "97.0", timeframe: DatadogAPIClient::V1::SLOTimeframe::SEVEN_DAYS, warning: 98, warning_display: "98.0", }), ], timeframe: DatadogAPIClient::V1::SLOTimeframe::SEVEN_DAYS, target_threshold: 97.0, warning_threshold: 98, }) p api_instance.create_slo(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a time-slice SLO object returns "OK" response ``` // Create a time-slice SLO object returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objectives::ServiceLevelObjectivesAPI; use datadog_api_client::datadogV1::model::FormulaAndFunctionMetricDataSource; use datadog_api_client::datadogV1::model::FormulaAndFunctionMetricQueryDefinition; use datadog_api_client::datadogV1::model::SLODataSourceQueryDefinition; use datadog_api_client::datadogV1::model::SLOFormula; use datadog_api_client::datadogV1::model::SLOSliSpec; use datadog_api_client::datadogV1::model::SLOThreshold; use datadog_api_client::datadogV1::model::SLOTimeSliceComparator; use datadog_api_client::datadogV1::model::SLOTimeSliceCondition; use datadog_api_client::datadogV1::model::SLOTimeSliceQuery; use datadog_api_client::datadogV1::model::SLOTimeSliceSpec; use datadog_api_client::datadogV1::model::SLOTimeframe; use datadog_api_client::datadogV1::model::SLOType; use datadog_api_client::datadogV1::model::ServiceLevelObjectiveRequest; #[tokio::main] async fn main() { let body = ServiceLevelObjectiveRequest::new( "Example-Service-Level-Objective".to_string(), vec![SLOThreshold::new(97.0, SLOTimeframe::SEVEN_DAYS) .target_display("97.0".to_string()) .warning(98.0 as f64) .warning_display("98.0".to_string())], SLOType::TIME_SLICE, ) .description(Some("string".to_string())) .sli_specification(SLOSliSpec::SLOTimeSliceSpec(Box::new( SLOTimeSliceSpec::new(SLOTimeSliceCondition::new( SLOTimeSliceComparator::GREATER, SLOTimeSliceQuery::new( vec![SLOFormula::new("query1".to_string())], vec![ SLODataSourceQueryDefinition::FormulaAndFunctionMetricQueryDefinition( Box::new(FormulaAndFunctionMetricQueryDefinition::new( FormulaAndFunctionMetricDataSource::METRICS, "query1".to_string(), "trace.servlet.request{env:prod}".to_string(), )), ), ], ), 5.0, )), ))) .tags(vec!["env:prod".to_string()]) .target_threshold(97.0 as f64) .timeframe(SLOTimeframe::SEVEN_DAYS) .warning_threshold(98.0 as f64); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api.create_slo(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create an SLO object returns "OK" response ``` // Create an SLO object returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objectives::ServiceLevelObjectivesAPI; use datadog_api_client::datadogV1::model::SLOThreshold; use datadog_api_client::datadogV1::model::SLOTimeframe; use datadog_api_client::datadogV1::model::SLOType; use datadog_api_client::datadogV1::model::ServiceLevelObjectiveQuery; use datadog_api_client::datadogV1::model::ServiceLevelObjectiveRequest; #[tokio::main] async fn main() { let body = ServiceLevelObjectiveRequest::new( "Example-Service-Level-Objective".to_string(), vec![SLOThreshold::new(97.0, SLOTimeframe::SEVEN_DAYS) .target_display("97.0".to_string()) .warning(98.0 as f64) .warning_display("98.0".to_string())], SLOType::METRIC, ) .description(Some("string".to_string())) .groups(vec!["env:test".to_string(), "role:mysql".to_string()]) .monitor_ids(vec![]) .query(ServiceLevelObjectiveQuery::new( "sum:httpservice.hits{!code:3xx}.as_count()".to_string(), "sum:httpservice.hits{code:2xx}.as_count()".to_string(), )) .tags(vec!["env:prod".to_string(), "app:core".to_string()]) .target_threshold(97.0 as f64) .timeframe(SLOTimeframe::SEVEN_DAYS) .warning_threshold(98.0 as f64); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api.create_slo(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a time-slice SLO object returns "OK" response ``` /** * Create a time-slice SLO object returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectivesApi(configuration); const params: v1.ServiceLevelObjectivesApiCreateSLORequest = { body: { type: "time_slice", description: "string", name: "Example-Service-Level-Objective", sliSpecification: { timeSlice: { query: { formulas: [ { formula: "query1", }, ], queries: [ { dataSource: "metrics", name: "query1", query: "trace.servlet.request{env:prod}", }, ], }, comparator: ">", threshold: 5, }, }, tags: ["env:prod"], thresholds: [ { target: 97.0, targetDisplay: "97.0", timeframe: "7d", warning: 98, warningDisplay: "98.0", }, ], timeframe: "7d", targetThreshold: 97.0, warningThreshold: 98, }, }; apiInstance .createSLO(params) .then((data: v1.SLOListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create an SLO object returns "OK" response ``` /** * Create an SLO object returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectivesApi(configuration); const params: v1.ServiceLevelObjectivesApiCreateSLORequest = { body: { type: "metric", description: "string", groups: ["env:test", "role:mysql"], monitorIds: [], name: "Example-Service-Level-Objective", query: { denominator: "sum:httpservice.hits{!code:3xx}.as_count()", numerator: "sum:httpservice.hits{code:2xx}.as_count()", }, tags: ["env:prod", "app:core"], thresholds: [ { target: 97.0, targetDisplay: "97.0", timeframe: "7d", warning: 98, warningDisplay: "98.0", }, ], timeframe: "7d", targetThreshold: 97.0, warningThreshold: 98, }, }; apiInstance .createSLO(params) .then((data: v1.SLOListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Search for SLOs](https://docs.datadoghq.com/api/latest/service-level-objectives/#search-for-slos) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#search-for-slos-v1) GET https://api.ap1.datadoghq.com/api/v1/slo/searchhttps://api.ap2.datadoghq.com/api/v1/slo/searchhttps://api.datadoghq.eu/api/v1/slo/searchhttps://api.ddog-gov.com/api/v1/slo/searchhttps://api.datadoghq.com/api/v1/slo/searchhttps://api.us3.datadoghq.com/api/v1/slo/searchhttps://api.us5.datadoghq.com/api/v1/slo/search ### Overview Get a list of service level objective objects for your organization. This endpoint requires the `slos_read` permission. OAuth apps require the `slos_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Arguments #### Query Strings Name Type Description query string The query string to filter results based on SLO names. Some examples of queries include `service:` and ``. page[size] integer The number of files to return in the response `[default=10]`. page[number] integer The identifier of the first page to return. This parameter is used for the pagination feature `[default=0]`. include_facets boolean Whether or not to return facet information in the response `[default=false]`. ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#SearchSLO-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-level-objectives/#SearchSLO-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#SearchSLO-403-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#SearchSLO-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) A search SLO response containing results from the search query. Field Type Description data object Data from search SLO response. attributes object Attributes facets object Facets all_tags [object] All tags associated with an SLO. count int64 Count name string Facet creator_name [object] Creator of an SLO. count int64 Count name string Facet env_tags [object] Tags with the `env` tag key. count int64 Count name string Facet service_tags [object] Tags with the `service` tag key. count int64 Count name string Facet slo_type [object] Type of SLO. count int64 Count name double Facet target [object] SLO Target count int64 Count name double Facet team_tags [object] Tags with the `team` tag key. count int64 Count name string Facet timeframe [object] Timeframes of SLOs. count int64 Count name string Facet slos [object] SLOs data object A service level objective ID and attributes. attributes object A service level objective object includes a service level indicator, thresholds for one or more timeframes, and metadata (`name`, `description`, and `tags`). all_tags [string] A list of tags associated with this service level objective. Always included in service level objective responses (but may be empty). created_at int64 Creation timestamp (UNIX time in seconds) Always included in service level objective responses. creator object The creator of the SLO email string Email of the creator. id int64 User ID of the creator. name string Name of the creator. description string A user-defined description of the service level objective. Always included in service level objective responses (but may be `null`). Optional in create/update requests. env_tags [string] Tags with the `env` tag key. groups [string] A list of (up to 100) monitor groups that narrow the scope of a monitor service level objective. Included in service level objective responses if it is not empty. modified_at int64 Modification timestamp (UNIX time in seconds) Always included in service level objective responses. monitor_ids [integer] A list of monitor ids that defines the scope of a monitor service level objective. name string The name of the service level objective object. overall_status [object] calculated status and error budget remaining. error string Error message if SLO status or error budget could not be calculated. error_budget_remaining double Remaining error budget of the SLO in percentage. indexed_at int64 timestamp (UNIX time in seconds) of when the SLO status and error budget were calculated. raw_error_budget_remaining object Error budget remaining for an SLO. unit string Error budget remaining unit. value double Error budget remaining value. span_precision int64 The amount of decimal places the SLI value is accurate to. state enum State of the SLO. Allowed enum values: `breached,warning,ok,no_data` status double The status of the SLO. target double The target of the SLO. timeframe enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` query object A metric-based SLO. **Required if type is`metric`**. Note that Datadog only allows the sum by aggregator to be used because this will sum up all request counts instead of averaging them, or taking the max or min of all of those requests. denominator string A Datadog metric query for total (valid) events. metrics [string] Metric names used in the query's numerator and denominator. This field will return null and will be implemented in the next version of this endpoint. numerator string A Datadog metric query for good events. service_tags [string] Tags with the `service` tag key. slo_type enum The type of the service level objective. Allowed enum values: `metric,monitor,time_slice` status object Status of the SLO's primary timeframe. calculation_error string Error message if SLO status or error budget could not be calculated. error_budget_remaining double Remaining error budget of the SLO in percentage. indexed_at int64 timestamp (UNIX time in seconds) of when the SLO status and error budget were calculated. raw_error_budget_remaining object Error budget remaining for an SLO. unit string Error budget remaining unit. value double Error budget remaining value. sli double The current service level indicator (SLI) of the SLO, also known as 'status'. This is a percentage value from 0-100 (inclusive). span_precision int64 The number of decimal places the SLI value is accurate to. state enum State of the SLO. Allowed enum values: `breached,warning,ok,no_data` team_tags [string] Tags with the `team` tag key. thresholds [object] The thresholds (timeframes and associated targets) for this service level objective object. target [_required_] double The target value for the service level indicator within the corresponding timeframe. target_display string A string representation of the target that indicates its precision. It uses trailing zeros to show significant decimal places (for example `98.00`). Always included in service level objective responses. Ignored in create/update requests. timeframe [_required_] enum The SLO time window options. Allowed enum values: `7d,30d,90d` warning double The warning value for the service level objective. warning_display string A string representation of the warning target (see the description of the `target_display` field for details). Included in service level objective responses if a warning target exists. Ignored in create/update requests. id string A unique identifier for the service level objective object. Always included in service level objective responses. type string The type of the object, must be `slo`. type string Type of service level objective result. links object Pagination links. first string Link to last page. last string Link to first page. next string Link to the next page. prev string Link to previous page. self string Link to current page. meta object Searches metadata returned by the API. pagination object Pagination metadata returned by the API. first_number int64 The first number. last_number int64 The last number. next_number int64 The next number. number int64 The page number. prev_number int64 The previous page number. size int64 The size of the response. total int64 The total number of SLOs in the response. type string Type of pagination. ``` { "data": { "attributes": { "facets": { "all_tags": [ { "count": "integer", "name": "string" } ], "creator_name": [ { "count": "integer", "name": "string" } ], "env_tags": [ { "count": "integer", "name": "string" } ], "service_tags": [ { "count": "integer", "name": "string" } ], "slo_type": [ { "count": "integer", "name": "number" } ], "target": [ { "count": "integer", "name": "number" } ], "team_tags": [ { "count": "integer", "name": "string" } ], "timeframe": [ { "count": "integer", "name": "string" } ] }, "slos": [ { "data": { "attributes": { "all_tags": [ "env:prod", "app:core" ], "created_at": "integer", "creator": { "email": "string", "id": "integer", "name": "string" }, "description": "string", "env_tags": [], "groups": [ "env:prod", "role:mysql" ], "modified_at": "integer", "monitor_ids": [], "name": "Custom Metric SLO", "overall_status": [ { "error": "string", "error_budget_remaining": 100, "indexed_at": 1662496260, "raw_error_budget_remaining": { "unit": "requests", "value": 60 }, "span_precision": 2, "state": "ok", "status": 100, "target": 99, "timeframe": "30d" } ], "query": { "denominator": "sum:my.custom.metric{*}.as_count()", "metrics": [ "my.custom.metric", "my.other.custom.metric" ], "numerator": "sum:my.custom.metric{type:good}.as_count()" }, "service_tags": [], "slo_type": "metric", "status": { "calculation_error": "string", "error_budget_remaining": 100, "indexed_at": 1662496260, "raw_error_budget_remaining": { "unit": "requests", "value": 60 }, "sli": 100, "span_precision": 2, "state": "ok" }, "team_tags": [], "thresholds": [ { "target": 99.9, "target_display": "99.9", "timeframe": "30d", "warning": 90, "warning_display": "90.0" } ] }, "id": "string", "type": "string" } } ] }, "type": "" }, "links": { "first": "string", "last": "string", "next": "string", "prev": "string", "self": "string" }, "meta": { "pagination": { "first_number": "integer", "last_number": "integer", "next_number": "integer", "number": "integer", "prev_number": "integer", "size": "integer", "total": "integer", "type": "string" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) ##### Search for SLOs Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/search" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Search for SLOs ``` """ Search for SLOs returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objectives_api import ServiceLevelObjectivesApi # there is a valid "slo" in the system SLO_DATA_0_NAME = environ["SLO_DATA_0_NAME"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.search_slo( query=SLO_DATA_0_NAME, page_size=20, page_number=0, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search for SLOs ``` # Search for SLOs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectivesAPI.new # there is a valid "slo" in the system SLO_DATA_0_NAME = ENV["SLO_DATA_0_NAME"] opts = { query: SLO_DATA_0_NAME, page_size: 20, page_number: 0, } p api_instance.search_slo(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search for SLOs ``` // Search for SLOs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "slo" in the system SloData0Name := os.Getenv("SLO_DATA_0_NAME") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.SearchSLO(ctx, *datadogV1.NewSearchSLOOptionalParameters().WithQuery(SloData0Name).WithPageSize(20).WithPageNumber(0)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.SearchSLO`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.SearchSLO`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search for SLOs ``` // Search for SLOs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi.SearchSLOOptionalParameters; import com.datadog.api.client.v1.model.SearchSLOResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); // there is a valid "slo" in the system String SLO_DATA_0_NAME = System.getenv("SLO_DATA_0_NAME"); try { SearchSLOResponse result = apiInstance.searchSLO( new SearchSLOOptionalParameters() .query(SLO_DATA_0_NAME) .pageSize(20L) .pageNumber(0L)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#searchSLO"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search for SLOs ``` // Search for SLOs returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objectives::SearchSLOOptionalParams; use datadog_api_client::datadogV1::api_service_level_objectives::ServiceLevelObjectivesAPI; #[tokio::main] async fn main() { // there is a valid "slo" in the system let slo_data_0_name = std::env::var("SLO_DATA_0_NAME").unwrap(); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api .search_slo( SearchSLOOptionalParams::default() .query(slo_data_0_name.clone()) .page_size(20) .page_number(0), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search for SLOs ``` /** * Search for SLOs returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectivesApi(configuration); // there is a valid "slo" in the system const SLO_DATA_0_NAME = process.env.SLO_DATA_0_NAME as string; const params: v1.ServiceLevelObjectivesApiSearchSLORequest = { query: SLO_DATA_0_NAME, pageSize: 20, pageNumber: 0, }; apiInstance .searchSLO(params) .then((data: v1.SearchSLOResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all SLOs](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-all-slos) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-all-slos-v1) GET https://api.ap1.datadoghq.com/api/v1/slohttps://api.ap2.datadoghq.com/api/v1/slohttps://api.datadoghq.eu/api/v1/slohttps://api.ddog-gov.com/api/v1/slohttps://api.datadoghq.com/api/v1/slohttps://api.us3.datadoghq.com/api/v1/slohttps://api.us5.datadoghq.com/api/v1/slo ### Overview Get a list of service level objective objects for your organization. This endpoint requires the `slos_read` permission. OAuth apps require the `slos_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Arguments #### Query Strings Name Type Description ids string A comma separated list of the IDs of the service level objectives objects. query string The query string to filter results based on SLO names. tags_query string The query string to filter results based on a single SLO tag. metrics_query string The query string to filter results based on SLO numerator and denominator. limit integer The number of SLOs to return in the response. offset integer The specific offset to use as the beginning of the returned response. ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#ListSLOs-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-level-objectives/#ListSLOs-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#ListSLOs-403-v1) * [404](https://docs.datadoghq.com/api/latest/service-level-objectives/#ListSLOs-404-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#ListSLOs-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) A response with one or more service level objective. Field Type Description data [object] An array of service level objective objects. created_at int64 Creation timestamp (UNIX time in seconds) Always included in service level objective responses. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. description string A user-defined description of the service level objective. Always included in service level objective responses (but may be `null`). Optional in create/update requests. groups [string] A list of (up to 100) monitor groups that narrow the scope of a monitor service level objective. Included in service level objective responses if it is not empty. Optional in create/update requests for monitor service level objectives, but may only be used when then length of the `monitor_ids` field is one. id string A unique identifier for the service level objective object. Always included in service level objective responses. modified_at int64 Modification timestamp (UNIX time in seconds) Always included in service level objective responses. monitor_ids [integer] A list of monitor ids that defines the scope of a monitor service level objective. **Required if type is`monitor`**. monitor_tags [string] The union of monitor tags for all monitors referenced by the `monitor_ids` field. Always included in service level objective responses for monitor-based service level objectives (but may be empty). Ignored in create/update requests. Does not affect which monitors are included in the service level objective (that is determined entirely by the `monitor_ids` field). name [_required_] string The name of the service level objective object. query object A metric-based SLO. **Required if type is`metric`**. Note that Datadog only allows the sum by aggregator to be used because this will sum up all request counts instead of averaging them, or taking the max or min of all of those requests. denominator [_required_] string A Datadog metric query for total (valid) events. numerator [_required_] string A Datadog metric query for good events. sli_specification A generic SLI specification. This is currently used for time-slice SLOs only. Option 1 object A time-slice SLI specification. time_slice [_required_] object The time-slice condition, composed of 3 parts: 1. the metric timeseries query, 2. the comparator, and 3. the threshold. Optionally, a fourth part, the query interval, can be provided. comparator [_required_] enum The comparator used to compare the SLI value to the threshold. Allowed enum values: `>,>=,<,<=` query [_required_] object The queries and formula used to calculate the SLI value. formulas [_required_] [object] A list that contains exactly one formula, as only a single formula may be used in a time-slice SLO. formula [_required_] string The formula string, which is an expression involving named queries. queries [_required_] [ ] A list of queries that are used to calculate the SLI value. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` query_interval_seconds enum The interval used when querying data, which defines the size of a time slice. Two values are allowed: 60 (1 minute) and 300 (5 minutes). If not provided, the value defaults to 300 (5 minutes). Allowed enum values: `60,300` threshold [_required_] double The threshold value to which each SLI value will be compared. tags [string] A list of tags associated with this service level objective. Always included in service level objective responses (but may be empty). Optional in create/update requests. target_threshold double The target threshold such that when the service level indicator is above this threshold over the given timeframe, the objective is being met. thresholds [_required_] [object] The thresholds (timeframes and associated targets) for this service level objective object. target [_required_] double The target value for the service level indicator within the corresponding timeframe. target_display string A string representation of the target that indicates its precision. It uses trailing zeros to show significant decimal places (for example `98.00`). Always included in service level objective responses. Ignored in create/update requests. timeframe [_required_] enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` warning double The warning value for the service level objective. warning_display string A string representation of the warning target (see the description of the `target_display` field for details). Included in service level objective responses if a warning target exists. Ignored in create/update requests. timeframe enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` type [_required_] enum The type of the service level objective. Allowed enum values: `metric,monitor,time_slice` warning_threshold double The optional warning threshold such that when the service level indicator is below this value for the given threshold, but above the target threshold, the objective appears in a "warning" state. This value must be greater than the target threshold. errors [string] An array of error messages. Each endpoint documents how/whether this field is used. metadata object The metadata object containing additional information about the list of SLOs. page object The object containing information about the pages of the list of SLOs. total_count int64 The total number of resources that could be retrieved ignoring the parameters and filters in the request. total_filtered_count int64 The total number of resources that match the parameters and filters in the request. This attribute can be used by a client to determine the total number of pages. ``` { "data": [ { "created_at": "integer", "creator": { "email": "string", "handle": "string", "name": "string" }, "description": "string", "groups": [ "env:prod", "role:mysql" ], "id": "string", "modified_at": "integer", "monitor_ids": [], "monitor_tags": [], "name": "Custom Metric SLO", "query": { "denominator": "sum:my.custom.metric{*}.as_count()", "numerator": "sum:my.custom.metric{type:good}.as_count()" }, "sli_specification": { "time_slice": { "comparator": ">", "query": { "formulas": [ { "formula": "query1 - default_zero(query2)" } ], "queries": [ [] ] }, "query_interval_seconds": 300, "threshold": 5 } }, "tags": [ "env:prod", "app:core" ], "target_threshold": 99.9, "thresholds": [ { "target": 99.9, "target_display": "99.9", "timeframe": "30d", "warning": 90, "warning_display": "90.0" } ], "timeframe": "30d", "type": "metric", "warning_threshold": 99.95 } ], "errors": [], "metadata": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python-legacy) ##### Get all SLOs Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all SLOs ``` """ Get all SLOs returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objectives_api import ServiceLevelObjectivesApi # there is a valid "slo" in the system SLO_DATA_0_ID = environ["SLO_DATA_0_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.list_slos( ids=SLO_DATA_0_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all SLOs ``` # Get all SLOs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectivesAPI.new # there is a valid "slo" in the system SLO_DATA_0_ID = ENV["SLO_DATA_0_ID"] opts = { ids: SLO_DATA_0_ID, } p api_instance.list_slos(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all SLOs ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) # Search with a list of IDs slo_ids = [''] dog.search_service_level_objective(slo_ids: slo_ids, offset: 0) # Search with a query on your SLO Name. query = 'my team' dog.search_service_level_objective(query: query, offset: 0) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all SLOs ``` // Get all SLOs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "slo" in the system SloData0ID := os.Getenv("SLO_DATA_0_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.ListSLOs(ctx, *datadogV1.NewListSLOsOptionalParameters().WithIds(SloData0ID)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.ListSLOs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.ListSLOs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all SLOs ``` // Get all SLOs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi.ListSLOsOptionalParameters; import com.datadog.api.client.v1.model.SLOListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); // there is a valid "slo" in the system String SLO_DATA_0_ID = System.getenv("SLO_DATA_0_ID"); try { SLOListResponse result = apiInstance.listSLOs(new ListSLOsOptionalParameters().ids(SLO_DATA_0_ID)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#listSLOs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all SLOs ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) # Search with a list of IDs slo_ids = ["", ""] api.ServiceLevelObjective.get_all(ids=slo_ids, offset=0) # Search with a query on your SLO Name. query = "my team" api.ServiceLevelObjective.get_all(query=query, offset=0) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get all SLOs ``` // Get all SLOs returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objectives::ListSLOsOptionalParams; use datadog_api_client::datadogV1::api_service_level_objectives::ServiceLevelObjectivesAPI; #[tokio::main] async fn main() { // there is a valid "slo" in the system let slo_data_0_id = std::env::var("SLO_DATA_0_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api .list_slos(ListSLOsOptionalParams::default().ids(slo_data_0_id.clone())) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all SLOs ``` /** * Get all SLOs returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectivesApi(configuration); // there is a valid "slo" in the system const SLO_DATA_0_ID = process.env.SLO_DATA_0_ID as string; const params: v1.ServiceLevelObjectivesApiListSLOsRequest = { ids: SLO_DATA_0_ID, }; apiInstance .listSLOs(params) .then((data: v1.SLOListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an SLO](https://docs.datadoghq.com/api/latest/service-level-objectives/#update-an-slo) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#update-an-slo-v1) PUT https://api.ap1.datadoghq.com/api/v1/slo/{slo_id}https://api.ap2.datadoghq.com/api/v1/slo/{slo_id}https://api.datadoghq.eu/api/v1/slo/{slo_id}https://api.ddog-gov.com/api/v1/slo/{slo_id}https://api.datadoghq.com/api/v1/slo/{slo_id}https://api.us3.datadoghq.com/api/v1/slo/{slo_id}https://api.us5.datadoghq.com/api/v1/slo/{slo_id} ### Overview Update the specified service level objective object. This endpoint requires the `slos_write` permission. OAuth apps require the `slos_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Arguments #### Path Parameters Name Type Description slo_id [_required_] string The ID of the service level objective object. ### Request #### Body Data (required) The edited service level objective request object. * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Field Type Description created_at int64 Creation timestamp (UNIX time in seconds) Always included in service level objective responses. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. description string A user-defined description of the service level objective. Always included in service level objective responses (but may be `null`). Optional in create/update requests. groups [string] A list of (up to 100) monitor groups that narrow the scope of a monitor service level objective. Included in service level objective responses if it is not empty. Optional in create/update requests for monitor service level objectives, but may only be used when then length of the `monitor_ids` field is one. id string A unique identifier for the service level objective object. Always included in service level objective responses. modified_at int64 Modification timestamp (UNIX time in seconds) Always included in service level objective responses. monitor_ids [integer] A list of monitor ids that defines the scope of a monitor service level objective. **Required if type is`monitor`**. monitor_tags [string] The union of monitor tags for all monitors referenced by the `monitor_ids` field. Always included in service level objective responses for monitor-based service level objectives (but may be empty). Ignored in create/update requests. Does not affect which monitors are included in the service level objective (that is determined entirely by the `monitor_ids` field). name [_required_] string The name of the service level objective object. query object A metric-based SLO. **Required if type is`metric`**. Note that Datadog only allows the sum by aggregator to be used because this will sum up all request counts instead of averaging them, or taking the max or min of all of those requests. denominator [_required_] string A Datadog metric query for total (valid) events. numerator [_required_] string A Datadog metric query for good events. sli_specification A generic SLI specification. This is currently used for time-slice SLOs only. Option 1 object A time-slice SLI specification. time_slice [_required_] object The time-slice condition, composed of 3 parts: 1. the metric timeseries query, 2. the comparator, and 3. the threshold. Optionally, a fourth part, the query interval, can be provided. comparator [_required_] enum The comparator used to compare the SLI value to the threshold. Allowed enum values: `>,>=,<,<=` query [_required_] object The queries and formula used to calculate the SLI value. formulas [_required_] [object] A list that contains exactly one formula, as only a single formula may be used in a time-slice SLO. formula [_required_] string The formula string, which is an expression involving named queries. queries [_required_] [ ] A list of queries that are used to calculate the SLI value. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` query_interval_seconds enum The interval used when querying data, which defines the size of a time slice. Two values are allowed: 60 (1 minute) and 300 (5 minutes). If not provided, the value defaults to 300 (5 minutes). Allowed enum values: `60,300` threshold [_required_] double The threshold value to which each SLI value will be compared. tags [string] A list of tags associated with this service level objective. Always included in service level objective responses (but may be empty). Optional in create/update requests. target_threshold double The target threshold such that when the service level indicator is above this threshold over the given timeframe, the objective is being met. thresholds [_required_] [object] The thresholds (timeframes and associated targets) for this service level objective object. target [_required_] double The target value for the service level indicator within the corresponding timeframe. target_display string A string representation of the target that indicates its precision. It uses trailing zeros to show significant decimal places (for example `98.00`). Always included in service level objective responses. Ignored in create/update requests. timeframe [_required_] enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` warning double The warning value for the service level objective. warning_display string A string representation of the warning target (see the description of the `target_display` field for details). Included in service level objective responses if a warning target exists. Ignored in create/update requests. timeframe enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` type [_required_] enum The type of the service level objective. Allowed enum values: `metric,monitor,time_slice` warning_threshold double The optional warning threshold such that when the service level indicator is below this value for the given threshold, but above the target threshold, the objective appears in a "warning" state. This value must be greater than the target threshold. ``` { "type": "metric", "name": "Custom Metric SLO", "thresholds": [ { "target": 97.0, "timeframe": "7d", "warning": 98.0 } ], "timeframe": "7d", "target_threshold": 97.0, "warning_threshold": 98, "query": { "numerator": "sum:httpservice.hits{code:2xx}.as_count()", "denominator": "sum:httpservice.hits{!code:3xx}.as_count()" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#UpdateSLO-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-level-objectives/#UpdateSLO-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#UpdateSLO-403-v1) * [404](https://docs.datadoghq.com/api/latest/service-level-objectives/#UpdateSLO-404-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#UpdateSLO-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) A response with one or more service level objective. Field Type Description data [object] An array of service level objective objects. created_at int64 Creation timestamp (UNIX time in seconds) Always included in service level objective responses. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. description string A user-defined description of the service level objective. Always included in service level objective responses (but may be `null`). Optional in create/update requests. groups [string] A list of (up to 100) monitor groups that narrow the scope of a monitor service level objective. Included in service level objective responses if it is not empty. Optional in create/update requests for monitor service level objectives, but may only be used when then length of the `monitor_ids` field is one. id string A unique identifier for the service level objective object. Always included in service level objective responses. modified_at int64 Modification timestamp (UNIX time in seconds) Always included in service level objective responses. monitor_ids [integer] A list of monitor ids that defines the scope of a monitor service level objective. **Required if type is`monitor`**. monitor_tags [string] The union of monitor tags for all monitors referenced by the `monitor_ids` field. Always included in service level objective responses for monitor-based service level objectives (but may be empty). Ignored in create/update requests. Does not affect which monitors are included in the service level objective (that is determined entirely by the `monitor_ids` field). name [_required_] string The name of the service level objective object. query object A metric-based SLO. **Required if type is`metric`**. Note that Datadog only allows the sum by aggregator to be used because this will sum up all request counts instead of averaging them, or taking the max or min of all of those requests. denominator [_required_] string A Datadog metric query for total (valid) events. numerator [_required_] string A Datadog metric query for good events. sli_specification A generic SLI specification. This is currently used for time-slice SLOs only. Option 1 object A time-slice SLI specification. time_slice [_required_] object The time-slice condition, composed of 3 parts: 1. the metric timeseries query, 2. the comparator, and 3. the threshold. Optionally, a fourth part, the query interval, can be provided. comparator [_required_] enum The comparator used to compare the SLI value to the threshold. Allowed enum values: `>,>=,<,<=` query [_required_] object The queries and formula used to calculate the SLI value. formulas [_required_] [object] A list that contains exactly one formula, as only a single formula may be used in a time-slice SLO. formula [_required_] string The formula string, which is an expression involving named queries. queries [_required_] [ ] A list of queries that are used to calculate the SLI value. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` query_interval_seconds enum The interval used when querying data, which defines the size of a time slice. Two values are allowed: 60 (1 minute) and 300 (5 minutes). If not provided, the value defaults to 300 (5 minutes). Allowed enum values: `60,300` threshold [_required_] double The threshold value to which each SLI value will be compared. tags [string] A list of tags associated with this service level objective. Always included in service level objective responses (but may be empty). Optional in create/update requests. target_threshold double The target threshold such that when the service level indicator is above this threshold over the given timeframe, the objective is being met. thresholds [_required_] [object] The thresholds (timeframes and associated targets) for this service level objective object. target [_required_] double The target value for the service level indicator within the corresponding timeframe. target_display string A string representation of the target that indicates its precision. It uses trailing zeros to show significant decimal places (for example `98.00`). Always included in service level objective responses. Ignored in create/update requests. timeframe [_required_] enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` warning double The warning value for the service level objective. warning_display string A string representation of the warning target (see the description of the `target_display` field for details). Included in service level objective responses if a warning target exists. Ignored in create/update requests. timeframe enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` type [_required_] enum The type of the service level objective. Allowed enum values: `metric,monitor,time_slice` warning_threshold double The optional warning threshold such that when the service level indicator is below this value for the given threshold, but above the target threshold, the objective appears in a "warning" state. This value must be greater than the target threshold. errors [string] An array of error messages. Each endpoint documents how/whether this field is used. metadata object The metadata object containing additional information about the list of SLOs. page object The object containing information about the pages of the list of SLOs. total_count int64 The total number of resources that could be retrieved ignoring the parameters and filters in the request. total_filtered_count int64 The total number of resources that match the parameters and filters in the request. This attribute can be used by a client to determine the total number of pages. ``` { "data": [ { "created_at": "integer", "creator": { "email": "string", "handle": "string", "name": "string" }, "description": "string", "groups": [ "env:prod", "role:mysql" ], "id": "string", "modified_at": "integer", "monitor_ids": [], "monitor_tags": [], "name": "Custom Metric SLO", "query": { "denominator": "sum:my.custom.metric{*}.as_count()", "numerator": "sum:my.custom.metric{type:good}.as_count()" }, "sli_specification": { "time_slice": { "comparator": ">", "query": { "formulas": [ { "formula": "query1 - default_zero(query2)" } ], "queries": [ [] ] }, "query_interval_seconds": 300, "threshold": 5 } }, "tags": [ "env:prod", "app:core" ], "target_threshold": 99.9, "thresholds": [ { "target": 99.9, "target_display": "99.9", "timeframe": "30d", "warning": 90, "warning_display": "90.0" } ], "timeframe": "30d", "type": "metric", "warning_threshold": 99.95 } ], "errors": [], "metadata": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) ##### Update an SLO returns "OK" response Copy ``` # Path parameters export slo_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/${slo_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "type": "metric", "name": "Custom Metric SLO", "thresholds": [ { "target": 97.0, "timeframe": "7d", "warning": 98.0 } ], "timeframe": "7d", "target_threshold": 97.0, "warning_threshold": 98, "query": { "numerator": "sum:httpservice.hits{code:2xx}.as_count()", "denominator": "sum:httpservice.hits{!code:3xx}.as_count()" } } EOF ``` ##### Update an SLO returns "OK" response ``` // Update an SLO returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "slo" in the system SloData0ID := os.Getenv("SLO_DATA_0_ID") SloData0Name := os.Getenv("SLO_DATA_0_NAME") body := datadogV1.ServiceLevelObjective{ Type: datadogV1.SLOTYPE_METRIC, Name: SloData0Name, Thresholds: []datadogV1.SLOThreshold{ { Target: 97.0, Timeframe: datadogV1.SLOTIMEFRAME_SEVEN_DAYS, Warning: datadog.PtrFloat64(98.0), }, }, Timeframe: datadogV1.SLOTIMEFRAME_SEVEN_DAYS.Ptr(), TargetThreshold: datadog.PtrFloat64(97.0), WarningThreshold: datadog.PtrFloat64(98), Query: &datadogV1.ServiceLevelObjectiveQuery{ Numerator: "sum:httpservice.hits{code:2xx}.as_count()", Denominator: "sum:httpservice.hits{!code:3xx}.as_count()", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.UpdateSLO(ctx, SloData0ID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.UpdateSLO`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.UpdateSLO`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an SLO returns "OK" response ``` // Update an SLO returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v1.model.SLOListResponse; import com.datadog.api.client.v1.model.SLOThreshold; import com.datadog.api.client.v1.model.SLOTimeframe; import com.datadog.api.client.v1.model.SLOType; import com.datadog.api.client.v1.model.ServiceLevelObjective; import com.datadog.api.client.v1.model.ServiceLevelObjectiveQuery; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); // there is a valid "slo" in the system String SLO_DATA_0_ID = System.getenv("SLO_DATA_0_ID"); String SLO_DATA_0_NAME = System.getenv("SLO_DATA_0_NAME"); ServiceLevelObjective body = new ServiceLevelObjective() .type(SLOType.METRIC) .name(SLO_DATA_0_NAME) .thresholds( Collections.singletonList( new SLOThreshold() .target(97.0) .timeframe(SLOTimeframe.SEVEN_DAYS) .warning(98.0))) .timeframe(SLOTimeframe.SEVEN_DAYS) .targetThreshold(97.0) .warningThreshold(98.0) .query( new ServiceLevelObjectiveQuery() .numerator("sum:httpservice.hits{code:2xx}.as_count()") .denominator("sum:httpservice.hits{!code:3xx}.as_count()")); try { SLOListResponse result = apiInstance.updateSLO(SLO_DATA_0_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#updateSLO"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an SLO returns "OK" response ``` """ Update an SLO returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objectives_api import ServiceLevelObjectivesApi from datadog_api_client.v1.model.service_level_objective import ServiceLevelObjective from datadog_api_client.v1.model.service_level_objective_query import ServiceLevelObjectiveQuery from datadog_api_client.v1.model.slo_threshold import SLOThreshold from datadog_api_client.v1.model.slo_timeframe import SLOTimeframe from datadog_api_client.v1.model.slo_type import SLOType # there is a valid "slo" in the system SLO_DATA_0_ID = environ["SLO_DATA_0_ID"] SLO_DATA_0_NAME = environ["SLO_DATA_0_NAME"] body = ServiceLevelObjective( type=SLOType.METRIC, name=SLO_DATA_0_NAME, thresholds=[ SLOThreshold( target=97.0, timeframe=SLOTimeframe.SEVEN_DAYS, warning=98.0, ), ], timeframe=SLOTimeframe.SEVEN_DAYS, target_threshold=97.0, warning_threshold=98.0, query=ServiceLevelObjectiveQuery( numerator="sum:httpservice.hits{code:2xx}.as_count()", denominator="sum:httpservice.hits{!code:3xx}.as_count()", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.update_slo(slo_id=SLO_DATA_0_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an SLO returns "OK" response ``` # Update an SLO returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectivesAPI.new # there is a valid "slo" in the system SLO_DATA_0_ID = ENV["SLO_DATA_0_ID"] SLO_DATA_0_NAME = ENV["SLO_DATA_0_NAME"] body = DatadogAPIClient::V1::ServiceLevelObjective.new({ type: DatadogAPIClient::V1::SLOType::METRIC, name: SLO_DATA_0_NAME, thresholds: [ DatadogAPIClient::V1::SLOThreshold.new({ target: 97.0, timeframe: DatadogAPIClient::V1::SLOTimeframe::SEVEN_DAYS, warning: 98.0, }), ], timeframe: DatadogAPIClient::V1::SLOTimeframe::SEVEN_DAYS, target_threshold: 97.0, warning_threshold: 98, query: DatadogAPIClient::V1::ServiceLevelObjectiveQuery.new({ numerator: "sum:httpservice.hits{code:2xx}.as_count()", denominator: "sum:httpservice.hits{!code:3xx}.as_count()", }), }) p api_instance.update_slo(SLO_DATA_0_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an SLO returns "OK" response ``` // Update an SLO returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objectives::ServiceLevelObjectivesAPI; use datadog_api_client::datadogV1::model::SLOThreshold; use datadog_api_client::datadogV1::model::SLOTimeframe; use datadog_api_client::datadogV1::model::SLOType; use datadog_api_client::datadogV1::model::ServiceLevelObjective; use datadog_api_client::datadogV1::model::ServiceLevelObjectiveQuery; #[tokio::main] async fn main() { // there is a valid "slo" in the system let slo_data_0_id = std::env::var("SLO_DATA_0_ID").unwrap(); let slo_data_0_name = std::env::var("SLO_DATA_0_NAME").unwrap(); let body = ServiceLevelObjective::new( slo_data_0_name.clone(), vec![SLOThreshold::new(97.0, SLOTimeframe::SEVEN_DAYS).warning(98.0 as f64)], SLOType::METRIC, ) .query(ServiceLevelObjectiveQuery::new( "sum:httpservice.hits{!code:3xx}.as_count()".to_string(), "sum:httpservice.hits{code:2xx}.as_count()".to_string(), )) .target_threshold(97.0 as f64) .timeframe(SLOTimeframe::SEVEN_DAYS) .warning_threshold(98.0 as f64); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api.update_slo(slo_data_0_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an SLO returns "OK" response ``` /** * Update an SLO returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectivesApi(configuration); // there is a valid "slo" in the system const SLO_DATA_0_ID = process.env.SLO_DATA_0_ID as string; const SLO_DATA_0_NAME = process.env.SLO_DATA_0_NAME as string; const params: v1.ServiceLevelObjectivesApiUpdateSLORequest = { body: { type: "metric", name: SLO_DATA_0_NAME, thresholds: [ { target: 97.0, timeframe: "7d", warning: 98.0, }, ], timeframe: "7d", targetThreshold: 97.0, warningThreshold: 98, query: { numerator: "sum:httpservice.hits{code:2xx}.as_count()", denominator: "sum:httpservice.hits{!code:3xx}.as_count()", }, }, sloId: SLO_DATA_0_ID, }; apiInstance .updateSLO(params) .then((data: v1.SLOListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an SLO's details](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-an-slos-details) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-an-slos-details-v1) GET https://api.ap1.datadoghq.com/api/v1/slo/{slo_id}https://api.ap2.datadoghq.com/api/v1/slo/{slo_id}https://api.datadoghq.eu/api/v1/slo/{slo_id}https://api.ddog-gov.com/api/v1/slo/{slo_id}https://api.datadoghq.com/api/v1/slo/{slo_id}https://api.us3.datadoghq.com/api/v1/slo/{slo_id}https://api.us5.datadoghq.com/api/v1/slo/{slo_id} ### Overview Get a service level objective object. This endpoint requires the `slos_read` permission. OAuth apps require the `slos_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Arguments #### Path Parameters Name Type Description slo_id [_required_] string The ID of the service level objective object. #### Query Strings Name Type Description with_configured_alert_ids boolean Get the IDs of SLO monitors that reference this SLO. ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLO-200-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLO-403-v1) * [404](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLO-404-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLO-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) A service level objective response containing a single service level objective. Field Type Description data object A service level objective object includes a service level indicator, thresholds for one or more timeframes, and metadata (`name`, `description`, `tags`, etc.). configured_alert_ids [integer] A list of SLO monitors IDs that reference this SLO. This field is returned only when `with_configured_alert_ids` parameter is true in query. created_at int64 Creation timestamp (UNIX time in seconds) Always included in service level objective responses. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. description string A user-defined description of the service level objective. Always included in service level objective responses (but may be `null`). Optional in create/update requests. groups [string] A list of (up to 20) monitor groups that narrow the scope of a monitor service level objective. Included in service level objective responses if it is not empty. Optional in create/update requests for monitor service level objectives, but may only be used when then length of the `monitor_ids` field is one. id string A unique identifier for the service level objective object. Always included in service level objective responses. modified_at int64 Modification timestamp (UNIX time in seconds) Always included in service level objective responses. monitor_ids [integer] A list of monitor ids that defines the scope of a monitor service level objective. **Required if type is`monitor`**. monitor_tags [string] The union of monitor tags for all monitors referenced by the `monitor_ids` field. Always included in service level objective responses for monitor service level objectives (but may be empty). Ignored in create/update requests. Does not affect which monitors are included in the service level objective (that is determined entirely by the `monitor_ids` field). name string The name of the service level objective object. query object A metric-based SLO. **Required if type is`metric`**. Note that Datadog only allows the sum by aggregator to be used because this will sum up all request counts instead of averaging them, or taking the max or min of all of those requests. denominator [_required_] string A Datadog metric query for total (valid) events. numerator [_required_] string A Datadog metric query for good events. sli_specification A generic SLI specification. This is currently used for time-slice SLOs only. Option 1 object A time-slice SLI specification. time_slice [_required_] object The time-slice condition, composed of 3 parts: 1. the metric timeseries query, 2. the comparator, and 3. the threshold. Optionally, a fourth part, the query interval, can be provided. comparator [_required_] enum The comparator used to compare the SLI value to the threshold. Allowed enum values: `>,>=,<,<=` query [_required_] object The queries and formula used to calculate the SLI value. formulas [_required_] [object] A list that contains exactly one formula, as only a single formula may be used in a time-slice SLO. formula [_required_] string The formula string, which is an expression involving named queries. queries [_required_] [ ] A list of queries that are used to calculate the SLI value. Option 1 object A formula and functions metrics query. aggregator enum The aggregation methods available for metrics queries. Allowed enum values: `avg,min,max,sum,last,area,l2norm,percentile` cross_org_uuids [string] The source organization UUID for cross organization queries. Feature in Private Beta. data_source [_required_] enum Data source for metrics queries. Allowed enum values: `metrics` name [_required_] string Name of the query for use in formulas. query [_required_] string Metrics query definition. semantic_mode enum Semantic mode for metrics queries. This determines how metrics from different sources are combined or displayed. Allowed enum values: `combined,native` query_interval_seconds enum The interval used when querying data, which defines the size of a time slice. Two values are allowed: 60 (1 minute) and 300 (5 minutes). If not provided, the value defaults to 300 (5 minutes). Allowed enum values: `60,300` threshold [_required_] double The threshold value to which each SLI value will be compared. tags [string] A list of tags associated with this service level objective. Always included in service level objective responses (but may be empty). Optional in create/update requests. target_threshold double The target threshold such that when the service level indicator is above this threshold over the given timeframe, the objective is being met. thresholds [object] The thresholds (timeframes and associated targets) for this service level objective object. target [_required_] double The target value for the service level indicator within the corresponding timeframe. target_display string A string representation of the target that indicates its precision. It uses trailing zeros to show significant decimal places (for example `98.00`). Always included in service level objective responses. Ignored in create/update requests. timeframe [_required_] enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` warning double The warning value for the service level objective. warning_display string A string representation of the warning target (see the description of the `target_display` field for details). Included in service level objective responses if a warning target exists. Ignored in create/update requests. timeframe enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` type enum The type of the service level objective. Allowed enum values: `metric,monitor,time_slice` warning_threshold double The optional warning threshold such that when the service level indicator is below this value for the given threshold, but above the target threshold, the objective appears in a "warning" state. This value must be greater than the target threshold. errors [string] An array of error messages. Each endpoint documents how/whether this field is used. ``` { "data": { "configured_alert_ids": [ 123, 456, 789 ], "created_at": "integer", "creator": { "email": "string", "handle": "string", "name": "string" }, "description": "string", "groups": [ "env:prod", "role:mysql" ], "id": "string", "modified_at": "integer", "monitor_ids": [], "monitor_tags": [], "name": "Custom Metric SLO", "query": { "denominator": "sum:my.custom.metric{*}.as_count()", "numerator": "sum:my.custom.metric{type:good}.as_count()" }, "sli_specification": { "time_slice": { "comparator": ">", "query": { "formulas": [ { "formula": "query1 - default_zero(query2)" } ], "queries": [ [] ] }, "query_interval_seconds": 300, "threshold": 5 } }, "tags": [ "env:prod", "app:core" ], "target_threshold": 99.9, "thresholds": [ { "target": 99.9, "target_display": "99.9", "timeframe": "30d", "warning": 90, "warning_display": "90.0" } ], "timeframe": "30d", "type": "metric", "warning_threshold": 99.95 }, "errors": [] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python-legacy) ##### Get an SLO's details Copy ``` # Path parameters export slo_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/${slo_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an SLO's details ``` """ Get an SLO's details returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objectives_api import ServiceLevelObjectivesApi # there is a valid "slo" in the system SLO_DATA_0_ID = environ["SLO_DATA_0_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.get_slo( slo_id=SLO_DATA_0_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an SLO's details ``` # Get an SLO's details returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectivesAPI.new # there is a valid "slo" in the system SLO_DATA_0_ID = ENV["SLO_DATA_0_ID"] p api_instance.get_slo(SLO_DATA_0_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an SLO's details ``` # This is not currently available for the Ruby API. ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an SLO's details ``` // Get an SLO's details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "slo" in the system SloData0ID := os.Getenv("SLO_DATA_0_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.GetSLO(ctx, SloData0ID, *datadogV1.NewGetSLOOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.GetSLO`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.GetSLO`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an SLO's details ``` // Get an SLO's details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v1.model.SLOResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); // there is a valid "slo" in the system String SLO_DATA_0_ID = System.getenv("SLO_DATA_0_ID"); try { SLOResponse result = apiInstance.getSLO(SLO_DATA_0_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#getSLO"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an SLO's details ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } slo_id = '' initialize(**options) api.ServiceLevelObjective.get(slo_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get an SLO's details ``` // Get an SLO's details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objectives::GetSLOOptionalParams; use datadog_api_client::datadogV1::api_service_level_objectives::ServiceLevelObjectivesAPI; #[tokio::main] async fn main() { // there is a valid "slo" in the system let slo_data_0_id = std::env::var("SLO_DATA_0_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api .get_slo(slo_data_0_id.clone(), GetSLOOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an SLO's details ``` /** * Get an SLO's details returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectivesApi(configuration); // there is a valid "slo" in the system const SLO_DATA_0_ID = process.env.SLO_DATA_0_ID as string; const params: v1.ServiceLevelObjectivesApiGetSLORequest = { sloId: SLO_DATA_0_ID, }; apiInstance .getSLO(params) .then((data: v1.SLOResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an SLO](https://docs.datadoghq.com/api/latest/service-level-objectives/#delete-an-slo) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#delete-an-slo-v1) DELETE https://api.ap1.datadoghq.com/api/v1/slo/{slo_id}https://api.ap2.datadoghq.com/api/v1/slo/{slo_id}https://api.datadoghq.eu/api/v1/slo/{slo_id}https://api.ddog-gov.com/api/v1/slo/{slo_id}https://api.datadoghq.com/api/v1/slo/{slo_id}https://api.us3.datadoghq.com/api/v1/slo/{slo_id}https://api.us5.datadoghq.com/api/v1/slo/{slo_id} ### Overview Permanently delete the specified service level objective object. If an SLO is used in a dashboard, the `DELETE /v1/slo/` endpoint returns a 409 conflict error because the SLO is referenced in a dashboard. This endpoint requires the `slos_write` permission. OAuth apps require the `slos_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Arguments #### Path Parameters Name Type Description slo_id [_required_] string The ID of the service level objective. #### Query Strings Name Type Description force string Delete the monitor even if it’s referenced by other resources (for example SLO, composite monitor). ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#DeleteSLO-200-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#DeleteSLO-403-v1) * [404](https://docs.datadoghq.com/api/latest/service-level-objectives/#DeleteSLO-404-v1) * [409](https://docs.datadoghq.com/api/latest/service-level-objectives/#DeleteSLO-409-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#DeleteSLO-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) A response list of all service level objective deleted. Field Type Description data [string] An array containing the ID of the deleted service level objective object. errors object An dictionary containing the ID of the SLO as key and a deletion error as value. string Error preventing the SLO deletion. ``` { "data": [], "errors": { "": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) A response list of all service level objective deleted. Field Type Description data [string] An array containing the ID of the deleted service level objective object. errors object An dictionary containing the ID of the SLO as key and a deletion error as value. string Error preventing the SLO deletion. ``` { "data": [], "errors": { "": "string" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python-legacy) ##### Delete an SLO Copy ``` # Path parameters export slo_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/${slo_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an SLO ``` """ Delete an SLO returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objectives_api import ServiceLevelObjectivesApi # there is a valid "slo" in the system SLO_DATA_0_ID = environ["SLO_DATA_0_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.delete_slo( slo_id=SLO_DATA_0_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an SLO ``` # Delete an SLO returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectivesAPI.new # there is a valid "slo" in the system SLO_DATA_0_ID = ENV["SLO_DATA_0_ID"] p api_instance.delete_slo(SLO_DATA_0_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an SLO ``` require 'dogapi' api_key = '' app_key = '' slo_id = '' dog = Dogapi::Client.new(api_key, app_key) dog.delete_service_level_objective(slo_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an SLO ``` // Delete an SLO returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "slo" in the system SloData0ID := os.Getenv("SLO_DATA_0_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.DeleteSLO(ctx, SloData0ID, *datadogV1.NewDeleteSLOOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.DeleteSLO`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.DeleteSLO`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an SLO ``` // Delete an SLO returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v1.model.SLODeleteResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); // there is a valid "slo" in the system String SLO_DATA_0_ID = System.getenv("SLO_DATA_0_ID"); try { SLODeleteResponse result = apiInstance.deleteSLO(SLO_DATA_0_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#deleteSLO"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an SLO ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } slo_id = '' initialize(**options) api.ServiceLevelObjective.delete(slo_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Delete an SLO ``` // Delete an SLO returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objectives::DeleteSLOOptionalParams; use datadog_api_client::datadogV1::api_service_level_objectives::ServiceLevelObjectivesAPI; #[tokio::main] async fn main() { // there is a valid "slo" in the system let slo_data_0_id = std::env::var("SLO_DATA_0_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api .delete_slo(slo_data_0_id.clone(), DeleteSLOOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an SLO ``` /** * Delete an SLO returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectivesApi(configuration); // there is a valid "slo" in the system const SLO_DATA_0_ID = process.env.SLO_DATA_0_ID as string; const params: v1.ServiceLevelObjectivesApiDeleteSLORequest = { sloId: SLO_DATA_0_ID, }; apiInstance .deleteSLO(params) .then((data: v1.SLODeleteResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an SLO's history](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-an-slos-history) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-an-slos-history-v1) GET https://api.ap1.datadoghq.com/api/v1/slo/{slo_id}/historyhttps://api.ap2.datadoghq.com/api/v1/slo/{slo_id}/historyhttps://api.datadoghq.eu/api/v1/slo/{slo_id}/historyhttps://api.ddog-gov.com/api/v1/slo/{slo_id}/historyhttps://api.datadoghq.com/api/v1/slo/{slo_id}/historyhttps://api.us3.datadoghq.com/api/v1/slo/{slo_id}/historyhttps://api.us5.datadoghq.com/api/v1/slo/{slo_id}/history ### Overview Get a specific SLO’s history, regardless of its SLO type. The detailed history data is structured according to the source data type. For example, metric data is included for event SLOs that use the metric source, and monitor SLO types include the monitor transition history. **Note:** There are different response formats for event based and time based SLOs. Examples of both are shown. This endpoint requires the `slos_read` permission. OAuth apps require the `slos_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Arguments #### Path Parameters Name Type Description slo_id [_required_] string The ID of the service level objective object. #### Query Strings Name Type Description from_ts [_required_] integer The `from` timestamp for the query window in epoch seconds. to_ts [_required_] integer The `to` timestamp for the query window in epoch seconds. target number The SLO target. If `target` is passed in, the response will include the remaining error budget and a timeframe value of `custom`. apply_correction boolean Defaults to `true`. If any SLO corrections are applied and this parameter is set to `false`, then the corrections will not be applied and the SLI values will not be affected. ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOHistory-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOHistory-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOHistory-403-v1) * [404](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOHistory-404-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOHistory-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) A service level objective history response. Field Type Description data object An array of service level objective objects. from_ts int64 The `from` timestamp in epoch seconds. group_by [string] For `metric` based SLOs where the query includes a group-by clause, this represents the list of grouping parameters. This is not included in responses for `monitor` based SLOs. groups [object] For grouped SLOs, this represents SLI data for specific groups. This is not included in the responses for `metric` based SLOs. error_budget_remaining object A mapping of threshold `timeframe` to the remaining error budget. double Remaining error budget. errors [object] An array of error objects returned while querying the history data for the service level objective. error_message [_required_] string A message with more details about the error. error_type [_required_] string Type of the error. group string For groups in a grouped SLO, this is the group name. history [array] The state transition history for the monitor. It is represented as an array of pairs. Each pair is an array containing the timestamp of the transition as an integer in Unix epoch format in the first element, and the state as an integer in the second element. An integer value of `0` for state means uptime, `1` means downtime, and `2` means no data. Periods of no data are counted either as uptime or downtime depending on monitor settings. See [SLO documentation](https://docs.datadoghq.com/service_management/service_level_objectives/monitor/#missing-data) for detailed information. monitor_modified int64 For `monitor` based SLOs, this is the last modified timestamp in epoch seconds of the monitor. monitor_type string For `monitor` based SLOs, this describes the type of monitor. name string For groups in a grouped SLO, this is the group name. For monitors in a multi-monitor SLO, this is the monitor name. precision double **DEPRECATED** : The amount of decimal places the SLI value is accurate to for the given from `&&` to timestamp. Use `span_precision` instead. preview boolean For `monitor` based SLOs, when `true` this indicates that a replay is in progress to give an accurate uptime calculation. sli_value double The current SLI value of the SLO over the history window. span_precision double The amount of decimal places the SLI value is accurate to for the given from `&&` to timestamp. uptime double **DEPRECATED** : Use `sli_value` instead. monitors [object] For multi-monitor SLOs, this represents SLI data for specific monitors. This is not included in the responses for `metric` based SLOs. error_budget_remaining object A mapping of threshold `timeframe` to the remaining error budget. double Remaining error budget. errors [object] An array of error objects returned while querying the history data for the service level objective. error_message [_required_] string A message with more details about the error. error_type [_required_] string Type of the error. group string For groups in a grouped SLO, this is the group name. history [array] The state transition history for the monitor. It is represented as an array of pairs. Each pair is an array containing the timestamp of the transition as an integer in Unix epoch format in the first element, and the state as an integer in the second element. An integer value of `0` for state means uptime, `1` means downtime, and `2` means no data. Periods of no data are counted either as uptime or downtime depending on monitor settings. See [SLO documentation](https://docs.datadoghq.com/service_management/service_level_objectives/monitor/#missing-data) for detailed information. monitor_modified int64 For `monitor` based SLOs, this is the last modified timestamp in epoch seconds of the monitor. monitor_type string For `monitor` based SLOs, this describes the type of monitor. name string For groups in a grouped SLO, this is the group name. For monitors in a multi-monitor SLO, this is the monitor name. precision double **DEPRECATED** : The amount of decimal places the SLI value is accurate to for the given from `&&` to timestamp. Use `span_precision` instead. preview boolean For `monitor` based SLOs, when `true` this indicates that a replay is in progress to give an accurate uptime calculation. sli_value double The current SLI value of the SLO over the history window. span_precision double The amount of decimal places the SLI value is accurate to for the given from `&&` to timestamp. uptime double **DEPRECATED** : Use `sli_value` instead. overall object An object that holds an SLI value and its associated data. It can represent an SLO's overall SLI value. This can also represent the SLI value for a specific monitor in multi-monitor SLOs, or a group in grouped SLOs. error_budget_remaining object A mapping of threshold `timeframe` to the remaining error budget. double Remaining error budget. errors [object] An array of error objects returned while querying the history data for the service level objective. error_message [_required_] string A message with more details about the error. error_type [_required_] string Type of the error. group string For groups in a grouped SLO, this is the group name. history [array] The state transition history for `monitor` or `time-slice` SLOs. It is represented as an array of pairs. Each pair is an array containing the timestamp of the transition as an integer in Unix epoch format in the first element, and the state as an integer in the second element. An integer value of `0` for state means uptime, `1` means downtime, and `2` means no data. Periods of no data count as uptime in time-slice SLOs, while for monitor SLOs, no data is counted either as uptime or downtime depending on monitor settings. See [SLO documentation](https://docs.datadoghq.com/service_management/service_level_objectives/monitor/#missing-data) for detailed information. monitor_modified int64 For `monitor` based SLOs, this is the last modified timestamp in epoch seconds of the monitor. monitor_type string For `monitor` based SLOs, this describes the type of monitor. name string For groups in a grouped SLO, this is the group name. For monitors in a multi-monitor SLO, this is the monitor name. precision object A mapping of threshold `timeframe` to number of accurate decimals, regardless of the from && to timestamp. double The number of accurate decimals. preview boolean For `monitor` based SLOs, when `true` this indicates that a replay is in progress to give an accurate uptime calculation. sli_value double The current SLI value of the SLO over the history window. span_precision double The amount of decimal places the SLI value is accurate to for the given from `&&` to timestamp. uptime double **DEPRECATED** : Use `sli_value` instead. series object A `metric` based SLO history response. This is not included in responses for `monitor` based SLOs. denominator [_required_] object A representation of `metric` based SLO timeseries for the provided queries. This is the same response type from `batch_query` endpoint. count [_required_] int64 Count of submitted metrics. metadata object Query metadata. aggr string **DEPRECATED** : Query aggregator function. expression string **DEPRECATED** : Query expression. metric string **DEPRECATED** : Query metric used. query_index int64 **DEPRECATED** : Query index from original combined query. scope string **DEPRECATED** : Query scope. unit [object] An array of metric units that contains up to two unit objects. For example, bytes represents one unit object and bytes per second represents two unit objects. If a metric query only has one unit object, the second array element is null. family string The family of metric unit, for example `bytes` is the family for `kibibyte`, `byte`, and `bit` units. id int64 The ID of the metric unit. name string The unit of the metric, for instance `byte`. plural string The plural Unit of metric, for instance `bytes`. scale_factor double The scale factor of metric unit, for instance `1.0`. short_name string A shorter and abbreviated version of the metric unit, for instance `B`. sum [_required_] double Total sum of the query. values [_required_] [number] The query values for each metric. interval [_required_] int64 The aggregated query interval for the series data. It's implicit based on the query time window. message string Optional message if there are specific query issues/warnings. numerator [_required_] object A representation of `metric` based SLO timeseries for the provided queries. This is the same response type from `batch_query` endpoint. count [_required_] int64 Count of submitted metrics. metadata object Query metadata. aggr string **DEPRECATED** : Query aggregator function. expression string **DEPRECATED** : Query expression. metric string **DEPRECATED** : Query metric used. query_index int64 **DEPRECATED** : Query index from original combined query. scope string **DEPRECATED** : Query scope. unit [object] An array of metric units that contains up to two unit objects. For example, bytes represents one unit object and bytes per second represents two unit objects. If a metric query only has one unit object, the second array element is null. family string The family of metric unit, for example `bytes` is the family for `kibibyte`, `byte`, and `bit` units. id int64 The ID of the metric unit. name string The unit of the metric, for instance `byte`. plural string The plural Unit of metric, for instance `bytes`. scale_factor double The scale factor of metric unit, for instance `1.0`. short_name string A shorter and abbreviated version of the metric unit, for instance `B`. sum [_required_] double Total sum of the query. values [_required_] [number] The query values for each metric. query [_required_] string The combined numerator and denominator query CSV. res_type [_required_] string The series result type. This mimics `batch_query` response type. resp_version [_required_] int64 The series response version type. This mimics `batch_query` response type. times [_required_] [number] An array of query timestamps in EPOCH milliseconds. thresholds object mapping of string timeframe to the SLO threshold. object SLO thresholds (target and optionally warning) for a single time window. target [_required_] double The target value for the service level indicator within the corresponding timeframe. target_display string A string representation of the target that indicates its precision. It uses trailing zeros to show significant decimal places (for example `98.00`). Always included in service level objective responses. Ignored in create/update requests. timeframe [_required_] enum The SLO time window options. Note that "custom" is not a valid option for creating or updating SLOs. It is only used when querying SLO history over custom timeframes. Allowed enum values: `7d,30d,90d,custom` warning double The warning value for the service level objective. warning_display string A string representation of the warning target (see the description of the `target_display` field for details). Included in service level objective responses if a warning target exists. Ignored in create/update requests. to_ts int64 The `to` timestamp in epoch seconds. type enum The type of the service level objective. Allowed enum values: `metric,monitor,time_slice` type_id enum A numeric representation of the type of the service level objective (`0` for monitor, `1` for metric). Always included in service level objective responses. Ignored in create/update requests. Allowed enum values: `0,1,2` errors [object] A list of errors while querying the history data for the service level objective. error string Human readable error. ``` { "data": { "from_ts": 1615323990, "group_by": [], "groups": [ { "error_budget_remaining": { "": "number" }, "errors": [ { "error_message": "", "error_type": "" } ], "group": "name", "history": [ [ 1579212382, 0 ] ], "monitor_modified": 1615867200, "monitor_type": "string", "name": "string", "precision": 2, "preview": true, "sli_value": 99.99, "span_precision": 2, "uptime": 99.99 } ], "monitors": [ { "error_budget_remaining": { "": "number" }, "errors": [ { "error_message": "", "error_type": "" } ], "group": "name", "history": [ [ 1579212382, 0 ] ], "monitor_modified": 1615867200, "monitor_type": "string", "name": "string", "precision": 2, "preview": true, "sli_value": 99.99, "span_precision": 2, "uptime": 99.99 } ], "overall": { "error_budget_remaining": { "": "number" }, "errors": [ { "error_message": "", "error_type": "" } ], "group": "name", "history": [ [ 1579212382, 0 ] ], "monitor_modified": 1615867200, "monitor_type": "string", "name": "string", "precision": { "": "number" }, "preview": true, "sli_value": 99.99, "span_precision": 2, "uptime": 99.99 }, "series": { "denominator": { "count": 0, "metadata": { "aggr": "string", "expression": "string", "metric": "string", "query_index": "integer", "scope": "string", "unit": [ { "family": "bytes", "id": 2, "name": "byte", "plural": "bytes", "scale_factor": 1, "short_name": "B" } ] }, "sum": 0, "values": [ [] ] }, "interval": 0, "message": "", "numerator": { "count": 0, "metadata": { "aggr": "string", "expression": "string", "metric": "string", "query_index": "integer", "scope": "string", "unit": [ { "family": "bytes", "id": 2, "name": "byte", "plural": "bytes", "scale_factor": 1, "short_name": "B" } ] }, "sum": 0, "values": [ [] ] }, "query": "", "res_type": "", "resp_version": 0, "times": [ [] ] }, "thresholds": { "": { "target": 99.9, "target_display": "99.9", "timeframe": "30d", "warning": 90, "warning_display": "90.0" } }, "to_ts": 1615928790, "type": "metric", "type_id": 0 }, "errors": [ { "error": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python-legacy) ##### Get an SLO's history Copy ``` # Path parameters export slo_id="CHANGE_ME" # Required query arguments export from_ts="CHANGE_ME" export to_ts="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/${slo_id}/history?from_ts=${from_ts}&to_ts=${to_ts}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an SLO's history ``` """ Get an SLO's history returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objectives_api import ServiceLevelObjectivesApi # there is a valid "slo" in the system SLO_DATA_0_ID = environ["SLO_DATA_0_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.get_slo_history( slo_id=SLO_DATA_0_ID, from_ts=int((datetime.now() + relativedelta(days=-1)).timestamp()), to_ts=int(datetime.now().timestamp()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an SLO's history ``` # Get an SLO's history returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectivesAPI.new # there is a valid "slo" in the system SLO_DATA_0_ID = ENV["SLO_DATA_0_ID"] p api_instance.get_slo_history(SLO_DATA_0_ID, (Time.now + -1 * 86400).to_i, Time.now.to_i) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an SLO's history ``` require 'dogapi' api_key = '' app_key = '' slo_id = '' dog = Dogapi::Client.new(api_key, app_key) to_ts = 1_571_320_613 from_ts = to_ts - 60 * 60 * 24 * 30 dog.get_service_level_objective_history(slo_id, from_ts, to_ts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an SLO's history ``` // Get an SLO's history returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "slo" in the system SloData0ID := os.Getenv("SLO_DATA_0_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.GetSLOHistory(ctx, SloData0ID, time.Now().AddDate(0, 0, -1).Unix(), time.Now().Unix(), *datadogV1.NewGetSLOHistoryOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.GetSLOHistory`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.GetSLOHistory`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an SLO's history ``` // Get an SLO's history returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v1.model.SLOHistoryResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); // there is a valid "slo" in the system String SLO_DATA_0_ID = System.getenv("SLO_DATA_0_ID"); try { SLOHistoryResponse result = apiInstance.getSLOHistory( SLO_DATA_0_ID, OffsetDateTime.now().plusDays(-1).toInstant().getEpochSecond(), OffsetDateTime.now().toInstant().getEpochSecond()); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#getSLOHistory"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an SLO's history ``` from datadog import initialize, api import time options = { 'api_key': '', 'app_key': '' } slo_id = '' initialize(**options) to_ts = int(time.time()) from_ts = to_ts - 60*60*24*30 api.ServiceLevelObjective.history(slo_id, from_ts=from_ts, to_ts=to_ts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get an SLO's history ``` // Get an SLO's history returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objectives::GetSLOHistoryOptionalParams; use datadog_api_client::datadogV1::api_service_level_objectives::ServiceLevelObjectivesAPI; #[tokio::main] async fn main() { // there is a valid "slo" in the system let slo_data_0_id = std::env::var("SLO_DATA_0_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api .get_slo_history( slo_data_0_id.clone(), 1636542671, 1636629071, GetSLOHistoryOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an SLO's history ``` /** * Get an SLO's history returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectivesApi(configuration); // there is a valid "slo" in the system const SLO_DATA_0_ID = process.env.SLO_DATA_0_ID as string; const params: v1.ServiceLevelObjectivesApiGetSLOHistoryRequest = { sloId: SLO_DATA_0_ID, fromTs: Math.round( new Date(new Date().getTime() + -1 * 86400 * 1000).getTime() / 1000 ), toTs: Math.round(new Date().getTime() / 1000), }; apiInstance .getSLOHistory(params) .then((data: v1.SLOHistoryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get Corrections For an SLO](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-corrections-for-an-slo) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-corrections-for-an-slo-v1) GET https://api.ap1.datadoghq.com/api/v1/slo/{slo_id}/correctionshttps://api.ap2.datadoghq.com/api/v1/slo/{slo_id}/correctionshttps://api.datadoghq.eu/api/v1/slo/{slo_id}/correctionshttps://api.ddog-gov.com/api/v1/slo/{slo_id}/correctionshttps://api.datadoghq.com/api/v1/slo/{slo_id}/correctionshttps://api.us3.datadoghq.com/api/v1/slo/{slo_id}/correctionshttps://api.us5.datadoghq.com/api/v1/slo/{slo_id}/corrections ### Overview Get corrections applied to an SLO This endpoint requires the `slos_read` permission. OAuth apps require the `slos_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Arguments #### Path Parameters Name Type Description slo_id [_required_] string The ID of the service level objective object. ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOCorrections-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOCorrections-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOCorrections-403-v1) * [404](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOCorrections-404-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOCorrections-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) A list of SLO correction objects. Field Type Description data [object] The list of SLO corrections objects. attributes object The attribute object associated with the SLO correction. category enum Category the SLO correction belongs to. Allowed enum values: `Scheduled Maintenance,Outside Business Hours,Deployment,Other` created_at int64 The epoch timestamp of when the correction was created at. creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. description string Description of the correction being made. duration int64 Length of time (in seconds) for a specified `rrule` recurring SLO correction. end int64 Ending time of the correction in epoch seconds. modified_at int64 The epoch timestamp of when the correction was modified at. modifier object Modifier of the object. email string Email of the Modifier. handle string Handle of the Modifier. name string Name of the Modifier. rrule string The recurrence rules as defined in the iCalendar RFC 5545. The supported rules for SLO corrections are `FREQ`, `INTERVAL`, `COUNT`, `UNTIL` and `BYDAY`. slo_id string ID of the SLO that this correction applies to. start int64 Starting time of the correction in epoch seconds. timezone string The timezone to display in the UI for the correction times (defaults to "UTC"). id string The ID of the SLO correction. type enum SLO correction resource type. Allowed enum values: `correction` default: `correction` meta object Object describing meta attributes of response. page object Pagination object. total_count int64 Total count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "attributes": { "category": "Scheduled Maintenance", "created_at": "integer", "creator": { "email": "string", "handle": "string", "name": "string" }, "description": "string", "duration": 3600, "end": "integer", "modified_at": "integer", "modifier": { "email": "string", "handle": "string", "name": "string" }, "rrule": "FREQ=DAILY;INTERVAL=10;COUNT=5", "slo_id": "string", "start": "integer", "timezone": "string" }, "id": "string", "type": "correction" } ], "meta": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) ##### Get Corrections For an SLO Copy ``` # Path parameters export slo_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/${slo_id}/corrections" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Corrections For an SLO ``` """ Get Corrections For an SLO returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objectives_api import ServiceLevelObjectivesApi # there is a valid "slo" in the system SLO_DATA_0_ID = environ["SLO_DATA_0_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.get_slo_corrections( slo_id=SLO_DATA_0_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Corrections For an SLO ``` # Get Corrections For an SLO returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectivesAPI.new # there is a valid "slo" in the system SLO_DATA_0_ID = ENV["SLO_DATA_0_ID"] p api_instance.get_slo_corrections(SLO_DATA_0_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Corrections For an SLO ``` // Get Corrections For an SLO returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "slo" in the system SloData0ID := os.Getenv("SLO_DATA_0_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.GetSLOCorrections(ctx, SloData0ID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.GetSLOCorrections`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.GetSLOCorrections`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Corrections For an SLO ``` // Get Corrections For an SLO returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v1.model.SLOCorrectionListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); // there is a valid "slo" in the system String SLO_DATA_0_ID = System.getenv("SLO_DATA_0_ID"); try { SLOCorrectionListResponse result = apiInstance.getSLOCorrections(SLO_DATA_0_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#getSLOCorrections"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Corrections For an SLO ``` // Get Corrections For an SLO returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objectives::ServiceLevelObjectivesAPI; #[tokio::main] async fn main() { // there is a valid "slo" in the system let slo_data_0_id = std::env::var("SLO_DATA_0_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api.get_slo_corrections(slo_data_0_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Corrections For an SLO ``` /** * Get Corrections For an SLO returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectivesApi(configuration); // there is a valid "slo" in the system const SLO_DATA_0_ID = process.env.SLO_DATA_0_ID as string; const params: v1.ServiceLevelObjectivesApiGetSLOCorrectionsRequest = { sloId: SLO_DATA_0_ID, }; apiInstance .getSLOCorrections(params) .then((data: v1.SLOCorrectionListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Check if SLOs can be safely deleted](https://docs.datadoghq.com/api/latest/service-level-objectives/#check-if-slos-can-be-safely-deleted) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#check-if-slos-can-be-safely-deleted-v1) GET https://api.ap1.datadoghq.com/api/v1/slo/can_deletehttps://api.ap2.datadoghq.com/api/v1/slo/can_deletehttps://api.datadoghq.eu/api/v1/slo/can_deletehttps://api.ddog-gov.com/api/v1/slo/can_deletehttps://api.datadoghq.com/api/v1/slo/can_deletehttps://api.us3.datadoghq.com/api/v1/slo/can_deletehttps://api.us5.datadoghq.com/api/v1/slo/can_delete ### Overview Check if an SLO can be safely deleted. For example, assure an SLO can be deleted without disrupting a dashboard. This endpoint requires the `slos_read` permission. OAuth apps require the `slos_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Arguments #### Query Strings Name Type Description ids [_required_] string A comma separated list of the IDs of the service level objectives objects. ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#CheckCanDeleteSLO-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-level-objectives/#CheckCanDeleteSLO-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#CheckCanDeleteSLO-403-v1) * [409](https://docs.datadoghq.com/api/latest/service-level-objectives/#CheckCanDeleteSLO-409-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#CheckCanDeleteSLO-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) A service level objective response containing the requested object. Field Type Description data object An array of service level objective objects. ok [string] An array of SLO IDs that can be safely deleted. errors object A mapping of SLO id to it's current usages. string Description of the service level objective reference. ``` { "data": { "ok": [] }, "errors": { "": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) A service level objective response containing the requested object. Field Type Description data object An array of service level objective objects. ok [string] An array of SLO IDs that can be safely deleted. errors object A mapping of SLO id to it's current usages. string Description of the service level objective reference. ``` { "data": { "ok": [] }, "errors": { "": "string" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python-legacy) ##### Check if SLOs can be safely deleted Copy ``` # Required query arguments export ids="id1, id2, id3" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/can_delete?ids=${ids}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Check if SLOs can be safely deleted ``` """ Check if SLOs can be safely deleted returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objectives_api import ServiceLevelObjectivesApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.check_can_delete_slo( ids="ids", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Check if SLOs can be safely deleted ``` # Check if SLOs can be safely deleted returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectivesAPI.new p api_instance.check_can_delete_slo("ids") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Check if SLOs can be safely deleted ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) slo_ids = [''] dog.can_delete_service_level_objective(slo_ids) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Check if SLOs can be safely deleted ``` // Check if SLOs can be safely deleted returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.CheckCanDeleteSLO(ctx, "ids") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.CheckCanDeleteSLO`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.CheckCanDeleteSLO`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Check if SLOs can be safely deleted ``` // Check if SLOs can be safely deleted returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v1.model.CheckCanDeleteSLOResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); try { CheckCanDeleteSLOResponse result = apiInstance.checkCanDeleteSLO("id1, id2, id3"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#checkCanDeleteSLO"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Check if SLOs can be safely deleted ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } slo_ids = [''] initialize(**options) api.ServiceLevelObjective.can_delete(slo_ids) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Check if SLOs can be safely deleted ``` // Check if SLOs can be safely deleted returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objectives::ServiceLevelObjectivesAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api.check_can_delete_slo("ids".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Check if SLOs can be safely deleted ``` /** * Check if SLOs can be safely deleted returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectivesApi(configuration); const params: v1.ServiceLevelObjectivesApiCheckCanDeleteSLORequest = { ids: "ids", }; apiInstance .checkCanDeleteSLO(params) .then((data: v1.CheckCanDeleteSLOResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Bulk Delete SLO Timeframes](https://docs.datadoghq.com/api/latest/service-level-objectives/#bulk-delete-slo-timeframes) * [v1 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#bulk-delete-slo-timeframes-v1) POST https://api.ap1.datadoghq.com/api/v1/slo/bulk_deletehttps://api.ap2.datadoghq.com/api/v1/slo/bulk_deletehttps://api.datadoghq.eu/api/v1/slo/bulk_deletehttps://api.ddog-gov.com/api/v1/slo/bulk_deletehttps://api.datadoghq.com/api/v1/slo/bulk_deletehttps://api.us3.datadoghq.com/api/v1/slo/bulk_deletehttps://api.us5.datadoghq.com/api/v1/slo/bulk_delete ### Overview Delete (or partially delete) multiple service level objective objects. This endpoint facilitates deletion of one or more thresholds for one or more service level objective objects. If all thresholds are deleted, the service level objective object is deleted as well. This endpoint requires the `slos_write` permission. OAuth apps require the `slos_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Request #### Body Data (required) Delete multiple service level objective objects request body. * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Expand All Field Type Description [string] An array of all SLO timeframes. ``` { "id1": [ "7d", "30d" ], "id2": [ "7d", "30d" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#DeleteSLOTimeframeInBulk-200-v1) * [400](https://docs.datadoghq.com/api/latest/service-level-objectives/#DeleteSLOTimeframeInBulk-400-v1) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#DeleteSLOTimeframeInBulk-403-v1) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#DeleteSLOTimeframeInBulk-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) The bulk partial delete service level objective object endpoint response. This endpoint operates on multiple service level objective objects, so it may be partially successful. In such cases, the “data” and “error” fields in this response indicate which deletions succeeded and failed. Field Type Description data object An array of service level objective objects. deleted [string] An array of service level objective object IDs that indicates which objects that were completely deleted. updated [string] An array of service level objective object IDs that indicates which objects that were modified (objects for which at least one threshold was deleted, but that were not completely deleted). errors [object] Array of errors object returned. id [_required_] string The ID of the service level objective object associated with this error. message [_required_] string The error message. timeframe [_required_] enum The timeframe of the threshold associated with this error or "all" if all thresholds are affected. Allowed enum values: `7d,30d,90d,all` ``` { "data": { "deleted": [], "updated": [] }, "errors": [ { "id": "", "message": "", "timeframe": "30d" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python-legacy) ##### Bulk Delete SLO Timeframes Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/slo/bulk_delete" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "id1": [ "7d", "30d" ], "id2": [ "7d", "30d" ] } EOF ``` ##### Bulk Delete SLO Timeframes ``` """ Bulk Delete SLO Timeframes returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.service_level_objectives_api import ServiceLevelObjectivesApi from datadog_api_client.v1.model.slo_bulk_delete import SLOBulkDelete from datadog_api_client.v1.model.slo_timeframe import SLOTimeframe body = SLOBulkDelete( id1=[ SLOTimeframe.SEVEN_DAYS, SLOTimeframe.THIRTY_DAYS, ], id2=[ SLOTimeframe.SEVEN_DAYS, SLOTimeframe.THIRTY_DAYS, ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.delete_slo_timeframe_in_bulk(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Bulk Delete SLO Timeframes ``` # Bulk Delete SLO Timeframes returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::ServiceLevelObjectivesAPI.new body = { id1: [ DatadogAPIClient::V1::SLOTimeframe::SEVEN_DAYS, DatadogAPIClient::V1::SLOTimeframe::THIRTY_DAYS, ], id2: [ DatadogAPIClient::V1::SLOTimeframe::SEVEN_DAYS, DatadogAPIClient::V1::SLOTimeframe::THIRTY_DAYS, ], } p api_instance.delete_slo_timeframe_in_bulk(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Bulk Delete SLO Timeframes ``` require 'dogapi' api_key = '' app_key = '' slo_id_one = ''.freeze slo_id_two = ''.freeze dog = Dogapi::Client.new(api_key, app_key) # Delete multiple timeframes thresholds = { slo_id_one => %w[7d], slo_id_two => %w[7d 30d] } dog.delete_timeframes_service_level_objective(thresholds) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Bulk Delete SLO Timeframes ``` // Bulk Delete SLO Timeframes returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := map[string][]datadogV1.SLOTimeframe{ "id1": []datadogV1.SLOTimeframe{ datadogV1.SLOTIMEFRAME_SEVEN_DAYS, datadogV1.SLOTIMEFRAME_THIRTY_DAYS, }, "id2": []datadogV1.SLOTimeframe{ datadogV1.SLOTIMEFRAME_SEVEN_DAYS, datadogV1.SLOTIMEFRAME_THIRTY_DAYS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.DeleteSLOTimeframeInBulk(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.DeleteSLOTimeframeInBulk`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.DeleteSLOTimeframeInBulk`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Bulk Delete SLO Timeframes ``` // Bulk Delete SLO Timeframes returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v1.model.SLOBulkDeleteResponse; import com.datadog.api.client.v1.model.SLOTimeframe; import java.util.Arrays; import java.util.List; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); Map> body = Map.ofEntries( Map.entry("id1", Arrays.asList(SLOTimeframe.SEVEN_DAYS, SLOTimeframe.THIRTY_DAYS)), Map.entry("id2", Arrays.asList(SLOTimeframe.SEVEN_DAYS, SLOTimeframe.THIRTY_DAYS))); try { SLOBulkDeleteResponse result = apiInstance.deleteSLOTimeframeInBulk(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceLevelObjectivesApi#deleteSLOTimeframeInBulk"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Bulk Delete SLO Timeframes ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } slo_id_1 = '' slo_id_2 = '' initialize(**options) delete_timeframes = { slo_id_1: ["7d"] slo_id_2: ["7d", "30d"] } api.ServiceLevelObjective.bulk_delete(delete_timeframes) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Bulk Delete SLO Timeframes ``` // Bulk Delete SLO Timeframes returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_service_level_objectives::ServiceLevelObjectivesAPI; use datadog_api_client::datadogV1::model::SLOTimeframe; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = BTreeMap::from([ ( "id1".to_string(), vec![SLOTimeframe::SEVEN_DAYS, SLOTimeframe::THIRTY_DAYS], ), ( "id2".to_string(), vec![SLOTimeframe::SEVEN_DAYS, SLOTimeframe::THIRTY_DAYS], ), ]); let configuration = datadog::Configuration::new(); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api.delete_slo_timeframe_in_bulk(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Bulk Delete SLO Timeframes ``` /** * Bulk Delete SLO Timeframes returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.ServiceLevelObjectivesApi(configuration); const params: v1.ServiceLevelObjectivesApiDeleteSLOTimeframeInBulkRequest = { body: { id1: ["7d", "30d"], id2: ["7d", "30d"], }, }; apiInstance .deleteSLOTimeframeInBulk(params) .then((data: v1.SLOBulkDeleteResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a new SLO report](https://docs.datadoghq.com/api/latest/service-level-objectives/#create-a-new-slo-report) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#create-a-new-slo-report-v2) **Note** : This feature is in private beta. To request access, use the request access form in the [Service Level Objectives](https://docs.datadoghq.com/service_management/service_level_objectives/#slo-csv-export) docs. POST https://api.ap1.datadoghq.com/api/v2/slo/reporthttps://api.ap2.datadoghq.com/api/v2/slo/reporthttps://api.datadoghq.eu/api/v2/slo/reporthttps://api.ddog-gov.com/api/v2/slo/reporthttps://api.datadoghq.com/api/v2/slo/reporthttps://api.us3.datadoghq.com/api/v2/slo/reporthttps://api.us5.datadoghq.com/api/v2/slo/report ### Overview Create a job to generate an SLO report. The report job is processed asynchronously and eventually results in a CSV report being available for download. Check the status of the job and download the CSV report using the returned `report_id`. This endpoint requires the `slos_read` permission. OAuth apps require the `slos_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Request #### Body Data (required) Create SLO report job request body. * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Field Type Description data [_required_] object The data portion of the SLO report request. attributes [_required_] object The attributes portion of the SLO report request. from_ts [_required_] int64 The `from` timestamp for the report in epoch seconds. interval enum The frequency at which report data is to be generated. Allowed enum values: `daily,weekly,monthly` query [_required_] string The query string used to filter SLO results. Some examples of queries include `service:` and `slo-name`. timezone string The timezone used to determine the start and end of each interval. For example, weekly intervals start at 12am on Sunday in the specified timezone. to_ts [_required_] int64 The `to` timestamp for the report in epoch seconds. ``` { "data": { "attributes": { "from_ts": 1633173071, "to_ts": 1636629071, "query": "slo_type:metric \"SLO Reporting Test\"", "interval": "monthly", "timezone": "America/New_York" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#CreateSLOReportJob-200-v2) * [400](https://docs.datadoghq.com/api/latest/service-level-objectives/#CreateSLOReportJob-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#CreateSLOReportJob-403-v2) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#CreateSLOReportJob-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) The SLO report response. Field Type Description data object The data portion of the SLO report response. id string The ID of the report job. type string The type of ID. ``` { "data": { "id": "dc8d92aa-e0af-11ee-af21-1feeaccaa3a3", "type": "report_id" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) ##### Create a new SLO report returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/slo/report" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "from_ts": 1633173071, "to_ts": 1636629071, "query": "slo_type:metric \"SLO Reporting Test\"", "interval": "monthly", "timezone": "America/New_York" } } } EOF ``` ##### Create a new SLO report returns "OK" response ``` // Create a new SLO report returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SloReportCreateRequest{ Data: datadogV2.SloReportCreateRequestData{ Attributes: datadogV2.SloReportCreateRequestAttributes{ FromTs: time.Now().AddDate(0, 0, -40).Unix(), ToTs: time.Now().Unix(), Query: `slo_type:metric "SLO Reporting Test"`, Interval: datadogV2.SLOREPORTINTERVAL_MONTHLY.Ptr(), Timezone: datadog.PtrString("America/New_York"), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateSLOReportJob", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.CreateSLOReportJob(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.CreateSLOReportJob`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.CreateSLOReportJob`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new SLO report returns "OK" response ``` // Create a new SLO report returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v2.model.SLOReportInterval; import com.datadog.api.client.v2.model.SLOReportPostResponse; import com.datadog.api.client.v2.model.SloReportCreateRequest; import com.datadog.api.client.v2.model.SloReportCreateRequestAttributes; import com.datadog.api.client.v2.model.SloReportCreateRequestData; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createSLOReportJob", true); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); SloReportCreateRequest body = new SloReportCreateRequest() .data( new SloReportCreateRequestData() .attributes( new SloReportCreateRequestAttributes() .fromTs(OffsetDateTime.now().plusDays(-40).toInstant().getEpochSecond()) .toTs(OffsetDateTime.now().toInstant().getEpochSecond()) .query(""" slo_type:metric "SLO Reporting Test" """) .interval(SLOReportInterval.MONTHLY) .timezone("America/New_York"))); try { SLOReportPostResponse result = apiInstance.createSLOReportJob(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#createSLOReportJob"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new SLO report returns "OK" response ``` """ Create a new SLO report returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_level_objectives_api import ServiceLevelObjectivesApi from datadog_api_client.v2.model.slo_report_create_request import SloReportCreateRequest from datadog_api_client.v2.model.slo_report_create_request_attributes import SloReportCreateRequestAttributes from datadog_api_client.v2.model.slo_report_create_request_data import SloReportCreateRequestData from datadog_api_client.v2.model.slo_report_interval import SLOReportInterval body = SloReportCreateRequest( data=SloReportCreateRequestData( attributes=SloReportCreateRequestAttributes( from_ts=int((datetime.now() + relativedelta(days=-40)).timestamp()), to_ts=int(datetime.now().timestamp()), query='slo_type:metric "SLO Reporting Test"', interval=SLOReportInterval.MONTHLY, timezone="America/New_York", ), ), ) configuration = Configuration() configuration.unstable_operations["create_slo_report_job"] = True with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.create_slo_report_job(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new SLO report returns "OK" response ``` # Create a new SLO report returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_slo_report_job".to_sym] = true end api_instance = DatadogAPIClient::V2::ServiceLevelObjectivesAPI.new body = DatadogAPIClient::V2::SloReportCreateRequest.new({ data: DatadogAPIClient::V2::SloReportCreateRequestData.new({ attributes: DatadogAPIClient::V2::SloReportCreateRequestAttributes.new({ from_ts: (Time.now + -40 * 86400).to_i, to_ts: Time.now.to_i, query: 'slo_type:metric "SLO Reporting Test"', interval: DatadogAPIClient::V2::SLOReportInterval::MONTHLY, timezone: "America/New_York", }), }), }) p api_instance.create_slo_report_job(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new SLO report returns "OK" response ``` // Create a new SLO report returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_level_objectives::ServiceLevelObjectivesAPI; use datadog_api_client::datadogV2::model::SLOReportInterval; use datadog_api_client::datadogV2::model::SloReportCreateRequest; use datadog_api_client::datadogV2::model::SloReportCreateRequestAttributes; use datadog_api_client::datadogV2::model::SloReportCreateRequestData; #[tokio::main] async fn main() { let body = SloReportCreateRequest::new(SloReportCreateRequestData::new( SloReportCreateRequestAttributes::new( 1633173071, r#"slo_type:metric "SLO Reporting Test""#.to_string(), 1636629071, ) .interval(SLOReportInterval::MONTHLY) .timezone("America/New_York".to_string()), )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateSLOReportJob", true); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api.create_slo_report_job(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new SLO report returns "OK" response ``` /** * Create a new SLO report returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createSLOReportJob"] = true; const apiInstance = new v2.ServiceLevelObjectivesApi(configuration); const params: v2.ServiceLevelObjectivesApiCreateSLOReportJobRequest = { body: { data: { attributes: { fromTs: Math.round( new Date(new Date().getTime() + -40 * 86400 * 1000).getTime() / 1000 ), toTs: Math.round(new Date().getTime() / 1000), query: `slo_type:metric "SLO Reporting Test"`, interval: "monthly", timezone: "America/New_York", }, }, }, }; apiInstance .createSLOReportJob(params) .then((data: v2.SLOReportPostResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get SLO report status](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-slo-report-status) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-slo-report-status-v2) **Note** : This feature is in private beta. To request access, use the request access form in the [Service Level Objectives](https://docs.datadoghq.com/service_management/service_level_objectives/#slo-csv-export) docs. GET https://api.ap1.datadoghq.com/api/v2/slo/report/{report_id}/statushttps://api.ap2.datadoghq.com/api/v2/slo/report/{report_id}/statushttps://api.datadoghq.eu/api/v2/slo/report/{report_id}/statushttps://api.ddog-gov.com/api/v2/slo/report/{report_id}/statushttps://api.datadoghq.com/api/v2/slo/report/{report_id}/statushttps://api.us3.datadoghq.com/api/v2/slo/report/{report_id}/statushttps://api.us5.datadoghq.com/api/v2/slo/report/{report_id}/status ### Overview Get the status of the SLO report job. OAuth apps require the `slos_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Arguments #### Path Parameters Name Type Description report_id [_required_] string The ID of the report job. ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOReportJobStatus-200-v2) * [400](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOReportJobStatus-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOReportJobStatus-403-v2) * [404](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOReportJobStatus-404-v2) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOReportJobStatus-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) The SLO report status response. Field Type Description data object The data portion of the SLO report status response. attributes object The attributes portion of the SLO report status response. status enum The status of the SLO report job. Allowed enum values: `in_progress,completed,completed_with_errors,failed` id string The ID of the report job. type string The type of ID. ``` { "data": { "attributes": { "status": "completed" }, "id": "dc8d92aa-e0af-11ee-af21-1feeaccaa3a3", "type": "report_id" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) ##### Get SLO report status Copy ``` # Path parameters export report_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/slo/report/${report_id}/status" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get SLO report status ``` """ Get SLO report status returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_level_objectives_api import ServiceLevelObjectivesApi # there is a valid "report" in the system REPORT_DATA_ID = environ["REPORT_DATA_ID"] configuration = Configuration() configuration.unstable_operations["get_slo_report_job_status"] = True with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.get_slo_report_job_status( report_id=REPORT_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get SLO report status ``` # Get SLO report status returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_slo_report_job_status".to_sym] = true end api_instance = DatadogAPIClient::V2::ServiceLevelObjectivesAPI.new # there is a valid "report" in the system REPORT_DATA_ID = ENV["REPORT_DATA_ID"] p api_instance.get_slo_report_job_status(REPORT_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get SLO report status ``` // Get SLO report status returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "report" in the system ReportDataID := os.Getenv("REPORT_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetSLOReportJobStatus", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.GetSLOReportJobStatus(ctx, ReportDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.GetSLOReportJobStatus`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.GetSLOReportJobStatus`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get SLO report status ``` // Get SLO report status returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceLevelObjectivesApi; import com.datadog.api.client.v2.model.SLOReportStatusGetResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getSLOReportJobStatus", true); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); // there is a valid "report" in the system String REPORT_DATA_ID = System.getenv("REPORT_DATA_ID"); try { SLOReportStatusGetResponse result = apiInstance.getSLOReportJobStatus(REPORT_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#getSLOReportJobStatus"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get SLO report status ``` // Get SLO report status returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_level_objectives::ServiceLevelObjectivesAPI; #[tokio::main] async fn main() { // there is a valid "report" in the system let report_data_id = std::env::var("REPORT_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetSLOReportJobStatus", true); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api.get_slo_report_job_status(report_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get SLO report status ``` /** * Get SLO report status returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getSLOReportJobStatus"] = true; const apiInstance = new v2.ServiceLevelObjectivesApi(configuration); // there is a valid "report" in the system const REPORT_DATA_ID = process.env.REPORT_DATA_ID as string; const params: v2.ServiceLevelObjectivesApiGetSLOReportJobStatusRequest = { reportId: REPORT_DATA_ID, }; apiInstance .getSLOReportJobStatus(params) .then((data: v2.SLOReportStatusGetResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get SLO report](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-slo-report) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-level-objectives/#get-slo-report-v2) **Note** : This feature is in private beta. To request access, use the request access form in the [Service Level Objectives](https://docs.datadoghq.com/service_management/service_level_objectives/#slo-csv-export) docs. GET https://api.ap1.datadoghq.com/api/v2/slo/report/{report_id}/downloadhttps://api.ap2.datadoghq.com/api/v2/slo/report/{report_id}/downloadhttps://api.datadoghq.eu/api/v2/slo/report/{report_id}/downloadhttps://api.ddog-gov.com/api/v2/slo/report/{report_id}/downloadhttps://api.datadoghq.com/api/v2/slo/report/{report_id}/downloadhttps://api.us3.datadoghq.com/api/v2/slo/report/{report_id}/downloadhttps://api.us5.datadoghq.com/api/v2/slo/report/{report_id}/download ### Overview Download an SLO report. This can only be performed after the report job has completed. Reports are not guaranteed to exist indefinitely. Datadog recommends that you download the report as soon as it is available. OAuth apps require the `slos_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-level-objectives) to access this endpoint. ### Arguments #### Path Parameters Name Type Description report_id [_required_] string The ID of the report job. ### Response * [200](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOReport-200-v2) * [400](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOReport-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOReport-403-v2) * [404](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOReport-404-v2) * [429](https://docs.datadoghq.com/api/latest/service-level-objectives/#GetSLOReport-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-level-objectives/) * [Example](https://docs.datadoghq.com/api/latest/service-level-objectives/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-level-objectives/?code-lang=typescript) ##### Get SLO report Copy ``` # Path parameters export report_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/slo/report/${report_id}/download" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get SLO report ``` """ Get SLO report returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_level_objectives_api import ServiceLevelObjectivesApi configuration = Configuration() configuration.unstable_operations["get_slo_report"] = True with ApiClient(configuration) as api_client: api_instance = ServiceLevelObjectivesApi(api_client) response = api_instance.get_slo_report( report_id="9fb2dc2a-ead0-11ee-a174-9fe3a9d7627f", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get SLO report ``` # Get SLO report returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_slo_report".to_sym] = true end api_instance = DatadogAPIClient::V2::ServiceLevelObjectivesAPI.new p api_instance.get_slo_report("9fb2dc2a-ead0-11ee-a174-9fe3a9d7627f") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get SLO report ``` // Get SLO report returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetSLOReport", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceLevelObjectivesApi(apiClient) resp, r, err := api.GetSLOReport(ctx, "9fb2dc2a-ead0-11ee-a174-9fe3a9d7627f") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceLevelObjectivesApi.GetSLOReport`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } fmt.Fprintf(os.Stdout, "Response from `ServiceLevelObjectivesApi.GetSLOReport`:\n%s\n", resp) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get SLO report ``` // Get SLO report returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceLevelObjectivesApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getSLOReport", true); ServiceLevelObjectivesApi apiInstance = new ServiceLevelObjectivesApi(defaultClient); try { String result = apiInstance.getSLOReport("9fb2dc2a-ead0-11ee-a174-9fe3a9d7627f"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceLevelObjectivesApi#getSLOReport"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get SLO report ``` // Get SLO report returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_level_objectives::ServiceLevelObjectivesAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetSLOReport", true); let api = ServiceLevelObjectivesAPI::with_config(configuration); let resp = api .get_slo_report("9fb2dc2a-ead0-11ee-a174-9fe3a9d7627f".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get SLO report ``` /** * Get SLO report returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getSLOReport"] = true; const apiInstance = new v2.ServiceLevelObjectivesApi(configuration); const params: v2.ServiceLevelObjectivesApiGetSLOReportRequest = { reportId: "9fb2dc2a-ead0-11ee-a174-9fe3a9d7627f", }; apiInstance .getSLOReport(params) .then((data: string) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=123ba612-2eed-4200-81dc-6ea11c312473&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2ffdaa14-0623-445e-9f19-d0fec4c34cb3&pt=Service%20Level%20Objectives&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-level-objectives%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=123ba612-2eed-4200-81dc-6ea11c312473&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2ffdaa14-0623-445e-9f19-d0fec4c34cb3&pt=Service%20Level%20Objectives&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-level-objectives%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=024537a3-b549-45e5-8a80-2116ef2e82c3&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Service%20Level%20Objectives&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-level-objectives%2F&r=<=12673&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=842901) --- # Source: https://docs.datadoghq.com/api/latest/service-scorecards # Service Scorecards API to create and update scorecard rules and outcomes. See [Service Scorecards](https://docs.datadoghq.com/service_catalog/scorecards) for more information. This feature is currently in BETA. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). ## [Create a new rule](https://docs.datadoghq.com/api/latest/service-scorecards/#create-a-new-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-scorecards/#create-a-new-rule-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/scorecard/ruleshttps://api.ap2.datadoghq.com/api/v2/scorecard/ruleshttps://api.datadoghq.eu/api/v2/scorecard/ruleshttps://api.ddog-gov.com/api/v2/scorecard/ruleshttps://api.datadoghq.com/api/v2/scorecard/ruleshttps://api.us3.datadoghq.com/api/v2/scorecard/ruleshttps://api.us5.datadoghq.com/api/v2/scorecard/rules ### Overview Creates a new rule. OAuth apps require the `apm_service_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-scorecards) to access this endpoint. ### Request #### Body Data (required) Rule attributes. * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) Field Type Description data object Scorecard create rule request data. attributes object Details of a rule. category string **DEPRECATED** : The scorecard name to which this rule must belong. created_at date-time Creation time of the rule outcome. custom boolean Defines if the rule is a custom rule. description string Explanation of the rule. enabled boolean If enabled, the rule is calculated as part of the score. level int32 The maturity level of the rule (1, 2, or 3). modified_at date-time Time of the last rule outcome modification. name string Name of the rule. owner string Owner of the rule. scorecard_name string The scorecard name to which this rule must belong. type enum The JSON:API type for scorecard rules. Allowed enum values: `rule` default: `rule` ``` { "data": { "attributes": { "enabled": true, "name": "Example-Service-Scorecard", "scorecard_name": "Observability Best Practices" }, "type": "rule" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/service-scorecards/#CreateScorecardRule-201-v2) * [400](https://docs.datadoghq.com/api/latest/service-scorecards/#CreateScorecardRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-scorecards/#CreateScorecardRule-403-v2) * [429](https://docs.datadoghq.com/api/latest/service-scorecards/#CreateScorecardRule-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) Created rule in response. Field Type Description data object Create rule response data. attributes object Details of a rule. category string **DEPRECATED** : The scorecard name to which this rule must belong. created_at date-time Creation time of the rule outcome. custom boolean Defines if the rule is a custom rule. description string Explanation of the rule. enabled boolean If enabled, the rule is calculated as part of the score. level int32 The maturity level of the rule (1, 2, or 3). modified_at date-time Time of the last rule outcome modification. name string Name of the rule. owner string Owner of the rule. scorecard_name string The scorecard name to which this rule must belong. id string The unique ID for a scorecard rule. relationships object Scorecard create rule response relationship. scorecard object Relationship data for a rule. data object Rule relationship data. id string The unique ID for a scorecard. type enum The JSON:API type for scorecard. Allowed enum values: `scorecard` default: `scorecard` type enum The JSON:API type for scorecard rules. Allowed enum values: `rule` default: `rule` ``` { "data": { "attributes": { "category": "string", "created_at": "2019-09-19T10:00:00.000Z", "custom": false, "description": "string", "enabled": true, "level": 2, "modified_at": "2019-09-19T10:00:00.000Z", "name": "Team Defined", "owner": "string", "scorecard_name": "Deployments automated via Deployment Trains" }, "id": "q8MQxk8TCqrHnWkx", "relationships": { "scorecard": { "data": { "id": "q8MQxk8TCqrHnWkp", "type": "scorecard" } } }, "type": "rule" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=typescript) ##### Create a new rule returns "Created" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scorecard/rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": true, "name": "Example-Service-Scorecard", "scorecard_name": "Observability Best Practices" }, "type": "rule" } } EOF ``` ##### Create a new rule returns "Created" response ``` // Create a new rule returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateRuleRequest{ Data: &datadogV2.CreateRuleRequestData{ Attributes: &datadogV2.RuleAttributes{ Enabled: datadog.PtrBool(true), Name: datadog.PtrString("Example-Service-Scorecard"), ScorecardName: datadog.PtrString("Observability Best Practices"), }, Type: datadogV2.RULETYPE_RULE.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateScorecardRule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceScorecardsApi(apiClient) resp, r, err := api.CreateScorecardRule(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceScorecardsApi.CreateScorecardRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceScorecardsApi.CreateScorecardRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a new rule returns "Created" response ``` // Create a new rule returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceScorecardsApi; import com.datadog.api.client.v2.model.CreateRuleRequest; import com.datadog.api.client.v2.model.CreateRuleRequestData; import com.datadog.api.client.v2.model.CreateRuleResponse; import com.datadog.api.client.v2.model.RuleAttributes; import com.datadog.api.client.v2.model.RuleType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createScorecardRule", true); ServiceScorecardsApi apiInstance = new ServiceScorecardsApi(defaultClient); CreateRuleRequest body = new CreateRuleRequest() .data( new CreateRuleRequestData() .attributes( new RuleAttributes() .enabled(true) .name("Example-Service-Scorecard") .scorecardName("Observability Best Practices")) .type(RuleType.RULE)); try { CreateRuleResponse result = apiInstance.createScorecardRule(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceScorecardsApi#createScorecardRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a new rule returns "Created" response ``` """ Create a new rule returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_scorecards_api import ServiceScorecardsApi from datadog_api_client.v2.model.create_rule_request import CreateRuleRequest from datadog_api_client.v2.model.create_rule_request_data import CreateRuleRequestData from datadog_api_client.v2.model.rule_attributes import RuleAttributes from datadog_api_client.v2.model.rule_type import RuleType body = CreateRuleRequest( data=CreateRuleRequestData( attributes=RuleAttributes( enabled=True, name="Example-Service-Scorecard", scorecard_name="Observability Best Practices", ), type=RuleType.RULE, ), ) configuration = Configuration() configuration.unstable_operations["create_scorecard_rule"] = True with ApiClient(configuration) as api_client: api_instance = ServiceScorecardsApi(api_client) response = api_instance.create_scorecard_rule(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a new rule returns "Created" response ``` # Create a new rule returns "Created" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_scorecard_rule".to_sym] = true end api_instance = DatadogAPIClient::V2::ServiceScorecardsAPI.new body = DatadogAPIClient::V2::CreateRuleRequest.new({ data: DatadogAPIClient::V2::CreateRuleRequestData.new({ attributes: DatadogAPIClient::V2::RuleAttributes.new({ enabled: true, name: "Example-Service-Scorecard", scorecard_name: "Observability Best Practices", }), type: DatadogAPIClient::V2::RuleType::RULE, }), }) p api_instance.create_scorecard_rule(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a new rule returns "Created" response ``` // Create a new rule returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_scorecards::ServiceScorecardsAPI; use datadog_api_client::datadogV2::model::CreateRuleRequest; use datadog_api_client::datadogV2::model::CreateRuleRequestData; use datadog_api_client::datadogV2::model::RuleAttributes; use datadog_api_client::datadogV2::model::RuleType; #[tokio::main] async fn main() { let body = CreateRuleRequest::new().data( CreateRuleRequestData::new() .attributes( RuleAttributes::new() .enabled(true) .name("Example-Service-Scorecard".to_string()) .scorecard_name("Observability Best Practices".to_string()), ) .type_(RuleType::RULE), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateScorecardRule", true); let api = ServiceScorecardsAPI::with_config(configuration); let resp = api.create_scorecard_rule(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a new rule returns "Created" response ``` /** * Create a new rule returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createScorecardRule"] = true; const apiInstance = new v2.ServiceScorecardsApi(configuration); const params: v2.ServiceScorecardsApiCreateScorecardRuleRequest = { body: { data: { attributes: { enabled: true, name: "Example-Service-Scorecard", scorecardName: "Observability Best Practices", }, type: "rule", }, }, }; apiInstance .createScorecardRule(params) .then((data: v2.CreateRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create outcomes batch](https://docs.datadoghq.com/api/latest/service-scorecards/#create-outcomes-batch) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-scorecards/#create-outcomes-batch-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/scorecard/outcomes/batchhttps://api.ap2.datadoghq.com/api/v2/scorecard/outcomes/batchhttps://api.datadoghq.eu/api/v2/scorecard/outcomes/batchhttps://api.ddog-gov.com/api/v2/scorecard/outcomes/batchhttps://api.datadoghq.com/api/v2/scorecard/outcomes/batchhttps://api.us3.datadoghq.com/api/v2/scorecard/outcomes/batchhttps://api.us5.datadoghq.com/api/v2/scorecard/outcomes/batch ### Overview Sets multiple service-rule outcomes in a single batched request. OAuth apps require the `apm_service_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-scorecards) to access this endpoint. ### Request #### Body Data (required) Set of scorecard outcomes. * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) Field Type Description data object Scorecard outcomes batch request data. attributes object The JSON:API attributes for a batched set of scorecard outcomes. results [object] Set of scorecard outcomes to update. remarks string Any remarks regarding the scorecard rule's evaluation, and supports HTML hyperlinks. rule_id [_required_] string The unique ID for a scorecard rule. service_name [_required_] string The unique name for a service in the catalog. state [_required_] enum The state of the rule evaluation. Allowed enum values: `pass,fail,skip` type enum The JSON:API type for scorecard outcomes. Allowed enum values: `batched-outcome` default: `batched-outcome` ``` { "data": { "attributes": { "results": [ { "remarks": "See: Services", "rule_id": "q8MQxk8TCqrHnWkx", "service_name": "my-service", "state": "pass" } ] }, "type": "batched-outcome" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/service-scorecards/#CreateScorecardOutcomesBatch-200-v2) * [400](https://docs.datadoghq.com/api/latest/service-scorecards/#CreateScorecardOutcomesBatch-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-scorecards/#CreateScorecardOutcomesBatch-403-v2) * [429](https://docs.datadoghq.com/api/latest/service-scorecards/#CreateScorecardOutcomesBatch-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) Scorecard outcomes batch response. Field Type Description data [_required_] [object] List of rule outcomes which were affected during the bulk operation. attributes object The JSON:API attributes for an outcome. created_at date-time Creation time of the rule outcome. modified_at date-time Time of last rule outcome modification. remarks string Any remarks regarding the scorecard rule's evaluation, and supports HTML hyperlinks. service_name string The unique name for a service in the catalog. state enum The state of the rule evaluation. Allowed enum values: `pass,fail,skip` id string The unique ID for a rule outcome. relationships object The JSON:API relationship to a scorecard rule. rule object The JSON:API relationship to a scorecard outcome. data object The JSON:API relationship to an outcome, which returns the related rule id. id string The unique ID for a scorecard rule. type enum The JSON:API type for scorecard rules. Allowed enum values: `rule` default: `rule` type enum The JSON:API type for an outcome. Allowed enum values: `outcome` default: `outcome` meta [_required_] object Metadata pertaining to the bulk operation. total_received int64 Total number of scorecard results received during the bulk operation. total_updated int64 Total number of scorecard results modified during the bulk operation. ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "remarks": "See: Services", "service_name": "my-service", "state": "pass" }, "id": "string", "relationships": { "rule": { "data": { "id": "q8MQxk8TCqrHnWkx", "type": "rule" } } }, "type": "outcome" } ], "meta": { "total_received": "integer", "total_updated": "integer" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=typescript) ##### Create outcomes batch returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scorecard/outcomes/batch" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "results": [ { "remarks": "See: Services", "rule_id": "q8MQxk8TCqrHnWkx", "service_name": "my-service", "state": "pass" } ] }, "type": "batched-outcome" } } EOF ``` ##### Create outcomes batch returns "OK" response ``` // Create outcomes batch returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "create_scorecard_rule" in the system CreateScorecardRuleDataID := os.Getenv("CREATE_SCORECARD_RULE_DATA_ID") body := datadogV2.OutcomesBatchRequest{ Data: &datadogV2.OutcomesBatchRequestData{ Attributes: &datadogV2.OutcomesBatchAttributes{ Results: []datadogV2.OutcomesBatchRequestItem{ { Remarks: datadog.PtrString(`See: Services`), RuleId: CreateScorecardRuleDataID, ServiceName: "my-service", State: datadogV2.STATE_PASS, }, }, }, Type: datadogV2.OUTCOMESBATCHTYPE_BATCHED_OUTCOME.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateScorecardOutcomesBatch", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceScorecardsApi(apiClient) resp, r, err := api.CreateScorecardOutcomesBatch(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceScorecardsApi.CreateScorecardOutcomesBatch`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceScorecardsApi.CreateScorecardOutcomesBatch`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create outcomes batch returns "OK" response ``` // Create outcomes batch returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceScorecardsApi; import com.datadog.api.client.v2.model.OutcomesBatchAttributes; import com.datadog.api.client.v2.model.OutcomesBatchRequest; import com.datadog.api.client.v2.model.OutcomesBatchRequestData; import com.datadog.api.client.v2.model.OutcomesBatchRequestItem; import com.datadog.api.client.v2.model.OutcomesBatchResponse; import com.datadog.api.client.v2.model.OutcomesBatchType; import com.datadog.api.client.v2.model.State; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createScorecardOutcomesBatch", true); ServiceScorecardsApi apiInstance = new ServiceScorecardsApi(defaultClient); // there is a valid "create_scorecard_rule" in the system String CREATE_SCORECARD_RULE_DATA_ID = System.getenv("CREATE_SCORECARD_RULE_DATA_ID"); OutcomesBatchRequest body = new OutcomesBatchRequest() .data( new OutcomesBatchRequestData() .attributes( new OutcomesBatchAttributes() .results( Collections.singletonList( new OutcomesBatchRequestItem() .remarks( """ See: Services """) .ruleId(CREATE_SCORECARD_RULE_DATA_ID) .serviceName("my-service") .state(State.PASS)))) .type(OutcomesBatchType.BATCHED_OUTCOME)); try { OutcomesBatchResponse result = apiInstance.createScorecardOutcomesBatch(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling ServiceScorecardsApi#createScorecardOutcomesBatch"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create outcomes batch returns "OK" response ``` """ Create outcomes batch returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_scorecards_api import ServiceScorecardsApi from datadog_api_client.v2.model.outcomes_batch_attributes import OutcomesBatchAttributes from datadog_api_client.v2.model.outcomes_batch_request import OutcomesBatchRequest from datadog_api_client.v2.model.outcomes_batch_request_data import OutcomesBatchRequestData from datadog_api_client.v2.model.outcomes_batch_request_item import OutcomesBatchRequestItem from datadog_api_client.v2.model.outcomes_batch_type import OutcomesBatchType from datadog_api_client.v2.model.state import State # there is a valid "create_scorecard_rule" in the system CREATE_SCORECARD_RULE_DATA_ID = environ["CREATE_SCORECARD_RULE_DATA_ID"] body = OutcomesBatchRequest( data=OutcomesBatchRequestData( attributes=OutcomesBatchAttributes( results=[ OutcomesBatchRequestItem( remarks='See: Services', rule_id=CREATE_SCORECARD_RULE_DATA_ID, service_name="my-service", state=State.PASS, ), ], ), type=OutcomesBatchType.BATCHED_OUTCOME, ), ) configuration = Configuration() configuration.unstable_operations["create_scorecard_outcomes_batch"] = True with ApiClient(configuration) as api_client: api_instance = ServiceScorecardsApi(api_client) response = api_instance.create_scorecard_outcomes_batch(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create outcomes batch returns "OK" response ``` # Create outcomes batch returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_scorecard_outcomes_batch".to_sym] = true end api_instance = DatadogAPIClient::V2::ServiceScorecardsAPI.new # there is a valid "create_scorecard_rule" in the system CREATE_SCORECARD_RULE_DATA_ID = ENV["CREATE_SCORECARD_RULE_DATA_ID"] body = DatadogAPIClient::V2::OutcomesBatchRequest.new({ data: DatadogAPIClient::V2::OutcomesBatchRequestData.new({ attributes: DatadogAPIClient::V2::OutcomesBatchAttributes.new({ results: [ DatadogAPIClient::V2::OutcomesBatchRequestItem.new({ remarks: 'See: Services', rule_id: CREATE_SCORECARD_RULE_DATA_ID, service_name: "my-service", state: DatadogAPIClient::V2::State::PASS, }), ], }), type: DatadogAPIClient::V2::OutcomesBatchType::BATCHED_OUTCOME, }), }) p api_instance.create_scorecard_outcomes_batch(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create outcomes batch returns "OK" response ``` // Create outcomes batch returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_scorecards::ServiceScorecardsAPI; use datadog_api_client::datadogV2::model::OutcomesBatchAttributes; use datadog_api_client::datadogV2::model::OutcomesBatchRequest; use datadog_api_client::datadogV2::model::OutcomesBatchRequestData; use datadog_api_client::datadogV2::model::OutcomesBatchRequestItem; use datadog_api_client::datadogV2::model::OutcomesBatchType; use datadog_api_client::datadogV2::model::State; #[tokio::main] async fn main() { // there is a valid "create_scorecard_rule" in the system let create_scorecard_rule_data_id = std::env::var("CREATE_SCORECARD_RULE_DATA_ID").unwrap(); let body = OutcomesBatchRequest::new().data( OutcomesBatchRequestData::new() .attributes( OutcomesBatchAttributes::new().results(vec![OutcomesBatchRequestItem::new( create_scorecard_rule_data_id.clone(), "my-service".to_string(), State::PASS, ) .remarks( r#"See: Services"#.to_string(), )]), ) .type_(OutcomesBatchType::BATCHED_OUTCOME), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateScorecardOutcomesBatch", true); let api = ServiceScorecardsAPI::with_config(configuration); let resp = api.create_scorecard_outcomes_batch(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create outcomes batch returns "OK" response ``` /** * Create outcomes batch returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createScorecardOutcomesBatch"] = true; const apiInstance = new v2.ServiceScorecardsApi(configuration); // there is a valid "create_scorecard_rule" in the system const CREATE_SCORECARD_RULE_DATA_ID = process.env .CREATE_SCORECARD_RULE_DATA_ID as string; const params: v2.ServiceScorecardsApiCreateScorecardOutcomesBatchRequest = { body: { data: { attributes: { results: [ { remarks: `See: Services`, ruleId: CREATE_SCORECARD_RULE_DATA_ID, serviceName: "my-service", state: "pass", }, ], }, type: "batched-outcome", }, }, }; apiInstance .createScorecardOutcomesBatch(params) .then((data: v2.OutcomesBatchResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update Scorecard outcomes asynchronously](https://docs.datadoghq.com/api/latest/service-scorecards/#update-scorecard-outcomes-asynchronously) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-scorecards/#update-scorecard-outcomes-asynchronously-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/scorecard/outcomeshttps://api.ap2.datadoghq.com/api/v2/scorecard/outcomeshttps://api.datadoghq.eu/api/v2/scorecard/outcomeshttps://api.ddog-gov.com/api/v2/scorecard/outcomeshttps://api.datadoghq.com/api/v2/scorecard/outcomeshttps://api.us3.datadoghq.com/api/v2/scorecard/outcomeshttps://api.us5.datadoghq.com/api/v2/scorecard/outcomes ### Overview Updates multiple scorecard rule outcomes in a single batched request. OAuth apps require the `apm_service_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-scorecards) to access this endpoint. ### Request #### Body Data (required) Set of scorecard outcomes. * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) Field Type Description data object Scorecard outcomes batch request data. attributes object The JSON:API attributes for a batched set of scorecard outcomes. results [object] Set of scorecard outcomes to update asynchronously. entity_reference [_required_] string The unique reference for an IDP entity. remarks string Any remarks regarding the scorecard rule's evaluation. Supports HTML hyperlinks. rule_id [_required_] string The unique ID for a scorecard rule. state [_required_] enum The state of the rule evaluation. Allowed enum values: `pass,fail,skip` type enum The JSON:API type for scorecard outcomes. Allowed enum values: `batched-outcome` default: `batched-outcome` ``` { "data": { "attributes": { "results": [ { "rule_id": "q8MQxk8TCqrHnWkx", "entity_reference": "service:my-service", "remarks": "See: Services", "state": "pass" } ] }, "type": "batched-outcome" } } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/service-scorecards/#UpdateScorecardOutcomesAsync-202-v2) * [400](https://docs.datadoghq.com/api/latest/service-scorecards/#UpdateScorecardOutcomesAsync-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-scorecards/#UpdateScorecardOutcomesAsync-403-v2) * [409](https://docs.datadoghq.com/api/latest/service-scorecards/#UpdateScorecardOutcomesAsync-409-v2) * [429](https://docs.datadoghq.com/api/latest/service-scorecards/#UpdateScorecardOutcomesAsync-429-v2) Accepted Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=typescript) ##### Update Scorecard outcomes asynchronously returns "Accepted" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scorecard/outcomes" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "results": [ { "rule_id": "q8MQxk8TCqrHnWkx", "entity_reference": "service:my-service", "remarks": "See: Services", "state": "pass" } ] }, "type": "batched-outcome" } } EOF ``` ##### Update Scorecard outcomes asynchronously returns "Accepted" response ``` // Update Scorecard outcomes asynchronously returns "Accepted" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "create_scorecard_rule" in the system CreateScorecardRuleDataID := os.Getenv("CREATE_SCORECARD_RULE_DATA_ID") body := datadogV2.UpdateOutcomesAsyncRequest{ Data: &datadogV2.UpdateOutcomesAsyncRequestData{ Attributes: &datadogV2.UpdateOutcomesAsyncAttributes{ Results: []datadogV2.UpdateOutcomesAsyncRequestItem{ { RuleId: CreateScorecardRuleDataID, EntityReference: "service:my-service", Remarks: datadog.PtrString(`See: Services`), State: datadogV2.STATE_PASS, }, }, }, Type: datadogV2.UPDATEOUTCOMESASYNCTYPE_BATCHED_OUTCOME.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateScorecardOutcomesAsync", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceScorecardsApi(apiClient) r, err := api.UpdateScorecardOutcomesAsync(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceScorecardsApi.UpdateScorecardOutcomesAsync`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update Scorecard outcomes asynchronously returns "Accepted" response ``` // Update Scorecard outcomes asynchronously returns "Accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceScorecardsApi; import com.datadog.api.client.v2.model.State; import com.datadog.api.client.v2.model.UpdateOutcomesAsyncAttributes; import com.datadog.api.client.v2.model.UpdateOutcomesAsyncRequest; import com.datadog.api.client.v2.model.UpdateOutcomesAsyncRequestData; import com.datadog.api.client.v2.model.UpdateOutcomesAsyncRequestItem; import com.datadog.api.client.v2.model.UpdateOutcomesAsyncType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateScorecardOutcomesAsync", true); ServiceScorecardsApi apiInstance = new ServiceScorecardsApi(defaultClient); // there is a valid "create_scorecard_rule" in the system String CREATE_SCORECARD_RULE_DATA_ID = System.getenv("CREATE_SCORECARD_RULE_DATA_ID"); UpdateOutcomesAsyncRequest body = new UpdateOutcomesAsyncRequest() .data( new UpdateOutcomesAsyncRequestData() .attributes( new UpdateOutcomesAsyncAttributes() .results( Collections.singletonList( new UpdateOutcomesAsyncRequestItem() .ruleId(CREATE_SCORECARD_RULE_DATA_ID) .entityReference("service:my-service") .remarks( """ See: Services """) .state(State.PASS)))) .type(UpdateOutcomesAsyncType.BATCHED_OUTCOME)); try { apiInstance.updateScorecardOutcomesAsync(body); } catch (ApiException e) { System.err.println( "Exception when calling ServiceScorecardsApi#updateScorecardOutcomesAsync"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update Scorecard outcomes asynchronously returns "Accepted" response ``` """ Update Scorecard outcomes asynchronously returns "Accepted" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_scorecards_api import ServiceScorecardsApi from datadog_api_client.v2.model.state import State from datadog_api_client.v2.model.update_outcomes_async_attributes import UpdateOutcomesAsyncAttributes from datadog_api_client.v2.model.update_outcomes_async_request import UpdateOutcomesAsyncRequest from datadog_api_client.v2.model.update_outcomes_async_request_data import UpdateOutcomesAsyncRequestData from datadog_api_client.v2.model.update_outcomes_async_request_item import UpdateOutcomesAsyncRequestItem from datadog_api_client.v2.model.update_outcomes_async_type import UpdateOutcomesAsyncType # there is a valid "create_scorecard_rule" in the system CREATE_SCORECARD_RULE_DATA_ID = environ["CREATE_SCORECARD_RULE_DATA_ID"] body = UpdateOutcomesAsyncRequest( data=UpdateOutcomesAsyncRequestData( attributes=UpdateOutcomesAsyncAttributes( results=[ UpdateOutcomesAsyncRequestItem( rule_id=CREATE_SCORECARD_RULE_DATA_ID, entity_reference="service:my-service", remarks='See: Services', state=State.PASS, ), ], ), type=UpdateOutcomesAsyncType.BATCHED_OUTCOME, ), ) configuration = Configuration() configuration.unstable_operations["update_scorecard_outcomes_async"] = True with ApiClient(configuration) as api_client: api_instance = ServiceScorecardsApi(api_client) api_instance.update_scorecard_outcomes_async(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update Scorecard outcomes asynchronously returns "Accepted" response ``` # Update Scorecard outcomes asynchronously returns "Accepted" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_scorecard_outcomes_async".to_sym] = true end api_instance = DatadogAPIClient::V2::ServiceScorecardsAPI.new # there is a valid "create_scorecard_rule" in the system CREATE_SCORECARD_RULE_DATA_ID = ENV["CREATE_SCORECARD_RULE_DATA_ID"] body = DatadogAPIClient::V2::UpdateOutcomesAsyncRequest.new({ data: DatadogAPIClient::V2::UpdateOutcomesAsyncRequestData.new({ attributes: DatadogAPIClient::V2::UpdateOutcomesAsyncAttributes.new({ results: [ DatadogAPIClient::V2::UpdateOutcomesAsyncRequestItem.new({ rule_id: CREATE_SCORECARD_RULE_DATA_ID, entity_reference: "service:my-service", remarks: 'See: Services', state: DatadogAPIClient::V2::State::PASS, }), ], }), type: DatadogAPIClient::V2::UpdateOutcomesAsyncType::BATCHED_OUTCOME, }), }) p api_instance.update_scorecard_outcomes_async(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update Scorecard outcomes asynchronously returns "Accepted" response ``` // Update Scorecard outcomes asynchronously returns "Accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_scorecards::ServiceScorecardsAPI; use datadog_api_client::datadogV2::model::State; use datadog_api_client::datadogV2::model::UpdateOutcomesAsyncAttributes; use datadog_api_client::datadogV2::model::UpdateOutcomesAsyncRequest; use datadog_api_client::datadogV2::model::UpdateOutcomesAsyncRequestData; use datadog_api_client::datadogV2::model::UpdateOutcomesAsyncRequestItem; use datadog_api_client::datadogV2::model::UpdateOutcomesAsyncType; #[tokio::main] async fn main() { // there is a valid "create_scorecard_rule" in the system let create_scorecard_rule_data_id = std::env::var("CREATE_SCORECARD_RULE_DATA_ID").unwrap(); let body = UpdateOutcomesAsyncRequest::new().data( UpdateOutcomesAsyncRequestData::new() .attributes(UpdateOutcomesAsyncAttributes::new().results(vec![ UpdateOutcomesAsyncRequestItem::new( "service:my-service".to_string(), create_scorecard_rule_data_id.clone(), State::PASS, ) .remarks( r#"See: Services"# .to_string(), ), ])) .type_(UpdateOutcomesAsyncType::BATCHED_OUTCOME), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateScorecardOutcomesAsync", true); let api = ServiceScorecardsAPI::with_config(configuration); let resp = api.update_scorecard_outcomes_async(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update Scorecard outcomes asynchronously returns "Accepted" response ``` /** * Update Scorecard outcomes asynchronously returns "Accepted" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateScorecardOutcomesAsync"] = true; const apiInstance = new v2.ServiceScorecardsApi(configuration); // there is a valid "create_scorecard_rule" in the system const CREATE_SCORECARD_RULE_DATA_ID = process.env .CREATE_SCORECARD_RULE_DATA_ID as string; const params: v2.ServiceScorecardsApiUpdateScorecardOutcomesAsyncRequest = { body: { data: { attributes: { results: [ { ruleId: CREATE_SCORECARD_RULE_DATA_ID, entityReference: "service:my-service", remarks: `See: Services`, state: "pass", }, ], }, type: "batched-outcome", }, }, }; apiInstance .updateScorecardOutcomesAsync(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all rule outcomes](https://docs.datadoghq.com/api/latest/service-scorecards/#list-all-rule-outcomes) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-scorecards/#list-all-rule-outcomes-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/scorecard/outcomeshttps://api.ap2.datadoghq.com/api/v2/scorecard/outcomeshttps://api.datadoghq.eu/api/v2/scorecard/outcomeshttps://api.ddog-gov.com/api/v2/scorecard/outcomeshttps://api.datadoghq.com/api/v2/scorecard/outcomeshttps://api.us3.datadoghq.com/api/v2/scorecard/outcomeshttps://api.us5.datadoghq.com/api/v2/scorecard/outcomes ### Overview Fetches all rule outcomes. OAuth apps require the `apm_service_catalog_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-scorecards) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[offset] integer Specific offset to use as the beginning of the returned page. include string Include related rule details in the response. fields[outcome] string Return only specified values in the outcome attributes. fields[rule] string Return only specified values in the included rule details. filter[outcome][service_name] string Filter the outcomes on a specific service name. filter[outcome][state] string Filter the outcomes by a specific state. filter[rule][enabled] boolean Filter outcomes on whether a rule is enabled/disabled. filter[rule][id] string Filter outcomes based on rule ID. filter[rule][name] string Filter outcomes based on rule name. ### Response * [200](https://docs.datadoghq.com/api/latest/service-scorecards/#ListScorecardOutcomes-200-v2) * [400](https://docs.datadoghq.com/api/latest/service-scorecards/#ListScorecardOutcomes-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-scorecards/#ListScorecardOutcomes-403-v2) * [429](https://docs.datadoghq.com/api/latest/service-scorecards/#ListScorecardOutcomes-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) Scorecard outcomes - the result of a rule for a service. Field Type Description data [object] List of rule outcomes. attributes object The JSON:API attributes for an outcome. created_at date-time Creation time of the rule outcome. modified_at date-time Time of last rule outcome modification. remarks string Any remarks regarding the scorecard rule's evaluation, and supports HTML hyperlinks. service_name string The unique name for a service in the catalog. state enum The state of the rule evaluation. Allowed enum values: `pass,fail,skip` id string The unique ID for a rule outcome. relationships object The JSON:API relationship to a scorecard rule. rule object The JSON:API relationship to a scorecard outcome. data object The JSON:API relationship to an outcome, which returns the related rule id. id string The unique ID for a scorecard rule. type enum The JSON:API type for scorecard rules. Allowed enum values: `rule` default: `rule` type enum The JSON:API type for an outcome. Allowed enum values: `outcome` default: `outcome` included [object] Array of rule details. attributes object Details of a rule. name string Name of the rule. scorecard_name string The scorecard name to which this rule must belong. id string The unique ID for a scorecard rule. type enum The JSON:API type for scorecard rules. Allowed enum values: `rule` default: `rule` links object Links attributes. next string Link for the next set of results. ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "modified_at": "2019-09-19T10:00:00.000Z", "remarks": "See: Services", "service_name": "my-service", "state": "pass" }, "id": "string", "relationships": { "rule": { "data": { "id": "q8MQxk8TCqrHnWkx", "type": "rule" } } }, "type": "outcome" } ], "included": [ { "attributes": { "name": "Team Defined", "scorecard_name": "Observability Best Practices" }, "id": "q8MQxk8TCqrHnWkx", "type": "rule" } ], "links": { "next": "/api/v2/scorecard/outcomes?include=rule\u0026page%5Blimit%5D=100\u0026page%5Boffset%5D=100" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=typescript) ##### List all rule outcomes Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scorecard/outcomes" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all rule outcomes ``` """ List all rule outcomes returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_scorecards_api import ServiceScorecardsApi configuration = Configuration() configuration.unstable_operations["list_scorecard_outcomes"] = True with ApiClient(configuration) as api_client: api_instance = ServiceScorecardsApi(api_client) response = api_instance.list_scorecard_outcomes() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all rule outcomes ``` # List all rule outcomes returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_scorecard_outcomes".to_sym] = true end api_instance = DatadogAPIClient::V2::ServiceScorecardsAPI.new p api_instance.list_scorecard_outcomes() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all rule outcomes ``` // List all rule outcomes returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListScorecardOutcomes", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceScorecardsApi(apiClient) resp, r, err := api.ListScorecardOutcomes(ctx, *datadogV2.NewListScorecardOutcomesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceScorecardsApi.ListScorecardOutcomes`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceScorecardsApi.ListScorecardOutcomes`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all rule outcomes ``` // List all rule outcomes returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceScorecardsApi; import com.datadog.api.client.v2.model.OutcomesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listScorecardOutcomes", true); ServiceScorecardsApi apiInstance = new ServiceScorecardsApi(defaultClient); try { OutcomesResponse result = apiInstance.listScorecardOutcomes(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceScorecardsApi#listScorecardOutcomes"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all rule outcomes ``` // List all rule outcomes returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_scorecards::ListScorecardOutcomesOptionalParams; use datadog_api_client::datadogV2::api_service_scorecards::ServiceScorecardsAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListScorecardOutcomes", true); let api = ServiceScorecardsAPI::with_config(configuration); let resp = api .list_scorecard_outcomes(ListScorecardOutcomesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all rule outcomes ``` /** * List all rule outcomes returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listScorecardOutcomes"] = true; const apiInstance = new v2.ServiceScorecardsApi(configuration); apiInstance .listScorecardOutcomes() .then((data: v2.OutcomesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all rules](https://docs.datadoghq.com/api/latest/service-scorecards/#list-all-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-scorecards/#list-all-rules-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/scorecard/ruleshttps://api.ap2.datadoghq.com/api/v2/scorecard/ruleshttps://api.datadoghq.eu/api/v2/scorecard/ruleshttps://api.ddog-gov.com/api/v2/scorecard/ruleshttps://api.datadoghq.com/api/v2/scorecard/ruleshttps://api.us3.datadoghq.com/api/v2/scorecard/ruleshttps://api.us5.datadoghq.com/api/v2/scorecard/rules ### Overview Fetch all rules. OAuth apps require the `apm_service_catalog_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-scorecards) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[offset] integer Specific offset to use as the beginning of the returned page. include string Include related scorecard details in the response. filter[rule][id] string Filter the rules on a rule ID. filter[rule][enabled] boolean Filter for enabled rules only. filter[rule][custom] boolean Filter for custom rules only. filter[rule][name] string Filter rules on the rule name. filter[rule][description] string Filter rules on the rule description. fields[rule] string Return only specific fields in the response for rule attributes. fields[scorecard] string Return only specific fields in the included response for scorecard attributes. ### Response * [200](https://docs.datadoghq.com/api/latest/service-scorecards/#ListScorecardRules-200-v2) * [400](https://docs.datadoghq.com/api/latest/service-scorecards/#ListScorecardRules-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-scorecards/#ListScorecardRules-403-v2) * [429](https://docs.datadoghq.com/api/latest/service-scorecards/#ListScorecardRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) Scorecard rules response. Field Type Description data [object] Array of rule details. attributes object Details of a rule. category string **DEPRECATED** : The scorecard name to which this rule must belong. created_at date-time Creation time of the rule outcome. custom boolean Defines if the rule is a custom rule. description string Explanation of the rule. enabled boolean If enabled, the rule is calculated as part of the score. level int32 The maturity level of the rule (1, 2, or 3). modified_at date-time Time of the last rule outcome modification. name string Name of the rule. owner string Owner of the rule. scorecard_name string The scorecard name to which this rule must belong. id string The unique ID for a scorecard rule. relationships object Scorecard create rule response relationship. scorecard object Relationship data for a rule. data object Rule relationship data. id string The unique ID for a scorecard. type enum The JSON:API type for scorecard. Allowed enum values: `scorecard` default: `scorecard` type enum The JSON:API type for scorecard rules. Allowed enum values: `rule` default: `rule` links object Links attributes. next string Link for the next set of rules. ``` { "data": [ { "attributes": { "category": "string", "created_at": "2019-09-19T10:00:00.000Z", "custom": false, "description": "string", "enabled": true, "level": 2, "modified_at": "2019-09-19T10:00:00.000Z", "name": "Team Defined", "owner": "string", "scorecard_name": "Deployments automated via Deployment Trains" }, "id": "q8MQxk8TCqrHnWkx", "relationships": { "scorecard": { "data": { "id": "q8MQxk8TCqrHnWkp", "type": "scorecard" } } }, "type": "rule" } ], "links": { "next": "/api/v2/scorecard/rules?page%5Blimit%5D=2\u0026page%5Boffset%5D=2\u0026page%5Bsize%5D=2" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=typescript) ##### List all rules Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scorecard/rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all rules ``` """ List all rules returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_scorecards_api import ServiceScorecardsApi configuration = Configuration() configuration.unstable_operations["list_scorecard_rules"] = True with ApiClient(configuration) as api_client: api_instance = ServiceScorecardsApi(api_client) response = api_instance.list_scorecard_rules() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all rules ``` # List all rules returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_scorecard_rules".to_sym] = true end api_instance = DatadogAPIClient::V2::ServiceScorecardsAPI.new p api_instance.list_scorecard_rules() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all rules ``` // List all rules returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListScorecardRules", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceScorecardsApi(apiClient) resp, r, err := api.ListScorecardRules(ctx, *datadogV2.NewListScorecardRulesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceScorecardsApi.ListScorecardRules`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceScorecardsApi.ListScorecardRules`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all rules ``` // List all rules returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceScorecardsApi; import com.datadog.api.client.v2.model.ListRulesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listScorecardRules", true); ServiceScorecardsApi apiInstance = new ServiceScorecardsApi(defaultClient); try { ListRulesResponse result = apiInstance.listScorecardRules(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceScorecardsApi#listScorecardRules"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all rules ``` // List all rules returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_scorecards::ListScorecardRulesOptionalParams; use datadog_api_client::datadogV2::api_service_scorecards::ServiceScorecardsAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListScorecardRules", true); let api = ServiceScorecardsAPI::with_config(configuration); let resp = api .list_scorecard_rules(ListScorecardRulesOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all rules ``` /** * List all rules returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listScorecardRules"] = true; const apiInstance = new v2.ServiceScorecardsApi(configuration); apiInstance .listScorecardRules() .then((data: v2.ListRulesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a rule](https://docs.datadoghq.com/api/latest/service-scorecards/#delete-a-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-scorecards/#delete-a-rule-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/scorecard/rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/scorecard/rules/{rule_id}https://api.datadoghq.eu/api/v2/scorecard/rules/{rule_id}https://api.ddog-gov.com/api/v2/scorecard/rules/{rule_id}https://api.datadoghq.com/api/v2/scorecard/rules/{rule_id}https://api.us3.datadoghq.com/api/v2/scorecard/rules/{rule_id}https://api.us5.datadoghq.com/api/v2/scorecard/rules/{rule_id} ### Overview Deletes a single rule. OAuth apps require the `apm_service_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-scorecards) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string The ID of the rule. ### Response * [204](https://docs.datadoghq.com/api/latest/service-scorecards/#DeleteScorecardRule-204-v2) * [400](https://docs.datadoghq.com/api/latest/service-scorecards/#DeleteScorecardRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-scorecards/#DeleteScorecardRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/service-scorecards/#DeleteScorecardRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/service-scorecards/#DeleteScorecardRule-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=typescript) ##### Delete a rule Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scorecard/rules/${rule_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a rule ``` """ Delete a rule returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_scorecards_api import ServiceScorecardsApi # there is a valid "create_scorecard_rule" in the system CREATE_SCORECARD_RULE_DATA_ID = environ["CREATE_SCORECARD_RULE_DATA_ID"] configuration = Configuration() configuration.unstable_operations["delete_scorecard_rule"] = True with ApiClient(configuration) as api_client: api_instance = ServiceScorecardsApi(api_client) api_instance.delete_scorecard_rule( rule_id=CREATE_SCORECARD_RULE_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a rule ``` # Delete a rule returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_scorecard_rule".to_sym] = true end api_instance = DatadogAPIClient::V2::ServiceScorecardsAPI.new # there is a valid "create_scorecard_rule" in the system CREATE_SCORECARD_RULE_DATA_ID = ENV["CREATE_SCORECARD_RULE_DATA_ID"] api_instance.delete_scorecard_rule(CREATE_SCORECARD_RULE_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a rule ``` // Delete a rule returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "create_scorecard_rule" in the system CreateScorecardRuleDataID := os.Getenv("CREATE_SCORECARD_RULE_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteScorecardRule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceScorecardsApi(apiClient) r, err := api.DeleteScorecardRule(ctx, CreateScorecardRuleDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceScorecardsApi.DeleteScorecardRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a rule ``` // Delete a rule returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceScorecardsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteScorecardRule", true); ServiceScorecardsApi apiInstance = new ServiceScorecardsApi(defaultClient); // there is a valid "create_scorecard_rule" in the system String CREATE_SCORECARD_RULE_DATA_ID = System.getenv("CREATE_SCORECARD_RULE_DATA_ID"); try { apiInstance.deleteScorecardRule(CREATE_SCORECARD_RULE_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling ServiceScorecardsApi#deleteScorecardRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a rule ``` // Delete a rule returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_scorecards::ServiceScorecardsAPI; #[tokio::main] async fn main() { // there is a valid "create_scorecard_rule" in the system let create_scorecard_rule_data_id = std::env::var("CREATE_SCORECARD_RULE_DATA_ID").unwrap(); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteScorecardRule", true); let api = ServiceScorecardsAPI::with_config(configuration); let resp = api .delete_scorecard_rule(create_scorecard_rule_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a rule ``` /** * Delete a rule returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteScorecardRule"] = true; const apiInstance = new v2.ServiceScorecardsApi(configuration); // there is a valid "create_scorecard_rule" in the system const CREATE_SCORECARD_RULE_DATA_ID = process.env .CREATE_SCORECARD_RULE_DATA_ID as string; const params: v2.ServiceScorecardsApiDeleteScorecardRuleRequest = { ruleId: CREATE_SCORECARD_RULE_DATA_ID, }; apiInstance .deleteScorecardRule(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an existing rule](https://docs.datadoghq.com/api/latest/service-scorecards/#update-an-existing-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/service-scorecards/#update-an-existing-rule-v2) **Note** : This endpoint is in public beta. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). PUT https://api.ap1.datadoghq.com/api/v2/scorecard/rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/scorecard/rules/{rule_id}https://api.datadoghq.eu/api/v2/scorecard/rules/{rule_id}https://api.ddog-gov.com/api/v2/scorecard/rules/{rule_id}https://api.datadoghq.com/api/v2/scorecard/rules/{rule_id}https://api.us3.datadoghq.com/api/v2/scorecard/rules/{rule_id}https://api.us5.datadoghq.com/api/v2/scorecard/rules/{rule_id} ### Overview Updates an existing rule. OAuth apps require the `apm_service_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#service-scorecards) to access this endpoint. ### Arguments #### Path Parameters Name Type Description rule_id [_required_] string The ID of the rule. ### Request #### Body Data (required) Rule attributes. * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) Field Type Description data object Data for the request to update a scorecard rule. attributes object Details of a rule. category string **DEPRECATED** : The scorecard name to which this rule must belong. created_at date-time Creation time of the rule outcome. custom boolean Defines if the rule is a custom rule. description string Explanation of the rule. enabled boolean If enabled, the rule is calculated as part of the score. level int32 The maturity level of the rule (1, 2, or 3). modified_at date-time Time of the last rule outcome modification. name string Name of the rule. owner string Owner of the rule. scorecard_name string The scorecard name to which this rule must belong. type enum The JSON:API type for scorecard rules. Allowed enum values: `rule` default: `rule` ``` { "data": { "attributes": { "enabled": true, "name": "Team Defined", "scorecard_name": "Deployments automated via Deployment Trains", "description": "Updated description via test" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/service-scorecards/#UpdateScorecardRule-200-v2) * [400](https://docs.datadoghq.com/api/latest/service-scorecards/#UpdateScorecardRule-400-v2) * [403](https://docs.datadoghq.com/api/latest/service-scorecards/#UpdateScorecardRule-403-v2) * [429](https://docs.datadoghq.com/api/latest/service-scorecards/#UpdateScorecardRule-429-v2) Rule updated successfully * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) The response from a rule update request. Field Type Description data object The data for a rule update response. attributes object Details of a rule. category string **DEPRECATED** : The scorecard name to which this rule must belong. created_at date-time Creation time of the rule outcome. custom boolean Defines if the rule is a custom rule. description string Explanation of the rule. enabled boolean If enabled, the rule is calculated as part of the score. level int32 The maturity level of the rule (1, 2, or 3). modified_at date-time Time of the last rule outcome modification. name string Name of the rule. owner string Owner of the rule. scorecard_name string The scorecard name to which this rule must belong. id string The unique ID for a scorecard rule. relationships object Scorecard create rule response relationship. scorecard object Relationship data for a rule. data object Rule relationship data. id string The unique ID for a scorecard. type enum The JSON:API type for scorecard. Allowed enum values: `scorecard` default: `scorecard` type enum The JSON:API type for scorecard rules. Allowed enum values: `rule` default: `rule` ``` { "data": { "attributes": { "category": "string", "created_at": "2019-09-19T10:00:00.000Z", "custom": false, "description": "string", "enabled": true, "level": 2, "modified_at": "2019-09-19T10:00:00.000Z", "name": "Team Defined", "owner": "string", "scorecard_name": "Deployments automated via Deployment Trains" }, "id": "q8MQxk8TCqrHnWkx", "relationships": { "scorecard": { "data": { "id": "q8MQxk8TCqrHnWkp", "type": "scorecard" } } }, "type": "rule" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/service-scorecards/) * [Example](https://docs.datadoghq.com/api/latest/service-scorecards/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/service-scorecards/?code-lang=typescript) ##### Update an existing rule returns "Rule updated successfully" response Copy ``` # Path parameters export rule_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/scorecard/rules/${rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "enabled": true, "name": "Team Defined", "scorecard_name": "Deployments automated via Deployment Trains", "description": "Updated description via test" } } } EOF ``` ##### Update an existing rule returns "Rule updated successfully" response ``` // Update an existing rule returns "Rule updated successfully" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "create_scorecard_rule" in the system CreateScorecardRuleDataAttributesName := os.Getenv("CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME") CreateScorecardRuleDataAttributesScorecardName := os.Getenv("CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME") CreateScorecardRuleDataID := os.Getenv("CREATE_SCORECARD_RULE_DATA_ID") body := datadogV2.UpdateRuleRequest{ Data: &datadogV2.UpdateRuleRequestData{ Attributes: &datadogV2.RuleAttributes{ Enabled: datadog.PtrBool(true), Name: datadog.PtrString(CreateScorecardRuleDataAttributesName), ScorecardName: datadog.PtrString(CreateScorecardRuleDataAttributesScorecardName), Description: datadog.PtrString("Updated description via test"), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.UpdateScorecardRule", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewServiceScorecardsApi(apiClient) resp, r, err := api.UpdateScorecardRule(ctx, CreateScorecardRuleDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `ServiceScorecardsApi.UpdateScorecardRule`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `ServiceScorecardsApi.UpdateScorecardRule`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an existing rule returns "Rule updated successfully" response ``` // Update an existing rule returns "Rule updated successfully" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.ServiceScorecardsApi; import com.datadog.api.client.v2.model.RuleAttributes; import com.datadog.api.client.v2.model.UpdateRuleRequest; import com.datadog.api.client.v2.model.UpdateRuleRequestData; import com.datadog.api.client.v2.model.UpdateRuleResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.updateScorecardRule", true); ServiceScorecardsApi apiInstance = new ServiceScorecardsApi(defaultClient); // there is a valid "create_scorecard_rule" in the system String CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME = System.getenv("CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME"); String CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME = System.getenv("CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME"); String CREATE_SCORECARD_RULE_DATA_ID = System.getenv("CREATE_SCORECARD_RULE_DATA_ID"); UpdateRuleRequest body = new UpdateRuleRequest() .data( new UpdateRuleRequestData() .attributes( new RuleAttributes() .enabled(true) .name(CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME) .scorecardName(CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME) .description("Updated description via test"))); try { UpdateRuleResponse result = apiInstance.updateScorecardRule(CREATE_SCORECARD_RULE_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling ServiceScorecardsApi#updateScorecardRule"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an existing rule returns "Rule updated successfully" response ``` """ Update an existing rule returns "Rule updated successfully" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.service_scorecards_api import ServiceScorecardsApi from datadog_api_client.v2.model.rule_attributes import RuleAttributes from datadog_api_client.v2.model.update_rule_request import UpdateRuleRequest from datadog_api_client.v2.model.update_rule_request_data import UpdateRuleRequestData # there is a valid "create_scorecard_rule" in the system CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME = environ["CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME"] CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME = environ["CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME"] CREATE_SCORECARD_RULE_DATA_ID = environ["CREATE_SCORECARD_RULE_DATA_ID"] body = UpdateRuleRequest( data=UpdateRuleRequestData( attributes=RuleAttributes( enabled=True, name=CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME, scorecard_name=CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME, description="Updated description via test", ), ), ) configuration = Configuration() configuration.unstable_operations["update_scorecard_rule"] = True with ApiClient(configuration) as api_client: api_instance = ServiceScorecardsApi(api_client) response = api_instance.update_scorecard_rule(rule_id=CREATE_SCORECARD_RULE_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an existing rule returns "Rule updated successfully" response ``` # Update an existing rule returns "Rule updated successfully" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.update_scorecard_rule".to_sym] = true end api_instance = DatadogAPIClient::V2::ServiceScorecardsAPI.new # there is a valid "create_scorecard_rule" in the system CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME = ENV["CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME"] CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME = ENV["CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME"] CREATE_SCORECARD_RULE_DATA_ID = ENV["CREATE_SCORECARD_RULE_DATA_ID"] body = DatadogAPIClient::V2::UpdateRuleRequest.new({ data: DatadogAPIClient::V2::UpdateRuleRequestData.new({ attributes: DatadogAPIClient::V2::RuleAttributes.new({ enabled: true, name: CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME, scorecard_name: CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME, description: "Updated description via test", }), }), }) p api_instance.update_scorecard_rule(CREATE_SCORECARD_RULE_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an existing rule returns "Rule updated successfully" response ``` // Update an existing rule returns "Rule updated successfully" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_service_scorecards::ServiceScorecardsAPI; use datadog_api_client::datadogV2::model::RuleAttributes; use datadog_api_client::datadogV2::model::UpdateRuleRequest; use datadog_api_client::datadogV2::model::UpdateRuleRequestData; #[tokio::main] async fn main() { // there is a valid "create_scorecard_rule" in the system let create_scorecard_rule_data_attributes_name = std::env::var("CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME").unwrap(); let create_scorecard_rule_data_attributes_scorecard_name = std::env::var("CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME").unwrap(); let create_scorecard_rule_data_id = std::env::var("CREATE_SCORECARD_RULE_DATA_ID").unwrap(); let body = UpdateRuleRequest::new().data( UpdateRuleRequestData::new().attributes( RuleAttributes::new() .description("Updated description via test".to_string()) .enabled(true) .name(create_scorecard_rule_data_attributes_name.clone()) .scorecard_name(create_scorecard_rule_data_attributes_scorecard_name.clone()), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.UpdateScorecardRule", true); let api = ServiceScorecardsAPI::with_config(configuration); let resp = api .update_scorecard_rule(create_scorecard_rule_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an existing rule returns "Rule updated successfully" response ``` /** * Update an existing rule returns "Rule updated successfully" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.updateScorecardRule"] = true; const apiInstance = new v2.ServiceScorecardsApi(configuration); // there is a valid "create_scorecard_rule" in the system const CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME = process.env .CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME as string; const CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME = process.env .CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME as string; const CREATE_SCORECARD_RULE_DATA_ID = process.env .CREATE_SCORECARD_RULE_DATA_ID as string; const params: v2.ServiceScorecardsApiUpdateScorecardRuleRequest = { body: { data: { attributes: { enabled: true, name: CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_NAME, scorecardName: CREATE_SCORECARD_RULE_DATA_ATTRIBUTES_SCORECARD_NAME, description: "Updated description via test", }, }, }, ruleId: CREATE_SCORECARD_RULE_DATA_ID, }; apiInstance .updateScorecardRule(params) .then((data: v2.UpdateRuleResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=9f45b900-10a3-4084-84e8-5e1a68095181&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=1bd04605-3149-4ca8-bce0-ddd1bce0c4d4&pt=Service%20Scorecards&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-scorecards%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=9f45b900-10a3-4084-84e8-5e1a68095181&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=1bd04605-3149-4ca8-bce0-ddd1bce0c4d4&pt=Service%20Scorecards&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-scorecards%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=de8206a9-eaf5-464e-81de-58cfcba6a0db&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Service%20Scorecards&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fservice-scorecards%2F&r=<=11503&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=537830) --- # Source: https://docs.datadoghq.com/api/latest/slack-integration/ # Slack Integration Configure your [Datadog-Slack integration](https://docs.datadoghq.com/integrations/slack) directly through the Datadog API. ## [Delete a Slack integration](https://docs.datadoghq.com/api/latest/slack-integration/#delete-a-slack-integration) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/slack-integration/#delete-a-slack-integration-v1) DELETE https://api.ap1.datadoghq.com/api/v1/integration/slackhttps://api.ap2.datadoghq.com/api/v1/integration/slackhttps://api.datadoghq.eu/api/v1/integration/slackhttps://api.ddog-gov.com/api/v1/integration/slackhttps://api.datadoghq.com/api/v1/integration/slackhttps://api.us3.datadoghq.com/api/v1/integration/slackhttps://api.us5.datadoghq.com/api/v1/integration/slack ### Overview Delete a Datadog-Slack integration. ### Response * [200](https://docs.datadoghq.com/api/latest/slack-integration/#DeleteSlackIntegration-200-v1) * [403](https://docs.datadoghq.com/api/latest/slack-integration/#DeleteSlackIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/slack-integration/#DeleteSlackIntegration-429-v1) OK Authentication error * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=curl) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=ruby-legacy) ##### Delete a Slack integration Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/slack" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a Slack integration ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.delete_integration('slack') ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` * * * ## [Get all channels in a Slack integration](https://docs.datadoghq.com/api/latest/slack-integration/#get-all-channels-in-a-slack-integration) * [v1 (latest)](https://docs.datadoghq.com/api/latest/slack-integration/#get-all-channels-in-a-slack-integration-v1) GET https://api.ap1.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.ap2.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.datadoghq.eu/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.ddog-gov.com/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.us3.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.us5.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels ### Overview Get a list of all channels configured for your Datadog-Slack integration. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description account_name [_required_] string Your Slack account name. ### Response * [200](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegrationChannels-200-v1) * [400](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegrationChannels-400-v1) * [403](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegrationChannels-403-v1) * [404](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegrationChannels-404-v1) * [429](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegrationChannels-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) A list of configured Slack channels. Field Type Description display object Configuration options for what is shown in an alert event message. message boolean Show the main body of the alert event. default: `true` mute_buttons boolean Show interactive buttons to mute the alerting monitor. notified boolean Show the list of @-handles in the alert event. default: `true` snapshot boolean Show the alert event's snapshot image. default: `true` tags boolean Show the scopes on which the monitor alerted. default: `true` name string Your channel name. ``` [ { "display": { "message": true, "mute_buttons": true, "notified": true, "snapshot": true, "tags": true }, "name": "#channel_name_main_account" }, { "display": { "message": true, "mute_buttons": true, "notified": true, "snapshot": false, "tags": true }, "name": "#channel_name_doghouse" } ] ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=typescript) ##### Get all channels in a Slack integration Copy ``` # Path parameters export account_name="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/slack/configuration/accounts/${account_name}/channels" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all channels in a Slack integration ``` """ Get all channels in a Slack integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.slack_integration_api import SlackIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SlackIntegrationApi(api_client) response = api_instance.get_slack_integration_channels( account_name="account_name", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all channels in a Slack integration ``` # Get all channels in a Slack integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SlackIntegrationAPI.new p api_instance.get_slack_integration_channels("account_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all channels in a Slack integration ``` // Get all channels in a Slack integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSlackIntegrationApi(apiClient) resp, r, err := api.GetSlackIntegrationChannels(ctx, "account_name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SlackIntegrationApi.GetSlackIntegrationChannels`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SlackIntegrationApi.GetSlackIntegrationChannels`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all channels in a Slack integration ``` // Get all channels in a Slack integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SlackIntegrationApi; import com.datadog.api.client.v1.model.SlackIntegrationChannel; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SlackIntegrationApi apiInstance = new SlackIntegrationApi(defaultClient); try { List result = apiInstance.getSlackIntegrationChannels("account_name"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SlackIntegrationApi#getSlackIntegrationChannels"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all channels in a Slack integration ``` // Get all channels in a Slack integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_slack_integration::SlackIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SlackIntegrationAPI::with_config(configuration); let resp = api .get_slack_integration_channels("account_name".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all channels in a Slack integration ``` /** * Get all channels in a Slack integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SlackIntegrationApi(configuration); const params: v1.SlackIntegrationApiGetSlackIntegrationChannelsRequest = { accountName: "account_name", }; apiInstance .getSlackIntegrationChannels(params) .then((data: v1.SlackIntegrationChannel[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add channels to Slack integration](https://docs.datadoghq.com/api/latest/slack-integration/#add-channels-to-slack-integration) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/slack-integration/#add-channels-to-slack-integration-v1) PUT https://api.ap1.datadoghq.com/api/v1/integration/slackhttps://api.ap2.datadoghq.com/api/v1/integration/slackhttps://api.datadoghq.eu/api/v1/integration/slackhttps://api.ddog-gov.com/api/v1/integration/slackhttps://api.datadoghq.com/api/v1/integration/slackhttps://api.us3.datadoghq.com/api/v1/integration/slackhttps://api.us5.datadoghq.com/api/v1/integration/slack ### Overview Add channels to your existing Datadog-Slack integration. This method updates your integration configuration by **replacing** your current configuration with the new one sent to your Datadog organization. ### Request #### Body Data (required) Update an existing Datadog-Slack integration request body. * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Field Type Description channels [object] An array of slack channel configurations. account [_required_] string Account to which the channel belongs to. channel_name [_required_] string Your channel name. transfer_all_user_comments boolean To be notified for every comment on a graph, set it to `true`. If set to `False` use the `@slack-channel_name` syntax for comments to be posted to slack. ``` { "channels": [ { "account": "jane.doe", "channel_name": "#general", "transfer_all_user_comments": false } ] } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/slack-integration/#UpdateSlackIntegration-204-v1) * [400](https://docs.datadoghq.com/api/latest/slack-integration/#UpdateSlackIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/slack-integration/#UpdateSlackIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/slack-integration/#UpdateSlackIntegration-429-v1) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=curl) ##### Add channels to Slack integration Copy ``` # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/slack" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "channels": [ { "account": "jane.doe", "channel_name": "#general" } ] } EOF ``` * * * ## [Create a Slack integration channel](https://docs.datadoghq.com/api/latest/slack-integration/#create-a-slack-integration-channel) * [v1 (latest)](https://docs.datadoghq.com/api/latest/slack-integration/#create-a-slack-integration-channel-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.ap2.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.datadoghq.eu/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.ddog-gov.com/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.us3.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channelshttps://api.us5.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels ### Overview Add a channel to your Datadog-Slack integration. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_name [_required_] string Your Slack account name. ### Request #### Body Data (required) Payload describing Slack channel to be created * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Field Type Description display object Configuration options for what is shown in an alert event message. message boolean Show the main body of the alert event. default: `true` mute_buttons boolean Show interactive buttons to mute the alerting monitor. notified boolean Show the list of @-handles in the alert event. default: `true` snapshot boolean Show the alert event's snapshot image. default: `true` tags boolean Show the scopes on which the monitor alerted. default: `true` name string Your channel name. ``` { "display": { "message": false, "mute_buttons": false, "notified": false, "snapshot": false, "tags": false }, "name": "#general" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/slack-integration/#CreateSlackIntegrationChannel-200-v1) * [400](https://docs.datadoghq.com/api/latest/slack-integration/#CreateSlackIntegrationChannel-400-v1) * [403](https://docs.datadoghq.com/api/latest/slack-integration/#CreateSlackIntegrationChannel-403-v1) * [404](https://docs.datadoghq.com/api/latest/slack-integration/#CreateSlackIntegrationChannel-404-v1) * [429](https://docs.datadoghq.com/api/latest/slack-integration/#CreateSlackIntegrationChannel-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) The Slack channel configuration. Field Type Description display object Configuration options for what is shown in an alert event message. message boolean Show the main body of the alert event. default: `true` mute_buttons boolean Show interactive buttons to mute the alerting monitor. notified boolean Show the list of @-handles in the alert event. default: `true` snapshot boolean Show the alert event's snapshot image. default: `true` tags boolean Show the scopes on which the monitor alerted. default: `true` name string Your channel name. ``` { "display": { "message": false, "mute_buttons": false, "notified": false, "snapshot": false, "tags": false }, "name": "#general" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=typescript) ##### Create a Slack integration channel Copy ``` # Path parameters export account_name="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/slack/configuration/accounts/${account_name}/channels" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Create a Slack integration channel ``` """ Create a Slack integration channel returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.slack_integration_api import SlackIntegrationApi from datadog_api_client.v1.model.slack_integration_channel import SlackIntegrationChannel from datadog_api_client.v1.model.slack_integration_channel_display import SlackIntegrationChannelDisplay body = SlackIntegrationChannel( display=SlackIntegrationChannelDisplay( message=True, mute_buttons=False, notified=True, snapshot=True, tags=True, ), name="#general", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SlackIntegrationApi(api_client) response = api_instance.create_slack_integration_channel(account_name="account_name", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a Slack integration channel ``` # Create a Slack integration channel returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SlackIntegrationAPI.new body = DatadogAPIClient::V1::SlackIntegrationChannel.new({ display: DatadogAPIClient::V1::SlackIntegrationChannelDisplay.new({ message: true, mute_buttons: false, notified: true, snapshot: true, tags: true, }), name: "#general", }) p api_instance.create_slack_integration_channel("account_name", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a Slack integration channel ``` // Create a Slack integration channel returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SlackIntegrationChannel{ Display: &datadogV1.SlackIntegrationChannelDisplay{ Message: datadog.PtrBool(true), MuteButtons: datadog.PtrBool(false), Notified: datadog.PtrBool(true), Snapshot: datadog.PtrBool(true), Tags: datadog.PtrBool(true), }, Name: datadog.PtrString("#general"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSlackIntegrationApi(apiClient) resp, r, err := api.CreateSlackIntegrationChannel(ctx, "account_name", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SlackIntegrationApi.CreateSlackIntegrationChannel`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SlackIntegrationApi.CreateSlackIntegrationChannel`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a Slack integration channel ``` // Create a Slack integration channel returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SlackIntegrationApi; import com.datadog.api.client.v1.model.SlackIntegrationChannel; import com.datadog.api.client.v1.model.SlackIntegrationChannelDisplay; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SlackIntegrationApi apiInstance = new SlackIntegrationApi(defaultClient); SlackIntegrationChannel body = new SlackIntegrationChannel() .display( new SlackIntegrationChannelDisplay() .message(true) .muteButtons(false) .notified(true) .snapshot(true) .tags(true)) .name("#general"); try { SlackIntegrationChannel result = apiInstance.createSlackIntegrationChannel("account_name", body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SlackIntegrationApi#createSlackIntegrationChannel"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a Slack integration channel ``` // Create a Slack integration channel returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_slack_integration::SlackIntegrationAPI; use datadog_api_client::datadogV1::model::SlackIntegrationChannel; use datadog_api_client::datadogV1::model::SlackIntegrationChannelDisplay; #[tokio::main] async fn main() { let body = SlackIntegrationChannel::new() .display( SlackIntegrationChannelDisplay::new() .message(true) .mute_buttons(false) .notified(true) .snapshot(true) .tags(true), ) .name("#general".to_string()); let configuration = datadog::Configuration::new(); let api = SlackIntegrationAPI::with_config(configuration); let resp = api .create_slack_integration_channel("account_name".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a Slack integration channel ``` /** * Create a Slack integration channel returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SlackIntegrationApi(configuration); const params: v1.SlackIntegrationApiCreateSlackIntegrationChannelRequest = { body: { display: { message: true, muteButtons: false, notified: true, snapshot: true, tags: true, }, name: "#general", }, accountName: "account_name", }; apiInstance .createSlackIntegrationChannel(params) .then((data: v1.SlackIntegrationChannel) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a Slack integration](https://docs.datadoghq.com/api/latest/slack-integration/#create-a-slack-integration) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/slack-integration/#create-a-slack-integration-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/slackhttps://api.ap2.datadoghq.com/api/v1/integration/slackhttps://api.datadoghq.eu/api/v1/integration/slackhttps://api.ddog-gov.com/api/v1/integration/slackhttps://api.datadoghq.com/api/v1/integration/slackhttps://api.us3.datadoghq.com/api/v1/integration/slackhttps://api.us5.datadoghq.com/api/v1/integration/slack ### Overview Create a Datadog-Slack integration. Once created, add a channel to it with the [Add channels to Slack integration endpoint](https://docs.datadoghq.com/api/?lang=bash#add-channels-to-slack-integration). This method updates your integration configuration by **adding** your new configuration to the existing one in your Datadog organization. ### Request #### Body Data (required) Create Datadog-Slack integration request body. * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Field Type Description service_hooks [object] The array of service hook objects. account [_required_] string Your Slack account name. url [_required_] string Your Slack service hook URL. ``` { "service_hooks": [ { "account": "joe.doe", "url": "https://hooks.slack.com/services/T00000000/B00000000/XXXXXXXXXXXXXXXXXXXXXXXX" } ] } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/slack-integration/#CreateSlackIntegration-204-v1) * [400](https://docs.datadoghq.com/api/latest/slack-integration/#CreateSlackIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/slack-integration/#CreateSlackIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/slack-integration/#CreateSlackIntegration-429-v1) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=curl) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=ruby-legacy) ##### Create a Slack integration Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/slack" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "service_hooks": [ { "account": "joe.doe", "url": "https://hooks.slack.com/services/T00000000/B00000000/XXXXXXXXXXXXXXXXXXXXXXXX" } ] } EOF ``` ##### Create a Slack integration ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' config= { "service_hooks": [ { "account": "Main_Account", "url": "https://hooks.slack.com/services/1/1/1" }, { "account": "doghouse", "url": "https://hooks.slack.com/services/2/2/2" } ] } dog = Dogapi::Client.new(api_key, app_key) dog.create_integration('slack', config) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` * * * ## [Get a Slack integration channel](https://docs.datadoghq.com/api/latest/slack-integration/#get-a-slack-integration-channel) * [v1 (latest)](https://docs.datadoghq.com/api/latest/slack-integration/#get-a-slack-integration-channel-v1) GET https://api.ap1.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.ap2.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.datadoghq.eu/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.ddog-gov.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.us3.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.us5.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name} ### Overview Get a channel configured for your Datadog-Slack integration. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description account_name [_required_] string Your Slack account name. channel_name [_required_] string The name of the Slack channel being operated on. ### Response * [200](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegrationChannel-200-v1) * [400](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegrationChannel-400-v1) * [403](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegrationChannel-403-v1) * [404](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegrationChannel-404-v1) * [429](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegrationChannel-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) The Slack channel configuration. Field Type Description display object Configuration options for what is shown in an alert event message. message boolean Show the main body of the alert event. default: `true` mute_buttons boolean Show interactive buttons to mute the alerting monitor. notified boolean Show the list of @-handles in the alert event. default: `true` snapshot boolean Show the alert event's snapshot image. default: `true` tags boolean Show the scopes on which the monitor alerted. default: `true` name string Your channel name. ``` { "display": { "message": false, "mute_buttons": false, "notified": false, "snapshot": false, "tags": false }, "name": "#general" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=typescript) ##### Get a Slack integration channel Copy ``` # Path parameters export account_name="CHANGE_ME" export channel_name="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/slack/configuration/accounts/${account_name}/channels/${channel_name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a Slack integration channel ``` """ Get a Slack integration channel returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.slack_integration_api import SlackIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SlackIntegrationApi(api_client) response = api_instance.get_slack_integration_channel( account_name="account_name", channel_name="channel_name", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a Slack integration channel ``` # Get a Slack integration channel returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SlackIntegrationAPI.new p api_instance.get_slack_integration_channel("account_name", "channel_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a Slack integration channel ``` // Get a Slack integration channel returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSlackIntegrationApi(apiClient) resp, r, err := api.GetSlackIntegrationChannel(ctx, "account_name", "channel_name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SlackIntegrationApi.GetSlackIntegrationChannel`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SlackIntegrationApi.GetSlackIntegrationChannel`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a Slack integration channel ``` // Get a Slack integration channel returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SlackIntegrationApi; import com.datadog.api.client.v1.model.SlackIntegrationChannel; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SlackIntegrationApi apiInstance = new SlackIntegrationApi(defaultClient); try { SlackIntegrationChannel result = apiInstance.getSlackIntegrationChannel("account_name", "channel_name"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SlackIntegrationApi#getSlackIntegrationChannel"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a Slack integration channel ``` // Get a Slack integration channel returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_slack_integration::SlackIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SlackIntegrationAPI::with_config(configuration); let resp = api .get_slack_integration_channel("account_name".to_string(), "channel_name".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a Slack integration channel ``` /** * Get a Slack integration channel returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SlackIntegrationApi(configuration); const params: v1.SlackIntegrationApiGetSlackIntegrationChannelRequest = { accountName: "account_name", channelName: "channel_name", }; apiInstance .getSlackIntegrationChannel(params) .then((data: v1.SlackIntegrationChannel) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get info about a Slack integration](https://docs.datadoghq.com/api/latest/slack-integration/#get-info-about-a-slack-integration) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/slack-integration/#get-info-about-a-slack-integration-v1) GET https://api.ap1.datadoghq.com/api/v1/integration/slackhttps://api.ap2.datadoghq.com/api/v1/integration/slackhttps://api.datadoghq.eu/api/v1/integration/slackhttps://api.ddog-gov.com/api/v1/integration/slackhttps://api.datadoghq.com/api/v1/integration/slackhttps://api.us3.datadoghq.com/api/v1/integration/slackhttps://api.us5.datadoghq.com/api/v1/integration/slack ### Overview Get all information about your Datadog-Slack integration. ### Response * [200](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegration-200-v1) * [400](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegration-403-v1) * [404](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegration-404-v1) * [429](https://docs.datadoghq.com/api/latest/slack-integration/#GetSlackIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Expand All Field Type Description No response body ``` {} ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=curl) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=ruby-legacy) ##### Get info about a Slack integration Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/slack" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get info about a Slack integration ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.get_integration('slack') ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` * * * ## [Update a Slack integration channel](https://docs.datadoghq.com/api/latest/slack-integration/#update-a-slack-integration-channel) * [v1 (latest)](https://docs.datadoghq.com/api/latest/slack-integration/#update-a-slack-integration-channel-v1) PATCH https://api.ap1.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.ap2.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.datadoghq.eu/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.ddog-gov.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.us3.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.us5.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name} ### Overview Update a channel used in your Datadog-Slack integration. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_name [_required_] string Your Slack account name. channel_name [_required_] string The name of the Slack channel being operated on. ### Request #### Body Data (required) Payload describing fields and values to be updated. * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Field Type Description display object Configuration options for what is shown in an alert event message. message boolean Show the main body of the alert event. default: `true` mute_buttons boolean Show interactive buttons to mute the alerting monitor. notified boolean Show the list of @-handles in the alert event. default: `true` snapshot boolean Show the alert event's snapshot image. default: `true` tags boolean Show the scopes on which the monitor alerted. default: `true` name string Your channel name. ``` { "display": { "message": false, "mute_buttons": false, "notified": false, "snapshot": false, "tags": false }, "name": "#general" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/slack-integration/#UpdateSlackIntegrationChannel-200-v1) * [400](https://docs.datadoghq.com/api/latest/slack-integration/#UpdateSlackIntegrationChannel-400-v1) * [403](https://docs.datadoghq.com/api/latest/slack-integration/#UpdateSlackIntegrationChannel-403-v1) * [404](https://docs.datadoghq.com/api/latest/slack-integration/#UpdateSlackIntegrationChannel-404-v1) * [429](https://docs.datadoghq.com/api/latest/slack-integration/#UpdateSlackIntegrationChannel-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) The Slack channel configuration. Field Type Description display object Configuration options for what is shown in an alert event message. message boolean Show the main body of the alert event. default: `true` mute_buttons boolean Show interactive buttons to mute the alerting monitor. notified boolean Show the list of @-handles in the alert event. default: `true` snapshot boolean Show the alert event's snapshot image. default: `true` tags boolean Show the scopes on which the monitor alerted. default: `true` name string Your channel name. ``` { "display": { "message": false, "mute_buttons": false, "notified": false, "snapshot": false, "tags": false }, "name": "#general" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=typescript) ##### Update a Slack integration channel Copy ``` # Path parameters export account_name="CHANGE_ME" export channel_name="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/slack/configuration/accounts/${account_name}/channels/${channel_name}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Update a Slack integration channel ``` """ Update a Slack integration channel returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.slack_integration_api import SlackIntegrationApi from datadog_api_client.v1.model.slack_integration_channel import SlackIntegrationChannel from datadog_api_client.v1.model.slack_integration_channel_display import SlackIntegrationChannelDisplay body = SlackIntegrationChannel( display=SlackIntegrationChannelDisplay( message=True, mute_buttons=False, notified=True, snapshot=True, tags=True, ), name="#general", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SlackIntegrationApi(api_client) response = api_instance.update_slack_integration_channel( account_name="account_name", channel_name="channel_name", body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a Slack integration channel ``` # Update a Slack integration channel returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SlackIntegrationAPI.new body = DatadogAPIClient::V1::SlackIntegrationChannel.new({ display: DatadogAPIClient::V1::SlackIntegrationChannelDisplay.new({ message: true, mute_buttons: false, notified: true, snapshot: true, tags: true, }), name: "#general", }) p api_instance.update_slack_integration_channel("account_name", "channel_name", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a Slack integration channel ``` // Update a Slack integration channel returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SlackIntegrationChannel{ Display: &datadogV1.SlackIntegrationChannelDisplay{ Message: datadog.PtrBool(true), MuteButtons: datadog.PtrBool(false), Notified: datadog.PtrBool(true), Snapshot: datadog.PtrBool(true), Tags: datadog.PtrBool(true), }, Name: datadog.PtrString("#general"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSlackIntegrationApi(apiClient) resp, r, err := api.UpdateSlackIntegrationChannel(ctx, "account_name", "channel_name", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SlackIntegrationApi.UpdateSlackIntegrationChannel`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SlackIntegrationApi.UpdateSlackIntegrationChannel`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a Slack integration channel ``` // Update a Slack integration channel returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SlackIntegrationApi; import com.datadog.api.client.v1.model.SlackIntegrationChannel; import com.datadog.api.client.v1.model.SlackIntegrationChannelDisplay; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SlackIntegrationApi apiInstance = new SlackIntegrationApi(defaultClient); SlackIntegrationChannel body = new SlackIntegrationChannel() .display( new SlackIntegrationChannelDisplay() .message(true) .muteButtons(false) .notified(true) .snapshot(true) .tags(true)) .name("#general"); try { SlackIntegrationChannel result = apiInstance.updateSlackIntegrationChannel("account_name", "channel_name", body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling SlackIntegrationApi#updateSlackIntegrationChannel"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a Slack integration channel ``` // Update a Slack integration channel returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_slack_integration::SlackIntegrationAPI; use datadog_api_client::datadogV1::model::SlackIntegrationChannel; use datadog_api_client::datadogV1::model::SlackIntegrationChannelDisplay; #[tokio::main] async fn main() { let body = SlackIntegrationChannel::new() .display( SlackIntegrationChannelDisplay::new() .message(true) .mute_buttons(false) .notified(true) .snapshot(true) .tags(true), ) .name("#general".to_string()); let configuration = datadog::Configuration::new(); let api = SlackIntegrationAPI::with_config(configuration); let resp = api .update_slack_integration_channel( "account_name".to_string(), "channel_name".to_string(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a Slack integration channel ``` /** * Update a Slack integration channel returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SlackIntegrationApi(configuration); const params: v1.SlackIntegrationApiUpdateSlackIntegrationChannelRequest = { body: { display: { message: true, muteButtons: false, notified: true, snapshot: true, tags: true, }, name: "#general", }, accountName: "account_name", channelName: "channel_name", }; apiInstance .updateSlackIntegrationChannel(params) .then((data: v1.SlackIntegrationChannel) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Remove a Slack integration channel](https://docs.datadoghq.com/api/latest/slack-integration/#remove-a-slack-integration-channel) * [v1 (latest)](https://docs.datadoghq.com/api/latest/slack-integration/#remove-a-slack-integration-channel-v1) DELETE https://api.ap1.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.ap2.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.datadoghq.eu/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.ddog-gov.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.us3.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name}https://api.us5.datadoghq.com/api/v1/integration/slack/configuration/accounts/{account_name}/channels/{channel_name} ### Overview Remove a channel from your Datadog-Slack integration. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description account_name [_required_] string Your Slack account name. channel_name [_required_] string The name of the Slack channel being operated on. ### Response * [204](https://docs.datadoghq.com/api/latest/slack-integration/#RemoveSlackIntegrationChannel-204-v1) * [400](https://docs.datadoghq.com/api/latest/slack-integration/#RemoveSlackIntegrationChannel-400-v1) * [403](https://docs.datadoghq.com/api/latest/slack-integration/#RemoveSlackIntegrationChannel-403-v1) * [404](https://docs.datadoghq.com/api/latest/slack-integration/#RemoveSlackIntegrationChannel-404-v1) * [429](https://docs.datadoghq.com/api/latest/slack-integration/#RemoveSlackIntegrationChannel-429-v1) The channel was removed successfully. Bad Request * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/slack-integration/) * [Example](https://docs.datadoghq.com/api/latest/slack-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/slack-integration/?code-lang=typescript) ##### Remove a Slack integration channel Copy ``` # Path parameters export account_name="CHANGE_ME" export channel_name="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/slack/configuration/accounts/${account_name}/channels/${channel_name}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Remove a Slack integration channel ``` """ Remove a Slack integration channel returns "The channel was removed successfully." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.slack_integration_api import SlackIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SlackIntegrationApi(api_client) api_instance.remove_slack_integration_channel( account_name="account_name", channel_name="channel_name", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Remove a Slack integration channel ``` # Remove a Slack integration channel returns "The channel was removed successfully." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SlackIntegrationAPI.new api_instance.remove_slack_integration_channel("account_name", "channel_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Remove a Slack integration channel ``` // Remove a Slack integration channel returns "The channel was removed successfully." response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSlackIntegrationApi(apiClient) r, err := api.RemoveSlackIntegrationChannel(ctx, "account_name", "channel_name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SlackIntegrationApi.RemoveSlackIntegrationChannel`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Remove a Slack integration channel ``` // Remove a Slack integration channel returns "The channel was removed successfully." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SlackIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SlackIntegrationApi apiInstance = new SlackIntegrationApi(defaultClient); try { apiInstance.removeSlackIntegrationChannel("account_name", "channel_name"); } catch (ApiException e) { System.err.println( "Exception when calling SlackIntegrationApi#removeSlackIntegrationChannel"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Remove a Slack integration channel ``` // Remove a Slack integration channel returns "The channel was removed // successfully." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_slack_integration::SlackIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SlackIntegrationAPI::with_config(configuration); let resp = api .remove_slack_integration_channel("account_name".to_string(), "channel_name".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Remove a Slack integration channel ``` /** * Remove a Slack integration channel returns "The channel was removed successfully." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SlackIntegrationApi(configuration); const params: v1.SlackIntegrationApiRemoveSlackIntegrationChannelRequest = { accountName: "account_name", channelName: "channel_name", }; apiInstance .removeSlackIntegrationChannel(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=02b90dc1-d04d-42fe-98f1-8f7ab7a82636&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=73736a74-ae36-4955-88aa-fb424771d23e&pt=Slack%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fslack-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=02b90dc1-d04d-42fe-98f1-8f7ab7a82636&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=73736a74-ae36-4955-88aa-fb424771d23e&pt=Slack%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fslack-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) --- # Source: https://docs.datadoghq.com/api/latest/snapshots/ # Snapshots Take graph snapshots using the API. ## [Take graph snapshots](https://docs.datadoghq.com/api/latest/snapshots/#take-graph-snapshots) * [v1 (latest)](https://docs.datadoghq.com/api/latest/snapshots/#take-graph-snapshots-v1) GET https://api.ap1.datadoghq.com/api/v1/graph/snapshothttps://api.ap2.datadoghq.com/api/v1/graph/snapshothttps://api.datadoghq.eu/api/v1/graph/snapshothttps://api.ddog-gov.com/api/v1/graph/snapshothttps://api.datadoghq.com/api/v1/graph/snapshothttps://api.us3.datadoghq.com/api/v1/graph/snapshothttps://api.us5.datadoghq.com/api/v1/graph/snapshot ### Overview Take graph snapshots. Snapshots are PNG images generated by rendering a specified widget in a web page and capturing it once the data is available. The image is then uploaded to cloud storage. **Note** : When a snapshot is created, there is some delay before it is available. ### Arguments #### Query Strings Name Type Description metric_query string The metric query. start [_required_] integer The POSIX timestamp of the start of the query in seconds. end [_required_] integer The POSIX timestamp of the end of the query in seconds. event_query string A query that adds event bands to the graph. graph_def string A JSON document defining the graph. `graph_def` can be used instead of `metric_query`. The JSON document uses the [grammar defined here](https://docs.datadoghq.com/graphing/graphing_json/#grammar) and should be formatted to a single line then URL encoded. title string A title for the graph. If no title is specified, the graph does not have a title. height integer The height of the graph. If no height is specified, the graph’s original height is used. width integer The width of the graph. If no width is specified, the graph’s original width is used. ### Response * [200](https://docs.datadoghq.com/api/latest/snapshots/#GetGraphSnapshot-200-v1) * [400](https://docs.datadoghq.com/api/latest/snapshots/#GetGraphSnapshot-400-v1) * [403](https://docs.datadoghq.com/api/latest/snapshots/#GetGraphSnapshot-403-v1) * [429](https://docs.datadoghq.com/api/latest/snapshots/#GetGraphSnapshot-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/snapshots/) * [Example](https://docs.datadoghq.com/api/latest/snapshots/) Object representing a graph snapshot. Expand All Field Type Description graph_def string A JSON document defining the graph. `graph_def` can be used instead of `metric_query`. The JSON document uses the [grammar defined here](https://docs.datadoghq.com/graphing/graphing_json/#grammar) and should be formatted to a single line then URL encoded. metric_query string The metric query. One of `metric_query` or `graph_def` is required. snapshot_url string URL of your [graph snapshot](https://docs.datadoghq.com/metrics/explorer/#snapshot). ``` { "graph_def": "string", "metric_query": "string", "snapshot_url": "https://app.datadoghq.com/s/f12345678/aaa-bbb-ccc" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/snapshots/) * [Example](https://docs.datadoghq.com/api/latest/snapshots/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/snapshots/) * [Example](https://docs.datadoghq.com/api/latest/snapshots/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/snapshots/) * [Example](https://docs.datadoghq.com/api/latest/snapshots/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/snapshots/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/snapshots/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/snapshots/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/snapshots/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/snapshots/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/snapshots/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/snapshots/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/snapshots/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/snapshots/?code-lang=python-legacy) ##### Take graph snapshots Copy ``` # Required query arguments export metric_query="CHANGE_ME" export start="CHANGE_ME" export end="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/graph/snapshot?metric_query=${metric_query}&start=${start}&end=${end}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Take graph snapshots ``` """ Take graph snapshots returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.snapshots_api import SnapshotsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SnapshotsApi(api_client) response = api_instance.get_graph_snapshot( metric_query="avg:system.load.1{*}", start=int((datetime.now() + relativedelta(days=-1)).timestamp()), end=int(datetime.now().timestamp()), title="System load", height=400, width=600, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Take graph snapshots ``` # Take graph snapshots returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SnapshotsAPI.new opts = { metric_query: "avg:system.load.1{*}", title: "System load", height: 400, width: 600, } p api_instance.get_graph_snapshot((Time.now + -1 * 86400).to_i, Time.now.to_i, opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Take graph snapshots ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) end_ts = Time.now().to_i start_ts = end_ts - (60 * 60) dog.graph_snapshot("system.load.1{*}", start_ts, end_ts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Take graph snapshots ``` // Take graph snapshots returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSnapshotsApi(apiClient) resp, r, err := api.GetGraphSnapshot(ctx, time.Now().AddDate(0, 0, -1).Unix(), time.Now().Unix(), *datadogV1.NewGetGraphSnapshotOptionalParameters().WithMetricQuery("avg:system.load.1{*}").WithTitle("System load").WithHeight(400).WithWidth(600)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SnapshotsApi.GetGraphSnapshot`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SnapshotsApi.GetGraphSnapshot`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Take graph snapshots ``` // Take graph snapshots returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SnapshotsApi; import com.datadog.api.client.v1.api.SnapshotsApi.GetGraphSnapshotOptionalParameters; import com.datadog.api.client.v1.model.GraphSnapshot; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SnapshotsApi apiInstance = new SnapshotsApi(defaultClient); try { GraphSnapshot result = apiInstance.getGraphSnapshot( OffsetDateTime.now().plusDays(-1).toInstant().getEpochSecond(), OffsetDateTime.now().toInstant().getEpochSecond(), new GetGraphSnapshotOptionalParameters() .metricQuery("avg:system.load.1{*}") .title("System load") .height(400L) .width(600L)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SnapshotsApi#getGraphSnapshot"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Take graph snapshots ``` from datadog import initialize, api import time options = { 'api_key': '', 'app_key': '' } initialize(**options) # Take a graph snapshot end = int(time.time()) start = end - (60 * 60) api.Graph.create( graph_def='{\ "viz": "timeseries", \ "requests": [ \ {"q": "avg:system.load.1{*}", "conditional_formats": [], "type": "line"},\ {"q": "avg:system.load.5{*}", "type": "line"}, \ {"q": "avg:system.load.15{*}", "type": "line"}\ ], \ "events": [\ {"q": "hosts:* ", "tags_execution": "and"}\ ]}', start=start, end=end ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Take graph snapshots ``` // Take graph snapshots returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_snapshots::GetGraphSnapshotOptionalParams; use datadog_api_client::datadogV1::api_snapshots::SnapshotsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SnapshotsAPI::with_config(configuration); let resp = api .get_graph_snapshot( 1636542671, 1636629071, GetGraphSnapshotOptionalParams::default() .metric_query("avg:system.load.1{*}".to_string()) .title("System load".to_string()) .height(400) .width(600), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Take graph snapshots ``` /** * Take graph snapshots returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SnapshotsApi(configuration); const params: v1.SnapshotsApiGetGraphSnapshotRequest = { metricQuery: "avg:system.load.1{*}", start: Math.round( new Date(new Date().getTime() + -1 * 86400 * 1000).getTime() / 1000 ), end: Math.round(new Date().getTime() / 1000), title: "System load", height: 400, width: 600, }; apiInstance .getGraphSnapshot(params) .then((data: v1.GraphSnapshot) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=8c7dd1a2-c228-492a-8e69-eb7d4bc02698&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=8607fbf3-4de8-46cb-8238-56eec475dded&pt=Snapshots&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsnapshots%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=8c7dd1a2-c228-492a-8e69-eb7d4bc02698&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=8607fbf3-4de8-46cb-8238-56eec475dded&pt=Snapshots&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsnapshots%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=81023550-3fb1-44e1-93b7-cae595a03133&bo=2&sid=ba46b8a0f0bf11f08761a3199a385bde&vid=ba46dde0f0bf11f0b26b6ba9935020b6&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Snapshots&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsnapshots%2F&r=<=1002&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=226282) --- # Source: https://docs.datadoghq.com/api/latest/software-catalog # Software Catalog API to create, update, retrieve, and delete Software Catalog entities. ## [Get a list of entities](https://docs.datadoghq.com/api/latest/software-catalog/#get-a-list-of-entities) * [v2 (latest)](https://docs.datadoghq.com/api/latest/software-catalog/#get-a-list-of-entities-v2) GET https://api.ap1.datadoghq.com/api/v2/catalog/entityhttps://api.ap2.datadoghq.com/api/v2/catalog/entityhttps://api.datadoghq.eu/api/v2/catalog/entityhttps://api.ddog-gov.com/api/v2/catalog/entityhttps://api.datadoghq.com/api/v2/catalog/entityhttps://api.us3.datadoghq.com/api/v2/catalog/entityhttps://api.us5.datadoghq.com/api/v2/catalog/entity ### Overview Get a list of entities from Software Catalog. OAuth apps require the `apm_service_catalog_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#software-catalog) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[offset] integer Specific offset to use as the beginning of the returned page. page[limit] integer Maximum number of entities in the response. filter[id] string Filter entities by UUID. filter[ref] string Filter entities by reference filter[name] string Filter entities by name. filter[kind] string Filter entities by kind. filter[owner] string Filter entities by owner. filter[relation][type] enum Filter entities by relation type. Allowed enum values: `RelationTypeOwns, RelationTypeOwnedBy, RelationTypeDependsOn, RelationTypeDependencyOf, RelationTypePartsOf, RelationTypeHasPart, RelationTypeOtherOwns, RelationTypeOtherOwnedBy, RelationTypeImplementedBy, RelationTypeImplements` filter[exclude_snapshot] string Filter entities by excluding snapshotted entities. include enum Include relationship data. Allowed enum values: `schema, raw_schema, oncall, incident, relation` includeDiscovered boolean If true, includes discovered services from APM and USM that do not have entity definitions. ### Response * [200](https://docs.datadoghq.com/api/latest/software-catalog/#ListCatalogEntity-200-v2) * [403](https://docs.datadoghq.com/api/latest/software-catalog/#ListCatalogEntity-403-v2) * [429](https://docs.datadoghq.com/api/latest/software-catalog/#ListCatalogEntity-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) List entity response. Field Type Description data [object] List of entity data. attributes object Entity attributes. apiVersion string The API version. description string The description. displayName string The display name. kind string The kind. name string The name. namespace string The namespace. owner string The owner. tags [string] The tags. id string Entity ID. meta object Entity metadata. createdAt string The creation time. ingestionSource string The ingestion source. modifiedAt string The modification time. origin string The origin. relationships object Entity relationships. incidents object Entity to incidents relationship. data [object] Relationships. id string Associated data ID. type string Relationship type. oncall object Entity to oncalls relationship. data [object] Relationships. id string Associated data ID. type string Relationship type. rawSchema object Entity to raw schema relationship. data object Relationship entry. id string Associated data ID. type string Relationship type. relatedEntities object Entity to related entities relationship. data [object] Relationships. id string Associated data ID. type string Relationship type. schema object Entity to detail schema relationship. data object Relationship entry. id string Associated data ID. type string Relationship type. type string Entity. included [ ] List entity response included. Option 1 object Included detail entity schema. attributes object Included schema. schema Entity schema v3. Option 1 object Schema for service entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the service entity. codeLocations [object] Schema for mapping source code locations to an entity. paths [string] The paths (glob) to the source code of the service. repositoryURL string The repository path of the source code of the entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. pipelines object CI Pipelines association. fingerprints [string] A list of CI Fingerprints that associate CI Pipelines with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 Service Kind object. Allowed enum values: `service` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 Service Spec object. componentOf [string] A list of components the service is a part of dependsOn [string] A list of components the service depends on. languages [string] The service's programming language. lifecycle string The lifecycle state of the component. tier string The importance of the component. type string The type of service. Option 2 object Schema for datastore entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the datastore entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. extensions object Custom extensions. This is the free-formed field to send client side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 Datastore Kind object. Allowed enum values: `datastore` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 Datastore Spec object. componentOf [string] A list of components the datastore is a part of lifecycle string The lifecycle state of the datastore. tier string The importance of the datastore. type string The type of datastore. Option 3 object Schema for queue entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the datastore entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 Queue Kind object. Allowed enum values: `queue` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 Queue Spec object. componentOf [string] A list of components the queue is a part of lifecycle string The lifecycle state of the queue. tier string The importance of the queue. type string The type of queue. Option 4 object Schema for system entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the service entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. pipelines object CI Pipelines association. fingerprints [string] A list of CI Fingerprints that associate CI Pipelines with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 System Kind object. Allowed enum values: `system` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 System Spec object. components [string] A list of components belongs to the system. lifecycle string The lifecycle state of the component. tier string An entity reference to the owner of the component. Option 5 object Schema for API entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the API entity. codeLocations [object] Schema for mapping source code locations to an entity. paths [string] The paths (glob) to the source code of the service. repositoryURL string The repository path of the source code of the entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. pipelines object CI Pipelines association. fingerprints [string] A list of CI Fingerprints that associate CI Pipelines with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 API Kind object. Allowed enum values: `api` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 API Spec object. implementedBy [string] Services which implemented the API. interface The API definition. lifecycle string The lifecycle state of the component. tier string The importance of the component. type string The type of API. id string Entity ID. type enum Schema type. Allowed enum values: `schema` Option 2 object Included raw schema. attributes object Included raw schema attributes. rawSchema string Schema from user input in base64 encoding. id string Raw schema ID. type enum Raw schema type. Allowed enum values: `rawSchema` Option 3 object Included related entity. attributes object Related entity attributes. kind string Entity kind. name string Entity name. namespace string Entity namespace. type string Entity relation type to the associated entity. id string Entity UUID. meta object Included related entity meta. createdAt date-time Entity creation time. defined_by string Entity relation defined by. modifiedAt date-time Entity modification time. source string Entity relation source. type enum Related entity. Allowed enum values: `relatedEntity` Option 4 object Included oncall. attributes object Included related oncall attributes. escalations [object] Oncall escalations. email string Oncall email. escalationLevel int64 Oncall level. name string Oncall name. provider string Oncall provider. id string Oncall ID. type enum Oncall type. Allowed enum values: `oncall` Option 5 object Included incident. attributes object Incident attributes. createdAt date-time Incident creation time. htmlURL string Incident URL. provider string Incident provider. status string Incident status. title string Incident title. id string Incident ID. type enum Incident description. Allowed enum values: `incident` links object List entity response links. next string Next link. previous string Previous link. self string Current link. meta object Entity metadata. count int64 Total entities count. includeCount int64 Total included data count. ``` { "data": [ { "attributes": { "apiVersion": "string", "description": "string", "displayName": "string", "kind": "string", "name": "string", "namespace": "string", "owner": "string", "tags": [] }, "id": "string", "meta": { "createdAt": "string", "ingestionSource": "string", "modifiedAt": "string", "origin": "string" }, "relationships": { "incidents": { "data": [ { "id": "string", "type": "string" } ] }, "oncall": { "data": [ { "id": "string", "type": "string" } ] }, "rawSchema": { "data": { "id": "string", "type": "string" } }, "relatedEntities": { "data": [ { "id": "string", "type": "string" } ] }, "schema": { "data": { "id": "string", "type": "string" } } }, "type": "string" } ], "included": [ { "attributes": { "schema": { "apiVersion": "v3", "datadog": { "codeLocations": [ { "paths": [], "repositoryURL": "string" } ], "events": [ { "name": "string", "query": "string" } ], "logs": [ { "name": "string", "query": "string" } ], "performanceData": { "tags": [] }, "pipelines": { "fingerprints": [] } }, "extensions": {}, "integrations": { "opsgenie": { "region": "string", "serviceURL": "https://www.opsgenie.com/service/shopping-cart" }, "pagerduty": { "serviceURL": "https://www.pagerduty.com/service-directory/Pshopping-cart" } }, "kind": "service", "metadata": { "additionalOwners": [ { "name": "", "type": "string" } ], "contacts": [ { "contact": "https://slack/", "name": "string", "type": "slack" } ], "description": "string", "displayName": "string", "id": "4b163705-23c0-4573-b2fb-f6cea2163fcb", "inheritFrom": "application:default/myapp", "links": [ { "name": "mylink", "provider": "string", "type": "link", "url": "https://mylink" } ], "managed": {}, "name": "myService", "namespace": "default", "owner": "string", "tags": [ "this:tag", "that:tag" ] }, "spec": { "componentOf": [], "dependsOn": [], "languages": [], "lifecycle": "string", "tier": "string", "type": "string" } } }, "id": "string", "type": "string" } ], "links": { "next": "string", "previous": "string", "self": "string" }, "meta": { "count": "integer", "includeCount": "integer" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=typescript) ##### Get a list of entities Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/catalog/entity" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of entities ``` """ Get a list of entities returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.software_catalog_api import SoftwareCatalogApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SoftwareCatalogApi(api_client) response = api_instance.list_catalog_entity() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of entities ``` # Get a list of entities returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SoftwareCatalogAPI.new p api_instance.list_catalog_entity() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of entities ``` // Get a list of entities returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSoftwareCatalogApi(apiClient) resp, r, err := api.ListCatalogEntity(ctx, *datadogV2.NewListCatalogEntityOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SoftwareCatalogApi.ListCatalogEntity`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SoftwareCatalogApi.ListCatalogEntity`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of entities ``` // Get a list of entities returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SoftwareCatalogApi; import com.datadog.api.client.v2.model.ListEntityCatalogResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SoftwareCatalogApi apiInstance = new SoftwareCatalogApi(defaultClient); try { ListEntityCatalogResponse result = apiInstance.listCatalogEntity(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SoftwareCatalogApi#listCatalogEntity"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of entities ``` // Get a list of entities returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_software_catalog::ListCatalogEntityOptionalParams; use datadog_api_client::datadogV2::api_software_catalog::SoftwareCatalogAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SoftwareCatalogAPI::with_config(configuration); let resp = api .list_catalog_entity(ListCatalogEntityOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of entities ``` /** * Get a list of entities returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SoftwareCatalogApi(configuration); apiInstance .listCatalogEntity() .then((data: v2.ListEntityCatalogResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create or update entities](https://docs.datadoghq.com/api/latest/software-catalog/#create-or-update-entities) * [v2 (latest)](https://docs.datadoghq.com/api/latest/software-catalog/#create-or-update-entities-v2) POST https://api.ap1.datadoghq.com/api/v2/catalog/entityhttps://api.ap2.datadoghq.com/api/v2/catalog/entityhttps://api.datadoghq.eu/api/v2/catalog/entityhttps://api.ddog-gov.com/api/v2/catalog/entityhttps://api.datadoghq.com/api/v2/catalog/entityhttps://api.us3.datadoghq.com/api/v2/catalog/entityhttps://api.us5.datadoghq.com/api/v2/catalog/entity ### Overview Create or update entities in Software Catalog. OAuth apps require the `apm_service_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#software-catalog) to access this endpoint. ### Request #### Body Data (required) Entity YAML or JSON. * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) Field Type Description Option 1 Entity schema v3. Option 1 object Schema for service entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the service entity. codeLocations [object] Schema for mapping source code locations to an entity. paths [string] The paths (glob) to the source code of the service. repositoryURL string The repository path of the source code of the entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. pipelines object CI Pipelines association. fingerprints [string] A list of CI Fingerprints that associate CI Pipelines with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 Service Kind object. Allowed enum values: `service` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 Service Spec object. componentOf [string] A list of components the service is a part of dependsOn [string] A list of components the service depends on. languages [string] The service's programming language. lifecycle string The lifecycle state of the component. tier string The importance of the component. type string The type of service. Option 2 object Schema for datastore entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the datastore entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. extensions object Custom extensions. This is the free-formed field to send client side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 Datastore Kind object. Allowed enum values: `datastore` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 Datastore Spec object. componentOf [string] A list of components the datastore is a part of lifecycle string The lifecycle state of the datastore. tier string The importance of the datastore. type string The type of datastore. Option 3 object Schema for queue entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the datastore entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 Queue Kind object. Allowed enum values: `queue` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 Queue Spec object. componentOf [string] A list of components the queue is a part of lifecycle string The lifecycle state of the queue. tier string The importance of the queue. type string The type of queue. Option 4 object Schema for system entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the service entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. pipelines object CI Pipelines association. fingerprints [string] A list of CI Fingerprints that associate CI Pipelines with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 System Kind object. Allowed enum values: `system` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 System Spec object. components [string] A list of components belongs to the system. lifecycle string The lifecycle state of the component. tier string An entity reference to the owner of the component. Option 5 object Schema for API entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the API entity. codeLocations [object] Schema for mapping source code locations to an entity. paths [string] The paths (glob) to the source code of the service. repositoryURL string The repository path of the source code of the entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. pipelines object CI Pipelines association. fingerprints [string] A list of CI Fingerprints that associate CI Pipelines with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 API Kind object. Allowed enum values: `api` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 API Spec object. implementedBy [string] Services which implemented the API. interface The API definition. lifecycle string The lifecycle state of the component. tier string The importance of the component. type string The type of API. Option 2 string Entity definition in raw JSON or YAML representation. ``` { "apiVersion": "v3", "datadog": { "codeLocations": [ { "paths": [] } ], "events": [ {} ], "logs": [ {} ], "performanceData": { "tags": [] }, "pipelines": { "fingerprints": [] } }, "integrations": { "opsgenie": { "serviceURL": "https://www.opsgenie.com/service/shopping-cart" }, "pagerduty": { "serviceURL": "https://www.pagerduty.com/service-directory/Pshopping-cart" } }, "kind": "service", "metadata": { "additionalOwners": [], "contacts": [ { "contact": "https://slack/", "type": "slack" } ], "id": "4b163705-23c0-4573-b2fb-f6cea2163fcb", "inheritFrom": "application:default/myapp", "links": [ { "name": "mylink", "type": "link", "url": "https://mylink" } ], "name": "service-examplesoftwarecatalog", "tags": [ "this:tag", "that:tag" ] }, "spec": { "dependsOn": [], "languages": [] } } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/software-catalog/#UpsertCatalogEntity-202-v2) * [400](https://docs.datadoghq.com/api/latest/software-catalog/#UpsertCatalogEntity-400-v2) * [403](https://docs.datadoghq.com/api/latest/software-catalog/#UpsertCatalogEntity-403-v2) * [429](https://docs.datadoghq.com/api/latest/software-catalog/#UpsertCatalogEntity-429-v2) ACCEPTED * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) Upsert entity response. Field Type Description data [object] List of entity data. attributes object Entity attributes. apiVersion string The API version. description string The description. displayName string The display name. kind string The kind. name string The name. namespace string The namespace. owner string The owner. tags [string] The tags. id string Entity ID. meta object Entity metadata. createdAt string The creation time. ingestionSource string The ingestion source. modifiedAt string The modification time. origin string The origin. relationships object Entity relationships. incidents object Entity to incidents relationship. data [object] Relationships. id string Associated data ID. type string Relationship type. oncall object Entity to oncalls relationship. data [object] Relationships. id string Associated data ID. type string Relationship type. rawSchema object Entity to raw schema relationship. data object Relationship entry. id string Associated data ID. type string Relationship type. relatedEntities object Entity to related entities relationship. data [object] Relationships. id string Associated data ID. type string Relationship type. schema object Entity to detail schema relationship. data object Relationship entry. id string Associated data ID. type string Relationship type. type string Entity. included [ ] Upsert entity response included. Option 1 object Included detail entity schema. attributes object Included schema. schema Entity schema v3. Option 1 object Schema for service entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the service entity. codeLocations [object] Schema for mapping source code locations to an entity. paths [string] The paths (glob) to the source code of the service. repositoryURL string The repository path of the source code of the entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. pipelines object CI Pipelines association. fingerprints [string] A list of CI Fingerprints that associate CI Pipelines with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 Service Kind object. Allowed enum values: `service` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 Service Spec object. componentOf [string] A list of components the service is a part of dependsOn [string] A list of components the service depends on. languages [string] The service's programming language. lifecycle string The lifecycle state of the component. tier string The importance of the component. type string The type of service. Option 2 object Schema for datastore entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the datastore entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. extensions object Custom extensions. This is the free-formed field to send client side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 Datastore Kind object. Allowed enum values: `datastore` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 Datastore Spec object. componentOf [string] A list of components the datastore is a part of lifecycle string The lifecycle state of the datastore. tier string The importance of the datastore. type string The type of datastore. Option 3 object Schema for queue entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the datastore entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 Queue Kind object. Allowed enum values: `queue` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 Queue Spec object. componentOf [string] A list of components the queue is a part of lifecycle string The lifecycle state of the queue. tier string The importance of the queue. type string The type of queue. Option 4 object Schema for system entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the service entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. pipelines object CI Pipelines association. fingerprints [string] A list of CI Fingerprints that associate CI Pipelines with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 System Kind object. Allowed enum values: `system` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 System Spec object. components [string] A list of components belongs to the system. lifecycle string The lifecycle state of the component. tier string An entity reference to the owner of the component. Option 5 object Schema for API entities. apiVersion [_required_] enum The version of the schema data that was used to populate this entity's data. This could be via the API, Terraform, or YAML file in a repository. The field is known as schema-version in the previous version. Allowed enum values: `v3,v2.2,v2.1,v2` datadog object Datadog product integrations for the API entity. codeLocations [object] Schema for mapping source code locations to an entity. paths [string] The paths (glob) to the source code of the service. repositoryURL string The repository path of the source code of the entity. events [object] Events associations. name string The name of the query. query string The query to run. logs [object] Logs association. name string The name of the query. query string The query to run. performanceData object Performance stats association. tags [string] A list of APM entity tags that associates the APM Stats data with the entity. pipelines object CI Pipelines association. fingerprints [string] A list of CI Fingerprints that associate CI Pipelines with the entity. extensions object Custom extensions. This is the free-formed field to send client-side metadata. No Datadog features are affected by this field. integrations object A base schema for defining third-party integrations. opsgenie object An Opsgenie integration schema. region string The region for the Opsgenie integration. serviceURL [_required_] string The service URL for the Opsgenie integration. pagerduty object A PagerDuty integration schema. serviceURL [_required_] string The service URL for the PagerDuty integration. kind [_required_] enum The definition of Entity V3 API Kind object. Allowed enum values: `api` metadata [_required_] object The definition of Entity V3 Metadata object. additionalOwners [object] The additional owners of the entity, usually a team. name [_required_] string Team name. type string Team type. contacts [object] A list of contacts for the entity. contact [_required_] string Contact value. name string Contact name. type [_required_] string Contact type. description string Short description of the entity. The UI can leverage the description for display. displayName string User friendly name of the entity. The UI can leverage the display name for display. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. inheritFrom string The entity reference from which to inherit metadata links [object] A list of links for the entity. name [_required_] string Link name. provider string Link provider. type [_required_] string Link type. default: `other` url [_required_] string Link URL. managed object A read-only set of Datadog managed attributes generated by Datadog. User supplied values are ignored. name [_required_] string Unique name given to an entity under the kind/namespace. namespace string Namespace is a part of unique identifier. It has a default value of 'default'. owner string The owner of the entity, usually a team. tags [string] A set of custom tags. spec object The definition of Entity V3 API Spec object. implementedBy [string] Services which implemented the API. interface The API definition. lifecycle string The lifecycle state of the component. tier string The importance of the component. type string The type of API. id string Entity ID. type enum Schema type. Allowed enum values: `schema` meta object Entity metadata. count int64 Total entities count. includeCount int64 Total included data count. ``` { "data": [ { "attributes": { "apiVersion": "string", "description": "string", "displayName": "string", "kind": "string", "name": "string", "namespace": "string", "owner": "string", "tags": [] }, "id": "string", "meta": { "createdAt": "string", "ingestionSource": "string", "modifiedAt": "string", "origin": "string" }, "relationships": { "incidents": { "data": [ { "id": "string", "type": "string" } ] }, "oncall": { "data": [ { "id": "string", "type": "string" } ] }, "rawSchema": { "data": { "id": "string", "type": "string" } }, "relatedEntities": { "data": [ { "id": "string", "type": "string" } ] }, "schema": { "data": { "id": "string", "type": "string" } } }, "type": "string" } ], "included": [ { "attributes": { "schema": { "apiVersion": "v3", "datadog": { "codeLocations": [ { "paths": [], "repositoryURL": "string" } ], "events": [ { "name": "string", "query": "string" } ], "logs": [ { "name": "string", "query": "string" } ], "performanceData": { "tags": [] }, "pipelines": { "fingerprints": [] } }, "extensions": {}, "integrations": { "opsgenie": { "region": "string", "serviceURL": "https://www.opsgenie.com/service/shopping-cart" }, "pagerduty": { "serviceURL": "https://www.pagerduty.com/service-directory/Pshopping-cart" } }, "kind": "service", "metadata": { "additionalOwners": [ { "name": "", "type": "string" } ], "contacts": [ { "contact": "https://slack/", "name": "string", "type": "slack" } ], "description": "string", "displayName": "string", "id": "4b163705-23c0-4573-b2fb-f6cea2163fcb", "inheritFrom": "application:default/myapp", "links": [ { "name": "mylink", "provider": "string", "type": "link", "url": "https://mylink" } ], "managed": {}, "name": "myService", "namespace": "default", "owner": "string", "tags": [ "this:tag", "that:tag" ] }, "spec": { "componentOf": [], "dependsOn": [], "languages": [], "lifecycle": "string", "tier": "string", "type": "string" } } }, "id": "string", "type": "string" } ], "meta": { "count": "integer", "includeCount": "integer" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=typescript) ##### Create or update software catalog entity using schema v3 returns "ACCEPTED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/catalog/entity" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "apiVersion": "v3", "datadog": { "codeLocations": [ { "paths": [] } ], "events": [ {} ], "logs": [ {} ], "performanceData": { "tags": [] }, "pipelines": { "fingerprints": [] } }, "integrations": { "opsgenie": { "serviceURL": "https://www.opsgenie.com/service/shopping-cart" }, "pagerduty": { "serviceURL": "https://www.pagerduty.com/service-directory/Pshopping-cart" } }, "kind": "service", "metadata": { "additionalOwners": [], "contacts": [ { "contact": "https://slack/", "type": "slack" } ], "id": "4b163705-23c0-4573-b2fb-f6cea2163fcb", "inheritFrom": "application:default/myapp", "links": [ { "name": "mylink", "type": "link", "url": "https://mylink" } ], "name": "service-examplesoftwarecatalog", "tags": [ "this:tag", "that:tag" ] }, "spec": { "dependsOn": [], "languages": [] } } EOF ``` ##### Create or update software catalog entity using schema v3 returns "ACCEPTED" response ``` // Create or update software catalog entity using schema v3 returns "ACCEPTED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.UpsertCatalogEntityRequest{ EntityV3: &datadogV2.EntityV3{ EntityV3Service: &datadogV2.EntityV3Service{ ApiVersion: datadogV2.ENTITYV3APIVERSION_V3, Datadog: &datadogV2.EntityV3ServiceDatadog{ CodeLocations: []datadogV2.EntityV3DatadogCodeLocationItem{ { Paths: []string{}, }, }, Events: []datadogV2.EntityV3DatadogEventItem{ {}, }, Logs: []datadogV2.EntityV3DatadogLogItem{ {}, }, PerformanceData: &datadogV2.EntityV3DatadogPerformance{ Tags: []string{}, }, Pipelines: &datadogV2.EntityV3DatadogPipelines{ Fingerprints: []string{}, }, }, Integrations: &datadogV2.EntityV3Integrations{ Opsgenie: &datadogV2.EntityV3DatadogIntegrationOpsgenie{ ServiceUrl: "https://www.opsgenie.com/service/shopping-cart", }, Pagerduty: &datadogV2.EntityV3DatadogIntegrationPagerduty{ ServiceUrl: "https://www.pagerduty.com/service-directory/Pshopping-cart", }, }, Kind: datadogV2.ENTITYV3SERVICEKIND_SERVICE, Metadata: datadogV2.EntityV3Metadata{ AdditionalOwners: []datadogV2.EntityV3MetadataAdditionalOwnersItems{}, Contacts: []datadogV2.EntityV3MetadataContactsItems{ { Contact: "https://slack/", Type: "slack", }, }, Id: datadog.PtrString("4b163705-23c0-4573-b2fb-f6cea2163fcb"), InheritFrom: datadog.PtrString("application:default/myapp"), Links: []datadogV2.EntityV3MetadataLinksItems{ { Name: "mylink", Type: "link", Url: "https://mylink", }, }, Name: "service-examplesoftwarecatalog", Tags: []string{ "this:tag", "that:tag", }, }, Spec: &datadogV2.EntityV3ServiceSpec{ DependsOn: []string{}, Languages: []string{}, }, }}} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSoftwareCatalogApi(apiClient) resp, r, err := api.UpsertCatalogEntity(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SoftwareCatalogApi.UpsertCatalogEntity`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SoftwareCatalogApi.UpsertCatalogEntity`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create or update software catalog entity using schema v3 returns "ACCEPTED" response ``` // Create or update software catalog entity using schema v3 returns "ACCEPTED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SoftwareCatalogApi; import com.datadog.api.client.v2.model.EntityV3; import com.datadog.api.client.v2.model.EntityV3APIVersion; import com.datadog.api.client.v2.model.EntityV3DatadogCodeLocationItem; import com.datadog.api.client.v2.model.EntityV3DatadogEventItem; import com.datadog.api.client.v2.model.EntityV3DatadogIntegrationOpsgenie; import com.datadog.api.client.v2.model.EntityV3DatadogIntegrationPagerduty; import com.datadog.api.client.v2.model.EntityV3DatadogLogItem; import com.datadog.api.client.v2.model.EntityV3DatadogPerformance; import com.datadog.api.client.v2.model.EntityV3DatadogPipelines; import com.datadog.api.client.v2.model.EntityV3Integrations; import com.datadog.api.client.v2.model.EntityV3Metadata; import com.datadog.api.client.v2.model.EntityV3MetadataContactsItems; import com.datadog.api.client.v2.model.EntityV3MetadataLinksItems; import com.datadog.api.client.v2.model.EntityV3Service; import com.datadog.api.client.v2.model.EntityV3ServiceDatadog; import com.datadog.api.client.v2.model.EntityV3ServiceKind; import com.datadog.api.client.v2.model.EntityV3ServiceSpec; import com.datadog.api.client.v2.model.UpsertCatalogEntityRequest; import com.datadog.api.client.v2.model.UpsertCatalogEntityResponse; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SoftwareCatalogApi apiInstance = new SoftwareCatalogApi(defaultClient); UpsertCatalogEntityRequest body = new UpsertCatalogEntityRequest( new EntityV3( new EntityV3Service() .apiVersion(EntityV3APIVersion.V3) .datadog( new EntityV3ServiceDatadog() .codeLocations( Collections.singletonList(new EntityV3DatadogCodeLocationItem())) .events(Collections.singletonList(new EntityV3DatadogEventItem())) .logs(Collections.singletonList(new EntityV3DatadogLogItem())) .performanceData(new EntityV3DatadogPerformance()) .pipelines(new EntityV3DatadogPipelines())) .integrations( new EntityV3Integrations() .opsgenie( new EntityV3DatadogIntegrationOpsgenie() .serviceUrl("https://www.opsgenie.com/service/shopping-cart")) .pagerduty( new EntityV3DatadogIntegrationPagerduty() .serviceUrl( "https://www.pagerduty.com/service-directory/Pshopping-cart"))) .kind(EntityV3ServiceKind.SERVICE) .metadata( new EntityV3Metadata() .contacts( Collections.singletonList( new EntityV3MetadataContactsItems() .contact("https://slack/") .type("slack"))) .id("4b163705-23c0-4573-b2fb-f6cea2163fcb") .inheritFrom("application:default/myapp") .links( Collections.singletonList( new EntityV3MetadataLinksItems() .name("mylink") .type("link") .url("https://mylink"))) .name("service-examplesoftwarecatalog") .tags(Arrays.asList("this:tag", "that:tag"))) .spec(new EntityV3ServiceSpec()))); try { UpsertCatalogEntityResponse result = apiInstance.upsertCatalogEntity(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SoftwareCatalogApi#upsertCatalogEntity"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create or update software catalog entity using schema v3 returns "ACCEPTED" response ``` """ Create or update software catalog entity using schema v3 returns "ACCEPTED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.software_catalog_api import SoftwareCatalogApi from datadog_api_client.v2.model.entity_v3_api_version import EntityV3APIVersion from datadog_api_client.v2.model.entity_v3_datadog_code_location_item import EntityV3DatadogCodeLocationItem from datadog_api_client.v2.model.entity_v3_datadog_event_item import EntityV3DatadogEventItem from datadog_api_client.v2.model.entity_v3_datadog_integration_opsgenie import EntityV3DatadogIntegrationOpsgenie from datadog_api_client.v2.model.entity_v3_datadog_integration_pagerduty import EntityV3DatadogIntegrationPagerduty from datadog_api_client.v2.model.entity_v3_datadog_log_item import EntityV3DatadogLogItem from datadog_api_client.v2.model.entity_v3_datadog_performance import EntityV3DatadogPerformance from datadog_api_client.v2.model.entity_v3_datadog_pipelines import EntityV3DatadogPipelines from datadog_api_client.v2.model.entity_v3_integrations import EntityV3Integrations from datadog_api_client.v2.model.entity_v3_metadata import EntityV3Metadata from datadog_api_client.v2.model.entity_v3_metadata_contacts_items import EntityV3MetadataContactsItems from datadog_api_client.v2.model.entity_v3_metadata_links_items import EntityV3MetadataLinksItems from datadog_api_client.v2.model.entity_v3_service import EntityV3Service from datadog_api_client.v2.model.entity_v3_service_datadog import EntityV3ServiceDatadog from datadog_api_client.v2.model.entity_v3_service_kind import EntityV3ServiceKind from datadog_api_client.v2.model.entity_v3_service_spec import EntityV3ServiceSpec body = EntityV3Service( api_version=EntityV3APIVersion.V3, datadog=EntityV3ServiceDatadog( code_locations=[ EntityV3DatadogCodeLocationItem( paths=[], ), ], events=[ EntityV3DatadogEventItem(), ], logs=[ EntityV3DatadogLogItem(), ], performance_data=EntityV3DatadogPerformance( tags=[], ), pipelines=EntityV3DatadogPipelines( fingerprints=[], ), ), integrations=EntityV3Integrations( opsgenie=EntityV3DatadogIntegrationOpsgenie( service_url="https://www.opsgenie.com/service/shopping-cart", ), pagerduty=EntityV3DatadogIntegrationPagerduty( service_url="https://www.pagerduty.com/service-directory/Pshopping-cart", ), ), kind=EntityV3ServiceKind.SERVICE, metadata=EntityV3Metadata( additional_owners=[], contacts=[ EntityV3MetadataContactsItems( contact="https://slack/", type="slack", ), ], id="4b163705-23c0-4573-b2fb-f6cea2163fcb", inherit_from="application:default/myapp", links=[ EntityV3MetadataLinksItems( name="mylink", type="link", url="https://mylink", ), ], name="service-examplesoftwarecatalog", tags=[ "this:tag", "that:tag", ], ), spec=EntityV3ServiceSpec( depends_on=[], languages=[], ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SoftwareCatalogApi(api_client) response = api_instance.upsert_catalog_entity(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create or update software catalog entity using schema v3 returns "ACCEPTED" response ``` # Create or update software catalog entity using schema v3 returns "ACCEPTED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SoftwareCatalogAPI.new body = DatadogAPIClient::V2::EntityV3Service.new({ api_version: DatadogAPIClient::V2::EntityV3APIVersion::V3, datadog: DatadogAPIClient::V2::EntityV3ServiceDatadog.new({ code_locations: [ DatadogAPIClient::V2::EntityV3DatadogCodeLocationItem.new({ paths: [], }), ], events: [ DatadogAPIClient::V2::EntityV3DatadogEventItem.new({}), ], logs: [ DatadogAPIClient::V2::EntityV3DatadogLogItem.new({}), ], performance_data: DatadogAPIClient::V2::EntityV3DatadogPerformance.new({ tags: [], }), pipelines: DatadogAPIClient::V2::EntityV3DatadogPipelines.new({ fingerprints: [], }), }), integrations: DatadogAPIClient::V2::EntityV3Integrations.new({ opsgenie: DatadogAPIClient::V2::EntityV3DatadogIntegrationOpsgenie.new({ service_url: "https://www.opsgenie.com/service/shopping-cart", }), pagerduty: DatadogAPIClient::V2::EntityV3DatadogIntegrationPagerduty.new({ service_url: "https://www.pagerduty.com/service-directory/Pshopping-cart", }), }), kind: DatadogAPIClient::V2::EntityV3ServiceKind::SERVICE, metadata: DatadogAPIClient::V2::EntityV3Metadata.new({ additional_owners: [], contacts: [ DatadogAPIClient::V2::EntityV3MetadataContactsItems.new({ contact: "https://slack/", type: "slack", }), ], id: "4b163705-23c0-4573-b2fb-f6cea2163fcb", inherit_from: "application:default/myapp", links: [ DatadogAPIClient::V2::EntityV3MetadataLinksItems.new({ name: "mylink", type: "link", url: "https://mylink", }), ], name: "service-examplesoftwarecatalog", tags: [ "this:tag", "that:tag", ], }), spec: DatadogAPIClient::V2::EntityV3ServiceSpec.new({ depends_on: [], languages: [], }), }) p api_instance.upsert_catalog_entity(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create or update software catalog entity using schema v3 returns "ACCEPTED" response ``` // Create or update software catalog entity using schema v3 returns "ACCEPTED" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_software_catalog::SoftwareCatalogAPI; use datadog_api_client::datadogV2::model::EntityV3; use datadog_api_client::datadogV2::model::EntityV3APIVersion; use datadog_api_client::datadogV2::model::EntityV3DatadogCodeLocationItem; use datadog_api_client::datadogV2::model::EntityV3DatadogEventItem; use datadog_api_client::datadogV2::model::EntityV3DatadogIntegrationOpsgenie; use datadog_api_client::datadogV2::model::EntityV3DatadogIntegrationPagerduty; use datadog_api_client::datadogV2::model::EntityV3DatadogLogItem; use datadog_api_client::datadogV2::model::EntityV3DatadogPerformance; use datadog_api_client::datadogV2::model::EntityV3DatadogPipelines; use datadog_api_client::datadogV2::model::EntityV3Integrations; use datadog_api_client::datadogV2::model::EntityV3Metadata; use datadog_api_client::datadogV2::model::EntityV3MetadataContactsItems; use datadog_api_client::datadogV2::model::EntityV3MetadataLinksItems; use datadog_api_client::datadogV2::model::EntityV3Service; use datadog_api_client::datadogV2::model::EntityV3ServiceDatadog; use datadog_api_client::datadogV2::model::EntityV3ServiceKind; use datadog_api_client::datadogV2::model::EntityV3ServiceSpec; use datadog_api_client::datadogV2::model::UpsertCatalogEntityRequest; #[tokio::main] async fn main() { let body = UpsertCatalogEntityRequest::EntityV3(Box::new(EntityV3::EntityV3Service(Box::new( EntityV3Service::new( EntityV3APIVersion::V3, EntityV3ServiceKind::SERVICE, EntityV3Metadata::new("service-examplesoftwarecatalog".to_string()) .additional_owners(vec![]) .contacts(vec![EntityV3MetadataContactsItems::new( "https://slack/".to_string(), "slack".to_string(), )]) .id("4b163705-23c0-4573-b2fb-f6cea2163fcb".to_string()) .inherit_from("application:default/myapp".to_string()) .links(vec![EntityV3MetadataLinksItems::new( "mylink".to_string(), "link".to_string(), "https://mylink".to_string(), )]) .tags(vec!["this:tag".to_string(), "that:tag".to_string()]), ) .datadog( EntityV3ServiceDatadog::new() .code_locations(vec![EntityV3DatadogCodeLocationItem::new().paths(vec![])]) .events(vec![EntityV3DatadogEventItem::new()]) .logs(vec![EntityV3DatadogLogItem::new()]) .performance_data(EntityV3DatadogPerformance::new().tags(vec![])) .pipelines(EntityV3DatadogPipelines::new().fingerprints(vec![])), ) .integrations( EntityV3Integrations::new() .opsgenie(EntityV3DatadogIntegrationOpsgenie::new( "https://www.opsgenie.com/service/shopping-cart".to_string(), )) .pagerduty(EntityV3DatadogIntegrationPagerduty::new( "https://www.pagerduty.com/service-directory/Pshopping-cart".to_string(), )), ) .spec( EntityV3ServiceSpec::new() .depends_on(vec![]) .languages(vec![]), ), )))); let configuration = datadog::Configuration::new(); let api = SoftwareCatalogAPI::with_config(configuration); let resp = api.upsert_catalog_entity(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create or update software catalog entity using schema v3 returns "ACCEPTED" response ``` /** * Create or update software catalog entity using schema v3 returns "ACCEPTED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SoftwareCatalogApi(configuration); const params: v2.SoftwareCatalogApiUpsertCatalogEntityRequest = { body: { apiVersion: "v3", datadog: { codeLocations: [ { paths: [], }, ], events: [{}], logs: [{}], performanceData: { tags: [], }, pipelines: { fingerprints: [], }, }, integrations: { opsgenie: { serviceUrl: "https://www.opsgenie.com/service/shopping-cart", }, pagerduty: { serviceUrl: "https://www.pagerduty.com/service-directory/Pshopping-cart", }, }, kind: "service", metadata: { additionalOwners: [], contacts: [ { contact: "https://slack/", type: "slack", }, ], id: "4b163705-23c0-4573-b2fb-f6cea2163fcb", inheritFrom: "application:default/myapp", links: [ { name: "mylink", type: "link", url: "https://mylink", }, ], name: "service-examplesoftwarecatalog", tags: ["this:tag", "that:tag"], }, spec: { dependsOn: [], languages: [], }, }, }; apiInstance .upsertCatalogEntity(params) .then((data: v2.UpsertCatalogEntityResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a single entity](https://docs.datadoghq.com/api/latest/software-catalog/#delete-a-single-entity) * [v2 (latest)](https://docs.datadoghq.com/api/latest/software-catalog/#delete-a-single-entity-v2) DELETE https://api.ap1.datadoghq.com/api/v2/catalog/entity/{entity_id}https://api.ap2.datadoghq.com/api/v2/catalog/entity/{entity_id}https://api.datadoghq.eu/api/v2/catalog/entity/{entity_id}https://api.ddog-gov.com/api/v2/catalog/entity/{entity_id}https://api.datadoghq.com/api/v2/catalog/entity/{entity_id}https://api.us3.datadoghq.com/api/v2/catalog/entity/{entity_id}https://api.us5.datadoghq.com/api/v2/catalog/entity/{entity_id} ### Overview Delete a single entity in Software Catalog. OAuth apps require the `apm_service_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#software-catalog) to access this endpoint. ### Arguments #### Path Parameters Name Type Description entity_id [_required_] string UUID or Entity Ref. ### Response * [204](https://docs.datadoghq.com/api/latest/software-catalog/#DeleteCatalogEntity-204-v2) * [400](https://docs.datadoghq.com/api/latest/software-catalog/#DeleteCatalogEntity-400-v2) * [403](https://docs.datadoghq.com/api/latest/software-catalog/#DeleteCatalogEntity-403-v2) * [404](https://docs.datadoghq.com/api/latest/software-catalog/#DeleteCatalogEntity-404-v2) * [429](https://docs.datadoghq.com/api/latest/software-catalog/#DeleteCatalogEntity-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=typescript) ##### Delete a single entity Copy ``` # Path parameters export entity_id="service:myservice" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/catalog/entity/${entity_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a single entity ``` """ Delete a single entity returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.software_catalog_api import SoftwareCatalogApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SoftwareCatalogApi(api_client) api_instance.delete_catalog_entity( entity_id="service:myservice", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a single entity ``` # Delete a single entity returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SoftwareCatalogAPI.new api_instance.delete_catalog_entity("service:myservice") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a single entity ``` // Delete a single entity returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSoftwareCatalogApi(apiClient) r, err := api.DeleteCatalogEntity(ctx, "service:myservice") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SoftwareCatalogApi.DeleteCatalogEntity`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a single entity ``` // Delete a single entity returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SoftwareCatalogApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SoftwareCatalogApi apiInstance = new SoftwareCatalogApi(defaultClient); try { apiInstance.deleteCatalogEntity("service:myservice"); } catch (ApiException e) { System.err.println("Exception when calling SoftwareCatalogApi#deleteCatalogEntity"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a single entity ``` // Delete a single entity returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_software_catalog::SoftwareCatalogAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SoftwareCatalogAPI::with_config(configuration); let resp = api .delete_catalog_entity("service:myservice".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a single entity ``` /** * Delete a single entity returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SoftwareCatalogApi(configuration); const params: v2.SoftwareCatalogApiDeleteCatalogEntityRequest = { entityId: "service:myservice", }; apiInstance .deleteCatalogEntity(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of entity relations](https://docs.datadoghq.com/api/latest/software-catalog/#get-a-list-of-entity-relations) * [v2 (latest)](https://docs.datadoghq.com/api/latest/software-catalog/#get-a-list-of-entity-relations-v2) GET https://api.ap1.datadoghq.com/api/v2/catalog/relationhttps://api.ap2.datadoghq.com/api/v2/catalog/relationhttps://api.datadoghq.eu/api/v2/catalog/relationhttps://api.ddog-gov.com/api/v2/catalog/relationhttps://api.datadoghq.com/api/v2/catalog/relationhttps://api.us3.datadoghq.com/api/v2/catalog/relationhttps://api.us5.datadoghq.com/api/v2/catalog/relation ### Overview Get a list of entity relations from Software Catalog. OAuth apps require the `apm_service_catalog_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#software-catalog) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[offset] integer Specific offset to use as the beginning of the returned page. page[limit] integer Maximum number of relations in the response. filter[type] enum Filter relations by type. Allowed enum values: `RelationTypeOwns, RelationTypeOwnedBy, RelationTypeDependsOn, RelationTypeDependencyOf, RelationTypePartsOf, RelationTypeHasPart, RelationTypeOtherOwns, RelationTypeOtherOwnedBy, RelationTypeImplementedBy, RelationTypeImplements` filter[from_ref] string Filter relations by the reference of the first entity in the relation. filter[to_ref] string Filter relations by the reference of the second entity in the relation. include enum Include relationship data. Allowed enum values: `entity, schema` includeDiscovered boolean If true, includes relationships discovered by APM and USM. ### Response * [200](https://docs.datadoghq.com/api/latest/software-catalog/#ListCatalogRelation-200-v2) * [403](https://docs.datadoghq.com/api/latest/software-catalog/#ListCatalogRelation-403-v2) * [429](https://docs.datadoghq.com/api/latest/software-catalog/#ListCatalogRelation-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) List entity relation response. Field Type Description data [object] Array of relation responses attributes object Relation attributes. from object Relation entity reference. kind string Entity kind. name string Entity name. namespace string Entity namespace. to object Relation entity reference. kind string Entity kind. name string Entity name. namespace string Entity namespace. type enum Supported relation types. Allowed enum values: `RelationTypeOwns,RelationTypeOwnedBy,RelationTypeDependsOn,RelationTypeDependencyOf,RelationTypePartsOf,RelationTypeHasPart,RelationTypeOtherOwns,RelationTypeOtherOwnedBy,RelationTypeImplementedBy,RelationTypeImplements` id string Relation ID. meta object Relation metadata. createdAt date-time Relation creation time. definedBy string Relation defined by. modifiedAt date-time Relation modification time. source string Relation source. relationships object Relation relationships. fromEntity object Relation to entity. data object Relationship entry. id string Associated data ID. type string Relationship type. meta object Entity metadata. createdAt string The creation time. ingestionSource string The ingestion source. modifiedAt string The modification time. origin string The origin. toEntity object Relation to entity. data object Relationship entry. id string Associated data ID. type string Relationship type. meta object Entity metadata. createdAt string The creation time. ingestionSource string The ingestion source. modifiedAt string The modification time. origin string The origin. subtype string Relation subtype. type enum Relation type. Allowed enum values: `relation` included [object] List relation response included entities. attributes object Entity attributes. apiVersion string The API version. description string The description. displayName string The display name. kind string The kind. name string The name. namespace string The namespace. owner string The owner. tags [string] The tags. id string Entity ID. meta object Entity metadata. createdAt string The creation time. ingestionSource string The ingestion source. modifiedAt string The modification time. origin string The origin. relationships object Entity relationships. incidents object Entity to incidents relationship. data [object] Relationships. id string Associated data ID. type string Relationship type. oncall object Entity to oncalls relationship. data [object] Relationships. id string Associated data ID. type string Relationship type. rawSchema object Entity to raw schema relationship. data object Relationship entry. id string Associated data ID. type string Relationship type. relatedEntities object Entity to related entities relationship. data [object] Relationships. id string Associated data ID. type string Relationship type. schema object Entity to detail schema relationship. data object Relationship entry. id string Associated data ID. type string Relationship type. type string Entity. links object List relation response links. next string Next link. previous string Previous link. self string Current link. meta object Relation response metadata. count int64 Total relations count. includeCount int64 Total included data count. ``` { "data": [ { "attributes": { "from": { "kind": "string", "name": "string", "namespace": "string" }, "to": { "kind": "string", "name": "string", "namespace": "string" }, "type": "string" }, "id": "string", "meta": { "createdAt": "2019-09-19T10:00:00.000Z", "definedBy": "string", "modifiedAt": "2019-09-19T10:00:00.000Z", "source": "string" }, "relationships": { "fromEntity": { "data": { "id": "string", "type": "string" }, "meta": { "createdAt": "string", "ingestionSource": "string", "modifiedAt": "string", "origin": "string" } }, "toEntity": { "data": { "id": "string", "type": "string" }, "meta": { "createdAt": "string", "ingestionSource": "string", "modifiedAt": "string", "origin": "string" } } }, "subtype": "string", "type": "string" } ], "included": [ { "attributes": { "apiVersion": "string", "description": "string", "displayName": "string", "kind": "string", "name": "string", "namespace": "string", "owner": "string", "tags": [] }, "id": "string", "meta": { "createdAt": "string", "ingestionSource": "string", "modifiedAt": "string", "origin": "string" }, "relationships": { "incidents": { "data": [ { "id": "string", "type": "string" } ] }, "oncall": { "data": [ { "id": "string", "type": "string" } ] }, "rawSchema": { "data": { "id": "string", "type": "string" } }, "relatedEntities": { "data": [ { "id": "string", "type": "string" } ] }, "schema": { "data": { "id": "string", "type": "string" } } }, "type": "string" } ], "links": { "next": "/api/v2/catalog/relation?filter[from_ref]=service:service-catalog\u0026include=entity\u0026page[limit]=2\u0026page[offset]=2", "previous": "string", "self": "/api/v2/catalog/relation?filter[from_ref]=service:service-catalog\u0026include=entity\u0026page[limit]=2\u0026page[offset]=0" }, "meta": { "count": "integer", "includeCount": "integer" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=typescript) ##### Get a list of entity relations Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/catalog/relation" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of entity relations ``` """ Get a list of entity relations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.software_catalog_api import SoftwareCatalogApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SoftwareCatalogApi(api_client) response = api_instance.list_catalog_relation() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of entity relations ``` # Get a list of entity relations returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SoftwareCatalogAPI.new p api_instance.list_catalog_relation() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of entity relations ``` // Get a list of entity relations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSoftwareCatalogApi(apiClient) resp, r, err := api.ListCatalogRelation(ctx, *datadogV2.NewListCatalogRelationOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SoftwareCatalogApi.ListCatalogRelation`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SoftwareCatalogApi.ListCatalogRelation`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of entity relations ``` // Get a list of entity relations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SoftwareCatalogApi; import com.datadog.api.client.v2.model.ListRelationCatalogResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SoftwareCatalogApi apiInstance = new SoftwareCatalogApi(defaultClient); try { ListRelationCatalogResponse result = apiInstance.listCatalogRelation(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SoftwareCatalogApi#listCatalogRelation"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of entity relations ``` // Get a list of entity relations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_software_catalog::ListCatalogRelationOptionalParams; use datadog_api_client::datadogV2::api_software_catalog::SoftwareCatalogAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SoftwareCatalogAPI::with_config(configuration); let resp = api .list_catalog_relation(ListCatalogRelationOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of entity relations ``` /** * Get a list of entity relations returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SoftwareCatalogApi(configuration); apiInstance .listCatalogRelation() .then((data: v2.ListRelationCatalogResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Preview catalog entities](https://docs.datadoghq.com/api/latest/software-catalog/#preview-catalog-entities) * [v2 (latest)](https://docs.datadoghq.com/api/latest/software-catalog/#preview-catalog-entities-v2) POST https://api.ap1.datadoghq.com/api/v2/catalog/entity/previewhttps://api.ap2.datadoghq.com/api/v2/catalog/entity/previewhttps://api.datadoghq.eu/api/v2/catalog/entity/previewhttps://api.ddog-gov.com/api/v2/catalog/entity/previewhttps://api.datadoghq.com/api/v2/catalog/entity/previewhttps://api.us3.datadoghq.com/api/v2/catalog/entity/previewhttps://api.us5.datadoghq.com/api/v2/catalog/entity/preview ### Overview OAuth apps require the `apm_service_catalog_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#software-catalog) to access this endpoint. ### Response * [202](https://docs.datadoghq.com/api/latest/software-catalog/#PreviewCatalogEntities-202-v2) * [429](https://docs.datadoghq.com/api/latest/software-catalog/#PreviewCatalogEntities-429-v2) Accepted * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) Field Type Description data [_required_] [object] attributes object apiVersion string description string displayName string kind string name string namespace string owner string properties object tags [string] id string relationships object incidents object data [object] id [_required_] string type [_required_] enum Incident resource type. Allowed enum values: `incident` default: `incident` oncalls object data [object] id [_required_] string type [_required_] enum Oncall resource type. Allowed enum values: `oncall` default: `oncall` rawSchema object data [_required_] object id [_required_] string type [_required_] enum Raw schema resource type. Allowed enum values: `rawSchema` default: `rawSchema` relatedEntities object data [object] id [_required_] string type [_required_] enum Related entity resource type. Allowed enum values: `relatedEntity` default: `relatedEntity` schema object data [_required_] object id [_required_] string type [_required_] enum Schema resource type. Allowed enum values: `schema` default: `schema` type [_required_] enum Entity resource type. Allowed enum values: `entity` default: `entity` ``` { "data": [ { "attributes": { "apiVersion": "string", "description": "string", "displayName": "string", "kind": "string", "name": "string", "namespace": "string", "owner": "string", "properties": {}, "tags": [] }, "id": "string", "relationships": { "incidents": { "data": [ { "id": "", "type": "incident" } ] }, "oncalls": { "data": [ { "id": "", "type": "oncall" } ] }, "rawSchema": { "data": { "id": "", "type": "rawSchema" } }, "relatedEntities": { "data": [ { "id": "", "type": "relatedEntity" } ] }, "schema": { "data": { "id": "", "type": "schema" } } }, "type": "entity" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=typescript) ##### Preview catalog entities Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/catalog/entity/preview" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Preview catalog entities ``` """ Preview catalog entities returns "Accepted" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.software_catalog_api import SoftwareCatalogApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SoftwareCatalogApi(api_client) response = api_instance.preview_catalog_entities() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Preview catalog entities ``` # Preview catalog entities returns "Accepted" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SoftwareCatalogAPI.new p api_instance.preview_catalog_entities() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Preview catalog entities ``` // Preview catalog entities returns "Accepted" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSoftwareCatalogApi(apiClient) resp, r, err := api.PreviewCatalogEntities(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SoftwareCatalogApi.PreviewCatalogEntities`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SoftwareCatalogApi.PreviewCatalogEntities`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Preview catalog entities ``` // Preview catalog entities returns "Accepted" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SoftwareCatalogApi; import com.datadog.api.client.v2.model.EntityResponseArray; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SoftwareCatalogApi apiInstance = new SoftwareCatalogApi(defaultClient); try { EntityResponseArray result = apiInstance.previewCatalogEntities(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SoftwareCatalogApi#previewCatalogEntities"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Preview catalog entities ``` // Preview catalog entities returns "Accepted" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_software_catalog::SoftwareCatalogAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SoftwareCatalogAPI::with_config(configuration); let resp = api.preview_catalog_entities().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Preview catalog entities ``` /** * Preview catalog entities returns "Accepted" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SoftwareCatalogApi(configuration); apiInstance .previewCatalogEntities() .then((data: v2.EntityResponseArray) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a list of entity kinds](https://docs.datadoghq.com/api/latest/software-catalog/#get-a-list-of-entity-kinds) * [v2 (latest)](https://docs.datadoghq.com/api/latest/software-catalog/#get-a-list-of-entity-kinds-v2) GET https://api.ap1.datadoghq.com/api/v2/catalog/kindhttps://api.ap2.datadoghq.com/api/v2/catalog/kindhttps://api.datadoghq.eu/api/v2/catalog/kindhttps://api.ddog-gov.com/api/v2/catalog/kindhttps://api.datadoghq.com/api/v2/catalog/kindhttps://api.us3.datadoghq.com/api/v2/catalog/kindhttps://api.us5.datadoghq.com/api/v2/catalog/kind ### Overview Get a list of entity kinds from Software Catalog. OAuth apps require the `apm_service_catalog_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#software-catalog) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[offset] integer Specific offset to use as the beginning of the returned page. page[limit] integer Maximum number of kinds in the response. filter[id] string Filter entities by UUID. filter[name] string Filter entities by name. ### Response * [200](https://docs.datadoghq.com/api/latest/software-catalog/#ListCatalogKind-200-v2) * [400](https://docs.datadoghq.com/api/latest/software-catalog/#ListCatalogKind-400-v2) * [403](https://docs.datadoghq.com/api/latest/software-catalog/#ListCatalogKind-403-v2) * [429](https://docs.datadoghq.com/api/latest/software-catalog/#ListCatalogKind-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) List kind response. Field Type Description data [object] List of kind responses. attributes object Kind attributes. description string Short description of the kind. displayName string User friendly name of the kind. name string The kind name. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. meta object Kind metadata. createdAt string The creation time. modifiedAt string The modification time. type string Kind. meta object Kind response metadata. count int64 Total kinds count. ``` { "data": [ { "attributes": { "description": "string", "displayName": "string", "name": "my-job" }, "id": "4b163705-23c0-4573-b2fb-f6cea2163fcb", "meta": { "createdAt": "string", "modifiedAt": "string" }, "type": "string" } ], "meta": { "count": "integer" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=typescript) ##### Get a list of entity kinds Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/catalog/kind" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of entity kinds ``` """ Get a list of entity kinds returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.software_catalog_api import SoftwareCatalogApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SoftwareCatalogApi(api_client) response = api_instance.list_catalog_kind() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of entity kinds ``` # Get a list of entity kinds returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SoftwareCatalogAPI.new p api_instance.list_catalog_kind() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of entity kinds ``` // Get a list of entity kinds returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSoftwareCatalogApi(apiClient) resp, r, err := api.ListCatalogKind(ctx, *datadogV2.NewListCatalogKindOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SoftwareCatalogApi.ListCatalogKind`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SoftwareCatalogApi.ListCatalogKind`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a list of entity kinds ``` // Get a list of entity kinds returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SoftwareCatalogApi; import com.datadog.api.client.v2.model.ListKindCatalogResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SoftwareCatalogApi apiInstance = new SoftwareCatalogApi(defaultClient); try { ListKindCatalogResponse result = apiInstance.listCatalogKind(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SoftwareCatalogApi#listCatalogKind"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of entity kinds ``` // Get a list of entity kinds returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_software_catalog::ListCatalogKindOptionalParams; use datadog_api_client::datadogV2::api_software_catalog::SoftwareCatalogAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SoftwareCatalogAPI::with_config(configuration); let resp = api .list_catalog_kind(ListCatalogKindOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of entity kinds ``` /** * Get a list of entity kinds returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SoftwareCatalogApi(configuration); apiInstance .listCatalogKind() .then((data: v2.ListKindCatalogResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create or update kinds](https://docs.datadoghq.com/api/latest/software-catalog/#create-or-update-kinds) * [v2 (latest)](https://docs.datadoghq.com/api/latest/software-catalog/#create-or-update-kinds-v2) POST https://api.ap1.datadoghq.com/api/v2/catalog/kindhttps://api.ap2.datadoghq.com/api/v2/catalog/kindhttps://api.datadoghq.eu/api/v2/catalog/kindhttps://api.ddog-gov.com/api/v2/catalog/kindhttps://api.datadoghq.com/api/v2/catalog/kindhttps://api.us3.datadoghq.com/api/v2/catalog/kindhttps://api.us5.datadoghq.com/api/v2/catalog/kind ### Overview Create or update kinds in Software Catalog. OAuth apps require the `apm_service_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#software-catalog) to access this endpoint. ### Request #### Body Data (required) Kind YAML or JSON. * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) Field Type Description Option 1 object Schema for kind. description string Short description of the kind. displayName string The display name of the kind. Automatically generated if not provided. kind [_required_] string The name of the kind to create or update. This must be in kebab-case format. Option 2 string Kind definition in raw JSON or YAML representation. ``` { "description": "string", "displayName": "string", "kind": "my-job" } ``` Copy ### Response * [202](https://docs.datadoghq.com/api/latest/software-catalog/#UpsertCatalogKind-202-v2) * [400](https://docs.datadoghq.com/api/latest/software-catalog/#UpsertCatalogKind-400-v2) * [403](https://docs.datadoghq.com/api/latest/software-catalog/#UpsertCatalogKind-403-v2) * [429](https://docs.datadoghq.com/api/latest/software-catalog/#UpsertCatalogKind-429-v2) ACCEPTED * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) Upsert kind response. Field Type Description data [object] List of kind responses. attributes object Kind attributes. description string Short description of the kind. displayName string User friendly name of the kind. name string The kind name. id string A read-only globally unique identifier for the entity generated by Datadog. User supplied values are ignored. meta object Kind metadata. createdAt string The creation time. modifiedAt string The modification time. type string Kind. meta object Kind response metadata. count int64 Total kinds count. ``` { "data": [ { "attributes": { "description": "string", "displayName": "string", "name": "my-job" }, "id": "4b163705-23c0-4573-b2fb-f6cea2163fcb", "meta": { "createdAt": "string", "modifiedAt": "string" }, "type": "string" } ], "meta": { "count": "integer" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=typescript) ##### Create or update kinds Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/catalog/kind" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Create or update kinds ``` """ Create or update kinds returns "ACCEPTED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.software_catalog_api import SoftwareCatalogApi from datadog_api_client.v2.model.kind_obj import KindObj body = KindObj( kind="my-job", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SoftwareCatalogApi(api_client) response = api_instance.upsert_catalog_kind(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create or update kinds ``` # Create or update kinds returns "ACCEPTED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SoftwareCatalogAPI.new body = DatadogAPIClient::V2::KindObj.new({ kind: "my-job", }) p api_instance.upsert_catalog_kind(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create or update kinds ``` // Create or update kinds returns "ACCEPTED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.UpsertCatalogKindRequest{ KindObj: &datadogV2.KindObj{ Kind: "my-job", }} ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSoftwareCatalogApi(apiClient) resp, r, err := api.UpsertCatalogKind(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SoftwareCatalogApi.UpsertCatalogKind`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SoftwareCatalogApi.UpsertCatalogKind`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create or update kinds ``` // Create or update kinds returns "ACCEPTED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SoftwareCatalogApi; import com.datadog.api.client.v2.model.KindObj; import com.datadog.api.client.v2.model.UpsertCatalogKindRequest; import com.datadog.api.client.v2.model.UpsertCatalogKindResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SoftwareCatalogApi apiInstance = new SoftwareCatalogApi(defaultClient); UpsertCatalogKindRequest body = new UpsertCatalogKindRequest(new KindObj().kind("my-job")); try { UpsertCatalogKindResponse result = apiInstance.upsertCatalogKind(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SoftwareCatalogApi#upsertCatalogKind"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create or update kinds ``` // Create or update kinds returns "ACCEPTED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_software_catalog::SoftwareCatalogAPI; use datadog_api_client::datadogV2::model::KindObj; use datadog_api_client::datadogV2::model::UpsertCatalogKindRequest; #[tokio::main] async fn main() { let body = UpsertCatalogKindRequest::KindObj(Box::new(KindObj::new("my-job".to_string()))); let configuration = datadog::Configuration::new(); let api = SoftwareCatalogAPI::with_config(configuration); let resp = api.upsert_catalog_kind(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create or update kinds ``` /** * Create or update kinds returns "ACCEPTED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SoftwareCatalogApi(configuration); const params: v2.SoftwareCatalogApiUpsertCatalogKindRequest = { body: { kind: "my-job", }, }; apiInstance .upsertCatalogKind(params) .then((data: v2.UpsertCatalogKindResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a single kind](https://docs.datadoghq.com/api/latest/software-catalog/#delete-a-single-kind) * [v2 (latest)](https://docs.datadoghq.com/api/latest/software-catalog/#delete-a-single-kind-v2) DELETE https://api.ap1.datadoghq.com/api/v2/catalog/kind/{kind_id}https://api.ap2.datadoghq.com/api/v2/catalog/kind/{kind_id}https://api.datadoghq.eu/api/v2/catalog/kind/{kind_id}https://api.ddog-gov.com/api/v2/catalog/kind/{kind_id}https://api.datadoghq.com/api/v2/catalog/kind/{kind_id}https://api.us3.datadoghq.com/api/v2/catalog/kind/{kind_id}https://api.us5.datadoghq.com/api/v2/catalog/kind/{kind_id} ### Overview Delete a single kind in Software Catalog. OAuth apps require the `apm_service_catalog_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#software-catalog) to access this endpoint. ### Arguments #### Path Parameters Name Type Description kind_id [_required_] string Entity kind. ### Response * [204](https://docs.datadoghq.com/api/latest/software-catalog/#DeleteCatalogKind-204-v2) * [400](https://docs.datadoghq.com/api/latest/software-catalog/#DeleteCatalogKind-400-v2) * [403](https://docs.datadoghq.com/api/latest/software-catalog/#DeleteCatalogKind-403-v2) * [404](https://docs.datadoghq.com/api/latest/software-catalog/#DeleteCatalogKind-404-v2) * [429](https://docs.datadoghq.com/api/latest/software-catalog/#DeleteCatalogKind-429-v2) OK Bad Request * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/software-catalog/) * [Example](https://docs.datadoghq.com/api/latest/software-catalog/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/software-catalog/?code-lang=typescript) ##### Delete a single kind Copy ``` # Path parameters export kind_id="my-job" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/catalog/kind/${kind_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a single kind ``` """ Delete a single kind returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.software_catalog_api import SoftwareCatalogApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SoftwareCatalogApi(api_client) api_instance.delete_catalog_kind( kind_id="my-job", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a single kind ``` # Delete a single kind returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SoftwareCatalogAPI.new api_instance.delete_catalog_kind("my-job") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a single kind ``` // Delete a single kind returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSoftwareCatalogApi(apiClient) r, err := api.DeleteCatalogKind(ctx, "my-job") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SoftwareCatalogApi.DeleteCatalogKind`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a single kind ``` // Delete a single kind returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SoftwareCatalogApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SoftwareCatalogApi apiInstance = new SoftwareCatalogApi(defaultClient); try { apiInstance.deleteCatalogKind("my-job"); } catch (ApiException e) { System.err.println("Exception when calling SoftwareCatalogApi#deleteCatalogKind"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a single kind ``` // Delete a single kind returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_software_catalog::SoftwareCatalogAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SoftwareCatalogAPI::with_config(configuration); let resp = api.delete_catalog_kind("my-job".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a single kind ``` /** * Delete a single kind returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SoftwareCatalogApi(configuration); const params: v2.SoftwareCatalogApiDeleteCatalogKindRequest = { kindId: "my-job", }; apiInstance .deleteCatalogKind(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=6ad15fea-7ced-4fb3-a43c-51efabf5cb5a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=03d0c8e6-6442-4f0a-a4a5-0bb3027479df&pt=Software%20Catalog&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsoftware-catalog%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=6ad15fea-7ced-4fb3-a43c-51efabf5cb5a&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=03d0c8e6-6442-4f0a-a4a5-0bb3027479df&pt=Software%20Catalog&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsoftware-catalog%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=4af5b44a-7737-4dee-9476-a81e3a252a57&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Software%20Catalog&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsoftware-catalog%2F&r=<=13096&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=835151) --- # Source: https://docs.datadoghq.com/api/latest/spa/ # Spa SPA (Spark Pod Autosizing) API. Provides resource recommendations and cost insights to help optimize Spark job configurations. ## [Get SPA Recommendations with a shard parameter](https://docs.datadoghq.com/api/latest/spa/#get-spa-recommendations-with-a-shard-parameter) * [v2 (latest)](https://docs.datadoghq.com/api/latest/spa/#get-spa-recommendations-with-a-shard-parameter-v2) **Note** : This endpoint is in preview and may change in the future. It is not yet recommended for production use. GET https://api.ap1.datadoghq.com/api/v2/spa/recommendations/{service}/{shard}https://api.ap2.datadoghq.com/api/v2/spa/recommendations/{service}/{shard}https://api.datadoghq.eu/api/v2/spa/recommendations/{service}/{shard}https://api.ddog-gov.com/api/v2/spa/recommendations/{service}/{shard}https://api.datadoghq.com/api/v2/spa/recommendations/{service}/{shard}https://api.us3.datadoghq.com/api/v2/spa/recommendations/{service}/{shard}https://api.us5.datadoghq.com/api/v2/spa/recommendations/{service}/{shard} ### Overview This endpoint is currently experimental and restricted to Datadog internal use only. Retrieve resource recommendations for a Spark job. The caller (Spark Gateway or DJM UI) provides a service name and shard identifier, and SPA returns structured recommendations for driver and executor resources. ### Arguments #### Path Parameters Name Type Description shard [_required_] string The shard tag for a spark job, which differentiates jobs within the same service that have different resource needs service [_required_] string The service name for a spark job #### Query Strings Name Type Description bypass_cache string The recommendation service should not use its metrics cache. ### Response * [200](https://docs.datadoghq.com/api/latest/spa/#GetSPARecommendationsWithShard-200-v2) * [400](https://docs.datadoghq.com/api/latest/spa/#GetSPARecommendationsWithShard-400-v2) * [403](https://docs.datadoghq.com/api/latest/spa/#GetSPARecommendationsWithShard-403-v2) * [429](https://docs.datadoghq.com/api/latest/spa/#GetSPARecommendationsWithShard-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/spa/) * [Example](https://docs.datadoghq.com/api/latest/spa/) Field Type Description data [_required_] object JSON:API resource object for SPA Recommendation. Includes type, optional ID, and resource attributes with structured recommendations. attributes [_required_] object Attributes of the SPA Recommendation resource. Contains recommendations for both driver and executor components. confidence_level double driver [_required_] object Resource recommendation for a single Spark component (driver or executor). Contains estimation data used to patch Spark job specs. estimation [_required_] object Recommended resource values for a Spark driver or executor, derived from recent real usage metrics. Used by SPA to propose more efficient pod sizing. cpu object CPU usage statistics derived from historical Spark job metrics. Provides multiple estimates so users can choose between conservative and cost-saving risk profiles. max int64 Maximum CPU usage observed for the job, expressed in millicores. This represents the upper bound of usage. p75 int64 75th percentile of CPU usage (millicores). Represents a cost-saving configuration while covering most workloads. p95 int64 95th percentile of CPU usage (millicores). Balances performance and cost, providing a safer margin than p75. ephemeral_storage int64 Recommended ephemeral storage allocation (in MiB). Derived from job temporary storage patterns. heap int64 Recommended JVM heap size (in MiB). memory int64 Recommended total memory allocation (in MiB). Includes both heap and overhead. overhead int64 Recommended JVM overhead (in MiB). Computed as total memory - heap. executor [_required_] object Resource recommendation for a single Spark component (driver or executor). Contains estimation data used to patch Spark job specs. estimation [_required_] object Recommended resource values for a Spark driver or executor, derived from recent real usage metrics. Used by SPA to propose more efficient pod sizing. cpu object CPU usage statistics derived from historical Spark job metrics. Provides multiple estimates so users can choose between conservative and cost-saving risk profiles. max int64 Maximum CPU usage observed for the job, expressed in millicores. This represents the upper bound of usage. p75 int64 75th percentile of CPU usage (millicores). Represents a cost-saving configuration while covering most workloads. p95 int64 95th percentile of CPU usage (millicores). Balances performance and cost, providing a safer margin than p75. ephemeral_storage int64 Recommended ephemeral storage allocation (in MiB). Derived from job temporary storage patterns. heap int64 Recommended JVM heap size (in MiB). memory int64 Recommended total memory allocation (in MiB). Includes both heap and overhead. overhead int64 Recommended JVM overhead (in MiB). Computed as total memory - heap. id string Resource identifier for the recommendation. Optional in responses. type [_required_] enum JSON:API resource type for Spark Pod Autosizing recommendations. Identifies the Recommendation resource returned by SPA. Allowed enum values: `recommendation` default: `recommendation` ``` { "data": { "attributes": { "confidence_level": "number", "driver": { "estimation": { "cpu": { "max": "integer", "p75": "integer", "p95": "integer" }, "ephemeral_storage": "integer", "heap": "integer", "memory": "integer", "overhead": "integer" } }, "executor": { "estimation": { "cpu": { "max": "integer", "p75": "integer", "p95": "integer" }, "ephemeral_storage": "integer", "heap": "integer", "memory": "integer", "overhead": "integer" } } }, "id": "string", "type": "recommendation" } } ``` Copy * [Model](https://docs.datadoghq.com/api/latest/spa/) * [Example](https://docs.datadoghq.com/api/latest/spa/) JSON:API document containing a single Recommendation resource. Returned by SPA when the Spark Gateway requests recommendations. Field Type Description data [_required_] object JSON:API resource object for SPA Recommendation. Includes type, optional ID, and resource attributes with structured recommendations. attributes [_required_] object Attributes of the SPA Recommendation resource. Contains recommendations for both driver and executor components. confidence_level double driver [_required_] object Resource recommendation for a single Spark component (driver or executor). Contains estimation data used to patch Spark job specs. estimation [_required_] object Recommended resource values for a Spark driver or executor, derived from recent real usage metrics. Used by SPA to propose more efficient pod sizing. cpu object CPU usage statistics derived from historical Spark job metrics. Provides multiple estimates so users can choose between conservative and cost-saving risk profiles. max int64 Maximum CPU usage observed for the job, expressed in millicores. This represents the upper bound of usage. p75 int64 75th percentile of CPU usage (millicores). Represents a cost-saving configuration while covering most workloads. p95 int64 95th percentile of CPU usage (millicores). Balances performance and cost, providing a safer margin than p75. ephemeral_storage int64 Recommended ephemeral storage allocation (in MiB). Derived from job temporary storage patterns. heap int64 Recommended JVM heap size (in MiB). memory int64 Recommended total memory allocation (in MiB). Includes both heap and overhead. overhead int64 Recommended JVM overhead (in MiB). Computed as total memory - heap. executor [_required_] object Resource recommendation for a single Spark component (driver or executor). Contains estimation data used to patch Spark job specs. estimation [_required_] object Recommended resource values for a Spark driver or executor, derived from recent real usage metrics. Used by SPA to propose more efficient pod sizing. cpu object CPU usage statistics derived from historical Spark job metrics. Provides multiple estimates so users can choose between conservative and cost-saving risk profiles. max int64 Maximum CPU usage observed for the job, expressed in millicores. This represents the upper bound of usage. p75 int64 75th percentile of CPU usage (millicores). Represents a cost-saving configuration while covering most workloads. p95 int64 95th percentile of CPU usage (millicores). Balances performance and cost, providing a safer margin than p75. ephemeral_storage int64 Recommended ephemeral storage allocation (in MiB). Derived from job temporary storage patterns. heap int64 Recommended JVM heap size (in MiB). memory int64 Recommended total memory allocation (in MiB). Includes both heap and overhead. overhead int64 Recommended JVM overhead (in MiB). Computed as total memory - heap. id string Resource identifier for the recommendation. Optional in responses. type [_required_] enum JSON:API resource type for Spark Pod Autosizing recommendations. Identifies the Recommendation resource returned by SPA. Allowed enum values: `recommendation` default: `recommendation` ``` { "data": { "attributes": { "confidence_level": "number", "driver": { "estimation": { "cpu": { "max": "integer", "p75": "integer", "p95": "integer" }, "ephemeral_storage": "integer", "heap": "integer", "memory": "integer", "overhead": "integer" } }, "executor": { "estimation": { "cpu": { "max": "integer", "p75": "integer", "p95": "integer" }, "ephemeral_storage": "integer", "heap": "integer", "memory": "integer", "overhead": "integer" } } }, "id": "string", "type": "recommendation" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/spa/) * [Example](https://docs.datadoghq.com/api/latest/spa/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/spa/) * [Example](https://docs.datadoghq.com/api/latest/spa/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/spa/) * [Example](https://docs.datadoghq.com/api/latest/spa/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/spa/?code-lang=curl) ##### Get SPA Recommendations with a shard parameter Copy ``` # Path parameters export shard="CHANGE_ME" export service="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/spa/recommendations/${service}/${shard}" \ -H "Accept: application/json" ``` * * * ## [Get SPA Recommendations](https://docs.datadoghq.com/api/latest/spa/#get-spa-recommendations) * [v2 (latest)](https://docs.datadoghq.com/api/latest/spa/#get-spa-recommendations-v2) **Note** : This endpoint is in preview and may change in the future. It is not yet recommended for production use. GET https://api.ap1.datadoghq.com/api/v2/spa/recommendations/{service}https://api.ap2.datadoghq.com/api/v2/spa/recommendations/{service}https://api.datadoghq.eu/api/v2/spa/recommendations/{service}https://api.ddog-gov.com/api/v2/spa/recommendations/{service}https://api.datadoghq.com/api/v2/spa/recommendations/{service}https://api.us3.datadoghq.com/api/v2/spa/recommendations/{service}https://api.us5.datadoghq.com/api/v2/spa/recommendations/{service} ### Overview This endpoint is currently experimental and restricted to Datadog internal use only. Retrieve resource recommendations for a Spark job. The caller (Spark Gateway or DJM UI) provides a service name and SPA returns structured recommendations for driver and executor resources. The version with a shard should be preferred, where possible, as it gives more accurate results. ### Arguments #### Path Parameters Name Type Description service [_required_] string The service name for a spark job. #### Query Strings Name Type Description bypass_cache string The recommendation service should not use its metrics cache. ### Response * [200](https://docs.datadoghq.com/api/latest/spa/#GetSPARecommendations-200-v2) * [400](https://docs.datadoghq.com/api/latest/spa/#GetSPARecommendations-400-v2) * [403](https://docs.datadoghq.com/api/latest/spa/#GetSPARecommendations-403-v2) * [429](https://docs.datadoghq.com/api/latest/spa/#GetSPARecommendations-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/spa/) * [Example](https://docs.datadoghq.com/api/latest/spa/) Field Type Description data [_required_] object JSON:API resource object for SPA Recommendation. Includes type, optional ID, and resource attributes with structured recommendations. attributes [_required_] object Attributes of the SPA Recommendation resource. Contains recommendations for both driver and executor components. confidence_level double driver [_required_] object Resource recommendation for a single Spark component (driver or executor). Contains estimation data used to patch Spark job specs. estimation [_required_] object Recommended resource values for a Spark driver or executor, derived from recent real usage metrics. Used by SPA to propose more efficient pod sizing. cpu object CPU usage statistics derived from historical Spark job metrics. Provides multiple estimates so users can choose between conservative and cost-saving risk profiles. max int64 Maximum CPU usage observed for the job, expressed in millicores. This represents the upper bound of usage. p75 int64 75th percentile of CPU usage (millicores). Represents a cost-saving configuration while covering most workloads. p95 int64 95th percentile of CPU usage (millicores). Balances performance and cost, providing a safer margin than p75. ephemeral_storage int64 Recommended ephemeral storage allocation (in MiB). Derived from job temporary storage patterns. heap int64 Recommended JVM heap size (in MiB). memory int64 Recommended total memory allocation (in MiB). Includes both heap and overhead. overhead int64 Recommended JVM overhead (in MiB). Computed as total memory - heap. executor [_required_] object Resource recommendation for a single Spark component (driver or executor). Contains estimation data used to patch Spark job specs. estimation [_required_] object Recommended resource values for a Spark driver or executor, derived from recent real usage metrics. Used by SPA to propose more efficient pod sizing. cpu object CPU usage statistics derived from historical Spark job metrics. Provides multiple estimates so users can choose between conservative and cost-saving risk profiles. max int64 Maximum CPU usage observed for the job, expressed in millicores. This represents the upper bound of usage. p75 int64 75th percentile of CPU usage (millicores). Represents a cost-saving configuration while covering most workloads. p95 int64 95th percentile of CPU usage (millicores). Balances performance and cost, providing a safer margin than p75. ephemeral_storage int64 Recommended ephemeral storage allocation (in MiB). Derived from job temporary storage patterns. heap int64 Recommended JVM heap size (in MiB). memory int64 Recommended total memory allocation (in MiB). Includes both heap and overhead. overhead int64 Recommended JVM overhead (in MiB). Computed as total memory - heap. id string Resource identifier for the recommendation. Optional in responses. type [_required_] enum JSON:API resource type for Spark Pod Autosizing recommendations. Identifies the Recommendation resource returned by SPA. Allowed enum values: `recommendation` default: `recommendation` ``` { "data": { "attributes": { "confidence_level": "number", "driver": { "estimation": { "cpu": { "max": "integer", "p75": "integer", "p95": "integer" }, "ephemeral_storage": "integer", "heap": "integer", "memory": "integer", "overhead": "integer" } }, "executor": { "estimation": { "cpu": { "max": "integer", "p75": "integer", "p95": "integer" }, "ephemeral_storage": "integer", "heap": "integer", "memory": "integer", "overhead": "integer" } } }, "id": "string", "type": "recommendation" } } ``` Copy * [Model](https://docs.datadoghq.com/api/latest/spa/) * [Example](https://docs.datadoghq.com/api/latest/spa/) JSON:API document containing a single Recommendation resource. Returned by SPA when the Spark Gateway requests recommendations. Field Type Description data [_required_] object JSON:API resource object for SPA Recommendation. Includes type, optional ID, and resource attributes with structured recommendations. attributes [_required_] object Attributes of the SPA Recommendation resource. Contains recommendations for both driver and executor components. confidence_level double driver [_required_] object Resource recommendation for a single Spark component (driver or executor). Contains estimation data used to patch Spark job specs. estimation [_required_] object Recommended resource values for a Spark driver or executor, derived from recent real usage metrics. Used by SPA to propose more efficient pod sizing. cpu object CPU usage statistics derived from historical Spark job metrics. Provides multiple estimates so users can choose between conservative and cost-saving risk profiles. max int64 Maximum CPU usage observed for the job, expressed in millicores. This represents the upper bound of usage. p75 int64 75th percentile of CPU usage (millicores). Represents a cost-saving configuration while covering most workloads. p95 int64 95th percentile of CPU usage (millicores). Balances performance and cost, providing a safer margin than p75. ephemeral_storage int64 Recommended ephemeral storage allocation (in MiB). Derived from job temporary storage patterns. heap int64 Recommended JVM heap size (in MiB). memory int64 Recommended total memory allocation (in MiB). Includes both heap and overhead. overhead int64 Recommended JVM overhead (in MiB). Computed as total memory - heap. executor [_required_] object Resource recommendation for a single Spark component (driver or executor). Contains estimation data used to patch Spark job specs. estimation [_required_] object Recommended resource values for a Spark driver or executor, derived from recent real usage metrics. Used by SPA to propose more efficient pod sizing. cpu object CPU usage statistics derived from historical Spark job metrics. Provides multiple estimates so users can choose between conservative and cost-saving risk profiles. max int64 Maximum CPU usage observed for the job, expressed in millicores. This represents the upper bound of usage. p75 int64 75th percentile of CPU usage (millicores). Represents a cost-saving configuration while covering most workloads. p95 int64 95th percentile of CPU usage (millicores). Balances performance and cost, providing a safer margin than p75. ephemeral_storage int64 Recommended ephemeral storage allocation (in MiB). Derived from job temporary storage patterns. heap int64 Recommended JVM heap size (in MiB). memory int64 Recommended total memory allocation (in MiB). Includes both heap and overhead. overhead int64 Recommended JVM overhead (in MiB). Computed as total memory - heap. id string Resource identifier for the recommendation. Optional in responses. type [_required_] enum JSON:API resource type for Spark Pod Autosizing recommendations. Identifies the Recommendation resource returned by SPA. Allowed enum values: `recommendation` default: `recommendation` ``` { "data": { "attributes": { "confidence_level": "number", "driver": { "estimation": { "cpu": { "max": "integer", "p75": "integer", "p95": "integer" }, "ephemeral_storage": "integer", "heap": "integer", "memory": "integer", "overhead": "integer" } }, "executor": { "estimation": { "cpu": { "max": "integer", "p75": "integer", "p95": "integer" }, "ephemeral_storage": "integer", "heap": "integer", "memory": "integer", "overhead": "integer" } } }, "id": "string", "type": "recommendation" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/spa/) * [Example](https://docs.datadoghq.com/api/latest/spa/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/spa/) * [Example](https://docs.datadoghq.com/api/latest/spa/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/spa/) * [Example](https://docs.datadoghq.com/api/latest/spa/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/spa/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/spa/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/spa/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/spa/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/spa/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/spa/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/spa/?code-lang=typescript) ##### Get SPA Recommendations Copy ``` # Path parameters export service="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/spa/recommendations/${service}" \ -H "Accept: application/json" ``` ##### Get SPA Recommendations ``` """ Get SPA Recommendations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.spa_api import SpaApi configuration = Configuration() configuration.unstable_operations["get_spa_recommendations"] = True with ApiClient(configuration) as api_client: api_instance = SpaApi(api_client) response = api_instance.get_spa_recommendations( shard="shard", service="service", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" python3 "example.py" ``` ##### Get SPA Recommendations ``` # Get SPA Recommendations returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_spa_recommendations".to_sym] = true end api_instance = DatadogAPIClient::V2::SpaAPI.new p api_instance.get_spa_recommendations("shard", "service") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" rb "example.rb" ``` ##### Get SPA Recommendations ``` // Get SPA Recommendations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetSPARecommendations", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSpaApi(apiClient) resp, r, err := api.GetSPARecommendations(ctx, "shard", "service") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SpaApi.GetSPARecommendations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SpaApi.GetSPARecommendations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" go run "main.go" ``` ##### Get SPA Recommendations ``` // Get SPA Recommendations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SpaApi; import com.datadog.api.client.v2.model.RecommendationDocument; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getSPARecommendations", true); SpaApi apiInstance = new SpaApi(defaultClient); try { RecommendationDocument result = apiInstance.getSPARecommendations("shard", "service"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SpaApi#getSPARecommendations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" java "Example.java" ``` ##### Get SPA Recommendations ``` // Get SPA Recommendations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_spa::SpaAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetSPARecommendations", true); let api = SpaAPI::with_config(configuration); let resp = api .get_spa_recommendations("shard".to_string(), "service".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" cargo run ``` ##### Get SPA Recommendations ``` /** * Get SPA Recommendations returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getSPARecommendations"] = true; const apiInstance = new v2.SpaApi(configuration); const params: v2.SpaApiGetSPARecommendationsRequest = { shard: "shard", service: "service", }; apiInstance .getSPARecommendations(params) .then((data: v2.RecommendationDocument) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=f720dfad-c11f-418f-98d4-2c0a0edfee52&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=8118000d-b769-4213-8994-2534835ccffb&pt=Spa&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fspa%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=f720dfad-c11f-418f-98d4-2c0a0edfee52&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=8118000d-b769-4213-8994-2534835ccffb&pt=Spa&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fspa%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=0c8ada2f-83c9-4be0-88e1-fc57b3f97189&bo=2&sid=ca97aff0f0bf11f0b225ebe7018368cd&vid=ca980060f0bf11f09385b797b877861f&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Spa&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fspa%2F&r=<=1047&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=959197) --- # Source: https://docs.datadoghq.com/api/latest/spans-metrics/ # Spans Metrics Manage configuration of [span-based metrics](https://app.datadoghq.com/apm/traces/generate-metrics) for your organization. See [Generate Metrics from Spans](https://docs.datadoghq.com/tracing/trace_pipeline/generate_metrics/) for more information. ## [Get all span-based metrics](https://docs.datadoghq.com/api/latest/spans-metrics/#get-all-span-based-metrics) * [v2 (latest)](https://docs.datadoghq.com/api/latest/spans-metrics/#get-all-span-based-metrics-v2) GET https://api.ap1.datadoghq.com/api/v2/apm/config/metricshttps://api.ap2.datadoghq.com/api/v2/apm/config/metricshttps://api.datadoghq.eu/api/v2/apm/config/metricshttps://api.ddog-gov.com/api/v2/apm/config/metricshttps://api.datadoghq.com/api/v2/apm/config/metricshttps://api.us3.datadoghq.com/api/v2/apm/config/metricshttps://api.us5.datadoghq.com/api/v2/apm/config/metrics ### Overview Get the list of configured span-based metrics with their definitions. This endpoint requires the `apm_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/spans-metrics/#ListSpansMetrics-200-v2) * [403](https://docs.datadoghq.com/api/latest/spans-metrics/#ListSpansMetrics-403-v2) * [429](https://docs.datadoghq.com/api/latest/spans-metrics/#ListSpansMetrics-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) All the available span-based metric objects. Field Type Description data [object] A list of span-based metric objects. attributes object The object describing a Datadog span-based metric. compute object The compute rule to compute the span-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. path string The path to the value the span-based metric will aggregate on (only used if the aggregation type is a "distribution"). filter object The span-based metric filter. Spans matching this filter will be aggregated in this metric. query string The search query - following the span search syntax. group_by [object] The rules for the group by. path string The path to the value the span-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. id string The name of the span-based metric. type enum The type of resource. The value should always be spans_metrics. Allowed enum values: `spans_metrics` default: `spans_metrics` ``` { "data": [ { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": false, "path": "@duration" }, "filter": { "query": "@http.status_code:200 service:my-service" }, "group_by": [ { "path": "resource_name", "tag_name": "resource_name" } ] }, "id": "my.metric", "type": "spans_metrics" } ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=typescript) ##### Get all span-based metrics Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/metrics" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all span-based metrics ``` """ Get all span-based metrics returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.spans_metrics_api import SpansMetricsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SpansMetricsApi(api_client) response = api_instance.list_spans_metrics() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all span-based metrics ``` # Get all span-based metrics returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SpansMetricsAPI.new p api_instance.list_spans_metrics() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all span-based metrics ``` // Get all span-based metrics returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSpansMetricsApi(apiClient) resp, r, err := api.ListSpansMetrics(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SpansMetricsApi.ListSpansMetrics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SpansMetricsApi.ListSpansMetrics`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all span-based metrics ``` // Get all span-based metrics returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SpansMetricsApi; import com.datadog.api.client.v2.model.SpansMetricsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SpansMetricsApi apiInstance = new SpansMetricsApi(defaultClient); try { SpansMetricsResponse result = apiInstance.listSpansMetrics(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SpansMetricsApi#listSpansMetrics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all span-based metrics ``` // Get all span-based metrics returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_spans_metrics::SpansMetricsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SpansMetricsAPI::with_config(configuration); let resp = api.list_spans_metrics().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all span-based metrics ``` /** * Get all span-based metrics returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SpansMetricsApi(configuration); apiInstance .listSpansMetrics() .then((data: v2.SpansMetricsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a span-based metric](https://docs.datadoghq.com/api/latest/spans-metrics/#create-a-span-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/spans-metrics/#create-a-span-based-metric-v2) POST https://api.ap1.datadoghq.com/api/v2/apm/config/metricshttps://api.ap2.datadoghq.com/api/v2/apm/config/metricshttps://api.datadoghq.eu/api/v2/apm/config/metricshttps://api.ddog-gov.com/api/v2/apm/config/metricshttps://api.datadoghq.com/api/v2/apm/config/metricshttps://api.us3.datadoghq.com/api/v2/apm/config/metricshttps://api.us5.datadoghq.com/api/v2/apm/config/metrics ### Overview Create a metric based on your ingested spans in your organization. Returns the span-based metric object from the request body when the request is successful. This endpoint requires the `apm_generate_metrics` permission. ### Request #### Body Data (required) The definition of the new span-based metric. * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) Field Type Description data [_required_] object The new span-based metric properties. attributes [_required_] object The object describing the Datadog span-based metric to create. compute [_required_] object The compute rule to compute the span-based metric. aggregation_type [_required_] enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. path string The path to the value the span-based metric will aggregate on (only used if the aggregation type is a "distribution"). filter object The span-based metric filter. Spans matching this filter will be aggregated in this metric. query string The search query - following the span search syntax. default: `*` group_by [object] The rules for the group by. path [_required_] string The path to the value the span-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. id [_required_] string The name of the span-based metric. type [_required_] enum The type of resource. The value should always be spans_metrics. Allowed enum values: `spans_metrics` default: `spans_metrics` ``` { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": false, "path": "@duration" }, "filter": { "query": "@http.status_code:200 service:my-service" }, "group_by": [ { "path": "resource_name", "tag_name": "resource_name" } ] }, "id": "ExampleSpansMetric", "type": "spans_metrics" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/spans-metrics/#CreateSpansMetric-200-v2) * [400](https://docs.datadoghq.com/api/latest/spans-metrics/#CreateSpansMetric-400-v2) * [403](https://docs.datadoghq.com/api/latest/spans-metrics/#CreateSpansMetric-403-v2) * [409](https://docs.datadoghq.com/api/latest/spans-metrics/#CreateSpansMetric-409-v2) * [429](https://docs.datadoghq.com/api/latest/spans-metrics/#CreateSpansMetric-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) The span-based metric object. Field Type Description data object The span-based metric properties. attributes object The object describing a Datadog span-based metric. compute object The compute rule to compute the span-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. path string The path to the value the span-based metric will aggregate on (only used if the aggregation type is a "distribution"). filter object The span-based metric filter. Spans matching this filter will be aggregated in this metric. query string The search query - following the span search syntax. group_by [object] The rules for the group by. path string The path to the value the span-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. id string The name of the span-based metric. type enum The type of resource. The value should always be spans_metrics. Allowed enum values: `spans_metrics` default: `spans_metrics` ``` { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": false, "path": "@duration" }, "filter": { "query": "@http.status_code:200 service:my-service" }, "group_by": [ { "path": "resource_name", "tag_name": "resource_name" } ] }, "id": "my.metric", "type": "spans_metrics" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=typescript) ##### Create a span-based metric returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/metrics" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": false, "path": "@duration" }, "filter": { "query": "@http.status_code:200 service:my-service" }, "group_by": [ { "path": "resource_name", "tag_name": "resource_name" } ] }, "id": "ExampleSpansMetric", "type": "spans_metrics" } } EOF ``` ##### Create a span-based metric returns "OK" response ``` // Create a span-based metric returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SpansMetricCreateRequest{ Data: datadogV2.SpansMetricCreateData{ Attributes: datadogV2.SpansMetricCreateAttributes{ Compute: datadogV2.SpansMetricCompute{ AggregationType: datadogV2.SPANSMETRICCOMPUTEAGGREGATIONTYPE_DISTRIBUTION, IncludePercentiles: datadog.PtrBool(false), Path: datadog.PtrString("@duration"), }, Filter: &datadogV2.SpansMetricFilter{ Query: datadog.PtrString("@http.status_code:200 service:my-service"), }, GroupBy: []datadogV2.SpansMetricGroupBy{ { Path: "resource_name", TagName: datadog.PtrString("resource_name"), }, }, }, Id: "ExampleSpansMetric", Type: datadogV2.SPANSMETRICTYPE_SPANS_METRICS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSpansMetricsApi(apiClient) resp, r, err := api.CreateSpansMetric(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SpansMetricsApi.CreateSpansMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SpansMetricsApi.CreateSpansMetric`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a span-based metric returns "OK" response ``` // Create a span-based metric returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SpansMetricsApi; import com.datadog.api.client.v2.model.SpansMetricCompute; import com.datadog.api.client.v2.model.SpansMetricComputeAggregationType; import com.datadog.api.client.v2.model.SpansMetricCreateAttributes; import com.datadog.api.client.v2.model.SpansMetricCreateData; import com.datadog.api.client.v2.model.SpansMetricCreateRequest; import com.datadog.api.client.v2.model.SpansMetricFilter; import com.datadog.api.client.v2.model.SpansMetricGroupBy; import com.datadog.api.client.v2.model.SpansMetricResponse; import com.datadog.api.client.v2.model.SpansMetricType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SpansMetricsApi apiInstance = new SpansMetricsApi(defaultClient); SpansMetricCreateRequest body = new SpansMetricCreateRequest() .data( new SpansMetricCreateData() .attributes( new SpansMetricCreateAttributes() .compute( new SpansMetricCompute() .aggregationType(SpansMetricComputeAggregationType.DISTRIBUTION) .includePercentiles(false) .path("@duration")) .filter( new SpansMetricFilter() .query("@http.status_code:200 service:my-service")) .groupBy( Collections.singletonList( new SpansMetricGroupBy() .path("resource_name") .tagName("resource_name")))) .id("ExampleSpansMetric") .type(SpansMetricType.SPANS_METRICS)); try { SpansMetricResponse result = apiInstance.createSpansMetric(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SpansMetricsApi#createSpansMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a span-based metric returns "OK" response ``` """ Create a span-based metric returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.spans_metrics_api import SpansMetricsApi from datadog_api_client.v2.model.spans_metric_compute import SpansMetricCompute from datadog_api_client.v2.model.spans_metric_compute_aggregation_type import SpansMetricComputeAggregationType from datadog_api_client.v2.model.spans_metric_create_attributes import SpansMetricCreateAttributes from datadog_api_client.v2.model.spans_metric_create_data import SpansMetricCreateData from datadog_api_client.v2.model.spans_metric_create_request import SpansMetricCreateRequest from datadog_api_client.v2.model.spans_metric_filter import SpansMetricFilter from datadog_api_client.v2.model.spans_metric_group_by import SpansMetricGroupBy from datadog_api_client.v2.model.spans_metric_type import SpansMetricType body = SpansMetricCreateRequest( data=SpansMetricCreateData( attributes=SpansMetricCreateAttributes( compute=SpansMetricCompute( aggregation_type=SpansMetricComputeAggregationType.DISTRIBUTION, include_percentiles=False, path="@duration", ), filter=SpansMetricFilter( query="@http.status_code:200 service:my-service", ), group_by=[ SpansMetricGroupBy( path="resource_name", tag_name="resource_name", ), ], ), id="ExampleSpansMetric", type=SpansMetricType.SPANS_METRICS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SpansMetricsApi(api_client) response = api_instance.create_spans_metric(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a span-based metric returns "OK" response ``` # Create a span-based metric returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SpansMetricsAPI.new body = DatadogAPIClient::V2::SpansMetricCreateRequest.new({ data: DatadogAPIClient::V2::SpansMetricCreateData.new({ attributes: DatadogAPIClient::V2::SpansMetricCreateAttributes.new({ compute: DatadogAPIClient::V2::SpansMetricCompute.new({ aggregation_type: DatadogAPIClient::V2::SpansMetricComputeAggregationType::DISTRIBUTION, include_percentiles: false, path: "@duration", }), filter: DatadogAPIClient::V2::SpansMetricFilter.new({ query: "@http.status_code:200 service:my-service", }), group_by: [ DatadogAPIClient::V2::SpansMetricGroupBy.new({ path: "resource_name", tag_name: "resource_name", }), ], }), id: "ExampleSpansMetric", type: DatadogAPIClient::V2::SpansMetricType::SPANS_METRICS, }), }) p api_instance.create_spans_metric(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a span-based metric returns "OK" response ``` // Create a span-based metric returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_spans_metrics::SpansMetricsAPI; use datadog_api_client::datadogV2::model::SpansMetricCompute; use datadog_api_client::datadogV2::model::SpansMetricComputeAggregationType; use datadog_api_client::datadogV2::model::SpansMetricCreateAttributes; use datadog_api_client::datadogV2::model::SpansMetricCreateData; use datadog_api_client::datadogV2::model::SpansMetricCreateRequest; use datadog_api_client::datadogV2::model::SpansMetricFilter; use datadog_api_client::datadogV2::model::SpansMetricGroupBy; use datadog_api_client::datadogV2::model::SpansMetricType; #[tokio::main] async fn main() { let body = SpansMetricCreateRequest::new(SpansMetricCreateData::new( SpansMetricCreateAttributes::new( SpansMetricCompute::new(SpansMetricComputeAggregationType::DISTRIBUTION) .include_percentiles(false) .path("@duration".to_string()), ) .filter( SpansMetricFilter::new().query("@http.status_code:200 service:my-service".to_string()), ) .group_by(vec![SpansMetricGroupBy::new("resource_name".to_string()) .tag_name("resource_name".to_string())]), "ExampleSpansMetric".to_string(), SpansMetricType::SPANS_METRICS, )); let configuration = datadog::Configuration::new(); let api = SpansMetricsAPI::with_config(configuration); let resp = api.create_spans_metric(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a span-based metric returns "OK" response ``` /** * Create a span-based metric returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SpansMetricsApi(configuration); const params: v2.SpansMetricsApiCreateSpansMetricRequest = { body: { data: { attributes: { compute: { aggregationType: "distribution", includePercentiles: false, path: "@duration", }, filter: { query: "@http.status_code:200 service:my-service", }, groupBy: [ { path: "resource_name", tagName: "resource_name", }, ], }, id: "ExampleSpansMetric", type: "spans_metrics", }, }, }; apiInstance .createSpansMetric(params) .then((data: v2.SpansMetricResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a span-based metric](https://docs.datadoghq.com/api/latest/spans-metrics/#get-a-span-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/spans-metrics/#get-a-span-based-metric-v2) GET https://api.ap1.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.ap2.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.datadoghq.eu/api/v2/apm/config/metrics/{metric_id}https://api.ddog-gov.com/api/v2/apm/config/metrics/{metric_id}https://api.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.us3.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.us5.datadoghq.com/api/v2/apm/config/metrics/{metric_id} ### Overview Get a specific span-based metric from your organization. This endpoint requires the `apm_read` permission. ### Arguments #### Path Parameters Name Type Description metric_id [_required_] string The name of the span-based metric. ### Response * [200](https://docs.datadoghq.com/api/latest/spans-metrics/#GetSpansMetric-200-v2) * [403](https://docs.datadoghq.com/api/latest/spans-metrics/#GetSpansMetric-403-v2) * [404](https://docs.datadoghq.com/api/latest/spans-metrics/#GetSpansMetric-404-v2) * [429](https://docs.datadoghq.com/api/latest/spans-metrics/#GetSpansMetric-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) The span-based metric object. Field Type Description data object The span-based metric properties. attributes object The object describing a Datadog span-based metric. compute object The compute rule to compute the span-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. path string The path to the value the span-based metric will aggregate on (only used if the aggregation type is a "distribution"). filter object The span-based metric filter. Spans matching this filter will be aggregated in this metric. query string The search query - following the span search syntax. group_by [object] The rules for the group by. path string The path to the value the span-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. id string The name of the span-based metric. type enum The type of resource. The value should always be spans_metrics. Allowed enum values: `spans_metrics` default: `spans_metrics` ``` { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": false, "path": "@duration" }, "filter": { "query": "@http.status_code:200 service:my-service" }, "group_by": [ { "path": "resource_name", "tag_name": "resource_name" } ] }, "id": "my.metric", "type": "spans_metrics" } } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=typescript) ##### Get a span-based metric Copy ``` # Path parameters export metric_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/metrics/${metric_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a span-based metric ``` """ Get a span-based metric returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.spans_metrics_api import SpansMetricsApi # there is a valid "spans_metric" in the system SPANS_METRIC_DATA_ID = environ["SPANS_METRIC_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SpansMetricsApi(api_client) response = api_instance.get_spans_metric( metric_id=SPANS_METRIC_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a span-based metric ``` # Get a span-based metric returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SpansMetricsAPI.new # there is a valid "spans_metric" in the system SPANS_METRIC_DATA_ID = ENV["SPANS_METRIC_DATA_ID"] p api_instance.get_spans_metric(SPANS_METRIC_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a span-based metric ``` // Get a span-based metric returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "spans_metric" in the system SpansMetricDataID := os.Getenv("SPANS_METRIC_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSpansMetricsApi(apiClient) resp, r, err := api.GetSpansMetric(ctx, SpansMetricDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SpansMetricsApi.GetSpansMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SpansMetricsApi.GetSpansMetric`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a span-based metric ``` // Get a span-based metric returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SpansMetricsApi; import com.datadog.api.client.v2.model.SpansMetricResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SpansMetricsApi apiInstance = new SpansMetricsApi(defaultClient); // there is a valid "spans_metric" in the system String SPANS_METRIC_DATA_ID = System.getenv("SPANS_METRIC_DATA_ID"); try { SpansMetricResponse result = apiInstance.getSpansMetric(SPANS_METRIC_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SpansMetricsApi#getSpansMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a span-based metric ``` // Get a span-based metric returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_spans_metrics::SpansMetricsAPI; #[tokio::main] async fn main() { // there is a valid "spans_metric" in the system let spans_metric_data_id = std::env::var("SPANS_METRIC_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SpansMetricsAPI::with_config(configuration); let resp = api.get_spans_metric(spans_metric_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a span-based metric ``` /** * Get a span-based metric returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SpansMetricsApi(configuration); // there is a valid "spans_metric" in the system const SPANS_METRIC_DATA_ID = process.env.SPANS_METRIC_DATA_ID as string; const params: v2.SpansMetricsApiGetSpansMetricRequest = { metricId: SPANS_METRIC_DATA_ID, }; apiInstance .getSpansMetric(params) .then((data: v2.SpansMetricResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a span-based metric](https://docs.datadoghq.com/api/latest/spans-metrics/#update-a-span-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/spans-metrics/#update-a-span-based-metric-v2) PATCH https://api.ap1.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.ap2.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.datadoghq.eu/api/v2/apm/config/metrics/{metric_id}https://api.ddog-gov.com/api/v2/apm/config/metrics/{metric_id}https://api.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.us3.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.us5.datadoghq.com/api/v2/apm/config/metrics/{metric_id} ### Overview Update a specific span-based metric from your organization. Returns the span-based metric object from the request body when the request is successful. This endpoint requires the `apm_generate_metrics` permission. ### Arguments #### Path Parameters Name Type Description metric_id [_required_] string The name of the span-based metric. ### Request #### Body Data (required) New definition of the span-based metric. * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) Field Type Description data [_required_] object The new span-based metric properties. attributes [_required_] object The span-based metric properties that will be updated. compute object The compute rule to compute the span-based metric. include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. filter object The span-based metric filter. Spans matching this filter will be aggregated in this metric. query string The search query - following the span search syntax. default: `*` group_by [object] The rules for the group by. path [_required_] string The path to the value the span-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. type [_required_] enum The type of resource. The value should always be spans_metrics. Allowed enum values: `spans_metrics` default: `spans_metrics` ``` { "data": { "attributes": { "compute": { "include_percentiles": false }, "filter": { "query": "@http.status_code:200 service:my-service-updated" }, "group_by": [ { "path": "resource_name", "tag_name": "resource_name" } ] }, "type": "spans_metrics" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/spans-metrics/#UpdateSpansMetric-200-v2) * [400](https://docs.datadoghq.com/api/latest/spans-metrics/#UpdateSpansMetric-400-v2) * [403](https://docs.datadoghq.com/api/latest/spans-metrics/#UpdateSpansMetric-403-v2) * [404](https://docs.datadoghq.com/api/latest/spans-metrics/#UpdateSpansMetric-404-v2) * [429](https://docs.datadoghq.com/api/latest/spans-metrics/#UpdateSpansMetric-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) The span-based metric object. Field Type Description data object The span-based metric properties. attributes object The object describing a Datadog span-based metric. compute object The compute rule to compute the span-based metric. aggregation_type enum The type of aggregation to use. Allowed enum values: `count,distribution` include_percentiles boolean Toggle to include or exclude percentile aggregations for distribution metrics. Only present when the `aggregation_type` is `distribution`. path string The path to the value the span-based metric will aggregate on (only used if the aggregation type is a "distribution"). filter object The span-based metric filter. Spans matching this filter will be aggregated in this metric. query string The search query - following the span search syntax. group_by [object] The rules for the group by. path string The path to the value the span-based metric will be aggregated over. tag_name string Eventual name of the tag that gets created. By default, the path attribute is used as the tag name. id string The name of the span-based metric. type enum The type of resource. The value should always be spans_metrics. Allowed enum values: `spans_metrics` default: `spans_metrics` ``` { "data": { "attributes": { "compute": { "aggregation_type": "distribution", "include_percentiles": false, "path": "@duration" }, "filter": { "query": "@http.status_code:200 service:my-service" }, "group_by": [ { "path": "resource_name", "tag_name": "resource_name" } ] }, "id": "my.metric", "type": "spans_metrics" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=typescript) ##### Update a span-based metric returns "OK" response Copy ``` # Path parameters export metric_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/metrics/${metric_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "compute": { "include_percentiles": false }, "filter": { "query": "@http.status_code:200 service:my-service-updated" }, "group_by": [ { "path": "resource_name", "tag_name": "resource_name" } ] }, "type": "spans_metrics" } } EOF ``` ##### Update a span-based metric returns "OK" response ``` // Update a span-based metric returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "spans_metric" in the system SpansMetricDataID := os.Getenv("SPANS_METRIC_DATA_ID") body := datadogV2.SpansMetricUpdateRequest{ Data: datadogV2.SpansMetricUpdateData{ Attributes: datadogV2.SpansMetricUpdateAttributes{ Compute: &datadogV2.SpansMetricUpdateCompute{ IncludePercentiles: datadog.PtrBool(false), }, Filter: &datadogV2.SpansMetricFilter{ Query: datadog.PtrString("@http.status_code:200 service:my-service-updated"), }, GroupBy: []datadogV2.SpansMetricGroupBy{ { Path: "resource_name", TagName: datadog.PtrString("resource_name"), }, }, }, Type: datadogV2.SPANSMETRICTYPE_SPANS_METRICS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSpansMetricsApi(apiClient) resp, r, err := api.UpdateSpansMetric(ctx, SpansMetricDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SpansMetricsApi.UpdateSpansMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SpansMetricsApi.UpdateSpansMetric`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a span-based metric returns "OK" response ``` // Update a span-based metric returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SpansMetricsApi; import com.datadog.api.client.v2.model.SpansMetricFilter; import com.datadog.api.client.v2.model.SpansMetricGroupBy; import com.datadog.api.client.v2.model.SpansMetricResponse; import com.datadog.api.client.v2.model.SpansMetricType; import com.datadog.api.client.v2.model.SpansMetricUpdateAttributes; import com.datadog.api.client.v2.model.SpansMetricUpdateCompute; import com.datadog.api.client.v2.model.SpansMetricUpdateData; import com.datadog.api.client.v2.model.SpansMetricUpdateRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SpansMetricsApi apiInstance = new SpansMetricsApi(defaultClient); // there is a valid "spans_metric" in the system String SPANS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY = System.getenv("SPANS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY"); String SPANS_METRIC_DATA_ID = System.getenv("SPANS_METRIC_DATA_ID"); SpansMetricUpdateRequest body = new SpansMetricUpdateRequest() .data( new SpansMetricUpdateData() .attributes( new SpansMetricUpdateAttributes() .compute(new SpansMetricUpdateCompute().includePercentiles(false)) .filter( new SpansMetricFilter() .query("@http.status_code:200 service:my-service-updated")) .groupBy( Collections.singletonList( new SpansMetricGroupBy() .path("resource_name") .tagName("resource_name")))) .type(SpansMetricType.SPANS_METRICS)); try { SpansMetricResponse result = apiInstance.updateSpansMetric(SPANS_METRIC_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SpansMetricsApi#updateSpansMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a span-based metric returns "OK" response ``` """ Update a span-based metric returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.spans_metrics_api import SpansMetricsApi from datadog_api_client.v2.model.spans_metric_filter import SpansMetricFilter from datadog_api_client.v2.model.spans_metric_group_by import SpansMetricGroupBy from datadog_api_client.v2.model.spans_metric_type import SpansMetricType from datadog_api_client.v2.model.spans_metric_update_attributes import SpansMetricUpdateAttributes from datadog_api_client.v2.model.spans_metric_update_compute import SpansMetricUpdateCompute from datadog_api_client.v2.model.spans_metric_update_data import SpansMetricUpdateData from datadog_api_client.v2.model.spans_metric_update_request import SpansMetricUpdateRequest # there is a valid "spans_metric" in the system SPANS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY = environ["SPANS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY"] SPANS_METRIC_DATA_ID = environ["SPANS_METRIC_DATA_ID"] body = SpansMetricUpdateRequest( data=SpansMetricUpdateData( attributes=SpansMetricUpdateAttributes( compute=SpansMetricUpdateCompute( include_percentiles=False, ), filter=SpansMetricFilter( query="@http.status_code:200 service:my-service-updated", ), group_by=[ SpansMetricGroupBy( path="resource_name", tag_name="resource_name", ), ], ), type=SpansMetricType.SPANS_METRICS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SpansMetricsApi(api_client) response = api_instance.update_spans_metric(metric_id=SPANS_METRIC_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a span-based metric returns "OK" response ``` # Update a span-based metric returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SpansMetricsAPI.new # there is a valid "spans_metric" in the system SPANS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY = ENV["SPANS_METRIC_DATA_ATTRIBUTES_FILTER_QUERY"] SPANS_METRIC_DATA_ID = ENV["SPANS_METRIC_DATA_ID"] body = DatadogAPIClient::V2::SpansMetricUpdateRequest.new({ data: DatadogAPIClient::V2::SpansMetricUpdateData.new({ attributes: DatadogAPIClient::V2::SpansMetricUpdateAttributes.new({ compute: DatadogAPIClient::V2::SpansMetricUpdateCompute.new({ include_percentiles: false, }), filter: DatadogAPIClient::V2::SpansMetricFilter.new({ query: "@http.status_code:200 service:my-service-updated", }), group_by: [ DatadogAPIClient::V2::SpansMetricGroupBy.new({ path: "resource_name", tag_name: "resource_name", }), ], }), type: DatadogAPIClient::V2::SpansMetricType::SPANS_METRICS, }), }) p api_instance.update_spans_metric(SPANS_METRIC_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a span-based metric returns "OK" response ``` // Update a span-based metric returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_spans_metrics::SpansMetricsAPI; use datadog_api_client::datadogV2::model::SpansMetricFilter; use datadog_api_client::datadogV2::model::SpansMetricGroupBy; use datadog_api_client::datadogV2::model::SpansMetricType; use datadog_api_client::datadogV2::model::SpansMetricUpdateAttributes; use datadog_api_client::datadogV2::model::SpansMetricUpdateCompute; use datadog_api_client::datadogV2::model::SpansMetricUpdateData; use datadog_api_client::datadogV2::model::SpansMetricUpdateRequest; #[tokio::main] async fn main() { // there is a valid "spans_metric" in the system let spans_metric_data_id = std::env::var("SPANS_METRIC_DATA_ID").unwrap(); let body = SpansMetricUpdateRequest::new(SpansMetricUpdateData::new( SpansMetricUpdateAttributes::new() .compute(SpansMetricUpdateCompute::new().include_percentiles(false)) .filter( SpansMetricFilter::new() .query("@http.status_code:200 service:my-service-updated".to_string()), ) .group_by(vec![SpansMetricGroupBy::new("resource_name".to_string()) .tag_name("resource_name".to_string())]), SpansMetricType::SPANS_METRICS, )); let configuration = datadog::Configuration::new(); let api = SpansMetricsAPI::with_config(configuration); let resp = api .update_spans_metric(spans_metric_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a span-based metric returns "OK" response ``` /** * Update a span-based metric returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SpansMetricsApi(configuration); // there is a valid "spans_metric" in the system const SPANS_METRIC_DATA_ID = process.env.SPANS_METRIC_DATA_ID as string; const params: v2.SpansMetricsApiUpdateSpansMetricRequest = { body: { data: { attributes: { compute: { includePercentiles: false, }, filter: { query: "@http.status_code:200 service:my-service-updated", }, groupBy: [ { path: "resource_name", tagName: "resource_name", }, ], }, type: "spans_metrics", }, }, metricId: SPANS_METRIC_DATA_ID, }; apiInstance .updateSpansMetric(params) .then((data: v2.SpansMetricResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a span-based metric](https://docs.datadoghq.com/api/latest/spans-metrics/#delete-a-span-based-metric) * [v2 (latest)](https://docs.datadoghq.com/api/latest/spans-metrics/#delete-a-span-based-metric-v2) DELETE https://api.ap1.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.ap2.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.datadoghq.eu/api/v2/apm/config/metrics/{metric_id}https://api.ddog-gov.com/api/v2/apm/config/metrics/{metric_id}https://api.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.us3.datadoghq.com/api/v2/apm/config/metrics/{metric_id}https://api.us5.datadoghq.com/api/v2/apm/config/metrics/{metric_id} ### Overview Delete a specific span-based metric from your organization. This endpoint requires the `apm_generate_metrics` permission. ### Arguments #### Path Parameters Name Type Description metric_id [_required_] string The name of the span-based metric. ### Response * [204](https://docs.datadoghq.com/api/latest/spans-metrics/#DeleteSpansMetric-204-v2) * [403](https://docs.datadoghq.com/api/latest/spans-metrics/#DeleteSpansMetric-403-v2) * [404](https://docs.datadoghq.com/api/latest/spans-metrics/#DeleteSpansMetric-404-v2) * [429](https://docs.datadoghq.com/api/latest/spans-metrics/#DeleteSpansMetric-429-v2) OK Not Authorized * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/spans-metrics/) * [Example](https://docs.datadoghq.com/api/latest/spans-metrics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/spans-metrics/?code-lang=typescript) ##### Delete a span-based metric Copy ``` # Path parameters export metric_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/apm/config/metrics/${metric_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a span-based metric ``` """ Delete a span-based metric returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.spans_metrics_api import SpansMetricsApi # there is a valid "spans_metric" in the system SPANS_METRIC_DATA_ID = environ["SPANS_METRIC_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SpansMetricsApi(api_client) api_instance.delete_spans_metric( metric_id=SPANS_METRIC_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a span-based metric ``` # Delete a span-based metric returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SpansMetricsAPI.new # there is a valid "spans_metric" in the system SPANS_METRIC_DATA_ID = ENV["SPANS_METRIC_DATA_ID"] api_instance.delete_spans_metric(SPANS_METRIC_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a span-based metric ``` // Delete a span-based metric returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "spans_metric" in the system SpansMetricDataID := os.Getenv("SPANS_METRIC_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSpansMetricsApi(apiClient) r, err := api.DeleteSpansMetric(ctx, SpansMetricDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SpansMetricsApi.DeleteSpansMetric`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a span-based metric ``` // Delete a span-based metric returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SpansMetricsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SpansMetricsApi apiInstance = new SpansMetricsApi(defaultClient); // there is a valid "spans_metric" in the system String SPANS_METRIC_DATA_ID = System.getenv("SPANS_METRIC_DATA_ID"); try { apiInstance.deleteSpansMetric(SPANS_METRIC_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling SpansMetricsApi#deleteSpansMetric"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a span-based metric ``` // Delete a span-based metric returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_spans_metrics::SpansMetricsAPI; #[tokio::main] async fn main() { // there is a valid "spans_metric" in the system let spans_metric_data_id = std::env::var("SPANS_METRIC_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SpansMetricsAPI::with_config(configuration); let resp = api.delete_spans_metric(spans_metric_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a span-based metric ``` /** * Delete a span-based metric returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SpansMetricsApi(configuration); // there is a valid "spans_metric" in the system const SPANS_METRIC_DATA_ID = process.env.SPANS_METRIC_DATA_ID as string; const params: v2.SpansMetricsApiDeleteSpansMetricRequest = { metricId: SPANS_METRIC_DATA_ID, }; apiInstance .deleteSpansMetric(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=0d84f0f4-bf8e-4fb4-93e6-64ca15ef6ca1&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=7ff6cd4b-e413-4fff-8933-ef35145dd648&pt=Spans%20Metrics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fspans-metrics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=0d84f0f4-bf8e-4fb4-93e6-64ca15ef6ca1&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=7ff6cd4b-e413-4fff-8933-ef35145dd648&pt=Spans%20Metrics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fspans-metrics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) Feedback ## Was this page helpful? Yes 🎉 No 👎 Next ![](https://survey-images.hotjar.com/surveys/logo/90f40352a7464c849f5ce82ccd0e758d) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=9b441c80-74c4-451f-bc86-a6d28ccb8353&bo=2&sid=6ba625b0f0c011f08f1691493ba2b986&vid=6ba6b460f0c011f0ac91d1b1b7b38028&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Spans%20Metrics&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fspans-metrics%2F&r=<=1666&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=225313) --- # Source: https://docs.datadoghq.com/api/latest/spans # Spans Search and aggregate your spans from your Datadog platform over HTTP. ## [Get a list of spans](https://docs.datadoghq.com/api/latest/spans/#get-a-list-of-spans) * [v2 (latest)](https://docs.datadoghq.com/api/latest/spans/#get-a-list-of-spans-v2) GET https://api.ap1.datadoghq.com/api/v2/spans/eventshttps://api.ap2.datadoghq.com/api/v2/spans/eventshttps://api.datadoghq.eu/api/v2/spans/eventshttps://api.ddog-gov.com/api/v2/spans/eventshttps://api.datadoghq.com/api/v2/spans/eventshttps://api.us3.datadoghq.com/api/v2/spans/eventshttps://api.us5.datadoghq.com/api/v2/spans/events ### Overview List endpoint returns spans that match a span search query. [Results are paginated](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination?tab=v2api). Use this endpoint to see your latest spans. This endpoint is rate limited to `300` requests per hour. OAuth apps require the `apm_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#spans) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[query] string Search query following spans syntax. filter[from] string Minimum timestamp for requested spans. Supports date-time ISO8601, date math, and regular timestamps (milliseconds). filter[to] string Maximum timestamp for requested spans. Supports date-time ISO8601, date math, and regular timestamps (milliseconds). sort enum Order of spans in results. Allowed enum values: `timestamp, -timestamp` page[cursor] string List following results with a cursor provided in the previous query. page[limit] integer Maximum number of spans in the response. ### Response * [200](https://docs.datadoghq.com/api/latest/spans/#ListSpansGet-200-v2) * [400](https://docs.datadoghq.com/api/latest/spans/#ListSpansGet-400-v2) * [403](https://docs.datadoghq.com/api/latest/spans/#ListSpansGet-403-v2) * [422](https://docs.datadoghq.com/api/latest/spans/#ListSpansGet-422-v2) * [429](https://docs.datadoghq.com/api/latest/spans/#ListSpansGet-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) Response object with all spans matching the request and pagination information. Field Type Description data [object] Array of spans matching the request. attributes object JSON object containing all span attributes and their associated values. attributes object JSON object of attributes from your span. custom object JSON object of custom spans data. end_timestamp date-time End timestamp of your span. env string Name of the environment from where the spans are being sent. host string Name of the machine from where the spans are being sent. ingestion_reason string The reason why the span was ingested. parent_id string Id of the span that's parent of this span. resource_hash string Unique identifier of the resource. resource_name string The name of the resource. retained_by string The reason why the span was indexed. service string The name of the application or service generating the span events. It is used to switch from APM to Logs, so make sure you define the same value when you use both products. single_span boolean Whether or not the span was collected as a stand-alone span. Always associated to "single_span" ingestion_reason if true. span_id string Id of the span. start_timestamp date-time Start timestamp of your span. tags [string] Array of tags associated with your span. trace_id string Id of the trace to which the span belongs. type string The type of the span. id string Unique ID of the Span. type enum Type of the span. Allowed enum values: `spans` default: `spans` links object Links attributes. next string Link for the next set of results. Note that the request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non fatal errors) encountered, partial results might be returned if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "custom": {}, "end_timestamp": "2023-01-02T09:42:36.420Z", "env": "prod", "host": "i-0123", "ingestion_reason": "rule", "parent_id": "0", "resource_hash": "a12345678b91c23d", "resource_name": "agent", "retained_by": "retention_filter", "service": "agent", "single_span": true, "span_id": "1234567890987654321", "start_timestamp": "2023-01-02T09:42:36.320Z", "tags": [ "team:A" ], "trace_id": "1234567890987654321", "type": "web" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "spans" } ], "links": { "next": "https://app.datadoghq.com/api/v2/spans/event?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request. * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden: Access denied. * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Unprocessable Entity. * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests: The rate limit set by the API has been exceeded. * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/spans/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/spans/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/spans/?code-lang=ruby) * [Java](https://docs.datadoghq.com/api/latest/spans/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/spans/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/spans/?code-lang=typescript) * [Go](https://docs.datadoghq.com/api/latest/spans/?code-lang=go) ##### Get a list of spans Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/spans/events" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a list of spans ``` """ Get a list of spans returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.spans_api import SpansApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SpansApi(api_client) response = api_instance.list_spans_get() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a list of spans ``` # Get a list of spans returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SpansAPI.new p api_instance.list_spans_get() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a list of spans ``` // Get a list of spans returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SpansApi; import com.datadog.api.client.v2.model.SpansListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SpansApi apiInstance = new SpansApi(defaultClient); try { SpansListResponse result = apiInstance.listSpansGet(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SpansApi#listSpansGet"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a list of spans ``` // Get a list of spans returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_spans::ListSpansGetOptionalParams; use datadog_api_client::datadogV2::api_spans::SpansAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SpansAPI::with_config(configuration); let resp = api .list_spans_get(ListSpansGetOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a list of spans ``` /** * Get a list of spans returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SpansApi(configuration); apiInstance .listSpansGet() .then((data: v2.SpansListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` ##### Get a list of spans ``` // Get a list of spans returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSpansApi(apiClient) resp, r, err := api.ListSpansGet(ctx, *datadogV2.NewListSpansGetOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SpansApi.ListSpansGet`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SpansApi.ListSpansGet`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` * * * ## [Search spans](https://docs.datadoghq.com/api/latest/spans/#search-spans) * [v2 (latest)](https://docs.datadoghq.com/api/latest/spans/#search-spans-v2) POST https://api.ap1.datadoghq.com/api/v2/spans/events/searchhttps://api.ap2.datadoghq.com/api/v2/spans/events/searchhttps://api.datadoghq.eu/api/v2/spans/events/searchhttps://api.ddog-gov.com/api/v2/spans/events/searchhttps://api.datadoghq.com/api/v2/spans/events/searchhttps://api.us3.datadoghq.com/api/v2/spans/events/searchhttps://api.us5.datadoghq.com/api/v2/spans/events/search ### Overview List endpoint returns spans that match a span search query. [Results are paginated](https://docs.datadoghq.com/logs/guide/collect-multiple-logs-with-pagination?tab=v2api). Use this endpoint to build complex spans filtering and search. This endpoint is rate limited to `300` requests per hour. OAuth apps require the `apm_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#spans) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) Field Type Description data object The object containing the query content. attributes object The object containing all the query parameters. filter object The search and filter query settings. from string The minimum time for the requested spans, supports date-time ISO8601, date math, and regular timestamps (milliseconds). default: `now-15m` query string The search query - following the span search syntax. default: `*` to string The maximum time for the requested spans, supports date-time ISO8601, date math, and regular timestamps (milliseconds). default: `now` options object Global query options that are used during the query. Note: You should only supply timezone or time offset but not both otherwise the query will fail. timeOffset int64 The time offset (in seconds) to apply to the query. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` page object Paging attributes for listing spans. cursor string List following results with a cursor provided in the previous query. limit int32 Maximum number of spans in the response. default: `10` sort enum Sort parameters when querying spans. Allowed enum values: `timestamp,-timestamp` type enum The type of resource. The value should always be search_request. Allowed enum values: `search_request` default: `search_request` ##### Search spans returns "OK" response ``` { "data": { "attributes": { "filter": { "from": "now-15m", "query": "*", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 25 }, "sort": "timestamp" }, "type": "search_request" } } ``` Copy ##### Search spans returns "OK" response with pagination ``` { "data": { "attributes": { "filter": { "from": "now-15m", "query": "service:python*", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" }, "type": "search_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/spans/#ListSpans-200-v2) * [400](https://docs.datadoghq.com/api/latest/spans/#ListSpans-400-v2) * [403](https://docs.datadoghq.com/api/latest/spans/#ListSpans-403-v2) * [422](https://docs.datadoghq.com/api/latest/spans/#ListSpans-422-v2) * [429](https://docs.datadoghq.com/api/latest/spans/#ListSpans-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) Response object with all spans matching the request and pagination information. Field Type Description data [object] Array of spans matching the request. attributes object JSON object containing all span attributes and their associated values. attributes object JSON object of attributes from your span. custom object JSON object of custom spans data. end_timestamp date-time End timestamp of your span. env string Name of the environment from where the spans are being sent. host string Name of the machine from where the spans are being sent. ingestion_reason string The reason why the span was ingested. parent_id string Id of the span that's parent of this span. resource_hash string Unique identifier of the resource. resource_name string The name of the resource. retained_by string The reason why the span was indexed. service string The name of the application or service generating the span events. It is used to switch from APM to Logs, so make sure you define the same value when you use both products. single_span boolean Whether or not the span was collected as a stand-alone span. Always associated to "single_span" ingestion_reason if true. span_id string Id of the span. start_timestamp date-time Start timestamp of your span. tags [string] Array of tags associated with your span. trace_id string Id of the trace to which the span belongs. type string The type of the span. id string Unique ID of the Span. type enum Type of the span. Allowed enum values: `spans` default: `spans` links object Links attributes. next string Link for the next set of results. Note that the request can also be made using the POST endpoint. meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. page object Paging attributes. after string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of the `page[cursor]`. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non fatal errors) encountered, partial results might be returned if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": [ { "attributes": { "attributes": { "customAttribute": 123, "duration": 2345 }, "custom": {}, "end_timestamp": "2023-01-02T09:42:36.420Z", "env": "prod", "host": "i-0123", "ingestion_reason": "rule", "parent_id": "0", "resource_hash": "a12345678b91c23d", "resource_name": "agent", "retained_by": "retention_filter", "service": "agent", "single_span": true, "span_id": "1234567890987654321", "start_timestamp": "2023-01-02T09:42:36.320Z", "tags": [ "team:A" ], "trace_id": "1234567890987654321", "type": "web" }, "id": "AAAAAWgN8Xwgr1vKDQAAAABBV2dOOFh3ZzZobm1mWXJFYTR0OA", "type": "spans" } ], "links": { "next": "https://app.datadoghq.com/api/v2/spans/event?filter[query]=foo\u0026page[cursor]=eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "meta": { "elapsed": 132, "page": { "after": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==" }, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request. * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden: Access denied. * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Unprocessable Entity. * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests: The rate limit set by the API has been exceeded. * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/spans/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/spans/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/spans/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/spans/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/spans/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/spans/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/spans/?code-lang=typescript) ##### Search spans returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/spans/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter": { "from": "now-15m", "query": "*", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 25 }, "sort": "timestamp" }, "type": "search_request" } } EOF ``` ##### Search spans returns "OK" response with pagination Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/spans/events/search" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "filter": { "from": "now-15m", "query": "service:python*", "to": "now" }, "options": { "timezone": "GMT" }, "page": { "limit": 2 }, "sort": "timestamp" }, "type": "search_request" } } EOF ``` ##### Search spans returns "OK" response ``` // Search spans returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SpansListRequest{ Data: &datadogV2.SpansListRequestData{ Attributes: &datadogV2.SpansListRequestAttributes{ Filter: &datadogV2.SpansQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("*"), To: datadog.PtrString("now"), }, Options: &datadogV2.SpansQueryOptions{ Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.SpansListRequestPage{ Limit: datadog.PtrInt32(25), }, Sort: datadogV2.SPANSSORT_TIMESTAMP_ASCENDING.Ptr(), }, Type: datadogV2.SPANSLISTREQUESTTYPE_SEARCH_REQUEST.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSpansApi(apiClient) resp, r, err := api.ListSpans(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SpansApi.ListSpans`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SpansApi.ListSpans`:\n%s\n", responseContent) } ``` Copy ##### Search spans returns "OK" response with pagination ``` // Search spans returns "OK" response with pagination package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SpansListRequest{ Data: &datadogV2.SpansListRequestData{ Attributes: &datadogV2.SpansListRequestAttributes{ Filter: &datadogV2.SpansQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("service:python*"), To: datadog.PtrString("now"), }, Options: &datadogV2.SpansQueryOptions{ Timezone: datadog.PtrString("GMT"), }, Page: &datadogV2.SpansListRequestPage{ Limit: datadog.PtrInt32(2), }, Sort: datadogV2.SPANSSORT_TIMESTAMP_ASCENDING.Ptr(), }, Type: datadogV2.SPANSLISTREQUESTTYPE_SEARCH_REQUEST.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSpansApi(apiClient) resp, _ := api.ListSpansWithPagination(ctx, body) for paginationResult := range resp { if paginationResult.Error != nil { fmt.Fprintf(os.Stderr, "Error when calling `SpansApi.ListSpans`: %v\n", paginationResult.Error) } responseContent, _ := json.MarshalIndent(paginationResult.Item, "", " ") fmt.Fprintf(os.Stdout, "%s\n", responseContent) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search spans returns "OK" response ``` // Search spans returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SpansApi; import com.datadog.api.client.v2.model.SpansListRequest; import com.datadog.api.client.v2.model.SpansListRequestAttributes; import com.datadog.api.client.v2.model.SpansListRequestData; import com.datadog.api.client.v2.model.SpansListRequestPage; import com.datadog.api.client.v2.model.SpansListRequestType; import com.datadog.api.client.v2.model.SpansListResponse; import com.datadog.api.client.v2.model.SpansQueryFilter; import com.datadog.api.client.v2.model.SpansQueryOptions; import com.datadog.api.client.v2.model.SpansSort; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SpansApi apiInstance = new SpansApi(defaultClient); SpansListRequest body = new SpansListRequest() .data( new SpansListRequestData() .attributes( new SpansListRequestAttributes() .filter(new SpansQueryFilter().from("now-15m").query("*").to("now")) .options(new SpansQueryOptions().timezone("GMT")) .page(new SpansListRequestPage().limit(25)) .sort(SpansSort.TIMESTAMP_ASCENDING)) .type(SpansListRequestType.SEARCH_REQUEST)); try { SpansListResponse result = apiInstance.listSpans(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SpansApi#listSpans"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Search spans returns "OK" response with pagination ``` // Search spans returns "OK" response with pagination import com.datadog.api.client.ApiClient; import com.datadog.api.client.PaginationIterable; import com.datadog.api.client.v2.api.SpansApi; import com.datadog.api.client.v2.model.Span; import com.datadog.api.client.v2.model.SpansListRequest; import com.datadog.api.client.v2.model.SpansListRequestAttributes; import com.datadog.api.client.v2.model.SpansListRequestData; import com.datadog.api.client.v2.model.SpansListRequestPage; import com.datadog.api.client.v2.model.SpansListRequestType; import com.datadog.api.client.v2.model.SpansQueryFilter; import com.datadog.api.client.v2.model.SpansQueryOptions; import com.datadog.api.client.v2.model.SpansSort; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SpansApi apiInstance = new SpansApi(defaultClient); SpansListRequest body = new SpansListRequest() .data( new SpansListRequestData() .attributes( new SpansListRequestAttributes() .filter( new SpansQueryFilter() .from("now-15m") .query("service:python*") .to("now")) .options(new SpansQueryOptions().timezone("GMT")) .page(new SpansListRequestPage().limit(2)) .sort(SpansSort.TIMESTAMP_ASCENDING)) .type(SpansListRequestType.SEARCH_REQUEST)); try { PaginationIterable iterable = apiInstance.listSpansWithPagination(body); for (Span item : iterable) { System.out.println(item); } } catch (RuntimeException e) { System.err.println("Exception when calling SpansApi#listSpansWithPagination"); System.err.println("Reason: " + e.getMessage()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search spans returns "OK" response ``` """ Search spans returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.spans_api import SpansApi from datadog_api_client.v2.model.spans_list_request import SpansListRequest from datadog_api_client.v2.model.spans_list_request_attributes import SpansListRequestAttributes from datadog_api_client.v2.model.spans_list_request_data import SpansListRequestData from datadog_api_client.v2.model.spans_list_request_page import SpansListRequestPage from datadog_api_client.v2.model.spans_list_request_type import SpansListRequestType from datadog_api_client.v2.model.spans_query_filter import SpansQueryFilter from datadog_api_client.v2.model.spans_query_options import SpansQueryOptions from datadog_api_client.v2.model.spans_sort import SpansSort body = SpansListRequest( data=SpansListRequestData( attributes=SpansListRequestAttributes( filter=SpansQueryFilter( _from="now-15m", query="*", to="now", ), options=SpansQueryOptions( timezone="GMT", ), page=SpansListRequestPage( limit=25, ), sort=SpansSort.TIMESTAMP_ASCENDING, ), type=SpansListRequestType.SEARCH_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SpansApi(api_client) response = api_instance.list_spans(body=body) print(response) ``` Copy ##### Search spans returns "OK" response with pagination ``` """ Search spans returns "OK" response with pagination """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.spans_api import SpansApi from datadog_api_client.v2.model.spans_list_request import SpansListRequest from datadog_api_client.v2.model.spans_list_request_attributes import SpansListRequestAttributes from datadog_api_client.v2.model.spans_list_request_data import SpansListRequestData from datadog_api_client.v2.model.spans_list_request_page import SpansListRequestPage from datadog_api_client.v2.model.spans_list_request_type import SpansListRequestType from datadog_api_client.v2.model.spans_query_filter import SpansQueryFilter from datadog_api_client.v2.model.spans_query_options import SpansQueryOptions from datadog_api_client.v2.model.spans_sort import SpansSort body = SpansListRequest( data=SpansListRequestData( attributes=SpansListRequestAttributes( filter=SpansQueryFilter( _from="now-15m", query="service:python*", to="now", ), options=SpansQueryOptions( timezone="GMT", ), page=SpansListRequestPage( limit=2, ), sort=SpansSort.TIMESTAMP_ASCENDING, ), type=SpansListRequestType.SEARCH_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SpansApi(api_client) items = api_instance.list_spans_with_pagination(body=body) for item in items: print(item) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search spans returns "OK" response ``` # Search spans returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SpansAPI.new body = DatadogAPIClient::V2::SpansListRequest.new({ data: DatadogAPIClient::V2::SpansListRequestData.new({ attributes: DatadogAPIClient::V2::SpansListRequestAttributes.new({ filter: DatadogAPIClient::V2::SpansQueryFilter.new({ from: "now-15m", query: "*", to: "now", }), options: DatadogAPIClient::V2::SpansQueryOptions.new({ timezone: "GMT", }), page: DatadogAPIClient::V2::SpansListRequestPage.new({ limit: 25, }), sort: DatadogAPIClient::V2::SpansSort::TIMESTAMP_ASCENDING, }), type: DatadogAPIClient::V2::SpansListRequestType::SEARCH_REQUEST, }), }) p api_instance.list_spans(body) ``` Copy ##### Search spans returns "OK" response with pagination ``` # Search spans returns "OK" response with pagination require "datadog_api_client" api_instance = DatadogAPIClient::V2::SpansAPI.new body = DatadogAPIClient::V2::SpansListRequest.new({ data: DatadogAPIClient::V2::SpansListRequestData.new({ attributes: DatadogAPIClient::V2::SpansListRequestAttributes.new({ filter: DatadogAPIClient::V2::SpansQueryFilter.new({ from: "now-15m", query: "service:python*", to: "now", }), options: DatadogAPIClient::V2::SpansQueryOptions.new({ timezone: "GMT", }), page: DatadogAPIClient::V2::SpansListRequestPage.new({ limit: 2, }), sort: DatadogAPIClient::V2::SpansSort::TIMESTAMP_ASCENDING, }), type: DatadogAPIClient::V2::SpansListRequestType::SEARCH_REQUEST, }), }) api_instance.list_spans_with_pagination(body) { |item| puts item } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search spans returns "OK" response ``` // Search spans returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_spans::SpansAPI; use datadog_api_client::datadogV2::model::SpansListRequest; use datadog_api_client::datadogV2::model::SpansListRequestAttributes; use datadog_api_client::datadogV2::model::SpansListRequestData; use datadog_api_client::datadogV2::model::SpansListRequestPage; use datadog_api_client::datadogV2::model::SpansListRequestType; use datadog_api_client::datadogV2::model::SpansQueryFilter; use datadog_api_client::datadogV2::model::SpansQueryOptions; use datadog_api_client::datadogV2::model::SpansSort; #[tokio::main] async fn main() { let body = SpansListRequest::new().data( SpansListRequestData::new() .attributes( SpansListRequestAttributes::new() .filter( SpansQueryFilter::new() .from("now-15m".to_string()) .query("*".to_string()) .to("now".to_string()), ) .options(SpansQueryOptions::new().timezone("GMT".to_string())) .page(SpansListRequestPage::new().limit(25)) .sort(SpansSort::TIMESTAMP_ASCENDING), ) .type_(SpansListRequestType::SEARCH_REQUEST), ); let configuration = datadog::Configuration::new(); let api = SpansAPI::with_config(configuration); let resp = api.list_spans(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Search spans returns "OK" response with pagination ``` // Search spans returns "OK" response with pagination use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_spans::SpansAPI; use datadog_api_client::datadogV2::model::SpansListRequest; use datadog_api_client::datadogV2::model::SpansListRequestAttributes; use datadog_api_client::datadogV2::model::SpansListRequestData; use datadog_api_client::datadogV2::model::SpansListRequestPage; use datadog_api_client::datadogV2::model::SpansListRequestType; use datadog_api_client::datadogV2::model::SpansQueryFilter; use datadog_api_client::datadogV2::model::SpansQueryOptions; use datadog_api_client::datadogV2::model::SpansSort; use futures_util::pin_mut; use futures_util::stream::StreamExt; #[tokio::main] async fn main() { let body = SpansListRequest::new().data( SpansListRequestData::new() .attributes( SpansListRequestAttributes::new() .filter( SpansQueryFilter::new() .from("now-15m".to_string()) .query("service:python*".to_string()) .to("now".to_string()), ) .options(SpansQueryOptions::new().timezone("GMT".to_string())) .page(SpansListRequestPage::new().limit(2)) .sort(SpansSort::TIMESTAMP_ASCENDING), ) .type_(SpansListRequestType::SEARCH_REQUEST), ); let configuration = datadog::Configuration::new(); let api = SpansAPI::with_config(configuration); let response = api.list_spans_with_pagination(body); pin_mut!(response); while let Some(resp) = response.next().await { if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search spans returns "OK" response ``` /** * Search spans returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SpansApi(configuration); const params: v2.SpansApiListSpansRequest = { body: { data: { attributes: { filter: { from: "now-15m", query: "*", to: "now", }, options: { timezone: "GMT", }, page: { limit: 25, }, sort: "timestamp", }, type: "search_request", }, }, }; apiInstance .listSpans(params) .then((data: v2.SpansListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Search spans returns "OK" response with pagination ``` /** * Search spans returns "OK" response with pagination */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SpansApi(configuration); const params: v2.SpansApiListSpansRequest = { body: { data: { attributes: { filter: { from: "now-15m", query: "service:python*", to: "now", }, options: { timezone: "GMT", }, page: { limit: 2, }, sort: "timestamp", }, type: "search_request", }, }, }; (async () => { try { for await (const item of apiInstance.listSpansWithPagination(params)) { console.log(item); } } catch (error) { console.error(error); } })(); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Aggregate spans](https://docs.datadoghq.com/api/latest/spans/#aggregate-spans) * [v2 (latest)](https://docs.datadoghq.com/api/latest/spans/#aggregate-spans-v2) POST https://api.ap1.datadoghq.com/api/v2/spans/analytics/aggregatehttps://api.ap2.datadoghq.com/api/v2/spans/analytics/aggregatehttps://api.datadoghq.eu/api/v2/spans/analytics/aggregatehttps://api.ddog-gov.com/api/v2/spans/analytics/aggregatehttps://api.datadoghq.com/api/v2/spans/analytics/aggregatehttps://api.us3.datadoghq.com/api/v2/spans/analytics/aggregatehttps://api.us5.datadoghq.com/api/v2/spans/analytics/aggregate ### Overview The API endpoint to aggregate spans into buckets and compute metrics and timeseries. This endpoint is rate limited to `300` requests per hour. This endpoint requires the `apm_read` permission. OAuth apps require the `apm_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#spans) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) Field Type Description data object The object containing the query content. attributes object The object containing all the query parameters. compute [object] The list of metrics or timeseries to compute for the retrieved buckets. aggregation [_required_] enum An aggregation function. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median` interval string The time buckets' size (only used for type=timeseries) Defaults to a resolution of 150 points. metric string The metric to use. type enum The type of compute. Allowed enum values: `timeseries,total` default: `total` filter object The search and filter query settings. from string The minimum time for the requested spans, supports date-time ISO8601, date math, and regular timestamps (milliseconds). default: `now-15m` query string The search query - following the span search syntax. default: `*` to string The maximum time for the requested spans, supports date-time ISO8601, date math, and regular timestamps (milliseconds). default: `now` group_by [object] The rules for the group by. facet [_required_] string The name of the facet to use (required). histogram object Used to perform a histogram computation (only for measure facets). Note: At most 100 buckets are allowed, the number of buckets is (max - min)/interval. interval [_required_] double The bin size of the histogram buckets. max [_required_] double The maximum value for the measure used in the histogram (values greater than this one are filtered out). min [_required_] double The minimum value for the measure used in the histogram (values smaller than this one are filtered out). limit int64 The maximum buckets to return for this group by. default: `10` missing The value to use for spans that don't have the facet used to group by. Option 1 string The missing value to use if there is string valued facet. Option 2 double The missing value to use if there is a number valued facet. sort object A sort rule. aggregation enum An aggregation function. Allowed enum values: `count,cardinality,pc75,pc90,pc95,pc98,pc99,sum,min,max,avg,median` metric string The metric to sort by (only used for `type=measure`). order enum The order to use, ascending or descending. Allowed enum values: `asc,desc` type enum The type of sorting algorithm. Allowed enum values: `alphabetical,measure` default: `alphabetical` total A resulting object to put the given computes in over all the matching records. Option 1 boolean If set to true, creates an additional bucket labeled "$facet_total". Option 2 string A string to use as the key value for the total bucket. Option 3 double A number to use as the key value for the total bucket. options object Global query options that are used during the query. Note: You should only supply timezone or time offset but not both otherwise the query will fail. timeOffset int64 The time offset (in seconds) to apply to the query. timezone string The timezone can be specified as GMT, UTC, an offset from UTC (like UTC+1), or as a Timezone Database identifier (like America/New_York). default: `UTC` type enum The type of resource. The value should always be aggregate_request. Allowed enum values: `aggregate_request` default: `aggregate_request` ``` { "data": { "attributes": { "compute": [ { "aggregation": "count", "interval": "5m", "type": "timeseries" } ], "filter": { "from": "now-15m", "query": "*", "to": "now" } }, "type": "aggregate_request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/spans/#AggregateSpans-200-v2) * [400](https://docs.datadoghq.com/api/latest/spans/#AggregateSpans-400-v2) * [403](https://docs.datadoghq.com/api/latest/spans/#AggregateSpans-403-v2) * [429](https://docs.datadoghq.com/api/latest/spans/#AggregateSpans-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) The response object for the spans aggregate API endpoint. Field Type Description data [object] The list of matching buckets, one item per bucket. attributes object A bucket values. by object The key, value pairs for each group by. The values for each group by. compute object The compute data. computes object A map of the metric name -> value for regular compute or list of values for a timeseries. A bucket value, can be either a timeseries or a single value. Option 1 string A single string value. Option 2 double A single number value. Option 3 [object] A timeseries array. time string The time value for this point. value double The value for this point. id string ID of the spans aggregate. type enum The spans aggregate bucket type. Allowed enum values: `bucket` meta object The metadata associated with a request. elapsed int64 The time elapsed in milliseconds. request_id string The identifier of the request. status enum The status of the response. Allowed enum values: `done,timeout` warnings [object] A list of warnings (non fatal errors) encountered, partial results might be returned if warnings are present in the response. code string A unique code for this type of warning. detail string A detailed explanation of this specific warning. title string A short human-readable summary of the warning. ``` { "data": [ { "attributes": { "by": { "": "undefined" }, "compute": {}, "computes": { "": { "description": "undefined", "type": "undefined" } } }, "id": "string", "type": "bucket" } ], "meta": { "elapsed": 132, "request_id": "MWlFUjVaWGZTTTZPYzM0VXp1OXU2d3xLSVpEMjZKQ0VKUTI0dEYtM3RSOFVR", "status": "done", "warnings": [ { "code": "unknown_index", "detail": "indexes: foo, bar", "title": "One or several indexes are missing or invalid, results hold data from the other indexes" } ] } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/spans/) * [Example](https://docs.datadoghq.com/api/latest/spans/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/spans/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/spans/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/spans/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/spans/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/spans/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/spans/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/spans/?code-lang=typescript) ##### Aggregate spans returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/spans/analytics/aggregate" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "compute": [ { "aggregation": "count", "interval": "5m", "type": "timeseries" } ], "filter": { "from": "now-15m", "query": "*", "to": "now" } }, "type": "aggregate_request" } } EOF ``` ##### Aggregate spans returns "OK" response ``` // Aggregate spans returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.SpansAggregateRequest{ Data: &datadogV2.SpansAggregateData{ Attributes: &datadogV2.SpansAggregateRequestAttributes{ Compute: []datadogV2.SpansCompute{ { Aggregation: datadogV2.SPANSAGGREGATIONFUNCTION_COUNT, Interval: datadog.PtrString("5m"), Type: datadogV2.SPANSCOMPUTETYPE_TIMESERIES.Ptr(), }, }, Filter: &datadogV2.SpansQueryFilter{ From: datadog.PtrString("now-15m"), Query: datadog.PtrString("*"), To: datadog.PtrString("now"), }, }, Type: datadogV2.SPANSAGGREGATEREQUESTTYPE_AGGREGATE_REQUEST.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSpansApi(apiClient) resp, r, err := api.AggregateSpans(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SpansApi.AggregateSpans`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SpansApi.AggregateSpans`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Aggregate spans returns "OK" response ``` // Aggregate spans returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SpansApi; import com.datadog.api.client.v2.model.SpansAggregateData; import com.datadog.api.client.v2.model.SpansAggregateRequest; import com.datadog.api.client.v2.model.SpansAggregateRequestAttributes; import com.datadog.api.client.v2.model.SpansAggregateRequestType; import com.datadog.api.client.v2.model.SpansAggregateResponse; import com.datadog.api.client.v2.model.SpansAggregationFunction; import com.datadog.api.client.v2.model.SpansCompute; import com.datadog.api.client.v2.model.SpansComputeType; import com.datadog.api.client.v2.model.SpansQueryFilter; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SpansApi apiInstance = new SpansApi(defaultClient); SpansAggregateRequest body = new SpansAggregateRequest() .data( new SpansAggregateData() .attributes( new SpansAggregateRequestAttributes() .compute( Collections.singletonList( new SpansCompute() .aggregation(SpansAggregationFunction.COUNT) .interval("5m") .type(SpansComputeType.TIMESERIES))) .filter(new SpansQueryFilter().from("now-15m").query("*").to("now"))) .type(SpansAggregateRequestType.AGGREGATE_REQUEST)); try { SpansAggregateResponse result = apiInstance.aggregateSpans(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SpansApi#aggregateSpans"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Aggregate spans returns "OK" response ``` """ Aggregate spans returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.spans_api import SpansApi from datadog_api_client.v2.model.spans_aggregate_data import SpansAggregateData from datadog_api_client.v2.model.spans_aggregate_request import SpansAggregateRequest from datadog_api_client.v2.model.spans_aggregate_request_attributes import SpansAggregateRequestAttributes from datadog_api_client.v2.model.spans_aggregate_request_type import SpansAggregateRequestType from datadog_api_client.v2.model.spans_aggregation_function import SpansAggregationFunction from datadog_api_client.v2.model.spans_compute import SpansCompute from datadog_api_client.v2.model.spans_compute_type import SpansComputeType from datadog_api_client.v2.model.spans_query_filter import SpansQueryFilter body = SpansAggregateRequest( data=SpansAggregateData( attributes=SpansAggregateRequestAttributes( compute=[ SpansCompute( aggregation=SpansAggregationFunction.COUNT, interval="5m", type=SpansComputeType.TIMESERIES, ), ], filter=SpansQueryFilter( _from="now-15m", query="*", to="now", ), ), type=SpansAggregateRequestType.AGGREGATE_REQUEST, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SpansApi(api_client) response = api_instance.aggregate_spans(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Aggregate spans returns "OK" response ``` # Aggregate spans returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SpansAPI.new body = DatadogAPIClient::V2::SpansAggregateRequest.new({ data: DatadogAPIClient::V2::SpansAggregateData.new({ attributes: DatadogAPIClient::V2::SpansAggregateRequestAttributes.new({ compute: [ DatadogAPIClient::V2::SpansCompute.new({ aggregation: DatadogAPIClient::V2::SpansAggregationFunction::COUNT, interval: "5m", type: DatadogAPIClient::V2::SpansComputeType::TIMESERIES, }), ], filter: DatadogAPIClient::V2::SpansQueryFilter.new({ from: "now-15m", query: "*", to: "now", }), }), type: DatadogAPIClient::V2::SpansAggregateRequestType::AGGREGATE_REQUEST, }), }) p api_instance.aggregate_spans(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Aggregate spans returns "OK" response ``` // Aggregate spans returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_spans::SpansAPI; use datadog_api_client::datadogV2::model::SpansAggregateData; use datadog_api_client::datadogV2::model::SpansAggregateRequest; use datadog_api_client::datadogV2::model::SpansAggregateRequestAttributes; use datadog_api_client::datadogV2::model::SpansAggregateRequestType; use datadog_api_client::datadogV2::model::SpansAggregationFunction; use datadog_api_client::datadogV2::model::SpansCompute; use datadog_api_client::datadogV2::model::SpansComputeType; use datadog_api_client::datadogV2::model::SpansQueryFilter; #[tokio::main] async fn main() { let body = SpansAggregateRequest::new().data( SpansAggregateData::new() .attributes( SpansAggregateRequestAttributes::new() .compute(vec![SpansCompute::new(SpansAggregationFunction::COUNT) .interval("5m".to_string()) .type_(SpansComputeType::TIMESERIES)]) .filter( SpansQueryFilter::new() .from("now-15m".to_string()) .query("*".to_string()) .to("now".to_string()), ), ) .type_(SpansAggregateRequestType::AGGREGATE_REQUEST), ); let configuration = datadog::Configuration::new(); let api = SpansAPI::with_config(configuration); let resp = api.aggregate_spans(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Aggregate spans returns "OK" response ``` /** * Aggregate spans returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SpansApi(configuration); const params: v2.SpansApiAggregateSpansRequest = { body: { data: { attributes: { compute: [ { aggregation: "count", interval: "5m", type: "timeseries", }, ], filter: { from: "now-15m", query: "*", to: "now", }, }, type: "aggregate_request", }, }, }; apiInstance .aggregateSpans(params) .then((data: v2.SpansAggregateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=d5f5b39f-7b93-46dc-9f7a-1d49f13be603&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d7a19fc3-cce8-4282-8140-55ff586f3989&pt=Spans&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fspans%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=d5f5b39f-7b93-46dc-9f7a-1d49f13be603&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=d7a19fc3-cce8-4282-8140-55ff586f3989&pt=Spans&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fspans%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=b4e94658-6cb7-4e39-ad2f-fa846362759c&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Spans&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fspans%2F&r=<=10866&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=958333) --- # Source: https://docs.datadoghq.com/api/latest/static-analysis # Static Analysis API for static analysis ## [POST request to resolve vulnerable symbols](https://docs.datadoghq.com/api/latest/static-analysis/#post-request-to-resolve-vulnerable-symbols) * [v2 (latest)](https://docs.datadoghq.com/api/latest/static-analysis/#post-request-to-resolve-vulnerable-symbols-v2) **Note** : This endpoint may be subject to changes. POST https://api.ap1.datadoghq.com/api/v2/static-analysis-sca/vulnerabilities/resolve-vulnerable-symbolshttps://api.ap2.datadoghq.com/api/v2/static-analysis-sca/vulnerabilities/resolve-vulnerable-symbolshttps://api.datadoghq.eu/api/v2/static-analysis-sca/vulnerabilities/resolve-vulnerable-symbolshttps://api.ddog-gov.com/api/v2/static-analysis-sca/vulnerabilities/resolve-vulnerable-symbolshttps://api.datadoghq.com/api/v2/static-analysis-sca/vulnerabilities/resolve-vulnerable-symbolshttps://api.us3.datadoghq.com/api/v2/static-analysis-sca/vulnerabilities/resolve-vulnerable-symbolshttps://api.us5.datadoghq.com/api/v2/static-analysis-sca/vulnerabilities/resolve-vulnerable-symbols ### Overview OAuth apps require the `code_analysis_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#static-analysis) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/static-analysis/) * [Example](https://docs.datadoghq.com/api/latest/static-analysis/) Field Type Description data object attributes object purls [string] id string type [_required_] enum default: `resolve-vulnerable-symbols-request` ``` { "data": { "attributes": { "purls": [] }, "id": "string", "type": "resolve-vulnerable-symbols-request" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/static-analysis/#CreateSCAResolveVulnerableSymbols-200-v2) * [429](https://docs.datadoghq.com/api/latest/static-analysis/#CreateSCAResolveVulnerableSymbols-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/static-analysis/) * [Example](https://docs.datadoghq.com/api/latest/static-analysis/) Field Type Description data object attributes object results [object] purl string vulnerable_symbols [object] advisory_id string symbols [object] name string type string value string id string type [_required_] enum default: `resolve-vulnerable-symbols-response` ``` { "data": { "attributes": { "results": [ { "purl": "string", "vulnerable_symbols": [ { "advisory_id": "string", "symbols": [ { "name": "string", "type": "string", "value": "string" } ] } ] } ] }, "id": "string", "type": "resolve-vulnerable-symbols-response" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/static-analysis/) * [Example](https://docs.datadoghq.com/api/latest/static-analysis/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=typescript) ##### POST request to resolve vulnerable symbols Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/static-analysis-sca/vulnerabilities/resolve-vulnerable-symbols" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "resolve-vulnerable-symbols-request" } } EOF ``` ##### POST request to resolve vulnerable symbols ``` """ POST request to resolve vulnerable symbols returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.static_analysis_api import StaticAnalysisApi from datadog_api_client.v2.model.resolve_vulnerable_symbols_request import ResolveVulnerableSymbolsRequest from datadog_api_client.v2.model.resolve_vulnerable_symbols_request_data import ResolveVulnerableSymbolsRequestData from datadog_api_client.v2.model.resolve_vulnerable_symbols_request_data_attributes import ( ResolveVulnerableSymbolsRequestDataAttributes, ) from datadog_api_client.v2.model.resolve_vulnerable_symbols_request_data_type import ( ResolveVulnerableSymbolsRequestDataType, ) body = ResolveVulnerableSymbolsRequest( data=ResolveVulnerableSymbolsRequestData( attributes=ResolveVulnerableSymbolsRequestDataAttributes( purls=[], ), type=ResolveVulnerableSymbolsRequestDataType.RESOLVE_VULNERABLE_SYMBOLS_REQUEST, ), ) configuration = Configuration() configuration.unstable_operations["create_sca_resolve_vulnerable_symbols"] = True with ApiClient(configuration) as api_client: api_instance = StaticAnalysisApi(api_client) response = api_instance.create_sca_resolve_vulnerable_symbols(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### POST request to resolve vulnerable symbols ``` # POST request to resolve vulnerable symbols returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_sca_resolve_vulnerable_symbols".to_sym] = true end api_instance = DatadogAPIClient::V2::StaticAnalysisAPI.new body = DatadogAPIClient::V2::ResolveVulnerableSymbolsRequest.new({ data: DatadogAPIClient::V2::ResolveVulnerableSymbolsRequestData.new({ attributes: DatadogAPIClient::V2::ResolveVulnerableSymbolsRequestDataAttributes.new({ purls: [], }), type: DatadogAPIClient::V2::ResolveVulnerableSymbolsRequestDataType::RESOLVE_VULNERABLE_SYMBOLS_REQUEST, }), }) p api_instance.create_sca_resolve_vulnerable_symbols(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### POST request to resolve vulnerable symbols ``` // POST request to resolve vulnerable symbols returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ResolveVulnerableSymbolsRequest{ Data: &datadogV2.ResolveVulnerableSymbolsRequestData{ Attributes: &datadogV2.ResolveVulnerableSymbolsRequestDataAttributes{ Purls: []string{}, }, Type: datadogV2.RESOLVEVULNERABLESYMBOLSREQUESTDATATYPE_RESOLVE_VULNERABLE_SYMBOLS_REQUEST, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateSCAResolveVulnerableSymbols", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewStaticAnalysisApi(apiClient) resp, r, err := api.CreateSCAResolveVulnerableSymbols(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `StaticAnalysisApi.CreateSCAResolveVulnerableSymbols`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `StaticAnalysisApi.CreateSCAResolveVulnerableSymbols`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### POST request to resolve vulnerable symbols ``` // POST request to resolve vulnerable symbols returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.StaticAnalysisApi; import com.datadog.api.client.v2.model.ResolveVulnerableSymbolsRequest; import com.datadog.api.client.v2.model.ResolveVulnerableSymbolsRequestData; import com.datadog.api.client.v2.model.ResolveVulnerableSymbolsRequestDataAttributes; import com.datadog.api.client.v2.model.ResolveVulnerableSymbolsRequestDataType; import com.datadog.api.client.v2.model.ResolveVulnerableSymbolsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createSCAResolveVulnerableSymbols", true); StaticAnalysisApi apiInstance = new StaticAnalysisApi(defaultClient); ResolveVulnerableSymbolsRequest body = new ResolveVulnerableSymbolsRequest() .data( new ResolveVulnerableSymbolsRequestData() .attributes(new ResolveVulnerableSymbolsRequestDataAttributes()) .type( ResolveVulnerableSymbolsRequestDataType .RESOLVE_VULNERABLE_SYMBOLS_REQUEST)); try { ResolveVulnerableSymbolsResponse result = apiInstance.createSCAResolveVulnerableSymbols(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling StaticAnalysisApi#createSCAResolveVulnerableSymbols"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### POST request to resolve vulnerable symbols ``` // POST request to resolve vulnerable symbols returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_static_analysis::StaticAnalysisAPI; use datadog_api_client::datadogV2::model::ResolveVulnerableSymbolsRequest; use datadog_api_client::datadogV2::model::ResolveVulnerableSymbolsRequestData; use datadog_api_client::datadogV2::model::ResolveVulnerableSymbolsRequestDataAttributes; use datadog_api_client::datadogV2::model::ResolveVulnerableSymbolsRequestDataType; #[tokio::main] async fn main() { let body = ResolveVulnerableSymbolsRequest::new().data( ResolveVulnerableSymbolsRequestData::new( ResolveVulnerableSymbolsRequestDataType::RESOLVE_VULNERABLE_SYMBOLS_REQUEST, ) .attributes(ResolveVulnerableSymbolsRequestDataAttributes::new().purls(vec![])), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateSCAResolveVulnerableSymbols", true); let api = StaticAnalysisAPI::with_config(configuration); let resp = api.create_sca_resolve_vulnerable_symbols(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### POST request to resolve vulnerable symbols ``` /** * POST request to resolve vulnerable symbols returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createSCAResolveVulnerableSymbols"] = true; const apiInstance = new v2.StaticAnalysisApi(configuration); const params: v2.StaticAnalysisApiCreateSCAResolveVulnerableSymbolsRequest = { body: { data: { attributes: { purls: [], }, type: "resolve-vulnerable-symbols-request", }, }, }; apiInstance .createSCAResolveVulnerableSymbols(params) .then((data: v2.ResolveVulnerableSymbolsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Post dependencies for analysis](https://docs.datadoghq.com/api/latest/static-analysis/#post-dependencies-for-analysis) * [v2 (latest)](https://docs.datadoghq.com/api/latest/static-analysis/#post-dependencies-for-analysis-v2) **Note** : This endpoint may be subject to changes. POST https://api.ap1.datadoghq.com/api/v2/static-analysis-sca/dependencieshttps://api.ap2.datadoghq.com/api/v2/static-analysis-sca/dependencieshttps://api.datadoghq.eu/api/v2/static-analysis-sca/dependencieshttps://api.ddog-gov.com/api/v2/static-analysis-sca/dependencieshttps://api.datadoghq.com/api/v2/static-analysis-sca/dependencieshttps://api.us3.datadoghq.com/api/v2/static-analysis-sca/dependencieshttps://api.us5.datadoghq.com/api/v2/static-analysis-sca/dependencies ### Overview OAuth apps require the `code_analysis_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#static-analysis) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/static-analysis/) * [Example](https://docs.datadoghq.com/api/latest/static-analysis/) Field Type Description data object attributes object commit object author_date string author_email string author_name string branch string committer_email string committer_name string sha string dependencies [object] exclusions [string] group string is_dev boolean is_direct boolean language string locations [object] block object end object col int32 line int32 file_name string start object col int32 line int32 name object end object col int32 line int32 file_name string start object col int32 line int32 namespace object end object col int32 line int32 file_name string start object col int32 line int32 version object end object col int32 line int32 file_name string start object col int32 line int32 name string package_manager string purl string reachable_symbol_properties [object] name string value string version string env string files [object] name string purl string relations [object] depends_on [string] ref string repository object url string service string tags object string vulnerabilities [object] affects [object] ref string bom_ref string id string id string type [_required_] enum default: `scarequests` ``` { "data": { "attributes": { "commit": { "author_date": "string", "author_email": "string", "author_name": "string", "branch": "string", "committer_email": "string", "committer_name": "string", "sha": "string" }, "dependencies": [ { "exclusions": [], "group": "string", "is_dev": false, "is_direct": false, "language": "string", "locations": [ { "block": { "end": { "col": "integer", "line": "integer" }, "file_name": "string", "start": { "col": "integer", "line": "integer" } }, "name": { "end": { "col": "integer", "line": "integer" }, "file_name": "string", "start": { "col": "integer", "line": "integer" } }, "namespace": { "end": { "col": "integer", "line": "integer" }, "file_name": "string", "start": { "col": "integer", "line": "integer" } }, "version": { "end": { "col": "integer", "line": "integer" }, "file_name": "string", "start": { "col": "integer", "line": "integer" } } } ], "name": "string", "package_manager": "string", "purl": "string", "reachable_symbol_properties": [ { "name": "string", "value": "string" } ], "version": "string" } ], "env": "string", "files": [ { "name": "string", "purl": "string" } ], "relations": [ { "depends_on": [], "ref": "string" } ], "repository": { "url": "string" }, "service": "string", "tags": { "": "string" }, "vulnerabilities": [ { "affects": [ { "ref": "string" } ], "bom_ref": "string", "id": "string" } ] }, "id": "string", "type": "scarequests" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/static-analysis/#CreateSCAResult-200-v2) * [429](https://docs.datadoghq.com/api/latest/static-analysis/#CreateSCAResult-429-v2) OK Too many requests * [Model](https://docs.datadoghq.com/api/latest/static-analysis/) * [Example](https://docs.datadoghq.com/api/latest/static-analysis/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/static-analysis/?code-lang=typescript) ##### Post dependencies for analysis Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/static-analysis-sca/dependencies" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "scarequests" } } EOF ``` ##### Post dependencies for analysis ``` """ Post dependencies for analysis returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.static_analysis_api import StaticAnalysisApi from datadog_api_client.v2.model.sca_request import ScaRequest from datadog_api_client.v2.model.sca_request_data import ScaRequestData from datadog_api_client.v2.model.sca_request_data_attributes import ScaRequestDataAttributes from datadog_api_client.v2.model.sca_request_data_attributes_commit import ScaRequestDataAttributesCommit from datadog_api_client.v2.model.sca_request_data_attributes_dependencies_items import ( ScaRequestDataAttributesDependenciesItems, ) from datadog_api_client.v2.model.sca_request_data_attributes_dependencies_items_locations_items import ( ScaRequestDataAttributesDependenciesItemsLocationsItems, ) from datadog_api_client.v2.model.sca_request_data_attributes_dependencies_items_locations_items_file_position import ( ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition, ) from datadog_api_client.v2.model.sca_request_data_attributes_dependencies_items_locations_items_position import ( ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition, ) from datadog_api_client.v2.model.sca_request_data_attributes_dependencies_items_reachable_symbol_properties_items import ( ScaRequestDataAttributesDependenciesItemsReachableSymbolPropertiesItems, ) from datadog_api_client.v2.model.sca_request_data_attributes_files_items import ScaRequestDataAttributesFilesItems from datadog_api_client.v2.model.sca_request_data_attributes_relations_items import ( ScaRequestDataAttributesRelationsItems, ) from datadog_api_client.v2.model.sca_request_data_attributes_repository import ScaRequestDataAttributesRepository from datadog_api_client.v2.model.sca_request_data_attributes_vulnerabilities_items import ( ScaRequestDataAttributesVulnerabilitiesItems, ) from datadog_api_client.v2.model.sca_request_data_attributes_vulnerabilities_items_affects_items import ( ScaRequestDataAttributesVulnerabilitiesItemsAffectsItems, ) from datadog_api_client.v2.model.sca_request_data_type import ScaRequestDataType body = ScaRequest( data=ScaRequestData( attributes=ScaRequestDataAttributes( commit=ScaRequestDataAttributesCommit(), dependencies=[ ScaRequestDataAttributesDependenciesItems( exclusions=[], locations=[ ScaRequestDataAttributesDependenciesItemsLocationsItems( block=ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition( end=ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition(), start=ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition(), ), name=ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition( end=ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition(), start=ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition(), ), namespace=ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition( end=ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition(), start=ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition(), ), version=ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition( end=ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition(), start=ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition(), ), ), ], reachable_symbol_properties=[ ScaRequestDataAttributesDependenciesItemsReachableSymbolPropertiesItems(), ], ), ], files=[ ScaRequestDataAttributesFilesItems(), ], relations=[ ScaRequestDataAttributesRelationsItems( depends_on=[], ), ], repository=ScaRequestDataAttributesRepository(), vulnerabilities=[ ScaRequestDataAttributesVulnerabilitiesItems( affects=[ ScaRequestDataAttributesVulnerabilitiesItemsAffectsItems(), ], ), ], ), type=ScaRequestDataType.SCAREQUESTS, ), ) configuration = Configuration() configuration.unstable_operations["create_sca_result"] = True with ApiClient(configuration) as api_client: api_instance = StaticAnalysisApi(api_client) api_instance.create_sca_result(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Post dependencies for analysis ``` # Post dependencies for analysis returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_sca_result".to_sym] = true end api_instance = DatadogAPIClient::V2::StaticAnalysisAPI.new body = DatadogAPIClient::V2::ScaRequest.new({ data: DatadogAPIClient::V2::ScaRequestData.new({ attributes: DatadogAPIClient::V2::ScaRequestDataAttributes.new({ commit: DatadogAPIClient::V2::ScaRequestDataAttributesCommit.new({}), dependencies: [ DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItems.new({ exclusions: [], locations: [ DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItems.new({ block: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition.new({ _end: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition.new({}), start: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition.new({}), }), name: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition.new({ _end: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition.new({}), start: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition.new({}), }), namespace: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition.new({ _end: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition.new({}), start: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition.new({}), }), version: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition.new({ _end: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition.new({}), start: DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition.new({}), }), }), ], reachable_symbol_properties: [ DatadogAPIClient::V2::ScaRequestDataAttributesDependenciesItemsReachableSymbolPropertiesItems.new({}), ], }), ], files: [ DatadogAPIClient::V2::ScaRequestDataAttributesFilesItems.new({}), ], relations: [ DatadogAPIClient::V2::ScaRequestDataAttributesRelationsItems.new({ depends_on: [], }), ], repository: DatadogAPIClient::V2::ScaRequestDataAttributesRepository.new({}), vulnerabilities: [ DatadogAPIClient::V2::ScaRequestDataAttributesVulnerabilitiesItems.new({ affects: [ DatadogAPIClient::V2::ScaRequestDataAttributesVulnerabilitiesItemsAffectsItems.new({}), ], }), ], }), type: DatadogAPIClient::V2::ScaRequestDataType::SCAREQUESTS, }), }) p api_instance.create_sca_result(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Post dependencies for analysis ``` // Post dependencies for analysis returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.ScaRequest{ Data: &datadogV2.ScaRequestData{ Attributes: &datadogV2.ScaRequestDataAttributes{ Commit: &datadogV2.ScaRequestDataAttributesCommit{}, Dependencies: []datadogV2.ScaRequestDataAttributesDependenciesItems{ { Exclusions: []string{}, Locations: []datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItems{ { Block: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition{ End: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition{}, Start: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition{}, }, Name: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition{ End: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition{}, Start: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition{}, }, Namespace: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition{ End: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition{}, Start: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition{}, }, Version: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition{ End: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition{}, Start: &datadogV2.ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition{}, }, }, }, ReachableSymbolProperties: []datadogV2.ScaRequestDataAttributesDependenciesItemsReachableSymbolPropertiesItems{ {}, }, }, }, Files: []datadogV2.ScaRequestDataAttributesFilesItems{ {}, }, Relations: []datadogV2.ScaRequestDataAttributesRelationsItems{ { DependsOn: []string{}, }, }, Repository: &datadogV2.ScaRequestDataAttributesRepository{}, Vulnerabilities: []datadogV2.ScaRequestDataAttributesVulnerabilitiesItems{ { Affects: []datadogV2.ScaRequestDataAttributesVulnerabilitiesItemsAffectsItems{ {}, }, }, }, }, Type: datadogV2.SCAREQUESTDATATYPE_SCAREQUESTS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateSCAResult", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewStaticAnalysisApi(apiClient) r, err := api.CreateSCAResult(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `StaticAnalysisApi.CreateSCAResult`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Post dependencies for analysis ``` // Post dependencies for analysis returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.StaticAnalysisApi; import com.datadog.api.client.v2.model.ScaRequest; import com.datadog.api.client.v2.model.ScaRequestData; import com.datadog.api.client.v2.model.ScaRequestDataAttributes; import com.datadog.api.client.v2.model.ScaRequestDataAttributesCommit; import com.datadog.api.client.v2.model.ScaRequestDataAttributesDependenciesItems; import com.datadog.api.client.v2.model.ScaRequestDataAttributesDependenciesItemsLocationsItems; import com.datadog.api.client.v2.model.ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition; import com.datadog.api.client.v2.model.ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition; import com.datadog.api.client.v2.model.ScaRequestDataAttributesDependenciesItemsReachableSymbolPropertiesItems; import com.datadog.api.client.v2.model.ScaRequestDataAttributesFilesItems; import com.datadog.api.client.v2.model.ScaRequestDataAttributesRelationsItems; import com.datadog.api.client.v2.model.ScaRequestDataAttributesRepository; import com.datadog.api.client.v2.model.ScaRequestDataAttributesVulnerabilitiesItems; import com.datadog.api.client.v2.model.ScaRequestDataAttributesVulnerabilitiesItemsAffectsItems; import com.datadog.api.client.v2.model.ScaRequestDataType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createSCAResult", true); StaticAnalysisApi apiInstance = new StaticAnalysisApi(defaultClient); ScaRequest body = new ScaRequest() .data( new ScaRequestData() .attributes( new ScaRequestDataAttributes() .commit(new ScaRequestDataAttributesCommit()) .dependencies( Collections.singletonList( new ScaRequestDataAttributesDependenciesItems() .locations( Collections.singletonList( new ScaRequestDataAttributesDependenciesItemsLocationsItems() .block( new ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition() .end( new ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition()) .start( new ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition())) .name( new ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition() .end( new ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition()) .start( new ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition())) .namespace( new ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition() .end( new ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition()) .start( new ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition())) .version( new ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition() .end( new ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition()) .start( new ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition())))) .reachableSymbolProperties( Collections.singletonList( new ScaRequestDataAttributesDependenciesItemsReachableSymbolPropertiesItems())))) .files( Collections.singletonList(new ScaRequestDataAttributesFilesItems())) .relations( Collections.singletonList( new ScaRequestDataAttributesRelationsItems())) .repository(new ScaRequestDataAttributesRepository()) .vulnerabilities( Collections.singletonList( new ScaRequestDataAttributesVulnerabilitiesItems() .affects( Collections.singletonList( new ScaRequestDataAttributesVulnerabilitiesItemsAffectsItems()))))) .type(ScaRequestDataType.SCAREQUESTS)); try { apiInstance.createSCAResult(body); } catch (ApiException e) { System.err.println("Exception when calling StaticAnalysisApi#createSCAResult"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Post dependencies for analysis ``` // Post dependencies for analysis returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_static_analysis::StaticAnalysisAPI; use datadog_api_client::datadogV2::model::ScaRequest; use datadog_api_client::datadogV2::model::ScaRequestData; use datadog_api_client::datadogV2::model::ScaRequestDataAttributes; use datadog_api_client::datadogV2::model::ScaRequestDataAttributesCommit; use datadog_api_client::datadogV2::model::ScaRequestDataAttributesDependenciesItems; use datadog_api_client::datadogV2::model::ScaRequestDataAttributesDependenciesItemsLocationsItems; use datadog_api_client::datadogV2::model::ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition; use datadog_api_client::datadogV2::model::ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition; use datadog_api_client::datadogV2::model::ScaRequestDataAttributesDependenciesItemsReachableSymbolPropertiesItems; use datadog_api_client::datadogV2::model::ScaRequestDataAttributesFilesItems; use datadog_api_client::datadogV2::model::ScaRequestDataAttributesRelationsItems; use datadog_api_client::datadogV2::model::ScaRequestDataAttributesRepository; use datadog_api_client::datadogV2::model::ScaRequestDataAttributesVulnerabilitiesItems; use datadog_api_client::datadogV2::model::ScaRequestDataAttributesVulnerabilitiesItemsAffectsItems; use datadog_api_client::datadogV2::model::ScaRequestDataType; #[tokio::main] async fn main() { let body = ScaRequest ::new().data( ScaRequestData::new( ScaRequestDataType::SCAREQUESTS, ).attributes( ScaRequestDataAttributes::new() .commit(ScaRequestDataAttributesCommit::new()) .dependencies( vec![ ScaRequestDataAttributesDependenciesItems::new() .exclusions(vec![]) .locations( vec![ ScaRequestDataAttributesDependenciesItemsLocationsItems::new() .block( ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition ::new() .end( ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition ::new(), ) .start( ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition ::new(), ), ) .name( ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition ::new() .end( ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition ::new(), ) .start( ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition ::new(), ), ) .namespace( ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition ::new() .end( ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition ::new(), ) .start( ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition ::new(), ), ) .version( ScaRequestDataAttributesDependenciesItemsLocationsItemsFilePosition ::new() .end( ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition ::new(), ) .start( ScaRequestDataAttributesDependenciesItemsLocationsItemsPosition ::new(), ), ) ], ) .reachable_symbol_properties( vec![ ScaRequestDataAttributesDependenciesItemsReachableSymbolPropertiesItems::new() ], ) ], ) .files(vec![ScaRequestDataAttributesFilesItems::new()]) .relations(vec![ScaRequestDataAttributesRelationsItems::new().depends_on(vec![])]) .repository(ScaRequestDataAttributesRepository::new()) .vulnerabilities( vec![ ScaRequestDataAttributesVulnerabilitiesItems ::new().affects(vec![ScaRequestDataAttributesVulnerabilitiesItemsAffectsItems::new()]) ], ), ), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateSCAResult", true); let api = StaticAnalysisAPI::with_config(configuration); let resp = api.create_sca_result(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Post dependencies for analysis ``` /** * Post dependencies for analysis returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createSCAResult"] = true; const apiInstance = new v2.StaticAnalysisApi(configuration); const params: v2.StaticAnalysisApiCreateSCAResultRequest = { body: { data: { attributes: { commit: {}, dependencies: [ { exclusions: [], locations: [ { block: { end: {}, start: {}, }, name: { end: {}, start: {}, }, namespace: { end: {}, start: {}, }, version: { end: {}, start: {}, }, }, ], reachableSymbolProperties: [{}], }, ], files: [{}], relations: [ { dependsOn: [], }, ], repository: {}, vulnerabilities: [ { affects: [{}], }, ], }, type: "scarequests", }, }, }; apiInstance .createSCAResult(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=cf65a979-d183-4b7d-a723-f756b26cee53&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=28c4c534-4020-4337-b0f5-85e67c351d17&pt=Static%20Analysis&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fstatic-analysis%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=cf65a979-d183-4b7d-a723-f756b26cee53&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=28c4c534-4020-4337-b0f5-85e67c351d17&pt=Static%20Analysis&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fstatic-analysis%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=e154694d-f455-4d0b-8bc6-b6180e9cdeac&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Static%20Analysis&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fstatic-analysis%2F&r=<=10031&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=760814) --- # Source: https://docs.datadoghq.com/api/latest/synthetics # Synthetics Datadog Synthetic Monitoring uses simulated user requests and browser rendering to help you ensure uptime, identify regional issues, and track your application performance. Synthetic tests come in two different flavors, [API tests](https://docs.datadoghq.com/synthetics/api_tests/?tab=httptest) and [browser tests](https://docs.datadoghq.com/synthetics/browser_tests). You can use Datadog’s API to manage both test types programmatically. For more information, see the [Synthetic Monitoring documentation](https://docs.datadoghq.com/synthetics/). ## [Create an API test](https://docs.datadoghq.com/api/latest/synthetics/#create-an-api-test) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#create-an-api-test-v1) POST https://api.ap1.datadoghq.com/api/v1/synthetics/tests/apihttps://api.ap2.datadoghq.com/api/v1/synthetics/tests/apihttps://api.datadoghq.eu/api/v1/synthetics/tests/apihttps://api.ddog-gov.com/api/v1/synthetics/tests/apihttps://api.datadoghq.com/api/v1/synthetics/tests/apihttps://api.us3.datadoghq.com/api/v1/synthetics/tests/apihttps://api.us5.datadoghq.com/api/v1/synthetics/tests/api ### Overview Create a Synthetic API test. This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Request #### Body Data (required) Details of the test to create. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description config [_required_] object Configuration object for a Synthetic API test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. steps [ ] When the test subtype is `multi`, the steps of the test. Option 1 object The Test step used in a Synthetic multi-step API test. allowFailure boolean Determines whether or not to continue with test if this step fails. assertions [_required_] [ ] Array of assertions used for the test. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` exitIfSucceed boolean Determines whether or not to exit the test if the step succeeds. extractedValues [object] Array of values to parse and save as variables from the response. field string When type is `http_header` or `grpc_metadata`, name of the header or metadatum to extract. name string Name of the variable to extract. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. secure boolean Determines whether or not the extracted value will be obfuscated. type enum Property of the Synthetic Test Response to extract into a local variable. Allowed enum values: `grpc_message,grpc_metadata,http_body,http_header,http_status_code` extractedValuesFromScript string Generate variables using JavaScript. id string ID of the step. isCritical boolean Determines whether or not to consider the entire test as failed if this step fails. Can be used only if `allowFailure` is `true`. name [_required_] string The name of the step. request [_required_] object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. subtype [_required_] enum The subtype of the Synthetic multi-step API test step. Allowed enum values: `http,grpc,ssl,dns,tcp,udp,icmp,websocket` Option 2 object The Wait step used in a Synthetic multi-step API test. id string ID of the step. name [_required_] string The name of the step. subtype [_required_] enum The subtype of the Synthetic multi-step API wait step. Allowed enum values: `wait` value [_required_] int32 The time to wait in seconds. Minimum value: 0. Maximum value: 180. Option 3 object The subtest step used in a Synthetics multi-step API test. allowFailure boolean Determines whether or not to continue with test if this step fails. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean Determines whether or not to exit the test if the step succeeds. extractedValuesFromScript string Generate variables using JavaScript. id string ID of the step. isCritical boolean Determines whether or not to consider the entire test as failed if this step fails. Can be used only if `allowFailure` is `true`. name [_required_] string The name of the step. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. subtestPublicId [_required_] string Public ID of the test to be played as part of a `playSubTest` step type. subtype [_required_] enum The subtype of the Synthetic multi-step API subtest step. Allowed enum values: `playSubTest` variablesFromScript string Variables defined from JavaScript code. locations [_required_] [string] Array of locations used to run the test. message [_required_] string Notification message associated with the test. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The public ID for the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `api`. Allowed enum values: `api` default: `api` ##### Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response ``` { "config": { "steps": [ { "assertions": [ { "operator": "is", "type": "statusCode", "target": 200 } ], "name": "request is sent", "request": { "url": "https://httpbin.org/status/200", "method": "GET", "basicAuth": { "password": "password", "username": "username" } }, "subtype": "http" }, { "assertions": [ { "operator": "is", "type": "statusCode", "target": 200 } ], "name": "request is sent", "request": { "url": "https://httpbin.org/status/200", "method": "GET", "basicAuth": { "password": "password", "username": "username", "type": "web" } }, "subtype": "http" }, { "assertions": [ { "operator": "is", "type": "statusCode", "target": 200 } ], "name": "request is sent", "request": { "url": "https://httpbin.org/status/200", "method": "GET", "basicAuth": { "accessKey": "accessKey", "secretKey": "secretKey", "type": "sigv4" } }, "subtype": "http" }, { "assertions": [ { "operator": "is", "type": "statusCode", "target": 200 } ], "name": "request is sent", "request": { "url": "https://httpbin.org/status/200", "method": "GET", "basicAuth": { "type": "ntlm" } }, "subtype": "http" }, { "assertions": [ { "operator": "is", "type": "statusCode", "target": 200 } ], "name": "request is sent", "request": { "url": "https://httpbin.org/status/200", "method": "GET", "basicAuth": { "password": "password", "username": "username", "type": "digest" } }, "subtype": "http" }, { "assertions": [ { "operator": "is", "type": "statusCode", "target": 200 } ], "name": "request is sent", "request": { "url": "https://httpbin.org/status/200", "method": "GET", "basicAuth": { "accessTokenUrl": "accessTokenUrl", "tokenApiAuthentication": "header", "clientId": "clientId", "clientSecret": "clientSecret", "type": "oauth-client" } }, "subtype": "http" }, { "assertions": [ { "operator": "is", "type": "statusCode", "target": 200 } ], "name": "request is sent", "request": { "url": "https://httpbin.org/status/200", "method": "GET", "basicAuth": { "accessTokenUrl": "accessTokenUrl", "password": "password", "tokenApiAuthentication": "header", "username": "username", "type": "oauth-rop" } }, "subtype": "http" } ] }, "locations": [ "aws:us-east-2" ], "message": "BDD test payload: synthetics_api_test_multi_step_with_every_type_of_basic_auth.json", "name": "Example-Synthetic", "options": { "tick_every": 60 }, "subtype": "multi", "type": "api" } ``` Copy ##### Create a multistep test with subtest returns "OK" response ``` { "config": { "steps": [ { "assertions": [ { "operator": "is", "type": "statusCode", "target": 200 } ], "name": "request is sent", "request": { "url": "https://httpbin.org/status/200", "method": "GET", "basicAuth": { "password": "password", "username": "username" } }, "subtype": "http" }, { "subtype": "playSubTest", "subtestPublicId": "123-abc-456", "name": "subtest step" } ] }, "locations": [ "aws:us-east-2" ], "message": "BDD test payload: synthetics_api_test_multi_step_with_subtest.json", "name": "Example-Synthetic", "options": { "tick_every": 60 }, "subtype": "multi", "type": "api" } ``` Copy ##### Create an API GRPC test returns "OK - Returns the created test details." response ``` { "config": { "assertions": [ { "operator": "is", "target": 1, "type": "grpcHealthcheckStatus" }, { "operator": "is", "target": "proto target", "type": "grpcProto" }, { "operator": "is", "target": "123", "property": "property", "type": "grpcMetadata" } ], "request": { "host": "localhost", "port": 50051, "service": "Hello", "method": "GET", "message": "", "metadata": {} } }, "locations": [ "aws:us-east-2" ], "message": "BDD test payload: synthetics_api_grpc_test_payload.json", "name": "Example-Synthetic", "options": { "min_failure_duration": 0, "min_location_failed": 1, "monitor_options": { "renotify_interval": 0 }, "monitor_name": "Example-Synthetic", "tick_every": 60 }, "subtype": "grpc", "tags": [ "testing:api" ], "type": "api" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsAPITest-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsAPITest-400-v1) * [402](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsAPITest-402-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsAPITest-403-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsAPITest-429-v1) OK - Returns the created test details. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about a Synthetic API test. Field Type Description config [_required_] object Configuration object for a Synthetic API test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. steps [ ] When the test subtype is `multi`, the steps of the test. Option 1 object The Test step used in a Synthetic multi-step API test. allowFailure boolean Determines whether or not to continue with test if this step fails. assertions [_required_] [ ] Array of assertions used for the test. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` exitIfSucceed boolean Determines whether or not to exit the test if the step succeeds. extractedValues [object] Array of values to parse and save as variables from the response. field string When type is `http_header` or `grpc_metadata`, name of the header or metadatum to extract. name string Name of the variable to extract. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. secure boolean Determines whether or not the extracted value will be obfuscated. type enum Property of the Synthetic Test Response to extract into a local variable. Allowed enum values: `grpc_message,grpc_metadata,http_body,http_header,http_status_code` extractedValuesFromScript string Generate variables using JavaScript. id string ID of the step. isCritical boolean Determines whether or not to consider the entire test as failed if this step fails. Can be used only if `allowFailure` is `true`. name [_required_] string The name of the step. request [_required_] object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. subtype [_required_] enum The subtype of the Synthetic multi-step API test step. Allowed enum values: `http,grpc,ssl,dns,tcp,udp,icmp,websocket` Option 2 object The Wait step used in a Synthetic multi-step API test. id string ID of the step. name [_required_] string The name of the step. subtype [_required_] enum The subtype of the Synthetic multi-step API wait step. Allowed enum values: `wait` value [_required_] int32 The time to wait in seconds. Minimum value: 0. Maximum value: 180. Option 3 object The subtest step used in a Synthetics multi-step API test. allowFailure boolean Determines whether or not to continue with test if this step fails. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean Determines whether or not to exit the test if the step succeeds. extractedValuesFromScript string Generate variables using JavaScript. id string ID of the step. isCritical boolean Determines whether or not to consider the entire test as failed if this step fails. Can be used only if `allowFailure` is `true`. name [_required_] string The name of the step. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. subtestPublicId [_required_] string Public ID of the test to be played as part of a `playSubTest` step type. subtype [_required_] enum The subtype of the Synthetic multi-step API subtest step. Allowed enum values: `playSubTest` variablesFromScript string Variables defined from JavaScript code. locations [_required_] [string] Array of locations used to run the test. message [_required_] string Notification message associated with the test. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The public ID for the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `api`. Allowed enum values: `api` default: `api` ``` { "config": { "assertions": [ [] ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "steps": [], "variablesFromScript": "dd.variable.set(\"FOO\", \"foo\")" }, "locations": [ "aws:eu-west-3" ], "message": "Notification message", "monitor_id": 12345678, "name": "Example test name", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "123-abc-456", "status": "live", "subtype": "http", "tags": [ "env:production" ], "type": "api" } ``` Copy - JSON format is wrong - Creation failed * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Test quota is reached * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response Copy ``` ## Create an API test. # Example of an API test. # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/api" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "assertions": [ { "operator": "lessThan", "target": 1000, "type": "responseTime" }, { "operator": "is", "target": 200, "type": "statusCode" }, { "operator": "is", "property": "content-type", "target": "text/html; charset=UTF-8", "type": "header" } ], "request": { "method": "GET", "url": "https://example.com" } }, "locations": [ "azure:eastus", "aws:eu-west-3" ], "message": "MY_NOTIFICATION_MESSAGE", "name": "MY_TEST_NAME", "options": { "min_failure_duration": 0, "min_location_failed": 1, "monitor_options": { "include_tags": true, "locked": false, "new_host_delay": 300, "notify_audit": false, "notify_no_data": false, "renotify_interval": 0 }, "tick_every": 60 }, "status": "live", "subtype": "http", "tags": [ "env:production" ], "type": "api" } EOF ## Create a Multistep API test # Example of a multistep API test running on a fake furniture store. # It creates a card, select a product and then add the product to the card. # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/api" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "steps": [ { "assertions": [ { "operator": "lessThan", "target": 30000, "type": "responseTime" } ], "extractedValues": [ { "field": "location", "name": "CART_ID", "parser": { "type": "regex", "value": "(?:[^\\\\/](?!(\\\\|/)))+$" }, "type": "http_header" } ], "name": "Get a cart", "request": { "method": "POST", "timeout": 30, "url": "https://api.shopist.io/carts" }, "subtype": "http" }, { "assertions": [ { "operator": "is", "target": 200, "type": "statusCode" } ], "extractedValues": [ { "name": "PRODUCT_ID", "parser": { "type": "json_path", "value": "$[0].id['$oid']" }, "type": "http_body" } ], "name": "Get a product", "request": { "method": "GET", "timeout": 30, "url": "https://api.shopist.io/products.json" }, "subtype": "http" }, { "assertions": [ { "operator": "is", "target": 201, "type": "statusCode" } ], "name": "Add product to cart", "request": { "body": "{\n \"cart_item\": {\n \"product_id\": \"{{ PRODUCT_ID }}\",\n \"amount_paid\": 500,\n \"quantity\": 1\n },\n \"cart_id\": \"{{ CART_ID }}\"\n}", "headers": { "content-type": "application/json" }, "method": "POST", "timeout": 30, "url": "https://api.shopist.io/add_item.json" }, "subtype": "http" } ] }, "locations": [ "aws:us-west-2" ], "message": "MY_NOTIFICATION_MESSAGE", "name": "MY_TEST_NAME", "options": { "ci": { "executionRule": "blocking" }, "min_failure_duration": 5400, "min_location_failed": 1, "monitor_options": { "renotify_interval": 0 }, "retry": { "count": 3, "interval": 300 }, "tick_every": 900 }, "status": "live", "subtype": "multi", "tags": [ "env:prod" ], "type": "api" } EOF ``` ##### Create a multistep test with subtest returns "OK" response Copy ``` ## Create an API test. # Example of an API test. # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/api" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "assertions": [ { "operator": "lessThan", "target": 1000, "type": "responseTime" }, { "operator": "is", "target": 200, "type": "statusCode" }, { "operator": "is", "property": "content-type", "target": "text/html; charset=UTF-8", "type": "header" } ], "request": { "method": "GET", "url": "https://example.com" } }, "locations": [ "azure:eastus", "aws:eu-west-3" ], "message": "MY_NOTIFICATION_MESSAGE", "name": "MY_TEST_NAME", "options": { "min_failure_duration": 0, "min_location_failed": 1, "monitor_options": { "include_tags": true, "locked": false, "new_host_delay": 300, "notify_audit": false, "notify_no_data": false, "renotify_interval": 0 }, "tick_every": 60 }, "status": "live", "subtype": "http", "tags": [ "env:production" ], "type": "api" } EOF ## Create a Multistep API test # Example of a multistep API test running on a fake furniture store. # It creates a card, select a product and then add the product to the card. # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/api" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "steps": [ { "assertions": [ { "operator": "lessThan", "target": 30000, "type": "responseTime" } ], "extractedValues": [ { "field": "location", "name": "CART_ID", "parser": { "type": "regex", "value": "(?:[^\\\\/](?!(\\\\|/)))+$" }, "type": "http_header" } ], "name": "Get a cart", "request": { "method": "POST", "timeout": 30, "url": "https://api.shopist.io/carts" }, "subtype": "http" }, { "assertions": [ { "operator": "is", "target": 200, "type": "statusCode" } ], "extractedValues": [ { "name": "PRODUCT_ID", "parser": { "type": "json_path", "value": "$[0].id['$oid']" }, "type": "http_body" } ], "name": "Get a product", "request": { "method": "GET", "timeout": 30, "url": "https://api.shopist.io/products.json" }, "subtype": "http" }, { "assertions": [ { "operator": "is", "target": 201, "type": "statusCode" } ], "name": "Add product to cart", "request": { "body": "{\n \"cart_item\": {\n \"product_id\": \"{{ PRODUCT_ID }}\",\n \"amount_paid\": 500,\n \"quantity\": 1\n },\n \"cart_id\": \"{{ CART_ID }}\"\n}", "headers": { "content-type": "application/json" }, "method": "POST", "timeout": 30, "url": "https://api.shopist.io/add_item.json" }, "subtype": "http" } ] }, "locations": [ "aws:us-west-2" ], "message": "MY_NOTIFICATION_MESSAGE", "name": "MY_TEST_NAME", "options": { "ci": { "executionRule": "blocking" }, "min_failure_duration": 5400, "min_location_failed": 1, "monitor_options": { "renotify_interval": 0 }, "retry": { "count": 3, "interval": 300 }, "tick_every": 900 }, "status": "live", "subtype": "multi", "tags": [ "env:prod" ], "type": "api" } EOF ``` ##### Create an API GRPC test returns "OK - Returns the created test details." response Copy ``` ## Create an API test. # Example of an API test. # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/api" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "assertions": [ { "operator": "lessThan", "target": 1000, "type": "responseTime" }, { "operator": "is", "target": 200, "type": "statusCode" }, { "operator": "is", "property": "content-type", "target": "text/html; charset=UTF-8", "type": "header" } ], "request": { "method": "GET", "url": "https://example.com" } }, "locations": [ "azure:eastus", "aws:eu-west-3" ], "message": "MY_NOTIFICATION_MESSAGE", "name": "MY_TEST_NAME", "options": { "min_failure_duration": 0, "min_location_failed": 1, "monitor_options": { "include_tags": true, "locked": false, "new_host_delay": 300, "notify_audit": false, "notify_no_data": false, "renotify_interval": 0 }, "tick_every": 60 }, "status": "live", "subtype": "http", "tags": [ "env:production" ], "type": "api" } EOF ## Create a Multistep API test # Example of a multistep API test running on a fake furniture store. # It creates a card, select a product and then add the product to the card. # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/api" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "steps": [ { "assertions": [ { "operator": "lessThan", "target": 30000, "type": "responseTime" } ], "extractedValues": [ { "field": "location", "name": "CART_ID", "parser": { "type": "regex", "value": "(?:[^\\\\/](?!(\\\\|/)))+$" }, "type": "http_header" } ], "name": "Get a cart", "request": { "method": "POST", "timeout": 30, "url": "https://api.shopist.io/carts" }, "subtype": "http" }, { "assertions": [ { "operator": "is", "target": 200, "type": "statusCode" } ], "extractedValues": [ { "name": "PRODUCT_ID", "parser": { "type": "json_path", "value": "$[0].id['$oid']" }, "type": "http_body" } ], "name": "Get a product", "request": { "method": "GET", "timeout": 30, "url": "https://api.shopist.io/products.json" }, "subtype": "http" }, { "assertions": [ { "operator": "is", "target": 201, "type": "statusCode" } ], "name": "Add product to cart", "request": { "body": "{\n \"cart_item\": {\n \"product_id\": \"{{ PRODUCT_ID }}\",\n \"amount_paid\": 500,\n \"quantity\": 1\n },\n \"cart_id\": \"{{ CART_ID }}\"\n}", "headers": { "content-type": "application/json" }, "method": "POST", "timeout": 30, "url": "https://api.shopist.io/add_item.json" }, "subtype": "http" } ] }, "locations": [ "aws:us-west-2" ], "message": "MY_NOTIFICATION_MESSAGE", "name": "MY_TEST_NAME", "options": { "ci": { "executionRule": "blocking" }, "min_failure_duration": 5400, "min_location_failed": 1, "monitor_options": { "renotify_interval": 0 }, "retry": { "count": 3, "interval": 300 }, "tick_every": 900 }, "status": "live", "subtype": "multi", "tags": [ "env:prod" ], "type": "api" } EOF ``` ##### Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response ``` // Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsAPITest{ Config: datadogV1.SyntheticsAPITestConfig{ Steps: []datadogV1.SyntheticsAPIStep{ datadogV1.SyntheticsAPIStep{ SyntheticsAPITestStep: &datadogV1.SyntheticsAPITestStep{ Assertions: []datadogV1.SyntheticsAssertion{ datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Type: datadogV1.SYNTHETICSASSERTIONTYPE_STATUS_CODE, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueNumber: datadog.PtrFloat64(200)}, }}, }, Name: "request is sent", Request: datadogV1.SyntheticsTestRequest{ Url: datadog.PtrString("https://httpbin.org/status/200"), Method: datadog.PtrString("GET"), BasicAuth: &datadogV1.SyntheticsBasicAuth{ SyntheticsBasicAuthWeb: &datadogV1.SyntheticsBasicAuthWeb{ Password: datadog.PtrString("password"), Username: datadog.PtrString("username"), }}, }, Subtype: datadogV1.SYNTHETICSAPITESTSTEPSUBTYPE_HTTP, }}, datadogV1.SyntheticsAPIStep{ SyntheticsAPITestStep: &datadogV1.SyntheticsAPITestStep{ Assertions: []datadogV1.SyntheticsAssertion{ datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Type: datadogV1.SYNTHETICSASSERTIONTYPE_STATUS_CODE, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueNumber: datadog.PtrFloat64(200)}, }}, }, Name: "request is sent", Request: datadogV1.SyntheticsTestRequest{ Url: datadog.PtrString("https://httpbin.org/status/200"), Method: datadog.PtrString("GET"), BasicAuth: &datadogV1.SyntheticsBasicAuth{ SyntheticsBasicAuthWeb: &datadogV1.SyntheticsBasicAuthWeb{ Password: datadog.PtrString("password"), Username: datadog.PtrString("username"), Type: datadogV1.SYNTHETICSBASICAUTHWEBTYPE_WEB.Ptr(), }}, }, Subtype: datadogV1.SYNTHETICSAPITESTSTEPSUBTYPE_HTTP, }}, datadogV1.SyntheticsAPIStep{ SyntheticsAPITestStep: &datadogV1.SyntheticsAPITestStep{ Assertions: []datadogV1.SyntheticsAssertion{ datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Type: datadogV1.SYNTHETICSASSERTIONTYPE_STATUS_CODE, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueNumber: datadog.PtrFloat64(200)}, }}, }, Name: "request is sent", Request: datadogV1.SyntheticsTestRequest{ Url: datadog.PtrString("https://httpbin.org/status/200"), Method: datadog.PtrString("GET"), BasicAuth: &datadogV1.SyntheticsBasicAuth{ SyntheticsBasicAuthSigv4: &datadogV1.SyntheticsBasicAuthSigv4{ AccessKey: "accessKey", SecretKey: "secretKey", Type: datadogV1.SYNTHETICSBASICAUTHSIGV4TYPE_SIGV4, }}, }, Subtype: datadogV1.SYNTHETICSAPITESTSTEPSUBTYPE_HTTP, }}, datadogV1.SyntheticsAPIStep{ SyntheticsAPITestStep: &datadogV1.SyntheticsAPITestStep{ Assertions: []datadogV1.SyntheticsAssertion{ datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Type: datadogV1.SYNTHETICSASSERTIONTYPE_STATUS_CODE, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueNumber: datadog.PtrFloat64(200)}, }}, }, Name: "request is sent", Request: datadogV1.SyntheticsTestRequest{ Url: datadog.PtrString("https://httpbin.org/status/200"), Method: datadog.PtrString("GET"), BasicAuth: &datadogV1.SyntheticsBasicAuth{ SyntheticsBasicAuthNTLM: &datadogV1.SyntheticsBasicAuthNTLM{ Type: datadogV1.SYNTHETICSBASICAUTHNTLMTYPE_NTLM, }}, }, Subtype: datadogV1.SYNTHETICSAPITESTSTEPSUBTYPE_HTTP, }}, datadogV1.SyntheticsAPIStep{ SyntheticsAPITestStep: &datadogV1.SyntheticsAPITestStep{ Assertions: []datadogV1.SyntheticsAssertion{ datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Type: datadogV1.SYNTHETICSASSERTIONTYPE_STATUS_CODE, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueNumber: datadog.PtrFloat64(200)}, }}, }, Name: "request is sent", Request: datadogV1.SyntheticsTestRequest{ Url: datadog.PtrString("https://httpbin.org/status/200"), Method: datadog.PtrString("GET"), BasicAuth: &datadogV1.SyntheticsBasicAuth{ SyntheticsBasicAuthDigest: &datadogV1.SyntheticsBasicAuthDigest{ Password: "password", Username: "username", Type: datadogV1.SYNTHETICSBASICAUTHDIGESTTYPE_DIGEST, }}, }, Subtype: datadogV1.SYNTHETICSAPITESTSTEPSUBTYPE_HTTP, }}, datadogV1.SyntheticsAPIStep{ SyntheticsAPITestStep: &datadogV1.SyntheticsAPITestStep{ Assertions: []datadogV1.SyntheticsAssertion{ datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Type: datadogV1.SYNTHETICSASSERTIONTYPE_STATUS_CODE, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueNumber: datadog.PtrFloat64(200)}, }}, }, Name: "request is sent", Request: datadogV1.SyntheticsTestRequest{ Url: datadog.PtrString("https://httpbin.org/status/200"), Method: datadog.PtrString("GET"), BasicAuth: &datadogV1.SyntheticsBasicAuth{ SyntheticsBasicAuthOauthClient: &datadogV1.SyntheticsBasicAuthOauthClient{ AccessTokenUrl: "accessTokenUrl", TokenApiAuthentication: datadogV1.SYNTHETICSBASICAUTHOAUTHTOKENAPIAUTHENTICATION_HEADER, ClientId: "clientId", ClientSecret: "clientSecret", Type: datadogV1.SYNTHETICSBASICAUTHOAUTHCLIENTTYPE_OAUTH_CLIENT, }}, }, Subtype: datadogV1.SYNTHETICSAPITESTSTEPSUBTYPE_HTTP, }}, datadogV1.SyntheticsAPIStep{ SyntheticsAPITestStep: &datadogV1.SyntheticsAPITestStep{ Assertions: []datadogV1.SyntheticsAssertion{ datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Type: datadogV1.SYNTHETICSASSERTIONTYPE_STATUS_CODE, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueNumber: datadog.PtrFloat64(200)}, }}, }, Name: "request is sent", Request: datadogV1.SyntheticsTestRequest{ Url: datadog.PtrString("https://httpbin.org/status/200"), Method: datadog.PtrString("GET"), BasicAuth: &datadogV1.SyntheticsBasicAuth{ SyntheticsBasicAuthOauthROP: &datadogV1.SyntheticsBasicAuthOauthROP{ AccessTokenUrl: "accessTokenUrl", Password: "password", TokenApiAuthentication: datadogV1.SYNTHETICSBASICAUTHOAUTHTOKENAPIAUTHENTICATION_HEADER, Username: "username", Type: datadogV1.SYNTHETICSBASICAUTHOAUTHROPTYPE_OAUTH_ROP, }}, }, Subtype: datadogV1.SYNTHETICSAPITESTSTEPSUBTYPE_HTTP, }}, }, }, Locations: []string{ "aws:us-east-2", }, Message: "BDD test payload: synthetics_api_test_multi_step_with_every_type_of_basic_auth.json", Name: "Example-Synthetic", Options: datadogV1.SyntheticsTestOptions{ TickEvery: datadog.PtrInt64(60), }, Subtype: datadogV1.SYNTHETICSTESTDETAILSSUBTYPE_MULTI.Ptr(), Type: datadogV1.SYNTHETICSAPITESTTYPE_API, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.CreateSyntheticsAPITest(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.CreateSyntheticsAPITest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.CreateSyntheticsAPITest`:\n%s\n", responseContent) } ``` Copy ##### Create an API GRPC test returns "OK - Returns the created test details." response ``` // Create an API GRPC test returns "OK - Returns the created test details." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsAPITest{ Config: datadogV1.SyntheticsAPITestConfig{ Assertions: []datadogV1.SyntheticsAssertion{ datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueNumber: datadog.PtrFloat64(1)}, Type: datadogV1.SYNTHETICSASSERTIONTYPE_GRPC_HEALTHCHECK_STATUS, }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueString: datadog.PtrString("proto target")}, Type: datadogV1.SYNTHETICSASSERTIONTYPE_GRPC_PROTO, }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueString: datadog.PtrString("123")}, Property: datadog.PtrString("property"), Type: datadogV1.SYNTHETICSASSERTIONTYPE_GRPC_METADATA, }}, }, Request: &datadogV1.SyntheticsTestRequest{ Host: datadog.PtrString("localhost"), Port: &datadogV1.SyntheticsTestRequestPort{ SyntheticsTestRequestNumericalPort: datadog.PtrInt64(50051)}, Service: datadog.PtrString("Hello"), Method: datadog.PtrString("GET"), Message: datadog.PtrString(""), Metadata: map[string]string{}, }, }, Locations: []string{ "aws:us-east-2", }, Message: "BDD test payload: synthetics_api_grpc_test_payload.json", Name: "Example-Synthetic", Options: datadogV1.SyntheticsTestOptions{ MinFailureDuration: datadog.PtrInt64(0), MinLocationFailed: datadog.PtrInt64(1), MonitorOptions: &datadogV1.SyntheticsTestOptionsMonitorOptions{ RenotifyInterval: datadog.PtrInt64(0), }, MonitorName: datadog.PtrString("Example-Synthetic"), TickEvery: datadog.PtrInt64(60), }, Subtype: datadogV1.SYNTHETICSTESTDETAILSSUBTYPE_GRPC.Ptr(), Tags: []string{ "testing:api", }, Type: datadogV1.SYNTHETICSAPITESTTYPE_API, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.CreateSyntheticsAPITest(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.CreateSyntheticsAPITest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.CreateSyntheticsAPITest`:\n%s\n", responseContent) } ``` Copy ##### Create an API HTTP test has bodyHash filled out ``` // Create an API HTTP test has bodyHash filled out package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsAPITest{ Config: datadogV1.SyntheticsAPITestConfig{ Assertions: []datadogV1.SyntheticsAssertion{ datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Property: datadog.PtrString("{{ PROPERTY }}"), Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueString: datadog.PtrString("text/html")}, Type: datadogV1.SYNTHETICSASSERTIONTYPE_HEADER, }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_LESS_THAN, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueNumber: datadog.PtrFloat64(2000)}, Type: datadogV1.SYNTHETICSASSERTIONTYPE_RESPONSE_TIME, TimingsScope: datadogV1.SYNTHETICSASSERTIONTIMINGSSCOPE_WITHOUT_DNS.Ptr(), }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionJSONPathTarget: &datadogV1.SyntheticsAssertionJSONPathTarget{ Operator: datadogV1.SYNTHETICSASSERTIONJSONPATHOPERATOR_VALIDATES_JSON_PATH, Target: &datadogV1.SyntheticsAssertionJSONPathTargetTarget{ JsonPath: datadog.PtrString("topKey"), Operator: datadog.PtrString("isNot"), TargetValue: &datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueString: datadog.PtrString("0")}, }, Type: datadogV1.SYNTHETICSASSERTIONTYPE_BODY, }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionJSONPathTarget: &datadogV1.SyntheticsAssertionJSONPathTarget{ Operator: datadogV1.SYNTHETICSASSERTIONJSONPATHOPERATOR_VALIDATES_JSON_PATH, Target: &datadogV1.SyntheticsAssertionJSONPathTargetTarget{ ElementsOperator: datadog.PtrString("atLeastOneElementMatches"), JsonPath: datadog.PtrString("topKey"), Operator: datadog.PtrString("isNot"), TargetValue: &datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueString: datadog.PtrString("0")}, }, Type: datadogV1.SYNTHETICSASSERTIONTYPE_BODY, }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionJSONSchemaTarget: &datadogV1.SyntheticsAssertionJSONSchemaTarget{ Operator: datadogV1.SYNTHETICSASSERTIONJSONSCHEMAOPERATOR_VALIDATES_JSON_SCHEMA, Target: &datadogV1.SyntheticsAssertionJSONSchemaTargetTarget{ MetaSchema: datadogV1.SYNTHETICSASSERTIONJSONSCHEMAMETASCHEMA_DRAFT_07.Ptr(), JsonSchema: datadog.PtrString(`{"type": "object", "properties":{"slideshow":{"type":"object"}}}`), }, Type: datadogV1.SYNTHETICSASSERTIONTYPE_BODY, }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionXPathTarget: &datadogV1.SyntheticsAssertionXPathTarget{ Operator: datadogV1.SYNTHETICSASSERTIONXPATHOPERATOR_VALIDATES_X_PATH, Target: &datadogV1.SyntheticsAssertionXPathTargetTarget{ XPath: datadog.PtrString("target-xpath"), TargetValue: &datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueString: datadog.PtrString("0")}, Operator: datadog.PtrString("contains"), }, Type: datadogV1.SYNTHETICSASSERTIONTYPE_BODY, }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionBodyHashTarget: &datadogV1.SyntheticsAssertionBodyHashTarget{ Operator: datadogV1.SYNTHETICSASSERTIONBODYHASHOPERATOR_MD5, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueString: datadog.PtrString("a")}, Type: datadogV1.SYNTHETICSASSERTIONBODYHASHTYPE_BODY_HASH, }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionJavascript: &datadogV1.SyntheticsAssertionJavascript{ Code: "const hello = 'world';", Type: datadogV1.SYNTHETICSASSERTIONJAVASCRIPTTYPE_JAVASCRIPT, }}, }, ConfigVariables: []datadogV1.SyntheticsConfigVariable{ { Example: datadog.PtrString("content-type"), Name: "PROPERTY", Pattern: datadog.PtrString("content-type"), Type: datadogV1.SYNTHETICSCONFIGVARIABLETYPE_TEXT, }, }, VariablesFromScript: datadog.PtrString(`dd.variable.set("FOO", "foo")`), Request: &datadogV1.SyntheticsTestRequest{ Certificate: &datadogV1.SyntheticsTestRequestCertificate{ Cert: &datadogV1.SyntheticsTestRequestCertificateItem{ Content: datadog.PtrString("cert-content"), Filename: datadog.PtrString("cert-filename"), UpdatedAt: datadog.PtrString("2020-10-16T09:23:24.857Z"), }, Key: &datadogV1.SyntheticsTestRequestCertificateItem{ Content: datadog.PtrString("key-content"), Filename: datadog.PtrString("key-filename"), UpdatedAt: datadog.PtrString("2020-10-16T09:23:24.857Z"), }, }, Headers: map[string]string{ "unique": "examplesynthetic", }, Method: datadog.PtrString("GET"), Timeout: datadog.PtrFloat64(10), Url: datadog.PtrString("https://datadoghq.com"), Proxy: &datadogV1.SyntheticsTestRequestProxy{ Url: "https://datadoghq.com", Headers: map[string]string{}, }, BasicAuth: &datadogV1.SyntheticsBasicAuth{ SyntheticsBasicAuthOauthClient: &datadogV1.SyntheticsBasicAuthOauthClient{ AccessTokenUrl: "https://datadog-token.com", Audience: datadog.PtrString("audience"), ClientId: "client-id", ClientSecret: "client-secret", Resource: datadog.PtrString("resource"), Scope: datadog.PtrString("yoyo"), TokenApiAuthentication: datadogV1.SYNTHETICSBASICAUTHOAUTHTOKENAPIAUTHENTICATION_HEADER, Type: datadogV1.SYNTHETICSBASICAUTHOAUTHCLIENTTYPE_OAUTH_CLIENT, }}, PersistCookies: datadog.PtrBool(true), }, }, Locations: []string{ "aws:us-east-2", }, Message: "BDD test payload: synthetics_api_http_test_payload.json", Name: "Example-Synthetic", Options: datadogV1.SyntheticsTestOptions{ AcceptSelfSigned: datadog.PtrBool(false), AllowInsecure: datadog.PtrBool(true), FollowRedirects: datadog.PtrBool(true), MinFailureDuration: datadog.PtrInt64(10), MinLocationFailed: datadog.PtrInt64(1), MonitorName: datadog.PtrString("Example-Synthetic"), MonitorPriority: datadog.PtrInt32(5), Retry: &datadogV1.SyntheticsTestOptionsRetry{ Count: datadog.PtrInt64(3), Interval: datadog.PtrFloat64(10), }, TickEvery: datadog.PtrInt64(60), HttpVersion: datadogV1.SYNTHETICSTESTOPTIONSHTTPVERSION_HTTP2.Ptr(), }, Subtype: datadogV1.SYNTHETICSTESTDETAILSSUBTYPE_HTTP.Ptr(), Tags: []string{ "testing:api", }, Type: datadogV1.SYNTHETICSAPITESTTYPE_API, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.CreateSyntheticsAPITest(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.CreateSyntheticsAPITest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.CreateSyntheticsAPITest`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response ``` // Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test // details." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsAPIStep; import com.datadog.api.client.v1.model.SyntheticsAPITest; import com.datadog.api.client.v1.model.SyntheticsAPITestConfig; import com.datadog.api.client.v1.model.SyntheticsAPITestStep; import com.datadog.api.client.v1.model.SyntheticsAPITestStepSubtype; import com.datadog.api.client.v1.model.SyntheticsAPITestType; import com.datadog.api.client.v1.model.SyntheticsAssertion; import com.datadog.api.client.v1.model.SyntheticsAssertionOperator; import com.datadog.api.client.v1.model.SyntheticsAssertionTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionTargetValue; import com.datadog.api.client.v1.model.SyntheticsAssertionType; import com.datadog.api.client.v1.model.SyntheticsBasicAuth; import com.datadog.api.client.v1.model.SyntheticsBasicAuthDigest; import com.datadog.api.client.v1.model.SyntheticsBasicAuthDigestType; import com.datadog.api.client.v1.model.SyntheticsBasicAuthNTLM; import com.datadog.api.client.v1.model.SyntheticsBasicAuthNTLMType; import com.datadog.api.client.v1.model.SyntheticsBasicAuthOauthClient; import com.datadog.api.client.v1.model.SyntheticsBasicAuthOauthClientType; import com.datadog.api.client.v1.model.SyntheticsBasicAuthOauthROP; import com.datadog.api.client.v1.model.SyntheticsBasicAuthOauthROPType; import com.datadog.api.client.v1.model.SyntheticsBasicAuthOauthTokenApiAuthentication; import com.datadog.api.client.v1.model.SyntheticsBasicAuthSigv4; import com.datadog.api.client.v1.model.SyntheticsBasicAuthSigv4Type; import com.datadog.api.client.v1.model.SyntheticsBasicAuthWeb; import com.datadog.api.client.v1.model.SyntheticsBasicAuthWebType; import com.datadog.api.client.v1.model.SyntheticsTestDetailsSubType; import com.datadog.api.client.v1.model.SyntheticsTestOptions; import com.datadog.api.client.v1.model.SyntheticsTestRequest; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsAPITest body = new SyntheticsAPITest() .config( new SyntheticsAPITestConfig() .steps( Arrays.asList( new SyntheticsAPIStep( new SyntheticsAPITestStep() .assertions( Collections.singletonList( new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .type(SyntheticsAssertionType.STATUS_CODE) .target( new SyntheticsAssertionTargetValue( 200.0))))) .name("request is sent") .request( new SyntheticsTestRequest() .url("https://httpbin.org/status/200") .method("GET") .basicAuth( new SyntheticsBasicAuth( new SyntheticsBasicAuthWeb() .password("password") .username("username")))) .subtype(SyntheticsAPITestStepSubtype.HTTP)), new SyntheticsAPIStep( new SyntheticsAPITestStep() .assertions( Collections.singletonList( new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .type(SyntheticsAssertionType.STATUS_CODE) .target( new SyntheticsAssertionTargetValue( 200.0))))) .name("request is sent") .request( new SyntheticsTestRequest() .url("https://httpbin.org/status/200") .method("GET") .basicAuth( new SyntheticsBasicAuth( new SyntheticsBasicAuthWeb() .password("password") .username("username") .type(SyntheticsBasicAuthWebType.WEB)))) .subtype(SyntheticsAPITestStepSubtype.HTTP)), new SyntheticsAPIStep( new SyntheticsAPITestStep() .assertions( Collections.singletonList( new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .type(SyntheticsAssertionType.STATUS_CODE) .target( new SyntheticsAssertionTargetValue( 200.0))))) .name("request is sent") .request( new SyntheticsTestRequest() .url("https://httpbin.org/status/200") .method("GET") .basicAuth( new SyntheticsBasicAuth( new SyntheticsBasicAuthSigv4() .accessKey("accessKey") .secretKey("secretKey") .type(SyntheticsBasicAuthSigv4Type.SIGV4)))) .subtype(SyntheticsAPITestStepSubtype.HTTP)), new SyntheticsAPIStep( new SyntheticsAPITestStep() .assertions( Collections.singletonList( new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .type(SyntheticsAssertionType.STATUS_CODE) .target( new SyntheticsAssertionTargetValue( 200.0))))) .name("request is sent") .request( new SyntheticsTestRequest() .url("https://httpbin.org/status/200") .method("GET") .basicAuth( new SyntheticsBasicAuth( new SyntheticsBasicAuthNTLM() .type(SyntheticsBasicAuthNTLMType.NTLM)))) .subtype(SyntheticsAPITestStepSubtype.HTTP)), new SyntheticsAPIStep( new SyntheticsAPITestStep() .assertions( Collections.singletonList( new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .type(SyntheticsAssertionType.STATUS_CODE) .target( new SyntheticsAssertionTargetValue( 200.0))))) .name("request is sent") .request( new SyntheticsTestRequest() .url("https://httpbin.org/status/200") .method("GET") .basicAuth( new SyntheticsBasicAuth( new SyntheticsBasicAuthDigest() .password("password") .username("username") .type( SyntheticsBasicAuthDigestType.DIGEST)))) .subtype(SyntheticsAPITestStepSubtype.HTTP)), new SyntheticsAPIStep( new SyntheticsAPITestStep() .assertions( Collections.singletonList( new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .type(SyntheticsAssertionType.STATUS_CODE) .target( new SyntheticsAssertionTargetValue( 200.0))))) .name("request is sent") .request( new SyntheticsTestRequest() .url("https://httpbin.org/status/200") .method("GET") .basicAuth( new SyntheticsBasicAuth( new SyntheticsBasicAuthOauthClient() .accessTokenUrl("accessTokenUrl") .tokenApiAuthentication( SyntheticsBasicAuthOauthTokenApiAuthentication .HEADER) .clientId("clientId") .clientSecret("clientSecret") .type( SyntheticsBasicAuthOauthClientType .OAUTH_CLIENT)))) .subtype(SyntheticsAPITestStepSubtype.HTTP)), new SyntheticsAPIStep( new SyntheticsAPITestStep() .assertions( Collections.singletonList( new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .type(SyntheticsAssertionType.STATUS_CODE) .target( new SyntheticsAssertionTargetValue( 200.0))))) .name("request is sent") .request( new SyntheticsTestRequest() .url("https://httpbin.org/status/200") .method("GET") .basicAuth( new SyntheticsBasicAuth( new SyntheticsBasicAuthOauthROP() .accessTokenUrl("accessTokenUrl") .password("password") .tokenApiAuthentication( SyntheticsBasicAuthOauthTokenApiAuthentication .HEADER) .username("username") .type( SyntheticsBasicAuthOauthROPType .OAUTH_ROP)))) .subtype(SyntheticsAPITestStepSubtype.HTTP))))) .locations(Collections.singletonList("aws:us-east-2")) .message( "BDD test payload:" + " synthetics_api_test_multi_step_with_every_type_of_basic_auth.json") .name("Example-Synthetic") .options(new SyntheticsTestOptions().tickEvery(60L)) .subtype(SyntheticsTestDetailsSubType.MULTI) .type(SyntheticsAPITestType.API); try { SyntheticsAPITest result = apiInstance.createSyntheticsAPITest(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#createSyntheticsAPITest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create an API GRPC test returns "OK - Returns the created test details." response ``` // Create an API GRPC test returns "OK - Returns the created test details." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsAPITest; import com.datadog.api.client.v1.model.SyntheticsAPITestConfig; import com.datadog.api.client.v1.model.SyntheticsAPITestType; import com.datadog.api.client.v1.model.SyntheticsAssertion; import com.datadog.api.client.v1.model.SyntheticsAssertionOperator; import com.datadog.api.client.v1.model.SyntheticsAssertionTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionTargetValue; import com.datadog.api.client.v1.model.SyntheticsAssertionType; import com.datadog.api.client.v1.model.SyntheticsTestDetailsSubType; import com.datadog.api.client.v1.model.SyntheticsTestOptions; import com.datadog.api.client.v1.model.SyntheticsTestOptionsMonitorOptions; import com.datadog.api.client.v1.model.SyntheticsTestRequest; import com.datadog.api.client.v1.model.SyntheticsTestRequestPort; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsAPITest body = new SyntheticsAPITest() .config( new SyntheticsAPITestConfig() .assertions( Arrays.asList( new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .target(new SyntheticsAssertionTargetValue(1.0)) .type(SyntheticsAssertionType.GRPC_HEALTHCHECK_STATUS)), new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .target(new SyntheticsAssertionTargetValue("proto target")) .type(SyntheticsAssertionType.GRPC_PROTO)), new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .target(new SyntheticsAssertionTargetValue("123")) .property("property") .type(SyntheticsAssertionType.GRPC_METADATA)))) .request( new SyntheticsTestRequest() .host("localhost") .port(new SyntheticsTestRequestPort(50051L)) .service("Hello") .method("GET") .message("") .metadata(Map.ofEntries()))) .locations(Collections.singletonList("aws:us-east-2")) .message("BDD test payload: synthetics_api_grpc_test_payload.json") .name("Example-Synthetic") .options( new SyntheticsTestOptions() .minFailureDuration(0L) .minLocationFailed(1L) .monitorOptions(new SyntheticsTestOptionsMonitorOptions().renotifyInterval(0L)) .monitorName("Example-Synthetic") .tickEvery(60L)) .subtype(SyntheticsTestDetailsSubType.GRPC) .tags(Collections.singletonList("testing:api")) .type(SyntheticsAPITestType.API); try { SyntheticsAPITest result = apiInstance.createSyntheticsAPITest(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#createSyntheticsAPITest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create an API HTTP test has bodyHash filled out ``` // Create an API HTTP test has bodyHash filled out import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsAPITest; import com.datadog.api.client.v1.model.SyntheticsAPITestConfig; import com.datadog.api.client.v1.model.SyntheticsAPITestType; import com.datadog.api.client.v1.model.SyntheticsAssertion; import com.datadog.api.client.v1.model.SyntheticsAssertionBodyHashOperator; import com.datadog.api.client.v1.model.SyntheticsAssertionBodyHashTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionBodyHashType; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONPathOperator; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONPathTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONPathTargetTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONSchemaMetaSchema; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONSchemaOperator; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONSchemaTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONSchemaTargetTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionJavascript; import com.datadog.api.client.v1.model.SyntheticsAssertionJavascriptType; import com.datadog.api.client.v1.model.SyntheticsAssertionOperator; import com.datadog.api.client.v1.model.SyntheticsAssertionTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionTargetValue; import com.datadog.api.client.v1.model.SyntheticsAssertionTimingsScope; import com.datadog.api.client.v1.model.SyntheticsAssertionType; import com.datadog.api.client.v1.model.SyntheticsAssertionXPathOperator; import com.datadog.api.client.v1.model.SyntheticsAssertionXPathTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionXPathTargetTarget; import com.datadog.api.client.v1.model.SyntheticsBasicAuth; import com.datadog.api.client.v1.model.SyntheticsBasicAuthOauthClient; import com.datadog.api.client.v1.model.SyntheticsBasicAuthOauthClientType; import com.datadog.api.client.v1.model.SyntheticsBasicAuthOauthTokenApiAuthentication; import com.datadog.api.client.v1.model.SyntheticsConfigVariable; import com.datadog.api.client.v1.model.SyntheticsConfigVariableType; import com.datadog.api.client.v1.model.SyntheticsTestDetailsSubType; import com.datadog.api.client.v1.model.SyntheticsTestOptions; import com.datadog.api.client.v1.model.SyntheticsTestOptionsHTTPVersion; import com.datadog.api.client.v1.model.SyntheticsTestOptionsRetry; import com.datadog.api.client.v1.model.SyntheticsTestRequest; import com.datadog.api.client.v1.model.SyntheticsTestRequestCertificate; import com.datadog.api.client.v1.model.SyntheticsTestRequestCertificateItem; import com.datadog.api.client.v1.model.SyntheticsTestRequestProxy; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsAPITest body = new SyntheticsAPITest() .config( new SyntheticsAPITestConfig() .assertions( Arrays.asList( new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .property("{{ PROPERTY }}") .target(new SyntheticsAssertionTargetValue("text/html")) .type(SyntheticsAssertionType.HEADER)), new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.LESS_THAN) .target(new SyntheticsAssertionTargetValue(2000.0)) .type(SyntheticsAssertionType.RESPONSE_TIME) .timingsScope(SyntheticsAssertionTimingsScope.WITHOUT_DNS)), new SyntheticsAssertion( new SyntheticsAssertionJSONPathTarget() .operator( SyntheticsAssertionJSONPathOperator.VALIDATES_JSON_PATH) .target( new SyntheticsAssertionJSONPathTargetTarget() .jsonPath("topKey") .operator("isNot") .targetValue(new SyntheticsAssertionTargetValue("0"))) .type(SyntheticsAssertionType.BODY)), new SyntheticsAssertion( new SyntheticsAssertionJSONPathTarget() .operator( SyntheticsAssertionJSONPathOperator.VALIDATES_JSON_PATH) .target( new SyntheticsAssertionJSONPathTargetTarget() .elementsOperator("atLeastOneElementMatches") .jsonPath("topKey") .operator("isNot") .targetValue(new SyntheticsAssertionTargetValue("0"))) .type(SyntheticsAssertionType.BODY)), new SyntheticsAssertion( new SyntheticsAssertionJSONSchemaTarget() .operator( SyntheticsAssertionJSONSchemaOperator.VALIDATES_JSON_SCHEMA) .target( new SyntheticsAssertionJSONSchemaTargetTarget() .metaSchema( SyntheticsAssertionJSONSchemaMetaSchema.DRAFT_07) .jsonSchema( """ {"type": "object", "properties":{"slideshow":{"type":"object"}}} """)) .type(SyntheticsAssertionType.BODY)), new SyntheticsAssertion( new SyntheticsAssertionXPathTarget() .operator(SyntheticsAssertionXPathOperator.VALIDATES_X_PATH) .target( new SyntheticsAssertionXPathTargetTarget() .xPath("target-xpath") .targetValue(new SyntheticsAssertionTargetValue("0")) .operator("contains")) .type(SyntheticsAssertionType.BODY)), new SyntheticsAssertion( new SyntheticsAssertionBodyHashTarget() .operator(SyntheticsAssertionBodyHashOperator.MD5) .target(new SyntheticsAssertionTargetValue("a")) .type(SyntheticsAssertionBodyHashType.BODY_HASH)), new SyntheticsAssertion( new SyntheticsAssertionJavascript() .code("const hello = 'world';") .type(SyntheticsAssertionJavascriptType.JAVASCRIPT)))) .configVariables( Collections.singletonList( new SyntheticsConfigVariable() .example("content-type") .name("PROPERTY") .pattern("content-type") .type(SyntheticsConfigVariableType.TEXT))) .variablesFromScript(""" dd.variable.set("FOO", "foo") """) .request( new SyntheticsTestRequest() .certificate( new SyntheticsTestRequestCertificate() .cert( new SyntheticsTestRequestCertificateItem() .content("cert-content") .filename("cert-filename") .updatedAt("2020-10-16T09:23:24.857Z")) .key( new SyntheticsTestRequestCertificateItem() .content("key-content") .filename("key-filename") .updatedAt("2020-10-16T09:23:24.857Z"))) .headers(Map.ofEntries(Map.entry("unique", "examplesynthetic"))) .method("GET") .timeout(10.0) .url("https://datadoghq.com") .proxy( new SyntheticsTestRequestProxy() .url("https://datadoghq.com") .headers(Map.ofEntries())) .basicAuth( new SyntheticsBasicAuth( new SyntheticsBasicAuthOauthClient() .accessTokenUrl("https://datadog-token.com") .audience("audience") .clientId("client-id") .clientSecret("client-secret") .resource("resource") .scope("yoyo") .tokenApiAuthentication( SyntheticsBasicAuthOauthTokenApiAuthentication.HEADER) .type(SyntheticsBasicAuthOauthClientType.OAUTH_CLIENT))) .persistCookies(true))) .locations(Collections.singletonList("aws:us-east-2")) .message("BDD test payload: synthetics_api_http_test_payload.json") .name("Example-Synthetic") .options( new SyntheticsTestOptions() .acceptSelfSigned(false) .allowInsecure(true) .followRedirects(true) .minFailureDuration(10L) .minLocationFailed(1L) .monitorName("Example-Synthetic") .monitorPriority(5) .retry(new SyntheticsTestOptionsRetry().count(3L).interval(10.0)) .tickEvery(60L) .httpVersion(SyntheticsTestOptionsHTTPVersion.HTTP2)) .subtype(SyntheticsTestDetailsSubType.HTTP) .tags(Collections.singletonList("testing:api")) .type(SyntheticsAPITestType.API); try { SyntheticsAPITest result = apiInstance.createSyntheticsAPITest(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#createSyntheticsAPITest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response ``` """ Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_api_test import SyntheticsAPITest from datadog_api_client.v1.model.synthetics_api_test_config import SyntheticsAPITestConfig from datadog_api_client.v1.model.synthetics_api_test_step import SyntheticsAPITestStep from datadog_api_client.v1.model.synthetics_api_test_step_subtype import SyntheticsAPITestStepSubtype from datadog_api_client.v1.model.synthetics_api_test_type import SyntheticsAPITestType from datadog_api_client.v1.model.synthetics_assertion_operator import SyntheticsAssertionOperator from datadog_api_client.v1.model.synthetics_assertion_target import SyntheticsAssertionTarget from datadog_api_client.v1.model.synthetics_assertion_type import SyntheticsAssertionType from datadog_api_client.v1.model.synthetics_basic_auth_digest import SyntheticsBasicAuthDigest from datadog_api_client.v1.model.synthetics_basic_auth_digest_type import SyntheticsBasicAuthDigestType from datadog_api_client.v1.model.synthetics_basic_auth_ntlm import SyntheticsBasicAuthNTLM from datadog_api_client.v1.model.synthetics_basic_auth_ntlm_type import SyntheticsBasicAuthNTLMType from datadog_api_client.v1.model.synthetics_basic_auth_oauth_client import SyntheticsBasicAuthOauthClient from datadog_api_client.v1.model.synthetics_basic_auth_oauth_client_type import SyntheticsBasicAuthOauthClientType from datadog_api_client.v1.model.synthetics_basic_auth_oauth_rop import SyntheticsBasicAuthOauthROP from datadog_api_client.v1.model.synthetics_basic_auth_oauth_rop_type import SyntheticsBasicAuthOauthROPType from datadog_api_client.v1.model.synthetics_basic_auth_oauth_token_api_authentication import ( SyntheticsBasicAuthOauthTokenApiAuthentication, ) from datadog_api_client.v1.model.synthetics_basic_auth_sigv4 import SyntheticsBasicAuthSigv4 from datadog_api_client.v1.model.synthetics_basic_auth_sigv4_type import SyntheticsBasicAuthSigv4Type from datadog_api_client.v1.model.synthetics_basic_auth_web import SyntheticsBasicAuthWeb from datadog_api_client.v1.model.synthetics_basic_auth_web_type import SyntheticsBasicAuthWebType from datadog_api_client.v1.model.synthetics_test_details_sub_type import SyntheticsTestDetailsSubType from datadog_api_client.v1.model.synthetics_test_options import SyntheticsTestOptions from datadog_api_client.v1.model.synthetics_test_request import SyntheticsTestRequest body = SyntheticsAPITest( config=SyntheticsAPITestConfig( steps=[ SyntheticsAPITestStep( assertions=[ SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, type=SyntheticsAssertionType.STATUS_CODE, target=200, ), ], name="request is sent", request=SyntheticsTestRequest( url="https://httpbin.org/status/200", method="GET", basic_auth=SyntheticsBasicAuthWeb( password="password", username="username", ), ), subtype=SyntheticsAPITestStepSubtype.HTTP, ), SyntheticsAPITestStep( assertions=[ SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, type=SyntheticsAssertionType.STATUS_CODE, target=200, ), ], name="request is sent", request=SyntheticsTestRequest( url="https://httpbin.org/status/200", method="GET", basic_auth=SyntheticsBasicAuthWeb( password="password", username="username", type=SyntheticsBasicAuthWebType.WEB, ), ), subtype=SyntheticsAPITestStepSubtype.HTTP, ), SyntheticsAPITestStep( assertions=[ SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, type=SyntheticsAssertionType.STATUS_CODE, target=200, ), ], name="request is sent", request=SyntheticsTestRequest( url="https://httpbin.org/status/200", method="GET", basic_auth=SyntheticsBasicAuthSigv4( access_key="accessKey", secret_key="secretKey", type=SyntheticsBasicAuthSigv4Type.SIGV4, ), ), subtype=SyntheticsAPITestStepSubtype.HTTP, ), SyntheticsAPITestStep( assertions=[ SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, type=SyntheticsAssertionType.STATUS_CODE, target=200, ), ], name="request is sent", request=SyntheticsTestRequest( url="https://httpbin.org/status/200", method="GET", basic_auth=SyntheticsBasicAuthNTLM( type=SyntheticsBasicAuthNTLMType.NTLM, ), ), subtype=SyntheticsAPITestStepSubtype.HTTP, ), SyntheticsAPITestStep( assertions=[ SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, type=SyntheticsAssertionType.STATUS_CODE, target=200, ), ], name="request is sent", request=SyntheticsTestRequest( url="https://httpbin.org/status/200", method="GET", basic_auth=SyntheticsBasicAuthDigest( password="password", username="username", type=SyntheticsBasicAuthDigestType.DIGEST, ), ), subtype=SyntheticsAPITestStepSubtype.HTTP, ), SyntheticsAPITestStep( assertions=[ SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, type=SyntheticsAssertionType.STATUS_CODE, target=200, ), ], name="request is sent", request=SyntheticsTestRequest( url="https://httpbin.org/status/200", method="GET", basic_auth=SyntheticsBasicAuthOauthClient( access_token_url="accessTokenUrl", token_api_authentication=SyntheticsBasicAuthOauthTokenApiAuthentication.HEADER, client_id="clientId", client_secret="clientSecret", type=SyntheticsBasicAuthOauthClientType.OAUTH_CLIENT, ), ), subtype=SyntheticsAPITestStepSubtype.HTTP, ), SyntheticsAPITestStep( assertions=[ SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, type=SyntheticsAssertionType.STATUS_CODE, target=200, ), ], name="request is sent", request=SyntheticsTestRequest( url="https://httpbin.org/status/200", method="GET", basic_auth=SyntheticsBasicAuthOauthROP( access_token_url="accessTokenUrl", password="password", token_api_authentication=SyntheticsBasicAuthOauthTokenApiAuthentication.HEADER, username="username", type=SyntheticsBasicAuthOauthROPType.OAUTH_ROP, ), ), subtype=SyntheticsAPITestStepSubtype.HTTP, ), ], ), locations=[ "aws:us-east-2", ], message="BDD test payload: synthetics_api_test_multi_step_with_every_type_of_basic_auth.json", name="Example-Synthetic", options=SyntheticsTestOptions( tick_every=60, ), subtype=SyntheticsTestDetailsSubType.MULTI, type=SyntheticsAPITestType.API, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.create_synthetics_api_test(body=body) print(response) ``` Copy ##### Create an API GRPC test returns "OK - Returns the created test details." response ``` """ Create an API GRPC test returns "OK - Returns the created test details." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_api_test import SyntheticsAPITest from datadog_api_client.v1.model.synthetics_api_test_config import SyntheticsAPITestConfig from datadog_api_client.v1.model.synthetics_api_test_type import SyntheticsAPITestType from datadog_api_client.v1.model.synthetics_assertion_operator import SyntheticsAssertionOperator from datadog_api_client.v1.model.synthetics_assertion_target import SyntheticsAssertionTarget from datadog_api_client.v1.model.synthetics_assertion_type import SyntheticsAssertionType from datadog_api_client.v1.model.synthetics_test_details_sub_type import SyntheticsTestDetailsSubType from datadog_api_client.v1.model.synthetics_test_metadata import SyntheticsTestMetadata from datadog_api_client.v1.model.synthetics_test_options import SyntheticsTestOptions from datadog_api_client.v1.model.synthetics_test_options_monitor_options import SyntheticsTestOptionsMonitorOptions from datadog_api_client.v1.model.synthetics_test_request import SyntheticsTestRequest body = SyntheticsAPITest( config=SyntheticsAPITestConfig( assertions=[ SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, target=1, type=SyntheticsAssertionType.GRPC_HEALTHCHECK_STATUS, ), SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, target="proto target", type=SyntheticsAssertionType.GRPC_PROTO, ), SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, target="123", _property="property", type=SyntheticsAssertionType.GRPC_METADATA, ), ], request=SyntheticsTestRequest( host="localhost", port=50051, service="Hello", method="GET", message="", metadata=SyntheticsTestMetadata(), ), ), locations=[ "aws:us-east-2", ], message="BDD test payload: synthetics_api_grpc_test_payload.json", name="Example-Synthetic", options=SyntheticsTestOptions( min_failure_duration=0, min_location_failed=1, monitor_options=SyntheticsTestOptionsMonitorOptions( renotify_interval=0, ), monitor_name="Example-Synthetic", tick_every=60, ), subtype=SyntheticsTestDetailsSubType.GRPC, tags=[ "testing:api", ], type=SyntheticsAPITestType.API, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.create_synthetics_api_test(body=body) print(response) ``` Copy ##### Create an API HTTP test has bodyHash filled out ``` """ Create an API HTTP test has bodyHash filled out """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_api_test import SyntheticsAPITest from datadog_api_client.v1.model.synthetics_api_test_config import SyntheticsAPITestConfig from datadog_api_client.v1.model.synthetics_api_test_type import SyntheticsAPITestType from datadog_api_client.v1.model.synthetics_assertion_body_hash_operator import SyntheticsAssertionBodyHashOperator from datadog_api_client.v1.model.synthetics_assertion_body_hash_target import SyntheticsAssertionBodyHashTarget from datadog_api_client.v1.model.synthetics_assertion_body_hash_type import SyntheticsAssertionBodyHashType from datadog_api_client.v1.model.synthetics_assertion_javascript import SyntheticsAssertionJavascript from datadog_api_client.v1.model.synthetics_assertion_javascript_type import SyntheticsAssertionJavascriptType from datadog_api_client.v1.model.synthetics_assertion_json_path_operator import SyntheticsAssertionJSONPathOperator from datadog_api_client.v1.model.synthetics_assertion_json_path_target import SyntheticsAssertionJSONPathTarget from datadog_api_client.v1.model.synthetics_assertion_json_path_target_target import ( SyntheticsAssertionJSONPathTargetTarget, ) from datadog_api_client.v1.model.synthetics_assertion_json_schema_meta_schema import ( SyntheticsAssertionJSONSchemaMetaSchema, ) from datadog_api_client.v1.model.synthetics_assertion_json_schema_operator import SyntheticsAssertionJSONSchemaOperator from datadog_api_client.v1.model.synthetics_assertion_json_schema_target import SyntheticsAssertionJSONSchemaTarget from datadog_api_client.v1.model.synthetics_assertion_json_schema_target_target import ( SyntheticsAssertionJSONSchemaTargetTarget, ) from datadog_api_client.v1.model.synthetics_assertion_operator import SyntheticsAssertionOperator from datadog_api_client.v1.model.synthetics_assertion_target import SyntheticsAssertionTarget from datadog_api_client.v1.model.synthetics_assertion_timings_scope import SyntheticsAssertionTimingsScope from datadog_api_client.v1.model.synthetics_assertion_type import SyntheticsAssertionType from datadog_api_client.v1.model.synthetics_assertion_x_path_operator import SyntheticsAssertionXPathOperator from datadog_api_client.v1.model.synthetics_assertion_x_path_target import SyntheticsAssertionXPathTarget from datadog_api_client.v1.model.synthetics_assertion_x_path_target_target import SyntheticsAssertionXPathTargetTarget from datadog_api_client.v1.model.synthetics_basic_auth_oauth_client import SyntheticsBasicAuthOauthClient from datadog_api_client.v1.model.synthetics_basic_auth_oauth_client_type import SyntheticsBasicAuthOauthClientType from datadog_api_client.v1.model.synthetics_basic_auth_oauth_token_api_authentication import ( SyntheticsBasicAuthOauthTokenApiAuthentication, ) from datadog_api_client.v1.model.synthetics_config_variable import SyntheticsConfigVariable from datadog_api_client.v1.model.synthetics_config_variable_type import SyntheticsConfigVariableType from datadog_api_client.v1.model.synthetics_test_details_sub_type import SyntheticsTestDetailsSubType from datadog_api_client.v1.model.synthetics_test_headers import SyntheticsTestHeaders from datadog_api_client.v1.model.synthetics_test_options import SyntheticsTestOptions from datadog_api_client.v1.model.synthetics_test_options_http_version import SyntheticsTestOptionsHTTPVersion from datadog_api_client.v1.model.synthetics_test_options_retry import SyntheticsTestOptionsRetry from datadog_api_client.v1.model.synthetics_test_request import SyntheticsTestRequest from datadog_api_client.v1.model.synthetics_test_request_certificate import SyntheticsTestRequestCertificate from datadog_api_client.v1.model.synthetics_test_request_certificate_item import SyntheticsTestRequestCertificateItem from datadog_api_client.v1.model.synthetics_test_request_proxy import SyntheticsTestRequestProxy body = SyntheticsAPITest( config=SyntheticsAPITestConfig( assertions=[ SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, _property="{{ PROPERTY }}", target="text/html", type=SyntheticsAssertionType.HEADER, ), SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.LESS_THAN, target=2000, type=SyntheticsAssertionType.RESPONSE_TIME, timings_scope=SyntheticsAssertionTimingsScope.WITHOUT_DNS, ), SyntheticsAssertionJSONPathTarget( operator=SyntheticsAssertionJSONPathOperator.VALIDATES_JSON_PATH, target=SyntheticsAssertionJSONPathTargetTarget( json_path="topKey", operator="isNot", target_value="0", ), type=SyntheticsAssertionType.BODY, ), SyntheticsAssertionJSONPathTarget( operator=SyntheticsAssertionJSONPathOperator.VALIDATES_JSON_PATH, target=SyntheticsAssertionJSONPathTargetTarget( elements_operator="atLeastOneElementMatches", json_path="topKey", operator="isNot", target_value="0", ), type=SyntheticsAssertionType.BODY, ), SyntheticsAssertionJSONSchemaTarget( operator=SyntheticsAssertionJSONSchemaOperator.VALIDATES_JSON_SCHEMA, target=SyntheticsAssertionJSONSchemaTargetTarget( meta_schema=SyntheticsAssertionJSONSchemaMetaSchema.DRAFT_07, json_schema='{"type": "object", "properties":{"slideshow":{"type":"object"}}}', ), type=SyntheticsAssertionType.BODY, ), SyntheticsAssertionXPathTarget( operator=SyntheticsAssertionXPathOperator.VALIDATES_X_PATH, target=SyntheticsAssertionXPathTargetTarget( x_path="target-xpath", target_value="0", operator="contains", ), type=SyntheticsAssertionType.BODY, ), SyntheticsAssertionBodyHashTarget( operator=SyntheticsAssertionBodyHashOperator.MD5, target="a", type=SyntheticsAssertionBodyHashType.BODY_HASH, ), SyntheticsAssertionJavascript( code="const hello = 'world';", type=SyntheticsAssertionJavascriptType.JAVASCRIPT, ), ], config_variables=[ SyntheticsConfigVariable( example="content-type", name="PROPERTY", pattern="content-type", type=SyntheticsConfigVariableType.TEXT, ), ], variables_from_script='dd.variable.set("FOO", "foo")', request=SyntheticsTestRequest( certificate=SyntheticsTestRequestCertificate( cert=SyntheticsTestRequestCertificateItem( content="cert-content", filename="cert-filename", updated_at="2020-10-16T09:23:24.857Z", ), key=SyntheticsTestRequestCertificateItem( content="key-content", filename="key-filename", updated_at="2020-10-16T09:23:24.857Z", ), ), headers=SyntheticsTestHeaders( unique="examplesynthetic", ), method="GET", timeout=10.0, url="https://datadoghq.com", proxy=SyntheticsTestRequestProxy( url="https://datadoghq.com", headers=SyntheticsTestHeaders(), ), basic_auth=SyntheticsBasicAuthOauthClient( access_token_url="https://datadog-token.com", audience="audience", client_id="client-id", client_secret="client-secret", resource="resource", scope="yoyo", token_api_authentication=SyntheticsBasicAuthOauthTokenApiAuthentication.HEADER, type=SyntheticsBasicAuthOauthClientType.OAUTH_CLIENT, ), persist_cookies=True, ), ), locations=[ "aws:us-east-2", ], message="BDD test payload: synthetics_api_http_test_payload.json", name="Example-Synthetic", options=SyntheticsTestOptions( accept_self_signed=False, allow_insecure=True, follow_redirects=True, min_failure_duration=10, min_location_failed=1, monitor_name="Example-Synthetic", monitor_priority=5, retry=SyntheticsTestOptionsRetry( count=3, interval=10.0, ), tick_every=60, http_version=SyntheticsTestOptionsHTTPVersion.HTTP2, ), subtype=SyntheticsTestDetailsSubType.HTTP, tags=[ "testing:api", ], type=SyntheticsAPITestType.API, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.create_synthetics_api_test(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response ``` # Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsAPITest.new({ config: DatadogAPIClient::V1::SyntheticsAPITestConfig.new({ steps: [ DatadogAPIClient::V1::SyntheticsAPITestStep.new({ assertions: [ DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, type: DatadogAPIClient::V1::SyntheticsAssertionType::STATUS_CODE, target: 200, }), ], name: "request is sent", request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ url: "https://httpbin.org/status/200", method: "GET", basic_auth: DatadogAPIClient::V1::SyntheticsBasicAuthWeb.new({ password: "password", username: "username", }), }), subtype: DatadogAPIClient::V1::SyntheticsAPITestStepSubtype::HTTP, }), DatadogAPIClient::V1::SyntheticsAPITestStep.new({ assertions: [ DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, type: DatadogAPIClient::V1::SyntheticsAssertionType::STATUS_CODE, target: 200, }), ], name: "request is sent", request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ url: "https://httpbin.org/status/200", method: "GET", basic_auth: DatadogAPIClient::V1::SyntheticsBasicAuthWeb.new({ password: "password", username: "username", type: DatadogAPIClient::V1::SyntheticsBasicAuthWebType::WEB, }), }), subtype: DatadogAPIClient::V1::SyntheticsAPITestStepSubtype::HTTP, }), DatadogAPIClient::V1::SyntheticsAPITestStep.new({ assertions: [ DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, type: DatadogAPIClient::V1::SyntheticsAssertionType::STATUS_CODE, target: 200, }), ], name: "request is sent", request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ url: "https://httpbin.org/status/200", method: "GET", basic_auth: DatadogAPIClient::V1::SyntheticsBasicAuthSigv4.new({ access_key: "accessKey", secret_key: "secretKey", type: DatadogAPIClient::V1::SyntheticsBasicAuthSigv4Type::SIGV4, }), }), subtype: DatadogAPIClient::V1::SyntheticsAPITestStepSubtype::HTTP, }), DatadogAPIClient::V1::SyntheticsAPITestStep.new({ assertions: [ DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, type: DatadogAPIClient::V1::SyntheticsAssertionType::STATUS_CODE, target: 200, }), ], name: "request is sent", request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ url: "https://httpbin.org/status/200", method: "GET", basic_auth: DatadogAPIClient::V1::SyntheticsBasicAuthNTLM.new({ type: DatadogAPIClient::V1::SyntheticsBasicAuthNTLMType::NTLM, }), }), subtype: DatadogAPIClient::V1::SyntheticsAPITestStepSubtype::HTTP, }), DatadogAPIClient::V1::SyntheticsAPITestStep.new({ assertions: [ DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, type: DatadogAPIClient::V1::SyntheticsAssertionType::STATUS_CODE, target: 200, }), ], name: "request is sent", request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ url: "https://httpbin.org/status/200", method: "GET", basic_auth: DatadogAPIClient::V1::SyntheticsBasicAuthDigest.new({ password: "password", username: "username", type: DatadogAPIClient::V1::SyntheticsBasicAuthDigestType::DIGEST, }), }), subtype: DatadogAPIClient::V1::SyntheticsAPITestStepSubtype::HTTP, }), DatadogAPIClient::V1::SyntheticsAPITestStep.new({ assertions: [ DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, type: DatadogAPIClient::V1::SyntheticsAssertionType::STATUS_CODE, target: 200, }), ], name: "request is sent", request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ url: "https://httpbin.org/status/200", method: "GET", basic_auth: DatadogAPIClient::V1::SyntheticsBasicAuthOauthClient.new({ access_token_url: "accessTokenUrl", token_api_authentication: DatadogAPIClient::V1::SyntheticsBasicAuthOauthTokenApiAuthentication::HEADER, client_id: "clientId", client_secret: "clientSecret", type: DatadogAPIClient::V1::SyntheticsBasicAuthOauthClientType::OAUTH_CLIENT, }), }), subtype: DatadogAPIClient::V1::SyntheticsAPITestStepSubtype::HTTP, }), DatadogAPIClient::V1::SyntheticsAPITestStep.new({ assertions: [ DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, type: DatadogAPIClient::V1::SyntheticsAssertionType::STATUS_CODE, target: 200, }), ], name: "request is sent", request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ url: "https://httpbin.org/status/200", method: "GET", basic_auth: DatadogAPIClient::V1::SyntheticsBasicAuthOauthROP.new({ access_token_url: "accessTokenUrl", password: "password", token_api_authentication: DatadogAPIClient::V1::SyntheticsBasicAuthOauthTokenApiAuthentication::HEADER, username: "username", type: DatadogAPIClient::V1::SyntheticsBasicAuthOauthROPType::OAUTH_ROP, }), }), subtype: DatadogAPIClient::V1::SyntheticsAPITestStepSubtype::HTTP, }), ], }), locations: [ "aws:us-east-2", ], message: "BDD test payload: synthetics_api_test_multi_step_with_every_type_of_basic_auth.json", name: "Example-Synthetic", options: DatadogAPIClient::V1::SyntheticsTestOptions.new({ tick_every: 60, }), subtype: DatadogAPIClient::V1::SyntheticsTestDetailsSubType::MULTI, type: DatadogAPIClient::V1::SyntheticsAPITestType::API, }) p api_instance.create_synthetics_api_test(body) ``` Copy ##### Create an API GRPC test returns "OK - Returns the created test details." response ``` # Create an API GRPC test returns "OK - Returns the created test details." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsAPITest.new({ config: DatadogAPIClient::V1::SyntheticsAPITestConfig.new({ assertions: [ DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, target: 1, type: DatadogAPIClient::V1::SyntheticsAssertionType::GRPC_HEALTHCHECK_STATUS, }), DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, target: "proto target", type: DatadogAPIClient::V1::SyntheticsAssertionType::GRPC_PROTO, }), DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, target: "123", property: "property", type: DatadogAPIClient::V1::SyntheticsAssertionType::GRPC_METADATA, }), ], request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ host: "localhost", port: 50051, service: "Hello", method: "GET", message: "", metadata: {}, }), }), locations: [ "aws:us-east-2", ], message: "BDD test payload: synthetics_api_grpc_test_payload.json", name: "Example-Synthetic", options: DatadogAPIClient::V1::SyntheticsTestOptions.new({ min_failure_duration: 0, min_location_failed: 1, monitor_options: DatadogAPIClient::V1::SyntheticsTestOptionsMonitorOptions.new({ renotify_interval: 0, }), monitor_name: "Example-Synthetic", tick_every: 60, }), subtype: DatadogAPIClient::V1::SyntheticsTestDetailsSubType::GRPC, tags: [ "testing:api", ], type: DatadogAPIClient::V1::SyntheticsAPITestType::API, }) p api_instance.create_synthetics_api_test(body) ``` Copy ##### Create an API HTTP test has bodyHash filled out ``` # Create an API HTTP test has bodyHash filled out require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsAPITest.new({ config: DatadogAPIClient::V1::SyntheticsAPITestConfig.new({ assertions: [ DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, property: "{{ PROPERTY }}", target: "text/html", type: DatadogAPIClient::V1::SyntheticsAssertionType::HEADER, }), DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::LESS_THAN, target: 2000, type: DatadogAPIClient::V1::SyntheticsAssertionType::RESPONSE_TIME, timings_scope: DatadogAPIClient::V1::SyntheticsAssertionTimingsScope::WITHOUT_DNS, }), DatadogAPIClient::V1::SyntheticsAssertionJSONPathTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionJSONPathOperator::VALIDATES_JSON_PATH, target: DatadogAPIClient::V1::SyntheticsAssertionJSONPathTargetTarget.new({ json_path: "topKey", operator: "isNot", target_value: "0", }), type: DatadogAPIClient::V1::SyntheticsAssertionType::BODY, }), DatadogAPIClient::V1::SyntheticsAssertionJSONPathTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionJSONPathOperator::VALIDATES_JSON_PATH, target: DatadogAPIClient::V1::SyntheticsAssertionJSONPathTargetTarget.new({ elements_operator: "atLeastOneElementMatches", json_path: "topKey", operator: "isNot", target_value: "0", }), type: DatadogAPIClient::V1::SyntheticsAssertionType::BODY, }), DatadogAPIClient::V1::SyntheticsAssertionJSONSchemaTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionJSONSchemaOperator::VALIDATES_JSON_SCHEMA, target: DatadogAPIClient::V1::SyntheticsAssertionJSONSchemaTargetTarget.new({ meta_schema: DatadogAPIClient::V1::SyntheticsAssertionJSONSchemaMetaSchema::DRAFT_07, json_schema: '{"type": "object", "properties":{"slideshow":{"type":"object"}}}', }), type: DatadogAPIClient::V1::SyntheticsAssertionType::BODY, }), DatadogAPIClient::V1::SyntheticsAssertionXPathTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionXPathOperator::VALIDATES_X_PATH, target: DatadogAPIClient::V1::SyntheticsAssertionXPathTargetTarget.new({ x_path: "target-xpath", target_value: "0", operator: "contains", }), type: DatadogAPIClient::V1::SyntheticsAssertionType::BODY, }), DatadogAPIClient::V1::SyntheticsAssertionBodyHashTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionBodyHashOperator::MD5, target: "a", type: DatadogAPIClient::V1::SyntheticsAssertionBodyHashType::BODY_HASH, }), DatadogAPIClient::V1::SyntheticsAssertionJavascript.new({ code: "const hello = 'world';", type: DatadogAPIClient::V1::SyntheticsAssertionJavascriptType::JAVASCRIPT, }), ], config_variables: [ DatadogAPIClient::V1::SyntheticsConfigVariable.new({ example: "content-type", name: "PROPERTY", pattern: "content-type", type: DatadogAPIClient::V1::SyntheticsConfigVariableType::TEXT, }), ], variables_from_script: 'dd.variable.set("FOO", "foo")', request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ certificate: DatadogAPIClient::V1::SyntheticsTestRequestCertificate.new({ cert: DatadogAPIClient::V1::SyntheticsTestRequestCertificateItem.new({ content: "cert-content", filename: "cert-filename", updated_at: "2020-10-16T09:23:24.857Z", }), key: DatadogAPIClient::V1::SyntheticsTestRequestCertificateItem.new({ content: "key-content", filename: "key-filename", updated_at: "2020-10-16T09:23:24.857Z", }), }), headers: { unique: "examplesynthetic", }, method: "GET", timeout: 10, url: "https://datadoghq.com", proxy: DatadogAPIClient::V1::SyntheticsTestRequestProxy.new({ url: "https://datadoghq.com", headers: {}, }), basic_auth: DatadogAPIClient::V1::SyntheticsBasicAuthOauthClient.new({ access_token_url: "https://datadog-token.com", audience: "audience", client_id: "client-id", client_secret: "client-secret", resource: "resource", scope: "yoyo", token_api_authentication: DatadogAPIClient::V1::SyntheticsBasicAuthOauthTokenApiAuthentication::HEADER, type: DatadogAPIClient::V1::SyntheticsBasicAuthOauthClientType::OAUTH_CLIENT, }), persist_cookies: true, }), }), locations: [ "aws:us-east-2", ], message: "BDD test payload: synthetics_api_http_test_payload.json", name: "Example-Synthetic", options: DatadogAPIClient::V1::SyntheticsTestOptions.new({ accept_self_signed: false, allow_insecure: true, follow_redirects: true, min_failure_duration: 10, min_location_failed: 1, monitor_name: "Example-Synthetic", monitor_priority: 5, _retry: DatadogAPIClient::V1::SyntheticsTestOptionsRetry.new({ count: 3, interval: 10, }), tick_every: 60, http_version: DatadogAPIClient::V1::SyntheticsTestOptionsHTTPVersion::HTTP2, }), subtype: DatadogAPIClient::V1::SyntheticsTestDetailsSubType::HTTP, tags: [ "testing:api", ], type: DatadogAPIClient::V1::SyntheticsAPITestType::API, }) p api_instance.create_synthetics_api_test(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response ``` // Create a multi-step api test with every type of basicAuth returns "OK - Returns // the created test details." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsAPIStep; use datadog_api_client::datadogV1::model::SyntheticsAPITest; use datadog_api_client::datadogV1::model::SyntheticsAPITestConfig; use datadog_api_client::datadogV1::model::SyntheticsAPITestStep; use datadog_api_client::datadogV1::model::SyntheticsAPITestStepSubtype; use datadog_api_client::datadogV1::model::SyntheticsAPITestType; use datadog_api_client::datadogV1::model::SyntheticsAssertion; use datadog_api_client::datadogV1::model::SyntheticsAssertionOperator; use datadog_api_client::datadogV1::model::SyntheticsAssertionTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionTargetValue; use datadog_api_client::datadogV1::model::SyntheticsAssertionType; use datadog_api_client::datadogV1::model::SyntheticsBasicAuth; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthDigest; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthDigestType; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthNTLM; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthNTLMType; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthOauthClient; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthOauthClientType; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthOauthROP; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthOauthROPType; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthOauthTokenApiAuthentication; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthSigv4; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthSigv4Type; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthWeb; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthWebType; use datadog_api_client::datadogV1::model::SyntheticsTestDetailsSubType; use datadog_api_client::datadogV1::model::SyntheticsTestOptions; use datadog_api_client::datadogV1::model::SyntheticsTestRequest; #[tokio::main] async fn main() { let body = SyntheticsAPITest::new( SyntheticsAPITestConfig::new().steps(vec![ SyntheticsAPIStep::SyntheticsAPITestStep(Box::new(SyntheticsAPITestStep::new( vec![SyntheticsAssertion::SyntheticsAssertionTarget(Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueNumber( 200.0 as f64, ), SyntheticsAssertionType::STATUS_CODE, ), ))], "request is sent".to_string(), SyntheticsTestRequest::new() .basic_auth(SyntheticsBasicAuth::SyntheticsBasicAuthWeb(Box::new( SyntheticsBasicAuthWeb::new() .password("password".to_string()) .username("username".to_string()), ))) .method("GET".to_string()) .url("https://httpbin.org/status/200".to_string()), SyntheticsAPITestStepSubtype::HTTP, ))), SyntheticsAPIStep::SyntheticsAPITestStep(Box::new(SyntheticsAPITestStep::new( vec![SyntheticsAssertion::SyntheticsAssertionTarget(Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueNumber( 200.0 as f64, ), SyntheticsAssertionType::STATUS_CODE, ), ))], "request is sent".to_string(), SyntheticsTestRequest::new() .basic_auth(SyntheticsBasicAuth::SyntheticsBasicAuthWeb(Box::new( SyntheticsBasicAuthWeb::new() .password("password".to_string()) .type_(SyntheticsBasicAuthWebType::WEB) .username("username".to_string()), ))) .method("GET".to_string()) .url("https://httpbin.org/status/200".to_string()), SyntheticsAPITestStepSubtype::HTTP, ))), SyntheticsAPIStep::SyntheticsAPITestStep(Box::new(SyntheticsAPITestStep::new( vec![SyntheticsAssertion::SyntheticsAssertionTarget(Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueNumber( 200.0 as f64, ), SyntheticsAssertionType::STATUS_CODE, ), ))], "request is sent".to_string(), SyntheticsTestRequest::new() .basic_auth(SyntheticsBasicAuth::SyntheticsBasicAuthSigv4(Box::new( SyntheticsBasicAuthSigv4::new( "accessKey".to_string(), "secretKey".to_string(), SyntheticsBasicAuthSigv4Type::SIGV4, ), ))) .method("GET".to_string()) .url("https://httpbin.org/status/200".to_string()), SyntheticsAPITestStepSubtype::HTTP, ))), SyntheticsAPIStep::SyntheticsAPITestStep(Box::new(SyntheticsAPITestStep::new( vec![SyntheticsAssertion::SyntheticsAssertionTarget(Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueNumber( 200.0 as f64, ), SyntheticsAssertionType::STATUS_CODE, ), ))], "request is sent".to_string(), SyntheticsTestRequest::new() .basic_auth(SyntheticsBasicAuth::SyntheticsBasicAuthNTLM(Box::new( SyntheticsBasicAuthNTLM::new(SyntheticsBasicAuthNTLMType::NTLM), ))) .method("GET".to_string()) .url("https://httpbin.org/status/200".to_string()), SyntheticsAPITestStepSubtype::HTTP, ))), SyntheticsAPIStep::SyntheticsAPITestStep(Box::new(SyntheticsAPITestStep::new( vec![SyntheticsAssertion::SyntheticsAssertionTarget(Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueNumber( 200.0 as f64, ), SyntheticsAssertionType::STATUS_CODE, ), ))], "request is sent".to_string(), SyntheticsTestRequest::new() .basic_auth(SyntheticsBasicAuth::SyntheticsBasicAuthDigest(Box::new( SyntheticsBasicAuthDigest::new( "password".to_string(), SyntheticsBasicAuthDigestType::DIGEST, "username".to_string(), ), ))) .method("GET".to_string()) .url("https://httpbin.org/status/200".to_string()), SyntheticsAPITestStepSubtype::HTTP, ))), SyntheticsAPIStep::SyntheticsAPITestStep(Box::new(SyntheticsAPITestStep::new( vec![SyntheticsAssertion::SyntheticsAssertionTarget(Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueNumber( 200.0 as f64, ), SyntheticsAssertionType::STATUS_CODE, ), ))], "request is sent".to_string(), SyntheticsTestRequest::new() .basic_auth(SyntheticsBasicAuth::SyntheticsBasicAuthOauthClient( Box::new(SyntheticsBasicAuthOauthClient::new( "accessTokenUrl".to_string(), "clientId".to_string(), "clientSecret".to_string(), SyntheticsBasicAuthOauthTokenApiAuthentication::HEADER, SyntheticsBasicAuthOauthClientType::OAUTH_CLIENT, )), )) .method("GET".to_string()) .url("https://httpbin.org/status/200".to_string()), SyntheticsAPITestStepSubtype::HTTP, ))), SyntheticsAPIStep::SyntheticsAPITestStep(Box::new(SyntheticsAPITestStep::new( vec![SyntheticsAssertion::SyntheticsAssertionTarget(Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueNumber( 200.0 as f64, ), SyntheticsAssertionType::STATUS_CODE, ), ))], "request is sent".to_string(), SyntheticsTestRequest::new() .basic_auth(SyntheticsBasicAuth::SyntheticsBasicAuthOauthROP(Box::new( SyntheticsBasicAuthOauthROP::new( "accessTokenUrl".to_string(), "password".to_string(), SyntheticsBasicAuthOauthTokenApiAuthentication::HEADER, SyntheticsBasicAuthOauthROPType::OAUTH_ROP, "username".to_string(), ), ))) .method("GET".to_string()) .url("https://httpbin.org/status/200".to_string()), SyntheticsAPITestStepSubtype::HTTP, ))), ]), vec!["aws:us-east-2".to_string()], "BDD test payload: synthetics_api_test_multi_step_with_every_type_of_basic_auth.json" .to_string(), "Example-Synthetic".to_string(), SyntheticsTestOptions::new().tick_every(60), SyntheticsAPITestType::API, ) .subtype(SyntheticsTestDetailsSubType::MULTI); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.create_synthetics_api_test(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create an API GRPC test returns "OK - Returns the created test details." response ``` // Create an API GRPC test returns "OK - Returns the created test details." // response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsAPITest; use datadog_api_client::datadogV1::model::SyntheticsAPITestConfig; use datadog_api_client::datadogV1::model::SyntheticsAPITestType; use datadog_api_client::datadogV1::model::SyntheticsAssertion; use datadog_api_client::datadogV1::model::SyntheticsAssertionOperator; use datadog_api_client::datadogV1::model::SyntheticsAssertionTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionTargetValue; use datadog_api_client::datadogV1::model::SyntheticsAssertionType; use datadog_api_client::datadogV1::model::SyntheticsTestDetailsSubType; use datadog_api_client::datadogV1::model::SyntheticsTestOptions; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsMonitorOptions; use datadog_api_client::datadogV1::model::SyntheticsTestRequest; use datadog_api_client::datadogV1::model::SyntheticsTestRequestPort; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = SyntheticsAPITest::new( SyntheticsAPITestConfig::new() .assertions(vec![ SyntheticsAssertion::SyntheticsAssertionTarget(Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueNumber( 1.0 as f64, ), SyntheticsAssertionType::GRPC_HEALTHCHECK_STATUS, ), )), SyntheticsAssertion::SyntheticsAssertionTarget(Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueString( "proto target".to_string(), ), SyntheticsAssertionType::GRPC_PROTO, ), )), SyntheticsAssertion::SyntheticsAssertionTarget(Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueString( "123".to_string(), ), SyntheticsAssertionType::GRPC_METADATA, ) .property("property".to_string()), )), ]) .request( SyntheticsTestRequest::new() .host("localhost".to_string()) .message("".to_string()) .metadata(BTreeMap::from([])) .method("GET".to_string()) .port(SyntheticsTestRequestPort::SyntheticsTestRequestNumericalPort(50051)) .service("Hello".to_string()), ), vec!["aws:us-east-2".to_string()], "BDD test payload: synthetics_api_grpc_test_payload.json".to_string(), "Example-Synthetic".to_string(), SyntheticsTestOptions::new() .min_failure_duration(0) .min_location_failed(1) .monitor_name("Example-Synthetic".to_string()) .monitor_options(SyntheticsTestOptionsMonitorOptions::new().renotify_interval(0)) .tick_every(60), SyntheticsAPITestType::API, ) .subtype(SyntheticsTestDetailsSubType::GRPC) .tags(vec!["testing:api".to_string()]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.create_synthetics_api_test(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create an API HTTP test has bodyHash filled out ``` // Create an API HTTP test has bodyHash filled out use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsAPITest; use datadog_api_client::datadogV1::model::SyntheticsAPITestConfig; use datadog_api_client::datadogV1::model::SyntheticsAPITestType; use datadog_api_client::datadogV1::model::SyntheticsAssertion; use datadog_api_client::datadogV1::model::SyntheticsAssertionBodyHashOperator; use datadog_api_client::datadogV1::model::SyntheticsAssertionBodyHashTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionBodyHashType; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONPathOperator; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONPathTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONPathTargetTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONSchemaMetaSchema; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONSchemaOperator; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONSchemaTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONSchemaTargetTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionJavascript; use datadog_api_client::datadogV1::model::SyntheticsAssertionJavascriptType; use datadog_api_client::datadogV1::model::SyntheticsAssertionOperator; use datadog_api_client::datadogV1::model::SyntheticsAssertionTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionTargetValue; use datadog_api_client::datadogV1::model::SyntheticsAssertionTimingsScope; use datadog_api_client::datadogV1::model::SyntheticsAssertionType; use datadog_api_client::datadogV1::model::SyntheticsAssertionXPathOperator; use datadog_api_client::datadogV1::model::SyntheticsAssertionXPathTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionXPathTargetTarget; use datadog_api_client::datadogV1::model::SyntheticsBasicAuth; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthOauthClient; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthOauthClientType; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthOauthTokenApiAuthentication; use datadog_api_client::datadogV1::model::SyntheticsConfigVariable; use datadog_api_client::datadogV1::model::SyntheticsConfigVariableType; use datadog_api_client::datadogV1::model::SyntheticsTestDetailsSubType; use datadog_api_client::datadogV1::model::SyntheticsTestOptions; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsHTTPVersion; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsRetry; use datadog_api_client::datadogV1::model::SyntheticsTestRequest; use datadog_api_client::datadogV1::model::SyntheticsTestRequestCertificate; use datadog_api_client::datadogV1::model::SyntheticsTestRequestCertificateItem; use datadog_api_client::datadogV1::model::SyntheticsTestRequestProxy; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = SyntheticsAPITest::new( SyntheticsAPITestConfig::new() .assertions( vec![ SyntheticsAssertion::SyntheticsAssertionTarget( Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueString( "text/html".to_string(), ), SyntheticsAssertionType::HEADER, ).property("{{ PROPERTY }}".to_string()), ), ), SyntheticsAssertion::SyntheticsAssertionTarget( Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::LESS_THAN, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueNumber( 2000.0 as f64, ), SyntheticsAssertionType::RESPONSE_TIME, ).timings_scope(SyntheticsAssertionTimingsScope::WITHOUT_DNS), ), ), SyntheticsAssertion::SyntheticsAssertionJSONPathTarget( Box::new( SyntheticsAssertionJSONPathTarget::new( SyntheticsAssertionJSONPathOperator::VALIDATES_JSON_PATH, SyntheticsAssertionType::BODY, ).target( SyntheticsAssertionJSONPathTargetTarget::new() .json_path("topKey".to_string()) .operator("isNot".to_string()) .target_value( SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueString( "0".to_string(), ), ), ), ), ), SyntheticsAssertion::SyntheticsAssertionJSONPathTarget( Box::new( SyntheticsAssertionJSONPathTarget::new( SyntheticsAssertionJSONPathOperator::VALIDATES_JSON_PATH, SyntheticsAssertionType::BODY, ).target( SyntheticsAssertionJSONPathTargetTarget::new() .elements_operator("atLeastOneElementMatches".to_string()) .json_path("topKey".to_string()) .operator("isNot".to_string()) .target_value( SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueString( "0".to_string(), ), ), ), ), ), SyntheticsAssertion::SyntheticsAssertionJSONSchemaTarget( Box::new( SyntheticsAssertionJSONSchemaTarget::new( SyntheticsAssertionJSONSchemaOperator::VALIDATES_JSON_SCHEMA, SyntheticsAssertionType::BODY, ).target( SyntheticsAssertionJSONSchemaTargetTarget::new() .json_schema( r#"{"type": "object", "properties":{"slideshow":{"type":"object"}}}"#.to_string(), ) .meta_schema(SyntheticsAssertionJSONSchemaMetaSchema::DRAFT_07), ), ), ), SyntheticsAssertion::SyntheticsAssertionXPathTarget( Box::new( SyntheticsAssertionXPathTarget::new( SyntheticsAssertionXPathOperator::VALIDATES_X_PATH, SyntheticsAssertionType::BODY, ).target( SyntheticsAssertionXPathTargetTarget::new() .operator("contains".to_string()) .target_value( SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueString( "0".to_string(), ), ) .x_path("target-xpath".to_string()), ), ), ), SyntheticsAssertion::SyntheticsAssertionBodyHashTarget( Box::new( SyntheticsAssertionBodyHashTarget::new( SyntheticsAssertionBodyHashOperator::MD5, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueString( "a".to_string(), ), SyntheticsAssertionBodyHashType::BODY_HASH, ), ), ), SyntheticsAssertion::SyntheticsAssertionJavascript( Box::new( SyntheticsAssertionJavascript::new( "const hello = 'world';".to_string(), SyntheticsAssertionJavascriptType::JAVASCRIPT, ), ), ) ], ) .config_variables( vec![ SyntheticsConfigVariable::new("PROPERTY".to_string(), SyntheticsConfigVariableType::TEXT) .example("content-type".to_string()) .pattern("content-type".to_string()) ], ) .request( SyntheticsTestRequest::new() .basic_auth( SyntheticsBasicAuth::SyntheticsBasicAuthOauthClient( Box::new( SyntheticsBasicAuthOauthClient::new( "https://datadog-token.com".to_string(), "client-id".to_string(), "client-secret".to_string(), SyntheticsBasicAuthOauthTokenApiAuthentication::HEADER, SyntheticsBasicAuthOauthClientType::OAUTH_CLIENT, ) .audience("audience".to_string()) .resource("resource".to_string()) .scope("yoyo".to_string()), ), ), ) .certificate( SyntheticsTestRequestCertificate::new() .cert( SyntheticsTestRequestCertificateItem::new() .content("cert-content".to_string()) .filename("cert-filename".to_string()) .updated_at("2020-10-16T09:23:24.857Z".to_string()), ) .key( SyntheticsTestRequestCertificateItem::new() .content("key-content".to_string()) .filename("key-filename".to_string()) .updated_at("2020-10-16T09:23:24.857Z".to_string()), ), ) .headers(BTreeMap::from([("unique".to_string(), "examplesynthetic".to_string())])) .method("GET".to_string()) .persist_cookies(true) .proxy( SyntheticsTestRequestProxy::new( "https://datadoghq.com".to_string(), ).headers(BTreeMap::from([])), ) .timeout(10.0 as f64) .url("https://datadoghq.com".to_string()), ) .variables_from_script(r#"dd.variable.set("FOO", "foo")"#.to_string()), vec!["aws:us-east-2".to_string()], "BDD test payload: synthetics_api_http_test_payload.json".to_string(), "Example-Synthetic".to_string(), SyntheticsTestOptions::new() .accept_self_signed(false) .allow_insecure(true) .follow_redirects(true) .http_version(SyntheticsTestOptionsHTTPVersion::HTTP2) .min_failure_duration(10) .min_location_failed(1) .monitor_name("Example-Synthetic".to_string()) .monitor_priority(5) .retry(SyntheticsTestOptionsRetry::new().count(3).interval(10.0 as f64)) .tick_every(60), SyntheticsAPITestType::API, ) .subtype(SyntheticsTestDetailsSubType::HTTP) .tags(vec!["testing:api".to_string()]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.create_synthetics_api_test(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response ``` /** * Create a multi-step api test with every type of basicAuth returns "OK - Returns the created test details." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiCreateSyntheticsAPITestRequest = { body: { config: { steps: [ { assertions: [ { operator: "is", type: "statusCode", target: 200, }, ], name: "request is sent", request: { url: "https://httpbin.org/status/200", method: "GET", basicAuth: { password: "password", username: "username", }, }, subtype: "http", }, { assertions: [ { operator: "is", type: "statusCode", target: 200, }, ], name: "request is sent", request: { url: "https://httpbin.org/status/200", method: "GET", basicAuth: { password: "password", username: "username", type: "web", }, }, subtype: "http", }, { assertions: [ { operator: "is", type: "statusCode", target: 200, }, ], name: "request is sent", request: { url: "https://httpbin.org/status/200", method: "GET", basicAuth: { accessKey: "accessKey", secretKey: "secretKey", type: "sigv4", }, }, subtype: "http", }, { assertions: [ { operator: "is", type: "statusCode", target: 200, }, ], name: "request is sent", request: { url: "https://httpbin.org/status/200", method: "GET", basicAuth: { type: "ntlm", }, }, subtype: "http", }, { assertions: [ { operator: "is", type: "statusCode", target: 200, }, ], name: "request is sent", request: { url: "https://httpbin.org/status/200", method: "GET", basicAuth: { password: "password", username: "username", type: "digest", }, }, subtype: "http", }, { assertions: [ { operator: "is", type: "statusCode", target: 200, }, ], name: "request is sent", request: { url: "https://httpbin.org/status/200", method: "GET", basicAuth: { accessTokenUrl: "accessTokenUrl", tokenApiAuthentication: "header", clientId: "clientId", clientSecret: "clientSecret", type: "oauth-client", }, }, subtype: "http", }, { assertions: [ { operator: "is", type: "statusCode", target: 200, }, ], name: "request is sent", request: { url: "https://httpbin.org/status/200", method: "GET", basicAuth: { accessTokenUrl: "accessTokenUrl", password: "password", tokenApiAuthentication: "header", username: "username", type: "oauth-rop", }, }, subtype: "http", }, ], }, locations: ["aws:us-east-2"], message: "BDD test payload: synthetics_api_test_multi_step_with_every_type_of_basic_auth.json", name: "Example-Synthetic", options: { tickEvery: 60, }, subtype: "multi", type: "api", }, }; apiInstance .createSyntheticsAPITest(params) .then((data: v1.SyntheticsAPITest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create an API GRPC test returns "OK - Returns the created test details." response ``` /** * Create an API GRPC test returns "OK - Returns the created test details." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiCreateSyntheticsAPITestRequest = { body: { config: { assertions: [ { operator: "is", target: 1, type: "grpcHealthcheckStatus", }, { operator: "is", target: "proto target", type: "grpcProto", }, { operator: "is", target: "123", property: "property", type: "grpcMetadata", }, ], request: { host: "localhost", port: 50051, service: "Hello", method: "GET", message: "", metadata: {}, }, }, locations: ["aws:us-east-2"], message: "BDD test payload: synthetics_api_grpc_test_payload.json", name: "Example-Synthetic", options: { minFailureDuration: 0, minLocationFailed: 1, monitorOptions: { renotifyInterval: 0, }, monitorName: "Example-Synthetic", tickEvery: 60, }, subtype: "grpc", tags: ["testing:api"], type: "api", }, }; apiInstance .createSyntheticsAPITest(params) .then((data: v1.SyntheticsAPITest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create an API HTTP test has bodyHash filled out ``` /** * Create an API HTTP test has bodyHash filled out */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiCreateSyntheticsAPITestRequest = { body: { config: { assertions: [ { operator: "is", property: "{{ PROPERTY }}", target: "text/html", type: "header", }, { operator: "lessThan", target: 2000, type: "responseTime", timingsScope: "withoutDNS", }, { operator: "validatesJSONPath", target: { jsonPath: "topKey", operator: "isNot", targetValue: "0", }, type: "body", }, { operator: "validatesJSONPath", target: { elementsOperator: "atLeastOneElementMatches", jsonPath: "topKey", operator: "isNot", targetValue: "0", }, type: "body", }, { operator: "validatesJSONSchema", target: { metaSchema: "draft-07", jsonSchema: `{"type": "object", "properties":{"slideshow":{"type":"object"}}}`, }, type: "body", }, { operator: "validatesXPath", target: { xPath: "target-xpath", targetValue: "0", operator: "contains", }, type: "body", }, { operator: "md5", target: "a", type: "bodyHash", }, { code: "const hello = 'world';", type: "javascript", }, ], configVariables: [ { example: "content-type", name: "PROPERTY", pattern: "content-type", type: "text", }, ], variablesFromScript: `dd.variable.set("FOO", "foo")`, request: { certificate: { cert: { content: "cert-content", filename: "cert-filename", updatedAt: "2020-10-16T09:23:24.857Z", }, key: { content: "key-content", filename: "key-filename", updatedAt: "2020-10-16T09:23:24.857Z", }, }, headers: { unique: "examplesynthetic", }, method: "GET", timeout: 10, url: "https://datadoghq.com", proxy: { url: "https://datadoghq.com", headers: {}, }, basicAuth: { accessTokenUrl: "https://datadog-token.com", audience: "audience", clientId: "client-id", clientSecret: "client-secret", resource: "resource", scope: "yoyo", tokenApiAuthentication: "header", type: "oauth-client", }, persistCookies: true, }, }, locations: ["aws:us-east-2"], message: "BDD test payload: synthetics_api_http_test_payload.json", name: "Example-Synthetic", options: { acceptSelfSigned: false, allowInsecure: true, followRedirects: true, minFailureDuration: 10, minLocationFailed: 1, monitorName: "Example-Synthetic", monitorPriority: 5, retry: { count: 3, interval: 10, }, tickEvery: 60, httpVersion: "http2", }, subtype: "http", tags: ["testing:api"], type: "api", }, }; apiInstance .createSyntheticsAPITest(params) .then((data: v1.SyntheticsAPITest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a browser test](https://docs.datadoghq.com/api/latest/synthetics/#create-a-browser-test) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#create-a-browser-test-v1) POST https://api.ap1.datadoghq.com/api/v1/synthetics/tests/browserhttps://api.ap2.datadoghq.com/api/v1/synthetics/tests/browserhttps://api.datadoghq.eu/api/v1/synthetics/tests/browserhttps://api.ddog-gov.com/api/v1/synthetics/tests/browserhttps://api.datadoghq.com/api/v1/synthetics/tests/browserhttps://api.us3.datadoghq.com/api/v1/synthetics/tests/browserhttps://api.us5.datadoghq.com/api/v1/synthetics/tests/browser ### Overview Create a Synthetic browser test. This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Request #### Body Data (required) Details of the test to create. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description config [_required_] object Configuration object for a Synthetic browser test. assertions [_required_] [ ] Array of assertions used for the test. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request [_required_] object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. setCookie string Cookies to be used for the request, using the [Set-Cookie](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Set-Cookie) syntax. variables [object] Array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` locations [_required_] [string] Array of locations used to run the test. message [_required_] string Notification message associated with the test. Message can either be text or an empty string. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The public ID of the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] Array of steps for the test. allowFailure boolean A boolean set to allow this step to fail. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean A boolean set to exit the test if the step succeeds. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name string The name of the step. noScreenshot boolean A boolean set to skip taking a screenshot for the step. params object The parameters of the step. public_id string The public ID of the step. timeout int64 The time before declaring a step failed. type enum Step type used in your Synthetic test. Allowed enum values: `assertCurrentUrl,assertElementAttribute,assertElementContent,assertElementPresent,assertEmail,assertFileDownload,assertFromJavascript,assertPageContains,assertPageLacks,assertRequests,click,extractFromJavascript,extractFromEmailBody,extractVariable,goToEmailLink,goToUrl,goToUrlAndMeasureTti,hover,playSubTest,pressKey,refresh,runApiTest,scroll,selectOption,typeText,uploadFiles,wait` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `browser`. Allowed enum values: `browser` default: `browser` ##### Create a browser test returns "OK - Returns saved rumSettings." response ``` { "config": { "assertions": [], "configVariables": [ { "example": "content-type", "name": "PROPERTY", "pattern": "content-type", "type": "text" } ], "request": { "method": "GET", "url": "https://datadoghq.com", "certificateDomains": [ "https://datadoghq.com" ] }, "setCookie": "name:test" }, "locations": [ "aws:us-east-2" ], "message": "Test message", "name": "Example-Synthetic", "options": { "accept_self_signed": false, "allow_insecure": true, "device_ids": [ "tablet" ], "disableCors": true, "follow_redirects": true, "min_failure_duration": 10, "min_location_failed": 1, "noScreenshot": true, "retry": { "count": 2, "interval": 10 }, "rumSettings": { "isEnabled": true, "applicationId": "mockApplicationId", "clientTokenId": 12345 }, "tick_every": 300, "ci": { "executionRule": "skipped" }, "ignoreServerCertificateError": true, "disableCsp": true, "initialNavigationTimeout": 200 }, "tags": [ "testing:browser" ], "type": "browser", "steps": [ { "allowFailure": false, "isCritical": true, "name": "Refresh page", "params": {}, "type": "refresh" } ] } ``` Copy ##### Create a browser test returns "OK - Returns the created test details." response ``` { "config": { "assertions": [], "variables": [ { "type": "text", "name": "TEST_VARIABLE", "pattern": "secret", "secure": true, "example": "secret" } ], "configVariables": [ { "example": "content-type", "name": "PROPERTY", "pattern": "content-type", "type": "text", "secure": true } ], "request": { "method": "GET", "url": "https://datadoghq.com" }, "setCookie": "name:test" }, "locations": [ "aws:us-east-2" ], "message": "Test message", "name": "Example-Synthetic", "options": { "accept_self_signed": false, "allow_insecure": true, "device_ids": [ "chrome.laptop_large" ], "disableCors": true, "follow_redirects": true, "min_failure_duration": 10, "min_location_failed": 1, "noScreenshot": true, "retry": { "count": 2, "interval": 10 }, "tick_every": 300, "enableProfiling": true, "enableSecurityTesting": true }, "tags": [ "testing:browser" ], "type": "browser", "steps": [ { "allowFailure": false, "alwaysExecute": true, "exitIfSucceed": true, "isCritical": true, "name": "Refresh page", "params": {}, "type": "refresh" } ] } ``` Copy ##### Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response ``` { "config": { "assertions": [], "configVariables": [ { "example": "content-type", "name": "PROPERTY", "pattern": "content-type", "type": "text" } ], "request": { "method": "GET", "url": "https://datadoghq.com" }, "setCookie": "name:test" }, "locations": [ "aws:us-east-2" ], "message": "Test message", "name": "Example-Synthetic", "options": { "accept_self_signed": false, "allow_insecure": true, "device_ids": [ "tablet" ], "disableCors": true, "follow_redirects": true, "min_failure_duration": 10, "min_location_failed": 1, "noScreenshot": true, "retry": { "count": 2, "interval": 10 }, "tick_every": 300, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" }, { "day": 3, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" } }, "tags": [ "testing:browser" ], "type": "browser", "steps": [ { "allowFailure": false, "isCritical": true, "name": "Refresh page", "params": {}, "type": "refresh" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsBrowserTest-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsBrowserTest-400-v1) * [402](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsBrowserTest-402-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsBrowserTest-403-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsBrowserTest-429-v1) OK - Returns the created test details. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about a Synthetic browser test. Field Type Description config [_required_] object Configuration object for a Synthetic browser test. assertions [_required_] [ ] Array of assertions used for the test. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request [_required_] object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. setCookie string Cookies to be used for the request, using the [Set-Cookie](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Set-Cookie) syntax. variables [object] Array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` locations [_required_] [string] Array of locations used to run the test. message [_required_] string Notification message associated with the test. Message can either be text or an empty string. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The public ID of the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] Array of steps for the test. allowFailure boolean A boolean set to allow this step to fail. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean A boolean set to exit the test if the step succeeds. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name string The name of the step. noScreenshot boolean A boolean set to skip taking a screenshot for the step. params object The parameters of the step. public_id string The public ID of the step. timeout int64 The time before declaring a step failed. type enum Step type used in your Synthetic test. Allowed enum values: `assertCurrentUrl,assertElementAttribute,assertElementContent,assertElementPresent,assertEmail,assertFileDownload,assertFromJavascript,assertPageContains,assertPageLacks,assertRequests,click,extractFromJavascript,extractFromEmailBody,extractVariable,goToEmailLink,goToUrl,goToUrlAndMeasureTti,hover,playSubTest,pressKey,refresh,runApiTest,scroll,selectOption,typeText,uploadFiles,wait` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `browser`. Allowed enum values: `browser` default: `browser` ``` { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "setCookie": "string", "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "locations": [ "aws:eu-west-3" ], "message": "", "monitor_id": "integer", "name": "Example test name", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "string", "status": "live", "steps": [ { "allowFailure": false, "alwaysExecute": false, "exitIfSucceed": false, "isCritical": false, "name": "string", "noScreenshot": false, "params": {}, "public_id": "string", "timeout": "integer", "type": "assertElementContent" } ], "tags": [ "env:prod" ], "type": "browser" } ``` Copy - JSON format is wrong - Creation failed * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Test quota is reached * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Create a browser test returns "OK - Returns saved rumSettings." response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/browser" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "assertions": [], "configVariables": [ { "example": "content-type", "name": "PROPERTY", "pattern": "content-type", "type": "text" } ], "request": { "method": "GET", "url": "https://datadoghq.com", "certificateDomains": [ "https://datadoghq.com" ] }, "setCookie": "name:test" }, "locations": [ "aws:us-east-2" ], "message": "Test message", "name": "Example-Synthetic", "options": { "accept_self_signed": false, "allow_insecure": true, "device_ids": [ "tablet" ], "disableCors": true, "follow_redirects": true, "min_failure_duration": 10, "min_location_failed": 1, "noScreenshot": true, "retry": { "count": 2, "interval": 10 }, "rumSettings": { "isEnabled": true, "applicationId": "mockApplicationId", "clientTokenId": 12345 }, "tick_every": 300, "ci": { "executionRule": "skipped" }, "ignoreServerCertificateError": true, "disableCsp": true, "initialNavigationTimeout": 200 }, "tags": [ "testing:browser" ], "type": "browser", "steps": [ { "allowFailure": false, "isCritical": true, "name": "Refresh page", "params": {}, "type": "refresh" } ] } EOF ``` ##### Create a browser test returns "OK - Returns the created test details." response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/browser" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "assertions": [], "variables": [ { "type": "text", "name": "TEST_VARIABLE", "pattern": "secret", "secure": true, "example": "secret" } ], "configVariables": [ { "example": "content-type", "name": "PROPERTY", "pattern": "content-type", "type": "text", "secure": true } ], "request": { "method": "GET", "url": "https://datadoghq.com" }, "setCookie": "name:test" }, "locations": [ "aws:us-east-2" ], "message": "Test message", "name": "Example-Synthetic", "options": { "accept_self_signed": false, "allow_insecure": true, "device_ids": [ "chrome.laptop_large" ], "disableCors": true, "follow_redirects": true, "min_failure_duration": 10, "min_location_failed": 1, "noScreenshot": true, "retry": { "count": 2, "interval": 10 }, "tick_every": 300, "enableProfiling": true, "enableSecurityTesting": true }, "tags": [ "testing:browser" ], "type": "browser", "steps": [ { "allowFailure": false, "alwaysExecute": true, "exitIfSucceed": true, "isCritical": true, "name": "Refresh page", "params": {}, "type": "refresh" } ] } EOF ``` ##### Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/browser" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "assertions": [], "configVariables": [ { "example": "content-type", "name": "PROPERTY", "pattern": "content-type", "type": "text" } ], "request": { "method": "GET", "url": "https://datadoghq.com" }, "setCookie": "name:test" }, "locations": [ "aws:us-east-2" ], "message": "Test message", "name": "Example-Synthetic", "options": { "accept_self_signed": false, "allow_insecure": true, "device_ids": [ "tablet" ], "disableCors": true, "follow_redirects": true, "min_failure_duration": 10, "min_location_failed": 1, "noScreenshot": true, "retry": { "count": 2, "interval": 10 }, "tick_every": 300, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" }, { "day": 3, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" } }, "tags": [ "testing:browser" ], "type": "browser", "steps": [ { "allowFailure": false, "isCritical": true, "name": "Refresh page", "params": {}, "type": "refresh" } ] } EOF ``` ##### Create a browser test returns "OK - Returns saved rumSettings." response ``` // Create a browser test returns "OK - Returns saved rumSettings." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsBrowserTest{ Config: datadogV1.SyntheticsBrowserTestConfig{ Assertions: []datadogV1.SyntheticsAssertion{}, ConfigVariables: []datadogV1.SyntheticsConfigVariable{ { Example: datadog.PtrString("content-type"), Name: "PROPERTY", Pattern: datadog.PtrString("content-type"), Type: datadogV1.SYNTHETICSCONFIGVARIABLETYPE_TEXT, }, }, Request: datadogV1.SyntheticsTestRequest{ Method: datadog.PtrString("GET"), Url: datadog.PtrString("https://datadoghq.com"), CertificateDomains: []string{ "https://datadoghq.com", }, }, SetCookie: datadog.PtrString("name:test"), }, Locations: []string{ "aws:us-east-2", }, Message: "Test message", Name: "Example-Synthetic", Options: datadogV1.SyntheticsTestOptions{ AcceptSelfSigned: datadog.PtrBool(false), AllowInsecure: datadog.PtrBool(true), DeviceIds: []string{ "tablet", }, DisableCors: datadog.PtrBool(true), FollowRedirects: datadog.PtrBool(true), MinFailureDuration: datadog.PtrInt64(10), MinLocationFailed: datadog.PtrInt64(1), NoScreenshot: datadog.PtrBool(true), Retry: &datadogV1.SyntheticsTestOptionsRetry{ Count: datadog.PtrInt64(2), Interval: datadog.PtrFloat64(10), }, RumSettings: &datadogV1.SyntheticsBrowserTestRumSettings{ IsEnabled: true, ApplicationId: datadog.PtrString("mockApplicationId"), ClientTokenId: datadog.PtrInt64(12345), }, TickEvery: datadog.PtrInt64(300), Ci: &datadogV1.SyntheticsTestCiOptions{ ExecutionRule: datadogV1.SYNTHETICSTESTEXECUTIONRULE_SKIPPED, }, IgnoreServerCertificateError: datadog.PtrBool(true), DisableCsp: datadog.PtrBool(true), InitialNavigationTimeout: datadog.PtrInt64(200), }, Tags: []string{ "testing:browser", }, Type: datadogV1.SYNTHETICSBROWSERTESTTYPE_BROWSER, Steps: []datadogV1.SyntheticsStep{ { AllowFailure: datadog.PtrBool(false), IsCritical: datadog.PtrBool(true), Name: datadog.PtrString("Refresh page"), Params: new(interface{}), Type: datadogV1.SYNTHETICSSTEPTYPE_REFRESH.Ptr(), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.CreateSyntheticsBrowserTest(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.CreateSyntheticsBrowserTest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.CreateSyntheticsBrowserTest`:\n%s\n", responseContent) } ``` Copy ##### Create a browser test returns "OK - Returns the created test details." response ``` // Create a browser test returns "OK - Returns the created test details." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsBrowserTest{ Config: datadogV1.SyntheticsBrowserTestConfig{ Assertions: []datadogV1.SyntheticsAssertion{}, Variables: []datadogV1.SyntheticsBrowserVariable{ { Type: datadogV1.SYNTHETICSBROWSERVARIABLETYPE_TEXT, Name: "TEST_VARIABLE", Pattern: datadog.PtrString("secret"), Secure: datadog.PtrBool(true), Example: datadog.PtrString("secret"), }, }, ConfigVariables: []datadogV1.SyntheticsConfigVariable{ { Example: datadog.PtrString("content-type"), Name: "PROPERTY", Pattern: datadog.PtrString("content-type"), Type: datadogV1.SYNTHETICSCONFIGVARIABLETYPE_TEXT, Secure: datadog.PtrBool(true), }, }, Request: datadogV1.SyntheticsTestRequest{ Method: datadog.PtrString("GET"), Url: datadog.PtrString("https://datadoghq.com"), }, SetCookie: datadog.PtrString("name:test"), }, Locations: []string{ "aws:us-east-2", }, Message: "Test message", Name: "Example-Synthetic", Options: datadogV1.SyntheticsTestOptions{ AcceptSelfSigned: datadog.PtrBool(false), AllowInsecure: datadog.PtrBool(true), DeviceIds: []string{ "chrome.laptop_large", }, DisableCors: datadog.PtrBool(true), FollowRedirects: datadog.PtrBool(true), MinFailureDuration: datadog.PtrInt64(10), MinLocationFailed: datadog.PtrInt64(1), NoScreenshot: datadog.PtrBool(true), Retry: &datadogV1.SyntheticsTestOptionsRetry{ Count: datadog.PtrInt64(2), Interval: datadog.PtrFloat64(10), }, TickEvery: datadog.PtrInt64(300), EnableProfiling: datadog.PtrBool(true), EnableSecurityTesting: datadog.PtrBool(true), }, Tags: []string{ "testing:browser", }, Type: datadogV1.SYNTHETICSBROWSERTESTTYPE_BROWSER, Steps: []datadogV1.SyntheticsStep{ { AllowFailure: datadog.PtrBool(false), AlwaysExecute: datadog.PtrBool(true), ExitIfSucceed: datadog.PtrBool(true), IsCritical: datadog.PtrBool(true), Name: datadog.PtrString("Refresh page"), Params: new(interface{}), Type: datadogV1.SYNTHETICSSTEPTYPE_REFRESH.Ptr(), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.CreateSyntheticsBrowserTest(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.CreateSyntheticsBrowserTest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.CreateSyntheticsBrowserTest`:\n%s\n", responseContent) } ``` Copy ##### Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response ``` // Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsBrowserTest{ Config: datadogV1.SyntheticsBrowserTestConfig{ Assertions: []datadogV1.SyntheticsAssertion{}, ConfigVariables: []datadogV1.SyntheticsConfigVariable{ { Example: datadog.PtrString("content-type"), Name: "PROPERTY", Pattern: datadog.PtrString("content-type"), Type: datadogV1.SYNTHETICSCONFIGVARIABLETYPE_TEXT, }, }, Request: datadogV1.SyntheticsTestRequest{ Method: datadog.PtrString("GET"), Url: datadog.PtrString("https://datadoghq.com"), }, SetCookie: datadog.PtrString("name:test"), }, Locations: []string{ "aws:us-east-2", }, Message: "Test message", Name: "Example-Synthetic", Options: datadogV1.SyntheticsTestOptions{ AcceptSelfSigned: datadog.PtrBool(false), AllowInsecure: datadog.PtrBool(true), DeviceIds: []string{ "tablet", }, DisableCors: datadog.PtrBool(true), FollowRedirects: datadog.PtrBool(true), MinFailureDuration: datadog.PtrInt64(10), MinLocationFailed: datadog.PtrInt64(1), NoScreenshot: datadog.PtrBool(true), Retry: &datadogV1.SyntheticsTestOptionsRetry{ Count: datadog.PtrInt64(2), Interval: datadog.PtrFloat64(10), }, TickEvery: datadog.PtrInt64(300), Scheduling: &datadogV1.SyntheticsTestOptionsScheduling{ Timeframes: []datadogV1.SyntheticsTestOptionsSchedulingTimeframe{ { Day: 1, From: "07:00", To: "16:00", }, { Day: 3, From: "07:00", To: "16:00", }, }, Timezone: "America/New_York", }, }, Tags: []string{ "testing:browser", }, Type: datadogV1.SYNTHETICSBROWSERTESTTYPE_BROWSER, Steps: []datadogV1.SyntheticsStep{ { AllowFailure: datadog.PtrBool(false), IsCritical: datadog.PtrBool(true), Name: datadog.PtrString("Refresh page"), Params: new(interface{}), Type: datadogV1.SYNTHETICSSTEPTYPE_REFRESH.Ptr(), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.CreateSyntheticsBrowserTest(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.CreateSyntheticsBrowserTest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.CreateSyntheticsBrowserTest`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a browser test returns "OK - Returns saved rumSettings." response ``` // Create a browser test returns "OK - Returns saved rumSettings." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsBrowserTest; import com.datadog.api.client.v1.model.SyntheticsBrowserTestConfig; import com.datadog.api.client.v1.model.SyntheticsBrowserTestRumSettings; import com.datadog.api.client.v1.model.SyntheticsBrowserTestType; import com.datadog.api.client.v1.model.SyntheticsConfigVariable; import com.datadog.api.client.v1.model.SyntheticsConfigVariableType; import com.datadog.api.client.v1.model.SyntheticsStep; import com.datadog.api.client.v1.model.SyntheticsStepType; import com.datadog.api.client.v1.model.SyntheticsTestCiOptions; import com.datadog.api.client.v1.model.SyntheticsTestExecutionRule; import com.datadog.api.client.v1.model.SyntheticsTestOptions; import com.datadog.api.client.v1.model.SyntheticsTestOptionsRetry; import com.datadog.api.client.v1.model.SyntheticsTestRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsBrowserTest body = new SyntheticsBrowserTest() .config( new SyntheticsBrowserTestConfig() .configVariables( Collections.singletonList( new SyntheticsConfigVariable() .example("content-type") .name("PROPERTY") .pattern("content-type") .type(SyntheticsConfigVariableType.TEXT))) .request( new SyntheticsTestRequest() .method("GET") .url("https://datadoghq.com") .certificateDomains(Collections.singletonList("https://datadoghq.com"))) .setCookie("name:test")) .locations(Collections.singletonList("aws:us-east-2")) .message("Test message") .name("Example-Synthetic") .options( new SyntheticsTestOptions() .acceptSelfSigned(false) .allowInsecure(true) .deviceIds(Collections.singletonList("tablet")) .disableCors(true) .followRedirects(true) .minFailureDuration(10L) .minLocationFailed(1L) .noScreenshot(true) .retry(new SyntheticsTestOptionsRetry().count(2L).interval(10.0)) .rumSettings( new SyntheticsBrowserTestRumSettings() .isEnabled(true) .applicationId("mockApplicationId") .clientTokenId(12345L)) .tickEvery(300L) .ci( new SyntheticsTestCiOptions() .executionRule(SyntheticsTestExecutionRule.SKIPPED)) .ignoreServerCertificateError(true) .disableCsp(true) .initialNavigationTimeout(200L)) .tags(Collections.singletonList("testing:browser")) .type(SyntheticsBrowserTestType.BROWSER) .steps( Collections.singletonList( new SyntheticsStep() .allowFailure(false) .isCritical(true) .name("Refresh page") .params(new Object()) .type(SyntheticsStepType.REFRESH))); try { SyntheticsBrowserTest result = apiInstance.createSyntheticsBrowserTest(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#createSyntheticsBrowserTest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a browser test returns "OK - Returns the created test details." response ``` // Create a browser test returns "OK - Returns the created test details." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsBrowserTest; import com.datadog.api.client.v1.model.SyntheticsBrowserTestConfig; import com.datadog.api.client.v1.model.SyntheticsBrowserTestType; import com.datadog.api.client.v1.model.SyntheticsBrowserVariable; import com.datadog.api.client.v1.model.SyntheticsBrowserVariableType; import com.datadog.api.client.v1.model.SyntheticsConfigVariable; import com.datadog.api.client.v1.model.SyntheticsConfigVariableType; import com.datadog.api.client.v1.model.SyntheticsStep; import com.datadog.api.client.v1.model.SyntheticsStepType; import com.datadog.api.client.v1.model.SyntheticsTestOptions; import com.datadog.api.client.v1.model.SyntheticsTestOptionsRetry; import com.datadog.api.client.v1.model.SyntheticsTestRequest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsBrowserTest body = new SyntheticsBrowserTest() .config( new SyntheticsBrowserTestConfig() .variables( Collections.singletonList( new SyntheticsBrowserVariable() .type(SyntheticsBrowserVariableType.TEXT) .name("TEST_VARIABLE") .pattern("secret") .secure(true) .example("secret"))) .configVariables( Collections.singletonList( new SyntheticsConfigVariable() .example("content-type") .name("PROPERTY") .pattern("content-type") .type(SyntheticsConfigVariableType.TEXT) .secure(true))) .request(new SyntheticsTestRequest().method("GET").url("https://datadoghq.com")) .setCookie("name:test")) .locations(Collections.singletonList("aws:us-east-2")) .message("Test message") .name("Example-Synthetic") .options( new SyntheticsTestOptions() .acceptSelfSigned(false) .allowInsecure(true) .deviceIds(Collections.singletonList("chrome.laptop_large")) .disableCors(true) .followRedirects(true) .minFailureDuration(10L) .minLocationFailed(1L) .noScreenshot(true) .retry(new SyntheticsTestOptionsRetry().count(2L).interval(10.0)) .tickEvery(300L) .enableProfiling(true) .enableSecurityTesting(true)) .tags(Collections.singletonList("testing:browser")) .type(SyntheticsBrowserTestType.BROWSER) .steps( Collections.singletonList( new SyntheticsStep() .allowFailure(false) .alwaysExecute(true) .exitIfSucceed(true) .isCritical(true) .name("Refresh page") .params(new Object()) .type(SyntheticsStepType.REFRESH))); try { SyntheticsBrowserTest result = apiInstance.createSyntheticsBrowserTest(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#createSyntheticsBrowserTest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response ``` // Create a browser test with advanced scheduling options returns "OK - Returns the created test // details." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsBrowserTest; import com.datadog.api.client.v1.model.SyntheticsBrowserTestConfig; import com.datadog.api.client.v1.model.SyntheticsBrowserTestType; import com.datadog.api.client.v1.model.SyntheticsConfigVariable; import com.datadog.api.client.v1.model.SyntheticsConfigVariableType; import com.datadog.api.client.v1.model.SyntheticsStep; import com.datadog.api.client.v1.model.SyntheticsStepType; import com.datadog.api.client.v1.model.SyntheticsTestOptions; import com.datadog.api.client.v1.model.SyntheticsTestOptionsRetry; import com.datadog.api.client.v1.model.SyntheticsTestOptionsScheduling; import com.datadog.api.client.v1.model.SyntheticsTestOptionsSchedulingTimeframe; import com.datadog.api.client.v1.model.SyntheticsTestRequest; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsBrowserTest body = new SyntheticsBrowserTest() .config( new SyntheticsBrowserTestConfig() .configVariables( Collections.singletonList( new SyntheticsConfigVariable() .example("content-type") .name("PROPERTY") .pattern("content-type") .type(SyntheticsConfigVariableType.TEXT))) .request(new SyntheticsTestRequest().method("GET").url("https://datadoghq.com")) .setCookie("name:test")) .locations(Collections.singletonList("aws:us-east-2")) .message("Test message") .name("Example-Synthetic") .options( new SyntheticsTestOptions() .acceptSelfSigned(false) .allowInsecure(true) .deviceIds(Collections.singletonList("tablet")) .disableCors(true) .followRedirects(true) .minFailureDuration(10L) .minLocationFailed(1L) .noScreenshot(true) .retry(new SyntheticsTestOptionsRetry().count(2L).interval(10.0)) .tickEvery(300L) .scheduling( new SyntheticsTestOptionsScheduling() .timeframes( Arrays.asList( new SyntheticsTestOptionsSchedulingTimeframe() .day(1) .from("07:00") .to("16:00"), new SyntheticsTestOptionsSchedulingTimeframe() .day(3) .from("07:00") .to("16:00"))) .timezone("America/New_York"))) .tags(Collections.singletonList("testing:browser")) .type(SyntheticsBrowserTestType.BROWSER) .steps( Collections.singletonList( new SyntheticsStep() .allowFailure(false) .isCritical(true) .name("Refresh page") .params(new Object()) .type(SyntheticsStepType.REFRESH))); try { SyntheticsBrowserTest result = apiInstance.createSyntheticsBrowserTest(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#createSyntheticsBrowserTest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a browser test returns "OK - Returns saved rumSettings." response ``` """ Create a browser test returns "OK - Returns saved rumSettings." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_browser_test import SyntheticsBrowserTest from datadog_api_client.v1.model.synthetics_browser_test_config import SyntheticsBrowserTestConfig from datadog_api_client.v1.model.synthetics_browser_test_rum_settings import SyntheticsBrowserTestRumSettings from datadog_api_client.v1.model.synthetics_browser_test_type import SyntheticsBrowserTestType from datadog_api_client.v1.model.synthetics_config_variable import SyntheticsConfigVariable from datadog_api_client.v1.model.synthetics_config_variable_type import SyntheticsConfigVariableType from datadog_api_client.v1.model.synthetics_step import SyntheticsStep from datadog_api_client.v1.model.synthetics_step_type import SyntheticsStepType from datadog_api_client.v1.model.synthetics_test_ci_options import SyntheticsTestCiOptions from datadog_api_client.v1.model.synthetics_test_execution_rule import SyntheticsTestExecutionRule from datadog_api_client.v1.model.synthetics_test_options import SyntheticsTestOptions from datadog_api_client.v1.model.synthetics_test_options_retry import SyntheticsTestOptionsRetry from datadog_api_client.v1.model.synthetics_test_request import SyntheticsTestRequest body = SyntheticsBrowserTest( config=SyntheticsBrowserTestConfig( assertions=[], config_variables=[ SyntheticsConfigVariable( example="content-type", name="PROPERTY", pattern="content-type", type=SyntheticsConfigVariableType.TEXT, ), ], request=SyntheticsTestRequest( method="GET", url="https://datadoghq.com", certificate_domains=[ "https://datadoghq.com", ], ), set_cookie="name:test", ), locations=[ "aws:us-east-2", ], message="Test message", name="Example-Synthetic", options=SyntheticsTestOptions( accept_self_signed=False, allow_insecure=True, device_ids=[ "tablet", ], disable_cors=True, follow_redirects=True, min_failure_duration=10, min_location_failed=1, no_screenshot=True, retry=SyntheticsTestOptionsRetry( count=2, interval=10.0, ), rum_settings=SyntheticsBrowserTestRumSettings( is_enabled=True, application_id="mockApplicationId", client_token_id=12345, ), tick_every=300, ci=SyntheticsTestCiOptions( execution_rule=SyntheticsTestExecutionRule.SKIPPED, ), ignore_server_certificate_error=True, disable_csp=True, initial_navigation_timeout=200, ), tags=[ "testing:browser", ], type=SyntheticsBrowserTestType.BROWSER, steps=[ SyntheticsStep( allow_failure=False, is_critical=True, name="Refresh page", params=dict(), type=SyntheticsStepType.REFRESH, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.create_synthetics_browser_test(body=body) print(response) ``` Copy ##### Create a browser test returns "OK - Returns the created test details." response ``` """ Create a browser test returns "OK - Returns the created test details." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_browser_test import SyntheticsBrowserTest from datadog_api_client.v1.model.synthetics_browser_test_config import SyntheticsBrowserTestConfig from datadog_api_client.v1.model.synthetics_browser_test_type import SyntheticsBrowserTestType from datadog_api_client.v1.model.synthetics_browser_variable import SyntheticsBrowserVariable from datadog_api_client.v1.model.synthetics_browser_variable_type import SyntheticsBrowserVariableType from datadog_api_client.v1.model.synthetics_config_variable import SyntheticsConfigVariable from datadog_api_client.v1.model.synthetics_config_variable_type import SyntheticsConfigVariableType from datadog_api_client.v1.model.synthetics_step import SyntheticsStep from datadog_api_client.v1.model.synthetics_step_type import SyntheticsStepType from datadog_api_client.v1.model.synthetics_test_options import SyntheticsTestOptions from datadog_api_client.v1.model.synthetics_test_options_retry import SyntheticsTestOptionsRetry from datadog_api_client.v1.model.synthetics_test_request import SyntheticsTestRequest body = SyntheticsBrowserTest( config=SyntheticsBrowserTestConfig( assertions=[], variables=[ SyntheticsBrowserVariable( type=SyntheticsBrowserVariableType.TEXT, name="TEST_VARIABLE", pattern="secret", secure=True, example="secret", ), ], config_variables=[ SyntheticsConfigVariable( example="content-type", name="PROPERTY", pattern="content-type", type=SyntheticsConfigVariableType.TEXT, secure=True, ), ], request=SyntheticsTestRequest( method="GET", url="https://datadoghq.com", ), set_cookie="name:test", ), locations=[ "aws:us-east-2", ], message="Test message", name="Example-Synthetic", options=SyntheticsTestOptions( accept_self_signed=False, allow_insecure=True, device_ids=[ "chrome.laptop_large", ], disable_cors=True, follow_redirects=True, min_failure_duration=10, min_location_failed=1, no_screenshot=True, retry=SyntheticsTestOptionsRetry( count=2, interval=10.0, ), tick_every=300, enable_profiling=True, enable_security_testing=True, ), tags=[ "testing:browser", ], type=SyntheticsBrowserTestType.BROWSER, steps=[ SyntheticsStep( allow_failure=False, always_execute=True, exit_if_succeed=True, is_critical=True, name="Refresh page", params=dict(), type=SyntheticsStepType.REFRESH, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.create_synthetics_browser_test(body=body) print(response) ``` Copy ##### Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response ``` """ Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_browser_test import SyntheticsBrowserTest from datadog_api_client.v1.model.synthetics_browser_test_config import SyntheticsBrowserTestConfig from datadog_api_client.v1.model.synthetics_browser_test_type import SyntheticsBrowserTestType from datadog_api_client.v1.model.synthetics_config_variable import SyntheticsConfigVariable from datadog_api_client.v1.model.synthetics_config_variable_type import SyntheticsConfigVariableType from datadog_api_client.v1.model.synthetics_step import SyntheticsStep from datadog_api_client.v1.model.synthetics_step_type import SyntheticsStepType from datadog_api_client.v1.model.synthetics_test_options import SyntheticsTestOptions from datadog_api_client.v1.model.synthetics_test_options_retry import SyntheticsTestOptionsRetry from datadog_api_client.v1.model.synthetics_test_options_scheduling import SyntheticsTestOptionsScheduling from datadog_api_client.v1.model.synthetics_test_options_scheduling_timeframe import ( SyntheticsTestOptionsSchedulingTimeframe, ) from datadog_api_client.v1.model.synthetics_test_request import SyntheticsTestRequest body = SyntheticsBrowserTest( config=SyntheticsBrowserTestConfig( assertions=[], config_variables=[ SyntheticsConfigVariable( example="content-type", name="PROPERTY", pattern="content-type", type=SyntheticsConfigVariableType.TEXT, ), ], request=SyntheticsTestRequest( method="GET", url="https://datadoghq.com", ), set_cookie="name:test", ), locations=[ "aws:us-east-2", ], message="Test message", name="Example-Synthetic", options=SyntheticsTestOptions( accept_self_signed=False, allow_insecure=True, device_ids=[ "tablet", ], disable_cors=True, follow_redirects=True, min_failure_duration=10, min_location_failed=1, no_screenshot=True, retry=SyntheticsTestOptionsRetry( count=2, interval=10.0, ), tick_every=300, scheduling=SyntheticsTestOptionsScheduling( timeframes=[ SyntheticsTestOptionsSchedulingTimeframe( day=1, _from="07:00", to="16:00", ), SyntheticsTestOptionsSchedulingTimeframe( day=3, _from="07:00", to="16:00", ), ], timezone="America/New_York", ), ), tags=[ "testing:browser", ], type=SyntheticsBrowserTestType.BROWSER, steps=[ SyntheticsStep( allow_failure=False, is_critical=True, name="Refresh page", params=dict(), type=SyntheticsStepType.REFRESH, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.create_synthetics_browser_test(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a browser test returns "OK - Returns saved rumSettings." response ``` # Create a browser test returns "OK - Returns saved rumSettings." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsBrowserTest.new({ config: DatadogAPIClient::V1::SyntheticsBrowserTestConfig.new({ assertions: [], config_variables: [ DatadogAPIClient::V1::SyntheticsConfigVariable.new({ example: "content-type", name: "PROPERTY", pattern: "content-type", type: DatadogAPIClient::V1::SyntheticsConfigVariableType::TEXT, }), ], request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ method: "GET", url: "https://datadoghq.com", certificate_domains: [ "https://datadoghq.com", ], }), set_cookie: "name:test", }), locations: [ "aws:us-east-2", ], message: "Test message", name: "Example-Synthetic", options: DatadogAPIClient::V1::SyntheticsTestOptions.new({ accept_self_signed: false, allow_insecure: true, device_ids: [ "tablet", ], disable_cors: true, follow_redirects: true, min_failure_duration: 10, min_location_failed: 1, no_screenshot: true, _retry: DatadogAPIClient::V1::SyntheticsTestOptionsRetry.new({ count: 2, interval: 10, }), rum_settings: DatadogAPIClient::V1::SyntheticsBrowserTestRumSettings.new({ is_enabled: true, application_id: "mockApplicationId", client_token_id: 12345, }), tick_every: 300, ci: DatadogAPIClient::V1::SyntheticsTestCiOptions.new({ execution_rule: DatadogAPIClient::V1::SyntheticsTestExecutionRule::SKIPPED, }), ignore_server_certificate_error: true, disable_csp: true, initial_navigation_timeout: 200, }), tags: [ "testing:browser", ], type: DatadogAPIClient::V1::SyntheticsBrowserTestType::BROWSER, steps: [ DatadogAPIClient::V1::SyntheticsStep.new({ allow_failure: false, is_critical: true, name: "Refresh page", params: {}, type: DatadogAPIClient::V1::SyntheticsStepType::REFRESH, }), ], }) p api_instance.create_synthetics_browser_test(body) ``` Copy ##### Create a browser test returns "OK - Returns the created test details." response ``` # Create a browser test returns "OK - Returns the created test details." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsBrowserTest.new({ config: DatadogAPIClient::V1::SyntheticsBrowserTestConfig.new({ assertions: [], variables: [ DatadogAPIClient::V1::SyntheticsBrowserVariable.new({ type: DatadogAPIClient::V1::SyntheticsBrowserVariableType::TEXT, name: "TEST_VARIABLE", pattern: "secret", secure: true, example: "secret", }), ], config_variables: [ DatadogAPIClient::V1::SyntheticsConfigVariable.new({ example: "content-type", name: "PROPERTY", pattern: "content-type", type: DatadogAPIClient::V1::SyntheticsConfigVariableType::TEXT, secure: true, }), ], request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ method: "GET", url: "https://datadoghq.com", }), set_cookie: "name:test", }), locations: [ "aws:us-east-2", ], message: "Test message", name: "Example-Synthetic", options: DatadogAPIClient::V1::SyntheticsTestOptions.new({ accept_self_signed: false, allow_insecure: true, device_ids: [ "chrome.laptop_large", ], disable_cors: true, follow_redirects: true, min_failure_duration: 10, min_location_failed: 1, no_screenshot: true, _retry: DatadogAPIClient::V1::SyntheticsTestOptionsRetry.new({ count: 2, interval: 10, }), tick_every: 300, enable_profiling: true, enable_security_testing: true, }), tags: [ "testing:browser", ], type: DatadogAPIClient::V1::SyntheticsBrowserTestType::BROWSER, steps: [ DatadogAPIClient::V1::SyntheticsStep.new({ allow_failure: false, always_execute: true, exit_if_succeed: true, is_critical: true, name: "Refresh page", params: {}, type: DatadogAPIClient::V1::SyntheticsStepType::REFRESH, }), ], }) p api_instance.create_synthetics_browser_test(body) ``` Copy ##### Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response ``` # Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsBrowserTest.new({ config: DatadogAPIClient::V1::SyntheticsBrowserTestConfig.new({ assertions: [], config_variables: [ DatadogAPIClient::V1::SyntheticsConfigVariable.new({ example: "content-type", name: "PROPERTY", pattern: "content-type", type: DatadogAPIClient::V1::SyntheticsConfigVariableType::TEXT, }), ], request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ method: "GET", url: "https://datadoghq.com", }), set_cookie: "name:test", }), locations: [ "aws:us-east-2", ], message: "Test message", name: "Example-Synthetic", options: DatadogAPIClient::V1::SyntheticsTestOptions.new({ accept_self_signed: false, allow_insecure: true, device_ids: [ "tablet", ], disable_cors: true, follow_redirects: true, min_failure_duration: 10, min_location_failed: 1, no_screenshot: true, _retry: DatadogAPIClient::V1::SyntheticsTestOptionsRetry.new({ count: 2, interval: 10, }), tick_every: 300, scheduling: DatadogAPIClient::V1::SyntheticsTestOptionsScheduling.new({ timeframes: [ DatadogAPIClient::V1::SyntheticsTestOptionsSchedulingTimeframe.new({ day: 1, from: "07:00", to: "16:00", }), DatadogAPIClient::V1::SyntheticsTestOptionsSchedulingTimeframe.new({ day: 3, from: "07:00", to: "16:00", }), ], timezone: "America/New_York", }), }), tags: [ "testing:browser", ], type: DatadogAPIClient::V1::SyntheticsBrowserTestType::BROWSER, steps: [ DatadogAPIClient::V1::SyntheticsStep.new({ allow_failure: false, is_critical: true, name: "Refresh page", params: {}, type: DatadogAPIClient::V1::SyntheticsStepType::REFRESH, }), ], }) p api_instance.create_synthetics_browser_test(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a browser test returns "OK - Returns saved rumSettings." response ``` // Create a browser test returns "OK - Returns saved rumSettings." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsBrowserTest; use datadog_api_client::datadogV1::model::SyntheticsBrowserTestConfig; use datadog_api_client::datadogV1::model::SyntheticsBrowserTestRumSettings; use datadog_api_client::datadogV1::model::SyntheticsBrowserTestType; use datadog_api_client::datadogV1::model::SyntheticsConfigVariable; use datadog_api_client::datadogV1::model::SyntheticsConfigVariableType; use datadog_api_client::datadogV1::model::SyntheticsStep; use datadog_api_client::datadogV1::model::SyntheticsStepType; use datadog_api_client::datadogV1::model::SyntheticsTestCiOptions; use datadog_api_client::datadogV1::model::SyntheticsTestExecutionRule; use datadog_api_client::datadogV1::model::SyntheticsTestOptions; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsRetry; use datadog_api_client::datadogV1::model::SyntheticsTestRequest; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = SyntheticsBrowserTest::new( SyntheticsBrowserTestConfig::new( vec![], SyntheticsTestRequest::new() .certificate_domains(vec!["https://datadoghq.com".to_string()]) .method("GET".to_string()) .url("https://datadoghq.com".to_string()), ) .config_variables(vec![SyntheticsConfigVariable::new( "PROPERTY".to_string(), SyntheticsConfigVariableType::TEXT, ) .example("content-type".to_string()) .pattern("content-type".to_string())]) .set_cookie("name:test".to_string()), vec!["aws:us-east-2".to_string()], "Test message".to_string(), "Example-Synthetic".to_string(), SyntheticsTestOptions::new() .accept_self_signed(false) .allow_insecure(true) .ci(SyntheticsTestCiOptions::new( SyntheticsTestExecutionRule::SKIPPED, )) .device_ids(vec!["tablet".to_string()]) .disable_cors(true) .disable_csp(true) .follow_redirects(true) .ignore_server_certificate_error(true) .initial_navigation_timeout(200) .min_failure_duration(10) .min_location_failed(1) .no_screenshot(true) .retry( SyntheticsTestOptionsRetry::new() .count(2) .interval(10.0 as f64), ) .rum_settings( SyntheticsBrowserTestRumSettings::new(true) .application_id("mockApplicationId".to_string()) .client_token_id(12345), ) .tick_every(300), SyntheticsBrowserTestType::BROWSER, ) .steps(vec![SyntheticsStep::new() .allow_failure(false) .is_critical(true) .name("Refresh page".to_string()) .params(BTreeMap::new()) .type_(SyntheticsStepType::REFRESH)]) .tags(vec!["testing:browser".to_string()]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.create_synthetics_browser_test(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a browser test returns "OK - Returns the created test details." response ``` // Create a browser test returns "OK - Returns the created test details." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsBrowserTest; use datadog_api_client::datadogV1::model::SyntheticsBrowserTestConfig; use datadog_api_client::datadogV1::model::SyntheticsBrowserTestType; use datadog_api_client::datadogV1::model::SyntheticsBrowserVariable; use datadog_api_client::datadogV1::model::SyntheticsBrowserVariableType; use datadog_api_client::datadogV1::model::SyntheticsConfigVariable; use datadog_api_client::datadogV1::model::SyntheticsConfigVariableType; use datadog_api_client::datadogV1::model::SyntheticsStep; use datadog_api_client::datadogV1::model::SyntheticsStepType; use datadog_api_client::datadogV1::model::SyntheticsTestOptions; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsRetry; use datadog_api_client::datadogV1::model::SyntheticsTestRequest; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = SyntheticsBrowserTest::new( SyntheticsBrowserTestConfig::new( vec![], SyntheticsTestRequest::new() .method("GET".to_string()) .url("https://datadoghq.com".to_string()), ) .config_variables(vec![SyntheticsConfigVariable::new( "PROPERTY".to_string(), SyntheticsConfigVariableType::TEXT, ) .example("content-type".to_string()) .pattern("content-type".to_string()) .secure(true)]) .set_cookie("name:test".to_string()) .variables(vec![SyntheticsBrowserVariable::new( "TEST_VARIABLE".to_string(), SyntheticsBrowserVariableType::TEXT, ) .example("secret".to_string()) .pattern("secret".to_string()) .secure(true)]), vec!["aws:us-east-2".to_string()], "Test message".to_string(), "Example-Synthetic".to_string(), SyntheticsTestOptions::new() .accept_self_signed(false) .allow_insecure(true) .device_ids(vec!["chrome.laptop_large".to_string()]) .disable_cors(true) .enable_profiling(true) .enable_security_testing(true) .follow_redirects(true) .min_failure_duration(10) .min_location_failed(1) .no_screenshot(true) .retry( SyntheticsTestOptionsRetry::new() .count(2) .interval(10.0 as f64), ) .tick_every(300), SyntheticsBrowserTestType::BROWSER, ) .steps(vec![SyntheticsStep::new() .allow_failure(false) .always_execute(true) .exit_if_succeed(true) .is_critical(true) .name("Refresh page".to_string()) .params(BTreeMap::new()) .type_(SyntheticsStepType::REFRESH)]) .tags(vec!["testing:browser".to_string()]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.create_synthetics_browser_test(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response ``` // Create a browser test with advanced scheduling options returns "OK - Returns // the created test details." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsBrowserTest; use datadog_api_client::datadogV1::model::SyntheticsBrowserTestConfig; use datadog_api_client::datadogV1::model::SyntheticsBrowserTestType; use datadog_api_client::datadogV1::model::SyntheticsConfigVariable; use datadog_api_client::datadogV1::model::SyntheticsConfigVariableType; use datadog_api_client::datadogV1::model::SyntheticsStep; use datadog_api_client::datadogV1::model::SyntheticsStepType; use datadog_api_client::datadogV1::model::SyntheticsTestOptions; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsRetry; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsScheduling; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsSchedulingTimeframe; use datadog_api_client::datadogV1::model::SyntheticsTestRequest; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = SyntheticsBrowserTest::new( SyntheticsBrowserTestConfig::new( vec![], SyntheticsTestRequest::new() .method("GET".to_string()) .url("https://datadoghq.com".to_string()), ) .config_variables(vec![SyntheticsConfigVariable::new( "PROPERTY".to_string(), SyntheticsConfigVariableType::TEXT, ) .example("content-type".to_string()) .pattern("content-type".to_string())]) .set_cookie("name:test".to_string()), vec!["aws:us-east-2".to_string()], "Test message".to_string(), "Example-Synthetic".to_string(), SyntheticsTestOptions::new() .accept_self_signed(false) .allow_insecure(true) .device_ids(vec!["tablet".to_string()]) .disable_cors(true) .follow_redirects(true) .min_failure_duration(10) .min_location_failed(1) .no_screenshot(true) .retry( SyntheticsTestOptionsRetry::new() .count(2) .interval(10.0 as f64), ) .scheduling(SyntheticsTestOptionsScheduling::new( vec![ SyntheticsTestOptionsSchedulingTimeframe::new( 1, "07:00".to_string(), "16:00".to_string(), ), SyntheticsTestOptionsSchedulingTimeframe::new( 3, "07:00".to_string(), "16:00".to_string(), ), ], "America/New_York".to_string(), )) .tick_every(300), SyntheticsBrowserTestType::BROWSER, ) .steps(vec![SyntheticsStep::new() .allow_failure(false) .is_critical(true) .name("Refresh page".to_string()) .params(BTreeMap::new()) .type_(SyntheticsStepType::REFRESH)]) .tags(vec!["testing:browser".to_string()]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.create_synthetics_browser_test(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a browser test returns "OK - Returns saved rumSettings." response ``` /** * Create a browser test returns "OK - Returns saved rumSettings." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiCreateSyntheticsBrowserTestRequest = { body: { config: { assertions: [], configVariables: [ { example: "content-type", name: "PROPERTY", pattern: "content-type", type: "text", }, ], request: { method: "GET", url: "https://datadoghq.com", certificateDomains: ["https://datadoghq.com"], }, setCookie: "name:test", }, locations: ["aws:us-east-2"], message: "Test message", name: "Example-Synthetic", options: { acceptSelfSigned: false, allowInsecure: true, deviceIds: ["tablet"], disableCors: true, followRedirects: true, minFailureDuration: 10, minLocationFailed: 1, noScreenshot: true, retry: { count: 2, interval: 10, }, rumSettings: { isEnabled: true, applicationId: "mockApplicationId", clientTokenId: 12345, }, tickEvery: 300, ci: { executionRule: "skipped", }, ignoreServerCertificateError: true, disableCsp: true, initialNavigationTimeout: 200, }, tags: ["testing:browser"], type: "browser", steps: [ { allowFailure: false, isCritical: true, name: "Refresh page", params: {}, type: "refresh", }, ], }, }; apiInstance .createSyntheticsBrowserTest(params) .then((data: v1.SyntheticsBrowserTest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a browser test returns "OK - Returns the created test details." response ``` /** * Create a browser test returns "OK - Returns the created test details." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiCreateSyntheticsBrowserTestRequest = { body: { config: { assertions: [], variables: [ { type: "text", name: "TEST_VARIABLE", pattern: "secret", secure: true, example: "secret", }, ], configVariables: [ { example: "content-type", name: "PROPERTY", pattern: "content-type", type: "text", secure: true, }, ], request: { method: "GET", url: "https://datadoghq.com", }, setCookie: "name:test", }, locations: ["aws:us-east-2"], message: "Test message", name: "Example-Synthetic", options: { acceptSelfSigned: false, allowInsecure: true, deviceIds: ["chrome.laptop_large"], disableCors: true, followRedirects: true, minFailureDuration: 10, minLocationFailed: 1, noScreenshot: true, retry: { count: 2, interval: 10, }, tickEvery: 300, enableProfiling: true, enableSecurityTesting: true, }, tags: ["testing:browser"], type: "browser", steps: [ { allowFailure: false, alwaysExecute: true, exitIfSucceed: true, isCritical: true, name: "Refresh page", params: {}, type: "refresh", }, ], }, }; apiInstance .createSyntheticsBrowserTest(params) .then((data: v1.SyntheticsBrowserTest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response ``` /** * Create a browser test with advanced scheduling options returns "OK - Returns the created test details." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiCreateSyntheticsBrowserTestRequest = { body: { config: { assertions: [], configVariables: [ { example: "content-type", name: "PROPERTY", pattern: "content-type", type: "text", }, ], request: { method: "GET", url: "https://datadoghq.com", }, setCookie: "name:test", }, locations: ["aws:us-east-2"], message: "Test message", name: "Example-Synthetic", options: { acceptSelfSigned: false, allowInsecure: true, deviceIds: ["tablet"], disableCors: true, followRedirects: true, minFailureDuration: 10, minLocationFailed: 1, noScreenshot: true, retry: { count: 2, interval: 10, }, tickEvery: 300, scheduling: { timeframes: [ { day: 1, from: "07:00", to: "16:00", }, { day: 3, from: "07:00", to: "16:00", }, ], timezone: "America/New_York", }, }, tags: ["testing:browser"], type: "browser", steps: [ { allowFailure: false, isCritical: true, name: "Refresh page", params: {}, type: "refresh", }, ], }, }; apiInstance .createSyntheticsBrowserTest(params) .then((data: v1.SyntheticsBrowserTest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a mobile test](https://docs.datadoghq.com/api/latest/synthetics/#create-a-mobile-test) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#create-a-mobile-test-v1) POST https://api.ap1.datadoghq.com/api/v1/synthetics/tests/mobilehttps://api.ap2.datadoghq.com/api/v1/synthetics/tests/mobilehttps://api.datadoghq.eu/api/v1/synthetics/tests/mobilehttps://api.ddog-gov.com/api/v1/synthetics/tests/mobilehttps://api.datadoghq.com/api/v1/synthetics/tests/mobilehttps://api.us3.datadoghq.com/api/v1/synthetics/tests/mobilehttps://api.us5.datadoghq.com/api/v1/synthetics/tests/mobile ### Overview Create a Synthetic mobile test. This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Request #### Body Data (required) Details of the test to create. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description config [_required_] object Configuration object for a Synthetic mobile test. initialApplicationArguments object Initial application arguments for a mobile test. string A single application argument. variables [object] Array of variables used for the test steps. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` device_ids [string] Array with the different device IDs used to run the test. message [_required_] string Notification message associated with the test. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. allowApplicationCrash boolean A boolean to set if an application crash would mark the test as failed. bindings [object] Array of bindings used for the mobile test. principals [string] List of principals for a mobile test binding. relation enum The type of relation for the binding. Allowed enum values: `editor,viewer` ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` defaultStepTimeout int32 The default timeout for steps in the test (in seconds). device_ids [_required_] [string] For mobile test, array with the different device IDs used to run the test. disableAutoAcceptAlert boolean A boolean to disable auto accepting alerts. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. mobileApplication [_required_] object Mobile application for mobile synthetics test. applicationId [_required_] string Application ID of the mobile application. referenceId [_required_] string Reference ID of the mobile application. referenceType [_required_] enum Reference type for the mobile application for a mobile synthetics test. Allowed enum values: `latest,version` monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean A boolean set to not take a screenshot for the step. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every [_required_] int64 The frequency at which to run the Synthetic test (in seconds). verbosity int32 The level of verbosity for the mobile test. This field can not be set by a user. public_id string The public ID of the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] Array of steps for the test. allowFailure boolean A boolean set to allow this step to fail. hasNewStepElement boolean A boolean set to determine if the step has a new step element. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name [_required_] string The name of the step. noScreenshot boolean A boolean set to not take a screenshot for the step. params [_required_] object The parameters of a mobile step. check enum Type of assertion to apply in an API test. Allowed enum values: `equals,notEquals,contains,notContains,startsWith,notStartsWith,greater,lower,greaterEquals,lowerEquals,matchRegex,between,isEmpty,notIsEmpty` delay int64 Number of milliseconds to wait between inputs in a `typeText` step type. direction enum The direction of the scroll for a `scrollToElement` step type. Allowed enum values: `up,down,left,right` element object Information about the element used for a step. context string Context of the element. contextType enum Type of the context that the element is in. Allowed enum values: `native,web` elementDescription string Description of the element. multiLocator object Multi-locator to find the element. relativePosition object Position of the action relative to the element. x double The `relativePosition` on the `x` axis for the element. y double The `relativePosition` on the `y` axis for the element. textContent string Text content of the element. userLocator object User locator to find the element. failTestOnCannotLocate boolean Whether if the test should fail if the element cannot be found. values [object] Values of the user locator. type enum Type of a user locator. Allowed enum values: `accessibility-id,id,ios-predicate-string,ios-class-chain,xpath` value string Value of a user locator. viewName string Name of the view of the element. enabled boolean Boolean to change the state of the wifi for a `toggleWiFi` step type. maxScrolls int64 Maximum number of scrolls to do for a `scrollToElement` step type. positions [object] List of positions for the `flick` step type. The maximum is 10 flicks per step x double The `x` position for the flick. y double The `y` position for the flick. subtestPublicId string Public ID of the test to be played as part of a `playSubTest` step type. value Values used in the step for in multiple step types. Option 1 string Value used in the step for in multiple step types. Option 2 int64 Value used in the step for in multiple step types. variable object Variable object for `extractVariable` step type. example [_required_] string An example for the variable. name [_required_] string The variable name. withEnter boolean Boolean to indicate if `Enter` should be pressed at the end of the `typeText` step type. x double Amount to scroll by on the `x` axis for a `scroll` step type. y double Amount to scroll by on the `y` axis for a `scroll` step type. publicId string The public ID of the step. timeout int64 The time before declaring a step failed. type [_required_] enum Step type used in your mobile Synthetic test. Allowed enum values: `assertElementContent,assertScreenContains,assertScreenLacks,doubleTap,extractVariable,flick,openDeeplink,playSubTest,pressBack,restartApplication,rotate,scroll,scrollToElement,tap,toggleWiFi,typeText,wait` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `mobile`. Allowed enum values: `mobile` default: `mobile` ``` { "name": "Example-Synthetic", "status": "paused", "type": "mobile", "config": { "variables": [] }, "message": "", "options": { "device_ids": [ "synthetics:mobile:device:iphone_15_ios_17" ], "mobileApplication": { "applicationId": "ab0e0aed-536d-411a-9a99-5428c27d8f8e", "referenceId": "6115922a-5f5d-455e-bc7e-7955a57f3815", "referenceType": "version" }, "tick_every": 3600 }, "steps": [] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsMobileTest-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsMobileTest-400-v1) * [402](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsMobileTest-402-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsMobileTest-403-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#CreateSyntheticsMobileTest-429-v1) OK - Returns the created test details. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about a Synthetic mobile test. Field Type Description config [_required_] object Configuration object for a Synthetic mobile test. initialApplicationArguments object Initial application arguments for a mobile test. string A single application argument. variables [object] Array of variables used for the test steps. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` device_ids [string] Array with the different device IDs used to run the test. message [_required_] string Notification message associated with the test. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. allowApplicationCrash boolean A boolean to set if an application crash would mark the test as failed. bindings [object] Array of bindings used for the mobile test. principals [string] List of principals for a mobile test binding. relation enum The type of relation for the binding. Allowed enum values: `editor,viewer` ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` defaultStepTimeout int32 The default timeout for steps in the test (in seconds). device_ids [_required_] [string] For mobile test, array with the different device IDs used to run the test. disableAutoAcceptAlert boolean A boolean to disable auto accepting alerts. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. mobileApplication [_required_] object Mobile application for mobile synthetics test. applicationId [_required_] string Application ID of the mobile application. referenceId [_required_] string Reference ID of the mobile application. referenceType [_required_] enum Reference type for the mobile application for a mobile synthetics test. Allowed enum values: `latest,version` monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean A boolean set to not take a screenshot for the step. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every [_required_] int64 The frequency at which to run the Synthetic test (in seconds). verbosity int32 The level of verbosity for the mobile test. This field can not be set by a user. public_id string The public ID of the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] Array of steps for the test. allowFailure boolean A boolean set to allow this step to fail. hasNewStepElement boolean A boolean set to determine if the step has a new step element. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name [_required_] string The name of the step. noScreenshot boolean A boolean set to not take a screenshot for the step. params [_required_] object The parameters of a mobile step. check enum Type of assertion to apply in an API test. Allowed enum values: `equals,notEquals,contains,notContains,startsWith,notStartsWith,greater,lower,greaterEquals,lowerEquals,matchRegex,between,isEmpty,notIsEmpty` delay int64 Number of milliseconds to wait between inputs in a `typeText` step type. direction enum The direction of the scroll for a `scrollToElement` step type. Allowed enum values: `up,down,left,right` element object Information about the element used for a step. context string Context of the element. contextType enum Type of the context that the element is in. Allowed enum values: `native,web` elementDescription string Description of the element. multiLocator object Multi-locator to find the element. relativePosition object Position of the action relative to the element. x double The `relativePosition` on the `x` axis for the element. y double The `relativePosition` on the `y` axis for the element. textContent string Text content of the element. userLocator object User locator to find the element. failTestOnCannotLocate boolean Whether if the test should fail if the element cannot be found. values [object] Values of the user locator. type enum Type of a user locator. Allowed enum values: `accessibility-id,id,ios-predicate-string,ios-class-chain,xpath` value string Value of a user locator. viewName string Name of the view of the element. enabled boolean Boolean to change the state of the wifi for a `toggleWiFi` step type. maxScrolls int64 Maximum number of scrolls to do for a `scrollToElement` step type. positions [object] List of positions for the `flick` step type. The maximum is 10 flicks per step x double The `x` position for the flick. y double The `y` position for the flick. subtestPublicId string Public ID of the test to be played as part of a `playSubTest` step type. value Values used in the step for in multiple step types. Option 1 string Value used in the step for in multiple step types. Option 2 int64 Value used in the step for in multiple step types. variable object Variable object for `extractVariable` step type. example [_required_] string An example for the variable. name [_required_] string The variable name. withEnter boolean Boolean to indicate if `Enter` should be pressed at the end of the `typeText` step type. x double Amount to scroll by on the `x` axis for a `scroll` step type. y double Amount to scroll by on the `y` axis for a `scroll` step type. publicId string The public ID of the step. timeout int64 The time before declaring a step failed. type [_required_] enum Step type used in your mobile Synthetic test. Allowed enum values: `assertElementContent,assertScreenContains,assertScreenLacks,doubleTap,extractVariable,flick,openDeeplink,playSubTest,pressBack,restartApplication,rotate,scroll,scrollToElement,tap,toggleWiFi,typeText,wait` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `mobile`. Allowed enum values: `mobile` default: `mobile` ``` { "config": { "initialApplicationArguments": { "": "string" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "device_ids": [ "chrome.laptop_large" ], "message": "Notification message", "monitor_id": 12345678, "name": "Example test name", "options": { "allowApplicationCrash": false, "bindings": [ { "principals": [], "relation": "string" } ], "ci": { "executionRule": "blocking" }, "defaultStepTimeout": "integer", "device_ids": [ "synthetics:mobile:device:apple_ipad_10th_gen_2022_ios_16" ], "disableAutoAcceptAlert": false, "min_failure_duration": "integer", "mobileApplication": { "applicationId": "00000000-0000-0000-0000-aaaaaaaaaaaa", "referenceId": "00000000-0000-0000-0000-aaaaaaaaaaab", "referenceType": "latest" }, "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": 300, "verbosity": "integer" }, "public_id": "123-abc-456", "status": "live", "steps": [ { "allowFailure": false, "hasNewStepElement": false, "isCritical": false, "name": "", "noScreenshot": false, "params": { "check": "string", "delay": "integer", "direction": "string", "element": { "context": "string", "contextType": "string", "elementDescription": "string", "multiLocator": {}, "relativePosition": { "x": "number", "y": "number" }, "textContent": "string", "userLocator": { "failTestOnCannotLocate": false, "values": [ { "type": "string", "value": "string" } ] }, "viewName": "string" }, "enabled": false, "maxScrolls": "integer", "positions": [ { "x": "number", "y": "number" } ], "subtestPublicId": "string", "value": { "description": "undefined", "type": "undefined" }, "variable": { "example": "", "name": "VAR_NAME" }, "withEnter": false, "x": "number", "y": "number" }, "publicId": "pub-lic-id0", "timeout": "integer", "type": "assertElementContent" } ], "tags": [ "env:production" ], "type": "mobile" } ``` Copy - JSON format is wrong - Creation failed * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Test quota is reached * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Create a mobile test returns "OK - Returns the created test details." response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/mobile" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Synthetic", "status": "paused", "type": "mobile", "config": { "variables": [] }, "message": "", "options": { "device_ids": [ "synthetics:mobile:device:iphone_15_ios_17" ], "mobileApplication": { "applicationId": "ab0e0aed-536d-411a-9a99-5428c27d8f8e", "referenceId": "6115922a-5f5d-455e-bc7e-7955a57f3815", "referenceType": "version" }, "tick_every": 3600 }, "steps": [] } EOF ``` ##### Create a mobile test returns "OK - Returns the created test details." response ``` // Create a mobile test returns "OK - Returns the created test details." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsMobileTest{ Name: "Example-Synthetic", Status: datadogV1.SYNTHETICSTESTPAUSESTATUS_PAUSED.Ptr(), Type: datadogV1.SYNTHETICSMOBILETESTTYPE_MOBILE, Config: datadogV1.SyntheticsMobileTestConfig{ Variables: []datadogV1.SyntheticsConfigVariable{}, }, Message: "", Options: datadogV1.SyntheticsMobileTestOptions{ DeviceIds: []string{ "synthetics:mobile:device:iphone_15_ios_17", }, MobileApplication: datadogV1.SyntheticsMobileTestsMobileApplication{ ApplicationId: "ab0e0aed-536d-411a-9a99-5428c27d8f8e", ReferenceId: "6115922a-5f5d-455e-bc7e-7955a57f3815", ReferenceType: datadogV1.SYNTHETICSMOBILETESTSMOBILEAPPLICATIONREFERENCETYPE_VERSION, }, TickEvery: 3600, }, Steps: []datadogV1.SyntheticsMobileStep{}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.CreateSyntheticsMobileTest(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.CreateSyntheticsMobileTest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.CreateSyntheticsMobileTest`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a mobile test returns "OK - Returns the created test details." response ``` // Create a mobile test returns "OK - Returns the created test details." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsMobileTest; import com.datadog.api.client.v1.model.SyntheticsMobileTestConfig; import com.datadog.api.client.v1.model.SyntheticsMobileTestOptions; import com.datadog.api.client.v1.model.SyntheticsMobileTestType; import com.datadog.api.client.v1.model.SyntheticsMobileTestsMobileApplication; import com.datadog.api.client.v1.model.SyntheticsMobileTestsMobileApplicationReferenceType; import com.datadog.api.client.v1.model.SyntheticsTestPauseStatus; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsMobileTest body = new SyntheticsMobileTest() .name("Example-Synthetic") .status(SyntheticsTestPauseStatus.PAUSED) .type(SyntheticsMobileTestType.MOBILE) .config(new SyntheticsMobileTestConfig()) .message("") .options( new SyntheticsMobileTestOptions() .deviceIds( Collections.singletonList("synthetics:mobile:device:iphone_15_ios_17")) .mobileApplication( new SyntheticsMobileTestsMobileApplication() .applicationId("ab0e0aed-536d-411a-9a99-5428c27d8f8e") .referenceId("6115922a-5f5d-455e-bc7e-7955a57f3815") .referenceType( SyntheticsMobileTestsMobileApplicationReferenceType.VERSION)) .tickEvery(3600L)); try { SyntheticsMobileTest result = apiInstance.createSyntheticsMobileTest(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#createSyntheticsMobileTest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a mobile test returns "OK - Returns the created test details." response ``` """ Create a mobile test returns "OK - Returns the created test details." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_mobile_test import SyntheticsMobileTest from datadog_api_client.v1.model.synthetics_mobile_test_config import SyntheticsMobileTestConfig from datadog_api_client.v1.model.synthetics_mobile_test_options import SyntheticsMobileTestOptions from datadog_api_client.v1.model.synthetics_mobile_test_type import SyntheticsMobileTestType from datadog_api_client.v1.model.synthetics_mobile_tests_mobile_application import ( SyntheticsMobileTestsMobileApplication, ) from datadog_api_client.v1.model.synthetics_mobile_tests_mobile_application_reference_type import ( SyntheticsMobileTestsMobileApplicationReferenceType, ) from datadog_api_client.v1.model.synthetics_test_pause_status import SyntheticsTestPauseStatus body = SyntheticsMobileTest( name="Example-Synthetic", status=SyntheticsTestPauseStatus.PAUSED, type=SyntheticsMobileTestType.MOBILE, config=SyntheticsMobileTestConfig( variables=[], ), message="", options=SyntheticsMobileTestOptions( device_ids=[ "synthetics:mobile:device:iphone_15_ios_17", ], mobile_application=SyntheticsMobileTestsMobileApplication( application_id="ab0e0aed-536d-411a-9a99-5428c27d8f8e", reference_id="6115922a-5f5d-455e-bc7e-7955a57f3815", reference_type=SyntheticsMobileTestsMobileApplicationReferenceType.VERSION, ), tick_every=3600, ), steps=[], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.create_synthetics_mobile_test(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a mobile test returns "OK - Returns the created test details." response ``` # Create a mobile test returns "OK - Returns the created test details." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsMobileTest.new({ name: "Example-Synthetic", status: DatadogAPIClient::V1::SyntheticsTestPauseStatus::PAUSED, type: DatadogAPIClient::V1::SyntheticsMobileTestType::MOBILE, config: DatadogAPIClient::V1::SyntheticsMobileTestConfig.new({ variables: [], }), message: "", options: DatadogAPIClient::V1::SyntheticsMobileTestOptions.new({ device_ids: [ "synthetics:mobile:device:iphone_15_ios_17", ], mobile_application: DatadogAPIClient::V1::SyntheticsMobileTestsMobileApplication.new({ application_id: "ab0e0aed-536d-411a-9a99-5428c27d8f8e", reference_id: "6115922a-5f5d-455e-bc7e-7955a57f3815", reference_type: DatadogAPIClient::V1::SyntheticsMobileTestsMobileApplicationReferenceType::VERSION, }), tick_every: 3600, }), steps: [], }) p api_instance.create_synthetics_mobile_test(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a mobile test returns "OK - Returns the created test details." response ``` // Create a mobile test returns "OK - Returns the created test details." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsMobileTest; use datadog_api_client::datadogV1::model::SyntheticsMobileTestConfig; use datadog_api_client::datadogV1::model::SyntheticsMobileTestOptions; use datadog_api_client::datadogV1::model::SyntheticsMobileTestType; use datadog_api_client::datadogV1::model::SyntheticsMobileTestsMobileApplication; use datadog_api_client::datadogV1::model::SyntheticsMobileTestsMobileApplicationReferenceType; use datadog_api_client::datadogV1::model::SyntheticsTestPauseStatus; #[tokio::main] async fn main() { let body = SyntheticsMobileTest::new( SyntheticsMobileTestConfig::new().variables(vec![]), "".to_string(), "Example-Synthetic".to_string(), SyntheticsMobileTestOptions::new( vec!["synthetics:mobile:device:iphone_15_ios_17".to_string()], SyntheticsMobileTestsMobileApplication::new( "ab0e0aed-536d-411a-9a99-5428c27d8f8e".to_string(), "6115922a-5f5d-455e-bc7e-7955a57f3815".to_string(), SyntheticsMobileTestsMobileApplicationReferenceType::VERSION, ), 3600, ), SyntheticsMobileTestType::MOBILE, ) .status(SyntheticsTestPauseStatus::PAUSED) .steps(vec![]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.create_synthetics_mobile_test(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a mobile test returns "OK - Returns the created test details." response ``` /** * Create a mobile test returns "OK - Returns the created test details." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiCreateSyntheticsMobileTestRequest = { body: { name: "Example-Synthetic", status: "paused", type: "mobile", config: { variables: [], }, message: "", options: { deviceIds: ["synthetics:mobile:device:iphone_15_ios_17"], mobileApplication: { applicationId: "ab0e0aed-536d-411a-9a99-5428c27d8f8e", referenceId: "6115922a-5f5d-455e-bc7e-7955a57f3815", referenceType: "version", }, tickEvery: 3600, }, steps: [], }, }; apiInstance .createSyntheticsMobileTest(params) .then((data: v1.SyntheticsMobileTest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit a Mobile test](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-mobile-test) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-mobile-test-v1) PUT https://api.ap1.datadoghq.com/api/v1/synthetics/tests/mobile/{public_id}https://api.ap2.datadoghq.com/api/v1/synthetics/tests/mobile/{public_id}https://api.datadoghq.eu/api/v1/synthetics/tests/mobile/{public_id}https://api.ddog-gov.com/api/v1/synthetics/tests/mobile/{public_id}https://api.datadoghq.com/api/v1/synthetics/tests/mobile/{public_id}https://api.us3.datadoghq.com/api/v1/synthetics/tests/mobile/{public_id}https://api.us5.datadoghq.com/api/v1/synthetics/tests/mobile/{public_id} ### Overview Edit the configuration of a Synthetic Mobile test. This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the test to get details from. ### Request #### Body Data (required) New test details to be saved. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description config [_required_] object Configuration object for a Synthetic mobile test. initialApplicationArguments object Initial application arguments for a mobile test. string A single application argument. variables [object] Array of variables used for the test steps. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` device_ids [string] Array with the different device IDs used to run the test. message [_required_] string Notification message associated with the test. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. allowApplicationCrash boolean A boolean to set if an application crash would mark the test as failed. bindings [object] Array of bindings used for the mobile test. principals [string] List of principals for a mobile test binding. relation enum The type of relation for the binding. Allowed enum values: `editor,viewer` ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` defaultStepTimeout int32 The default timeout for steps in the test (in seconds). device_ids [_required_] [string] For mobile test, array with the different device IDs used to run the test. disableAutoAcceptAlert boolean A boolean to disable auto accepting alerts. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. mobileApplication [_required_] object Mobile application for mobile synthetics test. applicationId [_required_] string Application ID of the mobile application. referenceId [_required_] string Reference ID of the mobile application. referenceType [_required_] enum Reference type for the mobile application for a mobile synthetics test. Allowed enum values: `latest,version` monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean A boolean set to not take a screenshot for the step. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every [_required_] int64 The frequency at which to run the Synthetic test (in seconds). verbosity int32 The level of verbosity for the mobile test. This field can not be set by a user. public_id string The public ID of the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] Array of steps for the test. allowFailure boolean A boolean set to allow this step to fail. hasNewStepElement boolean A boolean set to determine if the step has a new step element. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name [_required_] string The name of the step. noScreenshot boolean A boolean set to not take a screenshot for the step. params [_required_] object The parameters of a mobile step. check enum Type of assertion to apply in an API test. Allowed enum values: `equals,notEquals,contains,notContains,startsWith,notStartsWith,greater,lower,greaterEquals,lowerEquals,matchRegex,between,isEmpty,notIsEmpty` delay int64 Number of milliseconds to wait between inputs in a `typeText` step type. direction enum The direction of the scroll for a `scrollToElement` step type. Allowed enum values: `up,down,left,right` element object Information about the element used for a step. context string Context of the element. contextType enum Type of the context that the element is in. Allowed enum values: `native,web` elementDescription string Description of the element. multiLocator object Multi-locator to find the element. relativePosition object Position of the action relative to the element. x double The `relativePosition` on the `x` axis for the element. y double The `relativePosition` on the `y` axis for the element. textContent string Text content of the element. userLocator object User locator to find the element. failTestOnCannotLocate boolean Whether if the test should fail if the element cannot be found. values [object] Values of the user locator. type enum Type of a user locator. Allowed enum values: `accessibility-id,id,ios-predicate-string,ios-class-chain,xpath` value string Value of a user locator. viewName string Name of the view of the element. enabled boolean Boolean to change the state of the wifi for a `toggleWiFi` step type. maxScrolls int64 Maximum number of scrolls to do for a `scrollToElement` step type. positions [object] List of positions for the `flick` step type. The maximum is 10 flicks per step x double The `x` position for the flick. y double The `y` position for the flick. subtestPublicId string Public ID of the test to be played as part of a `playSubTest` step type. value Values used in the step for in multiple step types. Option 1 string Value used in the step for in multiple step types. Option 2 int64 Value used in the step for in multiple step types. variable object Variable object for `extractVariable` step type. example [_required_] string An example for the variable. name [_required_] string The variable name. withEnter boolean Boolean to indicate if `Enter` should be pressed at the end of the `typeText` step type. x double Amount to scroll by on the `x` axis for a `scroll` step type. y double Amount to scroll by on the `y` axis for a `scroll` step type. publicId string The public ID of the step. timeout int64 The time before declaring a step failed. type [_required_] enum Step type used in your mobile Synthetic test. Allowed enum values: `assertElementContent,assertScreenContains,assertScreenLacks,doubleTap,extractVariable,flick,openDeeplink,playSubTest,pressBack,restartApplication,rotate,scroll,scrollToElement,tap,toggleWiFi,typeText,wait` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `mobile`. Allowed enum values: `mobile` default: `mobile` ``` { "name": "Example-Synthetic-updated", "status": "paused", "type": "mobile", "config": { "variables": [] }, "message": "", "options": { "device_ids": [ "synthetics:mobile:device:iphone_15_ios_17" ], "mobileApplication": { "applicationId": "ab0e0aed-536d-411a-9a99-5428c27d8f8e", "referenceId": "6115922a-5f5d-455e-bc7e-7955a57f3815", "referenceType": "version" }, "tick_every": 3600 }, "steps": [] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#UpdateMobileTest-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#UpdateMobileTest-400-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#UpdateMobileTest-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#UpdateMobileTest-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#UpdateMobileTest-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about a Synthetic mobile test. Field Type Description config [_required_] object Configuration object for a Synthetic mobile test. initialApplicationArguments object Initial application arguments for a mobile test. string A single application argument. variables [object] Array of variables used for the test steps. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` device_ids [string] Array with the different device IDs used to run the test. message [_required_] string Notification message associated with the test. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. allowApplicationCrash boolean A boolean to set if an application crash would mark the test as failed. bindings [object] Array of bindings used for the mobile test. principals [string] List of principals for a mobile test binding. relation enum The type of relation for the binding. Allowed enum values: `editor,viewer` ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` defaultStepTimeout int32 The default timeout for steps in the test (in seconds). device_ids [_required_] [string] For mobile test, array with the different device IDs used to run the test. disableAutoAcceptAlert boolean A boolean to disable auto accepting alerts. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. mobileApplication [_required_] object Mobile application for mobile synthetics test. applicationId [_required_] string Application ID of the mobile application. referenceId [_required_] string Reference ID of the mobile application. referenceType [_required_] enum Reference type for the mobile application for a mobile synthetics test. Allowed enum values: `latest,version` monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean A boolean set to not take a screenshot for the step. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every [_required_] int64 The frequency at which to run the Synthetic test (in seconds). verbosity int32 The level of verbosity for the mobile test. This field can not be set by a user. public_id string The public ID of the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] Array of steps for the test. allowFailure boolean A boolean set to allow this step to fail. hasNewStepElement boolean A boolean set to determine if the step has a new step element. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name [_required_] string The name of the step. noScreenshot boolean A boolean set to not take a screenshot for the step. params [_required_] object The parameters of a mobile step. check enum Type of assertion to apply in an API test. Allowed enum values: `equals,notEquals,contains,notContains,startsWith,notStartsWith,greater,lower,greaterEquals,lowerEquals,matchRegex,between,isEmpty,notIsEmpty` delay int64 Number of milliseconds to wait between inputs in a `typeText` step type. direction enum The direction of the scroll for a `scrollToElement` step type. Allowed enum values: `up,down,left,right` element object Information about the element used for a step. context string Context of the element. contextType enum Type of the context that the element is in. Allowed enum values: `native,web` elementDescription string Description of the element. multiLocator object Multi-locator to find the element. relativePosition object Position of the action relative to the element. x double The `relativePosition` on the `x` axis for the element. y double The `relativePosition` on the `y` axis for the element. textContent string Text content of the element. userLocator object User locator to find the element. failTestOnCannotLocate boolean Whether if the test should fail if the element cannot be found. values [object] Values of the user locator. type enum Type of a user locator. Allowed enum values: `accessibility-id,id,ios-predicate-string,ios-class-chain,xpath` value string Value of a user locator. viewName string Name of the view of the element. enabled boolean Boolean to change the state of the wifi for a `toggleWiFi` step type. maxScrolls int64 Maximum number of scrolls to do for a `scrollToElement` step type. positions [object] List of positions for the `flick` step type. The maximum is 10 flicks per step x double The `x` position for the flick. y double The `y` position for the flick. subtestPublicId string Public ID of the test to be played as part of a `playSubTest` step type. value Values used in the step for in multiple step types. Option 1 string Value used in the step for in multiple step types. Option 2 int64 Value used in the step for in multiple step types. variable object Variable object for `extractVariable` step type. example [_required_] string An example for the variable. name [_required_] string The variable name. withEnter boolean Boolean to indicate if `Enter` should be pressed at the end of the `typeText` step type. x double Amount to scroll by on the `x` axis for a `scroll` step type. y double Amount to scroll by on the `y` axis for a `scroll` step type. publicId string The public ID of the step. timeout int64 The time before declaring a step failed. type [_required_] enum Step type used in your mobile Synthetic test. Allowed enum values: `assertElementContent,assertScreenContains,assertScreenLacks,doubleTap,extractVariable,flick,openDeeplink,playSubTest,pressBack,restartApplication,rotate,scroll,scrollToElement,tap,toggleWiFi,typeText,wait` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `mobile`. Allowed enum values: `mobile` default: `mobile` ``` { "config": { "initialApplicationArguments": { "": "string" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "device_ids": [ "chrome.laptop_large" ], "message": "Notification message", "monitor_id": 12345678, "name": "Example test name", "options": { "allowApplicationCrash": false, "bindings": [ { "principals": [], "relation": "string" } ], "ci": { "executionRule": "blocking" }, "defaultStepTimeout": "integer", "device_ids": [ "synthetics:mobile:device:apple_ipad_10th_gen_2022_ios_16" ], "disableAutoAcceptAlert": false, "min_failure_duration": "integer", "mobileApplication": { "applicationId": "00000000-0000-0000-0000-aaaaaaaaaaaa", "referenceId": "00000000-0000-0000-0000-aaaaaaaaaaab", "referenceType": "latest" }, "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": 300, "verbosity": "integer" }, "public_id": "123-abc-456", "status": "live", "steps": [ { "allowFailure": false, "hasNewStepElement": false, "isCritical": false, "name": "", "noScreenshot": false, "params": { "check": "string", "delay": "integer", "direction": "string", "element": { "context": "string", "contextType": "string", "elementDescription": "string", "multiLocator": {}, "relativePosition": { "x": "number", "y": "number" }, "textContent": "string", "userLocator": { "failTestOnCannotLocate": false, "values": [ { "type": "string", "value": "string" } ] }, "viewName": "string" }, "enabled": false, "maxScrolls": "integer", "positions": [ { "x": "number", "y": "number" } ], "subtestPublicId": "string", "value": { "description": "undefined", "type": "undefined" }, "variable": { "example": "", "name": "VAR_NAME" }, "withEnter": false, "x": "number", "y": "number" }, "publicId": "pub-lic-id0", "timeout": "integer", "type": "assertElementContent" } ], "tags": [ "env:production" ], "type": "mobile" } ``` Copy - JSON format is wrong - Updating sub-type is forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic Monitoring is not activated for the user - Test is not owned by the user - Test can't be found * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Edit a Mobile test returns "OK" response Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/mobile/${public_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Synthetic-updated", "status": "paused", "type": "mobile", "config": { "variables": [] }, "message": "", "options": { "device_ids": [ "synthetics:mobile:device:iphone_15_ios_17" ], "mobileApplication": { "applicationId": "ab0e0aed-536d-411a-9a99-5428c27d8f8e", "referenceId": "6115922a-5f5d-455e-bc7e-7955a57f3815", "referenceType": "version" }, "tick_every": 3600 }, "steps": [] } EOF ``` ##### Edit a Mobile test returns "OK" response ``` // Edit a Mobile test returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "synthetics_mobile_test" in the system SyntheticsMobileTestPublicID := os.Getenv("SYNTHETICS_MOBILE_TEST_PUBLIC_ID") body := datadogV1.SyntheticsMobileTest{ Name: "Example-Synthetic-updated", Status: datadogV1.SYNTHETICSTESTPAUSESTATUS_PAUSED.Ptr(), Type: datadogV1.SYNTHETICSMOBILETESTTYPE_MOBILE, Config: datadogV1.SyntheticsMobileTestConfig{ Variables: []datadogV1.SyntheticsConfigVariable{}, }, Message: "", Options: datadogV1.SyntheticsMobileTestOptions{ DeviceIds: []string{ "synthetics:mobile:device:iphone_15_ios_17", }, MobileApplication: datadogV1.SyntheticsMobileTestsMobileApplication{ ApplicationId: "ab0e0aed-536d-411a-9a99-5428c27d8f8e", ReferenceId: "6115922a-5f5d-455e-bc7e-7955a57f3815", ReferenceType: datadogV1.SYNTHETICSMOBILETESTSMOBILEAPPLICATIONREFERENCETYPE_VERSION, }, TickEvery: 3600, }, Steps: []datadogV1.SyntheticsMobileStep{}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.UpdateMobileTest(ctx, SyntheticsMobileTestPublicID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.UpdateMobileTest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.UpdateMobileTest`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit a Mobile test returns "OK" response ``` // Edit a Mobile test returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsMobileTest; import com.datadog.api.client.v1.model.SyntheticsMobileTestConfig; import com.datadog.api.client.v1.model.SyntheticsMobileTestOptions; import com.datadog.api.client.v1.model.SyntheticsMobileTestType; import com.datadog.api.client.v1.model.SyntheticsMobileTestsMobileApplication; import com.datadog.api.client.v1.model.SyntheticsMobileTestsMobileApplicationReferenceType; import com.datadog.api.client.v1.model.SyntheticsTestPauseStatus; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); // there is a valid "synthetics_mobile_test" in the system String SYNTHETICS_MOBILE_TEST_PUBLIC_ID = System.getenv("SYNTHETICS_MOBILE_TEST_PUBLIC_ID"); SyntheticsMobileTest body = new SyntheticsMobileTest() .name("Example-Synthetic-updated") .status(SyntheticsTestPauseStatus.PAUSED) .type(SyntheticsMobileTestType.MOBILE) .config(new SyntheticsMobileTestConfig()) .message("") .options( new SyntheticsMobileTestOptions() .deviceIds( Collections.singletonList("synthetics:mobile:device:iphone_15_ios_17")) .mobileApplication( new SyntheticsMobileTestsMobileApplication() .applicationId("ab0e0aed-536d-411a-9a99-5428c27d8f8e") .referenceId("6115922a-5f5d-455e-bc7e-7955a57f3815") .referenceType( SyntheticsMobileTestsMobileApplicationReferenceType.VERSION)) .tickEvery(3600L)); try { SyntheticsMobileTest result = apiInstance.updateMobileTest(SYNTHETICS_MOBILE_TEST_PUBLIC_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#updateMobileTest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit a Mobile test returns "OK" response ``` """ Edit a Mobile test returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_mobile_test import SyntheticsMobileTest from datadog_api_client.v1.model.synthetics_mobile_test_config import SyntheticsMobileTestConfig from datadog_api_client.v1.model.synthetics_mobile_test_options import SyntheticsMobileTestOptions from datadog_api_client.v1.model.synthetics_mobile_test_type import SyntheticsMobileTestType from datadog_api_client.v1.model.synthetics_mobile_tests_mobile_application import ( SyntheticsMobileTestsMobileApplication, ) from datadog_api_client.v1.model.synthetics_mobile_tests_mobile_application_reference_type import ( SyntheticsMobileTestsMobileApplicationReferenceType, ) from datadog_api_client.v1.model.synthetics_test_pause_status import SyntheticsTestPauseStatus # there is a valid "synthetics_mobile_test" in the system SYNTHETICS_MOBILE_TEST_PUBLIC_ID = environ["SYNTHETICS_MOBILE_TEST_PUBLIC_ID"] body = SyntheticsMobileTest( name="Example-Synthetic-updated", status=SyntheticsTestPauseStatus.PAUSED, type=SyntheticsMobileTestType.MOBILE, config=SyntheticsMobileTestConfig( variables=[], ), message="", options=SyntheticsMobileTestOptions( device_ids=[ "synthetics:mobile:device:iphone_15_ios_17", ], mobile_application=SyntheticsMobileTestsMobileApplication( application_id="ab0e0aed-536d-411a-9a99-5428c27d8f8e", reference_id="6115922a-5f5d-455e-bc7e-7955a57f3815", reference_type=SyntheticsMobileTestsMobileApplicationReferenceType.VERSION, ), tick_every=3600, ), steps=[], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.update_mobile_test(public_id=SYNTHETICS_MOBILE_TEST_PUBLIC_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit a Mobile test returns "OK" response ``` # Edit a Mobile test returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new # there is a valid "synthetics_mobile_test" in the system SYNTHETICS_MOBILE_TEST_PUBLIC_ID = ENV["SYNTHETICS_MOBILE_TEST_PUBLIC_ID"] body = DatadogAPIClient::V1::SyntheticsMobileTest.new({ name: "Example-Synthetic-updated", status: DatadogAPIClient::V1::SyntheticsTestPauseStatus::PAUSED, type: DatadogAPIClient::V1::SyntheticsMobileTestType::MOBILE, config: DatadogAPIClient::V1::SyntheticsMobileTestConfig.new({ variables: [], }), message: "", options: DatadogAPIClient::V1::SyntheticsMobileTestOptions.new({ device_ids: [ "synthetics:mobile:device:iphone_15_ios_17", ], mobile_application: DatadogAPIClient::V1::SyntheticsMobileTestsMobileApplication.new({ application_id: "ab0e0aed-536d-411a-9a99-5428c27d8f8e", reference_id: "6115922a-5f5d-455e-bc7e-7955a57f3815", reference_type: DatadogAPIClient::V1::SyntheticsMobileTestsMobileApplicationReferenceType::VERSION, }), tick_every: 3600, }), steps: [], }) p api_instance.update_mobile_test(SYNTHETICS_MOBILE_TEST_PUBLIC_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit a Mobile test returns "OK" response ``` // Edit a Mobile test returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsMobileTest; use datadog_api_client::datadogV1::model::SyntheticsMobileTestConfig; use datadog_api_client::datadogV1::model::SyntheticsMobileTestOptions; use datadog_api_client::datadogV1::model::SyntheticsMobileTestType; use datadog_api_client::datadogV1::model::SyntheticsMobileTestsMobileApplication; use datadog_api_client::datadogV1::model::SyntheticsMobileTestsMobileApplicationReferenceType; use datadog_api_client::datadogV1::model::SyntheticsTestPauseStatus; #[tokio::main] async fn main() { // there is a valid "synthetics_mobile_test" in the system let synthetics_mobile_test_public_id = std::env::var("SYNTHETICS_MOBILE_TEST_PUBLIC_ID").unwrap(); let body = SyntheticsMobileTest::new( SyntheticsMobileTestConfig::new().variables(vec![]), "".to_string(), "Example-Synthetic-updated".to_string(), SyntheticsMobileTestOptions::new( vec!["synthetics:mobile:device:iphone_15_ios_17".to_string()], SyntheticsMobileTestsMobileApplication::new( "ab0e0aed-536d-411a-9a99-5428c27d8f8e".to_string(), "6115922a-5f5d-455e-bc7e-7955a57f3815".to_string(), SyntheticsMobileTestsMobileApplicationReferenceType::VERSION, ), 3600, ), SyntheticsMobileTestType::MOBILE, ) .status(SyntheticsTestPauseStatus::PAUSED) .steps(vec![]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api .update_mobile_test(synthetics_mobile_test_public_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit a Mobile test returns "OK" response ``` /** * Edit a Mobile test returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); // there is a valid "synthetics_mobile_test" in the system const SYNTHETICS_MOBILE_TEST_PUBLIC_ID = process.env .SYNTHETICS_MOBILE_TEST_PUBLIC_ID as string; const params: v1.SyntheticsApiUpdateMobileTestRequest = { body: { name: "Example-Synthetic-updated", status: "paused", type: "mobile", config: { variables: [], }, message: "", options: { deviceIds: ["synthetics:mobile:device:iphone_15_ios_17"], mobileApplication: { applicationId: "ab0e0aed-536d-411a-9a99-5428c27d8f8e", referenceId: "6115922a-5f5d-455e-bc7e-7955a57f3815", referenceType: "version", }, tickEvery: 3600, }, steps: [], }, publicId: SYNTHETICS_MOBILE_TEST_PUBLIC_ID, }; apiInstance .updateMobileTest(params) .then((data: v1.SyntheticsMobileTest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit an API test](https://docs.datadoghq.com/api/latest/synthetics/#edit-an-api-test) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#edit-an-api-test-v1) PUT https://api.ap1.datadoghq.com/api/v1/synthetics/tests/api/{public_id}https://api.ap2.datadoghq.com/api/v1/synthetics/tests/api/{public_id}https://api.datadoghq.eu/api/v1/synthetics/tests/api/{public_id}https://api.ddog-gov.com/api/v1/synthetics/tests/api/{public_id}https://api.datadoghq.com/api/v1/synthetics/tests/api/{public_id}https://api.us3.datadoghq.com/api/v1/synthetics/tests/api/{public_id}https://api.us5.datadoghq.com/api/v1/synthetics/tests/api/{public_id} ### Overview Edit the configuration of a Synthetic API test. This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the test to get details from. ### Request #### Body Data (required) New test details to be saved. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description config [_required_] object Configuration object for a Synthetic API test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. steps [ ] When the test subtype is `multi`, the steps of the test. Option 1 object The Test step used in a Synthetic multi-step API test. allowFailure boolean Determines whether or not to continue with test if this step fails. assertions [_required_] [ ] Array of assertions used for the test. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` exitIfSucceed boolean Determines whether or not to exit the test if the step succeeds. extractedValues [object] Array of values to parse and save as variables from the response. field string When type is `http_header` or `grpc_metadata`, name of the header or metadatum to extract. name string Name of the variable to extract. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. secure boolean Determines whether or not the extracted value will be obfuscated. type enum Property of the Synthetic Test Response to extract into a local variable. Allowed enum values: `grpc_message,grpc_metadata,http_body,http_header,http_status_code` extractedValuesFromScript string Generate variables using JavaScript. id string ID of the step. isCritical boolean Determines whether or not to consider the entire test as failed if this step fails. Can be used only if `allowFailure` is `true`. name [_required_] string The name of the step. request [_required_] object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. subtype [_required_] enum The subtype of the Synthetic multi-step API test step. Allowed enum values: `http,grpc,ssl,dns,tcp,udp,icmp,websocket` Option 2 object The Wait step used in a Synthetic multi-step API test. id string ID of the step. name [_required_] string The name of the step. subtype [_required_] enum The subtype of the Synthetic multi-step API wait step. Allowed enum values: `wait` value [_required_] int32 The time to wait in seconds. Minimum value: 0. Maximum value: 180. Option 3 object The subtest step used in a Synthetics multi-step API test. allowFailure boolean Determines whether or not to continue with test if this step fails. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean Determines whether or not to exit the test if the step succeeds. extractedValuesFromScript string Generate variables using JavaScript. id string ID of the step. isCritical boolean Determines whether or not to consider the entire test as failed if this step fails. Can be used only if `allowFailure` is `true`. name [_required_] string The name of the step. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. subtestPublicId [_required_] string Public ID of the test to be played as part of a `playSubTest` step type. subtype [_required_] enum The subtype of the Synthetic multi-step API subtest step. Allowed enum values: `playSubTest` variablesFromScript string Variables defined from JavaScript code. locations [_required_] [string] Array of locations used to run the test. message [_required_] string Notification message associated with the test. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The public ID for the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `api`. Allowed enum values: `api` default: `api` ``` { "config": { "assertions": [ { "operator": "is", "property": "{{ PROPERTY }}", "target": "text/html", "type": "header" }, { "operator": "lessThan", "target": 2000, "type": "responseTime" }, { "operator": "validatesJSONPath", "target": { "jsonPath": "topKey", "operator": "isNot", "targetValue": "0" }, "type": "body" }, { "operator": "validatesJSONSchema", "target": { "metaSchema": "draft-07", "jsonSchema": "{\"type\": \"object\", \"properties\":{\"slideshow\":{\"type\":\"object\"}}}" }, "type": "body" } ], "configVariables": [ { "example": "content-type", "name": "PROPERTY", "pattern": "content-type", "type": "text" } ], "request": { "certificate": { "cert": { "filename": "cert-filename", "updatedAt": "2020-10-16T09:23:24.857Z" }, "key": { "filename": "key-filename", "updatedAt": "2020-10-16T09:23:24.857Z" } }, "headers": { "unique": "examplesynthetic" }, "method": "GET", "timeout": 10, "url": "https://datadoghq.com" } }, "locations": [ "aws:us-east-2" ], "message": "BDD test payload: synthetics_api_test_payload.json", "name": "Example-Synthetic-updated", "options": { "accept_self_signed": false, "allow_insecure": true, "follow_redirects": true, "min_failure_duration": 10, "min_location_failed": 1, "monitor_name": "Test-TestSyntheticsAPITestLifecycle-1623076664", "monitor_priority": 5, "retry": { "count": 3, "interval": 10 }, "tick_every": 60 }, "status": "live", "subtype": "http", "tags": [ "testing:api" ], "type": "api" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#UpdateAPITest-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#UpdateAPITest-400-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#UpdateAPITest-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#UpdateAPITest-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#UpdateAPITest-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about a Synthetic API test. Field Type Description config [_required_] object Configuration object for a Synthetic API test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. steps [ ] When the test subtype is `multi`, the steps of the test. Option 1 object The Test step used in a Synthetic multi-step API test. allowFailure boolean Determines whether or not to continue with test if this step fails. assertions [_required_] [ ] Array of assertions used for the test. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` exitIfSucceed boolean Determines whether or not to exit the test if the step succeeds. extractedValues [object] Array of values to parse and save as variables from the response. field string When type is `http_header` or `grpc_metadata`, name of the header or metadatum to extract. name string Name of the variable to extract. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. secure boolean Determines whether or not the extracted value will be obfuscated. type enum Property of the Synthetic Test Response to extract into a local variable. Allowed enum values: `grpc_message,grpc_metadata,http_body,http_header,http_status_code` extractedValuesFromScript string Generate variables using JavaScript. id string ID of the step. isCritical boolean Determines whether or not to consider the entire test as failed if this step fails. Can be used only if `allowFailure` is `true`. name [_required_] string The name of the step. request [_required_] object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. subtype [_required_] enum The subtype of the Synthetic multi-step API test step. Allowed enum values: `http,grpc,ssl,dns,tcp,udp,icmp,websocket` Option 2 object The Wait step used in a Synthetic multi-step API test. id string ID of the step. name [_required_] string The name of the step. subtype [_required_] enum The subtype of the Synthetic multi-step API wait step. Allowed enum values: `wait` value [_required_] int32 The time to wait in seconds. Minimum value: 0. Maximum value: 180. Option 3 object The subtest step used in a Synthetics multi-step API test. allowFailure boolean Determines whether or not to continue with test if this step fails. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean Determines whether or not to exit the test if the step succeeds. extractedValuesFromScript string Generate variables using JavaScript. id string ID of the step. isCritical boolean Determines whether or not to consider the entire test as failed if this step fails. Can be used only if `allowFailure` is `true`. name [_required_] string The name of the step. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. subtestPublicId [_required_] string Public ID of the test to be played as part of a `playSubTest` step type. subtype [_required_] enum The subtype of the Synthetic multi-step API subtest step. Allowed enum values: `playSubTest` variablesFromScript string Variables defined from JavaScript code. locations [_required_] [string] Array of locations used to run the test. message [_required_] string Notification message associated with the test. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The public ID for the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `api`. Allowed enum values: `api` default: `api` ``` { "config": { "assertions": [ [] ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "steps": [], "variablesFromScript": "dd.variable.set(\"FOO\", \"foo\")" }, "locations": [ "aws:eu-west-3" ], "message": "Notification message", "monitor_id": 12345678, "name": "Example test name", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "123-abc-456", "status": "live", "subtype": "http", "tags": [ "env:production" ], "type": "api" } ``` Copy - JSON format is wrong - Updating sub-type is forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic Monitoring is not activated for the user - Test is not owned by the user - Test can't be found * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Edit an API test returns "OK" response Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/api/${public_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "assertions": [ { "operator": "is", "property": "{{ PROPERTY }}", "target": "text/html", "type": "header" }, { "operator": "lessThan", "target": 2000, "type": "responseTime" }, { "operator": "validatesJSONPath", "target": { "jsonPath": "topKey", "operator": "isNot", "targetValue": "0" }, "type": "body" }, { "operator": "validatesJSONSchema", "target": { "metaSchema": "draft-07", "jsonSchema": "{\"type\": \"object\", \"properties\":{\"slideshow\":{\"type\":\"object\"}}}" }, "type": "body" } ], "configVariables": [ { "example": "content-type", "name": "PROPERTY", "pattern": "content-type", "type": "text" } ], "request": { "certificate": { "cert": { "filename": "cert-filename", "updatedAt": "2020-10-16T09:23:24.857Z" }, "key": { "filename": "key-filename", "updatedAt": "2020-10-16T09:23:24.857Z" } }, "headers": { "unique": "examplesynthetic" }, "method": "GET", "timeout": 10, "url": "https://datadoghq.com" } }, "locations": [ "aws:us-east-2" ], "message": "BDD test payload: synthetics_api_test_payload.json", "name": "Example-Synthetic-updated", "options": { "accept_self_signed": false, "allow_insecure": true, "follow_redirects": true, "min_failure_duration": 10, "min_location_failed": 1, "monitor_name": "Test-TestSyntheticsAPITestLifecycle-1623076664", "monitor_priority": 5, "retry": { "count": 3, "interval": 10 }, "tick_every": 60 }, "status": "live", "subtype": "http", "tags": [ "testing:api" ], "type": "api" } EOF ``` ##### Edit an API test returns "OK" response ``` // Edit an API test returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "synthetics_api_test" in the system SyntheticsAPITestPublicID := os.Getenv("SYNTHETICS_API_TEST_PUBLIC_ID") body := datadogV1.SyntheticsAPITest{ Config: datadogV1.SyntheticsAPITestConfig{ Assertions: []datadogV1.SyntheticsAssertion{ datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_IS, Property: datadog.PtrString("{{ PROPERTY }}"), Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueString: datadog.PtrString("text/html")}, Type: datadogV1.SYNTHETICSASSERTIONTYPE_HEADER, }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionTarget: &datadogV1.SyntheticsAssertionTarget{ Operator: datadogV1.SYNTHETICSASSERTIONOPERATOR_LESS_THAN, Target: datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueNumber: datadog.PtrFloat64(2000)}, Type: datadogV1.SYNTHETICSASSERTIONTYPE_RESPONSE_TIME, }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionJSONPathTarget: &datadogV1.SyntheticsAssertionJSONPathTarget{ Operator: datadogV1.SYNTHETICSASSERTIONJSONPATHOPERATOR_VALIDATES_JSON_PATH, Target: &datadogV1.SyntheticsAssertionJSONPathTargetTarget{ JsonPath: datadog.PtrString("topKey"), Operator: datadog.PtrString("isNot"), TargetValue: &datadogV1.SyntheticsAssertionTargetValue{ SyntheticsAssertionTargetValueString: datadog.PtrString("0")}, }, Type: datadogV1.SYNTHETICSASSERTIONTYPE_BODY, }}, datadogV1.SyntheticsAssertion{ SyntheticsAssertionJSONSchemaTarget: &datadogV1.SyntheticsAssertionJSONSchemaTarget{ Operator: datadogV1.SYNTHETICSASSERTIONJSONSCHEMAOPERATOR_VALIDATES_JSON_SCHEMA, Target: &datadogV1.SyntheticsAssertionJSONSchemaTargetTarget{ MetaSchema: datadogV1.SYNTHETICSASSERTIONJSONSCHEMAMETASCHEMA_DRAFT_07.Ptr(), JsonSchema: datadog.PtrString(`{"type": "object", "properties":{"slideshow":{"type":"object"}}}`), }, Type: datadogV1.SYNTHETICSASSERTIONTYPE_BODY, }}, }, ConfigVariables: []datadogV1.SyntheticsConfigVariable{ { Example: datadog.PtrString("content-type"), Name: "PROPERTY", Pattern: datadog.PtrString("content-type"), Type: datadogV1.SYNTHETICSCONFIGVARIABLETYPE_TEXT, }, }, Request: &datadogV1.SyntheticsTestRequest{ Certificate: &datadogV1.SyntheticsTestRequestCertificate{ Cert: &datadogV1.SyntheticsTestRequestCertificateItem{ Filename: datadog.PtrString("cert-filename"), UpdatedAt: datadog.PtrString("2020-10-16T09:23:24.857Z"), }, Key: &datadogV1.SyntheticsTestRequestCertificateItem{ Filename: datadog.PtrString("key-filename"), UpdatedAt: datadog.PtrString("2020-10-16T09:23:24.857Z"), }, }, Headers: map[string]string{ "unique": "examplesynthetic", }, Method: datadog.PtrString("GET"), Timeout: datadog.PtrFloat64(10), Url: datadog.PtrString("https://datadoghq.com"), }, }, Locations: []string{ "aws:us-east-2", }, Message: "BDD test payload: synthetics_api_test_payload.json", Name: "Example-Synthetic-updated", Options: datadogV1.SyntheticsTestOptions{ AcceptSelfSigned: datadog.PtrBool(false), AllowInsecure: datadog.PtrBool(true), FollowRedirects: datadog.PtrBool(true), MinFailureDuration: datadog.PtrInt64(10), MinLocationFailed: datadog.PtrInt64(1), MonitorName: datadog.PtrString("Test-TestSyntheticsAPITestLifecycle-1623076664"), MonitorPriority: datadog.PtrInt32(5), Retry: &datadogV1.SyntheticsTestOptionsRetry{ Count: datadog.PtrInt64(3), Interval: datadog.PtrFloat64(10), }, TickEvery: datadog.PtrInt64(60), }, Status: datadogV1.SYNTHETICSTESTPAUSESTATUS_LIVE.Ptr(), Subtype: datadogV1.SYNTHETICSTESTDETAILSSUBTYPE_HTTP.Ptr(), Tags: []string{ "testing:api", }, Type: datadogV1.SYNTHETICSAPITESTTYPE_API, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.UpdateAPITest(ctx, SyntheticsAPITestPublicID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.UpdateAPITest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.UpdateAPITest`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit an API test returns "OK" response ``` // Edit an API test returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsAPITest; import com.datadog.api.client.v1.model.SyntheticsAPITestConfig; import com.datadog.api.client.v1.model.SyntheticsAPITestType; import com.datadog.api.client.v1.model.SyntheticsAssertion; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONPathOperator; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONPathTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONPathTargetTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONSchemaMetaSchema; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONSchemaOperator; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONSchemaTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionJSONSchemaTargetTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionOperator; import com.datadog.api.client.v1.model.SyntheticsAssertionTarget; import com.datadog.api.client.v1.model.SyntheticsAssertionTargetValue; import com.datadog.api.client.v1.model.SyntheticsAssertionType; import com.datadog.api.client.v1.model.SyntheticsConfigVariable; import com.datadog.api.client.v1.model.SyntheticsConfigVariableType; import com.datadog.api.client.v1.model.SyntheticsTestDetailsSubType; import com.datadog.api.client.v1.model.SyntheticsTestOptions; import com.datadog.api.client.v1.model.SyntheticsTestOptionsRetry; import com.datadog.api.client.v1.model.SyntheticsTestPauseStatus; import com.datadog.api.client.v1.model.SyntheticsTestRequest; import com.datadog.api.client.v1.model.SyntheticsTestRequestCertificate; import com.datadog.api.client.v1.model.SyntheticsTestRequestCertificateItem; import java.util.Arrays; import java.util.Collections; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); // there is a valid "synthetics_api_test" in the system String SYNTHETICS_API_TEST_PUBLIC_ID = System.getenv("SYNTHETICS_API_TEST_PUBLIC_ID"); SyntheticsAPITest body = new SyntheticsAPITest() .config( new SyntheticsAPITestConfig() .assertions( Arrays.asList( new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.IS) .property("{{ PROPERTY }}") .target(new SyntheticsAssertionTargetValue("text/html")) .type(SyntheticsAssertionType.HEADER)), new SyntheticsAssertion( new SyntheticsAssertionTarget() .operator(SyntheticsAssertionOperator.LESS_THAN) .target(new SyntheticsAssertionTargetValue(2000.0)) .type(SyntheticsAssertionType.RESPONSE_TIME)), new SyntheticsAssertion( new SyntheticsAssertionJSONPathTarget() .operator( SyntheticsAssertionJSONPathOperator.VALIDATES_JSON_PATH) .target( new SyntheticsAssertionJSONPathTargetTarget() .jsonPath("topKey") .operator("isNot") .targetValue(new SyntheticsAssertionTargetValue("0"))) .type(SyntheticsAssertionType.BODY)), new SyntheticsAssertion( new SyntheticsAssertionJSONSchemaTarget() .operator( SyntheticsAssertionJSONSchemaOperator.VALIDATES_JSON_SCHEMA) .target( new SyntheticsAssertionJSONSchemaTargetTarget() .metaSchema( SyntheticsAssertionJSONSchemaMetaSchema.DRAFT_07) .jsonSchema( """ {"type": "object", "properties":{"slideshow":{"type":"object"}}} """)) .type(SyntheticsAssertionType.BODY)))) .configVariables( Collections.singletonList( new SyntheticsConfigVariable() .example("content-type") .name("PROPERTY") .pattern("content-type") .type(SyntheticsConfigVariableType.TEXT))) .request( new SyntheticsTestRequest() .certificate( new SyntheticsTestRequestCertificate() .cert( new SyntheticsTestRequestCertificateItem() .filename("cert-filename") .updatedAt("2020-10-16T09:23:24.857Z")) .key( new SyntheticsTestRequestCertificateItem() .filename("key-filename") .updatedAt("2020-10-16T09:23:24.857Z"))) .headers(Map.ofEntries(Map.entry("unique", "examplesynthetic"))) .method("GET") .timeout(10.0) .url("https://datadoghq.com"))) .locations(Collections.singletonList("aws:us-east-2")) .message("BDD test payload: synthetics_api_test_payload.json") .name("Example-Synthetic-updated") .options( new SyntheticsTestOptions() .acceptSelfSigned(false) .allowInsecure(true) .followRedirects(true) .minFailureDuration(10L) .minLocationFailed(1L) .monitorName("Test-TestSyntheticsAPITestLifecycle-1623076664") .monitorPriority(5) .retry(new SyntheticsTestOptionsRetry().count(3L).interval(10.0)) .tickEvery(60L)) .status(SyntheticsTestPauseStatus.LIVE) .subtype(SyntheticsTestDetailsSubType.HTTP) .tags(Collections.singletonList("testing:api")) .type(SyntheticsAPITestType.API); try { SyntheticsAPITest result = apiInstance.updateAPITest(SYNTHETICS_API_TEST_PUBLIC_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#updateAPITest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit an API test returns "OK" response ``` """ Edit an API test returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_api_test import SyntheticsAPITest from datadog_api_client.v1.model.synthetics_api_test_config import SyntheticsAPITestConfig from datadog_api_client.v1.model.synthetics_api_test_type import SyntheticsAPITestType from datadog_api_client.v1.model.synthetics_assertion_json_path_operator import SyntheticsAssertionJSONPathOperator from datadog_api_client.v1.model.synthetics_assertion_json_path_target import SyntheticsAssertionJSONPathTarget from datadog_api_client.v1.model.synthetics_assertion_json_path_target_target import ( SyntheticsAssertionJSONPathTargetTarget, ) from datadog_api_client.v1.model.synthetics_assertion_json_schema_meta_schema import ( SyntheticsAssertionJSONSchemaMetaSchema, ) from datadog_api_client.v1.model.synthetics_assertion_json_schema_operator import SyntheticsAssertionJSONSchemaOperator from datadog_api_client.v1.model.synthetics_assertion_json_schema_target import SyntheticsAssertionJSONSchemaTarget from datadog_api_client.v1.model.synthetics_assertion_json_schema_target_target import ( SyntheticsAssertionJSONSchemaTargetTarget, ) from datadog_api_client.v1.model.synthetics_assertion_operator import SyntheticsAssertionOperator from datadog_api_client.v1.model.synthetics_assertion_target import SyntheticsAssertionTarget from datadog_api_client.v1.model.synthetics_assertion_type import SyntheticsAssertionType from datadog_api_client.v1.model.synthetics_config_variable import SyntheticsConfigVariable from datadog_api_client.v1.model.synthetics_config_variable_type import SyntheticsConfigVariableType from datadog_api_client.v1.model.synthetics_test_details_sub_type import SyntheticsTestDetailsSubType from datadog_api_client.v1.model.synthetics_test_headers import SyntheticsTestHeaders from datadog_api_client.v1.model.synthetics_test_options import SyntheticsTestOptions from datadog_api_client.v1.model.synthetics_test_options_retry import SyntheticsTestOptionsRetry from datadog_api_client.v1.model.synthetics_test_pause_status import SyntheticsTestPauseStatus from datadog_api_client.v1.model.synthetics_test_request import SyntheticsTestRequest from datadog_api_client.v1.model.synthetics_test_request_certificate import SyntheticsTestRequestCertificate from datadog_api_client.v1.model.synthetics_test_request_certificate_item import SyntheticsTestRequestCertificateItem # there is a valid "synthetics_api_test" in the system SYNTHETICS_API_TEST_PUBLIC_ID = environ["SYNTHETICS_API_TEST_PUBLIC_ID"] body = SyntheticsAPITest( config=SyntheticsAPITestConfig( assertions=[ SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.IS, _property="{{ PROPERTY }}", target="text/html", type=SyntheticsAssertionType.HEADER, ), SyntheticsAssertionTarget( operator=SyntheticsAssertionOperator.LESS_THAN, target=2000, type=SyntheticsAssertionType.RESPONSE_TIME, ), SyntheticsAssertionJSONPathTarget( operator=SyntheticsAssertionJSONPathOperator.VALIDATES_JSON_PATH, target=SyntheticsAssertionJSONPathTargetTarget( json_path="topKey", operator="isNot", target_value="0", ), type=SyntheticsAssertionType.BODY, ), SyntheticsAssertionJSONSchemaTarget( operator=SyntheticsAssertionJSONSchemaOperator.VALIDATES_JSON_SCHEMA, target=SyntheticsAssertionJSONSchemaTargetTarget( meta_schema=SyntheticsAssertionJSONSchemaMetaSchema.DRAFT_07, json_schema='{"type": "object", "properties":{"slideshow":{"type":"object"}}}', ), type=SyntheticsAssertionType.BODY, ), ], config_variables=[ SyntheticsConfigVariable( example="content-type", name="PROPERTY", pattern="content-type", type=SyntheticsConfigVariableType.TEXT, ), ], request=SyntheticsTestRequest( certificate=SyntheticsTestRequestCertificate( cert=SyntheticsTestRequestCertificateItem( filename="cert-filename", updated_at="2020-10-16T09:23:24.857Z", ), key=SyntheticsTestRequestCertificateItem( filename="key-filename", updated_at="2020-10-16T09:23:24.857Z", ), ), headers=SyntheticsTestHeaders( unique="examplesynthetic", ), method="GET", timeout=10.0, url="https://datadoghq.com", ), ), locations=[ "aws:us-east-2", ], message="BDD test payload: synthetics_api_test_payload.json", name="Example-Synthetic-updated", options=SyntheticsTestOptions( accept_self_signed=False, allow_insecure=True, follow_redirects=True, min_failure_duration=10, min_location_failed=1, monitor_name="Test-TestSyntheticsAPITestLifecycle-1623076664", monitor_priority=5, retry=SyntheticsTestOptionsRetry( count=3, interval=10.0, ), tick_every=60, ), status=SyntheticsTestPauseStatus.LIVE, subtype=SyntheticsTestDetailsSubType.HTTP, tags=[ "testing:api", ], type=SyntheticsAPITestType.API, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.update_api_test(public_id=SYNTHETICS_API_TEST_PUBLIC_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit an API test returns "OK" response ``` # Edit an API test returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new # there is a valid "synthetics_api_test" in the system SYNTHETICS_API_TEST_PUBLIC_ID = ENV["SYNTHETICS_API_TEST_PUBLIC_ID"] body = DatadogAPIClient::V1::SyntheticsAPITest.new({ config: DatadogAPIClient::V1::SyntheticsAPITestConfig.new({ assertions: [ DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::IS, property: "{{ PROPERTY }}", target: "text/html", type: DatadogAPIClient::V1::SyntheticsAssertionType::HEADER, }), DatadogAPIClient::V1::SyntheticsAssertionTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionOperator::LESS_THAN, target: 2000, type: DatadogAPIClient::V1::SyntheticsAssertionType::RESPONSE_TIME, }), DatadogAPIClient::V1::SyntheticsAssertionJSONPathTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionJSONPathOperator::VALIDATES_JSON_PATH, target: DatadogAPIClient::V1::SyntheticsAssertionJSONPathTargetTarget.new({ json_path: "topKey", operator: "isNot", target_value: "0", }), type: DatadogAPIClient::V1::SyntheticsAssertionType::BODY, }), DatadogAPIClient::V1::SyntheticsAssertionJSONSchemaTarget.new({ operator: DatadogAPIClient::V1::SyntheticsAssertionJSONSchemaOperator::VALIDATES_JSON_SCHEMA, target: DatadogAPIClient::V1::SyntheticsAssertionJSONSchemaTargetTarget.new({ meta_schema: DatadogAPIClient::V1::SyntheticsAssertionJSONSchemaMetaSchema::DRAFT_07, json_schema: '{"type": "object", "properties":{"slideshow":{"type":"object"}}}', }), type: DatadogAPIClient::V1::SyntheticsAssertionType::BODY, }), ], config_variables: [ DatadogAPIClient::V1::SyntheticsConfigVariable.new({ example: "content-type", name: "PROPERTY", pattern: "content-type", type: DatadogAPIClient::V1::SyntheticsConfigVariableType::TEXT, }), ], request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ certificate: DatadogAPIClient::V1::SyntheticsTestRequestCertificate.new({ cert: DatadogAPIClient::V1::SyntheticsTestRequestCertificateItem.new({ filename: "cert-filename", updated_at: "2020-10-16T09:23:24.857Z", }), key: DatadogAPIClient::V1::SyntheticsTestRequestCertificateItem.new({ filename: "key-filename", updated_at: "2020-10-16T09:23:24.857Z", }), }), headers: { unique: "examplesynthetic", }, method: "GET", timeout: 10, url: "https://datadoghq.com", }), }), locations: [ "aws:us-east-2", ], message: "BDD test payload: synthetics_api_test_payload.json", name: "Example-Synthetic-updated", options: DatadogAPIClient::V1::SyntheticsTestOptions.new({ accept_self_signed: false, allow_insecure: true, follow_redirects: true, min_failure_duration: 10, min_location_failed: 1, monitor_name: "Test-TestSyntheticsAPITestLifecycle-1623076664", monitor_priority: 5, _retry: DatadogAPIClient::V1::SyntheticsTestOptionsRetry.new({ count: 3, interval: 10, }), tick_every: 60, }), status: DatadogAPIClient::V1::SyntheticsTestPauseStatus::LIVE, subtype: DatadogAPIClient::V1::SyntheticsTestDetailsSubType::HTTP, tags: [ "testing:api", ], type: DatadogAPIClient::V1::SyntheticsAPITestType::API, }) p api_instance.update_api_test(SYNTHETICS_API_TEST_PUBLIC_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit an API test returns "OK" response ``` // Edit an API test returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsAPITest; use datadog_api_client::datadogV1::model::SyntheticsAPITestConfig; use datadog_api_client::datadogV1::model::SyntheticsAPITestType; use datadog_api_client::datadogV1::model::SyntheticsAssertion; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONPathOperator; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONPathTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONPathTargetTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONSchemaMetaSchema; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONSchemaOperator; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONSchemaTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionJSONSchemaTargetTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionOperator; use datadog_api_client::datadogV1::model::SyntheticsAssertionTarget; use datadog_api_client::datadogV1::model::SyntheticsAssertionTargetValue; use datadog_api_client::datadogV1::model::SyntheticsAssertionType; use datadog_api_client::datadogV1::model::SyntheticsConfigVariable; use datadog_api_client::datadogV1::model::SyntheticsConfigVariableType; use datadog_api_client::datadogV1::model::SyntheticsTestDetailsSubType; use datadog_api_client::datadogV1::model::SyntheticsTestOptions; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsRetry; use datadog_api_client::datadogV1::model::SyntheticsTestPauseStatus; use datadog_api_client::datadogV1::model::SyntheticsTestRequest; use datadog_api_client::datadogV1::model::SyntheticsTestRequestCertificate; use datadog_api_client::datadogV1::model::SyntheticsTestRequestCertificateItem; use std::collections::BTreeMap; #[tokio::main] async fn main() { // there is a valid "synthetics_api_test" in the system let synthetics_api_test_public_id = std::env::var("SYNTHETICS_API_TEST_PUBLIC_ID").unwrap(); let body = SyntheticsAPITest::new( SyntheticsAPITestConfig::new() .assertions( vec![ SyntheticsAssertion::SyntheticsAssertionTarget( Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::IS, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueString( "text/html".to_string(), ), SyntheticsAssertionType::HEADER, ).property("{{ PROPERTY }}".to_string()), ), ), SyntheticsAssertion::SyntheticsAssertionTarget( Box::new( SyntheticsAssertionTarget::new( SyntheticsAssertionOperator::LESS_THAN, SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueNumber( 2000.0 as f64, ), SyntheticsAssertionType::RESPONSE_TIME, ), ), ), SyntheticsAssertion::SyntheticsAssertionJSONPathTarget( Box::new( SyntheticsAssertionJSONPathTarget::new( SyntheticsAssertionJSONPathOperator::VALIDATES_JSON_PATH, SyntheticsAssertionType::BODY, ).target( SyntheticsAssertionJSONPathTargetTarget::new() .json_path("topKey".to_string()) .operator("isNot".to_string()) .target_value( SyntheticsAssertionTargetValue::SyntheticsAssertionTargetValueString( "0".to_string(), ), ), ), ), ), SyntheticsAssertion::SyntheticsAssertionJSONSchemaTarget( Box::new( SyntheticsAssertionJSONSchemaTarget::new( SyntheticsAssertionJSONSchemaOperator::VALIDATES_JSON_SCHEMA, SyntheticsAssertionType::BODY, ).target( SyntheticsAssertionJSONSchemaTargetTarget::new() .json_schema( r#"{"type": "object", "properties":{"slideshow":{"type":"object"}}}"#.to_string(), ) .meta_schema(SyntheticsAssertionJSONSchemaMetaSchema::DRAFT_07), ), ), ) ], ) .config_variables( vec![ SyntheticsConfigVariable::new("PROPERTY".to_string(), SyntheticsConfigVariableType::TEXT) .example("content-type".to_string()) .pattern("content-type".to_string()) ], ) .request( SyntheticsTestRequest::new() .certificate( SyntheticsTestRequestCertificate::new() .cert( SyntheticsTestRequestCertificateItem::new() .filename("cert-filename".to_string()) .updated_at("2020-10-16T09:23:24.857Z".to_string()), ) .key( SyntheticsTestRequestCertificateItem::new() .filename("key-filename".to_string()) .updated_at("2020-10-16T09:23:24.857Z".to_string()), ), ) .headers(BTreeMap::from([("unique".to_string(), "examplesynthetic".to_string())])) .method("GET".to_string()) .timeout(10.0 as f64) .url("https://datadoghq.com".to_string()), ), vec!["aws:us-east-2".to_string()], "BDD test payload: synthetics_api_test_payload.json".to_string(), "Example-Synthetic-updated".to_string(), SyntheticsTestOptions::new() .accept_self_signed(false) .allow_insecure(true) .follow_redirects(true) .min_failure_duration(10) .min_location_failed(1) .monitor_name("Test-TestSyntheticsAPITestLifecycle-1623076664".to_string()) .monitor_priority(5) .retry(SyntheticsTestOptionsRetry::new().count(3).interval(10.0 as f64)) .tick_every(60), SyntheticsAPITestType::API, ) .status(SyntheticsTestPauseStatus::LIVE) .subtype(SyntheticsTestDetailsSubType::HTTP) .tags(vec!["testing:api".to_string()]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api .update_api_test(synthetics_api_test_public_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit an API test returns "OK" response ``` /** * Edit an API test returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); // there is a valid "synthetics_api_test" in the system const SYNTHETICS_API_TEST_PUBLIC_ID = process.env .SYNTHETICS_API_TEST_PUBLIC_ID as string; const params: v1.SyntheticsApiUpdateAPITestRequest = { body: { config: { assertions: [ { operator: "is", property: "{{ PROPERTY }}", target: "text/html", type: "header", }, { operator: "lessThan", target: 2000, type: "responseTime", }, { operator: "validatesJSONPath", target: { jsonPath: "topKey", operator: "isNot", targetValue: "0", }, type: "body", }, { operator: "validatesJSONSchema", target: { metaSchema: "draft-07", jsonSchema: `{"type": "object", "properties":{"slideshow":{"type":"object"}}}`, }, type: "body", }, ], configVariables: [ { example: "content-type", name: "PROPERTY", pattern: "content-type", type: "text", }, ], request: { certificate: { cert: { filename: "cert-filename", updatedAt: "2020-10-16T09:23:24.857Z", }, key: { filename: "key-filename", updatedAt: "2020-10-16T09:23:24.857Z", }, }, headers: { unique: "examplesynthetic", }, method: "GET", timeout: 10, url: "https://datadoghq.com", }, }, locations: ["aws:us-east-2"], message: "BDD test payload: synthetics_api_test_payload.json", name: "Example-Synthetic-updated", options: { acceptSelfSigned: false, allowInsecure: true, followRedirects: true, minFailureDuration: 10, minLocationFailed: 1, monitorName: "Test-TestSyntheticsAPITestLifecycle-1623076664", monitorPriority: 5, retry: { count: 3, interval: 10, }, tickEvery: 60, }, status: "live", subtype: "http", tags: ["testing:api"], type: "api", }, publicId: SYNTHETICS_API_TEST_PUBLIC_ID, }; apiInstance .updateAPITest(params) .then((data: v1.SyntheticsAPITest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit a browser test](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-browser-test) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-browser-test-v1) PUT https://api.ap1.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}https://api.ap2.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}https://api.datadoghq.eu/api/v1/synthetics/tests/browser/{public_id}https://api.ddog-gov.com/api/v1/synthetics/tests/browser/{public_id}https://api.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}https://api.us3.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}https://api.us5.datadoghq.com/api/v1/synthetics/tests/browser/{public_id} ### Overview Edit the configuration of a Synthetic browser test. This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the test to edit. ### Request #### Body Data (required) New test details to be saved. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description config [_required_] object Configuration object for a Synthetic browser test. assertions [_required_] [ ] Array of assertions used for the test. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request [_required_] object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. setCookie string Cookies to be used for the request, using the [Set-Cookie](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Set-Cookie) syntax. variables [object] Array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` locations [_required_] [string] Array of locations used to run the test. message [_required_] string Notification message associated with the test. Message can either be text or an empty string. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The public ID of the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] Array of steps for the test. allowFailure boolean A boolean set to allow this step to fail. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean A boolean set to exit the test if the step succeeds. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name string The name of the step. noScreenshot boolean A boolean set to skip taking a screenshot for the step. params object The parameters of the step. public_id string The public ID of the step. timeout int64 The time before declaring a step failed. type enum Step type used in your Synthetic test. Allowed enum values: `assertCurrentUrl,assertElementAttribute,assertElementContent,assertElementPresent,assertEmail,assertFileDownload,assertFromJavascript,assertPageContains,assertPageLacks,assertRequests,click,extractFromJavascript,extractFromEmailBody,extractVariable,goToEmailLink,goToUrl,goToUrlAndMeasureTti,hover,playSubTest,pressKey,refresh,runApiTest,scroll,selectOption,typeText,uploadFiles,wait` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `browser`. Allowed enum values: `browser` default: `browser` ``` { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "setCookie": "string", "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "locations": [ "aws:eu-west-3" ], "message": "", "name": "Example test name", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "status": "live", "steps": [ { "allowFailure": false, "alwaysExecute": false, "exitIfSucceed": false, "isCritical": false, "name": "string", "noScreenshot": false, "params": {}, "public_id": "string", "timeout": "integer", "type": "assertElementContent" } ], "tags": [ "env:prod" ], "type": "browser" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#UpdateBrowserTest-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#UpdateBrowserTest-400-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#UpdateBrowserTest-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#UpdateBrowserTest-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#UpdateBrowserTest-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about a Synthetic browser test. Field Type Description config [_required_] object Configuration object for a Synthetic browser test. assertions [_required_] [ ] Array of assertions used for the test. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request [_required_] object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. setCookie string Cookies to be used for the request, using the [Set-Cookie](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Set-Cookie) syntax. variables [object] Array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` locations [_required_] [string] Array of locations used to run the test. message [_required_] string Notification message associated with the test. Message can either be text or an empty string. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The public ID of the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] Array of steps for the test. allowFailure boolean A boolean set to allow this step to fail. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean A boolean set to exit the test if the step succeeds. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name string The name of the step. noScreenshot boolean A boolean set to skip taking a screenshot for the step. params object The parameters of the step. public_id string The public ID of the step. timeout int64 The time before declaring a step failed. type enum Step type used in your Synthetic test. Allowed enum values: `assertCurrentUrl,assertElementAttribute,assertElementContent,assertElementPresent,assertEmail,assertFileDownload,assertFromJavascript,assertPageContains,assertPageLacks,assertRequests,click,extractFromJavascript,extractFromEmailBody,extractVariable,goToEmailLink,goToUrl,goToUrlAndMeasureTti,hover,playSubTest,pressKey,refresh,runApiTest,scroll,selectOption,typeText,uploadFiles,wait` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `browser`. Allowed enum values: `browser` default: `browser` ``` { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "setCookie": "string", "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "locations": [ "aws:eu-west-3" ], "message": "", "monitor_id": "integer", "name": "Example test name", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "string", "status": "live", "steps": [ { "allowFailure": false, "alwaysExecute": false, "exitIfSucceed": false, "isCritical": false, "name": "string", "noScreenshot": false, "params": {}, "public_id": "string", "timeout": "integer", "type": "assertElementContent" } ], "tags": [ "env:prod" ], "type": "browser" } ``` Copy - JSON format is wrong - Updating sub-type is forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic Monitoring is not activated for the user - Test is not owned by the user - Test can't be found * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Edit a browser test Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/browser/${public_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "assertions": [ {} ], "configVariables": [ { "name": "VARIABLE_NAME", "type": "text" } ], "request": { "proxy": { "url": "https://example.com" } }, "variables": [ { "name": "VARIABLE_NAME", "type": "text" } ] }, "locations": [ "aws:eu-west-3" ], "message": "", "name": "Example test name", "options": { "ci": { "executionRule": "blocking" }, "rumSettings": { "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" } }, "type": "browser" } EOF ``` ##### Edit a browser test ``` """ Edit a browser test returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_basic_auth_web import SyntheticsBasicAuthWeb from datadog_api_client.v1.model.synthetics_basic_auth_web_type import SyntheticsBasicAuthWebType from datadog_api_client.v1.model.synthetics_browser_test import SyntheticsBrowserTest from datadog_api_client.v1.model.synthetics_browser_test_config import SyntheticsBrowserTestConfig from datadog_api_client.v1.model.synthetics_browser_test_rum_settings import SyntheticsBrowserTestRumSettings from datadog_api_client.v1.model.synthetics_browser_test_type import SyntheticsBrowserTestType from datadog_api_client.v1.model.synthetics_browser_variable import SyntheticsBrowserVariable from datadog_api_client.v1.model.synthetics_browser_variable_type import SyntheticsBrowserVariableType from datadog_api_client.v1.model.synthetics_config_variable import SyntheticsConfigVariable from datadog_api_client.v1.model.synthetics_config_variable_type import SyntheticsConfigVariableType from datadog_api_client.v1.model.synthetics_restricted_roles import SyntheticsRestrictedRoles from datadog_api_client.v1.model.synthetics_step import SyntheticsStep from datadog_api_client.v1.model.synthetics_step_type import SyntheticsStepType from datadog_api_client.v1.model.synthetics_test_call_type import SyntheticsTestCallType from datadog_api_client.v1.model.synthetics_test_ci_options import SyntheticsTestCiOptions from datadog_api_client.v1.model.synthetics_test_execution_rule import SyntheticsTestExecutionRule from datadog_api_client.v1.model.synthetics_test_options import SyntheticsTestOptions from datadog_api_client.v1.model.synthetics_test_options_http_version import SyntheticsTestOptionsHTTPVersion from datadog_api_client.v1.model.synthetics_test_options_monitor_options import SyntheticsTestOptionsMonitorOptions from datadog_api_client.v1.model.synthetics_test_options_monitor_options_notification_preset_name import ( SyntheticsTestOptionsMonitorOptionsNotificationPresetName, ) from datadog_api_client.v1.model.synthetics_test_options_retry import SyntheticsTestOptionsRetry from datadog_api_client.v1.model.synthetics_test_options_scheduling import SyntheticsTestOptionsScheduling from datadog_api_client.v1.model.synthetics_test_options_scheduling_timeframe import ( SyntheticsTestOptionsSchedulingTimeframe, ) from datadog_api_client.v1.model.synthetics_test_pause_status import SyntheticsTestPauseStatus from datadog_api_client.v1.model.synthetics_test_request import SyntheticsTestRequest from datadog_api_client.v1.model.synthetics_test_request_body_file import SyntheticsTestRequestBodyFile from datadog_api_client.v1.model.synthetics_test_request_body_type import SyntheticsTestRequestBodyType from datadog_api_client.v1.model.synthetics_test_request_certificate import SyntheticsTestRequestCertificate from datadog_api_client.v1.model.synthetics_test_request_certificate_item import SyntheticsTestRequestCertificateItem from datadog_api_client.v1.model.synthetics_test_request_proxy import SyntheticsTestRequestProxy body = SyntheticsBrowserTest( config=SyntheticsBrowserTestConfig( assertions=[], config_variables=[ SyntheticsConfigVariable( name="VARIABLE_NAME", secure=False, type=SyntheticsConfigVariableType.TEXT, ), ], request=SyntheticsTestRequest( basic_auth=SyntheticsBasicAuthWeb( password="PaSSw0RD!", type=SyntheticsBasicAuthWebType.WEB, username="my_username", ), body_type=SyntheticsTestRequestBodyType.TEXT_PLAIN, call_type=SyntheticsTestCallType.UNARY, certificate=SyntheticsTestRequestCertificate( cert=SyntheticsTestRequestCertificateItem(), key=SyntheticsTestRequestCertificateItem(), ), certificate_domains=[], files=[ SyntheticsTestRequestBodyFile(), ], http_version=SyntheticsTestOptionsHTTPVersion.HTTP1, proxy=SyntheticsTestRequestProxy( url="https://example.com", ), service="Greeter", url="https://example.com", ), variables=[ SyntheticsBrowserVariable( name="VARIABLE_NAME", type=SyntheticsBrowserVariableType.TEXT, ), ], ), locations=[ "aws:eu-west-3", ], message="", name="Example test name", options=SyntheticsTestOptions( blocked_request_patterns=[], ci=SyntheticsTestCiOptions( execution_rule=SyntheticsTestExecutionRule.BLOCKING, ), device_ids=[ "chrome.laptop_large", ], http_version=SyntheticsTestOptionsHTTPVersion.HTTP1, monitor_options=SyntheticsTestOptionsMonitorOptions( notification_preset_name=SyntheticsTestOptionsMonitorOptionsNotificationPresetName.SHOW_ALL, ), restricted_roles=SyntheticsRestrictedRoles( [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", ] ), retry=SyntheticsTestOptionsRetry(), rum_settings=SyntheticsBrowserTestRumSettings( application_id="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", client_token_id=12345, is_enabled=True, ), scheduling=SyntheticsTestOptionsScheduling( timeframes=[ SyntheticsTestOptionsSchedulingTimeframe( day=1, _from="07:00", to="16:00", ), SyntheticsTestOptionsSchedulingTimeframe( day=3, _from="07:00", to="16:00", ), ], timezone="America/New_York", ), ), status=SyntheticsTestPauseStatus.LIVE, steps=[ SyntheticsStep( type=SyntheticsStepType.ASSERT_ELEMENT_CONTENT, ), ], tags=[ "env:prod", ], type=SyntheticsBrowserTestType.BROWSER, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.update_browser_test(public_id="public_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit a browser test ``` # Edit a browser test returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsBrowserTest.new({ config: DatadogAPIClient::V1::SyntheticsBrowserTestConfig.new({ assertions: [], config_variables: [ DatadogAPIClient::V1::SyntheticsConfigVariable.new({ name: "VARIABLE_NAME", secure: false, type: DatadogAPIClient::V1::SyntheticsConfigVariableType::TEXT, }), ], request: DatadogAPIClient::V1::SyntheticsTestRequest.new({ basic_auth: DatadogAPIClient::V1::SyntheticsBasicAuthWeb.new({ password: "PaSSw0RD!", type: DatadogAPIClient::V1::SyntheticsBasicAuthWebType::WEB, username: "my_username", }), body_type: DatadogAPIClient::V1::SyntheticsTestRequestBodyType::TEXT_PLAIN, call_type: DatadogAPIClient::V1::SyntheticsTestCallType::UNARY, certificate: DatadogAPIClient::V1::SyntheticsTestRequestCertificate.new({ cert: DatadogAPIClient::V1::SyntheticsTestRequestCertificateItem.new({}), key: DatadogAPIClient::V1::SyntheticsTestRequestCertificateItem.new({}), }), certificate_domains: [], files: [ DatadogAPIClient::V1::SyntheticsTestRequestBodyFile.new({}), ], http_version: DatadogAPIClient::V1::SyntheticsTestOptionsHTTPVersion::HTTP1, proxy: DatadogAPIClient::V1::SyntheticsTestRequestProxy.new({ url: "https://example.com", }), service: "Greeter", url: "https://example.com", }), variables: [ DatadogAPIClient::V1::SyntheticsBrowserVariable.new({ name: "VARIABLE_NAME", type: DatadogAPIClient::V1::SyntheticsBrowserVariableType::TEXT, }), ], }), locations: [ "aws:eu-west-3", ], message: "", name: "Example test name", options: DatadogAPIClient::V1::SyntheticsTestOptions.new({ blocked_request_patterns: [], ci: DatadogAPIClient::V1::SyntheticsTestCiOptions.new({ execution_rule: DatadogAPIClient::V1::SyntheticsTestExecutionRule::BLOCKING, }), device_ids: [ "chrome.laptop_large", ], http_version: DatadogAPIClient::V1::SyntheticsTestOptionsHTTPVersion::HTTP1, monitor_options: DatadogAPIClient::V1::SyntheticsTestOptionsMonitorOptions.new({ notification_preset_name: DatadogAPIClient::V1::SyntheticsTestOptionsMonitorOptionsNotificationPresetName::SHOW_ALL, }), restricted_roles: [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", ], _retry: DatadogAPIClient::V1::SyntheticsTestOptionsRetry.new({}), rum_settings: DatadogAPIClient::V1::SyntheticsBrowserTestRumSettings.new({ application_id: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", client_token_id: 12345, is_enabled: true, }), scheduling: DatadogAPIClient::V1::SyntheticsTestOptionsScheduling.new({ timeframes: [ DatadogAPIClient::V1::SyntheticsTestOptionsSchedulingTimeframe.new({ day: 1, from: "07:00", to: "16:00", }), DatadogAPIClient::V1::SyntheticsTestOptionsSchedulingTimeframe.new({ day: 3, from: "07:00", to: "16:00", }), ], timezone: "America/New_York", }), }), status: DatadogAPIClient::V1::SyntheticsTestPauseStatus::LIVE, steps: [ DatadogAPIClient::V1::SyntheticsStep.new({ type: DatadogAPIClient::V1::SyntheticsStepType::ASSERT_ELEMENT_CONTENT, }), ], tags: [ "env:prod", ], type: DatadogAPIClient::V1::SyntheticsBrowserTestType::BROWSER, }) p api_instance.update_browser_test("public_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit a browser test ``` // Edit a browser test returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsBrowserTest{ Config: datadogV1.SyntheticsBrowserTestConfig{ Assertions: []datadogV1.SyntheticsAssertion{}, ConfigVariables: []datadogV1.SyntheticsConfigVariable{ { Name: "VARIABLE_NAME", Secure: datadog.PtrBool(false), Type: datadogV1.SYNTHETICSCONFIGVARIABLETYPE_TEXT, }, }, Request: datadogV1.SyntheticsTestRequest{ BasicAuth: &datadogV1.SyntheticsBasicAuth{ SyntheticsBasicAuthWeb: &datadogV1.SyntheticsBasicAuthWeb{ Password: datadog.PtrString("PaSSw0RD!"), Type: datadogV1.SYNTHETICSBASICAUTHWEBTYPE_WEB.Ptr(), Username: datadog.PtrString("my_username"), }}, BodyType: datadogV1.SYNTHETICSTESTREQUESTBODYTYPE_TEXT_PLAIN.Ptr(), CallType: datadogV1.SYNTHETICSTESTCALLTYPE_UNARY.Ptr(), Certificate: &datadogV1.SyntheticsTestRequestCertificate{ Cert: &datadogV1.SyntheticsTestRequestCertificateItem{}, Key: &datadogV1.SyntheticsTestRequestCertificateItem{}, }, CertificateDomains: []string{}, Files: []datadogV1.SyntheticsTestRequestBodyFile{ {}, }, HttpVersion: datadogV1.SYNTHETICSTESTOPTIONSHTTPVERSION_HTTP1.Ptr(), Proxy: &datadogV1.SyntheticsTestRequestProxy{ Url: "https://example.com", }, Service: datadog.PtrString("Greeter"), Url: datadog.PtrString("https://example.com"), }, Variables: []datadogV1.SyntheticsBrowserVariable{ { Name: "VARIABLE_NAME", Type: datadogV1.SYNTHETICSBROWSERVARIABLETYPE_TEXT, }, }, }, Locations: []string{ "aws:eu-west-3", }, Message: "", Name: "Example test name", Options: datadogV1.SyntheticsTestOptions{ BlockedRequestPatterns: []string{}, Ci: &datadogV1.SyntheticsTestCiOptions{ ExecutionRule: datadogV1.SYNTHETICSTESTEXECUTIONRULE_BLOCKING, }, DeviceIds: []string{ "chrome.laptop_large", }, HttpVersion: datadogV1.SYNTHETICSTESTOPTIONSHTTPVERSION_HTTP1.Ptr(), MonitorOptions: &datadogV1.SyntheticsTestOptionsMonitorOptions{ NotificationPresetName: datadogV1.SYNTHETICSTESTOPTIONSMONITOROPTIONSNOTIFICATIONPRESETNAME_SHOW_ALL.Ptr(), }, RestrictedRoles: []string{ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", }, Retry: &datadogV1.SyntheticsTestOptionsRetry{}, RumSettings: &datadogV1.SyntheticsBrowserTestRumSettings{ ApplicationId: datadog.PtrString("xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"), ClientTokenId: datadog.PtrInt64(12345), IsEnabled: true, }, Scheduling: &datadogV1.SyntheticsTestOptionsScheduling{ Timeframes: []datadogV1.SyntheticsTestOptionsSchedulingTimeframe{ { Day: 1, From: "07:00", To: "16:00", }, { Day: 3, From: "07:00", To: "16:00", }, }, Timezone: "America/New_York", }, }, Status: datadogV1.SYNTHETICSTESTPAUSESTATUS_LIVE.Ptr(), Steps: []datadogV1.SyntheticsStep{ { Type: datadogV1.SYNTHETICSSTEPTYPE_ASSERT_ELEMENT_CONTENT.Ptr(), }, }, Tags: []string{ "env:prod", }, Type: datadogV1.SYNTHETICSBROWSERTESTTYPE_BROWSER, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.UpdateBrowserTest(ctx, "public_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.UpdateBrowserTest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.UpdateBrowserTest`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit a browser test ``` // Edit a browser test returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsBasicAuth; import com.datadog.api.client.v1.model.SyntheticsBasicAuthWeb; import com.datadog.api.client.v1.model.SyntheticsBasicAuthWebType; import com.datadog.api.client.v1.model.SyntheticsBrowserTest; import com.datadog.api.client.v1.model.SyntheticsBrowserTestConfig; import com.datadog.api.client.v1.model.SyntheticsBrowserTestRumSettings; import com.datadog.api.client.v1.model.SyntheticsBrowserTestType; import com.datadog.api.client.v1.model.SyntheticsBrowserVariable; import com.datadog.api.client.v1.model.SyntheticsBrowserVariableType; import com.datadog.api.client.v1.model.SyntheticsConfigVariable; import com.datadog.api.client.v1.model.SyntheticsConfigVariableType; import com.datadog.api.client.v1.model.SyntheticsStep; import com.datadog.api.client.v1.model.SyntheticsStepType; import com.datadog.api.client.v1.model.SyntheticsTestCallType; import com.datadog.api.client.v1.model.SyntheticsTestCiOptions; import com.datadog.api.client.v1.model.SyntheticsTestExecutionRule; import com.datadog.api.client.v1.model.SyntheticsTestOptions; import com.datadog.api.client.v1.model.SyntheticsTestOptionsHTTPVersion; import com.datadog.api.client.v1.model.SyntheticsTestOptionsMonitorOptions; import com.datadog.api.client.v1.model.SyntheticsTestOptionsMonitorOptionsNotificationPresetName; import com.datadog.api.client.v1.model.SyntheticsTestOptionsRetry; import com.datadog.api.client.v1.model.SyntheticsTestOptionsScheduling; import com.datadog.api.client.v1.model.SyntheticsTestOptionsSchedulingTimeframe; import com.datadog.api.client.v1.model.SyntheticsTestPauseStatus; import com.datadog.api.client.v1.model.SyntheticsTestRequest; import com.datadog.api.client.v1.model.SyntheticsTestRequestBodyFile; import com.datadog.api.client.v1.model.SyntheticsTestRequestBodyType; import com.datadog.api.client.v1.model.SyntheticsTestRequestCertificate; import com.datadog.api.client.v1.model.SyntheticsTestRequestCertificateItem; import com.datadog.api.client.v1.model.SyntheticsTestRequestProxy; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsBrowserTest body = new SyntheticsBrowserTest() .config( new SyntheticsBrowserTestConfig() .configVariables( Collections.singletonList( new SyntheticsConfigVariable() .name("VARIABLE_NAME") .secure(false) .type(SyntheticsConfigVariableType.TEXT))) .request( new SyntheticsTestRequest() .basicAuth( new SyntheticsBasicAuth( new SyntheticsBasicAuthWeb() .password("PaSSw0RD!") .type(SyntheticsBasicAuthWebType.WEB) .username("my_username"))) .bodyType(SyntheticsTestRequestBodyType.TEXT_PLAIN) .callType(SyntheticsTestCallType.UNARY) .certificate( new SyntheticsTestRequestCertificate() .cert(new SyntheticsTestRequestCertificateItem()) .key(new SyntheticsTestRequestCertificateItem())) .files(Collections.singletonList(new SyntheticsTestRequestBodyFile())) .httpVersion(SyntheticsTestOptionsHTTPVersion.HTTP1) .proxy(new SyntheticsTestRequestProxy().url("https://example.com")) .service("Greeter") .url("https://example.com")) .variables( Collections.singletonList( new SyntheticsBrowserVariable() .name("VARIABLE_NAME") .type(SyntheticsBrowserVariableType.TEXT)))) .locations(Collections.singletonList("aws:eu-west-3")) .message("") .name("Example test name") .options( new SyntheticsTestOptions() .ci( new SyntheticsTestCiOptions() .executionRule(SyntheticsTestExecutionRule.BLOCKING)) .deviceIds(Collections.singletonList("chrome.laptop_large")) .httpVersion(SyntheticsTestOptionsHTTPVersion.HTTP1) .monitorOptions( new SyntheticsTestOptionsMonitorOptions() .notificationPresetName( SyntheticsTestOptionsMonitorOptionsNotificationPresetName.SHOW_ALL)) .restrictedRoles( Collections.singletonList("xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx")) .retry(new SyntheticsTestOptionsRetry()) .rumSettings( new SyntheticsBrowserTestRumSettings() .applicationId("xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx") .clientTokenId(12345L) .isEnabled(true)) .scheduling( new SyntheticsTestOptionsScheduling() .timeframes( Arrays.asList( new SyntheticsTestOptionsSchedulingTimeframe() .day(1) .from("07:00") .to("16:00"), new SyntheticsTestOptionsSchedulingTimeframe() .day(3) .from("07:00") .to("16:00"))) .timezone("America/New_York"))) .status(SyntheticsTestPauseStatus.LIVE) .steps( Collections.singletonList( new SyntheticsStep().type(SyntheticsStepType.ASSERT_ELEMENT_CONTENT))) .tags(Collections.singletonList("env:prod")) .type(SyntheticsBrowserTestType.BROWSER); try { SyntheticsBrowserTest result = apiInstance.updateBrowserTest("public_id", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#updateBrowserTest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit a browser test ``` // Edit a browser test returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsBasicAuth; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthWeb; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthWebType; use datadog_api_client::datadogV1::model::SyntheticsBrowserTest; use datadog_api_client::datadogV1::model::SyntheticsBrowserTestConfig; use datadog_api_client::datadogV1::model::SyntheticsBrowserTestRumSettings; use datadog_api_client::datadogV1::model::SyntheticsBrowserTestType; use datadog_api_client::datadogV1::model::SyntheticsBrowserVariable; use datadog_api_client::datadogV1::model::SyntheticsBrowserVariableType; use datadog_api_client::datadogV1::model::SyntheticsConfigVariable; use datadog_api_client::datadogV1::model::SyntheticsConfigVariableType; use datadog_api_client::datadogV1::model::SyntheticsStep; use datadog_api_client::datadogV1::model::SyntheticsStepType; use datadog_api_client::datadogV1::model::SyntheticsTestCallType; use datadog_api_client::datadogV1::model::SyntheticsTestCiOptions; use datadog_api_client::datadogV1::model::SyntheticsTestExecutionRule; use datadog_api_client::datadogV1::model::SyntheticsTestOptions; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsHTTPVersion; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsMonitorOptions; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsMonitorOptionsNotificationPresetName; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsRetry; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsScheduling; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsSchedulingTimeframe; use datadog_api_client::datadogV1::model::SyntheticsTestPauseStatus; use datadog_api_client::datadogV1::model::SyntheticsTestRequest; use datadog_api_client::datadogV1::model::SyntheticsTestRequestBodyFile; use datadog_api_client::datadogV1::model::SyntheticsTestRequestBodyType; use datadog_api_client::datadogV1::model::SyntheticsTestRequestCertificate; use datadog_api_client::datadogV1::model::SyntheticsTestRequestCertificateItem; use datadog_api_client::datadogV1::model::SyntheticsTestRequestProxy; #[tokio::main] async fn main() { let body = SyntheticsBrowserTest::new( SyntheticsBrowserTestConfig::new( vec![], SyntheticsTestRequest::new() .basic_auth(SyntheticsBasicAuth::SyntheticsBasicAuthWeb(Box::new( SyntheticsBasicAuthWeb::new() .password("PaSSw0RD!".to_string()) .type_(SyntheticsBasicAuthWebType::WEB) .username("my_username".to_string()), ))) .body_type(SyntheticsTestRequestBodyType::TEXT_PLAIN) .call_type(SyntheticsTestCallType::UNARY) .certificate( SyntheticsTestRequestCertificate::new() .cert(SyntheticsTestRequestCertificateItem::new()) .key(SyntheticsTestRequestCertificateItem::new()), ) .certificate_domains(vec![]) .files(vec![SyntheticsTestRequestBodyFile::new()]) .http_version(SyntheticsTestOptionsHTTPVersion::HTTP1) .proxy(SyntheticsTestRequestProxy::new( "https://example.com".to_string(), )) .service("Greeter".to_string()) .url("https://example.com".to_string()), ) .config_variables(vec![SyntheticsConfigVariable::new( "VARIABLE_NAME".to_string(), SyntheticsConfigVariableType::TEXT, ) .secure(false)]) .variables(vec![SyntheticsBrowserVariable::new( "VARIABLE_NAME".to_string(), SyntheticsBrowserVariableType::TEXT, )]), vec!["aws:eu-west-3".to_string()], "".to_string(), "Example test name".to_string(), SyntheticsTestOptions::new() .blocked_request_patterns(vec![]) .ci(SyntheticsTestCiOptions::new( SyntheticsTestExecutionRule::BLOCKING, )) .device_ids(vec!["chrome.laptop_large".to_string()]) .http_version(SyntheticsTestOptionsHTTPVersion::HTTP1) .monitor_options( SyntheticsTestOptionsMonitorOptions::new().notification_preset_name( SyntheticsTestOptionsMonitorOptionsNotificationPresetName::SHOW_ALL, ), ) .restricted_roles(vec!["xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx".to_string()]) .retry(SyntheticsTestOptionsRetry::new()) .rum_settings( SyntheticsBrowserTestRumSettings::new(true) .application_id("xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx".to_string()) .client_token_id(12345), ) .scheduling(SyntheticsTestOptionsScheduling::new( vec![ SyntheticsTestOptionsSchedulingTimeframe::new( 1, "07:00".to_string(), "16:00".to_string(), ), SyntheticsTestOptionsSchedulingTimeframe::new( 3, "07:00".to_string(), "16:00".to_string(), ), ], "America/New_York".to_string(), )), SyntheticsBrowserTestType::BROWSER, ) .status(SyntheticsTestPauseStatus::LIVE) .steps(vec![ SyntheticsStep::new().type_(SyntheticsStepType::ASSERT_ELEMENT_CONTENT) ]) .tags(vec!["env:prod".to_string()]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.update_browser_test("public_id".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit a browser test ``` /** * Edit a browser test returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiUpdateBrowserTestRequest = { body: { config: { assertions: [], configVariables: [ { name: "VARIABLE_NAME", secure: false, type: "text", }, ], request: { basicAuth: { password: "PaSSw0RD!", type: "web", username: "my_username", }, bodyType: "text/plain", callType: "unary", certificate: { cert: {}, key: {}, }, certificateDomains: [], files: [{}], httpVersion: "http1", proxy: { url: "https://example.com", }, service: "Greeter", url: "https://example.com", }, variables: [ { name: "VARIABLE_NAME", type: "text", }, ], }, locations: ["aws:eu-west-3"], message: "", name: "Example test name", options: { blockedRequestPatterns: [], ci: { executionRule: "blocking", }, deviceIds: ["chrome.laptop_large"], httpVersion: "http1", monitorOptions: { notificationPresetName: "show_all", }, restrictedRoles: ["xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"], retry: {}, rumSettings: { applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345, isEnabled: true, }, scheduling: { timeframes: [ { day: 1, from: "07:00", to: "16:00", }, { day: 3, from: "07:00", to: "16:00", }, ], timezone: "America/New_York", }, }, status: "live", steps: [ { type: "assertElementContent", }, ], tags: ["env:prod"], type: "browser", }, publicId: "public_id", }; apiInstance .updateBrowserTest(params) .then((data: v1.SyntheticsBrowserTest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Patch a Synthetic test](https://docs.datadoghq.com/api/latest/synthetics/#patch-a-synthetic-test) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#patch-a-synthetic-test-v1) PATCH https://api.ap1.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.ap2.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.datadoghq.eu/api/v1/synthetics/tests/{public_id}https://api.ddog-gov.com/api/v1/synthetics/tests/{public_id}https://api.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.us3.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.us5.datadoghq.com/api/v1/synthetics/tests/{public_id} ### Overview Patch the configuration of a Synthetic test with partial data. This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the test to patch. ### Request #### Body Data (required) [JSON Patch](https://jsonpatch.com/) compliant list of operations * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description data [object] Array of [JSON Patch](https://jsonpatch.com) operations to perform on the test op enum The operation to perform Allowed enum values: `add,remove,replace,move,copy,test` path string The path to the value to modify value A value to use in a [JSON Patch](https://jsonpatch.com) operation ``` { "data": [ { "op": "replace", "path": "/name", "value": "New test name" }, { "op": "remove", "path": "/config/assertions/0" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#PatchTest-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#PatchTest-400-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#PatchTest-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#PatchTest-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#PatchTest-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about your Synthetic test. Field Type Description config object Configuration object for a Synthetic test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. variables [object] Browser tests only - array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. locations [string] Array of locations used to run the test. message string Notification message associated with the test. monitor_id int64 The associated monitor ID. name string Name of the test. options object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The test public ID. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] The steps of the test if they exist. allowFailure boolean A boolean set to allow this step to fail. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean A boolean set to exit the test if the step succeeds. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name string The name of the step. noScreenshot boolean A boolean set to skip taking a screenshot for the step. params object The parameters of the step. public_id string The public ID of the step. timeout int64 The time before declaring a step failed. type enum Step type used in your Synthetic test. Allowed enum values: `assertCurrentUrl,assertElementAttribute,assertElementContent,assertElementPresent,assertEmail,assertFileDownload,assertFromJavascript,assertPageContains,assertPageLacks,assertRequests,click,extractFromJavascript,extractFromEmailBody,extractVariable,goToEmailLink,goToUrl,goToUrlAndMeasureTti,hover,playSubTest,pressKey,refresh,runApiTest,scroll,selectOption,typeText,uploadFiles,wait` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type enum Type of the Synthetic test, either `api` or `browser`. Allowed enum values: `api,browser,mobile` ``` { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "creator": { "email": "string", "handle": "string", "name": "string" }, "locations": [ "aws:eu-west-3" ], "message": "string", "monitor_id": "integer", "name": "string", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "string", "status": "live", "steps": [ { "allowFailure": false, "alwaysExecute": false, "exitIfSucceed": false, "isCritical": false, "name": "string", "noScreenshot": false, "params": {}, "public_id": "string", "timeout": "integer", "type": "assertElementContent" } ], "subtype": "http", "tags": [], "type": "string" } ``` Copy - JSON format is wrong - Updating sub-type is forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic Monitoring is not activated for the user - Test is not owned by the user - Test can't be found * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Patch a Synthetic test returns "OK" response Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/${public_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "op": "replace", "path": "/name", "value": "New test name" }, { "op": "remove", "path": "/config/assertions/0" } ] } EOF ``` ##### Patch a Synthetic test returns "OK" response ``` // Patch a Synthetic test returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "synthetics_api_test" in the system SyntheticsAPITestPublicID := os.Getenv("SYNTHETICS_API_TEST_PUBLIC_ID") body := datadogV1.SyntheticsPatchTestBody{ Data: []datadogV1.SyntheticsPatchTestOperation{ { Op: datadogV1.SYNTHETICSPATCHTESTOPERATIONNAME_REPLACE.Ptr(), Path: datadog.PtrString("/name"), Value: "New test name", }, { Op: datadogV1.SYNTHETICSPATCHTESTOPERATIONNAME_REMOVE.Ptr(), Path: datadog.PtrString("/config/assertions/0"), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.PatchTest(ctx, SyntheticsAPITestPublicID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.PatchTest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.PatchTest`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Patch a Synthetic test returns "OK" response ``` // Patch a Synthetic test returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsPatchTestBody; import com.datadog.api.client.v1.model.SyntheticsPatchTestOperation; import com.datadog.api.client.v1.model.SyntheticsPatchTestOperationName; import com.datadog.api.client.v1.model.SyntheticsTestDetails; import java.util.Arrays; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); // there is a valid "synthetics_api_test" in the system String SYNTHETICS_API_TEST_PUBLIC_ID = System.getenv("SYNTHETICS_API_TEST_PUBLIC_ID"); SyntheticsPatchTestBody body = new SyntheticsPatchTestBody() .data( Arrays.asList( new SyntheticsPatchTestOperation() .op(SyntheticsPatchTestOperationName.REPLACE) .path("/name") .value("New test name"), new SyntheticsPatchTestOperation() .op(SyntheticsPatchTestOperationName.REMOVE) .path("/config/assertions/0"))); try { SyntheticsTestDetails result = apiInstance.patchTest(SYNTHETICS_API_TEST_PUBLIC_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#patchTest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Patch a Synthetic test returns "OK" response ``` """ Patch a Synthetic test returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_patch_test_body import SyntheticsPatchTestBody from datadog_api_client.v1.model.synthetics_patch_test_operation import SyntheticsPatchTestOperation from datadog_api_client.v1.model.synthetics_patch_test_operation_name import SyntheticsPatchTestOperationName # there is a valid "synthetics_api_test" in the system SYNTHETICS_API_TEST_PUBLIC_ID = environ["SYNTHETICS_API_TEST_PUBLIC_ID"] body = SyntheticsPatchTestBody( data=[ SyntheticsPatchTestOperation( op=SyntheticsPatchTestOperationName.REPLACE, path="/name", value="New test name", ), SyntheticsPatchTestOperation( op=SyntheticsPatchTestOperationName.REMOVE, path="/config/assertions/0", ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.patch_test(public_id=SYNTHETICS_API_TEST_PUBLIC_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Patch a Synthetic test returns "OK" response ``` # Patch a Synthetic test returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new # there is a valid "synthetics_api_test" in the system SYNTHETICS_API_TEST_PUBLIC_ID = ENV["SYNTHETICS_API_TEST_PUBLIC_ID"] body = DatadogAPIClient::V1::SyntheticsPatchTestBody.new({ data: [ DatadogAPIClient::V1::SyntheticsPatchTestOperation.new({ op: DatadogAPIClient::V1::SyntheticsPatchTestOperationName::REPLACE, path: "/name", value: "New test name", }), DatadogAPIClient::V1::SyntheticsPatchTestOperation.new({ op: DatadogAPIClient::V1::SyntheticsPatchTestOperationName::REMOVE, path: "/config/assertions/0", }), ], }) p api_instance.patch_test(SYNTHETICS_API_TEST_PUBLIC_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Patch a Synthetic test returns "OK" response ``` // Patch a Synthetic test returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsPatchTestBody; use datadog_api_client::datadogV1::model::SyntheticsPatchTestOperation; use datadog_api_client::datadogV1::model::SyntheticsPatchTestOperationName; use serde_json::Value; #[tokio::main] async fn main() { // there is a valid "synthetics_api_test" in the system let synthetics_api_test_public_id = std::env::var("SYNTHETICS_API_TEST_PUBLIC_ID").unwrap(); let body = SyntheticsPatchTestBody::new().data(vec![ SyntheticsPatchTestOperation::new() .op(SyntheticsPatchTestOperationName::REPLACE) .path("/name".to_string()) .value(Value::from("New test name")), SyntheticsPatchTestOperation::new() .op(SyntheticsPatchTestOperationName::REMOVE) .path("/config/assertions/0".to_string()), ]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api .patch_test(synthetics_api_test_public_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Patch a Synthetic test returns "OK" response ``` /** * Patch a Synthetic test returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); // there is a valid "synthetics_api_test" in the system const SYNTHETICS_API_TEST_PUBLIC_ID = process.env .SYNTHETICS_API_TEST_PUBLIC_ID as string; const params: v1.SyntheticsApiPatchTestRequest = { body: { data: [ { op: "replace", path: "/name", value: "New test name", }, { op: "remove", path: "/config/assertions/0", }, ], }, publicId: SYNTHETICS_API_TEST_PUBLIC_ID, }; apiInstance .patchTest(params) .then((data: v1.SyntheticsTestDetails) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Pause or start a test](https://docs.datadoghq.com/api/latest/synthetics/#pause-or-start-a-test) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#pause-or-start-a-test-v1) PUT https://api.ap1.datadoghq.com/api/v1/synthetics/tests/{public_id}/statushttps://api.ap2.datadoghq.com/api/v1/synthetics/tests/{public_id}/statushttps://api.datadoghq.eu/api/v1/synthetics/tests/{public_id}/statushttps://api.ddog-gov.com/api/v1/synthetics/tests/{public_id}/statushttps://api.datadoghq.com/api/v1/synthetics/tests/{public_id}/statushttps://api.us3.datadoghq.com/api/v1/synthetics/tests/{public_id}/statushttps://api.us5.datadoghq.com/api/v1/synthetics/tests/{public_id}/status ### Overview Pause or start a Synthetic test by changing the status. This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the Synthetic test to update. ### Request #### Body Data (required) Status to set the given Synthetic test to. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Expand All Field Type Description new_status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` ``` { "new_status": "live" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#UpdateTestPauseStatus-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#UpdateTestPauseStatus-400-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#UpdateTestPauseStatus-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#UpdateTestPauseStatus-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#UpdateTestPauseStatus-429-v1) OK - Returns a boolean indicating if the update was successful. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Expand All Field Type Description No response body ``` {} ``` Copy JSON format is wrong. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic Monitoring is not activated for the user - Test is not owned by the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Pause or start a test Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/${public_id}/status" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Pause or start a test ``` """ Pause or start a test returns "OK - Returns a boolean indicating if the update was successful." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_test_pause_status import SyntheticsTestPauseStatus from datadog_api_client.v1.model.synthetics_update_test_pause_status_payload import ( SyntheticsUpdateTestPauseStatusPayload, ) body = SyntheticsUpdateTestPauseStatusPayload( new_status=SyntheticsTestPauseStatus.LIVE, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.update_test_pause_status(public_id="public_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Pause or start a test ``` # Pause or start a test returns "OK - Returns a boolean indicating if the update was successful." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsUpdateTestPauseStatusPayload.new({ new_status: DatadogAPIClient::V1::SyntheticsTestPauseStatus::LIVE, }) p api_instance.update_test_pause_status("public_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Pause or start a test ``` // Pause or start a test returns "OK - Returns a boolean indicating if the update was successful." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsUpdateTestPauseStatusPayload{ NewStatus: datadogV1.SYNTHETICSTESTPAUSESTATUS_LIVE.Ptr(), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.UpdateTestPauseStatus(ctx, "public_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.UpdateTestPauseStatus`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.UpdateTestPauseStatus`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Pause or start a test ``` // Pause or start a test returns "OK - Returns a boolean indicating if the update was successful." // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsTestPauseStatus; import com.datadog.api.client.v1.model.SyntheticsUpdateTestPauseStatusPayload; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsUpdateTestPauseStatusPayload body = new SyntheticsUpdateTestPauseStatusPayload().newStatus(SyntheticsTestPauseStatus.LIVE); try { Boolean result = apiInstance.updateTestPauseStatus("public_id", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#updateTestPauseStatus"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Pause or start a test ``` // Pause or start a test returns "OK - Returns a boolean indicating if the update // was successful." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsTestPauseStatus; use datadog_api_client::datadogV1::model::SyntheticsUpdateTestPauseStatusPayload; #[tokio::main] async fn main() { let body = SyntheticsUpdateTestPauseStatusPayload::new().new_status(SyntheticsTestPauseStatus::LIVE); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api .update_test_pause_status("public_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Pause or start a test ``` /** * Pause or start a test returns "OK - Returns a boolean indicating if the update was successful." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiUpdateTestPauseStatusRequest = { body: { newStatus: "live", }, publicId: "public_id", }; apiInstance .updateTestPauseStatus(params) .then((data: boolean) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Trigger tests from CI/CD pipelines](https://docs.datadoghq.com/api/latest/synthetics/#trigger-tests-from-cicd-pipelines) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#trigger-tests-from-cicd-pipelines-v1) POST https://api.ap1.datadoghq.com/api/v1/synthetics/tests/trigger/cihttps://api.ap2.datadoghq.com/api/v1/synthetics/tests/trigger/cihttps://api.datadoghq.eu/api/v1/synthetics/tests/trigger/cihttps://api.ddog-gov.com/api/v1/synthetics/tests/trigger/cihttps://api.datadoghq.com/api/v1/synthetics/tests/trigger/cihttps://api.us3.datadoghq.com/api/v1/synthetics/tests/trigger/cihttps://api.us5.datadoghq.com/api/v1/synthetics/tests/trigger/ci ### Overview Trigger a set of Synthetic tests for continuous integration. This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Request #### Body Data (required) Details of the test to trigger. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description tests [object] List of Synthetic tests with overrides. allowInsecureCertificates boolean Disable certificate checks in API tests. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType string Type of the data sent in a Synthetic API test. cookies string Cookies for the request. deviceIds [string] For browser test, array with the different device IDs used to run the test. followRedirects boolean For API HTTP test, whether or not the test should follow redirects. headers object Headers to include when performing the test. string A single Header. locations [string] Array of locations used to run the test. metadata object Metadata for the Synthetic tests run. ci object Description of the CI provider. pipeline object Description of the CI pipeline. url string URL of the pipeline. provider object Description of the CI provider. name string Name of the CI provider. git object Git information. branch string Branch name. commitSha string The commit SHA. public_id [_required_] string The public ID of the Synthetic test to trigger. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. startUrl string Starting URL for the browser test. variables object Variables to replace in the test. string A single variable. version int64 The version number of the Synthetic test version to trigger. ``` { "tests": [ { "allowInsecureCertificates": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "string", "cookies": "string", "deviceIds": [ "chrome.laptop_large" ], "followRedirects": false, "headers": { "": "string" }, "locations": [ "aws:eu-west-3" ], "metadata": { "ci": { "pipeline": { "url": "string" }, "provider": { "name": "string" } }, "git": { "branch": "string", "commitSha": "string" } }, "public_id": "aaa-aaa-aaa", "retry": { "count": "integer", "interval": "number" }, "startUrl": "string", "variables": { "": "string" }, "version": "integer" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#TriggerCITests-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#TriggerCITests-400-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#TriggerCITests-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing information about the tests triggered. Field Type Description batch_id string The public ID of the batch triggered. locations [object] List of Synthetic locations. id int64 Unique identifier of the location. name string Name of the location. results [object] Information about the tests runs. device string The device ID. location int64 The location ID of the test run. public_id string The public ID of the Synthetic test. result_id string ID of the result. triggered_check_ids [string] The public IDs of the Synthetic test triggered. ``` { "batch_id": "string", "locations": [ { "id": "integer", "name": "string" } ], "results": [ { "device": "chrome.laptop_large", "location": "integer", "public_id": "string", "result_id": "string" } ], "triggered_check_ids": [] } ``` Copy JSON format is wrong * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Trigger tests from CI/CD pipelines Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/trigger/ci" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "tests": [ { "public_id": "aaa-aaa-aaa" } ] } EOF ``` ##### Trigger tests from CI/CD pipelines ``` """ Trigger tests from CI/CD pipelines returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_basic_auth_web import SyntheticsBasicAuthWeb from datadog_api_client.v1.model.synthetics_basic_auth_web_type import SyntheticsBasicAuthWebType from datadog_api_client.v1.model.synthetics_ci_batch_metadata import SyntheticsCIBatchMetadata from datadog_api_client.v1.model.synthetics_ci_batch_metadata_ci import SyntheticsCIBatchMetadataCI from datadog_api_client.v1.model.synthetics_ci_batch_metadata_git import SyntheticsCIBatchMetadataGit from datadog_api_client.v1.model.synthetics_ci_batch_metadata_pipeline import SyntheticsCIBatchMetadataPipeline from datadog_api_client.v1.model.synthetics_ci_batch_metadata_provider import SyntheticsCIBatchMetadataProvider from datadog_api_client.v1.model.synthetics_ci_test import SyntheticsCITest from datadog_api_client.v1.model.synthetics_ci_test_body import SyntheticsCITestBody from datadog_api_client.v1.model.synthetics_test_options_retry import SyntheticsTestOptionsRetry body = SyntheticsCITestBody( tests=[ SyntheticsCITest( basic_auth=SyntheticsBasicAuthWeb( password="PaSSw0RD!", type=SyntheticsBasicAuthWebType.WEB, username="my_username", ), device_ids=[ "chrome.laptop_large", ], locations=[ "aws:eu-west-3", ], metadata=SyntheticsCIBatchMetadata( ci=SyntheticsCIBatchMetadataCI( pipeline=SyntheticsCIBatchMetadataPipeline(), provider=SyntheticsCIBatchMetadataProvider(), ), git=SyntheticsCIBatchMetadataGit(), ), public_id="aaa-aaa-aaa", retry=SyntheticsTestOptionsRetry(), ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.trigger_ci_tests(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Trigger tests from CI/CD pipelines ``` # Trigger tests from CI/CD pipelines returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsCITestBody.new({ tests: [ DatadogAPIClient::V1::SyntheticsCITest.new({ basic_auth: DatadogAPIClient::V1::SyntheticsBasicAuthWeb.new({ password: "PaSSw0RD!", type: DatadogAPIClient::V1::SyntheticsBasicAuthWebType::WEB, username: "my_username", }), device_ids: [ "chrome.laptop_large", ], locations: [ "aws:eu-west-3", ], metadata: DatadogAPIClient::V1::SyntheticsCIBatchMetadata.new({ ci: DatadogAPIClient::V1::SyntheticsCIBatchMetadataCI.new({ pipeline: DatadogAPIClient::V1::SyntheticsCIBatchMetadataPipeline.new({}), provider: DatadogAPIClient::V1::SyntheticsCIBatchMetadataProvider.new({}), }), git: DatadogAPIClient::V1::SyntheticsCIBatchMetadataGit.new({}), }), public_id: "aaa-aaa-aaa", _retry: DatadogAPIClient::V1::SyntheticsTestOptionsRetry.new({}), }), ], }) p api_instance.trigger_ci_tests(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Trigger tests from CI/CD pipelines ``` // Trigger tests from CI/CD pipelines returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsCITestBody{ Tests: []datadogV1.SyntheticsCITest{ { BasicAuth: &datadogV1.SyntheticsBasicAuth{ SyntheticsBasicAuthWeb: &datadogV1.SyntheticsBasicAuthWeb{ Password: datadog.PtrString("PaSSw0RD!"), Type: datadogV1.SYNTHETICSBASICAUTHWEBTYPE_WEB.Ptr(), Username: datadog.PtrString("my_username"), }}, DeviceIds: []string{ "chrome.laptop_large", }, Locations: []string{ "aws:eu-west-3", }, Metadata: &datadogV1.SyntheticsCIBatchMetadata{ Ci: &datadogV1.SyntheticsCIBatchMetadataCI{ Pipeline: &datadogV1.SyntheticsCIBatchMetadataPipeline{}, Provider: &datadogV1.SyntheticsCIBatchMetadataProvider{}, }, Git: &datadogV1.SyntheticsCIBatchMetadataGit{}, }, PublicId: "aaa-aaa-aaa", Retry: &datadogV1.SyntheticsTestOptionsRetry{}, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.TriggerCITests(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.TriggerCITests`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.TriggerCITests`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Trigger tests from CI/CD pipelines ``` // Trigger tests from CI/CD pipelines returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsBasicAuth; import com.datadog.api.client.v1.model.SyntheticsBasicAuthWeb; import com.datadog.api.client.v1.model.SyntheticsBasicAuthWebType; import com.datadog.api.client.v1.model.SyntheticsCIBatchMetadata; import com.datadog.api.client.v1.model.SyntheticsCIBatchMetadataCI; import com.datadog.api.client.v1.model.SyntheticsCIBatchMetadataGit; import com.datadog.api.client.v1.model.SyntheticsCIBatchMetadataPipeline; import com.datadog.api.client.v1.model.SyntheticsCIBatchMetadataProvider; import com.datadog.api.client.v1.model.SyntheticsCITest; import com.datadog.api.client.v1.model.SyntheticsCITestBody; import com.datadog.api.client.v1.model.SyntheticsTestOptionsRetry; import com.datadog.api.client.v1.model.SyntheticsTriggerCITestsResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsCITestBody body = new SyntheticsCITestBody() .tests( Collections.singletonList( new SyntheticsCITest() .basicAuth( new SyntheticsBasicAuth( new SyntheticsBasicAuthWeb() .password("PaSSw0RD!") .type(SyntheticsBasicAuthWebType.WEB) .username("my_username"))) .deviceIds(Collections.singletonList("chrome.laptop_large")) .locations(Collections.singletonList("aws:eu-west-3")) .metadata( new SyntheticsCIBatchMetadata() .ci( new SyntheticsCIBatchMetadataCI() .pipeline(new SyntheticsCIBatchMetadataPipeline()) .provider(new SyntheticsCIBatchMetadataProvider())) .git(new SyntheticsCIBatchMetadataGit())) .publicId("aaa-aaa-aaa") .retry(new SyntheticsTestOptionsRetry()))); try { SyntheticsTriggerCITestsResponse result = apiInstance.triggerCITests(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#triggerCITests"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Trigger tests from CI/CD pipelines ``` // Trigger tests from CI/CD pipelines returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsBasicAuth; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthWeb; use datadog_api_client::datadogV1::model::SyntheticsBasicAuthWebType; use datadog_api_client::datadogV1::model::SyntheticsCIBatchMetadata; use datadog_api_client::datadogV1::model::SyntheticsCIBatchMetadataCI; use datadog_api_client::datadogV1::model::SyntheticsCIBatchMetadataGit; use datadog_api_client::datadogV1::model::SyntheticsCIBatchMetadataPipeline; use datadog_api_client::datadogV1::model::SyntheticsCIBatchMetadataProvider; use datadog_api_client::datadogV1::model::SyntheticsCITest; use datadog_api_client::datadogV1::model::SyntheticsCITestBody; use datadog_api_client::datadogV1::model::SyntheticsTestOptionsRetry; #[tokio::main] async fn main() { let body = SyntheticsCITestBody::new().tests(vec![SyntheticsCITest::new("aaa-aaa-aaa".to_string()) .basic_auth(SyntheticsBasicAuth::SyntheticsBasicAuthWeb(Box::new( SyntheticsBasicAuthWeb::new() .password("PaSSw0RD!".to_string()) .type_(SyntheticsBasicAuthWebType::WEB) .username("my_username".to_string()), ))) .device_ids(vec!["chrome.laptop_large".to_string()]) .locations(vec!["aws:eu-west-3".to_string()]) .metadata( SyntheticsCIBatchMetadata::new() .ci(SyntheticsCIBatchMetadataCI::new() .pipeline(SyntheticsCIBatchMetadataPipeline::new()) .provider(SyntheticsCIBatchMetadataProvider::new())) .git(SyntheticsCIBatchMetadataGit::new()), ) .retry(SyntheticsTestOptionsRetry::new())]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.trigger_ci_tests(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Trigger tests from CI/CD pipelines ``` /** * Trigger tests from CI/CD pipelines returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiTriggerCITestsRequest = { body: { tests: [ { basicAuth: { password: "PaSSw0RD!", type: "web", username: "my_username", }, deviceIds: ["chrome.laptop_large"], locations: ["aws:eu-west-3"], metadata: { ci: { pipeline: {}, provider: {}, }, git: {}, }, publicId: "aaa-aaa-aaa", retry: {}, }, ], }, }; apiInstance .triggerCITests(params) .then((data: v1.SyntheticsTriggerCITestsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Trigger Synthetic tests](https://docs.datadoghq.com/api/latest/synthetics/#trigger-synthetic-tests) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#trigger-synthetic-tests-v1) POST https://api.ap1.datadoghq.com/api/v1/synthetics/tests/triggerhttps://api.ap2.datadoghq.com/api/v1/synthetics/tests/triggerhttps://api.datadoghq.eu/api/v1/synthetics/tests/triggerhttps://api.ddog-gov.com/api/v1/synthetics/tests/triggerhttps://api.datadoghq.com/api/v1/synthetics/tests/triggerhttps://api.us3.datadoghq.com/api/v1/synthetics/tests/triggerhttps://api.us5.datadoghq.com/api/v1/synthetics/tests/trigger ### Overview Trigger a set of Synthetic tests. This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Request #### Body Data (required) The identifiers of the tests to trigger. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description tests [_required_] [object] List of Synthetic tests. metadata object Metadata for the Synthetic tests run. ci object Description of the CI provider. pipeline object Description of the CI pipeline. url string URL of the pipeline. provider object Description of the CI provider. name string Name of the CI provider. git object Git information. branch string Branch name. commitSha string The commit SHA. public_id [_required_] string The public ID of the Synthetic test to trigger. ``` { "tests": [ { "public_id": "123-abc-456" } ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#TriggerTests-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#TriggerTests-400-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#TriggerTests-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing information about the tests triggered. Field Type Description batch_id string The public ID of the batch triggered. locations [object] List of Synthetic locations. id int64 Unique identifier of the location. name string Name of the location. results [object] Information about the tests runs. device string The device ID. location int64 The location ID of the test run. public_id string The public ID of the Synthetic test. result_id string ID of the result. triggered_check_ids [string] The public IDs of the Synthetic test triggered. ``` { "batch_id": "string", "locations": [ { "id": "integer", "name": "string" } ], "results": [ { "device": "chrome.laptop_large", "location": "integer", "public_id": "string", "result_id": "string" } ], "triggered_check_ids": [] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Trigger Synthetic tests returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/trigger" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "tests": [ { "public_id": "123-abc-456" } ] } EOF ``` ##### Trigger Synthetic tests returns "OK" response ``` // Trigger Synthetic tests returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "synthetics_api_test" in the system SyntheticsAPITestPublicID := os.Getenv("SYNTHETICS_API_TEST_PUBLIC_ID") body := datadogV1.SyntheticsTriggerBody{ Tests: []datadogV1.SyntheticsTriggerTest{ { PublicId: SyntheticsAPITestPublicID, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.TriggerTests(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.TriggerTests`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.TriggerTests`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Trigger Synthetic tests returns "OK" response ``` // Trigger Synthetic tests returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsTriggerBody; import com.datadog.api.client.v1.model.SyntheticsTriggerCITestsResponse; import com.datadog.api.client.v1.model.SyntheticsTriggerTest; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); // there is a valid "synthetics_api_test" in the system String SYNTHETICS_API_TEST_PUBLIC_ID = System.getenv("SYNTHETICS_API_TEST_PUBLIC_ID"); SyntheticsTriggerBody body = new SyntheticsTriggerBody() .tests( Collections.singletonList( new SyntheticsTriggerTest().publicId(SYNTHETICS_API_TEST_PUBLIC_ID))); try { SyntheticsTriggerCITestsResponse result = apiInstance.triggerTests(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#triggerTests"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Trigger Synthetic tests returns "OK" response ``` """ Trigger Synthetic tests returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_trigger_body import SyntheticsTriggerBody from datadog_api_client.v1.model.synthetics_trigger_test import SyntheticsTriggerTest # there is a valid "synthetics_api_test" in the system SYNTHETICS_API_TEST_PUBLIC_ID = environ["SYNTHETICS_API_TEST_PUBLIC_ID"] body = SyntheticsTriggerBody( tests=[ SyntheticsTriggerTest( public_id=SYNTHETICS_API_TEST_PUBLIC_ID, ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.trigger_tests(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Trigger Synthetic tests returns "OK" response ``` # Trigger Synthetic tests returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new # there is a valid "synthetics_api_test" in the system SYNTHETICS_API_TEST_PUBLIC_ID = ENV["SYNTHETICS_API_TEST_PUBLIC_ID"] body = DatadogAPIClient::V1::SyntheticsTriggerBody.new({ tests: [ DatadogAPIClient::V1::SyntheticsTriggerTest.new({ public_id: SYNTHETICS_API_TEST_PUBLIC_ID, }), ], }) p api_instance.trigger_tests(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Trigger Synthetic tests returns "OK" response ``` // Trigger Synthetic tests returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsTriggerBody; use datadog_api_client::datadogV1::model::SyntheticsTriggerTest; #[tokio::main] async fn main() { // there is a valid "synthetics_api_test" in the system let synthetics_api_test_public_id = std::env::var("SYNTHETICS_API_TEST_PUBLIC_ID").unwrap(); let body = SyntheticsTriggerBody::new(vec![SyntheticsTriggerTest::new( synthetics_api_test_public_id.clone(), )]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.trigger_tests(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Trigger Synthetic tests returns "OK" response ``` /** * Trigger Synthetic tests returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); // there is a valid "synthetics_api_test" in the system const SYNTHETICS_API_TEST_PUBLIC_ID = process.env .SYNTHETICS_API_TEST_PUBLIC_ID as string; const params: v1.SyntheticsApiTriggerTestsRequest = { body: { tests: [ { publicId: SYNTHETICS_API_TEST_PUBLIC_ID, }, ], }, }; apiInstance .triggerTests(params) .then((data: v1.SyntheticsTriggerCITestsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a Mobile test](https://docs.datadoghq.com/api/latest/synthetics/#get-a-mobile-test) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-a-mobile-test-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/tests/mobile/{public_id}https://api.ap2.datadoghq.com/api/v1/synthetics/tests/mobile/{public_id}https://api.datadoghq.eu/api/v1/synthetics/tests/mobile/{public_id}https://api.ddog-gov.com/api/v1/synthetics/tests/mobile/{public_id}https://api.datadoghq.com/api/v1/synthetics/tests/mobile/{public_id}https://api.us3.datadoghq.com/api/v1/synthetics/tests/mobile/{public_id}https://api.us5.datadoghq.com/api/v1/synthetics/tests/mobile/{public_id} ### Overview Get the detailed configuration associated with a Synthetic Mobile test. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the test to get details from. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetMobileTest-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#GetMobileTest-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#GetMobileTest-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetMobileTest-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about a Synthetic mobile test. Field Type Description config [_required_] object Configuration object for a Synthetic mobile test. initialApplicationArguments object Initial application arguments for a mobile test. string A single application argument. variables [object] Array of variables used for the test steps. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` device_ids [string] Array with the different device IDs used to run the test. message [_required_] string Notification message associated with the test. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. allowApplicationCrash boolean A boolean to set if an application crash would mark the test as failed. bindings [object] Array of bindings used for the mobile test. principals [string] List of principals for a mobile test binding. relation enum The type of relation for the binding. Allowed enum values: `editor,viewer` ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` defaultStepTimeout int32 The default timeout for steps in the test (in seconds). device_ids [_required_] [string] For mobile test, array with the different device IDs used to run the test. disableAutoAcceptAlert boolean A boolean to disable auto accepting alerts. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. mobileApplication [_required_] object Mobile application for mobile synthetics test. applicationId [_required_] string Application ID of the mobile application. referenceId [_required_] string Reference ID of the mobile application. referenceType [_required_] enum Reference type for the mobile application for a mobile synthetics test. Allowed enum values: `latest,version` monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean A boolean set to not take a screenshot for the step. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every [_required_] int64 The frequency at which to run the Synthetic test (in seconds). verbosity int32 The level of verbosity for the mobile test. This field can not be set by a user. public_id string The public ID of the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] Array of steps for the test. allowFailure boolean A boolean set to allow this step to fail. hasNewStepElement boolean A boolean set to determine if the step has a new step element. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name [_required_] string The name of the step. noScreenshot boolean A boolean set to not take a screenshot for the step. params [_required_] object The parameters of a mobile step. check enum Type of assertion to apply in an API test. Allowed enum values: `equals,notEquals,contains,notContains,startsWith,notStartsWith,greater,lower,greaterEquals,lowerEquals,matchRegex,between,isEmpty,notIsEmpty` delay int64 Number of milliseconds to wait between inputs in a `typeText` step type. direction enum The direction of the scroll for a `scrollToElement` step type. Allowed enum values: `up,down,left,right` element object Information about the element used for a step. context string Context of the element. contextType enum Type of the context that the element is in. Allowed enum values: `native,web` elementDescription string Description of the element. multiLocator object Multi-locator to find the element. relativePosition object Position of the action relative to the element. x double The `relativePosition` on the `x` axis for the element. y double The `relativePosition` on the `y` axis for the element. textContent string Text content of the element. userLocator object User locator to find the element. failTestOnCannotLocate boolean Whether if the test should fail if the element cannot be found. values [object] Values of the user locator. type enum Type of a user locator. Allowed enum values: `accessibility-id,id,ios-predicate-string,ios-class-chain,xpath` value string Value of a user locator. viewName string Name of the view of the element. enabled boolean Boolean to change the state of the wifi for a `toggleWiFi` step type. maxScrolls int64 Maximum number of scrolls to do for a `scrollToElement` step type. positions [object] List of positions for the `flick` step type. The maximum is 10 flicks per step x double The `x` position for the flick. y double The `y` position for the flick. subtestPublicId string Public ID of the test to be played as part of a `playSubTest` step type. value Values used in the step for in multiple step types. Option 1 string Value used in the step for in multiple step types. Option 2 int64 Value used in the step for in multiple step types. variable object Variable object for `extractVariable` step type. example [_required_] string An example for the variable. name [_required_] string The variable name. withEnter boolean Boolean to indicate if `Enter` should be pressed at the end of the `typeText` step type. x double Amount to scroll by on the `x` axis for a `scroll` step type. y double Amount to scroll by on the `y` axis for a `scroll` step type. publicId string The public ID of the step. timeout int64 The time before declaring a step failed. type [_required_] enum Step type used in your mobile Synthetic test. Allowed enum values: `assertElementContent,assertScreenContains,assertScreenLacks,doubleTap,extractVariable,flick,openDeeplink,playSubTest,pressBack,restartApplication,rotate,scroll,scrollToElement,tap,toggleWiFi,typeText,wait` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `mobile`. Allowed enum values: `mobile` default: `mobile` ``` { "config": { "initialApplicationArguments": { "": "string" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "device_ids": [ "chrome.laptop_large" ], "message": "Notification message", "monitor_id": 12345678, "name": "Example test name", "options": { "allowApplicationCrash": false, "bindings": [ { "principals": [], "relation": "string" } ], "ci": { "executionRule": "blocking" }, "defaultStepTimeout": "integer", "device_ids": [ "synthetics:mobile:device:apple_ipad_10th_gen_2022_ios_16" ], "disableAutoAcceptAlert": false, "min_failure_duration": "integer", "mobileApplication": { "applicationId": "00000000-0000-0000-0000-aaaaaaaaaaaa", "referenceId": "00000000-0000-0000-0000-aaaaaaaaaaab", "referenceType": "latest" }, "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": 300, "verbosity": "integer" }, "public_id": "123-abc-456", "status": "live", "steps": [ { "allowFailure": false, "hasNewStepElement": false, "isCritical": false, "name": "", "noScreenshot": false, "params": { "check": "string", "delay": "integer", "direction": "string", "element": { "context": "string", "contextType": "string", "elementDescription": "string", "multiLocator": {}, "relativePosition": { "x": "number", "y": "number" }, "textContent": "string", "userLocator": { "failTestOnCannotLocate": false, "values": [ { "type": "string", "value": "string" } ] }, "viewName": "string" }, "enabled": false, "maxScrolls": "integer", "positions": [ { "x": "number", "y": "number" } ], "subtestPublicId": "string", "value": { "description": "undefined", "type": "undefined" }, "variable": { "example": "", "name": "VAR_NAME" }, "withEnter": false, "x": "number", "y": "number" }, "publicId": "pub-lic-id0", "timeout": "integer", "type": "assertElementContent" } ], "tags": [ "env:production" ], "type": "mobile" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic Monitoring is not activated for the user - Test is not owned by the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get a Mobile test Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/mobile/${public_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a Mobile test ``` """ Get a Mobile test returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi # there is a valid "synthetics_mobile_test" in the system SYNTHETICS_MOBILE_TEST_PUBLIC_ID = environ["SYNTHETICS_MOBILE_TEST_PUBLIC_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_mobile_test( public_id=SYNTHETICS_MOBILE_TEST_PUBLIC_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a Mobile test ``` # Get a Mobile test returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new # there is a valid "synthetics_mobile_test" in the system SYNTHETICS_MOBILE_TEST_PUBLIC_ID = ENV["SYNTHETICS_MOBILE_TEST_PUBLIC_ID"] p api_instance.get_mobile_test(SYNTHETICS_MOBILE_TEST_PUBLIC_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a Mobile test ``` // Get a Mobile test returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "synthetics_mobile_test" in the system SyntheticsMobileTestPublicID := os.Getenv("SYNTHETICS_MOBILE_TEST_PUBLIC_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetMobileTest(ctx, SyntheticsMobileTestPublicID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetMobileTest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetMobileTest`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a Mobile test ``` // Get a Mobile test returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsMobileTest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); // there is a valid "synthetics_mobile_test" in the system String SYNTHETICS_MOBILE_TEST_PUBLIC_ID = System.getenv("SYNTHETICS_MOBILE_TEST_PUBLIC_ID"); try { SyntheticsMobileTest result = apiInstance.getMobileTest(SYNTHETICS_MOBILE_TEST_PUBLIC_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getMobileTest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a Mobile test ``` // Get a Mobile test returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { // there is a valid "synthetics_mobile_test" in the system let synthetics_mobile_test_public_id = std::env::var("SYNTHETICS_MOBILE_TEST_PUBLIC_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api .get_mobile_test(synthetics_mobile_test_public_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a Mobile test ``` /** * Get a Mobile test returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); // there is a valid "synthetics_mobile_test" in the system const SYNTHETICS_MOBILE_TEST_PUBLIC_ID = process.env .SYNTHETICS_MOBILE_TEST_PUBLIC_ID as string; const params: v1.SyntheticsApiGetMobileTestRequest = { publicId: SYNTHETICS_MOBILE_TEST_PUBLIC_ID, }; apiInstance .getMobileTest(params) .then((data: v1.SyntheticsMobileTest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an API test](https://docs.datadoghq.com/api/latest/synthetics/#get-an-api-test) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-an-api-test-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/tests/api/{public_id}https://api.ap2.datadoghq.com/api/v1/synthetics/tests/api/{public_id}https://api.datadoghq.eu/api/v1/synthetics/tests/api/{public_id}https://api.ddog-gov.com/api/v1/synthetics/tests/api/{public_id}https://api.datadoghq.com/api/v1/synthetics/tests/api/{public_id}https://api.us3.datadoghq.com/api/v1/synthetics/tests/api/{public_id}https://api.us5.datadoghq.com/api/v1/synthetics/tests/api/{public_id} ### Overview Get the detailed configuration associated with a Synthetic API test. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the test to get details from. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITest-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITest-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITest-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITest-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about a Synthetic API test. Field Type Description config [_required_] object Configuration object for a Synthetic API test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. steps [ ] When the test subtype is `multi`, the steps of the test. Option 1 object The Test step used in a Synthetic multi-step API test. allowFailure boolean Determines whether or not to continue with test if this step fails. assertions [_required_] [ ] Array of assertions used for the test. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` exitIfSucceed boolean Determines whether or not to exit the test if the step succeeds. extractedValues [object] Array of values to parse and save as variables from the response. field string When type is `http_header` or `grpc_metadata`, name of the header or metadatum to extract. name string Name of the variable to extract. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. secure boolean Determines whether or not the extracted value will be obfuscated. type enum Property of the Synthetic Test Response to extract into a local variable. Allowed enum values: `grpc_message,grpc_metadata,http_body,http_header,http_status_code` extractedValuesFromScript string Generate variables using JavaScript. id string ID of the step. isCritical boolean Determines whether or not to consider the entire test as failed if this step fails. Can be used only if `allowFailure` is `true`. name [_required_] string The name of the step. request [_required_] object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. subtype [_required_] enum The subtype of the Synthetic multi-step API test step. Allowed enum values: `http,grpc,ssl,dns,tcp,udp,icmp,websocket` Option 2 object The Wait step used in a Synthetic multi-step API test. id string ID of the step. name [_required_] string The name of the step. subtype [_required_] enum The subtype of the Synthetic multi-step API wait step. Allowed enum values: `wait` value [_required_] int32 The time to wait in seconds. Minimum value: 0. Maximum value: 180. Option 3 object The subtest step used in a Synthetics multi-step API test. allowFailure boolean Determines whether or not to continue with test if this step fails. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean Determines whether or not to exit the test if the step succeeds. extractedValuesFromScript string Generate variables using JavaScript. id string ID of the step. isCritical boolean Determines whether or not to consider the entire test as failed if this step fails. Can be used only if `allowFailure` is `true`. name [_required_] string The name of the step. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. subtestPublicId [_required_] string Public ID of the test to be played as part of a `playSubTest` step type. subtype [_required_] enum The subtype of the Synthetic multi-step API subtest step. Allowed enum values: `playSubTest` variablesFromScript string Variables defined from JavaScript code. locations [_required_] [string] Array of locations used to run the test. message [_required_] string Notification message associated with the test. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The public ID for the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `api`. Allowed enum values: `api` default: `api` ``` { "config": { "assertions": [ [] ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "steps": [], "variablesFromScript": "dd.variable.set(\"FOO\", \"foo\")" }, "locations": [ "aws:eu-west-3" ], "message": "Notification message", "monitor_id": 12345678, "name": "Example test name", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "123-abc-456", "status": "live", "subtype": "http", "tags": [ "env:production" ], "type": "api" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic Monitoring is not activated for the user - Test is not owned by the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get an API test Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/api/${public_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an API test ``` """ Get an API test returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_api_test( public_id="public_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an API test ``` # Get an API test returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.get_api_test("public_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an API test ``` // Get an API test returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetAPITest(ctx, "public_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetAPITest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetAPITest`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an API test ``` // Get an API test returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsAPITest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsAPITest result = apiInstance.getAPITest("public_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getAPITest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an API test ``` // Get an API test returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.get_api_test("public_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an API test ``` /** * Get an API test returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiGetAPITestRequest = { publicId: "public_id", }; apiInstance .getAPITest(params) .then((data: v1.SyntheticsAPITest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a browser test](https://docs.datadoghq.com/api/latest/synthetics/#get-a-browser-test) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-a-browser-test-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}https://api.ap2.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}https://api.datadoghq.eu/api/v1/synthetics/tests/browser/{public_id}https://api.ddog-gov.com/api/v1/synthetics/tests/browser/{public_id}https://api.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}https://api.us3.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}https://api.us5.datadoghq.com/api/v1/synthetics/tests/browser/{public_id} ### Overview Get the detailed configuration (including steps) associated with a Synthetic browser test. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the test to get details from. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTest-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTest-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTest-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTest-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about a Synthetic browser test. Field Type Description config [_required_] object Configuration object for a Synthetic browser test. assertions [_required_] [ ] Array of assertions used for the test. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request [_required_] object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. setCookie string Cookies to be used for the request, using the [Set-Cookie](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Set-Cookie) syntax. variables [object] Array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` locations [_required_] [string] Array of locations used to run the test. message [_required_] string Notification message associated with the test. Message can either be text or an empty string. monitor_id int64 The associated monitor ID. name [_required_] string Name of the test. options [_required_] object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The public ID of the test. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] Array of steps for the test. allowFailure boolean A boolean set to allow this step to fail. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean A boolean set to exit the test if the step succeeds. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name string The name of the step. noScreenshot boolean A boolean set to skip taking a screenshot for the step. params object The parameters of the step. public_id string The public ID of the step. timeout int64 The time before declaring a step failed. type enum Step type used in your Synthetic test. Allowed enum values: `assertCurrentUrl,assertElementAttribute,assertElementContent,assertElementPresent,assertEmail,assertFileDownload,assertFromJavascript,assertPageContains,assertPageLacks,assertRequests,click,extractFromJavascript,extractFromEmailBody,extractVariable,goToEmailLink,goToUrl,goToUrlAndMeasureTti,hover,playSubTest,pressKey,refresh,runApiTest,scroll,selectOption,typeText,uploadFiles,wait` tags [string] Array of tags attached to the test. type [_required_] enum Type of the Synthetic test, `browser`. Allowed enum values: `browser` default: `browser` ``` { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "setCookie": "string", "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "locations": [ "aws:eu-west-3" ], "message": "", "monitor_id": "integer", "name": "Example test name", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "string", "status": "live", "steps": [ { "allowFailure": false, "alwaysExecute": false, "exitIfSucceed": false, "isCritical": false, "name": "string", "noScreenshot": false, "params": {}, "public_id": "string", "timeout": "integer", "type": "assertElementContent" } ], "tags": [ "env:prod" ], "type": "browser" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic Monitoring is not activated for the user - Test is not owned by the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get a browser test Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/browser/${public_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a browser test ``` """ Get a browser test returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_browser_test( public_id="public_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a browser test ``` # Get a browser test returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.get_browser_test("public_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a browser test ``` // Get a browser test returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetBrowserTest(ctx, "public_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetBrowserTest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetBrowserTest`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a browser test ``` // Get a browser test returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsBrowserTest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsBrowserTest result = apiInstance.getBrowserTest("public_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getBrowserTest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a browser test ``` // Get a browser test returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.get_browser_test("public_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a browser test ``` /** * Get a browser test returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiGetBrowserTestRequest = { publicId: "public_id", }; apiInstance .getBrowserTest(params) .then((data: v1.SyntheticsBrowserTest) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the on-demand concurrency cap](https://docs.datadoghq.com/api/latest/synthetics/#get-the-on-demand-concurrency-cap) * [v2 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-the-on-demand-concurrency-cap-v2) GET https://api.ap1.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.ap2.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.datadoghq.eu/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.ddog-gov.com/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.us3.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.us5.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_cap ### Overview Get the on-demand concurrency cap. This endpoint requires the `billing_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetOnDemandConcurrencyCap-200-v2) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetOnDemandConcurrencyCap-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) On-demand concurrency cap response. Field Type Description data object On-demand concurrency cap. attributes object On-demand concurrency cap attributes. on_demand_concurrency_cap double Value of the on-demand concurrency cap. type enum On-demand concurrency cap type. Allowed enum values: `on_demand_concurrency_cap` ``` { "data": { "attributes": { "on_demand_concurrency_cap": "number" }, "type": "string" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get the on-demand concurrency cap Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_cap" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the on-demand concurrency cap ``` """ Get the on-demand concurrency cap returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_on_demand_concurrency_cap() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the on-demand concurrency cap ``` # Get the on-demand concurrency cap returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SyntheticsAPI.new p api_instance.get_on_demand_concurrency_cap() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the on-demand concurrency cap ``` // Get the on-demand concurrency cap returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSyntheticsApi(apiClient) resp, r, err := api.GetOnDemandConcurrencyCap(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetOnDemandConcurrencyCap`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetOnDemandConcurrencyCap`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the on-demand concurrency cap ``` // Get the on-demand concurrency cap returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SyntheticsApi; import com.datadog.api.client.v2.model.OnDemandConcurrencyCapResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { OnDemandConcurrencyCapResponse result = apiInstance.getOnDemandConcurrencyCap(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getOnDemandConcurrencyCap"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the on-demand concurrency cap ``` // Get the on-demand concurrency cap returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.get_on_demand_concurrency_cap().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the on-demand concurrency cap ``` /** * Get the on-demand concurrency cap returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SyntheticsApi(configuration); apiInstance .getOnDemandConcurrencyCap() .then((data: v2.OnDemandConcurrencyCapResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the list of all Synthetic tests](https://docs.datadoghq.com/api/latest/synthetics/#get-the-list-of-all-synthetic-tests) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-the-list-of-all-synthetic-tests-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/testshttps://api.ap2.datadoghq.com/api/v1/synthetics/testshttps://api.datadoghq.eu/api/v1/synthetics/testshttps://api.ddog-gov.com/api/v1/synthetics/testshttps://api.datadoghq.com/api/v1/synthetics/testshttps://api.us3.datadoghq.com/api/v1/synthetics/testshttps://api.us5.datadoghq.com/api/v1/synthetics/tests ### Overview Get the list of all Synthetic tests. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Query Strings Name Type Description page_size integer Used for pagination. The number of tests returned in the page. page_number integer Used for pagination. Which page you want to retrieve. Starts at zero. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#ListTests-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#ListTests-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#ListTests-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#ListTests-429-v1) OK - Returns the list of all Synthetic tests. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing an array of Synthetic tests configuration. Field Type Description tests [object] Array of Synthetic tests configuration. config object Configuration object for a Synthetic test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. variables [object] Browser tests only - array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. locations [string] Array of locations used to run the test. message string Notification message associated with the test. monitor_id int64 The associated monitor ID. name string Name of the test. options object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The test public ID. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type enum Type of the Synthetic test, either `api` or `browser`. Allowed enum values: `api,browser,mobile` ``` { "tests": [ { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "creator": { "email": "string", "handle": "string", "name": "string" }, "locations": [ "aws:eu-west-3" ], "message": "string", "monitor_id": "integer", "name": "string", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "string", "status": "live", "subtype": "http", "tags": [], "type": "string" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Synthetic Monitoring is not activated for the user. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python-legacy) ##### Get the list of all Synthetic tests Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the list of all Synthetic tests ``` """ Get the list of all Synthetic tests returns "OK - Returns the list of all Synthetic tests." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.list_tests() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the list of all Synthetic tests ``` # Get the list of all Synthetic tests returns "OK - Returns the list of all Synthetic tests." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.list_tests() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the list of all Synthetic tests ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) dog.get_all_synthetics_tests() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the list of all Synthetic tests ``` // Get the list of all Synthetic tests returns "OK - Returns the list of all Synthetic tests." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.ListTests(ctx, *datadogV1.NewListTestsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.ListTests`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.ListTests`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the list of all Synthetic tests ``` // Get the list of all Synthetic tests returns "OK - Returns the list of all Synthetic tests." // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsListTestsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsListTestsResponse result = apiInstance.listTests(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#listTests"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the list of all Synthetic tests ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } initialize(**options) api.Synthetics.get_all_tests() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get the list of all Synthetic tests ``` // Get the list of all Synthetic tests returns "OK - Returns the list of all // Synthetic tests." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::ListTestsOptionalParams; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.list_tests(ListTestsOptionalParams::default()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the list of all Synthetic tests ``` /** * Get the list of all Synthetic tests returns "OK - Returns the list of all Synthetic tests." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); apiInstance .listTests() .then((data: v1.SyntheticsListTestsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Save new value for on-demand concurrency cap](https://docs.datadoghq.com/api/latest/synthetics/#save-new-value-for-on-demand-concurrency-cap) * [v2 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#save-new-value-for-on-demand-concurrency-cap-v2) POST https://api.ap1.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.ap2.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.datadoghq.eu/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.ddog-gov.com/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.us3.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_caphttps://api.us5.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_cap ### Overview Save new value for on-demand concurrency cap. This endpoint requires the `billing_edit` permission. ### Request #### Body Data (required) . * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Expand All Field Type Description on_demand_concurrency_cap double Value of the on-demand concurrency cap. ``` { "on_demand_concurrency_cap": 20 } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#SetOnDemandConcurrencyCap-200-v2) * [429](https://docs.datadoghq.com/api/latest/synthetics/#SetOnDemandConcurrencyCap-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) On-demand concurrency cap response. Field Type Description data object On-demand concurrency cap. attributes object On-demand concurrency cap attributes. on_demand_concurrency_cap double Value of the on-demand concurrency cap. type enum On-demand concurrency cap type. Allowed enum values: `on_demand_concurrency_cap` ``` { "data": { "attributes": { "on_demand_concurrency_cap": "number" }, "type": "string" } } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Save new value for on-demand concurrency cap returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/synthetics/settings/on_demand_concurrency_cap" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "on_demand_concurrency_cap": 20 } EOF ``` ##### Save new value for on-demand concurrency cap returns "OK" response ``` // Save new value for on-demand concurrency cap returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.OnDemandConcurrencyCapAttributes{ OnDemandConcurrencyCap: datadog.PtrFloat64(20), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewSyntheticsApi(apiClient) resp, r, err := api.SetOnDemandConcurrencyCap(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.SetOnDemandConcurrencyCap`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.SetOnDemandConcurrencyCap`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Save new value for on-demand concurrency cap returns "OK" response ``` // Save new value for on-demand concurrency cap returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.SyntheticsApi; import com.datadog.api.client.v2.model.OnDemandConcurrencyCapAttributes; import com.datadog.api.client.v2.model.OnDemandConcurrencyCapResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); OnDemandConcurrencyCapAttributes body = new OnDemandConcurrencyCapAttributes().onDemandConcurrencyCap(20.0); try { OnDemandConcurrencyCapResponse result = apiInstance.setOnDemandConcurrencyCap(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#setOnDemandConcurrencyCap"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Save new value for on-demand concurrency cap returns "OK" response ``` """ Save new value for on-demand concurrency cap returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.synthetics_api import SyntheticsApi from datadog_api_client.v2.model.on_demand_concurrency_cap_attributes import OnDemandConcurrencyCapAttributes body = OnDemandConcurrencyCapAttributes( on_demand_concurrency_cap=20.0, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.set_on_demand_concurrency_cap(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Save new value for on-demand concurrency cap returns "OK" response ``` # Save new value for on-demand concurrency cap returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::SyntheticsAPI.new body = DatadogAPIClient::V2::OnDemandConcurrencyCapAttributes.new({ on_demand_concurrency_cap: 20, }) p api_instance.set_on_demand_concurrency_cap(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Save new value for on-demand concurrency cap returns "OK" response ``` // Save new value for on-demand concurrency cap returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV2::model::OnDemandConcurrencyCapAttributes; #[tokio::main] async fn main() { let body = OnDemandConcurrencyCapAttributes::new().on_demand_concurrency_cap(20.0 as f64); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.set_on_demand_concurrency_cap(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Save new value for on-demand concurrency cap returns "OK" response ``` /** * Save new value for on-demand concurrency cap returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.SyntheticsApi(configuration); const params: v2.SyntheticsApiSetOnDemandConcurrencyCapRequest = { body: { onDemandConcurrencyCap: 20, }, }; apiInstance .setOnDemandConcurrencyCap(params) .then((data: v2.OnDemandConcurrencyCapResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an API test result](https://docs.datadoghq.com/api/latest/synthetics/#get-an-api-test-result) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-an-api-test-result-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/tests/{public_id}/results/{result_id}https://api.ap2.datadoghq.com/api/v1/synthetics/tests/{public_id}/results/{result_id}https://api.datadoghq.eu/api/v1/synthetics/tests/{public_id}/results/{result_id}https://api.ddog-gov.com/api/v1/synthetics/tests/{public_id}/results/{result_id}https://api.datadoghq.com/api/v1/synthetics/tests/{public_id}/results/{result_id}https://api.us3.datadoghq.com/api/v1/synthetics/tests/{public_id}/results/{result_id}https://api.us5.datadoghq.com/api/v1/synthetics/tests/{public_id}/results/{result_id} ### Overview Get a specific full result from a given Synthetic API test. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the API test to which the target result belongs. result_id [_required_] string The ID of the result to get. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITestResult-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITestResult-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITestResult-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITestResult-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object returned describing a API test result. Field Type Description check object Object describing the API test configuration. config [_required_] object Configuration object for a Synthetic test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. variables [object] Browser tests only - array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` check_time double When the API test was conducted. check_version int64 Version of the API test used. probe_dc string Locations for which to query the API test results. result object Object containing results for your Synthetic API test. cert object Object describing the SSL certificate used for a Synthetic test. cipher string Cipher used for the connection. exponent double Exponent associated to the certificate. extKeyUsage [string] Array of extensions and details used for the certificate. fingerprint string MD5 digest of the DER-encoded Certificate information. fingerprint256 string SHA-1 digest of the DER-encoded Certificate information. issuer object Object describing the issuer of a SSL certificate. C string Country Name that issued the certificate. CN string Common Name that issued certificate. L string Locality that issued the certificate. O string Organization that issued the certificate. OU string Organizational Unit that issued the certificate. ST string State Or Province Name that issued the certificate. modulus string Modulus associated to the SSL certificate private key. protocol string TLS protocol used for the test. serialNumber string Serial Number assigned by Symantec to the SSL certificate. subject object Object describing the SSL certificate used for the test. C string Country Name associated with the certificate. CN string Common Name that associated with the certificate. L string Locality associated with the certificate. O string Organization associated with the certificate. OU string Organizational Unit associated with the certificate. ST string State Or Province Name associated with the certificate. altName string Subject Alternative Name associated with the certificate. validFrom date-time Date from which the SSL certificate is valid. validTo date-time Date until which the SSL certificate is valid. eventType enum Status of a Synthetic test. Allowed enum values: `not_scheduled,scheduled,finished,finished_with_error` failure object The API test failure details. code enum Error code that can be returned by a Synthetic test. Allowed enum values: `BODY_TOO_LARGE,DENIED,TOO_MANY_REDIRECTS,AUTHENTICATION_ERROR,DECRYPTION,INVALID_CHAR_IN_HEADER,HEADER_TOO_LARGE,HEADERS_INCOMPATIBLE_CONTENT_LENGTH,INVALID_REQUEST,REQUIRES_UPDATE,UNESCAPED_CHARACTERS_IN_REQUEST_PATH,MALFORMED_RESPONSE,INCORRECT_ASSERTION,CONNREFUSED,CONNRESET,DNS,HOSTUNREACH,NETUNREACH,TIMEOUT,SSL,OCSP,INVALID_TEST,TUNNEL,WEBSOCKET,UNKNOWN,INTERNAL_ERROR` message string The API test error message. httpStatusCode int64 The API test HTTP status code. requestHeaders object Request header object used for the API test. object Requested request header. responseBody string Response body returned for the API test. responseHeaders object Response headers returned for the API test. Returned request header. responseSize int64 Global size in byte of the API test response. timings object Object containing all metrics and their values collected for a Synthetic API test. See the [Synthetic Monitoring Metrics documentation](https://docs.datadoghq.com/synthetics/metrics/). dns double The duration in millisecond of the DNS lookup. download double The time in millisecond to download the response. firstByte double The time in millisecond to first byte. handshake double The duration in millisecond of the TLS handshake. redirect double The time in millisecond spent during redirections. ssl double The duration in millisecond of the TLS handshake. tcp double Time in millisecond to establish the TCP connection. total double The overall time in millisecond the request took to be processed. wait double Time spent in millisecond waiting for a response. result_id string ID of the API test result. status enum The status of your Synthetic monitor. * `O` for not triggered * `1` for triggered * `2` for no data Allowed enum values: `0,1,2` ``` { "check": { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] } }, "check_time": "number", "check_version": "integer", "probe_dc": "string", "result": { "cert": { "cipher": "string", "exponent": "number", "extKeyUsage": [], "fingerprint": "string", "fingerprint256": "string", "issuer": { "C": "string", "CN": "string", "L": "string", "O": "string", "OU": "string", "ST": "string" }, "modulus": "string", "protocol": "string", "serialNumber": "string", "subject": { "C": "string", "CN": "string", "L": "string", "O": "string", "OU": "string", "ST": "string", "altName": "string" }, "validFrom": "2019-09-19T10:00:00.000Z", "validTo": "2019-09-19T10:00:00.000Z" }, "eventType": "string", "failure": { "code": "string", "message": "Error during DNS resolution (ENOTFOUND)." }, "httpStatusCode": "integer", "requestHeaders": { "": {} }, "responseBody": "string", "responseHeaders": { "": "undefined" }, "responseSize": "integer", "timings": { "dns": "number", "download": "number", "firstByte": "number", "handshake": "number", "redirect": "number", "ssl": "number", "tcp": "number", "total": "number", "wait": "number" } }, "result_id": "string", "status": "integer" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic Monitoring is not activated for the user - Test or result is not owned by the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python-legacy) ##### Get an API test result Copy ``` # Path parameters export public_id="CHANGE_ME" export result_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/${public_id}/results/${result_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an API test result ``` """ Get an API test result returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_api_test_result( public_id="hwb-332-3xe", result_id="3420446318379485707", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an API test result ``` # Get an API test result returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.get_api_test_result("hwb-332-3xe", "3420446318379485707") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an API test result ``` require 'dogapi' api_key = '' app_key = '' test_id = '' result_id = '' dog = Dogapi::Client.new(api_key, app_key) dog.get_synthetics_result('test_id' => test_id , 'result_id' => result_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an API test result ``` // Get an API test result returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetAPITestResult(ctx, "hwb-332-3xe", "3420446318379485707") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetAPITestResult`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetAPITestResult`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an API test result ``` // Get an API test result returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsAPITestResultFull; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsAPITestResultFull result = apiInstance.getAPITestResult("hwb-332-3xe", "3420446318379485707"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getAPITestResult"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an API test result ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } test_id = '' result_id = '' initialize(**options) api.Synthetics.get_result(id=test_id, result_id=result_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get an API test result ``` // Get an API test result returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api .get_api_test_result("hwb-332-3xe".to_string(), "3420446318379485707".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an API test result ``` /** * Get an API test result returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiGetAPITestResultRequest = { publicId: "hwb-332-3xe", resultId: "3420446318379485707", }; apiInstance .getAPITestResult(params) .then((data: v1.SyntheticsAPITestResultFull) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a browser test result](https://docs.datadoghq.com/api/latest/synthetics/#get-a-browser-test-result) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-a-browser-test-result-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}/results/{result_id}https://api.ap2.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}/results/{result_id}https://api.datadoghq.eu/api/v1/synthetics/tests/browser/{public_id}/results/{result_id}https://api.ddog-gov.com/api/v1/synthetics/tests/browser/{public_id}/results/{result_id}https://api.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}/results/{result_id}https://api.us3.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}/results/{result_id}https://api.us5.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}/results/{result_id} ### Overview Get a specific full result from a given Synthetic browser test. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the browser test to which the target result belongs. result_id [_required_] string The ID of the result to get. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTestResult-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTestResult-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTestResult-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTestResult-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object returned describing a browser test result. Field Type Description check object Object describing the browser test configuration. config [_required_] object Configuration object for a Synthetic test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. variables [object] Browser tests only - array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` check_time double When the browser test was conducted. check_version int64 Version of the browser test used. probe_dc string Location from which the browser test was performed. result object Object containing results for your Synthetic browser test. browserType string Type of browser device used for the browser test. browserVersion string Browser version used for the browser test. device object Object describing the device used to perform the Synthetic test. height [_required_] int64 Screen height of the device. id [_required_] string The device ID. isMobile boolean Whether or not the device is a mobile. name [_required_] string The device name. width [_required_] int64 Screen width of the device. duration double Global duration in second of the browser test. error string Error returned for the browser test. failure object The browser test failure details. code enum Error code that can be returned by a Synthetic test. Allowed enum values: `API_REQUEST_FAILURE,ASSERTION_FAILURE,DOWNLOAD_FILE_TOO_LARGE,ELEMENT_NOT_INTERACTABLE,EMAIL_VARIABLE_NOT_DEFINED,EVALUATE_JAVASCRIPT,EVALUATE_JAVASCRIPT_CONTEXT,EXTRACT_VARIABLE,FORBIDDEN_URL,FRAME_DETACHED,INCONSISTENCIES,INTERNAL_ERROR,INVALID_TYPE_TEXT_DELAY,INVALID_URL,INVALID_VARIABLE_PATTERN,INVISIBLE_ELEMENT,LOCATE_ELEMENT,NAVIGATE_TO_LINK,OPEN_URL,PRESS_KEY,SERVER_CERTIFICATE,SELECT_OPTION,STEP_TIMEOUT,SUB_TEST_NOT_PASSED,TEST_TIMEOUT,TOO_MANY_HTTP_REQUESTS,UNAVAILABLE_BROWSER,UNKNOWN,UNSUPPORTED_AUTH_SCHEMA,UPLOAD_FILES_ELEMENT_TYPE,UPLOAD_FILES_DIALOG,UPLOAD_FILES_DYNAMIC_ELEMENT,UPLOAD_FILES_NAME` message string The browser test error message. passed boolean Whether or not the browser test was conducted. receivedEmailCount int64 The amount of email received during the browser test. startUrl string Starting URL for the browser test. stepDetails [object] Array containing the different browser test steps. allowFailure boolean Whether or not the step was allowed to fail. browserErrors [object] Array of errors collected for a browser test. description [_required_] string Description of the error. name [_required_] string Name of the error. status int64 Status Code of the error. type [_required_] enum Error type returned by a browser test. Allowed enum values: `network,js` checkType enum Type of assertion to apply in an API test. Allowed enum values: `equals,notEquals,contains,notContains,startsWith,notStartsWith,greater,lower,greaterEquals,lowerEquals,matchRegex,between,isEmpty,notIsEmpty` description string Description of the test. duration double Total duration in millisecond of the test. error string Error returned by the test. failure object The browser test failure details. code enum Error code that can be returned by a Synthetic test. Allowed enum values: `API_REQUEST_FAILURE,ASSERTION_FAILURE,DOWNLOAD_FILE_TOO_LARGE,ELEMENT_NOT_INTERACTABLE,EMAIL_VARIABLE_NOT_DEFINED,EVALUATE_JAVASCRIPT,EVALUATE_JAVASCRIPT_CONTEXT,EXTRACT_VARIABLE,FORBIDDEN_URL,FRAME_DETACHED,INCONSISTENCIES,INTERNAL_ERROR,INVALID_TYPE_TEXT_DELAY,INVALID_URL,INVALID_VARIABLE_PATTERN,INVISIBLE_ELEMENT,LOCATE_ELEMENT,NAVIGATE_TO_LINK,OPEN_URL,PRESS_KEY,SERVER_CERTIFICATE,SELECT_OPTION,STEP_TIMEOUT,SUB_TEST_NOT_PASSED,TEST_TIMEOUT,TOO_MANY_HTTP_REQUESTS,UNAVAILABLE_BROWSER,UNKNOWN,UNSUPPORTED_AUTH_SCHEMA,UPLOAD_FILES_ELEMENT_TYPE,UPLOAD_FILES_DIALOG,UPLOAD_FILES_DYNAMIC_ELEMENT,UPLOAD_FILES_NAME` message string The browser test error message. playingTab enum Navigate between different tabs for your browser test. Allowed enum values: `-1,0,1,2,3` screenshotBucketKey boolean Whether or not screenshots where collected by the test. skipped boolean Whether or not to skip this step. snapshotBucketKey boolean Whether or not snapshots where collected by the test. stepId int64 The step ID. subTestStepDetails [object] If this step includes a sub-test. [Subtests documentation](https://docs.datadoghq.com/synthetics/browser_tests/advanced_options/#subtests). timeToInteractive double Time before starting the step. type enum Step type used in your Synthetic test. Allowed enum values: `assertCurrentUrl,assertElementAttribute,assertElementContent,assertElementPresent,assertEmail,assertFileDownload,assertFromJavascript,assertPageContains,assertPageLacks,assertRequests,click,extractFromJavascript,extractFromEmailBody,extractVariable,goToEmailLink,goToUrl,goToUrlAndMeasureTti,hover,playSubTest,pressKey,refresh,runApiTest,scroll,selectOption,typeText,uploadFiles,wait` url string URL to perform the step against. value Value for the step. vitalsMetrics [object] Array of Core Web Vitals metrics for the step. cls double Cumulative Layout Shift. lcp double Largest Contentful Paint in milliseconds. url string URL attached to the metrics. warnings [object] Warning collected that didn't failed the step. message [_required_] string Message for the warning. type [_required_] enum User locator used. Allowed enum values: `user_locator` thumbnailsBucketKey boolean Whether or not a thumbnail is associated with the browser test. timeToInteractive double Time in second to wait before the browser test starts after reaching the start URL. result_id string ID of the browser test result. status enum The status of your Synthetic monitor. * `O` for not triggered * `1` for triggered * `2` for no data Allowed enum values: `0,1,2` ``` { "check": { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] } }, "check_time": "number", "check_version": "integer", "probe_dc": "string", "result": { "browserType": "string", "browserVersion": "string", "device": { "height": 0, "id": "chrome.laptop_large", "isMobile": false, "name": "", "width": 0 }, "duration": "number", "error": "string", "failure": { "code": "string", "message": "Error during DNS resolution (ENOTFOUND)." }, "passed": false, "receivedEmailCount": "integer", "startUrl": "string", "stepDetails": [ { "allowFailure": false, "browserErrors": [ { "description": "Example error message", "name": "Failed test", "status": 500, "type": "network" } ], "checkType": "string", "description": "string", "duration": "number", "error": "string", "failure": { "code": "string", "message": "Error during DNS resolution (ENOTFOUND)." }, "playingTab": "integer", "screenshotBucketKey": false, "skipped": false, "snapshotBucketKey": false, "stepId": "integer", "subTestStepDetails": [], "timeToInteractive": "number", "type": "assertElementContent", "url": "string", "value": "undefined", "vitalsMetrics": [ { "cls": "number", "lcp": "number", "url": "string" } ], "warnings": [ { "message": "", "type": "user_locator" } ] } ], "thumbnailsBucketKey": false, "timeToInteractive": "number" }, "result_id": "string", "status": "integer" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic Monitoring is not activated for the user - Test or result is not owned by the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get a browser test result Copy ``` # Path parameters export public_id="CHANGE_ME" export result_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/browser/${public_id}/results/${result_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a browser test result ``` """ Get a browser test result returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_browser_test_result( public_id="2yy-sem-mjh", result_id="5671719892074090418", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a browser test result ``` # Get a browser test result returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.get_browser_test_result("2yy-sem-mjh", "5671719892074090418") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a browser test result ``` // Get a browser test result returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetBrowserTestResult(ctx, "2yy-sem-mjh", "5671719892074090418") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetBrowserTestResult`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetBrowserTestResult`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a browser test result ``` // Get a browser test result returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsBrowserTestResultFull; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsBrowserTestResultFull result = apiInstance.getBrowserTestResult("2yy-sem-mjh", "5671719892074090418"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getBrowserTestResult"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a browser test result ``` // Get a browser test result returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api .get_browser_test_result("2yy-sem-mjh".to_string(), "5671719892074090418".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a browser test result ``` /** * Get a browser test result returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiGetBrowserTestResultRequest = { publicId: "2yy-sem-mjh", resultId: "5671719892074090418", }; apiInstance .getBrowserTestResult(params) .then((data: v1.SyntheticsBrowserTestResultFull) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get an API test's latest results summaries](https://docs.datadoghq.com/api/latest/synthetics/#get-an-api-tests-latest-results-summaries) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-an-api-tests-latest-results-summaries-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/tests/{public_id}/resultshttps://api.ap2.datadoghq.com/api/v1/synthetics/tests/{public_id}/resultshttps://api.datadoghq.eu/api/v1/synthetics/tests/{public_id}/resultshttps://api.ddog-gov.com/api/v1/synthetics/tests/{public_id}/resultshttps://api.datadoghq.com/api/v1/synthetics/tests/{public_id}/resultshttps://api.us3.datadoghq.com/api/v1/synthetics/tests/{public_id}/resultshttps://api.us5.datadoghq.com/api/v1/synthetics/tests/{public_id}/results ### Overview Get the last 150 test results summaries for a given Synthetic API test. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the test for which to search results for. #### Query Strings Name Type Description from_ts integer Timestamp in milliseconds from which to start querying results. to_ts integer Timestamp in milliseconds up to which to query results. probe_dc array Locations for which to query results. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITestLatestResults-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITestLatestResults-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITestLatestResults-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetAPITestLatestResults-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object with the latest Synthetic API test run. Field Type Description last_timestamp_fetched int64 Timestamp of the latest API test run. results [object] Result of the latest API test run. check_time double Last time the API test was performed. probe_dc string Location from which the API test was performed. result object Result of the last API test run. passed boolean Describes if the test run has passed or failed. timings object Object containing all metrics and their values collected for a Synthetic API test. See the [Synthetic Monitoring Metrics documentation](https://docs.datadoghq.com/synthetics/metrics/). dns double The duration in millisecond of the DNS lookup. download double The time in millisecond to download the response. firstByte double The time in millisecond to first byte. handshake double The duration in millisecond of the TLS handshake. redirect double The time in millisecond spent during redirections. ssl double The duration in millisecond of the TLS handshake. tcp double Time in millisecond to establish the TCP connection. total double The overall time in millisecond the request took to be processed. wait double Time spent in millisecond waiting for a response. result_id string ID of the API test result. status enum The status of your Synthetic monitor. * `O` for not triggered * `1` for triggered * `2` for no data Allowed enum values: `0,1,2` ``` { "last_timestamp_fetched": "integer", "results": [ { "check_time": "number", "probe_dc": "string", "result": { "passed": false, "timings": { "dns": "number", "download": "number", "firstByte": "number", "handshake": "number", "redirect": "number", "ssl": "number", "tcp": "number", "total": "number", "wait": "number" } }, "result_id": "string", "status": "integer" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic is not activated for the user - Test is not owned by the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python-legacy) ##### Get an API test's latest results summaries Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/${public_id}/results" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an API test's latest results summaries ``` """ Get an API test's latest results summaries returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_api_test_latest_results( public_id="hwb-332-3xe", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an API test's latest results summaries ``` # Get an API test's latest results summaries returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.get_api_test_latest_results("hwb-332-3xe") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an API test's latest results summaries ``` require 'dogapi' api_key = '' app_key = '' test_id = '' dog = Dogapi::Client.new(api_key, app_key) dog.get_synthetics_results('test_id' => test_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an API test's latest results summaries ``` // Get an API test's latest results summaries returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetAPITestLatestResults(ctx, "hwb-332-3xe", *datadogV1.NewGetAPITestLatestResultsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetAPITestLatestResults`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetAPITestLatestResults`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an API test's latest results summaries ``` // Get an API test's latest results summaries returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsGetAPITestLatestResultsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsGetAPITestLatestResultsResponse result = apiInstance.getAPITestLatestResults("hwb-332-3xe"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getAPITestLatestResults"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an API test's latest results summaries ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } test_id = '' initialize(**options) api.Synthetics.get_results(id=test_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get an API test's latest results summaries ``` // Get an API test's latest results summaries returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::GetAPITestLatestResultsOptionalParams; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api .get_api_test_latest_results( "hwb-332-3xe".to_string(), GetAPITestLatestResultsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an API test's latest results summaries ``` /** * Get an API test's latest results summaries returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiGetAPITestLatestResultsRequest = { publicId: "hwb-332-3xe", }; apiInstance .getAPITestLatestResults(params) .then((data: v1.SyntheticsGetAPITestLatestResultsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a browser test's latest results summaries](https://docs.datadoghq.com/api/latest/synthetics/#get-a-browser-tests-latest-results-summaries) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-a-browser-tests-latest-results-summaries-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}/resultshttps://api.ap2.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}/resultshttps://api.datadoghq.eu/api/v1/synthetics/tests/browser/{public_id}/resultshttps://api.ddog-gov.com/api/v1/synthetics/tests/browser/{public_id}/resultshttps://api.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}/resultshttps://api.us3.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}/resultshttps://api.us5.datadoghq.com/api/v1/synthetics/tests/browser/{public_id}/results ### Overview Get the last 150 test results summaries for a given Synthetic browser test. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the browser test for which to search results for. #### Query Strings Name Type Description from_ts integer Timestamp in milliseconds from which to start querying results. to_ts integer Timestamp in milliseconds up to which to query results. probe_dc array Locations for which to query results. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTestLatestResults-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTestLatestResults-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTestLatestResults-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetBrowserTestLatestResults-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object with the latest Synthetic browser test run. Field Type Description last_timestamp_fetched int64 Timestamp of the latest browser test run. results [object] Result of the latest browser test run. check_time double Last time the browser test was performed. probe_dc string Location from which the Browser test was performed. result object Object with the result of the last browser test run. device object Object describing the device used to perform the Synthetic test. height [_required_] int64 Screen height of the device. id [_required_] string The device ID. isMobile boolean Whether or not the device is a mobile. name [_required_] string The device name. width [_required_] int64 Screen width of the device. duration double Length in milliseconds of the browser test run. errorCount int64 Amount of errors collected for a single browser test run. stepCountCompleted int64 Amount of browser test steps completed before failing. stepCountTotal int64 Total amount of browser test steps. result_id string ID of the browser test result. status enum The status of your Synthetic monitor. * `O` for not triggered * `1` for triggered * `2` for no data Allowed enum values: `0,1,2` ``` { "last_timestamp_fetched": "integer", "results": [ { "check_time": "number", "probe_dc": "string", "result": { "device": { "height": 0, "id": "chrome.laptop_large", "isMobile": false, "name": "", "width": 0 }, "duration": "number", "errorCount": "integer", "stepCountCompleted": "integer", "stepCountTotal": "integer" }, "result_id": "string", "status": "integer" } ] } ``` Copy forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic Monitoring is not activated for the user - Test is not owned by the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get a browser test's latest results summaries Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/browser/${public_id}/results" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a browser test's latest results summaries ``` """ Get a browser test's latest results summaries returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_browser_test_latest_results( public_id="2yy-sem-mjh", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a browser test's latest results summaries ``` # Get a browser test's latest results summaries returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.get_browser_test_latest_results("2yy-sem-mjh") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a browser test's latest results summaries ``` // Get a browser test's latest results summaries returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetBrowserTestLatestResults(ctx, "2yy-sem-mjh", *datadogV1.NewGetBrowserTestLatestResultsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetBrowserTestLatestResults`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetBrowserTestLatestResults`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a browser test's latest results summaries ``` // Get a browser test's latest results summaries returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsGetBrowserTestLatestResultsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsGetBrowserTestLatestResultsResponse result = apiInstance.getBrowserTestLatestResults("2yy-sem-mjh"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getBrowserTestLatestResults"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a browser test's latest results summaries ``` // Get a browser test's latest results summaries returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::GetBrowserTestLatestResultsOptionalParams; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api .get_browser_test_latest_results( "2yy-sem-mjh".to_string(), GetBrowserTestLatestResultsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a browser test's latest results summaries ``` /** * Get a browser test's latest results summaries returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiGetBrowserTestLatestResultsRequest = { publicId: "2yy-sem-mjh", }; apiInstance .getBrowserTestLatestResults(params) .then((data: v1.SyntheticsGetBrowserTestLatestResultsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get details of batch](https://docs.datadoghq.com/api/latest/synthetics/#get-details-of-batch) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-details-of-batch-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/ci/batch/{batch_id}https://api.ap2.datadoghq.com/api/v1/synthetics/ci/batch/{batch_id}https://api.datadoghq.eu/api/v1/synthetics/ci/batch/{batch_id}https://api.ddog-gov.com/api/v1/synthetics/ci/batch/{batch_id}https://api.datadoghq.com/api/v1/synthetics/ci/batch/{batch_id}https://api.us3.datadoghq.com/api/v1/synthetics/ci/batch/{batch_id}https://api.us5.datadoghq.com/api/v1/synthetics/ci/batch/{batch_id} ### Overview Get a batch’s updated details. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description batch_id [_required_] string The ID of the batch. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetSyntheticsCIBatch-200-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#GetSyntheticsCIBatch-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetSyntheticsCIBatch-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Details about a batch response. Field Type Description data object Wrapper object that contains the details of a batch. metadata object Metadata for the Synthetic tests run. ci object Description of the CI provider. pipeline object Description of the CI pipeline. url string URL of the pipeline. provider object Description of the CI provider. name string Name of the CI provider. git object Git information. branch string Branch name. commitSha string The commit SHA. results [object] List of results for the batch. device string The device ID. duration double Total duration in millisecond of the test. execution_rule enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` location string Name of the location. result_id string The ID of the result to get. retries double Number of times this result has been retried. status enum Determines whether the batch has passed, failed, or is in progress. Allowed enum values: `passed,skipped,failed` test_name string Name of the test. test_public_id string The public ID of the Synthetic test. test_type enum Type of the Synthetic test, either `api` or `browser`. Allowed enum values: `api,browser,mobile` status enum Determines whether the batch has passed, failed, or is in progress. Allowed enum values: `passed,skipped,failed` ``` { "data": { "metadata": { "ci": { "pipeline": { "url": "string" }, "provider": { "name": "string" } }, "git": { "branch": "string", "commitSha": "string" } }, "results": [ { "device": "chrome.laptop_large", "duration": "number", "execution_rule": "blocking", "location": "string", "result_id": "string", "retries": "number", "status": "string", "test_name": "string", "test_public_id": "string", "test_type": "string" } ], "status": "string" } } ``` Copy Batch does not exist. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get details of batch Copy ``` # Path parameters export batch_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/ci/batch/${batch_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get details of batch ``` """ Get details of batch returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_synthetics_ci_batch( batch_id="batch_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get details of batch ``` # Get details of batch returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.get_synthetics_ci_batch("batch_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get details of batch ``` // Get details of batch returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetSyntheticsCIBatch(ctx, "batch_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetSyntheticsCIBatch`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetSyntheticsCIBatch`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get details of batch ``` // Get details of batch returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsBatchDetails; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsBatchDetails result = apiInstance.getSyntheticsCIBatch("batch_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getSyntheticsCIBatch"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get details of batch ``` // Get details of batch returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.get_synthetics_ci_batch("batch_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get details of batch ``` /** * Get details of batch returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiGetSyntheticsCIBatchRequest = { batchId: "batch_id", }; apiInstance .getSyntheticsCIBatch(params) .then((data: v1.SyntheticsBatchDetails) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete tests](https://docs.datadoghq.com/api/latest/synthetics/#delete-tests) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#delete-tests-v1) POST https://api.ap1.datadoghq.com/api/v1/synthetics/tests/deletehttps://api.ap2.datadoghq.com/api/v1/synthetics/tests/deletehttps://api.datadoghq.eu/api/v1/synthetics/tests/deletehttps://api.ddog-gov.com/api/v1/synthetics/tests/deletehttps://api.datadoghq.com/api/v1/synthetics/tests/deletehttps://api.us3.datadoghq.com/api/v1/synthetics/tests/deletehttps://api.us5.datadoghq.com/api/v1/synthetics/tests/delete ### Overview Delete multiple Synthetic tests by ID. This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Request #### Body Data (required) Public ID list of the Synthetic tests to be deleted. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Expand All Field Type Description force_delete_dependencies boolean Delete the Synthetic test even if it's referenced by other resources (for example, SLOs and composite monitors). public_ids [string] An array of Synthetic test IDs you want to delete. ``` { "force_delete_dependencies": false, "public_ids": [ [] ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#DeleteTests-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#DeleteTests-400-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#DeleteTests-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#DeleteTests-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#DeleteTests-429-v1) OK. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Response object for deleting Synthetic tests. Field Type Description deleted_tests [object] Array of objects containing a deleted Synthetic test ID with the associated deletion timestamp. deleted_at date-time Deletion timestamp of the Synthetic test ID. public_id string The Synthetic test ID deleted. ``` { "deleted_tests": [ { "deleted_at": "2019-09-19T10:00:00.000Z", "public_id": "string" } ] } ``` Copy - JSON format is wrong - Test cannot be deleted as it's used elsewhere (as a sub-test or in an uptime widget) - Some IDs are not owned by the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Tests to be deleted can't be found - Synthetic is not activated for the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python-legacy) ##### Delete tests Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/delete" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Delete tests ``` """ Delete tests returns "OK." response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_delete_tests_payload import SyntheticsDeleteTestsPayload # there is a valid "synthetics_api_test" in the system SYNTHETICS_API_TEST_PUBLIC_ID = environ["SYNTHETICS_API_TEST_PUBLIC_ID"] body = SyntheticsDeleteTestsPayload( public_ids=[ SYNTHETICS_API_TEST_PUBLIC_ID, ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.delete_tests(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete tests ``` # Delete tests returns "OK." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new # there is a valid "synthetics_api_test" in the system SYNTHETICS_API_TEST_PUBLIC_ID = ENV["SYNTHETICS_API_TEST_PUBLIC_ID"] body = DatadogAPIClient::V1::SyntheticsDeleteTestsPayload.new({ public_ids: [ SYNTHETICS_API_TEST_PUBLIC_ID, ], }) p api_instance.delete_tests(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete tests ``` require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) test_ids = ['',''] dog.delete_synthetics_tests('test_ids' => test_ids) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete tests ``` // Delete tests returns "OK." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "synthetics_api_test" in the system SyntheticsAPITestPublicID := os.Getenv("SYNTHETICS_API_TEST_PUBLIC_ID") body := datadogV1.SyntheticsDeleteTestsPayload{ PublicIds: []string{ SyntheticsAPITestPublicID, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.DeleteTests(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.DeleteTests`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.DeleteTests`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete tests ``` // Delete tests returns "OK." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsDeleteTestsPayload; import com.datadog.api.client.v1.model.SyntheticsDeleteTestsResponse; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); // there is a valid "synthetics_api_test" in the system String SYNTHETICS_API_TEST_PUBLIC_ID = System.getenv("SYNTHETICS_API_TEST_PUBLIC_ID"); SyntheticsDeleteTestsPayload body = new SyntheticsDeleteTestsPayload() .publicIds(Collections.singletonList(SYNTHETICS_API_TEST_PUBLIC_ID)); try { SyntheticsDeleteTestsResponse result = apiInstance.deleteTests(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#deleteTests"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete tests ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } test_ids = ['',''] initialize(**options) api.Synthetics.delete_test(public_ids=test_ids) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Delete tests ``` // Delete tests returns "OK." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsDeleteTestsPayload; #[tokio::main] async fn main() { // there is a valid "synthetics_api_test" in the system let synthetics_api_test_public_id = std::env::var("SYNTHETICS_API_TEST_PUBLIC_ID").unwrap(); let body = SyntheticsDeleteTestsPayload::new().public_ids(vec![synthetics_api_test_public_id.clone()]); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.delete_tests(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete tests ``` /** * Delete tests returns "OK." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); // there is a valid "synthetics_api_test" in the system const SYNTHETICS_API_TEST_PUBLIC_ID = process.env .SYNTHETICS_API_TEST_PUBLIC_ID as string; const params: v1.SyntheticsApiDeleteTestsRequest = { body: { publicIds: [SYNTHETICS_API_TEST_PUBLIC_ID], }, }; apiInstance .deleteTests(params) .then((data: v1.SyntheticsDeleteTestsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all global variables](https://docs.datadoghq.com/api/latest/synthetics/#get-all-global-variables) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-all-global-variables-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/variableshttps://api.ap2.datadoghq.com/api/v1/synthetics/variableshttps://api.datadoghq.eu/api/v1/synthetics/variableshttps://api.ddog-gov.com/api/v1/synthetics/variableshttps://api.datadoghq.com/api/v1/synthetics/variableshttps://api.us3.datadoghq.com/api/v1/synthetics/variableshttps://api.us5.datadoghq.com/api/v1/synthetics/variables ### Overview Get the list of all Synthetic global variables. This endpoint requires any of the following permissions: * `synthetics_global_variable_read` * `apm_api_catalog_read` OAuth apps require the `apm_api_catalog_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#ListGlobalVariables-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#ListGlobalVariables-403-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#ListGlobalVariables-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing an array of Synthetic global variables. Field Type Description variables [object] Array of Synthetic global variables. attributes object Attributes of the global variable. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. description [_required_] string Description of the global variable. id string Unique identifier of the global variable. is_fido boolean Determines if the global variable is a FIDO variable. is_totp boolean Determines if the global variable is a TOTP/MFA variable. name [_required_] string Name of the global variable. Unique across Synthetic global variables. parse_test_options object Parser options to use for retrieving a Synthetic global variable from a Synthetic test. Used in conjunction with `parse_test_public_id`. field string When type is `http_header`, name of the header to use to extract the value. localVariableName string When type is `local_variable`, name of the local variable to use to extract the value. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. type [_required_] enum Type of value to extract from a test for a Synthetic global variable. Allowed enum values: `http_body,http_header,http_status_code,local_variable` parse_test_public_id string A Synthetic test ID to use as a test to generate the variable value. tags [_required_] [string] Tags of the global variable. value [_required_] object Value of the global variable. options object Options for the Global Variable for MFA. totp_parameters object Parameters for the TOTP/MFA variable digits int32 Number of digits for the OTP code. refresh_interval int32 Interval for which to refresh the token (in seconds). secure boolean Determines if the value of the variable is hidden. value string Value of the global variable. When reading a global variable, the value will not be present if the variable is hidden with the `secure` property. ``` { "variables": [ { "attributes": { "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ] }, "description": "Example description", "id": "string", "is_fido": false, "is_totp": false, "name": "MY_VARIABLE", "parse_test_options": { "field": "content-type", "localVariableName": "LOCAL_VARIABLE", "parser": { "type": "raw", "value": "string" }, "type": "http_body" }, "parse_test_public_id": "abc-def-123", "tags": [ "team:front", "test:workflow-1" ], "value": { "options": { "totp_parameters": { "digits": 6, "refresh_interval": 30 } }, "secure": false, "value": "example-value" } } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get all global variables Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/variables" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all global variables ``` """ Get all global variables returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.list_global_variables() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all global variables ``` # Get all global variables returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.list_global_variables() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all global variables ``` // Get all global variables returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.ListGlobalVariables(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.ListGlobalVariables`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.ListGlobalVariables`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all global variables ``` // Get all global variables returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsListGlobalVariablesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsListGlobalVariablesResponse result = apiInstance.listGlobalVariables(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#listGlobalVariables"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all global variables ``` // Get all global variables returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.list_global_variables().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all global variables ``` /** * Get all global variables returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); apiInstance .listGlobalVariables() .then((data: v1.SyntheticsListGlobalVariablesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a global variable](https://docs.datadoghq.com/api/latest/synthetics/#create-a-global-variable) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#create-a-global-variable-v1) POST https://api.ap1.datadoghq.com/api/v1/synthetics/variableshttps://api.ap2.datadoghq.com/api/v1/synthetics/variableshttps://api.datadoghq.eu/api/v1/synthetics/variableshttps://api.ddog-gov.com/api/v1/synthetics/variableshttps://api.datadoghq.com/api/v1/synthetics/variableshttps://api.us3.datadoghq.com/api/v1/synthetics/variableshttps://api.us5.datadoghq.com/api/v1/synthetics/variables ### Overview Create a Synthetic global variable. This endpoint requires the `synthetics_global_variable_write` permission. OAuth apps require the `synthetics_global_variable_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Request #### Body Data (required) Details of the global variable to create. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description attributes object Attributes of the global variable. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. description [_required_] string Description of the global variable. id string Unique identifier of the global variable. is_fido boolean Determines if the global variable is a FIDO variable. is_totp boolean Determines if the global variable is a TOTP/MFA variable. name [_required_] string Name of the global variable. Unique across Synthetic global variables. parse_test_options object Parser options to use for retrieving a Synthetic global variable from a Synthetic test. Used in conjunction with `parse_test_public_id`. field string When type is `http_header`, name of the header to use to extract the value. localVariableName string When type is `local_variable`, name of the local variable to use to extract the value. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. type [_required_] enum Type of value to extract from a test for a Synthetic global variable. Allowed enum values: `http_body,http_header,http_status_code,local_variable` parse_test_public_id string A Synthetic test ID to use as a test to generate the variable value. tags [_required_] [string] Tags of the global variable. value object Value of the global variable. options object Options for the Global Variable for MFA. totp_parameters object Parameters for the TOTP/MFA variable digits int32 Number of digits for the OTP code. refresh_interval int32 Interval for which to refresh the token (in seconds). secure boolean Determines if the value of the variable is hidden. value string Value of the global variable. When reading a global variable, the value will not be present if the variable is hidden with the `secure` property. ##### Create a FIDO global variable returns "OK" response ``` { "description": "", "is_fido": true, "name": "GLOBAL_VARIABLE_FIDO_PAYLOAD_EXAMPLESYNTHETIC", "tags": [] } ``` Copy ##### Create a TOTP global variable returns "OK" response ``` { "description": "", "is_totp": true, "name": "GLOBAL_VARIABLE_TOTP_PAYLOAD_EXAMPLESYNTHETIC", "tags": [], "value": { "secure": false, "value": "", "options": { "totp_parameters": { "digits": 6, "refresh_interval": 30 } } } } ``` Copy ##### Create a global variable from test returns "OK" response ``` { "description": "", "name": "GLOBAL_VARIABLE_FROM_TEST_PAYLOAD_EXAMPLESYNTHETIC", "tags": [], "value": { "secure": false, "value": "" }, "parse_test_public_id": "123-abc-456", "parse_test_options": { "type": "local_variable", "localVariableName": "EXTRACTED_VALUE" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#CreateGlobalVariable-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#CreateGlobalVariable-400-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#CreateGlobalVariable-403-v1) * [409](https://docs.datadoghq.com/api/latest/synthetics/#CreateGlobalVariable-409-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#CreateGlobalVariable-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Synthetic global variable. Field Type Description attributes object Attributes of the global variable. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. description [_required_] string Description of the global variable. id string Unique identifier of the global variable. is_fido boolean Determines if the global variable is a FIDO variable. is_totp boolean Determines if the global variable is a TOTP/MFA variable. name [_required_] string Name of the global variable. Unique across Synthetic global variables. parse_test_options object Parser options to use for retrieving a Synthetic global variable from a Synthetic test. Used in conjunction with `parse_test_public_id`. field string When type is `http_header`, name of the header to use to extract the value. localVariableName string When type is `local_variable`, name of the local variable to use to extract the value. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. type [_required_] enum Type of value to extract from a test for a Synthetic global variable. Allowed enum values: `http_body,http_header,http_status_code,local_variable` parse_test_public_id string A Synthetic test ID to use as a test to generate the variable value. tags [_required_] [string] Tags of the global variable. value [_required_] object Value of the global variable. options object Options for the Global Variable for MFA. totp_parameters object Parameters for the TOTP/MFA variable digits int32 Number of digits for the OTP code. refresh_interval int32 Interval for which to refresh the token (in seconds). secure boolean Determines if the value of the variable is hidden. value string Value of the global variable. When reading a global variable, the value will not be present if the variable is hidden with the `secure` property. ``` { "attributes": { "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ] }, "description": "Example description", "id": "string", "is_fido": false, "is_totp": false, "name": "MY_VARIABLE", "parse_test_options": { "field": "content-type", "localVariableName": "LOCAL_VARIABLE", "parser": { "type": "raw", "value": "string" }, "type": "http_body" }, "parse_test_public_id": "abc-def-123", "tags": [ "team:front", "test:workflow-1" ], "value": { "options": { "totp_parameters": { "digits": 6, "refresh_interval": 30 } }, "secure": false, "value": "example-value" } } ``` Copy Invalid request * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Create a FIDO global variable returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/variables" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "description": "", "is_fido": true, "name": "GLOBAL_VARIABLE_FIDO_PAYLOAD_EXAMPLESYNTHETIC", "tags": [] } EOF ``` ##### Create a TOTP global variable returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/variables" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "description": "", "is_totp": true, "name": "GLOBAL_VARIABLE_TOTP_PAYLOAD_EXAMPLESYNTHETIC", "tags": [], "value": { "secure": false, "value": "", "options": { "totp_parameters": { "digits": 6, "refresh_interval": 30 } } } } EOF ``` ##### Create a global variable from test returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/variables" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "description": "", "name": "GLOBAL_VARIABLE_FROM_TEST_PAYLOAD_EXAMPLESYNTHETIC", "tags": [], "value": { "secure": false, "value": "" }, "parse_test_public_id": "123-abc-456", "parse_test_options": { "type": "local_variable", "localVariableName": "EXTRACTED_VALUE" } } EOF ``` ##### Create a FIDO global variable returns "OK" response ``` // Create a FIDO global variable returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsGlobalVariableRequest{ Description: "", IsFido: datadog.PtrBool(true), Name: "GLOBAL_VARIABLE_FIDO_PAYLOAD_EXAMPLESYNTHETIC", Tags: []string{}, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.CreateGlobalVariable(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.CreateGlobalVariable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.CreateGlobalVariable`:\n%s\n", responseContent) } ``` Copy ##### Create a TOTP global variable returns "OK" response ``` // Create a TOTP global variable returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsGlobalVariableRequest{ Description: "", IsTotp: datadog.PtrBool(true), Name: "GLOBAL_VARIABLE_TOTP_PAYLOAD_EXAMPLESYNTHETIC", Tags: []string{}, Value: &datadogV1.SyntheticsGlobalVariableValue{ Secure: datadog.PtrBool(false), Value: datadog.PtrString(""), Options: &datadogV1.SyntheticsGlobalVariableOptions{ TotpParameters: &datadogV1.SyntheticsGlobalVariableTOTPParameters{ Digits: datadog.PtrInt32(6), RefreshInterval: datadog.PtrInt32(30), }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.CreateGlobalVariable(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.CreateGlobalVariable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.CreateGlobalVariable`:\n%s\n", responseContent) } ``` Copy ##### Create a global variable from test returns "OK" response ``` // Create a global variable from test returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "synthetics_api_test_multi_step" in the system SyntheticsAPITestMultiStepPublicID := os.Getenv("SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID") body := datadogV1.SyntheticsGlobalVariableRequest{ Description: "", Name: "GLOBAL_VARIABLE_FROM_TEST_PAYLOAD_EXAMPLESYNTHETIC", Tags: []string{}, Value: &datadogV1.SyntheticsGlobalVariableValue{ Secure: datadog.PtrBool(false), Value: datadog.PtrString(""), }, ParseTestPublicId: datadog.PtrString(SyntheticsAPITestMultiStepPublicID), ParseTestOptions: &datadogV1.SyntheticsGlobalVariableParseTestOptions{ Type: datadogV1.SYNTHETICSGLOBALVARIABLEPARSETESTOPTIONSTYPE_LOCAL_VARIABLE, LocalVariableName: datadog.PtrString("EXTRACTED_VALUE"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.CreateGlobalVariable(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.CreateGlobalVariable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.CreateGlobalVariable`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a FIDO global variable returns "OK" response ``` // Create a FIDO global variable returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsGlobalVariable; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsGlobalVariableRequest body = new SyntheticsGlobalVariableRequest() .description("") .isFido(true) .name("GLOBAL_VARIABLE_FIDO_PAYLOAD_EXAMPLESYNTHETIC"); try { SyntheticsGlobalVariable result = apiInstance.createGlobalVariable(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#createGlobalVariable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a TOTP global variable returns "OK" response ``` // Create a TOTP global variable returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsGlobalVariable; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableOptions; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableRequest; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableTOTPParameters; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableValue; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsGlobalVariableRequest body = new SyntheticsGlobalVariableRequest() .description("") .isTotp(true) .name("GLOBAL_VARIABLE_TOTP_PAYLOAD_EXAMPLESYNTHETIC") .value( new SyntheticsGlobalVariableValue() .secure(false) .value("") .options( new SyntheticsGlobalVariableOptions() .totpParameters( new SyntheticsGlobalVariableTOTPParameters() .digits(6) .refreshInterval(30)))); try { SyntheticsGlobalVariable result = apiInstance.createGlobalVariable(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#createGlobalVariable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a global variable from test returns "OK" response ``` // Create a global variable from test returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsGlobalVariable; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableParseTestOptions; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableParseTestOptionsType; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableRequest; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableValue; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); // there is a valid "synthetics_api_test_multi_step" in the system String SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID = System.getenv("SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID"); SyntheticsGlobalVariableRequest body = new SyntheticsGlobalVariableRequest() .description("") .name("GLOBAL_VARIABLE_FROM_TEST_PAYLOAD_EXAMPLESYNTHETIC") .value(new SyntheticsGlobalVariableValue().secure(false).value("")) .parseTestPublicId(SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID) .parseTestOptions( new SyntheticsGlobalVariableParseTestOptions() .type(SyntheticsGlobalVariableParseTestOptionsType.LOCAL_VARIABLE) .localVariableName("EXTRACTED_VALUE")); try { SyntheticsGlobalVariable result = apiInstance.createGlobalVariable(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#createGlobalVariable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a FIDO global variable returns "OK" response ``` """ Create a FIDO global variable returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_global_variable_request import SyntheticsGlobalVariableRequest body = SyntheticsGlobalVariableRequest( description="", is_fido=True, name="GLOBAL_VARIABLE_FIDO_PAYLOAD_EXAMPLESYNTHETIC", tags=[], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.create_global_variable(body=body) print(response) ``` Copy ##### Create a TOTP global variable returns "OK" response ``` """ Create a TOTP global variable returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_global_variable_options import SyntheticsGlobalVariableOptions from datadog_api_client.v1.model.synthetics_global_variable_request import SyntheticsGlobalVariableRequest from datadog_api_client.v1.model.synthetics_global_variable_totp_parameters import ( SyntheticsGlobalVariableTOTPParameters, ) from datadog_api_client.v1.model.synthetics_global_variable_value import SyntheticsGlobalVariableValue body = SyntheticsGlobalVariableRequest( description="", is_totp=True, name="GLOBAL_VARIABLE_TOTP_PAYLOAD_EXAMPLESYNTHETIC", tags=[], value=SyntheticsGlobalVariableValue( secure=False, value="", options=SyntheticsGlobalVariableOptions( totp_parameters=SyntheticsGlobalVariableTOTPParameters( digits=6, refresh_interval=30, ), ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.create_global_variable(body=body) print(response) ``` Copy ##### Create a global variable from test returns "OK" response ``` """ Create a global variable from test returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_global_variable_parse_test_options import ( SyntheticsGlobalVariableParseTestOptions, ) from datadog_api_client.v1.model.synthetics_global_variable_parse_test_options_type import ( SyntheticsGlobalVariableParseTestOptionsType, ) from datadog_api_client.v1.model.synthetics_global_variable_request import SyntheticsGlobalVariableRequest from datadog_api_client.v1.model.synthetics_global_variable_value import SyntheticsGlobalVariableValue # there is a valid "synthetics_api_test_multi_step" in the system SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID = environ["SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID"] body = SyntheticsGlobalVariableRequest( description="", name="GLOBAL_VARIABLE_FROM_TEST_PAYLOAD_EXAMPLESYNTHETIC", tags=[], value=SyntheticsGlobalVariableValue( secure=False, value="", ), parse_test_public_id=SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID, parse_test_options=SyntheticsGlobalVariableParseTestOptions( type=SyntheticsGlobalVariableParseTestOptionsType.LOCAL_VARIABLE, local_variable_name="EXTRACTED_VALUE", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.create_global_variable(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a FIDO global variable returns "OK" response ``` # Create a FIDO global variable returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsGlobalVariableRequest.new({ description: "", is_fido: true, name: "GLOBAL_VARIABLE_FIDO_PAYLOAD_EXAMPLESYNTHETIC", tags: [], }) p api_instance.create_global_variable(body) ``` Copy ##### Create a TOTP global variable returns "OK" response ``` # Create a TOTP global variable returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsGlobalVariableRequest.new({ description: "", is_totp: true, name: "GLOBAL_VARIABLE_TOTP_PAYLOAD_EXAMPLESYNTHETIC", tags: [], value: DatadogAPIClient::V1::SyntheticsGlobalVariableValue.new({ secure: false, value: "", options: DatadogAPIClient::V1::SyntheticsGlobalVariableOptions.new({ totp_parameters: DatadogAPIClient::V1::SyntheticsGlobalVariableTOTPParameters.new({ digits: 6, refresh_interval: 30, }), }), }), }) p api_instance.create_global_variable(body) ``` Copy ##### Create a global variable from test returns "OK" response ``` # Create a global variable from test returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new # there is a valid "synthetics_api_test_multi_step" in the system SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID = ENV["SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID"] body = DatadogAPIClient::V1::SyntheticsGlobalVariableRequest.new({ description: "", name: "GLOBAL_VARIABLE_FROM_TEST_PAYLOAD_EXAMPLESYNTHETIC", tags: [], value: DatadogAPIClient::V1::SyntheticsGlobalVariableValue.new({ secure: false, value: "", }), parse_test_public_id: SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID, parse_test_options: DatadogAPIClient::V1::SyntheticsGlobalVariableParseTestOptions.new({ type: DatadogAPIClient::V1::SyntheticsGlobalVariableParseTestOptionsType::LOCAL_VARIABLE, local_variable_name: "EXTRACTED_VALUE", }), }) p api_instance.create_global_variable(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a FIDO global variable returns "OK" response ``` // Create a FIDO global variable returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableRequest; #[tokio::main] async fn main() { let body = SyntheticsGlobalVariableRequest::new( "".to_string(), "GLOBAL_VARIABLE_FIDO_PAYLOAD_EXAMPLESYNTHETIC".to_string(), vec![], ) .is_fido(true); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.create_global_variable(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a TOTP global variable returns "OK" response ``` // Create a TOTP global variable returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableOptions; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableRequest; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableTOTPParameters; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableValue; #[tokio::main] async fn main() { let body = SyntheticsGlobalVariableRequest::new( "".to_string(), "GLOBAL_VARIABLE_TOTP_PAYLOAD_EXAMPLESYNTHETIC".to_string(), vec![], ) .is_totp(true) .value( SyntheticsGlobalVariableValue::new() .options( SyntheticsGlobalVariableOptions::new().totp_parameters( SyntheticsGlobalVariableTOTPParameters::new() .digits(6) .refresh_interval(30), ), ) .secure(false) .value("".to_string()), ); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.create_global_variable(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a global variable from test returns "OK" response ``` // Create a global variable from test returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableParseTestOptions; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableParseTestOptionsType; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableRequest; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableValue; #[tokio::main] async fn main() { // there is a valid "synthetics_api_test_multi_step" in the system let synthetics_api_test_multi_step_public_id = std::env::var("SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID").unwrap(); let body = SyntheticsGlobalVariableRequest::new( "".to_string(), "GLOBAL_VARIABLE_FROM_TEST_PAYLOAD_EXAMPLESYNTHETIC".to_string(), vec![], ) .parse_test_options( SyntheticsGlobalVariableParseTestOptions::new( SyntheticsGlobalVariableParseTestOptionsType::LOCAL_VARIABLE, ) .local_variable_name("EXTRACTED_VALUE".to_string()), ) .parse_test_public_id(synthetics_api_test_multi_step_public_id.clone()) .value( SyntheticsGlobalVariableValue::new() .secure(false) .value("".to_string()), ); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.create_global_variable(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a FIDO global variable returns "OK" response ``` /** * Create a FIDO global variable returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiCreateGlobalVariableRequest = { body: { description: "", isFido: true, name: "GLOBAL_VARIABLE_FIDO_PAYLOAD_EXAMPLESYNTHETIC", tags: [], }, }; apiInstance .createGlobalVariable(params) .then((data: v1.SyntheticsGlobalVariable) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a TOTP global variable returns "OK" response ``` /** * Create a TOTP global variable returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiCreateGlobalVariableRequest = { body: { description: "", isTotp: true, name: "GLOBAL_VARIABLE_TOTP_PAYLOAD_EXAMPLESYNTHETIC", tags: [], value: { secure: false, value: "", options: { totpParameters: { digits: 6, refreshInterval: 30, }, }, }, }, }; apiInstance .createGlobalVariable(params) .then((data: v1.SyntheticsGlobalVariable) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a global variable from test returns "OK" response ``` /** * Create a global variable from test returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); // there is a valid "synthetics_api_test_multi_step" in the system const SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID = process.env .SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID as string; const params: v1.SyntheticsApiCreateGlobalVariableRequest = { body: { description: "", name: "GLOBAL_VARIABLE_FROM_TEST_PAYLOAD_EXAMPLESYNTHETIC", tags: [], value: { secure: false, value: "", }, parseTestPublicId: SYNTHETICS_API_TEST_MULTI_STEP_PUBLIC_ID, parseTestOptions: { type: "local_variable", localVariableName: "EXTRACTED_VALUE", }, }, }; apiInstance .createGlobalVariable(params) .then((data: v1.SyntheticsGlobalVariable) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a global variable](https://docs.datadoghq.com/api/latest/synthetics/#get-a-global-variable) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-a-global-variable-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.ap2.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.datadoghq.eu/api/v1/synthetics/variables/{variable_id}https://api.ddog-gov.com/api/v1/synthetics/variables/{variable_id}https://api.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.us3.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.us5.datadoghq.com/api/v1/synthetics/variables/{variable_id} ### Overview Get the detailed configuration of a global variable. OAuth apps require the `synthetics_global_variable_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description variable_id [_required_] string The ID of the global variable. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetGlobalVariable-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#GetGlobalVariable-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#GetGlobalVariable-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetGlobalVariable-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Synthetic global variable. Field Type Description attributes object Attributes of the global variable. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. description [_required_] string Description of the global variable. id string Unique identifier of the global variable. is_fido boolean Determines if the global variable is a FIDO variable. is_totp boolean Determines if the global variable is a TOTP/MFA variable. name [_required_] string Name of the global variable. Unique across Synthetic global variables. parse_test_options object Parser options to use for retrieving a Synthetic global variable from a Synthetic test. Used in conjunction with `parse_test_public_id`. field string When type is `http_header`, name of the header to use to extract the value. localVariableName string When type is `local_variable`, name of the local variable to use to extract the value. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. type [_required_] enum Type of value to extract from a test for a Synthetic global variable. Allowed enum values: `http_body,http_header,http_status_code,local_variable` parse_test_public_id string A Synthetic test ID to use as a test to generate the variable value. tags [_required_] [string] Tags of the global variable. value [_required_] object Value of the global variable. options object Options for the Global Variable for MFA. totp_parameters object Parameters for the TOTP/MFA variable digits int32 Number of digits for the OTP code. refresh_interval int32 Interval for which to refresh the token (in seconds). secure boolean Determines if the value of the variable is hidden. value string Value of the global variable. When reading a global variable, the value will not be present if the variable is hidden with the `secure` property. ``` { "attributes": { "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ] }, "description": "Example description", "id": "string", "is_fido": false, "is_totp": false, "name": "MY_VARIABLE", "parse_test_options": { "field": "content-type", "localVariableName": "LOCAL_VARIABLE", "parser": { "type": "raw", "value": "string" }, "type": "http_body" }, "parse_test_public_id": "abc-def-123", "tags": [ "team:front", "test:workflow-1" ], "value": { "options": { "totp_parameters": { "digits": 6, "refresh_interval": 30 } }, "secure": false, "value": "example-value" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get a global variable Copy ``` # Path parameters export variable_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/variables/${variable_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a global variable ``` """ Get a global variable returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_global_variable( variable_id="variable_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a global variable ``` # Get a global variable returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.get_global_variable("variable_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a global variable ``` // Get a global variable returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetGlobalVariable(ctx, "variable_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetGlobalVariable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetGlobalVariable`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a global variable ``` // Get a global variable returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsGlobalVariable; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsGlobalVariable result = apiInstance.getGlobalVariable("variable_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getGlobalVariable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a global variable ``` // Get a global variable returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.get_global_variable("variable_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a global variable ``` /** * Get a global variable returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiGetGlobalVariableRequest = { variableId: "variable_id", }; apiInstance .getGlobalVariable(params) .then((data: v1.SyntheticsGlobalVariable) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Patch a global variable](https://docs.datadoghq.com/api/latest/synthetics/#patch-a-global-variable) * [v2 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#patch-a-global-variable-v2) PATCH https://api.ap1.datadoghq.com/api/v2/synthetics/variables/{variable_id}/jsonpatchhttps://api.ap2.datadoghq.com/api/v2/synthetics/variables/{variable_id}/jsonpatchhttps://api.datadoghq.eu/api/v2/synthetics/variables/{variable_id}/jsonpatchhttps://api.ddog-gov.com/api/v2/synthetics/variables/{variable_id}/jsonpatchhttps://api.datadoghq.com/api/v2/synthetics/variables/{variable_id}/jsonpatchhttps://api.us3.datadoghq.com/api/v2/synthetics/variables/{variable_id}/jsonpatchhttps://api.us5.datadoghq.com/api/v2/synthetics/variables/{variable_id}/jsonpatch ### Overview Patch a global variable using JSON Patch (RFC 6902). This endpoint allows partial updates to a global variable by specifying only the fields to modify. Common operations include: * Replace field values: `{"op": "replace", "path": "/name", "value": "new_name"}` * Update nested values: `{"op": "replace", "path": "/value/value", "value": "new_value"}` * Add/update tags: `{"op": "add", "path": "/tags/-", "value": "new_tag"}` * Remove fields: `{"op": "remove", "path": "/description"}` This endpoint requires the `synthetics_global_variable_write` permission. ### Arguments #### Path Parameters Name Type Description variable_id [_required_] string The ID of the global variable. ### Request #### Body Data (required) JSON Patch document with operations to apply. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description data [_required_] object attributes object json_patch [object] JSON Patch operations following RFC 6902. op [_required_] enum The operation to perform. Allowed enum values: `add,remove,replace,move,copy,test` path [_required_] string A JSON Pointer path (e.g., "/name", "/value/secure"). value The value to use for the operation (not applicable for "remove" and "test" operations). type enum Global variable JSON Patch type. Allowed enum values: `global_variables_json_patch` ``` { "data": { "attributes": { "json_patch": [ { "op": "add", "path": "/name", "value": "undefined" } ] }, "type": "string" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#PatchGlobalVariable-200-v2) * [400](https://docs.datadoghq.com/api/latest/synthetics/#PatchGlobalVariable-400-v2) * [404](https://docs.datadoghq.com/api/latest/synthetics/#PatchGlobalVariable-404-v2) * [429](https://docs.datadoghq.com/api/latest/synthetics/#PatchGlobalVariable-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Global variable response. Field Type Description data object Synthetics global variable data. Wrapper around the global variable object. attributes object Synthetic global variable. attributes object Attributes of the global variable. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. description [_required_] string Description of the global variable. id string Unique identifier of the global variable. is_fido boolean Determines if the global variable is a FIDO variable. is_totp boolean Determines if the global variable is a TOTP/MFA variable. name [_required_] string Name of the global variable. Unique across Synthetic global variables. parse_test_options object Parser options to use for retrieving a Synthetic global variable from a Synthetic test. Used in conjunction with `parse_test_public_id`. field string When type is `http_header`, name of the header to use to extract the value. localVariableName string When type is `local_variable`, name of the local variable to use to extract the value. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. type [_required_] enum Type of value to extract from a test for a Synthetic global variable. Allowed enum values: `http_body,http_header,http_status_code,local_variable` parse_test_public_id string A Synthetic test ID to use as a test to generate the variable value. tags [_required_] [string] Tags of the global variable. value [_required_] object Value of the global variable. options object Options for the Global Variable for MFA. totp_parameters object Parameters for the TOTP/MFA variable digits int32 Number of digits for the OTP code. refresh_interval int32 Interval for which to refresh the token (in seconds). secure boolean Determines if the value of the variable is hidden. value string Value of the global variable. When reading a global variable, the value will not be present if the variable is hidden with the `secure` property. id string Global variable identifier. type enum Global variable type. Allowed enum values: `global_variables` ``` { "data": { "attributes": { "attributes": { "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ] }, "description": "Example description", "id": "string", "is_fido": false, "is_totp": false, "name": "MY_VARIABLE", "parse_test_options": { "field": "content-type", "localVariableName": "LOCAL_VARIABLE", "parser": { "type": "raw", "value": "string" }, "type": "http_body" }, "parse_test_public_id": "abc-def-123", "tags": [ "team:front", "test:workflow-1" ], "value": { "options": { "totp_parameters": { "digits": 6, "refresh_interval": 30 } }, "secure": false, "value": "example-value" } }, "id": "string", "type": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) ##### Patch a global variable Copy ``` # Path parameters export variable_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/synthetics/variables/${variable_id}/jsonpatch" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "json_patch": [ { "op": "add", "path": "/name" } ] } } } EOF ``` * * * ## [Edit a global variable](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-global-variable) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-global-variable-v1) PUT https://api.ap1.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.ap2.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.datadoghq.eu/api/v1/synthetics/variables/{variable_id}https://api.ddog-gov.com/api/v1/synthetics/variables/{variable_id}https://api.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.us3.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.us5.datadoghq.com/api/v1/synthetics/variables/{variable_id} ### Overview Edit a Synthetic global variable. This endpoint requires the `synthetics_global_variable_write` permission. OAuth apps require the `synthetics_global_variable_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description variable_id [_required_] string The ID of the global variable. ### Request #### Body Data (required) Details of the global variable to update. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description attributes object Attributes of the global variable. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. description [_required_] string Description of the global variable. id string Unique identifier of the global variable. is_fido boolean Determines if the global variable is a FIDO variable. is_totp boolean Determines if the global variable is a TOTP/MFA variable. name [_required_] string Name of the global variable. Unique across Synthetic global variables. parse_test_options object Parser options to use for retrieving a Synthetic global variable from a Synthetic test. Used in conjunction with `parse_test_public_id`. field string When type is `http_header`, name of the header to use to extract the value. localVariableName string When type is `local_variable`, name of the local variable to use to extract the value. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. type [_required_] enum Type of value to extract from a test for a Synthetic global variable. Allowed enum values: `http_body,http_header,http_status_code,local_variable` parse_test_public_id string A Synthetic test ID to use as a test to generate the variable value. tags [_required_] [string] Tags of the global variable. value object Value of the global variable. options object Options for the Global Variable for MFA. totp_parameters object Parameters for the TOTP/MFA variable digits int32 Number of digits for the OTP code. refresh_interval int32 Interval for which to refresh the token (in seconds). secure boolean Determines if the value of the variable is hidden. value string Value of the global variable. When reading a global variable, the value will not be present if the variable is hidden with the `secure` property. ``` { "attributes": { "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ] }, "description": "Example description", "is_fido": false, "is_totp": false, "name": "MY_VARIABLE", "parse_test_options": { "field": "content-type", "localVariableName": "LOCAL_VARIABLE", "parser": { "type": "raw", "value": "string" }, "type": "http_body" }, "parse_test_public_id": "abc-def-123", "tags": [ "team:front", "test:workflow-1" ], "value": { "options": { "totp_parameters": { "digits": 6, "refresh_interval": 30 } }, "secure": false, "value": "example-value" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#EditGlobalVariable-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#EditGlobalVariable-400-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#EditGlobalVariable-403-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#EditGlobalVariable-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Synthetic global variable. Field Type Description attributes object Attributes of the global variable. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. description [_required_] string Description of the global variable. id string Unique identifier of the global variable. is_fido boolean Determines if the global variable is a FIDO variable. is_totp boolean Determines if the global variable is a TOTP/MFA variable. name [_required_] string Name of the global variable. Unique across Synthetic global variables. parse_test_options object Parser options to use for retrieving a Synthetic global variable from a Synthetic test. Used in conjunction with `parse_test_public_id`. field string When type is `http_header`, name of the header to use to extract the value. localVariableName string When type is `local_variable`, name of the local variable to use to extract the value. parser object Details of the parser to use for the global variable. type [_required_] enum Type of parser for a Synthetic global variable from a synthetics test. Allowed enum values: `raw,json_path,regex,x_path` value string Regex or JSON path used for the parser. Not used with type `raw`. type [_required_] enum Type of value to extract from a test for a Synthetic global variable. Allowed enum values: `http_body,http_header,http_status_code,local_variable` parse_test_public_id string A Synthetic test ID to use as a test to generate the variable value. tags [_required_] [string] Tags of the global variable. value [_required_] object Value of the global variable. options object Options for the Global Variable for MFA. totp_parameters object Parameters for the TOTP/MFA variable digits int32 Number of digits for the OTP code. refresh_interval int32 Interval for which to refresh the token (in seconds). secure boolean Determines if the value of the variable is hidden. value string Value of the global variable. When reading a global variable, the value will not be present if the variable is hidden with the `secure` property. ``` { "attributes": { "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ] }, "description": "Example description", "id": "string", "is_fido": false, "is_totp": false, "name": "MY_VARIABLE", "parse_test_options": { "field": "content-type", "localVariableName": "LOCAL_VARIABLE", "parser": { "type": "raw", "value": "string" }, "type": "http_body" }, "parse_test_public_id": "abc-def-123", "tags": [ "team:front", "test:workflow-1" ], "value": { "options": { "totp_parameters": { "digits": 6, "refresh_interval": 30 } }, "secure": false, "value": "example-value" } } ``` Copy Invalid request * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Edit a global variable Copy ``` # Path parameters export variable_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/variables/${variable_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "description": "Example description", "name": "MY_VARIABLE", "parse_test_options": { "parser": { "type": "raw" }, "type": "http_body" }, "tags": [ "team:front", "test:workflow-1" ] } EOF ``` ##### Edit a global variable ``` """ Edit a global variable returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_global_variable_attributes import SyntheticsGlobalVariableAttributes from datadog_api_client.v1.model.synthetics_global_variable_parse_test_options import ( SyntheticsGlobalVariableParseTestOptions, ) from datadog_api_client.v1.model.synthetics_global_variable_parse_test_options_type import ( SyntheticsGlobalVariableParseTestOptionsType, ) from datadog_api_client.v1.model.synthetics_global_variable_parser_type import SyntheticsGlobalVariableParserType from datadog_api_client.v1.model.synthetics_global_variable_request import SyntheticsGlobalVariableRequest from datadog_api_client.v1.model.synthetics_global_variable_value import SyntheticsGlobalVariableValue from datadog_api_client.v1.model.synthetics_restricted_roles import SyntheticsRestrictedRoles from datadog_api_client.v1.model.synthetics_variable_parser import SyntheticsVariableParser body = SyntheticsGlobalVariableRequest( attributes=SyntheticsGlobalVariableAttributes( restricted_roles=SyntheticsRestrictedRoles( [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", ] ), ), description="Example description", name="MY_VARIABLE", parse_test_options=SyntheticsGlobalVariableParseTestOptions( field="content-type", local_variable_name="LOCAL_VARIABLE", parser=SyntheticsVariableParser( type=SyntheticsGlobalVariableParserType.REGEX, value=".*", ), type=SyntheticsGlobalVariableParseTestOptionsType.HTTP_BODY, ), parse_test_public_id="abc-def-123", tags=[ "team:front", "test:workflow-1", ], value=SyntheticsGlobalVariableValue( secure=True, value="value", ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.edit_global_variable(variable_id="variable_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit a global variable ``` # Edit a global variable returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsGlobalVariableRequest.new({ attributes: DatadogAPIClient::V1::SyntheticsGlobalVariableAttributes.new({ restricted_roles: [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", ], }), description: "Example description", name: "MY_VARIABLE", parse_test_options: DatadogAPIClient::V1::SyntheticsGlobalVariableParseTestOptions.new({ field: "content-type", local_variable_name: "LOCAL_VARIABLE", parser: DatadogAPIClient::V1::SyntheticsVariableParser.new({ type: DatadogAPIClient::V1::SyntheticsGlobalVariableParserType::REGEX, value: ".*", }), type: DatadogAPIClient::V1::SyntheticsGlobalVariableParseTestOptionsType::HTTP_BODY, }), parse_test_public_id: "abc-def-123", tags: [ "team:front", "test:workflow-1", ], value: DatadogAPIClient::V1::SyntheticsGlobalVariableValue.new({ secure: true, value: "value", }), }) p api_instance.edit_global_variable("variable_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit a global variable ``` // Edit a global variable returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsGlobalVariableRequest{ Attributes: &datadogV1.SyntheticsGlobalVariableAttributes{ RestrictedRoles: []string{ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", }, }, Description: "Example description", Name: "MY_VARIABLE", ParseTestOptions: &datadogV1.SyntheticsGlobalVariableParseTestOptions{ Field: datadog.PtrString("content-type"), LocalVariableName: datadog.PtrString("LOCAL_VARIABLE"), Parser: &datadogV1.SyntheticsVariableParser{ Type: datadogV1.SYNTHETICSGLOBALVARIABLEPARSERTYPE_REGEX, Value: datadog.PtrString(".*"), }, Type: datadogV1.SYNTHETICSGLOBALVARIABLEPARSETESTOPTIONSTYPE_HTTP_BODY, }, ParseTestPublicId: datadog.PtrString("abc-def-123"), Tags: []string{ "team:front", "test:workflow-1", }, Value: &datadogV1.SyntheticsGlobalVariableValue{ Secure: datadog.PtrBool(true), Value: datadog.PtrString("value"), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.EditGlobalVariable(ctx, "variable_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.EditGlobalVariable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.EditGlobalVariable`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit a global variable ``` // Edit a global variable returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsGlobalVariable; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableAttributes; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableParseTestOptions; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableParseTestOptionsType; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableParserType; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableRequest; import com.datadog.api.client.v1.model.SyntheticsGlobalVariableValue; import com.datadog.api.client.v1.model.SyntheticsVariableParser; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsGlobalVariableRequest body = new SyntheticsGlobalVariableRequest() .attributes( new SyntheticsGlobalVariableAttributes() .restrictedRoles( Collections.singletonList("xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"))) .description("Example description") .name("MY_VARIABLE") .parseTestOptions( new SyntheticsGlobalVariableParseTestOptions() .field("content-type") .localVariableName("LOCAL_VARIABLE") .parser( new SyntheticsVariableParser() .type(SyntheticsGlobalVariableParserType.REGEX) .value(".*")) .type(SyntheticsGlobalVariableParseTestOptionsType.HTTP_BODY)) .parseTestPublicId("abc-def-123") .tags(Arrays.asList("team:front", "test:workflow-1")) .value(new SyntheticsGlobalVariableValue().secure(true).value("value")); try { SyntheticsGlobalVariable result = apiInstance.editGlobalVariable("variable_id", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#editGlobalVariable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit a global variable ``` // Edit a global variable returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableAttributes; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableParseTestOptions; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableParseTestOptionsType; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableParserType; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableRequest; use datadog_api_client::datadogV1::model::SyntheticsGlobalVariableValue; use datadog_api_client::datadogV1::model::SyntheticsVariableParser; #[tokio::main] async fn main() { let body = SyntheticsGlobalVariableRequest::new( "Example description".to_string(), "MY_VARIABLE".to_string(), vec!["team:front".to_string(), "test:workflow-1".to_string()], ) .attributes( SyntheticsGlobalVariableAttributes::new() .restricted_roles(vec!["xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx".to_string()]), ) .parse_test_options( SyntheticsGlobalVariableParseTestOptions::new( SyntheticsGlobalVariableParseTestOptionsType::HTTP_BODY, ) .field("content-type".to_string()) .local_variable_name("LOCAL_VARIABLE".to_string()) .parser( SyntheticsVariableParser::new(SyntheticsGlobalVariableParserType::REGEX) .value(".*".to_string()), ), ) .parse_test_public_id("abc-def-123".to_string()) .value( SyntheticsGlobalVariableValue::new() .secure(true) .value("value".to_string()), ); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api .edit_global_variable("variable_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit a global variable ``` /** * Edit a global variable returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiEditGlobalVariableRequest = { body: { attributes: { restrictedRoles: ["xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"], }, description: "Example description", name: "MY_VARIABLE", parseTestOptions: { field: "content-type", localVariableName: "LOCAL_VARIABLE", parser: { type: "regex", value: ".*", }, type: "http_body", }, parseTestPublicId: "abc-def-123", tags: ["team:front", "test:workflow-1"], value: { secure: true, value: "value", }, }, variableId: "variable_id", }; apiInstance .editGlobalVariable(params) .then((data: v1.SyntheticsGlobalVariable) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a global variable](https://docs.datadoghq.com/api/latest/synthetics/#delete-a-global-variable) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#delete-a-global-variable-v1) DELETE https://api.ap1.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.ap2.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.datadoghq.eu/api/v1/synthetics/variables/{variable_id}https://api.ddog-gov.com/api/v1/synthetics/variables/{variable_id}https://api.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.us3.datadoghq.com/api/v1/synthetics/variables/{variable_id}https://api.us5.datadoghq.com/api/v1/synthetics/variables/{variable_id} ### Overview Delete a Synthetic global variable. This endpoint requires the `synthetics_global_variable_write` permission. OAuth apps require the `synthetics_global_variable_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description variable_id [_required_] string The ID of the global variable. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#DeleteGlobalVariable-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#DeleteGlobalVariable-400-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#DeleteGlobalVariable-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#DeleteGlobalVariable-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#DeleteGlobalVariable-429-v1) OK JSON format is wrong * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Delete a global variable Copy ``` # Path parameters export variable_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/variables/${variable_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a global variable ``` """ Delete a global variable returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) api_instance.delete_global_variable( variable_id="variable_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a global variable ``` # Delete a global variable returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.delete_global_variable("variable_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a global variable ``` // Delete a global variable returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) r, err := api.DeleteGlobalVariable(ctx, "variable_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.DeleteGlobalVariable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a global variable ``` // Delete a global variable returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { apiInstance.deleteGlobalVariable("variable_id"); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#deleteGlobalVariable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a global variable ``` // Delete a global variable returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.delete_global_variable("variable_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a global variable ``` /** * Delete a global variable returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiDeleteGlobalVariableRequest = { variableId: "variable_id", }; apiInstance .deleteGlobalVariable(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a private location](https://docs.datadoghq.com/api/latest/synthetics/#create-a-private-location) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#create-a-private-location-v1) POST https://api.ap1.datadoghq.com/api/v1/synthetics/private-locationshttps://api.ap2.datadoghq.com/api/v1/synthetics/private-locationshttps://api.datadoghq.eu/api/v1/synthetics/private-locationshttps://api.ddog-gov.com/api/v1/synthetics/private-locationshttps://api.datadoghq.com/api/v1/synthetics/private-locationshttps://api.us3.datadoghq.com/api/v1/synthetics/private-locationshttps://api.us5.datadoghq.com/api/v1/synthetics/private-locations ### Overview Create a new Synthetic private location. This endpoint requires the `synthetics_private_location_write` permission. OAuth apps require the `synthetics_private_location_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Request #### Body Data (required) Details of the private location to create. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description description [_required_] string Description of the private location. id string Unique identifier of the private location. metadata object Object containing metadata about the private location. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. name [_required_] string Name of the private location. secrets object Secrets for the private location. Only present in the response when creating the private location. authentication object Authentication part of the secrets. id string Access key for the private location. key string Secret access key for the private location. config_decryption object Private key for the private location. key string Private key for the private location. tags [_required_] [string] Array of tags attached to the private location. ``` { "description": "Test Example-Synthetic description", "metadata": { "restricted_roles": [ "string" ] }, "name": "Example-Synthetic", "tags": [ "test:examplesynthetic" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#CreatePrivateLocation-200-v1) * [402](https://docs.datadoghq.com/api/latest/synthetics/#CreatePrivateLocation-402-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#CreatePrivateLocation-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#CreatePrivateLocation-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object that contains the new private location, the public key for result encryption, and the configuration skeleton. Field Type Description config object Configuration skeleton for the private location. See installation instructions of the private location on how to use this configuration. private_location object Object containing information about the private location to create. description [_required_] string Description of the private location. id string Unique identifier of the private location. metadata object Object containing metadata about the private location. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. name [_required_] string Name of the private location. secrets object Secrets for the private location. Only present in the response when creating the private location. authentication object Authentication part of the secrets. id string Access key for the private location. key string Secret access key for the private location. config_decryption object Private key for the private location. key string Private key for the private location. tags [_required_] [string] Array of tags attached to the private location. result_encryption object Public key for the result encryption. id string Fingerprint for the encryption key. key string Public key for result encryption. ``` { "config": {}, "private_location": { "description": "Description of private location", "id": "string", "metadata": { "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ] }, "name": "New private location", "secrets": { "authentication": { "id": "string", "key": "string" }, "config_decryption": { "key": "string" } }, "tags": [ "team:front" ] }, "result_encryption": { "id": "string", "key": "string" } } ``` Copy Quota reached for private locations * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Private locations are not activated for the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Create a private location returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/private-locations" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "description": "Test Example-Synthetic description", "metadata": { "restricted_roles": [ "string" ] }, "name": "Example-Synthetic", "tags": [ "test:examplesynthetic" ] } EOF ``` ##### Create a private location returns "OK" response ``` // Create a private location returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "role" in the system RoleDataID := os.Getenv("ROLE_DATA_ID") body := datadogV1.SyntheticsPrivateLocation{ Description: "Test Example-Synthetic description", Metadata: &datadogV1.SyntheticsPrivateLocationMetadata{ RestrictedRoles: []string{ RoleDataID, }, }, Name: "Example-Synthetic", Tags: []string{ "test:examplesynthetic", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.CreatePrivateLocation(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.CreatePrivateLocation`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.CreatePrivateLocation`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a private location returns "OK" response ``` // Create a private location returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsPrivateLocation; import com.datadog.api.client.v1.model.SyntheticsPrivateLocationCreationResponse; import com.datadog.api.client.v1.model.SyntheticsPrivateLocationMetadata; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); // there is a valid "role" in the system String ROLE_DATA_ID = System.getenv("ROLE_DATA_ID"); SyntheticsPrivateLocation body = new SyntheticsPrivateLocation() .description("Test Example-Synthetic description") .metadata( new SyntheticsPrivateLocationMetadata() .restrictedRoles(Collections.singletonList(ROLE_DATA_ID))) .name("Example-Synthetic") .tags(Collections.singletonList("test:examplesynthetic")); try { SyntheticsPrivateLocationCreationResponse result = apiInstance.createPrivateLocation(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#createPrivateLocation"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a private location returns "OK" response ``` """ Create a private location returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_private_location import SyntheticsPrivateLocation from datadog_api_client.v1.model.synthetics_private_location_metadata import SyntheticsPrivateLocationMetadata from datadog_api_client.v1.model.synthetics_restricted_roles import SyntheticsRestrictedRoles # there is a valid "role" in the system ROLE_DATA_ID = environ["ROLE_DATA_ID"] body = SyntheticsPrivateLocation( description="Test Example-Synthetic description", metadata=SyntheticsPrivateLocationMetadata( restricted_roles=SyntheticsRestrictedRoles( [ ROLE_DATA_ID, ] ), ), name="Example-Synthetic", tags=[ "test:examplesynthetic", ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.create_private_location(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a private location returns "OK" response ``` # Create a private location returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new # there is a valid "role" in the system ROLE_DATA_ID = ENV["ROLE_DATA_ID"] body = DatadogAPIClient::V1::SyntheticsPrivateLocation.new({ description: "Test Example-Synthetic description", metadata: DatadogAPIClient::V1::SyntheticsPrivateLocationMetadata.new({ restricted_roles: [ ROLE_DATA_ID, ], }), name: "Example-Synthetic", tags: [ "test:examplesynthetic", ], }) p api_instance.create_private_location(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a private location returns "OK" response ``` // Create a private location returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsPrivateLocation; use datadog_api_client::datadogV1::model::SyntheticsPrivateLocationMetadata; #[tokio::main] async fn main() { // there is a valid "role" in the system let role_data_id = std::env::var("ROLE_DATA_ID").unwrap(); let body = SyntheticsPrivateLocation::new( "Test Example-Synthetic description".to_string(), "Example-Synthetic".to_string(), vec!["test:examplesynthetic".to_string()], ) .metadata( SyntheticsPrivateLocationMetadata::new().restricted_roles(vec![role_data_id.clone()]), ); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.create_private_location(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a private location returns "OK" response ``` /** * Create a private location returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); // there is a valid "role" in the system const ROLE_DATA_ID = process.env.ROLE_DATA_ID as string; const params: v1.SyntheticsApiCreatePrivateLocationRequest = { body: { description: "Test Example-Synthetic description", metadata: { restrictedRoles: [ROLE_DATA_ID], }, name: "Example-Synthetic", tags: ["test:examplesynthetic"], }, }; apiInstance .createPrivateLocation(params) .then((data: v1.SyntheticsPrivateLocationCreationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a private location](https://docs.datadoghq.com/api/latest/synthetics/#get-a-private-location) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-a-private-location-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.ap2.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.datadoghq.eu/api/v1/synthetics/private-locations/{location_id}https://api.ddog-gov.com/api/v1/synthetics/private-locations/{location_id}https://api.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.us3.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.us5.datadoghq.com/api/v1/synthetics/private-locations/{location_id} ### Overview Get a Synthetic private location. This endpoint requires the `synthetics_private_location_read` permission. OAuth apps require the `synthetics_private_location_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description location_id [_required_] string The ID of the private location. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetPrivateLocation-200-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#GetPrivateLocation-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetPrivateLocation-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing information about the private location to create. Field Type Description description [_required_] string Description of the private location. id string Unique identifier of the private location. metadata object Object containing metadata about the private location. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. name [_required_] string Name of the private location. secrets object Secrets for the private location. Only present in the response when creating the private location. authentication object Authentication part of the secrets. id string Access key for the private location. key string Secret access key for the private location. config_decryption object Private key for the private location. key string Private key for the private location. tags [_required_] [string] Array of tags attached to the private location. ``` { "description": "Description of private location", "id": "string", "metadata": { "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ] }, "name": "New private location", "secrets": { "authentication": { "id": "string", "key": "string" }, "config_decryption": { "key": "string" } }, "tags": [ "team:front" ] } ``` Copy - Synthetic private locations are not activated for the user - Private location does not exist * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get a private location Copy ``` # Path parameters export location_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/private-locations/${location_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a private location ``` """ Get a private location returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_private_location( location_id="location_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a private location ``` # Get a private location returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.get_private_location("location_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a private location ``` // Get a private location returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetPrivateLocation(ctx, "location_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetPrivateLocation`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetPrivateLocation`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a private location ``` // Get a private location returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsPrivateLocation; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsPrivateLocation result = apiInstance.getPrivateLocation("location_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getPrivateLocation"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a private location ``` // Get a private location returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.get_private_location("location_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a private location ``` /** * Get a private location returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiGetPrivateLocationRequest = { locationId: "location_id", }; apiInstance .getPrivateLocation(params) .then((data: v1.SyntheticsPrivateLocation) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit a private location](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-private-location) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-private-location-v1) PUT https://api.ap1.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.ap2.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.datadoghq.eu/api/v1/synthetics/private-locations/{location_id}https://api.ddog-gov.com/api/v1/synthetics/private-locations/{location_id}https://api.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.us3.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.us5.datadoghq.com/api/v1/synthetics/private-locations/{location_id} ### Overview Edit a Synthetic private location. This endpoint requires the `synthetics_private_location_write` permission. OAuth apps require the `synthetics_private_location_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description location_id [_required_] string The ID of the private location. ### Request #### Body Data (required) Details of the private location to be updated. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description description [_required_] string Description of the private location. id string Unique identifier of the private location. metadata object Object containing metadata about the private location. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. name [_required_] string Name of the private location. secrets object Secrets for the private location. Only present in the response when creating the private location. authentication object Authentication part of the secrets. id string Access key for the private location. key string Secret access key for the private location. config_decryption object Private key for the private location. key string Private key for the private location. tags [_required_] [string] Array of tags attached to the private location. ``` { "description": "Description of private location", "metadata": { "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ] }, "name": "New private location", "tags": [ "team:front" ] } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#UpdatePrivateLocation-200-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#UpdatePrivateLocation-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#UpdatePrivateLocation-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing information about the private location to create. Field Type Description description [_required_] string Description of the private location. id string Unique identifier of the private location. metadata object Object containing metadata about the private location. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. name [_required_] string Name of the private location. secrets object Secrets for the private location. Only present in the response when creating the private location. authentication object Authentication part of the secrets. id string Access key for the private location. key string Secret access key for the private location. config_decryption object Private key for the private location. key string Private key for the private location. tags [_required_] [string] Array of tags attached to the private location. ``` { "description": "Description of private location", "id": "string", "metadata": { "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ] }, "name": "New private location", "secrets": { "authentication": { "id": "string", "key": "string" }, "config_decryption": { "key": "string" } }, "tags": [ "team:front" ] } ``` Copy - Private locations are not activated for the user - Private location does not exist * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Edit a private location Copy ``` # Path parameters export location_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/private-locations/${location_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "description": "Description of private location", "name": "New private location", "tags": [ "team:front" ] } EOF ``` ##### Edit a private location ``` """ Edit a private location returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_private_location import SyntheticsPrivateLocation from datadog_api_client.v1.model.synthetics_private_location_metadata import SyntheticsPrivateLocationMetadata from datadog_api_client.v1.model.synthetics_restricted_roles import SyntheticsRestrictedRoles body = SyntheticsPrivateLocation( description="Description of private location", metadata=SyntheticsPrivateLocationMetadata( restricted_roles=SyntheticsRestrictedRoles( [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", ] ), ), name="New private location", tags=[ "team:front", ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.update_private_location(location_id="location_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Edit a private location ``` # Edit a private location returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsPrivateLocation.new({ description: "Description of private location", metadata: DatadogAPIClient::V1::SyntheticsPrivateLocationMetadata.new({ restricted_roles: [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", ], }), name: "New private location", tags: [ "team:front", ], }) p api_instance.update_private_location("location_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Edit a private location ``` // Edit a private location returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsPrivateLocation{ Description: "Description of private location", Metadata: &datadogV1.SyntheticsPrivateLocationMetadata{ RestrictedRoles: []string{ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", }, }, Name: "New private location", Tags: []string{ "team:front", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.UpdatePrivateLocation(ctx, "location_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.UpdatePrivateLocation`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.UpdatePrivateLocation`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Edit a private location ``` // Edit a private location returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsPrivateLocation; import com.datadog.api.client.v1.model.SyntheticsPrivateLocationMetadata; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsPrivateLocation body = new SyntheticsPrivateLocation() .description("Description of private location") .metadata( new SyntheticsPrivateLocationMetadata() .restrictedRoles( Collections.singletonList("xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"))) .name("New private location") .tags(Collections.singletonList("team:front")); try { SyntheticsPrivateLocation result = apiInstance.updatePrivateLocation("location_id", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#updatePrivateLocation"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Edit a private location ``` // Edit a private location returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsPrivateLocation; use datadog_api_client::datadogV1::model::SyntheticsPrivateLocationMetadata; #[tokio::main] async fn main() { let body = SyntheticsPrivateLocation::new( "Description of private location".to_string(), "New private location".to_string(), vec!["team:front".to_string()], ) .metadata( SyntheticsPrivateLocationMetadata::new() .restricted_roles(vec!["xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx".to_string()]), ); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api .update_private_location("location_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Edit a private location ``` /** * Edit a private location returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiUpdatePrivateLocationRequest = { body: { description: "Description of private location", metadata: { restrictedRoles: ["xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"], }, name: "New private location", tags: ["team:front"], }, locationId: "location_id", }; apiInstance .updatePrivateLocation(params) .then((data: v1.SyntheticsPrivateLocation) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all locations (public and private)](https://docs.datadoghq.com/api/latest/synthetics/#get-all-locations-public-and-private) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-all-locations-public-and-private-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/locationshttps://api.ap2.datadoghq.com/api/v1/synthetics/locationshttps://api.datadoghq.eu/api/v1/synthetics/locationshttps://api.ddog-gov.com/api/v1/synthetics/locationshttps://api.datadoghq.com/api/v1/synthetics/locationshttps://api.us3.datadoghq.com/api/v1/synthetics/locationshttps://api.us5.datadoghq.com/api/v1/synthetics/locations ### Overview Get the list of public and private locations available for Synthetic tests. No arguments required. This endpoint requires the `synthetics_private_location_read` permission. OAuth apps require the `synthetics_private_location_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#ListLocations-200-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#ListLocations-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) List of Synthetic locations. Field Type Description locations [object] List of Synthetic locations. id string Unique identifier of the location. name string Name of the location. ``` { "locations": [ { "id": "string", "name": "string" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get all locations (public and private) Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/locations" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all locations (public and private) ``` """ Get all locations (public and private) returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.list_locations() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all locations (public and private) ``` # Get all locations (public and private) returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.list_locations() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all locations (public and private) ``` // Get all locations (public and private) returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.ListLocations(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.ListLocations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.ListLocations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all locations (public and private) ``` // Get all locations (public and private) returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsLocations; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsLocations result = apiInstance.listLocations(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#listLocations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all locations (public and private) ``` // Get all locations (public and private) returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.list_locations().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all locations (public and private) ``` /** * Get all locations (public and private) returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); apiInstance .listLocations() .then((data: v1.SyntheticsLocations) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a private location](https://docs.datadoghq.com/api/latest/synthetics/#delete-a-private-location) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#delete-a-private-location-v1) DELETE https://api.ap1.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.ap2.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.datadoghq.eu/api/v1/synthetics/private-locations/{location_id}https://api.ddog-gov.com/api/v1/synthetics/private-locations/{location_id}https://api.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.us3.datadoghq.com/api/v1/synthetics/private-locations/{location_id}https://api.us5.datadoghq.com/api/v1/synthetics/private-locations/{location_id} ### Overview Delete a Synthetic private location. This endpoint requires the `synthetics_private_location_write` permission. OAuth apps require the `synthetics_private_location_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description location_id [_required_] string The ID of the private location. ### Response * [204](https://docs.datadoghq.com/api/latest/synthetics/#DeletePrivateLocation-204-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#DeletePrivateLocation-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#DeletePrivateLocation-429-v1) OK - Private locations are not activated for the user - Private location does not exist * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Delete a private location Copy ``` # Path parameters export location_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/private-locations/${location_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a private location ``` """ Delete a private location returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) api_instance.delete_private_location( location_id="location_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a private location ``` # Delete a private location returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new api_instance.delete_private_location("location_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a private location ``` // Delete a private location returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) r, err := api.DeletePrivateLocation(ctx, "location_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.DeletePrivateLocation`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a private location ``` // Delete a private location returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { apiInstance.deletePrivateLocation("location_id"); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#deletePrivateLocation"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a private location ``` // Delete a private location returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.delete_private_location("location_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a private location ``` /** * Delete a private location returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiDeletePrivateLocationRequest = { locationId: "location_id", }; apiInstance .deletePrivateLocation(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a test configuration](https://docs.datadoghq.com/api/latest/synthetics/#get-a-test-configuration) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-a-test-configuration-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.ap2.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.datadoghq.eu/api/v1/synthetics/tests/{public_id}https://api.ddog-gov.com/api/v1/synthetics/tests/{public_id}https://api.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.us3.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.us5.datadoghq.com/api/v1/synthetics/tests/{public_id} ### Overview Get the detailed configuration associated with a Synthetic test. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the test to get details from. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetTest-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#GetTest-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#GetTest-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetTest-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about your Synthetic test, without test steps. Field Type Description config object Configuration object for a Synthetic test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. variables [object] Browser tests only - array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. locations [string] Array of locations used to run the test. message string Notification message associated with the test. monitor_id int64 The associated monitor ID. name string Name of the test. options object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The test public ID. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type enum Type of the Synthetic test, either `api` or `browser`. Allowed enum values: `api,browser,mobile` ``` { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "creator": { "email": "string", "handle": "string", "name": "string" }, "locations": [ "aws:eu-west-3" ], "message": "string", "monitor_id": "integer", "name": "string", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "string", "status": "live", "subtype": "http", "tags": [], "type": "string" } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic is not activated for the user - Test is not owned by the user * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python-legacy) ##### Get a test configuration Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/${public_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a test configuration ``` """ Get a test configuration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_test( public_id="public_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a test configuration ``` # Get a test configuration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.get_test("public_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a test configuration ``` require 'dogapi' api_key = '' app_key = '' test_id = '' dog = Dogapi::Client.new(api_key, app_key) dog.get_synthetics_test('test_id' => test_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a test configuration ``` // Get a test configuration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetTest(ctx, "public_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetTest`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetTest`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a test configuration ``` // Get a test configuration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsTestDetails; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsTestDetails result = apiInstance.getTest("public_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getTest"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a test configuration ``` from datadog import initialize, api options = { 'api_key': '', 'app_key': '' } test_id = '' initialize(**options) api.Synthetics.get_test(id=test_id) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get a test configuration ``` // Get a test configuration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.get_test("public_id".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a test configuration ``` /** * Get a test configuration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiGetTestRequest = { publicId: "public_id", }; apiInstance .getTest(params) .then((data: v1.SyntheticsTestDetails) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the default locations](https://docs.datadoghq.com/api/latest/synthetics/#get-the-default-locations) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#get-the-default-locations-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/settings/default_locationshttps://api.ap2.datadoghq.com/api/v1/synthetics/settings/default_locationshttps://api.datadoghq.eu/api/v1/synthetics/settings/default_locationshttps://api.ddog-gov.com/api/v1/synthetics/settings/default_locationshttps://api.datadoghq.com/api/v1/synthetics/settings/default_locationshttps://api.us3.datadoghq.com/api/v1/synthetics/settings/default_locationshttps://api.us5.datadoghq.com/api/v1/synthetics/settings/default_locations ### Overview Get the default locations settings. This endpoint requires the `synthetics_default_settings_read` permission. ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#GetSyntheticsDefaultLocations-200-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#GetSyntheticsDefaultLocations-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) List of Synthetics default locations settings. Expand All Field Type Description description Name of the location. string ``` [ "aws:eu-west-3" ] ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Get the default locations Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/settings/default_locations" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the default locations ``` """ Get the default locations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.get_synthetics_default_locations() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the default locations ``` # Get the default locations returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.get_synthetics_default_locations() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the default locations ``` // Get the default locations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.GetSyntheticsDefaultLocations(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.GetSyntheticsDefaultLocations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.GetSyntheticsDefaultLocations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the default locations ``` // Get the default locations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { List result = apiInstance.getSyntheticsDefaultLocations(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#getSyntheticsDefaultLocations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the default locations ``` // Get the default locations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.get_synthetics_default_locations().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the default locations ``` /** * Get the default locations returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); apiInstance .getSyntheticsDefaultLocations() .then((data: string[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Edit a test](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-test) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-test-v1) PUT https://api.ap1.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.ap2.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.datadoghq.eu/api/v1/synthetics/tests/{public_id}https://api.ddog-gov.com/api/v1/synthetics/tests/{public_id}https://api.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.us3.datadoghq.com/api/v1/synthetics/tests/{public_id}https://api.us5.datadoghq.com/api/v1/synthetics/tests/{public_id} ### Overview This endpoint is deprecated. To edit a test, use [Edit an API test](https://docs.datadoghq.com/api/latest/synthetics/#edit-an-api-test) or [Edit a browser test](https://docs.datadoghq.com/api/latest/synthetics/#edit-a-browser-test). This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Path Parameters Name Type Description public_id [_required_] string The public ID of the test to get details from. ### Request #### Body Data (required) New test details to be saved. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description config object Configuration object for a Synthetic test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. variables [object] Browser tests only - array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. locations [string] Array of locations used to run the test. message string Notification message associated with the test. monitor_id int64 The associated monitor ID. name string Name of the test. options object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The test public ID. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] The steps of the test if they exist. allowFailure boolean A boolean set to allow this step to fail. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean A boolean set to exit the test if the step succeeds. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name string The name of the step. noScreenshot boolean A boolean set to skip taking a screenshot for the step. params object The parameters of the step. public_id string The public ID of the step. timeout int64 The time before declaring a step failed. type enum Step type used in your Synthetic test. Allowed enum values: `assertCurrentUrl,assertElementAttribute,assertElementContent,assertElementPresent,assertEmail,assertFileDownload,assertFromJavascript,assertPageContains,assertPageLacks,assertRequests,click,extractFromJavascript,extractFromEmailBody,extractVariable,goToEmailLink,goToUrl,goToUrlAndMeasureTti,hover,playSubTest,pressKey,refresh,runApiTest,scroll,selectOption,typeText,uploadFiles,wait` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type enum Type of the Synthetic test, either `api` or `browser`. Allowed enum values: `api,browser,mobile` ``` { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "locations": [ "aws:eu-west-3" ], "message": "string", "name": "string", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "status": "live", "steps": [ { "allowFailure": false, "alwaysExecute": false, "exitIfSucceed": false, "isCritical": false, "name": "string", "noScreenshot": false, "params": {}, "public_id": "string", "timeout": "integer", "type": "assertElementContent" } ], "subtype": "http", "tags": [], "type": "string" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#UpdateTest-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#UpdateTest-400-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#UpdateTest-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#UpdateTest-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#UpdateTest-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about your Synthetic test. Field Type Description config object Configuration object for a Synthetic test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. variables [object] Browser tests only - array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. locations [string] Array of locations used to run the test. message string Notification message associated with the test. monitor_id int64 The associated monitor ID. name string Name of the test. options object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The test public ID. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] The steps of the test if they exist. allowFailure boolean A boolean set to allow this step to fail. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean A boolean set to exit the test if the step succeeds. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name string The name of the step. noScreenshot boolean A boolean set to skip taking a screenshot for the step. params object The parameters of the step. public_id string The public ID of the step. timeout int64 The time before declaring a step failed. type enum Step type used in your Synthetic test. Allowed enum values: `assertCurrentUrl,assertElementAttribute,assertElementContent,assertElementPresent,assertEmail,assertFileDownload,assertFromJavascript,assertPageContains,assertPageLacks,assertRequests,click,extractFromJavascript,extractFromEmailBody,extractVariable,goToEmailLink,goToUrl,goToUrlAndMeasureTti,hover,playSubTest,pressKey,refresh,runApiTest,scroll,selectOption,typeText,uploadFiles,wait` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type enum Type of the Synthetic test, either `api` or `browser`. Allowed enum values: `api,browser,mobile` ``` { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "creator": { "email": "string", "handle": "string", "name": "string" }, "locations": [ "aws:eu-west-3" ], "message": "string", "monitor_id": "integer", "name": "string", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "string", "status": "live", "steps": [ { "allowFailure": false, "alwaysExecute": false, "exitIfSucceed": false, "isCritical": false, "name": "string", "noScreenshot": false, "params": {}, "public_id": "string", "timeout": "integer", "type": "assertElementContent" } ], "subtype": "http", "tags": [], "type": "string" } ``` Copy - JSON format is wrong - Updating sub-type is forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy - Synthetic is not activated for the user - Test is not owned by the user - Test can't be found * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) ##### Edit a test Copy ``` # Path parameters export public_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/${public_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "configVariables": [ { "name": "VARIABLE_NAME", "type": "text" } ], "request": { "proxy": { "url": "https://example.com" } }, "variables": [ { "name": "VARIABLE_NAME", "type": "text" } ] }, "options": { "ci": { "executionRule": "blocking" }, "rumSettings": { "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" } } } EOF ``` * * * ## [Fetch uptime for multiple tests](https://docs.datadoghq.com/api/latest/synthetics/#fetch-uptime-for-multiple-tests) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#fetch-uptime-for-multiple-tests-v1) POST https://api.ap1.datadoghq.com/api/v1/synthetics/tests/uptimeshttps://api.ap2.datadoghq.com/api/v1/synthetics/tests/uptimeshttps://api.datadoghq.eu/api/v1/synthetics/tests/uptimeshttps://api.ddog-gov.com/api/v1/synthetics/tests/uptimeshttps://api.datadoghq.com/api/v1/synthetics/tests/uptimeshttps://api.us3.datadoghq.com/api/v1/synthetics/tests/uptimeshttps://api.us5.datadoghq.com/api/v1/synthetics/tests/uptimes ### Overview Fetch uptime for multiple Synthetic tests by ID. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Request #### Body Data (required) Public ID list of the Synthetic tests and timeframe. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Expand All Field Type Description from_ts [_required_] int64 Timestamp in seconds (Unix epoch) for the start of uptime. public_ids [_required_] [string] An array of Synthetic test IDs you want uptimes for. to_ts [_required_] int64 Timestamp in seconds (Unix epoch) for the end of uptime. ``` { "from_ts": 1726041488, "public_ids": [ "p8m-9gw-nte" ], "to_ts": 1726055954 } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#FetchUptimes-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#FetchUptimes-400-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#FetchUptimes-403-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#FetchUptimes-429-v1) OK. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description from_ts int64 Timestamp in seconds for the start of uptime. overall object Object containing the uptime information. errors [object] An array of error objects returned while querying the history data for the service level objective. error_message [_required_] string A message with more details about the error. error_type [_required_] string Type of the error. group string The location name history [array] The state transition history for the monitor, represented as an array of pairs. Each pair is an array where the first element is the transition timestamp in Unix epoch format (integer) and the second element is the state (integer). For the state, an integer value of `0` indicates uptime, `1` indicates downtime, and `2` indicates no data. span_precision double The number of decimal places to which the SLI value is accurate for the given from-to timestamps. uptime double The overall uptime. public_id string A Synthetic test ID. to_ts int64 Timestamp in seconds for the end of uptime. ``` { "from_ts": "integer", "overall": { "errors": [ { "error_message": "", "error_type": "" } ], "group": "name", "history": [ [ 1579212382, 0 ] ], "span_precision": 2, "uptime": 99.99 }, "public_id": "abc-def-123", "to_ts": "integer" } ``` Copy - JSON format is wrong * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Fetch uptime for multiple tests returns "OK." response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/uptimes" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "from_ts": 1726041488, "public_ids": [ "p8m-9gw-nte" ], "to_ts": 1726055954 } EOF ``` ##### Fetch uptime for multiple tests returns "OK." response ``` // Fetch uptime for multiple tests returns "OK." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.SyntheticsFetchUptimesPayload{ FromTs: 1726041488, PublicIds: []string{ "p8m-9gw-nte", }, ToTs: 1726055954, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.FetchUptimes(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.FetchUptimes`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.FetchUptimes`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Fetch uptime for multiple tests returns "OK." response ``` // Fetch uptime for multiple tests returns "OK." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsFetchUptimesPayload; import com.datadog.api.client.v1.model.SyntheticsTestUptime; import java.util.Collections; import java.util.List; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); SyntheticsFetchUptimesPayload body = new SyntheticsFetchUptimesPayload() .fromTs(1726041488L) .publicIds(Collections.singletonList("p8m-9gw-nte")) .toTs(1726055954L); try { List result = apiInstance.fetchUptimes(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#fetchUptimes"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Fetch uptime for multiple tests returns "OK." response ``` """ Fetch uptime for multiple tests returns "OK." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi from datadog_api_client.v1.model.synthetics_fetch_uptimes_payload import SyntheticsFetchUptimesPayload body = SyntheticsFetchUptimesPayload( from_ts=1726041488, public_ids=[ "p8m-9gw-nte", ], to_ts=1726055954, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.fetch_uptimes(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Fetch uptime for multiple tests returns "OK." response ``` # Fetch uptime for multiple tests returns "OK." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new body = DatadogAPIClient::V1::SyntheticsFetchUptimesPayload.new({ from_ts: 1726041488, public_ids: [ "p8m-9gw-nte", ], to_ts: 1726055954, }) p api_instance.fetch_uptimes(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Fetch uptime for multiple tests returns "OK." response ``` // Fetch uptime for multiple tests returns "OK." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; use datadog_api_client::datadogV1::model::SyntheticsFetchUptimesPayload; #[tokio::main] async fn main() { let body = SyntheticsFetchUptimesPayload::new(1726041488, vec!["p8m-9gw-nte".to_string()], 1726055954); let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.fetch_uptimes(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Fetch uptime for multiple tests returns "OK." response ``` /** * Fetch uptime for multiple tests returns "OK." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); const params: v1.SyntheticsApiFetchUptimesRequest = { body: { fromTs: 1726041488, publicIds: ["p8m-9gw-nte"], toTs: 1726055954, }, }; apiInstance .fetchUptimes(params) .then((data: v1.SyntheticsTestUptime[]) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a test](https://docs.datadoghq.com/api/latest/synthetics/#create-a-test) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/synthetics/#create-a-test-v1) POST https://api.ap1.datadoghq.com/api/v1/synthetics/testshttps://api.ap2.datadoghq.com/api/v1/synthetics/testshttps://api.datadoghq.eu/api/v1/synthetics/testshttps://api.ddog-gov.com/api/v1/synthetics/testshttps://api.datadoghq.com/api/v1/synthetics/testshttps://api.us3.datadoghq.com/api/v1/synthetics/testshttps://api.us5.datadoghq.com/api/v1/synthetics/tests ### Overview This endpoint is deprecated. To create a test, use [Create an API test](https://docs.datadoghq.com/api/latest/synthetics/#create-an-api-test), or [Create a browser test](https://docs.datadoghq.com/api/latest/synthetics/#create-a-browser-test). This endpoint requires the `synthetics_write` permission. OAuth apps require the `synthetics_write` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Request #### Body Data (required) Details of the test to create. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Field Type Description config object Configuration object for a Synthetic test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. variables [object] Browser tests only - array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. locations [string] Array of locations used to run the test. message string Notification message associated with the test. monitor_id int64 The associated monitor ID. name string Name of the test. options object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The test public ID. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] The steps of the test if they exist. allowFailure boolean A boolean set to allow this step to fail. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean A boolean set to exit the test if the step succeeds. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name string The name of the step. noScreenshot boolean A boolean set to skip taking a screenshot for the step. params object The parameters of the step. public_id string The public ID of the step. timeout int64 The time before declaring a step failed. type enum Step type used in your Synthetic test. Allowed enum values: `assertCurrentUrl,assertElementAttribute,assertElementContent,assertElementPresent,assertEmail,assertFileDownload,assertFromJavascript,assertPageContains,assertPageLacks,assertRequests,click,extractFromJavascript,extractFromEmailBody,extractVariable,goToEmailLink,goToUrl,goToUrlAndMeasureTti,hover,playSubTest,pressKey,refresh,runApiTest,scroll,selectOption,typeText,uploadFiles,wait` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type enum Type of the Synthetic test, either `api` or `browser`. Allowed enum values: `api,browser,mobile` ``` { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "locations": [ "aws:eu-west-3" ], "message": "string", "name": "string", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "status": "live", "steps": [ { "allowFailure": false, "alwaysExecute": false, "exitIfSucceed": false, "isCritical": false, "name": "string", "noScreenshot": false, "params": {}, "public_id": "string", "timeout": "integer", "type": "assertElementContent" } ], "subtype": "http", "tags": [], "type": "string" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#CreateTest-200-v1) * [400](https://docs.datadoghq.com/api/latest/synthetics/#CreateTest-400-v1) * [402](https://docs.datadoghq.com/api/latest/synthetics/#CreateTest-402-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#CreateTest-403-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#CreateTest-429-v1) OK - Returns the created test details. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing details about your Synthetic test. Field Type Description config object Configuration object for a Synthetic test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. variables [object] Browser tests only - array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. locations [string] Array of locations used to run the test. message string Notification message associated with the test. monitor_id int64 The associated monitor ID. name string Name of the test. options object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The test public ID. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` steps [object] The steps of the test if they exist. allowFailure boolean A boolean set to allow this step to fail. alwaysExecute boolean A boolean set to always execute this step even if the previous step failed or was skipped. exitIfSucceed boolean A boolean set to exit the test if the step succeeds. isCritical boolean A boolean to use in addition to `allowFailure` to determine if the test should be marked as failed when the step fails. name string The name of the step. noScreenshot boolean A boolean set to skip taking a screenshot for the step. params object The parameters of the step. public_id string The public ID of the step. timeout int64 The time before declaring a step failed. type enum Step type used in your Synthetic test. Allowed enum values: `assertCurrentUrl,assertElementAttribute,assertElementContent,assertElementPresent,assertEmail,assertFileDownload,assertFromJavascript,assertPageContains,assertPageLacks,assertRequests,click,extractFromJavascript,extractFromEmailBody,extractVariable,goToEmailLink,goToUrl,goToUrlAndMeasureTti,hover,playSubTest,pressKey,refresh,runApiTest,scroll,selectOption,typeText,uploadFiles,wait` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type enum Type of the Synthetic test, either `api` or `browser`. Allowed enum values: `api,browser,mobile` ``` { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "creator": { "email": "string", "handle": "string", "name": "string" }, "locations": [ "aws:eu-west-3" ], "message": "string", "monitor_id": "integer", "name": "string", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "string", "status": "live", "steps": [ { "allowFailure": false, "alwaysExecute": false, "exitIfSucceed": false, "isCritical": false, "name": "string", "noScreenshot": false, "params": {}, "public_id": "string", "timeout": "integer", "type": "assertElementContent" } ], "subtype": "http", "tags": [], "type": "string" } ``` Copy - JSON format is wrong - Creation failed * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Test quota is reached * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) ##### Create a test Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "config": { "configVariables": [ { "name": "VARIABLE_NAME", "type": "text" } ], "request": { "proxy": { "url": "https://example.com" } }, "variables": [ { "name": "VARIABLE_NAME", "type": "text" } ] }, "options": { "ci": { "executionRule": "blocking" }, "rumSettings": { "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" } } } EOF ``` * * * ## [Search Synthetic tests](https://docs.datadoghq.com/api/latest/synthetics/#search-synthetic-tests) * [v1 (latest)](https://docs.datadoghq.com/api/latest/synthetics/#search-synthetic-tests-v1) GET https://api.ap1.datadoghq.com/api/v1/synthetics/tests/searchhttps://api.ap2.datadoghq.com/api/v1/synthetics/tests/searchhttps://api.datadoghq.eu/api/v1/synthetics/tests/searchhttps://api.ddog-gov.com/api/v1/synthetics/tests/searchhttps://api.datadoghq.com/api/v1/synthetics/tests/searchhttps://api.us3.datadoghq.com/api/v1/synthetics/tests/searchhttps://api.us5.datadoghq.com/api/v1/synthetics/tests/search ### Overview Search for Synthetic tests. This endpoint requires the `synthetics_read` permission. OAuth apps require the `synthetics_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#synthetics) to access this endpoint. ### Arguments #### Query Strings Name Type Description text string The search query. include_full_config boolean If true, include the full configuration for each test in the response. facets_only boolean If true, return only facets instead of full test details. start integer The offset from which to start returning results. count integer The maximum number of results to return. sort string The sort order for the results (e.g., `name,asc` or `name,desc`). ### Response * [200](https://docs.datadoghq.com/api/latest/synthetics/#SearchTests-200-v1) * [403](https://docs.datadoghq.com/api/latest/synthetics/#SearchTests-403-v1) * [404](https://docs.datadoghq.com/api/latest/synthetics/#SearchTests-404-v1) * [429](https://docs.datadoghq.com/api/latest/synthetics/#SearchTests-429-v1) OK - Returns the list of Synthetic tests matching the search. * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Object containing an array of Synthetic tests configuration. Field Type Description tests [object] Array of Synthetic tests configuration. config object Configuration object for a Synthetic test. assertions [ ] Array of assertions used for the test. Required for single API tests. default: Option 1 object An assertion which uses a simple target. operator [_required_] enum Assertion operator to apply. Allowed enum values: `contains,doesNotContain,is,isNot,lessThan,lessThanOrEqual,moreThan,moreThanOrEqual,matches,doesNotMatch,validates,isInMoreThan,isInLessThan,doesNotExist,isUndefined` property string The associated assertion property. target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. timingsScope enum Timings scope for response time assertions. Allowed enum values: `all,withoutDNS` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 2 object An assertion which targets body hash. operator [_required_] enum Assertion operator to apply. Allowed enum values: `md5,sha1,sha256` target [_required_] Value used by the operator. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `bodyHash` Option 3 object An assertion for the `validatesJSONPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONPath` property string The associated assertion property. target object Composed target for `validatesJSONPath` operator. elementsOperator string The element from the list of results to assert on. To choose from the first element in the list `firstElementMatches`, every element in the list `everyElementMatches`, at least one element in the list `atLeastOneElementMatches` or the serialized value of the list `serializationMatches`. jsonPath string The JSON path to assert. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 4 object An assertion for the `validatesJSONSchema` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesJSONSchema` target object Composed target for `validatesJSONSchema` operator. jsonSchema string The JSON Schema to assert. metaSchema enum The JSON Schema meta-schema version used in the assertion. Allowed enum values: `draft-07,draft-06` type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 5 object An assertion for the `validatesXPath` operator. operator [_required_] enum Assertion operator to apply. Allowed enum values: `validatesXPath` property string The associated assertion property. target object Composed target for `validatesXPath` operator. operator string The specific operator to use on the path. targetValue The path target value to compare to. Option 1 double Numeric value used by the operator in assertions. Option 2 string String value used by the operator in assertions. Supports templated variables. xPath string The X path to assert. type [_required_] enum Type of the assertion. Allowed enum values: `body,header,statusCode,certificate,responseTime,property,recordEvery,recordSome,tlsVersion,minTlsVersion,latency,packetLossPercentage,packetsReceived,networkHop,receivedMessage,grpcHealthcheckStatus,grpcMetadata,grpcProto,connection` Option 6 object A JavaScript assertion. code [_required_] string The JavaScript code that performs the assertions. type [_required_] enum Type of the assertion. Allowed enum values: `javascript` configVariables [object] Array of variables used for the test. example string Example for the variable. id string ID of the variable for global variables. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Whether the value of this variable will be obfuscated in test results. Only for config variables of type `text`. type [_required_] enum Type of the configuration variable. Allowed enum values: `global,text,email` request object Object describing the Synthetic test request. allow_insecure boolean Allows loading insecure content for an HTTP request in a multistep test step. basicAuth Object to handle basic authentication when performing the test. Option 1 object Object to handle basic authentication when performing the test. password string Password to use for the basic authentication. type enum The type of basic authentication to use when performing the test. Allowed enum values: `web` default: `web` username string Username to use for the basic authentication. Option 2 object Object to handle `SIGV4` authentication when performing the test. accessKey [_required_] string Access key for the `SIGV4` authentication. region string Region for the `SIGV4` authentication. secretKey [_required_] string Secret key for the `SIGV4` authentication. serviceName string Service name for the `SIGV4` authentication. sessionToken string Session token for the `SIGV4` authentication. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `sigv4` default: `sigv4` Option 3 object Object to handle `NTLM` authentication when performing the test. domain string Domain for the authentication to use when performing the test. password string Password for the authentication to use when performing the test. type [_required_] enum The type of authentication to use when performing the test. Allowed enum values: `ntlm` default: `ntlm` username string Username for the authentication to use when performing the test. workstation string Workstation for the authentication to use when performing the test. Option 4 object Object to handle digest authentication when performing the test. password [_required_] string Password to use for the digest authentication. type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `digest` default: `digest` username [_required_] string Username to use for the digest authentication. Option 5 object Object to handle `oauth client` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId [_required_] string Client ID to use when performing the authentication. clientSecret [_required_] string Client secret to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-client` default: `oauth-client` Option 6 object Object to handle `oauth rop` authentication when performing the test. accessTokenUrl [_required_] string Access token URL to use when performing the authentication. audience string Audience to use when performing the authentication. clientId string Client ID to use when performing the authentication. clientSecret string Client secret to use when performing the authentication. password [_required_] string Password to use when performing the authentication. resource string Resource to use when performing the authentication. scope string Scope to use when performing the authentication. tokenApiAuthentication [_required_] enum Type of token to use when performing the authentication. Allowed enum values: `header,body` type [_required_] enum The type of basic authentication to use when performing the test. Allowed enum values: `oauth-rop` default: `oauth-rop` username [_required_] string Username to use when performing the authentication. body string Body to include in the test. bodyType enum Type of the request body. Allowed enum values: `text/plain,application/json,text/xml,text/html,application/x-www-form-urlencoded,graphql,application/octet-stream,multipart/form-data` callType enum The type of gRPC call to perform. Allowed enum values: `healthcheck,unary` certificate object Client certificate to use when performing the test request. cert object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. key object Define a request certificate. content string Content of the certificate or key. filename string File name for the certificate or key. updatedAt string Date of update of the certificate or key, ISO format. certificateDomains [string] By default, the client certificate is applied on the domain of the starting URL for browser tests. If you want your client certificate to be applied on other domains instead, add them in `certificateDomains`. default: checkCertificateRevocation boolean Check for certificate revocation. compressedJsonDescriptor string A protobuf JSON descriptor that needs to be gzipped first then base64 encoded. compressedProtoFile string A protobuf file that needs to be gzipped first then base64 encoded. disableAiaIntermediateFetching boolean Disable fetching intermediate certificates from AIA. dnsServer string DNS server to use for DNS tests. dnsServerPort DNS server port to use for DNS tests. Option 1 int64 Integer DNS server port number to use when performing the test. Option 2 string String DNS server port number to use when performing the test. Supports templated variables. files [object] Files to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. bucketKey string Bucket key of the file. content string Content of the file. name string Name of the file. originalFileName string Original name of the file. size int64 Size of the file. type string Type of the file. follow_redirects boolean Specifies whether or not the request follows redirects. form object Form to be used as part of the request in the test. Only valid if `bodyType` is `multipart/form-data`. string A single form entry. headers object Headers to include when performing the test. string A single Header. host string Host name to perform the test with. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` isMessageBase64Encoded boolean Whether the message is base64 encoded. message string Message to send for UDP or WebSocket tests. metadata object Metadata to include when performing the gRPC test. string A single metadatum. method string Either the HTTP method/verb to use or a gRPC method available on the service set in the `service` field. Required if `subtype` is `HTTP` or if `subtype` is `grpc` and `callType` is `unary`. noSavingResponseBody boolean Determines whether or not to save the response body. numberOfPackets int32 Number of pings to use per test. persistCookies boolean Persist cookies across redirects. port Port to use when performing the test. Option 1 int64 Integer Port number to use when performing the test. Option 2 string String Port number to use when performing the test. Supports templated variables. proxy object The proxy to perform the test. headers object Headers to include when performing the test. string A single Header. url [_required_] string URL of the proxy to perform the test. query object Query to use for the test. servername string For SSL tests, it specifies on which server you want to initiate the TLS handshake, allowing the server to present one of multiple possible certificates on the same IP address and TCP port number. service string The gRPC service on which you want to perform the gRPC call. shouldTrackHops boolean Turns on a traceroute probe to discover all gateways along the path to the host destination. timeout double Timeout in seconds for the test. url string URL to perform the test with. variables [object] Browser tests only - array of variables used for the test steps. example string Example for the variable. id string ID for the variable. Global variables require an ID. name [_required_] string Name of the variable. pattern string Pattern of the variable. secure boolean Determines whether or not the browser test variable is obfuscated. Can only be used with browser variables of type `text`. type [_required_] enum Type of browser test variable. Allowed enum values: `element,email,global,text` creator object Object describing the creator of the shared element. email string Email of the creator. handle string Handle of the creator. name string Name of the creator. locations [string] Array of locations used to run the test. message string Notification message associated with the test. monitor_id int64 The associated monitor ID. name string Name of the test. options object Object describing the extra options for a Synthetic test. accept_self_signed boolean For SSL tests, whether or not the test should allow self signed certificates. allow_insecure boolean Allows loading insecure content for an HTTP request in an API test. blockedRequestPatterns [string] Array of URL patterns to block. checkCertificateRevocation boolean For SSL tests, whether or not the test should fail on revoked certificate in stapled OCSP. ci object CI/CD options for a Synthetic test. executionRule [_required_] enum Execution rule for a Synthetic test. Allowed enum values: `blocking,non_blocking,skipped` device_ids [string] For browser test, array with the different device IDs used to run the test. disableAiaIntermediateFetching boolean For SSL tests, whether or not the test should disable fetching intermediate certificates from AIA. disableCors boolean Whether or not to disable CORS mechanism. disableCsp boolean Disable Content Security Policy for browser tests. enableProfiling boolean Enable profiling for browser tests. enableSecurityTesting boolean **DEPRECATED** : Enable security testing for browser tests. Security testing is not available anymore. This field is deprecated and won't be used. follow_redirects boolean For API HTTP test, whether or not the test should follow redirects. httpVersion enum HTTP version to use for a Synthetic test. Allowed enum values: `http1,http2,any` ignoreServerCertificateError boolean Ignore server certificate error for browser tests. initialNavigationTimeout int64 Timeout before declaring the initial step as failed (in seconds) for browser tests. min_failure_duration int64 Minimum amount of time in failure required to trigger an alert. min_location_failed int64 Minimum number of locations in failure required to trigger an alert. monitor_name string The monitor name is used for the alert title as well as for all monitor dashboard widgets and SLOs. monitor_options object Object containing the options for a Synthetic test as a monitor (for example, renotification). escalation_message string Message to include in the escalation notification. notification_preset_name enum The name of the preset for the notification for the monitor. Allowed enum values: `show_all,hide_all,hide_query,hide_handles` renotify_interval int64 Time interval before renotifying if the test is still failing (in minutes). renotify_occurrences int64 The number of times to renotify if the test is still failing. monitor_priority int32 Integer from 1 (high) to 5 (low) indicating alert severity. noScreenshot boolean Prevents saving screenshots of the steps. restricted_roles [string] A list of role identifiers that can be pulled from the Roles API, for restricting read and write access. retry object Object describing the retry strategy to apply to a Synthetic test. count int64 Number of times a test needs to be retried before marking a location as failed. Defaults to 0. interval double Time interval between retries (in milliseconds). Defaults to 300ms. rumSettings object The RUM data collection settings for the Synthetic browser test. **Note:** There are 3 ways to format RUM settings: `{ isEnabled: false }` RUM data is not collected. `{ isEnabled: true }` RUM data is collected from the Synthetic test's default application. `{ isEnabled: true, applicationId: "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", clientTokenId: 12345 }` RUM data is collected using the specified application. applicationId string RUM application ID used to collect RUM data for the browser test. clientTokenId int64 RUM application API key ID used to collect RUM data for the browser test. isEnabled [_required_] boolean Determines whether RUM data is collected during test runs. scheduling object Object containing timeframes and timezone used for advanced scheduling. timeframes [_required_] [object] Array containing objects describing the scheduling pattern to apply to each day. day [_required_] int32 Number representing the day of the week. from [_required_] string The hour of the day on which scheduling starts. to [_required_] string The hour of the day on which scheduling ends. timezone [_required_] string Timezone in which the timeframe is based. tick_every int64 The frequency at which to run the Synthetic test (in seconds). public_id string The test public ID. status enum Define whether you want to start (`live`) or pause (`paused`) a Synthetic test. Allowed enum values: `live,paused` subtype enum The subtype of the Synthetic API test, `http`, `ssl`, `tcp`, `dns`, `icmp`, `udp`, `websocket`, `grpc` or `multi`. Allowed enum values: `http,ssl,tcp,dns,multi,icmp,udp,websocket,grpc` tags [string] Array of tags attached to the test. type enum Type of the Synthetic test, either `api` or `browser`. Allowed enum values: `api,browser,mobile` ``` { "tests": [ { "config": { "assertions": [ { "operator": "contains", "property": "string", "target": 0, "timingsScope": "string", "type": "statusCode" } ], "configVariables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ], "request": { "allow_insecure": false, "basicAuth": { "password": "PaSSw0RD!", "type": "web", "username": "my_username" }, "body": "string", "bodyType": "text/plain", "callType": "unary", "certificate": { "cert": { "content": "string", "filename": "string", "updatedAt": "string" }, "key": { "content": "string", "filename": "string", "updatedAt": "string" } }, "certificateDomains": [], "checkCertificateRevocation": false, "compressedJsonDescriptor": "string", "compressedProtoFile": "string", "disableAiaIntermediateFetching": false, "dnsServer": "string", "dnsServerPort": { "description": "undefined", "format": "undefined", "type": "undefined" }, "files": [ { "bucketKey": "string", "content": "string", "name": "string", "originalFileName": "string", "size": "integer", "type": "string" } ], "follow_redirects": false, "form": { "": "string" }, "headers": { "": "string" }, "host": "string", "httpVersion": "string", "isMessageBase64Encoded": false, "message": "string", "metadata": { "": "string" }, "method": "string", "noSavingResponseBody": false, "numberOfPackets": "integer", "persistCookies": false, "port": { "description": "undefined", "format": "undefined", "type": "undefined" }, "proxy": { "headers": { "": "string" }, "url": "https://example.com" }, "query": {}, "servername": "string", "service": "Greeter", "shouldTrackHops": false, "timeout": "number", "url": "https://example.com" }, "variables": [ { "example": "string", "id": "string", "name": "VARIABLE_NAME", "pattern": "string", "secure": false, "type": "text" } ] }, "creator": { "email": "string", "handle": "string", "name": "string" }, "locations": [ "aws:eu-west-3" ], "message": "string", "monitor_id": "integer", "name": "string", "options": { "accept_self_signed": false, "allow_insecure": false, "blockedRequestPatterns": [], "checkCertificateRevocation": false, "ci": { "executionRule": "blocking" }, "device_ids": [ "chrome.laptop_large" ], "disableAiaIntermediateFetching": false, "disableCors": false, "disableCsp": false, "enableProfiling": false, "enableSecurityTesting": false, "follow_redirects": false, "httpVersion": "string", "ignoreServerCertificateError": false, "initialNavigationTimeout": "integer", "min_failure_duration": "integer", "min_location_failed": "integer", "monitor_name": "string", "monitor_options": { "escalation_message": "string", "notification_preset_name": "string", "renotify_interval": "integer", "renotify_occurrences": "integer" }, "monitor_priority": "integer", "noScreenshot": false, "restricted_roles": [ "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" ], "retry": { "count": "integer", "interval": "number" }, "rumSettings": { "applicationId": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx", "clientTokenId": 12345, "isEnabled": true }, "scheduling": { "timeframes": [ { "day": 1, "from": "07:00", "to": "16:00" } ], "timezone": "America/New_York" }, "tick_every": "integer" }, "public_id": "string", "status": "live", "subtype": "http", "tags": [], "type": "string" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/synthetics/) * [Example](https://docs.datadoghq.com/api/latest/synthetics/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/synthetics/?code-lang=typescript) ##### Search Synthetic tests Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/synthetics/tests/search" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Search Synthetic tests ``` """ Search Synthetic tests returns "OK - Returns the list of Synthetic tests matching the search." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.synthetics_api import SyntheticsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = SyntheticsApi(api_client) response = api_instance.search_tests() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search Synthetic tests ``` # Search Synthetic tests returns "OK - Returns the list of Synthetic tests matching the search." response require "datadog_api_client" api_instance = DatadogAPIClient::V1::SyntheticsAPI.new p api_instance.search_tests() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search Synthetic tests ``` // Search Synthetic tests returns "OK - Returns the list of Synthetic tests matching the search." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewSyntheticsApi(apiClient) resp, r, err := api.SearchTests(ctx, *datadogV1.NewSearchTestsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `SyntheticsApi.SearchTests`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `SyntheticsApi.SearchTests`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search Synthetic tests ``` // Search Synthetic tests returns "OK - Returns the list of Synthetic tests matching the search." // response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.SyntheticsApi; import com.datadog.api.client.v1.model.SyntheticsListTestsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); SyntheticsApi apiInstance = new SyntheticsApi(defaultClient); try { SyntheticsListTestsResponse result = apiInstance.searchTests(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling SyntheticsApi#searchTests"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search Synthetic tests ``` // Search Synthetic tests returns "OK - Returns the list of Synthetic tests // matching the search." response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_synthetics::SearchTestsOptionalParams; use datadog_api_client::datadogV1::api_synthetics::SyntheticsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = SyntheticsAPI::with_config(configuration); let resp = api.search_tests(SearchTestsOptionalParams::default()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search Synthetic tests ``` /** * Search Synthetic tests returns "OK - Returns the list of Synthetic tests matching the search." response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.SyntheticsApi(configuration); apiInstance .searchTests() .then((data: v1.SyntheticsListTestsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=1490252a-57ca-4118-b491-209b0fb583f9&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=8212aa4a-0733-4628-85f2-3d350244d04d&pt=Synthetics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsynthetics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=1490252a-57ca-4118-b491-209b0fb583f9&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=8212aa4a-0733-4628-85f2-3d350244d04d&pt=Synthetics&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsynthetics%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=55147c82-89aa-453b-b520-78e8c3af7367&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Synthetics&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fsynthetics%2F&r=<=33332&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=502060) --- # Source: https://docs.datadoghq.com/api/latest/tags/ # Tags The tag endpoint allows you to assign tags to hosts, for example: `role:database`. Those tags are applied to all metrics sent by the host. Refer to hosts by name (`yourhost.example.com`) when fetching and applying tags to a particular host. The component of your infrastructure responsible for a tag is identified by a source. For example, some valid sources include nagios, hudson, jenkins, users, feed, chef, puppet, git, bitbucket, fabric, capistrano, etc. Read more about tags on [Getting Started with Tags](https://docs.datadoghq.com/getting_started/tagging/). ## [Get Tags](https://docs.datadoghq.com/api/latest/tags/#get-tags) * [v1 (latest)](https://docs.datadoghq.com/api/latest/tags/#get-tags-v1) GET https://api.ap1.datadoghq.com/api/v1/tags/hostshttps://api.ap2.datadoghq.com/api/v1/tags/hostshttps://api.datadoghq.eu/api/v1/tags/hostshttps://api.ddog-gov.com/api/v1/tags/hostshttps://api.datadoghq.com/api/v1/tags/hostshttps://api.us3.datadoghq.com/api/v1/tags/hostshttps://api.us5.datadoghq.com/api/v1/tags/hosts ### Overview Return a mapping of tags to hosts for your whole infrastructure. ### Arguments #### Query Strings Name Type Description source string When specified, filters host list to those tags with the specified source. ### Response * [200](https://docs.datadoghq.com/api/latest/tags/#ListHostTags-200-v1) * [403](https://docs.datadoghq.com/api/latest/tags/#ListHostTags-403-v1) * [404](https://docs.datadoghq.com/api/latest/tags/#ListHostTags-404-v1) * [429](https://docs.datadoghq.com/api/latest/tags/#ListHostTags-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) In this object, the key is the tag, the value is a list of host names that are reporting that tag. Field Type Description tags object A list of tags to apply to the host. [string] A list of additional properties for tags. ``` { "tags": { "": [ "test.metric.host" ] } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/tags/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/tags/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/tags/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/tags/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/tags/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/tags/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/tags/?code-lang=typescript) ##### Get Tags Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/tags/hosts" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Tags ``` """ Get Tags returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.tags_api import TagsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TagsApi(api_client) response = api_instance.list_host_tags() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Tags ``` # Get Tags returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::TagsAPI.new p api_instance.list_host_tags() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Tags ``` // Get Tags returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewTagsApi(apiClient) resp, r, err := api.ListHostTags(ctx, *datadogV1.NewListHostTagsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TagsApi.ListHostTags`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TagsApi.ListHostTags`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Tags ``` // Get Tags returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.TagsApi; import com.datadog.api.client.v1.model.TagToHosts; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TagsApi apiInstance = new TagsApi(defaultClient); try { TagToHosts result = apiInstance.listHostTags(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TagsApi#listHostTags"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Tags ``` // Get Tags returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_tags::ListHostTagsOptionalParams; use datadog_api_client::datadogV1::api_tags::TagsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = TagsAPI::with_config(configuration); let resp = api .list_host_tags(ListHostTagsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Tags ``` /** * Get Tags returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.TagsApi(configuration); apiInstance .listHostTags() .then((data: v1.TagToHosts) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get host tags](https://docs.datadoghq.com/api/latest/tags/#get-host-tags) * [v1 (latest)](https://docs.datadoghq.com/api/latest/tags/#get-host-tags-v1) GET https://api.ap1.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.ap2.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.datadoghq.eu/api/v1/tags/hosts/{host_name}https://api.ddog-gov.com/api/v1/tags/hosts/{host_name}https://api.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.us3.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.us5.datadoghq.com/api/v1/tags/hosts/{host_name} ### Overview Return the list of tags that apply to a given host. ### Arguments #### Path Parameters Name Type Description host_name [_required_] string When specified, filters list of tags to those tags with the specified source. #### Query Strings Name Type Description source string Source to filter. ### Response * [200](https://docs.datadoghq.com/api/latest/tags/#GetHostTags-200-v1) * [403](https://docs.datadoghq.com/api/latest/tags/#GetHostTags-403-v1) * [404](https://docs.datadoghq.com/api/latest/tags/#GetHostTags-404-v1) * [429](https://docs.datadoghq.com/api/latest/tags/#GetHostTags-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Set of tags to associate with your host. Expand All Field Type Description host string Your host name. tags [string] A list of tags to apply to the host. ``` { "host": "test.host", "tags": [ "environment:production" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/tags/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/tags/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/tags/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/tags/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/tags/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/tags/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/tags/?code-lang=typescript) ##### Get host tags Copy ``` # Path parameters export host_name="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/tags/hosts/${host_name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get host tags ``` """ Get host tags returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.tags_api import TagsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TagsApi(api_client) response = api_instance.get_host_tags( host_name="host_name", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get host tags ``` # Get host tags returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::TagsAPI.new p api_instance.get_host_tags("host_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get host tags ``` // Get host tags returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewTagsApi(apiClient) resp, r, err := api.GetHostTags(ctx, "host_name", *datadogV1.NewGetHostTagsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TagsApi.GetHostTags`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TagsApi.GetHostTags`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get host tags ``` // Get host tags returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.TagsApi; import com.datadog.api.client.v1.model.HostTags; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TagsApi apiInstance = new TagsApi(defaultClient); try { HostTags result = apiInstance.getHostTags("host_name"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TagsApi#getHostTags"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get host tags ``` // Get host tags returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_tags::GetHostTagsOptionalParams; use datadog_api_client::datadogV1::api_tags::TagsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = TagsAPI::with_config(configuration); let resp = api .get_host_tags( "host_name".to_string(), GetHostTagsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get host tags ``` /** * Get host tags returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.TagsApi(configuration); const params: v1.TagsApiGetHostTagsRequest = { hostName: "host_name", }; apiInstance .getHostTags(params) .then((data: v1.HostTags) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add tags to a host](https://docs.datadoghq.com/api/latest/tags/#add-tags-to-a-host) * [v1 (latest)](https://docs.datadoghq.com/api/latest/tags/#add-tags-to-a-host-v1) POST https://api.ap1.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.ap2.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.datadoghq.eu/api/v1/tags/hosts/{host_name}https://api.ddog-gov.com/api/v1/tags/hosts/{host_name}https://api.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.us3.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.us5.datadoghq.com/api/v1/tags/hosts/{host_name} ### Overview This endpoint allows you to add new tags to a host, optionally specifying where these tags come from. ### Arguments #### Path Parameters Name Type Description host_name [_required_] string This endpoint allows you to add new tags to a host, optionally specifying where the tags came from. #### Query Strings Name Type Description source string The source of the tags. [Complete list of source attribute values](https://docs.datadoghq.com/integrations/faq/list-of-api-source-attribute-value). ### Request #### Body Data (required) Update host tags request body. * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Expand All Field Type Description host string Your host name. tags [string] A list of tags to apply to the host. ``` { "host": "test.host", "tags": [ "environment:production" ] } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/tags/#CreateHostTags-201-v1) * [403](https://docs.datadoghq.com/api/latest/tags/#CreateHostTags-403-v1) * [404](https://docs.datadoghq.com/api/latest/tags/#CreateHostTags-404-v1) * [429](https://docs.datadoghq.com/api/latest/tags/#CreateHostTags-429-v1) Created * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Set of tags to associate with your host. Expand All Field Type Description host string Your host name. tags [string] A list of tags to apply to the host. ``` { "host": "test.host", "tags": [ "environment:production" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/tags/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/tags/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/tags/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/tags/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/tags/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/tags/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/tags/?code-lang=typescript) ##### Add tags to a host Copy ``` # Path parameters export host_name="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/tags/hosts/${host_name}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Add tags to a host ``` """ Add tags to a host returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.tags_api import TagsApi from datadog_api_client.v1.model.host_tags import HostTags body = HostTags( host="test.host", tags=[ "environment:production", ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TagsApi(api_client) response = api_instance.create_host_tags(host_name="host_name", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add tags to a host ``` # Add tags to a host returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::TagsAPI.new body = DatadogAPIClient::V1::HostTags.new({ host: "test.host", tags: [ "environment:production", ], }) p api_instance.create_host_tags("host_name", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add tags to a host ``` // Add tags to a host returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.HostTags{ Host: datadog.PtrString("test.host"), Tags: []string{ "environment:production", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewTagsApi(apiClient) resp, r, err := api.CreateHostTags(ctx, "host_name", body, *datadogV1.NewCreateHostTagsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TagsApi.CreateHostTags`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TagsApi.CreateHostTags`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add tags to a host ``` // Add tags to a host returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.TagsApi; import com.datadog.api.client.v1.model.HostTags; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TagsApi apiInstance = new TagsApi(defaultClient); HostTags body = new HostTags().host("test.host").tags(Collections.singletonList("environment:production")); try { HostTags result = apiInstance.createHostTags("host_name", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TagsApi#createHostTags"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add tags to a host ``` // Add tags to a host returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_tags::CreateHostTagsOptionalParams; use datadog_api_client::datadogV1::api_tags::TagsAPI; use datadog_api_client::datadogV1::model::HostTags; #[tokio::main] async fn main() { let body = HostTags::new() .host("test.host".to_string()) .tags(vec!["environment:production".to_string()]); let configuration = datadog::Configuration::new(); let api = TagsAPI::with_config(configuration); let resp = api .create_host_tags( "host_name".to_string(), body, CreateHostTagsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add tags to a host ``` /** * Add tags to a host returns "Created" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.TagsApi(configuration); const params: v1.TagsApiCreateHostTagsRequest = { body: { host: "test.host", tags: ["environment:production"], }, hostName: "host_name", }; apiInstance .createHostTags(params) .then((data: v1.HostTags) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update host tags](https://docs.datadoghq.com/api/latest/tags/#update-host-tags) * [v1 (latest)](https://docs.datadoghq.com/api/latest/tags/#update-host-tags-v1) PUT https://api.ap1.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.ap2.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.datadoghq.eu/api/v1/tags/hosts/{host_name}https://api.ddog-gov.com/api/v1/tags/hosts/{host_name}https://api.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.us3.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.us5.datadoghq.com/api/v1/tags/hosts/{host_name} ### Overview This endpoint allows you to update/replace all tags in an integration source with those supplied in the request. ### Arguments #### Path Parameters Name Type Description host_name [_required_] string This endpoint allows you to update/replace all in an integration source with those supplied in the request. #### Query Strings Name Type Description source string The source of the tags (for example chef, puppet). [Complete list of source attribute values](https://docs.datadoghq.com/integrations/faq/list-of-api-source-attribute-value) ### Request #### Body Data (required) Add tags to host * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Expand All Field Type Description host string Your host name. tags [string] A list of tags to apply to the host. ``` { "host": "test.host", "tags": [ "environment:production" ] } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/tags/#UpdateHostTags-201-v1) * [403](https://docs.datadoghq.com/api/latest/tags/#UpdateHostTags-403-v1) * [404](https://docs.datadoghq.com/api/latest/tags/#UpdateHostTags-404-v1) * [429](https://docs.datadoghq.com/api/latest/tags/#UpdateHostTags-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Set of tags to associate with your host. Expand All Field Type Description host string Your host name. tags [string] A list of tags to apply to the host. ``` { "host": "test.host", "tags": [ "environment:production" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/tags/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/tags/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/tags/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/tags/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/tags/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/tags/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/tags/?code-lang=typescript) ##### Update host tags Copy ``` # Path parameters export host_name="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/tags/hosts/${host_name}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Update host tags ``` """ Update host tags returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.tags_api import TagsApi from datadog_api_client.v1.model.host_tags import HostTags body = HostTags( host="test.host", tags=[ "environment:production", ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TagsApi(api_client) response = api_instance.update_host_tags(host_name="host_name", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update host tags ``` # Update host tags returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::TagsAPI.new body = DatadogAPIClient::V1::HostTags.new({ host: "test.host", tags: [ "environment:production", ], }) p api_instance.update_host_tags("host_name", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update host tags ``` // Update host tags returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.HostTags{ Host: datadog.PtrString("test.host"), Tags: []string{ "environment:production", }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewTagsApi(apiClient) resp, r, err := api.UpdateHostTags(ctx, "host_name", body, *datadogV1.NewUpdateHostTagsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TagsApi.UpdateHostTags`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TagsApi.UpdateHostTags`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update host tags ``` // Update host tags returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.TagsApi; import com.datadog.api.client.v1.model.HostTags; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TagsApi apiInstance = new TagsApi(defaultClient); HostTags body = new HostTags().host("test.host").tags(Collections.singletonList("environment:production")); try { HostTags result = apiInstance.updateHostTags("host_name", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TagsApi#updateHostTags"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update host tags ``` // Update host tags returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_tags::TagsAPI; use datadog_api_client::datadogV1::api_tags::UpdateHostTagsOptionalParams; use datadog_api_client::datadogV1::model::HostTags; #[tokio::main] async fn main() { let body = HostTags::new() .host("test.host".to_string()) .tags(vec!["environment:production".to_string()]); let configuration = datadog::Configuration::new(); let api = TagsAPI::with_config(configuration); let resp = api .update_host_tags( "host_name".to_string(), body, UpdateHostTagsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update host tags ``` /** * Update host tags returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.TagsApi(configuration); const params: v1.TagsApiUpdateHostTagsRequest = { body: { host: "test.host", tags: ["environment:production"], }, hostName: "host_name", }; apiInstance .updateHostTags(params) .then((data: v1.HostTags) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Remove host tags](https://docs.datadoghq.com/api/latest/tags/#remove-host-tags) * [v1 (latest)](https://docs.datadoghq.com/api/latest/tags/#remove-host-tags-v1) DELETE https://api.ap1.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.ap2.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.datadoghq.eu/api/v1/tags/hosts/{host_name}https://api.ddog-gov.com/api/v1/tags/hosts/{host_name}https://api.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.us3.datadoghq.com/api/v1/tags/hosts/{host_name}https://api.us5.datadoghq.com/api/v1/tags/hosts/{host_name} ### Overview This endpoint allows you to remove all user-assigned tags for a single host. ### Arguments #### Path Parameters Name Type Description host_name [_required_] string This endpoint allows you to remove all user-assigned tags for a single host. #### Query Strings Name Type Description source string The source of the tags (for example chef, puppet). [Complete list of source attribute values](https://docs.datadoghq.com/integrations/faq/list-of-api-source-attribute-value). ### Response * [204](https://docs.datadoghq.com/api/latest/tags/#DeleteHostTags-204-v1) * [403](https://docs.datadoghq.com/api/latest/tags/#DeleteHostTags-403-v1) * [404](https://docs.datadoghq.com/api/latest/tags/#DeleteHostTags-404-v1) * [429](https://docs.datadoghq.com/api/latest/tags/#DeleteHostTags-429-v1) OK Forbidden * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/tags/) * [Example](https://docs.datadoghq.com/api/latest/tags/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/tags/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/tags/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/tags/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/tags/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/tags/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/tags/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/tags/?code-lang=typescript) ##### Remove host tags Copy ``` # Path parameters export host_name="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/tags/hosts/${host_name}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Remove host tags ``` """ Remove host tags returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.tags_api import TagsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TagsApi(api_client) api_instance.delete_host_tags( host_name="host_name", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Remove host tags ``` # Remove host tags returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::TagsAPI.new api_instance.delete_host_tags("host_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Remove host tags ``` // Remove host tags returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewTagsApi(apiClient) r, err := api.DeleteHostTags(ctx, "host_name", *datadogV1.NewDeleteHostTagsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TagsApi.DeleteHostTags`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Remove host tags ``` // Remove host tags returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.TagsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TagsApi apiInstance = new TagsApi(defaultClient); try { apiInstance.deleteHostTags("host_name"); } catch (ApiException e) { System.err.println("Exception when calling TagsApi#deleteHostTags"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Remove host tags ``` // Remove host tags returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_tags::DeleteHostTagsOptionalParams; use datadog_api_client::datadogV1::api_tags::TagsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = TagsAPI::with_config(configuration); let resp = api .delete_host_tags( "host_name".to_string(), DeleteHostTagsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Remove host tags ``` /** * Remove host tags returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.TagsApi(configuration); const params: v1.TagsApiDeleteHostTagsRequest = { hostName: "host_name", }; apiInstance .deleteHostTags(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=650750b7-42d4-4040-bdd0-fc2d10637412&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=49d6692c-c019-4cb9-9df2-4d3379cac334&pt=Tags&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ftags%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=650750b7-42d4-4040-bdd0-fc2d10637412&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=49d6692c-c019-4cb9-9df2-4d3379cac334&pt=Tags&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ftags%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=b8735af3-5db4-4602-b2a2-a67515a5a379&bo=2&sid=c4ba0d50f0bf11f09afde9c0412ba014&vid=c4ba67e0f0bf11f099fcdb9b6c7a6b3e&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Tags&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ftags%2F&r=<=988&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=183701) --- # Source: https://docs.datadoghq.com/api/latest/teams # Teams View and manage teams within Datadog. See the [Teams page](https://docs.datadoghq.com/account_management/teams/) for more information. ## [Get all teams](https://docs.datadoghq.com/api/latest/teams/#get-all-teams) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-all-teams-v2) GET https://api.ap1.datadoghq.com/api/v2/teamhttps://api.ap2.datadoghq.com/api/v2/teamhttps://api.datadoghq.eu/api/v2/teamhttps://api.ddog-gov.com/api/v2/teamhttps://api.datadoghq.com/api/v2/teamhttps://api.us3.datadoghq.com/api/v2/teamhttps://api.us5.datadoghq.com/api/v2/team ### Overview Get all teams. Can be used to search for teams using the `filter[keyword]` and `filter[me]` query parameters. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[number] integer Specific page number to return. page[size] integer Size for a given page. The maximum allowed value is 100. sort enum Specifies the order of the returned teams Allowed enum values: `name, -name, user_count, -user_count` include array Included related resources optionally requested. Allowed enum values: `team_links, user_team_permissions` filter[keyword] string Search query. Can be team name, team handle, or email of team member filter[me] boolean When true, only returns teams the current user belongs to fields[team] array List of fields that need to be fetched. ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#ListTeams-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#ListTeams-403-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#ListTeams-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Response with multiple teams Field Type Description data [object] Teams response data attributes [_required_] object Team attributes avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team created_at date-time Creation date of the team description string Free-form markdown description/content for the team's homepage handle [_required_] string The team's identifier hidden_modules [string] Collection of hidden modules for the team is_managed boolean Whether the team is managed from an external source link_count int32 The number of links belonging to the team modified_at date-time Modification date of the team name [_required_] string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team visible_modules [string] Collection of visible modules for the team id [_required_] string The team's identifier relationships object Resources related to a team team_links object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. user_team_permissions object Relationship between a user team permission and a team data object Related user team permission data id [_required_] string The ID of the user team permission type [_required_] enum User team permission type Allowed enum values: `user_team_permissions` default: `user_team_permissions` links object Links attributes. related string Related link. type [_required_] enum Team type Allowed enum values: `team` default: `team` included [ ] Resources related to the team Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Team link attributes [_required_] object Team link attributes label [_required_] string The link's label position int32 The link's position, used to sort links for the team team_id string ID of the team the link is associated with url [_required_] string The URL for the link id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` Option 3 object A user's permissions for a given team attributes object User team permission attributes permissions object Object of team permission actions and boolean values that a logged in user can perform on this team. id [_required_] string The user team permission's identifier type [_required_] enum User team permission type Allowed enum values: `user_team_permissions` default: `user_team_permissions` links object Teams response links. first string First link. last string Last link. next string Next link. prev string Previous link. self string Current link. meta object Teams response metadata. pagination object Teams response metadata. first_offset int64 The first offset. last_offset int64 The last offset. limit int64 Pagination limit. next_offset int64 The next offset. offset int64 The offset. prev_offset int64 The previous offset. total int64 Total results. type string Offset type. ``` { "data": [ { "attributes": { "avatar": "🥑", "banner": "integer", "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "handle": "example-team", "hidden_modules": [], "is_managed": false, "link_count": "integer", "modified_at": "2019-09-19T10:00:00.000Z", "name": "Example Team", "summary": "string", "user_count": "integer", "visible_modules": [] }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "team_links": { "data": [ { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team_links" } ], "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } }, "user_team_permissions": { "data": { "id": "UserTeamPermissions-aeadc05e-98a8-11ec-ac2c-da7ad0900001-416595", "type": "user_team_permissions" }, "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } } }, "type": "team" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "links": { "first": "string", "last": "string", "next": "string", "prev": "string", "self": "string" }, "meta": { "pagination": { "first_offset": "integer", "last_offset": "integer", "limit": "integer", "next_offset": "integer", "offset": "integer", "prev_offset": "integer", "total": "integer", "type": "string" } } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Get all teams Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all teams ``` """ Get all teams returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.list_teams() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all teams ``` # Get all teams returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new p api_instance.list_teams() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all teams ``` // Get all teams returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.ListTeams(ctx, *datadogV2.NewListTeamsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.ListTeams`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.ListTeams`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all teams ``` // Get all teams returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); try { TeamsResponse result = apiInstance.listTeams(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#listTeams"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all teams ``` // Get all teams returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::ListTeamsOptionalParams; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api.list_teams(ListTeamsOptionalParams::default()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all teams ``` /** * Get all teams returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); apiInstance .listTeams() .then((data: v2.TeamsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a team](https://docs.datadoghq.com/api/latest/teams/#create-a-team) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#create-a-team-v2) POST https://api.ap1.datadoghq.com/api/v2/teamhttps://api.ap2.datadoghq.com/api/v2/teamhttps://api.datadoghq.eu/api/v2/teamhttps://api.ddog-gov.com/api/v2/teamhttps://api.datadoghq.com/api/v2/teamhttps://api.us3.datadoghq.com/api/v2/teamhttps://api.us5.datadoghq.com/api/v2/team ### Overview Create a new team. User IDs passed through the `users` relationship field are added to the team. This endpoint requires all of the following permissions: * `teams_read` * `teams_manage` OAuth apps require the `teams_read, teams_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object Team create attributes [_required_] object Team creation attributes avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team description string Free-form markdown description/content for the team's homepage handle [_required_] string The team's identifier hidden_modules [string] Collection of hidden modules for the team name [_required_] string The name of the team visible_modules [string] Collection of visible modules for the team relationships object Relationships formed with the team on creation users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum Team type Allowed enum values: `team` default: `team` ##### Create a team returns "CREATED" response ``` { "data": { "attributes": { "handle": "test-handle-a0fc0297eb519635", "name": "test-name-a0fc0297eb519635" }, "relationships": { "users": { "data": [] } }, "type": "team" } } ``` Copy ##### Create a team with V2 fields returns "CREATED" response ``` { "data": { "attributes": { "handle": "test-handle-a0fc0297eb519635", "name": "test-name-a0fc0297eb519635", "avatar": "\ud83e\udd51", "banner": 7, "visible_modules": [ "m1", "m2" ], "hidden_modules": [ "m3" ] }, "type": "team" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/teams/#CreateTeam-201-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#CreateTeam-403-v2) * [409](https://docs.datadoghq.com/api/latest/teams/#CreateTeam-409-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#CreateTeam-429-v2) CREATED * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Response with a team Field Type Description data object A team attributes [_required_] object Team attributes avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team created_at date-time Creation date of the team description string Free-form markdown description/content for the team's homepage handle [_required_] string The team's identifier hidden_modules [string] Collection of hidden modules for the team is_managed boolean Whether the team is managed from an external source link_count int32 The number of links belonging to the team modified_at date-time Modification date of the team name [_required_] string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team visible_modules [string] Collection of visible modules for the team id [_required_] string The team's identifier relationships object Resources related to a team team_links object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. user_team_permissions object Relationship between a user team permission and a team data object Related user team permission data id [_required_] string The ID of the user team permission type [_required_] enum User team permission type Allowed enum values: `user_team_permissions` default: `user_team_permissions` links object Links attributes. related string Related link. type [_required_] enum Team type Allowed enum values: `team` default: `team` ``` { "data": { "attributes": { "avatar": "🥑", "banner": "integer", "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "handle": "example-team", "hidden_modules": [], "is_managed": false, "link_count": "integer", "modified_at": "2019-09-19T10:00:00.000Z", "name": "Example Team", "summary": "string", "user_count": "integer", "visible_modules": [] }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "team_links": { "data": [ { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team_links" } ], "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } }, "user_team_permissions": { "data": { "id": "UserTeamPermissions-aeadc05e-98a8-11ec-ac2c-da7ad0900001-416595", "type": "user_team_permissions" }, "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } } }, "type": "team" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Create a team returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "handle": "test-handle-a0fc0297eb519635", "name": "test-name-a0fc0297eb519635" }, "relationships": { "users": { "data": [] } }, "type": "team" } } EOF ``` ##### Create a team with V2 fields returns "CREATED" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "handle": "test-handle-a0fc0297eb519635", "name": "test-name-a0fc0297eb519635", "avatar": "\ud83e\udd51", "banner": 7, "visible_modules": [ "m1", "m2" ], "hidden_modules": [ "m3" ] }, "type": "team" } } EOF ``` ##### Create a team returns "CREATED" response ``` // Create a team returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.TeamCreateRequest{ Data: datadogV2.TeamCreate{ Attributes: datadogV2.TeamCreateAttributes{ Handle: "test-handle-a0fc0297eb519635", Name: "test-name-a0fc0297eb519635", }, Relationships: &datadogV2.TeamCreateRelationships{ Users: &datadogV2.RelationshipToUsers{ Data: []datadogV2.RelationshipToUserData{}, }, }, Type: datadogV2.TEAMTYPE_TEAM, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.CreateTeam(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.CreateTeam`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.CreateTeam`:\n%s\n", responseContent) } ``` Copy ##### Create a team with V2 fields returns "CREATED" response ``` // Create a team with V2 fields returns "CREATED" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.TeamCreateRequest{ Data: datadogV2.TeamCreate{ Attributes: datadogV2.TeamCreateAttributes{ Handle: "test-handle-a0fc0297eb519635", Name: "test-name-a0fc0297eb519635", Avatar: *datadog.NewNullableString(datadog.PtrString("🥑")), Banner: *datadog.NewNullableInt64(datadog.PtrInt64(7)), VisibleModules: []string{ "m1", "m2", }, HiddenModules: []string{ "m3", }, }, Type: datadogV2.TEAMTYPE_TEAM, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.CreateTeam(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.CreateTeam`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.CreateTeam`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a team returns "CREATED" response ``` // Create a team returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.RelationshipToUsers; import com.datadog.api.client.v2.model.TeamCreate; import com.datadog.api.client.v2.model.TeamCreateAttributes; import com.datadog.api.client.v2.model.TeamCreateRelationships; import com.datadog.api.client.v2.model.TeamCreateRequest; import com.datadog.api.client.v2.model.TeamResponse; import com.datadog.api.client.v2.model.TeamType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); TeamCreateRequest body = new TeamCreateRequest() .data( new TeamCreate() .attributes( new TeamCreateAttributes() .handle("test-handle-a0fc0297eb519635") .name("test-name-a0fc0297eb519635")) .relationships(new TeamCreateRelationships().users(new RelationshipToUsers())) .type(TeamType.TEAM)); try { TeamResponse result = apiInstance.createTeam(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#createTeam"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy ##### Create a team with V2 fields returns "CREATED" response ``` // Create a team with V2 fields returns "CREATED" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamCreate; import com.datadog.api.client.v2.model.TeamCreateAttributes; import com.datadog.api.client.v2.model.TeamCreateRequest; import com.datadog.api.client.v2.model.TeamResponse; import com.datadog.api.client.v2.model.TeamType; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); TeamCreateRequest body = new TeamCreateRequest() .data( new TeamCreate() .attributes( new TeamCreateAttributes() .handle("test-handle-a0fc0297eb519635") .name("test-name-a0fc0297eb519635") .avatar("🥑") .banner(7L) .visibleModules(Arrays.asList("m1", "m2")) .hiddenModules(Collections.singletonList("m3"))) .type(TeamType.TEAM)); try { TeamResponse result = apiInstance.createTeam(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#createTeam"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a team returns "CREATED" response ``` """ Create a team returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.relationship_to_users import RelationshipToUsers from datadog_api_client.v2.model.team_create import TeamCreate from datadog_api_client.v2.model.team_create_attributes import TeamCreateAttributes from datadog_api_client.v2.model.team_create_relationships import TeamCreateRelationships from datadog_api_client.v2.model.team_create_request import TeamCreateRequest from datadog_api_client.v2.model.team_type import TeamType body = TeamCreateRequest( data=TeamCreate( attributes=TeamCreateAttributes( handle="test-handle-a0fc0297eb519635", name="test-name-a0fc0297eb519635", ), relationships=TeamCreateRelationships( users=RelationshipToUsers( data=[], ), ), type=TeamType.TEAM, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.create_team(body=body) print(response) ``` Copy ##### Create a team with V2 fields returns "CREATED" response ``` """ Create a team with V2 fields returns "CREATED" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.team_create import TeamCreate from datadog_api_client.v2.model.team_create_attributes import TeamCreateAttributes from datadog_api_client.v2.model.team_create_request import TeamCreateRequest from datadog_api_client.v2.model.team_type import TeamType body = TeamCreateRequest( data=TeamCreate( attributes=TeamCreateAttributes( handle="test-handle-a0fc0297eb519635", name="test-name-a0fc0297eb519635", avatar="🥑", banner=7, visible_modules=[ "m1", "m2", ], hidden_modules=[ "m3", ], ), type=TeamType.TEAM, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.create_team(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a team returns "CREATED" response ``` # Create a team returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new body = DatadogAPIClient::V2::TeamCreateRequest.new({ data: DatadogAPIClient::V2::TeamCreate.new({ attributes: DatadogAPIClient::V2::TeamCreateAttributes.new({ handle: "test-handle-a0fc0297eb519635", name: "test-name-a0fc0297eb519635", }), relationships: DatadogAPIClient::V2::TeamCreateRelationships.new({ users: DatadogAPIClient::V2::RelationshipToUsers.new({ data: [], }), }), type: DatadogAPIClient::V2::TeamType::TEAM, }), }) p api_instance.create_team(body) ``` Copy ##### Create a team with V2 fields returns "CREATED" response ``` # Create a team with V2 fields returns "CREATED" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new body = DatadogAPIClient::V2::TeamCreateRequest.new({ data: DatadogAPIClient::V2::TeamCreate.new({ attributes: DatadogAPIClient::V2::TeamCreateAttributes.new({ handle: "test-handle-a0fc0297eb519635", name: "test-name-a0fc0297eb519635", avatar: "🥑", banner: 7, visible_modules: [ "m1", "m2", ], hidden_modules: [ "m3", ], }), type: DatadogAPIClient::V2::TeamType::TEAM, }), }) p api_instance.create_team(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a team returns "CREATED" response ``` // Create a team returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::RelationshipToUsers; use datadog_api_client::datadogV2::model::TeamCreate; use datadog_api_client::datadogV2::model::TeamCreateAttributes; use datadog_api_client::datadogV2::model::TeamCreateRelationships; use datadog_api_client::datadogV2::model::TeamCreateRequest; use datadog_api_client::datadogV2::model::TeamType; #[tokio::main] async fn main() { let body = TeamCreateRequest::new( TeamCreate::new( TeamCreateAttributes::new( "test-handle-a0fc0297eb519635".to_string(), "test-name-a0fc0297eb519635".to_string(), ), TeamType::TEAM, ) .relationships(TeamCreateRelationships::new().users(RelationshipToUsers::new(vec![]))), ); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api.create_team(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy ##### Create a team with V2 fields returns "CREATED" response ``` // Create a team with V2 fields returns "CREATED" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::TeamCreate; use datadog_api_client::datadogV2::model::TeamCreateAttributes; use datadog_api_client::datadogV2::model::TeamCreateRequest; use datadog_api_client::datadogV2::model::TeamType; #[tokio::main] async fn main() { let body = TeamCreateRequest::new(TeamCreate::new( TeamCreateAttributes::new( "test-handle-a0fc0297eb519635".to_string(), "test-name-a0fc0297eb519635".to_string(), ) .avatar(Some("🥑".to_string())) .banner(Some(7)) .hidden_modules(vec!["m3".to_string()]) .visible_modules(vec!["m1".to_string(), "m2".to_string()]), TeamType::TEAM, )); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api.create_team(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a team returns "CREATED" response ``` /** * Create a team returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); const params: v2.TeamsApiCreateTeamRequest = { body: { data: { attributes: { handle: "test-handle-a0fc0297eb519635", name: "test-name-a0fc0297eb519635", }, relationships: { users: { data: [], }, }, type: "team", }, }, }; apiInstance .createTeam(params) .then((data: v2.TeamResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy ##### Create a team with V2 fields returns "CREATED" response ``` /** * Create a team with V2 fields returns "CREATED" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); const params: v2.TeamsApiCreateTeamRequest = { body: { data: { attributes: { handle: "test-handle-a0fc0297eb519635", name: "test-name-a0fc0297eb519635", avatar: "🥑", banner: 7, visibleModules: ["m1", "m2"], hiddenModules: ["m3"], }, type: "team", }, }, }; apiInstance .createTeam(params) .then((data: v2.TeamResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a team](https://docs.datadoghq.com/api/latest/teams/#get-a-team) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-a-team-v2) GET https://api.ap1.datadoghq.com/api/v2/team/{team_id}https://api.ap2.datadoghq.com/api/v2/team/{team_id}https://api.datadoghq.eu/api/v2/team/{team_id}https://api.ddog-gov.com/api/v2/team/{team_id}https://api.datadoghq.com/api/v2/team/{team_id}https://api.us3.datadoghq.com/api/v2/team/{team_id}https://api.us5.datadoghq.com/api/v2/team/{team_id} ### Overview Get a single team using the team’s `id`. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#GetTeam-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#GetTeam-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#GetTeam-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#GetTeam-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Response with a team Field Type Description data object A team attributes [_required_] object Team attributes avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team created_at date-time Creation date of the team description string Free-form markdown description/content for the team's homepage handle [_required_] string The team's identifier hidden_modules [string] Collection of hidden modules for the team is_managed boolean Whether the team is managed from an external source link_count int32 The number of links belonging to the team modified_at date-time Modification date of the team name [_required_] string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team visible_modules [string] Collection of visible modules for the team id [_required_] string The team's identifier relationships object Resources related to a team team_links object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. user_team_permissions object Relationship between a user team permission and a team data object Related user team permission data id [_required_] string The ID of the user team permission type [_required_] enum User team permission type Allowed enum values: `user_team_permissions` default: `user_team_permissions` links object Links attributes. related string Related link. type [_required_] enum Team type Allowed enum values: `team` default: `team` ``` { "data": { "attributes": { "avatar": "🥑", "banner": "integer", "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "handle": "example-team", "hidden_modules": [], "is_managed": false, "link_count": "integer", "modified_at": "2019-09-19T10:00:00.000Z", "name": "Example Team", "summary": "string", "user_count": "integer", "visible_modules": [] }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "team_links": { "data": [ { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team_links" } ], "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } }, "user_team_permissions": { "data": { "id": "UserTeamPermissions-aeadc05e-98a8-11ec-ac2c-da7ad0900001-416595", "type": "user_team_permissions" }, "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } } }, "type": "team" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Get a team Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a team ``` """ Get a team returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.get_team( team_id=DD_TEAM_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a team ``` # Get a team returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] p api_instance.get_team(DD_TEAM_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a team ``` // Get a team returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.GetTeam(ctx, DdTeamDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.GetTeam`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.GetTeam`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a team ``` // Get a team returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); try { TeamResponse result = apiInstance.getTeam(DD_TEAM_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#getTeam"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a team ``` // Get a team returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api.get_team(dd_team_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a team ``` /** * Get a team returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.TeamsApiGetTeamRequest = { teamId: DD_TEAM_DATA_ID, }; apiInstance .getTeam(params) .then((data: v2.TeamResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a team](https://docs.datadoghq.com/api/latest/teams/#update-a-team) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#update-a-team-v2) PATCH https://api.ap1.datadoghq.com/api/v2/team/{team_id}https://api.ap2.datadoghq.com/api/v2/team/{team_id}https://api.datadoghq.eu/api/v2/team/{team_id}https://api.ddog-gov.com/api/v2/team/{team_id}https://api.datadoghq.com/api/v2/team/{team_id}https://api.us3.datadoghq.com/api/v2/team/{team_id}https://api.us5.datadoghq.com/api/v2/team/{team_id} ### Overview Update a team using the team’s `id`. If the `team_links` relationship is present, the associated links are updated to be in the order they appear in the array, and any existing team links not present are removed. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object Team update request attributes [_required_] object Team update attributes avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team description string Free-form markdown description/content for the team's homepage handle [_required_] string The team's identifier hidden_modules [string] Collection of hidden modules for the team name [_required_] string The name of the team visible_modules [string] Collection of visible modules for the team relationships object Team update relationships team_links object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. type [_required_] enum Team type Allowed enum values: `team` default: `team` ``` { "data": { "attributes": { "handle": "example-team", "name": "Example Team updated", "avatar": "\ud83e\udd51", "banner": 7, "hidden_modules": [ "m3" ], "visible_modules": [ "m1", "m2" ] }, "type": "team" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#UpdateTeam-200-v2) * [400](https://docs.datadoghq.com/api/latest/teams/#UpdateTeam-400-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#UpdateTeam-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#UpdateTeam-404-v2) * [409](https://docs.datadoghq.com/api/latest/teams/#UpdateTeam-409-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#UpdateTeam-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Response with a team Field Type Description data object A team attributes [_required_] object Team attributes avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team created_at date-time Creation date of the team description string Free-form markdown description/content for the team's homepage handle [_required_] string The team's identifier hidden_modules [string] Collection of hidden modules for the team is_managed boolean Whether the team is managed from an external source link_count int32 The number of links belonging to the team modified_at date-time Modification date of the team name [_required_] string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team visible_modules [string] Collection of visible modules for the team id [_required_] string The team's identifier relationships object Resources related to a team team_links object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. user_team_permissions object Relationship between a user team permission and a team data object Related user team permission data id [_required_] string The ID of the user team permission type [_required_] enum User team permission type Allowed enum values: `user_team_permissions` default: `user_team_permissions` links object Links attributes. related string Related link. type [_required_] enum Team type Allowed enum values: `team` default: `team` ``` { "data": { "attributes": { "avatar": "🥑", "banner": "integer", "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "handle": "example-team", "hidden_modules": [], "is_managed": false, "link_count": "integer", "modified_at": "2019-09-19T10:00:00.000Z", "name": "Example Team", "summary": "string", "user_count": "integer", "visible_modules": [] }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "team_links": { "data": [ { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team_links" } ], "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } }, "user_team_permissions": { "data": { "id": "UserTeamPermissions-aeadc05e-98a8-11ec-ac2c-da7ad0900001-416595", "type": "user_team_permissions" }, "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } } }, "type": "team" } } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Update a team returns "OK" response Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "handle": "example-team", "name": "Example Team updated", "avatar": "\ud83e\udd51", "banner": 7, "hidden_modules": [ "m3" ], "visible_modules": [ "m1", "m2" ] }, "type": "team" } } EOF ``` ##### Update a team returns "OK" response ``` // Update a team returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataAttributesHandle := os.Getenv("DD_TEAM_DATA_ATTRIBUTES_HANDLE") DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") body := datadogV2.TeamUpdateRequest{ Data: datadogV2.TeamUpdate{ Attributes: datadogV2.TeamUpdateAttributes{ Handle: DdTeamDataAttributesHandle, Name: "Example Team updated", Avatar: *datadog.NewNullableString(datadog.PtrString("🥑")), Banner: *datadog.NewNullableInt64(datadog.PtrInt64(7)), HiddenModules: []string{ "m3", }, VisibleModules: []string{ "m1", "m2", }, }, Type: datadogV2.TEAMTYPE_TEAM, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.UpdateTeam(ctx, DdTeamDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.UpdateTeam`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.UpdateTeam`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a team returns "OK" response ``` // Update a team returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamResponse; import com.datadog.api.client.v2.model.TeamType; import com.datadog.api.client.v2.model.TeamUpdate; import com.datadog.api.client.v2.model.TeamUpdateAttributes; import com.datadog.api.client.v2.model.TeamUpdateRequest; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ATTRIBUTES_HANDLE = System.getenv("DD_TEAM_DATA_ATTRIBUTES_HANDLE"); String DD_TEAM_DATA_ATTRIBUTES_NAME = System.getenv("DD_TEAM_DATA_ATTRIBUTES_NAME"); String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); TeamUpdateRequest body = new TeamUpdateRequest() .data( new TeamUpdate() .attributes( new TeamUpdateAttributes() .handle(DD_TEAM_DATA_ATTRIBUTES_HANDLE) .name("Example Team updated") .avatar("🥑") .banner(7L) .hiddenModules(Collections.singletonList("m3")) .visibleModules(Arrays.asList("m1", "m2"))) .type(TeamType.TEAM)); try { TeamResponse result = apiInstance.updateTeam(DD_TEAM_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#updateTeam"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a team returns "OK" response ``` """ Update a team returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.team_type import TeamType from datadog_api_client.v2.model.team_update import TeamUpdate from datadog_api_client.v2.model.team_update_attributes import TeamUpdateAttributes from datadog_api_client.v2.model.team_update_request import TeamUpdateRequest # there is a valid "dd_team" in the system DD_TEAM_DATA_ATTRIBUTES_HANDLE = environ["DD_TEAM_DATA_ATTRIBUTES_HANDLE"] DD_TEAM_DATA_ATTRIBUTES_NAME = environ["DD_TEAM_DATA_ATTRIBUTES_NAME"] DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] body = TeamUpdateRequest( data=TeamUpdate( attributes=TeamUpdateAttributes( handle=DD_TEAM_DATA_ATTRIBUTES_HANDLE, name="Example Team updated", avatar="🥑", banner=7, hidden_modules=[ "m3", ], visible_modules=[ "m1", "m2", ], ), type=TeamType.TEAM, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.update_team(team_id=DD_TEAM_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a team returns "OK" response ``` # Update a team returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ATTRIBUTES_HANDLE = ENV["DD_TEAM_DATA_ATTRIBUTES_HANDLE"] DD_TEAM_DATA_ATTRIBUTES_NAME = ENV["DD_TEAM_DATA_ATTRIBUTES_NAME"] DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] body = DatadogAPIClient::V2::TeamUpdateRequest.new({ data: DatadogAPIClient::V2::TeamUpdate.new({ attributes: DatadogAPIClient::V2::TeamUpdateAttributes.new({ handle: DD_TEAM_DATA_ATTRIBUTES_HANDLE, name: "Example Team updated", avatar: "🥑", banner: 7, hidden_modules: [ "m3", ], visible_modules: [ "m1", "m2", ], }), type: DatadogAPIClient::V2::TeamType::TEAM, }), }) p api_instance.update_team(DD_TEAM_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a team returns "OK" response ``` // Update a team returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::TeamType; use datadog_api_client::datadogV2::model::TeamUpdate; use datadog_api_client::datadogV2::model::TeamUpdateAttributes; use datadog_api_client::datadogV2::model::TeamUpdateRequest; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_attributes_handle = std::env::var("DD_TEAM_DATA_ATTRIBUTES_HANDLE").unwrap(); let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let body = TeamUpdateRequest::new(TeamUpdate::new( TeamUpdateAttributes::new( dd_team_data_attributes_handle.clone(), "Example Team updated".to_string(), ) .avatar(Some("🥑".to_string())) .banner(Some(7)) .hidden_modules(vec!["m3".to_string()]) .visible_modules(vec!["m1".to_string(), "m2".to_string()]), TeamType::TEAM, )); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api.update_team(dd_team_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a team returns "OK" response ``` /** * Update a team returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ATTRIBUTES_HANDLE = process.env .DD_TEAM_DATA_ATTRIBUTES_HANDLE as string; const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.TeamsApiUpdateTeamRequest = { body: { data: { attributes: { handle: DD_TEAM_DATA_ATTRIBUTES_HANDLE, name: "Example Team updated", avatar: "🥑", banner: 7, hiddenModules: ["m3"], visibleModules: ["m1", "m2"], }, type: "team", }, }, teamId: DD_TEAM_DATA_ID, }; apiInstance .updateTeam(params) .then((data: v2.TeamResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Remove a team](https://docs.datadoghq.com/api/latest/teams/#remove-a-team) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#remove-a-team-v2) DELETE https://api.ap1.datadoghq.com/api/v2/team/{team_id}https://api.ap2.datadoghq.com/api/v2/team/{team_id}https://api.datadoghq.eu/api/v2/team/{team_id}https://api.ddog-gov.com/api/v2/team/{team_id}https://api.datadoghq.com/api/v2/team/{team_id}https://api.us3.datadoghq.com/api/v2/team/{team_id}https://api.us5.datadoghq.com/api/v2/team/{team_id} ### Overview Remove a team using the team’s `id`. This endpoint requires all of the following permissions: * `teams_read` * `teams_manage` OAuth apps require the `teams_read, teams_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None ### Response * [204](https://docs.datadoghq.com/api/latest/teams/#DeleteTeam-204-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#DeleteTeam-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#DeleteTeam-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#DeleteTeam-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Remove a team Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Remove a team ``` """ Remove a team returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) api_instance.delete_team( team_id=DD_TEAM_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Remove a team ``` # Remove a team returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] api_instance.delete_team(DD_TEAM_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Remove a team ``` // Remove a team returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) r, err := api.DeleteTeam(ctx, DdTeamDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.DeleteTeam`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Remove a team ``` // Remove a team returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); try { apiInstance.deleteTeam(DD_TEAM_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#deleteTeam"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Remove a team ``` // Remove a team returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api.delete_team(dd_team_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Remove a team ``` /** * Remove a team returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.TeamsApiDeleteTeamRequest = { teamId: DD_TEAM_DATA_ID, }; apiInstance .deleteTeam(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get team memberships](https://docs.datadoghq.com/api/latest/teams/#get-team-memberships) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-team-memberships-v2) GET https://api.ap1.datadoghq.com/api/v2/team/{team_id}/membershipshttps://api.ap2.datadoghq.com/api/v2/team/{team_id}/membershipshttps://api.datadoghq.eu/api/v2/team/{team_id}/membershipshttps://api.ddog-gov.com/api/v2/team/{team_id}/membershipshttps://api.datadoghq.com/api/v2/team/{team_id}/membershipshttps://api.us3.datadoghq.com/api/v2/team/{team_id}/membershipshttps://api.us5.datadoghq.com/api/v2/team/{team_id}/memberships ### Overview Get a paginated list of members for a team This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort enum Specifies the order of returned team memberships Allowed enum values: `manager_name, -manager_name, name, -name, handle, -handle, email, -email` filter[keyword] string Search query, can be user email or name ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#GetTeamMemberships-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#GetTeamMemberships-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#GetTeamMemberships-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#GetTeamMemberships-429-v2) Represents a user's association to a team * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team memberships response Field Type Description data [object] Team memberships response data attributes object Team membership attributes provisioned_by string The mechanism responsible for provisioning the team relationship. Possible values: null for added by a user, "service_account" if added by a service account, and "saml_mapping" if provisioned via SAML mapping. provisioned_by_id string UUID of the User or Service Account who provisioned this team membership, or null if provisioned via SAML mapping. role enum The user's role within the team Allowed enum values: `admin` id [_required_] string The ID of a user's relationship with a team relationships object Relationship between membership and a user team object Relationship between team membership and team data [_required_] object The team associated with the membership id [_required_] string The ID of the team associated with the membership type [_required_] enum User team team type Allowed enum values: `team` default: `team` user object Relationship between team membership and user data [_required_] object A user's relationship with a team id [_required_] string The ID of the user associated with the team type [_required_] enum User team user type Allowed enum values: `users` default: `users` type [_required_] enum Team membership type Allowed enum values: `team_memberships` default: `team_memberships` included [ ] Resources related to the team memberships Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object A team attributes [_required_] object Team attributes avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team created_at date-time Creation date of the team description string Free-form markdown description/content for the team's homepage handle [_required_] string The team's identifier hidden_modules [string] Collection of hidden modules for the team is_managed boolean Whether the team is managed from an external source link_count int32 The number of links belonging to the team modified_at date-time Modification date of the team name [_required_] string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team visible_modules [string] Collection of visible modules for the team id [_required_] string The team's identifier relationships object Resources related to a team team_links object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. user_team_permissions object Relationship between a user team permission and a team data object Related user team permission data id [_required_] string The ID of the user team permission type [_required_] enum User team permission type Allowed enum values: `user_team_permissions` default: `user_team_permissions` links object Links attributes. related string Related link. type [_required_] enum Team type Allowed enum values: `team` default: `team` links object Teams response links. first string First link. last string Last link. next string Next link. prev string Previous link. self string Current link. meta object Teams response metadata. pagination object Teams response metadata. first_offset int64 The first offset. last_offset int64 The last offset. limit int64 Pagination limit. next_offset int64 The next offset. offset int64 The offset. prev_offset int64 The previous offset. total int64 Total results. type string Offset type. ``` { "data": [ { "attributes": { "provisioned_by": "string", "provisioned_by_id": "string", "role": "string" }, "id": "TeamMembership-aeadc05e-98a8-11ec-ac2c-da7ad0900001-38835", "relationships": { "team": { "data": { "id": "d7e15d9d-d346-43da-81d8-3d9e71d9a5e9", "type": "team" } }, "user": { "data": { "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "users" } } }, "type": "team_memberships" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "links": { "first": "string", "last": "string", "next": "string", "prev": "string", "self": "string" }, "meta": { "pagination": { "first_offset": "integer", "last_offset": "integer", "limit": "integer", "next_offset": "integer", "offset": "integer", "prev_offset": "integer", "total": "integer", "type": "string" } } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Get team memberships Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/memberships" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get team memberships ``` """ Get team memberships returns "Represents a user's association to a team" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.get_team_memberships( team_id=DD_TEAM_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get team memberships ``` # Get team memberships returns "Represents a user's association to a team" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] p api_instance.get_team_memberships(DD_TEAM_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get team memberships ``` // Get team memberships returns "Represents a user's association to a team" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.GetTeamMemberships(ctx, DdTeamDataID, *datadogV2.NewGetTeamMembershipsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.GetTeamMemberships`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.GetTeamMemberships`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get team memberships ``` // Get team memberships returns "Represents a user's association to a team" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.UserTeamsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); try { UserTeamsResponse result = apiInstance.getTeamMemberships(DD_TEAM_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#getTeamMemberships"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get team memberships ``` // Get team memberships returns "Represents a user's association to a team" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::GetTeamMembershipsOptionalParams; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .get_team_memberships( dd_team_data_id.clone(), GetTeamMembershipsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get team memberships ``` /** * Get team memberships returns "Represents a user's association to a team" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.TeamsApiGetTeamMembershipsRequest = { teamId: DD_TEAM_DATA_ID, }; apiInstance .getTeamMemberships(params) .then((data: v2.UserTeamsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add a user to a team](https://docs.datadoghq.com/api/latest/teams/#add-a-user-to-a-team) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#add-a-user-to-a-team-v2) POST https://api.ap1.datadoghq.com/api/v2/team/{team_id}/membershipshttps://api.ap2.datadoghq.com/api/v2/team/{team_id}/membershipshttps://api.datadoghq.eu/api/v2/team/{team_id}/membershipshttps://api.ddog-gov.com/api/v2/team/{team_id}/membershipshttps://api.datadoghq.com/api/v2/team/{team_id}/membershipshttps://api.us3.datadoghq.com/api/v2/team/{team_id}/membershipshttps://api.us5.datadoghq.com/api/v2/team/{team_id}/memberships ### Overview Add a user to a team. **Note** : Each team has a setting that determines who is allowed to modify membership of the team. The `user_access_manage` permission generally grants access to modify membership of any team. To get the full picture, see [Team Membership documentation](https://docs.datadoghq.com/account_management/teams/manage/#team-membership). This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object A user's relationship with a team attributes object Team membership attributes provisioned_by string The mechanism responsible for provisioning the team relationship. Possible values: null for added by a user, "service_account" if added by a service account, and "saml_mapping" if provisioned via SAML mapping. provisioned_by_id string UUID of the User or Service Account who provisioned this team membership, or null if provisioned via SAML mapping. role enum The user's role within the team Allowed enum values: `admin` relationships object Relationship between membership and a user team object Relationship between team membership and team data [_required_] object The team associated with the membership id [_required_] string The ID of the team associated with the membership type [_required_] enum User team team type Allowed enum values: `team` default: `team` user object Relationship between team membership and user data [_required_] object A user's relationship with a team id [_required_] string The ID of the user associated with the team type [_required_] enum User team user type Allowed enum values: `users` default: `users` type [_required_] enum Team membership type Allowed enum values: `team_memberships` default: `team_memberships` ``` { "data": { "attributes": { "role": "string" }, "relationships": { "team": { "data": { "id": "d7e15d9d-d346-43da-81d8-3d9e71d9a5e9", "type": "team" } }, "user": { "data": { "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "users" } } }, "type": "team_memberships" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#CreateTeamMembership-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#CreateTeamMembership-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#CreateTeamMembership-404-v2) * [409](https://docs.datadoghq.com/api/latest/teams/#CreateTeamMembership-409-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#CreateTeamMembership-429-v2) Represents a user's association to a team * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team membership response Field Type Description data object A user's relationship with a team attributes object Team membership attributes provisioned_by string The mechanism responsible for provisioning the team relationship. Possible values: null for added by a user, "service_account" if added by a service account, and "saml_mapping" if provisioned via SAML mapping. provisioned_by_id string UUID of the User or Service Account who provisioned this team membership, or null if provisioned via SAML mapping. role enum The user's role within the team Allowed enum values: `admin` id [_required_] string The ID of a user's relationship with a team relationships object Relationship between membership and a user team object Relationship between team membership and team data [_required_] object The team associated with the membership id [_required_] string The ID of the team associated with the membership type [_required_] enum User team team type Allowed enum values: `team` default: `team` user object Relationship between team membership and user data [_required_] object A user's relationship with a team id [_required_] string The ID of the user associated with the team type [_required_] enum User team user type Allowed enum values: `users` default: `users` type [_required_] enum Team membership type Allowed enum values: `team_memberships` default: `team_memberships` included [ ] Resources related to the team memberships Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object A team attributes [_required_] object Team attributes avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team created_at date-time Creation date of the team description string Free-form markdown description/content for the team's homepage handle [_required_] string The team's identifier hidden_modules [string] Collection of hidden modules for the team is_managed boolean Whether the team is managed from an external source link_count int32 The number of links belonging to the team modified_at date-time Modification date of the team name [_required_] string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team visible_modules [string] Collection of visible modules for the team id [_required_] string The team's identifier relationships object Resources related to a team team_links object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. user_team_permissions object Relationship between a user team permission and a team data object Related user team permission data id [_required_] string The ID of the user team permission type [_required_] enum User team permission type Allowed enum values: `user_team_permissions` default: `user_team_permissions` links object Links attributes. related string Related link. type [_required_] enum Team type Allowed enum values: `team` default: `team` ``` { "data": { "attributes": { "provisioned_by": "string", "provisioned_by_id": "string", "role": "string" }, "id": "TeamMembership-aeadc05e-98a8-11ec-ac2c-da7ad0900001-38835", "relationships": { "team": { "data": { "id": "d7e15d9d-d346-43da-81d8-3d9e71d9a5e9", "type": "team" } }, "user": { "data": { "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "users" } } }, "type": "team_memberships" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Add a user to a team Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/memberships" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "relationships": { "team": { "data": { "id": "d7e15d9d-d346-43da-81d8-3d9e71d9a5e9", "type": "team" } }, "user": { "data": { "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "users" } } }, "type": "team_memberships" } } EOF ``` ##### Add a user to a team ``` """ Add a user to a team returns "Represents a user's association to a team" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.relationship_to_user_team_team import RelationshipToUserTeamTeam from datadog_api_client.v2.model.relationship_to_user_team_team_data import RelationshipToUserTeamTeamData from datadog_api_client.v2.model.relationship_to_user_team_user import RelationshipToUserTeamUser from datadog_api_client.v2.model.relationship_to_user_team_user_data import RelationshipToUserTeamUserData from datadog_api_client.v2.model.user_team_attributes import UserTeamAttributes from datadog_api_client.v2.model.user_team_create import UserTeamCreate from datadog_api_client.v2.model.user_team_relationships import UserTeamRelationships from datadog_api_client.v2.model.user_team_request import UserTeamRequest from datadog_api_client.v2.model.user_team_role import UserTeamRole from datadog_api_client.v2.model.user_team_team_type import UserTeamTeamType from datadog_api_client.v2.model.user_team_type import UserTeamType from datadog_api_client.v2.model.user_team_user_type import UserTeamUserType body = UserTeamRequest( data=UserTeamCreate( attributes=UserTeamAttributes( role=UserTeamRole.ADMIN, ), relationships=UserTeamRelationships( team=RelationshipToUserTeamTeam( data=RelationshipToUserTeamTeamData( id="d7e15d9d-d346-43da-81d8-3d9e71d9a5e9", type=UserTeamTeamType.TEAM, ), ), user=RelationshipToUserTeamUser( data=RelationshipToUserTeamUserData( id="b8626d7e-cedd-11eb-abf5-da7ad0900001", type=UserTeamUserType.USERS, ), ), ), type=UserTeamType.TEAM_MEMBERSHIPS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.create_team_membership(team_id="team_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add a user to a team ``` # Add a user to a team returns "Represents a user's association to a team" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new body = DatadogAPIClient::V2::UserTeamRequest.new({ data: DatadogAPIClient::V2::UserTeamCreate.new({ attributes: DatadogAPIClient::V2::UserTeamAttributes.new({ role: DatadogAPIClient::V2::UserTeamRole::ADMIN, }), relationships: DatadogAPIClient::V2::UserTeamRelationships.new({ team: DatadogAPIClient::V2::RelationshipToUserTeamTeam.new({ data: DatadogAPIClient::V2::RelationshipToUserTeamTeamData.new({ id: "d7e15d9d-d346-43da-81d8-3d9e71d9a5e9", type: DatadogAPIClient::V2::UserTeamTeamType::TEAM, }), }), user: DatadogAPIClient::V2::RelationshipToUserTeamUser.new({ data: DatadogAPIClient::V2::RelationshipToUserTeamUserData.new({ id: "b8626d7e-cedd-11eb-abf5-da7ad0900001", type: DatadogAPIClient::V2::UserTeamUserType::USERS, }), }), }), type: DatadogAPIClient::V2::UserTeamType::TEAM_MEMBERSHIPS, }), }) p api_instance.create_team_membership("team_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add a user to a team ``` // Add a user to a team returns "Represents a user's association to a team" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.UserTeamRequest{ Data: datadogV2.UserTeamCreate{ Attributes: &datadogV2.UserTeamAttributes{ Role: *datadogV2.NewNullableUserTeamRole(datadogV2.USERTEAMROLE_ADMIN.Ptr()), }, Relationships: &datadogV2.UserTeamRelationships{ Team: &datadogV2.RelationshipToUserTeamTeam{ Data: datadogV2.RelationshipToUserTeamTeamData{ Id: "d7e15d9d-d346-43da-81d8-3d9e71d9a5e9", Type: datadogV2.USERTEAMTEAMTYPE_TEAM, }, }, User: &datadogV2.RelationshipToUserTeamUser{ Data: datadogV2.RelationshipToUserTeamUserData{ Id: "b8626d7e-cedd-11eb-abf5-da7ad0900001", Type: datadogV2.USERTEAMUSERTYPE_USERS, }, }, }, Type: datadogV2.USERTEAMTYPE_TEAM_MEMBERSHIPS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.CreateTeamMembership(ctx, "team_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.CreateTeamMembership`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.CreateTeamMembership`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add a user to a team ``` // Add a user to a team returns "Represents a user's association to a team" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.RelationshipToUserTeamTeam; import com.datadog.api.client.v2.model.RelationshipToUserTeamTeamData; import com.datadog.api.client.v2.model.RelationshipToUserTeamUser; import com.datadog.api.client.v2.model.RelationshipToUserTeamUserData; import com.datadog.api.client.v2.model.UserTeamAttributes; import com.datadog.api.client.v2.model.UserTeamCreate; import com.datadog.api.client.v2.model.UserTeamRelationships; import com.datadog.api.client.v2.model.UserTeamRequest; import com.datadog.api.client.v2.model.UserTeamResponse; import com.datadog.api.client.v2.model.UserTeamRole; import com.datadog.api.client.v2.model.UserTeamTeamType; import com.datadog.api.client.v2.model.UserTeamType; import com.datadog.api.client.v2.model.UserTeamUserType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); UserTeamRequest body = new UserTeamRequest() .data( new UserTeamCreate() .attributes(new UserTeamAttributes().role(UserTeamRole.ADMIN)) .relationships( new UserTeamRelationships() .team( new RelationshipToUserTeamTeam() .data( new RelationshipToUserTeamTeamData() .id("d7e15d9d-d346-43da-81d8-3d9e71d9a5e9") .type(UserTeamTeamType.TEAM))) .user( new RelationshipToUserTeamUser() .data( new RelationshipToUserTeamUserData() .id("b8626d7e-cedd-11eb-abf5-da7ad0900001") .type(UserTeamUserType.USERS)))) .type(UserTeamType.TEAM_MEMBERSHIPS)); try { UserTeamResponse result = apiInstance.createTeamMembership("team_id", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#createTeamMembership"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add a user to a team ``` // Add a user to a team returns "Represents a user's association to a team" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::RelationshipToUserTeamTeam; use datadog_api_client::datadogV2::model::RelationshipToUserTeamTeamData; use datadog_api_client::datadogV2::model::RelationshipToUserTeamUser; use datadog_api_client::datadogV2::model::RelationshipToUserTeamUserData; use datadog_api_client::datadogV2::model::UserTeamAttributes; use datadog_api_client::datadogV2::model::UserTeamCreate; use datadog_api_client::datadogV2::model::UserTeamRelationships; use datadog_api_client::datadogV2::model::UserTeamRequest; use datadog_api_client::datadogV2::model::UserTeamRole; use datadog_api_client::datadogV2::model::UserTeamTeamType; use datadog_api_client::datadogV2::model::UserTeamType; use datadog_api_client::datadogV2::model::UserTeamUserType; #[tokio::main] async fn main() { let body = UserTeamRequest::new( UserTeamCreate::new(UserTeamType::TEAM_MEMBERSHIPS) .attributes(UserTeamAttributes::new().role(Some(UserTeamRole::ADMIN))) .relationships( UserTeamRelationships::new() .team(RelationshipToUserTeamTeam::new( RelationshipToUserTeamTeamData::new( "d7e15d9d-d346-43da-81d8-3d9e71d9a5e9".to_string(), UserTeamTeamType::TEAM, ), )) .user(RelationshipToUserTeamUser::new( RelationshipToUserTeamUserData::new( "b8626d7e-cedd-11eb-abf5-da7ad0900001".to_string(), UserTeamUserType::USERS, ), )), ), ); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .create_team_membership("team_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add a user to a team ``` /** * Add a user to a team returns "Represents a user's association to a team" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); const params: v2.TeamsApiCreateTeamMembershipRequest = { body: { data: { attributes: { role: "admin", }, relationships: { team: { data: { id: "d7e15d9d-d346-43da-81d8-3d9e71d9a5e9", type: "team", }, }, user: { data: { id: "b8626d7e-cedd-11eb-abf5-da7ad0900001", type: "users", }, }, }, type: "team_memberships", }, }, teamId: "team_id", }; apiInstance .createTeamMembership(params) .then((data: v2.UserTeamResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Remove a user from a team](https://docs.datadoghq.com/api/latest/teams/#remove-a-user-from-a-team) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#remove-a-user-from-a-team-v2) DELETE https://api.ap1.datadoghq.com/api/v2/team/{team_id}/memberships/{user_id}https://api.ap2.datadoghq.com/api/v2/team/{team_id}/memberships/{user_id}https://api.datadoghq.eu/api/v2/team/{team_id}/memberships/{user_id}https://api.ddog-gov.com/api/v2/team/{team_id}/memberships/{user_id}https://api.datadoghq.com/api/v2/team/{team_id}/memberships/{user_id}https://api.us3.datadoghq.com/api/v2/team/{team_id}/memberships/{user_id}https://api.us5.datadoghq.com/api/v2/team/{team_id}/memberships/{user_id} ### Overview Remove a user from a team. **Note** : Each team has a setting that determines who is allowed to modify membership of the team. The `user_access_manage` permission generally grants access to modify membership of any team. To get the full picture, see [Team Membership documentation](https://docs.datadoghq.com/account_management/teams/manage/#team-membership). This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None user_id [_required_] string None ### Response * [204](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamMembership-204-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamMembership-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamMembership-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamMembership-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Remove a user from a team Copy ``` # Path parameters export team_id="CHANGE_ME" export user_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/memberships/${user_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Remove a user from a team ``` """ Remove a user from a team returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) api_instance.delete_team_membership( team_id=DD_TEAM_DATA_ID, user_id="user_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Remove a user from a team ``` # Remove a user from a team returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] api_instance.delete_team_membership(DD_TEAM_DATA_ID, "user_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Remove a user from a team ``` // Remove a user from a team returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) r, err := api.DeleteTeamMembership(ctx, DdTeamDataID, "user_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.DeleteTeamMembership`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Remove a user from a team ``` // Remove a user from a team returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); try { apiInstance.deleteTeamMembership(DD_TEAM_DATA_ID, "user_id"); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#deleteTeamMembership"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Remove a user from a team ``` // Remove a user from a team returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .delete_team_membership(dd_team_data_id.clone(), "user_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Remove a user from a team ``` /** * Remove a user from a team returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.TeamsApiDeleteTeamMembershipRequest = { teamId: DD_TEAM_DATA_ID, userId: "user_id", }; apiInstance .deleteTeamMembership(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a user's membership attributes on a team](https://docs.datadoghq.com/api/latest/teams/#update-a-users-membership-attributes-on-a-team) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#update-a-users-membership-attributes-on-a-team-v2) PATCH https://api.ap1.datadoghq.com/api/v2/team/{team_id}/memberships/{user_id}https://api.ap2.datadoghq.com/api/v2/team/{team_id}/memberships/{user_id}https://api.datadoghq.eu/api/v2/team/{team_id}/memberships/{user_id}https://api.ddog-gov.com/api/v2/team/{team_id}/memberships/{user_id}https://api.datadoghq.com/api/v2/team/{team_id}/memberships/{user_id}https://api.us3.datadoghq.com/api/v2/team/{team_id}/memberships/{user_id}https://api.us5.datadoghq.com/api/v2/team/{team_id}/memberships/{user_id} ### Overview Update a user’s membership attributes on a team. **Note** : Each team has a setting that determines who is allowed to modify membership of the team. The `user_access_manage` permission generally grants access to modify membership of any team. To get the full picture, see [Team Membership documentation](https://docs.datadoghq.com/account_management/teams/manage/#team-membership). This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None user_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object A user's relationship with a team attributes object Team membership attributes provisioned_by string The mechanism responsible for provisioning the team relationship. Possible values: null for added by a user, "service_account" if added by a service account, and "saml_mapping" if provisioned via SAML mapping. provisioned_by_id string UUID of the User or Service Account who provisioned this team membership, or null if provisioned via SAML mapping. role enum The user's role within the team Allowed enum values: `admin` type [_required_] enum Team membership type Allowed enum values: `team_memberships` default: `team_memberships` ``` { "data": { "attributes": { "role": "string" }, "type": "team_memberships" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamMembership-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamMembership-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamMembership-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamMembership-429-v2) Represents a user's association to a team * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team membership response Field Type Description data object A user's relationship with a team attributes object Team membership attributes provisioned_by string The mechanism responsible for provisioning the team relationship. Possible values: null for added by a user, "service_account" if added by a service account, and "saml_mapping" if provisioned via SAML mapping. provisioned_by_id string UUID of the User or Service Account who provisioned this team membership, or null if provisioned via SAML mapping. role enum The user's role within the team Allowed enum values: `admin` id [_required_] string The ID of a user's relationship with a team relationships object Relationship between membership and a user team object Relationship between team membership and team data [_required_] object The team associated with the membership id [_required_] string The ID of the team associated with the membership type [_required_] enum User team team type Allowed enum values: `team` default: `team` user object Relationship between team membership and user data [_required_] object A user's relationship with a team id [_required_] string The ID of the user associated with the team type [_required_] enum User team user type Allowed enum values: `users` default: `users` type [_required_] enum Team membership type Allowed enum values: `team_memberships` default: `team_memberships` included [ ] Resources related to the team memberships Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object A team attributes [_required_] object Team attributes avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team created_at date-time Creation date of the team description string Free-form markdown description/content for the team's homepage handle [_required_] string The team's identifier hidden_modules [string] Collection of hidden modules for the team is_managed boolean Whether the team is managed from an external source link_count int32 The number of links belonging to the team modified_at date-time Modification date of the team name [_required_] string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team visible_modules [string] Collection of visible modules for the team id [_required_] string The team's identifier relationships object Resources related to a team team_links object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. user_team_permissions object Relationship between a user team permission and a team data object Related user team permission data id [_required_] string The ID of the user team permission type [_required_] enum User team permission type Allowed enum values: `user_team_permissions` default: `user_team_permissions` links object Links attributes. related string Related link. type [_required_] enum Team type Allowed enum values: `team` default: `team` ``` { "data": { "attributes": { "provisioned_by": "string", "provisioned_by_id": "string", "role": "string" }, "id": "TeamMembership-aeadc05e-98a8-11ec-ac2c-da7ad0900001-38835", "relationships": { "team": { "data": { "id": "d7e15d9d-d346-43da-81d8-3d9e71d9a5e9", "type": "team" } }, "user": { "data": { "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "users" } } }, "type": "team_memberships" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Update a user's membership attributes on a team Copy ``` # Path parameters export team_id="CHANGE_ME" export user_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/memberships/${user_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "team_memberships" } } EOF ``` ##### Update a user's membership attributes on a team ``` """ Update a user's membership attributes on a team returns "Represents a user's association to a team" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.user_team_attributes import UserTeamAttributes from datadog_api_client.v2.model.user_team_role import UserTeamRole from datadog_api_client.v2.model.user_team_type import UserTeamType from datadog_api_client.v2.model.user_team_update import UserTeamUpdate from datadog_api_client.v2.model.user_team_update_request import UserTeamUpdateRequest body = UserTeamUpdateRequest( data=UserTeamUpdate( attributes=UserTeamAttributes( role=UserTeamRole.ADMIN, ), type=UserTeamType.TEAM_MEMBERSHIPS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.update_team_membership(team_id="team_id", user_id="user_id", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a user's membership attributes on a team ``` # Update a user's membership attributes on a team returns "Represents a user's association to a team" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new body = DatadogAPIClient::V2::UserTeamUpdateRequest.new({ data: DatadogAPIClient::V2::UserTeamUpdate.new({ attributes: DatadogAPIClient::V2::UserTeamAttributes.new({ role: DatadogAPIClient::V2::UserTeamRole::ADMIN, }), type: DatadogAPIClient::V2::UserTeamType::TEAM_MEMBERSHIPS, }), }) p api_instance.update_team_membership("team_id", "user_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a user's membership attributes on a team ``` // Update a user's membership attributes on a team returns "Represents a user's association to a team" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.UserTeamUpdateRequest{ Data: datadogV2.UserTeamUpdate{ Attributes: &datadogV2.UserTeamAttributes{ Role: *datadogV2.NewNullableUserTeamRole(datadogV2.USERTEAMROLE_ADMIN.Ptr()), }, Type: datadogV2.USERTEAMTYPE_TEAM_MEMBERSHIPS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.UpdateTeamMembership(ctx, "team_id", "user_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.UpdateTeamMembership`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.UpdateTeamMembership`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a user's membership attributes on a team ``` // Update a user's membership attributes on a team returns "Represents a user's association to a // team" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.UserTeamAttributes; import com.datadog.api.client.v2.model.UserTeamResponse; import com.datadog.api.client.v2.model.UserTeamRole; import com.datadog.api.client.v2.model.UserTeamType; import com.datadog.api.client.v2.model.UserTeamUpdate; import com.datadog.api.client.v2.model.UserTeamUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); UserTeamUpdateRequest body = new UserTeamUpdateRequest() .data( new UserTeamUpdate() .attributes(new UserTeamAttributes().role(UserTeamRole.ADMIN)) .type(UserTeamType.TEAM_MEMBERSHIPS)); try { UserTeamResponse result = apiInstance.updateTeamMembership("team_id", "user_id", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#updateTeamMembership"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a user's membership attributes on a team ``` // Update a user's membership attributes on a team returns "Represents a user's // association to a team" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::UserTeamAttributes; use datadog_api_client::datadogV2::model::UserTeamRole; use datadog_api_client::datadogV2::model::UserTeamType; use datadog_api_client::datadogV2::model::UserTeamUpdate; use datadog_api_client::datadogV2::model::UserTeamUpdateRequest; #[tokio::main] async fn main() { let body = UserTeamUpdateRequest::new( UserTeamUpdate::new(UserTeamType::TEAM_MEMBERSHIPS) .attributes(UserTeamAttributes::new().role(Some(UserTeamRole::ADMIN))), ); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .update_team_membership("team_id".to_string(), "user_id".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a user's membership attributes on a team ``` /** * Update a user's membership attributes on a team returns "Represents a user's association to a team" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); const params: v2.TeamsApiUpdateTeamMembershipRequest = { body: { data: { attributes: { role: "admin", }, type: "team_memberships", }, }, teamId: "team_id", userId: "user_id", }; apiInstance .updateTeamMembership(params) .then((data: v2.UserTeamResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get links for a team](https://docs.datadoghq.com/api/latest/teams/#get-links-for-a-team) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-links-for-a-team-v2) GET https://api.ap1.datadoghq.com/api/v2/team/{team_id}/linkshttps://api.ap2.datadoghq.com/api/v2/team/{team_id}/linkshttps://api.datadoghq.eu/api/v2/team/{team_id}/linkshttps://api.ddog-gov.com/api/v2/team/{team_id}/linkshttps://api.datadoghq.com/api/v2/team/{team_id}/linkshttps://api.us3.datadoghq.com/api/v2/team/{team_id}/linkshttps://api.us5.datadoghq.com/api/v2/team/{team_id}/links ### Overview Get all links for a given team. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#GetTeamLinks-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#GetTeamLinks-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#GetTeamLinks-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#GetTeamLinks-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team links response Field Type Description data [object] Team links response data attributes [_required_] object Team link attributes label [_required_] string The link's label position int32 The link's position, used to sort links for the team team_id string ID of the team the link is associated with url [_required_] string The URL for the link id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` ``` { "data": [ { "attributes": { "label": "Link label", "position": "integer", "team_id": "string", "url": "https://example.com" }, "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "team_links" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Get links for a team Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/links" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get links for a team ``` """ Get links for a team returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.get_team_links( team_id=DD_TEAM_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get links for a team ``` # Get links for a team returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] p api_instance.get_team_links(DD_TEAM_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get links for a team ``` // Get links for a team returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.GetTeamLinks(ctx, DdTeamDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.GetTeamLinks`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.GetTeamLinks`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get links for a team ``` // Get links for a team returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamLinksResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); try { TeamLinksResponse result = apiInstance.getTeamLinks(DD_TEAM_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#getTeamLinks"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get links for a team ``` // Get links for a team returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api.get_team_links(dd_team_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get links for a team ``` /** * Get links for a team returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.TeamsApiGetTeamLinksRequest = { teamId: DD_TEAM_DATA_ID, }; apiInstance .getTeamLinks(params) .then((data: v2.TeamLinksResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a team link](https://docs.datadoghq.com/api/latest/teams/#create-a-team-link) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#create-a-team-link-v2) POST https://api.ap1.datadoghq.com/api/v2/team/{team_id}/linkshttps://api.ap2.datadoghq.com/api/v2/team/{team_id}/linkshttps://api.datadoghq.eu/api/v2/team/{team_id}/linkshttps://api.ddog-gov.com/api/v2/team/{team_id}/linkshttps://api.datadoghq.com/api/v2/team/{team_id}/linkshttps://api.us3.datadoghq.com/api/v2/team/{team_id}/linkshttps://api.us5.datadoghq.com/api/v2/team/{team_id}/links ### Overview Add a new link to a team. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object Team link create attributes [_required_] object Team link attributes label [_required_] string The link's label position int32 The link's position, used to sort links for the team team_id string ID of the team the link is associated with url [_required_] string The URL for the link type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` ``` { "data": { "attributes": { "label": "Link label", "url": "https://example.com", "position": 0 }, "type": "team_links" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#CreateTeamLink-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#CreateTeamLink-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#CreateTeamLink-404-v2) * [422](https://docs.datadoghq.com/api/latest/teams/#CreateTeamLink-422-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#CreateTeamLink-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team link response Field Type Description data object Team link attributes [_required_] object Team link attributes label [_required_] string The link's label position int32 The link's position, used to sort links for the team team_id string ID of the team the link is associated with url [_required_] string The URL for the link id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` ``` { "data": { "attributes": { "label": "Link label", "position": "integer", "team_id": "string", "url": "https://example.com" }, "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "team_links" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Create a team link returns "OK" response Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/links" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "label": "Link label", "url": "https://example.com", "position": 0 }, "type": "team_links" } } EOF ``` ##### Create a team link returns "OK" response ``` // Create a team link returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") body := datadogV2.TeamLinkCreateRequest{ Data: datadogV2.TeamLinkCreate{ Attributes: datadogV2.TeamLinkAttributes{ Label: "Link label", Url: "https://example.com", Position: datadog.PtrInt32(0), }, Type: datadogV2.TEAMLINKTYPE_TEAM_LINKS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.CreateTeamLink(ctx, DdTeamDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.CreateTeamLink`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.CreateTeamLink`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a team link returns "OK" response ``` // Create a team link returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamLinkAttributes; import com.datadog.api.client.v2.model.TeamLinkCreate; import com.datadog.api.client.v2.model.TeamLinkCreateRequest; import com.datadog.api.client.v2.model.TeamLinkResponse; import com.datadog.api.client.v2.model.TeamLinkType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); TeamLinkCreateRequest body = new TeamLinkCreateRequest() .data( new TeamLinkCreate() .attributes( new TeamLinkAttributes() .label("Link label") .url("https://example.com") .position(0)) .type(TeamLinkType.TEAM_LINKS)); try { TeamLinkResponse result = apiInstance.createTeamLink(DD_TEAM_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#createTeamLink"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a team link returns "OK" response ``` """ Create a team link returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.team_link_attributes import TeamLinkAttributes from datadog_api_client.v2.model.team_link_create import TeamLinkCreate from datadog_api_client.v2.model.team_link_create_request import TeamLinkCreateRequest from datadog_api_client.v2.model.team_link_type import TeamLinkType # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] body = TeamLinkCreateRequest( data=TeamLinkCreate( attributes=TeamLinkAttributes( label="Link label", url="https://example.com", position=0, ), type=TeamLinkType.TEAM_LINKS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.create_team_link(team_id=DD_TEAM_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a team link returns "OK" response ``` # Create a team link returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] body = DatadogAPIClient::V2::TeamLinkCreateRequest.new({ data: DatadogAPIClient::V2::TeamLinkCreate.new({ attributes: DatadogAPIClient::V2::TeamLinkAttributes.new({ label: "Link label", url: "https://example.com", position: 0, }), type: DatadogAPIClient::V2::TeamLinkType::TEAM_LINKS, }), }) p api_instance.create_team_link(DD_TEAM_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a team link returns "OK" response ``` // Create a team link returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::TeamLinkAttributes; use datadog_api_client::datadogV2::model::TeamLinkCreate; use datadog_api_client::datadogV2::model::TeamLinkCreateRequest; use datadog_api_client::datadogV2::model::TeamLinkType; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let body = TeamLinkCreateRequest::new(TeamLinkCreate::new( TeamLinkAttributes::new("Link label".to_string(), "https://example.com".to_string()) .position(0), TeamLinkType::TEAM_LINKS, )); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api.create_team_link(dd_team_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a team link returns "OK" response ``` /** * Create a team link returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.TeamsApiCreateTeamLinkRequest = { body: { data: { attributes: { label: "Link label", url: "https://example.com", position: 0, }, type: "team_links", }, }, teamId: DD_TEAM_DATA_ID, }; apiInstance .createTeamLink(params) .then((data: v2.TeamLinkResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a team link](https://docs.datadoghq.com/api/latest/teams/#get-a-team-link) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-a-team-link-v2) GET https://api.ap1.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.ap2.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.datadoghq.eu/api/v2/team/{team_id}/links/{link_id}https://api.ddog-gov.com/api/v2/team/{team_id}/links/{link_id}https://api.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.us3.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.us5.datadoghq.com/api/v2/team/{team_id}/links/{link_id} ### Overview Get a single link for a team. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None link_id [_required_] string None ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#GetTeamLink-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#GetTeamLink-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#GetTeamLink-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#GetTeamLink-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team link response Field Type Description data object Team link attributes [_required_] object Team link attributes label [_required_] string The link's label position int32 The link's position, used to sort links for the team team_id string ID of the team the link is associated with url [_required_] string The URL for the link id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` ``` { "data": { "attributes": { "label": "Link label", "position": "integer", "team_id": "string", "url": "https://example.com" }, "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "team_links" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Get a team link Copy ``` # Path parameters export team_id="CHANGE_ME" export link_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/links/${link_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a team link ``` """ Get a team link returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] # there is a valid "team_link" in the system TEAM_LINK_DATA_ID = environ["TEAM_LINK_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.get_team_link( team_id=DD_TEAM_DATA_ID, link_id=TEAM_LINK_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a team link ``` # Get a team link returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] # there is a valid "team_link" in the system TEAM_LINK_DATA_ID = ENV["TEAM_LINK_DATA_ID"] p api_instance.get_team_link(DD_TEAM_DATA_ID, TEAM_LINK_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a team link ``` // Get a team link returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") // there is a valid "team_link" in the system TeamLinkDataID := os.Getenv("TEAM_LINK_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.GetTeamLink(ctx, DdTeamDataID, TeamLinkDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.GetTeamLink`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.GetTeamLink`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a team link ``` // Get a team link returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamLinkResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); // there is a valid "team_link" in the system String TEAM_LINK_DATA_ID = System.getenv("TEAM_LINK_DATA_ID"); try { TeamLinkResponse result = apiInstance.getTeamLink(DD_TEAM_DATA_ID, TEAM_LINK_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#getTeamLink"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a team link ``` // Get a team link returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); // there is a valid "team_link" in the system let team_link_data_id = std::env::var("TEAM_LINK_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .get_team_link(dd_team_data_id.clone(), team_link_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a team link ``` /** * Get a team link returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; // there is a valid "team_link" in the system const TEAM_LINK_DATA_ID = process.env.TEAM_LINK_DATA_ID as string; const params: v2.TeamsApiGetTeamLinkRequest = { teamId: DD_TEAM_DATA_ID, linkId: TEAM_LINK_DATA_ID, }; apiInstance .getTeamLink(params) .then((data: v2.TeamLinkResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a team link](https://docs.datadoghq.com/api/latest/teams/#update-a-team-link) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#update-a-team-link-v2) PATCH https://api.ap1.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.ap2.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.datadoghq.eu/api/v2/team/{team_id}/links/{link_id}https://api.ddog-gov.com/api/v2/team/{team_id}/links/{link_id}https://api.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.us3.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.us5.datadoghq.com/api/v2/team/{team_id}/links/{link_id} ### Overview Update a team link. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None link_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object Team link create attributes [_required_] object Team link attributes label [_required_] string The link's label position int32 The link's position, used to sort links for the team team_id string ID of the team the link is associated with url [_required_] string The URL for the link type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` ``` { "data": { "attributes": { "label": "New Label", "url": "https://example.com" }, "type": "team_links" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamLink-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamLink-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamLink-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamLink-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team link response Field Type Description data object Team link attributes [_required_] object Team link attributes label [_required_] string The link's label position int32 The link's position, used to sort links for the team team_id string ID of the team the link is associated with url [_required_] string The URL for the link id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` ``` { "data": { "attributes": { "label": "Link label", "position": "integer", "team_id": "string", "url": "https://example.com" }, "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "team_links" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Update a team link returns "OK" response Copy ``` # Path parameters export team_id="CHANGE_ME" export link_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/links/${link_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "label": "New Label", "url": "https://example.com" }, "type": "team_links" } } EOF ``` ##### Update a team link returns "OK" response ``` // Update a team link returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") // there is a valid "team_link" in the system TeamLinkDataID := os.Getenv("TEAM_LINK_DATA_ID") body := datadogV2.TeamLinkCreateRequest{ Data: datadogV2.TeamLinkCreate{ Attributes: datadogV2.TeamLinkAttributes{ Label: "New Label", Url: "https://example.com", }, Type: datadogV2.TEAMLINKTYPE_TEAM_LINKS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.UpdateTeamLink(ctx, DdTeamDataID, TeamLinkDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.UpdateTeamLink`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.UpdateTeamLink`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a team link returns "OK" response ``` // Update a team link returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamLinkAttributes; import com.datadog.api.client.v2.model.TeamLinkCreate; import com.datadog.api.client.v2.model.TeamLinkCreateRequest; import com.datadog.api.client.v2.model.TeamLinkResponse; import com.datadog.api.client.v2.model.TeamLinkType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); // there is a valid "team_link" in the system String TEAM_LINK_DATA_ID = System.getenv("TEAM_LINK_DATA_ID"); TeamLinkCreateRequest body = new TeamLinkCreateRequest() .data( new TeamLinkCreate() .attributes( new TeamLinkAttributes().label("New Label").url("https://example.com")) .type(TeamLinkType.TEAM_LINKS)); try { TeamLinkResponse result = apiInstance.updateTeamLink(DD_TEAM_DATA_ID, TEAM_LINK_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#updateTeamLink"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a team link returns "OK" response ``` """ Update a team link returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.team_link_attributes import TeamLinkAttributes from datadog_api_client.v2.model.team_link_create import TeamLinkCreate from datadog_api_client.v2.model.team_link_create_request import TeamLinkCreateRequest from datadog_api_client.v2.model.team_link_type import TeamLinkType # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] # there is a valid "team_link" in the system TEAM_LINK_DATA_ID = environ["TEAM_LINK_DATA_ID"] body = TeamLinkCreateRequest( data=TeamLinkCreate( attributes=TeamLinkAttributes( label="New Label", url="https://example.com", ), type=TeamLinkType.TEAM_LINKS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.update_team_link(team_id=DD_TEAM_DATA_ID, link_id=TEAM_LINK_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a team link returns "OK" response ``` # Update a team link returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] # there is a valid "team_link" in the system TEAM_LINK_DATA_ID = ENV["TEAM_LINK_DATA_ID"] body = DatadogAPIClient::V2::TeamLinkCreateRequest.new({ data: DatadogAPIClient::V2::TeamLinkCreate.new({ attributes: DatadogAPIClient::V2::TeamLinkAttributes.new({ label: "New Label", url: "https://example.com", }), type: DatadogAPIClient::V2::TeamLinkType::TEAM_LINKS, }), }) p api_instance.update_team_link(DD_TEAM_DATA_ID, TEAM_LINK_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a team link returns "OK" response ``` // Update a team link returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::TeamLinkAttributes; use datadog_api_client::datadogV2::model::TeamLinkCreate; use datadog_api_client::datadogV2::model::TeamLinkCreateRequest; use datadog_api_client::datadogV2::model::TeamLinkType; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); // there is a valid "team_link" in the system let team_link_data_id = std::env::var("TEAM_LINK_DATA_ID").unwrap(); let body = TeamLinkCreateRequest::new(TeamLinkCreate::new( TeamLinkAttributes::new("New Label".to_string(), "https://example.com".to_string()), TeamLinkType::TEAM_LINKS, )); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .update_team_link(dd_team_data_id.clone(), team_link_data_id.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a team link returns "OK" response ``` /** * Update a team link returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; // there is a valid "team_link" in the system const TEAM_LINK_DATA_ID = process.env.TEAM_LINK_DATA_ID as string; const params: v2.TeamsApiUpdateTeamLinkRequest = { body: { data: { attributes: { label: "New Label", url: "https://example.com", }, type: "team_links", }, }, teamId: DD_TEAM_DATA_ID, linkId: TEAM_LINK_DATA_ID, }; apiInstance .updateTeamLink(params) .then((data: v2.TeamLinkResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Remove a team link](https://docs.datadoghq.com/api/latest/teams/#remove-a-team-link) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#remove-a-team-link-v2) DELETE https://api.ap1.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.ap2.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.datadoghq.eu/api/v2/team/{team_id}/links/{link_id}https://api.ddog-gov.com/api/v2/team/{team_id}/links/{link_id}https://api.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.us3.datadoghq.com/api/v2/team/{team_id}/links/{link_id}https://api.us5.datadoghq.com/api/v2/team/{team_id}/links/{link_id} ### Overview Remove a link from a team. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None link_id [_required_] string None ### Response * [204](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamLink-204-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamLink-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamLink-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamLink-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Remove a team link Copy ``` # Path parameters export team_id="CHANGE_ME" export link_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/links/${link_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Remove a team link ``` """ Remove a team link returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] # there is a valid "team_link" in the system TEAM_LINK_DATA_ID = environ["TEAM_LINK_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) api_instance.delete_team_link( team_id=DD_TEAM_DATA_ID, link_id=TEAM_LINK_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Remove a team link ``` # Remove a team link returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] # there is a valid "team_link" in the system TEAM_LINK_DATA_ID = ENV["TEAM_LINK_DATA_ID"] api_instance.delete_team_link(DD_TEAM_DATA_ID, TEAM_LINK_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Remove a team link ``` // Remove a team link returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") // there is a valid "team_link" in the system TeamLinkDataID := os.Getenv("TEAM_LINK_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) r, err := api.DeleteTeamLink(ctx, DdTeamDataID, TeamLinkDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.DeleteTeamLink`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Remove a team link ``` // Remove a team link returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); // there is a valid "team_link" in the system String TEAM_LINK_DATA_ID = System.getenv("TEAM_LINK_DATA_ID"); try { apiInstance.deleteTeamLink(DD_TEAM_DATA_ID, TEAM_LINK_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#deleteTeamLink"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Remove a team link ``` // Remove a team link returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); // there is a valid "team_link" in the system let team_link_data_id = std::env::var("TEAM_LINK_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .delete_team_link(dd_team_data_id.clone(), team_link_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Remove a team link ``` /** * Remove a team link returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; // there is a valid "team_link" in the system const TEAM_LINK_DATA_ID = process.env.TEAM_LINK_DATA_ID as string; const params: v2.TeamsApiDeleteTeamLinkRequest = { teamId: DD_TEAM_DATA_ID, linkId: TEAM_LINK_DATA_ID, }; apiInstance .deleteTeamLink(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get permission settings for a team](https://docs.datadoghq.com/api/latest/teams/#get-permission-settings-for-a-team) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-permission-settings-for-a-team-v2) GET https://api.ap1.datadoghq.com/api/v2/team/{team_id}/permission-settingshttps://api.ap2.datadoghq.com/api/v2/team/{team_id}/permission-settingshttps://api.datadoghq.eu/api/v2/team/{team_id}/permission-settingshttps://api.ddog-gov.com/api/v2/team/{team_id}/permission-settingshttps://api.datadoghq.com/api/v2/team/{team_id}/permission-settingshttps://api.us3.datadoghq.com/api/v2/team/{team_id}/permission-settingshttps://api.us5.datadoghq.com/api/v2/team/{team_id}/permission-settings ### Overview Get all permission settings for a given team. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#GetTeamPermissionSettings-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#GetTeamPermissionSettings-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#GetTeamPermissionSettings-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#GetTeamPermissionSettings-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team permission settings response Field Type Description data [object] Team permission settings response data attributes object Team permission setting attributes action enum The identifier for the action Allowed enum values: `manage_membership,edit` editable boolean Whether or not the permission setting is editable by the current user options [string] Possible values for action title string The team permission name value enum What type of user is allowed to perform the specified action Allowed enum values: `admins,members,organization,user_access_manage,teams_manage` id [_required_] string The team permission setting's identifier type [_required_] enum Team permission setting type Allowed enum values: `team_permission_settings` default: `team_permission_settings` ``` { "data": [ { "attributes": { "action": "string", "editable": false, "options": [], "title": "string", "value": "string" }, "id": "TeamPermission-aeadc05e-98a8-11ec-ac2c-da7ad0900001-edit", "type": "team_permission_settings" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Get permission settings for a team Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/permission-settings" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get permission settings for a team ``` """ Get permission settings for a team returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.get_team_permission_settings( team_id=DD_TEAM_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get permission settings for a team ``` # Get permission settings for a team returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] p api_instance.get_team_permission_settings(DD_TEAM_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get permission settings for a team ``` // Get permission settings for a team returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.GetTeamPermissionSettings(ctx, DdTeamDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.GetTeamPermissionSettings`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.GetTeamPermissionSettings`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get permission settings for a team ``` // Get permission settings for a team returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamPermissionSettingsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); try { TeamPermissionSettingsResponse result = apiInstance.getTeamPermissionSettings(DD_TEAM_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#getTeamPermissionSettings"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get permission settings for a team ``` // Get permission settings for a team returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .get_team_permission_settings(dd_team_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get permission settings for a team ``` /** * Get permission settings for a team returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.TeamsApiGetTeamPermissionSettingsRequest = { teamId: DD_TEAM_DATA_ID, }; apiInstance .getTeamPermissionSettings(params) .then((data: v2.TeamPermissionSettingsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get team sync configurations](https://docs.datadoghq.com/api/latest/teams/#get-team-sync-configurations) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-team-sync-configurations-v2) GET https://api.ap1.datadoghq.com/api/v2/team/synchttps://api.ap2.datadoghq.com/api/v2/team/synchttps://api.datadoghq.eu/api/v2/team/synchttps://api.ddog-gov.com/api/v2/team/synchttps://api.datadoghq.com/api/v2/team/synchttps://api.us3.datadoghq.com/api/v2/team/synchttps://api.us5.datadoghq.com/api/v2/team/sync ### Overview Get all team synchronization configurations. Returns a list of configurations used for linking or provisioning teams with external sources like GitHub. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[source] [_required_] enum Filter by the external source platform for team synchronization Allowed enum values: `github` ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#GetTeamSync-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#GetTeamSync-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#GetTeamSync-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#GetTeamSync-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team sync configurations response. Field Type Description data [object] List of team sync configurations attributes [_required_] object Team sync attributes. frequency enum How often the sync process should be run. Defaults to `once` when not provided. Allowed enum values: `once,continuously,paused` source [_required_] enum The external source platform for team synchronization. Only "github" is supported. Allowed enum values: `github` sync_membership boolean Whether to sync members from the external team to the Datadog team. Defaults to `false` when not provided. type [_required_] enum The type of synchronization operation. "link" connects teams by matching names. "provision" creates new teams when no match is found. Allowed enum values: `link,provision` id string The sync's identifier type [_required_] enum Team sync bulk type. Allowed enum values: `team_sync_bulk` ``` { "data": [ { "attributes": { "frequency": "once", "source": "github", "sync_membership": true, "type": "link" }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "team_sync_bulk" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Team sync configurations not found * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Get team sync configurations Copy ``` # Required query arguments export filter[source]="github" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/sync?filter[source]=${filter[source]}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get team sync configurations ``` """ Get team sync configurations returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.team_sync_attributes_source import TeamSyncAttributesSource configuration = Configuration() configuration.unstable_operations["get_team_sync"] = True with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.get_team_sync( filter_source=TeamSyncAttributesSource.GITHUB, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get team sync configurations ``` # Get team sync configurations returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.get_team_sync".to_sym] = true end api_instance = DatadogAPIClient::V2::TeamsAPI.new p api_instance.get_team_sync(TeamSyncAttributesSource::GITHUB) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get team sync configurations ``` // Get team sync configurations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.GetTeamSync", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.GetTeamSync(ctx, datadogV2.TEAMSYNCATTRIBUTESSOURCE_GITHUB) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.GetTeamSync`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.GetTeamSync`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get team sync configurations ``` // Get team sync configurations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamSyncAttributesSource; import com.datadog.api.client.v2.model.TeamSyncResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.getTeamSync", true); TeamsApi apiInstance = new TeamsApi(defaultClient); try { TeamSyncResponse result = apiInstance.getTeamSync(TeamSyncAttributesSource.GITHUB); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#getTeamSync"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get team sync configurations ``` // Get team sync configurations returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::TeamSyncAttributesSource; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.GetTeamSync", true); let api = TeamsAPI::with_config(configuration); let resp = api.get_team_sync(TeamSyncAttributesSource::GITHUB).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get team sync configurations ``` /** * Get team sync configurations returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.getTeamSync"] = true; const apiInstance = new v2.TeamsApi(configuration); const params: v2.TeamsApiGetTeamSyncRequest = { filterSource: "github", }; apiInstance .getTeamSync(params) .then((data: v2.TeamSyncResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update permission setting for team](https://docs.datadoghq.com/api/latest/teams/#update-permission-setting-for-team) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#update-permission-setting-for-team-v2) PUT https://api.ap1.datadoghq.com/api/v2/team/{team_id}/permission-settings/{action}https://api.ap2.datadoghq.com/api/v2/team/{team_id}/permission-settings/{action}https://api.datadoghq.eu/api/v2/team/{team_id}/permission-settings/{action}https://api.ddog-gov.com/api/v2/team/{team_id}/permission-settings/{action}https://api.datadoghq.com/api/v2/team/{team_id}/permission-settings/{action}https://api.us3.datadoghq.com/api/v2/team/{team_id}/permission-settings/{action}https://api.us5.datadoghq.com/api/v2/team/{team_id}/permission-settings/{action} ### Overview Update a team permission setting for a given team. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None action [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object Team permission setting update attributes object Team permission setting update attributes value enum What type of user is allowed to perform the specified action Allowed enum values: `admins,members,organization,user_access_manage,teams_manage` type [_required_] enum Team permission setting type Allowed enum values: `team_permission_settings` default: `team_permission_settings` ``` { "data": { "attributes": { "value": "admins" }, "type": "team_permission_settings" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamPermissionSetting-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamPermissionSetting-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamPermissionSetting-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamPermissionSetting-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team permission setting response Field Type Description data object Team permission setting attributes object Team permission setting attributes action enum The identifier for the action Allowed enum values: `manage_membership,edit` editable boolean Whether or not the permission setting is editable by the current user options [string] Possible values for action title string The team permission name value enum What type of user is allowed to perform the specified action Allowed enum values: `admins,members,organization,user_access_manage,teams_manage` id [_required_] string The team permission setting's identifier type [_required_] enum Team permission setting type Allowed enum values: `team_permission_settings` default: `team_permission_settings` ``` { "data": { "attributes": { "action": "string", "editable": false, "options": [], "title": "string", "value": "string" }, "id": "TeamPermission-aeadc05e-98a8-11ec-ac2c-da7ad0900001-edit", "type": "team_permission_settings" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Update permission setting for team returns "OK" response Copy ``` # Path parameters export team_id="CHANGE_ME" export action="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/permission-settings/${action}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "value": "admins" }, "type": "team_permission_settings" } } EOF ``` ##### Update permission setting for team returns "OK" response ``` // Update permission setting for team returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") body := datadogV2.TeamPermissionSettingUpdateRequest{ Data: datadogV2.TeamPermissionSettingUpdate{ Attributes: &datadogV2.TeamPermissionSettingUpdateAttributes{ Value: datadogV2.TEAMPERMISSIONSETTINGVALUE_ADMINS.Ptr(), }, Type: datadogV2.TEAMPERMISSIONSETTINGTYPE_TEAM_PERMISSION_SETTINGS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.UpdateTeamPermissionSetting(ctx, DdTeamDataID, "manage_membership", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.UpdateTeamPermissionSetting`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.UpdateTeamPermissionSetting`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update permission setting for team returns "OK" response ``` // Update permission setting for team returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamPermissionSettingResponse; import com.datadog.api.client.v2.model.TeamPermissionSettingType; import com.datadog.api.client.v2.model.TeamPermissionSettingUpdate; import com.datadog.api.client.v2.model.TeamPermissionSettingUpdateAttributes; import com.datadog.api.client.v2.model.TeamPermissionSettingUpdateRequest; import com.datadog.api.client.v2.model.TeamPermissionSettingValue; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); TeamPermissionSettingUpdateRequest body = new TeamPermissionSettingUpdateRequest() .data( new TeamPermissionSettingUpdate() .attributes( new TeamPermissionSettingUpdateAttributes() .value(TeamPermissionSettingValue.ADMINS)) .type(TeamPermissionSettingType.TEAM_PERMISSION_SETTINGS)); try { TeamPermissionSettingResponse result = apiInstance.updateTeamPermissionSetting(DD_TEAM_DATA_ID, "manage_membership", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#updateTeamPermissionSetting"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update permission setting for team returns "OK" response ``` """ Update permission setting for team returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.team_permission_setting_type import TeamPermissionSettingType from datadog_api_client.v2.model.team_permission_setting_update import TeamPermissionSettingUpdate from datadog_api_client.v2.model.team_permission_setting_update_attributes import TeamPermissionSettingUpdateAttributes from datadog_api_client.v2.model.team_permission_setting_update_request import TeamPermissionSettingUpdateRequest from datadog_api_client.v2.model.team_permission_setting_value import TeamPermissionSettingValue # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] body = TeamPermissionSettingUpdateRequest( data=TeamPermissionSettingUpdate( attributes=TeamPermissionSettingUpdateAttributes( value=TeamPermissionSettingValue.ADMINS, ), type=TeamPermissionSettingType.TEAM_PERMISSION_SETTINGS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.update_team_permission_setting( team_id=DD_TEAM_DATA_ID, action="manage_membership", body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update permission setting for team returns "OK" response ``` # Update permission setting for team returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] body = DatadogAPIClient::V2::TeamPermissionSettingUpdateRequest.new({ data: DatadogAPIClient::V2::TeamPermissionSettingUpdate.new({ attributes: DatadogAPIClient::V2::TeamPermissionSettingUpdateAttributes.new({ value: DatadogAPIClient::V2::TeamPermissionSettingValue::ADMINS, }), type: DatadogAPIClient::V2::TeamPermissionSettingType::TEAM_PERMISSION_SETTINGS, }), }) p api_instance.update_team_permission_setting(DD_TEAM_DATA_ID, "manage_membership", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update permission setting for team returns "OK" response ``` // Update permission setting for team returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::TeamPermissionSettingType; use datadog_api_client::datadogV2::model::TeamPermissionSettingUpdate; use datadog_api_client::datadogV2::model::TeamPermissionSettingUpdateAttributes; use datadog_api_client::datadogV2::model::TeamPermissionSettingUpdateRequest; use datadog_api_client::datadogV2::model::TeamPermissionSettingValue; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let body = TeamPermissionSettingUpdateRequest::new( TeamPermissionSettingUpdate::new(TeamPermissionSettingType::TEAM_PERMISSION_SETTINGS) .attributes( TeamPermissionSettingUpdateAttributes::new() .value(TeamPermissionSettingValue::ADMINS), ), ); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .update_team_permission_setting( dd_team_data_id.clone(), "manage_membership".to_string(), body, ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update permission setting for team returns "OK" response ``` /** * Update permission setting for team returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.TeamsApiUpdateTeamPermissionSettingRequest = { body: { data: { attributes: { value: "admins", }, type: "team_permission_settings", }, }, teamId: DD_TEAM_DATA_ID, action: "manage_membership", }; apiInstance .updateTeamPermissionSetting(params) .then((data: v2.TeamPermissionSettingResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get user memberships](https://docs.datadoghq.com/api/latest/teams/#get-user-memberships) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-user-memberships-v2) GET https://api.ap1.datadoghq.com/api/v2/users/{user_uuid}/membershipshttps://api.ap2.datadoghq.com/api/v2/users/{user_uuid}/membershipshttps://api.datadoghq.eu/api/v2/users/{user_uuid}/membershipshttps://api.ddog-gov.com/api/v2/users/{user_uuid}/membershipshttps://api.datadoghq.com/api/v2/users/{user_uuid}/membershipshttps://api.us3.datadoghq.com/api/v2/users/{user_uuid}/membershipshttps://api.us5.datadoghq.com/api/v2/users/{user_uuid}/memberships ### Overview Get a list of memberships for a user This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description user_uuid [_required_] string None ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#GetUserMemberships-200-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#GetUserMemberships-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#GetUserMemberships-429-v2) Represents a user's association to a team * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team memberships response Field Type Description data [object] Team memberships response data attributes object Team membership attributes provisioned_by string The mechanism responsible for provisioning the team relationship. Possible values: null for added by a user, "service_account" if added by a service account, and "saml_mapping" if provisioned via SAML mapping. provisioned_by_id string UUID of the User or Service Account who provisioned this team membership, or null if provisioned via SAML mapping. role enum The user's role within the team Allowed enum values: `admin` id [_required_] string The ID of a user's relationship with a team relationships object Relationship between membership and a user team object Relationship between team membership and team data [_required_] object The team associated with the membership id [_required_] string The ID of the team associated with the membership type [_required_] enum User team team type Allowed enum values: `team` default: `team` user object Relationship between team membership and user data [_required_] object A user's relationship with a team id [_required_] string The ID of the user associated with the team type [_required_] enum User team user type Allowed enum values: `users` default: `users` type [_required_] enum Team membership type Allowed enum values: `team_memberships` default: `team_memberships` included [ ] Resources related to the team memberships Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object A team attributes [_required_] object Team attributes avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team created_at date-time Creation date of the team description string Free-form markdown description/content for the team's homepage handle [_required_] string The team's identifier hidden_modules [string] Collection of hidden modules for the team is_managed boolean Whether the team is managed from an external source link_count int32 The number of links belonging to the team modified_at date-time Modification date of the team name [_required_] string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team visible_modules [string] Collection of visible modules for the team id [_required_] string The team's identifier relationships object Resources related to a team team_links object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. user_team_permissions object Relationship between a user team permission and a team data object Related user team permission data id [_required_] string The ID of the user team permission type [_required_] enum User team permission type Allowed enum values: `user_team_permissions` default: `user_team_permissions` links object Links attributes. related string Related link. type [_required_] enum Team type Allowed enum values: `team` default: `team` links object Teams response links. first string First link. last string Last link. next string Next link. prev string Previous link. self string Current link. meta object Teams response metadata. pagination object Teams response metadata. first_offset int64 The first offset. last_offset int64 The last offset. limit int64 Pagination limit. next_offset int64 The next offset. offset int64 The offset. prev_offset int64 The previous offset. total int64 Total results. type string Offset type. ``` { "data": [ { "attributes": { "provisioned_by": "string", "provisioned_by_id": "string", "role": "string" }, "id": "TeamMembership-aeadc05e-98a8-11ec-ac2c-da7ad0900001-38835", "relationships": { "team": { "data": { "id": "d7e15d9d-d346-43da-81d8-3d9e71d9a5e9", "type": "team" } }, "user": { "data": { "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "users" } } }, "type": "team_memberships" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "links": { "first": "string", "last": "string", "next": "string", "prev": "string", "self": "string" }, "meta": { "pagination": { "first_offset": "integer", "last_offset": "integer", "limit": "integer", "next_offset": "integer", "offset": "integer", "prev_offset": "integer", "total": "integer", "type": "string" } } } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Get user memberships Copy ``` # Path parameters export user_uuid="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/users/${user_uuid}/memberships" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get user memberships ``` """ Get user memberships returns "Represents a user's association to a team" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.get_user_memberships( user_uuid=USER_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get user memberships ``` # Get user memberships returns "Represents a user's association to a team" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] p api_instance.get_user_memberships(USER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get user memberships ``` // Get user memberships returns "Represents a user's association to a team" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.GetUserMemberships(ctx, UserDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.GetUserMemberships`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.GetUserMemberships`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get user memberships ``` // Get user memberships returns "Represents a user's association to a team" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.UserTeamsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); try { UserTeamsResponse result = apiInstance.getUserMemberships(USER_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#getUserMemberships"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get user memberships ``` // Get user memberships returns "Represents a user's association to a team" // response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api.get_user_memberships(user_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get user memberships ``` /** * Get user memberships returns "Represents a user's association to a team" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.TeamsApiGetUserMembershipsRequest = { userUuid: USER_DATA_ID, }; apiInstance .getUserMemberships(params) .then((data: v2.UserTeamsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Link Teams with GitHub Teams](https://docs.datadoghq.com/api/latest/teams/#link-teams-with-github-teams) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#link-teams-with-github-teams-v2) POST https://api.ap1.datadoghq.com/api/v2/team/synchttps://api.ap2.datadoghq.com/api/v2/team/synchttps://api.datadoghq.eu/api/v2/team/synchttps://api.ddog-gov.com/api/v2/team/synchttps://api.datadoghq.com/api/v2/team/synchttps://api.us3.datadoghq.com/api/v2/team/synchttps://api.us5.datadoghq.com/api/v2/team/sync ### Overview This endpoint attempts to link your existing Datadog teams with GitHub teams by matching their names. It evaluates all current Datadog teams and compares them against teams in the GitHub organization connected to your Datadog account, based on Datadog Team handle and GitHub Team slug (lowercased and kebab-cased). This operation is read-only on the GitHub side, no teams will be modified or created. [A GitHub organization must be connected to your Datadog account](https://docs.datadoghq.com/integrations/github/), and the GitHub App integrated with Datadog must have the `Members Read` permission. Matching is performed by comparing the Datadog team handle to the GitHub team slug using a normalized exact match; case is ignored and spaces are removed. No modifications are made to teams in GitHub. This only creates new teams in Datadog when type is set to `provision`. This endpoint requires the `teams_manage` permission. OAuth apps require the `teams_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object A configuration governing syncing between Datadog teams and teams from an external system. attributes [_required_] object Team sync attributes. frequency enum How often the sync process should be run. Defaults to `once` when not provided. Allowed enum values: `once,continuously,paused` source [_required_] enum The external source platform for team synchronization. Only "github" is supported. Allowed enum values: `github` sync_membership boolean Whether to sync members from the external team to the Datadog team. Defaults to `false` when not provided. type [_required_] enum The type of synchronization operation. "link" connects teams by matching names. "provision" creates new teams when no match is found. Allowed enum values: `link,provision` id string The sync's identifier type [_required_] enum Team sync bulk type. Allowed enum values: `team_sync_bulk` ``` { "data": { "attributes": { "source": "github", "type": "link" }, "type": "team_sync_bulk" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#SyncTeams-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#SyncTeams-403-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#SyncTeams-429-v2) * [500](https://docs.datadoghq.com/api/latest/teams/#SyncTeams-500-v2) OK Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Internal Server Error - Unexpected error during linking. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Sync teams returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/sync" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "source": "github", "type": "link" }, "type": "team_sync_bulk" } } EOF ``` ##### Sync teams returns "OK" response ``` // Sync teams returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.TeamSyncRequest{ Data: datadogV2.TeamSyncData{ Attributes: datadogV2.TeamSyncAttributes{ Source: datadogV2.TEAMSYNCATTRIBUTESSOURCE_GITHUB, Type: datadogV2.TEAMSYNCATTRIBUTESTYPE_LINK, }, Type: datadogV2.TEAMSYNCBULKTYPE_TEAM_SYNC_BULK, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.SyncTeams", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) r, err := api.SyncTeams(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.SyncTeams`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Sync teams returns "OK" response ``` // Sync teams returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamSyncAttributes; import com.datadog.api.client.v2.model.TeamSyncAttributesSource; import com.datadog.api.client.v2.model.TeamSyncAttributesType; import com.datadog.api.client.v2.model.TeamSyncBulkType; import com.datadog.api.client.v2.model.TeamSyncData; import com.datadog.api.client.v2.model.TeamSyncRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.syncTeams", true); TeamsApi apiInstance = new TeamsApi(defaultClient); TeamSyncRequest body = new TeamSyncRequest() .data( new TeamSyncData() .attributes( new TeamSyncAttributes() .source(TeamSyncAttributesSource.GITHUB) .type(TeamSyncAttributesType.LINK)) .type(TeamSyncBulkType.TEAM_SYNC_BULK)); try { apiInstance.syncTeams(body); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#syncTeams"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Sync teams returns "OK" response ``` """ Sync teams returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.team_sync_attributes import TeamSyncAttributes from datadog_api_client.v2.model.team_sync_attributes_source import TeamSyncAttributesSource from datadog_api_client.v2.model.team_sync_attributes_type import TeamSyncAttributesType from datadog_api_client.v2.model.team_sync_bulk_type import TeamSyncBulkType from datadog_api_client.v2.model.team_sync_data import TeamSyncData from datadog_api_client.v2.model.team_sync_request import TeamSyncRequest body = TeamSyncRequest( data=TeamSyncData( attributes=TeamSyncAttributes( source=TeamSyncAttributesSource.GITHUB, type=TeamSyncAttributesType.LINK, ), type=TeamSyncBulkType.TEAM_SYNC_BULK, ), ) configuration = Configuration() configuration.unstable_operations["sync_teams"] = True with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) api_instance.sync_teams(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Sync teams returns "OK" response ``` # Sync teams returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.sync_teams".to_sym] = true end api_instance = DatadogAPIClient::V2::TeamsAPI.new body = DatadogAPIClient::V2::TeamSyncRequest.new({ data: DatadogAPIClient::V2::TeamSyncData.new({ attributes: DatadogAPIClient::V2::TeamSyncAttributes.new({ source: DatadogAPIClient::V2::TeamSyncAttributesSource::GITHUB, type: DatadogAPIClient::V2::TeamSyncAttributesType::LINK, }), type: DatadogAPIClient::V2::TeamSyncBulkType::TEAM_SYNC_BULK, }), }) p api_instance.sync_teams(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Sync teams returns "OK" response ``` // Sync teams returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::TeamSyncAttributes; use datadog_api_client::datadogV2::model::TeamSyncAttributesSource; use datadog_api_client::datadogV2::model::TeamSyncAttributesType; use datadog_api_client::datadogV2::model::TeamSyncBulkType; use datadog_api_client::datadogV2::model::TeamSyncData; use datadog_api_client::datadogV2::model::TeamSyncRequest; #[tokio::main] async fn main() { let body = TeamSyncRequest::new(TeamSyncData::new( TeamSyncAttributes::new( TeamSyncAttributesSource::GITHUB, TeamSyncAttributesType::LINK, ), TeamSyncBulkType::TEAM_SYNC_BULK, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.SyncTeams", true); let api = TeamsAPI::with_config(configuration); let resp = api.sync_teams(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Sync teams returns "OK" response ``` /** * Sync teams returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.syncTeams"] = true; const apiInstance = new v2.TeamsApi(configuration); const params: v2.TeamsApiSyncTeamsRequest = { body: { data: { attributes: { source: "github", type: "link", }, type: "team_sync_bulk", }, }, }; apiInstance .syncTeams(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Add a member team](https://docs.datadoghq.com/api/latest/teams/#add-a-member-team) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/teams/#add-a-member-team-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/team/{super_team_id}/member_teamshttps://api.ap2.datadoghq.com/api/v2/team/{super_team_id}/member_teamshttps://api.datadoghq.eu/api/v2/team/{super_team_id}/member_teamshttps://api.ddog-gov.com/api/v2/team/{super_team_id}/member_teamshttps://api.datadoghq.com/api/v2/team/{super_team_id}/member_teamshttps://api.us3.datadoghq.com/api/v2/team/{super_team_id}/member_teamshttps://api.us5.datadoghq.com/api/v2/team/{super_team_id}/member_teams ### Overview Add a member team. Adds the team given by the `id` in the body as a member team of the super team. **Note** : This API is deprecated. For creating team hierarchy links, use the team hierarchy links API: `POST /api/v2/team-hierarchy-links`. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description super_team_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object A member team id [_required_] string The member team's identifier type [_required_] enum Member team type Allowed enum values: `member_teams` default: `member_teams` ``` { "data": { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "member_teams" } } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/teams/#AddMemberTeam-204-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#AddMemberTeam-403-v2) * [409](https://docs.datadoghq.com/api/latest/teams/#AddMemberTeam-409-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#AddMemberTeam-429-v2) Added Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Add a member team Copy ``` # Path parameters export super_team_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${super_team_id}/member_teams" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "member_teams" } } EOF ``` ##### Add a member team ``` """ Add a member team returns "Added" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.add_member_team_request import AddMemberTeamRequest from datadog_api_client.v2.model.member_team import MemberTeam from datadog_api_client.v2.model.member_team_type import MemberTeamType body = AddMemberTeamRequest( data=MemberTeam( id="aeadc05e-98a8-11ec-ac2c-da7ad0900001", type=MemberTeamType.MEMBER_TEAMS, ), ) configuration = Configuration() configuration.unstable_operations["add_member_team"] = True with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) api_instance.add_member_team(super_team_id="super_team_id", body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Add a member team ``` # Add a member team returns "Added" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.add_member_team".to_sym] = true end api_instance = DatadogAPIClient::V2::TeamsAPI.new body = DatadogAPIClient::V2::AddMemberTeamRequest.new({ data: DatadogAPIClient::V2::MemberTeam.new({ id: "aeadc05e-98a8-11ec-ac2c-da7ad0900001", type: DatadogAPIClient::V2::MemberTeamType::MEMBER_TEAMS, }), }) api_instance.add_member_team("super_team_id", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Add a member team ``` // Add a member team returns "Added" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.AddMemberTeamRequest{ Data: datadogV2.MemberTeam{ Id: "aeadc05e-98a8-11ec-ac2c-da7ad0900001", Type: datadogV2.MEMBERTEAMTYPE_MEMBER_TEAMS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.AddMemberTeam", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) r, err := api.AddMemberTeam(ctx, "super_team_id", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.AddMemberTeam`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Add a member team ``` // Add a member team returns "Added" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.AddMemberTeamRequest; import com.datadog.api.client.v2.model.MemberTeam; import com.datadog.api.client.v2.model.MemberTeamType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.addMemberTeam", true); TeamsApi apiInstance = new TeamsApi(defaultClient); AddMemberTeamRequest body = new AddMemberTeamRequest() .data( new MemberTeam() .id("aeadc05e-98a8-11ec-ac2c-da7ad0900001") .type(MemberTeamType.MEMBER_TEAMS)); try { apiInstance.addMemberTeam("super_team_id", body); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#addMemberTeam"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Add a member team ``` // Add a member team returns "Added" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::AddMemberTeamRequest; use datadog_api_client::datadogV2::model::MemberTeam; use datadog_api_client::datadogV2::model::MemberTeamType; #[tokio::main] async fn main() { let body = AddMemberTeamRequest::new(MemberTeam::new( "aeadc05e-98a8-11ec-ac2c-da7ad0900001".to_string(), MemberTeamType::MEMBER_TEAMS, )); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.AddMemberTeam", true); let api = TeamsAPI::with_config(configuration); let resp = api.add_member_team("super_team_id".to_string(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Add a member team ``` /** * Add a member team returns "Added" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.addMemberTeam"] = true; const apiInstance = new v2.TeamsApi(configuration); const params: v2.TeamsApiAddMemberTeamRequest = { body: { data: { id: "aeadc05e-98a8-11ec-ac2c-da7ad0900001", type: "member_teams", }, }, superTeamId: "super_team_id", }; apiInstance .addMemberTeam(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get team hierarchy links](https://docs.datadoghq.com/api/latest/teams/#get-team-hierarchy-links) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-team-hierarchy-links-v2) GET https://api.ap1.datadoghq.com/api/v2/team-hierarchy-linkshttps://api.ap2.datadoghq.com/api/v2/team-hierarchy-linkshttps://api.datadoghq.eu/api/v2/team-hierarchy-linkshttps://api.ddog-gov.com/api/v2/team-hierarchy-linkshttps://api.datadoghq.com/api/v2/team-hierarchy-linkshttps://api.us3.datadoghq.com/api/v2/team-hierarchy-linkshttps://api.us5.datadoghq.com/api/v2/team-hierarchy-links ### Overview List all team hierarchy links that match the provided filters. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[number] integer Specific page number to return. page[size] integer Size for a given page. The maximum allowed value is 100. filter[parent_team] string Filter by parent team ID filter[sub_team] string Filter by sub team ID ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#ListTeamHierarchyLinks-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#ListTeamHierarchyLinks-403-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#ListTeamHierarchyLinks-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team hierarchy links response Field Type Description data [object] Team hierarchy links response data attributes [_required_] object Team hierarchy link attributes created_at [_required_] date-time Timestamp when the team hierarchy link was created provisioned_by [_required_] string The provisioner of the team hierarchy link id [_required_] string The team hierarchy link's identifier relationships object Team hierarchy link relationships parent_team [_required_] object Team hierarchy link team relationship data [_required_] object Team hierarchy links connect different teams. This represents team objects that are connected by the team hierarchy link. attributes object Team hierarchy links connect different teams. This represents attributes from teams that are connected by the team hierarchy link. avatar string The team's avatar banner int64 The team's banner handle [_required_] string The team's handle is_managed boolean Whether the team is managed is_open_membership boolean Whether the team has open membership link_count int64 The number of links for the team name [_required_] string The team's name summary string The team's summary user_count int64 The number of users in the team id [_required_] string The team's identifier type [_required_] enum Team type Allowed enum values: `team` default: `team` sub_team [_required_] object Team hierarchy link team relationship data [_required_] object Team hierarchy links connect different teams. This represents team objects that are connected by the team hierarchy link. attributes object Team hierarchy links connect different teams. This represents attributes from teams that are connected by the team hierarchy link. avatar string The team's avatar banner int64 The team's banner handle [_required_] string The team's handle is_managed boolean Whether the team is managed is_open_membership boolean Whether the team has open membership link_count int64 The number of links for the team name [_required_] string The team's name summary string The team's summary user_count int64 The number of users in the team id [_required_] string The team's identifier type [_required_] enum Team type Allowed enum values: `team` default: `team` type [_required_] enum Team hierarchy link type Allowed enum values: `team_hierarchy_links` default: `team_hierarchy_links` included [object] Included teams attributes object Team hierarchy links connect different teams. This represents attributes from teams that are connected by the team hierarchy link. avatar string The team's avatar banner int64 The team's banner handle [_required_] string The team's handle is_managed boolean Whether the team is managed is_open_membership boolean Whether the team has open membership link_count int64 The number of links for the team name [_required_] string The team's name summary string The team's summary user_count int64 The number of users in the team id [_required_] string The team's identifier type [_required_] enum Team type Allowed enum values: `team` default: `team` links object When querying team hierarchy links, a set of links for navigation between different pages is included first string Link to the first page. last string Link to the last page. next string Link to the next page. prev string Link to the previous page. self string Link to the current object. meta object Metadata that is included in the response when querying the team hierarchy links page object Metadata related to paging information that is included in the response when querying the team hierarchy links first_number int64 First page number. last_number int64 Last page number. next_number int64 Next page number. number int64 Page number. prev_number int64 Previous page number. size int64 Page size. total int64 Total number of results. type string Pagination type. ``` { "data": [ { "attributes": { "created_at": "", "provisioned_by": "system" }, "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "relationships": { "parent_team": { "data": { "attributes": { "avatar": "string", "banner": "integer", "handle": "team-handle", "is_managed": false, "is_open_membership": false, "link_count": "integer", "name": "Team Name", "summary": "string", "user_count": "integer" }, "id": "692e8073-12c4-4c71-8408-5090bd44c9c8", "type": "team" } }, "sub_team": { "data": { "attributes": { "avatar": "string", "banner": "integer", "handle": "team-handle", "is_managed": false, "is_open_membership": false, "link_count": "integer", "name": "Team Name", "summary": "string", "user_count": "integer" }, "id": "692e8073-12c4-4c71-8408-5090bd44c9c8", "type": "team" } } }, "type": "team_hierarchy_links" } ], "included": [ { "attributes": { "avatar": "string", "banner": "integer", "handle": "team-handle", "is_managed": false, "is_open_membership": false, "link_count": "integer", "name": "Team Name", "summary": "string", "user_count": "integer" }, "id": "692e8073-12c4-4c71-8408-5090bd44c9c8", "type": "team" } ], "links": { "first": "string", "last": "string", "next": "string", "prev": "string", "self": "string" }, "meta": { "page": { "first_number": "integer", "last_number": "integer", "next_number": "integer", "number": "integer", "prev_number": "integer", "size": "integer", "total": "integer", "type": "number_size" } } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Get team hierarchy links Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team-hierarchy-links" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get team hierarchy links ``` """ Get team hierarchy links returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "team_hierarchy_link" in the system TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID = environ[ "TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID" ] TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID = environ[ "TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID" ] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.list_team_hierarchy_links( page_number=0, page_size=100, filter_parent_team=TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID, filter_sub_team=TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get team hierarchy links ``` # Get team hierarchy links returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "team_hierarchy_link" in the system TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID = ENV["TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID"] TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID = ENV["TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID"] opts = { filter_parent_team: TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID, filter_sub_team: TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID, page_number: 0, page_size: 100, } p api_instance.list_team_hierarchy_links(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get team hierarchy links ``` // Get team hierarchy links returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "team_hierarchy_link" in the system TeamHierarchyLinkDataRelationshipsParentTeamDataID := os.Getenv("TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID") TeamHierarchyLinkDataRelationshipsSubTeamDataID := os.Getenv("TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.ListTeamHierarchyLinks(ctx, *datadogV2.NewListTeamHierarchyLinksOptionalParameters().WithFilterParentTeam(TeamHierarchyLinkDataRelationshipsParentTeamDataID).WithFilterSubTeam(TeamHierarchyLinkDataRelationshipsSubTeamDataID).WithPageNumber(0).WithPageSize(100)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.ListTeamHierarchyLinks`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.ListTeamHierarchyLinks`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get team hierarchy links ``` // Get team hierarchy links returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.api.TeamsApi.ListTeamHierarchyLinksOptionalParameters; import com.datadog.api.client.v2.model.TeamHierarchyLinksResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "team_hierarchy_link" in the system String TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID = System.getenv("TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID"); String TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID = System.getenv("TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID"); try { TeamHierarchyLinksResponse result = apiInstance.listTeamHierarchyLinks( new ListTeamHierarchyLinksOptionalParameters() .filterParentTeam(TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID) .filterSubTeam(TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID) .pageNumber(0L) .pageSize(100L)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#listTeamHierarchyLinks"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get team hierarchy links ``` // Get team hierarchy links returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::ListTeamHierarchyLinksOptionalParams; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "team_hierarchy_link" in the system let team_hierarchy_link_data_relationships_parent_team_data_id = std::env::var("TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID").unwrap(); let team_hierarchy_link_data_relationships_sub_team_data_id = std::env::var("TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .list_team_hierarchy_links( ListTeamHierarchyLinksOptionalParams::default() .filter_parent_team( team_hierarchy_link_data_relationships_parent_team_data_id.clone(), ) .filter_sub_team(team_hierarchy_link_data_relationships_sub_team_data_id.clone()) .page_number(0) .page_size(100), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get team hierarchy links ``` /** * Get team hierarchy links returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "team_hierarchy_link" in the system const TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID = process.env .TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID as string; const TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID = process.env .TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID as string; const params: v2.TeamsApiListTeamHierarchyLinksRequest = { pageNumber: 0, pageSize: 100, filterParentTeam: TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_PARENT_TEAM_DATA_ID, filterSubTeam: TEAM_HIERARCHY_LINK_DATA_RELATIONSHIPS_SUB_TEAM_DATA_ID, }; apiInstance .listTeamHierarchyLinks(params) .then((data: v2.TeamHierarchyLinksResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a team hierarchy link](https://docs.datadoghq.com/api/latest/teams/#get-a-team-hierarchy-link) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-a-team-hierarchy-link-v2) GET https://api.ap1.datadoghq.com/api/v2/team-hierarchy-links/{link_id}https://api.ap2.datadoghq.com/api/v2/team-hierarchy-links/{link_id}https://api.datadoghq.eu/api/v2/team-hierarchy-links/{link_id}https://api.ddog-gov.com/api/v2/team-hierarchy-links/{link_id}https://api.datadoghq.com/api/v2/team-hierarchy-links/{link_id}https://api.us3.datadoghq.com/api/v2/team-hierarchy-links/{link_id}https://api.us5.datadoghq.com/api/v2/team-hierarchy-links/{link_id} ### Overview Get a single team hierarchy link for the given link_id. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description link_id [_required_] string The team hierarchy link’s identifier ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#GetTeamHierarchyLink-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#GetTeamHierarchyLink-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#GetTeamHierarchyLink-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#GetTeamHierarchyLink-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team hierarchy link response Field Type Description data object Team hierarchy link attributes [_required_] object Team hierarchy link attributes created_at [_required_] date-time Timestamp when the team hierarchy link was created provisioned_by [_required_] string The provisioner of the team hierarchy link id [_required_] string The team hierarchy link's identifier relationships object Team hierarchy link relationships parent_team [_required_] object Team hierarchy link team relationship data [_required_] object Team hierarchy links connect different teams. This represents team objects that are connected by the team hierarchy link. attributes object Team hierarchy links connect different teams. This represents attributes from teams that are connected by the team hierarchy link. avatar string The team's avatar banner int64 The team's banner handle [_required_] string The team's handle is_managed boolean Whether the team is managed is_open_membership boolean Whether the team has open membership link_count int64 The number of links for the team name [_required_] string The team's name summary string The team's summary user_count int64 The number of users in the team id [_required_] string The team's identifier type [_required_] enum Team type Allowed enum values: `team` default: `team` sub_team [_required_] object Team hierarchy link team relationship data [_required_] object Team hierarchy links connect different teams. This represents team objects that are connected by the team hierarchy link. attributes object Team hierarchy links connect different teams. This represents attributes from teams that are connected by the team hierarchy link. avatar string The team's avatar banner int64 The team's banner handle [_required_] string The team's handle is_managed boolean Whether the team is managed is_open_membership boolean Whether the team has open membership link_count int64 The number of links for the team name [_required_] string The team's name summary string The team's summary user_count int64 The number of users in the team id [_required_] string The team's identifier type [_required_] enum Team type Allowed enum values: `team` default: `team` type [_required_] enum Team hierarchy link type Allowed enum values: `team_hierarchy_links` default: `team_hierarchy_links` included [object] Included teams attributes object Team hierarchy links connect different teams. This represents attributes from teams that are connected by the team hierarchy link. avatar string The team's avatar banner int64 The team's banner handle [_required_] string The team's handle is_managed boolean Whether the team is managed is_open_membership boolean Whether the team has open membership link_count int64 The number of links for the team name [_required_] string The team's name summary string The team's summary user_count int64 The number of users in the team id [_required_] string The team's identifier type [_required_] enum Team type Allowed enum values: `team` default: `team` links object When querying team hierarchy links, a set of links for navigation between different pages is included first string Link to the first page. last string Link to the last page. next string Link to the next page. prev string Link to the previous page. self string Link to the current object. ``` { "data": { "attributes": { "created_at": "", "provisioned_by": "system" }, "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "relationships": { "parent_team": { "data": { "attributes": { "avatar": "string", "banner": "integer", "handle": "team-handle", "is_managed": false, "is_open_membership": false, "link_count": "integer", "name": "Team Name", "summary": "string", "user_count": "integer" }, "id": "692e8073-12c4-4c71-8408-5090bd44c9c8", "type": "team" } }, "sub_team": { "data": { "attributes": { "avatar": "string", "banner": "integer", "handle": "team-handle", "is_managed": false, "is_open_membership": false, "link_count": "integer", "name": "Team Name", "summary": "string", "user_count": "integer" }, "id": "692e8073-12c4-4c71-8408-5090bd44c9c8", "type": "team" } } }, "type": "team_hierarchy_links" }, "included": [ { "attributes": { "avatar": "string", "banner": "integer", "handle": "team-handle", "is_managed": false, "is_open_membership": false, "link_count": "integer", "name": "Team Name", "summary": "string", "user_count": "integer" }, "id": "692e8073-12c4-4c71-8408-5090bd44c9c8", "type": "team" } ], "links": { "first": "string", "last": "string", "next": "string", "prev": "string", "self": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Get a team hierarchy link Copy ``` # Path parameters export link_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team-hierarchy-links/${link_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a team hierarchy link ``` """ Get a team hierarchy link returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "team_hierarchy_link" in the system TEAM_HIERARCHY_LINK_DATA_ID = environ["TEAM_HIERARCHY_LINK_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.get_team_hierarchy_link( link_id=TEAM_HIERARCHY_LINK_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a team hierarchy link ``` # Get a team hierarchy link returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "team_hierarchy_link" in the system TEAM_HIERARCHY_LINK_DATA_ID = ENV["TEAM_HIERARCHY_LINK_DATA_ID"] p api_instance.get_team_hierarchy_link(TEAM_HIERARCHY_LINK_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a team hierarchy link ``` // Get a team hierarchy link returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "team_hierarchy_link" in the system TeamHierarchyLinkDataID := os.Getenv("TEAM_HIERARCHY_LINK_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.GetTeamHierarchyLink(ctx, TeamHierarchyLinkDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.GetTeamHierarchyLink`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.GetTeamHierarchyLink`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a team hierarchy link ``` // Get a team hierarchy link returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamHierarchyLinkResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "team_hierarchy_link" in the system String TEAM_HIERARCHY_LINK_DATA_ID = System.getenv("TEAM_HIERARCHY_LINK_DATA_ID"); try { TeamHierarchyLinkResponse result = apiInstance.getTeamHierarchyLink(TEAM_HIERARCHY_LINK_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#getTeamHierarchyLink"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a team hierarchy link ``` // Get a team hierarchy link returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "team_hierarchy_link" in the system let team_hierarchy_link_data_id = std::env::var("TEAM_HIERARCHY_LINK_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .get_team_hierarchy_link(team_hierarchy_link_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a team hierarchy link ``` /** * Get a team hierarchy link returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "team_hierarchy_link" in the system const TEAM_HIERARCHY_LINK_DATA_ID = process.env .TEAM_HIERARCHY_LINK_DATA_ID as string; const params: v2.TeamsApiGetTeamHierarchyLinkRequest = { linkId: TEAM_HIERARCHY_LINK_DATA_ID, }; apiInstance .getTeamHierarchyLink(params) .then((data: v2.TeamHierarchyLinkResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all member teams](https://docs.datadoghq.com/api/latest/teams/#get-all-member-teams) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/teams/#get-all-member-teams-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). GET https://api.ap1.datadoghq.com/api/v2/team/{super_team_id}/member_teamshttps://api.ap2.datadoghq.com/api/v2/team/{super_team_id}/member_teamshttps://api.datadoghq.eu/api/v2/team/{super_team_id}/member_teamshttps://api.ddog-gov.com/api/v2/team/{super_team_id}/member_teamshttps://api.datadoghq.com/api/v2/team/{super_team_id}/member_teamshttps://api.us3.datadoghq.com/api/v2/team/{super_team_id}/member_teamshttps://api.us5.datadoghq.com/api/v2/team/{super_team_id}/member_teams ### Overview Get all member teams. **Note** : This API is deprecated. For team hierarchy relationships (parent-child teams), use the team hierarchy links API: `GET /api/v2/team-hierarchy-links`. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description super_team_id [_required_] string None #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. fields[team] array List of fields that need to be fetched. ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#ListMemberTeams-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#ListMemberTeams-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#ListMemberTeams-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#ListMemberTeams-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Response with multiple teams Field Type Description data [object] Teams response data attributes [_required_] object Team attributes avatar string Unicode representation of the avatar for the team, limited to a single grapheme banner int64 Banner selection for the team created_at date-time Creation date of the team description string Free-form markdown description/content for the team's homepage handle [_required_] string The team's identifier hidden_modules [string] Collection of hidden modules for the team is_managed boolean Whether the team is managed from an external source link_count int32 The number of links belonging to the team modified_at date-time Modification date of the team name [_required_] string The name of the team summary string A brief summary of the team, derived from the `description` user_count int32 The number of users belonging to the team visible_modules [string] Collection of visible modules for the team id [_required_] string The team's identifier relationships object Resources related to a team team_links object Relationship between a team and a team link data [object] Related team links id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` links object Links attributes. related string Related link. user_team_permissions object Relationship between a user team permission and a team data object Related user team permission data id [_required_] string The ID of the user team permission type [_required_] enum User team permission type Allowed enum values: `user_team_permissions` default: `user_team_permissions` links object Links attributes. related string Related link. type [_required_] enum Team type Allowed enum values: `team` default: `team` included [ ] Resources related to the team Option 1 object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` Option 2 object Team link attributes [_required_] object Team link attributes label [_required_] string The link's label position int32 The link's position, used to sort links for the team team_id string ID of the team the link is associated with url [_required_] string The URL for the link id [_required_] string The team link's identifier type [_required_] enum Team link type Allowed enum values: `team_links` default: `team_links` Option 3 object A user's permissions for a given team attributes object User team permission attributes permissions object Object of team permission actions and boolean values that a logged in user can perform on this team. id [_required_] string The user team permission's identifier type [_required_] enum User team permission type Allowed enum values: `user_team_permissions` default: `user_team_permissions` links object Teams response links. first string First link. last string Last link. next string Next link. prev string Previous link. self string Current link. meta object Teams response metadata. pagination object Teams response metadata. first_offset int64 The first offset. last_offset int64 The last offset. limit int64 Pagination limit. next_offset int64 The next offset. offset int64 The offset. prev_offset int64 The previous offset. total int64 Total results. type string Offset type. ``` { "data": [ { "attributes": { "avatar": "🥑", "banner": "integer", "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "handle": "example-team", "hidden_modules": [], "is_managed": false, "link_count": "integer", "modified_at": "2019-09-19T10:00:00.000Z", "name": "Example Team", "summary": "string", "user_count": "integer", "visible_modules": [] }, "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "relationships": { "team_links": { "data": [ { "id": "f9bb8444-af7f-11ec-ac2c-da7ad0900001", "type": "team_links" } ], "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } }, "user_team_permissions": { "data": { "id": "UserTeamPermissions-aeadc05e-98a8-11ec-ac2c-da7ad0900001-416595", "type": "user_team_permissions" }, "links": { "related": "/api/v2/team/c75a4a8e-20c7-11ee-a3a5-da7ad0900002/links" } } }, "type": "team" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "links": { "first": "string", "last": "string", "next": "string", "prev": "string", "self": "string" }, "meta": { "pagination": { "first_offset": "integer", "last_offset": "integer", "limit": "integer", "next_offset": "integer", "offset": "integer", "prev_offset": "integer", "total": "integer", "type": "string" } } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Get all member teams Copy ``` # Path parameters export super_team_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${super_team_id}/member_teams" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all member teams ``` """ Get all member teams returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi configuration = Configuration() configuration.unstable_operations["list_member_teams"] = True with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.list_member_teams( super_team_id="super_team_id", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all member teams ``` # Get all member teams returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_member_teams".to_sym] = true end api_instance = DatadogAPIClient::V2::TeamsAPI.new p api_instance.list_member_teams("super_team_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all member teams ``` // Get all member teams returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListMemberTeams", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.ListMemberTeams(ctx, "super_team_id", *datadogV2.NewListMemberTeamsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.ListMemberTeams`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.ListMemberTeams`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all member teams ``` // Get all member teams returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listMemberTeams", true); TeamsApi apiInstance = new TeamsApi(defaultClient); try { TeamsResponse result = apiInstance.listMemberTeams("super_team_id"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#listMemberTeams"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all member teams ``` // Get all member teams returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::ListMemberTeamsOptionalParams; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListMemberTeams", true); let api = TeamsAPI::with_config(configuration); let resp = api .list_member_teams( "super_team_id".to_string(), ListMemberTeamsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all member teams ``` /** * Get all member teams returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listMemberTeams"] = true; const apiInstance = new v2.TeamsApi(configuration); const params: v2.TeamsApiListMemberTeamsRequest = { superTeamId: "super_team_id", }; apiInstance .listMemberTeams(params) .then((data: v2.TeamsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a team hierarchy link](https://docs.datadoghq.com/api/latest/teams/#create-a-team-hierarchy-link) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#create-a-team-hierarchy-link-v2) POST https://api.ap1.datadoghq.com/api/v2/team-hierarchy-linkshttps://api.ap2.datadoghq.com/api/v2/team-hierarchy-linkshttps://api.datadoghq.eu/api/v2/team-hierarchy-linkshttps://api.ddog-gov.com/api/v2/team-hierarchy-linkshttps://api.datadoghq.com/api/v2/team-hierarchy-linkshttps://api.us3.datadoghq.com/api/v2/team-hierarchy-linkshttps://api.us5.datadoghq.com/api/v2/team-hierarchy-links ### Overview Create a new team hierarchy link between a parent team and a sub team. This endpoint requires all of the following permissions: * `teams_read` * `teams_manage` OAuth apps require the `teams_read, teams_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object Data provided when creating a team hierarchy link relationships [_required_] object The related teams that will be connected by the team hierarchy link parent_team [_required_] object Data about each team that will be connected by the team hierarchy link data [_required_] object This schema defines the attributes about each team that has to be provided when creating a team hierarchy link id [_required_] string The team's identifier type [_required_] enum Team type Allowed enum values: `team` default: `team` sub_team [_required_] object Data about each team that will be connected by the team hierarchy link data [_required_] object This schema defines the attributes about each team that has to be provided when creating a team hierarchy link id [_required_] string The team's identifier type [_required_] enum Team type Allowed enum values: `team` default: `team` type [_required_] enum Team hierarchy link type Allowed enum values: `team_hierarchy_links` default: `team_hierarchy_links` ``` { "data": { "relationships": { "parent_team": { "data": { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "team" } }, "sub_team": { "data": { "id": "7c47c39d-7740-6408-d686-7870a744701c", "type": "team" } } }, "type": "team_hierarchy_links" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#AddTeamHierarchyLink-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#AddTeamHierarchyLink-403-v2) * [409](https://docs.datadoghq.com/api/latest/teams/#AddTeamHierarchyLink-409-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#AddTeamHierarchyLink-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team hierarchy link response Field Type Description data object Team hierarchy link attributes [_required_] object Team hierarchy link attributes created_at [_required_] date-time Timestamp when the team hierarchy link was created provisioned_by [_required_] string The provisioner of the team hierarchy link id [_required_] string The team hierarchy link's identifier relationships object Team hierarchy link relationships parent_team [_required_] object Team hierarchy link team relationship data [_required_] object Team hierarchy links connect different teams. This represents team objects that are connected by the team hierarchy link. attributes object Team hierarchy links connect different teams. This represents attributes from teams that are connected by the team hierarchy link. avatar string The team's avatar banner int64 The team's banner handle [_required_] string The team's handle is_managed boolean Whether the team is managed is_open_membership boolean Whether the team has open membership link_count int64 The number of links for the team name [_required_] string The team's name summary string The team's summary user_count int64 The number of users in the team id [_required_] string The team's identifier type [_required_] enum Team type Allowed enum values: `team` default: `team` sub_team [_required_] object Team hierarchy link team relationship data [_required_] object Team hierarchy links connect different teams. This represents team objects that are connected by the team hierarchy link. attributes object Team hierarchy links connect different teams. This represents attributes from teams that are connected by the team hierarchy link. avatar string The team's avatar banner int64 The team's banner handle [_required_] string The team's handle is_managed boolean Whether the team is managed is_open_membership boolean Whether the team has open membership link_count int64 The number of links for the team name [_required_] string The team's name summary string The team's summary user_count int64 The number of users in the team id [_required_] string The team's identifier type [_required_] enum Team type Allowed enum values: `team` default: `team` type [_required_] enum Team hierarchy link type Allowed enum values: `team_hierarchy_links` default: `team_hierarchy_links` included [object] Included teams attributes object Team hierarchy links connect different teams. This represents attributes from teams that are connected by the team hierarchy link. avatar string The team's avatar banner int64 The team's banner handle [_required_] string The team's handle is_managed boolean Whether the team is managed is_open_membership boolean Whether the team has open membership link_count int64 The number of links for the team name [_required_] string The team's name summary string The team's summary user_count int64 The number of users in the team id [_required_] string The team's identifier type [_required_] enum Team type Allowed enum values: `team` default: `team` links object When querying team hierarchy links, a set of links for navigation between different pages is included first string Link to the first page. last string Link to the last page. next string Link to the next page. prev string Link to the previous page. self string Link to the current object. ``` { "data": { "attributes": { "created_at": "", "provisioned_by": "system" }, "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "relationships": { "parent_team": { "data": { "attributes": { "avatar": "string", "banner": "integer", "handle": "team-handle", "is_managed": false, "is_open_membership": false, "link_count": "integer", "name": "Team Name", "summary": "string", "user_count": "integer" }, "id": "692e8073-12c4-4c71-8408-5090bd44c9c8", "type": "team" } }, "sub_team": { "data": { "attributes": { "avatar": "string", "banner": "integer", "handle": "team-handle", "is_managed": false, "is_open_membership": false, "link_count": "integer", "name": "Team Name", "summary": "string", "user_count": "integer" }, "id": "692e8073-12c4-4c71-8408-5090bd44c9c8", "type": "team" } } }, "type": "team_hierarchy_links" }, "included": [ { "attributes": { "avatar": "string", "banner": "integer", "handle": "team-handle", "is_managed": false, "is_open_membership": false, "link_count": "integer", "name": "Team Name", "summary": "string", "user_count": "integer" }, "id": "692e8073-12c4-4c71-8408-5090bd44c9c8", "type": "team" } ], "links": { "first": "string", "last": "string", "next": "string", "prev": "string", "self": "string" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Create a team hierarchy link returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team-hierarchy-links" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "relationships": { "parent_team": { "data": { "id": "aeadc05e-98a8-11ec-ac2c-da7ad0900001", "type": "team" } }, "sub_team": { "data": { "id": "7c47c39d-7740-6408-d686-7870a744701c", "type": "team" } } }, "type": "team_hierarchy_links" } } EOF ``` ##### Create a team hierarchy link returns "OK" response ``` // Create a team hierarchy link returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") // there is a valid "dd_team_2" in the system DdTeam2DataID := os.Getenv("DD_TEAM_2_DATA_ID") body := datadogV2.TeamHierarchyLinkCreateRequest{ Data: datadogV2.TeamHierarchyLinkCreate{ Relationships: datadogV2.TeamHierarchyLinkCreateRelationships{ ParentTeam: datadogV2.TeamHierarchyLinkCreateTeamRelationship{ Data: datadogV2.TeamHierarchyLinkCreateTeam{ Id: DdTeamDataID, Type: datadogV2.TEAMTYPE_TEAM, }, }, SubTeam: datadogV2.TeamHierarchyLinkCreateTeamRelationship{ Data: datadogV2.TeamHierarchyLinkCreateTeam{ Id: DdTeam2DataID, Type: datadogV2.TEAMTYPE_TEAM, }, }, }, Type: datadogV2.TEAMHIERARCHYLINKTYPE_TEAM_HIERARCHY_LINKS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.AddTeamHierarchyLink(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.AddTeamHierarchyLink`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.AddTeamHierarchyLink`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a team hierarchy link returns "OK" response ``` // Create a team hierarchy link returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamHierarchyLinkCreate; import com.datadog.api.client.v2.model.TeamHierarchyLinkCreateRelationships; import com.datadog.api.client.v2.model.TeamHierarchyLinkCreateRequest; import com.datadog.api.client.v2.model.TeamHierarchyLinkCreateTeam; import com.datadog.api.client.v2.model.TeamHierarchyLinkCreateTeamRelationship; import com.datadog.api.client.v2.model.TeamHierarchyLinkResponse; import com.datadog.api.client.v2.model.TeamHierarchyLinkType; import com.datadog.api.client.v2.model.TeamType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); // there is a valid "dd_team_2" in the system String DD_TEAM_2_DATA_ID = System.getenv("DD_TEAM_2_DATA_ID"); TeamHierarchyLinkCreateRequest body = new TeamHierarchyLinkCreateRequest() .data( new TeamHierarchyLinkCreate() .relationships( new TeamHierarchyLinkCreateRelationships() .parentTeam( new TeamHierarchyLinkCreateTeamRelationship() .data( new TeamHierarchyLinkCreateTeam() .id(DD_TEAM_DATA_ID) .type(TeamType.TEAM))) .subTeam( new TeamHierarchyLinkCreateTeamRelationship() .data( new TeamHierarchyLinkCreateTeam() .id(DD_TEAM_2_DATA_ID) .type(TeamType.TEAM)))) .type(TeamHierarchyLinkType.TEAM_HIERARCHY_LINKS)); try { TeamHierarchyLinkResponse result = apiInstance.addTeamHierarchyLink(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#addTeamHierarchyLink"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a team hierarchy link returns "OK" response ``` """ Create a team hierarchy link returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.team_hierarchy_link_create import TeamHierarchyLinkCreate from datadog_api_client.v2.model.team_hierarchy_link_create_relationships import TeamHierarchyLinkCreateRelationships from datadog_api_client.v2.model.team_hierarchy_link_create_request import TeamHierarchyLinkCreateRequest from datadog_api_client.v2.model.team_hierarchy_link_create_team import TeamHierarchyLinkCreateTeam from datadog_api_client.v2.model.team_hierarchy_link_create_team_relationship import ( TeamHierarchyLinkCreateTeamRelationship, ) from datadog_api_client.v2.model.team_hierarchy_link_type import TeamHierarchyLinkType from datadog_api_client.v2.model.team_type import TeamType # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] # there is a valid "dd_team_2" in the system DD_TEAM_2_DATA_ID = environ["DD_TEAM_2_DATA_ID"] body = TeamHierarchyLinkCreateRequest( data=TeamHierarchyLinkCreate( relationships=TeamHierarchyLinkCreateRelationships( parent_team=TeamHierarchyLinkCreateTeamRelationship( data=TeamHierarchyLinkCreateTeam( id=DD_TEAM_DATA_ID, type=TeamType.TEAM, ), ), sub_team=TeamHierarchyLinkCreateTeamRelationship( data=TeamHierarchyLinkCreateTeam( id=DD_TEAM_2_DATA_ID, type=TeamType.TEAM, ), ), ), type=TeamHierarchyLinkType.TEAM_HIERARCHY_LINKS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.add_team_hierarchy_link(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a team hierarchy link returns "OK" response ``` # Create a team hierarchy link returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] # there is a valid "dd_team_2" in the system DD_TEAM_2_DATA_ID = ENV["DD_TEAM_2_DATA_ID"] body = DatadogAPIClient::V2::TeamHierarchyLinkCreateRequest.new({ data: DatadogAPIClient::V2::TeamHierarchyLinkCreate.new({ relationships: DatadogAPIClient::V2::TeamHierarchyLinkCreateRelationships.new({ parent_team: DatadogAPIClient::V2::TeamHierarchyLinkCreateTeamRelationship.new({ data: DatadogAPIClient::V2::TeamHierarchyLinkCreateTeam.new({ id: DD_TEAM_DATA_ID, type: DatadogAPIClient::V2::TeamType::TEAM, }), }), sub_team: DatadogAPIClient::V2::TeamHierarchyLinkCreateTeamRelationship.new({ data: DatadogAPIClient::V2::TeamHierarchyLinkCreateTeam.new({ id: DD_TEAM_2_DATA_ID, type: DatadogAPIClient::V2::TeamType::TEAM, }), }), }), type: DatadogAPIClient::V2::TeamHierarchyLinkType::TEAM_HIERARCHY_LINKS, }), }) p api_instance.add_team_hierarchy_link(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a team hierarchy link returns "OK" response ``` // Create a team hierarchy link returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::TeamHierarchyLinkCreate; use datadog_api_client::datadogV2::model::TeamHierarchyLinkCreateRelationships; use datadog_api_client::datadogV2::model::TeamHierarchyLinkCreateRequest; use datadog_api_client::datadogV2::model::TeamHierarchyLinkCreateTeam; use datadog_api_client::datadogV2::model::TeamHierarchyLinkCreateTeamRelationship; use datadog_api_client::datadogV2::model::TeamHierarchyLinkType; use datadog_api_client::datadogV2::model::TeamType; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); // there is a valid "dd_team_2" in the system let dd_team_2_data_id = std::env::var("DD_TEAM_2_DATA_ID").unwrap(); let body = TeamHierarchyLinkCreateRequest::new(TeamHierarchyLinkCreate::new( TeamHierarchyLinkCreateRelationships::new( TeamHierarchyLinkCreateTeamRelationship::new(TeamHierarchyLinkCreateTeam::new( dd_team_data_id.clone(), TeamType::TEAM, )), TeamHierarchyLinkCreateTeamRelationship::new(TeamHierarchyLinkCreateTeam::new( dd_team_2_data_id.clone(), TeamType::TEAM, )), ), TeamHierarchyLinkType::TEAM_HIERARCHY_LINKS, )); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api.add_team_hierarchy_link(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a team hierarchy link returns "OK" response ``` /** * Create a team hierarchy link returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; // there is a valid "dd_team_2" in the system const DD_TEAM_2_DATA_ID = process.env.DD_TEAM_2_DATA_ID as string; const params: v2.TeamsApiAddTeamHierarchyLinkRequest = { body: { data: { relationships: { parentTeam: { data: { id: DD_TEAM_DATA_ID, type: "team", }, }, subTeam: { data: { id: DD_TEAM_2_DATA_ID, type: "team", }, }, }, type: "team_hierarchy_links", }, }, }; apiInstance .addTeamHierarchyLink(params) .then((data: v2.TeamHierarchyLinkResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Remove a member team](https://docs.datadoghq.com/api/latest/teams/#remove-a-member-team) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/teams/#remove-a-member-team-v2) **Note** : This endpoint is in Preview. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). DELETE https://api.ap1.datadoghq.com/api/v2/team/{super_team_id}/member_teams/{member_team_id}https://api.ap2.datadoghq.com/api/v2/team/{super_team_id}/member_teams/{member_team_id}https://api.datadoghq.eu/api/v2/team/{super_team_id}/member_teams/{member_team_id}https://api.ddog-gov.com/api/v2/team/{super_team_id}/member_teams/{member_team_id}https://api.datadoghq.com/api/v2/team/{super_team_id}/member_teams/{member_team_id}https://api.us3.datadoghq.com/api/v2/team/{super_team_id}/member_teams/{member_team_id}https://api.us5.datadoghq.com/api/v2/team/{super_team_id}/member_teams/{member_team_id} ### Overview Remove a super team’s member team identified by `member_team_id`. **Note** : This API is deprecated. For deleting team hierarchy links, use the team hierarchy links API: `DELETE /api/v2/team-hierarchy-links/{link_id}`. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description super_team_id [_required_] string None member_team_id [_required_] string None ### Response * [204](https://docs.datadoghq.com/api/latest/teams/#RemoveMemberTeam-204-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#RemoveMemberTeam-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#RemoveMemberTeam-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#RemoveMemberTeam-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Remove a member team Copy ``` # Path parameters export super_team_id="CHANGE_ME" export member_team_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${super_team_id}/member_teams/${member_team_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Remove a member team ``` """ Remove a member team returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi configuration = Configuration() configuration.unstable_operations["remove_member_team"] = True with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) api_instance.remove_member_team( super_team_id="super_team_id", member_team_id="member_team_id", ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Remove a member team ``` # Remove a member team returns "No Content" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.remove_member_team".to_sym] = true end api_instance = DatadogAPIClient::V2::TeamsAPI.new api_instance.remove_member_team("super_team_id", "member_team_id") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Remove a member team ``` // Remove a member team returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.RemoveMemberTeam", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) r, err := api.RemoveMemberTeam(ctx, "super_team_id", "member_team_id") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.RemoveMemberTeam`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Remove a member team ``` // Remove a member team returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.removeMemberTeam", true); TeamsApi apiInstance = new TeamsApi(defaultClient); try { apiInstance.removeMemberTeam("super_team_id", "member_team_id"); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#removeMemberTeam"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Remove a member team ``` // Remove a member team returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.RemoveMemberTeam", true); let api = TeamsAPI::with_config(configuration); let resp = api .remove_member_team("super_team_id".to_string(), "member_team_id".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Remove a member team ``` /** * Remove a member team returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.removeMemberTeam"] = true; const apiInstance = new v2.TeamsApi(configuration); const params: v2.TeamsApiRemoveMemberTeamRequest = { superTeamId: "super_team_id", memberTeamId: "member_team_id", }; apiInstance .removeMemberTeam(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Remove a team hierarchy link](https://docs.datadoghq.com/api/latest/teams/#remove-a-team-hierarchy-link) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#remove-a-team-hierarchy-link-v2) DELETE https://api.ap1.datadoghq.com/api/v2/team-hierarchy-links/{link_id}https://api.ap2.datadoghq.com/api/v2/team-hierarchy-links/{link_id}https://api.datadoghq.eu/api/v2/team-hierarchy-links/{link_id}https://api.ddog-gov.com/api/v2/team-hierarchy-links/{link_id}https://api.datadoghq.com/api/v2/team-hierarchy-links/{link_id}https://api.us3.datadoghq.com/api/v2/team-hierarchy-links/{link_id}https://api.us5.datadoghq.com/api/v2/team-hierarchy-links/{link_id} ### Overview Remove a team hierarchy link by the given link_id. This endpoint requires all of the following permissions: * `teams_read` * `teams_manage` OAuth apps require the `teams_read, teams_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description link_id [_required_] string The team hierarchy link’s identifier ### Response * [204](https://docs.datadoghq.com/api/latest/teams/#RemoveTeamHierarchyLink-204-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#RemoveTeamHierarchyLink-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#RemoveTeamHierarchyLink-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#RemoveTeamHierarchyLink-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Remove a team hierarchy link Copy ``` # Path parameters export link_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team-hierarchy-links/${link_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Remove a team hierarchy link ``` """ Remove a team hierarchy link returns "No Content" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi # there is a valid "team_hierarchy_link" in the system TEAM_HIERARCHY_LINK_DATA_ID = environ["TEAM_HIERARCHY_LINK_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) api_instance.remove_team_hierarchy_link( link_id=TEAM_HIERARCHY_LINK_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Remove a team hierarchy link ``` # Remove a team hierarchy link returns "No Content" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "team_hierarchy_link" in the system TEAM_HIERARCHY_LINK_DATA_ID = ENV["TEAM_HIERARCHY_LINK_DATA_ID"] api_instance.remove_team_hierarchy_link(TEAM_HIERARCHY_LINK_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Remove a team hierarchy link ``` // Remove a team hierarchy link returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "team_hierarchy_link" in the system TeamHierarchyLinkDataID := os.Getenv("TEAM_HIERARCHY_LINK_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) r, err := api.RemoveTeamHierarchyLink(ctx, TeamHierarchyLinkDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.RemoveTeamHierarchyLink`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Remove a team hierarchy link ``` // Remove a team hierarchy link returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "team_hierarchy_link" in the system String TEAM_HIERARCHY_LINK_DATA_ID = System.getenv("TEAM_HIERARCHY_LINK_DATA_ID"); try { apiInstance.removeTeamHierarchyLink(TEAM_HIERARCHY_LINK_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#removeTeamHierarchyLink"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Remove a team hierarchy link ``` // Remove a team hierarchy link returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { // there is a valid "team_hierarchy_link" in the system let team_hierarchy_link_data_id = std::env::var("TEAM_HIERARCHY_LINK_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = TeamsAPI::with_config(configuration); let resp = api .remove_team_hierarchy_link(team_hierarchy_link_data_id.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Remove a team hierarchy link ``` /** * Remove a team hierarchy link returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.TeamsApi(configuration); // there is a valid "team_hierarchy_link" in the system const TEAM_HIERARCHY_LINK_DATA_ID = process.env .TEAM_HIERARCHY_LINK_DATA_ID as string; const params: v2.TeamsApiRemoveTeamHierarchyLinkRequest = { linkId: TEAM_HIERARCHY_LINK_DATA_ID, }; apiInstance .removeTeamHierarchyLink(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List team connections](https://docs.datadoghq.com/api/latest/teams/#list-team-connections) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#list-team-connections-v2) GET https://api.ap1.datadoghq.com/api/v2/team/connectionshttps://api.ap2.datadoghq.com/api/v2/team/connectionshttps://api.datadoghq.eu/api/v2/team/connectionshttps://api.ddog-gov.com/api/v2/team/connectionshttps://api.datadoghq.com/api/v2/team/connectionshttps://api.us3.datadoghq.com/api/v2/team/connectionshttps://api.us5.datadoghq.com/api/v2/team/connections ### Overview Returns all team connections. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. filter[sources] array Filter team connections by external source systems. filter[team_ids] array Filter team connections by Datadog team IDs. filter[connected_team_ids] array Filter team connections by connected team IDs from external systems. filter[connection_ids] array Filter team connections by connection IDs. ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#ListTeamConnections-200-v2) * [400](https://docs.datadoghq.com/api/latest/teams/#ListTeamConnections-400-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#ListTeamConnections-403-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#ListTeamConnections-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Response containing information about multiple team connections. Field Type Description data [object] Array of team connections. attributes object Attributes of the team connection. managed_by string The entity that manages this team connection. source string The name of the external source. id [_required_] string The unique identifier of the team connection. relationships object Relationships of the team connection. connected_team object Reference to a team from an external system. data object Reference to connected external team. id [_required_] string The connected team ID as it is referenced throughout the Datadog ecosystem. type [_required_] enum External team resource type. Allowed enum values: `github_team` default: `github_team` team object Reference to a Datadog team. data object Reference to a Datadog team. id [_required_] string The Datadog team ID. type [_required_] enum Datadog team resource type. Allowed enum values: `team` default: `team` type [_required_] enum Team connection resource type. Allowed enum values: `team_connection` default: `team_connection` meta object Connections response metadata. page object Page-based pagination metadata. first_number int64 The first page number. last_number int64 The last page number. next_number int64 The next page number. number int64 The current page number. prev_number int64 The previous page number. size int64 The page size. total int64 Total connections matching request. type string Pagination type. ``` { "data": [ { "attributes": { "managed_by": "github_sync", "source": "github" }, "id": "12345678-1234-5678-9abc-123456789012", "relationships": { "connected_team": { "data": { "id": "@GitHubOrg/team-handle", "type": "github_team" } }, "team": { "data": { "id": "87654321-4321-8765-dcba-210987654321", "type": "team" } } }, "type": "team_connection" } ], "meta": { "page": { "first_number": "integer", "last_number": "integer", "next_number": "integer", "number": "integer", "prev_number": "integer", "size": "integer", "total": "integer", "type": "number_size" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### List team connections Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/connections" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List team connections ``` """ List team connections returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi configuration = Configuration() configuration.unstable_operations["list_team_connections"] = True with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.list_team_connections() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List team connections ``` # List team connections returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.list_team_connections".to_sym] = true end api_instance = DatadogAPIClient::V2::TeamsAPI.new p api_instance.list_team_connections() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List team connections ``` // List team connections returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.ListTeamConnections", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.ListTeamConnections(ctx, *datadogV2.NewListTeamConnectionsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.ListTeamConnections`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.ListTeamConnections`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List team connections ``` // List team connections returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamConnectionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.listTeamConnections", true); TeamsApi apiInstance = new TeamsApi(defaultClient); try { TeamConnectionsResponse result = apiInstance.listTeamConnections(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#listTeamConnections"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List team connections ``` // List team connections returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::ListTeamConnectionsOptionalParams; use datadog_api_client::datadogV2::api_teams::TeamsAPI; #[tokio::main] async fn main() { let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.ListTeamConnections", true); let api = TeamsAPI::with_config(configuration); let resp = api .list_team_connections(ListTeamConnectionsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List team connections ``` /** * List team connections returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.listTeamConnections"] = true; const apiInstance = new v2.TeamsApi(configuration); apiInstance .listTeamConnections() .then((data: v2.TeamConnectionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create team connections](https://docs.datadoghq.com/api/latest/teams/#create-team-connections) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#create-team-connections-v2) POST https://api.ap1.datadoghq.com/api/v2/team/connectionshttps://api.ap2.datadoghq.com/api/v2/team/connectionshttps://api.datadoghq.eu/api/v2/team/connectionshttps://api.ddog-gov.com/api/v2/team/connectionshttps://api.datadoghq.com/api/v2/team/connectionshttps://api.us3.datadoghq.com/api/v2/team/connectionshttps://api.us5.datadoghq.com/api/v2/team/connections ### Overview Create multiple team connections. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] [object] Array of team connections to create. attributes object Attributes of the team connection. managed_by string The entity that manages this team connection. source string The name of the external source. relationships object Relationships of the team connection. connected_team object Reference to a team from an external system. data object Reference to connected external team. id [_required_] string The connected team ID as it is referenced throughout the Datadog ecosystem. type [_required_] enum External team resource type. Allowed enum values: `github_team` default: `github_team` team object Reference to a Datadog team. data object Reference to a Datadog team. id [_required_] string The Datadog team ID. type [_required_] enum Datadog team resource type. Allowed enum values: `team` default: `team` type [_required_] enum Team connection resource type. Allowed enum values: `team_connection` default: `team_connection` ``` { "data": [ { "attributes": { "managed_by": "github_sync", "source": "github" }, "relationships": { "connected_team": { "data": { "id": "@GitHubOrg/team-handle", "type": "github_team" } }, "team": { "data": { "id": "87654321-4321-8765-dcba-210987654321", "type": "team" } } }, "type": "team_connection" } ] } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/teams/#CreateTeamConnections-201-v2) * [400](https://docs.datadoghq.com/api/latest/teams/#CreateTeamConnections-400-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#CreateTeamConnections-403-v2) * [409](https://docs.datadoghq.com/api/latest/teams/#CreateTeamConnections-409-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#CreateTeamConnections-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Response containing information about multiple team connections. Field Type Description data [object] Array of team connections. attributes object Attributes of the team connection. managed_by string The entity that manages this team connection. source string The name of the external source. id [_required_] string The unique identifier of the team connection. relationships object Relationships of the team connection. connected_team object Reference to a team from an external system. data object Reference to connected external team. id [_required_] string The connected team ID as it is referenced throughout the Datadog ecosystem. type [_required_] enum External team resource type. Allowed enum values: `github_team` default: `github_team` team object Reference to a Datadog team. data object Reference to a Datadog team. id [_required_] string The Datadog team ID. type [_required_] enum Datadog team resource type. Allowed enum values: `team` default: `team` type [_required_] enum Team connection resource type. Allowed enum values: `team_connection` default: `team_connection` meta object Connections response metadata. page object Page-based pagination metadata. first_number int64 The first page number. last_number int64 The last page number. next_number int64 The next page number. number int64 The current page number. prev_number int64 The previous page number. size int64 The page size. total int64 Total connections matching request. type string Pagination type. ``` { "data": [ { "attributes": { "managed_by": "github_sync", "source": "github" }, "id": "12345678-1234-5678-9abc-123456789012", "relationships": { "connected_team": { "data": { "id": "@GitHubOrg/team-handle", "type": "github_team" } }, "team": { "data": { "id": "87654321-4321-8765-dcba-210987654321", "type": "team" } } }, "type": "team_connection" } ], "meta": { "page": { "first_number": "integer", "last_number": "integer", "next_number": "integer", "number": "integer", "prev_number": "integer", "size": "integer", "total": "integer", "type": "number_size" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Create team connections Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/connections" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "relationships": { "connected_team": { "data": { "id": "@GitHubOrg/team-handle", "type": "github_team" } }, "team": { "data": { "id": "87654321-4321-8765-dcba-210987654321", "type": "team" } } }, "type": "team_connection" } ] } EOF ``` ##### Create team connections ``` """ Create team connections returns "Created" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.connected_team_ref import ConnectedTeamRef from datadog_api_client.v2.model.connected_team_ref_data import ConnectedTeamRefData from datadog_api_client.v2.model.connected_team_ref_data_type import ConnectedTeamRefDataType from datadog_api_client.v2.model.team_connection_attributes import TeamConnectionAttributes from datadog_api_client.v2.model.team_connection_create_data import TeamConnectionCreateData from datadog_api_client.v2.model.team_connection_create_request import TeamConnectionCreateRequest from datadog_api_client.v2.model.team_connection_relationships import TeamConnectionRelationships from datadog_api_client.v2.model.team_connection_type import TeamConnectionType from datadog_api_client.v2.model.team_ref import TeamRef from datadog_api_client.v2.model.team_ref_data import TeamRefData from datadog_api_client.v2.model.team_ref_data_type import TeamRefDataType # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = environ["DD_TEAM_DATA_ID"] body = TeamConnectionCreateRequest( data=[ TeamConnectionCreateData( type=TeamConnectionType.TEAM_CONNECTION, attributes=TeamConnectionAttributes( source="github", managed_by="datadog", ), relationships=TeamConnectionRelationships( team=TeamRef( data=TeamRefData( id=DD_TEAM_DATA_ID, type=TeamRefDataType.TEAM, ), ), connected_team=ConnectedTeamRef( data=ConnectedTeamRefData( id="@MyGitHubAccount/my-team-name", type=ConnectedTeamRefDataType.GITHUB_TEAM, ), ), ), ), ], ) configuration = Configuration() configuration.unstable_operations["create_team_connections"] = True with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) response = api_instance.create_team_connections(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create team connections ``` # Create team connections returns "Created" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.create_team_connections".to_sym] = true end api_instance = DatadogAPIClient::V2::TeamsAPI.new # there is a valid "dd_team" in the system DD_TEAM_DATA_ID = ENV["DD_TEAM_DATA_ID"] body = DatadogAPIClient::V2::TeamConnectionCreateRequest.new({ data: [ DatadogAPIClient::V2::TeamConnectionCreateData.new({ type: DatadogAPIClient::V2::TeamConnectionType::TEAM_CONNECTION, attributes: DatadogAPIClient::V2::TeamConnectionAttributes.new({ source: "github", managed_by: "datadog", }), relationships: DatadogAPIClient::V2::TeamConnectionRelationships.new({ team: DatadogAPIClient::V2::TeamRef.new({ data: DatadogAPIClient::V2::TeamRefData.new({ id: DD_TEAM_DATA_ID, type: DatadogAPIClient::V2::TeamRefDataType::TEAM, }), }), connected_team: DatadogAPIClient::V2::ConnectedTeamRef.new({ data: DatadogAPIClient::V2::ConnectedTeamRefData.new({ id: "@MyGitHubAccount/my-team-name", type: DatadogAPIClient::V2::ConnectedTeamRefDataType::GITHUB_TEAM, }), }), }), }), ], }) p api_instance.create_team_connections(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create team connections ``` // Create team connections returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "dd_team" in the system DdTeamDataID := os.Getenv("DD_TEAM_DATA_ID") body := datadogV2.TeamConnectionCreateRequest{ Data: []datadogV2.TeamConnectionCreateData{ { Type: datadogV2.TEAMCONNECTIONTYPE_TEAM_CONNECTION, Attributes: &datadogV2.TeamConnectionAttributes{ Source: datadog.PtrString("github"), ManagedBy: datadog.PtrString("datadog"), }, Relationships: &datadogV2.TeamConnectionRelationships{ Team: &datadogV2.TeamRef{ Data: &datadogV2.TeamRefData{ Id: DdTeamDataID, Type: datadogV2.TEAMREFDATATYPE_TEAM, }, }, ConnectedTeam: &datadogV2.ConnectedTeamRef{ Data: &datadogV2.ConnectedTeamRefData{ Id: "@MyGitHubAccount/my-team-name", Type: datadogV2.CONNECTEDTEAMREFDATATYPE_GITHUB_TEAM, }, }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.CreateTeamConnections", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) resp, r, err := api.CreateTeamConnections(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.CreateTeamConnections`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TeamsApi.CreateTeamConnections`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create team connections ``` // Create team connections returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.ConnectedTeamRef; import com.datadog.api.client.v2.model.ConnectedTeamRefData; import com.datadog.api.client.v2.model.ConnectedTeamRefDataType; import com.datadog.api.client.v2.model.TeamConnectionAttributes; import com.datadog.api.client.v2.model.TeamConnectionCreateData; import com.datadog.api.client.v2.model.TeamConnectionCreateRequest; import com.datadog.api.client.v2.model.TeamConnectionRelationships; import com.datadog.api.client.v2.model.TeamConnectionType; import com.datadog.api.client.v2.model.TeamConnectionsResponse; import com.datadog.api.client.v2.model.TeamRef; import com.datadog.api.client.v2.model.TeamRefData; import com.datadog.api.client.v2.model.TeamRefDataType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.createTeamConnections", true); TeamsApi apiInstance = new TeamsApi(defaultClient); // there is a valid "dd_team" in the system String DD_TEAM_DATA_ID = System.getenv("DD_TEAM_DATA_ID"); TeamConnectionCreateRequest body = new TeamConnectionCreateRequest() .data( Collections.singletonList( new TeamConnectionCreateData() .type(TeamConnectionType.TEAM_CONNECTION) .attributes( new TeamConnectionAttributes().source("github").managedBy("datadog")) .relationships( new TeamConnectionRelationships() .team( new TeamRef() .data( new TeamRefData() .id(DD_TEAM_DATA_ID) .type(TeamRefDataType.TEAM))) .connectedTeam( new ConnectedTeamRef() .data( new ConnectedTeamRefData() .id("@MyGitHubAccount/my-team-name") .type(ConnectedTeamRefDataType.GITHUB_TEAM)))))); try { TeamConnectionsResponse result = apiInstance.createTeamConnections(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#createTeamConnections"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create team connections ``` // Create team connections returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::ConnectedTeamRef; use datadog_api_client::datadogV2::model::ConnectedTeamRefData; use datadog_api_client::datadogV2::model::ConnectedTeamRefDataType; use datadog_api_client::datadogV2::model::TeamConnectionAttributes; use datadog_api_client::datadogV2::model::TeamConnectionCreateData; use datadog_api_client::datadogV2::model::TeamConnectionCreateRequest; use datadog_api_client::datadogV2::model::TeamConnectionRelationships; use datadog_api_client::datadogV2::model::TeamConnectionType; use datadog_api_client::datadogV2::model::TeamRef; use datadog_api_client::datadogV2::model::TeamRefData; use datadog_api_client::datadogV2::model::TeamRefDataType; #[tokio::main] async fn main() { // there is a valid "dd_team" in the system let dd_team_data_id = std::env::var("DD_TEAM_DATA_ID").unwrap(); let body = TeamConnectionCreateRequest::new(vec![TeamConnectionCreateData::new( TeamConnectionType::TEAM_CONNECTION, ) .attributes( TeamConnectionAttributes::new() .managed_by("datadog".to_string()) .source("github".to_string()), ) .relationships( TeamConnectionRelationships::new() .connected_team(ConnectedTeamRef::new().data(ConnectedTeamRefData::new( "@MyGitHubAccount/my-team-name".to_string(), ConnectedTeamRefDataType::GITHUB_TEAM, ))) .team(TeamRef::new().data(TeamRefData::new( dd_team_data_id.clone(), TeamRefDataType::TEAM, ))), )]); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.CreateTeamConnections", true); let api = TeamsAPI::with_config(configuration); let resp = api.create_team_connections(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create team connections ``` /** * Create team connections returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.createTeamConnections"] = true; const apiInstance = new v2.TeamsApi(configuration); // there is a valid "dd_team" in the system const DD_TEAM_DATA_ID = process.env.DD_TEAM_DATA_ID as string; const params: v2.TeamsApiCreateTeamConnectionsRequest = { body: { data: [ { type: "team_connection", attributes: { source: "github", managedBy: "datadog", }, relationships: { team: { data: { id: DD_TEAM_DATA_ID, type: "team", }, }, connectedTeam: { data: { id: "@MyGitHubAccount/my-team-name", type: "github_team", }, }, }, }, ], }, }; apiInstance .createTeamConnections(params) .then((data: v2.TeamConnectionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete team connections](https://docs.datadoghq.com/api/latest/teams/#delete-team-connections) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#delete-team-connections-v2) DELETE https://api.ap1.datadoghq.com/api/v2/team/connectionshttps://api.ap2.datadoghq.com/api/v2/team/connectionshttps://api.datadoghq.eu/api/v2/team/connectionshttps://api.ddog-gov.com/api/v2/team/connectionshttps://api.datadoghq.com/api/v2/team/connectionshttps://api.us3.datadoghq.com/api/v2/team/connectionshttps://api.us5.datadoghq.com/api/v2/team/connections ### Overview Delete multiple team connections. This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] [object] Array of team connection IDs to delete. id [_required_] string The unique identifier of the team connection to delete. type [_required_] enum Team connection resource type. Allowed enum values: `team_connection` default: `team_connection` ``` { "data": [ { "id": "12345678-1234-5678-9abc-123456789012", "type": "team_connection" } ] } ``` Copy ### Response * [204](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamConnections-204-v2) * [400](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamConnections-400-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamConnections-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamConnections-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamConnections-429-v2) No Content Bad Request * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/teams/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/teams/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/teams/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/teams/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/teams/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/teams/?code-lang=typescript) ##### Delete team connections Copy ``` # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/connections" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "id": "12345678-1234-5678-9abc-123456789012", "type": "team_connection" } ] } EOF ``` ##### Delete team connections ``` """ Delete team connections returns "No Content" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.teams_api import TeamsApi from datadog_api_client.v2.model.team_connection_delete_request import TeamConnectionDeleteRequest from datadog_api_client.v2.model.team_connection_delete_request_data_item import TeamConnectionDeleteRequestDataItem from datadog_api_client.v2.model.team_connection_type import TeamConnectionType body = TeamConnectionDeleteRequest( data=[ TeamConnectionDeleteRequestDataItem( id="12345678-1234-5678-9abc-123456789012", type=TeamConnectionType.TEAM_CONNECTION, ), ], ) configuration = Configuration() configuration.unstable_operations["delete_team_connections"] = True with ApiClient(configuration) as api_client: api_instance = TeamsApi(api_client) api_instance.delete_team_connections(body=body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete team connections ``` # Delete team connections returns "No Content" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.delete_team_connections".to_sym] = true end api_instance = DatadogAPIClient::V2::TeamsAPI.new body = DatadogAPIClient::V2::TeamConnectionDeleteRequest.new({ data: [ DatadogAPIClient::V2::TeamConnectionDeleteRequestDataItem.new({ id: "12345678-1234-5678-9abc-123456789012", type: DatadogAPIClient::V2::TeamConnectionType::TEAM_CONNECTION, }), ], }) api_instance.delete_team_connections(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete team connections ``` // Delete team connections returns "No Content" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.TeamConnectionDeleteRequest{ Data: []datadogV2.TeamConnectionDeleteRequestDataItem{ { Id: "12345678-1234-5678-9abc-123456789012", Type: datadogV2.TEAMCONNECTIONTYPE_TEAM_CONNECTION, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.DeleteTeamConnections", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTeamsApi(apiClient) r, err := api.DeleteTeamConnections(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TeamsApi.DeleteTeamConnections`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete team connections ``` // Delete team connections returns "No Content" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TeamsApi; import com.datadog.api.client.v2.model.TeamConnectionDeleteRequest; import com.datadog.api.client.v2.model.TeamConnectionDeleteRequestDataItem; import com.datadog.api.client.v2.model.TeamConnectionType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.deleteTeamConnections", true); TeamsApi apiInstance = new TeamsApi(defaultClient); TeamConnectionDeleteRequest body = new TeamConnectionDeleteRequest() .data( Collections.singletonList( new TeamConnectionDeleteRequestDataItem() .id("12345678-1234-5678-9abc-123456789012") .type(TeamConnectionType.TEAM_CONNECTION))); try { apiInstance.deleteTeamConnections(body); } catch (ApiException e) { System.err.println("Exception when calling TeamsApi#deleteTeamConnections"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete team connections ``` // Delete team connections returns "No Content" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_teams::TeamsAPI; use datadog_api_client::datadogV2::model::TeamConnectionDeleteRequest; use datadog_api_client::datadogV2::model::TeamConnectionDeleteRequestDataItem; use datadog_api_client::datadogV2::model::TeamConnectionType; #[tokio::main] async fn main() { let body = TeamConnectionDeleteRequest::new(vec![TeamConnectionDeleteRequestDataItem::new( "12345678-1234-5678-9abc-123456789012".to_string(), TeamConnectionType::TEAM_CONNECTION, )]); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.DeleteTeamConnections", true); let api = TeamsAPI::with_config(configuration); let resp = api.delete_team_connections(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete team connections ``` /** * Delete team connections returns "No Content" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.deleteTeamConnections"] = true; const apiInstance = new v2.TeamsApi(configuration); const params: v2.TeamsApiDeleteTeamConnectionsRequest = { body: { data: [ { id: "12345678-1234-5678-9abc-123456789012", type: "team_connection", }, ], }, }; apiInstance .deleteTeamConnections(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get team notification rules](https://docs.datadoghq.com/api/latest/teams/#get-team-notification-rules) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-team-notification-rules-v2) GET https://api.ap1.datadoghq.com/api/v2/team/{team_id}/notification-ruleshttps://api.ap2.datadoghq.com/api/v2/team/{team_id}/notification-ruleshttps://api.datadoghq.eu/api/v2/team/{team_id}/notification-ruleshttps://api.ddog-gov.com/api/v2/team/{team_id}/notification-ruleshttps://api.datadoghq.com/api/v2/team/{team_id}/notification-ruleshttps://api.us3.datadoghq.com/api/v2/team/{team_id}/notification-ruleshttps://api.us5.datadoghq.com/api/v2/team/{team_id}/notification-rules ### Overview This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#GetTeamNotificationRules-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#GetTeamNotificationRules-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#GetTeamNotificationRules-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#GetTeamNotificationRules-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team notification rules response Field Type Description data [object] Team notification rules response data attributes [_required_] object Team notification rule attributes email object Email notification settings for the team enabled boolean Flag indicating email notification ms_teams object MS Teams notification settings for the team connector_name string Handle for MS Teams pagerduty object PagerDuty notification settings for the team service_name string Service name for PagerDuty slack object Slack notification settings for the team channel string Channel for Slack notification workspace string Workspace for Slack notification id string The identifier of the team notification rule type [_required_] enum Team notification rule type Allowed enum values: `team_notification_rules` default: `team_notification_rules` meta object Metadata that is included in the response when querying the team notification rules page object Metadata related to paging information that is included in the response when querying the team notification rules first_offset int64 The first offset. last_offset int64 The last offset. limit int64 Pagination limit. next_offset int64 The next offset. offset int64 The offset. prev_offset int64 The previous offset. total int64 Total results. type string Offset type. ``` { "data": [ { "attributes": { "email": { "enabled": false }, "ms_teams": { "connector_name": "string" }, "pagerduty": { "service_name": "string" }, "slack": { "channel": "string", "workspace": "string" } }, "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "team_notification_rules" } ], "meta": { "page": { "first_offset": "integer", "last_offset": "integer", "limit": "integer", "next_offset": "integer", "offset": "integer", "prev_offset": "integer", "total": "integer", "type": "string" } } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) ##### Get team notification rules Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/notification-rules" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Create team notification rule](https://docs.datadoghq.com/api/latest/teams/#create-team-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#create-team-notification-rule-v2) POST https://api.ap1.datadoghq.com/api/v2/team/{team_id}/notification-ruleshttps://api.ap2.datadoghq.com/api/v2/team/{team_id}/notification-ruleshttps://api.datadoghq.eu/api/v2/team/{team_id}/notification-ruleshttps://api.ddog-gov.com/api/v2/team/{team_id}/notification-ruleshttps://api.datadoghq.com/api/v2/team/{team_id}/notification-ruleshttps://api.us3.datadoghq.com/api/v2/team/{team_id}/notification-ruleshttps://api.us5.datadoghq.com/api/v2/team/{team_id}/notification-rules ### Overview This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object Team notification rule attributes [_required_] object Team notification rule attributes email object Email notification settings for the team enabled boolean Flag indicating email notification ms_teams object MS Teams notification settings for the team connector_name string Handle for MS Teams pagerduty object PagerDuty notification settings for the team service_name string Service name for PagerDuty slack object Slack notification settings for the team channel string Channel for Slack notification workspace string Workspace for Slack notification id string The identifier of the team notification rule type [_required_] enum Team notification rule type Allowed enum values: `team_notification_rules` default: `team_notification_rules` ``` { "data": { "type": "team_notification_rules", "attributes": { "email": { "enabled": true }, "slack": { "workspace": "Datadog", "channel": "aaa-omg-ops" } } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/teams/#CreateTeamNotificationRule-201-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#CreateTeamNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#CreateTeamNotificationRule-404-v2) * [409](https://docs.datadoghq.com/api/latest/teams/#CreateTeamNotificationRule-409-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#CreateTeamNotificationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team notification rule response Field Type Description data object Team notification rule attributes [_required_] object Team notification rule attributes email object Email notification settings for the team enabled boolean Flag indicating email notification ms_teams object MS Teams notification settings for the team connector_name string Handle for MS Teams pagerduty object PagerDuty notification settings for the team service_name string Service name for PagerDuty slack object Slack notification settings for the team channel string Channel for Slack notification workspace string Workspace for Slack notification id string The identifier of the team notification rule type [_required_] enum Team notification rule type Allowed enum values: `team_notification_rules` default: `team_notification_rules` ``` { "data": { "attributes": { "email": { "enabled": false }, "ms_teams": { "connector_name": "string" }, "pagerduty": { "service_name": "string" }, "slack": { "channel": "string", "workspace": "string" } }, "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "team_notification_rules" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) ##### Create team notification rule returns "OK" response Copy ``` # Path parameters export team_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/notification-rules" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "team_notification_rules", "attributes": { "email": { "enabled": true }, "slack": { "workspace": "Datadog", "channel": "aaa-omg-ops" } } } } EOF ``` * * * ## [Get team notification rule](https://docs.datadoghq.com/api/latest/teams/#get-team-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#get-team-notification-rule-v2) GET https://api.ap1.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.datadoghq.eu/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.ddog-gov.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.us3.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.us5.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id} ### Overview This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None rule_id [_required_] string None ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#GetTeamNotificationRule-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#GetTeamNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#GetTeamNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#GetTeamNotificationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team notification rule response Field Type Description data object Team notification rule attributes [_required_] object Team notification rule attributes email object Email notification settings for the team enabled boolean Flag indicating email notification ms_teams object MS Teams notification settings for the team connector_name string Handle for MS Teams pagerduty object PagerDuty notification settings for the team service_name string Service name for PagerDuty slack object Slack notification settings for the team channel string Channel for Slack notification workspace string Workspace for Slack notification id string The identifier of the team notification rule type [_required_] enum Team notification rule type Allowed enum values: `team_notification_rules` default: `team_notification_rules` ``` { "data": { "attributes": { "email": { "enabled": false }, "ms_teams": { "connector_name": "string" }, "pagerduty": { "service_name": "string" }, "slack": { "channel": "string", "workspace": "string" } }, "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "team_notification_rules" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) ##### Get team notification rule Copy ``` # Path parameters export team_id="CHANGE_ME" export rule_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/notification-rules/${rule_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ## [Update team notification rule](https://docs.datadoghq.com/api/latest/teams/#update-team-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#update-team-notification-rule-v2) PUT https://api.ap1.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.datadoghq.eu/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.ddog-gov.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.us3.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.us5.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id} ### Overview This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None rule_id [_required_] string None ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Field Type Description data [_required_] object Team notification rule attributes [_required_] object Team notification rule attributes email object Email notification settings for the team enabled boolean Flag indicating email notification ms_teams object MS Teams notification settings for the team connector_name string Handle for MS Teams pagerduty object PagerDuty notification settings for the team service_name string Service name for PagerDuty slack object Slack notification settings for the team channel string Channel for Slack notification workspace string Workspace for Slack notification id string The identifier of the team notification rule type [_required_] enum Team notification rule type Allowed enum values: `team_notification_rules` default: `team_notification_rules` ``` { "data": { "type": "team_notification_rules", "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "attributes": { "pagerduty": { "service_name": "Datadog-prod" }, "slack": { "workspace": "Datadog", "channel": "aaa-governance-ops" } } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamNotificationRule-200-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#UpdateTeamNotificationRule-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) Team notification rule response Field Type Description data object Team notification rule attributes [_required_] object Team notification rule attributes email object Email notification settings for the team enabled boolean Flag indicating email notification ms_teams object MS Teams notification settings for the team connector_name string Handle for MS Teams pagerduty object PagerDuty notification settings for the team service_name string Service name for PagerDuty slack object Slack notification settings for the team channel string Channel for Slack notification workspace string Workspace for Slack notification id string The identifier of the team notification rule type [_required_] enum Team notification rule type Allowed enum values: `team_notification_rules` default: `team_notification_rules` ``` { "data": { "attributes": { "email": { "enabled": false }, "ms_teams": { "connector_name": "string" }, "pagerduty": { "service_name": "string" }, "slack": { "channel": "string", "workspace": "string" } }, "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "type": "team_notification_rules" } } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) ##### Update team notification rule returns "OK" response Copy ``` # Path parameters export team_id="CHANGE_ME" export rule_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/notification-rules/${rule_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "team_notification_rules", "id": "b8626d7e-cedd-11eb-abf5-da7ad0900001", "attributes": { "pagerduty": { "service_name": "Datadog-prod" }, "slack": { "workspace": "Datadog", "channel": "aaa-governance-ops" } } } } EOF ``` * * * ## [Delete team notification rule](https://docs.datadoghq.com/api/latest/teams/#delete-team-notification-rule) * [v2 (latest)](https://docs.datadoghq.com/api/latest/teams/#delete-team-notification-rule-v2) DELETE https://api.ap1.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.ap2.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.datadoghq.eu/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.ddog-gov.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.us3.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id}https://api.us5.datadoghq.com/api/v2/team/{team_id}/notification-rules/{rule_id} ### Overview This endpoint requires the `teams_read` permission. OAuth apps require the `teams_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#teams) to access this endpoint. ### Arguments #### Path Parameters Name Type Description team_id [_required_] string None rule_id [_required_] string None ### Response * [204](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamNotificationRule-204-v2) * [403](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamNotificationRule-403-v2) * [404](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamNotificationRule-404-v2) * [429](https://docs.datadoghq.com/api/latest/teams/#DeleteTeamNotificationRule-429-v2) No Content Forbidden * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy API error response. * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/teams/) * [Example](https://docs.datadoghq.com/api/latest/teams/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/teams/?code-lang=curl) ##### Delete team notification rule Copy ``` # Path parameters export team_id="CHANGE_ME" export rule_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/team/${team_id}/notification-rules/${rule_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=73beefbf-dd3a-4f0f-b7d5-68db78d29a29&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=0633be22-c613-4aab-8de3-706107e7aafe&pt=Teams&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fteams%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=73beefbf-dd3a-4f0f-b7d5-68db78d29a29&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=0633be22-c613-4aab-8de3-706107e7aafe&pt=Teams&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fteams%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=030e871f-5062-4d3b-ace8-47fd7556ba05&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Teams&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fteams%2F&r=<=14801&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=55476) --- # Source: https://docs.datadoghq.com/api/latest/test-optimization # Test Optimization Search and manage flaky tests through Test Optimization. See the [Test Optimization page](https://docs.datadoghq.com/tests/) for more information. ## [Search flaky tests](https://docs.datadoghq.com/api/latest/test-optimization/#search-flaky-tests) * [v2 (latest)](https://docs.datadoghq.com/api/latest/test-optimization/#search-flaky-tests-v2) **Note** : This endpoint is in preview and may be subject to change. If you have any feedback, contact [Datadog support](https://docs.datadoghq.com/help/). POST https://api.ap1.datadoghq.com/api/v2/test/flaky-test-management/testshttps://api.ap2.datadoghq.com/api/v2/test/flaky-test-management/testshttps://api.datadoghq.eu/api/v2/test/flaky-test-management/testshttps://api.ddog-gov.com/api/v2/test/flaky-test-management/testshttps://api.datadoghq.com/api/v2/test/flaky-test-management/testshttps://api.us3.datadoghq.com/api/v2/test/flaky-test-management/testshttps://api.us5.datadoghq.com/api/v2/test/flaky-test-management/tests ### Overview List endpoint returning flaky tests from Flaky Test Management. Results are paginated. This endpoint requires the `test_optimization_read` permission. OAuth apps require the `test_optimization_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#test-optimization) to access this endpoint. ### Request #### Body Data * [Model](https://docs.datadoghq.com/api/latest/test-optimization/) * [Example](https://docs.datadoghq.com/api/latest/test-optimization/) Field Type Description data object The JSON:API data for flaky tests search request. attributes object Attributes for the flaky tests search request. filter object Search filter settings. query string Search query following log syntax used to filter flaky tests, same as on Flaky Tests Management UI. The supported search keys are: * `flaky_test_state` * `flaky_test_category` * `@test.name` * `@test.suite` * `@test.module` * `@test.service` * `@git.repository.id_v2` * `@git.branch` * `@test.codeowners` * `env` default: `*` page object Pagination attributes for listing flaky tests. cursor string List following results with a cursor provided in the previous request. limit int64 Maximum number of flaky tests in the response. default: `10` sort enum Parameter for sorting flaky test results. The default sort is by ascending Fully Qualified Name (FQN). The FQN is the concatenation of the test module, suite, and name. Allowed enum values: `fqn,-fqn,first_flaked,-first_flaked,last_flaked,-last_flaked,failure_rate,-failure_rate,pipelines_failed,-pipelines_failed,pipelines_duration_lost,-pipelines_duration_lost` type enum The definition of `FlakyTestsSearchRequestDataType` object. Allowed enum values: `search_flaky_tests_request` ``` { "data": { "attributes": { "filter": { "query": "flaky_test_state:active @git.repository.id_v2:\"github.com/datadog/shopist\"" }, "page": { "cursor": "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==", "limit": 25 }, "sort": "failure_rate" }, "type": "string" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/test-optimization/#SearchFlakyTests-200-v2) * [400](https://docs.datadoghq.com/api/latest/test-optimization/#SearchFlakyTests-400-v2) * [403](https://docs.datadoghq.com/api/latest/test-optimization/#SearchFlakyTests-403-v2) * [429](https://docs.datadoghq.com/api/latest/test-optimization/#SearchFlakyTests-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/test-optimization/) * [Example](https://docs.datadoghq.com/api/latest/test-optimization/) Response object with flaky tests matching the search request. Field Type Description data [object] Array of flaky tests matching the request. attributes object Attributes of a flaky test. attempt_to_fix_id string Unique identifier for the attempt to fix this flaky test. Use this ID in the Git commit message in order to trigger the attempt to fix workflow. When the workflow is triggered the test is automatically retried by the tracer a certain number of configurable times. When all retries pass, the test is automatically marked as fixed in Flaky Test Management. Test runs are tagged with @test.test_management.attempt_to_fix_passed and @test.test_management.is_attempt_to_fix when the attempt to fix workflow is triggered. codeowners [string] The name of the test's code owners as inferred from the repository configuration. envs [string] List of environments where this test has been flaky. first_flaked_branch string The branch name where the test exhibited flakiness for the first time. first_flaked_sha string The commit SHA where the test exhibited flakiness for the first time. first_flaked_ts int64 Unix timestamp when the test exhibited flakiness for the first time. flaky_category string The category of a flaky test. flaky_state enum The current state of the flaky test. Allowed enum values: `active,fixed,quarantined,disabled` last_flaked_branch string The branch name where the test exhibited flakiness for the last time. last_flaked_sha string The commit SHA where the test exhibited flakiness for the last time. last_flaked_ts int64 Unix timestamp when the test exhibited flakiness for the last time. module string The name of the test module. The definition of module changes slightly per language: * In .NET, a test module groups every test that is run under the same unit test project. * In Swift, a test module groups every test that is run for a given bundle. * In JavaScript, the test modules map one-to-one to test sessions. * In Java, a test module groups every test that is run by the same Maven Surefire/Failsafe or Gradle Test task execution. * In Python, a test module groups every test that is run under the same `.py` file as part of a test suite, which is typically managed by a framework like `unittest` or `pytest`. * In Ruby, a test module groups every test that is run within the same test file, which is typically managed by a framework like `RSpec` or `Minitest`. name string The test name. A concise name for a test case. Defined in the test itself. pipeline_stats object CI pipeline related statistics for the flaky test. This information is only available if test runs are associated with CI pipeline events from CI Visibility. failed_pipelines int64 The number of pipelines that failed due to this test for the past 7 days. This is computed as the sum of failed CI pipeline events associated with test runs where the flaky test failed. total_lost_time_ms int64 The total time lost by CI pipelines due to this flaky test in milliseconds. This is computed as the sum of the duration of failed CI pipeline events associated with test runs where the flaky test failed. services [string] List of test service names where this test has been flaky. A test service is a group of tests associated with a project or repository. It contains all the individual tests for your code, optionally organized into test suites, which are like folders for your tests. suite string The name of the test suite. A group of tests exercising the same unit of code depending on your language and testing framework. test_run_metadata object Metadata about the latest failed test run of the flaky test. duration_ms int64 The duration of the test run in milliseconds. error_message string The error message from the test failure. error_stack string The stack trace from the test failure. source_end int64 The line number where the test ends in the source file. source_file string The source file where the test is defined. source_start int64 The line number where the test starts in the source file. test_stats object Test statistics for the flaky test. failure_rate_pct double The failure rate percentage of the test for the past 7 days. This is the number of failed test runs divided by the total number of test runs (excluding skipped test runs). id string Test's ID. This ID is the hash of the test's Fully Qualified Name and Git repository ID. On the Test Runs UI it is the same as the `test_fingerprint_fqn` tag. type enum The type of the flaky test from Flaky Test Management. Allowed enum values: `flaky_test` meta object Metadata for the flaky tests search response. pagination object Pagination metadata for flaky tests. next_page string Cursor for the next page of results. ``` { "data": [ { "attributes": { "attempt_to_fix_id": "I42TEO", "codeowners": [ "@foo", "@bar" ], "envs": "prod", "first_flaked_branch": "main", "first_flaked_sha": "0c6be03165b7f7ffe96e076ffb29afb2825616c3", "first_flaked_ts": 1757688149, "flaky_category": "Timeout", "flaky_state": "active", "last_flaked_branch": "main", "last_flaked_sha": "0c6be03165b7f7ffe96e076ffb29afb2825616c3", "last_flaked_ts": 1757688149, "module": "TestModule", "name": "TestName", "pipeline_stats": { "failed_pipelines": 319, "total_lost_time_ms": 1527550000 }, "services": [ "foo", "bar" ], "suite": "TestSuite", "test_run_metadata": { "duration_ms": 27398, "error_message": "Expecting actual not to be empty", "error_stack": "Traceback (most recent call last):\n File \"test_foo.py\", line 10, in test_foo\n assert actual == expected\nAssertionError: Expecting actual not to be empty", "source_end": 20, "source_file": "test_foo.py", "source_start": 10 }, "test_stats": { "failure_rate_pct": 0.1 } }, "id": "string", "type": "string" } ], "meta": { "pagination": { "next_page": "string" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/test-optimization/) * [Example](https://docs.datadoghq.com/api/latest/test-optimization/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Authorized * [Model](https://docs.datadoghq.com/api/latest/test-optimization/) * [Example](https://docs.datadoghq.com/api/latest/test-optimization/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/test-optimization/) * [Example](https://docs.datadoghq.com/api/latest/test-optimization/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/test-optimization/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/test-optimization/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/test-optimization/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/test-optimization/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/test-optimization/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/test-optimization/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/test-optimization/?code-lang=typescript) ##### Search flaky tests Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/test/flaky-test-management/tests" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Search flaky tests ``` """ Search flaky tests returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.test_optimization_api import TestOptimizationApi from datadog_api_client.v2.model.flaky_tests_search_filter import FlakyTestsSearchFilter from datadog_api_client.v2.model.flaky_tests_search_page_options import FlakyTestsSearchPageOptions from datadog_api_client.v2.model.flaky_tests_search_request import FlakyTestsSearchRequest from datadog_api_client.v2.model.flaky_tests_search_request_attributes import FlakyTestsSearchRequestAttributes from datadog_api_client.v2.model.flaky_tests_search_request_data import FlakyTestsSearchRequestData from datadog_api_client.v2.model.flaky_tests_search_request_data_type import FlakyTestsSearchRequestDataType from datadog_api_client.v2.model.flaky_tests_search_sort import FlakyTestsSearchSort body = FlakyTestsSearchRequest( data=FlakyTestsSearchRequestData( attributes=FlakyTestsSearchRequestAttributes( filter=FlakyTestsSearchFilter( query='flaky_test_state:active @git.repository.id_v2:"github.com/datadog/shopist"', ), page=FlakyTestsSearchPageOptions( cursor="eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==", limit=25, ), sort=FlakyTestsSearchSort.FAILURE_RATE_ASCENDING, ), type=FlakyTestsSearchRequestDataType.SEARCH_FLAKY_TESTS_REQUEST, ), ) configuration = Configuration() configuration.unstable_operations["search_flaky_tests"] = True with ApiClient(configuration) as api_client: api_instance = TestOptimizationApi(api_client) response = api_instance.search_flaky_tests(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Search flaky tests ``` # Search flaky tests returns "OK" response require "datadog_api_client" DatadogAPIClient.configure do |config| config.unstable_operations["v2.search_flaky_tests".to_sym] = true end api_instance = DatadogAPIClient::V2::TestOptimizationAPI.new body = DatadogAPIClient::V2::FlakyTestsSearchRequest.new({ data: DatadogAPIClient::V2::FlakyTestsSearchRequestData.new({ attributes: DatadogAPIClient::V2::FlakyTestsSearchRequestAttributes.new({ filter: DatadogAPIClient::V2::FlakyTestsSearchFilter.new({ query: 'flaky_test_state:active @git.repository.id_v2:"github.com/datadog/shopist"', }), page: DatadogAPIClient::V2::FlakyTestsSearchPageOptions.new({ cursor: "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==", limit: 25, }), sort: DatadogAPIClient::V2::FlakyTestsSearchSort::FAILURE_RATE_ASCENDING, }), type: DatadogAPIClient::V2::FlakyTestsSearchRequestDataType::SEARCH_FLAKY_TESTS_REQUEST, }), }) opts = { body: body, } p api_instance.search_flaky_tests(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Search flaky tests ``` // Search flaky tests returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.FlakyTestsSearchRequest{ Data: &datadogV2.FlakyTestsSearchRequestData{ Attributes: &datadogV2.FlakyTestsSearchRequestAttributes{ Filter: &datadogV2.FlakyTestsSearchFilter{ Query: datadog.PtrString(`flaky_test_state:active @git.repository.id_v2:"github.com/datadog/shopist"`), }, Page: &datadogV2.FlakyTestsSearchPageOptions{ Cursor: datadog.PtrString("eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ=="), Limit: datadog.PtrInt64(25), }, Sort: datadogV2.FLAKYTESTSSEARCHSORT_FAILURE_RATE_ASCENDING.Ptr(), }, Type: datadogV2.FLAKYTESTSSEARCHREQUESTDATATYPE_SEARCH_FLAKY_TESTS_REQUEST.Ptr(), }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() configuration.SetUnstableOperationEnabled("v2.SearchFlakyTests", true) apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewTestOptimizationApi(apiClient) resp, r, err := api.SearchFlakyTests(ctx, *datadogV2.NewSearchFlakyTestsOptionalParameters().WithBody(body)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `TestOptimizationApi.SearchFlakyTests`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `TestOptimizationApi.SearchFlakyTests`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Search flaky tests ``` // Search flaky tests returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.TestOptimizationApi; import com.datadog.api.client.v2.api.TestOptimizationApi.SearchFlakyTestsOptionalParameters; import com.datadog.api.client.v2.model.FlakyTestsSearchFilter; import com.datadog.api.client.v2.model.FlakyTestsSearchPageOptions; import com.datadog.api.client.v2.model.FlakyTestsSearchRequest; import com.datadog.api.client.v2.model.FlakyTestsSearchRequestAttributes; import com.datadog.api.client.v2.model.FlakyTestsSearchRequestData; import com.datadog.api.client.v2.model.FlakyTestsSearchRequestDataType; import com.datadog.api.client.v2.model.FlakyTestsSearchResponse; import com.datadog.api.client.v2.model.FlakyTestsSearchSort; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); defaultClient.setUnstableOperationEnabled("v2.searchFlakyTests", true); TestOptimizationApi apiInstance = new TestOptimizationApi(defaultClient); FlakyTestsSearchRequest body = new FlakyTestsSearchRequest() .data( new FlakyTestsSearchRequestData() .attributes( new FlakyTestsSearchRequestAttributes() .filter( new FlakyTestsSearchFilter() .query( """ flaky_test_state:active @git.repository.id_v2:"github.com/datadog/shopist" """)) .page( new FlakyTestsSearchPageOptions() .cursor( "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==") .limit(25L)) .sort(FlakyTestsSearchSort.FAILURE_RATE_ASCENDING)) .type(FlakyTestsSearchRequestDataType.SEARCH_FLAKY_TESTS_REQUEST)); try { FlakyTestsSearchResponse result = apiInstance.searchFlakyTests(new SearchFlakyTestsOptionalParameters().body(body)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling TestOptimizationApi#searchFlakyTests"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Search flaky tests ``` // Search flaky tests returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_test_optimization::SearchFlakyTestsOptionalParams; use datadog_api_client::datadogV2::api_test_optimization::TestOptimizationAPI; use datadog_api_client::datadogV2::model::FlakyTestsSearchFilter; use datadog_api_client::datadogV2::model::FlakyTestsSearchPageOptions; use datadog_api_client::datadogV2::model::FlakyTestsSearchRequest; use datadog_api_client::datadogV2::model::FlakyTestsSearchRequestAttributes; use datadog_api_client::datadogV2::model::FlakyTestsSearchRequestData; use datadog_api_client::datadogV2::model::FlakyTestsSearchRequestDataType; use datadog_api_client::datadogV2::model::FlakyTestsSearchSort; #[tokio::main] async fn main() { let body = FlakyTestsSearchRequest ::new().data( FlakyTestsSearchRequestData::new() .attributes( FlakyTestsSearchRequestAttributes::new() .filter( FlakyTestsSearchFilter ::new().query( r#"flaky_test_state:active @git.repository.id_v2:"github.com/datadog/shopist""#.to_string(), ), ) .page( FlakyTestsSearchPageOptions::new() .cursor( "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==".to_string(), ) .limit(25), ) .sort(FlakyTestsSearchSort::FAILURE_RATE_ASCENDING), ) .type_(FlakyTestsSearchRequestDataType::SEARCH_FLAKY_TESTS_REQUEST), ); let mut configuration = datadog::Configuration::new(); configuration.set_unstable_operation_enabled("v2.SearchFlakyTests", true); let api = TestOptimizationAPI::with_config(configuration); let resp = api .search_flaky_tests(SearchFlakyTestsOptionalParams::default().body(body)) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Search flaky tests ``` /** * Search flaky tests returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); configuration.unstableOperations["v2.searchFlakyTests"] = true; const apiInstance = new v2.TestOptimizationApi(configuration); const params: v2.TestOptimizationApiSearchFlakyTestsRequest = { body: { data: { attributes: { filter: { query: `flaky_test_state:active @git.repository.id_v2:"github.com/datadog/shopist"`, }, page: { cursor: "eyJzdGFydEF0IjoiQVFBQUFYS2tMS3pPbm40NGV3QUFBQUJCV0V0clRFdDZVbG8zY3pCRmNsbHJiVmxDWlEifQ==", limit: 25, }, sort: "failure_rate", }, type: "search_flaky_tests_request", }, }, }; apiInstance .searchFlakyTests(params) .then((data: v2.FlakyTestsSearchResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=94a79573-6e05-4042-8842-fc6a9c976379&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=8e522bda-345e-47a9-9548-a76f6ac45835&pt=Test%20Optimization&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ftest-optimization%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=94a79573-6e05-4042-8842-fc6a9c976379&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=8e522bda-345e-47a9-9548-a76f6ac45835&pt=Test%20Optimization&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ftest-optimization%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=c78cfa63-4c91-4fa4-8fbe-47833a115105&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Test%20Optimization&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Ftest-optimization%2F&r=<=4979&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=257142) --- # Source: https://docs.datadoghq.com/api/latest/timeboards/ # Timeboards **DEPRECATED**: This endpoint is outdated. Use the [new Dashboard endpoint](https://docs.datadoghq.com/api/latest/dashboards/) instead. The Timeboards API has been replaced by the unified Dashboards API. All timeboard functionality is now available through the Dashboards endpoint. See: [Dashboards API Documentation](https://docs.datadoghq.com/api/latest/dashboards/) --- # Source: https://docs.datadoghq.com/api/latest/usage-metering # Usage Metering The usage metering API allows you to get hourly, daily, and monthly usage across multiple facets of Datadog. This API is available to all Pro and Enterprise customers. **Note** : Usage data is delayed by up to 72 hours from when it was incurred. It is retained for 15 months. You can retrieve up to 24 hours of hourly usage data for multiple organizations, and up to two months of hourly usage data for a single organization in one request. Learn more on the [usage details documentation](https://docs.datadoghq.com/account_management/billing/usage_details/). ## [Get billing dimension mapping for usage endpoints](https://docs.datadoghq.com/api/latest/usage-metering/#get-billing-dimension-mapping-for-usage-endpoints) * [v2 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-billing-dimension-mapping-for-usage-endpoints-v2) GET https://api.ap1.datadoghq.com/api/v2/usage/billing_dimension_mappinghttps://api.ap2.datadoghq.com/api/v2/usage/billing_dimension_mappinghttps://api.datadoghq.eu/api/v2/usage/billing_dimension_mappinghttps://api.ddog-gov.com/api/v2/usage/billing_dimension_mappinghttps://api.datadoghq.com/api/v2/usage/billing_dimension_mappinghttps://api.us3.datadoghq.com/api/v2/usage/billing_dimension_mappinghttps://api.us5.datadoghq.com/api/v2/usage/billing_dimension_mapping ### Overview Get a mapping of billing dimensions to the corresponding keys for the supported usage metering public API endpoints. Mapping data is updated on a monthly cadence. This endpoint is only accessible to [parent-level organizations](https://docs.datadoghq.com/account_management/multi_organization/). This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[month] string Datetime in ISO-8601 format, UTC, and for mappings beginning this month. Defaults to the current month. filter[view] string String to specify whether to retrieve active billing dimension mappings for the contract or for all available mappings. Allowed views have the string `active` or `all`. Defaults to `active`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetBillingDimensionMapping-200-v2) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetBillingDimensionMapping-400-v2) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetBillingDimensionMapping-403-v2) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetBillingDimensionMapping-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Billing dimensions mapping response. Field Type Description data [object] Billing dimensions mapping data. attributes object Mapping of billing dimensions to endpoint keys. endpoints [object] List of supported endpoints with their keys mapped to the billing_dimension. id string The URL for the endpoint. keys [string] The billing dimension. status enum Denotes whether mapping keys were available for this endpoint. Allowed enum values: `OK,NOT_FOUND` in_app_label string Label used for the billing dimension in the Plan & Usage charts. timestamp date-time Month in ISO-8601 format, UTC, and precise to the second: `[YYYY-MM-DDThh:mm:ss]`. id string ID of the billing dimension. type enum Type of active billing dimensions data. Allowed enum values: `billing_dimensions` default: `billing_dimensions` ``` { "data": [ { "attributes": { "endpoints": [ { "id": "api/v1/usage/billable-summary", "keys": [ "apm_host_top99p", "apm_host_sum" ], "status": "string" } ], "in_app_label": "APM Hosts", "timestamp": "2019-09-19T10:00:00.000Z" }, "id": "string", "type": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get billing dimension mapping for usage endpoints Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/usage/billing_dimension_mapping" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get billing dimension mapping for usage endpoints ``` """ Get billing dimension mapping for usage endpoints returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_billing_dimension_mapping() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get billing dimension mapping for usage endpoints ``` # Get billing dimension mapping for usage endpoints returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsageMeteringAPI.new p api_instance.get_billing_dimension_mapping() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get billing dimension mapping for usage endpoints ``` // Get billing dimension mapping for usage endpoints returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsageMeteringApi(apiClient) resp, r, err := api.GetBillingDimensionMapping(ctx, *datadogV2.NewGetBillingDimensionMappingOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetBillingDimensionMapping`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetBillingDimensionMapping`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get billing dimension mapping for usage endpoints ``` // Get billing dimension mapping for usage endpoints returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsageMeteringApi; import com.datadog.api.client.v2.model.BillingDimensionsMappingResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { BillingDimensionsMappingResponse result = apiInstance.getBillingDimensionMapping(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getBillingDimensionMapping"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get billing dimension mapping for usage endpoints ``` // Get billing dimension mapping for usage endpoints returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_usage_metering::GetBillingDimensionMappingOptionalParams; use datadog_api_client::datadogV2::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_billing_dimension_mapping(GetBillingDimensionMappingOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get billing dimension mapping for usage endpoints ``` /** * Get billing dimension mapping for usage endpoints returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsageMeteringApi(configuration); apiInstance .getBillingDimensionMapping() .then((data: v2.BillingDimensionsMappingResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage by product family](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family) * [v2 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family-v2) GET https://api.ap1.datadoghq.com/api/v2/usage/hourly_usagehttps://api.ap2.datadoghq.com/api/v2/usage/hourly_usagehttps://api.datadoghq.eu/api/v2/usage/hourly_usagehttps://api.ddog-gov.com/api/v2/usage/hourly_usagehttps://api.datadoghq.com/api/v2/usage/hourly_usagehttps://api.us3.datadoghq.com/api/v2/usage/hourly_usagehttps://api.us5.datadoghq.com/api/v2/usage/hourly_usage ### Overview Get hourly usage by product family. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description filter[timestamp][start] [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. filter[timestamp][end] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. filter[product_families] [_required_] string Comma separated list of product families to retrieve. Available families are `all`, `analyzed_logs`, `application_security`, `audit_trail`, `bits_ai`, `serverless`, `ci_app`, `cloud_cost_management`, `cloud_siem`, `csm_container_enterprise`, `csm_host_enterprise`, `cspm`, `custom_events`, `cws`, `dbm`, `error_tracking`, `fargate`, `infra_hosts`, `incident_management`, `indexed_logs`, `indexed_spans`, `ingested_spans`, `iot`, `lambda_traced_invocations`, `llm_observability`, `logs`, `network_flows`, `network_hosts`, `network_monitoring`, `observability_pipelines`, `online_archive`, `profiling`, `product_analytics`, `rum`, `rum_browser_sessions`, `rum_mobile_sessions`, `sds`, `snmp`, `software_delivery`, `synthetics_api`, `synthetics_browser`, `synthetics_mobile`, `synthetics_parallel_testing`, `timeseries`, `vuln_management` and `workflow_executions`. The following product family has been **deprecated** : `audit_logs`. filter[include_descendants] boolean Include child org usage in the response. Defaults to false. filter[include_connected_accounts] boolean Boolean to specify whether to include accounts connected to the current account as partner customers in the Datadog partner network program. Defaults to false. filter[include_breakdown] boolean Include breakdown of usage by subcategories where applicable (for product family logs only). Defaults to false. filter[versions] string Comma separated list of product family versions to use in the format `product_family:version`. For example, `infra_hosts:1.0.0`. If this parameter is not used, the API will use the latest version of each requested product family. Currently all families have one version `1.0.0`. page[limit] integer Maximum number of results to return (between 1 and 500) - defaults to 500 if limit not specified. page[next_record_id] string List following results with a next_record_id provided in the previous query. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetHourlyUsage-200-v2) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetHourlyUsage-400-v2) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetHourlyUsage-403-v2) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetHourlyUsage-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Hourly usage response. Field Type Description data [object] Response containing hourly usage. attributes object Attributes of hourly usage for a product family for an org for a time period. account_name string The account name. account_public_id string The account public ID. measurements [object] List of the measured usage values for the product family for the org for the time period. usage_type string Type of usage. value int64 Contains the number measured for the given usage_type during the hour. org_name string The organization name. product_family string The product for which usage is being reported. public_id string The organization public ID. region string The region of the Datadog instance that the organization belongs to. timestamp date-time Datetime in ISO-8601 format, UTC. The hour for the usage. id string Unique ID of the response. type enum Type of usage data. Allowed enum values: `usage_timeseries` default: `usage_timeseries` meta object The object containing document metadata. pagination object The metadata for the current pagination. next_record_id string The cursor to get the next results (if any). To make the next request, use the same parameters and add `next_record_id`. ``` { "data": [ { "attributes": { "account_name": "string", "account_public_id": "string", "measurements": [ { "usage_type": "string", "value": "integer" } ], "org_name": "string", "product_family": "string", "public_id": "string", "region": "string", "timestamp": "2019-09-19T10:00:00.000Z" }, "id": "string", "type": "usage_timeseries" } ], "meta": { "pagination": { "next_record_id": "string" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage by product family Copy ``` # Required query arguments export filter[timestamp][start]="CHANGE_ME" export filter[product_families]="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/usage/hourly_usage?filter[timestamp][start]=${filter[timestamp][start]}&filter[product_families]=${filter[product_families]}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage by product family ``` """ Get hourly usage by product family returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_hourly_usage( filter_timestamp_start=(datetime.now() + relativedelta(days=-3)), filter_product_families="infra_hosts", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage by product family ``` # Get hourly usage by product family returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsageMeteringAPI.new p api_instance.get_hourly_usage((Time.now + -3 * 86400), "infra_hosts") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage by product family ``` // Get hourly usage by product family returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsageMeteringApi(apiClient) resp, r, err := api.GetHourlyUsage(ctx, time.Now().AddDate(0, 0, -3), "infra_hosts", *datadogV2.NewGetHourlyUsageOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetHourlyUsage`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetHourlyUsage`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage by product family ``` // Get hourly usage by product family returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsageMeteringApi; import com.datadog.api.client.v2.model.HourlyUsageResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { HourlyUsageResponse result = apiInstance.getHourlyUsage(OffsetDateTime.now().plusDays(-3), "infra_hosts"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getHourlyUsage"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage by product family ``` // Get hourly usage by product family returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_usage_metering::GetHourlyUsageOptionalParams; use datadog_api_client::datadogV2::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_hourly_usage( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), "infra_hosts".to_string(), GetHourlyUsageOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage by product family ``` /** * Get hourly usage by product family returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsageMeteringApi(configuration); const params: v2.UsageMeteringApiGetHourlyUsageRequest = { filterTimestampStart: new Date(new Date().getTime() + -3 * 86400 * 1000), filterProductFamilies: "infra_hosts", }; apiInstance .getHourlyUsage(params) .then((data: v2.HourlyUsageResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage attribution](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-attribution) * [v1 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-attribution-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/hourly-attributionhttps://api.ap2.datadoghq.com/api/v1/usage/hourly-attributionhttps://api.datadoghq.eu/api/v1/usage/hourly-attributionhttps://api.ddog-gov.com/api/v1/usage/hourly-attributionhttps://api.datadoghq.com/api/v1/usage/hourly-attributionhttps://api.us3.datadoghq.com/api/v1/usage/hourly-attributionhttps://api.us5.datadoghq.com/api/v1/usage/hourly-attribution ### Overview Get hourly usage attribution. Multi-region data is available starting March 1, 2023. This API endpoint is paginated. To make sure you receive all records, check if the value of `next_record_id` is set in the response. If it is, make another request and pass `next_record_id` as a parameter. Pseudo code example: ``` response := GetHourlyUsageAttribution(start_month) cursor := response.metadata.pagination.next_record_id WHILE cursor != null BEGIN sleep(5 seconds) # Avoid running into rate limit response := GetHourlyUsageAttribution(start_month, next_record_id=cursor) cursor := response.metadata.pagination.next_record_id END ``` The following values have been **deprecated** : `estimated_indexed_spans_usage`, `estimated_indexed_spans_percentage`, `estimated_ingested_spans_usage`, `estimated_ingested_spans_percentage`, `llm_observability_usage`, `llm_observability_percentage`. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. usage_type [_required_] enum Usage type to retrieve. The following values have been **deprecated** : `estimated_indexed_spans_usage`, `estimated_ingested_spans_usage`. Allowed enum values: `api_usage, apm_fargate_usage, apm_host_usage, apm_usm_usage, appsec_fargate_usage, appsec_usage, asm_serverless_traced_invocations_usage, asm_serverless_traced_invocations_percentage, bits_ai_investigations_usage, browser_usage, ci_pipeline_indexed_spans_usage, ci_test_indexed_spans_usage, ci_visibility_itr_usage, cloud_siem_usage, code_security_host_usage, container_excl_agent_usage, container_usage, cspm_containers_usage, cspm_hosts_usage, custom_event_usage, custom_ingested_timeseries_usage, custom_timeseries_usage, cws_containers_usage, cws_fargate_task_usage, cws_hosts_usage, data_jobs_monitoring_usage, data_stream_monitoring_usage, dbm_hosts_usage, dbm_queries_usage, error_tracking_usage, error_tracking_percentage, estimated_indexed_spans_usage, estimated_ingested_spans_usage, fargate_usage, flex_stored_logs, functions_usage, incident_management_monthly_active_users_usage, indexed_spans_usage, infra_host_usage, ingested_logs_bytes_usage, ingested_spans_bytes_usage, invocations_usage, lambda_traced_invocations_usage, llm_observability_usage, llm_spans_usage, logs_indexed_15day_usage, logs_indexed_180day_usage, logs_indexed_1day_usage, logs_indexed_30day_usage, logs_indexed_360day_usage, logs_indexed_3day_usage, logs_indexed_45day_usage, logs_indexed_60day_usage, logs_indexed_7day_usage, logs_indexed_90day_usage, logs_indexed_custom_retention_usage, mobile_app_testing_usage, ndm_netflow_usage, npm_host_usage, network_device_wireless_usage, obs_pipeline_bytes_usage, obs_pipelines_vcpu_usage, online_archive_usage, product_analytics_session_usage, profiled_container_usage, profiled_fargate_usage, profiled_host_usage, published_app, rum_browser_mobile_sessions_usage, rum_ingested_usage, rum_investigate_usage, rum_replay_sessions_usage, rum_session_replay_add_on_usage, sca_fargate_usage, sds_scanned_bytes_usage, serverless_apps_usage, siem_analyzed_logs_add_on_usage, siem_ingested_bytes_usage, snmp_usage, universal_service_monitoring_usage, vuln_management_hosts_usage, workflow_executions_usage` next_record_id string List following results with a next_record_id provided in the previous query. tag_breakdown_keys string Comma separated list of tags used to group usage. If no value is provided the usage will not be broken down by tags. To see which tags are available, look for the value of `tag_config_source` in the API response. include_descendants boolean Include child org usage in the response. Defaults to `true`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetHourlyUsageAttribution-200-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetHourlyUsageAttribution-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetHourlyUsageAttribution-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the hourly usage attribution by tag(s). Field Type Description metadata object The object containing document metadata. pagination object The metadata for the current pagination. next_record_id string The cursor to get the next results (if any). To make the next request, use the same parameters and add `next_record_id`. usage [object] Get the hourly usage attribution by tag(s). hour date-time The hour for the usage. org_name string The name of the organization. public_id string The organization public ID. region string The region of the Datadog instance that the organization belongs to. tag_config_source string The source of the usage attribution tag configuration and the selected tags in the format of `::://////`. tags object Tag keys and values. A `null` value here means that the requested tag breakdown cannot be applied because it does not match the [tags configured for usage attribution](https://docs.datadoghq.com/account_management/billing/usage_attribution/#getting-started). In this scenario the API returns the total usage, not broken down by tags. [string] A list of values that are associated with each tag key. * An empty list means the resource use wasn't tagged with the respective tag. * Multiple values means the respective tag was applied multiple times on the resource. * An `` value means the resource was tagged with the respective tag but did not have a value. total_usage_sum double Total product usage for the given tags within the hour. updated_at string Shows the most recent hour in the current month for all organizations where usages are calculated. usage_type enum Supported products for hourly usage attribution requests. The following values have been **deprecated** : `estimated_indexed_spans_usage`, `estimated_ingested_spans_usage`. Allowed enum values: `api_usage,apm_fargate_usage,apm_host_usage,apm_usm_usage,appsec_fargate_usage,appsec_usage,asm_serverless_traced_invocations_usage,asm_serverless_traced_invocations_percentage,bits_ai_investigations_usage,browser_usage,ci_pipeline_indexed_spans_usage,ci_test_indexed_spans_usage,ci_visibility_itr_usage,cloud_siem_usage,code_security_host_usage,container_excl_agent_usage,container_usage,cspm_containers_usage,cspm_hosts_usage,custom_event_usage,custom_ingested_timeseries_usage,custom_timeseries_usage,cws_containers_usage,cws_fargate_task_usage,cws_hosts_usage,data_jobs_monitoring_usage,data_stream_monitoring_usage,dbm_hosts_usage,dbm_queries_usage,error_tracking_usage,error_tracking_percentage,estimated_indexed_spans_usage,estimated_ingested_spans_usage,fargate_usage,flex_stored_logs,functions_usage,incident_management_monthly_active_users_usage,indexed_spans_usage,infra_host_usage,ingested_logs_bytes_usage,ingested_spans_bytes_usage,invocations_usage,lambda_traced_invocations_usage,llm_observability_usage,llm_spans_usage,logs_indexed_15day_usage,logs_indexed_180day_usage,logs_indexed_1day_usage,logs_indexed_30day_usage,logs_indexed_360day_usage,logs_indexed_3day_usage,logs_indexed_45day_usage,logs_indexed_60day_usage,logs_indexed_7day_usage,logs_indexed_90day_usage,logs_indexed_custom_retention_usage,mobile_app_testing_usage,ndm_netflow_usage,npm_host_usage,network_device_wireless_usage,obs_pipeline_bytes_usage,obs_pipelines_vcpu_usage,online_archive_usage,product_analytics_session_usage,profiled_container_usage,profiled_fargate_usage,profiled_host_usage,published_app,rum_browser_mobile_sessions_usage,rum_ingested_usage,rum_investigate_usage,rum_replay_sessions_usage,rum_session_replay_add_on_usage,sca_fargate_usage,sds_scanned_bytes_usage,serverless_apps_usage,siem_analyzed_logs_add_on_usage,siem_ingested_bytes_usage,snmp_usage,universal_service_monitoring_usage,vuln_management_hosts_usage,workflow_executions_usage` ``` { "metadata": { "pagination": { "next_record_id": "string" } }, "usage": [ { "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string", "region": "string", "tag_config_source": "string", "tags": { "": [ "datadog-integrations-lab" ] }, "total_usage_sum": "number", "updated_at": "string", "usage_type": "string" } ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage attribution Copy ``` # Required query arguments export start_hr="CHANGE_ME" export usage_type="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/hourly-attribution?start_hr=${start_hr}&usage_type=${usage_type}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage attribution ``` """ Get hourly usage attribution returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datadog_api_client.v1.model.hourly_usage_attribution_usage_type import HourlyUsageAttributionUsageType configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_hourly_usage_attribution( start_hr=(datetime.now() + relativedelta(days=-3)), usage_type=HourlyUsageAttributionUsageType.INFRA_HOST_USAGE, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage attribution ``` # Get hourly usage attribution returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_hourly_usage_attribution((Time.now + -3 * 86400), HourlyUsageAttributionUsageType::INFRA_HOST_USAGE) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage attribution ``` // Get hourly usage attribution returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetHourlyUsageAttribution(ctx, time.Now().AddDate(0, 0, -3), datadogV1.HOURLYUSAGEATTRIBUTIONUSAGETYPE_INFRA_HOST_USAGE, *datadogV1.NewGetHourlyUsageAttributionOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetHourlyUsageAttribution`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetHourlyUsageAttribution`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage attribution ``` // Get hourly usage attribution returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.HourlyUsageAttributionResponse; import com.datadog.api.client.v1.model.HourlyUsageAttributionUsageType; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { HourlyUsageAttributionResponse result = apiInstance.getHourlyUsageAttribution( OffsetDateTime.now().plusDays(-3), HourlyUsageAttributionUsageType.INFRA_HOST_USAGE); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getHourlyUsageAttribution"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage attribution ``` // Get hourly usage attribution returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetHourlyUsageAttributionOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; use datadog_api_client::datadogV1::model::HourlyUsageAttributionUsageType; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_hourly_usage_attribution( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), HourlyUsageAttributionUsageType::INFRA_HOST_USAGE, GetHourlyUsageAttributionOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage attribution ``` /** * Get hourly usage attribution returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetHourlyUsageAttributionRequest = { startHr: new Date(new Date().getTime() + -3 * 86400 * 1000), usageType: "infra_host_usage", }; apiInstance .getHourlyUsageAttribution(params) .then((data: v1.HourlyUsageAttributionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get monthly usage attribution](https://docs.datadoghq.com/api/latest/usage-metering/#get-monthly-usage-attribution) * [v1 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-monthly-usage-attribution-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/monthly-attributionhttps://api.ap2.datadoghq.com/api/v1/usage/monthly-attributionhttps://api.datadoghq.eu/api/v1/usage/monthly-attributionhttps://api.ddog-gov.com/api/v1/usage/monthly-attributionhttps://api.datadoghq.com/api/v1/usage/monthly-attributionhttps://api.us3.datadoghq.com/api/v1/usage/monthly-attributionhttps://api.us5.datadoghq.com/api/v1/usage/monthly-attribution ### Overview Get monthly usage attribution. Multi-region data is available starting March 1, 2023. This API endpoint is paginated. To make sure you receive all records, check if the value of `next_record_id` is set in the response. If it is, make another request and pass `next_record_id` as a parameter. Pseudo code example: ``` response := GetMonthlyUsageAttribution(start_month) cursor := response.metadata.pagination.next_record_id WHILE cursor != null BEGIN sleep(5 seconds) # Avoid running into rate limit response := GetMonthlyUsageAttribution(start_month, next_record_id=cursor) cursor := response.metadata.pagination.next_record_id END ``` This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_month [_required_] string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for usage beginning in this month. Maximum of 15 months ago. end_month string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for usage ending this month. fields [_required_] enum Comma-separated list of usage types to return, or `*` for all usage types. The following values have been **deprecated** : `estimated_indexed_spans_usage`, `estimated_indexed_spans_percentage`, `estimated_ingested_spans_usage`, `estimated_ingested_spans_percentage`, `llm_observability_usage`, `llm_observability_percentage`. Allowed enum values: `api_usage, api_percentage, apm_fargate_usage, apm_fargate_percentage, appsec_fargate_usage, appsec_fargate_percentage, apm_host_usage, apm_host_percentage, apm_usm_usage, apm_usm_percentage, appsec_usage, appsec_percentage, asm_serverless_traced_invocations_usage, asm_serverless_traced_invocations_percentage, bits_ai_investigations_usage, bits_ai_investigations_percentage, browser_usage, browser_percentage, ci_visibility_itr_usage, ci_visibility_itr_percentage, cloud_siem_usage, cloud_siem_percentage, code_security_host_usage, code_security_host_percentage, container_excl_agent_usage, container_excl_agent_percentage, container_usage, container_percentage, cspm_containers_percentage, cspm_containers_usage, cspm_hosts_percentage, cspm_hosts_usage, custom_timeseries_usage, custom_timeseries_percentage, custom_ingested_timeseries_usage, custom_ingested_timeseries_percentage, cws_containers_percentage, cws_containers_usage, cws_fargate_task_percentage, cws_fargate_task_usage, cws_hosts_percentage, cws_hosts_usage, data_jobs_monitoring_usage, data_jobs_monitoring_percentage, data_stream_monitoring_usage, data_stream_monitoring_percentage, dbm_hosts_percentage, dbm_hosts_usage, dbm_queries_percentage, dbm_queries_usage, error_tracking_usage, error_tracking_percentage, estimated_indexed_spans_usage, estimated_indexed_spans_percentage, estimated_ingested_spans_usage, estimated_ingested_spans_percentage, fargate_usage, fargate_percentage, flex_stored_logs_usage, flex_stored_logs_percentage, functions_usage, functions_percentage, incident_management_monthly_active_users_usage, incident_management_monthly_active_users_percentage, infra_host_usage, infra_host_percentage, invocations_usage, invocations_percentage, lambda_traced_invocations_usage, lambda_traced_invocations_percentage, llm_observability_usage, llm_observability_percentage, llm_spans_usage, llm_spans_percentage, mobile_app_testing_percentage, mobile_app_testing_usage, ndm_netflow_usage, ndm_netflow_percentage, network_device_wireless_usage, network_device_wireless_percentage, npm_host_usage, npm_host_percentage, obs_pipeline_bytes_usage, obs_pipeline_bytes_percentage, obs_pipelines_vcpu_usage, obs_pipelines_vcpu_percentage, online_archive_usage, online_archive_percentage, product_analytics_session_usage, product_analytics_session_percentage, profiled_container_usage, profiled_container_percentage, profiled_fargate_usage, profiled_fargate_percentage, profiled_host_usage, profiled_host_percentage, published_app_usage, published_app_percentage, serverless_apps_usage, serverless_apps_percentage, snmp_usage, snmp_percentage, universal_service_monitoring_usage, universal_service_monitoring_percentage, vuln_management_hosts_usage, vuln_management_hosts_percentage, sds_scanned_bytes_usage, sds_scanned_bytes_percentage, ci_test_indexed_spans_usage, ci_test_indexed_spans_percentage, ingested_logs_bytes_usage, ingested_logs_bytes_percentage, ci_pipeline_indexed_spans_usage, ci_pipeline_indexed_spans_percentage, indexed_spans_usage, indexed_spans_percentage, custom_event_usage, custom_event_percentage, logs_indexed_custom_retention_usage, logs_indexed_custom_retention_percentage, logs_indexed_360day_usage, logs_indexed_360day_percentage, logs_indexed_180day_usage, logs_indexed_180day_percentage, logs_indexed_90day_usage, logs_indexed_90day_percentage, logs_indexed_60day_usage, logs_indexed_60day_percentage, logs_indexed_45day_usage, logs_indexed_45day_percentage, logs_indexed_30day_usage, logs_indexed_30day_percentage, logs_indexed_15day_usage, logs_indexed_15day_percentage, logs_indexed_7day_usage, logs_indexed_7day_percentage, logs_indexed_3day_usage, logs_indexed_3day_percentage, logs_indexed_1day_usage, logs_indexed_1day_percentage, rum_ingested_usage, rum_ingested_percentage, rum_investigate_usage, rum_investigate_percentage, rum_replay_sessions_usage, rum_replay_sessions_percentage, rum_session_replay_add_on_usage, rum_session_replay_add_on_percentage, rum_browser_mobile_sessions_usage, rum_browser_mobile_sessions_percentage, ingested_spans_bytes_usage, ingested_spans_bytes_percentage, siem_analyzed_logs_add_on_usage, siem_analyzed_logs_add_on_percentage, siem_ingested_bytes_usage, siem_ingested_bytes_percentage, workflow_executions_usage, workflow_executions_percentage, sca_fargate_usage, sca_fargate_percentage, *` sort_direction enum The direction to sort by: `[desc, asc]`. Allowed enum values: `desc, asc` sort_name enum The field to sort by. The following values have been **deprecated** : `estimated_indexed_spans_usage`, `estimated_indexed_spans_percentage`, `estimated_ingested_spans_usage`, `estimated_ingested_spans_percentage`. Allowed enum values: `api_usage, api_percentage, apm_fargate_usage, apm_fargate_percentage, appsec_fargate_usage, appsec_fargate_percentage, apm_host_usage, apm_host_percentage, apm_usm_usage, apm_usm_percentage, appsec_usage, appsec_percentage, asm_serverless_traced_invocations_usage, asm_serverless_traced_invocations_percentage, bits_ai_investigations_usage, bits_ai_investigations_percentage, browser_usage, browser_percentage, ci_visibility_itr_usage, ci_visibility_itr_percentage, cloud_siem_usage, cloud_siem_percentage, code_security_host_usage, code_security_host_percentage, container_excl_agent_usage, container_excl_agent_percentage, container_usage, container_percentage, cspm_containers_percentage, cspm_containers_usage, cspm_hosts_percentage, cspm_hosts_usage, custom_timeseries_usage, custom_timeseries_percentage, custom_ingested_timeseries_usage, custom_ingested_timeseries_percentage, cws_containers_percentage, cws_containers_usage, cws_fargate_task_percentage, cws_fargate_task_usage, cws_hosts_percentage, cws_hosts_usage, data_jobs_monitoring_usage, data_jobs_monitoring_percentage, data_stream_monitoring_usage, data_stream_monitoring_percentage, dbm_hosts_percentage, dbm_hosts_usage, dbm_queries_percentage, dbm_queries_usage, error_tracking_usage, error_tracking_percentage, estimated_indexed_spans_usage, estimated_indexed_spans_percentage, estimated_ingested_spans_usage, estimated_ingested_spans_percentage, fargate_usage, fargate_percentage, flex_stored_logs_usage, flex_stored_logs_percentage, functions_usage, functions_percentage, incident_management_monthly_active_users_usage, incident_management_monthly_active_users_percentage, infra_host_usage, infra_host_percentage, invocations_usage, invocations_percentage, lambda_traced_invocations_usage, lambda_traced_invocations_percentage, llm_observability_usage, llm_observability_percentage, llm_spans_usage, llm_spans_percentage, mobile_app_testing_percentage, mobile_app_testing_usage, ndm_netflow_usage, ndm_netflow_percentage, network_device_wireless_usage, network_device_wireless_percentage, npm_host_usage, npm_host_percentage, obs_pipeline_bytes_usage, obs_pipeline_bytes_percentage, obs_pipelines_vcpu_usage, obs_pipelines_vcpu_percentage, online_archive_usage, online_archive_percentage, product_analytics_session_usage, product_analytics_session_percentage, profiled_container_usage, profiled_container_percentage, profiled_fargate_usage, profiled_fargate_percentage, profiled_host_usage, profiled_host_percentage, published_app_usage, published_app_percentage, serverless_apps_usage, serverless_apps_percentage, snmp_usage, snmp_percentage, universal_service_monitoring_usage, universal_service_monitoring_percentage, vuln_management_hosts_usage, vuln_management_hosts_percentage, sds_scanned_bytes_usage, sds_scanned_bytes_percentage, ci_test_indexed_spans_usage, ci_test_indexed_spans_percentage, ingested_logs_bytes_usage, ingested_logs_bytes_percentage, ci_pipeline_indexed_spans_usage, ci_pipeline_indexed_spans_percentage, indexed_spans_usage, indexed_spans_percentage, custom_event_usage, custom_event_percentage, logs_indexed_custom_retention_usage, logs_indexed_custom_retention_percentage, logs_indexed_360day_usage, logs_indexed_360day_percentage, logs_indexed_180day_usage, logs_indexed_180day_percentage, logs_indexed_90day_usage, logs_indexed_90day_percentage, logs_indexed_60day_usage, logs_indexed_60day_percentage, logs_indexed_45day_usage, logs_indexed_45day_percentage, logs_indexed_30day_usage, logs_indexed_30day_percentage, logs_indexed_15day_usage, logs_indexed_15day_percentage, logs_indexed_7day_usage, logs_indexed_7day_percentage, logs_indexed_3day_usage, logs_indexed_3day_percentage, logs_indexed_1day_usage, logs_indexed_1day_percentage, rum_ingested_usage, rum_ingested_percentage, rum_investigate_usage, rum_investigate_percentage, rum_replay_sessions_usage, rum_replay_sessions_percentage, rum_session_replay_add_on_usage, rum_session_replay_add_on_percentage, rum_browser_mobile_sessions_usage, rum_browser_mobile_sessions_percentage, ingested_spans_bytes_usage, ingested_spans_bytes_percentage, siem_analyzed_logs_add_on_usage, siem_analyzed_logs_add_on_percentage, siem_ingested_bytes_usage, siem_ingested_bytes_percentage, workflow_executions_usage, workflow_executions_percentage, sca_fargate_usage, sca_fargate_percentage, *` tag_breakdown_keys string Comma separated list of tag keys used to group usage. If no value is provided the usage will not be broken down by tags. To see which tags are available, look for the value of `tag_config_source` in the API response. next_record_id string List following results with a next_record_id provided in the previous query. include_descendants boolean Include child org usage in the response. Defaults to `true`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetMonthlyUsageAttribution-200-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetMonthlyUsageAttribution-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetMonthlyUsageAttribution-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the monthly Usage Summary by tag(s). Field Type Description metadata object The object containing document metadata. aggregates [object] An array of available aggregates. agg_type string The aggregate type. field string The field. value double The value for a given field. pagination object The metadata for the current pagination. next_record_id string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of the `next_record_id`. usage [object] Get usage summary by tag(s). month date-time Datetime in ISO-8601 format, UTC, precise to month: [YYYY-MM]. org_name string The name of the organization. public_id string The organization public ID. region string The region of the Datadog instance that the organization belongs to. tag_config_source string The source of the usage attribution tag configuration and the selected tags in the format `::://////`. tags object Tag keys and values. A `null` value here means that the requested tag breakdown cannot be applied because it does not match the [tags configured for usage attribution](https://docs.datadoghq.com/account_management/billing/usage_attribution/#getting-started). In this scenario the API returns the total usage, not broken down by tags. [string] A list of values that are associated with each tag key. * An empty list means the resource use wasn't tagged with the respective tag. * Multiple values means the respective tag was applied multiple times on the resource. * An `` value means the resource was tagged with the respective tag but did not have a value. updated_at date-time Datetime of the most recent update to the usage values. values object Fields in Usage Summary by tag(s). The following values have been **deprecated** : `estimated_indexed_spans_usage`, `estimated_indexed_spans_percentage`, `estimated_ingested_spans_usage`, `estimated_ingested_spans_percentage`. api_percentage double The percentage of synthetic API test usage by tag(s). api_usage double The synthetic API test usage by tag(s). apm_fargate_percentage double The percentage of APM ECS Fargate task usage by tag(s). apm_fargate_usage double The APM ECS Fargate task usage by tag(s). apm_host_percentage double The percentage of APM host usage by tag(s). apm_host_usage double The APM host usage by tag(s). apm_usm_percentage double The percentage of APM and Universal Service Monitoring host usage by tag(s). apm_usm_usage double The APM and Universal Service Monitoring host usage by tag(s). appsec_fargate_percentage double The percentage of Application Security Monitoring ECS Fargate task usage by tag(s). appsec_fargate_usage double The Application Security Monitoring ECS Fargate task usage by tag(s). appsec_percentage double The percentage of Application Security Monitoring host usage by tag(s). appsec_usage double The Application Security Monitoring host usage by tag(s). asm_serverless_traced_invocations_percentage double The percentage of Application Security Monitoring Serverless traced invocations usage by tag(s). asm_serverless_traced_invocations_usage double The Application Security Monitoring Serverless traced invocations usage by tag(s). bits_ai_investigations_percentage double The percentage of Bits AI `SRE` investigation usage by tag(s). bits_ai_investigations_usage double The Bits AI `SRE` investigation usage by tag(s). browser_percentage double The percentage of synthetic browser test usage by tag(s). browser_usage double The synthetic browser test usage by tag(s). ci_pipeline_indexed_spans_percentage double The percentage of CI Pipeline Indexed Spans usage by tag(s). ci_pipeline_indexed_spans_usage double The total CI Pipeline Indexed Spans usage by tag(s). ci_test_indexed_spans_percentage double The percentage of CI Test Indexed Spans usage by tag(s). ci_test_indexed_spans_usage double The total CI Test Indexed Spans usage by tag(s). ci_visibility_itr_percentage double The percentage of Git committers for Intelligent Test Runner usage by tag(s). ci_visibility_itr_usage double The Git committers for Intelligent Test Runner usage by tag(s). cloud_siem_percentage double The percentage of Cloud Security Information and Event Management usage by tag(s). cloud_siem_usage double The Cloud Security Information and Event Management usage by tag(s). code_security_host_percentage double The percentage of Code Security host usage by tags. code_security_host_usage double The Code Security host usage by tags. container_excl_agent_percentage double The percentage of container usage without the Datadog Agent by tag(s). container_excl_agent_usage double The container usage without the Datadog Agent by tag(s). container_percentage double The percentage of container usage by tag(s). container_usage double The container usage by tag(s). cspm_containers_percentage double The percentage of Cloud Security Management Pro container usage by tag(s). cspm_containers_usage double The Cloud Security Management Pro container usage by tag(s). cspm_hosts_percentage double The percentage of Cloud Security Management Pro host usage by tag(s). cspm_hosts_usage double The Cloud Security Management Pro host usage by tag(s). custom_event_percentage double The percentage of Custom Events usage by tag(s). custom_event_usage double The total Custom Events usage by tag(s). custom_ingested_timeseries_percentage double The percentage of ingested custom metrics usage by tag(s). custom_ingested_timeseries_usage double The ingested custom metrics usage by tag(s). custom_timeseries_percentage double The percentage of indexed custom metrics usage by tag(s). custom_timeseries_usage double The indexed custom metrics usage by tag(s). cws_containers_percentage double The percentage of Cloud Workload Security container usage by tag(s). cws_containers_usage double The Cloud Workload Security container usage by tag(s). cws_fargate_task_percentage double The percentage of Cloud Workload Security Fargate task usage by tag(s). cws_fargate_task_usage double The Cloud Workload Security Fargate task usage by tag(s). cws_hosts_percentage double The percentage of Cloud Workload Security host usage by tag(s). cws_hosts_usage double The Cloud Workload Security host usage by tag(s). data_jobs_monitoring_usage double The Data Jobs Monitoring usage by tag(s). data_stream_monitoring_usage double The Data Stream Monitoring usage by tag(s). dbm_hosts_percentage double The percentage of Database Monitoring host usage by tag(s). dbm_hosts_usage double The Database Monitoring host usage by tag(s). dbm_queries_percentage double The percentage of Database Monitoring queries usage by tag(s). dbm_queries_usage double The Database Monitoring queries usage by tag(s). error_tracking_percentage double The percentage of error tracking events usage by tag(s). error_tracking_usage double The error tracking events usage by tag(s). estimated_indexed_spans_percentage double The percentage of estimated indexed spans usage by tag(s). estimated_indexed_spans_usage double The estimated indexed spans usage by tag(s). estimated_ingested_spans_percentage double The percentage of estimated ingested spans usage by tag(s). estimated_ingested_spans_usage double The estimated ingested spans usage by tag(s). fargate_percentage double The percentage of Fargate usage by tags. fargate_usage double The Fargate usage by tags. flex_stored_logs_percentage double The percentage of Flex Stored Logs usage by tags. flex_stored_logs_usage double The Flex Stored Logs usage by tags. functions_percentage double The percentage of Lambda function usage by tag(s). functions_usage double The Lambda function usage by tag(s). incident_management_monthly_active_users_percentage double The percentage of Incident Management monthly active users usage by tag(s). incident_management_monthly_active_users_usage double The Incident Management monthly active users usage by tag(s). indexed_spans_percentage double The percentage of APM Indexed Spans usage by tag(s). indexed_spans_usage double The total APM Indexed Spans usage by tag(s). infra_host_percentage double The percentage of infrastructure host usage by tag(s). infra_host_usage double The infrastructure host usage by tag(s). ingested_logs_bytes_percentage double The percentage of Ingested Logs usage by tag(s). ingested_logs_bytes_usage double The total Ingested Logs usage by tag(s). ingested_spans_bytes_percentage double The percentage of APM Ingested Spans usage by tag(s). ingested_spans_bytes_usage double The total APM Ingested Spans usage by tag(s). invocations_percentage double The percentage of Lambda invocation usage by tag(s). invocations_usage double The Lambda invocation usage by tag(s). lambda_traced_invocations_percentage double The percentage of Serverless APM usage by tag(s). lambda_traced_invocations_usage double The Serverless APM usage by tag(s). llm_observability_percentage double The percentage of LLM Observability usage by tag(s). llm_observability_usage double The LLM Observability usage by tag(s). llm_spans_percentage double The percentage of LLM Spans usage by tag(s). llm_spans_usage double The LLM Spans usage by tag(s). logs_indexed_15day_percentage double The percentage of Indexed Logs (15-day Retention) usage by tag(s). logs_indexed_15day_usage double The total Indexed Logs (15-day Retention) usage by tag(s). logs_indexed_180day_percentage double The percentage of Indexed Logs (180-day Retention) usage by tag(s). logs_indexed_180day_usage double The total Indexed Logs (180-day Retention) usage by tag(s). logs_indexed_1day_percentage double The percentage of Indexed Logs (1-day Retention) usage by tag(s). logs_indexed_1day_usage double The total Indexed Logs (1-day Retention) usage by tag(s). logs_indexed_30day_percentage double The percentage of Indexed Logs (30-day Retention) usage by tag(s). logs_indexed_30day_usage double The total Indexed Logs (30-day Retention) usage by tag(s). logs_indexed_360day_percentage double The percentage of Indexed Logs (360-day Retention) usage by tag(s). logs_indexed_360day_usage double The total Indexed Logs (360-day Retention) usage by tag(s). logs_indexed_3day_percentage double The percentage of Indexed Logs (3-day Retention) usage by tag(s). logs_indexed_3day_usage double The total Indexed Logs (3-day Retention) usage by tag(s). logs_indexed_45day_percentage double The percentage of Indexed Logs (45-day Retention) usage by tag(s). logs_indexed_45day_usage double The total Indexed Logs (45-day Retention) usage by tag(s). logs_indexed_60day_percentage double The percentage of Indexed Logs (60-day Retention) usage by tag(s). logs_indexed_60day_usage double The total Indexed Logs (60-day Retention) usage by tag(s). logs_indexed_7day_percentage double The percentage of Indexed Logs (7-day Retention) usage by tag(s). logs_indexed_7day_usage double The total Indexed Logs (7-day Retention) usage by tag(s). logs_indexed_90day_percentage double The percentage of Indexed Logs (90-day Retention) usage by tag(s). logs_indexed_90day_usage double The total Indexed Logs (90-day Retention) usage by tag(s). logs_indexed_custom_retention_percentage double The percentage of Indexed Logs (Custom Retention) usage by tag(s). logs_indexed_custom_retention_usage double The total Indexed Logs (Custom Retention) usage by tag(s). mobile_app_testing_percentage double The percentage of Synthetic mobile application test usage by tag(s). mobile_app_testing_usage double The Synthetic mobile application test usage by tag(s). ndm_netflow_percentage double The percentage of Network Device Monitoring NetFlow usage by tag(s). ndm_netflow_usage double The Network Device Monitoring NetFlow usage by tag(s). network_device_wireless_percentage double The percentage of network device wireless usage by tag(s). network_device_wireless_usage double The network device wireless usage by tag(s). npm_host_percentage double The percentage of network host usage by tag(s). npm_host_usage double The network host usage by tag(s). obs_pipeline_bytes_percentage double The percentage of observability pipeline bytes usage by tag(s). obs_pipeline_bytes_usage double The observability pipeline bytes usage by tag(s). obs_pipelines_vcpu_percentage double The percentage of observability pipeline per core usage by tag(s). obs_pipelines_vcpu_usage double The observability pipeline per core usage by tag(s). online_archive_percentage double The percentage of online archive usage by tag(s). online_archive_usage double The online archive usage by tag(s). product_analytics_session_percentage double The percentage of Product Analytics session usage by tag(s). product_analytics_session_usage double The Product Analytics session usage by tag(s). profiled_container_percentage double The percentage of profiled container usage by tag(s). profiled_container_usage double The profiled container usage by tag(s). profiled_fargate_percentage double The percentage of profiled Fargate task usage by tag(s). profiled_fargate_usage double The profiled Fargate task usage by tag(s). profiled_host_percentage double The percentage of profiled hosts usage by tag(s). profiled_host_usage double The profiled hosts usage by tag(s). published_app_percentage double The percentage of published application usage by tag(s). published_app_usage double The published application usage by tag(s). rum_browser_mobile_sessions_percentage double The percentage of RUM Browser and Mobile usage by tag(s). rum_browser_mobile_sessions_usage double The total RUM Browser and Mobile usage by tag(s). rum_ingested_percentage double The percentage of RUM Ingested usage by tag(s). rum_ingested_usage double The total RUM Ingested usage by tag(s). rum_investigate_percentage double The percentage of RUM Investigate usage by tag(s). rum_investigate_usage double The total RUM Investigate usage by tag(s). rum_replay_sessions_percentage double The percentage of RUM Session Replay usage by tag(s). rum_replay_sessions_usage double The total RUM Session Replay usage by tag(s). rum_session_replay_add_on_percentage double The percentage of RUM Session Replay Add-On usage by tag(s). rum_session_replay_add_on_usage double The total RUM Session Replay Add-On usage by tag(s). sca_fargate_percentage double The percentage of Software Composition Analysis Fargate task usage by tag(s). sca_fargate_usage double The total Software Composition Analysis Fargate task usage by tag(s). sds_scanned_bytes_percentage double The percentage of Sensitive Data Scanner usage by tag(s). sds_scanned_bytes_usage double The total Sensitive Data Scanner usage by tag(s). serverless_apps_percentage double The percentage of Serverless Apps usage by tag(s). serverless_apps_usage double The total Serverless Apps usage by tag(s). siem_analyzed_logs_add_on_percentage double The percentage of log events analyzed by Cloud SIEM usage by tag(s). siem_analyzed_logs_add_on_usage double The log events analyzed by Cloud SIEM usage by tag(s). siem_ingested_bytes_percentage double The percentage of SIEM usage by tag(s). siem_ingested_bytes_usage double The total SIEM usage by tag(s). snmp_percentage double The percentage of network device usage by tag(s). snmp_usage double The network device usage by tag(s). universal_service_monitoring_percentage double The percentage of universal service monitoring usage by tag(s). universal_service_monitoring_usage double The universal service monitoring usage by tag(s). vuln_management_hosts_percentage double The percentage of Application Vulnerability Management usage by tag(s). vuln_management_hosts_usage double The Application Vulnerability Management usage by tag(s). workflow_executions_percentage double The percentage of workflow executions usage by tag(s). workflow_executions_usage double The total workflow executions usage by tag(s). ``` { "metadata": { "aggregates": [ { "agg_type": "sum", "field": "custom_timeseries_usage", "value": "number" } ], "pagination": { "next_record_id": "string" } }, "usage": [ { "month": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string", "region": "string", "tag_config_source": "string", "tags": { "": [ "datadog-integrations-lab" ] }, "updated_at": "2019-09-19T10:00:00.000Z", "values": { "api_percentage": "number", "api_usage": "number", "apm_fargate_percentage": "number", "apm_fargate_usage": "number", "apm_host_percentage": "number", "apm_host_usage": "number", "apm_usm_percentage": "number", "apm_usm_usage": "number", "appsec_fargate_percentage": "number", "appsec_fargate_usage": "number", "appsec_percentage": "number", "appsec_usage": "number", "asm_serverless_traced_invocations_percentage": "number", "asm_serverless_traced_invocations_usage": "number", "bits_ai_investigations_percentage": "number", "bits_ai_investigations_usage": "number", "browser_percentage": "number", "browser_usage": "number", "ci_pipeline_indexed_spans_percentage": "number", "ci_pipeline_indexed_spans_usage": "number", "ci_test_indexed_spans_percentage": "number", "ci_test_indexed_spans_usage": "number", "ci_visibility_itr_percentage": "number", "ci_visibility_itr_usage": "number", "cloud_siem_percentage": "number", "cloud_siem_usage": "number", "code_security_host_percentage": "number", "code_security_host_usage": "number", "container_excl_agent_percentage": "number", "container_excl_agent_usage": "number", "container_percentage": "number", "container_usage": "number", "cspm_containers_percentage": "number", "cspm_containers_usage": "number", "cspm_hosts_percentage": "number", "cspm_hosts_usage": "number", "custom_event_percentage": "number", "custom_event_usage": "number", "custom_ingested_timeseries_percentage": "number", "custom_ingested_timeseries_usage": "number", "custom_timeseries_percentage": "number", "custom_timeseries_usage": "number", "cws_containers_percentage": "number", "cws_containers_usage": "number", "cws_fargate_task_percentage": "number", "cws_fargate_task_usage": "number", "cws_hosts_percentage": "number", "cws_hosts_usage": "number", "data_jobs_monitoring_usage": "number", "data_stream_monitoring_usage": "number", "dbm_hosts_percentage": "number", "dbm_hosts_usage": "number", "dbm_queries_percentage": "number", "dbm_queries_usage": "number", "error_tracking_percentage": "number", "error_tracking_usage": "number", "estimated_indexed_spans_percentage": "number", "estimated_indexed_spans_usage": "number", "estimated_ingested_spans_percentage": "number", "estimated_ingested_spans_usage": "number", "fargate_percentage": "number", "fargate_usage": "number", "flex_stored_logs_percentage": "number", "flex_stored_logs_usage": "number", "functions_percentage": "number", "functions_usage": "number", "incident_management_monthly_active_users_percentage": "number", "incident_management_monthly_active_users_usage": "number", "indexed_spans_percentage": "number", "indexed_spans_usage": "number", "infra_host_percentage": "number", "infra_host_usage": "number", "ingested_logs_bytes_percentage": "number", "ingested_logs_bytes_usage": "number", "ingested_spans_bytes_percentage": "number", "ingested_spans_bytes_usage": "number", "invocations_percentage": "number", "invocations_usage": "number", "lambda_traced_invocations_percentage": "number", "lambda_traced_invocations_usage": "number", "llm_observability_percentage": "number", "llm_observability_usage": "number", "llm_spans_percentage": "number", "llm_spans_usage": "number", "logs_indexed_15day_percentage": "number", "logs_indexed_15day_usage": "number", "logs_indexed_180day_percentage": "number", "logs_indexed_180day_usage": "number", "logs_indexed_1day_percentage": "number", "logs_indexed_1day_usage": "number", "logs_indexed_30day_percentage": "number", "logs_indexed_30day_usage": "number", "logs_indexed_360day_percentage": "number", "logs_indexed_360day_usage": "number", "logs_indexed_3day_percentage": "number", "logs_indexed_3day_usage": "number", "logs_indexed_45day_percentage": "number", "logs_indexed_45day_usage": "number", "logs_indexed_60day_percentage": "number", "logs_indexed_60day_usage": "number", "logs_indexed_7day_percentage": "number", "logs_indexed_7day_usage": "number", "logs_indexed_90day_percentage": "number", "logs_indexed_90day_usage": "number", "logs_indexed_custom_retention_percentage": "number", "logs_indexed_custom_retention_usage": "number", "mobile_app_testing_percentage": "number", "mobile_app_testing_usage": "number", "ndm_netflow_percentage": "number", "ndm_netflow_usage": "number", "network_device_wireless_percentage": "number", "network_device_wireless_usage": "number", "npm_host_percentage": "number", "npm_host_usage": "number", "obs_pipeline_bytes_percentage": "number", "obs_pipeline_bytes_usage": "number", "obs_pipelines_vcpu_percentage": "number", "obs_pipelines_vcpu_usage": "number", "online_archive_percentage": "number", "online_archive_usage": "number", "product_analytics_session_percentage": "number", "product_analytics_session_usage": "number", "profiled_container_percentage": "number", "profiled_container_usage": "number", "profiled_fargate_percentage": "number", "profiled_fargate_usage": "number", "profiled_host_percentage": "number", "profiled_host_usage": "number", "published_app_percentage": "number", "published_app_usage": "number", "rum_browser_mobile_sessions_percentage": "number", "rum_browser_mobile_sessions_usage": "number", "rum_ingested_percentage": "number", "rum_ingested_usage": "number", "rum_investigate_percentage": "number", "rum_investigate_usage": "number", "rum_replay_sessions_percentage": "number", "rum_replay_sessions_usage": "number", "rum_session_replay_add_on_percentage": "number", "rum_session_replay_add_on_usage": "number", "sca_fargate_percentage": "number", "sca_fargate_usage": "number", "sds_scanned_bytes_percentage": "number", "sds_scanned_bytes_usage": "number", "serverless_apps_percentage": "number", "serverless_apps_usage": "number", "siem_analyzed_logs_add_on_percentage": "number", "siem_analyzed_logs_add_on_usage": "number", "siem_ingested_bytes_percentage": "number", "siem_ingested_bytes_usage": "number", "snmp_percentage": "number", "snmp_usage": "number", "universal_service_monitoring_percentage": "number", "universal_service_monitoring_usage": "number", "vuln_management_hosts_percentage": "number", "vuln_management_hosts_usage": "number", "workflow_executions_percentage": "number", "workflow_executions_usage": "number" } } ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get monthly usage attribution Copy ``` # Required query arguments export start_month="CHANGE_ME" export fields="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/monthly-attribution?start_month=${start_month}&fields=${fields}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get monthly usage attribution ``` """ Get monthly usage attribution returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datadog_api_client.v1.model.monthly_usage_attribution_supported_metrics import ( MonthlyUsageAttributionSupportedMetrics, ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_monthly_usage_attribution( start_month=(datetime.now() + relativedelta(days=-3)), fields=MonthlyUsageAttributionSupportedMetrics.INFRA_HOST_USAGE, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get monthly usage attribution ``` # Get monthly usage attribution returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_monthly_usage_attribution((Time.now + -3 * 86400), MonthlyUsageAttributionSupportedMetrics::INFRA_HOST_USAGE) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get monthly usage attribution ``` // Get monthly usage attribution returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetMonthlyUsageAttribution(ctx, time.Now().AddDate(0, 0, -3), datadogV1.MONTHLYUSAGEATTRIBUTIONSUPPORTEDMETRICS_INFRA_HOST_USAGE, *datadogV1.NewGetMonthlyUsageAttributionOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetMonthlyUsageAttribution`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetMonthlyUsageAttribution`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get monthly usage attribution ``` // Get monthly usage attribution returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.MonthlyUsageAttributionResponse; import com.datadog.api.client.v1.model.MonthlyUsageAttributionSupportedMetrics; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { MonthlyUsageAttributionResponse result = apiInstance.getMonthlyUsageAttribution( OffsetDateTime.now().plusDays(-3), MonthlyUsageAttributionSupportedMetrics.INFRA_HOST_USAGE); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getMonthlyUsageAttribution"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get monthly usage attribution ``` // Get monthly usage attribution returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetMonthlyUsageAttributionOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; use datadog_api_client::datadogV1::model::MonthlyUsageAttributionSupportedMetrics; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_monthly_usage_attribution( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), MonthlyUsageAttributionSupportedMetrics::INFRA_HOST_USAGE, GetMonthlyUsageAttributionOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get monthly usage attribution ``` /** * Get monthly usage attribution returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetMonthlyUsageAttributionRequest = { startMonth: new Date(new Date().getTime() + -3 * 86400 * 1000), fields: "infra_host_usage", }; apiInstance .getMonthlyUsageAttribution(params) .then((data: v1.MonthlyUsageAttributionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get active billing dimensions for cost attribution](https://docs.datadoghq.com/api/latest/usage-metering/#get-active-billing-dimensions-for-cost-attribution) * [v2 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-active-billing-dimensions-for-cost-attribution-v2) GET https://api.ap1.datadoghq.com/api/v2/cost_by_tag/active_billing_dimensionshttps://api.ap2.datadoghq.com/api/v2/cost_by_tag/active_billing_dimensionshttps://api.datadoghq.eu/api/v2/cost_by_tag/active_billing_dimensionshttps://api.ddog-gov.com/api/v2/cost_by_tag/active_billing_dimensionshttps://api.datadoghq.com/api/v2/cost_by_tag/active_billing_dimensionshttps://api.us3.datadoghq.com/api/v2/cost_by_tag/active_billing_dimensionshttps://api.us5.datadoghq.com/api/v2/cost_by_tag/active_billing_dimensions ### Overview Get active billing dimensions for cost attribution. Cost data for a given month becomes available no later than the 19th of the following month. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetActiveBillingDimensions-200-v2) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetActiveBillingDimensions-400-v2) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetActiveBillingDimensions-403-v2) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetActiveBillingDimensions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Active billing dimensions response. Field Type Description data object Active billing dimensions data. attributes object List of active billing dimensions. month date-time Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]`. values [string] List of active billing dimensions. Example: `[infra_host, apm_host, serverless_infra]`. id string Unique ID of the response. type enum Type of active billing dimensions data. Allowed enum values: `billing_dimensions` default: `billing_dimensions` ``` { "data": { "attributes": { "month": "2019-09-19T10:00:00.000Z", "values": [ "infra_host" ] }, "id": "string", "type": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get active billing dimensions for cost attribution Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost_by_tag/active_billing_dimensions" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get active billing dimensions for cost attribution ``` """ Get active billing dimensions for cost attribution returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_active_billing_dimensions() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get active billing dimensions for cost attribution ``` # Get active billing dimensions for cost attribution returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsageMeteringAPI.new p api_instance.get_active_billing_dimensions() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get active billing dimensions for cost attribution ``` // Get active billing dimensions for cost attribution returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsageMeteringApi(apiClient) resp, r, err := api.GetActiveBillingDimensions(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetActiveBillingDimensions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetActiveBillingDimensions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get active billing dimensions for cost attribution ``` // Get active billing dimensions for cost attribution returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsageMeteringApi; import com.datadog.api.client.v2.model.ActiveBillingDimensionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { ActiveBillingDimensionsResponse result = apiInstance.getActiveBillingDimensions(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getActiveBillingDimensions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get active billing dimensions for cost attribution ``` // Get active billing dimensions for cost attribution returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api.get_active_billing_dimensions().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get active billing dimensions for cost attribution ``` /** * Get active billing dimensions for cost attribution returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsageMeteringApi(configuration); apiInstance .getActiveBillingDimensions() .then((data: v2.ActiveBillingDimensionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get billable usage across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-billable-usage-across-your-account) * [v1 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-billable-usage-across-your-account-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/billable-summaryhttps://api.ap2.datadoghq.com/api/v1/usage/billable-summaryhttps://api.datadoghq.eu/api/v1/usage/billable-summaryhttps://api.ddog-gov.com/api/v1/usage/billable-summaryhttps://api.datadoghq.com/api/v1/usage/billable-summaryhttps://api.us3.datadoghq.com/api/v1/usage/billable-summaryhttps://api.us5.datadoghq.com/api/v1/usage/billable-summary ### Overview Get billable usage across your account. This endpoint is only accessible for [parent-level organizations](https://docs.datadoghq.com/account_management/multi_organization/). This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description month string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for usage starting this month. include_connected_accounts boolean Boolean to specify whether to include accounts connected to the current account as partner customers in the Datadog partner network program. Defaults to `false`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageBillableSummary-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageBillableSummary-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageBillableSummary-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageBillableSummary-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response with monthly summary of data billed by Datadog. Field Type Description usage [object] An array of objects regarding usage of billable summary. account_name string The account name. account_public_id string The account public ID. billing_plan string The billing plan. end_date date-time Shows the last date of usage. num_orgs int64 The number of organizations. org_name string The organization name. public_id string The organization public ID. ratio_in_month double Shows usage aggregation for a billing period. region string The region of the organization. start_date date-time Shows the first date of usage. usage object Response with aggregated usage types. apm_fargate_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. apm_fargate_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. apm_host_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. apm_host_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. apm_profiler_host_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. apm_profiler_host_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. apm_trace_search_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. application_security_fargate_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. application_security_host_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. application_security_host_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. ci_pipeline_indexed_spans_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. ci_pipeline_maximum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. ci_pipeline_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. ci_test_indexed_spans_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. ci_testing_maximum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. ci_testing_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. cloud_cost_management_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. cloud_cost_management_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. cspm_container_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. cspm_host_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. cspm_host_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. custom_event_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. cws_container_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. cws_host_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. cws_host_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. dbm_host_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. dbm_host_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. dbm_normalized_queries_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. dbm_normalized_queries_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. fargate_container_apm_and_profiler_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. fargate_container_apm_and_profiler_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. fargate_container_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. fargate_container_profiler_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. fargate_container_profiler_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. fargate_container_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. incident_management_maximum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. incident_management_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. infra_and_apm_host_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. infra_and_apm_host_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. infra_container_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. infra_host_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. infra_host_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. ingested_spans_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. ingested_timeseries_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. ingested_timeseries_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. iot_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. iot_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. lambda_function_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. lambda_function_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_forwarding_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_15day_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_180day_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_1day_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_30day_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_360day_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_3day_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_45day_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_60day_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_7day_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_90day_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_custom_retention_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_indexed_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. logs_ingested_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. network_device_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. network_device_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. npm_flow_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. npm_host_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. npm_host_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. observability_pipeline_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. online_archive_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. prof_container_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. prof_host_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. prof_host_top99p object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. rum_lite_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. rum_replay_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. rum_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. rum_units_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. sensitive_data_scanner_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. serverless_apm_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. serverless_infra_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. serverless_infra_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. serverless_invocation_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. siem_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. standard_timeseries_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. synthetics_api_tests_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. synthetics_app_testing_maximum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. synthetics_browser_checks_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. timeseries_average object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. timeseries_sum object Response with properties for each aggregated usage type. account_billable_usage int64 The total account usage. account_committed_usage int64 The total account committed usage. account_on_demand_usage int64 The total account on-demand usage. elapsed_usage_hours int64 Elapsed usage hours for some billable product. first_billable_usage_hour date-time The first billable hour for the org. last_billable_usage_hour date-time The last billable hour for the org. org_billable_usage int64 The number of units used within the billable timeframe. percentage_in_account double The percentage of account usage the org represents. usage_unit string Units pertaining to the usage. ``` { "usage": [ { "account_name": "string", "account_public_id": "string", "billing_plan": "string", "end_date": "2019-09-19T10:00:00.000Z", "num_orgs": "integer", "org_name": "string", "public_id": "string", "ratio_in_month": "number", "region": "string", "start_date": "2019-09-19T10:00:00.000Z", "usage": { "apm_fargate_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "apm_fargate_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "apm_host_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "apm_host_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "apm_profiler_host_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "apm_profiler_host_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "apm_trace_search_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "application_security_fargate_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "application_security_host_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "application_security_host_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "ci_pipeline_indexed_spans_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "ci_pipeline_maximum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "ci_pipeline_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "ci_test_indexed_spans_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "ci_testing_maximum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "ci_testing_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "cloud_cost_management_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "cloud_cost_management_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "cspm_container_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "cspm_host_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "cspm_host_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "custom_event_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "cws_container_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "cws_host_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "cws_host_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "dbm_host_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "dbm_host_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "dbm_normalized_queries_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "dbm_normalized_queries_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "fargate_container_apm_and_profiler_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "fargate_container_apm_and_profiler_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "fargate_container_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "fargate_container_profiler_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "fargate_container_profiler_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "fargate_container_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "incident_management_maximum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "incident_management_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "infra_and_apm_host_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "infra_and_apm_host_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "infra_container_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "infra_host_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "infra_host_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "ingested_spans_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "ingested_timeseries_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "ingested_timeseries_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "iot_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "iot_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "lambda_function_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "lambda_function_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_forwarding_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_15day_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_180day_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_1day_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_30day_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_360day_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_3day_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_45day_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_60day_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_7day_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_90day_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_custom_retention_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_indexed_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "logs_ingested_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "network_device_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "network_device_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "npm_flow_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "npm_host_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "npm_host_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "observability_pipeline_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "online_archive_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "prof_container_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "prof_host_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "prof_host_top99p": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "rum_lite_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "rum_replay_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "rum_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "rum_units_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "sensitive_data_scanner_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "serverless_apm_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "serverless_infra_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "serverless_infra_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "serverless_invocation_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "siem_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "standard_timeseries_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "synthetics_api_tests_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "synthetics_app_testing_maximum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "synthetics_browser_checks_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "timeseries_average": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" }, "timeseries_sum": { "account_billable_usage": "integer", "account_committed_usage": "integer", "account_on_demand_usage": "integer", "elapsed_usage_hours": "integer", "first_billable_usage_hour": "2019-09-19T10:00:00.000Z", "last_billable_usage_hour": "2019-09-19T10:00:00.000Z", "org_billable_usage": "integer", "percentage_in_account": "number", "usage_unit": "string" } } } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get billable usage across your account Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/billable-summary" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get billable usage across your account ``` """ Get billable usage across your account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_billable_summary() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get billable usage across your account ``` # Get billable usage across your account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_billable_summary() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get billable usage across your account ``` // Get billable usage across your account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageBillableSummary(ctx, *datadogV1.NewGetUsageBillableSummaryOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageBillableSummary`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageBillableSummary`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get billable usage across your account ``` // Get billable usage across your account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageBillableSummaryResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageBillableSummaryResponse result = apiInstance.getUsageBillableSummary(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageBillableSummary"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get billable usage across your account ``` // Get billable usage across your account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageBillableSummaryOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_billable_summary(GetUsageBillableSummaryOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get billable usage across your account ``` /** * Get billable usage across your account returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); apiInstance .getUsageBillableSummary() .then((data: v1.UsageBillableSummaryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get historical cost across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-historical-cost-across-your-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-historical-cost-across-your-account-v2) GET https://api.ap1.datadoghq.com/api/v2/usage/historical_costhttps://api.ap2.datadoghq.com/api/v2/usage/historical_costhttps://api.datadoghq.eu/api/v2/usage/historical_costhttps://api.ddog-gov.com/api/v2/usage/historical_costhttps://api.datadoghq.com/api/v2/usage/historical_costhttps://api.us3.datadoghq.com/api/v2/usage/historical_costhttps://api.us5.datadoghq.com/api/v2/usage/historical_cost ### Overview Get historical cost across multi-org and single root-org accounts. Cost data for a given month becomes available no later than the 16th of the following month. This endpoint is only accessible for [parent-level organizations](https://docs.datadoghq.com/account_management/multi_organization/). This endpoint requires all of the following permissions: * `usage_read` * `billing_read` OAuth apps require the `usage_read, billing_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_month [_required_] string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for cost beginning this month. view string String to specify whether cost is broken down at a parent-org level or at the sub-org level. Available views are `summary` and `sub-org`. Defaults to `summary`. end_month string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for cost ending this month. include_connected_accounts boolean Boolean to specify whether to include accounts connected to the current account as partner customers in the Datadog partner network program. Defaults to `false`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetHistoricalCostByOrg-200-v2) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetHistoricalCostByOrg-400-v2) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetHistoricalCostByOrg-403-v2) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetHistoricalCostByOrg-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Chargeback Summary response. Field Type Description data [object] Response containing Chargeback Summary. attributes object Cost attributes data. account_name string The account name. account_public_id string The account public ID. charges [object] List of charges data reported for the requested month. charge_type string The type of charge for a particular product. cost double The cost for a particular product and charge type during a given month. product_name string The product for which cost is being reported. date date-time The month requested. org_name string The organization name. public_id string The organization public ID. region string The region of the Datadog instance that the organization belongs to. total_cost double The total cost of products for the month. id string Unique ID of the response. type enum Type of cost data. Allowed enum values: `cost_by_org` default: `cost_by_org` ``` { "data": [ { "attributes": { "account_name": "string", "account_public_id": "string", "charges": [ { "charge_type": "on_demand", "cost": "number", "product_name": "infra_host" } ], "date": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string", "region": "string", "total_cost": "number" }, "id": "string", "type": "cost_by_org" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get historical cost across your account Copy ``` # Required query arguments export start_month="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/usage/historical_cost?start_month=${start_month}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get historical cost across your account ``` """ Get historical cost across your account returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_historical_cost_by_org( start_month=(datetime.now() + relativedelta(months=-2)), view="sub-org", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get historical cost across your account ``` # Get historical cost across your account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsageMeteringAPI.new opts = { view: "sub-org", } p api_instance.get_historical_cost_by_org((Time.now + -2 * 86400 * 30), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get historical cost across your account ``` // Get historical cost across your account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsageMeteringApi(apiClient) resp, r, err := api.GetHistoricalCostByOrg(ctx, time.Now().AddDate(0, -2, 0), *datadogV2.NewGetHistoricalCostByOrgOptionalParameters().WithView("sub-org")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetHistoricalCostByOrg`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetHistoricalCostByOrg`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get historical cost across your account ``` // Get historical cost across your account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsageMeteringApi; import com.datadog.api.client.v2.api.UsageMeteringApi.GetHistoricalCostByOrgOptionalParameters; import com.datadog.api.client.v2.model.CostByOrgResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { CostByOrgResponse result = apiInstance.getHistoricalCostByOrg( OffsetDateTime.now().plusMonths(-2), new GetHistoricalCostByOrgOptionalParameters().view("sub-org")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getHistoricalCostByOrg"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get historical cost across your account ``` // Get historical cost across your account returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_usage_metering::GetHistoricalCostByOrgOptionalParams; use datadog_api_client::datadogV2::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_historical_cost_by_org( DateTime::parse_from_rfc3339("2021-09-11T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetHistoricalCostByOrgOptionalParams::default().view("sub-org".to_string()), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get historical cost across your account ``` /** * Get historical cost across your account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsageMeteringApi(configuration); const params: v2.UsageMeteringApiGetHistoricalCostByOrgRequest = { startMonth: new Date(new Date().getTime() + -2 * 86400 * 30 * 1000), view: "sub-org", }; apiInstance .getHistoricalCostByOrg(params) .then((data: v2.CostByOrgResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get Monthly Cost Attribution](https://docs.datadoghq.com/api/latest/usage-metering/#get-monthly-cost-attribution) * [v2 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-monthly-cost-attribution-v2) GET https://api.ap1.datadoghq.com/api/v2/cost_by_tag/monthly_cost_attributionhttps://api.ap2.datadoghq.com/api/v2/cost_by_tag/monthly_cost_attributionhttps://api.datadoghq.eu/api/v2/cost_by_tag/monthly_cost_attributionhttps://api.ddog-gov.com/api/v2/cost_by_tag/monthly_cost_attributionhttps://api.datadoghq.com/api/v2/cost_by_tag/monthly_cost_attributionhttps://api.us3.datadoghq.com/api/v2/cost_by_tag/monthly_cost_attributionhttps://api.us5.datadoghq.com/api/v2/cost_by_tag/monthly_cost_attribution ### Overview Get monthly cost attribution by tag across multi-org and single root-org accounts. Cost Attribution data for a given month becomes available no later than the 19th of the following month. This API endpoint is paginated. To make sure you receive all records, check if the value of `next_record_id` is set in the response. If it is, make another request and pass `next_record_id` as a parameter. Pseudo code example: ``` response := GetMonthlyCostAttribution(start_month, end_month) cursor := response.metadata.pagination.next_record_id WHILE cursor != null BEGIN sleep(5 seconds) # Avoid running into rate limit response := GetMonthlyCostAttribution(start_month, end_month, next_record_id=cursor) cursor := response.metadata.pagination.next_record_id END ``` This endpoint is only accessible for [parent-level organizations](https://docs.datadoghq.com/account_management/multi_organization/). This endpoint is not available in the Government (US1-FED) site. This endpoint requires all of the following permissions: * `usage_read` * `billing_read` OAuth apps require the `usage_read, billing_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_month [_required_] string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for cost beginning in this month. end_month string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for cost ending this month. fields [_required_] string Comma-separated list specifying cost types (e.g., `_on_demand_cost`, `_committed_cost`, `_total_cost`) and the proportions (`_percentage_in_org`, `_percentage_in_account`). Use `*` to retrieve all fields. Example: `infra_host_on_demand_cost,infra_host_percentage_in_account` To obtain the complete list of active billing dimensions that can be used to replace `` in the field names, make a request to the [Get active billing dimensions API](https://docs.datadoghq.com/api/latest/usage-metering/#get-active-billing-dimensions-for-cost-attribution). sort_direction enum The direction to sort by: `[desc, asc]`. Allowed enum values: `desc, asc` sort_name string The billing dimension to sort by. Always sorted by total cost. Example: `infra_host`. tag_breakdown_keys string Comma separated list of tag keys used to group cost. If no value is provided the cost will not be broken down by tags. To see which tags are available, look for the value of `tag_config_source` in the API response. next_record_id string List following results with a next_record_id provided in the previous query. include_descendants boolean Include child org cost in the response. Defaults to `true`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetMonthlyCostAttribution-200-v2) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetMonthlyCostAttribution-400-v2) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetMonthlyCostAttribution-403-v2) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetMonthlyCostAttribution-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the monthly cost attribution by tag(s). Field Type Description data [object] Response containing cost attribution. attributes object Cost Attribution by Tag for a given organization. month date-time Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]`. org_name string The name of the organization. public_id string The organization public ID. tag_config_source string The source of the cost attribution tag configuration and the selected tags in the format `::://////`. tags object Tag keys and values. A `null` value here means that the requested tag breakdown cannot be applied because it does not match the [tags configured for usage attribution](https://docs.datadoghq.com/account_management/billing/usage_attribution/#getting-started). In this scenario the API returns the total cost, not broken down by tags. [string] A list of values that are associated with each tag key. * An empty list means the resource use wasn't tagged with the respective tag. * Multiple values means the respective tag was applied multiple times on the resource. * An `` value means the resource was tagged with the respective tag but did not have a value. updated_at string Shows the most recent hour in the current months for all organizations for which all costs were calculated. values object Fields in Cost Attribution by tag(s). Example: `infra_host_on_demand_cost`, `infra_host_committed_cost`, `infra_host_total_cost`, `infra_host_percentage_in_org`, `infra_host_percentage_in_account`. id string Unique ID of the response. type enum Type of cost attribution data. Allowed enum values: `cost_by_tag` default: `cost_by_tag` meta object The object containing document metadata. aggregates [object] An array of available aggregates. agg_type string The aggregate type. field string The field. value double The value for a given field. pagination object The metadata for the current pagination. next_record_id string The cursor to use to get the next results, if any. To make the next request, use the same parameters with the addition of the `next_record_id`. ``` { "data": [ { "attributes": { "month": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string", "tag_config_source": "string", "tags": { "": [ "datadog-integrations-lab" ] }, "updated_at": "string", "values": {} }, "id": "string", "type": "cost_by_tag" } ], "meta": { "aggregates": [ { "agg_type": "sum", "field": "infra_host_committed_cost", "value": "number" } ], "pagination": { "next_record_id": "string" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get Monthly Cost Attribution Copy ``` # Required query arguments export start_month="CHANGE_ME" export fields="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/cost_by_tag/monthly_cost_attribution?start_month=${start_month}&fields=${fields}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get Monthly Cost Attribution ``` """ Get Monthly Cost Attribution returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_monthly_cost_attribution( start_month=(datetime.now() + relativedelta(days=-5)), end_month=(datetime.now() + relativedelta(days=-3)), fields="infra_host_total_cost", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get Monthly Cost Attribution ``` # Get Monthly Cost Attribution returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsageMeteringAPI.new opts = { end_month: (Time.now + -3 * 86400), } p api_instance.get_monthly_cost_attribution((Time.now + -5 * 86400), "infra_host_total_cost", opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get Monthly Cost Attribution ``` // Get Monthly Cost Attribution returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsageMeteringApi(apiClient) resp, r, err := api.GetMonthlyCostAttribution(ctx, time.Now().AddDate(0, 0, -5), "infra_host_total_cost", *datadogV2.NewGetMonthlyCostAttributionOptionalParameters().WithEndMonth(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetMonthlyCostAttribution`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetMonthlyCostAttribution`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get Monthly Cost Attribution ``` // Get Monthly Cost Attribution returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsageMeteringApi; import com.datadog.api.client.v2.api.UsageMeteringApi.GetMonthlyCostAttributionOptionalParameters; import com.datadog.api.client.v2.model.MonthlyCostAttributionResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { MonthlyCostAttributionResponse result = apiInstance.getMonthlyCostAttribution( OffsetDateTime.now().plusDays(-5), "infra_host_total_cost", new GetMonthlyCostAttributionOptionalParameters() .endMonth(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getMonthlyCostAttribution"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get Monthly Cost Attribution ``` // Get Monthly Cost Attribution returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_usage_metering::GetMonthlyCostAttributionOptionalParams; use datadog_api_client::datadogV2::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_monthly_cost_attribution( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), "infra_host_total_cost".to_string(), GetMonthlyCostAttributionOptionalParams::default().end_month( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get Monthly Cost Attribution ``` /** * Get Monthly Cost Attribution returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsageMeteringApi(configuration); const params: v2.UsageMeteringApiGetMonthlyCostAttributionRequest = { startMonth: new Date(new Date().getTime() + -5 * 86400 * 1000), endMonth: new Date(new Date().getTime() + -3 * 86400 * 1000), fields: "infra_host_total_cost", }; apiInstance .getMonthlyCostAttribution(params) .then((data: v2.MonthlyCostAttributionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get estimated cost across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-estimated-cost-across-your-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-estimated-cost-across-your-account-v2) GET https://api.ap1.datadoghq.com/api/v2/usage/estimated_costhttps://api.ap2.datadoghq.com/api/v2/usage/estimated_costhttps://api.datadoghq.eu/api/v2/usage/estimated_costhttps://api.ddog-gov.com/api/v2/usage/estimated_costhttps://api.datadoghq.com/api/v2/usage/estimated_costhttps://api.us3.datadoghq.com/api/v2/usage/estimated_costhttps://api.us5.datadoghq.com/api/v2/usage/estimated_cost ### Overview Get estimated cost across multi-org and single root-org accounts. Estimated cost data is only available for the current month and previous month and is delayed by up to 72 hours from when it was incurred. To access historical costs prior to this, use the `/historical_cost` endpoint. This endpoint is only accessible for [parent-level organizations](https://docs.datadoghq.com/account_management/multi_organization/). This endpoint requires all of the following permissions: * `usage_read` * `billing_read` OAuth apps require the `usage_read, billing_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description view string String to specify whether cost is broken down at a parent-org level or at the sub-org level. Available views are `summary` and `sub-org`. Defaults to `summary`. start_month string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for cost beginning this month. **Either start_month or start_date should be specified, but not both.** (start_month cannot go beyond two months in the past). Provide an `end_month` to view month-over-month cost. end_month string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for cost ending this month. start_date string Datetime in ISO-8601 format, UTC, precise to day: `[YYYY-MM-DD]` for cost beginning this day. **Either start_month or start_date should be specified, but not both.** (start_date cannot go beyond two months in the past). Provide an `end_date` to view day-over-day cumulative cost. end_date string Datetime in ISO-8601 format, UTC, precise to day: `[YYYY-MM-DD]` for cost ending this day. include_connected_accounts boolean Boolean to specify whether to include accounts connected to the current account as partner customers in the Datadog partner network program. Defaults to `false`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetEstimatedCostByOrg-200-v2) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetEstimatedCostByOrg-400-v2) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetEstimatedCostByOrg-403-v2) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetEstimatedCostByOrg-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Chargeback Summary response. Field Type Description data [object] Response containing Chargeback Summary. attributes object Cost attributes data. account_name string The account name. account_public_id string The account public ID. charges [object] List of charges data reported for the requested month. charge_type string The type of charge for a particular product. cost double The cost for a particular product and charge type during a given month. product_name string The product for which cost is being reported. date date-time The month requested. org_name string The organization name. public_id string The organization public ID. region string The region of the Datadog instance that the organization belongs to. total_cost double The total cost of products for the month. id string Unique ID of the response. type enum Type of cost data. Allowed enum values: `cost_by_org` default: `cost_by_org` ``` { "data": [ { "attributes": { "account_name": "string", "account_public_id": "string", "charges": [ { "charge_type": "on_demand", "cost": "number", "product_name": "infra_host" } ], "date": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string", "region": "string", "total_cost": "number" }, "id": "string", "type": "cost_by_org" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get estimated cost across your account Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/usage/estimated_cost" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get estimated cost across your account ``` """ Get estimated cost across your account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_estimated_cost_by_org() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get estimated cost across your account ``` # Get estimated cost across your account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsageMeteringAPI.new p api_instance.get_estimated_cost_by_org() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get estimated cost across your account ``` // Get estimated cost across your account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsageMeteringApi(apiClient) resp, r, err := api.GetEstimatedCostByOrg(ctx, *datadogV2.NewGetEstimatedCostByOrgOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetEstimatedCostByOrg`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetEstimatedCostByOrg`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get estimated cost across your account ``` // Get estimated cost across your account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsageMeteringApi; import com.datadog.api.client.v2.model.CostByOrgResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { CostByOrgResponse result = apiInstance.getEstimatedCostByOrg(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getEstimatedCostByOrg"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get estimated cost across your account ``` // Get estimated cost across your account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_usage_metering::GetEstimatedCostByOrgOptionalParams; use datadog_api_client::datadogV2::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_estimated_cost_by_org(GetEstimatedCostByOrgOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get estimated cost across your account ``` /** * Get estimated cost across your account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsageMeteringApi(configuration); apiInstance .getEstimatedCostByOrg() .then((data: v2.CostByOrgResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get all custom metrics by hourly average](https://docs.datadoghq.com/api/latest/usage-metering/#get-all-custom-metrics-by-hourly-average) * [v1 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-all-custom-metrics-by-hourly-average-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/top_avg_metricshttps://api.ap2.datadoghq.com/api/v1/usage/top_avg_metricshttps://api.datadoghq.eu/api/v1/usage/top_avg_metricshttps://api.ddog-gov.com/api/v1/usage/top_avg_metricshttps://api.datadoghq.com/api/v1/usage/top_avg_metricshttps://api.us3.datadoghq.com/api/v1/usage/top_avg_metricshttps://api.us5.datadoghq.com/api/v1/usage/top_avg_metrics ### Overview Get all [custom metrics](https://docs.datadoghq.com/developers/metrics/custom_metrics/) by hourly average. Use the month parameter to get a month-to-date data resolution or use the day parameter to get a daily resolution. One of the two is required, and only one of the two is allowed. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description month string Datetime in ISO-8601 format, UTC, precise to month: [YYYY-MM] for usage beginning at this hour. (Either month or day should be specified, but not both) day string Datetime in ISO-8601 format, UTC, precise to day: [YYYY-MM-DD] for usage beginning at this hour. (Either month or day should be specified, but not both) names array Comma-separated list of metric names. limit integer Maximum number of results to return (between 1 and 5000) - defaults to 500 results if limit not specified. next_record_id string List following results with a next_record_id provided in the previous query. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageTopAvgMetrics-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageTopAvgMetrics-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageTopAvgMetrics-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageTopAvgMetrics-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of hourly recorded custom metrics for a given organization. Field Type Description metadata object The object containing document metadata. day date-time The day value from the user request that contains the returned usage data. (If day was used the request) month date-time The month value from the user request that contains the returned usage data. (If month was used the request) pagination object The metadata for the current pagination. limit int64 Maximum amount of records to be returned. next_record_id string The cursor to get the next results (if any). To make the next request, use the same parameters and add `next_record_id`. total_number_of_records int64 Total number of records. usage [object] Number of hourly recorded custom metrics for a given organization. avg_metric_hour int64 Average number of timeseries per hour in which the metric occurs. max_metric_hour int64 Maximum number of timeseries per hour in which the metric occurs. metric_category enum Contains the metric category. Allowed enum values: `standard,custom` metric_name string Contains the custom metric name. ``` { "metadata": { "day": "2019-09-19T10:00:00.000Z", "month": "2019-09-19T10:00:00.000Z", "pagination": { "limit": "integer", "next_record_id": "string", "total_number_of_records": "integer" } }, "usage": [ { "avg_metric_hour": "integer", "max_metric_hour": "integer", "metric_category": "string", "metric_name": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get all custom metrics by hourly average Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/top_avg_metrics" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get all custom metrics by hourly average ``` """ Get all custom metrics by hourly average returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_top_avg_metrics( day=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get all custom metrics by hourly average ``` # Get all custom metrics by hourly average returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { day: (Time.now + -3 * 86400), } p api_instance.get_usage_top_avg_metrics(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get all custom metrics by hourly average ``` // Get all custom metrics by hourly average returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageTopAvgMetrics(ctx, *datadogV1.NewGetUsageTopAvgMetricsOptionalParameters().WithDay(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageTopAvgMetrics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageTopAvgMetrics`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get all custom metrics by hourly average ``` // Get all custom metrics by hourly average returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageTopAvgMetricsOptionalParameters; import com.datadog.api.client.v1.model.UsageTopAvgMetricsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageTopAvgMetricsResponse result = apiInstance.getUsageTopAvgMetrics( new GetUsageTopAvgMetricsOptionalParameters().day(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageTopAvgMetrics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get all custom metrics by hourly average ``` // Get all custom metrics by hourly average returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageTopAvgMetricsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_top_avg_metrics( GetUsageTopAvgMetricsOptionalParams::default().day( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get all custom metrics by hourly average ``` /** * Get all custom metrics by hourly average returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageTopAvgMetricsRequest = { day: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageTopAvgMetrics(params) .then((data: v1.UsageTopAvgMetricsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get projected cost across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-projected-cost-across-your-account) * [v2 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-projected-cost-across-your-account-v2) GET https://api.ap1.datadoghq.com/api/v2/usage/projected_costhttps://api.ap2.datadoghq.com/api/v2/usage/projected_costhttps://api.datadoghq.eu/api/v2/usage/projected_costhttps://api.ddog-gov.com/api/v2/usage/projected_costhttps://api.datadoghq.com/api/v2/usage/projected_costhttps://api.us3.datadoghq.com/api/v2/usage/projected_costhttps://api.us5.datadoghq.com/api/v2/usage/projected_cost ### Overview Get projected cost across multi-org and single root-org accounts. Projected cost data is only available for the current month and becomes available around the 12th of the month. This endpoint is only accessible for [parent-level organizations](https://docs.datadoghq.com/account_management/multi_organization/). This endpoint requires all of the following permissions: * `usage_read` * `billing_read` OAuth apps require the `usage_read, billing_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description view string String to specify whether cost is broken down at a parent-org level or at the sub-org level. Available views are `summary` and `sub-org`. Defaults to `summary`. include_connected_accounts boolean Boolean to specify whether to include accounts connected to the current account as partner customers in the Datadog partner network program. Defaults to `false`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetProjectedCost-200-v2) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetProjectedCost-400-v2) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetProjectedCost-403-v2) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetProjectedCost-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Projected Cost response. Field Type Description data [object] Response containing Projected Cost. attributes object Projected Cost attributes data. account_name string The account name. account_public_id string The account public ID. charges [object] List of charges data reported for the requested month. charge_type string The type of charge for a particular product. cost double The cost for a particular product and charge type during a given month. product_name string The product for which cost is being reported. date date-time The month requested. org_name string The organization name. projected_total_cost double The total projected cost of products for the month. public_id string The organization public ID. region string The region of the Datadog instance that the organization belongs to. id string Unique ID of the response. type enum Type of cost data. Allowed enum values: `projected_cost` default: `projected_cost` ``` { "data": [ { "attributes": { "account_name": "string", "account_public_id": "string", "charges": [ { "charge_type": "on_demand", "cost": "number", "product_name": "infra_host" } ], "date": "2019-09-19T10:00:00.000Z", "org_name": "string", "projected_total_cost": "number", "public_id": "string", "region": "string" }, "id": "string", "type": "projected_cost" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get projected cost across your account Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/usage/projected_cost" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get projected cost across your account ``` """ Get projected cost across your account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_projected_cost( view="sub-org", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get projected cost across your account ``` # Get projected cost across your account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsageMeteringAPI.new opts = { view: "sub-org", } p api_instance.get_projected_cost(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get projected cost across your account ``` // Get projected cost across your account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsageMeteringApi(apiClient) resp, r, err := api.GetProjectedCost(ctx, *datadogV2.NewGetProjectedCostOptionalParameters().WithView("sub-org")) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetProjectedCost`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetProjectedCost`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get projected cost across your account ``` // Get projected cost across your account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsageMeteringApi; import com.datadog.api.client.v2.api.UsageMeteringApi.GetProjectedCostOptionalParameters; import com.datadog.api.client.v2.model.ProjectedCostResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { ProjectedCostResponse result = apiInstance.getProjectedCost(new GetProjectedCostOptionalParameters().view("sub-org")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getProjectedCost"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get projected cost across your account ``` // Get projected cost across your account returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_usage_metering::GetProjectedCostOptionalParams; use datadog_api_client::datadogV2::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_projected_cost(GetProjectedCostOptionalParams::default().view("sub-org".to_string())) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get projected cost across your account ``` /** * Get projected cost across your account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsageMeteringApi(configuration); const params: v2.UsageMeteringApiGetProjectedCostRequest = { view: "sub-org", }; apiInstance .getProjectedCost(params) .then((data: v2.ProjectedCostResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get usage across your account](https://docs.datadoghq.com/api/latest/usage-metering/#get-usage-across-your-account) * [v1 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-usage-across-your-account-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/summaryhttps://api.ap2.datadoghq.com/api/v1/usage/summaryhttps://api.datadoghq.eu/api/v1/usage/summaryhttps://api.ddog-gov.com/api/v1/usage/summaryhttps://api.datadoghq.com/api/v1/usage/summaryhttps://api.us3.datadoghq.com/api/v1/usage/summaryhttps://api.us5.datadoghq.com/api/v1/usage/summary ### Overview Get all usage across your account. This endpoint is only accessible for [parent-level organizations](https://docs.datadoghq.com/account_management/multi_organization/). This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_month [_required_] string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for usage beginning in this month. Maximum of 15 months ago. end_month string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for usage ending this month. include_org_details boolean Include usage summaries for each sub-org. include_connected_accounts boolean Boolean to specify whether to include accounts connected to the current account as partner customers in the Datadog partner network program. Defaults to `false`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSummary-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSummary-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSummary-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSummary-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response summarizing all usage aggregated across the months in the request for all organizations, and broken down by month and by organization. Field Type Description agent_host_top99p_sum int64 Shows the 99th percentile of all agent hosts over all hours in the current month for all organizations. apm_azure_app_service_host_top99p_sum int64 Shows the 99th percentile of all Azure app services using APM over all hours in the current month all organizations. apm_devsecops_host_top99p_sum int64 Shows the 99th percentile of all APM DevSecOps hosts over all hours in the current month for all organizations. apm_enterprise_standalone_hosts_top99p_sum int64 Shows the sum of the 99th percentile of all distinct standalone Enterprise hosts over all hours in the current month for all organizations. apm_fargate_count_avg_sum int64 Shows the average of all APM ECS Fargate tasks over all hours in the current month for all organizations. apm_host_top99p_sum int64 Shows the 99th percentile of all distinct APM hosts over all hours in the current month for all organizations. apm_pro_standalone_hosts_top99p_sum int64 Shows the sum of the 99th percentile of all distinct standalone Pro hosts over all hours in the current month for all organizations. appsec_fargate_count_avg_sum int64 Shows the average of all Application Security Monitoring ECS Fargate tasks over all hours in the current month for all organizations. asm_serverless_agg_sum int64 Shows the sum of all Application Security Monitoring Serverless invocations over all hours in the current months for all organizations. audit_logs_lines_indexed_agg_sum int64 **DEPRECATED** : Shows the sum of all audit logs lines indexed over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). audit_trail_enabled_hwm_sum int64 Shows the total number of organizations that had Audit Trail enabled over a specific number of months. avg_profiled_fargate_tasks_sum int64 The average total count for Fargate Container Profiler over all hours in the current month for all organizations. aws_host_top99p_sum int64 Shows the 99th percentile of all AWS hosts over all hours in the current month for all organizations. aws_lambda_func_count int64 Shows the average of the number of functions that executed 1 or more times each hour in the current month for all organizations. aws_lambda_invocations_sum int64 Shows the sum of all AWS Lambda invocations over all hours in the current month for all organizations. azure_app_service_top99p_sum int64 Shows the 99th percentile of all Azure app services over all hours in the current month for all organizations. azure_host_top99p_sum int64 Shows the 99th percentile of all Azure hosts over all hours in the current month for all organizations. billable_ingested_bytes_agg_sum int64 Shows the sum of all log bytes ingested over all hours in the current month for all organizations. bits_ai_investigations_agg_sum int64 Shows the sum of all Bits AI Investigations over all hours in the current month for all organizations. browser_rum_lite_session_count_agg_sum int64 **DEPRECATED** : Shows the sum of all browser lite sessions over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). browser_rum_replay_session_count_agg_sum int64 Shows the sum of all browser replay sessions over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). browser_rum_units_agg_sum int64 **DEPRECATED** : Shows the sum of all browser RUM units over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). ci_pipeline_indexed_spans_agg_sum int64 Shows the sum of all CI pipeline indexed spans over all hours in the current month for all organizations. ci_test_indexed_spans_agg_sum int64 Shows the sum of all CI test indexed spans over all hours in the current month for all organizations. ci_visibility_itr_committers_hwm_sum int64 Shows the high-water mark of all CI visibility intelligent test runner committers over all hours in the current month for all organizations. ci_visibility_pipeline_committers_hwm_sum int64 Shows the high-water mark of all CI visibility pipeline committers over all hours in the current month for all organizations. ci_visibility_test_committers_hwm_sum int64 Shows the high-water mark of all CI visibility test committers over all hours in the current month for all organizations. cloud_cost_management_aws_host_count_avg_sum int64 Sum of the host count average for Cloud Cost Management for AWS. cloud_cost_management_azure_host_count_avg_sum int64 Sum of the host count average for Cloud Cost Management for Azure. cloud_cost_management_gcp_host_count_avg_sum int64 Sum of the host count average for Cloud Cost Management for GCP. cloud_cost_management_host_count_avg_sum int64 Sum of the host count average for Cloud Cost Management for all cloud providers. cloud_cost_management_oci_host_count_avg_sum int64 Sum of the average host counts for Cloud Cost Management on OCI. cloud_siem_events_agg_sum int64 Shows the sum of all Cloud Security Information and Event Management events over all hours in the current month for all organizations. code_analysis_sa_committers_hwm_sum int64 Shows the high-water mark of all Static Analysis committers over all hours in the current month for all organizations. code_analysis_sca_committers_hwm_sum int64 Shows the high-water mark of all static Software Composition Analysis committers over all hours in the current month for all organizations. code_security_host_top99p_sum int64 Shows the 99th percentile of all Code Security hosts over all hours in the current month for all organizations. container_avg_sum int64 Shows the average of all distinct containers over all hours in the current month for all organizations. container_excl_agent_avg_sum int64 Shows the average of the containers without the Datadog Agent over all hours in the current month for all organizations. container_hwm_sum int64 Shows the sum of the high-water marks of all distinct containers over all hours in the current month for all organizations. csm_container_enterprise_compliance_count_agg_sum int64 Shows the sum of all Cloud Security Management Enterprise compliance containers over all hours in the current month for all organizations. csm_container_enterprise_cws_count_agg_sum int64 Shows the sum of all Cloud Security Management Enterprise Cloud Workload Security containers over all hours in the current month for all organizations. csm_container_enterprise_total_count_agg_sum int64 Shows the sum of all Cloud Security Management Enterprise containers over all hours in the current month for all organizations. csm_host_enterprise_aas_host_count_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Enterprise Azure app services hosts over all hours in the current month for all organizations. csm_host_enterprise_aws_host_count_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Enterprise AWS hosts over all hours in the current month for all organizations. csm_host_enterprise_azure_host_count_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Enterprise Azure hosts over all hours in the current month for all organizations. csm_host_enterprise_compliance_host_count_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Enterprise compliance hosts over all hours in the current month for all organizations. csm_host_enterprise_cws_host_count_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Enterprise Cloud Workload Security hosts over all hours in the current month for all organizations. csm_host_enterprise_gcp_host_count_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Enterprise GCP hosts over all hours in the current month for all organizations. csm_host_enterprise_total_host_count_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Enterprise hosts over all hours in the current month for all organizations. cspm_aas_host_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Pro Azure app services hosts over all hours in the current month for all organizations. cspm_aws_host_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Pro AWS hosts over all hours in the current month for all organizations. cspm_azure_host_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Pro Azure hosts over all hours in the current month for all organizations. cspm_container_avg_sum int64 Shows the average number of Cloud Security Management Pro containers over all hours in the current month for all organizations. cspm_container_hwm_sum int64 Shows the sum of the high-water marks of Cloud Security Management Pro containers over all hours in the current month for all organizations. cspm_gcp_host_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Pro GCP hosts over all hours in the current month for all organizations. cspm_host_top99p_sum int64 Shows the 99th percentile of all Cloud Security Management Pro hosts over all hours in the current month for all organizations. custom_historical_ts_sum int64 Shows the average number of distinct historical custom metrics over all hours in the current month for all organizations. custom_live_ts_sum int64 Shows the average number of distinct live custom metrics over all hours in the current month for all organizations. custom_ts_sum int64 Shows the average number of distinct custom metrics over all hours in the current month for all organizations. cws_container_avg_sum int64 Shows the average of all distinct Cloud Workload Security containers over all hours in the current month for all organizations. cws_fargate_task_avg_sum int64 Shows the average of all distinct Cloud Workload Security Fargate tasks over all hours in the current month for all organizations. cws_host_top99p_sum int64 Shows the 99th percentile of all Cloud Workload Security hosts over all hours in the current month for all organizations. data_jobs_monitoring_host_hr_agg_sum int64 Shows the sum of Data Jobs Monitoring hosts over all hours in the current months for all organizations dbm_host_top99p_sum int64 Shows the 99th percentile of all Database Monitoring hosts over all hours in the current month for all organizations. dbm_queries_avg_sum int64 Shows the average of all distinct Database Monitoring Normalized Queries over all hours in the current month for all organizations. end_date date-time Shows the last date of usage in the current month for all organizations. eph_infra_host_agent_agg_sum int64 Shows the sum of all ephemeral infrastructure hosts with the Datadog Agent over all hours in the current month for all organizations. eph_infra_host_alibaba_agg_sum int64 Shows the sum of all ephemeral infrastructure hosts on Alibaba over all hours in the current month for all organizations. eph_infra_host_aws_agg_sum int64 Shows the sum of all ephemeral infrastructure hosts on AWS over all hours in the current month for all organizations. eph_infra_host_azure_agg_sum int64 Shows the sum of all ephemeral infrastructure hosts on Azure over all hours in the current month for all organizations. eph_infra_host_ent_agg_sum int64 Shows the sum of all ephemeral infrastructure hosts for Enterprise over all hours in the current month for all organizations. eph_infra_host_gcp_agg_sum int64 Shows the sum of all ephemeral infrastructure hosts on GCP over all hours in the current month for all organizations. eph_infra_host_heroku_agg_sum int64 Shows the sum of all ephemeral infrastructure hosts on Heroku over all hours in the current month for all organizations. eph_infra_host_only_aas_agg_sum int64 Shows the sum of all ephemeral infrastructure hosts with only Azure App Services over all hours in the current month for all organizations. eph_infra_host_only_vsphere_agg_sum int64 Shows the sum of all ephemeral infrastructure hosts with only vSphere over all hours in the current month for all organizations. eph_infra_host_opentelemetry_agg_sum int64 Shows the sum of all ephemeral hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current month for all organizations. eph_infra_host_opentelemetry_apm_agg_sum int64 Shows the sum of all ephemeral APM hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current month for all organizations. eph_infra_host_pro_agg_sum int64 Shows the sum of all ephemeral infrastructure hosts for Pro over all hours in the current month for all organizations. eph_infra_host_proplus_agg_sum int64 Shows the sum of all ephemeral infrastructure hosts for Pro Plus over all hours in the current month for all organizations. eph_infra_host_proxmox_agg_sum int64 Sum of all ephemeral infrastructure hosts for Proxmox over all hours in the current month for all organizations. error_tracking_apm_error_events_agg_sum int64 Shows the sum of all Error Tracking APM error events over all hours in the current month for all organizations. error_tracking_error_events_agg_sum int64 Shows the sum of all Error Tracking error events over all hours in the current month for all organizations. error_tracking_events_agg_sum int64 Shows the sum of all Error Tracking events over all hours in the current months for all organizations. error_tracking_rum_error_events_agg_sum int64 Shows the sum of all Error Tracking RUM error events over all hours in the current month for all organizations. event_management_correlation_agg_sum int64 Shows the sum of all Event Management correlations over all hours in the current month for all organizations. event_management_correlation_correlated_events_agg_sum int64 Shows the sum of all Event Management correlated events over all hours in the current month for all organizations. event_management_correlation_correlated_related_events_agg_sum int64 Shows the sum of all Event Management correlated related events over all hours in the current month for all organizations. fargate_container_profiler_profiling_fargate_avg_sum int64 The average number of Profiling Fargate tasks over all hours in the current month for all organizations. fargate_container_profiler_profiling_fargate_eks_avg_sum int64 The average number of Profiling Fargate Elastic Kubernetes Service tasks over all hours in the current month for all organizations. fargate_tasks_count_avg_sum int64 Shows the average of all Fargate tasks over all hours in the current month for all organizations. fargate_tasks_count_hwm_sum int64 Shows the sum of the high-water marks of all Fargate tasks over all hours in the current month for all organizations. flex_logs_compute_large_avg_sum int64 Shows the average number of Flex Logs Compute Large Instances over all hours in the current months for all organizations. flex_logs_compute_medium_avg_sum int64 Shows the average number of Flex Logs Compute Medium Instances over all hours in the current months for all organizations. flex_logs_compute_small_avg_sum int64 Shows the average number of Flex Logs Compute Small Instances over all hours in the current months for all organizations. flex_logs_compute_xlarge_avg_sum int64 Shows the average number of Flex Logs Compute Extra Large Instances over all hours in the current months for all organizations. flex_logs_compute_xsmall_avg_sum int64 Shows the average number of Flex Logs Compute Extra Small Instances over all hours in the current months for all organizations. flex_logs_starter_avg_sum int64 Shows the average number of Flex Logs Starter Instances over all hours in the current months for all organizations. flex_logs_starter_storage_index_avg_sum int64 Shows the average number of Flex Logs Starter Storage Index Instances over all hours in the current months for all organizations. flex_logs_starter_storage_retention_adjustment_avg_sum int64 Shows the average number of Flex Logs Starter Storage Retention Adjustment Instances over all hours in the current months for all organizations. flex_stored_logs_avg_sum int64 Shows the average of all Flex Stored Logs over all hours in the current months for all organizations. forwarding_events_bytes_agg_sum int64 Shows the sum of all logs forwarding bytes over all hours in the current month for all organizations (data available as of April 1, 2023) gcp_host_top99p_sum int64 Shows the 99th percentile of all GCP hosts over all hours in the current month for all organizations. heroku_host_top99p_sum int64 Shows the 99th percentile of all Heroku dynos over all hours in the current month for all organizations. incident_management_monthly_active_users_hwm_sum int64 Shows sum of the high-water marks of incident management monthly active users in the current month for all organizations. incident_management_seats_hwm_sum int64 Shows the sum of the high-water marks of Incident Management seats over all hours in the current month for all organizations. indexed_events_count_agg_sum int64 **DEPRECATED** : Shows the sum of all log events indexed over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). infra_host_top99p_sum int64 Shows the 99th percentile of all distinct infrastructure hosts over all hours in the current month for all organizations. ingested_events_bytes_agg_sum int64 Shows the sum of all log bytes ingested over all hours in the current month for all organizations. iot_device_agg_sum int64 Shows the sum of all IoT devices over all hours in the current month for all organizations. iot_device_top99p_sum int64 Shows the 99th percentile of all IoT devices over all hours in the current month of all organizations. last_updated date-time Shows the most recent hour in the current month for all organizations for which all usages were calculated. live_indexed_events_agg_sum int64 **DEPRECATED** : Shows the sum of all live logs indexed over all hours in the current month for all organization (To be deprecated on October 1st, 2024). live_ingested_bytes_agg_sum int64 Shows the sum of all live logs bytes ingested over all hours in the current month for all organizations (data available as of December 1, 2020). llm_observability_agg_sum int64 Sum of all LLM observability sessions for all hours in the current month for all organizations. llm_observability_min_spend_agg_sum int64 Minimum spend for LLM observability sessions for all hours in the current month for all organizations. logs_by_retention object Object containing logs usage data broken down by retention period. orgs object Indexed logs usage summary for each organization for each retention period with usage. usage [object] Indexed logs usage summary for each organization. usage [object] Indexed logs usage for each active retention for the organization. logs_indexed_logs_usage_sum int64 Total indexed logs for this retention period. logs_live_indexed_logs_usage_sum int64 Live indexed logs for this retention period. logs_rehydrated_indexed_logs_usage_sum int64 Rehydrated indexed logs for this retention period. retention string The retention period in days or "custom" for all custom retention periods. usage [object] Aggregated index logs usage for each retention period with usage. logs_indexed_logs_usage_agg_sum int64 Total indexed logs for this retention period. logs_live_indexed_logs_usage_agg_sum int64 Live indexed logs for this retention period. logs_rehydrated_indexed_logs_usage_agg_sum int64 Rehydrated indexed logs for this retention period. retention string The retention period in days or "custom" for all custom retention periods. usage_by_month object Object containing a summary of indexed logs usage by retention period for a single month. date date-time The month for the usage. usage [object] Indexed logs usage for each active retention for the month. logs_indexed_logs_usage_sum int64 Total indexed logs for this retention period. logs_live_indexed_logs_usage_sum int64 Live indexed logs for this retention period. logs_rehydrated_indexed_logs_usage_sum int64 Rehydrated indexed logs for this retention period. retention string The retention period in days or "custom" for all custom retention periods. mobile_rum_lite_session_count_agg_sum int64 **DEPRECATED** : Shows the sum of all mobile lite sessions over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_agg_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_android_agg_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on Android over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_flutter_agg_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on Flutter over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_ios_agg_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on iOS over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_reactnative_agg_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on React Native over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_roku_agg_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on Roku over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). mobile_rum_units_agg_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM units over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). ndm_netflow_events_agg_sum int64 Shows the sum of all Network Device Monitoring NetFlow events over all hours in the current month for all organizations. netflow_indexed_events_count_agg_sum int64 **DEPRECATED** : Shows the sum of all Network flows indexed over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). network_device_wireless_top99p_sum int64 Shows the 99th percentile of all Network Device Monitoring wireless devices over all hours in the current month for all organizations. npm_host_top99p_sum int64 Shows the 99th percentile of all distinct Cloud Network Monitoring hosts (formerly known as Network hosts) over all hours in the current month for all organizations. observability_pipelines_bytes_processed_agg_sum int64 Sum of all observability pipelines bytes processed over all hours in the current month for all organizations. oci_host_agg_sum int64 Shows the sum of Oracle Cloud Infrastructure hosts over all hours in the current months for all organizations oci_host_top99p_sum int64 Shows the 99th percentile of Oracle Cloud Infrastructure hosts over all hours in the current months for all organizations on_call_seat_hwm_sum int64 Shows the sum of the high-water marks of On-Call seats over all hours in the current month for all organizations. online_archive_events_count_agg_sum int64 Sum of all online archived events over all hours in the current month for all organizations. opentelemetry_apm_host_top99p_sum int64 Shows the 99th percentile of APM hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current month for all organizations. opentelemetry_host_top99p_sum int64 Shows the 99th percentile of all hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current month for all organizations. product_analytics_agg_sum int64 Sum of all product analytics sessions for all hours in the current month for all organizations. profiling_aas_count_top99p_sum int64 Shows the 99th percentile of all profiled Azure app services over all hours in the current month for all organizations. profiling_container_agent_count_avg int64 Shows the average number of profiled containers over all hours in the current month for all organizations. profiling_host_count_top99p_sum int64 Shows the 99th percentile of all profiled hosts over all hours in the current month for all organizations. proxmox_host_agg_sum int64 Sum of all Proxmox hosts over all hours in the current month for all organizations. proxmox_host_top99p_sum int64 Sum of the 99th percentile of all Proxmox hosts over all hours in the current month for all organizations. published_app_hwm_sum int64 Shows the high-water mark of all published applications over all hours in the current month for all organizations. rehydrated_indexed_events_agg_sum int64 **DEPRECATED** : Shows the sum of all rehydrated logs indexed over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). rehydrated_ingested_bytes_agg_sum int64 Shows the sum of all rehydrated logs bytes ingested over all hours in the current month for all organizations (data available as of December 1, 2020). rum_browser_and_mobile_session_count int64 Shows the sum of all mobile sessions and all browser lite and legacy sessions over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). rum_browser_legacy_session_count_agg_sum int64 Shows the sum of all browser RUM legacy sessions over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_browser_lite_session_count_agg_sum int64 Shows the sum of all browser RUM lite sessions over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_browser_replay_session_count_agg_sum int64 Shows the sum of all browser RUM Session Replay counts over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_indexed_sessions_agg_sum int64 Sum of all RUM indexed sessions for all hours in the current month for all organizations. rum_ingested_sessions_agg_sum int64 Sum of all RUM ingested sessions for all hours in the current month for all organizations. rum_lite_session_count_agg_sum int64 Shows the sum of all RUM lite sessions (browser and mobile) over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_android_agg_sum int64 Shows the sum of all mobile RUM legacy sessions on Android over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_flutter_agg_sum int64 Shows the sum of all mobile RUM legacy sessions on Flutter over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_ios_agg_sum int64 Shows the sum of all mobile RUM legacy sessions on iOS over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_reactnative_agg_sum int64 Shows the sum of all mobile RUM legacy sessions on React Native over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_roku_agg_sum int64 Shows the sum of all mobile RUM legacy sessions on Roku over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_android_agg_sum int64 Shows the sum of all mobile RUM lite sessions on Android over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_flutter_agg_sum int64 Shows the sum of all mobile RUM lite sessions on Flutter over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_ios_agg_sum int64 Shows the sum of all mobile RUM lite sessions on iOS over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_kotlinmultiplatform_agg_sum int64 Shows the sum of all mobile RUM lite sessions on Kotlin Multiplatform over all hours within the current month for all organizations. rum_mobile_lite_session_count_reactnative_agg_sum int64 Shows the sum of all mobile RUM lite sessions on React Native over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_roku_agg_sum int64 Shows the sum of all mobile RUM lite sessions on Roku over all hours within the current month for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_unity_agg_sum int64 Shows the sum of all mobile RUM lite sessions on Unity over all hours within the current month for all organizations. rum_mobile_replay_session_count_android_agg_sum int64 Shows the sum of all mobile RUM replay sessions on Android over all hours within the current month for all organizations. rum_mobile_replay_session_count_ios_agg_sum int64 Shows the sum of all mobile RUM replay sessions on iOS over all hours within the current month for all organizations. rum_mobile_replay_session_count_kotlinmultiplatform_agg_sum int64 Shows the sum of all mobile RUM replay sessions on Kotlin Multiplatform over all hours within the current month for all organizations. rum_mobile_replay_session_count_reactnative_agg_sum int64 Shows the sum of all mobile RUM replay sessions on React Native over all hours within the current month for all organizations. rum_replay_session_count_agg_sum int64 Shows the sum of all RUM Session Replay counts over all hours in the current month for all organizations (To be introduced on October 1st, 2024). rum_session_count_agg_sum int64 **DEPRECATED** : Shows the sum of all browser RUM lite sessions over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). rum_session_replay_add_on_agg_sum int64 Sum of all RUM session replay add-on sessions for all hours in the current month for all organizations. rum_total_session_count_agg_sum int64 Shows the sum of RUM sessions (browser and mobile) over all hours in the current month for all organizations. rum_units_agg_sum int64 **DEPRECATED** : Shows the sum of all browser and mobile RUM units over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). sca_fargate_count_avg_sum int64 Shows the average of all Software Composition Analysis Fargate tasks over all hours in the current months for all organizations. sca_fargate_count_hwm_sum int64 Shows the sum of the high-water marks of all Software Composition Analysis Fargate tasks over all hours in the current months for all organizations. sds_apm_scanned_bytes_sum int64 Sum of all APM bytes scanned with sensitive data scanner in the current month for all organizations. sds_events_scanned_bytes_sum int64 Sum of all event stream events bytes scanned with sensitive data scanner in the current month for all organizations. sds_logs_scanned_bytes_sum int64 Shows the sum of all bytes scanned of logs usage by the Sensitive Data Scanner over all hours in the current month for all organizations. sds_rum_scanned_bytes_sum int64 Sum of all RUM bytes scanned with sensitive data scanner in the current month for all organizations. sds_total_scanned_bytes_sum int64 Shows the sum of all bytes scanned across all usage types by the Sensitive Data Scanner over all hours in the current month for all organizations. serverless_apps_apm_apm_azure_appservice_instances_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring for Azure App Service instances in the current month for all organizations. serverless_apps_apm_apm_azure_azurefunction_instances_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring for Azure Function instances in the current month for all organizations. serverless_apps_apm_apm_azure_containerapp_instances_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring for Azure Container App instances in the current month for all organizations. serverless_apps_apm_apm_fargate_ecs_tasks_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring for Fargate Elastic Container Service tasks in the current month for all organizations. serverless_apps_apm_apm_gcp_cloudfunction_instances_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring for Google Cloud Platform Cloud Function instances in the current month for all organizations. serverless_apps_apm_apm_gcp_cloudrun_instances_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring for Google Cloud Platform Cloud Run instances in the current month for all organizations. serverless_apps_apm_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring in the current month for all organizations. serverless_apps_apm_excl_fargate_apm_azure_appservice_instances_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Azure App Service instances in the current month for all organizations. serverless_apps_apm_excl_fargate_apm_azure_azurefunction_instances_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Azure Function instances in the current month for all organizations. serverless_apps_apm_excl_fargate_apm_azure_containerapp_instances_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Azure Container App instances in the current month for all organizations. serverless_apps_apm_excl_fargate_apm_gcp_cloudfunction_instances_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Google Cloud Platform Cloud Function instances in the current month for all organizations. serverless_apps_apm_excl_fargate_apm_gcp_cloudrun_instances_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Google Cloud Platform Cloud Run instances in the current month for all organizations. serverless_apps_apm_excl_fargate_avg_sum int64 Sum of the average number of Serverless Apps with Application Performance Monitoring excluding Fargate in the current month for all organizations. serverless_apps_azure_container_app_instances_avg_sum int64 Sum of the average number of Serverless Apps for Azure Container App instances in the current month for all organizations. serverless_apps_azure_count_avg_sum int64 Sum of the average number of Serverless Apps for Azure in the current month for all organizations. serverless_apps_azure_function_app_instances_avg_sum int64 Sum of the average number of Serverless Apps for Azure Function App instances in the current month for all organizations. serverless_apps_azure_web_app_instances_avg_sum int64 Sum of the average number of Serverless Apps for Azure Web App instances in the current month for all organizations. serverless_apps_ecs_avg_sum int64 Sum of the average number of Serverless Apps for Elastic Container Service in the current month for all organizations. serverless_apps_eks_avg_sum int64 Sum of the average number of Serverless Apps for Elastic Kubernetes Service in the current month for all organizations. serverless_apps_excl_fargate_avg_sum int64 Sum of the average number of Serverless Apps excluding Fargate in the current month for all organizations. serverless_apps_excl_fargate_azure_container_app_instances_avg_sum int64 Sum of the average number of Serverless Apps excluding Fargate for Azure Container App instances in the current month for all organizations. serverless_apps_excl_fargate_azure_function_app_instances_avg_sum int64 Sum of the average number of Serverless Apps excluding Fargate for Azure Function App instances in the current month for all organizations. serverless_apps_excl_fargate_azure_web_app_instances_avg_sum int64 Sum of the average number of Serverless Apps excluding Fargate for Azure Web App instances in the current month for all organizations. serverless_apps_excl_fargate_google_cloud_functions_instances_avg_sum int64 Sum of the average number of Serverless Apps excluding Fargate for Google Cloud Platform Cloud Functions instances in the current month for all organizations. serverless_apps_excl_fargate_google_cloud_run_instances_avg_sum int64 Sum of the average number of Serverless Apps excluding Fargate for Google Cloud Platform Cloud Run instances in the current month for all organizations. serverless_apps_google_cloud_functions_instances_avg_sum int64 Sum of the average number of Serverless Apps for Google Cloud Platform Cloud Functions instances in the current month for all organizations. serverless_apps_google_cloud_run_instances_avg_sum int64 Sum of the average number of Serverless Apps for Google Cloud Platform Cloud Run instances in the current month for all organizations. serverless_apps_google_count_avg_sum int64 Sum of the average number of Serverless Apps for Google Cloud in the current month for all organizations. serverless_apps_total_count_avg_sum int64 Sum of the average number of Serverless Apps for Azure and Google Cloud in the current month for all organizations. siem_analyzed_logs_add_on_count_agg_sum int64 Shows the sum of all log events analyzed by Cloud SIEM over all hours in the current month for all organizations. start_date date-time Shows the first date of usage in the current month for all organizations. synthetics_browser_check_calls_count_agg_sum int64 Shows the sum of all Synthetic browser tests over all hours in the current month for all organizations. synthetics_check_calls_count_agg_sum int64 Shows the sum of all Synthetic API tests over all hours in the current month for all organizations. synthetics_mobile_test_runs_agg_sum int64 Shows the sum of Synthetic mobile application tests over all hours in the current month for all organizations. synthetics_parallel_testing_max_slots_hwm_sum int64 Shows the sum of the high-water marks of used synthetics parallel testing slots over all hours in the current month for all organizations. trace_search_indexed_events_count_agg_sum int64 Shows the sum of all Indexed Spans indexed over all hours in the current month for all organizations. twol_ingested_events_bytes_agg_sum int64 Shows the sum of all ingested APM span bytes over all hours in the current month for all organizations. universal_service_monitoring_host_top99p_sum int64 Shows the 99th percentile of all Universal Service Monitoring hosts over all hours in the current month for all organizations. usage [object] An array of objects regarding hourly usage. agent_host_top99p int64 Shows the 99th percentile of all agent hosts over all hours in the current date for all organizations. apm_azure_app_service_host_top99p int64 Shows the 99th percentile of all Azure app services using APM over all hours in the current date all organizations. apm_devsecops_host_top99p int64 Shows the 99th percentile of all APM DevSecOps hosts over all hours in the current date for the given org. apm_enterprise_standalone_hosts_top99p int64 Shows the 99th percentile of all distinct standalone Enterprise hosts over all hours in the current date for all organizations. apm_fargate_count_avg int64 Shows the average of all APM ECS Fargate tasks over all hours in the current date for all organizations. apm_host_top99p int64 Shows the 99th percentile of all distinct APM hosts over all hours in the current date for all organizations. apm_pro_standalone_hosts_top99p int64 Shows the 99th percentile of all distinct standalone Pro hosts over all hours in the current date for all organizations. appsec_fargate_count_avg int64 Shows the average of all Application Security Monitoring ECS Fargate tasks over all hours in the current date for all organizations. asm_serverless_sum int64 Shows the sum of all Application Security Monitoring Serverless invocations over all hours in the current date for all organizations. audit_logs_lines_indexed_sum int64 **DEPRECATED** : Shows the sum of audit logs lines indexed over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). audit_trail_enabled_hwm int64 Shows the number of organizations that had Audit Trail enabled in the current date. avg_profiled_fargate_tasks int64 The average total count for Fargate Container Profiler over all hours in the current date for all organizations. aws_host_top99p int64 Shows the 99th percentile of all AWS hosts over all hours in the current date for all organizations. aws_lambda_func_count int64 Shows the average of the number of functions that executed 1 or more times each hour in the current date for all organizations. aws_lambda_invocations_sum int64 Shows the sum of all AWS Lambda invocations over all hours in the current date for all organizations. azure_app_service_top99p int64 Shows the 99th percentile of all Azure app services over all hours in the current date for all organizations. billable_ingested_bytes_sum int64 Shows the sum of all log bytes ingested over all hours in the current date for all organizations. bits_ai_investigations_sum int64 Shows the sum of all Bits AI Investigations over all hours in the current date for all organizations. browser_rum_lite_session_count_sum int64 **DEPRECATED** : Shows the sum of all browser lite sessions over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). browser_rum_replay_session_count_sum int64 Shows the sum of all browser replay sessions over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). browser_rum_units_sum int64 **DEPRECATED** : Shows the sum of all browser RUM units over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). ci_pipeline_indexed_spans_sum int64 Shows the sum of all CI pipeline indexed spans over all hours in the current month for all organizations. ci_test_indexed_spans_sum int64 Shows the sum of all CI test indexed spans over all hours in the current month for all organizations. ci_visibility_itr_committers_hwm int64 Shows the high-water mark of all CI visibility intelligent test runner committers over all hours in the current month for all organizations. ci_visibility_pipeline_committers_hwm int64 Shows the high-water mark of all CI visibility pipeline committers over all hours in the current month for all organizations. ci_visibility_test_committers_hwm int64 Shows the high-water mark of all CI visibility test committers over all hours in the current month for all organizations. cloud_cost_management_aws_host_count_avg int64 Host count average of Cloud Cost Management for AWS for the given date and given organization. cloud_cost_management_azure_host_count_avg int64 Host count average of Cloud Cost Management for Azure for the given date and given organization. cloud_cost_management_gcp_host_count_avg int64 Host count average of Cloud Cost Management for GCP for the given date and given organization. cloud_cost_management_host_count_avg int64 Host count average of Cloud Cost Management for all cloud providers for the given date and given organization. cloud_cost_management_oci_host_count_avg int64 Average host count for Cloud Cost Management on OCI for the given date and organization. cloud_siem_events_sum int64 Shows the sum of all Cloud Security Information and Event Management events over all hours in the current date for the given org. code_analysis_sa_committers_hwm int64 Shows the high-water mark of all Static Analysis committers over all hours in the current date for the given org. code_analysis_sca_committers_hwm int64 Shows the high-water mark of all static Software Composition Analysis committers over all hours in the current date for the given org. code_security_host_top99p int64 Shows the 99th percentile of all Code Security hosts over all hours in the current date for the given org. container_avg int64 Shows the average of all distinct containers over all hours in the current date for all organizations. container_excl_agent_avg int64 Shows the average of containers without the Datadog Agent over all hours in the current date for all organizations. container_hwm int64 Shows the high-water mark of all distinct containers over all hours in the current date for all organizations. csm_container_enterprise_compliance_count_sum int64 Shows the sum of all Cloud Security Management Enterprise compliance containers over all hours in the current date for the given org. csm_container_enterprise_cws_count_sum int64 Shows the sum of all Cloud Security Management Enterprise Cloud Workload Security containers over all hours in the current date for the given org. csm_container_enterprise_total_count_sum int64 Shows the sum of all Cloud Security Management Enterprise containers over all hours in the current date for the given org. csm_host_enterprise_aas_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise Azure app services hosts over all hours in the current date for the given org. csm_host_enterprise_aws_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise AWS hosts over all hours in the current date for the given org. csm_host_enterprise_azure_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise Azure hosts over all hours in the current date for the given org. csm_host_enterprise_compliance_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise compliance hosts over all hours in the current date for the given org. csm_host_enterprise_cws_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise Cloud Workload Security hosts over all hours in the current date for the given org. csm_host_enterprise_gcp_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise GCP hosts over all hours in the current date for the given org. csm_host_enterprise_total_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise hosts over all hours in the current date for the given org. cspm_aas_host_top99p int64 Shows the 99th percentile of all Cloud Security Management Pro Azure app services hosts over all hours in the current date for all organizations. cspm_aws_host_top99p int64 Shows the 99th percentile of all Cloud Security Management Pro AWS hosts over all hours in the current date for all organizations. cspm_azure_host_top99p int64 Shows the 99th percentile of all Cloud Security Management Pro Azure hosts over all hours in the current date for all organizations. cspm_container_avg int64 Shows the average number of Cloud Security Management Pro containers over all hours in the current date for all organizations. cspm_container_hwm int64 Shows the high-water mark of Cloud Security Management Pro containers over all hours in the current date for all organizations. cspm_gcp_host_top99p int64 Shows the 99th percentile of all Cloud Security Management Pro GCP hosts over all hours in the current date for all organizations. cspm_host_top99p int64 Shows the 99th percentile of all Cloud Security Management Pro hosts over all hours in the current date for all organizations. custom_ts_avg int64 Shows the average number of distinct custom metrics over all hours in the current date for all organizations. cws_container_count_avg int64 Shows the average of all distinct Cloud Workload Security containers over all hours in the current date for all organizations. cws_fargate_task_avg int64 Shows the average of all distinct Cloud Workload Security Fargate tasks over all hours in the current date for all organizations. cws_host_top99p int64 Shows the 99th percentile of all Cloud Workload Security hosts over all hours in the current date for all organizations. data_jobs_monitoring_host_hr_sum int64 Shows the sum of all Data Jobs Monitoring hosts over all hours in the current date for the given org. date date-time The date for the usage. dbm_host_top99p int64 Shows the 99th percentile of all Database Monitoring hosts over all hours in the current date for all organizations. dbm_queries_count_avg int64 Shows the average of all normalized Database Monitoring queries over all hours in the current date for all organizations. eph_infra_host_agent_sum int64 Shows the sum of all ephemeral infrastructure hosts with the Datadog Agent over all hours in the current date for the given org. eph_infra_host_alibaba_sum int64 Shows the sum of all ephemeral infrastructure hosts on Alibaba over all hours in the current date for the given org. eph_infra_host_aws_sum int64 Shows the sum of all ephemeral infrastructure hosts on AWS over all hours in the current date for the given org. eph_infra_host_azure_sum int64 Shows the sum of all ephemeral infrastructure hosts on Azure over all hours in the current date for the given org. eph_infra_host_ent_sum int64 Shows the sum of all ephemeral infrastructure hosts for Enterprise over all hours in the current date for the given org. eph_infra_host_gcp_sum int64 Shows the sum of all ephemeral infrastructure hosts on GCP over all hours in the current date for the given org. eph_infra_host_heroku_sum int64 Shows the sum of all ephemeral infrastructure hosts on Heroku over all hours in the current date for the given org. eph_infra_host_only_aas_sum int64 Shows the sum of all ephemeral infrastructure hosts with only Azure App Services over all hours in the current date for the given org. eph_infra_host_only_vsphere_sum int64 Shows the sum of all ephemeral infrastructure hosts with only vSphere over all hours in the current date for the given org. eph_infra_host_opentelemetry_apm_sum int64 Shows the sum of all ephemeral APM hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current date for the given org. eph_infra_host_opentelemetry_sum int64 Shows the sum of all ephemeral hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current date for the given org. eph_infra_host_pro_sum int64 Shows the sum of all ephemeral infrastructure hosts for Pro over all hours in the current date for the given org. eph_infra_host_proplus_sum int64 Shows the sum of all ephemeral infrastructure hosts for Pro Plus over all hours in the current date for the given org. eph_infra_host_proxmox_sum int64 Sum of all ephemeral infrastructure hosts for Proxmox over all hours in the current date for all organizations. error_tracking_apm_error_events_sum int64 Shows the sum of all Error Tracking APM error events over all hours in the current date for the given org. error_tracking_error_events_sum int64 Shows the sum of all Error Tracking error events over all hours in the current date for the given org. error_tracking_events_sum int64 Shows the sum of all Error Tracking events over all hours in the current date for the given org. error_tracking_rum_error_events_sum int64 Shows the sum of all Error Tracking RUM error events over all hours in the current date for the given org. event_management_correlation_correlated_events_sum int64 Shows the sum of all Event Management correlated events over all hours in the current date for all organizations. event_management_correlation_correlated_related_events_sum int64 Shows the sum of all Event Management correlated related events over all hours in the current date for all organizations. event_management_correlation_sum int64 Shows the sum of all Event Management correlations over all hours in the current date for all organizations. fargate_container_profiler_profiling_fargate_avg int64 The average number of Profiling Fargate tasks over all hours in the current date for all organizations. fargate_container_profiler_profiling_fargate_eks_avg int64 The average number of Profiling Fargate Elastic Kubernetes Service tasks over all hours in the current date for all organizations. fargate_tasks_count_avg int64 Shows the high-watermark of all Fargate tasks over all hours in the current date for all organizations. fargate_tasks_count_hwm int64 Shows the average of all Fargate tasks over all hours in the current date for all organizations. flex_logs_compute_large_avg int64 Shows the average number of Flex Logs Compute Large Instances over all hours in the current date for the given org. flex_logs_compute_medium_avg int64 Shows the average number of Flex Logs Compute Medium Instances over all hours in the current date for the given org. flex_logs_compute_small_avg int64 Shows the average number of Flex Logs Compute Small Instances over all hours in the current date for the given org. flex_logs_compute_xlarge_avg int64 Shows the average number of Flex Logs Compute Extra Large Instances over all hours in the current date for the given org. flex_logs_compute_xsmall_avg int64 Shows the average number of Flex Logs Compute Extra Small Instances over all hours in the current date for the given org. flex_logs_starter_avg int64 Shows the average number of Flex Logs Starter Instances over all hours in the current date for the given org. flex_logs_starter_storage_index_avg int64 Shows the average number of Flex Logs Starter Storage Index Instances over all hours in the current date for the given org. flex_logs_starter_storage_retention_adjustment_avg int64 Shows the average number of Flex Logs Starter Storage Retention Adjustment Instances over all hours in the current date for the given org. flex_stored_logs_avg int64 Shows the average of all Flex Stored Logs over all hours in the current date for the given org. forwarding_events_bytes_sum int64 Shows the sum of all log bytes forwarded over all hours in the current date for all organizations. gcp_host_top99p int64 Shows the 99th percentile of all GCP hosts over all hours in the current date for all organizations. heroku_host_top99p int64 Shows the 99th percentile of all Heroku dynos over all hours in the current date for all organizations. incident_management_monthly_active_users_hwm int64 Shows the high-water mark of incident management monthly active users over all hours in the current date for all organizations. incident_management_seats_hwm int64 Shows the high-water mark of Incident Management seats over all hours on the current date for all organizations. indexed_events_count_sum int64 Shows the sum of all log events indexed over all hours in the current date for all organizations. infra_host_top99p int64 Shows the 99th percentile of all distinct infrastructure hosts over all hours in the current date for all organizations. ingested_events_bytes_sum int64 Shows the sum of all log bytes ingested over all hours in the current date for all organizations. iot_device_sum int64 Shows the sum of all IoT devices over all hours in the current date for all organizations. iot_device_top99p int64 Shows the 99th percentile of all IoT devices over all hours in the current date all organizations. llm_observability_min_spend_sum int64 Sum of all LLM observability minimum spend over all hours in the current date for all organizations. llm_observability_sum int64 Sum of all LLM observability sessions over all hours in the current date for all organizations. mobile_rum_lite_session_count_sum int64 **DEPRECATED** : Shows the sum of all mobile lite sessions over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_android_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on Android over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_flutter_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on Flutter over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_ios_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on iOS over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_reactnative_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on React Native over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_roku_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on Roku over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). mobile_rum_session_count_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). mobile_rum_units_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM units over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). ndm_netflow_events_sum int64 Shows the sum of all Network Device Monitoring NetFlow events over all hours in the current date for the given org. netflow_indexed_events_count_sum int64 **DEPRECATED** : Shows the sum of all Network flows indexed over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). network_device_wireless_top99p int64 Shows the 99th percentile of all Network Device Monitoring wireless devices over all hours in the current date for all organizations. npm_host_top99p int64 Shows the 99th percentile of all distinct Cloud Network Monitoring hosts (formerly known as Network hosts) over all hours in the current date for all organizations. observability_pipelines_bytes_processed_sum int64 Sum of all observability pipelines bytes processed over all hours in the current date for the given org. oci_host_sum int64 Shows the sum of all Oracle Cloud Infrastructure hosts over all hours in the current date for the given org. oci_host_top99p int64 Shows the 99th percentile of all Oracle Cloud Infrastructure hosts over all hours in the current date for the given org. on_call_seat_hwm int64 Shows the high-water mark of On-Call seats over all hours in the current date for all organizations. online_archive_events_count_sum int64 Sum of all online archived events over all hours in the current date for all organizations. opentelemetry_apm_host_top99p int64 Shows the 99th percentile of APM hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current date for all organizations. opentelemetry_host_top99p int64 Shows the 99th percentile of all hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current date for all organizations. orgs [object] Organizations associated with a user. account_name string The account name. account_public_id string The account public id. agent_host_top99p int64 Shows the 99th percentile of all agent hosts over all hours in the current date for the given org. apm_azure_app_service_host_top99p int64 Shows the 99th percentile of all Azure app services using APM over all hours in the current date for the given org. apm_devsecops_host_top99p int64 Shows the 99th percentile of all APM DevSecOps hosts over all hours in the current date for the given org. apm_enterprise_standalone_hosts_top99p int64 Shows the 99th percentile of all distinct standalone Enterprise hosts over all hours in the current date for the given org. apm_fargate_count_avg int64 Shows the average of all APM ECS Fargate tasks over all hours in the current month for the given org. apm_host_top99p int64 Shows the 99th percentile of all distinct APM hosts over all hours in the current date for the given org. apm_pro_standalone_hosts_top99p int64 Shows the 99th percentile of all distinct standalone Pro hosts over all hours in the current date for the given org. appsec_fargate_count_avg int64 Shows the average of all Application Security Monitoring ECS Fargate tasks over all hours in the current month for the given org. asm_serverless_sum int64 Shows the sum of all Application Security Monitoring Serverless invocations over all hours in the current month for the given org. audit_logs_lines_indexed_sum int64 **DEPRECATED** : Shows the sum of all audit logs lines indexed over all hours in the current date for the given org (To be deprecated on October 1st, 2024). audit_trail_enabled_hwm int64 Shows whether Audit Trail is enabled for the current date for the given org. avg_profiled_fargate_tasks int64 The average total count for Fargate Container Profiler over all hours in the current month for the given org. aws_host_top99p int64 Shows the 99th percentile of all AWS hosts over all hours in the current date for the given org. aws_lambda_func_count int64 Shows the sum of all AWS Lambda invocations over all hours in the current date for the given org. aws_lambda_invocations_sum int64 Shows the sum of all AWS Lambda invocations over all hours in the current date for the given org. azure_app_service_top99p int64 Shows the 99th percentile of all Azure app services over all hours in the current date for the given org. billable_ingested_bytes_sum int64 Shows the sum of all log bytes ingested over all hours in the current date for the given org. bits_ai_investigations_sum int64 Shows the sum of all Bits AI Investigations over all hours in the current date for the given org. browser_rum_lite_session_count_sum int64 **DEPRECATED** : Shows the sum of all browser lite sessions over all hours in the current date for the given org (To be deprecated on October 1st, 2024). browser_rum_replay_session_count_sum int64 Shows the sum of all browser replay sessions over all hours in the current date for the given org (To be deprecated on October 1st, 2024). browser_rum_units_sum int64 **DEPRECATED** : Shows the sum of all browser RUM units over all hours in the current date for the given org (To be deprecated on October 1st, 2024). ci_pipeline_indexed_spans_sum int64 Shows the sum of all CI pipeline indexed spans over all hours in the current date for the given org. ci_test_indexed_spans_sum int64 Shows the sum of all CI test indexed spans over all hours in the current date for the given org. ci_visibility_itr_committers_hwm int64 Shows the high-water mark of all CI visibility intelligent test runner committers over all hours in the current date for the given org. ci_visibility_pipeline_committers_hwm int64 Shows the high-water mark of all CI visibility pipeline committers over all hours in the current date for the given org. ci_visibility_test_committers_hwm int64 Shows the high-water mark of all CI visibility test committers over all hours in the current date for the given org. cloud_cost_management_aws_host_count_avg int64 Host count average of Cloud Cost Management for AWS for the given date and given org. cloud_cost_management_azure_host_count_avg int64 Host count average of Cloud Cost Management for Azure for the given date and given org. cloud_cost_management_gcp_host_count_avg int64 Host count average of Cloud Cost Management for GCP for the given date and given org. cloud_cost_management_host_count_avg int64 Host count average of Cloud Cost Management for all cloud providers for the given date and given org. cloud_cost_management_oci_host_count_avg int64 Average host count for Cloud Cost Management on OCI for the given date and organization. cloud_siem_events_sum int64 Shows the sum of all Cloud Security Information and Event Management events over all hours in the current date for the given org. code_analysis_sa_committers_hwm int64 Shows the high-water mark of all Static Analysis committers over all hours in the current date for the given org. code_analysis_sca_committers_hwm int64 Shows the high-water mark of all static Software Composition Analysis committers over all hours in the current date for the given org. code_security_host_top99p int64 Shows the 99th percentile of all Code Security hosts over all hours in the current date for the given org. container_avg int64 Shows the average of all distinct containers over all hours in the current date for the given org. container_excl_agent_avg int64 Shows the average of containers without the Datadog Agent over all hours in the current date for the given organization. container_hwm int64 Shows the high-water mark of all distinct containers over all hours in the current date for the given org. csm_container_enterprise_compliance_count_sum int64 Shows the sum of all Cloud Security Management Enterprise compliance containers over all hours in the current date for the given org. csm_container_enterprise_cws_count_sum int64 Shows the sum of all Cloud Security Management Enterprise Cloud Workload Security containers over all hours in the current date for the given org. csm_container_enterprise_total_count_sum int64 Shows the sum of all Cloud Security Management Enterprise containers over all hours in the current date for the given org. csm_host_enterprise_aas_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise Azure app services hosts over all hours in the current date for the given org. csm_host_enterprise_aws_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise AWS hosts over all hours in the current date for the given org. csm_host_enterprise_azure_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise Azure hosts over all hours in the current date for the given org. csm_host_enterprise_compliance_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise compliance hosts over all hours in the current date for the given org. csm_host_enterprise_cws_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise Cloud Workload Security hosts over all hours in the current date for the given org. csm_host_enterprise_gcp_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise GCP hosts over all hours in the current date for the given org. csm_host_enterprise_total_host_count_top99p int64 Shows the 99th percentile of all Cloud Security Management Enterprise hosts over all hours in the current date for the given org. cspm_aas_host_top99p int64 Shows the 99th percentile of all Cloud Security Management Pro Azure app services hosts over all hours in the current date for the given org. cspm_aws_host_top99p int64 Shows the 99th percentile of all Cloud Security Management Pro AWS hosts over all hours in the current date for the given org. cspm_azure_host_top99p int64 Shows the 99th percentile of all Cloud Security Management Pro Azure hosts over all hours in the current date for the given org. cspm_container_avg int64 Shows the average number of Cloud Security Management Pro containers over all hours in the current date for the given org. cspm_container_hwm int64 Shows the high-water mark of Cloud Security Management Pro containers over all hours in the current date for the given org. cspm_gcp_host_top99p int64 Shows the 99th percentile of all Cloud Security Management Pro GCP hosts over all hours in the current date for the given org. cspm_host_top99p int64 Shows the 99th percentile of all Cloud Security Management Pro hosts over all hours in the current date for the given org. custom_historical_ts_avg int64 Shows the average number of distinct historical custom metrics over all hours in the current date for the given org. custom_live_ts_avg int64 Shows the average number of distinct live custom metrics over all hours in the current date for the given org. custom_ts_avg int64 Shows the average number of distinct custom metrics over all hours in the current date for the given org. cws_container_count_avg int64 Shows the average of all distinct Cloud Workload Security containers over all hours in the current date for the given org. cws_fargate_task_avg int64 Shows the average of all distinct Cloud Workload Security Fargate tasks over all hours in the current date for the given org. cws_host_top99p int64 Shows the 99th percentile of all Cloud Workload Security hosts over all hours in the current date for the given org. data_jobs_monitoring_host_hr_sum int64 Shows the sum of all Data Jobs Monitoring hosts over all hours in the current date for the given org. dbm_host_top99p_sum int64 Shows the 99th percentile of all Database Monitoring hosts over all hours in the current month for the given org. dbm_queries_avg_sum int64 Shows the average of all distinct Database Monitoring normalized queries over all hours in the current month for the given org. eph_infra_host_agent_sum int64 Shows the sum of all ephemeral infrastructure hosts with the Datadog Agent over all hours in the current date for the given org. eph_infra_host_alibaba_sum int64 Shows the sum of all ephemeral infrastructure hosts on Alibaba over all hours in the current date for the given org. eph_infra_host_aws_sum int64 Shows the sum of all ephemeral infrastructure hosts on AWS over all hours in the current date for the given org. eph_infra_host_azure_sum int64 Shows the sum of all ephemeral infrastructure hosts on Azure over all hours in the current date for the given org. eph_infra_host_ent_sum int64 Shows the sum of all ephemeral infrastructure hosts for Enterprise over all hours in the current date for the given org. eph_infra_host_gcp_sum int64 Shows the sum of all ephemeral infrastructure hosts on GCP over all hours in the current date for the given org. eph_infra_host_heroku_sum int64 Shows the sum of all ephemeral infrastructure hosts on Heroku over all hours in the current date for the given org. eph_infra_host_only_aas_sum int64 Shows the sum of all ephemeral infrastructure hosts with only Azure App Services over all hours in the current date for the given org. eph_infra_host_only_vsphere_sum int64 Shows the sum of all ephemeral infrastructure hosts with only vSphere over all hours in the current date for the given org. eph_infra_host_opentelemetry_apm_sum int64 Shows the sum of all ephemeral APM hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current date for the given org. eph_infra_host_opentelemetry_sum int64 Shows the sum of all ephemeral hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current date for the given org. eph_infra_host_pro_sum int64 Shows the sum of all ephemeral infrastructure hosts for Pro over all hours in the current date for the given org. eph_infra_host_proplus_sum int64 Shows the sum of all ephemeral infrastructure hosts for Pro Plus over all hours in the current date for the given org. eph_infra_host_proxmox_sum int64 Sum of all ephemeral infrastructure hosts for Proxmox over all hours in the current date for the given organization. error_tracking_apm_error_events_sum int64 Shows the sum of all Error Tracking APM error events over all hours in the current date for the given org. error_tracking_error_events_sum int64 Shows the sum of all Error Tracking error events over all hours in the current date for the given org. error_tracking_events_sum int64 Shows the sum of all Error Tracking events over all hours in the current date for the given org. error_tracking_rum_error_events_sum int64 Shows the sum of all Error Tracking RUM error events over all hours in the current date for the given org. event_management_correlation_correlated_events_sum int64 Shows the sum of all Event Management correlated events over all hours in the current date for the given org. event_management_correlation_correlated_related_events_sum int64 Shows the sum of all Event Management correlated related events over all hours in the current date for the given org. event_management_correlation_sum int64 Shows the sum of all Event Management correlations over all hours in the current date for the given org. fargate_container_profiler_profiling_fargate_avg int64 The average number of Profiling Fargate tasks over all hours in the current month for the given org. fargate_container_profiler_profiling_fargate_eks_avg int64 The average number of Profiling Fargate Elastic Kubernetes Service tasks over all hours in the current month for the given org. fargate_tasks_count_avg int64 The average task count for Fargate. fargate_tasks_count_hwm int64 Shows the high-water mark of all Fargate tasks over all hours in the current date for the given org. flex_logs_compute_large_avg int64 Shows the average number of Flex Logs Compute Large Instances over all hours in the current date for the given org. flex_logs_compute_medium_avg int64 Shows the average number of Flex Logs Compute Medium Instances over all hours in the current date for the given org. flex_logs_compute_small_avg int64 Shows the average number of Flex Logs Compute Small Instances over all hours in the current date for the given org. flex_logs_compute_xlarge_avg int64 Shows the average number of Flex Logs Compute Extra Large Instances over all hours in the current date for the given org. flex_logs_compute_xsmall_avg int64 Shows the average number of Flex Logs Compute Extra Small Instances over all hours in the current date for the given org. flex_logs_starter_avg int64 Shows the average number of Flex Logs Starter Instances over all hours in the current date for the given org. flex_logs_starter_storage_index_avg int64 Shows the average number of Flex Logs Starter Storage Index Instances over all hours in the current date for the given org. flex_logs_starter_storage_retention_adjustment_avg int64 Shows the average number of Flex Logs Starter Storage Retention Adjustment Instances over all hours in the current date for the given org. flex_stored_logs_avg int64 Shows the average of all Flex Stored Logs over all hours in the current date for the given org. forwarding_events_bytes_sum int64 Shows the sum of all log bytes forwarded over all hours in the current date for the given org. gcp_host_top99p int64 Shows the 99th percentile of all GCP hosts over all hours in the current date for the given org. heroku_host_top99p int64 Shows the 99th percentile of all Heroku dynos over all hours in the current date for the given org. id string The organization id. incident_management_monthly_active_users_hwm int64 Shows the high-water mark of incident management monthly active users over all hours in the current date for the given org. incident_management_seats_hwm int64 Shows the high-water mark of Incident Management seats over all hours on the current date for the given organization. indexed_events_count_sum int64 **DEPRECATED** : Shows the sum of all log events indexed over all hours in the current date for the given org (To be deprecated on October 1st, 2024). infra_host_top99p int64 Shows the 99th percentile of all distinct infrastructure hosts over all hours in the current date for the given org. ingested_events_bytes_sum int64 Shows the sum of all log bytes ingested over all hours in the current date for the given org. iot_device_agg_sum int64 Shows the sum of all IoT devices over all hours in the current date for the given org. iot_device_top99p_sum int64 Shows the 99th percentile of all IoT devices over all hours in the current date for the given org. llm_observability_min_spend_sum int64 Shows the sum of all LLM Observability minimum spend over all hours in the current date for the given org. llm_observability_sum int64 Shows the sum of all LLM observability sessions over all hours in the current date for the given org. mobile_rum_lite_session_count_sum int64 **DEPRECATED** : Shows the sum of all mobile lite sessions over all hours in the current date for the given org (To be deprecated on October 1st, 2024). mobile_rum_session_count_android_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on Android over all hours in the current date for the given org (To be deprecated on October 1st, 2024). mobile_rum_session_count_flutter_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on Flutter over all hours in the current date for the given org (To be deprecated on October 1st, 2024). mobile_rum_session_count_ios_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on iOS over all hours in the current date for the given org (To be deprecated on October 1st, 2024). mobile_rum_session_count_reactnative_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on React Native over all hours in the current date for the given org (To be deprecated on October 1st, 2024). mobile_rum_session_count_roku_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions on Roku over all hours in the current date for the given org (To be deprecated on October 1st, 2024). mobile_rum_session_count_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM sessions over all hours in the current date for the given org (To be deprecated on October 1st, 2024). mobile_rum_units_sum int64 **DEPRECATED** : Shows the sum of all mobile RUM units over all hours in the current date for the given org (To be deprecated on October 1st, 2024). name string The organization name. ndm_netflow_events_sum int64 Shows the sum of all Network Device Monitoring NetFlow events over all hours in the current date for the given org. netflow_indexed_events_count_sum int64 **DEPRECATED** : Shows the sum of all Network flows indexed over all hours in the current date for the given org (To be deprecated on October 1st, 2024). network_device_wireless_top99p int64 Shows the 99th percentile of all Network Device Monitoring wireless devices over all hours in the current date for the given org. npm_host_top99p int64 Shows the 99th percentile of all distinct Cloud Network Monitoring hosts (formerly known as Network hosts) over all hours in the current date for the given org. observability_pipelines_bytes_processed_sum int64 Sum of all observability pipelines bytes processed over all hours in the current date for the given org. oci_host_sum int64 Shows the sum of all Oracle Cloud Infrastructure hosts over all hours in the current date for the given org. oci_host_top99p int64 Shows the 99th percentile of all Oracle Cloud Infrastructure hosts over all hours in the current date for the given org. on_call_seat_hwm int64 Shows the high-water mark of On-Call seats over all hours in the current date for the given org. online_archive_events_count_sum int64 Sum of all online archived events over all hours in the current date for the given org. opentelemetry_apm_host_top99p int64 Shows the 99th percentile of APM hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current date for the given org. opentelemetry_host_top99p int64 Shows the 99th percentile of all hosts reported by the Datadog exporter for the OpenTelemetry Collector over all hours in the current date for the given org. product_analytics_sum int64 Shows the sum of all product analytics sessions over all hours in the current date for the given org. profiling_aas_count_top99p int64 Shows the 99th percentile of all profiled Azure app services over all hours in the current date for all organizations. profiling_host_top99p int64 Shows the 99th percentile of all profiled hosts over all hours within the current date for the given org. proxmox_host_sum int64 Sum of all Proxmox hosts over all hours in the current date for the given organization. proxmox_host_top99p int64 99th percentile of all Proxmox hosts over all hours in the current date for the given organization. public_id string The organization public id. published_app_hwm int64 Shows the high-water mark of all published applications over all hours in the current date for the given org. region string The region of the organization. rum_browser_and_mobile_session_count int64 Shows the sum of all mobile sessions and all browser lite and legacy sessions over all hours in the current date for the given org (To be deprecated on October 1st, 2024). rum_browser_legacy_session_count_sum int64 Shows the sum of all browser RUM legacy sessions over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_browser_lite_session_count_sum int64 Shows the sum of all browser RUM lite sessions over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_browser_replay_session_count_sum int64 Shows the sum of all browser RUM Session Replay counts over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_indexed_sessions_sum int64 Shows the sum of all RUM indexed sessions over all hours in the current date for the given org. rum_ingested_sessions_sum int64 Shows the sum of all RUM ingested sessions over all hours in the current date for the given org. rum_lite_session_count_sum int64 Shows the sum of all RUM lite sessions (browser and mobile) over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_android_sum int64 Shows the sum of all mobile RUM legacy sessions on Android over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_flutter_sum int64 Shows the sum of all mobile RUM legacy sessions on Flutter over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_ios_sum int64 Shows the sum of all mobile RUM legacy sessions on iOS over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_reactnative_sum int64 Shows the sum of all mobile RUM legacy sessions on React Native over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_roku_sum int64 Shows the sum of all mobile RUM legacy sessions on Roku over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_android_sum int64 Shows the sum of all mobile RUM lite sessions on Android over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_flutter_sum int64 Shows the sum of all mobile RUM lite sessions on Flutter over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_ios_sum int64 Shows the sum of all mobile RUM lite sessions on iOS over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_kotlinmultiplatform_sum int64 Shows the sum of all mobile RUM lite sessions on Kotlin Multiplatform over all hours within the current date for the given org. rum_mobile_lite_session_count_reactnative_sum int64 Shows the sum of all mobile RUM lite sessions on React Native over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_roku_sum int64 Shows the sum of all mobile RUM lite sessions on Roku over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_unity_sum int64 Shows the sum of all mobile RUM lite sessions on Unity over all hours within the current date for the given org. rum_mobile_replay_session_count_android_sum int64 Shows the sum of all mobile RUM replay sessions on Android over all hours within the current date for the given org. rum_mobile_replay_session_count_ios_sum int64 Shows the sum of all mobile RUM replay sessions on iOS over all hours within the current date for the given org. rum_mobile_replay_session_count_kotlinmultiplatform_sum int64 Shows the sum of all mobile RUM replay sessions on Kotlin Multiplatform over all hours within the current date for the given org. rum_mobile_replay_session_count_reactnative_sum int64 Shows the sum of all mobile RUM replay sessions on React Native over all hours within the current date for the given org. rum_replay_session_count_sum int64 Shows the sum of all RUM Session Replay counts over all hours in the current date for the given org (To be introduced on October 1st, 2024). rum_session_count_sum int64 **DEPRECATED** : Shows the sum of all browser RUM lite sessions over all hours in the current date for the given org (To be deprecated on October 1st, 2024). rum_session_replay_add_on_sum int64 Shows the sum of all RUM session replay add-on sessions over all hours in the current date for the given org. rum_total_session_count_sum int64 Shows the sum of RUM sessions (browser and mobile) over all hours in the current date for the given org. rum_units_sum int64 **DEPRECATED** : Shows the sum of all browser and mobile RUM units over all hours in the current date for the given org (To be deprecated on October 1st, 2024). sca_fargate_count_avg int64 Shows the average of all Software Composition Analysis Fargate tasks over all hours in the current date for the given org. sca_fargate_count_hwm int64 Shows the sum of the high-water marks of all Software Composition Analysis Fargate tasks over all hours in the current date for the given org. sds_apm_scanned_bytes_sum int64 Sum of all APM bytes scanned with sensitive data scanner over all hours in the current date for the given org. sds_events_scanned_bytes_sum int64 Sum of all event stream events bytes scanned with sensitive data scanner over all hours in the current date for the given org. sds_logs_scanned_bytes_sum int64 Shows the sum of all bytes scanned of logs usage by the Sensitive Data Scanner over all hours in the current month for the given org. sds_rum_scanned_bytes_sum int64 Sum of all RUM bytes scanned with sensitive data scanner over all hours in the current date for the given org. sds_total_scanned_bytes_sum int64 Shows the sum of all bytes scanned across all usage types by the Sensitive Data Scanner over all hours in the current month for the given org. serverless_apps_apm_apm_azure_appservice_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Azure App Service instances for the given date and given org. serverless_apps_apm_apm_azure_azurefunction_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Azure Function instances for the given date and given org. serverless_apps_apm_apm_azure_containerapp_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Azure Container App instances for the given date and given org. serverless_apps_apm_apm_fargate_ecs_tasks_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Fargate Elastic Container Service tasks for the given date and given org. serverless_apps_apm_apm_gcp_cloudfunction_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Google Cloud Platform Cloud Function instances for the given date and given org. serverless_apps_apm_apm_gcp_cloudrun_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Google Cloud Platform Cloud Run instances for the given date and given org. serverless_apps_apm_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for the given date and given org. serverless_apps_apm_excl_fargate_apm_azure_appservice_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Azure App Service instances for the given date and given org. serverless_apps_apm_excl_fargate_apm_azure_azurefunction_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Azure Function instances for the given date and given org. serverless_apps_apm_excl_fargate_apm_azure_containerapp_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Azure Container App instances for the given date and given org. serverless_apps_apm_excl_fargate_apm_gcp_cloudfunction_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Google Cloud Platform Cloud Function instances for the given date and given org. serverless_apps_apm_excl_fargate_apm_gcp_cloudrun_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Google Cloud Platform Cloud Run instances for the given date and given org. serverless_apps_apm_excl_fargate_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for the given date and given org. serverless_apps_azure_container_app_instances_avg int64 Shows the average number of Serverless Apps for Azure Container App instances for the given date and given org. serverless_apps_azure_count_avg int64 Shows the average number of Serverless Apps for Azure for the given date and given org. serverless_apps_azure_function_app_instances_avg int64 Shows the average number of Serverless Apps for Azure Function App instances for the given date and given org. serverless_apps_azure_web_app_instances_avg int64 Shows the average number of Serverless Apps for Azure Web App instances for the given date and given org. serverless_apps_ecs_avg int64 Shows the average number of Serverless Apps for Elastic Container Service for the given date and given org. serverless_apps_eks_avg int64 Shows the average number of Serverless Apps for Elastic Kubernetes Service for the given date and given org. serverless_apps_excl_fargate_avg int64 Shows the average number of Serverless Apps excluding Fargate for the given date and given org. serverless_apps_excl_fargate_azure_container_app_instances_avg int64 Shows the average number of Serverless Apps excluding Fargate for Azure Container App instances for the given date and given org. serverless_apps_excl_fargate_azure_function_app_instances_avg int64 Shows the average number of Serverless Apps excluding Fargate for Azure Function App instances for the given date and given org. serverless_apps_excl_fargate_azure_web_app_instances_avg int64 Shows the average number of Serverless Apps excluding Fargate for Azure Web App instances for the given date and given org. serverless_apps_excl_fargate_google_cloud_functions_instances_avg int64 Shows the average number of Serverless Apps excluding Fargate for Google Cloud Platform Cloud Functions instances for the given date and given org. serverless_apps_excl_fargate_google_cloud_run_instances_avg int64 Shows the average number of Serverless Apps excluding Fargate for Google Cloud Platform Cloud Run instances for the given date and given org. serverless_apps_google_cloud_functions_instances_avg int64 Shows the average number of Serverless Apps for Google Cloud Platform Cloud Functions instances for the given date and given org. serverless_apps_google_cloud_run_instances_avg int64 Shows the average number of Serverless Apps for Google Cloud Platform Cloud Run instances for the given date and given org. serverless_apps_google_count_avg int64 Shows the average number of Serverless Apps for Google Cloud for the given date and given org. serverless_apps_total_count_avg int64 Shows the average number of Serverless Apps for Azure and Google Cloud for the given date and given org. siem_analyzed_logs_add_on_count_sum int64 Shows the sum of all log events analyzed by Cloud SIEM over all hours in the current date for the given org. synthetics_browser_check_calls_count_sum int64 Shows the sum of all Synthetic browser tests over all hours in the current date for the given org. synthetics_check_calls_count_sum int64 Shows the sum of all Synthetic API tests over all hours in the current date for the given org. synthetics_mobile_test_runs_sum int64 Shows the sum of all Synthetic mobile application tests over all hours in the current date for the given org. synthetics_parallel_testing_max_slots_hwm int64 Shows the high-water mark of used synthetics parallel testing slots over all hours in the current date for the given org. trace_search_indexed_events_count_sum int64 Shows the sum of all Indexed Spans indexed over all hours in the current date for the given org. twol_ingested_events_bytes_sum int64 Shows the sum of all ingested APM span bytes over all hours in the current date for the given org. universal_service_monitoring_host_top99p int64 Shows the 99th percentile of all Universal Service Monitoring hosts over all hours in the current date for the given org. vsphere_host_top99p int64 Shows the 99th percentile of all vSphere hosts over all hours in the current date for the given org. vuln_management_host_count_top99p int64 Shows the 99th percentile of all Application Vulnerability Management hosts over all hours in the current date for the given org. workflow_executions_usage_sum int64 Sum of all workflows executed over all hours in the current date for the given org. product_analytics_sum int64 Sum of all product analytics sessions over all hours in the current date for all organizations. profiling_aas_count_top99p int64 Shows the 99th percentile of all profiled Azure app services over all hours in the current date for all organizations. profiling_host_top99p int64 Shows the 99th percentile of all profiled hosts over all hours within the current date for all organizations. proxmox_host_sum int64 Sum of all Proxmox hosts over all hours in the current date for all organizations. proxmox_host_top99p int64 99th percentile of all Proxmox hosts over all hours in the current date for all organizations. published_app_hwm int64 Shows the high-water mark of all published applications over all hours in the current date for all organizations. rum_browser_and_mobile_session_count int64 Shows the sum of all mobile sessions and all browser lite and legacy sessions over all hours in the current month for all organizations (To be deprecated on October 1st, 2024). rum_browser_legacy_session_count_sum int64 Shows the sum of all browser RUM legacy sessions over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_browser_lite_session_count_sum int64 Shows the sum of all browser RUM lite sessions over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_browser_replay_session_count_sum int64 Shows the sum of all browser RUM Session Replay counts over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_indexed_sessions_sum int64 Sum of all RUM indexed sessions over all hours in the current date for all organizations. rum_ingested_sessions_sum int64 Sum of all RUM ingested sessions over all hours in the current date for all organizations. rum_lite_session_count_sum int64 Shows the sum of all RUM lite sessions (browser and mobile) over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_android_sum int64 Shows the sum of all mobile RUM legacy sessions on Android over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_flutter_sum int64 Shows the sum of all mobile RUM legacy Sessions on Flutter over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_ios_sum int64 Shows the sum of all mobile RUM legacy sessions on iOS over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_reactnative_sum int64 Shows the sum of all mobile RUM legacy sessions on React Native over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_mobile_legacy_session_count_roku_sum int64 Shows the sum of all mobile RUM legacy sessions on Roku over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_android_sum int64 Shows the sum of all mobile RUM lite sessions on Android over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_flutter_sum int64 Shows the sum of all mobile RUM lite sessions on Flutter over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_ios_sum int64 Shows the sum of all mobile RUM lite sessions on iOS over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_kotlinmultiplatform_sum int64 Shows the sum of all mobile RUM lite sessions on Kotlin Multiplatform over all hours within the current date for all organizations. rum_mobile_lite_session_count_reactnative_sum int64 Shows the sum of all mobile RUM lite sessions on React Native over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_roku_sum int64 Shows the sum of all mobile RUM lite sessions on Roku over all hours within the current date for all organizations (To be introduced on October 1st, 2024). rum_mobile_lite_session_count_unity_sum int64 Shows the sum of all mobile RUM lite sessions on Unity over all hours within the current date for all organizations. rum_mobile_replay_session_count_android_sum int64 Shows the sum of all mobile RUM replay sessions on Android over all hours within the current date for the given org. rum_mobile_replay_session_count_ios_sum int64 Shows the sum of all mobile RUM replay sessions on iOS over all hours within the current date for the given org. rum_mobile_replay_session_count_kotlinmultiplatform_sum int64 Shows the sum of all mobile RUM replay sessions on Kotlin Multiplatform over all hours within the current date for all organizations. rum_mobile_replay_session_count_reactnative_sum int64 Shows the sum of all mobile RUM replay sessions on React Native over all hours within the current date for the given org. rum_replay_session_count_sum int64 Shows the sum of all RUM Session Replay counts over all hours in the current date for all organizations (To be introduced on October 1st, 2024). rum_session_count_sum int64 **DEPRECATED** : Shows the sum of all browser RUM lite sessions over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). rum_session_replay_add_on_sum int64 Sum of all RUM session replay add-on sessions over all hours in the current date for all organizations. rum_total_session_count_sum int64 Shows the sum of RUM sessions (browser and mobile) over all hours in the current date for all organizations. rum_units_sum int64 **DEPRECATED** : Shows the sum of all browser and mobile RUM units over all hours in the current date for all organizations (To be deprecated on October 1st, 2024). sca_fargate_count_avg int64 Shows the average of all Software Composition Analysis Fargate tasks over all hours in the current date for the given org. sca_fargate_count_hwm int64 Shows the sum of the high-water marks of all Software Composition Analysis Fargate tasks over all hours in the current date for the given org. sds_apm_scanned_bytes_sum int64 Sum of all APM bytes scanned with sensitive data scanner over all hours in the current date for all organizations. sds_events_scanned_bytes_sum int64 Sum of all event stream events bytes scanned with sensitive data scanner over all hours in the current date for all organizations. sds_logs_scanned_bytes_sum int64 Shows the sum of all bytes scanned of logs usage by the Sensitive Data Scanner over all hours in the current month for all organizations. sds_rum_scanned_bytes_sum int64 Sum of all RUM bytes scanned with sensitive data scanner over all hours in the current date for all organizations. sds_total_scanned_bytes_sum int64 Shows the sum of all bytes scanned across all usage types by the Sensitive Data Scanner over all hours in the current month for all organizations. serverless_apps_apm_apm_azure_appservice_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Azure App Service instances for the current date for all organizations. serverless_apps_apm_apm_azure_azurefunction_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Azure Function instances for the current date for all organizations. serverless_apps_apm_apm_azure_containerapp_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Azure Container App instances for the current date for all organizations. serverless_apps_apm_apm_fargate_ecs_tasks_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Fargate Elastic Container Service tasks for the current date for all organizations. serverless_apps_apm_apm_gcp_cloudfunction_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Google Cloud Platform Cloud Function instances for the current date for all organizations. serverless_apps_apm_apm_gcp_cloudrun_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for Google Cloud Platform Cloud Run instances for the current date for all organizations. serverless_apps_apm_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring for the current date for all organizations. serverless_apps_apm_excl_fargate_apm_azure_appservice_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Azure App Service instances for the current date for all organizations. serverless_apps_apm_excl_fargate_apm_azure_azurefunction_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Azure Function instances for the current date for all organizations. serverless_apps_apm_excl_fargate_apm_azure_containerapp_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Azure Container App instances for the current date for all organizations. serverless_apps_apm_excl_fargate_apm_gcp_cloudfunction_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Google Cloud Platform Cloud Function instances for the current date for all organizations. serverless_apps_apm_excl_fargate_apm_gcp_cloudrun_instances_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for Google Cloud Platform Cloud Run instances for the current date for all organizations. serverless_apps_apm_excl_fargate_avg int64 Shows the average number of Serverless Apps with Application Performance Monitoring excluding Fargate for the current date for all organizations. serverless_apps_azure_container_app_instances_avg int64 Shows the average number of Serverless Apps for Azure Container App instances for the current date for all organizations. serverless_apps_azure_count_avg int64 Shows the average number of Serverless Apps for Azure for the given date and given org. serverless_apps_azure_function_app_instances_avg int64 Shows the average number of Serverless Apps for Azure Function App instances for the current date for all organizations. serverless_apps_azure_web_app_instances_avg int64 Shows the average number of Serverless Apps for Azure Web App instances for the current date for all organizations. serverless_apps_ecs_avg int64 Shows the average number of Serverless Apps for Elastic Container Service for the current date for all organizations. serverless_apps_eks_avg int64 Shows the average number of Serverless Apps for Elastic Kubernetes Service for the current date for all organizations. serverless_apps_excl_fargate_avg int64 Shows the average number of Serverless Apps excluding Fargate for the current date for all organizations. serverless_apps_excl_fargate_azure_container_app_instances_avg int64 Shows the average number of Serverless Apps excluding Fargate for Azure Container App instances for the current date for all organizations. serverless_apps_excl_fargate_azure_function_app_instances_avg int64 Shows the average number of Serverless Apps excluding Fargate for Azure Function App instances for the current date for all organizations. serverless_apps_excl_fargate_azure_web_app_instances_avg int64 Shows the average number of Serverless Apps excluding Fargate for Azure Web App instances for the current date for all organizations. serverless_apps_excl_fargate_google_cloud_functions_instances_avg int64 Shows the average number of Serverless Apps excluding Fargate for Google Cloud Platform Cloud Functions instances for the current date for all organizations. serverless_apps_excl_fargate_google_cloud_run_instances_avg int64 Shows the average number of Serverless Apps excluding Fargate for Google Cloud Platform Cloud Run instances for the current date for all organizations. serverless_apps_google_cloud_functions_instances_avg int64 Shows the average number of Serverless Apps for Google Cloud Platform Cloud Functions instances for the current date for all organizations. serverless_apps_google_cloud_run_instances_avg int64 Shows the average number of Serverless Apps for Google Cloud Platform Cloud Run instances for the current date for all organizations. serverless_apps_google_count_avg int64 Shows the average number of Serverless Apps for Google Cloud for the given date and given org. serverless_apps_total_count_avg int64 Shows the average number of Serverless Apps for Azure and Google Cloud for the given date and given org. siem_analyzed_logs_add_on_count_sum int64 Shows the sum of all log events analyzed by Cloud SIEM over all hours in the current date for the given org. synthetics_browser_check_calls_count_sum int64 Shows the sum of all Synthetic browser tests over all hours in the current date for all organizations. synthetics_check_calls_count_sum int64 Shows the sum of all Synthetic API tests over all hours in the current date for all organizations. synthetics_mobile_test_runs_sum int64 Shows the sum of all Synthetic mobile application tests over all hours in the current date for all organizations. synthetics_parallel_testing_max_slots_hwm int64 Shows the high-water mark of used synthetics parallel testing slots over all hours in the current date for all organizations. trace_search_indexed_events_count_sum int64 Shows the sum of all Indexed Spans indexed over all hours in the current date for all organizations. twol_ingested_events_bytes_sum int64 Shows the sum of all ingested APM span bytes over all hours in the current date for all organizations. universal_service_monitoring_host_top99p int64 Shows the 99th percentile of all universal service management hosts over all hours in the current date for the given org. vsphere_host_top99p int64 Shows the 99th percentile of all vSphere hosts over all hours in the current date for all organizations. vuln_management_host_count_top99p int64 Shows the 99th percentile of all Application Vulnerability Management hosts over all hours in the current date for the given org. workflow_executions_usage_sum int64 Sum of all workflows executed over all hours in the current date for all organizations. vsphere_host_top99p_sum int64 Shows the 99th percentile of all vSphere hosts over all hours in the current month for all organizations. vuln_management_host_count_top99p_sum int64 Shows the 99th percentile of all Application Vulnerability Management hosts over all hours in the current month for all organizations. workflow_executions_usage_agg_sum int64 Sum of all workflows executed over all hours in the current month for all organizations. ``` { "agent_host_top99p_sum": "integer", "apm_azure_app_service_host_top99p_sum": "integer", "apm_devsecops_host_top99p_sum": "integer", "apm_enterprise_standalone_hosts_top99p_sum": "integer", "apm_fargate_count_avg_sum": "integer", "apm_host_top99p_sum": "integer", "apm_pro_standalone_hosts_top99p_sum": "integer", "appsec_fargate_count_avg_sum": "integer", "asm_serverless_agg_sum": "integer", "audit_logs_lines_indexed_agg_sum": "integer", "audit_trail_enabled_hwm_sum": "integer", "avg_profiled_fargate_tasks_sum": "integer", "aws_host_top99p_sum": "integer", "aws_lambda_func_count": "integer", "aws_lambda_invocations_sum": "integer", "azure_app_service_top99p_sum": "integer", "azure_host_top99p_sum": "integer", "billable_ingested_bytes_agg_sum": "integer", "bits_ai_investigations_agg_sum": "integer", "browser_rum_lite_session_count_agg_sum": "integer", "browser_rum_replay_session_count_agg_sum": "integer", "browser_rum_units_agg_sum": "integer", "ci_pipeline_indexed_spans_agg_sum": "integer", "ci_test_indexed_spans_agg_sum": "integer", "ci_visibility_itr_committers_hwm_sum": "integer", "ci_visibility_pipeline_committers_hwm_sum": "integer", "ci_visibility_test_committers_hwm_sum": "integer", "cloud_cost_management_aws_host_count_avg_sum": "integer", "cloud_cost_management_azure_host_count_avg_sum": "integer", "cloud_cost_management_gcp_host_count_avg_sum": "integer", "cloud_cost_management_host_count_avg_sum": "integer", "cloud_cost_management_oci_host_count_avg_sum": "integer", "cloud_siem_events_agg_sum": "integer", "code_analysis_sa_committers_hwm_sum": "integer", "code_analysis_sca_committers_hwm_sum": "integer", "code_security_host_top99p_sum": "integer", "container_avg_sum": "integer", "container_excl_agent_avg_sum": "integer", "container_hwm_sum": "integer", "csm_container_enterprise_compliance_count_agg_sum": "integer", "csm_container_enterprise_cws_count_agg_sum": "integer", "csm_container_enterprise_total_count_agg_sum": "integer", "csm_host_enterprise_aas_host_count_top99p_sum": "integer", "csm_host_enterprise_aws_host_count_top99p_sum": "integer", "csm_host_enterprise_azure_host_count_top99p_sum": "integer", "csm_host_enterprise_compliance_host_count_top99p_sum": "integer", "csm_host_enterprise_cws_host_count_top99p_sum": "integer", "csm_host_enterprise_gcp_host_count_top99p_sum": "integer", "csm_host_enterprise_total_host_count_top99p_sum": "integer", "cspm_aas_host_top99p_sum": "integer", "cspm_aws_host_top99p_sum": "integer", "cspm_azure_host_top99p_sum": "integer", "cspm_container_avg_sum": "integer", "cspm_container_hwm_sum": "integer", "cspm_gcp_host_top99p_sum": "integer", "cspm_host_top99p_sum": "integer", "custom_historical_ts_sum": "integer", "custom_live_ts_sum": "integer", "custom_ts_sum": "integer", "cws_container_avg_sum": "integer", "cws_fargate_task_avg_sum": "integer", "cws_host_top99p_sum": "integer", "data_jobs_monitoring_host_hr_agg_sum": "integer", "dbm_host_top99p_sum": "integer", "dbm_queries_avg_sum": "integer", "end_date": "2019-09-19T10:00:00.000Z", "eph_infra_host_agent_agg_sum": "integer", "eph_infra_host_alibaba_agg_sum": "integer", "eph_infra_host_aws_agg_sum": "integer", "eph_infra_host_azure_agg_sum": "integer", "eph_infra_host_ent_agg_sum": "integer", "eph_infra_host_gcp_agg_sum": "integer", "eph_infra_host_heroku_agg_sum": "integer", "eph_infra_host_only_aas_agg_sum": "integer", "eph_infra_host_only_vsphere_agg_sum": "integer", "eph_infra_host_opentelemetry_agg_sum": "integer", "eph_infra_host_opentelemetry_apm_agg_sum": "integer", "eph_infra_host_pro_agg_sum": "integer", "eph_infra_host_proplus_agg_sum": "integer", "eph_infra_host_proxmox_agg_sum": "integer", "error_tracking_apm_error_events_agg_sum": "integer", "error_tracking_error_events_agg_sum": "integer", "error_tracking_events_agg_sum": "integer", "error_tracking_rum_error_events_agg_sum": "integer", "event_management_correlation_agg_sum": "integer", "event_management_correlation_correlated_events_agg_sum": "integer", "event_management_correlation_correlated_related_events_agg_sum": "integer", "fargate_container_profiler_profiling_fargate_avg_sum": "integer", "fargate_container_profiler_profiling_fargate_eks_avg_sum": "integer", "fargate_tasks_count_avg_sum": "integer", "fargate_tasks_count_hwm_sum": "integer", "flex_logs_compute_large_avg_sum": "integer", "flex_logs_compute_medium_avg_sum": "integer", "flex_logs_compute_small_avg_sum": "integer", "flex_logs_compute_xlarge_avg_sum": "integer", "flex_logs_compute_xsmall_avg_sum": "integer", "flex_logs_starter_avg_sum": "integer", "flex_logs_starter_storage_index_avg_sum": "integer", "flex_logs_starter_storage_retention_adjustment_avg_sum": "integer", "flex_stored_logs_avg_sum": "integer", "forwarding_events_bytes_agg_sum": "integer", "gcp_host_top99p_sum": "integer", "heroku_host_top99p_sum": "integer", "incident_management_monthly_active_users_hwm_sum": "integer", "incident_management_seats_hwm_sum": "integer", "indexed_events_count_agg_sum": "integer", "infra_host_top99p_sum": "integer", "ingested_events_bytes_agg_sum": "integer", "iot_device_agg_sum": "integer", "iot_device_top99p_sum": "integer", "last_updated": "2019-09-19T10:00:00.000Z", "live_indexed_events_agg_sum": "integer", "live_ingested_bytes_agg_sum": "integer", "llm_observability_agg_sum": "integer", "llm_observability_min_spend_agg_sum": "integer", "logs_by_retention": { "orgs": { "usage": [ { "usage": [ { "logs_indexed_logs_usage_sum": "integer", "logs_live_indexed_logs_usage_sum": "integer", "logs_rehydrated_indexed_logs_usage_sum": "integer", "retention": "string" } ] } ] }, "usage": [ { "logs_indexed_logs_usage_agg_sum": "integer", "logs_live_indexed_logs_usage_agg_sum": "integer", "logs_rehydrated_indexed_logs_usage_agg_sum": "integer", "retention": "string" } ], "usage_by_month": { "date": "2019-09-19T10:00:00.000Z", "usage": [ { "logs_indexed_logs_usage_sum": "integer", "logs_live_indexed_logs_usage_sum": "integer", "logs_rehydrated_indexed_logs_usage_sum": "integer", "retention": "string" } ] } }, "mobile_rum_lite_session_count_agg_sum": "integer", "mobile_rum_session_count_agg_sum": "integer", "mobile_rum_session_count_android_agg_sum": "integer", "mobile_rum_session_count_flutter_agg_sum": "integer", "mobile_rum_session_count_ios_agg_sum": "integer", "mobile_rum_session_count_reactnative_agg_sum": "integer", "mobile_rum_session_count_roku_agg_sum": "integer", "mobile_rum_units_agg_sum": "integer", "ndm_netflow_events_agg_sum": "integer", "netflow_indexed_events_count_agg_sum": "integer", "network_device_wireless_top99p_sum": "integer", "npm_host_top99p_sum": "integer", "observability_pipelines_bytes_processed_agg_sum": "integer", "oci_host_agg_sum": "integer", "oci_host_top99p_sum": "integer", "on_call_seat_hwm_sum": "integer", "online_archive_events_count_agg_sum": "integer", "opentelemetry_apm_host_top99p_sum": "integer", "opentelemetry_host_top99p_sum": "integer", "product_analytics_agg_sum": "integer", "profiling_aas_count_top99p_sum": "integer", "profiling_container_agent_count_avg": "integer", "profiling_host_count_top99p_sum": "integer", "proxmox_host_agg_sum": "integer", "proxmox_host_top99p_sum": "integer", "published_app_hwm_sum": "integer", "rehydrated_indexed_events_agg_sum": "integer", "rehydrated_ingested_bytes_agg_sum": "integer", "rum_browser_and_mobile_session_count": "integer", "rum_browser_legacy_session_count_agg_sum": "integer", "rum_browser_lite_session_count_agg_sum": "integer", "rum_browser_replay_session_count_agg_sum": "integer", "rum_indexed_sessions_agg_sum": "integer", "rum_ingested_sessions_agg_sum": "integer", "rum_lite_session_count_agg_sum": "integer", "rum_mobile_legacy_session_count_android_agg_sum": "integer", "rum_mobile_legacy_session_count_flutter_agg_sum": "integer", "rum_mobile_legacy_session_count_ios_agg_sum": "integer", "rum_mobile_legacy_session_count_reactnative_agg_sum": "integer", "rum_mobile_legacy_session_count_roku_agg_sum": "integer", "rum_mobile_lite_session_count_android_agg_sum": "integer", "rum_mobile_lite_session_count_flutter_agg_sum": "integer", "rum_mobile_lite_session_count_ios_agg_sum": "integer", "rum_mobile_lite_session_count_kotlinmultiplatform_agg_sum": "integer", "rum_mobile_lite_session_count_reactnative_agg_sum": "integer", "rum_mobile_lite_session_count_roku_agg_sum": "integer", "rum_mobile_lite_session_count_unity_agg_sum": "integer", "rum_mobile_replay_session_count_android_agg_sum": "integer", "rum_mobile_replay_session_count_ios_agg_sum": "integer", "rum_mobile_replay_session_count_kotlinmultiplatform_agg_sum": "integer", "rum_mobile_replay_session_count_reactnative_agg_sum": "integer", "rum_replay_session_count_agg_sum": "integer", "rum_session_count_agg_sum": "integer", "rum_session_replay_add_on_agg_sum": "integer", "rum_total_session_count_agg_sum": "integer", "rum_units_agg_sum": "integer", "sca_fargate_count_avg_sum": "integer", "sca_fargate_count_hwm_sum": "integer", "sds_apm_scanned_bytes_sum": "integer", "sds_events_scanned_bytes_sum": "integer", "sds_logs_scanned_bytes_sum": "integer", "sds_rum_scanned_bytes_sum": "integer", "sds_total_scanned_bytes_sum": "integer", "serverless_apps_apm_apm_azure_appservice_instances_avg_sum": "integer", "serverless_apps_apm_apm_azure_azurefunction_instances_avg_sum": "integer", "serverless_apps_apm_apm_azure_containerapp_instances_avg_sum": "integer", "serverless_apps_apm_apm_fargate_ecs_tasks_avg_sum": "integer", "serverless_apps_apm_apm_gcp_cloudfunction_instances_avg_sum": "integer", "serverless_apps_apm_apm_gcp_cloudrun_instances_avg_sum": "integer", "serverless_apps_apm_avg_sum": "integer", "serverless_apps_apm_excl_fargate_apm_azure_appservice_instances_avg_sum": "integer", "serverless_apps_apm_excl_fargate_apm_azure_azurefunction_instances_avg_sum": "integer", "serverless_apps_apm_excl_fargate_apm_azure_containerapp_instances_avg_sum": "integer", "serverless_apps_apm_excl_fargate_apm_gcp_cloudfunction_instances_avg_sum": "integer", "serverless_apps_apm_excl_fargate_apm_gcp_cloudrun_instances_avg_sum": "integer", "serverless_apps_apm_excl_fargate_avg_sum": "integer", "serverless_apps_azure_container_app_instances_avg_sum": "integer", "serverless_apps_azure_count_avg_sum": "integer", "serverless_apps_azure_function_app_instances_avg_sum": "integer", "serverless_apps_azure_web_app_instances_avg_sum": "integer", "serverless_apps_ecs_avg_sum": "integer", "serverless_apps_eks_avg_sum": "integer", "serverless_apps_excl_fargate_avg_sum": "integer", "serverless_apps_excl_fargate_azure_container_app_instances_avg_sum": "integer", "serverless_apps_excl_fargate_azure_function_app_instances_avg_sum": "integer", "serverless_apps_excl_fargate_azure_web_app_instances_avg_sum": "integer", "serverless_apps_excl_fargate_google_cloud_functions_instances_avg_sum": "integer", "serverless_apps_excl_fargate_google_cloud_run_instances_avg_sum": "integer", "serverless_apps_google_cloud_functions_instances_avg_sum": "integer", "serverless_apps_google_cloud_run_instances_avg_sum": "integer", "serverless_apps_google_count_avg_sum": "integer", "serverless_apps_total_count_avg_sum": "integer", "siem_analyzed_logs_add_on_count_agg_sum": "integer", "start_date": "2019-09-19T10:00:00.000Z", "synthetics_browser_check_calls_count_agg_sum": "integer", "synthetics_check_calls_count_agg_sum": "integer", "synthetics_mobile_test_runs_agg_sum": "integer", "synthetics_parallel_testing_max_slots_hwm_sum": "integer", "trace_search_indexed_events_count_agg_sum": "integer", "twol_ingested_events_bytes_agg_sum": "integer", "universal_service_monitoring_host_top99p_sum": "integer", "usage": [ { "agent_host_top99p": "integer", "apm_azure_app_service_host_top99p": "integer", "apm_devsecops_host_top99p": "integer", "apm_enterprise_standalone_hosts_top99p": "integer", "apm_fargate_count_avg": "integer", "apm_host_top99p": "integer", "apm_pro_standalone_hosts_top99p": "integer", "appsec_fargate_count_avg": "integer", "asm_serverless_sum": "integer", "audit_logs_lines_indexed_sum": "integer", "audit_trail_enabled_hwm": "integer", "avg_profiled_fargate_tasks": "integer", "aws_host_top99p": "integer", "aws_lambda_func_count": "integer", "aws_lambda_invocations_sum": "integer", "azure_app_service_top99p": "integer", "billable_ingested_bytes_sum": "integer", "bits_ai_investigations_sum": "integer", "browser_rum_lite_session_count_sum": "integer", "browser_rum_replay_session_count_sum": "integer", "browser_rum_units_sum": "integer", "ci_pipeline_indexed_spans_sum": "integer", "ci_test_indexed_spans_sum": "integer", "ci_visibility_itr_committers_hwm": "integer", "ci_visibility_pipeline_committers_hwm": "integer", "ci_visibility_test_committers_hwm": "integer", "cloud_cost_management_aws_host_count_avg": "integer", "cloud_cost_management_azure_host_count_avg": "integer", "cloud_cost_management_gcp_host_count_avg": "integer", "cloud_cost_management_host_count_avg": "integer", "cloud_cost_management_oci_host_count_avg": "integer", "cloud_siem_events_sum": "integer", "code_analysis_sa_committers_hwm": "integer", "code_analysis_sca_committers_hwm": "integer", "code_security_host_top99p": "integer", "container_avg": "integer", "container_excl_agent_avg": "integer", "container_hwm": "integer", "csm_container_enterprise_compliance_count_sum": "integer", "csm_container_enterprise_cws_count_sum": "integer", "csm_container_enterprise_total_count_sum": "integer", "csm_host_enterprise_aas_host_count_top99p": "integer", "csm_host_enterprise_aws_host_count_top99p": "integer", "csm_host_enterprise_azure_host_count_top99p": "integer", "csm_host_enterprise_compliance_host_count_top99p": "integer", "csm_host_enterprise_cws_host_count_top99p": "integer", "csm_host_enterprise_gcp_host_count_top99p": "integer", "csm_host_enterprise_total_host_count_top99p": "integer", "cspm_aas_host_top99p": "integer", "cspm_aws_host_top99p": "integer", "cspm_azure_host_top99p": "integer", "cspm_container_avg": "integer", "cspm_container_hwm": "integer", "cspm_gcp_host_top99p": "integer", "cspm_host_top99p": "integer", "custom_ts_avg": "integer", "cws_container_count_avg": "integer", "cws_fargate_task_avg": "integer", "cws_host_top99p": "integer", "data_jobs_monitoring_host_hr_sum": "integer", "date": "2019-09-19T10:00:00.000Z", "dbm_host_top99p": "integer", "dbm_queries_count_avg": "integer", "eph_infra_host_agent_sum": "integer", "eph_infra_host_alibaba_sum": "integer", "eph_infra_host_aws_sum": "integer", "eph_infra_host_azure_sum": "integer", "eph_infra_host_ent_sum": "integer", "eph_infra_host_gcp_sum": "integer", "eph_infra_host_heroku_sum": "integer", "eph_infra_host_only_aas_sum": "integer", "eph_infra_host_only_vsphere_sum": "integer", "eph_infra_host_opentelemetry_apm_sum": "integer", "eph_infra_host_opentelemetry_sum": "integer", "eph_infra_host_pro_sum": "integer", "eph_infra_host_proplus_sum": "integer", "eph_infra_host_proxmox_sum": "integer", "error_tracking_apm_error_events_sum": "integer", "error_tracking_error_events_sum": "integer", "error_tracking_events_sum": "integer", "error_tracking_rum_error_events_sum": "integer", "event_management_correlation_correlated_events_sum": "integer", "event_management_correlation_correlated_related_events_sum": "integer", "event_management_correlation_sum": "integer", "fargate_container_profiler_profiling_fargate_avg": "integer", "fargate_container_profiler_profiling_fargate_eks_avg": "integer", "fargate_tasks_count_avg": "integer", "fargate_tasks_count_hwm": "integer", "flex_logs_compute_large_avg": "integer", "flex_logs_compute_medium_avg": "integer", "flex_logs_compute_small_avg": "integer", "flex_logs_compute_xlarge_avg": "integer", "flex_logs_compute_xsmall_avg": "integer", "flex_logs_starter_avg": "integer", "flex_logs_starter_storage_index_avg": "integer", "flex_logs_starter_storage_retention_adjustment_avg": "integer", "flex_stored_logs_avg": "integer", "forwarding_events_bytes_sum": "integer", "gcp_host_top99p": "integer", "heroku_host_top99p": "integer", "incident_management_monthly_active_users_hwm": "integer", "incident_management_seats_hwm": "integer", "indexed_events_count_sum": "integer", "infra_host_top99p": "integer", "ingested_events_bytes_sum": "integer", "iot_device_sum": "integer", "iot_device_top99p": "integer", "llm_observability_min_spend_sum": "integer", "llm_observability_sum": "integer", "mobile_rum_lite_session_count_sum": "integer", "mobile_rum_session_count_android_sum": "integer", "mobile_rum_session_count_flutter_sum": "integer", "mobile_rum_session_count_ios_sum": "integer", "mobile_rum_session_count_reactnative_sum": "integer", "mobile_rum_session_count_roku_sum": "integer", "mobile_rum_session_count_sum": "integer", "mobile_rum_units_sum": "integer", "ndm_netflow_events_sum": "integer", "netflow_indexed_events_count_sum": "integer", "network_device_wireless_top99p": "integer", "npm_host_top99p": "integer", "observability_pipelines_bytes_processed_sum": "integer", "oci_host_sum": "integer", "oci_host_top99p": "integer", "on_call_seat_hwm": "integer", "online_archive_events_count_sum": "integer", "opentelemetry_apm_host_top99p": "integer", "opentelemetry_host_top99p": "integer", "orgs": [ { "account_name": "string", "account_public_id": "string", "agent_host_top99p": "integer", "apm_azure_app_service_host_top99p": "integer", "apm_devsecops_host_top99p": "integer", "apm_enterprise_standalone_hosts_top99p": "integer", "apm_fargate_count_avg": "integer", "apm_host_top99p": "integer", "apm_pro_standalone_hosts_top99p": "integer", "appsec_fargate_count_avg": "integer", "asm_serverless_sum": "integer", "audit_logs_lines_indexed_sum": "integer", "audit_trail_enabled_hwm": "integer", "avg_profiled_fargate_tasks": "integer", "aws_host_top99p": "integer", "aws_lambda_func_count": "integer", "aws_lambda_invocations_sum": "integer", "azure_app_service_top99p": "integer", "billable_ingested_bytes_sum": "integer", "bits_ai_investigations_sum": "integer", "browser_rum_lite_session_count_sum": "integer", "browser_rum_replay_session_count_sum": "integer", "browser_rum_units_sum": "integer", "ci_pipeline_indexed_spans_sum": "integer", "ci_test_indexed_spans_sum": "integer", "ci_visibility_itr_committers_hwm": "integer", "ci_visibility_pipeline_committers_hwm": "integer", "ci_visibility_test_committers_hwm": "integer", "cloud_cost_management_aws_host_count_avg": "integer", "cloud_cost_management_azure_host_count_avg": "integer", "cloud_cost_management_gcp_host_count_avg": "integer", "cloud_cost_management_host_count_avg": "integer", "cloud_cost_management_oci_host_count_avg": "integer", "cloud_siem_events_sum": "integer", "code_analysis_sa_committers_hwm": "integer", "code_analysis_sca_committers_hwm": "integer", "code_security_host_top99p": "integer", "container_avg": "integer", "container_excl_agent_avg": "integer", "container_hwm": "integer", "csm_container_enterprise_compliance_count_sum": "integer", "csm_container_enterprise_cws_count_sum": "integer", "csm_container_enterprise_total_count_sum": "integer", "csm_host_enterprise_aas_host_count_top99p": "integer", "csm_host_enterprise_aws_host_count_top99p": "integer", "csm_host_enterprise_azure_host_count_top99p": "integer", "csm_host_enterprise_compliance_host_count_top99p": "integer", "csm_host_enterprise_cws_host_count_top99p": "integer", "csm_host_enterprise_gcp_host_count_top99p": "integer", "csm_host_enterprise_total_host_count_top99p": "integer", "cspm_aas_host_top99p": "integer", "cspm_aws_host_top99p": "integer", "cspm_azure_host_top99p": "integer", "cspm_container_avg": "integer", "cspm_container_hwm": "integer", "cspm_gcp_host_top99p": "integer", "cspm_host_top99p": "integer", "custom_historical_ts_avg": "integer", "custom_live_ts_avg": "integer", "custom_ts_avg": "integer", "cws_container_count_avg": "integer", "cws_fargate_task_avg": "integer", "cws_host_top99p": "integer", "data_jobs_monitoring_host_hr_sum": "integer", "dbm_host_top99p_sum": "integer", "dbm_queries_avg_sum": "integer", "eph_infra_host_agent_sum": "integer", "eph_infra_host_alibaba_sum": "integer", "eph_infra_host_aws_sum": "integer", "eph_infra_host_azure_sum": "integer", "eph_infra_host_ent_sum": "integer", "eph_infra_host_gcp_sum": "integer", "eph_infra_host_heroku_sum": "integer", "eph_infra_host_only_aas_sum": "integer", "eph_infra_host_only_vsphere_sum": "integer", "eph_infra_host_opentelemetry_apm_sum": "integer", "eph_infra_host_opentelemetry_sum": "integer", "eph_infra_host_pro_sum": "integer", "eph_infra_host_proplus_sum": "integer", "eph_infra_host_proxmox_sum": "integer", "error_tracking_apm_error_events_sum": "integer", "error_tracking_error_events_sum": "integer", "error_tracking_events_sum": "integer", "error_tracking_rum_error_events_sum": "integer", "event_management_correlation_correlated_events_sum": "integer", "event_management_correlation_correlated_related_events_sum": "integer", "event_management_correlation_sum": "integer", "fargate_container_profiler_profiling_fargate_avg": "integer", "fargate_container_profiler_profiling_fargate_eks_avg": "integer", "fargate_tasks_count_avg": "integer", "fargate_tasks_count_hwm": "integer", "flex_logs_compute_large_avg": "integer", "flex_logs_compute_medium_avg": "integer", "flex_logs_compute_small_avg": "integer", "flex_logs_compute_xlarge_avg": "integer", "flex_logs_compute_xsmall_avg": "integer", "flex_logs_starter_avg": "integer", "flex_logs_starter_storage_index_avg": "integer", "flex_logs_starter_storage_retention_adjustment_avg": "integer", "flex_stored_logs_avg": "integer", "forwarding_events_bytes_sum": "integer", "gcp_host_top99p": "integer", "heroku_host_top99p": "integer", "id": "string", "incident_management_monthly_active_users_hwm": "integer", "incident_management_seats_hwm": "integer", "indexed_events_count_sum": "integer", "infra_host_top99p": "integer", "ingested_events_bytes_sum": "integer", "iot_device_agg_sum": "integer", "iot_device_top99p_sum": "integer", "llm_observability_min_spend_sum": "integer", "llm_observability_sum": "integer", "mobile_rum_lite_session_count_sum": "integer", "mobile_rum_session_count_android_sum": "integer", "mobile_rum_session_count_flutter_sum": "integer", "mobile_rum_session_count_ios_sum": "integer", "mobile_rum_session_count_reactnative_sum": "integer", "mobile_rum_session_count_roku_sum": "integer", "mobile_rum_session_count_sum": "integer", "mobile_rum_units_sum": "integer", "name": "string", "ndm_netflow_events_sum": "integer", "netflow_indexed_events_count_sum": "integer", "network_device_wireless_top99p": "integer", "npm_host_top99p": "integer", "observability_pipelines_bytes_processed_sum": "integer", "oci_host_sum": "integer", "oci_host_top99p": "integer", "on_call_seat_hwm": "integer", "online_archive_events_count_sum": "integer", "opentelemetry_apm_host_top99p": "integer", "opentelemetry_host_top99p": "integer", "product_analytics_sum": "integer", "profiling_aas_count_top99p": "integer", "profiling_host_top99p": "integer", "proxmox_host_sum": "integer", "proxmox_host_top99p": "integer", "public_id": "string", "published_app_hwm": "integer", "region": "string", "rum_browser_and_mobile_session_count": "integer", "rum_browser_legacy_session_count_sum": "integer", "rum_browser_lite_session_count_sum": "integer", "rum_browser_replay_session_count_sum": "integer", "rum_indexed_sessions_sum": "integer", "rum_ingested_sessions_sum": "integer", "rum_lite_session_count_sum": "integer", "rum_mobile_legacy_session_count_android_sum": "integer", "rum_mobile_legacy_session_count_flutter_sum": "integer", "rum_mobile_legacy_session_count_ios_sum": "integer", "rum_mobile_legacy_session_count_reactnative_sum": "integer", "rum_mobile_legacy_session_count_roku_sum": "integer", "rum_mobile_lite_session_count_android_sum": "integer", "rum_mobile_lite_session_count_flutter_sum": "integer", "rum_mobile_lite_session_count_ios_sum": "integer", "rum_mobile_lite_session_count_kotlinmultiplatform_sum": "integer", "rum_mobile_lite_session_count_reactnative_sum": "integer", "rum_mobile_lite_session_count_roku_sum": "integer", "rum_mobile_lite_session_count_unity_sum": "integer", "rum_mobile_replay_session_count_android_sum": "integer", "rum_mobile_replay_session_count_ios_sum": "integer", "rum_mobile_replay_session_count_kotlinmultiplatform_sum": "integer", "rum_mobile_replay_session_count_reactnative_sum": "integer", "rum_replay_session_count_sum": "integer", "rum_session_count_sum": "integer", "rum_session_replay_add_on_sum": "integer", "rum_total_session_count_sum": "integer", "rum_units_sum": "integer", "sca_fargate_count_avg": "integer", "sca_fargate_count_hwm": "integer", "sds_apm_scanned_bytes_sum": "integer", "sds_events_scanned_bytes_sum": "integer", "sds_logs_scanned_bytes_sum": "integer", "sds_rum_scanned_bytes_sum": "integer", "sds_total_scanned_bytes_sum": "integer", "serverless_apps_apm_apm_azure_appservice_instances_avg": "integer", "serverless_apps_apm_apm_azure_azurefunction_instances_avg": "integer", "serverless_apps_apm_apm_azure_containerapp_instances_avg": "integer", "serverless_apps_apm_apm_fargate_ecs_tasks_avg": "integer", "serverless_apps_apm_apm_gcp_cloudfunction_instances_avg": "integer", "serverless_apps_apm_apm_gcp_cloudrun_instances_avg": "integer", "serverless_apps_apm_avg": "integer", "serverless_apps_apm_excl_fargate_apm_azure_appservice_instances_avg": "integer", "serverless_apps_apm_excl_fargate_apm_azure_azurefunction_instances_avg": "integer", "serverless_apps_apm_excl_fargate_apm_azure_containerapp_instances_avg": "integer", "serverless_apps_apm_excl_fargate_apm_gcp_cloudfunction_instances_avg": "integer", "serverless_apps_apm_excl_fargate_apm_gcp_cloudrun_instances_avg": "integer", "serverless_apps_apm_excl_fargate_avg": "integer", "serverless_apps_azure_container_app_instances_avg": "integer", "serverless_apps_azure_count_avg": "integer", "serverless_apps_azure_function_app_instances_avg": "integer", "serverless_apps_azure_web_app_instances_avg": "integer", "serverless_apps_ecs_avg": "integer", "serverless_apps_eks_avg": "integer", "serverless_apps_excl_fargate_avg": "integer", "serverless_apps_excl_fargate_azure_container_app_instances_avg": "integer", "serverless_apps_excl_fargate_azure_function_app_instances_avg": "integer", "serverless_apps_excl_fargate_azure_web_app_instances_avg": "integer", "serverless_apps_excl_fargate_google_cloud_functions_instances_avg": "integer", "serverless_apps_excl_fargate_google_cloud_run_instances_avg": "integer", "serverless_apps_google_cloud_functions_instances_avg": "integer", "serverless_apps_google_cloud_run_instances_avg": "integer", "serverless_apps_google_count_avg": "integer", "serverless_apps_total_count_avg": "integer", "siem_analyzed_logs_add_on_count_sum": "integer", "synthetics_browser_check_calls_count_sum": "integer", "synthetics_check_calls_count_sum": "integer", "synthetics_mobile_test_runs_sum": "integer", "synthetics_parallel_testing_max_slots_hwm": "integer", "trace_search_indexed_events_count_sum": "integer", "twol_ingested_events_bytes_sum": "integer", "universal_service_monitoring_host_top99p": "integer", "vsphere_host_top99p": "integer", "vuln_management_host_count_top99p": "integer", "workflow_executions_usage_sum": "integer" } ], "product_analytics_sum": "integer", "profiling_aas_count_top99p": "integer", "profiling_host_top99p": "integer", "proxmox_host_sum": "integer", "proxmox_host_top99p": "integer", "published_app_hwm": "integer", "rum_browser_and_mobile_session_count": "integer", "rum_browser_legacy_session_count_sum": "integer", "rum_browser_lite_session_count_sum": "integer", "rum_browser_replay_session_count_sum": "integer", "rum_indexed_sessions_sum": "integer", "rum_ingested_sessions_sum": "integer", "rum_lite_session_count_sum": "integer", "rum_mobile_legacy_session_count_android_sum": "integer", "rum_mobile_legacy_session_count_flutter_sum": "integer", "rum_mobile_legacy_session_count_ios_sum": "integer", "rum_mobile_legacy_session_count_reactnative_sum": "integer", "rum_mobile_legacy_session_count_roku_sum": "integer", "rum_mobile_lite_session_count_android_sum": "integer", "rum_mobile_lite_session_count_flutter_sum": "integer", "rum_mobile_lite_session_count_ios_sum": "integer", "rum_mobile_lite_session_count_kotlinmultiplatform_sum": "integer", "rum_mobile_lite_session_count_reactnative_sum": "integer", "rum_mobile_lite_session_count_roku_sum": "integer", "rum_mobile_lite_session_count_unity_sum": "integer", "rum_mobile_replay_session_count_android_sum": "integer", "rum_mobile_replay_session_count_ios_sum": "integer", "rum_mobile_replay_session_count_kotlinmultiplatform_sum": "integer", "rum_mobile_replay_session_count_reactnative_sum": "integer", "rum_replay_session_count_sum": "integer", "rum_session_count_sum": "integer", "rum_session_replay_add_on_sum": "integer", "rum_total_session_count_sum": "integer", "rum_units_sum": "integer", "sca_fargate_count_avg": "integer", "sca_fargate_count_hwm": "integer", "sds_apm_scanned_bytes_sum": "integer", "sds_events_scanned_bytes_sum": "integer", "sds_logs_scanned_bytes_sum": "integer", "sds_rum_scanned_bytes_sum": "integer", "sds_total_scanned_bytes_sum": "integer", "serverless_apps_apm_apm_azure_appservice_instances_avg": "integer", "serverless_apps_apm_apm_azure_azurefunction_instances_avg": "integer", "serverless_apps_apm_apm_azure_containerapp_instances_avg": "integer", "serverless_apps_apm_apm_fargate_ecs_tasks_avg": "integer", "serverless_apps_apm_apm_gcp_cloudfunction_instances_avg": "integer", "serverless_apps_apm_apm_gcp_cloudrun_instances_avg": "integer", "serverless_apps_apm_avg": "integer", "serverless_apps_apm_excl_fargate_apm_azure_appservice_instances_avg": "integer", "serverless_apps_apm_excl_fargate_apm_azure_azurefunction_instances_avg": "integer", "serverless_apps_apm_excl_fargate_apm_azure_containerapp_instances_avg": "integer", "serverless_apps_apm_excl_fargate_apm_gcp_cloudfunction_instances_avg": "integer", "serverless_apps_apm_excl_fargate_apm_gcp_cloudrun_instances_avg": "integer", "serverless_apps_apm_excl_fargate_avg": "integer", "serverless_apps_azure_container_app_instances_avg": "integer", "serverless_apps_azure_count_avg": "integer", "serverless_apps_azure_function_app_instances_avg": "integer", "serverless_apps_azure_web_app_instances_avg": "integer", "serverless_apps_ecs_avg": "integer", "serverless_apps_eks_avg": "integer", "serverless_apps_excl_fargate_avg": "integer", "serverless_apps_excl_fargate_azure_container_app_instances_avg": "integer", "serverless_apps_excl_fargate_azure_function_app_instances_avg": "integer", "serverless_apps_excl_fargate_azure_web_app_instances_avg": "integer", "serverless_apps_excl_fargate_google_cloud_functions_instances_avg": "integer", "serverless_apps_excl_fargate_google_cloud_run_instances_avg": "integer", "serverless_apps_google_cloud_functions_instances_avg": "integer", "serverless_apps_google_cloud_run_instances_avg": "integer", "serverless_apps_google_count_avg": "integer", "serverless_apps_total_count_avg": "integer", "siem_analyzed_logs_add_on_count_sum": "integer", "synthetics_browser_check_calls_count_sum": "integer", "synthetics_check_calls_count_sum": "integer", "synthetics_mobile_test_runs_sum": "integer", "synthetics_parallel_testing_max_slots_hwm": "integer", "trace_search_indexed_events_count_sum": "integer", "twol_ingested_events_bytes_sum": "integer", "universal_service_monitoring_host_top99p": "integer", "vsphere_host_top99p": "integer", "vuln_management_host_count_top99p": "integer", "workflow_executions_usage_sum": "integer" } ], "vsphere_host_top99p_sum": "integer", "vuln_management_host_count_top99p_sum": "integer", "workflow_executions_usage_agg_sum": "integer" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get usage across your account Copy ``` # Required query arguments export start_month="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/summary?start_month=${start_month}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get usage across your account ``` """ Get usage across your account returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_summary( start_month=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get usage across your account ``` # Get usage across your account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_summary("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get usage across your account ``` // Get usage across your account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageSummary(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageSummaryOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageSummary`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageSummary`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get usage across your account ``` // Get usage across your account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageSummaryResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageSummaryResponse result = apiInstance.getUsageSummary(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageSummary"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get usage across your account ``` // Get usage across your account returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageSummaryOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_summary( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageSummaryOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get usage across your account ``` /** * Get usage across your account returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageSummaryRequest = { startMonth: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageSummary(params) .then((data: v1.UsageSummaryResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for logs by index](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-logs-by-index) * [v1 (latest)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-logs-by-index-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/logs_by_indexhttps://api.ap2.datadoghq.com/api/v1/usage/logs_by_indexhttps://api.datadoghq.eu/api/v1/usage/logs_by_indexhttps://api.ddog-gov.com/api/v1/usage/logs_by_indexhttps://api.datadoghq.com/api/v1/usage/logs_by_indexhttps://api.us3.datadoghq.com/api/v1/usage/logs_by_indexhttps://api.us5.datadoghq.com/api/v1/usage/logs_by_index ### Overview Get hourly usage for logs by index. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. index_name array Comma-separated list of log index names. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogsByIndex-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogsByIndex-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogsByIndex-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogsByIndex-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of indexed logs for each hour and index for a given organization. Field Type Description usage [object] An array of objects regarding hourly usage of logs by index response. event_count int64 The total number of indexed logs for the queried hour. hour date-time The hour for the usage. index_id string The index ID for this usage. index_name string The user specified name for this index ID. org_name string The organization name. public_id string The organization public ID. retention int64 The retention period (in days) for this index ID. ``` { "usage": [ { "event_count": "integer", "hour": "2019-09-19T10:00:00.000Z", "index_id": "string", "index_name": "string", "org_name": "string", "public_id": "string", "retention": "integer" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby-legacy) * [Python [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python-legacy) ##### Get hourly usage for logs by index Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/logs_by_index?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for logs by index ``` """ Get hourly usage for logs by index returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_logs_by_index( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for logs by index ``` # Get hourly usage for logs by index returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_logs_by_index("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for logs by index ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) start_date= '2020-01-01T00' end_date='2019-01-01T02' index_name='main, marketing' dog.get_logs_by_index_usage(start_date, end_date, index_name) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for logs by index ``` // Get hourly usage for logs by index returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageLogsByIndex(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageLogsByIndexOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageLogsByIndex`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageLogsByIndex`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for logs by index ``` // Get hourly usage for logs by index returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageLogsByIndexResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageLogsByIndexResponse result = apiInstance.getUsageLogsByIndex(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageLogsByIndex"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for logs by index ``` # Consult the ruby example ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python-legacy) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python "example.py" ``` ##### Get hourly usage for logs by index ``` // Get hourly usage for logs by index returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageLogsByIndexOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_logs_by_index( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageLogsByIndexOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for logs by index ``` /** * Get hourly usage for logs by index returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageLogsByIndexRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageLogsByIndex(params) .then((data: v1.UsageLogsByIndexResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly logs usage by retention](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-logs-usage-by-retention) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-logs-usage-by-retention-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/logs-by-retentionhttps://api.ap2.datadoghq.com/api/v1/usage/logs-by-retentionhttps://api.datadoghq.eu/api/v1/usage/logs-by-retentionhttps://api.ddog-gov.com/api/v1/usage/logs-by-retentionhttps://api.datadoghq.com/api/v1/usage/logs-by-retentionhttps://api.us3.datadoghq.com/api/v1/usage/logs-by-retentionhttps://api.us5.datadoghq.com/api/v1/usage/logs-by-retention ### Overview Get hourly usage for indexed logs by retention period. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogsByRetention-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogsByRetention-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogsByRetention-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogsByRetention-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the indexed logs usage broken down by retention period for an organization during a given hour. Field Type Description usage [object] Get hourly usage for indexed logs by retention period. indexed_events_count int64 Total logs indexed with this retention period during a given hour. live_indexed_events_count int64 Live logs indexed with this retention period during a given hour. org_name string The organization name. public_id string The organization public ID. rehydrated_indexed_events_count int64 Rehydrated logs indexed with this retention period during a given hour. retention string The retention period in days or "custom" for all custom retention usage. ``` { "usage": [ { "indexed_events_count": "integer", "live_indexed_events_count": "integer", "org_name": "string", "public_id": "string", "rehydrated_indexed_events_count": "integer", "retention": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly logs usage by retention Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/logs-by-retention?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly logs usage by retention ``` """ Get hourly logs usage by retention returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_logs_by_retention( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly logs usage by retention ``` # Get hourly logs usage by retention returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_logs_by_retention((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly logs usage by retention ``` // Get hourly logs usage by retention returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageLogsByRetention(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageLogsByRetentionOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageLogsByRetention`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageLogsByRetention`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly logs usage by retention ``` // Get hourly logs usage by retention returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageLogsByRetentionOptionalParameters; import com.datadog.api.client.v1.model.UsageLogsByRetentionResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageLogsByRetentionResponse result = apiInstance.getUsageLogsByRetention( OffsetDateTime.now().plusDays(-5), new GetUsageLogsByRetentionOptionalParameters() .endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageLogsByRetention"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly logs usage by retention ``` // Get hourly logs usage by retention returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageLogsByRetentionOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_logs_by_retention( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageLogsByRetentionOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly logs usage by retention ``` /** * Get hourly logs usage by retention returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageLogsByRetentionRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageLogsByRetention(params) .then((data: v1.UsageLogsByRetentionResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for hosts and containers](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-hosts-and-containers) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-hosts-and-containers-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/hostshttps://api.ap2.datadoghq.com/api/v1/usage/hostshttps://api.datadoghq.eu/api/v1/usage/hostshttps://api.ddog-gov.com/api/v1/usage/hostshttps://api.datadoghq.com/api/v1/usage/hostshttps://api.us3.datadoghq.com/api/v1/usage/hostshttps://api.us5.datadoghq.com/api/v1/usage/hosts ### Overview Get hourly usage for hosts and containers. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageHosts-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageHosts-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageHosts-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageHosts-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Host usage response. Field Type Description usage [object] An array of objects related to host usage. agent_host_count int64 Contains the total number of infrastructure hosts reporting during a given hour that were running the Datadog Agent. alibaba_host_count int64 Contains the total number of hosts that reported through Alibaba integration (and were NOT running the Datadog Agent). apm_azure_app_service_host_count int64 Contains the total number of Azure App Services hosts using APM. apm_host_count int64 Shows the total number of hosts using APM during the hour, these are counted as billable (except during trial periods). aws_host_count int64 Contains the total number of hosts that reported through the AWS integration (and were NOT running the Datadog Agent). azure_host_count int64 Contains the total number of hosts that reported through Azure integration (and were NOT running the Datadog Agent). container_count int64 Shows the total number of containers reported by the Docker integration during the hour. gcp_host_count int64 Contains the total number of hosts that reported through the Google Cloud integration (and were NOT running the Datadog Agent). heroku_host_count int64 Contains the total number of Heroku dynos reported by the Datadog Agent. host_count int64 Contains the total number of billable infrastructure hosts reporting during a given hour. This is the sum of `agent_host_count`, `aws_host_count`, and `gcp_host_count`. hour date-time The hour for the usage. infra_azure_app_service int64 Contains the total number of hosts that reported through the Azure App Services integration (and were NOT running the Datadog Agent). opentelemetry_apm_host_count int64 Contains the total number of hosts using APM reported by Datadog exporter for the OpenTelemetry Collector. opentelemetry_host_count int64 Contains the total number of hosts reported by Datadog exporter for the OpenTelemetry Collector. org_name string The organization name. public_id string The organization public ID. vsphere_host_count int64 Contains the total number of hosts that reported through vSphere integration (and were NOT running the Datadog Agent). ``` { "usage": [ { "agent_host_count": "integer", "alibaba_host_count": "integer", "apm_azure_app_service_host_count": "integer", "apm_host_count": "integer", "aws_host_count": "integer", "azure_host_count": "integer", "container_count": "integer", "gcp_host_count": "integer", "heroku_host_count": "integer", "host_count": "integer", "hour": "2019-09-19T10:00:00.000Z", "infra_azure_app_service": "integer", "opentelemetry_apm_host_count": "integer", "opentelemetry_host_count": "integer", "org_name": "string", "public_id": "string", "vsphere_host_count": "integer" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby-legacy) ##### Get hourly usage for hosts and containers Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/hosts?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for hosts and containers ``` """ Get hourly usage for hosts and containers returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_hosts( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for hosts and containers ``` # Get hourly usage for hosts and containers returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_hosts((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for hosts and containers ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) start_date= '2019-10-07T00' end_date='2019-10-07T02' dog.get_hosts_usage(start_date, end_date) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for hosts and containers ``` // Get hourly usage for hosts and containers returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageHosts(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageHostsOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageHosts`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageHosts`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for hosts and containers ``` // Get hourly usage for hosts and containers returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageHostsOptionalParameters; import com.datadog.api.client.v1.model.UsageHostsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageHostsResponse result = apiInstance.getUsageHosts( OffsetDateTime.now().plusDays(-5), new GetUsageHostsOptionalParameters().endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageHosts"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for hosts and containers ``` // Get hourly usage for hosts and containers returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageHostsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_hosts( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageHostsOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for hosts and containers ``` /** * Get hourly usage for hosts and containers returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageHostsRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageHosts(params) .then((data: v1.UsageHostsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for logs](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-logs) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-logs-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/logshttps://api.ap2.datadoghq.com/api/v1/usage/logshttps://api.datadoghq.eu/api/v1/usage/logshttps://api.ddog-gov.com/api/v1/usage/logshttps://api.datadoghq.com/api/v1/usage/logshttps://api.us3.datadoghq.com/api/v1/usage/logshttps://api.us5.datadoghq.com/api/v1/usage/logs ### Overview Get hourly usage for logs. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogs-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogs-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogs-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLogs-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of logs for each hour. Field Type Description usage [object] An array of objects regarding hourly usage of logs. billable_ingested_bytes int64 Contains the number of billable log bytes ingested. hour date-time The hour for the usage. indexed_events_count int64 Contains the number of log events indexed. ingested_events_bytes int64 Contains the number of log bytes ingested. logs_forwarding_events_bytes int64 Contains the number of logs forwarded bytes (data available as of April 1st 2023) logs_live_indexed_count int64 Contains the number of live log events indexed (data available as of December 1, 2020). logs_live_ingested_bytes int64 Contains the number of live log bytes ingested (data available as of December 1, 2020). logs_rehydrated_indexed_count int64 Contains the number of rehydrated log events indexed (data available as of December 1, 2020). logs_rehydrated_ingested_bytes int64 Contains the number of rehydrated log bytes ingested (data available as of December 1, 2020). org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "billable_ingested_bytes": "integer", "hour": "2019-09-19T10:00:00.000Z", "indexed_events_count": "integer", "ingested_events_bytes": "integer", "logs_forwarding_events_bytes": "integer", "logs_live_indexed_count": "integer", "logs_live_ingested_bytes": "integer", "logs_rehydrated_indexed_count": "integer", "logs_rehydrated_ingested_bytes": "integer", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby-legacy) ##### Get hourly usage for logs Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/logs?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for logs ``` """ Get hourly usage for logs returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_logs( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for logs ``` # Get hourly usage for logs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_logs("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for logs ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) start_date= '2019-10-07T00' end_date='2019-10-07T02' dog.get_logs_usage(start_date, end_date) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for logs ``` // Get hourly usage for logs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageLogs(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageLogsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageLogs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageLogs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for logs ``` // Get hourly usage for logs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageLogsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageLogsResponse result = apiInstance.getUsageLogs(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageLogs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for logs ``` // Get hourly usage for logs returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageLogsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_logs( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageLogsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for logs ``` /** * Get hourly usage for logs returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageLogsRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageLogs(params) .then((data: v1.UsageLogsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for custom metrics](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-custom-metrics) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-custom-metrics-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/timeserieshttps://api.ap2.datadoghq.com/api/v1/usage/timeserieshttps://api.datadoghq.eu/api/v1/usage/timeserieshttps://api.ddog-gov.com/api/v1/usage/timeserieshttps://api.datadoghq.com/api/v1/usage/timeserieshttps://api.us3.datadoghq.com/api/v1/usage/timeserieshttps://api.us5.datadoghq.com/api/v1/usage/timeseries ### Overview Get hourly usage for [custom metrics](https://docs.datadoghq.com/developers/metrics/custom_metrics/). **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageTimeseries-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageTimeseries-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageTimeseries-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageTimeseries-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing hourly usage of timeseries. Field Type Description usage [object] An array of objects regarding hourly usage of timeseries. hour date-time The hour for the usage. num_custom_input_timeseries int64 Contains the number of custom metrics that are inputs for aggregations (metric configured is custom). num_custom_output_timeseries int64 Contains the number of custom metrics that are outputs for aggregations (metric configured is custom). num_custom_timeseries int64 Contains sum of non-aggregation custom metrics and custom metrics that are outputs for aggregations. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "hour": "2019-09-19T10:00:00.000Z", "num_custom_input_timeseries": "integer", "num_custom_output_timeseries": "integer", "num_custom_timeseries": "integer", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby-legacy) ##### Get hourly usage for custom metrics Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/timeseries?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for custom metrics ``` """ Get hourly usage for custom metrics returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_timeseries( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for custom metrics ``` # Get hourly usage for custom metrics returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_timeseries((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for custom metrics ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) start_date= '2019-10-07T00' end_date='2019-10-07T02' dog.get_custom_metrics_usage(start_date, end_date) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for custom metrics ``` // Get hourly usage for custom metrics returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageTimeseries(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageTimeseriesOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageTimeseries`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageTimeseries`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for custom metrics ``` // Get hourly usage for custom metrics returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageTimeseriesOptionalParameters; import com.datadog.api.client.v1.model.UsageTimeseriesResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageTimeseriesResponse result = apiInstance.getUsageTimeseries( OffsetDateTime.now().plusDays(-5), new GetUsageTimeseriesOptionalParameters().endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageTimeseries"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for custom metrics ``` // Get hourly usage for custom metrics returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageTimeseriesOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_timeseries( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageTimeseriesOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for custom metrics ``` /** * Get hourly usage for custom metrics returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageTimeseriesRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageTimeseries(params) .then((data: v1.UsageTimeseriesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for indexed spans](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-indexed-spans) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-indexed-spans-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/indexed-spanshttps://api.ap2.datadoghq.com/api/v1/usage/indexed-spanshttps://api.datadoghq.eu/api/v1/usage/indexed-spanshttps://api.ddog-gov.com/api/v1/usage/indexed-spanshttps://api.datadoghq.com/api/v1/usage/indexed-spanshttps://api.us3.datadoghq.com/api/v1/usage/indexed-spanshttps://api.us5.datadoghq.com/api/v1/usage/indexed-spans ### Overview Get hourly usage for indexed spans. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageIndexedSpans-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageIndexedSpans-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageIndexedSpans-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageIndexedSpans-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) A response containing indexed spans usage. Field Type Description usage [object] Array with the number of hourly traces indexed for a given organization. hour date-time The hour for the usage. indexed_events_count int64 Contains the number of spans indexed. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "hour": "2019-09-19T10:00:00.000Z", "indexed_events_count": "integer", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for indexed spans Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/indexed-spans?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for indexed spans ``` """ Get hourly usage for indexed spans returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_indexed_spans( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for indexed spans ``` # Get hourly usage for indexed spans returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_indexed_spans((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for indexed spans ``` // Get hourly usage for indexed spans returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageIndexedSpans(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageIndexedSpansOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageIndexedSpans`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageIndexedSpans`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for indexed spans ``` // Get hourly usage for indexed spans returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageIndexedSpansOptionalParameters; import com.datadog.api.client.v1.model.UsageIndexedSpansResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageIndexedSpansResponse result = apiInstance.getUsageIndexedSpans( OffsetDateTime.now().plusDays(-5), new GetUsageIndexedSpansOptionalParameters() .endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageIndexedSpans"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for indexed spans ``` // Get hourly usage for indexed spans returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageIndexedSpansOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_indexed_spans( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageIndexedSpansOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for indexed spans ``` /** * Get hourly usage for indexed spans returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageIndexedSpansRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageIndexedSpans(params) .then((data: v1.UsageIndexedSpansResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for synthetics checks](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-synthetics-checks) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-synthetics-checks-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/syntheticshttps://api.ap2.datadoghq.com/api/v1/usage/syntheticshttps://api.datadoghq.eu/api/v1/usage/syntheticshttps://api.ddog-gov.com/api/v1/usage/syntheticshttps://api.datadoghq.com/api/v1/usage/syntheticshttps://api.us3.datadoghq.com/api/v1/usage/syntheticshttps://api.us5.datadoghq.com/api/v1/usage/synthetics ### Overview Get hourly usage for [synthetics checks](https://docs.datadoghq.com/synthetics/). **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSynthetics-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSynthetics-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSynthetics-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSynthetics-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of Synthetics API tests run for each hour for a given organization. Field Type Description usage [object] Array with the number of hourly Synthetics test run for a given organization. check_calls_count int64 Contains the number of Synthetics API tests run. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "check_calls_count": "integer", "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for synthetics checks Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/synthetics?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for synthetics checks ``` """ Get hourly usage for synthetics checks returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_synthetics( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for synthetics checks ``` # Get hourly usage for synthetics checks returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_synthetics("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for synthetics checks ``` // Get hourly usage for synthetics checks returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageSynthetics(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageSyntheticsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageSynthetics`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageSynthetics`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for synthetics checks ``` // Get hourly usage for synthetics checks returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageSyntheticsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageSyntheticsResponse result = apiInstance.getUsageSynthetics(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageSynthetics"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for synthetics checks ``` // Get hourly usage for synthetics checks returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageSyntheticsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_synthetics( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageSyntheticsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for synthetics checks ``` /** * Get hourly usage for synthetics checks returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageSyntheticsRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageSynthetics(params) .then((data: v1.UsageSyntheticsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for synthetics API checks](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-synthetics-api-checks) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-synthetics-api-checks-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/synthetics_apihttps://api.ap2.datadoghq.com/api/v1/usage/synthetics_apihttps://api.datadoghq.eu/api/v1/usage/synthetics_apihttps://api.ddog-gov.com/api/v1/usage/synthetics_apihttps://api.datadoghq.com/api/v1/usage/synthetics_apihttps://api.us3.datadoghq.com/api/v1/usage/synthetics_apihttps://api.us5.datadoghq.com/api/v1/usage/synthetics_api ### Overview Get hourly usage for [synthetics API checks](https://docs.datadoghq.com/synthetics/). **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSyntheticsAPI-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSyntheticsAPI-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSyntheticsAPI-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSyntheticsAPI-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of Synthetics API tests run for each hour for a given organization. Field Type Description usage [object] Get hourly usage for Synthetics API tests. check_calls_count int64 Contains the number of Synthetics API tests run. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "check_calls_count": "integer", "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby-legacy) ##### Get hourly usage for synthetics API checks Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/synthetics_api?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for synthetics API checks ``` """ Get hourly usage for synthetics API checks returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_synthetics_api( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for synthetics API checks ``` # Get hourly usage for synthetics API checks returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_synthetics_api("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for synthetics API checks ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) start_date= '2019-10-07T00' end_date='2019-10-07T02' dog.get_synthetics_api_usage(start_date, end_date) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for synthetics API checks ``` // Get hourly usage for synthetics API checks returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageSyntheticsAPI(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageSyntheticsAPIOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageSyntheticsAPI`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageSyntheticsAPI`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for synthetics API checks ``` // Get hourly usage for synthetics API checks returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageSyntheticsAPIResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageSyntheticsAPIResponse result = apiInstance.getUsageSyntheticsAPI(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageSyntheticsAPI"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for synthetics API checks ``` // Get hourly usage for synthetics API checks returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageSyntheticsAPIOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_synthetics_api( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageSyntheticsAPIOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for synthetics API checks ``` /** * Get hourly usage for synthetics API checks returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageSyntheticsAPIRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageSyntheticsAPI(params) .then((data: v1.UsageSyntheticsAPIResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for synthetics browser checks](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-synthetics-browser-checks) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-synthetics-browser-checks-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/synthetics_browserhttps://api.ap2.datadoghq.com/api/v1/usage/synthetics_browserhttps://api.datadoghq.eu/api/v1/usage/synthetics_browserhttps://api.ddog-gov.com/api/v1/usage/synthetics_browserhttps://api.datadoghq.com/api/v1/usage/synthetics_browserhttps://api.us3.datadoghq.com/api/v1/usage/synthetics_browserhttps://api.us5.datadoghq.com/api/v1/usage/synthetics_browser ### Overview Get hourly usage for synthetics browser checks. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSyntheticsBrowser-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSyntheticsBrowser-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSyntheticsBrowser-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSyntheticsBrowser-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of Synthetics Browser tests run for each hour for a given organization. Field Type Description usage [object] Get hourly usage for Synthetics Browser tests. browser_check_calls_count int64 Contains the number of Synthetics Browser tests run. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "browser_check_calls_count": "integer", "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby-legacy) ##### Get hourly usage for synthetics browser checks Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/synthetics_browser?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for synthetics browser checks ``` """ Get hourly usage for synthetics browser checks returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_synthetics_browser( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for synthetics browser checks ``` # Get hourly usage for synthetics browser checks returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_synthetics_browser("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for synthetics browser checks ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) start_date= '2019-10-07T00' end_date='2019-10-07T02' dog.get_synthetics_browser_usage(start_date, end_date) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for synthetics browser checks ``` // Get hourly usage for synthetics browser checks returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageSyntheticsBrowser(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageSyntheticsBrowserOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageSyntheticsBrowser`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageSyntheticsBrowser`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for synthetics browser checks ``` // Get hourly usage for synthetics browser checks returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageSyntheticsBrowserResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageSyntheticsBrowserResponse result = apiInstance.getUsageSyntheticsBrowser( OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageSyntheticsBrowser"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for synthetics browser checks ``` // Get hourly usage for synthetics browser checks returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageSyntheticsBrowserOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_synthetics_browser( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageSyntheticsBrowserOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for synthetics browser checks ``` /** * Get hourly usage for synthetics browser checks returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageSyntheticsBrowserRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageSyntheticsBrowser(params) .then((data: v1.UsageSyntheticsBrowserResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for Fargate](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-fargate) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-fargate-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/fargatehttps://api.ap2.datadoghq.com/api/v1/usage/fargatehttps://api.datadoghq.eu/api/v1/usage/fargatehttps://api.ddog-gov.com/api/v1/usage/fargatehttps://api.datadoghq.com/api/v1/usage/fargatehttps://api.us3.datadoghq.com/api/v1/usage/fargatehttps://api.us5.datadoghq.com/api/v1/usage/fargate ### Overview Get hourly usage for [Fargate](https://docs.datadoghq.com/integrations/ecs_fargate/). **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageFargate-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageFargate-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageFargate-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageFargate-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of Fargate tasks run and hourly usage. Field Type Description usage [object] Array with the number of hourly Fargate tasks recorded for a given organization. apm_fargate_count int64 The high-water mark of APM ECS Fargate tasks during the given hour. appsec_fargate_count int64 The Application Security Monitoring ECS Fargate tasks during the given hour. avg_profiled_fargate_tasks int64 The average profiled task count for Fargate Profiling. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. tasks_count int64 The number of Fargate tasks run. ``` { "usage": [ { "apm_fargate_count": "integer", "appsec_fargate_count": "integer", "avg_profiled_fargate_tasks": "integer", "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string", "tasks_count": "integer" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby-legacy) ##### Get hourly usage for Fargate Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/fargate?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for Fargate ``` """ Get hourly usage for Fargate returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_fargate( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for Fargate ``` # Get hourly usage for Fargate returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_fargate((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for Fargate ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) start_date= '2019-10-07T00' end_date='2019-10-07T02' dog.get_fargate_usage(start_date, end_date) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for Fargate ``` // Get hourly usage for Fargate returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageFargate(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageFargateOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageFargate`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageFargate`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for Fargate ``` // Get hourly usage for Fargate returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageFargateOptionalParameters; import com.datadog.api.client.v1.model.UsageFargateResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageFargateResponse result = apiInstance.getUsageFargate( OffsetDateTime.now().plusDays(-5), new GetUsageFargateOptionalParameters().endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageFargate"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for Fargate ``` // Get hourly usage for Fargate returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageFargateOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_fargate( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageFargateOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for Fargate ``` /** * Get hourly usage for Fargate returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageFargateRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageFargate(params) .then((data: v1.UsageFargateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for Lambda](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-lambda) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-lambda-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/aws_lambdahttps://api.ap2.datadoghq.com/api/v1/usage/aws_lambdahttps://api.datadoghq.eu/api/v1/usage/aws_lambdahttps://api.ddog-gov.com/api/v1/usage/aws_lambdahttps://api.datadoghq.com/api/v1/usage/aws_lambdahttps://api.us3.datadoghq.com/api/v1/usage/aws_lambdahttps://api.us5.datadoghq.com/api/v1/usage/aws_lambda ### Overview Get hourly usage for Lambda. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLambda-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLambda-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLambda-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLambda-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of Lambda functions and sum of the invocations of all Lambda functions for each hour for a given organization. Field Type Description usage [object] Get hourly usage for Lambda. func_count int64 Contains the number of different functions for each region and AWS account. hour date-time The hour for the usage. invocations_sum int64 Contains the sum of invocations of all functions. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "func_count": "integer", "hour": "2019-09-19T10:00:00.000Z", "invocations_sum": "integer", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby-legacy) ##### Get hourly usage for Lambda Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/aws_lambda?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for Lambda ``` """ Get hourly usage for Lambda returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_lambda( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for Lambda ``` # Get hourly usage for Lambda returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_lambda((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for Lambda ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) start_date= '2019-10-07T00' end_date='2019-10-07T02' dog.get_aws_lambda_usage(start_date, end_date) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for Lambda ``` // Get hourly usage for Lambda returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageLambda(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageLambdaOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageLambda`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageLambda`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for Lambda ``` // Get hourly usage for Lambda returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageLambdaOptionalParameters; import com.datadog.api.client.v1.model.UsageLambdaResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageLambdaResponse result = apiInstance.getUsageLambda( OffsetDateTime.now().plusDays(-5), new GetUsageLambdaOptionalParameters().endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageLambda"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for Lambda ``` // Get hourly usage for Lambda returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageLambdaOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_lambda( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageLambdaOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for Lambda ``` /** * Get hourly usage for Lambda returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageLambdaRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageLambda(params) .then((data: v1.UsageLambdaResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for RUM sessions](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-rum-sessions) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-rum-sessions-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/rum_sessionshttps://api.ap2.datadoghq.com/api/v1/usage/rum_sessionshttps://api.datadoghq.eu/api/v1/usage/rum_sessionshttps://api.ddog-gov.com/api/v1/usage/rum_sessionshttps://api.datadoghq.com/api/v1/usage/rum_sessionshttps://api.us3.datadoghq.com/api/v1/usage/rum_sessionshttps://api.us5.datadoghq.com/api/v1/usage/rum_sessions ### Overview Get hourly usage for [RUM](https://docs.datadoghq.com/real_user_monitoring/) Sessions. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. type string RUM type: `[browser, mobile]`. Defaults to `browser`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageRumSessions-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageRumSessions-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageRumSessions-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageRumSessions-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of RUM sessions for each hour for a given organization. Field Type Description usage [object] Get hourly usage for RUM sessions. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. replay_session_count int64 Contains the number of RUM Session Replay counts (data available beginning November 1, 2021). session_count int64 Contains the number of browser RUM lite Sessions. session_count_android int64 Contains the number of mobile RUM sessions on Android (data available beginning December 1, 2020). session_count_flutter int64 Contains the number of mobile RUM sessions on Flutter (data available beginning March 1, 2023). session_count_ios int64 Contains the number of mobile RUM sessions on iOS (data available beginning December 1, 2020). session_count_reactnative int64 Contains the number of mobile RUM sessions on React Native (data available beginning May 1, 2022). ``` { "usage": [ { "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string", "replay_session_count": "integer", "session_count": "integer", "session_count_android": "integer", "session_count_flutter": "integer", "session_count_ios": "integer", "session_count_reactnative": "integer" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby-legacy) ##### Get hourly usage for RUM sessions Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/rum_sessions?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for RUM sessions ``` """ Get hourly usage for RUM sessions returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_rum_sessions( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for RUM sessions ``` # Get hourly usage for RUM sessions returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_rum_sessions("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for RUM sessions ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) start_date= '2019-10-07T00' end_date='2019-10-07T02' dog.get_rum_sessions_usage(start_date, end_date) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for RUM sessions ``` // Get hourly usage for RUM sessions returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageRumSessions(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageRumSessionsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageRumSessions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageRumSessions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for RUM sessions ``` // Get hourly usage for RUM sessions returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageRumSessionsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageRumSessionsResponse result = apiInstance.getUsageRumSessions(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageRumSessions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for RUM sessions ``` // Get hourly usage for RUM sessions returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageRumSessionsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_rum_sessions( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageRumSessionsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for RUM sessions ``` /** * Get hourly usage for RUM sessions returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageRumSessionsRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageRumSessions(params) .then((data: v1.UsageRumSessionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for network hosts](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-network-hosts) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-network-hosts-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/network_hostshttps://api.ap2.datadoghq.com/api/v1/usage/network_hostshttps://api.datadoghq.eu/api/v1/usage/network_hostshttps://api.ddog-gov.com/api/v1/usage/network_hostshttps://api.datadoghq.com/api/v1/usage/network_hostshttps://api.us3.datadoghq.com/api/v1/usage/network_hostshttps://api.us5.datadoghq.com/api/v1/usage/network_hosts ### Overview Get hourly usage for network hosts. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageNetworkHosts-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageNetworkHosts-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageNetworkHosts-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageNetworkHosts-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of active NPM hosts for each hour for a given organization. Field Type Description usage [object] Get hourly usage for NPM hosts. host_count int64 Contains the number of active NPM hosts. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "host_count": "integer", "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby-legacy) ##### Get hourly usage for network hosts Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/network_hosts?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for network hosts ``` """ Get hourly usage for network hosts returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_network_hosts( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for network hosts ``` # Get hourly usage for network hosts returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_network_hosts("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for network hosts ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) start_date= '2019-10-07T00' end_date='2019-10-07T02' dog.get_network_hosts_usage(start_date, end_date) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for network hosts ``` // Get hourly usage for network hosts returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageNetworkHosts(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageNetworkHostsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageNetworkHosts`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageNetworkHosts`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for network hosts ``` // Get hourly usage for network hosts returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageNetworkHostsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageNetworkHostsResponse result = apiInstance.getUsageNetworkHosts(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageNetworkHosts"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for network hosts ``` // Get hourly usage for network hosts returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageNetworkHostsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_network_hosts( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageNetworkHostsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for network hosts ``` /** * Get hourly usage for network hosts returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageNetworkHostsRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageNetworkHosts(params) .then((data: v1.UsageNetworkHostsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [get hourly usage for network flows](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-network-flows) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-network-flows-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/network_flowshttps://api.ap2.datadoghq.com/api/v1/usage/network_flowshttps://api.datadoghq.eu/api/v1/usage/network_flowshttps://api.ddog-gov.com/api/v1/usage/network_flowshttps://api.datadoghq.com/api/v1/usage/network_flowshttps://api.us3.datadoghq.com/api/v1/usage/network_flowshttps://api.us5.datadoghq.com/api/v1/usage/network_flows ### Overview Get hourly usage for network flows. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageNetworkFlows-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageNetworkFlows-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageNetworkFlows-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageNetworkFlows-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of netflow events indexed for each hour for a given organization. Field Type Description usage [object] Get hourly usage for Network Flows. hour date-time The hour for the usage. indexed_events_count int64 Contains the number of netflow events indexed. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "hour": "2019-09-19T10:00:00.000Z", "indexed_events_count": "integer", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) * [Ruby [legacy]](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby-legacy) ##### get hourly usage for network flows Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/network_flows?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### get hourly usage for network flows ``` """ get hourly usage for network flows returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_network_flows( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### get hourly usage for network flows ``` # get hourly usage for network flows returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_network_flows("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### get hourly usage for network flows ``` require 'rubygems' require 'dogapi' api_key = '' app_key = '' dog = Dogapi::Client.new(api_key, app_key) start_date= '2019-10-07T00' end_date='2019-10-07T02' dog.get_network_flows_usage(start_date, end_date) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby-legacy) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### get hourly usage for network flows ``` // get hourly usage for network flows returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageNetworkFlows(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageNetworkFlowsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageNetworkFlows`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageNetworkFlows`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### get hourly usage for network flows ``` // get hourly usage for network flows returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageNetworkFlowsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageNetworkFlowsResponse result = apiInstance.getUsageNetworkFlows(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageNetworkFlows"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### get hourly usage for network flows ``` // get hourly usage for network flows returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageNetworkFlowsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_network_flows( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageNetworkFlowsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### get hourly usage for network flows ``` /** * get hourly usage for network flows returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageNetworkFlowsRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageNetworkFlows(params) .then((data: v1.UsageNetworkFlowsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for analyzed logs](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-analyzed-logs) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-analyzed-logs-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/analyzed_logshttps://api.ap2.datadoghq.com/api/v1/usage/analyzed_logshttps://api.datadoghq.eu/api/v1/usage/analyzed_logshttps://api.ddog-gov.com/api/v1/usage/analyzed_logshttps://api.datadoghq.com/api/v1/usage/analyzed_logshttps://api.us3.datadoghq.com/api/v1/usage/analyzed_logshttps://api.us5.datadoghq.com/api/v1/usage/analyzed_logs ### Overview Get hourly usage for analyzed logs (Security Monitoring). **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageAnalyzedLogs-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageAnalyzedLogs-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageAnalyzedLogs-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageAnalyzedLogs-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) A response containing the number of analyzed logs for each hour for a given organization. Field Type Description usage [object] Get hourly usage for analyzed logs. analyzed_logs int64 Contains the number of analyzed logs. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "analyzed_logs": "integer", "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for analyzed logs Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/analyzed_logs?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for analyzed logs ``` """ Get hourly usage for analyzed logs returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_analyzed_logs( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for analyzed logs ``` # Get hourly usage for analyzed logs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_analyzed_logs((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for analyzed logs ``` // Get hourly usage for analyzed logs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageAnalyzedLogs(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageAnalyzedLogsOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageAnalyzedLogs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageAnalyzedLogs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for analyzed logs ``` // Get hourly usage for analyzed logs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageAnalyzedLogsOptionalParameters; import com.datadog.api.client.v1.model.UsageAnalyzedLogsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageAnalyzedLogsResponse result = apiInstance.getUsageAnalyzedLogs( OffsetDateTime.now().plusDays(-5), new GetUsageAnalyzedLogsOptionalParameters() .endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageAnalyzedLogs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for analyzed logs ``` // Get hourly usage for analyzed logs returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageAnalyzedLogsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_analyzed_logs( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageAnalyzedLogsOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for analyzed logs ``` /** * Get hourly usage for analyzed logs returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageAnalyzedLogsRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageAnalyzedLogs(params) .then((data: v1.UsageAnalyzedLogsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for SNMP devices](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-snmp-devices) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-snmp-devices-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/snmphttps://api.ap2.datadoghq.com/api/v1/usage/snmphttps://api.datadoghq.eu/api/v1/usage/snmphttps://api.ddog-gov.com/api/v1/usage/snmphttps://api.datadoghq.com/api/v1/usage/snmphttps://api.us3.datadoghq.com/api/v1/usage/snmphttps://api.us5.datadoghq.com/api/v1/usage/snmp ### Overview Get hourly usage for SNMP devices. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSNMP-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSNMP-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSNMP-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSNMP-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of SNMP devices for each hour for a given organization. Field Type Description usage [object] Get hourly usage for SNMP devices. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. snmp_devices int64 Contains the number of SNMP devices. ``` { "usage": [ { "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string", "snmp_devices": "integer" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for SNMP devices Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/snmp?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for SNMP devices ``` """ Get hourly usage for SNMP devices returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_snmp( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for SNMP devices ``` # Get hourly usage for SNMP devices returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_snmp((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for SNMP devices ``` // Get hourly usage for SNMP devices returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageSNMP(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageSNMPOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageSNMP`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageSNMP`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for SNMP devices ``` // Get hourly usage for SNMP devices returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageSNMPOptionalParameters; import com.datadog.api.client.v1.model.UsageSNMPResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageSNMPResponse result = apiInstance.getUsageSNMP( OffsetDateTime.now().plusDays(-5), new GetUsageSNMPOptionalParameters().endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageSNMP"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for SNMP devices ``` // Get hourly usage for SNMP devices returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageSNMPOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_snmp( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageSNMPOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for SNMP devices ``` /** * Get hourly usage for SNMP devices returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageSNMPRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageSNMP(params) .then((data: v1.UsageSNMPResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for ingested spans](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-ingested-spans) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-ingested-spans-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/ingested-spanshttps://api.ap2.datadoghq.com/api/v1/usage/ingested-spanshttps://api.datadoghq.eu/api/v1/usage/ingested-spanshttps://api.ddog-gov.com/api/v1/usage/ingested-spanshttps://api.datadoghq.com/api/v1/usage/ingested-spanshttps://api.us3.datadoghq.com/api/v1/usage/ingested-spanshttps://api.us5.datadoghq.com/api/v1/usage/ingested-spans ### Overview Get hourly usage for ingested spans. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetIngestedSpans-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetIngestedSpans-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetIngestedSpans-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetIngestedSpans-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the ingested spans usage for each hour for a given organization. Field Type Description usage [object] Get hourly usage for ingested spans. hour date-time The hour for the usage. ingested_events_bytes int64 Contains the total number of bytes ingested for APM spans during a given hour. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "hour": "2019-09-19T10:00:00.000Z", "ingested_events_bytes": "integer", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for ingested spans Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/ingested-spans?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for ingested spans ``` """ Get hourly usage for ingested spans returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_ingested_spans( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for ingested spans ``` # Get hourly usage for ingested spans returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_ingested_spans((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for ingested spans ``` // Get hourly usage for ingested spans returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetIngestedSpans(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetIngestedSpansOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetIngestedSpans`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetIngestedSpans`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for ingested spans ``` // Get hourly usage for ingested spans returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetIngestedSpansOptionalParameters; import com.datadog.api.client.v1.model.UsageIngestedSpansResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageIngestedSpansResponse result = apiInstance.getIngestedSpans( OffsetDateTime.now().plusDays(-5), new GetIngestedSpansOptionalParameters().endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getIngestedSpans"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for ingested spans ``` // Get hourly usage for ingested spans returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetIngestedSpansOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_ingested_spans( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetIngestedSpansOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for ingested spans ``` /** * Get hourly usage for ingested spans returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetIngestedSpansRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getIngestedSpans(params) .then((data: v1.UsageIngestedSpansResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for incident management](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-incident-management) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-incident-management-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/incident-managementhttps://api.ap2.datadoghq.com/api/v1/usage/incident-managementhttps://api.datadoghq.eu/api/v1/usage/incident-managementhttps://api.ddog-gov.com/api/v1/usage/incident-managementhttps://api.datadoghq.com/api/v1/usage/incident-managementhttps://api.us3.datadoghq.com/api/v1/usage/incident-managementhttps://api.us5.datadoghq.com/api/v1/usage/incident-management ### Overview Get hourly usage for incident management. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetIncidentManagement-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetIncidentManagement-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetIncidentManagement-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetIncidentManagement-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the incident management usage for each hour for a given organization. Field Type Description usage [object] Get hourly usage for incident management. hour date-time The hour for the usage. monthly_active_users int64 Contains the total number monthly active users from the start of the given hour's month until the given hour. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "hour": "2019-09-19T10:00:00.000Z", "monthly_active_users": "integer", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for incident management Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/incident-management?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for incident management ``` """ Get hourly usage for incident management returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_incident_management( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for incident management ``` # Get hourly usage for incident management returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_incident_management((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for incident management ``` // Get hourly usage for incident management returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetIncidentManagement(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetIncidentManagementOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetIncidentManagement`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetIncidentManagement`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for incident management ``` // Get hourly usage for incident management returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetIncidentManagementOptionalParameters; import com.datadog.api.client.v1.model.UsageIncidentManagementResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageIncidentManagementResponse result = apiInstance.getIncidentManagement( OffsetDateTime.now().plusDays(-5), new GetIncidentManagementOptionalParameters() .endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getIncidentManagement"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for incident management ``` // Get hourly usage for incident management returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetIncidentManagementOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_incident_management( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetIncidentManagementOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for incident management ``` /** * Get hourly usage for incident management returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetIncidentManagementRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getIncidentManagement(params) .then((data: v1.UsageIncidentManagementResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for IoT](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-iot) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-iot-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/iothttps://api.ap2.datadoghq.com/api/v1/usage/iothttps://api.datadoghq.eu/api/v1/usage/iothttps://api.ddog-gov.com/api/v1/usage/iothttps://api.datadoghq.com/api/v1/usage/iothttps://api.us3.datadoghq.com/api/v1/usage/iothttps://api.us5.datadoghq.com/api/v1/usage/iot ### Overview Get hourly usage for IoT. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageInternetOfThings-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageInternetOfThings-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageInternetOfThings-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageInternetOfThings-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the IoT usage for each hour for a given organization. Field Type Description usage [object] Get hourly usage for IoT. hour date-time The hour for the usage. iot_device_count int64 The total number of IoT devices during a given hour. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "hour": "2019-09-19T10:00:00.000Z", "iot_device_count": "integer", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for IoT Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/iot?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for IoT ``` """ Get hourly usage for IoT returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_internet_of_things( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for IoT ``` # Get hourly usage for IoT returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_internet_of_things((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for IoT ``` // Get hourly usage for IoT returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageInternetOfThings(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageInternetOfThingsOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageInternetOfThings`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageInternetOfThings`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for IoT ``` // Get hourly usage for IoT returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageInternetOfThingsOptionalParameters; import com.datadog.api.client.v1.model.UsageIoTResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageIoTResponse result = apiInstance.getUsageInternetOfThings( OffsetDateTime.now().plusDays(-5), new GetUsageInternetOfThingsOptionalParameters() .endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageInternetOfThings"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for IoT ``` // Get hourly usage for IoT returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageInternetOfThingsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_internet_of_things( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageInternetOfThingsOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for IoT ``` /** * Get hourly usage for IoT returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageInternetOfThingsRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageInternetOfThings(params) .then((data: v1.UsageIoTResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for CSM Pro](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-csm-pro) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-csm-pro-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/cspmhttps://api.ap2.datadoghq.com/api/v1/usage/cspmhttps://api.datadoghq.eu/api/v1/usage/cspmhttps://api.ddog-gov.com/api/v1/usage/cspmhttps://api.datadoghq.com/api/v1/usage/cspmhttps://api.us3.datadoghq.com/api/v1/usage/cspmhttps://api.us5.datadoghq.com/api/v1/usage/cspm ### Overview Get hourly usage for cloud security management (CSM) pro. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCloudSecurityPostureManagement-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCloudSecurityPostureManagement-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCloudSecurityPostureManagement-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCloudSecurityPostureManagement-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) The response containing the Cloud Security Management Pro usage for each hour for a given organization. Field Type Description usage [object] Get hourly usage for Cloud Security Management Pro. aas_host_count double The number of Cloud Security Management Pro Azure app services hosts during a given hour. aws_host_count double The number of Cloud Security Management Pro AWS hosts during a given hour. azure_host_count double The number of Cloud Security Management Pro Azure hosts during a given hour. compliance_host_count double The number of Cloud Security Management Pro hosts during a given hour. container_count double The total number of Cloud Security Management Pro containers during a given hour. gcp_host_count double The number of Cloud Security Management Pro GCP hosts during a given hour. host_count double The total number of Cloud Security Management Pro hosts during a given hour. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "aas_host_count": "number", "aws_host_count": "number", "azure_host_count": "number", "compliance_host_count": "number", "container_count": "number", "gcp_host_count": "number", "host_count": "number", "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for CSM Pro Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/cspm?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for CSM Pro ``` """ Get hourly usage for CSM Pro returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_cloud_security_posture_management( start_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for CSM Pro ``` # Get hourly usage for CSM Pro returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_cloud_security_posture_management((Time.now + -3 * 86400)) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for CSM Pro ``` // Get hourly usage for CSM Pro returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageCloudSecurityPostureManagement(ctx, time.Now().AddDate(0, 0, -3), *datadogV1.NewGetUsageCloudSecurityPostureManagementOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageCloudSecurityPostureManagement`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageCloudSecurityPostureManagement`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for CSM Pro ``` // Get hourly usage for CSM Pro returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageCloudSecurityPostureManagementResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageCloudSecurityPostureManagementResponse result = apiInstance.getUsageCloudSecurityPostureManagement(OffsetDateTime.now().plusDays(-3)); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling UsageMeteringApi#getUsageCloudSecurityPostureManagement"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for CSM Pro ``` // Get hourly usage for CSM Pro returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageCloudSecurityPostureManagementOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_cloud_security_posture_management( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageCloudSecurityPostureManagementOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for CSM Pro ``` /** * Get hourly usage for CSM Pro returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageCloudSecurityPostureManagementRequest = { startHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageCloudSecurityPostureManagement(params) .then((data: v1.UsageCloudSecurityPostureManagementResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for cloud workload security](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-cloud-workload-security) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-cloud-workload-security-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/cwshttps://api.ap2.datadoghq.com/api/v1/usage/cwshttps://api.datadoghq.eu/api/v1/usage/cwshttps://api.ddog-gov.com/api/v1/usage/cwshttps://api.datadoghq.com/api/v1/usage/cwshttps://api.us3.datadoghq.com/api/v1/usage/cwshttps://api.us5.datadoghq.com/api/v1/usage/cws ### Overview Get hourly usage for cloud workload security. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCWS-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCWS-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCWS-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCWS-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the Cloud Workload Security usage for each hour for a given organization. Field Type Description usage [object] Get hourly usage for Cloud Workload Security. cws_container_count int64 The total number of Cloud Workload Security container hours from the start of the given hour’s month until the given hour. cws_host_count int64 The total number of Cloud Workload Security host hours from the start of the given hour’s month until the given hour. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "cws_container_count": "integer", "cws_host_count": "integer", "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for cloud workload security Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/cws?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for cloud workload security ``` """ Get hourly usage for cloud workload security returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_cws( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for cloud workload security ``` # Get hourly usage for cloud workload security returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_cws((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for cloud workload security ``` // Get hourly usage for cloud workload security returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageCWS(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageCWSOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageCWS`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageCWS`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for cloud workload security ``` // Get hourly usage for cloud workload security returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageCWSOptionalParameters; import com.datadog.api.client.v1.model.UsageCWSResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageCWSResponse result = apiInstance.getUsageCWS( OffsetDateTime.now().plusDays(-5), new GetUsageCWSOptionalParameters().endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageCWS"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for cloud workload security ``` // Get hourly usage for cloud workload security returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageCWSOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_cws( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageCWSOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for cloud workload security ``` /** * Get hourly usage for cloud workload security returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageCWSRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageCWS(params) .then((data: v1.UsageCWSResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for database monitoring](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-database-monitoring) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-database-monitoring-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/dbmhttps://api.ap2.datadoghq.com/api/v1/usage/dbmhttps://api.datadoghq.eu/api/v1/usage/dbmhttps://api.ddog-gov.com/api/v1/usage/dbmhttps://api.datadoghq.com/api/v1/usage/dbmhttps://api.us3.datadoghq.com/api/v1/usage/dbmhttps://api.us5.datadoghq.com/api/v1/usage/dbm ### Overview Get hourly usage for database monitoring **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageDBM-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageDBM-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageDBM-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageDBM-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the Database Monitoring usage for each hour for a given organization. Field Type Description usage [object] Get hourly usage for Database Monitoring dbm_host_count int64 The total number of Database Monitoring host hours from the start of the given hour’s month until the given hour. dbm_queries_count int64 The total number of normalized Database Monitoring queries from the start of the given hour’s month until the given hour. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "dbm_host_count": "integer", "dbm_queries_count": "integer", "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for database monitoring Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/dbm?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for database monitoring ``` """ Get hourly usage for database monitoring returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_dbm( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for database monitoring ``` # Get hourly usage for database monitoring returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_dbm("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for database monitoring ``` // Get hourly usage for database monitoring returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageDBM(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageDBMOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageDBM`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageDBM`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for database monitoring ``` // Get hourly usage for database monitoring returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageDBMResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageDBMResponse result = apiInstance.getUsageDBM(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageDBM"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for database monitoring ``` // Get hourly usage for database monitoring returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageDBMOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_dbm( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageDBMOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for database monitoring ``` /** * Get hourly usage for database monitoring returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageDBMRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageDBM(params) .then((data: v1.UsageDBMResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for sensitive data scanner](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-sensitive-data-scanner) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-sensitive-data-scanner-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/sdshttps://api.ap2.datadoghq.com/api/v1/usage/sdshttps://api.datadoghq.eu/api/v1/usage/sdshttps://api.ddog-gov.com/api/v1/usage/sdshttps://api.datadoghq.com/api/v1/usage/sdshttps://api.us3.datadoghq.com/api/v1/usage/sdshttps://api.us5.datadoghq.com/api/v1/usage/sds ### Overview Get hourly usage for sensitive data scanner. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSDS-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSDS-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSDS-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageSDS-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the Sensitive Data Scanner usage for each hour for a given organization. Field Type Description usage [object] Get hourly usage for Sensitive Data Scanner. apm_scanned_bytes int64 The total number of bytes scanned of APM usage across all usage types by the Sensitive Data Scanner from the start of the given hour’s month until the given hour. events_scanned_bytes int64 The total number of bytes scanned of Events usage across all usage types by the Sensitive Data Scanner from the start of the given hour’s month until the given hour. hour date-time The hour for the usage. logs_scanned_bytes int64 The total number of bytes scanned of logs usage by the Sensitive Data Scanner from the start of the given hour’s month until the given hour. org_name string The organization name. public_id string The organization public ID. rum_scanned_bytes int64 The total number of bytes scanned of RUM usage across all usage types by the Sensitive Data Scanner from the start of the given hour’s month until the given hour. total_scanned_bytes int64 The total number of bytes scanned across all usage types by the Sensitive Data Scanner from the start of the given hour’s month until the given hour. ``` { "usage": [ { "apm_scanned_bytes": "integer", "events_scanned_bytes": "integer", "hour": "2019-09-19T10:00:00.000Z", "logs_scanned_bytes": "integer", "org_name": "string", "public_id": "string", "rum_scanned_bytes": "integer", "total_scanned_bytes": "integer" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for sensitive data scanner Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/sds?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for sensitive data scanner ``` """ Get hourly usage for sensitive data scanner returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_sds( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for sensitive data scanner ``` # Get hourly usage for sensitive data scanner returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_sds("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for sensitive data scanner ``` // Get hourly usage for sensitive data scanner returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageSDS(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageSDSOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageSDS`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageSDS`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for sensitive data scanner ``` // Get hourly usage for sensitive data scanner returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageSDSResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageSDSResponse result = apiInstance.getUsageSDS(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageSDS"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for sensitive data scanner ``` // Get hourly usage for sensitive data scanner returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageSDSOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_sds( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageSDSOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for sensitive data scanner ``` /** * Get hourly usage for sensitive data scanner returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageSDSRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageSDS(params) .then((data: v1.UsageSDSResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for RUM units](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-rum-units) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-rum-units-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/rumhttps://api.ap2.datadoghq.com/api/v1/usage/rumhttps://api.datadoghq.eu/api/v1/usage/rumhttps://api.ddog-gov.com/api/v1/usage/rumhttps://api.datadoghq.com/api/v1/usage/rumhttps://api.us3.datadoghq.com/api/v1/usage/rumhttps://api.us5.datadoghq.com/api/v1/usage/rum ### Overview Get hourly usage for [RUM](https://docs.datadoghq.com/real_user_monitoring/) Units. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: [YYYY-MM-DDThh] for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageRumUnits-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageRumUnits-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageRumUnits-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageRumUnits-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of RUM Units for each hour for a given organization. Field Type Description usage [object] Get hourly usage for RUM Units. browser_rum_units int64 The number of browser RUM units. mobile_rum_units int64 The number of mobile RUM units. org_name string The organization name. public_id string The organization public ID. rum_units int64 Total RUM units across mobile and browser RUM. ``` { "usage": [ { "browser_rum_units": "integer", "mobile_rum_units": "integer", "org_name": "string", "public_id": "string", "rum_units": "integer" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for RUM units Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/rum?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for RUM units ``` """ Get hourly usage for RUM units returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_rum_units( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for RUM units ``` # Get hourly usage for RUM units returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_rum_units("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for RUM units ``` // Get hourly usage for RUM units returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageRumUnits(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageRumUnitsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageRumUnits`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageRumUnits`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for RUM units ``` // Get hourly usage for RUM units returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageRumUnitsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageRumUnitsResponse result = apiInstance.getUsageRumUnits(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageRumUnits"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for RUM units ``` // Get hourly usage for RUM units returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageRumUnitsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_rum_units( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageRumUnitsOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for RUM units ``` /** * Get hourly usage for RUM units returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageRumUnitsRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageRumUnits(params) .then((data: v1.UsageRumUnitsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for profiled hosts](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-profiled-hosts) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-profiled-hosts-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/profilinghttps://api.ap2.datadoghq.com/api/v1/usage/profilinghttps://api.datadoghq.eu/api/v1/usage/profilinghttps://api.ddog-gov.com/api/v1/usage/profilinghttps://api.datadoghq.com/api/v1/usage/profilinghttps://api.us3.datadoghq.com/api/v1/usage/profilinghttps://api.us5.datadoghq.com/api/v1/usage/profiling ### Overview Get hourly usage for profiled hosts. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageProfiling-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageProfiling-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageProfiling-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageProfiling-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the number of profiled hosts for each hour for a given organization. Field Type Description usage [object] Get hourly usage for profiled hosts. aas_count int64 Contains the total number of profiled Azure app services reporting during a given hour. avg_container_agent_count int64 Get average number of container agents for that hour. host_count int64 Contains the total number of profiled hosts reporting during a given hour. hour date-time The hour for the usage. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "aas_count": "integer", "avg_container_agent_count": "integer", "host_count": "integer", "hour": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for profiled hosts Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/profiling?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for profiled hosts ``` """ Get hourly usage for profiled hosts returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_profiling( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for profiled hosts ``` # Get hourly usage for profiled hosts returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_profiling((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for profiled hosts ``` // Get hourly usage for profiled hosts returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageProfiling(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageProfilingOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageProfiling`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageProfiling`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for profiled hosts ``` // Get hourly usage for profiled hosts returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageProfilingOptionalParameters; import com.datadog.api.client.v1.model.UsageProfilingResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageProfilingResponse result = apiInstance.getUsageProfiling( OffsetDateTime.now().plusDays(-5), new GetUsageProfilingOptionalParameters().endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageProfiling"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for profiled hosts ``` // Get hourly usage for profiled hosts returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageProfilingOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_profiling( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageProfilingOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for profiled hosts ``` /** * Get hourly usage for profiled hosts returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageProfilingRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageProfiling(params) .then((data: v1.UsageProfilingResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for CI visibility](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-ci-visibility) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-ci-visibility-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/ci-apphttps://api.ap2.datadoghq.com/api/v1/usage/ci-apphttps://api.datadoghq.eu/api/v1/usage/ci-apphttps://api.ddog-gov.com/api/v1/usage/ci-apphttps://api.datadoghq.com/api/v1/usage/ci-apphttps://api.us3.datadoghq.com/api/v1/usage/ci-apphttps://api.us5.datadoghq.com/api/v1/usage/ci-app ### Overview Get hourly usage for CI visibility (tests, pipeline, and spans). **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCIApp-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCIApp-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCIApp-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageCIApp-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) CI visibility usage response Field Type Description usage [object] Response containing CI visibility usage. ci_pipeline_indexed_spans int64 The number of spans for pipelines in the queried hour. ci_test_indexed_spans int64 The number of spans for tests in the queried hour. ci_visibility_itr_committers int64 Shows the total count of all active Git committers for Intelligent Test Runner in the current month. A committer is active if they commit at least 3 times in a given month. ci_visibility_pipeline_committers int64 Shows the total count of all active Git committers for Pipelines in the current month. A committer is active if they commit at least 3 times in a given month. ci_visibility_test_committers int64 The total count of all active Git committers for tests in the current month. A committer is active if they commit at least 3 times in a given month. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "ci_pipeline_indexed_spans": "integer", "ci_test_indexed_spans": "integer", "ci_visibility_itr_committers": "integer", "ci_visibility_pipeline_committers": "integer", "ci_visibility_test_committers": "integer", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for CI visibility Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/ci-app?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for CI visibility ``` """ Get hourly usage for CI visibility returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_ci_app( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for CI visibility ``` # Get hourly usage for CI visibility returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_ci_app((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for CI visibility ``` // Get hourly usage for CI visibility returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageCIApp(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageCIAppOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageCIApp`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageCIApp`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for CI visibility ``` // Get hourly usage for CI visibility returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageCIAppOptionalParameters; import com.datadog.api.client.v1.model.UsageCIVisibilityResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageCIVisibilityResponse result = apiInstance.getUsageCIApp( OffsetDateTime.now().plusDays(-5), new GetUsageCIAppOptionalParameters().endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageCIApp"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for CI visibility ``` // Get hourly usage for CI visibility returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageCIAppOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_ci_app( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageCIAppOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for CI visibility ``` /** * Get hourly usage for CI visibility returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageCIAppRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageCIApp(params) .then((data: v1.UsageCIVisibilityResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for online archive](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-online-archive) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-online-archive-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/online-archivehttps://api.ap2.datadoghq.com/api/v1/usage/online-archivehttps://api.datadoghq.eu/api/v1/usage/online-archivehttps://api.ddog-gov.com/api/v1/usage/online-archivehttps://api.datadoghq.com/api/v1/usage/online-archivehttps://api.us3.datadoghq.com/api/v1/usage/online-archivehttps://api.us5.datadoghq.com/api/v1/usage/online-archive ### Overview Get hourly usage for online archive. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family). Refer to [Migrating from the V1 Hourly Usage APIs to V2](https://docs.datadoghq.com/account_management/guide/hourly-usage-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageOnlineArchive-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageOnlineArchive-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageOnlineArchive-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageOnlineArchive-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Online Archive usage response. Field Type Description usage [object] Response containing Online Archive usage. hour date-time The hour for the usage. online_archive_events_count int64 Total count of online archived events within the hour. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "hour": "2019-09-19T10:00:00.000Z", "online_archive_events_count": "integer", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for online archive Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/online-archive?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for online archive ``` """ Get hourly usage for online archive returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi from datetime import datetime from dateutil.tz import tzutc configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_online_archive( start_hr=datetime(2021, 11, 11, 11, 11, 11, 111000, tzinfo=tzutc()), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for online archive ``` # Get hourly usage for online archive returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_usage_online_archive("2021-11-11T11:11:11.111+00:00") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for online archive ``` // Get hourly usage for online archive returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageOnlineArchive(ctx, time.Date(2021, 11, 11, 11, 11, 11, 111000, time.UTC), *datadogV1.NewGetUsageOnlineArchiveOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageOnlineArchive`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageOnlineArchive`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for online archive ``` // Get hourly usage for online archive returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageOnlineArchiveResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageOnlineArchiveResponse result = apiInstance.getUsageOnlineArchive(OffsetDateTime.parse("2021-11-11T11:11:11.111+00:00")); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageOnlineArchive"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for online archive ``` // Get hourly usage for online archive returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageOnlineArchiveOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_online_archive( DateTime::parse_from_rfc3339("2021-11-11T11:11:11.111000+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageOnlineArchiveOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for online archive ``` /** * Get hourly usage for online archive returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageOnlineArchiveRequest = { startHr: new Date(2021, 11, 11, 11, 11, 11, 111000), }; apiInstance .getUsageOnlineArchive(params) .then((data: v1.UsageOnlineArchiveResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for Lambda traced invocations](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-lambda-traced-invocations) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-lambda-traced-invocations-v2) GET https://api.ap1.datadoghq.com/api/v2/usage/lambda_traced_invocationshttps://api.ap2.datadoghq.com/api/v2/usage/lambda_traced_invocationshttps://api.datadoghq.eu/api/v2/usage/lambda_traced_invocationshttps://api.ddog-gov.com/api/v2/usage/lambda_traced_invocationshttps://api.datadoghq.com/api/v2/usage/lambda_traced_invocationshttps://api.us3.datadoghq.com/api/v2/usage/lambda_traced_invocationshttps://api.us5.datadoghq.com/api/v2/usage/lambda_traced_invocations ### Overview Get hourly usage for Lambda traced invocations. **Note:** This endpoint has been deprecated.. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family) This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLambdaTracedInvocations-200-v2) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLambdaTracedInvocations-400-v2) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLambdaTracedInvocations-403-v2) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageLambdaTracedInvocations-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Lambda Traced Invocations usage response. Field Type Description data [object] Response containing Lambda Traced Invocations usage. attributes object Usage attributes data. org_name string The organization name. product_family string The product for which usage is being reported. public_id string The organization public ID. region string The region of the Datadog instance that the organization belongs to. timeseries [object] List of usage data reported for each requested hour. timestamp date-time Datetime in ISO-8601 format, UTC. The hour for the usage. value int64 Contains the number measured for the given usage_type during the hour. usage_type enum Usage type that is being measured. Allowed enum values: `app_sec_host_count,observability_pipelines_bytes_processed,lambda_traced_invocations_count` id string Unique ID of the response. type enum Type of usage data. Allowed enum values: `usage_timeseries` default: `usage_timeseries` ``` { "data": [ { "attributes": { "org_name": "string", "product_family": "string", "public_id": "string", "region": "string", "timeseries": [ { "timestamp": "2019-09-19T10:00:00.000Z", "value": "integer" } ], "usage_type": "observability_pipelines_bytes_processed" }, "id": "string", "type": "usage_timeseries" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for Lambda traced invocations Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/usage/lambda_traced_invocations?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for Lambda traced invocations ``` """ Get hourly usage for Lambda traced invocations returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_lambda_traced_invocations( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for Lambda traced invocations ``` # Get hourly usage for Lambda traced invocations returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_lambda_traced_invocations((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for Lambda traced invocations ``` // Get hourly usage for Lambda traced invocations returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageLambdaTracedInvocations(ctx, time.Now().AddDate(0, 0, -5), *datadogV2.NewGetUsageLambdaTracedInvocationsOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageLambdaTracedInvocations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageLambdaTracedInvocations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for Lambda traced invocations ``` // Get hourly usage for Lambda traced invocations returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsageMeteringApi; import com.datadog.api.client.v2.api.UsageMeteringApi.GetUsageLambdaTracedInvocationsOptionalParameters; import com.datadog.api.client.v2.model.UsageLambdaTracedInvocationsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageLambdaTracedInvocationsResponse result = apiInstance.getUsageLambdaTracedInvocations( OffsetDateTime.now().plusDays(-5), new GetUsageLambdaTracedInvocationsOptionalParameters() .endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageLambdaTracedInvocations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for Lambda traced invocations ``` // Get hourly usage for Lambda traced invocations returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_usage_metering::GetUsageLambdaTracedInvocationsOptionalParams; use datadog_api_client::datadogV2::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_lambda_traced_invocations( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageLambdaTracedInvocationsOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for Lambda traced invocations ``` /** * Get hourly usage for Lambda traced invocations returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsageMeteringApi(configuration); const params: v2.UsageMeteringApiGetUsageLambdaTracedInvocationsRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageLambdaTracedInvocations(params) .then((data: v2.UsageLambdaTracedInvocationsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for application security](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-application-security) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-application-security-v2) GET https://api.ap1.datadoghq.com/api/v2/usage/application_securityhttps://api.ap2.datadoghq.com/api/v2/usage/application_securityhttps://api.datadoghq.eu/api/v2/usage/application_securityhttps://api.ddog-gov.com/api/v2/usage/application_securityhttps://api.datadoghq.com/api/v2/usage/application_securityhttps://api.us3.datadoghq.com/api/v2/usage/application_securityhttps://api.us5.datadoghq.com/api/v2/usage/application_security ### Overview Get hourly usage for application security . **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family) This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageApplicationSecurityMonitoring-200-v2) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageApplicationSecurityMonitoring-400-v2) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageApplicationSecurityMonitoring-403-v2) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageApplicationSecurityMonitoring-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Application Security Monitoring usage response. Field Type Description data [object] Response containing Application Security Monitoring usage. attributes object Usage attributes data. org_name string The organization name. product_family string The product for which usage is being reported. public_id string The organization public ID. region string The region of the Datadog instance that the organization belongs to. timeseries [object] List of usage data reported for each requested hour. timestamp date-time Datetime in ISO-8601 format, UTC. The hour for the usage. value int64 Contains the number measured for the given usage_type during the hour. usage_type enum Usage type that is being measured. Allowed enum values: `app_sec_host_count,observability_pipelines_bytes_processed,lambda_traced_invocations_count` id string Unique ID of the response. type enum Type of usage data. Allowed enum values: `usage_timeseries` default: `usage_timeseries` ``` { "data": [ { "attributes": { "org_name": "string", "product_family": "string", "public_id": "string", "region": "string", "timeseries": [ { "timestamp": "2019-09-19T10:00:00.000Z", "value": "integer" } ], "usage_type": "observability_pipelines_bytes_processed" }, "id": "string", "type": "usage_timeseries" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for application security Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/usage/application_security?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for application security ``` """ Get hourly usage for application security returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_application_security_monitoring( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for application security ``` # Get hourly usage for application security returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_application_security_monitoring((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for application security ``` // Get hourly usage for application security returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageApplicationSecurityMonitoring(ctx, time.Now().AddDate(0, 0, -5), *datadogV2.NewGetUsageApplicationSecurityMonitoringOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageApplicationSecurityMonitoring`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageApplicationSecurityMonitoring`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for application security ``` // Get hourly usage for application security returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsageMeteringApi; import com.datadog.api.client.v2.api.UsageMeteringApi.GetUsageApplicationSecurityMonitoringOptionalParameters; import com.datadog.api.client.v2.model.UsageApplicationSecurityMonitoringResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageApplicationSecurityMonitoringResponse result = apiInstance.getUsageApplicationSecurityMonitoring( OffsetDateTime.now().plusDays(-5), new GetUsageApplicationSecurityMonitoringOptionalParameters() .endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling UsageMeteringApi#getUsageApplicationSecurityMonitoring"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for application security ``` // Get hourly usage for application security returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_usage_metering::GetUsageApplicationSecurityMonitoringOptionalParams; use datadog_api_client::datadogV2::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_application_security_monitoring( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageApplicationSecurityMonitoringOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for application security ``` /** * Get hourly usage for application security returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsageMeteringApi(configuration); const params: v2.UsageMeteringApiGetUsageApplicationSecurityMonitoringRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageApplicationSecurityMonitoring(params) .then((data: v2.UsageApplicationSecurityMonitoringResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for observability pipelines](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-observability-pipelines) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-observability-pipelines-v2) GET https://api.ap1.datadoghq.com/api/v2/usage/observability_pipelineshttps://api.ap2.datadoghq.com/api/v2/usage/observability_pipelineshttps://api.datadoghq.eu/api/v2/usage/observability_pipelineshttps://api.ddog-gov.com/api/v2/usage/observability_pipelineshttps://api.datadoghq.com/api/v2/usage/observability_pipelineshttps://api.us3.datadoghq.com/api/v2/usage/observability_pipelineshttps://api.us5.datadoghq.com/api/v2/usage/observability_pipelines ### Overview Get hourly usage for observability pipelines. **Note:** This endpoint has been deprecated. Hourly usage data for all products is now available in the [Get hourly usage by product family API](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-by-product-family) This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageObservabilityPipelines-200-v2) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageObservabilityPipelines-400-v2) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageObservabilityPipelines-403-v2) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageObservabilityPipelines-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Observability Pipelines usage response. Field Type Description data [object] Response containing Observability Pipelines usage. attributes object Usage attributes data. org_name string The organization name. product_family string The product for which usage is being reported. public_id string The organization public ID. region string The region of the Datadog instance that the organization belongs to. timeseries [object] List of usage data reported for each requested hour. timestamp date-time Datetime in ISO-8601 format, UTC. The hour for the usage. value int64 Contains the number measured for the given usage_type during the hour. usage_type enum Usage type that is being measured. Allowed enum values: `app_sec_host_count,observability_pipelines_bytes_processed,lambda_traced_invocations_count` id string Unique ID of the response. type enum Type of usage data. Allowed enum values: `usage_timeseries` default: `usage_timeseries` ``` { "data": [ { "attributes": { "org_name": "string", "product_family": "string", "public_id": "string", "region": "string", "timeseries": [ { "timestamp": "2019-09-19T10:00:00.000Z", "value": "integer" } ], "usage_type": "observability_pipelines_bytes_processed" }, "id": "string", "type": "usage_timeseries" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for observability pipelines Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/usage/observability_pipelines?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for observability pipelines ``` """ Get hourly usage for observability pipelines returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_observability_pipelines( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for observability pipelines ``` # Get hourly usage for observability pipelines returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_observability_pipelines((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for observability pipelines ``` // Get hourly usage for observability pipelines returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageObservabilityPipelines(ctx, time.Now().AddDate(0, 0, -5), *datadogV2.NewGetUsageObservabilityPipelinesOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageObservabilityPipelines`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageObservabilityPipelines`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for observability pipelines ``` // Get hourly usage for observability pipelines returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsageMeteringApi; import com.datadog.api.client.v2.api.UsageMeteringApi.GetUsageObservabilityPipelinesOptionalParameters; import com.datadog.api.client.v2.model.UsageObservabilityPipelinesResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageObservabilityPipelinesResponse result = apiInstance.getUsageObservabilityPipelines( OffsetDateTime.now().plusDays(-5), new GetUsageObservabilityPipelinesOptionalParameters() .endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageObservabilityPipelines"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for observability pipelines ``` // Get hourly usage for observability pipelines returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_usage_metering::GetUsageObservabilityPipelinesOptionalParams; use datadog_api_client::datadogV2::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_observability_pipelines( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageObservabilityPipelinesOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for observability pipelines ``` /** * Get hourly usage for observability pipelines returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsageMeteringApi(configuration); const params: v2.UsageMeteringApiGetUsageObservabilityPipelinesRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageObservabilityPipelines(params) .then((data: v2.UsageObservabilityPipelinesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get hourly usage for audit logs](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-audit-logs) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-hourly-usage-for-audit-logs-v1) GET https://api.ap1.datadoghq.com/api/v1/usage/audit_logshttps://api.ap2.datadoghq.com/api/v1/usage/audit_logshttps://api.datadoghq.eu/api/v1/usage/audit_logshttps://api.ddog-gov.com/api/v1/usage/audit_logshttps://api.datadoghq.com/api/v1/usage/audit_logshttps://api.us3.datadoghq.com/api/v1/usage/audit_logshttps://api.us5.datadoghq.com/api/v1/usage/audit_logs ### Overview Get hourly usage for audit logs. **Note:** This endpoint has been deprecated. This endpoint requires the `usage_read` permission. OAuth apps require the `usage_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_hr [_required_] string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage beginning at this hour. end_hr string Datetime in ISO-8601 format, UTC, precise to hour: `[YYYY-MM-DDThh]` for usage ending **before** this hour. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageAuditLogs-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageAuditLogs-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageAuditLogs-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetUsageAuditLogs-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing the audit logs usage for each hour for a given organization. Field Type Description usage [object] Get hourly usage for audit logs. hour date-time The hour for the usage. lines_indexed int64 The total number of audit logs lines indexed during a given hour. org_name string The organization name. public_id string The organization public ID. ``` { "usage": [ { "hour": "2019-09-19T10:00:00.000Z", "lines_indexed": "integer", "org_name": "string", "public_id": "string" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get hourly usage for audit logs Copy ``` # Required query arguments export start_hr="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/usage/audit_logs?start_hr=${start_hr}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get hourly usage for audit logs ``` """ Get hourly usage for audit logs returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_usage_audit_logs( start_hr=(datetime.now() + relativedelta(days=-5)), end_hr=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get hourly usage for audit logs ``` # Get hourly usage for audit logs returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new opts = { end_hr: (Time.now + -3 * 86400), } p api_instance.get_usage_audit_logs((Time.now + -5 * 86400), opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get hourly usage for audit logs ``` // Get hourly usage for audit logs returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetUsageAuditLogs(ctx, time.Now().AddDate(0, 0, -5), *datadogV1.NewGetUsageAuditLogsOptionalParameters().WithEndHr(time.Now().AddDate(0, 0, -3))) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetUsageAuditLogs`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetUsageAuditLogs`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get hourly usage for audit logs ``` // Get hourly usage for audit logs returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.api.UsageMeteringApi.GetUsageAuditLogsOptionalParameters; import com.datadog.api.client.v1.model.UsageAuditLogsResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageAuditLogsResponse result = apiInstance.getUsageAuditLogs( OffsetDateTime.now().plusDays(-5), new GetUsageAuditLogsOptionalParameters().endHr(OffsetDateTime.now().plusDays(-3))); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getUsageAuditLogs"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get hourly usage for audit logs ``` // Get hourly usage for audit logs returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetUsageAuditLogsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_usage_audit_logs( DateTime::parse_from_rfc3339("2021-11-06T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetUsageAuditLogsOptionalParams::default().end_hr( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), ), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get hourly usage for audit logs ``` /** * Get hourly usage for audit logs returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetUsageAuditLogsRequest = { startHr: new Date(new Date().getTime() + -5 * 86400 * 1000), endHr: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getUsageAuditLogs(params) .then((data: v1.UsageAuditLogsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the list of available daily custom reports](https://docs.datadoghq.com/api/latest/usage-metering/#get-the-list-of-available-daily-custom-reports) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-the-list-of-available-daily-custom-reports-v1) GET https://api.ap1.datadoghq.com/api/v1/daily_custom_reportshttps://api.ap2.datadoghq.com/api/v1/daily_custom_reportshttps://api.datadoghq.eu/api/v1/daily_custom_reportshttps://api.ddog-gov.com/api/v1/daily_custom_reportshttps://api.datadoghq.com/api/v1/daily_custom_reportshttps://api.us3.datadoghq.com/api/v1/daily_custom_reportshttps://api.us5.datadoghq.com/api/v1/daily_custom_reports ### Overview Get daily custom reports. **Note:** This endpoint will be fully deprecated on December 1, 2022. Refer to [Migrating from v1 to v2 of the Usage Attribution API](https://docs.datadoghq.com/account_management/guide/usage-attribution-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. ### Arguments #### Query Strings Name Type Description page[size] integer The number of files to return in the response. `[default=60]`. page[number] integer The identifier of the first page to return. This parameter is used for the pagination feature `[default=0]`. sort_dir enum The direction to sort by: `[desc, asc]`. Allowed enum values: `desc, asc` sort enum The field to sort by: `[computed_on, size, start_date, end_date]`. Allowed enum values: `computed_on, size, start_date, end_date` ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetDailyCustomReports-200-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetDailyCustomReports-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetDailyCustomReports-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing available custom reports. Field Type Description data [object] An array of available custom reports. attributes object The response containing attributes for custom reports. computed_on string The date the specified custom report was computed. end_date string The ending date of custom report. size int64 size start_date string The starting date of custom report. tags [string] A list of tags to apply to custom reports. id string The date for specified custom reports. type enum The type of reports. Allowed enum values: `reports` default: `reports` meta object The object containing document metadata. page object The object containing page total count. total_count int64 Total page count. ``` { "data": [ { "attributes": { "computed_on": "string", "end_date": "string", "size": "integer", "start_date": "string", "tags": [ "env" ] }, "id": "string", "type": "reports" } ], "meta": { "page": { "total_count": "integer" } } } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get the list of available daily custom reports Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/daily_custom_reports" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the list of available daily custom reports ``` """ Get the list of available daily custom reports returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_daily_custom_reports() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the list of available daily custom reports ``` # Get the list of available daily custom reports returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_daily_custom_reports() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the list of available daily custom reports ``` // Get the list of available daily custom reports returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetDailyCustomReports(ctx, *datadogV1.NewGetDailyCustomReportsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetDailyCustomReports`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetDailyCustomReports`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the list of available daily custom reports ``` // Get the list of available daily custom reports returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageCustomReportsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageCustomReportsResponse result = apiInstance.getDailyCustomReports(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getDailyCustomReports"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the list of available daily custom reports ``` // Get the list of available daily custom reports returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetDailyCustomReportsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_daily_custom_reports(GetDailyCustomReportsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the list of available daily custom reports ``` /** * Get the list of available daily custom reports returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); apiInstance .getDailyCustomReports() .then((data: v1.UsageCustomReportsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get specified daily custom reports](https://docs.datadoghq.com/api/latest/usage-metering/#get-specified-daily-custom-reports) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-specified-daily-custom-reports-v1) GET https://api.ap1.datadoghq.com/api/v1/daily_custom_reports/{report_id}https://api.ap2.datadoghq.com/api/v1/daily_custom_reports/{report_id}https://api.datadoghq.eu/api/v1/daily_custom_reports/{report_id}https://api.ddog-gov.com/api/v1/daily_custom_reports/{report_id}https://api.datadoghq.com/api/v1/daily_custom_reports/{report_id}https://api.us3.datadoghq.com/api/v1/daily_custom_reports/{report_id}https://api.us5.datadoghq.com/api/v1/daily_custom_reports/{report_id} ### Overview Get specified daily custom reports. **Note:** This endpoint will be fully deprecated on December 1, 2022. Refer to [Migrating from v1 to v2 of the Usage Attribution API](https://docs.datadoghq.com/account_management/guide/usage-attribution-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. ### Arguments #### Path Parameters Name Type Description report_id [_required_] string Date of the report in the format `YYYY-MM-DD`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetSpecifiedDailyCustomReports-200-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetSpecifiedDailyCustomReports-403-v1) * [404](https://docs.datadoghq.com/api/latest/usage-metering/#GetSpecifiedDailyCustomReports-404-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetSpecifiedDailyCustomReports-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Returns available specified custom reports. Field Type Description data object Response containing date and type for specified custom reports. attributes object The response containing attributes for specified custom reports. computed_on string The date the specified custom report was computed. end_date string The ending date of specified custom report. location string A downloadable file for the specified custom reporting file. size int64 size start_date string The starting date of specified custom report. tags [string] A list of tags to apply to specified custom reports. id string The date for specified custom reports. type enum The type of reports. Allowed enum values: `reports` default: `reports` meta object The object containing document metadata. page object The object containing page total count for specified ID. total_count int64 Total page count. ``` { "data": { "attributes": { "computed_on": "string", "end_date": "string", "location": "https://an-s3-or-gs-bucket.s3.amazonaws.com", "size": "integer", "start_date": "string", "tags": [ "env" ] }, "id": "string", "type": "reports" }, "meta": { "page": { "total_count": "integer" } } } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get specified daily custom reports Copy ``` # Path parameters export report_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/daily_custom_reports/${report_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get specified daily custom reports ``` """ Get specified daily custom reports returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_specified_daily_custom_reports( report_id="2022-03-20", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get specified daily custom reports ``` # Get specified daily custom reports returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_specified_daily_custom_reports("2022-03-20") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get specified daily custom reports ``` // Get specified daily custom reports returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetSpecifiedDailyCustomReports(ctx, "2022-03-20") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetSpecifiedDailyCustomReports`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetSpecifiedDailyCustomReports`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get specified daily custom reports ``` // Get specified daily custom reports returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageSpecifiedCustomReportsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageSpecifiedCustomReportsResponse result = apiInstance.getSpecifiedDailyCustomReports("2022-03-20"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getSpecifiedDailyCustomReports"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get specified daily custom reports ``` // Get specified daily custom reports returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_specified_daily_custom_reports("2022-03-20".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get specified daily custom reports ``` /** * Get specified daily custom reports returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetSpecifiedDailyCustomReportsRequest = { reportId: "2022-03-20", }; apiInstance .getSpecifiedDailyCustomReports(params) .then((data: v1.UsageSpecifiedCustomReportsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get the list of available monthly custom reports](https://docs.datadoghq.com/api/latest/usage-metering/#get-the-list-of-available-monthly-custom-reports) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-the-list-of-available-monthly-custom-reports-v1) GET https://api.ap1.datadoghq.com/api/v1/monthly_custom_reportshttps://api.ap2.datadoghq.com/api/v1/monthly_custom_reportshttps://api.datadoghq.eu/api/v1/monthly_custom_reportshttps://api.ddog-gov.com/api/v1/monthly_custom_reportshttps://api.datadoghq.com/api/v1/monthly_custom_reportshttps://api.us3.datadoghq.com/api/v1/monthly_custom_reportshttps://api.us5.datadoghq.com/api/v1/monthly_custom_reports ### Overview Get monthly custom reports. **Note:** This endpoint will be fully deprecated on December 1, 2022. Refer to [Migrating from v1 to v2 of the Usage Attribution API](https://docs.datadoghq.com/account_management/guide/usage-attribution-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. ### Arguments #### Query Strings Name Type Description page[size] integer The number of files to return in the response `[default=60].` page[number] integer The identifier of the first page to return. This parameter is used for the pagination feature `[default=0]`. sort_dir enum The direction to sort by: `[desc, asc]`. Allowed enum values: `desc, asc` sort enum The field to sort by: `[computed_on, size, start_date, end_date]`. Allowed enum values: `computed_on, size, start_date, end_date` ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetMonthlyCustomReports-200-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetMonthlyCustomReports-403-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetMonthlyCustomReports-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Response containing available custom reports. Field Type Description data [object] An array of available custom reports. attributes object The response containing attributes for custom reports. computed_on string The date the specified custom report was computed. end_date string The ending date of custom report. size int64 size start_date string The starting date of custom report. tags [string] A list of tags to apply to custom reports. id string The date for specified custom reports. type enum The type of reports. Allowed enum values: `reports` default: `reports` meta object The object containing document metadata. page object The object containing page total count. total_count int64 Total page count. ``` { "data": [ { "attributes": { "computed_on": "string", "end_date": "string", "size": "integer", "start_date": "string", "tags": [ "env" ] }, "id": "string", "type": "reports" } ], "meta": { "page": { "total_count": "integer" } } } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get the list of available monthly custom reports Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monthly_custom_reports" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get the list of available monthly custom reports ``` """ Get the list of available monthly custom reports returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_monthly_custom_reports() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get the list of available monthly custom reports ``` # Get the list of available monthly custom reports returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_monthly_custom_reports() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get the list of available monthly custom reports ``` // Get the list of available monthly custom reports returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetMonthlyCustomReports(ctx, *datadogV1.NewGetMonthlyCustomReportsOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetMonthlyCustomReports`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetMonthlyCustomReports`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get the list of available monthly custom reports ``` // Get the list of available monthly custom reports returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageCustomReportsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageCustomReportsResponse result = apiInstance.getMonthlyCustomReports(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getMonthlyCustomReports"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get the list of available monthly custom reports ``` // Get the list of available monthly custom reports returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::GetMonthlyCustomReportsOptionalParams; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_monthly_custom_reports(GetMonthlyCustomReportsOptionalParams::default()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get the list of available monthly custom reports ``` /** * Get the list of available monthly custom reports returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); apiInstance .getMonthlyCustomReports() .then((data: v1.UsageCustomReportsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get specified monthly custom reports](https://docs.datadoghq.com/api/latest/usage-metering/#get-specified-monthly-custom-reports) * [v1 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-specified-monthly-custom-reports-v1) GET https://api.ap1.datadoghq.com/api/v1/monthly_custom_reports/{report_id}https://api.ap2.datadoghq.com/api/v1/monthly_custom_reports/{report_id}https://api.datadoghq.eu/api/v1/monthly_custom_reports/{report_id}https://api.ddog-gov.com/api/v1/monthly_custom_reports/{report_id}https://api.datadoghq.com/api/v1/monthly_custom_reports/{report_id}https://api.us3.datadoghq.com/api/v1/monthly_custom_reports/{report_id}https://api.us5.datadoghq.com/api/v1/monthly_custom_reports/{report_id} ### Overview Get specified monthly custom reports. **Note:** This endpoint will be fully deprecated on December 1, 2022. Refer to [Migrating from v1 to v2 of the Usage Attribution API](https://docs.datadoghq.com/account_management/guide/usage-attribution-migration/) for the associated migration guide. This endpoint requires the `usage_read` permission. ### Arguments #### Path Parameters Name Type Description report_id [_required_] string Date of the report in the format `YYYY-MM-DD`. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetSpecifiedMonthlyCustomReports-200-v1) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetSpecifiedMonthlyCustomReports-400-v1) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetSpecifiedMonthlyCustomReports-403-v1) * [404](https://docs.datadoghq.com/api/latest/usage-metering/#GetSpecifiedMonthlyCustomReports-404-v1) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetSpecifiedMonthlyCustomReports-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Returns available specified custom reports. Field Type Description data object Response containing date and type for specified custom reports. attributes object The response containing attributes for specified custom reports. computed_on string The date the specified custom report was computed. end_date string The ending date of specified custom report. location string A downloadable file for the specified custom reporting file. size int64 size start_date string The starting date of specified custom report. tags [string] A list of tags to apply to specified custom reports. id string The date for specified custom reports. type enum The type of reports. Allowed enum values: `reports` default: `reports` meta object The object containing document metadata. page object The object containing page total count for specified ID. total_count int64 Total page count. ``` { "data": { "attributes": { "computed_on": "string", "end_date": "string", "location": "https://an-s3-or-gs-bucket.s3.amazonaws.com", "size": "integer", "start_date": "string", "tags": [ "env" ] }, "id": "string", "type": "reports" }, "meta": { "page": { "total_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get specified monthly custom reports Copy ``` # Path parameters export report_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/monthly_custom_reports/${report_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get specified monthly custom reports ``` """ Get specified monthly custom reports returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_specified_monthly_custom_reports( report_id="2021-05-01", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get specified monthly custom reports ``` # Get specified monthly custom reports returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsageMeteringAPI.new p api_instance.get_specified_monthly_custom_reports("2021-05-01") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get specified monthly custom reports ``` // Get specified monthly custom reports returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsageMeteringApi(apiClient) resp, r, err := api.GetSpecifiedMonthlyCustomReports(ctx, "2021-05-01") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetSpecifiedMonthlyCustomReports`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetSpecifiedMonthlyCustomReports`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get specified monthly custom reports ``` // Get specified monthly custom reports returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsageMeteringApi; import com.datadog.api.client.v1.model.UsageSpecifiedCustomReportsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { UsageSpecifiedCustomReportsResponse result = apiInstance.getSpecifiedMonthlyCustomReports("2021-05-01"); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling UsageMeteringApi#getSpecifiedMonthlyCustomReports"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get specified monthly custom reports ``` // Get specified monthly custom reports returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_specified_monthly_custom_reports("2021-05-01".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get specified monthly custom reports ``` /** * Get specified monthly custom reports returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsageMeteringApi(configuration); const params: v1.UsageMeteringApiGetSpecifiedMonthlyCustomReportsRequest = { reportId: "2021-05-01", }; apiInstance .getSpecifiedMonthlyCustomReports(params) .then((data: v1.UsageSpecifiedCustomReportsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get cost across multi-org account](https://docs.datadoghq.com/api/latest/usage-metering/#get-cost-across-multi-org-account) * [v2 (deprecated)](https://docs.datadoghq.com/api/latest/usage-metering/#get-cost-across-multi-org-account-v2) GET https://api.ap1.datadoghq.com/api/v2/usage/cost_by_orghttps://api.ap2.datadoghq.com/api/v2/usage/cost_by_orghttps://api.datadoghq.eu/api/v2/usage/cost_by_orghttps://api.ddog-gov.com/api/v2/usage/cost_by_orghttps://api.datadoghq.com/api/v2/usage/cost_by_orghttps://api.us3.datadoghq.com/api/v2/usage/cost_by_orghttps://api.us5.datadoghq.com/api/v2/usage/cost_by_org ### Overview Get cost across multi-org account. Cost by org data for a given month becomes available no later than the 16th of the following month. **Note:** This endpoint has been deprecated. Please use the new endpoint [`/historical_cost`](https://docs.datadoghq.com/api/latest/usage-metering/#get-historical-cost-across-your-account) instead. This endpoint is only accessible for [parent-level organizations](https://docs.datadoghq.com/account_management/multi_organization/). This endpoint requires all of the following permissions: * `usage_read` * `billing_read` OAuth apps require the `usage_read, billing_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#usage-metering) to access this endpoint. ### Arguments #### Query Strings Name Type Description start_month [_required_] string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for cost beginning this month. end_month string Datetime in ISO-8601 format, UTC, precise to month: `[YYYY-MM]` for cost ending this month. ### Response * [200](https://docs.datadoghq.com/api/latest/usage-metering/#GetCostByOrg-200-v2) * [400](https://docs.datadoghq.com/api/latest/usage-metering/#GetCostByOrg-400-v2) * [403](https://docs.datadoghq.com/api/latest/usage-metering/#GetCostByOrg-403-v2) * [429](https://docs.datadoghq.com/api/latest/usage-metering/#GetCostByOrg-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) Chargeback Summary response. Field Type Description data [object] Response containing Chargeback Summary. attributes object Cost attributes data. account_name string The account name. account_public_id string The account public ID. charges [object] List of charges data reported for the requested month. charge_type string The type of charge for a particular product. cost double The cost for a particular product and charge type during a given month. product_name string The product for which cost is being reported. date date-time The month requested. org_name string The organization name. public_id string The organization public ID. region string The region of the Datadog instance that the organization belongs to. total_cost double The total cost of products for the month. id string Unique ID of the response. type enum Type of cost data. Allowed enum values: `cost_by_org` default: `cost_by_org` ``` { "data": [ { "attributes": { "account_name": "string", "account_public_id": "string", "charges": [ { "charge_type": "on_demand", "cost": "number", "product_name": "infra_host" } ], "date": "2019-09-19T10:00:00.000Z", "org_name": "string", "public_id": "string", "region": "string", "total_cost": "number" }, "id": "string", "type": "cost_by_org" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden - User is not authorized * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/usage-metering/) * [Example](https://docs.datadoghq.com/api/latest/usage-metering/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/usage-metering/?code-lang=typescript) ##### Get cost across multi-org account Copy ``` # Required query arguments export start_month="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/usage/cost_by_org?start_month=${start_month}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get cost across multi-org account ``` """ Get cost across multi-org account returns "OK" response """ from datetime import datetime from dateutil.relativedelta import relativedelta from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.usage_metering_api import UsageMeteringApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsageMeteringApi(api_client) response = api_instance.get_cost_by_org( start_month=(datetime.now() + relativedelta(days=-3)), ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get cost across multi-org account ``` # Get cost across multi-org account returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsageMeteringAPI.new p api_instance.get_cost_by_org((Time.now + -3 * 86400)) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get cost across multi-org account ``` // Get cost across multi-org account returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "time" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsageMeteringApi(apiClient) resp, r, err := api.GetCostByOrg(ctx, time.Now().AddDate(0, 0, -3), *datadogV2.NewGetCostByOrgOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsageMeteringApi.GetCostByOrg`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsageMeteringApi.GetCostByOrg`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get cost across multi-org account ``` // Get cost across multi-org account returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsageMeteringApi; import com.datadog.api.client.v2.model.CostByOrgResponse; import java.time.OffsetDateTime; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsageMeteringApi apiInstance = new UsageMeteringApi(defaultClient); try { CostByOrgResponse result = apiInstance.getCostByOrg(OffsetDateTime.now().plusDays(-3)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsageMeteringApi#getCostByOrg"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get cost across multi-org account ``` // Get cost across multi-org account returns "OK" response use chrono::{DateTime, Utc}; use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_usage_metering::GetCostByOrgOptionalParams; use datadog_api_client::datadogV2::api_usage_metering::UsageMeteringAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsageMeteringAPI::with_config(configuration); let resp = api .get_cost_by_org( DateTime::parse_from_rfc3339("2021-11-08T11:11:11+00:00") .expect("Failed to parse datetime") .with_timezone(&Utc), GetCostByOrgOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get cost across multi-org account ``` /** * Get cost across multi-org account returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsageMeteringApi(configuration); const params: v2.UsageMeteringApiGetCostByOrgRequest = { startMonth: new Date(new Date().getTime() + -3 * 86400 * 1000), }; apiInstance .getCostByOrg(params) .then((data: v2.CostByOrgResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=ae94b021-7bc7-455c-ba1c-e020d8593c45&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=0d8d2689-303a-4218-8102-5b687e57895f&pt=Usage%20Metering&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fusage-metering%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=ae94b021-7bc7-455c-ba1c-e020d8593c45&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=0d8d2689-303a-4218-8102-5b687e57895f&pt=Usage%20Metering&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fusage-metering%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=8c4dc355-8d8f-4a5c-831f-fd0ae98f2e54&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Usage%20Metering&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fusage-metering%2F&r=<=18819&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=60740) --- # Source: https://docs.datadoghq.com/api/latest/users # Users Create, edit, and disable users. ## [Create a user](https://docs.datadoghq.com/api/latest/users/#create-a-user) * [v1](https://docs.datadoghq.com/api/latest/users/#create-a-user-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/users/#create-a-user-v2) POST https://api.ap1.datadoghq.com/api/v1/userhttps://api.ap2.datadoghq.com/api/v1/userhttps://api.datadoghq.eu/api/v1/userhttps://api.ddog-gov.com/api/v1/userhttps://api.datadoghq.com/api/v1/userhttps://api.us3.datadoghq.com/api/v1/userhttps://api.us5.datadoghq.com/api/v1/user ### Overview Create a user for your organization. **Note** : Users can only be created with the admin access role if application keys belong to administrators. This endpoint requires the `user_access_invite` permission. ### Request #### Body Data (required) User object that needs to be created. * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Expand All Field Type Description access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` disabled boolean The new disabled status of the user. email string The new email of the user. handle string The user handle, must be a valid email. icon string Gravatar icon associated to the user. name string The name of the user. verified boolean Whether or not the user logged in Datadog at least once. ``` { "access_role": null, "disabled": false, "email": "test@datadoghq.com", "handle": "test@datadoghq.com", "name": "test user" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/users/#CreateUser-200-v1) * [400](https://docs.datadoghq.com/api/latest/users/#CreateUser-400-v1) * [403](https://docs.datadoghq.com/api/latest/users/#CreateUser-403-v1) * [409](https://docs.datadoghq.com/api/latest/users/#CreateUser-409-v1) * [429](https://docs.datadoghq.com/api/latest/users/#CreateUser-429-v1) User created * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) A Datadog User. Field Type Description user object Create, edit, and disable users. access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` disabled boolean The new disabled status of the user. email string The new email of the user. handle string The user handle, must be a valid email. icon string Gravatar icon associated to the user. name string The name of the user. verified boolean Whether or not the user logged in Datadog at least once. ``` { "user": { "access_role": "ro", "disabled": false, "email": "test@datadoghq.com", "handle": "test@datadoghq.com", "icon": "/path/to/matching/gravatar/icon", "name": "test user", "verified": true } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Conflict * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Create a user returns null access role Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/user" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "access_role": null, "disabled": false, "email": "test@datadoghq.com", "handle": "test@datadoghq.com", "name": "test user" } EOF ``` ##### Create a user returns null access role ``` // Create a user returns null access role package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.User{ AccessRole: *datadogV1.NewNullableAccessRole(nil), Disabled: datadog.PtrBool(false), Email: datadog.PtrString("test@datadoghq.com"), Handle: datadog.PtrString("test@datadoghq.com"), Name: datadog.PtrString("test user"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsersApi(apiClient) resp, r, err := api.CreateUser(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.CreateUser`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.CreateUser`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a user returns null access role ``` // Create a user returns null access role import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsersApi; import com.datadog.api.client.v1.model.User; import com.datadog.api.client.v1.model.UserResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); User body = new User() .accessRole(null) .disabled(false) .email("test@datadoghq.com") .handle("test@datadoghq.com") .name("test user"); try { UserResponse result = apiInstance.createUser(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#createUser"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a user returns null access role ``` """ Create a user returns null access role """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.users_api import UsersApi from datadog_api_client.v1.model.user import User body = User( access_role=None, disabled=False, email="test@datadoghq.com", handle="test@datadoghq.com", name="test user", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.create_user(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a user returns null access role ``` # Create a user returns null access role require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsersAPI.new body = DatadogAPIClient::V1::User.new({ access_role: nil, disabled: false, email: "test@datadoghq.com", handle: "test@datadoghq.com", name: "test user", }) p api_instance.create_user(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a user returns null access role ``` // Create a user returns null access role use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_users::UsersAPI; use datadog_api_client::datadogV1::model::User; #[tokio::main] async fn main() { let body = User::new() .access_role(None) .disabled(false) .email("test@datadoghq.com".to_string()) .handle("test@datadoghq.com".to_string()) .name("test user".to_string()); let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api.create_user(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a user returns null access role ``` /** * Create a user returns null access role */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsersApi(configuration); const params: v1.UsersApiCreateUserRequest = { body: { accessRole: undefined, disabled: false, email: "test@datadoghq.com", handle: "test@datadoghq.com", name: "test user", }, }; apiInstance .createUser(params) .then((data: v1.UserResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` POST https://api.ap1.datadoghq.com/api/v2/usershttps://api.ap2.datadoghq.com/api/v2/usershttps://api.datadoghq.eu/api/v2/usershttps://api.ddog-gov.com/api/v2/usershttps://api.datadoghq.com/api/v2/usershttps://api.us3.datadoghq.com/api/v2/usershttps://api.us5.datadoghq.com/api/v2/users ### Overview Create a user for your organization. This endpoint requires the `user_access_invite` permission. OAuth apps require the `user_access_invite` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#users) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Field Type Description data [_required_] object Object to create a user. attributes [_required_] object Attributes of the created user. email [_required_] string The email of the user. name string The name of the user. title string The title of the user. relationships object Relationships of the user object. roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "type": "users", "attributes": { "name": "Datadog API Client Python", "email": "Example-User@datadoghq.com" } } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/users/#CreateUser-201-v2) * [400](https://docs.datadoghq.com/api/latest/users/#CreateUser-400-v2) * [403](https://docs.datadoghq.com/api/latest/users/#CreateUser-403-v2) * [429](https://docs.datadoghq.com/api/latest/users/#CreateUser-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Response containing information about a single user. Field Type Description data object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` included [ ] Array of objects related to the user. Option 1 object Organization object. attributes object Attributes of the organization. created_at date-time Creation time of the organization. description string Description of the organization. disabled boolean Whether or not the organization is disabled. modified_at date-time Time of last organization modification. name string Name of the organization. public_id string Public ID of the organization. sharing string Sharing type of the organization. url string URL of the site that this organization exists at. id string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` Option 2 object Permission object. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` Option 3 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "disabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "public_id": "string", "sharing": "string", "url": "string" }, "id": "string", "type": "orgs" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Create a user returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/users" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "type": "users", "attributes": { "name": "Datadog API Client Python", "email": "Example-User@datadoghq.com" } } } EOF ``` ##### Create a user returns "OK" response ``` // Create a user returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.UserCreateRequest{ Data: datadogV2.UserCreateData{ Type: datadogV2.USERSTYPE_USERS, Attributes: datadogV2.UserCreateAttributes{ Name: datadog.PtrString("Datadog API Client Python"), Email: "Example-User@datadoghq.com", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsersApi(apiClient) resp, r, err := api.CreateUser(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.CreateUser`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.CreateUser`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a user returns "OK" response ``` // Create a user returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsersApi; import com.datadog.api.client.v2.model.UserCreateAttributes; import com.datadog.api.client.v2.model.UserCreateData; import com.datadog.api.client.v2.model.UserCreateRequest; import com.datadog.api.client.v2.model.UserResponse; import com.datadog.api.client.v2.model.UsersType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); UserCreateRequest body = new UserCreateRequest() .data( new UserCreateData() .type(UsersType.USERS) .attributes( new UserCreateAttributes() .name("Datadog API Client Python") .email("Example-User@datadoghq.com"))); try { UserResponse result = apiInstance.createUser(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#createUser"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a user returns "OK" response ``` """ Create a user returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.users_api import UsersApi from datadog_api_client.v2.model.user_create_attributes import UserCreateAttributes from datadog_api_client.v2.model.user_create_data import UserCreateData from datadog_api_client.v2.model.user_create_request import UserCreateRequest from datadog_api_client.v2.model.users_type import UsersType body = UserCreateRequest( data=UserCreateData( type=UsersType.USERS, attributes=UserCreateAttributes( name="Datadog API Client Python", email="Example-User@datadoghq.com", ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.create_user(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a user returns "OK" response ``` # Create a user returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsersAPI.new body = DatadogAPIClient::V2::UserCreateRequest.new({ data: DatadogAPIClient::V2::UserCreateData.new({ type: DatadogAPIClient::V2::UsersType::USERS, attributes: DatadogAPIClient::V2::UserCreateAttributes.new({ name: "Datadog API Client Python", email: "Example-User@datadoghq.com", }), }), }) p api_instance.create_user(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a user returns "OK" response ``` // Create a user returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_users::UsersAPI; use datadog_api_client::datadogV2::model::UserCreateAttributes; use datadog_api_client::datadogV2::model::UserCreateData; use datadog_api_client::datadogV2::model::UserCreateRequest; use datadog_api_client::datadogV2::model::UsersType; #[tokio::main] async fn main() { let body = UserCreateRequest::new(UserCreateData::new( UserCreateAttributes::new("Example-User@datadoghq.com".to_string()) .name("Datadog API Client Python".to_string()), UsersType::USERS, )); let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api.create_user(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a user returns "OK" response ``` /** * Create a user returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsersApi(configuration); const params: v2.UsersApiCreateUserRequest = { body: { data: { type: "users", attributes: { name: "Datadog API Client Python", email: "Example-User@datadoghq.com", }, }, }, }; apiInstance .createUser(params) .then((data: v2.UserResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List all users](https://docs.datadoghq.com/api/latest/users/#list-all-users) * [v1](https://docs.datadoghq.com/api/latest/users/#list-all-users-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/users/#list-all-users-v2) GET https://api.ap1.datadoghq.com/api/v1/userhttps://api.ap2.datadoghq.com/api/v1/userhttps://api.datadoghq.eu/api/v1/userhttps://api.ddog-gov.com/api/v1/userhttps://api.datadoghq.com/api/v1/userhttps://api.us3.datadoghq.com/api/v1/userhttps://api.us5.datadoghq.com/api/v1/user ### Overview List all users for your organization. This endpoint requires the `user_access_read` permission. OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#users) to access this endpoint. ### Response * [200](https://docs.datadoghq.com/api/latest/users/#ListUsers-200-v1) * [403](https://docs.datadoghq.com/api/latest/users/#ListUsers-403-v1) * [429](https://docs.datadoghq.com/api/latest/users/#ListUsers-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Array of Datadog users for a given organization. Field Type Description users [object] Array of users. access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` disabled boolean The new disabled status of the user. email string The new email of the user. handle string The user handle, must be a valid email. icon string Gravatar icon associated to the user. name string The name of the user. verified boolean Whether or not the user logged in Datadog at least once. ``` { "users": [ { "access_role": "ro", "disabled": false, "email": "test@datadoghq.com", "handle": "test@datadoghq.com", "icon": "/path/to/matching/gravatar/icon", "name": "test user", "verified": true } ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### List all users Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/user" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all users ``` """ List all users returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.users_api import UsersApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.list_users() print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all users ``` # List all users returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsersAPI.new p api_instance.list_users() ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all users ``` // List all users returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsersApi(apiClient) resp, r, err := api.ListUsers(ctx) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.ListUsers`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.ListUsers`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all users ``` // List all users returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsersApi; import com.datadog.api.client.v1.model.UserListResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); try { UserListResponse result = apiInstance.listUsers(); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#listUsers"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all users ``` // List all users returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_users::UsersAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api.list_users().await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all users ``` /** * List all users returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsersApi(configuration); apiInstance .listUsers() .then((data: v1.UserListResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/usershttps://api.ap2.datadoghq.com/api/v2/usershttps://api.datadoghq.eu/api/v2/usershttps://api.ddog-gov.com/api/v2/usershttps://api.datadoghq.com/api/v2/usershttps://api.us3.datadoghq.com/api/v2/usershttps://api.us5.datadoghq.com/api/v2/users ### Overview Get the list of all users in the organization. This list includes all users even if they are deactivated or unverified. This endpoint requires the `user_access_read` permission. OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#users) to access this endpoint. ### Arguments #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. sort string User attribute to order results by. Sort order is ascending by default. Sort order is descending if the field is prefixed by a negative sign, for example `sort=-name`. Options: `name`, `modified_at`, `user_count`. sort_dir enum Direction of sort. Options: `asc`, `desc`. Allowed enum values: `asc, desc` filter string Filter all users by the given string. Defaults to no filtering. filter[status] string Filter on status attribute. Comma separated list, with possible values `Active`, `Pending`, and `Disabled`. Defaults to no filtering. ### Response * [200](https://docs.datadoghq.com/api/latest/users/#ListUsers-200-v2) * [400](https://docs.datadoghq.com/api/latest/users/#ListUsers-400-v2) * [403](https://docs.datadoghq.com/api/latest/users/#ListUsers-403-v2) * [429](https://docs.datadoghq.com/api/latest/users/#ListUsers-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Response containing information about multiple users. Field Type Description data [object] Array of returned users. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` included [ ] Array of objects related to the users. Option 1 object Organization object. attributes object Attributes of the organization. created_at date-time Creation time of the organization. description string Description of the organization. disabled boolean Whether or not the organization is disabled. modified_at date-time Time of last organization modification. name string Name of the organization. public_id string Public ID of the organization. sharing string Sharing type of the organization. url string URL of the site that this organization exists at. id string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` Option 2 object Permission object. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` Option 3 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` meta object Object describing meta attributes of response. page object Pagination object. total_count int64 Total count. total_filtered_count int64 Total count of elements matched by the filter. ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" } ], "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "disabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "public_id": "string", "sharing": "string", "url": "string" }, "id": "string", "type": "orgs" } ], "meta": { "page": { "total_count": "integer", "total_filtered_count": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### List all users Copy ``` # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/users" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List all users ``` """ List all users returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.users_api import UsersApi # there is a valid "user" in the system USER_DATA_ATTRIBUTES_EMAIL = environ["USER_DATA_ATTRIBUTES_EMAIL"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.list_users( filter=USER_DATA_ATTRIBUTES_EMAIL, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List all users ``` # List all users returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsersAPI.new # there is a valid "user" in the system USER_DATA_ATTRIBUTES_EMAIL = ENV["USER_DATA_ATTRIBUTES_EMAIL"] opts = { filter: USER_DATA_ATTRIBUTES_EMAIL, } p api_instance.list_users(opts) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List all users ``` // List all users returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataAttributesEmail := os.Getenv("USER_DATA_ATTRIBUTES_EMAIL") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsersApi(apiClient) resp, r, err := api.ListUsers(ctx, *datadogV2.NewListUsersOptionalParameters().WithFilter(UserDataAttributesEmail)) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.ListUsers`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.ListUsers`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List all users ``` // List all users returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsersApi; import com.datadog.api.client.v2.api.UsersApi.ListUsersOptionalParameters; import com.datadog.api.client.v2.model.UsersResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ATTRIBUTES_EMAIL = System.getenv("USER_DATA_ATTRIBUTES_EMAIL"); try { UsersResponse result = apiInstance.listUsers( new ListUsersOptionalParameters().filter(USER_DATA_ATTRIBUTES_EMAIL)); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#listUsers"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List all users ``` // List all users returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_users::ListUsersOptionalParams; use datadog_api_client::datadogV2::api_users::UsersAPI; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_attributes_email = std::env::var("USER_DATA_ATTRIBUTES_EMAIL").unwrap(); let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api .list_users(ListUsersOptionalParams::default().filter(user_data_attributes_email.clone())) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List all users ``` /** * List all users returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsersApi(configuration); // there is a valid "user" in the system const USER_DATA_ATTRIBUTES_EMAIL = process.env .USER_DATA_ATTRIBUTES_EMAIL as string; const params: v2.UsersApiListUsersRequest = { filter: USER_DATA_ATTRIBUTES_EMAIL, }; apiInstance .listUsers(params) .then((data: v2.UsersResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get user details](https://docs.datadoghq.com/api/latest/users/#get-user-details) * [v1](https://docs.datadoghq.com/api/latest/users/#get-user-details-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/users/#get-user-details-v2) GET https://api.ap1.datadoghq.com/api/v1/user/{user_handle}https://api.ap2.datadoghq.com/api/v1/user/{user_handle}https://api.datadoghq.eu/api/v1/user/{user_handle}https://api.ddog-gov.com/api/v1/user/{user_handle}https://api.datadoghq.com/api/v1/user/{user_handle}https://api.us3.datadoghq.com/api/v1/user/{user_handle}https://api.us5.datadoghq.com/api/v1/user/{user_handle} ### Overview Get a user’s details. ### Arguments #### Path Parameters Name Type Description user_handle [_required_] string The ID of the user. ### Response * [200](https://docs.datadoghq.com/api/latest/users/#GetUser-200-v1) * [403](https://docs.datadoghq.com/api/latest/users/#GetUser-403-v1) * [404](https://docs.datadoghq.com/api/latest/users/#GetUser-404-v1) * [429](https://docs.datadoghq.com/api/latest/users/#GetUser-429-v1) OK for get user * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) A Datadog User. Field Type Description user object Create, edit, and disable users. access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` disabled boolean The new disabled status of the user. email string The new email of the user. handle string The user handle, must be a valid email. icon string Gravatar icon associated to the user. name string The name of the user. verified boolean Whether or not the user logged in Datadog at least once. ``` { "user": { "access_role": "ro", "disabled": false, "email": "test@datadoghq.com", "handle": "test@datadoghq.com", "icon": "/path/to/matching/gravatar/icon", "name": "test user", "verified": true } } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Get user details Copy ``` # Path parameters export user_handle="test@datadoghq.com" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/user/${user_handle}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get user details ``` """ Get user details returns "OK for get user" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.users_api import UsersApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.get_user( user_handle="test@datadoghq.com", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get user details ``` # Get user details returns "OK for get user" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsersAPI.new p api_instance.get_user("test@datadoghq.com") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get user details ``` // Get user details returns "OK for get user" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsersApi(apiClient) resp, r, err := api.GetUser(ctx, "test@datadoghq.com") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.GetUser`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.GetUser`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get user details ``` // Get user details returns "OK for get user" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsersApi; import com.datadog.api.client.v1.model.UserResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); try { UserResponse result = apiInstance.getUser("test@datadoghq.com"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#getUser"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get user details ``` // Get user details returns "OK for get user" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_users::UsersAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api.get_user("test@datadoghq.com".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get user details ``` /** * Get user details returns "OK for get user" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsersApi(configuration); const params: v1.UsersApiGetUserRequest = { userHandle: "test@datadoghq.com", }; apiInstance .getUser(params) .then((data: v1.UserResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` GET https://api.ap1.datadoghq.com/api/v2/users/{user_id}https://api.ap2.datadoghq.com/api/v2/users/{user_id}https://api.datadoghq.eu/api/v2/users/{user_id}https://api.ddog-gov.com/api/v2/users/{user_id}https://api.datadoghq.com/api/v2/users/{user_id}https://api.us3.datadoghq.com/api/v2/users/{user_id}https://api.us5.datadoghq.com/api/v2/users/{user_id} ### Overview Get a user in the organization specified by the user’s `user_id`. This endpoint requires the `user_access_read` permission. OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#users) to access this endpoint. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The ID of the user. ### Response * [200](https://docs.datadoghq.com/api/latest/users/#GetUser-200-v2) * [403](https://docs.datadoghq.com/api/latest/users/#GetUser-403-v2) * [404](https://docs.datadoghq.com/api/latest/users/#GetUser-404-v2) * [429](https://docs.datadoghq.com/api/latest/users/#GetUser-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Response containing information about a single user. Field Type Description data object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` included [ ] Array of objects related to the user. Option 1 object Organization object. attributes object Attributes of the organization. created_at date-time Creation time of the organization. description string Description of the organization. disabled boolean Whether or not the organization is disabled. modified_at date-time Time of last organization modification. name string Name of the organization. public_id string Public ID of the organization. sharing string Sharing type of the organization. url string URL of the site that this organization exists at. id string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` Option 2 object Permission object. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` Option 3 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "disabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "public_id": "string", "sharing": "string", "url": "string" }, "id": "string", "type": "orgs" } ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Get user details Copy ``` # Path parameters export user_id="00000000-0000-9999-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/users/${user_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get user details ``` """ Get user details returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.users_api import UsersApi # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.get_user( user_id=USER_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get user details ``` # Get user details returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsersAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] p api_instance.get_user(USER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get user details ``` // Get user details returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsersApi(apiClient) resp, r, err := api.GetUser(ctx, UserDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.GetUser`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.GetUser`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get user details ``` // Get user details returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsersApi; import com.datadog.api.client.v2.model.UserResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); try { UserResponse result = apiInstance.getUser(USER_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#getUser"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get user details ``` // Get user details returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_users::UsersAPI; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api.get_user(user_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get user details ``` /** * Get user details returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsersApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.UsersApiGetUserRequest = { userId: USER_DATA_ID, }; apiInstance .getUser(params) .then((data: v2.UserResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a user](https://docs.datadoghq.com/api/latest/users/#update-a-user) * [v1](https://docs.datadoghq.com/api/latest/users/#update-a-user-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/users/#update-a-user-v2) PUT https://api.ap1.datadoghq.com/api/v1/user/{user_handle}https://api.ap2.datadoghq.com/api/v1/user/{user_handle}https://api.datadoghq.eu/api/v1/user/{user_handle}https://api.ddog-gov.com/api/v1/user/{user_handle}https://api.datadoghq.com/api/v1/user/{user_handle}https://api.us3.datadoghq.com/api/v1/user/{user_handle}https://api.us5.datadoghq.com/api/v1/user/{user_handle} ### Overview Update a user information. **Note** : It can only be used with application keys belonging to administrators. ### Arguments #### Path Parameters Name Type Description user_handle [_required_] string The ID of the user. ### Request #### Body Data (required) Description of the update. * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Expand All Field Type Description access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` disabled boolean The new disabled status of the user. email string The new email of the user. handle string The user handle, must be a valid email. icon string Gravatar icon associated to the user. name string The name of the user. verified boolean Whether or not the user logged in Datadog at least once. ``` { "access_role": "ro", "disabled": false, "email": "test@datadoghq.com", "handle": "test@datadoghq.com", "name": "test user" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/users/#UpdateUser-200-v1) * [400](https://docs.datadoghq.com/api/latest/users/#UpdateUser-400-v1) * [403](https://docs.datadoghq.com/api/latest/users/#UpdateUser-403-v1) * [404](https://docs.datadoghq.com/api/latest/users/#UpdateUser-404-v1) * [429](https://docs.datadoghq.com/api/latest/users/#UpdateUser-429-v1) User updated * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) A Datadog User. Field Type Description user object Create, edit, and disable users. access_role enum The access role of the user. Options are **st** (standard user), **adm** (admin user), or **ro** (read-only user). Allowed enum values: `st,adm,ro,ERROR` disabled boolean The new disabled status of the user. email string The new email of the user. handle string The user handle, must be a valid email. icon string Gravatar icon associated to the user. name string The name of the user. verified boolean Whether or not the user logged in Datadog at least once. ``` { "user": { "access_role": "ro", "disabled": false, "email": "test@datadoghq.com", "handle": "test@datadoghq.com", "icon": "/path/to/matching/gravatar/icon", "name": "test user", "verified": true } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Update a user Copy ``` # Path parameters export user_handle="test@datadoghq.com" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/user/${user_handle}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF {} EOF ``` ##### Update a user ``` """ Update a user returns "User updated" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.users_api import UsersApi from datadog_api_client.v1.model.access_role import AccessRole from datadog_api_client.v1.model.user import User body = User( access_role=AccessRole.READ_ONLY, disabled=False, email="test@datadoghq.com", handle="test@datadoghq.com", name="test user", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.update_user(user_handle="test@datadoghq.com", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a user ``` # Update a user returns "User updated" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsersAPI.new body = DatadogAPIClient::V1::User.new({ access_role: DatadogAPIClient::V1::AccessRole::READ_ONLY, disabled: false, email: "test@datadoghq.com", handle: "test@datadoghq.com", name: "test user", }) p api_instance.update_user("test@datadoghq.com", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a user ``` // Update a user returns "User updated" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.User{ AccessRole: *datadogV1.NewNullableAccessRole(datadogV1.ACCESSROLE_READ_ONLY.Ptr()), Disabled: datadog.PtrBool(false), Email: datadog.PtrString("test@datadoghq.com"), Handle: datadog.PtrString("test@datadoghq.com"), Name: datadog.PtrString("test user"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsersApi(apiClient) resp, r, err := api.UpdateUser(ctx, "test@datadoghq.com", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.UpdateUser`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.UpdateUser`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a user ``` // Update a user returns "User updated" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsersApi; import com.datadog.api.client.v1.model.AccessRole; import com.datadog.api.client.v1.model.User; import com.datadog.api.client.v1.model.UserResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); User body = new User() .accessRole(AccessRole.READ_ONLY) .disabled(false) .email("test@datadoghq.com") .handle("test@datadoghq.com") .name("test user"); try { UserResponse result = apiInstance.updateUser("test@datadoghq.com", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#updateUser"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a user ``` // Update a user returns "User updated" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_users::UsersAPI; use datadog_api_client::datadogV1::model::AccessRole; use datadog_api_client::datadogV1::model::User; #[tokio::main] async fn main() { let body = User::new() .access_role(Some(AccessRole::READ_ONLY)) .disabled(false) .email("test@datadoghq.com".to_string()) .handle("test@datadoghq.com".to_string()) .name("test user".to_string()); let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api .update_user("test@datadoghq.com".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a user ``` /** * Update a user returns "User updated" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsersApi(configuration); const params: v1.UsersApiUpdateUserRequest = { body: { accessRole: "ro", disabled: false, email: "test@datadoghq.com", handle: "test@datadoghq.com", name: "test user", }, userHandle: "test@datadoghq.com", }; apiInstance .updateUser(params) .then((data: v1.UserResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` PATCH https://api.ap1.datadoghq.com/api/v2/users/{user_id}https://api.ap2.datadoghq.com/api/v2/users/{user_id}https://api.datadoghq.eu/api/v2/users/{user_id}https://api.ddog-gov.com/api/v2/users/{user_id}https://api.datadoghq.com/api/v2/users/{user_id}https://api.us3.datadoghq.com/api/v2/users/{user_id}https://api.us5.datadoghq.com/api/v2/users/{user_id} ### Overview Edit a user. Can only be used with an application key belonging to an administrator user. This endpoint requires any of the following permissions: * `user_access_manage` * `service_account_write` OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#users) to access this endpoint. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The ID of the user. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Field Type Description data [_required_] object Object to update a user. attributes [_required_] object Attributes of the edited user. disabled boolean If the user is enabled or disabled. email string The email of the user. name string The name of the user. id [_required_] string ID of the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` ``` { "data": { "id": "string", "type": "users", "attributes": { "name": "updated", "disabled": true } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/users/#UpdateUser-200-v2) * [400](https://docs.datadoghq.com/api/latest/users/#UpdateUser-400-v2) * [403](https://docs.datadoghq.com/api/latest/users/#UpdateUser-403-v2) * [404](https://docs.datadoghq.com/api/latest/users/#UpdateUser-404-v2) * [422](https://docs.datadoghq.com/api/latest/users/#UpdateUser-422-v2) * [429](https://docs.datadoghq.com/api/latest/users/#UpdateUser-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Response containing information about a single user. Field Type Description data object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` included [ ] Array of objects related to the user. Option 1 object Organization object. attributes object Attributes of the organization. created_at date-time Creation time of the organization. description string Description of the organization. disabled boolean Whether or not the organization is disabled. modified_at date-time Time of last organization modification. name string Name of the organization. public_id string Public ID of the organization. sharing string Sharing type of the organization. url string URL of the site that this organization exists at. id string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` Option 2 object Permission object. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` Option 3 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "disabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "public_id": "string", "sharing": "string", "url": "string" }, "id": "string", "type": "orgs" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Unprocessable Entity * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Update a user returns "OK" response Copy ``` # Path parameters export user_id="00000000-0000-9999-0000-000000000000" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/users/${user_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "id": "string", "type": "users", "attributes": { "name": "updated", "disabled": true } } } EOF ``` ##### Update a user returns "OK" response ``` // Update a user returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") body := datadogV2.UserUpdateRequest{ Data: datadogV2.UserUpdateData{ Id: UserDataID, Type: datadogV2.USERSTYPE_USERS, Attributes: datadogV2.UserUpdateAttributes{ Name: datadog.PtrString("updated"), Disabled: datadog.PtrBool(true), }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsersApi(apiClient) resp, r, err := api.UpdateUser(ctx, UserDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.UpdateUser`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.UpdateUser`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a user returns "OK" response ``` // Update a user returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsersApi; import com.datadog.api.client.v2.model.UserResponse; import com.datadog.api.client.v2.model.UserUpdateAttributes; import com.datadog.api.client.v2.model.UserUpdateData; import com.datadog.api.client.v2.model.UserUpdateRequest; import com.datadog.api.client.v2.model.UsersType; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); UserUpdateRequest body = new UserUpdateRequest() .data( new UserUpdateData() .id(USER_DATA_ID) .type(UsersType.USERS) .attributes(new UserUpdateAttributes().name("updated").disabled(true))); try { UserResponse result = apiInstance.updateUser(USER_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#updateUser"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a user returns "OK" response ``` """ Update a user returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.users_api import UsersApi from datadog_api_client.v2.model.user_update_attributes import UserUpdateAttributes from datadog_api_client.v2.model.user_update_data import UserUpdateData from datadog_api_client.v2.model.user_update_request import UserUpdateRequest from datadog_api_client.v2.model.users_type import UsersType # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] body = UserUpdateRequest( data=UserUpdateData( id=USER_DATA_ID, type=UsersType.USERS, attributes=UserUpdateAttributes( name="updated", disabled=True, ), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.update_user(user_id=USER_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a user returns "OK" response ``` # Update a user returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsersAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] body = DatadogAPIClient::V2::UserUpdateRequest.new({ data: DatadogAPIClient::V2::UserUpdateData.new({ id: USER_DATA_ID, type: DatadogAPIClient::V2::UsersType::USERS, attributes: DatadogAPIClient::V2::UserUpdateAttributes.new({ name: "updated", disabled: true, }), }), }) p api_instance.update_user(USER_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a user returns "OK" response ``` // Update a user returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_users::UsersAPI; use datadog_api_client::datadogV2::model::UserUpdateAttributes; use datadog_api_client::datadogV2::model::UserUpdateData; use datadog_api_client::datadogV2::model::UserUpdateRequest; use datadog_api_client::datadogV2::model::UsersType; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let body = UserUpdateRequest::new(UserUpdateData::new( UserUpdateAttributes::new() .disabled(true) .name("updated".to_string()), user_data_id.clone(), UsersType::USERS, )); let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api.update_user(user_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a user returns "OK" response ``` /** * Update a user returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsersApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.UsersApiUpdateUserRequest = { body: { data: { id: USER_DATA_ID, type: "users", attributes: { name: "updated", disabled: true, }, }, }, userId: USER_DATA_ID, }; apiInstance .updateUser(params) .then((data: v2.UserResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Disable a user](https://docs.datadoghq.com/api/latest/users/#disable-a-user) * [v1](https://docs.datadoghq.com/api/latest/users/#disable-a-user-v1) * [v2 (latest)](https://docs.datadoghq.com/api/latest/users/#disable-a-user-v2) DELETE https://api.ap1.datadoghq.com/api/v1/user/{user_handle}https://api.ap2.datadoghq.com/api/v1/user/{user_handle}https://api.datadoghq.eu/api/v1/user/{user_handle}https://api.ddog-gov.com/api/v1/user/{user_handle}https://api.datadoghq.com/api/v1/user/{user_handle}https://api.us3.datadoghq.com/api/v1/user/{user_handle}https://api.us5.datadoghq.com/api/v1/user/{user_handle} ### Overview Delete a user from an organization. **Note** : This endpoint can only be used with application keys belonging to administrators. ### Arguments #### Path Parameters Name Type Description user_handle [_required_] string The handle of the user. ### Response * [200](https://docs.datadoghq.com/api/latest/users/#DisableUser-200-v1) * [400](https://docs.datadoghq.com/api/latest/users/#DisableUser-400-v1) * [403](https://docs.datadoghq.com/api/latest/users/#DisableUser-403-v1) * [404](https://docs.datadoghq.com/api/latest/users/#DisableUser-404-v1) * [429](https://docs.datadoghq.com/api/latest/users/#DisableUser-429-v1) User disabled * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Array of user disabled for a given organization. Expand All Field Type Description message string Information pertaining to a user disabled for a given organization. ``` { "message": "string" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Disable a user Copy ``` # Path parameters export user_handle="test@datadoghq.com" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/user/${user_handle}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Disable a user ``` """ Disable a user returns "User disabled" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.users_api import UsersApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.disable_user( user_handle="test@datadoghq.com", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Disable a user ``` # Disable a user returns "User disabled" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::UsersAPI.new p api_instance.disable_user("test@datadoghq.com") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Disable a user ``` // Disable a user returns "User disabled" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewUsersApi(apiClient) resp, r, err := api.DisableUser(ctx, "test@datadoghq.com") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.DisableUser`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.DisableUser`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Disable a user ``` // Disable a user returns "User disabled" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.UsersApi; import com.datadog.api.client.v1.model.UserDisableResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); try { UserDisableResponse result = apiInstance.disableUser("test@datadoghq.com"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#disableUser"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Disable a user ``` // Disable a user returns "User disabled" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_users::UsersAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api.disable_user("test@datadoghq.com".to_string()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Disable a user ``` /** * Disable a user returns "User disabled" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.UsersApi(configuration); const params: v1.UsersApiDisableUserRequest = { userHandle: "test@datadoghq.com", }; apiInstance .disableUser(params) .then((data: v1.UserDisableResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` DELETE https://api.ap1.datadoghq.com/api/v2/users/{user_id}https://api.ap2.datadoghq.com/api/v2/users/{user_id}https://api.datadoghq.eu/api/v2/users/{user_id}https://api.ddog-gov.com/api/v2/users/{user_id}https://api.datadoghq.com/api/v2/users/{user_id}https://api.us3.datadoghq.com/api/v2/users/{user_id}https://api.us5.datadoghq.com/api/v2/users/{user_id} ### Overview Disable a user. Can only be used with an application key belonging to an administrator user. This endpoint requires any of the following permissions: * `user_access_manage` * `service_account_write` OAuth apps require the `user_access_manage` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#users) to access this endpoint. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The ID of the user. ### Response * [204](https://docs.datadoghq.com/api/latest/users/#DisableUser-204-v2) * [403](https://docs.datadoghq.com/api/latest/users/#DisableUser-403-v2) * [404](https://docs.datadoghq.com/api/latest/users/#DisableUser-404-v2) * [429](https://docs.datadoghq.com/api/latest/users/#DisableUser-429-v2) OK Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Disable a user Copy ``` # Path parameters export user_id="00000000-0000-9999-0000-000000000000" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/users/${user_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Disable a user ``` """ Disable a user returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.users_api import UsersApi # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) api_instance.disable_user( user_id=USER_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Disable a user ``` # Disable a user returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsersAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] api_instance.disable_user(USER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Disable a user ``` // Disable a user returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsersApi(apiClient) r, err := api.DisableUser(ctx, UserDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.DisableUser`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Disable a user ``` // Disable a user returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsersApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); try { apiInstance.disableUser(USER_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#disableUser"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Disable a user ``` // Disable a user returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_users::UsersAPI; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api.disable_user(user_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Disable a user ``` /** * Disable a user returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsersApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.UsersApiDisableUserRequest = { userId: USER_DATA_ID, }; apiInstance .disableUser(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a user organization](https://docs.datadoghq.com/api/latest/users/#get-a-user-organization) * [v2 (latest)](https://docs.datadoghq.com/api/latest/users/#get-a-user-organization-v2) GET https://api.ap1.datadoghq.com/api/v2/users/{user_id}/orgshttps://api.ap2.datadoghq.com/api/v2/users/{user_id}/orgshttps://api.datadoghq.eu/api/v2/users/{user_id}/orgshttps://api.ddog-gov.com/api/v2/users/{user_id}/orgshttps://api.datadoghq.com/api/v2/users/{user_id}/orgshttps://api.us3.datadoghq.com/api/v2/users/{user_id}/orgshttps://api.us5.datadoghq.com/api/v2/users/{user_id}/orgs ### Overview Get a user organization. Returns the user information and all organizations joined by this user. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The ID of the user. ### Response * [200](https://docs.datadoghq.com/api/latest/users/#ListUserOrganizations-200-v2) * [403](https://docs.datadoghq.com/api/latest/users/#ListUserOrganizations-403-v2) * [404](https://docs.datadoghq.com/api/latest/users/#ListUserOrganizations-404-v2) * [429](https://docs.datadoghq.com/api/latest/users/#ListUserOrganizations-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Response containing information about a single user. Field Type Description data object User object returned by the API. attributes object Attributes of user object returned by the API. created_at date-time Creation time of the user. disabled boolean Whether the user is disabled. email string Email of the user. handle string Handle of the user. icon string URL of the user's icon. last_login_time date-time The last time the user logged in. mfa_enabled boolean If user has MFA enabled. modified_at date-time Time that the user was last modified. name string Name of the user. service_account boolean Whether the user is a service account. status string Status of the user. title string Title of the user. verified boolean Whether the user is verified. id string ID of the user. relationships object Relationships of the user object returned by the API. org object Relationship to an organization. data [_required_] object Relationship to organization object. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_orgs object Relationship to organizations. data [_required_] [object] Relationships to organization objects. id [_required_] string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` other_users object Relationship to users. data [_required_] [object] Relationships to user objects. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` roles object Relationship to roles. data [object] An array containing type and the unique identifier of a role. id string The unique identifier of the role. type enum Roles type. Allowed enum values: `roles` default: `roles` type enum Users resource type. Allowed enum values: `users` default: `users` included [ ] Array of objects related to the user. Option 1 object Organization object. attributes object Attributes of the organization. created_at date-time Creation time of the organization. description string Description of the organization. disabled boolean Whether or not the organization is disabled. modified_at date-time Time of last organization modification. name string Name of the organization. public_id string Public ID of the organization. sharing string Sharing type of the organization. url string URL of the site that this organization exists at. id string ID of the organization. type [_required_] enum Organizations resource type. Allowed enum values: `orgs` default: `orgs` Option 2 object Permission object. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` Option 3 object Role object returned by the API. attributes object Attributes of the role. created_at date-time Creation time of the role. modified_at date-time Time of last role modification. name string The name of the role. The name is neither unique nor a stable identifier of the role. user_count int64 Number of users with that role. id string The unique identifier of the role. relationships object Relationships of the role object returned by the API. permissions object Relationship to multiple permissions objects. data [object] Relationships to permission objects. id string ID of the permission. type enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` type [_required_] enum Roles type. Allowed enum values: `roles` default: `roles` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "disabled": false, "email": "string", "handle": "string", "icon": "string", "last_login_time": "2019-09-19T10:00:00.000Z", "mfa_enabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "service_account": false, "status": "string", "title": "string", "verified": false }, "id": "string", "relationships": { "org": { "data": { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } }, "other_orgs": { "data": [ { "id": "00000000-0000-beef-0000-000000000000", "type": "orgs" } ] }, "other_users": { "data": [ { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } ] }, "roles": { "data": [ { "id": "3653d3c6-0c75-11ea-ad28-fb5701eabc7d", "type": "roles" } ] } }, "type": "users" }, "included": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "description": "string", "disabled": false, "modified_at": "2019-09-19T10:00:00.000Z", "name": "string", "public_id": "string", "sharing": "string", "url": "string" }, "id": "string", "type": "orgs" } ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Get a user organization Copy ``` # Path parameters export user_id="00000000-0000-9999-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/users/${user_id}/orgs" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a user organization ``` """ Get a user organization returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.users_api import UsersApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.list_user_organizations( user_id="00000000-0000-9999-0000-000000000000", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a user organization ``` # Get a user organization returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsersAPI.new p api_instance.list_user_organizations("00000000-0000-9999-0000-000000000000") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a user organization ``` // Get a user organization returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsersApi(apiClient) resp, r, err := api.ListUserOrganizations(ctx, "00000000-0000-9999-0000-000000000000") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.ListUserOrganizations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.ListUserOrganizations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a user organization ``` // Get a user organization returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsersApi; import com.datadog.api.client.v2.model.UserResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); try { UserResponse result = apiInstance.listUserOrganizations("00000000-0000-9999-0000-000000000000"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#listUserOrganizations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a user organization ``` // Get a user organization returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_users::UsersAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api .list_user_organizations("00000000-0000-9999-0000-000000000000".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a user organization ``` /** * Get a user organization returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsersApi(configuration); const params: v2.UsersApiListUserOrganizationsRequest = { userId: "00000000-0000-9999-0000-000000000000", }; apiInstance .listUserOrganizations(params) .then((data: v2.UserResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a user permissions](https://docs.datadoghq.com/api/latest/users/#get-a-user-permissions) * [v2 (latest)](https://docs.datadoghq.com/api/latest/users/#get-a-user-permissions-v2) GET https://api.ap1.datadoghq.com/api/v2/users/{user_id}/permissionshttps://api.ap2.datadoghq.com/api/v2/users/{user_id}/permissionshttps://api.datadoghq.eu/api/v2/users/{user_id}/permissionshttps://api.ddog-gov.com/api/v2/users/{user_id}/permissionshttps://api.datadoghq.com/api/v2/users/{user_id}/permissionshttps://api.us3.datadoghq.com/api/v2/users/{user_id}/permissionshttps://api.us5.datadoghq.com/api/v2/users/{user_id}/permissions ### Overview Get a user permission set. Returns a list of the user’s permissions granted by the associated user’s roles. This endpoint requires the `user_access_read` permission. OAuth apps require the `user_access_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#users) to access this endpoint. ### Arguments #### Path Parameters Name Type Description user_id [_required_] string The ID of the user. ### Response * [200](https://docs.datadoghq.com/api/latest/users/#ListUserPermissions-200-v2) * [403](https://docs.datadoghq.com/api/latest/users/#ListUserPermissions-403-v2) * [404](https://docs.datadoghq.com/api/latest/users/#ListUserPermissions-404-v2) * [429](https://docs.datadoghq.com/api/latest/users/#ListUserPermissions-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Payload with API-returned permissions. Field Type Description data [object] Array of permissions. attributes object Attributes of a permission. created date-time Creation time of the permission. description string Description of the permission. display_name string Displayed name for the permission. display_type string Display type. group_name string Name of the permission group. name string Name of the permission. restricted boolean Whether or not the permission is restricted. id string ID of the permission. type [_required_] enum Permissions resource type. Allowed enum values: `permissions` default: `permissions` ``` { "data": [ { "attributes": { "created": "2019-09-19T10:00:00.000Z", "description": "string", "display_name": "string", "display_type": "string", "group_name": "string", "name": "string", "restricted": false }, "id": "string", "type": "permissions" } ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Get a user permissions Copy ``` # Path parameters export user_id="00000000-0000-9999-0000-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/users/${user_id}/permissions" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a user permissions ``` """ Get a user permissions returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.users_api import UsersApi # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.list_user_permissions( user_id=USER_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a user permissions ``` # Get a user permissions returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsersAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] p api_instance.list_user_permissions(USER_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a user permissions ``` // Get a user permissions returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsersApi(apiClient) resp, r, err := api.ListUserPermissions(ctx, UserDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.ListUserPermissions`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.ListUserPermissions`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a user permissions ``` // Get a user permissions returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsersApi; import com.datadog.api.client.v2.model.PermissionsResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); try { PermissionsResponse result = apiInstance.listUserPermissions(USER_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#listUserPermissions"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a user permissions ``` // Get a user permissions returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_users::UsersAPI; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api.list_user_permissions(user_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a user permissions ``` /** * Get a user permissions returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsersApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.UsersApiListUserPermissionsRequest = { userId: USER_DATA_ID, }; apiInstance .listUserPermissions(params) .then((data: v2.PermissionsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Send invitation emails](https://docs.datadoghq.com/api/latest/users/#send-invitation-emails) * [v2 (latest)](https://docs.datadoghq.com/api/latest/users/#send-invitation-emails-v2) POST https://api.ap1.datadoghq.com/api/v2/user_invitationshttps://api.ap2.datadoghq.com/api/v2/user_invitationshttps://api.datadoghq.eu/api/v2/user_invitationshttps://api.ddog-gov.com/api/v2/user_invitationshttps://api.datadoghq.com/api/v2/user_invitationshttps://api.us3.datadoghq.com/api/v2/user_invitationshttps://api.us5.datadoghq.com/api/v2/user_invitations ### Overview Sends emails to one or more users inviting them to join the organization. This endpoint requires the `user_access_invite` permission. OAuth apps require the `user_access_invite` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#users) to access this endpoint. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) Field Type Description data [_required_] [object] List of user invitations. relationships [_required_] object Relationships data for user invitation. user [_required_] object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type [_required_] enum User invitations type. Allowed enum values: `user_invitations` default: `user_invitations` ``` { "data": [ { "type": "user_invitations", "relationships": { "user": { "data": { "type": "users", "id": "string" } } } } ] } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/users/#SendInvitations-201-v2) * [400](https://docs.datadoghq.com/api/latest/users/#SendInvitations-400-v2) * [403](https://docs.datadoghq.com/api/latest/users/#SendInvitations-403-v2) * [429](https://docs.datadoghq.com/api/latest/users/#SendInvitations-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) User invitations as returned by the API. Field Type Description data [object] Array of user invitations. attributes object Attributes of a user invitation. created_at date-time Creation time of the user invitation. expires_at date-time Time of invitation expiration. invite_type string Type of invitation. uuid string UUID of the user invitation. id string ID of the user invitation. relationships object Relationships data for user invitation. user [_required_] object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum User invitations type. Allowed enum values: `user_invitations` default: `user_invitations` ``` { "data": [ { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "expires_at": "2019-09-19T10:00:00.000Z", "invite_type": "string", "uuid": "string" }, "id": "string", "relationships": { "user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "user_invitations" } ] } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Send invitation emails returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/user_invitations" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": [ { "type": "user_invitations", "relationships": { "user": { "data": { "type": "users", "id": "string" } } } } ] } EOF ``` ##### Send invitation emails returns "OK" response ``` // Send invitation emails returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "user" in the system UserDataID := os.Getenv("USER_DATA_ID") body := datadogV2.UserInvitationsRequest{ Data: []datadogV2.UserInvitationData{ { Type: datadogV2.USERINVITATIONSTYPE_USER_INVITATIONS, Relationships: datadogV2.UserInvitationRelationships{ User: datadogV2.RelationshipToUser{ Data: datadogV2.RelationshipToUserData{ Type: datadogV2.USERSTYPE_USERS, Id: UserDataID, }, }, }, }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsersApi(apiClient) resp, r, err := api.SendInvitations(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.SendInvitations`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.SendInvitations`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Send invitation emails returns "OK" response ``` // Send invitation emails returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsersApi; import com.datadog.api.client.v2.model.RelationshipToUser; import com.datadog.api.client.v2.model.RelationshipToUserData; import com.datadog.api.client.v2.model.UserInvitationData; import com.datadog.api.client.v2.model.UserInvitationRelationships; import com.datadog.api.client.v2.model.UserInvitationsRequest; import com.datadog.api.client.v2.model.UserInvitationsResponse; import com.datadog.api.client.v2.model.UserInvitationsType; import com.datadog.api.client.v2.model.UsersType; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); // there is a valid "user" in the system String USER_DATA_ID = System.getenv("USER_DATA_ID"); UserInvitationsRequest body = new UserInvitationsRequest() .data( Collections.singletonList( new UserInvitationData() .type(UserInvitationsType.USER_INVITATIONS) .relationships( new UserInvitationRelationships() .user( new RelationshipToUser() .data( new RelationshipToUserData() .type(UsersType.USERS) .id(USER_DATA_ID)))))); try { UserInvitationsResponse result = apiInstance.sendInvitations(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#sendInvitations"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Send invitation emails returns "OK" response ``` """ Send invitation emails returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.users_api import UsersApi from datadog_api_client.v2.model.relationship_to_user import RelationshipToUser from datadog_api_client.v2.model.relationship_to_user_data import RelationshipToUserData from datadog_api_client.v2.model.user_invitation_data import UserInvitationData from datadog_api_client.v2.model.user_invitation_relationships import UserInvitationRelationships from datadog_api_client.v2.model.user_invitations_request import UserInvitationsRequest from datadog_api_client.v2.model.user_invitations_type import UserInvitationsType from datadog_api_client.v2.model.users_type import UsersType # there is a valid "user" in the system USER_DATA_ID = environ["USER_DATA_ID"] body = UserInvitationsRequest( data=[ UserInvitationData( type=UserInvitationsType.USER_INVITATIONS, relationships=UserInvitationRelationships( user=RelationshipToUser( data=RelationshipToUserData( type=UsersType.USERS, id=USER_DATA_ID, ), ), ), ), ], ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.send_invitations(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Send invitation emails returns "OK" response ``` # Send invitation emails returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsersAPI.new # there is a valid "user" in the system USER_DATA_ID = ENV["USER_DATA_ID"] body = DatadogAPIClient::V2::UserInvitationsRequest.new({ data: [ DatadogAPIClient::V2::UserInvitationData.new({ type: DatadogAPIClient::V2::UserInvitationsType::USER_INVITATIONS, relationships: DatadogAPIClient::V2::UserInvitationRelationships.new({ user: DatadogAPIClient::V2::RelationshipToUser.new({ data: DatadogAPIClient::V2::RelationshipToUserData.new({ type: DatadogAPIClient::V2::UsersType::USERS, id: USER_DATA_ID, }), }), }), }), ], }) p api_instance.send_invitations(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Send invitation emails returns "OK" response ``` // Send invitation emails returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_users::UsersAPI; use datadog_api_client::datadogV2::model::RelationshipToUser; use datadog_api_client::datadogV2::model::RelationshipToUserData; use datadog_api_client::datadogV2::model::UserInvitationData; use datadog_api_client::datadogV2::model::UserInvitationRelationships; use datadog_api_client::datadogV2::model::UserInvitationsRequest; use datadog_api_client::datadogV2::model::UserInvitationsType; use datadog_api_client::datadogV2::model::UsersType; #[tokio::main] async fn main() { // there is a valid "user" in the system let user_data_id = std::env::var("USER_DATA_ID").unwrap(); let body = UserInvitationsRequest::new(vec![UserInvitationData::new( UserInvitationRelationships::new(RelationshipToUser::new(RelationshipToUserData::new( user_data_id.clone(), UsersType::USERS, ))), UserInvitationsType::USER_INVITATIONS, )]); let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api.send_invitations(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Send invitation emails returns "OK" response ``` /** * Send invitation emails returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsersApi(configuration); // there is a valid "user" in the system const USER_DATA_ID = process.env.USER_DATA_ID as string; const params: v2.UsersApiSendInvitationsRequest = { body: { data: [ { type: "user_invitations", relationships: { user: { data: { type: "users", id: USER_DATA_ID, }, }, }, }, ], }, }; apiInstance .sendInvitations(params) .then((data: v2.UserInvitationsResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a user invitation](https://docs.datadoghq.com/api/latest/users/#get-a-user-invitation) * [v2 (latest)](https://docs.datadoghq.com/api/latest/users/#get-a-user-invitation-v2) GET https://api.ap1.datadoghq.com/api/v2/user_invitations/{user_invitation_uuid}https://api.ap2.datadoghq.com/api/v2/user_invitations/{user_invitation_uuid}https://api.datadoghq.eu/api/v2/user_invitations/{user_invitation_uuid}https://api.ddog-gov.com/api/v2/user_invitations/{user_invitation_uuid}https://api.datadoghq.com/api/v2/user_invitations/{user_invitation_uuid}https://api.us3.datadoghq.com/api/v2/user_invitations/{user_invitation_uuid}https://api.us5.datadoghq.com/api/v2/user_invitations/{user_invitation_uuid} ### Overview Returns a single user invitation by its UUID. This endpoint requires the `user_access_invite` permission. OAuth apps require the `user_access_invite` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#users) to access this endpoint. ### Arguments #### Path Parameters Name Type Description user_invitation_uuid [_required_] string The UUID of the user invitation. ### Response * [200](https://docs.datadoghq.com/api/latest/users/#GetInvitation-200-v2) * [403](https://docs.datadoghq.com/api/latest/users/#GetInvitation-403-v2) * [404](https://docs.datadoghq.com/api/latest/users/#GetInvitation-404-v2) * [429](https://docs.datadoghq.com/api/latest/users/#GetInvitation-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) User invitation as returned by the API. Field Type Description data object Object of a user invitation returned by the API. attributes object Attributes of a user invitation. created_at date-time Creation time of the user invitation. expires_at date-time Time of invitation expiration. invite_type string Type of invitation. uuid string UUID of the user invitation. id string ID of the user invitation. relationships object Relationships data for user invitation. user [_required_] object Relationship to user. data [_required_] object Relationship to user object. id [_required_] string A unique identifier that represents the user. type [_required_] enum Users resource type. Allowed enum values: `users` default: `users` type enum User invitations type. Allowed enum values: `user_invitations` default: `user_invitations` ``` { "data": { "attributes": { "created_at": "2019-09-19T10:00:00.000Z", "expires_at": "2019-09-19T10:00:00.000Z", "invite_type": "string", "uuid": "string" }, "id": "string", "relationships": { "user": { "data": { "id": "00000000-0000-0000-2345-000000000000", "type": "users" } } }, "type": "user_invitations" } } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/users/) * [Example](https://docs.datadoghq.com/api/latest/users/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/users/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/users/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/users/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/users/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/users/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/users/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/users/?code-lang=typescript) ##### Get a user invitation Copy ``` # Path parameters export user_invitation_uuid="00000000-0000-0000-3456-000000000000" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/user_invitations/${user_invitation_uuid}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a user invitation ``` """ Get a user invitation returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.users_api import UsersApi # the "user" has a "user_invitation" USER_INVITATION_ID = environ["USER_INVITATION_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = UsersApi(api_client) response = api_instance.get_invitation( user_invitation_uuid=USER_INVITATION_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a user invitation ``` # Get a user invitation returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::UsersAPI.new # the "user" has a "user_invitation" USER_INVITATION_ID = ENV["USER_INVITATION_ID"] p api_instance.get_invitation(USER_INVITATION_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a user invitation ``` // Get a user invitation returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // the "user" has a "user_invitation" UserInvitationID := os.Getenv("USER_INVITATION_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewUsersApi(apiClient) resp, r, err := api.GetInvitation(ctx, UserInvitationID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `UsersApi.GetInvitation`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `UsersApi.GetInvitation`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a user invitation ``` // Get a user invitation returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.UsersApi; import com.datadog.api.client.v2.model.UserInvitationResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); UsersApi apiInstance = new UsersApi(defaultClient); // the "user" has a "user_invitation" String USER_INVITATION_ID = System.getenv("USER_INVITATION_ID"); try { UserInvitationResponse result = apiInstance.getInvitation(USER_INVITATION_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling UsersApi#getInvitation"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a user invitation ``` // Get a user invitation returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_users::UsersAPI; #[tokio::main] async fn main() { // the "user" has a "user_invitation" let user_invitation_id = std::env::var("USER_INVITATION_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = UsersAPI::with_config(configuration); let resp = api.get_invitation(user_invitation_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a user invitation ``` /** * Get a user invitation returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.UsersApi(configuration); // the "user" has a "user_invitation" const USER_INVITATION_ID = process.env.USER_INVITATION_ID as string; const params: v2.UsersApiGetInvitationRequest = { userInvitationUuid: USER_INVITATION_ID, }; apiInstance .getInvitation(params) .then((data: v2.UserInvitationResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=a6d08e02-1013-44cb-90dd-c6e03fac86fc&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2eb5d143-ead2-41d3-849e-714429653d0e&pt=Users&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fusers%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=a6d08e02-1013-44cb-90dd-c6e03fac86fc&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2eb5d143-ead2-41d3-849e-714429653d0e&pt=Users&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fusers%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=0aaa9c65-8285-4915-82ac-f4a7e2772924&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Users&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fusers%2F&r=<=9690&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=845438) --- # Source: https://docs.datadoghq.com/api/latest/using-the-api ## [Using the API](https://docs.datadoghq.com/api/latest/using-the-api/#using-the-api) Use the Datadog HTTP API to access the Datadog platform programmatically. You can use the API to send data to Datadog, build data visualizations, and manage your account. ## [Send data to Datadog](https://docs.datadoghq.com/api/latest/using-the-api/#send-data-to-datadog) Use the API to begin to send integrations data to Datadog. With some additional setup of the Agent, you can also use the API to send Synthetic test data, Logs, and Traces to Datadog. **Integrations endpoints** Available integrations endpoints: * [AWS Integration](https://docs.datadoghq.com/api/v1/aws-integration/) * [AWS Logs Integration](https://docs.datadoghq.com/api/v1/aws-logs-integration/) * [Azure Integration](https://docs.datadoghq.com/api/v1/azure-integration/) * [Google Cloud Integration](https://docs.datadoghq.com/api/v1/gcp-integration/) * [Slack Integration](https://docs.datadoghq.com/api/v1/slack-integration/) * [PagerDuty Integration](https://docs.datadoghq.com/api/v1/pagerduty-integration/) * [Webhooks Integration](https://docs.datadoghq.com/api/v1/webhooks-integration/) **Platform endpoints** Use these endpoints to post and fetch data to and from other parts of the Datadog platform: * The [metrics](https://docs.datadoghq.com/api/v1/metrics/) endpoints allow you to post [metrics](https://docs.datadoghq.com/metrics/introduction/) data so it can be graphed on Datadog’s dashboards and query metrics from any time period. * The [events](https://docs.datadoghq.com/api/v1/events/) endpoints allow you to post and fetch events to and from the [Datadog event explorer](https://docs.datadoghq.com/events/). * Use the [Synthetic Monitoring](https://docs.datadoghq.com/api/v1/synthetics/) endpoints to create, start, stop, and see [Synthetic tests](https://docs.datadoghq.com/synthetics/) results. * Use the [Tracing Agent API](https://docs.datadoghq.com/tracing/guide/send_traces_to_agent_by_api/) to send traces to your Datadog Agent, which then forwards them to Datadog. * Use the [LLM Observability Export API](https://docs.datadoghq.com/llm_observability/evaluations/export_api) to access your LLM Observability data for running external evaluations and exporting spans for offline storage. ## [Visualize your data](https://docs.datadoghq.com/api/latest/using-the-api/#visualize-your-data) Once you are sending data to Datadog, you can use the API to build data visualizations programmatically: * Build [Dashboards](https://docs.datadoghq.com/api/v1/dashboards/) and view [Dashboard Lists](https://docs.datadoghq.com/api/v1/dashboard-lists/) * Manage [host tags](https://docs.datadoghq.com/api/v1/hosts/) * Create [Embeddable Graphs](https://docs.datadoghq.com/api/v1/embeddable-graphs/) * Take a [graph snapshot](https://docs.datadoghq.com/api/v1/snapshots/) * [Service Dependencies](https://docs.datadoghq.com/api/v1/service-dependencies/) - see a list of your APM services and their dependencies * Create [Monitors](https://docs.datadoghq.com/api/v1/monitors/) * [Service Checks](https://docs.datadoghq.com/api/v1/service-checks/) - post check statuses for use with monitors * Create and manage [Logs](https://docs.datadoghq.com/api/v1/logs/), [Logs Indexes](https://docs.datadoghq.com/api/v1/logs-indexes/), and [Logs Pipelines](https://docs.datadoghq.com/api/v1/logs-pipelines/) * Get [Host](https://docs.datadoghq.com/api/v1/hosts/) information for your organization * Create and manage [Service Level Objectives](https://docs.datadoghq.com/api/v1/service-level-objectives/) * Generate [Security Monitoring](https://docs.datadoghq.com/api/v2/security-monitoring/) signals ## [Manage your account](https://docs.datadoghq.com/api/latest/using-the-api/#manage-your-account) You can also use the Datadog API to manage your account programmatically: * Manage [Users](https://docs.datadoghq.com/api/v1/users/) * Manage [Roles](https://docs.datadoghq.com/api/v1/roles/) * Manage your [Organization](https://docs.datadoghq.com/api/v1/organizations/) * Verify API and app keys with the [Authentication](https://docs.datadoghq.com/api/v1/authentication/) endpoint * Grant specific logs access with the [Logs Restriction Queries](https://docs.datadoghq.com/api/v2/logs-restriction-queries/) * Manage existing keys with [Key Management](https://docs.datadoghq.com/api/v1/key-management/) * Get hourly, daily, and monthly usage across multiple facets of Datadog with the [Usage Metering](https://docs.datadoghq.com/api/v1/usage-metering/) endpoints * See the list of IP prefixes belonging to Datadog with [IP Ranges](https://docs.datadoghq.com/api/v1/ip-ranges/) ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c19c9c99-3145-4c4d-a347-126012f6b783&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=c9c638e2-049f-4197-9417-fcdc325031a3&pt=Using%20the%20API&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fusing-the-api%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c19c9c99-3145-4c4d-a347-126012f6b783&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=c9c638e2-049f-4197-9417-fcdc325031a3&pt=Using%20the%20API&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fusing-the-api%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=5ed4359d-7c6a-4fb2-9d64-d962c4adbe73&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Using%20the%20API&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fusing-the-api%2F&r=<=1339&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=861288) --- # Source: https://docs.datadoghq.com/api/latest/webhooks-integration # Webhooks Integration Configure your Datadog-Webhooks integration directly through the Datadog API. See the [Webhooks integration page](https://docs.datadoghq.com/integrations/webhooks) for more information. ## [Create a webhooks integration](https://docs.datadoghq.com/api/latest/webhooks-integration/#create-a-webhooks-integration) * [v1 (latest)](https://docs.datadoghq.com/api/latest/webhooks-integration/#create-a-webhooks-integration-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/webhooks/configuration/webhookshttps://api.ap2.datadoghq.com/api/v1/integration/webhooks/configuration/webhookshttps://api.datadoghq.eu/api/v1/integration/webhooks/configuration/webhookshttps://api.ddog-gov.com/api/v1/integration/webhooks/configuration/webhookshttps://api.datadoghq.com/api/v1/integration/webhooks/configuration/webhookshttps://api.us3.datadoghq.com/api/v1/integration/webhooks/configuration/webhookshttps://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks ### Overview Creates an endpoint with the name ``. This endpoint requires the `create_webhooks` permission. OAuth apps require the `create_webhooks` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#webhooks-integration) to access this endpoint. ### Request #### Body Data (required) Create a webhooks integration request body. * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Expand All Field Type Description custom_headers string If `null`, uses no header. If given a JSON payload, these will be headers attached to your webhook. encode_as enum Encoding type. Can be given either `json` or `form`. Allowed enum values: `json,form` default: `json` name [_required_] string The name of the webhook. It corresponds with ``. Learn more on how to use it in [monitor notifications](https://docs.datadoghq.com/monitors/notify). payload string If `null`, uses the default payload. If given a JSON payload, the webhook returns the payload specified by the given payload. [Webhooks variable usage](https://docs.datadoghq.com/integrations/webhooks/#usage). url [_required_] string URL of the webhook. ``` { "name": "Example-Webhooks-Integration", "url": "https://example.com/webhook" } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/webhooks-integration/#CreateWebhooksIntegration-201-v1) * [400](https://docs.datadoghq.com/api/latest/webhooks-integration/#CreateWebhooksIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/webhooks-integration/#CreateWebhooksIntegration-403-v1) * [429](https://docs.datadoghq.com/api/latest/webhooks-integration/#CreateWebhooksIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Datadog-Webhooks integration. Expand All Field Type Description custom_headers string If `null`, uses no header. If given a JSON payload, these will be headers attached to your webhook. encode_as enum Encoding type. Can be given either `json` or `form`. Allowed enum values: `json,form` default: `json` name [_required_] string The name of the webhook. It corresponds with ``. Learn more on how to use it in [monitor notifications](https://docs.datadoghq.com/monitors/notify). payload string If `null`, uses the default payload. If given a JSON payload, the webhook returns the payload specified by the given payload. [Webhooks variable usage](https://docs.datadoghq.com/integrations/webhooks/#usage). url [_required_] string URL of the webhook. ``` { "custom_headers": "string", "encode_as": "string", "name": "WEBHOOK_NAME", "payload": "string", "url": "https://example.com/webhook" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=typescript) ##### Create a webhooks integration returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "name": "Example-Webhooks-Integration", "url": "https://example.com/webhook" } EOF ``` ##### Create a webhooks integration returns "OK" response ``` // Create a webhooks integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.WebhooksIntegration{ Name: "Example-Webhooks-Integration", Url: "https://example.com/webhook", } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewWebhooksIntegrationApi(apiClient) resp, r, err := api.CreateWebhooksIntegration(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WebhooksIntegrationApi.CreateWebhooksIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WebhooksIntegrationApi.CreateWebhooksIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a webhooks integration returns "OK" response ``` // Create a webhooks integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.WebhooksIntegrationApi; import com.datadog.api.client.v1.model.WebhooksIntegration; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WebhooksIntegrationApi apiInstance = new WebhooksIntegrationApi(defaultClient); WebhooksIntegration body = new WebhooksIntegration() .name("Example-Webhooks-Integration") .url("https://example.com/webhook"); try { WebhooksIntegration result = apiInstance.createWebhooksIntegration(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling WebhooksIntegrationApi#createWebhooksIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a webhooks integration returns "OK" response ``` """ Create a webhooks integration returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.webhooks_integration_api import WebhooksIntegrationApi from datadog_api_client.v1.model.webhooks_integration import WebhooksIntegration body = WebhooksIntegration( name="Example-Webhooks-Integration", url="https://example.com/webhook", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WebhooksIntegrationApi(api_client) response = api_instance.create_webhooks_integration(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a webhooks integration returns "OK" response ``` # Create a webhooks integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::WebhooksIntegrationAPI.new body = DatadogAPIClient::V1::WebhooksIntegration.new({ name: "Example-Webhooks-Integration", url: "https://example.com/webhook", }) p api_instance.create_webhooks_integration(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a webhooks integration returns "OK" response ``` // Create a webhooks integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_webhooks_integration::WebhooksIntegrationAPI; use datadog_api_client::datadogV1::model::WebhooksIntegration; #[tokio::main] async fn main() { let body = WebhooksIntegration::new( "Example-Webhooks-Integration".to_string(), "https://example.com/webhook".to_string(), ); let configuration = datadog::Configuration::new(); let api = WebhooksIntegrationAPI::with_config(configuration); let resp = api.create_webhooks_integration(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a webhooks integration returns "OK" response ``` /** * Create a webhooks integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.WebhooksIntegrationApi(configuration); const params: v1.WebhooksIntegrationApiCreateWebhooksIntegrationRequest = { body: { name: "Example-Webhooks-Integration", url: "https://example.com/webhook", }, }; apiInstance .createWebhooksIntegration(params) .then((data: v1.WebhooksIntegration) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a webhook integration](https://docs.datadoghq.com/api/latest/webhooks-integration/#get-a-webhook-integration) * [v1 (latest)](https://docs.datadoghq.com/api/latest/webhooks-integration/#get-a-webhook-integration-v1) GET https://api.ap1.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.ap2.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.datadoghq.eu/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.ddog-gov.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.us3.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name} ### Overview Gets the content of the webhook with the name ``. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description webhook_name [_required_] string The name of the webhook. ### Response * [200](https://docs.datadoghq.com/api/latest/webhooks-integration/#GetWebhooksIntegration-200-v1) * [400](https://docs.datadoghq.com/api/latest/webhooks-integration/#GetWebhooksIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/webhooks-integration/#GetWebhooksIntegration-403-v1) * [404](https://docs.datadoghq.com/api/latest/webhooks-integration/#GetWebhooksIntegration-404-v1) * [429](https://docs.datadoghq.com/api/latest/webhooks-integration/#GetWebhooksIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Datadog-Webhooks integration. Expand All Field Type Description custom_headers string If `null`, uses no header. If given a JSON payload, these will be headers attached to your webhook. encode_as enum Encoding type. Can be given either `json` or `form`. Allowed enum values: `json,form` default: `json` name [_required_] string The name of the webhook. It corresponds with ``. Learn more on how to use it in [monitor notifications](https://docs.datadoghq.com/monitors/notify). payload string If `null`, uses the default payload. If given a JSON payload, the webhook returns the payload specified by the given payload. [Webhooks variable usage](https://docs.datadoghq.com/integrations/webhooks/#usage). url [_required_] string URL of the webhook. ``` { "custom_headers": "string", "encode_as": "string", "name": "WEBHOOK_NAME", "payload": "string", "url": "https://example.com/webhook" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=typescript) ##### Get a webhook integration Copy ``` # Path parameters export webhook_name="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/${webhook_name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a webhook integration ``` """ Get a webhook integration returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.webhooks_integration_api import WebhooksIntegrationApi # there is a valid "webhook" in the system WEBHOOK_NAME = environ["WEBHOOK_NAME"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WebhooksIntegrationApi(api_client) response = api_instance.get_webhooks_integration( webhook_name=WEBHOOK_NAME, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a webhook integration ``` # Get a webhook integration returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::WebhooksIntegrationAPI.new # there is a valid "webhook" in the system WEBHOOK_NAME = ENV["WEBHOOK_NAME"] p api_instance.get_webhooks_integration(WEBHOOK_NAME) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a webhook integration ``` // Get a webhook integration returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "webhook" in the system WebhookName := os.Getenv("WEBHOOK_NAME") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewWebhooksIntegrationApi(apiClient) resp, r, err := api.GetWebhooksIntegration(ctx, WebhookName) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WebhooksIntegrationApi.GetWebhooksIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WebhooksIntegrationApi.GetWebhooksIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a webhook integration ``` // Get a webhook integration returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.WebhooksIntegrationApi; import com.datadog.api.client.v1.model.WebhooksIntegration; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WebhooksIntegrationApi apiInstance = new WebhooksIntegrationApi(defaultClient); // there is a valid "webhook" in the system String WEBHOOK_NAME = System.getenv("WEBHOOK_NAME"); try { WebhooksIntegration result = apiInstance.getWebhooksIntegration(WEBHOOK_NAME); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling WebhooksIntegrationApi#getWebhooksIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a webhook integration ``` // Get a webhook integration returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_webhooks_integration::WebhooksIntegrationAPI; #[tokio::main] async fn main() { // there is a valid "webhook" in the system let webhook_name = std::env::var("WEBHOOK_NAME").unwrap(); let configuration = datadog::Configuration::new(); let api = WebhooksIntegrationAPI::with_config(configuration); let resp = api.get_webhooks_integration(webhook_name.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a webhook integration ``` /** * Get a webhook integration returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.WebhooksIntegrationApi(configuration); // there is a valid "webhook" in the system const WEBHOOK_NAME = process.env.WEBHOOK_NAME as string; const params: v1.WebhooksIntegrationApiGetWebhooksIntegrationRequest = { webhookName: WEBHOOK_NAME, }; apiInstance .getWebhooksIntegration(params) .then((data: v1.WebhooksIntegration) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a webhook](https://docs.datadoghq.com/api/latest/webhooks-integration/#update-a-webhook) * [v1 (latest)](https://docs.datadoghq.com/api/latest/webhooks-integration/#update-a-webhook-v1) PUT https://api.ap1.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.ap2.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.datadoghq.eu/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.ddog-gov.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.us3.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name} ### Overview Updates the endpoint with the name ``. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description webhook_name [_required_] string The name of the webhook. ### Request #### Body Data (required) Update an existing Datadog-Webhooks integration. * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Expand All Field Type Description custom_headers string If `null`, uses no header. If given a JSON payload, these will be headers attached to your webhook. encode_as enum Encoding type. Can be given either `json` or `form`. Allowed enum values: `json,form` default: `json` name string The name of the webhook. It corresponds with ``. Learn more on how to use it in [monitor notifications](https://docs.datadoghq.com/monitors/notify). payload string If `null`, uses the default payload. If given a JSON payload, the webhook returns the payload specified by the given payload. [Webhooks variable usage](https://docs.datadoghq.com/integrations/webhooks/#usage). url string URL of the webhook. ``` { "url": "https://example.com/webhook-updated" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/webhooks-integration/#UpdateWebhooksIntegration-200-v1) * [400](https://docs.datadoghq.com/api/latest/webhooks-integration/#UpdateWebhooksIntegration-400-v1) * [403](https://docs.datadoghq.com/api/latest/webhooks-integration/#UpdateWebhooksIntegration-403-v1) * [404](https://docs.datadoghq.com/api/latest/webhooks-integration/#UpdateWebhooksIntegration-404-v1) * [429](https://docs.datadoghq.com/api/latest/webhooks-integration/#UpdateWebhooksIntegration-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Datadog-Webhooks integration. Expand All Field Type Description custom_headers string If `null`, uses no header. If given a JSON payload, these will be headers attached to your webhook. encode_as enum Encoding type. Can be given either `json` or `form`. Allowed enum values: `json,form` default: `json` name [_required_] string The name of the webhook. It corresponds with ``. Learn more on how to use it in [monitor notifications](https://docs.datadoghq.com/monitors/notify). payload string If `null`, uses the default payload. If given a JSON payload, the webhook returns the payload specified by the given payload. [Webhooks variable usage](https://docs.datadoghq.com/integrations/webhooks/#usage). url [_required_] string URL of the webhook. ``` { "custom_headers": "string", "encode_as": "string", "name": "WEBHOOK_NAME", "payload": "string", "url": "https://example.com/webhook" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=typescript) ##### Update a webhook returns "OK" response Copy ``` # Path parameters export webhook_name="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/${webhook_name}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "url": "https://example.com/webhook-updated" } EOF ``` ##### Update a webhook returns "OK" response ``` // Update a webhook returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "webhook" in the system WebhookName := os.Getenv("WEBHOOK_NAME") body := datadogV1.WebhooksIntegrationUpdateRequest{ Url: datadog.PtrString("https://example.com/webhook-updated"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewWebhooksIntegrationApi(apiClient) resp, r, err := api.UpdateWebhooksIntegration(ctx, WebhookName, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WebhooksIntegrationApi.UpdateWebhooksIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WebhooksIntegrationApi.UpdateWebhooksIntegration`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a webhook returns "OK" response ``` // Update a webhook returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.WebhooksIntegrationApi; import com.datadog.api.client.v1.model.WebhooksIntegration; import com.datadog.api.client.v1.model.WebhooksIntegrationUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WebhooksIntegrationApi apiInstance = new WebhooksIntegrationApi(defaultClient); // there is a valid "webhook" in the system String WEBHOOK_NAME = System.getenv("WEBHOOK_NAME"); WebhooksIntegrationUpdateRequest body = new WebhooksIntegrationUpdateRequest().url("https://example.com/webhook-updated"); try { WebhooksIntegration result = apiInstance.updateWebhooksIntegration(WEBHOOK_NAME, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling WebhooksIntegrationApi#updateWebhooksIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a webhook returns "OK" response ``` """ Update a webhook returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.webhooks_integration_api import WebhooksIntegrationApi from datadog_api_client.v1.model.webhooks_integration_update_request import WebhooksIntegrationUpdateRequest # there is a valid "webhook" in the system WEBHOOK_NAME = environ["WEBHOOK_NAME"] body = WebhooksIntegrationUpdateRequest( url="https://example.com/webhook-updated", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WebhooksIntegrationApi(api_client) response = api_instance.update_webhooks_integration(webhook_name=WEBHOOK_NAME, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a webhook returns "OK" response ``` # Update a webhook returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::WebhooksIntegrationAPI.new # there is a valid "webhook" in the system WEBHOOK_NAME = ENV["WEBHOOK_NAME"] body = DatadogAPIClient::V1::WebhooksIntegrationUpdateRequest.new({ url: "https://example.com/webhook-updated", }) p api_instance.update_webhooks_integration(WEBHOOK_NAME, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a webhook returns "OK" response ``` // Update a webhook returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_webhooks_integration::WebhooksIntegrationAPI; use datadog_api_client::datadogV1::model::WebhooksIntegrationUpdateRequest; #[tokio::main] async fn main() { // there is a valid "webhook" in the system let webhook_name = std::env::var("WEBHOOK_NAME").unwrap(); let body = WebhooksIntegrationUpdateRequest::new() .url("https://example.com/webhook-updated".to_string()); let configuration = datadog::Configuration::new(); let api = WebhooksIntegrationAPI::with_config(configuration); let resp = api .update_webhooks_integration(webhook_name.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a webhook returns "OK" response ``` /** * Update a webhook returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.WebhooksIntegrationApi(configuration); // there is a valid "webhook" in the system const WEBHOOK_NAME = process.env.WEBHOOK_NAME as string; const params: v1.WebhooksIntegrationApiUpdateWebhooksIntegrationRequest = { body: { url: "https://example.com/webhook-updated", }, webhookName: WEBHOOK_NAME, }; apiInstance .updateWebhooksIntegration(params) .then((data: v1.WebhooksIntegration) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a webhook](https://docs.datadoghq.com/api/latest/webhooks-integration/#delete-a-webhook) * [v1 (latest)](https://docs.datadoghq.com/api/latest/webhooks-integration/#delete-a-webhook-v1) DELETE https://api.ap1.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.ap2.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.datadoghq.eu/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.ddog-gov.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.us3.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name}https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/{webhook_name} ### Overview Deletes the endpoint with the name ``. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description webhook_name [_required_] string The name of the webhook. ### Response * [200](https://docs.datadoghq.com/api/latest/webhooks-integration/#DeleteWebhooksIntegration-200-v1) * [403](https://docs.datadoghq.com/api/latest/webhooks-integration/#DeleteWebhooksIntegration-403-v1) * [404](https://docs.datadoghq.com/api/latest/webhooks-integration/#DeleteWebhooksIntegration-404-v1) * [429](https://docs.datadoghq.com/api/latest/webhooks-integration/#DeleteWebhooksIntegration-429-v1) OK Authentication error * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=typescript) ##### Delete a webhook Copy ``` # Path parameters export webhook_name="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/webhooks/${webhook_name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a webhook ``` """ Delete a webhook returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.webhooks_integration_api import WebhooksIntegrationApi # there is a valid "webhook" in the system WEBHOOK_NAME = environ["WEBHOOK_NAME"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WebhooksIntegrationApi(api_client) api_instance.delete_webhooks_integration( webhook_name=WEBHOOK_NAME, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a webhook ``` # Delete a webhook returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::WebhooksIntegrationAPI.new # there is a valid "webhook" in the system WEBHOOK_NAME = ENV["WEBHOOK_NAME"] p api_instance.delete_webhooks_integration(WEBHOOK_NAME) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a webhook ``` // Delete a webhook returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "webhook" in the system WebhookName := os.Getenv("WEBHOOK_NAME") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewWebhooksIntegrationApi(apiClient) r, err := api.DeleteWebhooksIntegration(ctx, WebhookName) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WebhooksIntegrationApi.DeleteWebhooksIntegration`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a webhook ``` // Delete a webhook returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.WebhooksIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WebhooksIntegrationApi apiInstance = new WebhooksIntegrationApi(defaultClient); // there is a valid "webhook" in the system String WEBHOOK_NAME = System.getenv("WEBHOOK_NAME"); try { apiInstance.deleteWebhooksIntegration(WEBHOOK_NAME); } catch (ApiException e) { System.err.println("Exception when calling WebhooksIntegrationApi#deleteWebhooksIntegration"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a webhook ``` // Delete a webhook returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_webhooks_integration::WebhooksIntegrationAPI; #[tokio::main] async fn main() { // there is a valid "webhook" in the system let webhook_name = std::env::var("WEBHOOK_NAME").unwrap(); let configuration = datadog::Configuration::new(); let api = WebhooksIntegrationAPI::with_config(configuration); let resp = api.delete_webhooks_integration(webhook_name.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a webhook ``` /** * Delete a webhook returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.WebhooksIntegrationApi(configuration); // there is a valid "webhook" in the system const WEBHOOK_NAME = process.env.WEBHOOK_NAME as string; const params: v1.WebhooksIntegrationApiDeleteWebhooksIntegrationRequest = { webhookName: WEBHOOK_NAME, }; apiInstance .deleteWebhooksIntegration(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a custom variable](https://docs.datadoghq.com/api/latest/webhooks-integration/#create-a-custom-variable) * [v1 (latest)](https://docs.datadoghq.com/api/latest/webhooks-integration/#create-a-custom-variable-v1) POST https://api.ap1.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variableshttps://api.ap2.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variableshttps://api.datadoghq.eu/api/v1/integration/webhooks/configuration/custom-variableshttps://api.ddog-gov.com/api/v1/integration/webhooks/configuration/custom-variableshttps://api.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variableshttps://api.us3.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variableshttps://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables ### Overview Creates an endpoint with the name ``. This endpoint requires the `manage_integrations` permission. ### Request #### Body Data (required) Define a custom variable request body. * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Expand All Field Type Description is_secret [_required_] boolean Make custom variable is secret or not. If the custom variable is secret, the value is not returned in the response payload. name [_required_] string The name of the variable. It corresponds with ``. value [_required_] string Value of the custom variable. ``` { "is_secret": true, "name": "EXAMPLEWEBHOOKSINTEGRATION", "value": "CUSTOM_VARIABLE_VALUE" } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/webhooks-integration/#CreateWebhooksIntegrationCustomVariable-201-v1) * [400](https://docs.datadoghq.com/api/latest/webhooks-integration/#CreateWebhooksIntegrationCustomVariable-400-v1) * [403](https://docs.datadoghq.com/api/latest/webhooks-integration/#CreateWebhooksIntegrationCustomVariable-403-v1) * [429](https://docs.datadoghq.com/api/latest/webhooks-integration/#CreateWebhooksIntegrationCustomVariable-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Custom variable for Webhook integration. Expand All Field Type Description is_secret [_required_] boolean Make custom variable is secret or not. If the custom variable is secret, the value is not returned in the response payload. name [_required_] string The name of the variable. It corresponds with ``. It must only contains upper-case characters, integers or underscores. value string Value of the custom variable. It won't be returned if the variable is secret. ``` { "is_secret": true, "name": "CUSTOM_VARIABLE_NAME", "value": "CUSTOM_VARIABLE_VALUE" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=typescript) ##### Create a custom variable returns "OK" response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "is_secret": true, "name": "EXAMPLEWEBHOOKSINTEGRATION", "value": "CUSTOM_VARIABLE_VALUE" } EOF ``` ##### Create a custom variable returns "OK" response ``` // Create a custom variable returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { body := datadogV1.WebhooksIntegrationCustomVariable{ IsSecret: true, Name: "EXAMPLEWEBHOOKSINTEGRATION", Value: "CUSTOM_VARIABLE_VALUE", } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewWebhooksIntegrationApi(apiClient) resp, r, err := api.CreateWebhooksIntegrationCustomVariable(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WebhooksIntegrationApi.CreateWebhooksIntegrationCustomVariable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WebhooksIntegrationApi.CreateWebhooksIntegrationCustomVariable`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a custom variable returns "OK" response ``` // Create a custom variable returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.WebhooksIntegrationApi; import com.datadog.api.client.v1.model.WebhooksIntegrationCustomVariable; import com.datadog.api.client.v1.model.WebhooksIntegrationCustomVariableResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WebhooksIntegrationApi apiInstance = new WebhooksIntegrationApi(defaultClient); WebhooksIntegrationCustomVariable body = new WebhooksIntegrationCustomVariable() .isSecret(true) .name("EXAMPLEWEBHOOKSINTEGRATION") .value("CUSTOM_VARIABLE_VALUE"); try { WebhooksIntegrationCustomVariableResponse result = apiInstance.createWebhooksIntegrationCustomVariable(body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling WebhooksIntegrationApi#createWebhooksIntegrationCustomVariable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a custom variable returns "OK" response ``` """ Create a custom variable returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.webhooks_integration_api import WebhooksIntegrationApi from datadog_api_client.v1.model.webhooks_integration_custom_variable import WebhooksIntegrationCustomVariable body = WebhooksIntegrationCustomVariable( is_secret=True, name="EXAMPLEWEBHOOKSINTEGRATION", value="CUSTOM_VARIABLE_VALUE", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WebhooksIntegrationApi(api_client) response = api_instance.create_webhooks_integration_custom_variable(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a custom variable returns "OK" response ``` # Create a custom variable returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::WebhooksIntegrationAPI.new body = DatadogAPIClient::V1::WebhooksIntegrationCustomVariable.new({ is_secret: true, name: "EXAMPLEWEBHOOKSINTEGRATION", value: "CUSTOM_VARIABLE_VALUE", }) p api_instance.create_webhooks_integration_custom_variable(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a custom variable returns "OK" response ``` // Create a custom variable returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_webhooks_integration::WebhooksIntegrationAPI; use datadog_api_client::datadogV1::model::WebhooksIntegrationCustomVariable; #[tokio::main] async fn main() { let body = WebhooksIntegrationCustomVariable::new( true, "EXAMPLEWEBHOOKSINTEGRATION".to_string(), "CUSTOM_VARIABLE_VALUE".to_string(), ); let configuration = datadog::Configuration::new(); let api = WebhooksIntegrationAPI::with_config(configuration); let resp = api.create_webhooks_integration_custom_variable(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a custom variable returns "OK" response ``` /** * Create a custom variable returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.WebhooksIntegrationApi(configuration); const params: v1.WebhooksIntegrationApiCreateWebhooksIntegrationCustomVariableRequest = { body: { isSecret: true, name: "EXAMPLEWEBHOOKSINTEGRATION", value: "CUSTOM_VARIABLE_VALUE", }, }; apiInstance .createWebhooksIntegrationCustomVariable(params) .then((data: v1.WebhooksIntegrationCustomVariableResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a custom variable](https://docs.datadoghq.com/api/latest/webhooks-integration/#get-a-custom-variable) * [v1 (latest)](https://docs.datadoghq.com/api/latest/webhooks-integration/#get-a-custom-variable-v1) GET https://api.ap1.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.ap2.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.datadoghq.eu/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.ddog-gov.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.us3.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name} ### Overview Shows the content of the custom variable with the name ``. If the custom variable is secret, the value does not return in the response payload. This endpoint requires the `integrations_read` permission. ### Arguments #### Path Parameters Name Type Description custom_variable_name [_required_] string The name of the custom variable. ### Response * [200](https://docs.datadoghq.com/api/latest/webhooks-integration/#GetWebhooksIntegrationCustomVariable-200-v1) * [400](https://docs.datadoghq.com/api/latest/webhooks-integration/#GetWebhooksIntegrationCustomVariable-400-v1) * [403](https://docs.datadoghq.com/api/latest/webhooks-integration/#GetWebhooksIntegrationCustomVariable-403-v1) * [404](https://docs.datadoghq.com/api/latest/webhooks-integration/#GetWebhooksIntegrationCustomVariable-404-v1) * [429](https://docs.datadoghq.com/api/latest/webhooks-integration/#GetWebhooksIntegrationCustomVariable-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Custom variable for Webhook integration. Expand All Field Type Description is_secret [_required_] boolean Make custom variable is secret or not. If the custom variable is secret, the value is not returned in the response payload. name [_required_] string The name of the variable. It corresponds with ``. It must only contains upper-case characters, integers or underscores. value string Value of the custom variable. It won't be returned if the variable is secret. ``` { "is_secret": true, "name": "CUSTOM_VARIABLE_NAME", "value": "CUSTOM_VARIABLE_VALUE" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=typescript) ##### Get a custom variable Copy ``` # Path parameters export custom_variable_name="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/${custom_variable_name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a custom variable ``` """ Get a custom variable returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.webhooks_integration_api import WebhooksIntegrationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WebhooksIntegrationApi(api_client) response = api_instance.get_webhooks_integration_custom_variable( custom_variable_name="custom_variable_name", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a custom variable ``` # Get a custom variable returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::WebhooksIntegrationAPI.new p api_instance.get_webhooks_integration_custom_variable("custom_variable_name") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a custom variable ``` // Get a custom variable returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewWebhooksIntegrationApi(apiClient) resp, r, err := api.GetWebhooksIntegrationCustomVariable(ctx, "custom_variable_name") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WebhooksIntegrationApi.GetWebhooksIntegrationCustomVariable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WebhooksIntegrationApi.GetWebhooksIntegrationCustomVariable`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a custom variable ``` // Get a custom variable returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.WebhooksIntegrationApi; import com.datadog.api.client.v1.model.WebhooksIntegrationCustomVariableResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WebhooksIntegrationApi apiInstance = new WebhooksIntegrationApi(defaultClient); try { WebhooksIntegrationCustomVariableResponse result = apiInstance.getWebhooksIntegrationCustomVariable("custom_variable_name"); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling WebhooksIntegrationApi#getWebhooksIntegrationCustomVariable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a custom variable ``` // Get a custom variable returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_webhooks_integration::WebhooksIntegrationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = WebhooksIntegrationAPI::with_config(configuration); let resp = api .get_webhooks_integration_custom_variable("custom_variable_name".to_string()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a custom variable ``` /** * Get a custom variable returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.WebhooksIntegrationApi(configuration); const params: v1.WebhooksIntegrationApiGetWebhooksIntegrationCustomVariableRequest = { customVariableName: "custom_variable_name", }; apiInstance .getWebhooksIntegrationCustomVariable(params) .then((data: v1.WebhooksIntegrationCustomVariableResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update a custom variable](https://docs.datadoghq.com/api/latest/webhooks-integration/#update-a-custom-variable) * [v1 (latest)](https://docs.datadoghq.com/api/latest/webhooks-integration/#update-a-custom-variable-v1) PUT https://api.ap1.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.ap2.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.datadoghq.eu/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.ddog-gov.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.us3.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name} ### Overview Updates the endpoint with the name ``. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description custom_variable_name [_required_] string The name of the custom variable. ### Request #### Body Data (required) Update an existing custom variable request body. * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Expand All Field Type Description is_secret boolean Make custom variable is secret or not. If the custom variable is secret, the value is not returned in the response payload. name string The name of the variable. It corresponds with ``. It must only contains upper-case characters, integers or underscores. value string Value of the custom variable. ``` { "value": "variable-updated" } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/webhooks-integration/#UpdateWebhooksIntegrationCustomVariable-200-v1) * [400](https://docs.datadoghq.com/api/latest/webhooks-integration/#UpdateWebhooksIntegrationCustomVariable-400-v1) * [403](https://docs.datadoghq.com/api/latest/webhooks-integration/#UpdateWebhooksIntegrationCustomVariable-403-v1) * [404](https://docs.datadoghq.com/api/latest/webhooks-integration/#UpdateWebhooksIntegrationCustomVariable-404-v1) * [429](https://docs.datadoghq.com/api/latest/webhooks-integration/#UpdateWebhooksIntegrationCustomVariable-429-v1) OK * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Custom variable for Webhook integration. Expand All Field Type Description is_secret [_required_] boolean Make custom variable is secret or not. If the custom variable is secret, the value is not returned in the response payload. name [_required_] string The name of the variable. It corresponds with ``. It must only contains upper-case characters, integers or underscores. value string Value of the custom variable. It won't be returned if the variable is secret. ``` { "is_secret": true, "name": "CUSTOM_VARIABLE_NAME", "value": "CUSTOM_VARIABLE_VALUE" } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Authentication error * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=typescript) ##### Update a custom variable returns "OK" response Copy ``` # Path parameters export custom_variable_name="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/${custom_variable_name}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "value": "variable-updated" } EOF ``` ##### Update a custom variable returns "OK" response ``` // Update a custom variable returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "webhook_custom_variable" in the system WebhookCustomVariableName := os.Getenv("WEBHOOK_CUSTOM_VARIABLE_NAME") body := datadogV1.WebhooksIntegrationCustomVariableUpdateRequest{ Value: datadog.PtrString("variable-updated"), } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewWebhooksIntegrationApi(apiClient) resp, r, err := api.UpdateWebhooksIntegrationCustomVariable(ctx, WebhookCustomVariableName, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WebhooksIntegrationApi.UpdateWebhooksIntegrationCustomVariable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WebhooksIntegrationApi.UpdateWebhooksIntegrationCustomVariable`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update a custom variable returns "OK" response ``` // Update a custom variable returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.WebhooksIntegrationApi; import com.datadog.api.client.v1.model.WebhooksIntegrationCustomVariableResponse; import com.datadog.api.client.v1.model.WebhooksIntegrationCustomVariableUpdateRequest; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WebhooksIntegrationApi apiInstance = new WebhooksIntegrationApi(defaultClient); // there is a valid "webhook_custom_variable" in the system String WEBHOOK_CUSTOM_VARIABLE_NAME = System.getenv("WEBHOOK_CUSTOM_VARIABLE_NAME"); WebhooksIntegrationCustomVariableUpdateRequest body = new WebhooksIntegrationCustomVariableUpdateRequest().value("variable-updated"); try { WebhooksIntegrationCustomVariableResponse result = apiInstance.updateWebhooksIntegrationCustomVariable(WEBHOOK_CUSTOM_VARIABLE_NAME, body); System.out.println(result); } catch (ApiException e) { System.err.println( "Exception when calling WebhooksIntegrationApi#updateWebhooksIntegrationCustomVariable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update a custom variable returns "OK" response ``` """ Update a custom variable returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.webhooks_integration_api import WebhooksIntegrationApi from datadog_api_client.v1.model.webhooks_integration_custom_variable_update_request import ( WebhooksIntegrationCustomVariableUpdateRequest, ) # there is a valid "webhook_custom_variable" in the system WEBHOOK_CUSTOM_VARIABLE_NAME = environ["WEBHOOK_CUSTOM_VARIABLE_NAME"] body = WebhooksIntegrationCustomVariableUpdateRequest( value="variable-updated", ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WebhooksIntegrationApi(api_client) response = api_instance.update_webhooks_integration_custom_variable( custom_variable_name=WEBHOOK_CUSTOM_VARIABLE_NAME, body=body ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update a custom variable returns "OK" response ``` # Update a custom variable returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::WebhooksIntegrationAPI.new # there is a valid "webhook_custom_variable" in the system WEBHOOK_CUSTOM_VARIABLE_NAME = ENV["WEBHOOK_CUSTOM_VARIABLE_NAME"] body = DatadogAPIClient::V1::WebhooksIntegrationCustomVariableUpdateRequest.new({ value: "variable-updated", }) p api_instance.update_webhooks_integration_custom_variable(WEBHOOK_CUSTOM_VARIABLE_NAME, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update a custom variable returns "OK" response ``` // Update a custom variable returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_webhooks_integration::WebhooksIntegrationAPI; use datadog_api_client::datadogV1::model::WebhooksIntegrationCustomVariableUpdateRequest; #[tokio::main] async fn main() { // there is a valid "webhook_custom_variable" in the system let webhook_custom_variable_name = std::env::var("WEBHOOK_CUSTOM_VARIABLE_NAME").unwrap(); let body = WebhooksIntegrationCustomVariableUpdateRequest::new().value("variable-updated".to_string()); let configuration = datadog::Configuration::new(); let api = WebhooksIntegrationAPI::with_config(configuration); let resp = api .update_webhooks_integration_custom_variable(webhook_custom_variable_name.clone(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update a custom variable returns "OK" response ``` /** * Update a custom variable returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.WebhooksIntegrationApi(configuration); // there is a valid "webhook_custom_variable" in the system const WEBHOOK_CUSTOM_VARIABLE_NAME = process.env .WEBHOOK_CUSTOM_VARIABLE_NAME as string; const params: v1.WebhooksIntegrationApiUpdateWebhooksIntegrationCustomVariableRequest = { body: { value: "variable-updated", }, customVariableName: WEBHOOK_CUSTOM_VARIABLE_NAME, }; apiInstance .updateWebhooksIntegrationCustomVariable(params) .then((data: v1.WebhooksIntegrationCustomVariableResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete a custom variable](https://docs.datadoghq.com/api/latest/webhooks-integration/#delete-a-custom-variable) * [v1 (latest)](https://docs.datadoghq.com/api/latest/webhooks-integration/#delete-a-custom-variable-v1) DELETE https://api.ap1.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.ap2.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.datadoghq.eu/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.ddog-gov.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.us3.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name}https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/{custom_variable_name} ### Overview Deletes the endpoint with the name ``. This endpoint requires the `manage_integrations` permission. ### Arguments #### Path Parameters Name Type Description custom_variable_name [_required_] string The name of the custom variable. ### Response * [200](https://docs.datadoghq.com/api/latest/webhooks-integration/#DeleteWebhooksIntegrationCustomVariable-200-v1) * [403](https://docs.datadoghq.com/api/latest/webhooks-integration/#DeleteWebhooksIntegrationCustomVariable-403-v1) * [404](https://docs.datadoghq.com/api/latest/webhooks-integration/#DeleteWebhooksIntegrationCustomVariable-404-v1) * [429](https://docs.datadoghq.com/api/latest/webhooks-integration/#DeleteWebhooksIntegrationCustomVariable-429-v1) OK Authentication error * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Item Not Found * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/webhooks-integration/) * [Example](https://docs.datadoghq.com/api/latest/webhooks-integration/) Error response object. Expand All Field Type Description errors [_required_] [string] Array of errors returned by the API. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/webhooks-integration/?code-lang=typescript) ##### Delete a custom variable Copy ``` # Path parameters export custom_variable_name="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v1/integration/webhooks/configuration/custom-variables/${custom_variable_name}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete a custom variable ``` """ Delete a custom variable returns "OK" response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v1.api.webhooks_integration_api import WebhooksIntegrationApi # there is a valid "webhook_custom_variable" in the system WEBHOOK_CUSTOM_VARIABLE_NAME = environ["WEBHOOK_CUSTOM_VARIABLE_NAME"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WebhooksIntegrationApi(api_client) api_instance.delete_webhooks_integration_custom_variable( custom_variable_name=WEBHOOK_CUSTOM_VARIABLE_NAME, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete a custom variable ``` # Delete a custom variable returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V1::WebhooksIntegrationAPI.new # there is a valid "webhook_custom_variable" in the system WEBHOOK_CUSTOM_VARIABLE_NAME = ENV["WEBHOOK_CUSTOM_VARIABLE_NAME"] p api_instance.delete_webhooks_integration_custom_variable(WEBHOOK_CUSTOM_VARIABLE_NAME) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete a custom variable ``` // Delete a custom variable returns "OK" response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV1" ) func main() { // there is a valid "webhook_custom_variable" in the system WebhookCustomVariableName := os.Getenv("WEBHOOK_CUSTOM_VARIABLE_NAME") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV1.NewWebhooksIntegrationApi(apiClient) r, err := api.DeleteWebhooksIntegrationCustomVariable(ctx, WebhookCustomVariableName) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WebhooksIntegrationApi.DeleteWebhooksIntegrationCustomVariable`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete a custom variable ``` // Delete a custom variable returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v1.api.WebhooksIntegrationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WebhooksIntegrationApi apiInstance = new WebhooksIntegrationApi(defaultClient); // there is a valid "webhook_custom_variable" in the system String WEBHOOK_CUSTOM_VARIABLE_NAME = System.getenv("WEBHOOK_CUSTOM_VARIABLE_NAME"); try { apiInstance.deleteWebhooksIntegrationCustomVariable(WEBHOOK_CUSTOM_VARIABLE_NAME); } catch (ApiException e) { System.err.println( "Exception when calling WebhooksIntegrationApi#deleteWebhooksIntegrationCustomVariable"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete a custom variable ``` // Delete a custom variable returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV1::api_webhooks_integration::WebhooksIntegrationAPI; #[tokio::main] async fn main() { // there is a valid "webhook_custom_variable" in the system let webhook_custom_variable_name = std::env::var("WEBHOOK_CUSTOM_VARIABLE_NAME").unwrap(); let configuration = datadog::Configuration::new(); let api = WebhooksIntegrationAPI::with_config(configuration); let resp = api .delete_webhooks_integration_custom_variable(webhook_custom_variable_name.clone()) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete a custom variable ``` /** * Delete a custom variable returns "OK" response */ import { client, v1 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v1.WebhooksIntegrationApi(configuration); // there is a valid "webhook_custom_variable" in the system const WEBHOOK_CUSTOM_VARIABLE_NAME = process.env .WEBHOOK_CUSTOM_VARIABLE_NAME as string; const params: v1.WebhooksIntegrationApiDeleteWebhooksIntegrationCustomVariableRequest = { customVariableName: WEBHOOK_CUSTOM_VARIABLE_NAME, }; apiInstance .deleteWebhooksIntegrationCustomVariable(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=8d868fcf-e199-47ad-ae4e-d7c54d0bc1b7&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=bb61568f-0fc7-4acf-b4b8-6665c7401b04&pt=Webhooks%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fwebhooks-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=8d868fcf-e199-47ad-ae4e-d7c54d0bc1b7&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=bb61568f-0fc7-4acf-b4b8-6665c7401b04&pt=Webhooks%20Integration&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fwebhooks-integration%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://id.rlcdn.com/464526.gif) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=e6e99fed-6316-4003-8263-b022f4471efa&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Webhooks%20Integration&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fwebhooks-integration%2F&r=<=5089&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=65839) --- # Source: https://docs.datadoghq.com/api/latest/workflow-automation This product is not supported for your selected [Datadog site](https://docs.datadoghq.com/getting_started/site). (US1). # Workflow Automation Datadog Workflow Automation allows you to automate your end-to-end processes by connecting Datadog with the rest of your tech stack. Build workflows to auto-remediate your alerts, streamline your incident and security processes, and reduce manual toil. Workflow Automation supports over 1,000+ OOTB actions, including AWS, JIRA, ServiceNow, GitHub, and OpenAI. Learn more in our Workflow Automation docs [here](https://docs.datadoghq.com/service_management/workflows/). ## [Get an existing Workflow](https://docs.datadoghq.com/api/latest/workflow-automation/#get-an-existing-workflow) * [v2 (latest)](https://docs.datadoghq.com/api/latest/workflow-automation/#get-an-existing-workflow-v2) GET https://api.ap1.datadoghq.com/api/v2/workflows/{workflow_id}https://api.ap2.datadoghq.com/api/v2/workflows/{workflow_id}https://api.datadoghq.eu/api/v2/workflows/{workflow_id}https://api.ddog-gov.com/api/v2/workflows/{workflow_id}https://api.datadoghq.com/api/v2/workflows/{workflow_id}https://api.us3.datadoghq.com/api/v2/workflows/{workflow_id}https://api.us5.datadoghq.com/api/v2/workflows/{workflow_id} ### Overview Get a workflow by ID. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `workflows_read` permission. ### Arguments #### Path Parameters Name Type Description workflow_id [_required_] string The ID of the workflow. ### Response * [200](https://docs.datadoghq.com/api/latest/workflow-automation/#GetWorkflow-200-v2) * [400](https://docs.datadoghq.com/api/latest/workflow-automation/#GetWorkflow-400-v2) * [403](https://docs.datadoghq.com/api/latest/workflow-automation/#GetWorkflow-403-v2) * [404](https://docs.datadoghq.com/api/latest/workflow-automation/#GetWorkflow-404-v2) * [429](https://docs.datadoghq.com/api/latest/workflow-automation/#GetWorkflow-429-v2) Successfully got a workflow. * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) The response object after getting a workflow. Field Type Description data object Data related to the workflow. attributes [_required_] object The definition of `WorkflowDataAttributes` object. createdAt date-time When the workflow was created. description string Description of the workflow. name [_required_] string Name of the workflow. published boolean Set the workflow to published or unpublished. Workflows in an unpublished state will only be executable via manual runs. Automatic triggers such as Schedule will not execute the workflow until it is published. spec [_required_] object The spec defines what the workflow does. annotations [object] A list of annotations used in the workflow. These are like sticky notes for your workflow! display [_required_] object The definition of `AnnotationDisplay` object. bounds object The definition of `AnnotationDisplayBounds` object. height double The `bounds` `height`. width double The `bounds` `width`. x double The `bounds` `x`. y double The `bounds` `y`. id [_required_] string The `Annotation` `id`. markdownTextAnnotation [_required_] object The definition of `AnnotationMarkdownTextAnnotation` object. text string The `markdownTextAnnotation` `text`. connectionEnvs [object] A list of connections or connection groups used in the workflow. connectionGroups [object] The `ConnectionEnv` `connectionGroups`. connectionGroupId [_required_] string The `ConnectionGroup` `connectionGroupId`. label [_required_] string The `ConnectionGroup` `label`. tags [_required_] [string] The `ConnectionGroup` `tags`. connections [object] The `ConnectionEnv` `connections`. connectionId [_required_] string The `Connection` `connectionId`. label [_required_] string The `Connection` `label`. env [_required_] enum The definition of `ConnectionEnvEnv` object. Allowed enum values: `default` handle string Unique identifier used to trigger workflows automatically in Datadog. inputSchema object A list of input parameters for the workflow. These can be used as dynamic runtime values in your workflow. parameters [object] The `InputSchema` `parameters`. defaultValue The `InputSchemaParameters` `defaultValue`. description string The `InputSchemaParameters` `description`. label string The `InputSchemaParameters` `label`. name [_required_] string The `InputSchemaParameters` `name`. type [_required_] enum The definition of `InputSchemaParametersType` object. Allowed enum values: `STRING,NUMBER,BOOLEAN,OBJECT,ARRAY_STRING,ARRAY_NUMBER,ARRAY_BOOLEAN,ARRAY_OBJECT` outputSchema object A list of output parameters for the workflow. parameters [object] The `OutputSchema` `parameters`. defaultValue The `OutputSchemaParameters` `defaultValue`. description string The `OutputSchemaParameters` `description`. label string The `OutputSchemaParameters` `label`. name [_required_] string The `OutputSchemaParameters` `name`. type [_required_] enum The definition of `OutputSchemaParametersType` object. Allowed enum values: `STRING,NUMBER,BOOLEAN,OBJECT,ARRAY_STRING,ARRAY_NUMBER,ARRAY_BOOLEAN,ARRAY_OBJECT` value The `OutputSchemaParameters` `value`. steps [object] A `Step` is a sub-component of a workflow. Each `Step` performs an action. actionId [_required_] string The unique identifier of an action. completionGate object Used to create conditions before running subsequent actions. completionCondition [_required_] object The definition of `CompletionCondition` object. operand1 [_required_] The `CompletionCondition` `operand1`. operand2 The `CompletionCondition` `operand2`. operator [_required_] enum The definition of `CompletionConditionOperator` object. Allowed enum values: `OPERATOR_EQUAL,OPERATOR_NOT_EQUAL,OPERATOR_GREATER_THAN,OPERATOR_LESS_THAN,OPERATOR_GREATER_THAN_OR_EQUAL_TO,OPERATOR_LESS_THAN_OR_EQUAL_TO,OPERATOR_CONTAINS,OPERATOR_DOES_NOT_CONTAIN,OPERATOR_IS_NULL,OPERATOR_IS_NOT_NULL,OPERATOR_IS_EMPTY,OPERATOR_IS_NOT_EMPTY` retryStrategy [_required_] object The definition of `RetryStrategy` object. kind [_required_] enum The definition of `RetryStrategyKind` object. Allowed enum values: `RETRY_STRATEGY_LINEAR` linear object The definition of `RetryStrategyLinear` object. interval [_required_] string The `RetryStrategyLinear` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s maxRetries [_required_] double The `RetryStrategyLinear` `maxRetries`. connectionLabel string The unique identifier of a connection defined in the spec. display object The definition of `StepDisplay` object. bounds object The definition of `StepDisplayBounds` object. x double The `bounds` `x`. y double The `bounds` `y`. errorHandlers [object] The `Step` `errorHandlers`. fallbackStepName [_required_] string The `ErrorHandler` `fallbackStepName`. retryStrategy [_required_] object The definition of `RetryStrategy` object. kind [_required_] enum The definition of `RetryStrategyKind` object. Allowed enum values: `RETRY_STRATEGY_LINEAR` linear object The definition of `RetryStrategyLinear` object. interval [_required_] string The `RetryStrategyLinear` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s maxRetries [_required_] double The `RetryStrategyLinear` `maxRetries`. name [_required_] string Name of the step. outboundEdges [object] A list of subsequent actions to run. branchName [_required_] string The `OutboundEdge` `branchName`. nextStepName [_required_] string The `OutboundEdge` `nextStepName`. parameters [object] A list of inputs for an action. name [_required_] string The `Parameter` `name`. value [_required_] The `Parameter` `value`. readinessGate object Used to merge multiple branches into a single branch. thresholdType [_required_] enum The definition of `ReadinessGateThresholdType` object. Allowed enum values: `ANY,ALL` triggers [ ] The list of triggers that activate this workflow. At least one trigger is required, and each trigger type may appear at most once. Option 1 object Schema for an API-based trigger. apiTrigger [_required_] object Trigger a workflow from an API request. The workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 2 object Schema for an App-based trigger. appTrigger [_required_] object Trigger a workflow from an App. startStepNames [string] A list of steps that run first after a trigger fires. Option 3 object Schema for a Case-based trigger. caseTrigger [_required_] object Trigger a workflow from a Case. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 4 object Schema for a Change Event-based trigger. changeEventTrigger [_required_] object Trigger a workflow from a Change Event. startStepNames [string] A list of steps that run first after a trigger fires. Option 5 object Schema for a Database Monitoring-based trigger. databaseMonitoringTrigger [_required_] object Trigger a workflow from Database Monitoring. startStepNames [string] A list of steps that run first after a trigger fires. Option 6 object Schema for a Datastore-based trigger. datastoreTrigger [_required_] object Trigger a workflow from a Datastore. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 7 object Schema for a Dashboard-based trigger. dashboardTrigger [_required_] object Trigger a workflow from a Dashboard. startStepNames [string] A list of steps that run first after a trigger fires. Option 8 object Schema for a GitHub webhook-based trigger. githubWebhookTrigger [_required_] object Trigger a workflow from a GitHub webhook. To trigger a workflow from GitHub, you must set a `webhookSecret`. In your GitHub Webhook Settings, set the Payload URL to "base_url"/api/v2/workflows/"workflow_id"/webhook?orgId="org_id", select application/json for the content type, and be highly recommend enabling SSL verification for security. The workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 9 object Schema for an Incident-based trigger. incidentTrigger [_required_] object Trigger a workflow from an Incident. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 10 object Schema for a Monitor-based trigger. monitorTrigger [_required_] object Trigger a workflow from a Monitor. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 11 object Schema for a Notebook-based trigger. notebookTrigger [_required_] object Trigger a workflow from a Notebook. startStepNames [string] A list of steps that run first after a trigger fires. Option 12 object Schema for a Schedule-based trigger. scheduleTrigger [_required_] object Trigger a workflow from a Schedule. The workflow must be published. rruleExpression [_required_] string Recurrence rule expression for scheduling. startStepNames [string] A list of steps that run first after a trigger fires. Option 13 object Schema for a Security-based trigger. securityTrigger [_required_] object Trigger a workflow from a Security Signal or Finding. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 14 object Schema for a Self Service-based trigger. selfServiceTrigger [_required_] object Trigger a workflow from Self Service. startStepNames [string] A list of steps that run first after a trigger fires. Option 15 object Schema for a Slack-based trigger. slackTrigger [_required_] object Trigger a workflow from Slack. The workflow must be published. startStepNames [string] A list of steps that run first after a trigger fires. Option 16 object Schema for a Software Catalog-based trigger. softwareCatalogTrigger [_required_] object Trigger a workflow from Software Catalog. startStepNames [string] A list of steps that run first after a trigger fires. Option 17 object Schema for a Workflow-based trigger. startStepNames [string] A list of steps that run first after a trigger fires. workflowTrigger [_required_] object Trigger a workflow from the Datadog UI. Only required if no other trigger exists. tags [string] Tags of the workflow. updatedAt date-time When the workflow was last updated. webhookSecret string If a Webhook trigger is defined on this workflow, a webhookSecret is required and should be provided here. id string The workflow identifier relationships object The definition of `WorkflowDataRelationships` object. creator object The definition of `WorkflowUserRelationship` object. data object The definition of `WorkflowUserRelationshipData` object. id [_required_] string The user identifier type [_required_] enum The definition of `WorkflowUserRelationshipType` object. Allowed enum values: `users` owner object The definition of `WorkflowUserRelationship` object. data object The definition of `WorkflowUserRelationshipData` object. id [_required_] string The user identifier type [_required_] enum The definition of `WorkflowUserRelationshipType` object. Allowed enum values: `users` type [_required_] enum The definition of `WorkflowDataType` object. Allowed enum values: `workflows` ``` { "data": { "attributes": { "createdAt": "2019-09-19T10:00:00.000Z", "description": "string", "name": "", "published": false, "spec": { "annotations": [ { "display": { "bounds": { "height": "number", "width": "number", "x": "number", "y": "number" } }, "id": "", "markdownTextAnnotation": { "text": "string" } } ], "connectionEnvs": [ { "connectionGroups": [ { "connectionGroupId": "", "label": "", "tags": [ "" ] } ], "connections": [ { "connectionId": "", "label": "" } ], "env": "default" } ], "handle": "string", "inputSchema": { "parameters": [ { "defaultValue": "undefined", "description": "string", "label": "string", "name": "", "type": "STRING" } ] }, "outputSchema": { "parameters": [ { "defaultValue": "undefined", "description": "string", "label": "string", "name": "", "type": "STRING", "value": "undefined" } ] }, "steps": [ { "actionId": "", "completionGate": { "completionCondition": { "operand1": "undefined", "operand2": "undefined", "operator": "OPERATOR_EQUAL" }, "retryStrategy": { "kind": "RETRY_STRATEGY_LINEAR", "linear": { "interval": "", "maxRetries": 0 } } }, "connectionLabel": "string", "display": { "bounds": { "x": "number", "y": "number" } }, "errorHandlers": [ { "fallbackStepName": "", "retryStrategy": { "kind": "RETRY_STRATEGY_LINEAR", "linear": { "interval": "", "maxRetries": 0 } } } ], "name": "", "outboundEdges": [ { "branchName": "", "nextStepName": "" } ], "parameters": [ { "name": "", "value": "undefined" } ], "readinessGate": { "thresholdType": "ANY" } } ], "triggers": [ { "apiTrigger": { "rateLimit": { "count": "integer", "interval": "string" } }, "startStepNames": [ "" ] } ] }, "tags": [], "updatedAt": "2019-09-19T10:00:00.000Z", "webhookSecret": "string" }, "id": "string", "relationships": { "creator": { "data": { "id": "", "type": "users" } }, "owner": { "data": { "id": "", "type": "users" } } }, "type": "workflows" } } ``` Copy Bad request * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=typescript) ##### Get an existing Workflow Copy ``` # Path parameters export workflow_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/workflows/${workflow_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get an existing Workflow ``` """ Get an existing Workflow returns "Successfully got a workflow." response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.workflow_automation_api import WorkflowAutomationApi # there is a valid "workflow" in the system WORKFLOW_DATA_ID = environ["WORKFLOW_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WorkflowAutomationApi(api_client) response = api_instance.get_workflow( workflow_id=WORKFLOW_DATA_ID, ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get an existing Workflow ``` # Get an existing Workflow returns "Successfully got a workflow." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::WorkflowAutomationAPI.new # there is a valid "workflow" in the system WORKFLOW_DATA_ID = ENV["WORKFLOW_DATA_ID"] p api_instance.get_workflow(WORKFLOW_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get an existing Workflow ``` // Get an existing Workflow returns "Successfully got a workflow." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "workflow" in the system WorkflowDataID := os.Getenv("WORKFLOW_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewWorkflowAutomationApi(apiClient) resp, r, err := api.GetWorkflow(ctx, WorkflowDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WorkflowAutomationApi.GetWorkflow`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WorkflowAutomationApi.GetWorkflow`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get an existing Workflow ``` // Get an existing Workflow returns "Successfully got a workflow." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.WorkflowAutomationApi; import com.datadog.api.client.v2.model.GetWorkflowResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WorkflowAutomationApi apiInstance = new WorkflowAutomationApi(defaultClient); // there is a valid "workflow" in the system String WORKFLOW_DATA_ID = System.getenv("WORKFLOW_DATA_ID"); try { GetWorkflowResponse result = apiInstance.getWorkflow(WORKFLOW_DATA_ID); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling WorkflowAutomationApi#getWorkflow"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get an existing Workflow ``` // Get an existing Workflow returns "Successfully got a workflow." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_workflow_automation::WorkflowAutomationAPI; #[tokio::main] async fn main() { // there is a valid "workflow" in the system let workflow_data_id = std::env::var("WORKFLOW_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = WorkflowAutomationAPI::with_config(configuration); let resp = api.get_workflow(workflow_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get an existing Workflow ``` /** * Get an existing Workflow returns "Successfully got a workflow." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.WorkflowAutomationApi(configuration); // there is a valid "workflow" in the system const WORKFLOW_DATA_ID = process.env.WORKFLOW_DATA_ID as string; const params: v2.WorkflowAutomationApiGetWorkflowRequest = { workflowId: WORKFLOW_DATA_ID, }; apiInstance .getWorkflow(params) .then((data: v2.GetWorkflowResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Create a Workflow](https://docs.datadoghq.com/api/latest/workflow-automation/#create-a-workflow) * [v2 (latest)](https://docs.datadoghq.com/api/latest/workflow-automation/#create-a-workflow-v2) POST https://api.ap1.datadoghq.com/api/v2/workflowshttps://api.ap2.datadoghq.com/api/v2/workflowshttps://api.datadoghq.eu/api/v2/workflowshttps://api.ddog-gov.com/api/v2/workflowshttps://api.datadoghq.com/api/v2/workflowshttps://api.us3.datadoghq.com/api/v2/workflowshttps://api.us5.datadoghq.com/api/v2/workflows ### Overview Create a new workflow, returning the workflow ID. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `workflows_write` permission. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) Field Type Description data [_required_] object Data related to the workflow. attributes [_required_] object The definition of `WorkflowDataAttributes` object. createdAt date-time When the workflow was created. description string Description of the workflow. name [_required_] string Name of the workflow. published boolean Set the workflow to published or unpublished. Workflows in an unpublished state will only be executable via manual runs. Automatic triggers such as Schedule will not execute the workflow until it is published. spec [_required_] object The spec defines what the workflow does. annotations [object] A list of annotations used in the workflow. These are like sticky notes for your workflow! display [_required_] object The definition of `AnnotationDisplay` object. bounds object The definition of `AnnotationDisplayBounds` object. height double The `bounds` `height`. width double The `bounds` `width`. x double The `bounds` `x`. y double The `bounds` `y`. id [_required_] string The `Annotation` `id`. markdownTextAnnotation [_required_] object The definition of `AnnotationMarkdownTextAnnotation` object. text string The `markdownTextAnnotation` `text`. connectionEnvs [object] A list of connections or connection groups used in the workflow. connectionGroups [object] The `ConnectionEnv` `connectionGroups`. connectionGroupId [_required_] string The `ConnectionGroup` `connectionGroupId`. label [_required_] string The `ConnectionGroup` `label`. tags [_required_] [string] The `ConnectionGroup` `tags`. connections [object] The `ConnectionEnv` `connections`. connectionId [_required_] string The `Connection` `connectionId`. label [_required_] string The `Connection` `label`. env [_required_] enum The definition of `ConnectionEnvEnv` object. Allowed enum values: `default` handle string Unique identifier used to trigger workflows automatically in Datadog. inputSchema object A list of input parameters for the workflow. These can be used as dynamic runtime values in your workflow. parameters [object] The `InputSchema` `parameters`. defaultValue The `InputSchemaParameters` `defaultValue`. description string The `InputSchemaParameters` `description`. label string The `InputSchemaParameters` `label`. name [_required_] string The `InputSchemaParameters` `name`. type [_required_] enum The definition of `InputSchemaParametersType` object. Allowed enum values: `STRING,NUMBER,BOOLEAN,OBJECT,ARRAY_STRING,ARRAY_NUMBER,ARRAY_BOOLEAN,ARRAY_OBJECT` outputSchema object A list of output parameters for the workflow. parameters [object] The `OutputSchema` `parameters`. defaultValue The `OutputSchemaParameters` `defaultValue`. description string The `OutputSchemaParameters` `description`. label string The `OutputSchemaParameters` `label`. name [_required_] string The `OutputSchemaParameters` `name`. type [_required_] enum The definition of `OutputSchemaParametersType` object. Allowed enum values: `STRING,NUMBER,BOOLEAN,OBJECT,ARRAY_STRING,ARRAY_NUMBER,ARRAY_BOOLEAN,ARRAY_OBJECT` value The `OutputSchemaParameters` `value`. steps [object] A `Step` is a sub-component of a workflow. Each `Step` performs an action. actionId [_required_] string The unique identifier of an action. completionGate object Used to create conditions before running subsequent actions. completionCondition [_required_] object The definition of `CompletionCondition` object. operand1 [_required_] The `CompletionCondition` `operand1`. operand2 The `CompletionCondition` `operand2`. operator [_required_] enum The definition of `CompletionConditionOperator` object. Allowed enum values: `OPERATOR_EQUAL,OPERATOR_NOT_EQUAL,OPERATOR_GREATER_THAN,OPERATOR_LESS_THAN,OPERATOR_GREATER_THAN_OR_EQUAL_TO,OPERATOR_LESS_THAN_OR_EQUAL_TO,OPERATOR_CONTAINS,OPERATOR_DOES_NOT_CONTAIN,OPERATOR_IS_NULL,OPERATOR_IS_NOT_NULL,OPERATOR_IS_EMPTY,OPERATOR_IS_NOT_EMPTY` retryStrategy [_required_] object The definition of `RetryStrategy` object. kind [_required_] enum The definition of `RetryStrategyKind` object. Allowed enum values: `RETRY_STRATEGY_LINEAR` linear object The definition of `RetryStrategyLinear` object. interval [_required_] string The `RetryStrategyLinear` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s maxRetries [_required_] double The `RetryStrategyLinear` `maxRetries`. connectionLabel string The unique identifier of a connection defined in the spec. display object The definition of `StepDisplay` object. bounds object The definition of `StepDisplayBounds` object. x double The `bounds` `x`. y double The `bounds` `y`. errorHandlers [object] The `Step` `errorHandlers`. fallbackStepName [_required_] string The `ErrorHandler` `fallbackStepName`. retryStrategy [_required_] object The definition of `RetryStrategy` object. kind [_required_] enum The definition of `RetryStrategyKind` object. Allowed enum values: `RETRY_STRATEGY_LINEAR` linear object The definition of `RetryStrategyLinear` object. interval [_required_] string The `RetryStrategyLinear` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s maxRetries [_required_] double The `RetryStrategyLinear` `maxRetries`. name [_required_] string Name of the step. outboundEdges [object] A list of subsequent actions to run. branchName [_required_] string The `OutboundEdge` `branchName`. nextStepName [_required_] string The `OutboundEdge` `nextStepName`. parameters [object] A list of inputs for an action. name [_required_] string The `Parameter` `name`. value [_required_] The `Parameter` `value`. readinessGate object Used to merge multiple branches into a single branch. thresholdType [_required_] enum The definition of `ReadinessGateThresholdType` object. Allowed enum values: `ANY,ALL` triggers [ ] The list of triggers that activate this workflow. At least one trigger is required, and each trigger type may appear at most once. Option 1 object Schema for an API-based trigger. apiTrigger [_required_] object Trigger a workflow from an API request. The workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 2 object Schema for an App-based trigger. appTrigger [_required_] object Trigger a workflow from an App. startStepNames [string] A list of steps that run first after a trigger fires. Option 3 object Schema for a Case-based trigger. caseTrigger [_required_] object Trigger a workflow from a Case. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 4 object Schema for a Change Event-based trigger. changeEventTrigger [_required_] object Trigger a workflow from a Change Event. startStepNames [string] A list of steps that run first after a trigger fires. Option 5 object Schema for a Database Monitoring-based trigger. databaseMonitoringTrigger [_required_] object Trigger a workflow from Database Monitoring. startStepNames [string] A list of steps that run first after a trigger fires. Option 6 object Schema for a Datastore-based trigger. datastoreTrigger [_required_] object Trigger a workflow from a Datastore. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 7 object Schema for a Dashboard-based trigger. dashboardTrigger [_required_] object Trigger a workflow from a Dashboard. startStepNames [string] A list of steps that run first after a trigger fires. Option 8 object Schema for a GitHub webhook-based trigger. githubWebhookTrigger [_required_] object Trigger a workflow from a GitHub webhook. To trigger a workflow from GitHub, you must set a `webhookSecret`. In your GitHub Webhook Settings, set the Payload URL to "base_url"/api/v2/workflows/"workflow_id"/webhook?orgId="org_id", select application/json for the content type, and be highly recommend enabling SSL verification for security. The workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 9 object Schema for an Incident-based trigger. incidentTrigger [_required_] object Trigger a workflow from an Incident. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 10 object Schema for a Monitor-based trigger. monitorTrigger [_required_] object Trigger a workflow from a Monitor. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 11 object Schema for a Notebook-based trigger. notebookTrigger [_required_] object Trigger a workflow from a Notebook. startStepNames [string] A list of steps that run first after a trigger fires. Option 12 object Schema for a Schedule-based trigger. scheduleTrigger [_required_] object Trigger a workflow from a Schedule. The workflow must be published. rruleExpression [_required_] string Recurrence rule expression for scheduling. startStepNames [string] A list of steps that run first after a trigger fires. Option 13 object Schema for a Security-based trigger. securityTrigger [_required_] object Trigger a workflow from a Security Signal or Finding. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 14 object Schema for a Self Service-based trigger. selfServiceTrigger [_required_] object Trigger a workflow from Self Service. startStepNames [string] A list of steps that run first after a trigger fires. Option 15 object Schema for a Slack-based trigger. slackTrigger [_required_] object Trigger a workflow from Slack. The workflow must be published. startStepNames [string] A list of steps that run first after a trigger fires. Option 16 object Schema for a Software Catalog-based trigger. softwareCatalogTrigger [_required_] object Trigger a workflow from Software Catalog. startStepNames [string] A list of steps that run first after a trigger fires. Option 17 object Schema for a Workflow-based trigger. startStepNames [string] A list of steps that run first after a trigger fires. workflowTrigger [_required_] object Trigger a workflow from the Datadog UI. Only required if no other trigger exists. tags [string] Tags of the workflow. updatedAt date-time When the workflow was last updated. webhookSecret string If a Webhook trigger is defined on this workflow, a webhookSecret is required and should be provided here. id string The workflow identifier relationships object The definition of `WorkflowDataRelationships` object. creator object The definition of `WorkflowUserRelationship` object. data object The definition of `WorkflowUserRelationshipData` object. id [_required_] string The user identifier type [_required_] enum The definition of `WorkflowUserRelationshipType` object. Allowed enum values: `users` owner object The definition of `WorkflowUserRelationship` object. data object The definition of `WorkflowUserRelationshipData` object. id [_required_] string The user identifier type [_required_] enum The definition of `WorkflowUserRelationshipType` object. Allowed enum values: `users` type [_required_] enum The definition of `WorkflowDataType` object. Allowed enum values: `workflows` ``` { "data": { "attributes": { "description": "A sample workflow.", "name": "Example Workflow", "published": true, "spec": { "connectionEnvs": [ { "connections": [ { "connectionId": "11111111-1111-1111-1111-111111111111", "label": "INTEGRATION_DATADOG" } ], "env": "default" } ], "inputSchema": { "parameters": [ { "defaultValue": "default", "name": "input", "type": "STRING" } ] }, "outputSchema": { "parameters": [ { "name": "output", "type": "ARRAY_OBJECT", "value": "outputValue" } ] }, "steps": [ { "actionId": "com.datadoghq.dd.monitor.listMonitors", "connectionLabel": "INTEGRATION_DATADOG", "name": "Step1", "outboundEdges": [ { "branchName": "main", "nextStepName": "Step2" } ], "parameters": [ { "name": "tags", "value": "service:monitoring" } ] }, { "actionId": "com.datadoghq.core.noop", "name": "Step2" } ], "triggers": [ { "monitorTrigger": { "rateLimit": { "count": 1, "interval": "3600s" } }, "startStepNames": [ "Step1" ] }, { "startStepNames": [ "Step1" ], "githubWebhookTrigger": {} } ] }, "tags": [ "team:infra", "service:monitoring", "foo:bar" ] }, "type": "workflows" } } ``` Copy ### Response * [201](https://docs.datadoghq.com/api/latest/workflow-automation/#CreateWorkflow-201-v2) * [400](https://docs.datadoghq.com/api/latest/workflow-automation/#CreateWorkflow-400-v2) * [403](https://docs.datadoghq.com/api/latest/workflow-automation/#CreateWorkflow-403-v2) * [429](https://docs.datadoghq.com/api/latest/workflow-automation/#CreateWorkflow-429-v2) Successfully created a workflow. * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) The response object after creating a new workflow. Field Type Description data [_required_] object Data related to the workflow. attributes [_required_] object The definition of `WorkflowDataAttributes` object. createdAt date-time When the workflow was created. description string Description of the workflow. name [_required_] string Name of the workflow. published boolean Set the workflow to published or unpublished. Workflows in an unpublished state will only be executable via manual runs. Automatic triggers such as Schedule will not execute the workflow until it is published. spec [_required_] object The spec defines what the workflow does. annotations [object] A list of annotations used in the workflow. These are like sticky notes for your workflow! display [_required_] object The definition of `AnnotationDisplay` object. bounds object The definition of `AnnotationDisplayBounds` object. height double The `bounds` `height`. width double The `bounds` `width`. x double The `bounds` `x`. y double The `bounds` `y`. id [_required_] string The `Annotation` `id`. markdownTextAnnotation [_required_] object The definition of `AnnotationMarkdownTextAnnotation` object. text string The `markdownTextAnnotation` `text`. connectionEnvs [object] A list of connections or connection groups used in the workflow. connectionGroups [object] The `ConnectionEnv` `connectionGroups`. connectionGroupId [_required_] string The `ConnectionGroup` `connectionGroupId`. label [_required_] string The `ConnectionGroup` `label`. tags [_required_] [string] The `ConnectionGroup` `tags`. connections [object] The `ConnectionEnv` `connections`. connectionId [_required_] string The `Connection` `connectionId`. label [_required_] string The `Connection` `label`. env [_required_] enum The definition of `ConnectionEnvEnv` object. Allowed enum values: `default` handle string Unique identifier used to trigger workflows automatically in Datadog. inputSchema object A list of input parameters for the workflow. These can be used as dynamic runtime values in your workflow. parameters [object] The `InputSchema` `parameters`. defaultValue The `InputSchemaParameters` `defaultValue`. description string The `InputSchemaParameters` `description`. label string The `InputSchemaParameters` `label`. name [_required_] string The `InputSchemaParameters` `name`. type [_required_] enum The definition of `InputSchemaParametersType` object. Allowed enum values: `STRING,NUMBER,BOOLEAN,OBJECT,ARRAY_STRING,ARRAY_NUMBER,ARRAY_BOOLEAN,ARRAY_OBJECT` outputSchema object A list of output parameters for the workflow. parameters [object] The `OutputSchema` `parameters`. defaultValue The `OutputSchemaParameters` `defaultValue`. description string The `OutputSchemaParameters` `description`. label string The `OutputSchemaParameters` `label`. name [_required_] string The `OutputSchemaParameters` `name`. type [_required_] enum The definition of `OutputSchemaParametersType` object. Allowed enum values: `STRING,NUMBER,BOOLEAN,OBJECT,ARRAY_STRING,ARRAY_NUMBER,ARRAY_BOOLEAN,ARRAY_OBJECT` value The `OutputSchemaParameters` `value`. steps [object] A `Step` is a sub-component of a workflow. Each `Step` performs an action. actionId [_required_] string The unique identifier of an action. completionGate object Used to create conditions before running subsequent actions. completionCondition [_required_] object The definition of `CompletionCondition` object. operand1 [_required_] The `CompletionCondition` `operand1`. operand2 The `CompletionCondition` `operand2`. operator [_required_] enum The definition of `CompletionConditionOperator` object. Allowed enum values: `OPERATOR_EQUAL,OPERATOR_NOT_EQUAL,OPERATOR_GREATER_THAN,OPERATOR_LESS_THAN,OPERATOR_GREATER_THAN_OR_EQUAL_TO,OPERATOR_LESS_THAN_OR_EQUAL_TO,OPERATOR_CONTAINS,OPERATOR_DOES_NOT_CONTAIN,OPERATOR_IS_NULL,OPERATOR_IS_NOT_NULL,OPERATOR_IS_EMPTY,OPERATOR_IS_NOT_EMPTY` retryStrategy [_required_] object The definition of `RetryStrategy` object. kind [_required_] enum The definition of `RetryStrategyKind` object. Allowed enum values: `RETRY_STRATEGY_LINEAR` linear object The definition of `RetryStrategyLinear` object. interval [_required_] string The `RetryStrategyLinear` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s maxRetries [_required_] double The `RetryStrategyLinear` `maxRetries`. connectionLabel string The unique identifier of a connection defined in the spec. display object The definition of `StepDisplay` object. bounds object The definition of `StepDisplayBounds` object. x double The `bounds` `x`. y double The `bounds` `y`. errorHandlers [object] The `Step` `errorHandlers`. fallbackStepName [_required_] string The `ErrorHandler` `fallbackStepName`. retryStrategy [_required_] object The definition of `RetryStrategy` object. kind [_required_] enum The definition of `RetryStrategyKind` object. Allowed enum values: `RETRY_STRATEGY_LINEAR` linear object The definition of `RetryStrategyLinear` object. interval [_required_] string The `RetryStrategyLinear` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s maxRetries [_required_] double The `RetryStrategyLinear` `maxRetries`. name [_required_] string Name of the step. outboundEdges [object] A list of subsequent actions to run. branchName [_required_] string The `OutboundEdge` `branchName`. nextStepName [_required_] string The `OutboundEdge` `nextStepName`. parameters [object] A list of inputs for an action. name [_required_] string The `Parameter` `name`. value [_required_] The `Parameter` `value`. readinessGate object Used to merge multiple branches into a single branch. thresholdType [_required_] enum The definition of `ReadinessGateThresholdType` object. Allowed enum values: `ANY,ALL` triggers [ ] The list of triggers that activate this workflow. At least one trigger is required, and each trigger type may appear at most once. Option 1 object Schema for an API-based trigger. apiTrigger [_required_] object Trigger a workflow from an API request. The workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 2 object Schema for an App-based trigger. appTrigger [_required_] object Trigger a workflow from an App. startStepNames [string] A list of steps that run first after a trigger fires. Option 3 object Schema for a Case-based trigger. caseTrigger [_required_] object Trigger a workflow from a Case. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 4 object Schema for a Change Event-based trigger. changeEventTrigger [_required_] object Trigger a workflow from a Change Event. startStepNames [string] A list of steps that run first after a trigger fires. Option 5 object Schema for a Database Monitoring-based trigger. databaseMonitoringTrigger [_required_] object Trigger a workflow from Database Monitoring. startStepNames [string] A list of steps that run first after a trigger fires. Option 6 object Schema for a Datastore-based trigger. datastoreTrigger [_required_] object Trigger a workflow from a Datastore. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 7 object Schema for a Dashboard-based trigger. dashboardTrigger [_required_] object Trigger a workflow from a Dashboard. startStepNames [string] A list of steps that run first after a trigger fires. Option 8 object Schema for a GitHub webhook-based trigger. githubWebhookTrigger [_required_] object Trigger a workflow from a GitHub webhook. To trigger a workflow from GitHub, you must set a `webhookSecret`. In your GitHub Webhook Settings, set the Payload URL to "base_url"/api/v2/workflows/"workflow_id"/webhook?orgId="org_id", select application/json for the content type, and be highly recommend enabling SSL verification for security. The workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 9 object Schema for an Incident-based trigger. incidentTrigger [_required_] object Trigger a workflow from an Incident. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 10 object Schema for a Monitor-based trigger. monitorTrigger [_required_] object Trigger a workflow from a Monitor. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 11 object Schema for a Notebook-based trigger. notebookTrigger [_required_] object Trigger a workflow from a Notebook. startStepNames [string] A list of steps that run first after a trigger fires. Option 12 object Schema for a Schedule-based trigger. scheduleTrigger [_required_] object Trigger a workflow from a Schedule. The workflow must be published. rruleExpression [_required_] string Recurrence rule expression for scheduling. startStepNames [string] A list of steps that run first after a trigger fires. Option 13 object Schema for a Security-based trigger. securityTrigger [_required_] object Trigger a workflow from a Security Signal or Finding. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 14 object Schema for a Self Service-based trigger. selfServiceTrigger [_required_] object Trigger a workflow from Self Service. startStepNames [string] A list of steps that run first after a trigger fires. Option 15 object Schema for a Slack-based trigger. slackTrigger [_required_] object Trigger a workflow from Slack. The workflow must be published. startStepNames [string] A list of steps that run first after a trigger fires. Option 16 object Schema for a Software Catalog-based trigger. softwareCatalogTrigger [_required_] object Trigger a workflow from Software Catalog. startStepNames [string] A list of steps that run first after a trigger fires. Option 17 object Schema for a Workflow-based trigger. startStepNames [string] A list of steps that run first after a trigger fires. workflowTrigger [_required_] object Trigger a workflow from the Datadog UI. Only required if no other trigger exists. tags [string] Tags of the workflow. updatedAt date-time When the workflow was last updated. webhookSecret string If a Webhook trigger is defined on this workflow, a webhookSecret is required and should be provided here. id string The workflow identifier relationships object The definition of `WorkflowDataRelationships` object. creator object The definition of `WorkflowUserRelationship` object. data object The definition of `WorkflowUserRelationshipData` object. id [_required_] string The user identifier type [_required_] enum The definition of `WorkflowUserRelationshipType` object. Allowed enum values: `users` owner object The definition of `WorkflowUserRelationship` object. data object The definition of `WorkflowUserRelationshipData` object. id [_required_] string The user identifier type [_required_] enum The definition of `WorkflowUserRelationshipType` object. Allowed enum values: `users` type [_required_] enum The definition of `WorkflowDataType` object. Allowed enum values: `workflows` ``` { "data": { "attributes": { "createdAt": "2019-09-19T10:00:00.000Z", "description": "string", "name": "", "published": false, "spec": { "annotations": [ { "display": { "bounds": { "height": "number", "width": "number", "x": "number", "y": "number" } }, "id": "", "markdownTextAnnotation": { "text": "string" } } ], "connectionEnvs": [ { "connectionGroups": [ { "connectionGroupId": "", "label": "", "tags": [ "" ] } ], "connections": [ { "connectionId": "", "label": "" } ], "env": "default" } ], "handle": "string", "inputSchema": { "parameters": [ { "defaultValue": "undefined", "description": "string", "label": "string", "name": "", "type": "STRING" } ] }, "outputSchema": { "parameters": [ { "defaultValue": "undefined", "description": "string", "label": "string", "name": "", "type": "STRING", "value": "undefined" } ] }, "steps": [ { "actionId": "", "completionGate": { "completionCondition": { "operand1": "undefined", "operand2": "undefined", "operator": "OPERATOR_EQUAL" }, "retryStrategy": { "kind": "RETRY_STRATEGY_LINEAR", "linear": { "interval": "", "maxRetries": 0 } } }, "connectionLabel": "string", "display": { "bounds": { "x": "number", "y": "number" } }, "errorHandlers": [ { "fallbackStepName": "", "retryStrategy": { "kind": "RETRY_STRATEGY_LINEAR", "linear": { "interval": "", "maxRetries": 0 } } } ], "name": "", "outboundEdges": [ { "branchName": "", "nextStepName": "" } ], "parameters": [ { "name": "", "value": "undefined" } ], "readinessGate": { "thresholdType": "ANY" } } ], "triggers": [ { "apiTrigger": { "rateLimit": { "count": "integer", "interval": "string" } }, "startStepNames": [ "" ] } ] }, "tags": [], "updatedAt": "2019-09-19T10:00:00.000Z", "webhookSecret": "string" }, "id": "string", "relationships": { "creator": { "data": { "id": "", "type": "users" } }, "owner": { "data": { "id": "", "type": "users" } } }, "type": "workflows" } } ``` Copy Bad request * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=typescript) ##### Create a Workflow returns "Successfully created a workflow." response Copy ``` # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/workflows" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "A sample workflow.", "name": "Example Workflow", "published": true, "spec": { "connectionEnvs": [ { "connections": [ { "connectionId": "11111111-1111-1111-1111-111111111111", "label": "INTEGRATION_DATADOG" } ], "env": "default" } ], "inputSchema": { "parameters": [ { "defaultValue": "default", "name": "input", "type": "STRING" } ] }, "outputSchema": { "parameters": [ { "name": "output", "type": "ARRAY_OBJECT", "value": "outputValue" } ] }, "steps": [ { "actionId": "com.datadoghq.dd.monitor.listMonitors", "connectionLabel": "INTEGRATION_DATADOG", "name": "Step1", "outboundEdges": [ { "branchName": "main", "nextStepName": "Step2" } ], "parameters": [ { "name": "tags", "value": "service:monitoring" } ] }, { "actionId": "com.datadoghq.core.noop", "name": "Step2" } ], "triggers": [ { "monitorTrigger": { "rateLimit": { "count": 1, "interval": "3600s" } }, "startStepNames": [ "Step1" ] }, { "startStepNames": [ "Step1" ], "githubWebhookTrigger": {} } ] }, "tags": [ "team:infra", "service:monitoring", "foo:bar" ] }, "type": "workflows" } } EOF ``` ##### Create a Workflow returns "Successfully created a workflow." response ``` // Create a Workflow returns "Successfully created a workflow." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.CreateWorkflowRequest{ Data: datadogV2.WorkflowData{ Attributes: datadogV2.WorkflowDataAttributes{ Description: datadog.PtrString("A sample workflow."), Name: "Example Workflow", Published: datadog.PtrBool(true), Spec: datadogV2.Spec{ ConnectionEnvs: []datadogV2.ConnectionEnv{ { Connections: []datadogV2.Connection{ { ConnectionId: "11111111-1111-1111-1111-111111111111", Label: "INTEGRATION_DATADOG", }, }, Env: datadogV2.CONNECTIONENVENV_DEFAULT, }, }, InputSchema: &datadogV2.InputSchema{ Parameters: []datadogV2.InputSchemaParameters{ { DefaultValue: "default", Name: "input", Type: datadogV2.INPUTSCHEMAPARAMETERSTYPE_STRING, }, }, }, OutputSchema: &datadogV2.OutputSchema{ Parameters: []datadogV2.OutputSchemaParameters{ { Name: "output", Type: datadogV2.OUTPUTSCHEMAPARAMETERSTYPE_ARRAY_OBJECT, Value: "outputValue", }, }, }, Steps: []datadogV2.Step{ { ActionId: "com.datadoghq.dd.monitor.listMonitors", ConnectionLabel: datadog.PtrString("INTEGRATION_DATADOG"), Name: "Step1", OutboundEdges: []datadogV2.OutboundEdge{ { BranchName: "main", NextStepName: "Step2", }, }, Parameters: []datadogV2.Parameter{ { Name: "tags", Value: "service:monitoring", }, }, }, { ActionId: "com.datadoghq.core.noop", Name: "Step2", }, }, Triggers: []datadogV2.Trigger{ datadogV2.Trigger{ MonitorTriggerWrapper: &datadogV2.MonitorTriggerWrapper{ MonitorTrigger: datadogV2.MonitorTrigger{ RateLimit: &datadogV2.TriggerRateLimit{ Count: datadog.PtrInt64(1), Interval: datadog.PtrString("3600s"), }, }, StartStepNames: []string{ "Step1", }, }}, datadogV2.Trigger{ GithubWebhookTriggerWrapper: &datadogV2.GithubWebhookTriggerWrapper{ StartStepNames: []string{ "Step1", }, GithubWebhookTrigger: datadogV2.GithubWebhookTrigger{}, }}, }, }, Tags: []string{ "team:infra", "service:monitoring", "foo:bar", }, }, Type: datadogV2.WORKFLOWDATATYPE_WORKFLOWS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewWorkflowAutomationApi(apiClient) resp, r, err := api.CreateWorkflow(ctx, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WorkflowAutomationApi.CreateWorkflow`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WorkflowAutomationApi.CreateWorkflow`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Create a Workflow returns "Successfully created a workflow." response ``` // Create a Workflow returns "Successfully created a workflow." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.WorkflowAutomationApi; import com.datadog.api.client.v2.model.Connection; import com.datadog.api.client.v2.model.ConnectionEnv; import com.datadog.api.client.v2.model.ConnectionEnvEnv; import com.datadog.api.client.v2.model.CreateWorkflowRequest; import com.datadog.api.client.v2.model.CreateWorkflowResponse; import com.datadog.api.client.v2.model.GithubWebhookTrigger; import com.datadog.api.client.v2.model.GithubWebhookTriggerWrapper; import com.datadog.api.client.v2.model.InputSchema; import com.datadog.api.client.v2.model.InputSchemaParameters; import com.datadog.api.client.v2.model.InputSchemaParametersType; import com.datadog.api.client.v2.model.MonitorTrigger; import com.datadog.api.client.v2.model.MonitorTriggerWrapper; import com.datadog.api.client.v2.model.OutboundEdge; import com.datadog.api.client.v2.model.OutputSchema; import com.datadog.api.client.v2.model.OutputSchemaParameters; import com.datadog.api.client.v2.model.OutputSchemaParametersType; import com.datadog.api.client.v2.model.Parameter; import com.datadog.api.client.v2.model.Spec; import com.datadog.api.client.v2.model.Step; import com.datadog.api.client.v2.model.Trigger; import com.datadog.api.client.v2.model.TriggerRateLimit; import com.datadog.api.client.v2.model.WorkflowData; import com.datadog.api.client.v2.model.WorkflowDataAttributes; import com.datadog.api.client.v2.model.WorkflowDataType; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WorkflowAutomationApi apiInstance = new WorkflowAutomationApi(defaultClient); CreateWorkflowRequest body = new CreateWorkflowRequest() .data( new WorkflowData() .attributes( new WorkflowDataAttributes() .description("A sample workflow.") .name("Example Workflow") .published(true) .spec( new Spec() .connectionEnvs( Collections.singletonList( new ConnectionEnv() .connections( Collections.singletonList( new Connection() .connectionId( "11111111-1111-1111-1111-111111111111") .label("INTEGRATION_DATADOG"))) .env(ConnectionEnvEnv.DEFAULT))) .inputSchema( new InputSchema() .parameters( Collections.singletonList( new InputSchemaParameters() .defaultValue("default") .name("input") .type(InputSchemaParametersType.STRING)))) .outputSchema( new OutputSchema() .parameters( Collections.singletonList( new OutputSchemaParameters() .name("output") .type( OutputSchemaParametersType.ARRAY_OBJECT) .value("outputValue")))) .steps( Arrays.asList( new Step() .actionId("com.datadoghq.dd.monitor.listMonitors") .connectionLabel("INTEGRATION_DATADOG") .name("Step1") .outboundEdges( Collections.singletonList( new OutboundEdge() .branchName("main") .nextStepName("Step2"))) .parameters( Collections.singletonList( new Parameter() .name("tags") .value("service:monitoring"))), new Step() .actionId("com.datadoghq.core.noop") .name("Step2"))) .triggers( Arrays.asList( new Trigger( new MonitorTriggerWrapper() .monitorTrigger( new MonitorTrigger() .rateLimit( new TriggerRateLimit() .count(1L) .interval("3600s"))) .startStepNames( Collections.singletonList("Step1"))), new Trigger( new GithubWebhookTriggerWrapper() .startStepNames( Collections.singletonList("Step1")) .githubWebhookTrigger( new GithubWebhookTrigger()))))) .tags(Arrays.asList("team:infra", "service:monitoring", "foo:bar"))) .type(WorkflowDataType.WORKFLOWS)); try { CreateWorkflowResponse result = apiInstance.createWorkflow(body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling WorkflowAutomationApi#createWorkflow"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Create a Workflow returns "Successfully created a workflow." response ``` """ Create a Workflow returns "Successfully created a workflow." response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.workflow_automation_api import WorkflowAutomationApi from datadog_api_client.v2.model.connection import Connection from datadog_api_client.v2.model.connection_env import ConnectionEnv from datadog_api_client.v2.model.connection_env_env import ConnectionEnvEnv from datadog_api_client.v2.model.create_workflow_request import CreateWorkflowRequest from datadog_api_client.v2.model.github_webhook_trigger import GithubWebhookTrigger from datadog_api_client.v2.model.github_webhook_trigger_wrapper import GithubWebhookTriggerWrapper from datadog_api_client.v2.model.input_schema import InputSchema from datadog_api_client.v2.model.input_schema_parameters import InputSchemaParameters from datadog_api_client.v2.model.input_schema_parameters_type import InputSchemaParametersType from datadog_api_client.v2.model.monitor_trigger import MonitorTrigger from datadog_api_client.v2.model.monitor_trigger_wrapper import MonitorTriggerWrapper from datadog_api_client.v2.model.outbound_edge import OutboundEdge from datadog_api_client.v2.model.output_schema import OutputSchema from datadog_api_client.v2.model.output_schema_parameters import OutputSchemaParameters from datadog_api_client.v2.model.output_schema_parameters_type import OutputSchemaParametersType from datadog_api_client.v2.model.parameter import Parameter from datadog_api_client.v2.model.spec import Spec from datadog_api_client.v2.model.step import Step from datadog_api_client.v2.model.trigger_rate_limit import TriggerRateLimit from datadog_api_client.v2.model.workflow_data import WorkflowData from datadog_api_client.v2.model.workflow_data_attributes import WorkflowDataAttributes from datadog_api_client.v2.model.workflow_data_type import WorkflowDataType body = CreateWorkflowRequest( data=WorkflowData( attributes=WorkflowDataAttributes( description="A sample workflow.", name="Example Workflow", published=True, spec=Spec( connection_envs=[ ConnectionEnv( connections=[ Connection( connection_id="11111111-1111-1111-1111-111111111111", label="INTEGRATION_DATADOG", ), ], env=ConnectionEnvEnv.DEFAULT, ), ], input_schema=InputSchema( parameters=[ InputSchemaParameters( default_value="default", name="input", type=InputSchemaParametersType.STRING, ), ], ), output_schema=OutputSchema( parameters=[ OutputSchemaParameters( name="output", type=OutputSchemaParametersType.ARRAY_OBJECT, value="outputValue", ), ], ), steps=[ Step( action_id="com.datadoghq.dd.monitor.listMonitors", connection_label="INTEGRATION_DATADOG", name="Step1", outbound_edges=[ OutboundEdge( branch_name="main", next_step_name="Step2", ), ], parameters=[ Parameter( name="tags", value="service:monitoring", ), ], ), Step( action_id="com.datadoghq.core.noop", name="Step2", ), ], triggers=[ MonitorTriggerWrapper( monitor_trigger=MonitorTrigger( rate_limit=TriggerRateLimit( count=1, interval="3600s", ), ), start_step_names=[ "Step1", ], ), GithubWebhookTriggerWrapper( start_step_names=[ "Step1", ], github_webhook_trigger=GithubWebhookTrigger(), ), ], ), tags=[ "team:infra", "service:monitoring", "foo:bar", ], ), type=WorkflowDataType.WORKFLOWS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WorkflowAutomationApi(api_client) response = api_instance.create_workflow(body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Create a Workflow returns "Successfully created a workflow." response ``` # Create a Workflow returns "Successfully created a workflow." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::WorkflowAutomationAPI.new body = DatadogAPIClient::V2::CreateWorkflowRequest.new({ data: DatadogAPIClient::V2::WorkflowData.new({ attributes: DatadogAPIClient::V2::WorkflowDataAttributes.new({ description: "A sample workflow.", name: "Example Workflow", published: true, spec: DatadogAPIClient::V2::Spec.new({ connection_envs: [ DatadogAPIClient::V2::ConnectionEnv.new({ connections: [ DatadogAPIClient::V2::Connection.new({ connection_id: "11111111-1111-1111-1111-111111111111", label: "INTEGRATION_DATADOG", }), ], env: DatadogAPIClient::V2::ConnectionEnvEnv::DEFAULT, }), ], input_schema: DatadogAPIClient::V2::InputSchema.new({ parameters: [ DatadogAPIClient::V2::InputSchemaParameters.new({ default_value: "default", name: "input", type: DatadogAPIClient::V2::InputSchemaParametersType::STRING, }), ], }), output_schema: DatadogAPIClient::V2::OutputSchema.new({ parameters: [ DatadogAPIClient::V2::OutputSchemaParameters.new({ name: "output", type: DatadogAPIClient::V2::OutputSchemaParametersType::ARRAY_OBJECT, value: "outputValue", }), ], }), steps: [ DatadogAPIClient::V2::Step.new({ action_id: "com.datadoghq.dd.monitor.listMonitors", connection_label: "INTEGRATION_DATADOG", name: "Step1", outbound_edges: [ DatadogAPIClient::V2::OutboundEdge.new({ branch_name: "main", next_step_name: "Step2", }), ], parameters: [ DatadogAPIClient::V2::Parameter.new({ name: "tags", value: "service:monitoring", }), ], }), DatadogAPIClient::V2::Step.new({ action_id: "com.datadoghq.core.noop", name: "Step2", }), ], triggers: [ DatadogAPIClient::V2::MonitorTriggerWrapper.new({ monitor_trigger: DatadogAPIClient::V2::MonitorTrigger.new({ rate_limit: DatadogAPIClient::V2::TriggerRateLimit.new({ count: 1, interval: "3600s", }), }), start_step_names: [ "Step1", ], }), DatadogAPIClient::V2::GithubWebhookTriggerWrapper.new({ start_step_names: [ "Step1", ], github_webhook_trigger: DatadogAPIClient::V2::GithubWebhookTrigger.new({}), }), ], }), tags: [ "team:infra", "service:monitoring", "foo:bar", ], }), type: DatadogAPIClient::V2::WorkflowDataType::WORKFLOWS, }), }) p api_instance.create_workflow(body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Create a Workflow returns "Successfully created a workflow." response ``` // Create a Workflow returns "Successfully created a workflow." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_workflow_automation::WorkflowAutomationAPI; use datadog_api_client::datadogV2::model::Connection; use datadog_api_client::datadogV2::model::ConnectionEnv; use datadog_api_client::datadogV2::model::ConnectionEnvEnv; use datadog_api_client::datadogV2::model::CreateWorkflowRequest; use datadog_api_client::datadogV2::model::GithubWebhookTrigger; use datadog_api_client::datadogV2::model::GithubWebhookTriggerWrapper; use datadog_api_client::datadogV2::model::InputSchema; use datadog_api_client::datadogV2::model::InputSchemaParameters; use datadog_api_client::datadogV2::model::InputSchemaParametersType; use datadog_api_client::datadogV2::model::MonitorTrigger; use datadog_api_client::datadogV2::model::MonitorTriggerWrapper; use datadog_api_client::datadogV2::model::OutboundEdge; use datadog_api_client::datadogV2::model::OutputSchema; use datadog_api_client::datadogV2::model::OutputSchemaParameters; use datadog_api_client::datadogV2::model::OutputSchemaParametersType; use datadog_api_client::datadogV2::model::Parameter; use datadog_api_client::datadogV2::model::Spec; use datadog_api_client::datadogV2::model::Step; use datadog_api_client::datadogV2::model::Trigger; use datadog_api_client::datadogV2::model::TriggerRateLimit; use datadog_api_client::datadogV2::model::WorkflowData; use datadog_api_client::datadogV2::model::WorkflowDataAttributes; use datadog_api_client::datadogV2::model::WorkflowDataType; use serde_json::Value; #[tokio::main] async fn main() { let body = CreateWorkflowRequest::new(WorkflowData::new( WorkflowDataAttributes::new( "Example Workflow".to_string(), Spec::new() .connection_envs(vec![ConnectionEnv::new(ConnectionEnvEnv::DEFAULT) .connections(vec![Connection::new( "11111111-1111-1111-1111-111111111111".to_string(), "INTEGRATION_DATADOG".to_string(), )])]) .input_schema(InputSchema::new().parameters(vec![ InputSchemaParameters::new( "input".to_string(), InputSchemaParametersType::STRING, ).default_value(Value::from("default")) ])) .output_schema(OutputSchema::new().parameters(vec![ OutputSchemaParameters::new( "output".to_string(), OutputSchemaParametersType::ARRAY_OBJECT, ).value(Value::from("outputValue")) ])) .steps(vec![ Step::new( "com.datadoghq.dd.monitor.listMonitors".to_string(), "Step1".to_string(), ) .connection_label("INTEGRATION_DATADOG".to_string()) .outbound_edges(vec![OutboundEdge::new( "main".to_string(), "Step2".to_string(), )]) .parameters(vec![Parameter::new( "tags".to_string(), Value::from("service:monitoring"), )]), Step::new("com.datadoghq.core.noop".to_string(), "Step2".to_string()), ]) .triggers(vec![ Trigger::MonitorTriggerWrapper(Box::new( MonitorTriggerWrapper::new( MonitorTrigger::new().rate_limit( TriggerRateLimit::new() .count(1) .interval("3600s".to_string()), ), ) .start_step_names(vec!["Step1".to_string()]), )), Trigger::GithubWebhookTriggerWrapper(Box::new( GithubWebhookTriggerWrapper::new(GithubWebhookTrigger::new()) .start_step_names(vec!["Step1".to_string()]), )), ]), ) .description("A sample workflow.".to_string()) .published(true) .tags(vec![ "team:infra".to_string(), "service:monitoring".to_string(), "foo:bar".to_string(), ]), WorkflowDataType::WORKFLOWS, )); let configuration = datadog::Configuration::new(); let api = WorkflowAutomationAPI::with_config(configuration); let resp = api.create_workflow(body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Create a Workflow returns "Successfully created a workflow." response ``` /** * Create a Workflow returns "Successfully created a workflow." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.WorkflowAutomationApi(configuration); const params: v2.WorkflowAutomationApiCreateWorkflowRequest = { body: { data: { attributes: { description: "A sample workflow.", name: "Example Workflow", published: true, spec: { connectionEnvs: [ { connections: [ { connectionId: "11111111-1111-1111-1111-111111111111", label: "INTEGRATION_DATADOG", }, ], env: "default", }, ], inputSchema: { parameters: [ { defaultValue: "default", name: "input", type: "STRING", }, ], }, outputSchema: { parameters: [ { name: "output", type: "ARRAY_OBJECT", value: "outputValue", }, ], }, steps: [ { actionId: "com.datadoghq.dd.monitor.listMonitors", connectionLabel: "INTEGRATION_DATADOG", name: "Step1", outboundEdges: [ { branchName: "main", nextStepName: "Step2", }, ], parameters: [ { name: "tags", value: "service:monitoring", }, ], }, { actionId: "com.datadoghq.core.noop", name: "Step2", }, ], triggers: [ { monitorTrigger: { rateLimit: { count: 1, interval: "3600s", }, }, startStepNames: ["Step1"], }, { startStepNames: ["Step1"], githubWebhookTrigger: {}, }, ], }, tags: ["team:infra", "service:monitoring", "foo:bar"], }, type: "workflows", }, }, }; apiInstance .createWorkflow(params) .then((data: v2.CreateWorkflowResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Update an existing Workflow](https://docs.datadoghq.com/api/latest/workflow-automation/#update-an-existing-workflow) * [v2 (latest)](https://docs.datadoghq.com/api/latest/workflow-automation/#update-an-existing-workflow-v2) PATCH https://api.ap1.datadoghq.com/api/v2/workflows/{workflow_id}https://api.ap2.datadoghq.com/api/v2/workflows/{workflow_id}https://api.datadoghq.eu/api/v2/workflows/{workflow_id}https://api.ddog-gov.com/api/v2/workflows/{workflow_id}https://api.datadoghq.com/api/v2/workflows/{workflow_id}https://api.us3.datadoghq.com/api/v2/workflows/{workflow_id}https://api.us5.datadoghq.com/api/v2/workflows/{workflow_id} ### Overview Update a workflow by ID. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `workflows_write` permission. ### Arguments #### Path Parameters Name Type Description workflow_id [_required_] string The ID of the workflow. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) Field Type Description data [_required_] object Data related to the workflow being updated. attributes [_required_] object The definition of `WorkflowDataUpdateAttributes` object. createdAt date-time When the workflow was created. description string Description of the workflow. name string Name of the workflow. published boolean Set the workflow to published or unpublished. Workflows in an unpublished state will only be executable via manual runs. Automatic triggers such as Schedule will not execute the workflow until it is published. spec object The spec defines what the workflow does. annotations [object] A list of annotations used in the workflow. These are like sticky notes for your workflow! display [_required_] object The definition of `AnnotationDisplay` object. bounds object The definition of `AnnotationDisplayBounds` object. height double The `bounds` `height`. width double The `bounds` `width`. x double The `bounds` `x`. y double The `bounds` `y`. id [_required_] string The `Annotation` `id`. markdownTextAnnotation [_required_] object The definition of `AnnotationMarkdownTextAnnotation` object. text string The `markdownTextAnnotation` `text`. connectionEnvs [object] A list of connections or connection groups used in the workflow. connectionGroups [object] The `ConnectionEnv` `connectionGroups`. connectionGroupId [_required_] string The `ConnectionGroup` `connectionGroupId`. label [_required_] string The `ConnectionGroup` `label`. tags [_required_] [string] The `ConnectionGroup` `tags`. connections [object] The `ConnectionEnv` `connections`. connectionId [_required_] string The `Connection` `connectionId`. label [_required_] string The `Connection` `label`. env [_required_] enum The definition of `ConnectionEnvEnv` object. Allowed enum values: `default` handle string Unique identifier used to trigger workflows automatically in Datadog. inputSchema object A list of input parameters for the workflow. These can be used as dynamic runtime values in your workflow. parameters [object] The `InputSchema` `parameters`. defaultValue The `InputSchemaParameters` `defaultValue`. description string The `InputSchemaParameters` `description`. label string The `InputSchemaParameters` `label`. name [_required_] string The `InputSchemaParameters` `name`. type [_required_] enum The definition of `InputSchemaParametersType` object. Allowed enum values: `STRING,NUMBER,BOOLEAN,OBJECT,ARRAY_STRING,ARRAY_NUMBER,ARRAY_BOOLEAN,ARRAY_OBJECT` outputSchema object A list of output parameters for the workflow. parameters [object] The `OutputSchema` `parameters`. defaultValue The `OutputSchemaParameters` `defaultValue`. description string The `OutputSchemaParameters` `description`. label string The `OutputSchemaParameters` `label`. name [_required_] string The `OutputSchemaParameters` `name`. type [_required_] enum The definition of `OutputSchemaParametersType` object. Allowed enum values: `STRING,NUMBER,BOOLEAN,OBJECT,ARRAY_STRING,ARRAY_NUMBER,ARRAY_BOOLEAN,ARRAY_OBJECT` value The `OutputSchemaParameters` `value`. steps [object] A `Step` is a sub-component of a workflow. Each `Step` performs an action. actionId [_required_] string The unique identifier of an action. completionGate object Used to create conditions before running subsequent actions. completionCondition [_required_] object The definition of `CompletionCondition` object. operand1 [_required_] The `CompletionCondition` `operand1`. operand2 The `CompletionCondition` `operand2`. operator [_required_] enum The definition of `CompletionConditionOperator` object. Allowed enum values: `OPERATOR_EQUAL,OPERATOR_NOT_EQUAL,OPERATOR_GREATER_THAN,OPERATOR_LESS_THAN,OPERATOR_GREATER_THAN_OR_EQUAL_TO,OPERATOR_LESS_THAN_OR_EQUAL_TO,OPERATOR_CONTAINS,OPERATOR_DOES_NOT_CONTAIN,OPERATOR_IS_NULL,OPERATOR_IS_NOT_NULL,OPERATOR_IS_EMPTY,OPERATOR_IS_NOT_EMPTY` retryStrategy [_required_] object The definition of `RetryStrategy` object. kind [_required_] enum The definition of `RetryStrategyKind` object. Allowed enum values: `RETRY_STRATEGY_LINEAR` linear object The definition of `RetryStrategyLinear` object. interval [_required_] string The `RetryStrategyLinear` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s maxRetries [_required_] double The `RetryStrategyLinear` `maxRetries`. connectionLabel string The unique identifier of a connection defined in the spec. display object The definition of `StepDisplay` object. bounds object The definition of `StepDisplayBounds` object. x double The `bounds` `x`. y double The `bounds` `y`. errorHandlers [object] The `Step` `errorHandlers`. fallbackStepName [_required_] string The `ErrorHandler` `fallbackStepName`. retryStrategy [_required_] object The definition of `RetryStrategy` object. kind [_required_] enum The definition of `RetryStrategyKind` object. Allowed enum values: `RETRY_STRATEGY_LINEAR` linear object The definition of `RetryStrategyLinear` object. interval [_required_] string The `RetryStrategyLinear` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s maxRetries [_required_] double The `RetryStrategyLinear` `maxRetries`. name [_required_] string Name of the step. outboundEdges [object] A list of subsequent actions to run. branchName [_required_] string The `OutboundEdge` `branchName`. nextStepName [_required_] string The `OutboundEdge` `nextStepName`. parameters [object] A list of inputs for an action. name [_required_] string The `Parameter` `name`. value [_required_] The `Parameter` `value`. readinessGate object Used to merge multiple branches into a single branch. thresholdType [_required_] enum The definition of `ReadinessGateThresholdType` object. Allowed enum values: `ANY,ALL` triggers [ ] The list of triggers that activate this workflow. At least one trigger is required, and each trigger type may appear at most once. Option 1 object Schema for an API-based trigger. apiTrigger [_required_] object Trigger a workflow from an API request. The workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 2 object Schema for an App-based trigger. appTrigger [_required_] object Trigger a workflow from an App. startStepNames [string] A list of steps that run first after a trigger fires. Option 3 object Schema for a Case-based trigger. caseTrigger [_required_] object Trigger a workflow from a Case. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 4 object Schema for a Change Event-based trigger. changeEventTrigger [_required_] object Trigger a workflow from a Change Event. startStepNames [string] A list of steps that run first after a trigger fires. Option 5 object Schema for a Database Monitoring-based trigger. databaseMonitoringTrigger [_required_] object Trigger a workflow from Database Monitoring. startStepNames [string] A list of steps that run first after a trigger fires. Option 6 object Schema for a Datastore-based trigger. datastoreTrigger [_required_] object Trigger a workflow from a Datastore. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 7 object Schema for a Dashboard-based trigger. dashboardTrigger [_required_] object Trigger a workflow from a Dashboard. startStepNames [string] A list of steps that run first after a trigger fires. Option 8 object Schema for a GitHub webhook-based trigger. githubWebhookTrigger [_required_] object Trigger a workflow from a GitHub webhook. To trigger a workflow from GitHub, you must set a `webhookSecret`. In your GitHub Webhook Settings, set the Payload URL to "base_url"/api/v2/workflows/"workflow_id"/webhook?orgId="org_id", select application/json for the content type, and be highly recommend enabling SSL verification for security. The workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 9 object Schema for an Incident-based trigger. incidentTrigger [_required_] object Trigger a workflow from an Incident. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 10 object Schema for a Monitor-based trigger. monitorTrigger [_required_] object Trigger a workflow from a Monitor. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 11 object Schema for a Notebook-based trigger. notebookTrigger [_required_] object Trigger a workflow from a Notebook. startStepNames [string] A list of steps that run first after a trigger fires. Option 12 object Schema for a Schedule-based trigger. scheduleTrigger [_required_] object Trigger a workflow from a Schedule. The workflow must be published. rruleExpression [_required_] string Recurrence rule expression for scheduling. startStepNames [string] A list of steps that run first after a trigger fires. Option 13 object Schema for a Security-based trigger. securityTrigger [_required_] object Trigger a workflow from a Security Signal or Finding. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 14 object Schema for a Self Service-based trigger. selfServiceTrigger [_required_] object Trigger a workflow from Self Service. startStepNames [string] A list of steps that run first after a trigger fires. Option 15 object Schema for a Slack-based trigger. slackTrigger [_required_] object Trigger a workflow from Slack. The workflow must be published. startStepNames [string] A list of steps that run first after a trigger fires. Option 16 object Schema for a Software Catalog-based trigger. softwareCatalogTrigger [_required_] object Trigger a workflow from Software Catalog. startStepNames [string] A list of steps that run first after a trigger fires. Option 17 object Schema for a Workflow-based trigger. startStepNames [string] A list of steps that run first after a trigger fires. workflowTrigger [_required_] object Trigger a workflow from the Datadog UI. Only required if no other trigger exists. tags [string] Tags of the workflow. updatedAt date-time When the workflow was last updated. webhookSecret string If a Webhook trigger is defined on this workflow, a webhookSecret is required and should be provided here. id string The workflow identifier relationships object The definition of `WorkflowDataRelationships` object. creator object The definition of `WorkflowUserRelationship` object. data object The definition of `WorkflowUserRelationshipData` object. id [_required_] string The user identifier type [_required_] enum The definition of `WorkflowUserRelationshipType` object. Allowed enum values: `users` owner object The definition of `WorkflowUserRelationship` object. data object The definition of `WorkflowUserRelationshipData` object. id [_required_] string The user identifier type [_required_] enum The definition of `WorkflowUserRelationshipType` object. Allowed enum values: `users` type [_required_] enum The definition of `WorkflowDataType` object. Allowed enum values: `workflows` ``` { "data": { "attributes": { "description": "A sample workflow.", "name": "Example Workflow", "published": true, "spec": { "connectionEnvs": [ { "connections": [ { "connectionId": "11111111-1111-1111-1111-111111111111", "label": "INTEGRATION_DATADOG" } ], "env": "default" } ], "inputSchema": { "parameters": [ { "defaultValue": "default", "name": "input", "type": "STRING" } ] }, "outputSchema": { "parameters": [ { "name": "output", "type": "ARRAY_OBJECT", "value": "outputValue" } ] }, "steps": [ { "actionId": "com.datadoghq.dd.monitor.listMonitors", "connectionLabel": "INTEGRATION_DATADOG", "name": "Step1", "outboundEdges": [ { "branchName": "main", "nextStepName": "Step2" } ], "parameters": [ { "name": "tags", "value": "service:monitoring" } ] }, { "actionId": "com.datadoghq.core.noop", "name": "Step2" } ], "triggers": [ { "monitorTrigger": { "rateLimit": { "count": 1, "interval": "3600s" } }, "startStepNames": [ "Step1" ] }, { "startStepNames": [ "Step1" ], "githubWebhookTrigger": {} } ] }, "tags": [ "team:infra", "service:monitoring", "foo:bar" ] }, "id": "22222222-2222-2222-2222-222222222222", "type": "workflows" } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/workflow-automation/#UpdateWorkflow-200-v2) * [400](https://docs.datadoghq.com/api/latest/workflow-automation/#UpdateWorkflow-400-v2) * [403](https://docs.datadoghq.com/api/latest/workflow-automation/#UpdateWorkflow-403-v2) * [404](https://docs.datadoghq.com/api/latest/workflow-automation/#UpdateWorkflow-404-v2) * [429](https://docs.datadoghq.com/api/latest/workflow-automation/#UpdateWorkflow-429-v2) Successfully updated a workflow. * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) The response object after updating a workflow. Field Type Description data object Data related to the workflow being updated. attributes [_required_] object The definition of `WorkflowDataUpdateAttributes` object. createdAt date-time When the workflow was created. description string Description of the workflow. name string Name of the workflow. published boolean Set the workflow to published or unpublished. Workflows in an unpublished state will only be executable via manual runs. Automatic triggers such as Schedule will not execute the workflow until it is published. spec object The spec defines what the workflow does. annotations [object] A list of annotations used in the workflow. These are like sticky notes for your workflow! display [_required_] object The definition of `AnnotationDisplay` object. bounds object The definition of `AnnotationDisplayBounds` object. height double The `bounds` `height`. width double The `bounds` `width`. x double The `bounds` `x`. y double The `bounds` `y`. id [_required_] string The `Annotation` `id`. markdownTextAnnotation [_required_] object The definition of `AnnotationMarkdownTextAnnotation` object. text string The `markdownTextAnnotation` `text`. connectionEnvs [object] A list of connections or connection groups used in the workflow. connectionGroups [object] The `ConnectionEnv` `connectionGroups`. connectionGroupId [_required_] string The `ConnectionGroup` `connectionGroupId`. label [_required_] string The `ConnectionGroup` `label`. tags [_required_] [string] The `ConnectionGroup` `tags`. connections [object] The `ConnectionEnv` `connections`. connectionId [_required_] string The `Connection` `connectionId`. label [_required_] string The `Connection` `label`. env [_required_] enum The definition of `ConnectionEnvEnv` object. Allowed enum values: `default` handle string Unique identifier used to trigger workflows automatically in Datadog. inputSchema object A list of input parameters for the workflow. These can be used as dynamic runtime values in your workflow. parameters [object] The `InputSchema` `parameters`. defaultValue The `InputSchemaParameters` `defaultValue`. description string The `InputSchemaParameters` `description`. label string The `InputSchemaParameters` `label`. name [_required_] string The `InputSchemaParameters` `name`. type [_required_] enum The definition of `InputSchemaParametersType` object. Allowed enum values: `STRING,NUMBER,BOOLEAN,OBJECT,ARRAY_STRING,ARRAY_NUMBER,ARRAY_BOOLEAN,ARRAY_OBJECT` outputSchema object A list of output parameters for the workflow. parameters [object] The `OutputSchema` `parameters`. defaultValue The `OutputSchemaParameters` `defaultValue`. description string The `OutputSchemaParameters` `description`. label string The `OutputSchemaParameters` `label`. name [_required_] string The `OutputSchemaParameters` `name`. type [_required_] enum The definition of `OutputSchemaParametersType` object. Allowed enum values: `STRING,NUMBER,BOOLEAN,OBJECT,ARRAY_STRING,ARRAY_NUMBER,ARRAY_BOOLEAN,ARRAY_OBJECT` value The `OutputSchemaParameters` `value`. steps [object] A `Step` is a sub-component of a workflow. Each `Step` performs an action. actionId [_required_] string The unique identifier of an action. completionGate object Used to create conditions before running subsequent actions. completionCondition [_required_] object The definition of `CompletionCondition` object. operand1 [_required_] The `CompletionCondition` `operand1`. operand2 The `CompletionCondition` `operand2`. operator [_required_] enum The definition of `CompletionConditionOperator` object. Allowed enum values: `OPERATOR_EQUAL,OPERATOR_NOT_EQUAL,OPERATOR_GREATER_THAN,OPERATOR_LESS_THAN,OPERATOR_GREATER_THAN_OR_EQUAL_TO,OPERATOR_LESS_THAN_OR_EQUAL_TO,OPERATOR_CONTAINS,OPERATOR_DOES_NOT_CONTAIN,OPERATOR_IS_NULL,OPERATOR_IS_NOT_NULL,OPERATOR_IS_EMPTY,OPERATOR_IS_NOT_EMPTY` retryStrategy [_required_] object The definition of `RetryStrategy` object. kind [_required_] enum The definition of `RetryStrategyKind` object. Allowed enum values: `RETRY_STRATEGY_LINEAR` linear object The definition of `RetryStrategyLinear` object. interval [_required_] string The `RetryStrategyLinear` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s maxRetries [_required_] double The `RetryStrategyLinear` `maxRetries`. connectionLabel string The unique identifier of a connection defined in the spec. display object The definition of `StepDisplay` object. bounds object The definition of `StepDisplayBounds` object. x double The `bounds` `x`. y double The `bounds` `y`. errorHandlers [object] The `Step` `errorHandlers`. fallbackStepName [_required_] string The `ErrorHandler` `fallbackStepName`. retryStrategy [_required_] object The definition of `RetryStrategy` object. kind [_required_] enum The definition of `RetryStrategyKind` object. Allowed enum values: `RETRY_STRATEGY_LINEAR` linear object The definition of `RetryStrategyLinear` object. interval [_required_] string The `RetryStrategyLinear` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s maxRetries [_required_] double The `RetryStrategyLinear` `maxRetries`. name [_required_] string Name of the step. outboundEdges [object] A list of subsequent actions to run. branchName [_required_] string The `OutboundEdge` `branchName`. nextStepName [_required_] string The `OutboundEdge` `nextStepName`. parameters [object] A list of inputs for an action. name [_required_] string The `Parameter` `name`. value [_required_] The `Parameter` `value`. readinessGate object Used to merge multiple branches into a single branch. thresholdType [_required_] enum The definition of `ReadinessGateThresholdType` object. Allowed enum values: `ANY,ALL` triggers [ ] The list of triggers that activate this workflow. At least one trigger is required, and each trigger type may appear at most once. Option 1 object Schema for an API-based trigger. apiTrigger [_required_] object Trigger a workflow from an API request. The workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 2 object Schema for an App-based trigger. appTrigger [_required_] object Trigger a workflow from an App. startStepNames [string] A list of steps that run first after a trigger fires. Option 3 object Schema for a Case-based trigger. caseTrigger [_required_] object Trigger a workflow from a Case. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 4 object Schema for a Change Event-based trigger. changeEventTrigger [_required_] object Trigger a workflow from a Change Event. startStepNames [string] A list of steps that run first after a trigger fires. Option 5 object Schema for a Database Monitoring-based trigger. databaseMonitoringTrigger [_required_] object Trigger a workflow from Database Monitoring. startStepNames [string] A list of steps that run first after a trigger fires. Option 6 object Schema for a Datastore-based trigger. datastoreTrigger [_required_] object Trigger a workflow from a Datastore. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 7 object Schema for a Dashboard-based trigger. dashboardTrigger [_required_] object Trigger a workflow from a Dashboard. startStepNames [string] A list of steps that run first after a trigger fires. Option 8 object Schema for a GitHub webhook-based trigger. githubWebhookTrigger [_required_] object Trigger a workflow from a GitHub webhook. To trigger a workflow from GitHub, you must set a `webhookSecret`. In your GitHub Webhook Settings, set the Payload URL to "base_url"/api/v2/workflows/"workflow_id"/webhook?orgId="org_id", select application/json for the content type, and be highly recommend enabling SSL verification for security. The workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 9 object Schema for an Incident-based trigger. incidentTrigger [_required_] object Trigger a workflow from an Incident. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 10 object Schema for a Monitor-based trigger. monitorTrigger [_required_] object Trigger a workflow from a Monitor. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 11 object Schema for a Notebook-based trigger. notebookTrigger [_required_] object Trigger a workflow from a Notebook. startStepNames [string] A list of steps that run first after a trigger fires. Option 12 object Schema for a Schedule-based trigger. scheduleTrigger [_required_] object Trigger a workflow from a Schedule. The workflow must be published. rruleExpression [_required_] string Recurrence rule expression for scheduling. startStepNames [string] A list of steps that run first after a trigger fires. Option 13 object Schema for a Security-based trigger. securityTrigger [_required_] object Trigger a workflow from a Security Signal or Finding. For automatic triggering a handle must be configured and the workflow must be published. rateLimit object Defines a rate limit for a trigger. count int64 The `TriggerRateLimit` `count`. interval string The `TriggerRateLimit` `interval`. The expected format is the number of seconds ending with an s. For example, 1 day is 86400s startStepNames [string] A list of steps that run first after a trigger fires. Option 14 object Schema for a Self Service-based trigger. selfServiceTrigger [_required_] object Trigger a workflow from Self Service. startStepNames [string] A list of steps that run first after a trigger fires. Option 15 object Schema for a Slack-based trigger. slackTrigger [_required_] object Trigger a workflow from Slack. The workflow must be published. startStepNames [string] A list of steps that run first after a trigger fires. Option 16 object Schema for a Software Catalog-based trigger. softwareCatalogTrigger [_required_] object Trigger a workflow from Software Catalog. startStepNames [string] A list of steps that run first after a trigger fires. Option 17 object Schema for a Workflow-based trigger. startStepNames [string] A list of steps that run first after a trigger fires. workflowTrigger [_required_] object Trigger a workflow from the Datadog UI. Only required if no other trigger exists. tags [string] Tags of the workflow. updatedAt date-time When the workflow was last updated. webhookSecret string If a Webhook trigger is defined on this workflow, a webhookSecret is required and should be provided here. id string The workflow identifier relationships object The definition of `WorkflowDataRelationships` object. creator object The definition of `WorkflowUserRelationship` object. data object The definition of `WorkflowUserRelationshipData` object. id [_required_] string The user identifier type [_required_] enum The definition of `WorkflowUserRelationshipType` object. Allowed enum values: `users` owner object The definition of `WorkflowUserRelationship` object. data object The definition of `WorkflowUserRelationshipData` object. id [_required_] string The user identifier type [_required_] enum The definition of `WorkflowUserRelationshipType` object. Allowed enum values: `users` type [_required_] enum The definition of `WorkflowDataType` object. Allowed enum values: `workflows` ``` { "data": { "attributes": { "createdAt": "2019-09-19T10:00:00.000Z", "description": "string", "name": "string", "published": false, "spec": { "annotations": [ { "display": { "bounds": { "height": "number", "width": "number", "x": "number", "y": "number" } }, "id": "", "markdownTextAnnotation": { "text": "string" } } ], "connectionEnvs": [ { "connectionGroups": [ { "connectionGroupId": "", "label": "", "tags": [ "" ] } ], "connections": [ { "connectionId": "", "label": "" } ], "env": "default" } ], "handle": "string", "inputSchema": { "parameters": [ { "defaultValue": "undefined", "description": "string", "label": "string", "name": "", "type": "STRING" } ] }, "outputSchema": { "parameters": [ { "defaultValue": "undefined", "description": "string", "label": "string", "name": "", "type": "STRING", "value": "undefined" } ] }, "steps": [ { "actionId": "", "completionGate": { "completionCondition": { "operand1": "undefined", "operand2": "undefined", "operator": "OPERATOR_EQUAL" }, "retryStrategy": { "kind": "RETRY_STRATEGY_LINEAR", "linear": { "interval": "", "maxRetries": 0 } } }, "connectionLabel": "string", "display": { "bounds": { "x": "number", "y": "number" } }, "errorHandlers": [ { "fallbackStepName": "", "retryStrategy": { "kind": "RETRY_STRATEGY_LINEAR", "linear": { "interval": "", "maxRetries": 0 } } } ], "name": "", "outboundEdges": [ { "branchName": "", "nextStepName": "" } ], "parameters": [ { "name": "", "value": "undefined" } ], "readinessGate": { "thresholdType": "ANY" } } ], "triggers": [ { "apiTrigger": { "rateLimit": { "count": "integer", "interval": "string" } }, "startStepNames": [ "" ] } ] }, "tags": [], "updatedAt": "2019-09-19T10:00:00.000Z", "webhookSecret": "string" }, "id": "string", "relationships": { "creator": { "data": { "id": "", "type": "users" } }, "owner": { "data": { "id": "", "type": "users" } } }, "type": "workflows" } } ``` Copy Bad request * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=typescript) ##### Update an existing Workflow returns "Successfully updated a workflow." response Copy ``` # Path parameters export workflow_id="CHANGE_ME" # Curl command curl -X PATCH "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/workflows/${workflow_id}" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "data": { "attributes": { "description": "A sample workflow.", "name": "Example Workflow", "published": true, "spec": { "connectionEnvs": [ { "connections": [ { "connectionId": "11111111-1111-1111-1111-111111111111", "label": "INTEGRATION_DATADOG" } ], "env": "default" } ], "inputSchema": { "parameters": [ { "defaultValue": "default", "name": "input", "type": "STRING" } ] }, "outputSchema": { "parameters": [ { "name": "output", "type": "ARRAY_OBJECT", "value": "outputValue" } ] }, "steps": [ { "actionId": "com.datadoghq.dd.monitor.listMonitors", "connectionLabel": "INTEGRATION_DATADOG", "name": "Step1", "outboundEdges": [ { "branchName": "main", "nextStepName": "Step2" } ], "parameters": [ { "name": "tags", "value": "service:monitoring" } ] }, { "actionId": "com.datadoghq.core.noop", "name": "Step2" } ], "triggers": [ { "monitorTrigger": { "rateLimit": { "count": 1, "interval": "3600s" } }, "startStepNames": [ "Step1" ] }, { "startStepNames": [ "Step1" ], "githubWebhookTrigger": {} } ] }, "tags": [ "team:infra", "service:monitoring", "foo:bar" ] }, "id": "22222222-2222-2222-2222-222222222222", "type": "workflows" } } EOF ``` ##### Update an existing Workflow returns "Successfully updated a workflow." response ``` // Update an existing Workflow returns "Successfully updated a workflow." response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "workflow" in the system WorkflowDataID := os.Getenv("WORKFLOW_DATA_ID") body := datadogV2.UpdateWorkflowRequest{ Data: datadogV2.WorkflowDataUpdate{ Attributes: datadogV2.WorkflowDataUpdateAttributes{ Description: datadog.PtrString("A sample workflow."), Name: datadog.PtrString("Example Workflow"), Published: datadog.PtrBool(true), Spec: &datadogV2.Spec{ ConnectionEnvs: []datadogV2.ConnectionEnv{ { Connections: []datadogV2.Connection{ { ConnectionId: "11111111-1111-1111-1111-111111111111", Label: "INTEGRATION_DATADOG", }, }, Env: datadogV2.CONNECTIONENVENV_DEFAULT, }, }, InputSchema: &datadogV2.InputSchema{ Parameters: []datadogV2.InputSchemaParameters{ { DefaultValue: "default", Name: "input", Type: datadogV2.INPUTSCHEMAPARAMETERSTYPE_STRING, }, }, }, OutputSchema: &datadogV2.OutputSchema{ Parameters: []datadogV2.OutputSchemaParameters{ { Name: "output", Type: datadogV2.OUTPUTSCHEMAPARAMETERSTYPE_ARRAY_OBJECT, Value: "outputValue", }, }, }, Steps: []datadogV2.Step{ { ActionId: "com.datadoghq.dd.monitor.listMonitors", ConnectionLabel: datadog.PtrString("INTEGRATION_DATADOG"), Name: "Step1", OutboundEdges: []datadogV2.OutboundEdge{ { BranchName: "main", NextStepName: "Step2", }, }, Parameters: []datadogV2.Parameter{ { Name: "tags", Value: "service:monitoring", }, }, }, { ActionId: "com.datadoghq.core.noop", Name: "Step2", }, }, Triggers: []datadogV2.Trigger{ datadogV2.Trigger{ MonitorTriggerWrapper: &datadogV2.MonitorTriggerWrapper{ MonitorTrigger: datadogV2.MonitorTrigger{ RateLimit: &datadogV2.TriggerRateLimit{ Count: datadog.PtrInt64(1), Interval: datadog.PtrString("3600s"), }, }, StartStepNames: []string{ "Step1", }, }}, datadogV2.Trigger{ GithubWebhookTriggerWrapper: &datadogV2.GithubWebhookTriggerWrapper{ StartStepNames: []string{ "Step1", }, GithubWebhookTrigger: datadogV2.GithubWebhookTrigger{}, }}, }, }, Tags: []string{ "team:infra", "service:monitoring", "foo:bar", }, }, Id: datadog.PtrString("22222222-2222-2222-2222-222222222222"), Type: datadogV2.WORKFLOWDATATYPE_WORKFLOWS, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewWorkflowAutomationApi(apiClient) resp, r, err := api.UpdateWorkflow(ctx, WorkflowDataID, body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WorkflowAutomationApi.UpdateWorkflow`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WorkflowAutomationApi.UpdateWorkflow`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Update an existing Workflow returns "Successfully updated a workflow." response ``` // Update an existing Workflow returns "Successfully updated a workflow." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.WorkflowAutomationApi; import com.datadog.api.client.v2.model.Connection; import com.datadog.api.client.v2.model.ConnectionEnv; import com.datadog.api.client.v2.model.ConnectionEnvEnv; import com.datadog.api.client.v2.model.GithubWebhookTrigger; import com.datadog.api.client.v2.model.GithubWebhookTriggerWrapper; import com.datadog.api.client.v2.model.InputSchema; import com.datadog.api.client.v2.model.InputSchemaParameters; import com.datadog.api.client.v2.model.InputSchemaParametersType; import com.datadog.api.client.v2.model.MonitorTrigger; import com.datadog.api.client.v2.model.MonitorTriggerWrapper; import com.datadog.api.client.v2.model.OutboundEdge; import com.datadog.api.client.v2.model.OutputSchema; import com.datadog.api.client.v2.model.OutputSchemaParameters; import com.datadog.api.client.v2.model.OutputSchemaParametersType; import com.datadog.api.client.v2.model.Parameter; import com.datadog.api.client.v2.model.Spec; import com.datadog.api.client.v2.model.Step; import com.datadog.api.client.v2.model.Trigger; import com.datadog.api.client.v2.model.TriggerRateLimit; import com.datadog.api.client.v2.model.UpdateWorkflowRequest; import com.datadog.api.client.v2.model.UpdateWorkflowResponse; import com.datadog.api.client.v2.model.WorkflowDataType; import com.datadog.api.client.v2.model.WorkflowDataUpdate; import com.datadog.api.client.v2.model.WorkflowDataUpdateAttributes; import java.util.Arrays; import java.util.Collections; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WorkflowAutomationApi apiInstance = new WorkflowAutomationApi(defaultClient); // there is a valid "workflow" in the system String WORKFLOW_DATA_ID = System.getenv("WORKFLOW_DATA_ID"); UpdateWorkflowRequest body = new UpdateWorkflowRequest() .data( new WorkflowDataUpdate() .attributes( new WorkflowDataUpdateAttributes() .description("A sample workflow.") .name("Example Workflow") .published(true) .spec( new Spec() .connectionEnvs( Collections.singletonList( new ConnectionEnv() .connections( Collections.singletonList( new Connection() .connectionId( "11111111-1111-1111-1111-111111111111") .label("INTEGRATION_DATADOG"))) .env(ConnectionEnvEnv.DEFAULT))) .inputSchema( new InputSchema() .parameters( Collections.singletonList( new InputSchemaParameters() .defaultValue("default") .name("input") .type(InputSchemaParametersType.STRING)))) .outputSchema( new OutputSchema() .parameters( Collections.singletonList( new OutputSchemaParameters() .name("output") .type( OutputSchemaParametersType.ARRAY_OBJECT) .value("outputValue")))) .steps( Arrays.asList( new Step() .actionId("com.datadoghq.dd.monitor.listMonitors") .connectionLabel("INTEGRATION_DATADOG") .name("Step1") .outboundEdges( Collections.singletonList( new OutboundEdge() .branchName("main") .nextStepName("Step2"))) .parameters( Collections.singletonList( new Parameter() .name("tags") .value("service:monitoring"))), new Step() .actionId("com.datadoghq.core.noop") .name("Step2"))) .triggers( Arrays.asList( new Trigger( new MonitorTriggerWrapper() .monitorTrigger( new MonitorTrigger() .rateLimit( new TriggerRateLimit() .count(1L) .interval("3600s"))) .startStepNames( Collections.singletonList("Step1"))), new Trigger( new GithubWebhookTriggerWrapper() .startStepNames( Collections.singletonList("Step1")) .githubWebhookTrigger( new GithubWebhookTrigger()))))) .tags(Arrays.asList("team:infra", "service:monitoring", "foo:bar"))) .id("22222222-2222-2222-2222-222222222222") .type(WorkflowDataType.WORKFLOWS)); try { UpdateWorkflowResponse result = apiInstance.updateWorkflow(WORKFLOW_DATA_ID, body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling WorkflowAutomationApi#updateWorkflow"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Update an existing Workflow returns "Successfully updated a workflow." response ``` """ Update an existing Workflow returns "Successfully updated a workflow." response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.workflow_automation_api import WorkflowAutomationApi from datadog_api_client.v2.model.connection import Connection from datadog_api_client.v2.model.connection_env import ConnectionEnv from datadog_api_client.v2.model.connection_env_env import ConnectionEnvEnv from datadog_api_client.v2.model.github_webhook_trigger import GithubWebhookTrigger from datadog_api_client.v2.model.github_webhook_trigger_wrapper import GithubWebhookTriggerWrapper from datadog_api_client.v2.model.input_schema import InputSchema from datadog_api_client.v2.model.input_schema_parameters import InputSchemaParameters from datadog_api_client.v2.model.input_schema_parameters_type import InputSchemaParametersType from datadog_api_client.v2.model.monitor_trigger import MonitorTrigger from datadog_api_client.v2.model.monitor_trigger_wrapper import MonitorTriggerWrapper from datadog_api_client.v2.model.outbound_edge import OutboundEdge from datadog_api_client.v2.model.output_schema import OutputSchema from datadog_api_client.v2.model.output_schema_parameters import OutputSchemaParameters from datadog_api_client.v2.model.output_schema_parameters_type import OutputSchemaParametersType from datadog_api_client.v2.model.parameter import Parameter from datadog_api_client.v2.model.spec import Spec from datadog_api_client.v2.model.step import Step from datadog_api_client.v2.model.trigger_rate_limit import TriggerRateLimit from datadog_api_client.v2.model.update_workflow_request import UpdateWorkflowRequest from datadog_api_client.v2.model.workflow_data_type import WorkflowDataType from datadog_api_client.v2.model.workflow_data_update import WorkflowDataUpdate from datadog_api_client.v2.model.workflow_data_update_attributes import WorkflowDataUpdateAttributes # there is a valid "workflow" in the system WORKFLOW_DATA_ID = environ["WORKFLOW_DATA_ID"] body = UpdateWorkflowRequest( data=WorkflowDataUpdate( attributes=WorkflowDataUpdateAttributes( description="A sample workflow.", name="Example Workflow", published=True, spec=Spec( connection_envs=[ ConnectionEnv( connections=[ Connection( connection_id="11111111-1111-1111-1111-111111111111", label="INTEGRATION_DATADOG", ), ], env=ConnectionEnvEnv.DEFAULT, ), ], input_schema=InputSchema( parameters=[ InputSchemaParameters( default_value="default", name="input", type=InputSchemaParametersType.STRING, ), ], ), output_schema=OutputSchema( parameters=[ OutputSchemaParameters( name="output", type=OutputSchemaParametersType.ARRAY_OBJECT, value="outputValue", ), ], ), steps=[ Step( action_id="com.datadoghq.dd.monitor.listMonitors", connection_label="INTEGRATION_DATADOG", name="Step1", outbound_edges=[ OutboundEdge( branch_name="main", next_step_name="Step2", ), ], parameters=[ Parameter( name="tags", value="service:monitoring", ), ], ), Step( action_id="com.datadoghq.core.noop", name="Step2", ), ], triggers=[ MonitorTriggerWrapper( monitor_trigger=MonitorTrigger( rate_limit=TriggerRateLimit( count=1, interval="3600s", ), ), start_step_names=[ "Step1", ], ), GithubWebhookTriggerWrapper( start_step_names=[ "Step1", ], github_webhook_trigger=GithubWebhookTrigger(), ), ], ), tags=[ "team:infra", "service:monitoring", "foo:bar", ], ), id="22222222-2222-2222-2222-222222222222", type=WorkflowDataType.WORKFLOWS, ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WorkflowAutomationApi(api_client) response = api_instance.update_workflow(workflow_id=WORKFLOW_DATA_ID, body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Update an existing Workflow returns "Successfully updated a workflow." response ``` # Update an existing Workflow returns "Successfully updated a workflow." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::WorkflowAutomationAPI.new # there is a valid "workflow" in the system WORKFLOW_DATA_ID = ENV["WORKFLOW_DATA_ID"] body = DatadogAPIClient::V2::UpdateWorkflowRequest.new({ data: DatadogAPIClient::V2::WorkflowDataUpdate.new({ attributes: DatadogAPIClient::V2::WorkflowDataUpdateAttributes.new({ description: "A sample workflow.", name: "Example Workflow", published: true, spec: DatadogAPIClient::V2::Spec.new({ connection_envs: [ DatadogAPIClient::V2::ConnectionEnv.new({ connections: [ DatadogAPIClient::V2::Connection.new({ connection_id: "11111111-1111-1111-1111-111111111111", label: "INTEGRATION_DATADOG", }), ], env: DatadogAPIClient::V2::ConnectionEnvEnv::DEFAULT, }), ], input_schema: DatadogAPIClient::V2::InputSchema.new({ parameters: [ DatadogAPIClient::V2::InputSchemaParameters.new({ default_value: "default", name: "input", type: DatadogAPIClient::V2::InputSchemaParametersType::STRING, }), ], }), output_schema: DatadogAPIClient::V2::OutputSchema.new({ parameters: [ DatadogAPIClient::V2::OutputSchemaParameters.new({ name: "output", type: DatadogAPIClient::V2::OutputSchemaParametersType::ARRAY_OBJECT, value: "outputValue", }), ], }), steps: [ DatadogAPIClient::V2::Step.new({ action_id: "com.datadoghq.dd.monitor.listMonitors", connection_label: "INTEGRATION_DATADOG", name: "Step1", outbound_edges: [ DatadogAPIClient::V2::OutboundEdge.new({ branch_name: "main", next_step_name: "Step2", }), ], parameters: [ DatadogAPIClient::V2::Parameter.new({ name: "tags", value: "service:monitoring", }), ], }), DatadogAPIClient::V2::Step.new({ action_id: "com.datadoghq.core.noop", name: "Step2", }), ], triggers: [ DatadogAPIClient::V2::MonitorTriggerWrapper.new({ monitor_trigger: DatadogAPIClient::V2::MonitorTrigger.new({ rate_limit: DatadogAPIClient::V2::TriggerRateLimit.new({ count: 1, interval: "3600s", }), }), start_step_names: [ "Step1", ], }), DatadogAPIClient::V2::GithubWebhookTriggerWrapper.new({ start_step_names: [ "Step1", ], github_webhook_trigger: DatadogAPIClient::V2::GithubWebhookTrigger.new({}), }), ], }), tags: [ "team:infra", "service:monitoring", "foo:bar", ], }), id: "22222222-2222-2222-2222-222222222222", type: DatadogAPIClient::V2::WorkflowDataType::WORKFLOWS, }), }) p api_instance.update_workflow(WORKFLOW_DATA_ID, body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Update an existing Workflow returns "Successfully updated a workflow." response ``` // Update an existing Workflow returns "Successfully updated a workflow." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_workflow_automation::WorkflowAutomationAPI; use datadog_api_client::datadogV2::model::Connection; use datadog_api_client::datadogV2::model::ConnectionEnv; use datadog_api_client::datadogV2::model::ConnectionEnvEnv; use datadog_api_client::datadogV2::model::GithubWebhookTrigger; use datadog_api_client::datadogV2::model::GithubWebhookTriggerWrapper; use datadog_api_client::datadogV2::model::InputSchema; use datadog_api_client::datadogV2::model::InputSchemaParameters; use datadog_api_client::datadogV2::model::InputSchemaParametersType; use datadog_api_client::datadogV2::model::MonitorTrigger; use datadog_api_client::datadogV2::model::MonitorTriggerWrapper; use datadog_api_client::datadogV2::model::OutboundEdge; use datadog_api_client::datadogV2::model::OutputSchema; use datadog_api_client::datadogV2::model::OutputSchemaParameters; use datadog_api_client::datadogV2::model::OutputSchemaParametersType; use datadog_api_client::datadogV2::model::Parameter; use datadog_api_client::datadogV2::model::Spec; use datadog_api_client::datadogV2::model::Step; use datadog_api_client::datadogV2::model::Trigger; use datadog_api_client::datadogV2::model::TriggerRateLimit; use datadog_api_client::datadogV2::model::UpdateWorkflowRequest; use datadog_api_client::datadogV2::model::WorkflowDataType; use datadog_api_client::datadogV2::model::WorkflowDataUpdate; use datadog_api_client::datadogV2::model::WorkflowDataUpdateAttributes; use serde_json::Value; #[tokio::main] async fn main() { // there is a valid "workflow" in the system let workflow_data_id = std::env::var("WORKFLOW_DATA_ID").unwrap(); let body = UpdateWorkflowRequest::new( WorkflowDataUpdate::new( WorkflowDataUpdateAttributes::new() .description("A sample workflow.".to_string()) .name("Example Workflow".to_string()) .published(true) .spec( Spec::new() .connection_envs(vec![ConnectionEnv::new(ConnectionEnvEnv::DEFAULT) .connections(vec![Connection::new( "11111111-1111-1111-1111-111111111111".to_string(), "INTEGRATION_DATADOG".to_string(), )])]) .input_schema(InputSchema::new().parameters(vec![ InputSchemaParameters::new( "input".to_string(), InputSchemaParametersType::STRING, ).default_value(Value::from("default")) ])) .output_schema(OutputSchema::new().parameters(vec![ OutputSchemaParameters::new( "output".to_string(), OutputSchemaParametersType::ARRAY_OBJECT, ).value(Value::from("outputValue")) ])) .steps(vec![ Step::new( "com.datadoghq.dd.monitor.listMonitors".to_string(), "Step1".to_string(), ) .connection_label("INTEGRATION_DATADOG".to_string()) .outbound_edges(vec![OutboundEdge::new( "main".to_string(), "Step2".to_string(), )]) .parameters(vec![Parameter::new( "tags".to_string(), Value::from("service:monitoring"), )]), Step::new("com.datadoghq.core.noop".to_string(), "Step2".to_string()), ]) .triggers(vec![ Trigger::MonitorTriggerWrapper(Box::new( MonitorTriggerWrapper::new( MonitorTrigger::new().rate_limit( TriggerRateLimit::new() .count(1) .interval("3600s".to_string()), ), ) .start_step_names(vec!["Step1".to_string()]), )), Trigger::GithubWebhookTriggerWrapper(Box::new( GithubWebhookTriggerWrapper::new(GithubWebhookTrigger::new()) .start_step_names(vec!["Step1".to_string()]), )), ]), ) .tags(vec![ "team:infra".to_string(), "service:monitoring".to_string(), "foo:bar".to_string(), ]), WorkflowDataType::WORKFLOWS, ) .id("22222222-2222-2222-2222-222222222222".to_string()), ); let configuration = datadog::Configuration::new(); let api = WorkflowAutomationAPI::with_config(configuration); let resp = api.update_workflow(workflow_data_id.clone(), body).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Update an existing Workflow returns "Successfully updated a workflow." response ``` /** * Update an existing Workflow returns "Successfully updated a workflow." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.WorkflowAutomationApi(configuration); // there is a valid "workflow" in the system const WORKFLOW_DATA_ID = process.env.WORKFLOW_DATA_ID as string; const params: v2.WorkflowAutomationApiUpdateWorkflowRequest = { body: { data: { attributes: { description: "A sample workflow.", name: "Example Workflow", published: true, spec: { connectionEnvs: [ { connections: [ { connectionId: "11111111-1111-1111-1111-111111111111", label: "INTEGRATION_DATADOG", }, ], env: "default", }, ], inputSchema: { parameters: [ { defaultValue: "default", name: "input", type: "STRING", }, ], }, outputSchema: { parameters: [ { name: "output", type: "ARRAY_OBJECT", value: "outputValue", }, ], }, steps: [ { actionId: "com.datadoghq.dd.monitor.listMonitors", connectionLabel: "INTEGRATION_DATADOG", name: "Step1", outboundEdges: [ { branchName: "main", nextStepName: "Step2", }, ], parameters: [ { name: "tags", value: "service:monitoring", }, ], }, { actionId: "com.datadoghq.core.noop", name: "Step2", }, ], triggers: [ { monitorTrigger: { rateLimit: { count: 1, interval: "3600s", }, }, startStepNames: ["Step1"], }, { startStepNames: ["Step1"], githubWebhookTrigger: {}, }, ], }, tags: ["team:infra", "service:monitoring", "foo:bar"], }, id: "22222222-2222-2222-2222-222222222222", type: "workflows", }, }, workflowId: WORKFLOW_DATA_ID, }; apiInstance .updateWorkflow(params) .then((data: v2.UpdateWorkflowResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Delete an existing Workflow](https://docs.datadoghq.com/api/latest/workflow-automation/#delete-an-existing-workflow) * [v2 (latest)](https://docs.datadoghq.com/api/latest/workflow-automation/#delete-an-existing-workflow-v2) DELETE https://api.ap1.datadoghq.com/api/v2/workflows/{workflow_id}https://api.ap2.datadoghq.com/api/v2/workflows/{workflow_id}https://api.datadoghq.eu/api/v2/workflows/{workflow_id}https://api.ddog-gov.com/api/v2/workflows/{workflow_id}https://api.datadoghq.com/api/v2/workflows/{workflow_id}https://api.us3.datadoghq.com/api/v2/workflows/{workflow_id}https://api.us5.datadoghq.com/api/v2/workflows/{workflow_id} ### Overview Delete a workflow by ID. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `workflows_write` permission. ### Arguments #### Path Parameters Name Type Description workflow_id [_required_] string The ID of the workflow. ### Response * [204](https://docs.datadoghq.com/api/latest/workflow-automation/#DeleteWorkflow-204-v2) * [403](https://docs.datadoghq.com/api/latest/workflow-automation/#DeleteWorkflow-403-v2) * [404](https://docs.datadoghq.com/api/latest/workflow-automation/#DeleteWorkflow-404-v2) * [429](https://docs.datadoghq.com/api/latest/workflow-automation/#DeleteWorkflow-429-v2) Successfully deleted a workflow. Forbidden * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Not found * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Field Type Description errors [_required_] [object] A list of errors. detail string A human-readable explanation specific to this occurrence of the error. meta object Non-standard meta-information about the error source object References to the source of the error. header string A string indicating the name of a single request header which caused the error. parameter string A string indicating which URI query parameter caused the error. pointer string A JSON pointer to the value in the request document that caused the error. status string Status code of the response. title string Short human-readable summary of the error. ``` { "errors": [ { "detail": "Missing required attribute in body", "meta": {}, "source": { "header": "Authorization", "parameter": "limit", "pointer": "/data/attributes/title" }, "status": "400", "title": "Bad Request" } ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=typescript) ##### Delete an existing Workflow Copy ``` # Path parameters export workflow_id="CHANGE_ME" # Curl command curl -X DELETE "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/workflows/${workflow_id}" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Delete an existing Workflow ``` """ Delete an existing Workflow returns "Successfully deleted a workflow." response """ from os import environ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.workflow_automation_api import WorkflowAutomationApi # there is a valid "workflow" in the system WORKFLOW_DATA_ID = environ["WORKFLOW_DATA_ID"] configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WorkflowAutomationApi(api_client) api_instance.delete_workflow( workflow_id=WORKFLOW_DATA_ID, ) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Delete an existing Workflow ``` # Delete an existing Workflow returns "Successfully deleted a workflow." response require "datadog_api_client" api_instance = DatadogAPIClient::V2::WorkflowAutomationAPI.new # there is a valid "workflow" in the system WORKFLOW_DATA_ID = ENV["WORKFLOW_DATA_ID"] api_instance.delete_workflow(WORKFLOW_DATA_ID) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Delete an existing Workflow ``` // Delete an existing Workflow returns "Successfully deleted a workflow." response package main import ( "context" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { // there is a valid "workflow" in the system WorkflowDataID := os.Getenv("WORKFLOW_DATA_ID") ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewWorkflowAutomationApi(apiClient) r, err := api.DeleteWorkflow(ctx, WorkflowDataID) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WorkflowAutomationApi.DeleteWorkflow`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Delete an existing Workflow ``` // Delete an existing Workflow returns "Successfully deleted a workflow." response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.WorkflowAutomationApi; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WorkflowAutomationApi apiInstance = new WorkflowAutomationApi(defaultClient); // there is a valid "workflow" in the system String WORKFLOW_DATA_ID = System.getenv("WORKFLOW_DATA_ID"); try { apiInstance.deleteWorkflow(WORKFLOW_DATA_ID); } catch (ApiException e) { System.err.println("Exception when calling WorkflowAutomationApi#deleteWorkflow"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Delete an existing Workflow ``` // Delete an existing Workflow returns "Successfully deleted a workflow." response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_workflow_automation::WorkflowAutomationAPI; #[tokio::main] async fn main() { // there is a valid "workflow" in the system let workflow_data_id = std::env::var("WORKFLOW_DATA_ID").unwrap(); let configuration = datadog::Configuration::new(); let api = WorkflowAutomationAPI::with_config(configuration); let resp = api.delete_workflow(workflow_data_id.clone()).await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Delete an existing Workflow ``` /** * Delete an existing Workflow returns "Successfully deleted a workflow." response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.WorkflowAutomationApi(configuration); // there is a valid "workflow" in the system const WORKFLOW_DATA_ID = process.env.WORKFLOW_DATA_ID as string; const params: v2.WorkflowAutomationApiDeleteWorkflowRequest = { workflowId: WORKFLOW_DATA_ID, }; apiInstance .deleteWorkflow(params) .then((data: any) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [List workflow instances](https://docs.datadoghq.com/api/latest/workflow-automation/#list-workflow-instances) * [v2 (latest)](https://docs.datadoghq.com/api/latest/workflow-automation/#list-workflow-instances-v2) GET https://api.ap1.datadoghq.com/api/v2/workflows/{workflow_id}/instanceshttps://api.ap2.datadoghq.com/api/v2/workflows/{workflow_id}/instanceshttps://api.datadoghq.eu/api/v2/workflows/{workflow_id}/instanceshttps://api.ddog-gov.com/api/v2/workflows/{workflow_id}/instanceshttps://api.datadoghq.com/api/v2/workflows/{workflow_id}/instanceshttps://api.us3.datadoghq.com/api/v2/workflows/{workflow_id}/instanceshttps://api.us5.datadoghq.com/api/v2/workflows/{workflow_id}/instances ### Overview List all instances of a given workflow. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `workflows_read` permission. OAuth apps require the `workflows_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#workflow-automation) to access this endpoint. ### Arguments #### Path Parameters Name Type Description workflow_id [_required_] string The ID of the workflow. #### Query Strings Name Type Description page[size] integer Size for a given page. The maximum allowed value is 100. page[number] integer Specific page number to return. ### Response * [200](https://docs.datadoghq.com/api/latest/workflow-automation/#ListWorkflowInstances-200-v2) * [400](https://docs.datadoghq.com/api/latest/workflow-automation/#ListWorkflowInstances-400-v2) * [403](https://docs.datadoghq.com/api/latest/workflow-automation/#ListWorkflowInstances-403-v2) * [429](https://docs.datadoghq.com/api/latest/workflow-automation/#ListWorkflowInstances-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) Response returned when listing workflow instances. Expand All Field Type Description ``` { "data": [ { "id": "string" } ], "meta": { "page": { "totalCount": "integer" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=typescript) ##### List workflow instances Copy ``` # Path parameters export workflow_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/workflows/${workflow_id}/instances" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### List workflow instances ``` """ List workflow instances returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.workflow_automation_api import WorkflowAutomationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WorkflowAutomationApi(api_client) response = api_instance.list_workflow_instances( workflow_id="ccf73164-1998-4785-a7a3-8d06c7e5f558", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### List workflow instances ``` # List workflow instances returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::WorkflowAutomationAPI.new p api_instance.list_workflow_instances("ccf73164-1998-4785-a7a3-8d06c7e5f558") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### List workflow instances ``` // List workflow instances returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewWorkflowAutomationApi(apiClient) resp, r, err := api.ListWorkflowInstances(ctx, "ccf73164-1998-4785-a7a3-8d06c7e5f558", *datadogV2.NewListWorkflowInstancesOptionalParameters()) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WorkflowAutomationApi.ListWorkflowInstances`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WorkflowAutomationApi.ListWorkflowInstances`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### List workflow instances ``` // List workflow instances returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.WorkflowAutomationApi; import com.datadog.api.client.v2.model.WorkflowListInstancesResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WorkflowAutomationApi apiInstance = new WorkflowAutomationApi(defaultClient); try { WorkflowListInstancesResponse result = apiInstance.listWorkflowInstances("ccf73164-1998-4785-a7a3-8d06c7e5f558"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling WorkflowAutomationApi#listWorkflowInstances"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### List workflow instances ``` // List workflow instances returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_workflow_automation::ListWorkflowInstancesOptionalParams; use datadog_api_client::datadogV2::api_workflow_automation::WorkflowAutomationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = WorkflowAutomationAPI::with_config(configuration); let resp = api .list_workflow_instances( "ccf73164-1998-4785-a7a3-8d06c7e5f558".to_string(), ListWorkflowInstancesOptionalParams::default(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### List workflow instances ``` /** * List workflow instances returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.WorkflowAutomationApi(configuration); const params: v2.WorkflowAutomationApiListWorkflowInstancesRequest = { workflowId: "ccf73164-1998-4785-a7a3-8d06c7e5f558", }; apiInstance .listWorkflowInstances(params) .then((data: v2.WorkflowListInstancesResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Execute a workflow](https://docs.datadoghq.com/api/latest/workflow-automation/#execute-a-workflow) * [v2 (latest)](https://docs.datadoghq.com/api/latest/workflow-automation/#execute-a-workflow-v2) POST https://api.ap1.datadoghq.com/api/v2/workflows/{workflow_id}/instanceshttps://api.ap2.datadoghq.com/api/v2/workflows/{workflow_id}/instanceshttps://api.datadoghq.eu/api/v2/workflows/{workflow_id}/instanceshttps://api.ddog-gov.com/api/v2/workflows/{workflow_id}/instanceshttps://api.datadoghq.com/api/v2/workflows/{workflow_id}/instanceshttps://api.us3.datadoghq.com/api/v2/workflows/{workflow_id}/instanceshttps://api.us5.datadoghq.com/api/v2/workflows/{workflow_id}/instances ### Overview Execute the given workflow. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `workflows_run` permission. OAuth apps require the `workflows_run` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#workflow-automation) to access this endpoint. ### Arguments #### Path Parameters Name Type Description workflow_id [_required_] string The ID of the workflow. ### Request #### Body Data (required) * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) Field Type Description meta object Additional information for creating a workflow instance. payload object The input parameters to the workflow. ``` { "meta": { "payload": { "input": "value" } } } ``` Copy ### Response * [200](https://docs.datadoghq.com/api/latest/workflow-automation/#CreateWorkflowInstance-200-v2) * [400](https://docs.datadoghq.com/api/latest/workflow-automation/#CreateWorkflowInstance-400-v2) * [403](https://docs.datadoghq.com/api/latest/workflow-automation/#CreateWorkflowInstance-403-v2) * [429](https://docs.datadoghq.com/api/latest/workflow-automation/#CreateWorkflowInstance-429-v2) Created * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) Response returned upon successful workflow instance creation. Expand All Field Type Description ``` { "data": { "id": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=curl) * [Go](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=java) * [Python](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=ruby) * [Rust](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=typescript) ##### Execute a workflow returns "Created" response Copy ``` # Path parameters export workflow_id="CHANGE_ME" # Curl command curl -X POST "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/workflows/${workflow_id}/instances" \ -H "Accept: application/json" \ -H "Content-Type: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" \ -d @- << EOF { "meta": { "payload": { "input": "value" } } } EOF ``` ##### Execute a workflow returns "Created" response ``` // Execute a workflow returns "Created" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { body := datadogV2.WorkflowInstanceCreateRequest{ Meta: &datadogV2.WorkflowInstanceCreateMeta{ Payload: map[string]interface{}{ "input": "value", }, }, } ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewWorkflowAutomationApi(apiClient) resp, r, err := api.CreateWorkflowInstance(ctx, "ccf73164-1998-4785-a7a3-8d06c7e5f558", body) if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WorkflowAutomationApi.CreateWorkflowInstance`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WorkflowAutomationApi.CreateWorkflowInstance`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Execute a workflow returns "Created" response ``` // Execute a workflow returns "Created" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.WorkflowAutomationApi; import com.datadog.api.client.v2.model.WorkflowInstanceCreateMeta; import com.datadog.api.client.v2.model.WorkflowInstanceCreateRequest; import com.datadog.api.client.v2.model.WorkflowInstanceCreateResponse; import java.util.Map; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WorkflowAutomationApi apiInstance = new WorkflowAutomationApi(defaultClient); WorkflowInstanceCreateRequest body = new WorkflowInstanceCreateRequest() .meta( new WorkflowInstanceCreateMeta() .payload(Map.ofEntries(Map.entry("input", "value")))); try { WorkflowInstanceCreateResponse result = apiInstance.createWorkflowInstance("ccf73164-1998-4785-a7a3-8d06c7e5f558", body); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling WorkflowAutomationApi#createWorkflowInstance"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Execute a workflow returns "Created" response ``` """ Execute a workflow returns "Created" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.workflow_automation_api import WorkflowAutomationApi from datadog_api_client.v2.model.workflow_instance_create_meta import WorkflowInstanceCreateMeta from datadog_api_client.v2.model.workflow_instance_create_request import WorkflowInstanceCreateRequest body = WorkflowInstanceCreateRequest( meta=WorkflowInstanceCreateMeta( payload=dict([("input", "value")]), ), ) configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WorkflowAutomationApi(api_client) response = api_instance.create_workflow_instance(workflow_id="ccf73164-1998-4785-a7a3-8d06c7e5f558", body=body) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Execute a workflow returns "Created" response ``` # Execute a workflow returns "Created" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::WorkflowAutomationAPI.new body = DatadogAPIClient::V2::WorkflowInstanceCreateRequest.new({ meta: DatadogAPIClient::V2::WorkflowInstanceCreateMeta.new({ payload: { "input": "value", }, }), }) p api_instance.create_workflow_instance("ccf73164-1998-4785-a7a3-8d06c7e5f558", body) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Execute a workflow returns "Created" response ``` // Execute a workflow returns "Created" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_workflow_automation::WorkflowAutomationAPI; use datadog_api_client::datadogV2::model::WorkflowInstanceCreateMeta; use datadog_api_client::datadogV2::model::WorkflowInstanceCreateRequest; use serde_json::Value; use std::collections::BTreeMap; #[tokio::main] async fn main() { let body = WorkflowInstanceCreateRequest::new().meta(WorkflowInstanceCreateMeta::new().payload( BTreeMap::from([("input".to_string(), Value::from("value"))]), )); let configuration = datadog::Configuration::new(); let api = WorkflowAutomationAPI::with_config(configuration); let resp = api .create_workflow_instance("ccf73164-1998-4785-a7a3-8d06c7e5f558".to_string(), body) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Execute a workflow returns "Created" response ``` /** * Execute a workflow returns "Created" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.WorkflowAutomationApi(configuration); const params: v2.WorkflowAutomationApiCreateWorkflowInstanceRequest = { body: { meta: { payload: { input: "value", }, }, }, workflowId: "ccf73164-1998-4785-a7a3-8d06c7e5f558", }; apiInstance .createWorkflowInstance(params) .then((data: v2.WorkflowInstanceCreateResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Get a workflow instance](https://docs.datadoghq.com/api/latest/workflow-automation/#get-a-workflow-instance) * [v2 (latest)](https://docs.datadoghq.com/api/latest/workflow-automation/#get-a-workflow-instance-v2) GET https://api.ap1.datadoghq.com/api/v2/workflows/{workflow_id}/instances/{instance_id}https://api.ap2.datadoghq.com/api/v2/workflows/{workflow_id}/instances/{instance_id}https://api.datadoghq.eu/api/v2/workflows/{workflow_id}/instances/{instance_id}https://api.ddog-gov.com/api/v2/workflows/{workflow_id}/instances/{instance_id}https://api.datadoghq.com/api/v2/workflows/{workflow_id}/instances/{instance_id}https://api.us3.datadoghq.com/api/v2/workflows/{workflow_id}/instances/{instance_id}https://api.us5.datadoghq.com/api/v2/workflows/{workflow_id}/instances/{instance_id} ### Overview Get a specific execution of a given workflow. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `workflows_read` permission. OAuth apps require the `workflows_read` authorization [scope](https://docs.datadoghq.com/api/latest/scopes/#workflow-automation) to access this endpoint. ### Arguments #### Path Parameters Name Type Description workflow_id [_required_] string The ID of the workflow. instance_id [_required_] string The ID of the workflow instance. ### Response * [200](https://docs.datadoghq.com/api/latest/workflow-automation/#GetWorkflowInstance-200-v2) * [400](https://docs.datadoghq.com/api/latest/workflow-automation/#GetWorkflowInstance-400-v2) * [403](https://docs.datadoghq.com/api/latest/workflow-automation/#GetWorkflowInstance-403-v2) * [404](https://docs.datadoghq.com/api/latest/workflow-automation/#GetWorkflowInstance-404-v2) * [429](https://docs.datadoghq.com/api/latest/workflow-automation/#GetWorkflowInstance-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) The state of the given workflow instance. Expand All Field Type Description ``` { "data": { "attributes": { "id": "string" } } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=typescript) ##### Get a workflow instance Copy ``` # Path parameters export workflow_id="CHANGE_ME" export instance_id="CHANGE_ME" # Curl command curl -X GET "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/workflows/${workflow_id}/instances/${instance_id}" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Get a workflow instance ``` """ Get a workflow instance returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.workflow_automation_api import WorkflowAutomationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WorkflowAutomationApi(api_client) response = api_instance.get_workflow_instance( workflow_id="ccf73164-1998-4785-a7a3-8d06c7e5f558", instance_id="305a472b-71ab-4ce8-8f8d-75db635627b5", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Get a workflow instance ``` # Get a workflow instance returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::WorkflowAutomationAPI.new p api_instance.get_workflow_instance("ccf73164-1998-4785-a7a3-8d06c7e5f558", "305a472b-71ab-4ce8-8f8d-75db635627b5") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Get a workflow instance ``` // Get a workflow instance returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewWorkflowAutomationApi(apiClient) resp, r, err := api.GetWorkflowInstance(ctx, "ccf73164-1998-4785-a7a3-8d06c7e5f558", "305a472b-71ab-4ce8-8f8d-75db635627b5") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WorkflowAutomationApi.GetWorkflowInstance`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WorkflowAutomationApi.GetWorkflowInstance`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Get a workflow instance ``` // Get a workflow instance returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.WorkflowAutomationApi; import com.datadog.api.client.v2.model.WorklflowGetInstanceResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WorkflowAutomationApi apiInstance = new WorkflowAutomationApi(defaultClient); try { WorklflowGetInstanceResponse result = apiInstance.getWorkflowInstance( "ccf73164-1998-4785-a7a3-8d06c7e5f558", "305a472b-71ab-4ce8-8f8d-75db635627b5"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling WorkflowAutomationApi#getWorkflowInstance"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Get a workflow instance ``` // Get a workflow instance returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_workflow_automation::WorkflowAutomationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = WorkflowAutomationAPI::with_config(configuration); let resp = api .get_workflow_instance( "ccf73164-1998-4785-a7a3-8d06c7e5f558".to_string(), "305a472b-71ab-4ce8-8f8d-75db635627b5".to_string(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Get a workflow instance ``` /** * Get a workflow instance returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.WorkflowAutomationApi(configuration); const params: v2.WorkflowAutomationApiGetWorkflowInstanceRequest = { workflowId: "ccf73164-1998-4785-a7a3-8d06c7e5f558", instanceId: "305a472b-71ab-4ce8-8f8d-75db635627b5", }; apiInstance .getWorkflowInstance(params) .then((data: v2.WorklflowGetInstanceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ## [Cancel a workflow instance](https://docs.datadoghq.com/api/latest/workflow-automation/#cancel-a-workflow-instance) * [v2 (latest)](https://docs.datadoghq.com/api/latest/workflow-automation/#cancel-a-workflow-instance-v2) PUT https://api.ap1.datadoghq.com/api/v2/workflows/{workflow_id}/instances/{instance_id}/cancelhttps://api.ap2.datadoghq.com/api/v2/workflows/{workflow_id}/instances/{instance_id}/cancelhttps://api.datadoghq.eu/api/v2/workflows/{workflow_id}/instances/{instance_id}/cancelhttps://api.ddog-gov.com/api/v2/workflows/{workflow_id}/instances/{instance_id}/cancelhttps://api.datadoghq.com/api/v2/workflows/{workflow_id}/instances/{instance_id}/cancelhttps://api.us3.datadoghq.com/api/v2/workflows/{workflow_id}/instances/{instance_id}/cancelhttps://api.us5.datadoghq.com/api/v2/workflows/{workflow_id}/instances/{instance_id}/cancel ### Overview Cancels a specific execution of a given workflow. This API requires a [registered application key](https://docs.datadoghq.com/api/latest/action-connection/#register-a-new-app-key). Alternatively, you can configure these permissions [in the UI](https://docs.datadoghq.com/account_management/api-app-keys/#actions-api-access). This endpoint requires the `workflows_run` permission. ### Arguments #### Path Parameters Name Type Description workflow_id [_required_] string The ID of the workflow. instance_id [_required_] string The ID of the workflow instance. ### Response * [200](https://docs.datadoghq.com/api/latest/workflow-automation/#CancelWorkflowInstance-200-v2) * [400](https://docs.datadoghq.com/api/latest/workflow-automation/#CancelWorkflowInstance-400-v2) * [403](https://docs.datadoghq.com/api/latest/workflow-automation/#CancelWorkflowInstance-403-v2) * [404](https://docs.datadoghq.com/api/latest/workflow-automation/#CancelWorkflowInstance-404-v2) * [429](https://docs.datadoghq.com/api/latest/workflow-automation/#CancelWorkflowInstance-429-v2) OK * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) Information about the canceled instance. Field Type Description data object Data about the canceled instance. id string The id of the canceled instance ``` { "data": { "id": "string" } } ``` Copy Bad Request * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Forbidden * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Not Found * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy Too many requests * [Model](https://docs.datadoghq.com/api/latest/workflow-automation/) * [Example](https://docs.datadoghq.com/api/latest/workflow-automation/) API error response. Expand All Field Type Description errors [_required_] [string] A list of errors. ``` { "errors": [ "Bad Request" ] } ``` Copy ### Code Example * [Curl](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=curl) * [Python](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=python) * [Ruby](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=ruby) * [Go](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=go) * [Java](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=java) * [Rust](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=rust) * [Typescript](https://docs.datadoghq.com/api/latest/workflow-automation/?code-lang=typescript) ##### Cancel a workflow instance Copy ``` # Path parameters export workflow_id="CHANGE_ME" export instance_id="CHANGE_ME" # Curl command curl -X PUT "https://api.ap1.datadoghq.com"https://api.ap2.datadoghq.com"https://api.datadoghq.eu"https://api.ddog-gov.com"https://api.datadoghq.com"https://api.us3.datadoghq.com"https://api.us5.datadoghq.com/api/v2/workflows/${workflow_id}/instances/${instance_id}/cancel" \ -H "Accept: application/json" \ -H "DD-API-KEY: ${DD_API_KEY}" \ -H "DD-APPLICATION-KEY: ${DD_APP_KEY}" ``` ##### Cancel a workflow instance ``` """ Cancel a workflow instance returns "OK" response """ from datadog_api_client import ApiClient, Configuration from datadog_api_client.v2.api.workflow_automation_api import WorkflowAutomationApi configuration = Configuration() with ApiClient(configuration) as api_client: api_instance = WorkflowAutomationApi(api_client) response = api_instance.cancel_workflow_instance( workflow_id="ccf73164-1998-4785-a7a3-8d06c7e5f558", instance_id="305a472b-71ab-4ce8-8f8d-75db635627b5", ) print(response) ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=python) and then save the example to `example.py` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" python3 "example.py" ``` ##### Cancel a workflow instance ``` # Cancel a workflow instance returns "OK" response require "datadog_api_client" api_instance = DatadogAPIClient::V2::WorkflowAutomationAPI.new p api_instance.cancel_workflow_instance("ccf73164-1998-4785-a7a3-8d06c7e5f558", "305a472b-71ab-4ce8-8f8d-75db635627b5") ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=ruby) and then save the example to `example.rb` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" rb "example.rb" ``` ##### Cancel a workflow instance ``` // Cancel a workflow instance returns "OK" response package main import ( "context" "encoding/json" "fmt" "os" "github.com/DataDog/datadog-api-client-go/v2/api/datadog" "github.com/DataDog/datadog-api-client-go/v2/api/datadogV2" ) func main() { ctx := datadog.NewDefaultContext(context.Background()) configuration := datadog.NewConfiguration() apiClient := datadog.NewAPIClient(configuration) api := datadogV2.NewWorkflowAutomationApi(apiClient) resp, r, err := api.CancelWorkflowInstance(ctx, "ccf73164-1998-4785-a7a3-8d06c7e5f558", "305a472b-71ab-4ce8-8f8d-75db635627b5") if err != nil { fmt.Fprintf(os.Stderr, "Error when calling `WorkflowAutomationApi.CancelWorkflowInstance`: %v\n", err) fmt.Fprintf(os.Stderr, "Full HTTP response: %v\n", r) } responseContent, _ := json.MarshalIndent(resp, "", " ") fmt.Fprintf(os.Stdout, "Response from `WorkflowAutomationApi.CancelWorkflowInstance`:\n%s\n", responseContent) } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=go) and then save the example to `main.go` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" go run "main.go" ``` ##### Cancel a workflow instance ``` // Cancel a workflow instance returns "OK" response import com.datadog.api.client.ApiClient; import com.datadog.api.client.ApiException; import com.datadog.api.client.v2.api.WorkflowAutomationApi; import com.datadog.api.client.v2.model.WorklflowCancelInstanceResponse; public class Example { public static void main(String[] args) { ApiClient defaultClient = ApiClient.getDefaultApiClient(); WorkflowAutomationApi apiInstance = new WorkflowAutomationApi(defaultClient); try { WorklflowCancelInstanceResponse result = apiInstance.cancelWorkflowInstance( "ccf73164-1998-4785-a7a3-8d06c7e5f558", "305a472b-71ab-4ce8-8f8d-75db635627b5"); System.out.println(result); } catch (ApiException e) { System.err.println("Exception when calling WorkflowAutomationApi#cancelWorkflowInstance"); System.err.println("Status code: " + e.getCode()); System.err.println("Reason: " + e.getResponseBody()); System.err.println("Response headers: " + e.getResponseHeaders()); e.printStackTrace(); } } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=java) and then save the example to `Example.java` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" java "Example.java" ``` ##### Cancel a workflow instance ``` // Cancel a workflow instance returns "OK" response use datadog_api_client::datadog; use datadog_api_client::datadogV2::api_workflow_automation::WorkflowAutomationAPI; #[tokio::main] async fn main() { let configuration = datadog::Configuration::new(); let api = WorkflowAutomationAPI::with_config(configuration); let resp = api .cancel_workflow_instance( "ccf73164-1998-4785-a7a3-8d06c7e5f558".to_string(), "305a472b-71ab-4ce8-8f8d-75db635627b5".to_string(), ) .await; if let Ok(value) = resp { println!("{:#?}", value); } else { println!("{:#?}", resp.unwrap_err()); } } ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=rust) and then save the example to `src/main.rs` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" cargo run ``` ##### Cancel a workflow instance ``` /** * Cancel a workflow instance returns "OK" response */ import { client, v2 } from "@datadog/datadog-api-client"; const configuration = client.createConfiguration(); const apiInstance = new v2.WorkflowAutomationApi(configuration); const params: v2.WorkflowAutomationApiCancelWorkflowInstanceRequest = { workflowId: "ccf73164-1998-4785-a7a3-8d06c7e5f558", instanceId: "305a472b-71ab-4ce8-8f8d-75db635627b5", }; apiInstance .cancelWorkflowInstance(params) .then((data: v2.WorklflowCancelInstanceResponse) => { console.log( "API called successfully. Returned data: " + JSON.stringify(data) ); }) .catch((error: any) => console.error(error)); ``` Copy #### Instructions First [install the library and its dependencies](https://docs.datadoghq.com/api/latest/?code-lang=typescript) and then save the example to `example.ts` and run following commands: ``` DD_SITE="datadoghq.comus3.datadoghq.comus5.datadoghq.comdatadoghq.euap1.datadoghq.comap2.datadoghq.comddog-gov.com" DD_API_KEY="" DD_APP_KEY="" tsc "example.ts" ``` * * * ![](https://id.rlcdn.com/464526.gif)![](https://t.co/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c170ef65-c699-4085-ab5f-57d16f444c4e&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2b7ae509-6303-4b07-ab71-0d71cc38deaa&pt=Workflow%20Automation&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fworkflow-automation%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35)![](https://analytics.twitter.com/1/i/adsct?bci=4&dv=UTC%26en-US%26Google%20Inc.%26Linux%20x86_64%26255%261080%26600%264%2624%261080%26600%260%26na&eci=3&event=%7B%7D&event_id=c170ef65-c699-4085-ab5f-57d16f444c4e&integration=gtm&p_id=Twitter&p_user_id=0&pl_id=2b7ae509-6303-4b07-ab71-0d71cc38deaa&pt=Workflow%20Automation&tw_document_href=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fworkflow-automation%2F&tw_iframe_status=0&txn_id=nui23&type=javascript&version=2.3.35) ![](https://bat.bing.com/action/0?ti=4061438&Ver=2&mid=0074867a-4ad3-42f6-82ac-ab542a1198d2&bo=2&sid=e9375cd0f0bd11f08bcff787fa0fba6f&vid=e9375d80f0bd11f0b5075ff6ae9d9677&vids=0&msclkid=N&pi=918639831&lg=en-US&sw=1080&sh=600&sc=24&tl=Workflow%20Automation&p=https%3A%2F%2Fdocs.datadoghq.com%2Fapi%2Flatest%2Fworkflow-automation%2F&r=<=10501&evt=pageLoad&sv=2&asc=G&cdb=AQAS&rn=438524)